Jan 21 17:44:03 np0005591285 kernel: Linux version 5.14.0-661.el9.x86_64 (mockbuild@x86-05.stream.rdu2.redhat.com) (gcc (GCC) 11.5.0 20240719 (Red Hat 11.5.0-14), GNU ld version 2.35.2-69.el9) #1 SMP PREEMPT_DYNAMIC Fri Jan 16 09:19:22 UTC 2026
Jan 21 17:44:03 np0005591285 kernel: The list of certified hardware and cloud instances for Red Hat Enterprise Linux 9 can be viewed at the Red Hat Ecosystem Catalog, https://catalog.redhat.com.
Jan 21 17:44:03 np0005591285 kernel: Command line: BOOT_IMAGE=(hd0,msdos1)/boot/vmlinuz-5.14.0-661.el9.x86_64 root=UUID=22ac9141-3960-4912-b20e-19fc8a328d40 ro console=ttyS0,115200n8 no_timer_check net.ifnames=0 crashkernel=1G-2G:192M,2G-64G:256M,64G-:512M
Jan 21 17:44:03 np0005591285 kernel: BIOS-provided physical RAM map:
Jan 21 17:44:03 np0005591285 kernel: BIOS-e820: [mem 0x0000000000000000-0x000000000009fbff] usable
Jan 21 17:44:03 np0005591285 kernel: BIOS-e820: [mem 0x000000000009fc00-0x000000000009ffff] reserved
Jan 21 17:44:03 np0005591285 kernel: BIOS-e820: [mem 0x00000000000f0000-0x00000000000fffff] reserved
Jan 21 17:44:03 np0005591285 kernel: BIOS-e820: [mem 0x0000000000100000-0x00000000bffdafff] usable
Jan 21 17:44:03 np0005591285 kernel: BIOS-e820: [mem 0x00000000bffdb000-0x00000000bfffffff] reserved
Jan 21 17:44:03 np0005591285 kernel: BIOS-e820: [mem 0x00000000feffc000-0x00000000feffffff] reserved
Jan 21 17:44:03 np0005591285 kernel: BIOS-e820: [mem 0x00000000fffc0000-0x00000000ffffffff] reserved
Jan 21 17:44:03 np0005591285 kernel: BIOS-e820: [mem 0x0000000100000000-0x000000023fffffff] usable
Jan 21 17:44:03 np0005591285 kernel: NX (Execute Disable) protection: active
Jan 21 17:44:03 np0005591285 kernel: APIC: Static calls initialized
Jan 21 17:44:03 np0005591285 kernel: SMBIOS 2.8 present.
Jan 21 17:44:03 np0005591285 kernel: DMI: OpenStack Foundation OpenStack Nova, BIOS 1.15.0-1 04/01/2014
Jan 21 17:44:03 np0005591285 kernel: Hypervisor detected: KVM
Jan 21 17:44:03 np0005591285 kernel: kvm-clock: Using msrs 4b564d01 and 4b564d00
Jan 21 17:44:03 np0005591285 kernel: kvm-clock: using sched offset of 3463645806 cycles
Jan 21 17:44:03 np0005591285 kernel: clocksource: kvm-clock: mask: 0xffffffffffffffff max_cycles: 0x1cd42e4dffb, max_idle_ns: 881590591483 ns
Jan 21 17:44:03 np0005591285 kernel: tsc: Detected 2799.998 MHz processor
Jan 21 17:44:03 np0005591285 kernel: last_pfn = 0x240000 max_arch_pfn = 0x400000000
Jan 21 17:44:03 np0005591285 kernel: MTRR map: 4 entries (3 fixed + 1 variable; max 19), built from 8 variable MTRRs
Jan 21 17:44:03 np0005591285 kernel: x86/PAT: Configuration [0-7]: WB  WC  UC- UC  WB  WP  UC- WT  
Jan 21 17:44:03 np0005591285 kernel: last_pfn = 0xbffdb max_arch_pfn = 0x400000000
Jan 21 17:44:03 np0005591285 kernel: found SMP MP-table at [mem 0x000f5ae0-0x000f5aef]
Jan 21 17:44:03 np0005591285 kernel: Using GB pages for direct mapping
Jan 21 17:44:03 np0005591285 kernel: RAMDISK: [mem 0x2d426000-0x32a0afff]
Jan 21 17:44:03 np0005591285 kernel: ACPI: Early table checksum verification disabled
Jan 21 17:44:03 np0005591285 kernel: ACPI: RSDP 0x00000000000F5AA0 000014 (v00 BOCHS )
Jan 21 17:44:03 np0005591285 kernel: ACPI: RSDT 0x00000000BFFE16BD 000030 (v01 BOCHS  BXPC     00000001 BXPC 00000001)
Jan 21 17:44:03 np0005591285 kernel: ACPI: FACP 0x00000000BFFE1571 000074 (v01 BOCHS  BXPC     00000001 BXPC 00000001)
Jan 21 17:44:03 np0005591285 kernel: ACPI: DSDT 0x00000000BFFDFC80 0018F1 (v01 BOCHS  BXPC     00000001 BXPC 00000001)
Jan 21 17:44:03 np0005591285 kernel: ACPI: FACS 0x00000000BFFDFC40 000040
Jan 21 17:44:03 np0005591285 kernel: ACPI: APIC 0x00000000BFFE15E5 0000B0 (v01 BOCHS  BXPC     00000001 BXPC 00000001)
Jan 21 17:44:03 np0005591285 kernel: ACPI: WAET 0x00000000BFFE1695 000028 (v01 BOCHS  BXPC     00000001 BXPC 00000001)
Jan 21 17:44:03 np0005591285 kernel: ACPI: Reserving FACP table memory at [mem 0xbffe1571-0xbffe15e4]
Jan 21 17:44:03 np0005591285 kernel: ACPI: Reserving DSDT table memory at [mem 0xbffdfc80-0xbffe1570]
Jan 21 17:44:03 np0005591285 kernel: ACPI: Reserving FACS table memory at [mem 0xbffdfc40-0xbffdfc7f]
Jan 21 17:44:03 np0005591285 kernel: ACPI: Reserving APIC table memory at [mem 0xbffe15e5-0xbffe1694]
Jan 21 17:44:03 np0005591285 kernel: ACPI: Reserving WAET table memory at [mem 0xbffe1695-0xbffe16bc]
Jan 21 17:44:03 np0005591285 kernel: No NUMA configuration found
Jan 21 17:44:03 np0005591285 kernel: Faking a node at [mem 0x0000000000000000-0x000000023fffffff]
Jan 21 17:44:03 np0005591285 kernel: NODE_DATA(0) allocated [mem 0x23ffd3000-0x23fffdfff]
Jan 21 17:44:03 np0005591285 kernel: crashkernel reserved: 0x00000000af000000 - 0x00000000bf000000 (256 MB)
Jan 21 17:44:03 np0005591285 kernel: Zone ranges:
Jan 21 17:44:03 np0005591285 kernel:  DMA      [mem 0x0000000000001000-0x0000000000ffffff]
Jan 21 17:44:03 np0005591285 kernel:  DMA32    [mem 0x0000000001000000-0x00000000ffffffff]
Jan 21 17:44:03 np0005591285 kernel:  Normal   [mem 0x0000000100000000-0x000000023fffffff]
Jan 21 17:44:03 np0005591285 kernel:  Device   empty
Jan 21 17:44:03 np0005591285 kernel: Movable zone start for each node
Jan 21 17:44:03 np0005591285 kernel: Early memory node ranges
Jan 21 17:44:03 np0005591285 kernel:  node   0: [mem 0x0000000000001000-0x000000000009efff]
Jan 21 17:44:03 np0005591285 kernel:  node   0: [mem 0x0000000000100000-0x00000000bffdafff]
Jan 21 17:44:03 np0005591285 kernel:  node   0: [mem 0x0000000100000000-0x000000023fffffff]
Jan 21 17:44:03 np0005591285 kernel: Initmem setup node 0 [mem 0x0000000000001000-0x000000023fffffff]
Jan 21 17:44:03 np0005591285 kernel: On node 0, zone DMA: 1 pages in unavailable ranges
Jan 21 17:44:03 np0005591285 kernel: On node 0, zone DMA: 97 pages in unavailable ranges
Jan 21 17:44:03 np0005591285 kernel: On node 0, zone Normal: 37 pages in unavailable ranges
Jan 21 17:44:03 np0005591285 kernel: ACPI: PM-Timer IO Port: 0x608
Jan 21 17:44:03 np0005591285 kernel: ACPI: LAPIC_NMI (acpi_id[0xff] dfl dfl lint[0x1])
Jan 21 17:44:03 np0005591285 kernel: IOAPIC[0]: apic_id 0, version 17, address 0xfec00000, GSI 0-23
Jan 21 17:44:03 np0005591285 kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 0 global_irq 2 dfl dfl)
Jan 21 17:44:03 np0005591285 kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 5 global_irq 5 high level)
Jan 21 17:44:03 np0005591285 kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 9 global_irq 9 high level)
Jan 21 17:44:03 np0005591285 kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 10 global_irq 10 high level)
Jan 21 17:44:03 np0005591285 kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 11 global_irq 11 high level)
Jan 21 17:44:03 np0005591285 kernel: ACPI: Using ACPI (MADT) for SMP configuration information
Jan 21 17:44:03 np0005591285 kernel: TSC deadline timer available
Jan 21 17:44:03 np0005591285 kernel: CPU topo: Max. logical packages:   8
Jan 21 17:44:03 np0005591285 kernel: CPU topo: Max. logical dies:       8
Jan 21 17:44:03 np0005591285 kernel: CPU topo: Max. dies per package:   1
Jan 21 17:44:03 np0005591285 kernel: CPU topo: Max. threads per core:   1
Jan 21 17:44:03 np0005591285 kernel: CPU topo: Num. cores per package:     1
Jan 21 17:44:03 np0005591285 kernel: CPU topo: Num. threads per package:   1
Jan 21 17:44:03 np0005591285 kernel: CPU topo: Allowing 8 present CPUs plus 0 hotplug CPUs
Jan 21 17:44:03 np0005591285 kernel: kvm-guest: APIC: eoi() replaced with kvm_guest_apic_eoi_write()
Jan 21 17:44:03 np0005591285 kernel: PM: hibernation: Registered nosave memory: [mem 0x00000000-0x00000fff]
Jan 21 17:44:03 np0005591285 kernel: PM: hibernation: Registered nosave memory: [mem 0x0009f000-0x0009ffff]
Jan 21 17:44:03 np0005591285 kernel: PM: hibernation: Registered nosave memory: [mem 0x000a0000-0x000effff]
Jan 21 17:44:03 np0005591285 kernel: PM: hibernation: Registered nosave memory: [mem 0x000f0000-0x000fffff]
Jan 21 17:44:03 np0005591285 kernel: PM: hibernation: Registered nosave memory: [mem 0xbffdb000-0xbfffffff]
Jan 21 17:44:03 np0005591285 kernel: PM: hibernation: Registered nosave memory: [mem 0xc0000000-0xfeffbfff]
Jan 21 17:44:03 np0005591285 kernel: PM: hibernation: Registered nosave memory: [mem 0xfeffc000-0xfeffffff]
Jan 21 17:44:03 np0005591285 kernel: PM: hibernation: Registered nosave memory: [mem 0xff000000-0xfffbffff]
Jan 21 17:44:03 np0005591285 kernel: PM: hibernation: Registered nosave memory: [mem 0xfffc0000-0xffffffff]
Jan 21 17:44:03 np0005591285 kernel: [mem 0xc0000000-0xfeffbfff] available for PCI devices
Jan 21 17:44:03 np0005591285 kernel: Booting paravirtualized kernel on KVM
Jan 21 17:44:03 np0005591285 kernel: clocksource: refined-jiffies: mask: 0xffffffff max_cycles: 0xffffffff, max_idle_ns: 1910969940391419 ns
Jan 21 17:44:03 np0005591285 kernel: setup_percpu: NR_CPUS:8192 nr_cpumask_bits:8 nr_cpu_ids:8 nr_node_ids:1
Jan 21 17:44:03 np0005591285 kernel: percpu: Embedded 64 pages/cpu s225280 r8192 d28672 u262144
Jan 21 17:44:03 np0005591285 kernel: kvm-guest: PV spinlocks disabled, no host support
Jan 21 17:44:03 np0005591285 kernel: Kernel command line: BOOT_IMAGE=(hd0,msdos1)/boot/vmlinuz-5.14.0-661.el9.x86_64 root=UUID=22ac9141-3960-4912-b20e-19fc8a328d40 ro console=ttyS0,115200n8 no_timer_check net.ifnames=0 crashkernel=1G-2G:192M,2G-64G:256M,64G-:512M
Jan 21 17:44:03 np0005591285 kernel: Unknown kernel command line parameters "BOOT_IMAGE=(hd0,msdos1)/boot/vmlinuz-5.14.0-661.el9.x86_64", will be passed to user space.
Jan 21 17:44:03 np0005591285 kernel: random: crng init done
Jan 21 17:44:03 np0005591285 kernel: Dentry cache hash table entries: 1048576 (order: 11, 8388608 bytes, linear)
Jan 21 17:44:03 np0005591285 kernel: Inode-cache hash table entries: 524288 (order: 10, 4194304 bytes, linear)
Jan 21 17:44:03 np0005591285 kernel: Fallback order for Node 0: 0 
Jan 21 17:44:03 np0005591285 kernel: Built 1 zonelists, mobility grouping on.  Total pages: 2064091
Jan 21 17:44:03 np0005591285 kernel: Policy zone: Normal
Jan 21 17:44:03 np0005591285 kernel: mem auto-init: stack:off, heap alloc:off, heap free:off
Jan 21 17:44:03 np0005591285 kernel: software IO TLB: area num 8.
Jan 21 17:44:03 np0005591285 kernel: SLUB: HWalign=64, Order=0-3, MinObjects=0, CPUs=8, Nodes=1
Jan 21 17:44:03 np0005591285 kernel: ftrace: allocating 49417 entries in 194 pages
Jan 21 17:44:03 np0005591285 kernel: ftrace: allocated 194 pages with 3 groups
Jan 21 17:44:03 np0005591285 kernel: Dynamic Preempt: voluntary
Jan 21 17:44:03 np0005591285 kernel: rcu: Preemptible hierarchical RCU implementation.
Jan 21 17:44:03 np0005591285 kernel: rcu: #011RCU event tracing is enabled.
Jan 21 17:44:03 np0005591285 kernel: rcu: #011RCU restricting CPUs from NR_CPUS=8192 to nr_cpu_ids=8.
Jan 21 17:44:03 np0005591285 kernel: #011Trampoline variant of Tasks RCU enabled.
Jan 21 17:44:03 np0005591285 kernel: #011Rude variant of Tasks RCU enabled.
Jan 21 17:44:03 np0005591285 kernel: #011Tracing variant of Tasks RCU enabled.
Jan 21 17:44:03 np0005591285 kernel: rcu: RCU calculated value of scheduler-enlistment delay is 100 jiffies.
Jan 21 17:44:03 np0005591285 kernel: rcu: Adjusting geometry for rcu_fanout_leaf=16, nr_cpu_ids=8
Jan 21 17:44:03 np0005591285 kernel: RCU Tasks: Setting shift to 3 and lim to 1 rcu_task_cb_adjust=1 rcu_task_cpu_ids=8.
Jan 21 17:44:03 np0005591285 kernel: RCU Tasks Rude: Setting shift to 3 and lim to 1 rcu_task_cb_adjust=1 rcu_task_cpu_ids=8.
Jan 21 17:44:03 np0005591285 kernel: RCU Tasks Trace: Setting shift to 3 and lim to 1 rcu_task_cb_adjust=1 rcu_task_cpu_ids=8.
Jan 21 17:44:03 np0005591285 kernel: NR_IRQS: 524544, nr_irqs: 488, preallocated irqs: 16
Jan 21 17:44:03 np0005591285 kernel: rcu: srcu_init: Setting srcu_struct sizes based on contention.
Jan 21 17:44:03 np0005591285 kernel: kfence: initialized - using 2097152 bytes for 255 objects at 0x(____ptrval____)-0x(____ptrval____)
Jan 21 17:44:03 np0005591285 kernel: Console: colour VGA+ 80x25
Jan 21 17:44:03 np0005591285 kernel: printk: console [ttyS0] enabled
Jan 21 17:44:03 np0005591285 kernel: ACPI: Core revision 20230331
Jan 21 17:44:03 np0005591285 kernel: APIC: Switch to symmetric I/O mode setup
Jan 21 17:44:03 np0005591285 kernel: x2apic enabled
Jan 21 17:44:03 np0005591285 kernel: APIC: Switched APIC routing to: physical x2apic
Jan 21 17:44:03 np0005591285 kernel: tsc: Marking TSC unstable due to TSCs unsynchronized
Jan 21 17:44:03 np0005591285 kernel: Calibrating delay loop (skipped) preset value.. 5599.99 BogoMIPS (lpj=2799998)
Jan 21 17:44:03 np0005591285 kernel: x86/cpu: User Mode Instruction Prevention (UMIP) activated
Jan 21 17:44:03 np0005591285 kernel: Last level iTLB entries: 4KB 512, 2MB 255, 4MB 127
Jan 21 17:44:03 np0005591285 kernel: Last level dTLB entries: 4KB 512, 2MB 255, 4MB 127, 1GB 0
Jan 21 17:44:03 np0005591285 kernel: Spectre V1 : Mitigation: usercopy/swapgs barriers and __user pointer sanitization
Jan 21 17:44:03 np0005591285 kernel: Spectre V2 : Mitigation: Retpolines
Jan 21 17:44:03 np0005591285 kernel: Spectre V2 : Spectre v2 / SpectreRSB: Filling RSB on context switch and VMEXIT
Jan 21 17:44:03 np0005591285 kernel: Spectre V2 : Enabling Speculation Barrier for firmware calls
Jan 21 17:44:03 np0005591285 kernel: RETBleed: Mitigation: untrained return thunk
Jan 21 17:44:03 np0005591285 kernel: Spectre V2 : mitigation: Enabling conditional Indirect Branch Prediction Barrier
Jan 21 17:44:03 np0005591285 kernel: Speculative Store Bypass: Mitigation: Speculative Store Bypass disabled via prctl
Jan 21 17:44:03 np0005591285 kernel: Speculative Return Stack Overflow: IBPB-extending microcode not applied!
Jan 21 17:44:03 np0005591285 kernel: Speculative Return Stack Overflow: WARNING: See https://kernel.org/doc/html/latest/admin-guide/hw-vuln/srso.html for mitigation options.
Jan 21 17:44:03 np0005591285 kernel: x86/bugs: return thunk changed
Jan 21 17:44:03 np0005591285 kernel: Speculative Return Stack Overflow: Vulnerable: Safe RET, no microcode
Jan 21 17:44:03 np0005591285 kernel: x86/fpu: Supporting XSAVE feature 0x001: 'x87 floating point registers'
Jan 21 17:44:03 np0005591285 kernel: x86/fpu: Supporting XSAVE feature 0x002: 'SSE registers'
Jan 21 17:44:03 np0005591285 kernel: x86/fpu: Supporting XSAVE feature 0x004: 'AVX registers'
Jan 21 17:44:03 np0005591285 kernel: x86/fpu: xstate_offset[2]:  576, xstate_sizes[2]:  256
Jan 21 17:44:03 np0005591285 kernel: x86/fpu: Enabled xstate features 0x7, context size is 832 bytes, using 'compacted' format.
Jan 21 17:44:03 np0005591285 kernel: Freeing SMP alternatives memory: 40K
Jan 21 17:44:03 np0005591285 kernel: pid_max: default: 32768 minimum: 301
Jan 21 17:44:03 np0005591285 kernel: LSM: initializing lsm=lockdown,capability,landlock,yama,integrity,selinux,bpf
Jan 21 17:44:03 np0005591285 kernel: landlock: Up and running.
Jan 21 17:44:03 np0005591285 kernel: Yama: becoming mindful.
Jan 21 17:44:03 np0005591285 kernel: SELinux:  Initializing.
Jan 21 17:44:03 np0005591285 kernel: LSM support for eBPF active
Jan 21 17:44:03 np0005591285 kernel: Mount-cache hash table entries: 16384 (order: 5, 131072 bytes, linear)
Jan 21 17:44:03 np0005591285 kernel: Mountpoint-cache hash table entries: 16384 (order: 5, 131072 bytes, linear)
Jan 21 17:44:03 np0005591285 kernel: smpboot: CPU0: AMD EPYC-Rome Processor (family: 0x17, model: 0x31, stepping: 0x0)
Jan 21 17:44:03 np0005591285 kernel: Performance Events: Fam17h+ core perfctr, AMD PMU driver.
Jan 21 17:44:03 np0005591285 kernel: ... version:                0
Jan 21 17:44:03 np0005591285 kernel: ... bit width:              48
Jan 21 17:44:03 np0005591285 kernel: ... generic registers:      6
Jan 21 17:44:03 np0005591285 kernel: ... value mask:             0000ffffffffffff
Jan 21 17:44:03 np0005591285 kernel: ... max period:             00007fffffffffff
Jan 21 17:44:03 np0005591285 kernel: ... fixed-purpose events:   0
Jan 21 17:44:03 np0005591285 kernel: ... event mask:             000000000000003f
Jan 21 17:44:03 np0005591285 kernel: signal: max sigframe size: 1776
Jan 21 17:44:03 np0005591285 kernel: rcu: Hierarchical SRCU implementation.
Jan 21 17:44:03 np0005591285 kernel: rcu: #011Max phase no-delay instances is 400.
Jan 21 17:44:03 np0005591285 kernel: smp: Bringing up secondary CPUs ...
Jan 21 17:44:03 np0005591285 kernel: smpboot: x86: Booting SMP configuration:
Jan 21 17:44:03 np0005591285 kernel: .... node  #0, CPUs:      #1 #2 #3 #4 #5 #6 #7
Jan 21 17:44:03 np0005591285 kernel: smp: Brought up 1 node, 8 CPUs
Jan 21 17:44:03 np0005591285 kernel: smpboot: Total of 8 processors activated (44799.96 BogoMIPS)
Jan 21 17:44:03 np0005591285 kernel: node 0 deferred pages initialised in 9ms
Jan 21 17:44:03 np0005591285 kernel: Memory: 7763864K/8388068K available (16384K kernel code, 5797K rwdata, 13916K rodata, 4200K init, 7192K bss, 618368K reserved, 0K cma-reserved)
Jan 21 17:44:03 np0005591285 kernel: devtmpfs: initialized
Jan 21 17:44:03 np0005591285 kernel: x86/mm: Memory block size: 128MB
Jan 21 17:44:03 np0005591285 kernel: clocksource: jiffies: mask: 0xffffffff max_cycles: 0xffffffff, max_idle_ns: 1911260446275000 ns
Jan 21 17:44:03 np0005591285 kernel: futex hash table entries: 2048 (131072 bytes on 1 NUMA nodes, total 128 KiB, linear).
Jan 21 17:44:03 np0005591285 kernel: pinctrl core: initialized pinctrl subsystem
Jan 21 17:44:03 np0005591285 kernel: NET: Registered PF_NETLINK/PF_ROUTE protocol family
Jan 21 17:44:03 np0005591285 kernel: DMA: preallocated 1024 KiB GFP_KERNEL pool for atomic allocations
Jan 21 17:44:03 np0005591285 kernel: DMA: preallocated 1024 KiB GFP_KERNEL|GFP_DMA pool for atomic allocations
Jan 21 17:44:03 np0005591285 kernel: DMA: preallocated 1024 KiB GFP_KERNEL|GFP_DMA32 pool for atomic allocations
Jan 21 17:44:03 np0005591285 kernel: audit: initializing netlink subsys (disabled)
Jan 21 17:44:03 np0005591285 kernel: audit: type=2000 audit(1769035441.638:1): state=initialized audit_enabled=0 res=1
Jan 21 17:44:03 np0005591285 kernel: thermal_sys: Registered thermal governor 'fair_share'
Jan 21 17:44:03 np0005591285 kernel: thermal_sys: Registered thermal governor 'step_wise'
Jan 21 17:44:03 np0005591285 kernel: thermal_sys: Registered thermal governor 'user_space'
Jan 21 17:44:03 np0005591285 kernel: cpuidle: using governor menu
Jan 21 17:44:03 np0005591285 kernel: acpiphp: ACPI Hot Plug PCI Controller Driver version: 0.5
Jan 21 17:44:03 np0005591285 kernel: PCI: Using configuration type 1 for base access
Jan 21 17:44:03 np0005591285 kernel: PCI: Using configuration type 1 for extended access
Jan 21 17:44:03 np0005591285 kernel: kprobes: kprobe jump-optimization is enabled. All kprobes are optimized if possible.
Jan 21 17:44:03 np0005591285 kernel: HugeTLB: registered 1.00 GiB page size, pre-allocated 0 pages
Jan 21 17:44:03 np0005591285 kernel: HugeTLB: 16380 KiB vmemmap can be freed for a 1.00 GiB page
Jan 21 17:44:03 np0005591285 kernel: HugeTLB: registered 2.00 MiB page size, pre-allocated 0 pages
Jan 21 17:44:03 np0005591285 kernel: HugeTLB: 28 KiB vmemmap can be freed for a 2.00 MiB page
Jan 21 17:44:03 np0005591285 kernel: Demotion targets for Node 0: null
Jan 21 17:44:03 np0005591285 kernel: cryptd: max_cpu_qlen set to 1000
Jan 21 17:44:03 np0005591285 kernel: ACPI: Added _OSI(Module Device)
Jan 21 17:44:03 np0005591285 kernel: ACPI: Added _OSI(Processor Device)
Jan 21 17:44:03 np0005591285 kernel: ACPI: Added _OSI(Processor Aggregator Device)
Jan 21 17:44:03 np0005591285 kernel: ACPI: 1 ACPI AML tables successfully acquired and loaded
Jan 21 17:44:03 np0005591285 kernel: ACPI: Interpreter enabled
Jan 21 17:44:03 np0005591285 kernel: ACPI: PM: (supports S0 S3 S4 S5)
Jan 21 17:44:03 np0005591285 kernel: ACPI: Using IOAPIC for interrupt routing
Jan 21 17:44:03 np0005591285 kernel: PCI: Using host bridge windows from ACPI; if necessary, use "pci=nocrs" and report a bug
Jan 21 17:44:03 np0005591285 kernel: PCI: Using E820 reservations for host bridge windows
Jan 21 17:44:03 np0005591285 kernel: ACPI: Enabled 2 GPEs in block 00 to 0F
Jan 21 17:44:03 np0005591285 kernel: ACPI: PCI Root Bridge [PCI0] (domain 0000 [bus 00-ff])
Jan 21 17:44:03 np0005591285 kernel: acpi PNP0A03:00: _OSC: OS supports [ExtendedConfig ASPM ClockPM Segments MSI EDR HPX-Type3]
Jan 21 17:44:03 np0005591285 kernel: acpiphp: Slot [3] registered
Jan 21 17:44:03 np0005591285 kernel: acpiphp: Slot [4] registered
Jan 21 17:44:03 np0005591285 kernel: acpiphp: Slot [5] registered
Jan 21 17:44:03 np0005591285 kernel: acpiphp: Slot [6] registered
Jan 21 17:44:03 np0005591285 kernel: acpiphp: Slot [7] registered
Jan 21 17:44:03 np0005591285 kernel: acpiphp: Slot [8] registered
Jan 21 17:44:03 np0005591285 kernel: acpiphp: Slot [9] registered
Jan 21 17:44:03 np0005591285 kernel: acpiphp: Slot [10] registered
Jan 21 17:44:03 np0005591285 kernel: acpiphp: Slot [11] registered
Jan 21 17:44:03 np0005591285 kernel: acpiphp: Slot [12] registered
Jan 21 17:44:03 np0005591285 kernel: acpiphp: Slot [13] registered
Jan 21 17:44:03 np0005591285 kernel: acpiphp: Slot [14] registered
Jan 21 17:44:03 np0005591285 kernel: acpiphp: Slot [15] registered
Jan 21 17:44:03 np0005591285 kernel: acpiphp: Slot [16] registered
Jan 21 17:44:03 np0005591285 kernel: acpiphp: Slot [17] registered
Jan 21 17:44:03 np0005591285 kernel: acpiphp: Slot [18] registered
Jan 21 17:44:03 np0005591285 kernel: acpiphp: Slot [19] registered
Jan 21 17:44:03 np0005591285 kernel: acpiphp: Slot [20] registered
Jan 21 17:44:03 np0005591285 kernel: acpiphp: Slot [21] registered
Jan 21 17:44:03 np0005591285 kernel: acpiphp: Slot [22] registered
Jan 21 17:44:03 np0005591285 kernel: acpiphp: Slot [23] registered
Jan 21 17:44:03 np0005591285 kernel: acpiphp: Slot [24] registered
Jan 21 17:44:03 np0005591285 kernel: acpiphp: Slot [25] registered
Jan 21 17:44:03 np0005591285 kernel: acpiphp: Slot [26] registered
Jan 21 17:44:03 np0005591285 kernel: acpiphp: Slot [27] registered
Jan 21 17:44:03 np0005591285 kernel: acpiphp: Slot [28] registered
Jan 21 17:44:03 np0005591285 kernel: acpiphp: Slot [29] registered
Jan 21 17:44:03 np0005591285 kernel: acpiphp: Slot [30] registered
Jan 21 17:44:03 np0005591285 kernel: acpiphp: Slot [31] registered
Jan 21 17:44:03 np0005591285 kernel: PCI host bridge to bus 0000:00
Jan 21 17:44:03 np0005591285 kernel: pci_bus 0000:00: root bus resource [io  0x0000-0x0cf7 window]
Jan 21 17:44:03 np0005591285 kernel: pci_bus 0000:00: root bus resource [io  0x0d00-0xffff window]
Jan 21 17:44:03 np0005591285 kernel: pci_bus 0000:00: root bus resource [mem 0x000a0000-0x000bffff window]
Jan 21 17:44:03 np0005591285 kernel: pci_bus 0000:00: root bus resource [mem 0xc0000000-0xfebfffff window]
Jan 21 17:44:03 np0005591285 kernel: pci_bus 0000:00: root bus resource [mem 0x240000000-0x2bfffffff window]
Jan 21 17:44:03 np0005591285 kernel: pci_bus 0000:00: root bus resource [bus 00-ff]
Jan 21 17:44:03 np0005591285 kernel: pci 0000:00:00.0: [8086:1237] type 00 class 0x060000 conventional PCI endpoint
Jan 21 17:44:03 np0005591285 kernel: pci 0000:00:01.0: [8086:7000] type 00 class 0x060100 conventional PCI endpoint
Jan 21 17:44:03 np0005591285 kernel: pci 0000:00:01.1: [8086:7010] type 00 class 0x010180 conventional PCI endpoint
Jan 21 17:44:03 np0005591285 kernel: pci 0000:00:01.1: BAR 4 [io  0xc140-0xc14f]
Jan 21 17:44:03 np0005591285 kernel: pci 0000:00:01.1: BAR 0 [io  0x01f0-0x01f7]: legacy IDE quirk
Jan 21 17:44:03 np0005591285 kernel: pci 0000:00:01.1: BAR 1 [io  0x03f6]: legacy IDE quirk
Jan 21 17:44:03 np0005591285 kernel: pci 0000:00:01.1: BAR 2 [io  0x0170-0x0177]: legacy IDE quirk
Jan 21 17:44:03 np0005591285 kernel: pci 0000:00:01.1: BAR 3 [io  0x0376]: legacy IDE quirk
Jan 21 17:44:03 np0005591285 kernel: pci 0000:00:01.2: [8086:7020] type 00 class 0x0c0300 conventional PCI endpoint
Jan 21 17:44:03 np0005591285 kernel: pci 0000:00:01.2: BAR 4 [io  0xc100-0xc11f]
Jan 21 17:44:03 np0005591285 kernel: pci 0000:00:01.3: [8086:7113] type 00 class 0x068000 conventional PCI endpoint
Jan 21 17:44:03 np0005591285 kernel: pci 0000:00:01.3: quirk: [io  0x0600-0x063f] claimed by PIIX4 ACPI
Jan 21 17:44:03 np0005591285 kernel: pci 0000:00:01.3: quirk: [io  0x0700-0x070f] claimed by PIIX4 SMB
Jan 21 17:44:03 np0005591285 kernel: pci 0000:00:02.0: [1af4:1050] type 00 class 0x030000 conventional PCI endpoint
Jan 21 17:44:03 np0005591285 kernel: pci 0000:00:02.0: BAR 0 [mem 0xfe000000-0xfe7fffff pref]
Jan 21 17:44:03 np0005591285 kernel: pci 0000:00:02.0: BAR 2 [mem 0xfe800000-0xfe803fff 64bit pref]
Jan 21 17:44:03 np0005591285 kernel: pci 0000:00:02.0: BAR 4 [mem 0xfeb90000-0xfeb90fff]
Jan 21 17:44:03 np0005591285 kernel: pci 0000:00:02.0: ROM [mem 0xfeb80000-0xfeb8ffff pref]
Jan 21 17:44:03 np0005591285 kernel: pci 0000:00:02.0: Video device with shadowed ROM at [mem 0x000c0000-0x000dffff]
Jan 21 17:44:03 np0005591285 kernel: pci 0000:00:03.0: [1af4:1000] type 00 class 0x020000 conventional PCI endpoint
Jan 21 17:44:03 np0005591285 kernel: pci 0000:00:03.0: BAR 0 [io  0xc080-0xc0bf]
Jan 21 17:44:03 np0005591285 kernel: pci 0000:00:03.0: BAR 1 [mem 0xfeb91000-0xfeb91fff]
Jan 21 17:44:03 np0005591285 kernel: pci 0000:00:03.0: BAR 4 [mem 0xfe804000-0xfe807fff 64bit pref]
Jan 21 17:44:03 np0005591285 kernel: pci 0000:00:03.0: ROM [mem 0xfeb00000-0xfeb7ffff pref]
Jan 21 17:44:03 np0005591285 kernel: pci 0000:00:04.0: [1af4:1001] type 00 class 0x010000 conventional PCI endpoint
Jan 21 17:44:03 np0005591285 kernel: pci 0000:00:04.0: BAR 0 [io  0xc000-0xc07f]
Jan 21 17:44:03 np0005591285 kernel: pci 0000:00:04.0: BAR 1 [mem 0xfeb92000-0xfeb92fff]
Jan 21 17:44:03 np0005591285 kernel: pci 0000:00:04.0: BAR 4 [mem 0xfe808000-0xfe80bfff 64bit pref]
Jan 21 17:44:03 np0005591285 kernel: pci 0000:00:05.0: [1af4:1002] type 00 class 0x00ff00 conventional PCI endpoint
Jan 21 17:44:03 np0005591285 kernel: pci 0000:00:05.0: BAR 0 [io  0xc0c0-0xc0ff]
Jan 21 17:44:03 np0005591285 kernel: pci 0000:00:05.0: BAR 4 [mem 0xfe80c000-0xfe80ffff 64bit pref]
Jan 21 17:44:03 np0005591285 kernel: pci 0000:00:06.0: [1af4:1005] type 00 class 0x00ff00 conventional PCI endpoint
Jan 21 17:44:03 np0005591285 kernel: pci 0000:00:06.0: BAR 0 [io  0xc120-0xc13f]
Jan 21 17:44:03 np0005591285 kernel: pci 0000:00:06.0: BAR 4 [mem 0xfe810000-0xfe813fff 64bit pref]
Jan 21 17:44:03 np0005591285 kernel: ACPI: PCI: Interrupt link LNKA configured for IRQ 10
Jan 21 17:44:03 np0005591285 kernel: ACPI: PCI: Interrupt link LNKB configured for IRQ 10
Jan 21 17:44:03 np0005591285 kernel: ACPI: PCI: Interrupt link LNKC configured for IRQ 11
Jan 21 17:44:03 np0005591285 kernel: ACPI: PCI: Interrupt link LNKD configured for IRQ 11
Jan 21 17:44:03 np0005591285 kernel: ACPI: PCI: Interrupt link LNKS configured for IRQ 9
Jan 21 17:44:03 np0005591285 kernel: iommu: Default domain type: Translated
Jan 21 17:44:03 np0005591285 kernel: iommu: DMA domain TLB invalidation policy: lazy mode
Jan 21 17:44:03 np0005591285 kernel: SCSI subsystem initialized
Jan 21 17:44:03 np0005591285 kernel: ACPI: bus type USB registered
Jan 21 17:44:03 np0005591285 kernel: usbcore: registered new interface driver usbfs
Jan 21 17:44:03 np0005591285 kernel: usbcore: registered new interface driver hub
Jan 21 17:44:03 np0005591285 kernel: usbcore: registered new device driver usb
Jan 21 17:44:03 np0005591285 kernel: pps_core: LinuxPPS API ver. 1 registered
Jan 21 17:44:03 np0005591285 kernel: pps_core: Software ver. 5.3.6 - Copyright 2005-2007 Rodolfo Giometti <giometti@linux.it>
Jan 21 17:44:03 np0005591285 kernel: PTP clock support registered
Jan 21 17:44:03 np0005591285 kernel: EDAC MC: Ver: 3.0.0
Jan 21 17:44:03 np0005591285 kernel: NetLabel: Initializing
Jan 21 17:44:03 np0005591285 kernel: NetLabel:  domain hash size = 128
Jan 21 17:44:03 np0005591285 kernel: NetLabel:  protocols = UNLABELED CIPSOv4 CALIPSO
Jan 21 17:44:03 np0005591285 kernel: NetLabel:  unlabeled traffic allowed by default
Jan 21 17:44:03 np0005591285 kernel: PCI: Using ACPI for IRQ routing
Jan 21 17:44:03 np0005591285 kernel: pci 0000:00:02.0: vgaarb: setting as boot VGA device
Jan 21 17:44:03 np0005591285 kernel: pci 0000:00:02.0: vgaarb: bridge control possible
Jan 21 17:44:03 np0005591285 kernel: pci 0000:00:02.0: vgaarb: VGA device added: decodes=io+mem,owns=io+mem,locks=none
Jan 21 17:44:03 np0005591285 kernel: vgaarb: loaded
Jan 21 17:44:03 np0005591285 kernel: clocksource: Switched to clocksource kvm-clock
Jan 21 17:44:03 np0005591285 kernel: VFS: Disk quotas dquot_6.6.0
Jan 21 17:44:03 np0005591285 kernel: VFS: Dquot-cache hash table entries: 512 (order 0, 4096 bytes)
Jan 21 17:44:03 np0005591285 kernel: pnp: PnP ACPI init
Jan 21 17:44:03 np0005591285 kernel: pnp: PnP ACPI: found 5 devices
Jan 21 17:44:03 np0005591285 kernel: clocksource: acpi_pm: mask: 0xffffff max_cycles: 0xffffff, max_idle_ns: 2085701024 ns
Jan 21 17:44:03 np0005591285 kernel: NET: Registered PF_INET protocol family
Jan 21 17:44:03 np0005591285 kernel: IP idents hash table entries: 131072 (order: 8, 1048576 bytes, linear)
Jan 21 17:44:03 np0005591285 kernel: tcp_listen_portaddr_hash hash table entries: 4096 (order: 4, 65536 bytes, linear)
Jan 21 17:44:03 np0005591285 kernel: Table-perturb hash table entries: 65536 (order: 6, 262144 bytes, linear)
Jan 21 17:44:03 np0005591285 kernel: TCP established hash table entries: 65536 (order: 7, 524288 bytes, linear)
Jan 21 17:44:03 np0005591285 kernel: TCP bind hash table entries: 65536 (order: 8, 1048576 bytes, linear)
Jan 21 17:44:03 np0005591285 kernel: TCP: Hash tables configured (established 65536 bind 65536)
Jan 21 17:44:03 np0005591285 kernel: MPTCP token hash table entries: 8192 (order: 5, 196608 bytes, linear)
Jan 21 17:44:03 np0005591285 kernel: UDP hash table entries: 4096 (order: 5, 131072 bytes, linear)
Jan 21 17:44:03 np0005591285 kernel: UDP-Lite hash table entries: 4096 (order: 5, 131072 bytes, linear)
Jan 21 17:44:03 np0005591285 kernel: NET: Registered PF_UNIX/PF_LOCAL protocol family
Jan 21 17:44:03 np0005591285 kernel: NET: Registered PF_XDP protocol family
Jan 21 17:44:03 np0005591285 kernel: pci_bus 0000:00: resource 4 [io  0x0000-0x0cf7 window]
Jan 21 17:44:03 np0005591285 kernel: pci_bus 0000:00: resource 5 [io  0x0d00-0xffff window]
Jan 21 17:44:03 np0005591285 kernel: pci_bus 0000:00: resource 6 [mem 0x000a0000-0x000bffff window]
Jan 21 17:44:03 np0005591285 kernel: pci_bus 0000:00: resource 7 [mem 0xc0000000-0xfebfffff window]
Jan 21 17:44:03 np0005591285 kernel: pci_bus 0000:00: resource 8 [mem 0x240000000-0x2bfffffff window]
Jan 21 17:44:03 np0005591285 kernel: pci 0000:00:01.0: PIIX3: Enabling Passive Release
Jan 21 17:44:03 np0005591285 kernel: pci 0000:00:00.0: Limiting direct PCI/PCI transfers
Jan 21 17:44:03 np0005591285 kernel: ACPI: \_SB_.LNKD: Enabled at IRQ 11
Jan 21 17:44:03 np0005591285 kernel: pci 0000:00:01.2: quirk_usb_early_handoff+0x0/0x160 took 97534 usecs
Jan 21 17:44:03 np0005591285 kernel: PCI: CLS 0 bytes, default 64
Jan 21 17:44:03 np0005591285 kernel: PCI-DMA: Using software bounce buffering for IO (SWIOTLB)
Jan 21 17:44:03 np0005591285 kernel: software IO TLB: mapped [mem 0x00000000ab000000-0x00000000af000000] (64MB)
Jan 21 17:44:03 np0005591285 kernel: ACPI: bus type thunderbolt registered
Jan 21 17:44:03 np0005591285 kernel: Trying to unpack rootfs image as initramfs...
Jan 21 17:44:03 np0005591285 kernel: Initialise system trusted keyrings
Jan 21 17:44:03 np0005591285 kernel: Key type blacklist registered
Jan 21 17:44:03 np0005591285 kernel: workingset: timestamp_bits=36 max_order=21 bucket_order=0
Jan 21 17:44:03 np0005591285 kernel: zbud: loaded
Jan 21 17:44:03 np0005591285 kernel: integrity: Platform Keyring initialized
Jan 21 17:44:03 np0005591285 kernel: integrity: Machine keyring initialized
Jan 21 17:44:03 np0005591285 kernel: Freeing initrd memory: 87956K
Jan 21 17:44:03 np0005591285 kernel: NET: Registered PF_ALG protocol family
Jan 21 17:44:03 np0005591285 kernel: xor: automatically using best checksumming function   avx       
Jan 21 17:44:03 np0005591285 kernel: Key type asymmetric registered
Jan 21 17:44:03 np0005591285 kernel: Asymmetric key parser 'x509' registered
Jan 21 17:44:03 np0005591285 kernel: Block layer SCSI generic (bsg) driver version 0.4 loaded (major 246)
Jan 21 17:44:03 np0005591285 kernel: io scheduler mq-deadline registered
Jan 21 17:44:03 np0005591285 kernel: io scheduler kyber registered
Jan 21 17:44:03 np0005591285 kernel: io scheduler bfq registered
Jan 21 17:44:03 np0005591285 kernel: atomic64_test: passed for x86-64 platform with CX8 and with SSE
Jan 21 17:44:03 np0005591285 kernel: shpchp: Standard Hot Plug PCI Controller Driver version: 0.4
Jan 21 17:44:03 np0005591285 kernel: input: Power Button as /devices/LNXSYSTM:00/LNXPWRBN:00/input/input0
Jan 21 17:44:03 np0005591285 kernel: ACPI: button: Power Button [PWRF]
Jan 21 17:44:03 np0005591285 kernel: ACPI: \_SB_.LNKB: Enabled at IRQ 10
Jan 21 17:44:03 np0005591285 kernel: ACPI: \_SB_.LNKC: Enabled at IRQ 11
Jan 21 17:44:03 np0005591285 kernel: ACPI: \_SB_.LNKA: Enabled at IRQ 10
Jan 21 17:44:03 np0005591285 kernel: Serial: 8250/16550 driver, 4 ports, IRQ sharing enabled
Jan 21 17:44:03 np0005591285 kernel: 00:00: ttyS0 at I/O 0x3f8 (irq = 4, base_baud = 115200) is a 16550A
Jan 21 17:44:03 np0005591285 kernel: Non-volatile memory driver v1.3
Jan 21 17:44:03 np0005591285 kernel: rdac: device handler registered
Jan 21 17:44:03 np0005591285 kernel: hp_sw: device handler registered
Jan 21 17:44:03 np0005591285 kernel: emc: device handler registered
Jan 21 17:44:03 np0005591285 kernel: alua: device handler registered
Jan 21 17:44:03 np0005591285 kernel: uhci_hcd 0000:00:01.2: UHCI Host Controller
Jan 21 17:44:03 np0005591285 kernel: uhci_hcd 0000:00:01.2: new USB bus registered, assigned bus number 1
Jan 21 17:44:03 np0005591285 kernel: uhci_hcd 0000:00:01.2: detected 2 ports
Jan 21 17:44:03 np0005591285 kernel: uhci_hcd 0000:00:01.2: irq 11, io port 0x0000c100
Jan 21 17:44:03 np0005591285 kernel: usb usb1: New USB device found, idVendor=1d6b, idProduct=0001, bcdDevice= 5.14
Jan 21 17:44:03 np0005591285 kernel: usb usb1: New USB device strings: Mfr=3, Product=2, SerialNumber=1
Jan 21 17:44:03 np0005591285 kernel: usb usb1: Product: UHCI Host Controller
Jan 21 17:44:03 np0005591285 kernel: usb usb1: Manufacturer: Linux 5.14.0-661.el9.x86_64 uhci_hcd
Jan 21 17:44:03 np0005591285 kernel: usb usb1: SerialNumber: 0000:00:01.2
Jan 21 17:44:03 np0005591285 kernel: hub 1-0:1.0: USB hub found
Jan 21 17:44:03 np0005591285 kernel: hub 1-0:1.0: 2 ports detected
Jan 21 17:44:03 np0005591285 kernel: usbcore: registered new interface driver usbserial_generic
Jan 21 17:44:03 np0005591285 kernel: usbserial: USB Serial support registered for generic
Jan 21 17:44:03 np0005591285 kernel: i8042: PNP: PS/2 Controller [PNP0303:KBD,PNP0f13:MOU] at 0x60,0x64 irq 1,12
Jan 21 17:44:03 np0005591285 kernel: serio: i8042 KBD port at 0x60,0x64 irq 1
Jan 21 17:44:03 np0005591285 kernel: serio: i8042 AUX port at 0x60,0x64 irq 12
Jan 21 17:44:03 np0005591285 kernel: mousedev: PS/2 mouse device common for all mice
Jan 21 17:44:03 np0005591285 kernel: rtc_cmos 00:04: RTC can wake from S4
Jan 21 17:44:03 np0005591285 kernel: input: AT Translated Set 2 keyboard as /devices/platform/i8042/serio0/input/input1
Jan 21 17:44:03 np0005591285 kernel: rtc_cmos 00:04: registered as rtc0
Jan 21 17:44:03 np0005591285 kernel: input: VirtualPS/2 VMware VMMouse as /devices/platform/i8042/serio1/input/input4
Jan 21 17:44:03 np0005591285 kernel: rtc_cmos 00:04: setting system clock to 2026-01-21T22:44:02 UTC (1769035442)
Jan 21 17:44:03 np0005591285 kernel: rtc_cmos 00:04: alarms up to one day, y3k, 242 bytes nvram
Jan 21 17:44:03 np0005591285 kernel: amd_pstate: the _CPC object is not present in SBIOS or ACPI disabled
Jan 21 17:44:03 np0005591285 kernel: input: VirtualPS/2 VMware VMMouse as /devices/platform/i8042/serio1/input/input3
Jan 21 17:44:03 np0005591285 kernel: hid: raw HID events driver (C) Jiri Kosina
Jan 21 17:44:03 np0005591285 kernel: usbcore: registered new interface driver usbhid
Jan 21 17:44:03 np0005591285 kernel: usbhid: USB HID core driver
Jan 21 17:44:03 np0005591285 kernel: drop_monitor: Initializing network drop monitor service
Jan 21 17:44:03 np0005591285 kernel: Initializing XFRM netlink socket
Jan 21 17:44:03 np0005591285 kernel: NET: Registered PF_INET6 protocol family
Jan 21 17:44:03 np0005591285 kernel: Segment Routing with IPv6
Jan 21 17:44:03 np0005591285 kernel: NET: Registered PF_PACKET protocol family
Jan 21 17:44:03 np0005591285 kernel: mpls_gso: MPLS GSO support
Jan 21 17:44:03 np0005591285 kernel: IPI shorthand broadcast: enabled
Jan 21 17:44:03 np0005591285 kernel: AVX2 version of gcm_enc/dec engaged.
Jan 21 17:44:03 np0005591285 kernel: AES CTR mode by8 optimization enabled
Jan 21 17:44:03 np0005591285 kernel: sched_clock: Marking stable (1261001424, 148406983)->(1540786480, -131378073)
Jan 21 17:44:03 np0005591285 kernel: registered taskstats version 1
Jan 21 17:44:03 np0005591285 kernel: Loading compiled-in X.509 certificates
Jan 21 17:44:03 np0005591285 kernel: Loaded X.509 cert 'The CentOS Project: CentOS Stream kernel signing key: 04453f216699002fd63185eeab832de990bee6d7'
Jan 21 17:44:03 np0005591285 kernel: Loaded X.509 cert 'Red Hat Enterprise Linux Driver Update Program (key 3): bf57f3e87362bc7229d9f465321773dfd1f77a80'
Jan 21 17:44:03 np0005591285 kernel: Loaded X.509 cert 'Red Hat Enterprise Linux kpatch signing key: 4d38fd864ebe18c5f0b72e3852e2014c3a676fc8'
Jan 21 17:44:03 np0005591285 kernel: Loaded X.509 cert 'RH-IMA-CA: Red Hat IMA CA: fb31825dd0e073685b264e3038963673f753959a'
Jan 21 17:44:03 np0005591285 kernel: Loaded X.509 cert 'Nvidia GPU OOT signing 001: 55e1cef88193e60419f0b0ec379c49f77545acf0'
Jan 21 17:44:03 np0005591285 kernel: Demotion targets for Node 0: null
Jan 21 17:44:03 np0005591285 kernel: page_owner is disabled
Jan 21 17:44:03 np0005591285 kernel: Key type .fscrypt registered
Jan 21 17:44:03 np0005591285 kernel: Key type fscrypt-provisioning registered
Jan 21 17:44:03 np0005591285 kernel: Key type big_key registered
Jan 21 17:44:03 np0005591285 kernel: Key type encrypted registered
Jan 21 17:44:03 np0005591285 kernel: ima: No TPM chip found, activating TPM-bypass!
Jan 21 17:44:03 np0005591285 kernel: Loading compiled-in module X.509 certificates
Jan 21 17:44:03 np0005591285 kernel: Loaded X.509 cert 'The CentOS Project: CentOS Stream kernel signing key: 04453f216699002fd63185eeab832de990bee6d7'
Jan 21 17:44:03 np0005591285 kernel: ima: Allocated hash algorithm: sha256
Jan 21 17:44:03 np0005591285 kernel: ima: No architecture policies found
Jan 21 17:44:03 np0005591285 kernel: evm: Initialising EVM extended attributes:
Jan 21 17:44:03 np0005591285 kernel: evm: security.selinux
Jan 21 17:44:03 np0005591285 kernel: evm: security.SMACK64 (disabled)
Jan 21 17:44:03 np0005591285 kernel: evm: security.SMACK64EXEC (disabled)
Jan 21 17:44:03 np0005591285 kernel: evm: security.SMACK64TRANSMUTE (disabled)
Jan 21 17:44:03 np0005591285 kernel: evm: security.SMACK64MMAP (disabled)
Jan 21 17:44:03 np0005591285 kernel: evm: security.apparmor (disabled)
Jan 21 17:44:03 np0005591285 kernel: evm: security.ima
Jan 21 17:44:03 np0005591285 kernel: evm: security.capability
Jan 21 17:44:03 np0005591285 kernel: evm: HMAC attrs: 0x1
Jan 21 17:44:03 np0005591285 kernel: usb 1-1: new full-speed USB device number 2 using uhci_hcd
Jan 21 17:44:03 np0005591285 kernel: Running certificate verification RSA selftest
Jan 21 17:44:03 np0005591285 kernel: Loaded X.509 cert 'Certificate verification self-testing key: f58703bb33ce1b73ee02eccdee5b8817518fe3db'
Jan 21 17:44:03 np0005591285 kernel: Running certificate verification ECDSA selftest
Jan 21 17:44:03 np0005591285 kernel: Loaded X.509 cert 'Certificate verification ECDSA self-testing key: 2900bcea1deb7bc8479a84a23d758efdfdd2b2d3'
Jan 21 17:44:03 np0005591285 kernel: usb 1-1: New USB device found, idVendor=0627, idProduct=0001, bcdDevice= 0.00
Jan 21 17:44:03 np0005591285 kernel: usb 1-1: New USB device strings: Mfr=1, Product=3, SerialNumber=10
Jan 21 17:44:03 np0005591285 kernel: usb 1-1: Product: QEMU USB Tablet
Jan 21 17:44:03 np0005591285 kernel: usb 1-1: Manufacturer: QEMU
Jan 21 17:44:03 np0005591285 kernel: usb 1-1: SerialNumber: 28754-0000:00:01.2-1
Jan 21 17:44:03 np0005591285 kernel: clk: Disabling unused clocks
Jan 21 17:44:03 np0005591285 kernel: input: QEMU QEMU USB Tablet as /devices/pci0000:00/0000:00:01.2/usb1/1-1/1-1:1.0/0003:0627:0001.0001/input/input5
Jan 21 17:44:03 np0005591285 kernel: hid-generic 0003:0627:0001.0001: input,hidraw0: USB HID v0.01 Mouse [QEMU QEMU USB Tablet] on usb-0000:00:01.2-1/input0
Jan 21 17:44:03 np0005591285 kernel: Freeing unused decrypted memory: 2028K
Jan 21 17:44:03 np0005591285 kernel: Freeing unused kernel image (initmem) memory: 4200K
Jan 21 17:44:03 np0005591285 kernel: Write protecting the kernel read-only data: 30720k
Jan 21 17:44:03 np0005591285 kernel: Freeing unused kernel image (rodata/data gap) memory: 420K
Jan 21 17:44:03 np0005591285 kernel: x86/mm: Checked W+X mappings: passed, no W+X pages found.
Jan 21 17:44:03 np0005591285 kernel: Run /init as init process
Jan 21 17:44:03 np0005591285 systemd: systemd 252-64.el9 running in system mode (+PAM +AUDIT +SELINUX -APPARMOR +IMA +SMACK +SECCOMP +GCRYPT +GNUTLS +OPENSSL +ACL +BLKID +CURL +ELFUTILS +FIDO2 +IDN2 -IDN -IPTC +KMOD +LIBCRYPTSETUP +LIBFDISK +PCRE2 -PWQUALITY +P11KIT -QRENCODE +TPM2 +BZIP2 +LZ4 +XZ +ZLIB +ZSTD -BPF_FRAMEWORK +XKBCOMMON +UTMP +SYSVINIT default-hierarchy=unified)
Jan 21 17:44:03 np0005591285 systemd: Detected virtualization kvm.
Jan 21 17:44:03 np0005591285 systemd: Detected architecture x86-64.
Jan 21 17:44:03 np0005591285 systemd: Running in initrd.
Jan 21 17:44:03 np0005591285 systemd: No hostname configured, using default hostname.
Jan 21 17:44:03 np0005591285 systemd: Hostname set to <localhost>.
Jan 21 17:44:03 np0005591285 systemd: Initializing machine ID from VM UUID.
Jan 21 17:44:03 np0005591285 systemd: Queued start job for default target Initrd Default Target.
Jan 21 17:44:03 np0005591285 systemd: Started Dispatch Password Requests to Console Directory Watch.
Jan 21 17:44:03 np0005591285 systemd: Reached target Local Encrypted Volumes.
Jan 21 17:44:03 np0005591285 systemd: Reached target Initrd /usr File System.
Jan 21 17:44:03 np0005591285 systemd: Reached target Local File Systems.
Jan 21 17:44:03 np0005591285 systemd: Reached target Path Units.
Jan 21 17:44:03 np0005591285 systemd: Reached target Slice Units.
Jan 21 17:44:03 np0005591285 systemd: Reached target Swaps.
Jan 21 17:44:03 np0005591285 systemd: Reached target Timer Units.
Jan 21 17:44:03 np0005591285 systemd: Listening on D-Bus System Message Bus Socket.
Jan 21 17:44:03 np0005591285 systemd: Listening on Journal Socket (/dev/log).
Jan 21 17:44:03 np0005591285 systemd: Listening on Journal Socket.
Jan 21 17:44:03 np0005591285 systemd: Listening on udev Control Socket.
Jan 21 17:44:03 np0005591285 systemd: Listening on udev Kernel Socket.
Jan 21 17:44:03 np0005591285 systemd: Reached target Socket Units.
Jan 21 17:44:03 np0005591285 systemd: Starting Create List of Static Device Nodes...
Jan 21 17:44:03 np0005591285 systemd: Starting Journal Service...
Jan 21 17:44:03 np0005591285 systemd: Load Kernel Modules was skipped because no trigger condition checks were met.
Jan 21 17:44:03 np0005591285 systemd: Starting Apply Kernel Variables...
Jan 21 17:44:03 np0005591285 systemd: Starting Create System Users...
Jan 21 17:44:03 np0005591285 systemd: Starting Setup Virtual Console...
Jan 21 17:44:03 np0005591285 systemd: Finished Create List of Static Device Nodes.
Jan 21 17:44:03 np0005591285 systemd: Finished Apply Kernel Variables.
Jan 21 17:44:03 np0005591285 systemd: Finished Create System Users.
Jan 21 17:44:03 np0005591285 systemd-journald[303]: Journal started
Jan 21 17:44:03 np0005591285 systemd-journald[303]: Runtime Journal (/run/log/journal/632224e8817e4a21811283934a2544f5) is 8.0M, max 153.6M, 145.6M free.
Jan 21 17:44:03 np0005591285 systemd-sysusers[307]: Creating group 'users' with GID 100.
Jan 21 17:44:03 np0005591285 systemd-sysusers[307]: Creating group 'dbus' with GID 81.
Jan 21 17:44:03 np0005591285 systemd-sysusers[307]: Creating user 'dbus' (System Message Bus) with UID 81 and GID 81.
Jan 21 17:44:03 np0005591285 systemd: Started Journal Service.
Jan 21 17:44:03 np0005591285 systemd[1]: Starting Create Static Device Nodes in /dev...
Jan 21 17:44:03 np0005591285 systemd[1]: Starting Create Volatile Files and Directories...
Jan 21 17:44:03 np0005591285 systemd[1]: Finished Create Static Device Nodes in /dev.
Jan 21 17:44:03 np0005591285 systemd[1]: Finished Create Volatile Files and Directories.
Jan 21 17:44:03 np0005591285 systemd[1]: Finished Setup Virtual Console.
Jan 21 17:44:03 np0005591285 systemd[1]: dracut ask for additional cmdline parameters was skipped because no trigger condition checks were met.
Jan 21 17:44:03 np0005591285 systemd[1]: Starting dracut cmdline hook...
Jan 21 17:44:03 np0005591285 dracut-cmdline[322]: dracut-9 dracut-057-102.git20250818.el9
Jan 21 17:44:03 np0005591285 dracut-cmdline[322]: Using kernel command line parameters:    BOOT_IMAGE=(hd0,msdos1)/boot/vmlinuz-5.14.0-661.el9.x86_64 root=UUID=22ac9141-3960-4912-b20e-19fc8a328d40 ro console=ttyS0,115200n8 no_timer_check net.ifnames=0 crashkernel=1G-2G:192M,2G-64G:256M,64G-:512M
Jan 21 17:44:03 np0005591285 systemd[1]: Finished dracut cmdline hook.
Jan 21 17:44:03 np0005591285 systemd[1]: Starting dracut pre-udev hook...
Jan 21 17:44:03 np0005591285 kernel: device-mapper: core: CONFIG_IMA_DISABLE_HTABLE is disabled. Duplicate IMA measurements will not be recorded in the IMA log.
Jan 21 17:44:03 np0005591285 kernel: device-mapper: uevent: version 1.0.3
Jan 21 17:44:03 np0005591285 kernel: device-mapper: ioctl: 4.50.0-ioctl (2025-04-28) initialised: dm-devel@lists.linux.dev
Jan 21 17:44:03 np0005591285 kernel: RPC: Registered named UNIX socket transport module.
Jan 21 17:44:03 np0005591285 kernel: RPC: Registered udp transport module.
Jan 21 17:44:03 np0005591285 kernel: RPC: Registered tcp transport module.
Jan 21 17:44:03 np0005591285 kernel: RPC: Registered tcp-with-tls transport module.
Jan 21 17:44:03 np0005591285 kernel: RPC: Registered tcp NFSv4.1 backchannel transport module.
Jan 21 17:44:03 np0005591285 rpc.statd[441]: Version 2.5.4 starting
Jan 21 17:44:03 np0005591285 rpc.statd[441]: Initializing NSM state
Jan 21 17:44:04 np0005591285 rpc.idmapd[446]: Setting log level to 0
Jan 21 17:44:04 np0005591285 systemd[1]: Finished dracut pre-udev hook.
Jan 21 17:44:04 np0005591285 systemd[1]: Starting Rule-based Manager for Device Events and Files...
Jan 21 17:44:04 np0005591285 systemd-udevd[459]: Using default interface naming scheme 'rhel-9.0'.
Jan 21 17:44:04 np0005591285 systemd[1]: Started Rule-based Manager for Device Events and Files.
Jan 21 17:44:04 np0005591285 systemd[1]: Starting dracut pre-trigger hook...
Jan 21 17:44:04 np0005591285 systemd[1]: Finished dracut pre-trigger hook.
Jan 21 17:44:04 np0005591285 systemd[1]: Starting Coldplug All udev Devices...
Jan 21 17:44:04 np0005591285 systemd[1]: Created slice Slice /system/modprobe.
Jan 21 17:44:04 np0005591285 systemd[1]: Starting Load Kernel Module configfs...
Jan 21 17:44:04 np0005591285 systemd[1]: Finished Coldplug All udev Devices.
Jan 21 17:44:04 np0005591285 systemd[1]: modprobe@configfs.service: Deactivated successfully.
Jan 21 17:44:04 np0005591285 systemd[1]: Finished Load Kernel Module configfs.
Jan 21 17:44:04 np0005591285 systemd[1]: nm-initrd.service was skipped because of an unmet condition check (ConditionPathExists=/run/NetworkManager/initrd/neednet).
Jan 21 17:44:04 np0005591285 systemd[1]: Reached target Network.
Jan 21 17:44:04 np0005591285 systemd[1]: nm-wait-online-initrd.service was skipped because of an unmet condition check (ConditionPathExists=/run/NetworkManager/initrd/neednet).
Jan 21 17:44:04 np0005591285 systemd[1]: Starting dracut initqueue hook...
Jan 21 17:44:04 np0005591285 systemd[1]: Mounting Kernel Configuration File System...
Jan 21 17:44:04 np0005591285 systemd[1]: Mounted Kernel Configuration File System.
Jan 21 17:44:04 np0005591285 systemd[1]: Reached target System Initialization.
Jan 21 17:44:04 np0005591285 systemd[1]: Reached target Basic System.
Jan 21 17:44:04 np0005591285 kernel: virtio_blk virtio2: 8/0/0 default/read/poll queues
Jan 21 17:44:04 np0005591285 kernel: virtio_blk virtio2: [vda] 167772160 512-byte logical blocks (85.9 GB/80.0 GiB)
Jan 21 17:44:04 np0005591285 kernel: vda: vda1
Jan 21 17:44:04 np0005591285 kernel: scsi host0: ata_piix
Jan 21 17:44:04 np0005591285 systemd-udevd[498]: Network interface NamePolicy= disabled on kernel command line.
Jan 21 17:44:04 np0005591285 kernel: scsi host1: ata_piix
Jan 21 17:44:04 np0005591285 kernel: ata1: PATA max MWDMA2 cmd 0x1f0 ctl 0x3f6 bmdma 0xc140 irq 14 lpm-pol 0
Jan 21 17:44:04 np0005591285 kernel: ata2: PATA max MWDMA2 cmd 0x170 ctl 0x376 bmdma 0xc148 irq 15 lpm-pol 0
Jan 21 17:44:04 np0005591285 systemd[1]: Found device /dev/disk/by-uuid/22ac9141-3960-4912-b20e-19fc8a328d40.
Jan 21 17:44:04 np0005591285 systemd[1]: Reached target Initrd Root Device.
Jan 21 17:44:04 np0005591285 kernel: ata1: found unknown device (class 0)
Jan 21 17:44:04 np0005591285 kernel: ata1.00: ATAPI: QEMU DVD-ROM, 2.5+, max UDMA/100
Jan 21 17:44:04 np0005591285 kernel: scsi 0:0:0:0: CD-ROM            QEMU     QEMU DVD-ROM     2.5+ PQ: 0 ANSI: 5
Jan 21 17:44:04 np0005591285 kernel: scsi 0:0:0:0: Attached scsi generic sg0 type 5
Jan 21 17:44:04 np0005591285 kernel: sr 0:0:0:0: [sr0] scsi3-mmc drive: 4x/4x cd/rw xa/form2 tray
Jan 21 17:44:04 np0005591285 kernel: cdrom: Uniform CD-ROM driver Revision: 3.20
Jan 21 17:44:04 np0005591285 systemd[1]: Finished dracut initqueue hook.
Jan 21 17:44:04 np0005591285 systemd[1]: Reached target Preparation for Remote File Systems.
Jan 21 17:44:04 np0005591285 systemd[1]: Reached target Remote Encrypted Volumes.
Jan 21 17:44:04 np0005591285 systemd[1]: Reached target Remote File Systems.
Jan 21 17:44:04 np0005591285 systemd[1]: Starting dracut pre-mount hook...
Jan 21 17:44:04 np0005591285 systemd[1]: Finished dracut pre-mount hook.
Jan 21 17:44:04 np0005591285 systemd[1]: Starting File System Check on /dev/disk/by-uuid/22ac9141-3960-4912-b20e-19fc8a328d40...
Jan 21 17:44:04 np0005591285 systemd-fsck[555]: /usr/sbin/fsck.xfs: XFS file system.
Jan 21 17:44:04 np0005591285 systemd[1]: Finished File System Check on /dev/disk/by-uuid/22ac9141-3960-4912-b20e-19fc8a328d40.
Jan 21 17:44:04 np0005591285 systemd[1]: Mounting /sysroot...
Jan 21 17:44:05 np0005591285 kernel: SGI XFS with ACLs, security attributes, scrub, quota, no debug enabled
Jan 21 17:44:05 np0005591285 kernel: XFS (vda1): Mounting V5 Filesystem 22ac9141-3960-4912-b20e-19fc8a328d40
Jan 21 17:44:05 np0005591285 kernel: XFS (vda1): Ending clean mount
Jan 21 17:44:05 np0005591285 systemd[1]: Mounted /sysroot.
Jan 21 17:44:05 np0005591285 systemd[1]: Reached target Initrd Root File System.
Jan 21 17:44:05 np0005591285 systemd[1]: Starting Mountpoints Configured in the Real Root...
Jan 21 17:44:05 np0005591285 systemd[1]: initrd-parse-etc.service: Deactivated successfully.
Jan 21 17:44:05 np0005591285 systemd[1]: Finished Mountpoints Configured in the Real Root.
Jan 21 17:44:05 np0005591285 systemd[1]: Reached target Initrd File Systems.
Jan 21 17:44:05 np0005591285 systemd[1]: Reached target Initrd Default Target.
Jan 21 17:44:05 np0005591285 systemd[1]: Starting dracut mount hook...
Jan 21 17:44:05 np0005591285 systemd[1]: Finished dracut mount hook.
Jan 21 17:44:05 np0005591285 systemd[1]: Starting dracut pre-pivot and cleanup hook...
Jan 21 17:44:05 np0005591285 rpc.idmapd[446]: exiting on signal 15
Jan 21 17:44:05 np0005591285 systemd[1]: var-lib-nfs-rpc_pipefs.mount: Deactivated successfully.
Jan 21 17:44:05 np0005591285 systemd[1]: Finished dracut pre-pivot and cleanup hook.
Jan 21 17:44:05 np0005591285 systemd[1]: Starting Cleaning Up and Shutting Down Daemons...
Jan 21 17:44:05 np0005591285 systemd[1]: Stopped target Network.
Jan 21 17:44:05 np0005591285 systemd[1]: Stopped target Remote Encrypted Volumes.
Jan 21 17:44:05 np0005591285 systemd[1]: Stopped target Timer Units.
Jan 21 17:44:05 np0005591285 systemd[1]: dbus.socket: Deactivated successfully.
Jan 21 17:44:05 np0005591285 systemd[1]: Closed D-Bus System Message Bus Socket.
Jan 21 17:44:05 np0005591285 systemd[1]: dracut-pre-pivot.service: Deactivated successfully.
Jan 21 17:44:05 np0005591285 systemd[1]: Stopped dracut pre-pivot and cleanup hook.
Jan 21 17:44:05 np0005591285 systemd[1]: Stopped target Initrd Default Target.
Jan 21 17:44:05 np0005591285 systemd[1]: Stopped target Basic System.
Jan 21 17:44:05 np0005591285 systemd[1]: Stopped target Initrd Root Device.
Jan 21 17:44:05 np0005591285 systemd[1]: Stopped target Initrd /usr File System.
Jan 21 17:44:05 np0005591285 systemd[1]: Stopped target Path Units.
Jan 21 17:44:05 np0005591285 systemd[1]: Stopped target Remote File Systems.
Jan 21 17:44:05 np0005591285 systemd[1]: Stopped target Preparation for Remote File Systems.
Jan 21 17:44:05 np0005591285 systemd[1]: Stopped target Slice Units.
Jan 21 17:44:05 np0005591285 systemd[1]: Stopped target Socket Units.
Jan 21 17:44:05 np0005591285 systemd[1]: Stopped target System Initialization.
Jan 21 17:44:05 np0005591285 systemd[1]: Stopped target Local File Systems.
Jan 21 17:44:05 np0005591285 systemd[1]: Stopped target Swaps.
Jan 21 17:44:05 np0005591285 systemd[1]: dracut-mount.service: Deactivated successfully.
Jan 21 17:44:05 np0005591285 systemd[1]: Stopped dracut mount hook.
Jan 21 17:44:05 np0005591285 systemd[1]: dracut-pre-mount.service: Deactivated successfully.
Jan 21 17:44:05 np0005591285 systemd[1]: Stopped dracut pre-mount hook.
Jan 21 17:44:05 np0005591285 systemd[1]: Stopped target Local Encrypted Volumes.
Jan 21 17:44:05 np0005591285 systemd[1]: systemd-ask-password-console.path: Deactivated successfully.
Jan 21 17:44:05 np0005591285 systemd[1]: Stopped Dispatch Password Requests to Console Directory Watch.
Jan 21 17:44:05 np0005591285 systemd[1]: dracut-initqueue.service: Deactivated successfully.
Jan 21 17:44:05 np0005591285 systemd[1]: Stopped dracut initqueue hook.
Jan 21 17:44:05 np0005591285 systemd[1]: systemd-sysctl.service: Deactivated successfully.
Jan 21 17:44:05 np0005591285 systemd[1]: Stopped Apply Kernel Variables.
Jan 21 17:44:05 np0005591285 systemd[1]: systemd-tmpfiles-setup.service: Deactivated successfully.
Jan 21 17:44:05 np0005591285 systemd[1]: Stopped Create Volatile Files and Directories.
Jan 21 17:44:05 np0005591285 systemd[1]: systemd-udev-trigger.service: Deactivated successfully.
Jan 21 17:44:05 np0005591285 systemd[1]: Stopped Coldplug All udev Devices.
Jan 21 17:44:05 np0005591285 systemd[1]: dracut-pre-trigger.service: Deactivated successfully.
Jan 21 17:44:05 np0005591285 systemd[1]: Stopped dracut pre-trigger hook.
Jan 21 17:44:05 np0005591285 systemd[1]: Stopping Rule-based Manager for Device Events and Files...
Jan 21 17:44:05 np0005591285 systemd[1]: systemd-vconsole-setup.service: Deactivated successfully.
Jan 21 17:44:05 np0005591285 systemd[1]: Stopped Setup Virtual Console.
Jan 21 17:44:05 np0005591285 systemd[1]: run-credentials-systemd\x2dtmpfiles\x2dsetup.service.mount: Deactivated successfully.
Jan 21 17:44:05 np0005591285 systemd[1]: run-credentials-systemd\x2dsysctl.service.mount: Deactivated successfully.
Jan 21 17:44:05 np0005591285 systemd[1]: systemd-udevd.service: Deactivated successfully.
Jan 21 17:44:05 np0005591285 systemd[1]: Stopped Rule-based Manager for Device Events and Files.
Jan 21 17:44:05 np0005591285 systemd[1]: systemd-udevd-control.socket: Deactivated successfully.
Jan 21 17:44:05 np0005591285 systemd[1]: Closed udev Control Socket.
Jan 21 17:44:05 np0005591285 systemd[1]: systemd-udevd-kernel.socket: Deactivated successfully.
Jan 21 17:44:05 np0005591285 systemd[1]: Closed udev Kernel Socket.
Jan 21 17:44:05 np0005591285 systemd[1]: dracut-pre-udev.service: Deactivated successfully.
Jan 21 17:44:05 np0005591285 systemd[1]: Stopped dracut pre-udev hook.
Jan 21 17:44:05 np0005591285 systemd[1]: dracut-cmdline.service: Deactivated successfully.
Jan 21 17:44:05 np0005591285 systemd[1]: Stopped dracut cmdline hook.
Jan 21 17:44:05 np0005591285 systemd[1]: Starting Cleanup udev Database...
Jan 21 17:44:05 np0005591285 systemd[1]: systemd-tmpfiles-setup-dev.service: Deactivated successfully.
Jan 21 17:44:05 np0005591285 systemd[1]: Stopped Create Static Device Nodes in /dev.
Jan 21 17:44:05 np0005591285 systemd[1]: kmod-static-nodes.service: Deactivated successfully.
Jan 21 17:44:05 np0005591285 systemd[1]: Stopped Create List of Static Device Nodes.
Jan 21 17:44:05 np0005591285 systemd[1]: systemd-sysusers.service: Deactivated successfully.
Jan 21 17:44:05 np0005591285 systemd[1]: Stopped Create System Users.
Jan 21 17:44:05 np0005591285 systemd[1]: run-credentials-systemd\x2dtmpfiles\x2dsetup\x2ddev.service.mount: Deactivated successfully.
Jan 21 17:44:05 np0005591285 systemd[1]: run-credentials-systemd\x2dsysusers.service.mount: Deactivated successfully.
Jan 21 17:44:05 np0005591285 systemd[1]: initrd-cleanup.service: Deactivated successfully.
Jan 21 17:44:05 np0005591285 systemd[1]: Finished Cleaning Up and Shutting Down Daemons.
Jan 21 17:44:05 np0005591285 systemd[1]: initrd-udevadm-cleanup-db.service: Deactivated successfully.
Jan 21 17:44:05 np0005591285 systemd[1]: Finished Cleanup udev Database.
Jan 21 17:44:05 np0005591285 systemd[1]: Reached target Switch Root.
Jan 21 17:44:05 np0005591285 systemd[1]: Starting Switch Root...
Jan 21 17:44:05 np0005591285 systemd[1]: Switching root.
Jan 21 17:44:05 np0005591285 systemd-journald[303]: Journal stopped
Jan 21 17:44:06 np0005591285 systemd-journald: Received SIGTERM from PID 1 (systemd).
Jan 21 17:44:06 np0005591285 kernel: audit: type=1404 audit(1769035445.801:2): enforcing=1 old_enforcing=0 auid=4294967295 ses=4294967295 enabled=1 old-enabled=1 lsm=selinux res=1
Jan 21 17:44:06 np0005591285 kernel: SELinux:  policy capability network_peer_controls=1
Jan 21 17:44:06 np0005591285 kernel: SELinux:  policy capability open_perms=1
Jan 21 17:44:06 np0005591285 kernel: SELinux:  policy capability extended_socket_class=1
Jan 21 17:44:06 np0005591285 kernel: SELinux:  policy capability always_check_network=0
Jan 21 17:44:06 np0005591285 kernel: SELinux:  policy capability cgroup_seclabel=1
Jan 21 17:44:06 np0005591285 kernel: SELinux:  policy capability nnp_nosuid_transition=1
Jan 21 17:44:06 np0005591285 kernel: SELinux:  policy capability genfs_seclabel_symlinks=1
Jan 21 17:44:06 np0005591285 kernel: audit: type=1403 audit(1769035445.928:3): auid=4294967295 ses=4294967295 lsm=selinux res=1
Jan 21 17:44:06 np0005591285 systemd: Successfully loaded SELinux policy in 130.703ms.
Jan 21 17:44:06 np0005591285 systemd: Relabelled /dev, /dev/shm, /run, /sys/fs/cgroup in 26.214ms.
Jan 21 17:44:06 np0005591285 systemd: systemd 252-64.el9 running in system mode (+PAM +AUDIT +SELINUX -APPARMOR +IMA +SMACK +SECCOMP +GCRYPT +GNUTLS +OPENSSL +ACL +BLKID +CURL +ELFUTILS +FIDO2 +IDN2 -IDN -IPTC +KMOD +LIBCRYPTSETUP +LIBFDISK +PCRE2 -PWQUALITY +P11KIT -QRENCODE +TPM2 +BZIP2 +LZ4 +XZ +ZLIB +ZSTD -BPF_FRAMEWORK +XKBCOMMON +UTMP +SYSVINIT default-hierarchy=unified)
Jan 21 17:44:06 np0005591285 systemd: Detected virtualization kvm.
Jan 21 17:44:06 np0005591285 systemd: Detected architecture x86-64.
Jan 21 17:44:06 np0005591285 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 21 17:44:06 np0005591285 systemd: initrd-switch-root.service: Deactivated successfully.
Jan 21 17:44:06 np0005591285 systemd: Stopped Switch Root.
Jan 21 17:44:06 np0005591285 systemd: systemd-journald.service: Scheduled restart job, restart counter is at 1.
Jan 21 17:44:06 np0005591285 systemd: Created slice Slice /system/getty.
Jan 21 17:44:06 np0005591285 systemd: Created slice Slice /system/serial-getty.
Jan 21 17:44:06 np0005591285 systemd: Created slice Slice /system/sshd-keygen.
Jan 21 17:44:06 np0005591285 systemd: Created slice User and Session Slice.
Jan 21 17:44:06 np0005591285 systemd: Started Dispatch Password Requests to Console Directory Watch.
Jan 21 17:44:06 np0005591285 systemd: Started Forward Password Requests to Wall Directory Watch.
Jan 21 17:44:06 np0005591285 systemd: Set up automount Arbitrary Executable File Formats File System Automount Point.
Jan 21 17:44:06 np0005591285 systemd: Reached target Local Encrypted Volumes.
Jan 21 17:44:06 np0005591285 systemd: Stopped target Switch Root.
Jan 21 17:44:06 np0005591285 systemd: Stopped target Initrd File Systems.
Jan 21 17:44:06 np0005591285 systemd: Stopped target Initrd Root File System.
Jan 21 17:44:06 np0005591285 systemd: Reached target Local Integrity Protected Volumes.
Jan 21 17:44:06 np0005591285 systemd: Reached target Path Units.
Jan 21 17:44:06 np0005591285 systemd: Reached target rpc_pipefs.target.
Jan 21 17:44:06 np0005591285 systemd: Reached target Slice Units.
Jan 21 17:44:06 np0005591285 systemd: Reached target Swaps.
Jan 21 17:44:06 np0005591285 systemd: Reached target Local Verity Protected Volumes.
Jan 21 17:44:06 np0005591285 systemd: Listening on RPCbind Server Activation Socket.
Jan 21 17:44:06 np0005591285 systemd: Reached target RPC Port Mapper.
Jan 21 17:44:06 np0005591285 systemd: Listening on Process Core Dump Socket.
Jan 21 17:44:06 np0005591285 systemd: Listening on initctl Compatibility Named Pipe.
Jan 21 17:44:06 np0005591285 systemd: Listening on udev Control Socket.
Jan 21 17:44:06 np0005591285 systemd: Listening on udev Kernel Socket.
Jan 21 17:44:06 np0005591285 systemd: Mounting Huge Pages File System...
Jan 21 17:44:06 np0005591285 systemd: Mounting POSIX Message Queue File System...
Jan 21 17:44:06 np0005591285 systemd: Mounting Kernel Debug File System...
Jan 21 17:44:06 np0005591285 systemd: Mounting Kernel Trace File System...
Jan 21 17:44:06 np0005591285 systemd: Kernel Module supporting RPCSEC_GSS was skipped because of an unmet condition check (ConditionPathExists=/etc/krb5.keytab).
Jan 21 17:44:06 np0005591285 systemd: Starting Create List of Static Device Nodes...
Jan 21 17:44:06 np0005591285 systemd: Starting Load Kernel Module configfs...
Jan 21 17:44:06 np0005591285 systemd: Starting Load Kernel Module drm...
Jan 21 17:44:06 np0005591285 systemd: Starting Load Kernel Module efi_pstore...
Jan 21 17:44:06 np0005591285 systemd: Starting Load Kernel Module fuse...
Jan 21 17:44:06 np0005591285 systemd: Starting Read and set NIS domainname from /etc/sysconfig/network...
Jan 21 17:44:06 np0005591285 systemd: systemd-fsck-root.service: Deactivated successfully.
Jan 21 17:44:06 np0005591285 systemd: Stopped File System Check on Root Device.
Jan 21 17:44:06 np0005591285 systemd: Stopped Journal Service.
Jan 21 17:44:06 np0005591285 systemd: Starting Journal Service...
Jan 21 17:44:06 np0005591285 systemd: Load Kernel Modules was skipped because no trigger condition checks were met.
Jan 21 17:44:06 np0005591285 systemd: Starting Generate network units from Kernel command line...
Jan 21 17:44:06 np0005591285 systemd: TPM2 PCR Machine ID Measurement was skipped because of an unmet condition check (ConditionPathExists=/sys/firmware/efi/efivars/StubPcrKernelImage-4a67b082-0a4c-41cf-b6c7-440b29bb8c4f).
Jan 21 17:44:06 np0005591285 systemd: Starting Remount Root and Kernel File Systems...
Jan 21 17:44:06 np0005591285 systemd: Repartition Root Disk was skipped because no trigger condition checks were met.
Jan 21 17:44:06 np0005591285 systemd: Starting Apply Kernel Variables...
Jan 21 17:44:06 np0005591285 systemd-journald[680]: Journal started
Jan 21 17:44:06 np0005591285 systemd-journald[680]: Runtime Journal (/run/log/journal/85ac68c10a6e7ae08ceb898dbdca0cb5) is 8.0M, max 153.6M, 145.6M free.
Jan 21 17:44:06 np0005591285 systemd[1]: Queued start job for default target Multi-User System.
Jan 21 17:44:06 np0005591285 systemd[1]: systemd-journald.service: Deactivated successfully.
Jan 21 17:44:06 np0005591285 kernel: xfs filesystem being remounted at / supports timestamps until 2038 (0x7fffffff)
Jan 21 17:44:06 np0005591285 kernel: ACPI: bus type drm_connector registered
Jan 21 17:44:06 np0005591285 systemd: Starting Coldplug All udev Devices...
Jan 21 17:44:06 np0005591285 kernel: fuse: init (API version 7.37)
Jan 21 17:44:06 np0005591285 systemd: Started Journal Service.
Jan 21 17:44:06 np0005591285 systemd[1]: Mounted Huge Pages File System.
Jan 21 17:44:06 np0005591285 systemd[1]: Mounted POSIX Message Queue File System.
Jan 21 17:44:06 np0005591285 systemd[1]: Mounted Kernel Debug File System.
Jan 21 17:44:06 np0005591285 systemd[1]: Mounted Kernel Trace File System.
Jan 21 17:44:06 np0005591285 systemd[1]: Finished Create List of Static Device Nodes.
Jan 21 17:44:06 np0005591285 systemd[1]: modprobe@configfs.service: Deactivated successfully.
Jan 21 17:44:06 np0005591285 systemd[1]: Finished Load Kernel Module configfs.
Jan 21 17:44:06 np0005591285 systemd[1]: modprobe@drm.service: Deactivated successfully.
Jan 21 17:44:06 np0005591285 systemd[1]: Finished Load Kernel Module drm.
Jan 21 17:44:06 np0005591285 systemd[1]: modprobe@efi_pstore.service: Deactivated successfully.
Jan 21 17:44:06 np0005591285 systemd[1]: Finished Load Kernel Module efi_pstore.
Jan 21 17:44:06 np0005591285 systemd[1]: modprobe@fuse.service: Deactivated successfully.
Jan 21 17:44:06 np0005591285 systemd[1]: Finished Load Kernel Module fuse.
Jan 21 17:44:06 np0005591285 systemd[1]: Finished Read and set NIS domainname from /etc/sysconfig/network.
Jan 21 17:44:06 np0005591285 systemd[1]: Finished Generate network units from Kernel command line.
Jan 21 17:44:06 np0005591285 systemd[1]: Finished Remount Root and Kernel File Systems.
Jan 21 17:44:06 np0005591285 systemd[1]: Finished Apply Kernel Variables.
Jan 21 17:44:06 np0005591285 systemd[1]: Mounting FUSE Control File System...
Jan 21 17:44:06 np0005591285 systemd[1]: First Boot Wizard was skipped because of an unmet condition check (ConditionFirstBoot=yes).
Jan 21 17:44:06 np0005591285 systemd[1]: Starting Rebuild Hardware Database...
Jan 21 17:44:06 np0005591285 systemd[1]: Starting Flush Journal to Persistent Storage...
Jan 21 17:44:06 np0005591285 systemd[1]: Platform Persistent Storage Archival was skipped because of an unmet condition check (ConditionDirectoryNotEmpty=/sys/fs/pstore).
Jan 21 17:44:06 np0005591285 systemd[1]: Starting Load/Save OS Random Seed...
Jan 21 17:44:06 np0005591285 systemd[1]: Starting Create System Users...
Jan 21 17:44:06 np0005591285 systemd[1]: Mounted FUSE Control File System.
Jan 21 17:44:06 np0005591285 systemd-journald[680]: Runtime Journal (/run/log/journal/85ac68c10a6e7ae08ceb898dbdca0cb5) is 8.0M, max 153.6M, 145.6M free.
Jan 21 17:44:06 np0005591285 systemd-journald[680]: Received client request to flush runtime journal.
Jan 21 17:44:06 np0005591285 systemd[1]: Finished Load/Save OS Random Seed.
Jan 21 17:44:06 np0005591285 systemd[1]: Finished Flush Journal to Persistent Storage.
Jan 21 17:44:06 np0005591285 systemd[1]: First Boot Complete was skipped because of an unmet condition check (ConditionFirstBoot=yes).
Jan 21 17:44:06 np0005591285 systemd[1]: Finished Coldplug All udev Devices.
Jan 21 17:44:06 np0005591285 systemd[1]: Finished Create System Users.
Jan 21 17:44:06 np0005591285 systemd[1]: Starting Create Static Device Nodes in /dev...
Jan 21 17:44:06 np0005591285 systemd[1]: Finished Create Static Device Nodes in /dev.
Jan 21 17:44:06 np0005591285 systemd[1]: Reached target Preparation for Local File Systems.
Jan 21 17:44:06 np0005591285 systemd[1]: Reached target Local File Systems.
Jan 21 17:44:06 np0005591285 systemd[1]: Starting Rebuild Dynamic Linker Cache...
Jan 21 17:44:06 np0005591285 systemd[1]: Mark the need to relabel after reboot was skipped because of an unmet condition check (ConditionSecurity=!selinux).
Jan 21 17:44:06 np0005591285 systemd[1]: Set Up Additional Binary Formats was skipped because no trigger condition checks were met.
Jan 21 17:44:06 np0005591285 systemd[1]: Update Boot Loader Random Seed was skipped because no trigger condition checks were met.
Jan 21 17:44:06 np0005591285 systemd[1]: Starting Automatic Boot Loader Update...
Jan 21 17:44:06 np0005591285 systemd[1]: Commit a transient machine-id on disk was skipped because of an unmet condition check (ConditionPathIsMountPoint=/etc/machine-id).
Jan 21 17:44:06 np0005591285 systemd[1]: Starting Create Volatile Files and Directories...
Jan 21 17:44:06 np0005591285 bootctl[700]: Couldn't find EFI system partition, skipping.
Jan 21 17:44:06 np0005591285 systemd[1]: Finished Automatic Boot Loader Update.
Jan 21 17:44:06 np0005591285 systemd[1]: Finished Rebuild Dynamic Linker Cache.
Jan 21 17:44:06 np0005591285 systemd[1]: Finished Create Volatile Files and Directories.
Jan 21 17:44:06 np0005591285 systemd[1]: Starting Security Auditing Service...
Jan 21 17:44:06 np0005591285 systemd[1]: Starting RPC Bind...
Jan 21 17:44:06 np0005591285 systemd[1]: Starting Rebuild Journal Catalog...
Jan 21 17:44:06 np0005591285 auditd[706]: audit dispatcher initialized with q_depth=2000 and 1 active plugins
Jan 21 17:44:06 np0005591285 auditd[706]: Init complete, auditd 3.1.5 listening for events (startup state enable)
Jan 21 17:44:06 np0005591285 systemd[1]: Finished Rebuild Journal Catalog.
Jan 21 17:44:06 np0005591285 systemd[1]: Started RPC Bind.
Jan 21 17:44:06 np0005591285 augenrules[711]: /sbin/augenrules: No change
Jan 21 17:44:06 np0005591285 augenrules[726]: No rules
Jan 21 17:44:06 np0005591285 augenrules[726]: enabled 1
Jan 21 17:44:06 np0005591285 augenrules[726]: failure 1
Jan 21 17:44:06 np0005591285 augenrules[726]: pid 706
Jan 21 17:44:06 np0005591285 augenrules[726]: rate_limit 0
Jan 21 17:44:06 np0005591285 augenrules[726]: backlog_limit 8192
Jan 21 17:44:06 np0005591285 augenrules[726]: lost 0
Jan 21 17:44:06 np0005591285 augenrules[726]: backlog 1
Jan 21 17:44:06 np0005591285 augenrules[726]: backlog_wait_time 60000
Jan 21 17:44:06 np0005591285 augenrules[726]: backlog_wait_time_actual 0
Jan 21 17:44:06 np0005591285 augenrules[726]: enabled 1
Jan 21 17:44:06 np0005591285 augenrules[726]: failure 1
Jan 21 17:44:06 np0005591285 augenrules[726]: pid 706
Jan 21 17:44:06 np0005591285 augenrules[726]: rate_limit 0
Jan 21 17:44:06 np0005591285 augenrules[726]: backlog_limit 8192
Jan 21 17:44:06 np0005591285 augenrules[726]: lost 0
Jan 21 17:44:06 np0005591285 augenrules[726]: backlog 2
Jan 21 17:44:06 np0005591285 augenrules[726]: backlog_wait_time 60000
Jan 21 17:44:06 np0005591285 augenrules[726]: backlog_wait_time_actual 0
Jan 21 17:44:06 np0005591285 augenrules[726]: enabled 1
Jan 21 17:44:06 np0005591285 augenrules[726]: failure 1
Jan 21 17:44:06 np0005591285 augenrules[726]: pid 706
Jan 21 17:44:06 np0005591285 augenrules[726]: rate_limit 0
Jan 21 17:44:06 np0005591285 augenrules[726]: backlog_limit 8192
Jan 21 17:44:06 np0005591285 augenrules[726]: lost 0
Jan 21 17:44:06 np0005591285 augenrules[726]: backlog 3
Jan 21 17:44:06 np0005591285 augenrules[726]: backlog_wait_time 60000
Jan 21 17:44:06 np0005591285 augenrules[726]: backlog_wait_time_actual 0
Jan 21 17:44:06 np0005591285 systemd[1]: Started Security Auditing Service.
Jan 21 17:44:06 np0005591285 systemd[1]: Starting Record System Boot/Shutdown in UTMP...
Jan 21 17:44:07 np0005591285 systemd[1]: Finished Record System Boot/Shutdown in UTMP.
Jan 21 17:44:07 np0005591285 systemd[1]: Finished Rebuild Hardware Database.
Jan 21 17:44:07 np0005591285 systemd[1]: Starting Rule-based Manager for Device Events and Files...
Jan 21 17:44:07 np0005591285 systemd[1]: Starting Update is Completed...
Jan 21 17:44:07 np0005591285 systemd[1]: Finished Update is Completed.
Jan 21 17:44:07 np0005591285 systemd-udevd[734]: Using default interface naming scheme 'rhel-9.0'.
Jan 21 17:44:07 np0005591285 systemd[1]: Started Rule-based Manager for Device Events and Files.
Jan 21 17:44:07 np0005591285 systemd[1]: Reached target System Initialization.
Jan 21 17:44:07 np0005591285 systemd[1]: Started dnf makecache --timer.
Jan 21 17:44:07 np0005591285 systemd[1]: Started Daily rotation of log files.
Jan 21 17:44:07 np0005591285 systemd[1]: Started Daily Cleanup of Temporary Directories.
Jan 21 17:44:07 np0005591285 systemd[1]: Reached target Timer Units.
Jan 21 17:44:07 np0005591285 systemd[1]: Listening on D-Bus System Message Bus Socket.
Jan 21 17:44:07 np0005591285 systemd[1]: Listening on SSSD Kerberos Cache Manager responder socket.
Jan 21 17:44:07 np0005591285 systemd[1]: Reached target Socket Units.
Jan 21 17:44:07 np0005591285 systemd[1]: Starting D-Bus System Message Bus...
Jan 21 17:44:07 np0005591285 systemd[1]: TPM2 PCR Barrier (Initialization) was skipped because of an unmet condition check (ConditionPathExists=/sys/firmware/efi/efivars/StubPcrKernelImage-4a67b082-0a4c-41cf-b6c7-440b29bb8c4f).
Jan 21 17:44:07 np0005591285 systemd[1]: Starting Load Kernel Module configfs...
Jan 21 17:44:07 np0005591285 systemd-udevd[739]: Network interface NamePolicy= disabled on kernel command line.
Jan 21 17:44:07 np0005591285 systemd[1]: modprobe@configfs.service: Deactivated successfully.
Jan 21 17:44:07 np0005591285 systemd[1]: Finished Load Kernel Module configfs.
Jan 21 17:44:07 np0005591285 systemd[1]: Condition check resulted in /dev/ttyS0 being skipped.
Jan 21 17:44:07 np0005591285 systemd[1]: Started D-Bus System Message Bus.
Jan 21 17:44:07 np0005591285 systemd[1]: Reached target Basic System.
Jan 21 17:44:07 np0005591285 dbus-broker-lau[764]: Ready
Jan 21 17:44:07 np0005591285 systemd[1]: Starting NTP client/server...
Jan 21 17:44:07 np0005591285 systemd[1]: Starting Cloud-init: Local Stage (pre-network)...
Jan 21 17:44:07 np0005591285 systemd[1]: Starting Restore /run/initramfs on shutdown...
Jan 21 17:44:07 np0005591285 systemd[1]: Starting IPv4 firewall with iptables...
Jan 21 17:44:07 np0005591285 systemd[1]: Started irqbalance daemon.
Jan 21 17:44:07 np0005591285 systemd[1]: Load CPU microcode update was skipped because of an unmet condition check (ConditionPathExists=/sys/devices/system/cpu/microcode/reload).
Jan 21 17:44:07 np0005591285 systemd[1]: OpenSSH ecdsa Server Key Generation was skipped because of an unmet condition check (ConditionPathExists=!/run/systemd/generator.early/multi-user.target.wants/cloud-init.target).
Jan 21 17:44:07 np0005591285 systemd[1]: OpenSSH ed25519 Server Key Generation was skipped because of an unmet condition check (ConditionPathExists=!/run/systemd/generator.early/multi-user.target.wants/cloud-init.target).
Jan 21 17:44:07 np0005591285 systemd[1]: OpenSSH rsa Server Key Generation was skipped because of an unmet condition check (ConditionPathExists=!/run/systemd/generator.early/multi-user.target.wants/cloud-init.target).
Jan 21 17:44:07 np0005591285 systemd[1]: Reached target sshd-keygen.target.
Jan 21 17:44:07 np0005591285 systemd[1]: System Security Services Daemon was skipped because no trigger condition checks were met.
Jan 21 17:44:07 np0005591285 systemd[1]: Reached target User and Group Name Lookups.
Jan 21 17:44:07 np0005591285 kernel: input: PC Speaker as /devices/platform/pcspkr/input/input6
Jan 21 17:44:07 np0005591285 kernel: piix4_smbus 0000:00:01.3: SMBus Host Controller at 0x700, revision 0
Jan 21 17:44:07 np0005591285 chronyd[793]: chronyd version 4.8 starting (+CMDMON +REFCLOCK +RTC +PRIVDROP +SCFILTER +SIGND +NTS +SECHASH +IPV6 +DEBUG)
Jan 21 17:44:07 np0005591285 systemd[1]: Starting User Login Management...
Jan 21 17:44:07 np0005591285 chronyd[793]: Loaded 0 symmetric keys
Jan 21 17:44:07 np0005591285 systemd[1]: Started NTP client/server.
Jan 21 17:44:07 np0005591285 chronyd[793]: Using right/UTC timezone to obtain leap second data
Jan 21 17:44:07 np0005591285 chronyd[793]: Loaded seccomp filter (level 2)
Jan 21 17:44:07 np0005591285 systemd[1]: Finished Restore /run/initramfs on shutdown.
Jan 21 17:44:07 np0005591285 kernel: i2c i2c-0: 1/1 memory slots populated (from DMI)
Jan 21 17:44:07 np0005591285 kernel: i2c i2c-0: Memory type 0x07 not supported yet, not instantiating SPD
Jan 21 17:44:07 np0005591285 kernel: Warning: Deprecated Driver is detected: nft_compat will not be maintained in a future major release and may be disabled
Jan 21 17:44:07 np0005591285 systemd-logind[788]: Watching system buttons on /dev/input/event0 (Power Button)
Jan 21 17:44:07 np0005591285 systemd-logind[788]: Watching system buttons on /dev/input/event1 (AT Translated Set 2 keyboard)
Jan 21 17:44:07 np0005591285 kernel: Warning: Deprecated Driver is detected: nft_compat_module_init will not be maintained in a future major release and may be disabled
Jan 21 17:44:07 np0005591285 systemd-logind[788]: New seat seat0.
Jan 21 17:44:07 np0005591285 systemd[1]: Started User Login Management.
Jan 21 17:44:07 np0005591285 kernel: kvm_amd: TSC scaling supported
Jan 21 17:44:07 np0005591285 kernel: kvm_amd: Nested Virtualization enabled
Jan 21 17:44:07 np0005591285 kernel: kvm_amd: Nested Paging enabled
Jan 21 17:44:07 np0005591285 kernel: kvm_amd: LBR virtualization supported
Jan 21 17:44:07 np0005591285 kernel: [drm] pci: virtio-vga detected at 0000:00:02.0
Jan 21 17:44:07 np0005591285 kernel: virtio-pci 0000:00:02.0: vgaarb: deactivate vga console
Jan 21 17:44:07 np0005591285 kernel: Console: switching to colour dummy device 80x25
Jan 21 17:44:07 np0005591285 kernel: [drm] features: -virgl +edid -resource_blob -host_visible
Jan 21 17:44:07 np0005591285 kernel: [drm] features: -context_init
Jan 21 17:44:07 np0005591285 kernel: [drm] number of scanouts: 1
Jan 21 17:44:07 np0005591285 kernel: [drm] number of cap sets: 0
Jan 21 17:44:07 np0005591285 kernel: [drm] Initialized virtio_gpu 0.1.0 for 0000:00:02.0 on minor 0
Jan 21 17:44:07 np0005591285 kernel: fbcon: virtio_gpudrmfb (fb0) is primary device
Jan 21 17:44:07 np0005591285 kernel: Console: switching to colour frame buffer device 128x48
Jan 21 17:44:07 np0005591285 kernel: virtio-pci 0000:00:02.0: [drm] fb0: virtio_gpudrmfb frame buffer device
Jan 21 17:44:07 np0005591285 iptables.init[781]: iptables: Applying firewall rules: [  OK  ]
Jan 21 17:44:07 np0005591285 systemd[1]: Finished IPv4 firewall with iptables.
Jan 21 17:44:07 np0005591285 cloud-init[842]: Cloud-init v. 24.4-8.el9 running 'init-local' at Wed, 21 Jan 2026 22:44:07 +0000. Up 6.55 seconds.
Jan 21 17:44:08 np0005591285 systemd[1]: run-cloud\x2dinit-tmp-tmp7rojcxe8.mount: Deactivated successfully.
Jan 21 17:44:08 np0005591285 systemd[1]: Starting Hostname Service...
Jan 21 17:44:08 np0005591285 systemd[1]: Started Hostname Service.
Jan 21 17:44:08 np0005591285 systemd-hostnamed[856]: Hostname set to <np0005591285.novalocal> (static)
Jan 21 17:44:08 np0005591285 systemd[1]: Finished Cloud-init: Local Stage (pre-network).
Jan 21 17:44:08 np0005591285 systemd[1]: Reached target Preparation for Network.
Jan 21 17:44:08 np0005591285 systemd[1]: Starting Network Manager...
Jan 21 17:44:08 np0005591285 NetworkManager[860]: <info>  [1769035448.4091] NetworkManager (version 1.54.3-2.el9) is starting... (boot:2993e076-ac8d-4723-86a0-913496004632)
Jan 21 17:44:08 np0005591285 NetworkManager[860]: <info>  [1769035448.4094] Read config: /etc/NetworkManager/NetworkManager.conf, /run/NetworkManager/conf.d/15-carrier-timeout.conf
Jan 21 17:44:08 np0005591285 NetworkManager[860]: <info>  [1769035448.4152] manager[0x5626fcbaa000]: monitoring kernel firmware directory '/lib/firmware'.
Jan 21 17:44:08 np0005591285 NetworkManager[860]: <info>  [1769035448.4194] hostname: hostname: using hostnamed
Jan 21 17:44:08 np0005591285 NetworkManager[860]: <info>  [1769035448.4194] hostname: static hostname changed from (none) to "np0005591285.novalocal"
Jan 21 17:44:08 np0005591285 NetworkManager[860]: <info>  [1769035448.4198] dns-mgr: init: dns=default,systemd-resolved rc-manager=symlink (auto)
Jan 21 17:44:08 np0005591285 NetworkManager[860]: <info>  [1769035448.4280] manager[0x5626fcbaa000]: rfkill: Wi-Fi hardware radio set enabled
Jan 21 17:44:08 np0005591285 NetworkManager[860]: <info>  [1769035448.4281] manager[0x5626fcbaa000]: rfkill: WWAN hardware radio set enabled
Jan 21 17:44:08 np0005591285 NetworkManager[860]: <info>  [1769035448.4316] Loaded device plugin: NMTeamFactory (/usr/lib64/NetworkManager/1.54.3-2.el9/libnm-device-plugin-team.so)
Jan 21 17:44:08 np0005591285 NetworkManager[860]: <info>  [1769035448.4317] manager: rfkill: Wi-Fi enabled by radio killswitch; enabled by state file
Jan 21 17:44:08 np0005591285 NetworkManager[860]: <info>  [1769035448.4317] manager: rfkill: WWAN enabled by radio killswitch; enabled by state file
Jan 21 17:44:08 np0005591285 NetworkManager[860]: <info>  [1769035448.4318] manager: Networking is enabled by state file
Jan 21 17:44:08 np0005591285 NetworkManager[860]: <info>  [1769035448.4320] settings: Loaded settings plugin: keyfile (internal)
Jan 21 17:44:08 np0005591285 NetworkManager[860]: <info>  [1769035448.4328] settings: Loaded settings plugin: ifcfg-rh ("/usr/lib64/NetworkManager/1.54.3-2.el9/libnm-settings-plugin-ifcfg-rh.so")
Jan 21 17:44:08 np0005591285 NetworkManager[860]: <info>  [1769035448.4346] Warning: the ifcfg-rh plugin is deprecated, please migrate connections to the keyfile format using "nmcli connection migrate"
Jan 21 17:44:08 np0005591285 systemd[1]: Listening on Load/Save RF Kill Switch Status /dev/rfkill Watch.
Jan 21 17:44:08 np0005591285 NetworkManager[860]: <info>  [1769035448.4355] dhcp: init: Using DHCP client 'internal'
Jan 21 17:44:08 np0005591285 NetworkManager[860]: <info>  [1769035448.4357] manager: (lo): new Loopback device (/org/freedesktop/NetworkManager/Devices/1)
Jan 21 17:44:08 np0005591285 NetworkManager[860]: <info>  [1769035448.4367] device (lo): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 21 17:44:08 np0005591285 NetworkManager[860]: <info>  [1769035448.4374] device (lo): state change: unavailable -> disconnected (reason 'connection-assumed', managed-type: 'external')
Jan 21 17:44:08 np0005591285 NetworkManager[860]: <info>  [1769035448.4381] device (lo): Activation: starting connection 'lo' (e8b88d7b-c546-4855-a53a-f2271e918cb0)
Jan 21 17:44:08 np0005591285 NetworkManager[860]: <info>  [1769035448.4388] manager: (eth0): new Ethernet device (/org/freedesktop/NetworkManager/Devices/2)
Jan 21 17:44:08 np0005591285 NetworkManager[860]: <info>  [1769035448.4391] device (eth0): state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Jan 21 17:44:08 np0005591285 NetworkManager[860]: <info>  [1769035448.4418] bus-manager: acquired D-Bus service "org.freedesktop.NetworkManager"
Jan 21 17:44:08 np0005591285 NetworkManager[860]: <info>  [1769035448.4421] device (lo): state change: disconnected -> prepare (reason 'none', managed-type: 'external')
Jan 21 17:44:08 np0005591285 NetworkManager[860]: <info>  [1769035448.4423] device (lo): state change: prepare -> config (reason 'none', managed-type: 'external')
Jan 21 17:44:08 np0005591285 NetworkManager[860]: <info>  [1769035448.4425] device (lo): state change: config -> ip-config (reason 'none', managed-type: 'external')
Jan 21 17:44:08 np0005591285 NetworkManager[860]: <info>  [1769035448.4426] device (eth0): carrier: link connected
Jan 21 17:44:08 np0005591285 NetworkManager[860]: <info>  [1769035448.4429] device (lo): state change: ip-config -> ip-check (reason 'none', managed-type: 'external')
Jan 21 17:44:08 np0005591285 NetworkManager[860]: <info>  [1769035448.4434] device (eth0): state change: unavailable -> disconnected (reason 'carrier-changed', managed-type: 'full')
Jan 21 17:44:08 np0005591285 NetworkManager[860]: <info>  [1769035448.4439] policy: auto-activating connection 'System eth0' (5fb06bd0-0bb0-7ffb-45f1-d6edd65f3e03)
Jan 21 17:44:08 np0005591285 NetworkManager[860]: <info>  [1769035448.4442] device (eth0): Activation: starting connection 'System eth0' (5fb06bd0-0bb0-7ffb-45f1-d6edd65f3e03)
Jan 21 17:44:08 np0005591285 NetworkManager[860]: <info>  [1769035448.4443] device (eth0): state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Jan 21 17:44:08 np0005591285 NetworkManager[860]: <info>  [1769035448.4445] manager: NetworkManager state is now CONNECTING
Jan 21 17:44:08 np0005591285 NetworkManager[860]: <info>  [1769035448.4446] device (eth0): state change: prepare -> config (reason 'none', managed-type: 'full')
Jan 21 17:44:08 np0005591285 NetworkManager[860]: <info>  [1769035448.4451] device (eth0): state change: config -> ip-config (reason 'none', managed-type: 'full')
Jan 21 17:44:08 np0005591285 NetworkManager[860]: <info>  [1769035448.4454] dhcp4 (eth0): activation: beginning transaction (timeout in 45 seconds)
Jan 21 17:44:08 np0005591285 systemd[1]: Starting Network Manager Script Dispatcher Service...
Jan 21 17:44:08 np0005591285 systemd[1]: Started Network Manager.
Jan 21 17:44:08 np0005591285 systemd[1]: Reached target Network.
Jan 21 17:44:08 np0005591285 systemd[1]: Starting Network Manager Wait Online...
Jan 21 17:44:08 np0005591285 systemd[1]: Starting GSSAPI Proxy Daemon...
Jan 21 17:44:08 np0005591285 NetworkManager[860]: <info>  [1769035448.4712] device (lo): state change: ip-check -> secondaries (reason 'none', managed-type: 'external')
Jan 21 17:44:08 np0005591285 NetworkManager[860]: <info>  [1769035448.4715] device (lo): state change: secondaries -> activated (reason 'none', managed-type: 'external')
Jan 21 17:44:08 np0005591285 systemd[1]: Started Network Manager Script Dispatcher Service.
Jan 21 17:44:08 np0005591285 NetworkManager[860]: <info>  [1769035448.4719] device (lo): Activation: successful, device activated.
Jan 21 17:44:08 np0005591285 systemd[1]: Started GSSAPI Proxy Daemon.
Jan 21 17:44:08 np0005591285 systemd[1]: RPC security service for NFS client and server was skipped because of an unmet condition check (ConditionPathExists=/etc/krb5.keytab).
Jan 21 17:44:08 np0005591285 systemd[1]: Reached target NFS client services.
Jan 21 17:44:08 np0005591285 systemd[1]: Reached target Preparation for Remote File Systems.
Jan 21 17:44:08 np0005591285 systemd[1]: Reached target Remote File Systems.
Jan 21 17:44:08 np0005591285 systemd[1]: TPM2 PCR Barrier (User) was skipped because of an unmet condition check (ConditionPathExists=/sys/firmware/efi/efivars/StubPcrKernelImage-4a67b082-0a4c-41cf-b6c7-440b29bb8c4f).
Jan 21 17:44:08 np0005591285 NetworkManager[860]: <info>  [1769035448.7397] dhcp4 (eth0): state changed new lease, address=38.102.83.145
Jan 21 17:44:08 np0005591285 NetworkManager[860]: <info>  [1769035448.7409] policy: set 'System eth0' (eth0) as default for IPv4 routing and DNS
Jan 21 17:44:08 np0005591285 NetworkManager[860]: <info>  [1769035448.7428] device (eth0): state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Jan 21 17:44:08 np0005591285 NetworkManager[860]: <info>  [1769035448.7454] device (eth0): state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Jan 21 17:44:08 np0005591285 NetworkManager[860]: <info>  [1769035448.7456] device (eth0): state change: secondaries -> activated (reason 'none', managed-type: 'full')
Jan 21 17:44:08 np0005591285 NetworkManager[860]: <info>  [1769035448.7458] manager: NetworkManager state is now CONNECTED_SITE
Jan 21 17:44:08 np0005591285 NetworkManager[860]: <info>  [1769035448.7460] device (eth0): Activation: successful, device activated.
Jan 21 17:44:08 np0005591285 NetworkManager[860]: <info>  [1769035448.7465] manager: NetworkManager state is now CONNECTED_GLOBAL
Jan 21 17:44:08 np0005591285 NetworkManager[860]: <info>  [1769035448.7473] manager: startup complete
Jan 21 17:44:08 np0005591285 systemd[1]: Finished Network Manager Wait Online.
Jan 21 17:44:08 np0005591285 systemd[1]: Starting Cloud-init: Network Stage...
Jan 21 17:44:09 np0005591285 cloud-init[923]: Cloud-init v. 24.4-8.el9 running 'init' at Wed, 21 Jan 2026 22:44:09 +0000. Up 7.77 seconds.
Jan 21 17:44:09 np0005591285 cloud-init[923]: ci-info: +++++++++++++++++++++++++++++++++++++++Net device info+++++++++++++++++++++++++++++++++++++++
Jan 21 17:44:09 np0005591285 cloud-init[923]: ci-info: +--------+------+------------------------------+---------------+--------+-------------------+
Jan 21 17:44:09 np0005591285 cloud-init[923]: ci-info: | Device |  Up  |           Address            |      Mask     | Scope  |     Hw-Address    |
Jan 21 17:44:09 np0005591285 cloud-init[923]: ci-info: +--------+------+------------------------------+---------------+--------+-------------------+
Jan 21 17:44:09 np0005591285 cloud-init[923]: ci-info: |  eth0  | True |        38.102.83.145         | 255.255.255.0 | global | fa:16:3e:42:5b:77 |
Jan 21 17:44:09 np0005591285 cloud-init[923]: ci-info: |  eth0  | True | fe80::f816:3eff:fe42:5b77/64 |       .       |  link  | fa:16:3e:42:5b:77 |
Jan 21 17:44:09 np0005591285 cloud-init[923]: ci-info: |   lo   | True |          127.0.0.1           |   255.0.0.0   |  host  |         .         |
Jan 21 17:44:09 np0005591285 cloud-init[923]: ci-info: |   lo   | True |           ::1/128            |       .       |  host  |         .         |
Jan 21 17:44:09 np0005591285 cloud-init[923]: ci-info: +--------+------+------------------------------+---------------+--------+-------------------+
Jan 21 17:44:09 np0005591285 cloud-init[923]: ci-info: +++++++++++++++++++++++++++++++++Route IPv4 info+++++++++++++++++++++++++++++++++
Jan 21 17:44:09 np0005591285 cloud-init[923]: ci-info: +-------+-----------------+---------------+-----------------+-----------+-------+
Jan 21 17:44:09 np0005591285 cloud-init[923]: ci-info: | Route |   Destination   |    Gateway    |     Genmask     | Interface | Flags |
Jan 21 17:44:09 np0005591285 cloud-init[923]: ci-info: +-------+-----------------+---------------+-----------------+-----------+-------+
Jan 21 17:44:09 np0005591285 cloud-init[923]: ci-info: |   0   |     0.0.0.0     |  38.102.83.1  |     0.0.0.0     |    eth0   |   UG  |
Jan 21 17:44:09 np0005591285 cloud-init[923]: ci-info: |   1   |   38.102.83.0   |    0.0.0.0    |  255.255.255.0  |    eth0   |   U   |
Jan 21 17:44:09 np0005591285 cloud-init[923]: ci-info: |   2   | 169.254.169.254 | 38.102.83.126 | 255.255.255.255 |    eth0   |  UGH  |
Jan 21 17:44:09 np0005591285 cloud-init[923]: ci-info: +-------+-----------------+---------------+-----------------+-----------+-------+
Jan 21 17:44:09 np0005591285 cloud-init[923]: ci-info: +++++++++++++++++++Route IPv6 info+++++++++++++++++++
Jan 21 17:44:09 np0005591285 cloud-init[923]: ci-info: +-------+-------------+---------+-----------+-------+
Jan 21 17:44:09 np0005591285 cloud-init[923]: ci-info: | Route | Destination | Gateway | Interface | Flags |
Jan 21 17:44:09 np0005591285 cloud-init[923]: ci-info: +-------+-------------+---------+-----------+-------+
Jan 21 17:44:09 np0005591285 cloud-init[923]: ci-info: |   1   |  fe80::/64  |    ::   |    eth0   |   U   |
Jan 21 17:44:09 np0005591285 cloud-init[923]: ci-info: |   3   |  multicast  |    ::   |    eth0   |   U   |
Jan 21 17:44:09 np0005591285 cloud-init[923]: ci-info: +-------+-------------+---------+-----------+-------+
Jan 21 17:44:10 np0005591285 cloud-init[923]: Generating public/private rsa key pair.
Jan 21 17:44:10 np0005591285 cloud-init[923]: Your identification has been saved in /etc/ssh/ssh_host_rsa_key
Jan 21 17:44:10 np0005591285 cloud-init[923]: Your public key has been saved in /etc/ssh/ssh_host_rsa_key.pub
Jan 21 17:44:10 np0005591285 cloud-init[923]: The key fingerprint is:
Jan 21 17:44:10 np0005591285 cloud-init[923]: SHA256:0ZQITrzN8K/UUgx2BbB127auS+0u8lkYBOVR9iJya0k root@np0005591285.novalocal
Jan 21 17:44:10 np0005591285 cloud-init[923]: The key's randomart image is:
Jan 21 17:44:10 np0005591285 cloud-init[923]: +---[RSA 3072]----+
Jan 21 17:44:10 np0005591285 cloud-init[923]: |     .o..o===.o  |
Jan 21 17:44:10 np0005591285 cloud-init[923]: |     oo +=o+ = . |
Jan 21 17:44:10 np0005591285 cloud-init[923]: |      .Bo+o E + .|
Jan 21 17:44:10 np0005591285 cloud-init[923]: |      . +.o= = o |
Jan 21 17:44:10 np0005591285 cloud-init[923]: |        S+  = .  |
Jan 21 17:44:10 np0005591285 cloud-init[923]: |        o o. =   |
Jan 21 17:44:10 np0005591285 cloud-init[923]: |       . o  o +  |
Jan 21 17:44:10 np0005591285 cloud-init[923]: |        . ...=   |
Jan 21 17:44:10 np0005591285 cloud-init[923]: |           o=+o  |
Jan 21 17:44:10 np0005591285 cloud-init[923]: +----[SHA256]-----+
Jan 21 17:44:10 np0005591285 cloud-init[923]: Generating public/private ecdsa key pair.
Jan 21 17:44:10 np0005591285 cloud-init[923]: Your identification has been saved in /etc/ssh/ssh_host_ecdsa_key
Jan 21 17:44:10 np0005591285 cloud-init[923]: Your public key has been saved in /etc/ssh/ssh_host_ecdsa_key.pub
Jan 21 17:44:10 np0005591285 cloud-init[923]: The key fingerprint is:
Jan 21 17:44:10 np0005591285 cloud-init[923]: SHA256:qy4UMkRFrLhZUZi5ecPbCCTJSyHWVOwwCmFOQq0Kghk root@np0005591285.novalocal
Jan 21 17:44:10 np0005591285 cloud-init[923]: The key's randomart image is:
Jan 21 17:44:10 np0005591285 cloud-init[923]: +---[ECDSA 256]---+
Jan 21 17:44:10 np0005591285 cloud-init[923]: |*X*X*.           |
Jan 21 17:44:10 np0005591285 cloud-init[923]: |EoB=..           |
Jan 21 17:44:10 np0005591285 cloud-init[923]: |+O+*+            |
Jan 21 17:44:10 np0005591285 cloud-init[923]: |*+O =.           |
Jan 21 17:44:10 np0005591285 cloud-init[923]: |++ = *  S        |
Jan 21 17:44:10 np0005591285 cloud-init[923]: |+   + .  .       |
Jan 21 17:44:10 np0005591285 cloud-init[923]: |   .    .        |
Jan 21 17:44:10 np0005591285 cloud-init[923]: |    .  .         |
Jan 21 17:44:10 np0005591285 cloud-init[923]: |     oo          |
Jan 21 17:44:10 np0005591285 cloud-init[923]: +----[SHA256]-----+
Jan 21 17:44:10 np0005591285 cloud-init[923]: Generating public/private ed25519 key pair.
Jan 21 17:44:10 np0005591285 cloud-init[923]: Your identification has been saved in /etc/ssh/ssh_host_ed25519_key
Jan 21 17:44:10 np0005591285 cloud-init[923]: Your public key has been saved in /etc/ssh/ssh_host_ed25519_key.pub
Jan 21 17:44:10 np0005591285 cloud-init[923]: The key fingerprint is:
Jan 21 17:44:10 np0005591285 cloud-init[923]: SHA256:ayyV0vc10AEmTxtYdv47sfTLn4N3xcaUgCMU/D4qXVc root@np0005591285.novalocal
Jan 21 17:44:10 np0005591285 cloud-init[923]: The key's randomart image is:
Jan 21 17:44:10 np0005591285 cloud-init[923]: +--[ED25519 256]--+
Jan 21 17:44:10 np0005591285 cloud-init[923]: |        ooooO.o  |
Jan 21 17:44:10 np0005591285 cloud-init[923]: |         o.O B . |
Jan 21 17:44:10 np0005591285 cloud-init[923]: |          o = + .|
Jan 21 17:44:10 np0005591285 cloud-init[923]: |       . . . . E.|
Jan 21 17:44:10 np0005591285 cloud-init[923]: |      . S o   +*.|
Jan 21 17:44:10 np0005591285 cloud-init[923]: |       + o = o..X|
Jan 21 17:44:10 np0005591285 cloud-init[923]: |      . = o + .=o|
Jan 21 17:44:10 np0005591285 cloud-init[923]: |       + o   ..o=|
Jan 21 17:44:10 np0005591285 cloud-init[923]: |        .     .+=|
Jan 21 17:44:10 np0005591285 cloud-init[923]: +----[SHA256]-----+
Jan 21 17:44:10 np0005591285 systemd[1]: Finished Cloud-init: Network Stage.
Jan 21 17:44:10 np0005591285 systemd[1]: Reached target Cloud-config availability.
Jan 21 17:44:10 np0005591285 systemd[1]: Reached target Network is Online.
Jan 21 17:44:10 np0005591285 systemd[1]: Starting Cloud-init: Config Stage...
Jan 21 17:44:10 np0005591285 systemd[1]: Starting Crash recovery kernel arming...
Jan 21 17:44:10 np0005591285 systemd[1]: Starting Notify NFS peers of a restart...
Jan 21 17:44:10 np0005591285 systemd[1]: Starting System Logging Service...
Jan 21 17:44:10 np0005591285 sm-notify[1005]: Version 2.5.4 starting
Jan 21 17:44:10 np0005591285 systemd[1]: Starting OpenSSH server daemon...
Jan 21 17:44:10 np0005591285 systemd[1]: Starting Permit User Sessions...
Jan 21 17:44:10 np0005591285 systemd[1]: Started Notify NFS peers of a restart.
Jan 21 17:44:10 np0005591285 systemd[1]: Started OpenSSH server daemon.
Jan 21 17:44:10 np0005591285 systemd[1]: Finished Permit User Sessions.
Jan 21 17:44:10 np0005591285 systemd[1]: Started Command Scheduler.
Jan 21 17:44:10 np0005591285 systemd[1]: Started Getty on tty1.
Jan 21 17:44:10 np0005591285 systemd[1]: Started Serial Getty on ttyS0.
Jan 21 17:44:10 np0005591285 systemd[1]: Reached target Login Prompts.
Jan 21 17:44:10 np0005591285 rsyslogd[1006]: [origin software="rsyslogd" swVersion="8.2510.0-2.el9" x-pid="1006" x-info="https://www.rsyslog.com"] start
Jan 21 17:44:10 np0005591285 rsyslogd[1006]: imjournal: No statefile exists, /var/lib/rsyslog/imjournal.state will be created (ignore if this is first run): No such file or directory [v8.2510.0-2.el9 try https://www.rsyslog.com/e/2040 ]
Jan 21 17:44:10 np0005591285 systemd[1]: Started System Logging Service.
Jan 21 17:44:10 np0005591285 systemd[1]: Reached target Multi-User System.
Jan 21 17:44:10 np0005591285 systemd[1]: Starting Record Runlevel Change in UTMP...
Jan 21 17:44:10 np0005591285 systemd[1]: systemd-update-utmp-runlevel.service: Deactivated successfully.
Jan 21 17:44:10 np0005591285 systemd[1]: Finished Record Runlevel Change in UTMP.
Jan 21 17:44:10 np0005591285 rsyslogd[1006]: imjournal: journal files changed, reloading...  [v8.2510.0-2.el9 try https://www.rsyslog.com/e/0 ]
Jan 21 17:44:10 np0005591285 kdumpctl[1018]: kdump: No kdump initial ramdisk found.
Jan 21 17:44:10 np0005591285 kdumpctl[1018]: kdump: Rebuilding /boot/initramfs-5.14.0-661.el9.x86_64kdump.img
Jan 21 17:44:10 np0005591285 cloud-init[1114]: Cloud-init v. 24.4-8.el9 running 'modules:config' at Wed, 21 Jan 2026 22:44:10 +0000. Up 9.63 seconds.
Jan 21 17:44:11 np0005591285 systemd[1]: Finished Cloud-init: Config Stage.
Jan 21 17:44:11 np0005591285 systemd[1]: Starting Cloud-init: Final Stage...
Jan 21 17:44:11 np0005591285 cloud-init[1271]: Cloud-init v. 24.4-8.el9 running 'modules:final' at Wed, 21 Jan 2026 22:44:11 +0000. Up 10.03 seconds.
Jan 21 17:44:11 np0005591285 dracut[1277]: dracut-057-102.git20250818.el9
Jan 21 17:44:11 np0005591285 cloud-init[1296]: #############################################################
Jan 21 17:44:11 np0005591285 cloud-init[1300]: -----BEGIN SSH HOST KEY FINGERPRINTS-----
Jan 21 17:44:11 np0005591285 cloud-init[1302]: 256 SHA256:qy4UMkRFrLhZUZi5ecPbCCTJSyHWVOwwCmFOQq0Kghk root@np0005591285.novalocal (ECDSA)
Jan 21 17:44:11 np0005591285 cloud-init[1305]: 256 SHA256:ayyV0vc10AEmTxtYdv47sfTLn4N3xcaUgCMU/D4qXVc root@np0005591285.novalocal (ED25519)
Jan 21 17:44:11 np0005591285 cloud-init[1309]: 3072 SHA256:0ZQITrzN8K/UUgx2BbB127auS+0u8lkYBOVR9iJya0k root@np0005591285.novalocal (RSA)
Jan 21 17:44:11 np0005591285 cloud-init[1311]: -----END SSH HOST KEY FINGERPRINTS-----
Jan 21 17:44:11 np0005591285 cloud-init[1312]: #############################################################
Jan 21 17:44:11 np0005591285 cloud-init[1271]: Cloud-init v. 24.4-8.el9 finished at Wed, 21 Jan 2026 22:44:11 +0000. Datasource DataSourceConfigDrive [net,ver=2][source=/dev/sr0].  Up 10.22 seconds
Jan 21 17:44:11 np0005591285 dracut[1281]: Executing: /usr/bin/dracut --quiet --hostonly --hostonly-cmdline --hostonly-i18n --hostonly-mode strict --hostonly-nics  --mount "/dev/disk/by-uuid/22ac9141-3960-4912-b20e-19fc8a328d40 /sysroot xfs rw,relatime,seclabel,attr2,inode64,logbufs=8,logbsize=32k,noquota" --squash-compressor zstd --no-hostonly-default-device --add-confdir /lib/kdump/dracut.conf.d -f /boot/initramfs-5.14.0-661.el9.x86_64kdump.img 5.14.0-661.el9.x86_64
Jan 21 17:44:11 np0005591285 systemd[1]: Finished Cloud-init: Final Stage.
Jan 21 17:44:11 np0005591285 systemd[1]: Reached target Cloud-init target.
Jan 21 17:44:12 np0005591285 dracut[1281]: dracut module 'systemd-networkd' will not be installed, because command 'networkctl' could not be found!
Jan 21 17:44:12 np0005591285 dracut[1281]: dracut module 'systemd-networkd' will not be installed, because command '/usr/lib/systemd/systemd-networkd' could not be found!
Jan 21 17:44:12 np0005591285 dracut[1281]: dracut module 'systemd-networkd' will not be installed, because command '/usr/lib/systemd/systemd-networkd-wait-online' could not be found!
Jan 21 17:44:12 np0005591285 dracut[1281]: dracut module 'systemd-resolved' will not be installed, because command 'resolvectl' could not be found!
Jan 21 17:44:12 np0005591285 dracut[1281]: dracut module 'systemd-resolved' will not be installed, because command '/usr/lib/systemd/systemd-resolved' could not be found!
Jan 21 17:44:12 np0005591285 dracut[1281]: dracut module 'systemd-timesyncd' will not be installed, because command '/usr/lib/systemd/systemd-timesyncd' could not be found!
Jan 21 17:44:12 np0005591285 dracut[1281]: dracut module 'systemd-timesyncd' will not be installed, because command '/usr/lib/systemd/systemd-time-wait-sync' could not be found!
Jan 21 17:44:12 np0005591285 dracut[1281]: dracut module 'busybox' will not be installed, because command 'busybox' could not be found!
Jan 21 17:44:12 np0005591285 dracut[1281]: dracut module 'dbus-daemon' will not be installed, because command 'dbus-daemon' could not be found!
Jan 21 17:44:12 np0005591285 dracut[1281]: dracut module 'rngd' will not be installed, because command 'rngd' could not be found!
Jan 21 17:44:12 np0005591285 dracut[1281]: dracut module 'connman' will not be installed, because command 'connmand' could not be found!
Jan 21 17:44:12 np0005591285 dracut[1281]: dracut module 'connman' will not be installed, because command 'connmanctl' could not be found!
Jan 21 17:44:12 np0005591285 dracut[1281]: dracut module 'connman' will not be installed, because command 'connmand-wait-online' could not be found!
Jan 21 17:44:12 np0005591285 dracut[1281]: dracut module 'network-wicked' will not be installed, because command 'wicked' could not be found!
Jan 21 17:44:12 np0005591285 dracut[1281]: 62bluetooth: Could not find any command of '/usr/lib/bluetooth/bluetoothd /usr/libexec/bluetooth/bluetoothd'!
Jan 21 17:44:12 np0005591285 dracut[1281]: dracut module 'lvmmerge' will not be installed, because command 'lvm' could not be found!
Jan 21 17:44:12 np0005591285 dracut[1281]: dracut module 'lvmthinpool-monitor' will not be installed, because command 'lvm' could not be found!
Jan 21 17:44:12 np0005591285 dracut[1281]: dracut module 'btrfs' will not be installed, because command 'btrfs' could not be found!
Jan 21 17:44:12 np0005591285 dracut[1281]: dracut module 'dmraid' will not be installed, because command 'dmraid' could not be found!
Jan 21 17:44:12 np0005591285 dracut[1281]: dracut module 'lvm' will not be installed, because command 'lvm' could not be found!
Jan 21 17:44:12 np0005591285 dracut[1281]: dracut module 'mdraid' will not be installed, because command 'mdadm' could not be found!
Jan 21 17:44:12 np0005591285 dracut[1281]: dracut module 'pcsc' will not be installed, because command 'pcscd' could not be found!
Jan 21 17:44:12 np0005591285 dracut[1281]: dracut module 'tpm2-tss' will not be installed, because command 'tpm2' could not be found!
Jan 21 17:44:12 np0005591285 dracut[1281]: dracut module 'cifs' will not be installed, because command 'mount.cifs' could not be found!
Jan 21 17:44:12 np0005591285 dracut[1281]: dracut module 'iscsi' will not be installed, because command 'iscsi-iname' could not be found!
Jan 21 17:44:12 np0005591285 dracut[1281]: dracut module 'iscsi' will not be installed, because command 'iscsiadm' could not be found!
Jan 21 17:44:12 np0005591285 dracut[1281]: dracut module 'iscsi' will not be installed, because command 'iscsid' could not be found!
Jan 21 17:44:12 np0005591285 dracut[1281]: dracut module 'nvmf' will not be installed, because command 'nvme' could not be found!
Jan 21 17:44:12 np0005591285 dracut[1281]: dracut module 'biosdevname' will not be installed, because command 'biosdevname' could not be found!
Jan 21 17:44:12 np0005591285 dracut[1281]: dracut module 'memstrack' will not be installed, because command 'memstrack' could not be found!
Jan 21 17:44:12 np0005591285 dracut[1281]: memstrack is not available
Jan 21 17:44:13 np0005591285 dracut[1281]: If you need to use rd.memdebug>=4, please install memstrack and procps-ng
Jan 21 17:44:13 np0005591285 dracut[1281]: dracut module 'systemd-resolved' will not be installed, because command 'resolvectl' could not be found!
Jan 21 17:44:13 np0005591285 dracut[1281]: dracut module 'systemd-resolved' will not be installed, because command '/usr/lib/systemd/systemd-resolved' could not be found!
Jan 21 17:44:13 np0005591285 dracut[1281]: dracut module 'systemd-timesyncd' will not be installed, because command '/usr/lib/systemd/systemd-timesyncd' could not be found!
Jan 21 17:44:13 np0005591285 dracut[1281]: dracut module 'systemd-timesyncd' will not be installed, because command '/usr/lib/systemd/systemd-time-wait-sync' could not be found!
Jan 21 17:44:13 np0005591285 dracut[1281]: dracut module 'busybox' will not be installed, because command 'busybox' could not be found!
Jan 21 17:44:13 np0005591285 dracut[1281]: dracut module 'dbus-daemon' will not be installed, because command 'dbus-daemon' could not be found!
Jan 21 17:44:13 np0005591285 dracut[1281]: dracut module 'rngd' will not be installed, because command 'rngd' could not be found!
Jan 21 17:44:13 np0005591285 dracut[1281]: dracut module 'connman' will not be installed, because command 'connmand' could not be found!
Jan 21 17:44:13 np0005591285 dracut[1281]: dracut module 'connman' will not be installed, because command 'connmanctl' could not be found!
Jan 21 17:44:13 np0005591285 dracut[1281]: dracut module 'connman' will not be installed, because command 'connmand-wait-online' could not be found!
Jan 21 17:44:13 np0005591285 dracut[1281]: dracut module 'network-wicked' will not be installed, because command 'wicked' could not be found!
Jan 21 17:44:13 np0005591285 dracut[1281]: 62bluetooth: Could not find any command of '/usr/lib/bluetooth/bluetoothd /usr/libexec/bluetooth/bluetoothd'!
Jan 21 17:44:13 np0005591285 dracut[1281]: dracut module 'lvmmerge' will not be installed, because command 'lvm' could not be found!
Jan 21 17:44:13 np0005591285 dracut[1281]: dracut module 'lvmthinpool-monitor' will not be installed, because command 'lvm' could not be found!
Jan 21 17:44:13 np0005591285 dracut[1281]: dracut module 'btrfs' will not be installed, because command 'btrfs' could not be found!
Jan 21 17:44:13 np0005591285 dracut[1281]: dracut module 'dmraid' will not be installed, because command 'dmraid' could not be found!
Jan 21 17:44:13 np0005591285 dracut[1281]: dracut module 'lvm' will not be installed, because command 'lvm' could not be found!
Jan 21 17:44:13 np0005591285 dracut[1281]: dracut module 'mdraid' will not be installed, because command 'mdadm' could not be found!
Jan 21 17:44:13 np0005591285 dracut[1281]: dracut module 'pcsc' will not be installed, because command 'pcscd' could not be found!
Jan 21 17:44:13 np0005591285 dracut[1281]: dracut module 'tpm2-tss' will not be installed, because command 'tpm2' could not be found!
Jan 21 17:44:13 np0005591285 dracut[1281]: dracut module 'cifs' will not be installed, because command 'mount.cifs' could not be found!
Jan 21 17:44:13 np0005591285 dracut[1281]: dracut module 'iscsi' will not be installed, because command 'iscsi-iname' could not be found!
Jan 21 17:44:13 np0005591285 dracut[1281]: dracut module 'iscsi' will not be installed, because command 'iscsiadm' could not be found!
Jan 21 17:44:13 np0005591285 dracut[1281]: dracut module 'iscsi' will not be installed, because command 'iscsid' could not be found!
Jan 21 17:44:13 np0005591285 dracut[1281]: dracut module 'nvmf' will not be installed, because command 'nvme' could not be found!
Jan 21 17:44:13 np0005591285 dracut[1281]: dracut module 'memstrack' will not be installed, because command 'memstrack' could not be found!
Jan 21 17:44:13 np0005591285 dracut[1281]: memstrack is not available
Jan 21 17:44:13 np0005591285 dracut[1281]: If you need to use rd.memdebug>=4, please install memstrack and procps-ng
Jan 21 17:44:13 np0005591285 dracut[1281]: *** Including module: systemd ***
Jan 21 17:44:13 np0005591285 chronyd[793]: Selected source 147.189.136.126 (2.centos.pool.ntp.org)
Jan 21 17:44:13 np0005591285 chronyd[793]: System clock TAI offset set to 37 seconds
Jan 21 17:44:13 np0005591285 dracut[1281]: *** Including module: fips ***
Jan 21 17:44:14 np0005591285 dracut[1281]: *** Including module: systemd-initrd ***
Jan 21 17:44:14 np0005591285 dracut[1281]: *** Including module: i18n ***
Jan 21 17:44:14 np0005591285 dracut[1281]: *** Including module: drm ***
Jan 21 17:44:14 np0005591285 dracut[1281]: *** Including module: prefixdevname ***
Jan 21 17:44:14 np0005591285 dracut[1281]: *** Including module: kernel-modules ***
Jan 21 17:44:15 np0005591285 kernel: block vda: the capability attribute has been deprecated.
Jan 21 17:44:15 np0005591285 dracut[1281]: *** Including module: kernel-modules-extra ***
Jan 21 17:44:15 np0005591285 dracut[1281]: *** Including module: qemu ***
Jan 21 17:44:15 np0005591285 dracut[1281]: *** Including module: fstab-sys ***
Jan 21 17:44:15 np0005591285 dracut[1281]: *** Including module: rootfs-block ***
Jan 21 17:44:15 np0005591285 dracut[1281]: *** Including module: terminfo ***
Jan 21 17:44:15 np0005591285 dracut[1281]: *** Including module: udev-rules ***
Jan 21 17:44:16 np0005591285 dracut[1281]: Skipping udev rule: 91-permissions.rules
Jan 21 17:44:16 np0005591285 dracut[1281]: Skipping udev rule: 80-drivers-modprobe.rules
Jan 21 17:44:16 np0005591285 dracut[1281]: *** Including module: virtiofs ***
Jan 21 17:44:16 np0005591285 dracut[1281]: *** Including module: dracut-systemd ***
Jan 21 17:44:16 np0005591285 dracut[1281]: *** Including module: usrmount ***
Jan 21 17:44:16 np0005591285 dracut[1281]: *** Including module: base ***
Jan 21 17:44:16 np0005591285 dracut[1281]: *** Including module: fs-lib ***
Jan 21 17:44:16 np0005591285 dracut[1281]: *** Including module: kdumpbase ***
Jan 21 17:44:17 np0005591285 dracut[1281]: *** Including module: microcode_ctl-fw_dir_override ***
Jan 21 17:44:17 np0005591285 dracut[1281]:  microcode_ctl module: mangling fw_dir
Jan 21 17:44:17 np0005591285 dracut[1281]:    microcode_ctl: reset fw_dir to "/lib/firmware/updates /lib/firmware"
Jan 21 17:44:17 np0005591285 dracut[1281]:    microcode_ctl: processing data directory  "/usr/share/microcode_ctl/ucode_with_caveats/intel"...
Jan 21 17:44:17 np0005591285 dracut[1281]:    microcode_ctl: configuration "intel" is ignored
Jan 21 17:44:17 np0005591285 dracut[1281]:    microcode_ctl: processing data directory  "/usr/share/microcode_ctl/ucode_with_caveats/intel-06-2d-07"...
Jan 21 17:44:17 np0005591285 dracut[1281]:    microcode_ctl: configuration "intel-06-2d-07" is ignored
Jan 21 17:44:17 np0005591285 dracut[1281]:    microcode_ctl: processing data directory  "/usr/share/microcode_ctl/ucode_with_caveats/intel-06-4e-03"...
Jan 21 17:44:17 np0005591285 dracut[1281]:    microcode_ctl: configuration "intel-06-4e-03" is ignored
Jan 21 17:44:17 np0005591285 dracut[1281]:    microcode_ctl: processing data directory  "/usr/share/microcode_ctl/ucode_with_caveats/intel-06-4f-01"...
Jan 21 17:44:17 np0005591285 dracut[1281]:    microcode_ctl: configuration "intel-06-4f-01" is ignored
Jan 21 17:44:17 np0005591285 dracut[1281]:    microcode_ctl: processing data directory  "/usr/share/microcode_ctl/ucode_with_caveats/intel-06-55-04"...
Jan 21 17:44:17 np0005591285 dracut[1281]:    microcode_ctl: configuration "intel-06-55-04" is ignored
Jan 21 17:44:17 np0005591285 dracut[1281]:    microcode_ctl: processing data directory  "/usr/share/microcode_ctl/ucode_with_caveats/intel-06-5e-03"...
Jan 21 17:44:17 np0005591285 dracut[1281]:    microcode_ctl: configuration "intel-06-5e-03" is ignored
Jan 21 17:44:17 np0005591285 dracut[1281]:    microcode_ctl: processing data directory  "/usr/share/microcode_ctl/ucode_with_caveats/intel-06-8c-01"...
Jan 21 17:44:17 np0005591285 dracut[1281]:    microcode_ctl: configuration "intel-06-8c-01" is ignored
Jan 21 17:44:17 np0005591285 dracut[1281]:    microcode_ctl: processing data directory  "/usr/share/microcode_ctl/ucode_with_caveats/intel-06-8e-9e-0x-0xca"...
Jan 21 17:44:17 np0005591285 dracut[1281]:    microcode_ctl: configuration "intel-06-8e-9e-0x-0xca" is ignored
Jan 21 17:44:17 np0005591285 dracut[1281]:    microcode_ctl: processing data directory  "/usr/share/microcode_ctl/ucode_with_caveats/intel-06-8e-9e-0x-dell"...
Jan 21 17:44:18 np0005591285 dracut[1281]:    microcode_ctl: configuration "intel-06-8e-9e-0x-dell" is ignored
Jan 21 17:44:18 np0005591285 dracut[1281]:    microcode_ctl: processing data directory  "/usr/share/microcode_ctl/ucode_with_caveats/intel-06-8f-08"...
Jan 21 17:44:18 np0005591285 dracut[1281]:    microcode_ctl: configuration "intel-06-8f-08" is ignored
Jan 21 17:44:18 np0005591285 dracut[1281]:    microcode_ctl: final fw_dir: "/lib/firmware/updates /lib/firmware"
Jan 21 17:44:18 np0005591285 dracut[1281]: *** Including module: openssl ***
Jan 21 17:44:18 np0005591285 dracut[1281]: *** Including module: shutdown ***
Jan 21 17:44:18 np0005591285 dracut[1281]: *** Including module: squash ***
Jan 21 17:44:18 np0005591285 dracut[1281]: *** Including modules done ***
Jan 21 17:44:18 np0005591285 dracut[1281]: *** Installing kernel module dependencies ***
Jan 21 17:44:18 np0005591285 irqbalance[782]: Cannot change IRQ 25 affinity: Operation not permitted
Jan 21 17:44:18 np0005591285 irqbalance[782]: IRQ 25 affinity is now unmanaged
Jan 21 17:44:18 np0005591285 irqbalance[782]: Cannot change IRQ 31 affinity: Operation not permitted
Jan 21 17:44:18 np0005591285 irqbalance[782]: IRQ 31 affinity is now unmanaged
Jan 21 17:44:18 np0005591285 irqbalance[782]: Cannot change IRQ 28 affinity: Operation not permitted
Jan 21 17:44:18 np0005591285 irqbalance[782]: IRQ 28 affinity is now unmanaged
Jan 21 17:44:18 np0005591285 irqbalance[782]: Cannot change IRQ 32 affinity: Operation not permitted
Jan 21 17:44:18 np0005591285 irqbalance[782]: IRQ 32 affinity is now unmanaged
Jan 21 17:44:18 np0005591285 irqbalance[782]: Cannot change IRQ 30 affinity: Operation not permitted
Jan 21 17:44:18 np0005591285 irqbalance[782]: IRQ 30 affinity is now unmanaged
Jan 21 17:44:18 np0005591285 irqbalance[782]: Cannot change IRQ 29 affinity: Operation not permitted
Jan 21 17:44:18 np0005591285 irqbalance[782]: IRQ 29 affinity is now unmanaged
Jan 21 17:44:18 np0005591285 systemd[1]: NetworkManager-dispatcher.service: Deactivated successfully.
Jan 21 17:44:19 np0005591285 dracut[1281]: *** Installing kernel module dependencies done ***
Jan 21 17:44:19 np0005591285 dracut[1281]: *** Resolving executable dependencies ***
Jan 21 17:44:20 np0005591285 dracut[1281]: *** Resolving executable dependencies done ***
Jan 21 17:44:20 np0005591285 dracut[1281]: *** Generating early-microcode cpio image ***
Jan 21 17:44:20 np0005591285 dracut[1281]: *** Store current command line parameters ***
Jan 21 17:44:20 np0005591285 dracut[1281]: Stored kernel commandline:
Jan 21 17:44:20 np0005591285 dracut[1281]: No dracut internal kernel commandline stored in the initramfs
Jan 21 17:44:21 np0005591285 dracut[1281]: *** Install squash loader ***
Jan 21 17:44:22 np0005591285 dracut[1281]: *** Squashing the files inside the initramfs ***
Jan 21 17:44:23 np0005591285 dracut[1281]: *** Squashing the files inside the initramfs done ***
Jan 21 17:44:23 np0005591285 dracut[1281]: *** Creating image file '/boot/initramfs-5.14.0-661.el9.x86_64kdump.img' ***
Jan 21 17:44:23 np0005591285 dracut[1281]: *** Hardlinking files ***
Jan 21 17:44:23 np0005591285 dracut[1281]: *** Hardlinking files done ***
Jan 21 17:44:23 np0005591285 dracut[1281]: *** Creating initramfs image file '/boot/initramfs-5.14.0-661.el9.x86_64kdump.img' done ***
Jan 21 17:44:24 np0005591285 kdumpctl[1018]: kdump: kexec: loaded kdump kernel
Jan 21 17:44:24 np0005591285 kdumpctl[1018]: kdump: Starting kdump: [OK]
Jan 21 17:44:24 np0005591285 systemd[1]: Finished Crash recovery kernel arming.
Jan 21 17:44:24 np0005591285 systemd[1]: Startup finished in 1.736s (kernel) + 2.782s (initrd) + 18.450s (userspace) = 22.968s.
Jan 21 17:44:38 np0005591285 systemd[1]: systemd-hostnamed.service: Deactivated successfully.
Jan 21 17:45:19 np0005591285 chronyd[793]: Selected source 167.160.187.179 (2.centos.pool.ntp.org)
Jan 21 17:45:23 np0005591285 systemd[1]: Created slice User Slice of UID 1000.
Jan 21 17:45:23 np0005591285 systemd[1]: Starting User Runtime Directory /run/user/1000...
Jan 21 17:45:23 np0005591285 systemd-logind[788]: New session 1 of user zuul.
Jan 21 17:45:23 np0005591285 systemd[1]: Finished User Runtime Directory /run/user/1000.
Jan 21 17:45:23 np0005591285 systemd[1]: Starting User Manager for UID 1000...
Jan 21 17:45:24 np0005591285 systemd[4308]: Queued start job for default target Main User Target.
Jan 21 17:45:24 np0005591285 systemd[4308]: Created slice User Application Slice.
Jan 21 17:45:24 np0005591285 systemd[4308]: Started Mark boot as successful after the user session has run 2 minutes.
Jan 21 17:45:24 np0005591285 systemd[4308]: Started Daily Cleanup of User's Temporary Directories.
Jan 21 17:45:24 np0005591285 systemd[4308]: Reached target Paths.
Jan 21 17:45:24 np0005591285 systemd[4308]: Reached target Timers.
Jan 21 17:45:24 np0005591285 systemd[4308]: Starting D-Bus User Message Bus Socket...
Jan 21 17:45:24 np0005591285 systemd[4308]: Starting Create User's Volatile Files and Directories...
Jan 21 17:45:24 np0005591285 systemd[4308]: Finished Create User's Volatile Files and Directories.
Jan 21 17:45:24 np0005591285 systemd[4308]: Listening on D-Bus User Message Bus Socket.
Jan 21 17:45:24 np0005591285 systemd[4308]: Reached target Sockets.
Jan 21 17:45:24 np0005591285 systemd[4308]: Reached target Basic System.
Jan 21 17:45:24 np0005591285 systemd[4308]: Reached target Main User Target.
Jan 21 17:45:24 np0005591285 systemd[4308]: Startup finished in 127ms.
Jan 21 17:45:24 np0005591285 systemd[1]: Started User Manager for UID 1000.
Jan 21 17:45:24 np0005591285 systemd[1]: Started Session 1 of User zuul.
Jan 21 17:45:24 np0005591285 python3[4390]: ansible-setup Invoked with gather_subset=['!all'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 21 17:45:35 np0005591285 python3[4418]: ansible-ansible.legacy.setup Invoked with gather_subset=['all'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 21 17:45:42 np0005591285 python3[4476]: ansible-setup Invoked with gather_subset=['network'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 21 17:45:43 np0005591285 python3[4516]: ansible-zuul_console Invoked with path=/tmp/console-{log_uuid}.log port=19885 state=present
Jan 21 17:45:45 np0005591285 python3[4542]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQC2e4yttZDYBqdG8LApHzgUrKnJJhokPjy46000EGKrecg+C8A4mLQflJ0D/xvugtt/H91C3VfRJbQOPQ7hZmStaqICNoXl/C8gc+eNroWZE+yY/wlWIxUH08XS6asYrTpDpg5UmpvUaYUK+3UMHnBY7Ito24+Jty+rd2YwCphABstuMfb1NJAx6Jml5CgCMob2n9WNcySPRTJ7JEA45egnysW3zGHGsS6qA8z8KP4tsp0oqBu1cfczB2RxnOXPhXZSJcS+3lww8bkb/wmQh1+Ho5qQEILiO5sxZGE4T9giN9XH2aveWWK0ttofy63F0tFxrl4uVBOtPYvY+GFt+GJuAwQK/wFmObp8yFqj8YU0HrxwXaVGLO6bfltMq8+k+/sDcwLSVGsCR6kw70L44MXX4znyZuRO7aEx+rAOMmL9ZfrVMgF7BEKlJG7ZldriZuFA1dpyF07UOpUN5wDaKC0EUC9s9ANBhs/JzmSBbA66LTl3G+2zXPfjQLBU99msPhs= zuul-build-sshkey manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Jan 21 17:45:45 np0005591285 python3[4566]: ansible-file Invoked with state=directory path=/home/zuul/.ssh mode=448 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 21 17:45:46 np0005591285 python3[4665]: ansible-ansible.legacy.stat Invoked with path=/home/zuul/.ssh/id_rsa follow=False get_checksum=False checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Jan 21 17:45:46 np0005591285 python3[4736]: ansible-ansible.legacy.copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1769035545.956047-254-16405146410492/source dest=/home/zuul/.ssh/id_rsa mode=384 force=False _original_basename=3005863ccc544e3a9d90dbd38e9aa500_id_rsa follow=False checksum=232cfc4771d49d01feffe7bca174ec959890bb55 backup=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 21 17:45:47 np0005591285 python3[4859]: ansible-ansible.legacy.stat Invoked with path=/home/zuul/.ssh/id_rsa.pub follow=False get_checksum=False checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Jan 21 17:45:47 np0005591285 python3[4930]: ansible-ansible.legacy.copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1769035547.040748-309-21341080736055/source dest=/home/zuul/.ssh/id_rsa.pub mode=420 force=False _original_basename=3005863ccc544e3a9d90dbd38e9aa500_id_rsa.pub follow=False checksum=0a660c0f8e508883780892e7228376ef7bc415eb backup=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 21 17:45:49 np0005591285 python3[4978]: ansible-ping Invoked with data=pong
Jan 21 17:45:50 np0005591285 python3[5002]: ansible-setup Invoked with gather_subset=['all'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 21 17:45:52 np0005591285 python3[5060]: ansible-zuul_debug_info Invoked with ipv4_route_required=False ipv6_route_required=False image_manifest_files=['/etc/dib-builddate.txt', '/etc/image-hostname.txt'] image_manifest=None traceroute_host=None
Jan 21 17:45:53 np0005591285 python3[5092]: ansible-file Invoked with path=/home/zuul/zuul-output/logs state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 21 17:45:54 np0005591285 python3[5116]: ansible-file Invoked with path=/home/zuul/zuul-output/artifacts state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 21 17:45:54 np0005591285 python3[5140]: ansible-file Invoked with path=/home/zuul/zuul-output/docs state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 21 17:45:54 np0005591285 python3[5164]: ansible-file Invoked with path=/home/zuul/zuul-output/logs state=directory mode=493 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 21 17:45:55 np0005591285 python3[5188]: ansible-file Invoked with path=/home/zuul/zuul-output/artifacts state=directory mode=493 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 21 17:45:55 np0005591285 python3[5212]: ansible-file Invoked with path=/home/zuul/zuul-output/docs state=directory mode=493 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 21 17:45:56 np0005591285 python3[5238]: ansible-file Invoked with path=/etc/ci state=directory owner=root group=root mode=493 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 21 17:45:57 np0005591285 python3[5316]: ansible-ansible.legacy.stat Invoked with path=/etc/ci/mirror_info.sh follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Jan 21 17:45:58 np0005591285 python3[5389]: ansible-ansible.legacy.copy Invoked with dest=/etc/ci/mirror_info.sh owner=root group=root mode=420 src=/home/zuul/.ansible/tmp/ansible-tmp-1769035557.0901866-34-230431475702739/source follow=False _original_basename=mirror_info.sh.j2 checksum=92d92a03afdddee82732741071f662c729080c35 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 21 17:45:58 np0005591285 python3[5437]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-rsa AAAAB3NzaC1yc2EAAAABIwAAAQEA4Z/c9osaGGtU6X8fgELwfj/yayRurfcKA0HMFfdpPxev2dbwljysMuzoVp4OZmW1gvGtyYPSNRvnzgsaabPNKNo2ym5NToCP6UM+KSe93aln4BcM/24mXChYAbXJQ5Bqq/pIzsGs/pKetQN+vwvMxLOwTvpcsCJBXaa981RKML6xj9l/UZ7IIq1HSEKMvPLxZMWdu0Ut8DkCd5F4nOw9Wgml2uYpDCj5LLCrQQ9ChdOMz8hz6SighhNlRpPkvPaet3OXxr/ytFMu7j7vv06CaEnuMMiY2aTWN1Imin9eHAylIqFHta/3gFfQSWt9jXM7owkBLKL7ATzhaAn+fjNupw== arxcruz@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Jan 21 17:45:59 np0005591285 python3[5461]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABAQDS4Fn6k4deCnIlOtLWqZJyksbepjQt04j8Ed8CGx9EKkj0fKiAxiI4TadXQYPuNHMixZy4Nevjb6aDhL5Z906TfvNHKUrjrG7G26a0k8vdc61NEQ7FmcGMWRLwwc6ReDO7lFpzYKBMk4YqfWgBuGU/K6WLKiVW2cVvwIuGIaYrE1OiiX0iVUUk7KApXlDJMXn7qjSYynfO4mF629NIp8FJal38+Kv+HA+0QkE5Y2xXnzD4Lar5+keymiCHRntPppXHeLIRzbt0gxC7v3L72hpQ3BTBEzwHpeS8KY+SX1y5lRMN45thCHfJqGmARJREDjBvWG8JXOPmVIKQtZmVcD5b mandreou@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Jan 21 17:45:59 np0005591285 python3[5485]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABAQC9MiLfy30deHA7xPOAlew5qUq3UP2gmRMYJi8PtkjFB20/DKeWwWNnkZPqP9AayruRoo51SIiVg870gbZE2jYl+Ncx/FYDe56JeC3ySZsXoAVkC9bP7gkOGqOmJjirvAgPMI7bogVz8i+66Q4Ar7OKTp3762G4IuWPPEg4ce4Y7lx9qWocZapHYq4cYKMxrOZ7SEbFSATBbe2bPZAPKTw8do/Eny+Hq/LkHFhIeyra6cqTFQYShr+zPln0Cr+ro/pDX3bB+1ubFgTpjpkkkQsLhDfR6cCdCWM2lgnS3BTtYj5Ct9/JRPR5YOphqZz+uB+OEu2IL68hmU9vNTth1KeX rlandy@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Jan 21 17:45:59 np0005591285 python3[5509]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIFCbgz8gdERiJlk2IKOtkjQxEXejrio6ZYMJAVJYpOIp raukadah@gmail.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Jan 21 17:46:00 np0005591285 python3[5533]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIBqb3Q/9uDf4LmihQ7xeJ9gA/STIQUFPSfyyV0m8AoQi bshewale@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Jan 21 17:46:00 np0005591285 python3[5557]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABAQC0I8QqQx0Az2ysJt2JuffucLijhBqnsXKEIx5GyHwxVULROa8VtNFXUDH6ZKZavhiMcmfHB2+TBTda+lDP4FldYj06dGmzCY+IYGa+uDRdxHNGYjvCfLFcmLlzRK6fNbTcui+KlUFUdKe0fb9CRoGKyhlJD5GRkM1Dv+Yb6Bj+RNnmm1fVGYxzmrD2utvffYEb0SZGWxq2R9gefx1q/3wCGjeqvufEV+AskPhVGc5T7t9eyZ4qmslkLh1/nMuaIBFcr9AUACRajsvk6mXrAN1g3HlBf2gQlhi1UEyfbqIQvzzFtsbLDlSum/KmKjy818GzvWjERfQ0VkGzCd9bSLVL dviroel@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Jan 21 17:46:00 np0005591285 python3[5581]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQDLOQd4ZLtkZXQGY6UwAr/06ppWQK4fDO3HaqxPk98csyOCBXsliSKK39Bso828+5srIXiW7aI6aC9P5mwi4mUZlGPfJlQbfrcGvY+b/SocuvaGK+1RrHLoJCT52LBhwgrzlXio2jeksZeein8iaTrhsPrOAs7KggIL/rB9hEiB3NaOPWhhoCP4vlW6MEMExGcqB/1FVxXFBPnLkEyW0Lk7ycVflZl2ocRxbfjZi0+tI1Wlinp8PvSQSc/WVrAcDgKjc/mB4ODPOyYy3G8FHgfMsrXSDEyjBKgLKMsdCrAUcqJQWjkqXleXSYOV4q3pzL+9umK+q/e3P/bIoSFQzmJKTU1eDfuvPXmow9F5H54fii/Da7ezlMJ+wPGHJrRAkmzvMbALy7xwswLhZMkOGNtRcPqaKYRmIBKpw3o6bCTtcNUHOtOQnzwY8JzrM2eBWJBXAANYw+9/ho80JIiwhg29CFNpVBuHbql2YxJQNrnl90guN65rYNpDxdIluweyUf8= anbanerj@kaermorhen manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Jan 21 17:46:00 np0005591285 python3[5605]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQC3VwV8Im9kRm49lt3tM36hj4Zv27FxGo4C1Q/0jqhzFmHY7RHbmeRr8ObhwWoHjXSozKWg8FL5ER0z3hTwL0W6lez3sL7hUaCmSuZmG5Hnl3x4vTSxDI9JZ/Y65rtYiiWQo2fC5xJhU/4+0e5e/pseCm8cKRSu+SaxhO+sd6FDojA2x1BzOzKiQRDy/1zWGp/cZkxcEuB1wHI5LMzN03c67vmbu+fhZRAUO4dQkvcnj2LrhQtpa+ytvnSjr8icMDosf1OsbSffwZFyHB/hfWGAfe0eIeSA2XPraxiPknXxiPKx2MJsaUTYbsZcm3EjFdHBBMumw5rBI74zLrMRvCO9GwBEmGT4rFng1nP+yw5DB8sn2zqpOsPg1LYRwCPOUveC13P6pgsZZPh812e8v5EKnETct+5XI3dVpdw6CnNiLwAyVAF15DJvBGT/u1k0Myg/bQn+Gv9k2MSj6LvQmf6WbZu2Wgjm30z3FyCneBqTL7mLF19YXzeC0ufHz5pnO1E= dasm@fedora manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Jan 21 17:46:01 np0005591285 python3[5629]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIHUnwjB20UKmsSed9X73eGNV5AOEFccQ3NYrRW776pEk cjeanner manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Jan 21 17:46:01 np0005591285 python3[5653]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIDercCMGn8rW1C4P67tHgtflPdTeXlpyUJYH+6XDd2lR jgilaber@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Jan 21 17:46:01 np0005591285 python3[5677]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIAMI6kkg9Wg0sG7jIJmyZemEBwUn1yzNpQQd3gnulOmZ adrianfuscoarnejo@gmail.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Jan 21 17:46:02 np0005591285 python3[5701]: ansible-authorized_key Invoked with user=zuul state=present key=ecdsa-sha2-nistp256 AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBPijwpQu/3jhhhBZInXNOLEH57DrknPc3PLbsRvYyJIFzwYjX+WD4a7+nGnMYS42MuZk6TJcVqgnqofVx4isoD4= ramishra@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Jan 21 17:46:02 np0005591285 python3[5725]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIGpU/BepK3qX0NRf5Np+dOBDqzQEefhNrw2DCZaH3uWW rebtoor@monolith manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Jan 21 17:46:02 np0005591285 python3[5749]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIDK0iKdi8jQTpQrDdLVH/AAgLVYyTXF7AQ1gjc/5uT3t ykarel@yatinkarel manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Jan 21 17:46:03 np0005591285 python3[5773]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIF/V/cLotA6LZeO32VL45Hd78skuA2lJA425Sm2LlQeZ fmount@horcrux manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Jan 21 17:46:03 np0005591285 python3[5797]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIDa7QCjuDMVmRPo1rREbGwzYeBCYVN+Ou/3WKXZEC6Sr manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Jan 21 17:46:03 np0005591285 python3[5821]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAACAQCfNtF7NvKl915TGsGGoseUb06Hj8L/S4toWf0hExeY+F00woL6NvBlJD0nDct+P5a22I4EhvoQCRQ8reaPCm1lybR3uiRIJsj+8zkVvLwby9LXzfZorlNG9ofjd00FEmB09uW/YvTl6Q9XwwwX6tInzIOv3TMqTHHGOL74ibbj8J/FJR0cFEyj0z4WQRvtkh32xAHl83gbuINryMt0sqRI+clj2381NKL55DRLQrVw0gsfqqxiHAnXg21qWmc4J+b9e9kiuAFQjcjwTVkwJCcg3xbPwC/qokYRby/Y5S40UUd7/jEARGXT7RZgpzTuDd1oZiCVrnrqJNPaMNdVv5MLeFdf1B7iIe5aa/fGouX7AO4SdKhZUdnJmCFAGvjC6S3JMZ2wAcUl+OHnssfmdj7XL50cLo27vjuzMtLAgSqi6N99m92WCF2s8J9aVzszX7Xz9OKZCeGsiVJp3/NdABKzSEAyM9xBD/5Vho894Sav+otpySHe3p6RUTgbB5Zu8VyZRZ/UtB3ueXxyo764yrc6qWIDqrehm84Xm9g+/jpIBzGPl07NUNJpdt/6Sgf9RIKXw/7XypO5yZfUcuFNGTxLfqjTNrtgLZNcjfav6sSdVXVcMPL//XNuRdKmVFaO76eV/oGMQGr1fGcCD+N+CpI7+Q+fCNB6VFWG4nZFuI/Iuw== averdagu@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Jan 21 17:46:04 np0005591285 python3[5845]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQDq8l27xI+QlQVdS4djp9ogSoyrNE2+Ox6vKPdhSNL1J3PE5w+WCSvMz9A5gnNuH810zwbekEApbxTze/gLQJwBHA52CChfURpXrFaxY7ePXRElwKAL3mJfzBWY/c5jnNL9TCVmFJTGZkFZP3Nh+BMgZvL6xBkt3WKm6Uq18qzd9XeKcZusrA+O+uLv1fVeQnadY9RIqOCyeFYCzLWrUfTyE8x/XG0hAWIM7qpnF2cALQS2h9n4hW5ybiUN790H08wf9hFwEf5nxY9Z9dVkPFQiTSGKNBzmnCXU9skxS/xhpFjJ5duGSZdtAHe9O+nGZm9c67hxgtf8e5PDuqAdXEv2cf6e3VBAt+Bz8EKI3yosTj0oZHfwr42Yzb1l/SKy14Rggsrc9KAQlrGXan6+u2jcQqqx7l+SWmnpFiWTV9u5cWj2IgOhApOitmRBPYqk9rE2usfO0hLn/Pj/R/Nau4803e1/EikdLE7Ps95s9mX5jRDjAoUa2JwFF5RsVFyL910= ashigupt@ashigupt.remote.csb manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Jan 21 17:46:04 np0005591285 python3[5869]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIOKLl0NYKwoZ/JY5KeZU8VwRAggeOxqQJeoqp3dsAaY9 manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Jan 21 17:46:04 np0005591285 python3[5893]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIASASQOH2BcOyLKuuDOdWZlPi2orcjcA8q4400T73DLH evallesp@fedora manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Jan 21 17:46:04 np0005591285 python3[5917]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAILeBWlamUph+jRKV2qrx1PGU7vWuGIt5+z9k96I8WehW amsinha@amsinha-mac manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Jan 21 17:46:05 np0005591285 python3[5941]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIANvVgvJBlK3gb1yz5uef/JqIGq4HLEmY2dYA8e37swb morenod@redhat-laptop manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Jan 21 17:46:05 np0005591285 python3[5965]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAACAQDZdI7t1cxYx65heVI24HTV4F7oQLW1zyfxHreL2TIJKxjyrUUKIFEUmTutcBlJRLNT2Eoix6x1sOw9YrchloCLcn//SGfTElr9mSc5jbjb7QXEU+zJMhtxyEJ1Po3CUGnj7ckiIXw7wcawZtrEOAQ9pH3ExYCJcEMiyNjRQZCxT3tPK+S4B95EWh5Fsrz9CkwpjNRPPH7LigCeQTM3Wc7r97utAslBUUvYceDSLA7rMgkitJE38b7rZBeYzsGQ8YYUBjTCtehqQXxCRjizbHWaaZkBU+N3zkKB6n/iCNGIO690NK7A/qb6msTijiz1PeuM8ThOsi9qXnbX5v0PoTpcFSojV7NHAQ71f0XXuS43FhZctT+Dcx44dT8Fb5vJu2cJGrk+qF8ZgJYNpRS7gPg0EG2EqjK7JMf9ULdjSu0r+KlqIAyLvtzT4eOnQipoKlb/WG5D/0ohKv7OMQ352ggfkBFIQsRXyyTCT98Ft9juqPuahi3CAQmP4H9dyE+7+Kz437PEtsxLmfm6naNmWi7Ee1DqWPwS8rEajsm4sNM4wW9gdBboJQtc0uZw0DfLj1I9r3Mc8Ol0jYtz0yNQDSzVLrGCaJlC311trU70tZ+ZkAVV6Mn8lOhSbj1cK0lvSr6ZK4dgqGl3I1eTZJJhbLNdg7UOVaiRx9543+C/p/As7w== brjackma@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Jan 21 17:46:05 np0005591285 python3[5989]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIKwedoZ0TWPJX/z/4TAbO/kKcDZOQVgRH0hAqrL5UCI1 vcastell@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Jan 21 17:46:06 np0005591285 python3[6013]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIEmv8sE8GCk6ZTPIqF0FQrttBdL3mq7rCm/IJy0xDFh7 michburk@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Jan 21 17:46:06 np0005591285 python3[6037]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAICy6GpGEtwevXEEn4mmLR5lmSLe23dGgAvzkB9DMNbkf rsafrono@rsafrono manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Jan 21 17:46:08 np0005591285 python3[6063]: ansible-community.general.timezone Invoked with name=UTC hwclock=None
Jan 21 17:46:08 np0005591285 systemd[1]: Starting Time & Date Service...
Jan 21 17:46:08 np0005591285 systemd[1]: Started Time & Date Service.
Jan 21 17:46:08 np0005591285 systemd-timedated[6065]: Changed time zone to 'UTC' (UTC).
Jan 21 17:46:08 np0005591285 python3[6094]: ansible-file Invoked with path=/etc/nodepool state=directory mode=511 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 21 17:46:09 np0005591285 python3[6170]: ansible-ansible.legacy.stat Invoked with path=/etc/nodepool/sub_nodes follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Jan 21 17:46:09 np0005591285 python3[6241]: ansible-ansible.legacy.copy Invoked with dest=/etc/nodepool/sub_nodes src=/home/zuul/.ansible/tmp/ansible-tmp-1769035569.2339036-254-216210992311555/source _original_basename=tmp6vlx68o_ follow=False checksum=da39a3ee5e6b4b0d3255bfef95601890afd80709 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 21 17:46:10 np0005591285 python3[6341]: ansible-ansible.legacy.stat Invoked with path=/etc/nodepool/sub_nodes_private follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Jan 21 17:46:10 np0005591285 python3[6412]: ansible-ansible.legacy.copy Invoked with dest=/etc/nodepool/sub_nodes_private src=/home/zuul/.ansible/tmp/ansible-tmp-1769035570.1105242-304-10452385227079/source _original_basename=tmp5ww5wmcv follow=False checksum=da39a3ee5e6b4b0d3255bfef95601890afd80709 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 21 17:46:11 np0005591285 python3[6514]: ansible-ansible.legacy.stat Invoked with path=/etc/nodepool/node_private follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Jan 21 17:46:12 np0005591285 python3[6587]: ansible-ansible.legacy.copy Invoked with dest=/etc/nodepool/node_private src=/home/zuul/.ansible/tmp/ansible-tmp-1769035571.4559922-384-32071217975883/source _original_basename=tmpcqs1hwy4 follow=False checksum=fdc491946312142db92bc3cff6285a0a5b207d8c backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 21 17:46:12 np0005591285 python3[6635]: ansible-ansible.legacy.command Invoked with _raw_params=cp .ssh/id_rsa /etc/nodepool/id_rsa zuul_log_id=in-loop-ignore zuul_ansible_split_streams=False _uses_shell=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 21 17:46:13 np0005591285 python3[6661]: ansible-ansible.legacy.command Invoked with _raw_params=cp .ssh/id_rsa.pub /etc/nodepool/id_rsa.pub zuul_log_id=in-loop-ignore zuul_ansible_split_streams=False _uses_shell=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 21 17:46:13 np0005591285 python3[6741]: ansible-ansible.legacy.stat Invoked with path=/etc/sudoers.d/zuul-sudo-grep follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Jan 21 17:46:14 np0005591285 python3[6814]: ansible-ansible.legacy.copy Invoked with dest=/etc/sudoers.d/zuul-sudo-grep mode=288 src=/home/zuul/.ansible/tmp/ansible-tmp-1769035573.2606375-454-244653217262418/source _original_basename=tmpq8fsmm3u follow=False checksum=bdca1a77493d00fb51567671791f4aa30f66c2f0 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 21 17:46:14 np0005591285 python3[6865]: ansible-ansible.legacy.command Invoked with _raw_params=/usr/sbin/visudo -c zuul_log_id=fa163ec2-ffbe-2a6d-faa9-00000000001f-1-compute2 zuul_ansible_split_streams=False _uses_shell=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 21 17:46:15 np0005591285 python3[6893]: ansible-ansible.legacy.command Invoked with executable=/bin/bash _raw_params=env#012 _uses_shell=True zuul_log_id=fa163ec2-ffbe-2a6d-faa9-000000000020-1-compute2 zuul_ansible_split_streams=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None creates=None removes=None stdin=None
Jan 21 17:46:16 np0005591285 python3[6921]: ansible-file Invoked with path=/home/zuul/workspace state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 21 17:46:38 np0005591285 python3[6947]: ansible-ansible.builtin.file Invoked with path=/etc/ci/env state=directory mode=0755 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 21 17:46:38 np0005591285 systemd[1]: systemd-timedated.service: Deactivated successfully.
Jan 21 17:47:28 np0005591285 chronyd[793]: Selected source 147.189.136.126 (2.centos.pool.ntp.org)
Jan 21 17:47:38 np0005591285 systemd-logind[788]: Session 1 logged out. Waiting for processes to exit.
Jan 21 17:48:00 np0005591285 kernel: pci 0000:00:07.0: [1af4:1000] type 00 class 0x020000 conventional PCI endpoint
Jan 21 17:48:00 np0005591285 kernel: pci 0000:00:07.0: BAR 0 [io  0x0000-0x003f]
Jan 21 17:48:00 np0005591285 kernel: pci 0000:00:07.0: BAR 1 [mem 0x00000000-0x00000fff]
Jan 21 17:48:00 np0005591285 kernel: pci 0000:00:07.0: BAR 4 [mem 0x00000000-0x00003fff 64bit pref]
Jan 21 17:48:00 np0005591285 kernel: pci 0000:00:07.0: ROM [mem 0x00000000-0x0007ffff pref]
Jan 21 17:48:00 np0005591285 kernel: pci 0000:00:07.0: ROM [mem 0xc0000000-0xc007ffff pref]: assigned
Jan 21 17:48:00 np0005591285 kernel: pci 0000:00:07.0: BAR 4 [mem 0x240000000-0x240003fff 64bit pref]: assigned
Jan 21 17:48:00 np0005591285 kernel: pci 0000:00:07.0: BAR 1 [mem 0xc0080000-0xc0080fff]: assigned
Jan 21 17:48:00 np0005591285 kernel: pci 0000:00:07.0: BAR 0 [io  0x1000-0x103f]: assigned
Jan 21 17:48:00 np0005591285 kernel: virtio-pci 0000:00:07.0: enabling device (0000 -> 0003)
Jan 21 17:48:00 np0005591285 NetworkManager[860]: <info>  [1769035680.9818] manager: (eth1): new Ethernet device (/org/freedesktop/NetworkManager/Devices/3)
Jan 21 17:48:00 np0005591285 systemd-udevd[6950]: Network interface NamePolicy= disabled on kernel command line.
Jan 21 17:48:01 np0005591285 NetworkManager[860]: <info>  [1769035681.0012] device (eth1): state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Jan 21 17:48:01 np0005591285 NetworkManager[860]: <info>  [1769035681.0039] settings: (eth1): created default wired connection 'Wired connection 1'
Jan 21 17:48:01 np0005591285 NetworkManager[860]: <info>  [1769035681.0042] device (eth1): carrier: link connected
Jan 21 17:48:01 np0005591285 NetworkManager[860]: <info>  [1769035681.0044] device (eth1): state change: unavailable -> disconnected (reason 'carrier-changed', managed-type: 'full')
Jan 21 17:48:01 np0005591285 NetworkManager[860]: <info>  [1769035681.0050] policy: auto-activating connection 'Wired connection 1' (31dfefde-0caa-33b1-9dbb-97e36adad912)
Jan 21 17:48:01 np0005591285 NetworkManager[860]: <info>  [1769035681.0054] device (eth1): Activation: starting connection 'Wired connection 1' (31dfefde-0caa-33b1-9dbb-97e36adad912)
Jan 21 17:48:01 np0005591285 NetworkManager[860]: <info>  [1769035681.0055] device (eth1): state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Jan 21 17:48:01 np0005591285 NetworkManager[860]: <info>  [1769035681.0057] device (eth1): state change: prepare -> config (reason 'none', managed-type: 'full')
Jan 21 17:48:01 np0005591285 NetworkManager[860]: <info>  [1769035681.0061] device (eth1): state change: config -> ip-config (reason 'none', managed-type: 'full')
Jan 21 17:48:01 np0005591285 NetworkManager[860]: <info>  [1769035681.0066] dhcp4 (eth1): activation: beginning transaction (timeout in 45 seconds)
Jan 21 17:48:01 np0005591285 systemd[4308]: Starting Mark boot as successful...
Jan 21 17:48:01 np0005591285 systemd[4308]: Finished Mark boot as successful.
Jan 21 17:48:01 np0005591285 systemd-logind[788]: New session 3 of user zuul.
Jan 21 17:48:01 np0005591285 systemd[1]: Started Session 3 of User zuul.
Jan 21 17:48:02 np0005591285 python3[6982]: ansible-ansible.legacy.command Invoked with _raw_params=ip -j link zuul_log_id=fa163ec2-ffbe-4168-cfbf-0000000001ea-0-controller zuul_ansible_split_streams=False _uses_shell=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 21 17:48:09 np0005591285 python3[7062]: ansible-ansible.legacy.stat Invoked with path=/etc/NetworkManager/system-connections/ci-private-network.nmconnection follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Jan 21 17:48:09 np0005591285 python3[7135]: ansible-ansible.legacy.copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1769035688.7232509-206-149281487683588/source dest=/etc/NetworkManager/system-connections/ci-private-network.nmconnection mode=0600 owner=root group=root follow=False _original_basename=bootstrap-ci-network-nm-connection.nmconnection.j2 checksum=5a854425b068c1dd101a8bf88edf9dca519e8cd1 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 21 17:48:10 np0005591285 python3[7185]: ansible-ansible.builtin.systemd Invoked with name=NetworkManager state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Jan 21 17:48:10 np0005591285 systemd[1]: NetworkManager-wait-online.service: Deactivated successfully.
Jan 21 17:48:10 np0005591285 systemd[1]: Stopped Network Manager Wait Online.
Jan 21 17:48:10 np0005591285 systemd[1]: Stopping Network Manager Wait Online...
Jan 21 17:48:10 np0005591285 NetworkManager[860]: <info>  [1769035690.1792] caught SIGTERM, shutting down normally.
Jan 21 17:48:10 np0005591285 systemd[1]: Stopping Network Manager...
Jan 21 17:48:10 np0005591285 NetworkManager[860]: <info>  [1769035690.1809] dhcp4 (eth0): canceled DHCP transaction
Jan 21 17:48:10 np0005591285 NetworkManager[860]: <info>  [1769035690.1809] dhcp4 (eth0): activation: beginning transaction (timeout in 45 seconds)
Jan 21 17:48:10 np0005591285 NetworkManager[860]: <info>  [1769035690.1809] dhcp4 (eth0): state changed no lease
Jan 21 17:48:10 np0005591285 NetworkManager[860]: <info>  [1769035690.1813] manager: NetworkManager state is now CONNECTING
Jan 21 17:48:10 np0005591285 NetworkManager[860]: <info>  [1769035690.1879] dhcp4 (eth1): canceled DHCP transaction
Jan 21 17:48:10 np0005591285 NetworkManager[860]: <info>  [1769035690.1880] dhcp4 (eth1): state changed no lease
Jan 21 17:48:10 np0005591285 NetworkManager[860]: <info>  [1769035690.1984] exiting (success)
Jan 21 17:48:10 np0005591285 systemd[1]: Starting Network Manager Script Dispatcher Service...
Jan 21 17:48:10 np0005591285 systemd[1]: NetworkManager.service: Deactivated successfully.
Jan 21 17:48:10 np0005591285 systemd[1]: Stopped Network Manager.
Jan 21 17:48:10 np0005591285 systemd[1]: NetworkManager.service: Consumed 1.901s CPU time, 10.2M memory peak.
Jan 21 17:48:10 np0005591285 systemd[1]: Starting Network Manager...
Jan 21 17:48:10 np0005591285 systemd[1]: Started Network Manager Script Dispatcher Service.
Jan 21 17:48:10 np0005591285 NetworkManager[7190]: <info>  [1769035690.2698] NetworkManager (version 1.54.3-2.el9) is starting... (after a restart, boot:2993e076-ac8d-4723-86a0-913496004632)
Jan 21 17:48:10 np0005591285 NetworkManager[7190]: <info>  [1769035690.2701] Read config: /etc/NetworkManager/NetworkManager.conf, /run/NetworkManager/conf.d/15-carrier-timeout.conf
Jan 21 17:48:10 np0005591285 NetworkManager[7190]: <info>  [1769035690.2768] manager[0x560a7559a000]: monitoring kernel firmware directory '/lib/firmware'.
Jan 21 17:48:10 np0005591285 systemd[1]: Starting Hostname Service...
Jan 21 17:48:10 np0005591285 systemd[1]: Started Hostname Service.
Jan 21 17:48:10 np0005591285 NetworkManager[7190]: <info>  [1769035690.3830] hostname: hostname: using hostnamed
Jan 21 17:48:10 np0005591285 NetworkManager[7190]: <info>  [1769035690.3831] hostname: static hostname changed from (none) to "np0005591285.novalocal"
Jan 21 17:48:10 np0005591285 NetworkManager[7190]: <info>  [1769035690.3839] dns-mgr: init: dns=default,systemd-resolved rc-manager=symlink (auto)
Jan 21 17:48:10 np0005591285 NetworkManager[7190]: <info>  [1769035690.3848] manager[0x560a7559a000]: rfkill: Wi-Fi hardware radio set enabled
Jan 21 17:48:10 np0005591285 NetworkManager[7190]: <info>  [1769035690.3848] manager[0x560a7559a000]: rfkill: WWAN hardware radio set enabled
Jan 21 17:48:10 np0005591285 NetworkManager[7190]: <info>  [1769035690.3896] Loaded device plugin: NMTeamFactory (/usr/lib64/NetworkManager/1.54.3-2.el9/libnm-device-plugin-team.so)
Jan 21 17:48:10 np0005591285 NetworkManager[7190]: <info>  [1769035690.3897] manager: rfkill: Wi-Fi enabled by radio killswitch; enabled by state file
Jan 21 17:48:10 np0005591285 NetworkManager[7190]: <info>  [1769035690.3898] manager: rfkill: WWAN enabled by radio killswitch; enabled by state file
Jan 21 17:48:10 np0005591285 NetworkManager[7190]: <info>  [1769035690.3899] manager: Networking is enabled by state file
Jan 21 17:48:10 np0005591285 NetworkManager[7190]: <info>  [1769035690.3904] settings: Loaded settings plugin: keyfile (internal)
Jan 21 17:48:10 np0005591285 NetworkManager[7190]: <info>  [1769035690.3911] settings: Loaded settings plugin: ifcfg-rh ("/usr/lib64/NetworkManager/1.54.3-2.el9/libnm-settings-plugin-ifcfg-rh.so")
Jan 21 17:48:10 np0005591285 NetworkManager[7190]: <info>  [1769035690.3953] Warning: the ifcfg-rh plugin is deprecated, please migrate connections to the keyfile format using "nmcli connection migrate"
Jan 21 17:48:10 np0005591285 NetworkManager[7190]: <info>  [1769035690.3971] dhcp: init: Using DHCP client 'internal'
Jan 21 17:48:10 np0005591285 NetworkManager[7190]: <info>  [1769035690.3977] manager: (lo): new Loopback device (/org/freedesktop/NetworkManager/Devices/1)
Jan 21 17:48:10 np0005591285 NetworkManager[7190]: <info>  [1769035690.3987] device (lo): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 21 17:48:10 np0005591285 NetworkManager[7190]: <info>  [1769035690.3999] device (lo): state change: unavailable -> disconnected (reason 'connection-assumed', managed-type: 'external')
Jan 21 17:48:10 np0005591285 NetworkManager[7190]: <info>  [1769035690.4017] device (lo): Activation: starting connection 'lo' (e8b88d7b-c546-4855-a53a-f2271e918cb0)
Jan 21 17:48:10 np0005591285 NetworkManager[7190]: <info>  [1769035690.4030] device (eth0): carrier: link connected
Jan 21 17:48:10 np0005591285 NetworkManager[7190]: <info>  [1769035690.4037] manager: (eth0): new Ethernet device (/org/freedesktop/NetworkManager/Devices/2)
Jan 21 17:48:10 np0005591285 NetworkManager[7190]: <info>  [1769035690.4047] manager: (eth0): assume: will attempt to assume matching connection 'System eth0' (5fb06bd0-0bb0-7ffb-45f1-d6edd65f3e03) (indicated)
Jan 21 17:48:10 np0005591285 NetworkManager[7190]: <info>  [1769035690.4048] device (eth0): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'assume')
Jan 21 17:48:10 np0005591285 NetworkManager[7190]: <info>  [1769035690.4060] device (eth0): state change: unavailable -> disconnected (reason 'connection-assumed', managed-type: 'assume')
Jan 21 17:48:10 np0005591285 NetworkManager[7190]: <info>  [1769035690.4072] device (eth0): Activation: starting connection 'System eth0' (5fb06bd0-0bb0-7ffb-45f1-d6edd65f3e03)
Jan 21 17:48:10 np0005591285 NetworkManager[7190]: <info>  [1769035690.4083] device (eth1): carrier: link connected
Jan 21 17:48:10 np0005591285 NetworkManager[7190]: <info>  [1769035690.4090] manager: (eth1): new Ethernet device (/org/freedesktop/NetworkManager/Devices/3)
Jan 21 17:48:10 np0005591285 NetworkManager[7190]: <info>  [1769035690.4101] manager: (eth1): assume: will attempt to assume matching connection 'Wired connection 1' (31dfefde-0caa-33b1-9dbb-97e36adad912) (indicated)
Jan 21 17:48:10 np0005591285 NetworkManager[7190]: <info>  [1769035690.4102] device (eth1): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'assume')
Jan 21 17:48:10 np0005591285 NetworkManager[7190]: <info>  [1769035690.4111] device (eth1): state change: unavailable -> disconnected (reason 'connection-assumed', managed-type: 'assume')
Jan 21 17:48:10 np0005591285 NetworkManager[7190]: <info>  [1769035690.4123] device (eth1): Activation: starting connection 'Wired connection 1' (31dfefde-0caa-33b1-9dbb-97e36adad912)
Jan 21 17:48:10 np0005591285 systemd[1]: Started Network Manager.
Jan 21 17:48:10 np0005591285 NetworkManager[7190]: <info>  [1769035690.4136] bus-manager: acquired D-Bus service "org.freedesktop.NetworkManager"
Jan 21 17:48:10 np0005591285 NetworkManager[7190]: <info>  [1769035690.4143] device (lo): state change: disconnected -> prepare (reason 'none', managed-type: 'external')
Jan 21 17:48:10 np0005591285 NetworkManager[7190]: <info>  [1769035690.4147] device (lo): state change: prepare -> config (reason 'none', managed-type: 'external')
Jan 21 17:48:10 np0005591285 NetworkManager[7190]: <info>  [1769035690.4151] device (lo): state change: config -> ip-config (reason 'none', managed-type: 'external')
Jan 21 17:48:10 np0005591285 NetworkManager[7190]: <info>  [1769035690.4155] device (eth0): state change: disconnected -> prepare (reason 'none', managed-type: 'assume')
Jan 21 17:48:10 np0005591285 NetworkManager[7190]: <info>  [1769035690.4161] device (eth0): state change: prepare -> config (reason 'none', managed-type: 'assume')
Jan 21 17:48:10 np0005591285 NetworkManager[7190]: <info>  [1769035690.4164] device (eth1): state change: disconnected -> prepare (reason 'none', managed-type: 'assume')
Jan 21 17:48:10 np0005591285 NetworkManager[7190]: <info>  [1769035690.4168] device (eth1): state change: prepare -> config (reason 'none', managed-type: 'assume')
Jan 21 17:48:10 np0005591285 NetworkManager[7190]: <info>  [1769035690.4175] device (lo): state change: ip-config -> ip-check (reason 'none', managed-type: 'external')
Jan 21 17:48:10 np0005591285 NetworkManager[7190]: <info>  [1769035690.4186] device (eth0): state change: config -> ip-config (reason 'none', managed-type: 'assume')
Jan 21 17:48:10 np0005591285 NetworkManager[7190]: <info>  [1769035690.4191] dhcp4 (eth0): activation: beginning transaction (timeout in 45 seconds)
Jan 21 17:48:10 np0005591285 NetworkManager[7190]: <info>  [1769035690.4204] device (eth1): state change: config -> ip-config (reason 'none', managed-type: 'assume')
Jan 21 17:48:10 np0005591285 NetworkManager[7190]: <info>  [1769035690.4209] dhcp4 (eth1): activation: beginning transaction (timeout in 45 seconds)
Jan 21 17:48:10 np0005591285 NetworkManager[7190]: <info>  [1769035690.4238] device (lo): state change: ip-check -> secondaries (reason 'none', managed-type: 'external')
Jan 21 17:48:10 np0005591285 NetworkManager[7190]: <info>  [1769035690.4241] device (lo): state change: secondaries -> activated (reason 'none', managed-type: 'external')
Jan 21 17:48:10 np0005591285 NetworkManager[7190]: <info>  [1769035690.4250] device (lo): Activation: successful, device activated.
Jan 21 17:48:10 np0005591285 NetworkManager[7190]: <info>  [1769035690.4262] dhcp4 (eth0): state changed new lease, address=38.102.83.145
Jan 21 17:48:10 np0005591285 systemd[1]: Starting Network Manager Wait Online...
Jan 21 17:48:10 np0005591285 NetworkManager[7190]: <info>  [1769035690.4274] policy: set 'System eth0' (eth0) as default for IPv4 routing and DNS
Jan 21 17:48:10 np0005591285 NetworkManager[7190]: <info>  [1769035690.4368] device (eth0): state change: ip-config -> ip-check (reason 'none', managed-type: 'assume')
Jan 21 17:48:10 np0005591285 NetworkManager[7190]: <info>  [1769035690.4398] device (eth0): state change: ip-check -> secondaries (reason 'none', managed-type: 'assume')
Jan 21 17:48:10 np0005591285 NetworkManager[7190]: <info>  [1769035690.4401] device (eth0): state change: secondaries -> activated (reason 'none', managed-type: 'assume')
Jan 21 17:48:10 np0005591285 NetworkManager[7190]: <info>  [1769035690.4407] manager: NetworkManager state is now CONNECTED_SITE
Jan 21 17:48:10 np0005591285 NetworkManager[7190]: <info>  [1769035690.4411] device (eth0): Activation: successful, device activated.
Jan 21 17:48:10 np0005591285 NetworkManager[7190]: <info>  [1769035690.4420] manager: NetworkManager state is now CONNECTED_GLOBAL
Jan 21 17:48:10 np0005591285 python3[7269]: ansible-ansible.legacy.command Invoked with _raw_params=ip route zuul_log_id=fa163ec2-ffbe-4168-cfbf-0000000000d3-0-controller zuul_ansible_split_streams=False _uses_shell=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 21 17:48:20 np0005591285 systemd[1]: NetworkManager-dispatcher.service: Deactivated successfully.
Jan 21 17:48:40 np0005591285 systemd[1]: systemd-hostnamed.service: Deactivated successfully.
Jan 21 17:48:55 np0005591285 NetworkManager[7190]: <info>  [1769035735.2821] device (eth1): state change: ip-config -> ip-check (reason 'none', managed-type: 'assume')
Jan 21 17:48:55 np0005591285 systemd[1]: Starting Network Manager Script Dispatcher Service...
Jan 21 17:48:55 np0005591285 systemd[1]: Started Network Manager Script Dispatcher Service.
Jan 21 17:48:55 np0005591285 NetworkManager[7190]: <info>  [1769035735.3176] device (eth1): state change: ip-check -> secondaries (reason 'none', managed-type: 'assume')
Jan 21 17:48:55 np0005591285 NetworkManager[7190]: <info>  [1769035735.3181] device (eth1): state change: secondaries -> activated (reason 'none', managed-type: 'assume')
Jan 21 17:48:55 np0005591285 NetworkManager[7190]: <info>  [1769035735.3190] device (eth1): Activation: successful, device activated.
Jan 21 17:48:55 np0005591285 NetworkManager[7190]: <info>  [1769035735.3201] manager: startup complete
Jan 21 17:48:55 np0005591285 NetworkManager[7190]: <info>  [1769035735.3203] device (eth1): state change: activated -> failed (reason 'ip-config-unavailable', managed-type: 'full')
Jan 21 17:48:55 np0005591285 NetworkManager[7190]: <warn>  [1769035735.3215] device (eth1): Activation: failed for connection 'Wired connection 1'
Jan 21 17:48:55 np0005591285 NetworkManager[7190]: <info>  [1769035735.3231] device (eth1): state change: failed -> disconnected (reason 'none', managed-type: 'full')
Jan 21 17:48:55 np0005591285 systemd[1]: Finished Network Manager Wait Online.
Jan 21 17:48:55 np0005591285 NetworkManager[7190]: <info>  [1769035735.3390] dhcp4 (eth1): canceled DHCP transaction
Jan 21 17:48:55 np0005591285 NetworkManager[7190]: <info>  [1769035735.3390] dhcp4 (eth1): activation: beginning transaction (timeout in 45 seconds)
Jan 21 17:48:55 np0005591285 NetworkManager[7190]: <info>  [1769035735.3390] dhcp4 (eth1): state changed no lease
Jan 21 17:48:55 np0005591285 NetworkManager[7190]: <info>  [1769035735.3405] policy: auto-activating connection 'ci-private-network' (6edea1a0-705c-5cc0-8116-93b791d6dfff)
Jan 21 17:48:55 np0005591285 NetworkManager[7190]: <info>  [1769035735.3409] device (eth1): Activation: starting connection 'ci-private-network' (6edea1a0-705c-5cc0-8116-93b791d6dfff)
Jan 21 17:48:55 np0005591285 NetworkManager[7190]: <info>  [1769035735.3410] device (eth1): state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Jan 21 17:48:55 np0005591285 NetworkManager[7190]: <info>  [1769035735.3413] device (eth1): state change: prepare -> config (reason 'none', managed-type: 'full')
Jan 21 17:48:55 np0005591285 NetworkManager[7190]: <info>  [1769035735.3420] device (eth1): state change: config -> ip-config (reason 'none', managed-type: 'full')
Jan 21 17:48:55 np0005591285 NetworkManager[7190]: <info>  [1769035735.3428] device (eth1): state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Jan 21 17:48:55 np0005591285 NetworkManager[7190]: <info>  [1769035735.3471] device (eth1): state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Jan 21 17:48:55 np0005591285 NetworkManager[7190]: <info>  [1769035735.3473] device (eth1): state change: secondaries -> activated (reason 'none', managed-type: 'full')
Jan 21 17:48:55 np0005591285 NetworkManager[7190]: <info>  [1769035735.3476] device (eth1): Activation: successful, device activated.
Jan 21 17:49:05 np0005591285 systemd[1]: NetworkManager-dispatcher.service: Deactivated successfully.
Jan 21 17:49:10 np0005591285 systemd[1]: session-3.scope: Deactivated successfully.
Jan 21 17:49:10 np0005591285 systemd[1]: session-3.scope: Consumed 1.884s CPU time.
Jan 21 17:49:10 np0005591285 systemd-logind[788]: Session 3 logged out. Waiting for processes to exit.
Jan 21 17:49:10 np0005591285 systemd-logind[788]: Removed session 3.
Jan 21 17:49:15 np0005591285 systemd-logind[788]: New session 4 of user zuul.
Jan 21 17:49:15 np0005591285 systemd[1]: Started Session 4 of User zuul.
Jan 21 17:49:16 np0005591285 python3[7380]: ansible-ansible.legacy.stat Invoked with path=/etc/ci/env/networking-info.yml follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Jan 21 17:49:16 np0005591285 python3[7453]: ansible-ansible.legacy.copy Invoked with dest=/etc/ci/env/networking-info.yml owner=root group=root mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1769035755.9905612-365-9997556175540/source _original_basename=tmpqn_yj503 follow=False checksum=9be2ac127257c76b31f8acdef7104cc3c2481547 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 21 17:49:18 np0005591285 systemd[1]: session-4.scope: Deactivated successfully.
Jan 21 17:49:18 np0005591285 systemd-logind[788]: Session 4 logged out. Waiting for processes to exit.
Jan 21 17:49:18 np0005591285 systemd-logind[788]: Removed session 4.
Jan 21 17:51:05 np0005591285 systemd[4308]: Created slice User Background Tasks Slice.
Jan 21 17:51:05 np0005591285 systemd[4308]: Starting Cleanup of User's Temporary Files and Directories...
Jan 21 17:51:05 np0005591285 systemd[4308]: Finished Cleanup of User's Temporary Files and Directories.
Jan 21 17:54:45 np0005591285 systemd-logind[788]: New session 5 of user zuul.
Jan 21 17:54:45 np0005591285 systemd[1]: Started Session 5 of User zuul.
Jan 21 17:54:45 np0005591285 python3[7514]: ansible-ansible.legacy.command Invoked with _raw_params=lsblk -nd -o MAJ:MIN /dev/vda#012 _uses_shell=True zuul_log_id=fa163ec2-ffbe-51e3-fd82-000000000ca4-1-compute2 zuul_ansible_split_streams=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 21 17:54:45 np0005591285 python3[7542]: ansible-ansible.builtin.file Invoked with path=/sys/fs/cgroup/init.scope state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 21 17:54:46 np0005591285 python3[7569]: ansible-ansible.builtin.file Invoked with path=/sys/fs/cgroup/machine.slice state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 21 17:54:46 np0005591285 python3[7595]: ansible-ansible.builtin.file Invoked with path=/sys/fs/cgroup/system.slice state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 21 17:54:46 np0005591285 python3[7621]: ansible-ansible.builtin.file Invoked with path=/sys/fs/cgroup/user.slice state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 21 17:54:46 np0005591285 python3[7647]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system.conf.d state=directory mode=0755 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 21 17:54:47 np0005591285 python3[7725]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system.conf.d/override.conf follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Jan 21 17:54:48 np0005591285 python3[7798]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system.conf.d/override.conf mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1769036087.2319655-367-76346376788060/source _original_basename=tmpavxynrc4 follow=False checksum=a05098bd3d2321238ea1169d0e6f135b35b392d4 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 21 17:54:49 np0005591285 python3[7848]: ansible-ansible.builtin.systemd_service Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Jan 21 17:54:49 np0005591285 systemd[1]: Reloading.
Jan 21 17:54:49 np0005591285 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 21 17:54:51 np0005591285 python3[7904]: ansible-ansible.builtin.wait_for Invoked with path=/sys/fs/cgroup/system.slice/io.max state=present timeout=30 host=127.0.0.1 connect_timeout=5 delay=0 active_connection_states=['ESTABLISHED', 'FIN_WAIT1', 'FIN_WAIT2', 'SYN_RECV', 'SYN_SENT', 'TIME_WAIT'] sleep=1 port=None search_regex=None exclude_hosts=None msg=None
Jan 21 17:54:54 np0005591285 python3[7930]: ansible-ansible.legacy.command Invoked with _raw_params=echo "252:0   riops=18000 wiops=18000 rbps=262144000 wbps=262144000" > /sys/fs/cgroup/init.scope/io.max#012 _uses_shell=True zuul_log_id=in-loop-ignore zuul_ansible_split_streams=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 21 17:54:55 np0005591285 python3[7958]: ansible-ansible.legacy.command Invoked with _raw_params=echo "252:0   riops=18000 wiops=18000 rbps=262144000 wbps=262144000" > /sys/fs/cgroup/machine.slice/io.max#012 _uses_shell=True zuul_log_id=in-loop-ignore zuul_ansible_split_streams=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 21 17:54:55 np0005591285 python3[7986]: ansible-ansible.legacy.command Invoked with _raw_params=echo "252:0   riops=18000 wiops=18000 rbps=262144000 wbps=262144000" > /sys/fs/cgroup/system.slice/io.max#012 _uses_shell=True zuul_log_id=in-loop-ignore zuul_ansible_split_streams=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 21 17:54:55 np0005591285 python3[8014]: ansible-ansible.legacy.command Invoked with _raw_params=echo "252:0   riops=18000 wiops=18000 rbps=262144000 wbps=262144000" > /sys/fs/cgroup/user.slice/io.max#012 _uses_shell=True zuul_log_id=in-loop-ignore zuul_ansible_split_streams=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 21 17:54:56 np0005591285 python3[8041]: ansible-ansible.legacy.command Invoked with _raw_params=echo "init";    cat /sys/fs/cgroup/init.scope/io.max; echo "machine"; cat /sys/fs/cgroup/machine.slice/io.max; echo "system";  cat /sys/fs/cgroup/system.slice/io.max; echo "user";    cat /sys/fs/cgroup/user.slice/io.max;#012 _uses_shell=True zuul_log_id=fa163ec2-ffbe-51e3-fd82-000000000cab-1-compute2 zuul_ansible_split_streams=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 21 17:54:57 np0005591285 python3[8071]: ansible-ansible.builtin.stat Invoked with path=/sys/fs/cgroup/kubepods.slice/io.max follow=False get_md5=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Jan 21 17:55:00 np0005591285 systemd-logind[788]: Session 5 logged out. Waiting for processes to exit.
Jan 21 17:55:00 np0005591285 systemd[1]: session-5.scope: Deactivated successfully.
Jan 21 17:55:00 np0005591285 systemd[1]: session-5.scope: Consumed 4.591s CPU time.
Jan 21 17:55:00 np0005591285 systemd-logind[788]: Removed session 5.
Jan 21 17:55:02 np0005591285 systemd-logind[788]: New session 6 of user zuul.
Jan 21 17:55:02 np0005591285 systemd[1]: Started Session 6 of User zuul.
Jan 21 17:55:02 np0005591285 python3[8104]: ansible-ansible.legacy.dnf Invoked with name=['podman', 'buildah'] state=present allow_downgrade=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_repoquery=True install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 allowerasing=False nobest=False use_backend=auto conf_file=None disable_excludes=None download_dir=None list=None releasever=None
Jan 21 17:55:09 np0005591285 setsebool[8147]: The virt_use_nfs policy boolean was changed to 1 by root
Jan 21 17:55:09 np0005591285 setsebool[8147]: The virt_sandbox_use_all_caps policy boolean was changed to 1 by root
Jan 21 17:55:20 np0005591285 kernel: SELinux:  Converting 385 SID table entries...
Jan 21 17:55:20 np0005591285 kernel: SELinux:  policy capability network_peer_controls=1
Jan 21 17:55:20 np0005591285 kernel: SELinux:  policy capability open_perms=1
Jan 21 17:55:20 np0005591285 kernel: SELinux:  policy capability extended_socket_class=1
Jan 21 17:55:20 np0005591285 kernel: SELinux:  policy capability always_check_network=0
Jan 21 17:55:20 np0005591285 kernel: SELinux:  policy capability cgroup_seclabel=1
Jan 21 17:55:20 np0005591285 kernel: SELinux:  policy capability nnp_nosuid_transition=1
Jan 21 17:55:20 np0005591285 kernel: SELinux:  policy capability genfs_seclabel_symlinks=1
Jan 21 17:55:30 np0005591285 kernel: SELinux:  Converting 388 SID table entries...
Jan 21 17:55:30 np0005591285 kernel: SELinux:  policy capability network_peer_controls=1
Jan 21 17:55:30 np0005591285 kernel: SELinux:  policy capability open_perms=1
Jan 21 17:55:30 np0005591285 kernel: SELinux:  policy capability extended_socket_class=1
Jan 21 17:55:30 np0005591285 kernel: SELinux:  policy capability always_check_network=0
Jan 21 17:55:30 np0005591285 kernel: SELinux:  policy capability cgroup_seclabel=1
Jan 21 17:55:30 np0005591285 kernel: SELinux:  policy capability nnp_nosuid_transition=1
Jan 21 17:55:30 np0005591285 kernel: SELinux:  policy capability genfs_seclabel_symlinks=1
Jan 21 17:55:48 np0005591285 dbus-broker-launch[776]: avc:  op=load_policy lsm=selinux seqno=4 res=1
Jan 21 17:55:48 np0005591285 systemd[1]: Started /usr/bin/systemctl start man-db-cache-update.
Jan 21 17:55:48 np0005591285 systemd[1]: Starting man-db-cache-update.service...
Jan 21 17:55:48 np0005591285 systemd[1]: Reloading.
Jan 21 17:55:48 np0005591285 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 21 17:55:48 np0005591285 systemd[1]: Queuing reload/restart jobs for marked units…
Jan 21 17:55:53 np0005591285 python3[12438]: ansible-ansible.legacy.command Invoked with _raw_params=echo "openstack-k8s-operators+cirobot"#012 _uses_shell=True zuul_log_id=fa163ec2-ffbe-17c1-e6f1-00000000000c-1-compute2 zuul_ansible_split_streams=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 21 17:55:54 np0005591285 kernel: evm: overlay not supported
Jan 21 17:55:54 np0005591285 systemd[4308]: Starting D-Bus User Message Bus...
Jan 21 17:55:54 np0005591285 dbus-broker-launch[13358]: Policy to allow eavesdropping in /usr/share/dbus-1/session.conf +31: Eavesdropping is deprecated and ignored
Jan 21 17:55:54 np0005591285 dbus-broker-launch[13358]: Policy to allow eavesdropping in /usr/share/dbus-1/session.conf +33: Eavesdropping is deprecated and ignored
Jan 21 17:55:54 np0005591285 systemd[4308]: Started D-Bus User Message Bus.
Jan 21 17:55:54 np0005591285 dbus-broker-lau[13358]: Ready
Jan 21 17:55:54 np0005591285 systemd[4308]: selinux: avc:  op=load_policy lsm=selinux seqno=4 res=1
Jan 21 17:55:54 np0005591285 systemd[4308]: Created slice Slice /user.
Jan 21 17:55:54 np0005591285 systemd[4308]: podman-13242.scope: unit configures an IP firewall, but not running as root.
Jan 21 17:55:54 np0005591285 systemd[4308]: (This warning is only shown for the first unit using IP firewalling.)
Jan 21 17:55:54 np0005591285 systemd[4308]: Started podman-13242.scope.
Jan 21 17:55:54 np0005591285 systemd[4308]: Started podman-pause-2aaffb3f.scope.
Jan 21 17:55:55 np0005591285 python3[13932]: ansible-ansible.builtin.blockinfile Invoked with state=present insertafter=EOF dest=/etc/containers/registries.conf content=[[registry]]#012location = "38.102.83.27:5001"#012insecure = true path=/etc/containers/registries.conf block=[[registry]]#012location = "38.102.83.27:5001"#012insecure = true marker=# {mark} ANSIBLE MANAGED BLOCK create=False backup=False marker_begin=BEGIN marker_end=END unsafe_writes=False insertbefore=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 21 17:55:55 np0005591285 python3[13932]: ansible-ansible.builtin.blockinfile [WARNING] Module remote_tmp /root/.ansible/tmp did not exist and was created with a mode of 0700, this may cause issues when running as another user. To avoid this, create the remote_tmp dir with the correct permissions manually
Jan 21 17:55:56 np0005591285 systemd[1]: session-6.scope: Deactivated successfully.
Jan 21 17:55:56 np0005591285 systemd[1]: session-6.scope: Consumed 43.210s CPU time.
Jan 21 17:55:56 np0005591285 systemd-logind[788]: Session 6 logged out. Waiting for processes to exit.
Jan 21 17:55:56 np0005591285 systemd-logind[788]: Removed session 6.
Jan 21 17:56:08 np0005591285 irqbalance[782]: Cannot change IRQ 27 affinity: Operation not permitted
Jan 21 17:56:08 np0005591285 irqbalance[782]: IRQ 27 affinity is now unmanaged
Jan 21 17:56:20 np0005591285 systemd-logind[788]: New session 7 of user zuul.
Jan 21 17:56:20 np0005591285 systemd[1]: Started Session 7 of User zuul.
Jan 21 17:56:20 np0005591285 python3[22327]: ansible-ansible.posix.authorized_key Invoked with user=zuul key=ecdsa-sha2-nistp256 AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBLb1E5HwmlMjFU9nv6wd+VHV9J1rtO+UWxPZpEjo1oVR+Rls9TFII1iFAeK4/68neaHhE2B9Qc0dAUKPbHC0hoM= zuul@np0005591282.novalocal#012 manage_dir=True state=present exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Jan 21 17:56:20 np0005591285 python3[22487]: ansible-ansible.posix.authorized_key Invoked with user=root key=ecdsa-sha2-nistp256 AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBLb1E5HwmlMjFU9nv6wd+VHV9J1rtO+UWxPZpEjo1oVR+Rls9TFII1iFAeK4/68neaHhE2B9Qc0dAUKPbHC0hoM= zuul@np0005591282.novalocal#012 manage_dir=True state=present exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Jan 21 17:56:21 np0005591285 python3[22844]: ansible-ansible.builtin.user Invoked with name=cloud-admin shell=/bin/bash state=present non_unique=False force=False remove=False create_home=True system=False move_home=False append=False ssh_key_bits=0 ssh_key_type=rsa ssh_key_comment=ansible-generated on np0005591285.novalocal update_password=always uid=None group=None groups=None comment=None home=None password=NOT_LOGGING_PARAMETER login_class=None password_expire_max=None password_expire_min=None hidden=None seuser=None skeleton=None generate_ssh_key=None ssh_key_file=None ssh_key_passphrase=NOT_LOGGING_PARAMETER expires=None password_lock=None local=None profile=None authorization=None role=None umask=None
Jan 21 17:56:22 np0005591285 python3[23107]: ansible-ansible.posix.authorized_key Invoked with user=cloud-admin key=ecdsa-sha2-nistp256 AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBLb1E5HwmlMjFU9nv6wd+VHV9J1rtO+UWxPZpEjo1oVR+Rls9TFII1iFAeK4/68neaHhE2B9Qc0dAUKPbHC0hoM= zuul@np0005591282.novalocal#012 manage_dir=True state=present exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Jan 21 17:56:23 np0005591285 python3[23354]: ansible-ansible.legacy.stat Invoked with path=/etc/sudoers.d/cloud-admin follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Jan 21 17:56:23 np0005591285 python3[23570]: ansible-ansible.legacy.copy Invoked with dest=/etc/sudoers.d/cloud-admin mode=0640 src=/home/zuul/.ansible/tmp/ansible-tmp-1769036182.8605115-170-250613316956239/source _original_basename=tmph0blp2so follow=False checksum=e7614e5ad3ab06eaae55b8efaa2ed81b63ea5634 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 21 17:56:24 np0005591285 python3[23883]: ansible-ansible.builtin.hostname Invoked with name=compute-2 use=systemd
Jan 21 17:56:24 np0005591285 systemd[1]: Starting Hostname Service...
Jan 21 17:56:24 np0005591285 systemd[1]: Started Hostname Service.
Jan 21 17:56:24 np0005591285 systemd-hostnamed[23972]: Changed pretty hostname to 'compute-2'
Jan 21 17:56:24 np0005591285 systemd-hostnamed[23972]: Hostname set to <compute-2> (static)
Jan 21 17:56:24 np0005591285 NetworkManager[7190]: <info>  [1769036184.6985] hostname: static hostname changed from "np0005591285.novalocal" to "compute-2"
Jan 21 17:56:24 np0005591285 systemd[1]: Starting Network Manager Script Dispatcher Service...
Jan 21 17:56:24 np0005591285 systemd[1]: Started Network Manager Script Dispatcher Service.
Jan 21 17:56:26 np0005591285 systemd[1]: session-7.scope: Deactivated successfully.
Jan 21 17:56:26 np0005591285 systemd[1]: session-7.scope: Consumed 2.623s CPU time.
Jan 21 17:56:26 np0005591285 systemd-logind[788]: Session 7 logged out. Waiting for processes to exit.
Jan 21 17:56:26 np0005591285 systemd-logind[788]: Removed session 7.
Jan 21 17:56:34 np0005591285 systemd[1]: NetworkManager-dispatcher.service: Deactivated successfully.
Jan 21 17:56:42 np0005591285 systemd[1]: man-db-cache-update.service: Deactivated successfully.
Jan 21 17:56:42 np0005591285 systemd[1]: Finished man-db-cache-update.service.
Jan 21 17:56:42 np0005591285 systemd[1]: man-db-cache-update.service: Consumed 1min 6.807s CPU time.
Jan 21 17:56:42 np0005591285 systemd[1]: run-r43353530f7e948c4a81a1b1a1ada93c0.service: Deactivated successfully.
Jan 21 17:56:54 np0005591285 systemd[1]: systemd-hostnamed.service: Deactivated successfully.
Jan 21 17:59:05 np0005591285 systemd[1]: Starting Cleanup of Temporary Directories...
Jan 21 17:59:05 np0005591285 systemd[1]: systemd-tmpfiles-clean.service: Deactivated successfully.
Jan 21 17:59:05 np0005591285 systemd[1]: Finished Cleanup of Temporary Directories.
Jan 21 17:59:05 np0005591285 systemd[1]: run-credentials-systemd\x2dtmpfiles\x2dclean.service.mount: Deactivated successfully.
Jan 21 18:00:23 np0005591285 systemd-logind[788]: New session 8 of user zuul.
Jan 21 18:00:23 np0005591285 systemd[1]: Started Session 8 of User zuul.
Jan 21 18:00:24 np0005591285 python3[30018]: ansible-ansible.legacy.setup Invoked with gather_subset=['all'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 21 18:00:26 np0005591285 python3[30134]: ansible-ansible.legacy.stat Invoked with path=/etc/yum.repos.d/delorean.repo follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Jan 21 18:00:26 np0005591285 python3[30207]: ansible-ansible.legacy.copy Invoked with dest=/etc/yum.repos.d/ src=/home/zuul/.ansible/tmp/ansible-tmp-1769036425.7082865-34008-83710188034809/source mode=0755 _original_basename=delorean.repo follow=False checksum=0f7c85cc67bf467c48edf98d5acc63e62d808324 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 21 18:00:26 np0005591285 python3[30233]: ansible-ansible.legacy.stat Invoked with path=/etc/yum.repos.d/delorean-antelope-testing.repo follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Jan 21 18:00:27 np0005591285 python3[30306]: ansible-ansible.legacy.copy Invoked with dest=/etc/yum.repos.d/ src=/home/zuul/.ansible/tmp/ansible-tmp-1769036425.7082865-34008-83710188034809/source mode=0755 _original_basename=delorean-antelope-testing.repo follow=False checksum=4ebc56dead962b5d40b8d420dad43b948b84d3fc backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 21 18:00:27 np0005591285 python3[30332]: ansible-ansible.legacy.stat Invoked with path=/etc/yum.repos.d/repo-setup-centos-highavailability.repo follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Jan 21 18:00:27 np0005591285 python3[30405]: ansible-ansible.legacy.copy Invoked with dest=/etc/yum.repos.d/ src=/home/zuul/.ansible/tmp/ansible-tmp-1769036425.7082865-34008-83710188034809/source mode=0755 _original_basename=repo-setup-centos-highavailability.repo follow=False checksum=55d0f695fd0d8f47cbc3044ce0dcf5f88862490f backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 21 18:00:27 np0005591285 python3[30431]: ansible-ansible.legacy.stat Invoked with path=/etc/yum.repos.d/repo-setup-centos-powertools.repo follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Jan 21 18:00:28 np0005591285 python3[30504]: ansible-ansible.legacy.copy Invoked with dest=/etc/yum.repos.d/ src=/home/zuul/.ansible/tmp/ansible-tmp-1769036425.7082865-34008-83710188034809/source mode=0755 _original_basename=repo-setup-centos-powertools.repo follow=False checksum=4b0cf99aa89c5c5be0151545863a7a7568f67568 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 21 18:00:28 np0005591285 python3[30530]: ansible-ansible.legacy.stat Invoked with path=/etc/yum.repos.d/repo-setup-centos-appstream.repo follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Jan 21 18:00:28 np0005591285 python3[30603]: ansible-ansible.legacy.copy Invoked with dest=/etc/yum.repos.d/ src=/home/zuul/.ansible/tmp/ansible-tmp-1769036425.7082865-34008-83710188034809/source mode=0755 _original_basename=repo-setup-centos-appstream.repo follow=False checksum=e89244d2503b2996429dda1857290c1e91e393a1 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 21 18:00:29 np0005591285 python3[30629]: ansible-ansible.legacy.stat Invoked with path=/etc/yum.repos.d/repo-setup-centos-baseos.repo follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Jan 21 18:00:29 np0005591285 python3[30702]: ansible-ansible.legacy.copy Invoked with dest=/etc/yum.repos.d/ src=/home/zuul/.ansible/tmp/ansible-tmp-1769036425.7082865-34008-83710188034809/source mode=0755 _original_basename=repo-setup-centos-baseos.repo follow=False checksum=36d926db23a40dbfa5c84b5e4d43eac6fa2301d6 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 21 18:00:29 np0005591285 python3[30728]: ansible-ansible.legacy.stat Invoked with path=/etc/yum.repos.d/delorean.repo.md5 follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Jan 21 18:00:30 np0005591285 python3[30801]: ansible-ansible.legacy.copy Invoked with dest=/etc/yum.repos.d/ src=/home/zuul/.ansible/tmp/ansible-tmp-1769036425.7082865-34008-83710188034809/source mode=0755 _original_basename=delorean.repo.md5 follow=False checksum=2583a70b3ee76a9837350b0837bc004a8e52405c backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 21 18:00:40 np0005591285 python3[30849]: ansible-ansible.legacy.command Invoked with _raw_params=hostname _uses_shell=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 21 18:05:40 np0005591285 systemd[1]: session-8.scope: Deactivated successfully.
Jan 21 18:05:40 np0005591285 systemd[1]: session-8.scope: Consumed 5.216s CPU time.
Jan 21 18:05:40 np0005591285 systemd-logind[788]: Session 8 logged out. Waiting for processes to exit.
Jan 21 18:05:40 np0005591285 systemd-logind[788]: Removed session 8.
Jan 21 18:07:05 np0005591285 systemd[1]: Starting dnf makecache...
Jan 21 18:07:05 np0005591285 dnf[30872]: Failed determining last makecache time.
Jan 21 18:07:05 np0005591285 dnf[30872]: delorean-openstack-barbican-42b4c41831408a8e323 252 kB/s |  13 kB     00:00
Jan 21 18:07:05 np0005591285 dnf[30872]: delorean-python-glean-10df0bd91b9bc5c9fd9cc02d7 2.6 MB/s |  65 kB     00:00
Jan 21 18:07:05 np0005591285 dnf[30872]: delorean-openstack-cinder-1c00d6490d88e436f26ef 1.4 MB/s |  32 kB     00:00
Jan 21 18:07:05 np0005591285 dnf[30872]: delorean-python-stevedore-c4acc5639fd2329372142 4.2 MB/s | 131 kB     00:00
Jan 21 18:07:05 np0005591285 dnf[30872]: delorean-python-cloudkitty-tests-tempest-2c80f8 1.1 MB/s |  32 kB     00:00
Jan 21 18:07:05 np0005591285 dnf[30872]: delorean-os-refresh-config-9bfc52b5049be2d8de61  10 MB/s | 349 kB     00:00
Jan 21 18:07:05 np0005591285 dnf[30872]: delorean-openstack-nova-6f8decf0b4f1aa2e96292b6 1.8 MB/s |  42 kB     00:00
Jan 21 18:07:05 np0005591285 dnf[30872]: delorean-python-designate-tests-tempest-347fdbc 785 kB/s |  18 kB     00:00
Jan 21 18:07:05 np0005591285 dnf[30872]: delorean-openstack-glance-1fd12c29b339f30fe823e 644 kB/s |  18 kB     00:00
Jan 21 18:07:05 np0005591285 dnf[30872]: delorean-openstack-keystone-e4b40af0ae3698fbbbb 773 kB/s |  29 kB     00:00
Jan 21 18:07:05 np0005591285 dnf[30872]: delorean-openstack-manila-3c01b7181572c95dac462 1.1 MB/s |  25 kB     00:00
Jan 21 18:07:06 np0005591285 dnf[30872]: delorean-python-whitebox-neutron-tests-tempest- 5.7 MB/s | 154 kB     00:00
Jan 21 18:07:06 np0005591285 dnf[30872]: delorean-openstack-octavia-ba397f07a7331190208c 733 kB/s |  26 kB     00:00
Jan 21 18:07:06 np0005591285 dnf[30872]: delorean-openstack-watcher-c014f81a8647287f6dcc 668 kB/s |  16 kB     00:00
Jan 21 18:07:06 np0005591285 dnf[30872]: delorean-ansible-config_template-5ccaa22121a7ff 105 kB/s | 7.4 kB     00:00
Jan 21 18:07:06 np0005591285 dnf[30872]: delorean-puppet-ceph-7352068d7b8c84ded636ab3158 2.6 MB/s | 144 kB     00:00
Jan 21 18:07:06 np0005591285 dnf[30872]: delorean-openstack-swift-dc98a8463506ac520c469a 529 kB/s |  14 kB     00:00
Jan 21 18:07:06 np0005591285 dnf[30872]: delorean-python-tempestconf-8515371b7cceebd4282 2.0 MB/s |  53 kB     00:00
Jan 21 18:07:06 np0005591285 dnf[30872]: delorean-openstack-heat-ui-013accbfd179753bc3f0 2.2 MB/s |  96 kB     00:00
Jan 21 18:07:06 np0005591285 dnf[30872]: CentOS Stream 9 - BaseOS                         17 kB/s | 6.7 kB     00:00
Jan 21 18:07:07 np0005591285 dnf[30872]: CentOS Stream 9 - AppStream                      62 kB/s | 6.8 kB     00:00
Jan 21 18:07:07 np0005591285 dnf[30872]: CentOS Stream 9 - CRB                            35 kB/s | 6.6 kB     00:00
Jan 21 18:07:07 np0005591285 dnf[30872]: CentOS Stream 9 - Extras packages                19 kB/s | 7.3 kB     00:00
Jan 21 18:07:07 np0005591285 dnf[30872]: dlrn-antelope-testing                            25 MB/s | 1.1 MB     00:00
Jan 21 18:07:08 np0005591285 dnf[30872]: dlrn-antelope-build-deps                         14 MB/s | 461 kB     00:00
Jan 21 18:07:08 np0005591285 dnf[30872]: centos9-rabbitmq                                9.6 MB/s | 123 kB     00:00
Jan 21 18:07:08 np0005591285 dnf[30872]: centos9-storage                                  22 MB/s | 415 kB     00:00
Jan 21 18:07:08 np0005591285 dnf[30872]: centos9-opstools                                3.8 MB/s |  51 kB     00:00
Jan 21 18:07:08 np0005591285 dnf[30872]: NFV SIG OpenvSwitch                             8.4 MB/s | 461 kB     00:00
Jan 21 18:07:09 np0005591285 dnf[30872]: repo-setup-centos-appstream                      59 MB/s |  26 MB     00:00
Jan 21 18:07:15 np0005591285 dnf[30872]: repo-setup-centos-baseos                         47 MB/s | 8.9 MB     00:00
Jan 21 18:07:17 np0005591285 dnf[30872]: repo-setup-centos-highavailability              7.9 MB/s | 744 kB     00:00
Jan 21 18:07:17 np0005591285 dnf[30872]: repo-setup-centos-powertools                     52 MB/s | 7.6 MB     00:00
Jan 21 18:07:20 np0005591285 dnf[30872]: Extra Packages for Enterprise Linux 9 - x86_64   16 MB/s |  20 MB     00:01
Jan 21 18:07:37 np0005591285 dnf[30872]: Metadata cache created.
Jan 21 18:07:37 np0005591285 systemd[1]: dnf-makecache.service: Deactivated successfully.
Jan 21 18:07:37 np0005591285 systemd[1]: Finished dnf makecache.
Jan 21 18:07:37 np0005591285 systemd[1]: dnf-makecache.service: Consumed 28.359s CPU time.
Jan 21 18:15:34 np0005591285 systemd-logind[788]: New session 9 of user zuul.
Jan 21 18:15:34 np0005591285 systemd[1]: Started Session 9 of User zuul.
Jan 21 18:15:35 np0005591285 python3.9[31133]: ansible-ansible.legacy.setup Invoked with gather_subset=['all'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 21 18:15:36 np0005591285 python3.9[31314]: ansible-ansible.legacy.command Invoked with _raw_params=set -euxo pipefail#012pushd /var/tmp#012curl -sL https://github.com/openstack-k8s-operators/repo-setup/archive/refs/heads/main.tar.gz | tar -xz#012pushd repo-setup-main#012python3 -m venv ./venv#012PBR_VERSION=0.0.0 ./venv/bin/pip install ./#012./venv/bin/repo-setup current-podified -b antelope#012popd#012rm -rf repo-setup-main#012 _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 21 18:15:51 np0005591285 systemd[1]: session-9.scope: Deactivated successfully.
Jan 21 18:15:51 np0005591285 systemd[1]: session-9.scope: Consumed 7.965s CPU time.
Jan 21 18:15:51 np0005591285 systemd-logind[788]: Session 9 logged out. Waiting for processes to exit.
Jan 21 18:15:51 np0005591285 systemd-logind[788]: Removed session 9.
Jan 21 18:16:07 np0005591285 systemd-logind[788]: New session 10 of user zuul.
Jan 21 18:16:07 np0005591285 systemd[1]: Started Session 10 of User zuul.
Jan 21 18:16:08 np0005591285 python3.9[31526]: ansible-ansible.legacy.ping Invoked with data=pong
Jan 21 18:16:09 np0005591285 python3.9[31700]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 21 18:16:10 np0005591285 python3.9[31852]: ansible-ansible.legacy.command Invoked with _raw_params=PATH=/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin which growvols#012 _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 21 18:16:11 np0005591285 python3.9[32005]: ansible-ansible.builtin.stat Invoked with path=/etc/ansible/facts.d/bootc.fact follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 21 18:16:12 np0005591285 python3.9[32157]: ansible-ansible.builtin.file Invoked with mode=755 path=/etc/ansible/facts.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 21 18:16:13 np0005591285 python3.9[32309]: ansible-ansible.legacy.stat Invoked with path=/etc/ansible/facts.d/bootc.fact follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 21 18:16:14 np0005591285 python3.9[32432]: ansible-ansible.legacy.copy Invoked with dest=/etc/ansible/facts.d/bootc.fact mode=755 src=/home/zuul/.ansible/tmp/ansible-tmp-1769037372.781112-180-113186896330631/.source.fact _original_basename=bootc.fact follow=False checksum=eb4122ce7fc50a38407beb511c4ff8c178005b12 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 21 18:16:14 np0005591285 python3.9[32584]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 21 18:16:16 np0005591285 python3.9[32740]: ansible-ansible.builtin.file Invoked with group=root mode=0750 owner=root path=/var/log/journal setype=var_log_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 21 18:16:16 np0005591285 python3.9[32892]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/config-data/ansible-generated recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 21 18:16:17 np0005591285 python3.9[33042]: ansible-ansible.builtin.service_facts Invoked
Jan 21 18:16:21 np0005591285 python3.9[33295]: ansible-ansible.builtin.lineinfile Invoked with line=cloud-init=disabled path=/proc/cmdline state=present encoding=utf-8 backrefs=False create=False backup=False firstmatch=False unsafe_writes=False regexp=None search_string=None insertafter=None insertbefore=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 21 18:16:22 np0005591285 python3.9[33445]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 21 18:16:24 np0005591285 python3.9[33599]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local', 'distribution'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 21 18:16:25 np0005591285 python3.9[33757]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Jan 21 18:16:26 np0005591285 python3.9[33842]: ansible-ansible.legacy.dnf Invoked with name=['driverctl', 'lvm2', 'crudini', 'jq', 'nftables', 'NetworkManager', 'openstack-selinux', 'python3-libselinux', 'python3-pyyaml', 'rsync', 'tmpwatch', 'sysstat', 'iproute-tc', 'ksmtuned', 'systemd-container', 'crypto-policies-scripts', 'grubby', 'sos'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Jan 21 18:16:28 np0005591285 irqbalance[782]: Cannot change IRQ 26 affinity: Operation not permitted
Jan 21 18:16:28 np0005591285 irqbalance[782]: IRQ 26 affinity is now unmanaged
Jan 21 18:17:28 np0005591285 systemd[1]: Reloading.
Jan 21 18:17:28 np0005591285 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 21 18:17:28 np0005591285 systemd[1]: Listening on Device-mapper event daemon FIFOs.
Jan 21 18:17:29 np0005591285 systemd[1]: Reloading.
Jan 21 18:17:29 np0005591285 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 21 18:17:29 np0005591285 systemd[1]: Starting Monitoring of LVM2 mirrors, snapshots etc. using dmeventd or progress polling...
Jan 21 18:17:30 np0005591285 systemd[1]: Finished Monitoring of LVM2 mirrors, snapshots etc. using dmeventd or progress polling.
Jan 21 18:17:30 np0005591285 systemd[1]: Reloading.
Jan 21 18:17:30 np0005591285 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 21 18:17:30 np0005591285 systemd[1]: Listening on LVM2 poll daemon socket.
Jan 21 18:17:30 np0005591285 dbus-broker-launch[764]: Noticed file-system modification, trigger reload.
Jan 21 18:17:30 np0005591285 dbus-broker-launch[764]: Noticed file-system modification, trigger reload.
Jan 21 18:17:30 np0005591285 dbus-broker-launch[764]: Noticed file-system modification, trigger reload.
Jan 21 18:18:43 np0005591285 kernel: SELinux:  Converting 2725 SID table entries...
Jan 21 18:18:43 np0005591285 kernel: SELinux:  policy capability network_peer_controls=1
Jan 21 18:18:43 np0005591285 kernel: SELinux:  policy capability open_perms=1
Jan 21 18:18:43 np0005591285 kernel: SELinux:  policy capability extended_socket_class=1
Jan 21 18:18:43 np0005591285 kernel: SELinux:  policy capability always_check_network=0
Jan 21 18:18:43 np0005591285 kernel: SELinux:  policy capability cgroup_seclabel=1
Jan 21 18:18:43 np0005591285 kernel: SELinux:  policy capability nnp_nosuid_transition=1
Jan 21 18:18:43 np0005591285 kernel: SELinux:  policy capability genfs_seclabel_symlinks=1
Jan 21 18:18:44 np0005591285 dbus-broker-launch[776]: avc:  op=load_policy lsm=selinux seqno=6 res=1
Jan 21 18:18:44 np0005591285 systemd[1]: Started /usr/bin/systemctl start man-db-cache-update.
Jan 21 18:18:44 np0005591285 systemd[1]: Starting man-db-cache-update.service...
Jan 21 18:18:44 np0005591285 systemd[1]: Reloading.
Jan 21 18:18:44 np0005591285 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 21 18:18:44 np0005591285 systemd[1]: Queuing reload/restart jobs for marked units…
Jan 21 18:18:45 np0005591285 python3.9[35355]: ansible-ansible.legacy.command Invoked with _raw_params=rpm -V driverctl lvm2 crudini jq nftables NetworkManager openstack-selinux python3-libselinux python3-pyyaml rsync tmpwatch sysstat iproute-tc ksmtuned systemd-container crypto-policies-scripts grubby sos _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 21 18:18:47 np0005591285 systemd[1]: man-db-cache-update.service: Deactivated successfully.
Jan 21 18:18:47 np0005591285 systemd[1]: Finished man-db-cache-update.service.
Jan 21 18:18:47 np0005591285 systemd[1]: man-db-cache-update.service: Consumed 1.125s CPU time.
Jan 21 18:18:47 np0005591285 systemd[1]: run-r62c4e9d950d8407cb474c90c40d22520.service: Deactivated successfully.
Jan 21 18:18:47 np0005591285 python3.9[35636]: ansible-ansible.posix.selinux Invoked with policy=targeted state=enforcing configfile=/etc/selinux/config update_kernel_param=False
Jan 21 18:18:48 np0005591285 python3.9[35790]: ansible-ansible.legacy.command Invoked with cmd=dd if=/dev/zero of=/swap count=1024 bs=1M creates=/swap _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None removes=None stdin=None
Jan 21 18:18:52 np0005591285 python3.9[35943]: ansible-ansible.builtin.file Invoked with group=root mode=0600 owner=root path=/swap recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False state=None _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 21 18:18:53 np0005591285 python3.9[36095]: ansible-ansible.posix.mount Invoked with dump=0 fstype=swap name=none opts=sw passno=0 src=/swap state=present path=none boot=True opts_no_log=False backup=False fstab=None
Jan 21 18:18:55 np0005591285 python3.9[36247]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/pki/ca-trust/source/anchors setype=cert_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 21 18:19:02 np0005591285 python3.9[36399]: ansible-ansible.legacy.stat Invoked with path=/etc/pki/ca-trust/source/anchors/tls-ca-bundle.pem follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 21 18:19:03 np0005591285 python3.9[36522]: ansible-ansible.legacy.copy Invoked with dest=/etc/pki/ca-trust/source/anchors/tls-ca-bundle.pem group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769037538.7460365-669-83765051420957/.source.pem _original_basename=tls-ca-bundle.pem follow=False checksum=7d2dea4c5f9b91987b3f91d50d337a484f86b475 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 21 18:19:05 np0005591285 python3.9[36674]: ansible-ansible.builtin.stat Invoked with path=/etc/lvm/devices/system.devices follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 21 18:19:06 np0005591285 python3.9[36826]: ansible-ansible.legacy.command Invoked with _raw_params=/usr/sbin/vgimportdevices --all _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 21 18:19:07 np0005591285 python3.9[36979]: ansible-ansible.builtin.file Invoked with group=root mode=0600 owner=root path=/etc/lvm/devices/system.devices state=touch recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 21 18:19:08 np0005591285 python3.9[37131]: ansible-ansible.builtin.getent Invoked with database=passwd key=qemu fail_key=True service=None split=None
Jan 21 18:19:08 np0005591285 rsyslogd[1006]: imjournal: journal files changed, reloading...  [v8.2510.0-2.el9 try https://www.rsyslog.com/e/0 ]
Jan 21 18:19:08 np0005591285 rsyslogd[1006]: imjournal: journal files changed, reloading...  [v8.2510.0-2.el9 try https://www.rsyslog.com/e/0 ]
Jan 21 18:19:09 np0005591285 python3.9[37285]: ansible-ansible.builtin.group Invoked with gid=107 name=qemu state=present force=False system=False local=False non_unique=False gid_min=None gid_max=None
Jan 21 18:19:10 np0005591285 python3.9[37443]: ansible-ansible.builtin.user Invoked with comment=qemu user group=qemu groups=[''] name=qemu shell=/sbin/nologin state=present uid=107 non_unique=False force=False remove=False create_home=True system=False move_home=False append=False ssh_key_bits=0 ssh_key_type=rsa ssh_key_comment=ansible-generated on compute-2 update_password=always home=None password=NOT_LOGGING_PARAMETER login_class=None password_expire_max=None password_expire_min=None password_expire_warn=None hidden=None seuser=None skeleton=None generate_ssh_key=None ssh_key_file=None ssh_key_passphrase=NOT_LOGGING_PARAMETER expires=None password_lock=None local=None profile=None authorization=None role=None umask=None password_expire_account_disable=None uid_min=None uid_max=None
Jan 21 18:19:12 np0005591285 python3.9[37603]: ansible-ansible.builtin.getent Invoked with database=passwd key=hugetlbfs fail_key=True service=None split=None
Jan 21 18:19:12 np0005591285 python3.9[37756]: ansible-ansible.builtin.group Invoked with gid=42477 name=hugetlbfs state=present force=False system=False local=False non_unique=False gid_min=None gid_max=None
Jan 21 18:19:13 np0005591285 python3.9[37914]: ansible-ansible.builtin.file Invoked with group=qemu mode=0755 owner=qemu path=/var/lib/vhost_sockets setype=virt_cache_t seuser=system_u state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None serole=None selevel=None attributes=None
Jan 21 18:19:15 np0005591285 python3.9[38066]: ansible-ansible.legacy.dnf Invoked with name=['dracut-config-generic'] state=absent allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Jan 21 18:19:18 np0005591285 python3.9[38219]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/modules-load.d setype=etc_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 21 18:19:19 np0005591285 python3.9[38371]: ansible-ansible.legacy.stat Invoked with path=/etc/modules-load.d/99-edpm.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 21 18:19:19 np0005591285 python3.9[38494]: ansible-ansible.legacy.copy Invoked with dest=/etc/modules-load.d/99-edpm.conf group=root mode=0644 owner=root setype=etc_t src=/home/zuul/.ansible/tmp/ansible-tmp-1769037558.5036945-1026-147369525654407/.source.conf follow=False _original_basename=edpm-modprobe.conf.j2 checksum=8021efe01721d8fa8cab46b95c00ec1be6dbb9d0 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None
Jan 21 18:19:20 np0005591285 python3.9[38646]: ansible-ansible.builtin.systemd Invoked with name=systemd-modules-load.service state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Jan 21 18:19:21 np0005591285 systemd[1]: Starting Load Kernel Modules...
Jan 21 18:19:21 np0005591285 kernel: bridge: filtering via arp/ip/ip6tables is no longer available by default. Update your scripts to load br_netfilter if you need this.
Jan 21 18:19:21 np0005591285 kernel: Bridge firewalling registered
Jan 21 18:19:21 np0005591285 systemd-modules-load[38650]: Inserted module 'br_netfilter'
Jan 21 18:19:21 np0005591285 systemd[1]: Finished Load Kernel Modules.
Jan 21 18:19:22 np0005591285 python3.9[38806]: ansible-ansible.legacy.stat Invoked with path=/etc/sysctl.d/99-edpm.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 21 18:19:23 np0005591285 python3.9[38929]: ansible-ansible.legacy.copy Invoked with dest=/etc/sysctl.d/99-edpm.conf group=root mode=0644 owner=root setype=etc_t src=/home/zuul/.ansible/tmp/ansible-tmp-1769037562.3594177-1095-145523637498104/.source.conf follow=False _original_basename=edpm-sysctl.conf.j2 checksum=2a366439721b855adcfe4d7f152babb68596a007 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None
Jan 21 18:19:24 np0005591285 python3.9[39081]: ansible-ansible.legacy.dnf Invoked with name=['tuned', 'tuned-profiles-cpu-partitioning'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Jan 21 18:19:28 np0005591285 dbus-broker-launch[764]: Noticed file-system modification, trigger reload.
Jan 21 18:19:28 np0005591285 dbus-broker-launch[764]: Noticed file-system modification, trigger reload.
Jan 21 18:19:28 np0005591285 systemd[1]: Started /usr/bin/systemctl start man-db-cache-update.
Jan 21 18:19:28 np0005591285 systemd[1]: Starting man-db-cache-update.service...
Jan 21 18:19:28 np0005591285 systemd[1]: Reloading.
Jan 21 18:19:28 np0005591285 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 21 18:19:28 np0005591285 systemd[1]: Queuing reload/restart jobs for marked units…
Jan 21 18:19:30 np0005591285 python3.9[40979]: ansible-ansible.builtin.stat Invoked with path=/etc/tuned/active_profile follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 21 18:19:31 np0005591285 python3.9[41968]: ansible-ansible.builtin.slurp Invoked with src=/etc/tuned/active_profile
Jan 21 18:19:31 np0005591285 python3.9[42770]: ansible-ansible.builtin.stat Invoked with path=/etc/tuned/throughput-performance-variables.conf follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 21 18:19:32 np0005591285 systemd[1]: man-db-cache-update.service: Deactivated successfully.
Jan 21 18:19:32 np0005591285 systemd[1]: Finished man-db-cache-update.service.
Jan 21 18:19:32 np0005591285 systemd[1]: man-db-cache-update.service: Consumed 4.687s CPU time.
Jan 21 18:19:32 np0005591285 systemd[1]: run-rf34924ef3cd048d193b6cdbf324c473d.service: Deactivated successfully.
Jan 21 18:19:32 np0005591285 python3.9[43250]: ansible-ansible.legacy.command Invoked with _raw_params=/usr/sbin/tuned-adm profile throughput-performance _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 21 18:19:33 np0005591285 systemd[1]: Starting Dynamic System Tuning Daemon...
Jan 21 18:19:33 np0005591285 systemd[1]: Starting Authorization Manager...
Jan 21 18:19:33 np0005591285 polkitd[43467]: Started polkitd version 0.117
Jan 21 18:19:33 np0005591285 systemd[1]: Started Dynamic System Tuning Daemon.
Jan 21 18:19:33 np0005591285 systemd[1]: Started Authorization Manager.
Jan 21 18:19:34 np0005591285 python3.9[43637]: ansible-ansible.builtin.systemd Invoked with enabled=True name=tuned state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 21 18:19:34 np0005591285 systemd[1]: Stopping Dynamic System Tuning Daemon...
Jan 21 18:19:34 np0005591285 systemd[1]: tuned.service: Deactivated successfully.
Jan 21 18:19:34 np0005591285 systemd[1]: Stopped Dynamic System Tuning Daemon.
Jan 21 18:19:34 np0005591285 systemd[1]: Starting Dynamic System Tuning Daemon...
Jan 21 18:19:34 np0005591285 systemd[1]: Started Dynamic System Tuning Daemon.
Jan 21 18:19:35 np0005591285 python3.9[43799]: ansible-ansible.builtin.slurp Invoked with src=/proc/cmdline
Jan 21 18:19:39 np0005591285 python3.9[43951]: ansible-ansible.builtin.systemd Invoked with enabled=False name=ksm.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 21 18:19:39 np0005591285 systemd[1]: Reloading.
Jan 21 18:19:39 np0005591285 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 21 18:19:40 np0005591285 python3.9[44140]: ansible-ansible.builtin.systemd Invoked with enabled=False name=ksmtuned.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 21 18:19:40 np0005591285 systemd[1]: Reloading.
Jan 21 18:19:40 np0005591285 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 21 18:19:41 np0005591285 python3.9[44330]: ansible-ansible.legacy.command Invoked with _raw_params=mkswap "/swap" _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 21 18:19:42 np0005591285 python3.9[44483]: ansible-ansible.legacy.command Invoked with _raw_params=swapon "/swap" _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 21 18:19:42 np0005591285 kernel: Adding 1048572k swap on /swap.  Priority:-2 extents:1 across:1048572k 
Jan 21 18:19:43 np0005591285 python3.9[44636]: ansible-ansible.legacy.command Invoked with _raw_params=/usr/bin/update-ca-trust _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 21 18:19:45 np0005591285 python3.9[44798]: ansible-ansible.legacy.command Invoked with _raw_params=echo 2 >/sys/kernel/mm/ksm/run _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 21 18:19:46 np0005591285 python3.9[44951]: ansible-ansible.builtin.systemd Invoked with name=systemd-sysctl.service state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Jan 21 18:19:46 np0005591285 systemd[1]: systemd-sysctl.service: Deactivated successfully.
Jan 21 18:19:46 np0005591285 systemd[1]: Stopped Apply Kernel Variables.
Jan 21 18:19:46 np0005591285 systemd[1]: Stopping Apply Kernel Variables...
Jan 21 18:19:46 np0005591285 systemd[1]: Starting Apply Kernel Variables...
Jan 21 18:19:46 np0005591285 systemd[1]: run-credentials-systemd\x2dsysctl.service.mount: Deactivated successfully.
Jan 21 18:19:46 np0005591285 systemd[1]: Finished Apply Kernel Variables.
Jan 21 18:19:47 np0005591285 systemd[1]: session-10.scope: Deactivated successfully.
Jan 21 18:19:47 np0005591285 systemd[1]: session-10.scope: Consumed 2min 21.046s CPU time.
Jan 21 18:19:47 np0005591285 systemd-logind[788]: Session 10 logged out. Waiting for processes to exit.
Jan 21 18:19:47 np0005591285 systemd-logind[788]: Removed session 10.
Jan 21 18:19:52 np0005591285 systemd-logind[788]: New session 11 of user zuul.
Jan 21 18:19:52 np0005591285 systemd[1]: Started Session 11 of User zuul.
Jan 21 18:19:53 np0005591285 python3.9[45135]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 21 18:19:55 np0005591285 python3.9[45289]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 21 18:19:56 np0005591285 python3.9[45445]: ansible-ansible.legacy.command Invoked with _raw_params=PATH=/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin which growvols#012 _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 21 18:19:57 np0005591285 python3.9[45596]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 21 18:19:59 np0005591285 python3.9[45752]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Jan 21 18:19:59 np0005591285 python3.9[45836]: ansible-ansible.legacy.dnf Invoked with name=['podman'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Jan 21 18:20:02 np0005591285 python3.9[45989]: ansible-ansible.builtin.setup Invoked with filter=['ansible_interfaces'] gather_subset=['!all', '!min', 'network'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Jan 21 18:20:03 np0005591285 python3.9[46160]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/containers/networks recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 21 18:20:04 np0005591285 python3.9[46312]: ansible-ansible.legacy.command Invoked with _raw_params=podman network inspect podman#012 _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 21 18:20:04 np0005591285 podman[46313]: 2026-01-21 23:20:04.266931836 +0000 UTC m=+0.055582841 system refresh
Jan 21 18:20:05 np0005591285 python3.9[46475]: ansible-ansible.legacy.stat Invoked with path=/etc/containers/networks/podman.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 21 18:20:05 np0005591285 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Jan 21 18:20:05 np0005591285 python3.9[46598]: ansible-ansible.legacy.copy Invoked with dest=/etc/containers/networks/podman.json group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769037604.5150054-289-94761938998893/.source.json follow=False _original_basename=podman_network_config.j2 checksum=e62bf95263f0eac7e51181d59e96eb6207a0dc5a backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 21 18:20:06 np0005591285 python3.9[46750]: ansible-ansible.legacy.stat Invoked with path=/etc/containers/registries.conf.d/20-edpm-podman-registries.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 21 18:20:07 np0005591285 python3.9[46873]: ansible-ansible.legacy.copy Invoked with dest=/etc/containers/registries.conf.d/20-edpm-podman-registries.conf group=root mode=0644 owner=root setype=etc_t src=/home/zuul/.ansible/tmp/ansible-tmp-1769037606.0555286-334-230978441968399/.source.conf follow=False _original_basename=registries.conf.j2 checksum=51f7dfe021bf6a784cb4010cf142a3df219fb1a0 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None
Jan 21 18:20:08 np0005591285 python3.9[47025]: ansible-community.general.ini_file Invoked with create=True group=root mode=0644 option=pids_limit owner=root path=/etc/containers/containers.conf section=containers setype=etc_t value=4096 backup=False state=present exclusive=True no_extra_spaces=False ignore_spaces=False allow_no_value=False modify_inactive_option=True follow=False unsafe_writes=False section_has_values=None values=None seuser=None serole=None selevel=None attributes=None
Jan 21 18:20:08 np0005591285 python3.9[47177]: ansible-community.general.ini_file Invoked with create=True group=root mode=0644 option=events_logger owner=root path=/etc/containers/containers.conf section=engine setype=etc_t value="journald" backup=False state=present exclusive=True no_extra_spaces=False ignore_spaces=False allow_no_value=False modify_inactive_option=True follow=False unsafe_writes=False section_has_values=None values=None seuser=None serole=None selevel=None attributes=None
Jan 21 18:20:09 np0005591285 python3.9[47329]: ansible-community.general.ini_file Invoked with create=True group=root mode=0644 option=runtime owner=root path=/etc/containers/containers.conf section=engine setype=etc_t value="crun" backup=False state=present exclusive=True no_extra_spaces=False ignore_spaces=False allow_no_value=False modify_inactive_option=True follow=False unsafe_writes=False section_has_values=None values=None seuser=None serole=None selevel=None attributes=None
Jan 21 18:20:10 np0005591285 python3.9[47481]: ansible-community.general.ini_file Invoked with create=True group=root mode=0644 option=network_backend owner=root path=/etc/containers/containers.conf section=network setype=etc_t value="netavark" backup=False state=present exclusive=True no_extra_spaces=False ignore_spaces=False allow_no_value=False modify_inactive_option=True follow=False unsafe_writes=False section_has_values=None values=None seuser=None serole=None selevel=None attributes=None
Jan 21 18:20:11 np0005591285 python3.9[47631]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'distribution'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 21 18:20:12 np0005591285 python3.9[47785]: ansible-ansible.legacy.dnf Invoked with download_only=True name=['driverctl', 'lvm2', 'crudini', 'jq', 'nftables', 'NetworkManager', 'openstack-selinux', 'python3-libselinux', 'python3-pyyaml', 'rsync', 'tmpwatch', 'sysstat', 'iproute-tc', 'ksmtuned', 'systemd-container', 'crypto-policies-scripts', 'grubby', 'sos'] allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None state=None
Jan 21 18:20:14 np0005591285 python3.9[47938]: ansible-ansible.legacy.dnf Invoked with download_only=True name=['openstack-network-scripts'] allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None state=None
Jan 21 18:20:17 np0005591285 python3.9[48098]: ansible-ansible.legacy.dnf Invoked with download_only=True name=['podman', 'buildah'] allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None state=None
Jan 21 18:20:20 np0005591285 python3.9[48251]: ansible-ansible.legacy.dnf Invoked with download_only=True name=['tuned', 'tuned-profiles-cpu-partitioning'] allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None state=None
Jan 21 18:20:22 np0005591285 python3.9[48404]: ansible-ansible.legacy.dnf Invoked with download_only=True name=['NetworkManager-ovs'] allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None state=None
Jan 21 18:20:24 np0005591285 python3.9[48560]: ansible-ansible.legacy.dnf Invoked with download_only=True name=['os-net-config'] allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None state=None
Jan 21 18:20:28 np0005591285 python3.9[48730]: ansible-ansible.legacy.dnf Invoked with download_only=True name=['openssh-server'] allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None state=None
Jan 21 18:20:30 np0005591285 python3.9[48883]: ansible-ansible.legacy.dnf Invoked with download_only=True name=['libvirt ', 'libvirt-admin ', 'libvirt-client ', 'libvirt-daemon ', 'qemu-kvm', 'qemu-img', 'libguestfs', 'libseccomp', 'swtpm', 'swtpm-tools', 'edk2-ovmf', 'ceph-common', 'cyrus-sasl-scram'] allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None state=None
Jan 21 18:20:51 np0005591285 python3.9[49219]: ansible-ansible.legacy.dnf Invoked with download_only=True name=['iscsi-initiator-utils'] allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None state=None
Jan 21 18:20:53 np0005591285 python3.9[49376]: ansible-ansible.legacy.dnf Invoked with download_only=True name=['device-mapper-multipath'] allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None state=None
Jan 21 18:20:56 np0005591285 python3.9[49533]: ansible-ansible.builtin.file Invoked with group=zuul mode=0770 owner=zuul path=/root/.config/containers recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 21 18:20:57 np0005591285 python3.9[49708]: ansible-ansible.legacy.stat Invoked with path=/root/.config/containers/auth.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 21 18:20:57 np0005591285 python3.9[49831]: ansible-ansible.legacy.copy Invoked with dest=/root/.config/containers/auth.json group=zuul mode=0660 owner=zuul src=/home/zuul/.ansible/tmp/ansible-tmp-1769037656.8454437-808-77727892305809/.source.json _original_basename=.uahuu_7d follow=False checksum=bf21a9e8fbc5a3846fb05b4fa0859e0917b2202f backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 21 18:20:59 np0005591285 python3.9[49983]: ansible-containers.podman.podman_image Invoked with auth_file=/root/.config/containers/auth.json name=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified tag=latest pull=True push=False force=False state=present executable=podman build={'force_rm': False, 'format': 'oci', 'cache': True, 'rm': True, 'annotation': None, 'file': None, 'container_file': None, 'volume': None, 'extra_args': None, 'target': None} push_args={'ssh': None, 'compress': None, 'format': None, 'remove_signatures': None, 'sign_by': None, 'dest': None, 'extra_args': None, 'transport': None} arch=None pull_extra_args=None path=None validate_certs=None username=None password=NOT_LOGGING_PARAMETER ca_cert_dir=None quadlet_dir=None quadlet_filename=None quadlet_file_mode=None quadlet_options=None
Jan 21 18:20:59 np0005591285 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Jan 21 18:21:01 np0005591285 systemd[1]: var-lib-containers-storage-overlay-compat1169905932-lower\x2dmapped.mount: Deactivated successfully.
Jan 21 18:21:06 np0005591285 podman[49994]: 2026-01-21 23:21:06.712587103 +0000 UTC m=+7.457871534 image pull a17927617ef5a603f0594ee0d6df65aabdc9e0303ccc5a52c36f193de33ee0fe quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified
Jan 21 18:21:06 np0005591285 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Jan 21 18:21:06 np0005591285 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Jan 21 18:21:06 np0005591285 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Jan 21 18:21:09 np0005591285 python3.9[50291]: ansible-containers.podman.podman_image Invoked with auth_file=/root/.config/containers/auth.json name=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified tag=latest pull=True push=False force=False state=present executable=podman build={'force_rm': False, 'format': 'oci', 'cache': True, 'rm': True, 'annotation': None, 'file': None, 'container_file': None, 'volume': None, 'extra_args': None, 'target': None} push_args={'ssh': None, 'compress': None, 'format': None, 'remove_signatures': None, 'sign_by': None, 'dest': None, 'extra_args': None, 'transport': None} arch=None pull_extra_args=None path=None validate_certs=None username=None password=NOT_LOGGING_PARAMETER ca_cert_dir=None quadlet_dir=None quadlet_filename=None quadlet_file_mode=None quadlet_options=None
Jan 21 18:21:09 np0005591285 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Jan 21 18:21:26 np0005591285 podman[50304]: 2026-01-21 23:21:26.989366962 +0000 UTC m=+17.408393738 image pull e3166cc074f328e3b121ff82d56ed43a2542af699baffe6874520fe3837c2b18 quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified
Jan 21 18:21:26 np0005591285 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Jan 21 18:21:27 np0005591285 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Jan 21 18:21:27 np0005591285 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Jan 21 18:21:28 np0005591285 python3.9[50587]: ansible-containers.podman.podman_image Invoked with auth_file=/root/.config/containers/auth.json name=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified tag=latest pull=True push=False force=False state=present executable=podman build={'force_rm': False, 'format': 'oci', 'cache': True, 'rm': True, 'annotation': None, 'file': None, 'container_file': None, 'volume': None, 'extra_args': None, 'target': None} push_args={'ssh': None, 'compress': None, 'format': None, 'remove_signatures': None, 'sign_by': None, 'dest': None, 'extra_args': None, 'transport': None} arch=None pull_extra_args=None path=None validate_certs=None username=None password=NOT_LOGGING_PARAMETER ca_cert_dir=None quadlet_dir=None quadlet_filename=None quadlet_file_mode=None quadlet_options=None
Jan 21 18:21:28 np0005591285 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Jan 21 18:21:33 np0005591285 podman[50599]: 2026-01-21 23:21:33.15498314 +0000 UTC m=+4.230263458 image pull 806262ad9f61127734555408f71447afe6ceede79cc666e6f523dacd5edec739 quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified
Jan 21 18:21:33 np0005591285 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Jan 21 18:21:33 np0005591285 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Jan 21 18:21:33 np0005591285 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Jan 21 18:21:34 np0005591285 python3.9[50855]: ansible-containers.podman.podman_image Invoked with auth_file=/root/.config/containers/auth.json name=quay.io/prometheus/node-exporter:v1.5.0 tag=latest pull=True push=False force=False state=present executable=podman build={'force_rm': False, 'format': 'oci', 'cache': True, 'rm': True, 'annotation': None, 'file': None, 'container_file': None, 'volume': None, 'extra_args': None, 'target': None} push_args={'ssh': None, 'compress': None, 'format': None, 'remove_signatures': None, 'sign_by': None, 'dest': None, 'extra_args': None, 'transport': None} arch=None pull_extra_args=None path=None validate_certs=None username=None password=NOT_LOGGING_PARAMETER ca_cert_dir=None quadlet_dir=None quadlet_filename=None quadlet_file_mode=None quadlet_options=None
Jan 21 18:21:34 np0005591285 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Jan 21 18:21:35 np0005591285 podman[50867]: 2026-01-21 23:21:35.341952182 +0000 UTC m=+1.121117308 image pull 0da6a335fe1356545476b749c68f022c897de3a2139e8f0054f6937349ee2b83 quay.io/prometheus/node-exporter:v1.5.0
Jan 21 18:21:35 np0005591285 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Jan 21 18:21:35 np0005591285 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Jan 21 18:21:35 np0005591285 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Jan 21 18:21:36 np0005591285 systemd[1]: session-11.scope: Deactivated successfully.
Jan 21 18:21:36 np0005591285 systemd[1]: session-11.scope: Consumed 1min 46.214s CPU time.
Jan 21 18:21:36 np0005591285 systemd-logind[788]: Session 11 logged out. Waiting for processes to exit.
Jan 21 18:21:36 np0005591285 systemd-logind[788]: Removed session 11.
Jan 21 18:21:46 np0005591285 systemd-logind[788]: New session 12 of user zuul.
Jan 21 18:21:46 np0005591285 systemd[1]: Started Session 12 of User zuul.
Jan 21 18:21:47 np0005591285 python3.9[51168]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 21 18:21:49 np0005591285 python3.9[51324]: ansible-ansible.builtin.getent Invoked with database=passwd key=openvswitch fail_key=True service=None split=None
Jan 21 18:21:50 np0005591285 python3.9[51477]: ansible-ansible.builtin.group Invoked with gid=42476 name=openvswitch state=present force=False system=False local=False non_unique=False gid_min=None gid_max=None
Jan 21 18:21:52 np0005591285 python3.9[51635]: ansible-ansible.builtin.user Invoked with comment=openvswitch user group=openvswitch groups=['hugetlbfs'] name=openvswitch shell=/sbin/nologin state=present uid=42476 non_unique=False force=False remove=False create_home=True system=False move_home=False append=False ssh_key_bits=0 ssh_key_type=rsa ssh_key_comment=ansible-generated on compute-2 update_password=always home=None password=NOT_LOGGING_PARAMETER login_class=None password_expire_max=None password_expire_min=None password_expire_warn=None hidden=None seuser=None skeleton=None generate_ssh_key=None ssh_key_file=None ssh_key_passphrase=NOT_LOGGING_PARAMETER expires=None password_lock=None local=None profile=None authorization=None role=None umask=None password_expire_account_disable=None uid_min=None uid_max=None
Jan 21 18:21:53 np0005591285 python3.9[51795]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Jan 21 18:21:54 np0005591285 python3.9[51879]: ansible-ansible.legacy.dnf Invoked with download_only=True name=['openvswitch'] allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None state=None
Jan 21 18:21:58 np0005591285 python3.9[52045]: ansible-ansible.legacy.dnf Invoked with name=['openvswitch'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Jan 21 18:22:11 np0005591285 kernel: SELinux:  Converting 2738 SID table entries...
Jan 21 18:22:11 np0005591285 kernel: SELinux:  policy capability network_peer_controls=1
Jan 21 18:22:11 np0005591285 kernel: SELinux:  policy capability open_perms=1
Jan 21 18:22:11 np0005591285 kernel: SELinux:  policy capability extended_socket_class=1
Jan 21 18:22:11 np0005591285 kernel: SELinux:  policy capability always_check_network=0
Jan 21 18:22:11 np0005591285 kernel: SELinux:  policy capability cgroup_seclabel=1
Jan 21 18:22:11 np0005591285 kernel: SELinux:  policy capability nnp_nosuid_transition=1
Jan 21 18:22:11 np0005591285 kernel: SELinux:  policy capability genfs_seclabel_symlinks=1
Jan 21 18:22:12 np0005591285 dbus-broker-launch[776]: avc:  op=load_policy lsm=selinux seqno=7 res=1
Jan 21 18:22:12 np0005591285 systemd[1]: Started daily update of the root trust anchor for DNSSEC.
Jan 21 18:22:13 np0005591285 systemd[1]: Started /usr/bin/systemctl start man-db-cache-update.
Jan 21 18:22:13 np0005591285 systemd[1]: Starting man-db-cache-update.service...
Jan 21 18:22:13 np0005591285 systemd[1]: Reloading.
Jan 21 18:22:13 np0005591285 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 21 18:22:13 np0005591285 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 21 18:22:13 np0005591285 systemd[1]: Queuing reload/restart jobs for marked units…
Jan 21 18:22:14 np0005591285 systemd[1]: man-db-cache-update.service: Deactivated successfully.
Jan 21 18:22:14 np0005591285 systemd[1]: Finished man-db-cache-update.service.
Jan 21 18:22:14 np0005591285 systemd[1]: run-r0b6b0db8dd924fcd951025a4ab799ccc.service: Deactivated successfully.
Jan 21 18:22:15 np0005591285 python3.9[53142]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=openvswitch.service state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None
Jan 21 18:22:15 np0005591285 systemd[1]: Reloading.
Jan 21 18:22:15 np0005591285 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 21 18:22:15 np0005591285 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 21 18:22:16 np0005591285 systemd[1]: Starting Open vSwitch Database Unit...
Jan 21 18:22:16 np0005591285 chown[53184]: /usr/bin/chown: cannot access '/run/openvswitch': No such file or directory
Jan 21 18:22:16 np0005591285 ovs-ctl[53189]: /etc/openvswitch/conf.db does not exist ... (warning).
Jan 21 18:22:16 np0005591285 ovs-ctl[53189]: Creating empty database /etc/openvswitch/conf.db [  OK  ]
Jan 21 18:22:16 np0005591285 ovs-ctl[53189]: Starting ovsdb-server [  OK  ]
Jan 21 18:22:16 np0005591285 ovs-vsctl[53238]: ovs|00001|vsctl|INFO|Called as ovs-vsctl --no-wait -- init -- set Open_vSwitch . db-version=8.5.1
Jan 21 18:22:16 np0005591285 ovs-vsctl[53254]: ovs|00001|vsctl|INFO|Called as ovs-vsctl --no-wait set Open_vSwitch . ovs-version=3.3.5-115.el9s "external-ids:system-id=\"ce4b296c-26ac-415a-aa87-9634754eb3d3\"" "external-ids:rundir=\"/var/run/openvswitch\"" "system-type=\"centos\"" "system-version=\"9\""
Jan 21 18:22:16 np0005591285 ovs-ctl[53189]: Configuring Open vSwitch system IDs [  OK  ]
Jan 21 18:22:16 np0005591285 ovs-vsctl[53264]: ovs|00001|vsctl|INFO|Called as ovs-vsctl --no-wait add Open_vSwitch . external-ids hostname=compute-2
Jan 21 18:22:16 np0005591285 ovs-ctl[53189]: Enabling remote OVSDB managers [  OK  ]
Jan 21 18:22:16 np0005591285 systemd[1]: Started Open vSwitch Database Unit.
Jan 21 18:22:16 np0005591285 systemd[1]: Starting Open vSwitch Delete Transient Ports...
Jan 21 18:22:16 np0005591285 systemd[1]: Finished Open vSwitch Delete Transient Ports.
Jan 21 18:22:16 np0005591285 systemd[1]: Starting Open vSwitch Forwarding Unit...
Jan 21 18:22:16 np0005591285 kernel: openvswitch: Open vSwitch switching datapath
Jan 21 18:22:16 np0005591285 ovs-ctl[53309]: Inserting openvswitch module [  OK  ]
Jan 21 18:22:16 np0005591285 ovs-ctl[53278]: Starting ovs-vswitchd [  OK  ]
Jan 21 18:22:16 np0005591285 ovs-vsctl[53326]: ovs|00001|vsctl|INFO|Called as ovs-vsctl --no-wait add Open_vSwitch . external-ids hostname=compute-2
Jan 21 18:22:16 np0005591285 ovs-ctl[53278]: Enabling remote OVSDB managers [  OK  ]
Jan 21 18:22:16 np0005591285 systemd[1]: Started Open vSwitch Forwarding Unit.
Jan 21 18:22:16 np0005591285 systemd[1]: Starting Open vSwitch...
Jan 21 18:22:16 np0005591285 systemd[1]: Finished Open vSwitch.
Jan 21 18:22:17 np0005591285 python3.9[53478]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'selinux'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 21 18:22:19 np0005591285 python3.9[53630]: ansible-community.general.sefcontext Invoked with selevel=s0 setype=container_file_t state=present target=/var/lib/edpm-config(/.*)? ignore_selinux_state=False ftype=a reload=True substitute=None seuser=None
Jan 21 18:22:20 np0005591285 kernel: SELinux:  Converting 2752 SID table entries...
Jan 21 18:22:20 np0005591285 kernel: SELinux:  policy capability network_peer_controls=1
Jan 21 18:22:20 np0005591285 kernel: SELinux:  policy capability open_perms=1
Jan 21 18:22:20 np0005591285 kernel: SELinux:  policy capability extended_socket_class=1
Jan 21 18:22:20 np0005591285 kernel: SELinux:  policy capability always_check_network=0
Jan 21 18:22:20 np0005591285 kernel: SELinux:  policy capability cgroup_seclabel=1
Jan 21 18:22:20 np0005591285 kernel: SELinux:  policy capability nnp_nosuid_transition=1
Jan 21 18:22:20 np0005591285 kernel: SELinux:  policy capability genfs_seclabel_symlinks=1
Jan 21 18:22:21 np0005591285 python3.9[53785]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local', 'distribution'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 21 18:22:22 np0005591285 dbus-broker-launch[776]: avc:  op=load_policy lsm=selinux seqno=8 res=1
Jan 21 18:22:22 np0005591285 python3.9[53943]: ansible-ansible.legacy.dnf Invoked with name=['driverctl', 'lvm2', 'crudini', 'jq', 'nftables', 'NetworkManager', 'openstack-selinux', 'python3-libselinux', 'python3-pyyaml', 'rsync', 'tmpwatch', 'sysstat', 'iproute-tc', 'ksmtuned', 'systemd-container', 'crypto-policies-scripts', 'grubby', 'sos'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Jan 21 18:22:25 np0005591285 python3.9[54096]: ansible-ansible.legacy.command Invoked with _raw_params=rpm -V driverctl lvm2 crudini jq nftables NetworkManager openstack-selinux python3-libselinux python3-pyyaml rsync tmpwatch sysstat iproute-tc ksmtuned systemd-container crypto-policies-scripts grubby sos _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 21 18:22:26 np0005591285 python3.9[54383]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/edpm-config selevel=s0 setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None attributes=None
Jan 21 18:22:28 np0005591285 python3.9[54533]: ansible-ansible.builtin.stat Invoked with path=/etc/cloud/cloud.cfg.d follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 21 18:22:28 np0005591285 python3.9[54687]: ansible-ansible.legacy.dnf Invoked with name=['NetworkManager-ovs'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Jan 21 18:22:30 np0005591285 systemd[1]: Started /usr/bin/systemctl start man-db-cache-update.
Jan 21 18:22:30 np0005591285 systemd[1]: Starting man-db-cache-update.service...
Jan 21 18:22:30 np0005591285 systemd[1]: Reloading.
Jan 21 18:22:30 np0005591285 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 21 18:22:30 np0005591285 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 21 18:22:30 np0005591285 systemd[1]: Queuing reload/restart jobs for marked units…
Jan 21 18:22:31 np0005591285 systemd[1]: man-db-cache-update.service: Deactivated successfully.
Jan 21 18:22:31 np0005591285 systemd[1]: Finished man-db-cache-update.service.
Jan 21 18:22:31 np0005591285 systemd[1]: run-r2f04be61845444b886377d959c1eb0f9.service: Deactivated successfully.
Jan 21 18:22:32 np0005591285 python3.9[55004]: ansible-ansible.builtin.systemd Invoked with name=NetworkManager state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Jan 21 18:22:32 np0005591285 systemd[1]: NetworkManager-wait-online.service: Deactivated successfully.
Jan 21 18:22:32 np0005591285 systemd[1]: Stopped Network Manager Wait Online.
Jan 21 18:22:32 np0005591285 systemd[1]: Stopping Network Manager Wait Online...
Jan 21 18:22:32 np0005591285 systemd[1]: Stopping Network Manager...
Jan 21 18:22:32 np0005591285 NetworkManager[7190]: <info>  [1769037752.3411] caught SIGTERM, shutting down normally.
Jan 21 18:22:32 np0005591285 NetworkManager[7190]: <info>  [1769037752.3439] dhcp4 (eth0): canceled DHCP transaction
Jan 21 18:22:32 np0005591285 NetworkManager[7190]: <info>  [1769037752.3439] dhcp4 (eth0): activation: beginning transaction (timeout in 45 seconds)
Jan 21 18:22:32 np0005591285 NetworkManager[7190]: <info>  [1769037752.3439] dhcp4 (eth0): state changed no lease
Jan 21 18:22:32 np0005591285 NetworkManager[7190]: <info>  [1769037752.3447] manager: NetworkManager state is now CONNECTED_SITE
Jan 21 18:22:32 np0005591285 NetworkManager[7190]: <info>  [1769037752.3540] exiting (success)
Jan 21 18:22:32 np0005591285 systemd[1]: Starting Network Manager Script Dispatcher Service...
Jan 21 18:22:32 np0005591285 systemd[1]: Started Network Manager Script Dispatcher Service.
Jan 21 18:22:32 np0005591285 systemd[1]: NetworkManager.service: Deactivated successfully.
Jan 21 18:22:32 np0005591285 systemd[1]: Stopped Network Manager.
Jan 21 18:22:32 np0005591285 systemd[1]: NetworkManager.service: Consumed 14.937s CPU time, 4.1M memory peak, read 0B from disk, written 28.5K to disk.
Jan 21 18:22:32 np0005591285 systemd[1]: Starting Network Manager...
Jan 21 18:22:32 np0005591285 NetworkManager[55017]: <info>  [1769037752.4342] NetworkManager (version 1.54.3-2.el9) is starting... (after a restart, boot:2993e076-ac8d-4723-86a0-913496004632)
Jan 21 18:22:32 np0005591285 NetworkManager[55017]: <info>  [1769037752.4344] Read config: /etc/NetworkManager/NetworkManager.conf, /run/NetworkManager/conf.d/15-carrier-timeout.conf
Jan 21 18:22:32 np0005591285 NetworkManager[55017]: <info>  [1769037752.4415] manager[0x55d6466b6000]: monitoring kernel firmware directory '/lib/firmware'.
Jan 21 18:22:32 np0005591285 systemd[1]: Starting Hostname Service...
Jan 21 18:22:32 np0005591285 systemd[1]: Started Hostname Service.
Jan 21 18:22:32 np0005591285 NetworkManager[55017]: <info>  [1769037752.5429] hostname: hostname: using hostnamed
Jan 21 18:22:32 np0005591285 NetworkManager[55017]: <info>  [1769037752.5430] hostname: static hostname changed from (none) to "compute-2"
Jan 21 18:22:32 np0005591285 NetworkManager[55017]: <info>  [1769037752.5435] dns-mgr: init: dns=default,systemd-resolved rc-manager=symlink (auto)
Jan 21 18:22:32 np0005591285 NetworkManager[55017]: <info>  [1769037752.5440] manager[0x55d6466b6000]: rfkill: Wi-Fi hardware radio set enabled
Jan 21 18:22:32 np0005591285 NetworkManager[55017]: <info>  [1769037752.5441] manager[0x55d6466b6000]: rfkill: WWAN hardware radio set enabled
Jan 21 18:22:32 np0005591285 NetworkManager[55017]: <info>  [1769037752.5463] Loaded device plugin: NMOvsFactory (/usr/lib64/NetworkManager/1.54.3-2.el9/libnm-device-plugin-ovs.so)
Jan 21 18:22:32 np0005591285 NetworkManager[55017]: <info>  [1769037752.5473] Loaded device plugin: NMTeamFactory (/usr/lib64/NetworkManager/1.54.3-2.el9/libnm-device-plugin-team.so)
Jan 21 18:22:32 np0005591285 NetworkManager[55017]: <info>  [1769037752.5474] manager: rfkill: Wi-Fi enabled by radio killswitch; enabled by state file
Jan 21 18:22:32 np0005591285 NetworkManager[55017]: <info>  [1769037752.5474] manager: rfkill: WWAN enabled by radio killswitch; enabled by state file
Jan 21 18:22:32 np0005591285 NetworkManager[55017]: <info>  [1769037752.5475] manager: Networking is enabled by state file
Jan 21 18:22:32 np0005591285 NetworkManager[55017]: <info>  [1769037752.5477] settings: Loaded settings plugin: keyfile (internal)
Jan 21 18:22:32 np0005591285 NetworkManager[55017]: <info>  [1769037752.5481] settings: Loaded settings plugin: ifcfg-rh ("/usr/lib64/NetworkManager/1.54.3-2.el9/libnm-settings-plugin-ifcfg-rh.so")
Jan 21 18:22:32 np0005591285 NetworkManager[55017]: <info>  [1769037752.5512] Warning: the ifcfg-rh plugin is deprecated, please migrate connections to the keyfile format using "nmcli connection migrate"
Jan 21 18:22:32 np0005591285 NetworkManager[55017]: <info>  [1769037752.5521] dhcp: init: Using DHCP client 'internal'
Jan 21 18:22:32 np0005591285 NetworkManager[55017]: <info>  [1769037752.5524] manager: (lo): new Loopback device (/org/freedesktop/NetworkManager/Devices/1)
Jan 21 18:22:32 np0005591285 NetworkManager[55017]: <info>  [1769037752.5531] device (lo): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 21 18:22:32 np0005591285 NetworkManager[55017]: <info>  [1769037752.5536] device (lo): state change: unavailable -> disconnected (reason 'connection-assumed', managed-type: 'external')
Jan 21 18:22:32 np0005591285 NetworkManager[55017]: <info>  [1769037752.5544] device (lo): Activation: starting connection 'lo' (e8b88d7b-c546-4855-a53a-f2271e918cb0)
Jan 21 18:22:32 np0005591285 NetworkManager[55017]: <info>  [1769037752.5551] device (eth0): carrier: link connected
Jan 21 18:22:32 np0005591285 NetworkManager[55017]: <info>  [1769037752.5555] manager: (eth0): new Ethernet device (/org/freedesktop/NetworkManager/Devices/2)
Jan 21 18:22:32 np0005591285 NetworkManager[55017]: <info>  [1769037752.5561] manager: (eth0): assume: will attempt to assume matching connection 'System eth0' (5fb06bd0-0bb0-7ffb-45f1-d6edd65f3e03) (indicated)
Jan 21 18:22:32 np0005591285 NetworkManager[55017]: <info>  [1769037752.5561] device (eth0): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'assume')
Jan 21 18:22:32 np0005591285 NetworkManager[55017]: <info>  [1769037752.5567] device (eth0): state change: unavailable -> disconnected (reason 'connection-assumed', managed-type: 'assume')
Jan 21 18:22:32 np0005591285 NetworkManager[55017]: <info>  [1769037752.5574] device (eth0): Activation: starting connection 'System eth0' (5fb06bd0-0bb0-7ffb-45f1-d6edd65f3e03)
Jan 21 18:22:32 np0005591285 NetworkManager[55017]: <info>  [1769037752.5580] device (eth1): carrier: link connected
Jan 21 18:22:32 np0005591285 NetworkManager[55017]: <info>  [1769037752.5584] manager: (eth1): new Ethernet device (/org/freedesktop/NetworkManager/Devices/3)
Jan 21 18:22:32 np0005591285 NetworkManager[55017]: <info>  [1769037752.5590] manager: (eth1): assume: will attempt to assume matching connection 'ci-private-network' (6edea1a0-705c-5cc0-8116-93b791d6dfff) (indicated)
Jan 21 18:22:32 np0005591285 NetworkManager[55017]: <info>  [1769037752.5591] device (eth1): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'assume')
Jan 21 18:22:32 np0005591285 NetworkManager[55017]: <info>  [1769037752.5596] device (eth1): state change: unavailable -> disconnected (reason 'connection-assumed', managed-type: 'assume')
Jan 21 18:22:32 np0005591285 NetworkManager[55017]: <info>  [1769037752.5602] device (eth1): Activation: starting connection 'ci-private-network' (6edea1a0-705c-5cc0-8116-93b791d6dfff)
Jan 21 18:22:32 np0005591285 systemd[1]: Started Network Manager.
Jan 21 18:22:32 np0005591285 NetworkManager[55017]: <info>  [1769037752.5612] bus-manager: acquired D-Bus service "org.freedesktop.NetworkManager"
Jan 21 18:22:32 np0005591285 NetworkManager[55017]: <info>  [1769037752.5621] device (lo): state change: disconnected -> prepare (reason 'none', managed-type: 'external')
Jan 21 18:22:32 np0005591285 NetworkManager[55017]: <info>  [1769037752.5623] device (lo): state change: prepare -> config (reason 'none', managed-type: 'external')
Jan 21 18:22:32 np0005591285 NetworkManager[55017]: <info>  [1769037752.5625] device (lo): state change: config -> ip-config (reason 'none', managed-type: 'external')
Jan 21 18:22:32 np0005591285 NetworkManager[55017]: <info>  [1769037752.5628] device (eth0): state change: disconnected -> prepare (reason 'none', managed-type: 'assume')
Jan 21 18:22:32 np0005591285 NetworkManager[55017]: <info>  [1769037752.5632] device (eth0): state change: prepare -> config (reason 'none', managed-type: 'assume')
Jan 21 18:22:32 np0005591285 NetworkManager[55017]: <info>  [1769037752.5634] device (eth1): state change: disconnected -> prepare (reason 'none', managed-type: 'assume')
Jan 21 18:22:32 np0005591285 NetworkManager[55017]: <info>  [1769037752.5637] device (eth1): state change: prepare -> config (reason 'none', managed-type: 'assume')
Jan 21 18:22:32 np0005591285 NetworkManager[55017]: <info>  [1769037752.5642] device (lo): state change: ip-config -> ip-check (reason 'none', managed-type: 'external')
Jan 21 18:22:32 np0005591285 NetworkManager[55017]: <info>  [1769037752.5652] device (eth0): state change: config -> ip-config (reason 'none', managed-type: 'assume')
Jan 21 18:22:32 np0005591285 NetworkManager[55017]: <info>  [1769037752.5656] dhcp4 (eth0): activation: beginning transaction (timeout in 45 seconds)
Jan 21 18:22:32 np0005591285 NetworkManager[55017]: <info>  [1769037752.5666] device (eth1): state change: config -> ip-config (reason 'none', managed-type: 'assume')
Jan 21 18:22:32 np0005591285 NetworkManager[55017]: <info>  [1769037752.5683] device (eth1): state change: ip-config -> ip-check (reason 'none', managed-type: 'assume')
Jan 21 18:22:32 np0005591285 NetworkManager[55017]: <info>  [1769037752.5694] device (lo): state change: ip-check -> secondaries (reason 'none', managed-type: 'external')
Jan 21 18:22:32 np0005591285 NetworkManager[55017]: <info>  [1769037752.5696] device (lo): state change: secondaries -> activated (reason 'none', managed-type: 'external')
Jan 21 18:22:32 np0005591285 NetworkManager[55017]: <info>  [1769037752.5703] device (lo): Activation: successful, device activated.
Jan 21 18:22:32 np0005591285 NetworkManager[55017]: <info>  [1769037752.5714] dhcp4 (eth0): state changed new lease, address=38.102.83.145
Jan 21 18:22:32 np0005591285 NetworkManager[55017]: <info>  [1769037752.5724] policy: set 'System eth0' (eth0) as default for IPv4 routing and DNS
Jan 21 18:22:32 np0005591285 systemd[1]: Starting Network Manager Wait Online...
Jan 21 18:22:32 np0005591285 NetworkManager[55017]: <info>  [1769037752.5827] device (eth1): state change: ip-check -> secondaries (reason 'none', managed-type: 'assume')
Jan 21 18:22:32 np0005591285 NetworkManager[55017]: <info>  [1769037752.5833] device (eth0): state change: ip-config -> ip-check (reason 'none', managed-type: 'assume')
Jan 21 18:22:32 np0005591285 NetworkManager[55017]: <info>  [1769037752.5839] device (eth1): state change: secondaries -> activated (reason 'none', managed-type: 'assume')
Jan 21 18:22:32 np0005591285 NetworkManager[55017]: <info>  [1769037752.5843] manager: NetworkManager state is now CONNECTED_LOCAL
Jan 21 18:22:32 np0005591285 NetworkManager[55017]: <info>  [1769037752.5845] device (eth1): Activation: successful, device activated.
Jan 21 18:22:32 np0005591285 NetworkManager[55017]: <info>  [1769037752.5854] device (eth0): state change: ip-check -> secondaries (reason 'none', managed-type: 'assume')
Jan 21 18:22:32 np0005591285 NetworkManager[55017]: <info>  [1769037752.5856] device (eth0): state change: secondaries -> activated (reason 'none', managed-type: 'assume')
Jan 21 18:22:32 np0005591285 NetworkManager[55017]: <info>  [1769037752.5859] manager: NetworkManager state is now CONNECTED_SITE
Jan 21 18:22:32 np0005591285 NetworkManager[55017]: <info>  [1769037752.5862] device (eth0): Activation: successful, device activated.
Jan 21 18:22:32 np0005591285 NetworkManager[55017]: <info>  [1769037752.5866] manager: NetworkManager state is now CONNECTED_GLOBAL
Jan 21 18:22:32 np0005591285 NetworkManager[55017]: <info>  [1769037752.5871] manager: startup complete
Jan 21 18:22:32 np0005591285 systemd[1]: Finished Network Manager Wait Online.
Jan 21 18:22:33 np0005591285 python3.9[55231]: ansible-ansible.legacy.dnf Invoked with name=['os-net-config'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Jan 21 18:22:38 np0005591285 systemd[1]: Started /usr/bin/systemctl start man-db-cache-update.
Jan 21 18:22:38 np0005591285 systemd[1]: Starting man-db-cache-update.service...
Jan 21 18:22:38 np0005591285 systemd[1]: Reloading.
Jan 21 18:22:38 np0005591285 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 21 18:22:38 np0005591285 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 21 18:22:38 np0005591285 systemd[1]: Queuing reload/restart jobs for marked units…
Jan 21 18:22:39 np0005591285 systemd[1]: man-db-cache-update.service: Deactivated successfully.
Jan 21 18:22:39 np0005591285 systemd[1]: Finished man-db-cache-update.service.
Jan 21 18:22:39 np0005591285 systemd[1]: run-r4d1d3783c5224def9b463843cf8400ac.service: Deactivated successfully.
Jan 21 18:22:40 np0005591285 python3.9[55690]: ansible-ansible.builtin.stat Invoked with path=/var/lib/edpm-config/os-net-config.returncode follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 21 18:22:41 np0005591285 python3.9[55842]: ansible-community.general.ini_file Invoked with backup=True mode=0644 no_extra_spaces=True option=no-auto-default path=/etc/NetworkManager/NetworkManager.conf section=main state=present value=* exclusive=True ignore_spaces=False allow_no_value=False modify_inactive_option=True create=True follow=False unsafe_writes=False section_has_values=None values=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 21 18:22:42 np0005591285 systemd[1]: NetworkManager-dispatcher.service: Deactivated successfully.
Jan 21 18:22:43 np0005591285 python3.9[55996]: ansible-community.general.ini_file Invoked with backup=True mode=0644 no_extra_spaces=True option=dns path=/etc/NetworkManager/NetworkManager.conf section=main state=absent value=none exclusive=True ignore_spaces=False allow_no_value=False modify_inactive_option=True create=True follow=False unsafe_writes=False section_has_values=None values=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 21 18:22:43 np0005591285 python3.9[56148]: ansible-community.general.ini_file Invoked with backup=True mode=0644 no_extra_spaces=True option=dns path=/etc/NetworkManager/conf.d/99-cloud-init.conf section=main state=absent value=none exclusive=True ignore_spaces=False allow_no_value=False modify_inactive_option=True create=True follow=False unsafe_writes=False section_has_values=None values=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 21 18:22:44 np0005591285 python3.9[56300]: ansible-community.general.ini_file Invoked with backup=True mode=0644 no_extra_spaces=True option=rc-manager path=/etc/NetworkManager/NetworkManager.conf section=main state=absent value=unmanaged exclusive=True ignore_spaces=False allow_no_value=False modify_inactive_option=True create=True follow=False unsafe_writes=False section_has_values=None values=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 21 18:22:45 np0005591285 python3.9[56452]: ansible-community.general.ini_file Invoked with backup=True mode=0644 no_extra_spaces=True option=rc-manager path=/etc/NetworkManager/conf.d/99-cloud-init.conf section=main state=absent value=unmanaged exclusive=True ignore_spaces=False allow_no_value=False modify_inactive_option=True create=True follow=False unsafe_writes=False section_has_values=None values=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 21 18:22:46 np0005591285 python3.9[56604]: ansible-ansible.legacy.stat Invoked with path=/etc/dhcp/dhclient-enter-hooks follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 21 18:22:47 np0005591285 python3.9[56727]: ansible-ansible.legacy.copy Invoked with dest=/etc/dhcp/dhclient-enter-hooks mode=0755 src=/home/zuul/.ansible/tmp/ansible-tmp-1769037765.9446201-649-36403273799535/.source _original_basename=.pubqvnw2 follow=False checksum=f6278a40de79a9841f6ed1fc584538225566990c backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 21 18:22:47 np0005591285 python3.9[56879]: ansible-ansible.builtin.file Invoked with mode=0755 path=/etc/os-net-config state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 21 18:22:48 np0005591285 python3.9[57031]: ansible-edpm_os_net_config_mappings Invoked with net_config_data_lookup={}
Jan 21 18:22:49 np0005591285 python3.9[57183]: ansible-ansible.builtin.file Invoked with path=/var/lib/edpm-config/scripts state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 21 18:22:52 np0005591285 python3.9[57610]: ansible-ansible.builtin.slurp Invoked with path=/etc/os-net-config/config.yaml src=/etc/os-net-config/config.yaml
Jan 21 18:22:54 np0005591285 ansible-async_wrapper.py[57785]: Invoked with j483861627338 300 /home/zuul/.ansible/tmp/ansible-tmp-1769037773.0128076-847-207728291439014/AnsiballZ_edpm_os_net_config.py _
Jan 21 18:22:54 np0005591285 ansible-async_wrapper.py[57788]: Starting module and watcher
Jan 21 18:22:54 np0005591285 ansible-async_wrapper.py[57788]: Start watching 57789 (300)
Jan 21 18:22:54 np0005591285 ansible-async_wrapper.py[57789]: Start module (57789)
Jan 21 18:22:54 np0005591285 ansible-async_wrapper.py[57785]: Return async_wrapper task started.
Jan 21 18:22:55 np0005591285 python3.9[57790]: ansible-edpm_os_net_config Invoked with cleanup=True config_file=/etc/os-net-config/config.yaml debug=True detailed_exit_codes=True safe_defaults=False use_nmstate=True
Jan 21 18:22:55 np0005591285 kernel: cfg80211: Loading compiled-in X.509 certificates for regulatory database
Jan 21 18:22:55 np0005591285 kernel: Loaded X.509 cert 'sforshee: 00b28ddf47aef9cea7'
Jan 21 18:22:55 np0005591285 kernel: Loaded X.509 cert 'wens: 61c038651aabdcf94bd0ac7ff06c7248db18c600'
Jan 21 18:22:55 np0005591285 kernel: platform regulatory.0: Direct firmware load for regulatory.db failed with error -2
Jan 21 18:22:55 np0005591285 kernel: cfg80211: failed to load regulatory.db
Jan 21 18:22:56 np0005591285 NetworkManager[55017]: <info>  [1769037776.7568] audit: op="checkpoint-create" arg="/org/freedesktop/NetworkManager/Checkpoint/1" pid=57791 uid=0 result="success"
Jan 21 18:22:56 np0005591285 NetworkManager[55017]: <info>  [1769037776.7586] audit: op="checkpoint-adjust-rollback-timeout" arg="/org/freedesktop/NetworkManager/Checkpoint/1" pid=57791 uid=0 result="success"
Jan 21 18:22:56 np0005591285 NetworkManager[55017]: <info>  [1769037776.8057] manager: (br-ex): new Open vSwitch Bridge device (/org/freedesktop/NetworkManager/Devices/4)
Jan 21 18:22:56 np0005591285 NetworkManager[55017]: <info>  [1769037776.8058] audit: op="connection-add" uuid="e00990f3-55ed-4294-b6cc-1081fcdaffc3" name="br-ex-br" pid=57791 uid=0 result="success"
Jan 21 18:22:56 np0005591285 NetworkManager[55017]: <info>  [1769037776.8073] manager: (br-ex): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/5)
Jan 21 18:22:56 np0005591285 NetworkManager[55017]: <info>  [1769037776.8074] audit: op="connection-add" uuid="d490ee6f-f602-49cc-bf03-da2f74f263f7" name="br-ex-port" pid=57791 uid=0 result="success"
Jan 21 18:22:56 np0005591285 NetworkManager[55017]: <info>  [1769037776.8085] manager: (eth1): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/6)
Jan 21 18:22:56 np0005591285 NetworkManager[55017]: <info>  [1769037776.8086] audit: op="connection-add" uuid="057f8584-93e6-4119-830d-5c41aca216c6" name="eth1-port" pid=57791 uid=0 result="success"
Jan 21 18:22:56 np0005591285 NetworkManager[55017]: <info>  [1769037776.8096] manager: (vlan20): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/7)
Jan 21 18:22:56 np0005591285 NetworkManager[55017]: <info>  [1769037776.8097] audit: op="connection-add" uuid="3bfaf0de-2fcd-45be-b518-a80bff88efe9" name="vlan20-port" pid=57791 uid=0 result="success"
Jan 21 18:22:56 np0005591285 NetworkManager[55017]: <info>  [1769037776.8106] manager: (vlan21): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/8)
Jan 21 18:22:56 np0005591285 NetworkManager[55017]: <info>  [1769037776.8107] audit: op="connection-add" uuid="143ed989-2be6-48c0-bcfc-9e519f81adb3" name="vlan21-port" pid=57791 uid=0 result="success"
Jan 21 18:22:56 np0005591285 NetworkManager[55017]: <info>  [1769037776.8116] manager: (vlan22): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/9)
Jan 21 18:22:56 np0005591285 NetworkManager[55017]: <info>  [1769037776.8117] audit: op="connection-add" uuid="d7141d7c-aad5-45fd-8000-ed8dc00f90bf" name="vlan22-port" pid=57791 uid=0 result="success"
Jan 21 18:22:56 np0005591285 NetworkManager[55017]: <info>  [1769037776.8135] audit: op="connection-update" uuid="5fb06bd0-0bb0-7ffb-45f1-d6edd65f3e03" name="System eth0" args="ipv4.dhcp-client-id,ipv4.dhcp-timeout,ipv6.addr-gen-mode,ipv6.method,ipv6.dhcp-timeout,connection.autoconnect-priority,connection.timestamp,802-3-ethernet.mtu" pid=57791 uid=0 result="success"
Jan 21 18:22:56 np0005591285 NetworkManager[55017]: <info>  [1769037776.8148] manager: (br-ex): new Open vSwitch Interface device (/org/freedesktop/NetworkManager/Devices/10)
Jan 21 18:22:56 np0005591285 NetworkManager[55017]: <info>  [1769037776.8150] audit: op="connection-add" uuid="df84279f-5412-4206-9c5e-87b3e1e47cde" name="br-ex-if" pid=57791 uid=0 result="success"
Jan 21 18:22:57 np0005591285 NetworkManager[55017]: <info>  [1769037777.2080] audit: op="connection-update" uuid="6edea1a0-705c-5cc0-8116-93b791d6dfff" name="ci-private-network" args="ovs-external-ids.data,ipv4.dns,ipv4.method,ipv4.addresses,ipv4.never-default,ipv4.routes,ipv4.routing-rules,ipv6.dns,ipv6.addr-gen-mode,ipv6.method,ipv6.addresses,ipv6.routes,ipv6.routing-rules,ovs-interface.type,connection.controller,connection.master,connection.slave-type,connection.port-type,connection.timestamp" pid=57791 uid=0 result="success"
Jan 21 18:22:57 np0005591285 NetworkManager[55017]: <info>  [1769037777.2117] manager: (vlan20): new Open vSwitch Interface device (/org/freedesktop/NetworkManager/Devices/11)
Jan 21 18:22:57 np0005591285 NetworkManager[55017]: <info>  [1769037777.2120] audit: op="connection-add" uuid="eaba362b-f800-41a2-9aaf-f6894b925b3d" name="vlan20-if" pid=57791 uid=0 result="success"
Jan 21 18:22:57 np0005591285 NetworkManager[55017]: <info>  [1769037777.2146] manager: (vlan21): new Open vSwitch Interface device (/org/freedesktop/NetworkManager/Devices/12)
Jan 21 18:22:57 np0005591285 NetworkManager[55017]: <info>  [1769037777.2149] audit: op="connection-add" uuid="00be4a2c-ea24-4eb4-a594-b640567a3672" name="vlan21-if" pid=57791 uid=0 result="success"
Jan 21 18:22:57 np0005591285 NetworkManager[55017]: <info>  [1769037777.2173] manager: (vlan22): new Open vSwitch Interface device (/org/freedesktop/NetworkManager/Devices/13)
Jan 21 18:22:57 np0005591285 NetworkManager[55017]: <info>  [1769037777.2175] audit: op="connection-add" uuid="e17eca99-7fd7-469f-809b-fc2b62d52f9d" name="vlan22-if" pid=57791 uid=0 result="success"
Jan 21 18:22:57 np0005591285 NetworkManager[55017]: <info>  [1769037777.2190] audit: op="connection-delete" uuid="31dfefde-0caa-33b1-9dbb-97e36adad912" name="Wired connection 1" pid=57791 uid=0 result="success"
Jan 21 18:22:57 np0005591285 NetworkManager[55017]: <info>  [1769037777.2206] device (br-ex)[Open vSwitch Bridge]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Jan 21 18:22:57 np0005591285 NetworkManager[55017]: <warn>  [1769037777.2210] device (br-ex)[Open vSwitch Bridge]: error setting IPv4 forwarding to '1': Resource temporarily unavailable
Jan 21 18:22:57 np0005591285 NetworkManager[55017]: <info>  [1769037777.2218] device (br-ex)[Open vSwitch Bridge]: state change: unavailable -> disconnected (reason 'user-requested', managed-type: 'full')
Jan 21 18:22:57 np0005591285 NetworkManager[55017]: <info>  [1769037777.2223] device (br-ex)[Open vSwitch Bridge]: Activation: starting connection 'br-ex-br' (e00990f3-55ed-4294-b6cc-1081fcdaffc3)
Jan 21 18:22:57 np0005591285 NetworkManager[55017]: <info>  [1769037777.2224] audit: op="connection-activate" uuid="e00990f3-55ed-4294-b6cc-1081fcdaffc3" name="br-ex-br" pid=57791 uid=0 result="success"
Jan 21 18:22:57 np0005591285 NetworkManager[55017]: <info>  [1769037777.2227] device (br-ex)[Open vSwitch Port]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Jan 21 18:22:57 np0005591285 NetworkManager[55017]: <warn>  [1769037777.2228] device (br-ex)[Open vSwitch Port]: error setting IPv4 forwarding to '1': Resource temporarily unavailable
Jan 21 18:22:57 np0005591285 NetworkManager[55017]: <info>  [1769037777.2235] device (br-ex)[Open vSwitch Port]: state change: unavailable -> disconnected (reason 'user-requested', managed-type: 'full')
Jan 21 18:22:57 np0005591285 NetworkManager[55017]: <info>  [1769037777.2240] device (br-ex)[Open vSwitch Port]: Activation: starting connection 'br-ex-port' (d490ee6f-f602-49cc-bf03-da2f74f263f7)
Jan 21 18:22:57 np0005591285 NetworkManager[55017]: <info>  [1769037777.2242] device (eth1)[Open vSwitch Port]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Jan 21 18:22:57 np0005591285 NetworkManager[55017]: <warn>  [1769037777.2243] device (eth1)[Open vSwitch Port]: error setting IPv4 forwarding to '1': Resource temporarily unavailable
Jan 21 18:22:57 np0005591285 NetworkManager[55017]: <info>  [1769037777.2250] device (eth1)[Open vSwitch Port]: state change: unavailable -> disconnected (reason 'user-requested', managed-type: 'full')
Jan 21 18:22:57 np0005591285 NetworkManager[55017]: <info>  [1769037777.2255] device (eth1)[Open vSwitch Port]: Activation: starting connection 'eth1-port' (057f8584-93e6-4119-830d-5c41aca216c6)
Jan 21 18:22:57 np0005591285 NetworkManager[55017]: <info>  [1769037777.2258] device (vlan20)[Open vSwitch Port]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Jan 21 18:22:57 np0005591285 NetworkManager[55017]: <warn>  [1769037777.2259] device (vlan20)[Open vSwitch Port]: error setting IPv4 forwarding to '1': Resource temporarily unavailable
Jan 21 18:22:57 np0005591285 NetworkManager[55017]: <info>  [1769037777.2265] device (vlan20)[Open vSwitch Port]: state change: unavailable -> disconnected (reason 'user-requested', managed-type: 'full')
Jan 21 18:22:57 np0005591285 NetworkManager[55017]: <info>  [1769037777.2271] device (vlan20)[Open vSwitch Port]: Activation: starting connection 'vlan20-port' (3bfaf0de-2fcd-45be-b518-a80bff88efe9)
Jan 21 18:22:57 np0005591285 NetworkManager[55017]: <info>  [1769037777.2273] device (vlan21)[Open vSwitch Port]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Jan 21 18:22:57 np0005591285 NetworkManager[55017]: <warn>  [1769037777.2274] device (vlan21)[Open vSwitch Port]: error setting IPv4 forwarding to '1': Resource temporarily unavailable
Jan 21 18:22:57 np0005591285 NetworkManager[55017]: <info>  [1769037777.2281] device (vlan21)[Open vSwitch Port]: state change: unavailable -> disconnected (reason 'user-requested', managed-type: 'full')
Jan 21 18:22:57 np0005591285 NetworkManager[55017]: <info>  [1769037777.2287] device (vlan21)[Open vSwitch Port]: Activation: starting connection 'vlan21-port' (143ed989-2be6-48c0-bcfc-9e519f81adb3)
Jan 21 18:22:57 np0005591285 NetworkManager[55017]: <info>  [1769037777.2289] device (vlan22)[Open vSwitch Port]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Jan 21 18:22:57 np0005591285 NetworkManager[55017]: <warn>  [1769037777.2291] device (vlan22)[Open vSwitch Port]: error setting IPv4 forwarding to '1': Resource temporarily unavailable
Jan 21 18:22:57 np0005591285 NetworkManager[55017]: <info>  [1769037777.2297] device (vlan22)[Open vSwitch Port]: state change: unavailable -> disconnected (reason 'user-requested', managed-type: 'full')
Jan 21 18:22:57 np0005591285 NetworkManager[55017]: <info>  [1769037777.2303] device (vlan22)[Open vSwitch Port]: Activation: starting connection 'vlan22-port' (d7141d7c-aad5-45fd-8000-ed8dc00f90bf)
Jan 21 18:22:57 np0005591285 NetworkManager[55017]: <info>  [1769037777.2304] device (br-ex)[Open vSwitch Bridge]: state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Jan 21 18:22:57 np0005591285 NetworkManager[55017]: <info>  [1769037777.2308] device (br-ex)[Open vSwitch Bridge]: state change: prepare -> config (reason 'none', managed-type: 'full')
Jan 21 18:22:57 np0005591285 NetworkManager[55017]: <info>  [1769037777.2310] device (br-ex)[Open vSwitch Bridge]: state change: config -> ip-config (reason 'none', managed-type: 'full')
Jan 21 18:22:57 np0005591285 NetworkManager[55017]: <info>  [1769037777.2319] device (br-ex)[Open vSwitch Interface]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Jan 21 18:22:57 np0005591285 NetworkManager[55017]: <warn>  [1769037777.2321] device (br-ex)[Open vSwitch Interface]: error setting IPv4 forwarding to '1': No such file or directory
Jan 21 18:22:57 np0005591285 NetworkManager[55017]: <info>  [1769037777.2325] device (br-ex)[Open vSwitch Interface]: state change: unavailable -> disconnected (reason 'user-requested', managed-type: 'full')
Jan 21 18:22:57 np0005591285 NetworkManager[55017]: <info>  [1769037777.2330] device (br-ex)[Open vSwitch Interface]: Activation: starting connection 'br-ex-if' (df84279f-5412-4206-9c5e-87b3e1e47cde)
Jan 21 18:22:57 np0005591285 NetworkManager[55017]: <info>  [1769037777.2331] device (br-ex)[Open vSwitch Port]: state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Jan 21 18:22:57 np0005591285 NetworkManager[55017]: <info>  [1769037777.2335] device (br-ex)[Open vSwitch Port]: state change: prepare -> config (reason 'none', managed-type: 'full')
Jan 21 18:22:57 np0005591285 NetworkManager[55017]: <info>  [1769037777.2338] device (br-ex)[Open vSwitch Port]: state change: config -> ip-config (reason 'none', managed-type: 'full')
Jan 21 18:22:57 np0005591285 NetworkManager[55017]: <info>  [1769037777.2339] device (br-ex)[Open vSwitch Port]: Activation: connection 'br-ex-port' attached as port, continuing activation
Jan 21 18:22:57 np0005591285 NetworkManager[55017]: <info>  [1769037777.2341] device (eth1): state change: activated -> deactivating (reason 'new-activation', managed-type: 'full')
Jan 21 18:22:57 np0005591285 NetworkManager[55017]: <info>  [1769037777.2355] device (eth1): disconnecting for new activation request.
Jan 21 18:22:57 np0005591285 NetworkManager[55017]: <info>  [1769037777.2356] device (eth1)[Open vSwitch Port]: state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Jan 21 18:22:57 np0005591285 NetworkManager[55017]: <info>  [1769037777.2360] device (eth1)[Open vSwitch Port]: state change: prepare -> config (reason 'none', managed-type: 'full')
Jan 21 18:22:57 np0005591285 NetworkManager[55017]: <info>  [1769037777.2362] device (eth1)[Open vSwitch Port]: state change: config -> ip-config (reason 'none', managed-type: 'full')
Jan 21 18:22:57 np0005591285 NetworkManager[55017]: <info>  [1769037777.2364] device (eth1)[Open vSwitch Port]: Activation: connection 'eth1-port' attached as port, continuing activation
Jan 21 18:22:57 np0005591285 NetworkManager[55017]: <info>  [1769037777.2368] device (vlan20)[Open vSwitch Interface]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Jan 21 18:22:57 np0005591285 NetworkManager[55017]: <warn>  [1769037777.2369] device (vlan20)[Open vSwitch Interface]: error setting IPv4 forwarding to '1': No such file or directory
Jan 21 18:22:57 np0005591285 NetworkManager[55017]: <info>  [1769037777.2374] device (vlan20)[Open vSwitch Interface]: state change: unavailable -> disconnected (reason 'user-requested', managed-type: 'full')
Jan 21 18:22:57 np0005591285 NetworkManager[55017]: <info>  [1769037777.2379] device (vlan20)[Open vSwitch Interface]: Activation: starting connection 'vlan20-if' (eaba362b-f800-41a2-9aaf-f6894b925b3d)
Jan 21 18:22:57 np0005591285 NetworkManager[55017]: <info>  [1769037777.2380] device (vlan20)[Open vSwitch Port]: state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Jan 21 18:22:57 np0005591285 NetworkManager[55017]: <info>  [1769037777.2384] device (vlan20)[Open vSwitch Port]: state change: prepare -> config (reason 'none', managed-type: 'full')
Jan 21 18:22:57 np0005591285 NetworkManager[55017]: <info>  [1769037777.2386] device (vlan20)[Open vSwitch Port]: state change: config -> ip-config (reason 'none', managed-type: 'full')
Jan 21 18:22:57 np0005591285 NetworkManager[55017]: <info>  [1769037777.2387] device (vlan20)[Open vSwitch Port]: Activation: connection 'vlan20-port' attached as port, continuing activation
Jan 21 18:22:57 np0005591285 NetworkManager[55017]: <info>  [1769037777.2389] device (vlan21)[Open vSwitch Interface]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Jan 21 18:22:57 np0005591285 NetworkManager[55017]: <warn>  [1769037777.2391] device (vlan21)[Open vSwitch Interface]: error setting IPv4 forwarding to '1': No such file or directory
Jan 21 18:22:57 np0005591285 NetworkManager[55017]: <info>  [1769037777.2394] device (vlan21)[Open vSwitch Interface]: state change: unavailable -> disconnected (reason 'user-requested', managed-type: 'full')
Jan 21 18:22:57 np0005591285 NetworkManager[55017]: <info>  [1769037777.2399] device (vlan21)[Open vSwitch Interface]: Activation: starting connection 'vlan21-if' (00be4a2c-ea24-4eb4-a594-b640567a3672)
Jan 21 18:22:57 np0005591285 NetworkManager[55017]: <info>  [1769037777.2400] device (vlan21)[Open vSwitch Port]: state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Jan 21 18:22:57 np0005591285 NetworkManager[55017]: <info>  [1769037777.2403] device (vlan21)[Open vSwitch Port]: state change: prepare -> config (reason 'none', managed-type: 'full')
Jan 21 18:22:57 np0005591285 NetworkManager[55017]: <info>  [1769037777.2405] device (vlan21)[Open vSwitch Port]: state change: config -> ip-config (reason 'none', managed-type: 'full')
Jan 21 18:22:57 np0005591285 NetworkManager[55017]: <info>  [1769037777.2406] device (vlan21)[Open vSwitch Port]: Activation: connection 'vlan21-port' attached as port, continuing activation
Jan 21 18:22:57 np0005591285 NetworkManager[55017]: <info>  [1769037777.2410] device (vlan22)[Open vSwitch Interface]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Jan 21 18:22:57 np0005591285 NetworkManager[55017]: <warn>  [1769037777.2411] device (vlan22)[Open vSwitch Interface]: error setting IPv4 forwarding to '1': No such file or directory
Jan 21 18:22:57 np0005591285 NetworkManager[55017]: <info>  [1769037777.2415] device (vlan22)[Open vSwitch Interface]: state change: unavailable -> disconnected (reason 'user-requested', managed-type: 'full')
Jan 21 18:22:57 np0005591285 NetworkManager[55017]: <info>  [1769037777.2419] device (vlan22)[Open vSwitch Interface]: Activation: starting connection 'vlan22-if' (e17eca99-7fd7-469f-809b-fc2b62d52f9d)
Jan 21 18:22:57 np0005591285 NetworkManager[55017]: <info>  [1769037777.2420] device (vlan22)[Open vSwitch Port]: state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Jan 21 18:22:57 np0005591285 NetworkManager[55017]: <info>  [1769037777.2424] device (vlan22)[Open vSwitch Port]: state change: prepare -> config (reason 'none', managed-type: 'full')
Jan 21 18:22:57 np0005591285 NetworkManager[55017]: <info>  [1769037777.2426] device (vlan22)[Open vSwitch Port]: state change: config -> ip-config (reason 'none', managed-type: 'full')
Jan 21 18:22:57 np0005591285 NetworkManager[55017]: <info>  [1769037777.2428] device (vlan22)[Open vSwitch Port]: Activation: connection 'vlan22-port' attached as port, continuing activation
Jan 21 18:22:57 np0005591285 NetworkManager[55017]: <info>  [1769037777.2431] device (br-ex)[Open vSwitch Bridge]: state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Jan 21 18:22:57 np0005591285 NetworkManager[55017]: <info>  [1769037777.2446] audit: op="device-reapply" interface="eth0" ifindex=2 args="ipv4.dhcp-client-id,ipv4.dhcp-timeout,ipv6.addr-gen-mode,ipv6.method,connection.autoconnect-priority,802-3-ethernet.mtu" pid=57791 uid=0 result="success"
Jan 21 18:22:57 np0005591285 NetworkManager[55017]: <info>  [1769037777.2447] device (br-ex)[Open vSwitch Interface]: state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Jan 21 18:22:57 np0005591285 NetworkManager[55017]: <info>  [1769037777.2451] device (br-ex)[Open vSwitch Interface]: state change: prepare -> config (reason 'none', managed-type: 'full')
Jan 21 18:22:57 np0005591285 NetworkManager[55017]: <info>  [1769037777.2452] device (br-ex)[Open vSwitch Interface]: state change: config -> ip-config (reason 'none', managed-type: 'full')
Jan 21 18:22:57 np0005591285 NetworkManager[55017]: <info>  [1769037777.2459] device (br-ex)[Open vSwitch Port]: state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Jan 21 18:22:57 np0005591285 NetworkManager[55017]: <info>  [1769037777.2463] device (eth1)[Open vSwitch Port]: state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Jan 21 18:22:57 np0005591285 NetworkManager[55017]: <info>  [1769037777.2467] device (vlan20)[Open vSwitch Interface]: state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Jan 21 18:22:57 np0005591285 NetworkManager[55017]: <info>  [1769037777.2471] device (vlan20)[Open vSwitch Interface]: state change: prepare -> config (reason 'none', managed-type: 'full')
Jan 21 18:22:57 np0005591285 NetworkManager[55017]: <info>  [1769037777.2473] device (vlan20)[Open vSwitch Interface]: state change: config -> ip-config (reason 'none', managed-type: 'full')
Jan 21 18:22:57 np0005591285 kernel: ovs-system: entered promiscuous mode
Jan 21 18:22:57 np0005591285 NetworkManager[55017]: <info>  [1769037777.2496] device (vlan20)[Open vSwitch Port]: state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Jan 21 18:22:57 np0005591285 NetworkManager[55017]: <info>  [1769037777.2500] device (vlan21)[Open vSwitch Interface]: state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Jan 21 18:22:57 np0005591285 NetworkManager[55017]: <info>  [1769037777.2503] device (vlan21)[Open vSwitch Interface]: state change: prepare -> config (reason 'none', managed-type: 'full')
Jan 21 18:22:57 np0005591285 NetworkManager[55017]: <info>  [1769037777.2504] device (vlan21)[Open vSwitch Interface]: state change: config -> ip-config (reason 'none', managed-type: 'full')
Jan 21 18:22:57 np0005591285 systemd-udevd[57796]: Network interface NamePolicy= disabled on kernel command line.
Jan 21 18:22:57 np0005591285 NetworkManager[55017]: <info>  [1769037777.2509] device (vlan21)[Open vSwitch Port]: state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Jan 21 18:22:57 np0005591285 NetworkManager[55017]: <info>  [1769037777.2512] device (vlan22)[Open vSwitch Interface]: state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Jan 21 18:22:57 np0005591285 NetworkManager[55017]: <info>  [1769037777.2515] device (vlan22)[Open vSwitch Interface]: state change: prepare -> config (reason 'none', managed-type: 'full')
Jan 21 18:22:57 np0005591285 NetworkManager[55017]: <info>  [1769037777.2516] device (vlan22)[Open vSwitch Interface]: state change: config -> ip-config (reason 'none', managed-type: 'full')
Jan 21 18:22:57 np0005591285 NetworkManager[55017]: <info>  [1769037777.2521] device (vlan22)[Open vSwitch Port]: state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Jan 21 18:22:57 np0005591285 kernel: Timeout policy base is empty
Jan 21 18:22:57 np0005591285 NetworkManager[55017]: <info>  [1769037777.2533] dhcp4 (eth0): canceled DHCP transaction
Jan 21 18:22:57 np0005591285 NetworkManager[55017]: <info>  [1769037777.2533] dhcp4 (eth0): activation: beginning transaction (timeout in 45 seconds)
Jan 21 18:22:57 np0005591285 NetworkManager[55017]: <info>  [1769037777.2533] dhcp4 (eth0): state changed no lease
Jan 21 18:22:57 np0005591285 NetworkManager[55017]: <info>  [1769037777.2534] dhcp4 (eth0): activation: beginning transaction (no timeout)
Jan 21 18:22:57 np0005591285 NetworkManager[55017]: <info>  [1769037777.2542] device (br-ex)[Open vSwitch Interface]: Activation: connection 'br-ex-if' attached as port, continuing activation
Jan 21 18:22:57 np0005591285 NetworkManager[55017]: <info>  [1769037777.2544] audit: op="device-reapply" interface="eth1" ifindex=3 pid=57791 uid=0 result="fail" reason="Device is not activated"
Jan 21 18:22:57 np0005591285 NetworkManager[55017]: <info>  [1769037777.2549] device (vlan20)[Open vSwitch Interface]: Activation: connection 'vlan20-if' attached as port, continuing activation
Jan 21 18:22:57 np0005591285 systemd[1]: Starting Network Manager Script Dispatcher Service...
Jan 21 18:22:57 np0005591285 systemd[1]: Started Network Manager Script Dispatcher Service.
Jan 21 18:22:57 np0005591285 NetworkManager[55017]: <info>  [1769037777.2917] device (vlan21)[Open vSwitch Interface]: Activation: connection 'vlan21-if' attached as port, continuing activation
Jan 21 18:22:57 np0005591285 NetworkManager[55017]: <info>  [1769037777.2922] dhcp4 (eth0): state changed new lease, address=38.102.83.145
Jan 21 18:22:57 np0005591285 NetworkManager[55017]: <info>  [1769037777.2934] device (vlan22)[Open vSwitch Interface]: Activation: connection 'vlan22-if' attached as port, continuing activation
Jan 21 18:22:57 np0005591285 kernel: br-ex: entered promiscuous mode
Jan 21 18:22:57 np0005591285 kernel: vlan22: entered promiscuous mode
Jan 21 18:22:57 np0005591285 systemd-udevd[57797]: Network interface NamePolicy= disabled on kernel command line.
Jan 21 18:22:57 np0005591285 kernel: vlan20: entered promiscuous mode
Jan 21 18:22:57 np0005591285 NetworkManager[55017]: <info>  [1769037777.3210] device (eth1): disconnecting for new activation request.
Jan 21 18:22:57 np0005591285 NetworkManager[55017]: <info>  [1769037777.3212] audit: op="connection-activate" uuid="6edea1a0-705c-5cc0-8116-93b791d6dfff" name="ci-private-network" pid=57791 uid=0 result="success"
Jan 21 18:22:57 np0005591285 NetworkManager[55017]: <info>  [1769037777.3213] device (eth1): state change: deactivating -> disconnected (reason 'new-activation', managed-type: 'full')
Jan 21 18:22:57 np0005591285 kernel: vlan21: entered promiscuous mode
Jan 21 18:22:57 np0005591285 NetworkManager[55017]: <info>  [1769037777.3316] device (eth1): Activation: starting connection 'ci-private-network' (6edea1a0-705c-5cc0-8116-93b791d6dfff)
Jan 21 18:22:57 np0005591285 NetworkManager[55017]: <info>  [1769037777.3320] device (br-ex)[Open vSwitch Bridge]: state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Jan 21 18:22:57 np0005591285 NetworkManager[55017]: <info>  [1769037777.3321] device (br-ex)[Open vSwitch Port]: state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Jan 21 18:22:57 np0005591285 NetworkManager[55017]: <info>  [1769037777.3323] device (eth1)[Open vSwitch Port]: state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Jan 21 18:22:57 np0005591285 NetworkManager[55017]: <info>  [1769037777.3324] device (vlan20)[Open vSwitch Port]: state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Jan 21 18:22:57 np0005591285 NetworkManager[55017]: <info>  [1769037777.3325] device (vlan21)[Open vSwitch Port]: state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Jan 21 18:22:57 np0005591285 NetworkManager[55017]: <info>  [1769037777.3326] device (vlan22)[Open vSwitch Port]: state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Jan 21 18:22:57 np0005591285 NetworkManager[55017]: <info>  [1769037777.3347] device (br-ex)[Open vSwitch Interface]: carrier: link connected
Jan 21 18:22:57 np0005591285 NetworkManager[55017]: <info>  [1769037777.3354] device (vlan22)[Open vSwitch Interface]: carrier: link connected
Jan 21 18:22:57 np0005591285 NetworkManager[55017]: <info>  [1769037777.3360] device (vlan20)[Open vSwitch Interface]: carrier: link connected
Jan 21 18:22:57 np0005591285 NetworkManager[55017]: <info>  [1769037777.3361] device (eth1): state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Jan 21 18:22:57 np0005591285 NetworkManager[55017]: <info>  [1769037777.3364] device (eth1): state change: prepare -> config (reason 'none', managed-type: 'full')
Jan 21 18:22:57 np0005591285 NetworkManager[55017]: <info>  [1769037777.3368] device (br-ex)[Open vSwitch Bridge]: state change: secondaries -> activated (reason 'none', managed-type: 'full')
Jan 21 18:22:57 np0005591285 NetworkManager[55017]: <info>  [1769037777.3371] device (br-ex)[Open vSwitch Bridge]: Activation: successful, device activated.
Jan 21 18:22:57 np0005591285 NetworkManager[55017]: <info>  [1769037777.3375] device (br-ex)[Open vSwitch Port]: state change: secondaries -> activated (reason 'none', managed-type: 'full')
Jan 21 18:22:57 np0005591285 NetworkManager[55017]: <info>  [1769037777.3378] device (br-ex)[Open vSwitch Port]: Activation: successful, device activated.
Jan 21 18:22:57 np0005591285 NetworkManager[55017]: <info>  [1769037777.3384] device (eth1)[Open vSwitch Port]: state change: secondaries -> activated (reason 'none', managed-type: 'full')
Jan 21 18:22:57 np0005591285 NetworkManager[55017]: <info>  [1769037777.3386] device (eth1)[Open vSwitch Port]: Activation: successful, device activated.
Jan 21 18:22:57 np0005591285 NetworkManager[55017]: <info>  [1769037777.3390] device (vlan20)[Open vSwitch Port]: state change: secondaries -> activated (reason 'none', managed-type: 'full')
Jan 21 18:22:57 np0005591285 NetworkManager[55017]: <info>  [1769037777.3393] device (vlan20)[Open vSwitch Port]: Activation: successful, device activated.
Jan 21 18:22:57 np0005591285 NetworkManager[55017]: <info>  [1769037777.3396] device (vlan21)[Open vSwitch Port]: state change: secondaries -> activated (reason 'none', managed-type: 'full')
Jan 21 18:22:57 np0005591285 NetworkManager[55017]: <info>  [1769037777.3398] device (vlan21)[Open vSwitch Port]: Activation: successful, device activated.
Jan 21 18:22:57 np0005591285 NetworkManager[55017]: <info>  [1769037777.3401] device (vlan22)[Open vSwitch Port]: state change: secondaries -> activated (reason 'none', managed-type: 'full')
Jan 21 18:22:57 np0005591285 NetworkManager[55017]: <info>  [1769037777.3404] device (vlan22)[Open vSwitch Port]: Activation: successful, device activated.
Jan 21 18:22:57 np0005591285 NetworkManager[55017]: <info>  [1769037777.3412] device (vlan21)[Open vSwitch Interface]: carrier: link connected
Jan 21 18:22:57 np0005591285 NetworkManager[55017]: <info>  [1769037777.3414] audit: op="checkpoint-adjust-rollback-timeout" arg="/org/freedesktop/NetworkManager/Checkpoint/1" pid=57791 uid=0 result="success"
Jan 21 18:22:57 np0005591285 NetworkManager[55017]: <info>  [1769037777.3419] device (eth1): state change: config -> ip-config (reason 'none', managed-type: 'full')
Jan 21 18:22:57 np0005591285 NetworkManager[55017]: <info>  [1769037777.3434] device (eth1): Activation: connection 'ci-private-network' attached as port, continuing activation
Jan 21 18:22:57 np0005591285 NetworkManager[55017]: <info>  [1769037777.3449] device (br-ex)[Open vSwitch Interface]: state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Jan 21 18:22:57 np0005591285 kernel: virtio_net virtio5 eth1: entered promiscuous mode
Jan 21 18:22:57 np0005591285 NetworkManager[55017]: <info>  [1769037777.3463] device (vlan20)[Open vSwitch Interface]: state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Jan 21 18:22:57 np0005591285 NetworkManager[55017]: <info>  [1769037777.3470] device (vlan21)[Open vSwitch Interface]: state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Jan 21 18:22:57 np0005591285 NetworkManager[55017]: <info>  [1769037777.3477] device (vlan22)[Open vSwitch Interface]: state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Jan 21 18:22:57 np0005591285 NetworkManager[55017]: <info>  [1769037777.3488] device (eth1): state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Jan 21 18:22:57 np0005591285 NetworkManager[55017]: <info>  [1769037777.3496] device (br-ex)[Open vSwitch Interface]: state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Jan 21 18:22:57 np0005591285 NetworkManager[55017]: <info>  [1769037777.3497] device (vlan20)[Open vSwitch Interface]: state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Jan 21 18:22:57 np0005591285 NetworkManager[55017]: <info>  [1769037777.3498] device (vlan21)[Open vSwitch Interface]: state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Jan 21 18:22:57 np0005591285 NetworkManager[55017]: <info>  [1769037777.3499] device (br-ex)[Open vSwitch Interface]: state change: secondaries -> activated (reason 'none', managed-type: 'full')
Jan 21 18:22:57 np0005591285 NetworkManager[55017]: <info>  [1769037777.3502] device (br-ex)[Open vSwitch Interface]: Activation: successful, device activated.
Jan 21 18:22:57 np0005591285 NetworkManager[55017]: <info>  [1769037777.3506] device (vlan20)[Open vSwitch Interface]: state change: secondaries -> activated (reason 'none', managed-type: 'full')
Jan 21 18:22:57 np0005591285 NetworkManager[55017]: <info>  [1769037777.3510] device (vlan20)[Open vSwitch Interface]: Activation: successful, device activated.
Jan 21 18:22:57 np0005591285 NetworkManager[55017]: <info>  [1769037777.3513] device (vlan21)[Open vSwitch Interface]: state change: secondaries -> activated (reason 'none', managed-type: 'full')
Jan 21 18:22:57 np0005591285 NetworkManager[55017]: <info>  [1769037777.3516] device (vlan21)[Open vSwitch Interface]: Activation: successful, device activated.
Jan 21 18:22:57 np0005591285 NetworkManager[55017]: <info>  [1769037777.3521] device (vlan22)[Open vSwitch Interface]: state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Jan 21 18:22:57 np0005591285 NetworkManager[55017]: <info>  [1769037777.3522] device (eth1): state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Jan 21 18:22:57 np0005591285 NetworkManager[55017]: <info>  [1769037777.3522] device (vlan22)[Open vSwitch Interface]: state change: secondaries -> activated (reason 'none', managed-type: 'full')
Jan 21 18:22:57 np0005591285 NetworkManager[55017]: <info>  [1769037777.3525] device (vlan22)[Open vSwitch Interface]: Activation: successful, device activated.
Jan 21 18:22:57 np0005591285 NetworkManager[55017]: <info>  [1769037777.3529] device (eth1): state change: secondaries -> activated (reason 'none', managed-type: 'full')
Jan 21 18:22:57 np0005591285 NetworkManager[55017]: <info>  [1769037777.3532] device (eth1): Activation: successful, device activated.
Jan 21 18:22:58 np0005591285 NetworkManager[55017]: <info>  [1769037778.5061] audit: op="checkpoint-adjust-rollback-timeout" arg="/org/freedesktop/NetworkManager/Checkpoint/1" pid=57791 uid=0 result="success"
Jan 21 18:22:58 np0005591285 python3.9[58123]: ansible-ansible.legacy.async_status Invoked with jid=j483861627338.57785 mode=status _async_dir=/root/.ansible_async
Jan 21 18:22:58 np0005591285 NetworkManager[55017]: <info>  [1769037778.6840] checkpoint[0x55d64668c950]: destroy /org/freedesktop/NetworkManager/Checkpoint/1
Jan 21 18:22:58 np0005591285 NetworkManager[55017]: <info>  [1769037778.6843] audit: op="checkpoint-destroy" arg="/org/freedesktop/NetworkManager/Checkpoint/1" pid=57791 uid=0 result="success"
Jan 21 18:22:58 np0005591285 NetworkManager[55017]: <info>  [1769037778.9764] audit: op="checkpoint-create" arg="/org/freedesktop/NetworkManager/Checkpoint/2" pid=57791 uid=0 result="success"
Jan 21 18:22:58 np0005591285 NetworkManager[55017]: <info>  [1769037778.9775] audit: op="checkpoint-adjust-rollback-timeout" arg="/org/freedesktop/NetworkManager/Checkpoint/2" pid=57791 uid=0 result="success"
Jan 21 18:22:59 np0005591285 NetworkManager[55017]: <info>  [1769037779.1389] audit: op="networking-control" arg="global-dns-configuration" pid=57791 uid=0 result="success"
Jan 21 18:22:59 np0005591285 NetworkManager[55017]: <info>  [1769037779.1416] config: signal: SET_VALUES,values,values-intern,global-dns-config (/etc/NetworkManager/NetworkManager.conf, /run/NetworkManager/conf.d/15-carrier-timeout.conf)
Jan 21 18:22:59 np0005591285 NetworkManager[55017]: <info>  [1769037779.1442] audit: op="networking-control" arg="global-dns-configuration" pid=57791 uid=0 result="success"
Jan 21 18:22:59 np0005591285 NetworkManager[55017]: <info>  [1769037779.1462] audit: op="checkpoint-adjust-rollback-timeout" arg="/org/freedesktop/NetworkManager/Checkpoint/2" pid=57791 uid=0 result="success"
Jan 21 18:22:59 np0005591285 NetworkManager[55017]: <info>  [1769037779.2728] checkpoint[0x55d64668ca20]: destroy /org/freedesktop/NetworkManager/Checkpoint/2
Jan 21 18:22:59 np0005591285 NetworkManager[55017]: <info>  [1769037779.2731] audit: op="checkpoint-destroy" arg="/org/freedesktop/NetworkManager/Checkpoint/2" pid=57791 uid=0 result="success"
Jan 21 18:22:59 np0005591285 ansible-async_wrapper.py[57789]: Module complete (57789)
Jan 21 18:22:59 np0005591285 ansible-async_wrapper.py[57788]: Done in kid B.
Jan 21 18:23:02 np0005591285 python3.9[58229]: ansible-ansible.legacy.async_status Invoked with jid=j483861627338.57785 mode=status _async_dir=/root/.ansible_async
Jan 21 18:23:02 np0005591285 systemd[1]: systemd-hostnamed.service: Deactivated successfully.
Jan 21 18:23:02 np0005591285 python3.9[58329]: ansible-ansible.legacy.async_status Invoked with jid=j483861627338.57785 mode=cleanup _async_dir=/root/.ansible_async
Jan 21 18:23:03 np0005591285 python3.9[58483]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/os-net-config.returncode follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 21 18:23:04 np0005591285 python3.9[58606]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/edpm-config/os-net-config.returncode mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1769037783.0655541-928-272691166103333/.source.returncode _original_basename=.ja0_1nnf follow=False checksum=b6589fc6ab0dc82cf12099d1c2d40ab994e8410c backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 21 18:23:04 np0005591285 python3.9[58758]: ansible-ansible.legacy.stat Invoked with path=/etc/cloud/cloud.cfg.d/99-edpm-disable-network-config.cfg follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 21 18:23:05 np0005591285 python3.9[58882]: ansible-ansible.legacy.copy Invoked with dest=/etc/cloud/cloud.cfg.d/99-edpm-disable-network-config.cfg mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1769037784.4365685-976-92690600666858/.source.cfg _original_basename=.6a8yg7l9 follow=False checksum=f3c5952a9cd4c6c31b314b25eb897168971cc86e backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 21 18:23:06 np0005591285 python3.9[59034]: ansible-ansible.builtin.systemd Invoked with name=NetworkManager state=reloaded daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Jan 21 18:23:06 np0005591285 systemd[1]: Reloading Network Manager...
Jan 21 18:23:06 np0005591285 NetworkManager[55017]: <info>  [1769037786.3934] audit: op="reload" arg="0" pid=59038 uid=0 result="success"
Jan 21 18:23:06 np0005591285 NetworkManager[55017]: <info>  [1769037786.3942] config: signal: SIGHUP,config-files,values,values-user,no-auto-default (/etc/NetworkManager/NetworkManager.conf, /usr/lib/NetworkManager/conf.d/00-server.conf, /run/NetworkManager/conf.d/15-carrier-timeout.conf, /var/lib/NetworkManager/NetworkManager-intern.conf)
Jan 21 18:23:06 np0005591285 systemd[1]: Reloaded Network Manager.
Jan 21 18:23:06 np0005591285 systemd[1]: session-12.scope: Deactivated successfully.
Jan 21 18:23:06 np0005591285 systemd[1]: session-12.scope: Consumed 52.684s CPU time.
Jan 21 18:23:06 np0005591285 systemd-logind[788]: Session 12 logged out. Waiting for processes to exit.
Jan 21 18:23:06 np0005591285 systemd-logind[788]: Removed session 12.
Jan 21 18:23:13 np0005591285 systemd-logind[788]: New session 13 of user zuul.
Jan 21 18:23:13 np0005591285 systemd[1]: Started Session 13 of User zuul.
Jan 21 18:23:14 np0005591285 python3.9[59222]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 21 18:23:15 np0005591285 python3.9[59377]: ansible-ansible.builtin.setup Invoked with filter=['ansible_default_ipv4'] gather_subset=['!all', '!min', 'network'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Jan 21 18:23:16 np0005591285 systemd[1]: NetworkManager-dispatcher.service: Deactivated successfully.
Jan 21 18:23:17 np0005591285 python3.9[59568]: ansible-ansible.legacy.command Invoked with _raw_params=hostname -f _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 21 18:23:17 np0005591285 systemd[1]: session-13.scope: Deactivated successfully.
Jan 21 18:23:17 np0005591285 systemd[1]: session-13.scope: Consumed 2.425s CPU time.
Jan 21 18:23:17 np0005591285 systemd-logind[788]: Session 13 logged out. Waiting for processes to exit.
Jan 21 18:23:17 np0005591285 systemd-logind[788]: Removed session 13.
Jan 21 18:23:23 np0005591285 systemd-logind[788]: New session 14 of user zuul.
Jan 21 18:23:23 np0005591285 systemd[1]: Started Session 14 of User zuul.
Jan 21 18:23:24 np0005591285 python3.9[59749]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 21 18:23:25 np0005591285 python3.9[59904]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 21 18:23:26 np0005591285 python3.9[60060]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Jan 21 18:23:27 np0005591285 python3.9[60144]: ansible-ansible.legacy.dnf Invoked with name=['podman'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Jan 21 18:23:29 np0005591285 python3.9[60298]: ansible-ansible.builtin.setup Invoked with filter=['ansible_interfaces'] gather_subset=['!all', '!min', 'network'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Jan 21 18:23:31 np0005591285 python3.9[60489]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/containers/networks recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 21 18:23:32 np0005591285 python3.9[60641]: ansible-ansible.legacy.command Invoked with _raw_params=podman network inspect podman#012 _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 21 18:23:32 np0005591285 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Jan 21 18:23:33 np0005591285 python3.9[60804]: ansible-ansible.legacy.stat Invoked with path=/etc/containers/networks/podman.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 21 18:23:33 np0005591285 python3.9[60882]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/containers/networks/podman.json _original_basename=podman_network_config.j2 recurse=False state=file path=/etc/containers/networks/podman.json force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 21 18:23:34 np0005591285 python3.9[61034]: ansible-ansible.legacy.stat Invoked with path=/etc/containers/registries.conf.d/20-edpm-podman-registries.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 21 18:23:35 np0005591285 python3.9[61112]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root setype=etc_t dest=/etc/containers/registries.conf.d/20-edpm-podman-registries.conf _original_basename=registries.conf.j2 recurse=False state=file path=/etc/containers/registries.conf.d/20-edpm-podman-registries.conf force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 21 18:23:36 np0005591285 python3.9[61264]: ansible-community.general.ini_file Invoked with create=True group=root mode=0644 option=pids_limit owner=root path=/etc/containers/containers.conf section=containers setype=etc_t value=4096 backup=False state=present exclusive=True no_extra_spaces=False ignore_spaces=False allow_no_value=False modify_inactive_option=True follow=False unsafe_writes=False section_has_values=None values=None seuser=None serole=None selevel=None attributes=None
Jan 21 18:23:36 np0005591285 python3.9[61416]: ansible-community.general.ini_file Invoked with create=True group=root mode=0644 option=events_logger owner=root path=/etc/containers/containers.conf section=engine setype=etc_t value="journald" backup=False state=present exclusive=True no_extra_spaces=False ignore_spaces=False allow_no_value=False modify_inactive_option=True follow=False unsafe_writes=False section_has_values=None values=None seuser=None serole=None selevel=None attributes=None
Jan 21 18:23:37 np0005591285 python3.9[61568]: ansible-community.general.ini_file Invoked with create=True group=root mode=0644 option=runtime owner=root path=/etc/containers/containers.conf section=engine setype=etc_t value="crun" backup=False state=present exclusive=True no_extra_spaces=False ignore_spaces=False allow_no_value=False modify_inactive_option=True follow=False unsafe_writes=False section_has_values=None values=None seuser=None serole=None selevel=None attributes=None
Jan 21 18:23:38 np0005591285 python3.9[61720]: ansible-community.general.ini_file Invoked with create=True group=root mode=0644 option=network_backend owner=root path=/etc/containers/containers.conf section=network setype=etc_t value="netavark" backup=False state=present exclusive=True no_extra_spaces=False ignore_spaces=False allow_no_value=False modify_inactive_option=True follow=False unsafe_writes=False section_has_values=None values=None seuser=None serole=None selevel=None attributes=None
Jan 21 18:23:39 np0005591285 python3.9[61872]: ansible-ansible.legacy.dnf Invoked with name=['openssh-server'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Jan 21 18:23:42 np0005591285 python3.9[62025]: ansible-setup Invoked with gather_subset=['!all', '!min', 'distribution', 'distribution_major_version', 'distribution_version', 'os_family'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 21 18:23:42 np0005591285 python3.9[62179]: ansible-stat Invoked with path=/run/ostree-booted follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 21 18:23:43 np0005591285 python3.9[62331]: ansible-stat Invoked with path=/sbin/transactional-update follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 21 18:23:44 np0005591285 python3.9[62483]: ansible-ansible.legacy.command Invoked with _raw_params=systemctl is-system-running _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 21 18:23:45 np0005591285 python3.9[62636]: ansible-service_facts Invoked
Jan 21 18:23:45 np0005591285 network[62653]: You are using 'network' service provided by 'network-scripts', which are now deprecated.
Jan 21 18:23:45 np0005591285 network[62654]: 'network-scripts' will be removed from distribution in near future.
Jan 21 18:23:45 np0005591285 network[62655]: It is advised to switch to 'NetworkManager' instead for network management.
Jan 21 18:23:52 np0005591285 python3.9[63107]: ansible-ansible.legacy.dnf Invoked with name=['chrony'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Jan 21 18:23:55 np0005591285 python3.9[63260]: ansible-package_facts Invoked with manager=['auto'] strategy=first
Jan 21 18:23:57 np0005591285 python3.9[63412]: ansible-ansible.legacy.stat Invoked with path=/etc/chrony.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 21 18:23:58 np0005591285 python3.9[63537]: ansible-ansible.legacy.copy Invoked with backup=True dest=/etc/chrony.conf mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1769037836.726457-660-224873735653515/.source.conf follow=False _original_basename=chrony.conf.j2 checksum=cfb003e56d02d0d2c65555452eb1a05073fecdad force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 21 18:23:59 np0005591285 python3.9[63691]: ansible-ansible.legacy.stat Invoked with path=/etc/sysconfig/chronyd follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 21 18:23:59 np0005591285 python3.9[63816]: ansible-ansible.legacy.copy Invoked with backup=True dest=/etc/sysconfig/chronyd mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1769037838.625479-705-114472191593875/.source follow=False _original_basename=chronyd.sysconfig.j2 checksum=dd196b1ff1f915b23eebc37ec77405b5dd3df76c force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 21 18:24:01 np0005591285 python3.9[63970]: ansible-lineinfile Invoked with backup=True create=True dest=/etc/sysconfig/network line=PEERNTP=no mode=0644 regexp=^PEERNTP= state=present path=/etc/sysconfig/network encoding=utf-8 backrefs=False firstmatch=False unsafe_writes=False search_string=None insertafter=None insertbefore=None validate=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 21 18:24:03 np0005591285 python3.9[64124]: ansible-ansible.legacy.setup Invoked with gather_subset=['!all'] filter=['ansible_service_mgr'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Jan 21 18:24:04 np0005591285 python3.9[64208]: ansible-ansible.legacy.systemd Invoked with enabled=True name=chronyd state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 21 18:24:08 np0005591285 python3.9[64362]: ansible-ansible.legacy.setup Invoked with gather_subset=['!all'] filter=['ansible_service_mgr'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Jan 21 18:24:09 np0005591285 python3.9[64446]: ansible-ansible.legacy.systemd Invoked with name=chronyd state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Jan 21 18:24:09 np0005591285 chronyd[793]: chronyd exiting
Jan 21 18:24:09 np0005591285 systemd[1]: Stopping NTP client/server...
Jan 21 18:24:09 np0005591285 systemd[1]: chronyd.service: Deactivated successfully.
Jan 21 18:24:09 np0005591285 systemd[1]: Stopped NTP client/server.
Jan 21 18:24:09 np0005591285 systemd[1]: Starting NTP client/server...
Jan 21 18:24:09 np0005591285 chronyd[64454]: chronyd version 4.8 starting (+CMDMON +REFCLOCK +RTC +PRIVDROP +SCFILTER +SIGND +NTS +SECHASH +IPV6 +DEBUG)
Jan 21 18:24:09 np0005591285 chronyd[64454]: Frequency -26.562 +/- 0.164 ppm read from /var/lib/chrony/drift
Jan 21 18:24:09 np0005591285 chronyd[64454]: Loaded seccomp filter (level 2)
Jan 21 18:24:09 np0005591285 systemd[1]: Started NTP client/server.
Jan 21 18:24:10 np0005591285 systemd[1]: session-14.scope: Deactivated successfully.
Jan 21 18:24:10 np0005591285 systemd[1]: session-14.scope: Consumed 27.771s CPU time.
Jan 21 18:24:10 np0005591285 systemd-logind[788]: Session 14 logged out. Waiting for processes to exit.
Jan 21 18:24:10 np0005591285 systemd-logind[788]: Removed session 14.
Jan 21 18:24:15 np0005591285 systemd-logind[788]: New session 15 of user zuul.
Jan 21 18:24:15 np0005591285 systemd[1]: Started Session 15 of User zuul.
Jan 21 18:24:16 np0005591285 python3.9[64633]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 21 18:24:17 np0005591285 python3.9[64789]: ansible-ansible.builtin.file Invoked with group=zuul mode=0770 owner=zuul path=/root/.config/containers recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 21 18:24:18 np0005591285 python3.9[64964]: ansible-ansible.legacy.stat Invoked with path=/root/.config/containers/auth.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 21 18:24:19 np0005591285 python3.9[65042]: ansible-ansible.legacy.file Invoked with group=zuul mode=0660 owner=zuul dest=/root/.config/containers/auth.json _original_basename=.17ua336c recurse=False state=file path=/root/.config/containers/auth.json force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 21 18:24:20 np0005591285 python3.9[65194]: ansible-ansible.legacy.stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 21 18:24:21 np0005591285 python3.9[65317]: ansible-ansible.legacy.copy Invoked with dest=/etc/sysconfig/podman_drop_in mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1769037860.0617604-145-231699354345538/.source _original_basename=.rvvqtxw_ follow=False checksum=125299ce8dea7711a76292961206447f0043248b backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 21 18:24:22 np0005591285 python3.9[65469]: ansible-ansible.builtin.file Invoked with path=/var/local/libexec recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Jan 21 18:24:22 np0005591285 python3.9[65621]: ansible-ansible.legacy.stat Invoked with path=/var/local/libexec/edpm-container-shutdown follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 21 18:24:23 np0005591285 python3.9[65744]: ansible-ansible.legacy.copy Invoked with dest=/var/local/libexec/edpm-container-shutdown group=root mode=0700 owner=root setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1769037862.2973866-217-141152502149360/.source _original_basename=edpm-container-shutdown follow=False checksum=632c3792eb3dce4288b33ae7b265b71950d69f13 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None
Jan 21 18:24:24 np0005591285 python3.9[65896]: ansible-ansible.legacy.stat Invoked with path=/var/local/libexec/edpm-start-podman-container follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 21 18:24:24 np0005591285 python3.9[66019]: ansible-ansible.legacy.copy Invoked with dest=/var/local/libexec/edpm-start-podman-container group=root mode=0700 owner=root setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1769037863.6546688-217-271965969690413/.source _original_basename=edpm-start-podman-container follow=False checksum=b963c569d75a655c0ccae95d9bb4a2a9a4df27d1 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None
Jan 21 18:24:25 np0005591285 python3.9[66171]: ansible-ansible.builtin.file Invoked with mode=420 path=/etc/systemd/system-preset state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 21 18:24:26 np0005591285 python3.9[66323]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/edpm-container-shutdown.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 21 18:24:27 np0005591285 python3.9[66446]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/edpm-container-shutdown.service group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769037865.893919-329-166551350101809/.source.service _original_basename=edpm-container-shutdown-service follow=False checksum=6336835cb0f888670cc99de31e19c8c071444d33 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 21 18:24:27 np0005591285 python3.9[66598]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system-preset/91-edpm-container-shutdown.preset follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 21 18:24:28 np0005591285 python3.9[66721]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system-preset/91-edpm-container-shutdown.preset group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769037867.3056448-374-73437694015465/.source.preset _original_basename=91-edpm-container-shutdown-preset follow=False checksum=b275e4375287528cb63464dd32f622c4f142a915 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 21 18:24:29 np0005591285 python3.9[66873]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=edpm-container-shutdown state=started daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 21 18:24:29 np0005591285 systemd[1]: Reloading.
Jan 21 18:24:29 np0005591285 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 21 18:24:29 np0005591285 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 21 18:24:29 np0005591285 systemd[1]: Reloading.
Jan 21 18:24:30 np0005591285 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 21 18:24:30 np0005591285 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 21 18:24:30 np0005591285 systemd[1]: Starting EDPM Container Shutdown...
Jan 21 18:24:30 np0005591285 systemd[1]: Finished EDPM Container Shutdown.
Jan 21 18:24:31 np0005591285 python3.9[67101]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/netns-placeholder.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 21 18:24:31 np0005591285 python3.9[67224]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/netns-placeholder.service group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769037870.7370543-443-109581240269029/.source.service _original_basename=netns-placeholder-service follow=False checksum=b61b1b5918c20c877b8b226fbf34ff89a082d972 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 21 18:24:32 np0005591285 python3.9[67376]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system-preset/91-netns-placeholder.preset follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 21 18:24:33 np0005591285 python3.9[67499]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system-preset/91-netns-placeholder.preset group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769037872.2088413-487-57786509955421/.source.preset _original_basename=91-netns-placeholder-preset follow=False checksum=28b7b9aa893525d134a1eeda8a0a48fb25b736b9 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 21 18:24:34 np0005591285 python3.9[67651]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=netns-placeholder state=started daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 21 18:24:34 np0005591285 systemd[1]: Reloading.
Jan 21 18:24:34 np0005591285 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 21 18:24:34 np0005591285 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 21 18:24:34 np0005591285 systemd[1]: Reloading.
Jan 21 18:24:34 np0005591285 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 21 18:24:34 np0005591285 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 21 18:24:34 np0005591285 systemd[1]: Starting Create netns directory...
Jan 21 18:24:34 np0005591285 systemd[1]: run-netns-placeholder.mount: Deactivated successfully.
Jan 21 18:24:34 np0005591285 systemd[1]: netns-placeholder.service: Deactivated successfully.
Jan 21 18:24:34 np0005591285 systemd[1]: Finished Create netns directory.
Jan 21 18:24:35 np0005591285 python3.9[67878]: ansible-ansible.builtin.service_facts Invoked
Jan 21 18:24:35 np0005591285 network[67895]: You are using 'network' service provided by 'network-scripts', which are now deprecated.
Jan 21 18:24:35 np0005591285 network[67896]: 'network-scripts' will be removed from distribution in near future.
Jan 21 18:24:35 np0005591285 network[67897]: It is advised to switch to 'NetworkManager' instead for network management.
Jan 21 18:24:41 np0005591285 python3.9[68159]: ansible-ansible.builtin.systemd Invoked with enabled=False name=iptables.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 21 18:24:41 np0005591285 systemd[1]: Reloading.
Jan 21 18:24:41 np0005591285 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 21 18:24:41 np0005591285 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 21 18:24:41 np0005591285 systemd[1]: Stopping IPv4 firewall with iptables...
Jan 21 18:24:42 np0005591285 iptables.init[68199]: iptables: Setting chains to policy ACCEPT: raw mangle filter nat [  OK  ]
Jan 21 18:24:42 np0005591285 iptables.init[68199]: iptables: Flushing firewall rules: [  OK  ]
Jan 21 18:24:42 np0005591285 systemd[1]: iptables.service: Deactivated successfully.
Jan 21 18:24:42 np0005591285 systemd[1]: Stopped IPv4 firewall with iptables.
Jan 21 18:24:43 np0005591285 python3.9[68395]: ansible-ansible.builtin.systemd Invoked with enabled=False name=ip6tables.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 21 18:24:44 np0005591285 python3.9[68549]: ansible-ansible.builtin.systemd Invoked with enabled=True name=nftables state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 21 18:24:44 np0005591285 systemd[1]: Reloading.
Jan 21 18:24:44 np0005591285 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 21 18:24:44 np0005591285 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 21 18:24:44 np0005591285 systemd[1]: Starting Netfilter Tables...
Jan 21 18:24:44 np0005591285 systemd[1]: Finished Netfilter Tables.
Jan 21 18:24:45 np0005591285 python3.9[68740]: ansible-ansible.legacy.command Invoked with _raw_params=nft flush ruleset _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 21 18:24:46 np0005591285 python3.9[68893]: ansible-ansible.legacy.stat Invoked with path=/etc/ssh/sshd_config follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 21 18:24:47 np0005591285 python3.9[69018]: ansible-ansible.legacy.copy Invoked with dest=/etc/ssh/sshd_config mode=0600 src=/home/zuul/.ansible/tmp/ansible-tmp-1769037886.3450227-695-18469562523285/.source validate=/usr/sbin/sshd -T -f %s follow=False _original_basename=sshd_config_block.j2 checksum=6c79f4cb960ad444688fde322eeacb8402e22d79 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 21 18:24:48 np0005591285 python3.9[69171]: ansible-ansible.builtin.systemd Invoked with name=sshd state=reloaded daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Jan 21 18:24:48 np0005591285 systemd[1]: Reloading OpenSSH server daemon...
Jan 21 18:24:48 np0005591285 systemd[1]: Reloaded OpenSSH server daemon.
Jan 21 18:24:49 np0005591285 python3.9[69327]: ansible-ansible.builtin.file Invoked with group=root mode=0750 owner=root path=/var/lib/edpm-config/firewall state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 21 18:24:50 np0005591285 python3.9[69479]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/sshd-networks.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 21 18:24:51 np0005591285 python3.9[69602]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/edpm-config/firewall/sshd-networks.yaml group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769037889.624845-788-83477319989521/.source.yaml follow=False _original_basename=firewall.yaml.j2 checksum=0bfc8440fd8f39002ab90252479fb794f51b5ae8 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 21 18:24:52 np0005591285 python3.9[69754]: ansible-community.general.timezone Invoked with name=UTC hwclock=None
Jan 21 18:24:52 np0005591285 systemd[1]: Starting Time & Date Service...
Jan 21 18:24:52 np0005591285 systemd[1]: Started Time & Date Service.
Jan 21 18:24:54 np0005591285 python3.9[69910]: ansible-ansible.builtin.file Invoked with group=root mode=0750 owner=root path=/var/lib/edpm-config/firewall state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 21 18:24:55 np0005591285 python3.9[70062]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 21 18:24:55 np0005591285 python3.9[70185]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1769037894.6568384-892-57247358378340/.source.yaml follow=False _original_basename=base-rules.yaml.j2 checksum=450456afcafded6d4bdecceec7a02e806eebd8b3 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 21 18:24:56 np0005591285 python3.9[70337]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 21 18:24:56 np0005591285 python3.9[70460]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1769037895.9897454-938-260223782915507/.source.yaml _original_basename=.hwtt4cpx follow=False checksum=97d170e1550eee4afc0af065b78cda302a97674c backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 21 18:24:57 np0005591285 python3.9[70612]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/iptables.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 21 18:24:58 np0005591285 python3.9[70735]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/iptables.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769037897.2744725-983-30937911993333/.source.nft _original_basename=iptables.nft follow=False checksum=3e02df08f1f3ab4a513e94056dbd390e3d38fe30 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 21 18:24:59 np0005591285 python3.9[70887]: ansible-ansible.legacy.command Invoked with _raw_params=nft -f /etc/nftables/iptables.nft _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 21 18:24:59 np0005591285 python3.9[71040]: ansible-ansible.legacy.command Invoked with _raw_params=nft -j list ruleset _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 21 18:25:00 np0005591285 python3[71193]: ansible-edpm_nftables_from_files Invoked with src=/var/lib/edpm-config/firewall
Jan 21 18:25:01 np0005591285 python3.9[71345]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-jumps.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 21 18:25:02 np0005591285 python3.9[71468]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-jumps.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769037901.1207128-1100-30714637038772/.source.nft follow=False _original_basename=jump-chain.j2 checksum=4c6f036d2d5808f109acc0880c19aa74ca48c961 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 21 18:25:03 np0005591285 python3.9[71620]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-update-jumps.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 21 18:25:03 np0005591285 python3.9[71743]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-update-jumps.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769037902.5734718-1145-258597435886767/.source.nft follow=False _original_basename=jump-chain.j2 checksum=4c6f036d2d5808f109acc0880c19aa74ca48c961 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 21 18:25:04 np0005591285 python3.9[71895]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-flushes.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 21 18:25:05 np0005591285 python3.9[72018]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-flushes.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769037904.0422258-1190-206862162087956/.source.nft follow=False _original_basename=flush-chain.j2 checksum=d16337256a56373421842284fe09e4e6c7df417e backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 21 18:25:05 np0005591285 python3.9[72170]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-chains.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 21 18:25:06 np0005591285 python3.9[72293]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-chains.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769037905.4671166-1235-60888280651523/.source.nft follow=False _original_basename=chains.j2 checksum=2079f3b60590a165d1d502e763170876fc8e2984 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 21 18:25:07 np0005591285 python3.9[72445]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-rules.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 21 18:25:08 np0005591285 python3.9[72568]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-rules.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769037906.8540156-1279-100340625169180/.source.nft follow=False _original_basename=ruleset.j2 checksum=15a82a0dc61abfd6aa593407582b5b950437eb80 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 21 18:25:09 np0005591285 python3.9[72720]: ansible-ansible.builtin.file Invoked with group=root mode=0600 owner=root path=/etc/nftables/edpm-rules.nft.changed state=touch recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 21 18:25:09 np0005591285 python3.9[72872]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail; cat /etc/nftables/edpm-chains.nft /etc/nftables/edpm-flushes.nft /etc/nftables/edpm-rules.nft /etc/nftables/edpm-update-jumps.nft /etc/nftables/edpm-jumps.nft | nft -c -f - _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 21 18:25:10 np0005591285 python3.9[73031]: ansible-ansible.builtin.blockinfile Invoked with backup=False block=include "/etc/nftables/iptables.nft"#012include "/etc/nftables/edpm-chains.nft"#012include "/etc/nftables/edpm-rules.nft"#012include "/etc/nftables/edpm-jumps.nft"#012 path=/etc/sysconfig/nftables.conf validate=nft -c -f %s state=present marker=# {mark} ANSIBLE MANAGED BLOCK create=False marker_begin=BEGIN marker_end=END append_newline=False prepend_newline=False encoding=utf-8 unsafe_writes=False insertafter=None insertbefore=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 21 18:25:11 np0005591285 python3.9[73184]: ansible-ansible.builtin.file Invoked with group=hugetlbfs mode=0775 owner=zuul path=/dev/hugepages1G state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 21 18:25:12 np0005591285 python3.9[73336]: ansible-ansible.builtin.file Invoked with group=hugetlbfs mode=0775 owner=zuul path=/dev/hugepages2M state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 21 18:25:13 np0005591285 python3.9[73488]: ansible-ansible.posix.mount Invoked with fstype=hugetlbfs opts=pagesize=1G path=/dev/hugepages1G src=none state=mounted boot=True dump=0 opts_no_log=False passno=0 backup=False fstab=None
Jan 21 18:25:13 np0005591285 rsyslogd[1006]: imjournal: journal files changed, reloading...  [v8.2510.0-2.el9 try https://www.rsyslog.com/e/0 ]
Jan 21 18:25:14 np0005591285 python3.9[73642]: ansible-ansible.posix.mount Invoked with fstype=hugetlbfs opts=pagesize=2M path=/dev/hugepages2M src=none state=mounted boot=True dump=0 opts_no_log=False passno=0 backup=False fstab=None
Jan 21 18:25:14 np0005591285 systemd[1]: session-15.scope: Deactivated successfully.
Jan 21 18:25:14 np0005591285 systemd[1]: session-15.scope: Consumed 40.193s CPU time.
Jan 21 18:25:14 np0005591285 systemd-logind[788]: Session 15 logged out. Waiting for processes to exit.
Jan 21 18:25:14 np0005591285 systemd-logind[788]: Removed session 15.
Jan 21 18:25:19 np0005591285 systemd-logind[788]: New session 16 of user zuul.
Jan 21 18:25:19 np0005591285 systemd[1]: Started Session 16 of User zuul.
Jan 21 18:25:20 np0005591285 python3.9[73823]: ansible-ansible.builtin.tempfile Invoked with state=file prefix=ansible. suffix= path=None
Jan 21 18:25:21 np0005591285 python3.9[73975]: ansible-ansible.builtin.stat Invoked with path=/etc/ssh/ssh_known_hosts follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 21 18:25:22 np0005591285 python3.9[74127]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'ssh_host_key_rsa_public', 'ssh_host_key_ed25519_public', 'ssh_host_key_ecdsa_public'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 21 18:25:23 np0005591285 systemd[1]: systemd-timedated.service: Deactivated successfully.
Jan 21 18:25:23 np0005591285 python3.9[74279]: ansible-ansible.builtin.blockinfile Invoked with block=compute-2.ctlplane.example.com,192.168.122.102,compute-2* ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQC26D51NdJjdilPO47VkyAGWZEKpDvfQ2t45jAnFi+yGdqGpJZeqIqXy1qJWgR+nOjHPpu4xyjUsXsUdkcmQySQ9nELhPXxBtFGM3LlXjhhk0Yibj4G2gfuMuG/m8d0BtpBY66pWUvd424nrAKh1ObdZgR5iHS4dtFVcrUPD7nmkE3YxEDETOTc5d/Tcal9MQArb/rQQAs2Z7N4Lgv1bSzhuu70Ij9qUff8SJhc5ZBQkAGKfNPP8XajfuTOvnEOo9uZQjTKcFZnsiSBUnxId028vihtYF6+NFOByOltsmJc7OIafk5r6JZzbps6FcCaOaT2TRLuLemBS+qfS4N0tWS1iJ00Jo7h7y+UdgDBFB3/zCHD5KiHOYCHbXdqtz8HUwsz65bdDEsKyJdh6qyFv5DN7sbB1UK6Yr/urKbVGR2mYP7sNEIAcSC9HZ2vehi9Hm/TSD7IfvR2i96ckZOsnHD3QeMUyJXjqk3PG7rlUM7NxZYyHaTuZzYrR5DvOsUjS1s=#012compute-2.ctlplane.example.com,192.168.122.102,compute-2* ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAICZF6/j6naCAJ9xH6aYQVqdvwoz3vezm/JU2Pso9ogKK#012compute-2.ctlplane.example.com,192.168.122.102,compute-2* ecdsa-sha2-nistp256 AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBLqlgZ52debu0OKcJwhzrcTUf3XONAZS4TIW+jISXbbaqXAGs35QUNRljBr9O34MR2l+Jib4kJghkCYEmTbTxNo=#012compute-0.ctlplane.example.com,192.168.122.100,compute-0* ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQCmNgflEDQr9DxhZFToMSHP67cO7SUQpgVB7thv3JwDIojWojCRgQSVty7S1IJD5allDPdSEn7he/4X0ePPAI6phFNIWx+fwLuXpedyRVclMG0GASpOZ1kxLiQoMh+DOdnJArZ4llA4Lxdm7MyKCzA+Tna+2Z0+XrBTZjxzM4NbwGmUrESDcTXXu7f/vCq0QTRmjHLTbEvqFbJJzIetehEBC/yJb+35myPPBJ5IU8op6ixtbvwk2pzrRYr/NOUsf/ODWITXAvMjl6U1iE2Np2giBVqfz3zKkoH7gkMRHUmwxetTejWa1kIIZRiJUsQRetDm7v+bkaHGpwokxAC3n7pMwzdSO59inU/Lpr63ruukI64YeLK7FQiJ9557a+lcykXz0xgDF2aNHS9jyyhLQ0EHQGUgQToa462bLJwlCLFkxHpamrQKS67M+71TcFb//zx7kRmUT7NSxGHNAe155RcO0L1mkDHk1r2xOfk2iLNJtgYdCcVlwSRp4beSywJSECk=#012compute-0.ctlplane.example.com,192.168.122.100,compute-0* ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAILEP7bJKdLXxjWmdj4eC7ngVkPSbC0h6tc+Oej4hLtk7#012compute-0.ctlplane.example.com,192.168.122.100,compute-0* ecdsa-sha2-nistp256 AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBO/PF1lIcuvdp/VOQkUSqyeGOw1ILI4bhZtJ8xgcsTd+//1XE1ll313MwTKeS1n9loXGAVB4+f9lF2fbY4gEkQI=#012compute-1.ctlplane.example.com,192.168.122.101,compute-1* ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQCrBpqo7KcpSwQCe8bhjM7Y7rs9JlI0f5WrmGDjYEfFo3lyWMOb2fxXrIDWqMZa5HGb83LwgwgDL1MhTWmi47d7h8Wcxzg4uoflPcGqILiXv9Z+T68l/C6NT2ur0r4Njrz27cayzBtDPz1wKz1bf72s+Jm7Ukl84pubtCYfPhpZ6HBojmNiq+gesC60N0wbEbIHDEgd+jVptW/UdWmhzO7xEBn3qbNPk6UpnYJSU+Z2wGx6hHckTSl5Wy/7RQ2HXE990+4qkeVl88lR/LqsGthwUQ8tlp8F33yw3IS9D0uurGkuqY4GyRjexrol0VPx9VlrPU0y4K+1pP59O4qo9+z/eylWJViS4R223v0JF2RIrH6aQvHTtV1un22qYnTCTCQrZ6KAKQipc0pawnz7DdXE3D2gwcQkZZmcYm9JboWqFn5/80rsuHUZmMBOHy5owN7IjIly0yAPxjAIZy5dMr1MkQP9o/FSnvyQzt11XeO/49/DI3FH0TkomkN21/QhSYE=#012compute-1.ctlplane.example.com,192.168.122.101,compute-1* ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIKG99Xw/DkEh2LuhUTQH1tq7VFfroV01ukYKDqY+UjHx#012compute-1.ctlplane.example.com,192.168.122.101,compute-1* ecdsa-sha2-nistp256 AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBBvYfIE3Tv6SOsn96jsNhozh4WS77CDAl4JYSfjVLVK/RVCTMxlZOAnhAHwDUgcw2k0t2eycyJ2wTJO6OCAqGM4=#012 create=True mode=0644 path=/tmp/ansible.tw5l4qul state=present marker=# {mark} ANSIBLE MANAGED BLOCK backup=False marker_begin=BEGIN marker_end=END append_newline=False prepend_newline=False encoding=utf-8 unsafe_writes=False insertafter=None insertbefore=None validate=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 21 18:25:24 np0005591285 python3.9[74433]: ansible-ansible.legacy.command Invoked with _raw_params=cat '/tmp/ansible.tw5l4qul' > /etc/ssh/ssh_known_hosts _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 21 18:25:25 np0005591285 python3.9[74587]: ansible-ansible.builtin.file Invoked with path=/tmp/ansible.tw5l4qul state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 21 18:25:26 np0005591285 systemd[1]: session-16.scope: Deactivated successfully.
Jan 21 18:25:26 np0005591285 systemd[1]: session-16.scope: Consumed 3.780s CPU time.
Jan 21 18:25:26 np0005591285 systemd-logind[788]: Session 16 logged out. Waiting for processes to exit.
Jan 21 18:25:26 np0005591285 systemd-logind[788]: Removed session 16.
Jan 21 18:25:30 np0005591285 systemd-logind[788]: New session 17 of user zuul.
Jan 21 18:25:30 np0005591285 systemd[1]: Started Session 17 of User zuul.
Jan 21 18:25:31 np0005591285 python3.9[74765]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 21 18:25:33 np0005591285 python3.9[74921]: ansible-ansible.builtin.systemd Invoked with enabled=True name=sshd daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None masked=None
Jan 21 18:25:34 np0005591285 python3.9[75075]: ansible-ansible.builtin.systemd Invoked with name=sshd state=started daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Jan 21 18:25:35 np0005591285 python3.9[75228]: ansible-ansible.legacy.command Invoked with _raw_params=nft -f /etc/nftables/edpm-chains.nft _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 21 18:25:36 np0005591285 python3.9[75381]: ansible-ansible.builtin.stat Invoked with path=/etc/nftables/edpm-rules.nft.changed follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 21 18:25:36 np0005591285 python3.9[75535]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail; cat /etc/nftables/edpm-flushes.nft /etc/nftables/edpm-rules.nft /etc/nftables/edpm-update-jumps.nft | nft -f - _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 21 18:25:37 np0005591285 python3.9[75690]: ansible-ansible.builtin.file Invoked with path=/etc/nftables/edpm-rules.nft.changed state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 21 18:25:38 np0005591285 systemd-logind[788]: Session 17 logged out. Waiting for processes to exit.
Jan 21 18:25:38 np0005591285 systemd[1]: session-17.scope: Deactivated successfully.
Jan 21 18:25:38 np0005591285 systemd[1]: session-17.scope: Consumed 4.925s CPU time.
Jan 21 18:25:38 np0005591285 systemd-logind[788]: Removed session 17.
Jan 21 18:25:44 np0005591285 systemd-logind[788]: New session 18 of user zuul.
Jan 21 18:25:44 np0005591285 systemd[1]: Started Session 18 of User zuul.
Jan 21 18:25:45 np0005591285 python3.9[75868]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 21 18:25:46 np0005591285 python3.9[76024]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Jan 21 18:25:47 np0005591285 python3.9[76108]: ansible-ansible.legacy.dnf Invoked with name=['yum-utils'] allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None state=None
Jan 21 18:25:49 np0005591285 python3.9[76259]: ansible-ansible.legacy.command Invoked with _raw_params=needs-restarting -r _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 21 18:25:50 np0005591285 python3.9[76410]: ansible-ansible.builtin.find Invoked with paths=['/var/lib/openstack/reboot_required/'] patterns=[] read_whole_file=False file_type=file age_stamp=mtime recurse=False hidden=False follow=False get_checksum=False checksum_algorithm=sha1 use_regex=False exact_mode=True excludes=None contains=None age=None size=None depth=None mode=None encoding=None limit=None
Jan 21 18:25:51 np0005591285 python3.9[76560]: ansible-ansible.builtin.stat Invoked with path=/var/lib/config-data/puppet-generated follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 21 18:25:52 np0005591285 python3.9[76710]: ansible-ansible.builtin.stat Invoked with path=/var/lib/openstack follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 21 18:25:52 np0005591285 systemd[1]: session-18.scope: Deactivated successfully.
Jan 21 18:25:52 np0005591285 systemd[1]: session-18.scope: Consumed 6.565s CPU time.
Jan 21 18:25:52 np0005591285 systemd-logind[788]: Session 18 logged out. Waiting for processes to exit.
Jan 21 18:25:52 np0005591285 systemd-logind[788]: Removed session 18.
Jan 21 18:25:58 np0005591285 systemd-logind[788]: New session 19 of user zuul.
Jan 21 18:25:58 np0005591285 systemd[1]: Started Session 19 of User zuul.
Jan 21 18:25:59 np0005591285 python3.9[76888]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 21 18:26:01 np0005591285 python3.9[77044]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/certs/libvirt/default setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 21 18:26:01 np0005591285 python3.9[77196]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/certs/libvirt/default setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 21 18:26:02 np0005591285 python3.9[77348]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/certs/libvirt/default/tls.crt follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 21 18:26:03 np0005591285 python3.9[77471]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/certs/libvirt/default/tls.crt group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769037962.0414882-159-129164682098016/.source.crt _original_basename=compute-2.ctlplane.example.com-tls.crt follow=False checksum=cd4425dbd3ff138d6aedfb1ca3d775a590d07ca0 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 21 18:26:04 np0005591285 python3.9[77623]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/certs/libvirt/default/ca.crt follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 21 18:26:04 np0005591285 python3.9[77746]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/certs/libvirt/default/ca.crt group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769037963.4914389-159-175969766315659/.source.crt _original_basename=compute-2.ctlplane.example.com-ca.crt follow=False checksum=9c03bcfa62361e5ef322801c360476a6187916b6 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 21 18:26:05 np0005591285 python3.9[77898]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/certs/libvirt/default/tls.key follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 21 18:26:05 np0005591285 python3.9[78021]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/certs/libvirt/default/tls.key group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769037964.815994-159-150227338169550/.source.key _original_basename=compute-2.ctlplane.example.com-tls.key follow=False checksum=185b3f0370d42b444b1c4be14563a9a206f07e89 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 21 18:26:06 np0005591285 python3.9[78173]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/certs/telemetry/default setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 21 18:26:07 np0005591285 python3.9[78325]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/certs/telemetry/default setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 21 18:26:08 np0005591285 python3.9[78477]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/certs/telemetry/default/tls.crt follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 21 18:26:08 np0005591285 python3.9[78600]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/certs/telemetry/default/tls.crt group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769037967.7266977-338-250194802921726/.source.crt _original_basename=compute-2.ctlplane.example.com-tls.crt follow=False checksum=53213fb53c8c355b98ef9e6896848e534f9bc9ec backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 21 18:26:09 np0005591285 python3.9[78752]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/certs/telemetry/default/ca.crt follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 21 18:26:10 np0005591285 python3.9[78875]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/certs/telemetry/default/ca.crt group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769037968.9737742-338-271474741179261/.source.crt _original_basename=compute-2.ctlplane.example.com-ca.crt follow=False checksum=d8fd7bb3e34b5ea059d1c8aca5209211b8d4078a backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 21 18:26:10 np0005591285 python3.9[79027]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/certs/telemetry/default/tls.key follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 21 18:26:11 np0005591285 python3.9[79150]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/certs/telemetry/default/tls.key group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769037970.313258-338-106812570826364/.source.key _original_basename=compute-2.ctlplane.example.com-tls.key follow=False checksum=8810e18040b27463a45ea44ab7fc1b1003d67241 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 21 18:26:12 np0005591285 python3.9[79302]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/certs/neutron-metadata/default setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 21 18:26:12 np0005591285 python3.9[79454]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/certs/neutron-metadata/default setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 21 18:26:13 np0005591285 python3.9[79606]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/certs/neutron-metadata/default/tls.crt follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 21 18:26:14 np0005591285 python3.9[79729]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/certs/neutron-metadata/default/tls.crt group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769037972.9841106-523-259326557242430/.source.crt _original_basename=compute-2.ctlplane.example.com-tls.crt follow=False checksum=257ef5303340e9f531f23aa60acb97d363ff7a6c backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 21 18:26:14 np0005591285 python3.9[79881]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/certs/neutron-metadata/default/ca.crt follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 21 18:26:15 np0005591285 python3.9[80004]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/certs/neutron-metadata/default/ca.crt group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769037974.186561-523-26325071383469/.source.crt _original_basename=compute-2.ctlplane.example.com-ca.crt follow=False checksum=5ac53e5233bb5dc2a1a1ee89225b6d9cf54a324a backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 21 18:26:15 np0005591285 python3.9[80156]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/certs/neutron-metadata/default/tls.key follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 21 18:26:16 np0005591285 python3.9[80279]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/certs/neutron-metadata/default/tls.key group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769037975.418953-523-241151539545220/.source.key _original_basename=compute-2.ctlplane.example.com-tls.key follow=False checksum=1eddacc43563ce56331199d2b0d81e88056f4c0a backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 21 18:26:17 np0005591285 python3.9[80431]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/certs/ovn/default setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 21 18:26:17 np0005591285 python3.9[80583]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/certs/ovn/default setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 21 18:26:18 np0005591285 python3.9[80735]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/certs/ovn/default/tls.crt follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 21 18:26:19 np0005591285 python3.9[80859]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/certs/ovn/default/tls.crt group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769037978.1998549-699-136406103291534/.source.crt _original_basename=compute-2.ctlplane.example.com-tls.crt follow=False checksum=f73798aa9b25f54b6d3cd4159b7efaad676caee5 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 21 18:26:19 np0005591285 chronyd[64454]: Selected source 198.181.199.86 (pool.ntp.org)
Jan 21 18:26:20 np0005591285 python3.9[81011]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/certs/ovn/default/ca.crt follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 21 18:26:21 np0005591285 python3.9[81134]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/certs/ovn/default/ca.crt group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769037979.8639488-699-100489597248263/.source.crt _original_basename=compute-2.ctlplane.example.com-ca.crt follow=False checksum=5ac53e5233bb5dc2a1a1ee89225b6d9cf54a324a backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 21 18:26:21 np0005591285 python3.9[81286]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/certs/ovn/default/tls.key follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 21 18:26:22 np0005591285 python3.9[81409]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/certs/ovn/default/tls.key group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769037981.2361064-699-16549199846827/.source.key _original_basename=compute-2.ctlplane.example.com-tls.key follow=False checksum=c00be9c06428664ea02965fa8fd25e3fa7b4c409 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 21 18:26:23 np0005591285 python3.9[81561]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/cacerts/telemetry setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 21 18:26:24 np0005591285 python3.9[81713]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 21 18:26:24 np0005591285 python3.9[81836]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769037983.7628822-925-178131946041077/.source.pem _original_basename=tls-ca-bundle.pem follow=False checksum=7d2dea4c5f9b91987b3f91d50d337a484f86b475 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 21 18:26:25 np0005591285 python3.9[81988]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/cacerts/libvirt setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 21 18:26:26 np0005591285 python3.9[82140]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/cacerts/libvirt/tls-ca-bundle.pem follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 21 18:26:26 np0005591285 python3.9[82263]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/cacerts/libvirt/tls-ca-bundle.pem group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769037985.9190722-1000-204405592040158/.source.pem _original_basename=tls-ca-bundle.pem follow=False checksum=7d2dea4c5f9b91987b3f91d50d337a484f86b475 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 21 18:26:27 np0005591285 python3.9[82415]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/cacerts/repo-setup setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 21 18:26:28 np0005591285 python3.9[82567]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/cacerts/repo-setup/tls-ca-bundle.pem follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 21 18:26:29 np0005591285 python3.9[82690]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/cacerts/repo-setup/tls-ca-bundle.pem group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769037987.8809779-1072-168374391010880/.source.pem _original_basename=tls-ca-bundle.pem follow=False checksum=7d2dea4c5f9b91987b3f91d50d337a484f86b475 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 21 18:26:29 np0005591285 python3.9[82842]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/cacerts/neutron-metadata setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 21 18:26:30 np0005591285 python3.9[82994]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 21 18:26:31 np0005591285 python3.9[83117]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769037990.086879-1143-268868019058161/.source.pem _original_basename=tls-ca-bundle.pem follow=False checksum=7d2dea4c5f9b91987b3f91d50d337a484f86b475 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 21 18:26:31 np0005591285 python3.9[83269]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/cacerts/ovn setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 21 18:26:32 np0005591285 python3.9[83421]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 21 18:26:33 np0005591285 python3.9[83544]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769037992.1603332-1218-94651112468909/.source.pem _original_basename=tls-ca-bundle.pem follow=False checksum=7d2dea4c5f9b91987b3f91d50d337a484f86b475 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 21 18:26:33 np0005591285 python3.9[83696]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/cacerts/bootstrap setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 21 18:26:34 np0005591285 python3.9[83848]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/cacerts/bootstrap/tls-ca-bundle.pem follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 21 18:26:35 np0005591285 python3.9[83971]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/cacerts/bootstrap/tls-ca-bundle.pem group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769037994.197717-1291-115683343390408/.source.pem _original_basename=tls-ca-bundle.pem follow=False checksum=7d2dea4c5f9b91987b3f91d50d337a484f86b475 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 21 18:26:36 np0005591285 python3.9[84123]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/cacerts/nova setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 21 18:26:36 np0005591285 python3.9[84275]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 21 18:26:37 np0005591285 python3.9[84398]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769037996.204805-1342-266509387871102/.source.pem _original_basename=tls-ca-bundle.pem follow=False checksum=7d2dea4c5f9b91987b3f91d50d337a484f86b475 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 21 18:26:37 np0005591285 systemd-logind[788]: Session 19 logged out. Waiting for processes to exit.
Jan 21 18:26:37 np0005591285 systemd[1]: session-19.scope: Deactivated successfully.
Jan 21 18:26:37 np0005591285 systemd[1]: session-19.scope: Consumed 32.032s CPU time.
Jan 21 18:26:37 np0005591285 systemd-logind[788]: Removed session 19.
Jan 21 18:26:43 np0005591285 systemd-logind[788]: New session 20 of user zuul.
Jan 21 18:26:43 np0005591285 systemd[1]: Started Session 20 of User zuul.
Jan 21 18:26:44 np0005591285 python3.9[84576]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 21 18:26:45 np0005591285 python3.9[84732]: ansible-ansible.builtin.file Invoked with group=zuul mode=0750 owner=zuul path=/var/lib/edpm-config/firewall setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 21 18:26:46 np0005591285 python3.9[84884]: ansible-ansible.builtin.file Invoked with group=openvswitch owner=openvswitch path=/var/lib/openvswitch/ovn setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None attributes=None
Jan 21 18:26:46 np0005591285 python3.9[85034]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'selinux'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 21 18:26:48 np0005591285 python3.9[85186]: ansible-ansible.posix.seboolean Invoked with name=virt_sandbox_use_netlink persistent=True state=True ignore_selinux_state=False
Jan 21 18:26:50 np0005591285 dbus-broker-launch[776]: avc:  op=load_policy lsm=selinux seqno=9 res=1
Jan 21 18:26:50 np0005591285 python3.9[85342]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Jan 21 18:26:51 np0005591285 python3.9[85427]: ansible-ansible.legacy.dnf Invoked with name=['openvswitch'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Jan 21 18:26:53 np0005591285 python3.9[85580]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=openvswitch.service state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None
Jan 21 18:26:54 np0005591285 python3[85735]: ansible-osp.edpm.edpm_nftables_snippet Invoked with content=- rule_name: 118 neutron vxlan networks#012  rule:#012    proto: udp#012    dport: 4789#012- rule_name: 119 neutron geneve networks#012  rule:#012    proto: udp#012    dport: 6081#012    state: ["UNTRACKED"]#012- rule_name: 120 neutron geneve networks no conntrack#012  rule:#012    proto: udp#012    dport: 6081#012    table: raw#012    chain: OUTPUT#012    jump: NOTRACK#012    action: append#012    state: []#012- rule_name: 121 neutron geneve networks no conntrack#012  rule:#012    proto: udp#012    dport: 6081#012    table: raw#012    chain: PREROUTING#012    jump: NOTRACK#012    action: append#012    state: []#012 dest=/var/lib/edpm-config/firewall/ovn.yaml state=present
Jan 21 18:26:55 np0005591285 python3.9[85887]: ansible-ansible.builtin.file Invoked with group=root mode=0750 owner=root path=/var/lib/edpm-config/firewall state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 21 18:26:56 np0005591285 python3.9[86039]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 21 18:26:57 np0005591285 python3.9[86117]: ansible-ansible.legacy.file Invoked with mode=0644 dest=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml _original_basename=base-rules.yaml.j2 recurse=False state=file path=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 21 18:26:57 np0005591285 python3.9[86269]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 21 18:26:58 np0005591285 python3.9[86347]: ansible-ansible.legacy.file Invoked with mode=0644 dest=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml _original_basename=.t01iik88 recurse=False state=file path=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 21 18:26:59 np0005591285 python3.9[86499]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/iptables.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 21 18:26:59 np0005591285 python3.9[86577]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/iptables.nft _original_basename=iptables.nft recurse=False state=file path=/etc/nftables/iptables.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 21 18:27:00 np0005591285 python3.9[86729]: ansible-ansible.legacy.command Invoked with _raw_params=nft -j list ruleset _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 21 18:27:01 np0005591285 python3[86882]: ansible-edpm_nftables_from_files Invoked with src=/var/lib/edpm-config/firewall
Jan 21 18:27:02 np0005591285 python3.9[87034]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-jumps.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 21 18:27:03 np0005591285 python3.9[87159]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-jumps.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769038021.755866-434-96407687941457/.source.nft follow=False _original_basename=jump-chain.j2 checksum=81c2fc96c23335ffe374f9b064e885d5d971ddf9 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 21 18:27:03 np0005591285 python3.9[87311]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-update-jumps.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 21 18:27:04 np0005591285 python3.9[87436]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-update-jumps.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769038023.24867-479-151880176080268/.source.nft follow=False _original_basename=jump-chain.j2 checksum=ac8dea350c18f51f54d48dacc09613cda4c5540c backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 21 18:27:05 np0005591285 python3.9[87588]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-flushes.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 21 18:27:05 np0005591285 python3.9[87713]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-flushes.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769038024.6912582-524-134306574592430/.source.nft follow=False _original_basename=flush-chain.j2 checksum=4d3ffec49c8eb1a9b80d2f1e8cd64070063a87b4 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 21 18:27:06 np0005591285 python3.9[87865]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-chains.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 21 18:27:07 np0005591285 python3.9[87990]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-chains.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769038026.1108081-569-90386497154756/.source.nft follow=False _original_basename=chains.j2 checksum=298ada419730ec15df17ded0cc50c97a4014a591 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 21 18:27:08 np0005591285 python3.9[88142]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-rules.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 21 18:27:08 np0005591285 python3.9[88267]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-rules.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769038027.5572803-613-146806783892369/.source.nft follow=False _original_basename=ruleset.j2 checksum=eb691bdb7d792c5f8ff0d719e807fe1c95b09438 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 21 18:27:09 np0005591285 python3.9[88419]: ansible-ansible.builtin.file Invoked with group=root mode=0600 owner=root path=/etc/nftables/edpm-rules.nft.changed state=touch recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 21 18:27:10 np0005591285 python3.9[88571]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail; cat /etc/nftables/edpm-chains.nft /etc/nftables/edpm-flushes.nft /etc/nftables/edpm-rules.nft /etc/nftables/edpm-update-jumps.nft /etc/nftables/edpm-jumps.nft | nft -c -f - _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 21 18:27:11 np0005591285 python3.9[88726]: ansible-ansible.builtin.blockinfile Invoked with backup=False block=include "/etc/nftables/iptables.nft"#012include "/etc/nftables/edpm-chains.nft"#012include "/etc/nftables/edpm-rules.nft"#012include "/etc/nftables/edpm-jumps.nft"#012 path=/etc/sysconfig/nftables.conf validate=nft -c -f %s state=present marker=# {mark} ANSIBLE MANAGED BLOCK create=False marker_begin=BEGIN marker_end=END append_newline=False prepend_newline=False encoding=utf-8 unsafe_writes=False insertafter=None insertbefore=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 21 18:27:12 np0005591285 python3.9[88878]: ansible-ansible.legacy.command Invoked with _raw_params=nft -f /etc/nftables/edpm-chains.nft _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 21 18:27:13 np0005591285 python3.9[89031]: ansible-ansible.builtin.stat Invoked with path=/etc/nftables/edpm-rules.nft.changed follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 21 18:27:14 np0005591285 python3.9[89185]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail; cat /etc/nftables/edpm-flushes.nft /etc/nftables/edpm-rules.nft /etc/nftables/edpm-update-jumps.nft | nft -f - _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 21 18:27:15 np0005591285 python3.9[89340]: ansible-ansible.builtin.file Invoked with path=/etc/nftables/edpm-rules.nft.changed state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 21 18:27:16 np0005591285 python3.9[89490]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'machine'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 21 18:27:17 np0005591285 python3.9[89643]: ansible-ansible.legacy.command Invoked with _raw_params=ovs-vsctl set open . external_ids:hostname=compute-2.ctlplane.example.com external_ids:ovn-bridge=br-int external_ids:ovn-bridge-mappings=datacentre:br-ex external_ids:ovn-chassis-mac-mappings="datacentre:0e:0a:8d:1d:08:09" external_ids:ovn-encap-ip=172.19.0.102 external_ids:ovn-encap-type=geneve external_ids:ovn-encap-tos=0 external_ids:ovn-match-northd-version=False external_ids:ovn-monitor-all=True external_ids:ovn-remote=ssl:ovsdbserver-sb.openstack.svc:6642 external_ids:ovn-remote-probe-interval=60000 external_ids:ovn-ofctrl-wait-before-clear=8000 external_ids:rundir=/var/run/openvswitch #012 _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 21 18:27:17 np0005591285 ovs-vsctl[89644]: ovs|00001|vsctl|INFO|Called as ovs-vsctl set open . external_ids:hostname=compute-2.ctlplane.example.com external_ids:ovn-bridge=br-int external_ids:ovn-bridge-mappings=datacentre:br-ex external_ids:ovn-chassis-mac-mappings=datacentre:0e:0a:8d:1d:08:09 external_ids:ovn-encap-ip=172.19.0.102 external_ids:ovn-encap-type=geneve external_ids:ovn-encap-tos=0 external_ids:ovn-match-northd-version=False external_ids:ovn-monitor-all=True external_ids:ovn-remote=ssl:ovsdbserver-sb.openstack.svc:6642 external_ids:ovn-remote-probe-interval=60000 external_ids:ovn-ofctrl-wait-before-clear=8000 external_ids:rundir=/var/run/openvswitch
Jan 21 18:27:18 np0005591285 python3.9[89796]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail#012ovs-vsctl show | grep -q "Manager"#012 _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 21 18:27:19 np0005591285 python3.9[89951]: ansible-ansible.legacy.command Invoked with _raw_params=ovs-vsctl --timeout=5 --id=@manager -- create Manager target=\"ptcp:********@manager#012 _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 21 18:27:19 np0005591285 ovs-vsctl[89952]: ovs|00001|vsctl|INFO|Called as ovs-vsctl --timeout=5 --id=@manager -- create Manager "target=\"ptcp:6640:127.0.0.1\"" -- add Open_vSwitch . manager_options @manager
Jan 21 18:27:20 np0005591285 python3.9[90102]: ansible-ansible.builtin.stat Invoked with path=/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 21 18:27:21 np0005591285 python3.9[90256]: ansible-ansible.builtin.file Invoked with path=/var/local/libexec recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Jan 21 18:27:21 np0005591285 python3.9[90408]: ansible-ansible.legacy.stat Invoked with path=/var/local/libexec/edpm-container-shutdown follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 21 18:27:22 np0005591285 python3.9[90486]: ansible-ansible.legacy.file Invoked with group=root mode=0700 owner=root setype=container_file_t dest=/var/local/libexec/edpm-container-shutdown _original_basename=edpm-container-shutdown recurse=False state=file path=/var/local/libexec/edpm-container-shutdown force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 21 18:27:22 np0005591285 python3.9[90638]: ansible-ansible.legacy.stat Invoked with path=/var/local/libexec/edpm-start-podman-container follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 21 18:27:23 np0005591285 python3.9[90716]: ansible-ansible.legacy.file Invoked with group=root mode=0700 owner=root setype=container_file_t dest=/var/local/libexec/edpm-start-podman-container _original_basename=edpm-start-podman-container recurse=False state=file path=/var/local/libexec/edpm-start-podman-container force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 21 18:27:24 np0005591285 python3.9[90868]: ansible-ansible.builtin.file Invoked with mode=420 path=/etc/systemd/system-preset state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 21 18:27:25 np0005591285 python3.9[91020]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/edpm-container-shutdown.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 21 18:27:25 np0005591285 python3.9[91098]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system/edpm-container-shutdown.service _original_basename=edpm-container-shutdown-service recurse=False state=file path=/etc/systemd/system/edpm-container-shutdown.service force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 21 18:27:26 np0005591285 python3.9[91250]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system-preset/91-edpm-container-shutdown.preset follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 21 18:27:26 np0005591285 python3.9[91328]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system-preset/91-edpm-container-shutdown.preset _original_basename=91-edpm-container-shutdown-preset recurse=False state=file path=/etc/systemd/system-preset/91-edpm-container-shutdown.preset force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 21 18:27:27 np0005591285 python3.9[91480]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=edpm-container-shutdown state=started daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 21 18:27:27 np0005591285 systemd[1]: Reloading.
Jan 21 18:27:27 np0005591285 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 21 18:27:27 np0005591285 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 21 18:27:28 np0005591285 python3.9[91670]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/netns-placeholder.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 21 18:27:29 np0005591285 python3.9[91748]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system/netns-placeholder.service _original_basename=netns-placeholder-service recurse=False state=file path=/etc/systemd/system/netns-placeholder.service force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 21 18:27:29 np0005591285 python3.9[91900]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system-preset/91-netns-placeholder.preset follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 21 18:27:30 np0005591285 python3.9[91978]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system-preset/91-netns-placeholder.preset _original_basename=91-netns-placeholder-preset recurse=False state=file path=/etc/systemd/system-preset/91-netns-placeholder.preset force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 21 18:27:31 np0005591285 python3.9[92130]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=netns-placeholder state=started daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 21 18:27:31 np0005591285 systemd[1]: Reloading.
Jan 21 18:27:31 np0005591285 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 21 18:27:31 np0005591285 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 21 18:27:31 np0005591285 systemd[1]: Starting Create netns directory...
Jan 21 18:27:31 np0005591285 systemd[1]: run-netns-placeholder.mount: Deactivated successfully.
Jan 21 18:27:31 np0005591285 systemd[1]: netns-placeholder.service: Deactivated successfully.
Jan 21 18:27:31 np0005591285 systemd[1]: Finished Create netns directory.
Jan 21 18:27:32 np0005591285 python3.9[92323]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/openstack/healthchecks setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 21 18:27:33 np0005591285 python3.9[92475]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/healthchecks/ovn_controller/healthcheck follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 21 18:27:33 np0005591285 python3.9[92598]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/healthchecks/ovn_controller/ group=zuul mode=0700 owner=zuul setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1769038052.7678103-1366-264619380404753/.source _original_basename=healthcheck follow=False checksum=4098dd010265fabdf5c26b97d169fc4e575ff457 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None
Jan 21 18:27:34 np0005591285 python3.9[92750]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/edpm-config recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 21 18:27:35 np0005591285 python3.9[92902]: ansible-ansible.builtin.file Invoked with path=/var/lib/kolla/config_files recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Jan 21 18:27:36 np0005591285 python3.9[93054]: ansible-ansible.legacy.stat Invoked with path=/var/lib/kolla/config_files/ovn_controller.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 21 18:27:36 np0005591285 python3.9[93177]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/kolla/config_files/ovn_controller.json mode=0600 src=/home/zuul/.ansible/tmp/ansible-tmp-1769038055.7881148-1465-199710255692525/.source.json _original_basename=.zuj6e7iu follow=False checksum=2328fc98619beeb08ee32b01f15bb43094c10b61 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 21 18:27:37 np0005591285 python3.9[93327]: ansible-ansible.builtin.file Invoked with mode=0755 path=/var/lib/edpm-config/container-startup-config/ovn_controller state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 21 18:27:40 np0005591285 python3.9[93750]: ansible-container_config_data Invoked with config_overrides={} config_path=/var/lib/edpm-config/container-startup-config/ovn_controller config_pattern=*.json debug=False
Jan 21 18:27:41 np0005591285 python3.9[93902]: ansible-container_config_hash Invoked with check_mode=False config_vol_prefix=/var/lib/openstack
Jan 21 18:27:42 np0005591285 python3[94054]: ansible-edpm_container_manage Invoked with concurrency=1 config_dir=/var/lib/edpm-config/container-startup-config/ovn_controller config_id=ovn_controller config_overrides={} config_patterns=*.json containers=['ovn_controller'] log_base_path=/var/log/containers/stdouts debug=False
Jan 21 18:27:42 np0005591285 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Jan 21 18:27:42 np0005591285 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Jan 21 18:27:42 np0005591285 podman[94091]: 2026-01-21 23:27:42.740739296 +0000 UTC m=+0.053453391 container create e38def30a2d032278c507a8fe96e62e9ea51ff4721fe55b24e66691821fd079e (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, managed_by=edpm_ansible, config_id=ovn_controller, container_name=ovn_controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.license=GPLv2)
Jan 21 18:27:42 np0005591285 podman[94091]: 2026-01-21 23:27:42.713363068 +0000 UTC m=+0.026077183 image pull a17927617ef5a603f0594ee0d6df65aabdc9e0303ccc5a52c36f193de33ee0fe quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified
Jan 21 18:27:42 np0005591285 python3[94054]: ansible-edpm_container_manage PODMAN-CONTAINER-DEBUG: podman create --name ovn_controller --conmon-pidfile /run/ovn_controller.pid --env KOLLA_CONFIG_STRATEGY=COPY_ALWAYS --env EDPM_CONFIG_HASH=7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8 --healthcheck-command /openstack/healthcheck --label config_id=ovn_controller --label container_name=ovn_controller --label managed_by=edpm_ansible --label config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']} --log-driver journald --log-level info --network host --privileged=True --user root --volume /lib/modules:/lib/modules:ro --volume /run:/run --volume /var/lib/openvswitch/ovn:/run/ovn:shared,z --volume /var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro --volume /var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z --volume /var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z --volume /var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z --volume /var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z --volume /var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified
Jan 21 18:27:43 np0005591285 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Jan 21 18:27:43 np0005591285 python3.9[94281]: ansible-ansible.builtin.stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 21 18:27:44 np0005591285 python3.9[94435]: ansible-file Invoked with path=/etc/systemd/system/edpm_ovn_controller.requires state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 21 18:27:44 np0005591285 python3.9[94511]: ansible-stat Invoked with path=/etc/systemd/system/edpm_ovn_controller_healthcheck.timer follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 21 18:27:45 np0005591285 python3.9[94662]: ansible-copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1769038064.9230094-1699-101980593549731/source dest=/etc/systemd/system/edpm_ovn_controller.service mode=0644 owner=root group=root backup=False force=True remote_src=False follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 21 18:27:46 np0005591285 python3.9[94738]: ansible-systemd Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Jan 21 18:27:46 np0005591285 systemd[1]: Reloading.
Jan 21 18:27:46 np0005591285 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 21 18:27:46 np0005591285 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 21 18:27:47 np0005591285 python3.9[94850]: ansible-systemd Invoked with state=restarted name=edpm_ovn_controller.service enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 21 18:27:47 np0005591285 systemd[1]: Reloading.
Jan 21 18:27:47 np0005591285 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 21 18:27:47 np0005591285 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 21 18:27:47 np0005591285 systemd[1]: Starting ovn_controller container...
Jan 21 18:27:47 np0005591285 systemd[1]: Created slice Virtual Machine and Container Slice.
Jan 21 18:27:47 np0005591285 systemd[1]: Started libcrun container.
Jan 21 18:27:47 np0005591285 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/64af8e382a2ce74444463970a3e8c0af33ff904da8bed5c92dfba75bfe16c1be/merged/run/ovn supports timestamps until 2038 (0x7fffffff)
Jan 21 18:27:47 np0005591285 systemd[1]: Started /usr/bin/podman healthcheck run e38def30a2d032278c507a8fe96e62e9ea51ff4721fe55b24e66691821fd079e.
Jan 21 18:27:47 np0005591285 podman[94892]: 2026-01-21 23:27:47.466642252 +0000 UTC m=+0.119077599 container init e38def30a2d032278c507a8fe96e62e9ea51ff4721fe55b24e66691821fd079e (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0)
Jan 21 18:27:47 np0005591285 ovn_controller[94908]: + sudo -E kolla_set_configs
Jan 21 18:27:47 np0005591285 podman[94892]: 2026-01-21 23:27:47.489158199 +0000 UTC m=+0.141593516 container start e38def30a2d032278c507a8fe96e62e9ea51ff4721fe55b24e66691821fd079e (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251202, container_name=ovn_controller)
Jan 21 18:27:47 np0005591285 edpm-start-podman-container[94892]: ovn_controller
Jan 21 18:27:47 np0005591285 systemd[1]: Created slice User Slice of UID 0.
Jan 21 18:27:47 np0005591285 systemd[1]: Starting User Runtime Directory /run/user/0...
Jan 21 18:27:47 np0005591285 systemd[1]: Finished User Runtime Directory /run/user/0.
Jan 21 18:27:47 np0005591285 edpm-start-podman-container[94891]: Creating additional drop-in dependency for "ovn_controller" (e38def30a2d032278c507a8fe96e62e9ea51ff4721fe55b24e66691821fd079e)
Jan 21 18:27:47 np0005591285 systemd[1]: Starting User Manager for UID 0...
Jan 21 18:27:47 np0005591285 systemd[1]: Reloading.
Jan 21 18:27:47 np0005591285 podman[94914]: 2026-01-21 23:27:47.581036484 +0000 UTC m=+0.072762421 container health_status e38def30a2d032278c507a8fe96e62e9ea51ff4721fe55b24e66691821fd079e (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=starting, health_failing_streak=1, health_log=, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 21 18:27:47 np0005591285 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 21 18:27:47 np0005591285 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 21 18:27:47 np0005591285 systemd[94941]: Queued start job for default target Main User Target.
Jan 21 18:27:47 np0005591285 systemd[94941]: Created slice User Application Slice.
Jan 21 18:27:47 np0005591285 systemd[94941]: Mark boot as successful after the user session has run 2 minutes was skipped because of an unmet condition check (ConditionUser=!@system).
Jan 21 18:27:47 np0005591285 systemd[94941]: Started Daily Cleanup of User's Temporary Directories.
Jan 21 18:27:47 np0005591285 systemd[94941]: Reached target Paths.
Jan 21 18:27:47 np0005591285 systemd[94941]: Reached target Timers.
Jan 21 18:27:47 np0005591285 systemd[94941]: Starting D-Bus User Message Bus Socket...
Jan 21 18:27:47 np0005591285 systemd[94941]: Starting Create User's Volatile Files and Directories...
Jan 21 18:27:47 np0005591285 systemd[94941]: Listening on D-Bus User Message Bus Socket.
Jan 21 18:27:47 np0005591285 systemd[94941]: Reached target Sockets.
Jan 21 18:27:47 np0005591285 systemd[94941]: Finished Create User's Volatile Files and Directories.
Jan 21 18:27:47 np0005591285 systemd[94941]: Reached target Basic System.
Jan 21 18:27:47 np0005591285 systemd[94941]: Reached target Main User Target.
Jan 21 18:27:47 np0005591285 systemd[94941]: Startup finished in 116ms.
Jan 21 18:27:47 np0005591285 systemd[1]: Started User Manager for UID 0.
Jan 21 18:27:47 np0005591285 systemd[1]: Started ovn_controller container.
Jan 21 18:27:47 np0005591285 systemd[1]: e38def30a2d032278c507a8fe96e62e9ea51ff4721fe55b24e66691821fd079e-771dc736084769aa.service: Main process exited, code=exited, status=1/FAILURE
Jan 21 18:27:47 np0005591285 systemd[1]: e38def30a2d032278c507a8fe96e62e9ea51ff4721fe55b24e66691821fd079e-771dc736084769aa.service: Failed with result 'exit-code'.
Jan 21 18:27:47 np0005591285 systemd[1]: Started Session c1 of User root.
Jan 21 18:27:47 np0005591285 ovn_controller[94908]: INFO:__main__:Loading config file at /var/lib/kolla/config_files/config.json
Jan 21 18:27:47 np0005591285 ovn_controller[94908]: INFO:__main__:Validating config file
Jan 21 18:27:47 np0005591285 ovn_controller[94908]: INFO:__main__:Kolla config strategy set to: COPY_ALWAYS
Jan 21 18:27:47 np0005591285 ovn_controller[94908]: INFO:__main__:Writing out command to execute
Jan 21 18:27:47 np0005591285 systemd[1]: session-c1.scope: Deactivated successfully.
Jan 21 18:27:47 np0005591285 ovn_controller[94908]: ++ cat /run_command
Jan 21 18:27:47 np0005591285 ovn_controller[94908]: + CMD='/usr/bin/ovn-controller --pidfile unix:/run/openvswitch/db.sock  -p /etc/pki/tls/private/ovndb.key -c /etc/pki/tls/certs/ovndb.crt -C /etc/pki/tls/certs/ovndbca.crt '
Jan 21 18:27:47 np0005591285 ovn_controller[94908]: + ARGS=
Jan 21 18:27:47 np0005591285 ovn_controller[94908]: + sudo kolla_copy_cacerts
Jan 21 18:27:47 np0005591285 systemd[1]: Started Session c2 of User root.
Jan 21 18:27:47 np0005591285 ovn_controller[94908]: + [[ ! -n '' ]]
Jan 21 18:27:47 np0005591285 ovn_controller[94908]: + . kolla_extend_start
Jan 21 18:27:47 np0005591285 ovn_controller[94908]: Running command: '/usr/bin/ovn-controller --pidfile unix:/run/openvswitch/db.sock  -p /etc/pki/tls/private/ovndb.key -c /etc/pki/tls/certs/ovndb.crt -C /etc/pki/tls/certs/ovndbca.crt '
Jan 21 18:27:47 np0005591285 ovn_controller[94908]: + echo 'Running command: '\''/usr/bin/ovn-controller --pidfile unix:/run/openvswitch/db.sock  -p /etc/pki/tls/private/ovndb.key -c /etc/pki/tls/certs/ovndb.crt -C /etc/pki/tls/certs/ovndbca.crt '\'''
Jan 21 18:27:47 np0005591285 ovn_controller[94908]: + umask 0022
Jan 21 18:27:47 np0005591285 ovn_controller[94908]: + exec /usr/bin/ovn-controller --pidfile unix:/run/openvswitch/db.sock -p /etc/pki/tls/private/ovndb.key -c /etc/pki/tls/certs/ovndb.crt -C /etc/pki/tls/certs/ovndbca.crt
Jan 21 18:27:47 np0005591285 systemd[1]: session-c2.scope: Deactivated successfully.
Jan 21 18:27:47 np0005591285 ovn_controller[94908]: 2026-01-21T23:27:47Z|00001|reconnect|INFO|unix:/run/openvswitch/db.sock: connecting...
Jan 21 18:27:47 np0005591285 ovn_controller[94908]: 2026-01-21T23:27:47Z|00002|reconnect|INFO|unix:/run/openvswitch/db.sock: connected
Jan 21 18:27:47 np0005591285 ovn_controller[94908]: 2026-01-21T23:27:47Z|00003|main|INFO|OVN internal version is : [24.03.8-20.33.0-76.8]
Jan 21 18:27:47 np0005591285 ovn_controller[94908]: 2026-01-21T23:27:47Z|00004|main|INFO|OVS IDL reconnected, force recompute.
Jan 21 18:27:47 np0005591285 ovn_controller[94908]: 2026-01-21T23:27:47Z|00005|reconnect|INFO|ssl:ovsdbserver-sb.openstack.svc:6642: connecting...
Jan 21 18:27:47 np0005591285 ovn_controller[94908]: 2026-01-21T23:27:47Z|00006|main|INFO|OVNSB IDL reconnected, force recompute.
Jan 21 18:27:47 np0005591285 NetworkManager[55017]: <info>  [1769038067.9627] manager: (br-int): new Open vSwitch Interface device (/org/freedesktop/NetworkManager/Devices/14)
Jan 21 18:27:47 np0005591285 NetworkManager[55017]: <info>  [1769038067.9636] device (br-int)[Open vSwitch Interface]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Jan 21 18:27:47 np0005591285 NetworkManager[55017]: <warn>  [1769038067.9639] device (br-int)[Open vSwitch Interface]: error setting IPv4 forwarding to '1': No such file or directory
Jan 21 18:27:47 np0005591285 NetworkManager[55017]: <info>  [1769038067.9644] manager: (br-int): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/15)
Jan 21 18:27:47 np0005591285 NetworkManager[55017]: <info>  [1769038067.9649] manager: (br-int): new Open vSwitch Bridge device (/org/freedesktop/NetworkManager/Devices/16)
Jan 21 18:27:47 np0005591285 NetworkManager[55017]: <info>  [1769038067.9652] device (br-int)[Open vSwitch Interface]: state change: unavailable -> disconnected (reason 'none', managed-type: 'full')
Jan 21 18:27:47 np0005591285 kernel: br-int: entered promiscuous mode
Jan 21 18:27:47 np0005591285 ovn_controller[94908]: 2026-01-21T23:27:47Z|00007|reconnect|INFO|ssl:ovsdbserver-sb.openstack.svc:6642: connected
Jan 21 18:27:47 np0005591285 ovn_controller[94908]: 2026-01-21T23:27:47Z|00008|features|INFO|unix:/var/run/openvswitch/br-int.mgmt: connecting to switch
Jan 21 18:27:47 np0005591285 ovn_controller[94908]: 2026-01-21T23:27:47Z|00009|rconn|INFO|unix:/var/run/openvswitch/br-int.mgmt: connecting...
Jan 21 18:27:47 np0005591285 ovn_controller[94908]: 2026-01-21T23:27:47Z|00010|rconn|INFO|unix:/var/run/openvswitch/br-int.mgmt: connected
Jan 21 18:27:47 np0005591285 ovn_controller[94908]: 2026-01-21T23:27:47Z|00011|features|INFO|OVS Feature: ct_zero_snat, state: supported
Jan 21 18:27:47 np0005591285 ovn_controller[94908]: 2026-01-21T23:27:47Z|00012|features|INFO|OVS Feature: ct_flush, state: supported
Jan 21 18:27:47 np0005591285 ovn_controller[94908]: 2026-01-21T23:27:47Z|00013|features|INFO|OVS Feature: dp_hash_l4_sym_support, state: supported
Jan 21 18:27:47 np0005591285 ovn_controller[94908]: 2026-01-21T23:27:47Z|00014|reconnect|INFO|unix:/run/openvswitch/db.sock: connecting...
Jan 21 18:27:47 np0005591285 ovn_controller[94908]: 2026-01-21T23:27:47Z|00015|main|INFO|OVS feature set changed, force recompute.
Jan 21 18:27:47 np0005591285 ovn_controller[94908]: 2026-01-21T23:27:47Z|00016|ofctrl|INFO|unix:/var/run/openvswitch/br-int.mgmt: connecting to switch
Jan 21 18:27:47 np0005591285 ovn_controller[94908]: 2026-01-21T23:27:47Z|00017|rconn|INFO|unix:/var/run/openvswitch/br-int.mgmt: connecting...
Jan 21 18:27:47 np0005591285 ovn_controller[94908]: 2026-01-21T23:27:47Z|00018|rconn|INFO|unix:/var/run/openvswitch/br-int.mgmt: connected
Jan 21 18:27:47 np0005591285 ovn_controller[94908]: 2026-01-21T23:27:47Z|00019|ofctrl|INFO|ofctrl-wait-before-clear is now 8000 ms (was 0 ms)
Jan 21 18:27:47 np0005591285 ovn_controller[94908]: 2026-01-21T23:27:47Z|00020|main|INFO|OVS OpenFlow connection reconnected,force recompute.
Jan 21 18:27:47 np0005591285 ovn_controller[94908]: 2026-01-21T23:27:47Z|00021|reconnect|INFO|unix:/run/openvswitch/db.sock: connected
Jan 21 18:27:47 np0005591285 ovn_controller[94908]: 2026-01-21T23:27:47Z|00022|main|INFO|OVS feature set changed, force recompute.
Jan 21 18:27:47 np0005591285 ovn_controller[94908]: 2026-01-21T23:27:47Z|00023|features|INFO|OVS DB schema supports 4 flow table prefixes, our IDL supports: 4
Jan 21 18:27:47 np0005591285 ovn_controller[94908]: 2026-01-21T23:27:47Z|00024|main|INFO|Setting flow table prefixes: ip_src, ip_dst, ipv6_src, ipv6_dst.
Jan 21 18:27:47 np0005591285 ovn_controller[94908]: 2026-01-21T23:27:47Z|00001|pinctrl(ovn_pinctrl0)|INFO|unix:/var/run/openvswitch/br-int.mgmt: connecting to switch
Jan 21 18:27:47 np0005591285 ovn_controller[94908]: 2026-01-21T23:27:47Z|00002|rconn(ovn_pinctrl0)|INFO|unix:/var/run/openvswitch/br-int.mgmt: connecting...
Jan 21 18:27:47 np0005591285 ovn_controller[94908]: 2026-01-21T23:27:47Z|00001|statctrl(ovn_statctrl3)|INFO|unix:/var/run/openvswitch/br-int.mgmt: connecting to switch
Jan 21 18:27:47 np0005591285 ovn_controller[94908]: 2026-01-21T23:27:47Z|00002|rconn(ovn_statctrl3)|INFO|unix:/var/run/openvswitch/br-int.mgmt: connecting...
Jan 21 18:27:47 np0005591285 ovn_controller[94908]: 2026-01-21T23:27:47Z|00003|rconn(ovn_pinctrl0)|INFO|unix:/var/run/openvswitch/br-int.mgmt: connected
Jan 21 18:27:47 np0005591285 ovn_controller[94908]: 2026-01-21T23:27:47Z|00003|rconn(ovn_statctrl3)|INFO|unix:/var/run/openvswitch/br-int.mgmt: connected
Jan 21 18:27:47 np0005591285 NetworkManager[55017]: <info>  [1769038067.9915] manager: (ovn-7f404a-0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/17)
Jan 21 18:27:47 np0005591285 NetworkManager[55017]: <info>  [1769038067.9923] manager: (ovn-f0bd48-0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/18)
Jan 21 18:27:47 np0005591285 systemd-udevd[95045]: Network interface NamePolicy= disabled on kernel command line.
Jan 21 18:27:48 np0005591285 kernel: genev_sys_6081: entered promiscuous mode
Jan 21 18:27:48 np0005591285 systemd-udevd[95047]: Network interface NamePolicy= disabled on kernel command line.
Jan 21 18:27:48 np0005591285 NetworkManager[55017]: <info>  [1769038068.0107] device (genev_sys_6081): carrier: link connected
Jan 21 18:27:48 np0005591285 NetworkManager[55017]: <info>  [1769038068.0113] manager: (genev_sys_6081): new Generic device (/org/freedesktop/NetworkManager/Devices/19)
Jan 21 18:27:48 np0005591285 NetworkManager[55017]: <info>  [1769038068.3336] manager: (ovn-74526b-0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/20)
Jan 21 18:27:49 np0005591285 python3.9[95175]: ansible-ansible.builtin.slurp Invoked with src=/var/lib/edpm-config/deployed_services.yaml
Jan 21 18:27:50 np0005591285 python3.9[95327]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/deployed_services.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 21 18:27:51 np0005591285 python3.9[95450]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/edpm-config/deployed_services.yaml mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1769038070.0515363-1834-122924433732300/.source.yaml _original_basename=.mjkkk1aw follow=False checksum=1fa0f89c2313d90a3d28193c1cbb0dd87b38dad2 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 21 18:27:51 np0005591285 python3.9[95602]: ansible-ansible.legacy.command Invoked with _raw_params=ovs-vsctl remove open . other_config hw-offload#012 _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 21 18:27:51 np0005591285 ovs-vsctl[95603]: ovs|00001|vsctl|INFO|Called as ovs-vsctl remove open . other_config hw-offload
Jan 21 18:27:52 np0005591285 python3.9[95755]: ansible-ansible.legacy.command Invoked with _raw_params=ovs-vsctl get Open_vSwitch . external_ids:ovn-cms-options | sed 's/\"//g'#012 _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 21 18:27:52 np0005591285 ovs-vsctl[95757]: ovs|00001|db_ctl_base|ERR|no key "ovn-cms-options" in Open_vSwitch record "." column external_ids
Jan 21 18:27:54 np0005591285 python3.9[95910]: ansible-ansible.legacy.command Invoked with _raw_params=ovs-vsctl remove Open_vSwitch . external_ids ovn-cms-options#012 _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 21 18:27:54 np0005591285 ovs-vsctl[95911]: ovs|00001|vsctl|INFO|Called as ovs-vsctl remove Open_vSwitch . external_ids ovn-cms-options
Jan 21 18:27:54 np0005591285 systemd[1]: session-20.scope: Deactivated successfully.
Jan 21 18:27:54 np0005591285 systemd[1]: session-20.scope: Consumed 49.996s CPU time.
Jan 21 18:27:54 np0005591285 systemd-logind[788]: Session 20 logged out. Waiting for processes to exit.
Jan 21 18:27:54 np0005591285 systemd-logind[788]: Removed session 20.
Jan 21 18:27:58 np0005591285 systemd[1]: Stopping User Manager for UID 0...
Jan 21 18:27:58 np0005591285 systemd[94941]: Activating special unit Exit the Session...
Jan 21 18:27:58 np0005591285 systemd[94941]: Stopped target Main User Target.
Jan 21 18:27:58 np0005591285 systemd[94941]: Stopped target Basic System.
Jan 21 18:27:58 np0005591285 systemd[94941]: Stopped target Paths.
Jan 21 18:27:58 np0005591285 systemd[94941]: Stopped target Sockets.
Jan 21 18:27:58 np0005591285 systemd[94941]: Stopped target Timers.
Jan 21 18:27:58 np0005591285 systemd[94941]: Stopped Daily Cleanup of User's Temporary Directories.
Jan 21 18:27:58 np0005591285 systemd[94941]: Closed D-Bus User Message Bus Socket.
Jan 21 18:27:58 np0005591285 systemd[94941]: Stopped Create User's Volatile Files and Directories.
Jan 21 18:27:58 np0005591285 systemd[94941]: Removed slice User Application Slice.
Jan 21 18:27:58 np0005591285 systemd[94941]: Reached target Shutdown.
Jan 21 18:27:58 np0005591285 systemd[94941]: Finished Exit the Session.
Jan 21 18:27:58 np0005591285 systemd[94941]: Reached target Exit the Session.
Jan 21 18:27:58 np0005591285 systemd[1]: user@0.service: Deactivated successfully.
Jan 21 18:27:58 np0005591285 systemd[1]: Stopped User Manager for UID 0.
Jan 21 18:27:58 np0005591285 systemd[1]: Stopping User Runtime Directory /run/user/0...
Jan 21 18:27:58 np0005591285 systemd[1]: run-user-0.mount: Deactivated successfully.
Jan 21 18:27:58 np0005591285 systemd[1]: user-runtime-dir@0.service: Deactivated successfully.
Jan 21 18:27:58 np0005591285 systemd[1]: Stopped User Runtime Directory /run/user/0.
Jan 21 18:27:58 np0005591285 systemd[1]: Removed slice User Slice of UID 0.
Jan 21 18:27:59 np0005591285 systemd-logind[788]: New session 22 of user zuul.
Jan 21 18:27:59 np0005591285 systemd[1]: Started Session 22 of User zuul.
Jan 21 18:28:01 np0005591285 python3.9[96092]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 21 18:28:02 np0005591285 python3.9[96248]: ansible-ansible.builtin.file Invoked with group=zuul owner=zuul path=/var/lib/openstack/neutron-ovn-metadata-agent setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None attributes=None
Jan 21 18:28:03 np0005591285 python3.9[96400]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/neutron setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 21 18:28:03 np0005591285 python3.9[96552]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/neutron/kill_scripts setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 21 18:28:04 np0005591285 python3.9[96704]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/neutron/ovn-metadata-proxy setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 21 18:28:05 np0005591285 python3.9[96856]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/neutron/external/pids setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 21 18:28:06 np0005591285 python3.9[97006]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'selinux'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 21 18:28:07 np0005591285 python3.9[97159]: ansible-ansible.posix.seboolean Invoked with name=virt_sandbox_use_netlink persistent=True state=True ignore_selinux_state=False
Jan 21 18:28:08 np0005591285 python3.9[97309]: ansible-ansible.legacy.stat Invoked with path=/var/lib/neutron/ovn_metadata_haproxy_wrapper follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 21 18:28:09 np0005591285 python3.9[97430]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/neutron/ovn_metadata_haproxy_wrapper mode=0755 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1769038087.9780874-220-203027313488540/.source follow=False _original_basename=haproxy.j2 checksum=a5072e7b19ca96a1f495d94f97f31903737cfd27 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Jan 21 18:28:10 np0005591285 python3.9[97580]: ansible-ansible.legacy.stat Invoked with path=/var/lib/neutron/kill_scripts/haproxy-kill follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 21 18:28:10 np0005591285 python3.9[97701]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/neutron/kill_scripts/haproxy-kill mode=0755 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1769038089.6268666-266-106871309372542/.source follow=False _original_basename=kill-script.j2 checksum=2dfb5489f491f61b95691c3bf95fa1fe48ff3700 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Jan 21 18:28:11 np0005591285 python3.9[97853]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Jan 21 18:28:12 np0005591285 python3.9[97937]: ansible-ansible.legacy.dnf Invoked with name=['openvswitch'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Jan 21 18:28:15 np0005591285 python3.9[98090]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=openvswitch.service state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None
Jan 21 18:28:16 np0005591285 python3.9[98243]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/neutron-ovn-metadata-agent/01-rootwrap.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 21 18:28:16 np0005591285 python3.9[98364]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/neutron-ovn-metadata-agent/01-rootwrap.conf mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1769038095.9826918-377-116008120436628/.source.conf follow=False _original_basename=rootwrap.conf.j2 checksum=11f2cfb4b7d97b2cef3c2c2d88089e6999cffe22 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Jan 21 18:28:17 np0005591285 python3.9[98514]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/neutron-ovn-metadata-agent/01-neutron-ovn-metadata-agent.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 21 18:28:18 np0005591285 ovn_controller[94908]: 2026-01-21T23:28:18Z|00025|memory|INFO|16256 kB peak resident set size after 30.1 seconds
Jan 21 18:28:18 np0005591285 ovn_controller[94908]: 2026-01-21T23:28:18Z|00026|memory|INFO|idl-cells-OVN_Southbound:273 idl-cells-Open_vSwitch:585 ofctrl_desired_flow_usage-KB:7 ofctrl_installed_flow_usage-KB:5 ofctrl_sb_flow_ref_usage-KB:3
Jan 21 18:28:18 np0005591285 podman[98609]: 2026-01-21 23:28:18.115258386 +0000 UTC m=+0.178621622 container health_status e38def30a2d032278c507a8fe96e62e9ea51ff4721fe55b24e66691821fd079e (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, tcib_managed=true, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS)
Jan 21 18:28:18 np0005591285 python3.9[98645]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/neutron-ovn-metadata-agent/01-neutron-ovn-metadata-agent.conf mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1769038097.155802-377-161045054682284/.source.conf follow=False _original_basename=neutron-ovn-metadata-agent.conf.j2 checksum=8bc979abbe81c2cf3993a225517a7e2483e20443 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Jan 21 18:28:19 np0005591285 python3.9[98809]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/neutron-ovn-metadata-agent/10-neutron-metadata.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 21 18:28:20 np0005591285 python3.9[98930]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/neutron-ovn-metadata-agent/10-neutron-metadata.conf mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1769038099.0400004-508-153662732178541/.source.conf _original_basename=10-neutron-metadata.conf follow=False checksum=ca7d4d155f5b812fab1a3b70e34adb495d291b8d backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Jan 21 18:28:20 np0005591285 python3.9[99080]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/neutron-ovn-metadata-agent/05-nova-metadata.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 21 18:28:21 np0005591285 python3.9[99201]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/neutron-ovn-metadata-agent/05-nova-metadata.conf mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1769038100.2690086-508-266766492277414/.source.conf _original_basename=05-nova-metadata.conf follow=False checksum=3fd0bbe67f8d6b170421a2b4395a288aa69eaea2 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Jan 21 18:28:22 np0005591285 python3.9[99351]: ansible-ansible.builtin.stat Invoked with path=/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 21 18:28:23 np0005591285 python3.9[99505]: ansible-ansible.builtin.file Invoked with path=/var/local/libexec recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Jan 21 18:28:23 np0005591285 python3.9[99657]: ansible-ansible.legacy.stat Invoked with path=/var/local/libexec/edpm-container-shutdown follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 21 18:28:24 np0005591285 python3.9[99735]: ansible-ansible.legacy.file Invoked with group=root mode=0700 owner=root setype=container_file_t dest=/var/local/libexec/edpm-container-shutdown _original_basename=edpm-container-shutdown recurse=False state=file path=/var/local/libexec/edpm-container-shutdown force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 21 18:28:24 np0005591285 python3.9[99887]: ansible-ansible.legacy.stat Invoked with path=/var/local/libexec/edpm-start-podman-container follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 21 18:28:25 np0005591285 python3.9[99965]: ansible-ansible.legacy.file Invoked with group=root mode=0700 owner=root setype=container_file_t dest=/var/local/libexec/edpm-start-podman-container _original_basename=edpm-start-podman-container recurse=False state=file path=/var/local/libexec/edpm-start-podman-container force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 21 18:28:26 np0005591285 python3.9[100117]: ansible-ansible.builtin.file Invoked with mode=420 path=/etc/systemd/system-preset state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 21 18:28:27 np0005591285 python3.9[100269]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/edpm-container-shutdown.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 21 18:28:27 np0005591285 python3.9[100347]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system/edpm-container-shutdown.service _original_basename=edpm-container-shutdown-service recurse=False state=file path=/etc/systemd/system/edpm-container-shutdown.service force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 21 18:28:28 np0005591285 python3.9[100499]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system-preset/91-edpm-container-shutdown.preset follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 21 18:28:28 np0005591285 python3.9[100577]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system-preset/91-edpm-container-shutdown.preset _original_basename=91-edpm-container-shutdown-preset recurse=False state=file path=/etc/systemd/system-preset/91-edpm-container-shutdown.preset force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 21 18:28:29 np0005591285 python3.9[100729]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=edpm-container-shutdown state=started daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 21 18:28:29 np0005591285 systemd[1]: Reloading.
Jan 21 18:28:29 np0005591285 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 21 18:28:29 np0005591285 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 21 18:28:31 np0005591285 python3.9[100917]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/netns-placeholder.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 21 18:28:31 np0005591285 python3.9[100995]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system/netns-placeholder.service _original_basename=netns-placeholder-service recurse=False state=file path=/etc/systemd/system/netns-placeholder.service force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 21 18:28:32 np0005591285 python3.9[101147]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system-preset/91-netns-placeholder.preset follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 21 18:28:32 np0005591285 python3.9[101225]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system-preset/91-netns-placeholder.preset _original_basename=91-netns-placeholder-preset recurse=False state=file path=/etc/systemd/system-preset/91-netns-placeholder.preset force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 21 18:28:33 np0005591285 python3.9[101377]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=netns-placeholder state=started daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 21 18:28:33 np0005591285 systemd[1]: Reloading.
Jan 21 18:28:33 np0005591285 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 21 18:28:33 np0005591285 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 21 18:28:34 np0005591285 systemd[1]: Starting Create netns directory...
Jan 21 18:28:34 np0005591285 systemd[1]: run-netns-placeholder.mount: Deactivated successfully.
Jan 21 18:28:34 np0005591285 systemd[1]: netns-placeholder.service: Deactivated successfully.
Jan 21 18:28:34 np0005591285 systemd[1]: Finished Create netns directory.
Jan 21 18:28:34 np0005591285 python3.9[101570]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/openstack/healthchecks setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 21 18:28:35 np0005591285 python3.9[101722]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/healthchecks/ovn_metadata_agent/healthcheck follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 21 18:28:36 np0005591285 python3.9[101845]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/healthchecks/ovn_metadata_agent/ group=zuul mode=0700 owner=zuul setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1769038115.2463188-962-76879878153787/.source _original_basename=healthcheck follow=False checksum=898a5a1fcd473cf731177fc866e3bd7ebf20a131 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None
Jan 21 18:28:37 np0005591285 python3.9[101997]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/edpm-config recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 21 18:28:38 np0005591285 python3.9[102149]: ansible-ansible.builtin.file Invoked with path=/var/lib/kolla/config_files recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Jan 21 18:28:38 np0005591285 python3.9[102301]: ansible-ansible.legacy.stat Invoked with path=/var/lib/kolla/config_files/ovn_metadata_agent.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 21 18:28:39 np0005591285 python3.9[102424]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/kolla/config_files/ovn_metadata_agent.json mode=0600 src=/home/zuul/.ansible/tmp/ansible-tmp-1769038118.32798-1060-275028466954154/.source.json _original_basename=.3nuohk66 follow=False checksum=a908ef151ded3a33ae6c9ac8be72a35e5e33b9dc backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 21 18:28:40 np0005591285 python3.9[102574]: ansible-ansible.builtin.file Invoked with mode=0755 path=/var/lib/edpm-config/container-startup-config/ovn_metadata_agent state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 21 18:28:42 np0005591285 python3.9[102997]: ansible-container_config_data Invoked with config_overrides={} config_path=/var/lib/edpm-config/container-startup-config/ovn_metadata_agent config_pattern=*.json debug=False
Jan 21 18:28:43 np0005591285 python3.9[103149]: ansible-container_config_hash Invoked with check_mode=False config_vol_prefix=/var/lib/openstack
Jan 21 18:28:45 np0005591285 python3[103301]: ansible-edpm_container_manage Invoked with concurrency=1 config_dir=/var/lib/edpm-config/container-startup-config/ovn_metadata_agent config_id=ovn_metadata_agent config_overrides={} config_patterns=*.json containers=['ovn_metadata_agent'] log_base_path=/var/log/containers/stdouts debug=False
Jan 21 18:28:52 np0005591285 podman[103359]: 2026-01-21 23:28:52.157107413 +0000 UTC m=+2.810423137 container health_status e38def30a2d032278c507a8fe96e62e9ea51ff4721fe55b24e66691821fd079e (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_controller, container_name=ovn_controller)
Jan 21 18:28:55 np0005591285 podman[103315]: 2026-01-21 23:28:55.766128307 +0000 UTC m=+10.444895009 image pull 3695f0466b4af47afdf4b467956f8cc4744d7249671a73e7ca3fd26cca2f59c3 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Jan 21 18:28:56 np0005591285 podman[103440]: 2026-01-21 23:28:56.01084178 +0000 UTC m=+0.063894470 container create 482a8450540b33e5cd4c4cb0c4b60281016c657b79b89fa82f8a9e447843f995 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251202, container_name=ovn_metadata_agent, managed_by=edpm_ansible, io.buildah.version=1.41.3, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2)
Jan 21 18:28:56 np0005591285 podman[103440]: 2026-01-21 23:28:55.977650491 +0000 UTC m=+0.030703231 image pull 3695f0466b4af47afdf4b467956f8cc4744d7249671a73e7ca3fd26cca2f59c3 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Jan 21 18:28:56 np0005591285 python3[103301]: ansible-edpm_container_manage PODMAN-CONTAINER-DEBUG: podman create --name ovn_metadata_agent --cgroupns=host --conmon-pidfile /run/ovn_metadata_agent.pid --env KOLLA_CONFIG_STRATEGY=COPY_ALWAYS --env EDPM_CONFIG_HASH=7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a --healthcheck-command /openstack/healthcheck --label config_id=ovn_metadata_agent --label container_name=ovn_metadata_agent --label managed_by=edpm_ansible --label config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']} --log-driver journald --log-level info --network host --pid host --privileged=True --user root --volume /run/openvswitch:/run/openvswitch:z --volume /var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z --volume /run/netns:/run/netns:shared --volume /var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro --volume /var/lib/neutron:/var/lib/neutron:shared,z --volume /var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro --volume /var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro --volume /var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z --volume /var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z --volume /var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z --volume /var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z --volume /var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Jan 21 18:28:56 np0005591285 python3.9[103630]: ansible-ansible.builtin.stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 21 18:28:57 np0005591285 python3.9[103784]: ansible-file Invoked with path=/etc/systemd/system/edpm_ovn_metadata_agent.requires state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 21 18:28:58 np0005591285 python3.9[103860]: ansible-stat Invoked with path=/etc/systemd/system/edpm_ovn_metadata_agent_healthcheck.timer follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 21 18:28:59 np0005591285 python3.9[104011]: ansible-copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1769038138.2892466-1294-113329128950564/source dest=/etc/systemd/system/edpm_ovn_metadata_agent.service mode=0644 owner=root group=root backup=False force=True remote_src=False follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 21 18:28:59 np0005591285 python3.9[104087]: ansible-systemd Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Jan 21 18:28:59 np0005591285 systemd[1]: Reloading.
Jan 21 18:28:59 np0005591285 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 21 18:28:59 np0005591285 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 21 18:29:00 np0005591285 python3.9[104198]: ansible-systemd Invoked with state=restarted name=edpm_ovn_metadata_agent.service enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 21 18:29:00 np0005591285 systemd[1]: Reloading.
Jan 21 18:29:00 np0005591285 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 21 18:29:00 np0005591285 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 21 18:29:00 np0005591285 systemd[1]: Starting ovn_metadata_agent container...
Jan 21 18:29:00 np0005591285 systemd[1]: Started libcrun container.
Jan 21 18:29:00 np0005591285 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/ec3b0d48d624b24d6647e85950f1aa6aea46a9fac167eda036ce1e35694e536a/merged/etc/neutron.conf.d supports timestamps until 2038 (0x7fffffff)
Jan 21 18:29:00 np0005591285 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/ec3b0d48d624b24d6647e85950f1aa6aea46a9fac167eda036ce1e35694e536a/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 21 18:29:00 np0005591285 systemd[1]: Started /usr/bin/podman healthcheck run 482a8450540b33e5cd4c4cb0c4b60281016c657b79b89fa82f8a9e447843f995.
Jan 21 18:29:01 np0005591285 podman[104239]: 2026-01-21 23:29:01.002620275 +0000 UTC m=+0.144597982 container init 482a8450540b33e5cd4c4cb0c4b60281016c657b79b89fa82f8a9e447843f995 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, tcib_managed=true, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, config_id=ovn_metadata_agent, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible)
Jan 21 18:29:01 np0005591285 ovn_metadata_agent[104254]: + sudo -E kolla_set_configs
Jan 21 18:29:01 np0005591285 podman[104239]: 2026-01-21 23:29:01.030416314 +0000 UTC m=+0.172394001 container start 482a8450540b33e5cd4c4cb0c4b60281016c657b79b89fa82f8a9e447843f995 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.build-date=20251202, container_name=ovn_metadata_agent, tcib_managed=true, io.buildah.version=1.41.3)
Jan 21 18:29:01 np0005591285 edpm-start-podman-container[104239]: ovn_metadata_agent
Jan 21 18:29:01 np0005591285 ovn_metadata_agent[104254]: INFO:__main__:Loading config file at /var/lib/kolla/config_files/config.json
Jan 21 18:29:01 np0005591285 ovn_metadata_agent[104254]: INFO:__main__:Validating config file
Jan 21 18:29:01 np0005591285 ovn_metadata_agent[104254]: INFO:__main__:Kolla config strategy set to: COPY_ALWAYS
Jan 21 18:29:01 np0005591285 ovn_metadata_agent[104254]: INFO:__main__:Copying service configuration files
Jan 21 18:29:01 np0005591285 ovn_metadata_agent[104254]: INFO:__main__:Deleting /etc/neutron/rootwrap.conf
Jan 21 18:29:01 np0005591285 ovn_metadata_agent[104254]: INFO:__main__:Copying /etc/neutron.conf.d/01-rootwrap.conf to /etc/neutron/rootwrap.conf
Jan 21 18:29:01 np0005591285 ovn_metadata_agent[104254]: INFO:__main__:Setting permission for /etc/neutron/rootwrap.conf
Jan 21 18:29:01 np0005591285 ovn_metadata_agent[104254]: INFO:__main__:Writing out command to execute
Jan 21 18:29:01 np0005591285 ovn_metadata_agent[104254]: INFO:__main__:Setting permission for /var/lib/neutron
Jan 21 18:29:01 np0005591285 ovn_metadata_agent[104254]: INFO:__main__:Setting permission for /var/lib/neutron/kill_scripts
Jan 21 18:29:01 np0005591285 ovn_metadata_agent[104254]: INFO:__main__:Setting permission for /var/lib/neutron/ovn-metadata-proxy
Jan 21 18:29:01 np0005591285 ovn_metadata_agent[104254]: INFO:__main__:Setting permission for /var/lib/neutron/external
Jan 21 18:29:01 np0005591285 ovn_metadata_agent[104254]: INFO:__main__:Setting permission for /var/lib/neutron/ovn_metadata_haproxy_wrapper
Jan 21 18:29:01 np0005591285 ovn_metadata_agent[104254]: INFO:__main__:Setting permission for /var/lib/neutron/kill_scripts/haproxy-kill
Jan 21 18:29:01 np0005591285 ovn_metadata_agent[104254]: INFO:__main__:Setting permission for /var/lib/neutron/external/pids
Jan 21 18:29:01 np0005591285 podman[104260]: 2026-01-21 23:29:01.101497971 +0000 UTC m=+0.061593906 container health_status 482a8450540b33e5cd4c4cb0c4b60281016c657b79b89fa82f8a9e447843f995 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, container_name=ovn_metadata_agent, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=ovn_metadata_agent, managed_by=edpm_ansible)
Jan 21 18:29:01 np0005591285 edpm-start-podman-container[104238]: Creating additional drop-in dependency for "ovn_metadata_agent" (482a8450540b33e5cd4c4cb0c4b60281016c657b79b89fa82f8a9e447843f995)
Jan 21 18:29:01 np0005591285 ovn_metadata_agent[104254]: ++ cat /run_command
Jan 21 18:29:01 np0005591285 ovn_metadata_agent[104254]: + CMD=neutron-ovn-metadata-agent
Jan 21 18:29:01 np0005591285 ovn_metadata_agent[104254]: + ARGS=
Jan 21 18:29:01 np0005591285 ovn_metadata_agent[104254]: + sudo kolla_copy_cacerts
Jan 21 18:29:01 np0005591285 systemd[1]: Reloading.
Jan 21 18:29:01 np0005591285 ovn_metadata_agent[104254]: + [[ ! -n '' ]]
Jan 21 18:29:01 np0005591285 ovn_metadata_agent[104254]: + . kolla_extend_start
Jan 21 18:29:01 np0005591285 ovn_metadata_agent[104254]: Running command: 'neutron-ovn-metadata-agent'
Jan 21 18:29:01 np0005591285 ovn_metadata_agent[104254]: + echo 'Running command: '\''neutron-ovn-metadata-agent'\'''
Jan 21 18:29:01 np0005591285 ovn_metadata_agent[104254]: + umask 0022
Jan 21 18:29:01 np0005591285 ovn_metadata_agent[104254]: + exec neutron-ovn-metadata-agent
Jan 21 18:29:01 np0005591285 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 21 18:29:01 np0005591285 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 21 18:29:01 np0005591285 systemd[1]: Started ovn_metadata_agent container.
Jan 21 18:29:02 np0005591285 python3.9[104492]: ansible-ansible.builtin.slurp Invoked with src=/var/lib/edpm-config/deployed_services.yaml
Jan 21 18:29:02 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:29:02.893 104259 INFO neutron.common.config [-] Logging enabled!#033[00m
Jan 21 18:29:02 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:29:02.893 104259 INFO neutron.common.config [-] /usr/bin/neutron-ovn-metadata-agent version 22.2.2.dev43#033[00m
Jan 21 18:29:02 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:29:02.893 104259 DEBUG neutron.common.config [-] command line: /usr/bin/neutron-ovn-metadata-agent setup_logging /usr/lib/python3.9/site-packages/neutron/common/config.py:123#033[00m
Jan 21 18:29:02 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:29:02.893 104259 DEBUG neutron.agent.ovn.metadata_agent [-] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2589#033[00m
Jan 21 18:29:02 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:29:02.894 104259 DEBUG neutron.agent.ovn.metadata_agent [-] Configuration options gathered from: log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2590#033[00m
Jan 21 18:29:02 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:29:02.894 104259 DEBUG neutron.agent.ovn.metadata_agent [-] command line args: [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2591#033[00m
Jan 21 18:29:02 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:29:02.894 104259 DEBUG neutron.agent.ovn.metadata_agent [-] config files: ['/etc/neutron/neutron.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2592#033[00m
Jan 21 18:29:02 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:29:02.894 104259 DEBUG neutron.agent.ovn.metadata_agent [-] ================================================================================ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2594#033[00m
Jan 21 18:29:02 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:29:02.894 104259 DEBUG neutron.agent.ovn.metadata_agent [-] agent_down_time                = 75 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 21 18:29:02 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:29:02.894 104259 DEBUG neutron.agent.ovn.metadata_agent [-] allow_bulk                     = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 21 18:29:02 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:29:02.894 104259 DEBUG neutron.agent.ovn.metadata_agent [-] api_extensions_path            =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 21 18:29:02 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:29:02.894 104259 DEBUG neutron.agent.ovn.metadata_agent [-] api_paste_config               = api-paste.ini log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 21 18:29:02 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:29:02.894 104259 DEBUG neutron.agent.ovn.metadata_agent [-] api_workers                    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 21 18:29:02 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:29:02.894 104259 DEBUG neutron.agent.ovn.metadata_agent [-] auth_ca_cert                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 21 18:29:02 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:29:02.895 104259 DEBUG neutron.agent.ovn.metadata_agent [-] auth_strategy                  = keystone log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 21 18:29:02 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:29:02.895 104259 DEBUG neutron.agent.ovn.metadata_agent [-] backlog                        = 4096 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 21 18:29:02 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:29:02.895 104259 DEBUG neutron.agent.ovn.metadata_agent [-] base_mac                       = fa:16:3e:00:00:00 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 21 18:29:02 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:29:02.895 104259 DEBUG neutron.agent.ovn.metadata_agent [-] bind_host                      = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 21 18:29:02 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:29:02.895 104259 DEBUG neutron.agent.ovn.metadata_agent [-] bind_port                      = 9696 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 21 18:29:02 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:29:02.895 104259 DEBUG neutron.agent.ovn.metadata_agent [-] client_socket_timeout          = 900 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 21 18:29:02 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:29:02.895 104259 DEBUG neutron.agent.ovn.metadata_agent [-] config_dir                     = ['/etc/neutron.conf.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 21 18:29:02 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:29:02.895 104259 DEBUG neutron.agent.ovn.metadata_agent [-] config_file                    = ['/etc/neutron/neutron.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 21 18:29:02 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:29:02.895 104259 DEBUG neutron.agent.ovn.metadata_agent [-] config_source                  = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 21 18:29:02 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:29:02.896 104259 DEBUG neutron.agent.ovn.metadata_agent [-] control_exchange               = neutron log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 21 18:29:02 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:29:02.896 104259 DEBUG neutron.agent.ovn.metadata_agent [-] core_plugin                    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 21 18:29:02 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:29:02.896 104259 DEBUG neutron.agent.ovn.metadata_agent [-] debug                          = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 21 18:29:02 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:29:02.896 104259 DEBUG neutron.agent.ovn.metadata_agent [-] default_availability_zones     = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 21 18:29:02 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:29:02.896 104259 DEBUG neutron.agent.ovn.metadata_agent [-] default_log_levels             = ['amqp=WARN', 'amqplib=WARN', 'boto=WARN', 'qpid=WARN', 'sqlalchemy=WARN', 'suds=INFO', 'oslo.messaging=INFO', 'oslo_messaging=INFO', 'iso8601=WARN', 'requests.packages.urllib3.connectionpool=WARN', 'urllib3.connectionpool=WARN', 'websocket=WARN', 'requests.packages.urllib3.util.retry=WARN', 'urllib3.util.retry=WARN', 'keystonemiddleware=WARN', 'routes.middleware=WARN', 'stevedore=WARN', 'taskflow=WARN', 'keystoneauth=WARN', 'oslo.cache=INFO', 'oslo_policy=INFO', 'dogpile.core.dogpile=INFO', 'OFPHandler=INFO', 'OfctlService=INFO', 'os_ken.base.app_manager=INFO', 'os_ken.controller.controller=INFO'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 21 18:29:02 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:29:02.896 104259 DEBUG neutron.agent.ovn.metadata_agent [-] dhcp_agent_notification        = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 21 18:29:02 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:29:02.896 104259 DEBUG neutron.agent.ovn.metadata_agent [-] dhcp_lease_duration            = 86400 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 21 18:29:02 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:29:02.896 104259 DEBUG neutron.agent.ovn.metadata_agent [-] dhcp_load_type                 = networks log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 21 18:29:02 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:29:02.896 104259 DEBUG neutron.agent.ovn.metadata_agent [-] dns_domain                     = openstacklocal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 21 18:29:02 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:29:02.896 104259 DEBUG neutron.agent.ovn.metadata_agent [-] enable_new_agents              = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 21 18:29:02 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:29:02.897 104259 DEBUG neutron.agent.ovn.metadata_agent [-] enable_traditional_dhcp        = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 21 18:29:02 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:29:02.897 104259 DEBUG neutron.agent.ovn.metadata_agent [-] external_dns_driver            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 21 18:29:02 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:29:02.897 104259 DEBUG neutron.agent.ovn.metadata_agent [-] external_pids                  = /var/lib/neutron/external/pids log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 21 18:29:02 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:29:02.897 104259 DEBUG neutron.agent.ovn.metadata_agent [-] filter_validation              = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 21 18:29:02 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:29:02.897 104259 DEBUG neutron.agent.ovn.metadata_agent [-] global_physnet_mtu             = 1500 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 21 18:29:02 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:29:02.897 104259 DEBUG neutron.agent.ovn.metadata_agent [-] host                           = compute-2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 21 18:29:02 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:29:02.897 104259 DEBUG neutron.agent.ovn.metadata_agent [-] http_retries                   = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 21 18:29:02 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:29:02.897 104259 DEBUG neutron.agent.ovn.metadata_agent [-] instance_format                = [instance: %(uuid)s]  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 21 18:29:02 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:29:02.897 104259 DEBUG neutron.agent.ovn.metadata_agent [-] instance_uuid_format           = [instance: %(uuid)s]  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 21 18:29:02 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:29:02.898 104259 DEBUG neutron.agent.ovn.metadata_agent [-] ipam_driver                    = internal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 21 18:29:02 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:29:02.898 104259 DEBUG neutron.agent.ovn.metadata_agent [-] ipv6_pd_enabled                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 21 18:29:02 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:29:02.898 104259 DEBUG neutron.agent.ovn.metadata_agent [-] log_config_append              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 21 18:29:02 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:29:02.898 104259 DEBUG neutron.agent.ovn.metadata_agent [-] log_date_format                = %Y-%m-%d %H:%M:%S log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 21 18:29:02 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:29:02.898 104259 DEBUG neutron.agent.ovn.metadata_agent [-] log_dir                        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 21 18:29:02 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:29:02.898 104259 DEBUG neutron.agent.ovn.metadata_agent [-] log_file                       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 21 18:29:02 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:29:02.898 104259 DEBUG neutron.agent.ovn.metadata_agent [-] log_rotate_interval            = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 21 18:29:02 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:29:02.898 104259 DEBUG neutron.agent.ovn.metadata_agent [-] log_rotate_interval_type       = days log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 21 18:29:02 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:29:02.898 104259 DEBUG neutron.agent.ovn.metadata_agent [-] log_rotation_type              = none log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 21 18:29:02 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:29:02.899 104259 DEBUG neutron.agent.ovn.metadata_agent [-] logging_context_format_string  = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [%(global_request_id)s %(request_id)s %(user_identity)s] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 21 18:29:02 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:29:02.899 104259 DEBUG neutron.agent.ovn.metadata_agent [-] logging_debug_format_suffix    = %(funcName)s %(pathname)s:%(lineno)d log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 21 18:29:02 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:29:02.899 104259 DEBUG neutron.agent.ovn.metadata_agent [-] logging_default_format_string  = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [-] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 21 18:29:02 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:29:02.899 104259 DEBUG neutron.agent.ovn.metadata_agent [-] logging_exception_prefix       = %(asctime)s.%(msecs)03d %(process)d ERROR %(name)s %(instance)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 21 18:29:02 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:29:02.899 104259 DEBUG neutron.agent.ovn.metadata_agent [-] logging_user_identity_format   = %(user)s %(project)s %(domain)s %(system_scope)s %(user_domain)s %(project_domain)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 21 18:29:02 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:29:02.899 104259 DEBUG neutron.agent.ovn.metadata_agent [-] max_dns_nameservers            = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 21 18:29:02 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:29:02.899 104259 DEBUG neutron.agent.ovn.metadata_agent [-] max_header_line                = 16384 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 21 18:29:02 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:29:02.899 104259 DEBUG neutron.agent.ovn.metadata_agent [-] max_logfile_count              = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 21 18:29:02 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:29:02.899 104259 DEBUG neutron.agent.ovn.metadata_agent [-] max_logfile_size_mb            = 200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 21 18:29:02 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:29:02.899 104259 DEBUG neutron.agent.ovn.metadata_agent [-] max_subnet_host_routes         = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 21 18:29:02 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:29:02.900 104259 DEBUG neutron.agent.ovn.metadata_agent [-] metadata_backlog               = 4096 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 21 18:29:02 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:29:02.900 104259 DEBUG neutron.agent.ovn.metadata_agent [-] metadata_proxy_group           =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 21 18:29:02 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:29:02.900 104259 DEBUG neutron.agent.ovn.metadata_agent [-] metadata_proxy_shared_secret   = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 21 18:29:02 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:29:02.900 104259 DEBUG neutron.agent.ovn.metadata_agent [-] metadata_proxy_socket          = /var/lib/neutron/metadata_proxy log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 21 18:29:02 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:29:02.900 104259 DEBUG neutron.agent.ovn.metadata_agent [-] metadata_proxy_socket_mode     = deduce log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 21 18:29:02 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:29:02.900 104259 DEBUG neutron.agent.ovn.metadata_agent [-] metadata_proxy_user            =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 21 18:29:02 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:29:02.900 104259 DEBUG neutron.agent.ovn.metadata_agent [-] metadata_workers               = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 21 18:29:02 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:29:02.901 104259 DEBUG neutron.agent.ovn.metadata_agent [-] network_link_prefix            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 21 18:29:02 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:29:02.901 104259 DEBUG neutron.agent.ovn.metadata_agent [-] notify_nova_on_port_data_changes = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 21 18:29:02 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:29:02.901 104259 DEBUG neutron.agent.ovn.metadata_agent [-] notify_nova_on_port_status_changes = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 21 18:29:02 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:29:02.901 104259 DEBUG neutron.agent.ovn.metadata_agent [-] nova_client_cert               =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 21 18:29:02 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:29:02.901 104259 DEBUG neutron.agent.ovn.metadata_agent [-] nova_client_priv_key           =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 21 18:29:02 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:29:02.901 104259 DEBUG neutron.agent.ovn.metadata_agent [-] nova_metadata_host             = nova-metadata-cell1-internal.openstack.svc log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 21 18:29:02 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:29:02.901 104259 DEBUG neutron.agent.ovn.metadata_agent [-] nova_metadata_insecure         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 21 18:29:02 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:29:02.901 104259 DEBUG neutron.agent.ovn.metadata_agent [-] nova_metadata_port             = 8775 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 21 18:29:02 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:29:02.901 104259 DEBUG neutron.agent.ovn.metadata_agent [-] nova_metadata_protocol         = https log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 21 18:29:02 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:29:02.902 104259 DEBUG neutron.agent.ovn.metadata_agent [-] pagination_max_limit           = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 21 18:29:02 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:29:02.902 104259 DEBUG neutron.agent.ovn.metadata_agent [-] periodic_fuzzy_delay           = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 21 18:29:02 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:29:02.902 104259 DEBUG neutron.agent.ovn.metadata_agent [-] periodic_interval              = 40 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 21 18:29:02 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:29:02.902 104259 DEBUG neutron.agent.ovn.metadata_agent [-] publish_errors                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 21 18:29:02 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:29:02.902 104259 DEBUG neutron.agent.ovn.metadata_agent [-] rate_limit_burst               = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 21 18:29:02 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:29:02.902 104259 DEBUG neutron.agent.ovn.metadata_agent [-] rate_limit_except_level        = CRITICAL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 21 18:29:02 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:29:02.902 104259 DEBUG neutron.agent.ovn.metadata_agent [-] rate_limit_interval            = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 21 18:29:02 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:29:02.902 104259 DEBUG neutron.agent.ovn.metadata_agent [-] retry_until_window             = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 21 18:29:02 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:29:02.902 104259 DEBUG neutron.agent.ovn.metadata_agent [-] rpc_resources_processing_step  = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 21 18:29:02 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:29:02.903 104259 DEBUG neutron.agent.ovn.metadata_agent [-] rpc_response_max_timeout       = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 21 18:29:02 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:29:02.903 104259 DEBUG neutron.agent.ovn.metadata_agent [-] rpc_state_report_workers       = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 21 18:29:02 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:29:02.903 104259 DEBUG neutron.agent.ovn.metadata_agent [-] rpc_workers                    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 21 18:29:02 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:29:02.903 104259 DEBUG neutron.agent.ovn.metadata_agent [-] send_events_interval           = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 21 18:29:02 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:29:02.903 104259 DEBUG neutron.agent.ovn.metadata_agent [-] service_plugins                = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 21 18:29:02 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:29:02.903 104259 DEBUG neutron.agent.ovn.metadata_agent [-] setproctitle                   = on log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 21 18:29:02 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:29:02.903 104259 DEBUG neutron.agent.ovn.metadata_agent [-] state_path                     = /var/lib/neutron log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 21 18:29:02 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:29:02.903 104259 DEBUG neutron.agent.ovn.metadata_agent [-] syslog_log_facility            = syslog log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 21 18:29:02 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:29:02.903 104259 DEBUG neutron.agent.ovn.metadata_agent [-] tcp_keepidle                   = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 21 18:29:02 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:29:02.903 104259 DEBUG neutron.agent.ovn.metadata_agent [-] transport_url                  = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 21 18:29:02 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:29:02.904 104259 DEBUG neutron.agent.ovn.metadata_agent [-] use_eventlog                   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 21 18:29:02 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:29:02.904 104259 DEBUG neutron.agent.ovn.metadata_agent [-] use_journal                    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 21 18:29:02 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:29:02.904 104259 DEBUG neutron.agent.ovn.metadata_agent [-] use_json                       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 21 18:29:02 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:29:02.904 104259 DEBUG neutron.agent.ovn.metadata_agent [-] use_ssl                        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 21 18:29:02 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:29:02.904 104259 DEBUG neutron.agent.ovn.metadata_agent [-] use_stderr                     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 21 18:29:02 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:29:02.904 104259 DEBUG neutron.agent.ovn.metadata_agent [-] use_syslog                     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 21 18:29:02 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:29:02.904 104259 DEBUG neutron.agent.ovn.metadata_agent [-] vlan_transparent               = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 21 18:29:02 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:29:02.904 104259 DEBUG neutron.agent.ovn.metadata_agent [-] watch_log_file                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 21 18:29:02 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:29:02.904 104259 DEBUG neutron.agent.ovn.metadata_agent [-] wsgi_default_pool_size         = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 21 18:29:02 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:29:02.904 104259 DEBUG neutron.agent.ovn.metadata_agent [-] wsgi_keep_alive                = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 21 18:29:02 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:29:02.905 104259 DEBUG neutron.agent.ovn.metadata_agent [-] wsgi_log_format                = %(client_ip)s "%(request_line)s" status: %(status_code)s  len: %(body_length)s time: %(wall_seconds).7f log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 21 18:29:02 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:29:02.905 104259 DEBUG neutron.agent.ovn.metadata_agent [-] wsgi_server_debug              = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 21 18:29:02 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:29:02.905 104259 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_concurrency.disable_process_locking = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:29:02 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:29:02.905 104259 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_concurrency.lock_path     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:29:02 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:29:02.905 104259 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.connection_string     = messaging:// log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:29:02 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:29:02.905 104259 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.enabled               = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:29:02 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:29:02.905 104259 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.es_doc_type           = notification log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:29:02 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:29:02.905 104259 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.es_scroll_size        = 10000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:29:02 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:29:02.905 104259 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.es_scroll_time        = 2m log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:29:02 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:29:02.906 104259 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.filter_error_trace    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:29:02 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:29:02.906 104259 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.hmac_keys             = SECRET_KEY log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:29:02 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:29:02.906 104259 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.sentinel_service_name = mymaster log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:29:02 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:29:02.906 104259 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.socket_timeout        = 0.1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:29:02 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:29:02.906 104259 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.trace_sqlalchemy      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:29:02 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:29:02.906 104259 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.enforce_new_defaults = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:29:02 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:29:02.906 104259 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.enforce_scope      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:29:02 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:29:02.906 104259 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.policy_default_rule = default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:29:02 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:29:02.906 104259 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.policy_dirs        = ['policy.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:29:02 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:29:02.907 104259 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.policy_file        = policy.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:29:02 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:29:02.907 104259 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.remote_content_type = application/x-www-form-urlencoded log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:29:02 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:29:02.907 104259 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.remote_ssl_ca_crt_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:29:02 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:29:02.907 104259 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.remote_ssl_client_crt_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:29:02 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:29:02.907 104259 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.remote_ssl_client_key_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:29:02 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:29:02.907 104259 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.remote_ssl_verify_server_crt = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:29:02 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:29:02.907 104259 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_metrics.metrics_buffer_size = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:29:02 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:29:02.907 104259 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_metrics.metrics_enabled = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:29:02 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:29:02.907 104259 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_metrics.metrics_process_name =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:29:02 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:29:02.908 104259 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_metrics.metrics_socket_file = /var/tmp/metrics_collector.sock log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:29:02 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:29:02.908 104259 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_metrics.metrics_thread_stop_timeout = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:29:02 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:29:02.908 104259 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_middleware.http_basic_auth_user_file = /etc/htpasswd log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:29:02 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:29:02.908 104259 DEBUG neutron.agent.ovn.metadata_agent [-] service_providers.service_provider = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:29:02 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:29:02.908 104259 DEBUG neutron.agent.ovn.metadata_agent [-] privsep.capabilities           = [21, 12, 1, 2, 19] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:29:02 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:29:02.908 104259 DEBUG neutron.agent.ovn.metadata_agent [-] privsep.group                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:29:02 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:29:02.908 104259 DEBUG neutron.agent.ovn.metadata_agent [-] privsep.helper_command         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:29:02 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:29:02.908 104259 DEBUG neutron.agent.ovn.metadata_agent [-] privsep.logger_name            = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:29:02 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:29:02.908 104259 DEBUG neutron.agent.ovn.metadata_agent [-] privsep.thread_pool_size       = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:29:02 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:29:02.909 104259 DEBUG neutron.agent.ovn.metadata_agent [-] privsep.user                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:29:02 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:29:02.909 104259 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_dhcp_release.capabilities = [21, 12] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:29:02 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:29:02.909 104259 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_dhcp_release.group     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:29:02 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:29:02.909 104259 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_dhcp_release.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:29:02 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:29:02.909 104259 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_dhcp_release.logger_name = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:29:02 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:29:02.909 104259 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_dhcp_release.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:29:02 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:29:02.909 104259 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_dhcp_release.user      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:29:02 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:29:02.909 104259 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_ovs_vsctl.capabilities = [21, 12] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:29:02 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:29:02.909 104259 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_ovs_vsctl.group        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:29:02 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:29:02.910 104259 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_ovs_vsctl.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:29:02 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:29:02.910 104259 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_ovs_vsctl.logger_name  = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:29:02 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:29:02.910 104259 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_ovs_vsctl.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:29:02 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:29:02.910 104259 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_ovs_vsctl.user         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:29:02 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:29:02.910 104259 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_namespace.capabilities = [21] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:29:02 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:29:02.910 104259 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_namespace.group        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:29:02 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:29:02.910 104259 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_namespace.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:29:02 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:29:02.910 104259 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_namespace.logger_name  = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:29:02 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:29:02.910 104259 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_namespace.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:29:02 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:29:02.910 104259 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_namespace.user         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:29:02 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:29:02.911 104259 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_conntrack.capabilities = [12] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:29:02 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:29:02.911 104259 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_conntrack.group        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:29:02 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:29:02.911 104259 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_conntrack.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:29:02 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:29:02.911 104259 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_conntrack.logger_name  = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:29:02 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:29:02.911 104259 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_conntrack.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:29:02 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:29:02.911 104259 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_conntrack.user         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:29:02 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:29:02.911 104259 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_link.capabilities      = [12, 21] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:29:02 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:29:02.911 104259 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_link.group             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:29:02 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:29:02.911 104259 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_link.helper_command    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:29:02 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:29:02.912 104259 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_link.logger_name       = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:29:02 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:29:02.912 104259 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_link.thread_pool_size  = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:29:02 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:29:02.912 104259 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_link.user              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:29:02 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:29:02.912 104259 DEBUG neutron.agent.ovn.metadata_agent [-] AGENT.check_child_processes_action = respawn log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:29:02 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:29:02.912 104259 DEBUG neutron.agent.ovn.metadata_agent [-] AGENT.check_child_processes_interval = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:29:02 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:29:02.912 104259 DEBUG neutron.agent.ovn.metadata_agent [-] AGENT.comment_iptables_rules   = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:29:02 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:29:02.912 104259 DEBUG neutron.agent.ovn.metadata_agent [-] AGENT.debug_iptables_rules     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:29:02 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:29:02.912 104259 DEBUG neutron.agent.ovn.metadata_agent [-] AGENT.kill_scripts_path        = /etc/neutron/kill_scripts/ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:29:02 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:29:02.912 104259 DEBUG neutron.agent.ovn.metadata_agent [-] AGENT.root_helper              = sudo neutron-rootwrap /etc/neutron/rootwrap.conf log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:29:02 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:29:02.913 104259 DEBUG neutron.agent.ovn.metadata_agent [-] AGENT.root_helper_daemon       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:29:02 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:29:02.913 104259 DEBUG neutron.agent.ovn.metadata_agent [-] AGENT.use_helper_for_ns_read   = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:29:02 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:29:02.913 104259 DEBUG neutron.agent.ovn.metadata_agent [-] AGENT.use_random_fully         = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:29:02 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:29:02.913 104259 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_versionedobjects.fatal_exception_format_errors = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:29:02 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:29:02.913 104259 DEBUG neutron.agent.ovn.metadata_agent [-] QUOTAS.default_quota           = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:29:02 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:29:02.913 104259 DEBUG neutron.agent.ovn.metadata_agent [-] QUOTAS.quota_driver            = neutron.db.quota.driver_nolock.DbQuotaNoLockDriver log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:29:02 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:29:02.913 104259 DEBUG neutron.agent.ovn.metadata_agent [-] QUOTAS.quota_network           = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:29:02 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:29:02.913 104259 DEBUG neutron.agent.ovn.metadata_agent [-] QUOTAS.quota_port              = 500 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:29:02 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:29:02.914 104259 DEBUG neutron.agent.ovn.metadata_agent [-] QUOTAS.quota_security_group    = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:29:02 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:29:02.914 104259 DEBUG neutron.agent.ovn.metadata_agent [-] QUOTAS.quota_security_group_rule = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:29:02 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:29:02.914 104259 DEBUG neutron.agent.ovn.metadata_agent [-] QUOTAS.quota_subnet            = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:29:02 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:29:02.914 104259 DEBUG neutron.agent.ovn.metadata_agent [-] QUOTAS.track_quota_usage       = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:29:02 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:29:02.914 104259 DEBUG neutron.agent.ovn.metadata_agent [-] nova.auth_section              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:29:02 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:29:02.914 104259 DEBUG neutron.agent.ovn.metadata_agent [-] nova.auth_type                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:29:02 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:29:02.914 104259 DEBUG neutron.agent.ovn.metadata_agent [-] nova.cafile                    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:29:02 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:29:02.914 104259 DEBUG neutron.agent.ovn.metadata_agent [-] nova.certfile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:29:02 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:29:02.914 104259 DEBUG neutron.agent.ovn.metadata_agent [-] nova.collect_timing            = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:29:02 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:29:02.915 104259 DEBUG neutron.agent.ovn.metadata_agent [-] nova.endpoint_type             = public log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:29:02 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:29:02.915 104259 DEBUG neutron.agent.ovn.metadata_agent [-] nova.insecure                  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:29:02 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:29:02.915 104259 DEBUG neutron.agent.ovn.metadata_agent [-] nova.keyfile                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:29:02 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:29:02.915 104259 DEBUG neutron.agent.ovn.metadata_agent [-] nova.region_name               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:29:02 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:29:02.915 104259 DEBUG neutron.agent.ovn.metadata_agent [-] nova.split_loggers             = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:29:02 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:29:02.915 104259 DEBUG neutron.agent.ovn.metadata_agent [-] nova.timeout                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:29:02 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:29:02.915 104259 DEBUG neutron.agent.ovn.metadata_agent [-] placement.auth_section         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:29:02 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:29:02.915 104259 DEBUG neutron.agent.ovn.metadata_agent [-] placement.auth_type            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:29:02 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:29:02.915 104259 DEBUG neutron.agent.ovn.metadata_agent [-] placement.cafile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:29:02 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:29:02.916 104259 DEBUG neutron.agent.ovn.metadata_agent [-] placement.certfile             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:29:02 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:29:02.916 104259 DEBUG neutron.agent.ovn.metadata_agent [-] placement.collect_timing       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:29:02 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:29:02.916 104259 DEBUG neutron.agent.ovn.metadata_agent [-] placement.endpoint_type        = public log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:29:02 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:29:02.916 104259 DEBUG neutron.agent.ovn.metadata_agent [-] placement.insecure             = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:29:02 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:29:02.916 104259 DEBUG neutron.agent.ovn.metadata_agent [-] placement.keyfile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:29:02 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:29:02.916 104259 DEBUG neutron.agent.ovn.metadata_agent [-] placement.region_name          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:29:02 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:29:02.916 104259 DEBUG neutron.agent.ovn.metadata_agent [-] placement.split_loggers        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:29:02 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:29:02.916 104259 DEBUG neutron.agent.ovn.metadata_agent [-] placement.timeout              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:29:02 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:29:02.916 104259 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.auth_section            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:29:02 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:29:02.917 104259 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.auth_type               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:29:02 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:29:02.917 104259 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.cafile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:29:02 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:29:02.917 104259 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.certfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:29:02 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:29:02.917 104259 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.collect_timing          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:29:02 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:29:02.917 104259 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.connect_retries         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:29:02 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:29:02.917 104259 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.connect_retry_delay     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:29:02 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:29:02.917 104259 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.enable_notifications    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:29:02 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:29:02.917 104259 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.endpoint_override       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:29:02 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:29:02.917 104259 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:29:02 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:29:02.918 104259 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.interface               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:29:02 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:29:02.918 104259 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.keyfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:29:02 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:29:02.918 104259 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.max_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:29:02 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:29:02.918 104259 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.min_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:29:02 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:29:02.918 104259 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.region_name             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:29:02 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:29:02.918 104259 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.service_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:29:02 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:29:02.918 104259 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.service_type            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:29:02 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:29:02.918 104259 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.split_loggers           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:29:02 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:29:02.918 104259 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.status_code_retries     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:29:02 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:29:02.919 104259 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:29:02 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:29:02.919 104259 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.timeout                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:29:02 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:29:02.919 104259 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.valid_interfaces        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:29:02 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:29:02.919 104259 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.version                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:29:02 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:29:02.919 104259 DEBUG neutron.agent.ovn.metadata_agent [-] cli_script.dry_run             = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:29:02 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:29:02.919 104259 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.allow_stateless_action_supported = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:29:02 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:29:02.919 104259 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.dhcp_default_lease_time    = 43200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:29:02 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:29:02.919 104259 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.disable_ovn_dhcp_for_baremetal_ports = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:29:02 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:29:02.919 104259 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.dns_servers                = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:29:02 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:29:02.920 104259 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.enable_distributed_floating_ip = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:29:02 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:29:02.920 104259 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.neutron_sync_mode          = log log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:29:02 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:29:02.920 104259 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_dhcp4_global_options   = {} log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:29:02 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:29:02.920 104259 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_dhcp6_global_options   = {} log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:29:02 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:29:02.920 104259 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_emit_need_to_frag      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:29:02 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:29:02.920 104259 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_l3_mode                = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:29:02 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:29:02.920 104259 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_l3_scheduler           = leastloaded log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:29:02 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:29:02.920 104259 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_metadata_enabled       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:29:02 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:29:02.920 104259 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_nb_ca_cert             =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:29:02 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:29:02.921 104259 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_nb_certificate         =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:29:02 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:29:02.921 104259 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_nb_connection          = tcp:127.0.0.1:6641 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:29:02 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:29:02.921 104259 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_nb_private_key         =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:29:02 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:29:02.921 104259 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_sb_ca_cert             = /etc/pki/tls/certs/ovndbca.crt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:29:02 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:29:02.921 104259 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_sb_certificate         = /etc/pki/tls/certs/ovndb.crt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:29:02 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:29:02.921 104259 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_sb_connection          = ssl:ovsdbserver-sb.openstack.svc:6642 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:29:02 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:29:02.921 104259 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_sb_private_key         = /etc/pki/tls/private/ovndb.key log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:29:02 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:29:02.921 104259 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovsdb_connection_timeout   = 180 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:29:02 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:29:02.921 104259 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovsdb_log_level            = INFO log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:29:02 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:29:02.922 104259 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovsdb_probe_interval       = 60000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:29:02 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:29:02.922 104259 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovsdb_retry_max_interval   = 180 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:29:02 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:29:02.922 104259 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.vhost_sock_dir             = /var/run/openvswitch log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:29:02 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:29:02.922 104259 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.vif_type                   = ovs log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:29:02 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:29:02.922 104259 DEBUG neutron.agent.ovn.metadata_agent [-] OVS.bridge_mac_table_size      = 50000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:29:02 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:29:02.922 104259 DEBUG neutron.agent.ovn.metadata_agent [-] OVS.igmp_snooping_enable       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:29:02 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:29:02.922 104259 DEBUG neutron.agent.ovn.metadata_agent [-] OVS.ovsdb_timeout              = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:29:02 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:29:02.922 104259 DEBUG neutron.agent.ovn.metadata_agent [-] ovs.ovsdb_connection           = tcp:127.0.0.1:6640 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:29:02 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:29:02.922 104259 DEBUG neutron.agent.ovn.metadata_agent [-] ovs.ovsdb_connection_timeout   = 180 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:29:02 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:29:02.923 104259 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.amqp_auto_delete = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:29:02 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:29:02.923 104259 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.amqp_durable_queues = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:29:02 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:29:02.923 104259 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.conn_pool_min_size = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:29:02 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:29:02.923 104259 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.conn_pool_ttl = 1200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:29:02 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:29:02.923 104259 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.direct_mandatory_flag = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:29:02 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:29:02.923 104259 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.enable_cancel_on_failover = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:29:02 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:29:02.923 104259 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.heartbeat_in_pthread = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:29:02 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:29:02.923 104259 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.heartbeat_rate = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:29:02 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:29:02.923 104259 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.heartbeat_timeout_threshold = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:29:02 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:29:02.924 104259 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.kombu_compression = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:29:02 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:29:02.924 104259 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.kombu_failover_strategy = round-robin log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:29:02 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:29:02.924 104259 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.kombu_missing_consumer_retry_timeout = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:29:02 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:29:02.924 104259 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.kombu_reconnect_delay = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:29:02 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:29:02.924 104259 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_ha_queues = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:29:02 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:29:02.924 104259 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_interval_max = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:29:02 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:29:02.924 104259 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_login_method = AMQPLAIN log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:29:02 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:29:02.924 104259 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_qos_prefetch_count = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:29:02 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:29:02.924 104259 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_quorum_delivery_limit = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:29:02 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:29:02.925 104259 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_quorum_max_memory_bytes = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:29:02 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:29:02.925 104259 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_quorum_max_memory_length = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:29:02 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:29:02.925 104259 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_quorum_queue = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:29:02 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:29:02.925 104259 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_retry_backoff = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:29:02 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:29:02.925 104259 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_retry_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:29:02 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:29:02.925 104259 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_transient_queues_ttl = 1800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:29:02 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:29:02.925 104259 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rpc_conn_pool_size = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:29:02 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:29:02.925 104259 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.ssl      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:29:02 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:29:02.925 104259 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.ssl_ca_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:29:02 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:29:02.926 104259 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.ssl_cert_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:29:02 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:29:02.926 104259 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.ssl_enforce_fips_mode = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:29:02 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:29:02.926 104259 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.ssl_key_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:29:02 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:29:02.926 104259 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.ssl_version =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:29:02 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:29:02.926 104259 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_notifications.driver = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:29:02 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:29:02.926 104259 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_notifications.retry = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:29:02 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:29:02.926 104259 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_notifications.topics = ['notifications'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:29:02 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:29:02.926 104259 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_notifications.transport_url = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:29:02 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:29:02.926 104259 DEBUG neutron.agent.ovn.metadata_agent [-] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613#033[00m
Jan 21 18:29:02 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:29:02.935 104259 DEBUG ovsdbapp.backend.ovs_idl [-] Created schema index Bridge.name autocreate_indices /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:106#033[00m
Jan 21 18:29:02 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:29:02.935 104259 DEBUG ovsdbapp.backend.ovs_idl [-] Created schema index Port.name autocreate_indices /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:106#033[00m
Jan 21 18:29:02 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:29:02.935 104259 DEBUG ovsdbapp.backend.ovs_idl [-] Created schema index Interface.name autocreate_indices /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:106#033[00m
Jan 21 18:29:02 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:29:02.936 104259 INFO ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: connecting...#033[00m
Jan 21 18:29:02 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:29:02.936 104259 INFO ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: connected#033[00m
Jan 21 18:29:02 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:29:02.949 104259 DEBUG neutron.agent.ovn.metadata.agent [-] Loaded chassis name ce4b296c-26ac-415a-aa87-9634754eb3d3 (UUID: ce4b296c-26ac-415a-aa87-9634754eb3d3) and ovn bridge br-int. _load_config /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:309#033[00m
Jan 21 18:29:02 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:29:02.978 104259 INFO neutron.agent.ovn.metadata.ovsdb [-] Getting OvsdbSbOvnIdl for MetadataAgent with retry#033[00m
Jan 21 18:29:02 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:29:02.978 104259 DEBUG ovsdbapp.backend.ovs_idl [-] Created lookup_table index Chassis.name autocreate_indices /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:87#033[00m
Jan 21 18:29:02 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:29:02.978 104259 DEBUG ovsdbapp.backend.ovs_idl [-] Created schema index Datapath_Binding.tunnel_key autocreate_indices /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:106#033[00m
Jan 21 18:29:02 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:29:02.978 104259 DEBUG ovsdbapp.backend.ovs_idl [-] Created schema index Chassis_Private.name autocreate_indices /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:106#033[00m
Jan 21 18:29:02 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:29:02.982 104259 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connecting...#033[00m
Jan 21 18:29:02 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:29:02.987 104259 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connected#033[00m
Jan 21 18:29:02 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:29:02.994 104259 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched CREATE: ChassisPrivateCreateEvent(events=('create',), table='Chassis_Private', conditions=(('name', '=', 'ce4b296c-26ac-415a-aa87-9634754eb3d3'),), old_conditions=None), priority=20 to row=Chassis_Private(chassis=[<ovs.db.idl.Row object at 0x7fdd55b5c970>], external_ids={}, name=ce4b296c-26ac-415a-aa87-9634754eb3d3, nb_cfg_timestamp=1769038075985, nb_cfg=1) old= matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 21 18:29:02 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:29:02.995 104259 DEBUG neutron_lib.callbacks.manager [-] Subscribe: <bound method MetadataProxyHandler.post_fork_initialize of <neutron.agent.ovn.metadata.server.MetadataProxyHandler object at 0x7fdd55b5cb80>> process after_init 55550000, False subscribe /usr/lib/python3.9/site-packages/neutron_lib/callbacks/manager.py:52#033[00m
Jan 21 18:29:02 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:29:02.996 104259 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 21 18:29:02 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:29:02.996 104259 DEBUG oslo_concurrency.lockutils [-] Acquired lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 21 18:29:02 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:29:02.996 104259 DEBUG oslo_concurrency.lockutils [-] Releasing lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 21 18:29:02 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:29:02.997 104259 INFO oslo_service.service [-] Starting 1 workers#033[00m
Jan 21 18:29:03 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:29:03.002 104259 DEBUG oslo_service.service [-] Started child 104517 _start_child /usr/lib/python3.9/site-packages/oslo_service/service.py:575#033[00m
Jan 21 18:29:03 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:29:03.005 104517 DEBUG neutron_lib.callbacks.manager [-] Publish callbacks ['neutron.agent.ovn.metadata.server.MetadataProxyHandler.post_fork_initialize-431760'] for process (None), after_init _notify_loop /usr/lib/python3.9/site-packages/neutron_lib/callbacks/manager.py:184#033[00m
Jan 21 18:29:03 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:29:03.005 104259 INFO oslo.privsep.daemon [-] Running privsep helper: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'privsep-helper', '--config-file', '/etc/neutron/neutron.conf', '--config-dir', '/etc/neutron.conf.d', '--privsep_context', 'neutron.privileged.namespace_cmd', '--privsep_sock_path', '/tmp/tmpvmpwhs7p/privsep.sock']#033[00m
Jan 21 18:29:03 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:29:03.032 104517 INFO neutron.agent.ovn.metadata.ovsdb [-] Getting OvsdbSbOvnIdl for MetadataAgent with retry#033[00m
Jan 21 18:29:03 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:29:03.033 104517 DEBUG ovsdbapp.backend.ovs_idl [-] Created lookup_table index Chassis.name autocreate_indices /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:87#033[00m
Jan 21 18:29:03 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:29:03.033 104517 DEBUG ovsdbapp.backend.ovs_idl [-] Created schema index Datapath_Binding.tunnel_key autocreate_indices /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:106#033[00m
Jan 21 18:29:03 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:29:03.036 104517 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connecting...#033[00m
Jan 21 18:29:03 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:29:03.043 104517 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connected#033[00m
Jan 21 18:29:03 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:29:03.051 104517 INFO eventlet.wsgi.server [-] (104517) wsgi starting up on http:/var/lib/neutron/metadata_proxy#033[00m
Jan 21 18:29:03 np0005591285 kernel: capability: warning: `privsep-helper' uses deprecated v2 capabilities in a way that may be insecure
Jan 21 18:29:03 np0005591285 python3.9[104649]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/deployed_services.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 21 18:29:03 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:29:03.815 104259 INFO oslo.privsep.daemon [-] Spawned new privsep daemon via rootwrap#033[00m
Jan 21 18:29:03 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:29:03.816 104259 DEBUG oslo.privsep.daemon [-] Accepted privsep connection to /tmp/tmpvmpwhs7p/privsep.sock __init__ /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:362#033[00m
Jan 21 18:29:03 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:29:03.682 104650 INFO oslo.privsep.daemon [-] privsep daemon starting#033[00m
Jan 21 18:29:03 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:29:03.689 104650 INFO oslo.privsep.daemon [-] privsep process running with uid/gid: 0/0#033[00m
Jan 21 18:29:03 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:29:03.694 104650 INFO oslo.privsep.daemon [-] privsep process running with capabilities (eff/prm/inh): CAP_SYS_ADMIN/CAP_SYS_ADMIN/none#033[00m
Jan 21 18:29:03 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:29:03.694 104650 INFO oslo.privsep.daemon [-] privsep daemon running as pid 104650#033[00m
Jan 21 18:29:03 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:29:03.820 104650 DEBUG oslo.privsep.daemon [-] privsep: reply[449301db-53f8-4191-b2bd-4e49103699f8]: (2,) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 18:29:04 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:29:04.357 104650 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "context-manager" by "neutron_lib.db.api._create_context_manager" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 21 18:29:04 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:29:04.357 104650 DEBUG oslo_concurrency.lockutils [-] Lock "context-manager" acquired by "neutron_lib.db.api._create_context_manager" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 21 18:29:04 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:29:04.358 104650 DEBUG oslo_concurrency.lockutils [-] Lock "context-manager" "released" by "neutron_lib.db.api._create_context_manager" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 21 18:29:04 np0005591285 python3.9[104779]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/edpm-config/deployed_services.yaml mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1769038143.1609118-1429-48719391429218/.source.yaml _original_basename=.4bmrp28r follow=False checksum=f1357c586761a8e0d6b78c9eb359b93b40f134ec backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 21 18:29:05 np0005591285 systemd[1]: session-22.scope: Deactivated successfully.
Jan 21 18:29:05 np0005591285 systemd[1]: session-22.scope: Consumed 56.357s CPU time.
Jan 21 18:29:05 np0005591285 systemd-logind[788]: Session 22 logged out. Waiting for processes to exit.
Jan 21 18:29:05 np0005591285 systemd-logind[788]: Removed session 22.
Jan 21 18:29:05 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:29:05.042 104650 DEBUG oslo.privsep.daemon [-] privsep: reply[19573153-c502-4362-b222-151dac022f0b]: (4, []) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 18:29:05 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:29:05.045 104259 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbAddCommand(_result=None, table=Chassis_Private, record=ce4b296c-26ac-415a-aa87-9634754eb3d3, column=external_ids, values=({'neutron:ovn-metadata-id': '91248367-0c76-59ce-a190-18d7e60df115'},)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 21 18:29:05 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:29:05.198 104259 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=ce4b296c-26ac-415a-aa87-9634754eb3d3, col_values=(('external_ids', {'neutron:ovn-bridge': 'br-int'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 21 18:29:05 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:29:05.205 104259 DEBUG oslo_service.service [-] Full set of CONF: wait /usr/lib/python3.9/site-packages/oslo_service/service.py:649#033[00m
Jan 21 18:29:05 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:29:05.205 104259 DEBUG oslo_service.service [-] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2589#033[00m
Jan 21 18:29:05 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:29:05.205 104259 DEBUG oslo_service.service [-] Configuration options gathered from: log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2590#033[00m
Jan 21 18:29:05 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:29:05.205 104259 DEBUG oslo_service.service [-] command line args: [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2591#033[00m
Jan 21 18:29:05 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:29:05.205 104259 DEBUG oslo_service.service [-] config files: ['/etc/neutron/neutron.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2592#033[00m
Jan 21 18:29:05 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:29:05.205 104259 DEBUG oslo_service.service [-] ================================================================================ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2594#033[00m
Jan 21 18:29:05 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:29:05.206 104259 DEBUG oslo_service.service [-] agent_down_time                = 75 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 21 18:29:05 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:29:05.206 104259 DEBUG oslo_service.service [-] allow_bulk                     = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 21 18:29:05 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:29:05.206 104259 DEBUG oslo_service.service [-] api_extensions_path            =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 21 18:29:05 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:29:05.206 104259 DEBUG oslo_service.service [-] api_paste_config               = api-paste.ini log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 21 18:29:05 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:29:05.206 104259 DEBUG oslo_service.service [-] api_workers                    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 21 18:29:05 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:29:05.206 104259 DEBUG oslo_service.service [-] auth_ca_cert                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 21 18:29:05 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:29:05.206 104259 DEBUG oslo_service.service [-] auth_strategy                  = keystone log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 21 18:29:05 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:29:05.206 104259 DEBUG oslo_service.service [-] backlog                        = 4096 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 21 18:29:05 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:29:05.206 104259 DEBUG oslo_service.service [-] base_mac                       = fa:16:3e:00:00:00 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 21 18:29:05 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:29:05.207 104259 DEBUG oslo_service.service [-] bind_host                      = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 21 18:29:05 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:29:05.207 104259 DEBUG oslo_service.service [-] bind_port                      = 9696 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 21 18:29:05 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:29:05.207 104259 DEBUG oslo_service.service [-] client_socket_timeout          = 900 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 21 18:29:05 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:29:05.207 104259 DEBUG oslo_service.service [-] config_dir                     = ['/etc/neutron.conf.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 21 18:29:05 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:29:05.207 104259 DEBUG oslo_service.service [-] config_file                    = ['/etc/neutron/neutron.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 21 18:29:05 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:29:05.207 104259 DEBUG oslo_service.service [-] config_source                  = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 21 18:29:05 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:29:05.207 104259 DEBUG oslo_service.service [-] control_exchange               = neutron log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 21 18:29:05 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:29:05.207 104259 DEBUG oslo_service.service [-] core_plugin                    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 21 18:29:05 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:29:05.208 104259 DEBUG oslo_service.service [-] debug                          = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 21 18:29:05 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:29:05.208 104259 DEBUG oslo_service.service [-] default_availability_zones     = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 21 18:29:05 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:29:05.208 104259 DEBUG oslo_service.service [-] default_log_levels             = ['amqp=WARN', 'amqplib=WARN', 'boto=WARN', 'qpid=WARN', 'sqlalchemy=WARN', 'suds=INFO', 'oslo.messaging=INFO', 'oslo_messaging=INFO', 'iso8601=WARN', 'requests.packages.urllib3.connectionpool=WARN', 'urllib3.connectionpool=WARN', 'websocket=WARN', 'requests.packages.urllib3.util.retry=WARN', 'urllib3.util.retry=WARN', 'keystonemiddleware=WARN', 'routes.middleware=WARN', 'stevedore=WARN', 'taskflow=WARN', 'keystoneauth=WARN', 'oslo.cache=INFO', 'oslo_policy=INFO', 'dogpile.core.dogpile=INFO', 'OFPHandler=INFO', 'OfctlService=INFO', 'os_ken.base.app_manager=INFO', 'os_ken.controller.controller=INFO'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 21 18:29:05 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:29:05.208 104259 DEBUG oslo_service.service [-] dhcp_agent_notification        = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 21 18:29:05 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:29:05.208 104259 DEBUG oslo_service.service [-] dhcp_lease_duration            = 86400 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 21 18:29:05 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:29:05.208 104259 DEBUG oslo_service.service [-] dhcp_load_type                 = networks log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 21 18:29:05 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:29:05.208 104259 DEBUG oslo_service.service [-] dns_domain                     = openstacklocal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 21 18:29:05 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:29:05.208 104259 DEBUG oslo_service.service [-] enable_new_agents              = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 21 18:29:05 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:29:05.208 104259 DEBUG oslo_service.service [-] enable_traditional_dhcp        = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 21 18:29:05 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:29:05.209 104259 DEBUG oslo_service.service [-] external_dns_driver            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 21 18:29:05 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:29:05.209 104259 DEBUG oslo_service.service [-] external_pids                  = /var/lib/neutron/external/pids log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 21 18:29:05 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:29:05.209 104259 DEBUG oslo_service.service [-] filter_validation              = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 21 18:29:05 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:29:05.209 104259 DEBUG oslo_service.service [-] global_physnet_mtu             = 1500 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 21 18:29:05 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:29:05.209 104259 DEBUG oslo_service.service [-] graceful_shutdown_timeout      = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 21 18:29:05 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:29:05.209 104259 DEBUG oslo_service.service [-] host                           = compute-2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 21 18:29:05 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:29:05.209 104259 DEBUG oslo_service.service [-] http_retries                   = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 21 18:29:05 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:29:05.210 104259 DEBUG oslo_service.service [-] instance_format                = [instance: %(uuid)s]  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 21 18:29:05 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:29:05.210 104259 DEBUG oslo_service.service [-] instance_uuid_format           = [instance: %(uuid)s]  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 21 18:29:05 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:29:05.210 104259 DEBUG oslo_service.service [-] ipam_driver                    = internal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 21 18:29:05 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:29:05.210 104259 DEBUG oslo_service.service [-] ipv6_pd_enabled                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 21 18:29:05 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:29:05.210 104259 DEBUG oslo_service.service [-] log_config_append              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 21 18:29:05 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:29:05.210 104259 DEBUG oslo_service.service [-] log_date_format                = %Y-%m-%d %H:%M:%S log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 21 18:29:05 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:29:05.210 104259 DEBUG oslo_service.service [-] log_dir                        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 21 18:29:05 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:29:05.210 104259 DEBUG oslo_service.service [-] log_file                       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 21 18:29:05 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:29:05.210 104259 DEBUG oslo_service.service [-] log_options                    = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 21 18:29:05 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:29:05.210 104259 DEBUG oslo_service.service [-] log_rotate_interval            = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 21 18:29:05 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:29:05.211 104259 DEBUG oslo_service.service [-] log_rotate_interval_type       = days log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 21 18:29:05 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:29:05.211 104259 DEBUG oslo_service.service [-] log_rotation_type              = none log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 21 18:29:05 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:29:05.211 104259 DEBUG oslo_service.service [-] logging_context_format_string  = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [%(global_request_id)s %(request_id)s %(user_identity)s] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 21 18:29:05 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:29:05.211 104259 DEBUG oslo_service.service [-] logging_debug_format_suffix    = %(funcName)s %(pathname)s:%(lineno)d log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 21 18:29:05 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:29:05.211 104259 DEBUG oslo_service.service [-] logging_default_format_string  = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [-] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 21 18:29:05 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:29:05.211 104259 DEBUG oslo_service.service [-] logging_exception_prefix       = %(asctime)s.%(msecs)03d %(process)d ERROR %(name)s %(instance)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 21 18:29:05 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:29:05.211 104259 DEBUG oslo_service.service [-] logging_user_identity_format   = %(user)s %(project)s %(domain)s %(system_scope)s %(user_domain)s %(project_domain)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 21 18:29:05 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:29:05.211 104259 DEBUG oslo_service.service [-] max_dns_nameservers            = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 21 18:29:05 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:29:05.211 104259 DEBUG oslo_service.service [-] max_header_line                = 16384 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 21 18:29:05 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:29:05.211 104259 DEBUG oslo_service.service [-] max_logfile_count              = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 21 18:29:05 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:29:05.212 104259 DEBUG oslo_service.service [-] max_logfile_size_mb            = 200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 21 18:29:05 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:29:05.212 104259 DEBUG oslo_service.service [-] max_subnet_host_routes         = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 21 18:29:05 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:29:05.212 104259 DEBUG oslo_service.service [-] metadata_backlog               = 4096 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 21 18:29:05 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:29:05.212 104259 DEBUG oslo_service.service [-] metadata_proxy_group           =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 21 18:29:05 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:29:05.212 104259 DEBUG oslo_service.service [-] metadata_proxy_shared_secret   = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 21 18:29:05 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:29:05.212 104259 DEBUG oslo_service.service [-] metadata_proxy_socket          = /var/lib/neutron/metadata_proxy log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 21 18:29:05 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:29:05.212 104259 DEBUG oslo_service.service [-] metadata_proxy_socket_mode     = deduce log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 21 18:29:05 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:29:05.212 104259 DEBUG oslo_service.service [-] metadata_proxy_user            =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 21 18:29:05 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:29:05.212 104259 DEBUG oslo_service.service [-] metadata_workers               = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 21 18:29:05 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:29:05.212 104259 DEBUG oslo_service.service [-] network_link_prefix            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 21 18:29:05 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:29:05.213 104259 DEBUG oslo_service.service [-] notify_nova_on_port_data_changes = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 21 18:29:05 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:29:05.213 104259 DEBUG oslo_service.service [-] notify_nova_on_port_status_changes = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 21 18:29:05 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:29:05.213 104259 DEBUG oslo_service.service [-] nova_client_cert               =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 21 18:29:05 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:29:05.213 104259 DEBUG oslo_service.service [-] nova_client_priv_key           =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 21 18:29:05 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:29:05.213 104259 DEBUG oslo_service.service [-] nova_metadata_host             = nova-metadata-cell1-internal.openstack.svc log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 21 18:29:05 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:29:05.213 104259 DEBUG oslo_service.service [-] nova_metadata_insecure         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 21 18:29:05 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:29:05.213 104259 DEBUG oslo_service.service [-] nova_metadata_port             = 8775 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 21 18:29:05 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:29:05.213 104259 DEBUG oslo_service.service [-] nova_metadata_protocol         = https log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 21 18:29:05 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:29:05.213 104259 DEBUG oslo_service.service [-] pagination_max_limit           = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 21 18:29:05 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:29:05.214 104259 DEBUG oslo_service.service [-] periodic_fuzzy_delay           = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 21 18:29:05 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:29:05.214 104259 DEBUG oslo_service.service [-] periodic_interval              = 40 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 21 18:29:05 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:29:05.214 104259 DEBUG oslo_service.service [-] publish_errors                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 21 18:29:05 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:29:05.214 104259 DEBUG oslo_service.service [-] rate_limit_burst               = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 21 18:29:05 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:29:05.214 104259 DEBUG oslo_service.service [-] rate_limit_except_level        = CRITICAL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 21 18:29:05 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:29:05.214 104259 DEBUG oslo_service.service [-] rate_limit_interval            = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 21 18:29:05 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:29:05.214 104259 DEBUG oslo_service.service [-] retry_until_window             = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 21 18:29:05 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:29:05.214 104259 DEBUG oslo_service.service [-] rpc_resources_processing_step  = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 21 18:29:05 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:29:05.214 104259 DEBUG oslo_service.service [-] rpc_response_max_timeout       = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 21 18:29:05 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:29:05.214 104259 DEBUG oslo_service.service [-] rpc_state_report_workers       = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 21 18:29:05 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:29:05.215 104259 DEBUG oslo_service.service [-] rpc_workers                    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 21 18:29:05 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:29:05.215 104259 DEBUG oslo_service.service [-] send_events_interval           = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 21 18:29:05 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:29:05.215 104259 DEBUG oslo_service.service [-] service_plugins                = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 21 18:29:05 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:29:05.215 104259 DEBUG oslo_service.service [-] setproctitle                   = on log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 21 18:29:05 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:29:05.215 104259 DEBUG oslo_service.service [-] state_path                     = /var/lib/neutron log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 21 18:29:05 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:29:05.215 104259 DEBUG oslo_service.service [-] syslog_log_facility            = syslog log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 21 18:29:05 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:29:05.215 104259 DEBUG oslo_service.service [-] tcp_keepidle                   = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 21 18:29:05 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:29:05.215 104259 DEBUG oslo_service.service [-] transport_url                  = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 21 18:29:05 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:29:05.216 104259 DEBUG oslo_service.service [-] use_eventlog                   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 21 18:29:05 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:29:05.216 104259 DEBUG oslo_service.service [-] use_journal                    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 21 18:29:05 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:29:05.216 104259 DEBUG oslo_service.service [-] use_json                       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 21 18:29:05 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:29:05.216 104259 DEBUG oslo_service.service [-] use_ssl                        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 21 18:29:05 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:29:05.216 104259 DEBUG oslo_service.service [-] use_stderr                     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 21 18:29:05 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:29:05.216 104259 DEBUG oslo_service.service [-] use_syslog                     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 21 18:29:05 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:29:05.216 104259 DEBUG oslo_service.service [-] vlan_transparent               = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 21 18:29:05 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:29:05.216 104259 DEBUG oslo_service.service [-] watch_log_file                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 21 18:29:05 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:29:05.216 104259 DEBUG oslo_service.service [-] wsgi_default_pool_size         = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 21 18:29:05 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:29:05.216 104259 DEBUG oslo_service.service [-] wsgi_keep_alive                = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 21 18:29:05 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:29:05.216 104259 DEBUG oslo_service.service [-] wsgi_log_format                = %(client_ip)s "%(request_line)s" status: %(status_code)s  len: %(body_length)s time: %(wall_seconds).7f log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 21 18:29:05 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:29:05.217 104259 DEBUG oslo_service.service [-] wsgi_server_debug              = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 21 18:29:05 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:29:05.217 104259 DEBUG oslo_service.service [-] oslo_concurrency.disable_process_locking = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:29:05 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:29:05.217 104259 DEBUG oslo_service.service [-] oslo_concurrency.lock_path     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:29:05 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:29:05.217 104259 DEBUG oslo_service.service [-] profiler.connection_string     = messaging:// log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:29:05 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:29:05.217 104259 DEBUG oslo_service.service [-] profiler.enabled               = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:29:05 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:29:05.218 104259 DEBUG oslo_service.service [-] profiler.es_doc_type           = notification log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:29:05 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:29:05.218 104259 DEBUG oslo_service.service [-] profiler.es_scroll_size        = 10000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:29:05 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:29:05.218 104259 DEBUG oslo_service.service [-] profiler.es_scroll_time        = 2m log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:29:05 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:29:05.218 104259 DEBUG oslo_service.service [-] profiler.filter_error_trace    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:29:05 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:29:05.218 104259 DEBUG oslo_service.service [-] profiler.hmac_keys             = SECRET_KEY log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:29:05 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:29:05.218 104259 DEBUG oslo_service.service [-] profiler.sentinel_service_name = mymaster log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:29:05 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:29:05.218 104259 DEBUG oslo_service.service [-] profiler.socket_timeout        = 0.1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:29:05 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:29:05.219 104259 DEBUG oslo_service.service [-] profiler.trace_sqlalchemy      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:29:05 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:29:05.219 104259 DEBUG oslo_service.service [-] oslo_policy.enforce_new_defaults = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:29:05 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:29:05.219 104259 DEBUG oslo_service.service [-] oslo_policy.enforce_scope      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:29:05 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:29:05.219 104259 DEBUG oslo_service.service [-] oslo_policy.policy_default_rule = default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:29:05 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:29:05.219 104259 DEBUG oslo_service.service [-] oslo_policy.policy_dirs        = ['policy.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:29:05 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:29:05.219 104259 DEBUG oslo_service.service [-] oslo_policy.policy_file        = policy.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:29:05 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:29:05.219 104259 DEBUG oslo_service.service [-] oslo_policy.remote_content_type = application/x-www-form-urlencoded log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:29:05 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:29:05.219 104259 DEBUG oslo_service.service [-] oslo_policy.remote_ssl_ca_crt_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:29:05 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:29:05.220 104259 DEBUG oslo_service.service [-] oslo_policy.remote_ssl_client_crt_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:29:05 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:29:05.220 104259 DEBUG oslo_service.service [-] oslo_policy.remote_ssl_client_key_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:29:05 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:29:05.220 104259 DEBUG oslo_service.service [-] oslo_policy.remote_ssl_verify_server_crt = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:29:05 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:29:05.220 104259 DEBUG oslo_service.service [-] oslo_messaging_metrics.metrics_buffer_size = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:29:05 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:29:05.220 104259 DEBUG oslo_service.service [-] oslo_messaging_metrics.metrics_enabled = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:29:05 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:29:05.220 104259 DEBUG oslo_service.service [-] oslo_messaging_metrics.metrics_process_name =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:29:05 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:29:05.220 104259 DEBUG oslo_service.service [-] oslo_messaging_metrics.metrics_socket_file = /var/tmp/metrics_collector.sock log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:29:05 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:29:05.220 104259 DEBUG oslo_service.service [-] oslo_messaging_metrics.metrics_thread_stop_timeout = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:29:05 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:29:05.220 104259 DEBUG oslo_service.service [-] oslo_middleware.http_basic_auth_user_file = /etc/htpasswd log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:29:05 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:29:05.221 104259 DEBUG oslo_service.service [-] service_providers.service_provider = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:29:05 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:29:05.221 104259 DEBUG oslo_service.service [-] privsep.capabilities           = [21, 12, 1, 2, 19] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:29:05 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:29:05.221 104259 DEBUG oslo_service.service [-] privsep.group                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:29:05 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:29:05.221 104259 DEBUG oslo_service.service [-] privsep.helper_command         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:29:05 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:29:05.221 104259 DEBUG oslo_service.service [-] privsep.logger_name            = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:29:05 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:29:05.221 104259 DEBUG oslo_service.service [-] privsep.thread_pool_size       = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:29:05 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:29:05.221 104259 DEBUG oslo_service.service [-] privsep.user                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:29:05 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:29:05.221 104259 DEBUG oslo_service.service [-] privsep_dhcp_release.capabilities = [21, 12] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:29:05 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:29:05.221 104259 DEBUG oslo_service.service [-] privsep_dhcp_release.group     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:29:05 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:29:05.221 104259 DEBUG oslo_service.service [-] privsep_dhcp_release.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:29:05 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:29:05.222 104259 DEBUG oslo_service.service [-] privsep_dhcp_release.logger_name = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:29:05 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:29:05.222 104259 DEBUG oslo_service.service [-] privsep_dhcp_release.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:29:05 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:29:05.222 104259 DEBUG oslo_service.service [-] privsep_dhcp_release.user      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:29:05 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:29:05.222 104259 DEBUG oslo_service.service [-] privsep_ovs_vsctl.capabilities = [21, 12] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:29:05 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:29:05.222 104259 DEBUG oslo_service.service [-] privsep_ovs_vsctl.group        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:29:05 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:29:05.222 104259 DEBUG oslo_service.service [-] privsep_ovs_vsctl.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:29:05 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:29:05.222 104259 DEBUG oslo_service.service [-] privsep_ovs_vsctl.logger_name  = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:29:05 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:29:05.222 104259 DEBUG oslo_service.service [-] privsep_ovs_vsctl.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:29:05 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:29:05.223 104259 DEBUG oslo_service.service [-] privsep_ovs_vsctl.user         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:29:05 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:29:05.223 104259 DEBUG oslo_service.service [-] privsep_namespace.capabilities = [21] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:29:05 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:29:05.223 104259 DEBUG oslo_service.service [-] privsep_namespace.group        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:29:05 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:29:05.223 104259 DEBUG oslo_service.service [-] privsep_namespace.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:29:05 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:29:05.223 104259 DEBUG oslo_service.service [-] privsep_namespace.logger_name  = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:29:05 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:29:05.223 104259 DEBUG oslo_service.service [-] privsep_namespace.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:29:05 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:29:05.223 104259 DEBUG oslo_service.service [-] privsep_namespace.user         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:29:05 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:29:05.223 104259 DEBUG oslo_service.service [-] privsep_conntrack.capabilities = [12] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:29:05 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:29:05.223 104259 DEBUG oslo_service.service [-] privsep_conntrack.group        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:29:05 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:29:05.223 104259 DEBUG oslo_service.service [-] privsep_conntrack.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:29:05 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:29:05.224 104259 DEBUG oslo_service.service [-] privsep_conntrack.logger_name  = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:29:05 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:29:05.224 104259 DEBUG oslo_service.service [-] privsep_conntrack.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:29:05 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:29:05.224 104259 DEBUG oslo_service.service [-] privsep_conntrack.user         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:29:05 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:29:05.224 104259 DEBUG oslo_service.service [-] privsep_link.capabilities      = [12, 21] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:29:05 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:29:05.224 104259 DEBUG oslo_service.service [-] privsep_link.group             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:29:05 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:29:05.224 104259 DEBUG oslo_service.service [-] privsep_link.helper_command    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:29:05 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:29:05.224 104259 DEBUG oslo_service.service [-] privsep_link.logger_name       = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:29:05 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:29:05.224 104259 DEBUG oslo_service.service [-] privsep_link.thread_pool_size  = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:29:05 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:29:05.224 104259 DEBUG oslo_service.service [-] privsep_link.user              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:29:05 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:29:05.225 104259 DEBUG oslo_service.service [-] AGENT.check_child_processes_action = respawn log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:29:05 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:29:05.225 104259 DEBUG oslo_service.service [-] AGENT.check_child_processes_interval = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:29:05 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:29:05.225 104259 DEBUG oslo_service.service [-] AGENT.comment_iptables_rules   = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:29:05 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:29:05.225 104259 DEBUG oslo_service.service [-] AGENT.debug_iptables_rules     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:29:05 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:29:05.225 104259 DEBUG oslo_service.service [-] AGENT.kill_scripts_path        = /etc/neutron/kill_scripts/ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:29:05 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:29:05.225 104259 DEBUG oslo_service.service [-] AGENT.root_helper              = sudo neutron-rootwrap /etc/neutron/rootwrap.conf log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:29:05 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:29:05.225 104259 DEBUG oslo_service.service [-] AGENT.root_helper_daemon       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:29:05 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:29:05.225 104259 DEBUG oslo_service.service [-] AGENT.use_helper_for_ns_read   = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:29:05 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:29:05.226 104259 DEBUG oslo_service.service [-] AGENT.use_random_fully         = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:29:05 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:29:05.226 104259 DEBUG oslo_service.service [-] oslo_versionedobjects.fatal_exception_format_errors = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:29:05 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:29:05.226 104259 DEBUG oslo_service.service [-] QUOTAS.default_quota           = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:29:05 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:29:05.226 104259 DEBUG oslo_service.service [-] QUOTAS.quota_driver            = neutron.db.quota.driver_nolock.DbQuotaNoLockDriver log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:29:05 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:29:05.226 104259 DEBUG oslo_service.service [-] QUOTAS.quota_network           = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:29:05 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:29:05.226 104259 DEBUG oslo_service.service [-] QUOTAS.quota_port              = 500 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:29:05 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:29:05.226 104259 DEBUG oslo_service.service [-] QUOTAS.quota_security_group    = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:29:05 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:29:05.226 104259 DEBUG oslo_service.service [-] QUOTAS.quota_security_group_rule = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:29:05 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:29:05.227 104259 DEBUG oslo_service.service [-] QUOTAS.quota_subnet            = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:29:05 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:29:05.227 104259 DEBUG oslo_service.service [-] QUOTAS.track_quota_usage       = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:29:05 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:29:05.227 104259 DEBUG oslo_service.service [-] nova.auth_section              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:29:05 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:29:05.227 104259 DEBUG oslo_service.service [-] nova.auth_type                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:29:05 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:29:05.227 104259 DEBUG oslo_service.service [-] nova.cafile                    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:29:05 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:29:05.227 104259 DEBUG oslo_service.service [-] nova.certfile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:29:05 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:29:05.227 104259 DEBUG oslo_service.service [-] nova.collect_timing            = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:29:05 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:29:05.227 104259 DEBUG oslo_service.service [-] nova.endpoint_type             = public log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:29:05 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:29:05.227 104259 DEBUG oslo_service.service [-] nova.insecure                  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:29:05 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:29:05.228 104259 DEBUG oslo_service.service [-] nova.keyfile                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:29:05 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:29:05.228 104259 DEBUG oslo_service.service [-] nova.region_name               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:29:05 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:29:05.228 104259 DEBUG oslo_service.service [-] nova.split_loggers             = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:29:05 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:29:05.228 104259 DEBUG oslo_service.service [-] nova.timeout                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:29:05 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:29:05.228 104259 DEBUG oslo_service.service [-] placement.auth_section         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:29:05 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:29:05.228 104259 DEBUG oslo_service.service [-] placement.auth_type            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:29:05 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:29:05.228 104259 DEBUG oslo_service.service [-] placement.cafile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:29:05 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:29:05.228 104259 DEBUG oslo_service.service [-] placement.certfile             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:29:05 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:29:05.228 104259 DEBUG oslo_service.service [-] placement.collect_timing       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:29:05 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:29:05.228 104259 DEBUG oslo_service.service [-] placement.endpoint_type        = public log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:29:05 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:29:05.229 104259 DEBUG oslo_service.service [-] placement.insecure             = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:29:05 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:29:05.229 104259 DEBUG oslo_service.service [-] placement.keyfile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:29:05 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:29:05.229 104259 DEBUG oslo_service.service [-] placement.region_name          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:29:05 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:29:05.229 104259 DEBUG oslo_service.service [-] placement.split_loggers        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:29:05 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:29:05.229 104259 DEBUG oslo_service.service [-] placement.timeout              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:29:05 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:29:05.229 104259 DEBUG oslo_service.service [-] ironic.auth_section            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:29:05 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:29:05.229 104259 DEBUG oslo_service.service [-] ironic.auth_type               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:29:05 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:29:05.229 104259 DEBUG oslo_service.service [-] ironic.cafile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:29:05 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:29:05.229 104259 DEBUG oslo_service.service [-] ironic.certfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:29:05 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:29:05.229 104259 DEBUG oslo_service.service [-] ironic.collect_timing          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:29:05 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:29:05.230 104259 DEBUG oslo_service.service [-] ironic.connect_retries         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:29:05 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:29:05.230 104259 DEBUG oslo_service.service [-] ironic.connect_retry_delay     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:29:05 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:29:05.230 104259 DEBUG oslo_service.service [-] ironic.enable_notifications    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:29:05 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:29:05.230 104259 DEBUG oslo_service.service [-] ironic.endpoint_override       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:29:05 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:29:05.230 104259 DEBUG oslo_service.service [-] ironic.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:29:05 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:29:05.230 104259 DEBUG oslo_service.service [-] ironic.interface               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:29:05 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:29:05.230 104259 DEBUG oslo_service.service [-] ironic.keyfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:29:05 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:29:05.230 104259 DEBUG oslo_service.service [-] ironic.max_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:29:05 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:29:05.231 104259 DEBUG oslo_service.service [-] ironic.min_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:29:05 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:29:05.231 104259 DEBUG oslo_service.service [-] ironic.region_name             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:29:05 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:29:05.231 104259 DEBUG oslo_service.service [-] ironic.service_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:29:05 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:29:05.231 104259 DEBUG oslo_service.service [-] ironic.service_type            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:29:05 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:29:05.231 104259 DEBUG oslo_service.service [-] ironic.split_loggers           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:29:05 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:29:05.231 104259 DEBUG oslo_service.service [-] ironic.status_code_retries     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:29:05 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:29:05.231 104259 DEBUG oslo_service.service [-] ironic.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:29:05 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:29:05.231 104259 DEBUG oslo_service.service [-] ironic.timeout                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:29:05 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:29:05.231 104259 DEBUG oslo_service.service [-] ironic.valid_interfaces        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:29:05 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:29:05.231 104259 DEBUG oslo_service.service [-] ironic.version                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:29:05 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:29:05.232 104259 DEBUG oslo_service.service [-] cli_script.dry_run             = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:29:05 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:29:05.232 104259 DEBUG oslo_service.service [-] ovn.allow_stateless_action_supported = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:29:05 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:29:05.232 104259 DEBUG oslo_service.service [-] ovn.dhcp_default_lease_time    = 43200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:29:05 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:29:05.232 104259 DEBUG oslo_service.service [-] ovn.disable_ovn_dhcp_for_baremetal_ports = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:29:05 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:29:05.232 104259 DEBUG oslo_service.service [-] ovn.dns_servers                = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:29:05 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:29:05.232 104259 DEBUG oslo_service.service [-] ovn.enable_distributed_floating_ip = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:29:05 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:29:05.232 104259 DEBUG oslo_service.service [-] ovn.neutron_sync_mode          = log log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:29:05 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:29:05.232 104259 DEBUG oslo_service.service [-] ovn.ovn_dhcp4_global_options   = {} log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:29:05 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:29:05.232 104259 DEBUG oslo_service.service [-] ovn.ovn_dhcp6_global_options   = {} log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:29:05 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:29:05.233 104259 DEBUG oslo_service.service [-] ovn.ovn_emit_need_to_frag      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:29:05 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:29:05.233 104259 DEBUG oslo_service.service [-] ovn.ovn_l3_mode                = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:29:05 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:29:05.233 104259 DEBUG oslo_service.service [-] ovn.ovn_l3_scheduler           = leastloaded log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:29:05 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:29:05.233 104259 DEBUG oslo_service.service [-] ovn.ovn_metadata_enabled       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:29:05 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:29:05.233 104259 DEBUG oslo_service.service [-] ovn.ovn_nb_ca_cert             =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:29:05 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:29:05.233 104259 DEBUG oslo_service.service [-] ovn.ovn_nb_certificate         =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:29:05 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:29:05.233 104259 DEBUG oslo_service.service [-] ovn.ovn_nb_connection          = tcp:127.0.0.1:6641 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:29:05 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:29:05.233 104259 DEBUG oslo_service.service [-] ovn.ovn_nb_private_key         =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:29:05 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:29:05.234 104259 DEBUG oslo_service.service [-] ovn.ovn_sb_ca_cert             = /etc/pki/tls/certs/ovndbca.crt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:29:05 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:29:05.234 104259 DEBUG oslo_service.service [-] ovn.ovn_sb_certificate         = /etc/pki/tls/certs/ovndb.crt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:29:05 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:29:05.234 104259 DEBUG oslo_service.service [-] ovn.ovn_sb_connection          = ssl:ovsdbserver-sb.openstack.svc:6642 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:29:05 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:29:05.234 104259 DEBUG oslo_service.service [-] ovn.ovn_sb_private_key         = /etc/pki/tls/private/ovndb.key log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:29:05 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:29:05.234 104259 DEBUG oslo_service.service [-] ovn.ovsdb_connection_timeout   = 180 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:29:05 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:29:05.234 104259 DEBUG oslo_service.service [-] ovn.ovsdb_log_level            = INFO log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:29:05 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:29:05.234 104259 DEBUG oslo_service.service [-] ovn.ovsdb_probe_interval       = 60000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:29:05 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:29:05.234 104259 DEBUG oslo_service.service [-] ovn.ovsdb_retry_max_interval   = 180 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:29:05 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:29:05.234 104259 DEBUG oslo_service.service [-] ovn.vhost_sock_dir             = /var/run/openvswitch log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:29:05 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:29:05.235 104259 DEBUG oslo_service.service [-] ovn.vif_type                   = ovs log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:29:05 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:29:05.235 104259 DEBUG oslo_service.service [-] OVS.bridge_mac_table_size      = 50000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:29:05 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:29:05.235 104259 DEBUG oslo_service.service [-] OVS.igmp_snooping_enable       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:29:05 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:29:05.235 104259 DEBUG oslo_service.service [-] OVS.ovsdb_timeout              = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:29:05 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:29:05.235 104259 DEBUG oslo_service.service [-] ovs.ovsdb_connection           = tcp:127.0.0.1:6640 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:29:05 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:29:05.235 104259 DEBUG oslo_service.service [-] ovs.ovsdb_connection_timeout   = 180 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:29:05 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:29:05.235 104259 DEBUG oslo_service.service [-] oslo_messaging_rabbit.amqp_auto_delete = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:29:05 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:29:05.235 104259 DEBUG oslo_service.service [-] oslo_messaging_rabbit.amqp_durable_queues = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:29:05 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:29:05.236 104259 DEBUG oslo_service.service [-] oslo_messaging_rabbit.conn_pool_min_size = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:29:05 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:29:05.236 104259 DEBUG oslo_service.service [-] oslo_messaging_rabbit.conn_pool_ttl = 1200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:29:05 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:29:05.236 104259 DEBUG oslo_service.service [-] oslo_messaging_rabbit.direct_mandatory_flag = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:29:05 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:29:05.236 104259 DEBUG oslo_service.service [-] oslo_messaging_rabbit.enable_cancel_on_failover = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:29:05 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:29:05.236 104259 DEBUG oslo_service.service [-] oslo_messaging_rabbit.heartbeat_in_pthread = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:29:05 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:29:05.236 104259 DEBUG oslo_service.service [-] oslo_messaging_rabbit.heartbeat_rate = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:29:05 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:29:05.236 104259 DEBUG oslo_service.service [-] oslo_messaging_rabbit.heartbeat_timeout_threshold = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:29:05 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:29:05.236 104259 DEBUG oslo_service.service [-] oslo_messaging_rabbit.kombu_compression = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:29:05 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:29:05.236 104259 DEBUG oslo_service.service [-] oslo_messaging_rabbit.kombu_failover_strategy = round-robin log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:29:05 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:29:05.237 104259 DEBUG oslo_service.service [-] oslo_messaging_rabbit.kombu_missing_consumer_retry_timeout = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:29:05 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:29:05.237 104259 DEBUG oslo_service.service [-] oslo_messaging_rabbit.kombu_reconnect_delay = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:29:05 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:29:05.237 104259 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rabbit_ha_queues = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:29:05 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:29:05.237 104259 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rabbit_interval_max = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:29:05 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:29:05.237 104259 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rabbit_login_method = AMQPLAIN log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:29:05 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:29:05.237 104259 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rabbit_qos_prefetch_count = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:29:05 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:29:05.237 104259 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rabbit_quorum_delivery_limit = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:29:05 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:29:05.237 104259 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rabbit_quorum_max_memory_bytes = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:29:05 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:29:05.237 104259 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rabbit_quorum_max_memory_length = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:29:05 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:29:05.237 104259 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rabbit_quorum_queue = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:29:05 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:29:05.238 104259 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rabbit_retry_backoff = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:29:05 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:29:05.238 104259 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rabbit_retry_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:29:05 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:29:05.238 104259 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rabbit_transient_queues_ttl = 1800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:29:05 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:29:05.238 104259 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rpc_conn_pool_size = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:29:05 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:29:05.238 104259 DEBUG oslo_service.service [-] oslo_messaging_rabbit.ssl      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:29:05 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:29:05.238 104259 DEBUG oslo_service.service [-] oslo_messaging_rabbit.ssl_ca_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:29:05 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:29:05.238 104259 DEBUG oslo_service.service [-] oslo_messaging_rabbit.ssl_cert_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:29:05 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:29:05.238 104259 DEBUG oslo_service.service [-] oslo_messaging_rabbit.ssl_enforce_fips_mode = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:29:05 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:29:05.238 104259 DEBUG oslo_service.service [-] oslo_messaging_rabbit.ssl_key_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:29:05 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:29:05.239 104259 DEBUG oslo_service.service [-] oslo_messaging_rabbit.ssl_version =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:29:05 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:29:05.239 104259 DEBUG oslo_service.service [-] oslo_messaging_notifications.driver = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:29:05 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:29:05.239 104259 DEBUG oslo_service.service [-] oslo_messaging_notifications.retry = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:29:05 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:29:05.239 104259 DEBUG oslo_service.service [-] oslo_messaging_notifications.topics = ['notifications'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:29:05 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:29:05.239 104259 DEBUG oslo_service.service [-] oslo_messaging_notifications.transport_url = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:29:05 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:29:05.239 104259 DEBUG oslo_service.service [-] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613#033[00m
Jan 21 18:29:10 np0005591285 systemd-logind[788]: New session 23 of user zuul.
Jan 21 18:29:10 np0005591285 systemd[1]: Started Session 23 of User zuul.
Jan 21 18:29:11 np0005591285 python3.9[104958]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 21 18:29:13 np0005591285 python3.9[105114]: ansible-ansible.legacy.command Invoked with _raw_params=podman ps -a --filter name=^nova_virtlogd$ --format \{\{.Names\}\} _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 21 18:29:15 np0005591285 python3.9[105279]: ansible-ansible.builtin.systemd_service Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Jan 21 18:29:15 np0005591285 systemd[1]: Reloading.
Jan 21 18:29:15 np0005591285 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 21 18:29:15 np0005591285 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 21 18:29:16 np0005591285 python3.9[105464]: ansible-ansible.builtin.service_facts Invoked
Jan 21 18:29:17 np0005591285 network[105481]: You are using 'network' service provided by 'network-scripts', which are now deprecated.
Jan 21 18:29:17 np0005591285 network[105482]: 'network-scripts' will be removed from distribution in near future.
Jan 21 18:29:17 np0005591285 network[105483]: It is advised to switch to 'NetworkManager' instead for network management.
Jan 21 18:29:21 np0005591285 python3.9[105744]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_libvirt.target state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 21 18:29:22 np0005591285 python3.9[105897]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_virtlogd_wrapper.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 21 18:29:23 np0005591285 python3.9[106050]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_virtnodedevd.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 21 18:29:24 np0005591285 podman[106204]: 2026-01-21 23:29:24.215301684 +0000 UTC m=+0.088576032 container health_status e38def30a2d032278c507a8fe96e62e9ea51ff4721fe55b24e66691821fd079e (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, config_id=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, container_name=ovn_controller, managed_by=edpm_ansible, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2)
Jan 21 18:29:24 np0005591285 python3.9[106203]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_virtproxyd.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 21 18:29:25 np0005591285 python3.9[106382]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_virtqemud.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 21 18:29:25 np0005591285 python3.9[106535]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_virtsecretd.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 21 18:29:27 np0005591285 python3.9[106688]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_virtstoraged.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 21 18:29:29 np0005591285 python3.9[106841]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_libvirt.target state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 21 18:29:31 np0005591285 python3.9[106993]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_virtlogd_wrapper.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 21 18:29:31 np0005591285 podman[107117]: 2026-01-21 23:29:31.619321418 +0000 UTC m=+0.071722946 container health_status 482a8450540b33e5cd4c4cb0c4b60281016c657b79b89fa82f8a9e447843f995 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.build-date=20251202, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 21 18:29:31 np0005591285 python3.9[107164]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_virtnodedevd.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 21 18:29:32 np0005591285 python3.9[107317]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_virtproxyd.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 21 18:29:33 np0005591285 python3.9[107469]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_virtqemud.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 21 18:29:33 np0005591285 python3.9[107621]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_virtsecretd.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 21 18:29:34 np0005591285 python3.9[107773]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_virtstoraged.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 21 18:29:35 np0005591285 python3.9[107925]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_libvirt.target state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 21 18:29:35 np0005591285 python3.9[108077]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_virtlogd_wrapper.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 21 18:29:36 np0005591285 python3.9[108229]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_virtnodedevd.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 21 18:29:37 np0005591285 python3.9[108381]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_virtproxyd.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 21 18:29:37 np0005591285 python3.9[108533]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_virtqemud.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 21 18:29:38 np0005591285 python3.9[108685]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_virtsecretd.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 21 18:29:39 np0005591285 python3.9[108837]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_virtstoraged.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 21 18:29:40 np0005591285 python3.9[108989]: ansible-ansible.legacy.command Invoked with _raw_params=if systemctl is-active certmonger.service; then#012  systemctl disable --now certmonger.service#012  test -f /etc/systemd/system/certmonger.service || systemctl mask certmonger.service#012fi#012 _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 21 18:29:41 np0005591285 python3.9[109141]: ansible-ansible.builtin.find Invoked with file_type=any hidden=True paths=['/var/lib/certmonger/requests'] patterns=[] read_whole_file=False age_stamp=mtime recurse=False follow=False get_checksum=False checksum_algorithm=sha1 use_regex=False exact_mode=True excludes=None contains=None age=None size=None depth=None mode=None encoding=None limit=None
Jan 21 18:29:42 np0005591285 python3.9[109293]: ansible-ansible.builtin.systemd_service Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Jan 21 18:29:42 np0005591285 systemd[1]: Reloading.
Jan 21 18:29:42 np0005591285 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 21 18:29:42 np0005591285 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 21 18:29:43 np0005591285 python3.9[109480]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_libvirt.target _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 21 18:29:44 np0005591285 python3.9[109633]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_virtlogd_wrapper.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 21 18:29:45 np0005591285 python3.9[109786]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_virtnodedevd.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 21 18:29:46 np0005591285 python3.9[109939]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_virtproxyd.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 21 18:29:47 np0005591285 python3.9[110092]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_virtqemud.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 21 18:29:48 np0005591285 python3.9[110245]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_virtsecretd.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 21 18:29:48 np0005591285 python3.9[110398]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_virtstoraged.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 21 18:29:50 np0005591285 python3.9[110551]: ansible-ansible.builtin.getent Invoked with database=passwd key=libvirt fail_key=True service=None split=None
Jan 21 18:29:51 np0005591285 python3.9[110704]: ansible-ansible.builtin.group Invoked with gid=42473 name=libvirt state=present force=False system=False local=False non_unique=False gid_min=None gid_max=None
Jan 21 18:29:52 np0005591285 python3.9[110862]: ansible-ansible.builtin.user Invoked with comment=libvirt user group=libvirt groups=[''] name=libvirt shell=/sbin/nologin state=present uid=42473 non_unique=False force=False remove=False create_home=True system=False move_home=False append=False ssh_key_bits=0 ssh_key_type=rsa ssh_key_comment=ansible-generated on compute-2 update_password=always home=None password=NOT_LOGGING_PARAMETER login_class=None password_expire_max=None password_expire_min=None password_expire_warn=None hidden=None seuser=None skeleton=None generate_ssh_key=None ssh_key_file=None ssh_key_passphrase=NOT_LOGGING_PARAMETER expires=None password_lock=None local=None profile=None authorization=None role=None umask=None password_expire_account_disable=None uid_min=None uid_max=None
Jan 21 18:29:53 np0005591285 python3.9[111022]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Jan 21 18:29:54 np0005591285 podman[111106]: 2026-01-21 23:29:54.400353747 +0000 UTC m=+0.108884112 container health_status e38def30a2d032278c507a8fe96e62e9ea51ff4721fe55b24e66691821fd079e (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3)
Jan 21 18:29:54 np0005591285 python3.9[111107]: ansible-ansible.legacy.dnf Invoked with name=['libvirt ', 'libvirt-admin ', 'libvirt-client ', 'libvirt-daemon ', 'qemu-kvm', 'qemu-img', 'libguestfs', 'libseccomp', 'swtpm', 'swtpm-tools', 'edk2-ovmf', 'ceph-common', 'cyrus-sasl-scram'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Jan 21 18:30:02 np0005591285 podman[111144]: 2026-01-21 23:30:02.248014113 +0000 UTC m=+0.113243716 container health_status 482a8450540b33e5cd4c4cb0c4b60281016c657b79b89fa82f8a9e447843f995 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2)
Jan 21 18:30:02 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:30:02.929 104259 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 21 18:30:02 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:30:02.930 104259 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 21 18:30:02 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:30:02.931 104259 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 21 18:30:25 np0005591285 podman[111342]: 2026-01-21 23:30:25.278862366 +0000 UTC m=+0.129933041 container health_status e38def30a2d032278c507a8fe96e62e9ea51ff4721fe55b24e66691821fd079e (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true)
Jan 21 18:30:31 np0005591285 kernel: SELinux:  Converting 2764 SID table entries...
Jan 21 18:30:31 np0005591285 kernel: SELinux:  policy capability network_peer_controls=1
Jan 21 18:30:31 np0005591285 kernel: SELinux:  policy capability open_perms=1
Jan 21 18:30:31 np0005591285 kernel: SELinux:  policy capability extended_socket_class=1
Jan 21 18:30:31 np0005591285 kernel: SELinux:  policy capability always_check_network=0
Jan 21 18:30:31 np0005591285 kernel: SELinux:  policy capability cgroup_seclabel=1
Jan 21 18:30:31 np0005591285 kernel: SELinux:  policy capability nnp_nosuid_transition=1
Jan 21 18:30:31 np0005591285 kernel: SELinux:  policy capability genfs_seclabel_symlinks=1
Jan 21 18:30:33 np0005591285 dbus-broker-launch[776]: avc:  op=load_policy lsm=selinux seqno=10 res=1
Jan 21 18:30:33 np0005591285 podman[111378]: 2026-01-21 23:30:33.219893074 +0000 UTC m=+0.077380890 container health_status 482a8450540b33e5cd4c4cb0c4b60281016c657b79b89fa82f8a9e447843f995 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, org.label-schema.schema-version=1.0)
Jan 21 18:30:44 np0005591285 kernel: SELinux:  Converting 2764 SID table entries...
Jan 21 18:30:44 np0005591285 kernel: SELinux:  policy capability network_peer_controls=1
Jan 21 18:30:44 np0005591285 kernel: SELinux:  policy capability open_perms=1
Jan 21 18:30:44 np0005591285 kernel: SELinux:  policy capability extended_socket_class=1
Jan 21 18:30:44 np0005591285 kernel: SELinux:  policy capability always_check_network=0
Jan 21 18:30:44 np0005591285 kernel: SELinux:  policy capability cgroup_seclabel=1
Jan 21 18:30:44 np0005591285 kernel: SELinux:  policy capability nnp_nosuid_transition=1
Jan 21 18:30:44 np0005591285 kernel: SELinux:  policy capability genfs_seclabel_symlinks=1
Jan 21 18:30:56 np0005591285 dbus-broker-launch[776]: avc:  op=load_policy lsm=selinux seqno=11 res=1
Jan 21 18:30:56 np0005591285 podman[111405]: 2026-01-21 23:30:56.299005318 +0000 UTC m=+0.139166253 container health_status e38def30a2d032278c507a8fe96e62e9ea51ff4721fe55b24e66691821fd079e (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_controller, io.buildah.version=1.41.3, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Jan 21 18:31:02 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:31:02.930 104259 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 21 18:31:02 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:31:02.931 104259 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 21 18:31:02 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:31:02.931 104259 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 21 18:31:04 np0005591285 podman[115130]: 2026-01-21 23:31:04.208027276 +0000 UTC m=+0.072510758 container health_status 482a8450540b33e5cd4c4cb0c4b60281016c657b79b89fa82f8a9e447843f995 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, container_name=ovn_metadata_agent, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 21 18:31:27 np0005591285 podman[128299]: 2026-01-21 23:31:27.279137876 +0000 UTC m=+0.146838965 container health_status e38def30a2d032278c507a8fe96e62e9ea51ff4721fe55b24e66691821fd079e (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, tcib_managed=true, config_id=ovn_controller, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251202)
Jan 21 18:31:35 np0005591285 podman[128342]: 2026-01-21 23:31:35.223160063 +0000 UTC m=+0.074563655 container health_status 482a8450540b33e5cd4c4cb0c4b60281016c657b79b89fa82f8a9e447843f995 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, config_id=ovn_metadata_agent, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 21 18:31:42 np0005591285 kernel: SELinux:  Converting 2765 SID table entries...
Jan 21 18:31:42 np0005591285 kernel: SELinux:  policy capability network_peer_controls=1
Jan 21 18:31:42 np0005591285 kernel: SELinux:  policy capability open_perms=1
Jan 21 18:31:42 np0005591285 kernel: SELinux:  policy capability extended_socket_class=1
Jan 21 18:31:42 np0005591285 kernel: SELinux:  policy capability always_check_network=0
Jan 21 18:31:42 np0005591285 kernel: SELinux:  policy capability cgroup_seclabel=1
Jan 21 18:31:42 np0005591285 kernel: SELinux:  policy capability nnp_nosuid_transition=1
Jan 21 18:31:42 np0005591285 kernel: SELinux:  policy capability genfs_seclabel_symlinks=1
Jan 21 18:31:44 np0005591285 dbus-broker-launch[764]: Noticed file-system modification, trigger reload.
Jan 21 18:31:44 np0005591285 dbus-broker-launch[776]: avc:  op=load_policy lsm=selinux seqno=12 res=1
Jan 21 18:31:44 np0005591285 dbus-broker-launch[764]: Noticed file-system modification, trigger reload.
Jan 21 18:31:52 np0005591285 systemd[1]: Stopping OpenSSH server daemon...
Jan 21 18:31:52 np0005591285 systemd[1]: sshd.service: Deactivated successfully.
Jan 21 18:31:52 np0005591285 systemd[1]: Stopped OpenSSH server daemon.
Jan 21 18:31:52 np0005591285 systemd[1]: sshd.service: Consumed 1.494s CPU time, read 564.0K from disk, written 4.0K to disk.
Jan 21 18:31:52 np0005591285 systemd[1]: Stopped target sshd-keygen.target.
Jan 21 18:31:52 np0005591285 systemd[1]: Stopping sshd-keygen.target...
Jan 21 18:31:52 np0005591285 systemd[1]: OpenSSH ecdsa Server Key Generation was skipped because of an unmet condition check (ConditionPathExists=!/run/systemd/generator.early/multi-user.target.wants/cloud-init.target).
Jan 21 18:31:52 np0005591285 systemd[1]: OpenSSH ed25519 Server Key Generation was skipped because of an unmet condition check (ConditionPathExists=!/run/systemd/generator.early/multi-user.target.wants/cloud-init.target).
Jan 21 18:31:52 np0005591285 systemd[1]: OpenSSH rsa Server Key Generation was skipped because of an unmet condition check (ConditionPathExists=!/run/systemd/generator.early/multi-user.target.wants/cloud-init.target).
Jan 21 18:31:52 np0005591285 systemd[1]: Reached target sshd-keygen.target.
Jan 21 18:31:52 np0005591285 systemd[1]: Starting OpenSSH server daemon...
Jan 21 18:31:52 np0005591285 systemd[1]: Started OpenSSH server daemon.
Jan 21 18:31:55 np0005591285 systemd[1]: Started /usr/bin/systemctl start man-db-cache-update.
Jan 21 18:31:55 np0005591285 systemd[1]: Starting man-db-cache-update.service...
Jan 21 18:31:55 np0005591285 systemd[1]: Reloading.
Jan 21 18:31:55 np0005591285 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 21 18:31:55 np0005591285 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 21 18:31:55 np0005591285 systemd[1]: Queuing reload/restart jobs for marked units…
Jan 21 18:31:58 np0005591285 podman[131908]: 2026-01-21 23:31:58.353565318 +0000 UTC m=+0.222724953 container health_status e38def30a2d032278c507a8fe96e62e9ea51ff4721fe55b24e66691821fd079e (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Jan 21 18:32:00 np0005591285 python3.9[134403]: ansible-ansible.builtin.systemd Invoked with enabled=False masked=True name=libvirtd state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None
Jan 21 18:32:00 np0005591285 systemd[1]: Reloading.
Jan 21 18:32:01 np0005591285 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 21 18:32:01 np0005591285 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 21 18:32:02 np0005591285 python3.9[135637]: ansible-ansible.builtin.systemd Invoked with enabled=False masked=True name=libvirtd-tcp.socket state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None
Jan 21 18:32:02 np0005591285 systemd[1]: Reloading.
Jan 21 18:32:02 np0005591285 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 21 18:32:02 np0005591285 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 21 18:32:02 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:32:02.931 104259 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 21 18:32:02 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:32:02.934 104259 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.003s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 21 18:32:02 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:32:02.934 104259 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 21 18:32:03 np0005591285 python3.9[136836]: ansible-ansible.builtin.systemd Invoked with enabled=False masked=True name=libvirtd-tls.socket state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None
Jan 21 18:32:03 np0005591285 systemd[1]: Reloading.
Jan 21 18:32:03 np0005591285 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 21 18:32:03 np0005591285 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 21 18:32:04 np0005591285 python3.9[138198]: ansible-ansible.builtin.systemd Invoked with enabled=False masked=True name=virtproxyd-tcp.socket state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None
Jan 21 18:32:04 np0005591285 systemd[1]: Reloading.
Jan 21 18:32:04 np0005591285 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 21 18:32:04 np0005591285 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 21 18:32:04 np0005591285 systemd[1]: man-db-cache-update.service: Deactivated successfully.
Jan 21 18:32:04 np0005591285 systemd[1]: Finished man-db-cache-update.service.
Jan 21 18:32:04 np0005591285 systemd[1]: man-db-cache-update.service: Consumed 11.968s CPU time.
Jan 21 18:32:04 np0005591285 systemd[1]: run-r0df1f0b2bf164c7e93ce03d21db613a6.service: Deactivated successfully.
Jan 21 18:32:06 np0005591285 podman[138573]: 2026-01-21 23:32:06.228403139 +0000 UTC m=+0.095296651 container health_status 482a8450540b33e5cd4c4cb0c4b60281016c657b79b89fa82f8a9e447843f995 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, tcib_managed=true, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, config_id=ovn_metadata_agent, org.label-schema.vendor=CentOS)
Jan 21 18:32:06 np0005591285 python3.9[138722]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtlogd.service daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Jan 21 18:32:07 np0005591285 systemd[1]: Reloading.
Jan 21 18:32:07 np0005591285 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 21 18:32:07 np0005591285 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 21 18:32:08 np0005591285 python3.9[138912]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtnodedevd.service daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Jan 21 18:32:08 np0005591285 systemd[1]: Reloading.
Jan 21 18:32:08 np0005591285 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 21 18:32:08 np0005591285 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 21 18:32:09 np0005591285 python3.9[139102]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtproxyd.service daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Jan 21 18:32:10 np0005591285 systemd[1]: Reloading.
Jan 21 18:32:10 np0005591285 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 21 18:32:10 np0005591285 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 21 18:32:11 np0005591285 python3.9[139291]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtqemud.service daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Jan 21 18:32:12 np0005591285 python3.9[139446]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtsecretd.service daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Jan 21 18:32:12 np0005591285 systemd[1]: Reloading.
Jan 21 18:32:13 np0005591285 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 21 18:32:13 np0005591285 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 21 18:32:14 np0005591285 python3.9[139636]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtproxyd-tls.socket state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None
Jan 21 18:32:14 np0005591285 systemd[1]: Reloading.
Jan 21 18:32:14 np0005591285 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 21 18:32:14 np0005591285 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 21 18:32:14 np0005591285 systemd[1]: Listening on libvirt proxy daemon socket.
Jan 21 18:32:14 np0005591285 systemd[1]: Listening on libvirt proxy daemon TLS IP socket.
Jan 21 18:32:15 np0005591285 python3.9[139830]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtlogd.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Jan 21 18:32:16 np0005591285 python3.9[139985]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtlogd-admin.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Jan 21 18:32:17 np0005591285 python3.9[140140]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtnodedevd.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Jan 21 18:32:18 np0005591285 python3.9[140295]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtnodedevd-ro.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Jan 21 18:32:19 np0005591285 python3.9[140450]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtnodedevd-admin.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Jan 21 18:32:20 np0005591285 python3.9[140605]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtproxyd.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Jan 21 18:32:21 np0005591285 python3.9[140760]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtproxyd-ro.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Jan 21 18:32:22 np0005591285 python3.9[140915]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtproxyd-admin.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Jan 21 18:32:23 np0005591285 python3.9[141070]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtqemud.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Jan 21 18:32:24 np0005591285 python3.9[141225]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtqemud-ro.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Jan 21 18:32:25 np0005591285 python3.9[141380]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtqemud-admin.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Jan 21 18:32:25 np0005591285 python3.9[141535]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtsecretd.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Jan 21 18:32:26 np0005591285 python3.9[141690]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtsecretd-ro.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Jan 21 18:32:27 np0005591285 python3.9[141845]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtsecretd-admin.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Jan 21 18:32:29 np0005591285 podman[141873]: 2026-01-21 23:32:29.284323042 +0000 UTC m=+0.143153761 container health_status e38def30a2d032278c507a8fe96e62e9ea51ff4721fe55b24e66691821fd079e (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_controller, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS)
Jan 21 18:32:30 np0005591285 python3.9[142026]: ansible-ansible.builtin.file Invoked with group=root owner=root path=/etc/tmpfiles.d/ setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None attributes=None
Jan 21 18:32:30 np0005591285 python3.9[142178]: ansible-ansible.builtin.file Invoked with group=root owner=root path=/var/lib/edpm-config/firewall setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None attributes=None
Jan 21 18:32:31 np0005591285 python3.9[142330]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/pki/libvirt setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 21 18:32:32 np0005591285 python3.9[142482]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/pki/libvirt/private setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 21 18:32:33 np0005591285 python3.9[142634]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/pki/CA setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 21 18:32:33 np0005591285 python3.9[142786]: ansible-ansible.builtin.file Invoked with group=qemu owner=root path=/etc/pki/qemu setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None attributes=None
Jan 21 18:32:34 np0005591285 python3.9[142936]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'selinux'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 21 18:32:35 np0005591285 python3.9[143088]: ansible-ansible.legacy.stat Invoked with path=/etc/libvirt/virtlogd.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 21 18:32:36 np0005591285 podman[143185]: 2026-01-21 23:32:36.646069082 +0000 UTC m=+0.082383029 container health_status 482a8450540b33e5cd4c4cb0c4b60281016c657b79b89fa82f8a9e447843f995 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2)
Jan 21 18:32:36 np0005591285 python3.9[143228]: ansible-ansible.legacy.copy Invoked with dest=/etc/libvirt/virtlogd.conf group=libvirt mode=0640 owner=libvirt src=/home/zuul/.ansible/tmp/ansible-tmp-1769038355.1939416-1648-107537201073086/.source.conf follow=False _original_basename=virtlogd.conf checksum=d7a72ae92c2c205983b029473e05a6aa4c58ec24 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 21 18:32:37 np0005591285 python3.9[143384]: ansible-ansible.legacy.stat Invoked with path=/etc/libvirt/virtnodedevd.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 21 18:32:38 np0005591285 python3.9[143509]: ansible-ansible.legacy.copy Invoked with dest=/etc/libvirt/virtnodedevd.conf group=libvirt mode=0640 owner=libvirt src=/home/zuul/.ansible/tmp/ansible-tmp-1769038357.0242817-1648-16096189356809/.source.conf follow=False _original_basename=virtnodedevd.conf checksum=7a604468adb2868f1ab6ebd0fd4622286e6373e2 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 21 18:32:39 np0005591285 python3.9[143661]: ansible-ansible.legacy.stat Invoked with path=/etc/libvirt/virtproxyd.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 21 18:32:39 np0005591285 python3.9[143786]: ansible-ansible.legacy.copy Invoked with dest=/etc/libvirt/virtproxyd.conf group=libvirt mode=0640 owner=libvirt src=/home/zuul/.ansible/tmp/ansible-tmp-1769038358.4799197-1648-2639124913127/.source.conf follow=False _original_basename=virtproxyd.conf checksum=28bc484b7c9988e03de49d4fcc0a088ea975f716 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 21 18:32:40 np0005591285 python3.9[143938]: ansible-ansible.legacy.stat Invoked with path=/etc/libvirt/virtqemud.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 21 18:32:41 np0005591285 python3.9[144063]: ansible-ansible.legacy.copy Invoked with dest=/etc/libvirt/virtqemud.conf group=libvirt mode=0640 owner=libvirt src=/home/zuul/.ansible/tmp/ansible-tmp-1769038359.846716-1648-232263717647446/.source.conf follow=False _original_basename=virtqemud.conf checksum=7a604468adb2868f1ab6ebd0fd4622286e6373e2 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 21 18:32:41 np0005591285 python3.9[144215]: ansible-ansible.legacy.stat Invoked with path=/etc/libvirt/qemu.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 21 18:32:42 np0005591285 python3.9[144340]: ansible-ansible.legacy.copy Invoked with dest=/etc/libvirt/qemu.conf group=libvirt mode=0640 owner=libvirt src=/home/zuul/.ansible/tmp/ansible-tmp-1769038361.2381947-1648-11574317998447/.source.conf follow=False _original_basename=qemu.conf.j2 checksum=c44de21af13c90603565570f09ff60c6a41ed8df backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 21 18:32:43 np0005591285 python3.9[144492]: ansible-ansible.legacy.stat Invoked with path=/etc/libvirt/virtsecretd.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 21 18:32:43 np0005591285 python3.9[144617]: ansible-ansible.legacy.copy Invoked with dest=/etc/libvirt/virtsecretd.conf group=libvirt mode=0640 owner=libvirt src=/home/zuul/.ansible/tmp/ansible-tmp-1769038362.664837-1648-19690625854836/.source.conf follow=False _original_basename=virtsecretd.conf checksum=7a604468adb2868f1ab6ebd0fd4622286e6373e2 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 21 18:32:44 np0005591285 python3.9[144769]: ansible-ansible.legacy.stat Invoked with path=/etc/libvirt/auth.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 21 18:32:45 np0005591285 python3.9[144892]: ansible-ansible.legacy.copy Invoked with dest=/etc/libvirt/auth.conf group=libvirt mode=0600 owner=libvirt src=/home/zuul/.ansible/tmp/ansible-tmp-1769038364.0232096-1648-234941499207495/.source.conf follow=False _original_basename=auth.conf checksum=a94cd818c374cec2c8425b70d2e0e2f41b743ae4 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 21 18:32:45 np0005591285 python3.9[145044]: ansible-ansible.legacy.stat Invoked with path=/etc/sasl2/libvirt.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 21 18:32:46 np0005591285 python3.9[145169]: ansible-ansible.legacy.copy Invoked with dest=/etc/sasl2/libvirt.conf group=libvirt mode=0640 owner=libvirt src=/home/zuul/.ansible/tmp/ansible-tmp-1769038365.380548-1648-176666276368753/.source.conf follow=False _original_basename=sasl_libvirt.conf checksum=652e4d404bf79253d06956b8e9847c9364979d4a backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 21 18:32:47 np0005591285 python3.9[145321]: ansible-ansible.legacy.command Invoked with cmd=saslpasswd2 -f /etc/libvirt/passwd.db -p -a libvirt -u openstack migration stdin=12345678 _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None
Jan 21 18:32:48 np0005591285 python3.9[145474]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtlogd.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 21 18:32:48 np0005591285 python3.9[145626]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtlogd-admin.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 21 18:32:49 np0005591285 python3.9[145778]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtnodedevd.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 21 18:32:50 np0005591285 python3.9[145930]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtnodedevd-ro.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 21 18:32:51 np0005591285 python3.9[146082]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtnodedevd-admin.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 21 18:32:51 np0005591285 python3.9[146234]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtproxyd.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 21 18:32:52 np0005591285 python3.9[146386]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtproxyd-ro.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 21 18:32:53 np0005591285 python3.9[146538]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtproxyd-admin.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 21 18:32:54 np0005591285 python3.9[146690]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtqemud.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 21 18:32:55 np0005591285 python3.9[146842]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtqemud-ro.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 21 18:32:55 np0005591285 python3.9[146994]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtqemud-admin.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 21 18:32:56 np0005591285 python3.9[147146]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtsecretd.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 21 18:32:57 np0005591285 python3.9[147298]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtsecretd-ro.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 21 18:32:58 np0005591285 python3.9[147450]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtsecretd-admin.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 21 18:32:59 np0005591285 python3.9[147602]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtlogd.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 21 18:32:59 np0005591285 podman[147697]: 2026-01-21 23:32:59.710209848 +0000 UTC m=+0.100219638 container health_status e38def30a2d032278c507a8fe96e62e9ea51ff4721fe55b24e66691821fd079e (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=ovn_controller, container_name=ovn_controller, managed_by=edpm_ansible, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 21 18:32:59 np0005591285 python3.9[147740]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtlogd.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769038378.7250998-2312-154215553255274/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 21 18:33:00 np0005591285 python3.9[147902]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtlogd-admin.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 21 18:33:01 np0005591285 python3.9[148025]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtlogd-admin.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769038380.0381322-2312-252467877342618/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 21 18:33:01 np0005591285 python3.9[148177]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtnodedevd.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 21 18:33:02 np0005591285 python3.9[148300]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtnodedevd.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769038381.3253424-2312-187761664045319/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 21 18:33:02 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:33:02.932 104259 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 21 18:33:02 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:33:02.935 104259 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.003s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 21 18:33:02 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:33:02.935 104259 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 21 18:33:03 np0005591285 python3.9[148452]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtnodedevd-ro.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 21 18:33:03 np0005591285 python3.9[148575]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtnodedevd-ro.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769038382.7264678-2312-257835869700743/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 21 18:33:04 np0005591285 python3.9[148727]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtnodedevd-admin.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 21 18:33:05 np0005591285 python3.9[148850]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtnodedevd-admin.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769038384.2921803-2312-52778761991074/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 21 18:33:06 np0005591285 python3.9[149002]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtproxyd.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 21 18:33:06 np0005591285 python3.9[149125]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtproxyd.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769038385.657899-2312-228177510005889/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 21 18:33:07 np0005591285 podman[149249]: 2026-01-21 23:33:07.193208079 +0000 UTC m=+0.078100758 container health_status 482a8450540b33e5cd4c4cb0c4b60281016c657b79b89fa82f8a9e447843f995 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, tcib_managed=true, config_id=ovn_metadata_agent, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent, org.label-schema.build-date=20251202)
Jan 21 18:33:07 np0005591285 python3.9[149293]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtproxyd-ro.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 21 18:33:07 np0005591285 python3.9[149419]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtproxyd-ro.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769038386.890886-2312-213203004088545/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 21 18:33:08 np0005591285 python3.9[149571]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtproxyd-admin.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 21 18:33:09 np0005591285 python3.9[149694]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtproxyd-admin.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769038388.137909-2312-203100001751652/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 21 18:33:09 np0005591285 python3.9[149846]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtqemud.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 21 18:33:10 np0005591285 python3.9[149969]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtqemud.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769038389.4143436-2312-177688553605709/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 21 18:33:11 np0005591285 python3.9[150121]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtqemud-ro.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 21 18:33:11 np0005591285 python3.9[150244]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtqemud-ro.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769038390.6679075-2312-226000691431077/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 21 18:33:12 np0005591285 python3.9[150396]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtqemud-admin.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 21 18:33:13 np0005591285 python3.9[150519]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtqemud-admin.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769038392.0644052-2312-259143421837150/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 21 18:33:14 np0005591285 python3.9[150671]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtsecretd.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 21 18:33:14 np0005591285 python3.9[150794]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtsecretd.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769038393.464739-2312-32874474622540/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 21 18:33:15 np0005591285 python3.9[150946]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtsecretd-ro.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 21 18:33:15 np0005591285 python3.9[151069]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtsecretd-ro.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769038394.8435686-2312-232623769379389/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 21 18:33:16 np0005591285 python3.9[151221]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtsecretd-admin.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 21 18:33:17 np0005591285 python3.9[151344]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtsecretd-admin.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769038396.1728556-2312-144202439382790/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 21 18:33:18 np0005591285 python3.9[151494]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail#012ls -lRZ /run/libvirt | grep -E ':container_\S+_t'#012 _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 21 18:33:19 np0005591285 python3.9[151649]: ansible-ansible.posix.seboolean Invoked with name=os_enable_vtpm persistent=True state=True ignore_selinux_state=False
Jan 21 18:33:21 np0005591285 dbus-broker-launch[776]: avc:  op=load_policy lsm=selinux seqno=13 res=1
Jan 21 18:33:21 np0005591285 python3.9[151805]: ansible-ansible.legacy.copy Invoked with dest=/etc/pki/libvirt/servercert.pem group=root mode=0644 owner=root remote_src=True src=/var/lib/openstack/certs/libvirt/default/tls.crt backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 21 18:33:22 np0005591285 python3.9[151957]: ansible-ansible.legacy.copy Invoked with dest=/etc/pki/libvirt/private/serverkey.pem group=root mode=0600 owner=root remote_src=True src=/var/lib/openstack/certs/libvirt/default/tls.key backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 21 18:33:23 np0005591285 python3.9[152109]: ansible-ansible.legacy.copy Invoked with dest=/etc/pki/libvirt/clientcert.pem group=root mode=0644 owner=root remote_src=True src=/var/lib/openstack/certs/libvirt/default/tls.crt backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 21 18:33:23 np0005591285 python3.9[152261]: ansible-ansible.legacy.copy Invoked with dest=/etc/pki/libvirt/private/clientkey.pem group=root mode=0644 owner=root remote_src=True src=/var/lib/openstack/certs/libvirt/default/tls.key backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 21 18:33:24 np0005591285 python3.9[152413]: ansible-ansible.legacy.copy Invoked with dest=/etc/pki/CA/cacert.pem group=root mode=0644 owner=root remote_src=True src=/var/lib/openstack/certs/libvirt/default/ca.crt backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 21 18:33:25 np0005591285 python3.9[152565]: ansible-ansible.legacy.copy Invoked with dest=/etc/pki/qemu/server-cert.pem group=qemu mode=0640 owner=root remote_src=True src=/var/lib/openstack/certs/libvirt/default/tls.crt backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 21 18:33:26 np0005591285 python3.9[152717]: ansible-ansible.legacy.copy Invoked with dest=/etc/pki/qemu/server-key.pem group=qemu mode=0640 owner=root remote_src=True src=/var/lib/openstack/certs/libvirt/default/tls.key backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 21 18:33:26 np0005591285 python3.9[152869]: ansible-ansible.legacy.copy Invoked with dest=/etc/pki/qemu/client-cert.pem group=qemu mode=0640 owner=root remote_src=True src=/var/lib/openstack/certs/libvirt/default/tls.crt backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 21 18:33:27 np0005591285 python3.9[153021]: ansible-ansible.legacy.copy Invoked with dest=/etc/pki/qemu/client-key.pem group=qemu mode=0640 owner=root remote_src=True src=/var/lib/openstack/certs/libvirt/default/tls.key backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 21 18:33:28 np0005591285 python3.9[153173]: ansible-ansible.legacy.copy Invoked with dest=/etc/pki/qemu/ca-cert.pem group=qemu mode=0640 owner=root remote_src=True src=/var/lib/openstack/certs/libvirt/default/ca.crt backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 21 18:33:29 np0005591285 python3.9[153326]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True name=virtlogd.service state=restarted daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Jan 21 18:33:29 np0005591285 systemd[1]: Reloading.
Jan 21 18:33:29 np0005591285 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 21 18:33:29 np0005591285 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 21 18:33:29 np0005591285 systemd[1]: Starting libvirt logging daemon socket...
Jan 21 18:33:29 np0005591285 systemd[1]: Listening on libvirt logging daemon socket.
Jan 21 18:33:29 np0005591285 systemd[1]: Starting libvirt logging daemon admin socket...
Jan 21 18:33:29 np0005591285 systemd[1]: Listening on libvirt logging daemon admin socket.
Jan 21 18:33:29 np0005591285 systemd[1]: Starting libvirt logging daemon...
Jan 21 18:33:29 np0005591285 systemd[1]: Started libvirt logging daemon.
Jan 21 18:33:30 np0005591285 podman[153473]: 2026-01-21 23:33:30.271632327 +0000 UTC m=+0.123526833 container health_status e38def30a2d032278c507a8fe96e62e9ea51ff4721fe55b24e66691821fd079e (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, container_name=ovn_controller, org.label-schema.license=GPLv2, config_id=ovn_controller, org.label-schema.build-date=20251202, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 21 18:33:30 np0005591285 python3.9[153540]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True name=virtnodedevd.service state=restarted daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Jan 21 18:33:30 np0005591285 systemd[1]: Reloading.
Jan 21 18:33:30 np0005591285 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 21 18:33:30 np0005591285 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 21 18:33:30 np0005591285 systemd[1]: Starting libvirt nodedev daemon socket...
Jan 21 18:33:30 np0005591285 systemd[1]: Listening on libvirt nodedev daemon socket.
Jan 21 18:33:30 np0005591285 systemd[1]: Starting libvirt nodedev daemon admin socket...
Jan 21 18:33:30 np0005591285 systemd[1]: Starting libvirt nodedev daemon read-only socket...
Jan 21 18:33:30 np0005591285 systemd[1]: Listening on libvirt nodedev daemon admin socket.
Jan 21 18:33:30 np0005591285 systemd[1]: Listening on libvirt nodedev daemon read-only socket.
Jan 21 18:33:30 np0005591285 systemd[1]: Starting libvirt nodedev daemon...
Jan 21 18:33:31 np0005591285 systemd[1]: Started libvirt nodedev daemon.
Jan 21 18:33:31 np0005591285 systemd[1]: Starting SETroubleshoot daemon for processing new SELinux denial logs...
Jan 21 18:33:31 np0005591285 systemd[1]: Started SETroubleshoot daemon for processing new SELinux denial logs.
Jan 21 18:33:31 np0005591285 systemd[1]: Created slice Slice /system/dbus-:1.1-org.fedoraproject.SetroubleshootPrivileged.
Jan 21 18:33:31 np0005591285 systemd[1]: Started dbus-:1.1-org.fedoraproject.SetroubleshootPrivileged@0.service.
Jan 21 18:33:31 np0005591285 python3.9[153762]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True name=virtproxyd.service state=restarted daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Jan 21 18:33:31 np0005591285 systemd[1]: Reloading.
Jan 21 18:33:31 np0005591285 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 21 18:33:31 np0005591285 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 21 18:33:32 np0005591285 systemd[1]: Starting libvirt proxy daemon admin socket...
Jan 21 18:33:32 np0005591285 systemd[1]: Starting libvirt proxy daemon read-only socket...
Jan 21 18:33:32 np0005591285 systemd[1]: Listening on libvirt proxy daemon admin socket.
Jan 21 18:33:32 np0005591285 systemd[1]: Listening on libvirt proxy daemon read-only socket.
Jan 21 18:33:32 np0005591285 systemd[1]: Starting libvirt proxy daemon...
Jan 21 18:33:32 np0005591285 systemd[1]: Started libvirt proxy daemon.
Jan 21 18:33:32 np0005591285 setroubleshoot[153607]: SELinux is preventing /usr/sbin/virtlogd from using the dac_read_search capability. For complete SELinux messages run: sealert -l 13c04650-a149-4e4a-a58c-5525dd424cbf
Jan 21 18:33:32 np0005591285 setroubleshoot[153607]: SELinux is preventing /usr/sbin/virtlogd from using the dac_read_search capability.#012#012*****  Plugin dac_override (91.4 confidence) suggests   **********************#012#012If you want to help identify if domain needs this access or you have a file with the wrong permissions on your system#012Then turn on full auditing to get path information about the offending file and generate the error again.#012Do#012#012Turn on full auditing#012# auditctl -w /etc/shadow -p w#012Try to recreate AVC. Then execute#012# ausearch -m avc -ts recent#012If you see PATH record check ownership/permissions on file, and fix it,#012otherwise report as a bugzilla.#012#012*****  Plugin catchall (9.59 confidence) suggests   **************************#012#012If you believe that virtlogd should have the dac_read_search capability by default.#012Then you should report this as a bug.#012You can generate a local policy module to allow this access.#012Do#012allow this access for now by executing:#012# ausearch -c 'virtlogd' --raw | audit2allow -M my-virtlogd#012# semodule -X 300 -i my-virtlogd.pp#012
Jan 21 18:33:32 np0005591285 setroubleshoot[153607]: SELinux is preventing /usr/sbin/virtlogd from using the dac_read_search capability. For complete SELinux messages run: sealert -l 13c04650-a149-4e4a-a58c-5525dd424cbf
Jan 21 18:33:32 np0005591285 rsyslogd[1006]: imjournal: journal files changed, reloading...  [v8.2510.0-2.el9 try https://www.rsyslog.com/e/0 ]
Jan 21 18:33:32 np0005591285 setroubleshoot[153607]: SELinux is preventing /usr/sbin/virtlogd from using the dac_read_search capability.#012#012*****  Plugin dac_override (91.4 confidence) suggests   **********************#012#012If you want to help identify if domain needs this access or you have a file with the wrong permissions on your system#012Then turn on full auditing to get path information about the offending file and generate the error again.#012Do#012#012Turn on full auditing#012# auditctl -w /etc/shadow -p w#012Try to recreate AVC. Then execute#012# ausearch -m avc -ts recent#012If you see PATH record check ownership/permissions on file, and fix it,#012otherwise report as a bugzilla.#012#012*****  Plugin catchall (9.59 confidence) suggests   **************************#012#012If you believe that virtlogd should have the dac_read_search capability by default.#012Then you should report this as a bug.#012You can generate a local policy module to allow this access.#012Do#012allow this access for now by executing:#012# ausearch -c 'virtlogd' --raw | audit2allow -M my-virtlogd#012# semodule -X 300 -i my-virtlogd.pp#012
Jan 21 18:33:33 np0005591285 python3.9[153982]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True name=virtqemud.service state=restarted daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Jan 21 18:33:33 np0005591285 systemd[1]: Reloading.
Jan 21 18:33:33 np0005591285 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 21 18:33:33 np0005591285 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 21 18:33:33 np0005591285 systemd[1]: Listening on libvirt locking daemon socket.
Jan 21 18:33:33 np0005591285 systemd[1]: Starting libvirt QEMU daemon socket...
Jan 21 18:33:33 np0005591285 systemd[1]: Virtual Machine and Container Storage (Compatibility) was skipped because of an unmet condition check (ConditionPathExists=/var/lib/machines.raw).
Jan 21 18:33:33 np0005591285 systemd[1]: Starting Virtual Machine and Container Registration Service...
Jan 21 18:33:33 np0005591285 systemd[1]: Listening on libvirt QEMU daemon socket.
Jan 21 18:33:33 np0005591285 systemd[1]: Starting libvirt QEMU daemon admin socket...
Jan 21 18:33:33 np0005591285 systemd[1]: Starting libvirt QEMU daemon read-only socket...
Jan 21 18:33:33 np0005591285 systemd[1]: Listening on libvirt QEMU daemon admin socket.
Jan 21 18:33:33 np0005591285 systemd[1]: Listening on libvirt QEMU daemon read-only socket.
Jan 21 18:33:33 np0005591285 systemd[1]: Started Virtual Machine and Container Registration Service.
Jan 21 18:33:33 np0005591285 systemd[1]: Starting libvirt QEMU daemon...
Jan 21 18:33:33 np0005591285 systemd[1]: Started libvirt QEMU daemon.
Jan 21 18:33:34 np0005591285 python3.9[154197]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True name=virtsecretd.service state=restarted daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Jan 21 18:33:34 np0005591285 systemd[1]: Reloading.
Jan 21 18:33:34 np0005591285 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 21 18:33:34 np0005591285 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 21 18:33:34 np0005591285 systemd[1]: Starting libvirt secret daemon socket...
Jan 21 18:33:34 np0005591285 systemd[1]: Listening on libvirt secret daemon socket.
Jan 21 18:33:34 np0005591285 systemd[1]: Starting libvirt secret daemon admin socket...
Jan 21 18:33:34 np0005591285 systemd[1]: Starting libvirt secret daemon read-only socket...
Jan 21 18:33:34 np0005591285 systemd[1]: Listening on libvirt secret daemon admin socket.
Jan 21 18:33:34 np0005591285 systemd[1]: Listening on libvirt secret daemon read-only socket.
Jan 21 18:33:34 np0005591285 systemd[1]: Starting libvirt secret daemon...
Jan 21 18:33:34 np0005591285 systemd[1]: Started libvirt secret daemon.
Jan 21 18:33:35 np0005591285 python3.9[154409]: ansible-ansible.builtin.file Invoked with mode=0755 path=/var/lib/openstack/config/ceph state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 21 18:33:36 np0005591285 python3.9[154561]: ansible-ansible.builtin.find Invoked with paths=['/var/lib/openstack/config/ceph'] patterns=['*.conf'] read_whole_file=False file_type=file age_stamp=mtime recurse=False hidden=False follow=False get_checksum=False checksum_algorithm=sha1 use_regex=False exact_mode=True excludes=None contains=None age=None size=None depth=None mode=None encoding=None limit=None
Jan 21 18:33:37 np0005591285 podman[154685]: 2026-01-21 23:33:37.767340719 +0000 UTC m=+0.084766731 container health_status 482a8450540b33e5cd4c4cb0c4b60281016c657b79b89fa82f8a9e447843f995 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, managed_by=edpm_ansible, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_id=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, container_name=ovn_metadata_agent, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 21 18:33:37 np0005591285 python3.9[154730]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/libvirt.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 21 18:33:38 np0005591285 python3.9[154856]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/edpm-config/firewall/libvirt.yaml mode=0640 src=/home/zuul/.ansible/tmp/ansible-tmp-1769038417.3544085-3347-255074229235869/.source.yaml follow=False _original_basename=firewall.yaml.j2 checksum=5ca83b1310a74c5e48c4c3d4640e1cb8fdac1061 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 21 18:33:39 np0005591285 python3.9[155008]: ansible-ansible.builtin.file Invoked with group=root mode=0750 owner=root path=/var/lib/edpm-config/firewall state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 21 18:33:40 np0005591285 python3.9[155160]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 21 18:33:40 np0005591285 python3.9[155238]: ansible-ansible.legacy.file Invoked with mode=0644 dest=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml _original_basename=base-rules.yaml.j2 recurse=False state=file path=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 21 18:33:41 np0005591285 python3.9[155390]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 21 18:33:42 np0005591285 python3.9[155468]: ansible-ansible.legacy.file Invoked with mode=0644 dest=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml _original_basename=.xhdk9mkr recurse=False state=file path=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 21 18:33:42 np0005591285 systemd[1]: dbus-:1.1-org.fedoraproject.SetroubleshootPrivileged@0.service: Deactivated successfully.
Jan 21 18:33:42 np0005591285 systemd[1]: dbus-:1.1-org.fedoraproject.SetroubleshootPrivileged@0.service: Consumed 1.008s CPU time.
Jan 21 18:33:42 np0005591285 systemd[1]: setroubleshootd.service: Deactivated successfully.
Jan 21 18:33:42 np0005591285 python3.9[155620]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/iptables.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 21 18:33:43 np0005591285 python3.9[155698]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/iptables.nft _original_basename=iptables.nft recurse=False state=file path=/etc/nftables/iptables.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 21 18:33:44 np0005591285 python3.9[155850]: ansible-ansible.legacy.command Invoked with _raw_params=nft -j list ruleset _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 21 18:33:45 np0005591285 python3[156003]: ansible-edpm_nftables_from_files Invoked with src=/var/lib/edpm-config/firewall
Jan 21 18:33:46 np0005591285 python3.9[156155]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-jumps.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 21 18:33:46 np0005591285 python3.9[156233]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/edpm-jumps.nft _original_basename=jump-chain.j2 recurse=False state=file path=/etc/nftables/edpm-jumps.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 21 18:33:47 np0005591285 python3.9[156385]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-update-jumps.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 21 18:33:47 np0005591285 python3.9[156510]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-update-jumps.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769038426.7842066-3614-139606160484440/.source.nft follow=False _original_basename=jump-chain.j2 checksum=3ce353c89bce3b135a0ed688d4e338b2efb15185 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 21 18:33:48 np0005591285 python3.9[156662]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-flushes.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 21 18:33:49 np0005591285 python3.9[156740]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/edpm-flushes.nft _original_basename=flush-chain.j2 recurse=False state=file path=/etc/nftables/edpm-flushes.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 21 18:33:50 np0005591285 python3.9[156892]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-chains.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 21 18:33:50 np0005591285 python3.9[156970]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/edpm-chains.nft _original_basename=chains.j2 recurse=False state=file path=/etc/nftables/edpm-chains.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 21 18:33:51 np0005591285 python3.9[157122]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-rules.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 21 18:33:52 np0005591285 python3.9[157247]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-rules.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769038430.991748-3731-280527265481181/.source.nft follow=False _original_basename=ruleset.j2 checksum=8a12d4eb5149b6e500230381c1359a710881e9b0 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 21 18:33:53 np0005591285 python3.9[157399]: ansible-ansible.builtin.file Invoked with group=root mode=0600 owner=root path=/etc/nftables/edpm-rules.nft.changed state=touch recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 21 18:33:53 np0005591285 python3.9[157551]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail; cat /etc/nftables/edpm-chains.nft /etc/nftables/edpm-flushes.nft /etc/nftables/edpm-rules.nft /etc/nftables/edpm-update-jumps.nft /etc/nftables/edpm-jumps.nft | nft -c -f - _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 21 18:33:54 np0005591285 python3.9[157706]: ansible-ansible.builtin.blockinfile Invoked with backup=False block=include "/etc/nftables/iptables.nft"#012include "/etc/nftables/edpm-chains.nft"#012include "/etc/nftables/edpm-rules.nft"#012include "/etc/nftables/edpm-jumps.nft"#012 path=/etc/sysconfig/nftables.conf validate=nft -c -f %s state=present marker=# {mark} ANSIBLE MANAGED BLOCK create=False marker_begin=BEGIN marker_end=END append_newline=False prepend_newline=False encoding=utf-8 unsafe_writes=False insertafter=None insertbefore=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 21 18:33:55 np0005591285 python3.9[157858]: ansible-ansible.legacy.command Invoked with _raw_params=nft -f /etc/nftables/edpm-chains.nft _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 21 18:33:56 np0005591285 python3.9[158011]: ansible-ansible.builtin.stat Invoked with path=/etc/nftables/edpm-rules.nft.changed follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 21 18:33:57 np0005591285 python3.9[158165]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail; cat /etc/nftables/edpm-flushes.nft /etc/nftables/edpm-rules.nft /etc/nftables/edpm-update-jumps.nft | nft -f - _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 21 18:33:58 np0005591285 python3.9[158320]: ansible-ansible.builtin.file Invoked with path=/etc/nftables/edpm-rules.nft.changed state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 21 18:33:59 np0005591285 python3.9[158472]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/edpm_libvirt.target follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 21 18:33:59 np0005591285 python3.9[158595]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/edpm_libvirt.target mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1769038438.6464214-3947-116939938789034/.source.target follow=False _original_basename=edpm_libvirt.target checksum=13035a1aa0f414c677b14be9a5a363b6623d393c backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 21 18:34:00 np0005591285 podman[158719]: 2026-01-21 23:34:00.562240382 +0000 UTC m=+0.132035568 container health_status e38def30a2d032278c507a8fe96e62e9ea51ff4721fe55b24e66691821fd079e (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=ovn_controller, org.label-schema.license=GPLv2, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, container_name=ovn_controller, io.buildah.version=1.41.3, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']})
Jan 21 18:34:00 np0005591285 python3.9[158764]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/edpm_libvirt_guests.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 21 18:34:01 np0005591285 python3.9[158896]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/edpm_libvirt_guests.service mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1769038440.1194787-3992-82870542779666/.source.service follow=False _original_basename=edpm_libvirt_guests.service checksum=db83430a42fc2ccfd6ed8b56ebf04f3dff9cd0cf backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 21 18:34:02 np0005591285 python3.9[159048]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virt-guest-shutdown.target follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 21 18:34:02 np0005591285 python3.9[159171]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virt-guest-shutdown.target mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1769038441.5762613-4038-177994642784962/.source.target follow=False _original_basename=virt-guest-shutdown.target checksum=49ca149619c596cbba877418629d2cf8f7b0f5cf backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 21 18:34:02 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:34:02.933 104259 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 21 18:34:02 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:34:02.936 104259 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.003s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 21 18:34:02 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:34:02.936 104259 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 21 18:34:03 np0005591285 python3.9[159323]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=edpm_libvirt.target state=restarted daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 21 18:34:03 np0005591285 systemd[1]: Reloading.
Jan 21 18:34:03 np0005591285 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 21 18:34:03 np0005591285 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 21 18:34:04 np0005591285 systemd[1]: Reached target edpm_libvirt.target.
Jan 21 18:34:04 np0005591285 python3.9[159514]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=edpm_libvirt_guests daemon_reexec=False scope=system no_block=False state=None force=None masked=None
Jan 21 18:34:04 np0005591285 systemd[1]: Reloading.
Jan 21 18:34:05 np0005591285 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 21 18:34:05 np0005591285 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 21 18:34:05 np0005591285 systemd[1]: Reloading.
Jan 21 18:34:05 np0005591285 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 21 18:34:05 np0005591285 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 21 18:34:06 np0005591285 systemd[1]: session-23.scope: Deactivated successfully.
Jan 21 18:34:06 np0005591285 systemd[1]: session-23.scope: Consumed 3min 53.598s CPU time.
Jan 21 18:34:06 np0005591285 systemd-logind[788]: Session 23 logged out. Waiting for processes to exit.
Jan 21 18:34:06 np0005591285 systemd-logind[788]: Removed session 23.
Jan 21 18:34:08 np0005591285 podman[159613]: 2026-01-21 23:34:08.241826024 +0000 UTC m=+0.085776916 container health_status 482a8450540b33e5cd4c4cb0c4b60281016c657b79b89fa82f8a9e447843f995 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, managed_by=edpm_ansible, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0)
Jan 21 18:34:11 np0005591285 systemd-logind[788]: New session 24 of user zuul.
Jan 21 18:34:11 np0005591285 systemd[1]: Started Session 24 of User zuul.
Jan 21 18:34:13 np0005591285 python3.9[159786]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 21 18:34:14 np0005591285 python3.9[159940]: ansible-ansible.builtin.service_facts Invoked
Jan 21 18:34:14 np0005591285 network[159957]: You are using 'network' service provided by 'network-scripts', which are now deprecated.
Jan 21 18:34:14 np0005591285 network[159958]: 'network-scripts' will be removed from distribution in near future.
Jan 21 18:34:14 np0005591285 network[159959]: It is advised to switch to 'NetworkManager' instead for network management.
Jan 21 18:34:19 np0005591285 python3.9[160230]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Jan 21 18:34:20 np0005591285 python3.9[160314]: ansible-ansible.legacy.dnf Invoked with name=['iscsi-initiator-utils'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Jan 21 18:34:27 np0005591285 python3.9[160467]: ansible-ansible.builtin.stat Invoked with path=/var/lib/config-data/puppet-generated/iscsid/etc/iscsi follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 21 18:34:28 np0005591285 python3.9[160619]: ansible-ansible.legacy.command Invoked with _raw_params=/usr/sbin/restorecon -nvr /etc/iscsi /var/lib/iscsi _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 21 18:34:29 np0005591285 python3.9[160772]: ansible-ansible.builtin.stat Invoked with path=/etc/iscsi/.initiator_reset follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 21 18:34:30 np0005591285 python3.9[160924]: ansible-ansible.legacy.command Invoked with _raw_params=/usr/sbin/iscsi-iname _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 21 18:34:31 np0005591285 podman[161022]: 2026-01-21 23:34:31.302256708 +0000 UTC m=+0.143259907 container health_status e38def30a2d032278c507a8fe96e62e9ea51ff4721fe55b24e66691821fd079e (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, managed_by=edpm_ansible)
Jan 21 18:34:31 np0005591285 python3.9[161102]: ansible-ansible.legacy.stat Invoked with path=/etc/iscsi/initiatorname.iscsi follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 21 18:34:32 np0005591285 python3.9[161225]: ansible-ansible.legacy.copy Invoked with dest=/etc/iscsi/initiatorname.iscsi mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1769038470.9538224-248-79523999913834/.source.iscsi _original_basename=.797b_mg2 follow=False checksum=c3814a899fe3613189ae100f365f4170a222192a backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 21 18:34:33 np0005591285 python3.9[161377]: ansible-ansible.builtin.file Invoked with mode=0600 path=/etc/iscsi/.initiator_reset state=touch recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 21 18:34:34 np0005591285 python3.9[161529]: ansible-ansible.builtin.lineinfile Invoked with insertafter=^#node.session.auth.chap.algs line=node.session.auth.chap_algs = SHA3-256,SHA256,SHA1,MD5 path=/etc/iscsi/iscsid.conf regexp=^node.session.auth.chap_algs state=present encoding=utf-8 backrefs=False create=False backup=False firstmatch=False unsafe_writes=False search_string=None insertbefore=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 21 18:34:35 np0005591285 python3.9[161681]: ansible-ansible.builtin.systemd_service Invoked with enabled=True name=iscsid.socket state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 21 18:34:35 np0005591285 systemd[1]: Listening on Open-iSCSI iscsid Socket.
Jan 21 18:34:36 np0005591285 python3.9[161837]: ansible-ansible.builtin.systemd_service Invoked with enabled=True name=iscsid state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 21 18:34:36 np0005591285 systemd[1]: Reloading.
Jan 21 18:34:36 np0005591285 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 21 18:34:36 np0005591285 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 21 18:34:36 np0005591285 systemd[1]: One time configuration for iscsi.service was skipped because of an unmet condition check (ConditionPathExists=!/etc/iscsi/initiatorname.iscsi).
Jan 21 18:34:36 np0005591285 systemd[1]: Starting Open-iSCSI...
Jan 21 18:34:36 np0005591285 kernel: Loading iSCSI transport class v2.0-870.
Jan 21 18:34:37 np0005591285 systemd[1]: Started Open-iSCSI.
Jan 21 18:34:37 np0005591285 systemd[1]: Starting Logout off all iSCSI sessions on shutdown...
Jan 21 18:34:37 np0005591285 systemd[1]: Finished Logout off all iSCSI sessions on shutdown.
Jan 21 18:34:38 np0005591285 python3.9[162037]: ansible-ansible.builtin.service_facts Invoked
Jan 21 18:34:38 np0005591285 network[162054]: You are using 'network' service provided by 'network-scripts', which are now deprecated.
Jan 21 18:34:38 np0005591285 network[162055]: 'network-scripts' will be removed from distribution in near future.
Jan 21 18:34:38 np0005591285 network[162056]: It is advised to switch to 'NetworkManager' instead for network management.
Jan 21 18:34:38 np0005591285 podman[162061]: 2026-01-21 23:34:38.424266891 +0000 UTC m=+0.078074894 container health_status 482a8450540b33e5cd4c4cb0c4b60281016c657b79b89fa82f8a9e447843f995 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']})
Jan 21 18:34:44 np0005591285 python3.9[162346]: ansible-ansible.legacy.dnf Invoked with name=['device-mapper-multipath'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Jan 21 18:34:47 np0005591285 systemd[1]: Started /usr/bin/systemctl start man-db-cache-update.
Jan 21 18:34:47 np0005591285 systemd[1]: Starting man-db-cache-update.service...
Jan 21 18:34:47 np0005591285 systemd[1]: Reloading.
Jan 21 18:34:47 np0005591285 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 21 18:34:47 np0005591285 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 21 18:34:47 np0005591285 systemd[1]: Queuing reload/restart jobs for marked units…
Jan 21 18:34:47 np0005591285 systemd[1]: man-db-cache-update.service: Deactivated successfully.
Jan 21 18:34:47 np0005591285 systemd[1]: Finished man-db-cache-update.service.
Jan 21 18:34:47 np0005591285 systemd[1]: run-r2e2db999ed47408ab59288c12201165a.service: Deactivated successfully.
Jan 21 18:34:48 np0005591285 python3.9[162663]: ansible-ansible.builtin.file Invoked with mode=0755 path=/etc/modules-load.d selevel=s0 setype=etc_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None attributes=None
Jan 21 18:34:49 np0005591285 python3.9[162815]: ansible-community.general.modprobe Invoked with name=dm-multipath state=present params= persistent=disabled
Jan 21 18:34:50 np0005591285 python3.9[162971]: ansible-ansible.legacy.stat Invoked with path=/etc/modules-load.d/dm-multipath.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 21 18:34:51 np0005591285 python3.9[163094]: ansible-ansible.legacy.copy Invoked with dest=/etc/modules-load.d/dm-multipath.conf mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1769038490.1439464-511-761811646232/.source.conf follow=False _original_basename=module-load.conf.j2 checksum=065061c60917e4f67cecc70d12ce55e42f9d0b3f backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 21 18:34:52 np0005591285 python3.9[163246]: ansible-ansible.builtin.lineinfile Invoked with create=True dest=/etc/modules line=dm-multipath  mode=0644 state=present path=/etc/modules encoding=utf-8 backrefs=False backup=False firstmatch=False unsafe_writes=False regexp=None search_string=None insertafter=None insertbefore=None validate=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 21 18:34:53 np0005591285 python3.9[163398]: ansible-ansible.builtin.systemd Invoked with name=systemd-modules-load.service state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Jan 21 18:34:54 np0005591285 systemd[1]: systemd-modules-load.service: Deactivated successfully.
Jan 21 18:34:54 np0005591285 systemd[1]: Stopped Load Kernel Modules.
Jan 21 18:34:54 np0005591285 systemd[1]: Stopping Load Kernel Modules...
Jan 21 18:34:54 np0005591285 systemd[1]: Starting Load Kernel Modules...
Jan 21 18:34:54 np0005591285 systemd[1]: Finished Load Kernel Modules.
Jan 21 18:34:55 np0005591285 python3.9[163554]: ansible-ansible.legacy.command Invoked with _raw_params=/usr/sbin/restorecon -nvr /etc/multipath _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 21 18:34:56 np0005591285 python3.9[163707]: ansible-ansible.builtin.stat Invoked with path=/etc/multipath.conf follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 21 18:34:57 np0005591285 python3.9[163859]: ansible-ansible.legacy.stat Invoked with path=/etc/multipath.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 21 18:34:58 np0005591285 python3.9[163982]: ansible-ansible.legacy.copy Invoked with dest=/etc/multipath.conf mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1769038496.7570944-665-79050709376059/.source.conf _original_basename=multipath.conf follow=False checksum=bf02ab264d3d648048a81f3bacec8bc58db93162 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 21 18:34:58 np0005591285 python3.9[164134]: ansible-ansible.legacy.command Invoked with _raw_params=grep -q '^blacklist\s*{' /etc/multipath.conf _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 21 18:35:00 np0005591285 python3.9[164287]: ansible-ansible.builtin.lineinfile Invoked with line=blacklist { path=/etc/multipath.conf state=present encoding=utf-8 backrefs=False create=False backup=False firstmatch=False unsafe_writes=False regexp=None search_string=None insertafter=None insertbefore=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 21 18:35:01 np0005591285 python3.9[164439]: ansible-ansible.builtin.replace Invoked with path=/etc/multipath.conf regexp=^(blacklist {) replace=\1\n} backup=False encoding=utf-8 unsafe_writes=False after=None before=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 21 18:35:01 np0005591285 podman[164563]: 2026-01-21 23:35:01.726120599 +0000 UTC m=+0.136507569 container health_status e38def30a2d032278c507a8fe96e62e9ea51ff4721fe55b24e66691821fd079e (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, tcib_managed=true)
Jan 21 18:35:01 np0005591285 python3.9[164607]: ansible-ansible.builtin.replace Invoked with path=/etc/multipath.conf regexp=^blacklist\s*{\n[\s]+devnode \"\.\*\" replace=blacklist { backup=False encoding=utf-8 unsafe_writes=False after=None before=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 21 18:35:02 np0005591285 python3.9[164768]: ansible-ansible.builtin.lineinfile Invoked with firstmatch=True insertafter=^defaults line=        find_multipaths yes path=/etc/multipath.conf regexp=^\s+find_multipaths state=present encoding=utf-8 backrefs=False create=False backup=False unsafe_writes=False search_string=None insertbefore=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 21 18:35:02 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:35:02.934 104259 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 21 18:35:02 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:35:02.936 104259 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.003s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 21 18:35:02 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:35:02.936 104259 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 21 18:35:03 np0005591285 python3.9[164920]: ansible-ansible.builtin.lineinfile Invoked with firstmatch=True insertafter=^defaults line=        recheck_wwid yes path=/etc/multipath.conf regexp=^\s+recheck_wwid state=present encoding=utf-8 backrefs=False create=False backup=False unsafe_writes=False search_string=None insertbefore=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 21 18:35:04 np0005591285 python3.9[165072]: ansible-ansible.builtin.lineinfile Invoked with firstmatch=True insertafter=^defaults line=        skip_kpartx yes path=/etc/multipath.conf regexp=^\s+skip_kpartx state=present encoding=utf-8 backrefs=False create=False backup=False unsafe_writes=False search_string=None insertbefore=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 21 18:35:05 np0005591285 python3.9[165224]: ansible-ansible.builtin.lineinfile Invoked with firstmatch=True insertafter=^defaults line=        user_friendly_names no path=/etc/multipath.conf regexp=^\s+user_friendly_names state=present encoding=utf-8 backrefs=False create=False backup=False unsafe_writes=False search_string=None insertbefore=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 21 18:35:06 np0005591285 python3.9[165376]: ansible-ansible.builtin.stat Invoked with path=/etc/multipath.conf follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 21 18:35:06 np0005591285 python3.9[165530]: ansible-ansible.legacy.command Invoked with _raw_params=/usr/bin/true _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 21 18:35:07 np0005591285 python3.9[165683]: ansible-ansible.builtin.systemd_service Invoked with enabled=True name=multipathd.socket state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 21 18:35:08 np0005591285 systemd[1]: Listening on multipathd control socket.
Jan 21 18:35:10 np0005591285 podman[165811]: 2026-01-21 23:35:10.151825056 +0000 UTC m=+0.058758508 container health_status 482a8450540b33e5cd4c4cb0c4b60281016c657b79b89fa82f8a9e447843f995 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_id=ovn_metadata_agent, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible)
Jan 21 18:35:10 np0005591285 python3.9[165858]: ansible-ansible.builtin.systemd_service Invoked with enabled=True name=multipathd state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 21 18:35:10 np0005591285 systemd[1]: Starting Wait for udev To Complete Device Initialization...
Jan 21 18:35:10 np0005591285 udevadm[165863]: systemd-udev-settle.service is deprecated. Please fix multipathd.service not to pull it in.
Jan 21 18:35:10 np0005591285 systemd[1]: Finished Wait for udev To Complete Device Initialization.
Jan 21 18:35:10 np0005591285 systemd[1]: Starting Device-Mapper Multipath Device Controller...
Jan 21 18:35:10 np0005591285 multipathd[165866]: --------start up--------
Jan 21 18:35:10 np0005591285 multipathd[165866]: read /etc/multipath.conf
Jan 21 18:35:10 np0005591285 multipathd[165866]: path checkers start up
Jan 21 18:35:10 np0005591285 systemd[1]: Started Device-Mapper Multipath Device Controller.
Jan 21 18:35:12 np0005591285 python3.9[166025]: ansible-ansible.builtin.file Invoked with mode=0755 path=/etc/modules-load.d selevel=s0 setype=etc_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None attributes=None
Jan 21 18:35:12 np0005591285 python3.9[166177]: ansible-community.general.modprobe Invoked with name=nvme-fabrics state=present params= persistent=disabled
Jan 21 18:35:12 np0005591285 kernel: Key type psk registered
Jan 21 18:35:13 np0005591285 python3.9[166339]: ansible-ansible.legacy.stat Invoked with path=/etc/modules-load.d/nvme-fabrics.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 21 18:35:14 np0005591285 python3.9[166462]: ansible-ansible.legacy.copy Invoked with dest=/etc/modules-load.d/nvme-fabrics.conf mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1769038513.2890224-1055-65078388931059/.source.conf follow=False _original_basename=module-load.conf.j2 checksum=783c778f0c68cc414f35486f234cbb1cf3f9bbff backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 21 18:35:15 np0005591285 python3.9[166614]: ansible-ansible.builtin.lineinfile Invoked with create=True dest=/etc/modules line=nvme-fabrics  mode=0644 state=present path=/etc/modules encoding=utf-8 backrefs=False backup=False firstmatch=False unsafe_writes=False regexp=None search_string=None insertafter=None insertbefore=None validate=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 21 18:35:16 np0005591285 python3.9[166766]: ansible-ansible.builtin.systemd Invoked with name=systemd-modules-load.service state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Jan 21 18:35:16 np0005591285 systemd[1]: systemd-modules-load.service: Deactivated successfully.
Jan 21 18:35:16 np0005591285 systemd[1]: Stopped Load Kernel Modules.
Jan 21 18:35:16 np0005591285 systemd[1]: Stopping Load Kernel Modules...
Jan 21 18:35:16 np0005591285 systemd[1]: Starting Load Kernel Modules...
Jan 21 18:35:16 np0005591285 systemd[1]: Finished Load Kernel Modules.
Jan 21 18:35:17 np0005591285 python3.9[166922]: ansible-ansible.legacy.dnf Invoked with name=['nvme-cli'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Jan 21 18:35:20 np0005591285 systemd[1]: Reloading.
Jan 21 18:35:20 np0005591285 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 21 18:35:20 np0005591285 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 21 18:35:20 np0005591285 systemd[1]: Reloading.
Jan 21 18:35:20 np0005591285 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 21 18:35:20 np0005591285 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 21 18:35:21 np0005591285 systemd-logind[788]: Watching system buttons on /dev/input/event0 (Power Button)
Jan 21 18:35:21 np0005591285 systemd-logind[788]: Watching system buttons on /dev/input/event1 (AT Translated Set 2 keyboard)
Jan 21 18:35:21 np0005591285 systemd[1]: Started /usr/bin/systemctl start man-db-cache-update.
Jan 21 18:35:21 np0005591285 systemd[1]: Starting man-db-cache-update.service...
Jan 21 18:35:21 np0005591285 systemd[1]: Reloading.
Jan 21 18:35:21 np0005591285 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 21 18:35:21 np0005591285 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 21 18:35:21 np0005591285 systemd[1]: Queuing reload/restart jobs for marked units…
Jan 21 18:35:22 np0005591285 systemd[1]: man-db-cache-update.service: Deactivated successfully.
Jan 21 18:35:22 np0005591285 systemd[1]: Finished man-db-cache-update.service.
Jan 21 18:35:22 np0005591285 systemd[1]: man-db-cache-update.service: Consumed 1.644s CPU time.
Jan 21 18:35:22 np0005591285 systemd[1]: run-r7a19c4bbd00e434380c48d36f5558b52.service: Deactivated successfully.
Jan 21 18:35:23 np0005591285 python3.9[168388]: ansible-ansible.builtin.systemd_service Invoked with name=iscsid state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Jan 21 18:35:23 np0005591285 systemd[1]: Stopping Open-iSCSI...
Jan 21 18:35:23 np0005591285 iscsid[161878]: iscsid shutting down.
Jan 21 18:35:23 np0005591285 systemd[1]: iscsid.service: Deactivated successfully.
Jan 21 18:35:23 np0005591285 systemd[1]: Stopped Open-iSCSI.
Jan 21 18:35:23 np0005591285 systemd[1]: One time configuration for iscsi.service was skipped because of an unmet condition check (ConditionPathExists=!/etc/iscsi/initiatorname.iscsi).
Jan 21 18:35:23 np0005591285 systemd[1]: Starting Open-iSCSI...
Jan 21 18:35:23 np0005591285 systemd[1]: Started Open-iSCSI.
Jan 21 18:35:24 np0005591285 python3.9[168544]: ansible-ansible.builtin.systemd_service Invoked with name=multipathd state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Jan 21 18:35:24 np0005591285 systemd[1]: Stopping Device-Mapper Multipath Device Controller...
Jan 21 18:35:24 np0005591285 multipathd[165866]: exit (signal)
Jan 21 18:35:24 np0005591285 multipathd[165866]: --------shut down-------
Jan 21 18:35:24 np0005591285 systemd[1]: multipathd.service: Deactivated successfully.
Jan 21 18:35:24 np0005591285 systemd[1]: Stopped Device-Mapper Multipath Device Controller.
Jan 21 18:35:24 np0005591285 systemd[1]: Starting Device-Mapper Multipath Device Controller...
Jan 21 18:35:24 np0005591285 multipathd[168550]: --------start up--------
Jan 21 18:35:24 np0005591285 multipathd[168550]: read /etc/multipath.conf
Jan 21 18:35:24 np0005591285 multipathd[168550]: path checkers start up
Jan 21 18:35:24 np0005591285 systemd[1]: Started Device-Mapper Multipath Device Controller.
Jan 21 18:35:25 np0005591285 python3.9[168707]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 21 18:35:26 np0005591285 python3.9[168863]: ansible-ansible.builtin.file Invoked with mode=0644 path=/etc/ssh/ssh_known_hosts state=touch recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 21 18:35:28 np0005591285 python3.9[169015]: ansible-ansible.builtin.systemd_service Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Jan 21 18:35:28 np0005591285 systemd[1]: Reloading.
Jan 21 18:35:28 np0005591285 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 21 18:35:28 np0005591285 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 21 18:35:29 np0005591285 python3.9[169200]: ansible-ansible.builtin.service_facts Invoked
Jan 21 18:35:29 np0005591285 network[169217]: You are using 'network' service provided by 'network-scripts', which are now deprecated.
Jan 21 18:35:29 np0005591285 network[169218]: 'network-scripts' will be removed from distribution in near future.
Jan 21 18:35:29 np0005591285 network[169219]: It is advised to switch to 'NetworkManager' instead for network management.
Jan 21 18:35:31 np0005591285 systemd[1]: virtnodedevd.service: Deactivated successfully.
Jan 21 18:35:31 np0005591285 podman[169278]: 2026-01-21 23:35:31.976825747 +0000 UTC m=+0.159570535 container health_status e38def30a2d032278c507a8fe96e62e9ea51ff4721fe55b24e66691821fd079e (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, config_id=ovn_controller, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']})
Jan 21 18:35:32 np0005591285 systemd[1]: virtproxyd.service: Deactivated successfully.
Jan 21 18:35:33 np0005591285 systemd[1]: virtqemud.service: Deactivated successfully.
Jan 21 18:35:34 np0005591285 systemd[1]: virtsecretd.service: Deactivated successfully.
Jan 21 18:35:38 np0005591285 python3.9[169520]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_compute.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 21 18:35:39 np0005591285 python3.9[169673]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_migration_target.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 21 18:35:40 np0005591285 python3.9[169826]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_api_cron.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 21 18:35:40 np0005591285 podman[169828]: 2026-01-21 23:35:40.51179606 +0000 UTC m=+0.103566547 container health_status 482a8450540b33e5cd4c4cb0c4b60281016c657b79b89fa82f8a9e447843f995 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.build-date=20251202)
Jan 21 18:35:41 np0005591285 python3.9[169998]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_api.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 21 18:35:42 np0005591285 python3.9[170151]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_conductor.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 21 18:35:43 np0005591285 python3.9[170304]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_metadata.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 21 18:35:44 np0005591285 python3.9[170457]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_scheduler.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 21 18:35:45 np0005591285 python3.9[170610]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_vnc_proxy.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 21 18:35:46 np0005591285 python3.9[170763]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_compute.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 21 18:35:47 np0005591285 python3.9[170915]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_migration_target.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 21 18:35:48 np0005591285 python3.9[171067]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_api_cron.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 21 18:35:50 np0005591285 python3.9[171219]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_api.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 21 18:35:51 np0005591285 python3.9[171371]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_conductor.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 21 18:35:51 np0005591285 python3.9[171523]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_metadata.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 21 18:35:52 np0005591285 python3.9[171675]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_scheduler.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 21 18:35:53 np0005591285 python3.9[171827]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_vnc_proxy.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 21 18:35:54 np0005591285 python3.9[171979]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_compute.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 21 18:35:55 np0005591285 python3.9[172131]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_migration_target.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 21 18:35:56 np0005591285 python3.9[172283]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_api_cron.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 21 18:35:57 np0005591285 python3.9[172435]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_api.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 21 18:35:57 np0005591285 python3.9[172587]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_conductor.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 21 18:35:58 np0005591285 python3.9[172739]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_metadata.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 21 18:35:59 np0005591285 python3.9[172891]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_scheduler.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 21 18:36:00 np0005591285 python3.9[173043]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_vnc_proxy.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 21 18:36:01 np0005591285 python3.9[173195]: ansible-ansible.legacy.command Invoked with _raw_params=if systemctl is-active certmonger.service; then#012  systemctl disable --now certmonger.service#012  test -f /etc/systemd/system/certmonger.service || systemctl mask certmonger.service#012fi#012 _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 21 18:36:02 np0005591285 podman[173274]: 2026-01-21 23:36:02.349283508 +0000 UTC m=+0.188453433 container health_status e38def30a2d032278c507a8fe96e62e9ea51ff4721fe55b24e66691821fd079e (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_controller, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS)
Jan 21 18:36:02 np0005591285 python3.9[173374]: ansible-ansible.builtin.find Invoked with file_type=any hidden=True paths=['/var/lib/certmonger/requests'] patterns=[] read_whole_file=False age_stamp=mtime recurse=False follow=False get_checksum=False checksum_algorithm=sha1 use_regex=False exact_mode=True excludes=None contains=None age=None size=None depth=None mode=None encoding=None limit=None
Jan 21 18:36:02 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:36:02.936 104259 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 21 18:36:02 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:36:02.939 104259 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.004s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 21 18:36:02 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:36:02.939 104259 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 21 18:36:04 np0005591285 python3.9[173526]: ansible-ansible.builtin.systemd_service Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Jan 21 18:36:04 np0005591285 systemd[1]: Reloading.
Jan 21 18:36:04 np0005591285 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 21 18:36:04 np0005591285 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 21 18:36:05 np0005591285 python3.9[173713]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_compute.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 21 18:36:06 np0005591285 python3.9[173866]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_migration_target.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 21 18:36:07 np0005591285 python3.9[174019]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_api_cron.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 21 18:36:08 np0005591285 python3.9[174172]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_api.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 21 18:36:09 np0005591285 python3.9[174325]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_conductor.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 21 18:36:09 np0005591285 python3.9[174478]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_metadata.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 21 18:36:10 np0005591285 podman[174631]: 2026-01-21 23:36:10.67421459 +0000 UTC m=+0.093558589 container health_status 482a8450540b33e5cd4c4cb0c4b60281016c657b79b89fa82f8a9e447843f995 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.schema-version=1.0)
Jan 21 18:36:10 np0005591285 python3.9[174632]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_scheduler.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 21 18:36:11 np0005591285 python3.9[174805]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_vnc_proxy.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 21 18:36:13 np0005591285 python3.9[174958]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/openstack/config/nova setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 21 18:36:14 np0005591285 python3.9[175110]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/openstack/config/containers setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 21 18:36:15 np0005591285 python3.9[175262]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/openstack/config/nova_nvme_cleaner setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 21 18:36:16 np0005591285 python3.9[175414]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/nova setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 21 18:36:16 np0005591285 python3.9[175566]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/_nova_secontext setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 21 18:36:17 np0005591285 python3.9[175718]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/nova/instances setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 21 18:36:18 np0005591285 python3.9[175870]: ansible-ansible.builtin.file Invoked with group=root mode=0750 owner=root path=/etc/ceph setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 21 18:36:19 np0005591285 python3.9[176022]: ansible-ansible.builtin.file Invoked with group=zuul owner=zuul path=/etc/multipath setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None attributes=None
Jan 21 18:36:19 np0005591285 python3.9[176174]: ansible-ansible.builtin.file Invoked with group=zuul owner=zuul path=/etc/nvme setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None attributes=None
Jan 21 18:36:20 np0005591285 python3.9[176326]: ansible-ansible.builtin.file Invoked with group=zuul owner=zuul path=/run/openvswitch setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None attributes=None
Jan 21 18:36:26 np0005591285 python3.9[176478]: ansible-ansible.builtin.getent Invoked with database=passwd key=nova fail_key=True service=None split=None
Jan 21 18:36:27 np0005591285 python3.9[176631]: ansible-ansible.builtin.group Invoked with gid=42436 name=nova state=present force=False system=False local=False non_unique=False gid_min=None gid_max=None
Jan 21 18:36:28 np0005591285 python3.9[176789]: ansible-ansible.builtin.user Invoked with comment=nova user group=nova groups=['libvirt'] name=nova shell=/bin/sh state=present uid=42436 non_unique=False force=False remove=False create_home=True system=False move_home=False append=False ssh_key_bits=0 ssh_key_type=rsa ssh_key_comment=ansible-generated on compute-2 update_password=always home=None password=NOT_LOGGING_PARAMETER login_class=None password_expire_max=None password_expire_min=None password_expire_warn=None hidden=None seuser=None skeleton=None generate_ssh_key=None ssh_key_file=None ssh_key_passphrase=NOT_LOGGING_PARAMETER expires=None password_lock=None local=None profile=None authorization=None role=None umask=None password_expire_account_disable=None uid_min=None uid_max=None
Jan 21 18:36:29 np0005591285 systemd-logind[788]: New session 25 of user zuul.
Jan 21 18:36:29 np0005591285 systemd[1]: Started Session 25 of User zuul.
Jan 21 18:36:30 np0005591285 systemd[1]: session-25.scope: Deactivated successfully.
Jan 21 18:36:30 np0005591285 systemd-logind[788]: Session 25 logged out. Waiting for processes to exit.
Jan 21 18:36:30 np0005591285 systemd-logind[788]: Removed session 25.
Jan 21 18:36:31 np0005591285 python3.9[176975]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/nova/config.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 21 18:36:31 np0005591285 python3.9[177096]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/nova/config.json mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1769038590.3530455-2642-267797190935021/.source.json follow=False _original_basename=config.json.j2 checksum=b51012bfb0ca26296dcf3793a2f284446fb1395e backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Jan 21 18:36:32 np0005591285 python3.9[177246]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/nova/nova-blank.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 21 18:36:32 np0005591285 podman[177247]: 2026-01-21 23:36:32.59751574 +0000 UTC m=+0.119023713 container health_status e38def30a2d032278c507a8fe96e62e9ea51ff4721fe55b24e66691821fd079e (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, tcib_managed=true, io.buildah.version=1.41.3)
Jan 21 18:36:33 np0005591285 python3.9[177348]: ansible-ansible.legacy.file Invoked with mode=0644 setype=container_file_t dest=/var/lib/openstack/config/nova/nova-blank.conf _original_basename=nova-blank.conf recurse=False state=file path=/var/lib/openstack/config/nova/nova-blank.conf force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Jan 21 18:36:34 np0005591285 python3.9[177498]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/nova/ssh-config follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 21 18:36:34 np0005591285 python3.9[177619]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/nova/ssh-config mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1769038593.4266937-2642-17251849676504/.source follow=False _original_basename=ssh-config checksum=4297f735c41bdc1ff52d72e6f623a02242f37958 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Jan 21 18:36:35 np0005591285 python3.9[177769]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/nova/02-nova-host-specific.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 21 18:36:36 np0005591285 python3.9[177890]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/nova/02-nova-host-specific.conf mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1769038594.9254181-2642-59282053866110/.source.conf follow=False _original_basename=02-nova-host-specific.conf.j2 checksum=d01cc1b48d783e4ed08d12bb4d0a107aba230a69 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Jan 21 18:36:36 np0005591285 python3.9[178040]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/nova/nova_statedir_ownership.py follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 21 18:36:37 np0005591285 python3.9[178161]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/nova/nova_statedir_ownership.py mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1769038596.3377192-2642-218800797914173/.source.py follow=False _original_basename=nova_statedir_ownership.py checksum=c6c8a3cfefa5efd60ceb1408c4e977becedb71e2 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Jan 21 18:36:38 np0005591285 python3.9[178311]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/nova/run-on-host follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 21 18:36:39 np0005591285 python3.9[178432]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/nova/run-on-host mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1769038597.7679884-2642-53529557560364/.source follow=False _original_basename=run-on-host checksum=93aba8edc83d5878604a66d37fea2f12b60bdea2 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Jan 21 18:36:40 np0005591285 python3.9[178584]: ansible-ansible.builtin.file Invoked with group=nova mode=0700 owner=nova path=/home/nova/.ssh state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 21 18:36:40 np0005591285 podman[178736]: 2026-01-21 23:36:40.857548871 +0000 UTC m=+0.091098513 container health_status 482a8450540b33e5cd4c4cb0c4b60281016c657b79b89fa82f8a9e447843f995 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=ovn_metadata_agent, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']})
Jan 21 18:36:40 np0005591285 python3.9[178737]: ansible-ansible.legacy.copy Invoked with dest=/home/nova/.ssh/authorized_keys group=nova mode=0600 owner=nova remote_src=True src=/var/lib/openstack/config/nova/ssh-publickey backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 21 18:36:41 np0005591285 python3.9[178906]: ansible-ansible.builtin.stat Invoked with path=/var/lib/nova/compute_id follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 21 18:36:42 np0005591285 python3.9[179058]: ansible-ansible.legacy.stat Invoked with path=/var/lib/nova/compute_id follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 21 18:36:43 np0005591285 python3.9[179181]: ansible-ansible.legacy.copy Invoked with attributes=+i dest=/var/lib/nova/compute_id group=nova mode=0400 owner=nova src=/home/zuul/.ansible/tmp/ansible-tmp-1769038602.1539843-2966-124931735689430/.source _original_basename=.x5n1t9wy follow=False checksum=cd05be4a17de1d82c10d29753ffdd409ac537a7a backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None
Jan 21 18:36:44 np0005591285 python3.9[179333]: ansible-ansible.builtin.stat Invoked with path=/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 21 18:36:45 np0005591285 python3.9[179485]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/containers/nova_compute.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 21 18:36:46 np0005591285 python3.9[179606]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/containers/nova_compute.json mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1769038604.9258528-3041-49403143223724/.source.json follow=False _original_basename=nova_compute.json.j2 checksum=aff5546b44cf4461a7541a94e4cce1332c9b58b0 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Jan 21 18:36:46 np0005591285 python3.9[179756]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/containers/nova_compute_init.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 21 18:36:47 np0005591285 python3.9[179877]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/containers/nova_compute_init.json mode=0700 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1769038606.3491337-3086-27471826019808/.source.json follow=False _original_basename=nova_compute_init.json.j2 checksum=60b024e6db49dc6e700fc0d50263944d98d4c034 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Jan 21 18:36:48 np0005591285 python3.9[180029]: ansible-container_config_data Invoked with config_overrides={} config_path=/var/lib/openstack/config/containers config_pattern=nova_compute_init.json debug=False
Jan 21 18:36:49 np0005591285 python3.9[180181]: ansible-container_config_hash Invoked with check_mode=False config_vol_prefix=/var/lib/openstack
Jan 21 18:36:51 np0005591285 python3[180333]: ansible-edpm_container_manage Invoked with concurrency=1 config_dir=/var/lib/openstack/config/containers config_id=edpm config_overrides={} config_patterns=nova_compute_init.json containers=[] log_base_path=/var/log/containers/stdouts debug=False
Jan 21 18:36:51 np0005591285 podman[180365]: 2026-01-21 23:36:51.672560255 +0000 UTC m=+0.080350886 container create ff23a0c19e45d607cb33f10af7fd67011d76103914063646ee2fad058e8582ce (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute_init, managed_by=edpm_ansible, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, config_id=edpm, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'privileged': False, 'user': 'root', 'restart': 'never', 'command': 'bash -c $* -- eval python3 /sbin/nova_statedir_ownership.py | logger -t nova_compute_init', 'net': 'none', 'security_opt': ['label=disable'], 'detach': False, 'environment': {'NOVA_STATEDIR_OWNERSHIP_SKIP': '/var/lib/nova/compute_id', '__OS_DEBUG': False}, 'volumes': ['/dev/log:/dev/log', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/_nova_secontext:/var/lib/_nova_secontext:shared,z', '/var/lib/openstack/config/nova/nova_statedir_ownership.py:/sbin/nova_statedir_ownership.py:z']}, tcib_managed=true, container_name=nova_compute_init)
Jan 21 18:36:51 np0005591285 podman[180365]: 2026-01-21 23:36:51.632508282 +0000 UTC m=+0.040298963 image pull e3166cc074f328e3b121ff82d56ed43a2542af699baffe6874520fe3837c2b18 quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified
Jan 21 18:36:51 np0005591285 python3[180333]: ansible-edpm_container_manage PODMAN-CONTAINER-DEBUG: podman create --name nova_compute_init --conmon-pidfile /run/nova_compute_init.pid --env NOVA_STATEDIR_OWNERSHIP_SKIP=/var/lib/nova/compute_id --env __OS_DEBUG=False --label config_id=edpm --label container_name=nova_compute_init --label managed_by=edpm_ansible --label config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'privileged': False, 'user': 'root', 'restart': 'never', 'command': 'bash -c $* -- eval python3 /sbin/nova_statedir_ownership.py | logger -t nova_compute_init', 'net': 'none', 'security_opt': ['label=disable'], 'detach': False, 'environment': {'NOVA_STATEDIR_OWNERSHIP_SKIP': '/var/lib/nova/compute_id', '__OS_DEBUG': False}, 'volumes': ['/dev/log:/dev/log', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/_nova_secontext:/var/lib/_nova_secontext:shared,z', '/var/lib/openstack/config/nova/nova_statedir_ownership.py:/sbin/nova_statedir_ownership.py:z']} --log-driver journald --log-level info --network none --privileged=False --security-opt label=disable --user root --volume /dev/log:/dev/log --volume /var/lib/nova:/var/lib/nova:shared --volume /var/lib/_nova_secontext:/var/lib/_nova_secontext:shared,z --volume /var/lib/openstack/config/nova/nova_statedir_ownership.py:/sbin/nova_statedir_ownership.py:z quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified bash -c $* -- eval python3 /sbin/nova_statedir_ownership.py | logger -t nova_compute_init
Jan 21 18:36:52 np0005591285 python3.9[180555]: ansible-ansible.builtin.stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 21 18:36:54 np0005591285 python3.9[180709]: ansible-container_config_data Invoked with config_overrides={} config_path=/var/lib/openstack/config/containers config_pattern=nova_compute.json debug=False
Jan 21 18:36:55 np0005591285 python3.9[180861]: ansible-container_config_hash Invoked with check_mode=False config_vol_prefix=/var/lib/openstack
Jan 21 18:36:56 np0005591285 python3[181013]: ansible-edpm_container_manage Invoked with concurrency=1 config_dir=/var/lib/openstack/config/containers config_id=edpm config_overrides={} config_patterns=nova_compute.json containers=[] log_base_path=/var/log/containers/stdouts debug=False
Jan 21 18:36:56 np0005591285 podman[181050]: 2026-01-21 23:36:56.651471045 +0000 UTC m=+0.073889361 container create d3ea99c0c96ad7683553c5f41c4621c0e9f0e623bee3f4a7c2b90f7f436586f5 (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'privileged': True, 'user': 'nova', 'restart': 'always', 'command': 'kolla_start', 'net': 'host', 'pid': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/var/lib/openstack/config/nova:/var/lib/kolla/config_files:ro', '/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/etc/localtime:/etc/localtime:ro', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/libvirt:/var/lib/libvirt', '/run/libvirt:/run/libvirt:shared', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath', '/etc/multipath.conf:/etc/multipath.conf:ro,Z', '/etc/iscsi:/etc/iscsi:ro', '/etc/nvme:/etc/nvme', '/var/lib/openstack/config/ceph:/var/lib/kolla/config_files/ceph:ro', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro']}, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, container_name=nova_compute, tcib_managed=true, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, managed_by=edpm_ansible, config_id=edpm, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3)
Jan 21 18:36:56 np0005591285 podman[181050]: 2026-01-21 23:36:56.616826567 +0000 UTC m=+0.039244933 image pull e3166cc074f328e3b121ff82d56ed43a2542af699baffe6874520fe3837c2b18 quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified
Jan 21 18:36:56 np0005591285 python3[181013]: ansible-edpm_container_manage PODMAN-CONTAINER-DEBUG: podman create --name nova_compute --conmon-pidfile /run/nova_compute.pid --env KOLLA_CONFIG_STRATEGY=COPY_ALWAYS --label config_id=edpm --label container_name=nova_compute --label managed_by=edpm_ansible --label config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'privileged': True, 'user': 'nova', 'restart': 'always', 'command': 'kolla_start', 'net': 'host', 'pid': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/var/lib/openstack/config/nova:/var/lib/kolla/config_files:ro', '/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/etc/localtime:/etc/localtime:ro', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/libvirt:/var/lib/libvirt', '/run/libvirt:/run/libvirt:shared', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath', '/etc/multipath.conf:/etc/multipath.conf:ro,Z', '/etc/iscsi:/etc/iscsi:ro', '/etc/nvme:/etc/nvme', '/var/lib/openstack/config/ceph:/var/lib/kolla/config_files/ceph:ro', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro']} --log-driver journald --log-level info --network host --pid host --privileged=True --user nova --volume /var/lib/openstack/config/nova:/var/lib/kolla/config_files:ro --volume /var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z --volume /etc/localtime:/etc/localtime:ro --volume /lib/modules:/lib/modules:ro --volume /dev:/dev --volume /var/lib/libvirt:/var/lib/libvirt --volume /run/libvirt:/run/libvirt:shared --volume /var/lib/nova:/var/lib/nova:shared --volume /var/lib/iscsi:/var/lib/iscsi --volume /etc/multipath:/etc/multipath --volume /etc/multipath.conf:/etc/multipath.conf:ro,Z --volume /etc/iscsi:/etc/iscsi:ro --volume /etc/nvme:/etc/nvme --volume /var/lib/openstack/config/ceph:/var/lib/kolla/config_files/ceph:ro --volume /etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified kolla_start
Jan 21 18:36:57 np0005591285 python3.9[181241]: ansible-ansible.builtin.stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 21 18:36:58 np0005591285 python3.9[181395]: ansible-file Invoked with path=/etc/systemd/system/edpm_nova_compute.requires state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 21 18:36:59 np0005591285 python3.9[181546]: ansible-copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1769038618.8188744-3374-240906860909192/source dest=/etc/systemd/system/edpm_nova_compute.service mode=0644 owner=root group=root backup=False force=True remote_src=False follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 21 18:37:00 np0005591285 python3.9[181622]: ansible-systemd Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Jan 21 18:37:00 np0005591285 systemd[1]: Reloading.
Jan 21 18:37:00 np0005591285 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 21 18:37:00 np0005591285 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 21 18:37:01 np0005591285 python3.9[181734]: ansible-systemd Invoked with state=restarted name=edpm_nova_compute.service enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 21 18:37:01 np0005591285 systemd[1]: Reloading.
Jan 21 18:37:01 np0005591285 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 21 18:37:01 np0005591285 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 21 18:37:01 np0005591285 systemd[1]: Starting nova_compute container...
Jan 21 18:37:01 np0005591285 systemd[1]: Started libcrun container.
Jan 21 18:37:01 np0005591285 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/4506c398dafb27a1b5aeb09408aea108f00512fd629ef5ec06310add9488884d/merged/etc/nvme supports timestamps until 2038 (0x7fffffff)
Jan 21 18:37:01 np0005591285 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/4506c398dafb27a1b5aeb09408aea108f00512fd629ef5ec06310add9488884d/merged/etc/multipath supports timestamps until 2038 (0x7fffffff)
Jan 21 18:37:01 np0005591285 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/4506c398dafb27a1b5aeb09408aea108f00512fd629ef5ec06310add9488884d/merged/var/lib/libvirt supports timestamps until 2038 (0x7fffffff)
Jan 21 18:37:01 np0005591285 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/4506c398dafb27a1b5aeb09408aea108f00512fd629ef5ec06310add9488884d/merged/var/lib/nova supports timestamps until 2038 (0x7fffffff)
Jan 21 18:37:01 np0005591285 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/4506c398dafb27a1b5aeb09408aea108f00512fd629ef5ec06310add9488884d/merged/var/lib/iscsi supports timestamps until 2038 (0x7fffffff)
Jan 21 18:37:01 np0005591285 podman[181773]: 2026-01-21 23:37:01.894299931 +0000 UTC m=+0.174322864 container init d3ea99c0c96ad7683553c5f41c4621c0e9f0e623bee3f4a7c2b90f7f436586f5 (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute, container_name=nova_compute, managed_by=edpm_ansible, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'privileged': True, 'user': 'nova', 'restart': 'always', 'command': 'kolla_start', 'net': 'host', 'pid': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/var/lib/openstack/config/nova:/var/lib/kolla/config_files:ro', '/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/etc/localtime:/etc/localtime:ro', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/libvirt:/var/lib/libvirt', '/run/libvirt:/run/libvirt:shared', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath', '/etc/multipath.conf:/etc/multipath.conf:ro,Z', '/etc/iscsi:/etc/iscsi:ro', '/etc/nvme:/etc/nvme', '/var/lib/openstack/config/ceph:/var/lib/kolla/config_files/ceph:ro', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro']}, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_id=edpm)
Jan 21 18:37:01 np0005591285 podman[181773]: 2026-01-21 23:37:01.907212877 +0000 UTC m=+0.187235760 container start d3ea99c0c96ad7683553c5f41c4621c0e9f0e623bee3f4a7c2b90f7f436586f5 (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=edpm, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'privileged': True, 'user': 'nova', 'restart': 'always', 'command': 'kolla_start', 'net': 'host', 'pid': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/var/lib/openstack/config/nova:/var/lib/kolla/config_files:ro', '/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/etc/localtime:/etc/localtime:ro', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/libvirt:/var/lib/libvirt', '/run/libvirt:/run/libvirt:shared', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath', '/etc/multipath.conf:/etc/multipath.conf:ro,Z', '/etc/iscsi:/etc/iscsi:ro', '/etc/nvme:/etc/nvme', '/var/lib/openstack/config/ceph:/var/lib/kolla/config_files/ceph:ro', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro']}, container_name=nova_compute, io.buildah.version=1.41.3, org.label-schema.build-date=20251202)
Jan 21 18:37:01 np0005591285 podman[181773]: nova_compute
Jan 21 18:37:01 np0005591285 nova_compute[181789]: + sudo -E kolla_set_configs
Jan 21 18:37:01 np0005591285 systemd[1]: Started nova_compute container.
Jan 21 18:37:02 np0005591285 nova_compute[181789]: INFO:__main__:Loading config file at /var/lib/kolla/config_files/config.json
Jan 21 18:37:02 np0005591285 nova_compute[181789]: INFO:__main__:Validating config file
Jan 21 18:37:02 np0005591285 nova_compute[181789]: INFO:__main__:Kolla config strategy set to: COPY_ALWAYS
Jan 21 18:37:02 np0005591285 nova_compute[181789]: INFO:__main__:Copying service configuration files
Jan 21 18:37:02 np0005591285 nova_compute[181789]: INFO:__main__:Deleting /etc/nova/nova.conf
Jan 21 18:37:02 np0005591285 nova_compute[181789]: INFO:__main__:Copying /var/lib/kolla/config_files/nova-blank.conf to /etc/nova/nova.conf
Jan 21 18:37:02 np0005591285 nova_compute[181789]: INFO:__main__:Setting permission for /etc/nova/nova.conf
Jan 21 18:37:02 np0005591285 nova_compute[181789]: INFO:__main__:Copying /var/lib/kolla/config_files/01-nova.conf to /etc/nova/nova.conf.d/01-nova.conf
Jan 21 18:37:02 np0005591285 nova_compute[181789]: INFO:__main__:Setting permission for /etc/nova/nova.conf.d/01-nova.conf
Jan 21 18:37:02 np0005591285 nova_compute[181789]: INFO:__main__:Copying /var/lib/kolla/config_files/25-nova-extra.conf to /etc/nova/nova.conf.d/25-nova-extra.conf
Jan 21 18:37:02 np0005591285 nova_compute[181789]: INFO:__main__:Setting permission for /etc/nova/nova.conf.d/25-nova-extra.conf
Jan 21 18:37:02 np0005591285 nova_compute[181789]: INFO:__main__:Copying /var/lib/kolla/config_files/nova-blank.conf to /etc/nova/nova.conf.d/nova-blank.conf
Jan 21 18:37:02 np0005591285 nova_compute[181789]: INFO:__main__:Setting permission for /etc/nova/nova.conf.d/nova-blank.conf
Jan 21 18:37:02 np0005591285 nova_compute[181789]: INFO:__main__:Copying /var/lib/kolla/config_files/02-nova-host-specific.conf to /etc/nova/nova.conf.d/02-nova-host-specific.conf
Jan 21 18:37:02 np0005591285 nova_compute[181789]: INFO:__main__:Setting permission for /etc/nova/nova.conf.d/02-nova-host-specific.conf
Jan 21 18:37:02 np0005591285 nova_compute[181789]: INFO:__main__:Deleting /etc/ceph
Jan 21 18:37:02 np0005591285 nova_compute[181789]: INFO:__main__:Creating directory /etc/ceph
Jan 21 18:37:02 np0005591285 nova_compute[181789]: INFO:__main__:Setting permission for /etc/ceph
Jan 21 18:37:02 np0005591285 nova_compute[181789]: INFO:__main__:Copying /var/lib/kolla/config_files/ssh-privatekey to /var/lib/nova/.ssh/ssh-privatekey
Jan 21 18:37:02 np0005591285 nova_compute[181789]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/ssh-privatekey
Jan 21 18:37:02 np0005591285 nova_compute[181789]: INFO:__main__:Copying /var/lib/kolla/config_files/ssh-config to /var/lib/nova/.ssh/config
Jan 21 18:37:02 np0005591285 nova_compute[181789]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/config
Jan 21 18:37:02 np0005591285 nova_compute[181789]: INFO:__main__:Deleting /usr/sbin/iscsiadm
Jan 21 18:37:02 np0005591285 nova_compute[181789]: INFO:__main__:Copying /var/lib/kolla/config_files/run-on-host to /usr/sbin/iscsiadm
Jan 21 18:37:02 np0005591285 nova_compute[181789]: INFO:__main__:Setting permission for /usr/sbin/iscsiadm
Jan 21 18:37:02 np0005591285 nova_compute[181789]: INFO:__main__:Writing out command to execute
Jan 21 18:37:02 np0005591285 nova_compute[181789]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/
Jan 21 18:37:02 np0005591285 nova_compute[181789]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/ssh-privatekey
Jan 21 18:37:02 np0005591285 nova_compute[181789]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/config
Jan 21 18:37:02 np0005591285 nova_compute[181789]: ++ cat /run_command
Jan 21 18:37:02 np0005591285 nova_compute[181789]: + CMD=nova-compute
Jan 21 18:37:02 np0005591285 nova_compute[181789]: + ARGS=
Jan 21 18:37:02 np0005591285 nova_compute[181789]: + sudo kolla_copy_cacerts
Jan 21 18:37:02 np0005591285 nova_compute[181789]: + [[ ! -n '' ]]
Jan 21 18:37:02 np0005591285 nova_compute[181789]: + . kolla_extend_start
Jan 21 18:37:02 np0005591285 nova_compute[181789]: + echo 'Running command: '\''nova-compute'\'''
Jan 21 18:37:02 np0005591285 nova_compute[181789]: Running command: 'nova-compute'
Jan 21 18:37:02 np0005591285 nova_compute[181789]: + umask 0022
Jan 21 18:37:02 np0005591285 nova_compute[181789]: + exec nova-compute
Jan 21 18:37:02 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:37:02.938 104259 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 21 18:37:02 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:37:02.943 104259 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.006s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 21 18:37:02 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:37:02.944 104259 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 21 18:37:03 np0005591285 podman[181925]: 2026-01-21 23:37:03.282602503 +0000 UTC m=+0.176801451 container health_status e38def30a2d032278c507a8fe96e62e9ea51ff4721fe55b24e66691821fd079e (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2)
Jan 21 18:37:03 np0005591285 python3.9[181963]: ansible-ansible.builtin.stat Invoked with path=/etc/systemd/system/edpm_nova_nvme_cleaner_healthcheck.service follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 21 18:37:04 np0005591285 nova_compute[181789]: 2026-01-21 23:37:04.154 181793 DEBUG os_vif [-] Loaded VIF plugin class '<class 'vif_plug_linux_bridge.linux_bridge.LinuxBridgePlugin'>' with name 'linux_bridge' initialize /usr/lib/python3.9/site-packages/os_vif/__init__.py:44#033[00m
Jan 21 18:37:04 np0005591285 nova_compute[181789]: 2026-01-21 23:37:04.155 181793 DEBUG os_vif [-] Loaded VIF plugin class '<class 'vif_plug_noop.noop.NoOpPlugin'>' with name 'noop' initialize /usr/lib/python3.9/site-packages/os_vif/__init__.py:44#033[00m
Jan 21 18:37:04 np0005591285 nova_compute[181789]: 2026-01-21 23:37:04.155 181793 DEBUG os_vif [-] Loaded VIF plugin class '<class 'vif_plug_ovs.ovs.OvsPlugin'>' with name 'ovs' initialize /usr/lib/python3.9/site-packages/os_vif/__init__.py:44#033[00m
Jan 21 18:37:04 np0005591285 nova_compute[181789]: 2026-01-21 23:37:04.156 181793 INFO os_vif [-] Loaded VIF plugins: linux_bridge, noop, ovs#033[00m
Jan 21 18:37:04 np0005591285 nova_compute[181789]: 2026-01-21 23:37:04.307 181793 DEBUG oslo_concurrency.processutils [-] Running cmd (subprocess): grep -F node.session.scan /sbin/iscsiadm execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 21 18:37:04 np0005591285 nova_compute[181789]: 2026-01-21 23:37:04.347 181793 DEBUG oslo_concurrency.processutils [-] CMD "grep -F node.session.scan /sbin/iscsiadm" returned: 1 in 0.039s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 21 18:37:04 np0005591285 nova_compute[181789]: 2026-01-21 23:37:04.348 181793 DEBUG oslo_concurrency.processutils [-] 'grep -F node.session.scan /sbin/iscsiadm' failed. Not Retrying. execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:473#033[00m
Jan 21 18:37:04 np0005591285 python3.9[182129]: ansible-ansible.builtin.stat Invoked with path=/etc/systemd/system/edpm_nova_nvme_cleaner.service follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 21 18:37:04 np0005591285 nova_compute[181789]: 2026-01-21 23:37:04.975 181793 INFO nova.virt.driver [None req-6c5ad869-9008-4282-b2fa-81b360176ee2 - - - - - -] Loading compute driver 'libvirt.LibvirtDriver'#033[00m
Jan 21 18:37:05 np0005591285 nova_compute[181789]: 2026-01-21 23:37:05.127 181793 INFO nova.compute.provider_config [None req-6c5ad869-9008-4282-b2fa-81b360176ee2 - - - - - -] No provider configs found in /etc/nova/provider_config/. If files are present, ensure the Nova process has access.#033[00m
Jan 21 18:37:05 np0005591285 nova_compute[181789]: 2026-01-21 23:37:05.142 181793 DEBUG oslo_concurrency.lockutils [None req-6c5ad869-9008-4282-b2fa-81b360176ee2 - - - - - -] Acquiring lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 21 18:37:05 np0005591285 nova_compute[181789]: 2026-01-21 23:37:05.143 181793 DEBUG oslo_concurrency.lockutils [None req-6c5ad869-9008-4282-b2fa-81b360176ee2 - - - - - -] Acquired lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 21 18:37:05 np0005591285 nova_compute[181789]: 2026-01-21 23:37:05.143 181793 DEBUG oslo_concurrency.lockutils [None req-6c5ad869-9008-4282-b2fa-81b360176ee2 - - - - - -] Releasing lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 21 18:37:05 np0005591285 nova_compute[181789]: 2026-01-21 23:37:05.143 181793 DEBUG oslo_service.service [None req-6c5ad869-9008-4282-b2fa-81b360176ee2 - - - - - -] Full set of CONF: _wait_for_exit_or_signal /usr/lib/python3.9/site-packages/oslo_service/service.py:362#033[00m
Jan 21 18:37:05 np0005591285 nova_compute[181789]: 2026-01-21 23:37:05.144 181793 DEBUG oslo_service.service [None req-6c5ad869-9008-4282-b2fa-81b360176ee2 - - - - - -] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2589#033[00m
Jan 21 18:37:05 np0005591285 nova_compute[181789]: 2026-01-21 23:37:05.144 181793 DEBUG oslo_service.service [None req-6c5ad869-9008-4282-b2fa-81b360176ee2 - - - - - -] Configuration options gathered from: log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2590#033[00m
Jan 21 18:37:05 np0005591285 nova_compute[181789]: 2026-01-21 23:37:05.144 181793 DEBUG oslo_service.service [None req-6c5ad869-9008-4282-b2fa-81b360176ee2 - - - - - -] command line args: [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2591#033[00m
Jan 21 18:37:05 np0005591285 nova_compute[181789]: 2026-01-21 23:37:05.144 181793 DEBUG oslo_service.service [None req-6c5ad869-9008-4282-b2fa-81b360176ee2 - - - - - -] config files: ['/etc/nova/nova.conf', '/etc/nova/nova-compute.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2592#033[00m
Jan 21 18:37:05 np0005591285 nova_compute[181789]: 2026-01-21 23:37:05.144 181793 DEBUG oslo_service.service [None req-6c5ad869-9008-4282-b2fa-81b360176ee2 - - - - - -] ================================================================================ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2594#033[00m
Jan 21 18:37:05 np0005591285 nova_compute[181789]: 2026-01-21 23:37:05.144 181793 DEBUG oslo_service.service [None req-6c5ad869-9008-4282-b2fa-81b360176ee2 - - - - - -] allow_resize_to_same_host      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 21 18:37:05 np0005591285 nova_compute[181789]: 2026-01-21 23:37:05.144 181793 DEBUG oslo_service.service [None req-6c5ad869-9008-4282-b2fa-81b360176ee2 - - - - - -] arq_binding_timeout            = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 21 18:37:05 np0005591285 nova_compute[181789]: 2026-01-21 23:37:05.145 181793 DEBUG oslo_service.service [None req-6c5ad869-9008-4282-b2fa-81b360176ee2 - - - - - -] backdoor_port                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 21 18:37:05 np0005591285 nova_compute[181789]: 2026-01-21 23:37:05.145 181793 DEBUG oslo_service.service [None req-6c5ad869-9008-4282-b2fa-81b360176ee2 - - - - - -] backdoor_socket                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 21 18:37:05 np0005591285 nova_compute[181789]: 2026-01-21 23:37:05.145 181793 DEBUG oslo_service.service [None req-6c5ad869-9008-4282-b2fa-81b360176ee2 - - - - - -] block_device_allocate_retries  = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 21 18:37:05 np0005591285 nova_compute[181789]: 2026-01-21 23:37:05.145 181793 DEBUG oslo_service.service [None req-6c5ad869-9008-4282-b2fa-81b360176ee2 - - - - - -] block_device_allocate_retries_interval = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 21 18:37:05 np0005591285 nova_compute[181789]: 2026-01-21 23:37:05.145 181793 DEBUG oslo_service.service [None req-6c5ad869-9008-4282-b2fa-81b360176ee2 - - - - - -] cert                           = self.pem log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 21 18:37:05 np0005591285 nova_compute[181789]: 2026-01-21 23:37:05.145 181793 DEBUG oslo_service.service [None req-6c5ad869-9008-4282-b2fa-81b360176ee2 - - - - - -] compute_driver                 = libvirt.LibvirtDriver log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 21 18:37:05 np0005591285 nova_compute[181789]: 2026-01-21 23:37:05.146 181793 DEBUG oslo_service.service [None req-6c5ad869-9008-4282-b2fa-81b360176ee2 - - - - - -] compute_monitors               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 21 18:37:05 np0005591285 nova_compute[181789]: 2026-01-21 23:37:05.146 181793 DEBUG oslo_service.service [None req-6c5ad869-9008-4282-b2fa-81b360176ee2 - - - - - -] config_dir                     = ['/etc/nova/nova.conf.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 21 18:37:05 np0005591285 nova_compute[181789]: 2026-01-21 23:37:05.146 181793 DEBUG oslo_service.service [None req-6c5ad869-9008-4282-b2fa-81b360176ee2 - - - - - -] config_drive_format            = iso9660 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 21 18:37:05 np0005591285 nova_compute[181789]: 2026-01-21 23:37:05.146 181793 DEBUG oslo_service.service [None req-6c5ad869-9008-4282-b2fa-81b360176ee2 - - - - - -] config_file                    = ['/etc/nova/nova.conf', '/etc/nova/nova-compute.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 21 18:37:05 np0005591285 nova_compute[181789]: 2026-01-21 23:37:05.146 181793 DEBUG oslo_service.service [None req-6c5ad869-9008-4282-b2fa-81b360176ee2 - - - - - -] config_source                  = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 21 18:37:05 np0005591285 nova_compute[181789]: 2026-01-21 23:37:05.147 181793 DEBUG oslo_service.service [None req-6c5ad869-9008-4282-b2fa-81b360176ee2 - - - - - -] console_host                   = compute-2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 21 18:37:05 np0005591285 nova_compute[181789]: 2026-01-21 23:37:05.147 181793 DEBUG oslo_service.service [None req-6c5ad869-9008-4282-b2fa-81b360176ee2 - - - - - -] control_exchange               = nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 21 18:37:05 np0005591285 nova_compute[181789]: 2026-01-21 23:37:05.147 181793 DEBUG oslo_service.service [None req-6c5ad869-9008-4282-b2fa-81b360176ee2 - - - - - -] cpu_allocation_ratio           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 21 18:37:05 np0005591285 nova_compute[181789]: 2026-01-21 23:37:05.147 181793 DEBUG oslo_service.service [None req-6c5ad869-9008-4282-b2fa-81b360176ee2 - - - - - -] daemon                         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 21 18:37:05 np0005591285 nova_compute[181789]: 2026-01-21 23:37:05.147 181793 DEBUG oslo_service.service [None req-6c5ad869-9008-4282-b2fa-81b360176ee2 - - - - - -] debug                          = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 21 18:37:05 np0005591285 nova_compute[181789]: 2026-01-21 23:37:05.147 181793 DEBUG oslo_service.service [None req-6c5ad869-9008-4282-b2fa-81b360176ee2 - - - - - -] default_access_ip_network_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 21 18:37:05 np0005591285 nova_compute[181789]: 2026-01-21 23:37:05.148 181793 DEBUG oslo_service.service [None req-6c5ad869-9008-4282-b2fa-81b360176ee2 - - - - - -] default_availability_zone      = nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 21 18:37:05 np0005591285 nova_compute[181789]: 2026-01-21 23:37:05.148 181793 DEBUG oslo_service.service [None req-6c5ad869-9008-4282-b2fa-81b360176ee2 - - - - - -] default_ephemeral_format       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 21 18:37:05 np0005591285 nova_compute[181789]: 2026-01-21 23:37:05.148 181793 DEBUG oslo_service.service [None req-6c5ad869-9008-4282-b2fa-81b360176ee2 - - - - - -] default_log_levels             = ['amqp=WARN', 'amqplib=WARN', 'boto=WARN', 'qpid=WARN', 'sqlalchemy=WARN', 'suds=INFO', 'oslo.messaging=INFO', 'oslo_messaging=INFO', 'iso8601=WARN', 'requests.packages.urllib3.connectionpool=WARN', 'urllib3.connectionpool=WARN', 'websocket=WARN', 'requests.packages.urllib3.util.retry=WARN', 'urllib3.util.retry=WARN', 'keystonemiddleware=WARN', 'routes.middleware=WARN', 'stevedore=WARN', 'taskflow=WARN', 'keystoneauth=WARN', 'oslo.cache=INFO', 'oslo_policy=INFO', 'dogpile.core.dogpile=INFO', 'glanceclient=WARN', 'oslo.privsep.daemon=INFO'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 21 18:37:05 np0005591285 nova_compute[181789]: 2026-01-21 23:37:05.148 181793 DEBUG oslo_service.service [None req-6c5ad869-9008-4282-b2fa-81b360176ee2 - - - - - -] default_schedule_zone          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 21 18:37:05 np0005591285 nova_compute[181789]: 2026-01-21 23:37:05.148 181793 DEBUG oslo_service.service [None req-6c5ad869-9008-4282-b2fa-81b360176ee2 - - - - - -] disk_allocation_ratio          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 21 18:37:05 np0005591285 nova_compute[181789]: 2026-01-21 23:37:05.148 181793 DEBUG oslo_service.service [None req-6c5ad869-9008-4282-b2fa-81b360176ee2 - - - - - -] enable_new_services            = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 21 18:37:05 np0005591285 nova_compute[181789]: 2026-01-21 23:37:05.149 181793 DEBUG oslo_service.service [None req-6c5ad869-9008-4282-b2fa-81b360176ee2 - - - - - -] enabled_apis                   = ['osapi_compute', 'metadata'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 21 18:37:05 np0005591285 nova_compute[181789]: 2026-01-21 23:37:05.149 181793 DEBUG oslo_service.service [None req-6c5ad869-9008-4282-b2fa-81b360176ee2 - - - - - -] enabled_ssl_apis               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 21 18:37:05 np0005591285 nova_compute[181789]: 2026-01-21 23:37:05.149 181793 DEBUG oslo_service.service [None req-6c5ad869-9008-4282-b2fa-81b360176ee2 - - - - - -] flat_injected                  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 21 18:37:05 np0005591285 nova_compute[181789]: 2026-01-21 23:37:05.149 181793 DEBUG oslo_service.service [None req-6c5ad869-9008-4282-b2fa-81b360176ee2 - - - - - -] force_config_drive             = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 21 18:37:05 np0005591285 nova_compute[181789]: 2026-01-21 23:37:05.149 181793 DEBUG oslo_service.service [None req-6c5ad869-9008-4282-b2fa-81b360176ee2 - - - - - -] force_raw_images               = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 21 18:37:05 np0005591285 nova_compute[181789]: 2026-01-21 23:37:05.149 181793 DEBUG oslo_service.service [None req-6c5ad869-9008-4282-b2fa-81b360176ee2 - - - - - -] graceful_shutdown_timeout      = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 21 18:37:05 np0005591285 nova_compute[181789]: 2026-01-21 23:37:05.149 181793 DEBUG oslo_service.service [None req-6c5ad869-9008-4282-b2fa-81b360176ee2 - - - - - -] heal_instance_info_cache_interval = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 21 18:37:05 np0005591285 nova_compute[181789]: 2026-01-21 23:37:05.150 181793 DEBUG oslo_service.service [None req-6c5ad869-9008-4282-b2fa-81b360176ee2 - - - - - -] host                           = compute-2.ctlplane.example.com log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 21 18:37:05 np0005591285 nova_compute[181789]: 2026-01-21 23:37:05.150 181793 DEBUG oslo_service.service [None req-6c5ad869-9008-4282-b2fa-81b360176ee2 - - - - - -] initial_cpu_allocation_ratio   = 4.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 21 18:37:05 np0005591285 nova_compute[181789]: 2026-01-21 23:37:05.150 181793 DEBUG oslo_service.service [None req-6c5ad869-9008-4282-b2fa-81b360176ee2 - - - - - -] initial_disk_allocation_ratio  = 0.9 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 21 18:37:05 np0005591285 nova_compute[181789]: 2026-01-21 23:37:05.150 181793 DEBUG oslo_service.service [None req-6c5ad869-9008-4282-b2fa-81b360176ee2 - - - - - -] initial_ram_allocation_ratio   = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 21 18:37:05 np0005591285 nova_compute[181789]: 2026-01-21 23:37:05.150 181793 DEBUG oslo_service.service [None req-6c5ad869-9008-4282-b2fa-81b360176ee2 - - - - - -] injected_network_template      = /usr/lib/python3.9/site-packages/nova/virt/interfaces.template log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 21 18:37:05 np0005591285 nova_compute[181789]: 2026-01-21 23:37:05.150 181793 DEBUG oslo_service.service [None req-6c5ad869-9008-4282-b2fa-81b360176ee2 - - - - - -] instance_build_timeout         = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 21 18:37:05 np0005591285 nova_compute[181789]: 2026-01-21 23:37:05.151 181793 DEBUG oslo_service.service [None req-6c5ad869-9008-4282-b2fa-81b360176ee2 - - - - - -] instance_delete_interval       = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 21 18:37:05 np0005591285 nova_compute[181789]: 2026-01-21 23:37:05.151 181793 DEBUG oslo_service.service [None req-6c5ad869-9008-4282-b2fa-81b360176ee2 - - - - - -] instance_format                = [instance: %(uuid)s]  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 21 18:37:05 np0005591285 nova_compute[181789]: 2026-01-21 23:37:05.151 181793 DEBUG oslo_service.service [None req-6c5ad869-9008-4282-b2fa-81b360176ee2 - - - - - -] instance_name_template         = instance-%08x log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 21 18:37:05 np0005591285 nova_compute[181789]: 2026-01-21 23:37:05.151 181793 DEBUG oslo_service.service [None req-6c5ad869-9008-4282-b2fa-81b360176ee2 - - - - - -] instance_usage_audit           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 21 18:37:05 np0005591285 nova_compute[181789]: 2026-01-21 23:37:05.151 181793 DEBUG oslo_service.service [None req-6c5ad869-9008-4282-b2fa-81b360176ee2 - - - - - -] instance_usage_audit_period    = month log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 21 18:37:05 np0005591285 nova_compute[181789]: 2026-01-21 23:37:05.151 181793 DEBUG oslo_service.service [None req-6c5ad869-9008-4282-b2fa-81b360176ee2 - - - - - -] instance_uuid_format           = [instance: %(uuid)s]  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 21 18:37:05 np0005591285 nova_compute[181789]: 2026-01-21 23:37:05.152 181793 DEBUG oslo_service.service [None req-6c5ad869-9008-4282-b2fa-81b360176ee2 - - - - - -] instances_path                 = /var/lib/nova/instances log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 21 18:37:05 np0005591285 nova_compute[181789]: 2026-01-21 23:37:05.152 181793 DEBUG oslo_service.service [None req-6c5ad869-9008-4282-b2fa-81b360176ee2 - - - - - -] internal_service_availability_zone = internal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 21 18:37:05 np0005591285 nova_compute[181789]: 2026-01-21 23:37:05.152 181793 DEBUG oslo_service.service [None req-6c5ad869-9008-4282-b2fa-81b360176ee2 - - - - - -] key                            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 21 18:37:05 np0005591285 nova_compute[181789]: 2026-01-21 23:37:05.152 181793 DEBUG oslo_service.service [None req-6c5ad869-9008-4282-b2fa-81b360176ee2 - - - - - -] live_migration_retry_count     = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 21 18:37:05 np0005591285 nova_compute[181789]: 2026-01-21 23:37:05.152 181793 DEBUG oslo_service.service [None req-6c5ad869-9008-4282-b2fa-81b360176ee2 - - - - - -] log_config_append              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 21 18:37:05 np0005591285 nova_compute[181789]: 2026-01-21 23:37:05.152 181793 DEBUG oslo_service.service [None req-6c5ad869-9008-4282-b2fa-81b360176ee2 - - - - - -] log_date_format                = %Y-%m-%d %H:%M:%S log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 21 18:37:05 np0005591285 nova_compute[181789]: 2026-01-21 23:37:05.153 181793 DEBUG oslo_service.service [None req-6c5ad869-9008-4282-b2fa-81b360176ee2 - - - - - -] log_dir                        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 21 18:37:05 np0005591285 nova_compute[181789]: 2026-01-21 23:37:05.153 181793 DEBUG oslo_service.service [None req-6c5ad869-9008-4282-b2fa-81b360176ee2 - - - - - -] log_file                       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 21 18:37:05 np0005591285 nova_compute[181789]: 2026-01-21 23:37:05.153 181793 DEBUG oslo_service.service [None req-6c5ad869-9008-4282-b2fa-81b360176ee2 - - - - - -] log_options                    = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 21 18:37:05 np0005591285 nova_compute[181789]: 2026-01-21 23:37:05.153 181793 DEBUG oslo_service.service [None req-6c5ad869-9008-4282-b2fa-81b360176ee2 - - - - - -] log_rotate_interval            = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 21 18:37:05 np0005591285 nova_compute[181789]: 2026-01-21 23:37:05.153 181793 DEBUG oslo_service.service [None req-6c5ad869-9008-4282-b2fa-81b360176ee2 - - - - - -] log_rotate_interval_type       = days log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 21 18:37:05 np0005591285 nova_compute[181789]: 2026-01-21 23:37:05.153 181793 DEBUG oslo_service.service [None req-6c5ad869-9008-4282-b2fa-81b360176ee2 - - - - - -] log_rotation_type              = size log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 21 18:37:05 np0005591285 nova_compute[181789]: 2026-01-21 23:37:05.153 181793 DEBUG oslo_service.service [None req-6c5ad869-9008-4282-b2fa-81b360176ee2 - - - - - -] logging_context_format_string  = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [%(global_request_id)s %(request_id)s %(user_identity)s] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 21 18:37:05 np0005591285 nova_compute[181789]: 2026-01-21 23:37:05.153 181793 DEBUG oslo_service.service [None req-6c5ad869-9008-4282-b2fa-81b360176ee2 - - - - - -] logging_debug_format_suffix    = %(funcName)s %(pathname)s:%(lineno)d log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 21 18:37:05 np0005591285 nova_compute[181789]: 2026-01-21 23:37:05.154 181793 DEBUG oslo_service.service [None req-6c5ad869-9008-4282-b2fa-81b360176ee2 - - - - - -] logging_default_format_string  = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [-] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 21 18:37:05 np0005591285 nova_compute[181789]: 2026-01-21 23:37:05.154 181793 DEBUG oslo_service.service [None req-6c5ad869-9008-4282-b2fa-81b360176ee2 - - - - - -] logging_exception_prefix       = %(asctime)s.%(msecs)03d %(process)d ERROR %(name)s %(instance)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 21 18:37:05 np0005591285 nova_compute[181789]: 2026-01-21 23:37:05.154 181793 DEBUG oslo_service.service [None req-6c5ad869-9008-4282-b2fa-81b360176ee2 - - - - - -] logging_user_identity_format   = %(user)s %(project)s %(domain)s %(system_scope)s %(user_domain)s %(project_domain)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 21 18:37:05 np0005591285 nova_compute[181789]: 2026-01-21 23:37:05.154 181793 DEBUG oslo_service.service [None req-6c5ad869-9008-4282-b2fa-81b360176ee2 - - - - - -] long_rpc_timeout               = 1800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 21 18:37:05 np0005591285 nova_compute[181789]: 2026-01-21 23:37:05.154 181793 DEBUG oslo_service.service [None req-6c5ad869-9008-4282-b2fa-81b360176ee2 - - - - - -] max_concurrent_builds          = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 21 18:37:05 np0005591285 nova_compute[181789]: 2026-01-21 23:37:05.154 181793 DEBUG oslo_service.service [None req-6c5ad869-9008-4282-b2fa-81b360176ee2 - - - - - -] max_concurrent_live_migrations = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 21 18:37:05 np0005591285 nova_compute[181789]: 2026-01-21 23:37:05.154 181793 DEBUG oslo_service.service [None req-6c5ad869-9008-4282-b2fa-81b360176ee2 - - - - - -] max_concurrent_snapshots       = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 21 18:37:05 np0005591285 nova_compute[181789]: 2026-01-21 23:37:05.155 181793 DEBUG oslo_service.service [None req-6c5ad869-9008-4282-b2fa-81b360176ee2 - - - - - -] max_local_block_devices        = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 21 18:37:05 np0005591285 nova_compute[181789]: 2026-01-21 23:37:05.155 181793 DEBUG oslo_service.service [None req-6c5ad869-9008-4282-b2fa-81b360176ee2 - - - - - -] max_logfile_count              = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 21 18:37:05 np0005591285 nova_compute[181789]: 2026-01-21 23:37:05.155 181793 DEBUG oslo_service.service [None req-6c5ad869-9008-4282-b2fa-81b360176ee2 - - - - - -] max_logfile_size_mb            = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 21 18:37:05 np0005591285 nova_compute[181789]: 2026-01-21 23:37:05.155 181793 DEBUG oslo_service.service [None req-6c5ad869-9008-4282-b2fa-81b360176ee2 - - - - - -] maximum_instance_delete_attempts = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 21 18:37:05 np0005591285 nova_compute[181789]: 2026-01-21 23:37:05.155 181793 DEBUG oslo_service.service [None req-6c5ad869-9008-4282-b2fa-81b360176ee2 - - - - - -] metadata_listen                = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 21 18:37:05 np0005591285 nova_compute[181789]: 2026-01-21 23:37:05.155 181793 DEBUG oslo_service.service [None req-6c5ad869-9008-4282-b2fa-81b360176ee2 - - - - - -] metadata_listen_port           = 8775 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 21 18:37:05 np0005591285 nova_compute[181789]: 2026-01-21 23:37:05.155 181793 DEBUG oslo_service.service [None req-6c5ad869-9008-4282-b2fa-81b360176ee2 - - - - - -] metadata_workers               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 21 18:37:05 np0005591285 nova_compute[181789]: 2026-01-21 23:37:05.156 181793 DEBUG oslo_service.service [None req-6c5ad869-9008-4282-b2fa-81b360176ee2 - - - - - -] migrate_max_retries            = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 21 18:37:05 np0005591285 nova_compute[181789]: 2026-01-21 23:37:05.156 181793 DEBUG oslo_service.service [None req-6c5ad869-9008-4282-b2fa-81b360176ee2 - - - - - -] mkisofs_cmd                    = /usr/bin/mkisofs log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 21 18:37:05 np0005591285 nova_compute[181789]: 2026-01-21 23:37:05.156 181793 DEBUG oslo_service.service [None req-6c5ad869-9008-4282-b2fa-81b360176ee2 - - - - - -] my_block_storage_ip            = 192.168.122.102 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 21 18:37:05 np0005591285 nova_compute[181789]: 2026-01-21 23:37:05.156 181793 DEBUG oslo_service.service [None req-6c5ad869-9008-4282-b2fa-81b360176ee2 - - - - - -] my_ip                          = 192.168.122.102 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 21 18:37:05 np0005591285 nova_compute[181789]: 2026-01-21 23:37:05.156 181793 DEBUG oslo_service.service [None req-6c5ad869-9008-4282-b2fa-81b360176ee2 - - - - - -] network_allocate_retries       = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 21 18:37:05 np0005591285 nova_compute[181789]: 2026-01-21 23:37:05.156 181793 DEBUG oslo_service.service [None req-6c5ad869-9008-4282-b2fa-81b360176ee2 - - - - - -] non_inheritable_image_properties = ['cache_in_nova', 'bittorrent'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 21 18:37:05 np0005591285 nova_compute[181789]: 2026-01-21 23:37:05.157 181793 DEBUG oslo_service.service [None req-6c5ad869-9008-4282-b2fa-81b360176ee2 - - - - - -] osapi_compute_listen           = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 21 18:37:05 np0005591285 nova_compute[181789]: 2026-01-21 23:37:05.157 181793 DEBUG oslo_service.service [None req-6c5ad869-9008-4282-b2fa-81b360176ee2 - - - - - -] osapi_compute_listen_port      = 8774 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 21 18:37:05 np0005591285 nova_compute[181789]: 2026-01-21 23:37:05.157 181793 DEBUG oslo_service.service [None req-6c5ad869-9008-4282-b2fa-81b360176ee2 - - - - - -] osapi_compute_unique_server_name_scope =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 21 18:37:05 np0005591285 nova_compute[181789]: 2026-01-21 23:37:05.157 181793 DEBUG oslo_service.service [None req-6c5ad869-9008-4282-b2fa-81b360176ee2 - - - - - -] osapi_compute_workers          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 21 18:37:05 np0005591285 nova_compute[181789]: 2026-01-21 23:37:05.157 181793 DEBUG oslo_service.service [None req-6c5ad869-9008-4282-b2fa-81b360176ee2 - - - - - -] password_length                = 12 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 21 18:37:05 np0005591285 nova_compute[181789]: 2026-01-21 23:37:05.157 181793 DEBUG oslo_service.service [None req-6c5ad869-9008-4282-b2fa-81b360176ee2 - - - - - -] periodic_enable                = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 21 18:37:05 np0005591285 nova_compute[181789]: 2026-01-21 23:37:05.158 181793 DEBUG oslo_service.service [None req-6c5ad869-9008-4282-b2fa-81b360176ee2 - - - - - -] periodic_fuzzy_delay           = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 21 18:37:05 np0005591285 nova_compute[181789]: 2026-01-21 23:37:05.158 181793 DEBUG oslo_service.service [None req-6c5ad869-9008-4282-b2fa-81b360176ee2 - - - - - -] pointer_model                  = usbtablet log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 21 18:37:05 np0005591285 nova_compute[181789]: 2026-01-21 23:37:05.158 181793 DEBUG oslo_service.service [None req-6c5ad869-9008-4282-b2fa-81b360176ee2 - - - - - -] preallocate_images             = none log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 21 18:37:05 np0005591285 nova_compute[181789]: 2026-01-21 23:37:05.158 181793 DEBUG oslo_service.service [None req-6c5ad869-9008-4282-b2fa-81b360176ee2 - - - - - -] publish_errors                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 21 18:37:05 np0005591285 nova_compute[181789]: 2026-01-21 23:37:05.158 181793 DEBUG oslo_service.service [None req-6c5ad869-9008-4282-b2fa-81b360176ee2 - - - - - -] pybasedir                      = /usr/lib/python3.9/site-packages log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 21 18:37:05 np0005591285 nova_compute[181789]: 2026-01-21 23:37:05.158 181793 DEBUG oslo_service.service [None req-6c5ad869-9008-4282-b2fa-81b360176ee2 - - - - - -] ram_allocation_ratio           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 21 18:37:05 np0005591285 nova_compute[181789]: 2026-01-21 23:37:05.158 181793 DEBUG oslo_service.service [None req-6c5ad869-9008-4282-b2fa-81b360176ee2 - - - - - -] rate_limit_burst               = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 21 18:37:05 np0005591285 nova_compute[181789]: 2026-01-21 23:37:05.159 181793 DEBUG oslo_service.service [None req-6c5ad869-9008-4282-b2fa-81b360176ee2 - - - - - -] rate_limit_except_level        = CRITICAL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 21 18:37:05 np0005591285 nova_compute[181789]: 2026-01-21 23:37:05.159 181793 DEBUG oslo_service.service [None req-6c5ad869-9008-4282-b2fa-81b360176ee2 - - - - - -] rate_limit_interval            = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 21 18:37:05 np0005591285 nova_compute[181789]: 2026-01-21 23:37:05.159 181793 DEBUG oslo_service.service [None req-6c5ad869-9008-4282-b2fa-81b360176ee2 - - - - - -] reboot_timeout                 = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 21 18:37:05 np0005591285 nova_compute[181789]: 2026-01-21 23:37:05.159 181793 DEBUG oslo_service.service [None req-6c5ad869-9008-4282-b2fa-81b360176ee2 - - - - - -] reclaim_instance_interval      = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 21 18:37:05 np0005591285 nova_compute[181789]: 2026-01-21 23:37:05.159 181793 DEBUG oslo_service.service [None req-6c5ad869-9008-4282-b2fa-81b360176ee2 - - - - - -] record                         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 21 18:37:05 np0005591285 nova_compute[181789]: 2026-01-21 23:37:05.160 181793 DEBUG oslo_service.service [None req-6c5ad869-9008-4282-b2fa-81b360176ee2 - - - - - -] reimage_timeout_per_gb         = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 21 18:37:05 np0005591285 nova_compute[181789]: 2026-01-21 23:37:05.160 181793 DEBUG oslo_service.service [None req-6c5ad869-9008-4282-b2fa-81b360176ee2 - - - - - -] report_interval                = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 21 18:37:05 np0005591285 nova_compute[181789]: 2026-01-21 23:37:05.160 181793 DEBUG oslo_service.service [None req-6c5ad869-9008-4282-b2fa-81b360176ee2 - - - - - -] rescue_timeout                 = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 21 18:37:05 np0005591285 nova_compute[181789]: 2026-01-21 23:37:05.160 181793 DEBUG oslo_service.service [None req-6c5ad869-9008-4282-b2fa-81b360176ee2 - - - - - -] reserved_host_cpus             = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 21 18:37:05 np0005591285 nova_compute[181789]: 2026-01-21 23:37:05.160 181793 DEBUG oslo_service.service [None req-6c5ad869-9008-4282-b2fa-81b360176ee2 - - - - - -] reserved_host_disk_mb          = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 21 18:37:05 np0005591285 nova_compute[181789]: 2026-01-21 23:37:05.160 181793 DEBUG oslo_service.service [None req-6c5ad869-9008-4282-b2fa-81b360176ee2 - - - - - -] reserved_host_memory_mb        = 512 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 21 18:37:05 np0005591285 nova_compute[181789]: 2026-01-21 23:37:05.160 181793 DEBUG oslo_service.service [None req-6c5ad869-9008-4282-b2fa-81b360176ee2 - - - - - -] reserved_huge_pages            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 21 18:37:05 np0005591285 nova_compute[181789]: 2026-01-21 23:37:05.160 181793 DEBUG oslo_service.service [None req-6c5ad869-9008-4282-b2fa-81b360176ee2 - - - - - -] resize_confirm_window          = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 21 18:37:05 np0005591285 nova_compute[181789]: 2026-01-21 23:37:05.161 181793 DEBUG oslo_service.service [None req-6c5ad869-9008-4282-b2fa-81b360176ee2 - - - - - -] resize_fs_using_block_device   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 21 18:37:05 np0005591285 nova_compute[181789]: 2026-01-21 23:37:05.161 181793 DEBUG oslo_service.service [None req-6c5ad869-9008-4282-b2fa-81b360176ee2 - - - - - -] resume_guests_state_on_host_boot = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 21 18:37:05 np0005591285 nova_compute[181789]: 2026-01-21 23:37:05.161 181793 DEBUG oslo_service.service [None req-6c5ad869-9008-4282-b2fa-81b360176ee2 - - - - - -] rootwrap_config                = /etc/nova/rootwrap.conf log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 21 18:37:05 np0005591285 nova_compute[181789]: 2026-01-21 23:37:05.161 181793 DEBUG oslo_service.service [None req-6c5ad869-9008-4282-b2fa-81b360176ee2 - - - - - -] rpc_response_timeout           = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 21 18:37:05 np0005591285 nova_compute[181789]: 2026-01-21 23:37:05.161 181793 DEBUG oslo_service.service [None req-6c5ad869-9008-4282-b2fa-81b360176ee2 - - - - - -] run_external_periodic_tasks    = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 21 18:37:05 np0005591285 nova_compute[181789]: 2026-01-21 23:37:05.161 181793 DEBUG oslo_service.service [None req-6c5ad869-9008-4282-b2fa-81b360176ee2 - - - - - -] running_deleted_instance_action = reap log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 21 18:37:05 np0005591285 nova_compute[181789]: 2026-01-21 23:37:05.162 181793 DEBUG oslo_service.service [None req-6c5ad869-9008-4282-b2fa-81b360176ee2 - - - - - -] running_deleted_instance_poll_interval = 1800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 21 18:37:05 np0005591285 nova_compute[181789]: 2026-01-21 23:37:05.162 181793 DEBUG oslo_service.service [None req-6c5ad869-9008-4282-b2fa-81b360176ee2 - - - - - -] running_deleted_instance_timeout = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 21 18:37:05 np0005591285 nova_compute[181789]: 2026-01-21 23:37:05.162 181793 DEBUG oslo_service.service [None req-6c5ad869-9008-4282-b2fa-81b360176ee2 - - - - - -] scheduler_instance_sync_interval = 120 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 21 18:37:05 np0005591285 nova_compute[181789]: 2026-01-21 23:37:05.162 181793 DEBUG oslo_service.service [None req-6c5ad869-9008-4282-b2fa-81b360176ee2 - - - - - -] service_down_time              = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 21 18:37:05 np0005591285 nova_compute[181789]: 2026-01-21 23:37:05.162 181793 DEBUG oslo_service.service [None req-6c5ad869-9008-4282-b2fa-81b360176ee2 - - - - - -] servicegroup_driver            = db log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 21 18:37:05 np0005591285 nova_compute[181789]: 2026-01-21 23:37:05.162 181793 DEBUG oslo_service.service [None req-6c5ad869-9008-4282-b2fa-81b360176ee2 - - - - - -] shelved_offload_time           = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 21 18:37:05 np0005591285 nova_compute[181789]: 2026-01-21 23:37:05.162 181793 DEBUG oslo_service.service [None req-6c5ad869-9008-4282-b2fa-81b360176ee2 - - - - - -] shelved_poll_interval          = 3600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 21 18:37:05 np0005591285 nova_compute[181789]: 2026-01-21 23:37:05.163 181793 DEBUG oslo_service.service [None req-6c5ad869-9008-4282-b2fa-81b360176ee2 - - - - - -] shutdown_timeout               = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 21 18:37:05 np0005591285 nova_compute[181789]: 2026-01-21 23:37:05.163 181793 DEBUG oslo_service.service [None req-6c5ad869-9008-4282-b2fa-81b360176ee2 - - - - - -] source_is_ipv6                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 21 18:37:05 np0005591285 nova_compute[181789]: 2026-01-21 23:37:05.163 181793 DEBUG oslo_service.service [None req-6c5ad869-9008-4282-b2fa-81b360176ee2 - - - - - -] ssl_only                       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 21 18:37:05 np0005591285 nova_compute[181789]: 2026-01-21 23:37:05.163 181793 DEBUG oslo_service.service [None req-6c5ad869-9008-4282-b2fa-81b360176ee2 - - - - - -] state_path                     = /var/lib/nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 21 18:37:05 np0005591285 nova_compute[181789]: 2026-01-21 23:37:05.163 181793 DEBUG oslo_service.service [None req-6c5ad869-9008-4282-b2fa-81b360176ee2 - - - - - -] sync_power_state_interval      = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 21 18:37:05 np0005591285 nova_compute[181789]: 2026-01-21 23:37:05.163 181793 DEBUG oslo_service.service [None req-6c5ad869-9008-4282-b2fa-81b360176ee2 - - - - - -] sync_power_state_pool_size     = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 21 18:37:05 np0005591285 nova_compute[181789]: 2026-01-21 23:37:05.163 181793 DEBUG oslo_service.service [None req-6c5ad869-9008-4282-b2fa-81b360176ee2 - - - - - -] syslog_log_facility            = LOG_USER log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 21 18:37:05 np0005591285 nova_compute[181789]: 2026-01-21 23:37:05.163 181793 DEBUG oslo_service.service [None req-6c5ad869-9008-4282-b2fa-81b360176ee2 - - - - - -] tempdir                        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 21 18:37:05 np0005591285 nova_compute[181789]: 2026-01-21 23:37:05.164 181793 DEBUG oslo_service.service [None req-6c5ad869-9008-4282-b2fa-81b360176ee2 - - - - - -] timeout_nbd                    = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 21 18:37:05 np0005591285 nova_compute[181789]: 2026-01-21 23:37:05.164 181793 DEBUG oslo_service.service [None req-6c5ad869-9008-4282-b2fa-81b360176ee2 - - - - - -] transport_url                  = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 21 18:37:05 np0005591285 nova_compute[181789]: 2026-01-21 23:37:05.164 181793 DEBUG oslo_service.service [None req-6c5ad869-9008-4282-b2fa-81b360176ee2 - - - - - -] update_resources_interval      = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 21 18:37:05 np0005591285 nova_compute[181789]: 2026-01-21 23:37:05.164 181793 DEBUG oslo_service.service [None req-6c5ad869-9008-4282-b2fa-81b360176ee2 - - - - - -] use_cow_images                 = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 21 18:37:05 np0005591285 nova_compute[181789]: 2026-01-21 23:37:05.164 181793 DEBUG oslo_service.service [None req-6c5ad869-9008-4282-b2fa-81b360176ee2 - - - - - -] use_eventlog                   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 21 18:37:05 np0005591285 nova_compute[181789]: 2026-01-21 23:37:05.164 181793 DEBUG oslo_service.service [None req-6c5ad869-9008-4282-b2fa-81b360176ee2 - - - - - -] use_journal                    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 21 18:37:05 np0005591285 nova_compute[181789]: 2026-01-21 23:37:05.165 181793 DEBUG oslo_service.service [None req-6c5ad869-9008-4282-b2fa-81b360176ee2 - - - - - -] use_json                       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 21 18:37:05 np0005591285 nova_compute[181789]: 2026-01-21 23:37:05.165 181793 DEBUG oslo_service.service [None req-6c5ad869-9008-4282-b2fa-81b360176ee2 - - - - - -] use_rootwrap_daemon            = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 21 18:37:05 np0005591285 nova_compute[181789]: 2026-01-21 23:37:05.165 181793 DEBUG oslo_service.service [None req-6c5ad869-9008-4282-b2fa-81b360176ee2 - - - - - -] use_stderr                     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 21 18:37:05 np0005591285 nova_compute[181789]: 2026-01-21 23:37:05.165 181793 DEBUG oslo_service.service [None req-6c5ad869-9008-4282-b2fa-81b360176ee2 - - - - - -] use_syslog                     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 21 18:37:05 np0005591285 nova_compute[181789]: 2026-01-21 23:37:05.165 181793 DEBUG oslo_service.service [None req-6c5ad869-9008-4282-b2fa-81b360176ee2 - - - - - -] vcpu_pin_set                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 21 18:37:05 np0005591285 nova_compute[181789]: 2026-01-21 23:37:05.165 181793 DEBUG oslo_service.service [None req-6c5ad869-9008-4282-b2fa-81b360176ee2 - - - - - -] vif_plugging_is_fatal          = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 21 18:37:05 np0005591285 nova_compute[181789]: 2026-01-21 23:37:05.165 181793 DEBUG oslo_service.service [None req-6c5ad869-9008-4282-b2fa-81b360176ee2 - - - - - -] vif_plugging_timeout           = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 21 18:37:05 np0005591285 nova_compute[181789]: 2026-01-21 23:37:05.166 181793 DEBUG oslo_service.service [None req-6c5ad869-9008-4282-b2fa-81b360176ee2 - - - - - -] virt_mkfs                      = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 21 18:37:05 np0005591285 nova_compute[181789]: 2026-01-21 23:37:05.166 181793 DEBUG oslo_service.service [None req-6c5ad869-9008-4282-b2fa-81b360176ee2 - - - - - -] volume_usage_poll_interval     = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 21 18:37:05 np0005591285 nova_compute[181789]: 2026-01-21 23:37:05.166 181793 DEBUG oslo_service.service [None req-6c5ad869-9008-4282-b2fa-81b360176ee2 - - - - - -] watch_log_file                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 21 18:37:05 np0005591285 nova_compute[181789]: 2026-01-21 23:37:05.166 181793 DEBUG oslo_service.service [None req-6c5ad869-9008-4282-b2fa-81b360176ee2 - - - - - -] web                            = /usr/share/spice-html5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 21 18:37:05 np0005591285 nova_compute[181789]: 2026-01-21 23:37:05.166 181793 DEBUG oslo_service.service [None req-6c5ad869-9008-4282-b2fa-81b360176ee2 - - - - - -] oslo_concurrency.disable_process_locking = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:05 np0005591285 nova_compute[181789]: 2026-01-21 23:37:05.166 181793 DEBUG oslo_service.service [None req-6c5ad869-9008-4282-b2fa-81b360176ee2 - - - - - -] oslo_concurrency.lock_path     = /var/lib/nova/tmp log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:05 np0005591285 nova_compute[181789]: 2026-01-21 23:37:05.166 181793 DEBUG oslo_service.service [None req-6c5ad869-9008-4282-b2fa-81b360176ee2 - - - - - -] oslo_messaging_metrics.metrics_buffer_size = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:05 np0005591285 nova_compute[181789]: 2026-01-21 23:37:05.167 181793 DEBUG oslo_service.service [None req-6c5ad869-9008-4282-b2fa-81b360176ee2 - - - - - -] oslo_messaging_metrics.metrics_enabled = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:05 np0005591285 nova_compute[181789]: 2026-01-21 23:37:05.167 181793 DEBUG oslo_service.service [None req-6c5ad869-9008-4282-b2fa-81b360176ee2 - - - - - -] oslo_messaging_metrics.metrics_process_name =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:05 np0005591285 nova_compute[181789]: 2026-01-21 23:37:05.167 181793 DEBUG oslo_service.service [None req-6c5ad869-9008-4282-b2fa-81b360176ee2 - - - - - -] oslo_messaging_metrics.metrics_socket_file = /var/tmp/metrics_collector.sock log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:05 np0005591285 nova_compute[181789]: 2026-01-21 23:37:05.167 181793 DEBUG oslo_service.service [None req-6c5ad869-9008-4282-b2fa-81b360176ee2 - - - - - -] oslo_messaging_metrics.metrics_thread_stop_timeout = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:05 np0005591285 nova_compute[181789]: 2026-01-21 23:37:05.167 181793 DEBUG oslo_service.service [None req-6c5ad869-9008-4282-b2fa-81b360176ee2 - - - - - -] api.auth_strategy              = keystone log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:05 np0005591285 nova_compute[181789]: 2026-01-21 23:37:05.168 181793 DEBUG oslo_service.service [None req-6c5ad869-9008-4282-b2fa-81b360176ee2 - - - - - -] api.compute_link_prefix        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:05 np0005591285 nova_compute[181789]: 2026-01-21 23:37:05.168 181793 DEBUG oslo_service.service [None req-6c5ad869-9008-4282-b2fa-81b360176ee2 - - - - - -] api.config_drive_skip_versions = 1.0 2007-01-19 2007-03-01 2007-08-29 2007-10-10 2007-12-15 2008-02-01 2008-09-01 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:05 np0005591285 nova_compute[181789]: 2026-01-21 23:37:05.168 181793 DEBUG oslo_service.service [None req-6c5ad869-9008-4282-b2fa-81b360176ee2 - - - - - -] api.dhcp_domain                =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:05 np0005591285 nova_compute[181789]: 2026-01-21 23:37:05.168 181793 DEBUG oslo_service.service [None req-6c5ad869-9008-4282-b2fa-81b360176ee2 - - - - - -] api.enable_instance_password   = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:05 np0005591285 nova_compute[181789]: 2026-01-21 23:37:05.168 181793 DEBUG oslo_service.service [None req-6c5ad869-9008-4282-b2fa-81b360176ee2 - - - - - -] api.glance_link_prefix         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:05 np0005591285 nova_compute[181789]: 2026-01-21 23:37:05.168 181793 DEBUG oslo_service.service [None req-6c5ad869-9008-4282-b2fa-81b360176ee2 - - - - - -] api.instance_list_cells_batch_fixed_size = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:05 np0005591285 nova_compute[181789]: 2026-01-21 23:37:05.169 181793 DEBUG oslo_service.service [None req-6c5ad869-9008-4282-b2fa-81b360176ee2 - - - - - -] api.instance_list_cells_batch_strategy = distributed log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:05 np0005591285 nova_compute[181789]: 2026-01-21 23:37:05.169 181793 DEBUG oslo_service.service [None req-6c5ad869-9008-4282-b2fa-81b360176ee2 - - - - - -] api.instance_list_per_project_cells = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:05 np0005591285 nova_compute[181789]: 2026-01-21 23:37:05.169 181793 DEBUG oslo_service.service [None req-6c5ad869-9008-4282-b2fa-81b360176ee2 - - - - - -] api.list_records_by_skipping_down_cells = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:05 np0005591285 nova_compute[181789]: 2026-01-21 23:37:05.169 181793 DEBUG oslo_service.service [None req-6c5ad869-9008-4282-b2fa-81b360176ee2 - - - - - -] api.local_metadata_per_cell    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:05 np0005591285 nova_compute[181789]: 2026-01-21 23:37:05.169 181793 DEBUG oslo_service.service [None req-6c5ad869-9008-4282-b2fa-81b360176ee2 - - - - - -] api.max_limit                  = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:05 np0005591285 nova_compute[181789]: 2026-01-21 23:37:05.169 181793 DEBUG oslo_service.service [None req-6c5ad869-9008-4282-b2fa-81b360176ee2 - - - - - -] api.metadata_cache_expiration  = 15 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:05 np0005591285 nova_compute[181789]: 2026-01-21 23:37:05.169 181793 DEBUG oslo_service.service [None req-6c5ad869-9008-4282-b2fa-81b360176ee2 - - - - - -] api.neutron_default_tenant_id  = default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:05 np0005591285 nova_compute[181789]: 2026-01-21 23:37:05.170 181793 DEBUG oslo_service.service [None req-6c5ad869-9008-4282-b2fa-81b360176ee2 - - - - - -] api.use_forwarded_for          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:05 np0005591285 nova_compute[181789]: 2026-01-21 23:37:05.170 181793 DEBUG oslo_service.service [None req-6c5ad869-9008-4282-b2fa-81b360176ee2 - - - - - -] api.use_neutron_default_nets   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:05 np0005591285 nova_compute[181789]: 2026-01-21 23:37:05.170 181793 DEBUG oslo_service.service [None req-6c5ad869-9008-4282-b2fa-81b360176ee2 - - - - - -] api.vendordata_dynamic_connect_timeout = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:05 np0005591285 nova_compute[181789]: 2026-01-21 23:37:05.170 181793 DEBUG oslo_service.service [None req-6c5ad869-9008-4282-b2fa-81b360176ee2 - - - - - -] api.vendordata_dynamic_failure_fatal = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:05 np0005591285 nova_compute[181789]: 2026-01-21 23:37:05.170 181793 DEBUG oslo_service.service [None req-6c5ad869-9008-4282-b2fa-81b360176ee2 - - - - - -] api.vendordata_dynamic_read_timeout = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:05 np0005591285 nova_compute[181789]: 2026-01-21 23:37:05.170 181793 DEBUG oslo_service.service [None req-6c5ad869-9008-4282-b2fa-81b360176ee2 - - - - - -] api.vendordata_dynamic_ssl_certfile =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:05 np0005591285 nova_compute[181789]: 2026-01-21 23:37:05.171 181793 DEBUG oslo_service.service [None req-6c5ad869-9008-4282-b2fa-81b360176ee2 - - - - - -] api.vendordata_dynamic_targets = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:05 np0005591285 nova_compute[181789]: 2026-01-21 23:37:05.171 181793 DEBUG oslo_service.service [None req-6c5ad869-9008-4282-b2fa-81b360176ee2 - - - - - -] api.vendordata_jsonfile_path   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:05 np0005591285 nova_compute[181789]: 2026-01-21 23:37:05.171 181793 DEBUG oslo_service.service [None req-6c5ad869-9008-4282-b2fa-81b360176ee2 - - - - - -] api.vendordata_providers       = ['StaticJSON'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:05 np0005591285 nova_compute[181789]: 2026-01-21 23:37:05.171 181793 DEBUG oslo_service.service [None req-6c5ad869-9008-4282-b2fa-81b360176ee2 - - - - - -] cache.backend                  = oslo_cache.dict log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:05 np0005591285 nova_compute[181789]: 2026-01-21 23:37:05.171 181793 DEBUG oslo_service.service [None req-6c5ad869-9008-4282-b2fa-81b360176ee2 - - - - - -] cache.backend_argument         = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:05 np0005591285 nova_compute[181789]: 2026-01-21 23:37:05.171 181793 DEBUG oslo_service.service [None req-6c5ad869-9008-4282-b2fa-81b360176ee2 - - - - - -] cache.config_prefix            = cache.oslo log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:05 np0005591285 nova_compute[181789]: 2026-01-21 23:37:05.171 181793 DEBUG oslo_service.service [None req-6c5ad869-9008-4282-b2fa-81b360176ee2 - - - - - -] cache.dead_timeout             = 60.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:05 np0005591285 nova_compute[181789]: 2026-01-21 23:37:05.172 181793 DEBUG oslo_service.service [None req-6c5ad869-9008-4282-b2fa-81b360176ee2 - - - - - -] cache.debug_cache_backend      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:05 np0005591285 nova_compute[181789]: 2026-01-21 23:37:05.172 181793 DEBUG oslo_service.service [None req-6c5ad869-9008-4282-b2fa-81b360176ee2 - - - - - -] cache.enable_retry_client      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:05 np0005591285 nova_compute[181789]: 2026-01-21 23:37:05.172 181793 DEBUG oslo_service.service [None req-6c5ad869-9008-4282-b2fa-81b360176ee2 - - - - - -] cache.enable_socket_keepalive  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:05 np0005591285 nova_compute[181789]: 2026-01-21 23:37:05.172 181793 DEBUG oslo_service.service [None req-6c5ad869-9008-4282-b2fa-81b360176ee2 - - - - - -] cache.enabled                  = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:05 np0005591285 nova_compute[181789]: 2026-01-21 23:37:05.172 181793 DEBUG oslo_service.service [None req-6c5ad869-9008-4282-b2fa-81b360176ee2 - - - - - -] cache.expiration_time          = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:05 np0005591285 nova_compute[181789]: 2026-01-21 23:37:05.172 181793 DEBUG oslo_service.service [None req-6c5ad869-9008-4282-b2fa-81b360176ee2 - - - - - -] cache.hashclient_retry_attempts = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:05 np0005591285 nova_compute[181789]: 2026-01-21 23:37:05.172 181793 DEBUG oslo_service.service [None req-6c5ad869-9008-4282-b2fa-81b360176ee2 - - - - - -] cache.hashclient_retry_delay   = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:05 np0005591285 nova_compute[181789]: 2026-01-21 23:37:05.173 181793 DEBUG oslo_service.service [None req-6c5ad869-9008-4282-b2fa-81b360176ee2 - - - - - -] cache.memcache_dead_retry      = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:05 np0005591285 nova_compute[181789]: 2026-01-21 23:37:05.173 181793 DEBUG oslo_service.service [None req-6c5ad869-9008-4282-b2fa-81b360176ee2 - - - - - -] cache.memcache_password        =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:05 np0005591285 nova_compute[181789]: 2026-01-21 23:37:05.173 181793 DEBUG oslo_service.service [None req-6c5ad869-9008-4282-b2fa-81b360176ee2 - - - - - -] cache.memcache_pool_connection_get_timeout = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:05 np0005591285 nova_compute[181789]: 2026-01-21 23:37:05.173 181793 DEBUG oslo_service.service [None req-6c5ad869-9008-4282-b2fa-81b360176ee2 - - - - - -] cache.memcache_pool_flush_on_reconnect = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:05 np0005591285 nova_compute[181789]: 2026-01-21 23:37:05.173 181793 DEBUG oslo_service.service [None req-6c5ad869-9008-4282-b2fa-81b360176ee2 - - - - - -] cache.memcache_pool_maxsize    = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:05 np0005591285 nova_compute[181789]: 2026-01-21 23:37:05.173 181793 DEBUG oslo_service.service [None req-6c5ad869-9008-4282-b2fa-81b360176ee2 - - - - - -] cache.memcache_pool_unused_timeout = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:05 np0005591285 nova_compute[181789]: 2026-01-21 23:37:05.173 181793 DEBUG oslo_service.service [None req-6c5ad869-9008-4282-b2fa-81b360176ee2 - - - - - -] cache.memcache_sasl_enabled    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:05 np0005591285 nova_compute[181789]: 2026-01-21 23:37:05.174 181793 DEBUG oslo_service.service [None req-6c5ad869-9008-4282-b2fa-81b360176ee2 - - - - - -] cache.memcache_servers         = ['localhost:11211'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:05 np0005591285 nova_compute[181789]: 2026-01-21 23:37:05.174 181793 DEBUG oslo_service.service [None req-6c5ad869-9008-4282-b2fa-81b360176ee2 - - - - - -] cache.memcache_socket_timeout  = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:05 np0005591285 nova_compute[181789]: 2026-01-21 23:37:05.174 181793 DEBUG oslo_service.service [None req-6c5ad869-9008-4282-b2fa-81b360176ee2 - - - - - -] cache.memcache_username        =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:05 np0005591285 nova_compute[181789]: 2026-01-21 23:37:05.174 181793 DEBUG oslo_service.service [None req-6c5ad869-9008-4282-b2fa-81b360176ee2 - - - - - -] cache.proxies                  = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:05 np0005591285 nova_compute[181789]: 2026-01-21 23:37:05.174 181793 DEBUG oslo_service.service [None req-6c5ad869-9008-4282-b2fa-81b360176ee2 - - - - - -] cache.retry_attempts           = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:05 np0005591285 nova_compute[181789]: 2026-01-21 23:37:05.174 181793 DEBUG oslo_service.service [None req-6c5ad869-9008-4282-b2fa-81b360176ee2 - - - - - -] cache.retry_delay              = 0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:05 np0005591285 nova_compute[181789]: 2026-01-21 23:37:05.174 181793 DEBUG oslo_service.service [None req-6c5ad869-9008-4282-b2fa-81b360176ee2 - - - - - -] cache.socket_keepalive_count   = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:05 np0005591285 nova_compute[181789]: 2026-01-21 23:37:05.175 181793 DEBUG oslo_service.service [None req-6c5ad869-9008-4282-b2fa-81b360176ee2 - - - - - -] cache.socket_keepalive_idle    = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:05 np0005591285 nova_compute[181789]: 2026-01-21 23:37:05.175 181793 DEBUG oslo_service.service [None req-6c5ad869-9008-4282-b2fa-81b360176ee2 - - - - - -] cache.socket_keepalive_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:05 np0005591285 nova_compute[181789]: 2026-01-21 23:37:05.175 181793 DEBUG oslo_service.service [None req-6c5ad869-9008-4282-b2fa-81b360176ee2 - - - - - -] cache.tls_allowed_ciphers      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:05 np0005591285 nova_compute[181789]: 2026-01-21 23:37:05.175 181793 DEBUG oslo_service.service [None req-6c5ad869-9008-4282-b2fa-81b360176ee2 - - - - - -] cache.tls_cafile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:05 np0005591285 nova_compute[181789]: 2026-01-21 23:37:05.175 181793 DEBUG oslo_service.service [None req-6c5ad869-9008-4282-b2fa-81b360176ee2 - - - - - -] cache.tls_certfile             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:05 np0005591285 nova_compute[181789]: 2026-01-21 23:37:05.175 181793 DEBUG oslo_service.service [None req-6c5ad869-9008-4282-b2fa-81b360176ee2 - - - - - -] cache.tls_enabled              = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:05 np0005591285 nova_compute[181789]: 2026-01-21 23:37:05.175 181793 DEBUG oslo_service.service [None req-6c5ad869-9008-4282-b2fa-81b360176ee2 - - - - - -] cache.tls_keyfile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:05 np0005591285 nova_compute[181789]: 2026-01-21 23:37:05.176 181793 DEBUG oslo_service.service [None req-6c5ad869-9008-4282-b2fa-81b360176ee2 - - - - - -] cinder.auth_section            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:05 np0005591285 nova_compute[181789]: 2026-01-21 23:37:05.176 181793 DEBUG oslo_service.service [None req-6c5ad869-9008-4282-b2fa-81b360176ee2 - - - - - -] cinder.auth_type               = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:05 np0005591285 nova_compute[181789]: 2026-01-21 23:37:05.176 181793 DEBUG oslo_service.service [None req-6c5ad869-9008-4282-b2fa-81b360176ee2 - - - - - -] cinder.cafile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:05 np0005591285 nova_compute[181789]: 2026-01-21 23:37:05.176 181793 DEBUG oslo_service.service [None req-6c5ad869-9008-4282-b2fa-81b360176ee2 - - - - - -] cinder.catalog_info            = volumev3:cinderv3:internalURL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:05 np0005591285 nova_compute[181789]: 2026-01-21 23:37:05.176 181793 DEBUG oslo_service.service [None req-6c5ad869-9008-4282-b2fa-81b360176ee2 - - - - - -] cinder.certfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:05 np0005591285 nova_compute[181789]: 2026-01-21 23:37:05.176 181793 DEBUG oslo_service.service [None req-6c5ad869-9008-4282-b2fa-81b360176ee2 - - - - - -] cinder.collect_timing          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:05 np0005591285 nova_compute[181789]: 2026-01-21 23:37:05.177 181793 DEBUG oslo_service.service [None req-6c5ad869-9008-4282-b2fa-81b360176ee2 - - - - - -] cinder.cross_az_attach         = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:05 np0005591285 nova_compute[181789]: 2026-01-21 23:37:05.177 181793 DEBUG oslo_service.service [None req-6c5ad869-9008-4282-b2fa-81b360176ee2 - - - - - -] cinder.debug                   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:05 np0005591285 nova_compute[181789]: 2026-01-21 23:37:05.177 181793 DEBUG oslo_service.service [None req-6c5ad869-9008-4282-b2fa-81b360176ee2 - - - - - -] cinder.endpoint_template       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:05 np0005591285 nova_compute[181789]: 2026-01-21 23:37:05.177 181793 DEBUG oslo_service.service [None req-6c5ad869-9008-4282-b2fa-81b360176ee2 - - - - - -] cinder.http_retries            = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:05 np0005591285 nova_compute[181789]: 2026-01-21 23:37:05.177 181793 DEBUG oslo_service.service [None req-6c5ad869-9008-4282-b2fa-81b360176ee2 - - - - - -] cinder.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:05 np0005591285 nova_compute[181789]: 2026-01-21 23:37:05.177 181793 DEBUG oslo_service.service [None req-6c5ad869-9008-4282-b2fa-81b360176ee2 - - - - - -] cinder.keyfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:05 np0005591285 nova_compute[181789]: 2026-01-21 23:37:05.177 181793 DEBUG oslo_service.service [None req-6c5ad869-9008-4282-b2fa-81b360176ee2 - - - - - -] cinder.os_region_name          = regionOne log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:05 np0005591285 nova_compute[181789]: 2026-01-21 23:37:05.178 181793 DEBUG oslo_service.service [None req-6c5ad869-9008-4282-b2fa-81b360176ee2 - - - - - -] cinder.split_loggers           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:05 np0005591285 nova_compute[181789]: 2026-01-21 23:37:05.178 181793 DEBUG oslo_service.service [None req-6c5ad869-9008-4282-b2fa-81b360176ee2 - - - - - -] cinder.timeout                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:05 np0005591285 nova_compute[181789]: 2026-01-21 23:37:05.178 181793 DEBUG oslo_service.service [None req-6c5ad869-9008-4282-b2fa-81b360176ee2 - - - - - -] compute.consecutive_build_service_disable_threshold = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:05 np0005591285 nova_compute[181789]: 2026-01-21 23:37:05.178 181793 DEBUG oslo_service.service [None req-6c5ad869-9008-4282-b2fa-81b360176ee2 - - - - - -] compute.cpu_dedicated_set      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:05 np0005591285 nova_compute[181789]: 2026-01-21 23:37:05.178 181793 DEBUG oslo_service.service [None req-6c5ad869-9008-4282-b2fa-81b360176ee2 - - - - - -] compute.cpu_shared_set         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:05 np0005591285 nova_compute[181789]: 2026-01-21 23:37:05.178 181793 DEBUG oslo_service.service [None req-6c5ad869-9008-4282-b2fa-81b360176ee2 - - - - - -] compute.image_type_exclude_list = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:05 np0005591285 nova_compute[181789]: 2026-01-21 23:37:05.178 181793 DEBUG oslo_service.service [None req-6c5ad869-9008-4282-b2fa-81b360176ee2 - - - - - -] compute.live_migration_wait_for_vif_plug = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:05 np0005591285 nova_compute[181789]: 2026-01-21 23:37:05.179 181793 DEBUG oslo_service.service [None req-6c5ad869-9008-4282-b2fa-81b360176ee2 - - - - - -] compute.max_concurrent_disk_ops = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:05 np0005591285 nova_compute[181789]: 2026-01-21 23:37:05.179 181793 DEBUG oslo_service.service [None req-6c5ad869-9008-4282-b2fa-81b360176ee2 - - - - - -] compute.max_disk_devices_to_attach = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:05 np0005591285 nova_compute[181789]: 2026-01-21 23:37:05.179 181793 DEBUG oslo_service.service [None req-6c5ad869-9008-4282-b2fa-81b360176ee2 - - - - - -] compute.packing_host_numa_cells_allocation_strategy = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:05 np0005591285 nova_compute[181789]: 2026-01-21 23:37:05.179 181793 DEBUG oslo_service.service [None req-6c5ad869-9008-4282-b2fa-81b360176ee2 - - - - - -] compute.provider_config_location = /etc/nova/provider_config/ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:05 np0005591285 nova_compute[181789]: 2026-01-21 23:37:05.179 181793 DEBUG oslo_service.service [None req-6c5ad869-9008-4282-b2fa-81b360176ee2 - - - - - -] compute.resource_provider_association_refresh = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:05 np0005591285 nova_compute[181789]: 2026-01-21 23:37:05.179 181793 DEBUG oslo_service.service [None req-6c5ad869-9008-4282-b2fa-81b360176ee2 - - - - - -] compute.shutdown_retry_interval = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:05 np0005591285 nova_compute[181789]: 2026-01-21 23:37:05.179 181793 DEBUG oslo_service.service [None req-6c5ad869-9008-4282-b2fa-81b360176ee2 - - - - - -] compute.vmdk_allowed_types     = ['streamOptimized', 'monolithicSparse'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:05 np0005591285 nova_compute[181789]: 2026-01-21 23:37:05.180 181793 DEBUG oslo_service.service [None req-6c5ad869-9008-4282-b2fa-81b360176ee2 - - - - - -] conductor.workers              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:05 np0005591285 nova_compute[181789]: 2026-01-21 23:37:05.180 181793 DEBUG oslo_service.service [None req-6c5ad869-9008-4282-b2fa-81b360176ee2 - - - - - -] console.allowed_origins        = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:05 np0005591285 nova_compute[181789]: 2026-01-21 23:37:05.180 181793 DEBUG oslo_service.service [None req-6c5ad869-9008-4282-b2fa-81b360176ee2 - - - - - -] console.ssl_ciphers            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:05 np0005591285 nova_compute[181789]: 2026-01-21 23:37:05.180 181793 DEBUG oslo_service.service [None req-6c5ad869-9008-4282-b2fa-81b360176ee2 - - - - - -] console.ssl_minimum_version    = default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:05 np0005591285 nova_compute[181789]: 2026-01-21 23:37:05.180 181793 DEBUG oslo_service.service [None req-6c5ad869-9008-4282-b2fa-81b360176ee2 - - - - - -] consoleauth.token_ttl          = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:05 np0005591285 nova_compute[181789]: 2026-01-21 23:37:05.180 181793 DEBUG oslo_service.service [None req-6c5ad869-9008-4282-b2fa-81b360176ee2 - - - - - -] cyborg.cafile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:05 np0005591285 nova_compute[181789]: 2026-01-21 23:37:05.180 181793 DEBUG oslo_service.service [None req-6c5ad869-9008-4282-b2fa-81b360176ee2 - - - - - -] cyborg.certfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:05 np0005591285 nova_compute[181789]: 2026-01-21 23:37:05.181 181793 DEBUG oslo_service.service [None req-6c5ad869-9008-4282-b2fa-81b360176ee2 - - - - - -] cyborg.collect_timing          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:05 np0005591285 nova_compute[181789]: 2026-01-21 23:37:05.181 181793 DEBUG oslo_service.service [None req-6c5ad869-9008-4282-b2fa-81b360176ee2 - - - - - -] cyborg.connect_retries         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:05 np0005591285 nova_compute[181789]: 2026-01-21 23:37:05.181 181793 DEBUG oslo_service.service [None req-6c5ad869-9008-4282-b2fa-81b360176ee2 - - - - - -] cyborg.connect_retry_delay     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:05 np0005591285 nova_compute[181789]: 2026-01-21 23:37:05.181 181793 DEBUG oslo_service.service [None req-6c5ad869-9008-4282-b2fa-81b360176ee2 - - - - - -] cyborg.endpoint_override       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:05 np0005591285 nova_compute[181789]: 2026-01-21 23:37:05.181 181793 DEBUG oslo_service.service [None req-6c5ad869-9008-4282-b2fa-81b360176ee2 - - - - - -] cyborg.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:05 np0005591285 nova_compute[181789]: 2026-01-21 23:37:05.181 181793 DEBUG oslo_service.service [None req-6c5ad869-9008-4282-b2fa-81b360176ee2 - - - - - -] cyborg.keyfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:05 np0005591285 nova_compute[181789]: 2026-01-21 23:37:05.181 181793 DEBUG oslo_service.service [None req-6c5ad869-9008-4282-b2fa-81b360176ee2 - - - - - -] cyborg.max_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:05 np0005591285 nova_compute[181789]: 2026-01-21 23:37:05.182 181793 DEBUG oslo_service.service [None req-6c5ad869-9008-4282-b2fa-81b360176ee2 - - - - - -] cyborg.min_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:05 np0005591285 nova_compute[181789]: 2026-01-21 23:37:05.182 181793 DEBUG oslo_service.service [None req-6c5ad869-9008-4282-b2fa-81b360176ee2 - - - - - -] cyborg.region_name             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:05 np0005591285 nova_compute[181789]: 2026-01-21 23:37:05.182 181793 DEBUG oslo_service.service [None req-6c5ad869-9008-4282-b2fa-81b360176ee2 - - - - - -] cyborg.service_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:05 np0005591285 nova_compute[181789]: 2026-01-21 23:37:05.182 181793 DEBUG oslo_service.service [None req-6c5ad869-9008-4282-b2fa-81b360176ee2 - - - - - -] cyborg.service_type            = accelerator log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:05 np0005591285 nova_compute[181789]: 2026-01-21 23:37:05.182 181793 DEBUG oslo_service.service [None req-6c5ad869-9008-4282-b2fa-81b360176ee2 - - - - - -] cyborg.split_loggers           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:05 np0005591285 nova_compute[181789]: 2026-01-21 23:37:05.182 181793 DEBUG oslo_service.service [None req-6c5ad869-9008-4282-b2fa-81b360176ee2 - - - - - -] cyborg.status_code_retries     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:05 np0005591285 nova_compute[181789]: 2026-01-21 23:37:05.182 181793 DEBUG oslo_service.service [None req-6c5ad869-9008-4282-b2fa-81b360176ee2 - - - - - -] cyborg.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:05 np0005591285 nova_compute[181789]: 2026-01-21 23:37:05.183 181793 DEBUG oslo_service.service [None req-6c5ad869-9008-4282-b2fa-81b360176ee2 - - - - - -] cyborg.timeout                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:05 np0005591285 nova_compute[181789]: 2026-01-21 23:37:05.183 181793 DEBUG oslo_service.service [None req-6c5ad869-9008-4282-b2fa-81b360176ee2 - - - - - -] cyborg.valid_interfaces        = ['internal', 'public'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:05 np0005591285 nova_compute[181789]: 2026-01-21 23:37:05.183 181793 DEBUG oslo_service.service [None req-6c5ad869-9008-4282-b2fa-81b360176ee2 - - - - - -] cyborg.version                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:05 np0005591285 nova_compute[181789]: 2026-01-21 23:37:05.183 181793 DEBUG oslo_service.service [None req-6c5ad869-9008-4282-b2fa-81b360176ee2 - - - - - -] database.backend               = sqlalchemy log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:05 np0005591285 nova_compute[181789]: 2026-01-21 23:37:05.183 181793 DEBUG oslo_service.service [None req-6c5ad869-9008-4282-b2fa-81b360176ee2 - - - - - -] database.connection            = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:05 np0005591285 nova_compute[181789]: 2026-01-21 23:37:05.183 181793 DEBUG oslo_service.service [None req-6c5ad869-9008-4282-b2fa-81b360176ee2 - - - - - -] database.connection_debug      = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:05 np0005591285 nova_compute[181789]: 2026-01-21 23:37:05.183 181793 DEBUG oslo_service.service [None req-6c5ad869-9008-4282-b2fa-81b360176ee2 - - - - - -] database.connection_parameters =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:05 np0005591285 nova_compute[181789]: 2026-01-21 23:37:05.184 181793 DEBUG oslo_service.service [None req-6c5ad869-9008-4282-b2fa-81b360176ee2 - - - - - -] database.connection_recycle_time = 3600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:05 np0005591285 nova_compute[181789]: 2026-01-21 23:37:05.184 181793 DEBUG oslo_service.service [None req-6c5ad869-9008-4282-b2fa-81b360176ee2 - - - - - -] database.connection_trace      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:05 np0005591285 nova_compute[181789]: 2026-01-21 23:37:05.184 181793 DEBUG oslo_service.service [None req-6c5ad869-9008-4282-b2fa-81b360176ee2 - - - - - -] database.db_inc_retry_interval = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:05 np0005591285 nova_compute[181789]: 2026-01-21 23:37:05.184 181793 DEBUG oslo_service.service [None req-6c5ad869-9008-4282-b2fa-81b360176ee2 - - - - - -] database.db_max_retries        = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:05 np0005591285 nova_compute[181789]: 2026-01-21 23:37:05.184 181793 DEBUG oslo_service.service [None req-6c5ad869-9008-4282-b2fa-81b360176ee2 - - - - - -] database.db_max_retry_interval = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:05 np0005591285 nova_compute[181789]: 2026-01-21 23:37:05.184 181793 DEBUG oslo_service.service [None req-6c5ad869-9008-4282-b2fa-81b360176ee2 - - - - - -] database.db_retry_interval     = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:05 np0005591285 nova_compute[181789]: 2026-01-21 23:37:05.184 181793 DEBUG oslo_service.service [None req-6c5ad869-9008-4282-b2fa-81b360176ee2 - - - - - -] database.max_overflow          = 50 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:05 np0005591285 nova_compute[181789]: 2026-01-21 23:37:05.185 181793 DEBUG oslo_service.service [None req-6c5ad869-9008-4282-b2fa-81b360176ee2 - - - - - -] database.max_pool_size         = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:05 np0005591285 nova_compute[181789]: 2026-01-21 23:37:05.185 181793 DEBUG oslo_service.service [None req-6c5ad869-9008-4282-b2fa-81b360176ee2 - - - - - -] database.max_retries           = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:05 np0005591285 nova_compute[181789]: 2026-01-21 23:37:05.185 181793 DEBUG oslo_service.service [None req-6c5ad869-9008-4282-b2fa-81b360176ee2 - - - - - -] database.mysql_enable_ndb      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:05 np0005591285 nova_compute[181789]: 2026-01-21 23:37:05.185 181793 DEBUG oslo_service.service [None req-6c5ad869-9008-4282-b2fa-81b360176ee2 - - - - - -] database.mysql_sql_mode        = TRADITIONAL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:05 np0005591285 nova_compute[181789]: 2026-01-21 23:37:05.185 181793 DEBUG oslo_service.service [None req-6c5ad869-9008-4282-b2fa-81b360176ee2 - - - - - -] database.mysql_wsrep_sync_wait = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:05 np0005591285 nova_compute[181789]: 2026-01-21 23:37:05.185 181793 DEBUG oslo_service.service [None req-6c5ad869-9008-4282-b2fa-81b360176ee2 - - - - - -] database.pool_timeout          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:05 np0005591285 nova_compute[181789]: 2026-01-21 23:37:05.185 181793 DEBUG oslo_service.service [None req-6c5ad869-9008-4282-b2fa-81b360176ee2 - - - - - -] database.retry_interval        = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:05 np0005591285 nova_compute[181789]: 2026-01-21 23:37:05.186 181793 DEBUG oslo_service.service [None req-6c5ad869-9008-4282-b2fa-81b360176ee2 - - - - - -] database.slave_connection      = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:05 np0005591285 nova_compute[181789]: 2026-01-21 23:37:05.186 181793 DEBUG oslo_service.service [None req-6c5ad869-9008-4282-b2fa-81b360176ee2 - - - - - -] database.sqlite_synchronous    = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:05 np0005591285 nova_compute[181789]: 2026-01-21 23:37:05.186 181793 DEBUG oslo_service.service [None req-6c5ad869-9008-4282-b2fa-81b360176ee2 - - - - - -] api_database.backend           = sqlalchemy log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:05 np0005591285 nova_compute[181789]: 2026-01-21 23:37:05.186 181793 DEBUG oslo_service.service [None req-6c5ad869-9008-4282-b2fa-81b360176ee2 - - - - - -] api_database.connection        = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:05 np0005591285 nova_compute[181789]: 2026-01-21 23:37:05.186 181793 DEBUG oslo_service.service [None req-6c5ad869-9008-4282-b2fa-81b360176ee2 - - - - - -] api_database.connection_debug  = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:05 np0005591285 nova_compute[181789]: 2026-01-21 23:37:05.186 181793 DEBUG oslo_service.service [None req-6c5ad869-9008-4282-b2fa-81b360176ee2 - - - - - -] api_database.connection_parameters =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:05 np0005591285 nova_compute[181789]: 2026-01-21 23:37:05.186 181793 DEBUG oslo_service.service [None req-6c5ad869-9008-4282-b2fa-81b360176ee2 - - - - - -] api_database.connection_recycle_time = 3600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:05 np0005591285 nova_compute[181789]: 2026-01-21 23:37:05.187 181793 DEBUG oslo_service.service [None req-6c5ad869-9008-4282-b2fa-81b360176ee2 - - - - - -] api_database.connection_trace  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:05 np0005591285 nova_compute[181789]: 2026-01-21 23:37:05.187 181793 DEBUG oslo_service.service [None req-6c5ad869-9008-4282-b2fa-81b360176ee2 - - - - - -] api_database.db_inc_retry_interval = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:05 np0005591285 nova_compute[181789]: 2026-01-21 23:37:05.187 181793 DEBUG oslo_service.service [None req-6c5ad869-9008-4282-b2fa-81b360176ee2 - - - - - -] api_database.db_max_retries    = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:05 np0005591285 nova_compute[181789]: 2026-01-21 23:37:05.187 181793 DEBUG oslo_service.service [None req-6c5ad869-9008-4282-b2fa-81b360176ee2 - - - - - -] api_database.db_max_retry_interval = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:05 np0005591285 nova_compute[181789]: 2026-01-21 23:37:05.187 181793 DEBUG oslo_service.service [None req-6c5ad869-9008-4282-b2fa-81b360176ee2 - - - - - -] api_database.db_retry_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:05 np0005591285 nova_compute[181789]: 2026-01-21 23:37:05.187 181793 DEBUG oslo_service.service [None req-6c5ad869-9008-4282-b2fa-81b360176ee2 - - - - - -] api_database.max_overflow      = 50 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:05 np0005591285 nova_compute[181789]: 2026-01-21 23:37:05.187 181793 DEBUG oslo_service.service [None req-6c5ad869-9008-4282-b2fa-81b360176ee2 - - - - - -] api_database.max_pool_size     = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:05 np0005591285 nova_compute[181789]: 2026-01-21 23:37:05.188 181793 DEBUG oslo_service.service [None req-6c5ad869-9008-4282-b2fa-81b360176ee2 - - - - - -] api_database.max_retries       = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:05 np0005591285 nova_compute[181789]: 2026-01-21 23:37:05.188 181793 DEBUG oslo_service.service [None req-6c5ad869-9008-4282-b2fa-81b360176ee2 - - - - - -] api_database.mysql_enable_ndb  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:05 np0005591285 nova_compute[181789]: 2026-01-21 23:37:05.188 181793 DEBUG oslo_service.service [None req-6c5ad869-9008-4282-b2fa-81b360176ee2 - - - - - -] api_database.mysql_sql_mode    = TRADITIONAL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:05 np0005591285 nova_compute[181789]: 2026-01-21 23:37:05.188 181793 DEBUG oslo_service.service [None req-6c5ad869-9008-4282-b2fa-81b360176ee2 - - - - - -] api_database.mysql_wsrep_sync_wait = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:05 np0005591285 nova_compute[181789]: 2026-01-21 23:37:05.188 181793 DEBUG oslo_service.service [None req-6c5ad869-9008-4282-b2fa-81b360176ee2 - - - - - -] api_database.pool_timeout      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:05 np0005591285 nova_compute[181789]: 2026-01-21 23:37:05.188 181793 DEBUG oslo_service.service [None req-6c5ad869-9008-4282-b2fa-81b360176ee2 - - - - - -] api_database.retry_interval    = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:05 np0005591285 nova_compute[181789]: 2026-01-21 23:37:05.188 181793 DEBUG oslo_service.service [None req-6c5ad869-9008-4282-b2fa-81b360176ee2 - - - - - -] api_database.slave_connection  = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:05 np0005591285 nova_compute[181789]: 2026-01-21 23:37:05.189 181793 DEBUG oslo_service.service [None req-6c5ad869-9008-4282-b2fa-81b360176ee2 - - - - - -] api_database.sqlite_synchronous = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:05 np0005591285 nova_compute[181789]: 2026-01-21 23:37:05.189 181793 DEBUG oslo_service.service [None req-6c5ad869-9008-4282-b2fa-81b360176ee2 - - - - - -] devices.enabled_mdev_types     = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:05 np0005591285 nova_compute[181789]: 2026-01-21 23:37:05.189 181793 DEBUG oslo_service.service [None req-6c5ad869-9008-4282-b2fa-81b360176ee2 - - - - - -] ephemeral_storage_encryption.cipher = aes-xts-plain64 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:05 np0005591285 nova_compute[181789]: 2026-01-21 23:37:05.189 181793 DEBUG oslo_service.service [None req-6c5ad869-9008-4282-b2fa-81b360176ee2 - - - - - -] ephemeral_storage_encryption.enabled = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:05 np0005591285 nova_compute[181789]: 2026-01-21 23:37:05.189 181793 DEBUG oslo_service.service [None req-6c5ad869-9008-4282-b2fa-81b360176ee2 - - - - - -] ephemeral_storage_encryption.key_size = 512 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:05 np0005591285 nova_compute[181789]: 2026-01-21 23:37:05.189 181793 DEBUG oslo_service.service [None req-6c5ad869-9008-4282-b2fa-81b360176ee2 - - - - - -] glance.api_servers             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:05 np0005591285 nova_compute[181789]: 2026-01-21 23:37:05.189 181793 DEBUG oslo_service.service [None req-6c5ad869-9008-4282-b2fa-81b360176ee2 - - - - - -] glance.cafile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:05 np0005591285 nova_compute[181789]: 2026-01-21 23:37:05.190 181793 DEBUG oslo_service.service [None req-6c5ad869-9008-4282-b2fa-81b360176ee2 - - - - - -] glance.certfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:05 np0005591285 nova_compute[181789]: 2026-01-21 23:37:05.190 181793 DEBUG oslo_service.service [None req-6c5ad869-9008-4282-b2fa-81b360176ee2 - - - - - -] glance.collect_timing          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:05 np0005591285 nova_compute[181789]: 2026-01-21 23:37:05.190 181793 DEBUG oslo_service.service [None req-6c5ad869-9008-4282-b2fa-81b360176ee2 - - - - - -] glance.connect_retries         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:05 np0005591285 nova_compute[181789]: 2026-01-21 23:37:05.190 181793 DEBUG oslo_service.service [None req-6c5ad869-9008-4282-b2fa-81b360176ee2 - - - - - -] glance.connect_retry_delay     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:05 np0005591285 nova_compute[181789]: 2026-01-21 23:37:05.190 181793 DEBUG oslo_service.service [None req-6c5ad869-9008-4282-b2fa-81b360176ee2 - - - - - -] glance.debug                   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:05 np0005591285 nova_compute[181789]: 2026-01-21 23:37:05.190 181793 DEBUG oslo_service.service [None req-6c5ad869-9008-4282-b2fa-81b360176ee2 - - - - - -] glance.default_trusted_certificate_ids = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:05 np0005591285 nova_compute[181789]: 2026-01-21 23:37:05.190 181793 DEBUG oslo_service.service [None req-6c5ad869-9008-4282-b2fa-81b360176ee2 - - - - - -] glance.enable_certificate_validation = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:05 np0005591285 nova_compute[181789]: 2026-01-21 23:37:05.191 181793 DEBUG oslo_service.service [None req-6c5ad869-9008-4282-b2fa-81b360176ee2 - - - - - -] glance.enable_rbd_download     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:05 np0005591285 nova_compute[181789]: 2026-01-21 23:37:05.191 181793 DEBUG oslo_service.service [None req-6c5ad869-9008-4282-b2fa-81b360176ee2 - - - - - -] glance.endpoint_override       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:05 np0005591285 nova_compute[181789]: 2026-01-21 23:37:05.191 181793 DEBUG oslo_service.service [None req-6c5ad869-9008-4282-b2fa-81b360176ee2 - - - - - -] glance.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:05 np0005591285 nova_compute[181789]: 2026-01-21 23:37:05.191 181793 DEBUG oslo_service.service [None req-6c5ad869-9008-4282-b2fa-81b360176ee2 - - - - - -] glance.keyfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:05 np0005591285 nova_compute[181789]: 2026-01-21 23:37:05.191 181793 DEBUG oslo_service.service [None req-6c5ad869-9008-4282-b2fa-81b360176ee2 - - - - - -] glance.max_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:05 np0005591285 nova_compute[181789]: 2026-01-21 23:37:05.191 181793 DEBUG oslo_service.service [None req-6c5ad869-9008-4282-b2fa-81b360176ee2 - - - - - -] glance.min_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:05 np0005591285 nova_compute[181789]: 2026-01-21 23:37:05.191 181793 DEBUG oslo_service.service [None req-6c5ad869-9008-4282-b2fa-81b360176ee2 - - - - - -] glance.num_retries             = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:05 np0005591285 nova_compute[181789]: 2026-01-21 23:37:05.192 181793 DEBUG oslo_service.service [None req-6c5ad869-9008-4282-b2fa-81b360176ee2 - - - - - -] glance.rbd_ceph_conf           =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:05 np0005591285 nova_compute[181789]: 2026-01-21 23:37:05.192 181793 DEBUG oslo_service.service [None req-6c5ad869-9008-4282-b2fa-81b360176ee2 - - - - - -] glance.rbd_connect_timeout     = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:05 np0005591285 nova_compute[181789]: 2026-01-21 23:37:05.192 181793 DEBUG oslo_service.service [None req-6c5ad869-9008-4282-b2fa-81b360176ee2 - - - - - -] glance.rbd_pool                =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:05 np0005591285 nova_compute[181789]: 2026-01-21 23:37:05.192 181793 DEBUG oslo_service.service [None req-6c5ad869-9008-4282-b2fa-81b360176ee2 - - - - - -] glance.rbd_user                =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:05 np0005591285 nova_compute[181789]: 2026-01-21 23:37:05.192 181793 DEBUG oslo_service.service [None req-6c5ad869-9008-4282-b2fa-81b360176ee2 - - - - - -] glance.region_name             = regionOne log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:05 np0005591285 nova_compute[181789]: 2026-01-21 23:37:05.192 181793 DEBUG oslo_service.service [None req-6c5ad869-9008-4282-b2fa-81b360176ee2 - - - - - -] glance.service_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:05 np0005591285 nova_compute[181789]: 2026-01-21 23:37:05.192 181793 DEBUG oslo_service.service [None req-6c5ad869-9008-4282-b2fa-81b360176ee2 - - - - - -] glance.service_type            = image log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:05 np0005591285 nova_compute[181789]: 2026-01-21 23:37:05.193 181793 DEBUG oslo_service.service [None req-6c5ad869-9008-4282-b2fa-81b360176ee2 - - - - - -] glance.split_loggers           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:05 np0005591285 nova_compute[181789]: 2026-01-21 23:37:05.193 181793 DEBUG oslo_service.service [None req-6c5ad869-9008-4282-b2fa-81b360176ee2 - - - - - -] glance.status_code_retries     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:05 np0005591285 nova_compute[181789]: 2026-01-21 23:37:05.193 181793 DEBUG oslo_service.service [None req-6c5ad869-9008-4282-b2fa-81b360176ee2 - - - - - -] glance.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:05 np0005591285 nova_compute[181789]: 2026-01-21 23:37:05.193 181793 DEBUG oslo_service.service [None req-6c5ad869-9008-4282-b2fa-81b360176ee2 - - - - - -] glance.timeout                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:05 np0005591285 nova_compute[181789]: 2026-01-21 23:37:05.193 181793 DEBUG oslo_service.service [None req-6c5ad869-9008-4282-b2fa-81b360176ee2 - - - - - -] glance.valid_interfaces        = ['internal'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:05 np0005591285 nova_compute[181789]: 2026-01-21 23:37:05.193 181793 DEBUG oslo_service.service [None req-6c5ad869-9008-4282-b2fa-81b360176ee2 - - - - - -] glance.verify_glance_signatures = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:05 np0005591285 nova_compute[181789]: 2026-01-21 23:37:05.194 181793 DEBUG oslo_service.service [None req-6c5ad869-9008-4282-b2fa-81b360176ee2 - - - - - -] glance.version                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:05 np0005591285 nova_compute[181789]: 2026-01-21 23:37:05.194 181793 DEBUG oslo_service.service [None req-6c5ad869-9008-4282-b2fa-81b360176ee2 - - - - - -] guestfs.debug                  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:05 np0005591285 nova_compute[181789]: 2026-01-21 23:37:05.194 181793 DEBUG oslo_service.service [None req-6c5ad869-9008-4282-b2fa-81b360176ee2 - - - - - -] hyperv.config_drive_cdrom      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:05 np0005591285 nova_compute[181789]: 2026-01-21 23:37:05.194 181793 DEBUG oslo_service.service [None req-6c5ad869-9008-4282-b2fa-81b360176ee2 - - - - - -] hyperv.config_drive_inject_password = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:05 np0005591285 nova_compute[181789]: 2026-01-21 23:37:05.194 181793 DEBUG oslo_service.service [None req-6c5ad869-9008-4282-b2fa-81b360176ee2 - - - - - -] hyperv.dynamic_memory_ratio    = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:05 np0005591285 nova_compute[181789]: 2026-01-21 23:37:05.194 181793 DEBUG oslo_service.service [None req-6c5ad869-9008-4282-b2fa-81b360176ee2 - - - - - -] hyperv.enable_instance_metrics_collection = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:05 np0005591285 nova_compute[181789]: 2026-01-21 23:37:05.194 181793 DEBUG oslo_service.service [None req-6c5ad869-9008-4282-b2fa-81b360176ee2 - - - - - -] hyperv.enable_remotefx         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:05 np0005591285 nova_compute[181789]: 2026-01-21 23:37:05.195 181793 DEBUG oslo_service.service [None req-6c5ad869-9008-4282-b2fa-81b360176ee2 - - - - - -] hyperv.instances_path_share    =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:05 np0005591285 nova_compute[181789]: 2026-01-21 23:37:05.195 181793 DEBUG oslo_service.service [None req-6c5ad869-9008-4282-b2fa-81b360176ee2 - - - - - -] hyperv.iscsi_initiator_list    = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:05 np0005591285 nova_compute[181789]: 2026-01-21 23:37:05.195 181793 DEBUG oslo_service.service [None req-6c5ad869-9008-4282-b2fa-81b360176ee2 - - - - - -] hyperv.limit_cpu_features      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:05 np0005591285 nova_compute[181789]: 2026-01-21 23:37:05.195 181793 DEBUG oslo_service.service [None req-6c5ad869-9008-4282-b2fa-81b360176ee2 - - - - - -] hyperv.mounted_disk_query_retry_count = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:05 np0005591285 nova_compute[181789]: 2026-01-21 23:37:05.195 181793 DEBUG oslo_service.service [None req-6c5ad869-9008-4282-b2fa-81b360176ee2 - - - - - -] hyperv.mounted_disk_query_retry_interval = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:05 np0005591285 nova_compute[181789]: 2026-01-21 23:37:05.195 181793 DEBUG oslo_service.service [None req-6c5ad869-9008-4282-b2fa-81b360176ee2 - - - - - -] hyperv.power_state_check_timeframe = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:05 np0005591285 nova_compute[181789]: 2026-01-21 23:37:05.195 181793 DEBUG oslo_service.service [None req-6c5ad869-9008-4282-b2fa-81b360176ee2 - - - - - -] hyperv.power_state_event_polling_interval = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:05 np0005591285 nova_compute[181789]: 2026-01-21 23:37:05.196 181793 DEBUG oslo_service.service [None req-6c5ad869-9008-4282-b2fa-81b360176ee2 - - - - - -] hyperv.qemu_img_cmd            = qemu-img.exe log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:05 np0005591285 nova_compute[181789]: 2026-01-21 23:37:05.196 181793 DEBUG oslo_service.service [None req-6c5ad869-9008-4282-b2fa-81b360176ee2 - - - - - -] hyperv.use_multipath_io        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:05 np0005591285 nova_compute[181789]: 2026-01-21 23:37:05.196 181793 DEBUG oslo_service.service [None req-6c5ad869-9008-4282-b2fa-81b360176ee2 - - - - - -] hyperv.volume_attach_retry_count = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:05 np0005591285 nova_compute[181789]: 2026-01-21 23:37:05.196 181793 DEBUG oslo_service.service [None req-6c5ad869-9008-4282-b2fa-81b360176ee2 - - - - - -] hyperv.volume_attach_retry_interval = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:05 np0005591285 nova_compute[181789]: 2026-01-21 23:37:05.196 181793 DEBUG oslo_service.service [None req-6c5ad869-9008-4282-b2fa-81b360176ee2 - - - - - -] hyperv.vswitch_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:05 np0005591285 nova_compute[181789]: 2026-01-21 23:37:05.196 181793 DEBUG oslo_service.service [None req-6c5ad869-9008-4282-b2fa-81b360176ee2 - - - - - -] hyperv.wait_soft_reboot_seconds = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:05 np0005591285 nova_compute[181789]: 2026-01-21 23:37:05.196 181793 DEBUG oslo_service.service [None req-6c5ad869-9008-4282-b2fa-81b360176ee2 - - - - - -] mks.enabled                    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:05 np0005591285 nova_compute[181789]: 2026-01-21 23:37:05.197 181793 DEBUG oslo_service.service [None req-6c5ad869-9008-4282-b2fa-81b360176ee2 - - - - - -] mks.mksproxy_base_url          = http://127.0.0.1:6090/ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:05 np0005591285 nova_compute[181789]: 2026-01-21 23:37:05.197 181793 DEBUG oslo_service.service [None req-6c5ad869-9008-4282-b2fa-81b360176ee2 - - - - - -] image_cache.manager_interval   = 2400 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:05 np0005591285 nova_compute[181789]: 2026-01-21 23:37:05.197 181793 DEBUG oslo_service.service [None req-6c5ad869-9008-4282-b2fa-81b360176ee2 - - - - - -] image_cache.precache_concurrency = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:05 np0005591285 nova_compute[181789]: 2026-01-21 23:37:05.197 181793 DEBUG oslo_service.service [None req-6c5ad869-9008-4282-b2fa-81b360176ee2 - - - - - -] image_cache.remove_unused_base_images = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:05 np0005591285 nova_compute[181789]: 2026-01-21 23:37:05.198 181793 DEBUG oslo_service.service [None req-6c5ad869-9008-4282-b2fa-81b360176ee2 - - - - - -] image_cache.remove_unused_original_minimum_age_seconds = 86400 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:05 np0005591285 nova_compute[181789]: 2026-01-21 23:37:05.198 181793 DEBUG oslo_service.service [None req-6c5ad869-9008-4282-b2fa-81b360176ee2 - - - - - -] image_cache.remove_unused_resized_minimum_age_seconds = 3600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:05 np0005591285 nova_compute[181789]: 2026-01-21 23:37:05.198 181793 DEBUG oslo_service.service [None req-6c5ad869-9008-4282-b2fa-81b360176ee2 - - - - - -] image_cache.subdirectory_name  = _base log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:05 np0005591285 nova_compute[181789]: 2026-01-21 23:37:05.198 181793 DEBUG oslo_service.service [None req-6c5ad869-9008-4282-b2fa-81b360176ee2 - - - - - -] ironic.api_max_retries         = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:05 np0005591285 nova_compute[181789]: 2026-01-21 23:37:05.198 181793 DEBUG oslo_service.service [None req-6c5ad869-9008-4282-b2fa-81b360176ee2 - - - - - -] ironic.api_retry_interval      = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:05 np0005591285 nova_compute[181789]: 2026-01-21 23:37:05.198 181793 DEBUG oslo_service.service [None req-6c5ad869-9008-4282-b2fa-81b360176ee2 - - - - - -] ironic.auth_section            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:05 np0005591285 nova_compute[181789]: 2026-01-21 23:37:05.198 181793 DEBUG oslo_service.service [None req-6c5ad869-9008-4282-b2fa-81b360176ee2 - - - - - -] ironic.auth_type               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:05 np0005591285 nova_compute[181789]: 2026-01-21 23:37:05.199 181793 DEBUG oslo_service.service [None req-6c5ad869-9008-4282-b2fa-81b360176ee2 - - - - - -] ironic.cafile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:05 np0005591285 nova_compute[181789]: 2026-01-21 23:37:05.199 181793 DEBUG oslo_service.service [None req-6c5ad869-9008-4282-b2fa-81b360176ee2 - - - - - -] ironic.certfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:05 np0005591285 nova_compute[181789]: 2026-01-21 23:37:05.199 181793 DEBUG oslo_service.service [None req-6c5ad869-9008-4282-b2fa-81b360176ee2 - - - - - -] ironic.collect_timing          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:05 np0005591285 nova_compute[181789]: 2026-01-21 23:37:05.199 181793 DEBUG oslo_service.service [None req-6c5ad869-9008-4282-b2fa-81b360176ee2 - - - - - -] ironic.connect_retries         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:05 np0005591285 nova_compute[181789]: 2026-01-21 23:37:05.199 181793 DEBUG oslo_service.service [None req-6c5ad869-9008-4282-b2fa-81b360176ee2 - - - - - -] ironic.connect_retry_delay     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:05 np0005591285 nova_compute[181789]: 2026-01-21 23:37:05.199 181793 DEBUG oslo_service.service [None req-6c5ad869-9008-4282-b2fa-81b360176ee2 - - - - - -] ironic.endpoint_override       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:05 np0005591285 nova_compute[181789]: 2026-01-21 23:37:05.199 181793 DEBUG oslo_service.service [None req-6c5ad869-9008-4282-b2fa-81b360176ee2 - - - - - -] ironic.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:05 np0005591285 nova_compute[181789]: 2026-01-21 23:37:05.199 181793 DEBUG oslo_service.service [None req-6c5ad869-9008-4282-b2fa-81b360176ee2 - - - - - -] ironic.keyfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:05 np0005591285 nova_compute[181789]: 2026-01-21 23:37:05.200 181793 DEBUG oslo_service.service [None req-6c5ad869-9008-4282-b2fa-81b360176ee2 - - - - - -] ironic.max_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:05 np0005591285 nova_compute[181789]: 2026-01-21 23:37:05.200 181793 DEBUG oslo_service.service [None req-6c5ad869-9008-4282-b2fa-81b360176ee2 - - - - - -] ironic.min_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:05 np0005591285 nova_compute[181789]: 2026-01-21 23:37:05.200 181793 DEBUG oslo_service.service [None req-6c5ad869-9008-4282-b2fa-81b360176ee2 - - - - - -] ironic.partition_key           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:05 np0005591285 nova_compute[181789]: 2026-01-21 23:37:05.200 181793 DEBUG oslo_service.service [None req-6c5ad869-9008-4282-b2fa-81b360176ee2 - - - - - -] ironic.peer_list               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:05 np0005591285 nova_compute[181789]: 2026-01-21 23:37:05.200 181793 DEBUG oslo_service.service [None req-6c5ad869-9008-4282-b2fa-81b360176ee2 - - - - - -] ironic.region_name             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:05 np0005591285 nova_compute[181789]: 2026-01-21 23:37:05.200 181793 DEBUG oslo_service.service [None req-6c5ad869-9008-4282-b2fa-81b360176ee2 - - - - - -] ironic.serial_console_state_timeout = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:05 np0005591285 nova_compute[181789]: 2026-01-21 23:37:05.201 181793 DEBUG oslo_service.service [None req-6c5ad869-9008-4282-b2fa-81b360176ee2 - - - - - -] ironic.service_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:05 np0005591285 nova_compute[181789]: 2026-01-21 23:37:05.201 181793 DEBUG oslo_service.service [None req-6c5ad869-9008-4282-b2fa-81b360176ee2 - - - - - -] ironic.service_type            = baremetal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:05 np0005591285 nova_compute[181789]: 2026-01-21 23:37:05.201 181793 DEBUG oslo_service.service [None req-6c5ad869-9008-4282-b2fa-81b360176ee2 - - - - - -] ironic.split_loggers           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:05 np0005591285 nova_compute[181789]: 2026-01-21 23:37:05.201 181793 DEBUG oslo_service.service [None req-6c5ad869-9008-4282-b2fa-81b360176ee2 - - - - - -] ironic.status_code_retries     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:05 np0005591285 nova_compute[181789]: 2026-01-21 23:37:05.201 181793 DEBUG oslo_service.service [None req-6c5ad869-9008-4282-b2fa-81b360176ee2 - - - - - -] ironic.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:05 np0005591285 nova_compute[181789]: 2026-01-21 23:37:05.201 181793 DEBUG oslo_service.service [None req-6c5ad869-9008-4282-b2fa-81b360176ee2 - - - - - -] ironic.timeout                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:05 np0005591285 nova_compute[181789]: 2026-01-21 23:37:05.201 181793 DEBUG oslo_service.service [None req-6c5ad869-9008-4282-b2fa-81b360176ee2 - - - - - -] ironic.valid_interfaces        = ['internal', 'public'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:05 np0005591285 nova_compute[181789]: 2026-01-21 23:37:05.202 181793 DEBUG oslo_service.service [None req-6c5ad869-9008-4282-b2fa-81b360176ee2 - - - - - -] ironic.version                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:05 np0005591285 nova_compute[181789]: 2026-01-21 23:37:05.202 181793 DEBUG oslo_service.service [None req-6c5ad869-9008-4282-b2fa-81b360176ee2 - - - - - -] key_manager.backend            = barbican log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:05 np0005591285 nova_compute[181789]: 2026-01-21 23:37:05.202 181793 DEBUG oslo_service.service [None req-6c5ad869-9008-4282-b2fa-81b360176ee2 - - - - - -] key_manager.fixed_key          = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:05 np0005591285 nova_compute[181789]: 2026-01-21 23:37:05.202 181793 DEBUG oslo_service.service [None req-6c5ad869-9008-4282-b2fa-81b360176ee2 - - - - - -] barbican.auth_endpoint         = http://localhost/identity/v3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:05 np0005591285 nova_compute[181789]: 2026-01-21 23:37:05.202 181793 DEBUG oslo_service.service [None req-6c5ad869-9008-4282-b2fa-81b360176ee2 - - - - - -] barbican.barbican_api_version  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:05 np0005591285 nova_compute[181789]: 2026-01-21 23:37:05.202 181793 DEBUG oslo_service.service [None req-6c5ad869-9008-4282-b2fa-81b360176ee2 - - - - - -] barbican.barbican_endpoint     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:05 np0005591285 nova_compute[181789]: 2026-01-21 23:37:05.202 181793 DEBUG oslo_service.service [None req-6c5ad869-9008-4282-b2fa-81b360176ee2 - - - - - -] barbican.barbican_endpoint_type = internal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:05 np0005591285 nova_compute[181789]: 2026-01-21 23:37:05.203 181793 DEBUG oslo_service.service [None req-6c5ad869-9008-4282-b2fa-81b360176ee2 - - - - - -] barbican.barbican_region_name  = regionOne log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:05 np0005591285 nova_compute[181789]: 2026-01-21 23:37:05.203 181793 DEBUG oslo_service.service [None req-6c5ad869-9008-4282-b2fa-81b360176ee2 - - - - - -] barbican.cafile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:05 np0005591285 nova_compute[181789]: 2026-01-21 23:37:05.203 181793 DEBUG oslo_service.service [None req-6c5ad869-9008-4282-b2fa-81b360176ee2 - - - - - -] barbican.certfile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:05 np0005591285 nova_compute[181789]: 2026-01-21 23:37:05.203 181793 DEBUG oslo_service.service [None req-6c5ad869-9008-4282-b2fa-81b360176ee2 - - - - - -] barbican.collect_timing        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:05 np0005591285 nova_compute[181789]: 2026-01-21 23:37:05.203 181793 DEBUG oslo_service.service [None req-6c5ad869-9008-4282-b2fa-81b360176ee2 - - - - - -] barbican.insecure              = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:05 np0005591285 nova_compute[181789]: 2026-01-21 23:37:05.203 181793 DEBUG oslo_service.service [None req-6c5ad869-9008-4282-b2fa-81b360176ee2 - - - - - -] barbican.keyfile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:05 np0005591285 nova_compute[181789]: 2026-01-21 23:37:05.203 181793 DEBUG oslo_service.service [None req-6c5ad869-9008-4282-b2fa-81b360176ee2 - - - - - -] barbican.number_of_retries     = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:05 np0005591285 nova_compute[181789]: 2026-01-21 23:37:05.204 181793 DEBUG oslo_service.service [None req-6c5ad869-9008-4282-b2fa-81b360176ee2 - - - - - -] barbican.retry_delay           = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:05 np0005591285 nova_compute[181789]: 2026-01-21 23:37:05.204 181793 DEBUG oslo_service.service [None req-6c5ad869-9008-4282-b2fa-81b360176ee2 - - - - - -] barbican.send_service_user_token = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:05 np0005591285 nova_compute[181789]: 2026-01-21 23:37:05.204 181793 DEBUG oslo_service.service [None req-6c5ad869-9008-4282-b2fa-81b360176ee2 - - - - - -] barbican.split_loggers         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:05 np0005591285 nova_compute[181789]: 2026-01-21 23:37:05.204 181793 DEBUG oslo_service.service [None req-6c5ad869-9008-4282-b2fa-81b360176ee2 - - - - - -] barbican.timeout               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:05 np0005591285 nova_compute[181789]: 2026-01-21 23:37:05.204 181793 DEBUG oslo_service.service [None req-6c5ad869-9008-4282-b2fa-81b360176ee2 - - - - - -] barbican.verify_ssl            = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:05 np0005591285 nova_compute[181789]: 2026-01-21 23:37:05.204 181793 DEBUG oslo_service.service [None req-6c5ad869-9008-4282-b2fa-81b360176ee2 - - - - - -] barbican.verify_ssl_path       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:05 np0005591285 nova_compute[181789]: 2026-01-21 23:37:05.204 181793 DEBUG oslo_service.service [None req-6c5ad869-9008-4282-b2fa-81b360176ee2 - - - - - -] barbican_service_user.auth_section = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:05 np0005591285 nova_compute[181789]: 2026-01-21 23:37:05.205 181793 DEBUG oslo_service.service [None req-6c5ad869-9008-4282-b2fa-81b360176ee2 - - - - - -] barbican_service_user.auth_type = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:05 np0005591285 nova_compute[181789]: 2026-01-21 23:37:05.205 181793 DEBUG oslo_service.service [None req-6c5ad869-9008-4282-b2fa-81b360176ee2 - - - - - -] barbican_service_user.cafile   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:05 np0005591285 nova_compute[181789]: 2026-01-21 23:37:05.205 181793 DEBUG oslo_service.service [None req-6c5ad869-9008-4282-b2fa-81b360176ee2 - - - - - -] barbican_service_user.certfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:05 np0005591285 nova_compute[181789]: 2026-01-21 23:37:05.205 181793 DEBUG oslo_service.service [None req-6c5ad869-9008-4282-b2fa-81b360176ee2 - - - - - -] barbican_service_user.collect_timing = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:05 np0005591285 nova_compute[181789]: 2026-01-21 23:37:05.205 181793 DEBUG oslo_service.service [None req-6c5ad869-9008-4282-b2fa-81b360176ee2 - - - - - -] barbican_service_user.insecure = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:05 np0005591285 nova_compute[181789]: 2026-01-21 23:37:05.205 181793 DEBUG oslo_service.service [None req-6c5ad869-9008-4282-b2fa-81b360176ee2 - - - - - -] barbican_service_user.keyfile  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:05 np0005591285 nova_compute[181789]: 2026-01-21 23:37:05.205 181793 DEBUG oslo_service.service [None req-6c5ad869-9008-4282-b2fa-81b360176ee2 - - - - - -] barbican_service_user.split_loggers = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:05 np0005591285 nova_compute[181789]: 2026-01-21 23:37:05.206 181793 DEBUG oslo_service.service [None req-6c5ad869-9008-4282-b2fa-81b360176ee2 - - - - - -] barbican_service_user.timeout  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:05 np0005591285 nova_compute[181789]: 2026-01-21 23:37:05.206 181793 DEBUG oslo_service.service [None req-6c5ad869-9008-4282-b2fa-81b360176ee2 - - - - - -] vault.approle_role_id          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:05 np0005591285 nova_compute[181789]: 2026-01-21 23:37:05.206 181793 DEBUG oslo_service.service [None req-6c5ad869-9008-4282-b2fa-81b360176ee2 - - - - - -] vault.approle_secret_id        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:05 np0005591285 nova_compute[181789]: 2026-01-21 23:37:05.206 181793 DEBUG oslo_service.service [None req-6c5ad869-9008-4282-b2fa-81b360176ee2 - - - - - -] vault.cafile                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:05 np0005591285 nova_compute[181789]: 2026-01-21 23:37:05.206 181793 DEBUG oslo_service.service [None req-6c5ad869-9008-4282-b2fa-81b360176ee2 - - - - - -] vault.certfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:05 np0005591285 nova_compute[181789]: 2026-01-21 23:37:05.206 181793 DEBUG oslo_service.service [None req-6c5ad869-9008-4282-b2fa-81b360176ee2 - - - - - -] vault.collect_timing           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:05 np0005591285 nova_compute[181789]: 2026-01-21 23:37:05.206 181793 DEBUG oslo_service.service [None req-6c5ad869-9008-4282-b2fa-81b360176ee2 - - - - - -] vault.insecure                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:05 np0005591285 nova_compute[181789]: 2026-01-21 23:37:05.207 181793 DEBUG oslo_service.service [None req-6c5ad869-9008-4282-b2fa-81b360176ee2 - - - - - -] vault.keyfile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:05 np0005591285 nova_compute[181789]: 2026-01-21 23:37:05.207 181793 DEBUG oslo_service.service [None req-6c5ad869-9008-4282-b2fa-81b360176ee2 - - - - - -] vault.kv_mountpoint            = secret log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:05 np0005591285 nova_compute[181789]: 2026-01-21 23:37:05.207 181793 DEBUG oslo_service.service [None req-6c5ad869-9008-4282-b2fa-81b360176ee2 - - - - - -] vault.kv_version               = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:05 np0005591285 nova_compute[181789]: 2026-01-21 23:37:05.207 181793 DEBUG oslo_service.service [None req-6c5ad869-9008-4282-b2fa-81b360176ee2 - - - - - -] vault.namespace                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:05 np0005591285 nova_compute[181789]: 2026-01-21 23:37:05.207 181793 DEBUG oslo_service.service [None req-6c5ad869-9008-4282-b2fa-81b360176ee2 - - - - - -] vault.root_token_id            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:05 np0005591285 nova_compute[181789]: 2026-01-21 23:37:05.207 181793 DEBUG oslo_service.service [None req-6c5ad869-9008-4282-b2fa-81b360176ee2 - - - - - -] vault.split_loggers            = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:05 np0005591285 nova_compute[181789]: 2026-01-21 23:37:05.207 181793 DEBUG oslo_service.service [None req-6c5ad869-9008-4282-b2fa-81b360176ee2 - - - - - -] vault.ssl_ca_crt_file          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:05 np0005591285 nova_compute[181789]: 2026-01-21 23:37:05.208 181793 DEBUG oslo_service.service [None req-6c5ad869-9008-4282-b2fa-81b360176ee2 - - - - - -] vault.timeout                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:05 np0005591285 nova_compute[181789]: 2026-01-21 23:37:05.208 181793 DEBUG oslo_service.service [None req-6c5ad869-9008-4282-b2fa-81b360176ee2 - - - - - -] vault.use_ssl                  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:05 np0005591285 nova_compute[181789]: 2026-01-21 23:37:05.208 181793 DEBUG oslo_service.service [None req-6c5ad869-9008-4282-b2fa-81b360176ee2 - - - - - -] vault.vault_url                = http://127.0.0.1:8200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:05 np0005591285 nova_compute[181789]: 2026-01-21 23:37:05.208 181793 DEBUG oslo_service.service [None req-6c5ad869-9008-4282-b2fa-81b360176ee2 - - - - - -] keystone.cafile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:05 np0005591285 nova_compute[181789]: 2026-01-21 23:37:05.208 181793 DEBUG oslo_service.service [None req-6c5ad869-9008-4282-b2fa-81b360176ee2 - - - - - -] keystone.certfile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:05 np0005591285 nova_compute[181789]: 2026-01-21 23:37:05.208 181793 DEBUG oslo_service.service [None req-6c5ad869-9008-4282-b2fa-81b360176ee2 - - - - - -] keystone.collect_timing        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:05 np0005591285 nova_compute[181789]: 2026-01-21 23:37:05.208 181793 DEBUG oslo_service.service [None req-6c5ad869-9008-4282-b2fa-81b360176ee2 - - - - - -] keystone.connect_retries       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:05 np0005591285 nova_compute[181789]: 2026-01-21 23:37:05.209 181793 DEBUG oslo_service.service [None req-6c5ad869-9008-4282-b2fa-81b360176ee2 - - - - - -] keystone.connect_retry_delay   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:05 np0005591285 nova_compute[181789]: 2026-01-21 23:37:05.209 181793 DEBUG oslo_service.service [None req-6c5ad869-9008-4282-b2fa-81b360176ee2 - - - - - -] keystone.endpoint_override     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:05 np0005591285 nova_compute[181789]: 2026-01-21 23:37:05.209 181793 DEBUG oslo_service.service [None req-6c5ad869-9008-4282-b2fa-81b360176ee2 - - - - - -] keystone.insecure              = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:05 np0005591285 nova_compute[181789]: 2026-01-21 23:37:05.209 181793 DEBUG oslo_service.service [None req-6c5ad869-9008-4282-b2fa-81b360176ee2 - - - - - -] keystone.keyfile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:05 np0005591285 nova_compute[181789]: 2026-01-21 23:37:05.209 181793 DEBUG oslo_service.service [None req-6c5ad869-9008-4282-b2fa-81b360176ee2 - - - - - -] keystone.max_version           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:05 np0005591285 nova_compute[181789]: 2026-01-21 23:37:05.209 181793 DEBUG oslo_service.service [None req-6c5ad869-9008-4282-b2fa-81b360176ee2 - - - - - -] keystone.min_version           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:05 np0005591285 nova_compute[181789]: 2026-01-21 23:37:05.209 181793 DEBUG oslo_service.service [None req-6c5ad869-9008-4282-b2fa-81b360176ee2 - - - - - -] keystone.region_name           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:05 np0005591285 nova_compute[181789]: 2026-01-21 23:37:05.210 181793 DEBUG oslo_service.service [None req-6c5ad869-9008-4282-b2fa-81b360176ee2 - - - - - -] keystone.service_name          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:05 np0005591285 nova_compute[181789]: 2026-01-21 23:37:05.210 181793 DEBUG oslo_service.service [None req-6c5ad869-9008-4282-b2fa-81b360176ee2 - - - - - -] keystone.service_type          = identity log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:05 np0005591285 nova_compute[181789]: 2026-01-21 23:37:05.210 181793 DEBUG oslo_service.service [None req-6c5ad869-9008-4282-b2fa-81b360176ee2 - - - - - -] keystone.split_loggers         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:05 np0005591285 nova_compute[181789]: 2026-01-21 23:37:05.210 181793 DEBUG oslo_service.service [None req-6c5ad869-9008-4282-b2fa-81b360176ee2 - - - - - -] keystone.status_code_retries   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:05 np0005591285 nova_compute[181789]: 2026-01-21 23:37:05.210 181793 DEBUG oslo_service.service [None req-6c5ad869-9008-4282-b2fa-81b360176ee2 - - - - - -] keystone.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:05 np0005591285 nova_compute[181789]: 2026-01-21 23:37:05.210 181793 DEBUG oslo_service.service [None req-6c5ad869-9008-4282-b2fa-81b360176ee2 - - - - - -] keystone.timeout               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:05 np0005591285 nova_compute[181789]: 2026-01-21 23:37:05.210 181793 DEBUG oslo_service.service [None req-6c5ad869-9008-4282-b2fa-81b360176ee2 - - - - - -] keystone.valid_interfaces      = ['internal', 'public'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:05 np0005591285 nova_compute[181789]: 2026-01-21 23:37:05.211 181793 DEBUG oslo_service.service [None req-6c5ad869-9008-4282-b2fa-81b360176ee2 - - - - - -] keystone.version               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:05 np0005591285 nova_compute[181789]: 2026-01-21 23:37:05.211 181793 DEBUG oslo_service.service [None req-6c5ad869-9008-4282-b2fa-81b360176ee2 - - - - - -] libvirt.connection_uri         =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:05 np0005591285 nova_compute[181789]: 2026-01-21 23:37:05.211 181793 DEBUG oslo_service.service [None req-6c5ad869-9008-4282-b2fa-81b360176ee2 - - - - - -] libvirt.cpu_mode               = custom log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:05 np0005591285 nova_compute[181789]: 2026-01-21 23:37:05.211 181793 DEBUG oslo_service.service [None req-6c5ad869-9008-4282-b2fa-81b360176ee2 - - - - - -] libvirt.cpu_model_extra_flags  = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:05 np0005591285 nova_compute[181789]: 2026-01-21 23:37:05.211 181793 DEBUG oslo_service.service [None req-6c5ad869-9008-4282-b2fa-81b360176ee2 - - - - - -] libvirt.cpu_models             = ['Nehalem'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:05 np0005591285 nova_compute[181789]: 2026-01-21 23:37:05.211 181793 DEBUG oslo_service.service [None req-6c5ad869-9008-4282-b2fa-81b360176ee2 - - - - - -] libvirt.cpu_power_governor_high = performance log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:05 np0005591285 nova_compute[181789]: 2026-01-21 23:37:05.211 181793 DEBUG oslo_service.service [None req-6c5ad869-9008-4282-b2fa-81b360176ee2 - - - - - -] libvirt.cpu_power_governor_low = powersave log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:05 np0005591285 nova_compute[181789]: 2026-01-21 23:37:05.212 181793 DEBUG oslo_service.service [None req-6c5ad869-9008-4282-b2fa-81b360176ee2 - - - - - -] libvirt.cpu_power_management   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:05 np0005591285 nova_compute[181789]: 2026-01-21 23:37:05.212 181793 DEBUG oslo_service.service [None req-6c5ad869-9008-4282-b2fa-81b360176ee2 - - - - - -] libvirt.cpu_power_management_strategy = cpu_state log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:05 np0005591285 nova_compute[181789]: 2026-01-21 23:37:05.212 181793 DEBUG oslo_service.service [None req-6c5ad869-9008-4282-b2fa-81b360176ee2 - - - - - -] libvirt.device_detach_attempts = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:05 np0005591285 nova_compute[181789]: 2026-01-21 23:37:05.212 181793 DEBUG oslo_service.service [None req-6c5ad869-9008-4282-b2fa-81b360176ee2 - - - - - -] libvirt.device_detach_timeout  = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:05 np0005591285 nova_compute[181789]: 2026-01-21 23:37:05.212 181793 DEBUG oslo_service.service [None req-6c5ad869-9008-4282-b2fa-81b360176ee2 - - - - - -] libvirt.disk_cachemodes        = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:05 np0005591285 nova_compute[181789]: 2026-01-21 23:37:05.212 181793 DEBUG oslo_service.service [None req-6c5ad869-9008-4282-b2fa-81b360176ee2 - - - - - -] libvirt.disk_prefix            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:05 np0005591285 nova_compute[181789]: 2026-01-21 23:37:05.212 181793 DEBUG oslo_service.service [None req-6c5ad869-9008-4282-b2fa-81b360176ee2 - - - - - -] libvirt.enabled_perf_events    = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:05 np0005591285 nova_compute[181789]: 2026-01-21 23:37:05.213 181793 DEBUG oslo_service.service [None req-6c5ad869-9008-4282-b2fa-81b360176ee2 - - - - - -] libvirt.file_backed_memory     = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:05 np0005591285 nova_compute[181789]: 2026-01-21 23:37:05.213 181793 DEBUG oslo_service.service [None req-6c5ad869-9008-4282-b2fa-81b360176ee2 - - - - - -] libvirt.gid_maps               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:05 np0005591285 nova_compute[181789]: 2026-01-21 23:37:05.213 181793 DEBUG oslo_service.service [None req-6c5ad869-9008-4282-b2fa-81b360176ee2 - - - - - -] libvirt.hw_disk_discard        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:05 np0005591285 nova_compute[181789]: 2026-01-21 23:37:05.213 181793 DEBUG oslo_service.service [None req-6c5ad869-9008-4282-b2fa-81b360176ee2 - - - - - -] libvirt.hw_machine_type        = ['x86_64=q35'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:05 np0005591285 nova_compute[181789]: 2026-01-21 23:37:05.213 181793 DEBUG oslo_service.service [None req-6c5ad869-9008-4282-b2fa-81b360176ee2 - - - - - -] libvirt.images_rbd_ceph_conf   =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:05 np0005591285 nova_compute[181789]: 2026-01-21 23:37:05.213 181793 DEBUG oslo_service.service [None req-6c5ad869-9008-4282-b2fa-81b360176ee2 - - - - - -] libvirt.images_rbd_glance_copy_poll_interval = 15 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:05 np0005591285 nova_compute[181789]: 2026-01-21 23:37:05.213 181793 DEBUG oslo_service.service [None req-6c5ad869-9008-4282-b2fa-81b360176ee2 - - - - - -] libvirt.images_rbd_glance_copy_timeout = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:05 np0005591285 nova_compute[181789]: 2026-01-21 23:37:05.214 181793 DEBUG oslo_service.service [None req-6c5ad869-9008-4282-b2fa-81b360176ee2 - - - - - -] libvirt.images_rbd_glance_store_name =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:05 np0005591285 nova_compute[181789]: 2026-01-21 23:37:05.214 181793 DEBUG oslo_service.service [None req-6c5ad869-9008-4282-b2fa-81b360176ee2 - - - - - -] libvirt.images_rbd_pool        = rbd log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:05 np0005591285 nova_compute[181789]: 2026-01-21 23:37:05.214 181793 DEBUG oslo_service.service [None req-6c5ad869-9008-4282-b2fa-81b360176ee2 - - - - - -] libvirt.images_type            = qcow2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:05 np0005591285 nova_compute[181789]: 2026-01-21 23:37:05.214 181793 DEBUG oslo_service.service [None req-6c5ad869-9008-4282-b2fa-81b360176ee2 - - - - - -] libvirt.images_volume_group    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:05 np0005591285 nova_compute[181789]: 2026-01-21 23:37:05.214 181793 DEBUG oslo_service.service [None req-6c5ad869-9008-4282-b2fa-81b360176ee2 - - - - - -] libvirt.inject_key             = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:05 np0005591285 nova_compute[181789]: 2026-01-21 23:37:05.214 181793 DEBUG oslo_service.service [None req-6c5ad869-9008-4282-b2fa-81b360176ee2 - - - - - -] libvirt.inject_partition       = -2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:05 np0005591285 nova_compute[181789]: 2026-01-21 23:37:05.214 181793 DEBUG oslo_service.service [None req-6c5ad869-9008-4282-b2fa-81b360176ee2 - - - - - -] libvirt.inject_password        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:05 np0005591285 nova_compute[181789]: 2026-01-21 23:37:05.215 181793 DEBUG oslo_service.service [None req-6c5ad869-9008-4282-b2fa-81b360176ee2 - - - - - -] libvirt.iscsi_iface            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:05 np0005591285 nova_compute[181789]: 2026-01-21 23:37:05.215 181793 DEBUG oslo_service.service [None req-6c5ad869-9008-4282-b2fa-81b360176ee2 - - - - - -] libvirt.iser_use_multipath     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:05 np0005591285 nova_compute[181789]: 2026-01-21 23:37:05.215 181793 DEBUG oslo_service.service [None req-6c5ad869-9008-4282-b2fa-81b360176ee2 - - - - - -] libvirt.live_migration_bandwidth = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:05 np0005591285 nova_compute[181789]: 2026-01-21 23:37:05.215 181793 DEBUG oslo_service.service [None req-6c5ad869-9008-4282-b2fa-81b360176ee2 - - - - - -] libvirt.live_migration_completion_timeout = 800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:05 np0005591285 nova_compute[181789]: 2026-01-21 23:37:05.215 181793 DEBUG oslo_service.service [None req-6c5ad869-9008-4282-b2fa-81b360176ee2 - - - - - -] libvirt.live_migration_downtime = 500 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:05 np0005591285 nova_compute[181789]: 2026-01-21 23:37:05.215 181793 DEBUG oslo_service.service [None req-6c5ad869-9008-4282-b2fa-81b360176ee2 - - - - - -] libvirt.live_migration_downtime_delay = 75 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:05 np0005591285 nova_compute[181789]: 2026-01-21 23:37:05.215 181793 DEBUG oslo_service.service [None req-6c5ad869-9008-4282-b2fa-81b360176ee2 - - - - - -] libvirt.live_migration_downtime_steps = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:05 np0005591285 nova_compute[181789]: 2026-01-21 23:37:05.216 181793 DEBUG oslo_service.service [None req-6c5ad869-9008-4282-b2fa-81b360176ee2 - - - - - -] libvirt.live_migration_inbound_addr = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:05 np0005591285 nova_compute[181789]: 2026-01-21 23:37:05.216 181793 DEBUG oslo_service.service [None req-6c5ad869-9008-4282-b2fa-81b360176ee2 - - - - - -] libvirt.live_migration_permit_auto_converge = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:05 np0005591285 nova_compute[181789]: 2026-01-21 23:37:05.216 181793 DEBUG oslo_service.service [None req-6c5ad869-9008-4282-b2fa-81b360176ee2 - - - - - -] libvirt.live_migration_permit_post_copy = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:05 np0005591285 nova_compute[181789]: 2026-01-21 23:37:05.216 181793 DEBUG oslo_service.service [None req-6c5ad869-9008-4282-b2fa-81b360176ee2 - - - - - -] libvirt.live_migration_scheme  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:05 np0005591285 nova_compute[181789]: 2026-01-21 23:37:05.216 181793 DEBUG oslo_service.service [None req-6c5ad869-9008-4282-b2fa-81b360176ee2 - - - - - -] libvirt.live_migration_timeout_action = force_complete log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:05 np0005591285 nova_compute[181789]: 2026-01-21 23:37:05.216 181793 DEBUG oslo_service.service [None req-6c5ad869-9008-4282-b2fa-81b360176ee2 - - - - - -] libvirt.live_migration_tunnelled = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:05 np0005591285 nova_compute[181789]: 2026-01-21 23:37:05.217 181793 WARNING oslo_config.cfg [None req-6c5ad869-9008-4282-b2fa-81b360176ee2 - - - - - -] Deprecated: Option "live_migration_uri" from group "libvirt" is deprecated for removal (
Jan 21 18:37:05 np0005591285 nova_compute[181789]: live_migration_uri is deprecated for removal in favor of two other options that
Jan 21 18:37:05 np0005591285 nova_compute[181789]: allow to change live migration scheme and target URI: ``live_migration_scheme``
Jan 21 18:37:05 np0005591285 nova_compute[181789]: and ``live_migration_inbound_addr`` respectively.
Jan 21 18:37:05 np0005591285 nova_compute[181789]: ).  Its value may be silently ignored in the future.#033[00m
Jan 21 18:37:05 np0005591285 nova_compute[181789]: 2026-01-21 23:37:05.217 181793 DEBUG oslo_service.service [None req-6c5ad869-9008-4282-b2fa-81b360176ee2 - - - - - -] libvirt.live_migration_uri     = qemu+tls://%s/system log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:05 np0005591285 nova_compute[181789]: 2026-01-21 23:37:05.217 181793 DEBUG oslo_service.service [None req-6c5ad869-9008-4282-b2fa-81b360176ee2 - - - - - -] libvirt.live_migration_with_native_tls = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:05 np0005591285 nova_compute[181789]: 2026-01-21 23:37:05.217 181793 DEBUG oslo_service.service [None req-6c5ad869-9008-4282-b2fa-81b360176ee2 - - - - - -] libvirt.max_queues             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:05 np0005591285 nova_compute[181789]: 2026-01-21 23:37:05.217 181793 DEBUG oslo_service.service [None req-6c5ad869-9008-4282-b2fa-81b360176ee2 - - - - - -] libvirt.mem_stats_period_seconds = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:05 np0005591285 nova_compute[181789]: 2026-01-21 23:37:05.217 181793 DEBUG oslo_service.service [None req-6c5ad869-9008-4282-b2fa-81b360176ee2 - - - - - -] libvirt.nfs_mount_options      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:05 np0005591285 nova_compute[181789]: 2026-01-21 23:37:05.218 181793 DEBUG oslo_service.service [None req-6c5ad869-9008-4282-b2fa-81b360176ee2 - - - - - -] libvirt.nfs_mount_point_base   = /var/lib/nova/mnt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:05 np0005591285 nova_compute[181789]: 2026-01-21 23:37:05.218 181793 DEBUG oslo_service.service [None req-6c5ad869-9008-4282-b2fa-81b360176ee2 - - - - - -] libvirt.num_aoe_discover_tries = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:05 np0005591285 nova_compute[181789]: 2026-01-21 23:37:05.218 181793 DEBUG oslo_service.service [None req-6c5ad869-9008-4282-b2fa-81b360176ee2 - - - - - -] libvirt.num_iser_scan_tries    = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:05 np0005591285 nova_compute[181789]: 2026-01-21 23:37:05.218 181793 DEBUG oslo_service.service [None req-6c5ad869-9008-4282-b2fa-81b360176ee2 - - - - - -] libvirt.num_memory_encrypted_guests = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:05 np0005591285 nova_compute[181789]: 2026-01-21 23:37:05.218 181793 DEBUG oslo_service.service [None req-6c5ad869-9008-4282-b2fa-81b360176ee2 - - - - - -] libvirt.num_nvme_discover_tries = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:05 np0005591285 nova_compute[181789]: 2026-01-21 23:37:05.218 181793 DEBUG oslo_service.service [None req-6c5ad869-9008-4282-b2fa-81b360176ee2 - - - - - -] libvirt.num_pcie_ports         = 24 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:05 np0005591285 nova_compute[181789]: 2026-01-21 23:37:05.218 181793 DEBUG oslo_service.service [None req-6c5ad869-9008-4282-b2fa-81b360176ee2 - - - - - -] libvirt.num_volume_scan_tries  = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:05 np0005591285 nova_compute[181789]: 2026-01-21 23:37:05.219 181793 DEBUG oslo_service.service [None req-6c5ad869-9008-4282-b2fa-81b360176ee2 - - - - - -] libvirt.pmem_namespaces        = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:05 np0005591285 nova_compute[181789]: 2026-01-21 23:37:05.219 181793 DEBUG oslo_service.service [None req-6c5ad869-9008-4282-b2fa-81b360176ee2 - - - - - -] libvirt.quobyte_client_cfg     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:05 np0005591285 nova_compute[181789]: 2026-01-21 23:37:05.219 181793 DEBUG oslo_service.service [None req-6c5ad869-9008-4282-b2fa-81b360176ee2 - - - - - -] libvirt.quobyte_mount_point_base = /var/lib/nova/mnt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:05 np0005591285 nova_compute[181789]: 2026-01-21 23:37:05.219 181793 DEBUG oslo_service.service [None req-6c5ad869-9008-4282-b2fa-81b360176ee2 - - - - - -] libvirt.rbd_connect_timeout    = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:05 np0005591285 nova_compute[181789]: 2026-01-21 23:37:05.219 181793 DEBUG oslo_service.service [None req-6c5ad869-9008-4282-b2fa-81b360176ee2 - - - - - -] libvirt.rbd_destroy_volume_retries = 12 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:05 np0005591285 nova_compute[181789]: 2026-01-21 23:37:05.219 181793 DEBUG oslo_service.service [None req-6c5ad869-9008-4282-b2fa-81b360176ee2 - - - - - -] libvirt.rbd_destroy_volume_retry_interval = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:05 np0005591285 nova_compute[181789]: 2026-01-21 23:37:05.219 181793 DEBUG oslo_service.service [None req-6c5ad869-9008-4282-b2fa-81b360176ee2 - - - - - -] libvirt.rbd_secret_uuid        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:05 np0005591285 nova_compute[181789]: 2026-01-21 23:37:05.220 181793 DEBUG oslo_service.service [None req-6c5ad869-9008-4282-b2fa-81b360176ee2 - - - - - -] libvirt.rbd_user               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:05 np0005591285 nova_compute[181789]: 2026-01-21 23:37:05.220 181793 DEBUG oslo_service.service [None req-6c5ad869-9008-4282-b2fa-81b360176ee2 - - - - - -] libvirt.realtime_scheduler_priority = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:05 np0005591285 nova_compute[181789]: 2026-01-21 23:37:05.220 181793 DEBUG oslo_service.service [None req-6c5ad869-9008-4282-b2fa-81b360176ee2 - - - - - -] libvirt.remote_filesystem_transport = ssh log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:05 np0005591285 nova_compute[181789]: 2026-01-21 23:37:05.220 181793 DEBUG oslo_service.service [None req-6c5ad869-9008-4282-b2fa-81b360176ee2 - - - - - -] libvirt.rescue_image_id        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:05 np0005591285 nova_compute[181789]: 2026-01-21 23:37:05.220 181793 DEBUG oslo_service.service [None req-6c5ad869-9008-4282-b2fa-81b360176ee2 - - - - - -] libvirt.rescue_kernel_id       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:05 np0005591285 nova_compute[181789]: 2026-01-21 23:37:05.220 181793 DEBUG oslo_service.service [None req-6c5ad869-9008-4282-b2fa-81b360176ee2 - - - - - -] libvirt.rescue_ramdisk_id      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:05 np0005591285 nova_compute[181789]: 2026-01-21 23:37:05.220 181793 DEBUG oslo_service.service [None req-6c5ad869-9008-4282-b2fa-81b360176ee2 - - - - - -] libvirt.rng_dev_path           = /dev/urandom log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:05 np0005591285 nova_compute[181789]: 2026-01-21 23:37:05.221 181793 DEBUG oslo_service.service [None req-6c5ad869-9008-4282-b2fa-81b360176ee2 - - - - - -] libvirt.rx_queue_size          = 512 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:05 np0005591285 nova_compute[181789]: 2026-01-21 23:37:05.221 181793 DEBUG oslo_service.service [None req-6c5ad869-9008-4282-b2fa-81b360176ee2 - - - - - -] libvirt.smbfs_mount_options    =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:05 np0005591285 nova_compute[181789]: 2026-01-21 23:37:05.221 181793 DEBUG oslo_service.service [None req-6c5ad869-9008-4282-b2fa-81b360176ee2 - - - - - -] libvirt.smbfs_mount_point_base = /var/lib/nova/mnt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:05 np0005591285 nova_compute[181789]: 2026-01-21 23:37:05.221 181793 DEBUG oslo_service.service [None req-6c5ad869-9008-4282-b2fa-81b360176ee2 - - - - - -] libvirt.snapshot_compression   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:05 np0005591285 nova_compute[181789]: 2026-01-21 23:37:05.221 181793 DEBUG oslo_service.service [None req-6c5ad869-9008-4282-b2fa-81b360176ee2 - - - - - -] libvirt.snapshot_image_format  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:05 np0005591285 nova_compute[181789]: 2026-01-21 23:37:05.221 181793 DEBUG oslo_service.service [None req-6c5ad869-9008-4282-b2fa-81b360176ee2 - - - - - -] libvirt.snapshots_directory    = /var/lib/nova/instances/snapshots log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:05 np0005591285 nova_compute[181789]: 2026-01-21 23:37:05.222 181793 DEBUG oslo_service.service [None req-6c5ad869-9008-4282-b2fa-81b360176ee2 - - - - - -] libvirt.sparse_logical_volumes = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:05 np0005591285 nova_compute[181789]: 2026-01-21 23:37:05.222 181793 DEBUG oslo_service.service [None req-6c5ad869-9008-4282-b2fa-81b360176ee2 - - - - - -] libvirt.swtpm_enabled          = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:05 np0005591285 nova_compute[181789]: 2026-01-21 23:37:05.222 181793 DEBUG oslo_service.service [None req-6c5ad869-9008-4282-b2fa-81b360176ee2 - - - - - -] libvirt.swtpm_group            = tss log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:05 np0005591285 nova_compute[181789]: 2026-01-21 23:37:05.222 181793 DEBUG oslo_service.service [None req-6c5ad869-9008-4282-b2fa-81b360176ee2 - - - - - -] libvirt.swtpm_user             = tss log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:05 np0005591285 nova_compute[181789]: 2026-01-21 23:37:05.222 181793 DEBUG oslo_service.service [None req-6c5ad869-9008-4282-b2fa-81b360176ee2 - - - - - -] libvirt.sysinfo_serial         = unique log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:05 np0005591285 nova_compute[181789]: 2026-01-21 23:37:05.222 181793 DEBUG oslo_service.service [None req-6c5ad869-9008-4282-b2fa-81b360176ee2 - - - - - -] libvirt.tx_queue_size          = 512 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:05 np0005591285 nova_compute[181789]: 2026-01-21 23:37:05.222 181793 DEBUG oslo_service.service [None req-6c5ad869-9008-4282-b2fa-81b360176ee2 - - - - - -] libvirt.uid_maps               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:05 np0005591285 nova_compute[181789]: 2026-01-21 23:37:05.223 181793 DEBUG oslo_service.service [None req-6c5ad869-9008-4282-b2fa-81b360176ee2 - - - - - -] libvirt.use_virtio_for_bridges = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:05 np0005591285 nova_compute[181789]: 2026-01-21 23:37:05.223 181793 DEBUG oslo_service.service [None req-6c5ad869-9008-4282-b2fa-81b360176ee2 - - - - - -] libvirt.virt_type              = kvm log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:05 np0005591285 nova_compute[181789]: 2026-01-21 23:37:05.223 181793 DEBUG oslo_service.service [None req-6c5ad869-9008-4282-b2fa-81b360176ee2 - - - - - -] libvirt.volume_clear           = zero log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:05 np0005591285 nova_compute[181789]: 2026-01-21 23:37:05.223 181793 DEBUG oslo_service.service [None req-6c5ad869-9008-4282-b2fa-81b360176ee2 - - - - - -] libvirt.volume_clear_size      = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:05 np0005591285 nova_compute[181789]: 2026-01-21 23:37:05.223 181793 DEBUG oslo_service.service [None req-6c5ad869-9008-4282-b2fa-81b360176ee2 - - - - - -] libvirt.volume_use_multipath   = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:05 np0005591285 nova_compute[181789]: 2026-01-21 23:37:05.223 181793 DEBUG oslo_service.service [None req-6c5ad869-9008-4282-b2fa-81b360176ee2 - - - - - -] libvirt.vzstorage_cache_path   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:05 np0005591285 nova_compute[181789]: 2026-01-21 23:37:05.223 181793 DEBUG oslo_service.service [None req-6c5ad869-9008-4282-b2fa-81b360176ee2 - - - - - -] libvirt.vzstorage_log_path     = /var/log/vstorage/%(cluster_name)s/nova.log.gz log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:05 np0005591285 nova_compute[181789]: 2026-01-21 23:37:05.224 181793 DEBUG oslo_service.service [None req-6c5ad869-9008-4282-b2fa-81b360176ee2 - - - - - -] libvirt.vzstorage_mount_group  = qemu log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:05 np0005591285 nova_compute[181789]: 2026-01-21 23:37:05.224 181793 DEBUG oslo_service.service [None req-6c5ad869-9008-4282-b2fa-81b360176ee2 - - - - - -] libvirt.vzstorage_mount_opts   = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:05 np0005591285 nova_compute[181789]: 2026-01-21 23:37:05.224 181793 DEBUG oslo_service.service [None req-6c5ad869-9008-4282-b2fa-81b360176ee2 - - - - - -] libvirt.vzstorage_mount_perms  = 0770 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:05 np0005591285 nova_compute[181789]: 2026-01-21 23:37:05.224 181793 DEBUG oslo_service.service [None req-6c5ad869-9008-4282-b2fa-81b360176ee2 - - - - - -] libvirt.vzstorage_mount_point_base = /var/lib/nova/mnt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:05 np0005591285 nova_compute[181789]: 2026-01-21 23:37:05.224 181793 DEBUG oslo_service.service [None req-6c5ad869-9008-4282-b2fa-81b360176ee2 - - - - - -] libvirt.vzstorage_mount_user   = stack log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:05 np0005591285 nova_compute[181789]: 2026-01-21 23:37:05.224 181793 DEBUG oslo_service.service [None req-6c5ad869-9008-4282-b2fa-81b360176ee2 - - - - - -] libvirt.wait_soft_reboot_seconds = 120 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:05 np0005591285 nova_compute[181789]: 2026-01-21 23:37:05.225 181793 DEBUG oslo_service.service [None req-6c5ad869-9008-4282-b2fa-81b360176ee2 - - - - - -] neutron.auth_section           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:05 np0005591285 nova_compute[181789]: 2026-01-21 23:37:05.225 181793 DEBUG oslo_service.service [None req-6c5ad869-9008-4282-b2fa-81b360176ee2 - - - - - -] neutron.auth_type              = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:05 np0005591285 nova_compute[181789]: 2026-01-21 23:37:05.225 181793 DEBUG oslo_service.service [None req-6c5ad869-9008-4282-b2fa-81b360176ee2 - - - - - -] neutron.cafile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:05 np0005591285 nova_compute[181789]: 2026-01-21 23:37:05.225 181793 DEBUG oslo_service.service [None req-6c5ad869-9008-4282-b2fa-81b360176ee2 - - - - - -] neutron.certfile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:05 np0005591285 nova_compute[181789]: 2026-01-21 23:37:05.225 181793 DEBUG oslo_service.service [None req-6c5ad869-9008-4282-b2fa-81b360176ee2 - - - - - -] neutron.collect_timing         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:05 np0005591285 nova_compute[181789]: 2026-01-21 23:37:05.225 181793 DEBUG oslo_service.service [None req-6c5ad869-9008-4282-b2fa-81b360176ee2 - - - - - -] neutron.connect_retries        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:05 np0005591285 nova_compute[181789]: 2026-01-21 23:37:05.225 181793 DEBUG oslo_service.service [None req-6c5ad869-9008-4282-b2fa-81b360176ee2 - - - - - -] neutron.connect_retry_delay    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:05 np0005591285 nova_compute[181789]: 2026-01-21 23:37:05.225 181793 DEBUG oslo_service.service [None req-6c5ad869-9008-4282-b2fa-81b360176ee2 - - - - - -] neutron.default_floating_pool  = nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:05 np0005591285 nova_compute[181789]: 2026-01-21 23:37:05.226 181793 DEBUG oslo_service.service [None req-6c5ad869-9008-4282-b2fa-81b360176ee2 - - - - - -] neutron.endpoint_override      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:05 np0005591285 nova_compute[181789]: 2026-01-21 23:37:05.226 181793 DEBUG oslo_service.service [None req-6c5ad869-9008-4282-b2fa-81b360176ee2 - - - - - -] neutron.extension_sync_interval = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:05 np0005591285 nova_compute[181789]: 2026-01-21 23:37:05.226 181793 DEBUG oslo_service.service [None req-6c5ad869-9008-4282-b2fa-81b360176ee2 - - - - - -] neutron.http_retries           = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:05 np0005591285 nova_compute[181789]: 2026-01-21 23:37:05.226 181793 DEBUG oslo_service.service [None req-6c5ad869-9008-4282-b2fa-81b360176ee2 - - - - - -] neutron.insecure               = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:05 np0005591285 nova_compute[181789]: 2026-01-21 23:37:05.226 181793 DEBUG oslo_service.service [None req-6c5ad869-9008-4282-b2fa-81b360176ee2 - - - - - -] neutron.keyfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:05 np0005591285 nova_compute[181789]: 2026-01-21 23:37:05.226 181793 DEBUG oslo_service.service [None req-6c5ad869-9008-4282-b2fa-81b360176ee2 - - - - - -] neutron.max_version            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:05 np0005591285 nova_compute[181789]: 2026-01-21 23:37:05.226 181793 DEBUG oslo_service.service [None req-6c5ad869-9008-4282-b2fa-81b360176ee2 - - - - - -] neutron.metadata_proxy_shared_secret = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:05 np0005591285 nova_compute[181789]: 2026-01-21 23:37:05.227 181793 DEBUG oslo_service.service [None req-6c5ad869-9008-4282-b2fa-81b360176ee2 - - - - - -] neutron.min_version            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:05 np0005591285 nova_compute[181789]: 2026-01-21 23:37:05.227 181793 DEBUG oslo_service.service [None req-6c5ad869-9008-4282-b2fa-81b360176ee2 - - - - - -] neutron.ovs_bridge             = br-int log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:05 np0005591285 nova_compute[181789]: 2026-01-21 23:37:05.227 181793 DEBUG oslo_service.service [None req-6c5ad869-9008-4282-b2fa-81b360176ee2 - - - - - -] neutron.physnets               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:05 np0005591285 nova_compute[181789]: 2026-01-21 23:37:05.227 181793 DEBUG oslo_service.service [None req-6c5ad869-9008-4282-b2fa-81b360176ee2 - - - - - -] neutron.region_name            = regionOne log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:05 np0005591285 nova_compute[181789]: 2026-01-21 23:37:05.227 181793 DEBUG oslo_service.service [None req-6c5ad869-9008-4282-b2fa-81b360176ee2 - - - - - -] neutron.service_metadata_proxy = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:05 np0005591285 nova_compute[181789]: 2026-01-21 23:37:05.227 181793 DEBUG oslo_service.service [None req-6c5ad869-9008-4282-b2fa-81b360176ee2 - - - - - -] neutron.service_name           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:05 np0005591285 nova_compute[181789]: 2026-01-21 23:37:05.227 181793 DEBUG oslo_service.service [None req-6c5ad869-9008-4282-b2fa-81b360176ee2 - - - - - -] neutron.service_type           = network log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:05 np0005591285 nova_compute[181789]: 2026-01-21 23:37:05.227 181793 DEBUG oslo_service.service [None req-6c5ad869-9008-4282-b2fa-81b360176ee2 - - - - - -] neutron.split_loggers          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:05 np0005591285 nova_compute[181789]: 2026-01-21 23:37:05.228 181793 DEBUG oslo_service.service [None req-6c5ad869-9008-4282-b2fa-81b360176ee2 - - - - - -] neutron.status_code_retries    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:05 np0005591285 nova_compute[181789]: 2026-01-21 23:37:05.228 181793 DEBUG oslo_service.service [None req-6c5ad869-9008-4282-b2fa-81b360176ee2 - - - - - -] neutron.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:05 np0005591285 nova_compute[181789]: 2026-01-21 23:37:05.228 181793 DEBUG oslo_service.service [None req-6c5ad869-9008-4282-b2fa-81b360176ee2 - - - - - -] neutron.timeout                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:05 np0005591285 nova_compute[181789]: 2026-01-21 23:37:05.228 181793 DEBUG oslo_service.service [None req-6c5ad869-9008-4282-b2fa-81b360176ee2 - - - - - -] neutron.valid_interfaces       = ['internal'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:05 np0005591285 nova_compute[181789]: 2026-01-21 23:37:05.228 181793 DEBUG oslo_service.service [None req-6c5ad869-9008-4282-b2fa-81b360176ee2 - - - - - -] neutron.version                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:05 np0005591285 nova_compute[181789]: 2026-01-21 23:37:05.228 181793 DEBUG oslo_service.service [None req-6c5ad869-9008-4282-b2fa-81b360176ee2 - - - - - -] notifications.bdms_in_notifications = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:05 np0005591285 nova_compute[181789]: 2026-01-21 23:37:05.228 181793 DEBUG oslo_service.service [None req-6c5ad869-9008-4282-b2fa-81b360176ee2 - - - - - -] notifications.default_level    = INFO log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:05 np0005591285 nova_compute[181789]: 2026-01-21 23:37:05.229 181793 DEBUG oslo_service.service [None req-6c5ad869-9008-4282-b2fa-81b360176ee2 - - - - - -] notifications.notification_format = both log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:05 np0005591285 nova_compute[181789]: 2026-01-21 23:37:05.229 181793 DEBUG oslo_service.service [None req-6c5ad869-9008-4282-b2fa-81b360176ee2 - - - - - -] notifications.notify_on_state_change = vm_and_task_state log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:05 np0005591285 nova_compute[181789]: 2026-01-21 23:37:05.229 181793 DEBUG oslo_service.service [None req-6c5ad869-9008-4282-b2fa-81b360176ee2 - - - - - -] notifications.versioned_notifications_topics = ['versioned_notifications'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:05 np0005591285 nova_compute[181789]: 2026-01-21 23:37:05.229 181793 DEBUG oslo_service.service [None req-6c5ad869-9008-4282-b2fa-81b360176ee2 - - - - - -] pci.alias                      = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:05 np0005591285 nova_compute[181789]: 2026-01-21 23:37:05.229 181793 DEBUG oslo_service.service [None req-6c5ad869-9008-4282-b2fa-81b360176ee2 - - - - - -] pci.device_spec                = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:05 np0005591285 nova_compute[181789]: 2026-01-21 23:37:05.229 181793 DEBUG oslo_service.service [None req-6c5ad869-9008-4282-b2fa-81b360176ee2 - - - - - -] pci.report_in_placement        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:05 np0005591285 nova_compute[181789]: 2026-01-21 23:37:05.229 181793 DEBUG oslo_service.service [None req-6c5ad869-9008-4282-b2fa-81b360176ee2 - - - - - -] placement.auth_section         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:05 np0005591285 nova_compute[181789]: 2026-01-21 23:37:05.230 181793 DEBUG oslo_service.service [None req-6c5ad869-9008-4282-b2fa-81b360176ee2 - - - - - -] placement.auth_type            = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:05 np0005591285 nova_compute[181789]: 2026-01-21 23:37:05.230 181793 DEBUG oslo_service.service [None req-6c5ad869-9008-4282-b2fa-81b360176ee2 - - - - - -] placement.auth_url             = https://keystone-internal.openstack.svc:5000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:05 np0005591285 nova_compute[181789]: 2026-01-21 23:37:05.230 181793 DEBUG oslo_service.service [None req-6c5ad869-9008-4282-b2fa-81b360176ee2 - - - - - -] placement.cafile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:05 np0005591285 nova_compute[181789]: 2026-01-21 23:37:05.230 181793 DEBUG oslo_service.service [None req-6c5ad869-9008-4282-b2fa-81b360176ee2 - - - - - -] placement.certfile             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:05 np0005591285 nova_compute[181789]: 2026-01-21 23:37:05.230 181793 DEBUG oslo_service.service [None req-6c5ad869-9008-4282-b2fa-81b360176ee2 - - - - - -] placement.collect_timing       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:05 np0005591285 nova_compute[181789]: 2026-01-21 23:37:05.230 181793 DEBUG oslo_service.service [None req-6c5ad869-9008-4282-b2fa-81b360176ee2 - - - - - -] placement.connect_retries      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:05 np0005591285 nova_compute[181789]: 2026-01-21 23:37:05.230 181793 DEBUG oslo_service.service [None req-6c5ad869-9008-4282-b2fa-81b360176ee2 - - - - - -] placement.connect_retry_delay  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:05 np0005591285 nova_compute[181789]: 2026-01-21 23:37:05.230 181793 DEBUG oslo_service.service [None req-6c5ad869-9008-4282-b2fa-81b360176ee2 - - - - - -] placement.default_domain_id    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:05 np0005591285 nova_compute[181789]: 2026-01-21 23:37:05.231 181793 DEBUG oslo_service.service [None req-6c5ad869-9008-4282-b2fa-81b360176ee2 - - - - - -] placement.default_domain_name  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:05 np0005591285 nova_compute[181789]: 2026-01-21 23:37:05.231 181793 DEBUG oslo_service.service [None req-6c5ad869-9008-4282-b2fa-81b360176ee2 - - - - - -] placement.domain_id            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:05 np0005591285 nova_compute[181789]: 2026-01-21 23:37:05.231 181793 DEBUG oslo_service.service [None req-6c5ad869-9008-4282-b2fa-81b360176ee2 - - - - - -] placement.domain_name          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:05 np0005591285 nova_compute[181789]: 2026-01-21 23:37:05.231 181793 DEBUG oslo_service.service [None req-6c5ad869-9008-4282-b2fa-81b360176ee2 - - - - - -] placement.endpoint_override    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:05 np0005591285 nova_compute[181789]: 2026-01-21 23:37:05.231 181793 DEBUG oslo_service.service [None req-6c5ad869-9008-4282-b2fa-81b360176ee2 - - - - - -] placement.insecure             = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:05 np0005591285 nova_compute[181789]: 2026-01-21 23:37:05.231 181793 DEBUG oslo_service.service [None req-6c5ad869-9008-4282-b2fa-81b360176ee2 - - - - - -] placement.keyfile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:05 np0005591285 nova_compute[181789]: 2026-01-21 23:37:05.231 181793 DEBUG oslo_service.service [None req-6c5ad869-9008-4282-b2fa-81b360176ee2 - - - - - -] placement.max_version          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:05 np0005591285 nova_compute[181789]: 2026-01-21 23:37:05.232 181793 DEBUG oslo_service.service [None req-6c5ad869-9008-4282-b2fa-81b360176ee2 - - - - - -] placement.min_version          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:05 np0005591285 nova_compute[181789]: 2026-01-21 23:37:05.232 181793 DEBUG oslo_service.service [None req-6c5ad869-9008-4282-b2fa-81b360176ee2 - - - - - -] placement.password             = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:05 np0005591285 nova_compute[181789]: 2026-01-21 23:37:05.232 181793 DEBUG oslo_service.service [None req-6c5ad869-9008-4282-b2fa-81b360176ee2 - - - - - -] placement.project_domain_id    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:05 np0005591285 nova_compute[181789]: 2026-01-21 23:37:05.232 181793 DEBUG oslo_service.service [None req-6c5ad869-9008-4282-b2fa-81b360176ee2 - - - - - -] placement.project_domain_name  = Default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:05 np0005591285 nova_compute[181789]: 2026-01-21 23:37:05.232 181793 DEBUG oslo_service.service [None req-6c5ad869-9008-4282-b2fa-81b360176ee2 - - - - - -] placement.project_id           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:05 np0005591285 nova_compute[181789]: 2026-01-21 23:37:05.232 181793 DEBUG oslo_service.service [None req-6c5ad869-9008-4282-b2fa-81b360176ee2 - - - - - -] placement.project_name         = service log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:05 np0005591285 nova_compute[181789]: 2026-01-21 23:37:05.232 181793 DEBUG oslo_service.service [None req-6c5ad869-9008-4282-b2fa-81b360176ee2 - - - - - -] placement.region_name          = regionOne log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:05 np0005591285 nova_compute[181789]: 2026-01-21 23:37:05.232 181793 DEBUG oslo_service.service [None req-6c5ad869-9008-4282-b2fa-81b360176ee2 - - - - - -] placement.service_name         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:05 np0005591285 nova_compute[181789]: 2026-01-21 23:37:05.233 181793 DEBUG oslo_service.service [None req-6c5ad869-9008-4282-b2fa-81b360176ee2 - - - - - -] placement.service_type         = placement log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:05 np0005591285 nova_compute[181789]: 2026-01-21 23:37:05.233 181793 DEBUG oslo_service.service [None req-6c5ad869-9008-4282-b2fa-81b360176ee2 - - - - - -] placement.split_loggers        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:05 np0005591285 nova_compute[181789]: 2026-01-21 23:37:05.233 181793 DEBUG oslo_service.service [None req-6c5ad869-9008-4282-b2fa-81b360176ee2 - - - - - -] placement.status_code_retries  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:05 np0005591285 nova_compute[181789]: 2026-01-21 23:37:05.233 181793 DEBUG oslo_service.service [None req-6c5ad869-9008-4282-b2fa-81b360176ee2 - - - - - -] placement.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:05 np0005591285 nova_compute[181789]: 2026-01-21 23:37:05.233 181793 DEBUG oslo_service.service [None req-6c5ad869-9008-4282-b2fa-81b360176ee2 - - - - - -] placement.system_scope         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:05 np0005591285 nova_compute[181789]: 2026-01-21 23:37:05.233 181793 DEBUG oslo_service.service [None req-6c5ad869-9008-4282-b2fa-81b360176ee2 - - - - - -] placement.timeout              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:05 np0005591285 nova_compute[181789]: 2026-01-21 23:37:05.233 181793 DEBUG oslo_service.service [None req-6c5ad869-9008-4282-b2fa-81b360176ee2 - - - - - -] placement.trust_id             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:05 np0005591285 nova_compute[181789]: 2026-01-21 23:37:05.234 181793 DEBUG oslo_service.service [None req-6c5ad869-9008-4282-b2fa-81b360176ee2 - - - - - -] placement.user_domain_id       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:05 np0005591285 nova_compute[181789]: 2026-01-21 23:37:05.234 181793 DEBUG oslo_service.service [None req-6c5ad869-9008-4282-b2fa-81b360176ee2 - - - - - -] placement.user_domain_name     = Default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:05 np0005591285 nova_compute[181789]: 2026-01-21 23:37:05.234 181793 DEBUG oslo_service.service [None req-6c5ad869-9008-4282-b2fa-81b360176ee2 - - - - - -] placement.user_id              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:05 np0005591285 nova_compute[181789]: 2026-01-21 23:37:05.234 181793 DEBUG oslo_service.service [None req-6c5ad869-9008-4282-b2fa-81b360176ee2 - - - - - -] placement.username             = nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:05 np0005591285 nova_compute[181789]: 2026-01-21 23:37:05.234 181793 DEBUG oslo_service.service [None req-6c5ad869-9008-4282-b2fa-81b360176ee2 - - - - - -] placement.valid_interfaces     = ['internal'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:05 np0005591285 nova_compute[181789]: 2026-01-21 23:37:05.234 181793 DEBUG oslo_service.service [None req-6c5ad869-9008-4282-b2fa-81b360176ee2 - - - - - -] placement.version              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:05 np0005591285 nova_compute[181789]: 2026-01-21 23:37:05.234 181793 DEBUG oslo_service.service [None req-6c5ad869-9008-4282-b2fa-81b360176ee2 - - - - - -] quota.cores                    = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:05 np0005591285 nova_compute[181789]: 2026-01-21 23:37:05.235 181793 DEBUG oslo_service.service [None req-6c5ad869-9008-4282-b2fa-81b360176ee2 - - - - - -] quota.count_usage_from_placement = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:05 np0005591285 nova_compute[181789]: 2026-01-21 23:37:05.235 181793 DEBUG oslo_service.service [None req-6c5ad869-9008-4282-b2fa-81b360176ee2 - - - - - -] quota.driver                   = nova.quota.DbQuotaDriver log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:05 np0005591285 nova_compute[181789]: 2026-01-21 23:37:05.235 181793 DEBUG oslo_service.service [None req-6c5ad869-9008-4282-b2fa-81b360176ee2 - - - - - -] quota.injected_file_content_bytes = 10240 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:05 np0005591285 nova_compute[181789]: 2026-01-21 23:37:05.235 181793 DEBUG oslo_service.service [None req-6c5ad869-9008-4282-b2fa-81b360176ee2 - - - - - -] quota.injected_file_path_length = 255 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:05 np0005591285 nova_compute[181789]: 2026-01-21 23:37:05.235 181793 DEBUG oslo_service.service [None req-6c5ad869-9008-4282-b2fa-81b360176ee2 - - - - - -] quota.injected_files           = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:05 np0005591285 nova_compute[181789]: 2026-01-21 23:37:05.235 181793 DEBUG oslo_service.service [None req-6c5ad869-9008-4282-b2fa-81b360176ee2 - - - - - -] quota.instances                = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:05 np0005591285 nova_compute[181789]: 2026-01-21 23:37:05.235 181793 DEBUG oslo_service.service [None req-6c5ad869-9008-4282-b2fa-81b360176ee2 - - - - - -] quota.key_pairs                = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:05 np0005591285 nova_compute[181789]: 2026-01-21 23:37:05.235 181793 DEBUG oslo_service.service [None req-6c5ad869-9008-4282-b2fa-81b360176ee2 - - - - - -] quota.metadata_items           = 128 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:05 np0005591285 nova_compute[181789]: 2026-01-21 23:37:05.236 181793 DEBUG oslo_service.service [None req-6c5ad869-9008-4282-b2fa-81b360176ee2 - - - - - -] quota.ram                      = 51200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:05 np0005591285 nova_compute[181789]: 2026-01-21 23:37:05.236 181793 DEBUG oslo_service.service [None req-6c5ad869-9008-4282-b2fa-81b360176ee2 - - - - - -] quota.recheck_quota            = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:05 np0005591285 nova_compute[181789]: 2026-01-21 23:37:05.236 181793 DEBUG oslo_service.service [None req-6c5ad869-9008-4282-b2fa-81b360176ee2 - - - - - -] quota.server_group_members     = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:05 np0005591285 nova_compute[181789]: 2026-01-21 23:37:05.236 181793 DEBUG oslo_service.service [None req-6c5ad869-9008-4282-b2fa-81b360176ee2 - - - - - -] quota.server_groups            = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:05 np0005591285 nova_compute[181789]: 2026-01-21 23:37:05.236 181793 DEBUG oslo_service.service [None req-6c5ad869-9008-4282-b2fa-81b360176ee2 - - - - - -] rdp.enabled                    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:05 np0005591285 nova_compute[181789]: 2026-01-21 23:37:05.236 181793 DEBUG oslo_service.service [None req-6c5ad869-9008-4282-b2fa-81b360176ee2 - - - - - -] rdp.html5_proxy_base_url       = http://127.0.0.1:6083/ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:05 np0005591285 nova_compute[181789]: 2026-01-21 23:37:05.237 181793 DEBUG oslo_service.service [None req-6c5ad869-9008-4282-b2fa-81b360176ee2 - - - - - -] scheduler.discover_hosts_in_cells_interval = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:05 np0005591285 nova_compute[181789]: 2026-01-21 23:37:05.237 181793 DEBUG oslo_service.service [None req-6c5ad869-9008-4282-b2fa-81b360176ee2 - - - - - -] scheduler.enable_isolated_aggregate_filtering = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:05 np0005591285 nova_compute[181789]: 2026-01-21 23:37:05.237 181793 DEBUG oslo_service.service [None req-6c5ad869-9008-4282-b2fa-81b360176ee2 - - - - - -] scheduler.image_metadata_prefilter = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:05 np0005591285 nova_compute[181789]: 2026-01-21 23:37:05.237 181793 DEBUG oslo_service.service [None req-6c5ad869-9008-4282-b2fa-81b360176ee2 - - - - - -] scheduler.limit_tenants_to_placement_aggregate = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:05 np0005591285 nova_compute[181789]: 2026-01-21 23:37:05.237 181793 DEBUG oslo_service.service [None req-6c5ad869-9008-4282-b2fa-81b360176ee2 - - - - - -] scheduler.max_attempts         = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:05 np0005591285 nova_compute[181789]: 2026-01-21 23:37:05.237 181793 DEBUG oslo_service.service [None req-6c5ad869-9008-4282-b2fa-81b360176ee2 - - - - - -] scheduler.max_placement_results = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:05 np0005591285 nova_compute[181789]: 2026-01-21 23:37:05.237 181793 DEBUG oslo_service.service [None req-6c5ad869-9008-4282-b2fa-81b360176ee2 - - - - - -] scheduler.placement_aggregate_required_for_tenants = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:05 np0005591285 nova_compute[181789]: 2026-01-21 23:37:05.238 181793 DEBUG oslo_service.service [None req-6c5ad869-9008-4282-b2fa-81b360176ee2 - - - - - -] scheduler.query_placement_for_availability_zone = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:05 np0005591285 nova_compute[181789]: 2026-01-21 23:37:05.238 181793 DEBUG oslo_service.service [None req-6c5ad869-9008-4282-b2fa-81b360176ee2 - - - - - -] scheduler.query_placement_for_image_type_support = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:05 np0005591285 nova_compute[181789]: 2026-01-21 23:37:05.238 181793 DEBUG oslo_service.service [None req-6c5ad869-9008-4282-b2fa-81b360176ee2 - - - - - -] scheduler.query_placement_for_routed_network_aggregates = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:05 np0005591285 nova_compute[181789]: 2026-01-21 23:37:05.238 181793 DEBUG oslo_service.service [None req-6c5ad869-9008-4282-b2fa-81b360176ee2 - - - - - -] scheduler.workers              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:05 np0005591285 nova_compute[181789]: 2026-01-21 23:37:05.238 181793 DEBUG oslo_service.service [None req-6c5ad869-9008-4282-b2fa-81b360176ee2 - - - - - -] filter_scheduler.aggregate_image_properties_isolation_namespace = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:05 np0005591285 nova_compute[181789]: 2026-01-21 23:37:05.238 181793 DEBUG oslo_service.service [None req-6c5ad869-9008-4282-b2fa-81b360176ee2 - - - - - -] filter_scheduler.aggregate_image_properties_isolation_separator = . log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:05 np0005591285 nova_compute[181789]: 2026-01-21 23:37:05.238 181793 DEBUG oslo_service.service [None req-6c5ad869-9008-4282-b2fa-81b360176ee2 - - - - - -] filter_scheduler.available_filters = ['nova.scheduler.filters.all_filters'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:05 np0005591285 nova_compute[181789]: 2026-01-21 23:37:05.239 181793 DEBUG oslo_service.service [None req-6c5ad869-9008-4282-b2fa-81b360176ee2 - - - - - -] filter_scheduler.build_failure_weight_multiplier = 1000000.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:05 np0005591285 nova_compute[181789]: 2026-01-21 23:37:05.239 181793 DEBUG oslo_service.service [None req-6c5ad869-9008-4282-b2fa-81b360176ee2 - - - - - -] filter_scheduler.cpu_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:05 np0005591285 nova_compute[181789]: 2026-01-21 23:37:05.239 181793 DEBUG oslo_service.service [None req-6c5ad869-9008-4282-b2fa-81b360176ee2 - - - - - -] filter_scheduler.cross_cell_move_weight_multiplier = 1000000.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:05 np0005591285 nova_compute[181789]: 2026-01-21 23:37:05.239 181793 DEBUG oslo_service.service [None req-6c5ad869-9008-4282-b2fa-81b360176ee2 - - - - - -] filter_scheduler.disk_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:05 np0005591285 nova_compute[181789]: 2026-01-21 23:37:05.239 181793 DEBUG oslo_service.service [None req-6c5ad869-9008-4282-b2fa-81b360176ee2 - - - - - -] filter_scheduler.enabled_filters = ['ComputeFilter', 'ComputeCapabilitiesFilter', 'ImagePropertiesFilter', 'ServerGroupAntiAffinityFilter', 'ServerGroupAffinityFilter'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:05 np0005591285 nova_compute[181789]: 2026-01-21 23:37:05.239 181793 DEBUG oslo_service.service [None req-6c5ad869-9008-4282-b2fa-81b360176ee2 - - - - - -] filter_scheduler.host_subset_size = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:05 np0005591285 nova_compute[181789]: 2026-01-21 23:37:05.240 181793 DEBUG oslo_service.service [None req-6c5ad869-9008-4282-b2fa-81b360176ee2 - - - - - -] filter_scheduler.image_properties_default_architecture = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:05 np0005591285 nova_compute[181789]: 2026-01-21 23:37:05.240 181793 DEBUG oslo_service.service [None req-6c5ad869-9008-4282-b2fa-81b360176ee2 - - - - - -] filter_scheduler.io_ops_weight_multiplier = -1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:05 np0005591285 nova_compute[181789]: 2026-01-21 23:37:05.240 181793 DEBUG oslo_service.service [None req-6c5ad869-9008-4282-b2fa-81b360176ee2 - - - - - -] filter_scheduler.isolated_hosts = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:05 np0005591285 nova_compute[181789]: 2026-01-21 23:37:05.240 181793 DEBUG oslo_service.service [None req-6c5ad869-9008-4282-b2fa-81b360176ee2 - - - - - -] filter_scheduler.isolated_images = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:05 np0005591285 nova_compute[181789]: 2026-01-21 23:37:05.240 181793 DEBUG oslo_service.service [None req-6c5ad869-9008-4282-b2fa-81b360176ee2 - - - - - -] filter_scheduler.max_instances_per_host = 50 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:05 np0005591285 nova_compute[181789]: 2026-01-21 23:37:05.240 181793 DEBUG oslo_service.service [None req-6c5ad869-9008-4282-b2fa-81b360176ee2 - - - - - -] filter_scheduler.max_io_ops_per_host = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:05 np0005591285 nova_compute[181789]: 2026-01-21 23:37:05.240 181793 DEBUG oslo_service.service [None req-6c5ad869-9008-4282-b2fa-81b360176ee2 - - - - - -] filter_scheduler.pci_in_placement = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:05 np0005591285 nova_compute[181789]: 2026-01-21 23:37:05.241 181793 DEBUG oslo_service.service [None req-6c5ad869-9008-4282-b2fa-81b360176ee2 - - - - - -] filter_scheduler.pci_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:05 np0005591285 nova_compute[181789]: 2026-01-21 23:37:05.241 181793 DEBUG oslo_service.service [None req-6c5ad869-9008-4282-b2fa-81b360176ee2 - - - - - -] filter_scheduler.ram_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:05 np0005591285 nova_compute[181789]: 2026-01-21 23:37:05.241 181793 DEBUG oslo_service.service [None req-6c5ad869-9008-4282-b2fa-81b360176ee2 - - - - - -] filter_scheduler.restrict_isolated_hosts_to_isolated_images = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:05 np0005591285 nova_compute[181789]: 2026-01-21 23:37:05.241 181793 DEBUG oslo_service.service [None req-6c5ad869-9008-4282-b2fa-81b360176ee2 - - - - - -] filter_scheduler.shuffle_best_same_weighed_hosts = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:05 np0005591285 nova_compute[181789]: 2026-01-21 23:37:05.241 181793 DEBUG oslo_service.service [None req-6c5ad869-9008-4282-b2fa-81b360176ee2 - - - - - -] filter_scheduler.soft_affinity_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:05 np0005591285 nova_compute[181789]: 2026-01-21 23:37:05.241 181793 DEBUG oslo_service.service [None req-6c5ad869-9008-4282-b2fa-81b360176ee2 - - - - - -] filter_scheduler.soft_anti_affinity_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:05 np0005591285 nova_compute[181789]: 2026-01-21 23:37:05.241 181793 DEBUG oslo_service.service [None req-6c5ad869-9008-4282-b2fa-81b360176ee2 - - - - - -] filter_scheduler.track_instance_changes = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:05 np0005591285 nova_compute[181789]: 2026-01-21 23:37:05.242 181793 DEBUG oslo_service.service [None req-6c5ad869-9008-4282-b2fa-81b360176ee2 - - - - - -] filter_scheduler.weight_classes = ['nova.scheduler.weights.all_weighers'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:05 np0005591285 nova_compute[181789]: 2026-01-21 23:37:05.242 181793 DEBUG oslo_service.service [None req-6c5ad869-9008-4282-b2fa-81b360176ee2 - - - - - -] metrics.required               = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:05 np0005591285 nova_compute[181789]: 2026-01-21 23:37:05.242 181793 DEBUG oslo_service.service [None req-6c5ad869-9008-4282-b2fa-81b360176ee2 - - - - - -] metrics.weight_multiplier      = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:05 np0005591285 nova_compute[181789]: 2026-01-21 23:37:05.242 181793 DEBUG oslo_service.service [None req-6c5ad869-9008-4282-b2fa-81b360176ee2 - - - - - -] metrics.weight_of_unavailable  = -10000.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:05 np0005591285 nova_compute[181789]: 2026-01-21 23:37:05.242 181793 DEBUG oslo_service.service [None req-6c5ad869-9008-4282-b2fa-81b360176ee2 - - - - - -] metrics.weight_setting         = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:05 np0005591285 nova_compute[181789]: 2026-01-21 23:37:05.242 181793 DEBUG oslo_service.service [None req-6c5ad869-9008-4282-b2fa-81b360176ee2 - - - - - -] serial_console.base_url        = ws://127.0.0.1:6083/ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:05 np0005591285 nova_compute[181789]: 2026-01-21 23:37:05.243 181793 DEBUG oslo_service.service [None req-6c5ad869-9008-4282-b2fa-81b360176ee2 - - - - - -] serial_console.enabled         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:05 np0005591285 nova_compute[181789]: 2026-01-21 23:37:05.243 181793 DEBUG oslo_service.service [None req-6c5ad869-9008-4282-b2fa-81b360176ee2 - - - - - -] serial_console.port_range      = 10000:20000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:05 np0005591285 nova_compute[181789]: 2026-01-21 23:37:05.243 181793 DEBUG oslo_service.service [None req-6c5ad869-9008-4282-b2fa-81b360176ee2 - - - - - -] serial_console.proxyclient_address = 127.0.0.1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:05 np0005591285 nova_compute[181789]: 2026-01-21 23:37:05.243 181793 DEBUG oslo_service.service [None req-6c5ad869-9008-4282-b2fa-81b360176ee2 - - - - - -] serial_console.serialproxy_host = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:05 np0005591285 nova_compute[181789]: 2026-01-21 23:37:05.243 181793 DEBUG oslo_service.service [None req-6c5ad869-9008-4282-b2fa-81b360176ee2 - - - - - -] serial_console.serialproxy_port = 6083 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:05 np0005591285 nova_compute[181789]: 2026-01-21 23:37:05.244 181793 DEBUG oslo_service.service [None req-6c5ad869-9008-4282-b2fa-81b360176ee2 - - - - - -] service_user.auth_section      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:05 np0005591285 nova_compute[181789]: 2026-01-21 23:37:05.244 181793 DEBUG oslo_service.service [None req-6c5ad869-9008-4282-b2fa-81b360176ee2 - - - - - -] service_user.auth_type         = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:05 np0005591285 nova_compute[181789]: 2026-01-21 23:37:05.244 181793 DEBUG oslo_service.service [None req-6c5ad869-9008-4282-b2fa-81b360176ee2 - - - - - -] service_user.cafile            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:05 np0005591285 nova_compute[181789]: 2026-01-21 23:37:05.244 181793 DEBUG oslo_service.service [None req-6c5ad869-9008-4282-b2fa-81b360176ee2 - - - - - -] service_user.certfile          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:05 np0005591285 nova_compute[181789]: 2026-01-21 23:37:05.244 181793 DEBUG oslo_service.service [None req-6c5ad869-9008-4282-b2fa-81b360176ee2 - - - - - -] service_user.collect_timing    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:05 np0005591285 nova_compute[181789]: 2026-01-21 23:37:05.244 181793 DEBUG oslo_service.service [None req-6c5ad869-9008-4282-b2fa-81b360176ee2 - - - - - -] service_user.insecure          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:05 np0005591285 nova_compute[181789]: 2026-01-21 23:37:05.245 181793 DEBUG oslo_service.service [None req-6c5ad869-9008-4282-b2fa-81b360176ee2 - - - - - -] service_user.keyfile           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:05 np0005591285 nova_compute[181789]: 2026-01-21 23:37:05.245 181793 DEBUG oslo_service.service [None req-6c5ad869-9008-4282-b2fa-81b360176ee2 - - - - - -] service_user.send_service_user_token = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:05 np0005591285 nova_compute[181789]: 2026-01-21 23:37:05.245 181793 DEBUG oslo_service.service [None req-6c5ad869-9008-4282-b2fa-81b360176ee2 - - - - - -] service_user.split_loggers     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:05 np0005591285 nova_compute[181789]: 2026-01-21 23:37:05.245 181793 DEBUG oslo_service.service [None req-6c5ad869-9008-4282-b2fa-81b360176ee2 - - - - - -] service_user.timeout           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:05 np0005591285 nova_compute[181789]: 2026-01-21 23:37:05.245 181793 DEBUG oslo_service.service [None req-6c5ad869-9008-4282-b2fa-81b360176ee2 - - - - - -] spice.agent_enabled            = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:05 np0005591285 nova_compute[181789]: 2026-01-21 23:37:05.246 181793 DEBUG oslo_service.service [None req-6c5ad869-9008-4282-b2fa-81b360176ee2 - - - - - -] spice.enabled                  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:05 np0005591285 nova_compute[181789]: 2026-01-21 23:37:05.246 181793 DEBUG oslo_service.service [None req-6c5ad869-9008-4282-b2fa-81b360176ee2 - - - - - -] spice.html5proxy_base_url      = http://127.0.0.1:6082/spice_auto.html log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:05 np0005591285 nova_compute[181789]: 2026-01-21 23:37:05.246 181793 DEBUG oslo_service.service [None req-6c5ad869-9008-4282-b2fa-81b360176ee2 - - - - - -] spice.html5proxy_host          = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:05 np0005591285 nova_compute[181789]: 2026-01-21 23:37:05.246 181793 DEBUG oslo_service.service [None req-6c5ad869-9008-4282-b2fa-81b360176ee2 - - - - - -] spice.html5proxy_port          = 6082 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:05 np0005591285 nova_compute[181789]: 2026-01-21 23:37:05.246 181793 DEBUG oslo_service.service [None req-6c5ad869-9008-4282-b2fa-81b360176ee2 - - - - - -] spice.image_compression        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:05 np0005591285 nova_compute[181789]: 2026-01-21 23:37:05.246 181793 DEBUG oslo_service.service [None req-6c5ad869-9008-4282-b2fa-81b360176ee2 - - - - - -] spice.jpeg_compression         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:05 np0005591285 nova_compute[181789]: 2026-01-21 23:37:05.247 181793 DEBUG oslo_service.service [None req-6c5ad869-9008-4282-b2fa-81b360176ee2 - - - - - -] spice.playback_compression     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:05 np0005591285 nova_compute[181789]: 2026-01-21 23:37:05.247 181793 DEBUG oslo_service.service [None req-6c5ad869-9008-4282-b2fa-81b360176ee2 - - - - - -] spice.server_listen            = 127.0.0.1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:05 np0005591285 nova_compute[181789]: 2026-01-21 23:37:05.247 181793 DEBUG oslo_service.service [None req-6c5ad869-9008-4282-b2fa-81b360176ee2 - - - - - -] spice.server_proxyclient_address = 127.0.0.1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:05 np0005591285 nova_compute[181789]: 2026-01-21 23:37:05.247 181793 DEBUG oslo_service.service [None req-6c5ad869-9008-4282-b2fa-81b360176ee2 - - - - - -] spice.streaming_mode           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:05 np0005591285 nova_compute[181789]: 2026-01-21 23:37:05.247 181793 DEBUG oslo_service.service [None req-6c5ad869-9008-4282-b2fa-81b360176ee2 - - - - - -] spice.zlib_compression         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:05 np0005591285 nova_compute[181789]: 2026-01-21 23:37:05.247 181793 DEBUG oslo_service.service [None req-6c5ad869-9008-4282-b2fa-81b360176ee2 - - - - - -] upgrade_levels.baseapi         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:05 np0005591285 nova_compute[181789]: 2026-01-21 23:37:05.248 181793 DEBUG oslo_service.service [None req-6c5ad869-9008-4282-b2fa-81b360176ee2 - - - - - -] upgrade_levels.cert            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:05 np0005591285 nova_compute[181789]: 2026-01-21 23:37:05.248 181793 DEBUG oslo_service.service [None req-6c5ad869-9008-4282-b2fa-81b360176ee2 - - - - - -] upgrade_levels.compute         = auto log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:05 np0005591285 nova_compute[181789]: 2026-01-21 23:37:05.248 181793 DEBUG oslo_service.service [None req-6c5ad869-9008-4282-b2fa-81b360176ee2 - - - - - -] upgrade_levels.conductor       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:05 np0005591285 nova_compute[181789]: 2026-01-21 23:37:05.248 181793 DEBUG oslo_service.service [None req-6c5ad869-9008-4282-b2fa-81b360176ee2 - - - - - -] upgrade_levels.scheduler       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:05 np0005591285 nova_compute[181789]: 2026-01-21 23:37:05.248 181793 DEBUG oslo_service.service [None req-6c5ad869-9008-4282-b2fa-81b360176ee2 - - - - - -] vendordata_dynamic_auth.auth_section = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:05 np0005591285 nova_compute[181789]: 2026-01-21 23:37:05.248 181793 DEBUG oslo_service.service [None req-6c5ad869-9008-4282-b2fa-81b360176ee2 - - - - - -] vendordata_dynamic_auth.auth_type = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:05 np0005591285 nova_compute[181789]: 2026-01-21 23:37:05.248 181793 DEBUG oslo_service.service [None req-6c5ad869-9008-4282-b2fa-81b360176ee2 - - - - - -] vendordata_dynamic_auth.cafile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:05 np0005591285 nova_compute[181789]: 2026-01-21 23:37:05.249 181793 DEBUG oslo_service.service [None req-6c5ad869-9008-4282-b2fa-81b360176ee2 - - - - - -] vendordata_dynamic_auth.certfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:05 np0005591285 nova_compute[181789]: 2026-01-21 23:37:05.249 181793 DEBUG oslo_service.service [None req-6c5ad869-9008-4282-b2fa-81b360176ee2 - - - - - -] vendordata_dynamic_auth.collect_timing = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:05 np0005591285 nova_compute[181789]: 2026-01-21 23:37:05.249 181793 DEBUG oslo_service.service [None req-6c5ad869-9008-4282-b2fa-81b360176ee2 - - - - - -] vendordata_dynamic_auth.insecure = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:05 np0005591285 nova_compute[181789]: 2026-01-21 23:37:05.249 181793 DEBUG oslo_service.service [None req-6c5ad869-9008-4282-b2fa-81b360176ee2 - - - - - -] vendordata_dynamic_auth.keyfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:05 np0005591285 nova_compute[181789]: 2026-01-21 23:37:05.249 181793 DEBUG oslo_service.service [None req-6c5ad869-9008-4282-b2fa-81b360176ee2 - - - - - -] vendordata_dynamic_auth.split_loggers = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:05 np0005591285 nova_compute[181789]: 2026-01-21 23:37:05.249 181793 DEBUG oslo_service.service [None req-6c5ad869-9008-4282-b2fa-81b360176ee2 - - - - - -] vendordata_dynamic_auth.timeout = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:05 np0005591285 nova_compute[181789]: 2026-01-21 23:37:05.250 181793 DEBUG oslo_service.service [None req-6c5ad869-9008-4282-b2fa-81b360176ee2 - - - - - -] vmware.api_retry_count         = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:05 np0005591285 nova_compute[181789]: 2026-01-21 23:37:05.250 181793 DEBUG oslo_service.service [None req-6c5ad869-9008-4282-b2fa-81b360176ee2 - - - - - -] vmware.ca_file                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:05 np0005591285 nova_compute[181789]: 2026-01-21 23:37:05.250 181793 DEBUG oslo_service.service [None req-6c5ad869-9008-4282-b2fa-81b360176ee2 - - - - - -] vmware.cache_prefix            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:05 np0005591285 nova_compute[181789]: 2026-01-21 23:37:05.250 181793 DEBUG oslo_service.service [None req-6c5ad869-9008-4282-b2fa-81b360176ee2 - - - - - -] vmware.cluster_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:05 np0005591285 nova_compute[181789]: 2026-01-21 23:37:05.250 181793 DEBUG oslo_service.service [None req-6c5ad869-9008-4282-b2fa-81b360176ee2 - - - - - -] vmware.connection_pool_size    = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:05 np0005591285 nova_compute[181789]: 2026-01-21 23:37:05.251 181793 DEBUG oslo_service.service [None req-6c5ad869-9008-4282-b2fa-81b360176ee2 - - - - - -] vmware.console_delay_seconds   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:05 np0005591285 nova_compute[181789]: 2026-01-21 23:37:05.251 181793 DEBUG oslo_service.service [None req-6c5ad869-9008-4282-b2fa-81b360176ee2 - - - - - -] vmware.datastore_regex         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:05 np0005591285 nova_compute[181789]: 2026-01-21 23:37:05.251 181793 DEBUG oslo_service.service [None req-6c5ad869-9008-4282-b2fa-81b360176ee2 - - - - - -] vmware.host_ip                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:05 np0005591285 nova_compute[181789]: 2026-01-21 23:37:05.251 181793 DEBUG oslo_service.service [None req-6c5ad869-9008-4282-b2fa-81b360176ee2 - - - - - -] vmware.host_password           = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:05 np0005591285 nova_compute[181789]: 2026-01-21 23:37:05.252 181793 DEBUG oslo_service.service [None req-6c5ad869-9008-4282-b2fa-81b360176ee2 - - - - - -] vmware.host_port               = 443 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:05 np0005591285 nova_compute[181789]: 2026-01-21 23:37:05.252 181793 DEBUG oslo_service.service [None req-6c5ad869-9008-4282-b2fa-81b360176ee2 - - - - - -] vmware.host_username           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:05 np0005591285 nova_compute[181789]: 2026-01-21 23:37:05.252 181793 DEBUG oslo_service.service [None req-6c5ad869-9008-4282-b2fa-81b360176ee2 - - - - - -] vmware.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:05 np0005591285 nova_compute[181789]: 2026-01-21 23:37:05.252 181793 DEBUG oslo_service.service [None req-6c5ad869-9008-4282-b2fa-81b360176ee2 - - - - - -] vmware.integration_bridge      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:05 np0005591285 nova_compute[181789]: 2026-01-21 23:37:05.252 181793 DEBUG oslo_service.service [None req-6c5ad869-9008-4282-b2fa-81b360176ee2 - - - - - -] vmware.maximum_objects         = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:05 np0005591285 nova_compute[181789]: 2026-01-21 23:37:05.252 181793 DEBUG oslo_service.service [None req-6c5ad869-9008-4282-b2fa-81b360176ee2 - - - - - -] vmware.pbm_default_policy      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:05 np0005591285 nova_compute[181789]: 2026-01-21 23:37:05.252 181793 DEBUG oslo_service.service [None req-6c5ad869-9008-4282-b2fa-81b360176ee2 - - - - - -] vmware.pbm_enabled             = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:05 np0005591285 nova_compute[181789]: 2026-01-21 23:37:05.253 181793 DEBUG oslo_service.service [None req-6c5ad869-9008-4282-b2fa-81b360176ee2 - - - - - -] vmware.pbm_wsdl_location       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:05 np0005591285 nova_compute[181789]: 2026-01-21 23:37:05.253 181793 DEBUG oslo_service.service [None req-6c5ad869-9008-4282-b2fa-81b360176ee2 - - - - - -] vmware.serial_log_dir          = /opt/vmware/vspc log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:05 np0005591285 nova_compute[181789]: 2026-01-21 23:37:05.253 181793 DEBUG oslo_service.service [None req-6c5ad869-9008-4282-b2fa-81b360176ee2 - - - - - -] vmware.serial_port_proxy_uri   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:05 np0005591285 nova_compute[181789]: 2026-01-21 23:37:05.253 181793 DEBUG oslo_service.service [None req-6c5ad869-9008-4282-b2fa-81b360176ee2 - - - - - -] vmware.serial_port_service_uri = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:05 np0005591285 nova_compute[181789]: 2026-01-21 23:37:05.253 181793 DEBUG oslo_service.service [None req-6c5ad869-9008-4282-b2fa-81b360176ee2 - - - - - -] vmware.task_poll_interval      = 0.5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:05 np0005591285 nova_compute[181789]: 2026-01-21 23:37:05.253 181793 DEBUG oslo_service.service [None req-6c5ad869-9008-4282-b2fa-81b360176ee2 - - - - - -] vmware.use_linked_clone        = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:05 np0005591285 nova_compute[181789]: 2026-01-21 23:37:05.254 181793 DEBUG oslo_service.service [None req-6c5ad869-9008-4282-b2fa-81b360176ee2 - - - - - -] vmware.vnc_keymap              = en-us log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:05 np0005591285 nova_compute[181789]: 2026-01-21 23:37:05.254 181793 DEBUG oslo_service.service [None req-6c5ad869-9008-4282-b2fa-81b360176ee2 - - - - - -] vmware.vnc_port                = 5900 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:05 np0005591285 nova_compute[181789]: 2026-01-21 23:37:05.254 181793 DEBUG oslo_service.service [None req-6c5ad869-9008-4282-b2fa-81b360176ee2 - - - - - -] vmware.vnc_port_total          = 10000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:05 np0005591285 nova_compute[181789]: 2026-01-21 23:37:05.254 181793 DEBUG oslo_service.service [None req-6c5ad869-9008-4282-b2fa-81b360176ee2 - - - - - -] vnc.auth_schemes               = ['none'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:05 np0005591285 nova_compute[181789]: 2026-01-21 23:37:05.254 181793 DEBUG oslo_service.service [None req-6c5ad869-9008-4282-b2fa-81b360176ee2 - - - - - -] vnc.enabled                    = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:05 np0005591285 nova_compute[181789]: 2026-01-21 23:37:05.254 181793 DEBUG oslo_service.service [None req-6c5ad869-9008-4282-b2fa-81b360176ee2 - - - - - -] vnc.novncproxy_base_url        = https://nova-novncproxy-cell1-public-openstack.apps-crc.testing/vnc_lite.html log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:05 np0005591285 nova_compute[181789]: 2026-01-21 23:37:05.255 181793 DEBUG oslo_service.service [None req-6c5ad869-9008-4282-b2fa-81b360176ee2 - - - - - -] vnc.novncproxy_host            = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:05 np0005591285 nova_compute[181789]: 2026-01-21 23:37:05.255 181793 DEBUG oslo_service.service [None req-6c5ad869-9008-4282-b2fa-81b360176ee2 - - - - - -] vnc.novncproxy_port            = 6080 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:05 np0005591285 nova_compute[181789]: 2026-01-21 23:37:05.255 181793 DEBUG oslo_service.service [None req-6c5ad869-9008-4282-b2fa-81b360176ee2 - - - - - -] vnc.server_listen              = ::0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:05 np0005591285 nova_compute[181789]: 2026-01-21 23:37:05.255 181793 DEBUG oslo_service.service [None req-6c5ad869-9008-4282-b2fa-81b360176ee2 - - - - - -] vnc.server_proxyclient_address = 192.168.122.102 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:05 np0005591285 nova_compute[181789]: 2026-01-21 23:37:05.255 181793 DEBUG oslo_service.service [None req-6c5ad869-9008-4282-b2fa-81b360176ee2 - - - - - -] vnc.vencrypt_ca_certs          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:05 np0005591285 nova_compute[181789]: 2026-01-21 23:37:05.256 181793 DEBUG oslo_service.service [None req-6c5ad869-9008-4282-b2fa-81b360176ee2 - - - - - -] vnc.vencrypt_client_cert       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:05 np0005591285 nova_compute[181789]: 2026-01-21 23:37:05.256 181793 DEBUG oslo_service.service [None req-6c5ad869-9008-4282-b2fa-81b360176ee2 - - - - - -] vnc.vencrypt_client_key        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:05 np0005591285 nova_compute[181789]: 2026-01-21 23:37:05.256 181793 DEBUG oslo_service.service [None req-6c5ad869-9008-4282-b2fa-81b360176ee2 - - - - - -] workarounds.disable_compute_service_check_for_ffu = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:05 np0005591285 nova_compute[181789]: 2026-01-21 23:37:05.256 181793 DEBUG oslo_service.service [None req-6c5ad869-9008-4282-b2fa-81b360176ee2 - - - - - -] workarounds.disable_deep_image_inspection = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:05 np0005591285 nova_compute[181789]: 2026-01-21 23:37:05.256 181793 DEBUG oslo_service.service [None req-6c5ad869-9008-4282-b2fa-81b360176ee2 - - - - - -] workarounds.disable_fallback_pcpu_query = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:05 np0005591285 nova_compute[181789]: 2026-01-21 23:37:05.257 181793 DEBUG oslo_service.service [None req-6c5ad869-9008-4282-b2fa-81b360176ee2 - - - - - -] workarounds.disable_group_policy_check_upcall = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:05 np0005591285 nova_compute[181789]: 2026-01-21 23:37:05.257 181793 DEBUG oslo_service.service [None req-6c5ad869-9008-4282-b2fa-81b360176ee2 - - - - - -] workarounds.disable_libvirt_livesnapshot = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:05 np0005591285 nova_compute[181789]: 2026-01-21 23:37:05.257 181793 DEBUG oslo_service.service [None req-6c5ad869-9008-4282-b2fa-81b360176ee2 - - - - - -] workarounds.disable_rootwrap   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:05 np0005591285 nova_compute[181789]: 2026-01-21 23:37:05.257 181793 DEBUG oslo_service.service [None req-6c5ad869-9008-4282-b2fa-81b360176ee2 - - - - - -] workarounds.enable_numa_live_migration = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:05 np0005591285 nova_compute[181789]: 2026-01-21 23:37:05.257 181793 DEBUG oslo_service.service [None req-6c5ad869-9008-4282-b2fa-81b360176ee2 - - - - - -] workarounds.enable_qemu_monitor_announce_self = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:05 np0005591285 nova_compute[181789]: 2026-01-21 23:37:05.257 181793 DEBUG oslo_service.service [None req-6c5ad869-9008-4282-b2fa-81b360176ee2 - - - - - -] workarounds.ensure_libvirt_rbd_instance_dir_cleanup = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:05 np0005591285 nova_compute[181789]: 2026-01-21 23:37:05.258 181793 DEBUG oslo_service.service [None req-6c5ad869-9008-4282-b2fa-81b360176ee2 - - - - - -] workarounds.handle_virt_lifecycle_events = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:05 np0005591285 nova_compute[181789]: 2026-01-21 23:37:05.258 181793 DEBUG oslo_service.service [None req-6c5ad869-9008-4282-b2fa-81b360176ee2 - - - - - -] workarounds.libvirt_disable_apic = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:05 np0005591285 nova_compute[181789]: 2026-01-21 23:37:05.258 181793 DEBUG oslo_service.service [None req-6c5ad869-9008-4282-b2fa-81b360176ee2 - - - - - -] workarounds.never_download_image_if_on_rbd = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:05 np0005591285 nova_compute[181789]: 2026-01-21 23:37:05.258 181793 DEBUG oslo_service.service [None req-6c5ad869-9008-4282-b2fa-81b360176ee2 - - - - - -] workarounds.qemu_monitor_announce_self_count = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:05 np0005591285 nova_compute[181789]: 2026-01-21 23:37:05.258 181793 DEBUG oslo_service.service [None req-6c5ad869-9008-4282-b2fa-81b360176ee2 - - - - - -] workarounds.qemu_monitor_announce_self_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:05 np0005591285 nova_compute[181789]: 2026-01-21 23:37:05.258 181793 DEBUG oslo_service.service [None req-6c5ad869-9008-4282-b2fa-81b360176ee2 - - - - - -] workarounds.reserve_disk_resource_for_image_cache = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:05 np0005591285 nova_compute[181789]: 2026-01-21 23:37:05.258 181793 DEBUG oslo_service.service [None req-6c5ad869-9008-4282-b2fa-81b360176ee2 - - - - - -] workarounds.skip_cpu_compare_at_startup = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:05 np0005591285 nova_compute[181789]: 2026-01-21 23:37:05.259 181793 DEBUG oslo_service.service [None req-6c5ad869-9008-4282-b2fa-81b360176ee2 - - - - - -] workarounds.skip_cpu_compare_on_dest = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:05 np0005591285 nova_compute[181789]: 2026-01-21 23:37:05.259 181793 DEBUG oslo_service.service [None req-6c5ad869-9008-4282-b2fa-81b360176ee2 - - - - - -] workarounds.skip_hypervisor_version_check_on_lm = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:05 np0005591285 nova_compute[181789]: 2026-01-21 23:37:05.259 181793 DEBUG oslo_service.service [None req-6c5ad869-9008-4282-b2fa-81b360176ee2 - - - - - -] workarounds.skip_reserve_in_use_ironic_nodes = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:05 np0005591285 nova_compute[181789]: 2026-01-21 23:37:05.259 181793 DEBUG oslo_service.service [None req-6c5ad869-9008-4282-b2fa-81b360176ee2 - - - - - -] workarounds.unified_limits_count_pcpu_as_vcpu = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:05 np0005591285 nova_compute[181789]: 2026-01-21 23:37:05.259 181793 DEBUG oslo_service.service [None req-6c5ad869-9008-4282-b2fa-81b360176ee2 - - - - - -] workarounds.wait_for_vif_plugged_event_during_hard_reboot = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:05 np0005591285 nova_compute[181789]: 2026-01-21 23:37:05.259 181793 DEBUG oslo_service.service [None req-6c5ad869-9008-4282-b2fa-81b360176ee2 - - - - - -] wsgi.api_paste_config          = api-paste.ini log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:05 np0005591285 nova_compute[181789]: 2026-01-21 23:37:05.259 181793 DEBUG oslo_service.service [None req-6c5ad869-9008-4282-b2fa-81b360176ee2 - - - - - -] wsgi.client_socket_timeout     = 900 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:05 np0005591285 nova_compute[181789]: 2026-01-21 23:37:05.260 181793 DEBUG oslo_service.service [None req-6c5ad869-9008-4282-b2fa-81b360176ee2 - - - - - -] wsgi.default_pool_size         = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:05 np0005591285 nova_compute[181789]: 2026-01-21 23:37:05.260 181793 DEBUG oslo_service.service [None req-6c5ad869-9008-4282-b2fa-81b360176ee2 - - - - - -] wsgi.keep_alive                = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:05 np0005591285 nova_compute[181789]: 2026-01-21 23:37:05.260 181793 DEBUG oslo_service.service [None req-6c5ad869-9008-4282-b2fa-81b360176ee2 - - - - - -] wsgi.max_header_line           = 16384 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:05 np0005591285 nova_compute[181789]: 2026-01-21 23:37:05.260 181793 DEBUG oslo_service.service [None req-6c5ad869-9008-4282-b2fa-81b360176ee2 - - - - - -] wsgi.secure_proxy_ssl_header   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:05 np0005591285 nova_compute[181789]: 2026-01-21 23:37:05.260 181793 DEBUG oslo_service.service [None req-6c5ad869-9008-4282-b2fa-81b360176ee2 - - - - - -] wsgi.ssl_ca_file               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:05 np0005591285 nova_compute[181789]: 2026-01-21 23:37:05.260 181793 DEBUG oslo_service.service [None req-6c5ad869-9008-4282-b2fa-81b360176ee2 - - - - - -] wsgi.ssl_cert_file             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:05 np0005591285 nova_compute[181789]: 2026-01-21 23:37:05.260 181793 DEBUG oslo_service.service [None req-6c5ad869-9008-4282-b2fa-81b360176ee2 - - - - - -] wsgi.ssl_key_file              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:05 np0005591285 nova_compute[181789]: 2026-01-21 23:37:05.261 181793 DEBUG oslo_service.service [None req-6c5ad869-9008-4282-b2fa-81b360176ee2 - - - - - -] wsgi.tcp_keepidle              = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:05 np0005591285 nova_compute[181789]: 2026-01-21 23:37:05.261 181793 DEBUG oslo_service.service [None req-6c5ad869-9008-4282-b2fa-81b360176ee2 - - - - - -] wsgi.wsgi_log_format           = %(client_ip)s "%(request_line)s" status: %(status_code)s len: %(body_length)s time: %(wall_seconds).7f log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:05 np0005591285 nova_compute[181789]: 2026-01-21 23:37:05.261 181793 DEBUG oslo_service.service [None req-6c5ad869-9008-4282-b2fa-81b360176ee2 - - - - - -] zvm.ca_file                    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:05 np0005591285 nova_compute[181789]: 2026-01-21 23:37:05.261 181793 DEBUG oslo_service.service [None req-6c5ad869-9008-4282-b2fa-81b360176ee2 - - - - - -] zvm.cloud_connector_url        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:05 np0005591285 nova_compute[181789]: 2026-01-21 23:37:05.261 181793 DEBUG oslo_service.service [None req-6c5ad869-9008-4282-b2fa-81b360176ee2 - - - - - -] zvm.image_tmp_path             = /var/lib/nova/images log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:05 np0005591285 nova_compute[181789]: 2026-01-21 23:37:05.261 181793 DEBUG oslo_service.service [None req-6c5ad869-9008-4282-b2fa-81b360176ee2 - - - - - -] zvm.reachable_timeout          = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:05 np0005591285 nova_compute[181789]: 2026-01-21 23:37:05.262 181793 DEBUG oslo_service.service [None req-6c5ad869-9008-4282-b2fa-81b360176ee2 - - - - - -] oslo_policy.enforce_new_defaults = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:05 np0005591285 nova_compute[181789]: 2026-01-21 23:37:05.262 181793 DEBUG oslo_service.service [None req-6c5ad869-9008-4282-b2fa-81b360176ee2 - - - - - -] oslo_policy.enforce_scope      = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:05 np0005591285 nova_compute[181789]: 2026-01-21 23:37:05.262 181793 DEBUG oslo_service.service [None req-6c5ad869-9008-4282-b2fa-81b360176ee2 - - - - - -] oslo_policy.policy_default_rule = default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:05 np0005591285 nova_compute[181789]: 2026-01-21 23:37:05.262 181793 DEBUG oslo_service.service [None req-6c5ad869-9008-4282-b2fa-81b360176ee2 - - - - - -] oslo_policy.policy_dirs        = ['policy.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:05 np0005591285 nova_compute[181789]: 2026-01-21 23:37:05.262 181793 DEBUG oslo_service.service [None req-6c5ad869-9008-4282-b2fa-81b360176ee2 - - - - - -] oslo_policy.policy_file        = policy.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:05 np0005591285 nova_compute[181789]: 2026-01-21 23:37:05.263 181793 DEBUG oslo_service.service [None req-6c5ad869-9008-4282-b2fa-81b360176ee2 - - - - - -] oslo_policy.remote_content_type = application/x-www-form-urlencoded log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:05 np0005591285 nova_compute[181789]: 2026-01-21 23:37:05.263 181793 DEBUG oslo_service.service [None req-6c5ad869-9008-4282-b2fa-81b360176ee2 - - - - - -] oslo_policy.remote_ssl_ca_crt_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:05 np0005591285 nova_compute[181789]: 2026-01-21 23:37:05.263 181793 DEBUG oslo_service.service [None req-6c5ad869-9008-4282-b2fa-81b360176ee2 - - - - - -] oslo_policy.remote_ssl_client_crt_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:05 np0005591285 nova_compute[181789]: 2026-01-21 23:37:05.263 181793 DEBUG oslo_service.service [None req-6c5ad869-9008-4282-b2fa-81b360176ee2 - - - - - -] oslo_policy.remote_ssl_client_key_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:05 np0005591285 nova_compute[181789]: 2026-01-21 23:37:05.263 181793 DEBUG oslo_service.service [None req-6c5ad869-9008-4282-b2fa-81b360176ee2 - - - - - -] oslo_policy.remote_ssl_verify_server_crt = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:05 np0005591285 nova_compute[181789]: 2026-01-21 23:37:05.264 181793 DEBUG oslo_service.service [None req-6c5ad869-9008-4282-b2fa-81b360176ee2 - - - - - -] oslo_versionedobjects.fatal_exception_format_errors = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:05 np0005591285 nova_compute[181789]: 2026-01-21 23:37:05.264 181793 DEBUG oslo_service.service [None req-6c5ad869-9008-4282-b2fa-81b360176ee2 - - - - - -] oslo_middleware.http_basic_auth_user_file = /etc/htpasswd log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:05 np0005591285 nova_compute[181789]: 2026-01-21 23:37:05.264 181793 DEBUG oslo_service.service [None req-6c5ad869-9008-4282-b2fa-81b360176ee2 - - - - - -] remote_debug.host              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:05 np0005591285 nova_compute[181789]: 2026-01-21 23:37:05.264 181793 DEBUG oslo_service.service [None req-6c5ad869-9008-4282-b2fa-81b360176ee2 - - - - - -] remote_debug.port              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:05 np0005591285 nova_compute[181789]: 2026-01-21 23:37:05.264 181793 DEBUG oslo_service.service [None req-6c5ad869-9008-4282-b2fa-81b360176ee2 - - - - - -] oslo_messaging_rabbit.amqp_auto_delete = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:05 np0005591285 nova_compute[181789]: 2026-01-21 23:37:05.264 181793 DEBUG oslo_service.service [None req-6c5ad869-9008-4282-b2fa-81b360176ee2 - - - - - -] oslo_messaging_rabbit.amqp_durable_queues = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:05 np0005591285 nova_compute[181789]: 2026-01-21 23:37:05.264 181793 DEBUG oslo_service.service [None req-6c5ad869-9008-4282-b2fa-81b360176ee2 - - - - - -] oslo_messaging_rabbit.conn_pool_min_size = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:05 np0005591285 nova_compute[181789]: 2026-01-21 23:37:05.265 181793 DEBUG oslo_service.service [None req-6c5ad869-9008-4282-b2fa-81b360176ee2 - - - - - -] oslo_messaging_rabbit.conn_pool_ttl = 1200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:05 np0005591285 nova_compute[181789]: 2026-01-21 23:37:05.265 181793 DEBUG oslo_service.service [None req-6c5ad869-9008-4282-b2fa-81b360176ee2 - - - - - -] oslo_messaging_rabbit.direct_mandatory_flag = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:05 np0005591285 nova_compute[181789]: 2026-01-21 23:37:05.265 181793 DEBUG oslo_service.service [None req-6c5ad869-9008-4282-b2fa-81b360176ee2 - - - - - -] oslo_messaging_rabbit.enable_cancel_on_failover = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:05 np0005591285 nova_compute[181789]: 2026-01-21 23:37:05.265 181793 DEBUG oslo_service.service [None req-6c5ad869-9008-4282-b2fa-81b360176ee2 - - - - - -] oslo_messaging_rabbit.heartbeat_in_pthread = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:05 np0005591285 nova_compute[181789]: 2026-01-21 23:37:05.265 181793 DEBUG oslo_service.service [None req-6c5ad869-9008-4282-b2fa-81b360176ee2 - - - - - -] oslo_messaging_rabbit.heartbeat_rate = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:05 np0005591285 nova_compute[181789]: 2026-01-21 23:37:05.266 181793 DEBUG oslo_service.service [None req-6c5ad869-9008-4282-b2fa-81b360176ee2 - - - - - -] oslo_messaging_rabbit.heartbeat_timeout_threshold = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:05 np0005591285 nova_compute[181789]: 2026-01-21 23:37:05.266 181793 DEBUG oslo_service.service [None req-6c5ad869-9008-4282-b2fa-81b360176ee2 - - - - - -] oslo_messaging_rabbit.kombu_compression = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:05 np0005591285 nova_compute[181789]: 2026-01-21 23:37:05.266 181793 DEBUG oslo_service.service [None req-6c5ad869-9008-4282-b2fa-81b360176ee2 - - - - - -] oslo_messaging_rabbit.kombu_failover_strategy = round-robin log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:05 np0005591285 nova_compute[181789]: 2026-01-21 23:37:05.266 181793 DEBUG oslo_service.service [None req-6c5ad869-9008-4282-b2fa-81b360176ee2 - - - - - -] oslo_messaging_rabbit.kombu_missing_consumer_retry_timeout = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:05 np0005591285 nova_compute[181789]: 2026-01-21 23:37:05.266 181793 DEBUG oslo_service.service [None req-6c5ad869-9008-4282-b2fa-81b360176ee2 - - - - - -] oslo_messaging_rabbit.kombu_reconnect_delay = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:05 np0005591285 nova_compute[181789]: 2026-01-21 23:37:05.267 181793 DEBUG oslo_service.service [None req-6c5ad869-9008-4282-b2fa-81b360176ee2 - - - - - -] oslo_messaging_rabbit.rabbit_ha_queues = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:05 np0005591285 nova_compute[181789]: 2026-01-21 23:37:05.267 181793 DEBUG oslo_service.service [None req-6c5ad869-9008-4282-b2fa-81b360176ee2 - - - - - -] oslo_messaging_rabbit.rabbit_interval_max = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:05 np0005591285 nova_compute[181789]: 2026-01-21 23:37:05.267 181793 DEBUG oslo_service.service [None req-6c5ad869-9008-4282-b2fa-81b360176ee2 - - - - - -] oslo_messaging_rabbit.rabbit_login_method = AMQPLAIN log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:05 np0005591285 nova_compute[181789]: 2026-01-21 23:37:05.267 181793 DEBUG oslo_service.service [None req-6c5ad869-9008-4282-b2fa-81b360176ee2 - - - - - -] oslo_messaging_rabbit.rabbit_qos_prefetch_count = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:05 np0005591285 nova_compute[181789]: 2026-01-21 23:37:05.267 181793 DEBUG oslo_service.service [None req-6c5ad869-9008-4282-b2fa-81b360176ee2 - - - - - -] oslo_messaging_rabbit.rabbit_quorum_delivery_limit = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:05 np0005591285 nova_compute[181789]: 2026-01-21 23:37:05.267 181793 DEBUG oslo_service.service [None req-6c5ad869-9008-4282-b2fa-81b360176ee2 - - - - - -] oslo_messaging_rabbit.rabbit_quorum_max_memory_bytes = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:05 np0005591285 nova_compute[181789]: 2026-01-21 23:37:05.267 181793 DEBUG oslo_service.service [None req-6c5ad869-9008-4282-b2fa-81b360176ee2 - - - - - -] oslo_messaging_rabbit.rabbit_quorum_max_memory_length = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:05 np0005591285 nova_compute[181789]: 2026-01-21 23:37:05.268 181793 DEBUG oslo_service.service [None req-6c5ad869-9008-4282-b2fa-81b360176ee2 - - - - - -] oslo_messaging_rabbit.rabbit_quorum_queue = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:05 np0005591285 nova_compute[181789]: 2026-01-21 23:37:05.268 181793 DEBUG oslo_service.service [None req-6c5ad869-9008-4282-b2fa-81b360176ee2 - - - - - -] oslo_messaging_rabbit.rabbit_retry_backoff = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:05 np0005591285 nova_compute[181789]: 2026-01-21 23:37:05.268 181793 DEBUG oslo_service.service [None req-6c5ad869-9008-4282-b2fa-81b360176ee2 - - - - - -] oslo_messaging_rabbit.rabbit_retry_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:05 np0005591285 nova_compute[181789]: 2026-01-21 23:37:05.268 181793 DEBUG oslo_service.service [None req-6c5ad869-9008-4282-b2fa-81b360176ee2 - - - - - -] oslo_messaging_rabbit.rabbit_transient_queues_ttl = 1800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:05 np0005591285 nova_compute[181789]: 2026-01-21 23:37:05.268 181793 DEBUG oslo_service.service [None req-6c5ad869-9008-4282-b2fa-81b360176ee2 - - - - - -] oslo_messaging_rabbit.rpc_conn_pool_size = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:05 np0005591285 nova_compute[181789]: 2026-01-21 23:37:05.268 181793 DEBUG oslo_service.service [None req-6c5ad869-9008-4282-b2fa-81b360176ee2 - - - - - -] oslo_messaging_rabbit.ssl      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:05 np0005591285 nova_compute[181789]: 2026-01-21 23:37:05.269 181793 DEBUG oslo_service.service [None req-6c5ad869-9008-4282-b2fa-81b360176ee2 - - - - - -] oslo_messaging_rabbit.ssl_ca_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:05 np0005591285 nova_compute[181789]: 2026-01-21 23:37:05.269 181793 DEBUG oslo_service.service [None req-6c5ad869-9008-4282-b2fa-81b360176ee2 - - - - - -] oslo_messaging_rabbit.ssl_cert_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:05 np0005591285 nova_compute[181789]: 2026-01-21 23:37:05.269 181793 DEBUG oslo_service.service [None req-6c5ad869-9008-4282-b2fa-81b360176ee2 - - - - - -] oslo_messaging_rabbit.ssl_enforce_fips_mode = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:05 np0005591285 nova_compute[181789]: 2026-01-21 23:37:05.269 181793 DEBUG oslo_service.service [None req-6c5ad869-9008-4282-b2fa-81b360176ee2 - - - - - -] oslo_messaging_rabbit.ssl_key_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:05 np0005591285 nova_compute[181789]: 2026-01-21 23:37:05.269 181793 DEBUG oslo_service.service [None req-6c5ad869-9008-4282-b2fa-81b360176ee2 - - - - - -] oslo_messaging_rabbit.ssl_version =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:05 np0005591285 nova_compute[181789]: 2026-01-21 23:37:05.269 181793 DEBUG oslo_service.service [None req-6c5ad869-9008-4282-b2fa-81b360176ee2 - - - - - -] oslo_messaging_notifications.driver = ['messagingv2'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:05 np0005591285 nova_compute[181789]: 2026-01-21 23:37:05.270 181793 DEBUG oslo_service.service [None req-6c5ad869-9008-4282-b2fa-81b360176ee2 - - - - - -] oslo_messaging_notifications.retry = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:05 np0005591285 nova_compute[181789]: 2026-01-21 23:37:05.270 181793 DEBUG oslo_service.service [None req-6c5ad869-9008-4282-b2fa-81b360176ee2 - - - - - -] oslo_messaging_notifications.topics = ['notifications'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:05 np0005591285 nova_compute[181789]: 2026-01-21 23:37:05.270 181793 DEBUG oslo_service.service [None req-6c5ad869-9008-4282-b2fa-81b360176ee2 - - - - - -] oslo_messaging_notifications.transport_url = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:05 np0005591285 nova_compute[181789]: 2026-01-21 23:37:05.270 181793 DEBUG oslo_service.service [None req-6c5ad869-9008-4282-b2fa-81b360176ee2 - - - - - -] oslo_limit.auth_section        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:05 np0005591285 nova_compute[181789]: 2026-01-21 23:37:05.270 181793 DEBUG oslo_service.service [None req-6c5ad869-9008-4282-b2fa-81b360176ee2 - - - - - -] oslo_limit.auth_type           = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:05 np0005591285 nova_compute[181789]: 2026-01-21 23:37:05.271 181793 DEBUG oslo_service.service [None req-6c5ad869-9008-4282-b2fa-81b360176ee2 - - - - - -] oslo_limit.auth_url            = https://keystone-internal.openstack.svc:5000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:05 np0005591285 nova_compute[181789]: 2026-01-21 23:37:05.271 181793 DEBUG oslo_service.service [None req-6c5ad869-9008-4282-b2fa-81b360176ee2 - - - - - -] oslo_limit.cafile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:05 np0005591285 nova_compute[181789]: 2026-01-21 23:37:05.271 181793 DEBUG oslo_service.service [None req-6c5ad869-9008-4282-b2fa-81b360176ee2 - - - - - -] oslo_limit.certfile            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:05 np0005591285 nova_compute[181789]: 2026-01-21 23:37:05.271 181793 DEBUG oslo_service.service [None req-6c5ad869-9008-4282-b2fa-81b360176ee2 - - - - - -] oslo_limit.collect_timing      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:05 np0005591285 nova_compute[181789]: 2026-01-21 23:37:05.271 181793 DEBUG oslo_service.service [None req-6c5ad869-9008-4282-b2fa-81b360176ee2 - - - - - -] oslo_limit.connect_retries     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:05 np0005591285 nova_compute[181789]: 2026-01-21 23:37:05.272 181793 DEBUG oslo_service.service [None req-6c5ad869-9008-4282-b2fa-81b360176ee2 - - - - - -] oslo_limit.connect_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:05 np0005591285 nova_compute[181789]: 2026-01-21 23:37:05.272 181793 DEBUG oslo_service.service [None req-6c5ad869-9008-4282-b2fa-81b360176ee2 - - - - - -] oslo_limit.default_domain_id   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:05 np0005591285 nova_compute[181789]: 2026-01-21 23:37:05.272 181793 DEBUG oslo_service.service [None req-6c5ad869-9008-4282-b2fa-81b360176ee2 - - - - - -] oslo_limit.default_domain_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:05 np0005591285 nova_compute[181789]: 2026-01-21 23:37:05.272 181793 DEBUG oslo_service.service [None req-6c5ad869-9008-4282-b2fa-81b360176ee2 - - - - - -] oslo_limit.domain_id           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:05 np0005591285 nova_compute[181789]: 2026-01-21 23:37:05.272 181793 DEBUG oslo_service.service [None req-6c5ad869-9008-4282-b2fa-81b360176ee2 - - - - - -] oslo_limit.domain_name         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:05 np0005591285 nova_compute[181789]: 2026-01-21 23:37:05.272 181793 DEBUG oslo_service.service [None req-6c5ad869-9008-4282-b2fa-81b360176ee2 - - - - - -] oslo_limit.endpoint_id         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:05 np0005591285 nova_compute[181789]: 2026-01-21 23:37:05.273 181793 DEBUG oslo_service.service [None req-6c5ad869-9008-4282-b2fa-81b360176ee2 - - - - - -] oslo_limit.endpoint_override   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:05 np0005591285 nova_compute[181789]: 2026-01-21 23:37:05.273 181793 DEBUG oslo_service.service [None req-6c5ad869-9008-4282-b2fa-81b360176ee2 - - - - - -] oslo_limit.insecure            = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:05 np0005591285 nova_compute[181789]: 2026-01-21 23:37:05.273 181793 DEBUG oslo_service.service [None req-6c5ad869-9008-4282-b2fa-81b360176ee2 - - - - - -] oslo_limit.keyfile             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:05 np0005591285 nova_compute[181789]: 2026-01-21 23:37:05.273 181793 DEBUG oslo_service.service [None req-6c5ad869-9008-4282-b2fa-81b360176ee2 - - - - - -] oslo_limit.max_version         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:05 np0005591285 nova_compute[181789]: 2026-01-21 23:37:05.273 181793 DEBUG oslo_service.service [None req-6c5ad869-9008-4282-b2fa-81b360176ee2 - - - - - -] oslo_limit.min_version         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:05 np0005591285 nova_compute[181789]: 2026-01-21 23:37:05.273 181793 DEBUG oslo_service.service [None req-6c5ad869-9008-4282-b2fa-81b360176ee2 - - - - - -] oslo_limit.password            = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:05 np0005591285 nova_compute[181789]: 2026-01-21 23:37:05.273 181793 DEBUG oslo_service.service [None req-6c5ad869-9008-4282-b2fa-81b360176ee2 - - - - - -] oslo_limit.project_domain_id   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:05 np0005591285 nova_compute[181789]: 2026-01-21 23:37:05.274 181793 DEBUG oslo_service.service [None req-6c5ad869-9008-4282-b2fa-81b360176ee2 - - - - - -] oslo_limit.project_domain_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:05 np0005591285 nova_compute[181789]: 2026-01-21 23:37:05.274 181793 DEBUG oslo_service.service [None req-6c5ad869-9008-4282-b2fa-81b360176ee2 - - - - - -] oslo_limit.project_id          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:05 np0005591285 nova_compute[181789]: 2026-01-21 23:37:05.274 181793 DEBUG oslo_service.service [None req-6c5ad869-9008-4282-b2fa-81b360176ee2 - - - - - -] oslo_limit.project_name        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:05 np0005591285 nova_compute[181789]: 2026-01-21 23:37:05.274 181793 DEBUG oslo_service.service [None req-6c5ad869-9008-4282-b2fa-81b360176ee2 - - - - - -] oslo_limit.region_name         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:05 np0005591285 nova_compute[181789]: 2026-01-21 23:37:05.274 181793 DEBUG oslo_service.service [None req-6c5ad869-9008-4282-b2fa-81b360176ee2 - - - - - -] oslo_limit.service_name        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:05 np0005591285 nova_compute[181789]: 2026-01-21 23:37:05.274 181793 DEBUG oslo_service.service [None req-6c5ad869-9008-4282-b2fa-81b360176ee2 - - - - - -] oslo_limit.service_type        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:05 np0005591285 nova_compute[181789]: 2026-01-21 23:37:05.274 181793 DEBUG oslo_service.service [None req-6c5ad869-9008-4282-b2fa-81b360176ee2 - - - - - -] oslo_limit.split_loggers       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:05 np0005591285 nova_compute[181789]: 2026-01-21 23:37:05.275 181793 DEBUG oslo_service.service [None req-6c5ad869-9008-4282-b2fa-81b360176ee2 - - - - - -] oslo_limit.status_code_retries = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:05 np0005591285 nova_compute[181789]: 2026-01-21 23:37:05.275 181793 DEBUG oslo_service.service [None req-6c5ad869-9008-4282-b2fa-81b360176ee2 - - - - - -] oslo_limit.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:05 np0005591285 nova_compute[181789]: 2026-01-21 23:37:05.275 181793 DEBUG oslo_service.service [None req-6c5ad869-9008-4282-b2fa-81b360176ee2 - - - - - -] oslo_limit.system_scope        = all log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:05 np0005591285 nova_compute[181789]: 2026-01-21 23:37:05.275 181793 DEBUG oslo_service.service [None req-6c5ad869-9008-4282-b2fa-81b360176ee2 - - - - - -] oslo_limit.timeout             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:05 np0005591285 nova_compute[181789]: 2026-01-21 23:37:05.275 181793 DEBUG oslo_service.service [None req-6c5ad869-9008-4282-b2fa-81b360176ee2 - - - - - -] oslo_limit.trust_id            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:05 np0005591285 nova_compute[181789]: 2026-01-21 23:37:05.275 181793 DEBUG oslo_service.service [None req-6c5ad869-9008-4282-b2fa-81b360176ee2 - - - - - -] oslo_limit.user_domain_id      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:05 np0005591285 nova_compute[181789]: 2026-01-21 23:37:05.275 181793 DEBUG oslo_service.service [None req-6c5ad869-9008-4282-b2fa-81b360176ee2 - - - - - -] oslo_limit.user_domain_name    = Default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:05 np0005591285 nova_compute[181789]: 2026-01-21 23:37:05.276 181793 DEBUG oslo_service.service [None req-6c5ad869-9008-4282-b2fa-81b360176ee2 - - - - - -] oslo_limit.user_id             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:05 np0005591285 nova_compute[181789]: 2026-01-21 23:37:05.276 181793 DEBUG oslo_service.service [None req-6c5ad869-9008-4282-b2fa-81b360176ee2 - - - - - -] oslo_limit.username            = nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:05 np0005591285 nova_compute[181789]: 2026-01-21 23:37:05.276 181793 DEBUG oslo_service.service [None req-6c5ad869-9008-4282-b2fa-81b360176ee2 - - - - - -] oslo_limit.valid_interfaces    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:05 np0005591285 nova_compute[181789]: 2026-01-21 23:37:05.276 181793 DEBUG oslo_service.service [None req-6c5ad869-9008-4282-b2fa-81b360176ee2 - - - - - -] oslo_limit.version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:05 np0005591285 nova_compute[181789]: 2026-01-21 23:37:05.276 181793 DEBUG oslo_service.service [None req-6c5ad869-9008-4282-b2fa-81b360176ee2 - - - - - -] oslo_reports.file_event_handler = /var/lib/nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:05 np0005591285 nova_compute[181789]: 2026-01-21 23:37:05.276 181793 DEBUG oslo_service.service [None req-6c5ad869-9008-4282-b2fa-81b360176ee2 - - - - - -] oslo_reports.file_event_handler_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:05 np0005591285 nova_compute[181789]: 2026-01-21 23:37:05.277 181793 DEBUG oslo_service.service [None req-6c5ad869-9008-4282-b2fa-81b360176ee2 - - - - - -] oslo_reports.log_dir           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:05 np0005591285 nova_compute[181789]: 2026-01-21 23:37:05.277 181793 DEBUG oslo_service.service [None req-6c5ad869-9008-4282-b2fa-81b360176ee2 - - - - - -] vif_plug_linux_bridge_privileged.capabilities = [12] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:05 np0005591285 nova_compute[181789]: 2026-01-21 23:37:05.277 181793 DEBUG oslo_service.service [None req-6c5ad869-9008-4282-b2fa-81b360176ee2 - - - - - -] vif_plug_linux_bridge_privileged.group = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:05 np0005591285 nova_compute[181789]: 2026-01-21 23:37:05.277 181793 DEBUG oslo_service.service [None req-6c5ad869-9008-4282-b2fa-81b360176ee2 - - - - - -] vif_plug_linux_bridge_privileged.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:05 np0005591285 nova_compute[181789]: 2026-01-21 23:37:05.277 181793 DEBUG oslo_service.service [None req-6c5ad869-9008-4282-b2fa-81b360176ee2 - - - - - -] vif_plug_linux_bridge_privileged.logger_name = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:05 np0005591285 nova_compute[181789]: 2026-01-21 23:37:05.277 181793 DEBUG oslo_service.service [None req-6c5ad869-9008-4282-b2fa-81b360176ee2 - - - - - -] vif_plug_linux_bridge_privileged.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:05 np0005591285 nova_compute[181789]: 2026-01-21 23:37:05.278 181793 DEBUG oslo_service.service [None req-6c5ad869-9008-4282-b2fa-81b360176ee2 - - - - - -] vif_plug_linux_bridge_privileged.user = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:05 np0005591285 nova_compute[181789]: 2026-01-21 23:37:05.278 181793 DEBUG oslo_service.service [None req-6c5ad869-9008-4282-b2fa-81b360176ee2 - - - - - -] vif_plug_ovs_privileged.capabilities = [12, 1] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:05 np0005591285 nova_compute[181789]: 2026-01-21 23:37:05.278 181793 DEBUG oslo_service.service [None req-6c5ad869-9008-4282-b2fa-81b360176ee2 - - - - - -] vif_plug_ovs_privileged.group  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:05 np0005591285 nova_compute[181789]: 2026-01-21 23:37:05.278 181793 DEBUG oslo_service.service [None req-6c5ad869-9008-4282-b2fa-81b360176ee2 - - - - - -] vif_plug_ovs_privileged.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:05 np0005591285 nova_compute[181789]: 2026-01-21 23:37:05.278 181793 DEBUG oslo_service.service [None req-6c5ad869-9008-4282-b2fa-81b360176ee2 - - - - - -] vif_plug_ovs_privileged.logger_name = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:05 np0005591285 nova_compute[181789]: 2026-01-21 23:37:05.279 181793 DEBUG oslo_service.service [None req-6c5ad869-9008-4282-b2fa-81b360176ee2 - - - - - -] vif_plug_ovs_privileged.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:05 np0005591285 nova_compute[181789]: 2026-01-21 23:37:05.279 181793 DEBUG oslo_service.service [None req-6c5ad869-9008-4282-b2fa-81b360176ee2 - - - - - -] vif_plug_ovs_privileged.user   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:05 np0005591285 nova_compute[181789]: 2026-01-21 23:37:05.279 181793 DEBUG oslo_service.service [None req-6c5ad869-9008-4282-b2fa-81b360176ee2 - - - - - -] os_vif_linux_bridge.flat_interface = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:05 np0005591285 nova_compute[181789]: 2026-01-21 23:37:05.279 181793 DEBUG oslo_service.service [None req-6c5ad869-9008-4282-b2fa-81b360176ee2 - - - - - -] os_vif_linux_bridge.forward_bridge_interface = ['all'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:05 np0005591285 nova_compute[181789]: 2026-01-21 23:37:05.279 181793 DEBUG oslo_service.service [None req-6c5ad869-9008-4282-b2fa-81b360176ee2 - - - - - -] os_vif_linux_bridge.iptables_bottom_regex =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:05 np0005591285 nova_compute[181789]: 2026-01-21 23:37:05.279 181793 DEBUG oslo_service.service [None req-6c5ad869-9008-4282-b2fa-81b360176ee2 - - - - - -] os_vif_linux_bridge.iptables_drop_action = DROP log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:05 np0005591285 nova_compute[181789]: 2026-01-21 23:37:05.280 181793 DEBUG oslo_service.service [None req-6c5ad869-9008-4282-b2fa-81b360176ee2 - - - - - -] os_vif_linux_bridge.iptables_top_regex =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:05 np0005591285 nova_compute[181789]: 2026-01-21 23:37:05.280 181793 DEBUG oslo_service.service [None req-6c5ad869-9008-4282-b2fa-81b360176ee2 - - - - - -] os_vif_linux_bridge.network_device_mtu = 1500 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:05 np0005591285 nova_compute[181789]: 2026-01-21 23:37:05.280 181793 DEBUG oslo_service.service [None req-6c5ad869-9008-4282-b2fa-81b360176ee2 - - - - - -] os_vif_linux_bridge.use_ipv6   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:05 np0005591285 nova_compute[181789]: 2026-01-21 23:37:05.280 181793 DEBUG oslo_service.service [None req-6c5ad869-9008-4282-b2fa-81b360176ee2 - - - - - -] os_vif_linux_bridge.vlan_interface = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:05 np0005591285 nova_compute[181789]: 2026-01-21 23:37:05.280 181793 DEBUG oslo_service.service [None req-6c5ad869-9008-4282-b2fa-81b360176ee2 - - - - - -] os_vif_ovs.isolate_vif         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:05 np0005591285 nova_compute[181789]: 2026-01-21 23:37:05.280 181793 DEBUG oslo_service.service [None req-6c5ad869-9008-4282-b2fa-81b360176ee2 - - - - - -] os_vif_ovs.network_device_mtu  = 1500 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:05 np0005591285 nova_compute[181789]: 2026-01-21 23:37:05.280 181793 DEBUG oslo_service.service [None req-6c5ad869-9008-4282-b2fa-81b360176ee2 - - - - - -] os_vif_ovs.ovs_vsctl_timeout   = 120 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:05 np0005591285 nova_compute[181789]: 2026-01-21 23:37:05.281 181793 DEBUG oslo_service.service [None req-6c5ad869-9008-4282-b2fa-81b360176ee2 - - - - - -] os_vif_ovs.ovsdb_connection    = tcp:127.0.0.1:6640 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:05 np0005591285 nova_compute[181789]: 2026-01-21 23:37:05.281 181793 DEBUG oslo_service.service [None req-6c5ad869-9008-4282-b2fa-81b360176ee2 - - - - - -] os_vif_ovs.ovsdb_interface     = native log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:05 np0005591285 nova_compute[181789]: 2026-01-21 23:37:05.281 181793 DEBUG oslo_service.service [None req-6c5ad869-9008-4282-b2fa-81b360176ee2 - - - - - -] os_vif_ovs.per_port_bridge     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:05 np0005591285 nova_compute[181789]: 2026-01-21 23:37:05.281 181793 DEBUG oslo_service.service [None req-6c5ad869-9008-4282-b2fa-81b360176ee2 - - - - - -] os_brick.lock_path             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:05 np0005591285 nova_compute[181789]: 2026-01-21 23:37:05.281 181793 DEBUG oslo_service.service [None req-6c5ad869-9008-4282-b2fa-81b360176ee2 - - - - - -] os_brick.wait_mpath_device_attempts = 4 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:05 np0005591285 nova_compute[181789]: 2026-01-21 23:37:05.282 181793 DEBUG oslo_service.service [None req-6c5ad869-9008-4282-b2fa-81b360176ee2 - - - - - -] os_brick.wait_mpath_device_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:05 np0005591285 nova_compute[181789]: 2026-01-21 23:37:05.282 181793 DEBUG oslo_service.service [None req-6c5ad869-9008-4282-b2fa-81b360176ee2 - - - - - -] privsep_osbrick.capabilities   = [21] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:05 np0005591285 nova_compute[181789]: 2026-01-21 23:37:05.282 181793 DEBUG oslo_service.service [None req-6c5ad869-9008-4282-b2fa-81b360176ee2 - - - - - -] privsep_osbrick.group          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:05 np0005591285 nova_compute[181789]: 2026-01-21 23:37:05.282 181793 DEBUG oslo_service.service [None req-6c5ad869-9008-4282-b2fa-81b360176ee2 - - - - - -] privsep_osbrick.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:05 np0005591285 nova_compute[181789]: 2026-01-21 23:37:05.282 181793 DEBUG oslo_service.service [None req-6c5ad869-9008-4282-b2fa-81b360176ee2 - - - - - -] privsep_osbrick.logger_name    = os_brick.privileged log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:05 np0005591285 nova_compute[181789]: 2026-01-21 23:37:05.282 181793 DEBUG oslo_service.service [None req-6c5ad869-9008-4282-b2fa-81b360176ee2 - - - - - -] privsep_osbrick.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:05 np0005591285 nova_compute[181789]: 2026-01-21 23:37:05.283 181793 DEBUG oslo_service.service [None req-6c5ad869-9008-4282-b2fa-81b360176ee2 - - - - - -] privsep_osbrick.user           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:05 np0005591285 nova_compute[181789]: 2026-01-21 23:37:05.283 181793 DEBUG oslo_service.service [None req-6c5ad869-9008-4282-b2fa-81b360176ee2 - - - - - -] nova_sys_admin.capabilities    = [0, 1, 2, 3, 12, 21] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:05 np0005591285 nova_compute[181789]: 2026-01-21 23:37:05.283 181793 DEBUG oslo_service.service [None req-6c5ad869-9008-4282-b2fa-81b360176ee2 - - - - - -] nova_sys_admin.group           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:05 np0005591285 nova_compute[181789]: 2026-01-21 23:37:05.283 181793 DEBUG oslo_service.service [None req-6c5ad869-9008-4282-b2fa-81b360176ee2 - - - - - -] nova_sys_admin.helper_command  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:05 np0005591285 nova_compute[181789]: 2026-01-21 23:37:05.283 181793 DEBUG oslo_service.service [None req-6c5ad869-9008-4282-b2fa-81b360176ee2 - - - - - -] nova_sys_admin.logger_name     = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:05 np0005591285 nova_compute[181789]: 2026-01-21 23:37:05.283 181793 DEBUG oslo_service.service [None req-6c5ad869-9008-4282-b2fa-81b360176ee2 - - - - - -] nova_sys_admin.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:05 np0005591285 nova_compute[181789]: 2026-01-21 23:37:05.283 181793 DEBUG oslo_service.service [None req-6c5ad869-9008-4282-b2fa-81b360176ee2 - - - - - -] nova_sys_admin.user            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:05 np0005591285 nova_compute[181789]: 2026-01-21 23:37:05.283 181793 DEBUG oslo_service.service [None req-6c5ad869-9008-4282-b2fa-81b360176ee2 - - - - - -] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613#033[00m
Jan 21 18:37:05 np0005591285 nova_compute[181789]: 2026-01-21 23:37:05.285 181793 INFO nova.service [-] Starting compute node (version 27.5.2-0.20250829104910.6f8decf.el9)#033[00m
Jan 21 18:37:05 np0005591285 nova_compute[181789]: 2026-01-21 23:37:05.335 181793 DEBUG nova.virt.libvirt.host [None req-006d1d44-8b50-4352-9f60-dc35856b533c - - - - - -] Starting native event thread _init_events /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:492#033[00m
Jan 21 18:37:05 np0005591285 nova_compute[181789]: 2026-01-21 23:37:05.336 181793 DEBUG nova.virt.libvirt.host [None req-006d1d44-8b50-4352-9f60-dc35856b533c - - - - - -] Starting green dispatch thread _init_events /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:498#033[00m
Jan 21 18:37:05 np0005591285 nova_compute[181789]: 2026-01-21 23:37:05.337 181793 DEBUG nova.virt.libvirt.host [None req-006d1d44-8b50-4352-9f60-dc35856b533c - - - - - -] Starting connection event dispatch thread initialize /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:620#033[00m
Jan 21 18:37:05 np0005591285 nova_compute[181789]: 2026-01-21 23:37:05.337 181793 DEBUG nova.virt.libvirt.host [None req-006d1d44-8b50-4352-9f60-dc35856b533c - - - - - -] Connecting to libvirt: qemu:///system _get_new_connection /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:503#033[00m
Jan 21 18:37:05 np0005591285 systemd[1]: Starting libvirt QEMU daemon...
Jan 21 18:37:05 np0005591285 systemd[1]: Started libvirt QEMU daemon.
Jan 21 18:37:05 np0005591285 nova_compute[181789]: 2026-01-21 23:37:05.446 181793 DEBUG nova.virt.libvirt.host [None req-006d1d44-8b50-4352-9f60-dc35856b533c - - - - - -] Registering for lifecycle events <nova.virt.libvirt.host.Host object at 0x7ff2a0e035b0> _get_new_connection /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:509#033[00m
Jan 21 18:37:05 np0005591285 nova_compute[181789]: 2026-01-21 23:37:05.450 181793 DEBUG nova.virt.libvirt.host [None req-006d1d44-8b50-4352-9f60-dc35856b533c - - - - - -] Registering for connection events: <nova.virt.libvirt.host.Host object at 0x7ff2a0e035b0> _get_new_connection /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:530#033[00m
Jan 21 18:37:05 np0005591285 nova_compute[181789]: 2026-01-21 23:37:05.451 181793 INFO nova.virt.libvirt.driver [None req-006d1d44-8b50-4352-9f60-dc35856b533c - - - - - -] Connection event '1' reason 'None'#033[00m
Jan 21 18:37:05 np0005591285 nova_compute[181789]: 2026-01-21 23:37:05.468 181793 WARNING nova.virt.libvirt.driver [None req-006d1d44-8b50-4352-9f60-dc35856b533c - - - - - -] Cannot update service status on host "compute-2.ctlplane.example.com" since it is not registered.: nova.exception_Remote.ComputeHostNotFound_Remote: Compute host compute-2.ctlplane.example.com could not be found.#033[00m
Jan 21 18:37:05 np0005591285 nova_compute[181789]: 2026-01-21 23:37:05.469 181793 DEBUG nova.virt.libvirt.volume.mount [None req-006d1d44-8b50-4352-9f60-dc35856b533c - - - - - -] Initialising _HostMountState generation 0 host_up /usr/lib/python3.9/site-packages/nova/virt/libvirt/volume/mount.py:130#033[00m
Jan 21 18:37:05 np0005591285 python3.9[182303]: ansible-ansible.builtin.stat Invoked with path=/etc/systemd/system/edpm_nova_nvme_cleaner.service.requires follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 21 18:37:06 np0005591285 nova_compute[181789]: 2026-01-21 23:37:06.523 181793 INFO nova.virt.libvirt.host [None req-006d1d44-8b50-4352-9f60-dc35856b533c - - - - - -] Libvirt host capabilities <capabilities>
Jan 21 18:37:06 np0005591285 nova_compute[181789]: 
Jan 21 18:37:06 np0005591285 nova_compute[181789]:  <host>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:    <uuid>632224e8-817e-4a21-8112-83934a2544f5</uuid>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:    <cpu>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <arch>x86_64</arch>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <model>EPYC-Rome-v4</model>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <vendor>AMD</vendor>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <microcode version='16777317'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <signature family='23' model='49' stepping='0'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <topology sockets='8' dies='1' clusters='1' cores='1' threads='1'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <maxphysaddr mode='emulate' bits='40'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <feature name='x2apic'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <feature name='tsc-deadline'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <feature name='osxsave'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <feature name='hypervisor'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <feature name='tsc_adjust'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <feature name='spec-ctrl'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <feature name='stibp'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <feature name='arch-capabilities'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <feature name='ssbd'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <feature name='cmp_legacy'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <feature name='topoext'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <feature name='virt-ssbd'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <feature name='lbrv'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <feature name='tsc-scale'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <feature name='vmcb-clean'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <feature name='pause-filter'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <feature name='pfthreshold'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <feature name='svme-addr-chk'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <feature name='rdctl-no'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <feature name='skip-l1dfl-vmentry'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <feature name='mds-no'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <feature name='pschange-mc-no'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <pages unit='KiB' size='4'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <pages unit='KiB' size='2048'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <pages unit='KiB' size='1048576'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:    </cpu>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:    <power_management>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <suspend_mem/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <suspend_disk/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <suspend_hybrid/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:    </power_management>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:    <iommu support='no'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:    <migration_features>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <live/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <uri_transports>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <uri_transport>tcp</uri_transport>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <uri_transport>rdma</uri_transport>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      </uri_transports>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:    </migration_features>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:    <topology>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <cells num='1'>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <cell id='0'>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:          <memory unit='KiB'>7864304</memory>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:          <pages unit='KiB' size='4'>1966076</pages>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:          <pages unit='KiB' size='2048'>0</pages>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:          <pages unit='KiB' size='1048576'>0</pages>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:          <distances>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:            <sibling id='0' value='10'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:          </distances>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:          <cpus num='8'>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:            <cpu id='0' socket_id='0' die_id='0' cluster_id='65535' core_id='0' siblings='0'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:            <cpu id='1' socket_id='1' die_id='1' cluster_id='65535' core_id='0' siblings='1'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:            <cpu id='2' socket_id='2' die_id='2' cluster_id='65535' core_id='0' siblings='2'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:            <cpu id='3' socket_id='3' die_id='3' cluster_id='65535' core_id='0' siblings='3'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:            <cpu id='4' socket_id='4' die_id='4' cluster_id='65535' core_id='0' siblings='4'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:            <cpu id='5' socket_id='5' die_id='5' cluster_id='65535' core_id='0' siblings='5'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:            <cpu id='6' socket_id='6' die_id='6' cluster_id='65535' core_id='0' siblings='6'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:            <cpu id='7' socket_id='7' die_id='7' cluster_id='65535' core_id='0' siblings='7'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:          </cpus>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        </cell>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      </cells>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:    </topology>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:    <cache>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <bank id='0' level='2' type='both' size='512' unit='KiB' cpus='0'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <bank id='1' level='2' type='both' size='512' unit='KiB' cpus='1'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <bank id='2' level='2' type='both' size='512' unit='KiB' cpus='2'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <bank id='3' level='2' type='both' size='512' unit='KiB' cpus='3'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <bank id='4' level='2' type='both' size='512' unit='KiB' cpus='4'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <bank id='5' level='2' type='both' size='512' unit='KiB' cpus='5'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <bank id='6' level='2' type='both' size='512' unit='KiB' cpus='6'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <bank id='7' level='2' type='both' size='512' unit='KiB' cpus='7'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <bank id='0' level='3' type='both' size='16' unit='MiB' cpus='0'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <bank id='1' level='3' type='both' size='16' unit='MiB' cpus='1'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <bank id='2' level='3' type='both' size='16' unit='MiB' cpus='2'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <bank id='3' level='3' type='both' size='16' unit='MiB' cpus='3'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <bank id='4' level='3' type='both' size='16' unit='MiB' cpus='4'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <bank id='5' level='3' type='both' size='16' unit='MiB' cpus='5'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <bank id='6' level='3' type='both' size='16' unit='MiB' cpus='6'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <bank id='7' level='3' type='both' size='16' unit='MiB' cpus='7'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:    </cache>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:    <secmodel>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <model>selinux</model>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <doi>0</doi>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <baselabel type='kvm'>system_u:system_r:svirt_t:s0</baselabel>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <baselabel type='qemu'>system_u:system_r:svirt_tcg_t:s0</baselabel>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:    </secmodel>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:    <secmodel>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <model>dac</model>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <doi>0</doi>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <baselabel type='kvm'>+107:+107</baselabel>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <baselabel type='qemu'>+107:+107</baselabel>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:    </secmodel>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:  </host>
Jan 21 18:37:06 np0005591285 nova_compute[181789]: 
Jan 21 18:37:06 np0005591285 nova_compute[181789]:  <guest>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:    <os_type>hvm</os_type>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:    <arch name='i686'>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <wordsize>32</wordsize>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <emulator>/usr/libexec/qemu-kvm</emulator>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <machine maxCpus='240' deprecated='yes'>pc-i440fx-rhel7.6.0</machine>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <machine canonical='pc-i440fx-rhel7.6.0' maxCpus='240' deprecated='yes'>pc</machine>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <machine maxCpus='4096'>pc-q35-rhel9.8.0</machine>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <machine canonical='pc-q35-rhel9.8.0' maxCpus='4096'>q35</machine>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <machine maxCpus='4096'>pc-q35-rhel9.6.0</machine>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.6.0</machine>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <machine maxCpus='710'>pc-q35-rhel9.4.0</machine>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.5.0</machine>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.3.0</machine>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel7.6.0</machine>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.4.0</machine>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <machine maxCpus='710'>pc-q35-rhel9.2.0</machine>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.2.0</machine>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <machine maxCpus='710'>pc-q35-rhel9.0.0</machine>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.0.0</machine>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.1.0</machine>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <domain type='qemu'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <domain type='kvm'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:    </arch>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:    <features>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <pae/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <nonpae/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <acpi default='on' toggle='yes'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <apic default='on' toggle='no'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <cpuselection/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <deviceboot/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <disksnapshot default='on' toggle='no'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <externalSnapshot/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:    </features>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:  </guest>
Jan 21 18:37:06 np0005591285 nova_compute[181789]: 
Jan 21 18:37:06 np0005591285 nova_compute[181789]:  <guest>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:    <os_type>hvm</os_type>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:    <arch name='x86_64'>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <wordsize>64</wordsize>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <emulator>/usr/libexec/qemu-kvm</emulator>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <machine maxCpus='240' deprecated='yes'>pc-i440fx-rhel7.6.0</machine>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <machine canonical='pc-i440fx-rhel7.6.0' maxCpus='240' deprecated='yes'>pc</machine>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <machine maxCpus='4096'>pc-q35-rhel9.8.0</machine>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <machine canonical='pc-q35-rhel9.8.0' maxCpus='4096'>q35</machine>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <machine maxCpus='4096'>pc-q35-rhel9.6.0</machine>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.6.0</machine>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <machine maxCpus='710'>pc-q35-rhel9.4.0</machine>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.5.0</machine>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.3.0</machine>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel7.6.0</machine>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.4.0</machine>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <machine maxCpus='710'>pc-q35-rhel9.2.0</machine>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.2.0</machine>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <machine maxCpus='710'>pc-q35-rhel9.0.0</machine>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.0.0</machine>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.1.0</machine>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <domain type='qemu'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <domain type='kvm'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:    </arch>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:    <features>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <acpi default='on' toggle='yes'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <apic default='on' toggle='no'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <cpuselection/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <deviceboot/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <disksnapshot default='on' toggle='no'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <externalSnapshot/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:    </features>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:  </guest>
Jan 21 18:37:06 np0005591285 nova_compute[181789]: 
Jan 21 18:37:06 np0005591285 nova_compute[181789]: </capabilities>
Jan 21 18:37:06 np0005591285 nova_compute[181789]: #033[00m
Jan 21 18:37:06 np0005591285 nova_compute[181789]: 2026-01-21 23:37:06.535 181793 DEBUG nova.virt.libvirt.host [None req-006d1d44-8b50-4352-9f60-dc35856b533c - - - - - -] Getting domain capabilities for i686 via machine types: {'pc', 'q35'} _get_machine_types /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:952#033[00m
Jan 21 18:37:06 np0005591285 nova_compute[181789]: 2026-01-21 23:37:06.560 181793 DEBUG nova.virt.libvirt.host [None req-006d1d44-8b50-4352-9f60-dc35856b533c - - - - - -] Libvirt host hypervisor capabilities for arch=i686 and machine_type=pc:
Jan 21 18:37:06 np0005591285 nova_compute[181789]: <domainCapabilities>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:  <path>/usr/libexec/qemu-kvm</path>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:  <domain>kvm</domain>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:  <machine>pc-i440fx-rhel7.6.0</machine>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:  <arch>i686</arch>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:  <vcpu max='240'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:  <iothreads supported='yes'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:  <os supported='yes'>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:    <enum name='firmware'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:    <loader supported='yes'>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <value>/usr/share/OVMF/OVMF_CODE.secboot.fd</value>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <enum name='type'>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <value>rom</value>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <value>pflash</value>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      </enum>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <enum name='readonly'>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <value>yes</value>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <value>no</value>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      </enum>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <enum name='secure'>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <value>no</value>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      </enum>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:    </loader>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:  </os>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:  <cpu>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:    <mode name='host-passthrough' supported='yes'>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <enum name='hostPassthroughMigratable'>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <value>on</value>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <value>off</value>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      </enum>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:    </mode>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:    <mode name='maximum' supported='yes'>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <enum name='maximumMigratable'>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <value>on</value>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <value>off</value>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      </enum>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:    </mode>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:    <mode name='host-model' supported='yes'>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <model fallback='forbid'>EPYC-Rome</model>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <vendor>AMD</vendor>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <maxphysaddr mode='passthrough' limit='40'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <feature policy='require' name='x2apic'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <feature policy='require' name='tsc-deadline'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <feature policy='require' name='hypervisor'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <feature policy='require' name='tsc_adjust'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <feature policy='require' name='spec-ctrl'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <feature policy='require' name='stibp'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <feature policy='require' name='ssbd'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <feature policy='require' name='cmp_legacy'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <feature policy='require' name='overflow-recov'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <feature policy='require' name='succor'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <feature policy='require' name='ibrs'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <feature policy='require' name='amd-ssbd'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <feature policy='require' name='virt-ssbd'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <feature policy='require' name='lbrv'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <feature policy='require' name='tsc-scale'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <feature policy='require' name='vmcb-clean'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <feature policy='require' name='flushbyasid'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <feature policy='require' name='pause-filter'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <feature policy='require' name='pfthreshold'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <feature policy='require' name='svme-addr-chk'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <feature policy='require' name='lfence-always-serializing'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <feature policy='disable' name='xsaves'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:    </mode>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:    <mode name='custom' supported='yes'>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='486-v1'>486</model>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <model usable='yes' deprecated='yes' vendor='unknown'>486-v1</model>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <model usable='no' vendor='Intel' canonical='Broadwell-v1'>Broadwell</model>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <blockers model='Broadwell'>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='erms'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='hle'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='invpcid'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='pcid'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='rtm'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      </blockers>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <model usable='no' vendor='Intel' canonical='Broadwell-v3'>Broadwell-IBRS</model>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <blockers model='Broadwell-IBRS'>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='erms'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='hle'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='invpcid'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='pcid'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='rtm'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      </blockers>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <model usable='no' vendor='Intel' canonical='Broadwell-v2'>Broadwell-noTSX</model>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <blockers model='Broadwell-noTSX'>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='erms'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='invpcid'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='pcid'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      </blockers>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <model usable='no' vendor='Intel' canonical='Broadwell-v4'>Broadwell-noTSX-IBRS</model>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <blockers model='Broadwell-noTSX-IBRS'>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='erms'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='invpcid'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='pcid'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      </blockers>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <model usable='no' vendor='Intel'>Broadwell-v1</model>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <blockers model='Broadwell-v1'>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='erms'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='hle'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='invpcid'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='pcid'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='rtm'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      </blockers>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <model usable='no' vendor='Intel'>Broadwell-v2</model>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <blockers model='Broadwell-v2'>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='erms'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='invpcid'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='pcid'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      </blockers>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <model usable='no' vendor='Intel'>Broadwell-v3</model>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <blockers model='Broadwell-v3'>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='erms'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='hle'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='invpcid'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='pcid'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='rtm'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      </blockers>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <model usable='no' vendor='Intel'>Broadwell-v4</model>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <blockers model='Broadwell-v4'>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='erms'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='invpcid'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='pcid'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      </blockers>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v1'>Cascadelake-Server</model>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <blockers model='Cascadelake-Server'>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512bw'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512cd'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512dq'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512f'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512vl'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512vnni'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='erms'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='hle'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='invpcid'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='pcid'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='pku'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='rtm'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      </blockers>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v3'>Cascadelake-Server-noTSX</model>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <blockers model='Cascadelake-Server-noTSX'>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512bw'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512cd'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512dq'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512f'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512vl'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512vnni'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='erms'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='ibrs-all'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='invpcid'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='pcid'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='pku'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      </blockers>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v1</model>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <blockers model='Cascadelake-Server-v1'>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512bw'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512cd'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512dq'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512f'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512vl'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512vnni'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='erms'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='hle'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='invpcid'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='pcid'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='pku'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='rtm'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      </blockers>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v2</model>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <blockers model='Cascadelake-Server-v2'>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512bw'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512cd'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512dq'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512f'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512vl'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512vnni'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='erms'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='hle'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='ibrs-all'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='invpcid'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='pcid'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='pku'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='rtm'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      </blockers>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v3</model>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <blockers model='Cascadelake-Server-v3'>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512bw'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512cd'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512dq'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512f'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512vl'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512vnni'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='erms'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='ibrs-all'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='invpcid'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='pcid'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='pku'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      </blockers>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v4</model>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <blockers model='Cascadelake-Server-v4'>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512bw'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512cd'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512dq'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512f'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512vl'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512vnni'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='erms'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='ibrs-all'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='invpcid'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='pcid'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='pku'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      </blockers>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v5</model>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <blockers model='Cascadelake-Server-v5'>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512bw'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512cd'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512dq'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512f'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512vl'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512vnni'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='erms'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='ibrs-all'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='invpcid'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='pcid'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='pku'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='xsaves'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      </blockers>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <model usable='no' vendor='Intel' canonical='ClearwaterForest-v1'>ClearwaterForest</model>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <blockers model='ClearwaterForest'>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx-ifma'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx-ne-convert'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx-vnni'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx-vnni-int16'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx-vnni-int8'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='bhi-ctrl'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='bhi-no'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='bus-lock-detect'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='cldemote'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='cmpccxadd'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='ddpd-u'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='erms'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='fbsdp-no'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='fsrm'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='fsrs'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='gfni'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='ibrs-all'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='intel-psfd'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='invpcid'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='ipred-ctrl'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='lam'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='mcdt-no'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='movdir64b'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='movdiri'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='pbrsb-no'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='pcid'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='pku'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='prefetchiti'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='psdp-no'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='rrsba-ctrl'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='sbdr-ssdp-no'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='serialize'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='sha512'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='sm3'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='sm4'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='ss'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='vaes'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='vpclmulqdq'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='xsaves'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      </blockers>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <model usable='no' vendor='Intel'>ClearwaterForest-v1</model>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <blockers model='ClearwaterForest-v1'>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx-ifma'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx-ne-convert'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx-vnni'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx-vnni-int16'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx-vnni-int8'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='bhi-ctrl'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='bhi-no'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='bus-lock-detect'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='cldemote'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='cmpccxadd'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='ddpd-u'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='erms'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='fbsdp-no'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='fsrm'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='fsrs'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='gfni'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='ibrs-all'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='intel-psfd'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='invpcid'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='ipred-ctrl'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='lam'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='mcdt-no'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='movdir64b'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='movdiri'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='pbrsb-no'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='pcid'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='pku'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='prefetchiti'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='psdp-no'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='rrsba-ctrl'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='sbdr-ssdp-no'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='serialize'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='sha512'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='sm3'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='sm4'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='ss'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='vaes'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='vpclmulqdq'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='xsaves'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      </blockers>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <model usable='yes' deprecated='yes' vendor='Intel' canonical='Conroe-v1'>Conroe</model>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <model usable='yes' deprecated='yes' vendor='Intel'>Conroe-v1</model>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <model usable='no' vendor='Intel' canonical='Cooperlake-v1'>Cooperlake</model>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <blockers model='Cooperlake'>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512-bf16'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512bw'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512cd'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512dq'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512f'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512vl'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512vnni'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='erms'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='hle'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='ibrs-all'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='invpcid'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='pcid'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='pku'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='rtm'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='taa-no'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      </blockers>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <model usable='no' vendor='Intel'>Cooperlake-v1</model>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <blockers model='Cooperlake-v1'>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512-bf16'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512bw'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512cd'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512dq'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512f'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512vl'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512vnni'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='erms'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='hle'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='ibrs-all'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='invpcid'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='pcid'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='pku'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='rtm'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='taa-no'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      </blockers>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <model usable='no' vendor='Intel'>Cooperlake-v2</model>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <blockers model='Cooperlake-v2'>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512-bf16'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512bw'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512cd'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512dq'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512f'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512vl'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512vnni'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='erms'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='hle'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='ibrs-all'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='invpcid'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='pcid'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='pku'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='rtm'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='taa-no'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='xsaves'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      </blockers>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <model usable='no' vendor='Intel' canonical='Denverton-v1'>Denverton</model>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <blockers model='Denverton'>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='erms'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='mpx'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      </blockers>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <model usable='no' vendor='Intel'>Denverton-v1</model>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <blockers model='Denverton-v1'>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='erms'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='mpx'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      </blockers>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <model usable='no' vendor='Intel'>Denverton-v2</model>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <blockers model='Denverton-v2'>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='erms'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      </blockers>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <model usable='no' vendor='Intel'>Denverton-v3</model>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <blockers model='Denverton-v3'>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='erms'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='xsaves'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      </blockers>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <model usable='yes' vendor='Hygon' canonical='Dhyana-v1'>Dhyana</model>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <model usable='yes' vendor='Hygon'>Dhyana-v1</model>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <model usable='no' vendor='Hygon'>Dhyana-v2</model>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <blockers model='Dhyana-v2'>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='xsaves'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      </blockers>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <model usable='yes' vendor='AMD' canonical='EPYC-v1'>EPYC</model>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <model usable='no' vendor='AMD' canonical='EPYC-Genoa-v1'>EPYC-Genoa</model>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <blockers model='EPYC-Genoa'>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='amd-psfd'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='auto-ibrs'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512-bf16'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512-vpopcntdq'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512bitalg'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512bw'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512cd'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512dq'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512f'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512ifma'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512vbmi'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512vbmi2'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512vl'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512vnni'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='erms'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='fsrm'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='gfni'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='invpcid'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='la57'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='no-nested-data-bp'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='null-sel-clr-base'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='pcid'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='pku'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='stibp-always-on'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='vaes'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='vpclmulqdq'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='xsaves'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      </blockers>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <model usable='no' vendor='AMD'>EPYC-Genoa-v1</model>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <blockers model='EPYC-Genoa-v1'>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='amd-psfd'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='auto-ibrs'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512-bf16'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512-vpopcntdq'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512bitalg'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512bw'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512cd'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512dq'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512f'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512ifma'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512vbmi'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512vbmi2'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512vl'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512vnni'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='erms'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='fsrm'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='gfni'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='invpcid'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='la57'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='no-nested-data-bp'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='null-sel-clr-base'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='pcid'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='pku'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='stibp-always-on'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='vaes'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='vpclmulqdq'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='xsaves'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      </blockers>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <model usable='no' vendor='AMD'>EPYC-Genoa-v2</model>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <blockers model='EPYC-Genoa-v2'>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='amd-psfd'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='auto-ibrs'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512-bf16'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512-vpopcntdq'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512bitalg'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512bw'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512cd'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512dq'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512f'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512ifma'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512vbmi'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512vbmi2'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512vl'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512vnni'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='erms'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='fs-gs-base-ns'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='fsrm'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='gfni'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='invpcid'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='la57'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='no-nested-data-bp'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='null-sel-clr-base'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='pcid'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='perfmon-v2'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='pku'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='stibp-always-on'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='vaes'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='vpclmulqdq'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='xsaves'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      </blockers>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <model usable='yes' vendor='AMD' canonical='EPYC-v2'>EPYC-IBPB</model>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <model usable='no' vendor='AMD' canonical='EPYC-Milan-v1'>EPYC-Milan</model>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <blockers model='EPYC-Milan'>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='erms'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='fsrm'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='invpcid'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='pcid'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='pku'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='xsaves'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      </blockers>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <model usable='no' vendor='AMD'>EPYC-Milan-v1</model>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <blockers model='EPYC-Milan-v1'>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='erms'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='fsrm'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='invpcid'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='pcid'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='pku'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='xsaves'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      </blockers>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <model usable='no' vendor='AMD'>EPYC-Milan-v2</model>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <blockers model='EPYC-Milan-v2'>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='amd-psfd'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='erms'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='fsrm'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='invpcid'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='no-nested-data-bp'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='null-sel-clr-base'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='pcid'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='pku'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='stibp-always-on'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='vaes'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='vpclmulqdq'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='xsaves'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      </blockers>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <model usable='no' vendor='AMD'>EPYC-Milan-v3</model>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <blockers model='EPYC-Milan-v3'>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='amd-psfd'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='erms'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='fsrm'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='invpcid'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='no-nested-data-bp'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='null-sel-clr-base'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='pcid'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='pku'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='stibp-always-on'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='vaes'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='vpclmulqdq'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='xsaves'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      </blockers>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <model usable='no' vendor='AMD' canonical='EPYC-Rome-v1'>EPYC-Rome</model>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <blockers model='EPYC-Rome'>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='xsaves'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      </blockers>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <model usable='no' vendor='AMD'>EPYC-Rome-v1</model>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <blockers model='EPYC-Rome-v1'>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='xsaves'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      </blockers>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <model usable='no' vendor='AMD'>EPYC-Rome-v2</model>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <blockers model='EPYC-Rome-v2'>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='xsaves'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      </blockers>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <model usable='no' vendor='AMD'>EPYC-Rome-v3</model>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <blockers model='EPYC-Rome-v3'>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='xsaves'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      </blockers>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <model usable='yes' vendor='AMD'>EPYC-Rome-v4</model>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <model usable='yes' vendor='AMD'>EPYC-Rome-v5</model>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <model usable='no' vendor='AMD' canonical='EPYC-Turin-v1'>EPYC-Turin</model>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <blockers model='EPYC-Turin'>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='amd-psfd'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='auto-ibrs'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx-vnni'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512-bf16'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512-vp2intersect'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512-vpopcntdq'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512bitalg'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512bw'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512cd'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512dq'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512f'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512ifma'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512vbmi'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512vbmi2'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512vl'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512vnni'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='erms'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='fs-gs-base-ns'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='fsrm'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='gfni'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='ibpb-brtype'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='invpcid'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='la57'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='movdir64b'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='movdiri'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='no-nested-data-bp'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='null-sel-clr-base'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='pcid'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='perfmon-v2'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='pku'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='prefetchi'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='sbpb'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='srso-user-kernel-no'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='stibp-always-on'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='vaes'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='vpclmulqdq'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='xsaves'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      </blockers>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <model usable='no' vendor='AMD'>EPYC-Turin-v1</model>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <blockers model='EPYC-Turin-v1'>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='amd-psfd'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='auto-ibrs'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx-vnni'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512-bf16'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512-vp2intersect'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512-vpopcntdq'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512bitalg'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512bw'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512cd'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512dq'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512f'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512ifma'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512vbmi'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512vbmi2'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512vl'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512vnni'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='erms'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='fs-gs-base-ns'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='fsrm'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='gfni'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='ibpb-brtype'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='invpcid'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='la57'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='movdir64b'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='movdiri'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='no-nested-data-bp'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='null-sel-clr-base'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='pcid'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='perfmon-v2'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='pku'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='prefetchi'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='sbpb'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='srso-user-kernel-no'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='stibp-always-on'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='vaes'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='vpclmulqdq'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='xsaves'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      </blockers>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <model usable='yes' vendor='AMD'>EPYC-v1</model>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <model usable='yes' vendor='AMD'>EPYC-v2</model>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <model usable='no' vendor='AMD'>EPYC-v3</model>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <blockers model='EPYC-v3'>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='xsaves'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      </blockers>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <model usable='no' vendor='AMD'>EPYC-v4</model>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <blockers model='EPYC-v4'>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='xsaves'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      </blockers>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <model usable='no' vendor='AMD'>EPYC-v5</model>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <blockers model='EPYC-v5'>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='xsaves'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      </blockers>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <model usable='no' vendor='Intel' canonical='GraniteRapids-v1'>GraniteRapids</model>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <blockers model='GraniteRapids'>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='amx-bf16'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='amx-fp16'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='amx-int8'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='amx-tile'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx-vnni'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512-bf16'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512-fp16'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512-vpopcntdq'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512bitalg'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512bw'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512cd'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512dq'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512f'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512ifma'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512vbmi'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512vbmi2'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512vl'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512vnni'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='bus-lock-detect'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='erms'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='fbsdp-no'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='fsrc'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='fsrm'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='fsrs'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='fzrm'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='gfni'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='hle'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='ibrs-all'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='invpcid'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='la57'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='mcdt-no'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='pbrsb-no'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='pcid'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='pku'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='prefetchiti'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='psdp-no'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='rtm'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='sbdr-ssdp-no'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='serialize'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='taa-no'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='tsx-ldtrk'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='vaes'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='vpclmulqdq'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='xfd'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='xsaves'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      </blockers>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <model usable='no' vendor='Intel'>GraniteRapids-v1</model>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <blockers model='GraniteRapids-v1'>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='amx-bf16'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='amx-fp16'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='amx-int8'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='amx-tile'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx-vnni'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512-bf16'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512-fp16'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512-vpopcntdq'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512bitalg'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512bw'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512cd'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512dq'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512f'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512ifma'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512vbmi'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512vbmi2'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512vl'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512vnni'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='bus-lock-detect'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='erms'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='fbsdp-no'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='fsrc'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='fsrm'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='fsrs'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='fzrm'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='gfni'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='hle'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='ibrs-all'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='invpcid'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='la57'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='mcdt-no'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='pbrsb-no'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='pcid'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='pku'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='prefetchiti'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='psdp-no'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='rtm'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='sbdr-ssdp-no'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='serialize'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='taa-no'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='tsx-ldtrk'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='vaes'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='vpclmulqdq'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='xfd'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='xsaves'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      </blockers>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <model usable='no' vendor='Intel'>GraniteRapids-v2</model>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <blockers model='GraniteRapids-v2'>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='amx-bf16'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='amx-fp16'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='amx-int8'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='amx-tile'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx-vnni'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx10'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx10-128'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx10-256'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx10-512'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512-bf16'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512-fp16'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512-vpopcntdq'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512bitalg'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512bw'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512cd'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512dq'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512f'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512ifma'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512vbmi'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512vbmi2'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512vl'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512vnni'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='bus-lock-detect'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='cldemote'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='erms'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='fbsdp-no'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='fsrc'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='fsrm'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='fsrs'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='fzrm'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='gfni'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='hle'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='ibrs-all'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='invpcid'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='la57'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='mcdt-no'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='movdir64b'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='movdiri'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='pbrsb-no'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='pcid'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='pku'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='prefetchiti'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='psdp-no'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='rtm'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='sbdr-ssdp-no'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='serialize'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='ss'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='taa-no'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='tsx-ldtrk'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='vaes'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='vpclmulqdq'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='xfd'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='xsaves'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      </blockers>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <model usable='no' vendor='Intel'>GraniteRapids-v3</model>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <blockers model='GraniteRapids-v3'>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='amx-bf16'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='amx-fp16'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='amx-int8'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='amx-tile'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx-vnni'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx10'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx10-128'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx10-256'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx10-512'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512-bf16'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512-fp16'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512-vpopcntdq'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512bitalg'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512bw'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512cd'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512dq'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512f'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512ifma'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512vbmi'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512vbmi2'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512vl'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512vnni'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='bus-lock-detect'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='cldemote'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='erms'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='fbsdp-no'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='fsrc'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='fsrm'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='fsrs'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='fzrm'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='gfni'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='hle'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='ibrs-all'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='invpcid'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='la57'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='mcdt-no'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='movdir64b'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='movdiri'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='pbrsb-no'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='pcid'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='pku'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='prefetchiti'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='psdp-no'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='rtm'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='sbdr-ssdp-no'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='serialize'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='ss'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='taa-no'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='tsx-ldtrk'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='vaes'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='vpclmulqdq'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='xfd'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='xsaves'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      </blockers>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <model usable='no' vendor='Intel' canonical='Haswell-v1'>Haswell</model>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <blockers model='Haswell'>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='erms'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='hle'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='invpcid'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='pcid'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='rtm'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      </blockers>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <model usable='no' vendor='Intel' canonical='Haswell-v3'>Haswell-IBRS</model>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <blockers model='Haswell-IBRS'>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='erms'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='hle'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='invpcid'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='pcid'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='rtm'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      </blockers>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <model usable='no' vendor='Intel' canonical='Haswell-v2'>Haswell-noTSX</model>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <blockers model='Haswell-noTSX'>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='erms'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='invpcid'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='pcid'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      </blockers>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <model usable='no' vendor='Intel' canonical='Haswell-v4'>Haswell-noTSX-IBRS</model>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <blockers model='Haswell-noTSX-IBRS'>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='erms'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='invpcid'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='pcid'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      </blockers>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <model usable='no' vendor='Intel'>Haswell-v1</model>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <blockers model='Haswell-v1'>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='erms'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='hle'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='invpcid'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='pcid'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='rtm'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      </blockers>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <model usable='no' vendor='Intel'>Haswell-v2</model>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <blockers model='Haswell-v2'>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='erms'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='invpcid'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='pcid'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      </blockers>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <model usable='no' vendor='Intel'>Haswell-v3</model>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <blockers model='Haswell-v3'>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='erms'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='hle'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='invpcid'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='pcid'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='rtm'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      </blockers>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <model usable='no' vendor='Intel'>Haswell-v4</model>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <blockers model='Haswell-v4'>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='erms'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='invpcid'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='pcid'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      </blockers>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <model usable='no' vendor='Intel' canonical='Icelake-Server-v1'>Icelake-Server</model>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <blockers model='Icelake-Server'>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512-vpopcntdq'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512bitalg'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512bw'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512cd'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512dq'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512f'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512vbmi'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512vbmi2'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512vl'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512vnni'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='erms'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='gfni'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='hle'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='invpcid'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='la57'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='pcid'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='pku'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='rtm'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='vaes'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='vpclmulqdq'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      </blockers>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <model usable='no' vendor='Intel' canonical='Icelake-Server-v2'>Icelake-Server-noTSX</model>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <blockers model='Icelake-Server-noTSX'>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512-vpopcntdq'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512bitalg'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512bw'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512cd'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512dq'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512f'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512vbmi'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512vbmi2'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512vl'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512vnni'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='erms'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='gfni'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='invpcid'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='la57'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='pcid'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='pku'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='vaes'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='vpclmulqdq'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      </blockers>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <model usable='no' vendor='Intel'>Icelake-Server-v1</model>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <blockers model='Icelake-Server-v1'>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512-vpopcntdq'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512bitalg'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512bw'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512cd'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512dq'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512f'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512vbmi'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512vbmi2'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512vl'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512vnni'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='erms'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='gfni'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='hle'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='invpcid'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='la57'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='pcid'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='pku'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='rtm'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='vaes'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='vpclmulqdq'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      </blockers>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <model usable='no' vendor='Intel'>Icelake-Server-v2</model>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <blockers model='Icelake-Server-v2'>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512-vpopcntdq'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512bitalg'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512bw'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512cd'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512dq'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512f'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512vbmi'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512vbmi2'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512vl'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512vnni'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='erms'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='gfni'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='invpcid'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='la57'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='pcid'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='pku'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='vaes'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='vpclmulqdq'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      </blockers>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <model usable='no' vendor='Intel'>Icelake-Server-v3</model>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <blockers model='Icelake-Server-v3'>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512-vpopcntdq'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512bitalg'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512bw'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512cd'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512dq'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512f'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512vbmi'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512vbmi2'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512vl'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512vnni'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='erms'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='gfni'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='ibrs-all'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='invpcid'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='la57'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='pcid'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='pku'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='taa-no'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='vaes'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='vpclmulqdq'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      </blockers>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <model usable='no' vendor='Intel'>Icelake-Server-v4</model>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <blockers model='Icelake-Server-v4'>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512-vpopcntdq'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512bitalg'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512bw'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512cd'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512dq'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512f'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512ifma'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512vbmi'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512vbmi2'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512vl'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512vnni'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='erms'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='fsrm'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='gfni'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='ibrs-all'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='invpcid'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='la57'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='pcid'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='pku'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='taa-no'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='vaes'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='vpclmulqdq'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      </blockers>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <model usable='no' vendor='Intel'>Icelake-Server-v5</model>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <blockers model='Icelake-Server-v5'>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512-vpopcntdq'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512bitalg'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512bw'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512cd'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512dq'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512f'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512ifma'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512vbmi'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512vbmi2'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512vl'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512vnni'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='erms'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='fsrm'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='gfni'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='ibrs-all'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='invpcid'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='la57'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='pcid'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='pku'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='taa-no'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='vaes'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='vpclmulqdq'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='xsaves'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      </blockers>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <model usable='no' vendor='Intel'>Icelake-Server-v6</model>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <blockers model='Icelake-Server-v6'>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512-vpopcntdq'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512bitalg'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512bw'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512cd'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512dq'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512f'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512ifma'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512vbmi'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512vbmi2'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512vl'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512vnni'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='erms'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='fsrm'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='gfni'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='ibrs-all'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='invpcid'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='la57'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='pcid'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='pku'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='taa-no'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='vaes'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='vpclmulqdq'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='xsaves'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      </blockers>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <model usable='no' vendor='Intel'>Icelake-Server-v7</model>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <blockers model='Icelake-Server-v7'>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512-vpopcntdq'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512bitalg'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512bw'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512cd'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512dq'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512f'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512ifma'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512vbmi'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512vbmi2'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512vl'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512vnni'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='erms'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='fsrm'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='gfni'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='hle'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='ibrs-all'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='invpcid'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='la57'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='pcid'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='pku'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='rtm'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='taa-no'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='vaes'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='vpclmulqdq'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='xsaves'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      </blockers>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <model usable='no' vendor='Intel' canonical='IvyBridge-v1'>IvyBridge</model>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <blockers model='IvyBridge'>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='erms'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      </blockers>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <model usable='no' vendor='Intel' canonical='IvyBridge-v2'>IvyBridge-IBRS</model>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <blockers model='IvyBridge-IBRS'>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='erms'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      </blockers>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <model usable='no' vendor='Intel'>IvyBridge-v1</model>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <blockers model='IvyBridge-v1'>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='erms'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      </blockers>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <model usable='no' vendor='Intel'>IvyBridge-v2</model>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <blockers model='IvyBridge-v2'>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='erms'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      </blockers>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <model usable='no' vendor='Intel' canonical='KnightsMill-v1'>KnightsMill</model>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <blockers model='KnightsMill'>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512-4fmaps'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512-4vnniw'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512-vpopcntdq'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512cd'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512er'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512f'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512pf'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='erms'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='ss'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      </blockers>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <model usable='no' vendor='Intel'>KnightsMill-v1</model>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <blockers model='KnightsMill-v1'>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512-4fmaps'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512-4vnniw'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512-vpopcntdq'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512cd'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512er'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512f'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512pf'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='erms'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='ss'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      </blockers>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <model usable='yes' vendor='Intel' canonical='Nehalem-v1'>Nehalem</model>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <model usable='yes' vendor='Intel' canonical='Nehalem-v2'>Nehalem-IBRS</model>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <model usable='yes' vendor='Intel'>Nehalem-v1</model>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <model usable='yes' vendor='Intel'>Nehalem-v2</model>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G1-v1'>Opteron_G1</model>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G1-v1</model>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G2-v1'>Opteron_G2</model>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G2-v1</model>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G3-v1'>Opteron_G3</model>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G3-v1</model>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <model usable='no' vendor='AMD' canonical='Opteron_G4-v1'>Opteron_G4</model>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <blockers model='Opteron_G4'>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='fma4'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='xop'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      </blockers>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <model usable='no' vendor='AMD'>Opteron_G4-v1</model>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <blockers model='Opteron_G4-v1'>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='fma4'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='xop'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      </blockers>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <model usable='no' vendor='AMD' canonical='Opteron_G5-v1'>Opteron_G5</model>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <blockers model='Opteron_G5'>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='fma4'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='tbm'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='xop'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      </blockers>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <model usable='no' vendor='AMD'>Opteron_G5-v1</model>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <blockers model='Opteron_G5-v1'>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='fma4'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='tbm'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='xop'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      </blockers>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <model usable='yes' deprecated='yes' vendor='Intel' canonical='Penryn-v1'>Penryn</model>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <model usable='yes' deprecated='yes' vendor='Intel'>Penryn-v1</model>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <model usable='yes' vendor='Intel' canonical='SandyBridge-v1'>SandyBridge</model>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <model usable='yes' vendor='Intel' canonical='SandyBridge-v2'>SandyBridge-IBRS</model>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <model usable='yes' vendor='Intel'>SandyBridge-v1</model>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <model usable='yes' vendor='Intel'>SandyBridge-v2</model>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <model usable='no' vendor='Intel' canonical='SapphireRapids-v1'>SapphireRapids</model>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <blockers model='SapphireRapids'>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='amx-bf16'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='amx-int8'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='amx-tile'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx-vnni'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512-bf16'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512-fp16'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512-vpopcntdq'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512bitalg'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512bw'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512cd'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512dq'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512f'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512ifma'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512vbmi'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512vbmi2'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512vl'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512vnni'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='bus-lock-detect'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='erms'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='fsrc'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='fsrm'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='fsrs'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='fzrm'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='gfni'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='hle'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='ibrs-all'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='invpcid'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='la57'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='pcid'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='pku'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='rtm'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='serialize'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='taa-no'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='tsx-ldtrk'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='vaes'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='vpclmulqdq'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='xfd'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='xsaves'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      </blockers>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <model usable='no' vendor='Intel'>SapphireRapids-v1</model>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <blockers model='SapphireRapids-v1'>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='amx-bf16'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='amx-int8'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='amx-tile'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx-vnni'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512-bf16'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512-fp16'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512-vpopcntdq'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512bitalg'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512bw'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512cd'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512dq'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512f'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512ifma'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512vbmi'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512vbmi2'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512vl'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512vnni'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='bus-lock-detect'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='erms'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='fsrc'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='fsrm'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='fsrs'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='fzrm'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='gfni'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='hle'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='ibrs-all'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='invpcid'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='la57'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='pcid'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='pku'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='rtm'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='serialize'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='taa-no'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='tsx-ldtrk'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='vaes'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='vpclmulqdq'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='xfd'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='xsaves'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      </blockers>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <model usable='no' vendor='Intel'>SapphireRapids-v2</model>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <blockers model='SapphireRapids-v2'>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='amx-bf16'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='amx-int8'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='amx-tile'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx-vnni'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512-bf16'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512-fp16'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512-vpopcntdq'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512bitalg'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512bw'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512cd'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512dq'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512f'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512ifma'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512vbmi'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512vbmi2'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512vl'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512vnni'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='bus-lock-detect'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='erms'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='fbsdp-no'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='fsrc'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='fsrm'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='fsrs'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='fzrm'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='gfni'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='hle'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='ibrs-all'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='invpcid'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='la57'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='pcid'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='pku'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='psdp-no'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='rtm'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='sbdr-ssdp-no'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='serialize'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='taa-no'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='tsx-ldtrk'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='vaes'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='vpclmulqdq'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='xfd'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='xsaves'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      </blockers>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <model usable='no' vendor='Intel'>SapphireRapids-v3</model>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <blockers model='SapphireRapids-v3'>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='amx-bf16'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='amx-int8'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='amx-tile'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx-vnni'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512-bf16'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512-fp16'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512-vpopcntdq'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512bitalg'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512bw'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512cd'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512dq'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512f'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512ifma'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512vbmi'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512vbmi2'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512vl'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512vnni'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='bus-lock-detect'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='cldemote'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='erms'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='fbsdp-no'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='fsrc'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='fsrm'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='fsrs'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='fzrm'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='gfni'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='hle'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='ibrs-all'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='invpcid'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='la57'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='movdir64b'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='movdiri'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='pcid'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='pku'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='psdp-no'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='rtm'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='sbdr-ssdp-no'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='serialize'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='ss'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='taa-no'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='tsx-ldtrk'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='vaes'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='vpclmulqdq'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='xfd'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='xsaves'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      </blockers>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <model usable='no' vendor='Intel'>SapphireRapids-v4</model>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <blockers model='SapphireRapids-v4'>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='amx-bf16'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='amx-int8'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='amx-tile'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx-vnni'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512-bf16'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512-fp16'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512-vpopcntdq'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512bitalg'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512bw'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512cd'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512dq'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512f'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512ifma'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512vbmi'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512vbmi2'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512vl'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512vnni'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='bus-lock-detect'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='cldemote'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='erms'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='fbsdp-no'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='fsrc'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='fsrm'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='fsrs'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='fzrm'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='gfni'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='hle'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='ibrs-all'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='invpcid'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='la57'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='movdir64b'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='movdiri'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='pcid'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='pku'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='psdp-no'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='rtm'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='sbdr-ssdp-no'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='serialize'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='ss'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='taa-no'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='tsx-ldtrk'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='vaes'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='vpclmulqdq'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='xfd'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='xsaves'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      </blockers>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <model usable='no' vendor='Intel' canonical='SierraForest-v1'>SierraForest</model>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <blockers model='SierraForest'>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx-ifma'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx-ne-convert'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx-vnni'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx-vnni-int8'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='bus-lock-detect'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='cmpccxadd'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='erms'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='fbsdp-no'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='fsrm'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='fsrs'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='gfni'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='ibrs-all'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='invpcid'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='mcdt-no'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='pbrsb-no'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='pcid'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='pku'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='psdp-no'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='sbdr-ssdp-no'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='serialize'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='vaes'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='vpclmulqdq'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='xsaves'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      </blockers>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <model usable='no' vendor='Intel'>SierraForest-v1</model>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <blockers model='SierraForest-v1'>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx-ifma'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx-ne-convert'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx-vnni'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx-vnni-int8'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='bus-lock-detect'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='cmpccxadd'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='erms'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='fbsdp-no'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='fsrm'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='fsrs'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='gfni'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='ibrs-all'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='invpcid'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='mcdt-no'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='pbrsb-no'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='pcid'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='pku'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='psdp-no'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='sbdr-ssdp-no'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='serialize'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='vaes'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='vpclmulqdq'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='xsaves'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      </blockers>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <model usable='no' vendor='Intel'>SierraForest-v2</model>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <blockers model='SierraForest-v2'>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx-ifma'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx-ne-convert'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx-vnni'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx-vnni-int8'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='bhi-ctrl'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='bus-lock-detect'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='cldemote'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='cmpccxadd'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='erms'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='fbsdp-no'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='fsrm'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='fsrs'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='gfni'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='ibrs-all'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='intel-psfd'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='invpcid'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='ipred-ctrl'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='lam'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='mcdt-no'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='movdir64b'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='movdiri'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='pbrsb-no'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='pcid'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='pku'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='psdp-no'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='rrsba-ctrl'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='sbdr-ssdp-no'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='serialize'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='ss'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='vaes'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='vpclmulqdq'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='xsaves'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      </blockers>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <model usable='no' vendor='Intel'>SierraForest-v3</model>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <blockers model='SierraForest-v3'>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx-ifma'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx-ne-convert'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx-vnni'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx-vnni-int8'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='bhi-ctrl'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='bus-lock-detect'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='cldemote'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='cmpccxadd'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='erms'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='fbsdp-no'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='fsrm'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='fsrs'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='gfni'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='ibrs-all'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='intel-psfd'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='invpcid'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='ipred-ctrl'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='lam'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='mcdt-no'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='movdir64b'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='movdiri'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='pbrsb-no'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='pcid'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='pku'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='psdp-no'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='rrsba-ctrl'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='sbdr-ssdp-no'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='serialize'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='ss'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='vaes'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='vpclmulqdq'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='xsaves'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      </blockers>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <model usable='no' vendor='Intel' canonical='Skylake-Client-v1'>Skylake-Client</model>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <blockers model='Skylake-Client'>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='erms'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='hle'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='invpcid'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='pcid'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='rtm'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      </blockers>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <model usable='no' vendor='Intel' canonical='Skylake-Client-v2'>Skylake-Client-IBRS</model>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <blockers model='Skylake-Client-IBRS'>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='erms'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='hle'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='invpcid'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='pcid'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='rtm'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      </blockers>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <model usable='no' vendor='Intel' canonical='Skylake-Client-v3'>Skylake-Client-noTSX-IBRS</model>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <blockers model='Skylake-Client-noTSX-IBRS'>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='erms'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='invpcid'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='pcid'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      </blockers>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <model usable='no' vendor='Intel'>Skylake-Client-v1</model>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <blockers model='Skylake-Client-v1'>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='erms'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='hle'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='invpcid'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='pcid'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='rtm'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      </blockers>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <model usable='no' vendor='Intel'>Skylake-Client-v2</model>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <blockers model='Skylake-Client-v2'>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='erms'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='hle'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='invpcid'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='pcid'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='rtm'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      </blockers>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <model usable='no' vendor='Intel'>Skylake-Client-v3</model>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <blockers model='Skylake-Client-v3'>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='erms'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='invpcid'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='pcid'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      </blockers>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <model usable='no' vendor='Intel'>Skylake-Client-v4</model>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <blockers model='Skylake-Client-v4'>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='erms'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='invpcid'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='pcid'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='xsaves'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      </blockers>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <model usable='no' vendor='Intel' canonical='Skylake-Server-v1'>Skylake-Server</model>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <blockers model='Skylake-Server'>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512bw'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512cd'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512dq'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512f'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512vl'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='erms'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='hle'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='invpcid'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='pcid'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='pku'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='rtm'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      </blockers>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <model usable='no' vendor='Intel' canonical='Skylake-Server-v2'>Skylake-Server-IBRS</model>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <blockers model='Skylake-Server-IBRS'>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512bw'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512cd'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512dq'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512f'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512vl'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='erms'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='hle'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='invpcid'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='pcid'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='pku'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='rtm'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      </blockers>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <model usable='no' vendor='Intel' canonical='Skylake-Server-v3'>Skylake-Server-noTSX-IBRS</model>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <blockers model='Skylake-Server-noTSX-IBRS'>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512bw'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512cd'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512dq'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512f'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512vl'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='erms'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='invpcid'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='pcid'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='pku'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      </blockers>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <model usable='no' vendor='Intel'>Skylake-Server-v1</model>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <blockers model='Skylake-Server-v1'>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512bw'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512cd'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512dq'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512f'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512vl'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='erms'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='hle'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='invpcid'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='pcid'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='pku'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='rtm'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      </blockers>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <model usable='no' vendor='Intel'>Skylake-Server-v2</model>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <blockers model='Skylake-Server-v2'>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512bw'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512cd'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512dq'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512f'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512vl'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='erms'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='hle'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='invpcid'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='pcid'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='pku'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='rtm'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      </blockers>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <model usable='no' vendor='Intel'>Skylake-Server-v3</model>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <blockers model='Skylake-Server-v3'>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512bw'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512cd'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512dq'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512f'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512vl'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='erms'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='invpcid'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='pcid'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='pku'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      </blockers>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <model usable='no' vendor='Intel'>Skylake-Server-v4</model>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <blockers model='Skylake-Server-v4'>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512bw'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512cd'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512dq'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512f'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512vl'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='erms'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='invpcid'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='pcid'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='pku'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      </blockers>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <model usable='no' vendor='Intel'>Skylake-Server-v5</model>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <blockers model='Skylake-Server-v5'>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512bw'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512cd'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512dq'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512f'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512vl'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='erms'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='invpcid'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='pcid'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='pku'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='xsaves'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      </blockers>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <model usable='no' vendor='Intel' canonical='Snowridge-v1'>Snowridge</model>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <blockers model='Snowridge'>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='cldemote'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='core-capability'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='erms'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='gfni'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='movdir64b'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='movdiri'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='mpx'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='split-lock-detect'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      </blockers>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <model usable='no' vendor='Intel'>Snowridge-v1</model>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <blockers model='Snowridge-v1'>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='cldemote'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='core-capability'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='erms'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='gfni'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='movdir64b'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='movdiri'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='mpx'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='split-lock-detect'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      </blockers>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <model usable='no' vendor='Intel'>Snowridge-v2</model>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <blockers model='Snowridge-v2'>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='cldemote'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='core-capability'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='erms'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='gfni'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='movdir64b'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='movdiri'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='split-lock-detect'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      </blockers>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <model usable='no' vendor='Intel'>Snowridge-v3</model>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <blockers model='Snowridge-v3'>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='cldemote'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='core-capability'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='erms'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='gfni'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='movdir64b'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='movdiri'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='split-lock-detect'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='xsaves'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      </blockers>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <model usable='no' vendor='Intel'>Snowridge-v4</model>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <blockers model='Snowridge-v4'>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='cldemote'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='erms'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='gfni'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='movdir64b'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='movdiri'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='xsaves'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      </blockers>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <model usable='yes' vendor='Intel' canonical='Westmere-v1'>Westmere</model>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <model usable='yes' vendor='Intel' canonical='Westmere-v2'>Westmere-IBRS</model>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <model usable='yes' vendor='Intel'>Westmere-v1</model>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <model usable='yes' vendor='Intel'>Westmere-v2</model>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <model usable='no' deprecated='yes' vendor='AMD' canonical='athlon-v1'>athlon</model>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <blockers model='athlon'>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='3dnow'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='3dnowext'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      </blockers>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <model usable='no' deprecated='yes' vendor='AMD'>athlon-v1</model>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <blockers model='athlon-v1'>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='3dnow'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='3dnowext'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      </blockers>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <model usable='no' deprecated='yes' vendor='Intel' canonical='core2duo-v1'>core2duo</model>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <blockers model='core2duo'>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='ss'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      </blockers>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <model usable='no' deprecated='yes' vendor='Intel'>core2duo-v1</model>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <blockers model='core2duo-v1'>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='ss'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      </blockers>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <model usable='no' deprecated='yes' vendor='Intel' canonical='coreduo-v1'>coreduo</model>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <blockers model='coreduo'>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='ss'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      </blockers>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <model usable='no' deprecated='yes' vendor='Intel'>coreduo-v1</model>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <blockers model='coreduo-v1'>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='ss'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      </blockers>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm32-v1'>kvm32</model>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <model usable='yes' deprecated='yes' vendor='unknown'>kvm32-v1</model>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm64-v1'>kvm64</model>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <model usable='yes' deprecated='yes' vendor='unknown'>kvm64-v1</model>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <model usable='no' deprecated='yes' vendor='Intel' canonical='n270-v1'>n270</model>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <blockers model='n270'>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='ss'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      </blockers>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <model usable='no' deprecated='yes' vendor='Intel'>n270-v1</model>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <blockers model='n270-v1'>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='ss'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      </blockers>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium-v1'>pentium</model>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <model usable='yes' deprecated='yes' vendor='unknown'>pentium-v1</model>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium2-v1'>pentium2</model>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <model usable='yes' deprecated='yes' vendor='unknown'>pentium2-v1</model>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium3-v1'>pentium3</model>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <model usable='yes' deprecated='yes' vendor='unknown'>pentium3-v1</model>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <model usable='no' deprecated='yes' vendor='AMD' canonical='phenom-v1'>phenom</model>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <blockers model='phenom'>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='3dnow'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='3dnowext'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      </blockers>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <model usable='no' deprecated='yes' vendor='AMD'>phenom-v1</model>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <blockers model='phenom-v1'>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='3dnow'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='3dnowext'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      </blockers>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu32-v1'>qemu32</model>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <model usable='yes' deprecated='yes' vendor='unknown'>qemu32-v1</model>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu64-v1'>qemu64</model>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <model usable='yes' deprecated='yes' vendor='unknown'>qemu64-v1</model>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:    </mode>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:  </cpu>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:  <memoryBacking supported='yes'>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:    <enum name='sourceType'>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <value>file</value>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <value>anonymous</value>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <value>memfd</value>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:    </enum>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:  </memoryBacking>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:  <devices>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:    <disk supported='yes'>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <enum name='diskDevice'>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <value>disk</value>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <value>cdrom</value>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <value>floppy</value>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <value>lun</value>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      </enum>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <enum name='bus'>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <value>ide</value>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <value>fdc</value>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <value>scsi</value>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <value>virtio</value>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <value>usb</value>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <value>sata</value>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      </enum>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <enum name='model'>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <value>virtio</value>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <value>virtio-transitional</value>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <value>virtio-non-transitional</value>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      </enum>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:    </disk>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:    <graphics supported='yes'>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <enum name='type'>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <value>vnc</value>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <value>egl-headless</value>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <value>dbus</value>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      </enum>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:    </graphics>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:    <video supported='yes'>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <enum name='modelType'>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <value>vga</value>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <value>cirrus</value>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <value>virtio</value>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <value>none</value>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <value>bochs</value>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <value>ramfb</value>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      </enum>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:    </video>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:    <hostdev supported='yes'>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <enum name='mode'>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <value>subsystem</value>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      </enum>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <enum name='startupPolicy'>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <value>default</value>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <value>mandatory</value>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <value>requisite</value>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <value>optional</value>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      </enum>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <enum name='subsysType'>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <value>usb</value>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <value>pci</value>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <value>scsi</value>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      </enum>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <enum name='capsType'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <enum name='pciBackend'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:    </hostdev>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:    <rng supported='yes'>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <enum name='model'>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <value>virtio</value>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <value>virtio-transitional</value>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <value>virtio-non-transitional</value>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      </enum>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <enum name='backendModel'>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <value>random</value>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <value>egd</value>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <value>builtin</value>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      </enum>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:    </rng>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:    <filesystem supported='yes'>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <enum name='driverType'>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <value>path</value>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <value>handle</value>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <value>virtiofs</value>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      </enum>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:    </filesystem>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:    <tpm supported='yes'>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <enum name='model'>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <value>tpm-tis</value>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <value>tpm-crb</value>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      </enum>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <enum name='backendModel'>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <value>emulator</value>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <value>external</value>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      </enum>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <enum name='backendVersion'>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <value>2.0</value>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      </enum>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:    </tpm>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:    <redirdev supported='yes'>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <enum name='bus'>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <value>usb</value>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      </enum>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:    </redirdev>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:    <channel supported='yes'>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <enum name='type'>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <value>pty</value>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <value>unix</value>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      </enum>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:    </channel>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:    <crypto supported='yes'>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <enum name='model'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <enum name='type'>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <value>qemu</value>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      </enum>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <enum name='backendModel'>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <value>builtin</value>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      </enum>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:    </crypto>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:    <interface supported='yes'>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <enum name='backendType'>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <value>default</value>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <value>passt</value>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      </enum>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:    </interface>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:    <panic supported='yes'>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <enum name='model'>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <value>isa</value>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <value>hyperv</value>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      </enum>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:    </panic>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:    <console supported='yes'>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <enum name='type'>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <value>null</value>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <value>vc</value>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <value>pty</value>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <value>dev</value>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <value>file</value>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <value>pipe</value>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <value>stdio</value>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <value>udp</value>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <value>tcp</value>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <value>unix</value>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <value>qemu-vdagent</value>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <value>dbus</value>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      </enum>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:    </console>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:  </devices>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:  <features>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:    <gic supported='no'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:    <vmcoreinfo supported='yes'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:    <genid supported='yes'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:    <backingStoreInput supported='yes'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:    <backup supported='yes'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:    <async-teardown supported='yes'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:    <s390-pv supported='no'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:    <ps2 supported='yes'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:    <tdx supported='no'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:    <sev supported='no'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:    <sgx supported='no'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:    <hyperv supported='yes'>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <enum name='features'>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <value>relaxed</value>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <value>vapic</value>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <value>spinlocks</value>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <value>vpindex</value>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <value>runtime</value>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <value>synic</value>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <value>stimer</value>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <value>reset</value>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <value>vendor_id</value>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <value>frequencies</value>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <value>reenlightenment</value>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <value>tlbflush</value>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <value>ipi</value>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <value>avic</value>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <value>emsr_bitmap</value>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <value>xmm_input</value>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      </enum>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <defaults>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <spinlocks>4095</spinlocks>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <stimer_direct>on</stimer_direct>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <tlbflush_direct>on</tlbflush_direct>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <tlbflush_extended>on</tlbflush_extended>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <vendor_id>Linux KVM Hv</vendor_id>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      </defaults>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:    </hyperv>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:    <launchSecurity supported='no'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:  </features>
Jan 21 18:37:06 np0005591285 nova_compute[181789]: </domainCapabilities>
Jan 21 18:37:06 np0005591285 nova_compute[181789]: _get_domain_capabilities /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1037#033[00m
Jan 21 18:37:06 np0005591285 nova_compute[181789]: 2026-01-21 23:37:06.570 181793 DEBUG nova.virt.libvirt.host [None req-006d1d44-8b50-4352-9f60-dc35856b533c - - - - - -] Libvirt host hypervisor capabilities for arch=i686 and machine_type=q35:
Jan 21 18:37:06 np0005591285 nova_compute[181789]: <domainCapabilities>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:  <path>/usr/libexec/qemu-kvm</path>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:  <domain>kvm</domain>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:  <machine>pc-q35-rhel9.8.0</machine>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:  <arch>i686</arch>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:  <vcpu max='4096'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:  <iothreads supported='yes'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:  <os supported='yes'>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:    <enum name='firmware'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:    <loader supported='yes'>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <value>/usr/share/OVMF/OVMF_CODE.secboot.fd</value>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <enum name='type'>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <value>rom</value>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <value>pflash</value>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      </enum>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <enum name='readonly'>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <value>yes</value>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <value>no</value>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      </enum>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <enum name='secure'>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <value>no</value>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      </enum>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:    </loader>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:  </os>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:  <cpu>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:    <mode name='host-passthrough' supported='yes'>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <enum name='hostPassthroughMigratable'>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <value>on</value>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <value>off</value>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      </enum>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:    </mode>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:    <mode name='maximum' supported='yes'>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <enum name='maximumMigratable'>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <value>on</value>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <value>off</value>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      </enum>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:    </mode>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:    <mode name='host-model' supported='yes'>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <model fallback='forbid'>EPYC-Rome</model>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <vendor>AMD</vendor>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <maxphysaddr mode='passthrough' limit='40'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <feature policy='require' name='x2apic'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <feature policy='require' name='tsc-deadline'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <feature policy='require' name='hypervisor'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <feature policy='require' name='tsc_adjust'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <feature policy='require' name='spec-ctrl'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <feature policy='require' name='stibp'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <feature policy='require' name='ssbd'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <feature policy='require' name='cmp_legacy'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <feature policy='require' name='overflow-recov'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <feature policy='require' name='succor'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <feature policy='require' name='ibrs'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <feature policy='require' name='amd-ssbd'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <feature policy='require' name='virt-ssbd'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <feature policy='require' name='lbrv'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <feature policy='require' name='tsc-scale'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <feature policy='require' name='vmcb-clean'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <feature policy='require' name='flushbyasid'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <feature policy='require' name='pause-filter'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <feature policy='require' name='pfthreshold'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <feature policy='require' name='svme-addr-chk'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <feature policy='require' name='lfence-always-serializing'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <feature policy='disable' name='xsaves'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:    </mode>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:    <mode name='custom' supported='yes'>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='486-v1'>486</model>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <model usable='yes' deprecated='yes' vendor='unknown'>486-v1</model>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <model usable='no' vendor='Intel' canonical='Broadwell-v1'>Broadwell</model>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <blockers model='Broadwell'>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='erms'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='hle'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='invpcid'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='pcid'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='rtm'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      </blockers>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <model usable='no' vendor='Intel' canonical='Broadwell-v3'>Broadwell-IBRS</model>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <blockers model='Broadwell-IBRS'>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='erms'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='hle'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='invpcid'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='pcid'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='rtm'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      </blockers>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <model usable='no' vendor='Intel' canonical='Broadwell-v2'>Broadwell-noTSX</model>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <blockers model='Broadwell-noTSX'>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='erms'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='invpcid'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='pcid'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      </blockers>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <model usable='no' vendor='Intel' canonical='Broadwell-v4'>Broadwell-noTSX-IBRS</model>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <blockers model='Broadwell-noTSX-IBRS'>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='erms'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='invpcid'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='pcid'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      </blockers>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <model usable='no' vendor='Intel'>Broadwell-v1</model>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <blockers model='Broadwell-v1'>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='erms'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='hle'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='invpcid'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='pcid'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='rtm'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      </blockers>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <model usable='no' vendor='Intel'>Broadwell-v2</model>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <blockers model='Broadwell-v2'>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='erms'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='invpcid'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='pcid'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      </blockers>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <model usable='no' vendor='Intel'>Broadwell-v3</model>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <blockers model='Broadwell-v3'>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='erms'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='hle'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='invpcid'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='pcid'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='rtm'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      </blockers>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <model usable='no' vendor='Intel'>Broadwell-v4</model>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <blockers model='Broadwell-v4'>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='erms'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='invpcid'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='pcid'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      </blockers>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v1'>Cascadelake-Server</model>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <blockers model='Cascadelake-Server'>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512bw'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512cd'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512dq'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512f'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512vl'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512vnni'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='erms'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='hle'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='invpcid'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='pcid'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='pku'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='rtm'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      </blockers>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v3'>Cascadelake-Server-noTSX</model>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <blockers model='Cascadelake-Server-noTSX'>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512bw'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512cd'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512dq'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512f'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512vl'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512vnni'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='erms'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='ibrs-all'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='invpcid'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='pcid'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='pku'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      </blockers>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v1</model>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <blockers model='Cascadelake-Server-v1'>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512bw'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512cd'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512dq'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512f'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512vl'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512vnni'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='erms'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='hle'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='invpcid'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='pcid'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='pku'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='rtm'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      </blockers>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v2</model>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <blockers model='Cascadelake-Server-v2'>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512bw'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512cd'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512dq'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512f'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512vl'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512vnni'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='erms'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='hle'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='ibrs-all'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='invpcid'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='pcid'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='pku'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='rtm'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      </blockers>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v3</model>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <blockers model='Cascadelake-Server-v3'>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512bw'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512cd'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512dq'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512f'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512vl'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512vnni'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='erms'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='ibrs-all'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='invpcid'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='pcid'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='pku'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      </blockers>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v4</model>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <blockers model='Cascadelake-Server-v4'>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512bw'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512cd'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512dq'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512f'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512vl'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512vnni'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='erms'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='ibrs-all'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='invpcid'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='pcid'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='pku'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      </blockers>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v5</model>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <blockers model='Cascadelake-Server-v5'>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512bw'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512cd'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512dq'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512f'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512vl'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512vnni'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='erms'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='ibrs-all'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='invpcid'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='pcid'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='pku'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='xsaves'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      </blockers>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <model usable='no' vendor='Intel' canonical='ClearwaterForest-v1'>ClearwaterForest</model>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <blockers model='ClearwaterForest'>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx-ifma'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx-ne-convert'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx-vnni'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx-vnni-int16'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx-vnni-int8'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='bhi-ctrl'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='bhi-no'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='bus-lock-detect'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='cldemote'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='cmpccxadd'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='ddpd-u'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='erms'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='fbsdp-no'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='fsrm'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='fsrs'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='gfni'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='ibrs-all'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='intel-psfd'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='invpcid'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='ipred-ctrl'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='lam'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='mcdt-no'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='movdir64b'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='movdiri'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='pbrsb-no'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='pcid'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='pku'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='prefetchiti'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='psdp-no'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='rrsba-ctrl'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='sbdr-ssdp-no'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='serialize'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='sha512'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='sm3'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='sm4'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='ss'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='vaes'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='vpclmulqdq'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='xsaves'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      </blockers>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <model usable='no' vendor='Intel'>ClearwaterForest-v1</model>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <blockers model='ClearwaterForest-v1'>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx-ifma'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx-ne-convert'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx-vnni'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx-vnni-int16'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx-vnni-int8'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='bhi-ctrl'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='bhi-no'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='bus-lock-detect'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='cldemote'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='cmpccxadd'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='ddpd-u'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='erms'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='fbsdp-no'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='fsrm'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='fsrs'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='gfni'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='ibrs-all'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='intel-psfd'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='invpcid'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='ipred-ctrl'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='lam'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='mcdt-no'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='movdir64b'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='movdiri'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='pbrsb-no'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='pcid'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='pku'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='prefetchiti'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='psdp-no'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='rrsba-ctrl'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='sbdr-ssdp-no'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='serialize'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='sha512'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='sm3'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='sm4'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='ss'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='vaes'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='vpclmulqdq'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='xsaves'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      </blockers>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <model usable='yes' deprecated='yes' vendor='Intel' canonical='Conroe-v1'>Conroe</model>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <model usable='yes' deprecated='yes' vendor='Intel'>Conroe-v1</model>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <model usable='no' vendor='Intel' canonical='Cooperlake-v1'>Cooperlake</model>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <blockers model='Cooperlake'>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512-bf16'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512bw'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512cd'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512dq'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512f'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512vl'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512vnni'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='erms'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='hle'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='ibrs-all'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='invpcid'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='pcid'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='pku'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='rtm'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='taa-no'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      </blockers>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <model usable='no' vendor='Intel'>Cooperlake-v1</model>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <blockers model='Cooperlake-v1'>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512-bf16'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512bw'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512cd'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512dq'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512f'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512vl'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512vnni'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='erms'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='hle'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='ibrs-all'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='invpcid'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='pcid'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='pku'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='rtm'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='taa-no'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      </blockers>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <model usable='no' vendor='Intel'>Cooperlake-v2</model>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <blockers model='Cooperlake-v2'>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512-bf16'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512bw'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512cd'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512dq'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512f'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512vl'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512vnni'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='erms'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='hle'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='ibrs-all'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='invpcid'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='pcid'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='pku'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='rtm'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='taa-no'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='xsaves'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      </blockers>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <model usable='no' vendor='Intel' canonical='Denverton-v1'>Denverton</model>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <blockers model='Denverton'>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='erms'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='mpx'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      </blockers>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <model usable='no' vendor='Intel'>Denverton-v1</model>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <blockers model='Denverton-v1'>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='erms'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='mpx'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      </blockers>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <model usable='no' vendor='Intel'>Denverton-v2</model>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <blockers model='Denverton-v2'>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='erms'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      </blockers>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <model usable='no' vendor='Intel'>Denverton-v3</model>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <blockers model='Denverton-v3'>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='erms'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='xsaves'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      </blockers>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <model usable='yes' vendor='Hygon' canonical='Dhyana-v1'>Dhyana</model>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <model usable='yes' vendor='Hygon'>Dhyana-v1</model>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <model usable='no' vendor='Hygon'>Dhyana-v2</model>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <blockers model='Dhyana-v2'>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='xsaves'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      </blockers>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <model usable='yes' vendor='AMD' canonical='EPYC-v1'>EPYC</model>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <model usable='no' vendor='AMD' canonical='EPYC-Genoa-v1'>EPYC-Genoa</model>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <blockers model='EPYC-Genoa'>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='amd-psfd'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='auto-ibrs'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512-bf16'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512-vpopcntdq'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512bitalg'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512bw'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512cd'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512dq'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512f'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512ifma'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512vbmi'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512vbmi2'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512vl'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512vnni'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='erms'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='fsrm'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='gfni'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='invpcid'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='la57'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='no-nested-data-bp'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='null-sel-clr-base'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='pcid'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='pku'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='stibp-always-on'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='vaes'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='vpclmulqdq'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='xsaves'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      </blockers>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <model usable='no' vendor='AMD'>EPYC-Genoa-v1</model>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <blockers model='EPYC-Genoa-v1'>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='amd-psfd'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='auto-ibrs'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512-bf16'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512-vpopcntdq'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512bitalg'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512bw'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512cd'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512dq'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512f'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512ifma'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512vbmi'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512vbmi2'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512vl'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512vnni'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='erms'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='fsrm'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='gfni'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='invpcid'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='la57'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='no-nested-data-bp'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='null-sel-clr-base'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='pcid'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='pku'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='stibp-always-on'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='vaes'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='vpclmulqdq'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='xsaves'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      </blockers>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <model usable='no' vendor='AMD'>EPYC-Genoa-v2</model>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <blockers model='EPYC-Genoa-v2'>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='amd-psfd'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='auto-ibrs'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512-bf16'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512-vpopcntdq'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512bitalg'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512bw'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512cd'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512dq'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512f'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512ifma'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512vbmi'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512vbmi2'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512vl'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512vnni'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='erms'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='fs-gs-base-ns'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='fsrm'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='gfni'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='invpcid'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='la57'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='no-nested-data-bp'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='null-sel-clr-base'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='pcid'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='perfmon-v2'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='pku'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='stibp-always-on'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='vaes'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='vpclmulqdq'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='xsaves'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      </blockers>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <model usable='yes' vendor='AMD' canonical='EPYC-v2'>EPYC-IBPB</model>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <model usable='no' vendor='AMD' canonical='EPYC-Milan-v1'>EPYC-Milan</model>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <blockers model='EPYC-Milan'>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='erms'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='fsrm'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='invpcid'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='pcid'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='pku'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='xsaves'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      </blockers>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <model usable='no' vendor='AMD'>EPYC-Milan-v1</model>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <blockers model='EPYC-Milan-v1'>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='erms'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='fsrm'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='invpcid'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='pcid'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='pku'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='xsaves'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      </blockers>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <model usable='no' vendor='AMD'>EPYC-Milan-v2</model>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <blockers model='EPYC-Milan-v2'>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='amd-psfd'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='erms'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='fsrm'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='invpcid'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='no-nested-data-bp'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='null-sel-clr-base'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='pcid'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='pku'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='stibp-always-on'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='vaes'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='vpclmulqdq'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='xsaves'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      </blockers>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <model usable='no' vendor='AMD'>EPYC-Milan-v3</model>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <blockers model='EPYC-Milan-v3'>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='amd-psfd'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='erms'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='fsrm'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='invpcid'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='no-nested-data-bp'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='null-sel-clr-base'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='pcid'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='pku'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='stibp-always-on'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='vaes'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='vpclmulqdq'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='xsaves'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      </blockers>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <model usable='no' vendor='AMD' canonical='EPYC-Rome-v1'>EPYC-Rome</model>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <blockers model='EPYC-Rome'>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='xsaves'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      </blockers>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <model usable='no' vendor='AMD'>EPYC-Rome-v1</model>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <blockers model='EPYC-Rome-v1'>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='xsaves'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      </blockers>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <model usable='no' vendor='AMD'>EPYC-Rome-v2</model>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <blockers model='EPYC-Rome-v2'>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='xsaves'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      </blockers>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <model usable='no' vendor='AMD'>EPYC-Rome-v3</model>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <blockers model='EPYC-Rome-v3'>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='xsaves'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      </blockers>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <model usable='yes' vendor='AMD'>EPYC-Rome-v4</model>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <model usable='yes' vendor='AMD'>EPYC-Rome-v5</model>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <model usable='no' vendor='AMD' canonical='EPYC-Turin-v1'>EPYC-Turin</model>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <blockers model='EPYC-Turin'>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='amd-psfd'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='auto-ibrs'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx-vnni'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512-bf16'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512-vp2intersect'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512-vpopcntdq'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512bitalg'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512bw'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512cd'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512dq'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512f'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512ifma'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512vbmi'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512vbmi2'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512vl'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512vnni'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='erms'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='fs-gs-base-ns'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='fsrm'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='gfni'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='ibpb-brtype'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='invpcid'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='la57'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='movdir64b'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='movdiri'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='no-nested-data-bp'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='null-sel-clr-base'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='pcid'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='perfmon-v2'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='pku'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='prefetchi'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='sbpb'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='srso-user-kernel-no'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='stibp-always-on'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='vaes'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='vpclmulqdq'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='xsaves'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      </blockers>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <model usable='no' vendor='AMD'>EPYC-Turin-v1</model>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <blockers model='EPYC-Turin-v1'>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='amd-psfd'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='auto-ibrs'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx-vnni'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512-bf16'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512-vp2intersect'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512-vpopcntdq'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512bitalg'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512bw'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512cd'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512dq'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512f'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512ifma'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512vbmi'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512vbmi2'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512vl'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512vnni'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='erms'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='fs-gs-base-ns'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='fsrm'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='gfni'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='ibpb-brtype'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='invpcid'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='la57'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='movdir64b'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='movdiri'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='no-nested-data-bp'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='null-sel-clr-base'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='pcid'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='perfmon-v2'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='pku'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='prefetchi'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='sbpb'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='srso-user-kernel-no'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='stibp-always-on'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='vaes'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='vpclmulqdq'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='xsaves'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      </blockers>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <model usable='yes' vendor='AMD'>EPYC-v1</model>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <model usable='yes' vendor='AMD'>EPYC-v2</model>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <model usable='no' vendor='AMD'>EPYC-v3</model>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <blockers model='EPYC-v3'>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='xsaves'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      </blockers>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <model usable='no' vendor='AMD'>EPYC-v4</model>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <blockers model='EPYC-v4'>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='xsaves'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      </blockers>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <model usable='no' vendor='AMD'>EPYC-v5</model>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <blockers model='EPYC-v5'>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='xsaves'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      </blockers>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <model usable='no' vendor='Intel' canonical='GraniteRapids-v1'>GraniteRapids</model>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <blockers model='GraniteRapids'>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='amx-bf16'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='amx-fp16'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='amx-int8'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='amx-tile'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx-vnni'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512-bf16'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512-fp16'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512-vpopcntdq'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512bitalg'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512bw'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512cd'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512dq'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512f'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512ifma'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512vbmi'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512vbmi2'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512vl'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512vnni'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='bus-lock-detect'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='erms'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='fbsdp-no'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='fsrc'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='fsrm'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='fsrs'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='fzrm'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='gfni'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='hle'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='ibrs-all'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='invpcid'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='la57'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='mcdt-no'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='pbrsb-no'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='pcid'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='pku'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='prefetchiti'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='psdp-no'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='rtm'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='sbdr-ssdp-no'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='serialize'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='taa-no'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='tsx-ldtrk'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='vaes'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='vpclmulqdq'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='xfd'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='xsaves'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      </blockers>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <model usable='no' vendor='Intel'>GraniteRapids-v1</model>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <blockers model='GraniteRapids-v1'>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='amx-bf16'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='amx-fp16'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='amx-int8'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='amx-tile'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx-vnni'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512-bf16'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512-fp16'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512-vpopcntdq'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512bitalg'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512bw'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512cd'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512dq'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512f'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512ifma'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512vbmi'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512vbmi2'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512vl'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512vnni'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='bus-lock-detect'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='erms'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='fbsdp-no'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='fsrc'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='fsrm'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='fsrs'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='fzrm'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='gfni'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='hle'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='ibrs-all'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='invpcid'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='la57'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='mcdt-no'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='pbrsb-no'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='pcid'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='pku'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='prefetchiti'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='psdp-no'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='rtm'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='sbdr-ssdp-no'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='serialize'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='taa-no'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='tsx-ldtrk'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='vaes'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='vpclmulqdq'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='xfd'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='xsaves'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      </blockers>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <model usable='no' vendor='Intel'>GraniteRapids-v2</model>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <blockers model='GraniteRapids-v2'>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='amx-bf16'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='amx-fp16'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='amx-int8'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='amx-tile'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx-vnni'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx10'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx10-128'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx10-256'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx10-512'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512-bf16'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512-fp16'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512-vpopcntdq'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512bitalg'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512bw'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512cd'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512dq'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512f'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512ifma'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512vbmi'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512vbmi2'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512vl'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512vnni'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='bus-lock-detect'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='cldemote'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='erms'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='fbsdp-no'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='fsrc'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='fsrm'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='fsrs'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='fzrm'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='gfni'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='hle'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='ibrs-all'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='invpcid'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='la57'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='mcdt-no'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='movdir64b'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='movdiri'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='pbrsb-no'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='pcid'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='pku'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='prefetchiti'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='psdp-no'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='rtm'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='sbdr-ssdp-no'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='serialize'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='ss'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='taa-no'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='tsx-ldtrk'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='vaes'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='vpclmulqdq'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='xfd'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='xsaves'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      </blockers>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <model usable='no' vendor='Intel'>GraniteRapids-v3</model>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <blockers model='GraniteRapids-v3'>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='amx-bf16'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='amx-fp16'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='amx-int8'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='amx-tile'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx-vnni'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx10'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx10-128'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx10-256'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx10-512'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512-bf16'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512-fp16'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512-vpopcntdq'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512bitalg'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512bw'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512cd'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512dq'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512f'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512ifma'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512vbmi'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512vbmi2'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512vl'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512vnni'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='bus-lock-detect'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='cldemote'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='erms'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='fbsdp-no'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='fsrc'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='fsrm'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='fsrs'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='fzrm'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='gfni'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='hle'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='ibrs-all'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='invpcid'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='la57'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='mcdt-no'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='movdir64b'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='movdiri'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='pbrsb-no'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='pcid'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='pku'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='prefetchiti'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='psdp-no'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='rtm'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='sbdr-ssdp-no'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='serialize'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='ss'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='taa-no'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='tsx-ldtrk'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='vaes'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='vpclmulqdq'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='xfd'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='xsaves'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      </blockers>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <model usable='no' vendor='Intel' canonical='Haswell-v1'>Haswell</model>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <blockers model='Haswell'>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='erms'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='hle'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='invpcid'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='pcid'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='rtm'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      </blockers>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <model usable='no' vendor='Intel' canonical='Haswell-v3'>Haswell-IBRS</model>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <blockers model='Haswell-IBRS'>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='erms'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='hle'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='invpcid'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='pcid'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='rtm'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      </blockers>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <model usable='no' vendor='Intel' canonical='Haswell-v2'>Haswell-noTSX</model>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <blockers model='Haswell-noTSX'>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='erms'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='invpcid'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='pcid'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      </blockers>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <model usable='no' vendor='Intel' canonical='Haswell-v4'>Haswell-noTSX-IBRS</model>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <blockers model='Haswell-noTSX-IBRS'>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='erms'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='invpcid'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='pcid'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      </blockers>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <model usable='no' vendor='Intel'>Haswell-v1</model>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <blockers model='Haswell-v1'>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='erms'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='hle'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='invpcid'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='pcid'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='rtm'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      </blockers>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <model usable='no' vendor='Intel'>Haswell-v2</model>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <blockers model='Haswell-v2'>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='erms'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='invpcid'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='pcid'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      </blockers>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <model usable='no' vendor='Intel'>Haswell-v3</model>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <blockers model='Haswell-v3'>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='erms'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='hle'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='invpcid'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='pcid'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='rtm'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      </blockers>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <model usable='no' vendor='Intel'>Haswell-v4</model>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <blockers model='Haswell-v4'>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='erms'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='invpcid'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='pcid'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      </blockers>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <model usable='no' vendor='Intel' canonical='Icelake-Server-v1'>Icelake-Server</model>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <blockers model='Icelake-Server'>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512-vpopcntdq'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512bitalg'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512bw'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512cd'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512dq'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512f'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512vbmi'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512vbmi2'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512vl'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512vnni'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='erms'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='gfni'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='hle'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='invpcid'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='la57'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='pcid'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='pku'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='rtm'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='vaes'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='vpclmulqdq'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      </blockers>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <model usable='no' vendor='Intel' canonical='Icelake-Server-v2'>Icelake-Server-noTSX</model>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <blockers model='Icelake-Server-noTSX'>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512-vpopcntdq'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512bitalg'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512bw'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512cd'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512dq'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512f'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512vbmi'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512vbmi2'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512vl'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512vnni'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='erms'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='gfni'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='invpcid'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='la57'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='pcid'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='pku'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='vaes'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='vpclmulqdq'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      </blockers>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <model usable='no' vendor='Intel'>Icelake-Server-v1</model>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <blockers model='Icelake-Server-v1'>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512-vpopcntdq'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512bitalg'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512bw'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512cd'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512dq'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512f'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512vbmi'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512vbmi2'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512vl'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512vnni'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='erms'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='gfni'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='hle'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='invpcid'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='la57'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='pcid'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='pku'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='rtm'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='vaes'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='vpclmulqdq'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      </blockers>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <model usable='no' vendor='Intel'>Icelake-Server-v2</model>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <blockers model='Icelake-Server-v2'>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512-vpopcntdq'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512bitalg'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512bw'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512cd'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512dq'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512f'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512vbmi'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512vbmi2'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512vl'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512vnni'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='erms'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='gfni'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='invpcid'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='la57'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='pcid'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='pku'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='vaes'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='vpclmulqdq'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      </blockers>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <model usable='no' vendor='Intel'>Icelake-Server-v3</model>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <blockers model='Icelake-Server-v3'>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512-vpopcntdq'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512bitalg'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512bw'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512cd'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512dq'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512f'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512vbmi'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512vbmi2'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512vl'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512vnni'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='erms'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='gfni'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='ibrs-all'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='invpcid'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='la57'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='pcid'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='pku'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='taa-no'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='vaes'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='vpclmulqdq'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      </blockers>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <model usable='no' vendor='Intel'>Icelake-Server-v4</model>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <blockers model='Icelake-Server-v4'>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512-vpopcntdq'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512bitalg'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512bw'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512cd'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512dq'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512f'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512ifma'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512vbmi'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512vbmi2'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512vl'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512vnni'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='erms'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='fsrm'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='gfni'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='ibrs-all'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='invpcid'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='la57'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='pcid'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='pku'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='taa-no'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='vaes'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='vpclmulqdq'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      </blockers>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <model usable='no' vendor='Intel'>Icelake-Server-v5</model>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <blockers model='Icelake-Server-v5'>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512-vpopcntdq'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512bitalg'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512bw'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512cd'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512dq'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512f'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512ifma'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512vbmi'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512vbmi2'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512vl'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512vnni'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='erms'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='fsrm'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='gfni'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='ibrs-all'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='invpcid'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='la57'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='pcid'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='pku'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='taa-no'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='vaes'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='vpclmulqdq'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='xsaves'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      </blockers>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <model usable='no' vendor='Intel'>Icelake-Server-v6</model>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <blockers model='Icelake-Server-v6'>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512-vpopcntdq'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512bitalg'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512bw'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512cd'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512dq'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512f'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512ifma'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512vbmi'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512vbmi2'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512vl'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512vnni'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='erms'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='fsrm'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='gfni'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='ibrs-all'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='invpcid'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='la57'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='pcid'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='pku'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='taa-no'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='vaes'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='vpclmulqdq'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='xsaves'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      </blockers>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <model usable='no' vendor='Intel'>Icelake-Server-v7</model>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <blockers model='Icelake-Server-v7'>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512-vpopcntdq'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512bitalg'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512bw'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512cd'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512dq'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512f'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512ifma'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512vbmi'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512vbmi2'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512vl'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512vnni'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='erms'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='fsrm'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='gfni'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='hle'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='ibrs-all'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='invpcid'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='la57'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='pcid'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='pku'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='rtm'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='taa-no'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='vaes'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='vpclmulqdq'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='xsaves'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      </blockers>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <model usable='no' vendor='Intel' canonical='IvyBridge-v1'>IvyBridge</model>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <blockers model='IvyBridge'>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='erms'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      </blockers>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <model usable='no' vendor='Intel' canonical='IvyBridge-v2'>IvyBridge-IBRS</model>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <blockers model='IvyBridge-IBRS'>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='erms'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      </blockers>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <model usable='no' vendor='Intel'>IvyBridge-v1</model>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <blockers model='IvyBridge-v1'>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='erms'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      </blockers>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <model usable='no' vendor='Intel'>IvyBridge-v2</model>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <blockers model='IvyBridge-v2'>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='erms'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      </blockers>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <model usable='no' vendor='Intel' canonical='KnightsMill-v1'>KnightsMill</model>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <blockers model='KnightsMill'>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512-4fmaps'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512-4vnniw'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512-vpopcntdq'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512cd'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512er'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512f'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512pf'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='erms'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='ss'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      </blockers>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <model usable='no' vendor='Intel'>KnightsMill-v1</model>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <blockers model='KnightsMill-v1'>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512-4fmaps'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512-4vnniw'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512-vpopcntdq'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512cd'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512er'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512f'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512pf'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='erms'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='ss'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      </blockers>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <model usable='yes' vendor='Intel' canonical='Nehalem-v1'>Nehalem</model>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <model usable='yes' vendor='Intel' canonical='Nehalem-v2'>Nehalem-IBRS</model>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <model usable='yes' vendor='Intel'>Nehalem-v1</model>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <model usable='yes' vendor='Intel'>Nehalem-v2</model>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G1-v1'>Opteron_G1</model>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G1-v1</model>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G2-v1'>Opteron_G2</model>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G2-v1</model>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G3-v1'>Opteron_G3</model>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G3-v1</model>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <model usable='no' vendor='AMD' canonical='Opteron_G4-v1'>Opteron_G4</model>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <blockers model='Opteron_G4'>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='fma4'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='xop'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      </blockers>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <model usable='no' vendor='AMD'>Opteron_G4-v1</model>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <blockers model='Opteron_G4-v1'>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='fma4'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='xop'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      </blockers>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <model usable='no' vendor='AMD' canonical='Opteron_G5-v1'>Opteron_G5</model>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <blockers model='Opteron_G5'>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='fma4'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='tbm'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='xop'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      </blockers>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <model usable='no' vendor='AMD'>Opteron_G5-v1</model>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <blockers model='Opteron_G5-v1'>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='fma4'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='tbm'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='xop'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      </blockers>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <model usable='yes' deprecated='yes' vendor='Intel' canonical='Penryn-v1'>Penryn</model>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <model usable='yes' deprecated='yes' vendor='Intel'>Penryn-v1</model>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <model usable='yes' vendor='Intel' canonical='SandyBridge-v1'>SandyBridge</model>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <model usable='yes' vendor='Intel' canonical='SandyBridge-v2'>SandyBridge-IBRS</model>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <model usable='yes' vendor='Intel'>SandyBridge-v1</model>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <model usable='yes' vendor='Intel'>SandyBridge-v2</model>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <model usable='no' vendor='Intel' canonical='SapphireRapids-v1'>SapphireRapids</model>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <blockers model='SapphireRapids'>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='amx-bf16'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='amx-int8'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='amx-tile'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx-vnni'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512-bf16'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512-fp16'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512-vpopcntdq'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512bitalg'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512bw'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512cd'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512dq'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512f'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512ifma'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512vbmi'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512vbmi2'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512vl'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512vnni'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='bus-lock-detect'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='erms'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='fsrc'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='fsrm'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='fsrs'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='fzrm'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='gfni'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='hle'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='ibrs-all'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='invpcid'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='la57'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='pcid'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='pku'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='rtm'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='serialize'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='taa-no'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='tsx-ldtrk'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='vaes'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='vpclmulqdq'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='xfd'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='xsaves'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      </blockers>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <model usable='no' vendor='Intel'>SapphireRapids-v1</model>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <blockers model='SapphireRapids-v1'>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='amx-bf16'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='amx-int8'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='amx-tile'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx-vnni'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512-bf16'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512-fp16'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512-vpopcntdq'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512bitalg'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512bw'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512cd'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512dq'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512f'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512ifma'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512vbmi'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512vbmi2'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512vl'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512vnni'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='bus-lock-detect'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='erms'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='fsrc'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='fsrm'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='fsrs'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='fzrm'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='gfni'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='hle'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='ibrs-all'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='invpcid'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='la57'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='pcid'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='pku'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='rtm'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='serialize'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='taa-no'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='tsx-ldtrk'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='vaes'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='vpclmulqdq'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='xfd'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='xsaves'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      </blockers>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <model usable='no' vendor='Intel'>SapphireRapids-v2</model>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <blockers model='SapphireRapids-v2'>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='amx-bf16'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='amx-int8'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='amx-tile'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx-vnni'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512-bf16'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512-fp16'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512-vpopcntdq'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512bitalg'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512bw'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512cd'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512dq'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512f'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512ifma'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512vbmi'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512vbmi2'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512vl'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512vnni'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='bus-lock-detect'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='erms'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='fbsdp-no'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='fsrc'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='fsrm'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='fsrs'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='fzrm'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='gfni'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='hle'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='ibrs-all'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='invpcid'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='la57'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='pcid'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='pku'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='psdp-no'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='rtm'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='sbdr-ssdp-no'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='serialize'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='taa-no'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='tsx-ldtrk'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='vaes'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='vpclmulqdq'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='xfd'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='xsaves'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      </blockers>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <model usable='no' vendor='Intel'>SapphireRapids-v3</model>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <blockers model='SapphireRapids-v3'>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='amx-bf16'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='amx-int8'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='amx-tile'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx-vnni'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512-bf16'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512-fp16'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512-vpopcntdq'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512bitalg'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512bw'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512cd'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512dq'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512f'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512ifma'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512vbmi'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512vbmi2'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512vl'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512vnni'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='bus-lock-detect'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='cldemote'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='erms'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='fbsdp-no'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='fsrc'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='fsrm'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='fsrs'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='fzrm'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='gfni'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='hle'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='ibrs-all'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='invpcid'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='la57'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='movdir64b'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='movdiri'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='pcid'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='pku'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='psdp-no'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='rtm'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='sbdr-ssdp-no'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='serialize'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='ss'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='taa-no'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='tsx-ldtrk'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='vaes'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='vpclmulqdq'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='xfd'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='xsaves'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      </blockers>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <model usable='no' vendor='Intel'>SapphireRapids-v4</model>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <blockers model='SapphireRapids-v4'>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='amx-bf16'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='amx-int8'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='amx-tile'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx-vnni'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512-bf16'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512-fp16'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512-vpopcntdq'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512bitalg'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512bw'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512cd'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512dq'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512f'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512ifma'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512vbmi'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512vbmi2'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512vl'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512vnni'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='bus-lock-detect'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='cldemote'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='erms'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='fbsdp-no'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='fsrc'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='fsrm'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='fsrs'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='fzrm'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='gfni'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='hle'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='ibrs-all'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='invpcid'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='la57'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='movdir64b'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='movdiri'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='pcid'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='pku'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='psdp-no'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='rtm'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='sbdr-ssdp-no'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='serialize'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='ss'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='taa-no'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='tsx-ldtrk'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='vaes'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='vpclmulqdq'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='xfd'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='xsaves'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      </blockers>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <model usable='no' vendor='Intel' canonical='SierraForest-v1'>SierraForest</model>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <blockers model='SierraForest'>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx-ifma'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx-ne-convert'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx-vnni'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx-vnni-int8'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='bus-lock-detect'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='cmpccxadd'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='erms'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='fbsdp-no'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='fsrm'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='fsrs'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='gfni'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='ibrs-all'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='invpcid'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='mcdt-no'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='pbrsb-no'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='pcid'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='pku'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='psdp-no'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='sbdr-ssdp-no'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='serialize'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='vaes'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='vpclmulqdq'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='xsaves'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      </blockers>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <model usable='no' vendor='Intel'>SierraForest-v1</model>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <blockers model='SierraForest-v1'>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx-ifma'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx-ne-convert'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx-vnni'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx-vnni-int8'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='bus-lock-detect'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='cmpccxadd'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='erms'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='fbsdp-no'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='fsrm'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='fsrs'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='gfni'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='ibrs-all'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='invpcid'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='mcdt-no'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='pbrsb-no'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='pcid'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='pku'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='psdp-no'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='sbdr-ssdp-no'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='serialize'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='vaes'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='vpclmulqdq'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='xsaves'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      </blockers>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <model usable='no' vendor='Intel'>SierraForest-v2</model>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <blockers model='SierraForest-v2'>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx-ifma'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx-ne-convert'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx-vnni'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx-vnni-int8'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='bhi-ctrl'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='bus-lock-detect'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='cldemote'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='cmpccxadd'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='erms'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='fbsdp-no'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='fsrm'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='fsrs'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='gfni'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='ibrs-all'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='intel-psfd'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='invpcid'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='ipred-ctrl'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='lam'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='mcdt-no'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='movdir64b'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='movdiri'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='pbrsb-no'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='pcid'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='pku'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='psdp-no'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='rrsba-ctrl'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='sbdr-ssdp-no'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='serialize'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='ss'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='vaes'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='vpclmulqdq'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='xsaves'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      </blockers>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <model usable='no' vendor='Intel'>SierraForest-v3</model>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <blockers model='SierraForest-v3'>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx-ifma'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx-ne-convert'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx-vnni'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx-vnni-int8'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='bhi-ctrl'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='bus-lock-detect'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='cldemote'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='cmpccxadd'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='erms'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='fbsdp-no'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='fsrm'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='fsrs'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='gfni'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='ibrs-all'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='intel-psfd'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='invpcid'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='ipred-ctrl'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='lam'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='mcdt-no'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='movdir64b'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='movdiri'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='pbrsb-no'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='pcid'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='pku'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='psdp-no'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='rrsba-ctrl'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='sbdr-ssdp-no'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='serialize'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='ss'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='vaes'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='vpclmulqdq'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='xsaves'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      </blockers>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <model usable='no' vendor='Intel' canonical='Skylake-Client-v1'>Skylake-Client</model>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <blockers model='Skylake-Client'>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='erms'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='hle'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='invpcid'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='pcid'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='rtm'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      </blockers>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <model usable='no' vendor='Intel' canonical='Skylake-Client-v2'>Skylake-Client-IBRS</model>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <blockers model='Skylake-Client-IBRS'>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='erms'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='hle'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='invpcid'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='pcid'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='rtm'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      </blockers>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <model usable='no' vendor='Intel' canonical='Skylake-Client-v3'>Skylake-Client-noTSX-IBRS</model>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <blockers model='Skylake-Client-noTSX-IBRS'>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='erms'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='invpcid'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='pcid'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      </blockers>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <model usable='no' vendor='Intel'>Skylake-Client-v1</model>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <blockers model='Skylake-Client-v1'>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='erms'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='hle'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='invpcid'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='pcid'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='rtm'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      </blockers>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <model usable='no' vendor='Intel'>Skylake-Client-v2</model>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <blockers model='Skylake-Client-v2'>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='erms'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='hle'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='invpcid'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='pcid'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='rtm'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      </blockers>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <model usable='no' vendor='Intel'>Skylake-Client-v3</model>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <blockers model='Skylake-Client-v3'>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='erms'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='invpcid'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='pcid'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      </blockers>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <model usable='no' vendor='Intel'>Skylake-Client-v4</model>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <blockers model='Skylake-Client-v4'>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='erms'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='invpcid'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='pcid'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='xsaves'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      </blockers>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <model usable='no' vendor='Intel' canonical='Skylake-Server-v1'>Skylake-Server</model>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <blockers model='Skylake-Server'>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512bw'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512cd'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512dq'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512f'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512vl'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='erms'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='hle'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='invpcid'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='pcid'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='pku'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='rtm'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      </blockers>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <model usable='no' vendor='Intel' canonical='Skylake-Server-v2'>Skylake-Server-IBRS</model>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <blockers model='Skylake-Server-IBRS'>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512bw'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512cd'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512dq'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512f'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512vl'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='erms'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='hle'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='invpcid'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='pcid'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='pku'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='rtm'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      </blockers>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <model usable='no' vendor='Intel' canonical='Skylake-Server-v3'>Skylake-Server-noTSX-IBRS</model>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <blockers model='Skylake-Server-noTSX-IBRS'>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512bw'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512cd'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512dq'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512f'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512vl'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='erms'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='invpcid'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='pcid'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='pku'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      </blockers>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <model usable='no' vendor='Intel'>Skylake-Server-v1</model>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <blockers model='Skylake-Server-v1'>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512bw'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512cd'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512dq'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512f'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512vl'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='erms'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='hle'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='invpcid'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='pcid'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='pku'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='rtm'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      </blockers>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <model usable='no' vendor='Intel'>Skylake-Server-v2</model>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <blockers model='Skylake-Server-v2'>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512bw'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512cd'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512dq'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512f'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512vl'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='erms'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='hle'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='invpcid'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='pcid'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='pku'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='rtm'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      </blockers>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <model usable='no' vendor='Intel'>Skylake-Server-v3</model>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <blockers model='Skylake-Server-v3'>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512bw'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512cd'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512dq'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512f'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512vl'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='erms'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='invpcid'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='pcid'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='pku'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      </blockers>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <model usable='no' vendor='Intel'>Skylake-Server-v4</model>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <blockers model='Skylake-Server-v4'>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512bw'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512cd'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512dq'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512f'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512vl'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='erms'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='invpcid'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='pcid'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='pku'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      </blockers>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <model usable='no' vendor='Intel'>Skylake-Server-v5</model>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <blockers model='Skylake-Server-v5'>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512bw'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512cd'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512dq'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512f'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512vl'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='erms'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='invpcid'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='pcid'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='pku'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='xsaves'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      </blockers>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <model usable='no' vendor='Intel' canonical='Snowridge-v1'>Snowridge</model>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <blockers model='Snowridge'>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='cldemote'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='core-capability'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='erms'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='gfni'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='movdir64b'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='movdiri'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='mpx'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='split-lock-detect'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      </blockers>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <model usable='no' vendor='Intel'>Snowridge-v1</model>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <blockers model='Snowridge-v1'>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='cldemote'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='core-capability'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='erms'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='gfni'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='movdir64b'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='movdiri'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='mpx'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='split-lock-detect'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      </blockers>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <model usable='no' vendor='Intel'>Snowridge-v2</model>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <blockers model='Snowridge-v2'>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='cldemote'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='core-capability'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='erms'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='gfni'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='movdir64b'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='movdiri'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='split-lock-detect'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      </blockers>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <model usable='no' vendor='Intel'>Snowridge-v3</model>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <blockers model='Snowridge-v3'>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='cldemote'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='core-capability'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='erms'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='gfni'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='movdir64b'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='movdiri'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='split-lock-detect'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='xsaves'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      </blockers>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <model usable='no' vendor='Intel'>Snowridge-v4</model>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <blockers model='Snowridge-v4'>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='cldemote'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='erms'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='gfni'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='movdir64b'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='movdiri'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='xsaves'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      </blockers>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <model usable='yes' vendor='Intel' canonical='Westmere-v1'>Westmere</model>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <model usable='yes' vendor='Intel' canonical='Westmere-v2'>Westmere-IBRS</model>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <model usable='yes' vendor='Intel'>Westmere-v1</model>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <model usable='yes' vendor='Intel'>Westmere-v2</model>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <model usable='no' deprecated='yes' vendor='AMD' canonical='athlon-v1'>athlon</model>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <blockers model='athlon'>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='3dnow'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='3dnowext'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      </blockers>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <model usable='no' deprecated='yes' vendor='AMD'>athlon-v1</model>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <blockers model='athlon-v1'>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='3dnow'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='3dnowext'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      </blockers>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <model usable='no' deprecated='yes' vendor='Intel' canonical='core2duo-v1'>core2duo</model>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <blockers model='core2duo'>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='ss'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      </blockers>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <model usable='no' deprecated='yes' vendor='Intel'>core2duo-v1</model>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <blockers model='core2duo-v1'>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='ss'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      </blockers>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <model usable='no' deprecated='yes' vendor='Intel' canonical='coreduo-v1'>coreduo</model>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <blockers model='coreduo'>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='ss'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      </blockers>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <model usable='no' deprecated='yes' vendor='Intel'>coreduo-v1</model>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <blockers model='coreduo-v1'>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='ss'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      </blockers>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm32-v1'>kvm32</model>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <model usable='yes' deprecated='yes' vendor='unknown'>kvm32-v1</model>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm64-v1'>kvm64</model>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <model usable='yes' deprecated='yes' vendor='unknown'>kvm64-v1</model>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <model usable='no' deprecated='yes' vendor='Intel' canonical='n270-v1'>n270</model>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <blockers model='n270'>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='ss'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      </blockers>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <model usable='no' deprecated='yes' vendor='Intel'>n270-v1</model>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <blockers model='n270-v1'>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='ss'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      </blockers>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium-v1'>pentium</model>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <model usable='yes' deprecated='yes' vendor='unknown'>pentium-v1</model>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium2-v1'>pentium2</model>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <model usable='yes' deprecated='yes' vendor='unknown'>pentium2-v1</model>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium3-v1'>pentium3</model>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <model usable='yes' deprecated='yes' vendor='unknown'>pentium3-v1</model>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <model usable='no' deprecated='yes' vendor='AMD' canonical='phenom-v1'>phenom</model>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <blockers model='phenom'>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='3dnow'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='3dnowext'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      </blockers>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <model usable='no' deprecated='yes' vendor='AMD'>phenom-v1</model>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <blockers model='phenom-v1'>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='3dnow'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='3dnowext'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      </blockers>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu32-v1'>qemu32</model>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <model usable='yes' deprecated='yes' vendor='unknown'>qemu32-v1</model>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu64-v1'>qemu64</model>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <model usable='yes' deprecated='yes' vendor='unknown'>qemu64-v1</model>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:    </mode>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:  </cpu>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:  <memoryBacking supported='yes'>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:    <enum name='sourceType'>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <value>file</value>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <value>anonymous</value>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <value>memfd</value>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:    </enum>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:  </memoryBacking>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:  <devices>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:    <disk supported='yes'>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <enum name='diskDevice'>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <value>disk</value>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <value>cdrom</value>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <value>floppy</value>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <value>lun</value>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      </enum>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <enum name='bus'>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <value>fdc</value>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <value>scsi</value>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <value>virtio</value>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <value>usb</value>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <value>sata</value>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      </enum>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <enum name='model'>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <value>virtio</value>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <value>virtio-transitional</value>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <value>virtio-non-transitional</value>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      </enum>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:    </disk>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:    <graphics supported='yes'>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <enum name='type'>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <value>vnc</value>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <value>egl-headless</value>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <value>dbus</value>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      </enum>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:    </graphics>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:    <video supported='yes'>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <enum name='modelType'>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <value>vga</value>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <value>cirrus</value>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <value>virtio</value>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <value>none</value>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <value>bochs</value>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <value>ramfb</value>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      </enum>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:    </video>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:    <hostdev supported='yes'>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <enum name='mode'>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <value>subsystem</value>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      </enum>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <enum name='startupPolicy'>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <value>default</value>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <value>mandatory</value>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <value>requisite</value>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <value>optional</value>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      </enum>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <enum name='subsysType'>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <value>usb</value>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <value>pci</value>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <value>scsi</value>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      </enum>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <enum name='capsType'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <enum name='pciBackend'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:    </hostdev>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:    <rng supported='yes'>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <enum name='model'>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <value>virtio</value>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <value>virtio-transitional</value>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <value>virtio-non-transitional</value>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      </enum>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <enum name='backendModel'>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <value>random</value>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <value>egd</value>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <value>builtin</value>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      </enum>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:    </rng>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:    <filesystem supported='yes'>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <enum name='driverType'>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <value>path</value>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <value>handle</value>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <value>virtiofs</value>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      </enum>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:    </filesystem>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:    <tpm supported='yes'>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <enum name='model'>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <value>tpm-tis</value>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <value>tpm-crb</value>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      </enum>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <enum name='backendModel'>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <value>emulator</value>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <value>external</value>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      </enum>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <enum name='backendVersion'>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <value>2.0</value>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      </enum>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:    </tpm>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:    <redirdev supported='yes'>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <enum name='bus'>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <value>usb</value>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      </enum>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:    </redirdev>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:    <channel supported='yes'>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <enum name='type'>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <value>pty</value>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <value>unix</value>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      </enum>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:    </channel>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:    <crypto supported='yes'>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <enum name='model'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <enum name='type'>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <value>qemu</value>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      </enum>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <enum name='backendModel'>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <value>builtin</value>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      </enum>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:    </crypto>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:    <interface supported='yes'>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <enum name='backendType'>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <value>default</value>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <value>passt</value>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      </enum>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:    </interface>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:    <panic supported='yes'>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <enum name='model'>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <value>isa</value>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <value>hyperv</value>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      </enum>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:    </panic>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:    <console supported='yes'>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <enum name='type'>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <value>null</value>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <value>vc</value>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <value>pty</value>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <value>dev</value>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <value>file</value>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <value>pipe</value>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <value>stdio</value>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <value>udp</value>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <value>tcp</value>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <value>unix</value>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <value>qemu-vdagent</value>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <value>dbus</value>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      </enum>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:    </console>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:  </devices>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:  <features>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:    <gic supported='no'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:    <vmcoreinfo supported='yes'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:    <genid supported='yes'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:    <backingStoreInput supported='yes'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:    <backup supported='yes'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:    <async-teardown supported='yes'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:    <s390-pv supported='no'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:    <ps2 supported='yes'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:    <tdx supported='no'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:    <sev supported='no'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:    <sgx supported='no'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:    <hyperv supported='yes'>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <enum name='features'>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <value>relaxed</value>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <value>vapic</value>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <value>spinlocks</value>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <value>vpindex</value>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <value>runtime</value>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <value>synic</value>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <value>stimer</value>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <value>reset</value>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <value>vendor_id</value>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <value>frequencies</value>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <value>reenlightenment</value>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <value>tlbflush</value>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <value>ipi</value>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <value>avic</value>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <value>emsr_bitmap</value>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <value>xmm_input</value>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      </enum>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <defaults>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <spinlocks>4095</spinlocks>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <stimer_direct>on</stimer_direct>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <tlbflush_direct>on</tlbflush_direct>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <tlbflush_extended>on</tlbflush_extended>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <vendor_id>Linux KVM Hv</vendor_id>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      </defaults>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:    </hyperv>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:    <launchSecurity supported='no'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:  </features>
Jan 21 18:37:06 np0005591285 nova_compute[181789]: </domainCapabilities>
Jan 21 18:37:06 np0005591285 nova_compute[181789]: _get_domain_capabilities /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1037#033[00m
Jan 21 18:37:06 np0005591285 nova_compute[181789]: 2026-01-21 23:37:06.658 181793 DEBUG nova.virt.libvirt.host [None req-006d1d44-8b50-4352-9f60-dc35856b533c - - - - - -] Getting domain capabilities for x86_64 via machine types: {'pc', 'q35'} _get_machine_types /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:952#033[00m
Jan 21 18:37:06 np0005591285 nova_compute[181789]: 2026-01-21 23:37:06.665 181793 DEBUG nova.virt.libvirt.host [None req-006d1d44-8b50-4352-9f60-dc35856b533c - - - - - -] Libvirt host hypervisor capabilities for arch=x86_64 and machine_type=pc:
Jan 21 18:37:06 np0005591285 nova_compute[181789]: <domainCapabilities>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:  <path>/usr/libexec/qemu-kvm</path>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:  <domain>kvm</domain>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:  <machine>pc-i440fx-rhel7.6.0</machine>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:  <arch>x86_64</arch>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:  <vcpu max='240'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:  <iothreads supported='yes'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:  <os supported='yes'>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:    <enum name='firmware'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:    <loader supported='yes'>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <value>/usr/share/OVMF/OVMF_CODE.secboot.fd</value>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <enum name='type'>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <value>rom</value>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <value>pflash</value>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      </enum>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <enum name='readonly'>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <value>yes</value>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <value>no</value>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      </enum>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <enum name='secure'>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <value>no</value>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      </enum>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:    </loader>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:  </os>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:  <cpu>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:    <mode name='host-passthrough' supported='yes'>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <enum name='hostPassthroughMigratable'>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <value>on</value>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <value>off</value>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      </enum>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:    </mode>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:    <mode name='maximum' supported='yes'>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <enum name='maximumMigratable'>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <value>on</value>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <value>off</value>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      </enum>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:    </mode>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:    <mode name='host-model' supported='yes'>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <model fallback='forbid'>EPYC-Rome</model>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <vendor>AMD</vendor>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <maxphysaddr mode='passthrough' limit='40'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <feature policy='require' name='x2apic'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <feature policy='require' name='tsc-deadline'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <feature policy='require' name='hypervisor'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <feature policy='require' name='tsc_adjust'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <feature policy='require' name='spec-ctrl'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <feature policy='require' name='stibp'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <feature policy='require' name='ssbd'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <feature policy='require' name='cmp_legacy'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <feature policy='require' name='overflow-recov'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <feature policy='require' name='succor'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <feature policy='require' name='ibrs'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <feature policy='require' name='amd-ssbd'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <feature policy='require' name='virt-ssbd'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <feature policy='require' name='lbrv'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <feature policy='require' name='tsc-scale'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <feature policy='require' name='vmcb-clean'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <feature policy='require' name='flushbyasid'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <feature policy='require' name='pause-filter'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <feature policy='require' name='pfthreshold'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <feature policy='require' name='svme-addr-chk'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <feature policy='require' name='lfence-always-serializing'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <feature policy='disable' name='xsaves'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:    </mode>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:    <mode name='custom' supported='yes'>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='486-v1'>486</model>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <model usable='yes' deprecated='yes' vendor='unknown'>486-v1</model>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <model usable='no' vendor='Intel' canonical='Broadwell-v1'>Broadwell</model>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <blockers model='Broadwell'>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='erms'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='hle'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='invpcid'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='pcid'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='rtm'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      </blockers>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <model usable='no' vendor='Intel' canonical='Broadwell-v3'>Broadwell-IBRS</model>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <blockers model='Broadwell-IBRS'>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='erms'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='hle'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='invpcid'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='pcid'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='rtm'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      </blockers>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <model usable='no' vendor='Intel' canonical='Broadwell-v2'>Broadwell-noTSX</model>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <blockers model='Broadwell-noTSX'>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='erms'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='invpcid'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='pcid'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      </blockers>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <model usable='no' vendor='Intel' canonical='Broadwell-v4'>Broadwell-noTSX-IBRS</model>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <blockers model='Broadwell-noTSX-IBRS'>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='erms'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='invpcid'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='pcid'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      </blockers>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <model usable='no' vendor='Intel'>Broadwell-v1</model>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <blockers model='Broadwell-v1'>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='erms'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='hle'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='invpcid'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='pcid'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='rtm'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      </blockers>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <model usable='no' vendor='Intel'>Broadwell-v2</model>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <blockers model='Broadwell-v2'>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='erms'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='invpcid'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='pcid'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      </blockers>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <model usable='no' vendor='Intel'>Broadwell-v3</model>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <blockers model='Broadwell-v3'>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='erms'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='hle'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='invpcid'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='pcid'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='rtm'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      </blockers>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <model usable='no' vendor='Intel'>Broadwell-v4</model>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <blockers model='Broadwell-v4'>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='erms'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='invpcid'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='pcid'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      </blockers>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v1'>Cascadelake-Server</model>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <blockers model='Cascadelake-Server'>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512bw'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512cd'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512dq'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512f'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512vl'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512vnni'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='erms'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='hle'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='invpcid'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='pcid'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='pku'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='rtm'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      </blockers>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v3'>Cascadelake-Server-noTSX</model>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <blockers model='Cascadelake-Server-noTSX'>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512bw'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512cd'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512dq'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512f'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512vl'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512vnni'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='erms'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='ibrs-all'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='invpcid'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='pcid'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='pku'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      </blockers>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v1</model>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <blockers model='Cascadelake-Server-v1'>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512bw'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512cd'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512dq'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512f'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512vl'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512vnni'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='erms'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='hle'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='invpcid'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='pcid'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='pku'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='rtm'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      </blockers>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v2</model>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <blockers model='Cascadelake-Server-v2'>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512bw'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512cd'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512dq'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512f'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512vl'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512vnni'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='erms'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='hle'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='ibrs-all'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='invpcid'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='pcid'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='pku'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='rtm'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      </blockers>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v3</model>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <blockers model='Cascadelake-Server-v3'>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512bw'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512cd'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512dq'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512f'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512vl'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512vnni'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='erms'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='ibrs-all'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='invpcid'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='pcid'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='pku'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      </blockers>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v4</model>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <blockers model='Cascadelake-Server-v4'>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512bw'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512cd'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512dq'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512f'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512vl'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512vnni'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='erms'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='ibrs-all'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='invpcid'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='pcid'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='pku'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      </blockers>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v5</model>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <blockers model='Cascadelake-Server-v5'>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512bw'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512cd'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512dq'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512f'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512vl'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512vnni'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='erms'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='ibrs-all'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='invpcid'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='pcid'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='pku'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='xsaves'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      </blockers>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <model usable='no' vendor='Intel' canonical='ClearwaterForest-v1'>ClearwaterForest</model>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <blockers model='ClearwaterForest'>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx-ifma'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx-ne-convert'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx-vnni'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx-vnni-int16'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx-vnni-int8'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='bhi-ctrl'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='bhi-no'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='bus-lock-detect'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='cldemote'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='cmpccxadd'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='ddpd-u'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='erms'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='fbsdp-no'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='fsrm'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='fsrs'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='gfni'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='ibrs-all'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='intel-psfd'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='invpcid'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='ipred-ctrl'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='lam'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='mcdt-no'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='movdir64b'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='movdiri'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='pbrsb-no'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='pcid'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='pku'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='prefetchiti'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='psdp-no'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='rrsba-ctrl'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='sbdr-ssdp-no'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='serialize'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='sha512'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='sm3'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='sm4'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='ss'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='vaes'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='vpclmulqdq'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='xsaves'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      </blockers>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <model usable='no' vendor='Intel'>ClearwaterForest-v1</model>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <blockers model='ClearwaterForest-v1'>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx-ifma'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx-ne-convert'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx-vnni'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx-vnni-int16'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx-vnni-int8'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='bhi-ctrl'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='bhi-no'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='bus-lock-detect'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='cldemote'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='cmpccxadd'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='ddpd-u'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='erms'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='fbsdp-no'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='fsrm'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='fsrs'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='gfni'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='ibrs-all'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='intel-psfd'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='invpcid'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='ipred-ctrl'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='lam'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='mcdt-no'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='movdir64b'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='movdiri'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='pbrsb-no'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='pcid'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='pku'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='prefetchiti'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='psdp-no'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='rrsba-ctrl'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='sbdr-ssdp-no'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='serialize'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='sha512'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='sm3'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='sm4'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='ss'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='vaes'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='vpclmulqdq'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='xsaves'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      </blockers>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <model usable='yes' deprecated='yes' vendor='Intel' canonical='Conroe-v1'>Conroe</model>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <model usable='yes' deprecated='yes' vendor='Intel'>Conroe-v1</model>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <model usable='no' vendor='Intel' canonical='Cooperlake-v1'>Cooperlake</model>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <blockers model='Cooperlake'>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512-bf16'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512bw'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512cd'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512dq'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512f'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512vl'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512vnni'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='erms'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='hle'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='ibrs-all'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='invpcid'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='pcid'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='pku'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='rtm'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='taa-no'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      </blockers>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <model usable='no' vendor='Intel'>Cooperlake-v1</model>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <blockers model='Cooperlake-v1'>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512-bf16'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512bw'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512cd'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512dq'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512f'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512vl'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512vnni'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='erms'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='hle'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='ibrs-all'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='invpcid'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='pcid'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='pku'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='rtm'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='taa-no'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      </blockers>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <model usable='no' vendor='Intel'>Cooperlake-v2</model>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <blockers model='Cooperlake-v2'>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512-bf16'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512bw'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512cd'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512dq'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512f'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512vl'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512vnni'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='erms'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='hle'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='ibrs-all'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='invpcid'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='pcid'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='pku'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='rtm'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='taa-no'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='xsaves'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      </blockers>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <model usable='no' vendor='Intel' canonical='Denverton-v1'>Denverton</model>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <blockers model='Denverton'>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='erms'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='mpx'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      </blockers>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <model usable='no' vendor='Intel'>Denverton-v1</model>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <blockers model='Denverton-v1'>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='erms'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='mpx'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      </blockers>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <model usable='no' vendor='Intel'>Denverton-v2</model>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <blockers model='Denverton-v2'>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='erms'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      </blockers>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <model usable='no' vendor='Intel'>Denverton-v3</model>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <blockers model='Denverton-v3'>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='erms'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='xsaves'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      </blockers>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <model usable='yes' vendor='Hygon' canonical='Dhyana-v1'>Dhyana</model>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <model usable='yes' vendor='Hygon'>Dhyana-v1</model>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <model usable='no' vendor='Hygon'>Dhyana-v2</model>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <blockers model='Dhyana-v2'>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='xsaves'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      </blockers>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <model usable='yes' vendor='AMD' canonical='EPYC-v1'>EPYC</model>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <model usable='no' vendor='AMD' canonical='EPYC-Genoa-v1'>EPYC-Genoa</model>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <blockers model='EPYC-Genoa'>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='amd-psfd'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='auto-ibrs'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512-bf16'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512-vpopcntdq'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512bitalg'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512bw'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512cd'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512dq'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512f'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512ifma'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512vbmi'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512vbmi2'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512vl'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512vnni'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='erms'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='fsrm'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='gfni'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='invpcid'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='la57'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='no-nested-data-bp'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='null-sel-clr-base'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='pcid'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='pku'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='stibp-always-on'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='vaes'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='vpclmulqdq'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='xsaves'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      </blockers>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <model usable='no' vendor='AMD'>EPYC-Genoa-v1</model>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <blockers model='EPYC-Genoa-v1'>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='amd-psfd'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='auto-ibrs'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512-bf16'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512-vpopcntdq'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512bitalg'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512bw'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512cd'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512dq'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512f'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512ifma'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512vbmi'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512vbmi2'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512vl'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512vnni'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='erms'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='fsrm'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='gfni'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='invpcid'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='la57'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='no-nested-data-bp'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='null-sel-clr-base'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='pcid'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='pku'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='stibp-always-on'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='vaes'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='vpclmulqdq'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='xsaves'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      </blockers>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <model usable='no' vendor='AMD'>EPYC-Genoa-v2</model>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <blockers model='EPYC-Genoa-v2'>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='amd-psfd'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='auto-ibrs'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512-bf16'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512-vpopcntdq'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512bitalg'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512bw'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512cd'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512dq'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512f'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512ifma'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512vbmi'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512vbmi2'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512vl'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512vnni'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='erms'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='fs-gs-base-ns'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='fsrm'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='gfni'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='invpcid'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='la57'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='no-nested-data-bp'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='null-sel-clr-base'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='pcid'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='perfmon-v2'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='pku'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='stibp-always-on'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='vaes'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='vpclmulqdq'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='xsaves'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      </blockers>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <model usable='yes' vendor='AMD' canonical='EPYC-v2'>EPYC-IBPB</model>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <model usable='no' vendor='AMD' canonical='EPYC-Milan-v1'>EPYC-Milan</model>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <blockers model='EPYC-Milan'>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='erms'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='fsrm'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='invpcid'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='pcid'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='pku'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='xsaves'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      </blockers>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <model usable='no' vendor='AMD'>EPYC-Milan-v1</model>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <blockers model='EPYC-Milan-v1'>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='erms'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='fsrm'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='invpcid'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='pcid'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='pku'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='xsaves'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      </blockers>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <model usable='no' vendor='AMD'>EPYC-Milan-v2</model>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <blockers model='EPYC-Milan-v2'>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='amd-psfd'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='erms'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='fsrm'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='invpcid'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='no-nested-data-bp'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='null-sel-clr-base'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='pcid'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='pku'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='stibp-always-on'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='vaes'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='vpclmulqdq'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='xsaves'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      </blockers>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <model usable='no' vendor='AMD'>EPYC-Milan-v3</model>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <blockers model='EPYC-Milan-v3'>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='amd-psfd'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='erms'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='fsrm'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='invpcid'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='no-nested-data-bp'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='null-sel-clr-base'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='pcid'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='pku'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='stibp-always-on'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='vaes'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='vpclmulqdq'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='xsaves'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      </blockers>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <model usable='no' vendor='AMD' canonical='EPYC-Rome-v1'>EPYC-Rome</model>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <blockers model='EPYC-Rome'>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='xsaves'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      </blockers>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <model usable='no' vendor='AMD'>EPYC-Rome-v1</model>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <blockers model='EPYC-Rome-v1'>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='xsaves'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      </blockers>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <model usable='no' vendor='AMD'>EPYC-Rome-v2</model>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <blockers model='EPYC-Rome-v2'>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='xsaves'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      </blockers>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <model usable='no' vendor='AMD'>EPYC-Rome-v3</model>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <blockers model='EPYC-Rome-v3'>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='xsaves'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      </blockers>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <model usable='yes' vendor='AMD'>EPYC-Rome-v4</model>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <model usable='yes' vendor='AMD'>EPYC-Rome-v5</model>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <model usable='no' vendor='AMD' canonical='EPYC-Turin-v1'>EPYC-Turin</model>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <blockers model='EPYC-Turin'>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='amd-psfd'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='auto-ibrs'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx-vnni'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512-bf16'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512-vp2intersect'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512-vpopcntdq'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512bitalg'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512bw'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512cd'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512dq'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512f'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512ifma'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512vbmi'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512vbmi2'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512vl'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512vnni'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='erms'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='fs-gs-base-ns'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='fsrm'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='gfni'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='ibpb-brtype'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='invpcid'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='la57'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='movdir64b'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='movdiri'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='no-nested-data-bp'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='null-sel-clr-base'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='pcid'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='perfmon-v2'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='pku'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='prefetchi'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='sbpb'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='srso-user-kernel-no'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='stibp-always-on'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='vaes'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='vpclmulqdq'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='xsaves'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      </blockers>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <model usable='no' vendor='AMD'>EPYC-Turin-v1</model>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <blockers model='EPYC-Turin-v1'>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='amd-psfd'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='auto-ibrs'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx-vnni'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512-bf16'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512-vp2intersect'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512-vpopcntdq'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512bitalg'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512bw'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512cd'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512dq'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512f'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512ifma'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512vbmi'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512vbmi2'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512vl'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512vnni'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='erms'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='fs-gs-base-ns'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='fsrm'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='gfni'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='ibpb-brtype'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='invpcid'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='la57'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='movdir64b'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='movdiri'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='no-nested-data-bp'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='null-sel-clr-base'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='pcid'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='perfmon-v2'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='pku'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='prefetchi'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='sbpb'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='srso-user-kernel-no'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='stibp-always-on'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='vaes'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='vpclmulqdq'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='xsaves'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      </blockers>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <model usable='yes' vendor='AMD'>EPYC-v1</model>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <model usable='yes' vendor='AMD'>EPYC-v2</model>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <model usable='no' vendor='AMD'>EPYC-v3</model>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <blockers model='EPYC-v3'>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='xsaves'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      </blockers>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <model usable='no' vendor='AMD'>EPYC-v4</model>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <blockers model='EPYC-v4'>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='xsaves'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      </blockers>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <model usable='no' vendor='AMD'>EPYC-v5</model>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <blockers model='EPYC-v5'>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='xsaves'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      </blockers>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <model usable='no' vendor='Intel' canonical='GraniteRapids-v1'>GraniteRapids</model>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <blockers model='GraniteRapids'>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='amx-bf16'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='amx-fp16'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='amx-int8'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='amx-tile'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx-vnni'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512-bf16'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512-fp16'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512-vpopcntdq'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512bitalg'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512bw'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512cd'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512dq'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512f'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512ifma'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512vbmi'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512vbmi2'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512vl'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512vnni'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='bus-lock-detect'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='erms'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='fbsdp-no'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='fsrc'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='fsrm'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='fsrs'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='fzrm'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='gfni'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='hle'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='ibrs-all'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='invpcid'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='la57'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='mcdt-no'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='pbrsb-no'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='pcid'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='pku'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='prefetchiti'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='psdp-no'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='rtm'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='sbdr-ssdp-no'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='serialize'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='taa-no'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='tsx-ldtrk'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='vaes'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='vpclmulqdq'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='xfd'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='xsaves'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      </blockers>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <model usable='no' vendor='Intel'>GraniteRapids-v1</model>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <blockers model='GraniteRapids-v1'>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='amx-bf16'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='amx-fp16'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='amx-int8'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='amx-tile'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx-vnni'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512-bf16'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512-fp16'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512-vpopcntdq'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512bitalg'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512bw'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512cd'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512dq'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512f'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512ifma'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512vbmi'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512vbmi2'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512vl'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512vnni'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='bus-lock-detect'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='erms'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='fbsdp-no'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='fsrc'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='fsrm'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='fsrs'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='fzrm'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='gfni'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='hle'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='ibrs-all'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='invpcid'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='la57'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='mcdt-no'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='pbrsb-no'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='pcid'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='pku'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='prefetchiti'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='psdp-no'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='rtm'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='sbdr-ssdp-no'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='serialize'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='taa-no'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='tsx-ldtrk'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='vaes'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='vpclmulqdq'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='xfd'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='xsaves'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      </blockers>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <model usable='no' vendor='Intel'>GraniteRapids-v2</model>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <blockers model='GraniteRapids-v2'>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='amx-bf16'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='amx-fp16'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='amx-int8'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='amx-tile'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx-vnni'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx10'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx10-128'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx10-256'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx10-512'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512-bf16'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512-fp16'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512-vpopcntdq'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512bitalg'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512bw'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512cd'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512dq'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512f'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512ifma'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512vbmi'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512vbmi2'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512vl'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512vnni'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='bus-lock-detect'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='cldemote'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='erms'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='fbsdp-no'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='fsrc'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='fsrm'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='fsrs'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='fzrm'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='gfni'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='hle'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='ibrs-all'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='invpcid'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='la57'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='mcdt-no'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='movdir64b'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='movdiri'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='pbrsb-no'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='pcid'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='pku'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='prefetchiti'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='psdp-no'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='rtm'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='sbdr-ssdp-no'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='serialize'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='ss'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='taa-no'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='tsx-ldtrk'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='vaes'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='vpclmulqdq'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='xfd'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='xsaves'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      </blockers>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <model usable='no' vendor='Intel'>GraniteRapids-v3</model>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <blockers model='GraniteRapids-v3'>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='amx-bf16'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='amx-fp16'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='amx-int8'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='amx-tile'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx-vnni'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx10'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx10-128'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx10-256'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx10-512'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512-bf16'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512-fp16'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512-vpopcntdq'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512bitalg'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512bw'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512cd'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512dq'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512f'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512ifma'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512vbmi'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512vbmi2'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512vl'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512vnni'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='bus-lock-detect'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='cldemote'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='erms'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='fbsdp-no'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='fsrc'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='fsrm'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='fsrs'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='fzrm'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='gfni'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='hle'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='ibrs-all'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='invpcid'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='la57'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='mcdt-no'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='movdir64b'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='movdiri'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='pbrsb-no'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='pcid'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='pku'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='prefetchiti'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='psdp-no'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='rtm'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='sbdr-ssdp-no'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='serialize'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='ss'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='taa-no'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='tsx-ldtrk'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='vaes'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='vpclmulqdq'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='xfd'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='xsaves'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      </blockers>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <model usable='no' vendor='Intel' canonical='Haswell-v1'>Haswell</model>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <blockers model='Haswell'>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='erms'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='hle'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='invpcid'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='pcid'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='rtm'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      </blockers>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <model usable='no' vendor='Intel' canonical='Haswell-v3'>Haswell-IBRS</model>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <blockers model='Haswell-IBRS'>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='erms'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='hle'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='invpcid'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='pcid'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='rtm'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      </blockers>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <model usable='no' vendor='Intel' canonical='Haswell-v2'>Haswell-noTSX</model>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <blockers model='Haswell-noTSX'>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='erms'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='invpcid'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='pcid'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      </blockers>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <model usable='no' vendor='Intel' canonical='Haswell-v4'>Haswell-noTSX-IBRS</model>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <blockers model='Haswell-noTSX-IBRS'>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='erms'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='invpcid'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='pcid'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      </blockers>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <model usable='no' vendor='Intel'>Haswell-v1</model>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <blockers model='Haswell-v1'>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='erms'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='hle'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='invpcid'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='pcid'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='rtm'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      </blockers>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <model usable='no' vendor='Intel'>Haswell-v2</model>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <blockers model='Haswell-v2'>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='erms'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='invpcid'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='pcid'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      </blockers>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <model usable='no' vendor='Intel'>Haswell-v3</model>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <blockers model='Haswell-v3'>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='erms'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='hle'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='invpcid'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='pcid'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='rtm'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      </blockers>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <model usable='no' vendor='Intel'>Haswell-v4</model>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <blockers model='Haswell-v4'>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='erms'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='invpcid'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='pcid'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      </blockers>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <model usable='no' vendor='Intel' canonical='Icelake-Server-v1'>Icelake-Server</model>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <blockers model='Icelake-Server'>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512-vpopcntdq'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512bitalg'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512bw'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512cd'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512dq'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512f'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512vbmi'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512vbmi2'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512vl'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512vnni'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='erms'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='gfni'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='hle'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='invpcid'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='la57'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='pcid'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='pku'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='rtm'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='vaes'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='vpclmulqdq'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      </blockers>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <model usable='no' vendor='Intel' canonical='Icelake-Server-v2'>Icelake-Server-noTSX</model>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <blockers model='Icelake-Server-noTSX'>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512-vpopcntdq'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512bitalg'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512bw'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512cd'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512dq'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512f'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512vbmi'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512vbmi2'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512vl'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512vnni'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='erms'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='gfni'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='invpcid'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='la57'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='pcid'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='pku'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='vaes'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='vpclmulqdq'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      </blockers>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <model usable='no' vendor='Intel'>Icelake-Server-v1</model>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <blockers model='Icelake-Server-v1'>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512-vpopcntdq'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512bitalg'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512bw'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512cd'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512dq'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512f'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512vbmi'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512vbmi2'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512vl'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512vnni'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='erms'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='gfni'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='hle'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='invpcid'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='la57'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='pcid'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='pku'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='rtm'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='vaes'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='vpclmulqdq'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      </blockers>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <model usable='no' vendor='Intel'>Icelake-Server-v2</model>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <blockers model='Icelake-Server-v2'>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512-vpopcntdq'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512bitalg'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512bw'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512cd'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512dq'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512f'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512vbmi'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512vbmi2'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512vl'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512vnni'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='erms'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='gfni'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='invpcid'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='la57'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='pcid'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='pku'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='vaes'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='vpclmulqdq'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      </blockers>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <model usable='no' vendor='Intel'>Icelake-Server-v3</model>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <blockers model='Icelake-Server-v3'>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512-vpopcntdq'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512bitalg'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512bw'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512cd'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512dq'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512f'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512vbmi'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512vbmi2'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512vl'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512vnni'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='erms'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='gfni'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='ibrs-all'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='invpcid'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='la57'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='pcid'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='pku'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='taa-no'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='vaes'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='vpclmulqdq'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      </blockers>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <model usable='no' vendor='Intel'>Icelake-Server-v4</model>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <blockers model='Icelake-Server-v4'>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512-vpopcntdq'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512bitalg'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512bw'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512cd'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512dq'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512f'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512ifma'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512vbmi'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512vbmi2'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512vl'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512vnni'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='erms'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='fsrm'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='gfni'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='ibrs-all'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='invpcid'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='la57'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='pcid'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='pku'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='taa-no'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='vaes'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='vpclmulqdq'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      </blockers>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <model usable='no' vendor='Intel'>Icelake-Server-v5</model>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <blockers model='Icelake-Server-v5'>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512-vpopcntdq'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512bitalg'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512bw'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512cd'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512dq'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512f'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512ifma'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512vbmi'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512vbmi2'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512vl'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512vnni'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='erms'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='fsrm'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='gfni'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='ibrs-all'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='invpcid'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='la57'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='pcid'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='pku'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='taa-no'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='vaes'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='vpclmulqdq'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='xsaves'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      </blockers>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <model usable='no' vendor='Intel'>Icelake-Server-v6</model>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <blockers model='Icelake-Server-v6'>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512-vpopcntdq'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512bitalg'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512bw'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512cd'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512dq'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512f'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512ifma'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512vbmi'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512vbmi2'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512vl'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512vnni'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='erms'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='fsrm'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='gfni'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='ibrs-all'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='invpcid'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='la57'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='pcid'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='pku'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='taa-no'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='vaes'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='vpclmulqdq'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='xsaves'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      </blockers>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <model usable='no' vendor='Intel'>Icelake-Server-v7</model>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <blockers model='Icelake-Server-v7'>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512-vpopcntdq'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512bitalg'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512bw'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512cd'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512dq'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512f'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512ifma'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512vbmi'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512vbmi2'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512vl'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512vnni'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='erms'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='fsrm'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='gfni'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='hle'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='ibrs-all'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='invpcid'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='la57'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='pcid'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='pku'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='rtm'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='taa-no'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='vaes'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='vpclmulqdq'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='xsaves'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      </blockers>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <model usable='no' vendor='Intel' canonical='IvyBridge-v1'>IvyBridge</model>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <blockers model='IvyBridge'>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='erms'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      </blockers>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <model usable='no' vendor='Intel' canonical='IvyBridge-v2'>IvyBridge-IBRS</model>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <blockers model='IvyBridge-IBRS'>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='erms'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      </blockers>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <model usable='no' vendor='Intel'>IvyBridge-v1</model>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <blockers model='IvyBridge-v1'>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='erms'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      </blockers>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <model usable='no' vendor='Intel'>IvyBridge-v2</model>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <blockers model='IvyBridge-v2'>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='erms'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      </blockers>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <model usable='no' vendor='Intel' canonical='KnightsMill-v1'>KnightsMill</model>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <blockers model='KnightsMill'>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512-4fmaps'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512-4vnniw'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512-vpopcntdq'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512cd'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512er'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512f'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512pf'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='erms'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='ss'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      </blockers>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <model usable='no' vendor='Intel'>KnightsMill-v1</model>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <blockers model='KnightsMill-v1'>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512-4fmaps'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512-4vnniw'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512-vpopcntdq'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512cd'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512er'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512f'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512pf'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='erms'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='ss'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      </blockers>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <model usable='yes' vendor='Intel' canonical='Nehalem-v1'>Nehalem</model>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <model usable='yes' vendor='Intel' canonical='Nehalem-v2'>Nehalem-IBRS</model>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <model usable='yes' vendor='Intel'>Nehalem-v1</model>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <model usable='yes' vendor='Intel'>Nehalem-v2</model>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G1-v1'>Opteron_G1</model>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G1-v1</model>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G2-v1'>Opteron_G2</model>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G2-v1</model>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G3-v1'>Opteron_G3</model>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G3-v1</model>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <model usable='no' vendor='AMD' canonical='Opteron_G4-v1'>Opteron_G4</model>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <blockers model='Opteron_G4'>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='fma4'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='xop'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      </blockers>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <model usable='no' vendor='AMD'>Opteron_G4-v1</model>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <blockers model='Opteron_G4-v1'>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='fma4'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='xop'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      </blockers>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <model usable='no' vendor='AMD' canonical='Opteron_G5-v1'>Opteron_G5</model>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <blockers model='Opteron_G5'>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='fma4'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='tbm'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='xop'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      </blockers>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <model usable='no' vendor='AMD'>Opteron_G5-v1</model>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <blockers model='Opteron_G5-v1'>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='fma4'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='tbm'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='xop'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      </blockers>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <model usable='yes' deprecated='yes' vendor='Intel' canonical='Penryn-v1'>Penryn</model>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <model usable='yes' deprecated='yes' vendor='Intel'>Penryn-v1</model>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <model usable='yes' vendor='Intel' canonical='SandyBridge-v1'>SandyBridge</model>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <model usable='yes' vendor='Intel' canonical='SandyBridge-v2'>SandyBridge-IBRS</model>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <model usable='yes' vendor='Intel'>SandyBridge-v1</model>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <model usable='yes' vendor='Intel'>SandyBridge-v2</model>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <model usable='no' vendor='Intel' canonical='SapphireRapids-v1'>SapphireRapids</model>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <blockers model='SapphireRapids'>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='amx-bf16'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='amx-int8'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='amx-tile'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx-vnni'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512-bf16'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512-fp16'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512-vpopcntdq'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512bitalg'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512bw'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512cd'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512dq'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512f'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512ifma'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512vbmi'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512vbmi2'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512vl'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512vnni'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='bus-lock-detect'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='erms'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='fsrc'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='fsrm'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='fsrs'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='fzrm'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='gfni'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='hle'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='ibrs-all'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='invpcid'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='la57'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='pcid'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='pku'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='rtm'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='serialize'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='taa-no'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='tsx-ldtrk'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='vaes'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='vpclmulqdq'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='xfd'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='xsaves'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      </blockers>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <model usable='no' vendor='Intel'>SapphireRapids-v1</model>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <blockers model='SapphireRapids-v1'>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='amx-bf16'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='amx-int8'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='amx-tile'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx-vnni'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512-bf16'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512-fp16'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512-vpopcntdq'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512bitalg'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512bw'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512cd'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512dq'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512f'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512ifma'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512vbmi'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512vbmi2'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512vl'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512vnni'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='bus-lock-detect'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='erms'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='fsrc'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='fsrm'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='fsrs'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='fzrm'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='gfni'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='hle'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='ibrs-all'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='invpcid'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='la57'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='pcid'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='pku'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='rtm'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='serialize'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='taa-no'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='tsx-ldtrk'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='vaes'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='vpclmulqdq'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='xfd'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='xsaves'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      </blockers>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <model usable='no' vendor='Intel'>SapphireRapids-v2</model>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <blockers model='SapphireRapids-v2'>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='amx-bf16'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='amx-int8'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='amx-tile'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx-vnni'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512-bf16'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512-fp16'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512-vpopcntdq'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512bitalg'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512bw'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512cd'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512dq'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512f'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512ifma'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512vbmi'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512vbmi2'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512vl'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512vnni'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='bus-lock-detect'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='erms'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='fbsdp-no'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='fsrc'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='fsrm'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='fsrs'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='fzrm'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='gfni'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='hle'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='ibrs-all'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='invpcid'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='la57'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='pcid'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='pku'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='psdp-no'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='rtm'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='sbdr-ssdp-no'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='serialize'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='taa-no'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='tsx-ldtrk'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='vaes'/>
Jan 21 18:37:06 np0005591285 python3.9[182495]: ansible-containers.podman.podman_container Invoked with name=nova_nvme_cleaner state=absent executable=podman detach=True debug=False force_restart=False force_delete=True generate_systemd={} image_strict=False recreate=False image=None annotation=None arch=None attach=None authfile=None blkio_weight=None blkio_weight_device=None cap_add=None cap_drop=None cgroup_conf=None cgroup_parent=None cgroupns=None cgroups=None chrootdirs=None cidfile=None cmd_args=None conmon_pidfile=None command=None cpu_period=None cpu_quota=None cpu_rt_period=None cpu_rt_runtime=None cpu_shares=None cpus=None cpuset_cpus=None cpuset_mems=None decryption_key=None delete_depend=None delete_time=None delete_volumes=None detach_keys=None device=None device_cgroup_rule=None device_read_bps=None device_read_iops=None device_write_bps=None device_write_iops=None dns=None dns_option=None dns_search=None entrypoint=None env=None env_file=None env_host=None env_merge=None etc_hosts=None expose=None gidmap=None gpus=None group_add=None group_entry=None healthcheck=None healthcheck_interval=None healthcheck_retries=None healthcheck_start_period=None health_startup_cmd=None health_startup_interval=None health_startup_retries=None health_startup_success=None health_startup_timeout=None healthcheck_timeout=None healthcheck_failure_action=None hooks_dir=None hostname=None hostuser=None http_proxy=None image_volume=None init=None init_ctr=None init_path=None interactive=None ip=None ip6=None ipc=None kernel_memory=None label=None label_file=None log_driver=None log_level=None log_opt=None mac_address=None memory=None memory_reservation=None memory_swap=None memory_swappiness=None mount=None network=None network_aliases=None no_healthcheck=None no_hosts=None oom_kill_disable=None oom_score_adj=None os=None passwd=None passwd_entry=None personality=None pid=None pid_file=None pids_limit=None platform=None pod=None pod_id_file=None preserve_fd=None preserve_fds=None privileged=None publish=None publish_all=None pull=None quadlet_dir=None quadlet_filename=None quadlet_file_mode=None quadlet_options=None rdt_class=None read_only=None read_only_tmpfs=None requires=None restart_policy=None restart_time=None retry=None retry_delay=None rm=None rmi=None rootfs=None seccomp_policy=None secrets=NOT_LOGGING_PARAMETER sdnotify=None security_opt=None shm_size=None shm_size_systemd=None sig_proxy=None stop_signal=None stop_timeout=None stop_time=None subgidname=None subuidname=None sysctl=None systemd=None timeout=None timezone=None tls_verify=None tmpfs=None tty=None uidmap=None ulimit=None umask=None unsetenv=None unsetenv_all=None user=None userns=None uts=None variant=None volume=None volumes_from=None workdir=None
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='vpclmulqdq'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='xfd'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='xsaves'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      </blockers>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <model usable='no' vendor='Intel'>SapphireRapids-v3</model>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <blockers model='SapphireRapids-v3'>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='amx-bf16'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='amx-int8'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='amx-tile'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx-vnni'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512-bf16'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512-fp16'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512-vpopcntdq'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512bitalg'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512bw'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512cd'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512dq'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512f'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512ifma'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512vbmi'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512vbmi2'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512vl'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512vnni'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='bus-lock-detect'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='cldemote'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='erms'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='fbsdp-no'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='fsrc'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='fsrm'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='fsrs'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='fzrm'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='gfni'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='hle'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='ibrs-all'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='invpcid'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='la57'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='movdir64b'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='movdiri'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='pcid'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='pku'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='psdp-no'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='rtm'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='sbdr-ssdp-no'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='serialize'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='ss'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='taa-no'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='tsx-ldtrk'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='vaes'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='vpclmulqdq'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='xfd'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='xsaves'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      </blockers>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <model usable='no' vendor='Intel'>SapphireRapids-v4</model>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <blockers model='SapphireRapids-v4'>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='amx-bf16'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='amx-int8'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='amx-tile'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx-vnni'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512-bf16'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512-fp16'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512-vpopcntdq'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512bitalg'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512bw'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512cd'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512dq'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512f'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512ifma'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512vbmi'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512vbmi2'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512vl'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512vnni'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='bus-lock-detect'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='cldemote'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='erms'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='fbsdp-no'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='fsrc'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='fsrm'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='fsrs'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='fzrm'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='gfni'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='hle'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='ibrs-all'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='invpcid'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='la57'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='movdir64b'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='movdiri'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='pcid'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='pku'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='psdp-no'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='rtm'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='sbdr-ssdp-no'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='serialize'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='ss'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='taa-no'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='tsx-ldtrk'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='vaes'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='vpclmulqdq'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='xfd'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='xsaves'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      </blockers>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <model usable='no' vendor='Intel' canonical='SierraForest-v1'>SierraForest</model>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <blockers model='SierraForest'>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx-ifma'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx-ne-convert'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx-vnni'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx-vnni-int8'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='bus-lock-detect'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='cmpccxadd'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='erms'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='fbsdp-no'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='fsrm'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='fsrs'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='gfni'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='ibrs-all'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='invpcid'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='mcdt-no'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='pbrsb-no'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='pcid'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='pku'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='psdp-no'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='sbdr-ssdp-no'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='serialize'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='vaes'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='vpclmulqdq'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='xsaves'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      </blockers>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <model usable='no' vendor='Intel'>SierraForest-v1</model>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <blockers model='SierraForest-v1'>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx-ifma'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx-ne-convert'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx-vnni'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx-vnni-int8'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='bus-lock-detect'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='cmpccxadd'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='erms'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='fbsdp-no'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='fsrm'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='fsrs'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='gfni'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='ibrs-all'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='invpcid'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='mcdt-no'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='pbrsb-no'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='pcid'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='pku'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='psdp-no'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='sbdr-ssdp-no'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='serialize'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='vaes'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='vpclmulqdq'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='xsaves'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      </blockers>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <model usable='no' vendor='Intel'>SierraForest-v2</model>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <blockers model='SierraForest-v2'>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx-ifma'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx-ne-convert'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx-vnni'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx-vnni-int8'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='bhi-ctrl'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='bus-lock-detect'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='cldemote'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='cmpccxadd'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='erms'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='fbsdp-no'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='fsrm'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='fsrs'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='gfni'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='ibrs-all'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='intel-psfd'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='invpcid'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='ipred-ctrl'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='lam'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='mcdt-no'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='movdir64b'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='movdiri'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='pbrsb-no'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='pcid'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='pku'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='psdp-no'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='rrsba-ctrl'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='sbdr-ssdp-no'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='serialize'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='ss'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='vaes'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='vpclmulqdq'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='xsaves'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      </blockers>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <model usable='no' vendor='Intel'>SierraForest-v3</model>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <blockers model='SierraForest-v3'>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx-ifma'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx-ne-convert'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx-vnni'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx-vnni-int8'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='bhi-ctrl'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='bus-lock-detect'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='cldemote'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='cmpccxadd'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='erms'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='fbsdp-no'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='fsrm'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='fsrs'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='gfni'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='ibrs-all'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='intel-psfd'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='invpcid'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='ipred-ctrl'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='lam'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='mcdt-no'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='movdir64b'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='movdiri'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='pbrsb-no'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='pcid'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='pku'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='psdp-no'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='rrsba-ctrl'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='sbdr-ssdp-no'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='serialize'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='ss'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='vaes'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='vpclmulqdq'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='xsaves'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      </blockers>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <model usable='no' vendor='Intel' canonical='Skylake-Client-v1'>Skylake-Client</model>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <blockers model='Skylake-Client'>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='erms'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='hle'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='invpcid'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='pcid'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='rtm'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      </blockers>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <model usable='no' vendor='Intel' canonical='Skylake-Client-v2'>Skylake-Client-IBRS</model>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <blockers model='Skylake-Client-IBRS'>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='erms'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='hle'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='invpcid'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='pcid'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='rtm'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      </blockers>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <model usable='no' vendor='Intel' canonical='Skylake-Client-v3'>Skylake-Client-noTSX-IBRS</model>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <blockers model='Skylake-Client-noTSX-IBRS'>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='erms'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='invpcid'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='pcid'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      </blockers>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <model usable='no' vendor='Intel'>Skylake-Client-v1</model>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <blockers model='Skylake-Client-v1'>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='erms'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='hle'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='invpcid'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='pcid'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='rtm'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      </blockers>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <model usable='no' vendor='Intel'>Skylake-Client-v2</model>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <blockers model='Skylake-Client-v2'>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='erms'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='hle'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='invpcid'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='pcid'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='rtm'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      </blockers>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <model usable='no' vendor='Intel'>Skylake-Client-v3</model>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <blockers model='Skylake-Client-v3'>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='erms'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='invpcid'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='pcid'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      </blockers>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <model usable='no' vendor='Intel'>Skylake-Client-v4</model>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <blockers model='Skylake-Client-v4'>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='erms'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='invpcid'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='pcid'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='xsaves'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      </blockers>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <model usable='no' vendor='Intel' canonical='Skylake-Server-v1'>Skylake-Server</model>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <blockers model='Skylake-Server'>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512bw'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512cd'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512dq'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512f'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512vl'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='erms'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='hle'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='invpcid'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='pcid'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='pku'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='rtm'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      </blockers>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <model usable='no' vendor='Intel' canonical='Skylake-Server-v2'>Skylake-Server-IBRS</model>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <blockers model='Skylake-Server-IBRS'>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512bw'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512cd'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512dq'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512f'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512vl'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='erms'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='hle'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='invpcid'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='pcid'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='pku'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='rtm'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      </blockers>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <model usable='no' vendor='Intel' canonical='Skylake-Server-v3'>Skylake-Server-noTSX-IBRS</model>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <blockers model='Skylake-Server-noTSX-IBRS'>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512bw'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512cd'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512dq'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512f'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512vl'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='erms'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='invpcid'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='pcid'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='pku'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      </blockers>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <model usable='no' vendor='Intel'>Skylake-Server-v1</model>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <blockers model='Skylake-Server-v1'>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512bw'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512cd'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512dq'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512f'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512vl'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='erms'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='hle'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='invpcid'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='pcid'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='pku'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='rtm'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      </blockers>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <model usable='no' vendor='Intel'>Skylake-Server-v2</model>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <blockers model='Skylake-Server-v2'>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512bw'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512cd'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512dq'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512f'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512vl'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='erms'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='hle'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='invpcid'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='pcid'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='pku'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='rtm'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      </blockers>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <model usable='no' vendor='Intel'>Skylake-Server-v3</model>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <blockers model='Skylake-Server-v3'>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512bw'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512cd'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512dq'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512f'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512vl'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='erms'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='invpcid'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='pcid'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='pku'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      </blockers>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <model usable='no' vendor='Intel'>Skylake-Server-v4</model>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <blockers model='Skylake-Server-v4'>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512bw'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512cd'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512dq'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512f'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512vl'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='erms'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='invpcid'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='pcid'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='pku'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      </blockers>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <model usable='no' vendor='Intel'>Skylake-Server-v5</model>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <blockers model='Skylake-Server-v5'>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512bw'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512cd'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512dq'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512f'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512vl'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='erms'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='invpcid'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='pcid'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='pku'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='xsaves'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      </blockers>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <model usable='no' vendor='Intel' canonical='Snowridge-v1'>Snowridge</model>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <blockers model='Snowridge'>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='cldemote'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='core-capability'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='erms'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='gfni'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='movdir64b'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='movdiri'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='mpx'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='split-lock-detect'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      </blockers>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <model usable='no' vendor='Intel'>Snowridge-v1</model>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <blockers model='Snowridge-v1'>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='cldemote'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='core-capability'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='erms'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='gfni'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='movdir64b'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='movdiri'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='mpx'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='split-lock-detect'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      </blockers>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <model usable='no' vendor='Intel'>Snowridge-v2</model>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <blockers model='Snowridge-v2'>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='cldemote'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='core-capability'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='erms'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='gfni'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='movdir64b'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='movdiri'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='split-lock-detect'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      </blockers>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <model usable='no' vendor='Intel'>Snowridge-v3</model>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <blockers model='Snowridge-v3'>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='cldemote'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='core-capability'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='erms'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='gfni'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='movdir64b'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='movdiri'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='split-lock-detect'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='xsaves'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      </blockers>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <model usable='no' vendor='Intel'>Snowridge-v4</model>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <blockers model='Snowridge-v4'>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='cldemote'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='erms'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='gfni'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='movdir64b'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='movdiri'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='xsaves'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      </blockers>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <model usable='yes' vendor='Intel' canonical='Westmere-v1'>Westmere</model>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <model usable='yes' vendor='Intel' canonical='Westmere-v2'>Westmere-IBRS</model>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <model usable='yes' vendor='Intel'>Westmere-v1</model>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <model usable='yes' vendor='Intel'>Westmere-v2</model>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <model usable='no' deprecated='yes' vendor='AMD' canonical='athlon-v1'>athlon</model>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <blockers model='athlon'>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='3dnow'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='3dnowext'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      </blockers>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <model usable='no' deprecated='yes' vendor='AMD'>athlon-v1</model>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <blockers model='athlon-v1'>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='3dnow'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='3dnowext'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      </blockers>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <model usable='no' deprecated='yes' vendor='Intel' canonical='core2duo-v1'>core2duo</model>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <blockers model='core2duo'>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='ss'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      </blockers>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <model usable='no' deprecated='yes' vendor='Intel'>core2duo-v1</model>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <blockers model='core2duo-v1'>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='ss'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      </blockers>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <model usable='no' deprecated='yes' vendor='Intel' canonical='coreduo-v1'>coreduo</model>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <blockers model='coreduo'>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='ss'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      </blockers>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <model usable='no' deprecated='yes' vendor='Intel'>coreduo-v1</model>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <blockers model='coreduo-v1'>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='ss'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      </blockers>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm32-v1'>kvm32</model>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <model usable='yes' deprecated='yes' vendor='unknown'>kvm32-v1</model>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm64-v1'>kvm64</model>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <model usable='yes' deprecated='yes' vendor='unknown'>kvm64-v1</model>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <model usable='no' deprecated='yes' vendor='Intel' canonical='n270-v1'>n270</model>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <blockers model='n270'>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='ss'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      </blockers>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <model usable='no' deprecated='yes' vendor='Intel'>n270-v1</model>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <blockers model='n270-v1'>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='ss'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      </blockers>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium-v1'>pentium</model>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <model usable='yes' deprecated='yes' vendor='unknown'>pentium-v1</model>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium2-v1'>pentium2</model>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <model usable='yes' deprecated='yes' vendor='unknown'>pentium2-v1</model>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium3-v1'>pentium3</model>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <model usable='yes' deprecated='yes' vendor='unknown'>pentium3-v1</model>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <model usable='no' deprecated='yes' vendor='AMD' canonical='phenom-v1'>phenom</model>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <blockers model='phenom'>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='3dnow'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='3dnowext'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      </blockers>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <model usable='no' deprecated='yes' vendor='AMD'>phenom-v1</model>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <blockers model='phenom-v1'>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='3dnow'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='3dnowext'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      </blockers>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu32-v1'>qemu32</model>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <model usable='yes' deprecated='yes' vendor='unknown'>qemu32-v1</model>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu64-v1'>qemu64</model>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <model usable='yes' deprecated='yes' vendor='unknown'>qemu64-v1</model>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:    </mode>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:  </cpu>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:  <memoryBacking supported='yes'>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:    <enum name='sourceType'>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <value>file</value>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <value>anonymous</value>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <value>memfd</value>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:    </enum>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:  </memoryBacking>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:  <devices>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:    <disk supported='yes'>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <enum name='diskDevice'>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <value>disk</value>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <value>cdrom</value>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <value>floppy</value>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <value>lun</value>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      </enum>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <enum name='bus'>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <value>ide</value>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <value>fdc</value>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <value>scsi</value>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <value>virtio</value>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <value>usb</value>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <value>sata</value>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      </enum>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <enum name='model'>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <value>virtio</value>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <value>virtio-transitional</value>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <value>virtio-non-transitional</value>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      </enum>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:    </disk>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:    <graphics supported='yes'>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <enum name='type'>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <value>vnc</value>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <value>egl-headless</value>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <value>dbus</value>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      </enum>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:    </graphics>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:    <video supported='yes'>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <enum name='modelType'>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <value>vga</value>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <value>cirrus</value>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <value>virtio</value>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <value>none</value>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <value>bochs</value>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <value>ramfb</value>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      </enum>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:    </video>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:    <hostdev supported='yes'>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <enum name='mode'>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <value>subsystem</value>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      </enum>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <enum name='startupPolicy'>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <value>default</value>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <value>mandatory</value>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <value>requisite</value>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <value>optional</value>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      </enum>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <enum name='subsysType'>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <value>usb</value>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <value>pci</value>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <value>scsi</value>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      </enum>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <enum name='capsType'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <enum name='pciBackend'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:    </hostdev>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:    <rng supported='yes'>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <enum name='model'>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <value>virtio</value>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <value>virtio-transitional</value>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <value>virtio-non-transitional</value>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      </enum>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <enum name='backendModel'>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <value>random</value>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <value>egd</value>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <value>builtin</value>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      </enum>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:    </rng>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:    <filesystem supported='yes'>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <enum name='driverType'>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <value>path</value>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <value>handle</value>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <value>virtiofs</value>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      </enum>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:    </filesystem>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:    <tpm supported='yes'>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <enum name='model'>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <value>tpm-tis</value>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <value>tpm-crb</value>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      </enum>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <enum name='backendModel'>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <value>emulator</value>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <value>external</value>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      </enum>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <enum name='backendVersion'>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <value>2.0</value>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      </enum>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:    </tpm>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:    <redirdev supported='yes'>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <enum name='bus'>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <value>usb</value>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      </enum>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:    </redirdev>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:    <channel supported='yes'>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <enum name='type'>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <value>pty</value>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <value>unix</value>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      </enum>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:    </channel>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:    <crypto supported='yes'>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <enum name='model'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <enum name='type'>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <value>qemu</value>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      </enum>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <enum name='backendModel'>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <value>builtin</value>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      </enum>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:    </crypto>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:    <interface supported='yes'>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <enum name='backendType'>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <value>default</value>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <value>passt</value>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      </enum>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:    </interface>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:    <panic supported='yes'>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <enum name='model'>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <value>isa</value>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <value>hyperv</value>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      </enum>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:    </panic>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:    <console supported='yes'>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <enum name='type'>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <value>null</value>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <value>vc</value>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <value>pty</value>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <value>dev</value>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <value>file</value>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <value>pipe</value>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <value>stdio</value>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <value>udp</value>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <value>tcp</value>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <value>unix</value>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <value>qemu-vdagent</value>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <value>dbus</value>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      </enum>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:    </console>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:  </devices>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:  <features>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:    <gic supported='no'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:    <vmcoreinfo supported='yes'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:    <genid supported='yes'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:    <backingStoreInput supported='yes'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:    <backup supported='yes'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:    <async-teardown supported='yes'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:    <s390-pv supported='no'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:    <ps2 supported='yes'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:    <tdx supported='no'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:    <sev supported='no'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:    <sgx supported='no'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:    <hyperv supported='yes'>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <enum name='features'>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <value>relaxed</value>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <value>vapic</value>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <value>spinlocks</value>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <value>vpindex</value>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <value>runtime</value>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <value>synic</value>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <value>stimer</value>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <value>reset</value>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <value>vendor_id</value>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <value>frequencies</value>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <value>reenlightenment</value>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <value>tlbflush</value>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <value>ipi</value>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <value>avic</value>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <value>emsr_bitmap</value>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <value>xmm_input</value>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      </enum>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <defaults>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <spinlocks>4095</spinlocks>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <stimer_direct>on</stimer_direct>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <tlbflush_direct>on</tlbflush_direct>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <tlbflush_extended>on</tlbflush_extended>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <vendor_id>Linux KVM Hv</vendor_id>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      </defaults>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:    </hyperv>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:    <launchSecurity supported='no'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:  </features>
Jan 21 18:37:06 np0005591285 nova_compute[181789]: </domainCapabilities>
Jan 21 18:37:06 np0005591285 nova_compute[181789]: _get_domain_capabilities /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1037#033[00m
Jan 21 18:37:06 np0005591285 nova_compute[181789]: 2026-01-21 23:37:06.747 181793 DEBUG nova.virt.libvirt.host [None req-006d1d44-8b50-4352-9f60-dc35856b533c - - - - - -] Libvirt host hypervisor capabilities for arch=x86_64 and machine_type=q35:
Jan 21 18:37:06 np0005591285 nova_compute[181789]: <domainCapabilities>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:  <path>/usr/libexec/qemu-kvm</path>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:  <domain>kvm</domain>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:  <machine>pc-q35-rhel9.8.0</machine>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:  <arch>x86_64</arch>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:  <vcpu max='4096'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:  <iothreads supported='yes'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:  <os supported='yes'>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:    <enum name='firmware'>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <value>efi</value>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:    </enum>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:    <loader supported='yes'>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <value>/usr/share/edk2/ovmf/OVMF_CODE.secboot.fd</value>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <value>/usr/share/edk2/ovmf/OVMF_CODE.fd</value>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <value>/usr/share/edk2/ovmf/OVMF.amdsev.fd</value>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <value>/usr/share/edk2/ovmf/OVMF.inteltdx.secboot.fd</value>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <enum name='type'>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <value>rom</value>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <value>pflash</value>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      </enum>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <enum name='readonly'>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <value>yes</value>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <value>no</value>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      </enum>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <enum name='secure'>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <value>yes</value>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <value>no</value>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      </enum>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:    </loader>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:  </os>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:  <cpu>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:    <mode name='host-passthrough' supported='yes'>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <enum name='hostPassthroughMigratable'>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <value>on</value>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <value>off</value>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      </enum>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:    </mode>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:    <mode name='maximum' supported='yes'>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <enum name='maximumMigratable'>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <value>on</value>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <value>off</value>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      </enum>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:    </mode>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:    <mode name='host-model' supported='yes'>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <model fallback='forbid'>EPYC-Rome</model>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <vendor>AMD</vendor>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <maxphysaddr mode='passthrough' limit='40'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <feature policy='require' name='x2apic'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <feature policy='require' name='tsc-deadline'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <feature policy='require' name='hypervisor'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <feature policy='require' name='tsc_adjust'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <feature policy='require' name='spec-ctrl'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <feature policy='require' name='stibp'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <feature policy='require' name='ssbd'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <feature policy='require' name='cmp_legacy'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <feature policy='require' name='overflow-recov'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <feature policy='require' name='succor'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <feature policy='require' name='ibrs'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <feature policy='require' name='amd-ssbd'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <feature policy='require' name='virt-ssbd'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <feature policy='require' name='lbrv'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <feature policy='require' name='tsc-scale'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <feature policy='require' name='vmcb-clean'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <feature policy='require' name='flushbyasid'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <feature policy='require' name='pause-filter'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <feature policy='require' name='pfthreshold'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <feature policy='require' name='svme-addr-chk'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <feature policy='require' name='lfence-always-serializing'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <feature policy='disable' name='xsaves'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:    </mode>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:    <mode name='custom' supported='yes'>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='486-v1'>486</model>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <model usable='yes' deprecated='yes' vendor='unknown'>486-v1</model>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <model usable='no' vendor='Intel' canonical='Broadwell-v1'>Broadwell</model>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <blockers model='Broadwell'>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='erms'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='hle'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='invpcid'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='pcid'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='rtm'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      </blockers>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <model usable='no' vendor='Intel' canonical='Broadwell-v3'>Broadwell-IBRS</model>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <blockers model='Broadwell-IBRS'>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='erms'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='hle'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='invpcid'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='pcid'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='rtm'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      </blockers>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <model usable='no' vendor='Intel' canonical='Broadwell-v2'>Broadwell-noTSX</model>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <blockers model='Broadwell-noTSX'>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='erms'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='invpcid'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='pcid'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      </blockers>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <model usable='no' vendor='Intel' canonical='Broadwell-v4'>Broadwell-noTSX-IBRS</model>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <blockers model='Broadwell-noTSX-IBRS'>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='erms'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='invpcid'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='pcid'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      </blockers>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <model usable='no' vendor='Intel'>Broadwell-v1</model>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <blockers model='Broadwell-v1'>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='erms'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='hle'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='invpcid'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='pcid'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='rtm'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      </blockers>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <model usable='no' vendor='Intel'>Broadwell-v2</model>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <blockers model='Broadwell-v2'>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='erms'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='invpcid'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='pcid'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      </blockers>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <model usable='no' vendor='Intel'>Broadwell-v3</model>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <blockers model='Broadwell-v3'>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='erms'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='hle'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='invpcid'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='pcid'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='rtm'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      </blockers>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <model usable='no' vendor='Intel'>Broadwell-v4</model>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <blockers model='Broadwell-v4'>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='erms'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='invpcid'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='pcid'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      </blockers>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v1'>Cascadelake-Server</model>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <blockers model='Cascadelake-Server'>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512bw'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512cd'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512dq'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512f'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512vl'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512vnni'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='erms'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='hle'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='invpcid'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='pcid'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='pku'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='rtm'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      </blockers>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v3'>Cascadelake-Server-noTSX</model>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <blockers model='Cascadelake-Server-noTSX'>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512bw'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512cd'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512dq'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512f'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512vl'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512vnni'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='erms'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='ibrs-all'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='invpcid'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='pcid'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='pku'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      </blockers>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v1</model>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <blockers model='Cascadelake-Server-v1'>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512bw'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512cd'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512dq'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512f'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512vl'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512vnni'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='erms'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='hle'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='invpcid'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='pcid'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='pku'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='rtm'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      </blockers>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v2</model>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <blockers model='Cascadelake-Server-v2'>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512bw'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512cd'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512dq'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512f'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512vl'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512vnni'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='erms'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='hle'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='ibrs-all'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='invpcid'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='pcid'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='pku'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='rtm'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      </blockers>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v3</model>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <blockers model='Cascadelake-Server-v3'>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512bw'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512cd'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512dq'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512f'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512vl'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512vnni'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='erms'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='ibrs-all'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='invpcid'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='pcid'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='pku'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      </blockers>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v4</model>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <blockers model='Cascadelake-Server-v4'>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512bw'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512cd'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512dq'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512f'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512vl'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512vnni'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='erms'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='ibrs-all'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='invpcid'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='pcid'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='pku'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      </blockers>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v5</model>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <blockers model='Cascadelake-Server-v5'>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512bw'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512cd'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512dq'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512f'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512vl'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512vnni'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='erms'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='ibrs-all'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='invpcid'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='pcid'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='pku'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='xsaves'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      </blockers>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <model usable='no' vendor='Intel' canonical='ClearwaterForest-v1'>ClearwaterForest</model>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <blockers model='ClearwaterForest'>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx-ifma'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx-ne-convert'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx-vnni'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx-vnni-int16'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx-vnni-int8'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='bhi-ctrl'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='bhi-no'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='bus-lock-detect'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='cldemote'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='cmpccxadd'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='ddpd-u'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='erms'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='fbsdp-no'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='fsrm'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='fsrs'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='gfni'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='ibrs-all'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='intel-psfd'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='invpcid'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='ipred-ctrl'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='lam'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='mcdt-no'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='movdir64b'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='movdiri'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='pbrsb-no'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='pcid'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='pku'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='prefetchiti'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='psdp-no'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='rrsba-ctrl'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='sbdr-ssdp-no'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='serialize'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='sha512'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='sm3'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='sm4'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='ss'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='vaes'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='vpclmulqdq'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='xsaves'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      </blockers>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <model usable='no' vendor='Intel'>ClearwaterForest-v1</model>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <blockers model='ClearwaterForest-v1'>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx-ifma'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx-ne-convert'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx-vnni'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx-vnni-int16'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx-vnni-int8'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='bhi-ctrl'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='bhi-no'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='bus-lock-detect'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='cldemote'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='cmpccxadd'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='ddpd-u'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='erms'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='fbsdp-no'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='fsrm'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='fsrs'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='gfni'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='ibrs-all'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='intel-psfd'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='invpcid'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='ipred-ctrl'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='lam'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='mcdt-no'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='movdir64b'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='movdiri'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='pbrsb-no'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='pcid'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='pku'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='prefetchiti'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='psdp-no'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='rrsba-ctrl'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='sbdr-ssdp-no'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='serialize'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='sha512'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='sm3'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='sm4'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='ss'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='vaes'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='vpclmulqdq'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='xsaves'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      </blockers>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <model usable='yes' deprecated='yes' vendor='Intel' canonical='Conroe-v1'>Conroe</model>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <model usable='yes' deprecated='yes' vendor='Intel'>Conroe-v1</model>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <model usable='no' vendor='Intel' canonical='Cooperlake-v1'>Cooperlake</model>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <blockers model='Cooperlake'>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512-bf16'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512bw'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512cd'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512dq'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512f'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512vl'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512vnni'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='erms'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='hle'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='ibrs-all'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='invpcid'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='pcid'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='pku'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='rtm'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='taa-no'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      </blockers>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <model usable='no' vendor='Intel'>Cooperlake-v1</model>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <blockers model='Cooperlake-v1'>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512-bf16'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512bw'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512cd'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512dq'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512f'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512vl'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512vnni'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='erms'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='hle'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='ibrs-all'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='invpcid'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='pcid'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='pku'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='rtm'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='taa-no'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      </blockers>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <model usable='no' vendor='Intel'>Cooperlake-v2</model>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <blockers model='Cooperlake-v2'>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512-bf16'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512bw'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512cd'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512dq'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512f'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512vl'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512vnni'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='erms'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='hle'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='ibrs-all'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='invpcid'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='pcid'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='pku'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='rtm'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='taa-no'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='xsaves'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      </blockers>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <model usable='no' vendor='Intel' canonical='Denverton-v1'>Denverton</model>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <blockers model='Denverton'>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='erms'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='mpx'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      </blockers>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <model usable='no' vendor='Intel'>Denverton-v1</model>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <blockers model='Denverton-v1'>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='erms'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='mpx'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      </blockers>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <model usable='no' vendor='Intel'>Denverton-v2</model>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <blockers model='Denverton-v2'>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='erms'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      </blockers>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <model usable='no' vendor='Intel'>Denverton-v3</model>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <blockers model='Denverton-v3'>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='erms'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='xsaves'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      </blockers>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <model usable='yes' vendor='Hygon' canonical='Dhyana-v1'>Dhyana</model>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <model usable='yes' vendor='Hygon'>Dhyana-v1</model>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <model usable='no' vendor='Hygon'>Dhyana-v2</model>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <blockers model='Dhyana-v2'>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='xsaves'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      </blockers>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <model usable='yes' vendor='AMD' canonical='EPYC-v1'>EPYC</model>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <model usable='no' vendor='AMD' canonical='EPYC-Genoa-v1'>EPYC-Genoa</model>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <blockers model='EPYC-Genoa'>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='amd-psfd'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='auto-ibrs'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512-bf16'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512-vpopcntdq'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512bitalg'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512bw'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512cd'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512dq'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512f'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512ifma'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512vbmi'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512vbmi2'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512vl'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512vnni'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='erms'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='fsrm'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='gfni'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='invpcid'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='la57'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='no-nested-data-bp'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='null-sel-clr-base'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='pcid'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='pku'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='stibp-always-on'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='vaes'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='vpclmulqdq'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='xsaves'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      </blockers>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <model usable='no' vendor='AMD'>EPYC-Genoa-v1</model>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <blockers model='EPYC-Genoa-v1'>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='amd-psfd'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='auto-ibrs'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512-bf16'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512-vpopcntdq'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512bitalg'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512bw'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512cd'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512dq'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512f'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512ifma'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512vbmi'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512vbmi2'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512vl'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512vnni'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='erms'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='fsrm'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='gfni'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='invpcid'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='la57'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='no-nested-data-bp'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='null-sel-clr-base'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='pcid'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='pku'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='stibp-always-on'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='vaes'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='vpclmulqdq'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='xsaves'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      </blockers>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <model usable='no' vendor='AMD'>EPYC-Genoa-v2</model>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <blockers model='EPYC-Genoa-v2'>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='amd-psfd'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='auto-ibrs'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512-bf16'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512-vpopcntdq'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512bitalg'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512bw'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512cd'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512dq'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512f'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512ifma'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512vbmi'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512vbmi2'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512vl'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512vnni'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='erms'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='fs-gs-base-ns'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='fsrm'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='gfni'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='invpcid'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='la57'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='no-nested-data-bp'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='null-sel-clr-base'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='pcid'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='perfmon-v2'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='pku'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='stibp-always-on'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='vaes'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='vpclmulqdq'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='xsaves'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      </blockers>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <model usable='yes' vendor='AMD' canonical='EPYC-v2'>EPYC-IBPB</model>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <model usable='no' vendor='AMD' canonical='EPYC-Milan-v1'>EPYC-Milan</model>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <blockers model='EPYC-Milan'>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='erms'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='fsrm'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='invpcid'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='pcid'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='pku'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='xsaves'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      </blockers>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <model usable='no' vendor='AMD'>EPYC-Milan-v1</model>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <blockers model='EPYC-Milan-v1'>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='erms'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='fsrm'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='invpcid'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='pcid'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='pku'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='xsaves'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      </blockers>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <model usable='no' vendor='AMD'>EPYC-Milan-v2</model>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <blockers model='EPYC-Milan-v2'>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='amd-psfd'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='erms'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='fsrm'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='invpcid'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='no-nested-data-bp'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='null-sel-clr-base'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='pcid'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='pku'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='stibp-always-on'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='vaes'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='vpclmulqdq'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='xsaves'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      </blockers>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <model usable='no' vendor='AMD'>EPYC-Milan-v3</model>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <blockers model='EPYC-Milan-v3'>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='amd-psfd'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='erms'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='fsrm'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='invpcid'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='no-nested-data-bp'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='null-sel-clr-base'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='pcid'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='pku'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='stibp-always-on'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='vaes'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='vpclmulqdq'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='xsaves'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      </blockers>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <model usable='no' vendor='AMD' canonical='EPYC-Rome-v1'>EPYC-Rome</model>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <blockers model='EPYC-Rome'>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='xsaves'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      </blockers>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <model usable='no' vendor='AMD'>EPYC-Rome-v1</model>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <blockers model='EPYC-Rome-v1'>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='xsaves'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      </blockers>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <model usable='no' vendor='AMD'>EPYC-Rome-v2</model>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <blockers model='EPYC-Rome-v2'>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='xsaves'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      </blockers>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <model usable='no' vendor='AMD'>EPYC-Rome-v3</model>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <blockers model='EPYC-Rome-v3'>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='xsaves'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      </blockers>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <model usable='yes' vendor='AMD'>EPYC-Rome-v4</model>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <model usable='yes' vendor='AMD'>EPYC-Rome-v5</model>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <model usable='no' vendor='AMD' canonical='EPYC-Turin-v1'>EPYC-Turin</model>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <blockers model='EPYC-Turin'>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='amd-psfd'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='auto-ibrs'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx-vnni'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512-bf16'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512-vp2intersect'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512-vpopcntdq'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512bitalg'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512bw'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512cd'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512dq'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512f'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512ifma'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512vbmi'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512vbmi2'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512vl'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512vnni'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='erms'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='fs-gs-base-ns'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='fsrm'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='gfni'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='ibpb-brtype'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='invpcid'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='la57'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='movdir64b'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='movdiri'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='no-nested-data-bp'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='null-sel-clr-base'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='pcid'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='perfmon-v2'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='pku'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='prefetchi'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='sbpb'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='srso-user-kernel-no'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='stibp-always-on'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='vaes'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='vpclmulqdq'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='xsaves'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      </blockers>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <model usable='no' vendor='AMD'>EPYC-Turin-v1</model>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <blockers model='EPYC-Turin-v1'>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='amd-psfd'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='auto-ibrs'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx-vnni'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512-bf16'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512-vp2intersect'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512-vpopcntdq'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512bitalg'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512bw'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512cd'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512dq'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512f'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512ifma'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512vbmi'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512vbmi2'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512vl'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512vnni'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='erms'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='fs-gs-base-ns'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='fsrm'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='gfni'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='ibpb-brtype'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='invpcid'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='la57'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='movdir64b'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='movdiri'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='no-nested-data-bp'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='null-sel-clr-base'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='pcid'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='perfmon-v2'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='pku'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='prefetchi'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='sbpb'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='srso-user-kernel-no'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='stibp-always-on'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='vaes'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='vpclmulqdq'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='xsaves'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      </blockers>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <model usable='yes' vendor='AMD'>EPYC-v1</model>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <model usable='yes' vendor='AMD'>EPYC-v2</model>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <model usable='no' vendor='AMD'>EPYC-v3</model>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <blockers model='EPYC-v3'>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='xsaves'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      </blockers>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <model usable='no' vendor='AMD'>EPYC-v4</model>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <blockers model='EPYC-v4'>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='xsaves'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      </blockers>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <model usable='no' vendor='AMD'>EPYC-v5</model>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <blockers model='EPYC-v5'>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='xsaves'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      </blockers>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <model usable='no' vendor='Intel' canonical='GraniteRapids-v1'>GraniteRapids</model>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <blockers model='GraniteRapids'>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='amx-bf16'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='amx-fp16'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='amx-int8'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='amx-tile'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx-vnni'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512-bf16'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512-fp16'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512-vpopcntdq'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512bitalg'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512bw'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512cd'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512dq'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512f'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512ifma'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512vbmi'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512vbmi2'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512vl'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512vnni'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='bus-lock-detect'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='erms'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='fbsdp-no'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='fsrc'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='fsrm'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='fsrs'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='fzrm'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='gfni'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='hle'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='ibrs-all'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='invpcid'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='la57'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='mcdt-no'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='pbrsb-no'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='pcid'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='pku'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='prefetchiti'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='psdp-no'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='rtm'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='sbdr-ssdp-no'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='serialize'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='taa-no'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='tsx-ldtrk'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='vaes'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='vpclmulqdq'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='xfd'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='xsaves'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      </blockers>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <model usable='no' vendor='Intel'>GraniteRapids-v1</model>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <blockers model='GraniteRapids-v1'>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='amx-bf16'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='amx-fp16'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='amx-int8'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='amx-tile'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx-vnni'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512-bf16'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512-fp16'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512-vpopcntdq'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512bitalg'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512bw'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512cd'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512dq'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512f'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512ifma'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512vbmi'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512vbmi2'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512vl'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512vnni'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='bus-lock-detect'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='erms'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='fbsdp-no'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='fsrc'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='fsrm'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='fsrs'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='fzrm'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='gfni'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='hle'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='ibrs-all'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='invpcid'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='la57'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='mcdt-no'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='pbrsb-no'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='pcid'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='pku'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='prefetchiti'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='psdp-no'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='rtm'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='sbdr-ssdp-no'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='serialize'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='taa-no'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='tsx-ldtrk'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='vaes'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='vpclmulqdq'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='xfd'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='xsaves'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      </blockers>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <model usable='no' vendor='Intel'>GraniteRapids-v2</model>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <blockers model='GraniteRapids-v2'>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='amx-bf16'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='amx-fp16'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='amx-int8'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='amx-tile'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx-vnni'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx10'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx10-128'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx10-256'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx10-512'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512-bf16'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512-fp16'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512-vpopcntdq'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512bitalg'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512bw'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512cd'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512dq'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512f'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512ifma'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512vbmi'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512vbmi2'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512vl'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512vnni'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='bus-lock-detect'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='cldemote'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='erms'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='fbsdp-no'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='fsrc'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='fsrm'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='fsrs'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='fzrm'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='gfni'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='hle'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='ibrs-all'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='invpcid'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='la57'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='mcdt-no'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='movdir64b'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='movdiri'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='pbrsb-no'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='pcid'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='pku'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='prefetchiti'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='psdp-no'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='rtm'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='sbdr-ssdp-no'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='serialize'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='ss'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='taa-no'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='tsx-ldtrk'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='vaes'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='vpclmulqdq'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='xfd'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='xsaves'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      </blockers>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <model usable='no' vendor='Intel'>GraniteRapids-v3</model>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <blockers model='GraniteRapids-v3'>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='amx-bf16'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='amx-fp16'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='amx-int8'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='amx-tile'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx-vnni'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx10'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx10-128'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx10-256'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx10-512'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512-bf16'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512-fp16'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512-vpopcntdq'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512bitalg'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512bw'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512cd'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512dq'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512f'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512ifma'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512vbmi'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512vbmi2'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512vl'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512vnni'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='bus-lock-detect'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='cldemote'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='erms'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='fbsdp-no'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='fsrc'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='fsrm'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='fsrs'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='fzrm'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='gfni'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='hle'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='ibrs-all'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='invpcid'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='la57'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='mcdt-no'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='movdir64b'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='movdiri'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='pbrsb-no'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='pcid'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='pku'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='prefetchiti'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='psdp-no'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='rtm'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='sbdr-ssdp-no'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='serialize'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='ss'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='taa-no'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='tsx-ldtrk'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='vaes'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='vpclmulqdq'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='xfd'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='xsaves'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      </blockers>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <model usable='no' vendor='Intel' canonical='Haswell-v1'>Haswell</model>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <blockers model='Haswell'>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='erms'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='hle'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='invpcid'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='pcid'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='rtm'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      </blockers>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <model usable='no' vendor='Intel' canonical='Haswell-v3'>Haswell-IBRS</model>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <blockers model='Haswell-IBRS'>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='erms'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='hle'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='invpcid'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='pcid'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='rtm'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      </blockers>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <model usable='no' vendor='Intel' canonical='Haswell-v2'>Haswell-noTSX</model>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <blockers model='Haswell-noTSX'>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='erms'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='invpcid'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='pcid'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      </blockers>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <model usable='no' vendor='Intel' canonical='Haswell-v4'>Haswell-noTSX-IBRS</model>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <blockers model='Haswell-noTSX-IBRS'>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='erms'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='invpcid'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='pcid'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      </blockers>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <model usable='no' vendor='Intel'>Haswell-v1</model>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <blockers model='Haswell-v1'>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='erms'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='hle'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='invpcid'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='pcid'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='rtm'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      </blockers>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <model usable='no' vendor='Intel'>Haswell-v2</model>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <blockers model='Haswell-v2'>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='erms'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='invpcid'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='pcid'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      </blockers>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <model usable='no' vendor='Intel'>Haswell-v3</model>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <blockers model='Haswell-v3'>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='erms'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='hle'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='invpcid'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='pcid'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='rtm'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      </blockers>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <model usable='no' vendor='Intel'>Haswell-v4</model>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <blockers model='Haswell-v4'>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='erms'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='invpcid'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='pcid'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      </blockers>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <model usable='no' vendor='Intel' canonical='Icelake-Server-v1'>Icelake-Server</model>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <blockers model='Icelake-Server'>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512-vpopcntdq'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512bitalg'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512bw'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512cd'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512dq'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512f'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512vbmi'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512vbmi2'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512vl'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512vnni'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='erms'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='gfni'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='hle'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='invpcid'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='la57'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='pcid'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='pku'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='rtm'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='vaes'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='vpclmulqdq'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      </blockers>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <model usable='no' vendor='Intel' canonical='Icelake-Server-v2'>Icelake-Server-noTSX</model>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <blockers model='Icelake-Server-noTSX'>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512-vpopcntdq'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512bitalg'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512bw'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512cd'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512dq'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512f'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512vbmi'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512vbmi2'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512vl'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512vnni'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='erms'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='gfni'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='invpcid'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='la57'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='pcid'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='pku'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='vaes'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='vpclmulqdq'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      </blockers>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <model usable='no' vendor='Intel'>Icelake-Server-v1</model>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <blockers model='Icelake-Server-v1'>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512-vpopcntdq'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512bitalg'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512bw'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512cd'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512dq'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512f'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512vbmi'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512vbmi2'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512vl'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512vnni'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='erms'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='gfni'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='hle'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='invpcid'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='la57'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='pcid'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='pku'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='rtm'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='vaes'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='vpclmulqdq'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      </blockers>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <model usable='no' vendor='Intel'>Icelake-Server-v2</model>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <blockers model='Icelake-Server-v2'>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512-vpopcntdq'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512bitalg'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512bw'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512cd'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512dq'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512f'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512vbmi'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512vbmi2'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512vl'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512vnni'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='erms'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='gfni'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='invpcid'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='la57'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='pcid'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='pku'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='vaes'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='vpclmulqdq'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      </blockers>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <model usable='no' vendor='Intel'>Icelake-Server-v3</model>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <blockers model='Icelake-Server-v3'>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512-vpopcntdq'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512bitalg'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512bw'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512cd'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512dq'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512f'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512vbmi'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512vbmi2'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512vl'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512vnni'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='erms'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='gfni'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='ibrs-all'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='invpcid'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='la57'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='pcid'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='pku'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='taa-no'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='vaes'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='vpclmulqdq'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      </blockers>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <model usable='no' vendor='Intel'>Icelake-Server-v4</model>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <blockers model='Icelake-Server-v4'>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512-vpopcntdq'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512bitalg'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512bw'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512cd'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512dq'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512f'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512ifma'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512vbmi'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512vbmi2'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512vl'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512vnni'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='erms'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='fsrm'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='gfni'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='ibrs-all'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='invpcid'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='la57'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='pcid'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='pku'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='taa-no'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='vaes'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='vpclmulqdq'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      </blockers>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <model usable='no' vendor='Intel'>Icelake-Server-v5</model>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <blockers model='Icelake-Server-v5'>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512-vpopcntdq'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512bitalg'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512bw'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512cd'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512dq'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512f'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512ifma'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512vbmi'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512vbmi2'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512vl'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512vnni'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='erms'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='fsrm'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='gfni'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='ibrs-all'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='invpcid'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='la57'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='pcid'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='pku'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='taa-no'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='vaes'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='vpclmulqdq'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='xsaves'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      </blockers>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <model usable='no' vendor='Intel'>Icelake-Server-v6</model>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <blockers model='Icelake-Server-v6'>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512-vpopcntdq'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512bitalg'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512bw'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512cd'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512dq'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512f'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512ifma'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512vbmi'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512vbmi2'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512vl'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512vnni'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='erms'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='fsrm'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='gfni'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='ibrs-all'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='invpcid'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='la57'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='pcid'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='pku'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='taa-no'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='vaes'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='vpclmulqdq'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='xsaves'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      </blockers>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <model usable='no' vendor='Intel'>Icelake-Server-v7</model>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <blockers model='Icelake-Server-v7'>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512-vpopcntdq'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512bitalg'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512bw'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512cd'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512dq'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512f'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512ifma'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512vbmi'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512vbmi2'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512vl'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512vnni'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='erms'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='fsrm'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='gfni'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='hle'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='ibrs-all'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='invpcid'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='la57'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='pcid'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='pku'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='rtm'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='taa-no'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='vaes'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='vpclmulqdq'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='xsaves'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      </blockers>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <model usable='no' vendor='Intel' canonical='IvyBridge-v1'>IvyBridge</model>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <blockers model='IvyBridge'>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='erms'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      </blockers>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <model usable='no' vendor='Intel' canonical='IvyBridge-v2'>IvyBridge-IBRS</model>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <blockers model='IvyBridge-IBRS'>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='erms'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      </blockers>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <model usable='no' vendor='Intel'>IvyBridge-v1</model>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <blockers model='IvyBridge-v1'>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='erms'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      </blockers>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <model usable='no' vendor='Intel'>IvyBridge-v2</model>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <blockers model='IvyBridge-v2'>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='erms'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      </blockers>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <model usable='no' vendor='Intel' canonical='KnightsMill-v1'>KnightsMill</model>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <blockers model='KnightsMill'>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512-4fmaps'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512-4vnniw'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512-vpopcntdq'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512cd'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512er'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512f'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512pf'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='erms'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='ss'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      </blockers>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <model usable='no' vendor='Intel'>KnightsMill-v1</model>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <blockers model='KnightsMill-v1'>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512-4fmaps'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512-4vnniw'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512-vpopcntdq'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512cd'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512er'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512f'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512pf'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='erms'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='ss'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      </blockers>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <model usable='yes' vendor='Intel' canonical='Nehalem-v1'>Nehalem</model>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <model usable='yes' vendor='Intel' canonical='Nehalem-v2'>Nehalem-IBRS</model>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <model usable='yes' vendor='Intel'>Nehalem-v1</model>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <model usable='yes' vendor='Intel'>Nehalem-v2</model>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G1-v1'>Opteron_G1</model>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G1-v1</model>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G2-v1'>Opteron_G2</model>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G2-v1</model>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G3-v1'>Opteron_G3</model>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G3-v1</model>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <model usable='no' vendor='AMD' canonical='Opteron_G4-v1'>Opteron_G4</model>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <blockers model='Opteron_G4'>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='fma4'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='xop'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      </blockers>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <model usable='no' vendor='AMD'>Opteron_G4-v1</model>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <blockers model='Opteron_G4-v1'>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='fma4'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='xop'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      </blockers>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <model usable='no' vendor='AMD' canonical='Opteron_G5-v1'>Opteron_G5</model>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <blockers model='Opteron_G5'>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='fma4'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='tbm'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='xop'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      </blockers>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <model usable='no' vendor='AMD'>Opteron_G5-v1</model>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <blockers model='Opteron_G5-v1'>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='fma4'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='tbm'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='xop'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      </blockers>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <model usable='yes' deprecated='yes' vendor='Intel' canonical='Penryn-v1'>Penryn</model>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <model usable='yes' deprecated='yes' vendor='Intel'>Penryn-v1</model>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <model usable='yes' vendor='Intel' canonical='SandyBridge-v1'>SandyBridge</model>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <model usable='yes' vendor='Intel' canonical='SandyBridge-v2'>SandyBridge-IBRS</model>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <model usable='yes' vendor='Intel'>SandyBridge-v1</model>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <model usable='yes' vendor='Intel'>SandyBridge-v2</model>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <model usable='no' vendor='Intel' canonical='SapphireRapids-v1'>SapphireRapids</model>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <blockers model='SapphireRapids'>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='amx-bf16'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='amx-int8'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='amx-tile'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx-vnni'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512-bf16'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512-fp16'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512-vpopcntdq'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512bitalg'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512bw'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512cd'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512dq'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512f'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512ifma'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512vbmi'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512vbmi2'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512vl'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512vnni'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='bus-lock-detect'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='erms'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='fsrc'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='fsrm'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='fsrs'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='fzrm'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='gfni'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='hle'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='ibrs-all'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='invpcid'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='la57'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='pcid'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='pku'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='rtm'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='serialize'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='taa-no'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='tsx-ldtrk'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='vaes'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='vpclmulqdq'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='xfd'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='xsaves'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      </blockers>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <model usable='no' vendor='Intel'>SapphireRapids-v1</model>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <blockers model='SapphireRapids-v1'>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='amx-bf16'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='amx-int8'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='amx-tile'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx-vnni'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512-bf16'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512-fp16'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512-vpopcntdq'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512bitalg'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512bw'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512cd'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512dq'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512f'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512ifma'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512vbmi'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512vbmi2'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512vl'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512vnni'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='bus-lock-detect'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='erms'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='fsrc'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='fsrm'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='fsrs'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='fzrm'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='gfni'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='hle'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='ibrs-all'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='invpcid'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='la57'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='pcid'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='pku'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='rtm'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='serialize'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='taa-no'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='tsx-ldtrk'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='vaes'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='vpclmulqdq'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='xfd'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='xsaves'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      </blockers>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <model usable='no' vendor='Intel'>SapphireRapids-v2</model>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <blockers model='SapphireRapids-v2'>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='amx-bf16'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='amx-int8'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='amx-tile'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx-vnni'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512-bf16'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512-fp16'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512-vpopcntdq'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512bitalg'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512bw'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512cd'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512dq'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512f'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512ifma'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512vbmi'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512vbmi2'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512vl'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512vnni'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='bus-lock-detect'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='erms'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='fbsdp-no'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='fsrc'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='fsrm'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='fsrs'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='fzrm'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='gfni'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='hle'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='ibrs-all'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='invpcid'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='la57'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='pcid'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='pku'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='psdp-no'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='rtm'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='sbdr-ssdp-no'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='serialize'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='taa-no'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='tsx-ldtrk'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='vaes'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='vpclmulqdq'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='xfd'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='xsaves'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      </blockers>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <model usable='no' vendor='Intel'>SapphireRapids-v3</model>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <blockers model='SapphireRapids-v3'>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='amx-bf16'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='amx-int8'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='amx-tile'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx-vnni'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512-bf16'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512-fp16'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512-vpopcntdq'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512bitalg'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512bw'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512cd'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512dq'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512f'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512ifma'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512vbmi'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512vbmi2'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512vl'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512vnni'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='bus-lock-detect'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='cldemote'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='erms'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='fbsdp-no'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='fsrc'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='fsrm'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='fsrs'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='fzrm'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='gfni'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='hle'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='ibrs-all'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='invpcid'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='la57'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='movdir64b'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='movdiri'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='pcid'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='pku'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='psdp-no'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='rtm'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='sbdr-ssdp-no'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='serialize'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='ss'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='taa-no'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='tsx-ldtrk'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='vaes'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='vpclmulqdq'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='xfd'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='xsaves'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      </blockers>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <model usable='no' vendor='Intel'>SapphireRapids-v4</model>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <blockers model='SapphireRapids-v4'>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='amx-bf16'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='amx-int8'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='amx-tile'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx-vnni'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512-bf16'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512-fp16'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512-vpopcntdq'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512bitalg'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512bw'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512cd'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512dq'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512f'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512ifma'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512vbmi'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512vbmi2'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512vl'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512vnni'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='bus-lock-detect'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='cldemote'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='erms'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='fbsdp-no'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='fsrc'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='fsrm'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='fsrs'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='fzrm'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='gfni'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='hle'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='ibrs-all'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='invpcid'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='la57'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='movdir64b'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='movdiri'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='pcid'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='pku'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='psdp-no'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='rtm'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='sbdr-ssdp-no'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='serialize'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='ss'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='taa-no'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='tsx-ldtrk'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='vaes'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='vpclmulqdq'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='xfd'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='xsaves'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      </blockers>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <model usable='no' vendor='Intel' canonical='SierraForest-v1'>SierraForest</model>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <blockers model='SierraForest'>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx-ifma'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx-ne-convert'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx-vnni'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx-vnni-int8'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='bus-lock-detect'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='cmpccxadd'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='erms'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='fbsdp-no'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='fsrm'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='fsrs'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='gfni'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='ibrs-all'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='invpcid'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='mcdt-no'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='pbrsb-no'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='pcid'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='pku'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='psdp-no'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='sbdr-ssdp-no'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='serialize'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='vaes'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='vpclmulqdq'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='xsaves'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      </blockers>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <model usable='no' vendor='Intel'>SierraForest-v1</model>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <blockers model='SierraForest-v1'>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx-ifma'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx-ne-convert'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx-vnni'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx-vnni-int8'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='bus-lock-detect'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='cmpccxadd'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='erms'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='fbsdp-no'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='fsrm'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='fsrs'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='gfni'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='ibrs-all'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='invpcid'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='mcdt-no'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='pbrsb-no'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='pcid'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='pku'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='psdp-no'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='sbdr-ssdp-no'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='serialize'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='vaes'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='vpclmulqdq'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='xsaves'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      </blockers>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <model usable='no' vendor='Intel'>SierraForest-v2</model>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <blockers model='SierraForest-v2'>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx-ifma'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx-ne-convert'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx-vnni'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx-vnni-int8'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='bhi-ctrl'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='bus-lock-detect'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='cldemote'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='cmpccxadd'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='erms'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='fbsdp-no'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='fsrm'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='fsrs'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='gfni'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='ibrs-all'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='intel-psfd'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='invpcid'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='ipred-ctrl'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='lam'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='mcdt-no'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='movdir64b'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='movdiri'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='pbrsb-no'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='pcid'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='pku'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='psdp-no'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='rrsba-ctrl'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='sbdr-ssdp-no'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='serialize'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='ss'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='vaes'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='vpclmulqdq'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='xsaves'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      </blockers>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <model usable='no' vendor='Intel'>SierraForest-v3</model>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <blockers model='SierraForest-v3'>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx-ifma'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx-ne-convert'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx-vnni'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx-vnni-int8'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='bhi-ctrl'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='bus-lock-detect'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='cldemote'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='cmpccxadd'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='erms'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='fbsdp-no'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='fsrm'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='fsrs'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='gfni'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='ibrs-all'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='intel-psfd'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='invpcid'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='ipred-ctrl'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='lam'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='mcdt-no'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='movdir64b'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='movdiri'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='pbrsb-no'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='pcid'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='pku'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='psdp-no'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='rrsba-ctrl'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='sbdr-ssdp-no'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='serialize'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='ss'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='vaes'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='vpclmulqdq'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='xsaves'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      </blockers>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <model usable='no' vendor='Intel' canonical='Skylake-Client-v1'>Skylake-Client</model>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <blockers model='Skylake-Client'>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='erms'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='hle'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='invpcid'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='pcid'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='rtm'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      </blockers>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <model usable='no' vendor='Intel' canonical='Skylake-Client-v2'>Skylake-Client-IBRS</model>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <blockers model='Skylake-Client-IBRS'>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='erms'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='hle'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='invpcid'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='pcid'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='rtm'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      </blockers>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <model usable='no' vendor='Intel' canonical='Skylake-Client-v3'>Skylake-Client-noTSX-IBRS</model>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <blockers model='Skylake-Client-noTSX-IBRS'>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='erms'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='invpcid'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='pcid'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      </blockers>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <model usable='no' vendor='Intel'>Skylake-Client-v1</model>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <blockers model='Skylake-Client-v1'>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='erms'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='hle'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='invpcid'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='pcid'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='rtm'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      </blockers>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <model usable='no' vendor='Intel'>Skylake-Client-v2</model>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <blockers model='Skylake-Client-v2'>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='erms'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='hle'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='invpcid'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='pcid'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='rtm'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      </blockers>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <model usable='no' vendor='Intel'>Skylake-Client-v3</model>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <blockers model='Skylake-Client-v3'>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='erms'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='invpcid'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='pcid'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      </blockers>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <model usable='no' vendor='Intel'>Skylake-Client-v4</model>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <blockers model='Skylake-Client-v4'>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='erms'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='invpcid'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='pcid'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='xsaves'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      </blockers>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <model usable='no' vendor='Intel' canonical='Skylake-Server-v1'>Skylake-Server</model>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <blockers model='Skylake-Server'>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512bw'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512cd'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512dq'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512f'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512vl'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='erms'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='hle'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='invpcid'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='pcid'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='pku'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='rtm'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      </blockers>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <model usable='no' vendor='Intel' canonical='Skylake-Server-v2'>Skylake-Server-IBRS</model>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <blockers model='Skylake-Server-IBRS'>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512bw'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512cd'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512dq'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512f'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512vl'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='erms'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='hle'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='invpcid'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='pcid'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='pku'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='rtm'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      </blockers>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <model usable='no' vendor='Intel' canonical='Skylake-Server-v3'>Skylake-Server-noTSX-IBRS</model>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <blockers model='Skylake-Server-noTSX-IBRS'>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512bw'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512cd'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512dq'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512f'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512vl'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='erms'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='invpcid'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='pcid'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='pku'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      </blockers>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <model usable='no' vendor='Intel'>Skylake-Server-v1</model>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <blockers model='Skylake-Server-v1'>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512bw'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512cd'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512dq'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512f'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512vl'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='erms'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='hle'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='invpcid'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='pcid'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='pku'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='rtm'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      </blockers>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <model usable='no' vendor='Intel'>Skylake-Server-v2</model>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <blockers model='Skylake-Server-v2'>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512bw'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512cd'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512dq'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512f'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512vl'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='erms'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='hle'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='invpcid'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='pcid'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='pku'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='rtm'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      </blockers>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <model usable='no' vendor='Intel'>Skylake-Server-v3</model>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <blockers model='Skylake-Server-v3'>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512bw'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512cd'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512dq'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512f'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512vl'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='erms'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='invpcid'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='pcid'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='pku'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      </blockers>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <model usable='no' vendor='Intel'>Skylake-Server-v4</model>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <blockers model='Skylake-Server-v4'>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512bw'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512cd'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512dq'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512f'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512vl'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='erms'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='invpcid'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='pcid'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='pku'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      </blockers>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <model usable='no' vendor='Intel'>Skylake-Server-v5</model>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <blockers model='Skylake-Server-v5'>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512bw'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512cd'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512dq'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512f'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='avx512vl'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='erms'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='invpcid'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='pcid'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='pku'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='xsaves'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      </blockers>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <model usable='no' vendor='Intel' canonical='Snowridge-v1'>Snowridge</model>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <blockers model='Snowridge'>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='cldemote'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='core-capability'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='erms'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='gfni'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='movdir64b'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='movdiri'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='mpx'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='split-lock-detect'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      </blockers>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <model usable='no' vendor='Intel'>Snowridge-v1</model>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <blockers model='Snowridge-v1'>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='cldemote'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='core-capability'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='erms'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='gfni'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='movdir64b'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='movdiri'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='mpx'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='split-lock-detect'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      </blockers>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <model usable='no' vendor='Intel'>Snowridge-v2</model>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <blockers model='Snowridge-v2'>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='cldemote'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='core-capability'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='erms'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='gfni'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='movdir64b'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='movdiri'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='split-lock-detect'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      </blockers>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <model usable='no' vendor='Intel'>Snowridge-v3</model>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <blockers model='Snowridge-v3'>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='cldemote'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='core-capability'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='erms'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='gfni'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='movdir64b'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='movdiri'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='split-lock-detect'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='xsaves'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      </blockers>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <model usable='no' vendor='Intel'>Snowridge-v4</model>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <blockers model='Snowridge-v4'>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='cldemote'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='erms'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='gfni'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='movdir64b'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='movdiri'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='xsaves'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      </blockers>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <model usable='yes' vendor='Intel' canonical='Westmere-v1'>Westmere</model>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <model usable='yes' vendor='Intel' canonical='Westmere-v2'>Westmere-IBRS</model>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <model usable='yes' vendor='Intel'>Westmere-v1</model>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <model usable='yes' vendor='Intel'>Westmere-v2</model>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <model usable='no' deprecated='yes' vendor='AMD' canonical='athlon-v1'>athlon</model>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <blockers model='athlon'>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='3dnow'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='3dnowext'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      </blockers>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <model usable='no' deprecated='yes' vendor='AMD'>athlon-v1</model>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <blockers model='athlon-v1'>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='3dnow'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='3dnowext'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      </blockers>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <model usable='no' deprecated='yes' vendor='Intel' canonical='core2duo-v1'>core2duo</model>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <blockers model='core2duo'>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='ss'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      </blockers>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <model usable='no' deprecated='yes' vendor='Intel'>core2duo-v1</model>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <blockers model='core2duo-v1'>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='ss'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      </blockers>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <model usable='no' deprecated='yes' vendor='Intel' canonical='coreduo-v1'>coreduo</model>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <blockers model='coreduo'>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='ss'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      </blockers>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <model usable='no' deprecated='yes' vendor='Intel'>coreduo-v1</model>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <blockers model='coreduo-v1'>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='ss'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      </blockers>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm32-v1'>kvm32</model>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <model usable='yes' deprecated='yes' vendor='unknown'>kvm32-v1</model>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm64-v1'>kvm64</model>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <model usable='yes' deprecated='yes' vendor='unknown'>kvm64-v1</model>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <model usable='no' deprecated='yes' vendor='Intel' canonical='n270-v1'>n270</model>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <blockers model='n270'>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='ss'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      </blockers>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <model usable='no' deprecated='yes' vendor='Intel'>n270-v1</model>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <blockers model='n270-v1'>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='ss'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      </blockers>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium-v1'>pentium</model>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <model usable='yes' deprecated='yes' vendor='unknown'>pentium-v1</model>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium2-v1'>pentium2</model>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <model usable='yes' deprecated='yes' vendor='unknown'>pentium2-v1</model>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium3-v1'>pentium3</model>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <model usable='yes' deprecated='yes' vendor='unknown'>pentium3-v1</model>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <model usable='no' deprecated='yes' vendor='AMD' canonical='phenom-v1'>phenom</model>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <blockers model='phenom'>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='3dnow'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='3dnowext'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      </blockers>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <model usable='no' deprecated='yes' vendor='AMD'>phenom-v1</model>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <blockers model='phenom-v1'>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='3dnow'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <feature name='3dnowext'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      </blockers>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu32-v1'>qemu32</model>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <model usable='yes' deprecated='yes' vendor='unknown'>qemu32-v1</model>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu64-v1'>qemu64</model>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <model usable='yes' deprecated='yes' vendor='unknown'>qemu64-v1</model>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:    </mode>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:  </cpu>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:  <memoryBacking supported='yes'>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:    <enum name='sourceType'>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <value>file</value>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <value>anonymous</value>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <value>memfd</value>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:    </enum>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:  </memoryBacking>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:  <devices>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:    <disk supported='yes'>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <enum name='diskDevice'>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <value>disk</value>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <value>cdrom</value>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <value>floppy</value>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <value>lun</value>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      </enum>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <enum name='bus'>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <value>fdc</value>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <value>scsi</value>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <value>virtio</value>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <value>usb</value>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <value>sata</value>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      </enum>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <enum name='model'>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <value>virtio</value>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <value>virtio-transitional</value>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <value>virtio-non-transitional</value>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      </enum>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:    </disk>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:    <graphics supported='yes'>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <enum name='type'>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <value>vnc</value>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <value>egl-headless</value>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <value>dbus</value>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      </enum>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:    </graphics>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:    <video supported='yes'>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <enum name='modelType'>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <value>vga</value>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <value>cirrus</value>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <value>virtio</value>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <value>none</value>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <value>bochs</value>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <value>ramfb</value>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      </enum>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:    </video>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:    <hostdev supported='yes'>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <enum name='mode'>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <value>subsystem</value>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      </enum>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <enum name='startupPolicy'>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <value>default</value>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <value>mandatory</value>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <value>requisite</value>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <value>optional</value>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      </enum>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <enum name='subsysType'>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <value>usb</value>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <value>pci</value>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <value>scsi</value>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      </enum>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <enum name='capsType'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <enum name='pciBackend'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:    </hostdev>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:    <rng supported='yes'>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <enum name='model'>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <value>virtio</value>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <value>virtio-transitional</value>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <value>virtio-non-transitional</value>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      </enum>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <enum name='backendModel'>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <value>random</value>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <value>egd</value>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <value>builtin</value>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      </enum>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:    </rng>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:    <filesystem supported='yes'>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <enum name='driverType'>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <value>path</value>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <value>handle</value>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <value>virtiofs</value>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      </enum>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:    </filesystem>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:    <tpm supported='yes'>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <enum name='model'>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <value>tpm-tis</value>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <value>tpm-crb</value>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      </enum>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <enum name='backendModel'>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <value>emulator</value>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <value>external</value>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      </enum>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <enum name='backendVersion'>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <value>2.0</value>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      </enum>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:    </tpm>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:    <redirdev supported='yes'>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <enum name='bus'>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <value>usb</value>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      </enum>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:    </redirdev>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:    <channel supported='yes'>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <enum name='type'>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <value>pty</value>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <value>unix</value>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      </enum>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:    </channel>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:    <crypto supported='yes'>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <enum name='model'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <enum name='type'>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <value>qemu</value>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      </enum>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <enum name='backendModel'>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <value>builtin</value>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      </enum>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:    </crypto>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:    <interface supported='yes'>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <enum name='backendType'>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <value>default</value>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <value>passt</value>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      </enum>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:    </interface>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:    <panic supported='yes'>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <enum name='model'>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <value>isa</value>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <value>hyperv</value>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      </enum>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:    </panic>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:    <console supported='yes'>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <enum name='type'>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <value>null</value>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <value>vc</value>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <value>pty</value>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <value>dev</value>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <value>file</value>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <value>pipe</value>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <value>stdio</value>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <value>udp</value>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <value>tcp</value>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <value>unix</value>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <value>qemu-vdagent</value>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <value>dbus</value>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      </enum>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:    </console>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:  </devices>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:  <features>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:    <gic supported='no'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:    <vmcoreinfo supported='yes'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:    <genid supported='yes'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:    <backingStoreInput supported='yes'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:    <backup supported='yes'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:    <async-teardown supported='yes'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:    <s390-pv supported='no'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:    <ps2 supported='yes'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:    <tdx supported='no'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:    <sev supported='no'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:    <sgx supported='no'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:    <hyperv supported='yes'>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <enum name='features'>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <value>relaxed</value>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <value>vapic</value>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <value>spinlocks</value>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <value>vpindex</value>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <value>runtime</value>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <value>synic</value>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <value>stimer</value>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <value>reset</value>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <value>vendor_id</value>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <value>frequencies</value>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <value>reenlightenment</value>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <value>tlbflush</value>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <value>ipi</value>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <value>avic</value>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <value>emsr_bitmap</value>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <value>xmm_input</value>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      </enum>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      <defaults>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <spinlocks>4095</spinlocks>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <stimer_direct>on</stimer_direct>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <tlbflush_direct>on</tlbflush_direct>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <tlbflush_extended>on</tlbflush_extended>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:        <vendor_id>Linux KVM Hv</vendor_id>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:      </defaults>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:    </hyperv>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:    <launchSecurity supported='no'/>
Jan 21 18:37:06 np0005591285 nova_compute[181789]:  </features>
Jan 21 18:37:06 np0005591285 nova_compute[181789]: </domainCapabilities>
Jan 21 18:37:06 np0005591285 nova_compute[181789]: _get_domain_capabilities /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1037#033[00m
Jan 21 18:37:06 np0005591285 nova_compute[181789]: 2026-01-21 23:37:06.844 181793 DEBUG nova.virt.libvirt.host [None req-006d1d44-8b50-4352-9f60-dc35856b533c - - - - - -] Checking secure boot support for host arch (x86_64) supports_secure_boot /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1782#033[00m
Jan 21 18:37:06 np0005591285 nova_compute[181789]: 2026-01-21 23:37:06.845 181793 DEBUG nova.virt.libvirt.host [None req-006d1d44-8b50-4352-9f60-dc35856b533c - - - - - -] Checking secure boot support for host arch (x86_64) supports_secure_boot /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1782#033[00m
Jan 21 18:37:06 np0005591285 nova_compute[181789]: 2026-01-21 23:37:06.845 181793 DEBUG nova.virt.libvirt.host [None req-006d1d44-8b50-4352-9f60-dc35856b533c - - - - - -] Checking secure boot support for host arch (x86_64) supports_secure_boot /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1782#033[00m
Jan 21 18:37:06 np0005591285 nova_compute[181789]: 2026-01-21 23:37:06.851 181793 INFO nova.virt.libvirt.host [None req-006d1d44-8b50-4352-9f60-dc35856b533c - - - - - -] Secure Boot support detected#033[00m
Jan 21 18:37:06 np0005591285 nova_compute[181789]: 2026-01-21 23:37:06.854 181793 INFO nova.virt.libvirt.driver [None req-006d1d44-8b50-4352-9f60-dc35856b533c - - - - - -] The live_migration_permit_post_copy is set to True and post copy live migration is available so auto-converge will not be in use.#033[00m
Jan 21 18:37:06 np0005591285 nova_compute[181789]: 2026-01-21 23:37:06.854 181793 INFO nova.virt.libvirt.driver [None req-006d1d44-8b50-4352-9f60-dc35856b533c - - - - - -] The live_migration_permit_post_copy is set to True and post copy live migration is available so auto-converge will not be in use.#033[00m
Jan 21 18:37:06 np0005591285 nova_compute[181789]: 2026-01-21 23:37:06.867 181793 DEBUG nova.virt.libvirt.driver [None req-006d1d44-8b50-4352-9f60-dc35856b533c - - - - - -] cpu compare xml: <cpu match="exact">
Jan 21 18:37:06 np0005591285 nova_compute[181789]:  <model>Nehalem</model>
Jan 21 18:37:06 np0005591285 nova_compute[181789]: </cpu>
Jan 21 18:37:06 np0005591285 nova_compute[181789]: _compare_cpu /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10019#033[00m
Jan 21 18:37:06 np0005591285 nova_compute[181789]: 2026-01-21 23:37:06.871 181793 DEBUG nova.virt.libvirt.driver [None req-006d1d44-8b50-4352-9f60-dc35856b533c - - - - - -] Enabling emulated TPM support _check_vtpm_support /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:1097#033[00m
Jan 21 18:37:06 np0005591285 nova_compute[181789]: 2026-01-21 23:37:06.909 181793 INFO nova.virt.node [None req-006d1d44-8b50-4352-9f60-dc35856b533c - - - - - -] Determined node identity e96a8776-a298-4c19-937a-402cb8191067 from /var/lib/nova/compute_id#033[00m
Jan 21 18:37:06 np0005591285 nova_compute[181789]: 2026-01-21 23:37:06.928 181793 WARNING nova.compute.manager [None req-006d1d44-8b50-4352-9f60-dc35856b533c - - - - - -] Compute nodes ['e96a8776-a298-4c19-937a-402cb8191067'] for host compute-2.ctlplane.example.com were not found in the database. If this is the first time this service is starting on this host, then you can ignore this warning.#033[00m
Jan 21 18:37:06 np0005591285 nova_compute[181789]: 2026-01-21 23:37:06.956 181793 INFO nova.compute.manager [None req-006d1d44-8b50-4352-9f60-dc35856b533c - - - - - -] Looking for unclaimed instances stuck in BUILDING status for nodes managed by this host#033[00m
Jan 21 18:37:07 np0005591285 nova_compute[181789]: 2026-01-21 23:37:07.021 181793 WARNING nova.compute.manager [None req-006d1d44-8b50-4352-9f60-dc35856b533c - - - - - -] No compute node record found for host compute-2.ctlplane.example.com. If this is the first time this service is starting on this host, then you can ignore this warning.: nova.exception_Remote.ComputeHostNotFound_Remote: Compute host compute-2.ctlplane.example.com could not be found.#033[00m
Jan 21 18:37:07 np0005591285 nova_compute[181789]: 2026-01-21 23:37:07.021 181793 DEBUG oslo_concurrency.lockutils [None req-006d1d44-8b50-4352-9f60-dc35856b533c - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 21 18:37:07 np0005591285 nova_compute[181789]: 2026-01-21 23:37:07.022 181793 DEBUG oslo_concurrency.lockutils [None req-006d1d44-8b50-4352-9f60-dc35856b533c - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 21 18:37:07 np0005591285 nova_compute[181789]: 2026-01-21 23:37:07.022 181793 DEBUG oslo_concurrency.lockutils [None req-006d1d44-8b50-4352-9f60-dc35856b533c - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 21 18:37:07 np0005591285 nova_compute[181789]: 2026-01-21 23:37:07.022 181793 DEBUG nova.compute.resource_tracker [None req-006d1d44-8b50-4352-9f60-dc35856b533c - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Jan 21 18:37:07 np0005591285 systemd[1]: Starting libvirt nodedev daemon...
Jan 21 18:37:07 np0005591285 systemd[1]: Started libvirt nodedev daemon.
Jan 21 18:37:07 np0005591285 rsyslogd[1006]: imjournal: journal files changed, reloading...  [v8.2510.0-2.el9 try https://www.rsyslog.com/e/0 ]
Jan 21 18:37:07 np0005591285 nova_compute[181789]: 2026-01-21 23:37:07.433 181793 WARNING nova.virt.libvirt.driver [None req-006d1d44-8b50-4352-9f60-dc35856b533c - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 21 18:37:07 np0005591285 nova_compute[181789]: 2026-01-21 23:37:07.434 181793 DEBUG nova.compute.resource_tracker [None req-006d1d44-8b50-4352-9f60-dc35856b533c - - - - - -] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=6206MB free_disk=73.58760833740234GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Jan 21 18:37:07 np0005591285 nova_compute[181789]: 2026-01-21 23:37:07.434 181793 DEBUG oslo_concurrency.lockutils [None req-006d1d44-8b50-4352-9f60-dc35856b533c - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 21 18:37:07 np0005591285 nova_compute[181789]: 2026-01-21 23:37:07.435 181793 DEBUG oslo_concurrency.lockutils [None req-006d1d44-8b50-4352-9f60-dc35856b533c - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 21 18:37:07 np0005591285 nova_compute[181789]: 2026-01-21 23:37:07.486 181793 WARNING nova.compute.resource_tracker [None req-006d1d44-8b50-4352-9f60-dc35856b533c - - - - - -] No compute node record for compute-2.ctlplane.example.com:e96a8776-a298-4c19-937a-402cb8191067: nova.exception_Remote.ComputeHostNotFound_Remote: Compute host e96a8776-a298-4c19-937a-402cb8191067 could not be found.#033[00m
Jan 21 18:37:07 np0005591285 nova_compute[181789]: 2026-01-21 23:37:07.506 181793 INFO nova.compute.resource_tracker [None req-006d1d44-8b50-4352-9f60-dc35856b533c - - - - - -] Compute node record created for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com with uuid: e96a8776-a298-4c19-937a-402cb8191067#033[00m
Jan 21 18:37:07 np0005591285 nova_compute[181789]: 2026-01-21 23:37:07.576 181793 DEBUG nova.compute.resource_tracker [None req-006d1d44-8b50-4352-9f60-dc35856b533c - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Jan 21 18:37:07 np0005591285 nova_compute[181789]: 2026-01-21 23:37:07.577 181793 DEBUG nova.compute.resource_tracker [None req-006d1d44-8b50-4352-9f60-dc35856b533c - - - - - -] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Jan 21 18:37:07 np0005591285 python3.9[182692]: ansible-ansible.builtin.systemd Invoked with name=edpm_nova_compute.service state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Jan 21 18:37:07 np0005591285 systemd[1]: Stopping nova_compute container...
Jan 21 18:37:08 np0005591285 nova_compute[181789]: 2026-01-21 23:37:08.074 181793 DEBUG oslo_concurrency.lockutils [None req-006d1d44-8b50-4352-9f60-dc35856b533c - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.639s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 21 18:37:08 np0005591285 nova_compute[181789]: 2026-01-21 23:37:08.075 181793 DEBUG oslo_concurrency.lockutils [None req-6c5ad869-9008-4282-b2fa-81b360176ee2 - - - - - -] Acquiring lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 21 18:37:08 np0005591285 nova_compute[181789]: 2026-01-21 23:37:08.075 181793 DEBUG oslo_concurrency.lockutils [None req-6c5ad869-9008-4282-b2fa-81b360176ee2 - - - - - -] Acquired lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 21 18:37:08 np0005591285 nova_compute[181789]: 2026-01-21 23:37:08.076 181793 DEBUG oslo_concurrency.lockutils [None req-6c5ad869-9008-4282-b2fa-81b360176ee2 - - - - - -] Releasing lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 21 18:37:08 np0005591285 virtqemud[182299]: libvirt version: 11.10.0, package: 2.el9 (builder@centos.org, 2025-12-18-15:09:54, )
Jan 21 18:37:08 np0005591285 virtqemud[182299]: hostname: compute-2
Jan 21 18:37:08 np0005591285 virtqemud[182299]: End of file while reading data: Input/output error
Jan 21 18:37:08 np0005591285 systemd[1]: libpod-d3ea99c0c96ad7683553c5f41c4621c0e9f0e623bee3f4a7c2b90f7f436586f5.scope: Deactivated successfully.
Jan 21 18:37:08 np0005591285 systemd[1]: libpod-d3ea99c0c96ad7683553c5f41c4621c0e9f0e623bee3f4a7c2b90f7f436586f5.scope: Consumed 3.449s CPU time.
Jan 21 18:37:08 np0005591285 conmon[181789]: conmon d3ea99c0c96ad7683553 <nwarn>: Failed to open cgroups file: /sys/fs/cgroup/machine.slice/libpod-d3ea99c0c96ad7683553c5f41c4621c0e9f0e623bee3f4a7c2b90f7f436586f5.scope/container/memory.events
Jan 21 18:37:08 np0005591285 podman[182696]: 2026-01-21 23:37:08.473892507 +0000 UTC m=+0.476224414 container died d3ea99c0c96ad7683553c5f41c4621c0e9f0e623bee3f4a7c2b90f7f436586f5 (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'privileged': True, 'user': 'nova', 'restart': 'always', 'command': 'kolla_start', 'net': 'host', 'pid': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/var/lib/openstack/config/nova:/var/lib/kolla/config_files:ro', '/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/etc/localtime:/etc/localtime:ro', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/libvirt:/var/lib/libvirt', '/run/libvirt:/run/libvirt:shared', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath', '/etc/multipath.conf:/etc/multipath.conf:ro,Z', '/etc/iscsi:/etc/iscsi:ro', '/etc/nvme:/etc/nvme', '/var/lib/openstack/config/ceph:/var/lib/kolla/config_files/ceph:ro', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro']}, config_id=edpm, container_name=nova_compute, io.buildah.version=1.41.3)
Jan 21 18:37:08 np0005591285 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-d3ea99c0c96ad7683553c5f41c4621c0e9f0e623bee3f4a7c2b90f7f436586f5-userdata-shm.mount: Deactivated successfully.
Jan 21 18:37:08 np0005591285 systemd[1]: var-lib-containers-storage-overlay-4506c398dafb27a1b5aeb09408aea108f00512fd629ef5ec06310add9488884d-merged.mount: Deactivated successfully.
Jan 21 18:37:08 np0005591285 podman[182696]: 2026-01-21 23:37:08.546756774 +0000 UTC m=+0.549088681 container cleanup d3ea99c0c96ad7683553c5f41c4621c0e9f0e623bee3f4a7c2b90f7f436586f5 (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_id=edpm, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'privileged': True, 'user': 'nova', 'restart': 'always', 'command': 'kolla_start', 'net': 'host', 'pid': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/var/lib/openstack/config/nova:/var/lib/kolla/config_files:ro', '/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/etc/localtime:/etc/localtime:ro', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/libvirt:/var/lib/libvirt', '/run/libvirt:/run/libvirt:shared', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath', '/etc/multipath.conf:/etc/multipath.conf:ro,Z', '/etc/iscsi:/etc/iscsi:ro', '/etc/nvme:/etc/nvme', '/var/lib/openstack/config/ceph:/var/lib/kolla/config_files/ceph:ro', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro']}, container_name=nova_compute, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 21 18:37:08 np0005591285 podman[182696]: nova_compute
Jan 21 18:37:08 np0005591285 podman[182726]: nova_compute
Jan 21 18:37:08 np0005591285 systemd[1]: edpm_nova_compute.service: Deactivated successfully.
Jan 21 18:37:08 np0005591285 systemd[1]: Stopped nova_compute container.
Jan 21 18:37:08 np0005591285 systemd[1]: Starting nova_compute container...
Jan 21 18:37:08 np0005591285 systemd[1]: Started libcrun container.
Jan 21 18:37:08 np0005591285 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/4506c398dafb27a1b5aeb09408aea108f00512fd629ef5ec06310add9488884d/merged/etc/nvme supports timestamps until 2038 (0x7fffffff)
Jan 21 18:37:08 np0005591285 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/4506c398dafb27a1b5aeb09408aea108f00512fd629ef5ec06310add9488884d/merged/etc/multipath supports timestamps until 2038 (0x7fffffff)
Jan 21 18:37:08 np0005591285 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/4506c398dafb27a1b5aeb09408aea108f00512fd629ef5ec06310add9488884d/merged/var/lib/libvirt supports timestamps until 2038 (0x7fffffff)
Jan 21 18:37:08 np0005591285 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/4506c398dafb27a1b5aeb09408aea108f00512fd629ef5ec06310add9488884d/merged/var/lib/nova supports timestamps until 2038 (0x7fffffff)
Jan 21 18:37:08 np0005591285 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/4506c398dafb27a1b5aeb09408aea108f00512fd629ef5ec06310add9488884d/merged/var/lib/iscsi supports timestamps until 2038 (0x7fffffff)
Jan 21 18:37:08 np0005591285 podman[182740]: 2026-01-21 23:37:08.803268116 +0000 UTC m=+0.116490333 container init d3ea99c0c96ad7683553c5f41c4621c0e9f0e623bee3f4a7c2b90f7f436586f5 (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'privileged': True, 'user': 'nova', 'restart': 'always', 'command': 'kolla_start', 'net': 'host', 'pid': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/var/lib/openstack/config/nova:/var/lib/kolla/config_files:ro', '/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/etc/localtime:/etc/localtime:ro', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/libvirt:/var/lib/libvirt', '/run/libvirt:/run/libvirt:shared', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath', '/etc/multipath.conf:/etc/multipath.conf:ro,Z', '/etc/iscsi:/etc/iscsi:ro', '/etc/nvme:/etc/nvme', '/var/lib/openstack/config/ceph:/var/lib/kolla/config_files/ceph:ro', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro']}, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, config_id=edpm, container_name=nova_compute, managed_by=edpm_ansible, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2)
Jan 21 18:37:08 np0005591285 podman[182740]: 2026-01-21 23:37:08.815289781 +0000 UTC m=+0.128511918 container start d3ea99c0c96ad7683553c5f41c4621c0e9f0e623bee3f4a7c2b90f7f436586f5 (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'privileged': True, 'user': 'nova', 'restart': 'always', 'command': 'kolla_start', 'net': 'host', 'pid': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/var/lib/openstack/config/nova:/var/lib/kolla/config_files:ro', '/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/etc/localtime:/etc/localtime:ro', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/libvirt:/var/lib/libvirt', '/run/libvirt:/run/libvirt:shared', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath', '/etc/multipath.conf:/etc/multipath.conf:ro,Z', '/etc/iscsi:/etc/iscsi:ro', '/etc/nvme:/etc/nvme', '/var/lib/openstack/config/ceph:/var/lib/kolla/config_files/ceph:ro', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro']}, org.label-schema.build-date=20251202, config_id=edpm, container_name=nova_compute, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team)
Jan 21 18:37:08 np0005591285 podman[182740]: nova_compute
Jan 21 18:37:08 np0005591285 nova_compute[182755]: + sudo -E kolla_set_configs
Jan 21 18:37:08 np0005591285 systemd[1]: Started nova_compute container.
Jan 21 18:37:08 np0005591285 nova_compute[182755]: INFO:__main__:Loading config file at /var/lib/kolla/config_files/config.json
Jan 21 18:37:08 np0005591285 nova_compute[182755]: INFO:__main__:Validating config file
Jan 21 18:37:08 np0005591285 nova_compute[182755]: INFO:__main__:Kolla config strategy set to: COPY_ALWAYS
Jan 21 18:37:08 np0005591285 nova_compute[182755]: INFO:__main__:Copying service configuration files
Jan 21 18:37:08 np0005591285 nova_compute[182755]: INFO:__main__:Deleting /etc/nova/nova.conf
Jan 21 18:37:08 np0005591285 nova_compute[182755]: INFO:__main__:Copying /var/lib/kolla/config_files/nova-blank.conf to /etc/nova/nova.conf
Jan 21 18:37:08 np0005591285 nova_compute[182755]: INFO:__main__:Setting permission for /etc/nova/nova.conf
Jan 21 18:37:08 np0005591285 nova_compute[182755]: INFO:__main__:Deleting /etc/nova/nova.conf.d/01-nova.conf
Jan 21 18:37:08 np0005591285 nova_compute[182755]: INFO:__main__:Copying /var/lib/kolla/config_files/01-nova.conf to /etc/nova/nova.conf.d/01-nova.conf
Jan 21 18:37:08 np0005591285 nova_compute[182755]: INFO:__main__:Setting permission for /etc/nova/nova.conf.d/01-nova.conf
Jan 21 18:37:08 np0005591285 nova_compute[182755]: INFO:__main__:Deleting /etc/nova/nova.conf.d/25-nova-extra.conf
Jan 21 18:37:08 np0005591285 nova_compute[182755]: INFO:__main__:Copying /var/lib/kolla/config_files/25-nova-extra.conf to /etc/nova/nova.conf.d/25-nova-extra.conf
Jan 21 18:37:08 np0005591285 nova_compute[182755]: INFO:__main__:Setting permission for /etc/nova/nova.conf.d/25-nova-extra.conf
Jan 21 18:37:08 np0005591285 nova_compute[182755]: INFO:__main__:Deleting /etc/nova/nova.conf.d/nova-blank.conf
Jan 21 18:37:08 np0005591285 nova_compute[182755]: INFO:__main__:Copying /var/lib/kolla/config_files/nova-blank.conf to /etc/nova/nova.conf.d/nova-blank.conf
Jan 21 18:37:08 np0005591285 nova_compute[182755]: INFO:__main__:Setting permission for /etc/nova/nova.conf.d/nova-blank.conf
Jan 21 18:37:08 np0005591285 nova_compute[182755]: INFO:__main__:Deleting /etc/nova/nova.conf.d/02-nova-host-specific.conf
Jan 21 18:37:08 np0005591285 nova_compute[182755]: INFO:__main__:Copying /var/lib/kolla/config_files/02-nova-host-specific.conf to /etc/nova/nova.conf.d/02-nova-host-specific.conf
Jan 21 18:37:08 np0005591285 nova_compute[182755]: INFO:__main__:Setting permission for /etc/nova/nova.conf.d/02-nova-host-specific.conf
Jan 21 18:37:08 np0005591285 nova_compute[182755]: INFO:__main__:Deleting /etc/ceph
Jan 21 18:37:08 np0005591285 nova_compute[182755]: INFO:__main__:Creating directory /etc/ceph
Jan 21 18:37:08 np0005591285 nova_compute[182755]: INFO:__main__:Setting permission for /etc/ceph
Jan 21 18:37:08 np0005591285 nova_compute[182755]: INFO:__main__:Deleting /var/lib/nova/.ssh/ssh-privatekey
Jan 21 18:37:08 np0005591285 nova_compute[182755]: INFO:__main__:Copying /var/lib/kolla/config_files/ssh-privatekey to /var/lib/nova/.ssh/ssh-privatekey
Jan 21 18:37:08 np0005591285 nova_compute[182755]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/ssh-privatekey
Jan 21 18:37:08 np0005591285 nova_compute[182755]: INFO:__main__:Deleting /var/lib/nova/.ssh/config
Jan 21 18:37:08 np0005591285 nova_compute[182755]: INFO:__main__:Copying /var/lib/kolla/config_files/ssh-config to /var/lib/nova/.ssh/config
Jan 21 18:37:08 np0005591285 nova_compute[182755]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/config
Jan 21 18:37:08 np0005591285 nova_compute[182755]: INFO:__main__:Deleting /usr/sbin/iscsiadm
Jan 21 18:37:08 np0005591285 nova_compute[182755]: INFO:__main__:Copying /var/lib/kolla/config_files/run-on-host to /usr/sbin/iscsiadm
Jan 21 18:37:08 np0005591285 nova_compute[182755]: INFO:__main__:Setting permission for /usr/sbin/iscsiadm
Jan 21 18:37:08 np0005591285 nova_compute[182755]: INFO:__main__:Writing out command to execute
Jan 21 18:37:08 np0005591285 nova_compute[182755]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/
Jan 21 18:37:08 np0005591285 nova_compute[182755]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/ssh-privatekey
Jan 21 18:37:08 np0005591285 nova_compute[182755]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/config
Jan 21 18:37:08 np0005591285 nova_compute[182755]: ++ cat /run_command
Jan 21 18:37:08 np0005591285 nova_compute[182755]: + CMD=nova-compute
Jan 21 18:37:08 np0005591285 nova_compute[182755]: + ARGS=
Jan 21 18:37:08 np0005591285 nova_compute[182755]: + sudo kolla_copy_cacerts
Jan 21 18:37:08 np0005591285 nova_compute[182755]: + [[ ! -n '' ]]
Jan 21 18:37:08 np0005591285 nova_compute[182755]: + . kolla_extend_start
Jan 21 18:37:08 np0005591285 nova_compute[182755]: Running command: 'nova-compute'
Jan 21 18:37:08 np0005591285 nova_compute[182755]: + echo 'Running command: '\''nova-compute'\'''
Jan 21 18:37:08 np0005591285 nova_compute[182755]: + umask 0022
Jan 21 18:37:08 np0005591285 nova_compute[182755]: + exec nova-compute
Jan 21 18:37:10 np0005591285 python3.9[182919]: ansible-containers.podman.podman_container Invoked with name=nova_compute_init state=started executable=podman detach=True debug=False force_restart=False force_delete=True generate_systemd={} image_strict=False recreate=False image=None annotation=None arch=None attach=None authfile=None blkio_weight=None blkio_weight_device=None cap_add=None cap_drop=None cgroup_conf=None cgroup_parent=None cgroupns=None cgroups=None chrootdirs=None cidfile=None cmd_args=None conmon_pidfile=None command=None cpu_period=None cpu_quota=None cpu_rt_period=None cpu_rt_runtime=None cpu_shares=None cpus=None cpuset_cpus=None cpuset_mems=None decryption_key=None delete_depend=None delete_time=None delete_volumes=None detach_keys=None device=None device_cgroup_rule=None device_read_bps=None device_read_iops=None device_write_bps=None device_write_iops=None dns=None dns_option=None dns_search=None entrypoint=None env=None env_file=None env_host=None env_merge=None etc_hosts=None expose=None gidmap=None gpus=None group_add=None group_entry=None healthcheck=None healthcheck_interval=None healthcheck_retries=None healthcheck_start_period=None health_startup_cmd=None health_startup_interval=None health_startup_retries=None health_startup_success=None health_startup_timeout=None healthcheck_timeout=None healthcheck_failure_action=None hooks_dir=None hostname=None hostuser=None http_proxy=None image_volume=None init=None init_ctr=None init_path=None interactive=None ip=None ip6=None ipc=None kernel_memory=None label=None label_file=None log_driver=None log_level=None log_opt=None mac_address=None memory=None memory_reservation=None memory_swap=None memory_swappiness=None mount=None network=None network_aliases=None no_healthcheck=None no_hosts=None oom_kill_disable=None oom_score_adj=None os=None passwd=None passwd_entry=None personality=None pid=None pid_file=None pids_limit=None platform=None pod=None pod_id_file=None preserve_fd=None preserve_fds=None privileged=None publish=None publish_all=None pull=None quadlet_dir=None quadlet_filename=None quadlet_file_mode=None quadlet_options=None rdt_class=None read_only=None read_only_tmpfs=None requires=None restart_policy=None restart_time=None retry=None retry_delay=None rm=None rmi=None rootfs=None seccomp_policy=None secrets=NOT_LOGGING_PARAMETER sdnotify=None security_opt=None shm_size=None shm_size_systemd=None sig_proxy=None stop_signal=None stop_timeout=None stop_time=None subgidname=None subuidname=None sysctl=None systemd=None timeout=None timezone=None tls_verify=None tmpfs=None tty=None uidmap=None ulimit=None umask=None unsetenv=None unsetenv_all=None user=None userns=None uts=None variant=None volume=None volumes_from=None workdir=None
Jan 21 18:37:10 np0005591285 systemd[1]: Started libpod-conmon-ff23a0c19e45d607cb33f10af7fd67011d76103914063646ee2fad058e8582ce.scope.
Jan 21 18:37:10 np0005591285 systemd[1]: Started libcrun container.
Jan 21 18:37:10 np0005591285 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/d22c73ceaa3b70b19788e2138665d192566b0ff093b1318cc02ff91776224cd6/merged/usr/sbin/nova_statedir_ownership.py supports timestamps until 2038 (0x7fffffff)
Jan 21 18:37:10 np0005591285 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/d22c73ceaa3b70b19788e2138665d192566b0ff093b1318cc02ff91776224cd6/merged/var/lib/_nova_secontext supports timestamps until 2038 (0x7fffffff)
Jan 21 18:37:10 np0005591285 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/d22c73ceaa3b70b19788e2138665d192566b0ff093b1318cc02ff91776224cd6/merged/var/lib/nova supports timestamps until 2038 (0x7fffffff)
Jan 21 18:37:10 np0005591285 podman[182944]: 2026-01-21 23:37:10.798714814 +0000 UTC m=+0.168340799 container init ff23a0c19e45d607cb33f10af7fd67011d76103914063646ee2fad058e8582ce (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute_init, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'privileged': False, 'user': 'root', 'restart': 'never', 'command': 'bash -c $* -- eval python3 /sbin/nova_statedir_ownership.py | logger -t nova_compute_init', 'net': 'none', 'security_opt': ['label=disable'], 'detach': False, 'environment': {'NOVA_STATEDIR_OWNERSHIP_SKIP': '/var/lib/nova/compute_id', '__OS_DEBUG': False}, 'volumes': ['/dev/log:/dev/log', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/_nova_secontext:/var/lib/_nova_secontext:shared,z', '/var/lib/openstack/config/nova/nova_statedir_ownership.py:/sbin/nova_statedir_ownership.py:z']}, config_id=edpm, container_name=nova_compute_init, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 21 18:37:10 np0005591285 podman[182944]: 2026-01-21 23:37:10.806605494 +0000 UTC m=+0.176231459 container start ff23a0c19e45d607cb33f10af7fd67011d76103914063646ee2fad058e8582ce (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute_init, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'privileged': False, 'user': 'root', 'restart': 'never', 'command': 'bash -c $* -- eval python3 /sbin/nova_statedir_ownership.py | logger -t nova_compute_init', 'net': 'none', 'security_opt': ['label=disable'], 'detach': False, 'environment': {'NOVA_STATEDIR_OWNERSHIP_SKIP': '/var/lib/nova/compute_id', '__OS_DEBUG': False}, 'volumes': ['/dev/log:/dev/log', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/_nova_secontext:/var/lib/_nova_secontext:shared,z', '/var/lib/openstack/config/nova/nova_statedir_ownership.py:/sbin/nova_statedir_ownership.py:z']}, container_name=nova_compute_init, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=edpm)
Jan 21 18:37:10 np0005591285 python3.9[182919]: ansible-containers.podman.podman_container PODMAN-CONTAINER-DEBUG: podman start nova_compute_init
Jan 21 18:37:10 np0005591285 nova_compute_init[182965]: INFO:nova_statedir:Applying nova statedir ownership
Jan 21 18:37:10 np0005591285 nova_compute_init[182965]: INFO:nova_statedir:Target ownership for /var/lib/nova: 42436:42436
Jan 21 18:37:10 np0005591285 nova_compute_init[182965]: INFO:nova_statedir:Checking uid: 1000 gid: 1000 path: /var/lib/nova/
Jan 21 18:37:10 np0005591285 nova_compute_init[182965]: INFO:nova_statedir:Changing ownership of /var/lib/nova from 1000:1000 to 42436:42436
Jan 21 18:37:10 np0005591285 nova_compute_init[182965]: INFO:nova_statedir:Setting selinux context of /var/lib/nova to system_u:object_r:container_file_t:s0
Jan 21 18:37:10 np0005591285 nova_compute_init[182965]: INFO:nova_statedir:Checking uid: 1000 gid: 1000 path: /var/lib/nova/instances/
Jan 21 18:37:10 np0005591285 nova_compute_init[182965]: INFO:nova_statedir:Changing ownership of /var/lib/nova/instances from 1000:1000 to 42436:42436
Jan 21 18:37:10 np0005591285 nova_compute_init[182965]: INFO:nova_statedir:Setting selinux context of /var/lib/nova/instances to system_u:object_r:container_file_t:s0
Jan 21 18:37:10 np0005591285 nova_compute_init[182965]: INFO:nova_statedir:Checking uid: 42436 gid: 42436 path: /var/lib/nova/.ssh/
Jan 21 18:37:10 np0005591285 nova_compute_init[182965]: INFO:nova_statedir:Ownership of /var/lib/nova/.ssh already 42436:42436
Jan 21 18:37:10 np0005591285 nova_compute_init[182965]: INFO:nova_statedir:Setting selinux context of /var/lib/nova/.ssh to system_u:object_r:container_file_t:s0
Jan 21 18:37:10 np0005591285 nova_compute_init[182965]: INFO:nova_statedir:Checking uid: 42436 gid: 42436 path: /var/lib/nova/.ssh/ssh-privatekey
Jan 21 18:37:10 np0005591285 nova_compute_init[182965]: INFO:nova_statedir:Checking uid: 42436 gid: 42436 path: /var/lib/nova/.ssh/config
Jan 21 18:37:10 np0005591285 nova_compute_init[182965]: INFO:nova_statedir:Nova statedir ownership complete
Jan 21 18:37:10 np0005591285 systemd[1]: libpod-ff23a0c19e45d607cb33f10af7fd67011d76103914063646ee2fad058e8582ce.scope: Deactivated successfully.
Jan 21 18:37:10 np0005591285 podman[182981]: 2026-01-21 23:37:10.958342701 +0000 UTC m=+0.041674607 container died ff23a0c19e45d607cb33f10af7fd67011d76103914063646ee2fad058e8582ce (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute_init, container_name=nova_compute_init, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'privileged': False, 'user': 'root', 'restart': 'never', 'command': 'bash -c $* -- eval python3 /sbin/nova_statedir_ownership.py | logger -t nova_compute_init', 'net': 'none', 'security_opt': ['label=disable'], 'detach': False, 'environment': {'NOVA_STATEDIR_OWNERSHIP_SKIP': '/var/lib/nova/compute_id', '__OS_DEBUG': False}, 'volumes': ['/dev/log:/dev/log', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/_nova_secontext:/var/lib/_nova_secontext:shared,z', '/var/lib/openstack/config/nova/nova_statedir_ownership.py:/sbin/nova_statedir_ownership.py:z']}, config_id=edpm, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 21 18:37:10 np0005591285 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-ff23a0c19e45d607cb33f10af7fd67011d76103914063646ee2fad058e8582ce-userdata-shm.mount: Deactivated successfully.
Jan 21 18:37:10 np0005591285 systemd[1]: var-lib-containers-storage-overlay-d22c73ceaa3b70b19788e2138665d192566b0ff093b1318cc02ff91776224cd6-merged.mount: Deactivated successfully.
Jan 21 18:37:10 np0005591285 podman[182981]: 2026-01-21 23:37:10.99734721 +0000 UTC m=+0.080679086 container cleanup ff23a0c19e45d607cb33f10af7fd67011d76103914063646ee2fad058e8582ce (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute_init, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_id=edpm, container_name=nova_compute_init, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251202, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'privileged': False, 'user': 'root', 'restart': 'never', 'command': 'bash -c $* -- eval python3 /sbin/nova_statedir_ownership.py | logger -t nova_compute_init', 'net': 'none', 'security_opt': ['label=disable'], 'detach': False, 'environment': {'NOVA_STATEDIR_OWNERSHIP_SKIP': '/var/lib/nova/compute_id', '__OS_DEBUG': False}, 'volumes': ['/dev/log:/dev/log', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/_nova_secontext:/var/lib/_nova_secontext:shared,z', '/var/lib/openstack/config/nova/nova_statedir_ownership.py:/sbin/nova_statedir_ownership.py:z']})
Jan 21 18:37:11 np0005591285 systemd[1]: libpod-conmon-ff23a0c19e45d607cb33f10af7fd67011d76103914063646ee2fad058e8582ce.scope: Deactivated successfully.
Jan 21 18:37:11 np0005591285 podman[182982]: 2026-01-21 23:37:11.022760734 +0000 UTC m=+0.096642391 container health_status 482a8450540b33e5cd4c4cb0c4b60281016c657b79b89fa82f8a9e447843f995 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_metadata_agent, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 21 18:37:11 np0005591285 nova_compute[182755]: 2026-01-21 23:37:11.037 182759 DEBUG os_vif [-] Loaded VIF plugin class '<class 'vif_plug_linux_bridge.linux_bridge.LinuxBridgePlugin'>' with name 'linux_bridge' initialize /usr/lib/python3.9/site-packages/os_vif/__init__.py:44#033[00m
Jan 21 18:37:11 np0005591285 nova_compute[182755]: 2026-01-21 23:37:11.039 182759 DEBUG os_vif [-] Loaded VIF plugin class '<class 'vif_plug_noop.noop.NoOpPlugin'>' with name 'noop' initialize /usr/lib/python3.9/site-packages/os_vif/__init__.py:44#033[00m
Jan 21 18:37:11 np0005591285 nova_compute[182755]: 2026-01-21 23:37:11.039 182759 DEBUG os_vif [-] Loaded VIF plugin class '<class 'vif_plug_ovs.ovs.OvsPlugin'>' with name 'ovs' initialize /usr/lib/python3.9/site-packages/os_vif/__init__.py:44#033[00m
Jan 21 18:37:11 np0005591285 nova_compute[182755]: 2026-01-21 23:37:11.039 182759 INFO os_vif [-] Loaded VIF plugins: linux_bridge, noop, ovs#033[00m
Jan 21 18:37:11 np0005591285 nova_compute[182755]: 2026-01-21 23:37:11.183 182759 DEBUG oslo_concurrency.processutils [-] Running cmd (subprocess): grep -F node.session.scan /sbin/iscsiadm execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 21 18:37:11 np0005591285 nova_compute[182755]: 2026-01-21 23:37:11.207 182759 DEBUG oslo_concurrency.processutils [-] CMD "grep -F node.session.scan /sbin/iscsiadm" returned: 1 in 0.025s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 21 18:37:11 np0005591285 nova_compute[182755]: 2026-01-21 23:37:11.208 182759 DEBUG oslo_concurrency.processutils [-] 'grep -F node.session.scan /sbin/iscsiadm' failed. Not Retrying. execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:473#033[00m
Jan 21 18:37:11 np0005591285 nova_compute[182755]: 2026-01-21 23:37:11.712 182759 INFO nova.virt.driver [None req-03770a86-b916-455d-a5be-04baff7a18e9 - - - - - -] Loading compute driver 'libvirt.LibvirtDriver'#033[00m
Jan 21 18:37:11 np0005591285 systemd[1]: session-24.scope: Deactivated successfully.
Jan 21 18:37:11 np0005591285 systemd[1]: session-24.scope: Consumed 1min 56.661s CPU time.
Jan 21 18:37:11 np0005591285 systemd-logind[788]: Session 24 logged out. Waiting for processes to exit.
Jan 21 18:37:11 np0005591285 systemd-logind[788]: Removed session 24.
Jan 21 18:37:11 np0005591285 nova_compute[182755]: 2026-01-21 23:37:11.847 182759 INFO nova.compute.provider_config [None req-03770a86-b916-455d-a5be-04baff7a18e9 - - - - - -] No provider configs found in /etc/nova/provider_config/. If files are present, ensure the Nova process has access.#033[00m
Jan 21 18:37:11 np0005591285 nova_compute[182755]: 2026-01-21 23:37:11.868 182759 DEBUG oslo_concurrency.lockutils [None req-03770a86-b916-455d-a5be-04baff7a18e9 - - - - - -] Acquiring lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 21 18:37:11 np0005591285 nova_compute[182755]: 2026-01-21 23:37:11.868 182759 DEBUG oslo_concurrency.lockutils [None req-03770a86-b916-455d-a5be-04baff7a18e9 - - - - - -] Acquired lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 21 18:37:11 np0005591285 nova_compute[182755]: 2026-01-21 23:37:11.869 182759 DEBUG oslo_concurrency.lockutils [None req-03770a86-b916-455d-a5be-04baff7a18e9 - - - - - -] Releasing lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 21 18:37:11 np0005591285 nova_compute[182755]: 2026-01-21 23:37:11.869 182759 DEBUG oslo_service.service [None req-03770a86-b916-455d-a5be-04baff7a18e9 - - - - - -] Full set of CONF: _wait_for_exit_or_signal /usr/lib/python3.9/site-packages/oslo_service/service.py:362#033[00m
Jan 21 18:37:11 np0005591285 nova_compute[182755]: 2026-01-21 23:37:11.869 182759 DEBUG oslo_service.service [None req-03770a86-b916-455d-a5be-04baff7a18e9 - - - - - -] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2589#033[00m
Jan 21 18:37:11 np0005591285 nova_compute[182755]: 2026-01-21 23:37:11.869 182759 DEBUG oslo_service.service [None req-03770a86-b916-455d-a5be-04baff7a18e9 - - - - - -] Configuration options gathered from: log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2590#033[00m
Jan 21 18:37:11 np0005591285 nova_compute[182755]: 2026-01-21 23:37:11.870 182759 DEBUG oslo_service.service [None req-03770a86-b916-455d-a5be-04baff7a18e9 - - - - - -] command line args: [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2591#033[00m
Jan 21 18:37:11 np0005591285 nova_compute[182755]: 2026-01-21 23:37:11.870 182759 DEBUG oslo_service.service [None req-03770a86-b916-455d-a5be-04baff7a18e9 - - - - - -] config files: ['/etc/nova/nova.conf', '/etc/nova/nova-compute.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2592#033[00m
Jan 21 18:37:11 np0005591285 nova_compute[182755]: 2026-01-21 23:37:11.870 182759 DEBUG oslo_service.service [None req-03770a86-b916-455d-a5be-04baff7a18e9 - - - - - -] ================================================================================ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2594#033[00m
Jan 21 18:37:11 np0005591285 nova_compute[182755]: 2026-01-21 23:37:11.870 182759 DEBUG oslo_service.service [None req-03770a86-b916-455d-a5be-04baff7a18e9 - - - - - -] allow_resize_to_same_host      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 21 18:37:11 np0005591285 nova_compute[182755]: 2026-01-21 23:37:11.870 182759 DEBUG oslo_service.service [None req-03770a86-b916-455d-a5be-04baff7a18e9 - - - - - -] arq_binding_timeout            = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 21 18:37:11 np0005591285 nova_compute[182755]: 2026-01-21 23:37:11.870 182759 DEBUG oslo_service.service [None req-03770a86-b916-455d-a5be-04baff7a18e9 - - - - - -] backdoor_port                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 21 18:37:11 np0005591285 nova_compute[182755]: 2026-01-21 23:37:11.871 182759 DEBUG oslo_service.service [None req-03770a86-b916-455d-a5be-04baff7a18e9 - - - - - -] backdoor_socket                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 21 18:37:11 np0005591285 nova_compute[182755]: 2026-01-21 23:37:11.871 182759 DEBUG oslo_service.service [None req-03770a86-b916-455d-a5be-04baff7a18e9 - - - - - -] block_device_allocate_retries  = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 21 18:37:11 np0005591285 nova_compute[182755]: 2026-01-21 23:37:11.871 182759 DEBUG oslo_service.service [None req-03770a86-b916-455d-a5be-04baff7a18e9 - - - - - -] block_device_allocate_retries_interval = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 21 18:37:11 np0005591285 nova_compute[182755]: 2026-01-21 23:37:11.871 182759 DEBUG oslo_service.service [None req-03770a86-b916-455d-a5be-04baff7a18e9 - - - - - -] cert                           = self.pem log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 21 18:37:11 np0005591285 nova_compute[182755]: 2026-01-21 23:37:11.871 182759 DEBUG oslo_service.service [None req-03770a86-b916-455d-a5be-04baff7a18e9 - - - - - -] compute_driver                 = libvirt.LibvirtDriver log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 21 18:37:11 np0005591285 nova_compute[182755]: 2026-01-21 23:37:11.872 182759 DEBUG oslo_service.service [None req-03770a86-b916-455d-a5be-04baff7a18e9 - - - - - -] compute_monitors               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 21 18:37:11 np0005591285 nova_compute[182755]: 2026-01-21 23:37:11.872 182759 DEBUG oslo_service.service [None req-03770a86-b916-455d-a5be-04baff7a18e9 - - - - - -] config_dir                     = ['/etc/nova/nova.conf.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 21 18:37:11 np0005591285 nova_compute[182755]: 2026-01-21 23:37:11.872 182759 DEBUG oslo_service.service [None req-03770a86-b916-455d-a5be-04baff7a18e9 - - - - - -] config_drive_format            = iso9660 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 21 18:37:11 np0005591285 nova_compute[182755]: 2026-01-21 23:37:11.872 182759 DEBUG oslo_service.service [None req-03770a86-b916-455d-a5be-04baff7a18e9 - - - - - -] config_file                    = ['/etc/nova/nova.conf', '/etc/nova/nova-compute.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 21 18:37:11 np0005591285 nova_compute[182755]: 2026-01-21 23:37:11.872 182759 DEBUG oslo_service.service [None req-03770a86-b916-455d-a5be-04baff7a18e9 - - - - - -] config_source                  = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 21 18:37:11 np0005591285 nova_compute[182755]: 2026-01-21 23:37:11.873 182759 DEBUG oslo_service.service [None req-03770a86-b916-455d-a5be-04baff7a18e9 - - - - - -] console_host                   = compute-2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 21 18:37:11 np0005591285 nova_compute[182755]: 2026-01-21 23:37:11.873 182759 DEBUG oslo_service.service [None req-03770a86-b916-455d-a5be-04baff7a18e9 - - - - - -] control_exchange               = nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 21 18:37:11 np0005591285 nova_compute[182755]: 2026-01-21 23:37:11.873 182759 DEBUG oslo_service.service [None req-03770a86-b916-455d-a5be-04baff7a18e9 - - - - - -] cpu_allocation_ratio           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 21 18:37:11 np0005591285 nova_compute[182755]: 2026-01-21 23:37:11.873 182759 DEBUG oslo_service.service [None req-03770a86-b916-455d-a5be-04baff7a18e9 - - - - - -] daemon                         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 21 18:37:11 np0005591285 nova_compute[182755]: 2026-01-21 23:37:11.873 182759 DEBUG oslo_service.service [None req-03770a86-b916-455d-a5be-04baff7a18e9 - - - - - -] debug                          = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 21 18:37:11 np0005591285 nova_compute[182755]: 2026-01-21 23:37:11.873 182759 DEBUG oslo_service.service [None req-03770a86-b916-455d-a5be-04baff7a18e9 - - - - - -] default_access_ip_network_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 21 18:37:11 np0005591285 nova_compute[182755]: 2026-01-21 23:37:11.874 182759 DEBUG oslo_service.service [None req-03770a86-b916-455d-a5be-04baff7a18e9 - - - - - -] default_availability_zone      = nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 21 18:37:11 np0005591285 nova_compute[182755]: 2026-01-21 23:37:11.874 182759 DEBUG oslo_service.service [None req-03770a86-b916-455d-a5be-04baff7a18e9 - - - - - -] default_ephemeral_format       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 21 18:37:11 np0005591285 nova_compute[182755]: 2026-01-21 23:37:11.874 182759 DEBUG oslo_service.service [None req-03770a86-b916-455d-a5be-04baff7a18e9 - - - - - -] default_log_levels             = ['amqp=WARN', 'amqplib=WARN', 'boto=WARN', 'qpid=WARN', 'sqlalchemy=WARN', 'suds=INFO', 'oslo.messaging=INFO', 'oslo_messaging=INFO', 'iso8601=WARN', 'requests.packages.urllib3.connectionpool=WARN', 'urllib3.connectionpool=WARN', 'websocket=WARN', 'requests.packages.urllib3.util.retry=WARN', 'urllib3.util.retry=WARN', 'keystonemiddleware=WARN', 'routes.middleware=WARN', 'stevedore=WARN', 'taskflow=WARN', 'keystoneauth=WARN', 'oslo.cache=INFO', 'oslo_policy=INFO', 'dogpile.core.dogpile=INFO', 'glanceclient=WARN', 'oslo.privsep.daemon=INFO'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 21 18:37:11 np0005591285 nova_compute[182755]: 2026-01-21 23:37:11.874 182759 DEBUG oslo_service.service [None req-03770a86-b916-455d-a5be-04baff7a18e9 - - - - - -] default_schedule_zone          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 21 18:37:11 np0005591285 nova_compute[182755]: 2026-01-21 23:37:11.874 182759 DEBUG oslo_service.service [None req-03770a86-b916-455d-a5be-04baff7a18e9 - - - - - -] disk_allocation_ratio          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 21 18:37:11 np0005591285 nova_compute[182755]: 2026-01-21 23:37:11.875 182759 DEBUG oslo_service.service [None req-03770a86-b916-455d-a5be-04baff7a18e9 - - - - - -] enable_new_services            = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 21 18:37:11 np0005591285 nova_compute[182755]: 2026-01-21 23:37:11.875 182759 DEBUG oslo_service.service [None req-03770a86-b916-455d-a5be-04baff7a18e9 - - - - - -] enabled_apis                   = ['osapi_compute', 'metadata'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 21 18:37:11 np0005591285 nova_compute[182755]: 2026-01-21 23:37:11.875 182759 DEBUG oslo_service.service [None req-03770a86-b916-455d-a5be-04baff7a18e9 - - - - - -] enabled_ssl_apis               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 21 18:37:11 np0005591285 nova_compute[182755]: 2026-01-21 23:37:11.875 182759 DEBUG oslo_service.service [None req-03770a86-b916-455d-a5be-04baff7a18e9 - - - - - -] flat_injected                  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 21 18:37:11 np0005591285 nova_compute[182755]: 2026-01-21 23:37:11.875 182759 DEBUG oslo_service.service [None req-03770a86-b916-455d-a5be-04baff7a18e9 - - - - - -] force_config_drive             = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 21 18:37:11 np0005591285 nova_compute[182755]: 2026-01-21 23:37:11.875 182759 DEBUG oslo_service.service [None req-03770a86-b916-455d-a5be-04baff7a18e9 - - - - - -] force_raw_images               = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 21 18:37:11 np0005591285 nova_compute[182755]: 2026-01-21 23:37:11.876 182759 DEBUG oslo_service.service [None req-03770a86-b916-455d-a5be-04baff7a18e9 - - - - - -] graceful_shutdown_timeout      = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 21 18:37:11 np0005591285 nova_compute[182755]: 2026-01-21 23:37:11.876 182759 DEBUG oslo_service.service [None req-03770a86-b916-455d-a5be-04baff7a18e9 - - - - - -] heal_instance_info_cache_interval = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 21 18:37:11 np0005591285 nova_compute[182755]: 2026-01-21 23:37:11.876 182759 DEBUG oslo_service.service [None req-03770a86-b916-455d-a5be-04baff7a18e9 - - - - - -] host                           = compute-2.ctlplane.example.com log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 21 18:37:11 np0005591285 nova_compute[182755]: 2026-01-21 23:37:11.876 182759 DEBUG oslo_service.service [None req-03770a86-b916-455d-a5be-04baff7a18e9 - - - - - -] initial_cpu_allocation_ratio   = 4.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 21 18:37:11 np0005591285 nova_compute[182755]: 2026-01-21 23:37:11.876 182759 DEBUG oslo_service.service [None req-03770a86-b916-455d-a5be-04baff7a18e9 - - - - - -] initial_disk_allocation_ratio  = 0.9 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 21 18:37:11 np0005591285 nova_compute[182755]: 2026-01-21 23:37:11.877 182759 DEBUG oslo_service.service [None req-03770a86-b916-455d-a5be-04baff7a18e9 - - - - - -] initial_ram_allocation_ratio   = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 21 18:37:11 np0005591285 nova_compute[182755]: 2026-01-21 23:37:11.877 182759 DEBUG oslo_service.service [None req-03770a86-b916-455d-a5be-04baff7a18e9 - - - - - -] injected_network_template      = /usr/lib/python3.9/site-packages/nova/virt/interfaces.template log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 21 18:37:11 np0005591285 nova_compute[182755]: 2026-01-21 23:37:11.877 182759 DEBUG oslo_service.service [None req-03770a86-b916-455d-a5be-04baff7a18e9 - - - - - -] instance_build_timeout         = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 21 18:37:11 np0005591285 nova_compute[182755]: 2026-01-21 23:37:11.877 182759 DEBUG oslo_service.service [None req-03770a86-b916-455d-a5be-04baff7a18e9 - - - - - -] instance_delete_interval       = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 21 18:37:11 np0005591285 nova_compute[182755]: 2026-01-21 23:37:11.877 182759 DEBUG oslo_service.service [None req-03770a86-b916-455d-a5be-04baff7a18e9 - - - - - -] instance_format                = [instance: %(uuid)s]  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 21 18:37:11 np0005591285 nova_compute[182755]: 2026-01-21 23:37:11.878 182759 DEBUG oslo_service.service [None req-03770a86-b916-455d-a5be-04baff7a18e9 - - - - - -] instance_name_template         = instance-%08x log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 21 18:37:11 np0005591285 nova_compute[182755]: 2026-01-21 23:37:11.878 182759 DEBUG oslo_service.service [None req-03770a86-b916-455d-a5be-04baff7a18e9 - - - - - -] instance_usage_audit           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 21 18:37:11 np0005591285 nova_compute[182755]: 2026-01-21 23:37:11.878 182759 DEBUG oslo_service.service [None req-03770a86-b916-455d-a5be-04baff7a18e9 - - - - - -] instance_usage_audit_period    = month log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 21 18:37:11 np0005591285 nova_compute[182755]: 2026-01-21 23:37:11.878 182759 DEBUG oslo_service.service [None req-03770a86-b916-455d-a5be-04baff7a18e9 - - - - - -] instance_uuid_format           = [instance: %(uuid)s]  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 21 18:37:11 np0005591285 nova_compute[182755]: 2026-01-21 23:37:11.878 182759 DEBUG oslo_service.service [None req-03770a86-b916-455d-a5be-04baff7a18e9 - - - - - -] instances_path                 = /var/lib/nova/instances log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 21 18:37:11 np0005591285 nova_compute[182755]: 2026-01-21 23:37:11.879 182759 DEBUG oslo_service.service [None req-03770a86-b916-455d-a5be-04baff7a18e9 - - - - - -] internal_service_availability_zone = internal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 21 18:37:11 np0005591285 nova_compute[182755]: 2026-01-21 23:37:11.879 182759 DEBUG oslo_service.service [None req-03770a86-b916-455d-a5be-04baff7a18e9 - - - - - -] key                            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 21 18:37:11 np0005591285 nova_compute[182755]: 2026-01-21 23:37:11.879 182759 DEBUG oslo_service.service [None req-03770a86-b916-455d-a5be-04baff7a18e9 - - - - - -] live_migration_retry_count     = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 21 18:37:11 np0005591285 nova_compute[182755]: 2026-01-21 23:37:11.879 182759 DEBUG oslo_service.service [None req-03770a86-b916-455d-a5be-04baff7a18e9 - - - - - -] log_config_append              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 21 18:37:11 np0005591285 nova_compute[182755]: 2026-01-21 23:37:11.879 182759 DEBUG oslo_service.service [None req-03770a86-b916-455d-a5be-04baff7a18e9 - - - - - -] log_date_format                = %Y-%m-%d %H:%M:%S log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 21 18:37:11 np0005591285 nova_compute[182755]: 2026-01-21 23:37:11.880 182759 DEBUG oslo_service.service [None req-03770a86-b916-455d-a5be-04baff7a18e9 - - - - - -] log_dir                        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 21 18:37:11 np0005591285 nova_compute[182755]: 2026-01-21 23:37:11.880 182759 DEBUG oslo_service.service [None req-03770a86-b916-455d-a5be-04baff7a18e9 - - - - - -] log_file                       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 21 18:37:11 np0005591285 nova_compute[182755]: 2026-01-21 23:37:11.880 182759 DEBUG oslo_service.service [None req-03770a86-b916-455d-a5be-04baff7a18e9 - - - - - -] log_options                    = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 21 18:37:11 np0005591285 nova_compute[182755]: 2026-01-21 23:37:11.880 182759 DEBUG oslo_service.service [None req-03770a86-b916-455d-a5be-04baff7a18e9 - - - - - -] log_rotate_interval            = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 21 18:37:11 np0005591285 nova_compute[182755]: 2026-01-21 23:37:11.880 182759 DEBUG oslo_service.service [None req-03770a86-b916-455d-a5be-04baff7a18e9 - - - - - -] log_rotate_interval_type       = days log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 21 18:37:11 np0005591285 nova_compute[182755]: 2026-01-21 23:37:11.880 182759 DEBUG oslo_service.service [None req-03770a86-b916-455d-a5be-04baff7a18e9 - - - - - -] log_rotation_type              = size log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 21 18:37:11 np0005591285 nova_compute[182755]: 2026-01-21 23:37:11.881 182759 DEBUG oslo_service.service [None req-03770a86-b916-455d-a5be-04baff7a18e9 - - - - - -] logging_context_format_string  = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [%(global_request_id)s %(request_id)s %(user_identity)s] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 21 18:37:11 np0005591285 nova_compute[182755]: 2026-01-21 23:37:11.881 182759 DEBUG oslo_service.service [None req-03770a86-b916-455d-a5be-04baff7a18e9 - - - - - -] logging_debug_format_suffix    = %(funcName)s %(pathname)s:%(lineno)d log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 21 18:37:11 np0005591285 nova_compute[182755]: 2026-01-21 23:37:11.881 182759 DEBUG oslo_service.service [None req-03770a86-b916-455d-a5be-04baff7a18e9 - - - - - -] logging_default_format_string  = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [-] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 21 18:37:11 np0005591285 nova_compute[182755]: 2026-01-21 23:37:11.881 182759 DEBUG oslo_service.service [None req-03770a86-b916-455d-a5be-04baff7a18e9 - - - - - -] logging_exception_prefix       = %(asctime)s.%(msecs)03d %(process)d ERROR %(name)s %(instance)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 21 18:37:11 np0005591285 nova_compute[182755]: 2026-01-21 23:37:11.881 182759 DEBUG oslo_service.service [None req-03770a86-b916-455d-a5be-04baff7a18e9 - - - - - -] logging_user_identity_format   = %(user)s %(project)s %(domain)s %(system_scope)s %(user_domain)s %(project_domain)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 21 18:37:11 np0005591285 nova_compute[182755]: 2026-01-21 23:37:11.881 182759 DEBUG oslo_service.service [None req-03770a86-b916-455d-a5be-04baff7a18e9 - - - - - -] long_rpc_timeout               = 1800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 21 18:37:11 np0005591285 nova_compute[182755]: 2026-01-21 23:37:11.882 182759 DEBUG oslo_service.service [None req-03770a86-b916-455d-a5be-04baff7a18e9 - - - - - -] max_concurrent_builds          = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 21 18:37:11 np0005591285 nova_compute[182755]: 2026-01-21 23:37:11.882 182759 DEBUG oslo_service.service [None req-03770a86-b916-455d-a5be-04baff7a18e9 - - - - - -] max_concurrent_live_migrations = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 21 18:37:11 np0005591285 nova_compute[182755]: 2026-01-21 23:37:11.882 182759 DEBUG oslo_service.service [None req-03770a86-b916-455d-a5be-04baff7a18e9 - - - - - -] max_concurrent_snapshots       = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 21 18:37:11 np0005591285 nova_compute[182755]: 2026-01-21 23:37:11.882 182759 DEBUG oslo_service.service [None req-03770a86-b916-455d-a5be-04baff7a18e9 - - - - - -] max_local_block_devices        = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 21 18:37:11 np0005591285 nova_compute[182755]: 2026-01-21 23:37:11.882 182759 DEBUG oslo_service.service [None req-03770a86-b916-455d-a5be-04baff7a18e9 - - - - - -] max_logfile_count              = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 21 18:37:11 np0005591285 nova_compute[182755]: 2026-01-21 23:37:11.883 182759 DEBUG oslo_service.service [None req-03770a86-b916-455d-a5be-04baff7a18e9 - - - - - -] max_logfile_size_mb            = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 21 18:37:11 np0005591285 nova_compute[182755]: 2026-01-21 23:37:11.883 182759 DEBUG oslo_service.service [None req-03770a86-b916-455d-a5be-04baff7a18e9 - - - - - -] maximum_instance_delete_attempts = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 21 18:37:11 np0005591285 nova_compute[182755]: 2026-01-21 23:37:11.883 182759 DEBUG oslo_service.service [None req-03770a86-b916-455d-a5be-04baff7a18e9 - - - - - -] metadata_listen                = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 21 18:37:11 np0005591285 nova_compute[182755]: 2026-01-21 23:37:11.883 182759 DEBUG oslo_service.service [None req-03770a86-b916-455d-a5be-04baff7a18e9 - - - - - -] metadata_listen_port           = 8775 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 21 18:37:11 np0005591285 nova_compute[182755]: 2026-01-21 23:37:11.883 182759 DEBUG oslo_service.service [None req-03770a86-b916-455d-a5be-04baff7a18e9 - - - - - -] metadata_workers               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 21 18:37:11 np0005591285 nova_compute[182755]: 2026-01-21 23:37:11.883 182759 DEBUG oslo_service.service [None req-03770a86-b916-455d-a5be-04baff7a18e9 - - - - - -] migrate_max_retries            = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 21 18:37:11 np0005591285 nova_compute[182755]: 2026-01-21 23:37:11.884 182759 DEBUG oslo_service.service [None req-03770a86-b916-455d-a5be-04baff7a18e9 - - - - - -] mkisofs_cmd                    = /usr/bin/mkisofs log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 21 18:37:11 np0005591285 nova_compute[182755]: 2026-01-21 23:37:11.884 182759 DEBUG oslo_service.service [None req-03770a86-b916-455d-a5be-04baff7a18e9 - - - - - -] my_block_storage_ip            = 192.168.122.102 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 21 18:37:11 np0005591285 nova_compute[182755]: 2026-01-21 23:37:11.884 182759 DEBUG oslo_service.service [None req-03770a86-b916-455d-a5be-04baff7a18e9 - - - - - -] my_ip                          = 192.168.122.102 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 21 18:37:11 np0005591285 nova_compute[182755]: 2026-01-21 23:37:11.884 182759 DEBUG oslo_service.service [None req-03770a86-b916-455d-a5be-04baff7a18e9 - - - - - -] network_allocate_retries       = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 21 18:37:11 np0005591285 nova_compute[182755]: 2026-01-21 23:37:11.884 182759 DEBUG oslo_service.service [None req-03770a86-b916-455d-a5be-04baff7a18e9 - - - - - -] non_inheritable_image_properties = ['cache_in_nova', 'bittorrent'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 21 18:37:11 np0005591285 nova_compute[182755]: 2026-01-21 23:37:11.885 182759 DEBUG oslo_service.service [None req-03770a86-b916-455d-a5be-04baff7a18e9 - - - - - -] osapi_compute_listen           = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 21 18:37:11 np0005591285 nova_compute[182755]: 2026-01-21 23:37:11.885 182759 DEBUG oslo_service.service [None req-03770a86-b916-455d-a5be-04baff7a18e9 - - - - - -] osapi_compute_listen_port      = 8774 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 21 18:37:11 np0005591285 nova_compute[182755]: 2026-01-21 23:37:11.885 182759 DEBUG oslo_service.service [None req-03770a86-b916-455d-a5be-04baff7a18e9 - - - - - -] osapi_compute_unique_server_name_scope =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 21 18:37:11 np0005591285 nova_compute[182755]: 2026-01-21 23:37:11.885 182759 DEBUG oslo_service.service [None req-03770a86-b916-455d-a5be-04baff7a18e9 - - - - - -] osapi_compute_workers          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 21 18:37:11 np0005591285 nova_compute[182755]: 2026-01-21 23:37:11.885 182759 DEBUG oslo_service.service [None req-03770a86-b916-455d-a5be-04baff7a18e9 - - - - - -] password_length                = 12 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 21 18:37:11 np0005591285 nova_compute[182755]: 2026-01-21 23:37:11.886 182759 DEBUG oslo_service.service [None req-03770a86-b916-455d-a5be-04baff7a18e9 - - - - - -] periodic_enable                = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 21 18:37:11 np0005591285 nova_compute[182755]: 2026-01-21 23:37:11.886 182759 DEBUG oslo_service.service [None req-03770a86-b916-455d-a5be-04baff7a18e9 - - - - - -] periodic_fuzzy_delay           = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 21 18:37:11 np0005591285 nova_compute[182755]: 2026-01-21 23:37:11.886 182759 DEBUG oslo_service.service [None req-03770a86-b916-455d-a5be-04baff7a18e9 - - - - - -] pointer_model                  = usbtablet log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 21 18:37:11 np0005591285 nova_compute[182755]: 2026-01-21 23:37:11.886 182759 DEBUG oslo_service.service [None req-03770a86-b916-455d-a5be-04baff7a18e9 - - - - - -] preallocate_images             = none log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 21 18:37:11 np0005591285 nova_compute[182755]: 2026-01-21 23:37:11.886 182759 DEBUG oslo_service.service [None req-03770a86-b916-455d-a5be-04baff7a18e9 - - - - - -] publish_errors                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 21 18:37:11 np0005591285 nova_compute[182755]: 2026-01-21 23:37:11.886 182759 DEBUG oslo_service.service [None req-03770a86-b916-455d-a5be-04baff7a18e9 - - - - - -] pybasedir                      = /usr/lib/python3.9/site-packages log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 21 18:37:11 np0005591285 nova_compute[182755]: 2026-01-21 23:37:11.887 182759 DEBUG oslo_service.service [None req-03770a86-b916-455d-a5be-04baff7a18e9 - - - - - -] ram_allocation_ratio           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 21 18:37:11 np0005591285 nova_compute[182755]: 2026-01-21 23:37:11.887 182759 DEBUG oslo_service.service [None req-03770a86-b916-455d-a5be-04baff7a18e9 - - - - - -] rate_limit_burst               = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 21 18:37:11 np0005591285 nova_compute[182755]: 2026-01-21 23:37:11.887 182759 DEBUG oslo_service.service [None req-03770a86-b916-455d-a5be-04baff7a18e9 - - - - - -] rate_limit_except_level        = CRITICAL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 21 18:37:11 np0005591285 nova_compute[182755]: 2026-01-21 23:37:11.887 182759 DEBUG oslo_service.service [None req-03770a86-b916-455d-a5be-04baff7a18e9 - - - - - -] rate_limit_interval            = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 21 18:37:11 np0005591285 nova_compute[182755]: 2026-01-21 23:37:11.887 182759 DEBUG oslo_service.service [None req-03770a86-b916-455d-a5be-04baff7a18e9 - - - - - -] reboot_timeout                 = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 21 18:37:11 np0005591285 nova_compute[182755]: 2026-01-21 23:37:11.887 182759 DEBUG oslo_service.service [None req-03770a86-b916-455d-a5be-04baff7a18e9 - - - - - -] reclaim_instance_interval      = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 21 18:37:11 np0005591285 nova_compute[182755]: 2026-01-21 23:37:11.888 182759 DEBUG oslo_service.service [None req-03770a86-b916-455d-a5be-04baff7a18e9 - - - - - -] record                         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 21 18:37:11 np0005591285 nova_compute[182755]: 2026-01-21 23:37:11.888 182759 DEBUG oslo_service.service [None req-03770a86-b916-455d-a5be-04baff7a18e9 - - - - - -] reimage_timeout_per_gb         = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 21 18:37:11 np0005591285 nova_compute[182755]: 2026-01-21 23:37:11.888 182759 DEBUG oslo_service.service [None req-03770a86-b916-455d-a5be-04baff7a18e9 - - - - - -] report_interval                = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 21 18:37:11 np0005591285 nova_compute[182755]: 2026-01-21 23:37:11.888 182759 DEBUG oslo_service.service [None req-03770a86-b916-455d-a5be-04baff7a18e9 - - - - - -] rescue_timeout                 = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 21 18:37:11 np0005591285 nova_compute[182755]: 2026-01-21 23:37:11.888 182759 DEBUG oslo_service.service [None req-03770a86-b916-455d-a5be-04baff7a18e9 - - - - - -] reserved_host_cpus             = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 21 18:37:11 np0005591285 nova_compute[182755]: 2026-01-21 23:37:11.889 182759 DEBUG oslo_service.service [None req-03770a86-b916-455d-a5be-04baff7a18e9 - - - - - -] reserved_host_disk_mb          = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 21 18:37:11 np0005591285 nova_compute[182755]: 2026-01-21 23:37:11.889 182759 DEBUG oslo_service.service [None req-03770a86-b916-455d-a5be-04baff7a18e9 - - - - - -] reserved_host_memory_mb        = 512 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 21 18:37:11 np0005591285 nova_compute[182755]: 2026-01-21 23:37:11.889 182759 DEBUG oslo_service.service [None req-03770a86-b916-455d-a5be-04baff7a18e9 - - - - - -] reserved_huge_pages            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 21 18:37:11 np0005591285 nova_compute[182755]: 2026-01-21 23:37:11.889 182759 DEBUG oslo_service.service [None req-03770a86-b916-455d-a5be-04baff7a18e9 - - - - - -] resize_confirm_window          = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 21 18:37:11 np0005591285 nova_compute[182755]: 2026-01-21 23:37:11.889 182759 DEBUG oslo_service.service [None req-03770a86-b916-455d-a5be-04baff7a18e9 - - - - - -] resize_fs_using_block_device   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 21 18:37:11 np0005591285 nova_compute[182755]: 2026-01-21 23:37:11.889 182759 DEBUG oslo_service.service [None req-03770a86-b916-455d-a5be-04baff7a18e9 - - - - - -] resume_guests_state_on_host_boot = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 21 18:37:11 np0005591285 nova_compute[182755]: 2026-01-21 23:37:11.890 182759 DEBUG oslo_service.service [None req-03770a86-b916-455d-a5be-04baff7a18e9 - - - - - -] rootwrap_config                = /etc/nova/rootwrap.conf log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 21 18:37:11 np0005591285 nova_compute[182755]: 2026-01-21 23:37:11.890 182759 DEBUG oslo_service.service [None req-03770a86-b916-455d-a5be-04baff7a18e9 - - - - - -] rpc_response_timeout           = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 21 18:37:11 np0005591285 nova_compute[182755]: 2026-01-21 23:37:11.890 182759 DEBUG oslo_service.service [None req-03770a86-b916-455d-a5be-04baff7a18e9 - - - - - -] run_external_periodic_tasks    = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 21 18:37:11 np0005591285 nova_compute[182755]: 2026-01-21 23:37:11.890 182759 DEBUG oslo_service.service [None req-03770a86-b916-455d-a5be-04baff7a18e9 - - - - - -] running_deleted_instance_action = reap log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 21 18:37:11 np0005591285 nova_compute[182755]: 2026-01-21 23:37:11.890 182759 DEBUG oslo_service.service [None req-03770a86-b916-455d-a5be-04baff7a18e9 - - - - - -] running_deleted_instance_poll_interval = 1800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 21 18:37:11 np0005591285 nova_compute[182755]: 2026-01-21 23:37:11.891 182759 DEBUG oslo_service.service [None req-03770a86-b916-455d-a5be-04baff7a18e9 - - - - - -] running_deleted_instance_timeout = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 21 18:37:11 np0005591285 nova_compute[182755]: 2026-01-21 23:37:11.891 182759 DEBUG oslo_service.service [None req-03770a86-b916-455d-a5be-04baff7a18e9 - - - - - -] scheduler_instance_sync_interval = 120 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 21 18:37:11 np0005591285 nova_compute[182755]: 2026-01-21 23:37:11.891 182759 DEBUG oslo_service.service [None req-03770a86-b916-455d-a5be-04baff7a18e9 - - - - - -] service_down_time              = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 21 18:37:11 np0005591285 nova_compute[182755]: 2026-01-21 23:37:11.891 182759 DEBUG oslo_service.service [None req-03770a86-b916-455d-a5be-04baff7a18e9 - - - - - -] servicegroup_driver            = db log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 21 18:37:11 np0005591285 nova_compute[182755]: 2026-01-21 23:37:11.891 182759 DEBUG oslo_service.service [None req-03770a86-b916-455d-a5be-04baff7a18e9 - - - - - -] shelved_offload_time           = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 21 18:37:11 np0005591285 nova_compute[182755]: 2026-01-21 23:37:11.891 182759 DEBUG oslo_service.service [None req-03770a86-b916-455d-a5be-04baff7a18e9 - - - - - -] shelved_poll_interval          = 3600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 21 18:37:11 np0005591285 nova_compute[182755]: 2026-01-21 23:37:11.892 182759 DEBUG oslo_service.service [None req-03770a86-b916-455d-a5be-04baff7a18e9 - - - - - -] shutdown_timeout               = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 21 18:37:11 np0005591285 nova_compute[182755]: 2026-01-21 23:37:11.892 182759 DEBUG oslo_service.service [None req-03770a86-b916-455d-a5be-04baff7a18e9 - - - - - -] source_is_ipv6                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 21 18:37:11 np0005591285 nova_compute[182755]: 2026-01-21 23:37:11.892 182759 DEBUG oslo_service.service [None req-03770a86-b916-455d-a5be-04baff7a18e9 - - - - - -] ssl_only                       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 21 18:37:11 np0005591285 nova_compute[182755]: 2026-01-21 23:37:11.892 182759 DEBUG oslo_service.service [None req-03770a86-b916-455d-a5be-04baff7a18e9 - - - - - -] state_path                     = /var/lib/nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 21 18:37:11 np0005591285 nova_compute[182755]: 2026-01-21 23:37:11.892 182759 DEBUG oslo_service.service [None req-03770a86-b916-455d-a5be-04baff7a18e9 - - - - - -] sync_power_state_interval      = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 21 18:37:11 np0005591285 nova_compute[182755]: 2026-01-21 23:37:11.893 182759 DEBUG oslo_service.service [None req-03770a86-b916-455d-a5be-04baff7a18e9 - - - - - -] sync_power_state_pool_size     = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 21 18:37:11 np0005591285 nova_compute[182755]: 2026-01-21 23:37:11.893 182759 DEBUG oslo_service.service [None req-03770a86-b916-455d-a5be-04baff7a18e9 - - - - - -] syslog_log_facility            = LOG_USER log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 21 18:37:11 np0005591285 nova_compute[182755]: 2026-01-21 23:37:11.893 182759 DEBUG oslo_service.service [None req-03770a86-b916-455d-a5be-04baff7a18e9 - - - - - -] tempdir                        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 21 18:37:11 np0005591285 nova_compute[182755]: 2026-01-21 23:37:11.893 182759 DEBUG oslo_service.service [None req-03770a86-b916-455d-a5be-04baff7a18e9 - - - - - -] timeout_nbd                    = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 21 18:37:11 np0005591285 nova_compute[182755]: 2026-01-21 23:37:11.893 182759 DEBUG oslo_service.service [None req-03770a86-b916-455d-a5be-04baff7a18e9 - - - - - -] transport_url                  = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 21 18:37:11 np0005591285 nova_compute[182755]: 2026-01-21 23:37:11.893 182759 DEBUG oslo_service.service [None req-03770a86-b916-455d-a5be-04baff7a18e9 - - - - - -] update_resources_interval      = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 21 18:37:11 np0005591285 nova_compute[182755]: 2026-01-21 23:37:11.894 182759 DEBUG oslo_service.service [None req-03770a86-b916-455d-a5be-04baff7a18e9 - - - - - -] use_cow_images                 = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 21 18:37:11 np0005591285 nova_compute[182755]: 2026-01-21 23:37:11.894 182759 DEBUG oslo_service.service [None req-03770a86-b916-455d-a5be-04baff7a18e9 - - - - - -] use_eventlog                   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 21 18:37:11 np0005591285 nova_compute[182755]: 2026-01-21 23:37:11.894 182759 DEBUG oslo_service.service [None req-03770a86-b916-455d-a5be-04baff7a18e9 - - - - - -] use_journal                    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 21 18:37:11 np0005591285 nova_compute[182755]: 2026-01-21 23:37:11.894 182759 DEBUG oslo_service.service [None req-03770a86-b916-455d-a5be-04baff7a18e9 - - - - - -] use_json                       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 21 18:37:11 np0005591285 nova_compute[182755]: 2026-01-21 23:37:11.894 182759 DEBUG oslo_service.service [None req-03770a86-b916-455d-a5be-04baff7a18e9 - - - - - -] use_rootwrap_daemon            = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 21 18:37:11 np0005591285 nova_compute[182755]: 2026-01-21 23:37:11.895 182759 DEBUG oslo_service.service [None req-03770a86-b916-455d-a5be-04baff7a18e9 - - - - - -] use_stderr                     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 21 18:37:11 np0005591285 nova_compute[182755]: 2026-01-21 23:37:11.895 182759 DEBUG oslo_service.service [None req-03770a86-b916-455d-a5be-04baff7a18e9 - - - - - -] use_syslog                     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 21 18:37:11 np0005591285 nova_compute[182755]: 2026-01-21 23:37:11.895 182759 DEBUG oslo_service.service [None req-03770a86-b916-455d-a5be-04baff7a18e9 - - - - - -] vcpu_pin_set                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 21 18:37:11 np0005591285 nova_compute[182755]: 2026-01-21 23:37:11.895 182759 DEBUG oslo_service.service [None req-03770a86-b916-455d-a5be-04baff7a18e9 - - - - - -] vif_plugging_is_fatal          = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 21 18:37:11 np0005591285 nova_compute[182755]: 2026-01-21 23:37:11.895 182759 DEBUG oslo_service.service [None req-03770a86-b916-455d-a5be-04baff7a18e9 - - - - - -] vif_plugging_timeout           = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 21 18:37:11 np0005591285 nova_compute[182755]: 2026-01-21 23:37:11.895 182759 DEBUG oslo_service.service [None req-03770a86-b916-455d-a5be-04baff7a18e9 - - - - - -] virt_mkfs                      = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 21 18:37:11 np0005591285 nova_compute[182755]: 2026-01-21 23:37:11.896 182759 DEBUG oslo_service.service [None req-03770a86-b916-455d-a5be-04baff7a18e9 - - - - - -] volume_usage_poll_interval     = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 21 18:37:11 np0005591285 nova_compute[182755]: 2026-01-21 23:37:11.896 182759 DEBUG oslo_service.service [None req-03770a86-b916-455d-a5be-04baff7a18e9 - - - - - -] watch_log_file                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 21 18:37:11 np0005591285 nova_compute[182755]: 2026-01-21 23:37:11.896 182759 DEBUG oslo_service.service [None req-03770a86-b916-455d-a5be-04baff7a18e9 - - - - - -] web                            = /usr/share/spice-html5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 21 18:37:11 np0005591285 nova_compute[182755]: 2026-01-21 23:37:11.896 182759 DEBUG oslo_service.service [None req-03770a86-b916-455d-a5be-04baff7a18e9 - - - - - -] oslo_concurrency.disable_process_locking = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:11 np0005591285 nova_compute[182755]: 2026-01-21 23:37:11.896 182759 DEBUG oslo_service.service [None req-03770a86-b916-455d-a5be-04baff7a18e9 - - - - - -] oslo_concurrency.lock_path     = /var/lib/nova/tmp log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:11 np0005591285 nova_compute[182755]: 2026-01-21 23:37:11.897 182759 DEBUG oslo_service.service [None req-03770a86-b916-455d-a5be-04baff7a18e9 - - - - - -] oslo_messaging_metrics.metrics_buffer_size = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:11 np0005591285 nova_compute[182755]: 2026-01-21 23:37:11.897 182759 DEBUG oslo_service.service [None req-03770a86-b916-455d-a5be-04baff7a18e9 - - - - - -] oslo_messaging_metrics.metrics_enabled = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:11 np0005591285 nova_compute[182755]: 2026-01-21 23:37:11.897 182759 DEBUG oslo_service.service [None req-03770a86-b916-455d-a5be-04baff7a18e9 - - - - - -] oslo_messaging_metrics.metrics_process_name =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:11 np0005591285 nova_compute[182755]: 2026-01-21 23:37:11.897 182759 DEBUG oslo_service.service [None req-03770a86-b916-455d-a5be-04baff7a18e9 - - - - - -] oslo_messaging_metrics.metrics_socket_file = /var/tmp/metrics_collector.sock log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:11 np0005591285 nova_compute[182755]: 2026-01-21 23:37:11.897 182759 DEBUG oslo_service.service [None req-03770a86-b916-455d-a5be-04baff7a18e9 - - - - - -] oslo_messaging_metrics.metrics_thread_stop_timeout = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:11 np0005591285 nova_compute[182755]: 2026-01-21 23:37:11.898 182759 DEBUG oslo_service.service [None req-03770a86-b916-455d-a5be-04baff7a18e9 - - - - - -] api.auth_strategy              = keystone log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:11 np0005591285 nova_compute[182755]: 2026-01-21 23:37:11.898 182759 DEBUG oslo_service.service [None req-03770a86-b916-455d-a5be-04baff7a18e9 - - - - - -] api.compute_link_prefix        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:11 np0005591285 nova_compute[182755]: 2026-01-21 23:37:11.898 182759 DEBUG oslo_service.service [None req-03770a86-b916-455d-a5be-04baff7a18e9 - - - - - -] api.config_drive_skip_versions = 1.0 2007-01-19 2007-03-01 2007-08-29 2007-10-10 2007-12-15 2008-02-01 2008-09-01 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:11 np0005591285 nova_compute[182755]: 2026-01-21 23:37:11.898 182759 DEBUG oslo_service.service [None req-03770a86-b916-455d-a5be-04baff7a18e9 - - - - - -] api.dhcp_domain                =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:11 np0005591285 nova_compute[182755]: 2026-01-21 23:37:11.898 182759 DEBUG oslo_service.service [None req-03770a86-b916-455d-a5be-04baff7a18e9 - - - - - -] api.enable_instance_password   = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:11 np0005591285 nova_compute[182755]: 2026-01-21 23:37:11.898 182759 DEBUG oslo_service.service [None req-03770a86-b916-455d-a5be-04baff7a18e9 - - - - - -] api.glance_link_prefix         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:11 np0005591285 nova_compute[182755]: 2026-01-21 23:37:11.899 182759 DEBUG oslo_service.service [None req-03770a86-b916-455d-a5be-04baff7a18e9 - - - - - -] api.instance_list_cells_batch_fixed_size = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:11 np0005591285 nova_compute[182755]: 2026-01-21 23:37:11.899 182759 DEBUG oslo_service.service [None req-03770a86-b916-455d-a5be-04baff7a18e9 - - - - - -] api.instance_list_cells_batch_strategy = distributed log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:11 np0005591285 nova_compute[182755]: 2026-01-21 23:37:11.899 182759 DEBUG oslo_service.service [None req-03770a86-b916-455d-a5be-04baff7a18e9 - - - - - -] api.instance_list_per_project_cells = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:11 np0005591285 nova_compute[182755]: 2026-01-21 23:37:11.899 182759 DEBUG oslo_service.service [None req-03770a86-b916-455d-a5be-04baff7a18e9 - - - - - -] api.list_records_by_skipping_down_cells = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:11 np0005591285 nova_compute[182755]: 2026-01-21 23:37:11.899 182759 DEBUG oslo_service.service [None req-03770a86-b916-455d-a5be-04baff7a18e9 - - - - - -] api.local_metadata_per_cell    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:11 np0005591285 nova_compute[182755]: 2026-01-21 23:37:11.900 182759 DEBUG oslo_service.service [None req-03770a86-b916-455d-a5be-04baff7a18e9 - - - - - -] api.max_limit                  = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:11 np0005591285 nova_compute[182755]: 2026-01-21 23:37:11.900 182759 DEBUG oslo_service.service [None req-03770a86-b916-455d-a5be-04baff7a18e9 - - - - - -] api.metadata_cache_expiration  = 15 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:11 np0005591285 nova_compute[182755]: 2026-01-21 23:37:11.900 182759 DEBUG oslo_service.service [None req-03770a86-b916-455d-a5be-04baff7a18e9 - - - - - -] api.neutron_default_tenant_id  = default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:11 np0005591285 nova_compute[182755]: 2026-01-21 23:37:11.900 182759 DEBUG oslo_service.service [None req-03770a86-b916-455d-a5be-04baff7a18e9 - - - - - -] api.use_forwarded_for          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:11 np0005591285 nova_compute[182755]: 2026-01-21 23:37:11.900 182759 DEBUG oslo_service.service [None req-03770a86-b916-455d-a5be-04baff7a18e9 - - - - - -] api.use_neutron_default_nets   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:11 np0005591285 nova_compute[182755]: 2026-01-21 23:37:11.900 182759 DEBUG oslo_service.service [None req-03770a86-b916-455d-a5be-04baff7a18e9 - - - - - -] api.vendordata_dynamic_connect_timeout = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:11 np0005591285 nova_compute[182755]: 2026-01-21 23:37:11.901 182759 DEBUG oslo_service.service [None req-03770a86-b916-455d-a5be-04baff7a18e9 - - - - - -] api.vendordata_dynamic_failure_fatal = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:11 np0005591285 nova_compute[182755]: 2026-01-21 23:37:11.901 182759 DEBUG oslo_service.service [None req-03770a86-b916-455d-a5be-04baff7a18e9 - - - - - -] api.vendordata_dynamic_read_timeout = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:11 np0005591285 nova_compute[182755]: 2026-01-21 23:37:11.901 182759 DEBUG oslo_service.service [None req-03770a86-b916-455d-a5be-04baff7a18e9 - - - - - -] api.vendordata_dynamic_ssl_certfile =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:11 np0005591285 nova_compute[182755]: 2026-01-21 23:37:11.901 182759 DEBUG oslo_service.service [None req-03770a86-b916-455d-a5be-04baff7a18e9 - - - - - -] api.vendordata_dynamic_targets = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:11 np0005591285 nova_compute[182755]: 2026-01-21 23:37:11.901 182759 DEBUG oslo_service.service [None req-03770a86-b916-455d-a5be-04baff7a18e9 - - - - - -] api.vendordata_jsonfile_path   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:11 np0005591285 nova_compute[182755]: 2026-01-21 23:37:11.902 182759 DEBUG oslo_service.service [None req-03770a86-b916-455d-a5be-04baff7a18e9 - - - - - -] api.vendordata_providers       = ['StaticJSON'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:11 np0005591285 nova_compute[182755]: 2026-01-21 23:37:11.902 182759 DEBUG oslo_service.service [None req-03770a86-b916-455d-a5be-04baff7a18e9 - - - - - -] cache.backend                  = oslo_cache.dict log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:11 np0005591285 nova_compute[182755]: 2026-01-21 23:37:11.902 182759 DEBUG oslo_service.service [None req-03770a86-b916-455d-a5be-04baff7a18e9 - - - - - -] cache.backend_argument         = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:11 np0005591285 nova_compute[182755]: 2026-01-21 23:37:11.902 182759 DEBUG oslo_service.service [None req-03770a86-b916-455d-a5be-04baff7a18e9 - - - - - -] cache.config_prefix            = cache.oslo log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:11 np0005591285 nova_compute[182755]: 2026-01-21 23:37:11.902 182759 DEBUG oslo_service.service [None req-03770a86-b916-455d-a5be-04baff7a18e9 - - - - - -] cache.dead_timeout             = 60.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:11 np0005591285 nova_compute[182755]: 2026-01-21 23:37:11.903 182759 DEBUG oslo_service.service [None req-03770a86-b916-455d-a5be-04baff7a18e9 - - - - - -] cache.debug_cache_backend      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:11 np0005591285 nova_compute[182755]: 2026-01-21 23:37:11.903 182759 DEBUG oslo_service.service [None req-03770a86-b916-455d-a5be-04baff7a18e9 - - - - - -] cache.enable_retry_client      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:11 np0005591285 nova_compute[182755]: 2026-01-21 23:37:11.903 182759 DEBUG oslo_service.service [None req-03770a86-b916-455d-a5be-04baff7a18e9 - - - - - -] cache.enable_socket_keepalive  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:11 np0005591285 nova_compute[182755]: 2026-01-21 23:37:11.903 182759 DEBUG oslo_service.service [None req-03770a86-b916-455d-a5be-04baff7a18e9 - - - - - -] cache.enabled                  = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:11 np0005591285 nova_compute[182755]: 2026-01-21 23:37:11.903 182759 DEBUG oslo_service.service [None req-03770a86-b916-455d-a5be-04baff7a18e9 - - - - - -] cache.expiration_time          = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:11 np0005591285 nova_compute[182755]: 2026-01-21 23:37:11.903 182759 DEBUG oslo_service.service [None req-03770a86-b916-455d-a5be-04baff7a18e9 - - - - - -] cache.hashclient_retry_attempts = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:11 np0005591285 nova_compute[182755]: 2026-01-21 23:37:11.904 182759 DEBUG oslo_service.service [None req-03770a86-b916-455d-a5be-04baff7a18e9 - - - - - -] cache.hashclient_retry_delay   = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:11 np0005591285 nova_compute[182755]: 2026-01-21 23:37:11.904 182759 DEBUG oslo_service.service [None req-03770a86-b916-455d-a5be-04baff7a18e9 - - - - - -] cache.memcache_dead_retry      = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:11 np0005591285 nova_compute[182755]: 2026-01-21 23:37:11.904 182759 DEBUG oslo_service.service [None req-03770a86-b916-455d-a5be-04baff7a18e9 - - - - - -] cache.memcache_password        =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:11 np0005591285 nova_compute[182755]: 2026-01-21 23:37:11.904 182759 DEBUG oslo_service.service [None req-03770a86-b916-455d-a5be-04baff7a18e9 - - - - - -] cache.memcache_pool_connection_get_timeout = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:11 np0005591285 nova_compute[182755]: 2026-01-21 23:37:11.904 182759 DEBUG oslo_service.service [None req-03770a86-b916-455d-a5be-04baff7a18e9 - - - - - -] cache.memcache_pool_flush_on_reconnect = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:11 np0005591285 nova_compute[182755]: 2026-01-21 23:37:11.905 182759 DEBUG oslo_service.service [None req-03770a86-b916-455d-a5be-04baff7a18e9 - - - - - -] cache.memcache_pool_maxsize    = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:11 np0005591285 nova_compute[182755]: 2026-01-21 23:37:11.905 182759 DEBUG oslo_service.service [None req-03770a86-b916-455d-a5be-04baff7a18e9 - - - - - -] cache.memcache_pool_unused_timeout = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:11 np0005591285 nova_compute[182755]: 2026-01-21 23:37:11.905 182759 DEBUG oslo_service.service [None req-03770a86-b916-455d-a5be-04baff7a18e9 - - - - - -] cache.memcache_sasl_enabled    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:11 np0005591285 nova_compute[182755]: 2026-01-21 23:37:11.905 182759 DEBUG oslo_service.service [None req-03770a86-b916-455d-a5be-04baff7a18e9 - - - - - -] cache.memcache_servers         = ['localhost:11211'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:11 np0005591285 nova_compute[182755]: 2026-01-21 23:37:11.905 182759 DEBUG oslo_service.service [None req-03770a86-b916-455d-a5be-04baff7a18e9 - - - - - -] cache.memcache_socket_timeout  = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:11 np0005591285 nova_compute[182755]: 2026-01-21 23:37:11.906 182759 DEBUG oslo_service.service [None req-03770a86-b916-455d-a5be-04baff7a18e9 - - - - - -] cache.memcache_username        =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:11 np0005591285 nova_compute[182755]: 2026-01-21 23:37:11.906 182759 DEBUG oslo_service.service [None req-03770a86-b916-455d-a5be-04baff7a18e9 - - - - - -] cache.proxies                  = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:11 np0005591285 nova_compute[182755]: 2026-01-21 23:37:11.906 182759 DEBUG oslo_service.service [None req-03770a86-b916-455d-a5be-04baff7a18e9 - - - - - -] cache.retry_attempts           = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:11 np0005591285 nova_compute[182755]: 2026-01-21 23:37:11.906 182759 DEBUG oslo_service.service [None req-03770a86-b916-455d-a5be-04baff7a18e9 - - - - - -] cache.retry_delay              = 0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:11 np0005591285 nova_compute[182755]: 2026-01-21 23:37:11.906 182759 DEBUG oslo_service.service [None req-03770a86-b916-455d-a5be-04baff7a18e9 - - - - - -] cache.socket_keepalive_count   = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:11 np0005591285 nova_compute[182755]: 2026-01-21 23:37:11.906 182759 DEBUG oslo_service.service [None req-03770a86-b916-455d-a5be-04baff7a18e9 - - - - - -] cache.socket_keepalive_idle    = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:11 np0005591285 nova_compute[182755]: 2026-01-21 23:37:11.907 182759 DEBUG oslo_service.service [None req-03770a86-b916-455d-a5be-04baff7a18e9 - - - - - -] cache.socket_keepalive_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:11 np0005591285 nova_compute[182755]: 2026-01-21 23:37:11.907 182759 DEBUG oslo_service.service [None req-03770a86-b916-455d-a5be-04baff7a18e9 - - - - - -] cache.tls_allowed_ciphers      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:11 np0005591285 nova_compute[182755]: 2026-01-21 23:37:11.907 182759 DEBUG oslo_service.service [None req-03770a86-b916-455d-a5be-04baff7a18e9 - - - - - -] cache.tls_cafile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:11 np0005591285 nova_compute[182755]: 2026-01-21 23:37:11.907 182759 DEBUG oslo_service.service [None req-03770a86-b916-455d-a5be-04baff7a18e9 - - - - - -] cache.tls_certfile             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:11 np0005591285 nova_compute[182755]: 2026-01-21 23:37:11.907 182759 DEBUG oslo_service.service [None req-03770a86-b916-455d-a5be-04baff7a18e9 - - - - - -] cache.tls_enabled              = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:11 np0005591285 nova_compute[182755]: 2026-01-21 23:37:11.907 182759 DEBUG oslo_service.service [None req-03770a86-b916-455d-a5be-04baff7a18e9 - - - - - -] cache.tls_keyfile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:11 np0005591285 nova_compute[182755]: 2026-01-21 23:37:11.908 182759 DEBUG oslo_service.service [None req-03770a86-b916-455d-a5be-04baff7a18e9 - - - - - -] cinder.auth_section            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:11 np0005591285 nova_compute[182755]: 2026-01-21 23:37:11.908 182759 DEBUG oslo_service.service [None req-03770a86-b916-455d-a5be-04baff7a18e9 - - - - - -] cinder.auth_type               = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:11 np0005591285 nova_compute[182755]: 2026-01-21 23:37:11.908 182759 DEBUG oslo_service.service [None req-03770a86-b916-455d-a5be-04baff7a18e9 - - - - - -] cinder.cafile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:11 np0005591285 nova_compute[182755]: 2026-01-21 23:37:11.908 182759 DEBUG oslo_service.service [None req-03770a86-b916-455d-a5be-04baff7a18e9 - - - - - -] cinder.catalog_info            = volumev3:cinderv3:internalURL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:11 np0005591285 nova_compute[182755]: 2026-01-21 23:37:11.908 182759 DEBUG oslo_service.service [None req-03770a86-b916-455d-a5be-04baff7a18e9 - - - - - -] cinder.certfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:11 np0005591285 nova_compute[182755]: 2026-01-21 23:37:11.909 182759 DEBUG oslo_service.service [None req-03770a86-b916-455d-a5be-04baff7a18e9 - - - - - -] cinder.collect_timing          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:11 np0005591285 nova_compute[182755]: 2026-01-21 23:37:11.909 182759 DEBUG oslo_service.service [None req-03770a86-b916-455d-a5be-04baff7a18e9 - - - - - -] cinder.cross_az_attach         = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:11 np0005591285 nova_compute[182755]: 2026-01-21 23:37:11.909 182759 DEBUG oslo_service.service [None req-03770a86-b916-455d-a5be-04baff7a18e9 - - - - - -] cinder.debug                   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:11 np0005591285 nova_compute[182755]: 2026-01-21 23:37:11.909 182759 DEBUG oslo_service.service [None req-03770a86-b916-455d-a5be-04baff7a18e9 - - - - - -] cinder.endpoint_template       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:11 np0005591285 nova_compute[182755]: 2026-01-21 23:37:11.909 182759 DEBUG oslo_service.service [None req-03770a86-b916-455d-a5be-04baff7a18e9 - - - - - -] cinder.http_retries            = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:11 np0005591285 nova_compute[182755]: 2026-01-21 23:37:11.910 182759 DEBUG oslo_service.service [None req-03770a86-b916-455d-a5be-04baff7a18e9 - - - - - -] cinder.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:11 np0005591285 nova_compute[182755]: 2026-01-21 23:37:11.910 182759 DEBUG oslo_service.service [None req-03770a86-b916-455d-a5be-04baff7a18e9 - - - - - -] cinder.keyfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:11 np0005591285 nova_compute[182755]: 2026-01-21 23:37:11.910 182759 DEBUG oslo_service.service [None req-03770a86-b916-455d-a5be-04baff7a18e9 - - - - - -] cinder.os_region_name          = regionOne log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:11 np0005591285 nova_compute[182755]: 2026-01-21 23:37:11.910 182759 DEBUG oslo_service.service [None req-03770a86-b916-455d-a5be-04baff7a18e9 - - - - - -] cinder.split_loggers           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:11 np0005591285 nova_compute[182755]: 2026-01-21 23:37:11.910 182759 DEBUG oslo_service.service [None req-03770a86-b916-455d-a5be-04baff7a18e9 - - - - - -] cinder.timeout                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:11 np0005591285 nova_compute[182755]: 2026-01-21 23:37:11.910 182759 DEBUG oslo_service.service [None req-03770a86-b916-455d-a5be-04baff7a18e9 - - - - - -] compute.consecutive_build_service_disable_threshold = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:11 np0005591285 nova_compute[182755]: 2026-01-21 23:37:11.911 182759 DEBUG oslo_service.service [None req-03770a86-b916-455d-a5be-04baff7a18e9 - - - - - -] compute.cpu_dedicated_set      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:11 np0005591285 nova_compute[182755]: 2026-01-21 23:37:11.911 182759 DEBUG oslo_service.service [None req-03770a86-b916-455d-a5be-04baff7a18e9 - - - - - -] compute.cpu_shared_set         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:11 np0005591285 nova_compute[182755]: 2026-01-21 23:37:11.911 182759 DEBUG oslo_service.service [None req-03770a86-b916-455d-a5be-04baff7a18e9 - - - - - -] compute.image_type_exclude_list = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:11 np0005591285 nova_compute[182755]: 2026-01-21 23:37:11.911 182759 DEBUG oslo_service.service [None req-03770a86-b916-455d-a5be-04baff7a18e9 - - - - - -] compute.live_migration_wait_for_vif_plug = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:11 np0005591285 nova_compute[182755]: 2026-01-21 23:37:11.911 182759 DEBUG oslo_service.service [None req-03770a86-b916-455d-a5be-04baff7a18e9 - - - - - -] compute.max_concurrent_disk_ops = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:11 np0005591285 nova_compute[182755]: 2026-01-21 23:37:11.912 182759 DEBUG oslo_service.service [None req-03770a86-b916-455d-a5be-04baff7a18e9 - - - - - -] compute.max_disk_devices_to_attach = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:11 np0005591285 nova_compute[182755]: 2026-01-21 23:37:11.912 182759 DEBUG oslo_service.service [None req-03770a86-b916-455d-a5be-04baff7a18e9 - - - - - -] compute.packing_host_numa_cells_allocation_strategy = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:11 np0005591285 nova_compute[182755]: 2026-01-21 23:37:11.912 182759 DEBUG oslo_service.service [None req-03770a86-b916-455d-a5be-04baff7a18e9 - - - - - -] compute.provider_config_location = /etc/nova/provider_config/ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:11 np0005591285 nova_compute[182755]: 2026-01-21 23:37:11.912 182759 DEBUG oslo_service.service [None req-03770a86-b916-455d-a5be-04baff7a18e9 - - - - - -] compute.resource_provider_association_refresh = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:11 np0005591285 nova_compute[182755]: 2026-01-21 23:37:11.912 182759 DEBUG oslo_service.service [None req-03770a86-b916-455d-a5be-04baff7a18e9 - - - - - -] compute.shutdown_retry_interval = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:11 np0005591285 nova_compute[182755]: 2026-01-21 23:37:11.912 182759 DEBUG oslo_service.service [None req-03770a86-b916-455d-a5be-04baff7a18e9 - - - - - -] compute.vmdk_allowed_types     = ['streamOptimized', 'monolithicSparse'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:11 np0005591285 nova_compute[182755]: 2026-01-21 23:37:11.913 182759 DEBUG oslo_service.service [None req-03770a86-b916-455d-a5be-04baff7a18e9 - - - - - -] conductor.workers              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:11 np0005591285 nova_compute[182755]: 2026-01-21 23:37:11.913 182759 DEBUG oslo_service.service [None req-03770a86-b916-455d-a5be-04baff7a18e9 - - - - - -] console.allowed_origins        = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:11 np0005591285 nova_compute[182755]: 2026-01-21 23:37:11.913 182759 DEBUG oslo_service.service [None req-03770a86-b916-455d-a5be-04baff7a18e9 - - - - - -] console.ssl_ciphers            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:11 np0005591285 nova_compute[182755]: 2026-01-21 23:37:11.913 182759 DEBUG oslo_service.service [None req-03770a86-b916-455d-a5be-04baff7a18e9 - - - - - -] console.ssl_minimum_version    = default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:11 np0005591285 nova_compute[182755]: 2026-01-21 23:37:11.913 182759 DEBUG oslo_service.service [None req-03770a86-b916-455d-a5be-04baff7a18e9 - - - - - -] consoleauth.token_ttl          = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:11 np0005591285 nova_compute[182755]: 2026-01-21 23:37:11.914 182759 DEBUG oslo_service.service [None req-03770a86-b916-455d-a5be-04baff7a18e9 - - - - - -] cyborg.cafile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:11 np0005591285 nova_compute[182755]: 2026-01-21 23:37:11.914 182759 DEBUG oslo_service.service [None req-03770a86-b916-455d-a5be-04baff7a18e9 - - - - - -] cyborg.certfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:11 np0005591285 nova_compute[182755]: 2026-01-21 23:37:11.914 182759 DEBUG oslo_service.service [None req-03770a86-b916-455d-a5be-04baff7a18e9 - - - - - -] cyborg.collect_timing          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:11 np0005591285 nova_compute[182755]: 2026-01-21 23:37:11.914 182759 DEBUG oslo_service.service [None req-03770a86-b916-455d-a5be-04baff7a18e9 - - - - - -] cyborg.connect_retries         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:11 np0005591285 nova_compute[182755]: 2026-01-21 23:37:11.914 182759 DEBUG oslo_service.service [None req-03770a86-b916-455d-a5be-04baff7a18e9 - - - - - -] cyborg.connect_retry_delay     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:11 np0005591285 nova_compute[182755]: 2026-01-21 23:37:11.915 182759 DEBUG oslo_service.service [None req-03770a86-b916-455d-a5be-04baff7a18e9 - - - - - -] cyborg.endpoint_override       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:11 np0005591285 nova_compute[182755]: 2026-01-21 23:37:11.915 182759 DEBUG oslo_service.service [None req-03770a86-b916-455d-a5be-04baff7a18e9 - - - - - -] cyborg.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:11 np0005591285 nova_compute[182755]: 2026-01-21 23:37:11.915 182759 DEBUG oslo_service.service [None req-03770a86-b916-455d-a5be-04baff7a18e9 - - - - - -] cyborg.keyfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:11 np0005591285 nova_compute[182755]: 2026-01-21 23:37:11.915 182759 DEBUG oslo_service.service [None req-03770a86-b916-455d-a5be-04baff7a18e9 - - - - - -] cyborg.max_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:11 np0005591285 nova_compute[182755]: 2026-01-21 23:37:11.915 182759 DEBUG oslo_service.service [None req-03770a86-b916-455d-a5be-04baff7a18e9 - - - - - -] cyborg.min_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:11 np0005591285 nova_compute[182755]: 2026-01-21 23:37:11.915 182759 DEBUG oslo_service.service [None req-03770a86-b916-455d-a5be-04baff7a18e9 - - - - - -] cyborg.region_name             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:11 np0005591285 nova_compute[182755]: 2026-01-21 23:37:11.916 182759 DEBUG oslo_service.service [None req-03770a86-b916-455d-a5be-04baff7a18e9 - - - - - -] cyborg.service_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:11 np0005591285 nova_compute[182755]: 2026-01-21 23:37:11.916 182759 DEBUG oslo_service.service [None req-03770a86-b916-455d-a5be-04baff7a18e9 - - - - - -] cyborg.service_type            = accelerator log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:11 np0005591285 nova_compute[182755]: 2026-01-21 23:37:11.916 182759 DEBUG oslo_service.service [None req-03770a86-b916-455d-a5be-04baff7a18e9 - - - - - -] cyborg.split_loggers           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:11 np0005591285 nova_compute[182755]: 2026-01-21 23:37:11.916 182759 DEBUG oslo_service.service [None req-03770a86-b916-455d-a5be-04baff7a18e9 - - - - - -] cyborg.status_code_retries     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:11 np0005591285 nova_compute[182755]: 2026-01-21 23:37:11.916 182759 DEBUG oslo_service.service [None req-03770a86-b916-455d-a5be-04baff7a18e9 - - - - - -] cyborg.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:11 np0005591285 nova_compute[182755]: 2026-01-21 23:37:11.917 182759 DEBUG oslo_service.service [None req-03770a86-b916-455d-a5be-04baff7a18e9 - - - - - -] cyborg.timeout                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:11 np0005591285 nova_compute[182755]: 2026-01-21 23:37:11.917 182759 DEBUG oslo_service.service [None req-03770a86-b916-455d-a5be-04baff7a18e9 - - - - - -] cyborg.valid_interfaces        = ['internal', 'public'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:11 np0005591285 nova_compute[182755]: 2026-01-21 23:37:11.917 182759 DEBUG oslo_service.service [None req-03770a86-b916-455d-a5be-04baff7a18e9 - - - - - -] cyborg.version                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:11 np0005591285 nova_compute[182755]: 2026-01-21 23:37:11.917 182759 DEBUG oslo_service.service [None req-03770a86-b916-455d-a5be-04baff7a18e9 - - - - - -] database.backend               = sqlalchemy log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:11 np0005591285 nova_compute[182755]: 2026-01-21 23:37:11.917 182759 DEBUG oslo_service.service [None req-03770a86-b916-455d-a5be-04baff7a18e9 - - - - - -] database.connection            = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:11 np0005591285 nova_compute[182755]: 2026-01-21 23:37:11.917 182759 DEBUG oslo_service.service [None req-03770a86-b916-455d-a5be-04baff7a18e9 - - - - - -] database.connection_debug      = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:11 np0005591285 nova_compute[182755]: 2026-01-21 23:37:11.918 182759 DEBUG oslo_service.service [None req-03770a86-b916-455d-a5be-04baff7a18e9 - - - - - -] database.connection_parameters =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:11 np0005591285 nova_compute[182755]: 2026-01-21 23:37:11.918 182759 DEBUG oslo_service.service [None req-03770a86-b916-455d-a5be-04baff7a18e9 - - - - - -] database.connection_recycle_time = 3600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:11 np0005591285 nova_compute[182755]: 2026-01-21 23:37:11.918 182759 DEBUG oslo_service.service [None req-03770a86-b916-455d-a5be-04baff7a18e9 - - - - - -] database.connection_trace      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:11 np0005591285 nova_compute[182755]: 2026-01-21 23:37:11.918 182759 DEBUG oslo_service.service [None req-03770a86-b916-455d-a5be-04baff7a18e9 - - - - - -] database.db_inc_retry_interval = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:11 np0005591285 nova_compute[182755]: 2026-01-21 23:37:11.918 182759 DEBUG oslo_service.service [None req-03770a86-b916-455d-a5be-04baff7a18e9 - - - - - -] database.db_max_retries        = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:11 np0005591285 nova_compute[182755]: 2026-01-21 23:37:11.919 182759 DEBUG oslo_service.service [None req-03770a86-b916-455d-a5be-04baff7a18e9 - - - - - -] database.db_max_retry_interval = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:11 np0005591285 nova_compute[182755]: 2026-01-21 23:37:11.919 182759 DEBUG oslo_service.service [None req-03770a86-b916-455d-a5be-04baff7a18e9 - - - - - -] database.db_retry_interval     = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:11 np0005591285 nova_compute[182755]: 2026-01-21 23:37:11.919 182759 DEBUG oslo_service.service [None req-03770a86-b916-455d-a5be-04baff7a18e9 - - - - - -] database.max_overflow          = 50 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:11 np0005591285 nova_compute[182755]: 2026-01-21 23:37:11.919 182759 DEBUG oslo_service.service [None req-03770a86-b916-455d-a5be-04baff7a18e9 - - - - - -] database.max_pool_size         = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:11 np0005591285 nova_compute[182755]: 2026-01-21 23:37:11.919 182759 DEBUG oslo_service.service [None req-03770a86-b916-455d-a5be-04baff7a18e9 - - - - - -] database.max_retries           = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:11 np0005591285 nova_compute[182755]: 2026-01-21 23:37:11.919 182759 DEBUG oslo_service.service [None req-03770a86-b916-455d-a5be-04baff7a18e9 - - - - - -] database.mysql_enable_ndb      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:11 np0005591285 nova_compute[182755]: 2026-01-21 23:37:11.920 182759 DEBUG oslo_service.service [None req-03770a86-b916-455d-a5be-04baff7a18e9 - - - - - -] database.mysql_sql_mode        = TRADITIONAL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:11 np0005591285 nova_compute[182755]: 2026-01-21 23:37:11.920 182759 DEBUG oslo_service.service [None req-03770a86-b916-455d-a5be-04baff7a18e9 - - - - - -] database.mysql_wsrep_sync_wait = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:11 np0005591285 nova_compute[182755]: 2026-01-21 23:37:11.920 182759 DEBUG oslo_service.service [None req-03770a86-b916-455d-a5be-04baff7a18e9 - - - - - -] database.pool_timeout          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:11 np0005591285 nova_compute[182755]: 2026-01-21 23:37:11.920 182759 DEBUG oslo_service.service [None req-03770a86-b916-455d-a5be-04baff7a18e9 - - - - - -] database.retry_interval        = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:11 np0005591285 nova_compute[182755]: 2026-01-21 23:37:11.920 182759 DEBUG oslo_service.service [None req-03770a86-b916-455d-a5be-04baff7a18e9 - - - - - -] database.slave_connection      = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:11 np0005591285 nova_compute[182755]: 2026-01-21 23:37:11.921 182759 DEBUG oslo_service.service [None req-03770a86-b916-455d-a5be-04baff7a18e9 - - - - - -] database.sqlite_synchronous    = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:11 np0005591285 nova_compute[182755]: 2026-01-21 23:37:11.921 182759 DEBUG oslo_service.service [None req-03770a86-b916-455d-a5be-04baff7a18e9 - - - - - -] api_database.backend           = sqlalchemy log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:11 np0005591285 nova_compute[182755]: 2026-01-21 23:37:11.921 182759 DEBUG oslo_service.service [None req-03770a86-b916-455d-a5be-04baff7a18e9 - - - - - -] api_database.connection        = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:11 np0005591285 nova_compute[182755]: 2026-01-21 23:37:11.921 182759 DEBUG oslo_service.service [None req-03770a86-b916-455d-a5be-04baff7a18e9 - - - - - -] api_database.connection_debug  = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:11 np0005591285 nova_compute[182755]: 2026-01-21 23:37:11.921 182759 DEBUG oslo_service.service [None req-03770a86-b916-455d-a5be-04baff7a18e9 - - - - - -] api_database.connection_parameters =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:11 np0005591285 nova_compute[182755]: 2026-01-21 23:37:11.922 182759 DEBUG oslo_service.service [None req-03770a86-b916-455d-a5be-04baff7a18e9 - - - - - -] api_database.connection_recycle_time = 3600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:11 np0005591285 nova_compute[182755]: 2026-01-21 23:37:11.922 182759 DEBUG oslo_service.service [None req-03770a86-b916-455d-a5be-04baff7a18e9 - - - - - -] api_database.connection_trace  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:11 np0005591285 nova_compute[182755]: 2026-01-21 23:37:11.922 182759 DEBUG oslo_service.service [None req-03770a86-b916-455d-a5be-04baff7a18e9 - - - - - -] api_database.db_inc_retry_interval = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:11 np0005591285 nova_compute[182755]: 2026-01-21 23:37:11.922 182759 DEBUG oslo_service.service [None req-03770a86-b916-455d-a5be-04baff7a18e9 - - - - - -] api_database.db_max_retries    = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:11 np0005591285 nova_compute[182755]: 2026-01-21 23:37:11.922 182759 DEBUG oslo_service.service [None req-03770a86-b916-455d-a5be-04baff7a18e9 - - - - - -] api_database.db_max_retry_interval = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:11 np0005591285 nova_compute[182755]: 2026-01-21 23:37:11.922 182759 DEBUG oslo_service.service [None req-03770a86-b916-455d-a5be-04baff7a18e9 - - - - - -] api_database.db_retry_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:11 np0005591285 nova_compute[182755]: 2026-01-21 23:37:11.923 182759 DEBUG oslo_service.service [None req-03770a86-b916-455d-a5be-04baff7a18e9 - - - - - -] api_database.max_overflow      = 50 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:11 np0005591285 nova_compute[182755]: 2026-01-21 23:37:11.923 182759 DEBUG oslo_service.service [None req-03770a86-b916-455d-a5be-04baff7a18e9 - - - - - -] api_database.max_pool_size     = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:11 np0005591285 nova_compute[182755]: 2026-01-21 23:37:11.923 182759 DEBUG oslo_service.service [None req-03770a86-b916-455d-a5be-04baff7a18e9 - - - - - -] api_database.max_retries       = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:11 np0005591285 nova_compute[182755]: 2026-01-21 23:37:11.923 182759 DEBUG oslo_service.service [None req-03770a86-b916-455d-a5be-04baff7a18e9 - - - - - -] api_database.mysql_enable_ndb  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:11 np0005591285 nova_compute[182755]: 2026-01-21 23:37:11.923 182759 DEBUG oslo_service.service [None req-03770a86-b916-455d-a5be-04baff7a18e9 - - - - - -] api_database.mysql_sql_mode    = TRADITIONAL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:11 np0005591285 nova_compute[182755]: 2026-01-21 23:37:11.924 182759 DEBUG oslo_service.service [None req-03770a86-b916-455d-a5be-04baff7a18e9 - - - - - -] api_database.mysql_wsrep_sync_wait = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:11 np0005591285 nova_compute[182755]: 2026-01-21 23:37:11.924 182759 DEBUG oslo_service.service [None req-03770a86-b916-455d-a5be-04baff7a18e9 - - - - - -] api_database.pool_timeout      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:11 np0005591285 nova_compute[182755]: 2026-01-21 23:37:11.924 182759 DEBUG oslo_service.service [None req-03770a86-b916-455d-a5be-04baff7a18e9 - - - - - -] api_database.retry_interval    = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:11 np0005591285 nova_compute[182755]: 2026-01-21 23:37:11.924 182759 DEBUG oslo_service.service [None req-03770a86-b916-455d-a5be-04baff7a18e9 - - - - - -] api_database.slave_connection  = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:11 np0005591285 nova_compute[182755]: 2026-01-21 23:37:11.924 182759 DEBUG oslo_service.service [None req-03770a86-b916-455d-a5be-04baff7a18e9 - - - - - -] api_database.sqlite_synchronous = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:11 np0005591285 nova_compute[182755]: 2026-01-21 23:37:11.924 182759 DEBUG oslo_service.service [None req-03770a86-b916-455d-a5be-04baff7a18e9 - - - - - -] devices.enabled_mdev_types     = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:11 np0005591285 nova_compute[182755]: 2026-01-21 23:37:11.925 182759 DEBUG oslo_service.service [None req-03770a86-b916-455d-a5be-04baff7a18e9 - - - - - -] ephemeral_storage_encryption.cipher = aes-xts-plain64 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:11 np0005591285 nova_compute[182755]: 2026-01-21 23:37:11.925 182759 DEBUG oslo_service.service [None req-03770a86-b916-455d-a5be-04baff7a18e9 - - - - - -] ephemeral_storage_encryption.enabled = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:11 np0005591285 nova_compute[182755]: 2026-01-21 23:37:11.925 182759 DEBUG oslo_service.service [None req-03770a86-b916-455d-a5be-04baff7a18e9 - - - - - -] ephemeral_storage_encryption.key_size = 512 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:11 np0005591285 nova_compute[182755]: 2026-01-21 23:37:11.925 182759 DEBUG oslo_service.service [None req-03770a86-b916-455d-a5be-04baff7a18e9 - - - - - -] glance.api_servers             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:11 np0005591285 nova_compute[182755]: 2026-01-21 23:37:11.925 182759 DEBUG oslo_service.service [None req-03770a86-b916-455d-a5be-04baff7a18e9 - - - - - -] glance.cafile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:11 np0005591285 nova_compute[182755]: 2026-01-21 23:37:11.926 182759 DEBUG oslo_service.service [None req-03770a86-b916-455d-a5be-04baff7a18e9 - - - - - -] glance.certfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:11 np0005591285 nova_compute[182755]: 2026-01-21 23:37:11.926 182759 DEBUG oslo_service.service [None req-03770a86-b916-455d-a5be-04baff7a18e9 - - - - - -] glance.collect_timing          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:11 np0005591285 nova_compute[182755]: 2026-01-21 23:37:11.926 182759 DEBUG oslo_service.service [None req-03770a86-b916-455d-a5be-04baff7a18e9 - - - - - -] glance.connect_retries         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:11 np0005591285 nova_compute[182755]: 2026-01-21 23:37:11.926 182759 DEBUG oslo_service.service [None req-03770a86-b916-455d-a5be-04baff7a18e9 - - - - - -] glance.connect_retry_delay     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:11 np0005591285 nova_compute[182755]: 2026-01-21 23:37:11.926 182759 DEBUG oslo_service.service [None req-03770a86-b916-455d-a5be-04baff7a18e9 - - - - - -] glance.debug                   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:11 np0005591285 nova_compute[182755]: 2026-01-21 23:37:11.927 182759 DEBUG oslo_service.service [None req-03770a86-b916-455d-a5be-04baff7a18e9 - - - - - -] glance.default_trusted_certificate_ids = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:11 np0005591285 nova_compute[182755]: 2026-01-21 23:37:11.927 182759 DEBUG oslo_service.service [None req-03770a86-b916-455d-a5be-04baff7a18e9 - - - - - -] glance.enable_certificate_validation = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:11 np0005591285 nova_compute[182755]: 2026-01-21 23:37:11.927 182759 DEBUG oslo_service.service [None req-03770a86-b916-455d-a5be-04baff7a18e9 - - - - - -] glance.enable_rbd_download     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:11 np0005591285 nova_compute[182755]: 2026-01-21 23:37:11.927 182759 DEBUG oslo_service.service [None req-03770a86-b916-455d-a5be-04baff7a18e9 - - - - - -] glance.endpoint_override       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:11 np0005591285 nova_compute[182755]: 2026-01-21 23:37:11.927 182759 DEBUG oslo_service.service [None req-03770a86-b916-455d-a5be-04baff7a18e9 - - - - - -] glance.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:11 np0005591285 nova_compute[182755]: 2026-01-21 23:37:11.927 182759 DEBUG oslo_service.service [None req-03770a86-b916-455d-a5be-04baff7a18e9 - - - - - -] glance.keyfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:11 np0005591285 nova_compute[182755]: 2026-01-21 23:37:11.928 182759 DEBUG oslo_service.service [None req-03770a86-b916-455d-a5be-04baff7a18e9 - - - - - -] glance.max_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:11 np0005591285 nova_compute[182755]: 2026-01-21 23:37:11.928 182759 DEBUG oslo_service.service [None req-03770a86-b916-455d-a5be-04baff7a18e9 - - - - - -] glance.min_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:11 np0005591285 nova_compute[182755]: 2026-01-21 23:37:11.928 182759 DEBUG oslo_service.service [None req-03770a86-b916-455d-a5be-04baff7a18e9 - - - - - -] glance.num_retries             = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:11 np0005591285 nova_compute[182755]: 2026-01-21 23:37:11.928 182759 DEBUG oslo_service.service [None req-03770a86-b916-455d-a5be-04baff7a18e9 - - - - - -] glance.rbd_ceph_conf           =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:11 np0005591285 nova_compute[182755]: 2026-01-21 23:37:11.928 182759 DEBUG oslo_service.service [None req-03770a86-b916-455d-a5be-04baff7a18e9 - - - - - -] glance.rbd_connect_timeout     = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:11 np0005591285 nova_compute[182755]: 2026-01-21 23:37:11.929 182759 DEBUG oslo_service.service [None req-03770a86-b916-455d-a5be-04baff7a18e9 - - - - - -] glance.rbd_pool                =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:11 np0005591285 nova_compute[182755]: 2026-01-21 23:37:11.929 182759 DEBUG oslo_service.service [None req-03770a86-b916-455d-a5be-04baff7a18e9 - - - - - -] glance.rbd_user                =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:11 np0005591285 nova_compute[182755]: 2026-01-21 23:37:11.929 182759 DEBUG oslo_service.service [None req-03770a86-b916-455d-a5be-04baff7a18e9 - - - - - -] glance.region_name             = regionOne log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:11 np0005591285 nova_compute[182755]: 2026-01-21 23:37:11.929 182759 DEBUG oslo_service.service [None req-03770a86-b916-455d-a5be-04baff7a18e9 - - - - - -] glance.service_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:11 np0005591285 nova_compute[182755]: 2026-01-21 23:37:11.929 182759 DEBUG oslo_service.service [None req-03770a86-b916-455d-a5be-04baff7a18e9 - - - - - -] glance.service_type            = image log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:11 np0005591285 nova_compute[182755]: 2026-01-21 23:37:11.929 182759 DEBUG oslo_service.service [None req-03770a86-b916-455d-a5be-04baff7a18e9 - - - - - -] glance.split_loggers           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:11 np0005591285 nova_compute[182755]: 2026-01-21 23:37:11.930 182759 DEBUG oslo_service.service [None req-03770a86-b916-455d-a5be-04baff7a18e9 - - - - - -] glance.status_code_retries     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:11 np0005591285 nova_compute[182755]: 2026-01-21 23:37:11.930 182759 DEBUG oslo_service.service [None req-03770a86-b916-455d-a5be-04baff7a18e9 - - - - - -] glance.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:11 np0005591285 nova_compute[182755]: 2026-01-21 23:37:11.930 182759 DEBUG oslo_service.service [None req-03770a86-b916-455d-a5be-04baff7a18e9 - - - - - -] glance.timeout                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:11 np0005591285 nova_compute[182755]: 2026-01-21 23:37:11.930 182759 DEBUG oslo_service.service [None req-03770a86-b916-455d-a5be-04baff7a18e9 - - - - - -] glance.valid_interfaces        = ['internal'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:11 np0005591285 nova_compute[182755]: 2026-01-21 23:37:11.930 182759 DEBUG oslo_service.service [None req-03770a86-b916-455d-a5be-04baff7a18e9 - - - - - -] glance.verify_glance_signatures = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:11 np0005591285 nova_compute[182755]: 2026-01-21 23:37:11.931 182759 DEBUG oslo_service.service [None req-03770a86-b916-455d-a5be-04baff7a18e9 - - - - - -] glance.version                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:11 np0005591285 nova_compute[182755]: 2026-01-21 23:37:11.931 182759 DEBUG oslo_service.service [None req-03770a86-b916-455d-a5be-04baff7a18e9 - - - - - -] guestfs.debug                  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:11 np0005591285 nova_compute[182755]: 2026-01-21 23:37:11.931 182759 DEBUG oslo_service.service [None req-03770a86-b916-455d-a5be-04baff7a18e9 - - - - - -] hyperv.config_drive_cdrom      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:11 np0005591285 nova_compute[182755]: 2026-01-21 23:37:11.931 182759 DEBUG oslo_service.service [None req-03770a86-b916-455d-a5be-04baff7a18e9 - - - - - -] hyperv.config_drive_inject_password = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:11 np0005591285 nova_compute[182755]: 2026-01-21 23:37:11.931 182759 DEBUG oslo_service.service [None req-03770a86-b916-455d-a5be-04baff7a18e9 - - - - - -] hyperv.dynamic_memory_ratio    = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:11 np0005591285 nova_compute[182755]: 2026-01-21 23:37:11.931 182759 DEBUG oslo_service.service [None req-03770a86-b916-455d-a5be-04baff7a18e9 - - - - - -] hyperv.enable_instance_metrics_collection = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:11 np0005591285 nova_compute[182755]: 2026-01-21 23:37:11.932 182759 DEBUG oslo_service.service [None req-03770a86-b916-455d-a5be-04baff7a18e9 - - - - - -] hyperv.enable_remotefx         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:11 np0005591285 nova_compute[182755]: 2026-01-21 23:37:11.932 182759 DEBUG oslo_service.service [None req-03770a86-b916-455d-a5be-04baff7a18e9 - - - - - -] hyperv.instances_path_share    =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:11 np0005591285 nova_compute[182755]: 2026-01-21 23:37:11.932 182759 DEBUG oslo_service.service [None req-03770a86-b916-455d-a5be-04baff7a18e9 - - - - - -] hyperv.iscsi_initiator_list    = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:11 np0005591285 nova_compute[182755]: 2026-01-21 23:37:11.932 182759 DEBUG oslo_service.service [None req-03770a86-b916-455d-a5be-04baff7a18e9 - - - - - -] hyperv.limit_cpu_features      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:11 np0005591285 nova_compute[182755]: 2026-01-21 23:37:11.932 182759 DEBUG oslo_service.service [None req-03770a86-b916-455d-a5be-04baff7a18e9 - - - - - -] hyperv.mounted_disk_query_retry_count = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:11 np0005591285 nova_compute[182755]: 2026-01-21 23:37:11.933 182759 DEBUG oslo_service.service [None req-03770a86-b916-455d-a5be-04baff7a18e9 - - - - - -] hyperv.mounted_disk_query_retry_interval = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:11 np0005591285 nova_compute[182755]: 2026-01-21 23:37:11.933 182759 DEBUG oslo_service.service [None req-03770a86-b916-455d-a5be-04baff7a18e9 - - - - - -] hyperv.power_state_check_timeframe = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:11 np0005591285 nova_compute[182755]: 2026-01-21 23:37:11.933 182759 DEBUG oslo_service.service [None req-03770a86-b916-455d-a5be-04baff7a18e9 - - - - - -] hyperv.power_state_event_polling_interval = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:11 np0005591285 nova_compute[182755]: 2026-01-21 23:37:11.933 182759 DEBUG oslo_service.service [None req-03770a86-b916-455d-a5be-04baff7a18e9 - - - - - -] hyperv.qemu_img_cmd            = qemu-img.exe log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:11 np0005591285 nova_compute[182755]: 2026-01-21 23:37:11.933 182759 DEBUG oslo_service.service [None req-03770a86-b916-455d-a5be-04baff7a18e9 - - - - - -] hyperv.use_multipath_io        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:11 np0005591285 nova_compute[182755]: 2026-01-21 23:37:11.933 182759 DEBUG oslo_service.service [None req-03770a86-b916-455d-a5be-04baff7a18e9 - - - - - -] hyperv.volume_attach_retry_count = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:11 np0005591285 nova_compute[182755]: 2026-01-21 23:37:11.934 182759 DEBUG oslo_service.service [None req-03770a86-b916-455d-a5be-04baff7a18e9 - - - - - -] hyperv.volume_attach_retry_interval = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:11 np0005591285 nova_compute[182755]: 2026-01-21 23:37:11.934 182759 DEBUG oslo_service.service [None req-03770a86-b916-455d-a5be-04baff7a18e9 - - - - - -] hyperv.vswitch_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:11 np0005591285 nova_compute[182755]: 2026-01-21 23:37:11.934 182759 DEBUG oslo_service.service [None req-03770a86-b916-455d-a5be-04baff7a18e9 - - - - - -] hyperv.wait_soft_reboot_seconds = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:11 np0005591285 nova_compute[182755]: 2026-01-21 23:37:11.934 182759 DEBUG oslo_service.service [None req-03770a86-b916-455d-a5be-04baff7a18e9 - - - - - -] mks.enabled                    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:11 np0005591285 nova_compute[182755]: 2026-01-21 23:37:11.935 182759 DEBUG oslo_service.service [None req-03770a86-b916-455d-a5be-04baff7a18e9 - - - - - -] mks.mksproxy_base_url          = http://127.0.0.1:6090/ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:11 np0005591285 nova_compute[182755]: 2026-01-21 23:37:11.935 182759 DEBUG oslo_service.service [None req-03770a86-b916-455d-a5be-04baff7a18e9 - - - - - -] image_cache.manager_interval   = 2400 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:11 np0005591285 nova_compute[182755]: 2026-01-21 23:37:11.935 182759 DEBUG oslo_service.service [None req-03770a86-b916-455d-a5be-04baff7a18e9 - - - - - -] image_cache.precache_concurrency = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:11 np0005591285 nova_compute[182755]: 2026-01-21 23:37:11.935 182759 DEBUG oslo_service.service [None req-03770a86-b916-455d-a5be-04baff7a18e9 - - - - - -] image_cache.remove_unused_base_images = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:11 np0005591285 nova_compute[182755]: 2026-01-21 23:37:11.935 182759 DEBUG oslo_service.service [None req-03770a86-b916-455d-a5be-04baff7a18e9 - - - - - -] image_cache.remove_unused_original_minimum_age_seconds = 86400 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:11 np0005591285 nova_compute[182755]: 2026-01-21 23:37:11.935 182759 DEBUG oslo_service.service [None req-03770a86-b916-455d-a5be-04baff7a18e9 - - - - - -] image_cache.remove_unused_resized_minimum_age_seconds = 3600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:11 np0005591285 nova_compute[182755]: 2026-01-21 23:37:11.936 182759 DEBUG oslo_service.service [None req-03770a86-b916-455d-a5be-04baff7a18e9 - - - - - -] image_cache.subdirectory_name  = _base log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:11 np0005591285 nova_compute[182755]: 2026-01-21 23:37:11.936 182759 DEBUG oslo_service.service [None req-03770a86-b916-455d-a5be-04baff7a18e9 - - - - - -] ironic.api_max_retries         = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:11 np0005591285 nova_compute[182755]: 2026-01-21 23:37:11.936 182759 DEBUG oslo_service.service [None req-03770a86-b916-455d-a5be-04baff7a18e9 - - - - - -] ironic.api_retry_interval      = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:11 np0005591285 nova_compute[182755]: 2026-01-21 23:37:11.936 182759 DEBUG oslo_service.service [None req-03770a86-b916-455d-a5be-04baff7a18e9 - - - - - -] ironic.auth_section            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:11 np0005591285 nova_compute[182755]: 2026-01-21 23:37:11.936 182759 DEBUG oslo_service.service [None req-03770a86-b916-455d-a5be-04baff7a18e9 - - - - - -] ironic.auth_type               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:11 np0005591285 nova_compute[182755]: 2026-01-21 23:37:11.937 182759 DEBUG oslo_service.service [None req-03770a86-b916-455d-a5be-04baff7a18e9 - - - - - -] ironic.cafile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:11 np0005591285 nova_compute[182755]: 2026-01-21 23:37:11.937 182759 DEBUG oslo_service.service [None req-03770a86-b916-455d-a5be-04baff7a18e9 - - - - - -] ironic.certfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:11 np0005591285 nova_compute[182755]: 2026-01-21 23:37:11.937 182759 DEBUG oslo_service.service [None req-03770a86-b916-455d-a5be-04baff7a18e9 - - - - - -] ironic.collect_timing          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:11 np0005591285 nova_compute[182755]: 2026-01-21 23:37:11.937 182759 DEBUG oslo_service.service [None req-03770a86-b916-455d-a5be-04baff7a18e9 - - - - - -] ironic.connect_retries         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:11 np0005591285 nova_compute[182755]: 2026-01-21 23:37:11.937 182759 DEBUG oslo_service.service [None req-03770a86-b916-455d-a5be-04baff7a18e9 - - - - - -] ironic.connect_retry_delay     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:11 np0005591285 nova_compute[182755]: 2026-01-21 23:37:11.937 182759 DEBUG oslo_service.service [None req-03770a86-b916-455d-a5be-04baff7a18e9 - - - - - -] ironic.endpoint_override       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:11 np0005591285 nova_compute[182755]: 2026-01-21 23:37:11.938 182759 DEBUG oslo_service.service [None req-03770a86-b916-455d-a5be-04baff7a18e9 - - - - - -] ironic.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:11 np0005591285 nova_compute[182755]: 2026-01-21 23:37:11.938 182759 DEBUG oslo_service.service [None req-03770a86-b916-455d-a5be-04baff7a18e9 - - - - - -] ironic.keyfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:11 np0005591285 nova_compute[182755]: 2026-01-21 23:37:11.938 182759 DEBUG oslo_service.service [None req-03770a86-b916-455d-a5be-04baff7a18e9 - - - - - -] ironic.max_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:11 np0005591285 nova_compute[182755]: 2026-01-21 23:37:11.938 182759 DEBUG oslo_service.service [None req-03770a86-b916-455d-a5be-04baff7a18e9 - - - - - -] ironic.min_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:11 np0005591285 nova_compute[182755]: 2026-01-21 23:37:11.938 182759 DEBUG oslo_service.service [None req-03770a86-b916-455d-a5be-04baff7a18e9 - - - - - -] ironic.partition_key           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:11 np0005591285 nova_compute[182755]: 2026-01-21 23:37:11.939 182759 DEBUG oslo_service.service [None req-03770a86-b916-455d-a5be-04baff7a18e9 - - - - - -] ironic.peer_list               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:11 np0005591285 nova_compute[182755]: 2026-01-21 23:37:11.939 182759 DEBUG oslo_service.service [None req-03770a86-b916-455d-a5be-04baff7a18e9 - - - - - -] ironic.region_name             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:11 np0005591285 nova_compute[182755]: 2026-01-21 23:37:11.939 182759 DEBUG oslo_service.service [None req-03770a86-b916-455d-a5be-04baff7a18e9 - - - - - -] ironic.serial_console_state_timeout = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:11 np0005591285 nova_compute[182755]: 2026-01-21 23:37:11.939 182759 DEBUG oslo_service.service [None req-03770a86-b916-455d-a5be-04baff7a18e9 - - - - - -] ironic.service_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:11 np0005591285 nova_compute[182755]: 2026-01-21 23:37:11.939 182759 DEBUG oslo_service.service [None req-03770a86-b916-455d-a5be-04baff7a18e9 - - - - - -] ironic.service_type            = baremetal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:11 np0005591285 nova_compute[182755]: 2026-01-21 23:37:11.939 182759 DEBUG oslo_service.service [None req-03770a86-b916-455d-a5be-04baff7a18e9 - - - - - -] ironic.split_loggers           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:11 np0005591285 nova_compute[182755]: 2026-01-21 23:37:11.940 182759 DEBUG oslo_service.service [None req-03770a86-b916-455d-a5be-04baff7a18e9 - - - - - -] ironic.status_code_retries     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:11 np0005591285 nova_compute[182755]: 2026-01-21 23:37:11.940 182759 DEBUG oslo_service.service [None req-03770a86-b916-455d-a5be-04baff7a18e9 - - - - - -] ironic.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:11 np0005591285 nova_compute[182755]: 2026-01-21 23:37:11.940 182759 DEBUG oslo_service.service [None req-03770a86-b916-455d-a5be-04baff7a18e9 - - - - - -] ironic.timeout                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:11 np0005591285 nova_compute[182755]: 2026-01-21 23:37:11.940 182759 DEBUG oslo_service.service [None req-03770a86-b916-455d-a5be-04baff7a18e9 - - - - - -] ironic.valid_interfaces        = ['internal', 'public'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:11 np0005591285 nova_compute[182755]: 2026-01-21 23:37:11.940 182759 DEBUG oslo_service.service [None req-03770a86-b916-455d-a5be-04baff7a18e9 - - - - - -] ironic.version                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:11 np0005591285 nova_compute[182755]: 2026-01-21 23:37:11.941 182759 DEBUG oslo_service.service [None req-03770a86-b916-455d-a5be-04baff7a18e9 - - - - - -] key_manager.backend            = barbican log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:11 np0005591285 nova_compute[182755]: 2026-01-21 23:37:11.941 182759 DEBUG oslo_service.service [None req-03770a86-b916-455d-a5be-04baff7a18e9 - - - - - -] key_manager.fixed_key          = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:11 np0005591285 nova_compute[182755]: 2026-01-21 23:37:11.941 182759 DEBUG oslo_service.service [None req-03770a86-b916-455d-a5be-04baff7a18e9 - - - - - -] barbican.auth_endpoint         = http://localhost/identity/v3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:11 np0005591285 nova_compute[182755]: 2026-01-21 23:37:11.941 182759 DEBUG oslo_service.service [None req-03770a86-b916-455d-a5be-04baff7a18e9 - - - - - -] barbican.barbican_api_version  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:11 np0005591285 nova_compute[182755]: 2026-01-21 23:37:11.941 182759 DEBUG oslo_service.service [None req-03770a86-b916-455d-a5be-04baff7a18e9 - - - - - -] barbican.barbican_endpoint     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:11 np0005591285 nova_compute[182755]: 2026-01-21 23:37:11.942 182759 DEBUG oslo_service.service [None req-03770a86-b916-455d-a5be-04baff7a18e9 - - - - - -] barbican.barbican_endpoint_type = internal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:11 np0005591285 nova_compute[182755]: 2026-01-21 23:37:11.942 182759 DEBUG oslo_service.service [None req-03770a86-b916-455d-a5be-04baff7a18e9 - - - - - -] barbican.barbican_region_name  = regionOne log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:11 np0005591285 nova_compute[182755]: 2026-01-21 23:37:11.942 182759 DEBUG oslo_service.service [None req-03770a86-b916-455d-a5be-04baff7a18e9 - - - - - -] barbican.cafile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:11 np0005591285 nova_compute[182755]: 2026-01-21 23:37:11.942 182759 DEBUG oslo_service.service [None req-03770a86-b916-455d-a5be-04baff7a18e9 - - - - - -] barbican.certfile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:11 np0005591285 nova_compute[182755]: 2026-01-21 23:37:11.942 182759 DEBUG oslo_service.service [None req-03770a86-b916-455d-a5be-04baff7a18e9 - - - - - -] barbican.collect_timing        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:11 np0005591285 nova_compute[182755]: 2026-01-21 23:37:11.942 182759 DEBUG oslo_service.service [None req-03770a86-b916-455d-a5be-04baff7a18e9 - - - - - -] barbican.insecure              = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:11 np0005591285 nova_compute[182755]: 2026-01-21 23:37:11.943 182759 DEBUG oslo_service.service [None req-03770a86-b916-455d-a5be-04baff7a18e9 - - - - - -] barbican.keyfile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:11 np0005591285 nova_compute[182755]: 2026-01-21 23:37:11.943 182759 DEBUG oslo_service.service [None req-03770a86-b916-455d-a5be-04baff7a18e9 - - - - - -] barbican.number_of_retries     = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:11 np0005591285 nova_compute[182755]: 2026-01-21 23:37:11.943 182759 DEBUG oslo_service.service [None req-03770a86-b916-455d-a5be-04baff7a18e9 - - - - - -] barbican.retry_delay           = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:11 np0005591285 nova_compute[182755]: 2026-01-21 23:37:11.943 182759 DEBUG oslo_service.service [None req-03770a86-b916-455d-a5be-04baff7a18e9 - - - - - -] barbican.send_service_user_token = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:11 np0005591285 nova_compute[182755]: 2026-01-21 23:37:11.943 182759 DEBUG oslo_service.service [None req-03770a86-b916-455d-a5be-04baff7a18e9 - - - - - -] barbican.split_loggers         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:11 np0005591285 nova_compute[182755]: 2026-01-21 23:37:11.944 182759 DEBUG oslo_service.service [None req-03770a86-b916-455d-a5be-04baff7a18e9 - - - - - -] barbican.timeout               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:11 np0005591285 nova_compute[182755]: 2026-01-21 23:37:11.944 182759 DEBUG oslo_service.service [None req-03770a86-b916-455d-a5be-04baff7a18e9 - - - - - -] barbican.verify_ssl            = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:11 np0005591285 nova_compute[182755]: 2026-01-21 23:37:11.944 182759 DEBUG oslo_service.service [None req-03770a86-b916-455d-a5be-04baff7a18e9 - - - - - -] barbican.verify_ssl_path       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:11 np0005591285 nova_compute[182755]: 2026-01-21 23:37:11.944 182759 DEBUG oslo_service.service [None req-03770a86-b916-455d-a5be-04baff7a18e9 - - - - - -] barbican_service_user.auth_section = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:11 np0005591285 nova_compute[182755]: 2026-01-21 23:37:11.944 182759 DEBUG oslo_service.service [None req-03770a86-b916-455d-a5be-04baff7a18e9 - - - - - -] barbican_service_user.auth_type = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:11 np0005591285 nova_compute[182755]: 2026-01-21 23:37:11.944 182759 DEBUG oslo_service.service [None req-03770a86-b916-455d-a5be-04baff7a18e9 - - - - - -] barbican_service_user.cafile   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:11 np0005591285 nova_compute[182755]: 2026-01-21 23:37:11.945 182759 DEBUG oslo_service.service [None req-03770a86-b916-455d-a5be-04baff7a18e9 - - - - - -] barbican_service_user.certfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:11 np0005591285 nova_compute[182755]: 2026-01-21 23:37:11.945 182759 DEBUG oslo_service.service [None req-03770a86-b916-455d-a5be-04baff7a18e9 - - - - - -] barbican_service_user.collect_timing = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:11 np0005591285 nova_compute[182755]: 2026-01-21 23:37:11.945 182759 DEBUG oslo_service.service [None req-03770a86-b916-455d-a5be-04baff7a18e9 - - - - - -] barbican_service_user.insecure = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:11 np0005591285 nova_compute[182755]: 2026-01-21 23:37:11.945 182759 DEBUG oslo_service.service [None req-03770a86-b916-455d-a5be-04baff7a18e9 - - - - - -] barbican_service_user.keyfile  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:11 np0005591285 nova_compute[182755]: 2026-01-21 23:37:11.945 182759 DEBUG oslo_service.service [None req-03770a86-b916-455d-a5be-04baff7a18e9 - - - - - -] barbican_service_user.split_loggers = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:11 np0005591285 nova_compute[182755]: 2026-01-21 23:37:11.946 182759 DEBUG oslo_service.service [None req-03770a86-b916-455d-a5be-04baff7a18e9 - - - - - -] barbican_service_user.timeout  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:11 np0005591285 nova_compute[182755]: 2026-01-21 23:37:11.946 182759 DEBUG oslo_service.service [None req-03770a86-b916-455d-a5be-04baff7a18e9 - - - - - -] vault.approle_role_id          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:11 np0005591285 nova_compute[182755]: 2026-01-21 23:37:11.946 182759 DEBUG oslo_service.service [None req-03770a86-b916-455d-a5be-04baff7a18e9 - - - - - -] vault.approle_secret_id        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:11 np0005591285 nova_compute[182755]: 2026-01-21 23:37:11.946 182759 DEBUG oslo_service.service [None req-03770a86-b916-455d-a5be-04baff7a18e9 - - - - - -] vault.cafile                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:11 np0005591285 nova_compute[182755]: 2026-01-21 23:37:11.946 182759 DEBUG oslo_service.service [None req-03770a86-b916-455d-a5be-04baff7a18e9 - - - - - -] vault.certfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:11 np0005591285 nova_compute[182755]: 2026-01-21 23:37:11.946 182759 DEBUG oslo_service.service [None req-03770a86-b916-455d-a5be-04baff7a18e9 - - - - - -] vault.collect_timing           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:11 np0005591285 nova_compute[182755]: 2026-01-21 23:37:11.947 182759 DEBUG oslo_service.service [None req-03770a86-b916-455d-a5be-04baff7a18e9 - - - - - -] vault.insecure                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:11 np0005591285 nova_compute[182755]: 2026-01-21 23:37:11.947 182759 DEBUG oslo_service.service [None req-03770a86-b916-455d-a5be-04baff7a18e9 - - - - - -] vault.keyfile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:11 np0005591285 nova_compute[182755]: 2026-01-21 23:37:11.947 182759 DEBUG oslo_service.service [None req-03770a86-b916-455d-a5be-04baff7a18e9 - - - - - -] vault.kv_mountpoint            = secret log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:11 np0005591285 nova_compute[182755]: 2026-01-21 23:37:11.947 182759 DEBUG oslo_service.service [None req-03770a86-b916-455d-a5be-04baff7a18e9 - - - - - -] vault.kv_version               = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:11 np0005591285 nova_compute[182755]: 2026-01-21 23:37:11.947 182759 DEBUG oslo_service.service [None req-03770a86-b916-455d-a5be-04baff7a18e9 - - - - - -] vault.namespace                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:11 np0005591285 nova_compute[182755]: 2026-01-21 23:37:11.947 182759 DEBUG oslo_service.service [None req-03770a86-b916-455d-a5be-04baff7a18e9 - - - - - -] vault.root_token_id            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:11 np0005591285 nova_compute[182755]: 2026-01-21 23:37:11.948 182759 DEBUG oslo_service.service [None req-03770a86-b916-455d-a5be-04baff7a18e9 - - - - - -] vault.split_loggers            = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:11 np0005591285 nova_compute[182755]: 2026-01-21 23:37:11.948 182759 DEBUG oslo_service.service [None req-03770a86-b916-455d-a5be-04baff7a18e9 - - - - - -] vault.ssl_ca_crt_file          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:11 np0005591285 nova_compute[182755]: 2026-01-21 23:37:11.948 182759 DEBUG oslo_service.service [None req-03770a86-b916-455d-a5be-04baff7a18e9 - - - - - -] vault.timeout                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:11 np0005591285 nova_compute[182755]: 2026-01-21 23:37:11.948 182759 DEBUG oslo_service.service [None req-03770a86-b916-455d-a5be-04baff7a18e9 - - - - - -] vault.use_ssl                  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:11 np0005591285 nova_compute[182755]: 2026-01-21 23:37:11.948 182759 DEBUG oslo_service.service [None req-03770a86-b916-455d-a5be-04baff7a18e9 - - - - - -] vault.vault_url                = http://127.0.0.1:8200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:11 np0005591285 nova_compute[182755]: 2026-01-21 23:37:11.949 182759 DEBUG oslo_service.service [None req-03770a86-b916-455d-a5be-04baff7a18e9 - - - - - -] keystone.cafile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:11 np0005591285 nova_compute[182755]: 2026-01-21 23:37:11.949 182759 DEBUG oslo_service.service [None req-03770a86-b916-455d-a5be-04baff7a18e9 - - - - - -] keystone.certfile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:11 np0005591285 nova_compute[182755]: 2026-01-21 23:37:11.949 182759 DEBUG oslo_service.service [None req-03770a86-b916-455d-a5be-04baff7a18e9 - - - - - -] keystone.collect_timing        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:11 np0005591285 nova_compute[182755]: 2026-01-21 23:37:11.949 182759 DEBUG oslo_service.service [None req-03770a86-b916-455d-a5be-04baff7a18e9 - - - - - -] keystone.connect_retries       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:11 np0005591285 nova_compute[182755]: 2026-01-21 23:37:11.949 182759 DEBUG oslo_service.service [None req-03770a86-b916-455d-a5be-04baff7a18e9 - - - - - -] keystone.connect_retry_delay   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:11 np0005591285 nova_compute[182755]: 2026-01-21 23:37:11.949 182759 DEBUG oslo_service.service [None req-03770a86-b916-455d-a5be-04baff7a18e9 - - - - - -] keystone.endpoint_override     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:11 np0005591285 nova_compute[182755]: 2026-01-21 23:37:11.950 182759 DEBUG oslo_service.service [None req-03770a86-b916-455d-a5be-04baff7a18e9 - - - - - -] keystone.insecure              = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:11 np0005591285 nova_compute[182755]: 2026-01-21 23:37:11.950 182759 DEBUG oslo_service.service [None req-03770a86-b916-455d-a5be-04baff7a18e9 - - - - - -] keystone.keyfile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:11 np0005591285 nova_compute[182755]: 2026-01-21 23:37:11.950 182759 DEBUG oslo_service.service [None req-03770a86-b916-455d-a5be-04baff7a18e9 - - - - - -] keystone.max_version           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:11 np0005591285 nova_compute[182755]: 2026-01-21 23:37:11.950 182759 DEBUG oslo_service.service [None req-03770a86-b916-455d-a5be-04baff7a18e9 - - - - - -] keystone.min_version           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:11 np0005591285 nova_compute[182755]: 2026-01-21 23:37:11.950 182759 DEBUG oslo_service.service [None req-03770a86-b916-455d-a5be-04baff7a18e9 - - - - - -] keystone.region_name           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:11 np0005591285 nova_compute[182755]: 2026-01-21 23:37:11.951 182759 DEBUG oslo_service.service [None req-03770a86-b916-455d-a5be-04baff7a18e9 - - - - - -] keystone.service_name          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:11 np0005591285 nova_compute[182755]: 2026-01-21 23:37:11.951 182759 DEBUG oslo_service.service [None req-03770a86-b916-455d-a5be-04baff7a18e9 - - - - - -] keystone.service_type          = identity log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:11 np0005591285 nova_compute[182755]: 2026-01-21 23:37:11.951 182759 DEBUG oslo_service.service [None req-03770a86-b916-455d-a5be-04baff7a18e9 - - - - - -] keystone.split_loggers         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:11 np0005591285 nova_compute[182755]: 2026-01-21 23:37:11.951 182759 DEBUG oslo_service.service [None req-03770a86-b916-455d-a5be-04baff7a18e9 - - - - - -] keystone.status_code_retries   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:11 np0005591285 nova_compute[182755]: 2026-01-21 23:37:11.951 182759 DEBUG oslo_service.service [None req-03770a86-b916-455d-a5be-04baff7a18e9 - - - - - -] keystone.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:11 np0005591285 nova_compute[182755]: 2026-01-21 23:37:11.951 182759 DEBUG oslo_service.service [None req-03770a86-b916-455d-a5be-04baff7a18e9 - - - - - -] keystone.timeout               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:11 np0005591285 nova_compute[182755]: 2026-01-21 23:37:11.952 182759 DEBUG oslo_service.service [None req-03770a86-b916-455d-a5be-04baff7a18e9 - - - - - -] keystone.valid_interfaces      = ['internal', 'public'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:11 np0005591285 nova_compute[182755]: 2026-01-21 23:37:11.952 182759 DEBUG oslo_service.service [None req-03770a86-b916-455d-a5be-04baff7a18e9 - - - - - -] keystone.version               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:11 np0005591285 nova_compute[182755]: 2026-01-21 23:37:11.952 182759 DEBUG oslo_service.service [None req-03770a86-b916-455d-a5be-04baff7a18e9 - - - - - -] libvirt.connection_uri         =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:11 np0005591285 nova_compute[182755]: 2026-01-21 23:37:11.952 182759 DEBUG oslo_service.service [None req-03770a86-b916-455d-a5be-04baff7a18e9 - - - - - -] libvirt.cpu_mode               = custom log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:11 np0005591285 nova_compute[182755]: 2026-01-21 23:37:11.952 182759 DEBUG oslo_service.service [None req-03770a86-b916-455d-a5be-04baff7a18e9 - - - - - -] libvirt.cpu_model_extra_flags  = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:11 np0005591285 nova_compute[182755]: 2026-01-21 23:37:11.953 182759 DEBUG oslo_service.service [None req-03770a86-b916-455d-a5be-04baff7a18e9 - - - - - -] libvirt.cpu_models             = ['Nehalem'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:11 np0005591285 nova_compute[182755]: 2026-01-21 23:37:11.953 182759 DEBUG oslo_service.service [None req-03770a86-b916-455d-a5be-04baff7a18e9 - - - - - -] libvirt.cpu_power_governor_high = performance log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:11 np0005591285 nova_compute[182755]: 2026-01-21 23:37:11.953 182759 DEBUG oslo_service.service [None req-03770a86-b916-455d-a5be-04baff7a18e9 - - - - - -] libvirt.cpu_power_governor_low = powersave log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:11 np0005591285 nova_compute[182755]: 2026-01-21 23:37:11.953 182759 DEBUG oslo_service.service [None req-03770a86-b916-455d-a5be-04baff7a18e9 - - - - - -] libvirt.cpu_power_management   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:11 np0005591285 nova_compute[182755]: 2026-01-21 23:37:11.953 182759 DEBUG oslo_service.service [None req-03770a86-b916-455d-a5be-04baff7a18e9 - - - - - -] libvirt.cpu_power_management_strategy = cpu_state log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:11 np0005591285 nova_compute[182755]: 2026-01-21 23:37:11.954 182759 DEBUG oslo_service.service [None req-03770a86-b916-455d-a5be-04baff7a18e9 - - - - - -] libvirt.device_detach_attempts = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:11 np0005591285 nova_compute[182755]: 2026-01-21 23:37:11.954 182759 DEBUG oslo_service.service [None req-03770a86-b916-455d-a5be-04baff7a18e9 - - - - - -] libvirt.device_detach_timeout  = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:11 np0005591285 nova_compute[182755]: 2026-01-21 23:37:11.954 182759 DEBUG oslo_service.service [None req-03770a86-b916-455d-a5be-04baff7a18e9 - - - - - -] libvirt.disk_cachemodes        = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:11 np0005591285 nova_compute[182755]: 2026-01-21 23:37:11.954 182759 DEBUG oslo_service.service [None req-03770a86-b916-455d-a5be-04baff7a18e9 - - - - - -] libvirt.disk_prefix            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:11 np0005591285 nova_compute[182755]: 2026-01-21 23:37:11.954 182759 DEBUG oslo_service.service [None req-03770a86-b916-455d-a5be-04baff7a18e9 - - - - - -] libvirt.enabled_perf_events    = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:11 np0005591285 nova_compute[182755]: 2026-01-21 23:37:11.954 182759 DEBUG oslo_service.service [None req-03770a86-b916-455d-a5be-04baff7a18e9 - - - - - -] libvirt.file_backed_memory     = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:11 np0005591285 nova_compute[182755]: 2026-01-21 23:37:11.955 182759 DEBUG oslo_service.service [None req-03770a86-b916-455d-a5be-04baff7a18e9 - - - - - -] libvirt.gid_maps               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:11 np0005591285 nova_compute[182755]: 2026-01-21 23:37:11.955 182759 DEBUG oslo_service.service [None req-03770a86-b916-455d-a5be-04baff7a18e9 - - - - - -] libvirt.hw_disk_discard        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:11 np0005591285 nova_compute[182755]: 2026-01-21 23:37:11.955 182759 DEBUG oslo_service.service [None req-03770a86-b916-455d-a5be-04baff7a18e9 - - - - - -] libvirt.hw_machine_type        = ['x86_64=q35'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:11 np0005591285 nova_compute[182755]: 2026-01-21 23:37:11.955 182759 DEBUG oslo_service.service [None req-03770a86-b916-455d-a5be-04baff7a18e9 - - - - - -] libvirt.images_rbd_ceph_conf   =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:11 np0005591285 nova_compute[182755]: 2026-01-21 23:37:11.955 182759 DEBUG oslo_service.service [None req-03770a86-b916-455d-a5be-04baff7a18e9 - - - - - -] libvirt.images_rbd_glance_copy_poll_interval = 15 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:11 np0005591285 nova_compute[182755]: 2026-01-21 23:37:11.956 182759 DEBUG oslo_service.service [None req-03770a86-b916-455d-a5be-04baff7a18e9 - - - - - -] libvirt.images_rbd_glance_copy_timeout = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:11 np0005591285 nova_compute[182755]: 2026-01-21 23:37:11.956 182759 DEBUG oslo_service.service [None req-03770a86-b916-455d-a5be-04baff7a18e9 - - - - - -] libvirt.images_rbd_glance_store_name =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:11 np0005591285 nova_compute[182755]: 2026-01-21 23:37:11.956 182759 DEBUG oslo_service.service [None req-03770a86-b916-455d-a5be-04baff7a18e9 - - - - - -] libvirt.images_rbd_pool        = rbd log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:11 np0005591285 nova_compute[182755]: 2026-01-21 23:37:11.956 182759 DEBUG oslo_service.service [None req-03770a86-b916-455d-a5be-04baff7a18e9 - - - - - -] libvirt.images_type            = qcow2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:11 np0005591285 nova_compute[182755]: 2026-01-21 23:37:11.956 182759 DEBUG oslo_service.service [None req-03770a86-b916-455d-a5be-04baff7a18e9 - - - - - -] libvirt.images_volume_group    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:11 np0005591285 nova_compute[182755]: 2026-01-21 23:37:11.957 182759 DEBUG oslo_service.service [None req-03770a86-b916-455d-a5be-04baff7a18e9 - - - - - -] libvirt.inject_key             = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:11 np0005591285 nova_compute[182755]: 2026-01-21 23:37:11.957 182759 DEBUG oslo_service.service [None req-03770a86-b916-455d-a5be-04baff7a18e9 - - - - - -] libvirt.inject_partition       = -2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:11 np0005591285 nova_compute[182755]: 2026-01-21 23:37:11.957 182759 DEBUG oslo_service.service [None req-03770a86-b916-455d-a5be-04baff7a18e9 - - - - - -] libvirt.inject_password        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:11 np0005591285 nova_compute[182755]: 2026-01-21 23:37:11.957 182759 DEBUG oslo_service.service [None req-03770a86-b916-455d-a5be-04baff7a18e9 - - - - - -] libvirt.iscsi_iface            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:11 np0005591285 nova_compute[182755]: 2026-01-21 23:37:11.957 182759 DEBUG oslo_service.service [None req-03770a86-b916-455d-a5be-04baff7a18e9 - - - - - -] libvirt.iser_use_multipath     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:11 np0005591285 nova_compute[182755]: 2026-01-21 23:37:11.957 182759 DEBUG oslo_service.service [None req-03770a86-b916-455d-a5be-04baff7a18e9 - - - - - -] libvirt.live_migration_bandwidth = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:11 np0005591285 nova_compute[182755]: 2026-01-21 23:37:11.958 182759 DEBUG oslo_service.service [None req-03770a86-b916-455d-a5be-04baff7a18e9 - - - - - -] libvirt.live_migration_completion_timeout = 800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:11 np0005591285 nova_compute[182755]: 2026-01-21 23:37:11.958 182759 DEBUG oslo_service.service [None req-03770a86-b916-455d-a5be-04baff7a18e9 - - - - - -] libvirt.live_migration_downtime = 500 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:11 np0005591285 nova_compute[182755]: 2026-01-21 23:37:11.958 182759 DEBUG oslo_service.service [None req-03770a86-b916-455d-a5be-04baff7a18e9 - - - - - -] libvirt.live_migration_downtime_delay = 75 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:11 np0005591285 nova_compute[182755]: 2026-01-21 23:37:11.958 182759 DEBUG oslo_service.service [None req-03770a86-b916-455d-a5be-04baff7a18e9 - - - - - -] libvirt.live_migration_downtime_steps = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:11 np0005591285 nova_compute[182755]: 2026-01-21 23:37:11.958 182759 DEBUG oslo_service.service [None req-03770a86-b916-455d-a5be-04baff7a18e9 - - - - - -] libvirt.live_migration_inbound_addr = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:11 np0005591285 nova_compute[182755]: 2026-01-21 23:37:11.959 182759 DEBUG oslo_service.service [None req-03770a86-b916-455d-a5be-04baff7a18e9 - - - - - -] libvirt.live_migration_permit_auto_converge = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:11 np0005591285 nova_compute[182755]: 2026-01-21 23:37:11.959 182759 DEBUG oslo_service.service [None req-03770a86-b916-455d-a5be-04baff7a18e9 - - - - - -] libvirt.live_migration_permit_post_copy = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:11 np0005591285 nova_compute[182755]: 2026-01-21 23:37:11.959 182759 DEBUG oslo_service.service [None req-03770a86-b916-455d-a5be-04baff7a18e9 - - - - - -] libvirt.live_migration_scheme  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:11 np0005591285 nova_compute[182755]: 2026-01-21 23:37:11.959 182759 DEBUG oslo_service.service [None req-03770a86-b916-455d-a5be-04baff7a18e9 - - - - - -] libvirt.live_migration_timeout_action = force_complete log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:11 np0005591285 nova_compute[182755]: 2026-01-21 23:37:11.959 182759 DEBUG oslo_service.service [None req-03770a86-b916-455d-a5be-04baff7a18e9 - - - - - -] libvirt.live_migration_tunnelled = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:11 np0005591285 nova_compute[182755]: 2026-01-21 23:37:11.960 182759 WARNING oslo_config.cfg [None req-03770a86-b916-455d-a5be-04baff7a18e9 - - - - - -] Deprecated: Option "live_migration_uri" from group "libvirt" is deprecated for removal (
Jan 21 18:37:11 np0005591285 nova_compute[182755]: live_migration_uri is deprecated for removal in favor of two other options that
Jan 21 18:37:11 np0005591285 nova_compute[182755]: allow to change live migration scheme and target URI: ``live_migration_scheme``
Jan 21 18:37:11 np0005591285 nova_compute[182755]: and ``live_migration_inbound_addr`` respectively.
Jan 21 18:37:11 np0005591285 nova_compute[182755]: ).  Its value may be silently ignored in the future.#033[00m
Jan 21 18:37:11 np0005591285 nova_compute[182755]: 2026-01-21 23:37:11.960 182759 DEBUG oslo_service.service [None req-03770a86-b916-455d-a5be-04baff7a18e9 - - - - - -] libvirt.live_migration_uri     = qemu+tls://%s/system log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:11 np0005591285 nova_compute[182755]: 2026-01-21 23:37:11.960 182759 DEBUG oslo_service.service [None req-03770a86-b916-455d-a5be-04baff7a18e9 - - - - - -] libvirt.live_migration_with_native_tls = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:11 np0005591285 nova_compute[182755]: 2026-01-21 23:37:11.960 182759 DEBUG oslo_service.service [None req-03770a86-b916-455d-a5be-04baff7a18e9 - - - - - -] libvirt.max_queues             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:11 np0005591285 nova_compute[182755]: 2026-01-21 23:37:11.960 182759 DEBUG oslo_service.service [None req-03770a86-b916-455d-a5be-04baff7a18e9 - - - - - -] libvirt.mem_stats_period_seconds = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:11 np0005591285 nova_compute[182755]: 2026-01-21 23:37:11.961 182759 DEBUG oslo_service.service [None req-03770a86-b916-455d-a5be-04baff7a18e9 - - - - - -] libvirt.nfs_mount_options      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:11 np0005591285 nova_compute[182755]: 2026-01-21 23:37:11.961 182759 DEBUG oslo_service.service [None req-03770a86-b916-455d-a5be-04baff7a18e9 - - - - - -] libvirt.nfs_mount_point_base   = /var/lib/nova/mnt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:11 np0005591285 nova_compute[182755]: 2026-01-21 23:37:11.961 182759 DEBUG oslo_service.service [None req-03770a86-b916-455d-a5be-04baff7a18e9 - - - - - -] libvirt.num_aoe_discover_tries = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:11 np0005591285 nova_compute[182755]: 2026-01-21 23:37:11.961 182759 DEBUG oslo_service.service [None req-03770a86-b916-455d-a5be-04baff7a18e9 - - - - - -] libvirt.num_iser_scan_tries    = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:11 np0005591285 nova_compute[182755]: 2026-01-21 23:37:11.961 182759 DEBUG oslo_service.service [None req-03770a86-b916-455d-a5be-04baff7a18e9 - - - - - -] libvirt.num_memory_encrypted_guests = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:11 np0005591285 nova_compute[182755]: 2026-01-21 23:37:11.962 182759 DEBUG oslo_service.service [None req-03770a86-b916-455d-a5be-04baff7a18e9 - - - - - -] libvirt.num_nvme_discover_tries = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:11 np0005591285 nova_compute[182755]: 2026-01-21 23:37:11.962 182759 DEBUG oslo_service.service [None req-03770a86-b916-455d-a5be-04baff7a18e9 - - - - - -] libvirt.num_pcie_ports         = 24 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:11 np0005591285 nova_compute[182755]: 2026-01-21 23:37:11.962 182759 DEBUG oslo_service.service [None req-03770a86-b916-455d-a5be-04baff7a18e9 - - - - - -] libvirt.num_volume_scan_tries  = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:11 np0005591285 nova_compute[182755]: 2026-01-21 23:37:11.962 182759 DEBUG oslo_service.service [None req-03770a86-b916-455d-a5be-04baff7a18e9 - - - - - -] libvirt.pmem_namespaces        = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:11 np0005591285 nova_compute[182755]: 2026-01-21 23:37:11.962 182759 DEBUG oslo_service.service [None req-03770a86-b916-455d-a5be-04baff7a18e9 - - - - - -] libvirt.quobyte_client_cfg     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:11 np0005591285 nova_compute[182755]: 2026-01-21 23:37:11.963 182759 DEBUG oslo_service.service [None req-03770a86-b916-455d-a5be-04baff7a18e9 - - - - - -] libvirt.quobyte_mount_point_base = /var/lib/nova/mnt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:11 np0005591285 nova_compute[182755]: 2026-01-21 23:37:11.963 182759 DEBUG oslo_service.service [None req-03770a86-b916-455d-a5be-04baff7a18e9 - - - - - -] libvirt.rbd_connect_timeout    = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:11 np0005591285 nova_compute[182755]: 2026-01-21 23:37:11.963 182759 DEBUG oslo_service.service [None req-03770a86-b916-455d-a5be-04baff7a18e9 - - - - - -] libvirt.rbd_destroy_volume_retries = 12 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:11 np0005591285 nova_compute[182755]: 2026-01-21 23:37:11.963 182759 DEBUG oslo_service.service [None req-03770a86-b916-455d-a5be-04baff7a18e9 - - - - - -] libvirt.rbd_destroy_volume_retry_interval = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:11 np0005591285 nova_compute[182755]: 2026-01-21 23:37:11.963 182759 DEBUG oslo_service.service [None req-03770a86-b916-455d-a5be-04baff7a18e9 - - - - - -] libvirt.rbd_secret_uuid        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:11 np0005591285 nova_compute[182755]: 2026-01-21 23:37:11.963 182759 DEBUG oslo_service.service [None req-03770a86-b916-455d-a5be-04baff7a18e9 - - - - - -] libvirt.rbd_user               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:11 np0005591285 nova_compute[182755]: 2026-01-21 23:37:11.964 182759 DEBUG oslo_service.service [None req-03770a86-b916-455d-a5be-04baff7a18e9 - - - - - -] libvirt.realtime_scheduler_priority = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:11 np0005591285 nova_compute[182755]: 2026-01-21 23:37:11.964 182759 DEBUG oslo_service.service [None req-03770a86-b916-455d-a5be-04baff7a18e9 - - - - - -] libvirt.remote_filesystem_transport = ssh log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:11 np0005591285 nova_compute[182755]: 2026-01-21 23:37:11.964 182759 DEBUG oslo_service.service [None req-03770a86-b916-455d-a5be-04baff7a18e9 - - - - - -] libvirt.rescue_image_id        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:11 np0005591285 nova_compute[182755]: 2026-01-21 23:37:11.964 182759 DEBUG oslo_service.service [None req-03770a86-b916-455d-a5be-04baff7a18e9 - - - - - -] libvirt.rescue_kernel_id       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:11 np0005591285 nova_compute[182755]: 2026-01-21 23:37:11.964 182759 DEBUG oslo_service.service [None req-03770a86-b916-455d-a5be-04baff7a18e9 - - - - - -] libvirt.rescue_ramdisk_id      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:11 np0005591285 nova_compute[182755]: 2026-01-21 23:37:11.965 182759 DEBUG oslo_service.service [None req-03770a86-b916-455d-a5be-04baff7a18e9 - - - - - -] libvirt.rng_dev_path           = /dev/urandom log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:11 np0005591285 nova_compute[182755]: 2026-01-21 23:37:11.965 182759 DEBUG oslo_service.service [None req-03770a86-b916-455d-a5be-04baff7a18e9 - - - - - -] libvirt.rx_queue_size          = 512 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:11 np0005591285 nova_compute[182755]: 2026-01-21 23:37:11.965 182759 DEBUG oslo_service.service [None req-03770a86-b916-455d-a5be-04baff7a18e9 - - - - - -] libvirt.smbfs_mount_options    =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:11 np0005591285 nova_compute[182755]: 2026-01-21 23:37:11.965 182759 DEBUG oslo_service.service [None req-03770a86-b916-455d-a5be-04baff7a18e9 - - - - - -] libvirt.smbfs_mount_point_base = /var/lib/nova/mnt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:11 np0005591285 nova_compute[182755]: 2026-01-21 23:37:11.966 182759 DEBUG oslo_service.service [None req-03770a86-b916-455d-a5be-04baff7a18e9 - - - - - -] libvirt.snapshot_compression   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:11 np0005591285 nova_compute[182755]: 2026-01-21 23:37:11.966 182759 DEBUG oslo_service.service [None req-03770a86-b916-455d-a5be-04baff7a18e9 - - - - - -] libvirt.snapshot_image_format  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:11 np0005591285 nova_compute[182755]: 2026-01-21 23:37:11.966 182759 DEBUG oslo_service.service [None req-03770a86-b916-455d-a5be-04baff7a18e9 - - - - - -] libvirt.snapshots_directory    = /var/lib/nova/instances/snapshots log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:11 np0005591285 nova_compute[182755]: 2026-01-21 23:37:11.966 182759 DEBUG oslo_service.service [None req-03770a86-b916-455d-a5be-04baff7a18e9 - - - - - -] libvirt.sparse_logical_volumes = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:11 np0005591285 nova_compute[182755]: 2026-01-21 23:37:11.966 182759 DEBUG oslo_service.service [None req-03770a86-b916-455d-a5be-04baff7a18e9 - - - - - -] libvirt.swtpm_enabled          = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:11 np0005591285 nova_compute[182755]: 2026-01-21 23:37:11.967 182759 DEBUG oslo_service.service [None req-03770a86-b916-455d-a5be-04baff7a18e9 - - - - - -] libvirt.swtpm_group            = tss log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:11 np0005591285 nova_compute[182755]: 2026-01-21 23:37:11.967 182759 DEBUG oslo_service.service [None req-03770a86-b916-455d-a5be-04baff7a18e9 - - - - - -] libvirt.swtpm_user             = tss log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:11 np0005591285 nova_compute[182755]: 2026-01-21 23:37:11.967 182759 DEBUG oslo_service.service [None req-03770a86-b916-455d-a5be-04baff7a18e9 - - - - - -] libvirt.sysinfo_serial         = unique log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:11 np0005591285 nova_compute[182755]: 2026-01-21 23:37:11.967 182759 DEBUG oslo_service.service [None req-03770a86-b916-455d-a5be-04baff7a18e9 - - - - - -] libvirt.tx_queue_size          = 512 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:11 np0005591285 nova_compute[182755]: 2026-01-21 23:37:11.967 182759 DEBUG oslo_service.service [None req-03770a86-b916-455d-a5be-04baff7a18e9 - - - - - -] libvirt.uid_maps               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:11 np0005591285 nova_compute[182755]: 2026-01-21 23:37:11.968 182759 DEBUG oslo_service.service [None req-03770a86-b916-455d-a5be-04baff7a18e9 - - - - - -] libvirt.use_virtio_for_bridges = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:11 np0005591285 nova_compute[182755]: 2026-01-21 23:37:11.968 182759 DEBUG oslo_service.service [None req-03770a86-b916-455d-a5be-04baff7a18e9 - - - - - -] libvirt.virt_type              = kvm log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:11 np0005591285 nova_compute[182755]: 2026-01-21 23:37:11.968 182759 DEBUG oslo_service.service [None req-03770a86-b916-455d-a5be-04baff7a18e9 - - - - - -] libvirt.volume_clear           = zero log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:11 np0005591285 nova_compute[182755]: 2026-01-21 23:37:11.968 182759 DEBUG oslo_service.service [None req-03770a86-b916-455d-a5be-04baff7a18e9 - - - - - -] libvirt.volume_clear_size      = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:11 np0005591285 nova_compute[182755]: 2026-01-21 23:37:11.968 182759 DEBUG oslo_service.service [None req-03770a86-b916-455d-a5be-04baff7a18e9 - - - - - -] libvirt.volume_use_multipath   = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:11 np0005591285 nova_compute[182755]: 2026-01-21 23:37:11.969 182759 DEBUG oslo_service.service [None req-03770a86-b916-455d-a5be-04baff7a18e9 - - - - - -] libvirt.vzstorage_cache_path   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:11 np0005591285 nova_compute[182755]: 2026-01-21 23:37:11.969 182759 DEBUG oslo_service.service [None req-03770a86-b916-455d-a5be-04baff7a18e9 - - - - - -] libvirt.vzstorage_log_path     = /var/log/vstorage/%(cluster_name)s/nova.log.gz log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:11 np0005591285 nova_compute[182755]: 2026-01-21 23:37:11.969 182759 DEBUG oslo_service.service [None req-03770a86-b916-455d-a5be-04baff7a18e9 - - - - - -] libvirt.vzstorage_mount_group  = qemu log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:11 np0005591285 nova_compute[182755]: 2026-01-21 23:37:11.969 182759 DEBUG oslo_service.service [None req-03770a86-b916-455d-a5be-04baff7a18e9 - - - - - -] libvirt.vzstorage_mount_opts   = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:11 np0005591285 nova_compute[182755]: 2026-01-21 23:37:11.969 182759 DEBUG oslo_service.service [None req-03770a86-b916-455d-a5be-04baff7a18e9 - - - - - -] libvirt.vzstorage_mount_perms  = 0770 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:11 np0005591285 nova_compute[182755]: 2026-01-21 23:37:11.970 182759 DEBUG oslo_service.service [None req-03770a86-b916-455d-a5be-04baff7a18e9 - - - - - -] libvirt.vzstorage_mount_point_base = /var/lib/nova/mnt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:11 np0005591285 nova_compute[182755]: 2026-01-21 23:37:11.970 182759 DEBUG oslo_service.service [None req-03770a86-b916-455d-a5be-04baff7a18e9 - - - - - -] libvirt.vzstorage_mount_user   = stack log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:11 np0005591285 nova_compute[182755]: 2026-01-21 23:37:11.970 182759 DEBUG oslo_service.service [None req-03770a86-b916-455d-a5be-04baff7a18e9 - - - - - -] libvirt.wait_soft_reboot_seconds = 120 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:11 np0005591285 nova_compute[182755]: 2026-01-21 23:37:11.970 182759 DEBUG oslo_service.service [None req-03770a86-b916-455d-a5be-04baff7a18e9 - - - - - -] neutron.auth_section           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:11 np0005591285 nova_compute[182755]: 2026-01-21 23:37:11.970 182759 DEBUG oslo_service.service [None req-03770a86-b916-455d-a5be-04baff7a18e9 - - - - - -] neutron.auth_type              = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:11 np0005591285 nova_compute[182755]: 2026-01-21 23:37:11.971 182759 DEBUG oslo_service.service [None req-03770a86-b916-455d-a5be-04baff7a18e9 - - - - - -] neutron.cafile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:11 np0005591285 nova_compute[182755]: 2026-01-21 23:37:11.971 182759 DEBUG oslo_service.service [None req-03770a86-b916-455d-a5be-04baff7a18e9 - - - - - -] neutron.certfile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:11 np0005591285 nova_compute[182755]: 2026-01-21 23:37:11.971 182759 DEBUG oslo_service.service [None req-03770a86-b916-455d-a5be-04baff7a18e9 - - - - - -] neutron.collect_timing         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:11 np0005591285 nova_compute[182755]: 2026-01-21 23:37:11.971 182759 DEBUG oslo_service.service [None req-03770a86-b916-455d-a5be-04baff7a18e9 - - - - - -] neutron.connect_retries        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:11 np0005591285 nova_compute[182755]: 2026-01-21 23:37:11.971 182759 DEBUG oslo_service.service [None req-03770a86-b916-455d-a5be-04baff7a18e9 - - - - - -] neutron.connect_retry_delay    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:11 np0005591285 nova_compute[182755]: 2026-01-21 23:37:11.972 182759 DEBUG oslo_service.service [None req-03770a86-b916-455d-a5be-04baff7a18e9 - - - - - -] neutron.default_floating_pool  = nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:11 np0005591285 nova_compute[182755]: 2026-01-21 23:37:11.972 182759 DEBUG oslo_service.service [None req-03770a86-b916-455d-a5be-04baff7a18e9 - - - - - -] neutron.endpoint_override      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:11 np0005591285 nova_compute[182755]: 2026-01-21 23:37:11.972 182759 DEBUG oslo_service.service [None req-03770a86-b916-455d-a5be-04baff7a18e9 - - - - - -] neutron.extension_sync_interval = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:11 np0005591285 nova_compute[182755]: 2026-01-21 23:37:11.972 182759 DEBUG oslo_service.service [None req-03770a86-b916-455d-a5be-04baff7a18e9 - - - - - -] neutron.http_retries           = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:11 np0005591285 nova_compute[182755]: 2026-01-21 23:37:11.972 182759 DEBUG oslo_service.service [None req-03770a86-b916-455d-a5be-04baff7a18e9 - - - - - -] neutron.insecure               = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:11 np0005591285 nova_compute[182755]: 2026-01-21 23:37:11.973 182759 DEBUG oslo_service.service [None req-03770a86-b916-455d-a5be-04baff7a18e9 - - - - - -] neutron.keyfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:11 np0005591285 nova_compute[182755]: 2026-01-21 23:37:11.973 182759 DEBUG oslo_service.service [None req-03770a86-b916-455d-a5be-04baff7a18e9 - - - - - -] neutron.max_version            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:11 np0005591285 nova_compute[182755]: 2026-01-21 23:37:11.973 182759 DEBUG oslo_service.service [None req-03770a86-b916-455d-a5be-04baff7a18e9 - - - - - -] neutron.metadata_proxy_shared_secret = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:11 np0005591285 nova_compute[182755]: 2026-01-21 23:37:11.973 182759 DEBUG oslo_service.service [None req-03770a86-b916-455d-a5be-04baff7a18e9 - - - - - -] neutron.min_version            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:11 np0005591285 nova_compute[182755]: 2026-01-21 23:37:11.974 182759 DEBUG oslo_service.service [None req-03770a86-b916-455d-a5be-04baff7a18e9 - - - - - -] neutron.ovs_bridge             = br-int log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:11 np0005591285 nova_compute[182755]: 2026-01-21 23:37:11.974 182759 DEBUG oslo_service.service [None req-03770a86-b916-455d-a5be-04baff7a18e9 - - - - - -] neutron.physnets               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:11 np0005591285 nova_compute[182755]: 2026-01-21 23:37:11.974 182759 DEBUG oslo_service.service [None req-03770a86-b916-455d-a5be-04baff7a18e9 - - - - - -] neutron.region_name            = regionOne log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:11 np0005591285 nova_compute[182755]: 2026-01-21 23:37:11.975 182759 DEBUG oslo_service.service [None req-03770a86-b916-455d-a5be-04baff7a18e9 - - - - - -] neutron.service_metadata_proxy = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:11 np0005591285 nova_compute[182755]: 2026-01-21 23:37:11.975 182759 DEBUG oslo_service.service [None req-03770a86-b916-455d-a5be-04baff7a18e9 - - - - - -] neutron.service_name           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:11 np0005591285 nova_compute[182755]: 2026-01-21 23:37:11.975 182759 DEBUG oslo_service.service [None req-03770a86-b916-455d-a5be-04baff7a18e9 - - - - - -] neutron.service_type           = network log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:11 np0005591285 nova_compute[182755]: 2026-01-21 23:37:11.976 182759 DEBUG oslo_service.service [None req-03770a86-b916-455d-a5be-04baff7a18e9 - - - - - -] neutron.split_loggers          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:11 np0005591285 nova_compute[182755]: 2026-01-21 23:37:11.976 182759 DEBUG oslo_service.service [None req-03770a86-b916-455d-a5be-04baff7a18e9 - - - - - -] neutron.status_code_retries    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:11 np0005591285 nova_compute[182755]: 2026-01-21 23:37:11.976 182759 DEBUG oslo_service.service [None req-03770a86-b916-455d-a5be-04baff7a18e9 - - - - - -] neutron.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:11 np0005591285 nova_compute[182755]: 2026-01-21 23:37:11.977 182759 DEBUG oslo_service.service [None req-03770a86-b916-455d-a5be-04baff7a18e9 - - - - - -] neutron.timeout                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:11 np0005591285 nova_compute[182755]: 2026-01-21 23:37:11.977 182759 DEBUG oslo_service.service [None req-03770a86-b916-455d-a5be-04baff7a18e9 - - - - - -] neutron.valid_interfaces       = ['internal'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:11 np0005591285 nova_compute[182755]: 2026-01-21 23:37:11.977 182759 DEBUG oslo_service.service [None req-03770a86-b916-455d-a5be-04baff7a18e9 - - - - - -] neutron.version                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:11 np0005591285 nova_compute[182755]: 2026-01-21 23:37:11.978 182759 DEBUG oslo_service.service [None req-03770a86-b916-455d-a5be-04baff7a18e9 - - - - - -] notifications.bdms_in_notifications = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:11 np0005591285 nova_compute[182755]: 2026-01-21 23:37:11.978 182759 DEBUG oslo_service.service [None req-03770a86-b916-455d-a5be-04baff7a18e9 - - - - - -] notifications.default_level    = INFO log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:11 np0005591285 nova_compute[182755]: 2026-01-21 23:37:11.978 182759 DEBUG oslo_service.service [None req-03770a86-b916-455d-a5be-04baff7a18e9 - - - - - -] notifications.notification_format = both log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:11 np0005591285 nova_compute[182755]: 2026-01-21 23:37:11.978 182759 DEBUG oslo_service.service [None req-03770a86-b916-455d-a5be-04baff7a18e9 - - - - - -] notifications.notify_on_state_change = vm_and_task_state log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:11 np0005591285 nova_compute[182755]: 2026-01-21 23:37:11.978 182759 DEBUG oslo_service.service [None req-03770a86-b916-455d-a5be-04baff7a18e9 - - - - - -] notifications.versioned_notifications_topics = ['versioned_notifications'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:11 np0005591285 nova_compute[182755]: 2026-01-21 23:37:11.979 182759 DEBUG oslo_service.service [None req-03770a86-b916-455d-a5be-04baff7a18e9 - - - - - -] pci.alias                      = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:11 np0005591285 nova_compute[182755]: 2026-01-21 23:37:11.979 182759 DEBUG oslo_service.service [None req-03770a86-b916-455d-a5be-04baff7a18e9 - - - - - -] pci.device_spec                = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:11 np0005591285 nova_compute[182755]: 2026-01-21 23:37:11.979 182759 DEBUG oslo_service.service [None req-03770a86-b916-455d-a5be-04baff7a18e9 - - - - - -] pci.report_in_placement        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:11 np0005591285 nova_compute[182755]: 2026-01-21 23:37:11.979 182759 DEBUG oslo_service.service [None req-03770a86-b916-455d-a5be-04baff7a18e9 - - - - - -] placement.auth_section         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:11 np0005591285 nova_compute[182755]: 2026-01-21 23:37:11.979 182759 DEBUG oslo_service.service [None req-03770a86-b916-455d-a5be-04baff7a18e9 - - - - - -] placement.auth_type            = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:11 np0005591285 nova_compute[182755]: 2026-01-21 23:37:11.979 182759 DEBUG oslo_service.service [None req-03770a86-b916-455d-a5be-04baff7a18e9 - - - - - -] placement.auth_url             = https://keystone-internal.openstack.svc:5000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:11 np0005591285 nova_compute[182755]: 2026-01-21 23:37:11.979 182759 DEBUG oslo_service.service [None req-03770a86-b916-455d-a5be-04baff7a18e9 - - - - - -] placement.cafile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:11 np0005591285 nova_compute[182755]: 2026-01-21 23:37:11.979 182759 DEBUG oslo_service.service [None req-03770a86-b916-455d-a5be-04baff7a18e9 - - - - - -] placement.certfile             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:11 np0005591285 nova_compute[182755]: 2026-01-21 23:37:11.980 182759 DEBUG oslo_service.service [None req-03770a86-b916-455d-a5be-04baff7a18e9 - - - - - -] placement.collect_timing       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:11 np0005591285 nova_compute[182755]: 2026-01-21 23:37:11.980 182759 DEBUG oslo_service.service [None req-03770a86-b916-455d-a5be-04baff7a18e9 - - - - - -] placement.connect_retries      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:11 np0005591285 nova_compute[182755]: 2026-01-21 23:37:11.980 182759 DEBUG oslo_service.service [None req-03770a86-b916-455d-a5be-04baff7a18e9 - - - - - -] placement.connect_retry_delay  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:11 np0005591285 nova_compute[182755]: 2026-01-21 23:37:11.980 182759 DEBUG oslo_service.service [None req-03770a86-b916-455d-a5be-04baff7a18e9 - - - - - -] placement.default_domain_id    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:11 np0005591285 nova_compute[182755]: 2026-01-21 23:37:11.980 182759 DEBUG oslo_service.service [None req-03770a86-b916-455d-a5be-04baff7a18e9 - - - - - -] placement.default_domain_name  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:11 np0005591285 nova_compute[182755]: 2026-01-21 23:37:11.980 182759 DEBUG oslo_service.service [None req-03770a86-b916-455d-a5be-04baff7a18e9 - - - - - -] placement.domain_id            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:11 np0005591285 nova_compute[182755]: 2026-01-21 23:37:11.980 182759 DEBUG oslo_service.service [None req-03770a86-b916-455d-a5be-04baff7a18e9 - - - - - -] placement.domain_name          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:11 np0005591285 nova_compute[182755]: 2026-01-21 23:37:11.980 182759 DEBUG oslo_service.service [None req-03770a86-b916-455d-a5be-04baff7a18e9 - - - - - -] placement.endpoint_override    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:11 np0005591285 nova_compute[182755]: 2026-01-21 23:37:11.981 182759 DEBUG oslo_service.service [None req-03770a86-b916-455d-a5be-04baff7a18e9 - - - - - -] placement.insecure             = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:11 np0005591285 nova_compute[182755]: 2026-01-21 23:37:11.981 182759 DEBUG oslo_service.service [None req-03770a86-b916-455d-a5be-04baff7a18e9 - - - - - -] placement.keyfile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:11 np0005591285 nova_compute[182755]: 2026-01-21 23:37:11.981 182759 DEBUG oslo_service.service [None req-03770a86-b916-455d-a5be-04baff7a18e9 - - - - - -] placement.max_version          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:11 np0005591285 nova_compute[182755]: 2026-01-21 23:37:11.981 182759 DEBUG oslo_service.service [None req-03770a86-b916-455d-a5be-04baff7a18e9 - - - - - -] placement.min_version          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:11 np0005591285 nova_compute[182755]: 2026-01-21 23:37:11.981 182759 DEBUG oslo_service.service [None req-03770a86-b916-455d-a5be-04baff7a18e9 - - - - - -] placement.password             = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:11 np0005591285 nova_compute[182755]: 2026-01-21 23:37:11.981 182759 DEBUG oslo_service.service [None req-03770a86-b916-455d-a5be-04baff7a18e9 - - - - - -] placement.project_domain_id    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:11 np0005591285 nova_compute[182755]: 2026-01-21 23:37:11.981 182759 DEBUG oslo_service.service [None req-03770a86-b916-455d-a5be-04baff7a18e9 - - - - - -] placement.project_domain_name  = Default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:11 np0005591285 nova_compute[182755]: 2026-01-21 23:37:11.982 182759 DEBUG oslo_service.service [None req-03770a86-b916-455d-a5be-04baff7a18e9 - - - - - -] placement.project_id           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:11 np0005591285 nova_compute[182755]: 2026-01-21 23:37:11.982 182759 DEBUG oslo_service.service [None req-03770a86-b916-455d-a5be-04baff7a18e9 - - - - - -] placement.project_name         = service log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:11 np0005591285 nova_compute[182755]: 2026-01-21 23:37:11.982 182759 DEBUG oslo_service.service [None req-03770a86-b916-455d-a5be-04baff7a18e9 - - - - - -] placement.region_name          = regionOne log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:11 np0005591285 nova_compute[182755]: 2026-01-21 23:37:11.982 182759 DEBUG oslo_service.service [None req-03770a86-b916-455d-a5be-04baff7a18e9 - - - - - -] placement.service_name         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:11 np0005591285 nova_compute[182755]: 2026-01-21 23:37:11.982 182759 DEBUG oslo_service.service [None req-03770a86-b916-455d-a5be-04baff7a18e9 - - - - - -] placement.service_type         = placement log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:11 np0005591285 nova_compute[182755]: 2026-01-21 23:37:11.982 182759 DEBUG oslo_service.service [None req-03770a86-b916-455d-a5be-04baff7a18e9 - - - - - -] placement.split_loggers        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:11 np0005591285 nova_compute[182755]: 2026-01-21 23:37:11.982 182759 DEBUG oslo_service.service [None req-03770a86-b916-455d-a5be-04baff7a18e9 - - - - - -] placement.status_code_retries  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:11 np0005591285 nova_compute[182755]: 2026-01-21 23:37:11.982 182759 DEBUG oslo_service.service [None req-03770a86-b916-455d-a5be-04baff7a18e9 - - - - - -] placement.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:11 np0005591285 nova_compute[182755]: 2026-01-21 23:37:11.983 182759 DEBUG oslo_service.service [None req-03770a86-b916-455d-a5be-04baff7a18e9 - - - - - -] placement.system_scope         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:11 np0005591285 nova_compute[182755]: 2026-01-21 23:37:11.983 182759 DEBUG oslo_service.service [None req-03770a86-b916-455d-a5be-04baff7a18e9 - - - - - -] placement.timeout              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:11 np0005591285 nova_compute[182755]: 2026-01-21 23:37:11.983 182759 DEBUG oslo_service.service [None req-03770a86-b916-455d-a5be-04baff7a18e9 - - - - - -] placement.trust_id             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:11 np0005591285 nova_compute[182755]: 2026-01-21 23:37:11.983 182759 DEBUG oslo_service.service [None req-03770a86-b916-455d-a5be-04baff7a18e9 - - - - - -] placement.user_domain_id       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:11 np0005591285 nova_compute[182755]: 2026-01-21 23:37:11.983 182759 DEBUG oslo_service.service [None req-03770a86-b916-455d-a5be-04baff7a18e9 - - - - - -] placement.user_domain_name     = Default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:11 np0005591285 nova_compute[182755]: 2026-01-21 23:37:11.983 182759 DEBUG oslo_service.service [None req-03770a86-b916-455d-a5be-04baff7a18e9 - - - - - -] placement.user_id              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:11 np0005591285 nova_compute[182755]: 2026-01-21 23:37:11.983 182759 DEBUG oslo_service.service [None req-03770a86-b916-455d-a5be-04baff7a18e9 - - - - - -] placement.username             = nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:11 np0005591285 nova_compute[182755]: 2026-01-21 23:37:11.984 182759 DEBUG oslo_service.service [None req-03770a86-b916-455d-a5be-04baff7a18e9 - - - - - -] placement.valid_interfaces     = ['internal'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:11 np0005591285 nova_compute[182755]: 2026-01-21 23:37:11.984 182759 DEBUG oslo_service.service [None req-03770a86-b916-455d-a5be-04baff7a18e9 - - - - - -] placement.version              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:11 np0005591285 nova_compute[182755]: 2026-01-21 23:37:11.984 182759 DEBUG oslo_service.service [None req-03770a86-b916-455d-a5be-04baff7a18e9 - - - - - -] quota.cores                    = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:11 np0005591285 nova_compute[182755]: 2026-01-21 23:37:11.984 182759 DEBUG oslo_service.service [None req-03770a86-b916-455d-a5be-04baff7a18e9 - - - - - -] quota.count_usage_from_placement = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:11 np0005591285 nova_compute[182755]: 2026-01-21 23:37:11.984 182759 DEBUG oslo_service.service [None req-03770a86-b916-455d-a5be-04baff7a18e9 - - - - - -] quota.driver                   = nova.quota.DbQuotaDriver log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:11 np0005591285 nova_compute[182755]: 2026-01-21 23:37:11.984 182759 DEBUG oslo_service.service [None req-03770a86-b916-455d-a5be-04baff7a18e9 - - - - - -] quota.injected_file_content_bytes = 10240 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:11 np0005591285 nova_compute[182755]: 2026-01-21 23:37:11.984 182759 DEBUG oslo_service.service [None req-03770a86-b916-455d-a5be-04baff7a18e9 - - - - - -] quota.injected_file_path_length = 255 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:11 np0005591285 nova_compute[182755]: 2026-01-21 23:37:11.984 182759 DEBUG oslo_service.service [None req-03770a86-b916-455d-a5be-04baff7a18e9 - - - - - -] quota.injected_files           = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:11 np0005591285 nova_compute[182755]: 2026-01-21 23:37:11.985 182759 DEBUG oslo_service.service [None req-03770a86-b916-455d-a5be-04baff7a18e9 - - - - - -] quota.instances                = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:11 np0005591285 nova_compute[182755]: 2026-01-21 23:37:11.985 182759 DEBUG oslo_service.service [None req-03770a86-b916-455d-a5be-04baff7a18e9 - - - - - -] quota.key_pairs                = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:11 np0005591285 nova_compute[182755]: 2026-01-21 23:37:11.985 182759 DEBUG oslo_service.service [None req-03770a86-b916-455d-a5be-04baff7a18e9 - - - - - -] quota.metadata_items           = 128 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:11 np0005591285 nova_compute[182755]: 2026-01-21 23:37:11.985 182759 DEBUG oslo_service.service [None req-03770a86-b916-455d-a5be-04baff7a18e9 - - - - - -] quota.ram                      = 51200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:11 np0005591285 nova_compute[182755]: 2026-01-21 23:37:11.985 182759 DEBUG oslo_service.service [None req-03770a86-b916-455d-a5be-04baff7a18e9 - - - - - -] quota.recheck_quota            = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:11 np0005591285 nova_compute[182755]: 2026-01-21 23:37:11.985 182759 DEBUG oslo_service.service [None req-03770a86-b916-455d-a5be-04baff7a18e9 - - - - - -] quota.server_group_members     = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:11 np0005591285 nova_compute[182755]: 2026-01-21 23:37:11.985 182759 DEBUG oslo_service.service [None req-03770a86-b916-455d-a5be-04baff7a18e9 - - - - - -] quota.server_groups            = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:11 np0005591285 nova_compute[182755]: 2026-01-21 23:37:11.986 182759 DEBUG oslo_service.service [None req-03770a86-b916-455d-a5be-04baff7a18e9 - - - - - -] rdp.enabled                    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:11 np0005591285 nova_compute[182755]: 2026-01-21 23:37:11.986 182759 DEBUG oslo_service.service [None req-03770a86-b916-455d-a5be-04baff7a18e9 - - - - - -] rdp.html5_proxy_base_url       = http://127.0.0.1:6083/ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:11 np0005591285 nova_compute[182755]: 2026-01-21 23:37:11.986 182759 DEBUG oslo_service.service [None req-03770a86-b916-455d-a5be-04baff7a18e9 - - - - - -] scheduler.discover_hosts_in_cells_interval = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:11 np0005591285 nova_compute[182755]: 2026-01-21 23:37:11.986 182759 DEBUG oslo_service.service [None req-03770a86-b916-455d-a5be-04baff7a18e9 - - - - - -] scheduler.enable_isolated_aggregate_filtering = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:11 np0005591285 nova_compute[182755]: 2026-01-21 23:37:11.986 182759 DEBUG oslo_service.service [None req-03770a86-b916-455d-a5be-04baff7a18e9 - - - - - -] scheduler.image_metadata_prefilter = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:11 np0005591285 nova_compute[182755]: 2026-01-21 23:37:11.986 182759 DEBUG oslo_service.service [None req-03770a86-b916-455d-a5be-04baff7a18e9 - - - - - -] scheduler.limit_tenants_to_placement_aggregate = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:11 np0005591285 nova_compute[182755]: 2026-01-21 23:37:11.986 182759 DEBUG oslo_service.service [None req-03770a86-b916-455d-a5be-04baff7a18e9 - - - - - -] scheduler.max_attempts         = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:11 np0005591285 nova_compute[182755]: 2026-01-21 23:37:11.987 182759 DEBUG oslo_service.service [None req-03770a86-b916-455d-a5be-04baff7a18e9 - - - - - -] scheduler.max_placement_results = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:11 np0005591285 nova_compute[182755]: 2026-01-21 23:37:11.987 182759 DEBUG oslo_service.service [None req-03770a86-b916-455d-a5be-04baff7a18e9 - - - - - -] scheduler.placement_aggregate_required_for_tenants = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:11 np0005591285 nova_compute[182755]: 2026-01-21 23:37:11.987 182759 DEBUG oslo_service.service [None req-03770a86-b916-455d-a5be-04baff7a18e9 - - - - - -] scheduler.query_placement_for_availability_zone = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:11 np0005591285 nova_compute[182755]: 2026-01-21 23:37:11.987 182759 DEBUG oslo_service.service [None req-03770a86-b916-455d-a5be-04baff7a18e9 - - - - - -] scheduler.query_placement_for_image_type_support = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:11 np0005591285 nova_compute[182755]: 2026-01-21 23:37:11.987 182759 DEBUG oslo_service.service [None req-03770a86-b916-455d-a5be-04baff7a18e9 - - - - - -] scheduler.query_placement_for_routed_network_aggregates = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:11 np0005591285 nova_compute[182755]: 2026-01-21 23:37:11.987 182759 DEBUG oslo_service.service [None req-03770a86-b916-455d-a5be-04baff7a18e9 - - - - - -] scheduler.workers              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:11 np0005591285 nova_compute[182755]: 2026-01-21 23:37:11.987 182759 DEBUG oslo_service.service [None req-03770a86-b916-455d-a5be-04baff7a18e9 - - - - - -] filter_scheduler.aggregate_image_properties_isolation_namespace = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:11 np0005591285 nova_compute[182755]: 2026-01-21 23:37:11.988 182759 DEBUG oslo_service.service [None req-03770a86-b916-455d-a5be-04baff7a18e9 - - - - - -] filter_scheduler.aggregate_image_properties_isolation_separator = . log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:11 np0005591285 nova_compute[182755]: 2026-01-21 23:37:11.988 182759 DEBUG oslo_service.service [None req-03770a86-b916-455d-a5be-04baff7a18e9 - - - - - -] filter_scheduler.available_filters = ['nova.scheduler.filters.all_filters'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:11 np0005591285 nova_compute[182755]: 2026-01-21 23:37:11.988 182759 DEBUG oslo_service.service [None req-03770a86-b916-455d-a5be-04baff7a18e9 - - - - - -] filter_scheduler.build_failure_weight_multiplier = 1000000.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:11 np0005591285 nova_compute[182755]: 2026-01-21 23:37:11.988 182759 DEBUG oslo_service.service [None req-03770a86-b916-455d-a5be-04baff7a18e9 - - - - - -] filter_scheduler.cpu_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:11 np0005591285 nova_compute[182755]: 2026-01-21 23:37:11.988 182759 DEBUG oslo_service.service [None req-03770a86-b916-455d-a5be-04baff7a18e9 - - - - - -] filter_scheduler.cross_cell_move_weight_multiplier = 1000000.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:11 np0005591285 nova_compute[182755]: 2026-01-21 23:37:11.988 182759 DEBUG oslo_service.service [None req-03770a86-b916-455d-a5be-04baff7a18e9 - - - - - -] filter_scheduler.disk_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:11 np0005591285 nova_compute[182755]: 2026-01-21 23:37:11.988 182759 DEBUG oslo_service.service [None req-03770a86-b916-455d-a5be-04baff7a18e9 - - - - - -] filter_scheduler.enabled_filters = ['ComputeFilter', 'ComputeCapabilitiesFilter', 'ImagePropertiesFilter', 'ServerGroupAntiAffinityFilter', 'ServerGroupAffinityFilter'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:11 np0005591285 nova_compute[182755]: 2026-01-21 23:37:11.989 182759 DEBUG oslo_service.service [None req-03770a86-b916-455d-a5be-04baff7a18e9 - - - - - -] filter_scheduler.host_subset_size = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:11 np0005591285 nova_compute[182755]: 2026-01-21 23:37:11.989 182759 DEBUG oslo_service.service [None req-03770a86-b916-455d-a5be-04baff7a18e9 - - - - - -] filter_scheduler.image_properties_default_architecture = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:11 np0005591285 nova_compute[182755]: 2026-01-21 23:37:11.989 182759 DEBUG oslo_service.service [None req-03770a86-b916-455d-a5be-04baff7a18e9 - - - - - -] filter_scheduler.io_ops_weight_multiplier = -1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:11 np0005591285 nova_compute[182755]: 2026-01-21 23:37:11.989 182759 DEBUG oslo_service.service [None req-03770a86-b916-455d-a5be-04baff7a18e9 - - - - - -] filter_scheduler.isolated_hosts = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:11 np0005591285 nova_compute[182755]: 2026-01-21 23:37:11.989 182759 DEBUG oslo_service.service [None req-03770a86-b916-455d-a5be-04baff7a18e9 - - - - - -] filter_scheduler.isolated_images = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:11 np0005591285 nova_compute[182755]: 2026-01-21 23:37:11.989 182759 DEBUG oslo_service.service [None req-03770a86-b916-455d-a5be-04baff7a18e9 - - - - - -] filter_scheduler.max_instances_per_host = 50 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:11 np0005591285 nova_compute[182755]: 2026-01-21 23:37:11.989 182759 DEBUG oslo_service.service [None req-03770a86-b916-455d-a5be-04baff7a18e9 - - - - - -] filter_scheduler.max_io_ops_per_host = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:11 np0005591285 nova_compute[182755]: 2026-01-21 23:37:11.989 182759 DEBUG oslo_service.service [None req-03770a86-b916-455d-a5be-04baff7a18e9 - - - - - -] filter_scheduler.pci_in_placement = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:11 np0005591285 nova_compute[182755]: 2026-01-21 23:37:11.990 182759 DEBUG oslo_service.service [None req-03770a86-b916-455d-a5be-04baff7a18e9 - - - - - -] filter_scheduler.pci_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:11 np0005591285 nova_compute[182755]: 2026-01-21 23:37:11.990 182759 DEBUG oslo_service.service [None req-03770a86-b916-455d-a5be-04baff7a18e9 - - - - - -] filter_scheduler.ram_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:11 np0005591285 nova_compute[182755]: 2026-01-21 23:37:11.990 182759 DEBUG oslo_service.service [None req-03770a86-b916-455d-a5be-04baff7a18e9 - - - - - -] filter_scheduler.restrict_isolated_hosts_to_isolated_images = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:11 np0005591285 nova_compute[182755]: 2026-01-21 23:37:11.990 182759 DEBUG oslo_service.service [None req-03770a86-b916-455d-a5be-04baff7a18e9 - - - - - -] filter_scheduler.shuffle_best_same_weighed_hosts = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:11 np0005591285 nova_compute[182755]: 2026-01-21 23:37:11.990 182759 DEBUG oslo_service.service [None req-03770a86-b916-455d-a5be-04baff7a18e9 - - - - - -] filter_scheduler.soft_affinity_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:11 np0005591285 nova_compute[182755]: 2026-01-21 23:37:11.990 182759 DEBUG oslo_service.service [None req-03770a86-b916-455d-a5be-04baff7a18e9 - - - - - -] filter_scheduler.soft_anti_affinity_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:11 np0005591285 nova_compute[182755]: 2026-01-21 23:37:11.990 182759 DEBUG oslo_service.service [None req-03770a86-b916-455d-a5be-04baff7a18e9 - - - - - -] filter_scheduler.track_instance_changes = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:11 np0005591285 nova_compute[182755]: 2026-01-21 23:37:11.991 182759 DEBUG oslo_service.service [None req-03770a86-b916-455d-a5be-04baff7a18e9 - - - - - -] filter_scheduler.weight_classes = ['nova.scheduler.weights.all_weighers'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:11 np0005591285 nova_compute[182755]: 2026-01-21 23:37:11.991 182759 DEBUG oslo_service.service [None req-03770a86-b916-455d-a5be-04baff7a18e9 - - - - - -] metrics.required               = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:11 np0005591285 nova_compute[182755]: 2026-01-21 23:37:11.991 182759 DEBUG oslo_service.service [None req-03770a86-b916-455d-a5be-04baff7a18e9 - - - - - -] metrics.weight_multiplier      = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:11 np0005591285 nova_compute[182755]: 2026-01-21 23:37:11.991 182759 DEBUG oslo_service.service [None req-03770a86-b916-455d-a5be-04baff7a18e9 - - - - - -] metrics.weight_of_unavailable  = -10000.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:11 np0005591285 nova_compute[182755]: 2026-01-21 23:37:11.991 182759 DEBUG oslo_service.service [None req-03770a86-b916-455d-a5be-04baff7a18e9 - - - - - -] metrics.weight_setting         = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:11 np0005591285 nova_compute[182755]: 2026-01-21 23:37:11.991 182759 DEBUG oslo_service.service [None req-03770a86-b916-455d-a5be-04baff7a18e9 - - - - - -] serial_console.base_url        = ws://127.0.0.1:6083/ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:11 np0005591285 nova_compute[182755]: 2026-01-21 23:37:11.991 182759 DEBUG oslo_service.service [None req-03770a86-b916-455d-a5be-04baff7a18e9 - - - - - -] serial_console.enabled         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:11 np0005591285 nova_compute[182755]: 2026-01-21 23:37:11.992 182759 DEBUG oslo_service.service [None req-03770a86-b916-455d-a5be-04baff7a18e9 - - - - - -] serial_console.port_range      = 10000:20000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:11 np0005591285 nova_compute[182755]: 2026-01-21 23:37:11.992 182759 DEBUG oslo_service.service [None req-03770a86-b916-455d-a5be-04baff7a18e9 - - - - - -] serial_console.proxyclient_address = 127.0.0.1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:11 np0005591285 nova_compute[182755]: 2026-01-21 23:37:11.992 182759 DEBUG oslo_service.service [None req-03770a86-b916-455d-a5be-04baff7a18e9 - - - - - -] serial_console.serialproxy_host = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:11 np0005591285 nova_compute[182755]: 2026-01-21 23:37:11.992 182759 DEBUG oslo_service.service [None req-03770a86-b916-455d-a5be-04baff7a18e9 - - - - - -] serial_console.serialproxy_port = 6083 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:11 np0005591285 nova_compute[182755]: 2026-01-21 23:37:11.992 182759 DEBUG oslo_service.service [None req-03770a86-b916-455d-a5be-04baff7a18e9 - - - - - -] service_user.auth_section      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:11 np0005591285 nova_compute[182755]: 2026-01-21 23:37:11.992 182759 DEBUG oslo_service.service [None req-03770a86-b916-455d-a5be-04baff7a18e9 - - - - - -] service_user.auth_type         = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:11 np0005591285 nova_compute[182755]: 2026-01-21 23:37:11.992 182759 DEBUG oslo_service.service [None req-03770a86-b916-455d-a5be-04baff7a18e9 - - - - - -] service_user.cafile            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:11 np0005591285 nova_compute[182755]: 2026-01-21 23:37:11.992 182759 DEBUG oslo_service.service [None req-03770a86-b916-455d-a5be-04baff7a18e9 - - - - - -] service_user.certfile          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:11 np0005591285 nova_compute[182755]: 2026-01-21 23:37:11.993 182759 DEBUG oslo_service.service [None req-03770a86-b916-455d-a5be-04baff7a18e9 - - - - - -] service_user.collect_timing    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:11 np0005591285 nova_compute[182755]: 2026-01-21 23:37:11.993 182759 DEBUG oslo_service.service [None req-03770a86-b916-455d-a5be-04baff7a18e9 - - - - - -] service_user.insecure          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:11 np0005591285 nova_compute[182755]: 2026-01-21 23:37:11.993 182759 DEBUG oslo_service.service [None req-03770a86-b916-455d-a5be-04baff7a18e9 - - - - - -] service_user.keyfile           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:11 np0005591285 nova_compute[182755]: 2026-01-21 23:37:11.993 182759 DEBUG oslo_service.service [None req-03770a86-b916-455d-a5be-04baff7a18e9 - - - - - -] service_user.send_service_user_token = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:11 np0005591285 nova_compute[182755]: 2026-01-21 23:37:11.993 182759 DEBUG oslo_service.service [None req-03770a86-b916-455d-a5be-04baff7a18e9 - - - - - -] service_user.split_loggers     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:11 np0005591285 nova_compute[182755]: 2026-01-21 23:37:11.993 182759 DEBUG oslo_service.service [None req-03770a86-b916-455d-a5be-04baff7a18e9 - - - - - -] service_user.timeout           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:11 np0005591285 nova_compute[182755]: 2026-01-21 23:37:11.993 182759 DEBUG oslo_service.service [None req-03770a86-b916-455d-a5be-04baff7a18e9 - - - - - -] spice.agent_enabled            = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:11 np0005591285 nova_compute[182755]: 2026-01-21 23:37:11.994 182759 DEBUG oslo_service.service [None req-03770a86-b916-455d-a5be-04baff7a18e9 - - - - - -] spice.enabled                  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:11 np0005591285 nova_compute[182755]: 2026-01-21 23:37:11.994 182759 DEBUG oslo_service.service [None req-03770a86-b916-455d-a5be-04baff7a18e9 - - - - - -] spice.html5proxy_base_url      = http://127.0.0.1:6082/spice_auto.html log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:11 np0005591285 nova_compute[182755]: 2026-01-21 23:37:11.994 182759 DEBUG oslo_service.service [None req-03770a86-b916-455d-a5be-04baff7a18e9 - - - - - -] spice.html5proxy_host          = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:11 np0005591285 nova_compute[182755]: 2026-01-21 23:37:11.994 182759 DEBUG oslo_service.service [None req-03770a86-b916-455d-a5be-04baff7a18e9 - - - - - -] spice.html5proxy_port          = 6082 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:11 np0005591285 nova_compute[182755]: 2026-01-21 23:37:11.994 182759 DEBUG oslo_service.service [None req-03770a86-b916-455d-a5be-04baff7a18e9 - - - - - -] spice.image_compression        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:11 np0005591285 nova_compute[182755]: 2026-01-21 23:37:11.994 182759 DEBUG oslo_service.service [None req-03770a86-b916-455d-a5be-04baff7a18e9 - - - - - -] spice.jpeg_compression         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:11 np0005591285 nova_compute[182755]: 2026-01-21 23:37:11.994 182759 DEBUG oslo_service.service [None req-03770a86-b916-455d-a5be-04baff7a18e9 - - - - - -] spice.playback_compression     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:11 np0005591285 nova_compute[182755]: 2026-01-21 23:37:11.995 182759 DEBUG oslo_service.service [None req-03770a86-b916-455d-a5be-04baff7a18e9 - - - - - -] spice.server_listen            = 127.0.0.1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:11 np0005591285 nova_compute[182755]: 2026-01-21 23:37:11.995 182759 DEBUG oslo_service.service [None req-03770a86-b916-455d-a5be-04baff7a18e9 - - - - - -] spice.server_proxyclient_address = 127.0.0.1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:11 np0005591285 nova_compute[182755]: 2026-01-21 23:37:11.995 182759 DEBUG oslo_service.service [None req-03770a86-b916-455d-a5be-04baff7a18e9 - - - - - -] spice.streaming_mode           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:11 np0005591285 nova_compute[182755]: 2026-01-21 23:37:11.995 182759 DEBUG oslo_service.service [None req-03770a86-b916-455d-a5be-04baff7a18e9 - - - - - -] spice.zlib_compression         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:11 np0005591285 nova_compute[182755]: 2026-01-21 23:37:11.995 182759 DEBUG oslo_service.service [None req-03770a86-b916-455d-a5be-04baff7a18e9 - - - - - -] upgrade_levels.baseapi         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:11 np0005591285 nova_compute[182755]: 2026-01-21 23:37:11.995 182759 DEBUG oslo_service.service [None req-03770a86-b916-455d-a5be-04baff7a18e9 - - - - - -] upgrade_levels.cert            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:11 np0005591285 nova_compute[182755]: 2026-01-21 23:37:11.995 182759 DEBUG oslo_service.service [None req-03770a86-b916-455d-a5be-04baff7a18e9 - - - - - -] upgrade_levels.compute         = auto log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:11 np0005591285 nova_compute[182755]: 2026-01-21 23:37:11.996 182759 DEBUG oslo_service.service [None req-03770a86-b916-455d-a5be-04baff7a18e9 - - - - - -] upgrade_levels.conductor       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:11 np0005591285 nova_compute[182755]: 2026-01-21 23:37:11.996 182759 DEBUG oslo_service.service [None req-03770a86-b916-455d-a5be-04baff7a18e9 - - - - - -] upgrade_levels.scheduler       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:11 np0005591285 nova_compute[182755]: 2026-01-21 23:37:11.996 182759 DEBUG oslo_service.service [None req-03770a86-b916-455d-a5be-04baff7a18e9 - - - - - -] vendordata_dynamic_auth.auth_section = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:11 np0005591285 nova_compute[182755]: 2026-01-21 23:37:11.996 182759 DEBUG oslo_service.service [None req-03770a86-b916-455d-a5be-04baff7a18e9 - - - - - -] vendordata_dynamic_auth.auth_type = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:11 np0005591285 nova_compute[182755]: 2026-01-21 23:37:11.996 182759 DEBUG oslo_service.service [None req-03770a86-b916-455d-a5be-04baff7a18e9 - - - - - -] vendordata_dynamic_auth.cafile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:11 np0005591285 nova_compute[182755]: 2026-01-21 23:37:11.996 182759 DEBUG oslo_service.service [None req-03770a86-b916-455d-a5be-04baff7a18e9 - - - - - -] vendordata_dynamic_auth.certfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:11 np0005591285 nova_compute[182755]: 2026-01-21 23:37:11.996 182759 DEBUG oslo_service.service [None req-03770a86-b916-455d-a5be-04baff7a18e9 - - - - - -] vendordata_dynamic_auth.collect_timing = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:11 np0005591285 nova_compute[182755]: 2026-01-21 23:37:11.996 182759 DEBUG oslo_service.service [None req-03770a86-b916-455d-a5be-04baff7a18e9 - - - - - -] vendordata_dynamic_auth.insecure = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:11 np0005591285 nova_compute[182755]: 2026-01-21 23:37:11.997 182759 DEBUG oslo_service.service [None req-03770a86-b916-455d-a5be-04baff7a18e9 - - - - - -] vendordata_dynamic_auth.keyfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:11 np0005591285 nova_compute[182755]: 2026-01-21 23:37:11.997 182759 DEBUG oslo_service.service [None req-03770a86-b916-455d-a5be-04baff7a18e9 - - - - - -] vendordata_dynamic_auth.split_loggers = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:11 np0005591285 nova_compute[182755]: 2026-01-21 23:37:11.997 182759 DEBUG oslo_service.service [None req-03770a86-b916-455d-a5be-04baff7a18e9 - - - - - -] vendordata_dynamic_auth.timeout = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:11 np0005591285 nova_compute[182755]: 2026-01-21 23:37:11.997 182759 DEBUG oslo_service.service [None req-03770a86-b916-455d-a5be-04baff7a18e9 - - - - - -] vmware.api_retry_count         = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:11 np0005591285 nova_compute[182755]: 2026-01-21 23:37:11.997 182759 DEBUG oslo_service.service [None req-03770a86-b916-455d-a5be-04baff7a18e9 - - - - - -] vmware.ca_file                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:11 np0005591285 nova_compute[182755]: 2026-01-21 23:37:11.997 182759 DEBUG oslo_service.service [None req-03770a86-b916-455d-a5be-04baff7a18e9 - - - - - -] vmware.cache_prefix            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:11 np0005591285 nova_compute[182755]: 2026-01-21 23:37:11.997 182759 DEBUG oslo_service.service [None req-03770a86-b916-455d-a5be-04baff7a18e9 - - - - - -] vmware.cluster_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:11 np0005591285 nova_compute[182755]: 2026-01-21 23:37:11.997 182759 DEBUG oslo_service.service [None req-03770a86-b916-455d-a5be-04baff7a18e9 - - - - - -] vmware.connection_pool_size    = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:11 np0005591285 nova_compute[182755]: 2026-01-21 23:37:11.998 182759 DEBUG oslo_service.service [None req-03770a86-b916-455d-a5be-04baff7a18e9 - - - - - -] vmware.console_delay_seconds   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:11 np0005591285 nova_compute[182755]: 2026-01-21 23:37:11.998 182759 DEBUG oslo_service.service [None req-03770a86-b916-455d-a5be-04baff7a18e9 - - - - - -] vmware.datastore_regex         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:11 np0005591285 nova_compute[182755]: 2026-01-21 23:37:11.998 182759 DEBUG oslo_service.service [None req-03770a86-b916-455d-a5be-04baff7a18e9 - - - - - -] vmware.host_ip                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:11 np0005591285 nova_compute[182755]: 2026-01-21 23:37:11.998 182759 DEBUG oslo_service.service [None req-03770a86-b916-455d-a5be-04baff7a18e9 - - - - - -] vmware.host_password           = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:11 np0005591285 nova_compute[182755]: 2026-01-21 23:37:11.998 182759 DEBUG oslo_service.service [None req-03770a86-b916-455d-a5be-04baff7a18e9 - - - - - -] vmware.host_port               = 443 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:11 np0005591285 nova_compute[182755]: 2026-01-21 23:37:11.998 182759 DEBUG oslo_service.service [None req-03770a86-b916-455d-a5be-04baff7a18e9 - - - - - -] vmware.host_username           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:11 np0005591285 nova_compute[182755]: 2026-01-21 23:37:11.998 182759 DEBUG oslo_service.service [None req-03770a86-b916-455d-a5be-04baff7a18e9 - - - - - -] vmware.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:11 np0005591285 nova_compute[182755]: 2026-01-21 23:37:11.999 182759 DEBUG oslo_service.service [None req-03770a86-b916-455d-a5be-04baff7a18e9 - - - - - -] vmware.integration_bridge      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:11 np0005591285 nova_compute[182755]: 2026-01-21 23:37:11.999 182759 DEBUG oslo_service.service [None req-03770a86-b916-455d-a5be-04baff7a18e9 - - - - - -] vmware.maximum_objects         = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:11 np0005591285 nova_compute[182755]: 2026-01-21 23:37:11.999 182759 DEBUG oslo_service.service [None req-03770a86-b916-455d-a5be-04baff7a18e9 - - - - - -] vmware.pbm_default_policy      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:12 np0005591285 nova_compute[182755]: 2026-01-21 23:37:11.999 182759 DEBUG oslo_service.service [None req-03770a86-b916-455d-a5be-04baff7a18e9 - - - - - -] vmware.pbm_enabled             = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:12 np0005591285 nova_compute[182755]: 2026-01-21 23:37:11.999 182759 DEBUG oslo_service.service [None req-03770a86-b916-455d-a5be-04baff7a18e9 - - - - - -] vmware.pbm_wsdl_location       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:12 np0005591285 nova_compute[182755]: 2026-01-21 23:37:11.999 182759 DEBUG oslo_service.service [None req-03770a86-b916-455d-a5be-04baff7a18e9 - - - - - -] vmware.serial_log_dir          = /opt/vmware/vspc log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:12 np0005591285 nova_compute[182755]: 2026-01-21 23:37:11.999 182759 DEBUG oslo_service.service [None req-03770a86-b916-455d-a5be-04baff7a18e9 - - - - - -] vmware.serial_port_proxy_uri   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:12 np0005591285 nova_compute[182755]: 2026-01-21 23:37:11.999 182759 DEBUG oslo_service.service [None req-03770a86-b916-455d-a5be-04baff7a18e9 - - - - - -] vmware.serial_port_service_uri = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:12 np0005591285 nova_compute[182755]: 2026-01-21 23:37:12.000 182759 DEBUG oslo_service.service [None req-03770a86-b916-455d-a5be-04baff7a18e9 - - - - - -] vmware.task_poll_interval      = 0.5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:12 np0005591285 nova_compute[182755]: 2026-01-21 23:37:12.000 182759 DEBUG oslo_service.service [None req-03770a86-b916-455d-a5be-04baff7a18e9 - - - - - -] vmware.use_linked_clone        = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:12 np0005591285 nova_compute[182755]: 2026-01-21 23:37:12.000 182759 DEBUG oslo_service.service [None req-03770a86-b916-455d-a5be-04baff7a18e9 - - - - - -] vmware.vnc_keymap              = en-us log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:12 np0005591285 nova_compute[182755]: 2026-01-21 23:37:12.000 182759 DEBUG oslo_service.service [None req-03770a86-b916-455d-a5be-04baff7a18e9 - - - - - -] vmware.vnc_port                = 5900 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:12 np0005591285 nova_compute[182755]: 2026-01-21 23:37:12.000 182759 DEBUG oslo_service.service [None req-03770a86-b916-455d-a5be-04baff7a18e9 - - - - - -] vmware.vnc_port_total          = 10000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:12 np0005591285 nova_compute[182755]: 2026-01-21 23:37:12.000 182759 DEBUG oslo_service.service [None req-03770a86-b916-455d-a5be-04baff7a18e9 - - - - - -] vnc.auth_schemes               = ['none'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:12 np0005591285 nova_compute[182755]: 2026-01-21 23:37:12.000 182759 DEBUG oslo_service.service [None req-03770a86-b916-455d-a5be-04baff7a18e9 - - - - - -] vnc.enabled                    = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:12 np0005591285 nova_compute[182755]: 2026-01-21 23:37:12.001 182759 DEBUG oslo_service.service [None req-03770a86-b916-455d-a5be-04baff7a18e9 - - - - - -] vnc.novncproxy_base_url        = https://nova-novncproxy-cell1-public-openstack.apps-crc.testing/vnc_lite.html log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:12 np0005591285 nova_compute[182755]: 2026-01-21 23:37:12.001 182759 DEBUG oslo_service.service [None req-03770a86-b916-455d-a5be-04baff7a18e9 - - - - - -] vnc.novncproxy_host            = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:12 np0005591285 nova_compute[182755]: 2026-01-21 23:37:12.001 182759 DEBUG oslo_service.service [None req-03770a86-b916-455d-a5be-04baff7a18e9 - - - - - -] vnc.novncproxy_port            = 6080 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:12 np0005591285 nova_compute[182755]: 2026-01-21 23:37:12.001 182759 DEBUG oslo_service.service [None req-03770a86-b916-455d-a5be-04baff7a18e9 - - - - - -] vnc.server_listen              = ::0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:12 np0005591285 nova_compute[182755]: 2026-01-21 23:37:12.001 182759 DEBUG oslo_service.service [None req-03770a86-b916-455d-a5be-04baff7a18e9 - - - - - -] vnc.server_proxyclient_address = 192.168.122.102 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:12 np0005591285 nova_compute[182755]: 2026-01-21 23:37:12.001 182759 DEBUG oslo_service.service [None req-03770a86-b916-455d-a5be-04baff7a18e9 - - - - - -] vnc.vencrypt_ca_certs          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:12 np0005591285 nova_compute[182755]: 2026-01-21 23:37:12.002 182759 DEBUG oslo_service.service [None req-03770a86-b916-455d-a5be-04baff7a18e9 - - - - - -] vnc.vencrypt_client_cert       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:12 np0005591285 nova_compute[182755]: 2026-01-21 23:37:12.002 182759 DEBUG oslo_service.service [None req-03770a86-b916-455d-a5be-04baff7a18e9 - - - - - -] vnc.vencrypt_client_key        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:12 np0005591285 nova_compute[182755]: 2026-01-21 23:37:12.002 182759 DEBUG oslo_service.service [None req-03770a86-b916-455d-a5be-04baff7a18e9 - - - - - -] workarounds.disable_compute_service_check_for_ffu = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:12 np0005591285 nova_compute[182755]: 2026-01-21 23:37:12.002 182759 DEBUG oslo_service.service [None req-03770a86-b916-455d-a5be-04baff7a18e9 - - - - - -] workarounds.disable_deep_image_inspection = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:12 np0005591285 nova_compute[182755]: 2026-01-21 23:37:12.002 182759 DEBUG oslo_service.service [None req-03770a86-b916-455d-a5be-04baff7a18e9 - - - - - -] workarounds.disable_fallback_pcpu_query = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:12 np0005591285 nova_compute[182755]: 2026-01-21 23:37:12.002 182759 DEBUG oslo_service.service [None req-03770a86-b916-455d-a5be-04baff7a18e9 - - - - - -] workarounds.disable_group_policy_check_upcall = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:12 np0005591285 nova_compute[182755]: 2026-01-21 23:37:12.002 182759 DEBUG oslo_service.service [None req-03770a86-b916-455d-a5be-04baff7a18e9 - - - - - -] workarounds.disable_libvirt_livesnapshot = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:12 np0005591285 nova_compute[182755]: 2026-01-21 23:37:12.002 182759 DEBUG oslo_service.service [None req-03770a86-b916-455d-a5be-04baff7a18e9 - - - - - -] workarounds.disable_rootwrap   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:12 np0005591285 nova_compute[182755]: 2026-01-21 23:37:12.003 182759 DEBUG oslo_service.service [None req-03770a86-b916-455d-a5be-04baff7a18e9 - - - - - -] workarounds.enable_numa_live_migration = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:12 np0005591285 nova_compute[182755]: 2026-01-21 23:37:12.003 182759 DEBUG oslo_service.service [None req-03770a86-b916-455d-a5be-04baff7a18e9 - - - - - -] workarounds.enable_qemu_monitor_announce_self = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:12 np0005591285 nova_compute[182755]: 2026-01-21 23:37:12.003 182759 DEBUG oslo_service.service [None req-03770a86-b916-455d-a5be-04baff7a18e9 - - - - - -] workarounds.ensure_libvirt_rbd_instance_dir_cleanup = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:12 np0005591285 nova_compute[182755]: 2026-01-21 23:37:12.003 182759 DEBUG oslo_service.service [None req-03770a86-b916-455d-a5be-04baff7a18e9 - - - - - -] workarounds.handle_virt_lifecycle_events = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:12 np0005591285 nova_compute[182755]: 2026-01-21 23:37:12.003 182759 DEBUG oslo_service.service [None req-03770a86-b916-455d-a5be-04baff7a18e9 - - - - - -] workarounds.libvirt_disable_apic = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:12 np0005591285 nova_compute[182755]: 2026-01-21 23:37:12.003 182759 DEBUG oslo_service.service [None req-03770a86-b916-455d-a5be-04baff7a18e9 - - - - - -] workarounds.never_download_image_if_on_rbd = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:12 np0005591285 nova_compute[182755]: 2026-01-21 23:37:12.003 182759 DEBUG oslo_service.service [None req-03770a86-b916-455d-a5be-04baff7a18e9 - - - - - -] workarounds.qemu_monitor_announce_self_count = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:12 np0005591285 nova_compute[182755]: 2026-01-21 23:37:12.003 182759 DEBUG oslo_service.service [None req-03770a86-b916-455d-a5be-04baff7a18e9 - - - - - -] workarounds.qemu_monitor_announce_self_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:12 np0005591285 nova_compute[182755]: 2026-01-21 23:37:12.004 182759 DEBUG oslo_service.service [None req-03770a86-b916-455d-a5be-04baff7a18e9 - - - - - -] workarounds.reserve_disk_resource_for_image_cache = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:12 np0005591285 nova_compute[182755]: 2026-01-21 23:37:12.004 182759 DEBUG oslo_service.service [None req-03770a86-b916-455d-a5be-04baff7a18e9 - - - - - -] workarounds.skip_cpu_compare_at_startup = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:12 np0005591285 nova_compute[182755]: 2026-01-21 23:37:12.004 182759 DEBUG oslo_service.service [None req-03770a86-b916-455d-a5be-04baff7a18e9 - - - - - -] workarounds.skip_cpu_compare_on_dest = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:12 np0005591285 nova_compute[182755]: 2026-01-21 23:37:12.004 182759 DEBUG oslo_service.service [None req-03770a86-b916-455d-a5be-04baff7a18e9 - - - - - -] workarounds.skip_hypervisor_version_check_on_lm = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:12 np0005591285 nova_compute[182755]: 2026-01-21 23:37:12.004 182759 DEBUG oslo_service.service [None req-03770a86-b916-455d-a5be-04baff7a18e9 - - - - - -] workarounds.skip_reserve_in_use_ironic_nodes = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:12 np0005591285 nova_compute[182755]: 2026-01-21 23:37:12.004 182759 DEBUG oslo_service.service [None req-03770a86-b916-455d-a5be-04baff7a18e9 - - - - - -] workarounds.unified_limits_count_pcpu_as_vcpu = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:12 np0005591285 nova_compute[182755]: 2026-01-21 23:37:12.004 182759 DEBUG oslo_service.service [None req-03770a86-b916-455d-a5be-04baff7a18e9 - - - - - -] workarounds.wait_for_vif_plugged_event_during_hard_reboot = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:12 np0005591285 nova_compute[182755]: 2026-01-21 23:37:12.005 182759 DEBUG oslo_service.service [None req-03770a86-b916-455d-a5be-04baff7a18e9 - - - - - -] wsgi.api_paste_config          = api-paste.ini log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:12 np0005591285 nova_compute[182755]: 2026-01-21 23:37:12.005 182759 DEBUG oslo_service.service [None req-03770a86-b916-455d-a5be-04baff7a18e9 - - - - - -] wsgi.client_socket_timeout     = 900 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:12 np0005591285 nova_compute[182755]: 2026-01-21 23:37:12.005 182759 DEBUG oslo_service.service [None req-03770a86-b916-455d-a5be-04baff7a18e9 - - - - - -] wsgi.default_pool_size         = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:12 np0005591285 nova_compute[182755]: 2026-01-21 23:37:12.005 182759 DEBUG oslo_service.service [None req-03770a86-b916-455d-a5be-04baff7a18e9 - - - - - -] wsgi.keep_alive                = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:12 np0005591285 nova_compute[182755]: 2026-01-21 23:37:12.005 182759 DEBUG oslo_service.service [None req-03770a86-b916-455d-a5be-04baff7a18e9 - - - - - -] wsgi.max_header_line           = 16384 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:12 np0005591285 nova_compute[182755]: 2026-01-21 23:37:12.005 182759 DEBUG oslo_service.service [None req-03770a86-b916-455d-a5be-04baff7a18e9 - - - - - -] wsgi.secure_proxy_ssl_header   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:12 np0005591285 nova_compute[182755]: 2026-01-21 23:37:12.005 182759 DEBUG oslo_service.service [None req-03770a86-b916-455d-a5be-04baff7a18e9 - - - - - -] wsgi.ssl_ca_file               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:12 np0005591285 nova_compute[182755]: 2026-01-21 23:37:12.006 182759 DEBUG oslo_service.service [None req-03770a86-b916-455d-a5be-04baff7a18e9 - - - - - -] wsgi.ssl_cert_file             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:12 np0005591285 nova_compute[182755]: 2026-01-21 23:37:12.006 182759 DEBUG oslo_service.service [None req-03770a86-b916-455d-a5be-04baff7a18e9 - - - - - -] wsgi.ssl_key_file              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:12 np0005591285 nova_compute[182755]: 2026-01-21 23:37:12.006 182759 DEBUG oslo_service.service [None req-03770a86-b916-455d-a5be-04baff7a18e9 - - - - - -] wsgi.tcp_keepidle              = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:12 np0005591285 nova_compute[182755]: 2026-01-21 23:37:12.006 182759 DEBUG oslo_service.service [None req-03770a86-b916-455d-a5be-04baff7a18e9 - - - - - -] wsgi.wsgi_log_format           = %(client_ip)s "%(request_line)s" status: %(status_code)s len: %(body_length)s time: %(wall_seconds).7f log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:12 np0005591285 nova_compute[182755]: 2026-01-21 23:37:12.006 182759 DEBUG oslo_service.service [None req-03770a86-b916-455d-a5be-04baff7a18e9 - - - - - -] zvm.ca_file                    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:12 np0005591285 nova_compute[182755]: 2026-01-21 23:37:12.006 182759 DEBUG oslo_service.service [None req-03770a86-b916-455d-a5be-04baff7a18e9 - - - - - -] zvm.cloud_connector_url        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:12 np0005591285 nova_compute[182755]: 2026-01-21 23:37:12.006 182759 DEBUG oslo_service.service [None req-03770a86-b916-455d-a5be-04baff7a18e9 - - - - - -] zvm.image_tmp_path             = /var/lib/nova/images log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:12 np0005591285 nova_compute[182755]: 2026-01-21 23:37:12.007 182759 DEBUG oslo_service.service [None req-03770a86-b916-455d-a5be-04baff7a18e9 - - - - - -] zvm.reachable_timeout          = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:12 np0005591285 nova_compute[182755]: 2026-01-21 23:37:12.007 182759 DEBUG oslo_service.service [None req-03770a86-b916-455d-a5be-04baff7a18e9 - - - - - -] oslo_policy.enforce_new_defaults = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:12 np0005591285 nova_compute[182755]: 2026-01-21 23:37:12.007 182759 DEBUG oslo_service.service [None req-03770a86-b916-455d-a5be-04baff7a18e9 - - - - - -] oslo_policy.enforce_scope      = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:12 np0005591285 nova_compute[182755]: 2026-01-21 23:37:12.007 182759 DEBUG oslo_service.service [None req-03770a86-b916-455d-a5be-04baff7a18e9 - - - - - -] oslo_policy.policy_default_rule = default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:12 np0005591285 nova_compute[182755]: 2026-01-21 23:37:12.007 182759 DEBUG oslo_service.service [None req-03770a86-b916-455d-a5be-04baff7a18e9 - - - - - -] oslo_policy.policy_dirs        = ['policy.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:12 np0005591285 nova_compute[182755]: 2026-01-21 23:37:12.007 182759 DEBUG oslo_service.service [None req-03770a86-b916-455d-a5be-04baff7a18e9 - - - - - -] oslo_policy.policy_file        = policy.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:12 np0005591285 nova_compute[182755]: 2026-01-21 23:37:12.007 182759 DEBUG oslo_service.service [None req-03770a86-b916-455d-a5be-04baff7a18e9 - - - - - -] oslo_policy.remote_content_type = application/x-www-form-urlencoded log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:12 np0005591285 nova_compute[182755]: 2026-01-21 23:37:12.007 182759 DEBUG oslo_service.service [None req-03770a86-b916-455d-a5be-04baff7a18e9 - - - - - -] oslo_policy.remote_ssl_ca_crt_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:12 np0005591285 nova_compute[182755]: 2026-01-21 23:37:12.008 182759 DEBUG oslo_service.service [None req-03770a86-b916-455d-a5be-04baff7a18e9 - - - - - -] oslo_policy.remote_ssl_client_crt_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:12 np0005591285 nova_compute[182755]: 2026-01-21 23:37:12.008 182759 DEBUG oslo_service.service [None req-03770a86-b916-455d-a5be-04baff7a18e9 - - - - - -] oslo_policy.remote_ssl_client_key_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:12 np0005591285 nova_compute[182755]: 2026-01-21 23:37:12.008 182759 DEBUG oslo_service.service [None req-03770a86-b916-455d-a5be-04baff7a18e9 - - - - - -] oslo_policy.remote_ssl_verify_server_crt = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:12 np0005591285 nova_compute[182755]: 2026-01-21 23:37:12.008 182759 DEBUG oslo_service.service [None req-03770a86-b916-455d-a5be-04baff7a18e9 - - - - - -] oslo_versionedobjects.fatal_exception_format_errors = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:12 np0005591285 nova_compute[182755]: 2026-01-21 23:37:12.008 182759 DEBUG oslo_service.service [None req-03770a86-b916-455d-a5be-04baff7a18e9 - - - - - -] oslo_middleware.http_basic_auth_user_file = /etc/htpasswd log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:12 np0005591285 nova_compute[182755]: 2026-01-21 23:37:12.008 182759 DEBUG oslo_service.service [None req-03770a86-b916-455d-a5be-04baff7a18e9 - - - - - -] remote_debug.host              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:12 np0005591285 nova_compute[182755]: 2026-01-21 23:37:12.008 182759 DEBUG oslo_service.service [None req-03770a86-b916-455d-a5be-04baff7a18e9 - - - - - -] remote_debug.port              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:12 np0005591285 nova_compute[182755]: 2026-01-21 23:37:12.009 182759 DEBUG oslo_service.service [None req-03770a86-b916-455d-a5be-04baff7a18e9 - - - - - -] oslo_messaging_rabbit.amqp_auto_delete = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:12 np0005591285 nova_compute[182755]: 2026-01-21 23:37:12.009 182759 DEBUG oslo_service.service [None req-03770a86-b916-455d-a5be-04baff7a18e9 - - - - - -] oslo_messaging_rabbit.amqp_durable_queues = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:12 np0005591285 nova_compute[182755]: 2026-01-21 23:37:12.009 182759 DEBUG oslo_service.service [None req-03770a86-b916-455d-a5be-04baff7a18e9 - - - - - -] oslo_messaging_rabbit.conn_pool_min_size = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:12 np0005591285 nova_compute[182755]: 2026-01-21 23:37:12.009 182759 DEBUG oslo_service.service [None req-03770a86-b916-455d-a5be-04baff7a18e9 - - - - - -] oslo_messaging_rabbit.conn_pool_ttl = 1200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:12 np0005591285 nova_compute[182755]: 2026-01-21 23:37:12.009 182759 DEBUG oslo_service.service [None req-03770a86-b916-455d-a5be-04baff7a18e9 - - - - - -] oslo_messaging_rabbit.direct_mandatory_flag = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:12 np0005591285 nova_compute[182755]: 2026-01-21 23:37:12.009 182759 DEBUG oslo_service.service [None req-03770a86-b916-455d-a5be-04baff7a18e9 - - - - - -] oslo_messaging_rabbit.enable_cancel_on_failover = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:12 np0005591285 nova_compute[182755]: 2026-01-21 23:37:12.009 182759 DEBUG oslo_service.service [None req-03770a86-b916-455d-a5be-04baff7a18e9 - - - - - -] oslo_messaging_rabbit.heartbeat_in_pthread = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:12 np0005591285 nova_compute[182755]: 2026-01-21 23:37:12.010 182759 DEBUG oslo_service.service [None req-03770a86-b916-455d-a5be-04baff7a18e9 - - - - - -] oslo_messaging_rabbit.heartbeat_rate = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:12 np0005591285 nova_compute[182755]: 2026-01-21 23:37:12.010 182759 DEBUG oslo_service.service [None req-03770a86-b916-455d-a5be-04baff7a18e9 - - - - - -] oslo_messaging_rabbit.heartbeat_timeout_threshold = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:12 np0005591285 nova_compute[182755]: 2026-01-21 23:37:12.010 182759 DEBUG oslo_service.service [None req-03770a86-b916-455d-a5be-04baff7a18e9 - - - - - -] oslo_messaging_rabbit.kombu_compression = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:12 np0005591285 nova_compute[182755]: 2026-01-21 23:37:12.010 182759 DEBUG oslo_service.service [None req-03770a86-b916-455d-a5be-04baff7a18e9 - - - - - -] oslo_messaging_rabbit.kombu_failover_strategy = round-robin log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:12 np0005591285 nova_compute[182755]: 2026-01-21 23:37:12.010 182759 DEBUG oslo_service.service [None req-03770a86-b916-455d-a5be-04baff7a18e9 - - - - - -] oslo_messaging_rabbit.kombu_missing_consumer_retry_timeout = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:12 np0005591285 nova_compute[182755]: 2026-01-21 23:37:12.010 182759 DEBUG oslo_service.service [None req-03770a86-b916-455d-a5be-04baff7a18e9 - - - - - -] oslo_messaging_rabbit.kombu_reconnect_delay = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:12 np0005591285 nova_compute[182755]: 2026-01-21 23:37:12.010 182759 DEBUG oslo_service.service [None req-03770a86-b916-455d-a5be-04baff7a18e9 - - - - - -] oslo_messaging_rabbit.rabbit_ha_queues = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:12 np0005591285 nova_compute[182755]: 2026-01-21 23:37:12.011 182759 DEBUG oslo_service.service [None req-03770a86-b916-455d-a5be-04baff7a18e9 - - - - - -] oslo_messaging_rabbit.rabbit_interval_max = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:12 np0005591285 nova_compute[182755]: 2026-01-21 23:37:12.011 182759 DEBUG oslo_service.service [None req-03770a86-b916-455d-a5be-04baff7a18e9 - - - - - -] oslo_messaging_rabbit.rabbit_login_method = AMQPLAIN log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:12 np0005591285 nova_compute[182755]: 2026-01-21 23:37:12.011 182759 DEBUG oslo_service.service [None req-03770a86-b916-455d-a5be-04baff7a18e9 - - - - - -] oslo_messaging_rabbit.rabbit_qos_prefetch_count = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:12 np0005591285 nova_compute[182755]: 2026-01-21 23:37:12.011 182759 DEBUG oslo_service.service [None req-03770a86-b916-455d-a5be-04baff7a18e9 - - - - - -] oslo_messaging_rabbit.rabbit_quorum_delivery_limit = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:12 np0005591285 nova_compute[182755]: 2026-01-21 23:37:12.011 182759 DEBUG oslo_service.service [None req-03770a86-b916-455d-a5be-04baff7a18e9 - - - - - -] oslo_messaging_rabbit.rabbit_quorum_max_memory_bytes = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:12 np0005591285 nova_compute[182755]: 2026-01-21 23:37:12.011 182759 DEBUG oslo_service.service [None req-03770a86-b916-455d-a5be-04baff7a18e9 - - - - - -] oslo_messaging_rabbit.rabbit_quorum_max_memory_length = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:12 np0005591285 nova_compute[182755]: 2026-01-21 23:37:12.011 182759 DEBUG oslo_service.service [None req-03770a86-b916-455d-a5be-04baff7a18e9 - - - - - -] oslo_messaging_rabbit.rabbit_quorum_queue = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:12 np0005591285 nova_compute[182755]: 2026-01-21 23:37:12.011 182759 DEBUG oslo_service.service [None req-03770a86-b916-455d-a5be-04baff7a18e9 - - - - - -] oslo_messaging_rabbit.rabbit_retry_backoff = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:12 np0005591285 nova_compute[182755]: 2026-01-21 23:37:12.012 182759 DEBUG oslo_service.service [None req-03770a86-b916-455d-a5be-04baff7a18e9 - - - - - -] oslo_messaging_rabbit.rabbit_retry_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:12 np0005591285 nova_compute[182755]: 2026-01-21 23:37:12.012 182759 DEBUG oslo_service.service [None req-03770a86-b916-455d-a5be-04baff7a18e9 - - - - - -] oslo_messaging_rabbit.rabbit_transient_queues_ttl = 1800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:12 np0005591285 nova_compute[182755]: 2026-01-21 23:37:12.012 182759 DEBUG oslo_service.service [None req-03770a86-b916-455d-a5be-04baff7a18e9 - - - - - -] oslo_messaging_rabbit.rpc_conn_pool_size = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:12 np0005591285 nova_compute[182755]: 2026-01-21 23:37:12.012 182759 DEBUG oslo_service.service [None req-03770a86-b916-455d-a5be-04baff7a18e9 - - - - - -] oslo_messaging_rabbit.ssl      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:12 np0005591285 nova_compute[182755]: 2026-01-21 23:37:12.012 182759 DEBUG oslo_service.service [None req-03770a86-b916-455d-a5be-04baff7a18e9 - - - - - -] oslo_messaging_rabbit.ssl_ca_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:12 np0005591285 nova_compute[182755]: 2026-01-21 23:37:12.012 182759 DEBUG oslo_service.service [None req-03770a86-b916-455d-a5be-04baff7a18e9 - - - - - -] oslo_messaging_rabbit.ssl_cert_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:12 np0005591285 nova_compute[182755]: 2026-01-21 23:37:12.012 182759 DEBUG oslo_service.service [None req-03770a86-b916-455d-a5be-04baff7a18e9 - - - - - -] oslo_messaging_rabbit.ssl_enforce_fips_mode = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:12 np0005591285 nova_compute[182755]: 2026-01-21 23:37:12.013 182759 DEBUG oslo_service.service [None req-03770a86-b916-455d-a5be-04baff7a18e9 - - - - - -] oslo_messaging_rabbit.ssl_key_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:12 np0005591285 nova_compute[182755]: 2026-01-21 23:37:12.013 182759 DEBUG oslo_service.service [None req-03770a86-b916-455d-a5be-04baff7a18e9 - - - - - -] oslo_messaging_rabbit.ssl_version =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:12 np0005591285 nova_compute[182755]: 2026-01-21 23:37:12.013 182759 DEBUG oslo_service.service [None req-03770a86-b916-455d-a5be-04baff7a18e9 - - - - - -] oslo_messaging_notifications.driver = ['messagingv2'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:12 np0005591285 nova_compute[182755]: 2026-01-21 23:37:12.013 182759 DEBUG oslo_service.service [None req-03770a86-b916-455d-a5be-04baff7a18e9 - - - - - -] oslo_messaging_notifications.retry = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:12 np0005591285 nova_compute[182755]: 2026-01-21 23:37:12.013 182759 DEBUG oslo_service.service [None req-03770a86-b916-455d-a5be-04baff7a18e9 - - - - - -] oslo_messaging_notifications.topics = ['notifications'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:12 np0005591285 nova_compute[182755]: 2026-01-21 23:37:12.013 182759 DEBUG oslo_service.service [None req-03770a86-b916-455d-a5be-04baff7a18e9 - - - - - -] oslo_messaging_notifications.transport_url = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:12 np0005591285 nova_compute[182755]: 2026-01-21 23:37:12.013 182759 DEBUG oslo_service.service [None req-03770a86-b916-455d-a5be-04baff7a18e9 - - - - - -] oslo_limit.auth_section        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:12 np0005591285 nova_compute[182755]: 2026-01-21 23:37:12.014 182759 DEBUG oslo_service.service [None req-03770a86-b916-455d-a5be-04baff7a18e9 - - - - - -] oslo_limit.auth_type           = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:12 np0005591285 nova_compute[182755]: 2026-01-21 23:37:12.014 182759 DEBUG oslo_service.service [None req-03770a86-b916-455d-a5be-04baff7a18e9 - - - - - -] oslo_limit.auth_url            = https://keystone-internal.openstack.svc:5000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:12 np0005591285 nova_compute[182755]: 2026-01-21 23:37:12.014 182759 DEBUG oslo_service.service [None req-03770a86-b916-455d-a5be-04baff7a18e9 - - - - - -] oslo_limit.cafile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:12 np0005591285 nova_compute[182755]: 2026-01-21 23:37:12.014 182759 DEBUG oslo_service.service [None req-03770a86-b916-455d-a5be-04baff7a18e9 - - - - - -] oslo_limit.certfile            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:12 np0005591285 nova_compute[182755]: 2026-01-21 23:37:12.014 182759 DEBUG oslo_service.service [None req-03770a86-b916-455d-a5be-04baff7a18e9 - - - - - -] oslo_limit.collect_timing      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:12 np0005591285 nova_compute[182755]: 2026-01-21 23:37:12.014 182759 DEBUG oslo_service.service [None req-03770a86-b916-455d-a5be-04baff7a18e9 - - - - - -] oslo_limit.connect_retries     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:12 np0005591285 nova_compute[182755]: 2026-01-21 23:37:12.014 182759 DEBUG oslo_service.service [None req-03770a86-b916-455d-a5be-04baff7a18e9 - - - - - -] oslo_limit.connect_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:12 np0005591285 nova_compute[182755]: 2026-01-21 23:37:12.014 182759 DEBUG oslo_service.service [None req-03770a86-b916-455d-a5be-04baff7a18e9 - - - - - -] oslo_limit.default_domain_id   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:12 np0005591285 nova_compute[182755]: 2026-01-21 23:37:12.015 182759 DEBUG oslo_service.service [None req-03770a86-b916-455d-a5be-04baff7a18e9 - - - - - -] oslo_limit.default_domain_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:12 np0005591285 nova_compute[182755]: 2026-01-21 23:37:12.015 182759 DEBUG oslo_service.service [None req-03770a86-b916-455d-a5be-04baff7a18e9 - - - - - -] oslo_limit.domain_id           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:12 np0005591285 nova_compute[182755]: 2026-01-21 23:37:12.015 182759 DEBUG oslo_service.service [None req-03770a86-b916-455d-a5be-04baff7a18e9 - - - - - -] oslo_limit.domain_name         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:12 np0005591285 nova_compute[182755]: 2026-01-21 23:37:12.015 182759 DEBUG oslo_service.service [None req-03770a86-b916-455d-a5be-04baff7a18e9 - - - - - -] oslo_limit.endpoint_id         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:12 np0005591285 nova_compute[182755]: 2026-01-21 23:37:12.015 182759 DEBUG oslo_service.service [None req-03770a86-b916-455d-a5be-04baff7a18e9 - - - - - -] oslo_limit.endpoint_override   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:12 np0005591285 nova_compute[182755]: 2026-01-21 23:37:12.015 182759 DEBUG oslo_service.service [None req-03770a86-b916-455d-a5be-04baff7a18e9 - - - - - -] oslo_limit.insecure            = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:12 np0005591285 nova_compute[182755]: 2026-01-21 23:37:12.015 182759 DEBUG oslo_service.service [None req-03770a86-b916-455d-a5be-04baff7a18e9 - - - - - -] oslo_limit.keyfile             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:12 np0005591285 nova_compute[182755]: 2026-01-21 23:37:12.015 182759 DEBUG oslo_service.service [None req-03770a86-b916-455d-a5be-04baff7a18e9 - - - - - -] oslo_limit.max_version         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:12 np0005591285 nova_compute[182755]: 2026-01-21 23:37:12.016 182759 DEBUG oslo_service.service [None req-03770a86-b916-455d-a5be-04baff7a18e9 - - - - - -] oslo_limit.min_version         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:12 np0005591285 nova_compute[182755]: 2026-01-21 23:37:12.016 182759 DEBUG oslo_service.service [None req-03770a86-b916-455d-a5be-04baff7a18e9 - - - - - -] oslo_limit.password            = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:12 np0005591285 nova_compute[182755]: 2026-01-21 23:37:12.016 182759 DEBUG oslo_service.service [None req-03770a86-b916-455d-a5be-04baff7a18e9 - - - - - -] oslo_limit.project_domain_id   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:12 np0005591285 nova_compute[182755]: 2026-01-21 23:37:12.016 182759 DEBUG oslo_service.service [None req-03770a86-b916-455d-a5be-04baff7a18e9 - - - - - -] oslo_limit.project_domain_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:12 np0005591285 nova_compute[182755]: 2026-01-21 23:37:12.016 182759 DEBUG oslo_service.service [None req-03770a86-b916-455d-a5be-04baff7a18e9 - - - - - -] oslo_limit.project_id          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:12 np0005591285 nova_compute[182755]: 2026-01-21 23:37:12.016 182759 DEBUG oslo_service.service [None req-03770a86-b916-455d-a5be-04baff7a18e9 - - - - - -] oslo_limit.project_name        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:12 np0005591285 nova_compute[182755]: 2026-01-21 23:37:12.016 182759 DEBUG oslo_service.service [None req-03770a86-b916-455d-a5be-04baff7a18e9 - - - - - -] oslo_limit.region_name         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:12 np0005591285 nova_compute[182755]: 2026-01-21 23:37:12.016 182759 DEBUG oslo_service.service [None req-03770a86-b916-455d-a5be-04baff7a18e9 - - - - - -] oslo_limit.service_name        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:12 np0005591285 nova_compute[182755]: 2026-01-21 23:37:12.017 182759 DEBUG oslo_service.service [None req-03770a86-b916-455d-a5be-04baff7a18e9 - - - - - -] oslo_limit.service_type        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:12 np0005591285 nova_compute[182755]: 2026-01-21 23:37:12.017 182759 DEBUG oslo_service.service [None req-03770a86-b916-455d-a5be-04baff7a18e9 - - - - - -] oslo_limit.split_loggers       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:12 np0005591285 nova_compute[182755]: 2026-01-21 23:37:12.017 182759 DEBUG oslo_service.service [None req-03770a86-b916-455d-a5be-04baff7a18e9 - - - - - -] oslo_limit.status_code_retries = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:12 np0005591285 nova_compute[182755]: 2026-01-21 23:37:12.017 182759 DEBUG oslo_service.service [None req-03770a86-b916-455d-a5be-04baff7a18e9 - - - - - -] oslo_limit.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:12 np0005591285 nova_compute[182755]: 2026-01-21 23:37:12.017 182759 DEBUG oslo_service.service [None req-03770a86-b916-455d-a5be-04baff7a18e9 - - - - - -] oslo_limit.system_scope        = all log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:12 np0005591285 nova_compute[182755]: 2026-01-21 23:37:12.017 182759 DEBUG oslo_service.service [None req-03770a86-b916-455d-a5be-04baff7a18e9 - - - - - -] oslo_limit.timeout             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:12 np0005591285 nova_compute[182755]: 2026-01-21 23:37:12.017 182759 DEBUG oslo_service.service [None req-03770a86-b916-455d-a5be-04baff7a18e9 - - - - - -] oslo_limit.trust_id            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:12 np0005591285 nova_compute[182755]: 2026-01-21 23:37:12.018 182759 DEBUG oslo_service.service [None req-03770a86-b916-455d-a5be-04baff7a18e9 - - - - - -] oslo_limit.user_domain_id      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:12 np0005591285 nova_compute[182755]: 2026-01-21 23:37:12.018 182759 DEBUG oslo_service.service [None req-03770a86-b916-455d-a5be-04baff7a18e9 - - - - - -] oslo_limit.user_domain_name    = Default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:12 np0005591285 nova_compute[182755]: 2026-01-21 23:37:12.018 182759 DEBUG oslo_service.service [None req-03770a86-b916-455d-a5be-04baff7a18e9 - - - - - -] oslo_limit.user_id             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:12 np0005591285 nova_compute[182755]: 2026-01-21 23:37:12.018 182759 DEBUG oslo_service.service [None req-03770a86-b916-455d-a5be-04baff7a18e9 - - - - - -] oslo_limit.username            = nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:12 np0005591285 nova_compute[182755]: 2026-01-21 23:37:12.018 182759 DEBUG oslo_service.service [None req-03770a86-b916-455d-a5be-04baff7a18e9 - - - - - -] oslo_limit.valid_interfaces    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:12 np0005591285 nova_compute[182755]: 2026-01-21 23:37:12.018 182759 DEBUG oslo_service.service [None req-03770a86-b916-455d-a5be-04baff7a18e9 - - - - - -] oslo_limit.version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:12 np0005591285 nova_compute[182755]: 2026-01-21 23:37:12.018 182759 DEBUG oslo_service.service [None req-03770a86-b916-455d-a5be-04baff7a18e9 - - - - - -] oslo_reports.file_event_handler = /var/lib/nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:12 np0005591285 nova_compute[182755]: 2026-01-21 23:37:12.018 182759 DEBUG oslo_service.service [None req-03770a86-b916-455d-a5be-04baff7a18e9 - - - - - -] oslo_reports.file_event_handler_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:12 np0005591285 nova_compute[182755]: 2026-01-21 23:37:12.019 182759 DEBUG oslo_service.service [None req-03770a86-b916-455d-a5be-04baff7a18e9 - - - - - -] oslo_reports.log_dir           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:12 np0005591285 nova_compute[182755]: 2026-01-21 23:37:12.019 182759 DEBUG oslo_service.service [None req-03770a86-b916-455d-a5be-04baff7a18e9 - - - - - -] vif_plug_linux_bridge_privileged.capabilities = [12] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:12 np0005591285 nova_compute[182755]: 2026-01-21 23:37:12.019 182759 DEBUG oslo_service.service [None req-03770a86-b916-455d-a5be-04baff7a18e9 - - - - - -] vif_plug_linux_bridge_privileged.group = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:12 np0005591285 nova_compute[182755]: 2026-01-21 23:37:12.019 182759 DEBUG oslo_service.service [None req-03770a86-b916-455d-a5be-04baff7a18e9 - - - - - -] vif_plug_linux_bridge_privileged.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:12 np0005591285 nova_compute[182755]: 2026-01-21 23:37:12.019 182759 DEBUG oslo_service.service [None req-03770a86-b916-455d-a5be-04baff7a18e9 - - - - - -] vif_plug_linux_bridge_privileged.logger_name = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:12 np0005591285 nova_compute[182755]: 2026-01-21 23:37:12.019 182759 DEBUG oslo_service.service [None req-03770a86-b916-455d-a5be-04baff7a18e9 - - - - - -] vif_plug_linux_bridge_privileged.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:12 np0005591285 nova_compute[182755]: 2026-01-21 23:37:12.019 182759 DEBUG oslo_service.service [None req-03770a86-b916-455d-a5be-04baff7a18e9 - - - - - -] vif_plug_linux_bridge_privileged.user = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:12 np0005591285 nova_compute[182755]: 2026-01-21 23:37:12.020 182759 DEBUG oslo_service.service [None req-03770a86-b916-455d-a5be-04baff7a18e9 - - - - - -] vif_plug_ovs_privileged.capabilities = [12, 1] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:12 np0005591285 nova_compute[182755]: 2026-01-21 23:37:12.020 182759 DEBUG oslo_service.service [None req-03770a86-b916-455d-a5be-04baff7a18e9 - - - - - -] vif_plug_ovs_privileged.group  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:12 np0005591285 nova_compute[182755]: 2026-01-21 23:37:12.020 182759 DEBUG oslo_service.service [None req-03770a86-b916-455d-a5be-04baff7a18e9 - - - - - -] vif_plug_ovs_privileged.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:12 np0005591285 nova_compute[182755]: 2026-01-21 23:37:12.020 182759 DEBUG oslo_service.service [None req-03770a86-b916-455d-a5be-04baff7a18e9 - - - - - -] vif_plug_ovs_privileged.logger_name = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:12 np0005591285 nova_compute[182755]: 2026-01-21 23:37:12.020 182759 DEBUG oslo_service.service [None req-03770a86-b916-455d-a5be-04baff7a18e9 - - - - - -] vif_plug_ovs_privileged.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:12 np0005591285 nova_compute[182755]: 2026-01-21 23:37:12.020 182759 DEBUG oslo_service.service [None req-03770a86-b916-455d-a5be-04baff7a18e9 - - - - - -] vif_plug_ovs_privileged.user   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:12 np0005591285 nova_compute[182755]: 2026-01-21 23:37:12.020 182759 DEBUG oslo_service.service [None req-03770a86-b916-455d-a5be-04baff7a18e9 - - - - - -] os_vif_linux_bridge.flat_interface = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:12 np0005591285 nova_compute[182755]: 2026-01-21 23:37:12.020 182759 DEBUG oslo_service.service [None req-03770a86-b916-455d-a5be-04baff7a18e9 - - - - - -] os_vif_linux_bridge.forward_bridge_interface = ['all'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:12 np0005591285 nova_compute[182755]: 2026-01-21 23:37:12.021 182759 DEBUG oslo_service.service [None req-03770a86-b916-455d-a5be-04baff7a18e9 - - - - - -] os_vif_linux_bridge.iptables_bottom_regex =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:12 np0005591285 nova_compute[182755]: 2026-01-21 23:37:12.021 182759 DEBUG oslo_service.service [None req-03770a86-b916-455d-a5be-04baff7a18e9 - - - - - -] os_vif_linux_bridge.iptables_drop_action = DROP log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:12 np0005591285 nova_compute[182755]: 2026-01-21 23:37:12.021 182759 DEBUG oslo_service.service [None req-03770a86-b916-455d-a5be-04baff7a18e9 - - - - - -] os_vif_linux_bridge.iptables_top_regex =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:12 np0005591285 nova_compute[182755]: 2026-01-21 23:37:12.021 182759 DEBUG oslo_service.service [None req-03770a86-b916-455d-a5be-04baff7a18e9 - - - - - -] os_vif_linux_bridge.network_device_mtu = 1500 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:12 np0005591285 nova_compute[182755]: 2026-01-21 23:37:12.021 182759 DEBUG oslo_service.service [None req-03770a86-b916-455d-a5be-04baff7a18e9 - - - - - -] os_vif_linux_bridge.use_ipv6   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:12 np0005591285 nova_compute[182755]: 2026-01-21 23:37:12.021 182759 DEBUG oslo_service.service [None req-03770a86-b916-455d-a5be-04baff7a18e9 - - - - - -] os_vif_linux_bridge.vlan_interface = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:12 np0005591285 nova_compute[182755]: 2026-01-21 23:37:12.021 182759 DEBUG oslo_service.service [None req-03770a86-b916-455d-a5be-04baff7a18e9 - - - - - -] os_vif_ovs.isolate_vif         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:12 np0005591285 nova_compute[182755]: 2026-01-21 23:37:12.022 182759 DEBUG oslo_service.service [None req-03770a86-b916-455d-a5be-04baff7a18e9 - - - - - -] os_vif_ovs.network_device_mtu  = 1500 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:12 np0005591285 nova_compute[182755]: 2026-01-21 23:37:12.022 182759 DEBUG oslo_service.service [None req-03770a86-b916-455d-a5be-04baff7a18e9 - - - - - -] os_vif_ovs.ovs_vsctl_timeout   = 120 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:12 np0005591285 nova_compute[182755]: 2026-01-21 23:37:12.022 182759 DEBUG oslo_service.service [None req-03770a86-b916-455d-a5be-04baff7a18e9 - - - - - -] os_vif_ovs.ovsdb_connection    = tcp:127.0.0.1:6640 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:12 np0005591285 nova_compute[182755]: 2026-01-21 23:37:12.022 182759 DEBUG oslo_service.service [None req-03770a86-b916-455d-a5be-04baff7a18e9 - - - - - -] os_vif_ovs.ovsdb_interface     = native log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:12 np0005591285 nova_compute[182755]: 2026-01-21 23:37:12.022 182759 DEBUG oslo_service.service [None req-03770a86-b916-455d-a5be-04baff7a18e9 - - - - - -] os_vif_ovs.per_port_bridge     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:12 np0005591285 nova_compute[182755]: 2026-01-21 23:37:12.022 182759 DEBUG oslo_service.service [None req-03770a86-b916-455d-a5be-04baff7a18e9 - - - - - -] os_brick.lock_path             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:12 np0005591285 nova_compute[182755]: 2026-01-21 23:37:12.022 182759 DEBUG oslo_service.service [None req-03770a86-b916-455d-a5be-04baff7a18e9 - - - - - -] os_brick.wait_mpath_device_attempts = 4 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:12 np0005591285 nova_compute[182755]: 2026-01-21 23:37:12.022 182759 DEBUG oslo_service.service [None req-03770a86-b916-455d-a5be-04baff7a18e9 - - - - - -] os_brick.wait_mpath_device_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:12 np0005591285 nova_compute[182755]: 2026-01-21 23:37:12.023 182759 DEBUG oslo_service.service [None req-03770a86-b916-455d-a5be-04baff7a18e9 - - - - - -] privsep_osbrick.capabilities   = [21] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:12 np0005591285 nova_compute[182755]: 2026-01-21 23:37:12.023 182759 DEBUG oslo_service.service [None req-03770a86-b916-455d-a5be-04baff7a18e9 - - - - - -] privsep_osbrick.group          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:12 np0005591285 nova_compute[182755]: 2026-01-21 23:37:12.023 182759 DEBUG oslo_service.service [None req-03770a86-b916-455d-a5be-04baff7a18e9 - - - - - -] privsep_osbrick.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:12 np0005591285 nova_compute[182755]: 2026-01-21 23:37:12.023 182759 DEBUG oslo_service.service [None req-03770a86-b916-455d-a5be-04baff7a18e9 - - - - - -] privsep_osbrick.logger_name    = os_brick.privileged log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:12 np0005591285 nova_compute[182755]: 2026-01-21 23:37:12.023 182759 DEBUG oslo_service.service [None req-03770a86-b916-455d-a5be-04baff7a18e9 - - - - - -] privsep_osbrick.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:12 np0005591285 nova_compute[182755]: 2026-01-21 23:37:12.023 182759 DEBUG oslo_service.service [None req-03770a86-b916-455d-a5be-04baff7a18e9 - - - - - -] privsep_osbrick.user           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:12 np0005591285 nova_compute[182755]: 2026-01-21 23:37:12.023 182759 DEBUG oslo_service.service [None req-03770a86-b916-455d-a5be-04baff7a18e9 - - - - - -] nova_sys_admin.capabilities    = [0, 1, 2, 3, 12, 21] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:12 np0005591285 nova_compute[182755]: 2026-01-21 23:37:12.024 182759 DEBUG oslo_service.service [None req-03770a86-b916-455d-a5be-04baff7a18e9 - - - - - -] nova_sys_admin.group           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:12 np0005591285 nova_compute[182755]: 2026-01-21 23:37:12.024 182759 DEBUG oslo_service.service [None req-03770a86-b916-455d-a5be-04baff7a18e9 - - - - - -] nova_sys_admin.helper_command  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:12 np0005591285 nova_compute[182755]: 2026-01-21 23:37:12.024 182759 DEBUG oslo_service.service [None req-03770a86-b916-455d-a5be-04baff7a18e9 - - - - - -] nova_sys_admin.logger_name     = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:12 np0005591285 nova_compute[182755]: 2026-01-21 23:37:12.024 182759 DEBUG oslo_service.service [None req-03770a86-b916-455d-a5be-04baff7a18e9 - - - - - -] nova_sys_admin.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:12 np0005591285 nova_compute[182755]: 2026-01-21 23:37:12.024 182759 DEBUG oslo_service.service [None req-03770a86-b916-455d-a5be-04baff7a18e9 - - - - - -] nova_sys_admin.user            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 21 18:37:12 np0005591285 nova_compute[182755]: 2026-01-21 23:37:12.024 182759 DEBUG oslo_service.service [None req-03770a86-b916-455d-a5be-04baff7a18e9 - - - - - -] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613#033[00m
Jan 21 18:37:12 np0005591285 nova_compute[182755]: 2026-01-21 23:37:12.025 182759 INFO nova.service [-] Starting compute node (version 27.5.2-0.20250829104910.6f8decf.el9)#033[00m
Jan 21 18:37:12 np0005591285 nova_compute[182755]: 2026-01-21 23:37:12.049 182759 INFO nova.virt.node [None req-f447ef32-7edb-4e2c-a5c5-5452f2f9c37e - - - - - -] Determined node identity e96a8776-a298-4c19-937a-402cb8191067 from /var/lib/nova/compute_id#033[00m
Jan 21 18:37:12 np0005591285 nova_compute[182755]: 2026-01-21 23:37:12.049 182759 DEBUG nova.virt.libvirt.host [None req-f447ef32-7edb-4e2c-a5c5-5452f2f9c37e - - - - - -] Starting native event thread _init_events /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:492#033[00m
Jan 21 18:37:12 np0005591285 nova_compute[182755]: 2026-01-21 23:37:12.050 182759 DEBUG nova.virt.libvirt.host [None req-f447ef32-7edb-4e2c-a5c5-5452f2f9c37e - - - - - -] Starting green dispatch thread _init_events /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:498#033[00m
Jan 21 18:37:12 np0005591285 nova_compute[182755]: 2026-01-21 23:37:12.050 182759 DEBUG nova.virt.libvirt.host [None req-f447ef32-7edb-4e2c-a5c5-5452f2f9c37e - - - - - -] Starting connection event dispatch thread initialize /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:620#033[00m
Jan 21 18:37:12 np0005591285 nova_compute[182755]: 2026-01-21 23:37:12.050 182759 DEBUG nova.virt.libvirt.host [None req-f447ef32-7edb-4e2c-a5c5-5452f2f9c37e - - - - - -] Connecting to libvirt: qemu:///system _get_new_connection /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:503#033[00m
Jan 21 18:37:12 np0005591285 nova_compute[182755]: 2026-01-21 23:37:12.062 182759 DEBUG nova.virt.libvirt.host [None req-f447ef32-7edb-4e2c-a5c5-5452f2f9c37e - - - - - -] Registering for lifecycle events <nova.virt.libvirt.host.Host object at 0x7f2477d19040> _get_new_connection /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:509#033[00m
Jan 21 18:37:12 np0005591285 nova_compute[182755]: 2026-01-21 23:37:12.065 182759 DEBUG nova.virt.libvirt.host [None req-f447ef32-7edb-4e2c-a5c5-5452f2f9c37e - - - - - -] Registering for connection events: <nova.virt.libvirt.host.Host object at 0x7f2477d19040> _get_new_connection /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:530#033[00m
Jan 21 18:37:12 np0005591285 nova_compute[182755]: 2026-01-21 23:37:12.066 182759 INFO nova.virt.libvirt.driver [None req-f447ef32-7edb-4e2c-a5c5-5452f2f9c37e - - - - - -] Connection event '1' reason 'None'#033[00m
Jan 21 18:37:12 np0005591285 nova_compute[182755]: 2026-01-21 23:37:12.071 182759 INFO nova.virt.libvirt.host [None req-f447ef32-7edb-4e2c-a5c5-5452f2f9c37e - - - - - -] Libvirt host capabilities <capabilities>
Jan 21 18:37:12 np0005591285 nova_compute[182755]: 
Jan 21 18:37:12 np0005591285 nova_compute[182755]:  <host>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:    <uuid>632224e8-817e-4a21-8112-83934a2544f5</uuid>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:    <cpu>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <arch>x86_64</arch>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <model>EPYC-Rome-v4</model>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <vendor>AMD</vendor>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <microcode version='16777317'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <signature family='23' model='49' stepping='0'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <topology sockets='8' dies='1' clusters='1' cores='1' threads='1'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <maxphysaddr mode='emulate' bits='40'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <feature name='x2apic'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <feature name='tsc-deadline'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <feature name='osxsave'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <feature name='hypervisor'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <feature name='tsc_adjust'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <feature name='spec-ctrl'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <feature name='stibp'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <feature name='arch-capabilities'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <feature name='ssbd'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <feature name='cmp_legacy'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <feature name='topoext'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <feature name='virt-ssbd'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <feature name='lbrv'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <feature name='tsc-scale'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <feature name='vmcb-clean'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <feature name='pause-filter'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <feature name='pfthreshold'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <feature name='svme-addr-chk'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <feature name='rdctl-no'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <feature name='skip-l1dfl-vmentry'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <feature name='mds-no'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <feature name='pschange-mc-no'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <pages unit='KiB' size='4'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <pages unit='KiB' size='2048'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <pages unit='KiB' size='1048576'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:    </cpu>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:    <power_management>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <suspend_mem/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <suspend_disk/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <suspend_hybrid/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:    </power_management>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:    <iommu support='no'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:    <migration_features>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <live/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <uri_transports>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <uri_transport>tcp</uri_transport>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <uri_transport>rdma</uri_transport>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      </uri_transports>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:    </migration_features>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:    <topology>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <cells num='1'>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <cell id='0'>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:          <memory unit='KiB'>7864304</memory>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:          <pages unit='KiB' size='4'>1966076</pages>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:          <pages unit='KiB' size='2048'>0</pages>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:          <pages unit='KiB' size='1048576'>0</pages>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:          <distances>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:            <sibling id='0' value='10'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:          </distances>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:          <cpus num='8'>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:            <cpu id='0' socket_id='0' die_id='0' cluster_id='65535' core_id='0' siblings='0'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:            <cpu id='1' socket_id='1' die_id='1' cluster_id='65535' core_id='0' siblings='1'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:            <cpu id='2' socket_id='2' die_id='2' cluster_id='65535' core_id='0' siblings='2'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:            <cpu id='3' socket_id='3' die_id='3' cluster_id='65535' core_id='0' siblings='3'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:            <cpu id='4' socket_id='4' die_id='4' cluster_id='65535' core_id='0' siblings='4'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:            <cpu id='5' socket_id='5' die_id='5' cluster_id='65535' core_id='0' siblings='5'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:            <cpu id='6' socket_id='6' die_id='6' cluster_id='65535' core_id='0' siblings='6'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:            <cpu id='7' socket_id='7' die_id='7' cluster_id='65535' core_id='0' siblings='7'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:          </cpus>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        </cell>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      </cells>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:    </topology>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:    <cache>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <bank id='0' level='2' type='both' size='512' unit='KiB' cpus='0'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <bank id='1' level='2' type='both' size='512' unit='KiB' cpus='1'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <bank id='2' level='2' type='both' size='512' unit='KiB' cpus='2'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <bank id='3' level='2' type='both' size='512' unit='KiB' cpus='3'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <bank id='4' level='2' type='both' size='512' unit='KiB' cpus='4'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <bank id='5' level='2' type='both' size='512' unit='KiB' cpus='5'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <bank id='6' level='2' type='both' size='512' unit='KiB' cpus='6'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <bank id='7' level='2' type='both' size='512' unit='KiB' cpus='7'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <bank id='0' level='3' type='both' size='16' unit='MiB' cpus='0'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <bank id='1' level='3' type='both' size='16' unit='MiB' cpus='1'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <bank id='2' level='3' type='both' size='16' unit='MiB' cpus='2'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <bank id='3' level='3' type='both' size='16' unit='MiB' cpus='3'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <bank id='4' level='3' type='both' size='16' unit='MiB' cpus='4'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <bank id='5' level='3' type='both' size='16' unit='MiB' cpus='5'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <bank id='6' level='3' type='both' size='16' unit='MiB' cpus='6'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <bank id='7' level='3' type='both' size='16' unit='MiB' cpus='7'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:    </cache>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:    <secmodel>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <model>selinux</model>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <doi>0</doi>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <baselabel type='kvm'>system_u:system_r:svirt_t:s0</baselabel>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <baselabel type='qemu'>system_u:system_r:svirt_tcg_t:s0</baselabel>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:    </secmodel>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:    <secmodel>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <model>dac</model>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <doi>0</doi>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <baselabel type='kvm'>+107:+107</baselabel>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <baselabel type='qemu'>+107:+107</baselabel>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:    </secmodel>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:  </host>
Jan 21 18:37:12 np0005591285 nova_compute[182755]: 
Jan 21 18:37:12 np0005591285 nova_compute[182755]:  <guest>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:    <os_type>hvm</os_type>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:    <arch name='i686'>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <wordsize>32</wordsize>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <emulator>/usr/libexec/qemu-kvm</emulator>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <machine maxCpus='240' deprecated='yes'>pc-i440fx-rhel7.6.0</machine>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <machine canonical='pc-i440fx-rhel7.6.0' maxCpus='240' deprecated='yes'>pc</machine>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <machine maxCpus='4096'>pc-q35-rhel9.8.0</machine>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <machine canonical='pc-q35-rhel9.8.0' maxCpus='4096'>q35</machine>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <machine maxCpus='4096'>pc-q35-rhel9.6.0</machine>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.6.0</machine>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <machine maxCpus='710'>pc-q35-rhel9.4.0</machine>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.5.0</machine>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.3.0</machine>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel7.6.0</machine>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.4.0</machine>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <machine maxCpus='710'>pc-q35-rhel9.2.0</machine>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.2.0</machine>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <machine maxCpus='710'>pc-q35-rhel9.0.0</machine>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.0.0</machine>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.1.0</machine>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <domain type='qemu'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <domain type='kvm'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:    </arch>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:    <features>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <pae/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <nonpae/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <acpi default='on' toggle='yes'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <apic default='on' toggle='no'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <cpuselection/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <deviceboot/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <disksnapshot default='on' toggle='no'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <externalSnapshot/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:    </features>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:  </guest>
Jan 21 18:37:12 np0005591285 nova_compute[182755]: 
Jan 21 18:37:12 np0005591285 nova_compute[182755]:  <guest>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:    <os_type>hvm</os_type>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:    <arch name='x86_64'>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <wordsize>64</wordsize>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <emulator>/usr/libexec/qemu-kvm</emulator>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <machine maxCpus='240' deprecated='yes'>pc-i440fx-rhel7.6.0</machine>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <machine canonical='pc-i440fx-rhel7.6.0' maxCpus='240' deprecated='yes'>pc</machine>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <machine maxCpus='4096'>pc-q35-rhel9.8.0</machine>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <machine canonical='pc-q35-rhel9.8.0' maxCpus='4096'>q35</machine>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <machine maxCpus='4096'>pc-q35-rhel9.6.0</machine>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.6.0</machine>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <machine maxCpus='710'>pc-q35-rhel9.4.0</machine>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.5.0</machine>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.3.0</machine>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel7.6.0</machine>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.4.0</machine>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <machine maxCpus='710'>pc-q35-rhel9.2.0</machine>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.2.0</machine>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <machine maxCpus='710'>pc-q35-rhel9.0.0</machine>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.0.0</machine>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.1.0</machine>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <domain type='qemu'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <domain type='kvm'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:    </arch>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:    <features>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <acpi default='on' toggle='yes'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <apic default='on' toggle='no'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <cpuselection/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <deviceboot/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <disksnapshot default='on' toggle='no'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <externalSnapshot/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:    </features>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:  </guest>
Jan 21 18:37:12 np0005591285 nova_compute[182755]: 
Jan 21 18:37:12 np0005591285 nova_compute[182755]: </capabilities>
Jan 21 18:37:12 np0005591285 nova_compute[182755]: #033[00m
Jan 21 18:37:12 np0005591285 nova_compute[182755]: 2026-01-21 23:37:12.081 182759 DEBUG nova.virt.libvirt.host [None req-f447ef32-7edb-4e2c-a5c5-5452f2f9c37e - - - - - -] Getting domain capabilities for i686 via machine types: {'q35', 'pc'} _get_machine_types /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:952#033[00m
Jan 21 18:37:12 np0005591285 nova_compute[182755]: 2026-01-21 23:37:12.085 182759 DEBUG nova.virt.libvirt.volume.mount [None req-f447ef32-7edb-4e2c-a5c5-5452f2f9c37e - - - - - -] Initialising _HostMountState generation 0 host_up /usr/lib/python3.9/site-packages/nova/virt/libvirt/volume/mount.py:130#033[00m
Jan 21 18:37:12 np0005591285 nova_compute[182755]: 2026-01-21 23:37:12.085 182759 DEBUG nova.virt.libvirt.host [None req-f447ef32-7edb-4e2c-a5c5-5452f2f9c37e - - - - - -] Libvirt host hypervisor capabilities for arch=i686 and machine_type=q35:
Jan 21 18:37:12 np0005591285 nova_compute[182755]: <domainCapabilities>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:  <path>/usr/libexec/qemu-kvm</path>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:  <domain>kvm</domain>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:  <machine>pc-q35-rhel9.8.0</machine>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:  <arch>i686</arch>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:  <vcpu max='4096'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:  <iothreads supported='yes'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:  <os supported='yes'>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:    <enum name='firmware'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:    <loader supported='yes'>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <value>/usr/share/OVMF/OVMF_CODE.secboot.fd</value>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <enum name='type'>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <value>rom</value>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <value>pflash</value>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      </enum>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <enum name='readonly'>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <value>yes</value>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <value>no</value>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      </enum>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <enum name='secure'>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <value>no</value>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      </enum>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:    </loader>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:  </os>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:  <cpu>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:    <mode name='host-passthrough' supported='yes'>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <enum name='hostPassthroughMigratable'>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <value>on</value>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <value>off</value>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      </enum>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:    </mode>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:    <mode name='maximum' supported='yes'>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <enum name='maximumMigratable'>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <value>on</value>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <value>off</value>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      </enum>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:    </mode>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:    <mode name='host-model' supported='yes'>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <model fallback='forbid'>EPYC-Rome</model>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <vendor>AMD</vendor>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <maxphysaddr mode='passthrough' limit='40'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <feature policy='require' name='x2apic'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <feature policy='require' name='tsc-deadline'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <feature policy='require' name='hypervisor'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <feature policy='require' name='tsc_adjust'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <feature policy='require' name='spec-ctrl'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <feature policy='require' name='stibp'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <feature policy='require' name='ssbd'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <feature policy='require' name='cmp_legacy'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <feature policy='require' name='overflow-recov'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <feature policy='require' name='succor'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <feature policy='require' name='ibrs'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <feature policy='require' name='amd-ssbd'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <feature policy='require' name='virt-ssbd'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <feature policy='require' name='lbrv'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <feature policy='require' name='tsc-scale'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <feature policy='require' name='vmcb-clean'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <feature policy='require' name='flushbyasid'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <feature policy='require' name='pause-filter'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <feature policy='require' name='pfthreshold'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <feature policy='require' name='svme-addr-chk'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <feature policy='require' name='lfence-always-serializing'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <feature policy='disable' name='xsaves'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:    </mode>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:    <mode name='custom' supported='yes'>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='486-v1'>486</model>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <model usable='yes' deprecated='yes' vendor='unknown'>486-v1</model>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <model usable='no' vendor='Intel' canonical='Broadwell-v1'>Broadwell</model>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <blockers model='Broadwell'>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='erms'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='hle'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='invpcid'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='pcid'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='rtm'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      </blockers>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <model usable='no' vendor='Intel' canonical='Broadwell-v3'>Broadwell-IBRS</model>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <blockers model='Broadwell-IBRS'>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='erms'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='hle'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='invpcid'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='pcid'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='rtm'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      </blockers>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <model usable='no' vendor='Intel' canonical='Broadwell-v2'>Broadwell-noTSX</model>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <blockers model='Broadwell-noTSX'>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='erms'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='invpcid'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='pcid'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      </blockers>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <model usable='no' vendor='Intel' canonical='Broadwell-v4'>Broadwell-noTSX-IBRS</model>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <blockers model='Broadwell-noTSX-IBRS'>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='erms'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='invpcid'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='pcid'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      </blockers>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <model usable='no' vendor='Intel'>Broadwell-v1</model>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <blockers model='Broadwell-v1'>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='erms'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='hle'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='invpcid'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='pcid'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='rtm'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      </blockers>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <model usable='no' vendor='Intel'>Broadwell-v2</model>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <blockers model='Broadwell-v2'>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='erms'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='invpcid'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='pcid'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      </blockers>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <model usable='no' vendor='Intel'>Broadwell-v3</model>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <blockers model='Broadwell-v3'>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='erms'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='hle'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='invpcid'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='pcid'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='rtm'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      </blockers>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <model usable='no' vendor='Intel'>Broadwell-v4</model>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <blockers model='Broadwell-v4'>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='erms'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='invpcid'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='pcid'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      </blockers>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v1'>Cascadelake-Server</model>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <blockers model='Cascadelake-Server'>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512bw'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512cd'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512dq'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512f'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512vl'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512vnni'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='erms'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='hle'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='invpcid'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='pcid'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='pku'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='rtm'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      </blockers>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v3'>Cascadelake-Server-noTSX</model>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <blockers model='Cascadelake-Server-noTSX'>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512bw'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512cd'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512dq'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512f'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512vl'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512vnni'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='erms'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='ibrs-all'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='invpcid'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='pcid'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='pku'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      </blockers>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v1</model>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <blockers model='Cascadelake-Server-v1'>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512bw'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512cd'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512dq'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512f'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512vl'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512vnni'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='erms'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='hle'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='invpcid'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='pcid'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='pku'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='rtm'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      </blockers>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v2</model>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <blockers model='Cascadelake-Server-v2'>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512bw'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512cd'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512dq'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512f'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512vl'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512vnni'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='erms'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='hle'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='ibrs-all'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='invpcid'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='pcid'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='pku'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='rtm'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      </blockers>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v3</model>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <blockers model='Cascadelake-Server-v3'>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512bw'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512cd'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512dq'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512f'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512vl'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512vnni'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='erms'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='ibrs-all'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='invpcid'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='pcid'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='pku'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      </blockers>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v4</model>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <blockers model='Cascadelake-Server-v4'>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512bw'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512cd'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512dq'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512f'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512vl'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512vnni'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='erms'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='ibrs-all'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='invpcid'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='pcid'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='pku'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      </blockers>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v5</model>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <blockers model='Cascadelake-Server-v5'>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512bw'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512cd'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512dq'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512f'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512vl'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512vnni'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='erms'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='ibrs-all'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='invpcid'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='pcid'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='pku'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='xsaves'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      </blockers>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <model usable='no' vendor='Intel' canonical='ClearwaterForest-v1'>ClearwaterForest</model>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <blockers model='ClearwaterForest'>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx-ifma'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx-ne-convert'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx-vnni'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx-vnni-int16'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx-vnni-int8'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='bhi-ctrl'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='bhi-no'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='bus-lock-detect'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='cldemote'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='cmpccxadd'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='ddpd-u'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='erms'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='fbsdp-no'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='fsrm'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='fsrs'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='gfni'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='ibrs-all'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='intel-psfd'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='invpcid'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='ipred-ctrl'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='lam'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='mcdt-no'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='movdir64b'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='movdiri'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='pbrsb-no'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='pcid'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='pku'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='prefetchiti'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='psdp-no'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='rrsba-ctrl'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='sbdr-ssdp-no'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='serialize'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='sha512'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='sm3'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='sm4'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='ss'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='vaes'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='vpclmulqdq'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='xsaves'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      </blockers>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <model usable='no' vendor='Intel'>ClearwaterForest-v1</model>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <blockers model='ClearwaterForest-v1'>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx-ifma'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx-ne-convert'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx-vnni'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx-vnni-int16'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx-vnni-int8'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='bhi-ctrl'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='bhi-no'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='bus-lock-detect'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='cldemote'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='cmpccxadd'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='ddpd-u'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='erms'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='fbsdp-no'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='fsrm'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='fsrs'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='gfni'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='ibrs-all'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='intel-psfd'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='invpcid'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='ipred-ctrl'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='lam'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='mcdt-no'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='movdir64b'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='movdiri'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='pbrsb-no'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='pcid'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='pku'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='prefetchiti'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='psdp-no'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='rrsba-ctrl'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='sbdr-ssdp-no'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='serialize'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='sha512'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='sm3'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='sm4'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='ss'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='vaes'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='vpclmulqdq'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='xsaves'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      </blockers>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <model usable='yes' deprecated='yes' vendor='Intel' canonical='Conroe-v1'>Conroe</model>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <model usable='yes' deprecated='yes' vendor='Intel'>Conroe-v1</model>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <model usable='no' vendor='Intel' canonical='Cooperlake-v1'>Cooperlake</model>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <blockers model='Cooperlake'>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512-bf16'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512bw'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512cd'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512dq'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512f'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512vl'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512vnni'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='erms'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='hle'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='ibrs-all'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='invpcid'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='pcid'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='pku'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='rtm'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='taa-no'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      </blockers>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <model usable='no' vendor='Intel'>Cooperlake-v1</model>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <blockers model='Cooperlake-v1'>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512-bf16'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512bw'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512cd'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512dq'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512f'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512vl'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512vnni'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='erms'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='hle'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='ibrs-all'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='invpcid'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='pcid'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='pku'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='rtm'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='taa-no'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      </blockers>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <model usable='no' vendor='Intel'>Cooperlake-v2</model>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <blockers model='Cooperlake-v2'>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512-bf16'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512bw'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512cd'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512dq'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512f'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512vl'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512vnni'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='erms'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='hle'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='ibrs-all'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='invpcid'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='pcid'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='pku'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='rtm'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='taa-no'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='xsaves'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      </blockers>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <model usable='no' vendor='Intel' canonical='Denverton-v1'>Denverton</model>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <blockers model='Denverton'>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='erms'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='mpx'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      </blockers>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <model usable='no' vendor='Intel'>Denverton-v1</model>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <blockers model='Denverton-v1'>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='erms'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='mpx'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      </blockers>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <model usable='no' vendor='Intel'>Denverton-v2</model>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <blockers model='Denverton-v2'>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='erms'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      </blockers>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <model usable='no' vendor='Intel'>Denverton-v3</model>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <blockers model='Denverton-v3'>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='erms'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='xsaves'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      </blockers>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <model usable='yes' vendor='Hygon' canonical='Dhyana-v1'>Dhyana</model>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <model usable='yes' vendor='Hygon'>Dhyana-v1</model>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <model usable='no' vendor='Hygon'>Dhyana-v2</model>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <blockers model='Dhyana-v2'>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='xsaves'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      </blockers>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <model usable='yes' vendor='AMD' canonical='EPYC-v1'>EPYC</model>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <model usable='no' vendor='AMD' canonical='EPYC-Genoa-v1'>EPYC-Genoa</model>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <blockers model='EPYC-Genoa'>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='amd-psfd'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='auto-ibrs'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512-bf16'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512-vpopcntdq'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512bitalg'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512bw'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512cd'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512dq'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512f'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512ifma'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512vbmi'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512vbmi2'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512vl'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512vnni'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='erms'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='fsrm'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='gfni'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='invpcid'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='la57'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='no-nested-data-bp'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='null-sel-clr-base'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='pcid'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='pku'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='stibp-always-on'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='vaes'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='vpclmulqdq'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='xsaves'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      </blockers>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <model usable='no' vendor='AMD'>EPYC-Genoa-v1</model>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <blockers model='EPYC-Genoa-v1'>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='amd-psfd'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='auto-ibrs'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512-bf16'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512-vpopcntdq'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512bitalg'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512bw'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512cd'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512dq'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512f'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512ifma'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512vbmi'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512vbmi2'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512vl'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512vnni'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='erms'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='fsrm'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='gfni'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='invpcid'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='la57'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='no-nested-data-bp'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='null-sel-clr-base'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='pcid'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='pku'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='stibp-always-on'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='vaes'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='vpclmulqdq'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='xsaves'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      </blockers>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <model usable='no' vendor='AMD'>EPYC-Genoa-v2</model>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <blockers model='EPYC-Genoa-v2'>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='amd-psfd'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='auto-ibrs'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512-bf16'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512-vpopcntdq'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512bitalg'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512bw'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512cd'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512dq'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512f'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512ifma'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512vbmi'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512vbmi2'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512vl'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512vnni'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='erms'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='fs-gs-base-ns'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='fsrm'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='gfni'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='invpcid'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='la57'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='no-nested-data-bp'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='null-sel-clr-base'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='pcid'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='perfmon-v2'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='pku'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='stibp-always-on'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='vaes'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='vpclmulqdq'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='xsaves'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      </blockers>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <model usable='yes' vendor='AMD' canonical='EPYC-v2'>EPYC-IBPB</model>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <model usable='no' vendor='AMD' canonical='EPYC-Milan-v1'>EPYC-Milan</model>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <blockers model='EPYC-Milan'>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='erms'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='fsrm'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='invpcid'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='pcid'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='pku'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='xsaves'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      </blockers>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <model usable='no' vendor='AMD'>EPYC-Milan-v1</model>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <blockers model='EPYC-Milan-v1'>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='erms'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='fsrm'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='invpcid'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='pcid'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='pku'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='xsaves'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      </blockers>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <model usable='no' vendor='AMD'>EPYC-Milan-v2</model>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <blockers model='EPYC-Milan-v2'>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='amd-psfd'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='erms'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='fsrm'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='invpcid'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='no-nested-data-bp'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='null-sel-clr-base'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='pcid'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='pku'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='stibp-always-on'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='vaes'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='vpclmulqdq'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='xsaves'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      </blockers>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <model usable='no' vendor='AMD'>EPYC-Milan-v3</model>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <blockers model='EPYC-Milan-v3'>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='amd-psfd'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='erms'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='fsrm'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='invpcid'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='no-nested-data-bp'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='null-sel-clr-base'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='pcid'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='pku'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='stibp-always-on'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='vaes'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='vpclmulqdq'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='xsaves'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      </blockers>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <model usable='no' vendor='AMD' canonical='EPYC-Rome-v1'>EPYC-Rome</model>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <blockers model='EPYC-Rome'>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='xsaves'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      </blockers>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <model usable='no' vendor='AMD'>EPYC-Rome-v1</model>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <blockers model='EPYC-Rome-v1'>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='xsaves'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      </blockers>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <model usable='no' vendor='AMD'>EPYC-Rome-v2</model>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <blockers model='EPYC-Rome-v2'>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='xsaves'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      </blockers>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <model usable='no' vendor='AMD'>EPYC-Rome-v3</model>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <blockers model='EPYC-Rome-v3'>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='xsaves'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      </blockers>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <model usable='yes' vendor='AMD'>EPYC-Rome-v4</model>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <model usable='yes' vendor='AMD'>EPYC-Rome-v5</model>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <model usable='no' vendor='AMD' canonical='EPYC-Turin-v1'>EPYC-Turin</model>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <blockers model='EPYC-Turin'>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='amd-psfd'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='auto-ibrs'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx-vnni'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512-bf16'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512-vp2intersect'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512-vpopcntdq'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512bitalg'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512bw'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512cd'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512dq'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512f'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512ifma'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512vbmi'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512vbmi2'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512vl'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512vnni'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='erms'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='fs-gs-base-ns'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='fsrm'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='gfni'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='ibpb-brtype'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='invpcid'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='la57'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='movdir64b'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='movdiri'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='no-nested-data-bp'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='null-sel-clr-base'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='pcid'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='perfmon-v2'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='pku'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='prefetchi'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='sbpb'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='srso-user-kernel-no'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='stibp-always-on'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='vaes'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='vpclmulqdq'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='xsaves'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      </blockers>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <model usable='no' vendor='AMD'>EPYC-Turin-v1</model>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <blockers model='EPYC-Turin-v1'>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='amd-psfd'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='auto-ibrs'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx-vnni'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512-bf16'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512-vp2intersect'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512-vpopcntdq'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512bitalg'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512bw'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512cd'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512dq'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512f'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512ifma'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512vbmi'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512vbmi2'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512vl'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512vnni'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='erms'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='fs-gs-base-ns'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='fsrm'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='gfni'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='ibpb-brtype'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='invpcid'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='la57'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='movdir64b'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='movdiri'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='no-nested-data-bp'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='null-sel-clr-base'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='pcid'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='perfmon-v2'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='pku'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='prefetchi'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='sbpb'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='srso-user-kernel-no'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='stibp-always-on'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='vaes'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='vpclmulqdq'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='xsaves'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      </blockers>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <model usable='yes' vendor='AMD'>EPYC-v1</model>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <model usable='yes' vendor='AMD'>EPYC-v2</model>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <model usable='no' vendor='AMD'>EPYC-v3</model>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <blockers model='EPYC-v3'>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='xsaves'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      </blockers>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <model usable='no' vendor='AMD'>EPYC-v4</model>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <blockers model='EPYC-v4'>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='xsaves'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      </blockers>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <model usable='no' vendor='AMD'>EPYC-v5</model>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <blockers model='EPYC-v5'>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='xsaves'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      </blockers>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <model usable='no' vendor='Intel' canonical='GraniteRapids-v1'>GraniteRapids</model>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <blockers model='GraniteRapids'>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='amx-bf16'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='amx-fp16'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='amx-int8'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='amx-tile'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx-vnni'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512-bf16'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512-fp16'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512-vpopcntdq'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512bitalg'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512bw'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512cd'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512dq'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512f'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512ifma'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512vbmi'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512vbmi2'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512vl'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512vnni'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='bus-lock-detect'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='erms'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='fbsdp-no'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='fsrc'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='fsrm'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='fsrs'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='fzrm'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='gfni'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='hle'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='ibrs-all'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='invpcid'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='la57'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='mcdt-no'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='pbrsb-no'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='pcid'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='pku'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='prefetchiti'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='psdp-no'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='rtm'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='sbdr-ssdp-no'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='serialize'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='taa-no'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='tsx-ldtrk'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='vaes'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='vpclmulqdq'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='xfd'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='xsaves'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      </blockers>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <model usable='no' vendor='Intel'>GraniteRapids-v1</model>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <blockers model='GraniteRapids-v1'>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='amx-bf16'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='amx-fp16'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='amx-int8'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='amx-tile'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx-vnni'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512-bf16'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512-fp16'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512-vpopcntdq'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512bitalg'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512bw'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512cd'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512dq'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512f'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512ifma'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512vbmi'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512vbmi2'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512vl'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512vnni'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='bus-lock-detect'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='erms'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='fbsdp-no'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='fsrc'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='fsrm'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='fsrs'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='fzrm'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='gfni'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='hle'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='ibrs-all'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='invpcid'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='la57'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='mcdt-no'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='pbrsb-no'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='pcid'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='pku'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='prefetchiti'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='psdp-no'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='rtm'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='sbdr-ssdp-no'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='serialize'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='taa-no'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='tsx-ldtrk'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='vaes'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='vpclmulqdq'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='xfd'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='xsaves'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      </blockers>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <model usable='no' vendor='Intel'>GraniteRapids-v2</model>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <blockers model='GraniteRapids-v2'>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='amx-bf16'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='amx-fp16'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='amx-int8'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='amx-tile'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx-vnni'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx10'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx10-128'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx10-256'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx10-512'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512-bf16'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512-fp16'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512-vpopcntdq'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512bitalg'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512bw'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512cd'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512dq'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512f'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512ifma'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512vbmi'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512vbmi2'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512vl'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512vnni'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='bus-lock-detect'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='cldemote'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='erms'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='fbsdp-no'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='fsrc'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='fsrm'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='fsrs'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='fzrm'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='gfni'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='hle'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='ibrs-all'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='invpcid'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='la57'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='mcdt-no'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='movdir64b'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='movdiri'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='pbrsb-no'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='pcid'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='pku'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='prefetchiti'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='psdp-no'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='rtm'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='sbdr-ssdp-no'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='serialize'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='ss'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='taa-no'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='tsx-ldtrk'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='vaes'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='vpclmulqdq'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='xfd'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='xsaves'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      </blockers>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <model usable='no' vendor='Intel'>GraniteRapids-v3</model>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <blockers model='GraniteRapids-v3'>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='amx-bf16'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='amx-fp16'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='amx-int8'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='amx-tile'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx-vnni'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx10'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx10-128'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx10-256'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx10-512'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512-bf16'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512-fp16'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512-vpopcntdq'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512bitalg'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512bw'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512cd'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512dq'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512f'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512ifma'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512vbmi'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512vbmi2'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512vl'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512vnni'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='bus-lock-detect'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='cldemote'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='erms'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='fbsdp-no'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='fsrc'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='fsrm'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='fsrs'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='fzrm'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='gfni'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='hle'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='ibrs-all'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='invpcid'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='la57'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='mcdt-no'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='movdir64b'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='movdiri'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='pbrsb-no'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='pcid'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='pku'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='prefetchiti'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='psdp-no'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='rtm'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='sbdr-ssdp-no'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='serialize'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='ss'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='taa-no'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='tsx-ldtrk'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='vaes'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='vpclmulqdq'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='xfd'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='xsaves'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      </blockers>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <model usable='no' vendor='Intel' canonical='Haswell-v1'>Haswell</model>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <blockers model='Haswell'>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='erms'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='hle'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='invpcid'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='pcid'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='rtm'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      </blockers>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <model usable='no' vendor='Intel' canonical='Haswell-v3'>Haswell-IBRS</model>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <blockers model='Haswell-IBRS'>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='erms'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='hle'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='invpcid'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='pcid'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='rtm'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      </blockers>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <model usable='no' vendor='Intel' canonical='Haswell-v2'>Haswell-noTSX</model>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <blockers model='Haswell-noTSX'>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='erms'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='invpcid'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='pcid'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      </blockers>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <model usable='no' vendor='Intel' canonical='Haswell-v4'>Haswell-noTSX-IBRS</model>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <blockers model='Haswell-noTSX-IBRS'>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='erms'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='invpcid'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='pcid'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      </blockers>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <model usable='no' vendor='Intel'>Haswell-v1</model>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <blockers model='Haswell-v1'>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='erms'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='hle'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='invpcid'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='pcid'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='rtm'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      </blockers>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <model usable='no' vendor='Intel'>Haswell-v2</model>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <blockers model='Haswell-v2'>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='erms'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='invpcid'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='pcid'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      </blockers>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <model usable='no' vendor='Intel'>Haswell-v3</model>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <blockers model='Haswell-v3'>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='erms'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='hle'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='invpcid'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='pcid'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='rtm'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      </blockers>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <model usable='no' vendor='Intel'>Haswell-v4</model>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <blockers model='Haswell-v4'>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='erms'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='invpcid'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='pcid'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      </blockers>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <model usable='no' vendor='Intel' canonical='Icelake-Server-v1'>Icelake-Server</model>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <blockers model='Icelake-Server'>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512-vpopcntdq'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512bitalg'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512bw'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512cd'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512dq'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512f'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512vbmi'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512vbmi2'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512vl'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512vnni'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='erms'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='gfni'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='hle'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='invpcid'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='la57'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='pcid'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='pku'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='rtm'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='vaes'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='vpclmulqdq'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      </blockers>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <model usable='no' vendor='Intel' canonical='Icelake-Server-v2'>Icelake-Server-noTSX</model>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <blockers model='Icelake-Server-noTSX'>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512-vpopcntdq'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512bitalg'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512bw'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512cd'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512dq'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512f'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512vbmi'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512vbmi2'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512vl'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512vnni'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='erms'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='gfni'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='invpcid'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='la57'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='pcid'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='pku'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='vaes'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='vpclmulqdq'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      </blockers>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <model usable='no' vendor='Intel'>Icelake-Server-v1</model>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <blockers model='Icelake-Server-v1'>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512-vpopcntdq'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512bitalg'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512bw'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512cd'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512dq'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512f'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512vbmi'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512vbmi2'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512vl'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512vnni'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='erms'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='gfni'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='hle'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='invpcid'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='la57'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='pcid'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='pku'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='rtm'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='vaes'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='vpclmulqdq'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      </blockers>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <model usable='no' vendor='Intel'>Icelake-Server-v2</model>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <blockers model='Icelake-Server-v2'>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512-vpopcntdq'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512bitalg'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512bw'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512cd'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512dq'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512f'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512vbmi'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512vbmi2'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512vl'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512vnni'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='erms'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='gfni'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='invpcid'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='la57'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='pcid'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='pku'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='vaes'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='vpclmulqdq'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      </blockers>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <model usable='no' vendor='Intel'>Icelake-Server-v3</model>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <blockers model='Icelake-Server-v3'>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512-vpopcntdq'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512bitalg'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512bw'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512cd'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512dq'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512f'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512vbmi'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512vbmi2'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512vl'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512vnni'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='erms'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='gfni'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='ibrs-all'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='invpcid'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='la57'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='pcid'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='pku'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='taa-no'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='vaes'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='vpclmulqdq'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      </blockers>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <model usable='no' vendor='Intel'>Icelake-Server-v4</model>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <blockers model='Icelake-Server-v4'>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512-vpopcntdq'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512bitalg'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512bw'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512cd'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512dq'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512f'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512ifma'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512vbmi'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512vbmi2'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512vl'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512vnni'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='erms'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='fsrm'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='gfni'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='ibrs-all'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='invpcid'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='la57'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='pcid'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='pku'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='taa-no'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='vaes'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='vpclmulqdq'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      </blockers>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <model usable='no' vendor='Intel'>Icelake-Server-v5</model>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <blockers model='Icelake-Server-v5'>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512-vpopcntdq'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512bitalg'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512bw'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512cd'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512dq'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512f'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512ifma'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512vbmi'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512vbmi2'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512vl'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512vnni'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='erms'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='fsrm'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='gfni'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='ibrs-all'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='invpcid'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='la57'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='pcid'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='pku'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='taa-no'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='vaes'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='vpclmulqdq'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='xsaves'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      </blockers>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <model usable='no' vendor='Intel'>Icelake-Server-v6</model>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <blockers model='Icelake-Server-v6'>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512-vpopcntdq'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512bitalg'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512bw'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512cd'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512dq'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512f'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512ifma'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512vbmi'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512vbmi2'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512vl'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512vnni'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='erms'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='fsrm'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='gfni'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='ibrs-all'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='invpcid'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='la57'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='pcid'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='pku'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='taa-no'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='vaes'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='vpclmulqdq'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='xsaves'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      </blockers>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <model usable='no' vendor='Intel'>Icelake-Server-v7</model>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <blockers model='Icelake-Server-v7'>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512-vpopcntdq'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512bitalg'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512bw'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512cd'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512dq'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512f'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512ifma'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512vbmi'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512vbmi2'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512vl'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512vnni'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='erms'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='fsrm'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='gfni'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='hle'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='ibrs-all'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='invpcid'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='la57'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='pcid'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='pku'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='rtm'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='taa-no'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='vaes'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='vpclmulqdq'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='xsaves'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      </blockers>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <model usable='no' vendor='Intel' canonical='IvyBridge-v1'>IvyBridge</model>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <blockers model='IvyBridge'>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='erms'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      </blockers>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <model usable='no' vendor='Intel' canonical='IvyBridge-v2'>IvyBridge-IBRS</model>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <blockers model='IvyBridge-IBRS'>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='erms'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      </blockers>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <model usable='no' vendor='Intel'>IvyBridge-v1</model>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <blockers model='IvyBridge-v1'>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='erms'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      </blockers>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <model usable='no' vendor='Intel'>IvyBridge-v2</model>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <blockers model='IvyBridge-v2'>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='erms'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      </blockers>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <model usable='no' vendor='Intel' canonical='KnightsMill-v1'>KnightsMill</model>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <blockers model='KnightsMill'>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512-4fmaps'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512-4vnniw'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512-vpopcntdq'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512cd'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512er'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512f'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512pf'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='erms'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='ss'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      </blockers>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <model usable='no' vendor='Intel'>KnightsMill-v1</model>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <blockers model='KnightsMill-v1'>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512-4fmaps'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512-4vnniw'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512-vpopcntdq'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512cd'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512er'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512f'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512pf'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='erms'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='ss'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      </blockers>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <model usable='yes' vendor='Intel' canonical='Nehalem-v1'>Nehalem</model>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <model usable='yes' vendor='Intel' canonical='Nehalem-v2'>Nehalem-IBRS</model>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <model usable='yes' vendor='Intel'>Nehalem-v1</model>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <model usable='yes' vendor='Intel'>Nehalem-v2</model>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G1-v1'>Opteron_G1</model>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G1-v1</model>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G2-v1'>Opteron_G2</model>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G2-v1</model>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G3-v1'>Opteron_G3</model>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G3-v1</model>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <model usable='no' vendor='AMD' canonical='Opteron_G4-v1'>Opteron_G4</model>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <blockers model='Opteron_G4'>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='fma4'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='xop'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      </blockers>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <model usable='no' vendor='AMD'>Opteron_G4-v1</model>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <blockers model='Opteron_G4-v1'>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='fma4'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='xop'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      </blockers>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <model usable='no' vendor='AMD' canonical='Opteron_G5-v1'>Opteron_G5</model>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <blockers model='Opteron_G5'>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='fma4'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='tbm'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='xop'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      </blockers>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <model usable='no' vendor='AMD'>Opteron_G5-v1</model>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <blockers model='Opteron_G5-v1'>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='fma4'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='tbm'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='xop'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      </blockers>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <model usable='yes' deprecated='yes' vendor='Intel' canonical='Penryn-v1'>Penryn</model>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <model usable='yes' deprecated='yes' vendor='Intel'>Penryn-v1</model>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <model usable='yes' vendor='Intel' canonical='SandyBridge-v1'>SandyBridge</model>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <model usable='yes' vendor='Intel' canonical='SandyBridge-v2'>SandyBridge-IBRS</model>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <model usable='yes' vendor='Intel'>SandyBridge-v1</model>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <model usable='yes' vendor='Intel'>SandyBridge-v2</model>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <model usable='no' vendor='Intel' canonical='SapphireRapids-v1'>SapphireRapids</model>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <blockers model='SapphireRapids'>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='amx-bf16'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='amx-int8'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='amx-tile'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx-vnni'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512-bf16'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512-fp16'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512-vpopcntdq'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512bitalg'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512bw'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512cd'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512dq'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512f'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512ifma'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512vbmi'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512vbmi2'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512vl'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512vnni'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='bus-lock-detect'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='erms'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='fsrc'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='fsrm'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='fsrs'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='fzrm'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='gfni'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='hle'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='ibrs-all'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='invpcid'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='la57'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='pcid'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='pku'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='rtm'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='serialize'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='taa-no'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='tsx-ldtrk'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='vaes'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='vpclmulqdq'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='xfd'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='xsaves'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      </blockers>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <model usable='no' vendor='Intel'>SapphireRapids-v1</model>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <blockers model='SapphireRapids-v1'>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='amx-bf16'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='amx-int8'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='amx-tile'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx-vnni'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512-bf16'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512-fp16'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512-vpopcntdq'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512bitalg'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512bw'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512cd'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512dq'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512f'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512ifma'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512vbmi'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512vbmi2'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512vl'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512vnni'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='bus-lock-detect'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='erms'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='fsrc'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='fsrm'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='fsrs'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='fzrm'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='gfni'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='hle'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='ibrs-all'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='invpcid'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='la57'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='pcid'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='pku'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='rtm'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='serialize'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='taa-no'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='tsx-ldtrk'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='vaes'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='vpclmulqdq'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='xfd'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='xsaves'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      </blockers>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <model usable='no' vendor='Intel'>SapphireRapids-v2</model>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <blockers model='SapphireRapids-v2'>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='amx-bf16'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='amx-int8'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='amx-tile'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx-vnni'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512-bf16'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512-fp16'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512-vpopcntdq'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512bitalg'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512bw'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512cd'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512dq'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512f'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512ifma'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512vbmi'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512vbmi2'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512vl'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512vnni'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='bus-lock-detect'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='erms'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='fbsdp-no'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='fsrc'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='fsrm'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='fsrs'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='fzrm'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='gfni'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='hle'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='ibrs-all'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='invpcid'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='la57'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='pcid'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='pku'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='psdp-no'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='rtm'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='sbdr-ssdp-no'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='serialize'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='taa-no'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='tsx-ldtrk'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='vaes'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='vpclmulqdq'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='xfd'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='xsaves'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      </blockers>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <model usable='no' vendor='Intel'>SapphireRapids-v3</model>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <blockers model='SapphireRapids-v3'>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='amx-bf16'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='amx-int8'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='amx-tile'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx-vnni'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512-bf16'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512-fp16'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512-vpopcntdq'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512bitalg'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512bw'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512cd'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512dq'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512f'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512ifma'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512vbmi'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512vbmi2'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512vl'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512vnni'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='bus-lock-detect'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='cldemote'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='erms'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='fbsdp-no'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='fsrc'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='fsrm'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='fsrs'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='fzrm'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='gfni'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='hle'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='ibrs-all'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='invpcid'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='la57'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='movdir64b'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='movdiri'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='pcid'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='pku'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='psdp-no'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='rtm'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='sbdr-ssdp-no'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='serialize'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='ss'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='taa-no'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='tsx-ldtrk'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='vaes'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='vpclmulqdq'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='xfd'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='xsaves'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      </blockers>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <model usable='no' vendor='Intel'>SapphireRapids-v4</model>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <blockers model='SapphireRapids-v4'>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='amx-bf16'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='amx-int8'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='amx-tile'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx-vnni'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512-bf16'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512-fp16'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512-vpopcntdq'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512bitalg'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512bw'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512cd'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512dq'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512f'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512ifma'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512vbmi'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512vbmi2'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512vl'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512vnni'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='bus-lock-detect'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='cldemote'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='erms'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='fbsdp-no'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='fsrc'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='fsrm'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='fsrs'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='fzrm'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='gfni'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='hle'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='ibrs-all'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='invpcid'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='la57'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='movdir64b'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='movdiri'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='pcid'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='pku'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='psdp-no'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='rtm'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='sbdr-ssdp-no'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='serialize'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='ss'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='taa-no'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='tsx-ldtrk'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='vaes'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='vpclmulqdq'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='xfd'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='xsaves'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      </blockers>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <model usable='no' vendor='Intel' canonical='SierraForest-v1'>SierraForest</model>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <blockers model='SierraForest'>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx-ifma'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx-ne-convert'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx-vnni'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx-vnni-int8'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='bus-lock-detect'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='cmpccxadd'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='erms'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='fbsdp-no'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='fsrm'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='fsrs'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='gfni'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='ibrs-all'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='invpcid'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='mcdt-no'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='pbrsb-no'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='pcid'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='pku'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='psdp-no'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='sbdr-ssdp-no'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='serialize'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='vaes'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='vpclmulqdq'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='xsaves'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      </blockers>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <model usable='no' vendor='Intel'>SierraForest-v1</model>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <blockers model='SierraForest-v1'>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx-ifma'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx-ne-convert'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx-vnni'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx-vnni-int8'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='bus-lock-detect'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='cmpccxadd'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='erms'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='fbsdp-no'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='fsrm'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='fsrs'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='gfni'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='ibrs-all'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='invpcid'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='mcdt-no'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='pbrsb-no'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='pcid'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='pku'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='psdp-no'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='sbdr-ssdp-no'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='serialize'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='vaes'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='vpclmulqdq'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='xsaves'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      </blockers>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <model usable='no' vendor='Intel'>SierraForest-v2</model>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <blockers model='SierraForest-v2'>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx-ifma'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx-ne-convert'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx-vnni'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx-vnni-int8'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='bhi-ctrl'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='bus-lock-detect'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='cldemote'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='cmpccxadd'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='erms'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='fbsdp-no'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='fsrm'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='fsrs'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='gfni'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='ibrs-all'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='intel-psfd'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='invpcid'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='ipred-ctrl'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='lam'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='mcdt-no'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='movdir64b'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='movdiri'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='pbrsb-no'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='pcid'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='pku'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='psdp-no'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='rrsba-ctrl'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='sbdr-ssdp-no'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='serialize'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='ss'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='vaes'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='vpclmulqdq'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='xsaves'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      </blockers>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <model usable='no' vendor='Intel'>SierraForest-v3</model>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <blockers model='SierraForest-v3'>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx-ifma'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx-ne-convert'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx-vnni'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx-vnni-int8'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='bhi-ctrl'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='bus-lock-detect'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='cldemote'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='cmpccxadd'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='erms'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='fbsdp-no'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='fsrm'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='fsrs'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='gfni'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='ibrs-all'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='intel-psfd'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='invpcid'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='ipred-ctrl'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='lam'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='mcdt-no'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='movdir64b'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='movdiri'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='pbrsb-no'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='pcid'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='pku'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='psdp-no'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='rrsba-ctrl'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='sbdr-ssdp-no'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='serialize'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='ss'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='vaes'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='vpclmulqdq'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='xsaves'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      </blockers>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <model usable='no' vendor='Intel' canonical='Skylake-Client-v1'>Skylake-Client</model>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <blockers model='Skylake-Client'>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='erms'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='hle'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='invpcid'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='pcid'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='rtm'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      </blockers>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <model usable='no' vendor='Intel' canonical='Skylake-Client-v2'>Skylake-Client-IBRS</model>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <blockers model='Skylake-Client-IBRS'>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='erms'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='hle'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='invpcid'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='pcid'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='rtm'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      </blockers>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <model usable='no' vendor='Intel' canonical='Skylake-Client-v3'>Skylake-Client-noTSX-IBRS</model>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <blockers model='Skylake-Client-noTSX-IBRS'>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='erms'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='invpcid'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='pcid'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      </blockers>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <model usable='no' vendor='Intel'>Skylake-Client-v1</model>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <blockers model='Skylake-Client-v1'>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='erms'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='hle'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='invpcid'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='pcid'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='rtm'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      </blockers>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <model usable='no' vendor='Intel'>Skylake-Client-v2</model>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <blockers model='Skylake-Client-v2'>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='erms'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='hle'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='invpcid'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='pcid'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='rtm'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      </blockers>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <model usable='no' vendor='Intel'>Skylake-Client-v3</model>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <blockers model='Skylake-Client-v3'>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='erms'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='invpcid'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='pcid'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      </blockers>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <model usable='no' vendor='Intel'>Skylake-Client-v4</model>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <blockers model='Skylake-Client-v4'>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='erms'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='invpcid'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='pcid'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='xsaves'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      </blockers>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <model usable='no' vendor='Intel' canonical='Skylake-Server-v1'>Skylake-Server</model>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <blockers model='Skylake-Server'>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512bw'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512cd'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512dq'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512f'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512vl'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='erms'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='hle'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='invpcid'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='pcid'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='pku'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='rtm'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      </blockers>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <model usable='no' vendor='Intel' canonical='Skylake-Server-v2'>Skylake-Server-IBRS</model>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <blockers model='Skylake-Server-IBRS'>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512bw'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512cd'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512dq'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512f'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512vl'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='erms'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='hle'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='invpcid'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='pcid'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='pku'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='rtm'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      </blockers>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <model usable='no' vendor='Intel' canonical='Skylake-Server-v3'>Skylake-Server-noTSX-IBRS</model>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <blockers model='Skylake-Server-noTSX-IBRS'>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512bw'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512cd'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512dq'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512f'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512vl'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='erms'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='invpcid'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='pcid'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='pku'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      </blockers>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <model usable='no' vendor='Intel'>Skylake-Server-v1</model>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <blockers model='Skylake-Server-v1'>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512bw'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512cd'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512dq'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512f'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512vl'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='erms'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='hle'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='invpcid'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='pcid'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='pku'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='rtm'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      </blockers>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <model usable='no' vendor='Intel'>Skylake-Server-v2</model>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <blockers model='Skylake-Server-v2'>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512bw'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512cd'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512dq'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512f'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512vl'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='erms'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='hle'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='invpcid'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='pcid'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='pku'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='rtm'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      </blockers>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <model usable='no' vendor='Intel'>Skylake-Server-v3</model>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <blockers model='Skylake-Server-v3'>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512bw'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512cd'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512dq'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512f'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512vl'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='erms'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='invpcid'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='pcid'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='pku'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      </blockers>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <model usable='no' vendor='Intel'>Skylake-Server-v4</model>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <blockers model='Skylake-Server-v4'>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512bw'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512cd'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512dq'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512f'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512vl'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='erms'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='invpcid'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='pcid'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='pku'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      </blockers>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <model usable='no' vendor='Intel'>Skylake-Server-v5</model>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <blockers model='Skylake-Server-v5'>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512bw'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512cd'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512dq'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512f'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512vl'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='erms'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='invpcid'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='pcid'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='pku'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='xsaves'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      </blockers>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <model usable='no' vendor='Intel' canonical='Snowridge-v1'>Snowridge</model>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <blockers model='Snowridge'>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='cldemote'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='core-capability'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='erms'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='gfni'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='movdir64b'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='movdiri'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='mpx'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='split-lock-detect'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      </blockers>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <model usable='no' vendor='Intel'>Snowridge-v1</model>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <blockers model='Snowridge-v1'>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='cldemote'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='core-capability'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='erms'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='gfni'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='movdir64b'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='movdiri'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='mpx'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='split-lock-detect'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      </blockers>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <model usable='no' vendor='Intel'>Snowridge-v2</model>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <blockers model='Snowridge-v2'>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='cldemote'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='core-capability'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='erms'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='gfni'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='movdir64b'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='movdiri'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='split-lock-detect'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      </blockers>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <model usable='no' vendor='Intel'>Snowridge-v3</model>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <blockers model='Snowridge-v3'>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='cldemote'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='core-capability'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='erms'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='gfni'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='movdir64b'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='movdiri'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='split-lock-detect'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='xsaves'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      </blockers>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <model usable='no' vendor='Intel'>Snowridge-v4</model>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <blockers model='Snowridge-v4'>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='cldemote'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='erms'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='gfni'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='movdir64b'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='movdiri'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='xsaves'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      </blockers>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <model usable='yes' vendor='Intel' canonical='Westmere-v1'>Westmere</model>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <model usable='yes' vendor='Intel' canonical='Westmere-v2'>Westmere-IBRS</model>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <model usable='yes' vendor='Intel'>Westmere-v1</model>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <model usable='yes' vendor='Intel'>Westmere-v2</model>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <model usable='no' deprecated='yes' vendor='AMD' canonical='athlon-v1'>athlon</model>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <blockers model='athlon'>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='3dnow'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='3dnowext'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      </blockers>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <model usable='no' deprecated='yes' vendor='AMD'>athlon-v1</model>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <blockers model='athlon-v1'>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='3dnow'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='3dnowext'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      </blockers>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <model usable='no' deprecated='yes' vendor='Intel' canonical='core2duo-v1'>core2duo</model>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <blockers model='core2duo'>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='ss'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      </blockers>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <model usable='no' deprecated='yes' vendor='Intel'>core2duo-v1</model>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <blockers model='core2duo-v1'>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='ss'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      </blockers>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <model usable='no' deprecated='yes' vendor='Intel' canonical='coreduo-v1'>coreduo</model>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <blockers model='coreduo'>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='ss'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      </blockers>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <model usable='no' deprecated='yes' vendor='Intel'>coreduo-v1</model>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <blockers model='coreduo-v1'>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='ss'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      </blockers>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm32-v1'>kvm32</model>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <model usable='yes' deprecated='yes' vendor='unknown'>kvm32-v1</model>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm64-v1'>kvm64</model>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <model usable='yes' deprecated='yes' vendor='unknown'>kvm64-v1</model>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <model usable='no' deprecated='yes' vendor='Intel' canonical='n270-v1'>n270</model>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <blockers model='n270'>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='ss'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      </blockers>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <model usable='no' deprecated='yes' vendor='Intel'>n270-v1</model>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <blockers model='n270-v1'>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='ss'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      </blockers>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium-v1'>pentium</model>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <model usable='yes' deprecated='yes' vendor='unknown'>pentium-v1</model>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium2-v1'>pentium2</model>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <model usable='yes' deprecated='yes' vendor='unknown'>pentium2-v1</model>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium3-v1'>pentium3</model>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <model usable='yes' deprecated='yes' vendor='unknown'>pentium3-v1</model>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <model usable='no' deprecated='yes' vendor='AMD' canonical='phenom-v1'>phenom</model>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <blockers model='phenom'>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='3dnow'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='3dnowext'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      </blockers>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <model usable='no' deprecated='yes' vendor='AMD'>phenom-v1</model>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <blockers model='phenom-v1'>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='3dnow'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='3dnowext'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      </blockers>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu32-v1'>qemu32</model>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <model usable='yes' deprecated='yes' vendor='unknown'>qemu32-v1</model>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu64-v1'>qemu64</model>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <model usable='yes' deprecated='yes' vendor='unknown'>qemu64-v1</model>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:    </mode>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:  </cpu>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:  <memoryBacking supported='yes'>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:    <enum name='sourceType'>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <value>file</value>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <value>anonymous</value>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <value>memfd</value>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:    </enum>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:  </memoryBacking>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:  <devices>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:    <disk supported='yes'>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <enum name='diskDevice'>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <value>disk</value>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <value>cdrom</value>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <value>floppy</value>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <value>lun</value>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      </enum>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <enum name='bus'>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <value>fdc</value>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <value>scsi</value>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <value>virtio</value>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <value>usb</value>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <value>sata</value>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      </enum>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <enum name='model'>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <value>virtio</value>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <value>virtio-transitional</value>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <value>virtio-non-transitional</value>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      </enum>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:    </disk>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:    <graphics supported='yes'>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <enum name='type'>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <value>vnc</value>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <value>egl-headless</value>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <value>dbus</value>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      </enum>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:    </graphics>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:    <video supported='yes'>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <enum name='modelType'>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <value>vga</value>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <value>cirrus</value>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <value>virtio</value>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <value>none</value>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <value>bochs</value>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <value>ramfb</value>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      </enum>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:    </video>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:    <hostdev supported='yes'>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <enum name='mode'>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <value>subsystem</value>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      </enum>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <enum name='startupPolicy'>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <value>default</value>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <value>mandatory</value>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <value>requisite</value>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <value>optional</value>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      </enum>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <enum name='subsysType'>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <value>usb</value>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <value>pci</value>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <value>scsi</value>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      </enum>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <enum name='capsType'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <enum name='pciBackend'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:    </hostdev>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:    <rng supported='yes'>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <enum name='model'>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <value>virtio</value>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <value>virtio-transitional</value>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <value>virtio-non-transitional</value>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      </enum>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <enum name='backendModel'>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <value>random</value>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <value>egd</value>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <value>builtin</value>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      </enum>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:    </rng>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:    <filesystem supported='yes'>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <enum name='driverType'>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <value>path</value>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <value>handle</value>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <value>virtiofs</value>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      </enum>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:    </filesystem>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:    <tpm supported='yes'>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <enum name='model'>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <value>tpm-tis</value>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <value>tpm-crb</value>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      </enum>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <enum name='backendModel'>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <value>emulator</value>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <value>external</value>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      </enum>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <enum name='backendVersion'>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <value>2.0</value>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      </enum>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:    </tpm>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:    <redirdev supported='yes'>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <enum name='bus'>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <value>usb</value>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      </enum>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:    </redirdev>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:    <channel supported='yes'>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <enum name='type'>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <value>pty</value>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <value>unix</value>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      </enum>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:    </channel>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:    <crypto supported='yes'>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <enum name='model'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <enum name='type'>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <value>qemu</value>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      </enum>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <enum name='backendModel'>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <value>builtin</value>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      </enum>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:    </crypto>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:    <interface supported='yes'>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <enum name='backendType'>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <value>default</value>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <value>passt</value>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      </enum>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:    </interface>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:    <panic supported='yes'>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <enum name='model'>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <value>isa</value>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <value>hyperv</value>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      </enum>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:    </panic>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:    <console supported='yes'>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <enum name='type'>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <value>null</value>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <value>vc</value>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <value>pty</value>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <value>dev</value>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <value>file</value>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <value>pipe</value>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <value>stdio</value>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <value>udp</value>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <value>tcp</value>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <value>unix</value>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <value>qemu-vdagent</value>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <value>dbus</value>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      </enum>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:    </console>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:  </devices>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:  <features>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:    <gic supported='no'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:    <vmcoreinfo supported='yes'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:    <genid supported='yes'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:    <backingStoreInput supported='yes'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:    <backup supported='yes'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:    <async-teardown supported='yes'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:    <s390-pv supported='no'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:    <ps2 supported='yes'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:    <tdx supported='no'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:    <sev supported='no'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:    <sgx supported='no'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:    <hyperv supported='yes'>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <enum name='features'>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <value>relaxed</value>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <value>vapic</value>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <value>spinlocks</value>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <value>vpindex</value>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <value>runtime</value>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <value>synic</value>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <value>stimer</value>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <value>reset</value>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <value>vendor_id</value>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <value>frequencies</value>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <value>reenlightenment</value>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <value>tlbflush</value>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <value>ipi</value>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <value>avic</value>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <value>emsr_bitmap</value>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <value>xmm_input</value>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      </enum>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <defaults>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <spinlocks>4095</spinlocks>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <stimer_direct>on</stimer_direct>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <tlbflush_direct>on</tlbflush_direct>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <tlbflush_extended>on</tlbflush_extended>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <vendor_id>Linux KVM Hv</vendor_id>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      </defaults>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:    </hyperv>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:    <launchSecurity supported='no'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:  </features>
Jan 21 18:37:12 np0005591285 nova_compute[182755]: </domainCapabilities>
Jan 21 18:37:12 np0005591285 nova_compute[182755]: _get_domain_capabilities /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1037#033[00m
Jan 21 18:37:12 np0005591285 nova_compute[182755]: 2026-01-21 23:37:12.095 182759 DEBUG nova.virt.libvirt.host [None req-f447ef32-7edb-4e2c-a5c5-5452f2f9c37e - - - - - -] Libvirt host hypervisor capabilities for arch=i686 and machine_type=pc:
Jan 21 18:37:12 np0005591285 nova_compute[182755]: <domainCapabilities>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:  <path>/usr/libexec/qemu-kvm</path>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:  <domain>kvm</domain>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:  <machine>pc-i440fx-rhel7.6.0</machine>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:  <arch>i686</arch>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:  <vcpu max='240'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:  <iothreads supported='yes'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:  <os supported='yes'>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:    <enum name='firmware'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:    <loader supported='yes'>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <value>/usr/share/OVMF/OVMF_CODE.secboot.fd</value>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <enum name='type'>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <value>rom</value>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <value>pflash</value>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      </enum>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <enum name='readonly'>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <value>yes</value>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <value>no</value>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      </enum>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <enum name='secure'>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <value>no</value>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      </enum>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:    </loader>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:  </os>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:  <cpu>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:    <mode name='host-passthrough' supported='yes'>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <enum name='hostPassthroughMigratable'>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <value>on</value>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <value>off</value>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      </enum>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:    </mode>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:    <mode name='maximum' supported='yes'>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <enum name='maximumMigratable'>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <value>on</value>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <value>off</value>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      </enum>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:    </mode>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:    <mode name='host-model' supported='yes'>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <model fallback='forbid'>EPYC-Rome</model>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <vendor>AMD</vendor>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <maxphysaddr mode='passthrough' limit='40'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <feature policy='require' name='x2apic'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <feature policy='require' name='tsc-deadline'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <feature policy='require' name='hypervisor'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <feature policy='require' name='tsc_adjust'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <feature policy='require' name='spec-ctrl'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <feature policy='require' name='stibp'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <feature policy='require' name='ssbd'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <feature policy='require' name='cmp_legacy'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <feature policy='require' name='overflow-recov'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <feature policy='require' name='succor'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <feature policy='require' name='ibrs'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <feature policy='require' name='amd-ssbd'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <feature policy='require' name='virt-ssbd'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <feature policy='require' name='lbrv'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <feature policy='require' name='tsc-scale'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <feature policy='require' name='vmcb-clean'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <feature policy='require' name='flushbyasid'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <feature policy='require' name='pause-filter'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <feature policy='require' name='pfthreshold'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <feature policy='require' name='svme-addr-chk'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <feature policy='require' name='lfence-always-serializing'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <feature policy='disable' name='xsaves'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:    </mode>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:    <mode name='custom' supported='yes'>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='486-v1'>486</model>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <model usable='yes' deprecated='yes' vendor='unknown'>486-v1</model>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <model usable='no' vendor='Intel' canonical='Broadwell-v1'>Broadwell</model>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <blockers model='Broadwell'>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='erms'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='hle'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='invpcid'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='pcid'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='rtm'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      </blockers>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <model usable='no' vendor='Intel' canonical='Broadwell-v3'>Broadwell-IBRS</model>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <blockers model='Broadwell-IBRS'>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='erms'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='hle'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='invpcid'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='pcid'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='rtm'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      </blockers>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <model usable='no' vendor='Intel' canonical='Broadwell-v2'>Broadwell-noTSX</model>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <blockers model='Broadwell-noTSX'>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='erms'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='invpcid'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='pcid'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      </blockers>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <model usable='no' vendor='Intel' canonical='Broadwell-v4'>Broadwell-noTSX-IBRS</model>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <blockers model='Broadwell-noTSX-IBRS'>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='erms'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='invpcid'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='pcid'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      </blockers>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <model usable='no' vendor='Intel'>Broadwell-v1</model>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <blockers model='Broadwell-v1'>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='erms'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='hle'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='invpcid'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='pcid'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='rtm'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      </blockers>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <model usable='no' vendor='Intel'>Broadwell-v2</model>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <blockers model='Broadwell-v2'>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='erms'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='invpcid'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='pcid'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      </blockers>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <model usable='no' vendor='Intel'>Broadwell-v3</model>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <blockers model='Broadwell-v3'>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='erms'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='hle'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='invpcid'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='pcid'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='rtm'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      </blockers>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <model usable='no' vendor='Intel'>Broadwell-v4</model>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <blockers model='Broadwell-v4'>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='erms'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='invpcid'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='pcid'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      </blockers>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v1'>Cascadelake-Server</model>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <blockers model='Cascadelake-Server'>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512bw'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512cd'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512dq'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512f'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512vl'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512vnni'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='erms'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='hle'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='invpcid'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='pcid'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='pku'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='rtm'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      </blockers>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v3'>Cascadelake-Server-noTSX</model>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <blockers model='Cascadelake-Server-noTSX'>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512bw'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512cd'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512dq'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512f'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512vl'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512vnni'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='erms'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='ibrs-all'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='invpcid'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='pcid'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='pku'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      </blockers>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v1</model>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <blockers model='Cascadelake-Server-v1'>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512bw'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512cd'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512dq'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512f'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512vl'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512vnni'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='erms'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='hle'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='invpcid'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='pcid'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='pku'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='rtm'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      </blockers>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v2</model>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <blockers model='Cascadelake-Server-v2'>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512bw'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512cd'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512dq'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512f'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512vl'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512vnni'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='erms'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='hle'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='ibrs-all'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='invpcid'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='pcid'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='pku'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='rtm'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      </blockers>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v3</model>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <blockers model='Cascadelake-Server-v3'>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512bw'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512cd'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512dq'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512f'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512vl'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512vnni'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='erms'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='ibrs-all'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='invpcid'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='pcid'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='pku'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      </blockers>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v4</model>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <blockers model='Cascadelake-Server-v4'>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512bw'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512cd'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512dq'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512f'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512vl'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512vnni'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='erms'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='ibrs-all'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='invpcid'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='pcid'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='pku'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      </blockers>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v5</model>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <blockers model='Cascadelake-Server-v5'>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512bw'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512cd'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512dq'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512f'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512vl'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512vnni'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='erms'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='ibrs-all'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='invpcid'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='pcid'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='pku'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='xsaves'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      </blockers>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <model usable='no' vendor='Intel' canonical='ClearwaterForest-v1'>ClearwaterForest</model>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <blockers model='ClearwaterForest'>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx-ifma'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx-ne-convert'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx-vnni'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx-vnni-int16'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx-vnni-int8'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='bhi-ctrl'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='bhi-no'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='bus-lock-detect'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='cldemote'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='cmpccxadd'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='ddpd-u'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='erms'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='fbsdp-no'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='fsrm'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='fsrs'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='gfni'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='ibrs-all'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='intel-psfd'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='invpcid'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='ipred-ctrl'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='lam'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='mcdt-no'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='movdir64b'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='movdiri'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='pbrsb-no'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='pcid'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='pku'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='prefetchiti'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='psdp-no'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='rrsba-ctrl'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='sbdr-ssdp-no'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='serialize'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='sha512'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='sm3'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='sm4'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='ss'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='vaes'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='vpclmulqdq'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='xsaves'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      </blockers>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <model usable='no' vendor='Intel'>ClearwaterForest-v1</model>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <blockers model='ClearwaterForest-v1'>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx-ifma'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx-ne-convert'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx-vnni'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx-vnni-int16'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx-vnni-int8'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='bhi-ctrl'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='bhi-no'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='bus-lock-detect'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='cldemote'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='cmpccxadd'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='ddpd-u'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='erms'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='fbsdp-no'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='fsrm'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='fsrs'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='gfni'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='ibrs-all'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='intel-psfd'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='invpcid'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='ipred-ctrl'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='lam'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='mcdt-no'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='movdir64b'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='movdiri'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='pbrsb-no'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='pcid'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='pku'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='prefetchiti'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='psdp-no'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='rrsba-ctrl'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='sbdr-ssdp-no'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='serialize'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='sha512'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='sm3'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='sm4'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='ss'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='vaes'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='vpclmulqdq'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='xsaves'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      </blockers>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <model usable='yes' deprecated='yes' vendor='Intel' canonical='Conroe-v1'>Conroe</model>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <model usable='yes' deprecated='yes' vendor='Intel'>Conroe-v1</model>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <model usable='no' vendor='Intel' canonical='Cooperlake-v1'>Cooperlake</model>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <blockers model='Cooperlake'>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512-bf16'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512bw'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512cd'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512dq'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512f'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512vl'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512vnni'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='erms'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='hle'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='ibrs-all'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='invpcid'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='pcid'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='pku'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='rtm'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='taa-no'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      </blockers>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <model usable='no' vendor='Intel'>Cooperlake-v1</model>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <blockers model='Cooperlake-v1'>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512-bf16'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512bw'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512cd'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512dq'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512f'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512vl'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512vnni'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='erms'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='hle'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='ibrs-all'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='invpcid'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='pcid'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='pku'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='rtm'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='taa-no'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      </blockers>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <model usable='no' vendor='Intel'>Cooperlake-v2</model>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <blockers model='Cooperlake-v2'>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512-bf16'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512bw'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512cd'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512dq'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512f'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512vl'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512vnni'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='erms'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='hle'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='ibrs-all'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='invpcid'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='pcid'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='pku'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='rtm'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='taa-no'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='xsaves'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      </blockers>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <model usable='no' vendor='Intel' canonical='Denverton-v1'>Denverton</model>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <blockers model='Denverton'>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='erms'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='mpx'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      </blockers>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <model usable='no' vendor='Intel'>Denverton-v1</model>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <blockers model='Denverton-v1'>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='erms'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='mpx'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      </blockers>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <model usable='no' vendor='Intel'>Denverton-v2</model>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <blockers model='Denverton-v2'>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='erms'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      </blockers>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <model usable='no' vendor='Intel'>Denverton-v3</model>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <blockers model='Denverton-v3'>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='erms'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='xsaves'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      </blockers>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <model usable='yes' vendor='Hygon' canonical='Dhyana-v1'>Dhyana</model>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <model usable='yes' vendor='Hygon'>Dhyana-v1</model>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <model usable='no' vendor='Hygon'>Dhyana-v2</model>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <blockers model='Dhyana-v2'>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='xsaves'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      </blockers>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <model usable='yes' vendor='AMD' canonical='EPYC-v1'>EPYC</model>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <model usable='no' vendor='AMD' canonical='EPYC-Genoa-v1'>EPYC-Genoa</model>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <blockers model='EPYC-Genoa'>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='amd-psfd'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='auto-ibrs'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512-bf16'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512-vpopcntdq'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512bitalg'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512bw'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512cd'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512dq'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512f'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512ifma'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512vbmi'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512vbmi2'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512vl'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512vnni'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='erms'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='fsrm'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='gfni'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='invpcid'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='la57'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='no-nested-data-bp'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='null-sel-clr-base'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='pcid'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='pku'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='stibp-always-on'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='vaes'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='vpclmulqdq'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='xsaves'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      </blockers>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <model usable='no' vendor='AMD'>EPYC-Genoa-v1</model>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <blockers model='EPYC-Genoa-v1'>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='amd-psfd'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='auto-ibrs'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512-bf16'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512-vpopcntdq'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512bitalg'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512bw'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512cd'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512dq'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512f'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512ifma'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512vbmi'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512vbmi2'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512vl'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512vnni'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='erms'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='fsrm'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='gfni'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='invpcid'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='la57'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='no-nested-data-bp'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='null-sel-clr-base'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='pcid'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='pku'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='stibp-always-on'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='vaes'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='vpclmulqdq'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='xsaves'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      </blockers>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <model usable='no' vendor='AMD'>EPYC-Genoa-v2</model>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <blockers model='EPYC-Genoa-v2'>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='amd-psfd'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='auto-ibrs'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512-bf16'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512-vpopcntdq'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512bitalg'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512bw'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512cd'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512dq'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512f'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512ifma'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512vbmi'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512vbmi2'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512vl'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512vnni'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='erms'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='fs-gs-base-ns'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='fsrm'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='gfni'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='invpcid'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='la57'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='no-nested-data-bp'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='null-sel-clr-base'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='pcid'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='perfmon-v2'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='pku'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='stibp-always-on'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='vaes'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='vpclmulqdq'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='xsaves'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      </blockers>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <model usable='yes' vendor='AMD' canonical='EPYC-v2'>EPYC-IBPB</model>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <model usable='no' vendor='AMD' canonical='EPYC-Milan-v1'>EPYC-Milan</model>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <blockers model='EPYC-Milan'>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='erms'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='fsrm'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='invpcid'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='pcid'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='pku'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='xsaves'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      </blockers>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <model usable='no' vendor='AMD'>EPYC-Milan-v1</model>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <blockers model='EPYC-Milan-v1'>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='erms'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='fsrm'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='invpcid'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='pcid'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='pku'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='xsaves'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      </blockers>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <model usable='no' vendor='AMD'>EPYC-Milan-v2</model>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <blockers model='EPYC-Milan-v2'>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='amd-psfd'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='erms'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='fsrm'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='invpcid'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='no-nested-data-bp'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='null-sel-clr-base'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='pcid'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='pku'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='stibp-always-on'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='vaes'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='vpclmulqdq'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='xsaves'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      </blockers>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <model usable='no' vendor='AMD'>EPYC-Milan-v3</model>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <blockers model='EPYC-Milan-v3'>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='amd-psfd'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='erms'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='fsrm'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='invpcid'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='no-nested-data-bp'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='null-sel-clr-base'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='pcid'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='pku'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='stibp-always-on'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='vaes'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='vpclmulqdq'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='xsaves'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      </blockers>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <model usable='no' vendor='AMD' canonical='EPYC-Rome-v1'>EPYC-Rome</model>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <blockers model='EPYC-Rome'>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='xsaves'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      </blockers>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <model usable='no' vendor='AMD'>EPYC-Rome-v1</model>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <blockers model='EPYC-Rome-v1'>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='xsaves'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      </blockers>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <model usable='no' vendor='AMD'>EPYC-Rome-v2</model>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <blockers model='EPYC-Rome-v2'>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='xsaves'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      </blockers>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <model usable='no' vendor='AMD'>EPYC-Rome-v3</model>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <blockers model='EPYC-Rome-v3'>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='xsaves'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      </blockers>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <model usable='yes' vendor='AMD'>EPYC-Rome-v4</model>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <model usable='yes' vendor='AMD'>EPYC-Rome-v5</model>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <model usable='no' vendor='AMD' canonical='EPYC-Turin-v1'>EPYC-Turin</model>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <blockers model='EPYC-Turin'>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='amd-psfd'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='auto-ibrs'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx-vnni'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512-bf16'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512-vp2intersect'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512-vpopcntdq'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512bitalg'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512bw'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512cd'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512dq'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512f'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512ifma'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512vbmi'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512vbmi2'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512vl'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512vnni'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='erms'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='fs-gs-base-ns'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='fsrm'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='gfni'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='ibpb-brtype'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='invpcid'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='la57'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='movdir64b'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='movdiri'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='no-nested-data-bp'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='null-sel-clr-base'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='pcid'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='perfmon-v2'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='pku'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='prefetchi'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='sbpb'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='srso-user-kernel-no'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='stibp-always-on'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='vaes'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='vpclmulqdq'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='xsaves'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      </blockers>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <model usable='no' vendor='AMD'>EPYC-Turin-v1</model>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <blockers model='EPYC-Turin-v1'>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='amd-psfd'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='auto-ibrs'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx-vnni'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512-bf16'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512-vp2intersect'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512-vpopcntdq'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512bitalg'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512bw'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512cd'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512dq'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512f'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512ifma'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512vbmi'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512vbmi2'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512vl'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512vnni'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='erms'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='fs-gs-base-ns'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='fsrm'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='gfni'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='ibpb-brtype'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='invpcid'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='la57'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='movdir64b'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='movdiri'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='no-nested-data-bp'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='null-sel-clr-base'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='pcid'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='perfmon-v2'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='pku'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='prefetchi'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='sbpb'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='srso-user-kernel-no'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='stibp-always-on'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='vaes'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='vpclmulqdq'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='xsaves'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      </blockers>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <model usable='yes' vendor='AMD'>EPYC-v1</model>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <model usable='yes' vendor='AMD'>EPYC-v2</model>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <model usable='no' vendor='AMD'>EPYC-v3</model>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <blockers model='EPYC-v3'>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='xsaves'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      </blockers>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <model usable='no' vendor='AMD'>EPYC-v4</model>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <blockers model='EPYC-v4'>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='xsaves'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      </blockers>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <model usable='no' vendor='AMD'>EPYC-v5</model>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <blockers model='EPYC-v5'>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='xsaves'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      </blockers>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <model usable='no' vendor='Intel' canonical='GraniteRapids-v1'>GraniteRapids</model>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <blockers model='GraniteRapids'>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='amx-bf16'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='amx-fp16'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='amx-int8'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='amx-tile'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx-vnni'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512-bf16'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512-fp16'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512-vpopcntdq'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512bitalg'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512bw'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512cd'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512dq'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512f'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512ifma'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512vbmi'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512vbmi2'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512vl'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512vnni'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='bus-lock-detect'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='erms'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='fbsdp-no'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='fsrc'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='fsrm'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='fsrs'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='fzrm'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='gfni'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='hle'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='ibrs-all'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='invpcid'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='la57'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='mcdt-no'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='pbrsb-no'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='pcid'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='pku'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='prefetchiti'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='psdp-no'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='rtm'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='sbdr-ssdp-no'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='serialize'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='taa-no'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='tsx-ldtrk'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='vaes'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='vpclmulqdq'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='xfd'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='xsaves'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      </blockers>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <model usable='no' vendor='Intel'>GraniteRapids-v1</model>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <blockers model='GraniteRapids-v1'>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='amx-bf16'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='amx-fp16'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='amx-int8'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='amx-tile'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx-vnni'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512-bf16'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512-fp16'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512-vpopcntdq'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512bitalg'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512bw'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512cd'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512dq'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512f'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512ifma'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512vbmi'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512vbmi2'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512vl'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512vnni'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='bus-lock-detect'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='erms'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='fbsdp-no'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='fsrc'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='fsrm'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='fsrs'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='fzrm'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='gfni'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='hle'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='ibrs-all'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='invpcid'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='la57'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='mcdt-no'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='pbrsb-no'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='pcid'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='pku'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='prefetchiti'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='psdp-no'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='rtm'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='sbdr-ssdp-no'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='serialize'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='taa-no'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='tsx-ldtrk'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='vaes'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='vpclmulqdq'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='xfd'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='xsaves'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      </blockers>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <model usable='no' vendor='Intel'>GraniteRapids-v2</model>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <blockers model='GraniteRapids-v2'>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='amx-bf16'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='amx-fp16'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='amx-int8'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='amx-tile'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx-vnni'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx10'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx10-128'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx10-256'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx10-512'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512-bf16'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512-fp16'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512-vpopcntdq'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512bitalg'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512bw'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512cd'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512dq'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512f'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512ifma'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512vbmi'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512vbmi2'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512vl'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512vnni'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='bus-lock-detect'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='cldemote'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='erms'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='fbsdp-no'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='fsrc'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='fsrm'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='fsrs'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='fzrm'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='gfni'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='hle'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='ibrs-all'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='invpcid'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='la57'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='mcdt-no'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='movdir64b'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='movdiri'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='pbrsb-no'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='pcid'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='pku'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='prefetchiti'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='psdp-no'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='rtm'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='sbdr-ssdp-no'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='serialize'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='ss'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='taa-no'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='tsx-ldtrk'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='vaes'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='vpclmulqdq'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='xfd'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='xsaves'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      </blockers>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <model usable='no' vendor='Intel'>GraniteRapids-v3</model>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <blockers model='GraniteRapids-v3'>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='amx-bf16'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='amx-fp16'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='amx-int8'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='amx-tile'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx-vnni'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx10'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx10-128'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx10-256'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx10-512'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512-bf16'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512-fp16'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512-vpopcntdq'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512bitalg'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512bw'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512cd'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512dq'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512f'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512ifma'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512vbmi'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512vbmi2'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512vl'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512vnni'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='bus-lock-detect'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='cldemote'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='erms'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='fbsdp-no'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='fsrc'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='fsrm'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='fsrs'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='fzrm'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='gfni'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='hle'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='ibrs-all'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='invpcid'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='la57'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='mcdt-no'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='movdir64b'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='movdiri'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='pbrsb-no'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='pcid'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='pku'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='prefetchiti'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='psdp-no'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='rtm'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='sbdr-ssdp-no'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='serialize'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='ss'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='taa-no'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='tsx-ldtrk'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='vaes'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='vpclmulqdq'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='xfd'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='xsaves'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      </blockers>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <model usable='no' vendor='Intel' canonical='Haswell-v1'>Haswell</model>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <blockers model='Haswell'>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='erms'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='hle'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='invpcid'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='pcid'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='rtm'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      </blockers>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <model usable='no' vendor='Intel' canonical='Haswell-v3'>Haswell-IBRS</model>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <blockers model='Haswell-IBRS'>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='erms'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='hle'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='invpcid'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='pcid'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='rtm'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      </blockers>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <model usable='no' vendor='Intel' canonical='Haswell-v2'>Haswell-noTSX</model>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <blockers model='Haswell-noTSX'>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='erms'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='invpcid'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='pcid'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      </blockers>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <model usable='no' vendor='Intel' canonical='Haswell-v4'>Haswell-noTSX-IBRS</model>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <blockers model='Haswell-noTSX-IBRS'>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='erms'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='invpcid'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='pcid'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      </blockers>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <model usable='no' vendor='Intel'>Haswell-v1</model>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <blockers model='Haswell-v1'>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='erms'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='hle'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='invpcid'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='pcid'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='rtm'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      </blockers>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <model usable='no' vendor='Intel'>Haswell-v2</model>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <blockers model='Haswell-v2'>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='erms'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='invpcid'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='pcid'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      </blockers>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <model usable='no' vendor='Intel'>Haswell-v3</model>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <blockers model='Haswell-v3'>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='erms'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='hle'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='invpcid'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='pcid'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='rtm'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      </blockers>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <model usable='no' vendor='Intel'>Haswell-v4</model>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <blockers model='Haswell-v4'>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='erms'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='invpcid'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='pcid'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      </blockers>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <model usable='no' vendor='Intel' canonical='Icelake-Server-v1'>Icelake-Server</model>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <blockers model='Icelake-Server'>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512-vpopcntdq'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512bitalg'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512bw'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512cd'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512dq'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512f'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512vbmi'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512vbmi2'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512vl'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512vnni'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='erms'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='gfni'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='hle'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='invpcid'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='la57'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='pcid'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='pku'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='rtm'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='vaes'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='vpclmulqdq'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      </blockers>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <model usable='no' vendor='Intel' canonical='Icelake-Server-v2'>Icelake-Server-noTSX</model>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <blockers model='Icelake-Server-noTSX'>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512-vpopcntdq'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512bitalg'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512bw'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512cd'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512dq'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512f'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512vbmi'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512vbmi2'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512vl'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512vnni'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='erms'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='gfni'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='invpcid'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='la57'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='pcid'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='pku'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='vaes'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='vpclmulqdq'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      </blockers>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <model usable='no' vendor='Intel'>Icelake-Server-v1</model>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <blockers model='Icelake-Server-v1'>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512-vpopcntdq'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512bitalg'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512bw'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512cd'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512dq'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512f'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512vbmi'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512vbmi2'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512vl'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512vnni'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='erms'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='gfni'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='hle'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='invpcid'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='la57'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='pcid'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='pku'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='rtm'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='vaes'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='vpclmulqdq'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      </blockers>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <model usable='no' vendor='Intel'>Icelake-Server-v2</model>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <blockers model='Icelake-Server-v2'>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512-vpopcntdq'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512bitalg'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512bw'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512cd'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512dq'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512f'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512vbmi'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512vbmi2'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512vl'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512vnni'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='erms'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='gfni'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='invpcid'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='la57'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='pcid'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='pku'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='vaes'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='vpclmulqdq'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      </blockers>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <model usable='no' vendor='Intel'>Icelake-Server-v3</model>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <blockers model='Icelake-Server-v3'>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512-vpopcntdq'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512bitalg'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512bw'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512cd'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512dq'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512f'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512vbmi'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512vbmi2'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512vl'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512vnni'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='erms'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='gfni'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='ibrs-all'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='invpcid'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='la57'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='pcid'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='pku'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='taa-no'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='vaes'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='vpclmulqdq'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      </blockers>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <model usable='no' vendor='Intel'>Icelake-Server-v4</model>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <blockers model='Icelake-Server-v4'>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512-vpopcntdq'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512bitalg'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512bw'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512cd'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512dq'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512f'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512ifma'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512vbmi'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512vbmi2'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512vl'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512vnni'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='erms'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='fsrm'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='gfni'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='ibrs-all'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='invpcid'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='la57'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='pcid'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='pku'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='taa-no'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='vaes'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='vpclmulqdq'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      </blockers>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <model usable='no' vendor='Intel'>Icelake-Server-v5</model>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <blockers model='Icelake-Server-v5'>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512-vpopcntdq'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512bitalg'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512bw'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512cd'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512dq'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512f'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512ifma'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512vbmi'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512vbmi2'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512vl'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512vnni'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='erms'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='fsrm'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='gfni'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='ibrs-all'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='invpcid'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='la57'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='pcid'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='pku'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='taa-no'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='vaes'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='vpclmulqdq'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='xsaves'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      </blockers>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <model usable='no' vendor='Intel'>Icelake-Server-v6</model>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <blockers model='Icelake-Server-v6'>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512-vpopcntdq'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512bitalg'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512bw'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512cd'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512dq'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512f'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512ifma'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512vbmi'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512vbmi2'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512vl'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512vnni'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='erms'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='fsrm'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='gfni'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='ibrs-all'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='invpcid'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='la57'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='pcid'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='pku'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='taa-no'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='vaes'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='vpclmulqdq'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='xsaves'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      </blockers>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <model usable='no' vendor='Intel'>Icelake-Server-v7</model>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <blockers model='Icelake-Server-v7'>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512-vpopcntdq'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512bitalg'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512bw'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512cd'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512dq'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512f'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512ifma'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512vbmi'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512vbmi2'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512vl'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512vnni'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='erms'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='fsrm'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='gfni'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='hle'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='ibrs-all'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='invpcid'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='la57'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='pcid'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='pku'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='rtm'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='taa-no'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='vaes'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='vpclmulqdq'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='xsaves'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      </blockers>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <model usable='no' vendor='Intel' canonical='IvyBridge-v1'>IvyBridge</model>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <blockers model='IvyBridge'>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='erms'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      </blockers>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <model usable='no' vendor='Intel' canonical='IvyBridge-v2'>IvyBridge-IBRS</model>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <blockers model='IvyBridge-IBRS'>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='erms'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      </blockers>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <model usable='no' vendor='Intel'>IvyBridge-v1</model>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <blockers model='IvyBridge-v1'>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='erms'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      </blockers>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <model usable='no' vendor='Intel'>IvyBridge-v2</model>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <blockers model='IvyBridge-v2'>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='erms'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      </blockers>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <model usable='no' vendor='Intel' canonical='KnightsMill-v1'>KnightsMill</model>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <blockers model='KnightsMill'>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512-4fmaps'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512-4vnniw'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512-vpopcntdq'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512cd'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512er'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512f'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512pf'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='erms'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='ss'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      </blockers>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <model usable='no' vendor='Intel'>KnightsMill-v1</model>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <blockers model='KnightsMill-v1'>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512-4fmaps'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512-4vnniw'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512-vpopcntdq'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512cd'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512er'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512f'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512pf'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='erms'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='ss'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      </blockers>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <model usable='yes' vendor='Intel' canonical='Nehalem-v1'>Nehalem</model>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <model usable='yes' vendor='Intel' canonical='Nehalem-v2'>Nehalem-IBRS</model>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <model usable='yes' vendor='Intel'>Nehalem-v1</model>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <model usable='yes' vendor='Intel'>Nehalem-v2</model>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G1-v1'>Opteron_G1</model>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G1-v1</model>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G2-v1'>Opteron_G2</model>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G2-v1</model>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G3-v1'>Opteron_G3</model>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G3-v1</model>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <model usable='no' vendor='AMD' canonical='Opteron_G4-v1'>Opteron_G4</model>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <blockers model='Opteron_G4'>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='fma4'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='xop'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      </blockers>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <model usable='no' vendor='AMD'>Opteron_G4-v1</model>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <blockers model='Opteron_G4-v1'>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='fma4'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='xop'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      </blockers>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <model usable='no' vendor='AMD' canonical='Opteron_G5-v1'>Opteron_G5</model>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <blockers model='Opteron_G5'>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='fma4'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='tbm'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='xop'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      </blockers>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <model usable='no' vendor='AMD'>Opteron_G5-v1</model>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <blockers model='Opteron_G5-v1'>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='fma4'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='tbm'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='xop'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      </blockers>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <model usable='yes' deprecated='yes' vendor='Intel' canonical='Penryn-v1'>Penryn</model>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <model usable='yes' deprecated='yes' vendor='Intel'>Penryn-v1</model>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <model usable='yes' vendor='Intel' canonical='SandyBridge-v1'>SandyBridge</model>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <model usable='yes' vendor='Intel' canonical='SandyBridge-v2'>SandyBridge-IBRS</model>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <model usable='yes' vendor='Intel'>SandyBridge-v1</model>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <model usable='yes' vendor='Intel'>SandyBridge-v2</model>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <model usable='no' vendor='Intel' canonical='SapphireRapids-v1'>SapphireRapids</model>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <blockers model='SapphireRapids'>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='amx-bf16'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='amx-int8'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='amx-tile'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx-vnni'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512-bf16'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512-fp16'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512-vpopcntdq'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512bitalg'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512bw'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512cd'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512dq'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512f'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512ifma'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512vbmi'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512vbmi2'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512vl'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512vnni'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='bus-lock-detect'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='erms'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='fsrc'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='fsrm'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='fsrs'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='fzrm'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='gfni'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='hle'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='ibrs-all'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='invpcid'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='la57'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='pcid'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='pku'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='rtm'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='serialize'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='taa-no'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='tsx-ldtrk'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='vaes'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='vpclmulqdq'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='xfd'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='xsaves'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      </blockers>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <model usable='no' vendor='Intel'>SapphireRapids-v1</model>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <blockers model='SapphireRapids-v1'>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='amx-bf16'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='amx-int8'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='amx-tile'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx-vnni'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512-bf16'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512-fp16'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512-vpopcntdq'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512bitalg'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512bw'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512cd'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512dq'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512f'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512ifma'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512vbmi'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512vbmi2'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512vl'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512vnni'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='bus-lock-detect'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='erms'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='fsrc'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='fsrm'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='fsrs'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='fzrm'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='gfni'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='hle'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='ibrs-all'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='invpcid'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='la57'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='pcid'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='pku'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='rtm'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='serialize'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='taa-no'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='tsx-ldtrk'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='vaes'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='vpclmulqdq'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='xfd'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='xsaves'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      </blockers>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <model usable='no' vendor='Intel'>SapphireRapids-v2</model>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <blockers model='SapphireRapids-v2'>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='amx-bf16'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='amx-int8'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='amx-tile'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx-vnni'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512-bf16'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512-fp16'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512-vpopcntdq'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512bitalg'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512bw'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512cd'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512dq'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512f'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512ifma'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512vbmi'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512vbmi2'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512vl'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512vnni'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='bus-lock-detect'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='erms'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='fbsdp-no'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='fsrc'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='fsrm'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='fsrs'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='fzrm'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='gfni'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='hle'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='ibrs-all'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='invpcid'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='la57'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='pcid'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='pku'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='psdp-no'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='rtm'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='sbdr-ssdp-no'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='serialize'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='taa-no'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='tsx-ldtrk'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='vaes'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='vpclmulqdq'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='xfd'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='xsaves'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      </blockers>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <model usable='no' vendor='Intel'>SapphireRapids-v3</model>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <blockers model='SapphireRapids-v3'>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='amx-bf16'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='amx-int8'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='amx-tile'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx-vnni'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512-bf16'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512-fp16'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512-vpopcntdq'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512bitalg'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512bw'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512cd'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512dq'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512f'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512ifma'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512vbmi'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512vbmi2'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512vl'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512vnni'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='bus-lock-detect'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='cldemote'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='erms'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='fbsdp-no'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='fsrc'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='fsrm'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='fsrs'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='fzrm'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='gfni'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='hle'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='ibrs-all'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='invpcid'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='la57'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='movdir64b'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='movdiri'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='pcid'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='pku'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='psdp-no'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='rtm'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='sbdr-ssdp-no'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='serialize'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='ss'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='taa-no'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='tsx-ldtrk'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='vaes'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='vpclmulqdq'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='xfd'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='xsaves'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      </blockers>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <model usable='no' vendor='Intel'>SapphireRapids-v4</model>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <blockers model='SapphireRapids-v4'>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='amx-bf16'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='amx-int8'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='amx-tile'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx-vnni'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512-bf16'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512-fp16'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512-vpopcntdq'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512bitalg'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512bw'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512cd'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512dq'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512f'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512ifma'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512vbmi'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512vbmi2'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512vl'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512vnni'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='bus-lock-detect'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='cldemote'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='erms'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='fbsdp-no'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='fsrc'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='fsrm'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='fsrs'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='fzrm'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='gfni'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='hle'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='ibrs-all'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='invpcid'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='la57'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='movdir64b'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='movdiri'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='pcid'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='pku'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='psdp-no'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='rtm'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='sbdr-ssdp-no'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='serialize'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='ss'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='taa-no'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='tsx-ldtrk'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='vaes'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='vpclmulqdq'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='xfd'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='xsaves'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      </blockers>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <model usable='no' vendor='Intel' canonical='SierraForest-v1'>SierraForest</model>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <blockers model='SierraForest'>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx-ifma'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx-ne-convert'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx-vnni'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx-vnni-int8'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='bus-lock-detect'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='cmpccxadd'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='erms'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='fbsdp-no'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='fsrm'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='fsrs'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='gfni'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='ibrs-all'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='invpcid'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='mcdt-no'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='pbrsb-no'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='pcid'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='pku'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='psdp-no'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='sbdr-ssdp-no'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='serialize'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='vaes'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='vpclmulqdq'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='xsaves'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      </blockers>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <model usable='no' vendor='Intel'>SierraForest-v1</model>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <blockers model='SierraForest-v1'>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx-ifma'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx-ne-convert'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx-vnni'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx-vnni-int8'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='bus-lock-detect'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='cmpccxadd'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='erms'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='fbsdp-no'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='fsrm'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='fsrs'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='gfni'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='ibrs-all'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='invpcid'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='mcdt-no'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='pbrsb-no'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='pcid'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='pku'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='psdp-no'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='sbdr-ssdp-no'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='serialize'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='vaes'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='vpclmulqdq'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='xsaves'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      </blockers>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <model usable='no' vendor='Intel'>SierraForest-v2</model>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <blockers model='SierraForest-v2'>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx-ifma'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx-ne-convert'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx-vnni'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx-vnni-int8'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='bhi-ctrl'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='bus-lock-detect'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='cldemote'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='cmpccxadd'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='erms'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='fbsdp-no'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='fsrm'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='fsrs'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='gfni'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='ibrs-all'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='intel-psfd'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='invpcid'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='ipred-ctrl'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='lam'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='mcdt-no'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='movdir64b'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='movdiri'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='pbrsb-no'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='pcid'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='pku'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='psdp-no'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='rrsba-ctrl'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='sbdr-ssdp-no'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='serialize'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='ss'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='vaes'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='vpclmulqdq'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='xsaves'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      </blockers>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <model usable='no' vendor='Intel'>SierraForest-v3</model>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <blockers model='SierraForest-v3'>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx-ifma'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx-ne-convert'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx-vnni'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx-vnni-int8'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='bhi-ctrl'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='bus-lock-detect'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='cldemote'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='cmpccxadd'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='erms'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='fbsdp-no'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='fsrm'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='fsrs'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='gfni'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='ibrs-all'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='intel-psfd'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='invpcid'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='ipred-ctrl'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='lam'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='mcdt-no'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='movdir64b'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='movdiri'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='pbrsb-no'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='pcid'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='pku'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='psdp-no'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='rrsba-ctrl'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='sbdr-ssdp-no'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='serialize'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='ss'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='vaes'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='vpclmulqdq'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='xsaves'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      </blockers>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <model usable='no' vendor='Intel' canonical='Skylake-Client-v1'>Skylake-Client</model>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <blockers model='Skylake-Client'>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='erms'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='hle'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='invpcid'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='pcid'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='rtm'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      </blockers>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <model usable='no' vendor='Intel' canonical='Skylake-Client-v2'>Skylake-Client-IBRS</model>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <blockers model='Skylake-Client-IBRS'>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='erms'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='hle'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='invpcid'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='pcid'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='rtm'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      </blockers>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <model usable='no' vendor='Intel' canonical='Skylake-Client-v3'>Skylake-Client-noTSX-IBRS</model>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <blockers model='Skylake-Client-noTSX-IBRS'>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='erms'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='invpcid'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='pcid'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      </blockers>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <model usable='no' vendor='Intel'>Skylake-Client-v1</model>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <blockers model='Skylake-Client-v1'>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='erms'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='hle'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='invpcid'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='pcid'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='rtm'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      </blockers>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <model usable='no' vendor='Intel'>Skylake-Client-v2</model>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <blockers model='Skylake-Client-v2'>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='erms'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='hle'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='invpcid'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='pcid'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='rtm'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      </blockers>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <model usable='no' vendor='Intel'>Skylake-Client-v3</model>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <blockers model='Skylake-Client-v3'>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='erms'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='invpcid'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='pcid'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      </blockers>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <model usable='no' vendor='Intel'>Skylake-Client-v4</model>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <blockers model='Skylake-Client-v4'>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='erms'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='invpcid'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='pcid'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='xsaves'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      </blockers>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <model usable='no' vendor='Intel' canonical='Skylake-Server-v1'>Skylake-Server</model>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <blockers model='Skylake-Server'>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512bw'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512cd'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512dq'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512f'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512vl'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='erms'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='hle'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='invpcid'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='pcid'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='pku'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='rtm'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      </blockers>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <model usable='no' vendor='Intel' canonical='Skylake-Server-v2'>Skylake-Server-IBRS</model>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <blockers model='Skylake-Server-IBRS'>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512bw'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512cd'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512dq'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512f'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512vl'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='erms'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='hle'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='invpcid'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='pcid'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='pku'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='rtm'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      </blockers>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <model usable='no' vendor='Intel' canonical='Skylake-Server-v3'>Skylake-Server-noTSX-IBRS</model>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <blockers model='Skylake-Server-noTSX-IBRS'>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512bw'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512cd'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512dq'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512f'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512vl'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='erms'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='invpcid'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='pcid'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='pku'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      </blockers>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <model usable='no' vendor='Intel'>Skylake-Server-v1</model>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <blockers model='Skylake-Server-v1'>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512bw'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512cd'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512dq'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512f'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512vl'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='erms'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='hle'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='invpcid'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='pcid'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='pku'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='rtm'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      </blockers>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <model usable='no' vendor='Intel'>Skylake-Server-v2</model>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <blockers model='Skylake-Server-v2'>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512bw'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512cd'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512dq'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512f'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512vl'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='erms'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='hle'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='invpcid'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='pcid'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='pku'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='rtm'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      </blockers>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <model usable='no' vendor='Intel'>Skylake-Server-v3</model>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <blockers model='Skylake-Server-v3'>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512bw'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512cd'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512dq'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512f'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512vl'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='erms'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='invpcid'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='pcid'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='pku'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      </blockers>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <model usable='no' vendor='Intel'>Skylake-Server-v4</model>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <blockers model='Skylake-Server-v4'>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512bw'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512cd'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512dq'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512f'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512vl'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='erms'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='invpcid'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='pcid'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='pku'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      </blockers>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <model usable='no' vendor='Intel'>Skylake-Server-v5</model>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <blockers model='Skylake-Server-v5'>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512bw'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512cd'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512dq'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512f'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512vl'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='erms'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='invpcid'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='pcid'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='pku'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='xsaves'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      </blockers>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <model usable='no' vendor='Intel' canonical='Snowridge-v1'>Snowridge</model>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <blockers model='Snowridge'>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='cldemote'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='core-capability'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='erms'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='gfni'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='movdir64b'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='movdiri'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='mpx'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='split-lock-detect'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      </blockers>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <model usable='no' vendor='Intel'>Snowridge-v1</model>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <blockers model='Snowridge-v1'>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='cldemote'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='core-capability'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='erms'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='gfni'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='movdir64b'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='movdiri'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='mpx'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='split-lock-detect'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      </blockers>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <model usable='no' vendor='Intel'>Snowridge-v2</model>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <blockers model='Snowridge-v2'>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='cldemote'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='core-capability'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='erms'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='gfni'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='movdir64b'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='movdiri'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='split-lock-detect'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      </blockers>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <model usable='no' vendor='Intel'>Snowridge-v3</model>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <blockers model='Snowridge-v3'>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='cldemote'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='core-capability'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='erms'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='gfni'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='movdir64b'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='movdiri'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='split-lock-detect'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='xsaves'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      </blockers>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <model usable='no' vendor='Intel'>Snowridge-v4</model>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <blockers model='Snowridge-v4'>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='cldemote'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='erms'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='gfni'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='movdir64b'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='movdiri'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='xsaves'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      </blockers>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <model usable='yes' vendor='Intel' canonical='Westmere-v1'>Westmere</model>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <model usable='yes' vendor='Intel' canonical='Westmere-v2'>Westmere-IBRS</model>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <model usable='yes' vendor='Intel'>Westmere-v1</model>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <model usable='yes' vendor='Intel'>Westmere-v2</model>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <model usable='no' deprecated='yes' vendor='AMD' canonical='athlon-v1'>athlon</model>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <blockers model='athlon'>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='3dnow'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='3dnowext'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      </blockers>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <model usable='no' deprecated='yes' vendor='AMD'>athlon-v1</model>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <blockers model='athlon-v1'>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='3dnow'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='3dnowext'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      </blockers>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <model usable='no' deprecated='yes' vendor='Intel' canonical='core2duo-v1'>core2duo</model>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <blockers model='core2duo'>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='ss'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      </blockers>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <model usable='no' deprecated='yes' vendor='Intel'>core2duo-v1</model>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <blockers model='core2duo-v1'>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='ss'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      </blockers>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <model usable='no' deprecated='yes' vendor='Intel' canonical='coreduo-v1'>coreduo</model>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <blockers model='coreduo'>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='ss'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      </blockers>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <model usable='no' deprecated='yes' vendor='Intel'>coreduo-v1</model>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <blockers model='coreduo-v1'>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='ss'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      </blockers>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm32-v1'>kvm32</model>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <model usable='yes' deprecated='yes' vendor='unknown'>kvm32-v1</model>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm64-v1'>kvm64</model>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <model usable='yes' deprecated='yes' vendor='unknown'>kvm64-v1</model>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <model usable='no' deprecated='yes' vendor='Intel' canonical='n270-v1'>n270</model>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <blockers model='n270'>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='ss'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      </blockers>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <model usable='no' deprecated='yes' vendor='Intel'>n270-v1</model>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <blockers model='n270-v1'>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='ss'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      </blockers>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium-v1'>pentium</model>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <model usable='yes' deprecated='yes' vendor='unknown'>pentium-v1</model>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium2-v1'>pentium2</model>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <model usable='yes' deprecated='yes' vendor='unknown'>pentium2-v1</model>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium3-v1'>pentium3</model>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <model usable='yes' deprecated='yes' vendor='unknown'>pentium3-v1</model>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <model usable='no' deprecated='yes' vendor='AMD' canonical='phenom-v1'>phenom</model>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <blockers model='phenom'>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='3dnow'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='3dnowext'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      </blockers>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <model usable='no' deprecated='yes' vendor='AMD'>phenom-v1</model>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <blockers model='phenom-v1'>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='3dnow'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='3dnowext'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      </blockers>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu32-v1'>qemu32</model>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <model usable='yes' deprecated='yes' vendor='unknown'>qemu32-v1</model>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu64-v1'>qemu64</model>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <model usable='yes' deprecated='yes' vendor='unknown'>qemu64-v1</model>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:    </mode>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:  </cpu>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:  <memoryBacking supported='yes'>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:    <enum name='sourceType'>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <value>file</value>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <value>anonymous</value>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <value>memfd</value>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:    </enum>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:  </memoryBacking>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:  <devices>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:    <disk supported='yes'>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <enum name='diskDevice'>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <value>disk</value>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <value>cdrom</value>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <value>floppy</value>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <value>lun</value>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      </enum>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <enum name='bus'>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <value>ide</value>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <value>fdc</value>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <value>scsi</value>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <value>virtio</value>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <value>usb</value>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <value>sata</value>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      </enum>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <enum name='model'>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <value>virtio</value>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <value>virtio-transitional</value>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <value>virtio-non-transitional</value>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      </enum>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:    </disk>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:    <graphics supported='yes'>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <enum name='type'>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <value>vnc</value>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <value>egl-headless</value>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <value>dbus</value>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      </enum>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:    </graphics>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:    <video supported='yes'>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <enum name='modelType'>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <value>vga</value>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <value>cirrus</value>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <value>virtio</value>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <value>none</value>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <value>bochs</value>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <value>ramfb</value>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      </enum>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:    </video>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:    <hostdev supported='yes'>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <enum name='mode'>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <value>subsystem</value>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      </enum>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <enum name='startupPolicy'>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <value>default</value>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <value>mandatory</value>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <value>requisite</value>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <value>optional</value>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      </enum>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <enum name='subsysType'>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <value>usb</value>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <value>pci</value>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <value>scsi</value>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      </enum>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <enum name='capsType'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <enum name='pciBackend'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:    </hostdev>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:    <rng supported='yes'>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <enum name='model'>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <value>virtio</value>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <value>virtio-transitional</value>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <value>virtio-non-transitional</value>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      </enum>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <enum name='backendModel'>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <value>random</value>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <value>egd</value>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <value>builtin</value>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      </enum>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:    </rng>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:    <filesystem supported='yes'>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <enum name='driverType'>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <value>path</value>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <value>handle</value>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <value>virtiofs</value>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      </enum>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:    </filesystem>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:    <tpm supported='yes'>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <enum name='model'>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <value>tpm-tis</value>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <value>tpm-crb</value>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      </enum>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <enum name='backendModel'>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <value>emulator</value>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <value>external</value>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      </enum>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <enum name='backendVersion'>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <value>2.0</value>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      </enum>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:    </tpm>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:    <redirdev supported='yes'>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <enum name='bus'>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <value>usb</value>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      </enum>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:    </redirdev>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:    <channel supported='yes'>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <enum name='type'>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <value>pty</value>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <value>unix</value>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      </enum>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:    </channel>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:    <crypto supported='yes'>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <enum name='model'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <enum name='type'>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <value>qemu</value>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      </enum>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <enum name='backendModel'>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <value>builtin</value>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      </enum>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:    </crypto>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:    <interface supported='yes'>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <enum name='backendType'>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <value>default</value>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <value>passt</value>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      </enum>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:    </interface>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:    <panic supported='yes'>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <enum name='model'>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <value>isa</value>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <value>hyperv</value>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      </enum>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:    </panic>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:    <console supported='yes'>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <enum name='type'>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <value>null</value>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <value>vc</value>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <value>pty</value>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <value>dev</value>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <value>file</value>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <value>pipe</value>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <value>stdio</value>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <value>udp</value>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <value>tcp</value>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <value>unix</value>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <value>qemu-vdagent</value>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <value>dbus</value>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      </enum>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:    </console>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:  </devices>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:  <features>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:    <gic supported='no'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:    <vmcoreinfo supported='yes'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:    <genid supported='yes'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:    <backingStoreInput supported='yes'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:    <backup supported='yes'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:    <async-teardown supported='yes'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:    <s390-pv supported='no'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:    <ps2 supported='yes'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:    <tdx supported='no'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:    <sev supported='no'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:    <sgx supported='no'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:    <hyperv supported='yes'>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <enum name='features'>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <value>relaxed</value>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <value>vapic</value>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <value>spinlocks</value>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <value>vpindex</value>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <value>runtime</value>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <value>synic</value>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <value>stimer</value>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <value>reset</value>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <value>vendor_id</value>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <value>frequencies</value>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <value>reenlightenment</value>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <value>tlbflush</value>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <value>ipi</value>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <value>avic</value>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <value>emsr_bitmap</value>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <value>xmm_input</value>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      </enum>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <defaults>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <spinlocks>4095</spinlocks>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <stimer_direct>on</stimer_direct>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <tlbflush_direct>on</tlbflush_direct>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <tlbflush_extended>on</tlbflush_extended>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <vendor_id>Linux KVM Hv</vendor_id>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      </defaults>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:    </hyperv>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:    <launchSecurity supported='no'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:  </features>
Jan 21 18:37:12 np0005591285 nova_compute[182755]: </domainCapabilities>
Jan 21 18:37:12 np0005591285 nova_compute[182755]: _get_domain_capabilities /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1037#033[00m
Jan 21 18:37:12 np0005591285 nova_compute[182755]: 2026-01-21 23:37:12.158 182759 DEBUG nova.virt.libvirt.host [None req-f447ef32-7edb-4e2c-a5c5-5452f2f9c37e - - - - - -] Getting domain capabilities for x86_64 via machine types: {'q35', 'pc'} _get_machine_types /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:952#033[00m
Jan 21 18:37:12 np0005591285 nova_compute[182755]: 2026-01-21 23:37:12.162 182759 DEBUG nova.virt.libvirt.host [None req-f447ef32-7edb-4e2c-a5c5-5452f2f9c37e - - - - - -] Libvirt host hypervisor capabilities for arch=x86_64 and machine_type=q35:
Jan 21 18:37:12 np0005591285 nova_compute[182755]: <domainCapabilities>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:  <path>/usr/libexec/qemu-kvm</path>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:  <domain>kvm</domain>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:  <machine>pc-q35-rhel9.8.0</machine>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:  <arch>x86_64</arch>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:  <vcpu max='4096'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:  <iothreads supported='yes'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:  <os supported='yes'>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:    <enum name='firmware'>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <value>efi</value>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:    </enum>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:    <loader supported='yes'>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <value>/usr/share/edk2/ovmf/OVMF_CODE.secboot.fd</value>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <value>/usr/share/edk2/ovmf/OVMF_CODE.fd</value>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <value>/usr/share/edk2/ovmf/OVMF.amdsev.fd</value>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <value>/usr/share/edk2/ovmf/OVMF.inteltdx.secboot.fd</value>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <enum name='type'>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <value>rom</value>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <value>pflash</value>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      </enum>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <enum name='readonly'>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <value>yes</value>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <value>no</value>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      </enum>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <enum name='secure'>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <value>yes</value>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <value>no</value>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      </enum>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:    </loader>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:  </os>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:  <cpu>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:    <mode name='host-passthrough' supported='yes'>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <enum name='hostPassthroughMigratable'>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <value>on</value>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <value>off</value>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      </enum>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:    </mode>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:    <mode name='maximum' supported='yes'>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <enum name='maximumMigratable'>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <value>on</value>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <value>off</value>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      </enum>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:    </mode>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:    <mode name='host-model' supported='yes'>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <model fallback='forbid'>EPYC-Rome</model>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <vendor>AMD</vendor>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <maxphysaddr mode='passthrough' limit='40'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <feature policy='require' name='x2apic'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <feature policy='require' name='tsc-deadline'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <feature policy='require' name='hypervisor'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <feature policy='require' name='tsc_adjust'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <feature policy='require' name='spec-ctrl'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <feature policy='require' name='stibp'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <feature policy='require' name='ssbd'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <feature policy='require' name='cmp_legacy'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <feature policy='require' name='overflow-recov'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <feature policy='require' name='succor'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <feature policy='require' name='ibrs'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <feature policy='require' name='amd-ssbd'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <feature policy='require' name='virt-ssbd'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <feature policy='require' name='lbrv'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <feature policy='require' name='tsc-scale'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <feature policy='require' name='vmcb-clean'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <feature policy='require' name='flushbyasid'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <feature policy='require' name='pause-filter'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <feature policy='require' name='pfthreshold'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <feature policy='require' name='svme-addr-chk'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <feature policy='require' name='lfence-always-serializing'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <feature policy='disable' name='xsaves'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:    </mode>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:    <mode name='custom' supported='yes'>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='486-v1'>486</model>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <model usable='yes' deprecated='yes' vendor='unknown'>486-v1</model>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <model usable='no' vendor='Intel' canonical='Broadwell-v1'>Broadwell</model>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <blockers model='Broadwell'>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='erms'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='hle'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='invpcid'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='pcid'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='rtm'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      </blockers>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <model usable='no' vendor='Intel' canonical='Broadwell-v3'>Broadwell-IBRS</model>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <blockers model='Broadwell-IBRS'>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='erms'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='hle'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='invpcid'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='pcid'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='rtm'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      </blockers>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <model usable='no' vendor='Intel' canonical='Broadwell-v2'>Broadwell-noTSX</model>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <blockers model='Broadwell-noTSX'>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='erms'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='invpcid'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='pcid'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      </blockers>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <model usable='no' vendor='Intel' canonical='Broadwell-v4'>Broadwell-noTSX-IBRS</model>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <blockers model='Broadwell-noTSX-IBRS'>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='erms'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='invpcid'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='pcid'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      </blockers>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <model usable='no' vendor='Intel'>Broadwell-v1</model>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <blockers model='Broadwell-v1'>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='erms'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='hle'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='invpcid'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='pcid'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='rtm'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      </blockers>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <model usable='no' vendor='Intel'>Broadwell-v2</model>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <blockers model='Broadwell-v2'>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='erms'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='invpcid'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='pcid'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      </blockers>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <model usable='no' vendor='Intel'>Broadwell-v3</model>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <blockers model='Broadwell-v3'>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='erms'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='hle'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='invpcid'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='pcid'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='rtm'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      </blockers>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <model usable='no' vendor='Intel'>Broadwell-v4</model>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <blockers model='Broadwell-v4'>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='erms'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='invpcid'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='pcid'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      </blockers>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v1'>Cascadelake-Server</model>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <blockers model='Cascadelake-Server'>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512bw'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512cd'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512dq'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512f'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512vl'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512vnni'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='erms'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='hle'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='invpcid'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='pcid'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='pku'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='rtm'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      </blockers>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v3'>Cascadelake-Server-noTSX</model>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <blockers model='Cascadelake-Server-noTSX'>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512bw'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512cd'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512dq'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512f'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512vl'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512vnni'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='erms'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='ibrs-all'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='invpcid'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='pcid'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='pku'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      </blockers>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v1</model>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <blockers model='Cascadelake-Server-v1'>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512bw'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512cd'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512dq'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512f'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512vl'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512vnni'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='erms'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='hle'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='invpcid'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='pcid'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='pku'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='rtm'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      </blockers>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v2</model>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <blockers model='Cascadelake-Server-v2'>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512bw'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512cd'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512dq'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512f'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512vl'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512vnni'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='erms'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='hle'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='ibrs-all'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='invpcid'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='pcid'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='pku'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='rtm'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      </blockers>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v3</model>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <blockers model='Cascadelake-Server-v3'>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512bw'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512cd'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512dq'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512f'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512vl'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512vnni'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='erms'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='ibrs-all'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='invpcid'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='pcid'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='pku'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      </blockers>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v4</model>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <blockers model='Cascadelake-Server-v4'>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512bw'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512cd'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512dq'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512f'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512vl'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512vnni'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='erms'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='ibrs-all'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='invpcid'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='pcid'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='pku'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      </blockers>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v5</model>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <blockers model='Cascadelake-Server-v5'>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512bw'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512cd'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512dq'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512f'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512vl'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512vnni'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='erms'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='ibrs-all'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='invpcid'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='pcid'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='pku'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='xsaves'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      </blockers>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <model usable='no' vendor='Intel' canonical='ClearwaterForest-v1'>ClearwaterForest</model>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <blockers model='ClearwaterForest'>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx-ifma'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx-ne-convert'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx-vnni'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx-vnni-int16'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx-vnni-int8'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='bhi-ctrl'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='bhi-no'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='bus-lock-detect'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='cldemote'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='cmpccxadd'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='ddpd-u'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='erms'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='fbsdp-no'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='fsrm'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='fsrs'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='gfni'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='ibrs-all'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='intel-psfd'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='invpcid'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='ipred-ctrl'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='lam'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='mcdt-no'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='movdir64b'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='movdiri'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='pbrsb-no'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='pcid'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='pku'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='prefetchiti'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='psdp-no'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='rrsba-ctrl'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='sbdr-ssdp-no'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='serialize'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='sha512'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='sm3'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='sm4'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='ss'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='vaes'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='vpclmulqdq'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='xsaves'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      </blockers>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <model usable='no' vendor='Intel'>ClearwaterForest-v1</model>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <blockers model='ClearwaterForest-v1'>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx-ifma'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx-ne-convert'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx-vnni'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx-vnni-int16'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx-vnni-int8'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='bhi-ctrl'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='bhi-no'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='bus-lock-detect'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='cldemote'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='cmpccxadd'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='ddpd-u'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='erms'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='fbsdp-no'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='fsrm'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='fsrs'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='gfni'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='ibrs-all'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='intel-psfd'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='invpcid'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='ipred-ctrl'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='lam'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='mcdt-no'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='movdir64b'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='movdiri'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='pbrsb-no'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='pcid'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='pku'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='prefetchiti'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='psdp-no'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='rrsba-ctrl'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='sbdr-ssdp-no'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='serialize'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='sha512'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='sm3'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='sm4'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='ss'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='vaes'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='vpclmulqdq'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='xsaves'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      </blockers>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <model usable='yes' deprecated='yes' vendor='Intel' canonical='Conroe-v1'>Conroe</model>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <model usable='yes' deprecated='yes' vendor='Intel'>Conroe-v1</model>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <model usable='no' vendor='Intel' canonical='Cooperlake-v1'>Cooperlake</model>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <blockers model='Cooperlake'>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512-bf16'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512bw'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512cd'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512dq'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512f'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512vl'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512vnni'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='erms'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='hle'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='ibrs-all'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='invpcid'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='pcid'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='pku'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='rtm'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='taa-no'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      </blockers>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <model usable='no' vendor='Intel'>Cooperlake-v1</model>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <blockers model='Cooperlake-v1'>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512-bf16'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512bw'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512cd'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512dq'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512f'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512vl'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512vnni'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='erms'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='hle'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='ibrs-all'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='invpcid'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='pcid'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='pku'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='rtm'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='taa-no'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      </blockers>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <model usable='no' vendor='Intel'>Cooperlake-v2</model>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <blockers model='Cooperlake-v2'>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512-bf16'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512bw'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512cd'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512dq'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512f'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512vl'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512vnni'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='erms'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='hle'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='ibrs-all'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='invpcid'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='pcid'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='pku'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='rtm'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='taa-no'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='xsaves'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      </blockers>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <model usable='no' vendor='Intel' canonical='Denverton-v1'>Denverton</model>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <blockers model='Denverton'>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='erms'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='mpx'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      </blockers>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <model usable='no' vendor='Intel'>Denverton-v1</model>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <blockers model='Denverton-v1'>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='erms'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='mpx'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      </blockers>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <model usable='no' vendor='Intel'>Denverton-v2</model>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <blockers model='Denverton-v2'>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='erms'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      </blockers>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <model usable='no' vendor='Intel'>Denverton-v3</model>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <blockers model='Denverton-v3'>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='erms'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='xsaves'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      </blockers>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <model usable='yes' vendor='Hygon' canonical='Dhyana-v1'>Dhyana</model>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <model usable='yes' vendor='Hygon'>Dhyana-v1</model>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <model usable='no' vendor='Hygon'>Dhyana-v2</model>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <blockers model='Dhyana-v2'>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='xsaves'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      </blockers>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <model usable='yes' vendor='AMD' canonical='EPYC-v1'>EPYC</model>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <model usable='no' vendor='AMD' canonical='EPYC-Genoa-v1'>EPYC-Genoa</model>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <blockers model='EPYC-Genoa'>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='amd-psfd'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='auto-ibrs'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512-bf16'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512-vpopcntdq'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512bitalg'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512bw'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512cd'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512dq'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512f'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512ifma'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512vbmi'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512vbmi2'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512vl'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512vnni'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='erms'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='fsrm'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='gfni'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='invpcid'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='la57'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='no-nested-data-bp'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='null-sel-clr-base'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='pcid'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='pku'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='stibp-always-on'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='vaes'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='vpclmulqdq'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='xsaves'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      </blockers>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <model usable='no' vendor='AMD'>EPYC-Genoa-v1</model>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <blockers model='EPYC-Genoa-v1'>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='amd-psfd'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='auto-ibrs'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512-bf16'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512-vpopcntdq'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512bitalg'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512bw'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512cd'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512dq'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512f'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512ifma'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512vbmi'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512vbmi2'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512vl'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512vnni'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='erms'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='fsrm'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='gfni'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='invpcid'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='la57'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='no-nested-data-bp'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='null-sel-clr-base'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='pcid'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='pku'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='stibp-always-on'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='vaes'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='vpclmulqdq'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='xsaves'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      </blockers>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <model usable='no' vendor='AMD'>EPYC-Genoa-v2</model>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <blockers model='EPYC-Genoa-v2'>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='amd-psfd'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='auto-ibrs'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512-bf16'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512-vpopcntdq'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512bitalg'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512bw'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512cd'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512dq'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512f'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512ifma'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512vbmi'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512vbmi2'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512vl'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512vnni'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='erms'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='fs-gs-base-ns'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='fsrm'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='gfni'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='invpcid'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='la57'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='no-nested-data-bp'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='null-sel-clr-base'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='pcid'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='perfmon-v2'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='pku'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='stibp-always-on'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='vaes'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='vpclmulqdq'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='xsaves'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      </blockers>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <model usable='yes' vendor='AMD' canonical='EPYC-v2'>EPYC-IBPB</model>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <model usable='no' vendor='AMD' canonical='EPYC-Milan-v1'>EPYC-Milan</model>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <blockers model='EPYC-Milan'>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='erms'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='fsrm'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='invpcid'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='pcid'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='pku'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='xsaves'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      </blockers>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <model usable='no' vendor='AMD'>EPYC-Milan-v1</model>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <blockers model='EPYC-Milan-v1'>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='erms'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='fsrm'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='invpcid'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='pcid'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='pku'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='xsaves'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      </blockers>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <model usable='no' vendor='AMD'>EPYC-Milan-v2</model>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <blockers model='EPYC-Milan-v2'>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='amd-psfd'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='erms'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='fsrm'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='invpcid'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='no-nested-data-bp'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='null-sel-clr-base'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='pcid'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='pku'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='stibp-always-on'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='vaes'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='vpclmulqdq'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='xsaves'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      </blockers>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <model usable='no' vendor='AMD'>EPYC-Milan-v3</model>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <blockers model='EPYC-Milan-v3'>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='amd-psfd'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='erms'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='fsrm'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='invpcid'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='no-nested-data-bp'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='null-sel-clr-base'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='pcid'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='pku'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='stibp-always-on'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='vaes'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='vpclmulqdq'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='xsaves'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      </blockers>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <model usable='no' vendor='AMD' canonical='EPYC-Rome-v1'>EPYC-Rome</model>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <blockers model='EPYC-Rome'>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='xsaves'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      </blockers>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <model usable='no' vendor='AMD'>EPYC-Rome-v1</model>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <blockers model='EPYC-Rome-v1'>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='xsaves'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      </blockers>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <model usable='no' vendor='AMD'>EPYC-Rome-v2</model>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <blockers model='EPYC-Rome-v2'>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='xsaves'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      </blockers>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <model usable='no' vendor='AMD'>EPYC-Rome-v3</model>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <blockers model='EPYC-Rome-v3'>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='xsaves'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      </blockers>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <model usable='yes' vendor='AMD'>EPYC-Rome-v4</model>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <model usable='yes' vendor='AMD'>EPYC-Rome-v5</model>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <model usable='no' vendor='AMD' canonical='EPYC-Turin-v1'>EPYC-Turin</model>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <blockers model='EPYC-Turin'>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='amd-psfd'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='auto-ibrs'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx-vnni'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512-bf16'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512-vp2intersect'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512-vpopcntdq'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512bitalg'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512bw'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512cd'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512dq'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512f'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512ifma'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512vbmi'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512vbmi2'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512vl'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512vnni'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='erms'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='fs-gs-base-ns'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='fsrm'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='gfni'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='ibpb-brtype'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='invpcid'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='la57'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='movdir64b'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='movdiri'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='no-nested-data-bp'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='null-sel-clr-base'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='pcid'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='perfmon-v2'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='pku'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='prefetchi'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='sbpb'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='srso-user-kernel-no'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='stibp-always-on'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='vaes'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='vpclmulqdq'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='xsaves'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      </blockers>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <model usable='no' vendor='AMD'>EPYC-Turin-v1</model>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <blockers model='EPYC-Turin-v1'>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='amd-psfd'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='auto-ibrs'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx-vnni'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512-bf16'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512-vp2intersect'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512-vpopcntdq'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512bitalg'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512bw'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512cd'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512dq'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512f'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512ifma'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512vbmi'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512vbmi2'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512vl'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512vnni'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='erms'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='fs-gs-base-ns'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='fsrm'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='gfni'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='ibpb-brtype'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='invpcid'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='la57'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='movdir64b'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='movdiri'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='no-nested-data-bp'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='null-sel-clr-base'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='pcid'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='perfmon-v2'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='pku'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='prefetchi'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='sbpb'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='srso-user-kernel-no'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='stibp-always-on'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='vaes'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='vpclmulqdq'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='xsaves'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      </blockers>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <model usable='yes' vendor='AMD'>EPYC-v1</model>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <model usable='yes' vendor='AMD'>EPYC-v2</model>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <model usable='no' vendor='AMD'>EPYC-v3</model>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <blockers model='EPYC-v3'>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='xsaves'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      </blockers>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <model usable='no' vendor='AMD'>EPYC-v4</model>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <blockers model='EPYC-v4'>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='xsaves'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      </blockers>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <model usable='no' vendor='AMD'>EPYC-v5</model>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <blockers model='EPYC-v5'>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='xsaves'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      </blockers>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <model usable='no' vendor='Intel' canonical='GraniteRapids-v1'>GraniteRapids</model>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <blockers model='GraniteRapids'>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='amx-bf16'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='amx-fp16'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='amx-int8'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='amx-tile'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx-vnni'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512-bf16'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512-fp16'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512-vpopcntdq'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512bitalg'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512bw'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512cd'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512dq'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512f'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512ifma'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512vbmi'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512vbmi2'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512vl'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512vnni'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='bus-lock-detect'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='erms'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='fbsdp-no'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='fsrc'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='fsrm'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='fsrs'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='fzrm'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='gfni'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='hle'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='ibrs-all'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='invpcid'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='la57'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='mcdt-no'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='pbrsb-no'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='pcid'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='pku'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='prefetchiti'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='psdp-no'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='rtm'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='sbdr-ssdp-no'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='serialize'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='taa-no'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='tsx-ldtrk'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='vaes'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='vpclmulqdq'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='xfd'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='xsaves'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      </blockers>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <model usable='no' vendor='Intel'>GraniteRapids-v1</model>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <blockers model='GraniteRapids-v1'>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='amx-bf16'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='amx-fp16'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='amx-int8'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='amx-tile'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx-vnni'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512-bf16'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512-fp16'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512-vpopcntdq'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512bitalg'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512bw'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512cd'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512dq'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512f'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512ifma'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512vbmi'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512vbmi2'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512vl'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512vnni'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='bus-lock-detect'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='erms'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='fbsdp-no'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='fsrc'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='fsrm'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='fsrs'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='fzrm'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='gfni'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='hle'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='ibrs-all'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='invpcid'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='la57'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='mcdt-no'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='pbrsb-no'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='pcid'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='pku'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='prefetchiti'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='psdp-no'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='rtm'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='sbdr-ssdp-no'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='serialize'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='taa-no'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='tsx-ldtrk'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='vaes'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='vpclmulqdq'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='xfd'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='xsaves'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      </blockers>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <model usable='no' vendor='Intel'>GraniteRapids-v2</model>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <blockers model='GraniteRapids-v2'>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='amx-bf16'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='amx-fp16'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='amx-int8'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='amx-tile'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx-vnni'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx10'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx10-128'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx10-256'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx10-512'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512-bf16'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512-fp16'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512-vpopcntdq'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512bitalg'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512bw'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512cd'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512dq'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512f'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512ifma'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512vbmi'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512vbmi2'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512vl'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512vnni'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='bus-lock-detect'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='cldemote'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='erms'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='fbsdp-no'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='fsrc'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='fsrm'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='fsrs'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='fzrm'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='gfni'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='hle'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='ibrs-all'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='invpcid'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='la57'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='mcdt-no'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='movdir64b'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='movdiri'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='pbrsb-no'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='pcid'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='pku'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='prefetchiti'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='psdp-no'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='rtm'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='sbdr-ssdp-no'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='serialize'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='ss'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='taa-no'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='tsx-ldtrk'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='vaes'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='vpclmulqdq'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='xfd'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='xsaves'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      </blockers>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <model usable='no' vendor='Intel'>GraniteRapids-v3</model>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <blockers model='GraniteRapids-v3'>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='amx-bf16'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='amx-fp16'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='amx-int8'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='amx-tile'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx-vnni'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx10'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx10-128'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx10-256'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx10-512'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512-bf16'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512-fp16'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512-vpopcntdq'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512bitalg'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512bw'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512cd'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512dq'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512f'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512ifma'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512vbmi'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512vbmi2'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512vl'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512vnni'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='bus-lock-detect'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='cldemote'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='erms'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='fbsdp-no'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='fsrc'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='fsrm'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='fsrs'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='fzrm'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='gfni'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='hle'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='ibrs-all'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='invpcid'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='la57'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='mcdt-no'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='movdir64b'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='movdiri'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='pbrsb-no'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='pcid'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='pku'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='prefetchiti'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='psdp-no'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='rtm'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='sbdr-ssdp-no'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='serialize'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='ss'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='taa-no'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='tsx-ldtrk'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='vaes'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='vpclmulqdq'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='xfd'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='xsaves'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      </blockers>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <model usable='no' vendor='Intel' canonical='Haswell-v1'>Haswell</model>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <blockers model='Haswell'>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='erms'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='hle'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='invpcid'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='pcid'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='rtm'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      </blockers>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <model usable='no' vendor='Intel' canonical='Haswell-v3'>Haswell-IBRS</model>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <blockers model='Haswell-IBRS'>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='erms'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='hle'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='invpcid'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='pcid'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='rtm'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      </blockers>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <model usable='no' vendor='Intel' canonical='Haswell-v2'>Haswell-noTSX</model>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <blockers model='Haswell-noTSX'>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='erms'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='invpcid'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='pcid'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      </blockers>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <model usable='no' vendor='Intel' canonical='Haswell-v4'>Haswell-noTSX-IBRS</model>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <blockers model='Haswell-noTSX-IBRS'>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='erms'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='invpcid'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='pcid'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      </blockers>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <model usable='no' vendor='Intel'>Haswell-v1</model>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <blockers model='Haswell-v1'>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='erms'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='hle'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='invpcid'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='pcid'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='rtm'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      </blockers>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <model usable='no' vendor='Intel'>Haswell-v2</model>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <blockers model='Haswell-v2'>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='erms'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='invpcid'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='pcid'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      </blockers>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <model usable='no' vendor='Intel'>Haswell-v3</model>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <blockers model='Haswell-v3'>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='erms'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='hle'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='invpcid'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='pcid'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='rtm'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      </blockers>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <model usable='no' vendor='Intel'>Haswell-v4</model>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <blockers model='Haswell-v4'>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='erms'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='invpcid'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='pcid'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      </blockers>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <model usable='no' vendor='Intel' canonical='Icelake-Server-v1'>Icelake-Server</model>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <blockers model='Icelake-Server'>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512-vpopcntdq'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512bitalg'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512bw'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512cd'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512dq'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512f'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512vbmi'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512vbmi2'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512vl'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512vnni'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='erms'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='gfni'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='hle'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='invpcid'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='la57'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='pcid'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='pku'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='rtm'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='vaes'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='vpclmulqdq'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      </blockers>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <model usable='no' vendor='Intel' canonical='Icelake-Server-v2'>Icelake-Server-noTSX</model>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <blockers model='Icelake-Server-noTSX'>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512-vpopcntdq'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512bitalg'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512bw'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512cd'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512dq'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512f'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512vbmi'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512vbmi2'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512vl'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512vnni'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='erms'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='gfni'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='invpcid'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='la57'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='pcid'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='pku'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='vaes'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='vpclmulqdq'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      </blockers>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <model usable='no' vendor='Intel'>Icelake-Server-v1</model>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <blockers model='Icelake-Server-v1'>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512-vpopcntdq'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512bitalg'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512bw'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512cd'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512dq'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512f'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512vbmi'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512vbmi2'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512vl'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512vnni'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='erms'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='gfni'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='hle'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='invpcid'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='la57'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='pcid'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='pku'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='rtm'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='vaes'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='vpclmulqdq'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      </blockers>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <model usable='no' vendor='Intel'>Icelake-Server-v2</model>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <blockers model='Icelake-Server-v2'>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512-vpopcntdq'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512bitalg'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512bw'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512cd'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512dq'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512f'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512vbmi'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512vbmi2'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512vl'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512vnni'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='erms'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='gfni'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='invpcid'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='la57'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='pcid'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='pku'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='vaes'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='vpclmulqdq'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      </blockers>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <model usable='no' vendor='Intel'>Icelake-Server-v3</model>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <blockers model='Icelake-Server-v3'>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512-vpopcntdq'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512bitalg'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512bw'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512cd'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512dq'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512f'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512vbmi'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512vbmi2'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512vl'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512vnni'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='erms'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='gfni'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='ibrs-all'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='invpcid'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='la57'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='pcid'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='pku'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='taa-no'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='vaes'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='vpclmulqdq'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      </blockers>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <model usable='no' vendor='Intel'>Icelake-Server-v4</model>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <blockers model='Icelake-Server-v4'>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512-vpopcntdq'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512bitalg'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512bw'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512cd'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512dq'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512f'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512ifma'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512vbmi'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512vbmi2'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512vl'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512vnni'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='erms'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='fsrm'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='gfni'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='ibrs-all'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='invpcid'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='la57'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='pcid'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='pku'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='taa-no'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='vaes'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='vpclmulqdq'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      </blockers>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <model usable='no' vendor='Intel'>Icelake-Server-v5</model>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <blockers model='Icelake-Server-v5'>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512-vpopcntdq'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512bitalg'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512bw'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512cd'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512dq'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512f'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512ifma'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512vbmi'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512vbmi2'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512vl'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512vnni'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='erms'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='fsrm'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='gfni'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='ibrs-all'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='invpcid'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='la57'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='pcid'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='pku'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='taa-no'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='vaes'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='vpclmulqdq'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='xsaves'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      </blockers>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <model usable='no' vendor='Intel'>Icelake-Server-v6</model>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <blockers model='Icelake-Server-v6'>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512-vpopcntdq'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512bitalg'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512bw'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512cd'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512dq'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512f'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512ifma'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512vbmi'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512vbmi2'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512vl'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512vnni'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='erms'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='fsrm'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='gfni'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='ibrs-all'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='invpcid'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='la57'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='pcid'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='pku'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='taa-no'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='vaes'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='vpclmulqdq'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='xsaves'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      </blockers>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <model usable='no' vendor='Intel'>Icelake-Server-v7</model>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <blockers model='Icelake-Server-v7'>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512-vpopcntdq'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512bitalg'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512bw'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512cd'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512dq'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512f'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512ifma'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512vbmi'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512vbmi2'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512vl'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512vnni'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='erms'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='fsrm'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='gfni'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='hle'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='ibrs-all'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='invpcid'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='la57'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='pcid'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='pku'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='rtm'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='taa-no'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='vaes'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='vpclmulqdq'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='xsaves'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      </blockers>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <model usable='no' vendor='Intel' canonical='IvyBridge-v1'>IvyBridge</model>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <blockers model='IvyBridge'>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='erms'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      </blockers>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <model usable='no' vendor='Intel' canonical='IvyBridge-v2'>IvyBridge-IBRS</model>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <blockers model='IvyBridge-IBRS'>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='erms'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      </blockers>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <model usable='no' vendor='Intel'>IvyBridge-v1</model>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <blockers model='IvyBridge-v1'>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='erms'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      </blockers>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <model usable='no' vendor='Intel'>IvyBridge-v2</model>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <blockers model='IvyBridge-v2'>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='erms'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      </blockers>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <model usable='no' vendor='Intel' canonical='KnightsMill-v1'>KnightsMill</model>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <blockers model='KnightsMill'>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512-4fmaps'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512-4vnniw'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512-vpopcntdq'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512cd'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512er'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512f'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512pf'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='erms'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='ss'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      </blockers>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <model usable='no' vendor='Intel'>KnightsMill-v1</model>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <blockers model='KnightsMill-v1'>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512-4fmaps'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512-4vnniw'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512-vpopcntdq'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512cd'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512er'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512f'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512pf'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='erms'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='ss'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      </blockers>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <model usable='yes' vendor='Intel' canonical='Nehalem-v1'>Nehalem</model>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <model usable='yes' vendor='Intel' canonical='Nehalem-v2'>Nehalem-IBRS</model>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <model usable='yes' vendor='Intel'>Nehalem-v1</model>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <model usable='yes' vendor='Intel'>Nehalem-v2</model>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G1-v1'>Opteron_G1</model>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G1-v1</model>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G2-v1'>Opteron_G2</model>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G2-v1</model>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G3-v1'>Opteron_G3</model>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G3-v1</model>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <model usable='no' vendor='AMD' canonical='Opteron_G4-v1'>Opteron_G4</model>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <blockers model='Opteron_G4'>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='fma4'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='xop'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      </blockers>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <model usable='no' vendor='AMD'>Opteron_G4-v1</model>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <blockers model='Opteron_G4-v1'>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='fma4'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='xop'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      </blockers>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <model usable='no' vendor='AMD' canonical='Opteron_G5-v1'>Opteron_G5</model>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <blockers model='Opteron_G5'>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='fma4'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='tbm'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='xop'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      </blockers>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <model usable='no' vendor='AMD'>Opteron_G5-v1</model>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <blockers model='Opteron_G5-v1'>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='fma4'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='tbm'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='xop'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      </blockers>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <model usable='yes' deprecated='yes' vendor='Intel' canonical='Penryn-v1'>Penryn</model>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <model usable='yes' deprecated='yes' vendor='Intel'>Penryn-v1</model>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <model usable='yes' vendor='Intel' canonical='SandyBridge-v1'>SandyBridge</model>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <model usable='yes' vendor='Intel' canonical='SandyBridge-v2'>SandyBridge-IBRS</model>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <model usable='yes' vendor='Intel'>SandyBridge-v1</model>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <model usable='yes' vendor='Intel'>SandyBridge-v2</model>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <model usable='no' vendor='Intel' canonical='SapphireRapids-v1'>SapphireRapids</model>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <blockers model='SapphireRapids'>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='amx-bf16'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='amx-int8'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='amx-tile'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx-vnni'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512-bf16'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512-fp16'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512-vpopcntdq'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512bitalg'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512bw'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512cd'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512dq'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512f'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512ifma'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512vbmi'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512vbmi2'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512vl'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512vnni'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='bus-lock-detect'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='erms'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='fsrc'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='fsrm'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='fsrs'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='fzrm'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='gfni'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='hle'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='ibrs-all'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='invpcid'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='la57'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='pcid'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='pku'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='rtm'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='serialize'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='taa-no'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='tsx-ldtrk'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='vaes'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='vpclmulqdq'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='xfd'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='xsaves'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      </blockers>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <model usable='no' vendor='Intel'>SapphireRapids-v1</model>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <blockers model='SapphireRapids-v1'>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='amx-bf16'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='amx-int8'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='amx-tile'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx-vnni'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512-bf16'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512-fp16'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512-vpopcntdq'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512bitalg'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512bw'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512cd'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512dq'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512f'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512ifma'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512vbmi'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512vbmi2'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512vl'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512vnni'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='bus-lock-detect'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='erms'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='fsrc'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='fsrm'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='fsrs'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='fzrm'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='gfni'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='hle'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='ibrs-all'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='invpcid'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='la57'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='pcid'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='pku'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='rtm'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='serialize'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='taa-no'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='tsx-ldtrk'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='vaes'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='vpclmulqdq'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='xfd'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='xsaves'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      </blockers>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <model usable='no' vendor='Intel'>SapphireRapids-v2</model>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <blockers model='SapphireRapids-v2'>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='amx-bf16'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='amx-int8'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='amx-tile'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx-vnni'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512-bf16'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512-fp16'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512-vpopcntdq'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512bitalg'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512bw'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512cd'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512dq'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512f'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512ifma'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512vbmi'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512vbmi2'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512vl'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512vnni'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='bus-lock-detect'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='erms'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='fbsdp-no'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='fsrc'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='fsrm'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='fsrs'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='fzrm'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='gfni'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='hle'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='ibrs-all'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='invpcid'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='la57'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='pcid'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='pku'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='psdp-no'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='rtm'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='sbdr-ssdp-no'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='serialize'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='taa-no'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='tsx-ldtrk'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='vaes'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='vpclmulqdq'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='xfd'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='xsaves'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      </blockers>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <model usable='no' vendor='Intel'>SapphireRapids-v3</model>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <blockers model='SapphireRapids-v3'>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='amx-bf16'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='amx-int8'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='amx-tile'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx-vnni'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512-bf16'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512-fp16'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512-vpopcntdq'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512bitalg'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512bw'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512cd'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512dq'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512f'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512ifma'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512vbmi'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512vbmi2'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512vl'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512vnni'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='bus-lock-detect'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='cldemote'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='erms'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='fbsdp-no'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='fsrc'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='fsrm'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='fsrs'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='fzrm'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='gfni'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='hle'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='ibrs-all'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='invpcid'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='la57'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='movdir64b'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='movdiri'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='pcid'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='pku'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='psdp-no'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='rtm'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='sbdr-ssdp-no'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='serialize'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='ss'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='taa-no'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='tsx-ldtrk'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='vaes'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='vpclmulqdq'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='xfd'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='xsaves'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      </blockers>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <model usable='no' vendor='Intel'>SapphireRapids-v4</model>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <blockers model='SapphireRapids-v4'>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='amx-bf16'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='amx-int8'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='amx-tile'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx-vnni'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512-bf16'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512-fp16'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512-vpopcntdq'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512bitalg'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512bw'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512cd'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512dq'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512f'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512ifma'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512vbmi'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512vbmi2'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512vl'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512vnni'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='bus-lock-detect'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='cldemote'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='erms'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='fbsdp-no'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='fsrc'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='fsrm'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='fsrs'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='fzrm'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='gfni'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='hle'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='ibrs-all'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='invpcid'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='la57'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='movdir64b'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='movdiri'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='pcid'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='pku'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='psdp-no'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='rtm'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='sbdr-ssdp-no'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='serialize'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='ss'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='taa-no'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='tsx-ldtrk'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='vaes'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='vpclmulqdq'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='xfd'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='xsaves'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      </blockers>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <model usable='no' vendor='Intel' canonical='SierraForest-v1'>SierraForest</model>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <blockers model='SierraForest'>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx-ifma'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx-ne-convert'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx-vnni'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx-vnni-int8'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='bus-lock-detect'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='cmpccxadd'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='erms'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='fbsdp-no'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='fsrm'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='fsrs'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='gfni'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='ibrs-all'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='invpcid'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='mcdt-no'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='pbrsb-no'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='pcid'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='pku'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='psdp-no'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='sbdr-ssdp-no'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='serialize'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='vaes'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='vpclmulqdq'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='xsaves'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      </blockers>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <model usable='no' vendor='Intel'>SierraForest-v1</model>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <blockers model='SierraForest-v1'>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx-ifma'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx-ne-convert'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx-vnni'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx-vnni-int8'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='bus-lock-detect'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='cmpccxadd'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='erms'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='fbsdp-no'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='fsrm'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='fsrs'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='gfni'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='ibrs-all'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='invpcid'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='mcdt-no'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='pbrsb-no'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='pcid'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='pku'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='psdp-no'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='sbdr-ssdp-no'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='serialize'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='vaes'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='vpclmulqdq'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='xsaves'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      </blockers>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <model usable='no' vendor='Intel'>SierraForest-v2</model>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <blockers model='SierraForest-v2'>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx-ifma'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx-ne-convert'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx-vnni'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx-vnni-int8'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='bhi-ctrl'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='bus-lock-detect'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='cldemote'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='cmpccxadd'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='erms'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='fbsdp-no'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='fsrm'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='fsrs'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='gfni'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='ibrs-all'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='intel-psfd'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='invpcid'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='ipred-ctrl'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='lam'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='mcdt-no'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='movdir64b'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='movdiri'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='pbrsb-no'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='pcid'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='pku'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='psdp-no'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='rrsba-ctrl'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='sbdr-ssdp-no'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='serialize'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='ss'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='vaes'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='vpclmulqdq'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='xsaves'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      </blockers>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <model usable='no' vendor='Intel'>SierraForest-v3</model>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <blockers model='SierraForest-v3'>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx-ifma'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx-ne-convert'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx-vnni'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx-vnni-int8'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='bhi-ctrl'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='bus-lock-detect'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='cldemote'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='cmpccxadd'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='erms'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='fbsdp-no'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='fsrm'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='fsrs'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='gfni'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='ibrs-all'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='intel-psfd'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='invpcid'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='ipred-ctrl'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='lam'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='mcdt-no'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='movdir64b'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='movdiri'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='pbrsb-no'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='pcid'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='pku'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='psdp-no'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='rrsba-ctrl'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='sbdr-ssdp-no'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='serialize'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='ss'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='vaes'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='vpclmulqdq'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='xsaves'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      </blockers>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <model usable='no' vendor='Intel' canonical='Skylake-Client-v1'>Skylake-Client</model>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <blockers model='Skylake-Client'>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='erms'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='hle'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='invpcid'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='pcid'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='rtm'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      </blockers>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <model usable='no' vendor='Intel' canonical='Skylake-Client-v2'>Skylake-Client-IBRS</model>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <blockers model='Skylake-Client-IBRS'>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='erms'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='hle'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='invpcid'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='pcid'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='rtm'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      </blockers>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <model usable='no' vendor='Intel' canonical='Skylake-Client-v3'>Skylake-Client-noTSX-IBRS</model>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <blockers model='Skylake-Client-noTSX-IBRS'>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='erms'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='invpcid'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='pcid'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      </blockers>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <model usable='no' vendor='Intel'>Skylake-Client-v1</model>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <blockers model='Skylake-Client-v1'>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='erms'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='hle'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='invpcid'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='pcid'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='rtm'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      </blockers>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <model usable='no' vendor='Intel'>Skylake-Client-v2</model>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <blockers model='Skylake-Client-v2'>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='erms'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='hle'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='invpcid'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='pcid'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='rtm'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      </blockers>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <model usable='no' vendor='Intel'>Skylake-Client-v3</model>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <blockers model='Skylake-Client-v3'>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='erms'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='invpcid'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='pcid'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      </blockers>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <model usable='no' vendor='Intel'>Skylake-Client-v4</model>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <blockers model='Skylake-Client-v4'>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='erms'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='invpcid'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='pcid'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='xsaves'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      </blockers>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <model usable='no' vendor='Intel' canonical='Skylake-Server-v1'>Skylake-Server</model>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <blockers model='Skylake-Server'>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512bw'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512cd'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512dq'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512f'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512vl'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='erms'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='hle'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='invpcid'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='pcid'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='pku'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='rtm'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      </blockers>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <model usable='no' vendor='Intel' canonical='Skylake-Server-v2'>Skylake-Server-IBRS</model>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <blockers model='Skylake-Server-IBRS'>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512bw'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512cd'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512dq'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512f'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512vl'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='erms'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='hle'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='invpcid'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='pcid'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='pku'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='rtm'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      </blockers>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <model usable='no' vendor='Intel' canonical='Skylake-Server-v3'>Skylake-Server-noTSX-IBRS</model>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <blockers model='Skylake-Server-noTSX-IBRS'>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512bw'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512cd'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512dq'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512f'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512vl'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='erms'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='invpcid'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='pcid'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='pku'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      </blockers>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <model usable='no' vendor='Intel'>Skylake-Server-v1</model>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <blockers model='Skylake-Server-v1'>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512bw'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512cd'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512dq'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512f'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512vl'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='erms'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='hle'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='invpcid'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='pcid'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='pku'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='rtm'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      </blockers>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <model usable='no' vendor='Intel'>Skylake-Server-v2</model>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <blockers model='Skylake-Server-v2'>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512bw'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512cd'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512dq'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512f'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512vl'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='erms'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='hle'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='invpcid'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='pcid'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='pku'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='rtm'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      </blockers>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <model usable='no' vendor='Intel'>Skylake-Server-v3</model>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <blockers model='Skylake-Server-v3'>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512bw'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512cd'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512dq'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512f'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512vl'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='erms'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='invpcid'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='pcid'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='pku'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      </blockers>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <model usable='no' vendor='Intel'>Skylake-Server-v4</model>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <blockers model='Skylake-Server-v4'>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512bw'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512cd'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512dq'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512f'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512vl'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='erms'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='invpcid'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='pcid'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='pku'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      </blockers>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <model usable='no' vendor='Intel'>Skylake-Server-v5</model>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <blockers model='Skylake-Server-v5'>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512bw'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512cd'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512dq'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512f'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512vl'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='erms'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='invpcid'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='pcid'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='pku'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='xsaves'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      </blockers>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <model usable='no' vendor='Intel' canonical='Snowridge-v1'>Snowridge</model>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <blockers model='Snowridge'>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='cldemote'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='core-capability'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='erms'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='gfni'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='movdir64b'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='movdiri'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='mpx'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='split-lock-detect'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      </blockers>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <model usable='no' vendor='Intel'>Snowridge-v1</model>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <blockers model='Snowridge-v1'>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='cldemote'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='core-capability'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='erms'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='gfni'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='movdir64b'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='movdiri'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='mpx'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='split-lock-detect'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      </blockers>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <model usable='no' vendor='Intel'>Snowridge-v2</model>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <blockers model='Snowridge-v2'>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='cldemote'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='core-capability'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='erms'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='gfni'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='movdir64b'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='movdiri'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='split-lock-detect'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      </blockers>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <model usable='no' vendor='Intel'>Snowridge-v3</model>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <blockers model='Snowridge-v3'>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='cldemote'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='core-capability'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='erms'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='gfni'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='movdir64b'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='movdiri'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='split-lock-detect'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='xsaves'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      </blockers>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <model usable='no' vendor='Intel'>Snowridge-v4</model>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <blockers model='Snowridge-v4'>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='cldemote'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='erms'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='gfni'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='movdir64b'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='movdiri'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='xsaves'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      </blockers>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <model usable='yes' vendor='Intel' canonical='Westmere-v1'>Westmere</model>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <model usable='yes' vendor='Intel' canonical='Westmere-v2'>Westmere-IBRS</model>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <model usable='yes' vendor='Intel'>Westmere-v1</model>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <model usable='yes' vendor='Intel'>Westmere-v2</model>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <model usable='no' deprecated='yes' vendor='AMD' canonical='athlon-v1'>athlon</model>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <blockers model='athlon'>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='3dnow'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='3dnowext'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      </blockers>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <model usable='no' deprecated='yes' vendor='AMD'>athlon-v1</model>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <blockers model='athlon-v1'>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='3dnow'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='3dnowext'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      </blockers>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <model usable='no' deprecated='yes' vendor='Intel' canonical='core2duo-v1'>core2duo</model>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <blockers model='core2duo'>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='ss'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      </blockers>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <model usable='no' deprecated='yes' vendor='Intel'>core2duo-v1</model>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <blockers model='core2duo-v1'>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='ss'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      </blockers>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <model usable='no' deprecated='yes' vendor='Intel' canonical='coreduo-v1'>coreduo</model>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <blockers model='coreduo'>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='ss'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      </blockers>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <model usable='no' deprecated='yes' vendor='Intel'>coreduo-v1</model>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <blockers model='coreduo-v1'>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='ss'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      </blockers>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm32-v1'>kvm32</model>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <model usable='yes' deprecated='yes' vendor='unknown'>kvm32-v1</model>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm64-v1'>kvm64</model>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <model usable='yes' deprecated='yes' vendor='unknown'>kvm64-v1</model>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <model usable='no' deprecated='yes' vendor='Intel' canonical='n270-v1'>n270</model>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <blockers model='n270'>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='ss'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      </blockers>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <model usable='no' deprecated='yes' vendor='Intel'>n270-v1</model>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <blockers model='n270-v1'>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='ss'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      </blockers>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium-v1'>pentium</model>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <model usable='yes' deprecated='yes' vendor='unknown'>pentium-v1</model>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium2-v1'>pentium2</model>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <model usable='yes' deprecated='yes' vendor='unknown'>pentium2-v1</model>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium3-v1'>pentium3</model>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <model usable='yes' deprecated='yes' vendor='unknown'>pentium3-v1</model>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <model usable='no' deprecated='yes' vendor='AMD' canonical='phenom-v1'>phenom</model>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <blockers model='phenom'>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='3dnow'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='3dnowext'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      </blockers>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <model usable='no' deprecated='yes' vendor='AMD'>phenom-v1</model>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <blockers model='phenom-v1'>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='3dnow'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='3dnowext'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      </blockers>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu32-v1'>qemu32</model>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <model usable='yes' deprecated='yes' vendor='unknown'>qemu32-v1</model>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu64-v1'>qemu64</model>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <model usable='yes' deprecated='yes' vendor='unknown'>qemu64-v1</model>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:    </mode>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:  </cpu>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:  <memoryBacking supported='yes'>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:    <enum name='sourceType'>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <value>file</value>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <value>anonymous</value>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <value>memfd</value>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:    </enum>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:  </memoryBacking>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:  <devices>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:    <disk supported='yes'>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <enum name='diskDevice'>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <value>disk</value>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <value>cdrom</value>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <value>floppy</value>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <value>lun</value>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      </enum>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <enum name='bus'>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <value>fdc</value>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <value>scsi</value>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <value>virtio</value>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <value>usb</value>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <value>sata</value>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      </enum>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <enum name='model'>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <value>virtio</value>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <value>virtio-transitional</value>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <value>virtio-non-transitional</value>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      </enum>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:    </disk>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:    <graphics supported='yes'>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <enum name='type'>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <value>vnc</value>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <value>egl-headless</value>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <value>dbus</value>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      </enum>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:    </graphics>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:    <video supported='yes'>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <enum name='modelType'>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <value>vga</value>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <value>cirrus</value>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <value>virtio</value>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <value>none</value>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <value>bochs</value>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <value>ramfb</value>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      </enum>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:    </video>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:    <hostdev supported='yes'>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <enum name='mode'>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <value>subsystem</value>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      </enum>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <enum name='startupPolicy'>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <value>default</value>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <value>mandatory</value>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <value>requisite</value>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <value>optional</value>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      </enum>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <enum name='subsysType'>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <value>usb</value>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <value>pci</value>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <value>scsi</value>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      </enum>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <enum name='capsType'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <enum name='pciBackend'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:    </hostdev>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:    <rng supported='yes'>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <enum name='model'>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <value>virtio</value>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <value>virtio-transitional</value>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <value>virtio-non-transitional</value>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      </enum>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <enum name='backendModel'>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <value>random</value>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <value>egd</value>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <value>builtin</value>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      </enum>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:    </rng>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:    <filesystem supported='yes'>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <enum name='driverType'>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <value>path</value>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <value>handle</value>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <value>virtiofs</value>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      </enum>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:    </filesystem>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:    <tpm supported='yes'>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <enum name='model'>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <value>tpm-tis</value>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <value>tpm-crb</value>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      </enum>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <enum name='backendModel'>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <value>emulator</value>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <value>external</value>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      </enum>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <enum name='backendVersion'>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <value>2.0</value>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      </enum>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:    </tpm>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:    <redirdev supported='yes'>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <enum name='bus'>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <value>usb</value>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      </enum>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:    </redirdev>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:    <channel supported='yes'>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <enum name='type'>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <value>pty</value>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <value>unix</value>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      </enum>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:    </channel>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:    <crypto supported='yes'>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <enum name='model'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <enum name='type'>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <value>qemu</value>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      </enum>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <enum name='backendModel'>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <value>builtin</value>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      </enum>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:    </crypto>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:    <interface supported='yes'>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <enum name='backendType'>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <value>default</value>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <value>passt</value>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      </enum>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:    </interface>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:    <panic supported='yes'>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <enum name='model'>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <value>isa</value>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <value>hyperv</value>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      </enum>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:    </panic>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:    <console supported='yes'>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <enum name='type'>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <value>null</value>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <value>vc</value>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <value>pty</value>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <value>dev</value>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <value>file</value>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <value>pipe</value>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <value>stdio</value>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <value>udp</value>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <value>tcp</value>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <value>unix</value>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <value>qemu-vdagent</value>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <value>dbus</value>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      </enum>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:    </console>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:  </devices>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:  <features>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:    <gic supported='no'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:    <vmcoreinfo supported='yes'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:    <genid supported='yes'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:    <backingStoreInput supported='yes'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:    <backup supported='yes'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:    <async-teardown supported='yes'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:    <s390-pv supported='no'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:    <ps2 supported='yes'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:    <tdx supported='no'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:    <sev supported='no'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:    <sgx supported='no'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:    <hyperv supported='yes'>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <enum name='features'>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <value>relaxed</value>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <value>vapic</value>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <value>spinlocks</value>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <value>vpindex</value>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <value>runtime</value>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <value>synic</value>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <value>stimer</value>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <value>reset</value>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <value>vendor_id</value>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <value>frequencies</value>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <value>reenlightenment</value>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <value>tlbflush</value>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <value>ipi</value>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <value>avic</value>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <value>emsr_bitmap</value>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <value>xmm_input</value>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      </enum>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <defaults>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <spinlocks>4095</spinlocks>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <stimer_direct>on</stimer_direct>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <tlbflush_direct>on</tlbflush_direct>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <tlbflush_extended>on</tlbflush_extended>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <vendor_id>Linux KVM Hv</vendor_id>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      </defaults>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:    </hyperv>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:    <launchSecurity supported='no'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:  </features>
Jan 21 18:37:12 np0005591285 nova_compute[182755]: </domainCapabilities>
Jan 21 18:37:12 np0005591285 nova_compute[182755]: _get_domain_capabilities /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1037#033[00m
Jan 21 18:37:12 np0005591285 nova_compute[182755]: 2026-01-21 23:37:12.247 182759 DEBUG nova.virt.libvirt.host [None req-f447ef32-7edb-4e2c-a5c5-5452f2f9c37e - - - - - -] Libvirt host hypervisor capabilities for arch=x86_64 and machine_type=pc:
Jan 21 18:37:12 np0005591285 nova_compute[182755]: <domainCapabilities>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:  <path>/usr/libexec/qemu-kvm</path>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:  <domain>kvm</domain>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:  <machine>pc-i440fx-rhel7.6.0</machine>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:  <arch>x86_64</arch>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:  <vcpu max='240'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:  <iothreads supported='yes'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:  <os supported='yes'>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:    <enum name='firmware'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:    <loader supported='yes'>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <value>/usr/share/OVMF/OVMF_CODE.secboot.fd</value>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <enum name='type'>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <value>rom</value>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <value>pflash</value>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      </enum>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <enum name='readonly'>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <value>yes</value>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <value>no</value>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      </enum>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <enum name='secure'>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <value>no</value>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      </enum>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:    </loader>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:  </os>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:  <cpu>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:    <mode name='host-passthrough' supported='yes'>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <enum name='hostPassthroughMigratable'>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <value>on</value>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <value>off</value>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      </enum>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:    </mode>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:    <mode name='maximum' supported='yes'>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <enum name='maximumMigratable'>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <value>on</value>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <value>off</value>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      </enum>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:    </mode>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:    <mode name='host-model' supported='yes'>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <model fallback='forbid'>EPYC-Rome</model>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <vendor>AMD</vendor>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <maxphysaddr mode='passthrough' limit='40'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <feature policy='require' name='x2apic'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <feature policy='require' name='tsc-deadline'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <feature policy='require' name='hypervisor'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <feature policy='require' name='tsc_adjust'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <feature policy='require' name='spec-ctrl'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <feature policy='require' name='stibp'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <feature policy='require' name='ssbd'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <feature policy='require' name='cmp_legacy'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <feature policy='require' name='overflow-recov'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <feature policy='require' name='succor'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <feature policy='require' name='ibrs'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <feature policy='require' name='amd-ssbd'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <feature policy='require' name='virt-ssbd'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <feature policy='require' name='lbrv'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <feature policy='require' name='tsc-scale'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <feature policy='require' name='vmcb-clean'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <feature policy='require' name='flushbyasid'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <feature policy='require' name='pause-filter'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <feature policy='require' name='pfthreshold'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <feature policy='require' name='svme-addr-chk'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <feature policy='require' name='lfence-always-serializing'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <feature policy='disable' name='xsaves'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:    </mode>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:    <mode name='custom' supported='yes'>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='486-v1'>486</model>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <model usable='yes' deprecated='yes' vendor='unknown'>486-v1</model>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <model usable='no' vendor='Intel' canonical='Broadwell-v1'>Broadwell</model>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <blockers model='Broadwell'>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='erms'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='hle'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='invpcid'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='pcid'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='rtm'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      </blockers>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <model usable='no' vendor='Intel' canonical='Broadwell-v3'>Broadwell-IBRS</model>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <blockers model='Broadwell-IBRS'>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='erms'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='hle'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='invpcid'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='pcid'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='rtm'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      </blockers>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <model usable='no' vendor='Intel' canonical='Broadwell-v2'>Broadwell-noTSX</model>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <blockers model='Broadwell-noTSX'>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='erms'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='invpcid'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='pcid'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      </blockers>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <model usable='no' vendor='Intel' canonical='Broadwell-v4'>Broadwell-noTSX-IBRS</model>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <blockers model='Broadwell-noTSX-IBRS'>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='erms'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='invpcid'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='pcid'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      </blockers>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <model usable='no' vendor='Intel'>Broadwell-v1</model>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <blockers model='Broadwell-v1'>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='erms'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='hle'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='invpcid'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='pcid'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='rtm'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      </blockers>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <model usable='no' vendor='Intel'>Broadwell-v2</model>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <blockers model='Broadwell-v2'>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='erms'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='invpcid'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='pcid'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      </blockers>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <model usable='no' vendor='Intel'>Broadwell-v3</model>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <blockers model='Broadwell-v3'>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='erms'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='hle'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='invpcid'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='pcid'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='rtm'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      </blockers>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <model usable='no' vendor='Intel'>Broadwell-v4</model>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <blockers model='Broadwell-v4'>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='erms'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='invpcid'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='pcid'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      </blockers>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v1'>Cascadelake-Server</model>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <blockers model='Cascadelake-Server'>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512bw'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512cd'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512dq'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512f'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512vl'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512vnni'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='erms'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='hle'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='invpcid'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='pcid'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='pku'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='rtm'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      </blockers>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v3'>Cascadelake-Server-noTSX</model>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <blockers model='Cascadelake-Server-noTSX'>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512bw'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512cd'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512dq'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512f'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512vl'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512vnni'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='erms'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='ibrs-all'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='invpcid'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='pcid'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='pku'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      </blockers>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v1</model>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <blockers model='Cascadelake-Server-v1'>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512bw'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512cd'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512dq'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512f'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512vl'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512vnni'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='erms'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='hle'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='invpcid'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='pcid'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='pku'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='rtm'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      </blockers>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v2</model>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <blockers model='Cascadelake-Server-v2'>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512bw'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512cd'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512dq'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512f'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512vl'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512vnni'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='erms'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='hle'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='ibrs-all'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='invpcid'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='pcid'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='pku'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='rtm'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      </blockers>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v3</model>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <blockers model='Cascadelake-Server-v3'>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512bw'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512cd'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512dq'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512f'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512vl'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512vnni'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='erms'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='ibrs-all'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='invpcid'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='pcid'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='pku'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      </blockers>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v4</model>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <blockers model='Cascadelake-Server-v4'>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512bw'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512cd'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512dq'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512f'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512vl'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512vnni'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='erms'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='ibrs-all'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='invpcid'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='pcid'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='pku'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      </blockers>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v5</model>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <blockers model='Cascadelake-Server-v5'>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512bw'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512cd'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512dq'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512f'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512vl'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512vnni'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='erms'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='ibrs-all'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='invpcid'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='pcid'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='pku'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='xsaves'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      </blockers>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <model usable='no' vendor='Intel' canonical='ClearwaterForest-v1'>ClearwaterForest</model>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <blockers model='ClearwaterForest'>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx-ifma'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx-ne-convert'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx-vnni'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx-vnni-int16'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx-vnni-int8'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='bhi-ctrl'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='bhi-no'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='bus-lock-detect'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='cldemote'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='cmpccxadd'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='ddpd-u'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='erms'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='fbsdp-no'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='fsrm'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='fsrs'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='gfni'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='ibrs-all'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='intel-psfd'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='invpcid'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='ipred-ctrl'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='lam'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='mcdt-no'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='movdir64b'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='movdiri'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='pbrsb-no'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='pcid'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='pku'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='prefetchiti'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='psdp-no'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='rrsba-ctrl'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='sbdr-ssdp-no'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='serialize'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='sha512'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='sm3'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='sm4'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='ss'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='vaes'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='vpclmulqdq'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='xsaves'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      </blockers>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <model usable='no' vendor='Intel'>ClearwaterForest-v1</model>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <blockers model='ClearwaterForest-v1'>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx-ifma'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx-ne-convert'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx-vnni'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx-vnni-int16'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx-vnni-int8'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='bhi-ctrl'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='bhi-no'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='bus-lock-detect'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='cldemote'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='cmpccxadd'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='ddpd-u'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='erms'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='fbsdp-no'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='fsrm'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='fsrs'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='gfni'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='ibrs-all'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='intel-psfd'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='invpcid'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='ipred-ctrl'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='lam'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='mcdt-no'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='movdir64b'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='movdiri'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='pbrsb-no'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='pcid'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='pku'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='prefetchiti'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='psdp-no'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='rrsba-ctrl'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='sbdr-ssdp-no'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='serialize'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='sha512'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='sm3'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='sm4'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='ss'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='vaes'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='vpclmulqdq'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='xsaves'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      </blockers>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <model usable='yes' deprecated='yes' vendor='Intel' canonical='Conroe-v1'>Conroe</model>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <model usable='yes' deprecated='yes' vendor='Intel'>Conroe-v1</model>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <model usable='no' vendor='Intel' canonical='Cooperlake-v1'>Cooperlake</model>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <blockers model='Cooperlake'>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512-bf16'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512bw'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512cd'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512dq'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512f'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512vl'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512vnni'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='erms'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='hle'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='ibrs-all'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='invpcid'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='pcid'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='pku'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='rtm'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='taa-no'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      </blockers>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <model usable='no' vendor='Intel'>Cooperlake-v1</model>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <blockers model='Cooperlake-v1'>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512-bf16'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512bw'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512cd'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512dq'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512f'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512vl'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512vnni'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='erms'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='hle'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='ibrs-all'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='invpcid'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='pcid'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='pku'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='rtm'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='taa-no'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      </blockers>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <model usable='no' vendor='Intel'>Cooperlake-v2</model>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <blockers model='Cooperlake-v2'>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512-bf16'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512bw'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512cd'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512dq'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512f'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512vl'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512vnni'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='erms'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='hle'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='ibrs-all'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='invpcid'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='pcid'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='pku'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='rtm'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='taa-no'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='xsaves'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      </blockers>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <model usable='no' vendor='Intel' canonical='Denverton-v1'>Denverton</model>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <blockers model='Denverton'>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='erms'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='mpx'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      </blockers>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <model usable='no' vendor='Intel'>Denverton-v1</model>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <blockers model='Denverton-v1'>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='erms'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='mpx'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      </blockers>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <model usable='no' vendor='Intel'>Denverton-v2</model>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <blockers model='Denverton-v2'>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='erms'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      </blockers>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <model usable='no' vendor='Intel'>Denverton-v3</model>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <blockers model='Denverton-v3'>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='erms'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='xsaves'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      </blockers>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <model usable='yes' vendor='Hygon' canonical='Dhyana-v1'>Dhyana</model>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <model usable='yes' vendor='Hygon'>Dhyana-v1</model>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <model usable='no' vendor='Hygon'>Dhyana-v2</model>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <blockers model='Dhyana-v2'>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='xsaves'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      </blockers>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <model usable='yes' vendor='AMD' canonical='EPYC-v1'>EPYC</model>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <model usable='no' vendor='AMD' canonical='EPYC-Genoa-v1'>EPYC-Genoa</model>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <blockers model='EPYC-Genoa'>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='amd-psfd'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='auto-ibrs'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512-bf16'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512-vpopcntdq'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512bitalg'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512bw'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512cd'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512dq'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512f'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512ifma'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512vbmi'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512vbmi2'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512vl'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512vnni'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='erms'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='fsrm'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='gfni'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='invpcid'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='la57'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='no-nested-data-bp'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='null-sel-clr-base'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='pcid'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='pku'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='stibp-always-on'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='vaes'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='vpclmulqdq'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='xsaves'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      </blockers>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <model usable='no' vendor='AMD'>EPYC-Genoa-v1</model>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <blockers model='EPYC-Genoa-v1'>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='amd-psfd'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='auto-ibrs'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512-bf16'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512-vpopcntdq'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512bitalg'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512bw'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512cd'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512dq'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512f'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512ifma'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512vbmi'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512vbmi2'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512vl'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512vnni'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='erms'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='fsrm'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='gfni'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='invpcid'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='la57'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='no-nested-data-bp'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='null-sel-clr-base'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='pcid'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='pku'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='stibp-always-on'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='vaes'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='vpclmulqdq'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='xsaves'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      </blockers>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <model usable='no' vendor='AMD'>EPYC-Genoa-v2</model>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <blockers model='EPYC-Genoa-v2'>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='amd-psfd'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='auto-ibrs'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512-bf16'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512-vpopcntdq'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512bitalg'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512bw'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512cd'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512dq'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512f'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512ifma'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512vbmi'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512vbmi2'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512vl'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512vnni'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='erms'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='fs-gs-base-ns'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='fsrm'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='gfni'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='invpcid'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='la57'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='no-nested-data-bp'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='null-sel-clr-base'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='pcid'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='perfmon-v2'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='pku'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='stibp-always-on'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='vaes'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='vpclmulqdq'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='xsaves'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      </blockers>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <model usable='yes' vendor='AMD' canonical='EPYC-v2'>EPYC-IBPB</model>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <model usable='no' vendor='AMD' canonical='EPYC-Milan-v1'>EPYC-Milan</model>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <blockers model='EPYC-Milan'>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='erms'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='fsrm'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='invpcid'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='pcid'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='pku'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='xsaves'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      </blockers>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <model usable='no' vendor='AMD'>EPYC-Milan-v1</model>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <blockers model='EPYC-Milan-v1'>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='erms'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='fsrm'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='invpcid'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='pcid'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='pku'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='xsaves'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      </blockers>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <model usable='no' vendor='AMD'>EPYC-Milan-v2</model>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <blockers model='EPYC-Milan-v2'>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='amd-psfd'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='erms'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='fsrm'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='invpcid'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='no-nested-data-bp'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='null-sel-clr-base'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='pcid'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='pku'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='stibp-always-on'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='vaes'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='vpclmulqdq'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='xsaves'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      </blockers>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <model usable='no' vendor='AMD'>EPYC-Milan-v3</model>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <blockers model='EPYC-Milan-v3'>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='amd-psfd'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='erms'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='fsrm'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='invpcid'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='no-nested-data-bp'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='null-sel-clr-base'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='pcid'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='pku'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='stibp-always-on'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='vaes'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='vpclmulqdq'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='xsaves'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      </blockers>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <model usable='no' vendor='AMD' canonical='EPYC-Rome-v1'>EPYC-Rome</model>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <blockers model='EPYC-Rome'>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='xsaves'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      </blockers>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <model usable='no' vendor='AMD'>EPYC-Rome-v1</model>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <blockers model='EPYC-Rome-v1'>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='xsaves'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      </blockers>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <model usable='no' vendor='AMD'>EPYC-Rome-v2</model>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <blockers model='EPYC-Rome-v2'>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='xsaves'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      </blockers>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <model usable='no' vendor='AMD'>EPYC-Rome-v3</model>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <blockers model='EPYC-Rome-v3'>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='xsaves'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      </blockers>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <model usable='yes' vendor='AMD'>EPYC-Rome-v4</model>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <model usable='yes' vendor='AMD'>EPYC-Rome-v5</model>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <model usable='no' vendor='AMD' canonical='EPYC-Turin-v1'>EPYC-Turin</model>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <blockers model='EPYC-Turin'>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='amd-psfd'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='auto-ibrs'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx-vnni'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512-bf16'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512-vp2intersect'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512-vpopcntdq'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512bitalg'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512bw'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512cd'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512dq'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512f'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512ifma'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512vbmi'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512vbmi2'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512vl'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512vnni'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='erms'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='fs-gs-base-ns'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='fsrm'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='gfni'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='ibpb-brtype'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='invpcid'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='la57'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='movdir64b'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='movdiri'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='no-nested-data-bp'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='null-sel-clr-base'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='pcid'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='perfmon-v2'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='pku'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='prefetchi'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='sbpb'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='srso-user-kernel-no'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='stibp-always-on'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='vaes'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='vpclmulqdq'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='xsaves'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      </blockers>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <model usable='no' vendor='AMD'>EPYC-Turin-v1</model>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <blockers model='EPYC-Turin-v1'>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='amd-psfd'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='auto-ibrs'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx-vnni'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512-bf16'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512-vp2intersect'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512-vpopcntdq'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512bitalg'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512bw'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512cd'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512dq'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512f'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512ifma'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512vbmi'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512vbmi2'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512vl'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512vnni'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='erms'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='fs-gs-base-ns'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='fsrm'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='gfni'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='ibpb-brtype'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='invpcid'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='la57'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='movdir64b'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='movdiri'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='no-nested-data-bp'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='null-sel-clr-base'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='pcid'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='perfmon-v2'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='pku'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='prefetchi'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='sbpb'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='srso-user-kernel-no'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='stibp-always-on'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='vaes'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='vpclmulqdq'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='xsaves'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      </blockers>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <model usable='yes' vendor='AMD'>EPYC-v1</model>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <model usable='yes' vendor='AMD'>EPYC-v2</model>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <model usable='no' vendor='AMD'>EPYC-v3</model>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <blockers model='EPYC-v3'>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='xsaves'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      </blockers>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <model usable='no' vendor='AMD'>EPYC-v4</model>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <blockers model='EPYC-v4'>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='xsaves'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      </blockers>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <model usable='no' vendor='AMD'>EPYC-v5</model>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <blockers model='EPYC-v5'>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='xsaves'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      </blockers>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <model usable='no' vendor='Intel' canonical='GraniteRapids-v1'>GraniteRapids</model>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <blockers model='GraniteRapids'>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='amx-bf16'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='amx-fp16'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='amx-int8'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='amx-tile'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx-vnni'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512-bf16'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512-fp16'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512-vpopcntdq'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512bitalg'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512bw'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512cd'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512dq'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512f'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512ifma'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512vbmi'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512vbmi2'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512vl'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512vnni'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='bus-lock-detect'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='erms'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='fbsdp-no'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='fsrc'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='fsrm'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='fsrs'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='fzrm'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='gfni'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='hle'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='ibrs-all'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='invpcid'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='la57'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='mcdt-no'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='pbrsb-no'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='pcid'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='pku'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='prefetchiti'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='psdp-no'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='rtm'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='sbdr-ssdp-no'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='serialize'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='taa-no'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='tsx-ldtrk'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='vaes'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='vpclmulqdq'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='xfd'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='xsaves'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      </blockers>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <model usable='no' vendor='Intel'>GraniteRapids-v1</model>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <blockers model='GraniteRapids-v1'>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='amx-bf16'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='amx-fp16'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='amx-int8'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='amx-tile'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx-vnni'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512-bf16'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512-fp16'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512-vpopcntdq'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512bitalg'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512bw'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512cd'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512dq'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512f'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512ifma'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512vbmi'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512vbmi2'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512vl'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512vnni'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='bus-lock-detect'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='erms'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='fbsdp-no'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='fsrc'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='fsrm'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='fsrs'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='fzrm'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='gfni'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='hle'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='ibrs-all'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='invpcid'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='la57'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='mcdt-no'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='pbrsb-no'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='pcid'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='pku'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='prefetchiti'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='psdp-no'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='rtm'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='sbdr-ssdp-no'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='serialize'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='taa-no'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='tsx-ldtrk'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='vaes'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='vpclmulqdq'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='xfd'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='xsaves'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      </blockers>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <model usable='no' vendor='Intel'>GraniteRapids-v2</model>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <blockers model='GraniteRapids-v2'>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='amx-bf16'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='amx-fp16'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='amx-int8'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='amx-tile'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx-vnni'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx10'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx10-128'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx10-256'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx10-512'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512-bf16'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512-fp16'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512-vpopcntdq'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512bitalg'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512bw'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512cd'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512dq'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512f'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512ifma'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512vbmi'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512vbmi2'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512vl'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512vnni'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='bus-lock-detect'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='cldemote'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='erms'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='fbsdp-no'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='fsrc'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='fsrm'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='fsrs'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='fzrm'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='gfni'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='hle'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='ibrs-all'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='invpcid'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='la57'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='mcdt-no'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='movdir64b'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='movdiri'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='pbrsb-no'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='pcid'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='pku'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='prefetchiti'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='psdp-no'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='rtm'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='sbdr-ssdp-no'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='serialize'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='ss'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='taa-no'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='tsx-ldtrk'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='vaes'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='vpclmulqdq'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='xfd'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='xsaves'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      </blockers>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <model usable='no' vendor='Intel'>GraniteRapids-v3</model>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <blockers model='GraniteRapids-v3'>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='amx-bf16'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='amx-fp16'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='amx-int8'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='amx-tile'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx-vnni'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx10'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx10-128'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx10-256'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx10-512'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512-bf16'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512-fp16'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512-vpopcntdq'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512bitalg'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512bw'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512cd'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512dq'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512f'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512ifma'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512vbmi'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512vbmi2'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512vl'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512vnni'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='bus-lock-detect'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='cldemote'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='erms'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='fbsdp-no'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='fsrc'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='fsrm'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='fsrs'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='fzrm'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='gfni'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='hle'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='ibrs-all'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='invpcid'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='la57'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='mcdt-no'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='movdir64b'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='movdiri'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='pbrsb-no'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='pcid'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='pku'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='prefetchiti'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='psdp-no'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='rtm'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='sbdr-ssdp-no'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='serialize'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='ss'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='taa-no'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='tsx-ldtrk'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='vaes'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='vpclmulqdq'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='xfd'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='xsaves'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      </blockers>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <model usable='no' vendor='Intel' canonical='Haswell-v1'>Haswell</model>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <blockers model='Haswell'>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='erms'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='hle'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='invpcid'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='pcid'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='rtm'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      </blockers>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <model usable='no' vendor='Intel' canonical='Haswell-v3'>Haswell-IBRS</model>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <blockers model='Haswell-IBRS'>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='erms'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='hle'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='invpcid'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='pcid'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='rtm'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      </blockers>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <model usable='no' vendor='Intel' canonical='Haswell-v2'>Haswell-noTSX</model>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <blockers model='Haswell-noTSX'>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='erms'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='invpcid'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='pcid'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      </blockers>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <model usable='no' vendor='Intel' canonical='Haswell-v4'>Haswell-noTSX-IBRS</model>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <blockers model='Haswell-noTSX-IBRS'>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='erms'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='invpcid'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='pcid'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      </blockers>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <model usable='no' vendor='Intel'>Haswell-v1</model>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <blockers model='Haswell-v1'>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='erms'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='hle'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='invpcid'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='pcid'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='rtm'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      </blockers>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <model usable='no' vendor='Intel'>Haswell-v2</model>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <blockers model='Haswell-v2'>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='erms'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='invpcid'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='pcid'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      </blockers>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <model usable='no' vendor='Intel'>Haswell-v3</model>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <blockers model='Haswell-v3'>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='erms'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='hle'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='invpcid'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='pcid'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='rtm'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      </blockers>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <model usable='no' vendor='Intel'>Haswell-v4</model>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <blockers model='Haswell-v4'>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='erms'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='invpcid'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='pcid'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      </blockers>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <model usable='no' vendor='Intel' canonical='Icelake-Server-v1'>Icelake-Server</model>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <blockers model='Icelake-Server'>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512-vpopcntdq'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512bitalg'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512bw'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512cd'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512dq'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512f'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512vbmi'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512vbmi2'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512vl'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512vnni'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='erms'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='gfni'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='hle'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='invpcid'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='la57'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='pcid'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='pku'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='rtm'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='vaes'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='vpclmulqdq'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      </blockers>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <model usable='no' vendor='Intel' canonical='Icelake-Server-v2'>Icelake-Server-noTSX</model>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <blockers model='Icelake-Server-noTSX'>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512-vpopcntdq'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512bitalg'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512bw'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512cd'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512dq'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512f'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512vbmi'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512vbmi2'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512vl'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512vnni'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='erms'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='gfni'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='invpcid'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='la57'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='pcid'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='pku'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='vaes'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='vpclmulqdq'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      </blockers>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <model usable='no' vendor='Intel'>Icelake-Server-v1</model>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <blockers model='Icelake-Server-v1'>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512-vpopcntdq'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512bitalg'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512bw'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512cd'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512dq'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512f'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512vbmi'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512vbmi2'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512vl'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512vnni'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='erms'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='gfni'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='hle'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='invpcid'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='la57'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='pcid'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='pku'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='rtm'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='vaes'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='vpclmulqdq'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      </blockers>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <model usable='no' vendor='Intel'>Icelake-Server-v2</model>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <blockers model='Icelake-Server-v2'>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512-vpopcntdq'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512bitalg'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512bw'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512cd'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512dq'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512f'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512vbmi'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512vbmi2'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512vl'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512vnni'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='erms'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='gfni'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='invpcid'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='la57'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='pcid'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='pku'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='vaes'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='vpclmulqdq'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      </blockers>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <model usable='no' vendor='Intel'>Icelake-Server-v3</model>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <blockers model='Icelake-Server-v3'>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512-vpopcntdq'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512bitalg'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512bw'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512cd'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512dq'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512f'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512vbmi'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512vbmi2'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512vl'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512vnni'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='erms'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='gfni'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='ibrs-all'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='invpcid'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='la57'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='pcid'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='pku'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='taa-no'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='vaes'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='vpclmulqdq'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      </blockers>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <model usable='no' vendor='Intel'>Icelake-Server-v4</model>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <blockers model='Icelake-Server-v4'>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512-vpopcntdq'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512bitalg'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512bw'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512cd'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512dq'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512f'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512ifma'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512vbmi'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512vbmi2'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512vl'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512vnni'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='erms'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='fsrm'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='gfni'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='ibrs-all'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='invpcid'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='la57'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='pcid'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='pku'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='taa-no'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='vaes'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='vpclmulqdq'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      </blockers>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <model usable='no' vendor='Intel'>Icelake-Server-v5</model>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <blockers model='Icelake-Server-v5'>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512-vpopcntdq'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512bitalg'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512bw'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512cd'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512dq'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512f'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512ifma'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512vbmi'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512vbmi2'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512vl'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512vnni'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='erms'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='fsrm'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='gfni'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='ibrs-all'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='invpcid'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='la57'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='pcid'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='pku'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='taa-no'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='vaes'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='vpclmulqdq'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='xsaves'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      </blockers>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <model usable='no' vendor='Intel'>Icelake-Server-v6</model>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <blockers model='Icelake-Server-v6'>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512-vpopcntdq'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512bitalg'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512bw'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512cd'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512dq'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512f'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512ifma'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512vbmi'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512vbmi2'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512vl'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512vnni'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='erms'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='fsrm'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='gfni'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='ibrs-all'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='invpcid'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='la57'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='pcid'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='pku'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='taa-no'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='vaes'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='vpclmulqdq'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='xsaves'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      </blockers>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <model usable='no' vendor='Intel'>Icelake-Server-v7</model>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <blockers model='Icelake-Server-v7'>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512-vpopcntdq'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512bitalg'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512bw'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512cd'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512dq'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512f'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512ifma'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512vbmi'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512vbmi2'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512vl'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512vnni'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='erms'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='fsrm'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='gfni'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='hle'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='ibrs-all'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='invpcid'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='la57'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='pcid'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='pku'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='rtm'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='taa-no'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='vaes'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='vpclmulqdq'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='xsaves'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      </blockers>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <model usable='no' vendor='Intel' canonical='IvyBridge-v1'>IvyBridge</model>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <blockers model='IvyBridge'>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='erms'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      </blockers>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <model usable='no' vendor='Intel' canonical='IvyBridge-v2'>IvyBridge-IBRS</model>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <blockers model='IvyBridge-IBRS'>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='erms'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      </blockers>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <model usable='no' vendor='Intel'>IvyBridge-v1</model>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <blockers model='IvyBridge-v1'>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='erms'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      </blockers>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <model usable='no' vendor='Intel'>IvyBridge-v2</model>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <blockers model='IvyBridge-v2'>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='erms'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      </blockers>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <model usable='no' vendor='Intel' canonical='KnightsMill-v1'>KnightsMill</model>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <blockers model='KnightsMill'>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512-4fmaps'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512-4vnniw'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512-vpopcntdq'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512cd'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512er'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512f'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512pf'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='erms'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='ss'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      </blockers>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <model usable='no' vendor='Intel'>KnightsMill-v1</model>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <blockers model='KnightsMill-v1'>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512-4fmaps'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512-4vnniw'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512-vpopcntdq'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512cd'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512er'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512f'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512pf'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='erms'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='ss'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      </blockers>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <model usable='yes' vendor='Intel' canonical='Nehalem-v1'>Nehalem</model>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <model usable='yes' vendor='Intel' canonical='Nehalem-v2'>Nehalem-IBRS</model>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <model usable='yes' vendor='Intel'>Nehalem-v1</model>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <model usable='yes' vendor='Intel'>Nehalem-v2</model>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G1-v1'>Opteron_G1</model>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G1-v1</model>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G2-v1'>Opteron_G2</model>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G2-v1</model>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G3-v1'>Opteron_G3</model>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G3-v1</model>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <model usable='no' vendor='AMD' canonical='Opteron_G4-v1'>Opteron_G4</model>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <blockers model='Opteron_G4'>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='fma4'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='xop'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      </blockers>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <model usable='no' vendor='AMD'>Opteron_G4-v1</model>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <blockers model='Opteron_G4-v1'>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='fma4'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='xop'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      </blockers>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <model usable='no' vendor='AMD' canonical='Opteron_G5-v1'>Opteron_G5</model>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <blockers model='Opteron_G5'>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='fma4'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='tbm'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='xop'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      </blockers>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <model usable='no' vendor='AMD'>Opteron_G5-v1</model>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <blockers model='Opteron_G5-v1'>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='fma4'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='tbm'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='xop'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      </blockers>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <model usable='yes' deprecated='yes' vendor='Intel' canonical='Penryn-v1'>Penryn</model>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <model usable='yes' deprecated='yes' vendor='Intel'>Penryn-v1</model>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <model usable='yes' vendor='Intel' canonical='SandyBridge-v1'>SandyBridge</model>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <model usable='yes' vendor='Intel' canonical='SandyBridge-v2'>SandyBridge-IBRS</model>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <model usable='yes' vendor='Intel'>SandyBridge-v1</model>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <model usable='yes' vendor='Intel'>SandyBridge-v2</model>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <model usable='no' vendor='Intel' canonical='SapphireRapids-v1'>SapphireRapids</model>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <blockers model='SapphireRapids'>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='amx-bf16'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='amx-int8'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='amx-tile'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx-vnni'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512-bf16'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512-fp16'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512-vpopcntdq'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512bitalg'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512bw'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512cd'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512dq'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512f'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512ifma'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512vbmi'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512vbmi2'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512vl'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512vnni'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='bus-lock-detect'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='erms'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='fsrc'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='fsrm'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='fsrs'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='fzrm'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='gfni'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='hle'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='ibrs-all'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='invpcid'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='la57'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='pcid'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='pku'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='rtm'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='serialize'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='taa-no'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='tsx-ldtrk'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='vaes'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='vpclmulqdq'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='xfd'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='xsaves'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      </blockers>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <model usable='no' vendor='Intel'>SapphireRapids-v1</model>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <blockers model='SapphireRapids-v1'>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='amx-bf16'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='amx-int8'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='amx-tile'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx-vnni'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512-bf16'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512-fp16'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512-vpopcntdq'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512bitalg'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512bw'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512cd'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512dq'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512f'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512ifma'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512vbmi'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512vbmi2'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512vl'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512vnni'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='bus-lock-detect'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='erms'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='fsrc'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='fsrm'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='fsrs'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='fzrm'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='gfni'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='hle'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='ibrs-all'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='invpcid'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='la57'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='pcid'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='pku'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='rtm'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='serialize'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='taa-no'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='tsx-ldtrk'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='vaes'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='vpclmulqdq'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='xfd'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='xsaves'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      </blockers>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <model usable='no' vendor='Intel'>SapphireRapids-v2</model>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <blockers model='SapphireRapids-v2'>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='amx-bf16'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='amx-int8'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='amx-tile'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx-vnni'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512-bf16'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512-fp16'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512-vpopcntdq'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512bitalg'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512bw'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512cd'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512dq'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512f'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512ifma'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512vbmi'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512vbmi2'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512vl'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512vnni'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='bus-lock-detect'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='erms'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='fbsdp-no'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='fsrc'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='fsrm'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='fsrs'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='fzrm'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='gfni'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='hle'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='ibrs-all'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='invpcid'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='la57'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='pcid'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='pku'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='psdp-no'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='rtm'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='sbdr-ssdp-no'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='serialize'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='taa-no'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='tsx-ldtrk'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='vaes'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='vpclmulqdq'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='xfd'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='xsaves'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      </blockers>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <model usable='no' vendor='Intel'>SapphireRapids-v3</model>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <blockers model='SapphireRapids-v3'>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='amx-bf16'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='amx-int8'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='amx-tile'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx-vnni'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512-bf16'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512-fp16'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512-vpopcntdq'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512bitalg'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512bw'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512cd'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512dq'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512f'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512ifma'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512vbmi'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512vbmi2'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512vl'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512vnni'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='bus-lock-detect'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='cldemote'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='erms'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='fbsdp-no'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='fsrc'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='fsrm'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='fsrs'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='fzrm'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='gfni'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='hle'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='ibrs-all'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='invpcid'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='la57'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='movdir64b'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='movdiri'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='pcid'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='pku'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='psdp-no'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='rtm'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='sbdr-ssdp-no'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='serialize'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='ss'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='taa-no'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='tsx-ldtrk'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='vaes'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='vpclmulqdq'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='xfd'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='xsaves'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      </blockers>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <model usable='no' vendor='Intel'>SapphireRapids-v4</model>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <blockers model='SapphireRapids-v4'>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='amx-bf16'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='amx-int8'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='amx-tile'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx-vnni'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512-bf16'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512-fp16'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512-vpopcntdq'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512bitalg'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512bw'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512cd'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512dq'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512f'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512ifma'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512vbmi'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512vbmi2'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512vl'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512vnni'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='bus-lock-detect'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='cldemote'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='erms'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='fbsdp-no'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='fsrc'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='fsrm'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='fsrs'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='fzrm'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='gfni'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='hle'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='ibrs-all'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='invpcid'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='la57'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='movdir64b'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='movdiri'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='pcid'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='pku'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='psdp-no'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='rtm'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='sbdr-ssdp-no'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='serialize'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='ss'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='taa-no'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='tsx-ldtrk'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='vaes'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='vpclmulqdq'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='xfd'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='xsaves'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      </blockers>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <model usable='no' vendor='Intel' canonical='SierraForest-v1'>SierraForest</model>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <blockers model='SierraForest'>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx-ifma'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx-ne-convert'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx-vnni'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx-vnni-int8'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='bus-lock-detect'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='cmpccxadd'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='erms'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='fbsdp-no'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='fsrm'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='fsrs'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='gfni'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='ibrs-all'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='invpcid'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='mcdt-no'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='pbrsb-no'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='pcid'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='pku'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='psdp-no'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='sbdr-ssdp-no'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='serialize'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='vaes'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='vpclmulqdq'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='xsaves'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      </blockers>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <model usable='no' vendor='Intel'>SierraForest-v1</model>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <blockers model='SierraForest-v1'>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx-ifma'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx-ne-convert'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx-vnni'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx-vnni-int8'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='bus-lock-detect'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='cmpccxadd'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='erms'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='fbsdp-no'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='fsrm'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='fsrs'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='gfni'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='ibrs-all'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='invpcid'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='mcdt-no'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='pbrsb-no'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='pcid'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='pku'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='psdp-no'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='sbdr-ssdp-no'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='serialize'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='vaes'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='vpclmulqdq'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='xsaves'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      </blockers>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <model usable='no' vendor='Intel'>SierraForest-v2</model>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <blockers model='SierraForest-v2'>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx-ifma'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx-ne-convert'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx-vnni'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx-vnni-int8'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='bhi-ctrl'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='bus-lock-detect'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='cldemote'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='cmpccxadd'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='erms'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='fbsdp-no'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='fsrm'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='fsrs'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='gfni'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='ibrs-all'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='intel-psfd'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='invpcid'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='ipred-ctrl'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='lam'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='mcdt-no'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='movdir64b'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='movdiri'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='pbrsb-no'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='pcid'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='pku'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='psdp-no'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='rrsba-ctrl'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='sbdr-ssdp-no'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='serialize'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='ss'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='vaes'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='vpclmulqdq'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='xsaves'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      </blockers>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <model usable='no' vendor='Intel'>SierraForest-v3</model>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <blockers model='SierraForest-v3'>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx-ifma'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx-ne-convert'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx-vnni'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx-vnni-int8'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='bhi-ctrl'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='bus-lock-detect'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='cldemote'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='cmpccxadd'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='erms'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='fbsdp-no'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='fsrm'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='fsrs'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='gfni'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='ibrs-all'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='intel-psfd'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='invpcid'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='ipred-ctrl'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='lam'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='mcdt-no'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='movdir64b'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='movdiri'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='pbrsb-no'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='pcid'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='pku'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='psdp-no'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='rrsba-ctrl'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='sbdr-ssdp-no'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='serialize'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='ss'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='vaes'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='vpclmulqdq'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='xsaves'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      </blockers>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <model usable='no' vendor='Intel' canonical='Skylake-Client-v1'>Skylake-Client</model>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <blockers model='Skylake-Client'>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='erms'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='hle'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='invpcid'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='pcid'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='rtm'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      </blockers>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <model usable='no' vendor='Intel' canonical='Skylake-Client-v2'>Skylake-Client-IBRS</model>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <blockers model='Skylake-Client-IBRS'>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='erms'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='hle'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='invpcid'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='pcid'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='rtm'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      </blockers>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <model usable='no' vendor='Intel' canonical='Skylake-Client-v3'>Skylake-Client-noTSX-IBRS</model>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <blockers model='Skylake-Client-noTSX-IBRS'>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='erms'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='invpcid'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='pcid'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      </blockers>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <model usable='no' vendor='Intel'>Skylake-Client-v1</model>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <blockers model='Skylake-Client-v1'>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='erms'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='hle'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='invpcid'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='pcid'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='rtm'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      </blockers>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <model usable='no' vendor='Intel'>Skylake-Client-v2</model>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <blockers model='Skylake-Client-v2'>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='erms'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='hle'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='invpcid'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='pcid'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='rtm'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      </blockers>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <model usable='no' vendor='Intel'>Skylake-Client-v3</model>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <blockers model='Skylake-Client-v3'>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='erms'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='invpcid'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='pcid'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      </blockers>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <model usable='no' vendor='Intel'>Skylake-Client-v4</model>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <blockers model='Skylake-Client-v4'>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='erms'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='invpcid'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='pcid'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='xsaves'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      </blockers>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <model usable='no' vendor='Intel' canonical='Skylake-Server-v1'>Skylake-Server</model>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <blockers model='Skylake-Server'>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512bw'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512cd'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512dq'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512f'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512vl'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='erms'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='hle'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='invpcid'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='pcid'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='pku'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='rtm'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      </blockers>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <model usable='no' vendor='Intel' canonical='Skylake-Server-v2'>Skylake-Server-IBRS</model>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <blockers model='Skylake-Server-IBRS'>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512bw'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512cd'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512dq'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512f'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512vl'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='erms'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='hle'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='invpcid'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='pcid'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='pku'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='rtm'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      </blockers>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <model usable='no' vendor='Intel' canonical='Skylake-Server-v3'>Skylake-Server-noTSX-IBRS</model>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <blockers model='Skylake-Server-noTSX-IBRS'>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512bw'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512cd'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512dq'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512f'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512vl'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='erms'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='invpcid'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='pcid'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='pku'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      </blockers>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <model usable='no' vendor='Intel'>Skylake-Server-v1</model>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <blockers model='Skylake-Server-v1'>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512bw'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512cd'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512dq'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512f'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512vl'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='erms'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='hle'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='invpcid'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='pcid'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='pku'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='rtm'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      </blockers>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <model usable='no' vendor='Intel'>Skylake-Server-v2</model>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <blockers model='Skylake-Server-v2'>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512bw'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512cd'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512dq'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512f'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512vl'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='erms'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='hle'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='invpcid'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='pcid'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='pku'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='rtm'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      </blockers>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <model usable='no' vendor='Intel'>Skylake-Server-v3</model>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <blockers model='Skylake-Server-v3'>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512bw'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512cd'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512dq'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512f'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512vl'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='erms'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='invpcid'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='pcid'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='pku'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      </blockers>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <model usable='no' vendor='Intel'>Skylake-Server-v4</model>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <blockers model='Skylake-Server-v4'>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512bw'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512cd'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512dq'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512f'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512vl'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='erms'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='invpcid'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='pcid'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='pku'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      </blockers>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <model usable='no' vendor='Intel'>Skylake-Server-v5</model>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <blockers model='Skylake-Server-v5'>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512bw'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512cd'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512dq'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512f'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='avx512vl'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='erms'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='invpcid'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='pcid'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='pku'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='xsaves'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      </blockers>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <model usable='no' vendor='Intel' canonical='Snowridge-v1'>Snowridge</model>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <blockers model='Snowridge'>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='cldemote'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='core-capability'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='erms'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='gfni'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='movdir64b'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='movdiri'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='mpx'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='split-lock-detect'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      </blockers>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <model usable='no' vendor='Intel'>Snowridge-v1</model>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <blockers model='Snowridge-v1'>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='cldemote'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='core-capability'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='erms'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='gfni'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='movdir64b'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='movdiri'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='mpx'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='split-lock-detect'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      </blockers>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <model usable='no' vendor='Intel'>Snowridge-v2</model>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <blockers model='Snowridge-v2'>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='cldemote'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='core-capability'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='erms'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='gfni'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='movdir64b'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='movdiri'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='split-lock-detect'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      </blockers>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <model usable='no' vendor='Intel'>Snowridge-v3</model>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <blockers model='Snowridge-v3'>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='cldemote'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='core-capability'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='erms'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='gfni'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='movdir64b'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='movdiri'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='split-lock-detect'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='xsaves'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      </blockers>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <model usable='no' vendor='Intel'>Snowridge-v4</model>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <blockers model='Snowridge-v4'>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='cldemote'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='erms'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='gfni'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='movdir64b'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='movdiri'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='xsaves'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      </blockers>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <model usable='yes' vendor='Intel' canonical='Westmere-v1'>Westmere</model>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <model usable='yes' vendor='Intel' canonical='Westmere-v2'>Westmere-IBRS</model>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <model usable='yes' vendor='Intel'>Westmere-v1</model>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <model usable='yes' vendor='Intel'>Westmere-v2</model>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <model usable='no' deprecated='yes' vendor='AMD' canonical='athlon-v1'>athlon</model>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <blockers model='athlon'>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='3dnow'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='3dnowext'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      </blockers>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <model usable='no' deprecated='yes' vendor='AMD'>athlon-v1</model>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <blockers model='athlon-v1'>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='3dnow'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='3dnowext'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      </blockers>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <model usable='no' deprecated='yes' vendor='Intel' canonical='core2duo-v1'>core2duo</model>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <blockers model='core2duo'>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='ss'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      </blockers>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <model usable='no' deprecated='yes' vendor='Intel'>core2duo-v1</model>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <blockers model='core2duo-v1'>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='ss'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      </blockers>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <model usable='no' deprecated='yes' vendor='Intel' canonical='coreduo-v1'>coreduo</model>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <blockers model='coreduo'>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='ss'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      </blockers>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <model usable='no' deprecated='yes' vendor='Intel'>coreduo-v1</model>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <blockers model='coreduo-v1'>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='ss'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      </blockers>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm32-v1'>kvm32</model>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <model usable='yes' deprecated='yes' vendor='unknown'>kvm32-v1</model>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm64-v1'>kvm64</model>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <model usable='yes' deprecated='yes' vendor='unknown'>kvm64-v1</model>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <model usable='no' deprecated='yes' vendor='Intel' canonical='n270-v1'>n270</model>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <blockers model='n270'>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='ss'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      </blockers>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <model usable='no' deprecated='yes' vendor='Intel'>n270-v1</model>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <blockers model='n270-v1'>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='ss'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      </blockers>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium-v1'>pentium</model>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <model usable='yes' deprecated='yes' vendor='unknown'>pentium-v1</model>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium2-v1'>pentium2</model>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <model usable='yes' deprecated='yes' vendor='unknown'>pentium2-v1</model>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium3-v1'>pentium3</model>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <model usable='yes' deprecated='yes' vendor='unknown'>pentium3-v1</model>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <model usable='no' deprecated='yes' vendor='AMD' canonical='phenom-v1'>phenom</model>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <blockers model='phenom'>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='3dnow'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='3dnowext'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      </blockers>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <model usable='no' deprecated='yes' vendor='AMD'>phenom-v1</model>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <blockers model='phenom-v1'>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='3dnow'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <feature name='3dnowext'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      </blockers>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu32-v1'>qemu32</model>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <model usable='yes' deprecated='yes' vendor='unknown'>qemu32-v1</model>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu64-v1'>qemu64</model>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <model usable='yes' deprecated='yes' vendor='unknown'>qemu64-v1</model>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:    </mode>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:  </cpu>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:  <memoryBacking supported='yes'>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:    <enum name='sourceType'>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <value>file</value>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <value>anonymous</value>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <value>memfd</value>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:    </enum>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:  </memoryBacking>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:  <devices>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:    <disk supported='yes'>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <enum name='diskDevice'>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <value>disk</value>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <value>cdrom</value>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <value>floppy</value>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <value>lun</value>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      </enum>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <enum name='bus'>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <value>ide</value>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <value>fdc</value>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <value>scsi</value>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <value>virtio</value>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <value>usb</value>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <value>sata</value>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      </enum>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <enum name='model'>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <value>virtio</value>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <value>virtio-transitional</value>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <value>virtio-non-transitional</value>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      </enum>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:    </disk>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:    <graphics supported='yes'>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <enum name='type'>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <value>vnc</value>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <value>egl-headless</value>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <value>dbus</value>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      </enum>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:    </graphics>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:    <video supported='yes'>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <enum name='modelType'>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <value>vga</value>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <value>cirrus</value>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <value>virtio</value>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <value>none</value>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <value>bochs</value>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <value>ramfb</value>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      </enum>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:    </video>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:    <hostdev supported='yes'>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <enum name='mode'>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <value>subsystem</value>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      </enum>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <enum name='startupPolicy'>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <value>default</value>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <value>mandatory</value>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <value>requisite</value>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <value>optional</value>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      </enum>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <enum name='subsysType'>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <value>usb</value>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <value>pci</value>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <value>scsi</value>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      </enum>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <enum name='capsType'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <enum name='pciBackend'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:    </hostdev>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:    <rng supported='yes'>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <enum name='model'>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <value>virtio</value>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <value>virtio-transitional</value>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <value>virtio-non-transitional</value>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      </enum>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <enum name='backendModel'>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <value>random</value>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <value>egd</value>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <value>builtin</value>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      </enum>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:    </rng>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:    <filesystem supported='yes'>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <enum name='driverType'>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <value>path</value>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <value>handle</value>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <value>virtiofs</value>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      </enum>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:    </filesystem>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:    <tpm supported='yes'>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <enum name='model'>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <value>tpm-tis</value>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <value>tpm-crb</value>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      </enum>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <enum name='backendModel'>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <value>emulator</value>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <value>external</value>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      </enum>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <enum name='backendVersion'>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <value>2.0</value>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      </enum>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:    </tpm>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:    <redirdev supported='yes'>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <enum name='bus'>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <value>usb</value>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      </enum>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:    </redirdev>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:    <channel supported='yes'>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <enum name='type'>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <value>pty</value>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <value>unix</value>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      </enum>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:    </channel>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:    <crypto supported='yes'>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <enum name='model'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <enum name='type'>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <value>qemu</value>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      </enum>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <enum name='backendModel'>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <value>builtin</value>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      </enum>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:    </crypto>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:    <interface supported='yes'>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <enum name='backendType'>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <value>default</value>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <value>passt</value>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      </enum>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:    </interface>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:    <panic supported='yes'>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <enum name='model'>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <value>isa</value>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <value>hyperv</value>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      </enum>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:    </panic>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:    <console supported='yes'>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <enum name='type'>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <value>null</value>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <value>vc</value>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <value>pty</value>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <value>dev</value>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <value>file</value>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <value>pipe</value>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <value>stdio</value>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <value>udp</value>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <value>tcp</value>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <value>unix</value>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <value>qemu-vdagent</value>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <value>dbus</value>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      </enum>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:    </console>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:  </devices>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:  <features>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:    <gic supported='no'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:    <vmcoreinfo supported='yes'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:    <genid supported='yes'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:    <backingStoreInput supported='yes'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:    <backup supported='yes'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:    <async-teardown supported='yes'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:    <s390-pv supported='no'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:    <ps2 supported='yes'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:    <tdx supported='no'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:    <sev supported='no'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:    <sgx supported='no'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:    <hyperv supported='yes'>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <enum name='features'>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <value>relaxed</value>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <value>vapic</value>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <value>spinlocks</value>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <value>vpindex</value>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <value>runtime</value>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <value>synic</value>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <value>stimer</value>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <value>reset</value>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <value>vendor_id</value>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <value>frequencies</value>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <value>reenlightenment</value>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <value>tlbflush</value>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <value>ipi</value>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <value>avic</value>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <value>emsr_bitmap</value>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <value>xmm_input</value>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      </enum>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      <defaults>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <spinlocks>4095</spinlocks>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <stimer_direct>on</stimer_direct>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <tlbflush_direct>on</tlbflush_direct>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <tlbflush_extended>on</tlbflush_extended>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:        <vendor_id>Linux KVM Hv</vendor_id>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:      </defaults>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:    </hyperv>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:    <launchSecurity supported='no'/>
Jan 21 18:37:12 np0005591285 nova_compute[182755]:  </features>
Jan 21 18:37:12 np0005591285 nova_compute[182755]: </domainCapabilities>
Jan 21 18:37:12 np0005591285 nova_compute[182755]: _get_domain_capabilities /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1037#033[00m
Jan 21 18:37:12 np0005591285 nova_compute[182755]: 2026-01-21 23:37:12.324 182759 DEBUG nova.virt.libvirt.host [None req-f447ef32-7edb-4e2c-a5c5-5452f2f9c37e - - - - - -] Checking secure boot support for host arch (x86_64) supports_secure_boot /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1782#033[00m
Jan 21 18:37:12 np0005591285 nova_compute[182755]: 2026-01-21 23:37:12.324 182759 INFO nova.virt.libvirt.host [None req-f447ef32-7edb-4e2c-a5c5-5452f2f9c37e - - - - - -] Secure Boot support detected#033[00m
Jan 21 18:37:12 np0005591285 nova_compute[182755]: 2026-01-21 23:37:12.326 182759 INFO nova.virt.libvirt.driver [None req-f447ef32-7edb-4e2c-a5c5-5452f2f9c37e - - - - - -] The live_migration_permit_post_copy is set to True and post copy live migration is available so auto-converge will not be in use.#033[00m
Jan 21 18:37:12 np0005591285 nova_compute[182755]: 2026-01-21 23:37:12.326 182759 INFO nova.virt.libvirt.driver [None req-f447ef32-7edb-4e2c-a5c5-5452f2f9c37e - - - - - -] The live_migration_permit_post_copy is set to True and post copy live migration is available so auto-converge will not be in use.#033[00m
Jan 21 18:37:12 np0005591285 nova_compute[182755]: 2026-01-21 23:37:12.337 182759 DEBUG nova.virt.libvirt.driver [None req-f447ef32-7edb-4e2c-a5c5-5452f2f9c37e - - - - - -] cpu compare xml: <cpu match="exact">
Jan 21 18:37:12 np0005591285 nova_compute[182755]:  <model>Nehalem</model>
Jan 21 18:37:12 np0005591285 nova_compute[182755]: </cpu>
Jan 21 18:37:12 np0005591285 nova_compute[182755]: _compare_cpu /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10019#033[00m
Jan 21 18:37:12 np0005591285 nova_compute[182755]: 2026-01-21 23:37:12.340 182759 DEBUG nova.virt.libvirt.driver [None req-f447ef32-7edb-4e2c-a5c5-5452f2f9c37e - - - - - -] Enabling emulated TPM support _check_vtpm_support /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:1097#033[00m
Jan 21 18:37:12 np0005591285 nova_compute[182755]: 2026-01-21 23:37:12.367 182759 INFO nova.virt.node [None req-f447ef32-7edb-4e2c-a5c5-5452f2f9c37e - - - - - -] Determined node identity e96a8776-a298-4c19-937a-402cb8191067 from /var/lib/nova/compute_id#033[00m
Jan 21 18:37:12 np0005591285 nova_compute[182755]: 2026-01-21 23:37:12.387 182759 DEBUG nova.compute.manager [None req-f447ef32-7edb-4e2c-a5c5-5452f2f9c37e - - - - - -] Verified node e96a8776-a298-4c19-937a-402cb8191067 matches my host compute-2.ctlplane.example.com _check_for_host_rename /usr/lib/python3.9/site-packages/nova/compute/manager.py:1568#033[00m
Jan 21 18:37:12 np0005591285 nova_compute[182755]: 2026-01-21 23:37:12.451 182759 INFO nova.compute.manager [None req-f447ef32-7edb-4e2c-a5c5-5452f2f9c37e - - - - - -] Looking for unclaimed instances stuck in BUILDING status for nodes managed by this host#033[00m
Jan 21 18:37:12 np0005591285 nova_compute[182755]: 2026-01-21 23:37:12.577 182759 ERROR nova.compute.manager [None req-f447ef32-7edb-4e2c-a5c5-5452f2f9c37e - - - - - -] Could not retrieve compute node resource provider e96a8776-a298-4c19-937a-402cb8191067 and therefore unable to error out any instances stuck in BUILDING state. Error: Failed to retrieve allocations for resource provider e96a8776-a298-4c19-937a-402cb8191067: {"errors": [{"status": 404, "title": "Not Found", "detail": "The resource could not be found.\n\n Resource provider 'e96a8776-a298-4c19-937a-402cb8191067' not found: No resource provider with uuid e96a8776-a298-4c19-937a-402cb8191067 found  ", "request_id": "req-d8c5944d-090b-47a2-b708-abb1e830a4af"}]}: nova.exception.ResourceProviderAllocationRetrievalFailed: Failed to retrieve allocations for resource provider e96a8776-a298-4c19-937a-402cb8191067: {"errors": [{"status": 404, "title": "Not Found", "detail": "The resource could not be found.\n\n Resource provider 'e96a8776-a298-4c19-937a-402cb8191067' not found: No resource provider with uuid e96a8776-a298-4c19-937a-402cb8191067 found  ", "request_id": "req-d8c5944d-090b-47a2-b708-abb1e830a4af"}]}#033[00m
Jan 21 18:37:12 np0005591285 nova_compute[182755]: 2026-01-21 23:37:12.614 182759 DEBUG oslo_concurrency.lockutils [None req-f447ef32-7edb-4e2c-a5c5-5452f2f9c37e - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 21 18:37:12 np0005591285 nova_compute[182755]: 2026-01-21 23:37:12.615 182759 DEBUG oslo_concurrency.lockutils [None req-f447ef32-7edb-4e2c-a5c5-5452f2f9c37e - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 21 18:37:12 np0005591285 nova_compute[182755]: 2026-01-21 23:37:12.615 182759 DEBUG oslo_concurrency.lockutils [None req-f447ef32-7edb-4e2c-a5c5-5452f2f9c37e - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 21 18:37:12 np0005591285 nova_compute[182755]: 2026-01-21 23:37:12.615 182759 DEBUG nova.compute.resource_tracker [None req-f447ef32-7edb-4e2c-a5c5-5452f2f9c37e - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Jan 21 18:37:12 np0005591285 nova_compute[182755]: 2026-01-21 23:37:12.794 182759 WARNING nova.virt.libvirt.driver [None req-f447ef32-7edb-4e2c-a5c5-5452f2f9c37e - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 21 18:37:12 np0005591285 nova_compute[182755]: 2026-01-21 23:37:12.795 182759 DEBUG nova.compute.resource_tracker [None req-f447ef32-7edb-4e2c-a5c5-5452f2f9c37e - - - - - -] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=6202MB free_disk=73.5861587524414GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Jan 21 18:37:12 np0005591285 nova_compute[182755]: 2026-01-21 23:37:12.796 182759 DEBUG oslo_concurrency.lockutils [None req-f447ef32-7edb-4e2c-a5c5-5452f2f9c37e - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 21 18:37:12 np0005591285 nova_compute[182755]: 2026-01-21 23:37:12.796 182759 DEBUG oslo_concurrency.lockutils [None req-f447ef32-7edb-4e2c-a5c5-5452f2f9c37e - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 21 18:37:12 np0005591285 nova_compute[182755]: 2026-01-21 23:37:12.908 182759 ERROR nova.compute.resource_tracker [None req-f447ef32-7edb-4e2c-a5c5-5452f2f9c37e - - - - - -] Skipping removal of allocations for deleted instances: Failed to retrieve allocations for resource provider e96a8776-a298-4c19-937a-402cb8191067: {"errors": [{"status": 404, "title": "Not Found", "detail": "The resource could not be found.\n\n Resource provider 'e96a8776-a298-4c19-937a-402cb8191067' not found: No resource provider with uuid e96a8776-a298-4c19-937a-402cb8191067 found  ", "request_id": "req-8f8e4b2e-5c7a-4531-866c-b6d691ae0101"}]}: nova.exception.ResourceProviderAllocationRetrievalFailed: Failed to retrieve allocations for resource provider e96a8776-a298-4c19-937a-402cb8191067: {"errors": [{"status": 404, "title": "Not Found", "detail": "The resource could not be found.\n\n Resource provider 'e96a8776-a298-4c19-937a-402cb8191067' not found: No resource provider with uuid e96a8776-a298-4c19-937a-402cb8191067 found  ", "request_id": "req-8f8e4b2e-5c7a-4531-866c-b6d691ae0101"}]}#033[00m
Jan 21 18:37:12 np0005591285 nova_compute[182755]: 2026-01-21 23:37:12.908 182759 DEBUG nova.compute.resource_tracker [None req-f447ef32-7edb-4e2c-a5c5-5452f2f9c37e - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Jan 21 18:37:12 np0005591285 nova_compute[182755]: 2026-01-21 23:37:12.908 182759 DEBUG nova.compute.resource_tracker [None req-f447ef32-7edb-4e2c-a5c5-5452f2f9c37e - - - - - -] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Jan 21 18:37:13 np0005591285 nova_compute[182755]: 2026-01-21 23:37:13.154 182759 INFO nova.scheduler.client.report [None req-f447ef32-7edb-4e2c-a5c5-5452f2f9c37e - - - - - -] [req-729906ca-d362-4321-989a-038b4562f3e3] Created resource provider record via placement API for resource provider with UUID e96a8776-a298-4c19-937a-402cb8191067 and name compute-2.ctlplane.example.com.#033[00m
Jan 21 18:37:13 np0005591285 nova_compute[182755]: 2026-01-21 23:37:13.263 182759 DEBUG nova.virt.libvirt.host [None req-f447ef32-7edb-4e2c-a5c5-5452f2f9c37e - - - - - -] /sys/module/kvm_amd/parameters/sev contains [N
Jan 21 18:37:13 np0005591285 nova_compute[182755]: ] _kernel_supports_amd_sev /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1803#033[00m
Jan 21 18:37:13 np0005591285 nova_compute[182755]: 2026-01-21 23:37:13.263 182759 INFO nova.virt.libvirt.host [None req-f447ef32-7edb-4e2c-a5c5-5452f2f9c37e - - - - - -] kernel doesn't support AMD SEV#033[00m
Jan 21 18:37:13 np0005591285 nova_compute[182755]: 2026-01-21 23:37:13.265 182759 DEBUG nova.compute.provider_tree [None req-f447ef32-7edb-4e2c-a5c5-5452f2f9c37e - - - - - -] Updating inventory in ProviderTree for provider e96a8776-a298-4c19-937a-402cb8191067 with inventory: {'MEMORY_MB': {'total': 7679, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0, 'reserved': 512}, 'VCPU': {'total': 8, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0, 'reserved': 0}, 'DISK_GB': {'total': 79, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9, 'reserved': 0}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176#033[00m
Jan 21 18:37:13 np0005591285 nova_compute[182755]: 2026-01-21 23:37:13.265 182759 DEBUG nova.virt.libvirt.driver [None req-f447ef32-7edb-4e2c-a5c5-5452f2f9c37e - - - - - -] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Jan 21 18:37:13 np0005591285 nova_compute[182755]: 2026-01-21 23:37:13.269 182759 DEBUG nova.virt.libvirt.driver [None req-f447ef32-7edb-4e2c-a5c5-5452f2f9c37e - - - - - -] Libvirt baseline CPU <cpu>
Jan 21 18:37:13 np0005591285 nova_compute[182755]:  <arch>x86_64</arch>
Jan 21 18:37:13 np0005591285 nova_compute[182755]:  <model>Nehalem</model>
Jan 21 18:37:13 np0005591285 nova_compute[182755]:  <vendor>AMD</vendor>
Jan 21 18:37:13 np0005591285 nova_compute[182755]:  <topology sockets="8" cores="1" threads="1"/>
Jan 21 18:37:13 np0005591285 nova_compute[182755]: </cpu>
Jan 21 18:37:13 np0005591285 nova_compute[182755]: _get_guest_baseline_cpu_features /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12537#033[00m
Jan 21 18:37:13 np0005591285 nova_compute[182755]: 2026-01-21 23:37:13.390 182759 DEBUG nova.scheduler.client.report [None req-f447ef32-7edb-4e2c-a5c5-5452f2f9c37e - - - - - -] Updated inventory for provider e96a8776-a298-4c19-937a-402cb8191067 with generation 0 in Placement from set_inventory_for_provider using data: {'MEMORY_MB': {'total': 7679, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0, 'reserved': 512}, 'VCPU': {'total': 8, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0, 'reserved': 0}, 'DISK_GB': {'total': 79, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9, 'reserved': 0}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:957#033[00m
Jan 21 18:37:13 np0005591285 nova_compute[182755]: 2026-01-21 23:37:13.390 182759 DEBUG nova.compute.provider_tree [None req-f447ef32-7edb-4e2c-a5c5-5452f2f9c37e - - - - - -] Updating resource provider e96a8776-a298-4c19-937a-402cb8191067 generation from 0 to 1 during operation: update_inventory _update_generation /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:164#033[00m
Jan 21 18:37:13 np0005591285 nova_compute[182755]: 2026-01-21 23:37:13.391 182759 DEBUG nova.compute.provider_tree [None req-f447ef32-7edb-4e2c-a5c5-5452f2f9c37e - - - - - -] Updating inventory in ProviderTree for provider e96a8776-a298-4c19-937a-402cb8191067 with inventory: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 0, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176#033[00m
Jan 21 18:37:13 np0005591285 nova_compute[182755]: 2026-01-21 23:37:13.523 182759 DEBUG nova.compute.provider_tree [None req-f447ef32-7edb-4e2c-a5c5-5452f2f9c37e - - - - - -] Updating resource provider e96a8776-a298-4c19-937a-402cb8191067 generation from 1 to 2 during operation: update_traits _update_generation /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:164#033[00m
Jan 21 18:37:13 np0005591285 nova_compute[182755]: 2026-01-21 23:37:13.583 182759 DEBUG nova.compute.resource_tracker [None req-f447ef32-7edb-4e2c-a5c5-5452f2f9c37e - - - - - -] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Jan 21 18:37:13 np0005591285 nova_compute[182755]: 2026-01-21 23:37:13.583 182759 DEBUG oslo_concurrency.lockutils [None req-f447ef32-7edb-4e2c-a5c5-5452f2f9c37e - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.787s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 21 18:37:13 np0005591285 nova_compute[182755]: 2026-01-21 23:37:13.584 182759 DEBUG nova.service [None req-f447ef32-7edb-4e2c-a5c5-5452f2f9c37e - - - - - -] Creating RPC server for service compute start /usr/lib/python3.9/site-packages/nova/service.py:182#033[00m
Jan 21 18:37:13 np0005591285 nova_compute[182755]: 2026-01-21 23:37:13.668 182759 DEBUG nova.service [None req-f447ef32-7edb-4e2c-a5c5-5452f2f9c37e - - - - - -] Join ServiceGroup membership for this service compute start /usr/lib/python3.9/site-packages/nova/service.py:199#033[00m
Jan 21 18:37:13 np0005591285 nova_compute[182755]: 2026-01-21 23:37:13.669 182759 DEBUG nova.servicegroup.drivers.db [None req-f447ef32-7edb-4e2c-a5c5-5452f2f9c37e - - - - - -] DB_Driver: join new ServiceGroup member compute-2.ctlplane.example.com to the compute group, service = <Service: host=compute-2.ctlplane.example.com, binary=nova-compute, manager_class_name=nova.compute.manager.ComputeManager> join /usr/lib/python3.9/site-packages/nova/servicegroup/drivers/db.py:44#033[00m
Jan 21 18:37:17 np0005591285 systemd-logind[788]: New session 26 of user zuul.
Jan 21 18:37:17 np0005591285 systemd[1]: Started Session 26 of User zuul.
Jan 21 18:37:18 np0005591285 python3.9[183224]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 21 18:37:20 np0005591285 python3.9[183380]: ansible-ansible.builtin.systemd_service Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Jan 21 18:37:20 np0005591285 systemd[1]: Reloading.
Jan 21 18:37:20 np0005591285 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 21 18:37:20 np0005591285 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 21 18:37:21 np0005591285 python3.9[183567]: ansible-ansible.builtin.service_facts Invoked
Jan 21 18:37:21 np0005591285 network[183584]: You are using 'network' service provided by 'network-scripts', which are now deprecated.
Jan 21 18:37:21 np0005591285 network[183585]: 'network-scripts' will be removed from distribution in near future.
Jan 21 18:37:21 np0005591285 network[183586]: It is advised to switch to 'NetworkManager' instead for network management.
Jan 21 18:37:25 np0005591285 nova_compute[182755]: 2026-01-21 23:37:25.672 182759 DEBUG oslo_service.periodic_task [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Running periodic task ComputeManager._sync_power_states run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 21 18:37:25 np0005591285 nova_compute[182755]: 2026-01-21 23:37:25.729 182759 DEBUG oslo_service.periodic_task [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Running periodic task ComputeManager._cleanup_running_deleted_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 21 18:37:28 np0005591285 python3.9[183858]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_ceilometer_agent_compute.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 21 18:37:29 np0005591285 python3.9[184011]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_ceilometer_agent_compute.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 21 18:37:29 np0005591285 rsyslogd[1006]: imjournal: journal files changed, reloading...  [v8.2510.0-2.el9 try https://www.rsyslog.com/e/0 ]
Jan 21 18:37:29 np0005591285 rsyslogd[1006]: imjournal: journal files changed, reloading...  [v8.2510.0-2.el9 try https://www.rsyslog.com/e/0 ]
Jan 21 18:37:30 np0005591285 python3.9[184164]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_ceilometer_agent_compute.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 21 18:37:31 np0005591285 python3.9[184316]: ansible-ansible.legacy.command Invoked with _raw_params=if systemctl is-active certmonger.service; then#012  systemctl disable --now certmonger.service#012  test -f /etc/systemd/system/certmonger.service || systemctl mask certmonger.service#012fi#012 _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 21 18:37:32 np0005591285 python3.9[184468]: ansible-ansible.builtin.find Invoked with file_type=any hidden=True paths=['/var/lib/certmonger/requests'] patterns=[] read_whole_file=False age_stamp=mtime recurse=False follow=False get_checksum=False checksum_algorithm=sha1 use_regex=False exact_mode=True excludes=None contains=None age=None size=None depth=None mode=None encoding=None limit=None
Jan 21 18:37:33 np0005591285 podman[184620]: 2026-01-21 23:37:33.496225185 +0000 UTC m=+0.106537422 container health_status e38def30a2d032278c507a8fe96e62e9ea51ff4721fe55b24e66691821fd079e (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_id=ovn_controller, container_name=ovn_controller, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 21 18:37:33 np0005591285 python3.9[184621]: ansible-ansible.builtin.systemd_service Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Jan 21 18:37:33 np0005591285 systemd[1]: Reloading.
Jan 21 18:37:33 np0005591285 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 21 18:37:33 np0005591285 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 21 18:37:34 np0005591285 python3.9[184834]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_ceilometer_agent_compute.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 21 18:37:35 np0005591285 python3.9[184987]: ansible-ansible.builtin.file Invoked with group=zuul mode=0750 owner=zuul path=/var/lib/openstack/telemetry recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 21 18:37:36 np0005591285 python3.9[185137]: ansible-ansible.builtin.stat Invoked with path=/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 21 18:37:37 np0005591285 python3.9[185291]: ansible-ansible.builtin.group Invoked with name=libvirt state=present force=False system=False local=False non_unique=False gid=None gid_min=None gid_max=None
Jan 21 18:37:39 np0005591285 python3.9[185443]: ansible-ansible.builtin.getent Invoked with database=passwd key=ceilometer fail_key=True service=None split=None
Jan 21 18:37:39 np0005591285 python3.9[185596]: ansible-ansible.builtin.group Invoked with gid=42405 name=ceilometer state=present force=False system=False local=False non_unique=False gid_min=None gid_max=None
Jan 21 18:37:41 np0005591285 python3.9[185754]: ansible-ansible.builtin.user Invoked with comment=ceilometer user group=ceilometer groups=['libvirt'] name=ceilometer shell=/sbin/nologin state=present uid=42405 non_unique=False force=False remove=False create_home=True system=False move_home=False append=False ssh_key_bits=0 ssh_key_type=rsa ssh_key_comment=ansible-generated on compute-2 update_password=always home=None password=NOT_LOGGING_PARAMETER login_class=None password_expire_max=None password_expire_min=None password_expire_warn=None hidden=None seuser=None skeleton=None generate_ssh_key=None ssh_key_file=None ssh_key_passphrase=NOT_LOGGING_PARAMETER expires=None password_lock=None local=None profile=None authorization=None role=None umask=None password_expire_account_disable=None uid_min=None uid_max=None
Jan 21 18:37:41 np0005591285 podman[185756]: 2026-01-21 23:37:41.17767031 +0000 UTC m=+0.075922315 container health_status 482a8450540b33e5cd4c4cb0c4b60281016c657b79b89fa82f8a9e447843f995 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, managed_by=edpm_ansible, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, org.label-schema.schema-version=1.0)
Jan 21 18:37:42 np0005591285 python3.9[185932]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/telemetry/ceilometer.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 21 18:37:43 np0005591285 python3.9[186053]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/telemetry/ceilometer.conf mode=0640 remote_src=False src=/home/zuul/.ansible/tmp/ansible-tmp-1769038662.1281016-521-256586576195245/.source.conf _original_basename=ceilometer.conf follow=False checksum=806b21daa538a66a80669be8bf74c414d178dfbc backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 21 18:37:43 np0005591285 python3.9[186203]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/telemetry/polling.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 21 18:37:44 np0005591285 python3.9[186324]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/telemetry/polling.yaml mode=0640 remote_src=False src=/home/zuul/.ansible/tmp/ansible-tmp-1769038663.4753103-521-273152160347583/.source.yaml _original_basename=polling.yaml follow=False checksum=6c8680a286285f2e0ef9fa528ca754765e5ed0e5 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 21 18:37:45 np0005591285 python3.9[186474]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/telemetry/custom.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 21 18:37:45 np0005591285 python3.9[186595]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/telemetry/custom.conf mode=0640 remote_src=False src=/home/zuul/.ansible/tmp/ansible-tmp-1769038664.7067764-521-94971374269181/.source.conf _original_basename=custom.conf follow=False checksum=838b8b0a7d7f72e55ab67d39f32e3cb3eca2139b backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 21 18:37:46 np0005591285 python3.9[186745]: ansible-ansible.builtin.stat Invoked with path=/var/lib/openstack/certs/telemetry/default/tls.crt follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 21 18:37:47 np0005591285 python3.9[186897]: ansible-ansible.builtin.stat Invoked with path=/var/lib/openstack/certs/telemetry/default/tls.key follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 21 18:37:48 np0005591285 python3.9[187049]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/telemetry/ceilometer-host-specific.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 21 18:37:48 np0005591285 python3.9[187170]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/telemetry/ceilometer-host-specific.conf mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1769038667.7099195-698-260114492255538/.source.conf follow=False _original_basename=ceilometer-host-specific.conf.j2 checksum=d3d36c542f4af449a66988015465dd0bb4b47bb9 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Jan 21 18:37:49 np0005591285 python3.9[187320]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/telemetry/openstack_network_exporter.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 21 18:37:50 np0005591285 python3.9[187441]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/telemetry/openstack_network_exporter.yaml mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1769038669.0753827-698-56501801690768/.source.yaml follow=False _original_basename=openstack_network_exporter.yaml.j2 checksum=87dede51a10e22722618c1900db75cb764463d91 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Jan 21 18:37:50 np0005591285 python3.9[187591]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/telemetry/firewall.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 21 18:37:51 np0005591285 python3.9[187712]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/telemetry/firewall.yaml mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1769038670.3938043-786-127363485407179/.source.yaml _original_basename=firewall.yaml follow=False checksum=d942d984493b214bda2913f753ff68cdcedff00e backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Jan 21 18:37:52 np0005591285 python3.9[187862]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/telemetry/node_exporter.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 21 18:37:52 np0005591285 python3.9[187983]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/telemetry/node_exporter.yaml mode=420 src=/home/zuul/.ansible/tmp/ansible-tmp-1769038671.8938181-833-69559600085474/.source.yaml _original_basename=node_exporter.yaml follow=False checksum=81d906d3e1e8c4f8367276f5d3a67b80ca7e989e backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 21 18:37:53 np0005591285 python3.9[188133]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/telemetry/podman_exporter.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 21 18:37:54 np0005591285 python3.9[188254]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/telemetry/podman_exporter.yaml mode=420 src=/home/zuul/.ansible/tmp/ansible-tmp-1769038673.1922276-878-222662760636539/.source.yaml _original_basename=podman_exporter.yaml follow=False checksum=7ccb5eca2ff1dc337c3f3ecbbff5245af7149c47 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 21 18:37:55 np0005591285 python3.9[188404]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 21 18:37:55 np0005591285 python3.9[188525]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml mode=420 src=/home/zuul/.ansible/tmp/ansible-tmp-1769038674.701782-923-24499752373434/.source.yaml _original_basename=ceilometer_prom_exporter.yaml follow=False checksum=10157c879411ee6023e506dc85a343cedc52700f backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 21 18:37:56 np0005591285 python3.9[188677]: ansible-ansible.builtin.file Invoked with group=ceilometer mode=0644 owner=ceilometer path=/var/lib/openstack/certs/telemetry/default/tls.crt recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False state=None _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 21 18:37:57 np0005591285 python3.9[188829]: ansible-ansible.builtin.file Invoked with group=ceilometer mode=0644 owner=ceilometer path=/var/lib/openstack/certs/telemetry/default/tls.key recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False state=None _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 21 18:37:58 np0005591285 python3.9[188979]: ansible-ansible.builtin.stat Invoked with path=/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 21 18:37:59 np0005591285 python3.9[189131]: ansible-ansible.builtin.stat Invoked with path=/var/lib/openstack/certs/telemetry/default/tls.crt follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 21 18:37:59 np0005591285 python3.9[189283]: ansible-ansible.builtin.stat Invoked with path=/var/lib/openstack/certs/telemetry/default/tls.key follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 21 18:38:00 np0005591285 python3.9[189437]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/openstack/healthchecks setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 21 18:38:01 np0005591285 python3.9[189589]: ansible-ansible.builtin.systemd_service Invoked with enabled=True name=podman.socket state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 21 18:38:01 np0005591285 systemd[1]: Reloading.
Jan 21 18:38:01 np0005591285 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 21 18:38:01 np0005591285 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 21 18:38:02 np0005591285 systemd[1]: Listening on Podman API Socket.
Jan 21 18:38:02 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:38:02.939 104259 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 21 18:38:02 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:38:02.944 104259 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.007s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 21 18:38:02 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:38:02.945 104259 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 21 18:38:03 np0005591285 python3.9[189779]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/healthchecks/ceilometer_agent_compute/healthcheck follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 21 18:38:03 np0005591285 podman[189874]: 2026-01-21 23:38:03.865776381 +0000 UTC m=+0.136863461 container health_status e38def30a2d032278c507a8fe96e62e9ea51ff4721fe55b24e66691821fd079e (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_controller, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 21 18:38:04 np0005591285 python3.9[189922]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/healthchecks/ceilometer_agent_compute/ group=zuul mode=0700 owner=zuul setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1769038682.7113466-1139-61467443074608/.source _original_basename=healthcheck follow=False checksum=ebb343c21fce35a02591a9351660cb7035a47d42 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None
Jan 21 18:38:04 np0005591285 python3.9[190004]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/healthchecks/ceilometer_agent_compute/healthcheck.future follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 21 18:38:05 np0005591285 python3.9[190127]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/healthchecks/ceilometer_agent_compute/ group=zuul mode=0700 owner=zuul setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1769038682.7113466-1139-61467443074608/.source.future _original_basename=healthcheck.future follow=False checksum=d500a98192f4ddd70b4dfdc059e2d81aed36a294 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None
Jan 21 18:38:06 np0005591285 python3.9[190279]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/edpm-config recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 21 18:38:07 np0005591285 python3.9[190431]: ansible-ansible.builtin.file Invoked with path=/var/lib/kolla/config_files recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Jan 21 18:38:08 np0005591285 python3.9[190583]: ansible-ansible.legacy.stat Invoked with path=/var/lib/kolla/config_files/ceilometer_agent_compute.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 21 18:38:09 np0005591285 python3.9[190706]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/kolla/config_files/ceilometer_agent_compute.json mode=0600 src=/home/zuul/.ansible/tmp/ansible-tmp-1769038688.0601134-1283-169293214492504/.source.json _original_basename=.63sc9rix follow=False checksum=ce2b0c83293a970bafffa087afa083dd7c93a79c backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 21 18:38:10 np0005591285 python3.9[190856]: ansible-ansible.builtin.file Invoked with mode=0755 path=/var/lib/edpm-config/container-startup-config/ceilometer_agent_compute state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 21 18:38:11 np0005591285 nova_compute[182755]: 2026-01-21 23:38:11.220 182759 DEBUG oslo_service.periodic_task [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 21 18:38:11 np0005591285 nova_compute[182755]: 2026-01-21 23:38:11.221 182759 DEBUG oslo_service.periodic_task [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 21 18:38:11 np0005591285 nova_compute[182755]: 2026-01-21 23:38:11.222 182759 DEBUG nova.compute.manager [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Jan 21 18:38:11 np0005591285 nova_compute[182755]: 2026-01-21 23:38:11.222 182759 DEBUG nova.compute.manager [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Jan 21 18:38:11 np0005591285 nova_compute[182755]: 2026-01-21 23:38:11.238 182759 DEBUG nova.compute.manager [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Jan 21 18:38:11 np0005591285 nova_compute[182755]: 2026-01-21 23:38:11.238 182759 DEBUG oslo_service.periodic_task [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 21 18:38:11 np0005591285 nova_compute[182755]: 2026-01-21 23:38:11.239 182759 DEBUG oslo_service.periodic_task [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 21 18:38:11 np0005591285 nova_compute[182755]: 2026-01-21 23:38:11.240 182759 DEBUG oslo_service.periodic_task [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 21 18:38:11 np0005591285 nova_compute[182755]: 2026-01-21 23:38:11.240 182759 DEBUG oslo_service.periodic_task [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 21 18:38:11 np0005591285 nova_compute[182755]: 2026-01-21 23:38:11.241 182759 DEBUG oslo_service.periodic_task [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 21 18:38:11 np0005591285 nova_compute[182755]: 2026-01-21 23:38:11.241 182759 DEBUG oslo_service.periodic_task [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 21 18:38:11 np0005591285 nova_compute[182755]: 2026-01-21 23:38:11.242 182759 DEBUG nova.compute.manager [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Jan 21 18:38:11 np0005591285 nova_compute[182755]: 2026-01-21 23:38:11.242 182759 DEBUG oslo_service.periodic_task [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 21 18:38:11 np0005591285 nova_compute[182755]: 2026-01-21 23:38:11.270 182759 DEBUG oslo_concurrency.lockutils [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 21 18:38:11 np0005591285 nova_compute[182755]: 2026-01-21 23:38:11.270 182759 DEBUG oslo_concurrency.lockutils [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 21 18:38:11 np0005591285 nova_compute[182755]: 2026-01-21 23:38:11.271 182759 DEBUG oslo_concurrency.lockutils [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 21 18:38:11 np0005591285 nova_compute[182755]: 2026-01-21 23:38:11.271 182759 DEBUG nova.compute.resource_tracker [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Jan 21 18:38:11 np0005591285 nova_compute[182755]: 2026-01-21 23:38:11.477 182759 WARNING nova.virt.libvirt.driver [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 21 18:38:11 np0005591285 nova_compute[182755]: 2026-01-21 23:38:11.479 182759 DEBUG nova.compute.resource_tracker [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=6148MB free_disk=73.58586120605469GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Jan 21 18:38:11 np0005591285 nova_compute[182755]: 2026-01-21 23:38:11.479 182759 DEBUG oslo_concurrency.lockutils [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 21 18:38:11 np0005591285 nova_compute[182755]: 2026-01-21 23:38:11.479 182759 DEBUG oslo_concurrency.lockutils [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 21 18:38:11 np0005591285 nova_compute[182755]: 2026-01-21 23:38:11.556 182759 DEBUG nova.compute.resource_tracker [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Jan 21 18:38:11 np0005591285 nova_compute[182755]: 2026-01-21 23:38:11.556 182759 DEBUG nova.compute.resource_tracker [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Jan 21 18:38:11 np0005591285 nova_compute[182755]: 2026-01-21 23:38:11.581 182759 DEBUG nova.compute.provider_tree [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Inventory has not changed in ProviderTree for provider: e96a8776-a298-4c19-937a-402cb8191067 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 21 18:38:11 np0005591285 nova_compute[182755]: 2026-01-21 23:38:11.603 182759 DEBUG nova.scheduler.client.report [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Inventory has not changed for provider e96a8776-a298-4c19-937a-402cb8191067 based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 0, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 21 18:38:11 np0005591285 nova_compute[182755]: 2026-01-21 23:38:11.605 182759 DEBUG nova.compute.resource_tracker [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Jan 21 18:38:11 np0005591285 nova_compute[182755]: 2026-01-21 23:38:11.605 182759 DEBUG oslo_concurrency.lockutils [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.126s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 21 18:38:11 np0005591285 podman[191101]: 2026-01-21 23:38:11.670849824 +0000 UTC m=+0.089227271 container health_status 482a8450540b33e5cd4c4cb0c4b60281016c657b79b89fa82f8a9e447843f995 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_metadata_agent)
Jan 21 18:38:13 np0005591285 python3.9[191298]: ansible-container_config_data Invoked with config_overrides={} config_path=/var/lib/edpm-config/container-startup-config/ceilometer_agent_compute config_pattern=*.json debug=False
Jan 21 18:38:14 np0005591285 python3.9[191450]: ansible-container_config_hash Invoked with check_mode=False config_vol_prefix=/var/lib/openstack
Jan 21 18:38:15 np0005591285 python3[191602]: ansible-edpm_container_manage Invoked with concurrency=1 config_dir=/var/lib/edpm-config/container-startup-config/ceilometer_agent_compute config_id=ceilometer_agent_compute config_overrides={} config_patterns=*.json containers=['ceilometer_agent_compute'] log_base_path=/var/log/containers/stdouts debug=False
Jan 21 18:38:15 np0005591285 podman[191637]: 2026-01-21 23:38:15.918525105 +0000 UTC m=+0.076156742 container create b5c1a31a508d0cf9e10702abe9c0ef424614b4d8f0daee85791f7de054afa6f0 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251202, container_name=ceilometer_agent_compute, managed_by=edpm_ansible, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, config_id=ceilometer_agent_compute, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 21 18:38:15 np0005591285 podman[191637]: 2026-01-21 23:38:15.874004417 +0000 UTC m=+0.031636134 image pull 806262ad9f61127734555408f71447afe6ceede79cc666e6f523dacd5edec739 quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified
Jan 21 18:38:15 np0005591285 python3[191602]: ansible-edpm_container_manage PODMAN-CONTAINER-DEBUG: podman create --name ceilometer_agent_compute --conmon-pidfile /run/ceilometer_agent_compute.pid --env KOLLA_CONFIG_STRATEGY=COPY_ALWAYS --env OS_ENDPOINT_TYPE=internal --env EDPM_CONFIG_HASH=7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc --healthcheck-command /openstack/healthcheck compute --label config_id=ceilometer_agent_compute --label container_name=ceilometer_agent_compute --label managed_by=edpm_ansible --label config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']} --log-driver journald --log-level info --network host --security-opt label:type:ceilometer_polling_t --user ceilometer --volume /var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z --volume /var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z --volume /run/libvirt:/run/libvirt:shared,ro --volume /etc/hosts:/etc/hosts:ro --volume /etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro --volume /etc/localtime:/etc/localtime:ro --volume /etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro --volume /var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z --volume /var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z --volume /var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z --volume /dev/log:/dev/log --volume /var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified kolla_start
Jan 21 18:38:16 np0005591285 python3.9[191827]: ansible-ansible.builtin.stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 21 18:38:17 np0005591285 python3.9[191981]: ansible-file Invoked with path=/etc/systemd/system/edpm_ceilometer_agent_compute.requires state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 21 18:38:18 np0005591285 python3.9[192057]: ansible-stat Invoked with path=/etc/systemd/system/edpm_ceilometer_agent_compute_healthcheck.timer follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 21 18:38:19 np0005591285 python3.9[192208]: ansible-copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1769038698.360533-1516-146754851533280/source dest=/etc/systemd/system/edpm_ceilometer_agent_compute.service mode=0644 owner=root group=root backup=False force=True remote_src=False follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 21 18:38:20 np0005591285 python3.9[192284]: ansible-systemd Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Jan 21 18:38:20 np0005591285 systemd[1]: Reloading.
Jan 21 18:38:20 np0005591285 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 21 18:38:20 np0005591285 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 21 18:38:21 np0005591285 python3.9[192396]: ansible-systemd Invoked with state=restarted name=edpm_ceilometer_agent_compute.service enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 21 18:38:21 np0005591285 systemd[1]: Reloading.
Jan 21 18:38:21 np0005591285 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 21 18:38:21 np0005591285 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 21 18:38:21 np0005591285 systemd[1]: Starting ceilometer_agent_compute container...
Jan 21 18:38:21 np0005591285 systemd[1]: Started libcrun container.
Jan 21 18:38:21 np0005591285 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/7579319ff7ecfe7bddf1b95356510720995de16471d38fcc50c90298a9c0d520/merged/etc/ceilometer/tls supports timestamps until 2038 (0x7fffffff)
Jan 21 18:38:21 np0005591285 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/7579319ff7ecfe7bddf1b95356510720995de16471d38fcc50c90298a9c0d520/merged/etc/ceilometer/ceilometer_prom_exporter.yaml supports timestamps until 2038 (0x7fffffff)
Jan 21 18:38:21 np0005591285 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/7579319ff7ecfe7bddf1b95356510720995de16471d38fcc50c90298a9c0d520/merged/var/lib/kolla/config_files/src supports timestamps until 2038 (0x7fffffff)
Jan 21 18:38:21 np0005591285 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/7579319ff7ecfe7bddf1b95356510720995de16471d38fcc50c90298a9c0d520/merged/var/lib/kolla/config_files/config.json supports timestamps until 2038 (0x7fffffff)
Jan 21 18:38:21 np0005591285 systemd[1]: Started /usr/bin/podman healthcheck run b5c1a31a508d0cf9e10702abe9c0ef424614b4d8f0daee85791f7de054afa6f0.
Jan 21 18:38:21 np0005591285 podman[192437]: 2026-01-21 23:38:21.699731917 +0000 UTC m=+0.140035866 container init b5c1a31a508d0cf9e10702abe9c0ef424614b4d8f0daee85791f7de054afa6f0 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, config_id=ceilometer_agent_compute, container_name=ceilometer_agent_compute, tcib_managed=true, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Jan 21 18:38:21 np0005591285 ceilometer_agent_compute[192452]: + sudo -E kolla_set_configs
Jan 21 18:38:21 np0005591285 podman[192437]: 2026-01-21 23:38:21.72347626 +0000 UTC m=+0.163780179 container start b5c1a31a508d0cf9e10702abe9c0ef424614b4d8f0daee85791f7de054afa6f0 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_id=ceilometer_agent_compute, container_name=ceilometer_agent_compute, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_managed=true)
Jan 21 18:38:21 np0005591285 podman[192437]: ceilometer_agent_compute
Jan 21 18:38:21 np0005591285 systemd[1]: Started ceilometer_agent_compute container.
Jan 21 18:38:21 np0005591285 ceilometer_agent_compute[192452]: sudo: unable to send audit message: Operation not permitted
Jan 21 18:38:21 np0005591285 ceilometer_agent_compute[192452]: INFO:__main__:Loading config file at /var/lib/kolla/config_files/config.json
Jan 21 18:38:21 np0005591285 ceilometer_agent_compute[192452]: INFO:__main__:Validating config file
Jan 21 18:38:21 np0005591285 ceilometer_agent_compute[192452]: INFO:__main__:Kolla config strategy set to: COPY_ALWAYS
Jan 21 18:38:21 np0005591285 ceilometer_agent_compute[192452]: INFO:__main__:Copying service configuration files
Jan 21 18:38:21 np0005591285 ceilometer_agent_compute[192452]: INFO:__main__:Deleting /etc/ceilometer/ceilometer.conf
Jan 21 18:38:21 np0005591285 ceilometer_agent_compute[192452]: INFO:__main__:Copying /var/lib/kolla/config_files/src/ceilometer.conf to /etc/ceilometer/ceilometer.conf
Jan 21 18:38:21 np0005591285 ceilometer_agent_compute[192452]: INFO:__main__:Setting permission for /etc/ceilometer/ceilometer.conf
Jan 21 18:38:21 np0005591285 ceilometer_agent_compute[192452]: INFO:__main__:Deleting /etc/ceilometer/polling.yaml
Jan 21 18:38:21 np0005591285 ceilometer_agent_compute[192452]: INFO:__main__:Copying /var/lib/kolla/config_files/src/polling.yaml to /etc/ceilometer/polling.yaml
Jan 21 18:38:21 np0005591285 ceilometer_agent_compute[192452]: INFO:__main__:Setting permission for /etc/ceilometer/polling.yaml
Jan 21 18:38:21 np0005591285 ceilometer_agent_compute[192452]: INFO:__main__:Copying /var/lib/kolla/config_files/src/custom.conf to /etc/ceilometer/ceilometer.conf.d/01-ceilometer-custom.conf
Jan 21 18:38:21 np0005591285 ceilometer_agent_compute[192452]: INFO:__main__:Setting permission for /etc/ceilometer/ceilometer.conf.d/01-ceilometer-custom.conf
Jan 21 18:38:21 np0005591285 ceilometer_agent_compute[192452]: INFO:__main__:Copying /var/lib/kolla/config_files/src/ceilometer-host-specific.conf to /etc/ceilometer/ceilometer.conf.d/02-ceilometer-host-specific.conf
Jan 21 18:38:21 np0005591285 ceilometer_agent_compute[192452]: INFO:__main__:Setting permission for /etc/ceilometer/ceilometer.conf.d/02-ceilometer-host-specific.conf
Jan 21 18:38:21 np0005591285 ceilometer_agent_compute[192452]: INFO:__main__:Writing out command to execute
Jan 21 18:38:21 np0005591285 ceilometer_agent_compute[192452]: ++ cat /run_command
Jan 21 18:38:21 np0005591285 ceilometer_agent_compute[192452]: + CMD='/usr/bin/ceilometer-polling --polling-namespaces compute --logfile /dev/stdout'
Jan 21 18:38:21 np0005591285 ceilometer_agent_compute[192452]: + ARGS=
Jan 21 18:38:21 np0005591285 ceilometer_agent_compute[192452]: + sudo kolla_copy_cacerts
Jan 21 18:38:21 np0005591285 ceilometer_agent_compute[192452]: sudo: unable to send audit message: Operation not permitted
Jan 21 18:38:21 np0005591285 podman[192459]: 2026-01-21 23:38:21.846641255 +0000 UTC m=+0.099108414 container health_status b5c1a31a508d0cf9e10702abe9c0ef424614b4d8f0daee85791f7de054afa6f0 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=starting, health_failing_streak=1, health_log=, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=ceilometer_agent_compute, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, container_name=ceilometer_agent_compute, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 21 18:38:21 np0005591285 ceilometer_agent_compute[192452]: + [[ ! -n '' ]]
Jan 21 18:38:21 np0005591285 ceilometer_agent_compute[192452]: + . kolla_extend_start
Jan 21 18:38:21 np0005591285 ceilometer_agent_compute[192452]: Running command: '/usr/bin/ceilometer-polling --polling-namespaces compute --logfile /dev/stdout'
Jan 21 18:38:21 np0005591285 ceilometer_agent_compute[192452]: + echo 'Running command: '\''/usr/bin/ceilometer-polling --polling-namespaces compute --logfile /dev/stdout'\'''
Jan 21 18:38:21 np0005591285 ceilometer_agent_compute[192452]: + umask 0022
Jan 21 18:38:21 np0005591285 ceilometer_agent_compute[192452]: + exec /usr/bin/ceilometer-polling --polling-namespaces compute --logfile /dev/stdout
Jan 21 18:38:21 np0005591285 systemd[1]: b5c1a31a508d0cf9e10702abe9c0ef424614b4d8f0daee85791f7de054afa6f0-eaf718c58fdcd99.service: Main process exited, code=exited, status=1/FAILURE
Jan 21 18:38:21 np0005591285 systemd[1]: b5c1a31a508d0cf9e10702abe9c0ef424614b4d8f0daee85791f7de054afa6f0-eaf718c58fdcd99.service: Failed with result 'exit-code'.
Jan 21 18:38:22 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:38:22.738 2 DEBUG cotyledon.oslo_config_glue [-] Full set of CONF: _load_service_manager_options /usr/lib/python3.9/site-packages/cotyledon/oslo_config_glue.py:40
Jan 21 18:38:22 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:38:22.739 2 DEBUG cotyledon.oslo_config_glue [-] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2589
Jan 21 18:38:22 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:38:22.739 2 DEBUG cotyledon.oslo_config_glue [-] Configuration options gathered from: log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2590
Jan 21 18:38:22 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:38:22.739 2 DEBUG cotyledon.oslo_config_glue [-] command line args: ['--polling-namespaces', 'compute', '--logfile', '/dev/stdout'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2591
Jan 21 18:38:22 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:38:22.740 2 DEBUG cotyledon.oslo_config_glue [-] config files: ['/etc/ceilometer/ceilometer.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2592
Jan 21 18:38:22 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:38:22.740 2 DEBUG cotyledon.oslo_config_glue [-] ================================================================================ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2594
Jan 21 18:38:22 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:38:22.740 2 DEBUG cotyledon.oslo_config_glue [-] batch_size                     = 50 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 18:38:22 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:38:22.740 2 DEBUG cotyledon.oslo_config_glue [-] cfg_file                       = polling.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 18:38:22 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:38:22.740 2 DEBUG cotyledon.oslo_config_glue [-] config_dir                     = ['/etc/ceilometer/ceilometer.conf.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 18:38:22 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:38:22.740 2 DEBUG cotyledon.oslo_config_glue [-] config_file                    = ['/etc/ceilometer/ceilometer.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 18:38:22 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:38:22.740 2 DEBUG cotyledon.oslo_config_glue [-] config_source                  = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 18:38:22 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:38:22.740 2 DEBUG cotyledon.oslo_config_glue [-] debug                          = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 18:38:22 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:38:22.741 2 DEBUG cotyledon.oslo_config_glue [-] default_log_levels             = ['amqp=WARN', 'amqplib=WARN', 'boto=WARN', 'qpid=WARN', 'sqlalchemy=WARN', 'suds=INFO', 'oslo.messaging=INFO', 'oslo_messaging=INFO', 'iso8601=WARN', 'requests.packages.urllib3.connectionpool=WARN', 'urllib3.connectionpool=WARN', 'websocket=WARN', 'requests.packages.urllib3.util.retry=WARN', 'urllib3.util.retry=WARN', 'keystonemiddleware=WARN', 'routes.middleware=WARN', 'stevedore=WARN', 'taskflow=WARN', 'keystoneauth=WARN', 'oslo.cache=INFO', 'oslo_policy=INFO', 'dogpile.core.dogpile=INFO', 'futurist=INFO', 'neutronclient=INFO', 'keystoneclient=INFO'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 18:38:22 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:38:22.741 2 DEBUG cotyledon.oslo_config_glue [-] event_pipeline_cfg_file        = event_pipeline.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 18:38:22 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:38:22.741 2 DEBUG cotyledon.oslo_config_glue [-] graceful_shutdown_timeout      = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 18:38:22 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:38:22.741 2 DEBUG cotyledon.oslo_config_glue [-] host                           = compute-2.ctlplane.example.com log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 18:38:22 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:38:22.741 2 DEBUG cotyledon.oslo_config_glue [-] http_timeout                   = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 18:38:22 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:38:22.741 2 DEBUG cotyledon.oslo_config_glue [-] hypervisor_inspector           = libvirt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 18:38:22 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:38:22.741 2 DEBUG cotyledon.oslo_config_glue [-] instance_format                = [instance: %(uuid)s]  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 18:38:22 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:38:22.741 2 DEBUG cotyledon.oslo_config_glue [-] instance_uuid_format           = [instance: %(uuid)s]  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 18:38:22 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:38:22.741 2 DEBUG cotyledon.oslo_config_glue [-] libvirt_type                   = kvm log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 18:38:22 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:38:22.741 2 DEBUG cotyledon.oslo_config_glue [-] libvirt_uri                    =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 18:38:22 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:38:22.742 2 DEBUG cotyledon.oslo_config_glue [-] log_config_append              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 18:38:22 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:38:22.742 2 DEBUG cotyledon.oslo_config_glue [-] log_date_format                = %Y-%m-%d %H:%M:%S log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 18:38:22 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:38:22.742 2 DEBUG cotyledon.oslo_config_glue [-] log_dir                        = /var/log/ceilometer log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 18:38:22 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:38:22.742 2 DEBUG cotyledon.oslo_config_glue [-] log_file                       = /dev/stdout log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 18:38:22 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:38:22.742 2 DEBUG cotyledon.oslo_config_glue [-] log_options                    = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 18:38:22 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:38:22.742 2 DEBUG cotyledon.oslo_config_glue [-] log_rotate_interval            = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 18:38:22 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:38:22.742 2 DEBUG cotyledon.oslo_config_glue [-] log_rotate_interval_type       = days log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 18:38:22 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:38:22.742 2 DEBUG cotyledon.oslo_config_glue [-] log_rotation_type              = none log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 18:38:22 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:38:22.742 2 DEBUG cotyledon.oslo_config_glue [-] logging_context_format_string  = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [%(global_request_id)s %(request_id)s %(user_identity)s] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 18:38:22 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:38:22.742 2 DEBUG cotyledon.oslo_config_glue [-] logging_debug_format_suffix    = %(funcName)s %(pathname)s:%(lineno)d log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 18:38:22 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:38:22.742 2 DEBUG cotyledon.oslo_config_glue [-] logging_default_format_string  = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [-] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 18:38:22 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:38:22.742 2 DEBUG cotyledon.oslo_config_glue [-] logging_exception_prefix       = %(asctime)s.%(msecs)03d %(process)d ERROR %(name)s %(instance)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 18:38:22 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:38:22.743 2 DEBUG cotyledon.oslo_config_glue [-] logging_user_identity_format   = %(user)s %(project)s %(domain)s %(system_scope)s %(user_domain)s %(project_domain)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 18:38:22 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:38:22.743 2 DEBUG cotyledon.oslo_config_glue [-] max_logfile_count              = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 18:38:22 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:38:22.743 2 DEBUG cotyledon.oslo_config_glue [-] max_logfile_size_mb            = 200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 18:38:22 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:38:22.743 2 DEBUG cotyledon.oslo_config_glue [-] max_parallel_requests          = 64 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 18:38:22 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:38:22.743 2 DEBUG cotyledon.oslo_config_glue [-] partitioning_group_prefix      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 18:38:22 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:38:22.743 2 DEBUG cotyledon.oslo_config_glue [-] pipeline_cfg_file              = pipeline.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 18:38:22 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:38:22.743 2 DEBUG cotyledon.oslo_config_glue [-] polling_namespaces             = ['compute'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 18:38:22 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:38:22.743 2 DEBUG cotyledon.oslo_config_glue [-] pollsters_definitions_dirs     = ['/etc/ceilometer/pollsters.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 18:38:22 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:38:22.743 2 DEBUG cotyledon.oslo_config_glue [-] publish_errors                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 18:38:22 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:38:22.743 2 DEBUG cotyledon.oslo_config_glue [-] rate_limit_burst               = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 18:38:22 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:38:22.744 2 DEBUG cotyledon.oslo_config_glue [-] rate_limit_except_level        = CRITICAL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 18:38:22 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:38:22.744 2 DEBUG cotyledon.oslo_config_glue [-] rate_limit_interval            = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 18:38:22 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:38:22.744 2 DEBUG cotyledon.oslo_config_glue [-] reseller_prefix                = AUTH_ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 18:38:22 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:38:22.744 2 DEBUG cotyledon.oslo_config_glue [-] reserved_metadata_keys         = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 18:38:22 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:38:22.744 2 DEBUG cotyledon.oslo_config_glue [-] reserved_metadata_length       = 256 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 18:38:22 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:38:22.744 2 DEBUG cotyledon.oslo_config_glue [-] reserved_metadata_namespace    = ['metering.'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 18:38:22 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:38:22.744 2 DEBUG cotyledon.oslo_config_glue [-] rootwrap_config                = /etc/ceilometer/rootwrap.conf log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 18:38:22 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:38:22.744 2 DEBUG cotyledon.oslo_config_glue [-] sample_source                  = openstack log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 18:38:22 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:38:22.744 2 DEBUG cotyledon.oslo_config_glue [-] syslog_log_facility            = LOG_USER log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 18:38:22 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:38:22.745 2 DEBUG cotyledon.oslo_config_glue [-] tenant_name_discovery          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 18:38:22 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:38:22.745 2 DEBUG cotyledon.oslo_config_glue [-] use_eventlog                   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 18:38:22 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:38:22.745 2 DEBUG cotyledon.oslo_config_glue [-] use_journal                    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 18:38:22 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:38:22.745 2 DEBUG cotyledon.oslo_config_glue [-] use_json                       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 18:38:22 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:38:22.745 2 DEBUG cotyledon.oslo_config_glue [-] use_stderr                     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 18:38:22 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:38:22.745 2 DEBUG cotyledon.oslo_config_glue [-] use_syslog                     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 18:38:22 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:38:22.745 2 DEBUG cotyledon.oslo_config_glue [-] watch_log_file                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 18:38:22 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:38:22.745 2 DEBUG cotyledon.oslo_config_glue [-] compute.instance_discovery_method = libvirt_metadata log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:38:22 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:38:22.745 2 DEBUG cotyledon.oslo_config_glue [-] compute.resource_cache_expiry  = 3600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:38:22 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:38:22.745 2 DEBUG cotyledon.oslo_config_glue [-] compute.resource_update_interval = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:38:22 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:38:22.745 2 DEBUG cotyledon.oslo_config_glue [-] coordination.backend_url       = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:38:22 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:38:22.746 2 DEBUG cotyledon.oslo_config_glue [-] event.definitions_cfg_file     = event_definitions.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:38:22 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:38:22.746 2 DEBUG cotyledon.oslo_config_glue [-] event.drop_unmatched_notifications = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:38:22 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:38:22.746 2 DEBUG cotyledon.oslo_config_glue [-] event.store_raw                = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:38:22 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:38:22.746 2 DEBUG cotyledon.oslo_config_glue [-] ipmi.node_manager_init_retry   = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:38:22 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:38:22.746 2 DEBUG cotyledon.oslo_config_glue [-] ipmi.polling_retry             = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:38:22 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:38:22.746 2 DEBUG cotyledon.oslo_config_glue [-] meter.meter_definitions_dirs   = ['/etc/ceilometer/meters.d', '/usr/lib/python3.9/site-packages/ceilometer/data/meters.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:38:22 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:38:22.746 2 DEBUG cotyledon.oslo_config_glue [-] monasca.archive_on_failure     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:38:22 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:38:22.746 2 DEBUG cotyledon.oslo_config_glue [-] monasca.archive_path           = mon_pub_failures.txt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:38:22 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:38:22.747 2 DEBUG cotyledon.oslo_config_glue [-] monasca.auth_section           = service_credentials log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:38:22 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:38:22.747 2 DEBUG cotyledon.oslo_config_glue [-] monasca.auth_type              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:38:22 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:38:22.747 2 DEBUG cotyledon.oslo_config_glue [-] monasca.batch_count            = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:38:22 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:38:22.747 2 DEBUG cotyledon.oslo_config_glue [-] monasca.batch_max_retries      = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:38:22 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:38:22.747 2 DEBUG cotyledon.oslo_config_glue [-] monasca.batch_mode             = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:38:22 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:38:22.747 2 DEBUG cotyledon.oslo_config_glue [-] monasca.batch_polling_interval = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:38:22 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:38:22.747 2 DEBUG cotyledon.oslo_config_glue [-] monasca.batch_timeout          = 15 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:38:22 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:38:22.747 2 DEBUG cotyledon.oslo_config_glue [-] monasca.cafile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:38:22 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:38:22.747 2 DEBUG cotyledon.oslo_config_glue [-] monasca.certfile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:38:22 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:38:22.748 2 DEBUG cotyledon.oslo_config_glue [-] monasca.client_max_retries     = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:38:22 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:38:22.748 2 DEBUG cotyledon.oslo_config_glue [-] monasca.client_retry_interval  = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:38:22 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:38:22.748 2 DEBUG cotyledon.oslo_config_glue [-] monasca.clientapi_version      = 2_0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:38:22 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:38:22.748 2 DEBUG cotyledon.oslo_config_glue [-] monasca.cloud_name             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:38:22 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:38:22.748 2 DEBUG cotyledon.oslo_config_glue [-] monasca.cluster                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:38:22 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:38:22.748 2 DEBUG cotyledon.oslo_config_glue [-] monasca.collect_timing         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:38:22 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:38:22.748 2 DEBUG cotyledon.oslo_config_glue [-] monasca.control_plane          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:38:22 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:38:22.748 2 DEBUG cotyledon.oslo_config_glue [-] monasca.enable_api_pagination  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:38:22 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:38:22.749 2 DEBUG cotyledon.oslo_config_glue [-] monasca.insecure               = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:38:22 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:38:22.749 2 DEBUG cotyledon.oslo_config_glue [-] monasca.interface              = internal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:38:22 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:38:22.749 2 DEBUG cotyledon.oslo_config_glue [-] monasca.keyfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:38:22 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:38:22.749 2 DEBUG cotyledon.oslo_config_glue [-] monasca.monasca_mappings       = /etc/ceilometer/monasca_field_definitions.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:38:22 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:38:22.749 2 DEBUG cotyledon.oslo_config_glue [-] monasca.region_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:38:22 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:38:22.749 2 DEBUG cotyledon.oslo_config_glue [-] monasca.retry_on_failure       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:38:22 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:38:22.749 2 DEBUG cotyledon.oslo_config_glue [-] monasca.split_loggers          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:38:22 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:38:22.749 2 DEBUG cotyledon.oslo_config_glue [-] monasca.timeout                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:38:22 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:38:22.749 2 DEBUG cotyledon.oslo_config_glue [-] notification.ack_on_event_error = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:38:22 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:38:22.749 2 DEBUG cotyledon.oslo_config_glue [-] notification.batch_size        = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:38:22 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:38:22.750 2 DEBUG cotyledon.oslo_config_glue [-] notification.batch_timeout     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:38:22 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:38:22.750 2 DEBUG cotyledon.oslo_config_glue [-] notification.messaging_urls    = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:38:22 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:38:22.750 2 DEBUG cotyledon.oslo_config_glue [-] notification.notification_control_exchanges = ['nova', 'glance', 'neutron', 'cinder', 'heat', 'keystone', 'sahara', 'trove', 'zaqar', 'swift', 'ceilometer', 'magnum', 'dns', 'ironic', 'aodh'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:38:22 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:38:22.750 2 DEBUG cotyledon.oslo_config_glue [-] notification.pipelines         = ['meter', 'event'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:38:22 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:38:22.750 2 DEBUG cotyledon.oslo_config_glue [-] notification.workers           = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:38:22 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:38:22.750 2 DEBUG cotyledon.oslo_config_glue [-] polling.batch_size             = 50 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:38:22 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:38:22.750 2 DEBUG cotyledon.oslo_config_glue [-] polling.cfg_file               = polling.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:38:22 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:38:22.750 2 DEBUG cotyledon.oslo_config_glue [-] polling.partitioning_group_prefix = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:38:22 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:38:22.751 2 DEBUG cotyledon.oslo_config_glue [-] polling.pollsters_definitions_dirs = ['/etc/ceilometer/pollsters.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:38:22 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:38:22.751 2 DEBUG cotyledon.oslo_config_glue [-] polling.tenant_name_discovery  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:38:22 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:38:22.751 2 DEBUG cotyledon.oslo_config_glue [-] publisher.telemetry_secret     = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:38:22 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:38:22.751 2 DEBUG cotyledon.oslo_config_glue [-] publisher_notifier.event_topic = event log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:38:22 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:38:22.751 2 DEBUG cotyledon.oslo_config_glue [-] publisher_notifier.metering_topic = metering log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:38:22 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:38:22.751 2 DEBUG cotyledon.oslo_config_glue [-] publisher_notifier.telemetry_driver = messagingv2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:38:22 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:38:22.751 2 DEBUG cotyledon.oslo_config_glue [-] rgw_admin_credentials.access_key = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:38:22 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:38:22.751 2 DEBUG cotyledon.oslo_config_glue [-] rgw_admin_credentials.secret_key = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:38:22 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:38:22.751 2 DEBUG cotyledon.oslo_config_glue [-] rgw_client.implicit_tenants    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:38:22 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:38:22.752 2 DEBUG cotyledon.oslo_config_glue [-] service_types.cinder           = volumev3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:38:22 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:38:22.752 2 DEBUG cotyledon.oslo_config_glue [-] service_types.glance           = image log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:38:22 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:38:22.752 2 DEBUG cotyledon.oslo_config_glue [-] service_types.neutron          = network log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:38:22 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:38:22.752 2 DEBUG cotyledon.oslo_config_glue [-] service_types.nova             = compute log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:38:22 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:38:22.752 2 DEBUG cotyledon.oslo_config_glue [-] service_types.radosgw          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:38:22 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:38:22.752 2 DEBUG cotyledon.oslo_config_glue [-] service_types.swift            = object-store log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:38:22 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:38:22.752 2 DEBUG cotyledon.oslo_config_glue [-] vmware.api_retry_count         = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:38:22 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:38:22.752 2 DEBUG cotyledon.oslo_config_glue [-] vmware.ca_file                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:38:22 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:38:22.753 2 DEBUG cotyledon.oslo_config_glue [-] vmware.host_ip                 = 127.0.0.1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:38:22 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:38:22.753 2 DEBUG cotyledon.oslo_config_glue [-] vmware.host_password           = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:38:22 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:38:22.753 2 DEBUG cotyledon.oslo_config_glue [-] vmware.host_port               = 443 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:38:22 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:38:22.753 2 DEBUG cotyledon.oslo_config_glue [-] vmware.host_username           =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:38:22 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:38:22.753 2 DEBUG cotyledon.oslo_config_glue [-] vmware.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:38:22 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:38:22.753 2 DEBUG cotyledon.oslo_config_glue [-] vmware.task_poll_interval      = 0.5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:38:22 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:38:22.753 2 DEBUG cotyledon.oslo_config_glue [-] vmware.wsdl_location           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:38:22 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:38:22.753 2 DEBUG cotyledon.oslo_config_glue [-] service_credentials.auth_section = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:38:22 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:38:22.753 2 DEBUG cotyledon.oslo_config_glue [-] service_credentials.auth_type  = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:38:22 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:38:22.754 2 DEBUG cotyledon.oslo_config_glue [-] service_credentials.cafile     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:38:22 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:38:22.754 2 DEBUG cotyledon.oslo_config_glue [-] service_credentials.certfile   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:38:22 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:38:22.754 2 DEBUG cotyledon.oslo_config_glue [-] service_credentials.collect_timing = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:38:22 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:38:22.754 2 DEBUG cotyledon.oslo_config_glue [-] service_credentials.insecure   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:38:22 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:38:22.754 2 DEBUG cotyledon.oslo_config_glue [-] service_credentials.interface  = internalURL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:38:22 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:38:22.754 2 DEBUG cotyledon.oslo_config_glue [-] service_credentials.keyfile    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:38:22 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:38:22.754 2 DEBUG cotyledon.oslo_config_glue [-] service_credentials.region_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:38:22 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:38:22.754 2 DEBUG cotyledon.oslo_config_glue [-] service_credentials.split_loggers = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:38:22 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:38:22.754 2 DEBUG cotyledon.oslo_config_glue [-] service_credentials.timeout    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:38:22 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:38:22.755 2 DEBUG cotyledon.oslo_config_glue [-] gnocchi.auth_section           = service_credentials log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:38:22 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:38:22.755 2 DEBUG cotyledon.oslo_config_glue [-] gnocchi.auth_type              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:38:22 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:38:22.755 2 DEBUG cotyledon.oslo_config_glue [-] gnocchi.cafile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:38:22 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:38:22.755 2 DEBUG cotyledon.oslo_config_glue [-] gnocchi.certfile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:38:22 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:38:22.755 2 DEBUG cotyledon.oslo_config_glue [-] gnocchi.collect_timing         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:38:22 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:38:22.755 2 DEBUG cotyledon.oslo_config_glue [-] gnocchi.insecure               = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:38:22 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:38:22.755 2 DEBUG cotyledon.oslo_config_glue [-] gnocchi.interface              = internal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:38:22 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:38:22.755 2 DEBUG cotyledon.oslo_config_glue [-] gnocchi.keyfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:38:22 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:38:22.755 2 DEBUG cotyledon.oslo_config_glue [-] gnocchi.region_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:38:22 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:38:22.755 2 DEBUG cotyledon.oslo_config_glue [-] gnocchi.split_loggers          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:38:22 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:38:22.756 2 DEBUG cotyledon.oslo_config_glue [-] gnocchi.timeout                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:38:22 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:38:22.756 2 DEBUG cotyledon.oslo_config_glue [-] zaqar.auth_section             = service_credentials log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:38:22 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:38:22.756 2 DEBUG cotyledon.oslo_config_glue [-] zaqar.auth_type                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:38:22 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:38:22.756 2 DEBUG cotyledon.oslo_config_glue [-] zaqar.cafile                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:38:22 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:38:22.756 2 DEBUG cotyledon.oslo_config_glue [-] zaqar.certfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:38:22 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:38:22.756 2 DEBUG cotyledon.oslo_config_glue [-] zaqar.collect_timing           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:38:22 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:38:22.756 2 DEBUG cotyledon.oslo_config_glue [-] zaqar.insecure                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:38:22 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:38:22.756 2 DEBUG cotyledon.oslo_config_glue [-] zaqar.interface                = internal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:38:22 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:38:22.756 2 DEBUG cotyledon.oslo_config_glue [-] zaqar.keyfile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:38:22 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:38:22.756 2 DEBUG cotyledon.oslo_config_glue [-] zaqar.region_name              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:38:22 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:38:22.757 2 DEBUG cotyledon.oslo_config_glue [-] zaqar.split_loggers            = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:38:22 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:38:22.757 2 DEBUG cotyledon.oslo_config_glue [-] zaqar.timeout                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:38:22 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:38:22.757 2 DEBUG cotyledon.oslo_config_glue [-] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613
Jan 21 18:38:22 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:38:22.778 12 INFO ceilometer.polling.manager [-] Looking for dynamic pollsters configurations at [['/etc/ceilometer/pollsters.d']].
Jan 21 18:38:22 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:38:22.780 12 INFO ceilometer.polling.manager [-] No dynamic pollsters found in folder [/etc/ceilometer/pollsters.d].
Jan 21 18:38:22 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:38:22.781 12 INFO ceilometer.polling.manager [-] No dynamic pollsters file found in dirs [['/etc/ceilometer/pollsters.d']].
Jan 21 18:38:22 np0005591285 python3.9[192634]: ansible-ansible.builtin.slurp Invoked with src=/var/lib/edpm-config/deployed_services.yaml
Jan 21 18:38:22 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:38:22.962 12 DEBUG ceilometer.compute.virt.libvirt.utils [-] Connecting to libvirt: qemu:///system new_libvirt_connection /usr/lib/python3.9/site-packages/ceilometer/compute/virt/libvirt/utils.py:93
Jan 21 18:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:38:23.097 12 DEBUG cotyledon.oslo_config_glue [-] Full set of CONF: _load_service_options /usr/lib/python3.9/site-packages/cotyledon/oslo_config_glue.py:48
Jan 21 18:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:38:23.098 12 DEBUG cotyledon.oslo_config_glue [-] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2589
Jan 21 18:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:38:23.098 12 DEBUG cotyledon.oslo_config_glue [-] Configuration options gathered from: log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2590
Jan 21 18:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:38:23.098 12 DEBUG cotyledon.oslo_config_glue [-] command line args: ['--polling-namespaces', 'compute', '--logfile', '/dev/stdout'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2591
Jan 21 18:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:38:23.098 12 DEBUG cotyledon.oslo_config_glue [-] config files: ['/etc/ceilometer/ceilometer.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2592
Jan 21 18:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:38:23.098 12 DEBUG cotyledon.oslo_config_glue [-] ================================================================================ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2594
Jan 21 18:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:38:23.098 12 DEBUG cotyledon.oslo_config_glue [-] batch_size                     = 50 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 18:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:38:23.098 12 DEBUG cotyledon.oslo_config_glue [-] cfg_file                       = polling.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 18:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:38:23.099 12 DEBUG cotyledon.oslo_config_glue [-] config_dir                     = ['/etc/ceilometer/ceilometer.conf.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 18:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:38:23.099 12 DEBUG cotyledon.oslo_config_glue [-] config_file                    = ['/etc/ceilometer/ceilometer.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 18:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:38:23.099 12 DEBUG cotyledon.oslo_config_glue [-] config_source                  = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 18:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:38:23.099 12 DEBUG cotyledon.oslo_config_glue [-] control_exchange               = ceilometer log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 18:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:38:23.099 12 DEBUG cotyledon.oslo_config_glue [-] debug                          = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 18:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:38:23.099 12 DEBUG cotyledon.oslo_config_glue [-] default_log_levels             = ['amqp=WARN', 'amqplib=WARN', 'boto=WARN', 'qpid=WARN', 'sqlalchemy=WARN', 'suds=INFO', 'oslo.messaging=INFO', 'oslo_messaging=INFO', 'iso8601=WARN', 'requests.packages.urllib3.connectionpool=WARN', 'urllib3.connectionpool=WARN', 'websocket=WARN', 'requests.packages.urllib3.util.retry=WARN', 'urllib3.util.retry=WARN', 'keystonemiddleware=WARN', 'routes.middleware=WARN', 'stevedore=WARN', 'taskflow=WARN', 'keystoneauth=WARN', 'oslo.cache=INFO', 'oslo_policy=INFO', 'dogpile.core.dogpile=INFO', 'futurist=INFO', 'neutronclient=INFO', 'keystoneclient=INFO'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 18:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:38:23.099 12 DEBUG cotyledon.oslo_config_glue [-] event_pipeline_cfg_file        = event_pipeline.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 18:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:38:23.099 12 DEBUG cotyledon.oslo_config_glue [-] graceful_shutdown_timeout      = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 18:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:38:23.100 12 DEBUG cotyledon.oslo_config_glue [-] host                           = compute-2.ctlplane.example.com log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 18:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:38:23.100 12 DEBUG cotyledon.oslo_config_glue [-] http_timeout                   = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 18:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:38:23.100 12 DEBUG cotyledon.oslo_config_glue [-] hypervisor_inspector           = libvirt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 18:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:38:23.100 12 DEBUG cotyledon.oslo_config_glue [-] instance_format                = [instance: %(uuid)s]  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 18:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:38:23.100 12 DEBUG cotyledon.oslo_config_glue [-] instance_uuid_format           = [instance: %(uuid)s]  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 18:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:38:23.100 12 DEBUG cotyledon.oslo_config_glue [-] libvirt_type                   = kvm log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 18:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:38:23.100 12 DEBUG cotyledon.oslo_config_glue [-] libvirt_uri                    =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 18:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:38:23.100 12 DEBUG cotyledon.oslo_config_glue [-] log_config_append              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 18:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:38:23.100 12 DEBUG cotyledon.oslo_config_glue [-] log_date_format                = %Y-%m-%d %H:%M:%S log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 18:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:38:23.101 12 DEBUG cotyledon.oslo_config_glue [-] log_dir                        = /var/log/ceilometer log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 18:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:38:23.101 12 DEBUG cotyledon.oslo_config_glue [-] log_file                       = /dev/stdout log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 18:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:38:23.101 12 DEBUG cotyledon.oslo_config_glue [-] log_options                    = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 18:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:38:23.101 12 DEBUG cotyledon.oslo_config_glue [-] log_rotate_interval            = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 18:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:38:23.101 12 DEBUG cotyledon.oslo_config_glue [-] log_rotate_interval_type       = days log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 18:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:38:23.101 12 DEBUG cotyledon.oslo_config_glue [-] log_rotation_type              = none log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 18:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:38:23.101 12 DEBUG cotyledon.oslo_config_glue [-] logging_context_format_string  = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [%(global_request_id)s %(request_id)s %(user_identity)s] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 18:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:38:23.101 12 DEBUG cotyledon.oslo_config_glue [-] logging_debug_format_suffix    = %(funcName)s %(pathname)s:%(lineno)d log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 18:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:38:23.101 12 DEBUG cotyledon.oslo_config_glue [-] logging_default_format_string  = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [-] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 18:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:38:23.101 12 DEBUG cotyledon.oslo_config_glue [-] logging_exception_prefix       = %(asctime)s.%(msecs)03d %(process)d ERROR %(name)s %(instance)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 18:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:38:23.102 12 DEBUG cotyledon.oslo_config_glue [-] logging_user_identity_format   = %(user)s %(project)s %(domain)s %(system_scope)s %(user_domain)s %(project_domain)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 18:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:38:23.102 12 DEBUG cotyledon.oslo_config_glue [-] max_logfile_count              = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 18:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:38:23.102 12 DEBUG cotyledon.oslo_config_glue [-] max_logfile_size_mb            = 200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 18:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:38:23.102 12 DEBUG cotyledon.oslo_config_glue [-] max_parallel_requests          = 64 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 18:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:38:23.102 12 DEBUG cotyledon.oslo_config_glue [-] partitioning_group_prefix      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 18:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:38:23.102 12 DEBUG cotyledon.oslo_config_glue [-] pipeline_cfg_file              = pipeline.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 18:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:38:23.102 12 DEBUG cotyledon.oslo_config_glue [-] polling_namespaces             = ['compute'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 18:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:38:23.102 12 DEBUG cotyledon.oslo_config_glue [-] pollsters_definitions_dirs     = ['/etc/ceilometer/pollsters.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 18:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:38:23.103 12 DEBUG cotyledon.oslo_config_glue [-] publish_errors                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 18:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:38:23.103 12 DEBUG cotyledon.oslo_config_glue [-] rate_limit_burst               = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 18:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:38:23.103 12 DEBUG cotyledon.oslo_config_glue [-] rate_limit_except_level        = CRITICAL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 18:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:38:23.103 12 DEBUG cotyledon.oslo_config_glue [-] rate_limit_interval            = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 18:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:38:23.103 12 DEBUG cotyledon.oslo_config_glue [-] reseller_prefix                = AUTH_ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 18:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:38:23.103 12 DEBUG cotyledon.oslo_config_glue [-] reserved_metadata_keys         = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 18:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:38:23.103 12 DEBUG cotyledon.oslo_config_glue [-] reserved_metadata_length       = 256 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 18:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:38:23.103 12 DEBUG cotyledon.oslo_config_glue [-] reserved_metadata_namespace    = ['metering.'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 18:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:38:23.103 12 DEBUG cotyledon.oslo_config_glue [-] rootwrap_config                = /etc/ceilometer/rootwrap.conf log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 18:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:38:23.104 12 DEBUG cotyledon.oslo_config_glue [-] sample_source                  = openstack log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 18:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:38:23.104 12 DEBUG cotyledon.oslo_config_glue [-] syslog_log_facility            = LOG_USER log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 18:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:38:23.104 12 DEBUG cotyledon.oslo_config_glue [-] tenant_name_discovery          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 18:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:38:23.104 12 DEBUG cotyledon.oslo_config_glue [-] transport_url                  = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 18:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:38:23.104 12 DEBUG cotyledon.oslo_config_glue [-] use_eventlog                   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 18:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:38:23.104 12 DEBUG cotyledon.oslo_config_glue [-] use_journal                    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 18:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:38:23.104 12 DEBUG cotyledon.oslo_config_glue [-] use_json                       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 18:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:38:23.104 12 DEBUG cotyledon.oslo_config_glue [-] use_stderr                     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 18:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:38:23.104 12 DEBUG cotyledon.oslo_config_glue [-] use_syslog                     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 18:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:38:23.105 12 DEBUG cotyledon.oslo_config_glue [-] watch_log_file                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 18:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:38:23.105 12 DEBUG cotyledon.oslo_config_glue [-] compute.instance_discovery_method = libvirt_metadata log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:38:23.105 12 DEBUG cotyledon.oslo_config_glue [-] compute.resource_cache_expiry  = 3600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:38:23.105 12 DEBUG cotyledon.oslo_config_glue [-] compute.resource_update_interval = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:38:23.105 12 DEBUG cotyledon.oslo_config_glue [-] coordination.backend_url       = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:38:23.105 12 DEBUG cotyledon.oslo_config_glue [-] event.definitions_cfg_file     = event_definitions.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:38:23.105 12 DEBUG cotyledon.oslo_config_glue [-] event.drop_unmatched_notifications = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:38:23.105 12 DEBUG cotyledon.oslo_config_glue [-] event.store_raw                = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:38:23.106 12 DEBUG cotyledon.oslo_config_glue [-] ipmi.node_manager_init_retry   = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:38:23.106 12 DEBUG cotyledon.oslo_config_glue [-] ipmi.polling_retry             = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:38:23.106 12 DEBUG cotyledon.oslo_config_glue [-] meter.meter_definitions_dirs   = ['/etc/ceilometer/meters.d', '/usr/lib/python3.9/site-packages/ceilometer/data/meters.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:38:23.106 12 DEBUG cotyledon.oslo_config_glue [-] monasca.archive_on_failure     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:38:23.106 12 DEBUG cotyledon.oslo_config_glue [-] monasca.archive_path           = mon_pub_failures.txt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:38:23.106 12 DEBUG cotyledon.oslo_config_glue [-] monasca.auth_section           = service_credentials log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:38:23.106 12 DEBUG cotyledon.oslo_config_glue [-] monasca.auth_type              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:38:23.106 12 DEBUG cotyledon.oslo_config_glue [-] monasca.batch_count            = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:38:23.107 12 DEBUG cotyledon.oslo_config_glue [-] monasca.batch_max_retries      = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:38:23.107 12 DEBUG cotyledon.oslo_config_glue [-] monasca.batch_mode             = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:38:23.107 12 DEBUG cotyledon.oslo_config_glue [-] monasca.batch_polling_interval = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:38:23.107 12 DEBUG cotyledon.oslo_config_glue [-] monasca.batch_timeout          = 15 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:38:23.107 12 DEBUG cotyledon.oslo_config_glue [-] monasca.cafile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:38:23.107 12 DEBUG cotyledon.oslo_config_glue [-] monasca.certfile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:38:23.107 12 DEBUG cotyledon.oslo_config_glue [-] monasca.client_max_retries     = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:38:23.107 12 DEBUG cotyledon.oslo_config_glue [-] monasca.client_retry_interval  = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:38:23.108 12 DEBUG cotyledon.oslo_config_glue [-] monasca.clientapi_version      = 2_0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:38:23.108 12 DEBUG cotyledon.oslo_config_glue [-] monasca.cloud_name             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:38:23.108 12 DEBUG cotyledon.oslo_config_glue [-] monasca.cluster                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:38:23.108 12 DEBUG cotyledon.oslo_config_glue [-] monasca.collect_timing         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:38:23.108 12 DEBUG cotyledon.oslo_config_glue [-] monasca.control_plane          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:38:23.108 12 DEBUG cotyledon.oslo_config_glue [-] monasca.enable_api_pagination  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:38:23.108 12 DEBUG cotyledon.oslo_config_glue [-] monasca.insecure               = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:38:23.108 12 DEBUG cotyledon.oslo_config_glue [-] monasca.interface              = internal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:38:23.108 12 DEBUG cotyledon.oslo_config_glue [-] monasca.keyfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:38:23.109 12 DEBUG cotyledon.oslo_config_glue [-] monasca.monasca_mappings       = /etc/ceilometer/monasca_field_definitions.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:38:23.109 12 DEBUG cotyledon.oslo_config_glue [-] monasca.region_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:38:23.109 12 DEBUG cotyledon.oslo_config_glue [-] monasca.retry_on_failure       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:38:23.109 12 DEBUG cotyledon.oslo_config_glue [-] monasca.split_loggers          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:38:23.109 12 DEBUG cotyledon.oslo_config_glue [-] monasca.timeout                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:38:23.109 12 DEBUG cotyledon.oslo_config_glue [-] notification.ack_on_event_error = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:38:23.109 12 DEBUG cotyledon.oslo_config_glue [-] notification.batch_size        = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:38:23.109 12 DEBUG cotyledon.oslo_config_glue [-] notification.batch_timeout     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:38:23.109 12 DEBUG cotyledon.oslo_config_glue [-] notification.messaging_urls    = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:38:23.109 12 DEBUG cotyledon.oslo_config_glue [-] notification.notification_control_exchanges = ['nova', 'glance', 'neutron', 'cinder', 'heat', 'keystone', 'sahara', 'trove', 'zaqar', 'swift', 'ceilometer', 'magnum', 'dns', 'ironic', 'aodh'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:38:23.110 12 DEBUG cotyledon.oslo_config_glue [-] notification.pipelines         = ['meter', 'event'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:38:23.110 12 DEBUG cotyledon.oslo_config_glue [-] notification.workers           = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:38:23.110 12 DEBUG cotyledon.oslo_config_glue [-] polling.batch_size             = 50 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:38:23.110 12 DEBUG cotyledon.oslo_config_glue [-] polling.cfg_file               = polling.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:38:23.110 12 DEBUG cotyledon.oslo_config_glue [-] polling.partitioning_group_prefix = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:38:23.110 12 DEBUG cotyledon.oslo_config_glue [-] polling.pollsters_definitions_dirs = ['/etc/ceilometer/pollsters.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:38:23.110 12 DEBUG cotyledon.oslo_config_glue [-] polling.tenant_name_discovery  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:38:23.110 12 DEBUG cotyledon.oslo_config_glue [-] publisher.telemetry_secret     = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:38:23.110 12 DEBUG cotyledon.oslo_config_glue [-] publisher_notifier.event_topic = event log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:38:23.111 12 DEBUG cotyledon.oslo_config_glue [-] publisher_notifier.metering_topic = metering log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:38:23.111 12 DEBUG cotyledon.oslo_config_glue [-] publisher_notifier.telemetry_driver = messagingv2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:38:23.111 12 DEBUG cotyledon.oslo_config_glue [-] rgw_admin_credentials.access_key = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:38:23.111 12 DEBUG cotyledon.oslo_config_glue [-] rgw_admin_credentials.secret_key = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:38:23.111 12 DEBUG cotyledon.oslo_config_glue [-] rgw_client.implicit_tenants    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:38:23.111 12 DEBUG cotyledon.oslo_config_glue [-] service_types.cinder           = volumev3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:38:23.111 12 DEBUG cotyledon.oslo_config_glue [-] service_types.glance           = image log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:38:23.111 12 DEBUG cotyledon.oslo_config_glue [-] service_types.neutron          = network log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:38:23.111 12 DEBUG cotyledon.oslo_config_glue [-] service_types.nova             = compute log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:38:23.111 12 DEBUG cotyledon.oslo_config_glue [-] service_types.radosgw          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:38:23.111 12 DEBUG cotyledon.oslo_config_glue [-] service_types.swift            = object-store log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:38:23.112 12 DEBUG cotyledon.oslo_config_glue [-] vmware.api_retry_count         = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:38:23.112 12 DEBUG cotyledon.oslo_config_glue [-] vmware.ca_file                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:38:23.112 12 DEBUG cotyledon.oslo_config_glue [-] vmware.host_ip                 = 127.0.0.1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:38:23.112 12 DEBUG cotyledon.oslo_config_glue [-] vmware.host_password           = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:38:23.112 12 DEBUG cotyledon.oslo_config_glue [-] vmware.host_port               = 443 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:38:23.112 12 DEBUG cotyledon.oslo_config_glue [-] vmware.host_username           =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:38:23.112 12 DEBUG cotyledon.oslo_config_glue [-] vmware.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:38:23.112 12 DEBUG cotyledon.oslo_config_glue [-] vmware.task_poll_interval      = 0.5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:38:23.112 12 DEBUG cotyledon.oslo_config_glue [-] vmware.wsdl_location           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:38:23.112 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.auth_section = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:38:23.113 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.auth_type  = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:38:23.113 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.auth_url   = https://keystone-internal.openstack.svc:5000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:38:23.113 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.cafile     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:38:23.113 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.certfile   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:38:23.113 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.collect_timing = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:38:23.113 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.default_domain_id = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:38:23.113 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.default_domain_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:38:23.113 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.domain_id  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:38:23.113 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.domain_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:38:23.114 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.insecure   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:38:23.114 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.interface  = internalURL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:38:23.114 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.keyfile    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:38:23.114 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.password   = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:38:23.114 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.project_domain_id = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:38:23.114 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.project_domain_name = Default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:38:23.114 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.project_id = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:38:23.114 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.project_name = service log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:38:23.114 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.region_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:38:23.114 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.split_loggers = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:38:23.115 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.system_scope = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:38:23.115 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.timeout    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:38:23.115 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.trust_id   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:38:23.115 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.user_domain_id = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:38:23.115 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.user_domain_name = Default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:38:23.115 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.user_id    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:38:23.115 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.username   = ceilometer log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:38:23.115 12 DEBUG cotyledon.oslo_config_glue [-] gnocchi.auth_section           = service_credentials log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:38:23.115 12 DEBUG cotyledon.oslo_config_glue [-] gnocchi.auth_type              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:38:23.115 12 DEBUG cotyledon.oslo_config_glue [-] gnocchi.cafile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:38:23.116 12 DEBUG cotyledon.oslo_config_glue [-] gnocchi.certfile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:38:23.116 12 DEBUG cotyledon.oslo_config_glue [-] gnocchi.collect_timing         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:38:23.116 12 DEBUG cotyledon.oslo_config_glue [-] gnocchi.insecure               = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:38:23.116 12 DEBUG cotyledon.oslo_config_glue [-] gnocchi.interface              = internal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:38:23.116 12 DEBUG cotyledon.oslo_config_glue [-] gnocchi.keyfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:38:23.116 12 DEBUG cotyledon.oslo_config_glue [-] gnocchi.region_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:38:23.116 12 DEBUG cotyledon.oslo_config_glue [-] gnocchi.split_loggers          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:38:23.116 12 DEBUG cotyledon.oslo_config_glue [-] gnocchi.timeout                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:38:23.116 12 DEBUG cotyledon.oslo_config_glue [-] zaqar.auth_section             = service_credentials log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:38:23.116 12 DEBUG cotyledon.oslo_config_glue [-] zaqar.auth_type                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:38:23.116 12 DEBUG cotyledon.oslo_config_glue [-] zaqar.cafile                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:38:23.117 12 DEBUG cotyledon.oslo_config_glue [-] zaqar.certfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:38:23.117 12 DEBUG cotyledon.oslo_config_glue [-] zaqar.collect_timing           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:38:23.117 12 DEBUG cotyledon.oslo_config_glue [-] zaqar.insecure                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:38:23.117 12 DEBUG cotyledon.oslo_config_glue [-] zaqar.interface                = internal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:38:23.117 12 DEBUG cotyledon.oslo_config_glue [-] zaqar.keyfile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:38:23.117 12 DEBUG cotyledon.oslo_config_glue [-] zaqar.region_name              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:38:23.117 12 DEBUG cotyledon.oslo_config_glue [-] zaqar.split_loggers            = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:38:23.117 12 DEBUG cotyledon.oslo_config_glue [-] zaqar.timeout                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:38:23.117 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_notifications.driver = ['noop'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:38:23.117 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_notifications.retry = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:38:23.117 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_notifications.topics = ['notifications'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:38:23.118 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_notifications.transport_url = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:38:23.118 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.amqp_auto_delete = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:38:23.118 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.amqp_durable_queues = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:38:23.118 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.conn_pool_min_size = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:38:23.118 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.conn_pool_ttl = 1200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:38:23.118 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.direct_mandatory_flag = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:38:23.118 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.enable_cancel_on_failover = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:38:23.118 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.heartbeat_in_pthread = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:38:23.119 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.heartbeat_rate = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:38:23.119 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.heartbeat_timeout_threshold = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:38:23.119 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.kombu_compression = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:38:23.119 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.kombu_failover_strategy = round-robin log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:38:23.119 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.kombu_missing_consumer_retry_timeout = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:38:23.119 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.kombu_reconnect_delay = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:38:23.119 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.rabbit_ha_queues = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:38:23.119 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.rabbit_interval_max = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:38:23.120 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.rabbit_login_method = AMQPLAIN log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:38:23.120 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.rabbit_qos_prefetch_count = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:38:23.120 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.rabbit_quorum_delivery_limit = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:38:23.120 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.rabbit_quorum_max_memory_bytes = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:38:23.120 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.rabbit_quorum_max_memory_length = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:38:23.120 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.rabbit_quorum_queue = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:38:23.120 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.rabbit_retry_backoff = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:38:23.120 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.rabbit_retry_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:38:23.120 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.rabbit_transient_queues_ttl = 1800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:38:23.121 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.rpc_conn_pool_size = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:38:23.121 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.ssl      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:38:23.121 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.ssl_ca_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:38:23.121 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.ssl_cert_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:38:23.121 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.ssl_enforce_fips_mode = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:38:23.121 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.ssl_key_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:38:23.121 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.ssl_version =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 18:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:38:23.121 12 DEBUG cotyledon.oslo_config_glue [-] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613
Jan 21 18:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:38:23.121 12 DEBUG cotyledon._service [-] Run service AgentManager(0) [12] wait_forever /usr/lib/python3.9/site-packages/cotyledon/_service.py:241
Jan 21 18:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:38:23.124 12 DEBUG ceilometer.agent [-] Config file: {'sources': [{'name': 'pollsters', 'interval': 120, 'meters': ['power.state', 'cpu', 'memory.usage', 'disk.*', 'network.*']}]} load_config /usr/lib/python3.9/site-packages/ceilometer/agent.py:64
Jan 21 18:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:38:23.130 12 DEBUG ceilometer.compute.virt.libvirt.utils [-] Connecting to libvirt: qemu:///system new_libvirt_connection /usr/lib/python3.9/site-packages/ceilometer/compute/virt/libvirt/utils.py:93
Jan 21 18:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:38:23.134 12 DEBUG ceilometer.polling.manager [-] Skip pollster cpu, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 21 18:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:38:23.134 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 21 18:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:38:23.134 12 DEBUG ceilometer.polling.manager [-] Skip pollster memory.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 21 18:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:38:23.134 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 21 18:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:38:23.135 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 21 18:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:38:23.135 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 21 18:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:38:23.135 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 21 18:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:38:23.135 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.capacity, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 21 18:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:38:23.135 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 21 18:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:38:23.135 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 21 18:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:38:23.135 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 21 18:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:38:23.135 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 21 18:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:38:23.135 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 21 18:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:38:23.135 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 21 18:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:38:23.135 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 21 18:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:38:23.135 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 21 18:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:38:23.135 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 21 18:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:38:23.136 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 21 18:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:38:23.136 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 21 18:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:38:23.136 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 21 18:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:38:23.136 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 21 18:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:38:23.136 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 21 18:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:38:23.136 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 21 18:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:38:23.136 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.iops, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 21 18:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:38:23.136 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.allocation, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 21 18:38:24 np0005591285 python3.9[192791]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/deployed_services.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 21 18:38:25 np0005591285 python3.9[192916]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/edpm-config/deployed_services.yaml mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1769038703.7688951-1652-232369026873119/.source.yaml _original_basename=.j5q4gii_ follow=False checksum=269c2ae6db6905fafe50874cc0e2c17a6e3f803d backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 21 18:38:25 np0005591285 python3.9[193068]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/healthchecks/node_exporter/healthcheck follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 21 18:38:26 np0005591285 python3.9[193191]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/healthchecks/node_exporter/ group=zuul mode=0700 owner=zuul setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1769038705.363443-1697-254325979385216/.source _original_basename=healthcheck follow=False checksum=e380c11c36804bfc65a818f2960cfa663daacfe5 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None
Jan 21 18:38:27 np0005591285 python3.9[193343]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/edpm-config recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 21 18:38:28 np0005591285 python3.9[193495]: ansible-ansible.builtin.file Invoked with path=/var/lib/kolla/config_files recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Jan 21 18:38:29 np0005591285 python3.9[193647]: ansible-ansible.legacy.stat Invoked with path=/var/lib/kolla/config_files/ceilometer_agent_compute.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 21 18:38:30 np0005591285 python3.9[193725]: ansible-ansible.legacy.file Invoked with mode=0600 dest=/var/lib/kolla/config_files/ceilometer_agent_compute.json _original_basename=.37aof5qp recurse=False state=file path=/var/lib/kolla/config_files/ceilometer_agent_compute.json force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 21 18:38:31 np0005591285 python3.9[193875]: ansible-ansible.builtin.file Invoked with mode=0755 path=/var/lib/edpm-config/container-startup-config/node_exporter state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 21 18:38:34 np0005591285 podman[194270]: 2026-01-21 23:38:34.220241489 +0000 UTC m=+0.131786186 container health_status e38def30a2d032278c507a8fe96e62e9ea51ff4721fe55b24e66691821fd079e (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, managed_by=edpm_ansible, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 21 18:38:34 np0005591285 python3.9[194319]: ansible-container_config_data Invoked with config_overrides={} config_path=/var/lib/edpm-config/container-startup-config/node_exporter config_pattern=*.json debug=False
Jan 21 18:38:35 np0005591285 python3.9[194476]: ansible-container_config_hash Invoked with check_mode=False config_vol_prefix=/var/lib/openstack
Jan 21 18:38:36 np0005591285 python3[194628]: ansible-edpm_container_manage Invoked with concurrency=1 config_dir=/var/lib/edpm-config/container-startup-config/node_exporter config_id=node_exporter config_overrides={} config_patterns=*.json containers=['node_exporter'] log_base_path=/var/log/containers/stdouts debug=False
Jan 21 18:38:36 np0005591285 podman[194664]: 2026-01-21 23:38:36.941430988 +0000 UTC m=+0.061645356 container create 8919309155a033da8deb024bb37b9fc0d285fe97686ee0d202af37a5f2df8416 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, config_id=node_exporter, container_name=node_exporter, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']})
Jan 21 18:38:36 np0005591285 podman[194664]: 2026-01-21 23:38:36.908705615 +0000 UTC m=+0.028920073 image pull 0da6a335fe1356545476b749c68f022c897de3a2139e8f0054f6937349ee2b83 quay.io/prometheus/node-exporter:v1.5.0
Jan 21 18:38:36 np0005591285 python3[194628]: ansible-edpm_container_manage PODMAN-CONTAINER-DEBUG: podman create --name node_exporter --conmon-pidfile /run/node_exporter.pid --env OS_ENDPOINT_TYPE=internal --env EDPM_CONFIG_HASH=3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc --healthcheck-command /openstack/healthcheck node_exporter --label config_id=node_exporter --label container_name=node_exporter --label managed_by=edpm_ansible --label config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']} --log-driver journald --log-level info --network host --privileged=True --publish 9100:9100 --user root --volume /var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z --volume /var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z --volume /var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw --volume /var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z quay.io/prometheus/node-exporter:v1.5.0 --web.config.file=/etc/node_exporter/node_exporter.yaml --web.disable-exporter-metrics --collector.systemd --collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\.service --no-collector.dmi --no-collector.entropy --no-collector.thermal_zone --no-collector.time --no-collector.timex --no-collector.uname --no-collector.stat --no-collector.hwmon --no-collector.os --no-collector.selinux --no-collector.textfile --no-collector.powersupplyclass --no-collector.pressure --no-collector.rapl
Jan 21 18:38:37 np0005591285 python3.9[194854]: ansible-ansible.builtin.stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 21 18:38:38 np0005591285 python3.9[195008]: ansible-file Invoked with path=/etc/systemd/system/edpm_node_exporter.requires state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 21 18:38:39 np0005591285 python3.9[195084]: ansible-stat Invoked with path=/etc/systemd/system/edpm_node_exporter_healthcheck.timer follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 21 18:38:39 np0005591285 python3.9[195235]: ansible-copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1769038719.2492166-2032-89344427495299/source dest=/etc/systemd/system/edpm_node_exporter.service mode=0644 owner=root group=root backup=False force=True remote_src=False follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 21 18:38:40 np0005591285 python3.9[195311]: ansible-systemd Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Jan 21 18:38:40 np0005591285 systemd[1]: Reloading.
Jan 21 18:38:40 np0005591285 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 21 18:38:40 np0005591285 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 21 18:38:41 np0005591285 python3.9[195423]: ansible-systemd Invoked with state=restarted name=edpm_node_exporter.service enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 21 18:38:41 np0005591285 systemd[1]: Reloading.
Jan 21 18:38:41 np0005591285 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 21 18:38:41 np0005591285 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 21 18:38:41 np0005591285 systemd[1]: Starting node_exporter container...
Jan 21 18:38:42 np0005591285 podman[195462]: 2026-01-21 23:38:42.071143045 +0000 UTC m=+0.091477351 container health_status 482a8450540b33e5cd4c4cb0c4b60281016c657b79b89fa82f8a9e447843f995 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.build-date=20251202, io.buildah.version=1.41.3)
Jan 21 18:38:42 np0005591285 systemd[1]: Started libcrun container.
Jan 21 18:38:42 np0005591285 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/125baa2bcbc3e730d4a848447de6678fd7d57bb40c03f9baf258f064a10ccdc2/merged/etc/node_exporter/node_exporter.yaml supports timestamps until 2038 (0x7fffffff)
Jan 21 18:38:42 np0005591285 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/125baa2bcbc3e730d4a848447de6678fd7d57bb40c03f9baf258f064a10ccdc2/merged/etc/node_exporter/tls supports timestamps until 2038 (0x7fffffff)
Jan 21 18:38:42 np0005591285 systemd[1]: Started /usr/bin/podman healthcheck run 8919309155a033da8deb024bb37b9fc0d285fe97686ee0d202af37a5f2df8416.
Jan 21 18:38:42 np0005591285 podman[195464]: 2026-01-21 23:38:42.137425852 +0000 UTC m=+0.158255031 container init 8919309155a033da8deb024bb37b9fc0d285fe97686ee0d202af37a5f2df8416 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Jan 21 18:38:42 np0005591285 node_exporter[195496]: ts=2026-01-21T23:38:42.156Z caller=node_exporter.go:180 level=info msg="Starting node_exporter" version="(version=1.5.0, branch=HEAD, revision=1b48970ffcf5630534fb00bb0687d73c66d1c959)"
Jan 21 18:38:42 np0005591285 node_exporter[195496]: ts=2026-01-21T23:38:42.156Z caller=node_exporter.go:181 level=info msg="Build context" build_context="(go=go1.19.3, user=root@6e7732a7b81b, date=20221129-18:59:09)"
Jan 21 18:38:42 np0005591285 node_exporter[195496]: ts=2026-01-21T23:38:42.156Z caller=node_exporter.go:183 level=warn msg="Node Exporter is running as root user. This exporter is designed to run as unprivileged user, root is not required."
Jan 21 18:38:42 np0005591285 node_exporter[195496]: ts=2026-01-21T23:38:42.157Z caller=systemd_linux.go:152 level=info collector=systemd msg="Parsed flag --collector.systemd.unit-include" flag=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\.service
Jan 21 18:38:42 np0005591285 node_exporter[195496]: ts=2026-01-21T23:38:42.157Z caller=systemd_linux.go:154 level=info collector=systemd msg="Parsed flag --collector.systemd.unit-exclude" flag=.+\.(automount|device|mount|scope|slice)
Jan 21 18:38:42 np0005591285 node_exporter[195496]: ts=2026-01-21T23:38:42.157Z caller=diskstats_common.go:111 level=info collector=diskstats msg="Parsed flag --collector.diskstats.device-exclude" flag=^(ram|loop|fd|(h|s|v|xv)d[a-z]|nvme\d+n\d+p)\d+$
Jan 21 18:38:42 np0005591285 node_exporter[195496]: ts=2026-01-21T23:38:42.158Z caller=diskstats_linux.go:264 level=error collector=diskstats msg="Failed to open directory, disabling udev device properties" path=/run/udev/data
Jan 21 18:38:42 np0005591285 node_exporter[195496]: ts=2026-01-21T23:38:42.158Z caller=filesystem_common.go:111 level=info collector=filesystem msg="Parsed flag --collector.filesystem.mount-points-exclude" flag=^/(dev|proc|run/credentials/.+|sys|var/lib/docker/.+|var/lib/containers/storage/.+)($|/)
Jan 21 18:38:42 np0005591285 node_exporter[195496]: ts=2026-01-21T23:38:42.158Z caller=filesystem_common.go:113 level=info collector=filesystem msg="Parsed flag --collector.filesystem.fs-types-exclude" flag=^(autofs|binfmt_misc|bpf|cgroup2?|configfs|debugfs|devpts|devtmpfs|fusectl|hugetlbfs|iso9660|mqueue|nsfs|overlay|proc|procfs|pstore|rpc_pipefs|securityfs|selinuxfs|squashfs|sysfs|tracefs)$
Jan 21 18:38:42 np0005591285 node_exporter[195496]: ts=2026-01-21T23:38:42.158Z caller=node_exporter.go:110 level=info msg="Enabled collectors"
Jan 21 18:38:42 np0005591285 node_exporter[195496]: ts=2026-01-21T23:38:42.158Z caller=node_exporter.go:117 level=info collector=arp
Jan 21 18:38:42 np0005591285 node_exporter[195496]: ts=2026-01-21T23:38:42.158Z caller=node_exporter.go:117 level=info collector=bcache
Jan 21 18:38:42 np0005591285 node_exporter[195496]: ts=2026-01-21T23:38:42.158Z caller=node_exporter.go:117 level=info collector=bonding
Jan 21 18:38:42 np0005591285 node_exporter[195496]: ts=2026-01-21T23:38:42.158Z caller=node_exporter.go:117 level=info collector=btrfs
Jan 21 18:38:42 np0005591285 node_exporter[195496]: ts=2026-01-21T23:38:42.158Z caller=node_exporter.go:117 level=info collector=conntrack
Jan 21 18:38:42 np0005591285 node_exporter[195496]: ts=2026-01-21T23:38:42.158Z caller=node_exporter.go:117 level=info collector=cpu
Jan 21 18:38:42 np0005591285 node_exporter[195496]: ts=2026-01-21T23:38:42.158Z caller=node_exporter.go:117 level=info collector=cpufreq
Jan 21 18:38:42 np0005591285 node_exporter[195496]: ts=2026-01-21T23:38:42.158Z caller=node_exporter.go:117 level=info collector=diskstats
Jan 21 18:38:42 np0005591285 node_exporter[195496]: ts=2026-01-21T23:38:42.158Z caller=node_exporter.go:117 level=info collector=edac
Jan 21 18:38:42 np0005591285 node_exporter[195496]: ts=2026-01-21T23:38:42.158Z caller=node_exporter.go:117 level=info collector=fibrechannel
Jan 21 18:38:42 np0005591285 node_exporter[195496]: ts=2026-01-21T23:38:42.158Z caller=node_exporter.go:117 level=info collector=filefd
Jan 21 18:38:42 np0005591285 node_exporter[195496]: ts=2026-01-21T23:38:42.158Z caller=node_exporter.go:117 level=info collector=filesystem
Jan 21 18:38:42 np0005591285 node_exporter[195496]: ts=2026-01-21T23:38:42.158Z caller=node_exporter.go:117 level=info collector=infiniband
Jan 21 18:38:42 np0005591285 node_exporter[195496]: ts=2026-01-21T23:38:42.158Z caller=node_exporter.go:117 level=info collector=ipvs
Jan 21 18:38:42 np0005591285 node_exporter[195496]: ts=2026-01-21T23:38:42.158Z caller=node_exporter.go:117 level=info collector=loadavg
Jan 21 18:38:42 np0005591285 node_exporter[195496]: ts=2026-01-21T23:38:42.158Z caller=node_exporter.go:117 level=info collector=mdadm
Jan 21 18:38:42 np0005591285 node_exporter[195496]: ts=2026-01-21T23:38:42.158Z caller=node_exporter.go:117 level=info collector=meminfo
Jan 21 18:38:42 np0005591285 node_exporter[195496]: ts=2026-01-21T23:38:42.158Z caller=node_exporter.go:117 level=info collector=netclass
Jan 21 18:38:42 np0005591285 node_exporter[195496]: ts=2026-01-21T23:38:42.158Z caller=node_exporter.go:117 level=info collector=netdev
Jan 21 18:38:42 np0005591285 node_exporter[195496]: ts=2026-01-21T23:38:42.158Z caller=node_exporter.go:117 level=info collector=netstat
Jan 21 18:38:42 np0005591285 node_exporter[195496]: ts=2026-01-21T23:38:42.158Z caller=node_exporter.go:117 level=info collector=nfs
Jan 21 18:38:42 np0005591285 node_exporter[195496]: ts=2026-01-21T23:38:42.158Z caller=node_exporter.go:117 level=info collector=nfsd
Jan 21 18:38:42 np0005591285 node_exporter[195496]: ts=2026-01-21T23:38:42.158Z caller=node_exporter.go:117 level=info collector=nvme
Jan 21 18:38:42 np0005591285 node_exporter[195496]: ts=2026-01-21T23:38:42.158Z caller=node_exporter.go:117 level=info collector=schedstat
Jan 21 18:38:42 np0005591285 node_exporter[195496]: ts=2026-01-21T23:38:42.158Z caller=node_exporter.go:117 level=info collector=sockstat
Jan 21 18:38:42 np0005591285 node_exporter[195496]: ts=2026-01-21T23:38:42.158Z caller=node_exporter.go:117 level=info collector=softnet
Jan 21 18:38:42 np0005591285 node_exporter[195496]: ts=2026-01-21T23:38:42.158Z caller=node_exporter.go:117 level=info collector=systemd
Jan 21 18:38:42 np0005591285 node_exporter[195496]: ts=2026-01-21T23:38:42.158Z caller=node_exporter.go:117 level=info collector=tapestats
Jan 21 18:38:42 np0005591285 node_exporter[195496]: ts=2026-01-21T23:38:42.158Z caller=node_exporter.go:117 level=info collector=udp_queues
Jan 21 18:38:42 np0005591285 node_exporter[195496]: ts=2026-01-21T23:38:42.158Z caller=node_exporter.go:117 level=info collector=vmstat
Jan 21 18:38:42 np0005591285 node_exporter[195496]: ts=2026-01-21T23:38:42.158Z caller=node_exporter.go:117 level=info collector=xfs
Jan 21 18:38:42 np0005591285 node_exporter[195496]: ts=2026-01-21T23:38:42.158Z caller=node_exporter.go:117 level=info collector=zfs
Jan 21 18:38:42 np0005591285 node_exporter[195496]: ts=2026-01-21T23:38:42.159Z caller=tls_config.go:232 level=info msg="Listening on" address=[::]:9100
Jan 21 18:38:42 np0005591285 node_exporter[195496]: ts=2026-01-21T23:38:42.160Z caller=tls_config.go:268 level=info msg="TLS is enabled." http2=true address=[::]:9100
Jan 21 18:38:42 np0005591285 podman[195464]: 2026-01-21 23:38:42.166970111 +0000 UTC m=+0.187799290 container start 8919309155a033da8deb024bb37b9fc0d285fe97686ee0d202af37a5f2df8416 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Jan 21 18:38:42 np0005591285 podman[195464]: node_exporter
Jan 21 18:38:42 np0005591285 systemd[1]: Started node_exporter container.
Jan 21 18:38:42 np0005591285 podman[195510]: 2026-01-21 23:38:42.27567128 +0000 UTC m=+0.090310050 container health_status 8919309155a033da8deb024bb37b9fc0d285fe97686ee0d202af37a5f2df8416 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']})
Jan 21 18:38:43 np0005591285 python3.9[195684]: ansible-ansible.builtin.slurp Invoked with src=/var/lib/edpm-config/deployed_services.yaml
Jan 21 18:38:44 np0005591285 python3.9[195836]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/deployed_services.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 21 18:38:45 np0005591285 python3.9[195961]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/edpm-config/deployed_services.yaml mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1769038724.0214145-2168-182644188201341/.source.yaml _original_basename=.4hwa6nfq follow=False checksum=46691264d48ac1d5b49ea21d640bb39d5e6f28d6 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 21 18:38:46 np0005591285 python3.9[196113]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/healthchecks/podman_exporter/healthcheck follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 21 18:38:46 np0005591285 python3.9[196236]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/healthchecks/podman_exporter/ group=zuul mode=0700 owner=zuul setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1769038725.5269165-2213-174168051468808/.source _original_basename=healthcheck follow=False checksum=e380c11c36804bfc65a818f2960cfa663daacfe5 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None
Jan 21 18:38:48 np0005591285 python3.9[196388]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/edpm-config recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 21 18:38:48 np0005591285 python3.9[196540]: ansible-ansible.builtin.file Invoked with path=/var/lib/kolla/config_files recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Jan 21 18:38:49 np0005591285 python3.9[196692]: ansible-ansible.legacy.stat Invoked with path=/var/lib/kolla/config_files/ceilometer_agent_compute.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 21 18:38:50 np0005591285 python3.9[196770]: ansible-ansible.legacy.file Invoked with mode=0600 dest=/var/lib/kolla/config_files/ceilometer_agent_compute.json _original_basename=.9h2gt42x recurse=False state=file path=/var/lib/kolla/config_files/ceilometer_agent_compute.json force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 21 18:38:50 np0005591285 python3.9[196920]: ansible-ansible.builtin.file Invoked with mode=0755 path=/var/lib/edpm-config/container-startup-config/podman_exporter state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 21 18:38:52 np0005591285 podman[197141]: 2026-01-21 23:38:52.2252376 +0000 UTC m=+0.080281182 container health_status b5c1a31a508d0cf9e10702abe9c0ef424614b4d8f0daee85791f7de054afa6f0 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=starting, health_failing_streak=2, health_log=, container_name=ceilometer_agent_compute, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_id=ceilometer_agent_compute, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team)
Jan 21 18:38:52 np0005591285 systemd[1]: b5c1a31a508d0cf9e10702abe9c0ef424614b4d8f0daee85791f7de054afa6f0-eaf718c58fdcd99.service: Main process exited, code=exited, status=1/FAILURE
Jan 21 18:38:52 np0005591285 systemd[1]: b5c1a31a508d0cf9e10702abe9c0ef424614b4d8f0daee85791f7de054afa6f0-eaf718c58fdcd99.service: Failed with result 'exit-code'.
Jan 21 18:38:53 np0005591285 python3.9[197361]: ansible-container_config_data Invoked with config_overrides={} config_path=/var/lib/edpm-config/container-startup-config/podman_exporter config_pattern=*.json debug=False
Jan 21 18:38:54 np0005591285 python3.9[197513]: ansible-container_config_hash Invoked with check_mode=False config_vol_prefix=/var/lib/openstack
Jan 21 18:38:55 np0005591285 python3[197665]: ansible-edpm_container_manage Invoked with concurrency=1 config_dir=/var/lib/edpm-config/container-startup-config/podman_exporter config_id=podman_exporter config_overrides={} config_patterns=*.json containers=['podman_exporter'] log_base_path=/var/log/containers/stdouts debug=False
Jan 21 18:38:57 np0005591285 podman[197678]: 2026-01-21 23:38:57.330830413 +0000 UTC m=+1.364443083 image pull e56d40e393eb5ea8704d9af8cf0d74665df83747106713fda91530f201837815 quay.io/navidys/prometheus-podman-exporter:v1.10.1
Jan 21 18:38:57 np0005591285 podman[197776]: 2026-01-21 23:38:57.504024932 +0000 UTC m=+0.075313690 container create 3d927e208c8d9d2bf2ed958430b573b5beb6e6062ba87e1ec7a0ee42c855b2e4 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, config_id=podman_exporter, container_name=podman_exporter, managed_by=edpm_ansible)
Jan 21 18:38:57 np0005591285 podman[197776]: 2026-01-21 23:38:57.469342746 +0000 UTC m=+0.040631515 image pull e56d40e393eb5ea8704d9af8cf0d74665df83747106713fda91530f201837815 quay.io/navidys/prometheus-podman-exporter:v1.10.1
Jan 21 18:38:57 np0005591285 python3[197665]: ansible-edpm_container_manage PODMAN-CONTAINER-DEBUG: podman create --name podman_exporter --conmon-pidfile /run/podman_exporter.pid --env CONTAINER_HOST=unix:///run/podman/podman.sock --env OS_ENDPOINT_TYPE=internal --env EDPM_CONFIG_HASH=3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc --healthcheck-command /openstack/healthcheck podman_exporter --label config_id=podman_exporter --label container_name=podman_exporter --label managed_by=edpm_ansible --label config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']} --log-driver journald --log-level info --network host --privileged=True --publish 9882:9882 --user root --volume /var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z --volume /var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z --volume /run/podman/podman.sock:/run/podman/podman.sock:rw,z --volume /var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z quay.io/navidys/prometheus-podman-exporter:v1.10.1 --web.config.file=/etc/podman_exporter/podman_exporter.yaml
Jan 21 18:38:58 np0005591285 python3.9[197966]: ansible-ansible.builtin.stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 21 18:38:59 np0005591285 python3.9[198120]: ansible-file Invoked with path=/etc/systemd/system/edpm_podman_exporter.requires state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 21 18:39:00 np0005591285 python3.9[198196]: ansible-stat Invoked with path=/etc/systemd/system/edpm_podman_exporter_healthcheck.timer follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 21 18:39:00 np0005591285 python3.9[198347]: ansible-copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1769038740.1352932-2550-71052600992479/source dest=/etc/systemd/system/edpm_podman_exporter.service mode=0644 owner=root group=root backup=False force=True remote_src=False follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 21 18:39:01 np0005591285 python3.9[198423]: ansible-systemd Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Jan 21 18:39:01 np0005591285 systemd[1]: Reloading.
Jan 21 18:39:01 np0005591285 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 21 18:39:01 np0005591285 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 21 18:39:02 np0005591285 python3.9[198535]: ansible-systemd Invoked with state=restarted name=edpm_podman_exporter.service enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 21 18:39:02 np0005591285 systemd[1]: Reloading.
Jan 21 18:39:02 np0005591285 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 21 18:39:02 np0005591285 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 21 18:39:02 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:39:02.941 104259 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 21 18:39:02 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:39:02.948 104259 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.008s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 21 18:39:02 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:39:02.949 104259 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 21 18:39:02 np0005591285 systemd[1]: Starting podman_exporter container...
Jan 21 18:39:03 np0005591285 systemd[1]: Started libcrun container.
Jan 21 18:39:03 np0005591285 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/1efac981a1ec939885ec9ce58159584464e235d5bd83138a30ef97998263ced5/merged/etc/podman_exporter/podman_exporter.yaml supports timestamps until 2038 (0x7fffffff)
Jan 21 18:39:03 np0005591285 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/1efac981a1ec939885ec9ce58159584464e235d5bd83138a30ef97998263ced5/merged/etc/podman_exporter/tls supports timestamps until 2038 (0x7fffffff)
Jan 21 18:39:03 np0005591285 systemd[1]: Started /usr/bin/podman healthcheck run 3d927e208c8d9d2bf2ed958430b573b5beb6e6062ba87e1ec7a0ee42c855b2e4.
Jan 21 18:39:03 np0005591285 podman[198576]: 2026-01-21 23:39:03.169638492 +0000 UTC m=+0.179729095 container init 3d927e208c8d9d2bf2ed958430b573b5beb6e6062ba87e1ec7a0ee42c855b2e4 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Jan 21 18:39:03 np0005591285 podman_exporter[198591]: ts=2026-01-21T23:39:03.193Z caller=exporter.go:68 level=info msg="Starting podman-prometheus-exporter" version="(version=1.10.1, branch=HEAD, revision=1)"
Jan 21 18:39:03 np0005591285 podman_exporter[198591]: ts=2026-01-21T23:39:03.193Z caller=exporter.go:69 level=info msg=metrics enhanced=false
Jan 21 18:39:03 np0005591285 podman_exporter[198591]: ts=2026-01-21T23:39:03.193Z caller=handler.go:94 level=info msg="enabled collectors"
Jan 21 18:39:03 np0005591285 podman_exporter[198591]: ts=2026-01-21T23:39:03.193Z caller=handler.go:105 level=info collector=container
Jan 21 18:39:03 np0005591285 podman[198576]: 2026-01-21 23:39:03.207712428 +0000 UTC m=+0.217803011 container start 3d927e208c8d9d2bf2ed958430b573b5beb6e6062ba87e1ec7a0ee42c855b2e4 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter)
Jan 21 18:39:03 np0005591285 podman[198576]: podman_exporter
Jan 21 18:39:03 np0005591285 systemd[1]: Starting Podman API Service...
Jan 21 18:39:03 np0005591285 systemd[1]: Started podman_exporter container.
Jan 21 18:39:03 np0005591285 systemd[1]: Started Podman API Service.
Jan 21 18:39:03 np0005591285 podman[198602]: time="2026-01-21T23:39:03Z" level=info msg="/usr/bin/podman filtering at log level info"
Jan 21 18:39:03 np0005591285 podman[198602]: time="2026-01-21T23:39:03Z" level=info msg="Setting parallel job count to 25"
Jan 21 18:39:03 np0005591285 podman[198602]: time="2026-01-21T23:39:03Z" level=info msg="Using sqlite as database backend"
Jan 21 18:39:03 np0005591285 podman[198602]: time="2026-01-21T23:39:03Z" level=info msg="Not using native diff for overlay, this may cause degraded performance for building images: kernel has CONFIG_OVERLAY_FS_REDIRECT_DIR enabled"
Jan 21 18:39:03 np0005591285 podman[198602]: time="2026-01-21T23:39:03Z" level=info msg="Using systemd socket activation to determine API endpoint"
Jan 21 18:39:03 np0005591285 podman[198602]: time="2026-01-21T23:39:03Z" level=info msg="API service listening on \"/run/podman/podman.sock\". URI: \"unix:///run/podman/podman.sock\""
Jan 21 18:39:03 np0005591285 podman[198602]: @ - - [21/Jan/2026:23:39:03 +0000] "GET /v4.9.3/libpod/_ping HTTP/1.1" 200 2 "" "Go-http-client/1.1"
Jan 21 18:39:03 np0005591285 podman[198602]: time="2026-01-21T23:39:03Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Jan 21 18:39:03 np0005591285 podman[198602]: @ - - [21/Jan/2026:23:39:03 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=true&sync=false HTTP/1.1" 200 18075 "" "Go-http-client/1.1"
Jan 21 18:39:03 np0005591285 podman_exporter[198591]: ts=2026-01-21T23:39:03.312Z caller=exporter.go:96 level=info msg="Listening on" address=:9882
Jan 21 18:39:03 np0005591285 podman_exporter[198591]: ts=2026-01-21T23:39:03.312Z caller=tls_config.go:313 level=info msg="Listening on" address=[::]:9882
Jan 21 18:39:03 np0005591285 podman_exporter[198591]: ts=2026-01-21T23:39:03.313Z caller=tls_config.go:349 level=info msg="TLS is enabled." http2=true address=[::]:9882
Jan 21 18:39:03 np0005591285 podman[198601]: 2026-01-21 23:39:03.339758659 +0000 UTC m=+0.114084814 container health_status 3d927e208c8d9d2bf2ed958430b573b5beb6e6062ba87e1ec7a0ee42c855b2e4 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=starting, health_failing_streak=1, health_log=, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Jan 21 18:39:03 np0005591285 systemd[1]: 3d927e208c8d9d2bf2ed958430b573b5beb6e6062ba87e1ec7a0ee42c855b2e4-55d9866070e937ca.service: Main process exited, code=exited, status=1/FAILURE
Jan 21 18:39:03 np0005591285 systemd[1]: 3d927e208c8d9d2bf2ed958430b573b5beb6e6062ba87e1ec7a0ee42c855b2e4-55d9866070e937ca.service: Failed with result 'exit-code'.
Jan 21 18:39:04 np0005591285 python3.9[198782]: ansible-ansible.builtin.slurp Invoked with src=/var/lib/edpm-config/deployed_services.yaml
Jan 21 18:39:05 np0005591285 podman[198812]: 2026-01-21 23:39:05.246020282 +0000 UTC m=+0.112849101 container health_status e38def30a2d032278c507a8fe96e62e9ea51ff4721fe55b24e66691821fd079e (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, config_id=ovn_controller, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller)
Jan 21 18:39:05 np0005591285 python3.9[198957]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/deployed_services.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 21 18:39:06 np0005591285 python3.9[199082]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/edpm-config/deployed_services.yaml mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1769038745.1448343-2684-10598715123400/.source.yaml _original_basename=.tuw4uat7 follow=False checksum=d938fb355e6aa721d1e605900a3efc3868a25fbd backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 21 18:39:07 np0005591285 python3.9[199234]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/healthchecks/openstack_network_exporter/healthcheck follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 21 18:39:07 np0005591285 python3.9[199357]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/healthchecks/openstack_network_exporter/ group=zuul mode=0700 owner=zuul setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1769038746.6128674-2729-168651591602265/.source _original_basename=healthcheck follow=False checksum=e380c11c36804bfc65a818f2960cfa663daacfe5 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None
Jan 21 18:39:09 np0005591285 python3.9[199509]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/edpm-config recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 21 18:39:10 np0005591285 python3.9[199661]: ansible-ansible.builtin.file Invoked with path=/var/lib/kolla/config_files recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Jan 21 18:39:11 np0005591285 python3.9[199813]: ansible-ansible.legacy.stat Invoked with path=/var/lib/kolla/config_files/ceilometer_agent_compute.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 21 18:39:11 np0005591285 python3.9[199891]: ansible-ansible.legacy.file Invoked with mode=0600 dest=/var/lib/kolla/config_files/ceilometer_agent_compute.json _original_basename=.csdsjlwk recurse=False state=file path=/var/lib/kolla/config_files/ceilometer_agent_compute.json force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 21 18:39:11 np0005591285 nova_compute[182755]: 2026-01-21 23:39:11.597 182759 DEBUG oslo_service.periodic_task [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 21 18:39:11 np0005591285 nova_compute[182755]: 2026-01-21 23:39:11.599 182759 DEBUG oslo_service.periodic_task [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 21 18:39:11 np0005591285 nova_compute[182755]: 2026-01-21 23:39:11.623 182759 DEBUG oslo_service.periodic_task [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 21 18:39:11 np0005591285 nova_compute[182755]: 2026-01-21 23:39:11.624 182759 DEBUG nova.compute.manager [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Jan 21 18:39:11 np0005591285 nova_compute[182755]: 2026-01-21 23:39:11.624 182759 DEBUG nova.compute.manager [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Jan 21 18:39:11 np0005591285 nova_compute[182755]: 2026-01-21 23:39:11.638 182759 DEBUG nova.compute.manager [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Jan 21 18:39:11 np0005591285 nova_compute[182755]: 2026-01-21 23:39:11.638 182759 DEBUG oslo_service.periodic_task [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 21 18:39:12 np0005591285 nova_compute[182755]: 2026-01-21 23:39:12.216 182759 DEBUG oslo_service.periodic_task [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 21 18:39:12 np0005591285 nova_compute[182755]: 2026-01-21 23:39:12.217 182759 DEBUG oslo_service.periodic_task [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 21 18:39:12 np0005591285 podman[200042]: 2026-01-21 23:39:12.219235728 +0000 UTC m=+0.081072723 container health_status 482a8450540b33e5cd4c4cb0c4b60281016c657b79b89fa82f8a9e447843f995 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true)
Jan 21 18:39:12 np0005591285 python3.9[200041]: ansible-ansible.builtin.file Invoked with mode=0755 path=/var/lib/edpm-config/container-startup-config/openstack_network_exporter state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 21 18:39:12 np0005591285 podman[200064]: 2026-01-21 23:39:12.41311841 +0000 UTC m=+0.069585817 container health_status 8919309155a033da8deb024bb37b9fc0d285fe97686ee0d202af37a5f2df8416 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Jan 21 18:39:13 np0005591285 nova_compute[182755]: 2026-01-21 23:39:13.216 182759 DEBUG oslo_service.periodic_task [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 21 18:39:13 np0005591285 nova_compute[182755]: 2026-01-21 23:39:13.217 182759 DEBUG oslo_service.periodic_task [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 21 18:39:13 np0005591285 nova_compute[182755]: 2026-01-21 23:39:13.217 182759 DEBUG oslo_service.periodic_task [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 21 18:39:13 np0005591285 nova_compute[182755]: 2026-01-21 23:39:13.217 182759 DEBUG nova.compute.manager [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Jan 21 18:39:13 np0005591285 nova_compute[182755]: 2026-01-21 23:39:13.217 182759 DEBUG oslo_service.periodic_task [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 21 18:39:13 np0005591285 nova_compute[182755]: 2026-01-21 23:39:13.245 182759 DEBUG oslo_concurrency.lockutils [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 21 18:39:13 np0005591285 nova_compute[182755]: 2026-01-21 23:39:13.246 182759 DEBUG oslo_concurrency.lockutils [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 21 18:39:13 np0005591285 nova_compute[182755]: 2026-01-21 23:39:13.246 182759 DEBUG oslo_concurrency.lockutils [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 21 18:39:13 np0005591285 nova_compute[182755]: 2026-01-21 23:39:13.246 182759 DEBUG nova.compute.resource_tracker [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Jan 21 18:39:13 np0005591285 nova_compute[182755]: 2026-01-21 23:39:13.448 182759 WARNING nova.virt.libvirt.driver [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 21 18:39:13 np0005591285 nova_compute[182755]: 2026-01-21 23:39:13.451 182759 DEBUG nova.compute.resource_tracker [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=6006MB free_disk=73.53351211547852GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Jan 21 18:39:13 np0005591285 nova_compute[182755]: 2026-01-21 23:39:13.451 182759 DEBUG oslo_concurrency.lockutils [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 21 18:39:13 np0005591285 nova_compute[182755]: 2026-01-21 23:39:13.451 182759 DEBUG oslo_concurrency.lockutils [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 21 18:39:13 np0005591285 nova_compute[182755]: 2026-01-21 23:39:13.535 182759 DEBUG nova.compute.resource_tracker [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Jan 21 18:39:13 np0005591285 nova_compute[182755]: 2026-01-21 23:39:13.536 182759 DEBUG nova.compute.resource_tracker [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Jan 21 18:39:13 np0005591285 nova_compute[182755]: 2026-01-21 23:39:13.573 182759 DEBUG nova.compute.provider_tree [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Inventory has not changed in ProviderTree for provider: e96a8776-a298-4c19-937a-402cb8191067 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 21 18:39:13 np0005591285 nova_compute[182755]: 2026-01-21 23:39:13.600 182759 DEBUG nova.scheduler.client.report [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Inventory has not changed for provider e96a8776-a298-4c19-937a-402cb8191067 based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 0, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 21 18:39:13 np0005591285 nova_compute[182755]: 2026-01-21 23:39:13.601 182759 DEBUG nova.compute.resource_tracker [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Jan 21 18:39:13 np0005591285 nova_compute[182755]: 2026-01-21 23:39:13.601 182759 DEBUG oslo_concurrency.lockutils [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.150s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 21 18:39:14 np0005591285 python3.9[200512]: ansible-container_config_data Invoked with config_overrides={} config_path=/var/lib/edpm-config/container-startup-config/openstack_network_exporter config_pattern=*.json debug=False
Jan 21 18:39:15 np0005591285 python3.9[200664]: ansible-container_config_hash Invoked with check_mode=False config_vol_prefix=/var/lib/openstack
Jan 21 18:39:17 np0005591285 python3[200816]: ansible-edpm_container_manage Invoked with concurrency=1 config_dir=/var/lib/edpm-config/container-startup-config/openstack_network_exporter config_id=openstack_network_exporter config_overrides={} config_patterns=*.json containers=['openstack_network_exporter'] log_base_path=/var/log/containers/stdouts debug=False
Jan 21 18:39:19 np0005591285 podman[200829]: 2026-01-21 23:39:19.562446158 +0000 UTC m=+2.474166854 image pull 186c5e97c6f6912533851a0044ea6da23938910e7bddfb4a6c0be9b48ab2a1d1 quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified
Jan 21 18:39:19 np0005591285 podman[200929]: 2026-01-21 23:39:19.721213791 +0000 UTC m=+0.046323989 container create 769668cc4e818cfae854ab3fcb5517227ddca1e18005a18ef4d782a494f45662 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, distribution-scope=public, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., container_name=openstack_network_exporter, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=Red Hat, Inc., managed_by=edpm_ansible, release=1755695350, io.openshift.expose-services=, architecture=x86_64, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., name=ubi9-minimal, config_id=openstack_network_exporter, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.component=ubi9-minimal-container, io.openshift.tags=minimal rhel9, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.buildah.version=1.33.7, vendor=Red Hat, Inc., build-date=2025-08-20T13:12:41, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, version=9.6, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vcs-type=git)
Jan 21 18:39:19 np0005591285 podman[200929]: 2026-01-21 23:39:19.697491347 +0000 UTC m=+0.022601565 image pull 186c5e97c6f6912533851a0044ea6da23938910e7bddfb4a6c0be9b48ab2a1d1 quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified
Jan 21 18:39:19 np0005591285 python3[200816]: ansible-edpm_container_manage PODMAN-CONTAINER-DEBUG: podman create --name openstack_network_exporter --conmon-pidfile /run/openstack_network_exporter.pid --env OPENSTACK_NETWORK_EXPORTER_YAML=/etc/openstack_network_exporter/openstack_network_exporter.yaml --env OS_ENDPOINT_TYPE=internal --env EDPM_CONFIG_HASH=3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc --healthcheck-command /openstack/healthcheck openstack-netwo --label config_id=openstack_network_exporter --label container_name=openstack_network_exporter --label managed_by=edpm_ansible --label config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']} --log-driver journald --log-level info --network host --privileged=True --publish 9105:9105 --volume /var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z --volume /var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z --volume /var/run/openvswitch:/run/openvswitch:rw,z --volume /var/lib/openvswitch/ovn:/run/ovn:rw,z --volume /proc:/host/proc:ro --volume /var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified
Jan 21 18:39:21 np0005591285 python3.9[201119]: ansible-ansible.builtin.stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 21 18:39:22 np0005591285 python3.9[201273]: ansible-file Invoked with path=/etc/systemd/system/edpm_openstack_network_exporter.requires state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 21 18:39:22 np0005591285 podman[201321]: 2026-01-21 23:39:22.503565321 +0000 UTC m=+0.082091266 container health_status b5c1a31a508d0cf9e10702abe9c0ef424614b4d8f0daee85791f7de054afa6f0 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=unhealthy, health_failing_streak=3, health_log=, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, io.buildah.version=1.41.3, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, config_id=ceilometer_agent_compute, managed_by=edpm_ansible)
Jan 21 18:39:22 np0005591285 systemd[1]: b5c1a31a508d0cf9e10702abe9c0ef424614b4d8f0daee85791f7de054afa6f0-eaf718c58fdcd99.service: Main process exited, code=exited, status=1/FAILURE
Jan 21 18:39:22 np0005591285 systemd[1]: b5c1a31a508d0cf9e10702abe9c0ef424614b4d8f0daee85791f7de054afa6f0-eaf718c58fdcd99.service: Failed with result 'exit-code'.
Jan 21 18:39:22 np0005591285 python3.9[201367]: ansible-stat Invoked with path=/etc/systemd/system/edpm_openstack_network_exporter_healthcheck.timer follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 21 18:39:23 np0005591285 python3.9[201519]: ansible-copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1769038762.789337-3064-184692332833621/source dest=/etc/systemd/system/edpm_openstack_network_exporter.service mode=0644 owner=root group=root backup=False force=True remote_src=False follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 21 18:39:24 np0005591285 python3.9[201595]: ansible-systemd Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Jan 21 18:39:24 np0005591285 systemd[1]: Reloading.
Jan 21 18:39:24 np0005591285 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 21 18:39:24 np0005591285 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 21 18:39:25 np0005591285 python3.9[201706]: ansible-systemd Invoked with state=restarted name=edpm_openstack_network_exporter.service enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 21 18:39:25 np0005591285 systemd[1]: Reloading.
Jan 21 18:39:25 np0005591285 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 21 18:39:25 np0005591285 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 21 18:39:25 np0005591285 systemd[1]: Starting openstack_network_exporter container...
Jan 21 18:39:25 np0005591285 systemd[1]: Started libcrun container.
Jan 21 18:39:25 np0005591285 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/7fb00b19886cddd30d0f96a13755ba7b7b0f20b643ed50517cddc963a6450f18/merged/run/ovn supports timestamps until 2038 (0x7fffffff)
Jan 21 18:39:25 np0005591285 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/7fb00b19886cddd30d0f96a13755ba7b7b0f20b643ed50517cddc963a6450f18/merged/etc/openstack_network_exporter/openstack_network_exporter.yaml supports timestamps until 2038 (0x7fffffff)
Jan 21 18:39:25 np0005591285 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/7fb00b19886cddd30d0f96a13755ba7b7b0f20b643ed50517cddc963a6450f18/merged/etc/openstack_network_exporter/tls supports timestamps until 2038 (0x7fffffff)
Jan 21 18:39:25 np0005591285 systemd[1]: Started /usr/bin/podman healthcheck run 769668cc4e818cfae854ab3fcb5517227ddca1e18005a18ef4d782a494f45662.
Jan 21 18:39:25 np0005591285 podman[201745]: 2026-01-21 23:39:25.759945829 +0000 UTC m=+0.155586119 container init 769668cc4e818cfae854ab3fcb5517227ddca1e18005a18ef4d782a494f45662 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, architecture=x86_64, io.buildah.version=1.33.7, config_id=openstack_network_exporter, io.openshift.tags=minimal rhel9, managed_by=edpm_ansible, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, distribution-scope=public, build-date=2025-08-20T13:12:41, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.component=ubi9-minimal-container, vcs-type=git, maintainer=Red Hat, Inc., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, name=ubi9-minimal, version=9.6, release=1755695350, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, container_name=openstack_network_exporter, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly.)
Jan 21 18:39:25 np0005591285 openstack_network_exporter[201761]: INFO    23:39:25 main.go:48: registering *bridge.Collector
Jan 21 18:39:25 np0005591285 openstack_network_exporter[201761]: INFO    23:39:25 main.go:48: registering *coverage.Collector
Jan 21 18:39:25 np0005591285 openstack_network_exporter[201761]: INFO    23:39:25 main.go:48: registering *datapath.Collector
Jan 21 18:39:25 np0005591285 openstack_network_exporter[201761]: INFO    23:39:25 main.go:48: registering *iface.Collector
Jan 21 18:39:25 np0005591285 openstack_network_exporter[201761]: INFO    23:39:25 main.go:48: registering *memory.Collector
Jan 21 18:39:25 np0005591285 openstack_network_exporter[201761]: INFO    23:39:25 main.go:55: *ovnnorthd.Collector not registered, metric set not enabled
Jan 21 18:39:25 np0005591285 openstack_network_exporter[201761]: INFO    23:39:25 main.go:48: registering *ovn.Collector
Jan 21 18:39:25 np0005591285 openstack_network_exporter[201761]: INFO    23:39:25 main.go:55: *ovsdbserver.Collector not registered, metric set not enabled
Jan 21 18:39:25 np0005591285 openstack_network_exporter[201761]: INFO    23:39:25 main.go:48: registering *pmd_perf.Collector
Jan 21 18:39:25 np0005591285 openstack_network_exporter[201761]: INFO    23:39:25 main.go:48: registering *pmd_rxq.Collector
Jan 21 18:39:25 np0005591285 openstack_network_exporter[201761]: INFO    23:39:25 main.go:48: registering *vswitch.Collector
Jan 21 18:39:25 np0005591285 openstack_network_exporter[201761]: NOTICE  23:39:25 main.go:76: listening on https://:9105/metrics
Jan 21 18:39:25 np0005591285 podman[201745]: 2026-01-21 23:39:25.792364276 +0000 UTC m=+0.188004376 container start 769668cc4e818cfae854ab3fcb5517227ddca1e18005a18ef4d782a494f45662 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, config_id=openstack_network_exporter, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, build-date=2025-08-20T13:12:41, release=1755695350, vcs-type=git, io.openshift.tags=minimal rhel9, maintainer=Red Hat, Inc., version=9.6, managed_by=edpm_ansible, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, container_name=openstack_network_exporter, distribution-scope=public, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., url=https://catalog.redhat.com/en/search?searchType=containers, vendor=Red Hat, Inc., architecture=x86_64, com.redhat.component=ubi9-minimal-container, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., name=ubi9-minimal, io.buildah.version=1.33.7, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b)
Jan 21 18:39:25 np0005591285 podman[201745]: openstack_network_exporter
Jan 21 18:39:25 np0005591285 systemd[1]: Started openstack_network_exporter container.
Jan 21 18:39:25 np0005591285 podman[201766]: 2026-01-21 23:39:25.909470814 +0000 UTC m=+0.107043791 container health_status 769668cc4e818cfae854ab3fcb5517227ddca1e18005a18ef4d782a494f45662 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., config_id=openstack_network_exporter, release=1755695350, vcs-type=git, vendor=Red Hat, Inc., io.openshift.expose-services=, architecture=x86_64, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., build-date=2025-08-20T13:12:41, maintainer=Red Hat, Inc., managed_by=edpm_ansible, version=9.6, io.buildah.version=1.33.7, url=https://catalog.redhat.com/en/search?searchType=containers, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., distribution-scope=public, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=ubi9-minimal, com.redhat.component=ubi9-minimal-container, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, container_name=openstack_network_exporter, io.openshift.tags=minimal rhel9)
Jan 21 18:39:26 np0005591285 python3.9[201943]: ansible-ansible.builtin.slurp Invoked with src=/var/lib/edpm-config/deployed_services.yaml
Jan 21 18:39:28 np0005591285 python3.9[202095]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/deployed_services.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 21 18:39:28 np0005591285 python3.9[202220]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/edpm-config/deployed_services.yaml mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1769038767.6400065-3200-262716734026250/.source.yaml _original_basename=.ptri_9vf follow=False checksum=1ff3e7ccec2ab5a1b726a13e028ede8a02ff8cbe backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 21 18:39:29 np0005591285 python3.9[202372]: ansible-ansible.builtin.find Invoked with file_type=directory paths=['/var/lib/openstack/healthchecks/'] patterns=[] read_whole_file=False age_stamp=mtime recurse=False hidden=False follow=False get_checksum=False checksum_algorithm=sha1 use_regex=False exact_mode=True excludes=None contains=None age=None size=None depth=None mode=None encoding=None limit=None
Jan 21 18:39:34 np0005591285 podman[202397]: 2026-01-21 23:39:34.227788526 +0000 UTC m=+0.092458152 container health_status 3d927e208c8d9d2bf2ed958430b573b5beb6e6062ba87e1ec7a0ee42c855b2e4 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Jan 21 18:39:36 np0005591285 podman[202422]: 2026-01-21 23:39:36.25933893 +0000 UTC m=+0.128081055 container health_status e38def30a2d032278c507a8fe96e62e9ea51ff4721fe55b24e66691821fd079e (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, org.label-schema.schema-version=1.0)
Jan 21 18:39:43 np0005591285 podman[202449]: 2026-01-21 23:39:43.232723796 +0000 UTC m=+0.083775619 container health_status 8919309155a033da8deb024bb37b9fc0d285fe97686ee0d202af37a5f2df8416 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Jan 21 18:39:43 np0005591285 podman[202448]: 2026-01-21 23:39:43.242527729 +0000 UTC m=+0.105569972 container health_status 482a8450540b33e5cd4c4cb0c4b60281016c657b79b89fa82f8a9e447843f995 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.schema-version=1.0, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, tcib_managed=true)
Jan 21 18:39:53 np0005591285 podman[202489]: 2026-01-21 23:39:53.216768994 +0000 UTC m=+0.071906882 container health_status b5c1a31a508d0cf9e10702abe9c0ef424614b4d8f0daee85791f7de054afa6f0 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=unhealthy, health_failing_streak=4, health_log=, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=ceilometer_agent_compute, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251202)
Jan 21 18:39:53 np0005591285 systemd[1]: b5c1a31a508d0cf9e10702abe9c0ef424614b4d8f0daee85791f7de054afa6f0-eaf718c58fdcd99.service: Main process exited, code=exited, status=1/FAILURE
Jan 21 18:39:53 np0005591285 systemd[1]: b5c1a31a508d0cf9e10702abe9c0ef424614b4d8f0daee85791f7de054afa6f0-eaf718c58fdcd99.service: Failed with result 'exit-code'.
Jan 21 18:39:56 np0005591285 podman[202509]: 2026-01-21 23:39:56.232335937 +0000 UTC m=+0.090072608 container health_status 769668cc4e818cfae854ab3fcb5517227ddca1e18005a18ef4d782a494f45662 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.openshift.tags=minimal rhel9, vcs-type=git, io.buildah.version=1.33.7, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, version=9.6, com.redhat.component=ubi9-minimal-container, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., name=ubi9-minimal, release=1755695350, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vendor=Red Hat, Inc., maintainer=Red Hat, Inc., architecture=x86_64, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, managed_by=edpm_ansible, build-date=2025-08-20T13:12:41, url=https://catalog.redhat.com/en/search?searchType=containers, config_id=openstack_network_exporter, container_name=openstack_network_exporter, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly.)
Jan 21 18:40:02 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:40:02.942 104259 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 21 18:40:02 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:40:02.945 104259 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.003s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 21 18:40:02 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:40:02.945 104259 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 21 18:40:05 np0005591285 podman[202531]: 2026-01-21 23:40:05.207925203 +0000 UTC m=+0.073924597 container health_status 3d927e208c8d9d2bf2ed958430b573b5beb6e6062ba87e1ec7a0ee42c855b2e4 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter)
Jan 21 18:40:07 np0005591285 podman[202556]: 2026-01-21 23:40:07.235518421 +0000 UTC m=+0.106150327 container health_status e38def30a2d032278c507a8fe96e62e9ea51ff4721fe55b24e66691821fd079e (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 21 18:40:11 np0005591285 auditd[706]: Audit daemon rotating log files
Jan 21 18:40:11 np0005591285 nova_compute[182755]: 2026-01-21 23:40:11.603 182759 DEBUG oslo_service.periodic_task [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 21 18:40:11 np0005591285 nova_compute[182755]: 2026-01-21 23:40:11.605 182759 DEBUG nova.compute.manager [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Jan 21 18:40:11 np0005591285 nova_compute[182755]: 2026-01-21 23:40:11.605 182759 DEBUG nova.compute.manager [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Jan 21 18:40:11 np0005591285 nova_compute[182755]: 2026-01-21 23:40:11.631 182759 DEBUG nova.compute.manager [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Jan 21 18:40:11 np0005591285 python3.9[202711]: ansible-containers.podman.podman_container_info Invoked with name=['ovn_controller'] executable=podman
Jan 21 18:40:12 np0005591285 nova_compute[182755]: 2026-01-21 23:40:12.216 182759 DEBUG oslo_service.periodic_task [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 21 18:40:12 np0005591285 nova_compute[182755]: 2026-01-21 23:40:12.217 182759 DEBUG oslo_service.periodic_task [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 21 18:40:12 np0005591285 nova_compute[182755]: 2026-01-21 23:40:12.217 182759 DEBUG oslo_service.periodic_task [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 21 18:40:12 np0005591285 python3.9[202877]: ansible-containers.podman.podman_container_exec Invoked with command=id -u name=ovn_controller detach=False executable=podman privileged=False tty=False argv=None env=None user=None workdir=None
Jan 21 18:40:12 np0005591285 systemd[1]: Started libpod-conmon-e38def30a2d032278c507a8fe96e62e9ea51ff4721fe55b24e66691821fd079e.scope.
Jan 21 18:40:12 np0005591285 podman[202878]: 2026-01-21 23:40:12.941914637 +0000 UTC m=+0.120767018 container exec e38def30a2d032278c507a8fe96e62e9ea51ff4721fe55b24e66691821fd079e (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_managed=true, managed_by=edpm_ansible, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, container_name=ovn_controller, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2)
Jan 21 18:40:12 np0005591285 podman[202878]: 2026-01-21 23:40:12.952221003 +0000 UTC m=+0.131073344 container exec_died e38def30a2d032278c507a8fe96e62e9ea51ff4721fe55b24e66691821fd079e (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, container_name=ovn_controller, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, managed_by=edpm_ansible, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=ovn_controller)
Jan 21 18:40:13 np0005591285 systemd[1]: libpod-conmon-e38def30a2d032278c507a8fe96e62e9ea51ff4721fe55b24e66691821fd079e.scope: Deactivated successfully.
Jan 21 18:40:13 np0005591285 nova_compute[182755]: 2026-01-21 23:40:13.218 182759 DEBUG oslo_service.periodic_task [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 21 18:40:13 np0005591285 podman[203035]: 2026-01-21 23:40:13.601712961 +0000 UTC m=+0.085155306 container health_status 8919309155a033da8deb024bb37b9fc0d285fe97686ee0d202af37a5f2df8416 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Jan 21 18:40:13 np0005591285 podman[203034]: 2026-01-21 23:40:13.603062087 +0000 UTC m=+0.091014903 container health_status 482a8450540b33e5cd4c4cb0c4b60281016c657b79b89fa82f8a9e447843f995 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true)
Jan 21 18:40:13 np0005591285 python3.9[203102]: ansible-containers.podman.podman_container_exec Invoked with command=id -g name=ovn_controller detach=False executable=podman privileged=False tty=False argv=None env=None user=None workdir=None
Jan 21 18:40:13 np0005591285 systemd[1]: Started libpod-conmon-e38def30a2d032278c507a8fe96e62e9ea51ff4721fe55b24e66691821fd079e.scope.
Jan 21 18:40:13 np0005591285 podman[203103]: 2026-01-21 23:40:13.938137362 +0000 UTC m=+0.120726337 container exec e38def30a2d032278c507a8fe96e62e9ea51ff4721fe55b24e66691821fd079e (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, config_id=ovn_controller, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS)
Jan 21 18:40:13 np0005591285 podman[203103]: 2026-01-21 23:40:13.972406138 +0000 UTC m=+0.154995063 container exec_died e38def30a2d032278c507a8fe96e62e9ea51ff4721fe55b24e66691821fd079e (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, config_id=ovn_controller, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3)
Jan 21 18:40:14 np0005591285 systemd[1]: libpod-conmon-e38def30a2d032278c507a8fe96e62e9ea51ff4721fe55b24e66691821fd079e.scope: Deactivated successfully.
Jan 21 18:40:14 np0005591285 nova_compute[182755]: 2026-01-21 23:40:14.217 182759 DEBUG oslo_service.periodic_task [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 21 18:40:14 np0005591285 nova_compute[182755]: 2026-01-21 23:40:14.217 182759 DEBUG oslo_service.periodic_task [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 21 18:40:14 np0005591285 nova_compute[182755]: 2026-01-21 23:40:14.218 182759 DEBUG oslo_service.periodic_task [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 21 18:40:14 np0005591285 nova_compute[182755]: 2026-01-21 23:40:14.218 182759 DEBUG nova.compute.manager [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Jan 21 18:40:14 np0005591285 python3.9[203287]: ansible-ansible.builtin.file Invoked with group=0 mode=0700 owner=0 path=/var/lib/openstack/healthchecks/ovn_controller recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 21 18:40:15 np0005591285 nova_compute[182755]: 2026-01-21 23:40:15.218 182759 DEBUG oslo_service.periodic_task [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 21 18:40:15 np0005591285 nova_compute[182755]: 2026-01-21 23:40:15.253 182759 DEBUG oslo_concurrency.lockutils [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 21 18:40:15 np0005591285 nova_compute[182755]: 2026-01-21 23:40:15.254 182759 DEBUG oslo_concurrency.lockutils [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 21 18:40:15 np0005591285 nova_compute[182755]: 2026-01-21 23:40:15.255 182759 DEBUG oslo_concurrency.lockutils [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 21 18:40:15 np0005591285 nova_compute[182755]: 2026-01-21 23:40:15.255 182759 DEBUG nova.compute.resource_tracker [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Jan 21 18:40:15 np0005591285 nova_compute[182755]: 2026-01-21 23:40:15.445 182759 WARNING nova.virt.libvirt.driver [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 21 18:40:15 np0005591285 nova_compute[182755]: 2026-01-21 23:40:15.447 182759 DEBUG nova.compute.resource_tracker [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=6007MB free_disk=73.41682434082031GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Jan 21 18:40:15 np0005591285 nova_compute[182755]: 2026-01-21 23:40:15.447 182759 DEBUG oslo_concurrency.lockutils [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 21 18:40:15 np0005591285 nova_compute[182755]: 2026-01-21 23:40:15.448 182759 DEBUG oslo_concurrency.lockutils [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 21 18:40:15 np0005591285 nova_compute[182755]: 2026-01-21 23:40:15.518 182759 DEBUG nova.compute.resource_tracker [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Jan 21 18:40:15 np0005591285 nova_compute[182755]: 2026-01-21 23:40:15.518 182759 DEBUG nova.compute.resource_tracker [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Jan 21 18:40:15 np0005591285 nova_compute[182755]: 2026-01-21 23:40:15.539 182759 DEBUG nova.compute.provider_tree [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Inventory has not changed in ProviderTree for provider: e96a8776-a298-4c19-937a-402cb8191067 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 21 18:40:15 np0005591285 nova_compute[182755]: 2026-01-21 23:40:15.555 182759 DEBUG nova.scheduler.client.report [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Inventory has not changed for provider e96a8776-a298-4c19-937a-402cb8191067 based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 0, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 21 18:40:15 np0005591285 nova_compute[182755]: 2026-01-21 23:40:15.557 182759 DEBUG nova.compute.resource_tracker [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Jan 21 18:40:15 np0005591285 nova_compute[182755]: 2026-01-21 23:40:15.557 182759 DEBUG oslo_concurrency.lockutils [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.109s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 21 18:40:15 np0005591285 python3.9[203439]: ansible-containers.podman.podman_container_info Invoked with name=['ovn_metadata_agent'] executable=podman
Jan 21 18:45:56 np0005591285 nova_compute[182755]: 2026-01-21 23:45:56.356 182759 DEBUG nova.compute.manager [None req-86d0787f-6d68-47f6-a9a4-8fcab7e1a9b1 3f3d277fcbf24af7b1dc18c283ed60b3 94733cdd1eb94db1ae000cbc4537c9b6 - - default default] pre_live_migration data is LibvirtLiveMigrateData(bdms=<?>,block_migration=True,disk_available_mb=74752,disk_over_commit=<?>,dst_numa_info=<?>,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmpf6wm0dwd',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='69dceb72-db44-4bfc-9b98-cc8b39885ae7',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids=<?>,serial_listen_addr=None,serial_listen_ports=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=<?>,target_connect_addr=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) pre_live_migration /usr/lib/python3.9/site-packages/nova/compute/manager.py:8604#033[00m
Jan 21 18:45:56 np0005591285 nova_compute[182755]: 2026-01-21 23:45:56.391 182759 DEBUG oslo_concurrency.lockutils [None req-86d0787f-6d68-47f6-a9a4-8fcab7e1a9b1 3f3d277fcbf24af7b1dc18c283ed60b3 94733cdd1eb94db1ae000cbc4537c9b6 - - default default] Acquiring lock "refresh_cache-69dceb72-db44-4bfc-9b98-cc8b39885ae7" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 21 18:45:56 np0005591285 nova_compute[182755]: 2026-01-21 23:45:56.391 182759 DEBUG oslo_concurrency.lockutils [None req-86d0787f-6d68-47f6-a9a4-8fcab7e1a9b1 3f3d277fcbf24af7b1dc18c283ed60b3 94733cdd1eb94db1ae000cbc4537c9b6 - - default default] Acquired lock "refresh_cache-69dceb72-db44-4bfc-9b98-cc8b39885ae7" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 21 18:45:56 np0005591285 nova_compute[182755]: 2026-01-21 23:45:56.392 182759 DEBUG nova.network.neutron [None req-86d0787f-6d68-47f6-a9a4-8fcab7e1a9b1 3f3d277fcbf24af7b1dc18c283ed60b3 94733cdd1eb94db1ae000cbc4537c9b6 - - default default] [instance: 69dceb72-db44-4bfc-9b98-cc8b39885ae7] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Jan 21 18:45:56 np0005591285 rsyslogd[1006]: imjournal: 2259 messages lost due to rate-limiting (20000 allowed within 600 seconds)
Jan 21 18:45:58 np0005591285 nova_compute[182755]: 2026-01-21 23:45:58.926 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 18:45:59 np0005591285 nova_compute[182755]: 2026-01-21 23:45:59.009 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 18:46:00 np0005591285 nova_compute[182755]: 2026-01-21 23:46:00.584 182759 DEBUG nova.network.neutron [None req-86d0787f-6d68-47f6-a9a4-8fcab7e1a9b1 3f3d277fcbf24af7b1dc18c283ed60b3 94733cdd1eb94db1ae000cbc4537c9b6 - - default default] [instance: 69dceb72-db44-4bfc-9b98-cc8b39885ae7] Updating instance_info_cache with network_info: [{"id": "bae5fde2-5ead-4ae5-90dd-1d6d468541ea", "address": "fa:16:3e:6f:ac:86", "network": {"id": "b7816b8e-52c1-4d60-84f7-524ebe7dfa5c", "bridge": "br-int", "label": "tempest-LiveAutoBlockMigrationV225Test-1363967197-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "1298204af0f241dc8b63851b2046cf5c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapbae5fde2-5e", "ovs_interfaceid": "bae5fde2-5ead-4ae5-90dd-1d6d468541ea", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 21 18:46:00 np0005591285 nova_compute[182755]: 2026-01-21 23:46:00.610 182759 DEBUG oslo_concurrency.lockutils [None req-86d0787f-6d68-47f6-a9a4-8fcab7e1a9b1 3f3d277fcbf24af7b1dc18c283ed60b3 94733cdd1eb94db1ae000cbc4537c9b6 - - default default] Releasing lock "refresh_cache-69dceb72-db44-4bfc-9b98-cc8b39885ae7" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 21 18:46:00 np0005591285 nova_compute[182755]: 2026-01-21 23:46:00.626 182759 DEBUG nova.virt.libvirt.driver [None req-86d0787f-6d68-47f6-a9a4-8fcab7e1a9b1 3f3d277fcbf24af7b1dc18c283ed60b3 94733cdd1eb94db1ae000cbc4537c9b6 - - default default] [instance: 69dceb72-db44-4bfc-9b98-cc8b39885ae7] migrate_data in pre_live_migration: LibvirtLiveMigrateData(bdms=<?>,block_migration=True,disk_available_mb=74752,disk_over_commit=<?>,dst_numa_info=<?>,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmpf6wm0dwd',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='69dceb72-db44-4bfc-9b98-cc8b39885ae7',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids={},serial_listen_addr=None,serial_listen_ports=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=<?>,target_connect_addr=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) pre_live_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10827#033[00m
Jan 21 18:46:00 np0005591285 nova_compute[182755]: 2026-01-21 23:46:00.627 182759 DEBUG nova.virt.libvirt.driver [None req-86d0787f-6d68-47f6-a9a4-8fcab7e1a9b1 3f3d277fcbf24af7b1dc18c283ed60b3 94733cdd1eb94db1ae000cbc4537c9b6 - - default default] [instance: 69dceb72-db44-4bfc-9b98-cc8b39885ae7] Creating instance directory: /var/lib/nova/instances/69dceb72-db44-4bfc-9b98-cc8b39885ae7 pre_live_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10840#033[00m
Jan 21 18:46:00 np0005591285 nova_compute[182755]: 2026-01-21 23:46:00.628 182759 DEBUG nova.virt.libvirt.driver [None req-86d0787f-6d68-47f6-a9a4-8fcab7e1a9b1 3f3d277fcbf24af7b1dc18c283ed60b3 94733cdd1eb94db1ae000cbc4537c9b6 - - default default] [instance: 69dceb72-db44-4bfc-9b98-cc8b39885ae7] Creating disk.info with the contents: {'/var/lib/nova/instances/69dceb72-db44-4bfc-9b98-cc8b39885ae7/disk': 'qcow2', '/var/lib/nova/instances/69dceb72-db44-4bfc-9b98-cc8b39885ae7/disk.config': 'raw'} pre_live_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10854#033[00m
Jan 21 18:46:00 np0005591285 nova_compute[182755]: 2026-01-21 23:46:00.629 182759 DEBUG nova.virt.libvirt.driver [None req-86d0787f-6d68-47f6-a9a4-8fcab7e1a9b1 3f3d277fcbf24af7b1dc18c283ed60b3 94733cdd1eb94db1ae000cbc4537c9b6 - - default default] [instance: 69dceb72-db44-4bfc-9b98-cc8b39885ae7] Checking to make sure images and backing files are present before live migration. pre_live_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10864#033[00m
Jan 21 18:46:00 np0005591285 nova_compute[182755]: 2026-01-21 23:46:00.631 182759 DEBUG nova.objects.instance [None req-86d0787f-6d68-47f6-a9a4-8fcab7e1a9b1 3f3d277fcbf24af7b1dc18c283ed60b3 94733cdd1eb94db1ae000cbc4537c9b6 - - default default] Lazy-loading 'trusted_certs' on Instance uuid 69dceb72-db44-4bfc-9b98-cc8b39885ae7 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 21 18:46:00 np0005591285 nova_compute[182755]: 2026-01-21 23:46:00.672 182759 DEBUG oslo_concurrency.processutils [None req-86d0787f-6d68-47f6-a9a4-8fcab7e1a9b1 3f3d277fcbf24af7b1dc18c283ed60b3 94733cdd1eb94db1ae000cbc4537c9b6 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 21 18:46:00 np0005591285 nova_compute[182755]: 2026-01-21 23:46:00.732 182759 DEBUG oslo_concurrency.processutils [None req-86d0787f-6d68-47f6-a9a4-8fcab7e1a9b1 3f3d277fcbf24af7b1dc18c283ed60b3 94733cdd1eb94db1ae000cbc4537c9b6 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json" returned: 0 in 0.060s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 21 18:46:00 np0005591285 nova_compute[182755]: 2026-01-21 23:46:00.735 182759 DEBUG oslo_concurrency.lockutils [None req-86d0787f-6d68-47f6-a9a4-8fcab7e1a9b1 3f3d277fcbf24af7b1dc18c283ed60b3 94733cdd1eb94db1ae000cbc4537c9b6 - - default default] Acquiring lock "3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 21 18:46:00 np0005591285 nova_compute[182755]: 2026-01-21 23:46:00.736 182759 DEBUG oslo_concurrency.lockutils [None req-86d0787f-6d68-47f6-a9a4-8fcab7e1a9b1 3f3d277fcbf24af7b1dc18c283ed60b3 94733cdd1eb94db1ae000cbc4537c9b6 - - default default] Lock "3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 21 18:46:00 np0005591285 nova_compute[182755]: 2026-01-21 23:46:00.761 182759 DEBUG oslo_concurrency.processutils [None req-86d0787f-6d68-47f6-a9a4-8fcab7e1a9b1 3f3d277fcbf24af7b1dc18c283ed60b3 94733cdd1eb94db1ae000cbc4537c9b6 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 21 18:46:00 np0005591285 nova_compute[182755]: 2026-01-21 23:46:00.831 182759 DEBUG oslo_concurrency.processutils [None req-86d0787f-6d68-47f6-a9a4-8fcab7e1a9b1 3f3d277fcbf24af7b1dc18c283ed60b3 94733cdd1eb94db1ae000cbc4537c9b6 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json" returned: 0 in 0.070s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 21 18:46:00 np0005591285 nova_compute[182755]: 2026-01-21 23:46:00.833 182759 DEBUG oslo_concurrency.processutils [None req-86d0787f-6d68-47f6-a9a4-8fcab7e1a9b1 3f3d277fcbf24af7b1dc18c283ed60b3 94733cdd1eb94db1ae000cbc4537c9b6 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474,backing_fmt=raw /var/lib/nova/instances/69dceb72-db44-4bfc-9b98-cc8b39885ae7/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 21 18:46:00 np0005591285 nova_compute[182755]: 2026-01-21 23:46:00.873 182759 DEBUG oslo_concurrency.processutils [None req-86d0787f-6d68-47f6-a9a4-8fcab7e1a9b1 3f3d277fcbf24af7b1dc18c283ed60b3 94733cdd1eb94db1ae000cbc4537c9b6 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474,backing_fmt=raw /var/lib/nova/instances/69dceb72-db44-4bfc-9b98-cc8b39885ae7/disk 1073741824" returned: 0 in 0.041s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 21 18:46:00 np0005591285 nova_compute[182755]: 2026-01-21 23:46:00.875 182759 DEBUG oslo_concurrency.lockutils [None req-86d0787f-6d68-47f6-a9a4-8fcab7e1a9b1 3f3d277fcbf24af7b1dc18c283ed60b3 94733cdd1eb94db1ae000cbc4537c9b6 - - default default] Lock "3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.138s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 21 18:46:00 np0005591285 nova_compute[182755]: 2026-01-21 23:46:00.875 182759 DEBUG oslo_concurrency.processutils [None req-86d0787f-6d68-47f6-a9a4-8fcab7e1a9b1 3f3d277fcbf24af7b1dc18c283ed60b3 94733cdd1eb94db1ae000cbc4537c9b6 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 21 18:46:00 np0005591285 nova_compute[182755]: 2026-01-21 23:46:00.935 182759 DEBUG oslo_concurrency.processutils [None req-86d0787f-6d68-47f6-a9a4-8fcab7e1a9b1 3f3d277fcbf24af7b1dc18c283ed60b3 94733cdd1eb94db1ae000cbc4537c9b6 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json" returned: 0 in 0.059s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 21 18:46:00 np0005591285 nova_compute[182755]: 2026-01-21 23:46:00.937 182759 DEBUG nova.virt.disk.api [None req-86d0787f-6d68-47f6-a9a4-8fcab7e1a9b1 3f3d277fcbf24af7b1dc18c283ed60b3 94733cdd1eb94db1ae000cbc4537c9b6 - - default default] Checking if we can resize image /var/lib/nova/instances/69dceb72-db44-4bfc-9b98-cc8b39885ae7/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166#033[00m
Jan 21 18:46:00 np0005591285 nova_compute[182755]: 2026-01-21 23:46:00.937 182759 DEBUG oslo_concurrency.processutils [None req-86d0787f-6d68-47f6-a9a4-8fcab7e1a9b1 3f3d277fcbf24af7b1dc18c283ed60b3 94733cdd1eb94db1ae000cbc4537c9b6 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/69dceb72-db44-4bfc-9b98-cc8b39885ae7/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 21 18:46:01 np0005591285 nova_compute[182755]: 2026-01-21 23:46:01.004 182759 DEBUG oslo_concurrency.processutils [None req-86d0787f-6d68-47f6-a9a4-8fcab7e1a9b1 3f3d277fcbf24af7b1dc18c283ed60b3 94733cdd1eb94db1ae000cbc4537c9b6 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/69dceb72-db44-4bfc-9b98-cc8b39885ae7/disk --force-share --output=json" returned: 0 in 0.066s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 21 18:46:01 np0005591285 nova_compute[182755]: 2026-01-21 23:46:01.005 182759 DEBUG nova.virt.disk.api [None req-86d0787f-6d68-47f6-a9a4-8fcab7e1a9b1 3f3d277fcbf24af7b1dc18c283ed60b3 94733cdd1eb94db1ae000cbc4537c9b6 - - default default] Cannot resize image /var/lib/nova/instances/69dceb72-db44-4bfc-9b98-cc8b39885ae7/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172#033[00m
Jan 21 18:46:01 np0005591285 nova_compute[182755]: 2026-01-21 23:46:01.006 182759 DEBUG nova.objects.instance [None req-86d0787f-6d68-47f6-a9a4-8fcab7e1a9b1 3f3d277fcbf24af7b1dc18c283ed60b3 94733cdd1eb94db1ae000cbc4537c9b6 - - default default] Lazy-loading 'migration_context' on Instance uuid 69dceb72-db44-4bfc-9b98-cc8b39885ae7 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 21 18:46:01 np0005591285 nova_compute[182755]: 2026-01-21 23:46:01.037 182759 DEBUG oslo_concurrency.processutils [None req-86d0787f-6d68-47f6-a9a4-8fcab7e1a9b1 3f3d277fcbf24af7b1dc18c283ed60b3 94733cdd1eb94db1ae000cbc4537c9b6 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f raw /var/lib/nova/instances/69dceb72-db44-4bfc-9b98-cc8b39885ae7/disk.config 485376 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 21 18:46:01 np0005591285 nova_compute[182755]: 2026-01-21 23:46:01.067 182759 DEBUG oslo_concurrency.processutils [None req-86d0787f-6d68-47f6-a9a4-8fcab7e1a9b1 3f3d277fcbf24af7b1dc18c283ed60b3 94733cdd1eb94db1ae000cbc4537c9b6 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f raw /var/lib/nova/instances/69dceb72-db44-4bfc-9b98-cc8b39885ae7/disk.config 485376" returned: 0 in 0.030s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 21 18:46:01 np0005591285 nova_compute[182755]: 2026-01-21 23:46:01.070 182759 DEBUG nova.virt.libvirt.volume.remotefs [None req-86d0787f-6d68-47f6-a9a4-8fcab7e1a9b1 3f3d277fcbf24af7b1dc18c283ed60b3 94733cdd1eb94db1ae000cbc4537c9b6 - - default default] Copying file compute-0.ctlplane.example.com:/var/lib/nova/instances/69dceb72-db44-4bfc-9b98-cc8b39885ae7/disk.config to /var/lib/nova/instances/69dceb72-db44-4bfc-9b98-cc8b39885ae7 copy_file /usr/lib/python3.9/site-packages/nova/virt/libvirt/volume/remotefs.py:103#033[00m
Jan 21 18:46:01 np0005591285 nova_compute[182755]: 2026-01-21 23:46:01.070 182759 DEBUG oslo_concurrency.processutils [None req-86d0787f-6d68-47f6-a9a4-8fcab7e1a9b1 3f3d277fcbf24af7b1dc18c283ed60b3 94733cdd1eb94db1ae000cbc4537c9b6 - - default default] Running cmd (subprocess): scp -C -r compute-0.ctlplane.example.com:/var/lib/nova/instances/69dceb72-db44-4bfc-9b98-cc8b39885ae7/disk.config /var/lib/nova/instances/69dceb72-db44-4bfc-9b98-cc8b39885ae7 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 21 18:46:01 np0005591285 nova_compute[182755]: 2026-01-21 23:46:01.503 182759 DEBUG oslo_concurrency.processutils [None req-86d0787f-6d68-47f6-a9a4-8fcab7e1a9b1 3f3d277fcbf24af7b1dc18c283ed60b3 94733cdd1eb94db1ae000cbc4537c9b6 - - default default] CMD "scp -C -r compute-0.ctlplane.example.com:/var/lib/nova/instances/69dceb72-db44-4bfc-9b98-cc8b39885ae7/disk.config /var/lib/nova/instances/69dceb72-db44-4bfc-9b98-cc8b39885ae7" returned: 0 in 0.433s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 21 18:46:01 np0005591285 nova_compute[182755]: 2026-01-21 23:46:01.504 182759 DEBUG nova.virt.libvirt.driver [None req-86d0787f-6d68-47f6-a9a4-8fcab7e1a9b1 3f3d277fcbf24af7b1dc18c283ed60b3 94733cdd1eb94db1ae000cbc4537c9b6 - - default default] [instance: 69dceb72-db44-4bfc-9b98-cc8b39885ae7] Plugging VIFs using destination host port bindings before live migration. _pre_live_migration_plug_vifs /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10794#033[00m
Jan 21 18:46:01 np0005591285 nova_compute[182755]: 2026-01-21 23:46:01.507 182759 DEBUG nova.virt.libvirt.vif [None req-86d0787f-6d68-47f6-a9a4-8fcab7e1a9b1 3f3d277fcbf24af7b1dc18c283ed60b3 94733cdd1eb94db1ae000cbc4537c9b6 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=True,config_drive='True',created_at=2026-01-21T23:45:15Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-LiveAutoBlockMigrationV225Test-server-1333035319',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-liveautoblockmigrationv225test-server-1333035319',id=12,image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2026-01-21T23:45:22Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='1298204af0f241dc8b63851b2046cf5c',ramdisk_id='',reservation_id='r-45mgwptl',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',clean_attempts='1',image_base_image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-LiveAutoBlockMigrationV225Test-1063342224',owner_user_name='tempest-LiveAutoBlockMigrationV225Test-1063342224-project-member'},tags=<?>,task_state='migrating',terminated_at=None,trusted_certs=None,updated_at=2026-01-21T23:45:47Z,user_data=None,user_id='553fdc065acf4000a185abac43878ab4',uuid=69dceb72-db44-4bfc-9b98-cc8b39885ae7,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "bae5fde2-5ead-4ae5-90dd-1d6d468541ea", "address": "fa:16:3e:6f:ac:86", "network": {"id": "b7816b8e-52c1-4d60-84f7-524ebe7dfa5c", "bridge": "br-int", "label": "tempest-LiveAutoBlockMigrationV225Test-1363967197-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "1298204af0f241dc8b63851b2046cf5c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system"}, "devname": "tapbae5fde2-5e", "ovs_interfaceid": "bae5fde2-5ead-4ae5-90dd-1d6d468541ea", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Jan 21 18:46:01 np0005591285 nova_compute[182755]: 2026-01-21 23:46:01.508 182759 DEBUG nova.network.os_vif_util [None req-86d0787f-6d68-47f6-a9a4-8fcab7e1a9b1 3f3d277fcbf24af7b1dc18c283ed60b3 94733cdd1eb94db1ae000cbc4537c9b6 - - default default] Converting VIF {"id": "bae5fde2-5ead-4ae5-90dd-1d6d468541ea", "address": "fa:16:3e:6f:ac:86", "network": {"id": "b7816b8e-52c1-4d60-84f7-524ebe7dfa5c", "bridge": "br-int", "label": "tempest-LiveAutoBlockMigrationV225Test-1363967197-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "1298204af0f241dc8b63851b2046cf5c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system"}, "devname": "tapbae5fde2-5e", "ovs_interfaceid": "bae5fde2-5ead-4ae5-90dd-1d6d468541ea", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 21 18:46:01 np0005591285 nova_compute[182755]: 2026-01-21 23:46:01.510 182759 DEBUG nova.network.os_vif_util [None req-86d0787f-6d68-47f6-a9a4-8fcab7e1a9b1 3f3d277fcbf24af7b1dc18c283ed60b3 94733cdd1eb94db1ae000cbc4537c9b6 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:6f:ac:86,bridge_name='br-int',has_traffic_filtering=True,id=bae5fde2-5ead-4ae5-90dd-1d6d468541ea,network=Network(b7816b8e-52c1-4d60-84f7-524ebe7dfa5c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapbae5fde2-5e') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 21 18:46:01 np0005591285 nova_compute[182755]: 2026-01-21 23:46:01.511 182759 DEBUG os_vif [None req-86d0787f-6d68-47f6-a9a4-8fcab7e1a9b1 3f3d277fcbf24af7b1dc18c283ed60b3 94733cdd1eb94db1ae000cbc4537c9b6 - - default default] Plugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:6f:ac:86,bridge_name='br-int',has_traffic_filtering=True,id=bae5fde2-5ead-4ae5-90dd-1d6d468541ea,network=Network(b7816b8e-52c1-4d60-84f7-524ebe7dfa5c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapbae5fde2-5e') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Jan 21 18:46:01 np0005591285 nova_compute[182755]: 2026-01-21 23:46:01.513 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 18:46:01 np0005591285 nova_compute[182755]: 2026-01-21 23:46:01.514 182759 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 21 18:46:01 np0005591285 nova_compute[182755]: 2026-01-21 23:46:01.516 182759 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 21 18:46:01 np0005591285 nova_compute[182755]: 2026-01-21 23:46:01.534 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 18:46:01 np0005591285 nova_compute[182755]: 2026-01-21 23:46:01.534 182759 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapbae5fde2-5e, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 21 18:46:01 np0005591285 nova_compute[182755]: 2026-01-21 23:46:01.535 182759 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tapbae5fde2-5e, col_values=(('external_ids', {'iface-id': 'bae5fde2-5ead-4ae5-90dd-1d6d468541ea', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:6f:ac:86', 'vm-uuid': '69dceb72-db44-4bfc-9b98-cc8b39885ae7'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 21 18:46:01 np0005591285 nova_compute[182755]: 2026-01-21 23:46:01.538 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 18:46:01 np0005591285 NetworkManager[55017]: <info>  [1769039161.5392] manager: (tapbae5fde2-5e): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/37)
Jan 21 18:46:01 np0005591285 nova_compute[182755]: 2026-01-21 23:46:01.542 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 21 18:46:01 np0005591285 nova_compute[182755]: 2026-01-21 23:46:01.548 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 18:46:01 np0005591285 nova_compute[182755]: 2026-01-21 23:46:01.550 182759 INFO os_vif [None req-86d0787f-6d68-47f6-a9a4-8fcab7e1a9b1 3f3d277fcbf24af7b1dc18c283ed60b3 94733cdd1eb94db1ae000cbc4537c9b6 - - default default] Successfully plugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:6f:ac:86,bridge_name='br-int',has_traffic_filtering=True,id=bae5fde2-5ead-4ae5-90dd-1d6d468541ea,network=Network(b7816b8e-52c1-4d60-84f7-524ebe7dfa5c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapbae5fde2-5e')#033[00m
Jan 21 18:46:01 np0005591285 nova_compute[182755]: 2026-01-21 23:46:01.551 182759 DEBUG nova.virt.libvirt.driver [None req-86d0787f-6d68-47f6-a9a4-8fcab7e1a9b1 3f3d277fcbf24af7b1dc18c283ed60b3 94733cdd1eb94db1ae000cbc4537c9b6 - - default default] No dst_numa_info in migrate_data, no cores to power up in pre_live_migration. pre_live_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10954#033[00m
Jan 21 18:46:01 np0005591285 nova_compute[182755]: 2026-01-21 23:46:01.551 182759 DEBUG nova.compute.manager [None req-86d0787f-6d68-47f6-a9a4-8fcab7e1a9b1 3f3d277fcbf24af7b1dc18c283ed60b3 94733cdd1eb94db1ae000cbc4537c9b6 - - default default] driver pre_live_migration data is LibvirtLiveMigrateData(bdms=[],block_migration=True,disk_available_mb=74752,disk_over_commit=<?>,dst_numa_info=<?>,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmpf6wm0dwd',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='69dceb72-db44-4bfc-9b98-cc8b39885ae7',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids={},serial_listen_addr=None,serial_listen_ports=[],src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=[],target_connect_addr=None,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) pre_live_migration /usr/lib/python3.9/site-packages/nova/compute/manager.py:8668#033[00m
Jan 21 18:46:02 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:46:02.950 104259 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 21 18:46:02 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:46:02.951 104259 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 21 18:46:02 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:46:02.951 104259 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 21 18:46:03 np0005591285 nova_compute[182755]: 2026-01-21 23:46:03.928 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 18:46:04 np0005591285 nova_compute[182755]: 2026-01-21 23:46:04.071 182759 DEBUG nova.network.neutron [None req-86d0787f-6d68-47f6-a9a4-8fcab7e1a9b1 3f3d277fcbf24af7b1dc18c283ed60b3 94733cdd1eb94db1ae000cbc4537c9b6 - - default default] [instance: 69dceb72-db44-4bfc-9b98-cc8b39885ae7] Port bae5fde2-5ead-4ae5-90dd-1d6d468541ea updated with migration profile {'os_vif_delegation': True, 'migrating_to': 'compute-2.ctlplane.example.com'} successfully _setup_migration_port_profile /usr/lib/python3.9/site-packages/nova/network/neutron.py:354#033[00m
Jan 21 18:46:04 np0005591285 nova_compute[182755]: 2026-01-21 23:46:04.090 182759 DEBUG nova.compute.manager [None req-86d0787f-6d68-47f6-a9a4-8fcab7e1a9b1 3f3d277fcbf24af7b1dc18c283ed60b3 94733cdd1eb94db1ae000cbc4537c9b6 - - default default] pre_live_migration result data is LibvirtLiveMigrateData(bdms=[],block_migration=True,disk_available_mb=74752,disk_over_commit=<?>,dst_numa_info=<?>,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmpf6wm0dwd',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='69dceb72-db44-4bfc-9b98-cc8b39885ae7',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids={},serial_listen_addr=None,serial_listen_ports=[],src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=[],target_connect_addr=None,vifs=[VIFMigrateData],wait_for_vif_plugged=True) pre_live_migration /usr/lib/python3.9/site-packages/nova/compute/manager.py:8723#033[00m
Jan 21 18:46:04 np0005591285 systemd[1]: Starting libvirt proxy daemon...
Jan 21 18:46:04 np0005591285 podman[212934]: 2026-01-21 23:46:04.241816384 +0000 UTC m=+0.101591244 container health_status b5c1a31a508d0cf9e10702abe9c0ef424614b4d8f0daee85791f7de054afa6f0 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=ceilometer_agent_compute, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_managed=true, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.build-date=20251202)
Jan 21 18:46:04 np0005591285 systemd[1]: Started libvirt proxy daemon.
Jan 21 18:46:04 np0005591285 kernel: tapbae5fde2-5e: entered promiscuous mode
Jan 21 18:46:04 np0005591285 NetworkManager[55017]: <info>  [1769039164.4331] manager: (tapbae5fde2-5e): new Tun device (/org/freedesktop/NetworkManager/Devices/38)
Jan 21 18:46:04 np0005591285 ovn_controller[94908]: 2026-01-21T23:46:04Z|00053|binding|INFO|Claiming lport bae5fde2-5ead-4ae5-90dd-1d6d468541ea for this additional chassis.
Jan 21 18:46:04 np0005591285 ovn_controller[94908]: 2026-01-21T23:46:04Z|00054|binding|INFO|bae5fde2-5ead-4ae5-90dd-1d6d468541ea: Claiming fa:16:3e:6f:ac:86 10.100.0.6
Jan 21 18:46:04 np0005591285 nova_compute[182755]: 2026-01-21 23:46:04.432 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 18:46:04 np0005591285 nova_compute[182755]: 2026-01-21 23:46:04.435 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 18:46:04 np0005591285 ovn_controller[94908]: 2026-01-21T23:46:04Z|00055|binding|INFO|Setting lport bae5fde2-5ead-4ae5-90dd-1d6d468541ea ovn-installed in OVS
Jan 21 18:46:04 np0005591285 nova_compute[182755]: 2026-01-21 23:46:04.449 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 18:46:04 np0005591285 nova_compute[182755]: 2026-01-21 23:46:04.451 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 18:46:04 np0005591285 systemd-machined[154022]: New machine qemu-4-instance-0000000c.
Jan 21 18:46:04 np0005591285 systemd-udevd[212988]: Network interface NamePolicy= disabled on kernel command line.
Jan 21 18:46:04 np0005591285 systemd[1]: Started Virtual Machine qemu-4-instance-0000000c.
Jan 21 18:46:04 np0005591285 NetworkManager[55017]: <info>  [1769039164.5121] device (tapbae5fde2-5e): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 21 18:46:04 np0005591285 NetworkManager[55017]: <info>  [1769039164.5132] device (tapbae5fde2-5e): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 21 18:46:05 np0005591285 nova_compute[182755]: 2026-01-21 23:46:05.384 182759 DEBUG nova.virt.driver [None req-f447ef32-7edb-4e2c-a5c5-5452f2f9c37e - - - - - -] Emitting event <LifecycleEvent: 1769039165.383969, 69dceb72-db44-4bfc-9b98-cc8b39885ae7 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 21 18:46:05 np0005591285 nova_compute[182755]: 2026-01-21 23:46:05.386 182759 INFO nova.compute.manager [None req-f447ef32-7edb-4e2c-a5c5-5452f2f9c37e - - - - - -] [instance: 69dceb72-db44-4bfc-9b98-cc8b39885ae7] VM Started (Lifecycle Event)#033[00m
Jan 21 18:46:05 np0005591285 nova_compute[182755]: 2026-01-21 23:46:05.596 182759 DEBUG nova.compute.manager [None req-f447ef32-7edb-4e2c-a5c5-5452f2f9c37e - - - - - -] [instance: 69dceb72-db44-4bfc-9b98-cc8b39885ae7] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 21 18:46:06 np0005591285 nova_compute[182755]: 2026-01-21 23:46:06.391 182759 DEBUG nova.virt.driver [None req-f447ef32-7edb-4e2c-a5c5-5452f2f9c37e - - - - - -] Emitting event <LifecycleEvent: 1769039166.3907344, 69dceb72-db44-4bfc-9b98-cc8b39885ae7 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 21 18:46:06 np0005591285 nova_compute[182755]: 2026-01-21 23:46:06.392 182759 INFO nova.compute.manager [None req-f447ef32-7edb-4e2c-a5c5-5452f2f9c37e - - - - - -] [instance: 69dceb72-db44-4bfc-9b98-cc8b39885ae7] VM Resumed (Lifecycle Event)#033[00m
Jan 21 18:46:06 np0005591285 nova_compute[182755]: 2026-01-21 23:46:06.454 182759 DEBUG nova.compute.manager [None req-f447ef32-7edb-4e2c-a5c5-5452f2f9c37e - - - - - -] [instance: 69dceb72-db44-4bfc-9b98-cc8b39885ae7] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 21 18:46:06 np0005591285 nova_compute[182755]: 2026-01-21 23:46:06.458 182759 DEBUG nova.compute.manager [None req-f447ef32-7edb-4e2c-a5c5-5452f2f9c37e - - - - - -] [instance: 69dceb72-db44-4bfc-9b98-cc8b39885ae7] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: active, current task_state: migrating, current DB power_state: 1, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 21 18:46:06 np0005591285 nova_compute[182755]: 2026-01-21 23:46:06.506 182759 INFO nova.compute.manager [None req-f447ef32-7edb-4e2c-a5c5-5452f2f9c37e - - - - - -] [instance: 69dceb72-db44-4bfc-9b98-cc8b39885ae7] During the sync_power process the instance has moved from host compute-0.ctlplane.example.com to host compute-2.ctlplane.example.com#033[00m
Jan 21 18:46:06 np0005591285 nova_compute[182755]: 2026-01-21 23:46:06.541 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 18:46:07 np0005591285 podman[213016]: 2026-01-21 23:46:07.239433495 +0000 UTC m=+0.091969134 container health_status 769668cc4e818cfae854ab3fcb5517227ddca1e18005a18ef4d782a494f45662 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=openstack_network_exporter, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., config_id=openstack_network_exporter, managed_by=edpm_ansible, build-date=2025-08-20T13:12:41, maintainer=Red Hat, Inc., architecture=x86_64, name=ubi9-minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-type=git, com.redhat.component=ubi9-minimal-container, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.buildah.version=1.33.7, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, version=9.6, io.openshift.expose-services=, distribution-scope=public, io.openshift.tags=minimal rhel9, release=1755695350, url=https://catalog.redhat.com/en/search?searchType=containers, vendor=Red Hat, Inc., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI)
Jan 21 18:46:08 np0005591285 nova_compute[182755]: 2026-01-21 23:46:08.932 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 18:46:09 np0005591285 ovn_controller[94908]: 2026-01-21T23:46:09Z|00056|binding|INFO|Claiming lport bae5fde2-5ead-4ae5-90dd-1d6d468541ea for this chassis.
Jan 21 18:46:09 np0005591285 ovn_controller[94908]: 2026-01-21T23:46:09Z|00057|binding|INFO|bae5fde2-5ead-4ae5-90dd-1d6d468541ea: Claiming fa:16:3e:6f:ac:86 10.100.0.6
Jan 21 18:46:09 np0005591285 ovn_controller[94908]: 2026-01-21T23:46:09Z|00058|binding|INFO|Setting lport bae5fde2-5ead-4ae5-90dd-1d6d468541ea up in Southbound
Jan 21 18:46:09 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:46:09.029 104259 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:6f:ac:86 10.100.0.6'], port_security=['fa:16:3e:6f:ac:86 10.100.0.6'], type=, nat_addresses=[], virtual_parent=[], up=[True], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.6/28', 'neutron:device_id': '69dceb72-db44-4bfc-9b98-cc8b39885ae7', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-b7816b8e-52c1-4d60-84f7-524ebe7dfa5c', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '1298204af0f241dc8b63851b2046cf5c', 'neutron:revision_number': '21', 'neutron:security_group_ids': '4fca0662-11c4-4183-96b8-546eae3304ff', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=c50c611d-d348-436f-bd12-bc6add278699, chassis=[<ovs.db.idl.Row object at 0x7fdd55b5c970>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fdd55b5c970>], logical_port=bae5fde2-5ead-4ae5-90dd-1d6d468541ea) old=Port_Binding(up=[False], additional_chassis=[<ovs.db.idl.Row object at 0x7fdd55b5c970>], chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 21 18:46:09 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:46:09.031 104259 INFO neutron.agent.ovn.metadata.agent [-] Port bae5fde2-5ead-4ae5-90dd-1d6d468541ea in datapath b7816b8e-52c1-4d60-84f7-524ebe7dfa5c bound to our chassis#033[00m
Jan 21 18:46:09 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:46:09.032 104259 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network b7816b8e-52c1-4d60-84f7-524ebe7dfa5c#033[00m
Jan 21 18:46:09 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:46:09.047 211690 DEBUG oslo.privsep.daemon [-] privsep: reply[eb34d5bd-f708-49e4-ae0c-244c03e7735f]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 18:46:09 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:46:09.049 104259 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tapb7816b8e-51 in ovnmeta-b7816b8e-52c1-4d60-84f7-524ebe7dfa5c namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Jan 21 18:46:09 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:46:09.052 211690 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tapb7816b8e-50 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Jan 21 18:46:09 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:46:09.052 211690 DEBUG oslo.privsep.daemon [-] privsep: reply[7706abcd-ede3-4444-95ea-40fc06375a65]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 18:46:09 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:46:09.054 211690 DEBUG oslo.privsep.daemon [-] privsep: reply[4d16fa76-d6ff-454e-9b56-4e2f08c1c8bf]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 18:46:09 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:46:09.075 104650 DEBUG oslo.privsep.daemon [-] privsep: reply[103288d3-c185-4b14-9d5c-3bfda20a0131]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 18:46:09 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:46:09.105 211690 DEBUG oslo.privsep.daemon [-] privsep: reply[05182e6a-2d48-4691-b273-a3839994a149]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 18:46:09 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:46:09.147 211704 DEBUG oslo.privsep.daemon [-] privsep: reply[ed421d6d-6776-418f-9d58-97f1d3776b9e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 18:46:09 np0005591285 NetworkManager[55017]: <info>  [1769039169.1572] manager: (tapb7816b8e-50): new Veth device (/org/freedesktop/NetworkManager/Devices/39)
Jan 21 18:46:09 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:46:09.157 211690 DEBUG oslo.privsep.daemon [-] privsep: reply[304665c6-ea89-460f-92ce-93b85dcf8191]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 18:46:09 np0005591285 systemd-udevd[213044]: Network interface NamePolicy= disabled on kernel command line.
Jan 21 18:46:09 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:46:09.215 211704 DEBUG oslo.privsep.daemon [-] privsep: reply[dcd5be02-52f0-48eb-9149-74d54bd1a026]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 18:46:09 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:46:09.221 211704 DEBUG oslo.privsep.daemon [-] privsep: reply[6d456fc9-3ccf-44dd-8f89-de26654b4e07]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 18:46:09 np0005591285 NetworkManager[55017]: <info>  [1769039169.2566] device (tapb7816b8e-50): carrier: link connected
Jan 21 18:46:09 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:46:09.261 211704 DEBUG oslo.privsep.daemon [-] privsep: reply[bbf0cb2f-5bec-4b18-ac18-11ba2a2e022c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 18:46:09 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:46:09.286 211690 DEBUG oslo.privsep.daemon [-] privsep: reply[bab23a5d-db7a-4151-ba12-e009beff02ce]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapb7816b8e-51'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:b1:20:b0'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 21], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 372791, 'reachable_time': 44846, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 213063, 'error': None, 'target': 'ovnmeta-b7816b8e-52c1-4d60-84f7-524ebe7dfa5c', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 18:46:09 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:46:09.302 211690 DEBUG oslo.privsep.daemon [-] privsep: reply[dce9f904-16ff-41ac-9103-e80f47ef7b5d]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:feb1:20b0'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 372791, 'tstamp': 372791}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 213064, 'error': None, 'target': 'ovnmeta-b7816b8e-52c1-4d60-84f7-524ebe7dfa5c', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 18:46:09 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:46:09.320 211690 DEBUG oslo.privsep.daemon [-] privsep: reply[9a4bd944-dccd-475d-9294-0d23bfaddce4]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapb7816b8e-51'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:b1:20:b0'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 21], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 372791, 'reachable_time': 44846, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 213065, 'error': None, 'target': 'ovnmeta-b7816b8e-52c1-4d60-84f7-524ebe7dfa5c', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 18:46:09 np0005591285 nova_compute[182755]: 2026-01-21 23:46:09.341 182759 INFO nova.compute.manager [None req-86d0787f-6d68-47f6-a9a4-8fcab7e1a9b1 3f3d277fcbf24af7b1dc18c283ed60b3 94733cdd1eb94db1ae000cbc4537c9b6 - - default default] [instance: 69dceb72-db44-4bfc-9b98-cc8b39885ae7] Post operation of migration started#033[00m
Jan 21 18:46:09 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:46:09.360 211690 DEBUG oslo.privsep.daemon [-] privsep: reply[803bb4ae-85d0-4556-b98b-0436d787ef14]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 18:46:09 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:46:09.453 211690 DEBUG oslo.privsep.daemon [-] privsep: reply[be4bedb7-c53b-415d-a406-243fb53c2929]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 18:46:09 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:46:09.456 104259 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapb7816b8e-50, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 21 18:46:09 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:46:09.456 104259 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 21 18:46:09 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:46:09.457 104259 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapb7816b8e-50, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 21 18:46:09 np0005591285 nova_compute[182755]: 2026-01-21 23:46:09.460 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 18:46:09 np0005591285 NetworkManager[55017]: <info>  [1769039169.4607] manager: (tapb7816b8e-50): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/40)
Jan 21 18:46:09 np0005591285 kernel: tapb7816b8e-50: entered promiscuous mode
Jan 21 18:46:09 np0005591285 nova_compute[182755]: 2026-01-21 23:46:09.464 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 18:46:09 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:46:09.467 104259 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapb7816b8e-50, col_values=(('external_ids', {'iface-id': 'ecebff42-11cb-48b4-9c3d-966172998a49'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 21 18:46:09 np0005591285 nova_compute[182755]: 2026-01-21 23:46:09.469 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 18:46:09 np0005591285 ovn_controller[94908]: 2026-01-21T23:46:09Z|00059|binding|INFO|Releasing lport ecebff42-11cb-48b4-9c3d-966172998a49 from this chassis (sb_readonly=0)
Jan 21 18:46:09 np0005591285 nova_compute[182755]: 2026-01-21 23:46:09.492 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 18:46:09 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:46:09.493 104259 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/b7816b8e-52c1-4d60-84f7-524ebe7dfa5c.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/b7816b8e-52c1-4d60-84f7-524ebe7dfa5c.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Jan 21 18:46:09 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:46:09.495 211690 DEBUG oslo.privsep.daemon [-] privsep: reply[fd971ed9-3248-43bf-8202-cf85e4dcc08f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 18:46:09 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:46:09.496 104259 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Jan 21 18:46:09 np0005591285 ovn_metadata_agent[104254]: global
Jan 21 18:46:09 np0005591285 ovn_metadata_agent[104254]:    log         /dev/log local0 debug
Jan 21 18:46:09 np0005591285 ovn_metadata_agent[104254]:    log-tag     haproxy-metadata-proxy-b7816b8e-52c1-4d60-84f7-524ebe7dfa5c
Jan 21 18:46:09 np0005591285 ovn_metadata_agent[104254]:    user        root
Jan 21 18:46:09 np0005591285 ovn_metadata_agent[104254]:    group       root
Jan 21 18:46:09 np0005591285 ovn_metadata_agent[104254]:    maxconn     1024
Jan 21 18:46:09 np0005591285 ovn_metadata_agent[104254]:    pidfile     /var/lib/neutron/external/pids/b7816b8e-52c1-4d60-84f7-524ebe7dfa5c.pid.haproxy
Jan 21 18:46:09 np0005591285 ovn_metadata_agent[104254]:    daemon
Jan 21 18:46:09 np0005591285 ovn_metadata_agent[104254]: 
Jan 21 18:46:09 np0005591285 ovn_metadata_agent[104254]: defaults
Jan 21 18:46:09 np0005591285 ovn_metadata_agent[104254]:    log global
Jan 21 18:46:09 np0005591285 ovn_metadata_agent[104254]:    mode http
Jan 21 18:46:09 np0005591285 ovn_metadata_agent[104254]:    option httplog
Jan 21 18:46:09 np0005591285 ovn_metadata_agent[104254]:    option dontlognull
Jan 21 18:46:09 np0005591285 ovn_metadata_agent[104254]:    option http-server-close
Jan 21 18:46:09 np0005591285 ovn_metadata_agent[104254]:    option forwardfor
Jan 21 18:46:09 np0005591285 ovn_metadata_agent[104254]:    retries                 3
Jan 21 18:46:09 np0005591285 ovn_metadata_agent[104254]:    timeout http-request    30s
Jan 21 18:46:09 np0005591285 ovn_metadata_agent[104254]:    timeout connect         30s
Jan 21 18:46:09 np0005591285 ovn_metadata_agent[104254]:    timeout client          32s
Jan 21 18:46:09 np0005591285 ovn_metadata_agent[104254]:    timeout server          32s
Jan 21 18:46:09 np0005591285 ovn_metadata_agent[104254]:    timeout http-keep-alive 30s
Jan 21 18:46:09 np0005591285 ovn_metadata_agent[104254]: 
Jan 21 18:46:09 np0005591285 ovn_metadata_agent[104254]: 
Jan 21 18:46:09 np0005591285 ovn_metadata_agent[104254]: listen listener
Jan 21 18:46:09 np0005591285 ovn_metadata_agent[104254]:    bind 169.254.169.254:80
Jan 21 18:46:09 np0005591285 ovn_metadata_agent[104254]:    server metadata /var/lib/neutron/metadata_proxy
Jan 21 18:46:09 np0005591285 ovn_metadata_agent[104254]:    http-request add-header X-OVN-Network-ID b7816b8e-52c1-4d60-84f7-524ebe7dfa5c
Jan 21 18:46:09 np0005591285 ovn_metadata_agent[104254]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Jan 21 18:46:09 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:46:09.499 104259 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-b7816b8e-52c1-4d60-84f7-524ebe7dfa5c', 'env', 'PROCESS_TAG=haproxy-b7816b8e-52c1-4d60-84f7-524ebe7dfa5c', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/b7816b8e-52c1-4d60-84f7-524ebe7dfa5c.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Jan 21 18:46:09 np0005591285 podman[213098]: 2026-01-21 23:46:09.955100333 +0000 UTC m=+0.071332898 container create 20e53b70c2cdb2f99ac32a475e0929d5e135734e7afb1b21d0944f0fa45da17b (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-b7816b8e-52c1-4d60-84f7-524ebe7dfa5c, org.label-schema.schema-version=1.0, tcib_managed=true, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 21 18:46:10 np0005591285 systemd[1]: Started libpod-conmon-20e53b70c2cdb2f99ac32a475e0929d5e135734e7afb1b21d0944f0fa45da17b.scope.
Jan 21 18:46:10 np0005591285 podman[213098]: 2026-01-21 23:46:09.921611353 +0000 UTC m=+0.037844008 image pull 3695f0466b4af47afdf4b467956f8cc4744d7249671a73e7ca3fd26cca2f59c3 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Jan 21 18:46:10 np0005591285 systemd[1]: Started libcrun container.
Jan 21 18:46:10 np0005591285 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/c6952301675a4cf3fe0926dd062687530c3772970b250c346e29a97a2a0cea90/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 21 18:46:10 np0005591285 podman[213098]: 2026-01-21 23:46:10.058459473 +0000 UTC m=+0.174692058 container init 20e53b70c2cdb2f99ac32a475e0929d5e135734e7afb1b21d0944f0fa45da17b (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-b7816b8e-52c1-4d60-84f7-524ebe7dfa5c, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, tcib_managed=true, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2)
Jan 21 18:46:10 np0005591285 podman[213098]: 2026-01-21 23:46:10.066230572 +0000 UTC m=+0.182463147 container start 20e53b70c2cdb2f99ac32a475e0929d5e135734e7afb1b21d0944f0fa45da17b (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-b7816b8e-52c1-4d60-84f7-524ebe7dfa5c, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, tcib_managed=true, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS)
Jan 21 18:46:10 np0005591285 neutron-haproxy-ovnmeta-b7816b8e-52c1-4d60-84f7-524ebe7dfa5c[213113]: [NOTICE]   (213117) : New worker (213119) forked
Jan 21 18:46:10 np0005591285 neutron-haproxy-ovnmeta-b7816b8e-52c1-4d60-84f7-524ebe7dfa5c[213113]: [NOTICE]   (213117) : Loading success.
Jan 21 18:46:11 np0005591285 nova_compute[182755]: 2026-01-21 23:46:11.197 182759 DEBUG oslo_concurrency.lockutils [None req-86d0787f-6d68-47f6-a9a4-8fcab7e1a9b1 3f3d277fcbf24af7b1dc18c283ed60b3 94733cdd1eb94db1ae000cbc4537c9b6 - - default default] Acquiring lock "refresh_cache-69dceb72-db44-4bfc-9b98-cc8b39885ae7" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 21 18:46:11 np0005591285 nova_compute[182755]: 2026-01-21 23:46:11.197 182759 DEBUG oslo_concurrency.lockutils [None req-86d0787f-6d68-47f6-a9a4-8fcab7e1a9b1 3f3d277fcbf24af7b1dc18c283ed60b3 94733cdd1eb94db1ae000cbc4537c9b6 - - default default] Acquired lock "refresh_cache-69dceb72-db44-4bfc-9b98-cc8b39885ae7" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 21 18:46:11 np0005591285 nova_compute[182755]: 2026-01-21 23:46:11.198 182759 DEBUG nova.network.neutron [None req-86d0787f-6d68-47f6-a9a4-8fcab7e1a9b1 3f3d277fcbf24af7b1dc18c283ed60b3 94733cdd1eb94db1ae000cbc4537c9b6 - - default default] [instance: 69dceb72-db44-4bfc-9b98-cc8b39885ae7] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Jan 21 18:46:11 np0005591285 nova_compute[182755]: 2026-01-21 23:46:11.545 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 18:46:13 np0005591285 nova_compute[182755]: 2026-01-21 23:46:13.934 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 18:46:14 np0005591285 nova_compute[182755]: 2026-01-21 23:46:14.071 182759 DEBUG nova.network.neutron [None req-86d0787f-6d68-47f6-a9a4-8fcab7e1a9b1 3f3d277fcbf24af7b1dc18c283ed60b3 94733cdd1eb94db1ae000cbc4537c9b6 - - default default] [instance: 69dceb72-db44-4bfc-9b98-cc8b39885ae7] Updating instance_info_cache with network_info: [{"id": "bae5fde2-5ead-4ae5-90dd-1d6d468541ea", "address": "fa:16:3e:6f:ac:86", "network": {"id": "b7816b8e-52c1-4d60-84f7-524ebe7dfa5c", "bridge": "br-int", "label": "tempest-LiveAutoBlockMigrationV225Test-1363967197-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "1298204af0f241dc8b63851b2046cf5c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapbae5fde2-5e", "ovs_interfaceid": "bae5fde2-5ead-4ae5-90dd-1d6d468541ea", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 21 18:46:14 np0005591285 nova_compute[182755]: 2026-01-21 23:46:14.094 182759 DEBUG oslo_concurrency.lockutils [None req-86d0787f-6d68-47f6-a9a4-8fcab7e1a9b1 3f3d277fcbf24af7b1dc18c283ed60b3 94733cdd1eb94db1ae000cbc4537c9b6 - - default default] Releasing lock "refresh_cache-69dceb72-db44-4bfc-9b98-cc8b39885ae7" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 21 18:46:14 np0005591285 nova_compute[182755]: 2026-01-21 23:46:14.130 182759 DEBUG oslo_concurrency.lockutils [None req-86d0787f-6d68-47f6-a9a4-8fcab7e1a9b1 3f3d277fcbf24af7b1dc18c283ed60b3 94733cdd1eb94db1ae000cbc4537c9b6 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.allocate_pci_devices_for_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 21 18:46:14 np0005591285 nova_compute[182755]: 2026-01-21 23:46:14.131 182759 DEBUG oslo_concurrency.lockutils [None req-86d0787f-6d68-47f6-a9a4-8fcab7e1a9b1 3f3d277fcbf24af7b1dc18c283ed60b3 94733cdd1eb94db1ae000cbc4537c9b6 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.allocate_pci_devices_for_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 21 18:46:14 np0005591285 nova_compute[182755]: 2026-01-21 23:46:14.131 182759 DEBUG oslo_concurrency.lockutils [None req-86d0787f-6d68-47f6-a9a4-8fcab7e1a9b1 3f3d277fcbf24af7b1dc18c283ed60b3 94733cdd1eb94db1ae000cbc4537c9b6 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.allocate_pci_devices_for_instance" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 21 18:46:14 np0005591285 nova_compute[182755]: 2026-01-21 23:46:14.139 182759 INFO nova.virt.libvirt.driver [None req-86d0787f-6d68-47f6-a9a4-8fcab7e1a9b1 3f3d277fcbf24af7b1dc18c283ed60b3 94733cdd1eb94db1ae000cbc4537c9b6 - - default default] [instance: 69dceb72-db44-4bfc-9b98-cc8b39885ae7] Sending announce-self command to QEMU monitor. Attempt 1 of 3#033[00m
Jan 21 18:46:14 np0005591285 virtqemud[182299]: Domain id=4 name='instance-0000000c' uuid=69dceb72-db44-4bfc-9b98-cc8b39885ae7 is tainted: custom-monitor
Jan 21 18:46:15 np0005591285 nova_compute[182755]: 2026-01-21 23:46:15.150 182759 INFO nova.virt.libvirt.driver [None req-86d0787f-6d68-47f6-a9a4-8fcab7e1a9b1 3f3d277fcbf24af7b1dc18c283ed60b3 94733cdd1eb94db1ae000cbc4537c9b6 - - default default] [instance: 69dceb72-db44-4bfc-9b98-cc8b39885ae7] Sending announce-self command to QEMU monitor. Attempt 2 of 3#033[00m
Jan 21 18:46:15 np0005591285 podman[213128]: 2026-01-21 23:46:15.242081297 +0000 UTC m=+0.092583281 container health_status 3d927e208c8d9d2bf2ed958430b573b5beb6e6062ba87e1ec7a0ee42c855b2e4 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Jan 21 18:46:15 np0005591285 nova_compute[182755]: 2026-01-21 23:46:15.616 182759 DEBUG oslo_concurrency.lockutils [None req-9da0126d-3d47-4846-ba6b-b14bd2f59e32 aa25befcc85f49009cc03d3f9a7af21a 7b93a1e09d8a4019807c39b0826b8c31 - - default default] Acquiring lock "c5dba36b-76a4-4e09-bb90-2f8ef859d5f6" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 21 18:46:15 np0005591285 nova_compute[182755]: 2026-01-21 23:46:15.616 182759 DEBUG oslo_concurrency.lockutils [None req-9da0126d-3d47-4846-ba6b-b14bd2f59e32 aa25befcc85f49009cc03d3f9a7af21a 7b93a1e09d8a4019807c39b0826b8c31 - - default default] Lock "c5dba36b-76a4-4e09-bb90-2f8ef859d5f6" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 21 18:46:15 np0005591285 nova_compute[182755]: 2026-01-21 23:46:15.634 182759 DEBUG nova.compute.manager [None req-9da0126d-3d47-4846-ba6b-b14bd2f59e32 aa25befcc85f49009cc03d3f9a7af21a 7b93a1e09d8a4019807c39b0826b8c31 - - default default] [instance: c5dba36b-76a4-4e09-bb90-2f8ef859d5f6] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Jan 21 18:46:15 np0005591285 nova_compute[182755]: 2026-01-21 23:46:15.847 182759 DEBUG oslo_concurrency.lockutils [None req-9da0126d-3d47-4846-ba6b-b14bd2f59e32 aa25befcc85f49009cc03d3f9a7af21a 7b93a1e09d8a4019807c39b0826b8c31 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 21 18:46:15 np0005591285 nova_compute[182755]: 2026-01-21 23:46:15.848 182759 DEBUG oslo_concurrency.lockutils [None req-9da0126d-3d47-4846-ba6b-b14bd2f59e32 aa25befcc85f49009cc03d3f9a7af21a 7b93a1e09d8a4019807c39b0826b8c31 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 21 18:46:15 np0005591285 nova_compute[182755]: 2026-01-21 23:46:15.858 182759 DEBUG nova.virt.hardware [None req-9da0126d-3d47-4846-ba6b-b14bd2f59e32 aa25befcc85f49009cc03d3f9a7af21a 7b93a1e09d8a4019807c39b0826b8c31 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Jan 21 18:46:15 np0005591285 nova_compute[182755]: 2026-01-21 23:46:15.859 182759 INFO nova.compute.claims [None req-9da0126d-3d47-4846-ba6b-b14bd2f59e32 aa25befcc85f49009cc03d3f9a7af21a 7b93a1e09d8a4019807c39b0826b8c31 - - default default] [instance: c5dba36b-76a4-4e09-bb90-2f8ef859d5f6] Claim successful on node compute-2.ctlplane.example.com#033[00m
Jan 21 18:46:16 np0005591285 nova_compute[182755]: 2026-01-21 23:46:16.065 182759 DEBUG nova.compute.provider_tree [None req-9da0126d-3d47-4846-ba6b-b14bd2f59e32 aa25befcc85f49009cc03d3f9a7af21a 7b93a1e09d8a4019807c39b0826b8c31 - - default default] Inventory has not changed in ProviderTree for provider: e96a8776-a298-4c19-937a-402cb8191067 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 21 18:46:16 np0005591285 nova_compute[182755]: 2026-01-21 23:46:16.082 182759 DEBUG nova.scheduler.client.report [None req-9da0126d-3d47-4846-ba6b-b14bd2f59e32 aa25befcc85f49009cc03d3f9a7af21a 7b93a1e09d8a4019807c39b0826b8c31 - - default default] Inventory has not changed for provider e96a8776-a298-4c19-937a-402cb8191067 based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 21 18:46:16 np0005591285 nova_compute[182755]: 2026-01-21 23:46:16.130 182759 DEBUG oslo_concurrency.lockutils [None req-9da0126d-3d47-4846-ba6b-b14bd2f59e32 aa25befcc85f49009cc03d3f9a7af21a 7b93a1e09d8a4019807c39b0826b8c31 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.282s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 21 18:46:16 np0005591285 nova_compute[182755]: 2026-01-21 23:46:16.131 182759 DEBUG nova.compute.manager [None req-9da0126d-3d47-4846-ba6b-b14bd2f59e32 aa25befcc85f49009cc03d3f9a7af21a 7b93a1e09d8a4019807c39b0826b8c31 - - default default] [instance: c5dba36b-76a4-4e09-bb90-2f8ef859d5f6] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Jan 21 18:46:16 np0005591285 nova_compute[182755]: 2026-01-21 23:46:16.159 182759 INFO nova.virt.libvirt.driver [None req-86d0787f-6d68-47f6-a9a4-8fcab7e1a9b1 3f3d277fcbf24af7b1dc18c283ed60b3 94733cdd1eb94db1ae000cbc4537c9b6 - - default default] [instance: 69dceb72-db44-4bfc-9b98-cc8b39885ae7] Sending announce-self command to QEMU monitor. Attempt 3 of 3#033[00m
Jan 21 18:46:16 np0005591285 nova_compute[182755]: 2026-01-21 23:46:16.167 182759 DEBUG nova.compute.manager [None req-86d0787f-6d68-47f6-a9a4-8fcab7e1a9b1 3f3d277fcbf24af7b1dc18c283ed60b3 94733cdd1eb94db1ae000cbc4537c9b6 - - default default] [instance: 69dceb72-db44-4bfc-9b98-cc8b39885ae7] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 21 18:46:16 np0005591285 nova_compute[182755]: 2026-01-21 23:46:16.213 182759 DEBUG nova.objects.instance [None req-86d0787f-6d68-47f6-a9a4-8fcab7e1a9b1 3f3d277fcbf24af7b1dc18c283ed60b3 94733cdd1eb94db1ae000cbc4537c9b6 - - default default] [instance: 69dceb72-db44-4bfc-9b98-cc8b39885ae7] Trying to apply a migration context that does not seem to be set for this instance apply_migration_context /usr/lib/python3.9/site-packages/nova/objects/instance.py:1032#033[00m
Jan 21 18:46:16 np0005591285 nova_compute[182755]: 2026-01-21 23:46:16.218 182759 DEBUG oslo_service.periodic_task [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 21 18:46:16 np0005591285 nova_compute[182755]: 2026-01-21 23:46:16.257 182759 DEBUG nova.compute.manager [None req-9da0126d-3d47-4846-ba6b-b14bd2f59e32 aa25befcc85f49009cc03d3f9a7af21a 7b93a1e09d8a4019807c39b0826b8c31 - - default default] [instance: c5dba36b-76a4-4e09-bb90-2f8ef859d5f6] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Jan 21 18:46:16 np0005591285 nova_compute[182755]: 2026-01-21 23:46:16.258 182759 DEBUG nova.network.neutron [None req-9da0126d-3d47-4846-ba6b-b14bd2f59e32 aa25befcc85f49009cc03d3f9a7af21a 7b93a1e09d8a4019807c39b0826b8c31 - - default default] [instance: c5dba36b-76a4-4e09-bb90-2f8ef859d5f6] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Jan 21 18:46:16 np0005591285 nova_compute[182755]: 2026-01-21 23:46:16.312 182759 INFO nova.virt.libvirt.driver [None req-9da0126d-3d47-4846-ba6b-b14bd2f59e32 aa25befcc85f49009cc03d3f9a7af21a 7b93a1e09d8a4019807c39b0826b8c31 - - default default] [instance: c5dba36b-76a4-4e09-bb90-2f8ef859d5f6] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Jan 21 18:46:16 np0005591285 nova_compute[182755]: 2026-01-21 23:46:16.384 182759 DEBUG nova.compute.manager [None req-9da0126d-3d47-4846-ba6b-b14bd2f59e32 aa25befcc85f49009cc03d3f9a7af21a 7b93a1e09d8a4019807c39b0826b8c31 - - default default] [instance: c5dba36b-76a4-4e09-bb90-2f8ef859d5f6] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Jan 21 18:46:16 np0005591285 nova_compute[182755]: 2026-01-21 23:46:16.549 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 18:46:16 np0005591285 nova_compute[182755]: 2026-01-21 23:46:16.598 182759 DEBUG nova.compute.manager [None req-9da0126d-3d47-4846-ba6b-b14bd2f59e32 aa25befcc85f49009cc03d3f9a7af21a 7b93a1e09d8a4019807c39b0826b8c31 - - default default] [instance: c5dba36b-76a4-4e09-bb90-2f8ef859d5f6] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Jan 21 18:46:16 np0005591285 nova_compute[182755]: 2026-01-21 23:46:16.600 182759 DEBUG nova.virt.libvirt.driver [None req-9da0126d-3d47-4846-ba6b-b14bd2f59e32 aa25befcc85f49009cc03d3f9a7af21a 7b93a1e09d8a4019807c39b0826b8c31 - - default default] [instance: c5dba36b-76a4-4e09-bb90-2f8ef859d5f6] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Jan 21 18:46:16 np0005591285 nova_compute[182755]: 2026-01-21 23:46:16.600 182759 INFO nova.virt.libvirt.driver [None req-9da0126d-3d47-4846-ba6b-b14bd2f59e32 aa25befcc85f49009cc03d3f9a7af21a 7b93a1e09d8a4019807c39b0826b8c31 - - default default] [instance: c5dba36b-76a4-4e09-bb90-2f8ef859d5f6] Creating image(s)#033[00m
Jan 21 18:46:16 np0005591285 nova_compute[182755]: 2026-01-21 23:46:16.601 182759 DEBUG oslo_concurrency.lockutils [None req-9da0126d-3d47-4846-ba6b-b14bd2f59e32 aa25befcc85f49009cc03d3f9a7af21a 7b93a1e09d8a4019807c39b0826b8c31 - - default default] Acquiring lock "/var/lib/nova/instances/c5dba36b-76a4-4e09-bb90-2f8ef859d5f6/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 21 18:46:16 np0005591285 nova_compute[182755]: 2026-01-21 23:46:16.601 182759 DEBUG oslo_concurrency.lockutils [None req-9da0126d-3d47-4846-ba6b-b14bd2f59e32 aa25befcc85f49009cc03d3f9a7af21a 7b93a1e09d8a4019807c39b0826b8c31 - - default default] Lock "/var/lib/nova/instances/c5dba36b-76a4-4e09-bb90-2f8ef859d5f6/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 21 18:46:16 np0005591285 nova_compute[182755]: 2026-01-21 23:46:16.602 182759 DEBUG oslo_concurrency.lockutils [None req-9da0126d-3d47-4846-ba6b-b14bd2f59e32 aa25befcc85f49009cc03d3f9a7af21a 7b93a1e09d8a4019807c39b0826b8c31 - - default default] Lock "/var/lib/nova/instances/c5dba36b-76a4-4e09-bb90-2f8ef859d5f6/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 21 18:46:16 np0005591285 nova_compute[182755]: 2026-01-21 23:46:16.615 182759 DEBUG oslo_concurrency.processutils [None req-9da0126d-3d47-4846-ba6b-b14bd2f59e32 aa25befcc85f49009cc03d3f9a7af21a 7b93a1e09d8a4019807c39b0826b8c31 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 21 18:46:16 np0005591285 nova_compute[182755]: 2026-01-21 23:46:16.699 182759 DEBUG oslo_concurrency.processutils [None req-9da0126d-3d47-4846-ba6b-b14bd2f59e32 aa25befcc85f49009cc03d3f9a7af21a 7b93a1e09d8a4019807c39b0826b8c31 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json" returned: 0 in 0.084s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 21 18:46:16 np0005591285 nova_compute[182755]: 2026-01-21 23:46:16.701 182759 DEBUG oslo_concurrency.lockutils [None req-9da0126d-3d47-4846-ba6b-b14bd2f59e32 aa25befcc85f49009cc03d3f9a7af21a 7b93a1e09d8a4019807c39b0826b8c31 - - default default] Acquiring lock "3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 21 18:46:16 np0005591285 nova_compute[182755]: 2026-01-21 23:46:16.702 182759 DEBUG oslo_concurrency.lockutils [None req-9da0126d-3d47-4846-ba6b-b14bd2f59e32 aa25befcc85f49009cc03d3f9a7af21a 7b93a1e09d8a4019807c39b0826b8c31 - - default default] Lock "3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 21 18:46:16 np0005591285 nova_compute[182755]: 2026-01-21 23:46:16.730 182759 DEBUG oslo_concurrency.processutils [None req-9da0126d-3d47-4846-ba6b-b14bd2f59e32 aa25befcc85f49009cc03d3f9a7af21a 7b93a1e09d8a4019807c39b0826b8c31 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 21 18:46:16 np0005591285 nova_compute[182755]: 2026-01-21 23:46:16.822 182759 DEBUG oslo_concurrency.processutils [None req-9da0126d-3d47-4846-ba6b-b14bd2f59e32 aa25befcc85f49009cc03d3f9a7af21a 7b93a1e09d8a4019807c39b0826b8c31 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json" returned: 0 in 0.092s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 21 18:46:16 np0005591285 nova_compute[182755]: 2026-01-21 23:46:16.824 182759 DEBUG oslo_concurrency.processutils [None req-9da0126d-3d47-4846-ba6b-b14bd2f59e32 aa25befcc85f49009cc03d3f9a7af21a 7b93a1e09d8a4019807c39b0826b8c31 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474,backing_fmt=raw /var/lib/nova/instances/c5dba36b-76a4-4e09-bb90-2f8ef859d5f6/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 21 18:46:16 np0005591285 nova_compute[182755]: 2026-01-21 23:46:16.887 182759 DEBUG oslo_concurrency.processutils [None req-9da0126d-3d47-4846-ba6b-b14bd2f59e32 aa25befcc85f49009cc03d3f9a7af21a 7b93a1e09d8a4019807c39b0826b8c31 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474,backing_fmt=raw /var/lib/nova/instances/c5dba36b-76a4-4e09-bb90-2f8ef859d5f6/disk 1073741824" returned: 0 in 0.063s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 21 18:46:16 np0005591285 nova_compute[182755]: 2026-01-21 23:46:16.889 182759 DEBUG oslo_concurrency.lockutils [None req-9da0126d-3d47-4846-ba6b-b14bd2f59e32 aa25befcc85f49009cc03d3f9a7af21a 7b93a1e09d8a4019807c39b0826b8c31 - - default default] Lock "3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.187s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 21 18:46:16 np0005591285 nova_compute[182755]: 2026-01-21 23:46:16.890 182759 DEBUG oslo_concurrency.processutils [None req-9da0126d-3d47-4846-ba6b-b14bd2f59e32 aa25befcc85f49009cc03d3f9a7af21a 7b93a1e09d8a4019807c39b0826b8c31 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 21 18:46:16 np0005591285 nova_compute[182755]: 2026-01-21 23:46:16.982 182759 DEBUG oslo_concurrency.processutils [None req-9da0126d-3d47-4846-ba6b-b14bd2f59e32 aa25befcc85f49009cc03d3f9a7af21a 7b93a1e09d8a4019807c39b0826b8c31 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json" returned: 0 in 0.093s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 21 18:46:16 np0005591285 nova_compute[182755]: 2026-01-21 23:46:16.984 182759 DEBUG nova.virt.disk.api [None req-9da0126d-3d47-4846-ba6b-b14bd2f59e32 aa25befcc85f49009cc03d3f9a7af21a 7b93a1e09d8a4019807c39b0826b8c31 - - default default] Checking if we can resize image /var/lib/nova/instances/c5dba36b-76a4-4e09-bb90-2f8ef859d5f6/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166#033[00m
Jan 21 18:46:16 np0005591285 nova_compute[182755]: 2026-01-21 23:46:16.985 182759 DEBUG oslo_concurrency.processutils [None req-9da0126d-3d47-4846-ba6b-b14bd2f59e32 aa25befcc85f49009cc03d3f9a7af21a 7b93a1e09d8a4019807c39b0826b8c31 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/c5dba36b-76a4-4e09-bb90-2f8ef859d5f6/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 21 18:46:17 np0005591285 nova_compute[182755]: 2026-01-21 23:46:17.017 182759 DEBUG nova.network.neutron [None req-9da0126d-3d47-4846-ba6b-b14bd2f59e32 aa25befcc85f49009cc03d3f9a7af21a 7b93a1e09d8a4019807c39b0826b8c31 - - default default] [instance: c5dba36b-76a4-4e09-bb90-2f8ef859d5f6] No network configured allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1188#033[00m
Jan 21 18:46:17 np0005591285 nova_compute[182755]: 2026-01-21 23:46:17.018 182759 DEBUG nova.compute.manager [None req-9da0126d-3d47-4846-ba6b-b14bd2f59e32 aa25befcc85f49009cc03d3f9a7af21a 7b93a1e09d8a4019807c39b0826b8c31 - - default default] [instance: c5dba36b-76a4-4e09-bb90-2f8ef859d5f6] Instance network_info: |[]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Jan 21 18:46:17 np0005591285 nova_compute[182755]: 2026-01-21 23:46:17.082 182759 DEBUG oslo_concurrency.processutils [None req-9da0126d-3d47-4846-ba6b-b14bd2f59e32 aa25befcc85f49009cc03d3f9a7af21a 7b93a1e09d8a4019807c39b0826b8c31 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/c5dba36b-76a4-4e09-bb90-2f8ef859d5f6/disk --force-share --output=json" returned: 0 in 0.097s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 21 18:46:17 np0005591285 nova_compute[182755]: 2026-01-21 23:46:17.083 182759 DEBUG nova.virt.disk.api [None req-9da0126d-3d47-4846-ba6b-b14bd2f59e32 aa25befcc85f49009cc03d3f9a7af21a 7b93a1e09d8a4019807c39b0826b8c31 - - default default] Cannot resize image /var/lib/nova/instances/c5dba36b-76a4-4e09-bb90-2f8ef859d5f6/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172#033[00m
Jan 21 18:46:17 np0005591285 nova_compute[182755]: 2026-01-21 23:46:17.084 182759 DEBUG nova.objects.instance [None req-9da0126d-3d47-4846-ba6b-b14bd2f59e32 aa25befcc85f49009cc03d3f9a7af21a 7b93a1e09d8a4019807c39b0826b8c31 - - default default] Lazy-loading 'migration_context' on Instance uuid c5dba36b-76a4-4e09-bb90-2f8ef859d5f6 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 21 18:46:17 np0005591285 nova_compute[182755]: 2026-01-21 23:46:17.118 182759 DEBUG nova.virt.libvirt.driver [None req-9da0126d-3d47-4846-ba6b-b14bd2f59e32 aa25befcc85f49009cc03d3f9a7af21a 7b93a1e09d8a4019807c39b0826b8c31 - - default default] [instance: c5dba36b-76a4-4e09-bb90-2f8ef859d5f6] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Jan 21 18:46:17 np0005591285 nova_compute[182755]: 2026-01-21 23:46:17.118 182759 DEBUG nova.virt.libvirt.driver [None req-9da0126d-3d47-4846-ba6b-b14bd2f59e32 aa25befcc85f49009cc03d3f9a7af21a 7b93a1e09d8a4019807c39b0826b8c31 - - default default] [instance: c5dba36b-76a4-4e09-bb90-2f8ef859d5f6] Ensure instance console log exists: /var/lib/nova/instances/c5dba36b-76a4-4e09-bb90-2f8ef859d5f6/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Jan 21 18:46:17 np0005591285 nova_compute[182755]: 2026-01-21 23:46:17.119 182759 DEBUG oslo_concurrency.lockutils [None req-9da0126d-3d47-4846-ba6b-b14bd2f59e32 aa25befcc85f49009cc03d3f9a7af21a 7b93a1e09d8a4019807c39b0826b8c31 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 21 18:46:17 np0005591285 nova_compute[182755]: 2026-01-21 23:46:17.119 182759 DEBUG oslo_concurrency.lockutils [None req-9da0126d-3d47-4846-ba6b-b14bd2f59e32 aa25befcc85f49009cc03d3f9a7af21a 7b93a1e09d8a4019807c39b0826b8c31 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 21 18:46:17 np0005591285 nova_compute[182755]: 2026-01-21 23:46:17.120 182759 DEBUG oslo_concurrency.lockutils [None req-9da0126d-3d47-4846-ba6b-b14bd2f59e32 aa25befcc85f49009cc03d3f9a7af21a 7b93a1e09d8a4019807c39b0826b8c31 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 21 18:46:17 np0005591285 nova_compute[182755]: 2026-01-21 23:46:17.122 182759 DEBUG nova.virt.libvirt.driver [None req-9da0126d-3d47-4846-ba6b-b14bd2f59e32 aa25befcc85f49009cc03d3f9a7af21a 7b93a1e09d8a4019807c39b0826b8c31 - - default default] [instance: c5dba36b-76a4-4e09-bb90-2f8ef859d5f6] Start _get_guest_xml network_info=[] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-21T23:42:50Z,direct_url=<?>,disk_format='qcow2',id=9cd98f02-a505-4543-a7ad-04e9a377b456,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='43b70c4e837343859ac97b6b2397ba1b',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-21T23:42:52Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'device_type': 'disk', 'device_name': '/dev/vda', 'size': 0, 'boot_index': 0, 'disk_bus': 'virtio', 'guest_format': None, 'encryption_options': None, 'encryption_format': None, 'encrypted': False, 'encryption_secret_uuid': None, 'image_id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Jan 21 18:46:17 np0005591285 nova_compute[182755]: 2026-01-21 23:46:17.128 182759 WARNING nova.virt.libvirt.driver [None req-9da0126d-3d47-4846-ba6b-b14bd2f59e32 aa25befcc85f49009cc03d3f9a7af21a 7b93a1e09d8a4019807c39b0826b8c31 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 21 18:46:17 np0005591285 nova_compute[182755]: 2026-01-21 23:46:17.134 182759 DEBUG nova.virt.libvirt.host [None req-9da0126d-3d47-4846-ba6b-b14bd2f59e32 aa25befcc85f49009cc03d3f9a7af21a 7b93a1e09d8a4019807c39b0826b8c31 - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Jan 21 18:46:17 np0005591285 nova_compute[182755]: 2026-01-21 23:46:17.135 182759 DEBUG nova.virt.libvirt.host [None req-9da0126d-3d47-4846-ba6b-b14bd2f59e32 aa25befcc85f49009cc03d3f9a7af21a 7b93a1e09d8a4019807c39b0826b8c31 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Jan 21 18:46:17 np0005591285 nova_compute[182755]: 2026-01-21 23:46:17.145 182759 DEBUG nova.virt.libvirt.host [None req-9da0126d-3d47-4846-ba6b-b14bd2f59e32 aa25befcc85f49009cc03d3f9a7af21a 7b93a1e09d8a4019807c39b0826b8c31 - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Jan 21 18:46:17 np0005591285 nova_compute[182755]: 2026-01-21 23:46:17.145 182759 DEBUG nova.virt.libvirt.host [None req-9da0126d-3d47-4846-ba6b-b14bd2f59e32 aa25befcc85f49009cc03d3f9a7af21a 7b93a1e09d8a4019807c39b0826b8c31 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Jan 21 18:46:17 np0005591285 nova_compute[182755]: 2026-01-21 23:46:17.148 182759 DEBUG nova.virt.libvirt.driver [None req-9da0126d-3d47-4846-ba6b-b14bd2f59e32 aa25befcc85f49009cc03d3f9a7af21a 7b93a1e09d8a4019807c39b0826b8c31 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Jan 21 18:46:17 np0005591285 nova_compute[182755]: 2026-01-21 23:46:17.148 182759 DEBUG nova.virt.hardware [None req-9da0126d-3d47-4846-ba6b-b14bd2f59e32 aa25befcc85f49009cc03d3f9a7af21a 7b93a1e09d8a4019807c39b0826b8c31 - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-21T23:42:45Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='c3389c03-89c4-4ff5-9e03-1a99d41713d4',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-21T23:42:50Z,direct_url=<?>,disk_format='qcow2',id=9cd98f02-a505-4543-a7ad-04e9a377b456,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='43b70c4e837343859ac97b6b2397ba1b',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-21T23:42:52Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Jan 21 18:46:17 np0005591285 nova_compute[182755]: 2026-01-21 23:46:17.149 182759 DEBUG nova.virt.hardware [None req-9da0126d-3d47-4846-ba6b-b14bd2f59e32 aa25befcc85f49009cc03d3f9a7af21a 7b93a1e09d8a4019807c39b0826b8c31 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Jan 21 18:46:17 np0005591285 nova_compute[182755]: 2026-01-21 23:46:17.149 182759 DEBUG nova.virt.hardware [None req-9da0126d-3d47-4846-ba6b-b14bd2f59e32 aa25befcc85f49009cc03d3f9a7af21a 7b93a1e09d8a4019807c39b0826b8c31 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Jan 21 18:46:17 np0005591285 nova_compute[182755]: 2026-01-21 23:46:17.149 182759 DEBUG nova.virt.hardware [None req-9da0126d-3d47-4846-ba6b-b14bd2f59e32 aa25befcc85f49009cc03d3f9a7af21a 7b93a1e09d8a4019807c39b0826b8c31 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Jan 21 18:46:17 np0005591285 nova_compute[182755]: 2026-01-21 23:46:17.149 182759 DEBUG nova.virt.hardware [None req-9da0126d-3d47-4846-ba6b-b14bd2f59e32 aa25befcc85f49009cc03d3f9a7af21a 7b93a1e09d8a4019807c39b0826b8c31 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Jan 21 18:46:17 np0005591285 nova_compute[182755]: 2026-01-21 23:46:17.150 182759 DEBUG nova.virt.hardware [None req-9da0126d-3d47-4846-ba6b-b14bd2f59e32 aa25befcc85f49009cc03d3f9a7af21a 7b93a1e09d8a4019807c39b0826b8c31 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Jan 21 18:46:17 np0005591285 nova_compute[182755]: 2026-01-21 23:46:17.150 182759 DEBUG nova.virt.hardware [None req-9da0126d-3d47-4846-ba6b-b14bd2f59e32 aa25befcc85f49009cc03d3f9a7af21a 7b93a1e09d8a4019807c39b0826b8c31 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Jan 21 18:46:17 np0005591285 nova_compute[182755]: 2026-01-21 23:46:17.150 182759 DEBUG nova.virt.hardware [None req-9da0126d-3d47-4846-ba6b-b14bd2f59e32 aa25befcc85f49009cc03d3f9a7af21a 7b93a1e09d8a4019807c39b0826b8c31 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Jan 21 18:46:17 np0005591285 nova_compute[182755]: 2026-01-21 23:46:17.151 182759 DEBUG nova.virt.hardware [None req-9da0126d-3d47-4846-ba6b-b14bd2f59e32 aa25befcc85f49009cc03d3f9a7af21a 7b93a1e09d8a4019807c39b0826b8c31 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Jan 21 18:46:17 np0005591285 nova_compute[182755]: 2026-01-21 23:46:17.151 182759 DEBUG nova.virt.hardware [None req-9da0126d-3d47-4846-ba6b-b14bd2f59e32 aa25befcc85f49009cc03d3f9a7af21a 7b93a1e09d8a4019807c39b0826b8c31 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Jan 21 18:46:17 np0005591285 nova_compute[182755]: 2026-01-21 23:46:17.151 182759 DEBUG nova.virt.hardware [None req-9da0126d-3d47-4846-ba6b-b14bd2f59e32 aa25befcc85f49009cc03d3f9a7af21a 7b93a1e09d8a4019807c39b0826b8c31 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Jan 21 18:46:17 np0005591285 nova_compute[182755]: 2026-01-21 23:46:17.157 182759 DEBUG nova.objects.instance [None req-9da0126d-3d47-4846-ba6b-b14bd2f59e32 aa25befcc85f49009cc03d3f9a7af21a 7b93a1e09d8a4019807c39b0826b8c31 - - default default] Lazy-loading 'pci_devices' on Instance uuid c5dba36b-76a4-4e09-bb90-2f8ef859d5f6 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 21 18:46:17 np0005591285 nova_compute[182755]: 2026-01-21 23:46:17.175 182759 DEBUG nova.virt.libvirt.driver [None req-9da0126d-3d47-4846-ba6b-b14bd2f59e32 aa25befcc85f49009cc03d3f9a7af21a 7b93a1e09d8a4019807c39b0826b8c31 - - default default] [instance: c5dba36b-76a4-4e09-bb90-2f8ef859d5f6] End _get_guest_xml xml=<domain type="kvm">
Jan 21 18:46:17 np0005591285 nova_compute[182755]:  <uuid>c5dba36b-76a4-4e09-bb90-2f8ef859d5f6</uuid>
Jan 21 18:46:17 np0005591285 nova_compute[182755]:  <name>instance-0000000f</name>
Jan 21 18:46:17 np0005591285 nova_compute[182755]:  <memory>131072</memory>
Jan 21 18:46:17 np0005591285 nova_compute[182755]:  <vcpu>1</vcpu>
Jan 21 18:46:17 np0005591285 nova_compute[182755]:  <metadata>
Jan 21 18:46:17 np0005591285 nova_compute[182755]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 21 18:46:17 np0005591285 nova_compute[182755]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 21 18:46:17 np0005591285 nova_compute[182755]:      <nova:name>tempest-ServersAdminNegativeTestJSON-server-792157281</nova:name>
Jan 21 18:46:17 np0005591285 nova_compute[182755]:      <nova:creationTime>2026-01-21 23:46:17</nova:creationTime>
Jan 21 18:46:17 np0005591285 nova_compute[182755]:      <nova:flavor name="m1.nano">
Jan 21 18:46:17 np0005591285 nova_compute[182755]:        <nova:memory>128</nova:memory>
Jan 21 18:46:17 np0005591285 nova_compute[182755]:        <nova:disk>1</nova:disk>
Jan 21 18:46:17 np0005591285 nova_compute[182755]:        <nova:swap>0</nova:swap>
Jan 21 18:46:17 np0005591285 nova_compute[182755]:        <nova:ephemeral>0</nova:ephemeral>
Jan 21 18:46:17 np0005591285 nova_compute[182755]:        <nova:vcpus>1</nova:vcpus>
Jan 21 18:46:17 np0005591285 nova_compute[182755]:      </nova:flavor>
Jan 21 18:46:17 np0005591285 nova_compute[182755]:      <nova:owner>
Jan 21 18:46:17 np0005591285 nova_compute[182755]:        <nova:user uuid="aa25befcc85f49009cc03d3f9a7af21a">tempest-ServersAdminNegativeTestJSON-1173112993-project-member</nova:user>
Jan 21 18:46:17 np0005591285 nova_compute[182755]:        <nova:project uuid="7b93a1e09d8a4019807c39b0826b8c31">tempest-ServersAdminNegativeTestJSON-1173112993</nova:project>
Jan 21 18:46:17 np0005591285 nova_compute[182755]:      </nova:owner>
Jan 21 18:46:17 np0005591285 nova_compute[182755]:      <nova:root type="image" uuid="9cd98f02-a505-4543-a7ad-04e9a377b456"/>
Jan 21 18:46:17 np0005591285 nova_compute[182755]:      <nova:ports/>
Jan 21 18:46:17 np0005591285 nova_compute[182755]:    </nova:instance>
Jan 21 18:46:17 np0005591285 nova_compute[182755]:  </metadata>
Jan 21 18:46:17 np0005591285 nova_compute[182755]:  <sysinfo type="smbios">
Jan 21 18:46:17 np0005591285 nova_compute[182755]:    <system>
Jan 21 18:46:17 np0005591285 nova_compute[182755]:      <entry name="manufacturer">RDO</entry>
Jan 21 18:46:17 np0005591285 nova_compute[182755]:      <entry name="product">OpenStack Compute</entry>
Jan 21 18:46:17 np0005591285 nova_compute[182755]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 21 18:46:17 np0005591285 nova_compute[182755]:      <entry name="serial">c5dba36b-76a4-4e09-bb90-2f8ef859d5f6</entry>
Jan 21 18:46:17 np0005591285 nova_compute[182755]:      <entry name="uuid">c5dba36b-76a4-4e09-bb90-2f8ef859d5f6</entry>
Jan 21 18:46:17 np0005591285 nova_compute[182755]:      <entry name="family">Virtual Machine</entry>
Jan 21 18:46:17 np0005591285 nova_compute[182755]:    </system>
Jan 21 18:46:17 np0005591285 nova_compute[182755]:  </sysinfo>
Jan 21 18:46:17 np0005591285 nova_compute[182755]:  <os>
Jan 21 18:46:17 np0005591285 nova_compute[182755]:    <type arch="x86_64" machine="q35">hvm</type>
Jan 21 18:46:17 np0005591285 nova_compute[182755]:    <boot dev="hd"/>
Jan 21 18:46:17 np0005591285 nova_compute[182755]:    <smbios mode="sysinfo"/>
Jan 21 18:46:17 np0005591285 nova_compute[182755]:  </os>
Jan 21 18:46:17 np0005591285 nova_compute[182755]:  <features>
Jan 21 18:46:17 np0005591285 nova_compute[182755]:    <acpi/>
Jan 21 18:46:17 np0005591285 nova_compute[182755]:    <apic/>
Jan 21 18:46:17 np0005591285 nova_compute[182755]:    <vmcoreinfo/>
Jan 21 18:46:17 np0005591285 nova_compute[182755]:  </features>
Jan 21 18:46:17 np0005591285 nova_compute[182755]:  <clock offset="utc">
Jan 21 18:46:17 np0005591285 nova_compute[182755]:    <timer name="pit" tickpolicy="delay"/>
Jan 21 18:46:17 np0005591285 nova_compute[182755]:    <timer name="rtc" tickpolicy="catchup"/>
Jan 21 18:46:17 np0005591285 nova_compute[182755]:    <timer name="hpet" present="no"/>
Jan 21 18:46:17 np0005591285 nova_compute[182755]:  </clock>
Jan 21 18:46:17 np0005591285 nova_compute[182755]:  <cpu mode="custom" match="exact">
Jan 21 18:46:17 np0005591285 nova_compute[182755]:    <model>Nehalem</model>
Jan 21 18:46:17 np0005591285 nova_compute[182755]:    <topology sockets="1" cores="1" threads="1"/>
Jan 21 18:46:17 np0005591285 nova_compute[182755]:  </cpu>
Jan 21 18:46:17 np0005591285 nova_compute[182755]:  <devices>
Jan 21 18:46:17 np0005591285 nova_compute[182755]:    <disk type="file" device="disk">
Jan 21 18:46:17 np0005591285 nova_compute[182755]:      <driver name="qemu" type="qcow2" cache="none"/>
Jan 21 18:46:17 np0005591285 nova_compute[182755]:      <source file="/var/lib/nova/instances/c5dba36b-76a4-4e09-bb90-2f8ef859d5f6/disk"/>
Jan 21 18:46:17 np0005591285 nova_compute[182755]:      <target dev="vda" bus="virtio"/>
Jan 21 18:46:17 np0005591285 nova_compute[182755]:    </disk>
Jan 21 18:46:17 np0005591285 nova_compute[182755]:    <disk type="file" device="cdrom">
Jan 21 18:46:17 np0005591285 nova_compute[182755]:      <driver name="qemu" type="raw" cache="none"/>
Jan 21 18:46:17 np0005591285 nova_compute[182755]:      <source file="/var/lib/nova/instances/c5dba36b-76a4-4e09-bb90-2f8ef859d5f6/disk.config"/>
Jan 21 18:46:17 np0005591285 nova_compute[182755]:      <target dev="sda" bus="sata"/>
Jan 21 18:46:17 np0005591285 nova_compute[182755]:    </disk>
Jan 21 18:46:17 np0005591285 nova_compute[182755]:    <serial type="pty">
Jan 21 18:46:17 np0005591285 nova_compute[182755]:      <log file="/var/lib/nova/instances/c5dba36b-76a4-4e09-bb90-2f8ef859d5f6/console.log" append="off"/>
Jan 21 18:46:17 np0005591285 nova_compute[182755]:    </serial>
Jan 21 18:46:17 np0005591285 nova_compute[182755]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 21 18:46:17 np0005591285 nova_compute[182755]:    <video>
Jan 21 18:46:17 np0005591285 nova_compute[182755]:      <model type="virtio"/>
Jan 21 18:46:17 np0005591285 nova_compute[182755]:    </video>
Jan 21 18:46:17 np0005591285 nova_compute[182755]:    <input type="tablet" bus="usb"/>
Jan 21 18:46:17 np0005591285 nova_compute[182755]:    <rng model="virtio">
Jan 21 18:46:17 np0005591285 nova_compute[182755]:      <backend model="random">/dev/urandom</backend>
Jan 21 18:46:17 np0005591285 nova_compute[182755]:    </rng>
Jan 21 18:46:17 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root"/>
Jan 21 18:46:17 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 18:46:17 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 18:46:17 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 18:46:17 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 18:46:17 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 18:46:17 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 18:46:17 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 18:46:17 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 18:46:17 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 18:46:17 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 18:46:17 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 18:46:17 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 18:46:17 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 18:46:17 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 18:46:17 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 18:46:17 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 18:46:17 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 18:46:17 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 18:46:17 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 18:46:17 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 18:46:17 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 18:46:17 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 18:46:17 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 18:46:17 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 18:46:17 np0005591285 nova_compute[182755]:    <controller type="usb" index="0"/>
Jan 21 18:46:17 np0005591285 nova_compute[182755]:    <memballoon model="virtio">
Jan 21 18:46:17 np0005591285 nova_compute[182755]:      <stats period="10"/>
Jan 21 18:46:17 np0005591285 nova_compute[182755]:    </memballoon>
Jan 21 18:46:17 np0005591285 nova_compute[182755]:  </devices>
Jan 21 18:46:17 np0005591285 nova_compute[182755]: </domain>
Jan 21 18:46:17 np0005591285 nova_compute[182755]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Jan 21 18:46:17 np0005591285 nova_compute[182755]: 2026-01-21 23:46:17.217 182759 DEBUG oslo_service.periodic_task [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 21 18:46:17 np0005591285 nova_compute[182755]: 2026-01-21 23:46:17.251 182759 DEBUG nova.virt.libvirt.driver [None req-9da0126d-3d47-4846-ba6b-b14bd2f59e32 aa25befcc85f49009cc03d3f9a7af21a 7b93a1e09d8a4019807c39b0826b8c31 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 21 18:46:17 np0005591285 nova_compute[182755]: 2026-01-21 23:46:17.251 182759 DEBUG nova.virt.libvirt.driver [None req-9da0126d-3d47-4846-ba6b-b14bd2f59e32 aa25befcc85f49009cc03d3f9a7af21a 7b93a1e09d8a4019807c39b0826b8c31 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 21 18:46:17 np0005591285 nova_compute[182755]: 2026-01-21 23:46:17.252 182759 INFO nova.virt.libvirt.driver [None req-9da0126d-3d47-4846-ba6b-b14bd2f59e32 aa25befcc85f49009cc03d3f9a7af21a 7b93a1e09d8a4019807c39b0826b8c31 - - default default] [instance: c5dba36b-76a4-4e09-bb90-2f8ef859d5f6] Using config drive#033[00m
Jan 21 18:46:18 np0005591285 nova_compute[182755]: 2026-01-21 23:46:18.174 182759 INFO nova.virt.libvirt.driver [None req-9da0126d-3d47-4846-ba6b-b14bd2f59e32 aa25befcc85f49009cc03d3f9a7af21a 7b93a1e09d8a4019807c39b0826b8c31 - - default default] [instance: c5dba36b-76a4-4e09-bb90-2f8ef859d5f6] Creating config drive at /var/lib/nova/instances/c5dba36b-76a4-4e09-bb90-2f8ef859d5f6/disk.config#033[00m
Jan 21 18:46:18 np0005591285 nova_compute[182755]: 2026-01-21 23:46:18.188 182759 DEBUG oslo_concurrency.processutils [None req-9da0126d-3d47-4846-ba6b-b14bd2f59e32 aa25befcc85f49009cc03d3f9a7af21a 7b93a1e09d8a4019807c39b0826b8c31 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/c5dba36b-76a4-4e09-bb90-2f8ef859d5f6/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp4e8xfv_j execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 21 18:46:18 np0005591285 nova_compute[182755]: 2026-01-21 23:46:18.218 182759 DEBUG oslo_service.periodic_task [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 21 18:46:18 np0005591285 nova_compute[182755]: 2026-01-21 23:46:18.220 182759 DEBUG oslo_service.periodic_task [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 21 18:46:18 np0005591285 nova_compute[182755]: 2026-01-21 23:46:18.329 182759 DEBUG oslo_concurrency.processutils [None req-9da0126d-3d47-4846-ba6b-b14bd2f59e32 aa25befcc85f49009cc03d3f9a7af21a 7b93a1e09d8a4019807c39b0826b8c31 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/c5dba36b-76a4-4e09-bb90-2f8ef859d5f6/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp4e8xfv_j" returned: 0 in 0.141s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 21 18:46:18 np0005591285 systemd-machined[154022]: New machine qemu-5-instance-0000000f.
Jan 21 18:46:18 np0005591285 systemd[1]: Started Virtual Machine qemu-5-instance-0000000f.
Jan 21 18:46:18 np0005591285 podman[213178]: 2026-01-21 23:46:18.562896501 +0000 UTC m=+0.144734924 container health_status e38def30a2d032278c507a8fe96e62e9ea51ff4721fe55b24e66691821fd079e (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, config_id=ovn_controller, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Jan 21 18:46:18 np0005591285 nova_compute[182755]: 2026-01-21 23:46:18.940 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 18:46:19 np0005591285 nova_compute[182755]: 2026-01-21 23:46:19.217 182759 DEBUG oslo_service.periodic_task [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 21 18:46:19 np0005591285 nova_compute[182755]: 2026-01-21 23:46:19.218 182759 DEBUG nova.compute.manager [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Jan 21 18:46:19 np0005591285 nova_compute[182755]: 2026-01-21 23:46:19.218 182759 DEBUG nova.compute.manager [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Jan 21 18:46:19 np0005591285 nova_compute[182755]: 2026-01-21 23:46:19.263 182759 DEBUG nova.compute.manager [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] [instance: c5dba36b-76a4-4e09-bb90-2f8ef859d5f6] Skipping network cache update for instance because it is Building. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9871#033[00m
Jan 21 18:46:19 np0005591285 nova_compute[182755]: 2026-01-21 23:46:19.342 182759 DEBUG nova.virt.driver [None req-f447ef32-7edb-4e2c-a5c5-5452f2f9c37e - - - - - -] Emitting event <LifecycleEvent: 1769039179.3420131, c5dba36b-76a4-4e09-bb90-2f8ef859d5f6 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 21 18:46:19 np0005591285 nova_compute[182755]: 2026-01-21 23:46:19.343 182759 INFO nova.compute.manager [None req-f447ef32-7edb-4e2c-a5c5-5452f2f9c37e - - - - - -] [instance: c5dba36b-76a4-4e09-bb90-2f8ef859d5f6] VM Resumed (Lifecycle Event)#033[00m
Jan 21 18:46:19 np0005591285 nova_compute[182755]: 2026-01-21 23:46:19.345 182759 DEBUG nova.compute.manager [None req-9da0126d-3d47-4846-ba6b-b14bd2f59e32 aa25befcc85f49009cc03d3f9a7af21a 7b93a1e09d8a4019807c39b0826b8c31 - - default default] [instance: c5dba36b-76a4-4e09-bb90-2f8ef859d5f6] Instance event wait completed in 0 seconds for  wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Jan 21 18:46:19 np0005591285 nova_compute[182755]: 2026-01-21 23:46:19.346 182759 DEBUG nova.virt.libvirt.driver [None req-9da0126d-3d47-4846-ba6b-b14bd2f59e32 aa25befcc85f49009cc03d3f9a7af21a 7b93a1e09d8a4019807c39b0826b8c31 - - default default] [instance: c5dba36b-76a4-4e09-bb90-2f8ef859d5f6] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Jan 21 18:46:19 np0005591285 nova_compute[182755]: 2026-01-21 23:46:19.352 182759 INFO nova.virt.libvirt.driver [-] [instance: c5dba36b-76a4-4e09-bb90-2f8ef859d5f6] Instance spawned successfully.#033[00m
Jan 21 18:46:19 np0005591285 nova_compute[182755]: 2026-01-21 23:46:19.352 182759 DEBUG nova.virt.libvirt.driver [None req-9da0126d-3d47-4846-ba6b-b14bd2f59e32 aa25befcc85f49009cc03d3f9a7af21a 7b93a1e09d8a4019807c39b0826b8c31 - - default default] [instance: c5dba36b-76a4-4e09-bb90-2f8ef859d5f6] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Jan 21 18:46:19 np0005591285 nova_compute[182755]: 2026-01-21 23:46:19.389 182759 DEBUG nova.compute.manager [None req-f447ef32-7edb-4e2c-a5c5-5452f2f9c37e - - - - - -] [instance: c5dba36b-76a4-4e09-bb90-2f8ef859d5f6] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 21 18:46:19 np0005591285 nova_compute[182755]: 2026-01-21 23:46:19.394 182759 DEBUG nova.compute.manager [None req-f447ef32-7edb-4e2c-a5c5-5452f2f9c37e - - - - - -] [instance: c5dba36b-76a4-4e09-bb90-2f8ef859d5f6] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 21 18:46:19 np0005591285 nova_compute[182755]: 2026-01-21 23:46:19.400 182759 DEBUG nova.virt.libvirt.driver [None req-9da0126d-3d47-4846-ba6b-b14bd2f59e32 aa25befcc85f49009cc03d3f9a7af21a 7b93a1e09d8a4019807c39b0826b8c31 - - default default] [instance: c5dba36b-76a4-4e09-bb90-2f8ef859d5f6] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 21 18:46:19 np0005591285 nova_compute[182755]: 2026-01-21 23:46:19.400 182759 DEBUG nova.virt.libvirt.driver [None req-9da0126d-3d47-4846-ba6b-b14bd2f59e32 aa25befcc85f49009cc03d3f9a7af21a 7b93a1e09d8a4019807c39b0826b8c31 - - default default] [instance: c5dba36b-76a4-4e09-bb90-2f8ef859d5f6] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 21 18:46:19 np0005591285 nova_compute[182755]: 2026-01-21 23:46:19.400 182759 DEBUG nova.virt.libvirt.driver [None req-9da0126d-3d47-4846-ba6b-b14bd2f59e32 aa25befcc85f49009cc03d3f9a7af21a 7b93a1e09d8a4019807c39b0826b8c31 - - default default] [instance: c5dba36b-76a4-4e09-bb90-2f8ef859d5f6] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 21 18:46:19 np0005591285 nova_compute[182755]: 2026-01-21 23:46:19.401 182759 DEBUG nova.virt.libvirt.driver [None req-9da0126d-3d47-4846-ba6b-b14bd2f59e32 aa25befcc85f49009cc03d3f9a7af21a 7b93a1e09d8a4019807c39b0826b8c31 - - default default] [instance: c5dba36b-76a4-4e09-bb90-2f8ef859d5f6] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 21 18:46:19 np0005591285 nova_compute[182755]: 2026-01-21 23:46:19.401 182759 DEBUG nova.virt.libvirt.driver [None req-9da0126d-3d47-4846-ba6b-b14bd2f59e32 aa25befcc85f49009cc03d3f9a7af21a 7b93a1e09d8a4019807c39b0826b8c31 - - default default] [instance: c5dba36b-76a4-4e09-bb90-2f8ef859d5f6] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 21 18:46:19 np0005591285 nova_compute[182755]: 2026-01-21 23:46:19.401 182759 DEBUG nova.virt.libvirt.driver [None req-9da0126d-3d47-4846-ba6b-b14bd2f59e32 aa25befcc85f49009cc03d3f9a7af21a 7b93a1e09d8a4019807c39b0826b8c31 - - default default] [instance: c5dba36b-76a4-4e09-bb90-2f8ef859d5f6] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 21 18:46:19 np0005591285 nova_compute[182755]: 2026-01-21 23:46:19.440 182759 INFO nova.compute.manager [None req-f447ef32-7edb-4e2c-a5c5-5452f2f9c37e - - - - - -] [instance: c5dba36b-76a4-4e09-bb90-2f8ef859d5f6] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 21 18:46:19 np0005591285 nova_compute[182755]: 2026-01-21 23:46:19.440 182759 DEBUG nova.virt.driver [None req-f447ef32-7edb-4e2c-a5c5-5452f2f9c37e - - - - - -] Emitting event <LifecycleEvent: 1769039179.343537, c5dba36b-76a4-4e09-bb90-2f8ef859d5f6 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 21 18:46:19 np0005591285 nova_compute[182755]: 2026-01-21 23:46:19.440 182759 INFO nova.compute.manager [None req-f447ef32-7edb-4e2c-a5c5-5452f2f9c37e - - - - - -] [instance: c5dba36b-76a4-4e09-bb90-2f8ef859d5f6] VM Started (Lifecycle Event)#033[00m
Jan 21 18:46:19 np0005591285 nova_compute[182755]: 2026-01-21 23:46:19.480 182759 DEBUG nova.compute.manager [None req-f447ef32-7edb-4e2c-a5c5-5452f2f9c37e - - - - - -] [instance: c5dba36b-76a4-4e09-bb90-2f8ef859d5f6] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 21 18:46:19 np0005591285 nova_compute[182755]: 2026-01-21 23:46:19.487 182759 DEBUG nova.compute.manager [None req-f447ef32-7edb-4e2c-a5c5-5452f2f9c37e - - - - - -] [instance: c5dba36b-76a4-4e09-bb90-2f8ef859d5f6] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 21 18:46:19 np0005591285 nova_compute[182755]: 2026-01-21 23:46:19.503 182759 INFO nova.compute.manager [None req-9da0126d-3d47-4846-ba6b-b14bd2f59e32 aa25befcc85f49009cc03d3f9a7af21a 7b93a1e09d8a4019807c39b0826b8c31 - - default default] [instance: c5dba36b-76a4-4e09-bb90-2f8ef859d5f6] Took 2.90 seconds to spawn the instance on the hypervisor.#033[00m
Jan 21 18:46:19 np0005591285 nova_compute[182755]: 2026-01-21 23:46:19.503 182759 DEBUG nova.compute.manager [None req-9da0126d-3d47-4846-ba6b-b14bd2f59e32 aa25befcc85f49009cc03d3f9a7af21a 7b93a1e09d8a4019807c39b0826b8c31 - - default default] [instance: c5dba36b-76a4-4e09-bb90-2f8ef859d5f6] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 21 18:46:19 np0005591285 nova_compute[182755]: 2026-01-21 23:46:19.518 182759 INFO nova.compute.manager [None req-f447ef32-7edb-4e2c-a5c5-5452f2f9c37e - - - - - -] [instance: c5dba36b-76a4-4e09-bb90-2f8ef859d5f6] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 21 18:46:19 np0005591285 nova_compute[182755]: 2026-01-21 23:46:19.620 182759 INFO nova.compute.manager [None req-9da0126d-3d47-4846-ba6b-b14bd2f59e32 aa25befcc85f49009cc03d3f9a7af21a 7b93a1e09d8a4019807c39b0826b8c31 - - default default] [instance: c5dba36b-76a4-4e09-bb90-2f8ef859d5f6] Took 3.83 seconds to build instance.#033[00m
Jan 21 18:46:19 np0005591285 nova_compute[182755]: 2026-01-21 23:46:19.649 182759 DEBUG oslo_concurrency.lockutils [None req-9da0126d-3d47-4846-ba6b-b14bd2f59e32 aa25befcc85f49009cc03d3f9a7af21a 7b93a1e09d8a4019807c39b0826b8c31 - - default default] Lock "c5dba36b-76a4-4e09-bb90-2f8ef859d5f6" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 4.033s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 21 18:46:20 np0005591285 nova_compute[182755]: 2026-01-21 23:46:20.140 182759 DEBUG oslo_concurrency.lockutils [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Acquiring lock "refresh_cache-69dceb72-db44-4bfc-9b98-cc8b39885ae7" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 21 18:46:20 np0005591285 nova_compute[182755]: 2026-01-21 23:46:20.140 182759 DEBUG oslo_concurrency.lockutils [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Acquired lock "refresh_cache-69dceb72-db44-4bfc-9b98-cc8b39885ae7" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 21 18:46:20 np0005591285 nova_compute[182755]: 2026-01-21 23:46:20.140 182759 DEBUG nova.network.neutron [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] [instance: 69dceb72-db44-4bfc-9b98-cc8b39885ae7] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004#033[00m
Jan 21 18:46:20 np0005591285 nova_compute[182755]: 2026-01-21 23:46:20.141 182759 DEBUG nova.objects.instance [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Lazy-loading 'info_cache' on Instance uuid 69dceb72-db44-4bfc-9b98-cc8b39885ae7 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 21 18:46:21 np0005591285 nova_compute[182755]: 2026-01-21 23:46:21.553 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.152 12 DEBUG ceilometer.compute.discovery [-] instance data: {'id': '69dceb72-db44-4bfc-9b98-cc8b39885ae7', 'name': 'tempest-LiveAutoBlockMigrationV225Test-server-1333035319', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'os_type': 'hvm', 'architecture': 'x86_64', 'OS-EXT-SRV-ATTR:instance_name': 'instance-0000000c', 'OS-EXT-SRV-ATTR:host': 'compute-2.ctlplane.example.com', 'OS-EXT-STS:vm_state': 'running', 'tenant_id': '1298204af0f241dc8b63851b2046cf5c', 'user_id': '553fdc065acf4000a185abac43878ab4', 'hostId': 'b33f8342b081136aa645faba69efa3119d62b1f83776af4a141e0346', 'status': 'active', 'metadata': {}} discover_libvirt_polling /usr/lib/python3.9/site-packages/ceilometer/compute/discovery.py:228
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.160 12 DEBUG ceilometer.compute.discovery [-] instance data: {'id': 'c5dba36b-76a4-4e09-bb90-2f8ef859d5f6', 'name': 'tempest-ServersAdminNegativeTestJSON-server-792157281', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'os_type': 'hvm', 'architecture': 'x86_64', 'OS-EXT-SRV-ATTR:instance_name': 'instance-0000000f', 'OS-EXT-SRV-ATTR:host': 'compute-2.ctlplane.example.com', 'OS-EXT-STS:vm_state': 'running', 'tenant_id': '7b93a1e09d8a4019807c39b0826b8c31', 'user_id': 'aa25befcc85f49009cc03d3f9a7af21a', 'hostId': '430eeb3a2cb08ade65e170c593d235040b27860c4cbed7ba90a2e749', 'status': 'active', 'metadata': {}} discover_libvirt_polling /usr/lib/python3.9/site-packages/ceilometer/compute/discovery.py:228
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.161 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.latency in the context of pollsters
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.161 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for PerDeviceDiskLatencyPollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.162 12 ERROR ceilometer.polling.manager [-] Prevent pollster disk.device.latency from polling [<NovaLikeServer: tempest-LiveAutoBlockMigrationV225Test-server-1333035319>, <NovaLikeServer: tempest-ServersAdminNegativeTestJSON-server-792157281>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-LiveAutoBlockMigrationV225Test-server-1333035319>, <NovaLikeServer: tempest-ServersAdminNegativeTestJSON-server-792157281>]
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.163 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.iops in the context of pollsters
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.164 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for PerDeviceDiskIOPSPollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.164 12 ERROR ceilometer.polling.manager [-] Prevent pollster disk.device.iops from polling [<NovaLikeServer: tempest-LiveAutoBlockMigrationV225Test-server-1333035319>, <NovaLikeServer: tempest-ServersAdminNegativeTestJSON-server-792157281>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-LiveAutoBlockMigrationV225Test-server-1333035319>, <NovaLikeServer: tempest-ServersAdminNegativeTestJSON-server-792157281>]
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.164 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets in the context of pollsters
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.171 12 DEBUG ceilometer.compute.virt.libvirt.inspector [-] No delta meter predecessor for 69dceb72-db44-4bfc-9b98-cc8b39885ae7 / tapbae5fde2-5e inspect_vnics /usr/lib/python3.9/site-packages/ceilometer/compute/virt/libvirt/inspector.py:136
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.172 12 DEBUG ceilometer.compute.pollsters [-] 69dceb72-db44-4bfc-9b98-cc8b39885ae7/network.incoming.packets volume: 7 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.203 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'ef3ef976-47f4-49e8-827d-90acf3e60afd', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 7, 'user_id': '553fdc065acf4000a185abac43878ab4', 'user_name': None, 'project_id': '1298204af0f241dc8b63851b2046cf5c', 'project_name': None, 'resource_id': 'instance-0000000c-69dceb72-db44-4bfc-9b98-cc8b39885ae7-tapbae5fde2-5e', 'timestamp': '2026-01-21T23:46:23.164798', 'resource_metadata': {'display_name': 'tempest-LiveAutoBlockMigrationV225Test-server-1333035319', 'name': 'tapbae5fde2-5e', 'instance_id': '69dceb72-db44-4bfc-9b98-cc8b39885ae7', 'instance_type': 'm1.nano', 'host': 'b33f8342b081136aa645faba69efa3119d62b1f83776af4a141e0346', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:6f:ac:86', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapbae5fde2-5e'}, 'message_id': '6492ce50-f723-11f0-b13b-fa163e425b77', 'monotonic_time': 3741.885073423, 'message_signature': 'ca75502e018da8337f98711d4d986cff64ebbab1f2cdd4f75787ffcb75c08ac1'}]}, 'timestamp': '2026-01-21 23:46:23.180307', '_unique_id': 'f6c171308d00457c9fcd21c903de635a'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.203 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.203 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.203 12 ERROR oslo_messaging.notify.messaging     yield
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.203 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.203 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.203 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.203 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.203 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.203 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.203 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.203 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.203 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.203 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.203 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.203 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.203 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.203 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.203 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.203 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.203 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.203 12 ERROR oslo_messaging.notify.messaging 
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.203 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.203 12 ERROR oslo_messaging.notify.messaging 
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.203 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.203 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.203 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.203 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.203 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.203 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.203 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.203 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.203 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.203 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.203 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.203 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.203 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.203 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.203 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.203 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.203 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.203 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.203 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.203 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.203 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.203 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.203 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.203 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.203 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.203 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.203 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.203 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.203 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.203 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.203 12 ERROR oslo_messaging.notify.messaging 
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.207 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets in the context of pollsters
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.207 12 DEBUG ceilometer.compute.pollsters [-] 69dceb72-db44-4bfc-9b98-cc8b39885ae7/network.outgoing.packets volume: 80 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.209 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'b4e2a17a-ed65-4ad4-88ec-358e3b0377c7', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 80, 'user_id': '553fdc065acf4000a185abac43878ab4', 'user_name': None, 'project_id': '1298204af0f241dc8b63851b2046cf5c', 'project_name': None, 'resource_id': 'instance-0000000c-69dceb72-db44-4bfc-9b98-cc8b39885ae7-tapbae5fde2-5e', 'timestamp': '2026-01-21T23:46:23.207422', 'resource_metadata': {'display_name': 'tempest-LiveAutoBlockMigrationV225Test-server-1333035319', 'name': 'tapbae5fde2-5e', 'instance_id': '69dceb72-db44-4bfc-9b98-cc8b39885ae7', 'instance_type': 'm1.nano', 'host': 'b33f8342b081136aa645faba69efa3119d62b1f83776af4a141e0346', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:6f:ac:86', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapbae5fde2-5e'}, 'message_id': '6497e138-f723-11f0-b13b-fa163e425b77', 'monotonic_time': 3741.885073423, 'message_signature': 'fd5a46c9d5bdb579f7a43c62201727c1d57486f5d68234bd516e28c7d9a2f92a'}]}, 'timestamp': '2026-01-21 23:46:23.207990', '_unique_id': 'cca14a7afecf44d3a7988d7d1cb21f73'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.209 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.209 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.209 12 ERROR oslo_messaging.notify.messaging     yield
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.209 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.209 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.209 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.209 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.209 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.209 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.209 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.209 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.209 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.209 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.209 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.209 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.209 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.209 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.209 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.209 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.209 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.209 12 ERROR oslo_messaging.notify.messaging 
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.209 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.209 12 ERROR oslo_messaging.notify.messaging 
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.209 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.209 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.209 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.209 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.209 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.209 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.209 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.209 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.209 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.209 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.209 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.209 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.209 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.209 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.209 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.209 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.209 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.209 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.209 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.209 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.209 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.209 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.209 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.209 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.209 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.209 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.209 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.209 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.209 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.209 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.209 12 ERROR oslo_messaging.notify.messaging 
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.210 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.delta in the context of pollsters
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.210 12 DEBUG ceilometer.compute.pollsters [-] 69dceb72-db44-4bfc-9b98-cc8b39885ae7/network.outgoing.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.211 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'ca99cdd1-a3f7-4ce5-bf5a-95ac750ebf2d', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '553fdc065acf4000a185abac43878ab4', 'user_name': None, 'project_id': '1298204af0f241dc8b63851b2046cf5c', 'project_name': None, 'resource_id': 'instance-0000000c-69dceb72-db44-4bfc-9b98-cc8b39885ae7-tapbae5fde2-5e', 'timestamp': '2026-01-21T23:46:23.210173', 'resource_metadata': {'display_name': 'tempest-LiveAutoBlockMigrationV225Test-server-1333035319', 'name': 'tapbae5fde2-5e', 'instance_id': '69dceb72-db44-4bfc-9b98-cc8b39885ae7', 'instance_type': 'm1.nano', 'host': 'b33f8342b081136aa645faba69efa3119d62b1f83776af4a141e0346', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:6f:ac:86', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapbae5fde2-5e'}, 'message_id': '64984c40-f723-11f0-b13b-fa163e425b77', 'monotonic_time': 3741.885073423, 'message_signature': 'fbaf8acf27a47e23f5bd746829ef18813565610d5223125bd8ad41f64add198e'}]}, 'timestamp': '2026-01-21 23:46:23.210582', '_unique_id': '20317b9d248747afa9e8fa02645ae0a3'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.211 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.211 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.211 12 ERROR oslo_messaging.notify.messaging     yield
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.211 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.211 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.211 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.211 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.211 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.211 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.211 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.211 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.211 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.211 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.211 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.211 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.211 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.211 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.211 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.211 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.211 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.211 12 ERROR oslo_messaging.notify.messaging 
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.211 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.211 12 ERROR oslo_messaging.notify.messaging 
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.211 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.211 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.211 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.211 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.211 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.211 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.211 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.211 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.211 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.211 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.211 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.211 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.211 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.211 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.211 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.211 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.211 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.211 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.211 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.211 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.211 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.211 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.211 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.211 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.211 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.211 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.211 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.211 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.211 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.211 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.211 12 ERROR oslo_messaging.notify.messaging 
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.212 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.requests in the context of pollsters
Jan 21 18:46:23 np0005591285 podman[213221]: 2026-01-21 23:46:23.234431675 +0000 UTC m=+0.087954375 container health_status 482a8450540b33e5cd4c4cb0c4b60281016c657b79b89fa82f8a9e447843f995 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, container_name=ovn_metadata_agent, io.buildah.version=1.41.3)
Jan 21 18:46:23 np0005591285 podman[213222]: 2026-01-21 23:46:23.234677622 +0000 UTC m=+0.068605835 container health_status 8919309155a033da8deb024bb37b9fc0d285fe97686ee0d202af37a5f2df8416 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter)
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.250 12 DEBUG ceilometer.compute.pollsters [-] 69dceb72-db44-4bfc-9b98-cc8b39885ae7/disk.device.write.requests volume: 13 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.251 12 DEBUG ceilometer.compute.pollsters [-] 69dceb72-db44-4bfc-9b98-cc8b39885ae7/disk.device.write.requests volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.299 12 DEBUG ceilometer.compute.pollsters [-] c5dba36b-76a4-4e09-bb90-2f8ef859d5f6/disk.device.write.requests volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.300 12 DEBUG ceilometer.compute.pollsters [-] c5dba36b-76a4-4e09-bb90-2f8ef859d5f6/disk.device.write.requests volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.302 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '49b9c628-6932-414b-98f4-4790651fd914', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 13, 'user_id': '553fdc065acf4000a185abac43878ab4', 'user_name': None, 'project_id': '1298204af0f241dc8b63851b2046cf5c', 'project_name': None, 'resource_id': '69dceb72-db44-4bfc-9b98-cc8b39885ae7-vda', 'timestamp': '2026-01-21T23:46:23.212440', 'resource_metadata': {'display_name': 'tempest-LiveAutoBlockMigrationV225Test-server-1333035319', 'name': 'instance-0000000c', 'instance_id': '69dceb72-db44-4bfc-9b98-cc8b39885ae7', 'instance_type': 'm1.nano', 'host': 'b33f8342b081136aa645faba69efa3119d62b1f83776af4a141e0346', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '649e85b0-f723-11f0-b13b-fa163e425b77', 'monotonic_time': 3741.931950324, 'message_signature': '69da548d63628b945b6666ce2e4afcaf2672863397ff94c3907f0a47309a4af5'}, {'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 0, 'user_id': '553fdc065acf4000a185abac43878ab4', 'user_name': None, 'project_id': '1298204af0f241dc8b63851b2046cf5c', 'project_name': None, 'resource_id': '69dceb72-db44-4bfc-9b98-cc8b39885ae7-sda', 'timestamp': '2026-01-21T23:46:23.212440', 'resource_metadata': {'display_name': 'tempest-LiveAutoBlockMigrationV225Test-server-1333035319', 'name': 'instance-0000000c', 'instance_id': '69dceb72-db44-4bfc-9b98-cc8b39885ae7', 'instance_type': 'm1.nano', 'host': 'b33f8342b081136aa645faba69efa3119d62b1f83776af4a141e0346', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '649e9582-f723-11f0-b13b-fa163e425b77', 'monotonic_time': 3741.931950324, 'message_signature': '6ef5da6902e424f6cc4e3acf619aa019c019fa876fe3b023c6969acac1df8946'}, {'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 0, 'user_id': 'aa25befcc85f49009cc03d3f9a7af21a', 'user_name': None, 'project_id': '7b93a1e09d8a4019807c39b0826b8c31', 'project_name': None, 'resource_id': 'c5dba36b-76a4-4e09-bb90-2f8ef859d5f6-vda', 'timestamp': '2026-01-21T23:46:23.212440', 'resource_metadata': {'display_name': 'tempest-ServersAdminNegativeTestJSON-server-792157281', 'name': 'instance-0000000f', 'instance_id': 'c5dba36b-76a4-4e09-bb90-2f8ef859d5f6', 'instance_type': 'm1.nano', 'host': '430eeb3a2cb08ade65e170c593d235040b27860c4cbed7ba90a2e749', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '64a5ff8e-f723-11f0-b13b-fa163e425b77', 'monotonic_time': 3741.970943953, 'message_signature': 'b36e82f49a4ead0ad84600de9dabe8b39d4476afa042e2bc08366ec8949ca1d2'}, {'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 0, 'user_id': 'aa25befcc85f49009cc03d3f9a7af21a', 'user_name': None, 'project_id': '7b93a1e09d8a4019807c39b0826b8c31', 'project_name': None, 'resource_id': 'c5dba36b-76a4-4e09-bb90-2f8ef859d5f6-sda', 'timestamp': '2026-01-21T23:46:23.212440', 'resource_metadata': {'display_name': 'tempest-ServersAdminNegativeTestJSON-server-792157281', 'name': 'instance-0000000f', 'instance_id': 'c5dba36b-76a4-4e09-bb90-2f8ef859d5f6', 'instance_type': 'm1.nano', 'host': '430eeb3a2cb08ade65e170c593d235040b27860c4cbed7ba90a2e749', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '64a60b6e-f723-11f0-b13b-fa163e425b77', 'monotonic_time': 3741.970943953, 'message_signature': 'd49567dc945c1712ce10f9d5df7a7dd86d8b65ea581aad3a77cdec4fb90f0c0d'}]}, 'timestamp': '2026-01-21 23:46:23.300613', '_unique_id': '399cb680de3443019da89b22f9666470'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.302 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.302 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.302 12 ERROR oslo_messaging.notify.messaging     yield
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.302 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.302 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.302 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.302 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.302 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.302 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.302 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.302 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.302 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.302 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.302 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.302 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.302 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.302 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.302 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.302 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.302 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.302 12 ERROR oslo_messaging.notify.messaging 
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.302 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.302 12 ERROR oslo_messaging.notify.messaging 
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.302 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.302 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.302 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.302 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.302 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.302 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.302 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.302 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.302 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.302 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.302 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.302 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.302 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.302 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.302 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.302 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.302 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.302 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.302 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.302 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.302 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.302 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.302 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.302 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.302 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.302 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.302 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.302 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.302 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.302 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.302 12 ERROR oslo_messaging.notify.messaging 
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.303 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.error in the context of pollsters
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.303 12 DEBUG ceilometer.compute.pollsters [-] 69dceb72-db44-4bfc-9b98-cc8b39885ae7/network.outgoing.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.304 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '801c86f7-2a6b-4b9b-a6b6-c945b29bdfeb', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '553fdc065acf4000a185abac43878ab4', 'user_name': None, 'project_id': '1298204af0f241dc8b63851b2046cf5c', 'project_name': None, 'resource_id': 'instance-0000000c-69dceb72-db44-4bfc-9b98-cc8b39885ae7-tapbae5fde2-5e', 'timestamp': '2026-01-21T23:46:23.303596', 'resource_metadata': {'display_name': 'tempest-LiveAutoBlockMigrationV225Test-server-1333035319', 'name': 'tapbae5fde2-5e', 'instance_id': '69dceb72-db44-4bfc-9b98-cc8b39885ae7', 'instance_type': 'm1.nano', 'host': 'b33f8342b081136aa645faba69efa3119d62b1f83776af4a141e0346', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:6f:ac:86', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapbae5fde2-5e'}, 'message_id': '64a6911a-f723-11f0-b13b-fa163e425b77', 'monotonic_time': 3741.885073423, 'message_signature': 'f194e8f2217c5b99efc37b3a5dda690350c89f8e351f6398fd8ef67aa392da75'}]}, 'timestamp': '2026-01-21 23:46:23.304091', '_unique_id': '0a68ff60163c49c09d3200acd19ebfe9'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.304 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.304 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.304 12 ERROR oslo_messaging.notify.messaging     yield
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.304 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.304 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.304 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.304 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.304 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.304 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.304 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.304 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.304 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.304 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.304 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.304 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.304 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.304 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.304 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.304 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.304 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.304 12 ERROR oslo_messaging.notify.messaging 
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.304 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.304 12 ERROR oslo_messaging.notify.messaging 
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.304 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.304 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.304 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.304 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.304 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.304 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.304 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.304 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.304 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.304 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.304 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.304 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.304 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.304 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.304 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.304 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.304 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.304 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.304 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.304 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.304 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.304 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.304 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.304 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.304 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.304 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.304 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.304 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.304 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.304 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.304 12 ERROR oslo_messaging.notify.messaging 
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.305 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.requests in the context of pollsters
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.305 12 DEBUG ceilometer.compute.pollsters [-] 69dceb72-db44-4bfc-9b98-cc8b39885ae7/disk.device.read.requests volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.306 12 DEBUG ceilometer.compute.pollsters [-] 69dceb72-db44-4bfc-9b98-cc8b39885ae7/disk.device.read.requests volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.306 12 DEBUG ceilometer.compute.pollsters [-] c5dba36b-76a4-4e09-bb90-2f8ef859d5f6/disk.device.read.requests volume: 760 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.306 12 DEBUG ceilometer.compute.pollsters [-] c5dba36b-76a4-4e09-bb90-2f8ef859d5f6/disk.device.read.requests volume: 1 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.307 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '57f9acbd-43f6-4f89-8003-e2b6746c9c9a', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 0, 'user_id': '553fdc065acf4000a185abac43878ab4', 'user_name': None, 'project_id': '1298204af0f241dc8b63851b2046cf5c', 'project_name': None, 'resource_id': '69dceb72-db44-4bfc-9b98-cc8b39885ae7-vda', 'timestamp': '2026-01-21T23:46:23.305771', 'resource_metadata': {'display_name': 'tempest-LiveAutoBlockMigrationV225Test-server-1333035319', 'name': 'instance-0000000c', 'instance_id': '69dceb72-db44-4bfc-9b98-cc8b39885ae7', 'instance_type': 'm1.nano', 'host': 'b33f8342b081136aa645faba69efa3119d62b1f83776af4a141e0346', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '64a6e37c-f723-11f0-b13b-fa163e425b77', 'monotonic_time': 3741.931950324, 'message_signature': '8e1eb86164e6a770e5eefec8660bef59fd3a4bf7f3e2633d3efcfde1a0163cb4'}, {'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 0, 'user_id': '553fdc065acf4000a185abac43878ab4', 'user_name': None, 'project_id': '1298204af0f241dc8b63851b2046cf5c', 'project_name': None, 'resource_id': '69dceb72-db44-4bfc-9b98-cc8b39885ae7-sda', 'timestamp': '2026-01-21T23:46:23.305771', 'resource_metadata': {'display_name': 'tempest-LiveAutoBlockMigrationV225Test-server-1333035319', 'name': 'instance-0000000c', 'instance_id': '69dceb72-db44-4bfc-9b98-cc8b39885ae7', 'instance_type': 'm1.nano', 'host': 'b33f8342b081136aa645faba69efa3119d62b1f83776af4a141e0346', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '64a6efc0-f723-11f0-b13b-fa163e425b77', 'monotonic_time': 3741.931950324, 'message_signature': 'd46f7f148dade0a97077d02adee70612229a6f25405da6bd0b912d6265b522f5'}, {'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 760, 'user_id': 'aa25befcc85f49009cc03d3f9a7af21a', 'user_name': None, 'project_id': '7b93a1e09d8a4019807c39b0826b8c31', 'project_name': None, 'resource_id': 'c5dba36b-76a4-4e09-bb90-2f8ef859d5f6-vda', 'timestamp': '2026-01-21T23:46:23.305771', 'resource_metadata': {'display_name': 'tempest-ServersAdminNegativeTestJSON-server-792157281', 'name': 'instance-0000000f', 'instance_id': 'c5dba36b-76a4-4e09-bb90-2f8ef859d5f6', 'instance_type': 'm1.nano', 'host': '430eeb3a2cb08ade65e170c593d235040b27860c4cbed7ba90a2e749', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '64a6f98e-f723-11f0-b13b-fa163e425b77', 'monotonic_time': 3741.970943953, 'message_signature': '333498114b1d207892e35612e2aafdb6c02af42cffb7a6f35f2ca6e25ce809f8'}, {'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 1, 'user_id': 'aa25befcc85f49009cc03d3f9a7af21a', 'user_name': None, 'project_id': '7b93a1e09d8a4019807c39b0826b8c31', 'project_name': None, 'resource_id': 'c5dba36b-76a4-4e09-bb90-2f8ef859d5f6-sda', 'timestamp': '2026-01-21T23:46:23.305771', 'resource_metadata': {'display_name': 'tempest-ServersAdminNegativeTestJSON-server-792157281', 'name': 'instance-0000000f', 'instance_id': 'c5dba36b-76a4-4e09-bb90-2f8ef859d5f6', 'instance_type': 'm1.nano', 'host': '430eeb3a2cb08ade65e170c593d235040b27860c4cbed7ba90a2e749', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '64a703e8-f723-11f0-b13b-fa163e425b77', 'monotonic_time': 3741.970943953, 'message_signature': 'e5e3c5238b19e5c74fc7f2fc475aba160a3d0fbc209d9bd208c14672e99856df'}]}, 'timestamp': '2026-01-21 23:46:23.307002', '_unique_id': '2e6be4d0f11a4ceda1b11ff56d23eaef'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.307 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.307 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.307 12 ERROR oslo_messaging.notify.messaging     yield
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.307 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.307 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.307 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.307 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.307 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.307 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.307 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.307 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.307 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.307 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.307 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.307 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.307 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.307 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.307 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.307 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.307 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.307 12 ERROR oslo_messaging.notify.messaging 
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.307 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.307 12 ERROR oslo_messaging.notify.messaging 
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.307 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.307 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.307 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.307 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.307 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.307 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.307 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.307 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.307 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.307 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.307 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.307 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.307 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.307 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.307 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.307 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.307 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.307 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.307 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.307 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.307 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.307 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.307 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.307 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.307 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.307 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.307 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.307 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.307 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.307 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.307 12 ERROR oslo_messaging.notify.messaging 
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.308 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.usage in the context of pollsters
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.320 12 DEBUG ceilometer.compute.pollsters [-] 69dceb72-db44-4bfc-9b98-cc8b39885ae7/disk.device.usage volume: 29949952 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.321 12 DEBUG ceilometer.compute.pollsters [-] 69dceb72-db44-4bfc-9b98-cc8b39885ae7/disk.device.usage volume: 485376 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.331 12 DEBUG ceilometer.compute.pollsters [-] c5dba36b-76a4-4e09-bb90-2f8ef859d5f6/disk.device.usage volume: 196624 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.331 12 DEBUG ceilometer.compute.pollsters [-] c5dba36b-76a4-4e09-bb90-2f8ef859d5f6/disk.device.usage volume: 485376 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.333 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '82e9ae21-1b27-48b0-ae6b-822b7a404318', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 29949952, 'user_id': '553fdc065acf4000a185abac43878ab4', 'user_name': None, 'project_id': '1298204af0f241dc8b63851b2046cf5c', 'project_name': None, 'resource_id': '69dceb72-db44-4bfc-9b98-cc8b39885ae7-vda', 'timestamp': '2026-01-21T23:46:23.308647', 'resource_metadata': {'display_name': 'tempest-LiveAutoBlockMigrationV225Test-server-1333035319', 'name': 'instance-0000000c', 'instance_id': '69dceb72-db44-4bfc-9b98-cc8b39885ae7', 'instance_type': 'm1.nano', 'host': 'b33f8342b081136aa645faba69efa3119d62b1f83776af4a141e0346', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '64a9388e-f723-11f0-b13b-fa163e425b77', 'monotonic_time': 3742.028091479, 'message_signature': '830d670d42c873b33e157da56b3863166bf0dd05a320b5baa64ec128f3fe29b2'}, {'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 485376, 'user_id': '553fdc065acf4000a185abac43878ab4', 'user_name': None, 'project_id': '1298204af0f241dc8b63851b2046cf5c', 'project_name': None, 'resource_id': '69dceb72-db44-4bfc-9b98-cc8b39885ae7-sda', 'timestamp': '2026-01-21T23:46:23.308647', 'resource_metadata': {'display_name': 'tempest-LiveAutoBlockMigrationV225Test-server-1333035319', 'name': 'instance-0000000c', 'instance_id': '69dceb72-db44-4bfc-9b98-cc8b39885ae7', 'instance_type': 'm1.nano', 'host': 'b33f8342b081136aa645faba69efa3119d62b1f83776af4a141e0346', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '64a94450-f723-11f0-b13b-fa163e425b77', 'monotonic_time': 3742.028091479, 'message_signature': '34b0037821fa2eb4f6cb83e391fac1cc1331b9d8989775ef97b573aea3bd9b27'}, {'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 196624, 'user_id': 'aa25befcc85f49009cc03d3f9a7af21a', 'user_name': None, 'project_id': '7b93a1e09d8a4019807c39b0826b8c31', 'project_name': None, 'resource_id': 'c5dba36b-76a4-4e09-bb90-2f8ef859d5f6-vda', 'timestamp': '2026-01-21T23:46:23.308647', 'resource_metadata': {'display_name': 'tempest-ServersAdminNegativeTestJSON-server-792157281', 'name': 'instance-0000000f', 'instance_id': 'c5dba36b-76a4-4e09-bb90-2f8ef859d5f6', 'instance_type': 'm1.nano', 'host': '430eeb3a2cb08ade65e170c593d235040b27860c4cbed7ba90a2e749', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '64aacc9e-f723-11f0-b13b-fa163e425b77', 'monotonic_time': 3742.040907824, 'message_signature': '1411ff0e527c67b700ffd6279dd94e23f672cec425e4d06f42fc5e965f9b6fbf'}, {'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 485376, 'user_id': 'aa25befcc85f49009cc03d3f9a7af21a', 'user_name': None, 'project_id': '7b93a1e09d8a4019807c39b0826b8c31', 'project_name': None, 'resource_id': 'c5dba36b-76a4-4e09-bb90-2f8ef859d5f6-sda', 'timestamp': '2026-01-21T23:46:23.308647', 'resource_metadata': {'display_name': 'tempest-ServersAdminNegativeTestJSON-server-792157281', 'name': 'instance-0000000f', 'instance_id': 'c5dba36b-76a4-4e09-bb90-2f8ef859d5f6', 'instance_type': 'm1.nano', 'host': '430eeb3a2cb08ade65e170c593d235040b27860c4cbed7ba90a2e749', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '64aad680-f723-11f0-b13b-fa163e425b77', 'monotonic_time': 3742.040907824, 'message_signature': 'a02e0ee07b6f67f8e0e7e8a81ad59e964b78ddad024fe613826027c66f04f724'}]}, 'timestamp': '2026-01-21 23:46:23.332027', '_unique_id': '664b0525996e4e5984ab5ef784b1ba2c'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.333 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.333 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.333 12 ERROR oslo_messaging.notify.messaging     yield
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.333 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.333 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.333 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.333 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.333 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.333 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.333 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.333 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.333 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.333 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.333 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.333 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.333 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.333 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.333 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.333 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.333 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.333 12 ERROR oslo_messaging.notify.messaging 
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.333 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.333 12 ERROR oslo_messaging.notify.messaging 
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.333 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.333 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.333 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.333 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.333 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.333 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.333 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.333 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.333 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.333 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.333 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.333 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.333 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.333 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.333 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.333 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.333 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.333 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.333 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.333 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.333 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.333 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.333 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.333 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.333 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.333 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.333 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.333 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.333 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.333 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.333 12 ERROR oslo_messaging.notify.messaging 
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.333 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes in the context of pollsters
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.333 12 DEBUG ceilometer.compute.pollsters [-] 69dceb72-db44-4bfc-9b98-cc8b39885ae7/network.incoming.bytes volume: 622 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.334 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'f9b9daf7-b80e-456c-b97b-31670768c108', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 622, 'user_id': '553fdc065acf4000a185abac43878ab4', 'user_name': None, 'project_id': '1298204af0f241dc8b63851b2046cf5c', 'project_name': None, 'resource_id': 'instance-0000000c-69dceb72-db44-4bfc-9b98-cc8b39885ae7-tapbae5fde2-5e', 'timestamp': '2026-01-21T23:46:23.333803', 'resource_metadata': {'display_name': 'tempest-LiveAutoBlockMigrationV225Test-server-1333035319', 'name': 'tapbae5fde2-5e', 'instance_id': '69dceb72-db44-4bfc-9b98-cc8b39885ae7', 'instance_type': 'm1.nano', 'host': 'b33f8342b081136aa645faba69efa3119d62b1f83776af4a141e0346', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:6f:ac:86', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapbae5fde2-5e'}, 'message_id': '64ab2a4a-f723-11f0-b13b-fa163e425b77', 'monotonic_time': 3741.885073423, 'message_signature': '71862042b41127e77e4f35ab57265764bdf6b21a2376ed6a11ebc528993407c5'}]}, 'timestamp': '2026-01-21 23:46:23.334168', '_unique_id': '594843d02e8c480bba08c71af669285f'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.334 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.334 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.334 12 ERROR oslo_messaging.notify.messaging     yield
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.334 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.334 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.334 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.334 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.334 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.334 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.334 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.334 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.334 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.334 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.334 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.334 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.334 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.334 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.334 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.334 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.334 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.334 12 ERROR oslo_messaging.notify.messaging 
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.334 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.334 12 ERROR oslo_messaging.notify.messaging 
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.334 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.334 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.334 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.334 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.334 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.334 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.334 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.334 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.334 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.334 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.334 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.334 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.334 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.334 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.334 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.334 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.334 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.334 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.334 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.334 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.334 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.334 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.334 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.334 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.334 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.334 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.334 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.334 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.334 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.334 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.334 12 ERROR oslo_messaging.notify.messaging 
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.335 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.drop in the context of pollsters
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.335 12 DEBUG ceilometer.compute.pollsters [-] 69dceb72-db44-4bfc-9b98-cc8b39885ae7/network.outgoing.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.336 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '2eb794a8-1a0d-4c6b-892d-282533ce5f6f', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '553fdc065acf4000a185abac43878ab4', 'user_name': None, 'project_id': '1298204af0f241dc8b63851b2046cf5c', 'project_name': None, 'resource_id': 'instance-0000000c-69dceb72-db44-4bfc-9b98-cc8b39885ae7-tapbae5fde2-5e', 'timestamp': '2026-01-21T23:46:23.335365', 'resource_metadata': {'display_name': 'tempest-LiveAutoBlockMigrationV225Test-server-1333035319', 'name': 'tapbae5fde2-5e', 'instance_id': '69dceb72-db44-4bfc-9b98-cc8b39885ae7', 'instance_type': 'm1.nano', 'host': 'b33f8342b081136aa645faba69efa3119d62b1f83776af4a141e0346', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:6f:ac:86', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapbae5fde2-5e'}, 'message_id': '64ab6334-f723-11f0-b13b-fa163e425b77', 'monotonic_time': 3741.885073423, 'message_signature': '06d8230c8455aeb23eb06e813a2d8f4c1a953c13b9f43c53d0cf4d7f935c3e99'}]}, 'timestamp': '2026-01-21 23:46:23.335618', '_unique_id': '83aff4d5635947eb838fb0aa035419d7'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.336 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.336 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.336 12 ERROR oslo_messaging.notify.messaging     yield
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.336 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.336 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.336 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.336 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.336 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.336 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.336 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.336 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.336 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.336 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.336 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.336 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.336 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.336 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.336 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.336 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.336 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.336 12 ERROR oslo_messaging.notify.messaging 
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.336 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.336 12 ERROR oslo_messaging.notify.messaging 
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.336 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.336 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.336 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.336 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.336 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.336 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.336 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.336 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.336 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.336 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.336 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.336 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.336 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.336 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.336 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.336 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.336 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.336 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.336 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.336 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.336 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.336 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.336 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.336 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.336 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.336 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.336 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.336 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.336 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.336 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.336 12 ERROR oslo_messaging.notify.messaging 
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.336 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.drop in the context of pollsters
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.336 12 DEBUG ceilometer.compute.pollsters [-] 69dceb72-db44-4bfc-9b98-cc8b39885ae7/network.incoming.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.337 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'efb40200-8aae-46dd-8a57-706e6909da0f', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '553fdc065acf4000a185abac43878ab4', 'user_name': None, 'project_id': '1298204af0f241dc8b63851b2046cf5c', 'project_name': None, 'resource_id': 'instance-0000000c-69dceb72-db44-4bfc-9b98-cc8b39885ae7-tapbae5fde2-5e', 'timestamp': '2026-01-21T23:46:23.336730', 'resource_metadata': {'display_name': 'tempest-LiveAutoBlockMigrationV225Test-server-1333035319', 'name': 'tapbae5fde2-5e', 'instance_id': '69dceb72-db44-4bfc-9b98-cc8b39885ae7', 'instance_type': 'm1.nano', 'host': 'b33f8342b081136aa645faba69efa3119d62b1f83776af4a141e0346', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:6f:ac:86', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapbae5fde2-5e'}, 'message_id': '64ab9ade-f723-11f0-b13b-fa163e425b77', 'monotonic_time': 3741.885073423, 'message_signature': 'dd9a718d4149ff16dbbb0a8aa2886c491587d69eb2b2e1e43793d325da993e76'}]}, 'timestamp': '2026-01-21 23:46:23.337042', '_unique_id': '057294989d454fbc9c587e5b27dbb5a4'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.337 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.337 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.337 12 ERROR oslo_messaging.notify.messaging     yield
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.337 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.337 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.337 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.337 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.337 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.337 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.337 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.337 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.337 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.337 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.337 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.337 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.337 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.337 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.337 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.337 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.337 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.337 12 ERROR oslo_messaging.notify.messaging 
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.337 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.337 12 ERROR oslo_messaging.notify.messaging 
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.337 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.337 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.337 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.337 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.337 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.337 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.337 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.337 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.337 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.337 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.337 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.337 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.337 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.337 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.337 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.337 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.337 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.337 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.337 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.337 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.337 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.337 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.337 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.337 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.337 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.337 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.337 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.337 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.337 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.337 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.337 12 ERROR oslo_messaging.notify.messaging 
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.338 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.delta in the context of pollsters
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.338 12 DEBUG ceilometer.compute.pollsters [-] 69dceb72-db44-4bfc-9b98-cc8b39885ae7/network.incoming.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.338 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '0ef7cc5c-7e93-45e1-8b35-5b2f05329f0d', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '553fdc065acf4000a185abac43878ab4', 'user_name': None, 'project_id': '1298204af0f241dc8b63851b2046cf5c', 'project_name': None, 'resource_id': 'instance-0000000c-69dceb72-db44-4bfc-9b98-cc8b39885ae7-tapbae5fde2-5e', 'timestamp': '2026-01-21T23:46:23.338109', 'resource_metadata': {'display_name': 'tempest-LiveAutoBlockMigrationV225Test-server-1333035319', 'name': 'tapbae5fde2-5e', 'instance_id': '69dceb72-db44-4bfc-9b98-cc8b39885ae7', 'instance_type': 'm1.nano', 'host': 'b33f8342b081136aa645faba69efa3119d62b1f83776af4a141e0346', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:6f:ac:86', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapbae5fde2-5e'}, 'message_id': '64abcd88-f723-11f0-b13b-fa163e425b77', 'monotonic_time': 3741.885073423, 'message_signature': '6d41475e7c633e8f48613822b05b49b1456046721377811e499f32bbec2c27cb'}]}, 'timestamp': '2026-01-21 23:46:23.338337', '_unique_id': '7dc224b9d6ea4b86ac394b9e2dc711cf'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.338 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.338 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.338 12 ERROR oslo_messaging.notify.messaging     yield
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.338 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.338 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.338 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.338 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.338 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.338 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.338 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.338 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.338 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.338 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.338 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.338 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.338 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.338 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.338 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.338 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.338 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.338 12 ERROR oslo_messaging.notify.messaging 
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.338 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.338 12 ERROR oslo_messaging.notify.messaging 
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.338 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.338 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.338 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.338 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.338 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.338 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.338 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.338 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.338 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.338 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.338 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.338 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.338 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.338 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.338 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.338 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.338 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.338 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.338 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.338 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.338 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.338 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.338 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.338 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.338 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.338 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.338 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.338 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.338 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.338 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.338 12 ERROR oslo_messaging.notify.messaging 
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.339 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.rate in the context of pollsters
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.339 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for IncomingBytesRatePollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.339 12 ERROR ceilometer.polling.manager [-] Prevent pollster network.incoming.bytes.rate from polling [<NovaLikeServer: tempest-LiveAutoBlockMigrationV225Test-server-1333035319>, <NovaLikeServer: tempest-ServersAdminNegativeTestJSON-server-792157281>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-LiveAutoBlockMigrationV225Test-server-1333035319>, <NovaLikeServer: tempest-ServersAdminNegativeTestJSON-server-792157281>]
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.339 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.bytes in the context of pollsters
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.339 12 DEBUG ceilometer.compute.pollsters [-] 69dceb72-db44-4bfc-9b98-cc8b39885ae7/disk.device.write.bytes volume: 98304 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.340 12 DEBUG ceilometer.compute.pollsters [-] 69dceb72-db44-4bfc-9b98-cc8b39885ae7/disk.device.write.bytes volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.340 12 DEBUG ceilometer.compute.pollsters [-] c5dba36b-76a4-4e09-bb90-2f8ef859d5f6/disk.device.write.bytes volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.340 12 DEBUG ceilometer.compute.pollsters [-] c5dba36b-76a4-4e09-bb90-2f8ef859d5f6/disk.device.write.bytes volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.341 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'e469dd5b-1226-41d8-8961-180592cbe395', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 98304, 'user_id': '553fdc065acf4000a185abac43878ab4', 'user_name': None, 'project_id': '1298204af0f241dc8b63851b2046cf5c', 'project_name': None, 'resource_id': '69dceb72-db44-4bfc-9b98-cc8b39885ae7-vda', 'timestamp': '2026-01-21T23:46:23.339706', 'resource_metadata': {'display_name': 'tempest-LiveAutoBlockMigrationV225Test-server-1333035319', 'name': 'instance-0000000c', 'instance_id': '69dceb72-db44-4bfc-9b98-cc8b39885ae7', 'instance_type': 'm1.nano', 'host': 'b33f8342b081136aa645faba69efa3119d62b1f83776af4a141e0346', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '64ac0e7e-f723-11f0-b13b-fa163e425b77', 'monotonic_time': 3741.931950324, 'message_signature': '44984cfca984157783f327823e06a30e438a4bc428326456df425e95c9f17daa'}, {'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '553fdc065acf4000a185abac43878ab4', 'user_name': None, 'project_id': '1298204af0f241dc8b63851b2046cf5c', 'project_name': None, 'resource_id': '69dceb72-db44-4bfc-9b98-cc8b39885ae7-sda', 'timestamp': '2026-01-21T23:46:23.339706', 'resource_metadata': {'display_name': 'tempest-LiveAutoBlockMigrationV225Test-server-1333035319', 'name': 'instance-0000000c', 'instance_id': '69dceb72-db44-4bfc-9b98-cc8b39885ae7', 'instance_type': 'm1.nano', 'host': 'b33f8342b081136aa645faba69efa3119d62b1f83776af4a141e0346', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '64ac1716-f723-11f0-b13b-fa163e425b77', 'monotonic_time': 3741.931950324, 'message_signature': '3c7f87e10f6b576081af3d8d21b887611f684961937f5bbf8be3a9982f708afb'}, {'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': 'aa25befcc85f49009cc03d3f9a7af21a', 'user_name': None, 'project_id': '7b93a1e09d8a4019807c39b0826b8c31', 'project_name': None, 'resource_id': 'c5dba36b-76a4-4e09-bb90-2f8ef859d5f6-vda', 'timestamp': '2026-01-21T23:46:23.339706', 'resource_metadata': {'display_name': 'tempest-ServersAdminNegativeTestJSON-server-792157281', 'name': 'instance-0000000f', 'instance_id': 'c5dba36b-76a4-4e09-bb90-2f8ef859d5f6', 'instance_type': 'm1.nano', 'host': '430eeb3a2cb08ade65e170c593d235040b27860c4cbed7ba90a2e749', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '64ac1ec8-f723-11f0-b13b-fa163e425b77', 'monotonic_time': 3741.970943953, 'message_signature': '14f4233cb40502677a53b5f15bfdb92848fed68c18f48d3d942314a72fed9db1'}, {'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': 'aa25befcc85f49009cc03d3f9a7af21a', 'user_name': None, 'project_id': '7b93a1e09d8a4019807c39b0826b8c31', 'project_name': None, 'resource_id': 'c5dba36b-76a4-4e09-bb90-2f8ef859d5f6-sda', 'timestamp': '2026-01-21T23:46:23.339706', 'resource_metadata': {'display_name': 'tempest-ServersAdminNegativeTestJSON-server-792157281', 'name': 'instance-0000000f', 'instance_id': 'c5dba36b-76a4-4e09-bb90-2f8ef859d5f6', 'instance_type': 'm1.nano', 'host': '430eeb3a2cb08ade65e170c593d235040b27860c4cbed7ba90a2e749', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '64ac2620-f723-11f0-b13b-fa163e425b77', 'monotonic_time': 3741.970943953, 'message_signature': 'cf0cb9b170146d5b5b4c52e05c1eadab7a634fb5b6599d829f781ec02631ccf4'}]}, 'timestamp': '2026-01-21 23:46:23.340588', '_unique_id': '9239cbb7fcca4944a5fdb2d55406ab15'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.341 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.341 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.341 12 ERROR oslo_messaging.notify.messaging     yield
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.341 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.341 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.341 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.341 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.341 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.341 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.341 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.341 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.341 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.341 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.341 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.341 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.341 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.341 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.341 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.341 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.341 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.341 12 ERROR oslo_messaging.notify.messaging 
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.341 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.341 12 ERROR oslo_messaging.notify.messaging 
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.341 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.341 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.341 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.341 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.341 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.341 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.341 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.341 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.341 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.341 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.341 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.341 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.341 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.341 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.341 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.341 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.341 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.341 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.341 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.341 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.341 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.341 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.341 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.341 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.341 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.341 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.341 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.341 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.341 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.341 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.341 12 ERROR oslo_messaging.notify.messaging 
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.341 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.latency in the context of pollsters
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.341 12 DEBUG ceilometer.compute.pollsters [-] 69dceb72-db44-4bfc-9b98-cc8b39885ae7/disk.device.read.latency volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.341 12 DEBUG ceilometer.compute.pollsters [-] 69dceb72-db44-4bfc-9b98-cc8b39885ae7/disk.device.read.latency volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.342 12 DEBUG ceilometer.compute.pollsters [-] c5dba36b-76a4-4e09-bb90-2f8ef859d5f6/disk.device.read.latency volume: 108296964 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.342 12 DEBUG ceilometer.compute.pollsters [-] c5dba36b-76a4-4e09-bb90-2f8ef859d5f6/disk.device.read.latency volume: 714989 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.343 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '3c707406-8c13-47b6-ac4c-4acb14122473', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 0, 'user_id': '553fdc065acf4000a185abac43878ab4', 'user_name': None, 'project_id': '1298204af0f241dc8b63851b2046cf5c', 'project_name': None, 'resource_id': '69dceb72-db44-4bfc-9b98-cc8b39885ae7-vda', 'timestamp': '2026-01-21T23:46:23.341739', 'resource_metadata': {'display_name': 'tempest-LiveAutoBlockMigrationV225Test-server-1333035319', 'name': 'instance-0000000c', 'instance_id': '69dceb72-db44-4bfc-9b98-cc8b39885ae7', 'instance_type': 'm1.nano', 'host': 'b33f8342b081136aa645faba69efa3119d62b1f83776af4a141e0346', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '64ac5b9a-f723-11f0-b13b-fa163e425b77', 'monotonic_time': 3741.931950324, 'message_signature': 'b6400ebf3eb3b646a3189a6b3199aa46f844a78f42657591acdf3a396b502465'}, {'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 0, 'user_id': '553fdc065acf4000a185abac43878ab4', 'user_name': None, 'project_id': '1298204af0f241dc8b63851b2046cf5c', 'project_name': None, 'resource_id': '69dceb72-db44-4bfc-9b98-cc8b39885ae7-sda', 'timestamp': '2026-01-21T23:46:23.341739', 'resource_metadata': {'display_name': 'tempest-LiveAutoBlockMigrationV225Test-server-1333035319', 'name': 'instance-0000000c', 'instance_id': '69dceb72-db44-4bfc-9b98-cc8b39885ae7', 'instance_type': 'm1.nano', 'host': 'b33f8342b081136aa645faba69efa3119d62b1f83776af4a141e0346', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '64ac6478-f723-11f0-b13b-fa163e425b77', 'monotonic_time': 3741.931950324, 'message_signature': '2865d7719e13b3e7a0b5aec7ee23fe8d44dc4b8f545f25b8b8ab758187caf04a'}, {'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 108296964, 'user_id': 'aa25befcc85f49009cc03d3f9a7af21a', 'user_name': None, 'project_id': '7b93a1e09d8a4019807c39b0826b8c31', 'project_name': None, 'resource_id': 'c5dba36b-76a4-4e09-bb90-2f8ef859d5f6-vda', 'timestamp': '2026-01-21T23:46:23.341739', 'resource_metadata': {'display_name': 'tempest-ServersAdminNegativeTestJSON-server-792157281', 'name': 'instance-0000000f', 'instance_id': 'c5dba36b-76a4-4e09-bb90-2f8ef859d5f6', 'instance_type': 'm1.nano', 'host': '430eeb3a2cb08ade65e170c593d235040b27860c4cbed7ba90a2e749', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '64ac6bee-f723-11f0-b13b-fa163e425b77', 'monotonic_time': 3741.970943953, 'message_signature': '479c47287cb748a896387f7cc2a44b493c877b46d0750be0c9092200ea825348'}, {'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 714989, 'user_id': 'aa25befcc85f49009cc03d3f9a7af21a', 'user_name': None, 'project_id': '7b93a1e09d8a4019807c39b0826b8c31', 'project_name': None, 'resource_id': 'c5dba36b-76a4-4e09-bb90-2f8ef859d5f6-sda', 'timestamp': '2026-01-21T23:46:23.341739', 'resource_metadata': {'display_name': 'tempest-ServersAdminNegativeTestJSON-server-792157281', 'name': 'instance-0000000f', 'instance_id': 'c5dba36b-76a4-4e09-bb90-2f8ef859d5f6', 'instance_type': 'm1.nano', 'host': '430eeb3a2cb08ade65e170c593d235040b27860c4cbed7ba90a2e749', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '64ac733c-f723-11f0-b13b-fa163e425b77', 'monotonic_time': 3741.970943953, 'message_signature': 'acebea7a6e7430b6fb53bbb9cdb58e07b13c460fb023ad34a8b78612fcca5799'}]}, 'timestamp': '2026-01-21 23:46:23.342560', '_unique_id': '06da5753d15f4eae84a11c7fd02e17cf'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.343 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.343 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.343 12 ERROR oslo_messaging.notify.messaging     yield
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.343 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.343 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.343 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.343 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.343 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.343 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.343 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.343 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.343 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.343 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.343 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.343 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.343 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.343 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.343 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.343 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.343 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.343 12 ERROR oslo_messaging.notify.messaging 
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.343 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.343 12 ERROR oslo_messaging.notify.messaging 
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.343 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.343 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.343 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.343 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.343 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.343 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.343 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.343 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.343 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.343 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.343 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.343 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.343 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.343 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.343 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.343 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.343 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.343 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.343 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.343 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.343 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.343 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.343 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.343 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.343 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.343 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.343 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.343 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.343 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.343 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.343 12 ERROR oslo_messaging.notify.messaging 
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.343 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.rate in the context of pollsters
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.343 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for OutgoingBytesRatePollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.343 12 ERROR ceilometer.polling.manager [-] Prevent pollster network.outgoing.bytes.rate from polling [<NovaLikeServer: tempest-LiveAutoBlockMigrationV225Test-server-1333035319>, <NovaLikeServer: tempest-ServersAdminNegativeTestJSON-server-792157281>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-LiveAutoBlockMigrationV225Test-server-1333035319>, <NovaLikeServer: tempest-ServersAdminNegativeTestJSON-server-792157281>]
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.343 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.error in the context of pollsters
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.344 12 DEBUG ceilometer.compute.pollsters [-] 69dceb72-db44-4bfc-9b98-cc8b39885ae7/network.incoming.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.344 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'be8ec95c-170d-4208-827a-8a78d55e031e', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '553fdc065acf4000a185abac43878ab4', 'user_name': None, 'project_id': '1298204af0f241dc8b63851b2046cf5c', 'project_name': None, 'resource_id': 'instance-0000000c-69dceb72-db44-4bfc-9b98-cc8b39885ae7-tapbae5fde2-5e', 'timestamp': '2026-01-21T23:46:23.343995', 'resource_metadata': {'display_name': 'tempest-LiveAutoBlockMigrationV225Test-server-1333035319', 'name': 'tapbae5fde2-5e', 'instance_id': '69dceb72-db44-4bfc-9b98-cc8b39885ae7', 'instance_type': 'm1.nano', 'host': 'b33f8342b081136aa645faba69efa3119d62b1f83776af4a141e0346', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:6f:ac:86', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapbae5fde2-5e'}, 'message_id': '64acb388-f723-11f0-b13b-fa163e425b77', 'monotonic_time': 3741.885073423, 'message_signature': 'd024d8d1257b933c9ef2b8878cc251c80fabb34d935c7ff176fd2b63465d865e'}]}, 'timestamp': '2026-01-21 23:46:23.344224', '_unique_id': '76490e4503bd4dafb43f681e90da9d6b'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.344 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.344 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.344 12 ERROR oslo_messaging.notify.messaging     yield
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.344 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.344 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.344 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.344 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.344 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.344 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.344 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.344 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.344 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.344 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.344 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.344 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.344 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.344 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.344 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.344 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.344 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.344 12 ERROR oslo_messaging.notify.messaging 
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.344 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.344 12 ERROR oslo_messaging.notify.messaging 
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.344 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.344 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.344 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.344 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.344 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.344 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.344 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.344 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.344 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.344 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.344 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.344 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.344 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.344 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.344 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.344 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.344 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.344 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.344 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.344 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.344 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.344 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.344 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.344 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.344 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.344 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.344 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.344 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.344 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.344 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.344 12 ERROR oslo_messaging.notify.messaging 
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.345 12 INFO ceilometer.polling.manager [-] Polling pollster cpu in the context of pollsters
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.363 12 DEBUG ceilometer.compute.pollsters [-] 69dceb72-db44-4bfc-9b98-cc8b39885ae7/cpu volume: 110000000 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.382 12 DEBUG ceilometer.compute.pollsters [-] c5dba36b-76a4-4e09-bb90-2f8ef859d5f6/cpu volume: 3720000000 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.384 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '92401454-9718-4ad9-8fa8-8d28527b7eea', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'cpu', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 110000000, 'user_id': '553fdc065acf4000a185abac43878ab4', 'user_name': None, 'project_id': '1298204af0f241dc8b63851b2046cf5c', 'project_name': None, 'resource_id': '69dceb72-db44-4bfc-9b98-cc8b39885ae7', 'timestamp': '2026-01-21T23:46:23.345300', 'resource_metadata': {'display_name': 'tempest-LiveAutoBlockMigrationV225Test-server-1333035319', 'name': 'instance-0000000c', 'instance_id': '69dceb72-db44-4bfc-9b98-cc8b39885ae7', 'instance_type': 'm1.nano', 'host': 'b33f8342b081136aa645faba69efa3119d62b1f83776af4a141e0346', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'cpu_number': 1}, 'message_id': '64afbb0a-f723-11f0-b13b-fa163e425b77', 'monotonic_time': 3742.081973608, 'message_signature': '43601798e4a2feafe776ec78ae67329eec29af70e28c8658cf33a4b58fefbb39'}, {'source': 'openstack', 'counter_name': 'cpu', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 3720000000, 'user_id': 'aa25befcc85f49009cc03d3f9a7af21a', 'user_name': None, 'project_id': '7b93a1e09d8a4019807c39b0826b8c31', 'project_name': None, 'resource_id': 'c5dba36b-76a4-4e09-bb90-2f8ef859d5f6', 'timestamp': '2026-01-21T23:46:23.345300', 'resource_metadata': {'display_name': 'tempest-ServersAdminNegativeTestJSON-server-792157281', 'name': 'instance-0000000f', 'instance_id': 'c5dba36b-76a4-4e09-bb90-2f8ef859d5f6', 'instance_type': 'm1.nano', 'host': '430eeb3a2cb08ade65e170c593d235040b27860c4cbed7ba90a2e749', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'cpu_number': 1}, 'message_id': '64b29f0a-f723-11f0-b13b-fa163e425b77', 'monotonic_time': 3742.101343219, 'message_signature': 'eaeab27e21dee7910876ccc4e89de85a3e1818f7c70d46955d6d945e89176bfd'}]}, 'timestamp': '2026-01-21 23:46:23.383245', '_unique_id': '412df0183de349ddb90c29456c3b717d'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.384 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.384 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.384 12 ERROR oslo_messaging.notify.messaging     yield
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.384 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.384 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.384 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.384 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.384 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.384 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.384 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.384 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.384 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.384 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.384 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.384 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.384 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.384 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.384 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.384 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.384 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.384 12 ERROR oslo_messaging.notify.messaging 
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.384 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.384 12 ERROR oslo_messaging.notify.messaging 
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.384 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.384 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.384 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.384 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.384 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.384 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.384 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.384 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.384 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.384 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.384 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.384 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.384 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.384 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.384 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.384 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.384 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.384 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.384 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.384 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.384 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.384 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.384 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.384 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.384 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.384 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.384 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.384 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.384 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.384 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.384 12 ERROR oslo_messaging.notify.messaging 
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.386 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.allocation in the context of pollsters
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.386 12 DEBUG ceilometer.compute.pollsters [-] 69dceb72-db44-4bfc-9b98-cc8b39885ae7/disk.device.allocation volume: 30613504 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.387 12 DEBUG ceilometer.compute.pollsters [-] 69dceb72-db44-4bfc-9b98-cc8b39885ae7/disk.device.allocation volume: 487424 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.387 12 DEBUG ceilometer.compute.pollsters [-] c5dba36b-76a4-4e09-bb90-2f8ef859d5f6/disk.device.allocation volume: 204800 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.388 12 DEBUG ceilometer.compute.pollsters [-] c5dba36b-76a4-4e09-bb90-2f8ef859d5f6/disk.device.allocation volume: 487424 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.389 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'bf545e29-69a5-49f9-95b8-d4def8ac97f0', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 30613504, 'user_id': '553fdc065acf4000a185abac43878ab4', 'user_name': None, 'project_id': '1298204af0f241dc8b63851b2046cf5c', 'project_name': None, 'resource_id': '69dceb72-db44-4bfc-9b98-cc8b39885ae7-vda', 'timestamp': '2026-01-21T23:46:23.386524', 'resource_metadata': {'display_name': 'tempest-LiveAutoBlockMigrationV225Test-server-1333035319', 'name': 'instance-0000000c', 'instance_id': '69dceb72-db44-4bfc-9b98-cc8b39885ae7', 'instance_type': 'm1.nano', 'host': 'b33f8342b081136aa645faba69efa3119d62b1f83776af4a141e0346', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '64b3379e-f723-11f0-b13b-fa163e425b77', 'monotonic_time': 3742.028091479, 'message_signature': 'afe8e61d177149089a7eb9562dca0c7e188e89e58d12189e082828619705eed7'}, {'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 487424, 'user_id': '553fdc065acf4000a185abac43878ab4', 'user_name': None, 'project_id': '1298204af0f241dc8b63851b2046cf5c', 'project_name': None, 'resource_id': '69dceb72-db44-4bfc-9b98-cc8b39885ae7-sda', 'timestamp': '2026-01-21T23:46:23.386524', 'resource_metadata': {'display_name': 'tempest-LiveAutoBlockMigrationV225Test-server-1333035319', 'name': 'instance-0000000c', 'instance_id': '69dceb72-db44-4bfc-9b98-cc8b39885ae7', 'instance_type': 'm1.nano', 'host': 'b33f8342b081136aa645faba69efa3119d62b1f83776af4a141e0346', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '64b34bc6-f723-11f0-b13b-fa163e425b77', 'monotonic_time': 3742.028091479, 'message_signature': '7712f84a45422c1920f4e52f7cb01050219b81cdc18b3ac3353c227946152d17'}, {'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 204800, 'user_id': 'aa25befcc85f49009cc03d3f9a7af21a', 'user_name': None, 'project_id': '7b93a1e09d8a4019807c39b0826b8c31', 'project_name': None, 'resource_id': 'c5dba36b-76a4-4e09-bb90-2f8ef859d5f6-vda', 'timestamp': '2026-01-21T23:46:23.386524', 'resource_metadata': {'display_name': 'tempest-ServersAdminNegativeTestJSON-server-792157281', 'name': 'instance-0000000f', 'instance_id': 'c5dba36b-76a4-4e09-bb90-2f8ef859d5f6', 'instance_type': 'm1.nano', 'host': '430eeb3a2cb08ade65e170c593d235040b27860c4cbed7ba90a2e749', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '64b35ce2-f723-11f0-b13b-fa163e425b77', 'monotonic_time': 3742.040907824, 'message_signature': 'd52e7323a9990d2b73d435c2ba2739f23076a01df7cc3abe89ebe082d43cb720'}, {'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 487424, 'user_id': 'aa25befcc85f49009cc03d3f9a7af21a', 'user_name': None, 'project_id': '7b93a1e09d8a4019807c39b0826b8c31', 'project_name': None, 'resource_id': 'c5dba36b-76a4-4e09-bb90-2f8ef859d5f6-sda', 'timestamp': '2026-01-21T23:46:23.386524', 'resource_metadata': {'display_name': 'tempest-ServersAdminNegativeTestJSON-server-792157281', 'name': 'instance-0000000f', 'instance_id': 'c5dba36b-76a4-4e09-bb90-2f8ef859d5f6', 'instance_type': 'm1.nano', 'host': '430eeb3a2cb08ade65e170c593d235040b27860c4cbed7ba90a2e749', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '64b36f20-f723-11f0-b13b-fa163e425b77', 'monotonic_time': 3742.040907824, 'message_signature': 'eed0e64c53de413e94cb705688aff35b68c04ceac249ede728b3f93b8e26328a'}]}, 'timestamp': '2026-01-21 23:46:23.388455', '_unique_id': '9df29d3ca2eb493cb0f691845e863627'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.389 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.389 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.389 12 ERROR oslo_messaging.notify.messaging     yield
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.389 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.389 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.389 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.389 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.389 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.389 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.389 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.389 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.389 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.389 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.389 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.389 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.389 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.389 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.389 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.389 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.389 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.389 12 ERROR oslo_messaging.notify.messaging 
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.389 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.389 12 ERROR oslo_messaging.notify.messaging 
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.389 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.389 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.389 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.389 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.389 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.389 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.389 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.389 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.389 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.389 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.389 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.389 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.389 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.389 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.389 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.389 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.389 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.389 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.389 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.389 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.389 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.389 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.389 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.389 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.389 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.389 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.389 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.389 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.389 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.389 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.389 12 ERROR oslo_messaging.notify.messaging 
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.391 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.bytes in the context of pollsters
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.391 12 DEBUG ceilometer.compute.pollsters [-] 69dceb72-db44-4bfc-9b98-cc8b39885ae7/disk.device.read.bytes volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.392 12 DEBUG ceilometer.compute.pollsters [-] 69dceb72-db44-4bfc-9b98-cc8b39885ae7/disk.device.read.bytes volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.392 12 DEBUG ceilometer.compute.pollsters [-] c5dba36b-76a4-4e09-bb90-2f8ef859d5f6/disk.device.read.bytes volume: 23775232 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.393 12 DEBUG ceilometer.compute.pollsters [-] c5dba36b-76a4-4e09-bb90-2f8ef859d5f6/disk.device.read.bytes volume: 2048 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.394 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'cb2381c8-7327-4a89-9592-a9793438c245', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '553fdc065acf4000a185abac43878ab4', 'user_name': None, 'project_id': '1298204af0f241dc8b63851b2046cf5c', 'project_name': None, 'resource_id': '69dceb72-db44-4bfc-9b98-cc8b39885ae7-vda', 'timestamp': '2026-01-21T23:46:23.391454', 'resource_metadata': {'display_name': 'tempest-LiveAutoBlockMigrationV225Test-server-1333035319', 'name': 'instance-0000000c', 'instance_id': '69dceb72-db44-4bfc-9b98-cc8b39885ae7', 'instance_type': 'm1.nano', 'host': 'b33f8342b081136aa645faba69efa3119d62b1f83776af4a141e0346', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '64b3fefe-f723-11f0-b13b-fa163e425b77', 'monotonic_time': 3741.931950324, 'message_signature': '8c0ad95fb9bc664b358161e0dad33d8e11db001a16fdedb7179b99a1c3ab9ac9'}, {'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '553fdc065acf4000a185abac43878ab4', 'user_name': None, 'project_id': '1298204af0f241dc8b63851b2046cf5c', 'project_name': None, 'resource_id': '69dceb72-db44-4bfc-9b98-cc8b39885ae7-sda', 'timestamp': '2026-01-21T23:46:23.391454', 'resource_metadata': {'display_name': 'tempest-LiveAutoBlockMigrationV225Test-server-1333035319', 'name': 'instance-0000000c', 'instance_id': '69dceb72-db44-4bfc-9b98-cc8b39885ae7', 'instance_type': 'm1.nano', 'host': 'b33f8342b081136aa645faba69efa3119d62b1f83776af4a141e0346', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '64b4154c-f723-11f0-b13b-fa163e425b77', 'monotonic_time': 3741.931950324, 'message_signature': '4ce893b94c1896797eda61c24f623b3791b6fc8a25fdc16fe2c7dde4cda5b07d'}, {'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 23775232, 'user_id': 'aa25befcc85f49009cc03d3f9a7af21a', 'user_name': None, 'project_id': '7b93a1e09d8a4019807c39b0826b8c31', 'project_name': None, 'resource_id': 'c5dba36b-76a4-4e09-bb90-2f8ef859d5f6-vda', 'timestamp': '2026-01-21T23:46:23.391454', 'resource_metadata': {'display_name': 'tempest-ServersAdminNegativeTestJSON-server-792157281', 'name': 'instance-0000000f', 'instance_id': 'c5dba36b-76a4-4e09-bb90-2f8ef859d5f6', 'instance_type': 'm1.nano', 'host': '430eeb3a2cb08ade65e170c593d235040b27860c4cbed7ba90a2e749', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '64b42a50-f723-11f0-b13b-fa163e425b77', 'monotonic_time': 3741.970943953, 'message_signature': '551a146c4f2e945354235fb2adfcb8d82b7a7bfbb361873cf0129c41225378a4'}, {'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 2048, 'user_id': 'aa25befcc85f49009cc03d3f9a7af21a', 'user_name': None, 'project_id': '7b93a1e09d8a4019807c39b0826b8c31', 'project_name': None, 'resource_id': 'c5dba36b-76a4-4e09-bb90-2f8ef859d5f6-sda', 'timestamp': '2026-01-21T23:46:23.391454', 'resource_metadata': {'display_name': 'tempest-ServersAdminNegativeTestJSON-server-792157281', 'name': 'instance-0000000f', 'instance_id': 'c5dba36b-76a4-4e09-bb90-2f8ef859d5f6', 'instance_type': 'm1.nano', 'host': '430eeb3a2cb08ade65e170c593d235040b27860c4cbed7ba90a2e749', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '64b43aea-f723-11f0-b13b-fa163e425b77', 'monotonic_time': 3741.970943953, 'message_signature': 'a99174a9f80ae64c1d6be5e169a05c0118a40a9df2f33b182f8eecbab7574bbe'}]}, 'timestamp': '2026-01-21 23:46:23.393669', '_unique_id': 'a4f20390a1a44ed6bf6839c6078d28e9'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.394 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.394 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.394 12 ERROR oslo_messaging.notify.messaging     yield
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.394 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.394 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.394 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.394 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.394 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.394 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.394 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.394 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.394 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.394 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.394 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.394 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.394 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.394 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.394 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.394 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.394 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.394 12 ERROR oslo_messaging.notify.messaging 
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.394 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.394 12 ERROR oslo_messaging.notify.messaging 
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.394 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.394 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.394 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.394 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.394 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.394 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.394 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.394 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.394 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.394 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.394 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.394 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.394 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.394 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.394 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.394 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.394 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.394 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.394 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.394 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.394 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.394 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.394 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.394 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.394 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.394 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.394 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.394 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.394 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.394 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.394 12 ERROR oslo_messaging.notify.messaging 
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.395 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.capacity in the context of pollsters
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.395 12 DEBUG ceilometer.compute.pollsters [-] 69dceb72-db44-4bfc-9b98-cc8b39885ae7/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.396 12 DEBUG ceilometer.compute.pollsters [-] 69dceb72-db44-4bfc-9b98-cc8b39885ae7/disk.device.capacity volume: 485376 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.396 12 DEBUG ceilometer.compute.pollsters [-] c5dba36b-76a4-4e09-bb90-2f8ef859d5f6/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.396 12 DEBUG ceilometer.compute.pollsters [-] c5dba36b-76a4-4e09-bb90-2f8ef859d5f6/disk.device.capacity volume: 485376 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.398 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'b63afb31-32f5-4184-ad39-3b5515ab4c9b', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '553fdc065acf4000a185abac43878ab4', 'user_name': None, 'project_id': '1298204af0f241dc8b63851b2046cf5c', 'project_name': None, 'resource_id': '69dceb72-db44-4bfc-9b98-cc8b39885ae7-vda', 'timestamp': '2026-01-21T23:46:23.395814', 'resource_metadata': {'display_name': 'tempest-LiveAutoBlockMigrationV225Test-server-1333035319', 'name': 'instance-0000000c', 'instance_id': '69dceb72-db44-4bfc-9b98-cc8b39885ae7', 'instance_type': 'm1.nano', 'host': 'b33f8342b081136aa645faba69efa3119d62b1f83776af4a141e0346', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '64b4a110-f723-11f0-b13b-fa163e425b77', 'monotonic_time': 3742.028091479, 'message_signature': '73de02913a729469d79d67a3da2c06c6db892be5a782d748137b1cbf76d62d11'}, {'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 485376, 'user_id': '553fdc065acf4000a185abac43878ab4', 'user_name': None, 'project_id': '1298204af0f241dc8b63851b2046cf5c', 'project_name': None, 'resource_id': '69dceb72-db44-4bfc-9b98-cc8b39885ae7-sda', 'timestamp': '2026-01-21T23:46:23.395814', 'resource_metadata': {'display_name': 'tempest-LiveAutoBlockMigrationV225Test-server-1333035319', 'name': 'instance-0000000c', 'instance_id': '69dceb72-db44-4bfc-9b98-cc8b39885ae7', 'instance_type': 'm1.nano', 'host': 'b33f8342b081136aa645faba69efa3119d62b1f83776af4a141e0346', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '64b4ad5e-f723-11f0-b13b-fa163e425b77', 'monotonic_time': 3742.028091479, 'message_signature': '3479e9693101cd78ed663ab8dbed1af3d108f15976c2b9d4a0060dc44a1f69a4'}, {'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': 'aa25befcc85f49009cc03d3f9a7af21a', 'user_name': None, 'project_id': '7b93a1e09d8a4019807c39b0826b8c31', 'project_name': None, 'resource_id': 'c5dba36b-76a4-4e09-bb90-2f8ef859d5f6-vda', 'timestamp': '2026-01-21T23:46:23.395814', 'resource_metadata': {'display_name': 'tempest-ServersAdminNegativeTestJSON-server-792157281', 'name': 'instance-0000000f', 'instance_id': 'c5dba36b-76a4-4e09-bb90-2f8ef859d5f6', 'instance_type': 'm1.nano', 'host': '430eeb3a2cb08ade65e170c593d235040b27860c4cbed7ba90a2e749', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '64b4b966-f723-11f0-b13b-fa163e425b77', 'monotonic_time': 3742.040907824, 'message_signature': '0d722b50775eb015a4839ab88b919baf91bb5301966eba3b80a18b7da192aeec'}, {'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 485376, 'user_id': 'aa25befcc85f49009cc03d3f9a7af21a', 'user_name': None, 'project_id': '7b93a1e09d8a4019807c39b0826b8c31', 'project_name': None, 'resource_id': 'c5dba36b-76a4-4e09-bb90-2f8ef859d5f6-sda', 'timestamp': '2026-01-21T23:46:23.395814', 'resource_metadata': {'display_name': 'tempest-ServersAdminNegativeTestJSON-server-792157281', 'name': 'instance-0000000f', 'instance_id': 'c5dba36b-76a4-4e09-bb90-2f8ef859d5f6', 'instance_type': 'm1.nano', 'host': '430eeb3a2cb08ade65e170c593d235040b27860c4cbed7ba90a2e749', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '64b4c5be-f723-11f0-b13b-fa163e425b77', 'monotonic_time': 3742.040907824, 'message_signature': 'ebd666c87cfb94cacbe1f9d9dc5b5d6ed13e234e0649a051fb3a3ad8f1b0324c'}]}, 'timestamp': '2026-01-21 23:46:23.397154', '_unique_id': '90fe06f4032f48868741bd9f40d80078'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.398 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.398 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.398 12 ERROR oslo_messaging.notify.messaging     yield
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.398 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.398 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.398 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.398 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.398 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.398 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.398 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.398 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.398 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.398 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.398 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.398 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.398 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.398 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.398 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.398 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.398 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.398 12 ERROR oslo_messaging.notify.messaging 
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.398 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.398 12 ERROR oslo_messaging.notify.messaging 
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.398 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.398 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.398 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.398 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.398 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.398 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.398 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.398 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.398 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.398 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.398 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.398 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.398 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.398 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.398 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.398 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.398 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.398 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.398 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.398 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.398 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.398 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.398 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.398 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.398 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.398 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.398 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.398 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.398 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.398 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.398 12 ERROR oslo_messaging.notify.messaging 
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.399 12 INFO ceilometer.polling.manager [-] Polling pollster memory.usage in the context of pollsters
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.399 12 DEBUG ceilometer.compute.pollsters [-] 69dceb72-db44-4bfc-9b98-cc8b39885ae7/memory.usage volume: 42.84375 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.399 12 DEBUG ceilometer.compute.pollsters [-] c5dba36b-76a4-4e09-bb90-2f8ef859d5f6/memory.usage volume: Unavailable _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.400 12 WARNING ceilometer.compute.pollsters [-] memory.usage statistic in not available for instance c5dba36b-76a4-4e09-bb90-2f8ef859d5f6: ceilometer.compute.pollsters.NoVolumeException
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.403 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'e1e7c14e-b487-4163-b7d3-84702578397e', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'memory.usage', 'counter_type': 'gauge', 'counter_unit': 'MB', 'counter_volume': 42.84375, 'user_id': '553fdc065acf4000a185abac43878ab4', 'user_name': None, 'project_id': '1298204af0f241dc8b63851b2046cf5c', 'project_name': None, 'resource_id': '69dceb72-db44-4bfc-9b98-cc8b39885ae7', 'timestamp': '2026-01-21T23:46:23.399520', 'resource_metadata': {'display_name': 'tempest-LiveAutoBlockMigrationV225Test-server-1333035319', 'name': 'instance-0000000c', 'instance_id': '69dceb72-db44-4bfc-9b98-cc8b39885ae7', 'instance_type': 'm1.nano', 'host': 'b33f8342b081136aa645faba69efa3119d62b1f83776af4a141e0346', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1}, 'message_id': '64b5304e-f723-11f0-b13b-fa163e425b77', 'monotonic_time': 3742.081973608, 'message_signature': '901fbd817fa76e5451c31c3e539f1f8f0a67988956423ce366f4f2971e2a01fe'}]}, 'timestamp': '2026-01-21 23:46:23.400286', '_unique_id': 'a252d80dc3fb48bfbdb75122f82e21f4'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.403 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.403 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.403 12 ERROR oslo_messaging.notify.messaging     yield
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.403 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.403 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.403 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.403 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.403 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.403 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.403 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.403 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.403 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.403 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.403 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.403 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.403 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.403 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.403 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.403 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.403 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.403 12 ERROR oslo_messaging.notify.messaging 
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.403 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.403 12 ERROR oslo_messaging.notify.messaging 
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.403 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.403 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.403 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.403 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.403 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.403 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.403 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.403 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.403 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.403 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.403 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.403 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.403 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.403 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.403 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.403 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.403 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.403 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.403 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.403 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.403 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.403 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.403 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.403 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.403 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.403 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.403 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.403 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.403 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.403 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.403 12 ERROR oslo_messaging.notify.messaging 
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.404 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes in the context of pollsters
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.404 12 DEBUG ceilometer.compute.pollsters [-] 69dceb72-db44-4bfc-9b98-cc8b39885ae7/network.outgoing.bytes volume: 5560 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.406 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'a99d274b-a688-4887-92b3-1d7be8def7e4', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 5560, 'user_id': '553fdc065acf4000a185abac43878ab4', 'user_name': None, 'project_id': '1298204af0f241dc8b63851b2046cf5c', 'project_name': None, 'resource_id': 'instance-0000000c-69dceb72-db44-4bfc-9b98-cc8b39885ae7-tapbae5fde2-5e', 'timestamp': '2026-01-21T23:46:23.404737', 'resource_metadata': {'display_name': 'tempest-LiveAutoBlockMigrationV225Test-server-1333035319', 'name': 'tapbae5fde2-5e', 'instance_id': '69dceb72-db44-4bfc-9b98-cc8b39885ae7', 'instance_type': 'm1.nano', 'host': 'b33f8342b081136aa645faba69efa3119d62b1f83776af4a141e0346', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:6f:ac:86', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapbae5fde2-5e'}, 'message_id': '64b5fd4e-f723-11f0-b13b-fa163e425b77', 'monotonic_time': 3741.885073423, 'message_signature': '29def12536223df8f6de36356af5d0fb5e7ca0537097b0c35f43cbe60d943a6c'}]}, 'timestamp': '2026-01-21 23:46:23.405995', '_unique_id': 'd3476dee73fc427299a610349168a394'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.406 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.406 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.406 12 ERROR oslo_messaging.notify.messaging     yield
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.406 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.406 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.406 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.406 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.406 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.406 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.406 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.406 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.406 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.406 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.406 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.406 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.406 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.406 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.406 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.406 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.406 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.406 12 ERROR oslo_messaging.notify.messaging 
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.406 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.406 12 ERROR oslo_messaging.notify.messaging 
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.406 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.406 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.406 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.406 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.406 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.406 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.406 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.406 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.406 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.406 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.406 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.406 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.406 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.406 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.406 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.406 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.406 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.406 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.406 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.406 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.406 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.406 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.406 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.406 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.406 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.406 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.406 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.406 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.406 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.406 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.406 12 ERROR oslo_messaging.notify.messaging 
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.407 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.latency in the context of pollsters
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.407 12 DEBUG ceilometer.compute.pollsters [-] 69dceb72-db44-4bfc-9b98-cc8b39885ae7/disk.device.write.latency volume: 8296322 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.408 12 DEBUG ceilometer.compute.pollsters [-] 69dceb72-db44-4bfc-9b98-cc8b39885ae7/disk.device.write.latency volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.408 12 DEBUG ceilometer.compute.pollsters [-] c5dba36b-76a4-4e09-bb90-2f8ef859d5f6/disk.device.write.latency volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.409 12 DEBUG ceilometer.compute.pollsters [-] c5dba36b-76a4-4e09-bb90-2f8ef859d5f6/disk.device.write.latency volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.412 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'aa7f080b-f018-4e33-b165-221a2b240afe', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 8296322, 'user_id': '553fdc065acf4000a185abac43878ab4', 'user_name': None, 'project_id': '1298204af0f241dc8b63851b2046cf5c', 'project_name': None, 'resource_id': '69dceb72-db44-4bfc-9b98-cc8b39885ae7-vda', 'timestamp': '2026-01-21T23:46:23.407734', 'resource_metadata': {'display_name': 'tempest-LiveAutoBlockMigrationV225Test-server-1333035319', 'name': 'instance-0000000c', 'instance_id': '69dceb72-db44-4bfc-9b98-cc8b39885ae7', 'instance_type': 'm1.nano', 'host': 'b33f8342b081136aa645faba69efa3119d62b1f83776af4a141e0346', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '64b66ff4-f723-11f0-b13b-fa163e425b77', 'monotonic_time': 3741.931950324, 'message_signature': '1f3d88ae80ba74c18954c6ee4de47f945899e62a172aaa3543c51d635937ef85'}, {'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 0, 'user_id': '553fdc065acf4000a185abac43878ab4', 'user_name': None, 'project_id': '1298204af0f241dc8b63851b2046cf5c', 'project_name': None, 'resource_id': '69dceb72-db44-4bfc-9b98-cc8b39885ae7-sda', 'timestamp': '2026-01-21T23:46:23.407734', 'resource_metadata': {'display_name': 'tempest-LiveAutoBlockMigrationV225Test-server-1333035319', 'name': 'instance-0000000c', 'instance_id': '69dceb72-db44-4bfc-9b98-cc8b39885ae7', 'instance_type': 'm1.nano', 'host': 'b33f8342b081136aa645faba69efa3119d62b1f83776af4a141e0346', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '64b679fe-f723-11f0-b13b-fa163e425b77', 'monotonic_time': 3741.931950324, 'message_signature': '515a27f2185ebdb280ba64dff9f297e34de7f420afd86c59c4ccefffb57c1177'}, {'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 0, 'user_id': 'aa25befcc85f49009cc03d3f9a7af21a', 'user_name': None, 'project_id': '7b93a1e09d8a4019807c39b0826b8c31', 'project_name': None, 'resource_id': 'c5dba36b-76a4-4e09-bb90-2f8ef859d5f6-vda', 'timestamp': '2026-01-21T23:46:23.407734', 'resource_metadata': {'display_name': 'tempest-ServersAdminNegativeTestJSON-server-792157281', 'name': 'instance-0000000f', 'instance_id': 'c5dba36b-76a4-4e09-bb90-2f8ef859d5f6', 'instance_type': 'm1.nano', 'host': '430eeb3a2cb08ade65e170c593d235040b27860c4cbed7ba90a2e749', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '64b6a9e2-f723-11f0-b13b-fa163e425b77', 'monotonic_time': 3741.970943953, 'message_signature': '7484a5343d2fa31fa85f90a593861274bcb7108dffdc5653f90aca73378c83eb'}, {'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 0, 'user_id': 'aa25befcc85f49009cc03d3f9a7af21a', 'user_name': None, 'project_id': '7b93a1e09d8a4019807c39b0826b8c31', 'project_name': None, 'resource_id': 'c5dba36b-76a4-4e09-bb90-2f8ef859d5f6-sda', 'timestamp': '2026-01-21T23:46:23.407734', 'resource_metadata': {'display_name': 'tempest-ServersAdminNegativeTestJSON-server-792157281', 'name': 'instance-0000000f', 'instance_id': 'c5dba36b-76a4-4e09-bb90-2f8ef859d5f6', 'instance_type': 'm1.nano', 'host': '430eeb3a2cb08ade65e170c593d235040b27860c4cbed7ba90a2e749', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '64b6cb8e-f723-11f0-b13b-fa163e425b77', 'monotonic_time': 3741.970943953, 'message_signature': '780fa87e1ecf03a9260d559beb9dd45e258a63fe0e959324f3f6254a4e7d6620'}]}, 'timestamp': '2026-01-21 23:46:23.410504', '_unique_id': '744c036f1c394c39914c6fb2a5dd08c7'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.412 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.412 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.412 12 ERROR oslo_messaging.notify.messaging     yield
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.412 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.412 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.412 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.412 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.412 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.412 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.412 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.412 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.412 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.412 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.412 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.412 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.412 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.412 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.412 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.412 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.412 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.412 12 ERROR oslo_messaging.notify.messaging 
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.412 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.412 12 ERROR oslo_messaging.notify.messaging 
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.412 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.412 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.412 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.412 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.412 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.412 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.412 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.412 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.412 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.412 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.412 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.412 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.412 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.412 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.412 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.412 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.412 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.412 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.412 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.412 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.412 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.412 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.412 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.412 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.412 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.412 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.412 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.412 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.412 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.412 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 21 18:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:46:23.412 12 ERROR oslo_messaging.notify.messaging 
Jan 21 18:46:23 np0005591285 nova_compute[182755]: 2026-01-21 23:46:23.646 182759 DEBUG nova.network.neutron [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] [instance: 69dceb72-db44-4bfc-9b98-cc8b39885ae7] Updating instance_info_cache with network_info: [{"id": "bae5fde2-5ead-4ae5-90dd-1d6d468541ea", "address": "fa:16:3e:6f:ac:86", "network": {"id": "b7816b8e-52c1-4d60-84f7-524ebe7dfa5c", "bridge": "br-int", "label": "tempest-LiveAutoBlockMigrationV225Test-1363967197-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "1298204af0f241dc8b63851b2046cf5c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapbae5fde2-5e", "ovs_interfaceid": "bae5fde2-5ead-4ae5-90dd-1d6d468541ea", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 21 18:46:23 np0005591285 nova_compute[182755]: 2026-01-21 23:46:23.678 182759 DEBUG oslo_concurrency.lockutils [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Releasing lock "refresh_cache-69dceb72-db44-4bfc-9b98-cc8b39885ae7" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 21 18:46:23 np0005591285 nova_compute[182755]: 2026-01-21 23:46:23.679 182759 DEBUG nova.compute.manager [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] [instance: 69dceb72-db44-4bfc-9b98-cc8b39885ae7] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929#033[00m
Jan 21 18:46:23 np0005591285 nova_compute[182755]: 2026-01-21 23:46:23.680 182759 DEBUG oslo_service.periodic_task [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 21 18:46:23 np0005591285 nova_compute[182755]: 2026-01-21 23:46:23.681 182759 DEBUG nova.compute.manager [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Jan 21 18:46:23 np0005591285 nova_compute[182755]: 2026-01-21 23:46:23.681 182759 DEBUG oslo_service.periodic_task [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 21 18:46:23 np0005591285 nova_compute[182755]: 2026-01-21 23:46:23.720 182759 DEBUG oslo_concurrency.lockutils [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 21 18:46:23 np0005591285 nova_compute[182755]: 2026-01-21 23:46:23.721 182759 DEBUG oslo_concurrency.lockutils [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 21 18:46:23 np0005591285 nova_compute[182755]: 2026-01-21 23:46:23.722 182759 DEBUG oslo_concurrency.lockutils [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 21 18:46:23 np0005591285 nova_compute[182755]: 2026-01-21 23:46:23.722 182759 DEBUG nova.compute.resource_tracker [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Jan 21 18:46:23 np0005591285 nova_compute[182755]: 2026-01-21 23:46:23.852 182759 DEBUG oslo_concurrency.processutils [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/69dceb72-db44-4bfc-9b98-cc8b39885ae7/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 21 18:46:23 np0005591285 nova_compute[182755]: 2026-01-21 23:46:23.941 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 18:46:23 np0005591285 nova_compute[182755]: 2026-01-21 23:46:23.958 182759 DEBUG oslo_concurrency.processutils [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/69dceb72-db44-4bfc-9b98-cc8b39885ae7/disk --force-share --output=json" returned: 0 in 0.107s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 21 18:46:23 np0005591285 nova_compute[182755]: 2026-01-21 23:46:23.960 182759 DEBUG oslo_concurrency.processutils [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/69dceb72-db44-4bfc-9b98-cc8b39885ae7/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 21 18:46:24 np0005591285 nova_compute[182755]: 2026-01-21 23:46:24.023 182759 DEBUG oslo_concurrency.processutils [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/69dceb72-db44-4bfc-9b98-cc8b39885ae7/disk --force-share --output=json" returned: 0 in 0.063s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 21 18:46:24 np0005591285 nova_compute[182755]: 2026-01-21 23:46:24.030 182759 DEBUG oslo_concurrency.processutils [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/c5dba36b-76a4-4e09-bb90-2f8ef859d5f6/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 21 18:46:24 np0005591285 nova_compute[182755]: 2026-01-21 23:46:24.106 182759 DEBUG oslo_concurrency.processutils [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/c5dba36b-76a4-4e09-bb90-2f8ef859d5f6/disk --force-share --output=json" returned: 0 in 0.076s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 21 18:46:24 np0005591285 nova_compute[182755]: 2026-01-21 23:46:24.108 182759 DEBUG oslo_concurrency.processutils [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/c5dba36b-76a4-4e09-bb90-2f8ef859d5f6/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 21 18:46:24 np0005591285 nova_compute[182755]: 2026-01-21 23:46:24.187 182759 DEBUG oslo_concurrency.processutils [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/c5dba36b-76a4-4e09-bb90-2f8ef859d5f6/disk --force-share --output=json" returned: 0 in 0.079s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 21 18:46:24 np0005591285 nova_compute[182755]: 2026-01-21 23:46:24.432 182759 WARNING nova.virt.libvirt.driver [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 21 18:46:24 np0005591285 nova_compute[182755]: 2026-01-21 23:46:24.435 182759 DEBUG nova.compute.resource_tracker [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=5479MB free_disk=73.34956741333008GB free_vcpus=6 pci_devices=[{"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Jan 21 18:46:24 np0005591285 nova_compute[182755]: 2026-01-21 23:46:24.435 182759 DEBUG oslo_concurrency.lockutils [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 21 18:46:24 np0005591285 nova_compute[182755]: 2026-01-21 23:46:24.436 182759 DEBUG oslo_concurrency.lockutils [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 21 18:46:24 np0005591285 nova_compute[182755]: 2026-01-21 23:46:24.595 182759 DEBUG nova.compute.resource_tracker [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Instance 69dceb72-db44-4bfc-9b98-cc8b39885ae7 actively managed on this compute host and has allocations in placement: {'resources': {'VCPU': 1, 'MEMORY_MB': 128, 'DISK_GB': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Jan 21 18:46:24 np0005591285 nova_compute[182755]: 2026-01-21 23:46:24.596 182759 DEBUG nova.compute.resource_tracker [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Instance c5dba36b-76a4-4e09-bb90-2f8ef859d5f6 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Jan 21 18:46:24 np0005591285 nova_compute[182755]: 2026-01-21 23:46:24.596 182759 DEBUG nova.compute.resource_tracker [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 2 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Jan 21 18:46:24 np0005591285 nova_compute[182755]: 2026-01-21 23:46:24.596 182759 DEBUG nova.compute.resource_tracker [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7679MB used_ram=768MB phys_disk=79GB used_disk=2GB total_vcpus=8 used_vcpus=2 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Jan 21 18:46:24 np0005591285 nova_compute[182755]: 2026-01-21 23:46:24.730 182759 DEBUG nova.compute.provider_tree [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Inventory has not changed in ProviderTree for provider: e96a8776-a298-4c19-937a-402cb8191067 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 21 18:46:24 np0005591285 nova_compute[182755]: 2026-01-21 23:46:24.749 182759 DEBUG nova.scheduler.client.report [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Inventory has not changed for provider e96a8776-a298-4c19-937a-402cb8191067 based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 21 18:46:24 np0005591285 nova_compute[182755]: 2026-01-21 23:46:24.784 182759 DEBUG nova.compute.resource_tracker [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Jan 21 18:46:24 np0005591285 nova_compute[182755]: 2026-01-21 23:46:24.785 182759 DEBUG oslo_concurrency.lockutils [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.350s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 21 18:46:25 np0005591285 nova_compute[182755]: 2026-01-21 23:46:25.323 182759 DEBUG oslo_service.periodic_task [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 21 18:46:25 np0005591285 nova_compute[182755]: 2026-01-21 23:46:25.324 182759 DEBUG oslo_service.periodic_task [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 21 18:46:26 np0005591285 nova_compute[182755]: 2026-01-21 23:46:26.594 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 18:46:28 np0005591285 nova_compute[182755]: 2026-01-21 23:46:28.945 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 18:46:31 np0005591285 nova_compute[182755]: 2026-01-21 23:46:31.599 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 18:46:33 np0005591285 nova_compute[182755]: 2026-01-21 23:46:33.947 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 18:46:35 np0005591285 nova_compute[182755]: 2026-01-21 23:46:35.224 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 18:46:35 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:46:35.224 104259 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=6, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '16:a4:fb', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': 'fa:c2:bb:50:eb:27'}, ipsec=False) old=SB_Global(nb_cfg=5) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 21 18:46:35 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:46:35.228 104259 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 9 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Jan 21 18:46:35 np0005591285 podman[213293]: 2026-01-21 23:46:35.23010488 +0000 UTC m=+0.091259505 container health_status b5c1a31a508d0cf9e10702abe9c0ef424614b4d8f0daee85791f7de054afa6f0 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, config_id=ceilometer_agent_compute, io.buildah.version=1.41.3)
Jan 21 18:46:36 np0005591285 nova_compute[182755]: 2026-01-21 23:46:36.602 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 18:46:38 np0005591285 podman[213315]: 2026-01-21 23:46:38.221392662 +0000 UTC m=+0.088406867 container health_status 769668cc4e818cfae854ab3fcb5517227ddca1e18005a18ef4d782a494f45662 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=openstack_network_exporter, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vendor=Red Hat, Inc., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., com.redhat.component=ubi9-minimal-container, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, container_name=openstack_network_exporter, name=ubi9-minimal, url=https://catalog.redhat.com/en/search?searchType=containers, architecture=x86_64, distribution-scope=public, io.openshift.expose-services=, release=1755695350, vcs-type=git, version=9.6, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Red Hat, Inc., io.buildah.version=1.33.7, managed_by=edpm_ansible, build-date=2025-08-20T13:12:41, io.openshift.tags=minimal rhel9)
Jan 21 18:46:38 np0005591285 nova_compute[182755]: 2026-01-21 23:46:38.953 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 18:46:39 np0005591285 nova_compute[182755]: 2026-01-21 23:46:39.934 182759 DEBUG oslo_concurrency.lockutils [None req-b883789c-43ec-4984-84d8-1f3853ceeb11 aa25befcc85f49009cc03d3f9a7af21a 7b93a1e09d8a4019807c39b0826b8c31 - - default default] Acquiring lock "c5dba36b-76a4-4e09-bb90-2f8ef859d5f6" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 21 18:46:39 np0005591285 nova_compute[182755]: 2026-01-21 23:46:39.935 182759 DEBUG oslo_concurrency.lockutils [None req-b883789c-43ec-4984-84d8-1f3853ceeb11 aa25befcc85f49009cc03d3f9a7af21a 7b93a1e09d8a4019807c39b0826b8c31 - - default default] Lock "c5dba36b-76a4-4e09-bb90-2f8ef859d5f6" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 21 18:46:39 np0005591285 nova_compute[182755]: 2026-01-21 23:46:39.936 182759 DEBUG oslo_concurrency.lockutils [None req-b883789c-43ec-4984-84d8-1f3853ceeb11 aa25befcc85f49009cc03d3f9a7af21a 7b93a1e09d8a4019807c39b0826b8c31 - - default default] Acquiring lock "c5dba36b-76a4-4e09-bb90-2f8ef859d5f6-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 21 18:46:39 np0005591285 nova_compute[182755]: 2026-01-21 23:46:39.936 182759 DEBUG oslo_concurrency.lockutils [None req-b883789c-43ec-4984-84d8-1f3853ceeb11 aa25befcc85f49009cc03d3f9a7af21a 7b93a1e09d8a4019807c39b0826b8c31 - - default default] Lock "c5dba36b-76a4-4e09-bb90-2f8ef859d5f6-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 21 18:46:39 np0005591285 nova_compute[182755]: 2026-01-21 23:46:39.937 182759 DEBUG oslo_concurrency.lockutils [None req-b883789c-43ec-4984-84d8-1f3853ceeb11 aa25befcc85f49009cc03d3f9a7af21a 7b93a1e09d8a4019807c39b0826b8c31 - - default default] Lock "c5dba36b-76a4-4e09-bb90-2f8ef859d5f6-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 21 18:46:39 np0005591285 nova_compute[182755]: 2026-01-21 23:46:39.955 182759 INFO nova.compute.manager [None req-b883789c-43ec-4984-84d8-1f3853ceeb11 aa25befcc85f49009cc03d3f9a7af21a 7b93a1e09d8a4019807c39b0826b8c31 - - default default] [instance: c5dba36b-76a4-4e09-bb90-2f8ef859d5f6] Terminating instance#033[00m
Jan 21 18:46:39 np0005591285 nova_compute[182755]: 2026-01-21 23:46:39.972 182759 DEBUG oslo_concurrency.lockutils [None req-b883789c-43ec-4984-84d8-1f3853ceeb11 aa25befcc85f49009cc03d3f9a7af21a 7b93a1e09d8a4019807c39b0826b8c31 - - default default] Acquiring lock "refresh_cache-c5dba36b-76a4-4e09-bb90-2f8ef859d5f6" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 21 18:46:39 np0005591285 nova_compute[182755]: 2026-01-21 23:46:39.972 182759 DEBUG oslo_concurrency.lockutils [None req-b883789c-43ec-4984-84d8-1f3853ceeb11 aa25befcc85f49009cc03d3f9a7af21a 7b93a1e09d8a4019807c39b0826b8c31 - - default default] Acquired lock "refresh_cache-c5dba36b-76a4-4e09-bb90-2f8ef859d5f6" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 21 18:46:39 np0005591285 nova_compute[182755]: 2026-01-21 23:46:39.973 182759 DEBUG nova.network.neutron [None req-b883789c-43ec-4984-84d8-1f3853ceeb11 aa25befcc85f49009cc03d3f9a7af21a 7b93a1e09d8a4019807c39b0826b8c31 - - default default] [instance: c5dba36b-76a4-4e09-bb90-2f8ef859d5f6] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Jan 21 18:46:40 np0005591285 nova_compute[182755]: 2026-01-21 23:46:40.182 182759 DEBUG nova.network.neutron [None req-b883789c-43ec-4984-84d8-1f3853ceeb11 aa25befcc85f49009cc03d3f9a7af21a 7b93a1e09d8a4019807c39b0826b8c31 - - default default] [instance: c5dba36b-76a4-4e09-bb90-2f8ef859d5f6] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Jan 21 18:46:40 np0005591285 nova_compute[182755]: 2026-01-21 23:46:40.596 182759 DEBUG nova.network.neutron [None req-b883789c-43ec-4984-84d8-1f3853ceeb11 aa25befcc85f49009cc03d3f9a7af21a 7b93a1e09d8a4019807c39b0826b8c31 - - default default] [instance: c5dba36b-76a4-4e09-bb90-2f8ef859d5f6] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 21 18:46:40 np0005591285 nova_compute[182755]: 2026-01-21 23:46:40.613 182759 DEBUG oslo_concurrency.lockutils [None req-b883789c-43ec-4984-84d8-1f3853ceeb11 aa25befcc85f49009cc03d3f9a7af21a 7b93a1e09d8a4019807c39b0826b8c31 - - default default] Releasing lock "refresh_cache-c5dba36b-76a4-4e09-bb90-2f8ef859d5f6" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 21 18:46:40 np0005591285 nova_compute[182755]: 2026-01-21 23:46:40.614 182759 DEBUG nova.compute.manager [None req-b883789c-43ec-4984-84d8-1f3853ceeb11 aa25befcc85f49009cc03d3f9a7af21a 7b93a1e09d8a4019807c39b0826b8c31 - - default default] [instance: c5dba36b-76a4-4e09-bb90-2f8ef859d5f6] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Jan 21 18:46:40 np0005591285 systemd[1]: machine-qemu\x2d5\x2dinstance\x2d0000000f.scope: Deactivated successfully.
Jan 21 18:46:40 np0005591285 systemd[1]: machine-qemu\x2d5\x2dinstance\x2d0000000f.scope: Consumed 13.539s CPU time.
Jan 21 18:46:40 np0005591285 systemd-machined[154022]: Machine qemu-5-instance-0000000f terminated.
Jan 21 18:46:40 np0005591285 nova_compute[182755]: 2026-01-21 23:46:40.893 182759 INFO nova.virt.libvirt.driver [-] [instance: c5dba36b-76a4-4e09-bb90-2f8ef859d5f6] Instance destroyed successfully.#033[00m
Jan 21 18:46:40 np0005591285 nova_compute[182755]: 2026-01-21 23:46:40.894 182759 DEBUG nova.objects.instance [None req-b883789c-43ec-4984-84d8-1f3853ceeb11 aa25befcc85f49009cc03d3f9a7af21a 7b93a1e09d8a4019807c39b0826b8c31 - - default default] Lazy-loading 'resources' on Instance uuid c5dba36b-76a4-4e09-bb90-2f8ef859d5f6 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 21 18:46:40 np0005591285 nova_compute[182755]: 2026-01-21 23:46:40.941 182759 INFO nova.virt.libvirt.driver [None req-b883789c-43ec-4984-84d8-1f3853ceeb11 aa25befcc85f49009cc03d3f9a7af21a 7b93a1e09d8a4019807c39b0826b8c31 - - default default] [instance: c5dba36b-76a4-4e09-bb90-2f8ef859d5f6] Deleting instance files /var/lib/nova/instances/c5dba36b-76a4-4e09-bb90-2f8ef859d5f6_del#033[00m
Jan 21 18:46:40 np0005591285 nova_compute[182755]: 2026-01-21 23:46:40.943 182759 INFO nova.virt.libvirt.driver [None req-b883789c-43ec-4984-84d8-1f3853ceeb11 aa25befcc85f49009cc03d3f9a7af21a 7b93a1e09d8a4019807c39b0826b8c31 - - default default] [instance: c5dba36b-76a4-4e09-bb90-2f8ef859d5f6] Deletion of /var/lib/nova/instances/c5dba36b-76a4-4e09-bb90-2f8ef859d5f6_del complete#033[00m
Jan 21 18:46:41 np0005591285 nova_compute[182755]: 2026-01-21 23:46:41.033 182759 INFO nova.compute.manager [None req-b883789c-43ec-4984-84d8-1f3853ceeb11 aa25befcc85f49009cc03d3f9a7af21a 7b93a1e09d8a4019807c39b0826b8c31 - - default default] [instance: c5dba36b-76a4-4e09-bb90-2f8ef859d5f6] Took 0.42 seconds to destroy the instance on the hypervisor.#033[00m
Jan 21 18:46:41 np0005591285 nova_compute[182755]: 2026-01-21 23:46:41.034 182759 DEBUG oslo.service.loopingcall [None req-b883789c-43ec-4984-84d8-1f3853ceeb11 aa25befcc85f49009cc03d3f9a7af21a 7b93a1e09d8a4019807c39b0826b8c31 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Jan 21 18:46:41 np0005591285 nova_compute[182755]: 2026-01-21 23:46:41.035 182759 DEBUG nova.compute.manager [-] [instance: c5dba36b-76a4-4e09-bb90-2f8ef859d5f6] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Jan 21 18:46:41 np0005591285 nova_compute[182755]: 2026-01-21 23:46:41.035 182759 DEBUG nova.network.neutron [-] [instance: c5dba36b-76a4-4e09-bb90-2f8ef859d5f6] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Jan 21 18:46:41 np0005591285 nova_compute[182755]: 2026-01-21 23:46:41.247 182759 DEBUG nova.network.neutron [-] [instance: c5dba36b-76a4-4e09-bb90-2f8ef859d5f6] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Jan 21 18:46:41 np0005591285 nova_compute[182755]: 2026-01-21 23:46:41.271 182759 DEBUG nova.network.neutron [-] [instance: c5dba36b-76a4-4e09-bb90-2f8ef859d5f6] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 21 18:46:41 np0005591285 nova_compute[182755]: 2026-01-21 23:46:41.292 182759 INFO nova.compute.manager [-] [instance: c5dba36b-76a4-4e09-bb90-2f8ef859d5f6] Took 0.26 seconds to deallocate network for instance.#033[00m
Jan 21 18:46:41 np0005591285 nova_compute[182755]: 2026-01-21 23:46:41.420 182759 DEBUG oslo_concurrency.lockutils [None req-b883789c-43ec-4984-84d8-1f3853ceeb11 aa25befcc85f49009cc03d3f9a7af21a 7b93a1e09d8a4019807c39b0826b8c31 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 21 18:46:41 np0005591285 nova_compute[182755]: 2026-01-21 23:46:41.421 182759 DEBUG oslo_concurrency.lockutils [None req-b883789c-43ec-4984-84d8-1f3853ceeb11 aa25befcc85f49009cc03d3f9a7af21a 7b93a1e09d8a4019807c39b0826b8c31 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 21 18:46:41 np0005591285 nova_compute[182755]: 2026-01-21 23:46:41.527 182759 DEBUG nova.compute.provider_tree [None req-b883789c-43ec-4984-84d8-1f3853ceeb11 aa25befcc85f49009cc03d3f9a7af21a 7b93a1e09d8a4019807c39b0826b8c31 - - default default] Inventory has not changed in ProviderTree for provider: e96a8776-a298-4c19-937a-402cb8191067 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 21 18:46:41 np0005591285 nova_compute[182755]: 2026-01-21 23:46:41.549 182759 DEBUG nova.scheduler.client.report [None req-b883789c-43ec-4984-84d8-1f3853ceeb11 aa25befcc85f49009cc03d3f9a7af21a 7b93a1e09d8a4019807c39b0826b8c31 - - default default] Inventory has not changed for provider e96a8776-a298-4c19-937a-402cb8191067 based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 21 18:46:41 np0005591285 nova_compute[182755]: 2026-01-21 23:46:41.646 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 18:46:41 np0005591285 nova_compute[182755]: 2026-01-21 23:46:41.665 182759 DEBUG oslo_concurrency.lockutils [None req-b883789c-43ec-4984-84d8-1f3853ceeb11 aa25befcc85f49009cc03d3f9a7af21a 7b93a1e09d8a4019807c39b0826b8c31 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.243s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 21 18:46:41 np0005591285 nova_compute[182755]: 2026-01-21 23:46:41.728 182759 INFO nova.scheduler.client.report [None req-b883789c-43ec-4984-84d8-1f3853ceeb11 aa25befcc85f49009cc03d3f9a7af21a 7b93a1e09d8a4019807c39b0826b8c31 - - default default] Deleted allocations for instance c5dba36b-76a4-4e09-bb90-2f8ef859d5f6#033[00m
Jan 21 18:46:41 np0005591285 nova_compute[182755]: 2026-01-21 23:46:41.842 182759 DEBUG oslo_concurrency.lockutils [None req-b883789c-43ec-4984-84d8-1f3853ceeb11 aa25befcc85f49009cc03d3f9a7af21a 7b93a1e09d8a4019807c39b0826b8c31 - - default default] Lock "c5dba36b-76a4-4e09-bb90-2f8ef859d5f6" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 1.907s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 21 18:46:43 np0005591285 nova_compute[182755]: 2026-01-21 23:46:43.957 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 18:46:44 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:46:44.231 104259 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=ce4b296c-26ac-415a-aa87-9634754eb3d3, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '6'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 21 18:46:46 np0005591285 podman[213346]: 2026-01-21 23:46:46.226225385 +0000 UTC m=+0.088907122 container health_status 3d927e208c8d9d2bf2ed958430b573b5beb6e6062ba87e1ec7a0ee42c855b2e4 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Jan 21 18:46:46 np0005591285 nova_compute[182755]: 2026-01-21 23:46:46.693 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 18:46:48 np0005591285 nova_compute[182755]: 2026-01-21 23:46:48.182 182759 DEBUG oslo_concurrency.lockutils [None req-488460ed-5cba-442c-89a1-718ef1b00c2a 4a6034ff39094b6486bac680b7ed5a57 4d40fc03fb534b5689415f3d8a3de1fc - - default default] Acquiring lock "3a585d4f-6f31-4651-b848-0470f4eed464" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 21 18:46:48 np0005591285 nova_compute[182755]: 2026-01-21 23:46:48.183 182759 DEBUG oslo_concurrency.lockutils [None req-488460ed-5cba-442c-89a1-718ef1b00c2a 4a6034ff39094b6486bac680b7ed5a57 4d40fc03fb534b5689415f3d8a3de1fc - - default default] Lock "3a585d4f-6f31-4651-b848-0470f4eed464" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 21 18:46:48 np0005591285 nova_compute[182755]: 2026-01-21 23:46:48.210 182759 DEBUG nova.compute.manager [None req-488460ed-5cba-442c-89a1-718ef1b00c2a 4a6034ff39094b6486bac680b7ed5a57 4d40fc03fb534b5689415f3d8a3de1fc - - default default] [instance: 3a585d4f-6f31-4651-b848-0470f4eed464] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Jan 21 18:46:48 np0005591285 nova_compute[182755]: 2026-01-21 23:46:48.357 182759 DEBUG oslo_concurrency.lockutils [None req-488460ed-5cba-442c-89a1-718ef1b00c2a 4a6034ff39094b6486bac680b7ed5a57 4d40fc03fb534b5689415f3d8a3de1fc - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 21 18:46:48 np0005591285 nova_compute[182755]: 2026-01-21 23:46:48.358 182759 DEBUG oslo_concurrency.lockutils [None req-488460ed-5cba-442c-89a1-718ef1b00c2a 4a6034ff39094b6486bac680b7ed5a57 4d40fc03fb534b5689415f3d8a3de1fc - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 21 18:46:48 np0005591285 nova_compute[182755]: 2026-01-21 23:46:48.367 182759 DEBUG nova.virt.hardware [None req-488460ed-5cba-442c-89a1-718ef1b00c2a 4a6034ff39094b6486bac680b7ed5a57 4d40fc03fb534b5689415f3d8a3de1fc - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Jan 21 18:46:48 np0005591285 nova_compute[182755]: 2026-01-21 23:46:48.367 182759 INFO nova.compute.claims [None req-488460ed-5cba-442c-89a1-718ef1b00c2a 4a6034ff39094b6486bac680b7ed5a57 4d40fc03fb534b5689415f3d8a3de1fc - - default default] [instance: 3a585d4f-6f31-4651-b848-0470f4eed464] Claim successful on node compute-2.ctlplane.example.com#033[00m
Jan 21 18:46:48 np0005591285 nova_compute[182755]: 2026-01-21 23:46:48.555 182759 DEBUG nova.compute.provider_tree [None req-488460ed-5cba-442c-89a1-718ef1b00c2a 4a6034ff39094b6486bac680b7ed5a57 4d40fc03fb534b5689415f3d8a3de1fc - - default default] Inventory has not changed in ProviderTree for provider: e96a8776-a298-4c19-937a-402cb8191067 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 21 18:46:48 np0005591285 nova_compute[182755]: 2026-01-21 23:46:48.575 182759 DEBUG nova.scheduler.client.report [None req-488460ed-5cba-442c-89a1-718ef1b00c2a 4a6034ff39094b6486bac680b7ed5a57 4d40fc03fb534b5689415f3d8a3de1fc - - default default] Inventory has not changed for provider e96a8776-a298-4c19-937a-402cb8191067 based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 21 18:46:48 np0005591285 nova_compute[182755]: 2026-01-21 23:46:48.606 182759 DEBUG oslo_concurrency.lockutils [None req-488460ed-5cba-442c-89a1-718ef1b00c2a 4a6034ff39094b6486bac680b7ed5a57 4d40fc03fb534b5689415f3d8a3de1fc - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.248s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 21 18:46:48 np0005591285 nova_compute[182755]: 2026-01-21 23:46:48.607 182759 DEBUG nova.compute.manager [None req-488460ed-5cba-442c-89a1-718ef1b00c2a 4a6034ff39094b6486bac680b7ed5a57 4d40fc03fb534b5689415f3d8a3de1fc - - default default] [instance: 3a585d4f-6f31-4651-b848-0470f4eed464] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Jan 21 18:46:48 np0005591285 nova_compute[182755]: 2026-01-21 23:46:48.687 182759 DEBUG nova.compute.manager [None req-488460ed-5cba-442c-89a1-718ef1b00c2a 4a6034ff39094b6486bac680b7ed5a57 4d40fc03fb534b5689415f3d8a3de1fc - - default default] [instance: 3a585d4f-6f31-4651-b848-0470f4eed464] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Jan 21 18:46:48 np0005591285 nova_compute[182755]: 2026-01-21 23:46:48.688 182759 DEBUG nova.network.neutron [None req-488460ed-5cba-442c-89a1-718ef1b00c2a 4a6034ff39094b6486bac680b7ed5a57 4d40fc03fb534b5689415f3d8a3de1fc - - default default] [instance: 3a585d4f-6f31-4651-b848-0470f4eed464] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Jan 21 18:46:48 np0005591285 nova_compute[182755]: 2026-01-21 23:46:48.723 182759 INFO nova.virt.libvirt.driver [None req-488460ed-5cba-442c-89a1-718ef1b00c2a 4a6034ff39094b6486bac680b7ed5a57 4d40fc03fb534b5689415f3d8a3de1fc - - default default] [instance: 3a585d4f-6f31-4651-b848-0470f4eed464] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Jan 21 18:46:48 np0005591285 nova_compute[182755]: 2026-01-21 23:46:48.746 182759 DEBUG nova.compute.manager [None req-488460ed-5cba-442c-89a1-718ef1b00c2a 4a6034ff39094b6486bac680b7ed5a57 4d40fc03fb534b5689415f3d8a3de1fc - - default default] [instance: 3a585d4f-6f31-4651-b848-0470f4eed464] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Jan 21 18:46:48 np0005591285 nova_compute[182755]: 2026-01-21 23:46:48.960 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 18:46:48 np0005591285 nova_compute[182755]: 2026-01-21 23:46:48.970 182759 DEBUG nova.policy [None req-488460ed-5cba-442c-89a1-718ef1b00c2a 4a6034ff39094b6486bac680b7ed5a57 4d40fc03fb534b5689415f3d8a3de1fc - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '4a6034ff39094b6486bac680b7ed5a57', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '4d40fc03fb534b5689415f3d8a3de1fc', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Jan 21 18:46:49 np0005591285 nova_compute[182755]: 2026-01-21 23:46:49.118 182759 DEBUG nova.compute.manager [None req-488460ed-5cba-442c-89a1-718ef1b00c2a 4a6034ff39094b6486bac680b7ed5a57 4d40fc03fb534b5689415f3d8a3de1fc - - default default] [instance: 3a585d4f-6f31-4651-b848-0470f4eed464] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Jan 21 18:46:49 np0005591285 nova_compute[182755]: 2026-01-21 23:46:49.121 182759 DEBUG nova.virt.libvirt.driver [None req-488460ed-5cba-442c-89a1-718ef1b00c2a 4a6034ff39094b6486bac680b7ed5a57 4d40fc03fb534b5689415f3d8a3de1fc - - default default] [instance: 3a585d4f-6f31-4651-b848-0470f4eed464] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Jan 21 18:46:49 np0005591285 nova_compute[182755]: 2026-01-21 23:46:49.122 182759 INFO nova.virt.libvirt.driver [None req-488460ed-5cba-442c-89a1-718ef1b00c2a 4a6034ff39094b6486bac680b7ed5a57 4d40fc03fb534b5689415f3d8a3de1fc - - default default] [instance: 3a585d4f-6f31-4651-b848-0470f4eed464] Creating image(s)#033[00m
Jan 21 18:46:49 np0005591285 nova_compute[182755]: 2026-01-21 23:46:49.123 182759 DEBUG oslo_concurrency.lockutils [None req-488460ed-5cba-442c-89a1-718ef1b00c2a 4a6034ff39094b6486bac680b7ed5a57 4d40fc03fb534b5689415f3d8a3de1fc - - default default] Acquiring lock "/var/lib/nova/instances/3a585d4f-6f31-4651-b848-0470f4eed464/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 21 18:46:49 np0005591285 nova_compute[182755]: 2026-01-21 23:46:49.124 182759 DEBUG oslo_concurrency.lockutils [None req-488460ed-5cba-442c-89a1-718ef1b00c2a 4a6034ff39094b6486bac680b7ed5a57 4d40fc03fb534b5689415f3d8a3de1fc - - default default] Lock "/var/lib/nova/instances/3a585d4f-6f31-4651-b848-0470f4eed464/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 21 18:46:49 np0005591285 nova_compute[182755]: 2026-01-21 23:46:49.125 182759 DEBUG oslo_concurrency.lockutils [None req-488460ed-5cba-442c-89a1-718ef1b00c2a 4a6034ff39094b6486bac680b7ed5a57 4d40fc03fb534b5689415f3d8a3de1fc - - default default] Lock "/var/lib/nova/instances/3a585d4f-6f31-4651-b848-0470f4eed464/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 21 18:46:49 np0005591285 nova_compute[182755]: 2026-01-21 23:46:49.164 182759 DEBUG oslo_concurrency.processutils [None req-488460ed-5cba-442c-89a1-718ef1b00c2a 4a6034ff39094b6486bac680b7ed5a57 4d40fc03fb534b5689415f3d8a3de1fc - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 21 18:46:49 np0005591285 nova_compute[182755]: 2026-01-21 23:46:49.261 182759 DEBUG oslo_concurrency.processutils [None req-488460ed-5cba-442c-89a1-718ef1b00c2a 4a6034ff39094b6486bac680b7ed5a57 4d40fc03fb534b5689415f3d8a3de1fc - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json" returned: 0 in 0.097s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 21 18:46:49 np0005591285 nova_compute[182755]: 2026-01-21 23:46:49.263 182759 DEBUG oslo_concurrency.lockutils [None req-488460ed-5cba-442c-89a1-718ef1b00c2a 4a6034ff39094b6486bac680b7ed5a57 4d40fc03fb534b5689415f3d8a3de1fc - - default default] Acquiring lock "3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 21 18:46:49 np0005591285 nova_compute[182755]: 2026-01-21 23:46:49.264 182759 DEBUG oslo_concurrency.lockutils [None req-488460ed-5cba-442c-89a1-718ef1b00c2a 4a6034ff39094b6486bac680b7ed5a57 4d40fc03fb534b5689415f3d8a3de1fc - - default default] Lock "3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 21 18:46:49 np0005591285 podman[213371]: 2026-01-21 23:46:49.278169372 +0000 UTC m=+0.143897990 container health_status e38def30a2d032278c507a8fe96e62e9ea51ff4721fe55b24e66691821fd079e (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_id=ovn_controller, managed_by=edpm_ansible, org.label-schema.build-date=20251202)
Jan 21 18:46:49 np0005591285 nova_compute[182755]: 2026-01-21 23:46:49.285 182759 DEBUG oslo_concurrency.processutils [None req-488460ed-5cba-442c-89a1-718ef1b00c2a 4a6034ff39094b6486bac680b7ed5a57 4d40fc03fb534b5689415f3d8a3de1fc - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 21 18:46:49 np0005591285 nova_compute[182755]: 2026-01-21 23:46:49.339 182759 DEBUG oslo_concurrency.processutils [None req-488460ed-5cba-442c-89a1-718ef1b00c2a 4a6034ff39094b6486bac680b7ed5a57 4d40fc03fb534b5689415f3d8a3de1fc - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json" returned: 0 in 0.054s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 21 18:46:49 np0005591285 nova_compute[182755]: 2026-01-21 23:46:49.340 182759 DEBUG oslo_concurrency.processutils [None req-488460ed-5cba-442c-89a1-718ef1b00c2a 4a6034ff39094b6486bac680b7ed5a57 4d40fc03fb534b5689415f3d8a3de1fc - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474,backing_fmt=raw /var/lib/nova/instances/3a585d4f-6f31-4651-b848-0470f4eed464/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 21 18:46:49 np0005591285 nova_compute[182755]: 2026-01-21 23:46:49.378 182759 DEBUG oslo_concurrency.processutils [None req-488460ed-5cba-442c-89a1-718ef1b00c2a 4a6034ff39094b6486bac680b7ed5a57 4d40fc03fb534b5689415f3d8a3de1fc - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474,backing_fmt=raw /var/lib/nova/instances/3a585d4f-6f31-4651-b848-0470f4eed464/disk 1073741824" returned: 0 in 0.038s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 21 18:46:49 np0005591285 nova_compute[182755]: 2026-01-21 23:46:49.379 182759 DEBUG oslo_concurrency.lockutils [None req-488460ed-5cba-442c-89a1-718ef1b00c2a 4a6034ff39094b6486bac680b7ed5a57 4d40fc03fb534b5689415f3d8a3de1fc - - default default] Lock "3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.115s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 21 18:46:49 np0005591285 nova_compute[182755]: 2026-01-21 23:46:49.380 182759 DEBUG oslo_concurrency.processutils [None req-488460ed-5cba-442c-89a1-718ef1b00c2a 4a6034ff39094b6486bac680b7ed5a57 4d40fc03fb534b5689415f3d8a3de1fc - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 21 18:46:49 np0005591285 nova_compute[182755]: 2026-01-21 23:46:49.479 182759 DEBUG oslo_concurrency.processutils [None req-488460ed-5cba-442c-89a1-718ef1b00c2a 4a6034ff39094b6486bac680b7ed5a57 4d40fc03fb534b5689415f3d8a3de1fc - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json" returned: 0 in 0.099s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 21 18:46:49 np0005591285 nova_compute[182755]: 2026-01-21 23:46:49.481 182759 DEBUG nova.virt.disk.api [None req-488460ed-5cba-442c-89a1-718ef1b00c2a 4a6034ff39094b6486bac680b7ed5a57 4d40fc03fb534b5689415f3d8a3de1fc - - default default] Checking if we can resize image /var/lib/nova/instances/3a585d4f-6f31-4651-b848-0470f4eed464/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166#033[00m
Jan 21 18:46:49 np0005591285 nova_compute[182755]: 2026-01-21 23:46:49.481 182759 DEBUG oslo_concurrency.processutils [None req-488460ed-5cba-442c-89a1-718ef1b00c2a 4a6034ff39094b6486bac680b7ed5a57 4d40fc03fb534b5689415f3d8a3de1fc - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/3a585d4f-6f31-4651-b848-0470f4eed464/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 21 18:46:49 np0005591285 nova_compute[182755]: 2026-01-21 23:46:49.553 182759 DEBUG oslo_concurrency.processutils [None req-488460ed-5cba-442c-89a1-718ef1b00c2a 4a6034ff39094b6486bac680b7ed5a57 4d40fc03fb534b5689415f3d8a3de1fc - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/3a585d4f-6f31-4651-b848-0470f4eed464/disk --force-share --output=json" returned: 0 in 0.072s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 21 18:46:49 np0005591285 nova_compute[182755]: 2026-01-21 23:46:49.555 182759 DEBUG nova.virt.disk.api [None req-488460ed-5cba-442c-89a1-718ef1b00c2a 4a6034ff39094b6486bac680b7ed5a57 4d40fc03fb534b5689415f3d8a3de1fc - - default default] Cannot resize image /var/lib/nova/instances/3a585d4f-6f31-4651-b848-0470f4eed464/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172#033[00m
Jan 21 18:46:49 np0005591285 nova_compute[182755]: 2026-01-21 23:46:49.557 182759 DEBUG nova.objects.instance [None req-488460ed-5cba-442c-89a1-718ef1b00c2a 4a6034ff39094b6486bac680b7ed5a57 4d40fc03fb534b5689415f3d8a3de1fc - - default default] Lazy-loading 'migration_context' on Instance uuid 3a585d4f-6f31-4651-b848-0470f4eed464 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 21 18:46:49 np0005591285 nova_compute[182755]: 2026-01-21 23:46:49.583 182759 DEBUG nova.virt.libvirt.driver [None req-488460ed-5cba-442c-89a1-718ef1b00c2a 4a6034ff39094b6486bac680b7ed5a57 4d40fc03fb534b5689415f3d8a3de1fc - - default default] [instance: 3a585d4f-6f31-4651-b848-0470f4eed464] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Jan 21 18:46:49 np0005591285 nova_compute[182755]: 2026-01-21 23:46:49.583 182759 DEBUG nova.virt.libvirt.driver [None req-488460ed-5cba-442c-89a1-718ef1b00c2a 4a6034ff39094b6486bac680b7ed5a57 4d40fc03fb534b5689415f3d8a3de1fc - - default default] [instance: 3a585d4f-6f31-4651-b848-0470f4eed464] Ensure instance console log exists: /var/lib/nova/instances/3a585d4f-6f31-4651-b848-0470f4eed464/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Jan 21 18:46:49 np0005591285 nova_compute[182755]: 2026-01-21 23:46:49.584 182759 DEBUG oslo_concurrency.lockutils [None req-488460ed-5cba-442c-89a1-718ef1b00c2a 4a6034ff39094b6486bac680b7ed5a57 4d40fc03fb534b5689415f3d8a3de1fc - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 21 18:46:49 np0005591285 nova_compute[182755]: 2026-01-21 23:46:49.585 182759 DEBUG oslo_concurrency.lockutils [None req-488460ed-5cba-442c-89a1-718ef1b00c2a 4a6034ff39094b6486bac680b7ed5a57 4d40fc03fb534b5689415f3d8a3de1fc - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 21 18:46:49 np0005591285 nova_compute[182755]: 2026-01-21 23:46:49.586 182759 DEBUG oslo_concurrency.lockutils [None req-488460ed-5cba-442c-89a1-718ef1b00c2a 4a6034ff39094b6486bac680b7ed5a57 4d40fc03fb534b5689415f3d8a3de1fc - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 21 18:46:49 np0005591285 nova_compute[182755]: 2026-01-21 23:46:49.733 182759 DEBUG nova.network.neutron [None req-488460ed-5cba-442c-89a1-718ef1b00c2a 4a6034ff39094b6486bac680b7ed5a57 4d40fc03fb534b5689415f3d8a3de1fc - - default default] [instance: 3a585d4f-6f31-4651-b848-0470f4eed464] Successfully created port: eb56f7d1-8dec-4faa-a727-c8bdf54f0af5 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Jan 21 18:46:50 np0005591285 nova_compute[182755]: 2026-01-21 23:46:50.724 182759 DEBUG nova.network.neutron [None req-488460ed-5cba-442c-89a1-718ef1b00c2a 4a6034ff39094b6486bac680b7ed5a57 4d40fc03fb534b5689415f3d8a3de1fc - - default default] [instance: 3a585d4f-6f31-4651-b848-0470f4eed464] Successfully updated port: eb56f7d1-8dec-4faa-a727-c8bdf54f0af5 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Jan 21 18:46:50 np0005591285 nova_compute[182755]: 2026-01-21 23:46:50.760 182759 DEBUG oslo_concurrency.lockutils [None req-488460ed-5cba-442c-89a1-718ef1b00c2a 4a6034ff39094b6486bac680b7ed5a57 4d40fc03fb534b5689415f3d8a3de1fc - - default default] Acquiring lock "refresh_cache-3a585d4f-6f31-4651-b848-0470f4eed464" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 21 18:46:50 np0005591285 nova_compute[182755]: 2026-01-21 23:46:50.760 182759 DEBUG oslo_concurrency.lockutils [None req-488460ed-5cba-442c-89a1-718ef1b00c2a 4a6034ff39094b6486bac680b7ed5a57 4d40fc03fb534b5689415f3d8a3de1fc - - default default] Acquired lock "refresh_cache-3a585d4f-6f31-4651-b848-0470f4eed464" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 21 18:46:50 np0005591285 nova_compute[182755]: 2026-01-21 23:46:50.761 182759 DEBUG nova.network.neutron [None req-488460ed-5cba-442c-89a1-718ef1b00c2a 4a6034ff39094b6486bac680b7ed5a57 4d40fc03fb534b5689415f3d8a3de1fc - - default default] [instance: 3a585d4f-6f31-4651-b848-0470f4eed464] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Jan 21 18:46:50 np0005591285 nova_compute[182755]: 2026-01-21 23:46:50.927 182759 DEBUG nova.compute.manager [req-7e70e477-d73e-4516-a2f4-328baab479ee req-c2d387bf-319f-4d1e-9f69-0b0bbd3d5240 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 3a585d4f-6f31-4651-b848-0470f4eed464] Received event network-changed-eb56f7d1-8dec-4faa-a727-c8bdf54f0af5 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 21 18:46:50 np0005591285 nova_compute[182755]: 2026-01-21 23:46:50.928 182759 DEBUG nova.compute.manager [req-7e70e477-d73e-4516-a2f4-328baab479ee req-c2d387bf-319f-4d1e-9f69-0b0bbd3d5240 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 3a585d4f-6f31-4651-b848-0470f4eed464] Refreshing instance network info cache due to event network-changed-eb56f7d1-8dec-4faa-a727-c8bdf54f0af5. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 21 18:46:50 np0005591285 nova_compute[182755]: 2026-01-21 23:46:50.928 182759 DEBUG oslo_concurrency.lockutils [req-7e70e477-d73e-4516-a2f4-328baab479ee req-c2d387bf-319f-4d1e-9f69-0b0bbd3d5240 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "refresh_cache-3a585d4f-6f31-4651-b848-0470f4eed464" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 21 18:46:51 np0005591285 nova_compute[182755]: 2026-01-21 23:46:51.148 182759 DEBUG nova.network.neutron [None req-488460ed-5cba-442c-89a1-718ef1b00c2a 4a6034ff39094b6486bac680b7ed5a57 4d40fc03fb534b5689415f3d8a3de1fc - - default default] [instance: 3a585d4f-6f31-4651-b848-0470f4eed464] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Jan 21 18:46:51 np0005591285 nova_compute[182755]: 2026-01-21 23:46:51.698 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 18:46:52 np0005591285 nova_compute[182755]: 2026-01-21 23:46:52.200 182759 DEBUG nova.network.neutron [None req-488460ed-5cba-442c-89a1-718ef1b00c2a 4a6034ff39094b6486bac680b7ed5a57 4d40fc03fb534b5689415f3d8a3de1fc - - default default] [instance: 3a585d4f-6f31-4651-b848-0470f4eed464] Updating instance_info_cache with network_info: [{"id": "eb56f7d1-8dec-4faa-a727-c8bdf54f0af5", "address": "fa:16:3e:ae:45:35", "network": {"id": "1530a22a-f758-407d-b1aa-fd922904fe07", "bridge": "br-int", "label": "tempest-ServersAdminTestJSON-1431691179-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "4d40fc03fb534b5689415f3d8a3de1fc", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapeb56f7d1-8d", "ovs_interfaceid": "eb56f7d1-8dec-4faa-a727-c8bdf54f0af5", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 21 18:46:52 np0005591285 nova_compute[182755]: 2026-01-21 23:46:52.231 182759 DEBUG oslo_concurrency.lockutils [None req-488460ed-5cba-442c-89a1-718ef1b00c2a 4a6034ff39094b6486bac680b7ed5a57 4d40fc03fb534b5689415f3d8a3de1fc - - default default] Releasing lock "refresh_cache-3a585d4f-6f31-4651-b848-0470f4eed464" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 21 18:46:52 np0005591285 nova_compute[182755]: 2026-01-21 23:46:52.232 182759 DEBUG nova.compute.manager [None req-488460ed-5cba-442c-89a1-718ef1b00c2a 4a6034ff39094b6486bac680b7ed5a57 4d40fc03fb534b5689415f3d8a3de1fc - - default default] [instance: 3a585d4f-6f31-4651-b848-0470f4eed464] Instance network_info: |[{"id": "eb56f7d1-8dec-4faa-a727-c8bdf54f0af5", "address": "fa:16:3e:ae:45:35", "network": {"id": "1530a22a-f758-407d-b1aa-fd922904fe07", "bridge": "br-int", "label": "tempest-ServersAdminTestJSON-1431691179-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "4d40fc03fb534b5689415f3d8a3de1fc", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapeb56f7d1-8d", "ovs_interfaceid": "eb56f7d1-8dec-4faa-a727-c8bdf54f0af5", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Jan 21 18:46:52 np0005591285 nova_compute[182755]: 2026-01-21 23:46:52.233 182759 DEBUG oslo_concurrency.lockutils [req-7e70e477-d73e-4516-a2f4-328baab479ee req-c2d387bf-319f-4d1e-9f69-0b0bbd3d5240 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquired lock "refresh_cache-3a585d4f-6f31-4651-b848-0470f4eed464" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 21 18:46:52 np0005591285 nova_compute[182755]: 2026-01-21 23:46:52.233 182759 DEBUG nova.network.neutron [req-7e70e477-d73e-4516-a2f4-328baab479ee req-c2d387bf-319f-4d1e-9f69-0b0bbd3d5240 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 3a585d4f-6f31-4651-b848-0470f4eed464] Refreshing network info cache for port eb56f7d1-8dec-4faa-a727-c8bdf54f0af5 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 21 18:46:52 np0005591285 nova_compute[182755]: 2026-01-21 23:46:52.239 182759 DEBUG nova.virt.libvirt.driver [None req-488460ed-5cba-442c-89a1-718ef1b00c2a 4a6034ff39094b6486bac680b7ed5a57 4d40fc03fb534b5689415f3d8a3de1fc - - default default] [instance: 3a585d4f-6f31-4651-b848-0470f4eed464] Start _get_guest_xml network_info=[{"id": "eb56f7d1-8dec-4faa-a727-c8bdf54f0af5", "address": "fa:16:3e:ae:45:35", "network": {"id": "1530a22a-f758-407d-b1aa-fd922904fe07", "bridge": "br-int", "label": "tempest-ServersAdminTestJSON-1431691179-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "4d40fc03fb534b5689415f3d8a3de1fc", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapeb56f7d1-8d", "ovs_interfaceid": "eb56f7d1-8dec-4faa-a727-c8bdf54f0af5", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-21T23:42:50Z,direct_url=<?>,disk_format='qcow2',id=9cd98f02-a505-4543-a7ad-04e9a377b456,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='43b70c4e837343859ac97b6b2397ba1b',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-21T23:42:52Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'device_type': 'disk', 'device_name': '/dev/vda', 'size': 0, 'boot_index': 0, 'disk_bus': 'virtio', 'guest_format': None, 'encryption_options': None, 'encryption_format': None, 'encrypted': False, 'encryption_secret_uuid': None, 'image_id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Jan 21 18:46:52 np0005591285 nova_compute[182755]: 2026-01-21 23:46:52.245 182759 WARNING nova.virt.libvirt.driver [None req-488460ed-5cba-442c-89a1-718ef1b00c2a 4a6034ff39094b6486bac680b7ed5a57 4d40fc03fb534b5689415f3d8a3de1fc - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 21 18:46:52 np0005591285 nova_compute[182755]: 2026-01-21 23:46:52.253 182759 DEBUG nova.virt.libvirt.host [None req-488460ed-5cba-442c-89a1-718ef1b00c2a 4a6034ff39094b6486bac680b7ed5a57 4d40fc03fb534b5689415f3d8a3de1fc - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Jan 21 18:46:52 np0005591285 nova_compute[182755]: 2026-01-21 23:46:52.254 182759 DEBUG nova.virt.libvirt.host [None req-488460ed-5cba-442c-89a1-718ef1b00c2a 4a6034ff39094b6486bac680b7ed5a57 4d40fc03fb534b5689415f3d8a3de1fc - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Jan 21 18:46:52 np0005591285 nova_compute[182755]: 2026-01-21 23:46:52.259 182759 DEBUG nova.virt.libvirt.host [None req-488460ed-5cba-442c-89a1-718ef1b00c2a 4a6034ff39094b6486bac680b7ed5a57 4d40fc03fb534b5689415f3d8a3de1fc - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Jan 21 18:46:52 np0005591285 nova_compute[182755]: 2026-01-21 23:46:52.260 182759 DEBUG nova.virt.libvirt.host [None req-488460ed-5cba-442c-89a1-718ef1b00c2a 4a6034ff39094b6486bac680b7ed5a57 4d40fc03fb534b5689415f3d8a3de1fc - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Jan 21 18:46:52 np0005591285 nova_compute[182755]: 2026-01-21 23:46:52.262 182759 DEBUG nova.virt.libvirt.driver [None req-488460ed-5cba-442c-89a1-718ef1b00c2a 4a6034ff39094b6486bac680b7ed5a57 4d40fc03fb534b5689415f3d8a3de1fc - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Jan 21 18:46:52 np0005591285 nova_compute[182755]: 2026-01-21 23:46:52.263 182759 DEBUG nova.virt.hardware [None req-488460ed-5cba-442c-89a1-718ef1b00c2a 4a6034ff39094b6486bac680b7ed5a57 4d40fc03fb534b5689415f3d8a3de1fc - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-21T23:42:45Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='c3389c03-89c4-4ff5-9e03-1a99d41713d4',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-21T23:42:50Z,direct_url=<?>,disk_format='qcow2',id=9cd98f02-a505-4543-a7ad-04e9a377b456,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='43b70c4e837343859ac97b6b2397ba1b',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-21T23:42:52Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Jan 21 18:46:52 np0005591285 nova_compute[182755]: 2026-01-21 23:46:52.263 182759 DEBUG nova.virt.hardware [None req-488460ed-5cba-442c-89a1-718ef1b00c2a 4a6034ff39094b6486bac680b7ed5a57 4d40fc03fb534b5689415f3d8a3de1fc - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Jan 21 18:46:52 np0005591285 nova_compute[182755]: 2026-01-21 23:46:52.264 182759 DEBUG nova.virt.hardware [None req-488460ed-5cba-442c-89a1-718ef1b00c2a 4a6034ff39094b6486bac680b7ed5a57 4d40fc03fb534b5689415f3d8a3de1fc - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Jan 21 18:46:52 np0005591285 nova_compute[182755]: 2026-01-21 23:46:52.264 182759 DEBUG nova.virt.hardware [None req-488460ed-5cba-442c-89a1-718ef1b00c2a 4a6034ff39094b6486bac680b7ed5a57 4d40fc03fb534b5689415f3d8a3de1fc - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Jan 21 18:46:52 np0005591285 nova_compute[182755]: 2026-01-21 23:46:52.265 182759 DEBUG nova.virt.hardware [None req-488460ed-5cba-442c-89a1-718ef1b00c2a 4a6034ff39094b6486bac680b7ed5a57 4d40fc03fb534b5689415f3d8a3de1fc - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Jan 21 18:46:52 np0005591285 nova_compute[182755]: 2026-01-21 23:46:52.265 182759 DEBUG nova.virt.hardware [None req-488460ed-5cba-442c-89a1-718ef1b00c2a 4a6034ff39094b6486bac680b7ed5a57 4d40fc03fb534b5689415f3d8a3de1fc - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Jan 21 18:46:52 np0005591285 nova_compute[182755]: 2026-01-21 23:46:52.265 182759 DEBUG nova.virt.hardware [None req-488460ed-5cba-442c-89a1-718ef1b00c2a 4a6034ff39094b6486bac680b7ed5a57 4d40fc03fb534b5689415f3d8a3de1fc - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Jan 21 18:46:52 np0005591285 nova_compute[182755]: 2026-01-21 23:46:52.266 182759 DEBUG nova.virt.hardware [None req-488460ed-5cba-442c-89a1-718ef1b00c2a 4a6034ff39094b6486bac680b7ed5a57 4d40fc03fb534b5689415f3d8a3de1fc - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Jan 21 18:46:52 np0005591285 nova_compute[182755]: 2026-01-21 23:46:52.266 182759 DEBUG nova.virt.hardware [None req-488460ed-5cba-442c-89a1-718ef1b00c2a 4a6034ff39094b6486bac680b7ed5a57 4d40fc03fb534b5689415f3d8a3de1fc - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Jan 21 18:46:52 np0005591285 nova_compute[182755]: 2026-01-21 23:46:52.267 182759 DEBUG nova.virt.hardware [None req-488460ed-5cba-442c-89a1-718ef1b00c2a 4a6034ff39094b6486bac680b7ed5a57 4d40fc03fb534b5689415f3d8a3de1fc - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Jan 21 18:46:52 np0005591285 nova_compute[182755]: 2026-01-21 23:46:52.267 182759 DEBUG nova.virt.hardware [None req-488460ed-5cba-442c-89a1-718ef1b00c2a 4a6034ff39094b6486bac680b7ed5a57 4d40fc03fb534b5689415f3d8a3de1fc - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Jan 21 18:46:52 np0005591285 nova_compute[182755]: 2026-01-21 23:46:52.274 182759 DEBUG nova.virt.libvirt.vif [None req-488460ed-5cba-442c-89a1-718ef1b00c2a 4a6034ff39094b6486bac680b7ed5a57 4d40fc03fb534b5689415f3d8a3de1fc - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-21T23:46:47Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ServersAdminTestJSON-server-308403715',display_name='tempest-ServersAdminTestJSON-server-308403715',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-serversadmintestjson-server-308403715',id=19,image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='4d40fc03fb534b5689415f3d8a3de1fc',ramdisk_id='',reservation_id='r-z067t4bf',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ServersAdminTestJSON-1815099341',owner_user_name='tempest-ServersAdminTestJSON-1815099341-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-21T23:46:49Z,user_data=None,user_id='4a6034ff39094b6486bac680b7ed5a57',uuid=3a585d4f-6f31-4651-b848-0470f4eed464,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "eb56f7d1-8dec-4faa-a727-c8bdf54f0af5", "address": "fa:16:3e:ae:45:35", "network": {"id": "1530a22a-f758-407d-b1aa-fd922904fe07", "bridge": "br-int", "label": "tempest-ServersAdminTestJSON-1431691179-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "4d40fc03fb534b5689415f3d8a3de1fc", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapeb56f7d1-8d", "ovs_interfaceid": "eb56f7d1-8dec-4faa-a727-c8bdf54f0af5", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Jan 21 18:46:52 np0005591285 nova_compute[182755]: 2026-01-21 23:46:52.275 182759 DEBUG nova.network.os_vif_util [None req-488460ed-5cba-442c-89a1-718ef1b00c2a 4a6034ff39094b6486bac680b7ed5a57 4d40fc03fb534b5689415f3d8a3de1fc - - default default] Converting VIF {"id": "eb56f7d1-8dec-4faa-a727-c8bdf54f0af5", "address": "fa:16:3e:ae:45:35", "network": {"id": "1530a22a-f758-407d-b1aa-fd922904fe07", "bridge": "br-int", "label": "tempest-ServersAdminTestJSON-1431691179-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "4d40fc03fb534b5689415f3d8a3de1fc", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapeb56f7d1-8d", "ovs_interfaceid": "eb56f7d1-8dec-4faa-a727-c8bdf54f0af5", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 21 18:46:52 np0005591285 nova_compute[182755]: 2026-01-21 23:46:52.276 182759 DEBUG nova.network.os_vif_util [None req-488460ed-5cba-442c-89a1-718ef1b00c2a 4a6034ff39094b6486bac680b7ed5a57 4d40fc03fb534b5689415f3d8a3de1fc - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:ae:45:35,bridge_name='br-int',has_traffic_filtering=True,id=eb56f7d1-8dec-4faa-a727-c8bdf54f0af5,network=Network(1530a22a-f758-407d-b1aa-fd922904fe07),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapeb56f7d1-8d') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 21 18:46:52 np0005591285 nova_compute[182755]: 2026-01-21 23:46:52.278 182759 DEBUG nova.objects.instance [None req-488460ed-5cba-442c-89a1-718ef1b00c2a 4a6034ff39094b6486bac680b7ed5a57 4d40fc03fb534b5689415f3d8a3de1fc - - default default] Lazy-loading 'pci_devices' on Instance uuid 3a585d4f-6f31-4651-b848-0470f4eed464 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 21 18:46:52 np0005591285 nova_compute[182755]: 2026-01-21 23:46:52.309 182759 DEBUG nova.virt.libvirt.driver [None req-488460ed-5cba-442c-89a1-718ef1b00c2a 4a6034ff39094b6486bac680b7ed5a57 4d40fc03fb534b5689415f3d8a3de1fc - - default default] [instance: 3a585d4f-6f31-4651-b848-0470f4eed464] End _get_guest_xml xml=<domain type="kvm">
Jan 21 18:46:52 np0005591285 nova_compute[182755]:  <uuid>3a585d4f-6f31-4651-b848-0470f4eed464</uuid>
Jan 21 18:46:52 np0005591285 nova_compute[182755]:  <name>instance-00000013</name>
Jan 21 18:46:52 np0005591285 nova_compute[182755]:  <memory>131072</memory>
Jan 21 18:46:52 np0005591285 nova_compute[182755]:  <vcpu>1</vcpu>
Jan 21 18:46:52 np0005591285 nova_compute[182755]:  <metadata>
Jan 21 18:46:52 np0005591285 nova_compute[182755]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 21 18:46:52 np0005591285 nova_compute[182755]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 21 18:46:52 np0005591285 nova_compute[182755]:      <nova:name>tempest-ServersAdminTestJSON-server-308403715</nova:name>
Jan 21 18:46:52 np0005591285 nova_compute[182755]:      <nova:creationTime>2026-01-21 23:46:52</nova:creationTime>
Jan 21 18:46:52 np0005591285 nova_compute[182755]:      <nova:flavor name="m1.nano">
Jan 21 18:46:52 np0005591285 nova_compute[182755]:        <nova:memory>128</nova:memory>
Jan 21 18:46:52 np0005591285 nova_compute[182755]:        <nova:disk>1</nova:disk>
Jan 21 18:46:52 np0005591285 nova_compute[182755]:        <nova:swap>0</nova:swap>
Jan 21 18:46:52 np0005591285 nova_compute[182755]:        <nova:ephemeral>0</nova:ephemeral>
Jan 21 18:46:52 np0005591285 nova_compute[182755]:        <nova:vcpus>1</nova:vcpus>
Jan 21 18:46:52 np0005591285 nova_compute[182755]:      </nova:flavor>
Jan 21 18:46:52 np0005591285 nova_compute[182755]:      <nova:owner>
Jan 21 18:46:52 np0005591285 nova_compute[182755]:        <nova:user uuid="4a6034ff39094b6486bac680b7ed5a57">tempest-ServersAdminTestJSON-1815099341-project-member</nova:user>
Jan 21 18:46:52 np0005591285 nova_compute[182755]:        <nova:project uuid="4d40fc03fb534b5689415f3d8a3de1fc">tempest-ServersAdminTestJSON-1815099341</nova:project>
Jan 21 18:46:52 np0005591285 nova_compute[182755]:      </nova:owner>
Jan 21 18:46:52 np0005591285 nova_compute[182755]:      <nova:root type="image" uuid="9cd98f02-a505-4543-a7ad-04e9a377b456"/>
Jan 21 18:46:52 np0005591285 nova_compute[182755]:      <nova:ports>
Jan 21 18:46:52 np0005591285 nova_compute[182755]:        <nova:port uuid="eb56f7d1-8dec-4faa-a727-c8bdf54f0af5">
Jan 21 18:46:52 np0005591285 nova_compute[182755]:          <nova:ip type="fixed" address="10.100.0.6" ipVersion="4"/>
Jan 21 18:46:52 np0005591285 nova_compute[182755]:        </nova:port>
Jan 21 18:46:52 np0005591285 nova_compute[182755]:      </nova:ports>
Jan 21 18:46:52 np0005591285 nova_compute[182755]:    </nova:instance>
Jan 21 18:46:52 np0005591285 nova_compute[182755]:  </metadata>
Jan 21 18:46:52 np0005591285 nova_compute[182755]:  <sysinfo type="smbios">
Jan 21 18:46:52 np0005591285 nova_compute[182755]:    <system>
Jan 21 18:46:52 np0005591285 nova_compute[182755]:      <entry name="manufacturer">RDO</entry>
Jan 21 18:46:52 np0005591285 nova_compute[182755]:      <entry name="product">OpenStack Compute</entry>
Jan 21 18:46:52 np0005591285 nova_compute[182755]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 21 18:46:52 np0005591285 nova_compute[182755]:      <entry name="serial">3a585d4f-6f31-4651-b848-0470f4eed464</entry>
Jan 21 18:46:52 np0005591285 nova_compute[182755]:      <entry name="uuid">3a585d4f-6f31-4651-b848-0470f4eed464</entry>
Jan 21 18:46:52 np0005591285 nova_compute[182755]:      <entry name="family">Virtual Machine</entry>
Jan 21 18:46:52 np0005591285 nova_compute[182755]:    </system>
Jan 21 18:46:52 np0005591285 nova_compute[182755]:  </sysinfo>
Jan 21 18:46:52 np0005591285 nova_compute[182755]:  <os>
Jan 21 18:46:52 np0005591285 nova_compute[182755]:    <type arch="x86_64" machine="q35">hvm</type>
Jan 21 18:46:52 np0005591285 nova_compute[182755]:    <boot dev="hd"/>
Jan 21 18:46:52 np0005591285 nova_compute[182755]:    <smbios mode="sysinfo"/>
Jan 21 18:46:52 np0005591285 nova_compute[182755]:  </os>
Jan 21 18:46:52 np0005591285 nova_compute[182755]:  <features>
Jan 21 18:46:52 np0005591285 nova_compute[182755]:    <acpi/>
Jan 21 18:46:52 np0005591285 nova_compute[182755]:    <apic/>
Jan 21 18:46:52 np0005591285 nova_compute[182755]:    <vmcoreinfo/>
Jan 21 18:46:52 np0005591285 nova_compute[182755]:  </features>
Jan 21 18:46:52 np0005591285 nova_compute[182755]:  <clock offset="utc">
Jan 21 18:46:52 np0005591285 nova_compute[182755]:    <timer name="pit" tickpolicy="delay"/>
Jan 21 18:46:52 np0005591285 nova_compute[182755]:    <timer name="rtc" tickpolicy="catchup"/>
Jan 21 18:46:52 np0005591285 nova_compute[182755]:    <timer name="hpet" present="no"/>
Jan 21 18:46:52 np0005591285 nova_compute[182755]:  </clock>
Jan 21 18:46:52 np0005591285 nova_compute[182755]:  <cpu mode="custom" match="exact">
Jan 21 18:46:52 np0005591285 nova_compute[182755]:    <model>Nehalem</model>
Jan 21 18:46:52 np0005591285 nova_compute[182755]:    <topology sockets="1" cores="1" threads="1"/>
Jan 21 18:46:52 np0005591285 nova_compute[182755]:  </cpu>
Jan 21 18:46:52 np0005591285 nova_compute[182755]:  <devices>
Jan 21 18:46:52 np0005591285 nova_compute[182755]:    <disk type="file" device="disk">
Jan 21 18:46:52 np0005591285 nova_compute[182755]:      <driver name="qemu" type="qcow2" cache="none"/>
Jan 21 18:46:52 np0005591285 nova_compute[182755]:      <source file="/var/lib/nova/instances/3a585d4f-6f31-4651-b848-0470f4eed464/disk"/>
Jan 21 18:46:52 np0005591285 nova_compute[182755]:      <target dev="vda" bus="virtio"/>
Jan 21 18:46:52 np0005591285 nova_compute[182755]:    </disk>
Jan 21 18:46:52 np0005591285 nova_compute[182755]:    <disk type="file" device="cdrom">
Jan 21 18:46:52 np0005591285 nova_compute[182755]:      <driver name="qemu" type="raw" cache="none"/>
Jan 21 18:46:52 np0005591285 nova_compute[182755]:      <source file="/var/lib/nova/instances/3a585d4f-6f31-4651-b848-0470f4eed464/disk.config"/>
Jan 21 18:46:52 np0005591285 nova_compute[182755]:      <target dev="sda" bus="sata"/>
Jan 21 18:46:52 np0005591285 nova_compute[182755]:    </disk>
Jan 21 18:46:52 np0005591285 nova_compute[182755]:    <interface type="ethernet">
Jan 21 18:46:52 np0005591285 nova_compute[182755]:      <mac address="fa:16:3e:ae:45:35"/>
Jan 21 18:46:52 np0005591285 nova_compute[182755]:      <model type="virtio"/>
Jan 21 18:46:52 np0005591285 nova_compute[182755]:      <driver name="vhost" rx_queue_size="512"/>
Jan 21 18:46:52 np0005591285 nova_compute[182755]:      <mtu size="1442"/>
Jan 21 18:46:52 np0005591285 nova_compute[182755]:      <target dev="tapeb56f7d1-8d"/>
Jan 21 18:46:52 np0005591285 nova_compute[182755]:    </interface>
Jan 21 18:46:52 np0005591285 nova_compute[182755]:    <serial type="pty">
Jan 21 18:46:52 np0005591285 nova_compute[182755]:      <log file="/var/lib/nova/instances/3a585d4f-6f31-4651-b848-0470f4eed464/console.log" append="off"/>
Jan 21 18:46:52 np0005591285 nova_compute[182755]:    </serial>
Jan 21 18:46:52 np0005591285 nova_compute[182755]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 21 18:46:52 np0005591285 nova_compute[182755]:    <video>
Jan 21 18:46:52 np0005591285 nova_compute[182755]:      <model type="virtio"/>
Jan 21 18:46:52 np0005591285 nova_compute[182755]:    </video>
Jan 21 18:46:52 np0005591285 nova_compute[182755]:    <input type="tablet" bus="usb"/>
Jan 21 18:46:52 np0005591285 nova_compute[182755]:    <rng model="virtio">
Jan 21 18:46:52 np0005591285 nova_compute[182755]:      <backend model="random">/dev/urandom</backend>
Jan 21 18:46:52 np0005591285 nova_compute[182755]:    </rng>
Jan 21 18:46:52 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root"/>
Jan 21 18:46:52 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 18:46:52 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 18:46:52 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 18:46:52 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 18:46:52 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 18:46:52 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 18:46:52 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 18:46:52 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 18:46:52 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 18:46:52 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 18:46:52 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 18:46:52 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 18:46:52 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 18:46:52 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 18:46:52 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 18:46:52 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 18:46:52 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 18:46:52 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 18:46:52 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 18:46:52 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 18:46:52 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 18:46:52 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 18:46:52 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 18:46:52 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 18:46:52 np0005591285 nova_compute[182755]:    <controller type="usb" index="0"/>
Jan 21 18:46:52 np0005591285 nova_compute[182755]:    <memballoon model="virtio">
Jan 21 18:46:52 np0005591285 nova_compute[182755]:      <stats period="10"/>
Jan 21 18:46:52 np0005591285 nova_compute[182755]:    </memballoon>
Jan 21 18:46:52 np0005591285 nova_compute[182755]:  </devices>
Jan 21 18:46:52 np0005591285 nova_compute[182755]: </domain>
Jan 21 18:46:52 np0005591285 nova_compute[182755]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Jan 21 18:46:52 np0005591285 nova_compute[182755]: 2026-01-21 23:46:52.310 182759 DEBUG nova.compute.manager [None req-488460ed-5cba-442c-89a1-718ef1b00c2a 4a6034ff39094b6486bac680b7ed5a57 4d40fc03fb534b5689415f3d8a3de1fc - - default default] [instance: 3a585d4f-6f31-4651-b848-0470f4eed464] Preparing to wait for external event network-vif-plugged-eb56f7d1-8dec-4faa-a727-c8bdf54f0af5 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Jan 21 18:46:52 np0005591285 nova_compute[182755]: 2026-01-21 23:46:52.311 182759 DEBUG oslo_concurrency.lockutils [None req-488460ed-5cba-442c-89a1-718ef1b00c2a 4a6034ff39094b6486bac680b7ed5a57 4d40fc03fb534b5689415f3d8a3de1fc - - default default] Acquiring lock "3a585d4f-6f31-4651-b848-0470f4eed464-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 21 18:46:52 np0005591285 nova_compute[182755]: 2026-01-21 23:46:52.311 182759 DEBUG oslo_concurrency.lockutils [None req-488460ed-5cba-442c-89a1-718ef1b00c2a 4a6034ff39094b6486bac680b7ed5a57 4d40fc03fb534b5689415f3d8a3de1fc - - default default] Lock "3a585d4f-6f31-4651-b848-0470f4eed464-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 21 18:46:52 np0005591285 nova_compute[182755]: 2026-01-21 23:46:52.312 182759 DEBUG oslo_concurrency.lockutils [None req-488460ed-5cba-442c-89a1-718ef1b00c2a 4a6034ff39094b6486bac680b7ed5a57 4d40fc03fb534b5689415f3d8a3de1fc - - default default] Lock "3a585d4f-6f31-4651-b848-0470f4eed464-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 21 18:46:52 np0005591285 nova_compute[182755]: 2026-01-21 23:46:52.314 182759 DEBUG nova.virt.libvirt.vif [None req-488460ed-5cba-442c-89a1-718ef1b00c2a 4a6034ff39094b6486bac680b7ed5a57 4d40fc03fb534b5689415f3d8a3de1fc - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-21T23:46:47Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ServersAdminTestJSON-server-308403715',display_name='tempest-ServersAdminTestJSON-server-308403715',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-serversadmintestjson-server-308403715',id=19,image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='4d40fc03fb534b5689415f3d8a3de1fc',ramdisk_id='',reservation_id='r-z067t4bf',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ServersAdminTestJSON-1815099341',owner_user_name='tempest-ServersAdminTestJSON-1815099341-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-21T23:46:49Z,user_data=None,user_id='4a6034ff39094b6486bac680b7ed5a57',uuid=3a585d4f-6f31-4651-b848-0470f4eed464,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "eb56f7d1-8dec-4faa-a727-c8bdf54f0af5", "address": "fa:16:3e:ae:45:35", "network": {"id": "1530a22a-f758-407d-b1aa-fd922904fe07", "bridge": "br-int", "label": "tempest-ServersAdminTestJSON-1431691179-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "4d40fc03fb534b5689415f3d8a3de1fc", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapeb56f7d1-8d", "ovs_interfaceid": "eb56f7d1-8dec-4faa-a727-c8bdf54f0af5", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Jan 21 18:46:52 np0005591285 nova_compute[182755]: 2026-01-21 23:46:52.315 182759 DEBUG nova.network.os_vif_util [None req-488460ed-5cba-442c-89a1-718ef1b00c2a 4a6034ff39094b6486bac680b7ed5a57 4d40fc03fb534b5689415f3d8a3de1fc - - default default] Converting VIF {"id": "eb56f7d1-8dec-4faa-a727-c8bdf54f0af5", "address": "fa:16:3e:ae:45:35", "network": {"id": "1530a22a-f758-407d-b1aa-fd922904fe07", "bridge": "br-int", "label": "tempest-ServersAdminTestJSON-1431691179-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "4d40fc03fb534b5689415f3d8a3de1fc", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapeb56f7d1-8d", "ovs_interfaceid": "eb56f7d1-8dec-4faa-a727-c8bdf54f0af5", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 21 18:46:52 np0005591285 nova_compute[182755]: 2026-01-21 23:46:52.316 182759 DEBUG nova.network.os_vif_util [None req-488460ed-5cba-442c-89a1-718ef1b00c2a 4a6034ff39094b6486bac680b7ed5a57 4d40fc03fb534b5689415f3d8a3de1fc - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:ae:45:35,bridge_name='br-int',has_traffic_filtering=True,id=eb56f7d1-8dec-4faa-a727-c8bdf54f0af5,network=Network(1530a22a-f758-407d-b1aa-fd922904fe07),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapeb56f7d1-8d') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 21 18:46:52 np0005591285 nova_compute[182755]: 2026-01-21 23:46:52.317 182759 DEBUG os_vif [None req-488460ed-5cba-442c-89a1-718ef1b00c2a 4a6034ff39094b6486bac680b7ed5a57 4d40fc03fb534b5689415f3d8a3de1fc - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:ae:45:35,bridge_name='br-int',has_traffic_filtering=True,id=eb56f7d1-8dec-4faa-a727-c8bdf54f0af5,network=Network(1530a22a-f758-407d-b1aa-fd922904fe07),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapeb56f7d1-8d') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Jan 21 18:46:52 np0005591285 nova_compute[182755]: 2026-01-21 23:46:52.319 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 18:46:52 np0005591285 nova_compute[182755]: 2026-01-21 23:46:52.320 182759 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 21 18:46:52 np0005591285 nova_compute[182755]: 2026-01-21 23:46:52.320 182759 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 21 18:46:52 np0005591285 nova_compute[182755]: 2026-01-21 23:46:52.329 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 18:46:52 np0005591285 nova_compute[182755]: 2026-01-21 23:46:52.330 182759 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapeb56f7d1-8d, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 21 18:46:52 np0005591285 nova_compute[182755]: 2026-01-21 23:46:52.331 182759 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tapeb56f7d1-8d, col_values=(('external_ids', {'iface-id': 'eb56f7d1-8dec-4faa-a727-c8bdf54f0af5', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:ae:45:35', 'vm-uuid': '3a585d4f-6f31-4651-b848-0470f4eed464'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 21 18:46:52 np0005591285 nova_compute[182755]: 2026-01-21 23:46:52.334 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 18:46:52 np0005591285 NetworkManager[55017]: <info>  [1769039212.3368] manager: (tapeb56f7d1-8d): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/41)
Jan 21 18:46:52 np0005591285 nova_compute[182755]: 2026-01-21 23:46:52.337 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 21 18:46:52 np0005591285 nova_compute[182755]: 2026-01-21 23:46:52.345 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 18:46:52 np0005591285 nova_compute[182755]: 2026-01-21 23:46:52.346 182759 INFO os_vif [None req-488460ed-5cba-442c-89a1-718ef1b00c2a 4a6034ff39094b6486bac680b7ed5a57 4d40fc03fb534b5689415f3d8a3de1fc - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:ae:45:35,bridge_name='br-int',has_traffic_filtering=True,id=eb56f7d1-8dec-4faa-a727-c8bdf54f0af5,network=Network(1530a22a-f758-407d-b1aa-fd922904fe07),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapeb56f7d1-8d')#033[00m
Jan 21 18:46:52 np0005591285 nova_compute[182755]: 2026-01-21 23:46:52.413 182759 DEBUG nova.virt.libvirt.driver [None req-488460ed-5cba-442c-89a1-718ef1b00c2a 4a6034ff39094b6486bac680b7ed5a57 4d40fc03fb534b5689415f3d8a3de1fc - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 21 18:46:52 np0005591285 nova_compute[182755]: 2026-01-21 23:46:52.413 182759 DEBUG nova.virt.libvirt.driver [None req-488460ed-5cba-442c-89a1-718ef1b00c2a 4a6034ff39094b6486bac680b7ed5a57 4d40fc03fb534b5689415f3d8a3de1fc - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 21 18:46:52 np0005591285 nova_compute[182755]: 2026-01-21 23:46:52.413 182759 DEBUG nova.virt.libvirt.driver [None req-488460ed-5cba-442c-89a1-718ef1b00c2a 4a6034ff39094b6486bac680b7ed5a57 4d40fc03fb534b5689415f3d8a3de1fc - - default default] No VIF found with MAC fa:16:3e:ae:45:35, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Jan 21 18:46:52 np0005591285 nova_compute[182755]: 2026-01-21 23:46:52.414 182759 INFO nova.virt.libvirt.driver [None req-488460ed-5cba-442c-89a1-718ef1b00c2a 4a6034ff39094b6486bac680b7ed5a57 4d40fc03fb534b5689415f3d8a3de1fc - - default default] [instance: 3a585d4f-6f31-4651-b848-0470f4eed464] Using config drive#033[00m
Jan 21 18:46:53 np0005591285 nova_compute[182755]: 2026-01-21 23:46:53.019 182759 INFO nova.virt.libvirt.driver [None req-488460ed-5cba-442c-89a1-718ef1b00c2a 4a6034ff39094b6486bac680b7ed5a57 4d40fc03fb534b5689415f3d8a3de1fc - - default default] [instance: 3a585d4f-6f31-4651-b848-0470f4eed464] Creating config drive at /var/lib/nova/instances/3a585d4f-6f31-4651-b848-0470f4eed464/disk.config#033[00m
Jan 21 18:46:53 np0005591285 nova_compute[182755]: 2026-01-21 23:46:53.030 182759 DEBUG oslo_concurrency.processutils [None req-488460ed-5cba-442c-89a1-718ef1b00c2a 4a6034ff39094b6486bac680b7ed5a57 4d40fc03fb534b5689415f3d8a3de1fc - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/3a585d4f-6f31-4651-b848-0470f4eed464/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpbllpnto0 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 21 18:46:53 np0005591285 nova_compute[182755]: 2026-01-21 23:46:53.173 182759 DEBUG oslo_concurrency.processutils [None req-488460ed-5cba-442c-89a1-718ef1b00c2a 4a6034ff39094b6486bac680b7ed5a57 4d40fc03fb534b5689415f3d8a3de1fc - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/3a585d4f-6f31-4651-b848-0470f4eed464/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpbllpnto0" returned: 0 in 0.143s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 21 18:46:53 np0005591285 kernel: tapeb56f7d1-8d: entered promiscuous mode
Jan 21 18:46:53 np0005591285 NetworkManager[55017]: <info>  [1769039213.2285] manager: (tapeb56f7d1-8d): new Tun device (/org/freedesktop/NetworkManager/Devices/42)
Jan 21 18:46:53 np0005591285 ovn_controller[94908]: 2026-01-21T23:46:53Z|00060|binding|INFO|Claiming lport eb56f7d1-8dec-4faa-a727-c8bdf54f0af5 for this chassis.
Jan 21 18:46:53 np0005591285 ovn_controller[94908]: 2026-01-21T23:46:53Z|00061|binding|INFO|eb56f7d1-8dec-4faa-a727-c8bdf54f0af5: Claiming fa:16:3e:ae:45:35 10.100.0.6
Jan 21 18:46:53 np0005591285 nova_compute[182755]: 2026-01-21 23:46:53.269 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 18:46:53 np0005591285 nova_compute[182755]: 2026-01-21 23:46:53.273 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 18:46:53 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:46:53.293 104259 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:ae:45:35 10.100.0.6'], port_security=['fa:16:3e:ae:45:35 10.100.0.6'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.6/28', 'neutron:device_id': '3a585d4f-6f31-4651-b848-0470f4eed464', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-1530a22a-f758-407d-b1aa-fd922904fe07', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '4d40fc03fb534b5689415f3d8a3de1fc', 'neutron:revision_number': '2', 'neutron:security_group_ids': '53586897-09f0-4175-b34b-334e99525efe', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=b41d97fa-58c9-4587-b2cb-1c83c11100c9, chassis=[<ovs.db.idl.Row object at 0x7fdd55b5c970>], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fdd55b5c970>], logical_port=eb56f7d1-8dec-4faa-a727-c8bdf54f0af5) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 21 18:46:53 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:46:53.296 104259 INFO neutron.agent.ovn.metadata.agent [-] Port eb56f7d1-8dec-4faa-a727-c8bdf54f0af5 in datapath 1530a22a-f758-407d-b1aa-fd922904fe07 bound to our chassis#033[00m
Jan 21 18:46:53 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:46:53.298 104259 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 1530a22a-f758-407d-b1aa-fd922904fe07#033[00m
Jan 21 18:46:53 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:46:53.317 211690 DEBUG oslo.privsep.daemon [-] privsep: reply[b5582e7d-5a94-4607-a356-84868d42cc16]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 18:46:53 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:46:53.318 104259 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap1530a22a-f1 in ovnmeta-1530a22a-f758-407d-b1aa-fd922904fe07 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Jan 21 18:46:53 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:46:53.322 211690 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap1530a22a-f0 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Jan 21 18:46:53 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:46:53.322 211690 DEBUG oslo.privsep.daemon [-] privsep: reply[0c3e6cc0-01ad-415c-a2b4-437179df170e]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 18:46:53 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:46:53.323 211690 DEBUG oslo.privsep.daemon [-] privsep: reply[e5ed2cd8-1080-4fb3-8daf-03cafc80e3e2]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 18:46:53 np0005591285 systemd-machined[154022]: New machine qemu-6-instance-00000013.
Jan 21 18:46:53 np0005591285 systemd[1]: Started Virtual Machine qemu-6-instance-00000013.
Jan 21 18:46:53 np0005591285 systemd-udevd[213449]: Network interface NamePolicy= disabled on kernel command line.
Jan 21 18:46:53 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:46:53.340 104650 DEBUG oslo.privsep.daemon [-] privsep: reply[9145fcb0-a909-4148-9079-c9e2ad768996]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 18:46:53 np0005591285 ovn_controller[94908]: 2026-01-21T23:46:53Z|00062|binding|INFO|Setting lport eb56f7d1-8dec-4faa-a727-c8bdf54f0af5 ovn-installed in OVS
Jan 21 18:46:53 np0005591285 ovn_controller[94908]: 2026-01-21T23:46:53Z|00063|binding|INFO|Setting lport eb56f7d1-8dec-4faa-a727-c8bdf54f0af5 up in Southbound
Jan 21 18:46:53 np0005591285 nova_compute[182755]: 2026-01-21 23:46:53.354 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 18:46:53 np0005591285 NetworkManager[55017]: <info>  [1769039213.3577] device (tapeb56f7d1-8d): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 21 18:46:53 np0005591285 NetworkManager[55017]: <info>  [1769039213.3585] device (tapeb56f7d1-8d): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 21 18:46:53 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:46:53.382 211690 DEBUG oslo.privsep.daemon [-] privsep: reply[9117d8c8-bc12-4ef6-b0a8-a7b2648d7da7]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 18:46:53 np0005591285 podman[213430]: 2026-01-21 23:46:53.398211687 +0000 UTC m=+0.086475035 container health_status 8919309155a033da8deb024bb37b9fc0d285fe97686ee0d202af37a5f2df8416 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Jan 21 18:46:53 np0005591285 podman[213429]: 2026-01-21 23:46:53.409800197 +0000 UTC m=+0.111851074 container health_status 482a8450540b33e5cd4c4cb0c4b60281016c657b79b89fa82f8a9e447843f995 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, tcib_managed=true, io.buildah.version=1.41.3, managed_by=edpm_ansible, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, org.label-schema.build-date=20251202, maintainer=OpenStack Kubernetes Operator team)
Jan 21 18:46:53 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:46:53.421 211704 DEBUG oslo.privsep.daemon [-] privsep: reply[2690a180-535e-43c1-b137-f7151b59c0de]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 18:46:53 np0005591285 NetworkManager[55017]: <info>  [1769039213.4299] manager: (tap1530a22a-f0): new Veth device (/org/freedesktop/NetworkManager/Devices/43)
Jan 21 18:46:53 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:46:53.429 211690 DEBUG oslo.privsep.daemon [-] privsep: reply[c2d7278e-3df5-4378-a9d9-d5de66d78bf9]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 18:46:53 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:46:53.471 211704 DEBUG oslo.privsep.daemon [-] privsep: reply[9ccef03b-ed97-494a-9a18-b1655b2a3c11]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 18:46:53 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:46:53.474 211704 DEBUG oslo.privsep.daemon [-] privsep: reply[78acf5ad-fb59-4e89-b4a7-0504fead77a2]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 18:46:53 np0005591285 NetworkManager[55017]: <info>  [1769039213.5015] device (tap1530a22a-f0): carrier: link connected
Jan 21 18:46:53 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:46:53.508 211704 DEBUG oslo.privsep.daemon [-] privsep: reply[cd418dd4-e482-4b8e-9a51-3e2b3d7a0243]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 18:46:53 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:46:53.529 211690 DEBUG oslo.privsep.daemon [-] privsep: reply[45618962-fb57-45e8-8859-3607587f82ed]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap1530a22a-f1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:a6:bf:13'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 23], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 377216, 'reachable_time': 23232, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 213510, 'error': None, 'target': 'ovnmeta-1530a22a-f758-407d-b1aa-fd922904fe07', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 18:46:53 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:46:53.553 211690 DEBUG oslo.privsep.daemon [-] privsep: reply[b83670a6-4183-4682-8cbb-9dcbde546ce9]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fea6:bf13'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 377216, 'tstamp': 377216}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 213512, 'error': None, 'target': 'ovnmeta-1530a22a-f758-407d-b1aa-fd922904fe07', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 18:46:53 np0005591285 nova_compute[182755]: 2026-01-21 23:46:53.572 182759 DEBUG nova.virt.driver [None req-f447ef32-7edb-4e2c-a5c5-5452f2f9c37e - - - - - -] Emitting event <LifecycleEvent: 1769039213.571219, 3a585d4f-6f31-4651-b848-0470f4eed464 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 21 18:46:53 np0005591285 nova_compute[182755]: 2026-01-21 23:46:53.572 182759 INFO nova.compute.manager [None req-f447ef32-7edb-4e2c-a5c5-5452f2f9c37e - - - - - -] [instance: 3a585d4f-6f31-4651-b848-0470f4eed464] VM Started (Lifecycle Event)#033[00m
Jan 21 18:46:53 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:46:53.571 211690 DEBUG oslo.privsep.daemon [-] privsep: reply[f61b4619-67a9-4060-bf49-edec5799ea66]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap1530a22a-f1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:a6:bf:13'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 23], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 377216, 'reachable_time': 23232, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 213513, 'error': None, 'target': 'ovnmeta-1530a22a-f758-407d-b1aa-fd922904fe07', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 18:46:53 np0005591285 nova_compute[182755]: 2026-01-21 23:46:53.603 182759 DEBUG nova.compute.manager [None req-f447ef32-7edb-4e2c-a5c5-5452f2f9c37e - - - - - -] [instance: 3a585d4f-6f31-4651-b848-0470f4eed464] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 21 18:46:53 np0005591285 nova_compute[182755]: 2026-01-21 23:46:53.608 182759 DEBUG nova.virt.driver [None req-f447ef32-7edb-4e2c-a5c5-5452f2f9c37e - - - - - -] Emitting event <LifecycleEvent: 1769039213.5716245, 3a585d4f-6f31-4651-b848-0470f4eed464 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 21 18:46:53 np0005591285 nova_compute[182755]: 2026-01-21 23:46:53.608 182759 INFO nova.compute.manager [None req-f447ef32-7edb-4e2c-a5c5-5452f2f9c37e - - - - - -] [instance: 3a585d4f-6f31-4651-b848-0470f4eed464] VM Paused (Lifecycle Event)#033[00m
Jan 21 18:46:53 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:46:53.608 211690 DEBUG oslo.privsep.daemon [-] privsep: reply[7221d956-bc5a-407e-adba-3f8da106cc37]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 18:46:53 np0005591285 nova_compute[182755]: 2026-01-21 23:46:53.629 182759 DEBUG nova.compute.manager [None req-f447ef32-7edb-4e2c-a5c5-5452f2f9c37e - - - - - -] [instance: 3a585d4f-6f31-4651-b848-0470f4eed464] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 21 18:46:53 np0005591285 nova_compute[182755]: 2026-01-21 23:46:53.633 182759 DEBUG nova.compute.manager [None req-f447ef32-7edb-4e2c-a5c5-5452f2f9c37e - - - - - -] [instance: 3a585d4f-6f31-4651-b848-0470f4eed464] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 21 18:46:53 np0005591285 nova_compute[182755]: 2026-01-21 23:46:53.659 182759 INFO nova.compute.manager [None req-f447ef32-7edb-4e2c-a5c5-5452f2f9c37e - - - - - -] [instance: 3a585d4f-6f31-4651-b848-0470f4eed464] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 21 18:46:53 np0005591285 nova_compute[182755]: 2026-01-21 23:46:53.662 182759 DEBUG nova.compute.manager [req-2a0bd3e1-aa97-4488-8fba-1dd35a316ba8 req-71b55989-4098-455c-8cc2-8d55d896b02b 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 3a585d4f-6f31-4651-b848-0470f4eed464] Received event network-vif-plugged-eb56f7d1-8dec-4faa-a727-c8bdf54f0af5 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 21 18:46:53 np0005591285 nova_compute[182755]: 2026-01-21 23:46:53.662 182759 DEBUG oslo_concurrency.lockutils [req-2a0bd3e1-aa97-4488-8fba-1dd35a316ba8 req-71b55989-4098-455c-8cc2-8d55d896b02b 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "3a585d4f-6f31-4651-b848-0470f4eed464-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 21 18:46:53 np0005591285 nova_compute[182755]: 2026-01-21 23:46:53.663 182759 DEBUG oslo_concurrency.lockutils [req-2a0bd3e1-aa97-4488-8fba-1dd35a316ba8 req-71b55989-4098-455c-8cc2-8d55d896b02b 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "3a585d4f-6f31-4651-b848-0470f4eed464-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 21 18:46:53 np0005591285 nova_compute[182755]: 2026-01-21 23:46:53.663 182759 DEBUG oslo_concurrency.lockutils [req-2a0bd3e1-aa97-4488-8fba-1dd35a316ba8 req-71b55989-4098-455c-8cc2-8d55d896b02b 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "3a585d4f-6f31-4651-b848-0470f4eed464-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 21 18:46:53 np0005591285 nova_compute[182755]: 2026-01-21 23:46:53.663 182759 DEBUG nova.compute.manager [req-2a0bd3e1-aa97-4488-8fba-1dd35a316ba8 req-71b55989-4098-455c-8cc2-8d55d896b02b 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 3a585d4f-6f31-4651-b848-0470f4eed464] Processing event network-vif-plugged-eb56f7d1-8dec-4faa-a727-c8bdf54f0af5 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Jan 21 18:46:53 np0005591285 nova_compute[182755]: 2026-01-21 23:46:53.666 182759 DEBUG nova.compute.manager [None req-488460ed-5cba-442c-89a1-718ef1b00c2a 4a6034ff39094b6486bac680b7ed5a57 4d40fc03fb534b5689415f3d8a3de1fc - - default default] [instance: 3a585d4f-6f31-4651-b848-0470f4eed464] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Jan 21 18:46:53 np0005591285 nova_compute[182755]: 2026-01-21 23:46:53.675 182759 DEBUG nova.virt.driver [None req-f447ef32-7edb-4e2c-a5c5-5452f2f9c37e - - - - - -] Emitting event <LifecycleEvent: 1769039213.6701255, 3a585d4f-6f31-4651-b848-0470f4eed464 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 21 18:46:53 np0005591285 nova_compute[182755]: 2026-01-21 23:46:53.675 182759 INFO nova.compute.manager [None req-f447ef32-7edb-4e2c-a5c5-5452f2f9c37e - - - - - -] [instance: 3a585d4f-6f31-4651-b848-0470f4eed464] VM Resumed (Lifecycle Event)#033[00m
Jan 21 18:46:53 np0005591285 nova_compute[182755]: 2026-01-21 23:46:53.678 182759 DEBUG nova.virt.libvirt.driver [None req-488460ed-5cba-442c-89a1-718ef1b00c2a 4a6034ff39094b6486bac680b7ed5a57 4d40fc03fb534b5689415f3d8a3de1fc - - default default] [instance: 3a585d4f-6f31-4651-b848-0470f4eed464] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Jan 21 18:46:53 np0005591285 nova_compute[182755]: 2026-01-21 23:46:53.681 182759 INFO nova.virt.libvirt.driver [-] [instance: 3a585d4f-6f31-4651-b848-0470f4eed464] Instance spawned successfully.#033[00m
Jan 21 18:46:53 np0005591285 nova_compute[182755]: 2026-01-21 23:46:53.682 182759 DEBUG nova.virt.libvirt.driver [None req-488460ed-5cba-442c-89a1-718ef1b00c2a 4a6034ff39094b6486bac680b7ed5a57 4d40fc03fb534b5689415f3d8a3de1fc - - default default] [instance: 3a585d4f-6f31-4651-b848-0470f4eed464] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Jan 21 18:46:53 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:46:53.685 211690 DEBUG oslo.privsep.daemon [-] privsep: reply[08cd9537-623c-4d71-9358-c8ae38054d0b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 18:46:53 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:46:53.687 104259 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap1530a22a-f0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 21 18:46:53 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:46:53.687 104259 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 21 18:46:53 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:46:53.688 104259 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap1530a22a-f0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 21 18:46:53 np0005591285 kernel: tap1530a22a-f0: entered promiscuous mode
Jan 21 18:46:53 np0005591285 NetworkManager[55017]: <info>  [1769039213.6905] manager: (tap1530a22a-f0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/44)
Jan 21 18:46:53 np0005591285 nova_compute[182755]: 2026-01-21 23:46:53.689 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 18:46:53 np0005591285 nova_compute[182755]: 2026-01-21 23:46:53.692 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 18:46:53 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:46:53.695 104259 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap1530a22a-f0, col_values=(('external_ids', {'iface-id': '1e43acd7-fb26-4f78-8f65-2a3b2d4a2acd'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 21 18:46:53 np0005591285 nova_compute[182755]: 2026-01-21 23:46:53.696 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 18:46:53 np0005591285 ovn_controller[94908]: 2026-01-21T23:46:53Z|00064|binding|INFO|Releasing lport 1e43acd7-fb26-4f78-8f65-2a3b2d4a2acd from this chassis (sb_readonly=0)
Jan 21 18:46:53 np0005591285 nova_compute[182755]: 2026-01-21 23:46:53.698 182759 DEBUG nova.compute.manager [None req-f447ef32-7edb-4e2c-a5c5-5452f2f9c37e - - - - - -] [instance: 3a585d4f-6f31-4651-b848-0470f4eed464] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 21 18:46:53 np0005591285 nova_compute[182755]: 2026-01-21 23:46:53.705 182759 DEBUG nova.compute.manager [None req-f447ef32-7edb-4e2c-a5c5-5452f2f9c37e - - - - - -] [instance: 3a585d4f-6f31-4651-b848-0470f4eed464] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 21 18:46:53 np0005591285 nova_compute[182755]: 2026-01-21 23:46:53.707 182759 DEBUG nova.virt.libvirt.driver [None req-488460ed-5cba-442c-89a1-718ef1b00c2a 4a6034ff39094b6486bac680b7ed5a57 4d40fc03fb534b5689415f3d8a3de1fc - - default default] [instance: 3a585d4f-6f31-4651-b848-0470f4eed464] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 21 18:46:53 np0005591285 nova_compute[182755]: 2026-01-21 23:46:53.708 182759 DEBUG nova.virt.libvirt.driver [None req-488460ed-5cba-442c-89a1-718ef1b00c2a 4a6034ff39094b6486bac680b7ed5a57 4d40fc03fb534b5689415f3d8a3de1fc - - default default] [instance: 3a585d4f-6f31-4651-b848-0470f4eed464] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 21 18:46:53 np0005591285 nova_compute[182755]: 2026-01-21 23:46:53.708 182759 DEBUG nova.virt.libvirt.driver [None req-488460ed-5cba-442c-89a1-718ef1b00c2a 4a6034ff39094b6486bac680b7ed5a57 4d40fc03fb534b5689415f3d8a3de1fc - - default default] [instance: 3a585d4f-6f31-4651-b848-0470f4eed464] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 21 18:46:53 np0005591285 nova_compute[182755]: 2026-01-21 23:46:53.708 182759 DEBUG nova.virt.libvirt.driver [None req-488460ed-5cba-442c-89a1-718ef1b00c2a 4a6034ff39094b6486bac680b7ed5a57 4d40fc03fb534b5689415f3d8a3de1fc - - default default] [instance: 3a585d4f-6f31-4651-b848-0470f4eed464] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 21 18:46:53 np0005591285 nova_compute[182755]: 2026-01-21 23:46:53.709 182759 DEBUG nova.virt.libvirt.driver [None req-488460ed-5cba-442c-89a1-718ef1b00c2a 4a6034ff39094b6486bac680b7ed5a57 4d40fc03fb534b5689415f3d8a3de1fc - - default default] [instance: 3a585d4f-6f31-4651-b848-0470f4eed464] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 21 18:46:53 np0005591285 nova_compute[182755]: 2026-01-21 23:46:53.709 182759 DEBUG nova.virt.libvirt.driver [None req-488460ed-5cba-442c-89a1-718ef1b00c2a 4a6034ff39094b6486bac680b7ed5a57 4d40fc03fb534b5689415f3d8a3de1fc - - default default] [instance: 3a585d4f-6f31-4651-b848-0470f4eed464] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 21 18:46:53 np0005591285 nova_compute[182755]: 2026-01-21 23:46:53.721 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 18:46:53 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:46:53.722 104259 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/1530a22a-f758-407d-b1aa-fd922904fe07.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/1530a22a-f758-407d-b1aa-fd922904fe07.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Jan 21 18:46:53 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:46:53.723 211690 DEBUG oslo.privsep.daemon [-] privsep: reply[de85b046-7dd4-4dbe-a4b3-e5c6f621a521]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 18:46:53 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:46:53.724 104259 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Jan 21 18:46:53 np0005591285 ovn_metadata_agent[104254]: global
Jan 21 18:46:53 np0005591285 ovn_metadata_agent[104254]:    log         /dev/log local0 debug
Jan 21 18:46:53 np0005591285 ovn_metadata_agent[104254]:    log-tag     haproxy-metadata-proxy-1530a22a-f758-407d-b1aa-fd922904fe07
Jan 21 18:46:53 np0005591285 ovn_metadata_agent[104254]:    user        root
Jan 21 18:46:53 np0005591285 ovn_metadata_agent[104254]:    group       root
Jan 21 18:46:53 np0005591285 ovn_metadata_agent[104254]:    maxconn     1024
Jan 21 18:46:53 np0005591285 ovn_metadata_agent[104254]:    pidfile     /var/lib/neutron/external/pids/1530a22a-f758-407d-b1aa-fd922904fe07.pid.haproxy
Jan 21 18:46:53 np0005591285 ovn_metadata_agent[104254]:    daemon
Jan 21 18:46:53 np0005591285 ovn_metadata_agent[104254]: 
Jan 21 18:46:53 np0005591285 ovn_metadata_agent[104254]: defaults
Jan 21 18:46:53 np0005591285 ovn_metadata_agent[104254]:    log global
Jan 21 18:46:53 np0005591285 ovn_metadata_agent[104254]:    mode http
Jan 21 18:46:53 np0005591285 ovn_metadata_agent[104254]:    option httplog
Jan 21 18:46:53 np0005591285 ovn_metadata_agent[104254]:    option dontlognull
Jan 21 18:46:53 np0005591285 ovn_metadata_agent[104254]:    option http-server-close
Jan 21 18:46:53 np0005591285 ovn_metadata_agent[104254]:    option forwardfor
Jan 21 18:46:53 np0005591285 ovn_metadata_agent[104254]:    retries                 3
Jan 21 18:46:53 np0005591285 ovn_metadata_agent[104254]:    timeout http-request    30s
Jan 21 18:46:53 np0005591285 ovn_metadata_agent[104254]:    timeout connect         30s
Jan 21 18:46:53 np0005591285 ovn_metadata_agent[104254]:    timeout client          32s
Jan 21 18:46:53 np0005591285 ovn_metadata_agent[104254]:    timeout server          32s
Jan 21 18:46:53 np0005591285 ovn_metadata_agent[104254]:    timeout http-keep-alive 30s
Jan 21 18:46:53 np0005591285 ovn_metadata_agent[104254]: 
Jan 21 18:46:53 np0005591285 ovn_metadata_agent[104254]: 
Jan 21 18:46:53 np0005591285 ovn_metadata_agent[104254]: listen listener
Jan 21 18:46:53 np0005591285 ovn_metadata_agent[104254]:    bind 169.254.169.254:80
Jan 21 18:46:53 np0005591285 ovn_metadata_agent[104254]:    server metadata /var/lib/neutron/metadata_proxy
Jan 21 18:46:53 np0005591285 ovn_metadata_agent[104254]:    http-request add-header X-OVN-Network-ID 1530a22a-f758-407d-b1aa-fd922904fe07
Jan 21 18:46:53 np0005591285 ovn_metadata_agent[104254]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Jan 21 18:46:53 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:46:53.726 104259 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-1530a22a-f758-407d-b1aa-fd922904fe07', 'env', 'PROCESS_TAG=haproxy-1530a22a-f758-407d-b1aa-fd922904fe07', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/1530a22a-f758-407d-b1aa-fd922904fe07.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Jan 21 18:46:53 np0005591285 nova_compute[182755]: 2026-01-21 23:46:53.740 182759 INFO nova.compute.manager [None req-f447ef32-7edb-4e2c-a5c5-5452f2f9c37e - - - - - -] [instance: 3a585d4f-6f31-4651-b848-0470f4eed464] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 21 18:46:53 np0005591285 nova_compute[182755]: 2026-01-21 23:46:53.790 182759 INFO nova.compute.manager [None req-488460ed-5cba-442c-89a1-718ef1b00c2a 4a6034ff39094b6486bac680b7ed5a57 4d40fc03fb534b5689415f3d8a3de1fc - - default default] [instance: 3a585d4f-6f31-4651-b848-0470f4eed464] Took 4.67 seconds to spawn the instance on the hypervisor.#033[00m
Jan 21 18:46:53 np0005591285 nova_compute[182755]: 2026-01-21 23:46:53.791 182759 DEBUG nova.compute.manager [None req-488460ed-5cba-442c-89a1-718ef1b00c2a 4a6034ff39094b6486bac680b7ed5a57 4d40fc03fb534b5689415f3d8a3de1fc - - default default] [instance: 3a585d4f-6f31-4651-b848-0470f4eed464] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 21 18:46:53 np0005591285 nova_compute[182755]: 2026-01-21 23:46:53.935 182759 INFO nova.compute.manager [None req-488460ed-5cba-442c-89a1-718ef1b00c2a 4a6034ff39094b6486bac680b7ed5a57 4d40fc03fb534b5689415f3d8a3de1fc - - default default] [instance: 3a585d4f-6f31-4651-b848-0470f4eed464] Took 5.63 seconds to build instance.#033[00m
Jan 21 18:46:53 np0005591285 nova_compute[182755]: 2026-01-21 23:46:53.962 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 18:46:53 np0005591285 nova_compute[182755]: 2026-01-21 23:46:53.970 182759 DEBUG oslo_concurrency.lockutils [None req-488460ed-5cba-442c-89a1-718ef1b00c2a 4a6034ff39094b6486bac680b7ed5a57 4d40fc03fb534b5689415f3d8a3de1fc - - default default] Lock "3a585d4f-6f31-4651-b848-0470f4eed464" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 5.787s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 21 18:46:54 np0005591285 podman[213545]: 2026-01-21 23:46:54.230422299 +0000 UTC m=+0.088098288 container create 75aca6e737251648c3ddab05a55c8354608790185af70e728b6702135e147ca1 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-1530a22a-f758-407d-b1aa-fd922904fe07, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, org.label-schema.build-date=20251202, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team)
Jan 21 18:46:54 np0005591285 podman[213545]: 2026-01-21 23:46:54.178190042 +0000 UTC m=+0.035866081 image pull 3695f0466b4af47afdf4b467956f8cc4744d7249671a73e7ca3fd26cca2f59c3 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Jan 21 18:46:54 np0005591285 systemd[1]: Started libpod-conmon-75aca6e737251648c3ddab05a55c8354608790185af70e728b6702135e147ca1.scope.
Jan 21 18:46:54 np0005591285 systemd[1]: Started libcrun container.
Jan 21 18:46:54 np0005591285 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/4db65bed9faebea719c23bd0484d4cee0c61e85524154af12d57705602cdb015/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 21 18:46:54 np0005591285 nova_compute[182755]: 2026-01-21 23:46:54.330 182759 DEBUG nova.network.neutron [req-7e70e477-d73e-4516-a2f4-328baab479ee req-c2d387bf-319f-4d1e-9f69-0b0bbd3d5240 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 3a585d4f-6f31-4651-b848-0470f4eed464] Updated VIF entry in instance network info cache for port eb56f7d1-8dec-4faa-a727-c8bdf54f0af5. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 21 18:46:54 np0005591285 nova_compute[182755]: 2026-01-21 23:46:54.331 182759 DEBUG nova.network.neutron [req-7e70e477-d73e-4516-a2f4-328baab479ee req-c2d387bf-319f-4d1e-9f69-0b0bbd3d5240 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 3a585d4f-6f31-4651-b848-0470f4eed464] Updating instance_info_cache with network_info: [{"id": "eb56f7d1-8dec-4faa-a727-c8bdf54f0af5", "address": "fa:16:3e:ae:45:35", "network": {"id": "1530a22a-f758-407d-b1aa-fd922904fe07", "bridge": "br-int", "label": "tempest-ServersAdminTestJSON-1431691179-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "4d40fc03fb534b5689415f3d8a3de1fc", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapeb56f7d1-8d", "ovs_interfaceid": "eb56f7d1-8dec-4faa-a727-c8bdf54f0af5", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 21 18:46:54 np0005591285 nova_compute[182755]: 2026-01-21 23:46:54.354 182759 DEBUG oslo_concurrency.lockutils [req-7e70e477-d73e-4516-a2f4-328baab479ee req-c2d387bf-319f-4d1e-9f69-0b0bbd3d5240 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Releasing lock "refresh_cache-3a585d4f-6f31-4651-b848-0470f4eed464" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 21 18:46:54 np0005591285 podman[213545]: 2026-01-21 23:46:54.360393566 +0000 UTC m=+0.218069615 container init 75aca6e737251648c3ddab05a55c8354608790185af70e728b6702135e147ca1 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-1530a22a-f758-407d-b1aa-fd922904fe07, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, tcib_managed=true, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 21 18:46:54 np0005591285 podman[213545]: 2026-01-21 23:46:54.371813191 +0000 UTC m=+0.229489180 container start 75aca6e737251648c3ddab05a55c8354608790185af70e728b6702135e147ca1 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-1530a22a-f758-407d-b1aa-fd922904fe07, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3)
Jan 21 18:46:54 np0005591285 neutron-haproxy-ovnmeta-1530a22a-f758-407d-b1aa-fd922904fe07[213560]: [NOTICE]   (213564) : New worker (213566) forked
Jan 21 18:46:54 np0005591285 neutron-haproxy-ovnmeta-1530a22a-f758-407d-b1aa-fd922904fe07[213560]: [NOTICE]   (213564) : Loading success.
Jan 21 18:46:55 np0005591285 nova_compute[182755]: 2026-01-21 23:46:55.789 182759 DEBUG nova.compute.manager [req-9de1b0be-94af-48d0-b925-e170e205a5f3 req-d18d9de9-81ca-416e-b95c-506bb89b5dbc 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 3a585d4f-6f31-4651-b848-0470f4eed464] Received event network-vif-plugged-eb56f7d1-8dec-4faa-a727-c8bdf54f0af5 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 21 18:46:55 np0005591285 nova_compute[182755]: 2026-01-21 23:46:55.789 182759 DEBUG oslo_concurrency.lockutils [req-9de1b0be-94af-48d0-b925-e170e205a5f3 req-d18d9de9-81ca-416e-b95c-506bb89b5dbc 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "3a585d4f-6f31-4651-b848-0470f4eed464-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 21 18:46:55 np0005591285 nova_compute[182755]: 2026-01-21 23:46:55.790 182759 DEBUG oslo_concurrency.lockutils [req-9de1b0be-94af-48d0-b925-e170e205a5f3 req-d18d9de9-81ca-416e-b95c-506bb89b5dbc 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "3a585d4f-6f31-4651-b848-0470f4eed464-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 21 18:46:55 np0005591285 nova_compute[182755]: 2026-01-21 23:46:55.790 182759 DEBUG oslo_concurrency.lockutils [req-9de1b0be-94af-48d0-b925-e170e205a5f3 req-d18d9de9-81ca-416e-b95c-506bb89b5dbc 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "3a585d4f-6f31-4651-b848-0470f4eed464-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 21 18:46:55 np0005591285 nova_compute[182755]: 2026-01-21 23:46:55.791 182759 DEBUG nova.compute.manager [req-9de1b0be-94af-48d0-b925-e170e205a5f3 req-d18d9de9-81ca-416e-b95c-506bb89b5dbc 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 3a585d4f-6f31-4651-b848-0470f4eed464] No waiting events found dispatching network-vif-plugged-eb56f7d1-8dec-4faa-a727-c8bdf54f0af5 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 21 18:46:55 np0005591285 nova_compute[182755]: 2026-01-21 23:46:55.791 182759 WARNING nova.compute.manager [req-9de1b0be-94af-48d0-b925-e170e205a5f3 req-d18d9de9-81ca-416e-b95c-506bb89b5dbc 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 3a585d4f-6f31-4651-b848-0470f4eed464] Received unexpected event network-vif-plugged-eb56f7d1-8dec-4faa-a727-c8bdf54f0af5 for instance with vm_state active and task_state None.#033[00m
Jan 21 18:46:55 np0005591285 nova_compute[182755]: 2026-01-21 23:46:55.890 182759 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1769039200.8887768, c5dba36b-76a4-4e09-bb90-2f8ef859d5f6 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 21 18:46:55 np0005591285 nova_compute[182755]: 2026-01-21 23:46:55.890 182759 INFO nova.compute.manager [-] [instance: c5dba36b-76a4-4e09-bb90-2f8ef859d5f6] VM Stopped (Lifecycle Event)#033[00m
Jan 21 18:46:55 np0005591285 nova_compute[182755]: 2026-01-21 23:46:55.912 182759 DEBUG nova.compute.manager [None req-9020c1c0-5902-4545-95ff-c2af8a783fbb - - - - - -] [instance: c5dba36b-76a4-4e09-bb90-2f8ef859d5f6] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 21 18:46:57 np0005591285 nova_compute[182755]: 2026-01-21 23:46:57.335 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 18:46:58 np0005591285 nova_compute[182755]: 2026-01-21 23:46:58.965 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 18:47:02 np0005591285 nova_compute[182755]: 2026-01-21 23:47:02.341 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 18:47:02 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:47:02.952 104259 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 21 18:47:02 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:47:02.956 104259 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.004s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 21 18:47:02 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:47:02.959 104259 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.003s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 21 18:47:03 np0005591285 nova_compute[182755]: 2026-01-21 23:47:03.967 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 18:47:06 np0005591285 podman[213594]: 2026-01-21 23:47:06.236387353 +0000 UTC m=+0.091996682 container health_status b5c1a31a508d0cf9e10702abe9c0ef424614b4d8f0daee85791f7de054afa6f0 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=ceilometer_agent_compute, org.label-schema.build-date=20251202, container_name=ceilometer_agent_compute, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.vendor=CentOS)
Jan 21 18:47:06 np0005591285 ovn_controller[94908]: 2026-01-21T23:47:06Z|00008|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:ae:45:35 10.100.0.6
Jan 21 18:47:06 np0005591285 ovn_controller[94908]: 2026-01-21T23:47:06Z|00009|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:ae:45:35 10.100.0.6
Jan 21 18:47:07 np0005591285 nova_compute[182755]: 2026-01-21 23:47:07.345 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 18:47:08 np0005591285 nova_compute[182755]: 2026-01-21 23:47:08.970 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 18:47:09 np0005591285 podman[213614]: 2026-01-21 23:47:09.245409918 +0000 UTC m=+0.098711461 container health_status 769668cc4e818cfae854ab3fcb5517227ddca1e18005a18ef4d782a494f45662 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, distribution-scope=public, io.openshift.expose-services=, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, version=9.6, maintainer=Red Hat, Inc., vendor=Red Hat, Inc., container_name=openstack_network_exporter, build-date=2025-08-20T13:12:41, architecture=x86_64, io.openshift.tags=minimal rhel9, name=ubi9-minimal, com.redhat.component=ubi9-minimal-container, io.buildah.version=1.33.7, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://catalog.redhat.com/en/search?searchType=containers, config_id=openstack_network_exporter, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, managed_by=edpm_ansible, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., release=1755695350, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly.)
Jan 21 18:47:12 np0005591285 nova_compute[182755]: 2026-01-21 23:47:12.348 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 18:47:13 np0005591285 nova_compute[182755]: 2026-01-21 23:47:13.974 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 18:47:15 np0005591285 nova_compute[182755]: 2026-01-21 23:47:15.219 182759 DEBUG oslo_service.periodic_task [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Running periodic task ComputeManager._cleanup_incomplete_migrations run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 21 18:47:15 np0005591285 nova_compute[182755]: 2026-01-21 23:47:15.221 182759 DEBUG nova.compute.manager [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Cleaning up deleted instances with incomplete migration  _cleanup_incomplete_migrations /usr/lib/python3.9/site-packages/nova/compute/manager.py:11183#033[00m
Jan 21 18:47:16 np0005591285 nova_compute[182755]: 2026-01-21 23:47:16.217 182759 DEBUG oslo_service.periodic_task [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 21 18:47:16 np0005591285 nova_compute[182755]: 2026-01-21 23:47:16.217 182759 DEBUG oslo_service.periodic_task [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 21 18:47:17 np0005591285 podman[213635]: 2026-01-21 23:47:17.209105483 +0000 UTC m=+0.072866690 container health_status 3d927e208c8d9d2bf2ed958430b573b5beb6e6062ba87e1ec7a0ee42c855b2e4 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter)
Jan 21 18:47:17 np0005591285 nova_compute[182755]: 2026-01-21 23:47:17.351 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 18:47:18 np0005591285 nova_compute[182755]: 2026-01-21 23:47:18.230 182759 DEBUG oslo_service.periodic_task [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 21 18:47:18 np0005591285 nova_compute[182755]: 2026-01-21 23:47:18.231 182759 DEBUG oslo_service.periodic_task [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 21 18:47:18 np0005591285 nova_compute[182755]: 2026-01-21 23:47:18.977 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 18:47:19 np0005591285 nova_compute[182755]: 2026-01-21 23:47:19.217 182759 DEBUG oslo_service.periodic_task [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 21 18:47:19 np0005591285 nova_compute[182755]: 2026-01-21 23:47:19.218 182759 DEBUG nova.compute.manager [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Jan 21 18:47:19 np0005591285 nova_compute[182755]: 2026-01-21 23:47:19.219 182759 DEBUG nova.compute.manager [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Jan 21 18:47:19 np0005591285 nova_compute[182755]: 2026-01-21 23:47:19.512 182759 DEBUG oslo_concurrency.lockutils [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Acquiring lock "refresh_cache-3a585d4f-6f31-4651-b848-0470f4eed464" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 21 18:47:19 np0005591285 nova_compute[182755]: 2026-01-21 23:47:19.513 182759 DEBUG oslo_concurrency.lockutils [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Acquired lock "refresh_cache-3a585d4f-6f31-4651-b848-0470f4eed464" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 21 18:47:19 np0005591285 nova_compute[182755]: 2026-01-21 23:47:19.514 182759 DEBUG nova.network.neutron [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] [instance: 3a585d4f-6f31-4651-b848-0470f4eed464] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004#033[00m
Jan 21 18:47:19 np0005591285 nova_compute[182755]: 2026-01-21 23:47:19.514 182759 DEBUG nova.objects.instance [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Lazy-loading 'info_cache' on Instance uuid 3a585d4f-6f31-4651-b848-0470f4eed464 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 21 18:47:20 np0005591285 podman[213659]: 2026-01-21 23:47:20.283378932 +0000 UTC m=+0.140189181 container health_status e38def30a2d032278c507a8fe96e62e9ea51ff4721fe55b24e66691821fd079e (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, tcib_managed=true, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.build-date=20251202, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 21 18:47:21 np0005591285 nova_compute[182755]: 2026-01-21 23:47:21.569 182759 DEBUG nova.network.neutron [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] [instance: 3a585d4f-6f31-4651-b848-0470f4eed464] Updating instance_info_cache with network_info: [{"id": "eb56f7d1-8dec-4faa-a727-c8bdf54f0af5", "address": "fa:16:3e:ae:45:35", "network": {"id": "1530a22a-f758-407d-b1aa-fd922904fe07", "bridge": "br-int", "label": "tempest-ServersAdminTestJSON-1431691179-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "4d40fc03fb534b5689415f3d8a3de1fc", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapeb56f7d1-8d", "ovs_interfaceid": "eb56f7d1-8dec-4faa-a727-c8bdf54f0af5", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 21 18:47:21 np0005591285 nova_compute[182755]: 2026-01-21 23:47:21.589 182759 DEBUG oslo_concurrency.lockutils [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Releasing lock "refresh_cache-3a585d4f-6f31-4651-b848-0470f4eed464" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 21 18:47:21 np0005591285 nova_compute[182755]: 2026-01-21 23:47:21.590 182759 DEBUG nova.compute.manager [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] [instance: 3a585d4f-6f31-4651-b848-0470f4eed464] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929#033[00m
Jan 21 18:47:21 np0005591285 nova_compute[182755]: 2026-01-21 23:47:21.590 182759 DEBUG oslo_service.periodic_task [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 21 18:47:21 np0005591285 nova_compute[182755]: 2026-01-21 23:47:21.590 182759 DEBUG oslo_service.periodic_task [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 21 18:47:21 np0005591285 nova_compute[182755]: 2026-01-21 23:47:21.591 182759 DEBUG nova.compute.manager [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Jan 21 18:47:22 np0005591285 nova_compute[182755]: 2026-01-21 23:47:22.353 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 18:47:22 np0005591285 nova_compute[182755]: 2026-01-21 23:47:22.586 182759 DEBUG oslo_service.periodic_task [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 21 18:47:23 np0005591285 nova_compute[182755]: 2026-01-21 23:47:23.217 182759 DEBUG oslo_service.periodic_task [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 21 18:47:23 np0005591285 nova_compute[182755]: 2026-01-21 23:47:23.257 182759 DEBUG oslo_concurrency.lockutils [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 21 18:47:23 np0005591285 nova_compute[182755]: 2026-01-21 23:47:23.258 182759 DEBUG oslo_concurrency.lockutils [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 21 18:47:23 np0005591285 nova_compute[182755]: 2026-01-21 23:47:23.258 182759 DEBUG oslo_concurrency.lockutils [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 21 18:47:23 np0005591285 nova_compute[182755]: 2026-01-21 23:47:23.259 182759 DEBUG nova.compute.resource_tracker [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Jan 21 18:47:23 np0005591285 nova_compute[182755]: 2026-01-21 23:47:23.374 182759 DEBUG oslo_concurrency.lockutils [None req-7c4fc80a-b8f6-4577-8b38-65533f3c9aa9 553fdc065acf4000a185abac43878ab4 1298204af0f241dc8b63851b2046cf5c - - default default] Acquiring lock "69dceb72-db44-4bfc-9b98-cc8b39885ae7" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 21 18:47:23 np0005591285 nova_compute[182755]: 2026-01-21 23:47:23.376 182759 DEBUG oslo_concurrency.lockutils [None req-7c4fc80a-b8f6-4577-8b38-65533f3c9aa9 553fdc065acf4000a185abac43878ab4 1298204af0f241dc8b63851b2046cf5c - - default default] Lock "69dceb72-db44-4bfc-9b98-cc8b39885ae7" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.003s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 21 18:47:23 np0005591285 nova_compute[182755]: 2026-01-21 23:47:23.377 182759 DEBUG oslo_concurrency.lockutils [None req-7c4fc80a-b8f6-4577-8b38-65533f3c9aa9 553fdc065acf4000a185abac43878ab4 1298204af0f241dc8b63851b2046cf5c - - default default] Acquiring lock "69dceb72-db44-4bfc-9b98-cc8b39885ae7-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 21 18:47:23 np0005591285 nova_compute[182755]: 2026-01-21 23:47:23.377 182759 DEBUG oslo_concurrency.lockutils [None req-7c4fc80a-b8f6-4577-8b38-65533f3c9aa9 553fdc065acf4000a185abac43878ab4 1298204af0f241dc8b63851b2046cf5c - - default default] Lock "69dceb72-db44-4bfc-9b98-cc8b39885ae7-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 21 18:47:23 np0005591285 nova_compute[182755]: 2026-01-21 23:47:23.378 182759 DEBUG oslo_concurrency.lockutils [None req-7c4fc80a-b8f6-4577-8b38-65533f3c9aa9 553fdc065acf4000a185abac43878ab4 1298204af0f241dc8b63851b2046cf5c - - default default] Lock "69dceb72-db44-4bfc-9b98-cc8b39885ae7-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 21 18:47:23 np0005591285 nova_compute[182755]: 2026-01-21 23:47:23.396 182759 DEBUG oslo_concurrency.processutils [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/69dceb72-db44-4bfc-9b98-cc8b39885ae7/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 21 18:47:23 np0005591285 nova_compute[182755]: 2026-01-21 23:47:23.432 182759 INFO nova.compute.manager [None req-7c4fc80a-b8f6-4577-8b38-65533f3c9aa9 553fdc065acf4000a185abac43878ab4 1298204af0f241dc8b63851b2046cf5c - - default default] [instance: 69dceb72-db44-4bfc-9b98-cc8b39885ae7] Terminating instance#033[00m
Jan 21 18:47:23 np0005591285 nova_compute[182755]: 2026-01-21 23:47:23.454 182759 DEBUG nova.compute.manager [None req-7c4fc80a-b8f6-4577-8b38-65533f3c9aa9 553fdc065acf4000a185abac43878ab4 1298204af0f241dc8b63851b2046cf5c - - default default] [instance: 69dceb72-db44-4bfc-9b98-cc8b39885ae7] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Jan 21 18:47:23 np0005591285 kernel: tapbae5fde2-5e (unregistering): left promiscuous mode
Jan 21 18:47:23 np0005591285 NetworkManager[55017]: <info>  [1769039243.4886] device (tapbae5fde2-5e): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 21 18:47:23 np0005591285 ovn_controller[94908]: 2026-01-21T23:47:23Z|00065|binding|INFO|Releasing lport bae5fde2-5ead-4ae5-90dd-1d6d468541ea from this chassis (sb_readonly=0)
Jan 21 18:47:23 np0005591285 ovn_controller[94908]: 2026-01-21T23:47:23Z|00066|binding|INFO|Setting lport bae5fde2-5ead-4ae5-90dd-1d6d468541ea down in Southbound
Jan 21 18:47:23 np0005591285 nova_compute[182755]: 2026-01-21 23:47:23.500 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 18:47:23 np0005591285 ovn_controller[94908]: 2026-01-21T23:47:23Z|00067|binding|INFO|Removing iface tapbae5fde2-5e ovn-installed in OVS
Jan 21 18:47:23 np0005591285 nova_compute[182755]: 2026-01-21 23:47:23.508 182759 DEBUG oslo_concurrency.processutils [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/69dceb72-db44-4bfc-9b98-cc8b39885ae7/disk --force-share --output=json" returned: 0 in 0.112s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 21 18:47:23 np0005591285 nova_compute[182755]: 2026-01-21 23:47:23.509 182759 DEBUG oslo_concurrency.processutils [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/69dceb72-db44-4bfc-9b98-cc8b39885ae7/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 21 18:47:23 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:47:23.529 104259 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:6f:ac:86 10.100.0.6'], port_security=['fa:16:3e:6f:ac:86 10.100.0.6'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.6/28', 'neutron:device_id': '69dceb72-db44-4bfc-9b98-cc8b39885ae7', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-b7816b8e-52c1-4d60-84f7-524ebe7dfa5c', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '1298204af0f241dc8b63851b2046cf5c', 'neutron:revision_number': '23', 'neutron:security_group_ids': '4fca0662-11c4-4183-96b8-546eae3304ff', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=c50c611d-d348-436f-bd12-bc6add278699, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fdd55b5c970>], logical_port=bae5fde2-5ead-4ae5-90dd-1d6d468541ea) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7fdd55b5c970>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 21 18:47:23 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:47:23.533 104259 INFO neutron.agent.ovn.metadata.agent [-] Port bae5fde2-5ead-4ae5-90dd-1d6d468541ea in datapath b7816b8e-52c1-4d60-84f7-524ebe7dfa5c unbound from our chassis#033[00m
Jan 21 18:47:23 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:47:23.537 104259 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network b7816b8e-52c1-4d60-84f7-524ebe7dfa5c, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Jan 21 18:47:23 np0005591285 systemd[1]: machine-qemu\x2d4\x2dinstance\x2d0000000c.scope: Deactivated successfully.
Jan 21 18:47:23 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:47:23.545 211690 DEBUG oslo.privsep.daemon [-] privsep: reply[c50a3084-94d5-4f43-85de-74177171100f]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 18:47:23 np0005591285 systemd[1]: machine-qemu\x2d4\x2dinstance\x2d0000000c.scope: Consumed 5.470s CPU time.
Jan 21 18:47:23 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:47:23.547 104259 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-b7816b8e-52c1-4d60-84f7-524ebe7dfa5c namespace which is not needed anymore#033[00m
Jan 21 18:47:23 np0005591285 systemd-machined[154022]: Machine qemu-4-instance-0000000c terminated.
Jan 21 18:47:23 np0005591285 nova_compute[182755]: 2026-01-21 23:47:23.559 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 18:47:23 np0005591285 podman[213692]: 2026-01-21 23:47:23.607994729 +0000 UTC m=+0.088241412 container health_status 482a8450540b33e5cd4c4cb0c4b60281016c657b79b89fa82f8a9e447843f995 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 21 18:47:23 np0005591285 nova_compute[182755]: 2026-01-21 23:47:23.613 182759 DEBUG oslo_concurrency.processutils [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/69dceb72-db44-4bfc-9b98-cc8b39885ae7/disk --force-share --output=json" returned: 0 in 0.104s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 21 18:47:23 np0005591285 nova_compute[182755]: 2026-01-21 23:47:23.621 182759 DEBUG oslo_concurrency.processutils [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/3a585d4f-6f31-4651-b848-0470f4eed464/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 21 18:47:23 np0005591285 podman[213696]: 2026-01-21 23:47:23.63345605 +0000 UTC m=+0.102444002 container health_status 8919309155a033da8deb024bb37b9fc0d285fe97686ee0d202af37a5f2df8416 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Jan 21 18:47:23 np0005591285 nova_compute[182755]: 2026-01-21 23:47:23.689 182759 DEBUG oslo_concurrency.processutils [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/3a585d4f-6f31-4651-b848-0470f4eed464/disk --force-share --output=json" returned: 0 in 0.068s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 21 18:47:23 np0005591285 nova_compute[182755]: 2026-01-21 23:47:23.691 182759 DEBUG oslo_concurrency.processutils [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/3a585d4f-6f31-4651-b848-0470f4eed464/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 21 18:47:23 np0005591285 neutron-haproxy-ovnmeta-b7816b8e-52c1-4d60-84f7-524ebe7dfa5c[213113]: [NOTICE]   (213117) : haproxy version is 2.8.14-c23fe91
Jan 21 18:47:23 np0005591285 neutron-haproxy-ovnmeta-b7816b8e-52c1-4d60-84f7-524ebe7dfa5c[213113]: [NOTICE]   (213117) : path to executable is /usr/sbin/haproxy
Jan 21 18:47:23 np0005591285 neutron-haproxy-ovnmeta-b7816b8e-52c1-4d60-84f7-524ebe7dfa5c[213113]: [WARNING]  (213117) : Exiting Master process...
Jan 21 18:47:23 np0005591285 neutron-haproxy-ovnmeta-b7816b8e-52c1-4d60-84f7-524ebe7dfa5c[213113]: [WARNING]  (213117) : Exiting Master process...
Jan 21 18:47:23 np0005591285 neutron-haproxy-ovnmeta-b7816b8e-52c1-4d60-84f7-524ebe7dfa5c[213113]: [ALERT]    (213117) : Current worker (213119) exited with code 143 (Terminated)
Jan 21 18:47:23 np0005591285 neutron-haproxy-ovnmeta-b7816b8e-52c1-4d60-84f7-524ebe7dfa5c[213113]: [WARNING]  (213117) : All workers exited. Exiting... (0)
Jan 21 18:47:23 np0005591285 nova_compute[182755]: 2026-01-21 23:47:23.740 182759 DEBUG nova.compute.manager [req-af4bf664-0c34-4956-93ce-31aeb46d596a req-d5aa7d64-1bca-436c-9dfb-7925da71208f 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 69dceb72-db44-4bfc-9b98-cc8b39885ae7] Received event network-vif-unplugged-bae5fde2-5ead-4ae5-90dd-1d6d468541ea external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 21 18:47:23 np0005591285 nova_compute[182755]: 2026-01-21 23:47:23.745 182759 DEBUG oslo_concurrency.lockutils [req-af4bf664-0c34-4956-93ce-31aeb46d596a req-d5aa7d64-1bca-436c-9dfb-7925da71208f 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "69dceb72-db44-4bfc-9b98-cc8b39885ae7-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 21 18:47:23 np0005591285 nova_compute[182755]: 2026-01-21 23:47:23.745 182759 DEBUG oslo_concurrency.lockutils [req-af4bf664-0c34-4956-93ce-31aeb46d596a req-d5aa7d64-1bca-436c-9dfb-7925da71208f 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "69dceb72-db44-4bfc-9b98-cc8b39885ae7-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 21 18:47:23 np0005591285 nova_compute[182755]: 2026-01-21 23:47:23.746 182759 DEBUG oslo_concurrency.lockutils [req-af4bf664-0c34-4956-93ce-31aeb46d596a req-d5aa7d64-1bca-436c-9dfb-7925da71208f 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "69dceb72-db44-4bfc-9b98-cc8b39885ae7-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 21 18:47:23 np0005591285 nova_compute[182755]: 2026-01-21 23:47:23.746 182759 DEBUG nova.compute.manager [req-af4bf664-0c34-4956-93ce-31aeb46d596a req-d5aa7d64-1bca-436c-9dfb-7925da71208f 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 69dceb72-db44-4bfc-9b98-cc8b39885ae7] No waiting events found dispatching network-vif-unplugged-bae5fde2-5ead-4ae5-90dd-1d6d468541ea pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 21 18:47:23 np0005591285 nova_compute[182755]: 2026-01-21 23:47:23.746 182759 DEBUG nova.compute.manager [req-af4bf664-0c34-4956-93ce-31aeb46d596a req-d5aa7d64-1bca-436c-9dfb-7925da71208f 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 69dceb72-db44-4bfc-9b98-cc8b39885ae7] Received event network-vif-unplugged-bae5fde2-5ead-4ae5-90dd-1d6d468541ea for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Jan 21 18:47:23 np0005591285 systemd[1]: libpod-20e53b70c2cdb2f99ac32a475e0929d5e135734e7afb1b21d0944f0fa45da17b.scope: Deactivated successfully.
Jan 21 18:47:23 np0005591285 podman[213761]: 2026-01-21 23:47:23.758629438 +0000 UTC m=+0.093364558 container died 20e53b70c2cdb2f99ac32a475e0929d5e135734e7afb1b21d0944f0fa45da17b (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-b7816b8e-52c1-4d60-84f7-524ebe7dfa5c, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team)
Jan 21 18:47:23 np0005591285 nova_compute[182755]: 2026-01-21 23:47:23.766 182759 DEBUG oslo_concurrency.processutils [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/3a585d4f-6f31-4651-b848-0470f4eed464/disk --force-share --output=json" returned: 0 in 0.075s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 21 18:47:23 np0005591285 nova_compute[182755]: 2026-01-21 23:47:23.782 182759 INFO nova.virt.libvirt.driver [-] [instance: 69dceb72-db44-4bfc-9b98-cc8b39885ae7] Instance destroyed successfully.#033[00m
Jan 21 18:47:23 np0005591285 nova_compute[182755]: 2026-01-21 23:47:23.784 182759 DEBUG nova.objects.instance [None req-7c4fc80a-b8f6-4577-8b38-65533f3c9aa9 553fdc065acf4000a185abac43878ab4 1298204af0f241dc8b63851b2046cf5c - - default default] Lazy-loading 'resources' on Instance uuid 69dceb72-db44-4bfc-9b98-cc8b39885ae7 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 21 18:47:23 np0005591285 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-20e53b70c2cdb2f99ac32a475e0929d5e135734e7afb1b21d0944f0fa45da17b-userdata-shm.mount: Deactivated successfully.
Jan 21 18:47:23 np0005591285 systemd[1]: var-lib-containers-storage-overlay-c6952301675a4cf3fe0926dd062687530c3772970b250c346e29a97a2a0cea90-merged.mount: Deactivated successfully.
Jan 21 18:47:23 np0005591285 podman[213761]: 2026-01-21 23:47:23.800598911 +0000 UTC m=+0.135334031 container cleanup 20e53b70c2cdb2f99ac32a475e0929d5e135734e7afb1b21d0944f0fa45da17b (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-b7816b8e-52c1-4d60-84f7-524ebe7dfa5c, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3)
Jan 21 18:47:23 np0005591285 nova_compute[182755]: 2026-01-21 23:47:23.802 182759 DEBUG nova.virt.libvirt.vif [None req-7c4fc80a-b8f6-4577-8b38-65533f3c9aa9 553fdc065acf4000a185abac43878ab4 1298204af0f241dc8b63851b2046cf5c - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=True,config_drive='True',created_at=2026-01-21T23:45:15Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-LiveAutoBlockMigrationV225Test-server-1333035319',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-liveautoblockmigrationv225test-server-1333035319',id=12,image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2026-01-21T23:45:22Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='1298204af0f241dc8b63851b2046cf5c',ramdisk_id='',reservation_id='r-45mgwptl',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',clean_attempts='2',image_base_image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-LiveAutoBlockMigrationV225Test-1063342224',owner_user_name='tempest-LiveAutoBlockMigrationV225Test-1063342224-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-21T23:46:16Z,user_data=None,user_id='553fdc065acf4000a185abac43878ab4',uuid=69dceb72-db44-4bfc-9b98-cc8b39885ae7,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "bae5fde2-5ead-4ae5-90dd-1d6d468541ea", "address": "fa:16:3e:6f:ac:86", "network": {"id": "b7816b8e-52c1-4d60-84f7-524ebe7dfa5c", "bridge": "br-int", "label": "tempest-LiveAutoBlockMigrationV225Test-1363967197-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "1298204af0f241dc8b63851b2046cf5c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapbae5fde2-5e", "ovs_interfaceid": "bae5fde2-5ead-4ae5-90dd-1d6d468541ea", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Jan 21 18:47:23 np0005591285 nova_compute[182755]: 2026-01-21 23:47:23.803 182759 DEBUG nova.network.os_vif_util [None req-7c4fc80a-b8f6-4577-8b38-65533f3c9aa9 553fdc065acf4000a185abac43878ab4 1298204af0f241dc8b63851b2046cf5c - - default default] Converting VIF {"id": "bae5fde2-5ead-4ae5-90dd-1d6d468541ea", "address": "fa:16:3e:6f:ac:86", "network": {"id": "b7816b8e-52c1-4d60-84f7-524ebe7dfa5c", "bridge": "br-int", "label": "tempest-LiveAutoBlockMigrationV225Test-1363967197-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "1298204af0f241dc8b63851b2046cf5c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapbae5fde2-5e", "ovs_interfaceid": "bae5fde2-5ead-4ae5-90dd-1d6d468541ea", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 21 18:47:23 np0005591285 nova_compute[182755]: 2026-01-21 23:47:23.804 182759 DEBUG nova.network.os_vif_util [None req-7c4fc80a-b8f6-4577-8b38-65533f3c9aa9 553fdc065acf4000a185abac43878ab4 1298204af0f241dc8b63851b2046cf5c - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:6f:ac:86,bridge_name='br-int',has_traffic_filtering=True,id=bae5fde2-5ead-4ae5-90dd-1d6d468541ea,network=Network(b7816b8e-52c1-4d60-84f7-524ebe7dfa5c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapbae5fde2-5e') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 21 18:47:23 np0005591285 nova_compute[182755]: 2026-01-21 23:47:23.804 182759 DEBUG os_vif [None req-7c4fc80a-b8f6-4577-8b38-65533f3c9aa9 553fdc065acf4000a185abac43878ab4 1298204af0f241dc8b63851b2046cf5c - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:6f:ac:86,bridge_name='br-int',has_traffic_filtering=True,id=bae5fde2-5ead-4ae5-90dd-1d6d468541ea,network=Network(b7816b8e-52c1-4d60-84f7-524ebe7dfa5c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapbae5fde2-5e') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Jan 21 18:47:23 np0005591285 nova_compute[182755]: 2026-01-21 23:47:23.807 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 18:47:23 np0005591285 nova_compute[182755]: 2026-01-21 23:47:23.808 182759 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapbae5fde2-5e, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 21 18:47:23 np0005591285 nova_compute[182755]: 2026-01-21 23:47:23.810 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 18:47:23 np0005591285 nova_compute[182755]: 2026-01-21 23:47:23.811 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 21 18:47:23 np0005591285 nova_compute[182755]: 2026-01-21 23:47:23.826 182759 INFO os_vif [None req-7c4fc80a-b8f6-4577-8b38-65533f3c9aa9 553fdc065acf4000a185abac43878ab4 1298204af0f241dc8b63851b2046cf5c - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:6f:ac:86,bridge_name='br-int',has_traffic_filtering=True,id=bae5fde2-5ead-4ae5-90dd-1d6d468541ea,network=Network(b7816b8e-52c1-4d60-84f7-524ebe7dfa5c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapbae5fde2-5e')#033[00m
Jan 21 18:47:23 np0005591285 nova_compute[182755]: 2026-01-21 23:47:23.826 182759 INFO nova.virt.libvirt.driver [None req-7c4fc80a-b8f6-4577-8b38-65533f3c9aa9 553fdc065acf4000a185abac43878ab4 1298204af0f241dc8b63851b2046cf5c - - default default] [instance: 69dceb72-db44-4bfc-9b98-cc8b39885ae7] Deleting instance files /var/lib/nova/instances/69dceb72-db44-4bfc-9b98-cc8b39885ae7_del#033[00m
Jan 21 18:47:23 np0005591285 nova_compute[182755]: 2026-01-21 23:47:23.827 182759 INFO nova.virt.libvirt.driver [None req-7c4fc80a-b8f6-4577-8b38-65533f3c9aa9 553fdc065acf4000a185abac43878ab4 1298204af0f241dc8b63851b2046cf5c - - default default] [instance: 69dceb72-db44-4bfc-9b98-cc8b39885ae7] Deletion of /var/lib/nova/instances/69dceb72-db44-4bfc-9b98-cc8b39885ae7_del complete#033[00m
Jan 21 18:47:23 np0005591285 systemd[1]: libpod-conmon-20e53b70c2cdb2f99ac32a475e0929d5e135734e7afb1b21d0944f0fa45da17b.scope: Deactivated successfully.
Jan 21 18:47:23 np0005591285 podman[213812]: 2026-01-21 23:47:23.881955628 +0000 UTC m=+0.054191591 container remove 20e53b70c2cdb2f99ac32a475e0929d5e135734e7afb1b21d0944f0fa45da17b (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-b7816b8e-52c1-4d60-84f7-524ebe7dfa5c, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3)
Jan 21 18:47:23 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:47:23.892 211690 DEBUG oslo.privsep.daemon [-] privsep: reply[635a0277-dbf3-4f50-87f4-8d445a1309d0]: (4, ('Wed Jan 21 11:47:23 PM UTC 2026 Stopping container neutron-haproxy-ovnmeta-b7816b8e-52c1-4d60-84f7-524ebe7dfa5c (20e53b70c2cdb2f99ac32a475e0929d5e135734e7afb1b21d0944f0fa45da17b)\n20e53b70c2cdb2f99ac32a475e0929d5e135734e7afb1b21d0944f0fa45da17b\nWed Jan 21 11:47:23 PM UTC 2026 Deleting container neutron-haproxy-ovnmeta-b7816b8e-52c1-4d60-84f7-524ebe7dfa5c (20e53b70c2cdb2f99ac32a475e0929d5e135734e7afb1b21d0944f0fa45da17b)\n20e53b70c2cdb2f99ac32a475e0929d5e135734e7afb1b21d0944f0fa45da17b\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 18:47:23 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:47:23.894 211690 DEBUG oslo.privsep.daemon [-] privsep: reply[97a9b6eb-9ddb-44de-a079-e567efdebe2c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 18:47:23 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:47:23.896 104259 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapb7816b8e-50, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 21 18:47:23 np0005591285 nova_compute[182755]: 2026-01-21 23:47:23.898 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 18:47:23 np0005591285 kernel: tapb7816b8e-50: left promiscuous mode
Jan 21 18:47:23 np0005591285 nova_compute[182755]: 2026-01-21 23:47:23.915 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 18:47:23 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:47:23.919 211690 DEBUG oslo.privsep.daemon [-] privsep: reply[0698c5bf-ef3e-4f93-a54f-6aedb8b9eaef]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 18:47:23 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:47:23.935 211690 DEBUG oslo.privsep.daemon [-] privsep: reply[ed505d9e-be3a-4a21-831b-3db35434f8d5]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 18:47:23 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:47:23.936 211690 DEBUG oslo.privsep.daemon [-] privsep: reply[7b9914da-5c23-40e5-a303-206937a9c5d5]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 18:47:23 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:47:23.955 211690 DEBUG oslo.privsep.daemon [-] privsep: reply[95e48ac4-5cc0-416a-9f52-469ebbab1d60]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 372780, 'reachable_time': 23835, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 213825, 'error': None, 'target': 'ovnmeta-b7816b8e-52c1-4d60-84f7-524ebe7dfa5c', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 18:47:23 np0005591285 systemd[1]: run-netns-ovnmeta\x2db7816b8e\x2d52c1\x2d4d60\x2d84f7\x2d524ebe7dfa5c.mount: Deactivated successfully.
Jan 21 18:47:23 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:47:23.962 104650 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-b7816b8e-52c1-4d60-84f7-524ebe7dfa5c deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Jan 21 18:47:23 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:47:23.962 104650 DEBUG oslo.privsep.daemon [-] privsep: reply[e49b6533-e147-4a7f-a129-345d6a8d5acd]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 18:47:23 np0005591285 nova_compute[182755]: 2026-01-21 23:47:23.964 182759 INFO nova.compute.manager [None req-7c4fc80a-b8f6-4577-8b38-65533f3c9aa9 553fdc065acf4000a185abac43878ab4 1298204af0f241dc8b63851b2046cf5c - - default default] [instance: 69dceb72-db44-4bfc-9b98-cc8b39885ae7] Took 0.51 seconds to destroy the instance on the hypervisor.#033[00m
Jan 21 18:47:23 np0005591285 nova_compute[182755]: 2026-01-21 23:47:23.966 182759 DEBUG oslo.service.loopingcall [None req-7c4fc80a-b8f6-4577-8b38-65533f3c9aa9 553fdc065acf4000a185abac43878ab4 1298204af0f241dc8b63851b2046cf5c - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Jan 21 18:47:23 np0005591285 nova_compute[182755]: 2026-01-21 23:47:23.967 182759 DEBUG nova.compute.manager [-] [instance: 69dceb72-db44-4bfc-9b98-cc8b39885ae7] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Jan 21 18:47:23 np0005591285 nova_compute[182755]: 2026-01-21 23:47:23.967 182759 DEBUG nova.network.neutron [-] [instance: 69dceb72-db44-4bfc-9b98-cc8b39885ae7] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Jan 21 18:47:23 np0005591285 nova_compute[182755]: 2026-01-21 23:47:23.979 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 18:47:24 np0005591285 nova_compute[182755]: 2026-01-21 23:47:24.055 182759 WARNING nova.virt.libvirt.driver [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 21 18:47:24 np0005591285 nova_compute[182755]: 2026-01-21 23:47:24.057 182759 DEBUG nova.compute.resource_tracker [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=5429MB free_disk=73.3215217590332GB free_vcpus=6 pci_devices=[{"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Jan 21 18:47:24 np0005591285 nova_compute[182755]: 2026-01-21 23:47:24.058 182759 DEBUG oslo_concurrency.lockutils [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 21 18:47:24 np0005591285 nova_compute[182755]: 2026-01-21 23:47:24.058 182759 DEBUG oslo_concurrency.lockutils [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 21 18:47:24 np0005591285 nova_compute[182755]: 2026-01-21 23:47:24.190 182759 DEBUG nova.compute.resource_tracker [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Instance 69dceb72-db44-4bfc-9b98-cc8b39885ae7 actively managed on this compute host and has allocations in placement: {'resources': {'VCPU': 1, 'MEMORY_MB': 128, 'DISK_GB': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Jan 21 18:47:24 np0005591285 nova_compute[182755]: 2026-01-21 23:47:24.191 182759 DEBUG nova.compute.resource_tracker [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Instance 3a585d4f-6f31-4651-b848-0470f4eed464 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Jan 21 18:47:24 np0005591285 nova_compute[182755]: 2026-01-21 23:47:24.191 182759 DEBUG nova.compute.resource_tracker [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 2 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Jan 21 18:47:24 np0005591285 nova_compute[182755]: 2026-01-21 23:47:24.191 182759 DEBUG nova.compute.resource_tracker [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7679MB used_ram=768MB phys_disk=79GB used_disk=2GB total_vcpus=8 used_vcpus=2 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Jan 21 18:47:24 np0005591285 nova_compute[182755]: 2026-01-21 23:47:24.438 182759 DEBUG nova.compute.provider_tree [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Inventory has not changed in ProviderTree for provider: e96a8776-a298-4c19-937a-402cb8191067 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 21 18:47:24 np0005591285 nova_compute[182755]: 2026-01-21 23:47:24.460 182759 DEBUG nova.scheduler.client.report [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Inventory has not changed for provider e96a8776-a298-4c19-937a-402cb8191067 based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 21 18:47:24 np0005591285 nova_compute[182755]: 2026-01-21 23:47:24.506 182759 DEBUG nova.compute.resource_tracker [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Jan 21 18:47:24 np0005591285 nova_compute[182755]: 2026-01-21 23:47:24.507 182759 DEBUG oslo_concurrency.lockutils [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.449s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 21 18:47:24 np0005591285 nova_compute[182755]: 2026-01-21 23:47:24.509 182759 DEBUG oslo_service.periodic_task [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Running periodic task ComputeManager._run_pending_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 21 18:47:24 np0005591285 nova_compute[182755]: 2026-01-21 23:47:24.509 182759 DEBUG nova.compute.manager [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Cleaning up deleted instances _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11145#033[00m
Jan 21 18:47:24 np0005591285 nova_compute[182755]: 2026-01-21 23:47:24.528 182759 DEBUG nova.compute.manager [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] There are 0 instances to clean _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11154#033[00m
Jan 21 18:47:25 np0005591285 nova_compute[182755]: 2026-01-21 23:47:25.156 182759 DEBUG nova.network.neutron [-] [instance: 69dceb72-db44-4bfc-9b98-cc8b39885ae7] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 21 18:47:25 np0005591285 nova_compute[182755]: 2026-01-21 23:47:25.176 182759 INFO nova.compute.manager [-] [instance: 69dceb72-db44-4bfc-9b98-cc8b39885ae7] Took 1.21 seconds to deallocate network for instance.#033[00m
Jan 21 18:47:25 np0005591285 nova_compute[182755]: 2026-01-21 23:47:25.283 182759 DEBUG oslo_concurrency.lockutils [None req-7c4fc80a-b8f6-4577-8b38-65533f3c9aa9 553fdc065acf4000a185abac43878ab4 1298204af0f241dc8b63851b2046cf5c - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 21 18:47:25 np0005591285 nova_compute[182755]: 2026-01-21 23:47:25.284 182759 DEBUG oslo_concurrency.lockutils [None req-7c4fc80a-b8f6-4577-8b38-65533f3c9aa9 553fdc065acf4000a185abac43878ab4 1298204af0f241dc8b63851b2046cf5c - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 21 18:47:25 np0005591285 nova_compute[182755]: 2026-01-21 23:47:25.402 182759 DEBUG nova.compute.provider_tree [None req-7c4fc80a-b8f6-4577-8b38-65533f3c9aa9 553fdc065acf4000a185abac43878ab4 1298204af0f241dc8b63851b2046cf5c - - default default] Inventory has not changed in ProviderTree for provider: e96a8776-a298-4c19-937a-402cb8191067 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 21 18:47:25 np0005591285 nova_compute[182755]: 2026-01-21 23:47:25.529 182759 DEBUG oslo_service.periodic_task [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 21 18:47:25 np0005591285 nova_compute[182755]: 2026-01-21 23:47:25.530 182759 DEBUG oslo_service.periodic_task [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 21 18:47:25 np0005591285 nova_compute[182755]: 2026-01-21 23:47:25.674 182759 DEBUG nova.scheduler.client.report [None req-7c4fc80a-b8f6-4577-8b38-65533f3c9aa9 553fdc065acf4000a185abac43878ab4 1298204af0f241dc8b63851b2046cf5c - - default default] Inventory has not changed for provider e96a8776-a298-4c19-937a-402cb8191067 based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 21 18:47:25 np0005591285 nova_compute[182755]: 2026-01-21 23:47:25.679 182759 DEBUG oslo_service.periodic_task [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Running periodic task ComputeManager._sync_power_states run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 21 18:47:25 np0005591285 nova_compute[182755]: 2026-01-21 23:47:25.708 182759 DEBUG oslo_concurrency.lockutils [None req-7c4fc80a-b8f6-4577-8b38-65533f3c9aa9 553fdc065acf4000a185abac43878ab4 1298204af0f241dc8b63851b2046cf5c - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.424s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 21 18:47:25 np0005591285 nova_compute[182755]: 2026-01-21 23:47:25.779 182759 INFO nova.scheduler.client.report [None req-7c4fc80a-b8f6-4577-8b38-65533f3c9aa9 553fdc065acf4000a185abac43878ab4 1298204af0f241dc8b63851b2046cf5c - - default default] Deleted allocations for instance 69dceb72-db44-4bfc-9b98-cc8b39885ae7#033[00m
Jan 21 18:47:25 np0005591285 nova_compute[182755]: 2026-01-21 23:47:25.782 182759 WARNING nova.compute.manager [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] While synchronizing instance power states, found 2 instances in the database and 1 instances on the hypervisor.#033[00m
Jan 21 18:47:25 np0005591285 nova_compute[182755]: 2026-01-21 23:47:25.783 182759 DEBUG nova.compute.manager [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Triggering sync for uuid 3a585d4f-6f31-4651-b848-0470f4eed464 _sync_power_states /usr/lib/python3.9/site-packages/nova/compute/manager.py:10268#033[00m
Jan 21 18:47:25 np0005591285 nova_compute[182755]: 2026-01-21 23:47:25.783 182759 DEBUG nova.compute.manager [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Triggering sync for uuid 69dceb72-db44-4bfc-9b98-cc8b39885ae7 _sync_power_states /usr/lib/python3.9/site-packages/nova/compute/manager.py:10268#033[00m
Jan 21 18:47:25 np0005591285 nova_compute[182755]: 2026-01-21 23:47:25.783 182759 DEBUG oslo_concurrency.lockutils [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Acquiring lock "3a585d4f-6f31-4651-b848-0470f4eed464" by "nova.compute.manager.ComputeManager._sync_power_states.<locals>._sync.<locals>.query_driver_power_state_and_sync" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 21 18:47:25 np0005591285 nova_compute[182755]: 2026-01-21 23:47:25.783 182759 DEBUG oslo_concurrency.lockutils [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Lock "3a585d4f-6f31-4651-b848-0470f4eed464" acquired by "nova.compute.manager.ComputeManager._sync_power_states.<locals>._sync.<locals>.query_driver_power_state_and_sync" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 21 18:47:25 np0005591285 nova_compute[182755]: 2026-01-21 23:47:25.784 182759 DEBUG oslo_concurrency.lockutils [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Acquiring lock "69dceb72-db44-4bfc-9b98-cc8b39885ae7" by "nova.compute.manager.ComputeManager._sync_power_states.<locals>._sync.<locals>.query_driver_power_state_and_sync" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 21 18:47:25 np0005591285 nova_compute[182755]: 2026-01-21 23:47:25.851 182759 DEBUG oslo_concurrency.lockutils [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Lock "3a585d4f-6f31-4651-b848-0470f4eed464" "released" by "nova.compute.manager.ComputeManager._sync_power_states.<locals>._sync.<locals>.query_driver_power_state_and_sync" :: held 0.067s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 21 18:47:25 np0005591285 nova_compute[182755]: 2026-01-21 23:47:25.921 182759 DEBUG nova.compute.manager [req-4c727260-7b19-4169-8407-5a5925fda402 req-e267a3f4-47d7-46ec-8670-f6f61cea78cb 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 69dceb72-db44-4bfc-9b98-cc8b39885ae7] Received event network-vif-plugged-bae5fde2-5ead-4ae5-90dd-1d6d468541ea external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 21 18:47:25 np0005591285 nova_compute[182755]: 2026-01-21 23:47:25.921 182759 DEBUG oslo_concurrency.lockutils [req-4c727260-7b19-4169-8407-5a5925fda402 req-e267a3f4-47d7-46ec-8670-f6f61cea78cb 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "69dceb72-db44-4bfc-9b98-cc8b39885ae7-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 21 18:47:25 np0005591285 nova_compute[182755]: 2026-01-21 23:47:25.922 182759 DEBUG oslo_concurrency.lockutils [req-4c727260-7b19-4169-8407-5a5925fda402 req-e267a3f4-47d7-46ec-8670-f6f61cea78cb 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "69dceb72-db44-4bfc-9b98-cc8b39885ae7-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 21 18:47:25 np0005591285 nova_compute[182755]: 2026-01-21 23:47:25.922 182759 DEBUG oslo_concurrency.lockutils [req-4c727260-7b19-4169-8407-5a5925fda402 req-e267a3f4-47d7-46ec-8670-f6f61cea78cb 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "69dceb72-db44-4bfc-9b98-cc8b39885ae7-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 21 18:47:25 np0005591285 nova_compute[182755]: 2026-01-21 23:47:25.922 182759 DEBUG nova.compute.manager [req-4c727260-7b19-4169-8407-5a5925fda402 req-e267a3f4-47d7-46ec-8670-f6f61cea78cb 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 69dceb72-db44-4bfc-9b98-cc8b39885ae7] No waiting events found dispatching network-vif-plugged-bae5fde2-5ead-4ae5-90dd-1d6d468541ea pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 21 18:47:25 np0005591285 nova_compute[182755]: 2026-01-21 23:47:25.922 182759 WARNING nova.compute.manager [req-4c727260-7b19-4169-8407-5a5925fda402 req-e267a3f4-47d7-46ec-8670-f6f61cea78cb 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 69dceb72-db44-4bfc-9b98-cc8b39885ae7] Received unexpected event network-vif-plugged-bae5fde2-5ead-4ae5-90dd-1d6d468541ea for instance with vm_state deleted and task_state None.#033[00m
Jan 21 18:47:25 np0005591285 nova_compute[182755]: 2026-01-21 23:47:25.923 182759 DEBUG nova.compute.manager [req-4c727260-7b19-4169-8407-5a5925fda402 req-e267a3f4-47d7-46ec-8670-f6f61cea78cb 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 69dceb72-db44-4bfc-9b98-cc8b39885ae7] Received event network-vif-deleted-bae5fde2-5ead-4ae5-90dd-1d6d468541ea external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 21 18:47:25 np0005591285 nova_compute[182755]: 2026-01-21 23:47:25.940 182759 DEBUG oslo_concurrency.lockutils [None req-7c4fc80a-b8f6-4577-8b38-65533f3c9aa9 553fdc065acf4000a185abac43878ab4 1298204af0f241dc8b63851b2046cf5c - - default default] Lock "69dceb72-db44-4bfc-9b98-cc8b39885ae7" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 2.564s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 21 18:47:25 np0005591285 nova_compute[182755]: 2026-01-21 23:47:25.941 182759 DEBUG oslo_concurrency.lockutils [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Lock "69dceb72-db44-4bfc-9b98-cc8b39885ae7" acquired by "nova.compute.manager.ComputeManager._sync_power_states.<locals>._sync.<locals>.query_driver_power_state_and_sync" :: waited 0.157s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 21 18:47:25 np0005591285 nova_compute[182755]: 2026-01-21 23:47:25.976 182759 DEBUG oslo_concurrency.lockutils [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Lock "69dceb72-db44-4bfc-9b98-cc8b39885ae7" "released" by "nova.compute.manager.ComputeManager._sync_power_states.<locals>._sync.<locals>.query_driver_power_state_and_sync" :: held 0.035s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 21 18:47:28 np0005591285 nova_compute[182755]: 2026-01-21 23:47:28.812 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 18:47:28 np0005591285 nova_compute[182755]: 2026-01-21 23:47:28.982 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 18:47:30 np0005591285 ovn_controller[94908]: 2026-01-21T23:47:30Z|00068|binding|INFO|Releasing lport 1e43acd7-fb26-4f78-8f65-2a3b2d4a2acd from this chassis (sb_readonly=0)
Jan 21 18:47:30 np0005591285 nova_compute[182755]: 2026-01-21 23:47:30.841 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 18:47:33 np0005591285 nova_compute[182755]: 2026-01-21 23:47:33.817 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 18:47:33 np0005591285 nova_compute[182755]: 2026-01-21 23:47:33.985 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 18:47:36 np0005591285 nova_compute[182755]: 2026-01-21 23:47:36.175 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 18:47:36 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:47:36.175 104259 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=7, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '16:a4:fb', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': 'fa:c2:bb:50:eb:27'}, ipsec=False) old=SB_Global(nb_cfg=6) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 21 18:47:36 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:47:36.178 104259 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 10 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Jan 21 18:47:37 np0005591285 podman[213826]: 2026-01-21 23:47:37.257322052 +0000 UTC m=+0.112500851 container health_status b5c1a31a508d0cf9e10702abe9c0ef424614b4d8f0daee85791f7de054afa6f0 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_id=ceilometer_agent_compute, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Jan 21 18:47:38 np0005591285 nova_compute[182755]: 2026-01-21 23:47:38.776 182759 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1769039243.7739387, 69dceb72-db44-4bfc-9b98-cc8b39885ae7 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 21 18:47:38 np0005591285 nova_compute[182755]: 2026-01-21 23:47:38.777 182759 INFO nova.compute.manager [-] [instance: 69dceb72-db44-4bfc-9b98-cc8b39885ae7] VM Stopped (Lifecycle Event)#033[00m
Jan 21 18:47:38 np0005591285 nova_compute[182755]: 2026-01-21 23:47:38.803 182759 DEBUG nova.compute.manager [None req-253ee93a-203d-448c-9441-f4a1fd23c175 - - - - - -] [instance: 69dceb72-db44-4bfc-9b98-cc8b39885ae7] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 21 18:47:38 np0005591285 nova_compute[182755]: 2026-01-21 23:47:38.822 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 18:47:38 np0005591285 nova_compute[182755]: 2026-01-21 23:47:38.987 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 18:47:40 np0005591285 podman[213846]: 2026-01-21 23:47:40.250249705 +0000 UTC m=+0.108074802 container health_status 769668cc4e818cfae854ab3fcb5517227ddca1e18005a18ef4d782a494f45662 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., managed_by=edpm_ansible, io.openshift.tags=minimal rhel9, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., architecture=x86_64, container_name=openstack_network_exporter, io.openshift.expose-services=, build-date=2025-08-20T13:12:41, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, distribution-scope=public, maintainer=Red Hat, Inc., url=https://catalog.redhat.com/en/search?searchType=containers, config_id=openstack_network_exporter, io.buildah.version=1.33.7, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=ubi9-minimal, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., version=9.6, release=1755695350, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, com.redhat.component=ubi9-minimal-container)
Jan 21 18:47:43 np0005591285 nova_compute[182755]: 2026-01-21 23:47:43.855 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 18:47:43 np0005591285 nova_compute[182755]: 2026-01-21 23:47:43.989 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 18:47:46 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:47:46.183 104259 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=ce4b296c-26ac-415a-aa87-9634754eb3d3, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '7'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 21 18:47:48 np0005591285 podman[213867]: 2026-01-21 23:47:48.215788051 +0000 UTC m=+0.080371431 container health_status 3d927e208c8d9d2bf2ed958430b573b5beb6e6062ba87e1ec7a0ee42c855b2e4 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter)
Jan 21 18:47:48 np0005591285 nova_compute[182755]: 2026-01-21 23:47:48.907 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 18:47:48 np0005591285 nova_compute[182755]: 2026-01-21 23:47:48.991 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 18:47:51 np0005591285 podman[213891]: 2026-01-21 23:47:51.268359421 +0000 UTC m=+0.125564250 container health_status e38def30a2d032278c507a8fe96e62e9ea51ff4721fe55b24e66691821fd079e (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_id=ovn_controller, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, container_name=ovn_controller, managed_by=edpm_ansible, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']})
Jan 21 18:47:53 np0005591285 nova_compute[182755]: 2026-01-21 23:47:53.198 182759 DEBUG oslo_concurrency.lockutils [None req-f1985406-49a5-4583-b2e2-c94bcda3c46e 0c3f927acf834c718155d5ee5dd81b19 2edcdd2e6c5a46cb95eb89874a9cb5f3 - - default default] Acquiring lock "5ed410d9-b024-4004-83b6-eb4618833532" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 21 18:47:53 np0005591285 nova_compute[182755]: 2026-01-21 23:47:53.199 182759 DEBUG oslo_concurrency.lockutils [None req-f1985406-49a5-4583-b2e2-c94bcda3c46e 0c3f927acf834c718155d5ee5dd81b19 2edcdd2e6c5a46cb95eb89874a9cb5f3 - - default default] Lock "5ed410d9-b024-4004-83b6-eb4618833532" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 21 18:47:53 np0005591285 nova_compute[182755]: 2026-01-21 23:47:53.233 182759 DEBUG nova.compute.manager [None req-f1985406-49a5-4583-b2e2-c94bcda3c46e 0c3f927acf834c718155d5ee5dd81b19 2edcdd2e6c5a46cb95eb89874a9cb5f3 - - default default] [instance: 5ed410d9-b024-4004-83b6-eb4618833532] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Jan 21 18:47:53 np0005591285 nova_compute[182755]: 2026-01-21 23:47:53.374 182759 DEBUG oslo_concurrency.lockutils [None req-f1985406-49a5-4583-b2e2-c94bcda3c46e 0c3f927acf834c718155d5ee5dd81b19 2edcdd2e6c5a46cb95eb89874a9cb5f3 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 21 18:47:53 np0005591285 nova_compute[182755]: 2026-01-21 23:47:53.375 182759 DEBUG oslo_concurrency.lockutils [None req-f1985406-49a5-4583-b2e2-c94bcda3c46e 0c3f927acf834c718155d5ee5dd81b19 2edcdd2e6c5a46cb95eb89874a9cb5f3 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 21 18:47:53 np0005591285 nova_compute[182755]: 2026-01-21 23:47:53.385 182759 DEBUG nova.virt.hardware [None req-f1985406-49a5-4583-b2e2-c94bcda3c46e 0c3f927acf834c718155d5ee5dd81b19 2edcdd2e6c5a46cb95eb89874a9cb5f3 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Jan 21 18:47:53 np0005591285 nova_compute[182755]: 2026-01-21 23:47:53.386 182759 INFO nova.compute.claims [None req-f1985406-49a5-4583-b2e2-c94bcda3c46e 0c3f927acf834c718155d5ee5dd81b19 2edcdd2e6c5a46cb95eb89874a9cb5f3 - - default default] [instance: 5ed410d9-b024-4004-83b6-eb4618833532] Claim successful on node compute-2.ctlplane.example.com#033[00m
Jan 21 18:47:53 np0005591285 nova_compute[182755]: 2026-01-21 23:47:53.554 182759 DEBUG nova.compute.provider_tree [None req-f1985406-49a5-4583-b2e2-c94bcda3c46e 0c3f927acf834c718155d5ee5dd81b19 2edcdd2e6c5a46cb95eb89874a9cb5f3 - - default default] Inventory has not changed in ProviderTree for provider: e96a8776-a298-4c19-937a-402cb8191067 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 21 18:47:53 np0005591285 nova_compute[182755]: 2026-01-21 23:47:53.578 182759 DEBUG nova.scheduler.client.report [None req-f1985406-49a5-4583-b2e2-c94bcda3c46e 0c3f927acf834c718155d5ee5dd81b19 2edcdd2e6c5a46cb95eb89874a9cb5f3 - - default default] Inventory has not changed for provider e96a8776-a298-4c19-937a-402cb8191067 based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 21 18:47:53 np0005591285 nova_compute[182755]: 2026-01-21 23:47:53.606 182759 DEBUG oslo_concurrency.lockutils [None req-f1985406-49a5-4583-b2e2-c94bcda3c46e 0c3f927acf834c718155d5ee5dd81b19 2edcdd2e6c5a46cb95eb89874a9cb5f3 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.231s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 21 18:47:53 np0005591285 nova_compute[182755]: 2026-01-21 23:47:53.607 182759 DEBUG nova.compute.manager [None req-f1985406-49a5-4583-b2e2-c94bcda3c46e 0c3f927acf834c718155d5ee5dd81b19 2edcdd2e6c5a46cb95eb89874a9cb5f3 - - default default] [instance: 5ed410d9-b024-4004-83b6-eb4618833532] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Jan 21 18:47:53 np0005591285 nova_compute[182755]: 2026-01-21 23:47:53.678 182759 DEBUG nova.compute.manager [None req-f1985406-49a5-4583-b2e2-c94bcda3c46e 0c3f927acf834c718155d5ee5dd81b19 2edcdd2e6c5a46cb95eb89874a9cb5f3 - - default default] [instance: 5ed410d9-b024-4004-83b6-eb4618833532] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Jan 21 18:47:53 np0005591285 nova_compute[182755]: 2026-01-21 23:47:53.679 182759 DEBUG nova.network.neutron [None req-f1985406-49a5-4583-b2e2-c94bcda3c46e 0c3f927acf834c718155d5ee5dd81b19 2edcdd2e6c5a46cb95eb89874a9cb5f3 - - default default] [instance: 5ed410d9-b024-4004-83b6-eb4618833532] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Jan 21 18:47:53 np0005591285 nova_compute[182755]: 2026-01-21 23:47:53.703 182759 INFO nova.virt.libvirt.driver [None req-f1985406-49a5-4583-b2e2-c94bcda3c46e 0c3f927acf834c718155d5ee5dd81b19 2edcdd2e6c5a46cb95eb89874a9cb5f3 - - default default] [instance: 5ed410d9-b024-4004-83b6-eb4618833532] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Jan 21 18:47:53 np0005591285 nova_compute[182755]: 2026-01-21 23:47:53.731 182759 DEBUG nova.compute.manager [None req-f1985406-49a5-4583-b2e2-c94bcda3c46e 0c3f927acf834c718155d5ee5dd81b19 2edcdd2e6c5a46cb95eb89874a9cb5f3 - - default default] [instance: 5ed410d9-b024-4004-83b6-eb4618833532] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Jan 21 18:47:53 np0005591285 nova_compute[182755]: 2026-01-21 23:47:53.890 182759 DEBUG nova.compute.manager [None req-f1985406-49a5-4583-b2e2-c94bcda3c46e 0c3f927acf834c718155d5ee5dd81b19 2edcdd2e6c5a46cb95eb89874a9cb5f3 - - default default] [instance: 5ed410d9-b024-4004-83b6-eb4618833532] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Jan 21 18:47:53 np0005591285 nova_compute[182755]: 2026-01-21 23:47:53.893 182759 DEBUG nova.virt.libvirt.driver [None req-f1985406-49a5-4583-b2e2-c94bcda3c46e 0c3f927acf834c718155d5ee5dd81b19 2edcdd2e6c5a46cb95eb89874a9cb5f3 - - default default] [instance: 5ed410d9-b024-4004-83b6-eb4618833532] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Jan 21 18:47:53 np0005591285 nova_compute[182755]: 2026-01-21 23:47:53.894 182759 INFO nova.virt.libvirt.driver [None req-f1985406-49a5-4583-b2e2-c94bcda3c46e 0c3f927acf834c718155d5ee5dd81b19 2edcdd2e6c5a46cb95eb89874a9cb5f3 - - default default] [instance: 5ed410d9-b024-4004-83b6-eb4618833532] Creating image(s)#033[00m
Jan 21 18:47:53 np0005591285 nova_compute[182755]: 2026-01-21 23:47:53.895 182759 DEBUG oslo_concurrency.lockutils [None req-f1985406-49a5-4583-b2e2-c94bcda3c46e 0c3f927acf834c718155d5ee5dd81b19 2edcdd2e6c5a46cb95eb89874a9cb5f3 - - default default] Acquiring lock "/var/lib/nova/instances/5ed410d9-b024-4004-83b6-eb4618833532/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 21 18:47:53 np0005591285 nova_compute[182755]: 2026-01-21 23:47:53.895 182759 DEBUG oslo_concurrency.lockutils [None req-f1985406-49a5-4583-b2e2-c94bcda3c46e 0c3f927acf834c718155d5ee5dd81b19 2edcdd2e6c5a46cb95eb89874a9cb5f3 - - default default] Lock "/var/lib/nova/instances/5ed410d9-b024-4004-83b6-eb4618833532/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 21 18:47:53 np0005591285 nova_compute[182755]: 2026-01-21 23:47:53.897 182759 DEBUG oslo_concurrency.lockutils [None req-f1985406-49a5-4583-b2e2-c94bcda3c46e 0c3f927acf834c718155d5ee5dd81b19 2edcdd2e6c5a46cb95eb89874a9cb5f3 - - default default] Lock "/var/lib/nova/instances/5ed410d9-b024-4004-83b6-eb4618833532/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 21 18:47:53 np0005591285 nova_compute[182755]: 2026-01-21 23:47:53.927 182759 DEBUG oslo_concurrency.processutils [None req-f1985406-49a5-4583-b2e2-c94bcda3c46e 0c3f927acf834c718155d5ee5dd81b19 2edcdd2e6c5a46cb95eb89874a9cb5f3 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 21 18:47:53 np0005591285 nova_compute[182755]: 2026-01-21 23:47:53.953 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 18:47:53 np0005591285 nova_compute[182755]: 2026-01-21 23:47:53.994 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 18:47:53 np0005591285 nova_compute[182755]: 2026-01-21 23:47:53.998 182759 DEBUG oslo_concurrency.processutils [None req-f1985406-49a5-4583-b2e2-c94bcda3c46e 0c3f927acf834c718155d5ee5dd81b19 2edcdd2e6c5a46cb95eb89874a9cb5f3 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json" returned: 0 in 0.071s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 21 18:47:53 np0005591285 nova_compute[182755]: 2026-01-21 23:47:53.999 182759 DEBUG oslo_concurrency.lockutils [None req-f1985406-49a5-4583-b2e2-c94bcda3c46e 0c3f927acf834c718155d5ee5dd81b19 2edcdd2e6c5a46cb95eb89874a9cb5f3 - - default default] Acquiring lock "3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 21 18:47:54 np0005591285 nova_compute[182755]: 2026-01-21 23:47:54.000 182759 DEBUG oslo_concurrency.lockutils [None req-f1985406-49a5-4583-b2e2-c94bcda3c46e 0c3f927acf834c718155d5ee5dd81b19 2edcdd2e6c5a46cb95eb89874a9cb5f3 - - default default] Lock "3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 21 18:47:54 np0005591285 nova_compute[182755]: 2026-01-21 23:47:54.015 182759 DEBUG oslo_concurrency.processutils [None req-f1985406-49a5-4583-b2e2-c94bcda3c46e 0c3f927acf834c718155d5ee5dd81b19 2edcdd2e6c5a46cb95eb89874a9cb5f3 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 21 18:47:54 np0005591285 nova_compute[182755]: 2026-01-21 23:47:54.046 182759 DEBUG nova.policy [None req-f1985406-49a5-4583-b2e2-c94bcda3c46e 0c3f927acf834c718155d5ee5dd81b19 2edcdd2e6c5a46cb95eb89874a9cb5f3 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '0c3f927acf834c718155d5ee5dd81b19', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '2edcdd2e6c5a46cb95eb89874a9cb5f3', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Jan 21 18:47:54 np0005591285 nova_compute[182755]: 2026-01-21 23:47:54.107 182759 DEBUG oslo_concurrency.processutils [None req-f1985406-49a5-4583-b2e2-c94bcda3c46e 0c3f927acf834c718155d5ee5dd81b19 2edcdd2e6c5a46cb95eb89874a9cb5f3 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json" returned: 0 in 0.092s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 21 18:47:54 np0005591285 nova_compute[182755]: 2026-01-21 23:47:54.108 182759 DEBUG oslo_concurrency.processutils [None req-f1985406-49a5-4583-b2e2-c94bcda3c46e 0c3f927acf834c718155d5ee5dd81b19 2edcdd2e6c5a46cb95eb89874a9cb5f3 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474,backing_fmt=raw /var/lib/nova/instances/5ed410d9-b024-4004-83b6-eb4618833532/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 21 18:47:54 np0005591285 nova_compute[182755]: 2026-01-21 23:47:54.164 182759 DEBUG oslo_concurrency.processutils [None req-f1985406-49a5-4583-b2e2-c94bcda3c46e 0c3f927acf834c718155d5ee5dd81b19 2edcdd2e6c5a46cb95eb89874a9cb5f3 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474,backing_fmt=raw /var/lib/nova/instances/5ed410d9-b024-4004-83b6-eb4618833532/disk 1073741824" returned: 0 in 0.056s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 21 18:47:54 np0005591285 nova_compute[182755]: 2026-01-21 23:47:54.167 182759 DEBUG oslo_concurrency.lockutils [None req-f1985406-49a5-4583-b2e2-c94bcda3c46e 0c3f927acf834c718155d5ee5dd81b19 2edcdd2e6c5a46cb95eb89874a9cb5f3 - - default default] Lock "3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.167s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 21 18:47:54 np0005591285 nova_compute[182755]: 2026-01-21 23:47:54.167 182759 DEBUG oslo_concurrency.processutils [None req-f1985406-49a5-4583-b2e2-c94bcda3c46e 0c3f927acf834c718155d5ee5dd81b19 2edcdd2e6c5a46cb95eb89874a9cb5f3 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 21 18:47:54 np0005591285 podman[213923]: 2026-01-21 23:47:54.208661486 +0000 UTC m=+0.075212823 container health_status 482a8450540b33e5cd4c4cb0c4b60281016c657b79b89fa82f8a9e447843f995 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']})
Jan 21 18:47:54 np0005591285 podman[213925]: 2026-01-21 23:47:54.219797624 +0000 UTC m=+0.077752721 container health_status 8919309155a033da8deb024bb37b9fc0d285fe97686ee0d202af37a5f2df8416 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter)
Jan 21 18:47:54 np0005591285 nova_compute[182755]: 2026-01-21 23:47:54.234 182759 DEBUG oslo_concurrency.processutils [None req-f1985406-49a5-4583-b2e2-c94bcda3c46e 0c3f927acf834c718155d5ee5dd81b19 2edcdd2e6c5a46cb95eb89874a9cb5f3 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json" returned: 0 in 0.066s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 21 18:47:54 np0005591285 nova_compute[182755]: 2026-01-21 23:47:54.236 182759 DEBUG nova.virt.disk.api [None req-f1985406-49a5-4583-b2e2-c94bcda3c46e 0c3f927acf834c718155d5ee5dd81b19 2edcdd2e6c5a46cb95eb89874a9cb5f3 - - default default] Checking if we can resize image /var/lib/nova/instances/5ed410d9-b024-4004-83b6-eb4618833532/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166#033[00m
Jan 21 18:47:54 np0005591285 nova_compute[182755]: 2026-01-21 23:47:54.236 182759 DEBUG oslo_concurrency.processutils [None req-f1985406-49a5-4583-b2e2-c94bcda3c46e 0c3f927acf834c718155d5ee5dd81b19 2edcdd2e6c5a46cb95eb89874a9cb5f3 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/5ed410d9-b024-4004-83b6-eb4618833532/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 21 18:47:54 np0005591285 nova_compute[182755]: 2026-01-21 23:47:54.329 182759 DEBUG oslo_concurrency.processutils [None req-f1985406-49a5-4583-b2e2-c94bcda3c46e 0c3f927acf834c718155d5ee5dd81b19 2edcdd2e6c5a46cb95eb89874a9cb5f3 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/5ed410d9-b024-4004-83b6-eb4618833532/disk --force-share --output=json" returned: 0 in 0.093s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 21 18:47:54 np0005591285 nova_compute[182755]: 2026-01-21 23:47:54.330 182759 DEBUG nova.virt.disk.api [None req-f1985406-49a5-4583-b2e2-c94bcda3c46e 0c3f927acf834c718155d5ee5dd81b19 2edcdd2e6c5a46cb95eb89874a9cb5f3 - - default default] Cannot resize image /var/lib/nova/instances/5ed410d9-b024-4004-83b6-eb4618833532/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172#033[00m
Jan 21 18:47:54 np0005591285 nova_compute[182755]: 2026-01-21 23:47:54.331 182759 DEBUG nova.objects.instance [None req-f1985406-49a5-4583-b2e2-c94bcda3c46e 0c3f927acf834c718155d5ee5dd81b19 2edcdd2e6c5a46cb95eb89874a9cb5f3 - - default default] Lazy-loading 'migration_context' on Instance uuid 5ed410d9-b024-4004-83b6-eb4618833532 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 21 18:47:54 np0005591285 nova_compute[182755]: 2026-01-21 23:47:54.345 182759 DEBUG nova.virt.libvirt.driver [None req-f1985406-49a5-4583-b2e2-c94bcda3c46e 0c3f927acf834c718155d5ee5dd81b19 2edcdd2e6c5a46cb95eb89874a9cb5f3 - - default default] [instance: 5ed410d9-b024-4004-83b6-eb4618833532] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Jan 21 18:47:54 np0005591285 nova_compute[182755]: 2026-01-21 23:47:54.345 182759 DEBUG nova.virt.libvirt.driver [None req-f1985406-49a5-4583-b2e2-c94bcda3c46e 0c3f927acf834c718155d5ee5dd81b19 2edcdd2e6c5a46cb95eb89874a9cb5f3 - - default default] [instance: 5ed410d9-b024-4004-83b6-eb4618833532] Ensure instance console log exists: /var/lib/nova/instances/5ed410d9-b024-4004-83b6-eb4618833532/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Jan 21 18:47:54 np0005591285 nova_compute[182755]: 2026-01-21 23:47:54.346 182759 DEBUG oslo_concurrency.lockutils [None req-f1985406-49a5-4583-b2e2-c94bcda3c46e 0c3f927acf834c718155d5ee5dd81b19 2edcdd2e6c5a46cb95eb89874a9cb5f3 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 21 18:47:54 np0005591285 nova_compute[182755]: 2026-01-21 23:47:54.346 182759 DEBUG oslo_concurrency.lockutils [None req-f1985406-49a5-4583-b2e2-c94bcda3c46e 0c3f927acf834c718155d5ee5dd81b19 2edcdd2e6c5a46cb95eb89874a9cb5f3 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 21 18:47:54 np0005591285 nova_compute[182755]: 2026-01-21 23:47:54.346 182759 DEBUG oslo_concurrency.lockutils [None req-f1985406-49a5-4583-b2e2-c94bcda3c46e 0c3f927acf834c718155d5ee5dd81b19 2edcdd2e6c5a46cb95eb89874a9cb5f3 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 21 18:47:55 np0005591285 nova_compute[182755]: 2026-01-21 23:47:55.504 182759 DEBUG nova.network.neutron [None req-f1985406-49a5-4583-b2e2-c94bcda3c46e 0c3f927acf834c718155d5ee5dd81b19 2edcdd2e6c5a46cb95eb89874a9cb5f3 - - default default] [instance: 5ed410d9-b024-4004-83b6-eb4618833532] Successfully created port: 32a358ca-397f-4aed-b20f-8cbdde33b131 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Jan 21 18:47:57 np0005591285 nova_compute[182755]: 2026-01-21 23:47:57.053 182759 DEBUG nova.network.neutron [None req-f1985406-49a5-4583-b2e2-c94bcda3c46e 0c3f927acf834c718155d5ee5dd81b19 2edcdd2e6c5a46cb95eb89874a9cb5f3 - - default default] [instance: 5ed410d9-b024-4004-83b6-eb4618833532] Successfully updated port: 32a358ca-397f-4aed-b20f-8cbdde33b131 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Jan 21 18:47:57 np0005591285 nova_compute[182755]: 2026-01-21 23:47:57.076 182759 DEBUG oslo_concurrency.lockutils [None req-f1985406-49a5-4583-b2e2-c94bcda3c46e 0c3f927acf834c718155d5ee5dd81b19 2edcdd2e6c5a46cb95eb89874a9cb5f3 - - default default] Acquiring lock "refresh_cache-5ed410d9-b024-4004-83b6-eb4618833532" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 21 18:47:57 np0005591285 nova_compute[182755]: 2026-01-21 23:47:57.077 182759 DEBUG oslo_concurrency.lockutils [None req-f1985406-49a5-4583-b2e2-c94bcda3c46e 0c3f927acf834c718155d5ee5dd81b19 2edcdd2e6c5a46cb95eb89874a9cb5f3 - - default default] Acquired lock "refresh_cache-5ed410d9-b024-4004-83b6-eb4618833532" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 21 18:47:57 np0005591285 nova_compute[182755]: 2026-01-21 23:47:57.077 182759 DEBUG nova.network.neutron [None req-f1985406-49a5-4583-b2e2-c94bcda3c46e 0c3f927acf834c718155d5ee5dd81b19 2edcdd2e6c5a46cb95eb89874a9cb5f3 - - default default] [instance: 5ed410d9-b024-4004-83b6-eb4618833532] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Jan 21 18:47:57 np0005591285 nova_compute[182755]: 2026-01-21 23:47:57.308 182759 DEBUG nova.network.neutron [None req-f1985406-49a5-4583-b2e2-c94bcda3c46e 0c3f927acf834c718155d5ee5dd81b19 2edcdd2e6c5a46cb95eb89874a9cb5f3 - - default default] [instance: 5ed410d9-b024-4004-83b6-eb4618833532] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Jan 21 18:47:58 np0005591285 nova_compute[182755]: 2026-01-21 23:47:58.408 182759 DEBUG nova.compute.manager [req-627f3d36-4a19-48b7-9838-1c394b6b2bef req-fd366c69-7f21-4765-8ae3-48c7a6279f82 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 5ed410d9-b024-4004-83b6-eb4618833532] Received event network-changed-32a358ca-397f-4aed-b20f-8cbdde33b131 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 21 18:47:58 np0005591285 nova_compute[182755]: 2026-01-21 23:47:58.408 182759 DEBUG nova.compute.manager [req-627f3d36-4a19-48b7-9838-1c394b6b2bef req-fd366c69-7f21-4765-8ae3-48c7a6279f82 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 5ed410d9-b024-4004-83b6-eb4618833532] Refreshing instance network info cache due to event network-changed-32a358ca-397f-4aed-b20f-8cbdde33b131. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 21 18:47:58 np0005591285 nova_compute[182755]: 2026-01-21 23:47:58.409 182759 DEBUG oslo_concurrency.lockutils [req-627f3d36-4a19-48b7-9838-1c394b6b2bef req-fd366c69-7f21-4765-8ae3-48c7a6279f82 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "refresh_cache-5ed410d9-b024-4004-83b6-eb4618833532" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 21 18:47:58 np0005591285 nova_compute[182755]: 2026-01-21 23:47:58.478 182759 DEBUG nova.network.neutron [None req-f1985406-49a5-4583-b2e2-c94bcda3c46e 0c3f927acf834c718155d5ee5dd81b19 2edcdd2e6c5a46cb95eb89874a9cb5f3 - - default default] [instance: 5ed410d9-b024-4004-83b6-eb4618833532] Updating instance_info_cache with network_info: [{"id": "32a358ca-397f-4aed-b20f-8cbdde33b131", "address": "fa:16:3e:3b:3f:4e", "network": {"id": "135f4ca0-b287-4f82-8393-a426855e9926", "bridge": "br-int", "label": "tempest-ImagesOneServerNegativeTestJSON-1018143163-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "2edcdd2e6c5a46cb95eb89874a9cb5f3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap32a358ca-39", "ovs_interfaceid": "32a358ca-397f-4aed-b20f-8cbdde33b131", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 21 18:47:58 np0005591285 nova_compute[182755]: 2026-01-21 23:47:58.508 182759 DEBUG oslo_concurrency.lockutils [None req-f1985406-49a5-4583-b2e2-c94bcda3c46e 0c3f927acf834c718155d5ee5dd81b19 2edcdd2e6c5a46cb95eb89874a9cb5f3 - - default default] Releasing lock "refresh_cache-5ed410d9-b024-4004-83b6-eb4618833532" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 21 18:47:58 np0005591285 nova_compute[182755]: 2026-01-21 23:47:58.509 182759 DEBUG nova.compute.manager [None req-f1985406-49a5-4583-b2e2-c94bcda3c46e 0c3f927acf834c718155d5ee5dd81b19 2edcdd2e6c5a46cb95eb89874a9cb5f3 - - default default] [instance: 5ed410d9-b024-4004-83b6-eb4618833532] Instance network_info: |[{"id": "32a358ca-397f-4aed-b20f-8cbdde33b131", "address": "fa:16:3e:3b:3f:4e", "network": {"id": "135f4ca0-b287-4f82-8393-a426855e9926", "bridge": "br-int", "label": "tempest-ImagesOneServerNegativeTestJSON-1018143163-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "2edcdd2e6c5a46cb95eb89874a9cb5f3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap32a358ca-39", "ovs_interfaceid": "32a358ca-397f-4aed-b20f-8cbdde33b131", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Jan 21 18:47:58 np0005591285 nova_compute[182755]: 2026-01-21 23:47:58.510 182759 DEBUG oslo_concurrency.lockutils [req-627f3d36-4a19-48b7-9838-1c394b6b2bef req-fd366c69-7f21-4765-8ae3-48c7a6279f82 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquired lock "refresh_cache-5ed410d9-b024-4004-83b6-eb4618833532" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 21 18:47:58 np0005591285 nova_compute[182755]: 2026-01-21 23:47:58.511 182759 DEBUG nova.network.neutron [req-627f3d36-4a19-48b7-9838-1c394b6b2bef req-fd366c69-7f21-4765-8ae3-48c7a6279f82 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 5ed410d9-b024-4004-83b6-eb4618833532] Refreshing network info cache for port 32a358ca-397f-4aed-b20f-8cbdde33b131 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 21 18:47:58 np0005591285 nova_compute[182755]: 2026-01-21 23:47:58.519 182759 DEBUG nova.virt.libvirt.driver [None req-f1985406-49a5-4583-b2e2-c94bcda3c46e 0c3f927acf834c718155d5ee5dd81b19 2edcdd2e6c5a46cb95eb89874a9cb5f3 - - default default] [instance: 5ed410d9-b024-4004-83b6-eb4618833532] Start _get_guest_xml network_info=[{"id": "32a358ca-397f-4aed-b20f-8cbdde33b131", "address": "fa:16:3e:3b:3f:4e", "network": {"id": "135f4ca0-b287-4f82-8393-a426855e9926", "bridge": "br-int", "label": "tempest-ImagesOneServerNegativeTestJSON-1018143163-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "2edcdd2e6c5a46cb95eb89874a9cb5f3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap32a358ca-39", "ovs_interfaceid": "32a358ca-397f-4aed-b20f-8cbdde33b131", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-21T23:42:50Z,direct_url=<?>,disk_format='qcow2',id=9cd98f02-a505-4543-a7ad-04e9a377b456,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='43b70c4e837343859ac97b6b2397ba1b',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-21T23:42:52Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'device_type': 'disk', 'device_name': '/dev/vda', 'size': 0, 'boot_index': 0, 'disk_bus': 'virtio', 'guest_format': None, 'encryption_options': None, 'encryption_format': None, 'encrypted': False, 'encryption_secret_uuid': None, 'image_id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Jan 21 18:47:58 np0005591285 nova_compute[182755]: 2026-01-21 23:47:58.528 182759 WARNING nova.virt.libvirt.driver [None req-f1985406-49a5-4583-b2e2-c94bcda3c46e 0c3f927acf834c718155d5ee5dd81b19 2edcdd2e6c5a46cb95eb89874a9cb5f3 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 21 18:47:58 np0005591285 nova_compute[182755]: 2026-01-21 23:47:58.536 182759 DEBUG nova.virt.libvirt.host [None req-f1985406-49a5-4583-b2e2-c94bcda3c46e 0c3f927acf834c718155d5ee5dd81b19 2edcdd2e6c5a46cb95eb89874a9cb5f3 - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Jan 21 18:47:58 np0005591285 nova_compute[182755]: 2026-01-21 23:47:58.537 182759 DEBUG nova.virt.libvirt.host [None req-f1985406-49a5-4583-b2e2-c94bcda3c46e 0c3f927acf834c718155d5ee5dd81b19 2edcdd2e6c5a46cb95eb89874a9cb5f3 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Jan 21 18:47:58 np0005591285 nova_compute[182755]: 2026-01-21 23:47:58.548 182759 DEBUG nova.virt.libvirt.host [None req-f1985406-49a5-4583-b2e2-c94bcda3c46e 0c3f927acf834c718155d5ee5dd81b19 2edcdd2e6c5a46cb95eb89874a9cb5f3 - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Jan 21 18:47:58 np0005591285 nova_compute[182755]: 2026-01-21 23:47:58.549 182759 DEBUG nova.virt.libvirt.host [None req-f1985406-49a5-4583-b2e2-c94bcda3c46e 0c3f927acf834c718155d5ee5dd81b19 2edcdd2e6c5a46cb95eb89874a9cb5f3 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Jan 21 18:47:58 np0005591285 nova_compute[182755]: 2026-01-21 23:47:58.551 182759 DEBUG nova.virt.libvirt.driver [None req-f1985406-49a5-4583-b2e2-c94bcda3c46e 0c3f927acf834c718155d5ee5dd81b19 2edcdd2e6c5a46cb95eb89874a9cb5f3 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Jan 21 18:47:58 np0005591285 nova_compute[182755]: 2026-01-21 23:47:58.552 182759 DEBUG nova.virt.hardware [None req-f1985406-49a5-4583-b2e2-c94bcda3c46e 0c3f927acf834c718155d5ee5dd81b19 2edcdd2e6c5a46cb95eb89874a9cb5f3 - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-21T23:42:45Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='c3389c03-89c4-4ff5-9e03-1a99d41713d4',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-21T23:42:50Z,direct_url=<?>,disk_format='qcow2',id=9cd98f02-a505-4543-a7ad-04e9a377b456,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='43b70c4e837343859ac97b6b2397ba1b',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-21T23:42:52Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Jan 21 18:47:58 np0005591285 nova_compute[182755]: 2026-01-21 23:47:58.553 182759 DEBUG nova.virt.hardware [None req-f1985406-49a5-4583-b2e2-c94bcda3c46e 0c3f927acf834c718155d5ee5dd81b19 2edcdd2e6c5a46cb95eb89874a9cb5f3 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Jan 21 18:47:58 np0005591285 nova_compute[182755]: 2026-01-21 23:47:58.554 182759 DEBUG nova.virt.hardware [None req-f1985406-49a5-4583-b2e2-c94bcda3c46e 0c3f927acf834c718155d5ee5dd81b19 2edcdd2e6c5a46cb95eb89874a9cb5f3 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Jan 21 18:47:58 np0005591285 nova_compute[182755]: 2026-01-21 23:47:58.554 182759 DEBUG nova.virt.hardware [None req-f1985406-49a5-4583-b2e2-c94bcda3c46e 0c3f927acf834c718155d5ee5dd81b19 2edcdd2e6c5a46cb95eb89874a9cb5f3 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Jan 21 18:47:58 np0005591285 nova_compute[182755]: 2026-01-21 23:47:58.555 182759 DEBUG nova.virt.hardware [None req-f1985406-49a5-4583-b2e2-c94bcda3c46e 0c3f927acf834c718155d5ee5dd81b19 2edcdd2e6c5a46cb95eb89874a9cb5f3 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Jan 21 18:47:58 np0005591285 nova_compute[182755]: 2026-01-21 23:47:58.555 182759 DEBUG nova.virt.hardware [None req-f1985406-49a5-4583-b2e2-c94bcda3c46e 0c3f927acf834c718155d5ee5dd81b19 2edcdd2e6c5a46cb95eb89874a9cb5f3 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Jan 21 18:47:58 np0005591285 nova_compute[182755]: 2026-01-21 23:47:58.556 182759 DEBUG nova.virt.hardware [None req-f1985406-49a5-4583-b2e2-c94bcda3c46e 0c3f927acf834c718155d5ee5dd81b19 2edcdd2e6c5a46cb95eb89874a9cb5f3 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Jan 21 18:47:58 np0005591285 nova_compute[182755]: 2026-01-21 23:47:58.556 182759 DEBUG nova.virt.hardware [None req-f1985406-49a5-4583-b2e2-c94bcda3c46e 0c3f927acf834c718155d5ee5dd81b19 2edcdd2e6c5a46cb95eb89874a9cb5f3 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Jan 21 18:47:58 np0005591285 nova_compute[182755]: 2026-01-21 23:47:58.557 182759 DEBUG nova.virt.hardware [None req-f1985406-49a5-4583-b2e2-c94bcda3c46e 0c3f927acf834c718155d5ee5dd81b19 2edcdd2e6c5a46cb95eb89874a9cb5f3 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Jan 21 18:47:58 np0005591285 nova_compute[182755]: 2026-01-21 23:47:58.558 182759 DEBUG nova.virt.hardware [None req-f1985406-49a5-4583-b2e2-c94bcda3c46e 0c3f927acf834c718155d5ee5dd81b19 2edcdd2e6c5a46cb95eb89874a9cb5f3 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Jan 21 18:47:58 np0005591285 nova_compute[182755]: 2026-01-21 23:47:58.558 182759 DEBUG nova.virt.hardware [None req-f1985406-49a5-4583-b2e2-c94bcda3c46e 0c3f927acf834c718155d5ee5dd81b19 2edcdd2e6c5a46cb95eb89874a9cb5f3 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Jan 21 18:47:58 np0005591285 nova_compute[182755]: 2026-01-21 23:47:58.568 182759 DEBUG nova.virt.libvirt.vif [None req-f1985406-49a5-4583-b2e2-c94bcda3c46e 0c3f927acf834c718155d5ee5dd81b19 2edcdd2e6c5a46cb95eb89874a9cb5f3 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-21T23:47:52Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ImagesOneServerNegativeTestJSON-server-682579751',display_name='tempest-ImagesOneServerNegativeTestJSON-server-682579751',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-imagesoneservernegativetestjson-server-682579751',id=24,image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='2edcdd2e6c5a46cb95eb89874a9cb5f3',ramdisk_id='',reservation_id='r-ur8f6wuq',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ImagesOneServerNegativeTestJSON-222133061',owner_user_name='tempest-ImagesOneServerNegativeTestJSON-222133061-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-21T23:47:53Z,user_data=None,user_id='0c3f927acf834c718155d5ee5dd81b19',uuid=5ed410d9-b024-4004-83b6-eb4618833532,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "32a358ca-397f-4aed-b20f-8cbdde33b131", "address": "fa:16:3e:3b:3f:4e", "network": {"id": "135f4ca0-b287-4f82-8393-a426855e9926", "bridge": "br-int", "label": "tempest-ImagesOneServerNegativeTestJSON-1018143163-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "2edcdd2e6c5a46cb95eb89874a9cb5f3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap32a358ca-39", "ovs_interfaceid": "32a358ca-397f-4aed-b20f-8cbdde33b131", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Jan 21 18:47:58 np0005591285 nova_compute[182755]: 2026-01-21 23:47:58.568 182759 DEBUG nova.network.os_vif_util [None req-f1985406-49a5-4583-b2e2-c94bcda3c46e 0c3f927acf834c718155d5ee5dd81b19 2edcdd2e6c5a46cb95eb89874a9cb5f3 - - default default] Converting VIF {"id": "32a358ca-397f-4aed-b20f-8cbdde33b131", "address": "fa:16:3e:3b:3f:4e", "network": {"id": "135f4ca0-b287-4f82-8393-a426855e9926", "bridge": "br-int", "label": "tempest-ImagesOneServerNegativeTestJSON-1018143163-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "2edcdd2e6c5a46cb95eb89874a9cb5f3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap32a358ca-39", "ovs_interfaceid": "32a358ca-397f-4aed-b20f-8cbdde33b131", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 21 18:47:58 np0005591285 nova_compute[182755]: 2026-01-21 23:47:58.570 182759 DEBUG nova.network.os_vif_util [None req-f1985406-49a5-4583-b2e2-c94bcda3c46e 0c3f927acf834c718155d5ee5dd81b19 2edcdd2e6c5a46cb95eb89874a9cb5f3 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:3b:3f:4e,bridge_name='br-int',has_traffic_filtering=True,id=32a358ca-397f-4aed-b20f-8cbdde33b131,network=Network(135f4ca0-b287-4f82-8393-a426855e9926),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap32a358ca-39') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 21 18:47:58 np0005591285 nova_compute[182755]: 2026-01-21 23:47:58.573 182759 DEBUG nova.objects.instance [None req-f1985406-49a5-4583-b2e2-c94bcda3c46e 0c3f927acf834c718155d5ee5dd81b19 2edcdd2e6c5a46cb95eb89874a9cb5f3 - - default default] Lazy-loading 'pci_devices' on Instance uuid 5ed410d9-b024-4004-83b6-eb4618833532 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 21 18:47:58 np0005591285 nova_compute[182755]: 2026-01-21 23:47:58.590 182759 DEBUG nova.virt.libvirt.driver [None req-f1985406-49a5-4583-b2e2-c94bcda3c46e 0c3f927acf834c718155d5ee5dd81b19 2edcdd2e6c5a46cb95eb89874a9cb5f3 - - default default] [instance: 5ed410d9-b024-4004-83b6-eb4618833532] End _get_guest_xml xml=<domain type="kvm">
Jan 21 18:47:58 np0005591285 nova_compute[182755]:  <uuid>5ed410d9-b024-4004-83b6-eb4618833532</uuid>
Jan 21 18:47:58 np0005591285 nova_compute[182755]:  <name>instance-00000018</name>
Jan 21 18:47:58 np0005591285 nova_compute[182755]:  <memory>131072</memory>
Jan 21 18:47:58 np0005591285 nova_compute[182755]:  <vcpu>1</vcpu>
Jan 21 18:47:58 np0005591285 nova_compute[182755]:  <metadata>
Jan 21 18:47:58 np0005591285 nova_compute[182755]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 21 18:47:58 np0005591285 nova_compute[182755]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 21 18:47:58 np0005591285 nova_compute[182755]:      <nova:name>tempest-ImagesOneServerNegativeTestJSON-server-682579751</nova:name>
Jan 21 18:47:58 np0005591285 nova_compute[182755]:      <nova:creationTime>2026-01-21 23:47:58</nova:creationTime>
Jan 21 18:47:58 np0005591285 nova_compute[182755]:      <nova:flavor name="m1.nano">
Jan 21 18:47:58 np0005591285 nova_compute[182755]:        <nova:memory>128</nova:memory>
Jan 21 18:47:58 np0005591285 nova_compute[182755]:        <nova:disk>1</nova:disk>
Jan 21 18:47:58 np0005591285 nova_compute[182755]:        <nova:swap>0</nova:swap>
Jan 21 18:47:58 np0005591285 nova_compute[182755]:        <nova:ephemeral>0</nova:ephemeral>
Jan 21 18:47:58 np0005591285 nova_compute[182755]:        <nova:vcpus>1</nova:vcpus>
Jan 21 18:47:58 np0005591285 nova_compute[182755]:      </nova:flavor>
Jan 21 18:47:58 np0005591285 nova_compute[182755]:      <nova:owner>
Jan 21 18:47:58 np0005591285 nova_compute[182755]:        <nova:user uuid="0c3f927acf834c718155d5ee5dd81b19">tempest-ImagesOneServerNegativeTestJSON-222133061-project-member</nova:user>
Jan 21 18:47:58 np0005591285 nova_compute[182755]:        <nova:project uuid="2edcdd2e6c5a46cb95eb89874a9cb5f3">tempest-ImagesOneServerNegativeTestJSON-222133061</nova:project>
Jan 21 18:47:58 np0005591285 nova_compute[182755]:      </nova:owner>
Jan 21 18:47:58 np0005591285 nova_compute[182755]:      <nova:root type="image" uuid="9cd98f02-a505-4543-a7ad-04e9a377b456"/>
Jan 21 18:47:58 np0005591285 nova_compute[182755]:      <nova:ports>
Jan 21 18:47:58 np0005591285 nova_compute[182755]:        <nova:port uuid="32a358ca-397f-4aed-b20f-8cbdde33b131">
Jan 21 18:47:58 np0005591285 nova_compute[182755]:          <nova:ip type="fixed" address="10.100.0.5" ipVersion="4"/>
Jan 21 18:47:58 np0005591285 nova_compute[182755]:        </nova:port>
Jan 21 18:47:58 np0005591285 nova_compute[182755]:      </nova:ports>
Jan 21 18:47:58 np0005591285 nova_compute[182755]:    </nova:instance>
Jan 21 18:47:58 np0005591285 nova_compute[182755]:  </metadata>
Jan 21 18:47:58 np0005591285 nova_compute[182755]:  <sysinfo type="smbios">
Jan 21 18:47:58 np0005591285 nova_compute[182755]:    <system>
Jan 21 18:47:58 np0005591285 nova_compute[182755]:      <entry name="manufacturer">RDO</entry>
Jan 21 18:47:58 np0005591285 nova_compute[182755]:      <entry name="product">OpenStack Compute</entry>
Jan 21 18:47:58 np0005591285 nova_compute[182755]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 21 18:47:58 np0005591285 nova_compute[182755]:      <entry name="serial">5ed410d9-b024-4004-83b6-eb4618833532</entry>
Jan 21 18:47:58 np0005591285 nova_compute[182755]:      <entry name="uuid">5ed410d9-b024-4004-83b6-eb4618833532</entry>
Jan 21 18:47:58 np0005591285 nova_compute[182755]:      <entry name="family">Virtual Machine</entry>
Jan 21 18:47:58 np0005591285 nova_compute[182755]:    </system>
Jan 21 18:47:58 np0005591285 nova_compute[182755]:  </sysinfo>
Jan 21 18:47:58 np0005591285 nova_compute[182755]:  <os>
Jan 21 18:47:58 np0005591285 nova_compute[182755]:    <type arch="x86_64" machine="q35">hvm</type>
Jan 21 18:47:58 np0005591285 nova_compute[182755]:    <boot dev="hd"/>
Jan 21 18:47:58 np0005591285 nova_compute[182755]:    <smbios mode="sysinfo"/>
Jan 21 18:47:58 np0005591285 nova_compute[182755]:  </os>
Jan 21 18:47:58 np0005591285 nova_compute[182755]:  <features>
Jan 21 18:47:58 np0005591285 nova_compute[182755]:    <acpi/>
Jan 21 18:47:58 np0005591285 nova_compute[182755]:    <apic/>
Jan 21 18:47:58 np0005591285 nova_compute[182755]:    <vmcoreinfo/>
Jan 21 18:47:58 np0005591285 nova_compute[182755]:  </features>
Jan 21 18:47:58 np0005591285 nova_compute[182755]:  <clock offset="utc">
Jan 21 18:47:58 np0005591285 nova_compute[182755]:    <timer name="pit" tickpolicy="delay"/>
Jan 21 18:47:58 np0005591285 nova_compute[182755]:    <timer name="rtc" tickpolicy="catchup"/>
Jan 21 18:47:58 np0005591285 nova_compute[182755]:    <timer name="hpet" present="no"/>
Jan 21 18:47:58 np0005591285 nova_compute[182755]:  </clock>
Jan 21 18:47:58 np0005591285 nova_compute[182755]:  <cpu mode="custom" match="exact">
Jan 21 18:47:58 np0005591285 nova_compute[182755]:    <model>Nehalem</model>
Jan 21 18:47:58 np0005591285 nova_compute[182755]:    <topology sockets="1" cores="1" threads="1"/>
Jan 21 18:47:58 np0005591285 nova_compute[182755]:  </cpu>
Jan 21 18:47:58 np0005591285 nova_compute[182755]:  <devices>
Jan 21 18:47:58 np0005591285 nova_compute[182755]:    <disk type="file" device="disk">
Jan 21 18:47:58 np0005591285 nova_compute[182755]:      <driver name="qemu" type="qcow2" cache="none"/>
Jan 21 18:47:58 np0005591285 nova_compute[182755]:      <source file="/var/lib/nova/instances/5ed410d9-b024-4004-83b6-eb4618833532/disk"/>
Jan 21 18:47:58 np0005591285 nova_compute[182755]:      <target dev="vda" bus="virtio"/>
Jan 21 18:47:58 np0005591285 nova_compute[182755]:    </disk>
Jan 21 18:47:58 np0005591285 nova_compute[182755]:    <disk type="file" device="cdrom">
Jan 21 18:47:58 np0005591285 nova_compute[182755]:      <driver name="qemu" type="raw" cache="none"/>
Jan 21 18:47:58 np0005591285 nova_compute[182755]:      <source file="/var/lib/nova/instances/5ed410d9-b024-4004-83b6-eb4618833532/disk.config"/>
Jan 21 18:47:58 np0005591285 nova_compute[182755]:      <target dev="sda" bus="sata"/>
Jan 21 18:47:58 np0005591285 nova_compute[182755]:    </disk>
Jan 21 18:47:58 np0005591285 nova_compute[182755]:    <interface type="ethernet">
Jan 21 18:47:58 np0005591285 nova_compute[182755]:      <mac address="fa:16:3e:3b:3f:4e"/>
Jan 21 18:47:58 np0005591285 nova_compute[182755]:      <model type="virtio"/>
Jan 21 18:47:58 np0005591285 nova_compute[182755]:      <driver name="vhost" rx_queue_size="512"/>
Jan 21 18:47:58 np0005591285 nova_compute[182755]:      <mtu size="1442"/>
Jan 21 18:47:58 np0005591285 nova_compute[182755]:      <target dev="tap32a358ca-39"/>
Jan 21 18:47:58 np0005591285 nova_compute[182755]:    </interface>
Jan 21 18:47:58 np0005591285 nova_compute[182755]:    <serial type="pty">
Jan 21 18:47:58 np0005591285 nova_compute[182755]:      <log file="/var/lib/nova/instances/5ed410d9-b024-4004-83b6-eb4618833532/console.log" append="off"/>
Jan 21 18:47:58 np0005591285 nova_compute[182755]:    </serial>
Jan 21 18:47:58 np0005591285 nova_compute[182755]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 21 18:47:58 np0005591285 nova_compute[182755]:    <video>
Jan 21 18:47:58 np0005591285 nova_compute[182755]:      <model type="virtio"/>
Jan 21 18:47:58 np0005591285 nova_compute[182755]:    </video>
Jan 21 18:47:58 np0005591285 nova_compute[182755]:    <input type="tablet" bus="usb"/>
Jan 21 18:47:58 np0005591285 nova_compute[182755]:    <rng model="virtio">
Jan 21 18:47:58 np0005591285 nova_compute[182755]:      <backend model="random">/dev/urandom</backend>
Jan 21 18:47:58 np0005591285 nova_compute[182755]:    </rng>
Jan 21 18:47:58 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root"/>
Jan 21 18:47:58 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 18:47:58 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 18:47:58 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 18:47:58 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 18:47:58 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 18:47:58 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 18:47:58 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 18:47:58 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 18:47:58 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 18:47:58 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 18:47:58 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 18:47:58 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 18:47:58 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 18:47:58 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 18:47:58 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 18:47:58 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 18:47:58 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 18:47:58 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 18:47:58 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 18:47:58 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 18:47:58 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 18:47:58 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 18:47:58 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 18:47:58 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 18:47:58 np0005591285 nova_compute[182755]:    <controller type="usb" index="0"/>
Jan 21 18:47:58 np0005591285 nova_compute[182755]:    <memballoon model="virtio">
Jan 21 18:47:58 np0005591285 nova_compute[182755]:      <stats period="10"/>
Jan 21 18:47:58 np0005591285 nova_compute[182755]:    </memballoon>
Jan 21 18:47:58 np0005591285 nova_compute[182755]:  </devices>
Jan 21 18:47:58 np0005591285 nova_compute[182755]: </domain>
Jan 21 18:47:58 np0005591285 nova_compute[182755]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Jan 21 18:47:58 np0005591285 nova_compute[182755]: 2026-01-21 23:47:58.592 182759 DEBUG nova.compute.manager [None req-f1985406-49a5-4583-b2e2-c94bcda3c46e 0c3f927acf834c718155d5ee5dd81b19 2edcdd2e6c5a46cb95eb89874a9cb5f3 - - default default] [instance: 5ed410d9-b024-4004-83b6-eb4618833532] Preparing to wait for external event network-vif-plugged-32a358ca-397f-4aed-b20f-8cbdde33b131 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Jan 21 18:47:58 np0005591285 nova_compute[182755]: 2026-01-21 23:47:58.593 182759 DEBUG oslo_concurrency.lockutils [None req-f1985406-49a5-4583-b2e2-c94bcda3c46e 0c3f927acf834c718155d5ee5dd81b19 2edcdd2e6c5a46cb95eb89874a9cb5f3 - - default default] Acquiring lock "5ed410d9-b024-4004-83b6-eb4618833532-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 21 18:47:58 np0005591285 nova_compute[182755]: 2026-01-21 23:47:58.593 182759 DEBUG oslo_concurrency.lockutils [None req-f1985406-49a5-4583-b2e2-c94bcda3c46e 0c3f927acf834c718155d5ee5dd81b19 2edcdd2e6c5a46cb95eb89874a9cb5f3 - - default default] Lock "5ed410d9-b024-4004-83b6-eb4618833532-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 21 18:47:58 np0005591285 nova_compute[182755]: 2026-01-21 23:47:58.594 182759 DEBUG oslo_concurrency.lockutils [None req-f1985406-49a5-4583-b2e2-c94bcda3c46e 0c3f927acf834c718155d5ee5dd81b19 2edcdd2e6c5a46cb95eb89874a9cb5f3 - - default default] Lock "5ed410d9-b024-4004-83b6-eb4618833532-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 21 18:47:58 np0005591285 nova_compute[182755]: 2026-01-21 23:47:58.595 182759 DEBUG nova.virt.libvirt.vif [None req-f1985406-49a5-4583-b2e2-c94bcda3c46e 0c3f927acf834c718155d5ee5dd81b19 2edcdd2e6c5a46cb95eb89874a9cb5f3 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-21T23:47:52Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ImagesOneServerNegativeTestJSON-server-682579751',display_name='tempest-ImagesOneServerNegativeTestJSON-server-682579751',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-imagesoneservernegativetestjson-server-682579751',id=24,image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='2edcdd2e6c5a46cb95eb89874a9cb5f3',ramdisk_id='',reservation_id='r-ur8f6wuq',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ImagesOneServerNegativeTestJSON-222133061',owner_user_name='tempest-ImagesOneServerNegativeTestJSON-222133061-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-21T23:47:53Z,user_data=None,user_id='0c3f927acf834c718155d5ee5dd81b19',uuid=5ed410d9-b024-4004-83b6-eb4618833532,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "32a358ca-397f-4aed-b20f-8cbdde33b131", "address": "fa:16:3e:3b:3f:4e", "network": {"id": "135f4ca0-b287-4f82-8393-a426855e9926", "bridge": "br-int", "label": "tempest-ImagesOneServerNegativeTestJSON-1018143163-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "2edcdd2e6c5a46cb95eb89874a9cb5f3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap32a358ca-39", "ovs_interfaceid": "32a358ca-397f-4aed-b20f-8cbdde33b131", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Jan 21 18:47:58 np0005591285 nova_compute[182755]: 2026-01-21 23:47:58.596 182759 DEBUG nova.network.os_vif_util [None req-f1985406-49a5-4583-b2e2-c94bcda3c46e 0c3f927acf834c718155d5ee5dd81b19 2edcdd2e6c5a46cb95eb89874a9cb5f3 - - default default] Converting VIF {"id": "32a358ca-397f-4aed-b20f-8cbdde33b131", "address": "fa:16:3e:3b:3f:4e", "network": {"id": "135f4ca0-b287-4f82-8393-a426855e9926", "bridge": "br-int", "label": "tempest-ImagesOneServerNegativeTestJSON-1018143163-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "2edcdd2e6c5a46cb95eb89874a9cb5f3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap32a358ca-39", "ovs_interfaceid": "32a358ca-397f-4aed-b20f-8cbdde33b131", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 21 18:47:58 np0005591285 nova_compute[182755]: 2026-01-21 23:47:58.597 182759 DEBUG nova.network.os_vif_util [None req-f1985406-49a5-4583-b2e2-c94bcda3c46e 0c3f927acf834c718155d5ee5dd81b19 2edcdd2e6c5a46cb95eb89874a9cb5f3 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:3b:3f:4e,bridge_name='br-int',has_traffic_filtering=True,id=32a358ca-397f-4aed-b20f-8cbdde33b131,network=Network(135f4ca0-b287-4f82-8393-a426855e9926),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap32a358ca-39') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 21 18:47:58 np0005591285 nova_compute[182755]: 2026-01-21 23:47:58.598 182759 DEBUG os_vif [None req-f1985406-49a5-4583-b2e2-c94bcda3c46e 0c3f927acf834c718155d5ee5dd81b19 2edcdd2e6c5a46cb95eb89874a9cb5f3 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:3b:3f:4e,bridge_name='br-int',has_traffic_filtering=True,id=32a358ca-397f-4aed-b20f-8cbdde33b131,network=Network(135f4ca0-b287-4f82-8393-a426855e9926),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap32a358ca-39') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Jan 21 18:47:58 np0005591285 nova_compute[182755]: 2026-01-21 23:47:58.599 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 18:47:58 np0005591285 nova_compute[182755]: 2026-01-21 23:47:58.600 182759 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 21 18:47:58 np0005591285 nova_compute[182755]: 2026-01-21 23:47:58.601 182759 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 21 18:47:58 np0005591285 nova_compute[182755]: 2026-01-21 23:47:58.606 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 18:47:58 np0005591285 nova_compute[182755]: 2026-01-21 23:47:58.607 182759 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap32a358ca-39, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 21 18:47:58 np0005591285 nova_compute[182755]: 2026-01-21 23:47:58.608 182759 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap32a358ca-39, col_values=(('external_ids', {'iface-id': '32a358ca-397f-4aed-b20f-8cbdde33b131', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:3b:3f:4e', 'vm-uuid': '5ed410d9-b024-4004-83b6-eb4618833532'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 21 18:47:58 np0005591285 nova_compute[182755]: 2026-01-21 23:47:58.610 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 18:47:58 np0005591285 NetworkManager[55017]: <info>  [1769039278.6121] manager: (tap32a358ca-39): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/45)
Jan 21 18:47:58 np0005591285 nova_compute[182755]: 2026-01-21 23:47:58.615 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 21 18:47:58 np0005591285 nova_compute[182755]: 2026-01-21 23:47:58.622 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 18:47:58 np0005591285 nova_compute[182755]: 2026-01-21 23:47:58.623 182759 INFO os_vif [None req-f1985406-49a5-4583-b2e2-c94bcda3c46e 0c3f927acf834c718155d5ee5dd81b19 2edcdd2e6c5a46cb95eb89874a9cb5f3 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:3b:3f:4e,bridge_name='br-int',has_traffic_filtering=True,id=32a358ca-397f-4aed-b20f-8cbdde33b131,network=Network(135f4ca0-b287-4f82-8393-a426855e9926),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap32a358ca-39')#033[00m
Jan 21 18:47:58 np0005591285 nova_compute[182755]: 2026-01-21 23:47:58.696 182759 DEBUG nova.virt.libvirt.driver [None req-f1985406-49a5-4583-b2e2-c94bcda3c46e 0c3f927acf834c718155d5ee5dd81b19 2edcdd2e6c5a46cb95eb89874a9cb5f3 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 21 18:47:58 np0005591285 nova_compute[182755]: 2026-01-21 23:47:58.697 182759 DEBUG nova.virt.libvirt.driver [None req-f1985406-49a5-4583-b2e2-c94bcda3c46e 0c3f927acf834c718155d5ee5dd81b19 2edcdd2e6c5a46cb95eb89874a9cb5f3 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 21 18:47:58 np0005591285 nova_compute[182755]: 2026-01-21 23:47:58.698 182759 DEBUG nova.virt.libvirt.driver [None req-f1985406-49a5-4583-b2e2-c94bcda3c46e 0c3f927acf834c718155d5ee5dd81b19 2edcdd2e6c5a46cb95eb89874a9cb5f3 - - default default] No VIF found with MAC fa:16:3e:3b:3f:4e, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Jan 21 18:47:58 np0005591285 nova_compute[182755]: 2026-01-21 23:47:58.699 182759 INFO nova.virt.libvirt.driver [None req-f1985406-49a5-4583-b2e2-c94bcda3c46e 0c3f927acf834c718155d5ee5dd81b19 2edcdd2e6c5a46cb95eb89874a9cb5f3 - - default default] [instance: 5ed410d9-b024-4004-83b6-eb4618833532] Using config drive#033[00m
Jan 21 18:47:58 np0005591285 nova_compute[182755]: 2026-01-21 23:47:58.997 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 18:47:59 np0005591285 nova_compute[182755]: 2026-01-21 23:47:59.054 182759 INFO nova.virt.libvirt.driver [None req-f1985406-49a5-4583-b2e2-c94bcda3c46e 0c3f927acf834c718155d5ee5dd81b19 2edcdd2e6c5a46cb95eb89874a9cb5f3 - - default default] [instance: 5ed410d9-b024-4004-83b6-eb4618833532] Creating config drive at /var/lib/nova/instances/5ed410d9-b024-4004-83b6-eb4618833532/disk.config#033[00m
Jan 21 18:47:59 np0005591285 nova_compute[182755]: 2026-01-21 23:47:59.062 182759 DEBUG oslo_concurrency.processutils [None req-f1985406-49a5-4583-b2e2-c94bcda3c46e 0c3f927acf834c718155d5ee5dd81b19 2edcdd2e6c5a46cb95eb89874a9cb5f3 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/5ed410d9-b024-4004-83b6-eb4618833532/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp_762ilvm execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 21 18:47:59 np0005591285 nova_compute[182755]: 2026-01-21 23:47:59.210 182759 DEBUG oslo_concurrency.processutils [None req-f1985406-49a5-4583-b2e2-c94bcda3c46e 0c3f927acf834c718155d5ee5dd81b19 2edcdd2e6c5a46cb95eb89874a9cb5f3 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/5ed410d9-b024-4004-83b6-eb4618833532/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp_762ilvm" returned: 0 in 0.148s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 21 18:47:59 np0005591285 kernel: tap32a358ca-39: entered promiscuous mode
Jan 21 18:47:59 np0005591285 NetworkManager[55017]: <info>  [1769039279.3024] manager: (tap32a358ca-39): new Tun device (/org/freedesktop/NetworkManager/Devices/46)
Jan 21 18:47:59 np0005591285 ovn_controller[94908]: 2026-01-21T23:47:59Z|00069|binding|INFO|Claiming lport 32a358ca-397f-4aed-b20f-8cbdde33b131 for this chassis.
Jan 21 18:47:59 np0005591285 ovn_controller[94908]: 2026-01-21T23:47:59Z|00070|binding|INFO|32a358ca-397f-4aed-b20f-8cbdde33b131: Claiming fa:16:3e:3b:3f:4e 10.100.0.5
Jan 21 18:47:59 np0005591285 nova_compute[182755]: 2026-01-21 23:47:59.308 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 18:47:59 np0005591285 nova_compute[182755]: 2026-01-21 23:47:59.310 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 18:47:59 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:47:59.317 104259 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:3b:3f:4e 10.100.0.5'], port_security=['fa:16:3e:3b:3f:4e 10.100.0.5'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.5/28', 'neutron:device_id': '5ed410d9-b024-4004-83b6-eb4618833532', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-135f4ca0-b287-4f82-8393-a426855e9926', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '2edcdd2e6c5a46cb95eb89874a9cb5f3', 'neutron:revision_number': '2', 'neutron:security_group_ids': 'b452e9c4-b5fd-46cd-9749-caa7edf73c8c', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=357b9b46-d446-48ea-adde-5992e2bcd56d, chassis=[<ovs.db.idl.Row object at 0x7fdd55b5c970>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fdd55b5c970>], logical_port=32a358ca-397f-4aed-b20f-8cbdde33b131) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 21 18:47:59 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:47:59.318 104259 INFO neutron.agent.ovn.metadata.agent [-] Port 32a358ca-397f-4aed-b20f-8cbdde33b131 in datapath 135f4ca0-b287-4f82-8393-a426855e9926 bound to our chassis#033[00m
Jan 21 18:47:59 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:47:59.320 104259 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 135f4ca0-b287-4f82-8393-a426855e9926#033[00m
Jan 21 18:47:59 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:47:59.333 211690 DEBUG oslo.privsep.daemon [-] privsep: reply[f9c93fe4-7159-418c-ad60-cd1c86f954d9]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 18:47:59 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:47:59.334 104259 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap135f4ca0-b1 in ovnmeta-135f4ca0-b287-4f82-8393-a426855e9926 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Jan 21 18:47:59 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:47:59.335 211690 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap135f4ca0-b0 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Jan 21 18:47:59 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:47:59.335 211690 DEBUG oslo.privsep.daemon [-] privsep: reply[3bf98c4b-f23d-4707-8783-bec60fe61992]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 18:47:59 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:47:59.336 211690 DEBUG oslo.privsep.daemon [-] privsep: reply[60cf2f23-4631-4545-bb54-d981b7b9a5ef]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 18:47:59 np0005591285 systemd-udevd[213993]: Network interface NamePolicy= disabled on kernel command line.
Jan 21 18:47:59 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:47:59.358 104650 DEBUG oslo.privsep.daemon [-] privsep: reply[813d4028-29c2-460c-bd0e-fbaa65314935]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 18:47:59 np0005591285 systemd-machined[154022]: New machine qemu-7-instance-00000018.
Jan 21 18:47:59 np0005591285 NetworkManager[55017]: <info>  [1769039279.3790] device (tap32a358ca-39): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 21 18:47:59 np0005591285 NetworkManager[55017]: <info>  [1769039279.3808] device (tap32a358ca-39): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 21 18:47:59 np0005591285 systemd[1]: Started Virtual Machine qemu-7-instance-00000018.
Jan 21 18:47:59 np0005591285 nova_compute[182755]: 2026-01-21 23:47:59.385 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 18:47:59 np0005591285 ovn_controller[94908]: 2026-01-21T23:47:59Z|00071|binding|INFO|Setting lport 32a358ca-397f-4aed-b20f-8cbdde33b131 ovn-installed in OVS
Jan 21 18:47:59 np0005591285 ovn_controller[94908]: 2026-01-21T23:47:59Z|00072|binding|INFO|Setting lport 32a358ca-397f-4aed-b20f-8cbdde33b131 up in Southbound
Jan 21 18:47:59 np0005591285 nova_compute[182755]: 2026-01-21 23:47:59.390 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 18:47:59 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:47:59.395 211690 DEBUG oslo.privsep.daemon [-] privsep: reply[af82dbc2-a798-4d3a-a256-6e6c15ebcdfd]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 18:47:59 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:47:59.435 211704 DEBUG oslo.privsep.daemon [-] privsep: reply[5f2b0510-4976-44ca-9ae7-f3c22f3f3a5d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 18:47:59 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:47:59.444 211690 DEBUG oslo.privsep.daemon [-] privsep: reply[56e70bf2-48c8-4d17-b157-f6bf4d41e884]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 18:47:59 np0005591285 NetworkManager[55017]: <info>  [1769039279.4457] manager: (tap135f4ca0-b0): new Veth device (/org/freedesktop/NetworkManager/Devices/47)
Jan 21 18:47:59 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:47:59.489 211704 DEBUG oslo.privsep.daemon [-] privsep: reply[3a7c5035-6901-40de-9eb0-d1257a3cc6cc]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 18:47:59 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:47:59.493 211704 DEBUG oslo.privsep.daemon [-] privsep: reply[8b5544e7-b58e-4cc7-8470-a375164a9de7]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 18:47:59 np0005591285 NetworkManager[55017]: <info>  [1769039279.5202] device (tap135f4ca0-b0): carrier: link connected
Jan 21 18:47:59 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:47:59.534 211704 DEBUG oslo.privsep.daemon [-] privsep: reply[52d2a6f2-429c-4584-958c-3ebaf006cf45]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 18:47:59 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:47:59.564 211690 DEBUG oslo.privsep.daemon [-] privsep: reply[307b2916-0467-4fce-ad5e-e68823487239]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap135f4ca0-b1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:7b:bd:df'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 26], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 383817, 'reachable_time': 37379, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 214024, 'error': None, 'target': 'ovnmeta-135f4ca0-b287-4f82-8393-a426855e9926', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 18:47:59 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:47:59.588 211690 DEBUG oslo.privsep.daemon [-] privsep: reply[38f49790-c8ea-4f74-abce-bae5990539a4]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe7b:bddf'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 383817, 'tstamp': 383817}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 214025, 'error': None, 'target': 'ovnmeta-135f4ca0-b287-4f82-8393-a426855e9926', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 18:47:59 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:47:59.624 211690 DEBUG oslo.privsep.daemon [-] privsep: reply[539131e1-2a59-4097-8f06-5c48e867576a]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap135f4ca0-b1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:7b:bd:df'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 26], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 383817, 'reachable_time': 37379, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 214026, 'error': None, 'target': 'ovnmeta-135f4ca0-b287-4f82-8393-a426855e9926', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 18:47:59 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:47:59.666 211690 DEBUG oslo.privsep.daemon [-] privsep: reply[59f48ba3-93f7-4940-9166-6d9a8984e764]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 18:47:59 np0005591285 nova_compute[182755]: 2026-01-21 23:47:59.726 182759 DEBUG nova.network.neutron [req-627f3d36-4a19-48b7-9838-1c394b6b2bef req-fd366c69-7f21-4765-8ae3-48c7a6279f82 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 5ed410d9-b024-4004-83b6-eb4618833532] Updated VIF entry in instance network info cache for port 32a358ca-397f-4aed-b20f-8cbdde33b131. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 21 18:47:59 np0005591285 nova_compute[182755]: 2026-01-21 23:47:59.727 182759 DEBUG nova.network.neutron [req-627f3d36-4a19-48b7-9838-1c394b6b2bef req-fd366c69-7f21-4765-8ae3-48c7a6279f82 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 5ed410d9-b024-4004-83b6-eb4618833532] Updating instance_info_cache with network_info: [{"id": "32a358ca-397f-4aed-b20f-8cbdde33b131", "address": "fa:16:3e:3b:3f:4e", "network": {"id": "135f4ca0-b287-4f82-8393-a426855e9926", "bridge": "br-int", "label": "tempest-ImagesOneServerNegativeTestJSON-1018143163-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "2edcdd2e6c5a46cb95eb89874a9cb5f3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap32a358ca-39", "ovs_interfaceid": "32a358ca-397f-4aed-b20f-8cbdde33b131", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 21 18:47:59 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:47:59.740 211690 DEBUG oslo.privsep.daemon [-] privsep: reply[efc7429f-b284-49b1-90bb-7867885d4e42]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 18:47:59 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:47:59.743 104259 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap135f4ca0-b0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 21 18:47:59 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:47:59.743 104259 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 21 18:47:59 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:47:59.744 104259 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap135f4ca0-b0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 21 18:47:59 np0005591285 nova_compute[182755]: 2026-01-21 23:47:59.748 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 18:47:59 np0005591285 kernel: tap135f4ca0-b0: entered promiscuous mode
Jan 21 18:47:59 np0005591285 NetworkManager[55017]: <info>  [1769039279.7496] manager: (tap135f4ca0-b0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/48)
Jan 21 18:47:59 np0005591285 nova_compute[182755]: 2026-01-21 23:47:59.751 182759 DEBUG oslo_concurrency.lockutils [req-627f3d36-4a19-48b7-9838-1c394b6b2bef req-fd366c69-7f21-4765-8ae3-48c7a6279f82 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Releasing lock "refresh_cache-5ed410d9-b024-4004-83b6-eb4618833532" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 21 18:47:59 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:47:59.752 104259 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap135f4ca0-b0, col_values=(('external_ids', {'iface-id': 'f24d5ed7-f246-4123-afeb-d49e73610afb'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 21 18:47:59 np0005591285 nova_compute[182755]: 2026-01-21 23:47:59.753 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 18:47:59 np0005591285 ovn_controller[94908]: 2026-01-21T23:47:59Z|00073|binding|INFO|Releasing lport f24d5ed7-f246-4123-afeb-d49e73610afb from this chassis (sb_readonly=0)
Jan 21 18:47:59 np0005591285 nova_compute[182755]: 2026-01-21 23:47:59.767 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 18:47:59 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:47:59.768 104259 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/135f4ca0-b287-4f82-8393-a426855e9926.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/135f4ca0-b287-4f82-8393-a426855e9926.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Jan 21 18:47:59 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:47:59.769 211690 DEBUG oslo.privsep.daemon [-] privsep: reply[28716b9b-3980-4a8d-b963-c3c7a7299180]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 18:47:59 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:47:59.770 104259 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Jan 21 18:47:59 np0005591285 ovn_metadata_agent[104254]: global
Jan 21 18:47:59 np0005591285 ovn_metadata_agent[104254]:    log         /dev/log local0 debug
Jan 21 18:47:59 np0005591285 ovn_metadata_agent[104254]:    log-tag     haproxy-metadata-proxy-135f4ca0-b287-4f82-8393-a426855e9926
Jan 21 18:47:59 np0005591285 ovn_metadata_agent[104254]:    user        root
Jan 21 18:47:59 np0005591285 ovn_metadata_agent[104254]:    group       root
Jan 21 18:47:59 np0005591285 ovn_metadata_agent[104254]:    maxconn     1024
Jan 21 18:47:59 np0005591285 ovn_metadata_agent[104254]:    pidfile     /var/lib/neutron/external/pids/135f4ca0-b287-4f82-8393-a426855e9926.pid.haproxy
Jan 21 18:47:59 np0005591285 ovn_metadata_agent[104254]:    daemon
Jan 21 18:47:59 np0005591285 ovn_metadata_agent[104254]: 
Jan 21 18:47:59 np0005591285 ovn_metadata_agent[104254]: defaults
Jan 21 18:47:59 np0005591285 ovn_metadata_agent[104254]:    log global
Jan 21 18:47:59 np0005591285 ovn_metadata_agent[104254]:    mode http
Jan 21 18:47:59 np0005591285 ovn_metadata_agent[104254]:    option httplog
Jan 21 18:47:59 np0005591285 ovn_metadata_agent[104254]:    option dontlognull
Jan 21 18:47:59 np0005591285 ovn_metadata_agent[104254]:    option http-server-close
Jan 21 18:47:59 np0005591285 ovn_metadata_agent[104254]:    option forwardfor
Jan 21 18:47:59 np0005591285 ovn_metadata_agent[104254]:    retries                 3
Jan 21 18:47:59 np0005591285 ovn_metadata_agent[104254]:    timeout http-request    30s
Jan 21 18:47:59 np0005591285 ovn_metadata_agent[104254]:    timeout connect         30s
Jan 21 18:47:59 np0005591285 ovn_metadata_agent[104254]:    timeout client          32s
Jan 21 18:47:59 np0005591285 ovn_metadata_agent[104254]:    timeout server          32s
Jan 21 18:47:59 np0005591285 ovn_metadata_agent[104254]:    timeout http-keep-alive 30s
Jan 21 18:47:59 np0005591285 ovn_metadata_agent[104254]: 
Jan 21 18:47:59 np0005591285 ovn_metadata_agent[104254]: 
Jan 21 18:47:59 np0005591285 ovn_metadata_agent[104254]: listen listener
Jan 21 18:47:59 np0005591285 ovn_metadata_agent[104254]:    bind 169.254.169.254:80
Jan 21 18:47:59 np0005591285 ovn_metadata_agent[104254]:    server metadata /var/lib/neutron/metadata_proxy
Jan 21 18:47:59 np0005591285 ovn_metadata_agent[104254]:    http-request add-header X-OVN-Network-ID 135f4ca0-b287-4f82-8393-a426855e9926
Jan 21 18:47:59 np0005591285 ovn_metadata_agent[104254]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Jan 21 18:47:59 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:47:59.772 104259 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-135f4ca0-b287-4f82-8393-a426855e9926', 'env', 'PROCESS_TAG=haproxy-135f4ca0-b287-4f82-8393-a426855e9926', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/135f4ca0-b287-4f82-8393-a426855e9926.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Jan 21 18:47:59 np0005591285 nova_compute[182755]: 2026-01-21 23:47:59.980 182759 DEBUG nova.virt.driver [None req-f447ef32-7edb-4e2c-a5c5-5452f2f9c37e - - - - - -] Emitting event <LifecycleEvent: 1769039279.9802384, 5ed410d9-b024-4004-83b6-eb4618833532 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 21 18:47:59 np0005591285 nova_compute[182755]: 2026-01-21 23:47:59.982 182759 INFO nova.compute.manager [None req-f447ef32-7edb-4e2c-a5c5-5452f2f9c37e - - - - - -] [instance: 5ed410d9-b024-4004-83b6-eb4618833532] VM Started (Lifecycle Event)#033[00m
Jan 21 18:48:00 np0005591285 nova_compute[182755]: 2026-01-21 23:48:00.005 182759 DEBUG nova.compute.manager [None req-f447ef32-7edb-4e2c-a5c5-5452f2f9c37e - - - - - -] [instance: 5ed410d9-b024-4004-83b6-eb4618833532] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 21 18:48:00 np0005591285 nova_compute[182755]: 2026-01-21 23:48:00.012 182759 DEBUG nova.virt.driver [None req-f447ef32-7edb-4e2c-a5c5-5452f2f9c37e - - - - - -] Emitting event <LifecycleEvent: 1769039279.9814434, 5ed410d9-b024-4004-83b6-eb4618833532 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 21 18:48:00 np0005591285 nova_compute[182755]: 2026-01-21 23:48:00.012 182759 INFO nova.compute.manager [None req-f447ef32-7edb-4e2c-a5c5-5452f2f9c37e - - - - - -] [instance: 5ed410d9-b024-4004-83b6-eb4618833532] VM Paused (Lifecycle Event)#033[00m
Jan 21 18:48:00 np0005591285 nova_compute[182755]: 2026-01-21 23:48:00.062 182759 DEBUG nova.compute.manager [None req-f447ef32-7edb-4e2c-a5c5-5452f2f9c37e - - - - - -] [instance: 5ed410d9-b024-4004-83b6-eb4618833532] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 21 18:48:00 np0005591285 nova_compute[182755]: 2026-01-21 23:48:00.068 182759 DEBUG nova.compute.manager [None req-f447ef32-7edb-4e2c-a5c5-5452f2f9c37e - - - - - -] [instance: 5ed410d9-b024-4004-83b6-eb4618833532] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 21 18:48:00 np0005591285 nova_compute[182755]: 2026-01-21 23:48:00.094 182759 INFO nova.compute.manager [None req-f447ef32-7edb-4e2c-a5c5-5452f2f9c37e - - - - - -] [instance: 5ed410d9-b024-4004-83b6-eb4618833532] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 21 18:48:00 np0005591285 podman[214065]: 2026-01-21 23:48:00.298563907 +0000 UTC m=+0.086160566 container create 44c90f9772a12a3eafbdb1e0073b3713c135aa18065296b8108f5b3e90e76317 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-135f4ca0-b287-4f82-8393-a426855e9926, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.license=GPLv2)
Jan 21 18:48:00 np0005591285 podman[214065]: 2026-01-21 23:48:00.256239845 +0000 UTC m=+0.043836564 image pull 3695f0466b4af47afdf4b467956f8cc4744d7249671a73e7ca3fd26cca2f59c3 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Jan 21 18:48:00 np0005591285 systemd[1]: Started libpod-conmon-44c90f9772a12a3eafbdb1e0073b3713c135aa18065296b8108f5b3e90e76317.scope.
Jan 21 18:48:00 np0005591285 systemd[1]: Started libcrun container.
Jan 21 18:48:00 np0005591285 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/29a7c7e907863db96420e4fac5c64d62feb0a315ff74565e7c5f91b6ff9877b7/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 21 18:48:00 np0005591285 podman[214065]: 2026-01-21 23:48:00.427038494 +0000 UTC m=+0.214635223 container init 44c90f9772a12a3eafbdb1e0073b3713c135aa18065296b8108f5b3e90e76317 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-135f4ca0-b287-4f82-8393-a426855e9926, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 21 18:48:00 np0005591285 podman[214065]: 2026-01-21 23:48:00.437167105 +0000 UTC m=+0.224763774 container start 44c90f9772a12a3eafbdb1e0073b3713c135aa18065296b8108f5b3e90e76317 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-135f4ca0-b287-4f82-8393-a426855e9926, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.vendor=CentOS)
Jan 21 18:48:00 np0005591285 neutron-haproxy-ovnmeta-135f4ca0-b287-4f82-8393-a426855e9926[214080]: [NOTICE]   (214084) : New worker (214086) forked
Jan 21 18:48:00 np0005591285 neutron-haproxy-ovnmeta-135f4ca0-b287-4f82-8393-a426855e9926[214080]: [NOTICE]   (214084) : Loading success.
Jan 21 18:48:00 np0005591285 nova_compute[182755]: 2026-01-21 23:48:00.620 182759 DEBUG nova.compute.manager [req-1d2c1775-d507-4ea6-8903-9e466f6aac90 req-9cfb93b1-257f-40ee-bebb-85deca59ac85 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 5ed410d9-b024-4004-83b6-eb4618833532] Received event network-vif-plugged-32a358ca-397f-4aed-b20f-8cbdde33b131 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 21 18:48:00 np0005591285 nova_compute[182755]: 2026-01-21 23:48:00.621 182759 DEBUG oslo_concurrency.lockutils [req-1d2c1775-d507-4ea6-8903-9e466f6aac90 req-9cfb93b1-257f-40ee-bebb-85deca59ac85 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "5ed410d9-b024-4004-83b6-eb4618833532-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 21 18:48:00 np0005591285 nova_compute[182755]: 2026-01-21 23:48:00.621 182759 DEBUG oslo_concurrency.lockutils [req-1d2c1775-d507-4ea6-8903-9e466f6aac90 req-9cfb93b1-257f-40ee-bebb-85deca59ac85 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "5ed410d9-b024-4004-83b6-eb4618833532-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 21 18:48:00 np0005591285 nova_compute[182755]: 2026-01-21 23:48:00.621 182759 DEBUG oslo_concurrency.lockutils [req-1d2c1775-d507-4ea6-8903-9e466f6aac90 req-9cfb93b1-257f-40ee-bebb-85deca59ac85 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "5ed410d9-b024-4004-83b6-eb4618833532-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 21 18:48:00 np0005591285 nova_compute[182755]: 2026-01-21 23:48:00.622 182759 DEBUG nova.compute.manager [req-1d2c1775-d507-4ea6-8903-9e466f6aac90 req-9cfb93b1-257f-40ee-bebb-85deca59ac85 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 5ed410d9-b024-4004-83b6-eb4618833532] Processing event network-vif-plugged-32a358ca-397f-4aed-b20f-8cbdde33b131 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Jan 21 18:48:00 np0005591285 nova_compute[182755]: 2026-01-21 23:48:00.622 182759 DEBUG nova.compute.manager [req-1d2c1775-d507-4ea6-8903-9e466f6aac90 req-9cfb93b1-257f-40ee-bebb-85deca59ac85 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 5ed410d9-b024-4004-83b6-eb4618833532] Received event network-vif-plugged-32a358ca-397f-4aed-b20f-8cbdde33b131 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 21 18:48:00 np0005591285 nova_compute[182755]: 2026-01-21 23:48:00.622 182759 DEBUG oslo_concurrency.lockutils [req-1d2c1775-d507-4ea6-8903-9e466f6aac90 req-9cfb93b1-257f-40ee-bebb-85deca59ac85 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "5ed410d9-b024-4004-83b6-eb4618833532-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 21 18:48:00 np0005591285 nova_compute[182755]: 2026-01-21 23:48:00.622 182759 DEBUG oslo_concurrency.lockutils [req-1d2c1775-d507-4ea6-8903-9e466f6aac90 req-9cfb93b1-257f-40ee-bebb-85deca59ac85 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "5ed410d9-b024-4004-83b6-eb4618833532-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 21 18:48:00 np0005591285 nova_compute[182755]: 2026-01-21 23:48:00.622 182759 DEBUG oslo_concurrency.lockutils [req-1d2c1775-d507-4ea6-8903-9e466f6aac90 req-9cfb93b1-257f-40ee-bebb-85deca59ac85 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "5ed410d9-b024-4004-83b6-eb4618833532-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 21 18:48:00 np0005591285 nova_compute[182755]: 2026-01-21 23:48:00.623 182759 DEBUG nova.compute.manager [req-1d2c1775-d507-4ea6-8903-9e466f6aac90 req-9cfb93b1-257f-40ee-bebb-85deca59ac85 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 5ed410d9-b024-4004-83b6-eb4618833532] No waiting events found dispatching network-vif-plugged-32a358ca-397f-4aed-b20f-8cbdde33b131 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 21 18:48:00 np0005591285 nova_compute[182755]: 2026-01-21 23:48:00.623 182759 WARNING nova.compute.manager [req-1d2c1775-d507-4ea6-8903-9e466f6aac90 req-9cfb93b1-257f-40ee-bebb-85deca59ac85 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 5ed410d9-b024-4004-83b6-eb4618833532] Received unexpected event network-vif-plugged-32a358ca-397f-4aed-b20f-8cbdde33b131 for instance with vm_state building and task_state spawning.#033[00m
Jan 21 18:48:00 np0005591285 nova_compute[182755]: 2026-01-21 23:48:00.623 182759 DEBUG nova.compute.manager [None req-f1985406-49a5-4583-b2e2-c94bcda3c46e 0c3f927acf834c718155d5ee5dd81b19 2edcdd2e6c5a46cb95eb89874a9cb5f3 - - default default] [instance: 5ed410d9-b024-4004-83b6-eb4618833532] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Jan 21 18:48:00 np0005591285 nova_compute[182755]: 2026-01-21 23:48:00.630 182759 DEBUG nova.virt.driver [None req-f447ef32-7edb-4e2c-a5c5-5452f2f9c37e - - - - - -] Emitting event <LifecycleEvent: 1769039280.629934, 5ed410d9-b024-4004-83b6-eb4618833532 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 21 18:48:00 np0005591285 nova_compute[182755]: 2026-01-21 23:48:00.632 182759 INFO nova.compute.manager [None req-f447ef32-7edb-4e2c-a5c5-5452f2f9c37e - - - - - -] [instance: 5ed410d9-b024-4004-83b6-eb4618833532] VM Resumed (Lifecycle Event)#033[00m
Jan 21 18:48:00 np0005591285 nova_compute[182755]: 2026-01-21 23:48:00.633 182759 DEBUG nova.virt.libvirt.driver [None req-f1985406-49a5-4583-b2e2-c94bcda3c46e 0c3f927acf834c718155d5ee5dd81b19 2edcdd2e6c5a46cb95eb89874a9cb5f3 - - default default] [instance: 5ed410d9-b024-4004-83b6-eb4618833532] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Jan 21 18:48:00 np0005591285 nova_compute[182755]: 2026-01-21 23:48:00.636 182759 INFO nova.virt.libvirt.driver [-] [instance: 5ed410d9-b024-4004-83b6-eb4618833532] Instance spawned successfully.#033[00m
Jan 21 18:48:00 np0005591285 nova_compute[182755]: 2026-01-21 23:48:00.637 182759 DEBUG nova.virt.libvirt.driver [None req-f1985406-49a5-4583-b2e2-c94bcda3c46e 0c3f927acf834c718155d5ee5dd81b19 2edcdd2e6c5a46cb95eb89874a9cb5f3 - - default default] [instance: 5ed410d9-b024-4004-83b6-eb4618833532] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Jan 21 18:48:00 np0005591285 nova_compute[182755]: 2026-01-21 23:48:00.658 182759 DEBUG nova.compute.manager [None req-f447ef32-7edb-4e2c-a5c5-5452f2f9c37e - - - - - -] [instance: 5ed410d9-b024-4004-83b6-eb4618833532] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 21 18:48:00 np0005591285 nova_compute[182755]: 2026-01-21 23:48:00.665 182759 DEBUG nova.compute.manager [None req-f447ef32-7edb-4e2c-a5c5-5452f2f9c37e - - - - - -] [instance: 5ed410d9-b024-4004-83b6-eb4618833532] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 21 18:48:00 np0005591285 nova_compute[182755]: 2026-01-21 23:48:00.668 182759 DEBUG nova.virt.libvirt.driver [None req-f1985406-49a5-4583-b2e2-c94bcda3c46e 0c3f927acf834c718155d5ee5dd81b19 2edcdd2e6c5a46cb95eb89874a9cb5f3 - - default default] [instance: 5ed410d9-b024-4004-83b6-eb4618833532] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 21 18:48:00 np0005591285 nova_compute[182755]: 2026-01-21 23:48:00.669 182759 DEBUG nova.virt.libvirt.driver [None req-f1985406-49a5-4583-b2e2-c94bcda3c46e 0c3f927acf834c718155d5ee5dd81b19 2edcdd2e6c5a46cb95eb89874a9cb5f3 - - default default] [instance: 5ed410d9-b024-4004-83b6-eb4618833532] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 21 18:48:00 np0005591285 nova_compute[182755]: 2026-01-21 23:48:00.669 182759 DEBUG nova.virt.libvirt.driver [None req-f1985406-49a5-4583-b2e2-c94bcda3c46e 0c3f927acf834c718155d5ee5dd81b19 2edcdd2e6c5a46cb95eb89874a9cb5f3 - - default default] [instance: 5ed410d9-b024-4004-83b6-eb4618833532] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 21 18:48:00 np0005591285 nova_compute[182755]: 2026-01-21 23:48:00.669 182759 DEBUG nova.virt.libvirt.driver [None req-f1985406-49a5-4583-b2e2-c94bcda3c46e 0c3f927acf834c718155d5ee5dd81b19 2edcdd2e6c5a46cb95eb89874a9cb5f3 - - default default] [instance: 5ed410d9-b024-4004-83b6-eb4618833532] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 21 18:48:00 np0005591285 nova_compute[182755]: 2026-01-21 23:48:00.670 182759 DEBUG nova.virt.libvirt.driver [None req-f1985406-49a5-4583-b2e2-c94bcda3c46e 0c3f927acf834c718155d5ee5dd81b19 2edcdd2e6c5a46cb95eb89874a9cb5f3 - - default default] [instance: 5ed410d9-b024-4004-83b6-eb4618833532] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 21 18:48:00 np0005591285 nova_compute[182755]: 2026-01-21 23:48:00.670 182759 DEBUG nova.virt.libvirt.driver [None req-f1985406-49a5-4583-b2e2-c94bcda3c46e 0c3f927acf834c718155d5ee5dd81b19 2edcdd2e6c5a46cb95eb89874a9cb5f3 - - default default] [instance: 5ed410d9-b024-4004-83b6-eb4618833532] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 21 18:48:00 np0005591285 nova_compute[182755]: 2026-01-21 23:48:00.708 182759 INFO nova.compute.manager [None req-f447ef32-7edb-4e2c-a5c5-5452f2f9c37e - - - - - -] [instance: 5ed410d9-b024-4004-83b6-eb4618833532] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 21 18:48:00 np0005591285 nova_compute[182755]: 2026-01-21 23:48:00.781 182759 INFO nova.compute.manager [None req-f1985406-49a5-4583-b2e2-c94bcda3c46e 0c3f927acf834c718155d5ee5dd81b19 2edcdd2e6c5a46cb95eb89874a9cb5f3 - - default default] [instance: 5ed410d9-b024-4004-83b6-eb4618833532] Took 6.89 seconds to spawn the instance on the hypervisor.#033[00m
Jan 21 18:48:00 np0005591285 nova_compute[182755]: 2026-01-21 23:48:00.782 182759 DEBUG nova.compute.manager [None req-f1985406-49a5-4583-b2e2-c94bcda3c46e 0c3f927acf834c718155d5ee5dd81b19 2edcdd2e6c5a46cb95eb89874a9cb5f3 - - default default] [instance: 5ed410d9-b024-4004-83b6-eb4618833532] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 21 18:48:00 np0005591285 nova_compute[182755]: 2026-01-21 23:48:00.873 182759 INFO nova.compute.manager [None req-f1985406-49a5-4583-b2e2-c94bcda3c46e 0c3f927acf834c718155d5ee5dd81b19 2edcdd2e6c5a46cb95eb89874a9cb5f3 - - default default] [instance: 5ed410d9-b024-4004-83b6-eb4618833532] Took 7.55 seconds to build instance.#033[00m
Jan 21 18:48:00 np0005591285 nova_compute[182755]: 2026-01-21 23:48:00.899 182759 DEBUG oslo_concurrency.lockutils [None req-f1985406-49a5-4583-b2e2-c94bcda3c46e 0c3f927acf834c718155d5ee5dd81b19 2edcdd2e6c5a46cb95eb89874a9cb5f3 - - default default] Lock "5ed410d9-b024-4004-83b6-eb4618833532" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 7.700s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 21 18:48:02 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:48:02.953 104259 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 21 18:48:02 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:48:02.955 104259 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 21 18:48:02 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:48:02.956 104259 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 21 18:48:03 np0005591285 nova_compute[182755]: 2026-01-21 23:48:03.613 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 18:48:04 np0005591285 nova_compute[182755]: 2026-01-21 23:48:04.000 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 18:48:06 np0005591285 systemd[1]: virtproxyd.service: Deactivated successfully.
Jan 21 18:48:08 np0005591285 podman[214110]: 2026-01-21 23:48:08.265488341 +0000 UTC m=+0.124401599 container health_status b5c1a31a508d0cf9e10702abe9c0ef424614b4d8f0daee85791f7de054afa6f0 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=ceilometer_agent_compute, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, container_name=ceilometer_agent_compute, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3)
Jan 21 18:48:08 np0005591285 nova_compute[182755]: 2026-01-21 23:48:08.616 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 18:48:09 np0005591285 nova_compute[182755]: 2026-01-21 23:48:09.003 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 18:48:11 np0005591285 nova_compute[182755]: 2026-01-21 23:48:11.047 182759 DEBUG nova.compute.manager [None req-2a6fa16f-a14c-4e87-890d-c0c01faaa9a3 0c3f927acf834c718155d5ee5dd81b19 2edcdd2e6c5a46cb95eb89874a9cb5f3 - - default default] [instance: 5ed410d9-b024-4004-83b6-eb4618833532] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 21 18:48:11 np0005591285 nova_compute[182755]: 2026-01-21 23:48:11.155 182759 INFO nova.compute.manager [None req-2a6fa16f-a14c-4e87-890d-c0c01faaa9a3 0c3f927acf834c718155d5ee5dd81b19 2edcdd2e6c5a46cb95eb89874a9cb5f3 - - default default] [instance: 5ed410d9-b024-4004-83b6-eb4618833532] instance snapshotting#033[00m
Jan 21 18:48:11 np0005591285 podman[214130]: 2026-01-21 23:48:11.260569952 +0000 UTC m=+0.118026578 container health_status 769668cc4e818cfae854ab3fcb5517227ddca1e18005a18ef4d782a494f45662 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, managed_by=edpm_ansible, release=1755695350, com.redhat.component=ubi9-minimal-container, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, container_name=openstack_network_exporter, url=https://catalog.redhat.com/en/search?searchType=containers, vendor=Red Hat, Inc., io.buildah.version=1.33.7, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., version=9.6, build-date=2025-08-20T13:12:41, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.tags=minimal rhel9, name=ubi9-minimal, vcs-type=git, maintainer=Red Hat, Inc., config_id=openstack_network_exporter, distribution-scope=public, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., architecture=x86_64, io.openshift.expose-services=)
Jan 21 18:48:11 np0005591285 nova_compute[182755]: 2026-01-21 23:48:11.593 182759 INFO nova.virt.libvirt.driver [None req-2a6fa16f-a14c-4e87-890d-c0c01faaa9a3 0c3f927acf834c718155d5ee5dd81b19 2edcdd2e6c5a46cb95eb89874a9cb5f3 - - default default] [instance: 5ed410d9-b024-4004-83b6-eb4618833532] Beginning live snapshot process#033[00m
Jan 21 18:48:11 np0005591285 virtqemud[182299]: invalid argument: disk vda does not have an active block job
Jan 21 18:48:11 np0005591285 nova_compute[182755]: 2026-01-21 23:48:11.893 182759 DEBUG oslo_concurrency.processutils [None req-2a6fa16f-a14c-4e87-890d-c0c01faaa9a3 0c3f927acf834c718155d5ee5dd81b19 2edcdd2e6c5a46cb95eb89874a9cb5f3 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/5ed410d9-b024-4004-83b6-eb4618833532/disk --force-share --output=json -f qcow2 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 21 18:48:11 np0005591285 nova_compute[182755]: 2026-01-21 23:48:11.968 182759 DEBUG oslo_concurrency.processutils [None req-2a6fa16f-a14c-4e87-890d-c0c01faaa9a3 0c3f927acf834c718155d5ee5dd81b19 2edcdd2e6c5a46cb95eb89874a9cb5f3 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/5ed410d9-b024-4004-83b6-eb4618833532/disk --force-share --output=json -f qcow2" returned: 0 in 0.074s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 21 18:48:11 np0005591285 nova_compute[182755]: 2026-01-21 23:48:11.970 182759 DEBUG oslo_concurrency.processutils [None req-2a6fa16f-a14c-4e87-890d-c0c01faaa9a3 0c3f927acf834c718155d5ee5dd81b19 2edcdd2e6c5a46cb95eb89874a9cb5f3 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/5ed410d9-b024-4004-83b6-eb4618833532/disk --force-share --output=json -f qcow2 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 21 18:48:12 np0005591285 nova_compute[182755]: 2026-01-21 23:48:12.066 182759 DEBUG oslo_concurrency.processutils [None req-2a6fa16f-a14c-4e87-890d-c0c01faaa9a3 0c3f927acf834c718155d5ee5dd81b19 2edcdd2e6c5a46cb95eb89874a9cb5f3 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/5ed410d9-b024-4004-83b6-eb4618833532/disk --force-share --output=json -f qcow2" returned: 0 in 0.096s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 21 18:48:12 np0005591285 nova_compute[182755]: 2026-01-21 23:48:12.079 182759 DEBUG oslo_concurrency.processutils [None req-2a6fa16f-a14c-4e87-890d-c0c01faaa9a3 0c3f927acf834c718155d5ee5dd81b19 2edcdd2e6c5a46cb95eb89874a9cb5f3 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 21 18:48:12 np0005591285 nova_compute[182755]: 2026-01-21 23:48:12.143 182759 DEBUG oslo_concurrency.processutils [None req-2a6fa16f-a14c-4e87-890d-c0c01faaa9a3 0c3f927acf834c718155d5ee5dd81b19 2edcdd2e6c5a46cb95eb89874a9cb5f3 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json" returned: 0 in 0.063s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 21 18:48:12 np0005591285 nova_compute[182755]: 2026-01-21 23:48:12.145 182759 DEBUG oslo_concurrency.processutils [None req-2a6fa16f-a14c-4e87-890d-c0c01faaa9a3 0c3f927acf834c718155d5ee5dd81b19 2edcdd2e6c5a46cb95eb89874a9cb5f3 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474,backing_fmt=raw /var/lib/nova/instances/snapshots/tmp46vzznah/40bf88d8fc774279bd7e42d3d320c76b.delta 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 21 18:48:12 np0005591285 nova_compute[182755]: 2026-01-21 23:48:12.186 182759 DEBUG oslo_concurrency.processutils [None req-2a6fa16f-a14c-4e87-890d-c0c01faaa9a3 0c3f927acf834c718155d5ee5dd81b19 2edcdd2e6c5a46cb95eb89874a9cb5f3 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474,backing_fmt=raw /var/lib/nova/instances/snapshots/tmp46vzznah/40bf88d8fc774279bd7e42d3d320c76b.delta 1073741824" returned: 0 in 0.040s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 21 18:48:12 np0005591285 nova_compute[182755]: 2026-01-21 23:48:12.188 182759 INFO nova.virt.libvirt.driver [None req-2a6fa16f-a14c-4e87-890d-c0c01faaa9a3 0c3f927acf834c718155d5ee5dd81b19 2edcdd2e6c5a46cb95eb89874a9cb5f3 - - default default] [instance: 5ed410d9-b024-4004-83b6-eb4618833532] Quiescing instance not available: QEMU guest agent is not enabled.#033[00m
Jan 21 18:48:12 np0005591285 nova_compute[182755]: 2026-01-21 23:48:12.268 182759 DEBUG nova.virt.libvirt.guest [None req-2a6fa16f-a14c-4e87-890d-c0c01faaa9a3 0c3f927acf834c718155d5ee5dd81b19 2edcdd2e6c5a46cb95eb89874a9cb5f3 - - default default] COPY block job progress, current cursor: 0 final cursor: 27918336 is_job_complete /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:846#033[00m
Jan 21 18:48:12 np0005591285 nova_compute[182755]: 2026-01-21 23:48:12.776 182759 DEBUG nova.virt.libvirt.guest [None req-2a6fa16f-a14c-4e87-890d-c0c01faaa9a3 0c3f927acf834c718155d5ee5dd81b19 2edcdd2e6c5a46cb95eb89874a9cb5f3 - - default default] COPY block job progress, current cursor: 27918336 final cursor: 27983872 is_job_complete /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:846#033[00m
Jan 21 18:48:12 np0005591285 ovn_controller[94908]: 2026-01-21T23:48:12Z|00010|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:3b:3f:4e 10.100.0.5
Jan 21 18:48:12 np0005591285 ovn_controller[94908]: 2026-01-21T23:48:12Z|00011|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:3b:3f:4e 10.100.0.5
Jan 21 18:48:13 np0005591285 nova_compute[182755]: 2026-01-21 23:48:13.281 182759 DEBUG nova.virt.libvirt.guest [None req-2a6fa16f-a14c-4e87-890d-c0c01faaa9a3 0c3f927acf834c718155d5ee5dd81b19 2edcdd2e6c5a46cb95eb89874a9cb5f3 - - default default] COPY block job progress, current cursor: 76808192 final cursor: 76808192 is_job_complete /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:846#033[00m
Jan 21 18:48:13 np0005591285 nova_compute[182755]: 2026-01-21 23:48:13.286 182759 INFO nova.virt.libvirt.driver [None req-2a6fa16f-a14c-4e87-890d-c0c01faaa9a3 0c3f927acf834c718155d5ee5dd81b19 2edcdd2e6c5a46cb95eb89874a9cb5f3 - - default default] [instance: 5ed410d9-b024-4004-83b6-eb4618833532] Skipping quiescing instance: QEMU guest agent is not enabled.#033[00m
Jan 21 18:48:13 np0005591285 nova_compute[182755]: 2026-01-21 23:48:13.344 182759 DEBUG nova.privsep.utils [None req-2a6fa16f-a14c-4e87-890d-c0c01faaa9a3 0c3f927acf834c718155d5ee5dd81b19 2edcdd2e6c5a46cb95eb89874a9cb5f3 - - default default] Path '/var/lib/nova/instances' supports direct I/O supports_direct_io /usr/lib/python3.9/site-packages/nova/privsep/utils.py:63#033[00m
Jan 21 18:48:13 np0005591285 nova_compute[182755]: 2026-01-21 23:48:13.345 182759 DEBUG oslo_concurrency.processutils [None req-2a6fa16f-a14c-4e87-890d-c0c01faaa9a3 0c3f927acf834c718155d5ee5dd81b19 2edcdd2e6c5a46cb95eb89874a9cb5f3 - - default default] Running cmd (subprocess): qemu-img convert -t none -O qcow2 -f qcow2 /var/lib/nova/instances/snapshots/tmp46vzznah/40bf88d8fc774279bd7e42d3d320c76b.delta /var/lib/nova/instances/snapshots/tmp46vzznah/40bf88d8fc774279bd7e42d3d320c76b execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 21 18:48:13 np0005591285 nova_compute[182755]: 2026-01-21 23:48:13.620 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 18:48:13 np0005591285 nova_compute[182755]: 2026-01-21 23:48:13.845 182759 DEBUG oslo_concurrency.processutils [None req-2a6fa16f-a14c-4e87-890d-c0c01faaa9a3 0c3f927acf834c718155d5ee5dd81b19 2edcdd2e6c5a46cb95eb89874a9cb5f3 - - default default] CMD "qemu-img convert -t none -O qcow2 -f qcow2 /var/lib/nova/instances/snapshots/tmp46vzznah/40bf88d8fc774279bd7e42d3d320c76b.delta /var/lib/nova/instances/snapshots/tmp46vzznah/40bf88d8fc774279bd7e42d3d320c76b" returned: 0 in 0.501s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 21 18:48:13 np0005591285 nova_compute[182755]: 2026-01-21 23:48:13.856 182759 INFO nova.virt.libvirt.driver [None req-2a6fa16f-a14c-4e87-890d-c0c01faaa9a3 0c3f927acf834c718155d5ee5dd81b19 2edcdd2e6c5a46cb95eb89874a9cb5f3 - - default default] [instance: 5ed410d9-b024-4004-83b6-eb4618833532] Snapshot extracted, beginning image upload#033[00m
Jan 21 18:48:14 np0005591285 nova_compute[182755]: 2026-01-21 23:48:14.008 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 18:48:14 np0005591285 nova_compute[182755]: 2026-01-21 23:48:14.193 182759 WARNING nova.compute.manager [None req-2a6fa16f-a14c-4e87-890d-c0c01faaa9a3 0c3f927acf834c718155d5ee5dd81b19 2edcdd2e6c5a46cb95eb89874a9cb5f3 - - default default] [instance: 5ed410d9-b024-4004-83b6-eb4618833532] Image not found during snapshot: nova.exception.ImageNotFound: Image e9b6ccb1-8b88-4a9d-bdef-352a03f74a9b could not be found.#033[00m
Jan 21 18:48:14 np0005591285 nova_compute[182755]: 2026-01-21 23:48:14.963 182759 DEBUG oslo_concurrency.lockutils [None req-be7fedbe-61a3-4509-a6dc-cadba52ec189 0c3f927acf834c718155d5ee5dd81b19 2edcdd2e6c5a46cb95eb89874a9cb5f3 - - default default] Acquiring lock "5ed410d9-b024-4004-83b6-eb4618833532" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 21 18:48:14 np0005591285 nova_compute[182755]: 2026-01-21 23:48:14.964 182759 DEBUG oslo_concurrency.lockutils [None req-be7fedbe-61a3-4509-a6dc-cadba52ec189 0c3f927acf834c718155d5ee5dd81b19 2edcdd2e6c5a46cb95eb89874a9cb5f3 - - default default] Lock "5ed410d9-b024-4004-83b6-eb4618833532" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 21 18:48:14 np0005591285 nova_compute[182755]: 2026-01-21 23:48:14.965 182759 DEBUG oslo_concurrency.lockutils [None req-be7fedbe-61a3-4509-a6dc-cadba52ec189 0c3f927acf834c718155d5ee5dd81b19 2edcdd2e6c5a46cb95eb89874a9cb5f3 - - default default] Acquiring lock "5ed410d9-b024-4004-83b6-eb4618833532-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 21 18:48:14 np0005591285 nova_compute[182755]: 2026-01-21 23:48:14.965 182759 DEBUG oslo_concurrency.lockutils [None req-be7fedbe-61a3-4509-a6dc-cadba52ec189 0c3f927acf834c718155d5ee5dd81b19 2edcdd2e6c5a46cb95eb89874a9cb5f3 - - default default] Lock "5ed410d9-b024-4004-83b6-eb4618833532-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 21 18:48:14 np0005591285 nova_compute[182755]: 2026-01-21 23:48:14.966 182759 DEBUG oslo_concurrency.lockutils [None req-be7fedbe-61a3-4509-a6dc-cadba52ec189 0c3f927acf834c718155d5ee5dd81b19 2edcdd2e6c5a46cb95eb89874a9cb5f3 - - default default] Lock "5ed410d9-b024-4004-83b6-eb4618833532-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 21 18:48:14 np0005591285 nova_compute[182755]: 2026-01-21 23:48:14.980 182759 INFO nova.compute.manager [None req-be7fedbe-61a3-4509-a6dc-cadba52ec189 0c3f927acf834c718155d5ee5dd81b19 2edcdd2e6c5a46cb95eb89874a9cb5f3 - - default default] [instance: 5ed410d9-b024-4004-83b6-eb4618833532] Terminating instance#033[00m
Jan 21 18:48:14 np0005591285 nova_compute[182755]: 2026-01-21 23:48:14.995 182759 DEBUG nova.compute.manager [None req-be7fedbe-61a3-4509-a6dc-cadba52ec189 0c3f927acf834c718155d5ee5dd81b19 2edcdd2e6c5a46cb95eb89874a9cb5f3 - - default default] [instance: 5ed410d9-b024-4004-83b6-eb4618833532] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Jan 21 18:48:15 np0005591285 kernel: tap32a358ca-39 (unregistering): left promiscuous mode
Jan 21 18:48:15 np0005591285 NetworkManager[55017]: <info>  [1769039295.0329] device (tap32a358ca-39): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 21 18:48:15 np0005591285 ovn_controller[94908]: 2026-01-21T23:48:15Z|00074|binding|INFO|Releasing lport 32a358ca-397f-4aed-b20f-8cbdde33b131 from this chassis (sb_readonly=0)
Jan 21 18:48:15 np0005591285 ovn_controller[94908]: 2026-01-21T23:48:15Z|00075|binding|INFO|Setting lport 32a358ca-397f-4aed-b20f-8cbdde33b131 down in Southbound
Jan 21 18:48:15 np0005591285 nova_compute[182755]: 2026-01-21 23:48:15.045 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 18:48:15 np0005591285 ovn_controller[94908]: 2026-01-21T23:48:15Z|00076|binding|INFO|Removing iface tap32a358ca-39 ovn-installed in OVS
Jan 21 18:48:15 np0005591285 nova_compute[182755]: 2026-01-21 23:48:15.048 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 18:48:15 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:48:15.059 104259 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:3b:3f:4e 10.100.0.5'], port_security=['fa:16:3e:3b:3f:4e 10.100.0.5'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.5/28', 'neutron:device_id': '5ed410d9-b024-4004-83b6-eb4618833532', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-135f4ca0-b287-4f82-8393-a426855e9926', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '2edcdd2e6c5a46cb95eb89874a9cb5f3', 'neutron:revision_number': '4', 'neutron:security_group_ids': 'b452e9c4-b5fd-46cd-9749-caa7edf73c8c', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-2.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=357b9b46-d446-48ea-adde-5992e2bcd56d, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fdd55b5c970>], logical_port=32a358ca-397f-4aed-b20f-8cbdde33b131) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7fdd55b5c970>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 21 18:48:15 np0005591285 nova_compute[182755]: 2026-01-21 23:48:15.060 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 18:48:15 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:48:15.061 104259 INFO neutron.agent.ovn.metadata.agent [-] Port 32a358ca-397f-4aed-b20f-8cbdde33b131 in datapath 135f4ca0-b287-4f82-8393-a426855e9926 unbound from our chassis#033[00m
Jan 21 18:48:15 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:48:15.064 104259 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 135f4ca0-b287-4f82-8393-a426855e9926, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Jan 21 18:48:15 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:48:15.066 211690 DEBUG oslo.privsep.daemon [-] privsep: reply[b67a1342-8814-4e4f-9757-677d16cfa383]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 18:48:15 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:48:15.067 104259 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-135f4ca0-b287-4f82-8393-a426855e9926 namespace which is not needed anymore#033[00m
Jan 21 18:48:15 np0005591285 systemd[1]: machine-qemu\x2d7\x2dinstance\x2d00000018.scope: Deactivated successfully.
Jan 21 18:48:15 np0005591285 systemd[1]: machine-qemu\x2d7\x2dinstance\x2d00000018.scope: Consumed 13.191s CPU time.
Jan 21 18:48:15 np0005591285 systemd-machined[154022]: Machine qemu-7-instance-00000018 terminated.
Jan 21 18:48:15 np0005591285 neutron-haproxy-ovnmeta-135f4ca0-b287-4f82-8393-a426855e9926[214080]: [NOTICE]   (214084) : haproxy version is 2.8.14-c23fe91
Jan 21 18:48:15 np0005591285 neutron-haproxy-ovnmeta-135f4ca0-b287-4f82-8393-a426855e9926[214080]: [NOTICE]   (214084) : path to executable is /usr/sbin/haproxy
Jan 21 18:48:15 np0005591285 neutron-haproxy-ovnmeta-135f4ca0-b287-4f82-8393-a426855e9926[214080]: [WARNING]  (214084) : Exiting Master process...
Jan 21 18:48:15 np0005591285 neutron-haproxy-ovnmeta-135f4ca0-b287-4f82-8393-a426855e9926[214080]: [ALERT]    (214084) : Current worker (214086) exited with code 143 (Terminated)
Jan 21 18:48:15 np0005591285 neutron-haproxy-ovnmeta-135f4ca0-b287-4f82-8393-a426855e9926[214080]: [WARNING]  (214084) : All workers exited. Exiting... (0)
Jan 21 18:48:15 np0005591285 systemd[1]: libpod-44c90f9772a12a3eafbdb1e0073b3713c135aa18065296b8108f5b3e90e76317.scope: Deactivated successfully.
Jan 21 18:48:15 np0005591285 podman[214235]: 2026-01-21 23:48:15.271843757 +0000 UTC m=+0.087854171 container died 44c90f9772a12a3eafbdb1e0073b3713c135aa18065296b8108f5b3e90e76317 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-135f4ca0-b287-4f82-8393-a426855e9926, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 21 18:48:15 np0005591285 nova_compute[182755]: 2026-01-21 23:48:15.274 182759 INFO nova.virt.libvirt.driver [-] [instance: 5ed410d9-b024-4004-83b6-eb4618833532] Instance destroyed successfully.#033[00m
Jan 21 18:48:15 np0005591285 nova_compute[182755]: 2026-01-21 23:48:15.275 182759 DEBUG nova.objects.instance [None req-be7fedbe-61a3-4509-a6dc-cadba52ec189 0c3f927acf834c718155d5ee5dd81b19 2edcdd2e6c5a46cb95eb89874a9cb5f3 - - default default] Lazy-loading 'resources' on Instance uuid 5ed410d9-b024-4004-83b6-eb4618833532 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 21 18:48:15 np0005591285 nova_compute[182755]: 2026-01-21 23:48:15.297 182759 DEBUG nova.virt.libvirt.vif [None req-be7fedbe-61a3-4509-a6dc-cadba52ec189 0c3f927acf834c718155d5ee5dd81b19 2edcdd2e6c5a46cb95eb89874a9cb5f3 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-21T23:47:52Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ImagesOneServerNegativeTestJSON-server-682579751',display_name='tempest-ImagesOneServerNegativeTestJSON-server-682579751',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-imagesoneservernegativetestjson-server-682579751',id=24,image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2026-01-21T23:48:00Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='2edcdd2e6c5a46cb95eb89874a9cb5f3',ramdisk_id='',reservation_id='r-ur8f6wuq',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-ImagesOneServerNegativeTestJSON-222133061',owner_user_name='tempest-ImagesOneServerNegativeTestJSON-222133061-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-21T23:48:14Z,user_data=None,user_id='0c3f927acf834c718155d5ee5dd81b19',uuid=5ed410d9-b024-4004-83b6-eb4618833532,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "32a358ca-397f-4aed-b20f-8cbdde33b131", "address": "fa:16:3e:3b:3f:4e", "network": {"id": "135f4ca0-b287-4f82-8393-a426855e9926", "bridge": "br-int", "label": "tempest-ImagesOneServerNegativeTestJSON-1018143163-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "2edcdd2e6c5a46cb95eb89874a9cb5f3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap32a358ca-39", "ovs_interfaceid": "32a358ca-397f-4aed-b20f-8cbdde33b131", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Jan 21 18:48:15 np0005591285 nova_compute[182755]: 2026-01-21 23:48:15.298 182759 DEBUG nova.network.os_vif_util [None req-be7fedbe-61a3-4509-a6dc-cadba52ec189 0c3f927acf834c718155d5ee5dd81b19 2edcdd2e6c5a46cb95eb89874a9cb5f3 - - default default] Converting VIF {"id": "32a358ca-397f-4aed-b20f-8cbdde33b131", "address": "fa:16:3e:3b:3f:4e", "network": {"id": "135f4ca0-b287-4f82-8393-a426855e9926", "bridge": "br-int", "label": "tempest-ImagesOneServerNegativeTestJSON-1018143163-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "2edcdd2e6c5a46cb95eb89874a9cb5f3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap32a358ca-39", "ovs_interfaceid": "32a358ca-397f-4aed-b20f-8cbdde33b131", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 21 18:48:15 np0005591285 nova_compute[182755]: 2026-01-21 23:48:15.299 182759 DEBUG nova.network.os_vif_util [None req-be7fedbe-61a3-4509-a6dc-cadba52ec189 0c3f927acf834c718155d5ee5dd81b19 2edcdd2e6c5a46cb95eb89874a9cb5f3 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:3b:3f:4e,bridge_name='br-int',has_traffic_filtering=True,id=32a358ca-397f-4aed-b20f-8cbdde33b131,network=Network(135f4ca0-b287-4f82-8393-a426855e9926),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap32a358ca-39') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 21 18:48:15 np0005591285 nova_compute[182755]: 2026-01-21 23:48:15.299 182759 DEBUG os_vif [None req-be7fedbe-61a3-4509-a6dc-cadba52ec189 0c3f927acf834c718155d5ee5dd81b19 2edcdd2e6c5a46cb95eb89874a9cb5f3 - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:3b:3f:4e,bridge_name='br-int',has_traffic_filtering=True,id=32a358ca-397f-4aed-b20f-8cbdde33b131,network=Network(135f4ca0-b287-4f82-8393-a426855e9926),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap32a358ca-39') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Jan 21 18:48:15 np0005591285 nova_compute[182755]: 2026-01-21 23:48:15.301 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 18:48:15 np0005591285 nova_compute[182755]: 2026-01-21 23:48:15.302 182759 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap32a358ca-39, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 21 18:48:15 np0005591285 nova_compute[182755]: 2026-01-21 23:48:15.304 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 18:48:15 np0005591285 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-44c90f9772a12a3eafbdb1e0073b3713c135aa18065296b8108f5b3e90e76317-userdata-shm.mount: Deactivated successfully.
Jan 21 18:48:15 np0005591285 nova_compute[182755]: 2026-01-21 23:48:15.306 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 21 18:48:15 np0005591285 nova_compute[182755]: 2026-01-21 23:48:15.310 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 18:48:15 np0005591285 systemd[1]: var-lib-containers-storage-overlay-29a7c7e907863db96420e4fac5c64d62feb0a315ff74565e7c5f91b6ff9877b7-merged.mount: Deactivated successfully.
Jan 21 18:48:15 np0005591285 nova_compute[182755]: 2026-01-21 23:48:15.315 182759 INFO os_vif [None req-be7fedbe-61a3-4509-a6dc-cadba52ec189 0c3f927acf834c718155d5ee5dd81b19 2edcdd2e6c5a46cb95eb89874a9cb5f3 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:3b:3f:4e,bridge_name='br-int',has_traffic_filtering=True,id=32a358ca-397f-4aed-b20f-8cbdde33b131,network=Network(135f4ca0-b287-4f82-8393-a426855e9926),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap32a358ca-39')#033[00m
Jan 21 18:48:15 np0005591285 nova_compute[182755]: 2026-01-21 23:48:15.316 182759 INFO nova.virt.libvirt.driver [None req-be7fedbe-61a3-4509-a6dc-cadba52ec189 0c3f927acf834c718155d5ee5dd81b19 2edcdd2e6c5a46cb95eb89874a9cb5f3 - - default default] [instance: 5ed410d9-b024-4004-83b6-eb4618833532] Deleting instance files /var/lib/nova/instances/5ed410d9-b024-4004-83b6-eb4618833532_del#033[00m
Jan 21 18:48:15 np0005591285 nova_compute[182755]: 2026-01-21 23:48:15.317 182759 INFO nova.virt.libvirt.driver [None req-be7fedbe-61a3-4509-a6dc-cadba52ec189 0c3f927acf834c718155d5ee5dd81b19 2edcdd2e6c5a46cb95eb89874a9cb5f3 - - default default] [instance: 5ed410d9-b024-4004-83b6-eb4618833532] Deletion of /var/lib/nova/instances/5ed410d9-b024-4004-83b6-eb4618833532_del complete#033[00m
Jan 21 18:48:15 np0005591285 podman[214235]: 2026-01-21 23:48:15.321275119 +0000 UTC m=+0.137285543 container cleanup 44c90f9772a12a3eafbdb1e0073b3713c135aa18065296b8108f5b3e90e76317 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-135f4ca0-b287-4f82-8393-a426855e9926, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.schema-version=1.0)
Jan 21 18:48:15 np0005591285 systemd[1]: libpod-conmon-44c90f9772a12a3eafbdb1e0073b3713c135aa18065296b8108f5b3e90e76317.scope: Deactivated successfully.
Jan 21 18:48:15 np0005591285 podman[214282]: 2026-01-21 23:48:15.393781589 +0000 UTC m=+0.044417669 container remove 44c90f9772a12a3eafbdb1e0073b3713c135aa18065296b8108f5b3e90e76317 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-135f4ca0-b287-4f82-8393-a426855e9926, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3)
Jan 21 18:48:15 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:48:15.400 211690 DEBUG oslo.privsep.daemon [-] privsep: reply[80db476d-3dff-4674-828d-c5a30b4adc20]: (4, ('Wed Jan 21 11:48:15 PM UTC 2026 Stopping container neutron-haproxy-ovnmeta-135f4ca0-b287-4f82-8393-a426855e9926 (44c90f9772a12a3eafbdb1e0073b3713c135aa18065296b8108f5b3e90e76317)\n44c90f9772a12a3eafbdb1e0073b3713c135aa18065296b8108f5b3e90e76317\nWed Jan 21 11:48:15 PM UTC 2026 Deleting container neutron-haproxy-ovnmeta-135f4ca0-b287-4f82-8393-a426855e9926 (44c90f9772a12a3eafbdb1e0073b3713c135aa18065296b8108f5b3e90e76317)\n44c90f9772a12a3eafbdb1e0073b3713c135aa18065296b8108f5b3e90e76317\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 18:48:15 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:48:15.403 211690 DEBUG oslo.privsep.daemon [-] privsep: reply[42dd9ee5-3081-46d4-99f5-e2ed95fb4771]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 18:48:15 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:48:15.405 104259 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap135f4ca0-b0, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 21 18:48:15 np0005591285 nova_compute[182755]: 2026-01-21 23:48:15.408 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 18:48:15 np0005591285 kernel: tap135f4ca0-b0: left promiscuous mode
Jan 21 18:48:15 np0005591285 nova_compute[182755]: 2026-01-21 23:48:15.411 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 18:48:15 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:48:15.414 211690 DEBUG oslo.privsep.daemon [-] privsep: reply[b2a908ab-429d-47f5-ba98-6db2445e153c]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 18:48:15 np0005591285 nova_compute[182755]: 2026-01-21 23:48:15.423 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 18:48:15 np0005591285 nova_compute[182755]: 2026-01-21 23:48:15.430 182759 INFO nova.compute.manager [None req-be7fedbe-61a3-4509-a6dc-cadba52ec189 0c3f927acf834c718155d5ee5dd81b19 2edcdd2e6c5a46cb95eb89874a9cb5f3 - - default default] [instance: 5ed410d9-b024-4004-83b6-eb4618833532] Took 0.43 seconds to destroy the instance on the hypervisor.#033[00m
Jan 21 18:48:15 np0005591285 nova_compute[182755]: 2026-01-21 23:48:15.433 182759 DEBUG oslo.service.loopingcall [None req-be7fedbe-61a3-4509-a6dc-cadba52ec189 0c3f927acf834c718155d5ee5dd81b19 2edcdd2e6c5a46cb95eb89874a9cb5f3 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Jan 21 18:48:15 np0005591285 nova_compute[182755]: 2026-01-21 23:48:15.433 182759 DEBUG nova.compute.manager [-] [instance: 5ed410d9-b024-4004-83b6-eb4618833532] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Jan 21 18:48:15 np0005591285 nova_compute[182755]: 2026-01-21 23:48:15.434 182759 DEBUG nova.network.neutron [-] [instance: 5ed410d9-b024-4004-83b6-eb4618833532] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Jan 21 18:48:15 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:48:15.442 211690 DEBUG oslo.privsep.daemon [-] privsep: reply[10ac9f40-5a53-4d35-bbae-2d40004cb2bd]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 18:48:15 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:48:15.444 211690 DEBUG oslo.privsep.daemon [-] privsep: reply[4ec8eac6-8a66-4d62-bf77-3be51c624ed2]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 18:48:15 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:48:15.466 211690 DEBUG oslo.privsep.daemon [-] privsep: reply[9178f66a-931d-4095-984a-a7e650bc93b2]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 383808, 'reachable_time': 37107, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 214297, 'error': None, 'target': 'ovnmeta-135f4ca0-b287-4f82-8393-a426855e9926', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 18:48:15 np0005591285 systemd[1]: run-netns-ovnmeta\x2d135f4ca0\x2db287\x2d4f82\x2d8393\x2da426855e9926.mount: Deactivated successfully.
Jan 21 18:48:15 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:48:15.470 104650 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-135f4ca0-b287-4f82-8393-a426855e9926 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Jan 21 18:48:15 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:48:15.471 104650 DEBUG oslo.privsep.daemon [-] privsep: reply[39b5f952-85fc-4dc5-ac61-2ee73b41a3db]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 18:48:15 np0005591285 nova_compute[182755]: 2026-01-21 23:48:15.907 182759 DEBUG nova.compute.manager [req-ed8975c5-a378-4266-b873-bbe58c3f1c99 req-42d7e028-e756-41f5-a686-9e71a1e802d4 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 5ed410d9-b024-4004-83b6-eb4618833532] Received event network-vif-unplugged-32a358ca-397f-4aed-b20f-8cbdde33b131 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 21 18:48:15 np0005591285 nova_compute[182755]: 2026-01-21 23:48:15.907 182759 DEBUG oslo_concurrency.lockutils [req-ed8975c5-a378-4266-b873-bbe58c3f1c99 req-42d7e028-e756-41f5-a686-9e71a1e802d4 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "5ed410d9-b024-4004-83b6-eb4618833532-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 21 18:48:15 np0005591285 nova_compute[182755]: 2026-01-21 23:48:15.908 182759 DEBUG oslo_concurrency.lockutils [req-ed8975c5-a378-4266-b873-bbe58c3f1c99 req-42d7e028-e756-41f5-a686-9e71a1e802d4 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "5ed410d9-b024-4004-83b6-eb4618833532-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 21 18:48:15 np0005591285 nova_compute[182755]: 2026-01-21 23:48:15.908 182759 DEBUG oslo_concurrency.lockutils [req-ed8975c5-a378-4266-b873-bbe58c3f1c99 req-42d7e028-e756-41f5-a686-9e71a1e802d4 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "5ed410d9-b024-4004-83b6-eb4618833532-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 21 18:48:15 np0005591285 nova_compute[182755]: 2026-01-21 23:48:15.908 182759 DEBUG nova.compute.manager [req-ed8975c5-a378-4266-b873-bbe58c3f1c99 req-42d7e028-e756-41f5-a686-9e71a1e802d4 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 5ed410d9-b024-4004-83b6-eb4618833532] No waiting events found dispatching network-vif-unplugged-32a358ca-397f-4aed-b20f-8cbdde33b131 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 21 18:48:15 np0005591285 nova_compute[182755]: 2026-01-21 23:48:15.909 182759 DEBUG nova.compute.manager [req-ed8975c5-a378-4266-b873-bbe58c3f1c99 req-42d7e028-e756-41f5-a686-9e71a1e802d4 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 5ed410d9-b024-4004-83b6-eb4618833532] Received event network-vif-unplugged-32a358ca-397f-4aed-b20f-8cbdde33b131 for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Jan 21 18:48:16 np0005591285 nova_compute[182755]: 2026-01-21 23:48:16.370 182759 DEBUG nova.network.neutron [-] [instance: 5ed410d9-b024-4004-83b6-eb4618833532] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 21 18:48:16 np0005591285 nova_compute[182755]: 2026-01-21 23:48:16.393 182759 INFO nova.compute.manager [-] [instance: 5ed410d9-b024-4004-83b6-eb4618833532] Took 0.96 seconds to deallocate network for instance.#033[00m
Jan 21 18:48:16 np0005591285 nova_compute[182755]: 2026-01-21 23:48:16.492 182759 DEBUG oslo_concurrency.lockutils [None req-be7fedbe-61a3-4509-a6dc-cadba52ec189 0c3f927acf834c718155d5ee5dd81b19 2edcdd2e6c5a46cb95eb89874a9cb5f3 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 21 18:48:16 np0005591285 nova_compute[182755]: 2026-01-21 23:48:16.494 182759 DEBUG oslo_concurrency.lockutils [None req-be7fedbe-61a3-4509-a6dc-cadba52ec189 0c3f927acf834c718155d5ee5dd81b19 2edcdd2e6c5a46cb95eb89874a9cb5f3 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 21 18:48:16 np0005591285 nova_compute[182755]: 2026-01-21 23:48:16.499 182759 DEBUG nova.compute.manager [req-543bf671-e839-40ca-bbe6-105167a58271 req-8ecc2151-329f-4761-afe5-a3db1394178e 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 5ed410d9-b024-4004-83b6-eb4618833532] Received event network-vif-deleted-32a358ca-397f-4aed-b20f-8cbdde33b131 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 21 18:48:16 np0005591285 nova_compute[182755]: 2026-01-21 23:48:16.585 182759 DEBUG nova.compute.provider_tree [None req-be7fedbe-61a3-4509-a6dc-cadba52ec189 0c3f927acf834c718155d5ee5dd81b19 2edcdd2e6c5a46cb95eb89874a9cb5f3 - - default default] Inventory has not changed in ProviderTree for provider: e96a8776-a298-4c19-937a-402cb8191067 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 21 18:48:16 np0005591285 nova_compute[182755]: 2026-01-21 23:48:16.616 182759 DEBUG nova.scheduler.client.report [None req-be7fedbe-61a3-4509-a6dc-cadba52ec189 0c3f927acf834c718155d5ee5dd81b19 2edcdd2e6c5a46cb95eb89874a9cb5f3 - - default default] Inventory has not changed for provider e96a8776-a298-4c19-937a-402cb8191067 based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 21 18:48:16 np0005591285 nova_compute[182755]: 2026-01-21 23:48:16.649 182759 DEBUG oslo_concurrency.lockutils [None req-be7fedbe-61a3-4509-a6dc-cadba52ec189 0c3f927acf834c718155d5ee5dd81b19 2edcdd2e6c5a46cb95eb89874a9cb5f3 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.155s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 21 18:48:16 np0005591285 nova_compute[182755]: 2026-01-21 23:48:16.677 182759 INFO nova.scheduler.client.report [None req-be7fedbe-61a3-4509-a6dc-cadba52ec189 0c3f927acf834c718155d5ee5dd81b19 2edcdd2e6c5a46cb95eb89874a9cb5f3 - - default default] Deleted allocations for instance 5ed410d9-b024-4004-83b6-eb4618833532#033[00m
Jan 21 18:48:16 np0005591285 nova_compute[182755]: 2026-01-21 23:48:16.803 182759 DEBUG oslo_concurrency.lockutils [None req-be7fedbe-61a3-4509-a6dc-cadba52ec189 0c3f927acf834c718155d5ee5dd81b19 2edcdd2e6c5a46cb95eb89874a9cb5f3 - - default default] Lock "5ed410d9-b024-4004-83b6-eb4618833532" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 1.839s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 21 18:48:17 np0005591285 nova_compute[182755]: 2026-01-21 23:48:17.322 182759 DEBUG oslo_service.periodic_task [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 21 18:48:18 np0005591285 nova_compute[182755]: 2026-01-21 23:48:18.039 182759 DEBUG nova.compute.manager [req-fdb0cdf8-27c2-46a1-a39c-dea2ad2caeea req-54c88cf5-f2e3-450f-8148-5c471cef7098 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 5ed410d9-b024-4004-83b6-eb4618833532] Received event network-vif-plugged-32a358ca-397f-4aed-b20f-8cbdde33b131 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 21 18:48:18 np0005591285 nova_compute[182755]: 2026-01-21 23:48:18.040 182759 DEBUG oslo_concurrency.lockutils [req-fdb0cdf8-27c2-46a1-a39c-dea2ad2caeea req-54c88cf5-f2e3-450f-8148-5c471cef7098 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "5ed410d9-b024-4004-83b6-eb4618833532-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 21 18:48:18 np0005591285 nova_compute[182755]: 2026-01-21 23:48:18.040 182759 DEBUG oslo_concurrency.lockutils [req-fdb0cdf8-27c2-46a1-a39c-dea2ad2caeea req-54c88cf5-f2e3-450f-8148-5c471cef7098 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "5ed410d9-b024-4004-83b6-eb4618833532-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 21 18:48:18 np0005591285 nova_compute[182755]: 2026-01-21 23:48:18.040 182759 DEBUG oslo_concurrency.lockutils [req-fdb0cdf8-27c2-46a1-a39c-dea2ad2caeea req-54c88cf5-f2e3-450f-8148-5c471cef7098 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "5ed410d9-b024-4004-83b6-eb4618833532-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 21 18:48:18 np0005591285 nova_compute[182755]: 2026-01-21 23:48:18.041 182759 DEBUG nova.compute.manager [req-fdb0cdf8-27c2-46a1-a39c-dea2ad2caeea req-54c88cf5-f2e3-450f-8148-5c471cef7098 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 5ed410d9-b024-4004-83b6-eb4618833532] No waiting events found dispatching network-vif-plugged-32a358ca-397f-4aed-b20f-8cbdde33b131 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 21 18:48:18 np0005591285 nova_compute[182755]: 2026-01-21 23:48:18.041 182759 WARNING nova.compute.manager [req-fdb0cdf8-27c2-46a1-a39c-dea2ad2caeea req-54c88cf5-f2e3-450f-8148-5c471cef7098 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 5ed410d9-b024-4004-83b6-eb4618833532] Received unexpected event network-vif-plugged-32a358ca-397f-4aed-b20f-8cbdde33b131 for instance with vm_state deleted and task_state None.#033[00m
Jan 21 18:48:18 np0005591285 nova_compute[182755]: 2026-01-21 23:48:18.289 182759 DEBUG oslo_concurrency.lockutils [None req-b8173f1b-0a6b-4e23-9e98-fa1d12f1c48d 4a6034ff39094b6486bac680b7ed5a57 4d40fc03fb534b5689415f3d8a3de1fc - - default default] Acquiring lock "3a585d4f-6f31-4651-b848-0470f4eed464" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 21 18:48:18 np0005591285 nova_compute[182755]: 2026-01-21 23:48:18.290 182759 DEBUG oslo_concurrency.lockutils [None req-b8173f1b-0a6b-4e23-9e98-fa1d12f1c48d 4a6034ff39094b6486bac680b7ed5a57 4d40fc03fb534b5689415f3d8a3de1fc - - default default] Lock "3a585d4f-6f31-4651-b848-0470f4eed464" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 21 18:48:18 np0005591285 nova_compute[182755]: 2026-01-21 23:48:18.290 182759 DEBUG oslo_concurrency.lockutils [None req-b8173f1b-0a6b-4e23-9e98-fa1d12f1c48d 4a6034ff39094b6486bac680b7ed5a57 4d40fc03fb534b5689415f3d8a3de1fc - - default default] Acquiring lock "3a585d4f-6f31-4651-b848-0470f4eed464-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 21 18:48:18 np0005591285 nova_compute[182755]: 2026-01-21 23:48:18.291 182759 DEBUG oslo_concurrency.lockutils [None req-b8173f1b-0a6b-4e23-9e98-fa1d12f1c48d 4a6034ff39094b6486bac680b7ed5a57 4d40fc03fb534b5689415f3d8a3de1fc - - default default] Lock "3a585d4f-6f31-4651-b848-0470f4eed464-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 21 18:48:18 np0005591285 nova_compute[182755]: 2026-01-21 23:48:18.291 182759 DEBUG oslo_concurrency.lockutils [None req-b8173f1b-0a6b-4e23-9e98-fa1d12f1c48d 4a6034ff39094b6486bac680b7ed5a57 4d40fc03fb534b5689415f3d8a3de1fc - - default default] Lock "3a585d4f-6f31-4651-b848-0470f4eed464-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 21 18:48:18 np0005591285 nova_compute[182755]: 2026-01-21 23:48:18.306 182759 INFO nova.compute.manager [None req-b8173f1b-0a6b-4e23-9e98-fa1d12f1c48d 4a6034ff39094b6486bac680b7ed5a57 4d40fc03fb534b5689415f3d8a3de1fc - - default default] [instance: 3a585d4f-6f31-4651-b848-0470f4eed464] Terminating instance#033[00m
Jan 21 18:48:18 np0005591285 nova_compute[182755]: 2026-01-21 23:48:18.317 182759 DEBUG nova.compute.manager [None req-b8173f1b-0a6b-4e23-9e98-fa1d12f1c48d 4a6034ff39094b6486bac680b7ed5a57 4d40fc03fb534b5689415f3d8a3de1fc - - default default] [instance: 3a585d4f-6f31-4651-b848-0470f4eed464] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Jan 21 18:48:18 np0005591285 kernel: tapeb56f7d1-8d (unregistering): left promiscuous mode
Jan 21 18:48:18 np0005591285 NetworkManager[55017]: <info>  [1769039298.3501] device (tapeb56f7d1-8d): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 21 18:48:18 np0005591285 ovn_controller[94908]: 2026-01-21T23:48:18Z|00077|binding|INFO|Releasing lport eb56f7d1-8dec-4faa-a727-c8bdf54f0af5 from this chassis (sb_readonly=0)
Jan 21 18:48:18 np0005591285 ovn_controller[94908]: 2026-01-21T23:48:18Z|00078|binding|INFO|Setting lport eb56f7d1-8dec-4faa-a727-c8bdf54f0af5 down in Southbound
Jan 21 18:48:18 np0005591285 ovn_controller[94908]: 2026-01-21T23:48:18Z|00079|binding|INFO|Removing iface tapeb56f7d1-8d ovn-installed in OVS
Jan 21 18:48:18 np0005591285 nova_compute[182755]: 2026-01-21 23:48:18.356 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 18:48:18 np0005591285 nova_compute[182755]: 2026-01-21 23:48:18.358 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 18:48:18 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:48:18.368 104259 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:ae:45:35 10.100.0.6'], port_security=['fa:16:3e:ae:45:35 10.100.0.6'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.6/28', 'neutron:device_id': '3a585d4f-6f31-4651-b848-0470f4eed464', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-1530a22a-f758-407d-b1aa-fd922904fe07', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '4d40fc03fb534b5689415f3d8a3de1fc', 'neutron:revision_number': '4', 'neutron:security_group_ids': '53586897-09f0-4175-b34b-334e99525efe', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-2.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=b41d97fa-58c9-4587-b2cb-1c83c11100c9, chassis=[], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fdd55b5c970>], logical_port=eb56f7d1-8dec-4faa-a727-c8bdf54f0af5) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7fdd55b5c970>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 21 18:48:18 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:48:18.370 104259 INFO neutron.agent.ovn.metadata.agent [-] Port eb56f7d1-8dec-4faa-a727-c8bdf54f0af5 in datapath 1530a22a-f758-407d-b1aa-fd922904fe07 unbound from our chassis#033[00m
Jan 21 18:48:18 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:48:18.372 104259 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 1530a22a-f758-407d-b1aa-fd922904fe07, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Jan 21 18:48:18 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:48:18.374 211690 DEBUG oslo.privsep.daemon [-] privsep: reply[1b838dca-6b82-4cc5-badb-298a3e1e713c]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 18:48:18 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:48:18.374 104259 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-1530a22a-f758-407d-b1aa-fd922904fe07 namespace which is not needed anymore#033[00m
Jan 21 18:48:18 np0005591285 nova_compute[182755]: 2026-01-21 23:48:18.386 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 18:48:18 np0005591285 systemd[1]: machine-qemu\x2d6\x2dinstance\x2d00000013.scope: Deactivated successfully.
Jan 21 18:48:18 np0005591285 systemd[1]: machine-qemu\x2d6\x2dinstance\x2d00000013.scope: Consumed 16.551s CPU time.
Jan 21 18:48:18 np0005591285 systemd-machined[154022]: Machine qemu-6-instance-00000013 terminated.
Jan 21 18:48:18 np0005591285 podman[214300]: 2026-01-21 23:48:18.47636101 +0000 UTC m=+0.090241384 container health_status 3d927e208c8d9d2bf2ed958430b573b5beb6e6062ba87e1ec7a0ee42c855b2e4 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Jan 21 18:48:18 np0005591285 neutron-haproxy-ovnmeta-1530a22a-f758-407d-b1aa-fd922904fe07[213560]: [NOTICE]   (213564) : haproxy version is 2.8.14-c23fe91
Jan 21 18:48:18 np0005591285 neutron-haproxy-ovnmeta-1530a22a-f758-407d-b1aa-fd922904fe07[213560]: [NOTICE]   (213564) : path to executable is /usr/sbin/haproxy
Jan 21 18:48:18 np0005591285 neutron-haproxy-ovnmeta-1530a22a-f758-407d-b1aa-fd922904fe07[213560]: [WARNING]  (213564) : Exiting Master process...
Jan 21 18:48:18 np0005591285 neutron-haproxy-ovnmeta-1530a22a-f758-407d-b1aa-fd922904fe07[213560]: [WARNING]  (213564) : Exiting Master process...
Jan 21 18:48:18 np0005591285 neutron-haproxy-ovnmeta-1530a22a-f758-407d-b1aa-fd922904fe07[213560]: [ALERT]    (213564) : Current worker (213566) exited with code 143 (Terminated)
Jan 21 18:48:18 np0005591285 neutron-haproxy-ovnmeta-1530a22a-f758-407d-b1aa-fd922904fe07[213560]: [WARNING]  (213564) : All workers exited. Exiting... (0)
Jan 21 18:48:18 np0005591285 systemd[1]: libpod-75aca6e737251648c3ddab05a55c8354608790185af70e728b6702135e147ca1.scope: Deactivated successfully.
Jan 21 18:48:18 np0005591285 podman[214345]: 2026-01-21 23:48:18.585411748 +0000 UTC m=+0.067768094 container died 75aca6e737251648c3ddab05a55c8354608790185af70e728b6702135e147ca1 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-1530a22a-f758-407d-b1aa-fd922904fe07, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2)
Jan 21 18:48:18 np0005591285 nova_compute[182755]: 2026-01-21 23:48:18.608 182759 INFO nova.virt.libvirt.driver [-] [instance: 3a585d4f-6f31-4651-b848-0470f4eed464] Instance destroyed successfully.#033[00m
Jan 21 18:48:18 np0005591285 nova_compute[182755]: 2026-01-21 23:48:18.609 182759 DEBUG nova.objects.instance [None req-b8173f1b-0a6b-4e23-9e98-fa1d12f1c48d 4a6034ff39094b6486bac680b7ed5a57 4d40fc03fb534b5689415f3d8a3de1fc - - default default] Lazy-loading 'resources' on Instance uuid 3a585d4f-6f31-4651-b848-0470f4eed464 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 21 18:48:18 np0005591285 nova_compute[182755]: 2026-01-21 23:48:18.619 182759 DEBUG nova.compute.manager [req-2cc59eb8-17f0-4d78-aaeb-a02d8b42b5ff req-d5ed1608-2043-4929-96fa-767066438133 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 3a585d4f-6f31-4651-b848-0470f4eed464] Received event network-vif-unplugged-eb56f7d1-8dec-4faa-a727-c8bdf54f0af5 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 21 18:48:18 np0005591285 nova_compute[182755]: 2026-01-21 23:48:18.619 182759 DEBUG oslo_concurrency.lockutils [req-2cc59eb8-17f0-4d78-aaeb-a02d8b42b5ff req-d5ed1608-2043-4929-96fa-767066438133 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "3a585d4f-6f31-4651-b848-0470f4eed464-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 21 18:48:18 np0005591285 nova_compute[182755]: 2026-01-21 23:48:18.619 182759 DEBUG oslo_concurrency.lockutils [req-2cc59eb8-17f0-4d78-aaeb-a02d8b42b5ff req-d5ed1608-2043-4929-96fa-767066438133 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "3a585d4f-6f31-4651-b848-0470f4eed464-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 21 18:48:18 np0005591285 nova_compute[182755]: 2026-01-21 23:48:18.620 182759 DEBUG oslo_concurrency.lockutils [req-2cc59eb8-17f0-4d78-aaeb-a02d8b42b5ff req-d5ed1608-2043-4929-96fa-767066438133 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "3a585d4f-6f31-4651-b848-0470f4eed464-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 21 18:48:18 np0005591285 nova_compute[182755]: 2026-01-21 23:48:18.620 182759 DEBUG nova.compute.manager [req-2cc59eb8-17f0-4d78-aaeb-a02d8b42b5ff req-d5ed1608-2043-4929-96fa-767066438133 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 3a585d4f-6f31-4651-b848-0470f4eed464] No waiting events found dispatching network-vif-unplugged-eb56f7d1-8dec-4faa-a727-c8bdf54f0af5 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 21 18:48:18 np0005591285 nova_compute[182755]: 2026-01-21 23:48:18.620 182759 DEBUG nova.compute.manager [req-2cc59eb8-17f0-4d78-aaeb-a02d8b42b5ff req-d5ed1608-2043-4929-96fa-767066438133 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 3a585d4f-6f31-4651-b848-0470f4eed464] Received event network-vif-unplugged-eb56f7d1-8dec-4faa-a727-c8bdf54f0af5 for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Jan 21 18:48:18 np0005591285 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-75aca6e737251648c3ddab05a55c8354608790185af70e728b6702135e147ca1-userdata-shm.mount: Deactivated successfully.
Jan 21 18:48:18 np0005591285 nova_compute[182755]: 2026-01-21 23:48:18.629 182759 DEBUG nova.virt.libvirt.vif [None req-b8173f1b-0a6b-4e23-9e98-fa1d12f1c48d 4a6034ff39094b6486bac680b7ed5a57 4d40fc03fb534b5689415f3d8a3de1fc - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-21T23:46:47Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ServersAdminTestJSON-server-308403715',display_name='tempest-ServersAdminTestJSON-server-308403715',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-serversadmintestjson-server-308403715',id=19,image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2026-01-21T23:46:53Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='4d40fc03fb534b5689415f3d8a3de1fc',ramdisk_id='',reservation_id='r-z067t4bf',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-ServersAdminTestJSON-1815099341',owner_user_name='tempest-ServersAdminTestJSON-1815099341-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-21T23:46:53Z,user_data=None,user_id='4a6034ff39094b6486bac680b7ed5a57',uuid=3a585d4f-6f31-4651-b848-0470f4eed464,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "eb56f7d1-8dec-4faa-a727-c8bdf54f0af5", "address": "fa:16:3e:ae:45:35", "network": {"id": "1530a22a-f758-407d-b1aa-fd922904fe07", "bridge": "br-int", "label": "tempest-ServersAdminTestJSON-1431691179-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "4d40fc03fb534b5689415f3d8a3de1fc", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapeb56f7d1-8d", "ovs_interfaceid": "eb56f7d1-8dec-4faa-a727-c8bdf54f0af5", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Jan 21 18:48:18 np0005591285 systemd[1]: var-lib-containers-storage-overlay-4db65bed9faebea719c23bd0484d4cee0c61e85524154af12d57705602cdb015-merged.mount: Deactivated successfully.
Jan 21 18:48:18 np0005591285 nova_compute[182755]: 2026-01-21 23:48:18.630 182759 DEBUG nova.network.os_vif_util [None req-b8173f1b-0a6b-4e23-9e98-fa1d12f1c48d 4a6034ff39094b6486bac680b7ed5a57 4d40fc03fb534b5689415f3d8a3de1fc - - default default] Converting VIF {"id": "eb56f7d1-8dec-4faa-a727-c8bdf54f0af5", "address": "fa:16:3e:ae:45:35", "network": {"id": "1530a22a-f758-407d-b1aa-fd922904fe07", "bridge": "br-int", "label": "tempest-ServersAdminTestJSON-1431691179-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "4d40fc03fb534b5689415f3d8a3de1fc", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapeb56f7d1-8d", "ovs_interfaceid": "eb56f7d1-8dec-4faa-a727-c8bdf54f0af5", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 21 18:48:18 np0005591285 nova_compute[182755]: 2026-01-21 23:48:18.631 182759 DEBUG nova.network.os_vif_util [None req-b8173f1b-0a6b-4e23-9e98-fa1d12f1c48d 4a6034ff39094b6486bac680b7ed5a57 4d40fc03fb534b5689415f3d8a3de1fc - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:ae:45:35,bridge_name='br-int',has_traffic_filtering=True,id=eb56f7d1-8dec-4faa-a727-c8bdf54f0af5,network=Network(1530a22a-f758-407d-b1aa-fd922904fe07),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapeb56f7d1-8d') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 21 18:48:18 np0005591285 nova_compute[182755]: 2026-01-21 23:48:18.631 182759 DEBUG os_vif [None req-b8173f1b-0a6b-4e23-9e98-fa1d12f1c48d 4a6034ff39094b6486bac680b7ed5a57 4d40fc03fb534b5689415f3d8a3de1fc - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:ae:45:35,bridge_name='br-int',has_traffic_filtering=True,id=eb56f7d1-8dec-4faa-a727-c8bdf54f0af5,network=Network(1530a22a-f758-407d-b1aa-fd922904fe07),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapeb56f7d1-8d') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Jan 21 18:48:18 np0005591285 nova_compute[182755]: 2026-01-21 23:48:18.633 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 18:48:18 np0005591285 nova_compute[182755]: 2026-01-21 23:48:18.633 182759 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapeb56f7d1-8d, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 21 18:48:18 np0005591285 nova_compute[182755]: 2026-01-21 23:48:18.635 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 18:48:18 np0005591285 nova_compute[182755]: 2026-01-21 23:48:18.637 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 21 18:48:18 np0005591285 nova_compute[182755]: 2026-01-21 23:48:18.637 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 18:48:18 np0005591285 nova_compute[182755]: 2026-01-21 23:48:18.640 182759 INFO os_vif [None req-b8173f1b-0a6b-4e23-9e98-fa1d12f1c48d 4a6034ff39094b6486bac680b7ed5a57 4d40fc03fb534b5689415f3d8a3de1fc - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:ae:45:35,bridge_name='br-int',has_traffic_filtering=True,id=eb56f7d1-8dec-4faa-a727-c8bdf54f0af5,network=Network(1530a22a-f758-407d-b1aa-fd922904fe07),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapeb56f7d1-8d')#033[00m
Jan 21 18:48:18 np0005591285 nova_compute[182755]: 2026-01-21 23:48:18.641 182759 INFO nova.virt.libvirt.driver [None req-b8173f1b-0a6b-4e23-9e98-fa1d12f1c48d 4a6034ff39094b6486bac680b7ed5a57 4d40fc03fb534b5689415f3d8a3de1fc - - default default] [instance: 3a585d4f-6f31-4651-b848-0470f4eed464] Deleting instance files /var/lib/nova/instances/3a585d4f-6f31-4651-b848-0470f4eed464_del#033[00m
Jan 21 18:48:18 np0005591285 podman[214345]: 2026-01-21 23:48:18.642255748 +0000 UTC m=+0.124612094 container cleanup 75aca6e737251648c3ddab05a55c8354608790185af70e728b6702135e147ca1 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-1530a22a-f758-407d-b1aa-fd922904fe07, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, io.buildah.version=1.41.3)
Jan 21 18:48:18 np0005591285 nova_compute[182755]: 2026-01-21 23:48:18.642 182759 INFO nova.virt.libvirt.driver [None req-b8173f1b-0a6b-4e23-9e98-fa1d12f1c48d 4a6034ff39094b6486bac680b7ed5a57 4d40fc03fb534b5689415f3d8a3de1fc - - default default] [instance: 3a585d4f-6f31-4651-b848-0470f4eed464] Deletion of /var/lib/nova/instances/3a585d4f-6f31-4651-b848-0470f4eed464_del complete#033[00m
Jan 21 18:48:18 np0005591285 systemd[1]: libpod-conmon-75aca6e737251648c3ddab05a55c8354608790185af70e728b6702135e147ca1.scope: Deactivated successfully.
Jan 21 18:48:18 np0005591285 podman[214391]: 2026-01-21 23:48:18.719244088 +0000 UTC m=+0.050204424 container remove 75aca6e737251648c3ddab05a55c8354608790185af70e728b6702135e147ca1 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-1530a22a-f758-407d-b1aa-fd922904fe07, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.build-date=20251202, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2)
Jan 21 18:48:18 np0005591285 nova_compute[182755]: 2026-01-21 23:48:18.720 182759 INFO nova.compute.manager [None req-b8173f1b-0a6b-4e23-9e98-fa1d12f1c48d 4a6034ff39094b6486bac680b7ed5a57 4d40fc03fb534b5689415f3d8a3de1fc - - default default] [instance: 3a585d4f-6f31-4651-b848-0470f4eed464] Took 0.40 seconds to destroy the instance on the hypervisor.#033[00m
Jan 21 18:48:18 np0005591285 nova_compute[182755]: 2026-01-21 23:48:18.721 182759 DEBUG oslo.service.loopingcall [None req-b8173f1b-0a6b-4e23-9e98-fa1d12f1c48d 4a6034ff39094b6486bac680b7ed5a57 4d40fc03fb534b5689415f3d8a3de1fc - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Jan 21 18:48:18 np0005591285 nova_compute[182755]: 2026-01-21 23:48:18.721 182759 DEBUG nova.compute.manager [-] [instance: 3a585d4f-6f31-4651-b848-0470f4eed464] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Jan 21 18:48:18 np0005591285 nova_compute[182755]: 2026-01-21 23:48:18.721 182759 DEBUG nova.network.neutron [-] [instance: 3a585d4f-6f31-4651-b848-0470f4eed464] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Jan 21 18:48:18 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:48:18.726 211690 DEBUG oslo.privsep.daemon [-] privsep: reply[9d8dda01-19a4-4420-872f-9054d451e6fc]: (4, ('Wed Jan 21 11:48:18 PM UTC 2026 Stopping container neutron-haproxy-ovnmeta-1530a22a-f758-407d-b1aa-fd922904fe07 (75aca6e737251648c3ddab05a55c8354608790185af70e728b6702135e147ca1)\n75aca6e737251648c3ddab05a55c8354608790185af70e728b6702135e147ca1\nWed Jan 21 11:48:18 PM UTC 2026 Deleting container neutron-haproxy-ovnmeta-1530a22a-f758-407d-b1aa-fd922904fe07 (75aca6e737251648c3ddab05a55c8354608790185af70e728b6702135e147ca1)\n75aca6e737251648c3ddab05a55c8354608790185af70e728b6702135e147ca1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 18:48:18 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:48:18.728 211690 DEBUG oslo.privsep.daemon [-] privsep: reply[0e8d7e73-57d4-48e7-961b-56d81f005d3b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 18:48:18 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:48:18.729 104259 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap1530a22a-f0, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 21 18:48:18 np0005591285 kernel: tap1530a22a-f0: left promiscuous mode
Jan 21 18:48:18 np0005591285 nova_compute[182755]: 2026-01-21 23:48:18.732 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 18:48:18 np0005591285 nova_compute[182755]: 2026-01-21 23:48:18.755 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 18:48:18 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:48:18.759 211690 DEBUG oslo.privsep.daemon [-] privsep: reply[8af583e5-0b73-4386-b9ec-09712211b6ad]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 18:48:18 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:48:18.783 211690 DEBUG oslo.privsep.daemon [-] privsep: reply[c0ea94b5-f4ec-4643-b0e4-2f4440cd3ade]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 18:48:18 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:48:18.785 211690 DEBUG oslo.privsep.daemon [-] privsep: reply[b569f49a-c7ff-4149-b5dc-8239059829a7]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 18:48:18 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:48:18.808 211690 DEBUG oslo.privsep.daemon [-] privsep: reply[297a1f85-0a0d-4e78-a9f9-cb132c0eb3e8]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 377207, 'reachable_time': 22134, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 214406, 'error': None, 'target': 'ovnmeta-1530a22a-f758-407d-b1aa-fd922904fe07', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 18:48:18 np0005591285 systemd[1]: run-netns-ovnmeta\x2d1530a22a\x2df758\x2d407d\x2db1aa\x2dfd922904fe07.mount: Deactivated successfully.
Jan 21 18:48:18 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:48:18.813 104650 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-1530a22a-f758-407d-b1aa-fd922904fe07 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Jan 21 18:48:18 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:48:18.813 104650 DEBUG oslo.privsep.daemon [-] privsep: reply[b4da4f35-8c14-4d23-9f1a-b16564c0f052]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 18:48:19 np0005591285 nova_compute[182755]: 2026-01-21 23:48:19.006 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 18:48:19 np0005591285 nova_compute[182755]: 2026-01-21 23:48:19.217 182759 DEBUG oslo_service.periodic_task [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 21 18:48:19 np0005591285 nova_compute[182755]: 2026-01-21 23:48:19.218 182759 DEBUG nova.compute.manager [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Jan 21 18:48:19 np0005591285 nova_compute[182755]: 2026-01-21 23:48:19.245 182759 DEBUG nova.compute.manager [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Jan 21 18:48:19 np0005591285 nova_compute[182755]: 2026-01-21 23:48:19.246 182759 DEBUG oslo_service.periodic_task [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 21 18:48:20 np0005591285 nova_compute[182755]: 2026-01-21 23:48:20.242 182759 DEBUG oslo_service.periodic_task [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 21 18:48:20 np0005591285 nova_compute[182755]: 2026-01-21 23:48:20.718 182759 DEBUG nova.compute.manager [req-99de23fa-42d5-4670-9e73-f8e04a5b6fd9 req-5cb28bb1-ae43-4b92-b547-3c4115937d24 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 3a585d4f-6f31-4651-b848-0470f4eed464] Received event network-vif-plugged-eb56f7d1-8dec-4faa-a727-c8bdf54f0af5 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 21 18:48:20 np0005591285 nova_compute[182755]: 2026-01-21 23:48:20.719 182759 DEBUG oslo_concurrency.lockutils [req-99de23fa-42d5-4670-9e73-f8e04a5b6fd9 req-5cb28bb1-ae43-4b92-b547-3c4115937d24 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "3a585d4f-6f31-4651-b848-0470f4eed464-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 21 18:48:20 np0005591285 nova_compute[182755]: 2026-01-21 23:48:20.719 182759 DEBUG oslo_concurrency.lockutils [req-99de23fa-42d5-4670-9e73-f8e04a5b6fd9 req-5cb28bb1-ae43-4b92-b547-3c4115937d24 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "3a585d4f-6f31-4651-b848-0470f4eed464-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 21 18:48:20 np0005591285 nova_compute[182755]: 2026-01-21 23:48:20.720 182759 DEBUG oslo_concurrency.lockutils [req-99de23fa-42d5-4670-9e73-f8e04a5b6fd9 req-5cb28bb1-ae43-4b92-b547-3c4115937d24 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "3a585d4f-6f31-4651-b848-0470f4eed464-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 21 18:48:20 np0005591285 nova_compute[182755]: 2026-01-21 23:48:20.720 182759 DEBUG nova.compute.manager [req-99de23fa-42d5-4670-9e73-f8e04a5b6fd9 req-5cb28bb1-ae43-4b92-b547-3c4115937d24 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 3a585d4f-6f31-4651-b848-0470f4eed464] No waiting events found dispatching network-vif-plugged-eb56f7d1-8dec-4faa-a727-c8bdf54f0af5 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 21 18:48:20 np0005591285 nova_compute[182755]: 2026-01-21 23:48:20.720 182759 WARNING nova.compute.manager [req-99de23fa-42d5-4670-9e73-f8e04a5b6fd9 req-5cb28bb1-ae43-4b92-b547-3c4115937d24 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 3a585d4f-6f31-4651-b848-0470f4eed464] Received unexpected event network-vif-plugged-eb56f7d1-8dec-4faa-a727-c8bdf54f0af5 for instance with vm_state active and task_state deleting.#033[00m
Jan 21 18:48:20 np0005591285 nova_compute[182755]: 2026-01-21 23:48:20.835 182759 DEBUG nova.network.neutron [-] [instance: 3a585d4f-6f31-4651-b848-0470f4eed464] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 21 18:48:20 np0005591285 nova_compute[182755]: 2026-01-21 23:48:20.868 182759 INFO nova.compute.manager [-] [instance: 3a585d4f-6f31-4651-b848-0470f4eed464] Took 2.15 seconds to deallocate network for instance.#033[00m
Jan 21 18:48:21 np0005591285 nova_compute[182755]: 2026-01-21 23:48:21.027 182759 DEBUG oslo_concurrency.lockutils [None req-b8173f1b-0a6b-4e23-9e98-fa1d12f1c48d 4a6034ff39094b6486bac680b7ed5a57 4d40fc03fb534b5689415f3d8a3de1fc - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 21 18:48:21 np0005591285 nova_compute[182755]: 2026-01-21 23:48:21.028 182759 DEBUG oslo_concurrency.lockutils [None req-b8173f1b-0a6b-4e23-9e98-fa1d12f1c48d 4a6034ff39094b6486bac680b7ed5a57 4d40fc03fb534b5689415f3d8a3de1fc - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 21 18:48:21 np0005591285 nova_compute[182755]: 2026-01-21 23:48:21.149 182759 DEBUG nova.compute.provider_tree [None req-b8173f1b-0a6b-4e23-9e98-fa1d12f1c48d 4a6034ff39094b6486bac680b7ed5a57 4d40fc03fb534b5689415f3d8a3de1fc - - default default] Inventory has not changed in ProviderTree for provider: e96a8776-a298-4c19-937a-402cb8191067 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 21 18:48:21 np0005591285 nova_compute[182755]: 2026-01-21 23:48:21.177 182759 DEBUG nova.scheduler.client.report [None req-b8173f1b-0a6b-4e23-9e98-fa1d12f1c48d 4a6034ff39094b6486bac680b7ed5a57 4d40fc03fb534b5689415f3d8a3de1fc - - default default] Inventory has not changed for provider e96a8776-a298-4c19-937a-402cb8191067 based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 21 18:48:21 np0005591285 nova_compute[182755]: 2026-01-21 23:48:21.217 182759 DEBUG oslo_service.periodic_task [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 21 18:48:21 np0005591285 nova_compute[182755]: 2026-01-21 23:48:21.217 182759 DEBUG nova.compute.manager [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Jan 21 18:48:21 np0005591285 nova_compute[182755]: 2026-01-21 23:48:21.219 182759 DEBUG oslo_concurrency.lockutils [None req-b8173f1b-0a6b-4e23-9e98-fa1d12f1c48d 4a6034ff39094b6486bac680b7ed5a57 4d40fc03fb534b5689415f3d8a3de1fc - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.191s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 21 18:48:21 np0005591285 nova_compute[182755]: 2026-01-21 23:48:21.260 182759 INFO nova.scheduler.client.report [None req-b8173f1b-0a6b-4e23-9e98-fa1d12f1c48d 4a6034ff39094b6486bac680b7ed5a57 4d40fc03fb534b5689415f3d8a3de1fc - - default default] Deleted allocations for instance 3a585d4f-6f31-4651-b848-0470f4eed464#033[00m
Jan 21 18:48:21 np0005591285 nova_compute[182755]: 2026-01-21 23:48:21.330 182759 DEBUG nova.compute.manager [req-ba8c6624-b308-43e3-8168-f7b0048e4505 req-bffc0139-d7de-4a02-8ea7-f77a0603da57 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 3a585d4f-6f31-4651-b848-0470f4eed464] Received event network-vif-deleted-eb56f7d1-8dec-4faa-a727-c8bdf54f0af5 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 21 18:48:21 np0005591285 nova_compute[182755]: 2026-01-21 23:48:21.356 182759 DEBUG oslo_concurrency.lockutils [None req-b8173f1b-0a6b-4e23-9e98-fa1d12f1c48d 4a6034ff39094b6486bac680b7ed5a57 4d40fc03fb534b5689415f3d8a3de1fc - - default default] Lock "3a585d4f-6f31-4651-b848-0470f4eed464" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 3.066s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 21 18:48:21 np0005591285 nova_compute[182755]: 2026-01-21 23:48:21.465 182759 DEBUG oslo_concurrency.lockutils [None req-e98d6500-7fb6-4177-a9c6-3fced1196727 36d71830ce70436e97fbc17b6da8d3c6 95574103d0094883861c58d01690e5a3 - - default default] Acquiring lock "63b2e61e-8ad4-44e9-ba44-db37454a4b34" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 21 18:48:21 np0005591285 nova_compute[182755]: 2026-01-21 23:48:21.465 182759 DEBUG oslo_concurrency.lockutils [None req-e98d6500-7fb6-4177-a9c6-3fced1196727 36d71830ce70436e97fbc17b6da8d3c6 95574103d0094883861c58d01690e5a3 - - default default] Lock "63b2e61e-8ad4-44e9-ba44-db37454a4b34" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 21 18:48:21 np0005591285 nova_compute[182755]: 2026-01-21 23:48:21.513 182759 DEBUG nova.compute.manager [None req-e98d6500-7fb6-4177-a9c6-3fced1196727 36d71830ce70436e97fbc17b6da8d3c6 95574103d0094883861c58d01690e5a3 - - default default] [instance: 63b2e61e-8ad4-44e9-ba44-db37454a4b34] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Jan 21 18:48:21 np0005591285 nova_compute[182755]: 2026-01-21 23:48:21.861 182759 DEBUG oslo_concurrency.lockutils [None req-e98d6500-7fb6-4177-a9c6-3fced1196727 36d71830ce70436e97fbc17b6da8d3c6 95574103d0094883861c58d01690e5a3 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 21 18:48:21 np0005591285 nova_compute[182755]: 2026-01-21 23:48:21.862 182759 DEBUG oslo_concurrency.lockutils [None req-e98d6500-7fb6-4177-a9c6-3fced1196727 36d71830ce70436e97fbc17b6da8d3c6 95574103d0094883861c58d01690e5a3 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 21 18:48:21 np0005591285 nova_compute[182755]: 2026-01-21 23:48:21.869 182759 DEBUG nova.virt.hardware [None req-e98d6500-7fb6-4177-a9c6-3fced1196727 36d71830ce70436e97fbc17b6da8d3c6 95574103d0094883861c58d01690e5a3 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Jan 21 18:48:21 np0005591285 nova_compute[182755]: 2026-01-21 23:48:21.870 182759 INFO nova.compute.claims [None req-e98d6500-7fb6-4177-a9c6-3fced1196727 36d71830ce70436e97fbc17b6da8d3c6 95574103d0094883861c58d01690e5a3 - - default default] [instance: 63b2e61e-8ad4-44e9-ba44-db37454a4b34] Claim successful on node compute-2.ctlplane.example.com#033[00m
Jan 21 18:48:22 np0005591285 nova_compute[182755]: 2026-01-21 23:48:22.028 182759 DEBUG nova.compute.provider_tree [None req-e98d6500-7fb6-4177-a9c6-3fced1196727 36d71830ce70436e97fbc17b6da8d3c6 95574103d0094883861c58d01690e5a3 - - default default] Inventory has not changed in ProviderTree for provider: e96a8776-a298-4c19-937a-402cb8191067 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 21 18:48:22 np0005591285 nova_compute[182755]: 2026-01-21 23:48:22.043 182759 DEBUG nova.scheduler.client.report [None req-e98d6500-7fb6-4177-a9c6-3fced1196727 36d71830ce70436e97fbc17b6da8d3c6 95574103d0094883861c58d01690e5a3 - - default default] Inventory has not changed for provider e96a8776-a298-4c19-937a-402cb8191067 based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 21 18:48:22 np0005591285 nova_compute[182755]: 2026-01-21 23:48:22.088 182759 DEBUG oslo_concurrency.lockutils [None req-e98d6500-7fb6-4177-a9c6-3fced1196727 36d71830ce70436e97fbc17b6da8d3c6 95574103d0094883861c58d01690e5a3 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.226s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 21 18:48:22 np0005591285 nova_compute[182755]: 2026-01-21 23:48:22.088 182759 DEBUG nova.compute.manager [None req-e98d6500-7fb6-4177-a9c6-3fced1196727 36d71830ce70436e97fbc17b6da8d3c6 95574103d0094883861c58d01690e5a3 - - default default] [instance: 63b2e61e-8ad4-44e9-ba44-db37454a4b34] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Jan 21 18:48:22 np0005591285 nova_compute[182755]: 2026-01-21 23:48:22.156 182759 DEBUG nova.compute.manager [None req-e98d6500-7fb6-4177-a9c6-3fced1196727 36d71830ce70436e97fbc17b6da8d3c6 95574103d0094883861c58d01690e5a3 - - default default] [instance: 63b2e61e-8ad4-44e9-ba44-db37454a4b34] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Jan 21 18:48:22 np0005591285 nova_compute[182755]: 2026-01-21 23:48:22.157 182759 DEBUG nova.network.neutron [None req-e98d6500-7fb6-4177-a9c6-3fced1196727 36d71830ce70436e97fbc17b6da8d3c6 95574103d0094883861c58d01690e5a3 - - default default] [instance: 63b2e61e-8ad4-44e9-ba44-db37454a4b34] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Jan 21 18:48:22 np0005591285 nova_compute[182755]: 2026-01-21 23:48:22.192 182759 INFO nova.virt.libvirt.driver [None req-e98d6500-7fb6-4177-a9c6-3fced1196727 36d71830ce70436e97fbc17b6da8d3c6 95574103d0094883861c58d01690e5a3 - - default default] [instance: 63b2e61e-8ad4-44e9-ba44-db37454a4b34] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Jan 21 18:48:22 np0005591285 nova_compute[182755]: 2026-01-21 23:48:22.211 182759 DEBUG nova.compute.manager [None req-e98d6500-7fb6-4177-a9c6-3fced1196727 36d71830ce70436e97fbc17b6da8d3c6 95574103d0094883861c58d01690e5a3 - - default default] [instance: 63b2e61e-8ad4-44e9-ba44-db37454a4b34] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Jan 21 18:48:22 np0005591285 nova_compute[182755]: 2026-01-21 23:48:22.224 182759 DEBUG oslo_service.periodic_task [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 21 18:48:22 np0005591285 podman[214408]: 2026-01-21 23:48:22.272656196 +0000 UTC m=+0.116575300 container health_status e38def30a2d032278c507a8fe96e62e9ea51ff4721fe55b24e66691821fd079e (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_controller, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, container_name=ovn_controller)
Jan 21 18:48:22 np0005591285 nova_compute[182755]: 2026-01-21 23:48:22.347 182759 DEBUG oslo_concurrency.lockutils [None req-f2134448-f17c-4eae-ab41-fe1a1b639086 0c3f927acf834c718155d5ee5dd81b19 2edcdd2e6c5a46cb95eb89874a9cb5f3 - - default default] Acquiring lock "c45fafe2-1fc2-4740-a927-9f38ba3ab16b" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 21 18:48:22 np0005591285 nova_compute[182755]: 2026-01-21 23:48:22.348 182759 DEBUG oslo_concurrency.lockutils [None req-f2134448-f17c-4eae-ab41-fe1a1b639086 0c3f927acf834c718155d5ee5dd81b19 2edcdd2e6c5a46cb95eb89874a9cb5f3 - - default default] Lock "c45fafe2-1fc2-4740-a927-9f38ba3ab16b" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 21 18:48:22 np0005591285 nova_compute[182755]: 2026-01-21 23:48:22.389 182759 DEBUG nova.compute.manager [None req-f2134448-f17c-4eae-ab41-fe1a1b639086 0c3f927acf834c718155d5ee5dd81b19 2edcdd2e6c5a46cb95eb89874a9cb5f3 - - default default] [instance: c45fafe2-1fc2-4740-a927-9f38ba3ab16b] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Jan 21 18:48:22 np0005591285 nova_compute[182755]: 2026-01-21 23:48:22.403 182759 DEBUG nova.compute.manager [None req-e98d6500-7fb6-4177-a9c6-3fced1196727 36d71830ce70436e97fbc17b6da8d3c6 95574103d0094883861c58d01690e5a3 - - default default] [instance: 63b2e61e-8ad4-44e9-ba44-db37454a4b34] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Jan 21 18:48:22 np0005591285 nova_compute[182755]: 2026-01-21 23:48:22.405 182759 DEBUG nova.virt.libvirt.driver [None req-e98d6500-7fb6-4177-a9c6-3fced1196727 36d71830ce70436e97fbc17b6da8d3c6 95574103d0094883861c58d01690e5a3 - - default default] [instance: 63b2e61e-8ad4-44e9-ba44-db37454a4b34] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Jan 21 18:48:22 np0005591285 nova_compute[182755]: 2026-01-21 23:48:22.405 182759 INFO nova.virt.libvirt.driver [None req-e98d6500-7fb6-4177-a9c6-3fced1196727 36d71830ce70436e97fbc17b6da8d3c6 95574103d0094883861c58d01690e5a3 - - default default] [instance: 63b2e61e-8ad4-44e9-ba44-db37454a4b34] Creating image(s)#033[00m
Jan 21 18:48:22 np0005591285 nova_compute[182755]: 2026-01-21 23:48:22.406 182759 DEBUG oslo_concurrency.lockutils [None req-e98d6500-7fb6-4177-a9c6-3fced1196727 36d71830ce70436e97fbc17b6da8d3c6 95574103d0094883861c58d01690e5a3 - - default default] Acquiring lock "/var/lib/nova/instances/63b2e61e-8ad4-44e9-ba44-db37454a4b34/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 21 18:48:22 np0005591285 nova_compute[182755]: 2026-01-21 23:48:22.406 182759 DEBUG oslo_concurrency.lockutils [None req-e98d6500-7fb6-4177-a9c6-3fced1196727 36d71830ce70436e97fbc17b6da8d3c6 95574103d0094883861c58d01690e5a3 - - default default] Lock "/var/lib/nova/instances/63b2e61e-8ad4-44e9-ba44-db37454a4b34/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 21 18:48:22 np0005591285 nova_compute[182755]: 2026-01-21 23:48:22.407 182759 DEBUG oslo_concurrency.lockutils [None req-e98d6500-7fb6-4177-a9c6-3fced1196727 36d71830ce70436e97fbc17b6da8d3c6 95574103d0094883861c58d01690e5a3 - - default default] Lock "/var/lib/nova/instances/63b2e61e-8ad4-44e9-ba44-db37454a4b34/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 21 18:48:22 np0005591285 nova_compute[182755]: 2026-01-21 23:48:22.419 182759 DEBUG oslo_concurrency.processutils [None req-e98d6500-7fb6-4177-a9c6-3fced1196727 36d71830ce70436e97fbc17b6da8d3c6 95574103d0094883861c58d01690e5a3 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 21 18:48:22 np0005591285 nova_compute[182755]: 2026-01-21 23:48:22.480 182759 DEBUG oslo_concurrency.processutils [None req-e98d6500-7fb6-4177-a9c6-3fced1196727 36d71830ce70436e97fbc17b6da8d3c6 95574103d0094883861c58d01690e5a3 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json" returned: 0 in 0.061s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 21 18:48:22 np0005591285 nova_compute[182755]: 2026-01-21 23:48:22.480 182759 DEBUG oslo_concurrency.lockutils [None req-e98d6500-7fb6-4177-a9c6-3fced1196727 36d71830ce70436e97fbc17b6da8d3c6 95574103d0094883861c58d01690e5a3 - - default default] Acquiring lock "3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 21 18:48:22 np0005591285 nova_compute[182755]: 2026-01-21 23:48:22.481 182759 DEBUG oslo_concurrency.lockutils [None req-e98d6500-7fb6-4177-a9c6-3fced1196727 36d71830ce70436e97fbc17b6da8d3c6 95574103d0094883861c58d01690e5a3 - - default default] Lock "3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 21 18:48:22 np0005591285 nova_compute[182755]: 2026-01-21 23:48:22.491 182759 DEBUG oslo_concurrency.processutils [None req-e98d6500-7fb6-4177-a9c6-3fced1196727 36d71830ce70436e97fbc17b6da8d3c6 95574103d0094883861c58d01690e5a3 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 21 18:48:22 np0005591285 nova_compute[182755]: 2026-01-21 23:48:22.512 182759 DEBUG oslo_concurrency.lockutils [None req-f2134448-f17c-4eae-ab41-fe1a1b639086 0c3f927acf834c718155d5ee5dd81b19 2edcdd2e6c5a46cb95eb89874a9cb5f3 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 21 18:48:22 np0005591285 nova_compute[182755]: 2026-01-21 23:48:22.513 182759 DEBUG oslo_concurrency.lockutils [None req-f2134448-f17c-4eae-ab41-fe1a1b639086 0c3f927acf834c718155d5ee5dd81b19 2edcdd2e6c5a46cb95eb89874a9cb5f3 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 21 18:48:22 np0005591285 nova_compute[182755]: 2026-01-21 23:48:22.522 182759 DEBUG nova.virt.hardware [None req-f2134448-f17c-4eae-ab41-fe1a1b639086 0c3f927acf834c718155d5ee5dd81b19 2edcdd2e6c5a46cb95eb89874a9cb5f3 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Jan 21 18:48:22 np0005591285 nova_compute[182755]: 2026-01-21 23:48:22.522 182759 INFO nova.compute.claims [None req-f2134448-f17c-4eae-ab41-fe1a1b639086 0c3f927acf834c718155d5ee5dd81b19 2edcdd2e6c5a46cb95eb89874a9cb5f3 - - default default] [instance: c45fafe2-1fc2-4740-a927-9f38ba3ab16b] Claim successful on node compute-2.ctlplane.example.com#033[00m
Jan 21 18:48:22 np0005591285 nova_compute[182755]: 2026-01-21 23:48:22.526 182759 DEBUG nova.network.neutron [None req-e98d6500-7fb6-4177-a9c6-3fced1196727 36d71830ce70436e97fbc17b6da8d3c6 95574103d0094883861c58d01690e5a3 - - default default] [instance: 63b2e61e-8ad4-44e9-ba44-db37454a4b34] No network configured allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1188#033[00m
Jan 21 18:48:22 np0005591285 nova_compute[182755]: 2026-01-21 23:48:22.526 182759 DEBUG nova.compute.manager [None req-e98d6500-7fb6-4177-a9c6-3fced1196727 36d71830ce70436e97fbc17b6da8d3c6 95574103d0094883861c58d01690e5a3 - - default default] [instance: 63b2e61e-8ad4-44e9-ba44-db37454a4b34] Instance network_info: |[]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Jan 21 18:48:22 np0005591285 nova_compute[182755]: 2026-01-21 23:48:22.552 182759 DEBUG oslo_concurrency.processutils [None req-e98d6500-7fb6-4177-a9c6-3fced1196727 36d71830ce70436e97fbc17b6da8d3c6 95574103d0094883861c58d01690e5a3 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json" returned: 0 in 0.061s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 21 18:48:22 np0005591285 nova_compute[182755]: 2026-01-21 23:48:22.553 182759 DEBUG oslo_concurrency.processutils [None req-e98d6500-7fb6-4177-a9c6-3fced1196727 36d71830ce70436e97fbc17b6da8d3c6 95574103d0094883861c58d01690e5a3 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474,backing_fmt=raw /var/lib/nova/instances/63b2e61e-8ad4-44e9-ba44-db37454a4b34/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 21 18:48:22 np0005591285 nova_compute[182755]: 2026-01-21 23:48:22.598 182759 DEBUG oslo_concurrency.processutils [None req-e98d6500-7fb6-4177-a9c6-3fced1196727 36d71830ce70436e97fbc17b6da8d3c6 95574103d0094883861c58d01690e5a3 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474,backing_fmt=raw /var/lib/nova/instances/63b2e61e-8ad4-44e9-ba44-db37454a4b34/disk 1073741824" returned: 0 in 0.045s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 21 18:48:22 np0005591285 nova_compute[182755]: 2026-01-21 23:48:22.599 182759 DEBUG oslo_concurrency.lockutils [None req-e98d6500-7fb6-4177-a9c6-3fced1196727 36d71830ce70436e97fbc17b6da8d3c6 95574103d0094883861c58d01690e5a3 - - default default] Lock "3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.119s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 21 18:48:22 np0005591285 nova_compute[182755]: 2026-01-21 23:48:22.600 182759 DEBUG oslo_concurrency.processutils [None req-e98d6500-7fb6-4177-a9c6-3fced1196727 36d71830ce70436e97fbc17b6da8d3c6 95574103d0094883861c58d01690e5a3 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 21 18:48:22 np0005591285 nova_compute[182755]: 2026-01-21 23:48:22.691 182759 DEBUG oslo_concurrency.processutils [None req-e98d6500-7fb6-4177-a9c6-3fced1196727 36d71830ce70436e97fbc17b6da8d3c6 95574103d0094883861c58d01690e5a3 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json" returned: 0 in 0.092s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 21 18:48:22 np0005591285 nova_compute[182755]: 2026-01-21 23:48:22.692 182759 DEBUG nova.virt.disk.api [None req-e98d6500-7fb6-4177-a9c6-3fced1196727 36d71830ce70436e97fbc17b6da8d3c6 95574103d0094883861c58d01690e5a3 - - default default] Checking if we can resize image /var/lib/nova/instances/63b2e61e-8ad4-44e9-ba44-db37454a4b34/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166#033[00m
Jan 21 18:48:22 np0005591285 nova_compute[182755]: 2026-01-21 23:48:22.693 182759 DEBUG oslo_concurrency.processutils [None req-e98d6500-7fb6-4177-a9c6-3fced1196727 36d71830ce70436e97fbc17b6da8d3c6 95574103d0094883861c58d01690e5a3 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/63b2e61e-8ad4-44e9-ba44-db37454a4b34/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 21 18:48:22 np0005591285 nova_compute[182755]: 2026-01-21 23:48:22.732 182759 DEBUG nova.compute.provider_tree [None req-f2134448-f17c-4eae-ab41-fe1a1b639086 0c3f927acf834c718155d5ee5dd81b19 2edcdd2e6c5a46cb95eb89874a9cb5f3 - - default default] Inventory has not changed in ProviderTree for provider: e96a8776-a298-4c19-937a-402cb8191067 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 21 18:48:22 np0005591285 nova_compute[182755]: 2026-01-21 23:48:22.757 182759 DEBUG nova.scheduler.client.report [None req-f2134448-f17c-4eae-ab41-fe1a1b639086 0c3f927acf834c718155d5ee5dd81b19 2edcdd2e6c5a46cb95eb89874a9cb5f3 - - default default] Inventory has not changed for provider e96a8776-a298-4c19-937a-402cb8191067 based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 21 18:48:22 np0005591285 nova_compute[182755]: 2026-01-21 23:48:22.789 182759 DEBUG oslo_concurrency.processutils [None req-e98d6500-7fb6-4177-a9c6-3fced1196727 36d71830ce70436e97fbc17b6da8d3c6 95574103d0094883861c58d01690e5a3 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/63b2e61e-8ad4-44e9-ba44-db37454a4b34/disk --force-share --output=json" returned: 0 in 0.096s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 21 18:48:22 np0005591285 nova_compute[182755]: 2026-01-21 23:48:22.790 182759 DEBUG nova.virt.disk.api [None req-e98d6500-7fb6-4177-a9c6-3fced1196727 36d71830ce70436e97fbc17b6da8d3c6 95574103d0094883861c58d01690e5a3 - - default default] Cannot resize image /var/lib/nova/instances/63b2e61e-8ad4-44e9-ba44-db37454a4b34/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172#033[00m
Jan 21 18:48:22 np0005591285 nova_compute[182755]: 2026-01-21 23:48:22.790 182759 DEBUG nova.objects.instance [None req-e98d6500-7fb6-4177-a9c6-3fced1196727 36d71830ce70436e97fbc17b6da8d3c6 95574103d0094883861c58d01690e5a3 - - default default] Lazy-loading 'migration_context' on Instance uuid 63b2e61e-8ad4-44e9-ba44-db37454a4b34 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 21 18:48:22 np0005591285 nova_compute[182755]: 2026-01-21 23:48:22.830 182759 DEBUG nova.virt.libvirt.driver [None req-e98d6500-7fb6-4177-a9c6-3fced1196727 36d71830ce70436e97fbc17b6da8d3c6 95574103d0094883861c58d01690e5a3 - - default default] [instance: 63b2e61e-8ad4-44e9-ba44-db37454a4b34] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Jan 21 18:48:22 np0005591285 nova_compute[182755]: 2026-01-21 23:48:22.830 182759 DEBUG nova.virt.libvirt.driver [None req-e98d6500-7fb6-4177-a9c6-3fced1196727 36d71830ce70436e97fbc17b6da8d3c6 95574103d0094883861c58d01690e5a3 - - default default] [instance: 63b2e61e-8ad4-44e9-ba44-db37454a4b34] Ensure instance console log exists: /var/lib/nova/instances/63b2e61e-8ad4-44e9-ba44-db37454a4b34/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Jan 21 18:48:22 np0005591285 nova_compute[182755]: 2026-01-21 23:48:22.831 182759 DEBUG oslo_concurrency.lockutils [None req-e98d6500-7fb6-4177-a9c6-3fced1196727 36d71830ce70436e97fbc17b6da8d3c6 95574103d0094883861c58d01690e5a3 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 21 18:48:22 np0005591285 nova_compute[182755]: 2026-01-21 23:48:22.831 182759 DEBUG oslo_concurrency.lockutils [None req-e98d6500-7fb6-4177-a9c6-3fced1196727 36d71830ce70436e97fbc17b6da8d3c6 95574103d0094883861c58d01690e5a3 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 21 18:48:22 np0005591285 nova_compute[182755]: 2026-01-21 23:48:22.831 182759 DEBUG oslo_concurrency.lockutils [None req-e98d6500-7fb6-4177-a9c6-3fced1196727 36d71830ce70436e97fbc17b6da8d3c6 95574103d0094883861c58d01690e5a3 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 21 18:48:22 np0005591285 nova_compute[182755]: 2026-01-21 23:48:22.833 182759 DEBUG nova.virt.libvirt.driver [None req-e98d6500-7fb6-4177-a9c6-3fced1196727 36d71830ce70436e97fbc17b6da8d3c6 95574103d0094883861c58d01690e5a3 - - default default] [instance: 63b2e61e-8ad4-44e9-ba44-db37454a4b34] Start _get_guest_xml network_info=[] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-21T23:42:50Z,direct_url=<?>,disk_format='qcow2',id=9cd98f02-a505-4543-a7ad-04e9a377b456,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='43b70c4e837343859ac97b6b2397ba1b',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-21T23:42:52Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'device_type': 'disk', 'device_name': '/dev/vda', 'size': 0, 'boot_index': 0, 'disk_bus': 'virtio', 'guest_format': None, 'encryption_options': None, 'encryption_format': None, 'encrypted': False, 'encryption_secret_uuid': None, 'image_id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Jan 21 18:48:22 np0005591285 nova_compute[182755]: 2026-01-21 23:48:22.835 182759 DEBUG oslo_concurrency.lockutils [None req-f2134448-f17c-4eae-ab41-fe1a1b639086 0c3f927acf834c718155d5ee5dd81b19 2edcdd2e6c5a46cb95eb89874a9cb5f3 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.322s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 21 18:48:22 np0005591285 nova_compute[182755]: 2026-01-21 23:48:22.836 182759 DEBUG nova.compute.manager [None req-f2134448-f17c-4eae-ab41-fe1a1b639086 0c3f927acf834c718155d5ee5dd81b19 2edcdd2e6c5a46cb95eb89874a9cb5f3 - - default default] [instance: c45fafe2-1fc2-4740-a927-9f38ba3ab16b] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Jan 21 18:48:22 np0005591285 nova_compute[182755]: 2026-01-21 23:48:22.843 182759 WARNING nova.virt.libvirt.driver [None req-e98d6500-7fb6-4177-a9c6-3fced1196727 36d71830ce70436e97fbc17b6da8d3c6 95574103d0094883861c58d01690e5a3 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 21 18:48:22 np0005591285 nova_compute[182755]: 2026-01-21 23:48:22.848 182759 DEBUG nova.virt.libvirt.host [None req-e98d6500-7fb6-4177-a9c6-3fced1196727 36d71830ce70436e97fbc17b6da8d3c6 95574103d0094883861c58d01690e5a3 - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Jan 21 18:48:22 np0005591285 nova_compute[182755]: 2026-01-21 23:48:22.849 182759 DEBUG nova.virt.libvirt.host [None req-e98d6500-7fb6-4177-a9c6-3fced1196727 36d71830ce70436e97fbc17b6da8d3c6 95574103d0094883861c58d01690e5a3 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Jan 21 18:48:22 np0005591285 nova_compute[182755]: 2026-01-21 23:48:22.852 182759 DEBUG nova.virt.libvirt.host [None req-e98d6500-7fb6-4177-a9c6-3fced1196727 36d71830ce70436e97fbc17b6da8d3c6 95574103d0094883861c58d01690e5a3 - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Jan 21 18:48:22 np0005591285 nova_compute[182755]: 2026-01-21 23:48:22.853 182759 DEBUG nova.virt.libvirt.host [None req-e98d6500-7fb6-4177-a9c6-3fced1196727 36d71830ce70436e97fbc17b6da8d3c6 95574103d0094883861c58d01690e5a3 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Jan 21 18:48:22 np0005591285 nova_compute[182755]: 2026-01-21 23:48:22.854 182759 DEBUG nova.virt.libvirt.driver [None req-e98d6500-7fb6-4177-a9c6-3fced1196727 36d71830ce70436e97fbc17b6da8d3c6 95574103d0094883861c58d01690e5a3 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Jan 21 18:48:22 np0005591285 nova_compute[182755]: 2026-01-21 23:48:22.854 182759 DEBUG nova.virt.hardware [None req-e98d6500-7fb6-4177-a9c6-3fced1196727 36d71830ce70436e97fbc17b6da8d3c6 95574103d0094883861c58d01690e5a3 - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-21T23:42:45Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='c3389c03-89c4-4ff5-9e03-1a99d41713d4',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-21T23:42:50Z,direct_url=<?>,disk_format='qcow2',id=9cd98f02-a505-4543-a7ad-04e9a377b456,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='43b70c4e837343859ac97b6b2397ba1b',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-21T23:42:52Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Jan 21 18:48:22 np0005591285 nova_compute[182755]: 2026-01-21 23:48:22.855 182759 DEBUG nova.virt.hardware [None req-e98d6500-7fb6-4177-a9c6-3fced1196727 36d71830ce70436e97fbc17b6da8d3c6 95574103d0094883861c58d01690e5a3 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Jan 21 18:48:22 np0005591285 nova_compute[182755]: 2026-01-21 23:48:22.855 182759 DEBUG nova.virt.hardware [None req-e98d6500-7fb6-4177-a9c6-3fced1196727 36d71830ce70436e97fbc17b6da8d3c6 95574103d0094883861c58d01690e5a3 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Jan 21 18:48:22 np0005591285 nova_compute[182755]: 2026-01-21 23:48:22.855 182759 DEBUG nova.virt.hardware [None req-e98d6500-7fb6-4177-a9c6-3fced1196727 36d71830ce70436e97fbc17b6da8d3c6 95574103d0094883861c58d01690e5a3 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Jan 21 18:48:22 np0005591285 nova_compute[182755]: 2026-01-21 23:48:22.855 182759 DEBUG nova.virt.hardware [None req-e98d6500-7fb6-4177-a9c6-3fced1196727 36d71830ce70436e97fbc17b6da8d3c6 95574103d0094883861c58d01690e5a3 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Jan 21 18:48:22 np0005591285 nova_compute[182755]: 2026-01-21 23:48:22.855 182759 DEBUG nova.virt.hardware [None req-e98d6500-7fb6-4177-a9c6-3fced1196727 36d71830ce70436e97fbc17b6da8d3c6 95574103d0094883861c58d01690e5a3 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Jan 21 18:48:22 np0005591285 nova_compute[182755]: 2026-01-21 23:48:22.856 182759 DEBUG nova.virt.hardware [None req-e98d6500-7fb6-4177-a9c6-3fced1196727 36d71830ce70436e97fbc17b6da8d3c6 95574103d0094883861c58d01690e5a3 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Jan 21 18:48:22 np0005591285 nova_compute[182755]: 2026-01-21 23:48:22.856 182759 DEBUG nova.virt.hardware [None req-e98d6500-7fb6-4177-a9c6-3fced1196727 36d71830ce70436e97fbc17b6da8d3c6 95574103d0094883861c58d01690e5a3 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Jan 21 18:48:22 np0005591285 nova_compute[182755]: 2026-01-21 23:48:22.856 182759 DEBUG nova.virt.hardware [None req-e98d6500-7fb6-4177-a9c6-3fced1196727 36d71830ce70436e97fbc17b6da8d3c6 95574103d0094883861c58d01690e5a3 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Jan 21 18:48:22 np0005591285 nova_compute[182755]: 2026-01-21 23:48:22.856 182759 DEBUG nova.virt.hardware [None req-e98d6500-7fb6-4177-a9c6-3fced1196727 36d71830ce70436e97fbc17b6da8d3c6 95574103d0094883861c58d01690e5a3 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Jan 21 18:48:22 np0005591285 nova_compute[182755]: 2026-01-21 23:48:22.856 182759 DEBUG nova.virt.hardware [None req-e98d6500-7fb6-4177-a9c6-3fced1196727 36d71830ce70436e97fbc17b6da8d3c6 95574103d0094883861c58d01690e5a3 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Jan 21 18:48:22 np0005591285 nova_compute[182755]: 2026-01-21 23:48:22.861 182759 DEBUG nova.objects.instance [None req-e98d6500-7fb6-4177-a9c6-3fced1196727 36d71830ce70436e97fbc17b6da8d3c6 95574103d0094883861c58d01690e5a3 - - default default] Lazy-loading 'pci_devices' on Instance uuid 63b2e61e-8ad4-44e9-ba44-db37454a4b34 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 21 18:48:22 np0005591285 nova_compute[182755]: 2026-01-21 23:48:22.905 182759 DEBUG nova.virt.libvirt.driver [None req-e98d6500-7fb6-4177-a9c6-3fced1196727 36d71830ce70436e97fbc17b6da8d3c6 95574103d0094883861c58d01690e5a3 - - default default] [instance: 63b2e61e-8ad4-44e9-ba44-db37454a4b34] End _get_guest_xml xml=<domain type="kvm">
Jan 21 18:48:22 np0005591285 nova_compute[182755]:  <uuid>63b2e61e-8ad4-44e9-ba44-db37454a4b34</uuid>
Jan 21 18:48:22 np0005591285 nova_compute[182755]:  <name>instance-0000001a</name>
Jan 21 18:48:22 np0005591285 nova_compute[182755]:  <memory>131072</memory>
Jan 21 18:48:22 np0005591285 nova_compute[182755]:  <vcpu>1</vcpu>
Jan 21 18:48:22 np0005591285 nova_compute[182755]:  <metadata>
Jan 21 18:48:22 np0005591285 nova_compute[182755]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 21 18:48:22 np0005591285 nova_compute[182755]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 21 18:48:22 np0005591285 nova_compute[182755]:      <nova:name>tempest-MigrationsAdminTest-server-1192752510</nova:name>
Jan 21 18:48:22 np0005591285 nova_compute[182755]:      <nova:creationTime>2026-01-21 23:48:22</nova:creationTime>
Jan 21 18:48:22 np0005591285 nova_compute[182755]:      <nova:flavor name="m1.nano">
Jan 21 18:48:22 np0005591285 nova_compute[182755]:        <nova:memory>128</nova:memory>
Jan 21 18:48:22 np0005591285 nova_compute[182755]:        <nova:disk>1</nova:disk>
Jan 21 18:48:22 np0005591285 nova_compute[182755]:        <nova:swap>0</nova:swap>
Jan 21 18:48:22 np0005591285 nova_compute[182755]:        <nova:ephemeral>0</nova:ephemeral>
Jan 21 18:48:22 np0005591285 nova_compute[182755]:        <nova:vcpus>1</nova:vcpus>
Jan 21 18:48:22 np0005591285 nova_compute[182755]:      </nova:flavor>
Jan 21 18:48:22 np0005591285 nova_compute[182755]:      <nova:owner>
Jan 21 18:48:22 np0005591285 nova_compute[182755]:        <nova:user uuid="36d71830ce70436e97fbc17b6da8d3c6">tempest-MigrationsAdminTest-1559502816-project-member</nova:user>
Jan 21 18:48:22 np0005591285 nova_compute[182755]:        <nova:project uuid="95574103d0094883861c58d01690e5a3">tempest-MigrationsAdminTest-1559502816</nova:project>
Jan 21 18:48:22 np0005591285 nova_compute[182755]:      </nova:owner>
Jan 21 18:48:22 np0005591285 nova_compute[182755]:      <nova:root type="image" uuid="9cd98f02-a505-4543-a7ad-04e9a377b456"/>
Jan 21 18:48:22 np0005591285 nova_compute[182755]:      <nova:ports/>
Jan 21 18:48:22 np0005591285 nova_compute[182755]:    </nova:instance>
Jan 21 18:48:22 np0005591285 nova_compute[182755]:  </metadata>
Jan 21 18:48:22 np0005591285 nova_compute[182755]:  <sysinfo type="smbios">
Jan 21 18:48:22 np0005591285 nova_compute[182755]:    <system>
Jan 21 18:48:22 np0005591285 nova_compute[182755]:      <entry name="manufacturer">RDO</entry>
Jan 21 18:48:22 np0005591285 nova_compute[182755]:      <entry name="product">OpenStack Compute</entry>
Jan 21 18:48:22 np0005591285 nova_compute[182755]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 21 18:48:22 np0005591285 nova_compute[182755]:      <entry name="serial">63b2e61e-8ad4-44e9-ba44-db37454a4b34</entry>
Jan 21 18:48:22 np0005591285 nova_compute[182755]:      <entry name="uuid">63b2e61e-8ad4-44e9-ba44-db37454a4b34</entry>
Jan 21 18:48:22 np0005591285 nova_compute[182755]:      <entry name="family">Virtual Machine</entry>
Jan 21 18:48:22 np0005591285 nova_compute[182755]:    </system>
Jan 21 18:48:22 np0005591285 nova_compute[182755]:  </sysinfo>
Jan 21 18:48:22 np0005591285 nova_compute[182755]:  <os>
Jan 21 18:48:22 np0005591285 nova_compute[182755]:    <type arch="x86_64" machine="q35">hvm</type>
Jan 21 18:48:22 np0005591285 nova_compute[182755]:    <boot dev="hd"/>
Jan 21 18:48:22 np0005591285 nova_compute[182755]:    <smbios mode="sysinfo"/>
Jan 21 18:48:22 np0005591285 nova_compute[182755]:  </os>
Jan 21 18:48:22 np0005591285 nova_compute[182755]:  <features>
Jan 21 18:48:22 np0005591285 nova_compute[182755]:    <acpi/>
Jan 21 18:48:22 np0005591285 nova_compute[182755]:    <apic/>
Jan 21 18:48:22 np0005591285 nova_compute[182755]:    <vmcoreinfo/>
Jan 21 18:48:22 np0005591285 nova_compute[182755]:  </features>
Jan 21 18:48:22 np0005591285 nova_compute[182755]:  <clock offset="utc">
Jan 21 18:48:22 np0005591285 nova_compute[182755]:    <timer name="pit" tickpolicy="delay"/>
Jan 21 18:48:22 np0005591285 nova_compute[182755]:    <timer name="rtc" tickpolicy="catchup"/>
Jan 21 18:48:22 np0005591285 nova_compute[182755]:    <timer name="hpet" present="no"/>
Jan 21 18:48:22 np0005591285 nova_compute[182755]:  </clock>
Jan 21 18:48:22 np0005591285 nova_compute[182755]:  <cpu mode="custom" match="exact">
Jan 21 18:48:22 np0005591285 nova_compute[182755]:    <model>Nehalem</model>
Jan 21 18:48:22 np0005591285 nova_compute[182755]:    <topology sockets="1" cores="1" threads="1"/>
Jan 21 18:48:22 np0005591285 nova_compute[182755]:  </cpu>
Jan 21 18:48:22 np0005591285 nova_compute[182755]:  <devices>
Jan 21 18:48:22 np0005591285 nova_compute[182755]:    <disk type="file" device="disk">
Jan 21 18:48:22 np0005591285 nova_compute[182755]:      <driver name="qemu" type="qcow2" cache="none"/>
Jan 21 18:48:22 np0005591285 nova_compute[182755]:      <source file="/var/lib/nova/instances/63b2e61e-8ad4-44e9-ba44-db37454a4b34/disk"/>
Jan 21 18:48:22 np0005591285 nova_compute[182755]:      <target dev="vda" bus="virtio"/>
Jan 21 18:48:22 np0005591285 nova_compute[182755]:    </disk>
Jan 21 18:48:22 np0005591285 nova_compute[182755]:    <disk type="file" device="cdrom">
Jan 21 18:48:22 np0005591285 nova_compute[182755]:      <driver name="qemu" type="raw" cache="none"/>
Jan 21 18:48:22 np0005591285 nova_compute[182755]:      <source file="/var/lib/nova/instances/63b2e61e-8ad4-44e9-ba44-db37454a4b34/disk.config"/>
Jan 21 18:48:22 np0005591285 nova_compute[182755]:      <target dev="sda" bus="sata"/>
Jan 21 18:48:22 np0005591285 nova_compute[182755]:    </disk>
Jan 21 18:48:22 np0005591285 nova_compute[182755]:    <serial type="pty">
Jan 21 18:48:22 np0005591285 nova_compute[182755]:      <log file="/var/lib/nova/instances/63b2e61e-8ad4-44e9-ba44-db37454a4b34/console.log" append="off"/>
Jan 21 18:48:22 np0005591285 nova_compute[182755]:    </serial>
Jan 21 18:48:22 np0005591285 nova_compute[182755]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 21 18:48:22 np0005591285 nova_compute[182755]:    <video>
Jan 21 18:48:22 np0005591285 nova_compute[182755]:      <model type="virtio"/>
Jan 21 18:48:22 np0005591285 nova_compute[182755]:    </video>
Jan 21 18:48:22 np0005591285 nova_compute[182755]:    <input type="tablet" bus="usb"/>
Jan 21 18:48:22 np0005591285 nova_compute[182755]:    <rng model="virtio">
Jan 21 18:48:22 np0005591285 nova_compute[182755]:      <backend model="random">/dev/urandom</backend>
Jan 21 18:48:22 np0005591285 nova_compute[182755]:    </rng>
Jan 21 18:48:22 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root"/>
Jan 21 18:48:22 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 18:48:22 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 18:48:22 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 18:48:22 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 18:48:22 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 18:48:22 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 18:48:22 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 18:48:22 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 18:48:22 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 18:48:22 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 18:48:22 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 18:48:22 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 18:48:22 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 18:48:22 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 18:48:22 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 18:48:22 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 18:48:22 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 18:48:22 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 18:48:22 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 18:48:22 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 18:48:22 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 18:48:22 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 18:48:22 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 18:48:22 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 18:48:22 np0005591285 nova_compute[182755]:    <controller type="usb" index="0"/>
Jan 21 18:48:22 np0005591285 nova_compute[182755]:    <memballoon model="virtio">
Jan 21 18:48:22 np0005591285 nova_compute[182755]:      <stats period="10"/>
Jan 21 18:48:22 np0005591285 nova_compute[182755]:    </memballoon>
Jan 21 18:48:22 np0005591285 nova_compute[182755]:  </devices>
Jan 21 18:48:22 np0005591285 nova_compute[182755]: </domain>
Jan 21 18:48:22 np0005591285 nova_compute[182755]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Jan 21 18:48:22 np0005591285 nova_compute[182755]: 2026-01-21 23:48:22.927 182759 DEBUG nova.compute.manager [None req-f2134448-f17c-4eae-ab41-fe1a1b639086 0c3f927acf834c718155d5ee5dd81b19 2edcdd2e6c5a46cb95eb89874a9cb5f3 - - default default] [instance: c45fafe2-1fc2-4740-a927-9f38ba3ab16b] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Jan 21 18:48:22 np0005591285 nova_compute[182755]: 2026-01-21 23:48:22.928 182759 DEBUG nova.network.neutron [None req-f2134448-f17c-4eae-ab41-fe1a1b639086 0c3f927acf834c718155d5ee5dd81b19 2edcdd2e6c5a46cb95eb89874a9cb5f3 - - default default] [instance: c45fafe2-1fc2-4740-a927-9f38ba3ab16b] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Jan 21 18:48:22 np0005591285 nova_compute[182755]: 2026-01-21 23:48:22.951 182759 INFO nova.virt.libvirt.driver [None req-f2134448-f17c-4eae-ab41-fe1a1b639086 0c3f927acf834c718155d5ee5dd81b19 2edcdd2e6c5a46cb95eb89874a9cb5f3 - - default default] [instance: c45fafe2-1fc2-4740-a927-9f38ba3ab16b] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Jan 21 18:48:22 np0005591285 nova_compute[182755]: 2026-01-21 23:48:22.981 182759 DEBUG nova.virt.libvirt.driver [None req-e98d6500-7fb6-4177-a9c6-3fced1196727 36d71830ce70436e97fbc17b6da8d3c6 95574103d0094883861c58d01690e5a3 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 21 18:48:22 np0005591285 nova_compute[182755]: 2026-01-21 23:48:22.982 182759 DEBUG nova.virt.libvirt.driver [None req-e98d6500-7fb6-4177-a9c6-3fced1196727 36d71830ce70436e97fbc17b6da8d3c6 95574103d0094883861c58d01690e5a3 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 21 18:48:22 np0005591285 nova_compute[182755]: 2026-01-21 23:48:22.983 182759 INFO nova.virt.libvirt.driver [None req-e98d6500-7fb6-4177-a9c6-3fced1196727 36d71830ce70436e97fbc17b6da8d3c6 95574103d0094883861c58d01690e5a3 - - default default] [instance: 63b2e61e-8ad4-44e9-ba44-db37454a4b34] Using config drive#033[00m
Jan 21 18:48:22 np0005591285 nova_compute[182755]: 2026-01-21 23:48:22.987 182759 DEBUG nova.compute.manager [None req-f2134448-f17c-4eae-ab41-fe1a1b639086 0c3f927acf834c718155d5ee5dd81b19 2edcdd2e6c5a46cb95eb89874a9cb5f3 - - default default] [instance: c45fafe2-1fc2-4740-a927-9f38ba3ab16b] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Jan 21 18:48:23 np0005591285 nova_compute[182755]: 2026-01-21 23:48:23.125 182759 DEBUG nova.compute.manager [None req-f2134448-f17c-4eae-ab41-fe1a1b639086 0c3f927acf834c718155d5ee5dd81b19 2edcdd2e6c5a46cb95eb89874a9cb5f3 - - default default] [instance: c45fafe2-1fc2-4740-a927-9f38ba3ab16b] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Jan 21 18:48:23 np0005591285 nova_compute[182755]: 2026-01-21 23:48:23.127 182759 DEBUG nova.virt.libvirt.driver [None req-f2134448-f17c-4eae-ab41-fe1a1b639086 0c3f927acf834c718155d5ee5dd81b19 2edcdd2e6c5a46cb95eb89874a9cb5f3 - - default default] [instance: c45fafe2-1fc2-4740-a927-9f38ba3ab16b] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Jan 21 18:48:23 np0005591285 nova_compute[182755]: 2026-01-21 23:48:23.127 182759 INFO nova.virt.libvirt.driver [None req-f2134448-f17c-4eae-ab41-fe1a1b639086 0c3f927acf834c718155d5ee5dd81b19 2edcdd2e6c5a46cb95eb89874a9cb5f3 - - default default] [instance: c45fafe2-1fc2-4740-a927-9f38ba3ab16b] Creating image(s)#033[00m
Jan 21 18:48:23 np0005591285 nova_compute[182755]: 2026-01-21 23:48:23.128 182759 DEBUG oslo_concurrency.lockutils [None req-f2134448-f17c-4eae-ab41-fe1a1b639086 0c3f927acf834c718155d5ee5dd81b19 2edcdd2e6c5a46cb95eb89874a9cb5f3 - - default default] Acquiring lock "/var/lib/nova/instances/c45fafe2-1fc2-4740-a927-9f38ba3ab16b/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 21 18:48:23 np0005591285 nova_compute[182755]: 2026-01-21 23:48:23.128 182759 DEBUG oslo_concurrency.lockutils [None req-f2134448-f17c-4eae-ab41-fe1a1b639086 0c3f927acf834c718155d5ee5dd81b19 2edcdd2e6c5a46cb95eb89874a9cb5f3 - - default default] Lock "/var/lib/nova/instances/c45fafe2-1fc2-4740-a927-9f38ba3ab16b/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 21 18:48:23 np0005591285 nova_compute[182755]: 2026-01-21 23:48:23.129 182759 DEBUG oslo_concurrency.lockutils [None req-f2134448-f17c-4eae-ab41-fe1a1b639086 0c3f927acf834c718155d5ee5dd81b19 2edcdd2e6c5a46cb95eb89874a9cb5f3 - - default default] Lock "/var/lib/nova/instances/c45fafe2-1fc2-4740-a927-9f38ba3ab16b/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 21 18:48:23 np0005591285 nova_compute[182755]: 2026-01-21 23:48:23.147 182759 DEBUG oslo_concurrency.processutils [None req-f2134448-f17c-4eae-ab41-fe1a1b639086 0c3f927acf834c718155d5ee5dd81b19 2edcdd2e6c5a46cb95eb89874a9cb5f3 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 21 18:48:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:48:23.148 12 DEBUG ceilometer.compute.discovery [-] instance data: {'id': '63b2e61e-8ad4-44e9-ba44-db37454a4b34', 'name': 'tempest-MigrationsAdminTest-server-1192752510', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'os_type': 'hvm', 'architecture': 'x86_64', 'OS-EXT-SRV-ATTR:instance_name': 'instance-0000001a', 'OS-EXT-SRV-ATTR:host': 'compute-2.ctlplane.example.com', 'OS-EXT-STS:vm_state': 'shutdown', 'tenant_id': '95574103d0094883861c58d01690e5a3', 'user_id': '36d71830ce70436e97fbc17b6da8d3c6', 'hostId': 'af4502268c6a69c8f225d78dcaeb3563262c66fb4e0ef6e51e9292e7', 'status': 'stopped', 'metadata': {}} discover_libvirt_polling /usr/lib/python3.9/site-packages/ceilometer/compute/discovery.py:228
Jan 21 18:48:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:48:23.150 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.usage in the context of pollsters
Jan 21 18:48:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:48:23.153 12 DEBUG ceilometer.compute.pollsters [-] Instance 63b2e61e-8ad4-44e9-ba44-db37454a4b34 was shut off while getting sample of disk.device.usage: Failed to inspect data of instance <name=instance-0000001a, id=63b2e61e-8ad4-44e9-ba44-db37454a4b34>, domain state is SHUTOFF. get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:152
Jan 21 18:48:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:48:23.153 12 INFO ceilometer.polling.manager [-] Polling pollster cpu in the context of pollsters
Jan 21 18:48:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:48:23.155 12 DEBUG ceilometer.compute.pollsters [-] Instance 63b2e61e-8ad4-44e9-ba44-db37454a4b34 was shut off while getting sample of cpu: Failed to inspect data of instance <name=instance-0000001a, id=63b2e61e-8ad4-44e9-ba44-db37454a4b34>, domain state is SHUTOFF. get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:152
Jan 21 18:48:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:48:23.155 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.delta in the context of pollsters
Jan 21 18:48:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:48:23.156 12 DEBUG ceilometer.compute.pollsters [-] Instance 63b2e61e-8ad4-44e9-ba44-db37454a4b34 was shut off while getting sample of network.incoming.bytes.delta: Failed to inspect data of instance <name=instance-0000001a, id=63b2e61e-8ad4-44e9-ba44-db37454a4b34>, domain state is SHUTOFF. get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:152
Jan 21 18:48:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:48:23.156 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.latency in the context of pollsters
Jan 21 18:48:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:48:23.158 12 DEBUG ceilometer.compute.pollsters [-] Instance 63b2e61e-8ad4-44e9-ba44-db37454a4b34 was shut off while getting sample of disk.device.read.latency: Failed to inspect data of instance <name=instance-0000001a, id=63b2e61e-8ad4-44e9-ba44-db37454a4b34>, domain state is SHUTOFF. get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:152
Jan 21 18:48:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:48:23.158 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.latency in the context of pollsters
Jan 21 18:48:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:48:23.159 12 DEBUG ceilometer.compute.pollsters [-] Instance 63b2e61e-8ad4-44e9-ba44-db37454a4b34 was shut off while getting sample of disk.device.write.latency: Failed to inspect data of instance <name=instance-0000001a, id=63b2e61e-8ad4-44e9-ba44-db37454a4b34>, domain state is SHUTOFF. get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:152
Jan 21 18:48:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:48:23.160 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets in the context of pollsters
Jan 21 18:48:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:48:23.161 12 DEBUG ceilometer.compute.pollsters [-] Instance 63b2e61e-8ad4-44e9-ba44-db37454a4b34 was shut off while getting sample of network.incoming.packets: Failed to inspect data of instance <name=instance-0000001a, id=63b2e61e-8ad4-44e9-ba44-db37454a4b34>, domain state is SHUTOFF. get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:152
Jan 21 18:48:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:48:23.161 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.rate in the context of pollsters
Jan 21 18:48:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:48:23.161 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for IncomingBytesRatePollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Jan 21 18:48:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:48:23.161 12 ERROR ceilometer.polling.manager [-] Prevent pollster network.incoming.bytes.rate from polling [<NovaLikeServer: tempest-MigrationsAdminTest-server-1192752510>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-MigrationsAdminTest-server-1192752510>]
Jan 21 18:48:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:48:23.162 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.requests in the context of pollsters
Jan 21 18:48:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:48:23.164 12 DEBUG ceilometer.compute.pollsters [-] Instance 63b2e61e-8ad4-44e9-ba44-db37454a4b34 was shut off while getting sample of disk.device.read.requests: Failed to inspect data of instance <name=instance-0000001a, id=63b2e61e-8ad4-44e9-ba44-db37454a4b34>, domain state is SHUTOFF. get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:152
Jan 21 18:48:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:48:23.164 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.error in the context of pollsters
Jan 21 18:48:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:48:23.165 12 DEBUG ceilometer.compute.pollsters [-] Instance 63b2e61e-8ad4-44e9-ba44-db37454a4b34 was shut off while getting sample of network.outgoing.packets.error: Failed to inspect data of instance <name=instance-0000001a, id=63b2e61e-8ad4-44e9-ba44-db37454a4b34>, domain state is SHUTOFF. get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:152
Jan 21 18:48:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:48:23.166 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.bytes in the context of pollsters
Jan 21 18:48:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:48:23.167 12 DEBUG ceilometer.compute.pollsters [-] Instance 63b2e61e-8ad4-44e9-ba44-db37454a4b34 was shut off while getting sample of disk.device.read.bytes: Failed to inspect data of instance <name=instance-0000001a, id=63b2e61e-8ad4-44e9-ba44-db37454a4b34>, domain state is SHUTOFF. get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:152
Jan 21 18:48:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:48:23.167 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.rate in the context of pollsters
Jan 21 18:48:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:48:23.167 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for OutgoingBytesRatePollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Jan 21 18:48:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:48:23.167 12 ERROR ceilometer.polling.manager [-] Prevent pollster network.outgoing.bytes.rate from polling [<NovaLikeServer: tempest-MigrationsAdminTest-server-1192752510>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-MigrationsAdminTest-server-1192752510>]
Jan 21 18:48:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:48:23.168 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.requests in the context of pollsters
Jan 21 18:48:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:48:23.169 12 DEBUG ceilometer.compute.pollsters [-] Instance 63b2e61e-8ad4-44e9-ba44-db37454a4b34 was shut off while getting sample of disk.device.write.requests: Failed to inspect data of instance <name=instance-0000001a, id=63b2e61e-8ad4-44e9-ba44-db37454a4b34>, domain state is SHUTOFF. get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:152
Jan 21 18:48:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:48:23.169 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.iops in the context of pollsters
Jan 21 18:48:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:48:23.169 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for PerDeviceDiskIOPSPollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Jan 21 18:48:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:48:23.170 12 ERROR ceilometer.polling.manager [-] Prevent pollster disk.device.iops from polling [<NovaLikeServer: tempest-MigrationsAdminTest-server-1192752510>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-MigrationsAdminTest-server-1192752510>]
Jan 21 18:48:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:48:23.170 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.delta in the context of pollsters
Jan 21 18:48:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:48:23.171 12 DEBUG ceilometer.compute.pollsters [-] Instance 63b2e61e-8ad4-44e9-ba44-db37454a4b34 was shut off while getting sample of network.outgoing.bytes.delta: Failed to inspect data of instance <name=instance-0000001a, id=63b2e61e-8ad4-44e9-ba44-db37454a4b34>, domain state is SHUTOFF. get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:152
Jan 21 18:48:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:48:23.171 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.error in the context of pollsters
Jan 21 18:48:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:48:23.173 12 DEBUG ceilometer.compute.pollsters [-] Instance 63b2e61e-8ad4-44e9-ba44-db37454a4b34 was shut off while getting sample of network.incoming.packets.error: Failed to inspect data of instance <name=instance-0000001a, id=63b2e61e-8ad4-44e9-ba44-db37454a4b34>, domain state is SHUTOFF. get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:152
Jan 21 18:48:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:48:23.173 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.latency in the context of pollsters
Jan 21 18:48:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:48:23.173 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for PerDeviceDiskLatencyPollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Jan 21 18:48:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:48:23.173 12 ERROR ceilometer.polling.manager [-] Prevent pollster disk.device.latency from polling [<NovaLikeServer: tempest-MigrationsAdminTest-server-1192752510>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-MigrationsAdminTest-server-1192752510>]
Jan 21 18:48:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:48:23.174 12 INFO ceilometer.polling.manager [-] Polling pollster memory.usage in the context of pollsters
Jan 21 18:48:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:48:23.175 12 DEBUG ceilometer.compute.pollsters [-] Instance 63b2e61e-8ad4-44e9-ba44-db37454a4b34 was shut off while getting sample of memory.usage: Failed to inspect data of instance <name=instance-0000001a, id=63b2e61e-8ad4-44e9-ba44-db37454a4b34>, domain state is SHUTOFF. get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:152
Jan 21 18:48:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:48:23.175 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes in the context of pollsters
Jan 21 18:48:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:48:23.176 12 DEBUG ceilometer.compute.pollsters [-] Instance 63b2e61e-8ad4-44e9-ba44-db37454a4b34 was shut off while getting sample of network.outgoing.bytes: Failed to inspect data of instance <name=instance-0000001a, id=63b2e61e-8ad4-44e9-ba44-db37454a4b34>, domain state is SHUTOFF. get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:152
Jan 21 18:48:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:48:23.177 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes in the context of pollsters
Jan 21 18:48:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:48:23.178 12 DEBUG ceilometer.compute.pollsters [-] Instance 63b2e61e-8ad4-44e9-ba44-db37454a4b34 was shut off while getting sample of network.incoming.bytes: Failed to inspect data of instance <name=instance-0000001a, id=63b2e61e-8ad4-44e9-ba44-db37454a4b34>, domain state is SHUTOFF. get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:152
Jan 21 18:48:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:48:23.178 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.capacity in the context of pollsters
Jan 21 18:48:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:48:23.179 12 DEBUG ceilometer.compute.pollsters [-] Instance 63b2e61e-8ad4-44e9-ba44-db37454a4b34 was shut off while getting sample of disk.device.capacity: Failed to inspect data of instance <name=instance-0000001a, id=63b2e61e-8ad4-44e9-ba44-db37454a4b34>, domain state is SHUTOFF. get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:152
Jan 21 18:48:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:48:23.179 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.bytes in the context of pollsters
Jan 21 18:48:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:48:23.180 12 DEBUG ceilometer.compute.pollsters [-] Instance 63b2e61e-8ad4-44e9-ba44-db37454a4b34 was shut off while getting sample of disk.device.write.bytes: Failed to inspect data of instance <name=instance-0000001a, id=63b2e61e-8ad4-44e9-ba44-db37454a4b34>, domain state is SHUTOFF. get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:152
Jan 21 18:48:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:48:23.181 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.drop in the context of pollsters
Jan 21 18:48:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:48:23.182 12 DEBUG ceilometer.compute.pollsters [-] Instance 63b2e61e-8ad4-44e9-ba44-db37454a4b34 was shut off while getting sample of network.incoming.packets.drop: Failed to inspect data of instance <name=instance-0000001a, id=63b2e61e-8ad4-44e9-ba44-db37454a4b34>, domain state is SHUTOFF. get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:152
Jan 21 18:48:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:48:23.182 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.allocation in the context of pollsters
Jan 21 18:48:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:48:23.183 12 DEBUG ceilometer.compute.pollsters [-] Instance 63b2e61e-8ad4-44e9-ba44-db37454a4b34 was shut off while getting sample of disk.device.allocation: Failed to inspect data of instance <name=instance-0000001a, id=63b2e61e-8ad4-44e9-ba44-db37454a4b34>, domain state is SHUTOFF. get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:152
Jan 21 18:48:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:48:23.183 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets in the context of pollsters
Jan 21 18:48:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:48:23.184 12 DEBUG ceilometer.compute.pollsters [-] Instance 63b2e61e-8ad4-44e9-ba44-db37454a4b34 was shut off while getting sample of network.outgoing.packets: Failed to inspect data of instance <name=instance-0000001a, id=63b2e61e-8ad4-44e9-ba44-db37454a4b34>, domain state is SHUTOFF. get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:152
Jan 21 18:48:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:48:23.185 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.drop in the context of pollsters
Jan 21 18:48:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:48:23.186 12 DEBUG ceilometer.compute.pollsters [-] Instance 63b2e61e-8ad4-44e9-ba44-db37454a4b34 was shut off while getting sample of network.outgoing.packets.drop: Failed to inspect data of instance <name=instance-0000001a, id=63b2e61e-8ad4-44e9-ba44-db37454a4b34>, domain state is SHUTOFF. get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:152
Jan 21 18:48:23 np0005591285 nova_compute[182755]: 2026-01-21 23:48:23.217 182759 DEBUG oslo_service.periodic_task [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 21 18:48:23 np0005591285 nova_compute[182755]: 2026-01-21 23:48:23.242 182759 INFO nova.virt.libvirt.driver [None req-e98d6500-7fb6-4177-a9c6-3fced1196727 36d71830ce70436e97fbc17b6da8d3c6 95574103d0094883861c58d01690e5a3 - - default default] [instance: 63b2e61e-8ad4-44e9-ba44-db37454a4b34] Creating config drive at /var/lib/nova/instances/63b2e61e-8ad4-44e9-ba44-db37454a4b34/disk.config#033[00m
Jan 21 18:48:23 np0005591285 nova_compute[182755]: 2026-01-21 23:48:23.253 182759 DEBUG oslo_concurrency.processutils [None req-e98d6500-7fb6-4177-a9c6-3fced1196727 36d71830ce70436e97fbc17b6da8d3c6 95574103d0094883861c58d01690e5a3 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/63b2e61e-8ad4-44e9-ba44-db37454a4b34/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmph92fe1pz execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 21 18:48:23 np0005591285 nova_compute[182755]: 2026-01-21 23:48:23.286 182759 DEBUG nova.policy [None req-f2134448-f17c-4eae-ab41-fe1a1b639086 0c3f927acf834c718155d5ee5dd81b19 2edcdd2e6c5a46cb95eb89874a9cb5f3 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '0c3f927acf834c718155d5ee5dd81b19', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '2edcdd2e6c5a46cb95eb89874a9cb5f3', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Jan 21 18:48:23 np0005591285 nova_compute[182755]: 2026-01-21 23:48:23.291 182759 DEBUG oslo_concurrency.processutils [None req-f2134448-f17c-4eae-ab41-fe1a1b639086 0c3f927acf834c718155d5ee5dd81b19 2edcdd2e6c5a46cb95eb89874a9cb5f3 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json" returned: 0 in 0.144s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 21 18:48:23 np0005591285 nova_compute[182755]: 2026-01-21 23:48:23.293 182759 DEBUG oslo_concurrency.lockutils [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 21 18:48:23 np0005591285 nova_compute[182755]: 2026-01-21 23:48:23.294 182759 DEBUG oslo_concurrency.lockutils [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 21 18:48:23 np0005591285 nova_compute[182755]: 2026-01-21 23:48:23.295 182759 DEBUG oslo_concurrency.lockutils [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 21 18:48:23 np0005591285 nova_compute[182755]: 2026-01-21 23:48:23.295 182759 DEBUG nova.compute.resource_tracker [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Jan 21 18:48:23 np0005591285 nova_compute[182755]: 2026-01-21 23:48:23.297 182759 DEBUG oslo_concurrency.lockutils [None req-f2134448-f17c-4eae-ab41-fe1a1b639086 0c3f927acf834c718155d5ee5dd81b19 2edcdd2e6c5a46cb95eb89874a9cb5f3 - - default default] Acquiring lock "3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 21 18:48:23 np0005591285 nova_compute[182755]: 2026-01-21 23:48:23.298 182759 DEBUG oslo_concurrency.lockutils [None req-f2134448-f17c-4eae-ab41-fe1a1b639086 0c3f927acf834c718155d5ee5dd81b19 2edcdd2e6c5a46cb95eb89874a9cb5f3 - - default default] Lock "3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 21 18:48:23 np0005591285 nova_compute[182755]: 2026-01-21 23:48:23.322 182759 DEBUG oslo_concurrency.processutils [None req-f2134448-f17c-4eae-ab41-fe1a1b639086 0c3f927acf834c718155d5ee5dd81b19 2edcdd2e6c5a46cb95eb89874a9cb5f3 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 21 18:48:23 np0005591285 nova_compute[182755]: 2026-01-21 23:48:23.388 182759 DEBUG oslo_concurrency.processutils [None req-e98d6500-7fb6-4177-a9c6-3fced1196727 36d71830ce70436e97fbc17b6da8d3c6 95574103d0094883861c58d01690e5a3 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/63b2e61e-8ad4-44e9-ba44-db37454a4b34/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmph92fe1pz" returned: 0 in 0.135s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 21 18:48:23 np0005591285 nova_compute[182755]: 2026-01-21 23:48:23.415 182759 DEBUG oslo_concurrency.processutils [None req-f2134448-f17c-4eae-ab41-fe1a1b639086 0c3f927acf834c718155d5ee5dd81b19 2edcdd2e6c5a46cb95eb89874a9cb5f3 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json" returned: 0 in 0.093s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 21 18:48:23 np0005591285 nova_compute[182755]: 2026-01-21 23:48:23.416 182759 DEBUG oslo_concurrency.processutils [None req-f2134448-f17c-4eae-ab41-fe1a1b639086 0c3f927acf834c718155d5ee5dd81b19 2edcdd2e6c5a46cb95eb89874a9cb5f3 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474,backing_fmt=raw /var/lib/nova/instances/c45fafe2-1fc2-4740-a927-9f38ba3ab16b/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 21 18:48:23 np0005591285 nova_compute[182755]: 2026-01-21 23:48:23.480 182759 DEBUG oslo_concurrency.processutils [None req-f2134448-f17c-4eae-ab41-fe1a1b639086 0c3f927acf834c718155d5ee5dd81b19 2edcdd2e6c5a46cb95eb89874a9cb5f3 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474,backing_fmt=raw /var/lib/nova/instances/c45fafe2-1fc2-4740-a927-9f38ba3ab16b/disk 1073741824" returned: 0 in 0.064s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 21 18:48:23 np0005591285 nova_compute[182755]: 2026-01-21 23:48:23.483 182759 DEBUG oslo_concurrency.lockutils [None req-f2134448-f17c-4eae-ab41-fe1a1b639086 0c3f927acf834c718155d5ee5dd81b19 2edcdd2e6c5a46cb95eb89874a9cb5f3 - - default default] Lock "3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.185s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 21 18:48:23 np0005591285 nova_compute[182755]: 2026-01-21 23:48:23.485 182759 DEBUG oslo_concurrency.processutils [None req-f2134448-f17c-4eae-ab41-fe1a1b639086 0c3f927acf834c718155d5ee5dd81b19 2edcdd2e6c5a46cb95eb89874a9cb5f3 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 21 18:48:23 np0005591285 systemd-machined[154022]: New machine qemu-8-instance-0000001a.
Jan 21 18:48:23 np0005591285 systemd[1]: Started Virtual Machine qemu-8-instance-0000001a.
Jan 21 18:48:23 np0005591285 nova_compute[182755]: 2026-01-21 23:48:23.564 182759 DEBUG oslo_concurrency.processutils [None req-f2134448-f17c-4eae-ab41-fe1a1b639086 0c3f927acf834c718155d5ee5dd81b19 2edcdd2e6c5a46cb95eb89874a9cb5f3 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json" returned: 0 in 0.079s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 21 18:48:23 np0005591285 nova_compute[182755]: 2026-01-21 23:48:23.566 182759 DEBUG nova.virt.disk.api [None req-f2134448-f17c-4eae-ab41-fe1a1b639086 0c3f927acf834c718155d5ee5dd81b19 2edcdd2e6c5a46cb95eb89874a9cb5f3 - - default default] Checking if we can resize image /var/lib/nova/instances/c45fafe2-1fc2-4740-a927-9f38ba3ab16b/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166#033[00m
Jan 21 18:48:23 np0005591285 nova_compute[182755]: 2026-01-21 23:48:23.567 182759 DEBUG oslo_concurrency.processutils [None req-f2134448-f17c-4eae-ab41-fe1a1b639086 0c3f927acf834c718155d5ee5dd81b19 2edcdd2e6c5a46cb95eb89874a9cb5f3 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/c45fafe2-1fc2-4740-a927-9f38ba3ab16b/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 21 18:48:23 np0005591285 nova_compute[182755]: 2026-01-21 23:48:23.607 182759 DEBUG oslo_concurrency.processutils [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/63b2e61e-8ad4-44e9-ba44-db37454a4b34/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 21 18:48:23 np0005591285 nova_compute[182755]: 2026-01-21 23:48:23.645 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 18:48:23 np0005591285 nova_compute[182755]: 2026-01-21 23:48:23.665 182759 DEBUG oslo_concurrency.processutils [None req-f2134448-f17c-4eae-ab41-fe1a1b639086 0c3f927acf834c718155d5ee5dd81b19 2edcdd2e6c5a46cb95eb89874a9cb5f3 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/c45fafe2-1fc2-4740-a927-9f38ba3ab16b/disk --force-share --output=json" returned: 0 in 0.098s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 21 18:48:23 np0005591285 nova_compute[182755]: 2026-01-21 23:48:23.666 182759 DEBUG nova.virt.disk.api [None req-f2134448-f17c-4eae-ab41-fe1a1b639086 0c3f927acf834c718155d5ee5dd81b19 2edcdd2e6c5a46cb95eb89874a9cb5f3 - - default default] Cannot resize image /var/lib/nova/instances/c45fafe2-1fc2-4740-a927-9f38ba3ab16b/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172#033[00m
Jan 21 18:48:23 np0005591285 nova_compute[182755]: 2026-01-21 23:48:23.667 182759 DEBUG nova.objects.instance [None req-f2134448-f17c-4eae-ab41-fe1a1b639086 0c3f927acf834c718155d5ee5dd81b19 2edcdd2e6c5a46cb95eb89874a9cb5f3 - - default default] Lazy-loading 'migration_context' on Instance uuid c45fafe2-1fc2-4740-a927-9f38ba3ab16b obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 21 18:48:23 np0005591285 nova_compute[182755]: 2026-01-21 23:48:23.682 182759 DEBUG nova.virt.libvirt.driver [None req-f2134448-f17c-4eae-ab41-fe1a1b639086 0c3f927acf834c718155d5ee5dd81b19 2edcdd2e6c5a46cb95eb89874a9cb5f3 - - default default] [instance: c45fafe2-1fc2-4740-a927-9f38ba3ab16b] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Jan 21 18:48:23 np0005591285 nova_compute[182755]: 2026-01-21 23:48:23.683 182759 DEBUG nova.virt.libvirt.driver [None req-f2134448-f17c-4eae-ab41-fe1a1b639086 0c3f927acf834c718155d5ee5dd81b19 2edcdd2e6c5a46cb95eb89874a9cb5f3 - - default default] [instance: c45fafe2-1fc2-4740-a927-9f38ba3ab16b] Ensure instance console log exists: /var/lib/nova/instances/c45fafe2-1fc2-4740-a927-9f38ba3ab16b/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Jan 21 18:48:23 np0005591285 nova_compute[182755]: 2026-01-21 23:48:23.683 182759 DEBUG oslo_concurrency.lockutils [None req-f2134448-f17c-4eae-ab41-fe1a1b639086 0c3f927acf834c718155d5ee5dd81b19 2edcdd2e6c5a46cb95eb89874a9cb5f3 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 21 18:48:23 np0005591285 nova_compute[182755]: 2026-01-21 23:48:23.684 182759 DEBUG oslo_concurrency.lockutils [None req-f2134448-f17c-4eae-ab41-fe1a1b639086 0c3f927acf834c718155d5ee5dd81b19 2edcdd2e6c5a46cb95eb89874a9cb5f3 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 21 18:48:23 np0005591285 nova_compute[182755]: 2026-01-21 23:48:23.684 182759 DEBUG oslo_concurrency.lockutils [None req-f2134448-f17c-4eae-ab41-fe1a1b639086 0c3f927acf834c718155d5ee5dd81b19 2edcdd2e6c5a46cb95eb89874a9cb5f3 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 21 18:48:23 np0005591285 nova_compute[182755]: 2026-01-21 23:48:23.713 182759 DEBUG oslo_concurrency.processutils [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/63b2e61e-8ad4-44e9-ba44-db37454a4b34/disk --force-share --output=json" returned: 0 in 0.106s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 21 18:48:23 np0005591285 nova_compute[182755]: 2026-01-21 23:48:23.714 182759 DEBUG oslo_concurrency.processutils [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/63b2e61e-8ad4-44e9-ba44-db37454a4b34/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 21 18:48:23 np0005591285 nova_compute[182755]: 2026-01-21 23:48:23.796 182759 DEBUG oslo_concurrency.processutils [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/63b2e61e-8ad4-44e9-ba44-db37454a4b34/disk --force-share --output=json" returned: 0 in 0.083s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 21 18:48:23 np0005591285 nova_compute[182755]: 2026-01-21 23:48:23.886 182759 DEBUG nova.virt.driver [None req-f447ef32-7edb-4e2c-a5c5-5452f2f9c37e - - - - - -] Emitting event <LifecycleEvent: 1769039303.885743, 63b2e61e-8ad4-44e9-ba44-db37454a4b34 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 21 18:48:23 np0005591285 nova_compute[182755]: 2026-01-21 23:48:23.887 182759 INFO nova.compute.manager [None req-f447ef32-7edb-4e2c-a5c5-5452f2f9c37e - - - - - -] [instance: 63b2e61e-8ad4-44e9-ba44-db37454a4b34] VM Resumed (Lifecycle Event)#033[00m
Jan 21 18:48:23 np0005591285 nova_compute[182755]: 2026-01-21 23:48:23.892 182759 DEBUG nova.compute.manager [None req-e98d6500-7fb6-4177-a9c6-3fced1196727 36d71830ce70436e97fbc17b6da8d3c6 95574103d0094883861c58d01690e5a3 - - default default] [instance: 63b2e61e-8ad4-44e9-ba44-db37454a4b34] Instance event wait completed in 0 seconds for  wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Jan 21 18:48:23 np0005591285 nova_compute[182755]: 2026-01-21 23:48:23.893 182759 DEBUG nova.virt.libvirt.driver [None req-e98d6500-7fb6-4177-a9c6-3fced1196727 36d71830ce70436e97fbc17b6da8d3c6 95574103d0094883861c58d01690e5a3 - - default default] [instance: 63b2e61e-8ad4-44e9-ba44-db37454a4b34] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Jan 21 18:48:23 np0005591285 nova_compute[182755]: 2026-01-21 23:48:23.897 182759 INFO nova.virt.libvirt.driver [-] [instance: 63b2e61e-8ad4-44e9-ba44-db37454a4b34] Instance spawned successfully.#033[00m
Jan 21 18:48:23 np0005591285 nova_compute[182755]: 2026-01-21 23:48:23.898 182759 DEBUG nova.virt.libvirt.driver [None req-e98d6500-7fb6-4177-a9c6-3fced1196727 36d71830ce70436e97fbc17b6da8d3c6 95574103d0094883861c58d01690e5a3 - - default default] [instance: 63b2e61e-8ad4-44e9-ba44-db37454a4b34] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Jan 21 18:48:23 np0005591285 nova_compute[182755]: 2026-01-21 23:48:23.919 182759 DEBUG nova.compute.manager [None req-f447ef32-7edb-4e2c-a5c5-5452f2f9c37e - - - - - -] [instance: 63b2e61e-8ad4-44e9-ba44-db37454a4b34] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 21 18:48:23 np0005591285 nova_compute[182755]: 2026-01-21 23:48:23.924 182759 DEBUG nova.compute.manager [None req-f447ef32-7edb-4e2c-a5c5-5452f2f9c37e - - - - - -] [instance: 63b2e61e-8ad4-44e9-ba44-db37454a4b34] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 21 18:48:23 np0005591285 nova_compute[182755]: 2026-01-21 23:48:23.944 182759 DEBUG nova.virt.libvirt.driver [None req-e98d6500-7fb6-4177-a9c6-3fced1196727 36d71830ce70436e97fbc17b6da8d3c6 95574103d0094883861c58d01690e5a3 - - default default] [instance: 63b2e61e-8ad4-44e9-ba44-db37454a4b34] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 21 18:48:23 np0005591285 nova_compute[182755]: 2026-01-21 23:48:23.945 182759 DEBUG nova.virt.libvirt.driver [None req-e98d6500-7fb6-4177-a9c6-3fced1196727 36d71830ce70436e97fbc17b6da8d3c6 95574103d0094883861c58d01690e5a3 - - default default] [instance: 63b2e61e-8ad4-44e9-ba44-db37454a4b34] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 21 18:48:23 np0005591285 nova_compute[182755]: 2026-01-21 23:48:23.946 182759 DEBUG nova.virt.libvirt.driver [None req-e98d6500-7fb6-4177-a9c6-3fced1196727 36d71830ce70436e97fbc17b6da8d3c6 95574103d0094883861c58d01690e5a3 - - default default] [instance: 63b2e61e-8ad4-44e9-ba44-db37454a4b34] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 21 18:48:23 np0005591285 nova_compute[182755]: 2026-01-21 23:48:23.946 182759 DEBUG nova.virt.libvirt.driver [None req-e98d6500-7fb6-4177-a9c6-3fced1196727 36d71830ce70436e97fbc17b6da8d3c6 95574103d0094883861c58d01690e5a3 - - default default] [instance: 63b2e61e-8ad4-44e9-ba44-db37454a4b34] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 21 18:48:23 np0005591285 nova_compute[182755]: 2026-01-21 23:48:23.947 182759 DEBUG nova.virt.libvirt.driver [None req-e98d6500-7fb6-4177-a9c6-3fced1196727 36d71830ce70436e97fbc17b6da8d3c6 95574103d0094883861c58d01690e5a3 - - default default] [instance: 63b2e61e-8ad4-44e9-ba44-db37454a4b34] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 21 18:48:23 np0005591285 nova_compute[182755]: 2026-01-21 23:48:23.948 182759 DEBUG nova.virt.libvirt.driver [None req-e98d6500-7fb6-4177-a9c6-3fced1196727 36d71830ce70436e97fbc17b6da8d3c6 95574103d0094883861c58d01690e5a3 - - default default] [instance: 63b2e61e-8ad4-44e9-ba44-db37454a4b34] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 21 18:48:23 np0005591285 nova_compute[182755]: 2026-01-21 23:48:23.954 182759 INFO nova.compute.manager [None req-f447ef32-7edb-4e2c-a5c5-5452f2f9c37e - - - - - -] [instance: 63b2e61e-8ad4-44e9-ba44-db37454a4b34] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 21 18:48:23 np0005591285 nova_compute[182755]: 2026-01-21 23:48:23.955 182759 DEBUG nova.virt.driver [None req-f447ef32-7edb-4e2c-a5c5-5452f2f9c37e - - - - - -] Emitting event <LifecycleEvent: 1769039303.8875701, 63b2e61e-8ad4-44e9-ba44-db37454a4b34 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 21 18:48:23 np0005591285 nova_compute[182755]: 2026-01-21 23:48:23.955 182759 INFO nova.compute.manager [None req-f447ef32-7edb-4e2c-a5c5-5452f2f9c37e - - - - - -] [instance: 63b2e61e-8ad4-44e9-ba44-db37454a4b34] VM Started (Lifecycle Event)#033[00m
Jan 21 18:48:23 np0005591285 nova_compute[182755]: 2026-01-21 23:48:23.983 182759 DEBUG nova.compute.manager [None req-f447ef32-7edb-4e2c-a5c5-5452f2f9c37e - - - - - -] [instance: 63b2e61e-8ad4-44e9-ba44-db37454a4b34] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 21 18:48:23 np0005591285 nova_compute[182755]: 2026-01-21 23:48:23.988 182759 DEBUG nova.compute.manager [None req-f447ef32-7edb-4e2c-a5c5-5452f2f9c37e - - - - - -] [instance: 63b2e61e-8ad4-44e9-ba44-db37454a4b34] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 21 18:48:24 np0005591285 nova_compute[182755]: 2026-01-21 23:48:24.008 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 18:48:24 np0005591285 nova_compute[182755]: 2026-01-21 23:48:24.014 182759 INFO nova.compute.manager [None req-f447ef32-7edb-4e2c-a5c5-5452f2f9c37e - - - - - -] [instance: 63b2e61e-8ad4-44e9-ba44-db37454a4b34] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 21 18:48:24 np0005591285 nova_compute[182755]: 2026-01-21 23:48:24.043 182759 INFO nova.compute.manager [None req-e98d6500-7fb6-4177-a9c6-3fced1196727 36d71830ce70436e97fbc17b6da8d3c6 95574103d0094883861c58d01690e5a3 - - default default] [instance: 63b2e61e-8ad4-44e9-ba44-db37454a4b34] Took 1.64 seconds to spawn the instance on the hypervisor.#033[00m
Jan 21 18:48:24 np0005591285 nova_compute[182755]: 2026-01-21 23:48:24.044 182759 DEBUG nova.compute.manager [None req-e98d6500-7fb6-4177-a9c6-3fced1196727 36d71830ce70436e97fbc17b6da8d3c6 95574103d0094883861c58d01690e5a3 - - default default] [instance: 63b2e61e-8ad4-44e9-ba44-db37454a4b34] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 21 18:48:24 np0005591285 nova_compute[182755]: 2026-01-21 23:48:24.071 182759 WARNING nova.virt.libvirt.driver [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 21 18:48:24 np0005591285 nova_compute[182755]: 2026-01-21 23:48:24.077 182759 DEBUG nova.compute.resource_tracker [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=5719MB free_disk=73.37585830688477GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Jan 21 18:48:24 np0005591285 nova_compute[182755]: 2026-01-21 23:48:24.077 182759 DEBUG oslo_concurrency.lockutils [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 21 18:48:24 np0005591285 nova_compute[182755]: 2026-01-21 23:48:24.077 182759 DEBUG oslo_concurrency.lockutils [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 21 18:48:24 np0005591285 nova_compute[182755]: 2026-01-21 23:48:24.163 182759 INFO nova.compute.manager [None req-e98d6500-7fb6-4177-a9c6-3fced1196727 36d71830ce70436e97fbc17b6da8d3c6 95574103d0094883861c58d01690e5a3 - - default default] [instance: 63b2e61e-8ad4-44e9-ba44-db37454a4b34] Took 2.34 seconds to build instance.#033[00m
Jan 21 18:48:24 np0005591285 nova_compute[182755]: 2026-01-21 23:48:24.183 182759 DEBUG oslo_concurrency.lockutils [None req-e98d6500-7fb6-4177-a9c6-3fced1196727 36d71830ce70436e97fbc17b6da8d3c6 95574103d0094883861c58d01690e5a3 - - default default] Lock "63b2e61e-8ad4-44e9-ba44-db37454a4b34" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 2.717s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 21 18:48:24 np0005591285 nova_compute[182755]: 2026-01-21 23:48:24.194 182759 DEBUG nova.compute.resource_tracker [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Instance 63b2e61e-8ad4-44e9-ba44-db37454a4b34 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Jan 21 18:48:24 np0005591285 nova_compute[182755]: 2026-01-21 23:48:24.194 182759 DEBUG nova.compute.resource_tracker [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Instance c45fafe2-1fc2-4740-a927-9f38ba3ab16b actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Jan 21 18:48:24 np0005591285 nova_compute[182755]: 2026-01-21 23:48:24.194 182759 DEBUG nova.compute.resource_tracker [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 2 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Jan 21 18:48:24 np0005591285 nova_compute[182755]: 2026-01-21 23:48:24.195 182759 DEBUG nova.compute.resource_tracker [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7679MB used_ram=768MB phys_disk=79GB used_disk=2GB total_vcpus=8 used_vcpus=2 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Jan 21 18:48:24 np0005591285 nova_compute[182755]: 2026-01-21 23:48:24.312 182759 DEBUG nova.compute.provider_tree [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Inventory has not changed in ProviderTree for provider: e96a8776-a298-4c19-937a-402cb8191067 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 21 18:48:24 np0005591285 nova_compute[182755]: 2026-01-21 23:48:24.341 182759 DEBUG nova.scheduler.client.report [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Inventory has not changed for provider e96a8776-a298-4c19-937a-402cb8191067 based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 21 18:48:24 np0005591285 nova_compute[182755]: 2026-01-21 23:48:24.392 182759 DEBUG nova.compute.resource_tracker [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Jan 21 18:48:24 np0005591285 nova_compute[182755]: 2026-01-21 23:48:24.392 182759 DEBUG oslo_concurrency.lockutils [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.315s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 21 18:48:24 np0005591285 nova_compute[182755]: 2026-01-21 23:48:24.535 182759 DEBUG nova.network.neutron [None req-f2134448-f17c-4eae-ab41-fe1a1b639086 0c3f927acf834c718155d5ee5dd81b19 2edcdd2e6c5a46cb95eb89874a9cb5f3 - - default default] [instance: c45fafe2-1fc2-4740-a927-9f38ba3ab16b] Successfully created port: 6fc91ab1-e76c-419d-9474-37ac429e4a4b _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Jan 21 18:48:25 np0005591285 podman[214501]: 2026-01-21 23:48:25.229438134 +0000 UTC m=+0.088714195 container health_status 482a8450540b33e5cd4c4cb0c4b60281016c657b79b89fa82f8a9e447843f995 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, container_name=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.build-date=20251202, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, managed_by=edpm_ansible)
Jan 21 18:48:25 np0005591285 podman[214502]: 2026-01-21 23:48:25.236179963 +0000 UTC m=+0.094702964 container health_status 8919309155a033da8deb024bb37b9fc0d285fe97686ee0d202af37a5f2df8416 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Jan 21 18:48:26 np0005591285 nova_compute[182755]: 2026-01-21 23:48:26.387 182759 DEBUG nova.network.neutron [None req-f2134448-f17c-4eae-ab41-fe1a1b639086 0c3f927acf834c718155d5ee5dd81b19 2edcdd2e6c5a46cb95eb89874a9cb5f3 - - default default] [instance: c45fafe2-1fc2-4740-a927-9f38ba3ab16b] Successfully updated port: 6fc91ab1-e76c-419d-9474-37ac429e4a4b _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Jan 21 18:48:26 np0005591285 nova_compute[182755]: 2026-01-21 23:48:26.393 182759 DEBUG oslo_service.periodic_task [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 21 18:48:26 np0005591285 nova_compute[182755]: 2026-01-21 23:48:26.412 182759 DEBUG oslo_concurrency.lockutils [None req-f2134448-f17c-4eae-ab41-fe1a1b639086 0c3f927acf834c718155d5ee5dd81b19 2edcdd2e6c5a46cb95eb89874a9cb5f3 - - default default] Acquiring lock "refresh_cache-c45fafe2-1fc2-4740-a927-9f38ba3ab16b" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 21 18:48:26 np0005591285 nova_compute[182755]: 2026-01-21 23:48:26.413 182759 DEBUG oslo_concurrency.lockutils [None req-f2134448-f17c-4eae-ab41-fe1a1b639086 0c3f927acf834c718155d5ee5dd81b19 2edcdd2e6c5a46cb95eb89874a9cb5f3 - - default default] Acquired lock "refresh_cache-c45fafe2-1fc2-4740-a927-9f38ba3ab16b" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 21 18:48:26 np0005591285 nova_compute[182755]: 2026-01-21 23:48:26.413 182759 DEBUG nova.network.neutron [None req-f2134448-f17c-4eae-ab41-fe1a1b639086 0c3f927acf834c718155d5ee5dd81b19 2edcdd2e6c5a46cb95eb89874a9cb5f3 - - default default] [instance: c45fafe2-1fc2-4740-a927-9f38ba3ab16b] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Jan 21 18:48:26 np0005591285 nova_compute[182755]: 2026-01-21 23:48:26.554 182759 DEBUG nova.compute.manager [req-1cb52d9c-dc55-4389-bf3c-785cedc792e4 req-c43e8780-3802-467c-9466-62f5a518df82 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: c45fafe2-1fc2-4740-a927-9f38ba3ab16b] Received event network-changed-6fc91ab1-e76c-419d-9474-37ac429e4a4b external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 21 18:48:26 np0005591285 nova_compute[182755]: 2026-01-21 23:48:26.555 182759 DEBUG nova.compute.manager [req-1cb52d9c-dc55-4389-bf3c-785cedc792e4 req-c43e8780-3802-467c-9466-62f5a518df82 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: c45fafe2-1fc2-4740-a927-9f38ba3ab16b] Refreshing instance network info cache due to event network-changed-6fc91ab1-e76c-419d-9474-37ac429e4a4b. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 21 18:48:26 np0005591285 nova_compute[182755]: 2026-01-21 23:48:26.556 182759 DEBUG oslo_concurrency.lockutils [req-1cb52d9c-dc55-4389-bf3c-785cedc792e4 req-c43e8780-3802-467c-9466-62f5a518df82 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "refresh_cache-c45fafe2-1fc2-4740-a927-9f38ba3ab16b" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 21 18:48:26 np0005591285 nova_compute[182755]: 2026-01-21 23:48:26.684 182759 DEBUG nova.network.neutron [None req-f2134448-f17c-4eae-ab41-fe1a1b639086 0c3f927acf834c718155d5ee5dd81b19 2edcdd2e6c5a46cb95eb89874a9cb5f3 - - default default] [instance: c45fafe2-1fc2-4740-a927-9f38ba3ab16b] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Jan 21 18:48:27 np0005591285 nova_compute[182755]: 2026-01-21 23:48:27.218 182759 DEBUG oslo_service.periodic_task [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 21 18:48:27 np0005591285 nova_compute[182755]: 2026-01-21 23:48:27.307 182759 DEBUG oslo_concurrency.lockutils [None req-15a89eb9-8e01-4c87-aafa-5521252a6100 36d71830ce70436e97fbc17b6da8d3c6 95574103d0094883861c58d01690e5a3 - - default default] Acquiring lock "refresh_cache-63b2e61e-8ad4-44e9-ba44-db37454a4b34" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 21 18:48:27 np0005591285 nova_compute[182755]: 2026-01-21 23:48:27.308 182759 DEBUG oslo_concurrency.lockutils [None req-15a89eb9-8e01-4c87-aafa-5521252a6100 36d71830ce70436e97fbc17b6da8d3c6 95574103d0094883861c58d01690e5a3 - - default default] Acquired lock "refresh_cache-63b2e61e-8ad4-44e9-ba44-db37454a4b34" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 21 18:48:27 np0005591285 nova_compute[182755]: 2026-01-21 23:48:27.308 182759 DEBUG nova.network.neutron [None req-15a89eb9-8e01-4c87-aafa-5521252a6100 36d71830ce70436e97fbc17b6da8d3c6 95574103d0094883861c58d01690e5a3 - - default default] [instance: 63b2e61e-8ad4-44e9-ba44-db37454a4b34] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Jan 21 18:48:27 np0005591285 nova_compute[182755]: 2026-01-21 23:48:27.668 182759 DEBUG nova.network.neutron [None req-15a89eb9-8e01-4c87-aafa-5521252a6100 36d71830ce70436e97fbc17b6da8d3c6 95574103d0094883861c58d01690e5a3 - - default default] [instance: 63b2e61e-8ad4-44e9-ba44-db37454a4b34] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Jan 21 18:48:27 np0005591285 nova_compute[182755]: 2026-01-21 23:48:27.806 182759 DEBUG nova.network.neutron [None req-f2134448-f17c-4eae-ab41-fe1a1b639086 0c3f927acf834c718155d5ee5dd81b19 2edcdd2e6c5a46cb95eb89874a9cb5f3 - - default default] [instance: c45fafe2-1fc2-4740-a927-9f38ba3ab16b] Updating instance_info_cache with network_info: [{"id": "6fc91ab1-e76c-419d-9474-37ac429e4a4b", "address": "fa:16:3e:f5:71:6c", "network": {"id": "135f4ca0-b287-4f82-8393-a426855e9926", "bridge": "br-int", "label": "tempest-ImagesOneServerNegativeTestJSON-1018143163-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "2edcdd2e6c5a46cb95eb89874a9cb5f3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap6fc91ab1-e7", "ovs_interfaceid": "6fc91ab1-e76c-419d-9474-37ac429e4a4b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 21 18:48:27 np0005591285 nova_compute[182755]: 2026-01-21 23:48:27.854 182759 DEBUG oslo_concurrency.lockutils [None req-f2134448-f17c-4eae-ab41-fe1a1b639086 0c3f927acf834c718155d5ee5dd81b19 2edcdd2e6c5a46cb95eb89874a9cb5f3 - - default default] Releasing lock "refresh_cache-c45fafe2-1fc2-4740-a927-9f38ba3ab16b" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 21 18:48:27 np0005591285 nova_compute[182755]: 2026-01-21 23:48:27.855 182759 DEBUG nova.compute.manager [None req-f2134448-f17c-4eae-ab41-fe1a1b639086 0c3f927acf834c718155d5ee5dd81b19 2edcdd2e6c5a46cb95eb89874a9cb5f3 - - default default] [instance: c45fafe2-1fc2-4740-a927-9f38ba3ab16b] Instance network_info: |[{"id": "6fc91ab1-e76c-419d-9474-37ac429e4a4b", "address": "fa:16:3e:f5:71:6c", "network": {"id": "135f4ca0-b287-4f82-8393-a426855e9926", "bridge": "br-int", "label": "tempest-ImagesOneServerNegativeTestJSON-1018143163-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "2edcdd2e6c5a46cb95eb89874a9cb5f3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap6fc91ab1-e7", "ovs_interfaceid": "6fc91ab1-e76c-419d-9474-37ac429e4a4b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Jan 21 18:48:27 np0005591285 nova_compute[182755]: 2026-01-21 23:48:27.856 182759 DEBUG oslo_concurrency.lockutils [req-1cb52d9c-dc55-4389-bf3c-785cedc792e4 req-c43e8780-3802-467c-9466-62f5a518df82 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquired lock "refresh_cache-c45fafe2-1fc2-4740-a927-9f38ba3ab16b" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 21 18:48:27 np0005591285 nova_compute[182755]: 2026-01-21 23:48:27.857 182759 DEBUG nova.network.neutron [req-1cb52d9c-dc55-4389-bf3c-785cedc792e4 req-c43e8780-3802-467c-9466-62f5a518df82 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: c45fafe2-1fc2-4740-a927-9f38ba3ab16b] Refreshing network info cache for port 6fc91ab1-e76c-419d-9474-37ac429e4a4b _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 21 18:48:27 np0005591285 nova_compute[182755]: 2026-01-21 23:48:27.862 182759 DEBUG nova.virt.libvirt.driver [None req-f2134448-f17c-4eae-ab41-fe1a1b639086 0c3f927acf834c718155d5ee5dd81b19 2edcdd2e6c5a46cb95eb89874a9cb5f3 - - default default] [instance: c45fafe2-1fc2-4740-a927-9f38ba3ab16b] Start _get_guest_xml network_info=[{"id": "6fc91ab1-e76c-419d-9474-37ac429e4a4b", "address": "fa:16:3e:f5:71:6c", "network": {"id": "135f4ca0-b287-4f82-8393-a426855e9926", "bridge": "br-int", "label": "tempest-ImagesOneServerNegativeTestJSON-1018143163-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "2edcdd2e6c5a46cb95eb89874a9cb5f3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap6fc91ab1-e7", "ovs_interfaceid": "6fc91ab1-e76c-419d-9474-37ac429e4a4b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-21T23:42:50Z,direct_url=<?>,disk_format='qcow2',id=9cd98f02-a505-4543-a7ad-04e9a377b456,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='43b70c4e837343859ac97b6b2397ba1b',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-21T23:42:52Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'device_type': 'disk', 'device_name': '/dev/vda', 'size': 0, 'boot_index': 0, 'disk_bus': 'virtio', 'guest_format': None, 'encryption_options': None, 'encryption_format': None, 'encrypted': False, 'encryption_secret_uuid': None, 'image_id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Jan 21 18:48:27 np0005591285 nova_compute[182755]: 2026-01-21 23:48:27.870 182759 WARNING nova.virt.libvirt.driver [None req-f2134448-f17c-4eae-ab41-fe1a1b639086 0c3f927acf834c718155d5ee5dd81b19 2edcdd2e6c5a46cb95eb89874a9cb5f3 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 21 18:48:27 np0005591285 nova_compute[182755]: 2026-01-21 23:48:27.877 182759 DEBUG nova.virt.libvirt.host [None req-f2134448-f17c-4eae-ab41-fe1a1b639086 0c3f927acf834c718155d5ee5dd81b19 2edcdd2e6c5a46cb95eb89874a9cb5f3 - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Jan 21 18:48:27 np0005591285 nova_compute[182755]: 2026-01-21 23:48:27.878 182759 DEBUG nova.virt.libvirt.host [None req-f2134448-f17c-4eae-ab41-fe1a1b639086 0c3f927acf834c718155d5ee5dd81b19 2edcdd2e6c5a46cb95eb89874a9cb5f3 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Jan 21 18:48:27 np0005591285 nova_compute[182755]: 2026-01-21 23:48:27.883 182759 DEBUG nova.virt.libvirt.host [None req-f2134448-f17c-4eae-ab41-fe1a1b639086 0c3f927acf834c718155d5ee5dd81b19 2edcdd2e6c5a46cb95eb89874a9cb5f3 - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Jan 21 18:48:27 np0005591285 nova_compute[182755]: 2026-01-21 23:48:27.883 182759 DEBUG nova.virt.libvirt.host [None req-f2134448-f17c-4eae-ab41-fe1a1b639086 0c3f927acf834c718155d5ee5dd81b19 2edcdd2e6c5a46cb95eb89874a9cb5f3 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Jan 21 18:48:27 np0005591285 nova_compute[182755]: 2026-01-21 23:48:27.885 182759 DEBUG nova.virt.libvirt.driver [None req-f2134448-f17c-4eae-ab41-fe1a1b639086 0c3f927acf834c718155d5ee5dd81b19 2edcdd2e6c5a46cb95eb89874a9cb5f3 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Jan 21 18:48:27 np0005591285 nova_compute[182755]: 2026-01-21 23:48:27.886 182759 DEBUG nova.virt.hardware [None req-f2134448-f17c-4eae-ab41-fe1a1b639086 0c3f927acf834c718155d5ee5dd81b19 2edcdd2e6c5a46cb95eb89874a9cb5f3 - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-21T23:42:45Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='c3389c03-89c4-4ff5-9e03-1a99d41713d4',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-21T23:42:50Z,direct_url=<?>,disk_format='qcow2',id=9cd98f02-a505-4543-a7ad-04e9a377b456,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='43b70c4e837343859ac97b6b2397ba1b',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-21T23:42:52Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Jan 21 18:48:27 np0005591285 nova_compute[182755]: 2026-01-21 23:48:27.887 182759 DEBUG nova.virt.hardware [None req-f2134448-f17c-4eae-ab41-fe1a1b639086 0c3f927acf834c718155d5ee5dd81b19 2edcdd2e6c5a46cb95eb89874a9cb5f3 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Jan 21 18:48:27 np0005591285 nova_compute[182755]: 2026-01-21 23:48:27.887 182759 DEBUG nova.virt.hardware [None req-f2134448-f17c-4eae-ab41-fe1a1b639086 0c3f927acf834c718155d5ee5dd81b19 2edcdd2e6c5a46cb95eb89874a9cb5f3 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Jan 21 18:48:27 np0005591285 nova_compute[182755]: 2026-01-21 23:48:27.888 182759 DEBUG nova.virt.hardware [None req-f2134448-f17c-4eae-ab41-fe1a1b639086 0c3f927acf834c718155d5ee5dd81b19 2edcdd2e6c5a46cb95eb89874a9cb5f3 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Jan 21 18:48:27 np0005591285 nova_compute[182755]: 2026-01-21 23:48:27.888 182759 DEBUG nova.virt.hardware [None req-f2134448-f17c-4eae-ab41-fe1a1b639086 0c3f927acf834c718155d5ee5dd81b19 2edcdd2e6c5a46cb95eb89874a9cb5f3 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Jan 21 18:48:27 np0005591285 nova_compute[182755]: 2026-01-21 23:48:27.888 182759 DEBUG nova.virt.hardware [None req-f2134448-f17c-4eae-ab41-fe1a1b639086 0c3f927acf834c718155d5ee5dd81b19 2edcdd2e6c5a46cb95eb89874a9cb5f3 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Jan 21 18:48:27 np0005591285 nova_compute[182755]: 2026-01-21 23:48:27.889 182759 DEBUG nova.virt.hardware [None req-f2134448-f17c-4eae-ab41-fe1a1b639086 0c3f927acf834c718155d5ee5dd81b19 2edcdd2e6c5a46cb95eb89874a9cb5f3 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Jan 21 18:48:27 np0005591285 nova_compute[182755]: 2026-01-21 23:48:27.889 182759 DEBUG nova.virt.hardware [None req-f2134448-f17c-4eae-ab41-fe1a1b639086 0c3f927acf834c718155d5ee5dd81b19 2edcdd2e6c5a46cb95eb89874a9cb5f3 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Jan 21 18:48:27 np0005591285 nova_compute[182755]: 2026-01-21 23:48:27.890 182759 DEBUG nova.virt.hardware [None req-f2134448-f17c-4eae-ab41-fe1a1b639086 0c3f927acf834c718155d5ee5dd81b19 2edcdd2e6c5a46cb95eb89874a9cb5f3 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Jan 21 18:48:27 np0005591285 nova_compute[182755]: 2026-01-21 23:48:27.890 182759 DEBUG nova.virt.hardware [None req-f2134448-f17c-4eae-ab41-fe1a1b639086 0c3f927acf834c718155d5ee5dd81b19 2edcdd2e6c5a46cb95eb89874a9cb5f3 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Jan 21 18:48:27 np0005591285 nova_compute[182755]: 2026-01-21 23:48:27.891 182759 DEBUG nova.virt.hardware [None req-f2134448-f17c-4eae-ab41-fe1a1b639086 0c3f927acf834c718155d5ee5dd81b19 2edcdd2e6c5a46cb95eb89874a9cb5f3 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Jan 21 18:48:27 np0005591285 nova_compute[182755]: 2026-01-21 23:48:27.898 182759 DEBUG nova.virt.libvirt.vif [None req-f2134448-f17c-4eae-ab41-fe1a1b639086 0c3f927acf834c718155d5ee5dd81b19 2edcdd2e6c5a46cb95eb89874a9cb5f3 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-21T23:48:21Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ImagesOneServerNegativeTestJSON-server-1662194836',display_name='tempest-ImagesOneServerNegativeTestJSON-server-1662194836',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-imagesoneservernegativetestjson-server-1662194836',id=27,image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='2edcdd2e6c5a46cb95eb89874a9cb5f3',ramdisk_id='',reservation_id='r-bhpdwuo4',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ImagesOneServerNegativeTestJSON-222133061',owner_user_name='tempest-ImagesOneServerNegativeTestJSON-222133061-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-21T23:48:23Z,user_data=None,user_id='0c3f927acf834c718155d5ee5dd81b19',uuid=c45fafe2-1fc2-4740-a927-9f38ba3ab16b,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "6fc91ab1-e76c-419d-9474-37ac429e4a4b", "address": "fa:16:3e:f5:71:6c", "network": {"id": "135f4ca0-b287-4f82-8393-a426855e9926", "bridge": "br-int", "label": "tempest-ImagesOneServerNegativeTestJSON-1018143163-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "2edcdd2e6c5a46cb95eb89874a9cb5f3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap6fc91ab1-e7", "ovs_interfaceid": "6fc91ab1-e76c-419d-9474-37ac429e4a4b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Jan 21 18:48:27 np0005591285 nova_compute[182755]: 2026-01-21 23:48:27.899 182759 DEBUG nova.network.os_vif_util [None req-f2134448-f17c-4eae-ab41-fe1a1b639086 0c3f927acf834c718155d5ee5dd81b19 2edcdd2e6c5a46cb95eb89874a9cb5f3 - - default default] Converting VIF {"id": "6fc91ab1-e76c-419d-9474-37ac429e4a4b", "address": "fa:16:3e:f5:71:6c", "network": {"id": "135f4ca0-b287-4f82-8393-a426855e9926", "bridge": "br-int", "label": "tempest-ImagesOneServerNegativeTestJSON-1018143163-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "2edcdd2e6c5a46cb95eb89874a9cb5f3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap6fc91ab1-e7", "ovs_interfaceid": "6fc91ab1-e76c-419d-9474-37ac429e4a4b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 21 18:48:27 np0005591285 nova_compute[182755]: 2026-01-21 23:48:27.902 182759 DEBUG nova.network.os_vif_util [None req-f2134448-f17c-4eae-ab41-fe1a1b639086 0c3f927acf834c718155d5ee5dd81b19 2edcdd2e6c5a46cb95eb89874a9cb5f3 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:f5:71:6c,bridge_name='br-int',has_traffic_filtering=True,id=6fc91ab1-e76c-419d-9474-37ac429e4a4b,network=Network(135f4ca0-b287-4f82-8393-a426855e9926),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap6fc91ab1-e7') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 21 18:48:27 np0005591285 nova_compute[182755]: 2026-01-21 23:48:27.904 182759 DEBUG nova.objects.instance [None req-f2134448-f17c-4eae-ab41-fe1a1b639086 0c3f927acf834c718155d5ee5dd81b19 2edcdd2e6c5a46cb95eb89874a9cb5f3 - - default default] Lazy-loading 'pci_devices' on Instance uuid c45fafe2-1fc2-4740-a927-9f38ba3ab16b obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 21 18:48:27 np0005591285 nova_compute[182755]: 2026-01-21 23:48:27.928 182759 DEBUG nova.virt.libvirt.driver [None req-f2134448-f17c-4eae-ab41-fe1a1b639086 0c3f927acf834c718155d5ee5dd81b19 2edcdd2e6c5a46cb95eb89874a9cb5f3 - - default default] [instance: c45fafe2-1fc2-4740-a927-9f38ba3ab16b] End _get_guest_xml xml=<domain type="kvm">
Jan 21 18:48:27 np0005591285 nova_compute[182755]:  <uuid>c45fafe2-1fc2-4740-a927-9f38ba3ab16b</uuid>
Jan 21 18:48:27 np0005591285 nova_compute[182755]:  <name>instance-0000001b</name>
Jan 21 18:48:27 np0005591285 nova_compute[182755]:  <memory>131072</memory>
Jan 21 18:48:27 np0005591285 nova_compute[182755]:  <vcpu>1</vcpu>
Jan 21 18:48:27 np0005591285 nova_compute[182755]:  <metadata>
Jan 21 18:48:27 np0005591285 nova_compute[182755]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 21 18:48:27 np0005591285 nova_compute[182755]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 21 18:48:27 np0005591285 nova_compute[182755]:      <nova:name>tempest-ImagesOneServerNegativeTestJSON-server-1662194836</nova:name>
Jan 21 18:48:27 np0005591285 nova_compute[182755]:      <nova:creationTime>2026-01-21 23:48:27</nova:creationTime>
Jan 21 18:48:27 np0005591285 nova_compute[182755]:      <nova:flavor name="m1.nano">
Jan 21 18:48:27 np0005591285 nova_compute[182755]:        <nova:memory>128</nova:memory>
Jan 21 18:48:27 np0005591285 nova_compute[182755]:        <nova:disk>1</nova:disk>
Jan 21 18:48:27 np0005591285 nova_compute[182755]:        <nova:swap>0</nova:swap>
Jan 21 18:48:27 np0005591285 nova_compute[182755]:        <nova:ephemeral>0</nova:ephemeral>
Jan 21 18:48:27 np0005591285 nova_compute[182755]:        <nova:vcpus>1</nova:vcpus>
Jan 21 18:48:27 np0005591285 nova_compute[182755]:      </nova:flavor>
Jan 21 18:48:27 np0005591285 nova_compute[182755]:      <nova:owner>
Jan 21 18:48:27 np0005591285 nova_compute[182755]:        <nova:user uuid="0c3f927acf834c718155d5ee5dd81b19">tempest-ImagesOneServerNegativeTestJSON-222133061-project-member</nova:user>
Jan 21 18:48:27 np0005591285 nova_compute[182755]:        <nova:project uuid="2edcdd2e6c5a46cb95eb89874a9cb5f3">tempest-ImagesOneServerNegativeTestJSON-222133061</nova:project>
Jan 21 18:48:27 np0005591285 nova_compute[182755]:      </nova:owner>
Jan 21 18:48:27 np0005591285 nova_compute[182755]:      <nova:root type="image" uuid="9cd98f02-a505-4543-a7ad-04e9a377b456"/>
Jan 21 18:48:27 np0005591285 nova_compute[182755]:      <nova:ports>
Jan 21 18:48:27 np0005591285 nova_compute[182755]:        <nova:port uuid="6fc91ab1-e76c-419d-9474-37ac429e4a4b">
Jan 21 18:48:27 np0005591285 nova_compute[182755]:          <nova:ip type="fixed" address="10.100.0.10" ipVersion="4"/>
Jan 21 18:48:27 np0005591285 nova_compute[182755]:        </nova:port>
Jan 21 18:48:27 np0005591285 nova_compute[182755]:      </nova:ports>
Jan 21 18:48:27 np0005591285 nova_compute[182755]:    </nova:instance>
Jan 21 18:48:27 np0005591285 nova_compute[182755]:  </metadata>
Jan 21 18:48:27 np0005591285 nova_compute[182755]:  <sysinfo type="smbios">
Jan 21 18:48:27 np0005591285 nova_compute[182755]:    <system>
Jan 21 18:48:27 np0005591285 nova_compute[182755]:      <entry name="manufacturer">RDO</entry>
Jan 21 18:48:27 np0005591285 nova_compute[182755]:      <entry name="product">OpenStack Compute</entry>
Jan 21 18:48:27 np0005591285 nova_compute[182755]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 21 18:48:27 np0005591285 nova_compute[182755]:      <entry name="serial">c45fafe2-1fc2-4740-a927-9f38ba3ab16b</entry>
Jan 21 18:48:27 np0005591285 nova_compute[182755]:      <entry name="uuid">c45fafe2-1fc2-4740-a927-9f38ba3ab16b</entry>
Jan 21 18:48:27 np0005591285 nova_compute[182755]:      <entry name="family">Virtual Machine</entry>
Jan 21 18:48:27 np0005591285 nova_compute[182755]:    </system>
Jan 21 18:48:27 np0005591285 nova_compute[182755]:  </sysinfo>
Jan 21 18:48:27 np0005591285 nova_compute[182755]:  <os>
Jan 21 18:48:27 np0005591285 nova_compute[182755]:    <type arch="x86_64" machine="q35">hvm</type>
Jan 21 18:48:27 np0005591285 nova_compute[182755]:    <boot dev="hd"/>
Jan 21 18:48:27 np0005591285 nova_compute[182755]:    <smbios mode="sysinfo"/>
Jan 21 18:48:27 np0005591285 nova_compute[182755]:  </os>
Jan 21 18:48:27 np0005591285 nova_compute[182755]:  <features>
Jan 21 18:48:27 np0005591285 nova_compute[182755]:    <acpi/>
Jan 21 18:48:27 np0005591285 nova_compute[182755]:    <apic/>
Jan 21 18:48:27 np0005591285 nova_compute[182755]:    <vmcoreinfo/>
Jan 21 18:48:27 np0005591285 nova_compute[182755]:  </features>
Jan 21 18:48:27 np0005591285 nova_compute[182755]:  <clock offset="utc">
Jan 21 18:48:27 np0005591285 nova_compute[182755]:    <timer name="pit" tickpolicy="delay"/>
Jan 21 18:48:27 np0005591285 nova_compute[182755]:    <timer name="rtc" tickpolicy="catchup"/>
Jan 21 18:48:27 np0005591285 nova_compute[182755]:    <timer name="hpet" present="no"/>
Jan 21 18:48:27 np0005591285 nova_compute[182755]:  </clock>
Jan 21 18:48:27 np0005591285 nova_compute[182755]:  <cpu mode="custom" match="exact">
Jan 21 18:48:27 np0005591285 nova_compute[182755]:    <model>Nehalem</model>
Jan 21 18:48:27 np0005591285 nova_compute[182755]:    <topology sockets="1" cores="1" threads="1"/>
Jan 21 18:48:27 np0005591285 nova_compute[182755]:  </cpu>
Jan 21 18:48:27 np0005591285 nova_compute[182755]:  <devices>
Jan 21 18:48:27 np0005591285 nova_compute[182755]:    <disk type="file" device="disk">
Jan 21 18:48:27 np0005591285 nova_compute[182755]:      <driver name="qemu" type="qcow2" cache="none"/>
Jan 21 18:48:27 np0005591285 nova_compute[182755]:      <source file="/var/lib/nova/instances/c45fafe2-1fc2-4740-a927-9f38ba3ab16b/disk"/>
Jan 21 18:48:27 np0005591285 nova_compute[182755]:      <target dev="vda" bus="virtio"/>
Jan 21 18:48:27 np0005591285 nova_compute[182755]:    </disk>
Jan 21 18:48:27 np0005591285 nova_compute[182755]:    <disk type="file" device="cdrom">
Jan 21 18:48:27 np0005591285 nova_compute[182755]:      <driver name="qemu" type="raw" cache="none"/>
Jan 21 18:48:27 np0005591285 nova_compute[182755]:      <source file="/var/lib/nova/instances/c45fafe2-1fc2-4740-a927-9f38ba3ab16b/disk.config"/>
Jan 21 18:48:27 np0005591285 nova_compute[182755]:      <target dev="sda" bus="sata"/>
Jan 21 18:48:27 np0005591285 nova_compute[182755]:    </disk>
Jan 21 18:48:27 np0005591285 nova_compute[182755]:    <interface type="ethernet">
Jan 21 18:48:27 np0005591285 nova_compute[182755]:      <mac address="fa:16:3e:f5:71:6c"/>
Jan 21 18:48:27 np0005591285 nova_compute[182755]:      <model type="virtio"/>
Jan 21 18:48:27 np0005591285 nova_compute[182755]:      <driver name="vhost" rx_queue_size="512"/>
Jan 21 18:48:27 np0005591285 nova_compute[182755]:      <mtu size="1442"/>
Jan 21 18:48:27 np0005591285 nova_compute[182755]:      <target dev="tap6fc91ab1-e7"/>
Jan 21 18:48:27 np0005591285 nova_compute[182755]:    </interface>
Jan 21 18:48:27 np0005591285 nova_compute[182755]:    <serial type="pty">
Jan 21 18:48:27 np0005591285 nova_compute[182755]:      <log file="/var/lib/nova/instances/c45fafe2-1fc2-4740-a927-9f38ba3ab16b/console.log" append="off"/>
Jan 21 18:48:27 np0005591285 nova_compute[182755]:    </serial>
Jan 21 18:48:27 np0005591285 nova_compute[182755]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 21 18:48:27 np0005591285 nova_compute[182755]:    <video>
Jan 21 18:48:27 np0005591285 nova_compute[182755]:      <model type="virtio"/>
Jan 21 18:48:27 np0005591285 nova_compute[182755]:    </video>
Jan 21 18:48:27 np0005591285 nova_compute[182755]:    <input type="tablet" bus="usb"/>
Jan 21 18:48:27 np0005591285 nova_compute[182755]:    <rng model="virtio">
Jan 21 18:48:27 np0005591285 nova_compute[182755]:      <backend model="random">/dev/urandom</backend>
Jan 21 18:48:27 np0005591285 nova_compute[182755]:    </rng>
Jan 21 18:48:27 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root"/>
Jan 21 18:48:27 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 18:48:27 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 18:48:27 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 18:48:27 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 18:48:27 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 18:48:27 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 18:48:27 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 18:48:27 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 18:48:27 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 18:48:27 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 18:48:27 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 18:48:27 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 18:48:27 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 18:48:27 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 18:48:27 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 18:48:27 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 18:48:27 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 18:48:27 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 18:48:27 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 18:48:27 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 18:48:27 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 18:48:27 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 18:48:27 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 18:48:27 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 18:48:27 np0005591285 nova_compute[182755]:    <controller type="usb" index="0"/>
Jan 21 18:48:27 np0005591285 nova_compute[182755]:    <memballoon model="virtio">
Jan 21 18:48:27 np0005591285 nova_compute[182755]:      <stats period="10"/>
Jan 21 18:48:27 np0005591285 nova_compute[182755]:    </memballoon>
Jan 21 18:48:27 np0005591285 nova_compute[182755]:  </devices>
Jan 21 18:48:27 np0005591285 nova_compute[182755]: </domain>
Jan 21 18:48:27 np0005591285 nova_compute[182755]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Jan 21 18:48:27 np0005591285 nova_compute[182755]: 2026-01-21 23:48:27.930 182759 DEBUG nova.compute.manager [None req-f2134448-f17c-4eae-ab41-fe1a1b639086 0c3f927acf834c718155d5ee5dd81b19 2edcdd2e6c5a46cb95eb89874a9cb5f3 - - default default] [instance: c45fafe2-1fc2-4740-a927-9f38ba3ab16b] Preparing to wait for external event network-vif-plugged-6fc91ab1-e76c-419d-9474-37ac429e4a4b prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Jan 21 18:48:27 np0005591285 nova_compute[182755]: 2026-01-21 23:48:27.931 182759 DEBUG oslo_concurrency.lockutils [None req-f2134448-f17c-4eae-ab41-fe1a1b639086 0c3f927acf834c718155d5ee5dd81b19 2edcdd2e6c5a46cb95eb89874a9cb5f3 - - default default] Acquiring lock "c45fafe2-1fc2-4740-a927-9f38ba3ab16b-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 21 18:48:27 np0005591285 nova_compute[182755]: 2026-01-21 23:48:27.931 182759 DEBUG oslo_concurrency.lockutils [None req-f2134448-f17c-4eae-ab41-fe1a1b639086 0c3f927acf834c718155d5ee5dd81b19 2edcdd2e6c5a46cb95eb89874a9cb5f3 - - default default] Lock "c45fafe2-1fc2-4740-a927-9f38ba3ab16b-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 21 18:48:27 np0005591285 nova_compute[182755]: 2026-01-21 23:48:27.932 182759 DEBUG oslo_concurrency.lockutils [None req-f2134448-f17c-4eae-ab41-fe1a1b639086 0c3f927acf834c718155d5ee5dd81b19 2edcdd2e6c5a46cb95eb89874a9cb5f3 - - default default] Lock "c45fafe2-1fc2-4740-a927-9f38ba3ab16b-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 21 18:48:27 np0005591285 nova_compute[182755]: 2026-01-21 23:48:27.933 182759 DEBUG nova.virt.libvirt.vif [None req-f2134448-f17c-4eae-ab41-fe1a1b639086 0c3f927acf834c718155d5ee5dd81b19 2edcdd2e6c5a46cb95eb89874a9cb5f3 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-21T23:48:21Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ImagesOneServerNegativeTestJSON-server-1662194836',display_name='tempest-ImagesOneServerNegativeTestJSON-server-1662194836',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-imagesoneservernegativetestjson-server-1662194836',id=27,image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='2edcdd2e6c5a46cb95eb89874a9cb5f3',ramdisk_id='',reservation_id='r-bhpdwuo4',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ImagesOneServerNegativeTestJSON-222133061',owner_user_name='tempest-ImagesOneServerNegativeTestJSON-222133061-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-21T23:48:23Z,user_data=None,user_id='0c3f927acf834c718155d5ee5dd81b19',uuid=c45fafe2-1fc2-4740-a927-9f38ba3ab16b,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "6fc91ab1-e76c-419d-9474-37ac429e4a4b", "address": "fa:16:3e:f5:71:6c", "network": {"id": "135f4ca0-b287-4f82-8393-a426855e9926", "bridge": "br-int", "label": "tempest-ImagesOneServerNegativeTestJSON-1018143163-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "2edcdd2e6c5a46cb95eb89874a9cb5f3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap6fc91ab1-e7", "ovs_interfaceid": "6fc91ab1-e76c-419d-9474-37ac429e4a4b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Jan 21 18:48:27 np0005591285 nova_compute[182755]: 2026-01-21 23:48:27.934 182759 DEBUG nova.network.os_vif_util [None req-f2134448-f17c-4eae-ab41-fe1a1b639086 0c3f927acf834c718155d5ee5dd81b19 2edcdd2e6c5a46cb95eb89874a9cb5f3 - - default default] Converting VIF {"id": "6fc91ab1-e76c-419d-9474-37ac429e4a4b", "address": "fa:16:3e:f5:71:6c", "network": {"id": "135f4ca0-b287-4f82-8393-a426855e9926", "bridge": "br-int", "label": "tempest-ImagesOneServerNegativeTestJSON-1018143163-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "2edcdd2e6c5a46cb95eb89874a9cb5f3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap6fc91ab1-e7", "ovs_interfaceid": "6fc91ab1-e76c-419d-9474-37ac429e4a4b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 21 18:48:27 np0005591285 nova_compute[182755]: 2026-01-21 23:48:27.935 182759 DEBUG nova.network.os_vif_util [None req-f2134448-f17c-4eae-ab41-fe1a1b639086 0c3f927acf834c718155d5ee5dd81b19 2edcdd2e6c5a46cb95eb89874a9cb5f3 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:f5:71:6c,bridge_name='br-int',has_traffic_filtering=True,id=6fc91ab1-e76c-419d-9474-37ac429e4a4b,network=Network(135f4ca0-b287-4f82-8393-a426855e9926),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap6fc91ab1-e7') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 21 18:48:27 np0005591285 nova_compute[182755]: 2026-01-21 23:48:27.935 182759 DEBUG os_vif [None req-f2134448-f17c-4eae-ab41-fe1a1b639086 0c3f927acf834c718155d5ee5dd81b19 2edcdd2e6c5a46cb95eb89874a9cb5f3 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:f5:71:6c,bridge_name='br-int',has_traffic_filtering=True,id=6fc91ab1-e76c-419d-9474-37ac429e4a4b,network=Network(135f4ca0-b287-4f82-8393-a426855e9926),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap6fc91ab1-e7') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Jan 21 18:48:27 np0005591285 nova_compute[182755]: 2026-01-21 23:48:27.937 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 18:48:27 np0005591285 nova_compute[182755]: 2026-01-21 23:48:27.939 182759 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 21 18:48:27 np0005591285 nova_compute[182755]: 2026-01-21 23:48:27.940 182759 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 21 18:48:27 np0005591285 nova_compute[182755]: 2026-01-21 23:48:27.948 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 18:48:27 np0005591285 nova_compute[182755]: 2026-01-21 23:48:27.949 182759 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap6fc91ab1-e7, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 21 18:48:27 np0005591285 nova_compute[182755]: 2026-01-21 23:48:27.950 182759 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap6fc91ab1-e7, col_values=(('external_ids', {'iface-id': '6fc91ab1-e76c-419d-9474-37ac429e4a4b', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:f5:71:6c', 'vm-uuid': 'c45fafe2-1fc2-4740-a927-9f38ba3ab16b'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 21 18:48:27 np0005591285 nova_compute[182755]: 2026-01-21 23:48:27.953 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 18:48:27 np0005591285 NetworkManager[55017]: <info>  [1769039307.9549] manager: (tap6fc91ab1-e7): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/49)
Jan 21 18:48:27 np0005591285 nova_compute[182755]: 2026-01-21 23:48:27.960 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 21 18:48:27 np0005591285 nova_compute[182755]: 2026-01-21 23:48:27.966 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 18:48:27 np0005591285 nova_compute[182755]: 2026-01-21 23:48:27.968 182759 INFO os_vif [None req-f2134448-f17c-4eae-ab41-fe1a1b639086 0c3f927acf834c718155d5ee5dd81b19 2edcdd2e6c5a46cb95eb89874a9cb5f3 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:f5:71:6c,bridge_name='br-int',has_traffic_filtering=True,id=6fc91ab1-e76c-419d-9474-37ac429e4a4b,network=Network(135f4ca0-b287-4f82-8393-a426855e9926),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap6fc91ab1-e7')#033[00m
Jan 21 18:48:28 np0005591285 nova_compute[182755]: 2026-01-21 23:48:28.012 182759 DEBUG nova.network.neutron [None req-15a89eb9-8e01-4c87-aafa-5521252a6100 36d71830ce70436e97fbc17b6da8d3c6 95574103d0094883861c58d01690e5a3 - - default default] [instance: 63b2e61e-8ad4-44e9-ba44-db37454a4b34] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 21 18:48:28 np0005591285 nova_compute[182755]: 2026-01-21 23:48:28.022 182759 DEBUG nova.virt.libvirt.driver [None req-f2134448-f17c-4eae-ab41-fe1a1b639086 0c3f927acf834c718155d5ee5dd81b19 2edcdd2e6c5a46cb95eb89874a9cb5f3 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 21 18:48:28 np0005591285 nova_compute[182755]: 2026-01-21 23:48:28.022 182759 DEBUG nova.virt.libvirt.driver [None req-f2134448-f17c-4eae-ab41-fe1a1b639086 0c3f927acf834c718155d5ee5dd81b19 2edcdd2e6c5a46cb95eb89874a9cb5f3 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 21 18:48:28 np0005591285 nova_compute[182755]: 2026-01-21 23:48:28.023 182759 DEBUG nova.virt.libvirt.driver [None req-f2134448-f17c-4eae-ab41-fe1a1b639086 0c3f927acf834c718155d5ee5dd81b19 2edcdd2e6c5a46cb95eb89874a9cb5f3 - - default default] No VIF found with MAC fa:16:3e:f5:71:6c, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Jan 21 18:48:28 np0005591285 nova_compute[182755]: 2026-01-21 23:48:28.024 182759 INFO nova.virt.libvirt.driver [None req-f2134448-f17c-4eae-ab41-fe1a1b639086 0c3f927acf834c718155d5ee5dd81b19 2edcdd2e6c5a46cb95eb89874a9cb5f3 - - default default] [instance: c45fafe2-1fc2-4740-a927-9f38ba3ab16b] Using config drive#033[00m
Jan 21 18:48:28 np0005591285 nova_compute[182755]: 2026-01-21 23:48:28.178 182759 DEBUG oslo_concurrency.lockutils [None req-15a89eb9-8e01-4c87-aafa-5521252a6100 36d71830ce70436e97fbc17b6da8d3c6 95574103d0094883861c58d01690e5a3 - - default default] Releasing lock "refresh_cache-63b2e61e-8ad4-44e9-ba44-db37454a4b34" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 21 18:48:28 np0005591285 nova_compute[182755]: 2026-01-21 23:48:28.356 182759 DEBUG nova.virt.libvirt.driver [None req-15a89eb9-8e01-4c87-aafa-5521252a6100 36d71830ce70436e97fbc17b6da8d3c6 95574103d0094883861c58d01690e5a3 - - default default] [instance: 63b2e61e-8ad4-44e9-ba44-db37454a4b34] Starting migrate_disk_and_power_off migrate_disk_and_power_off /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11511#033[00m
Jan 21 18:48:28 np0005591285 nova_compute[182755]: 2026-01-21 23:48:28.357 182759 DEBUG nova.virt.libvirt.volume.remotefs [None req-15a89eb9-8e01-4c87-aafa-5521252a6100 36d71830ce70436e97fbc17b6da8d3c6 95574103d0094883861c58d01690e5a3 - - default default] Creating file /var/lib/nova/instances/63b2e61e-8ad4-44e9-ba44-db37454a4b34/01c27a3201b046deb6aa3d0c1feab643.tmp on remote host 192.168.122.100 create_file /usr/lib/python3.9/site-packages/nova/virt/libvirt/volume/remotefs.py:79#033[00m
Jan 21 18:48:28 np0005591285 nova_compute[182755]: 2026-01-21 23:48:28.358 182759 DEBUG oslo_concurrency.processutils [None req-15a89eb9-8e01-4c87-aafa-5521252a6100 36d71830ce70436e97fbc17b6da8d3c6 95574103d0094883861c58d01690e5a3 - - default default] Running cmd (subprocess): ssh -o BatchMode=yes 192.168.122.100 touch /var/lib/nova/instances/63b2e61e-8ad4-44e9-ba44-db37454a4b34/01c27a3201b046deb6aa3d0c1feab643.tmp execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 21 18:48:28 np0005591285 nova_compute[182755]: 2026-01-21 23:48:28.689 182759 INFO nova.virt.libvirt.driver [None req-f2134448-f17c-4eae-ab41-fe1a1b639086 0c3f927acf834c718155d5ee5dd81b19 2edcdd2e6c5a46cb95eb89874a9cb5f3 - - default default] [instance: c45fafe2-1fc2-4740-a927-9f38ba3ab16b] Creating config drive at /var/lib/nova/instances/c45fafe2-1fc2-4740-a927-9f38ba3ab16b/disk.config#033[00m
Jan 21 18:48:28 np0005591285 nova_compute[182755]: 2026-01-21 23:48:28.699 182759 DEBUG oslo_concurrency.processutils [None req-f2134448-f17c-4eae-ab41-fe1a1b639086 0c3f927acf834c718155d5ee5dd81b19 2edcdd2e6c5a46cb95eb89874a9cb5f3 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/c45fafe2-1fc2-4740-a927-9f38ba3ab16b/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpa1sqn3ij execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 21 18:48:28 np0005591285 nova_compute[182755]: 2026-01-21 23:48:28.818 182759 DEBUG oslo_concurrency.processutils [None req-15a89eb9-8e01-4c87-aafa-5521252a6100 36d71830ce70436e97fbc17b6da8d3c6 95574103d0094883861c58d01690e5a3 - - default default] CMD "ssh -o BatchMode=yes 192.168.122.100 touch /var/lib/nova/instances/63b2e61e-8ad4-44e9-ba44-db37454a4b34/01c27a3201b046deb6aa3d0c1feab643.tmp" returned: 1 in 0.460s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 21 18:48:28 np0005591285 nova_compute[182755]: 2026-01-21 23:48:28.819 182759 DEBUG oslo_concurrency.processutils [None req-15a89eb9-8e01-4c87-aafa-5521252a6100 36d71830ce70436e97fbc17b6da8d3c6 95574103d0094883861c58d01690e5a3 - - default default] 'ssh -o BatchMode=yes 192.168.122.100 touch /var/lib/nova/instances/63b2e61e-8ad4-44e9-ba44-db37454a4b34/01c27a3201b046deb6aa3d0c1feab643.tmp' failed. Not Retrying. execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:473#033[00m
Jan 21 18:48:28 np0005591285 nova_compute[182755]: 2026-01-21 23:48:28.820 182759 DEBUG nova.virt.libvirt.volume.remotefs [None req-15a89eb9-8e01-4c87-aafa-5521252a6100 36d71830ce70436e97fbc17b6da8d3c6 95574103d0094883861c58d01690e5a3 - - default default] Creating directory /var/lib/nova/instances/63b2e61e-8ad4-44e9-ba44-db37454a4b34 on remote host 192.168.122.100 create_dir /usr/lib/python3.9/site-packages/nova/virt/libvirt/volume/remotefs.py:91#033[00m
Jan 21 18:48:28 np0005591285 nova_compute[182755]: 2026-01-21 23:48:28.820 182759 DEBUG oslo_concurrency.processutils [None req-15a89eb9-8e01-4c87-aafa-5521252a6100 36d71830ce70436e97fbc17b6da8d3c6 95574103d0094883861c58d01690e5a3 - - default default] Running cmd (subprocess): ssh -o BatchMode=yes 192.168.122.100 mkdir -p /var/lib/nova/instances/63b2e61e-8ad4-44e9-ba44-db37454a4b34 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 21 18:48:28 np0005591285 nova_compute[182755]: 2026-01-21 23:48:28.850 182759 DEBUG oslo_concurrency.processutils [None req-f2134448-f17c-4eae-ab41-fe1a1b639086 0c3f927acf834c718155d5ee5dd81b19 2edcdd2e6c5a46cb95eb89874a9cb5f3 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/c45fafe2-1fc2-4740-a927-9f38ba3ab16b/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpa1sqn3ij" returned: 0 in 0.151s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 21 18:48:28 np0005591285 NetworkManager[55017]: <info>  [1769039308.9435] manager: (tap6fc91ab1-e7): new Tun device (/org/freedesktop/NetworkManager/Devices/50)
Jan 21 18:48:28 np0005591285 kernel: tap6fc91ab1-e7: entered promiscuous mode
Jan 21 18:48:28 np0005591285 ovn_controller[94908]: 2026-01-21T23:48:28Z|00080|binding|INFO|Claiming lport 6fc91ab1-e76c-419d-9474-37ac429e4a4b for this chassis.
Jan 21 18:48:28 np0005591285 ovn_controller[94908]: 2026-01-21T23:48:28Z|00081|binding|INFO|6fc91ab1-e76c-419d-9474-37ac429e4a4b: Claiming fa:16:3e:f5:71:6c 10.100.0.10
Jan 21 18:48:28 np0005591285 nova_compute[182755]: 2026-01-21 23:48:28.952 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 18:48:28 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:48:28.965 104259 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:f5:71:6c 10.100.0.10'], port_security=['fa:16:3e:f5:71:6c 10.100.0.10'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.10/28', 'neutron:device_id': 'c45fafe2-1fc2-4740-a927-9f38ba3ab16b', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-135f4ca0-b287-4f82-8393-a426855e9926', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '2edcdd2e6c5a46cb95eb89874a9cb5f3', 'neutron:revision_number': '2', 'neutron:security_group_ids': 'b452e9c4-b5fd-46cd-9749-caa7edf73c8c', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=357b9b46-d446-48ea-adde-5992e2bcd56d, chassis=[<ovs.db.idl.Row object at 0x7fdd55b5c970>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fdd55b5c970>], logical_port=6fc91ab1-e76c-419d-9474-37ac429e4a4b) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 21 18:48:28 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:48:28.968 104259 INFO neutron.agent.ovn.metadata.agent [-] Port 6fc91ab1-e76c-419d-9474-37ac429e4a4b in datapath 135f4ca0-b287-4f82-8393-a426855e9926 bound to our chassis#033[00m
Jan 21 18:48:28 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:48:28.970 104259 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 135f4ca0-b287-4f82-8393-a426855e9926#033[00m
Jan 21 18:48:28 np0005591285 ovn_controller[94908]: 2026-01-21T23:48:28Z|00082|binding|INFO|Setting lport 6fc91ab1-e76c-419d-9474-37ac429e4a4b ovn-installed in OVS
Jan 21 18:48:28 np0005591285 ovn_controller[94908]: 2026-01-21T23:48:28Z|00083|binding|INFO|Setting lport 6fc91ab1-e76c-419d-9474-37ac429e4a4b up in Southbound
Jan 21 18:48:28 np0005591285 nova_compute[182755]: 2026-01-21 23:48:28.979 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 18:48:28 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:48:28.992 211690 DEBUG oslo.privsep.daemon [-] privsep: reply[41e2832b-3804-406a-b7cf-2909a637869c]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 18:48:28 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:48:28.993 104259 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap135f4ca0-b1 in ovnmeta-135f4ca0-b287-4f82-8393-a426855e9926 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Jan 21 18:48:28 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:48:28.996 211690 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap135f4ca0-b0 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Jan 21 18:48:28 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:48:28.996 211690 DEBUG oslo.privsep.daemon [-] privsep: reply[a3c5cdf7-4c8f-499d-a5d6-f8f70c18fb7e]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 18:48:28 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:48:28.997 211690 DEBUG oslo.privsep.daemon [-] privsep: reply[7ecb039f-4f06-46a2-915b-d80a72ad98bc]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 18:48:29 np0005591285 systemd-machined[154022]: New machine qemu-9-instance-0000001b.
Jan 21 18:48:29 np0005591285 nova_compute[182755]: 2026-01-21 23:48:29.010 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 18:48:29 np0005591285 systemd[1]: Started Virtual Machine qemu-9-instance-0000001b.
Jan 21 18:48:29 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:48:29.013 104650 DEBUG oslo.privsep.daemon [-] privsep: reply[3ff8e082-483a-4825-a5a2-a1390173b7b3]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 18:48:29 np0005591285 systemd-udevd[214567]: Network interface NamePolicy= disabled on kernel command line.
Jan 21 18:48:29 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:48:29.040 211690 DEBUG oslo.privsep.daemon [-] privsep: reply[10727588-15ee-431f-a4d4-d9a5f7ff9f04]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 18:48:29 np0005591285 NetworkManager[55017]: <info>  [1769039309.0481] device (tap6fc91ab1-e7): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 21 18:48:29 np0005591285 NetworkManager[55017]: <info>  [1769039309.0491] device (tap6fc91ab1-e7): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 21 18:48:29 np0005591285 nova_compute[182755]: 2026-01-21 23:48:29.077 182759 DEBUG oslo_concurrency.processutils [None req-15a89eb9-8e01-4c87-aafa-5521252a6100 36d71830ce70436e97fbc17b6da8d3c6 95574103d0094883861c58d01690e5a3 - - default default] CMD "ssh -o BatchMode=yes 192.168.122.100 mkdir -p /var/lib/nova/instances/63b2e61e-8ad4-44e9-ba44-db37454a4b34" returned: 0 in 0.257s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 21 18:48:29 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:48:29.098 211704 DEBUG oslo.privsep.daemon [-] privsep: reply[3a11fb50-d89e-46ef-8779-982d5de9b726]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 18:48:29 np0005591285 nova_compute[182755]: 2026-01-21 23:48:29.101 182759 DEBUG nova.virt.libvirt.driver [None req-15a89eb9-8e01-4c87-aafa-5521252a6100 36d71830ce70436e97fbc17b6da8d3c6 95574103d0094883861c58d01690e5a3 - - default default] [instance: 63b2e61e-8ad4-44e9-ba44-db37454a4b34] Shutting down instance from state 1 _clean_shutdown /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4071#033[00m
Jan 21 18:48:29 np0005591285 NetworkManager[55017]: <info>  [1769039309.1063] manager: (tap135f4ca0-b0): new Veth device (/org/freedesktop/NetworkManager/Devices/51)
Jan 21 18:48:29 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:48:29.105 211690 DEBUG oslo.privsep.daemon [-] privsep: reply[8c843223-37cd-4d5e-a330-3d3535af7af4]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 18:48:29 np0005591285 systemd-udevd[214570]: Network interface NamePolicy= disabled on kernel command line.
Jan 21 18:48:29 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:48:29.146 211704 DEBUG oslo.privsep.daemon [-] privsep: reply[c6277b63-ad49-45fa-8260-56b35005036f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 18:48:29 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:48:29.150 211704 DEBUG oslo.privsep.daemon [-] privsep: reply[222f43a9-7331-4067-b26c-979e1df38d03]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 18:48:29 np0005591285 NetworkManager[55017]: <info>  [1769039309.1788] device (tap135f4ca0-b0): carrier: link connected
Jan 21 18:48:29 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:48:29.190 211704 DEBUG oslo.privsep.daemon [-] privsep: reply[300c620d-a3fb-429f-b856-736b91055df4]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 18:48:29 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:48:29.217 211690 DEBUG oslo.privsep.daemon [-] privsep: reply[299cf862-78c2-4444-b0db-d41a31f9c5b2]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap135f4ca0-b1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:7b:bd:df'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 30], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 386783, 'reachable_time': 37548, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 214598, 'error': None, 'target': 'ovnmeta-135f4ca0-b287-4f82-8393-a426855e9926', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 18:48:29 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:48:29.242 211690 DEBUG oslo.privsep.daemon [-] privsep: reply[d0aa90dd-e4f1-406a-b848-6c4512ee72b3]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe7b:bddf'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 386783, 'tstamp': 386783}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 214600, 'error': None, 'target': 'ovnmeta-135f4ca0-b287-4f82-8393-a426855e9926', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 18:48:29 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:48:29.266 211690 DEBUG oslo.privsep.daemon [-] privsep: reply[3de2429b-3afd-48c1-9730-edcf4a52c203]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap135f4ca0-b1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:7b:bd:df'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 30], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 386783, 'reachable_time': 37548, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 214601, 'error': None, 'target': 'ovnmeta-135f4ca0-b287-4f82-8393-a426855e9926', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 18:48:29 np0005591285 nova_compute[182755]: 2026-01-21 23:48:29.320 182759 DEBUG nova.compute.manager [req-ab21a65c-d53f-41eb-88a6-545a891fef5e req-8435dc53-2ccf-4bb2-b3bd-3673a6de70b6 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: c45fafe2-1fc2-4740-a927-9f38ba3ab16b] Received event network-vif-plugged-6fc91ab1-e76c-419d-9474-37ac429e4a4b external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 21 18:48:29 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:48:29.321 211690 DEBUG oslo.privsep.daemon [-] privsep: reply[9a4d45dc-92e3-4b07-93f9-f1ad323d3d3c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 18:48:29 np0005591285 nova_compute[182755]: 2026-01-21 23:48:29.329 182759 DEBUG oslo_concurrency.lockutils [req-ab21a65c-d53f-41eb-88a6-545a891fef5e req-8435dc53-2ccf-4bb2-b3bd-3673a6de70b6 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "c45fafe2-1fc2-4740-a927-9f38ba3ab16b-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 21 18:48:29 np0005591285 nova_compute[182755]: 2026-01-21 23:48:29.329 182759 DEBUG oslo_concurrency.lockutils [req-ab21a65c-d53f-41eb-88a6-545a891fef5e req-8435dc53-2ccf-4bb2-b3bd-3673a6de70b6 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "c45fafe2-1fc2-4740-a927-9f38ba3ab16b-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 21 18:48:29 np0005591285 nova_compute[182755]: 2026-01-21 23:48:29.329 182759 DEBUG oslo_concurrency.lockutils [req-ab21a65c-d53f-41eb-88a6-545a891fef5e req-8435dc53-2ccf-4bb2-b3bd-3673a6de70b6 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "c45fafe2-1fc2-4740-a927-9f38ba3ab16b-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 21 18:48:29 np0005591285 nova_compute[182755]: 2026-01-21 23:48:29.330 182759 DEBUG nova.compute.manager [req-ab21a65c-d53f-41eb-88a6-545a891fef5e req-8435dc53-2ccf-4bb2-b3bd-3673a6de70b6 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: c45fafe2-1fc2-4740-a927-9f38ba3ab16b] Processing event network-vif-plugged-6fc91ab1-e76c-419d-9474-37ac429e4a4b _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Jan 21 18:48:29 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:48:29.395 211690 DEBUG oslo.privsep.daemon [-] privsep: reply[6df9879e-e7a3-4892-b01f-6c2d50cca279]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 18:48:29 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:48:29.396 104259 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap135f4ca0-b0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 21 18:48:29 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:48:29.396 104259 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 21 18:48:29 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:48:29.397 104259 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap135f4ca0-b0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 21 18:48:29 np0005591285 NetworkManager[55017]: <info>  [1769039309.4004] manager: (tap135f4ca0-b0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/52)
Jan 21 18:48:29 np0005591285 kernel: tap135f4ca0-b0: entered promiscuous mode
Jan 21 18:48:29 np0005591285 nova_compute[182755]: 2026-01-21 23:48:29.399 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 18:48:29 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:48:29.408 104259 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap135f4ca0-b0, col_values=(('external_ids', {'iface-id': 'f24d5ed7-f246-4123-afeb-d49e73610afb'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 21 18:48:29 np0005591285 ovn_controller[94908]: 2026-01-21T23:48:29Z|00084|binding|INFO|Releasing lport f24d5ed7-f246-4123-afeb-d49e73610afb from this chassis (sb_readonly=0)
Jan 21 18:48:29 np0005591285 nova_compute[182755]: 2026-01-21 23:48:29.410 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 18:48:29 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:48:29.414 104259 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/135f4ca0-b287-4f82-8393-a426855e9926.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/135f4ca0-b287-4f82-8393-a426855e9926.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Jan 21 18:48:29 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:48:29.415 211690 DEBUG oslo.privsep.daemon [-] privsep: reply[f66ed6ef-0156-4cbd-a6e6-7165339dbe8a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 18:48:29 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:48:29.416 104259 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Jan 21 18:48:29 np0005591285 ovn_metadata_agent[104254]: global
Jan 21 18:48:29 np0005591285 ovn_metadata_agent[104254]:    log         /dev/log local0 debug
Jan 21 18:48:29 np0005591285 ovn_metadata_agent[104254]:    log-tag     haproxy-metadata-proxy-135f4ca0-b287-4f82-8393-a426855e9926
Jan 21 18:48:29 np0005591285 ovn_metadata_agent[104254]:    user        root
Jan 21 18:48:29 np0005591285 ovn_metadata_agent[104254]:    group       root
Jan 21 18:48:29 np0005591285 ovn_metadata_agent[104254]:    maxconn     1024
Jan 21 18:48:29 np0005591285 ovn_metadata_agent[104254]:    pidfile     /var/lib/neutron/external/pids/135f4ca0-b287-4f82-8393-a426855e9926.pid.haproxy
Jan 21 18:48:29 np0005591285 ovn_metadata_agent[104254]:    daemon
Jan 21 18:48:29 np0005591285 ovn_metadata_agent[104254]: 
Jan 21 18:48:29 np0005591285 ovn_metadata_agent[104254]: defaults
Jan 21 18:48:29 np0005591285 ovn_metadata_agent[104254]:    log global
Jan 21 18:48:29 np0005591285 ovn_metadata_agent[104254]:    mode http
Jan 21 18:48:29 np0005591285 ovn_metadata_agent[104254]:    option httplog
Jan 21 18:48:29 np0005591285 ovn_metadata_agent[104254]:    option dontlognull
Jan 21 18:48:29 np0005591285 ovn_metadata_agent[104254]:    option http-server-close
Jan 21 18:48:29 np0005591285 ovn_metadata_agent[104254]:    option forwardfor
Jan 21 18:48:29 np0005591285 ovn_metadata_agent[104254]:    retries                 3
Jan 21 18:48:29 np0005591285 ovn_metadata_agent[104254]:    timeout http-request    30s
Jan 21 18:48:29 np0005591285 ovn_metadata_agent[104254]:    timeout connect         30s
Jan 21 18:48:29 np0005591285 ovn_metadata_agent[104254]:    timeout client          32s
Jan 21 18:48:29 np0005591285 ovn_metadata_agent[104254]:    timeout server          32s
Jan 21 18:48:29 np0005591285 ovn_metadata_agent[104254]:    timeout http-keep-alive 30s
Jan 21 18:48:29 np0005591285 ovn_metadata_agent[104254]: 
Jan 21 18:48:29 np0005591285 ovn_metadata_agent[104254]: 
Jan 21 18:48:29 np0005591285 ovn_metadata_agent[104254]: listen listener
Jan 21 18:48:29 np0005591285 ovn_metadata_agent[104254]:    bind 169.254.169.254:80
Jan 21 18:48:29 np0005591285 ovn_metadata_agent[104254]:    server metadata /var/lib/neutron/metadata_proxy
Jan 21 18:48:29 np0005591285 ovn_metadata_agent[104254]:    http-request add-header X-OVN-Network-ID 135f4ca0-b287-4f82-8393-a426855e9926
Jan 21 18:48:29 np0005591285 ovn_metadata_agent[104254]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Jan 21 18:48:29 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:48:29.416 104259 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-135f4ca0-b287-4f82-8393-a426855e9926', 'env', 'PROCESS_TAG=haproxy-135f4ca0-b287-4f82-8393-a426855e9926', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/135f4ca0-b287-4f82-8393-a426855e9926.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Jan 21 18:48:29 np0005591285 nova_compute[182755]: 2026-01-21 23:48:29.434 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 18:48:29 np0005591285 nova_compute[182755]: 2026-01-21 23:48:29.758 182759 DEBUG nova.network.neutron [req-1cb52d9c-dc55-4389-bf3c-785cedc792e4 req-c43e8780-3802-467c-9466-62f5a518df82 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: c45fafe2-1fc2-4740-a927-9f38ba3ab16b] Updated VIF entry in instance network info cache for port 6fc91ab1-e76c-419d-9474-37ac429e4a4b. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 21 18:48:29 np0005591285 nova_compute[182755]: 2026-01-21 23:48:29.761 182759 DEBUG nova.network.neutron [req-1cb52d9c-dc55-4389-bf3c-785cedc792e4 req-c43e8780-3802-467c-9466-62f5a518df82 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: c45fafe2-1fc2-4740-a927-9f38ba3ab16b] Updating instance_info_cache with network_info: [{"id": "6fc91ab1-e76c-419d-9474-37ac429e4a4b", "address": "fa:16:3e:f5:71:6c", "network": {"id": "135f4ca0-b287-4f82-8393-a426855e9926", "bridge": "br-int", "label": "tempest-ImagesOneServerNegativeTestJSON-1018143163-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "2edcdd2e6c5a46cb95eb89874a9cb5f3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap6fc91ab1-e7", "ovs_interfaceid": "6fc91ab1-e76c-419d-9474-37ac429e4a4b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 21 18:48:29 np0005591285 nova_compute[182755]: 2026-01-21 23:48:29.780 182759 DEBUG oslo_concurrency.lockutils [req-1cb52d9c-dc55-4389-bf3c-785cedc792e4 req-c43e8780-3802-467c-9466-62f5a518df82 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Releasing lock "refresh_cache-c45fafe2-1fc2-4740-a927-9f38ba3ab16b" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 21 18:48:29 np0005591285 ovn_controller[94908]: 2026-01-21T23:48:29Z|00085|binding|INFO|Releasing lport f24d5ed7-f246-4123-afeb-d49e73610afb from this chassis (sb_readonly=0)
Jan 21 18:48:29 np0005591285 podman[214632]: 2026-01-21 23:48:29.920958456 +0000 UTC m=+0.104821305 container create ac43ea6029dcf9acda86ef80aaedffccb1d82ca3e0a4df52a2335f3476ad7e59 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-135f4ca0-b287-4f82-8393-a426855e9926, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251202)
Jan 21 18:48:29 np0005591285 nova_compute[182755]: 2026-01-21 23:48:29.933 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 18:48:29 np0005591285 podman[214632]: 2026-01-21 23:48:29.861319531 +0000 UTC m=+0.045182430 image pull 3695f0466b4af47afdf4b467956f8cc4744d7249671a73e7ca3fd26cca2f59c3 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Jan 21 18:48:29 np0005591285 systemd[1]: Started libpod-conmon-ac43ea6029dcf9acda86ef80aaedffccb1d82ca3e0a4df52a2335f3476ad7e59.scope.
Jan 21 18:48:29 np0005591285 systemd[1]: Started libcrun container.
Jan 21 18:48:29 np0005591285 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/306704e64bc2142bcad98580230395c469e2af3d302b86d8c056983864547fd3/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 21 18:48:30 np0005591285 podman[214632]: 2026-01-21 23:48:30.009000111 +0000 UTC m=+0.192863000 container init ac43ea6029dcf9acda86ef80aaedffccb1d82ca3e0a4df52a2335f3476ad7e59 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-135f4ca0-b287-4f82-8393-a426855e9926, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.build-date=20251202)
Jan 21 18:48:30 np0005591285 podman[214632]: 2026-01-21 23:48:30.015383832 +0000 UTC m=+0.199246681 container start ac43ea6029dcf9acda86ef80aaedffccb1d82ca3e0a4df52a2335f3476ad7e59 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-135f4ca0-b287-4f82-8393-a426855e9926, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2)
Jan 21 18:48:30 np0005591285 neutron-haproxy-ovnmeta-135f4ca0-b287-4f82-8393-a426855e9926[214652]: [NOTICE]   (214657) : New worker (214660) forked
Jan 21 18:48:30 np0005591285 neutron-haproxy-ovnmeta-135f4ca0-b287-4f82-8393-a426855e9926[214652]: [NOTICE]   (214657) : Loading success.
Jan 21 18:48:30 np0005591285 nova_compute[182755]: 2026-01-21 23:48:30.077 182759 DEBUG nova.virt.driver [None req-f447ef32-7edb-4e2c-a5c5-5452f2f9c37e - - - - - -] Emitting event <LifecycleEvent: 1769039310.0758226, c45fafe2-1fc2-4740-a927-9f38ba3ab16b => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 21 18:48:30 np0005591285 nova_compute[182755]: 2026-01-21 23:48:30.077 182759 INFO nova.compute.manager [None req-f447ef32-7edb-4e2c-a5c5-5452f2f9c37e - - - - - -] [instance: c45fafe2-1fc2-4740-a927-9f38ba3ab16b] VM Started (Lifecycle Event)#033[00m
Jan 21 18:48:30 np0005591285 nova_compute[182755]: 2026-01-21 23:48:30.082 182759 DEBUG nova.compute.manager [None req-f2134448-f17c-4eae-ab41-fe1a1b639086 0c3f927acf834c718155d5ee5dd81b19 2edcdd2e6c5a46cb95eb89874a9cb5f3 - - default default] [instance: c45fafe2-1fc2-4740-a927-9f38ba3ab16b] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Jan 21 18:48:30 np0005591285 nova_compute[182755]: 2026-01-21 23:48:30.086 182759 DEBUG nova.virt.libvirt.driver [None req-f2134448-f17c-4eae-ab41-fe1a1b639086 0c3f927acf834c718155d5ee5dd81b19 2edcdd2e6c5a46cb95eb89874a9cb5f3 - - default default] [instance: c45fafe2-1fc2-4740-a927-9f38ba3ab16b] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Jan 21 18:48:30 np0005591285 nova_compute[182755]: 2026-01-21 23:48:30.090 182759 INFO nova.virt.libvirt.driver [-] [instance: c45fafe2-1fc2-4740-a927-9f38ba3ab16b] Instance spawned successfully.#033[00m
Jan 21 18:48:30 np0005591285 nova_compute[182755]: 2026-01-21 23:48:30.090 182759 DEBUG nova.virt.libvirt.driver [None req-f2134448-f17c-4eae-ab41-fe1a1b639086 0c3f927acf834c718155d5ee5dd81b19 2edcdd2e6c5a46cb95eb89874a9cb5f3 - - default default] [instance: c45fafe2-1fc2-4740-a927-9f38ba3ab16b] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Jan 21 18:48:30 np0005591285 nova_compute[182755]: 2026-01-21 23:48:30.109 182759 DEBUG nova.compute.manager [None req-f447ef32-7edb-4e2c-a5c5-5452f2f9c37e - - - - - -] [instance: c45fafe2-1fc2-4740-a927-9f38ba3ab16b] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 21 18:48:30 np0005591285 nova_compute[182755]: 2026-01-21 23:48:30.117 182759 DEBUG nova.compute.manager [None req-f447ef32-7edb-4e2c-a5c5-5452f2f9c37e - - - - - -] [instance: c45fafe2-1fc2-4740-a927-9f38ba3ab16b] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 21 18:48:30 np0005591285 nova_compute[182755]: 2026-01-21 23:48:30.120 182759 DEBUG nova.virt.libvirt.driver [None req-f2134448-f17c-4eae-ab41-fe1a1b639086 0c3f927acf834c718155d5ee5dd81b19 2edcdd2e6c5a46cb95eb89874a9cb5f3 - - default default] [instance: c45fafe2-1fc2-4740-a927-9f38ba3ab16b] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 21 18:48:30 np0005591285 nova_compute[182755]: 2026-01-21 23:48:30.121 182759 DEBUG nova.virt.libvirt.driver [None req-f2134448-f17c-4eae-ab41-fe1a1b639086 0c3f927acf834c718155d5ee5dd81b19 2edcdd2e6c5a46cb95eb89874a9cb5f3 - - default default] [instance: c45fafe2-1fc2-4740-a927-9f38ba3ab16b] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 21 18:48:30 np0005591285 nova_compute[182755]: 2026-01-21 23:48:30.121 182759 DEBUG nova.virt.libvirt.driver [None req-f2134448-f17c-4eae-ab41-fe1a1b639086 0c3f927acf834c718155d5ee5dd81b19 2edcdd2e6c5a46cb95eb89874a9cb5f3 - - default default] [instance: c45fafe2-1fc2-4740-a927-9f38ba3ab16b] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 21 18:48:30 np0005591285 nova_compute[182755]: 2026-01-21 23:48:30.122 182759 DEBUG nova.virt.libvirt.driver [None req-f2134448-f17c-4eae-ab41-fe1a1b639086 0c3f927acf834c718155d5ee5dd81b19 2edcdd2e6c5a46cb95eb89874a9cb5f3 - - default default] [instance: c45fafe2-1fc2-4740-a927-9f38ba3ab16b] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 21 18:48:30 np0005591285 nova_compute[182755]: 2026-01-21 23:48:30.122 182759 DEBUG nova.virt.libvirt.driver [None req-f2134448-f17c-4eae-ab41-fe1a1b639086 0c3f927acf834c718155d5ee5dd81b19 2edcdd2e6c5a46cb95eb89874a9cb5f3 - - default default] [instance: c45fafe2-1fc2-4740-a927-9f38ba3ab16b] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 21 18:48:30 np0005591285 nova_compute[182755]: 2026-01-21 23:48:30.122 182759 DEBUG nova.virt.libvirt.driver [None req-f2134448-f17c-4eae-ab41-fe1a1b639086 0c3f927acf834c718155d5ee5dd81b19 2edcdd2e6c5a46cb95eb89874a9cb5f3 - - default default] [instance: c45fafe2-1fc2-4740-a927-9f38ba3ab16b] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 21 18:48:30 np0005591285 nova_compute[182755]: 2026-01-21 23:48:30.150 182759 INFO nova.compute.manager [None req-f447ef32-7edb-4e2c-a5c5-5452f2f9c37e - - - - - -] [instance: c45fafe2-1fc2-4740-a927-9f38ba3ab16b] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 21 18:48:30 np0005591285 nova_compute[182755]: 2026-01-21 23:48:30.150 182759 DEBUG nova.virt.driver [None req-f447ef32-7edb-4e2c-a5c5-5452f2f9c37e - - - - - -] Emitting event <LifecycleEvent: 1769039310.0764747, c45fafe2-1fc2-4740-a927-9f38ba3ab16b => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 21 18:48:30 np0005591285 nova_compute[182755]: 2026-01-21 23:48:30.150 182759 INFO nova.compute.manager [None req-f447ef32-7edb-4e2c-a5c5-5452f2f9c37e - - - - - -] [instance: c45fafe2-1fc2-4740-a927-9f38ba3ab16b] VM Paused (Lifecycle Event)#033[00m
Jan 21 18:48:30 np0005591285 nova_compute[182755]: 2026-01-21 23:48:30.189 182759 DEBUG nova.compute.manager [None req-f447ef32-7edb-4e2c-a5c5-5452f2f9c37e - - - - - -] [instance: c45fafe2-1fc2-4740-a927-9f38ba3ab16b] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 21 18:48:30 np0005591285 nova_compute[182755]: 2026-01-21 23:48:30.193 182759 DEBUG nova.virt.driver [None req-f447ef32-7edb-4e2c-a5c5-5452f2f9c37e - - - - - -] Emitting event <LifecycleEvent: 1769039310.0859025, c45fafe2-1fc2-4740-a927-9f38ba3ab16b => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 21 18:48:30 np0005591285 nova_compute[182755]: 2026-01-21 23:48:30.194 182759 INFO nova.compute.manager [None req-f447ef32-7edb-4e2c-a5c5-5452f2f9c37e - - - - - -] [instance: c45fafe2-1fc2-4740-a927-9f38ba3ab16b] VM Resumed (Lifecycle Event)#033[00m
Jan 21 18:48:30 np0005591285 nova_compute[182755]: 2026-01-21 23:48:30.220 182759 INFO nova.compute.manager [None req-f2134448-f17c-4eae-ab41-fe1a1b639086 0c3f927acf834c718155d5ee5dd81b19 2edcdd2e6c5a46cb95eb89874a9cb5f3 - - default default] [instance: c45fafe2-1fc2-4740-a927-9f38ba3ab16b] Took 7.09 seconds to spawn the instance on the hypervisor.#033[00m
Jan 21 18:48:30 np0005591285 nova_compute[182755]: 2026-01-21 23:48:30.221 182759 DEBUG nova.compute.manager [None req-f2134448-f17c-4eae-ab41-fe1a1b639086 0c3f927acf834c718155d5ee5dd81b19 2edcdd2e6c5a46cb95eb89874a9cb5f3 - - default default] [instance: c45fafe2-1fc2-4740-a927-9f38ba3ab16b] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 21 18:48:30 np0005591285 nova_compute[182755]: 2026-01-21 23:48:30.222 182759 DEBUG nova.compute.manager [None req-f447ef32-7edb-4e2c-a5c5-5452f2f9c37e - - - - - -] [instance: c45fafe2-1fc2-4740-a927-9f38ba3ab16b] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 21 18:48:30 np0005591285 nova_compute[182755]: 2026-01-21 23:48:30.233 182759 DEBUG nova.compute.manager [None req-f447ef32-7edb-4e2c-a5c5-5452f2f9c37e - - - - - -] [instance: c45fafe2-1fc2-4740-a927-9f38ba3ab16b] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 21 18:48:30 np0005591285 nova_compute[182755]: 2026-01-21 23:48:30.270 182759 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1769039295.269231, 5ed410d9-b024-4004-83b6-eb4618833532 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 21 18:48:30 np0005591285 nova_compute[182755]: 2026-01-21 23:48:30.271 182759 INFO nova.compute.manager [-] [instance: 5ed410d9-b024-4004-83b6-eb4618833532] VM Stopped (Lifecycle Event)#033[00m
Jan 21 18:48:30 np0005591285 nova_compute[182755]: 2026-01-21 23:48:30.276 182759 INFO nova.compute.manager [None req-f447ef32-7edb-4e2c-a5c5-5452f2f9c37e - - - - - -] [instance: c45fafe2-1fc2-4740-a927-9f38ba3ab16b] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 21 18:48:30 np0005591285 nova_compute[182755]: 2026-01-21 23:48:30.303 182759 DEBUG nova.compute.manager [None req-e734644f-13c6-462e-add9-62c274b36bc0 - - - - - -] [instance: 5ed410d9-b024-4004-83b6-eb4618833532] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 21 18:48:30 np0005591285 nova_compute[182755]: 2026-01-21 23:48:30.344 182759 INFO nova.compute.manager [None req-f2134448-f17c-4eae-ab41-fe1a1b639086 0c3f927acf834c718155d5ee5dd81b19 2edcdd2e6c5a46cb95eb89874a9cb5f3 - - default default] [instance: c45fafe2-1fc2-4740-a927-9f38ba3ab16b] Took 7.89 seconds to build instance.#033[00m
Jan 21 18:48:30 np0005591285 nova_compute[182755]: 2026-01-21 23:48:30.362 182759 DEBUG oslo_concurrency.lockutils [None req-f2134448-f17c-4eae-ab41-fe1a1b639086 0c3f927acf834c718155d5ee5dd81b19 2edcdd2e6c5a46cb95eb89874a9cb5f3 - - default default] Lock "c45fafe2-1fc2-4740-a927-9f38ba3ab16b" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 8.014s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 21 18:48:31 np0005591285 nova_compute[182755]: 2026-01-21 23:48:31.515 182759 DEBUG nova.compute.manager [req-7317dd20-0511-4788-b092-68a798158853 req-abf666fb-3818-45f8-91b8-01151d3c98ad 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: c45fafe2-1fc2-4740-a927-9f38ba3ab16b] Received event network-vif-plugged-6fc91ab1-e76c-419d-9474-37ac429e4a4b external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 21 18:48:31 np0005591285 nova_compute[182755]: 2026-01-21 23:48:31.517 182759 DEBUG oslo_concurrency.lockutils [req-7317dd20-0511-4788-b092-68a798158853 req-abf666fb-3818-45f8-91b8-01151d3c98ad 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "c45fafe2-1fc2-4740-a927-9f38ba3ab16b-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 21 18:48:31 np0005591285 nova_compute[182755]: 2026-01-21 23:48:31.517 182759 DEBUG oslo_concurrency.lockutils [req-7317dd20-0511-4788-b092-68a798158853 req-abf666fb-3818-45f8-91b8-01151d3c98ad 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "c45fafe2-1fc2-4740-a927-9f38ba3ab16b-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 21 18:48:31 np0005591285 nova_compute[182755]: 2026-01-21 23:48:31.518 182759 DEBUG oslo_concurrency.lockutils [req-7317dd20-0511-4788-b092-68a798158853 req-abf666fb-3818-45f8-91b8-01151d3c98ad 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "c45fafe2-1fc2-4740-a927-9f38ba3ab16b-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 21 18:48:31 np0005591285 nova_compute[182755]: 2026-01-21 23:48:31.519 182759 DEBUG nova.compute.manager [req-7317dd20-0511-4788-b092-68a798158853 req-abf666fb-3818-45f8-91b8-01151d3c98ad 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: c45fafe2-1fc2-4740-a927-9f38ba3ab16b] No waiting events found dispatching network-vif-plugged-6fc91ab1-e76c-419d-9474-37ac429e4a4b pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 21 18:48:31 np0005591285 nova_compute[182755]: 2026-01-21 23:48:31.519 182759 WARNING nova.compute.manager [req-7317dd20-0511-4788-b092-68a798158853 req-abf666fb-3818-45f8-91b8-01151d3c98ad 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: c45fafe2-1fc2-4740-a927-9f38ba3ab16b] Received unexpected event network-vif-plugged-6fc91ab1-e76c-419d-9474-37ac429e4a4b for instance with vm_state active and task_state None.#033[00m
Jan 21 18:48:32 np0005591285 nova_compute[182755]: 2026-01-21 23:48:32.953 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 18:48:33 np0005591285 nova_compute[182755]: 2026-01-21 23:48:33.507 182759 DEBUG nova.compute.manager [None req-ed76da43-6cf8-4ba6-b5fa-aebf794c69b2 0c3f927acf834c718155d5ee5dd81b19 2edcdd2e6c5a46cb95eb89874a9cb5f3 - - default default] [instance: c45fafe2-1fc2-4740-a927-9f38ba3ab16b] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 21 18:48:33 np0005591285 nova_compute[182755]: 2026-01-21 23:48:33.580 182759 INFO nova.compute.manager [None req-ed76da43-6cf8-4ba6-b5fa-aebf794c69b2 0c3f927acf834c718155d5ee5dd81b19 2edcdd2e6c5a46cb95eb89874a9cb5f3 - - default default] [instance: c45fafe2-1fc2-4740-a927-9f38ba3ab16b] instance snapshotting#033[00m
Jan 21 18:48:33 np0005591285 nova_compute[182755]: 2026-01-21 23:48:33.605 182759 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1769039298.6041553, 3a585d4f-6f31-4651-b848-0470f4eed464 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 21 18:48:33 np0005591285 nova_compute[182755]: 2026-01-21 23:48:33.606 182759 INFO nova.compute.manager [-] [instance: 3a585d4f-6f31-4651-b848-0470f4eed464] VM Stopped (Lifecycle Event)#033[00m
Jan 21 18:48:33 np0005591285 nova_compute[182755]: 2026-01-21 23:48:33.630 182759 DEBUG nova.compute.manager [None req-575fc3c5-17cf-4d4f-868b-3ce1ae503d04 - - - - - -] [instance: 3a585d4f-6f31-4651-b848-0470f4eed464] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 21 18:48:33 np0005591285 nova_compute[182755]: 2026-01-21 23:48:33.830 182759 INFO nova.virt.libvirt.driver [None req-ed76da43-6cf8-4ba6-b5fa-aebf794c69b2 0c3f927acf834c718155d5ee5dd81b19 2edcdd2e6c5a46cb95eb89874a9cb5f3 - - default default] [instance: c45fafe2-1fc2-4740-a927-9f38ba3ab16b] Beginning live snapshot process#033[00m
Jan 21 18:48:34 np0005591285 nova_compute[182755]: 2026-01-21 23:48:34.013 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 18:48:34 np0005591285 virtqemud[182299]: invalid argument: disk vda does not have an active block job
Jan 21 18:48:34 np0005591285 nova_compute[182755]: 2026-01-21 23:48:34.148 182759 DEBUG oslo_concurrency.processutils [None req-ed76da43-6cf8-4ba6-b5fa-aebf794c69b2 0c3f927acf834c718155d5ee5dd81b19 2edcdd2e6c5a46cb95eb89874a9cb5f3 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/c45fafe2-1fc2-4740-a927-9f38ba3ab16b/disk --force-share --output=json -f qcow2 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 21 18:48:34 np0005591285 nova_compute[182755]: 2026-01-21 23:48:34.248 182759 DEBUG oslo_concurrency.processutils [None req-ed76da43-6cf8-4ba6-b5fa-aebf794c69b2 0c3f927acf834c718155d5ee5dd81b19 2edcdd2e6c5a46cb95eb89874a9cb5f3 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/c45fafe2-1fc2-4740-a927-9f38ba3ab16b/disk --force-share --output=json -f qcow2" returned: 0 in 0.100s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 21 18:48:34 np0005591285 nova_compute[182755]: 2026-01-21 23:48:34.251 182759 DEBUG oslo_concurrency.processutils [None req-ed76da43-6cf8-4ba6-b5fa-aebf794c69b2 0c3f927acf834c718155d5ee5dd81b19 2edcdd2e6c5a46cb95eb89874a9cb5f3 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/c45fafe2-1fc2-4740-a927-9f38ba3ab16b/disk --force-share --output=json -f qcow2 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 21 18:48:34 np0005591285 nova_compute[182755]: 2026-01-21 23:48:34.333 182759 DEBUG oslo_concurrency.processutils [None req-ed76da43-6cf8-4ba6-b5fa-aebf794c69b2 0c3f927acf834c718155d5ee5dd81b19 2edcdd2e6c5a46cb95eb89874a9cb5f3 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/c45fafe2-1fc2-4740-a927-9f38ba3ab16b/disk --force-share --output=json -f qcow2" returned: 0 in 0.082s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 21 18:48:34 np0005591285 nova_compute[182755]: 2026-01-21 23:48:34.360 182759 DEBUG oslo_concurrency.processutils [None req-ed76da43-6cf8-4ba6-b5fa-aebf794c69b2 0c3f927acf834c718155d5ee5dd81b19 2edcdd2e6c5a46cb95eb89874a9cb5f3 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 21 18:48:34 np0005591285 nova_compute[182755]: 2026-01-21 23:48:34.448 182759 DEBUG oslo_concurrency.processutils [None req-ed76da43-6cf8-4ba6-b5fa-aebf794c69b2 0c3f927acf834c718155d5ee5dd81b19 2edcdd2e6c5a46cb95eb89874a9cb5f3 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json" returned: 0 in 0.087s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 21 18:48:34 np0005591285 nova_compute[182755]: 2026-01-21 23:48:34.451 182759 DEBUG oslo_concurrency.processutils [None req-ed76da43-6cf8-4ba6-b5fa-aebf794c69b2 0c3f927acf834c718155d5ee5dd81b19 2edcdd2e6c5a46cb95eb89874a9cb5f3 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474,backing_fmt=raw /var/lib/nova/instances/snapshots/tmp6zly7ipp/ed6053f6dc4b41c5a57fe99c6aa28643.delta 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 21 18:48:34 np0005591285 nova_compute[182755]: 2026-01-21 23:48:34.506 182759 DEBUG oslo_concurrency.processutils [None req-ed76da43-6cf8-4ba6-b5fa-aebf794c69b2 0c3f927acf834c718155d5ee5dd81b19 2edcdd2e6c5a46cb95eb89874a9cb5f3 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474,backing_fmt=raw /var/lib/nova/instances/snapshots/tmp6zly7ipp/ed6053f6dc4b41c5a57fe99c6aa28643.delta 1073741824" returned: 0 in 0.054s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 21 18:48:34 np0005591285 nova_compute[182755]: 2026-01-21 23:48:34.508 182759 INFO nova.virt.libvirt.driver [None req-ed76da43-6cf8-4ba6-b5fa-aebf794c69b2 0c3f927acf834c718155d5ee5dd81b19 2edcdd2e6c5a46cb95eb89874a9cb5f3 - - default default] [instance: c45fafe2-1fc2-4740-a927-9f38ba3ab16b] Quiescing instance not available: QEMU guest agent is not enabled.#033[00m
Jan 21 18:48:34 np0005591285 nova_compute[182755]: 2026-01-21 23:48:34.564 182759 DEBUG nova.virt.libvirt.guest [None req-ed76da43-6cf8-4ba6-b5fa-aebf794c69b2 0c3f927acf834c718155d5ee5dd81b19 2edcdd2e6c5a46cb95eb89874a9cb5f3 - - default default] COPY block job progress, current cursor: 1 final cursor: 1 is_job_complete /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:846#033[00m
Jan 21 18:48:34 np0005591285 nova_compute[182755]: 2026-01-21 23:48:34.569 182759 INFO nova.virt.libvirt.driver [None req-ed76da43-6cf8-4ba6-b5fa-aebf794c69b2 0c3f927acf834c718155d5ee5dd81b19 2edcdd2e6c5a46cb95eb89874a9cb5f3 - - default default] [instance: c45fafe2-1fc2-4740-a927-9f38ba3ab16b] Skipping quiescing instance: QEMU guest agent is not enabled.#033[00m
Jan 21 18:48:34 np0005591285 nova_compute[182755]: 2026-01-21 23:48:34.615 182759 DEBUG nova.privsep.utils [None req-ed76da43-6cf8-4ba6-b5fa-aebf794c69b2 0c3f927acf834c718155d5ee5dd81b19 2edcdd2e6c5a46cb95eb89874a9cb5f3 - - default default] Path '/var/lib/nova/instances' supports direct I/O supports_direct_io /usr/lib/python3.9/site-packages/nova/privsep/utils.py:63#033[00m
Jan 21 18:48:34 np0005591285 nova_compute[182755]: 2026-01-21 23:48:34.617 182759 DEBUG oslo_concurrency.processutils [None req-ed76da43-6cf8-4ba6-b5fa-aebf794c69b2 0c3f927acf834c718155d5ee5dd81b19 2edcdd2e6c5a46cb95eb89874a9cb5f3 - - default default] Running cmd (subprocess): qemu-img convert -t none -O qcow2 -f qcow2 /var/lib/nova/instances/snapshots/tmp6zly7ipp/ed6053f6dc4b41c5a57fe99c6aa28643.delta /var/lib/nova/instances/snapshots/tmp6zly7ipp/ed6053f6dc4b41c5a57fe99c6aa28643 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 21 18:48:34 np0005591285 nova_compute[182755]: 2026-01-21 23:48:34.778 182759 DEBUG oslo_concurrency.processutils [None req-ed76da43-6cf8-4ba6-b5fa-aebf794c69b2 0c3f927acf834c718155d5ee5dd81b19 2edcdd2e6c5a46cb95eb89874a9cb5f3 - - default default] CMD "qemu-img convert -t none -O qcow2 -f qcow2 /var/lib/nova/instances/snapshots/tmp6zly7ipp/ed6053f6dc4b41c5a57fe99c6aa28643.delta /var/lib/nova/instances/snapshots/tmp6zly7ipp/ed6053f6dc4b41c5a57fe99c6aa28643" returned: 0 in 0.161s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 21 18:48:34 np0005591285 nova_compute[182755]: 2026-01-21 23:48:34.780 182759 INFO nova.virt.libvirt.driver [None req-ed76da43-6cf8-4ba6-b5fa-aebf794c69b2 0c3f927acf834c718155d5ee5dd81b19 2edcdd2e6c5a46cb95eb89874a9cb5f3 - - default default] [instance: c45fafe2-1fc2-4740-a927-9f38ba3ab16b] Snapshot extracted, beginning image upload#033[00m
Jan 21 18:48:35 np0005591285 nova_compute[182755]: 2026-01-21 23:48:35.122 182759 WARNING nova.compute.manager [None req-ed76da43-6cf8-4ba6-b5fa-aebf794c69b2 0c3f927acf834c718155d5ee5dd81b19 2edcdd2e6c5a46cb95eb89874a9cb5f3 - - default default] [instance: c45fafe2-1fc2-4740-a927-9f38ba3ab16b] Image not found during snapshot: nova.exception.ImageNotFound: Image bfe6dad8-b207-4ab6-8212-be1e53c01159 could not be found.#033[00m
Jan 21 18:48:36 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:48:36.456 104259 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=8, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '16:a4:fb', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': 'fa:c2:bb:50:eb:27'}, ipsec=False) old=SB_Global(nb_cfg=7) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 21 18:48:36 np0005591285 nova_compute[182755]: 2026-01-21 23:48:36.458 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 18:48:36 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:48:36.458 104259 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 8 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Jan 21 18:48:36 np0005591285 nova_compute[182755]: 2026-01-21 23:48:36.829 182759 DEBUG oslo_concurrency.lockutils [None req-e53bc008-1275-4146-87f8-955b097cdd80 0c3f927acf834c718155d5ee5dd81b19 2edcdd2e6c5a46cb95eb89874a9cb5f3 - - default default] Acquiring lock "c45fafe2-1fc2-4740-a927-9f38ba3ab16b" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 21 18:48:36 np0005591285 nova_compute[182755]: 2026-01-21 23:48:36.831 182759 DEBUG oslo_concurrency.lockutils [None req-e53bc008-1275-4146-87f8-955b097cdd80 0c3f927acf834c718155d5ee5dd81b19 2edcdd2e6c5a46cb95eb89874a9cb5f3 - - default default] Lock "c45fafe2-1fc2-4740-a927-9f38ba3ab16b" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 21 18:48:36 np0005591285 nova_compute[182755]: 2026-01-21 23:48:36.831 182759 DEBUG oslo_concurrency.lockutils [None req-e53bc008-1275-4146-87f8-955b097cdd80 0c3f927acf834c718155d5ee5dd81b19 2edcdd2e6c5a46cb95eb89874a9cb5f3 - - default default] Acquiring lock "c45fafe2-1fc2-4740-a927-9f38ba3ab16b-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 21 18:48:36 np0005591285 nova_compute[182755]: 2026-01-21 23:48:36.832 182759 DEBUG oslo_concurrency.lockutils [None req-e53bc008-1275-4146-87f8-955b097cdd80 0c3f927acf834c718155d5ee5dd81b19 2edcdd2e6c5a46cb95eb89874a9cb5f3 - - default default] Lock "c45fafe2-1fc2-4740-a927-9f38ba3ab16b-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 21 18:48:36 np0005591285 nova_compute[182755]: 2026-01-21 23:48:36.832 182759 DEBUG oslo_concurrency.lockutils [None req-e53bc008-1275-4146-87f8-955b097cdd80 0c3f927acf834c718155d5ee5dd81b19 2edcdd2e6c5a46cb95eb89874a9cb5f3 - - default default] Lock "c45fafe2-1fc2-4740-a927-9f38ba3ab16b-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 21 18:48:36 np0005591285 nova_compute[182755]: 2026-01-21 23:48:36.849 182759 INFO nova.compute.manager [None req-e53bc008-1275-4146-87f8-955b097cdd80 0c3f927acf834c718155d5ee5dd81b19 2edcdd2e6c5a46cb95eb89874a9cb5f3 - - default default] [instance: c45fafe2-1fc2-4740-a927-9f38ba3ab16b] Terminating instance#033[00m
Jan 21 18:48:36 np0005591285 nova_compute[182755]: 2026-01-21 23:48:36.861 182759 DEBUG nova.compute.manager [None req-e53bc008-1275-4146-87f8-955b097cdd80 0c3f927acf834c718155d5ee5dd81b19 2edcdd2e6c5a46cb95eb89874a9cb5f3 - - default default] [instance: c45fafe2-1fc2-4740-a927-9f38ba3ab16b] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Jan 21 18:48:36 np0005591285 kernel: tap6fc91ab1-e7 (unregistering): left promiscuous mode
Jan 21 18:48:36 np0005591285 ovn_controller[94908]: 2026-01-21T23:48:36Z|00086|binding|INFO|Releasing lport 6fc91ab1-e76c-419d-9474-37ac429e4a4b from this chassis (sb_readonly=0)
Jan 21 18:48:36 np0005591285 ovn_controller[94908]: 2026-01-21T23:48:36Z|00087|binding|INFO|Setting lport 6fc91ab1-e76c-419d-9474-37ac429e4a4b down in Southbound
Jan 21 18:48:36 np0005591285 NetworkManager[55017]: <info>  [1769039316.8935] device (tap6fc91ab1-e7): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 21 18:48:36 np0005591285 ovn_controller[94908]: 2026-01-21T23:48:36Z|00088|binding|INFO|Removing iface tap6fc91ab1-e7 ovn-installed in OVS
Jan 21 18:48:36 np0005591285 nova_compute[182755]: 2026-01-21 23:48:36.897 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 18:48:36 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:48:36.906 104259 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:f5:71:6c 10.100.0.10'], port_security=['fa:16:3e:f5:71:6c 10.100.0.10'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.10/28', 'neutron:device_id': 'c45fafe2-1fc2-4740-a927-9f38ba3ab16b', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-135f4ca0-b287-4f82-8393-a426855e9926', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '2edcdd2e6c5a46cb95eb89874a9cb5f3', 'neutron:revision_number': '4', 'neutron:security_group_ids': 'b452e9c4-b5fd-46cd-9749-caa7edf73c8c', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-2.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=357b9b46-d446-48ea-adde-5992e2bcd56d, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fdd55b5c970>], logical_port=6fc91ab1-e76c-419d-9474-37ac429e4a4b) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7fdd55b5c970>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 21 18:48:36 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:48:36.909 104259 INFO neutron.agent.ovn.metadata.agent [-] Port 6fc91ab1-e76c-419d-9474-37ac429e4a4b in datapath 135f4ca0-b287-4f82-8393-a426855e9926 unbound from our chassis#033[00m
Jan 21 18:48:36 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:48:36.911 104259 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 135f4ca0-b287-4f82-8393-a426855e9926, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Jan 21 18:48:36 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:48:36.912 211690 DEBUG oslo.privsep.daemon [-] privsep: reply[80cf258d-3498-41be-944e-0d7c12d3fded]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 18:48:36 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:48:36.913 104259 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-135f4ca0-b287-4f82-8393-a426855e9926 namespace which is not needed anymore#033[00m
Jan 21 18:48:36 np0005591285 nova_compute[182755]: 2026-01-21 23:48:36.922 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 18:48:36 np0005591285 systemd[1]: machine-qemu\x2d9\x2dinstance\x2d0000001b.scope: Deactivated successfully.
Jan 21 18:48:36 np0005591285 systemd[1]: machine-qemu\x2d9\x2dinstance\x2d0000001b.scope: Consumed 8.064s CPU time.
Jan 21 18:48:36 np0005591285 systemd-machined[154022]: Machine qemu-9-instance-0000001b terminated.
Jan 21 18:48:37 np0005591285 neutron-haproxy-ovnmeta-135f4ca0-b287-4f82-8393-a426855e9926[214652]: [NOTICE]   (214657) : haproxy version is 2.8.14-c23fe91
Jan 21 18:48:37 np0005591285 neutron-haproxy-ovnmeta-135f4ca0-b287-4f82-8393-a426855e9926[214652]: [NOTICE]   (214657) : path to executable is /usr/sbin/haproxy
Jan 21 18:48:37 np0005591285 neutron-haproxy-ovnmeta-135f4ca0-b287-4f82-8393-a426855e9926[214652]: [WARNING]  (214657) : Exiting Master process...
Jan 21 18:48:37 np0005591285 neutron-haproxy-ovnmeta-135f4ca0-b287-4f82-8393-a426855e9926[214652]: [ALERT]    (214657) : Current worker (214660) exited with code 143 (Terminated)
Jan 21 18:48:37 np0005591285 neutron-haproxy-ovnmeta-135f4ca0-b287-4f82-8393-a426855e9926[214652]: [WARNING]  (214657) : All workers exited. Exiting... (0)
Jan 21 18:48:37 np0005591285 systemd[1]: libpod-ac43ea6029dcf9acda86ef80aaedffccb1d82ca3e0a4df52a2335f3476ad7e59.scope: Deactivated successfully.
Jan 21 18:48:37 np0005591285 podman[214732]: 2026-01-21 23:48:37.079763122 +0000 UTC m=+0.059626507 container died ac43ea6029dcf9acda86ef80aaedffccb1d82ca3e0a4df52a2335f3476ad7e59 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-135f4ca0-b287-4f82-8393-a426855e9926, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team)
Jan 21 18:48:37 np0005591285 nova_compute[182755]: 2026-01-21 23:48:37.095 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 18:48:37 np0005591285 nova_compute[182755]: 2026-01-21 23:48:37.107 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 18:48:37 np0005591285 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-ac43ea6029dcf9acda86ef80aaedffccb1d82ca3e0a4df52a2335f3476ad7e59-userdata-shm.mount: Deactivated successfully.
Jan 21 18:48:37 np0005591285 systemd[1]: var-lib-containers-storage-overlay-306704e64bc2142bcad98580230395c469e2af3d302b86d8c056983864547fd3-merged.mount: Deactivated successfully.
Jan 21 18:48:37 np0005591285 podman[214732]: 2026-01-21 23:48:37.145782127 +0000 UTC m=+0.125645512 container cleanup ac43ea6029dcf9acda86ef80aaedffccb1d82ca3e0a4df52a2335f3476ad7e59 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-135f4ca0-b287-4f82-8393-a426855e9926, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2)
Jan 21 18:48:37 np0005591285 nova_compute[182755]: 2026-01-21 23:48:37.150 182759 INFO nova.virt.libvirt.driver [-] [instance: c45fafe2-1fc2-4740-a927-9f38ba3ab16b] Instance destroyed successfully.#033[00m
Jan 21 18:48:37 np0005591285 nova_compute[182755]: 2026-01-21 23:48:37.152 182759 DEBUG nova.objects.instance [None req-e53bc008-1275-4146-87f8-955b097cdd80 0c3f927acf834c718155d5ee5dd81b19 2edcdd2e6c5a46cb95eb89874a9cb5f3 - - default default] Lazy-loading 'resources' on Instance uuid c45fafe2-1fc2-4740-a927-9f38ba3ab16b obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 21 18:48:37 np0005591285 systemd[1]: libpod-conmon-ac43ea6029dcf9acda86ef80aaedffccb1d82ca3e0a4df52a2335f3476ad7e59.scope: Deactivated successfully.
Jan 21 18:48:37 np0005591285 nova_compute[182755]: 2026-01-21 23:48:37.179 182759 DEBUG nova.virt.libvirt.vif [None req-e53bc008-1275-4146-87f8-955b097cdd80 0c3f927acf834c718155d5ee5dd81b19 2edcdd2e6c5a46cb95eb89874a9cb5f3 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-21T23:48:21Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ImagesOneServerNegativeTestJSON-server-1662194836',display_name='tempest-ImagesOneServerNegativeTestJSON-server-1662194836',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-imagesoneservernegativetestjson-server-1662194836',id=27,image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2026-01-21T23:48:30Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='2edcdd2e6c5a46cb95eb89874a9cb5f3',ramdisk_id='',reservation_id='r-bhpdwuo4',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-ImagesOneServerNegativeTestJSON-222133061',owner_user_name='tempest-ImagesOneServerNegativeTestJSON-222133061-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-21T23:48:35Z,user_data=None,user_id='0c3f927acf834c718155d5ee5dd81b19',uuid=c45fafe2-1fc2-4740-a927-9f38ba3ab16b,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "6fc91ab1-e76c-419d-9474-37ac429e4a4b", "address": "fa:16:3e:f5:71:6c", "network": {"id": "135f4ca0-b287-4f82-8393-a426855e9926", "bridge": "br-int", "label": "tempest-ImagesOneServerNegativeTestJSON-1018143163-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "2edcdd2e6c5a46cb95eb89874a9cb5f3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap6fc91ab1-e7", "ovs_interfaceid": "6fc91ab1-e76c-419d-9474-37ac429e4a4b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Jan 21 18:48:37 np0005591285 nova_compute[182755]: 2026-01-21 23:48:37.180 182759 DEBUG nova.network.os_vif_util [None req-e53bc008-1275-4146-87f8-955b097cdd80 0c3f927acf834c718155d5ee5dd81b19 2edcdd2e6c5a46cb95eb89874a9cb5f3 - - default default] Converting VIF {"id": "6fc91ab1-e76c-419d-9474-37ac429e4a4b", "address": "fa:16:3e:f5:71:6c", "network": {"id": "135f4ca0-b287-4f82-8393-a426855e9926", "bridge": "br-int", "label": "tempest-ImagesOneServerNegativeTestJSON-1018143163-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "2edcdd2e6c5a46cb95eb89874a9cb5f3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap6fc91ab1-e7", "ovs_interfaceid": "6fc91ab1-e76c-419d-9474-37ac429e4a4b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 21 18:48:37 np0005591285 nova_compute[182755]: 2026-01-21 23:48:37.181 182759 DEBUG nova.network.os_vif_util [None req-e53bc008-1275-4146-87f8-955b097cdd80 0c3f927acf834c718155d5ee5dd81b19 2edcdd2e6c5a46cb95eb89874a9cb5f3 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:f5:71:6c,bridge_name='br-int',has_traffic_filtering=True,id=6fc91ab1-e76c-419d-9474-37ac429e4a4b,network=Network(135f4ca0-b287-4f82-8393-a426855e9926),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap6fc91ab1-e7') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 21 18:48:37 np0005591285 nova_compute[182755]: 2026-01-21 23:48:37.182 182759 DEBUG os_vif [None req-e53bc008-1275-4146-87f8-955b097cdd80 0c3f927acf834c718155d5ee5dd81b19 2edcdd2e6c5a46cb95eb89874a9cb5f3 - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:f5:71:6c,bridge_name='br-int',has_traffic_filtering=True,id=6fc91ab1-e76c-419d-9474-37ac429e4a4b,network=Network(135f4ca0-b287-4f82-8393-a426855e9926),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap6fc91ab1-e7') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Jan 21 18:48:37 np0005591285 nova_compute[182755]: 2026-01-21 23:48:37.186 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 18:48:37 np0005591285 nova_compute[182755]: 2026-01-21 23:48:37.186 182759 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap6fc91ab1-e7, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 21 18:48:37 np0005591285 nova_compute[182755]: 2026-01-21 23:48:37.189 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 18:48:37 np0005591285 nova_compute[182755]: 2026-01-21 23:48:37.190 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 18:48:37 np0005591285 nova_compute[182755]: 2026-01-21 23:48:37.197 182759 INFO os_vif [None req-e53bc008-1275-4146-87f8-955b097cdd80 0c3f927acf834c718155d5ee5dd81b19 2edcdd2e6c5a46cb95eb89874a9cb5f3 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:f5:71:6c,bridge_name='br-int',has_traffic_filtering=True,id=6fc91ab1-e76c-419d-9474-37ac429e4a4b,network=Network(135f4ca0-b287-4f82-8393-a426855e9926),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap6fc91ab1-e7')#033[00m
Jan 21 18:48:37 np0005591285 nova_compute[182755]: 2026-01-21 23:48:37.198 182759 INFO nova.virt.libvirt.driver [None req-e53bc008-1275-4146-87f8-955b097cdd80 0c3f927acf834c718155d5ee5dd81b19 2edcdd2e6c5a46cb95eb89874a9cb5f3 - - default default] [instance: c45fafe2-1fc2-4740-a927-9f38ba3ab16b] Deleting instance files /var/lib/nova/instances/c45fafe2-1fc2-4740-a927-9f38ba3ab16b_del#033[00m
Jan 21 18:48:37 np0005591285 nova_compute[182755]: 2026-01-21 23:48:37.198 182759 INFO nova.virt.libvirt.driver [None req-e53bc008-1275-4146-87f8-955b097cdd80 0c3f927acf834c718155d5ee5dd81b19 2edcdd2e6c5a46cb95eb89874a9cb5f3 - - default default] [instance: c45fafe2-1fc2-4740-a927-9f38ba3ab16b] Deletion of /var/lib/nova/instances/c45fafe2-1fc2-4740-a927-9f38ba3ab16b_del complete#033[00m
Jan 21 18:48:37 np0005591285 podman[214777]: 2026-01-21 23:48:37.216124529 +0000 UTC m=+0.043977357 container remove ac43ea6029dcf9acda86ef80aaedffccb1d82ca3e0a4df52a2335f3476ad7e59 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-135f4ca0-b287-4f82-8393-a426855e9926, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251202, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2)
Jan 21 18:48:37 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:48:37.222 211690 DEBUG oslo.privsep.daemon [-] privsep: reply[e9e61910-19e6-44eb-9a04-63a1d53acf20]: (4, ('Wed Jan 21 11:48:37 PM UTC 2026 Stopping container neutron-haproxy-ovnmeta-135f4ca0-b287-4f82-8393-a426855e9926 (ac43ea6029dcf9acda86ef80aaedffccb1d82ca3e0a4df52a2335f3476ad7e59)\nac43ea6029dcf9acda86ef80aaedffccb1d82ca3e0a4df52a2335f3476ad7e59\nWed Jan 21 11:48:37 PM UTC 2026 Deleting container neutron-haproxy-ovnmeta-135f4ca0-b287-4f82-8393-a426855e9926 (ac43ea6029dcf9acda86ef80aaedffccb1d82ca3e0a4df52a2335f3476ad7e59)\nac43ea6029dcf9acda86ef80aaedffccb1d82ca3e0a4df52a2335f3476ad7e59\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 18:48:37 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:48:37.224 211690 DEBUG oslo.privsep.daemon [-] privsep: reply[233145a2-033e-4380-b87c-a4b08af7a40e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 18:48:37 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:48:37.225 104259 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap135f4ca0-b0, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 21 18:48:37 np0005591285 nova_compute[182755]: 2026-01-21 23:48:37.227 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 18:48:37 np0005591285 kernel: tap135f4ca0-b0: left promiscuous mode
Jan 21 18:48:37 np0005591285 nova_compute[182755]: 2026-01-21 23:48:37.250 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 18:48:37 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:48:37.254 211690 DEBUG oslo.privsep.daemon [-] privsep: reply[402688ba-e0ab-495f-8fe9-54b475c1be54]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 18:48:37 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:48:37.276 211690 DEBUG oslo.privsep.daemon [-] privsep: reply[7aa0d33f-901b-47ec-b4db-63a817629210]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 18:48:37 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:48:37.278 211690 DEBUG oslo.privsep.daemon [-] privsep: reply[d28e1854-e477-46ef-8000-77515c885a15]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 18:48:37 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:48:37.300 211690 DEBUG oslo.privsep.daemon [-] privsep: reply[2790b86c-8543-493f-9b5e-b60827aaecf6]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 386774, 'reachable_time': 17563, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 214792, 'error': None, 'target': 'ovnmeta-135f4ca0-b287-4f82-8393-a426855e9926', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 18:48:37 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:48:37.305 104650 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-135f4ca0-b287-4f82-8393-a426855e9926 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Jan 21 18:48:37 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:48:37.305 104650 DEBUG oslo.privsep.daemon [-] privsep: reply[cd80f74e-b06e-44c6-b6d1-59d07e2f45b3]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 18:48:37 np0005591285 systemd[1]: run-netns-ovnmeta\x2d135f4ca0\x2db287\x2d4f82\x2d8393\x2da426855e9926.mount: Deactivated successfully.
Jan 21 18:48:37 np0005591285 nova_compute[182755]: 2026-01-21 23:48:37.315 182759 INFO nova.compute.manager [None req-e53bc008-1275-4146-87f8-955b097cdd80 0c3f927acf834c718155d5ee5dd81b19 2edcdd2e6c5a46cb95eb89874a9cb5f3 - - default default] [instance: c45fafe2-1fc2-4740-a927-9f38ba3ab16b] Took 0.45 seconds to destroy the instance on the hypervisor.#033[00m
Jan 21 18:48:37 np0005591285 nova_compute[182755]: 2026-01-21 23:48:37.316 182759 DEBUG oslo.service.loopingcall [None req-e53bc008-1275-4146-87f8-955b097cdd80 0c3f927acf834c718155d5ee5dd81b19 2edcdd2e6c5a46cb95eb89874a9cb5f3 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Jan 21 18:48:37 np0005591285 nova_compute[182755]: 2026-01-21 23:48:37.317 182759 DEBUG nova.compute.manager [-] [instance: c45fafe2-1fc2-4740-a927-9f38ba3ab16b] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Jan 21 18:48:37 np0005591285 nova_compute[182755]: 2026-01-21 23:48:37.317 182759 DEBUG nova.network.neutron [-] [instance: c45fafe2-1fc2-4740-a927-9f38ba3ab16b] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Jan 21 18:48:38 np0005591285 nova_compute[182755]: 2026-01-21 23:48:38.087 182759 DEBUG nova.network.neutron [-] [instance: c45fafe2-1fc2-4740-a927-9f38ba3ab16b] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 21 18:48:38 np0005591285 nova_compute[182755]: 2026-01-21 23:48:38.114 182759 INFO nova.compute.manager [-] [instance: c45fafe2-1fc2-4740-a927-9f38ba3ab16b] Took 0.80 seconds to deallocate network for instance.#033[00m
Jan 21 18:48:38 np0005591285 nova_compute[182755]: 2026-01-21 23:48:38.208 182759 DEBUG oslo_concurrency.lockutils [None req-e53bc008-1275-4146-87f8-955b097cdd80 0c3f927acf834c718155d5ee5dd81b19 2edcdd2e6c5a46cb95eb89874a9cb5f3 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 21 18:48:38 np0005591285 nova_compute[182755]: 2026-01-21 23:48:38.209 182759 DEBUG oslo_concurrency.lockutils [None req-e53bc008-1275-4146-87f8-955b097cdd80 0c3f927acf834c718155d5ee5dd81b19 2edcdd2e6c5a46cb95eb89874a9cb5f3 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 21 18:48:38 np0005591285 nova_compute[182755]: 2026-01-21 23:48:38.240 182759 DEBUG nova.compute.manager [req-9bb03f31-31bc-4799-9999-b81c36cdee40 req-2cf22eea-e858-4689-91f6-e2ede33c9eea 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: c45fafe2-1fc2-4740-a927-9f38ba3ab16b] Received event network-vif-unplugged-6fc91ab1-e76c-419d-9474-37ac429e4a4b external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 21 18:48:38 np0005591285 nova_compute[182755]: 2026-01-21 23:48:38.241 182759 DEBUG oslo_concurrency.lockutils [req-9bb03f31-31bc-4799-9999-b81c36cdee40 req-2cf22eea-e858-4689-91f6-e2ede33c9eea 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "c45fafe2-1fc2-4740-a927-9f38ba3ab16b-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 21 18:48:38 np0005591285 nova_compute[182755]: 2026-01-21 23:48:38.242 182759 DEBUG oslo_concurrency.lockutils [req-9bb03f31-31bc-4799-9999-b81c36cdee40 req-2cf22eea-e858-4689-91f6-e2ede33c9eea 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "c45fafe2-1fc2-4740-a927-9f38ba3ab16b-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 21 18:48:38 np0005591285 nova_compute[182755]: 2026-01-21 23:48:38.243 182759 DEBUG oslo_concurrency.lockutils [req-9bb03f31-31bc-4799-9999-b81c36cdee40 req-2cf22eea-e858-4689-91f6-e2ede33c9eea 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "c45fafe2-1fc2-4740-a927-9f38ba3ab16b-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 21 18:48:38 np0005591285 nova_compute[182755]: 2026-01-21 23:48:38.243 182759 DEBUG nova.compute.manager [req-9bb03f31-31bc-4799-9999-b81c36cdee40 req-2cf22eea-e858-4689-91f6-e2ede33c9eea 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: c45fafe2-1fc2-4740-a927-9f38ba3ab16b] No waiting events found dispatching network-vif-unplugged-6fc91ab1-e76c-419d-9474-37ac429e4a4b pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 21 18:48:38 np0005591285 nova_compute[182755]: 2026-01-21 23:48:38.244 182759 WARNING nova.compute.manager [req-9bb03f31-31bc-4799-9999-b81c36cdee40 req-2cf22eea-e858-4689-91f6-e2ede33c9eea 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: c45fafe2-1fc2-4740-a927-9f38ba3ab16b] Received unexpected event network-vif-unplugged-6fc91ab1-e76c-419d-9474-37ac429e4a4b for instance with vm_state deleted and task_state None.#033[00m
Jan 21 18:48:38 np0005591285 nova_compute[182755]: 2026-01-21 23:48:38.244 182759 DEBUG nova.compute.manager [req-9bb03f31-31bc-4799-9999-b81c36cdee40 req-2cf22eea-e858-4689-91f6-e2ede33c9eea 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: c45fafe2-1fc2-4740-a927-9f38ba3ab16b] Received event network-vif-plugged-6fc91ab1-e76c-419d-9474-37ac429e4a4b external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 21 18:48:38 np0005591285 nova_compute[182755]: 2026-01-21 23:48:38.245 182759 DEBUG oslo_concurrency.lockutils [req-9bb03f31-31bc-4799-9999-b81c36cdee40 req-2cf22eea-e858-4689-91f6-e2ede33c9eea 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "c45fafe2-1fc2-4740-a927-9f38ba3ab16b-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 21 18:48:38 np0005591285 nova_compute[182755]: 2026-01-21 23:48:38.245 182759 DEBUG oslo_concurrency.lockutils [req-9bb03f31-31bc-4799-9999-b81c36cdee40 req-2cf22eea-e858-4689-91f6-e2ede33c9eea 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "c45fafe2-1fc2-4740-a927-9f38ba3ab16b-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 21 18:48:38 np0005591285 nova_compute[182755]: 2026-01-21 23:48:38.246 182759 DEBUG oslo_concurrency.lockutils [req-9bb03f31-31bc-4799-9999-b81c36cdee40 req-2cf22eea-e858-4689-91f6-e2ede33c9eea 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "c45fafe2-1fc2-4740-a927-9f38ba3ab16b-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 21 18:48:38 np0005591285 nova_compute[182755]: 2026-01-21 23:48:38.246 182759 DEBUG nova.compute.manager [req-9bb03f31-31bc-4799-9999-b81c36cdee40 req-2cf22eea-e858-4689-91f6-e2ede33c9eea 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: c45fafe2-1fc2-4740-a927-9f38ba3ab16b] No waiting events found dispatching network-vif-plugged-6fc91ab1-e76c-419d-9474-37ac429e4a4b pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 21 18:48:38 np0005591285 nova_compute[182755]: 2026-01-21 23:48:38.247 182759 WARNING nova.compute.manager [req-9bb03f31-31bc-4799-9999-b81c36cdee40 req-2cf22eea-e858-4689-91f6-e2ede33c9eea 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: c45fafe2-1fc2-4740-a927-9f38ba3ab16b] Received unexpected event network-vif-plugged-6fc91ab1-e76c-419d-9474-37ac429e4a4b for instance with vm_state deleted and task_state None.#033[00m
Jan 21 18:48:38 np0005591285 nova_compute[182755]: 2026-01-21 23:48:38.332 182759 DEBUG nova.compute.manager [req-a5124084-7f3d-49e5-883a-84a2c019c5aa req-e65e88fd-e385-457e-9828-f1b291de4e95 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: c45fafe2-1fc2-4740-a927-9f38ba3ab16b] Received event network-vif-deleted-6fc91ab1-e76c-419d-9474-37ac429e4a4b external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 21 18:48:38 np0005591285 nova_compute[182755]: 2026-01-21 23:48:38.342 182759 DEBUG nova.compute.provider_tree [None req-e53bc008-1275-4146-87f8-955b097cdd80 0c3f927acf834c718155d5ee5dd81b19 2edcdd2e6c5a46cb95eb89874a9cb5f3 - - default default] Inventory has not changed in ProviderTree for provider: e96a8776-a298-4c19-937a-402cb8191067 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 21 18:48:38 np0005591285 nova_compute[182755]: 2026-01-21 23:48:38.362 182759 DEBUG nova.scheduler.client.report [None req-e53bc008-1275-4146-87f8-955b097cdd80 0c3f927acf834c718155d5ee5dd81b19 2edcdd2e6c5a46cb95eb89874a9cb5f3 - - default default] Inventory has not changed for provider e96a8776-a298-4c19-937a-402cb8191067 based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 21 18:48:38 np0005591285 nova_compute[182755]: 2026-01-21 23:48:38.396 182759 DEBUG oslo_concurrency.lockutils [None req-e53bc008-1275-4146-87f8-955b097cdd80 0c3f927acf834c718155d5ee5dd81b19 2edcdd2e6c5a46cb95eb89874a9cb5f3 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.187s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 21 18:48:38 np0005591285 nova_compute[182755]: 2026-01-21 23:48:38.421 182759 INFO nova.scheduler.client.report [None req-e53bc008-1275-4146-87f8-955b097cdd80 0c3f927acf834c718155d5ee5dd81b19 2edcdd2e6c5a46cb95eb89874a9cb5f3 - - default default] Deleted allocations for instance c45fafe2-1fc2-4740-a927-9f38ba3ab16b#033[00m
Jan 21 18:48:38 np0005591285 nova_compute[182755]: 2026-01-21 23:48:38.494 182759 DEBUG oslo_concurrency.lockutils [None req-e53bc008-1275-4146-87f8-955b097cdd80 0c3f927acf834c718155d5ee5dd81b19 2edcdd2e6c5a46cb95eb89874a9cb5f3 - - default default] Lock "c45fafe2-1fc2-4740-a927-9f38ba3ab16b" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 1.663s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 21 18:48:39 np0005591285 nova_compute[182755]: 2026-01-21 23:48:39.016 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 18:48:39 np0005591285 nova_compute[182755]: 2026-01-21 23:48:39.221 182759 DEBUG nova.virt.libvirt.driver [None req-15a89eb9-8e01-4c87-aafa-5521252a6100 36d71830ce70436e97fbc17b6da8d3c6 95574103d0094883861c58d01690e5a3 - - default default] [instance: 63b2e61e-8ad4-44e9-ba44-db37454a4b34] Instance in state 1 after 10 seconds - resending shutdown _clean_shutdown /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4101#033[00m
Jan 21 18:48:39 np0005591285 podman[214796]: 2026-01-21 23:48:39.263615641 +0000 UTC m=+0.117142234 container health_status b5c1a31a508d0cf9e10702abe9c0ef424614b4d8f0daee85791f7de054afa6f0 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, config_id=ceilometer_agent_compute)
Jan 21 18:48:41 np0005591285 systemd[1]: machine-qemu\x2d8\x2dinstance\x2d0000001a.scope: Deactivated successfully.
Jan 21 18:48:41 np0005591285 systemd[1]: machine-qemu\x2d8\x2dinstance\x2d0000001a.scope: Consumed 14.071s CPU time.
Jan 21 18:48:41 np0005591285 systemd-machined[154022]: Machine qemu-8-instance-0000001a terminated.
Jan 21 18:48:41 np0005591285 podman[214816]: 2026-01-21 23:48:41.592726138 +0000 UTC m=+0.084405439 container health_status 769668cc4e818cfae854ab3fcb5517227ddca1e18005a18ef4d782a494f45662 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, version=9.6, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, distribution-scope=public, io.openshift.expose-services=, release=1755695350, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-type=git, architecture=x86_64, container_name=openstack_network_exporter, io.buildah.version=1.33.7, io.openshift.tags=minimal rhel9, url=https://catalog.redhat.com/en/search?searchType=containers, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Red Hat, Inc., name=ubi9-minimal, managed_by=edpm_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, com.redhat.component=ubi9-minimal-container, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., build-date=2025-08-20T13:12:41, config_id=openstack_network_exporter, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vendor=Red Hat, Inc.)
Jan 21 18:48:42 np0005591285 nova_compute[182755]: 2026-01-21 23:48:42.189 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 18:48:42 np0005591285 nova_compute[182755]: 2026-01-21 23:48:42.240 182759 INFO nova.virt.libvirt.driver [None req-15a89eb9-8e01-4c87-aafa-5521252a6100 36d71830ce70436e97fbc17b6da8d3c6 95574103d0094883861c58d01690e5a3 - - default default] [instance: 63b2e61e-8ad4-44e9-ba44-db37454a4b34] Instance shutdown successfully after 13 seconds.#033[00m
Jan 21 18:48:42 np0005591285 nova_compute[182755]: 2026-01-21 23:48:42.249 182759 INFO nova.virt.libvirt.driver [-] [instance: 63b2e61e-8ad4-44e9-ba44-db37454a4b34] Instance destroyed successfully.#033[00m
Jan 21 18:48:42 np0005591285 nova_compute[182755]: 2026-01-21 23:48:42.255 182759 DEBUG oslo_concurrency.processutils [None req-15a89eb9-8e01-4c87-aafa-5521252a6100 36d71830ce70436e97fbc17b6da8d3c6 95574103d0094883861c58d01690e5a3 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/63b2e61e-8ad4-44e9-ba44-db37454a4b34/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 21 18:48:42 np0005591285 nova_compute[182755]: 2026-01-21 23:48:42.347 182759 DEBUG oslo_concurrency.processutils [None req-15a89eb9-8e01-4c87-aafa-5521252a6100 36d71830ce70436e97fbc17b6da8d3c6 95574103d0094883861c58d01690e5a3 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/63b2e61e-8ad4-44e9-ba44-db37454a4b34/disk --force-share --output=json" returned: 0 in 0.092s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 21 18:48:42 np0005591285 nova_compute[182755]: 2026-01-21 23:48:42.349 182759 DEBUG oslo_concurrency.processutils [None req-15a89eb9-8e01-4c87-aafa-5521252a6100 36d71830ce70436e97fbc17b6da8d3c6 95574103d0094883861c58d01690e5a3 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/63b2e61e-8ad4-44e9-ba44-db37454a4b34/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 21 18:48:42 np0005591285 nova_compute[182755]: 2026-01-21 23:48:42.445 182759 DEBUG oslo_concurrency.processutils [None req-15a89eb9-8e01-4c87-aafa-5521252a6100 36d71830ce70436e97fbc17b6da8d3c6 95574103d0094883861c58d01690e5a3 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/63b2e61e-8ad4-44e9-ba44-db37454a4b34/disk --force-share --output=json" returned: 0 in 0.096s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 21 18:48:42 np0005591285 nova_compute[182755]: 2026-01-21 23:48:42.449 182759 DEBUG nova.virt.libvirt.volume.remotefs [None req-15a89eb9-8e01-4c87-aafa-5521252a6100 36d71830ce70436e97fbc17b6da8d3c6 95574103d0094883861c58d01690e5a3 - - default default] Copying file /var/lib/nova/instances/63b2e61e-8ad4-44e9-ba44-db37454a4b34_resize/disk to 192.168.122.100:/var/lib/nova/instances/63b2e61e-8ad4-44e9-ba44-db37454a4b34/disk copy_file /usr/lib/python3.9/site-packages/nova/virt/libvirt/volume/remotefs.py:103#033[00m
Jan 21 18:48:42 np0005591285 nova_compute[182755]: 2026-01-21 23:48:42.450 182759 DEBUG oslo_concurrency.processutils [None req-15a89eb9-8e01-4c87-aafa-5521252a6100 36d71830ce70436e97fbc17b6da8d3c6 95574103d0094883861c58d01690e5a3 - - default default] Running cmd (subprocess): scp -r /var/lib/nova/instances/63b2e61e-8ad4-44e9-ba44-db37454a4b34_resize/disk 192.168.122.100:/var/lib/nova/instances/63b2e61e-8ad4-44e9-ba44-db37454a4b34/disk execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 21 18:48:43 np0005591285 nova_compute[182755]: 2026-01-21 23:48:43.345 182759 DEBUG oslo_concurrency.processutils [None req-15a89eb9-8e01-4c87-aafa-5521252a6100 36d71830ce70436e97fbc17b6da8d3c6 95574103d0094883861c58d01690e5a3 - - default default] CMD "scp -r /var/lib/nova/instances/63b2e61e-8ad4-44e9-ba44-db37454a4b34_resize/disk 192.168.122.100:/var/lib/nova/instances/63b2e61e-8ad4-44e9-ba44-db37454a4b34/disk" returned: 0 in 0.895s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 21 18:48:43 np0005591285 nova_compute[182755]: 2026-01-21 23:48:43.346 182759 DEBUG nova.virt.libvirt.volume.remotefs [None req-15a89eb9-8e01-4c87-aafa-5521252a6100 36d71830ce70436e97fbc17b6da8d3c6 95574103d0094883861c58d01690e5a3 - - default default] Copying file /var/lib/nova/instances/63b2e61e-8ad4-44e9-ba44-db37454a4b34_resize/disk.config to 192.168.122.100:/var/lib/nova/instances/63b2e61e-8ad4-44e9-ba44-db37454a4b34/disk.config copy_file /usr/lib/python3.9/site-packages/nova/virt/libvirt/volume/remotefs.py:103#033[00m
Jan 21 18:48:43 np0005591285 nova_compute[182755]: 2026-01-21 23:48:43.346 182759 DEBUG oslo_concurrency.processutils [None req-15a89eb9-8e01-4c87-aafa-5521252a6100 36d71830ce70436e97fbc17b6da8d3c6 95574103d0094883861c58d01690e5a3 - - default default] Running cmd (subprocess): scp -C -r /var/lib/nova/instances/63b2e61e-8ad4-44e9-ba44-db37454a4b34_resize/disk.config 192.168.122.100:/var/lib/nova/instances/63b2e61e-8ad4-44e9-ba44-db37454a4b34/disk.config execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 21 18:48:43 np0005591285 nova_compute[182755]: 2026-01-21 23:48:43.603 182759 DEBUG oslo_concurrency.processutils [None req-15a89eb9-8e01-4c87-aafa-5521252a6100 36d71830ce70436e97fbc17b6da8d3c6 95574103d0094883861c58d01690e5a3 - - default default] CMD "scp -C -r /var/lib/nova/instances/63b2e61e-8ad4-44e9-ba44-db37454a4b34_resize/disk.config 192.168.122.100:/var/lib/nova/instances/63b2e61e-8ad4-44e9-ba44-db37454a4b34/disk.config" returned: 0 in 0.257s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 21 18:48:43 np0005591285 nova_compute[182755]: 2026-01-21 23:48:43.604 182759 DEBUG nova.virt.libvirt.volume.remotefs [None req-15a89eb9-8e01-4c87-aafa-5521252a6100 36d71830ce70436e97fbc17b6da8d3c6 95574103d0094883861c58d01690e5a3 - - default default] Copying file /var/lib/nova/instances/63b2e61e-8ad4-44e9-ba44-db37454a4b34_resize/disk.info to 192.168.122.100:/var/lib/nova/instances/63b2e61e-8ad4-44e9-ba44-db37454a4b34/disk.info copy_file /usr/lib/python3.9/site-packages/nova/virt/libvirt/volume/remotefs.py:103#033[00m
Jan 21 18:48:43 np0005591285 nova_compute[182755]: 2026-01-21 23:48:43.604 182759 DEBUG oslo_concurrency.processutils [None req-15a89eb9-8e01-4c87-aafa-5521252a6100 36d71830ce70436e97fbc17b6da8d3c6 95574103d0094883861c58d01690e5a3 - - default default] Running cmd (subprocess): scp -C -r /var/lib/nova/instances/63b2e61e-8ad4-44e9-ba44-db37454a4b34_resize/disk.info 192.168.122.100:/var/lib/nova/instances/63b2e61e-8ad4-44e9-ba44-db37454a4b34/disk.info execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 21 18:48:43 np0005591285 nova_compute[182755]: 2026-01-21 23:48:43.868 182759 DEBUG oslo_concurrency.processutils [None req-15a89eb9-8e01-4c87-aafa-5521252a6100 36d71830ce70436e97fbc17b6da8d3c6 95574103d0094883861c58d01690e5a3 - - default default] CMD "scp -C -r /var/lib/nova/instances/63b2e61e-8ad4-44e9-ba44-db37454a4b34_resize/disk.info 192.168.122.100:/var/lib/nova/instances/63b2e61e-8ad4-44e9-ba44-db37454a4b34/disk.info" returned: 0 in 0.264s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 21 18:48:44 np0005591285 nova_compute[182755]: 2026-01-21 23:48:44.053 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 18:48:44 np0005591285 nova_compute[182755]: 2026-01-21 23:48:44.083 182759 DEBUG oslo_concurrency.lockutils [None req-15a89eb9-8e01-4c87-aafa-5521252a6100 36d71830ce70436e97fbc17b6da8d3c6 95574103d0094883861c58d01690e5a3 - - default default] Acquiring lock "63b2e61e-8ad4-44e9-ba44-db37454a4b34-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 21 18:48:44 np0005591285 nova_compute[182755]: 2026-01-21 23:48:44.083 182759 DEBUG oslo_concurrency.lockutils [None req-15a89eb9-8e01-4c87-aafa-5521252a6100 36d71830ce70436e97fbc17b6da8d3c6 95574103d0094883861c58d01690e5a3 - - default default] Lock "63b2e61e-8ad4-44e9-ba44-db37454a4b34-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 21 18:48:44 np0005591285 nova_compute[182755]: 2026-01-21 23:48:44.083 182759 DEBUG oslo_concurrency.lockutils [None req-15a89eb9-8e01-4c87-aafa-5521252a6100 36d71830ce70436e97fbc17b6da8d3c6 95574103d0094883861c58d01690e5a3 - - default default] Lock "63b2e61e-8ad4-44e9-ba44-db37454a4b34-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 21 18:48:44 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:48:44.461 104259 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=ce4b296c-26ac-415a-aa87-9634754eb3d3, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '8'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 21 18:48:47 np0005591285 nova_compute[182755]: 2026-01-21 23:48:47.191 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 18:48:47 np0005591285 nova_compute[182755]: 2026-01-21 23:48:47.896 182759 DEBUG oslo_concurrency.lockutils [None req-086d05fd-eb31-4921-ac90-9fac25334728 36d71830ce70436e97fbc17b6da8d3c6 95574103d0094883861c58d01690e5a3 - - default default] Acquiring lock "63b2e61e-8ad4-44e9-ba44-db37454a4b34" by "nova.compute.manager.ComputeManager.confirm_resize.<locals>.do_confirm_resize" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 21 18:48:47 np0005591285 nova_compute[182755]: 2026-01-21 23:48:47.897 182759 DEBUG oslo_concurrency.lockutils [None req-086d05fd-eb31-4921-ac90-9fac25334728 36d71830ce70436e97fbc17b6da8d3c6 95574103d0094883861c58d01690e5a3 - - default default] Lock "63b2e61e-8ad4-44e9-ba44-db37454a4b34" acquired by "nova.compute.manager.ComputeManager.confirm_resize.<locals>.do_confirm_resize" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 21 18:48:47 np0005591285 nova_compute[182755]: 2026-01-21 23:48:47.897 182759 DEBUG nova.compute.manager [None req-086d05fd-eb31-4921-ac90-9fac25334728 36d71830ce70436e97fbc17b6da8d3c6 95574103d0094883861c58d01690e5a3 - - default default] [instance: 63b2e61e-8ad4-44e9-ba44-db37454a4b34] Going to confirm migration 8 do_confirm_resize /usr/lib/python3.9/site-packages/nova/compute/manager.py:4679#033[00m
Jan 21 18:48:47 np0005591285 nova_compute[182755]: 2026-01-21 23:48:47.944 182759 DEBUG nova.objects.instance [None req-086d05fd-eb31-4921-ac90-9fac25334728 36d71830ce70436e97fbc17b6da8d3c6 95574103d0094883861c58d01690e5a3 - - default default] Lazy-loading 'info_cache' on Instance uuid 63b2e61e-8ad4-44e9-ba44-db37454a4b34 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 21 18:48:48 np0005591285 nova_compute[182755]: 2026-01-21 23:48:48.706 182759 DEBUG oslo_concurrency.lockutils [None req-086d05fd-eb31-4921-ac90-9fac25334728 36d71830ce70436e97fbc17b6da8d3c6 95574103d0094883861c58d01690e5a3 - - default default] Acquiring lock "refresh_cache-63b2e61e-8ad4-44e9-ba44-db37454a4b34" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 21 18:48:48 np0005591285 nova_compute[182755]: 2026-01-21 23:48:48.707 182759 DEBUG oslo_concurrency.lockutils [None req-086d05fd-eb31-4921-ac90-9fac25334728 36d71830ce70436e97fbc17b6da8d3c6 95574103d0094883861c58d01690e5a3 - - default default] Acquired lock "refresh_cache-63b2e61e-8ad4-44e9-ba44-db37454a4b34" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 21 18:48:48 np0005591285 nova_compute[182755]: 2026-01-21 23:48:48.707 182759 DEBUG nova.network.neutron [None req-086d05fd-eb31-4921-ac90-9fac25334728 36d71830ce70436e97fbc17b6da8d3c6 95574103d0094883861c58d01690e5a3 - - default default] [instance: 63b2e61e-8ad4-44e9-ba44-db37454a4b34] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Jan 21 18:48:48 np0005591285 nova_compute[182755]: 2026-01-21 23:48:48.892 182759 DEBUG nova.network.neutron [None req-086d05fd-eb31-4921-ac90-9fac25334728 36d71830ce70436e97fbc17b6da8d3c6 95574103d0094883861c58d01690e5a3 - - default default] [instance: 63b2e61e-8ad4-44e9-ba44-db37454a4b34] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Jan 21 18:48:49 np0005591285 nova_compute[182755]: 2026-01-21 23:48:49.057 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 18:48:49 np0005591285 podman[214859]: 2026-01-21 23:48:49.225594054 +0000 UTC m=+0.089683970 container health_status 3d927e208c8d9d2bf2ed958430b573b5beb6e6062ba87e1ec7a0ee42c855b2e4 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Jan 21 18:48:49 np0005591285 nova_compute[182755]: 2026-01-21 23:48:49.453 182759 DEBUG nova.network.neutron [None req-086d05fd-eb31-4921-ac90-9fac25334728 36d71830ce70436e97fbc17b6da8d3c6 95574103d0094883861c58d01690e5a3 - - default default] [instance: 63b2e61e-8ad4-44e9-ba44-db37454a4b34] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 21 18:48:49 np0005591285 nova_compute[182755]: 2026-01-21 23:48:49.472 182759 DEBUG oslo_concurrency.lockutils [None req-086d05fd-eb31-4921-ac90-9fac25334728 36d71830ce70436e97fbc17b6da8d3c6 95574103d0094883861c58d01690e5a3 - - default default] Releasing lock "refresh_cache-63b2e61e-8ad4-44e9-ba44-db37454a4b34" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 21 18:48:49 np0005591285 nova_compute[182755]: 2026-01-21 23:48:49.472 182759 DEBUG nova.objects.instance [None req-086d05fd-eb31-4921-ac90-9fac25334728 36d71830ce70436e97fbc17b6da8d3c6 95574103d0094883861c58d01690e5a3 - - default default] Lazy-loading 'migration_context' on Instance uuid 63b2e61e-8ad4-44e9-ba44-db37454a4b34 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 21 18:48:49 np0005591285 nova_compute[182755]: 2026-01-21 23:48:49.508 182759 DEBUG oslo_concurrency.lockutils [None req-086d05fd-eb31-4921-ac90-9fac25334728 36d71830ce70436e97fbc17b6da8d3c6 95574103d0094883861c58d01690e5a3 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.drop_move_claim_at_source" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 21 18:48:49 np0005591285 nova_compute[182755]: 2026-01-21 23:48:49.509 182759 DEBUG oslo_concurrency.lockutils [None req-086d05fd-eb31-4921-ac90-9fac25334728 36d71830ce70436e97fbc17b6da8d3c6 95574103d0094883861c58d01690e5a3 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.drop_move_claim_at_source" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 21 18:48:49 np0005591285 nova_compute[182755]: 2026-01-21 23:48:49.649 182759 DEBUG nova.compute.provider_tree [None req-086d05fd-eb31-4921-ac90-9fac25334728 36d71830ce70436e97fbc17b6da8d3c6 95574103d0094883861c58d01690e5a3 - - default default] Inventory has not changed in ProviderTree for provider: e96a8776-a298-4c19-937a-402cb8191067 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 21 18:48:49 np0005591285 nova_compute[182755]: 2026-01-21 23:48:49.806 182759 DEBUG nova.scheduler.client.report [None req-086d05fd-eb31-4921-ac90-9fac25334728 36d71830ce70436e97fbc17b6da8d3c6 95574103d0094883861c58d01690e5a3 - - default default] Inventory has not changed for provider e96a8776-a298-4c19-937a-402cb8191067 based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 21 18:48:49 np0005591285 nova_compute[182755]: 2026-01-21 23:48:49.903 182759 DEBUG oslo_concurrency.lockutils [None req-086d05fd-eb31-4921-ac90-9fac25334728 36d71830ce70436e97fbc17b6da8d3c6 95574103d0094883861c58d01690e5a3 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.drop_move_claim_at_source" :: held 0.394s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 21 18:48:50 np0005591285 nova_compute[182755]: 2026-01-21 23:48:50.077 182759 INFO nova.scheduler.client.report [None req-086d05fd-eb31-4921-ac90-9fac25334728 36d71830ce70436e97fbc17b6da8d3c6 95574103d0094883861c58d01690e5a3 - - default default] Deleted allocation for migration 42bc1ed7-e22c-426c-943d-5b751761144a#033[00m
Jan 21 18:48:50 np0005591285 nova_compute[182755]: 2026-01-21 23:48:50.148 182759 DEBUG oslo_concurrency.lockutils [None req-086d05fd-eb31-4921-ac90-9fac25334728 36d71830ce70436e97fbc17b6da8d3c6 95574103d0094883861c58d01690e5a3 - - default default] Lock "63b2e61e-8ad4-44e9-ba44-db37454a4b34" "released" by "nova.compute.manager.ComputeManager.confirm_resize.<locals>.do_confirm_resize" :: held 2.251s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 21 18:48:52 np0005591285 nova_compute[182755]: 2026-01-21 23:48:52.150 182759 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1769039317.1474164, c45fafe2-1fc2-4740-a927-9f38ba3ab16b => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 21 18:48:52 np0005591285 nova_compute[182755]: 2026-01-21 23:48:52.150 182759 INFO nova.compute.manager [-] [instance: c45fafe2-1fc2-4740-a927-9f38ba3ab16b] VM Stopped (Lifecycle Event)#033[00m
Jan 21 18:48:52 np0005591285 nova_compute[182755]: 2026-01-21 23:48:52.171 182759 DEBUG nova.compute.manager [None req-f1b24610-61e1-4480-b52c-919721983402 - - - - - -] [instance: c45fafe2-1fc2-4740-a927-9f38ba3ab16b] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 21 18:48:52 np0005591285 nova_compute[182755]: 2026-01-21 23:48:52.193 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 18:48:53 np0005591285 podman[214884]: 2026-01-21 23:48:53.278545094 +0000 UTC m=+0.140577881 container health_status e38def30a2d032278c507a8fe96e62e9ea51ff4721fe55b24e66691821fd079e (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_controller, container_name=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Jan 21 18:48:54 np0005591285 nova_compute[182755]: 2026-01-21 23:48:54.084 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 18:48:56 np0005591285 podman[214911]: 2026-01-21 23:48:56.224302987 +0000 UTC m=+0.085007676 container health_status 8919309155a033da8deb024bb37b9fc0d285fe97686ee0d202af37a5f2df8416 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter)
Jan 21 18:48:56 np0005591285 podman[214910]: 2026-01-21 23:48:56.228634313 +0000 UTC m=+0.094370416 container health_status 482a8450540b33e5cd4c4cb0c4b60281016c657b79b89fa82f8a9e447843f995 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true)
Jan 21 18:48:56 np0005591285 nova_compute[182755]: 2026-01-21 23:48:56.724 182759 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1769039321.723162, 63b2e61e-8ad4-44e9-ba44-db37454a4b34 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 21 18:48:56 np0005591285 nova_compute[182755]: 2026-01-21 23:48:56.725 182759 INFO nova.compute.manager [-] [instance: 63b2e61e-8ad4-44e9-ba44-db37454a4b34] VM Stopped (Lifecycle Event)#033[00m
Jan 21 18:48:56 np0005591285 nova_compute[182755]: 2026-01-21 23:48:56.759 182759 DEBUG nova.compute.manager [None req-79f3ba5d-26b5-4039-b615-47ed7859ff18 - - - - - -] [instance: 63b2e61e-8ad4-44e9-ba44-db37454a4b34] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 21 18:48:57 np0005591285 nova_compute[182755]: 2026-01-21 23:48:57.196 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 18:48:58 np0005591285 nova_compute[182755]: 2026-01-21 23:48:58.060 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 18:48:59 np0005591285 nova_compute[182755]: 2026-01-21 23:48:59.127 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 18:49:02 np0005591285 nova_compute[182755]: 2026-01-21 23:49:02.198 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 18:49:02 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:49:02.954 104259 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 21 18:49:02 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:49:02.956 104259 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 21 18:49:02 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:49:02.956 104259 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 21 18:49:03 np0005591285 nova_compute[182755]: 2026-01-21 23:49:03.729 182759 DEBUG nova.compute.manager [None req-f79a4c93-1692-4be0-9ec2-dc22359e4f87 36d71830ce70436e97fbc17b6da8d3c6 95574103d0094883861c58d01690e5a3 - - default default] [instance: 4d84ec02-4252-4dab-8580-d9961b6e6afd] Stashing vm_state: active _prep_resize /usr/lib/python3.9/site-packages/nova/compute/manager.py:5560#033[00m
Jan 21 18:49:03 np0005591285 nova_compute[182755]: 2026-01-21 23:49:03.849 182759 DEBUG oslo_concurrency.lockutils [None req-f79a4c93-1692-4be0-9ec2-dc22359e4f87 36d71830ce70436e97fbc17b6da8d3c6 95574103d0094883861c58d01690e5a3 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.resize_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 21 18:49:03 np0005591285 nova_compute[182755]: 2026-01-21 23:49:03.850 182759 DEBUG oslo_concurrency.lockutils [None req-f79a4c93-1692-4be0-9ec2-dc22359e4f87 36d71830ce70436e97fbc17b6da8d3c6 95574103d0094883861c58d01690e5a3 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.resize_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 21 18:49:03 np0005591285 nova_compute[182755]: 2026-01-21 23:49:03.904 182759 DEBUG nova.objects.instance [None req-f79a4c93-1692-4be0-9ec2-dc22359e4f87 36d71830ce70436e97fbc17b6da8d3c6 95574103d0094883861c58d01690e5a3 - - default default] Lazy-loading 'pci_requests' on Instance uuid 4d84ec02-4252-4dab-8580-d9961b6e6afd obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 21 18:49:03 np0005591285 nova_compute[182755]: 2026-01-21 23:49:03.927 182759 DEBUG nova.virt.hardware [None req-f79a4c93-1692-4be0-9ec2-dc22359e4f87 36d71830ce70436e97fbc17b6da8d3c6 95574103d0094883861c58d01690e5a3 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Jan 21 18:49:03 np0005591285 nova_compute[182755]: 2026-01-21 23:49:03.928 182759 INFO nova.compute.claims [None req-f79a4c93-1692-4be0-9ec2-dc22359e4f87 36d71830ce70436e97fbc17b6da8d3c6 95574103d0094883861c58d01690e5a3 - - default default] [instance: 4d84ec02-4252-4dab-8580-d9961b6e6afd] Claim successful on node compute-2.ctlplane.example.com#033[00m
Jan 21 18:49:03 np0005591285 nova_compute[182755]: 2026-01-21 23:49:03.928 182759 DEBUG nova.objects.instance [None req-f79a4c93-1692-4be0-9ec2-dc22359e4f87 36d71830ce70436e97fbc17b6da8d3c6 95574103d0094883861c58d01690e5a3 - - default default] Lazy-loading 'resources' on Instance uuid 4d84ec02-4252-4dab-8580-d9961b6e6afd obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 21 18:49:03 np0005591285 nova_compute[182755]: 2026-01-21 23:49:03.942 182759 DEBUG nova.objects.instance [None req-f79a4c93-1692-4be0-9ec2-dc22359e4f87 36d71830ce70436e97fbc17b6da8d3c6 95574103d0094883861c58d01690e5a3 - - default default] Lazy-loading 'pci_devices' on Instance uuid 4d84ec02-4252-4dab-8580-d9961b6e6afd obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 21 18:49:03 np0005591285 nova_compute[182755]: 2026-01-21 23:49:03.996 182759 INFO nova.compute.resource_tracker [None req-f79a4c93-1692-4be0-9ec2-dc22359e4f87 36d71830ce70436e97fbc17b6da8d3c6 95574103d0094883861c58d01690e5a3 - - default default] [instance: 4d84ec02-4252-4dab-8580-d9961b6e6afd] Updating resource usage from migration 63e5814c-3982-49e0-b565-1e0faa0531cf#033[00m
Jan 21 18:49:03 np0005591285 nova_compute[182755]: 2026-01-21 23:49:03.996 182759 DEBUG nova.compute.resource_tracker [None req-f79a4c93-1692-4be0-9ec2-dc22359e4f87 36d71830ce70436e97fbc17b6da8d3c6 95574103d0094883861c58d01690e5a3 - - default default] [instance: 4d84ec02-4252-4dab-8580-d9961b6e6afd] Starting to track incoming migration 63e5814c-3982-49e0-b565-1e0faa0531cf with flavor ff01ccba-ad51-439f-9037-926190d6dc0f _update_usage_from_migration /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1431#033[00m
Jan 21 18:49:04 np0005591285 nova_compute[182755]: 2026-01-21 23:49:04.134 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 18:49:04 np0005591285 nova_compute[182755]: 2026-01-21 23:49:04.150 182759 DEBUG nova.compute.provider_tree [None req-f79a4c93-1692-4be0-9ec2-dc22359e4f87 36d71830ce70436e97fbc17b6da8d3c6 95574103d0094883861c58d01690e5a3 - - default default] Inventory has not changed in ProviderTree for provider: e96a8776-a298-4c19-937a-402cb8191067 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 21 18:49:04 np0005591285 nova_compute[182755]: 2026-01-21 23:49:04.169 182759 DEBUG nova.scheduler.client.report [None req-f79a4c93-1692-4be0-9ec2-dc22359e4f87 36d71830ce70436e97fbc17b6da8d3c6 95574103d0094883861c58d01690e5a3 - - default default] Inventory has not changed for provider e96a8776-a298-4c19-937a-402cb8191067 based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 21 18:49:04 np0005591285 nova_compute[182755]: 2026-01-21 23:49:04.188 182759 DEBUG oslo_concurrency.lockutils [None req-f79a4c93-1692-4be0-9ec2-dc22359e4f87 36d71830ce70436e97fbc17b6da8d3c6 95574103d0094883861c58d01690e5a3 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.resize_claim" :: held 0.339s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 21 18:49:04 np0005591285 nova_compute[182755]: 2026-01-21 23:49:04.189 182759 INFO nova.compute.manager [None req-f79a4c93-1692-4be0-9ec2-dc22359e4f87 36d71830ce70436e97fbc17b6da8d3c6 95574103d0094883861c58d01690e5a3 - - default default] [instance: 4d84ec02-4252-4dab-8580-d9961b6e6afd] Migrating#033[00m
Jan 21 18:49:05 np0005591285 systemd[1]: Created slice User Slice of UID 42436.
Jan 21 18:49:05 np0005591285 systemd[1]: Starting User Runtime Directory /run/user/42436...
Jan 21 18:49:05 np0005591285 systemd-logind[788]: New session 29 of user nova.
Jan 21 18:49:05 np0005591285 systemd[1]: Finished User Runtime Directory /run/user/42436.
Jan 21 18:49:05 np0005591285 systemd[1]: Starting User Manager for UID 42436...
Jan 21 18:49:05 np0005591285 systemd[214955]: Queued start job for default target Main User Target.
Jan 21 18:49:05 np0005591285 systemd[214955]: Created slice User Application Slice.
Jan 21 18:49:05 np0005591285 systemd[214955]: Started Mark boot as successful after the user session has run 2 minutes.
Jan 21 18:49:05 np0005591285 systemd[214955]: Started Daily Cleanup of User's Temporary Directories.
Jan 21 18:49:05 np0005591285 systemd[214955]: Reached target Paths.
Jan 21 18:49:05 np0005591285 systemd[214955]: Reached target Timers.
Jan 21 18:49:05 np0005591285 systemd[214955]: Starting D-Bus User Message Bus Socket...
Jan 21 18:49:05 np0005591285 systemd[214955]: Starting Create User's Volatile Files and Directories...
Jan 21 18:49:05 np0005591285 systemd[214955]: Listening on D-Bus User Message Bus Socket.
Jan 21 18:49:05 np0005591285 systemd[214955]: Reached target Sockets.
Jan 21 18:49:05 np0005591285 systemd[214955]: Finished Create User's Volatile Files and Directories.
Jan 21 18:49:05 np0005591285 systemd[214955]: Reached target Basic System.
Jan 21 18:49:05 np0005591285 systemd[214955]: Reached target Main User Target.
Jan 21 18:49:05 np0005591285 systemd[214955]: Startup finished in 139ms.
Jan 21 18:49:05 np0005591285 systemd[1]: Started User Manager for UID 42436.
Jan 21 18:49:05 np0005591285 systemd[1]: Started Session 29 of User nova.
Jan 21 18:49:05 np0005591285 systemd[1]: session-29.scope: Deactivated successfully.
Jan 21 18:49:05 np0005591285 systemd-logind[788]: Session 29 logged out. Waiting for processes to exit.
Jan 21 18:49:05 np0005591285 systemd-logind[788]: Removed session 29.
Jan 21 18:49:05 np0005591285 systemd-logind[788]: New session 31 of user nova.
Jan 21 18:49:05 np0005591285 systemd[1]: Started Session 31 of User nova.
Jan 21 18:49:05 np0005591285 systemd[1]: session-31.scope: Deactivated successfully.
Jan 21 18:49:05 np0005591285 systemd-logind[788]: Session 31 logged out. Waiting for processes to exit.
Jan 21 18:49:05 np0005591285 systemd-logind[788]: Removed session 31.
Jan 21 18:49:07 np0005591285 nova_compute[182755]: 2026-01-21 23:49:07.201 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 18:49:09 np0005591285 nova_compute[182755]: 2026-01-21 23:49:09.136 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 18:49:10 np0005591285 podman[214977]: 2026-01-21 23:49:10.234314171 +0000 UTC m=+0.087547979 container health_status b5c1a31a508d0cf9e10702abe9c0ef424614b4d8f0daee85791f7de054afa6f0 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=ceilometer_agent_compute, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0)
Jan 21 18:49:12 np0005591285 nova_compute[182755]: 2026-01-21 23:49:12.204 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 18:49:12 np0005591285 podman[214998]: 2026-01-21 23:49:12.249370374 +0000 UTC m=+0.105579341 container health_status 769668cc4e818cfae854ab3fcb5517227ddca1e18005a18ef4d782a494f45662 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, url=https://catalog.redhat.com/en/search?searchType=containers, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, vcs-type=git, vendor=Red Hat, Inc., com.redhat.component=ubi9-minimal-container, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, maintainer=Red Hat, Inc., version=9.6, build-date=2025-08-20T13:12:41, container_name=openstack_network_exporter, managed_by=edpm_ansible, io.openshift.tags=minimal rhel9, io.buildah.version=1.33.7, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, config_id=openstack_network_exporter, distribution-scope=public, release=1755695350, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., architecture=x86_64, name=ubi9-minimal)
Jan 21 18:49:13 np0005591285 nova_compute[182755]: 2026-01-21 23:49:13.506 182759 DEBUG oslo_concurrency.lockutils [None req-c7a31d95-be17-4c05-9f13-e5ecd603f6bd abd17ede09d948d58de153b963381f13 54c1b2890dcc4b4599ff907adcbbbbb0 - - default default] Acquiring lock "539483c9-32b1-4c30-b72a-be10a98b79fa" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 21 18:49:13 np0005591285 nova_compute[182755]: 2026-01-21 23:49:13.507 182759 DEBUG oslo_concurrency.lockutils [None req-c7a31d95-be17-4c05-9f13-e5ecd603f6bd abd17ede09d948d58de153b963381f13 54c1b2890dcc4b4599ff907adcbbbbb0 - - default default] Lock "539483c9-32b1-4c30-b72a-be10a98b79fa" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 21 18:49:13 np0005591285 nova_compute[182755]: 2026-01-21 23:49:13.528 182759 DEBUG nova.compute.manager [None req-c7a31d95-be17-4c05-9f13-e5ecd603f6bd abd17ede09d948d58de153b963381f13 54c1b2890dcc4b4599ff907adcbbbbb0 - - default default] [instance: 539483c9-32b1-4c30-b72a-be10a98b79fa] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Jan 21 18:49:13 np0005591285 nova_compute[182755]: 2026-01-21 23:49:13.625 182759 DEBUG oslo_concurrency.lockutils [None req-c7a31d95-be17-4c05-9f13-e5ecd603f6bd abd17ede09d948d58de153b963381f13 54c1b2890dcc4b4599ff907adcbbbbb0 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 21 18:49:13 np0005591285 nova_compute[182755]: 2026-01-21 23:49:13.626 182759 DEBUG oslo_concurrency.lockutils [None req-c7a31d95-be17-4c05-9f13-e5ecd603f6bd abd17ede09d948d58de153b963381f13 54c1b2890dcc4b4599ff907adcbbbbb0 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 21 18:49:13 np0005591285 nova_compute[182755]: 2026-01-21 23:49:13.636 182759 DEBUG nova.virt.hardware [None req-c7a31d95-be17-4c05-9f13-e5ecd603f6bd abd17ede09d948d58de153b963381f13 54c1b2890dcc4b4599ff907adcbbbbb0 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Jan 21 18:49:13 np0005591285 nova_compute[182755]: 2026-01-21 23:49:13.637 182759 INFO nova.compute.claims [None req-c7a31d95-be17-4c05-9f13-e5ecd603f6bd abd17ede09d948d58de153b963381f13 54c1b2890dcc4b4599ff907adcbbbbb0 - - default default] [instance: 539483c9-32b1-4c30-b72a-be10a98b79fa] Claim successful on node compute-2.ctlplane.example.com#033[00m
Jan 21 18:49:13 np0005591285 nova_compute[182755]: 2026-01-21 23:49:13.799 182759 DEBUG nova.compute.provider_tree [None req-c7a31d95-be17-4c05-9f13-e5ecd603f6bd abd17ede09d948d58de153b963381f13 54c1b2890dcc4b4599ff907adcbbbbb0 - - default default] Inventory has not changed in ProviderTree for provider: e96a8776-a298-4c19-937a-402cb8191067 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 21 18:49:13 np0005591285 nova_compute[182755]: 2026-01-21 23:49:13.819 182759 DEBUG nova.scheduler.client.report [None req-c7a31d95-be17-4c05-9f13-e5ecd603f6bd abd17ede09d948d58de153b963381f13 54c1b2890dcc4b4599ff907adcbbbbb0 - - default default] Inventory has not changed for provider e96a8776-a298-4c19-937a-402cb8191067 based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 21 18:49:13 np0005591285 nova_compute[182755]: 2026-01-21 23:49:13.847 182759 DEBUG oslo_concurrency.lockutils [None req-c7a31d95-be17-4c05-9f13-e5ecd603f6bd abd17ede09d948d58de153b963381f13 54c1b2890dcc4b4599ff907adcbbbbb0 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.221s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 21 18:49:13 np0005591285 nova_compute[182755]: 2026-01-21 23:49:13.848 182759 DEBUG nova.compute.manager [None req-c7a31d95-be17-4c05-9f13-e5ecd603f6bd abd17ede09d948d58de153b963381f13 54c1b2890dcc4b4599ff907adcbbbbb0 - - default default] [instance: 539483c9-32b1-4c30-b72a-be10a98b79fa] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Jan 21 18:49:13 np0005591285 nova_compute[182755]: 2026-01-21 23:49:13.946 182759 DEBUG nova.compute.manager [None req-c7a31d95-be17-4c05-9f13-e5ecd603f6bd abd17ede09d948d58de153b963381f13 54c1b2890dcc4b4599ff907adcbbbbb0 - - default default] [instance: 539483c9-32b1-4c30-b72a-be10a98b79fa] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Jan 21 18:49:13 np0005591285 nova_compute[182755]: 2026-01-21 23:49:13.947 182759 DEBUG nova.network.neutron [None req-c7a31d95-be17-4c05-9f13-e5ecd603f6bd abd17ede09d948d58de153b963381f13 54c1b2890dcc4b4599ff907adcbbbbb0 - - default default] [instance: 539483c9-32b1-4c30-b72a-be10a98b79fa] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Jan 21 18:49:13 np0005591285 nova_compute[182755]: 2026-01-21 23:49:13.972 182759 INFO nova.virt.libvirt.driver [None req-c7a31d95-be17-4c05-9f13-e5ecd603f6bd abd17ede09d948d58de153b963381f13 54c1b2890dcc4b4599ff907adcbbbbb0 - - default default] [instance: 539483c9-32b1-4c30-b72a-be10a98b79fa] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Jan 21 18:49:13 np0005591285 nova_compute[182755]: 2026-01-21 23:49:13.997 182759 DEBUG nova.compute.manager [None req-c7a31d95-be17-4c05-9f13-e5ecd603f6bd abd17ede09d948d58de153b963381f13 54c1b2890dcc4b4599ff907adcbbbbb0 - - default default] [instance: 539483c9-32b1-4c30-b72a-be10a98b79fa] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Jan 21 18:49:14 np0005591285 nova_compute[182755]: 2026-01-21 23:49:14.128 182759 DEBUG nova.compute.manager [None req-c7a31d95-be17-4c05-9f13-e5ecd603f6bd abd17ede09d948d58de153b963381f13 54c1b2890dcc4b4599ff907adcbbbbb0 - - default default] [instance: 539483c9-32b1-4c30-b72a-be10a98b79fa] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Jan 21 18:49:14 np0005591285 nova_compute[182755]: 2026-01-21 23:49:14.130 182759 DEBUG nova.virt.libvirt.driver [None req-c7a31d95-be17-4c05-9f13-e5ecd603f6bd abd17ede09d948d58de153b963381f13 54c1b2890dcc4b4599ff907adcbbbbb0 - - default default] [instance: 539483c9-32b1-4c30-b72a-be10a98b79fa] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Jan 21 18:49:14 np0005591285 nova_compute[182755]: 2026-01-21 23:49:14.131 182759 INFO nova.virt.libvirt.driver [None req-c7a31d95-be17-4c05-9f13-e5ecd603f6bd abd17ede09d948d58de153b963381f13 54c1b2890dcc4b4599ff907adcbbbbb0 - - default default] [instance: 539483c9-32b1-4c30-b72a-be10a98b79fa] Creating image(s)#033[00m
Jan 21 18:49:14 np0005591285 nova_compute[182755]: 2026-01-21 23:49:14.132 182759 DEBUG oslo_concurrency.lockutils [None req-c7a31d95-be17-4c05-9f13-e5ecd603f6bd abd17ede09d948d58de153b963381f13 54c1b2890dcc4b4599ff907adcbbbbb0 - - default default] Acquiring lock "/var/lib/nova/instances/539483c9-32b1-4c30-b72a-be10a98b79fa/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 21 18:49:14 np0005591285 nova_compute[182755]: 2026-01-21 23:49:14.132 182759 DEBUG oslo_concurrency.lockutils [None req-c7a31d95-be17-4c05-9f13-e5ecd603f6bd abd17ede09d948d58de153b963381f13 54c1b2890dcc4b4599ff907adcbbbbb0 - - default default] Lock "/var/lib/nova/instances/539483c9-32b1-4c30-b72a-be10a98b79fa/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 21 18:49:14 np0005591285 nova_compute[182755]: 2026-01-21 23:49:14.134 182759 DEBUG oslo_concurrency.lockutils [None req-c7a31d95-be17-4c05-9f13-e5ecd603f6bd abd17ede09d948d58de153b963381f13 54c1b2890dcc4b4599ff907adcbbbbb0 - - default default] Lock "/var/lib/nova/instances/539483c9-32b1-4c30-b72a-be10a98b79fa/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 21 18:49:14 np0005591285 nova_compute[182755]: 2026-01-21 23:49:14.162 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 18:49:14 np0005591285 nova_compute[182755]: 2026-01-21 23:49:14.165 182759 DEBUG oslo_concurrency.processutils [None req-c7a31d95-be17-4c05-9f13-e5ecd603f6bd abd17ede09d948d58de153b963381f13 54c1b2890dcc4b4599ff907adcbbbbb0 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 21 18:49:14 np0005591285 nova_compute[182755]: 2026-01-21 23:49:14.263 182759 DEBUG oslo_concurrency.processutils [None req-c7a31d95-be17-4c05-9f13-e5ecd603f6bd abd17ede09d948d58de153b963381f13 54c1b2890dcc4b4599ff907adcbbbbb0 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json" returned: 0 in 0.099s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 21 18:49:14 np0005591285 nova_compute[182755]: 2026-01-21 23:49:14.265 182759 DEBUG oslo_concurrency.lockutils [None req-c7a31d95-be17-4c05-9f13-e5ecd603f6bd abd17ede09d948d58de153b963381f13 54c1b2890dcc4b4599ff907adcbbbbb0 - - default default] Acquiring lock "3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 21 18:49:14 np0005591285 nova_compute[182755]: 2026-01-21 23:49:14.267 182759 DEBUG oslo_concurrency.lockutils [None req-c7a31d95-be17-4c05-9f13-e5ecd603f6bd abd17ede09d948d58de153b963381f13 54c1b2890dcc4b4599ff907adcbbbbb0 - - default default] Lock "3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 21 18:49:14 np0005591285 nova_compute[182755]: 2026-01-21 23:49:14.291 182759 DEBUG oslo_concurrency.processutils [None req-c7a31d95-be17-4c05-9f13-e5ecd603f6bd abd17ede09d948d58de153b963381f13 54c1b2890dcc4b4599ff907adcbbbbb0 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 21 18:49:14 np0005591285 nova_compute[182755]: 2026-01-21 23:49:14.380 182759 DEBUG oslo_concurrency.processutils [None req-c7a31d95-be17-4c05-9f13-e5ecd603f6bd abd17ede09d948d58de153b963381f13 54c1b2890dcc4b4599ff907adcbbbbb0 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json" returned: 0 in 0.088s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 21 18:49:14 np0005591285 nova_compute[182755]: 2026-01-21 23:49:14.382 182759 DEBUG oslo_concurrency.processutils [None req-c7a31d95-be17-4c05-9f13-e5ecd603f6bd abd17ede09d948d58de153b963381f13 54c1b2890dcc4b4599ff907adcbbbbb0 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474,backing_fmt=raw /var/lib/nova/instances/539483c9-32b1-4c30-b72a-be10a98b79fa/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 21 18:49:14 np0005591285 nova_compute[182755]: 2026-01-21 23:49:14.428 182759 DEBUG oslo_concurrency.processutils [None req-c7a31d95-be17-4c05-9f13-e5ecd603f6bd abd17ede09d948d58de153b963381f13 54c1b2890dcc4b4599ff907adcbbbbb0 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474,backing_fmt=raw /var/lib/nova/instances/539483c9-32b1-4c30-b72a-be10a98b79fa/disk 1073741824" returned: 0 in 0.047s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 21 18:49:14 np0005591285 nova_compute[182755]: 2026-01-21 23:49:14.430 182759 DEBUG oslo_concurrency.lockutils [None req-c7a31d95-be17-4c05-9f13-e5ecd603f6bd abd17ede09d948d58de153b963381f13 54c1b2890dcc4b4599ff907adcbbbbb0 - - default default] Lock "3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.164s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 21 18:49:14 np0005591285 nova_compute[182755]: 2026-01-21 23:49:14.431 182759 DEBUG oslo_concurrency.processutils [None req-c7a31d95-be17-4c05-9f13-e5ecd603f6bd abd17ede09d948d58de153b963381f13 54c1b2890dcc4b4599ff907adcbbbbb0 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 21 18:49:14 np0005591285 nova_compute[182755]: 2026-01-21 23:49:14.512 182759 DEBUG oslo_concurrency.processutils [None req-c7a31d95-be17-4c05-9f13-e5ecd603f6bd abd17ede09d948d58de153b963381f13 54c1b2890dcc4b4599ff907adcbbbbb0 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json" returned: 0 in 0.081s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 21 18:49:14 np0005591285 nova_compute[182755]: 2026-01-21 23:49:14.514 182759 DEBUG nova.virt.disk.api [None req-c7a31d95-be17-4c05-9f13-e5ecd603f6bd abd17ede09d948d58de153b963381f13 54c1b2890dcc4b4599ff907adcbbbbb0 - - default default] Checking if we can resize image /var/lib/nova/instances/539483c9-32b1-4c30-b72a-be10a98b79fa/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166#033[00m
Jan 21 18:49:14 np0005591285 nova_compute[182755]: 2026-01-21 23:49:14.515 182759 DEBUG oslo_concurrency.processutils [None req-c7a31d95-be17-4c05-9f13-e5ecd603f6bd abd17ede09d948d58de153b963381f13 54c1b2890dcc4b4599ff907adcbbbbb0 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/539483c9-32b1-4c30-b72a-be10a98b79fa/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 21 18:49:14 np0005591285 nova_compute[182755]: 2026-01-21 23:49:14.604 182759 DEBUG oslo_concurrency.processutils [None req-c7a31d95-be17-4c05-9f13-e5ecd603f6bd abd17ede09d948d58de153b963381f13 54c1b2890dcc4b4599ff907adcbbbbb0 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/539483c9-32b1-4c30-b72a-be10a98b79fa/disk --force-share --output=json" returned: 0 in 0.089s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 21 18:49:14 np0005591285 nova_compute[182755]: 2026-01-21 23:49:14.606 182759 DEBUG nova.virt.disk.api [None req-c7a31d95-be17-4c05-9f13-e5ecd603f6bd abd17ede09d948d58de153b963381f13 54c1b2890dcc4b4599ff907adcbbbbb0 - - default default] Cannot resize image /var/lib/nova/instances/539483c9-32b1-4c30-b72a-be10a98b79fa/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172#033[00m
Jan 21 18:49:14 np0005591285 nova_compute[182755]: 2026-01-21 23:49:14.607 182759 DEBUG nova.objects.instance [None req-c7a31d95-be17-4c05-9f13-e5ecd603f6bd abd17ede09d948d58de153b963381f13 54c1b2890dcc4b4599ff907adcbbbbb0 - - default default] Lazy-loading 'migration_context' on Instance uuid 539483c9-32b1-4c30-b72a-be10a98b79fa obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 21 18:49:14 np0005591285 nova_compute[182755]: 2026-01-21 23:49:14.624 182759 DEBUG nova.virt.libvirt.driver [None req-c7a31d95-be17-4c05-9f13-e5ecd603f6bd abd17ede09d948d58de153b963381f13 54c1b2890dcc4b4599ff907adcbbbbb0 - - default default] [instance: 539483c9-32b1-4c30-b72a-be10a98b79fa] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Jan 21 18:49:14 np0005591285 nova_compute[182755]: 2026-01-21 23:49:14.625 182759 DEBUG nova.virt.libvirt.driver [None req-c7a31d95-be17-4c05-9f13-e5ecd603f6bd abd17ede09d948d58de153b963381f13 54c1b2890dcc4b4599ff907adcbbbbb0 - - default default] [instance: 539483c9-32b1-4c30-b72a-be10a98b79fa] Ensure instance console log exists: /var/lib/nova/instances/539483c9-32b1-4c30-b72a-be10a98b79fa/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Jan 21 18:49:14 np0005591285 nova_compute[182755]: 2026-01-21 23:49:14.625 182759 DEBUG oslo_concurrency.lockutils [None req-c7a31d95-be17-4c05-9f13-e5ecd603f6bd abd17ede09d948d58de153b963381f13 54c1b2890dcc4b4599ff907adcbbbbb0 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 21 18:49:14 np0005591285 nova_compute[182755]: 2026-01-21 23:49:14.626 182759 DEBUG oslo_concurrency.lockutils [None req-c7a31d95-be17-4c05-9f13-e5ecd603f6bd abd17ede09d948d58de153b963381f13 54c1b2890dcc4b4599ff907adcbbbbb0 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 21 18:49:14 np0005591285 nova_compute[182755]: 2026-01-21 23:49:14.626 182759 DEBUG oslo_concurrency.lockutils [None req-c7a31d95-be17-4c05-9f13-e5ecd603f6bd abd17ede09d948d58de153b963381f13 54c1b2890dcc4b4599ff907adcbbbbb0 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 21 18:49:14 np0005591285 nova_compute[182755]: 2026-01-21 23:49:14.825 182759 DEBUG nova.network.neutron [None req-c7a31d95-be17-4c05-9f13-e5ecd603f6bd abd17ede09d948d58de153b963381f13 54c1b2890dcc4b4599ff907adcbbbbb0 - - default default] [instance: 539483c9-32b1-4c30-b72a-be10a98b79fa] No network configured allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1188#033[00m
Jan 21 18:49:14 np0005591285 nova_compute[182755]: 2026-01-21 23:49:14.826 182759 DEBUG nova.compute.manager [None req-c7a31d95-be17-4c05-9f13-e5ecd603f6bd abd17ede09d948d58de153b963381f13 54c1b2890dcc4b4599ff907adcbbbbb0 - - default default] [instance: 539483c9-32b1-4c30-b72a-be10a98b79fa] Instance network_info: |[]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Jan 21 18:49:14 np0005591285 nova_compute[182755]: 2026-01-21 23:49:14.828 182759 DEBUG nova.virt.libvirt.driver [None req-c7a31d95-be17-4c05-9f13-e5ecd603f6bd abd17ede09d948d58de153b963381f13 54c1b2890dcc4b4599ff907adcbbbbb0 - - default default] [instance: 539483c9-32b1-4c30-b72a-be10a98b79fa] Start _get_guest_xml network_info=[] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-21T23:42:50Z,direct_url=<?>,disk_format='qcow2',id=9cd98f02-a505-4543-a7ad-04e9a377b456,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='43b70c4e837343859ac97b6b2397ba1b',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-21T23:42:52Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'device_type': 'disk', 'device_name': '/dev/vda', 'size': 0, 'boot_index': 0, 'disk_bus': 'virtio', 'guest_format': None, 'encryption_options': None, 'encryption_format': None, 'encrypted': False, 'encryption_secret_uuid': None, 'image_id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Jan 21 18:49:14 np0005591285 nova_compute[182755]: 2026-01-21 23:49:14.834 182759 WARNING nova.virt.libvirt.driver [None req-c7a31d95-be17-4c05-9f13-e5ecd603f6bd abd17ede09d948d58de153b963381f13 54c1b2890dcc4b4599ff907adcbbbbb0 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 21 18:49:14 np0005591285 nova_compute[182755]: 2026-01-21 23:49:14.839 182759 DEBUG nova.virt.libvirt.host [None req-c7a31d95-be17-4c05-9f13-e5ecd603f6bd abd17ede09d948d58de153b963381f13 54c1b2890dcc4b4599ff907adcbbbbb0 - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Jan 21 18:49:14 np0005591285 nova_compute[182755]: 2026-01-21 23:49:14.839 182759 DEBUG nova.virt.libvirt.host [None req-c7a31d95-be17-4c05-9f13-e5ecd603f6bd abd17ede09d948d58de153b963381f13 54c1b2890dcc4b4599ff907adcbbbbb0 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Jan 21 18:49:14 np0005591285 nova_compute[182755]: 2026-01-21 23:49:14.843 182759 DEBUG nova.virt.libvirt.host [None req-c7a31d95-be17-4c05-9f13-e5ecd603f6bd abd17ede09d948d58de153b963381f13 54c1b2890dcc4b4599ff907adcbbbbb0 - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Jan 21 18:49:14 np0005591285 nova_compute[182755]: 2026-01-21 23:49:14.843 182759 DEBUG nova.virt.libvirt.host [None req-c7a31d95-be17-4c05-9f13-e5ecd603f6bd abd17ede09d948d58de153b963381f13 54c1b2890dcc4b4599ff907adcbbbbb0 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Jan 21 18:49:14 np0005591285 nova_compute[182755]: 2026-01-21 23:49:14.845 182759 DEBUG nova.virt.libvirt.driver [None req-c7a31d95-be17-4c05-9f13-e5ecd603f6bd abd17ede09d948d58de153b963381f13 54c1b2890dcc4b4599ff907adcbbbbb0 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Jan 21 18:49:14 np0005591285 nova_compute[182755]: 2026-01-21 23:49:14.845 182759 DEBUG nova.virt.hardware [None req-c7a31d95-be17-4c05-9f13-e5ecd603f6bd abd17ede09d948d58de153b963381f13 54c1b2890dcc4b4599ff907adcbbbbb0 - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-21T23:42:45Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='c3389c03-89c4-4ff5-9e03-1a99d41713d4',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-21T23:42:50Z,direct_url=<?>,disk_format='qcow2',id=9cd98f02-a505-4543-a7ad-04e9a377b456,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='43b70c4e837343859ac97b6b2397ba1b',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-21T23:42:52Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Jan 21 18:49:14 np0005591285 nova_compute[182755]: 2026-01-21 23:49:14.846 182759 DEBUG nova.virt.hardware [None req-c7a31d95-be17-4c05-9f13-e5ecd603f6bd abd17ede09d948d58de153b963381f13 54c1b2890dcc4b4599ff907adcbbbbb0 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Jan 21 18:49:14 np0005591285 nova_compute[182755]: 2026-01-21 23:49:14.846 182759 DEBUG nova.virt.hardware [None req-c7a31d95-be17-4c05-9f13-e5ecd603f6bd abd17ede09d948d58de153b963381f13 54c1b2890dcc4b4599ff907adcbbbbb0 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Jan 21 18:49:14 np0005591285 nova_compute[182755]: 2026-01-21 23:49:14.846 182759 DEBUG nova.virt.hardware [None req-c7a31d95-be17-4c05-9f13-e5ecd603f6bd abd17ede09d948d58de153b963381f13 54c1b2890dcc4b4599ff907adcbbbbb0 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Jan 21 18:49:14 np0005591285 nova_compute[182755]: 2026-01-21 23:49:14.847 182759 DEBUG nova.virt.hardware [None req-c7a31d95-be17-4c05-9f13-e5ecd603f6bd abd17ede09d948d58de153b963381f13 54c1b2890dcc4b4599ff907adcbbbbb0 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Jan 21 18:49:14 np0005591285 nova_compute[182755]: 2026-01-21 23:49:14.847 182759 DEBUG nova.virt.hardware [None req-c7a31d95-be17-4c05-9f13-e5ecd603f6bd abd17ede09d948d58de153b963381f13 54c1b2890dcc4b4599ff907adcbbbbb0 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Jan 21 18:49:14 np0005591285 nova_compute[182755]: 2026-01-21 23:49:14.847 182759 DEBUG nova.virt.hardware [None req-c7a31d95-be17-4c05-9f13-e5ecd603f6bd abd17ede09d948d58de153b963381f13 54c1b2890dcc4b4599ff907adcbbbbb0 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Jan 21 18:49:14 np0005591285 nova_compute[182755]: 2026-01-21 23:49:14.848 182759 DEBUG nova.virt.hardware [None req-c7a31d95-be17-4c05-9f13-e5ecd603f6bd abd17ede09d948d58de153b963381f13 54c1b2890dcc4b4599ff907adcbbbbb0 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Jan 21 18:49:14 np0005591285 nova_compute[182755]: 2026-01-21 23:49:14.848 182759 DEBUG nova.virt.hardware [None req-c7a31d95-be17-4c05-9f13-e5ecd603f6bd abd17ede09d948d58de153b963381f13 54c1b2890dcc4b4599ff907adcbbbbb0 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Jan 21 18:49:14 np0005591285 nova_compute[182755]: 2026-01-21 23:49:14.848 182759 DEBUG nova.virt.hardware [None req-c7a31d95-be17-4c05-9f13-e5ecd603f6bd abd17ede09d948d58de153b963381f13 54c1b2890dcc4b4599ff907adcbbbbb0 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Jan 21 18:49:14 np0005591285 nova_compute[182755]: 2026-01-21 23:49:14.848 182759 DEBUG nova.virt.hardware [None req-c7a31d95-be17-4c05-9f13-e5ecd603f6bd abd17ede09d948d58de153b963381f13 54c1b2890dcc4b4599ff907adcbbbbb0 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Jan 21 18:49:14 np0005591285 nova_compute[182755]: 2026-01-21 23:49:14.854 182759 DEBUG nova.objects.instance [None req-c7a31d95-be17-4c05-9f13-e5ecd603f6bd abd17ede09d948d58de153b963381f13 54c1b2890dcc4b4599ff907adcbbbbb0 - - default default] Lazy-loading 'pci_devices' on Instance uuid 539483c9-32b1-4c30-b72a-be10a98b79fa obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 21 18:49:14 np0005591285 nova_compute[182755]: 2026-01-21 23:49:14.870 182759 DEBUG nova.virt.libvirt.driver [None req-c7a31d95-be17-4c05-9f13-e5ecd603f6bd abd17ede09d948d58de153b963381f13 54c1b2890dcc4b4599ff907adcbbbbb0 - - default default] [instance: 539483c9-32b1-4c30-b72a-be10a98b79fa] End _get_guest_xml xml=<domain type="kvm">
Jan 21 18:49:14 np0005591285 nova_compute[182755]:  <uuid>539483c9-32b1-4c30-b72a-be10a98b79fa</uuid>
Jan 21 18:49:14 np0005591285 nova_compute[182755]:  <name>instance-00000021</name>
Jan 21 18:49:14 np0005591285 nova_compute[182755]:  <memory>131072</memory>
Jan 21 18:49:14 np0005591285 nova_compute[182755]:  <vcpu>1</vcpu>
Jan 21 18:49:14 np0005591285 nova_compute[182755]:  <metadata>
Jan 21 18:49:14 np0005591285 nova_compute[182755]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 21 18:49:14 np0005591285 nova_compute[182755]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 21 18:49:14 np0005591285 nova_compute[182755]:      <nova:name>tempest-ListImageFiltersTestJSON-server-1912557340</nova:name>
Jan 21 18:49:14 np0005591285 nova_compute[182755]:      <nova:creationTime>2026-01-21 23:49:14</nova:creationTime>
Jan 21 18:49:14 np0005591285 nova_compute[182755]:      <nova:flavor name="m1.nano">
Jan 21 18:49:14 np0005591285 nova_compute[182755]:        <nova:memory>128</nova:memory>
Jan 21 18:49:14 np0005591285 nova_compute[182755]:        <nova:disk>1</nova:disk>
Jan 21 18:49:14 np0005591285 nova_compute[182755]:        <nova:swap>0</nova:swap>
Jan 21 18:49:14 np0005591285 nova_compute[182755]:        <nova:ephemeral>0</nova:ephemeral>
Jan 21 18:49:14 np0005591285 nova_compute[182755]:        <nova:vcpus>1</nova:vcpus>
Jan 21 18:49:14 np0005591285 nova_compute[182755]:      </nova:flavor>
Jan 21 18:49:14 np0005591285 nova_compute[182755]:      <nova:owner>
Jan 21 18:49:14 np0005591285 nova_compute[182755]:        <nova:user uuid="abd17ede09d948d58de153b963381f13">tempest-ListImageFiltersTestJSON-2096581596-project-member</nova:user>
Jan 21 18:49:14 np0005591285 nova_compute[182755]:        <nova:project uuid="54c1b2890dcc4b4599ff907adcbbbbb0">tempest-ListImageFiltersTestJSON-2096581596</nova:project>
Jan 21 18:49:14 np0005591285 nova_compute[182755]:      </nova:owner>
Jan 21 18:49:14 np0005591285 nova_compute[182755]:      <nova:root type="image" uuid="9cd98f02-a505-4543-a7ad-04e9a377b456"/>
Jan 21 18:49:14 np0005591285 nova_compute[182755]:      <nova:ports/>
Jan 21 18:49:14 np0005591285 nova_compute[182755]:    </nova:instance>
Jan 21 18:49:14 np0005591285 nova_compute[182755]:  </metadata>
Jan 21 18:49:14 np0005591285 nova_compute[182755]:  <sysinfo type="smbios">
Jan 21 18:49:14 np0005591285 nova_compute[182755]:    <system>
Jan 21 18:49:14 np0005591285 nova_compute[182755]:      <entry name="manufacturer">RDO</entry>
Jan 21 18:49:14 np0005591285 nova_compute[182755]:      <entry name="product">OpenStack Compute</entry>
Jan 21 18:49:14 np0005591285 nova_compute[182755]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 21 18:49:14 np0005591285 nova_compute[182755]:      <entry name="serial">539483c9-32b1-4c30-b72a-be10a98b79fa</entry>
Jan 21 18:49:14 np0005591285 nova_compute[182755]:      <entry name="uuid">539483c9-32b1-4c30-b72a-be10a98b79fa</entry>
Jan 21 18:49:14 np0005591285 nova_compute[182755]:      <entry name="family">Virtual Machine</entry>
Jan 21 18:49:14 np0005591285 nova_compute[182755]:    </system>
Jan 21 18:49:14 np0005591285 nova_compute[182755]:  </sysinfo>
Jan 21 18:49:14 np0005591285 nova_compute[182755]:  <os>
Jan 21 18:49:14 np0005591285 nova_compute[182755]:    <type arch="x86_64" machine="q35">hvm</type>
Jan 21 18:49:14 np0005591285 nova_compute[182755]:    <boot dev="hd"/>
Jan 21 18:49:14 np0005591285 nova_compute[182755]:    <smbios mode="sysinfo"/>
Jan 21 18:49:14 np0005591285 nova_compute[182755]:  </os>
Jan 21 18:49:14 np0005591285 nova_compute[182755]:  <features>
Jan 21 18:49:14 np0005591285 nova_compute[182755]:    <acpi/>
Jan 21 18:49:14 np0005591285 nova_compute[182755]:    <apic/>
Jan 21 18:49:14 np0005591285 nova_compute[182755]:    <vmcoreinfo/>
Jan 21 18:49:14 np0005591285 nova_compute[182755]:  </features>
Jan 21 18:49:14 np0005591285 nova_compute[182755]:  <clock offset="utc">
Jan 21 18:49:14 np0005591285 nova_compute[182755]:    <timer name="pit" tickpolicy="delay"/>
Jan 21 18:49:14 np0005591285 nova_compute[182755]:    <timer name="rtc" tickpolicy="catchup"/>
Jan 21 18:49:14 np0005591285 nova_compute[182755]:    <timer name="hpet" present="no"/>
Jan 21 18:49:14 np0005591285 nova_compute[182755]:  </clock>
Jan 21 18:49:14 np0005591285 nova_compute[182755]:  <cpu mode="custom" match="exact">
Jan 21 18:49:14 np0005591285 nova_compute[182755]:    <model>Nehalem</model>
Jan 21 18:49:14 np0005591285 nova_compute[182755]:    <topology sockets="1" cores="1" threads="1"/>
Jan 21 18:49:14 np0005591285 nova_compute[182755]:  </cpu>
Jan 21 18:49:14 np0005591285 nova_compute[182755]:  <devices>
Jan 21 18:49:14 np0005591285 nova_compute[182755]:    <disk type="file" device="disk">
Jan 21 18:49:14 np0005591285 nova_compute[182755]:      <driver name="qemu" type="qcow2" cache="none"/>
Jan 21 18:49:14 np0005591285 nova_compute[182755]:      <source file="/var/lib/nova/instances/539483c9-32b1-4c30-b72a-be10a98b79fa/disk"/>
Jan 21 18:49:14 np0005591285 nova_compute[182755]:      <target dev="vda" bus="virtio"/>
Jan 21 18:49:14 np0005591285 nova_compute[182755]:    </disk>
Jan 21 18:49:14 np0005591285 nova_compute[182755]:    <disk type="file" device="cdrom">
Jan 21 18:49:14 np0005591285 nova_compute[182755]:      <driver name="qemu" type="raw" cache="none"/>
Jan 21 18:49:14 np0005591285 nova_compute[182755]:      <source file="/var/lib/nova/instances/539483c9-32b1-4c30-b72a-be10a98b79fa/disk.config"/>
Jan 21 18:49:14 np0005591285 nova_compute[182755]:      <target dev="sda" bus="sata"/>
Jan 21 18:49:14 np0005591285 nova_compute[182755]:    </disk>
Jan 21 18:49:14 np0005591285 nova_compute[182755]:    <serial type="pty">
Jan 21 18:49:14 np0005591285 nova_compute[182755]:      <log file="/var/lib/nova/instances/539483c9-32b1-4c30-b72a-be10a98b79fa/console.log" append="off"/>
Jan 21 18:49:14 np0005591285 nova_compute[182755]:    </serial>
Jan 21 18:49:14 np0005591285 nova_compute[182755]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 21 18:49:14 np0005591285 nova_compute[182755]:    <video>
Jan 21 18:49:14 np0005591285 nova_compute[182755]:      <model type="virtio"/>
Jan 21 18:49:14 np0005591285 nova_compute[182755]:    </video>
Jan 21 18:49:14 np0005591285 nova_compute[182755]:    <input type="tablet" bus="usb"/>
Jan 21 18:49:14 np0005591285 nova_compute[182755]:    <rng model="virtio">
Jan 21 18:49:14 np0005591285 nova_compute[182755]:      <backend model="random">/dev/urandom</backend>
Jan 21 18:49:14 np0005591285 nova_compute[182755]:    </rng>
Jan 21 18:49:14 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root"/>
Jan 21 18:49:14 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 18:49:14 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 18:49:14 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 18:49:14 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 18:49:14 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 18:49:14 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 18:49:14 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 18:49:14 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 18:49:14 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 18:49:14 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 18:49:14 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 18:49:14 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 18:49:14 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 18:49:14 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 18:49:14 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 18:49:14 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 18:49:14 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 18:49:14 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 18:49:14 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 18:49:14 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 18:49:14 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 18:49:14 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 18:49:14 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 18:49:14 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 18:49:14 np0005591285 nova_compute[182755]:    <controller type="usb" index="0"/>
Jan 21 18:49:14 np0005591285 nova_compute[182755]:    <memballoon model="virtio">
Jan 21 18:49:14 np0005591285 nova_compute[182755]:      <stats period="10"/>
Jan 21 18:49:14 np0005591285 nova_compute[182755]:    </memballoon>
Jan 21 18:49:14 np0005591285 nova_compute[182755]:  </devices>
Jan 21 18:49:14 np0005591285 nova_compute[182755]: </domain>
Jan 21 18:49:14 np0005591285 nova_compute[182755]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Jan 21 18:49:14 np0005591285 nova_compute[182755]: 2026-01-21 23:49:14.955 182759 DEBUG nova.virt.libvirt.driver [None req-c7a31d95-be17-4c05-9f13-e5ecd603f6bd abd17ede09d948d58de153b963381f13 54c1b2890dcc4b4599ff907adcbbbbb0 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 21 18:49:14 np0005591285 nova_compute[182755]: 2026-01-21 23:49:14.956 182759 DEBUG nova.virt.libvirt.driver [None req-c7a31d95-be17-4c05-9f13-e5ecd603f6bd abd17ede09d948d58de153b963381f13 54c1b2890dcc4b4599ff907adcbbbbb0 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 21 18:49:14 np0005591285 nova_compute[182755]: 2026-01-21 23:49:14.957 182759 INFO nova.virt.libvirt.driver [None req-c7a31d95-be17-4c05-9f13-e5ecd603f6bd abd17ede09d948d58de153b963381f13 54c1b2890dcc4b4599ff907adcbbbbb0 - - default default] [instance: 539483c9-32b1-4c30-b72a-be10a98b79fa] Using config drive#033[00m
Jan 21 18:49:15 np0005591285 nova_compute[182755]: 2026-01-21 23:49:15.340 182759 INFO nova.virt.libvirt.driver [None req-c7a31d95-be17-4c05-9f13-e5ecd603f6bd abd17ede09d948d58de153b963381f13 54c1b2890dcc4b4599ff907adcbbbbb0 - - default default] [instance: 539483c9-32b1-4c30-b72a-be10a98b79fa] Creating config drive at /var/lib/nova/instances/539483c9-32b1-4c30-b72a-be10a98b79fa/disk.config#033[00m
Jan 21 18:49:15 np0005591285 nova_compute[182755]: 2026-01-21 23:49:15.346 182759 DEBUG oslo_concurrency.processutils [None req-c7a31d95-be17-4c05-9f13-e5ecd603f6bd abd17ede09d948d58de153b963381f13 54c1b2890dcc4b4599ff907adcbbbbb0 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/539483c9-32b1-4c30-b72a-be10a98b79fa/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp5scrmjqo execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 21 18:49:15 np0005591285 nova_compute[182755]: 2026-01-21 23:49:15.474 182759 DEBUG oslo_concurrency.processutils [None req-c7a31d95-be17-4c05-9f13-e5ecd603f6bd abd17ede09d948d58de153b963381f13 54c1b2890dcc4b4599ff907adcbbbbb0 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/539483c9-32b1-4c30-b72a-be10a98b79fa/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp5scrmjqo" returned: 0 in 0.128s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 21 18:49:15 np0005591285 systemd-machined[154022]: New machine qemu-10-instance-00000021.
Jan 21 18:49:15 np0005591285 systemd[1]: Started Virtual Machine qemu-10-instance-00000021.
Jan 21 18:49:15 np0005591285 systemd[214955]: Activating special unit Exit the Session...
Jan 21 18:49:15 np0005591285 systemd[214955]: Stopped target Main User Target.
Jan 21 18:49:15 np0005591285 systemd[214955]: Stopped target Basic System.
Jan 21 18:49:15 np0005591285 systemd[214955]: Stopped target Paths.
Jan 21 18:49:15 np0005591285 systemd[214955]: Stopped target Sockets.
Jan 21 18:49:15 np0005591285 systemd[214955]: Stopped target Timers.
Jan 21 18:49:15 np0005591285 systemd[214955]: Stopped Mark boot as successful after the user session has run 2 minutes.
Jan 21 18:49:15 np0005591285 systemd[214955]: Stopped Daily Cleanup of User's Temporary Directories.
Jan 21 18:49:15 np0005591285 systemd[214955]: Closed D-Bus User Message Bus Socket.
Jan 21 18:49:15 np0005591285 systemd[214955]: Stopped Create User's Volatile Files and Directories.
Jan 21 18:49:15 np0005591285 systemd[214955]: Removed slice User Application Slice.
Jan 21 18:49:15 np0005591285 systemd[214955]: Reached target Shutdown.
Jan 21 18:49:15 np0005591285 systemd[214955]: Finished Exit the Session.
Jan 21 18:49:15 np0005591285 systemd[214955]: Reached target Exit the Session.
Jan 21 18:49:15 np0005591285 systemd[1]: Stopping User Manager for UID 42436...
Jan 21 18:49:15 np0005591285 systemd[1]: user@42436.service: Deactivated successfully.
Jan 21 18:49:15 np0005591285 systemd[1]: Stopped User Manager for UID 42436.
Jan 21 18:49:15 np0005591285 systemd[1]: Stopping User Runtime Directory /run/user/42436...
Jan 21 18:49:15 np0005591285 systemd[1]: run-user-42436.mount: Deactivated successfully.
Jan 21 18:49:15 np0005591285 systemd[1]: user-runtime-dir@42436.service: Deactivated successfully.
Jan 21 18:49:15 np0005591285 systemd[1]: Stopped User Runtime Directory /run/user/42436.
Jan 21 18:49:15 np0005591285 systemd[1]: Removed slice User Slice of UID 42436.
Jan 21 18:49:16 np0005591285 nova_compute[182755]: 2026-01-21 23:49:16.018 182759 DEBUG nova.virt.driver [None req-f447ef32-7edb-4e2c-a5c5-5452f2f9c37e - - - - - -] Emitting event <LifecycleEvent: 1769039356.016813, 539483c9-32b1-4c30-b72a-be10a98b79fa => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 21 18:49:16 np0005591285 nova_compute[182755]: 2026-01-21 23:49:16.021 182759 INFO nova.compute.manager [None req-f447ef32-7edb-4e2c-a5c5-5452f2f9c37e - - - - - -] [instance: 539483c9-32b1-4c30-b72a-be10a98b79fa] VM Resumed (Lifecycle Event)#033[00m
Jan 21 18:49:16 np0005591285 nova_compute[182755]: 2026-01-21 23:49:16.029 182759 DEBUG nova.compute.manager [None req-c7a31d95-be17-4c05-9f13-e5ecd603f6bd abd17ede09d948d58de153b963381f13 54c1b2890dcc4b4599ff907adcbbbbb0 - - default default] [instance: 539483c9-32b1-4c30-b72a-be10a98b79fa] Instance event wait completed in 0 seconds for  wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Jan 21 18:49:16 np0005591285 nova_compute[182755]: 2026-01-21 23:49:16.030 182759 DEBUG nova.virt.libvirt.driver [None req-c7a31d95-be17-4c05-9f13-e5ecd603f6bd abd17ede09d948d58de153b963381f13 54c1b2890dcc4b4599ff907adcbbbbb0 - - default default] [instance: 539483c9-32b1-4c30-b72a-be10a98b79fa] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Jan 21 18:49:16 np0005591285 nova_compute[182755]: 2026-01-21 23:49:16.035 182759 INFO nova.virt.libvirt.driver [-] [instance: 539483c9-32b1-4c30-b72a-be10a98b79fa] Instance spawned successfully.#033[00m
Jan 21 18:49:16 np0005591285 nova_compute[182755]: 2026-01-21 23:49:16.036 182759 DEBUG nova.virt.libvirt.driver [None req-c7a31d95-be17-4c05-9f13-e5ecd603f6bd abd17ede09d948d58de153b963381f13 54c1b2890dcc4b4599ff907adcbbbbb0 - - default default] [instance: 539483c9-32b1-4c30-b72a-be10a98b79fa] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Jan 21 18:49:16 np0005591285 nova_compute[182755]: 2026-01-21 23:49:16.054 182759 DEBUG nova.compute.manager [None req-f447ef32-7edb-4e2c-a5c5-5452f2f9c37e - - - - - -] [instance: 539483c9-32b1-4c30-b72a-be10a98b79fa] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 21 18:49:16 np0005591285 nova_compute[182755]: 2026-01-21 23:49:16.063 182759 DEBUG nova.compute.manager [None req-f447ef32-7edb-4e2c-a5c5-5452f2f9c37e - - - - - -] [instance: 539483c9-32b1-4c30-b72a-be10a98b79fa] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 21 18:49:16 np0005591285 nova_compute[182755]: 2026-01-21 23:49:16.068 182759 DEBUG nova.virt.libvirt.driver [None req-c7a31d95-be17-4c05-9f13-e5ecd603f6bd abd17ede09d948d58de153b963381f13 54c1b2890dcc4b4599ff907adcbbbbb0 - - default default] [instance: 539483c9-32b1-4c30-b72a-be10a98b79fa] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 21 18:49:16 np0005591285 nova_compute[182755]: 2026-01-21 23:49:16.069 182759 DEBUG nova.virt.libvirt.driver [None req-c7a31d95-be17-4c05-9f13-e5ecd603f6bd abd17ede09d948d58de153b963381f13 54c1b2890dcc4b4599ff907adcbbbbb0 - - default default] [instance: 539483c9-32b1-4c30-b72a-be10a98b79fa] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 21 18:49:16 np0005591285 nova_compute[182755]: 2026-01-21 23:49:16.070 182759 DEBUG nova.virt.libvirt.driver [None req-c7a31d95-be17-4c05-9f13-e5ecd603f6bd abd17ede09d948d58de153b963381f13 54c1b2890dcc4b4599ff907adcbbbbb0 - - default default] [instance: 539483c9-32b1-4c30-b72a-be10a98b79fa] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 21 18:49:16 np0005591285 nova_compute[182755]: 2026-01-21 23:49:16.070 182759 DEBUG nova.virt.libvirt.driver [None req-c7a31d95-be17-4c05-9f13-e5ecd603f6bd abd17ede09d948d58de153b963381f13 54c1b2890dcc4b4599ff907adcbbbbb0 - - default default] [instance: 539483c9-32b1-4c30-b72a-be10a98b79fa] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 21 18:49:16 np0005591285 nova_compute[182755]: 2026-01-21 23:49:16.071 182759 DEBUG nova.virt.libvirt.driver [None req-c7a31d95-be17-4c05-9f13-e5ecd603f6bd abd17ede09d948d58de153b963381f13 54c1b2890dcc4b4599ff907adcbbbbb0 - - default default] [instance: 539483c9-32b1-4c30-b72a-be10a98b79fa] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 21 18:49:16 np0005591285 nova_compute[182755]: 2026-01-21 23:49:16.072 182759 DEBUG nova.virt.libvirt.driver [None req-c7a31d95-be17-4c05-9f13-e5ecd603f6bd abd17ede09d948d58de153b963381f13 54c1b2890dcc4b4599ff907adcbbbbb0 - - default default] [instance: 539483c9-32b1-4c30-b72a-be10a98b79fa] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 21 18:49:16 np0005591285 nova_compute[182755]: 2026-01-21 23:49:16.094 182759 INFO nova.compute.manager [None req-f447ef32-7edb-4e2c-a5c5-5452f2f9c37e - - - - - -] [instance: 539483c9-32b1-4c30-b72a-be10a98b79fa] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 21 18:49:16 np0005591285 nova_compute[182755]: 2026-01-21 23:49:16.095 182759 DEBUG nova.virt.driver [None req-f447ef32-7edb-4e2c-a5c5-5452f2f9c37e - - - - - -] Emitting event <LifecycleEvent: 1769039356.0186183, 539483c9-32b1-4c30-b72a-be10a98b79fa => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 21 18:49:16 np0005591285 nova_compute[182755]: 2026-01-21 23:49:16.095 182759 INFO nova.compute.manager [None req-f447ef32-7edb-4e2c-a5c5-5452f2f9c37e - - - - - -] [instance: 539483c9-32b1-4c30-b72a-be10a98b79fa] VM Started (Lifecycle Event)#033[00m
Jan 21 18:49:16 np0005591285 nova_compute[182755]: 2026-01-21 23:49:16.120 182759 DEBUG nova.compute.manager [None req-f447ef32-7edb-4e2c-a5c5-5452f2f9c37e - - - - - -] [instance: 539483c9-32b1-4c30-b72a-be10a98b79fa] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 21 18:49:16 np0005591285 nova_compute[182755]: 2026-01-21 23:49:16.124 182759 DEBUG nova.compute.manager [None req-f447ef32-7edb-4e2c-a5c5-5452f2f9c37e - - - - - -] [instance: 539483c9-32b1-4c30-b72a-be10a98b79fa] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 21 18:49:16 np0005591285 nova_compute[182755]: 2026-01-21 23:49:16.150 182759 INFO nova.compute.manager [None req-c7a31d95-be17-4c05-9f13-e5ecd603f6bd abd17ede09d948d58de153b963381f13 54c1b2890dcc4b4599ff907adcbbbbb0 - - default default] [instance: 539483c9-32b1-4c30-b72a-be10a98b79fa] Took 2.02 seconds to spawn the instance on the hypervisor.#033[00m
Jan 21 18:49:16 np0005591285 nova_compute[182755]: 2026-01-21 23:49:16.150 182759 DEBUG nova.compute.manager [None req-c7a31d95-be17-4c05-9f13-e5ecd603f6bd abd17ede09d948d58de153b963381f13 54c1b2890dcc4b4599ff907adcbbbbb0 - - default default] [instance: 539483c9-32b1-4c30-b72a-be10a98b79fa] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 21 18:49:16 np0005591285 nova_compute[182755]: 2026-01-21 23:49:16.152 182759 INFO nova.compute.manager [None req-f447ef32-7edb-4e2c-a5c5-5452f2f9c37e - - - - - -] [instance: 539483c9-32b1-4c30-b72a-be10a98b79fa] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 21 18:49:16 np0005591285 nova_compute[182755]: 2026-01-21 23:49:16.298 182759 INFO nova.compute.manager [None req-c7a31d95-be17-4c05-9f13-e5ecd603f6bd abd17ede09d948d58de153b963381f13 54c1b2890dcc4b4599ff907adcbbbbb0 - - default default] [instance: 539483c9-32b1-4c30-b72a-be10a98b79fa] Took 2.71 seconds to build instance.#033[00m
Jan 21 18:49:16 np0005591285 nova_compute[182755]: 2026-01-21 23:49:16.318 182759 DEBUG oslo_concurrency.lockutils [None req-c7a31d95-be17-4c05-9f13-e5ecd603f6bd abd17ede09d948d58de153b963381f13 54c1b2890dcc4b4599ff907adcbbbbb0 - - default default] Lock "539483c9-32b1-4c30-b72a-be10a98b79fa" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 2.811s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 21 18:49:17 np0005591285 nova_compute[182755]: 2026-01-21 23:49:17.207 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 18:49:18 np0005591285 nova_compute[182755]: 2026-01-21 23:49:18.218 182759 DEBUG oslo_service.periodic_task [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 21 18:49:19 np0005591285 nova_compute[182755]: 2026-01-21 23:49:19.181 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 18:49:19 np0005591285 nova_compute[182755]: 2026-01-21 23:49:19.216 182759 DEBUG oslo_service.periodic_task [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 21 18:49:19 np0005591285 nova_compute[182755]: 2026-01-21 23:49:19.217 182759 DEBUG nova.compute.manager [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Jan 21 18:49:19 np0005591285 nova_compute[182755]: 2026-01-21 23:49:19.218 182759 DEBUG nova.compute.manager [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Jan 21 18:49:19 np0005591285 systemd-logind[788]: New session 32 of user nova.
Jan 21 18:49:19 np0005591285 systemd[1]: Created slice User Slice of UID 42436.
Jan 21 18:49:19 np0005591285 systemd[1]: Starting User Runtime Directory /run/user/42436...
Jan 21 18:49:19 np0005591285 systemd[1]: Finished User Runtime Directory /run/user/42436.
Jan 21 18:49:19 np0005591285 systemd[1]: Starting User Manager for UID 42436...
Jan 21 18:49:19 np0005591285 podman[215066]: 2026-01-21 23:49:19.372226724 +0000 UTC m=+0.080645287 container health_status 3d927e208c8d9d2bf2ed958430b573b5beb6e6062ba87e1ec7a0ee42c855b2e4 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Jan 21 18:49:19 np0005591285 nova_compute[182755]: 2026-01-21 23:49:19.472 182759 DEBUG oslo_concurrency.lockutils [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Acquiring lock "refresh_cache-539483c9-32b1-4c30-b72a-be10a98b79fa" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 21 18:49:19 np0005591285 nova_compute[182755]: 2026-01-21 23:49:19.473 182759 DEBUG oslo_concurrency.lockutils [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Acquired lock "refresh_cache-539483c9-32b1-4c30-b72a-be10a98b79fa" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 21 18:49:19 np0005591285 nova_compute[182755]: 2026-01-21 23:49:19.473 182759 DEBUG nova.network.neutron [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] [instance: 539483c9-32b1-4c30-b72a-be10a98b79fa] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004#033[00m
Jan 21 18:49:19 np0005591285 nova_compute[182755]: 2026-01-21 23:49:19.473 182759 DEBUG nova.objects.instance [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Lazy-loading 'info_cache' on Instance uuid 539483c9-32b1-4c30-b72a-be10a98b79fa obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 21 18:49:19 np0005591285 systemd[215079]: Queued start job for default target Main User Target.
Jan 21 18:49:19 np0005591285 systemd[215079]: Created slice User Application Slice.
Jan 21 18:49:19 np0005591285 systemd[215079]: Started Mark boot as successful after the user session has run 2 minutes.
Jan 21 18:49:19 np0005591285 systemd[215079]: Started Daily Cleanup of User's Temporary Directories.
Jan 21 18:49:19 np0005591285 systemd[215079]: Reached target Paths.
Jan 21 18:49:19 np0005591285 systemd[215079]: Reached target Timers.
Jan 21 18:49:19 np0005591285 systemd[215079]: Starting D-Bus User Message Bus Socket...
Jan 21 18:49:19 np0005591285 systemd[215079]: Starting Create User's Volatile Files and Directories...
Jan 21 18:49:19 np0005591285 systemd[215079]: Finished Create User's Volatile Files and Directories.
Jan 21 18:49:19 np0005591285 systemd[215079]: Listening on D-Bus User Message Bus Socket.
Jan 21 18:49:19 np0005591285 systemd[215079]: Reached target Sockets.
Jan 21 18:49:19 np0005591285 systemd[215079]: Reached target Basic System.
Jan 21 18:49:19 np0005591285 systemd[215079]: Reached target Main User Target.
Jan 21 18:49:19 np0005591285 systemd[215079]: Startup finished in 155ms.
Jan 21 18:49:19 np0005591285 systemd[1]: Started User Manager for UID 42436.
Jan 21 18:49:19 np0005591285 systemd[1]: Started Session 32 of User nova.
Jan 21 18:49:19 np0005591285 nova_compute[182755]: 2026-01-21 23:49:19.682 182759 DEBUG nova.network.neutron [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] [instance: 539483c9-32b1-4c30-b72a-be10a98b79fa] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Jan 21 18:49:19 np0005591285 systemd[1]: session-32.scope: Deactivated successfully.
Jan 21 18:49:19 np0005591285 systemd-logind[788]: Session 32 logged out. Waiting for processes to exit.
Jan 21 18:49:19 np0005591285 systemd-logind[788]: Removed session 32.
Jan 21 18:49:20 np0005591285 systemd-logind[788]: New session 34 of user nova.
Jan 21 18:49:20 np0005591285 systemd[1]: Started Session 34 of User nova.
Jan 21 18:49:20 np0005591285 nova_compute[182755]: 2026-01-21 23:49:20.078 182759 DEBUG nova.network.neutron [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] [instance: 539483c9-32b1-4c30-b72a-be10a98b79fa] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 21 18:49:20 np0005591285 nova_compute[182755]: 2026-01-21 23:49:20.102 182759 DEBUG oslo_concurrency.lockutils [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Releasing lock "refresh_cache-539483c9-32b1-4c30-b72a-be10a98b79fa" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 21 18:49:20 np0005591285 nova_compute[182755]: 2026-01-21 23:49:20.102 182759 DEBUG nova.compute.manager [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] [instance: 539483c9-32b1-4c30-b72a-be10a98b79fa] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929#033[00m
Jan 21 18:49:20 np0005591285 nova_compute[182755]: 2026-01-21 23:49:20.104 182759 DEBUG oslo_service.periodic_task [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 21 18:49:20 np0005591285 systemd[1]: session-34.scope: Deactivated successfully.
Jan 21 18:49:20 np0005591285 systemd-logind[788]: Session 34 logged out. Waiting for processes to exit.
Jan 21 18:49:20 np0005591285 systemd-logind[788]: Removed session 34.
Jan 21 18:49:20 np0005591285 systemd-logind[788]: New session 35 of user nova.
Jan 21 18:49:20 np0005591285 systemd[1]: Started Session 35 of User nova.
Jan 21 18:49:20 np0005591285 systemd[1]: session-35.scope: Deactivated successfully.
Jan 21 18:49:20 np0005591285 systemd-logind[788]: Session 35 logged out. Waiting for processes to exit.
Jan 21 18:49:20 np0005591285 systemd-logind[788]: Removed session 35.
Jan 21 18:49:21 np0005591285 nova_compute[182755]: 2026-01-21 23:49:21.100 182759 DEBUG oslo_service.periodic_task [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 21 18:49:21 np0005591285 nova_compute[182755]: 2026-01-21 23:49:21.477 182759 DEBUG oslo_concurrency.lockutils [None req-f79a4c93-1692-4be0-9ec2-dc22359e4f87 36d71830ce70436e97fbc17b6da8d3c6 95574103d0094883861c58d01690e5a3 - - default default] Acquiring lock "refresh_cache-4d84ec02-4252-4dab-8580-d9961b6e6afd" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 21 18:49:21 np0005591285 nova_compute[182755]: 2026-01-21 23:49:21.478 182759 DEBUG oslo_concurrency.lockutils [None req-f79a4c93-1692-4be0-9ec2-dc22359e4f87 36d71830ce70436e97fbc17b6da8d3c6 95574103d0094883861c58d01690e5a3 - - default default] Acquired lock "refresh_cache-4d84ec02-4252-4dab-8580-d9961b6e6afd" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 21 18:49:21 np0005591285 nova_compute[182755]: 2026-01-21 23:49:21.479 182759 DEBUG nova.network.neutron [None req-f79a4c93-1692-4be0-9ec2-dc22359e4f87 36d71830ce70436e97fbc17b6da8d3c6 95574103d0094883861c58d01690e5a3 - - default default] [instance: 4d84ec02-4252-4dab-8580-d9961b6e6afd] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Jan 21 18:49:21 np0005591285 nova_compute[182755]: 2026-01-21 23:49:21.642 182759 DEBUG nova.network.neutron [None req-f79a4c93-1692-4be0-9ec2-dc22359e4f87 36d71830ce70436e97fbc17b6da8d3c6 95574103d0094883861c58d01690e5a3 - - default default] [instance: 4d84ec02-4252-4dab-8580-d9961b6e6afd] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Jan 21 18:49:22 np0005591285 nova_compute[182755]: 2026-01-21 23:49:22.211 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 18:49:22 np0005591285 nova_compute[182755]: 2026-01-21 23:49:22.213 182759 DEBUG oslo_service.periodic_task [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 21 18:49:22 np0005591285 nova_compute[182755]: 2026-01-21 23:49:22.254 182759 DEBUG oslo_service.periodic_task [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 21 18:49:23 np0005591285 nova_compute[182755]: 2026-01-21 23:49:23.217 182759 DEBUG oslo_service.periodic_task [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 21 18:49:23 np0005591285 nova_compute[182755]: 2026-01-21 23:49:23.218 182759 DEBUG nova.compute.manager [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Jan 21 18:49:23 np0005591285 nova_compute[182755]: 2026-01-21 23:49:23.249 182759 DEBUG nova.network.neutron [None req-f79a4c93-1692-4be0-9ec2-dc22359e4f87 36d71830ce70436e97fbc17b6da8d3c6 95574103d0094883861c58d01690e5a3 - - default default] [instance: 4d84ec02-4252-4dab-8580-d9961b6e6afd] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 21 18:49:23 np0005591285 nova_compute[182755]: 2026-01-21 23:49:23.286 182759 DEBUG oslo_concurrency.lockutils [None req-f79a4c93-1692-4be0-9ec2-dc22359e4f87 36d71830ce70436e97fbc17b6da8d3c6 95574103d0094883861c58d01690e5a3 - - default default] Releasing lock "refresh_cache-4d84ec02-4252-4dab-8580-d9961b6e6afd" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 21 18:49:23 np0005591285 nova_compute[182755]: 2026-01-21 23:49:23.442 182759 DEBUG nova.virt.libvirt.driver [None req-f79a4c93-1692-4be0-9ec2-dc22359e4f87 36d71830ce70436e97fbc17b6da8d3c6 95574103d0094883861c58d01690e5a3 - - default default] [instance: 4d84ec02-4252-4dab-8580-d9961b6e6afd] Starting finish_migration finish_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11698#033[00m
Jan 21 18:49:23 np0005591285 nova_compute[182755]: 2026-01-21 23:49:23.445 182759 DEBUG nova.virt.libvirt.driver [None req-f79a4c93-1692-4be0-9ec2-dc22359e4f87 36d71830ce70436e97fbc17b6da8d3c6 95574103d0094883861c58d01690e5a3 - - default default] [instance: 4d84ec02-4252-4dab-8580-d9961b6e6afd] Instance directory exists: not creating _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4719#033[00m
Jan 21 18:49:23 np0005591285 nova_compute[182755]: 2026-01-21 23:49:23.445 182759 INFO nova.virt.libvirt.driver [None req-f79a4c93-1692-4be0-9ec2-dc22359e4f87 36d71830ce70436e97fbc17b6da8d3c6 95574103d0094883861c58d01690e5a3 - - default default] [instance: 4d84ec02-4252-4dab-8580-d9961b6e6afd] Creating image(s)#033[00m
Jan 21 18:49:23 np0005591285 nova_compute[182755]: 2026-01-21 23:49:23.447 182759 DEBUG nova.objects.instance [None req-f79a4c93-1692-4be0-9ec2-dc22359e4f87 36d71830ce70436e97fbc17b6da8d3c6 95574103d0094883861c58d01690e5a3 - - default default] Lazy-loading 'trusted_certs' on Instance uuid 4d84ec02-4252-4dab-8580-d9961b6e6afd obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 21 18:49:23 np0005591285 nova_compute[182755]: 2026-01-21 23:49:23.466 182759 DEBUG oslo_concurrency.processutils [None req-f79a4c93-1692-4be0-9ec2-dc22359e4f87 36d71830ce70436e97fbc17b6da8d3c6 95574103d0094883861c58d01690e5a3 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 21 18:49:23 np0005591285 nova_compute[182755]: 2026-01-21 23:49:23.567 182759 DEBUG oslo_concurrency.processutils [None req-f79a4c93-1692-4be0-9ec2-dc22359e4f87 36d71830ce70436e97fbc17b6da8d3c6 95574103d0094883861c58d01690e5a3 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json" returned: 0 in 0.101s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 21 18:49:23 np0005591285 nova_compute[182755]: 2026-01-21 23:49:23.569 182759 DEBUG nova.virt.disk.api [None req-f79a4c93-1692-4be0-9ec2-dc22359e4f87 36d71830ce70436e97fbc17b6da8d3c6 95574103d0094883861c58d01690e5a3 - - default default] Checking if we can resize image /var/lib/nova/instances/4d84ec02-4252-4dab-8580-d9961b6e6afd/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166#033[00m
Jan 21 18:49:23 np0005591285 nova_compute[182755]: 2026-01-21 23:49:23.570 182759 DEBUG oslo_concurrency.processutils [None req-f79a4c93-1692-4be0-9ec2-dc22359e4f87 36d71830ce70436e97fbc17b6da8d3c6 95574103d0094883861c58d01690e5a3 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/4d84ec02-4252-4dab-8580-d9961b6e6afd/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 21 18:49:23 np0005591285 nova_compute[182755]: 2026-01-21 23:49:23.650 182759 DEBUG oslo_concurrency.processutils [None req-f79a4c93-1692-4be0-9ec2-dc22359e4f87 36d71830ce70436e97fbc17b6da8d3c6 95574103d0094883861c58d01690e5a3 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/4d84ec02-4252-4dab-8580-d9961b6e6afd/disk --force-share --output=json" returned: 0 in 0.080s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 21 18:49:23 np0005591285 nova_compute[182755]: 2026-01-21 23:49:23.652 182759 DEBUG nova.virt.disk.api [None req-f79a4c93-1692-4be0-9ec2-dc22359e4f87 36d71830ce70436e97fbc17b6da8d3c6 95574103d0094883861c58d01690e5a3 - - default default] Cannot resize image /var/lib/nova/instances/4d84ec02-4252-4dab-8580-d9961b6e6afd/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172#033[00m
Jan 21 18:49:23 np0005591285 nova_compute[182755]: 2026-01-21 23:49:23.669 182759 DEBUG nova.virt.libvirt.driver [None req-f79a4c93-1692-4be0-9ec2-dc22359e4f87 36d71830ce70436e97fbc17b6da8d3c6 95574103d0094883861c58d01690e5a3 - - default default] [instance: 4d84ec02-4252-4dab-8580-d9961b6e6afd] Did not create local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4859#033[00m
Jan 21 18:49:23 np0005591285 nova_compute[182755]: 2026-01-21 23:49:23.670 182759 DEBUG nova.virt.libvirt.driver [None req-f79a4c93-1692-4be0-9ec2-dc22359e4f87 36d71830ce70436e97fbc17b6da8d3c6 95574103d0094883861c58d01690e5a3 - - default default] [instance: 4d84ec02-4252-4dab-8580-d9961b6e6afd] Ensure instance console log exists: /var/lib/nova/instances/4d84ec02-4252-4dab-8580-d9961b6e6afd/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Jan 21 18:49:23 np0005591285 nova_compute[182755]: 2026-01-21 23:49:23.671 182759 DEBUG oslo_concurrency.lockutils [None req-f79a4c93-1692-4be0-9ec2-dc22359e4f87 36d71830ce70436e97fbc17b6da8d3c6 95574103d0094883861c58d01690e5a3 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 21 18:49:23 np0005591285 nova_compute[182755]: 2026-01-21 23:49:23.672 182759 DEBUG oslo_concurrency.lockutils [None req-f79a4c93-1692-4be0-9ec2-dc22359e4f87 36d71830ce70436e97fbc17b6da8d3c6 95574103d0094883861c58d01690e5a3 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 21 18:49:23 np0005591285 nova_compute[182755]: 2026-01-21 23:49:23.673 182759 DEBUG oslo_concurrency.lockutils [None req-f79a4c93-1692-4be0-9ec2-dc22359e4f87 36d71830ce70436e97fbc17b6da8d3c6 95574103d0094883861c58d01690e5a3 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 21 18:49:23 np0005591285 nova_compute[182755]: 2026-01-21 23:49:23.676 182759 DEBUG nova.virt.libvirt.driver [None req-f79a4c93-1692-4be0-9ec2-dc22359e4f87 36d71830ce70436e97fbc17b6da8d3c6 95574103d0094883861c58d01690e5a3 - - default default] [instance: 4d84ec02-4252-4dab-8580-d9961b6e6afd] Start _get_guest_xml network_info=[] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-21T23:42:50Z,direct_url=<?>,disk_format='qcow2',id=9cd98f02-a505-4543-a7ad-04e9a377b456,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='43b70c4e837343859ac97b6b2397ba1b',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-21T23:42:52Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'device_type': 'disk', 'device_name': '/dev/vda', 'size': 0, 'boot_index': 0, 'disk_bus': 'virtio', 'guest_format': None, 'encryption_options': None, 'encryption_format': None, 'encrypted': False, 'encryption_secret_uuid': None, 'image_id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Jan 21 18:49:23 np0005591285 nova_compute[182755]: 2026-01-21 23:49:23.685 182759 WARNING nova.virt.libvirt.driver [None req-f79a4c93-1692-4be0-9ec2-dc22359e4f87 36d71830ce70436e97fbc17b6da8d3c6 95574103d0094883861c58d01690e5a3 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 21 18:49:23 np0005591285 nova_compute[182755]: 2026-01-21 23:49:23.692 182759 DEBUG nova.virt.libvirt.host [None req-f79a4c93-1692-4be0-9ec2-dc22359e4f87 36d71830ce70436e97fbc17b6da8d3c6 95574103d0094883861c58d01690e5a3 - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Jan 21 18:49:23 np0005591285 nova_compute[182755]: 2026-01-21 23:49:23.693 182759 DEBUG nova.virt.libvirt.host [None req-f79a4c93-1692-4be0-9ec2-dc22359e4f87 36d71830ce70436e97fbc17b6da8d3c6 95574103d0094883861c58d01690e5a3 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Jan 21 18:49:23 np0005591285 nova_compute[182755]: 2026-01-21 23:49:23.700 182759 DEBUG nova.virt.libvirt.host [None req-f79a4c93-1692-4be0-9ec2-dc22359e4f87 36d71830ce70436e97fbc17b6da8d3c6 95574103d0094883861c58d01690e5a3 - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Jan 21 18:49:23 np0005591285 nova_compute[182755]: 2026-01-21 23:49:23.701 182759 DEBUG nova.virt.libvirt.host [None req-f79a4c93-1692-4be0-9ec2-dc22359e4f87 36d71830ce70436e97fbc17b6da8d3c6 95574103d0094883861c58d01690e5a3 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Jan 21 18:49:23 np0005591285 nova_compute[182755]: 2026-01-21 23:49:23.704 182759 DEBUG nova.virt.libvirt.driver [None req-f79a4c93-1692-4be0-9ec2-dc22359e4f87 36d71830ce70436e97fbc17b6da8d3c6 95574103d0094883861c58d01690e5a3 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Jan 21 18:49:23 np0005591285 nova_compute[182755]: 2026-01-21 23:49:23.705 182759 DEBUG nova.virt.hardware [None req-f79a4c93-1692-4be0-9ec2-dc22359e4f87 36d71830ce70436e97fbc17b6da8d3c6 95574103d0094883861c58d01690e5a3 - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-21T23:42:48Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='ff01ccba-ad51-439f-9037-926190d6dc0f',id=2,is_public=True,memory_mb=192,name='m1.micro',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-21T23:42:50Z,direct_url=<?>,disk_format='qcow2',id=9cd98f02-a505-4543-a7ad-04e9a377b456,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='43b70c4e837343859ac97b6b2397ba1b',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-21T23:42:52Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Jan 21 18:49:23 np0005591285 nova_compute[182755]: 2026-01-21 23:49:23.706 182759 DEBUG nova.virt.hardware [None req-f79a4c93-1692-4be0-9ec2-dc22359e4f87 36d71830ce70436e97fbc17b6da8d3c6 95574103d0094883861c58d01690e5a3 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Jan 21 18:49:23 np0005591285 nova_compute[182755]: 2026-01-21 23:49:23.706 182759 DEBUG nova.virt.hardware [None req-f79a4c93-1692-4be0-9ec2-dc22359e4f87 36d71830ce70436e97fbc17b6da8d3c6 95574103d0094883861c58d01690e5a3 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Jan 21 18:49:23 np0005591285 nova_compute[182755]: 2026-01-21 23:49:23.707 182759 DEBUG nova.virt.hardware [None req-f79a4c93-1692-4be0-9ec2-dc22359e4f87 36d71830ce70436e97fbc17b6da8d3c6 95574103d0094883861c58d01690e5a3 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Jan 21 18:49:23 np0005591285 nova_compute[182755]: 2026-01-21 23:49:23.707 182759 DEBUG nova.virt.hardware [None req-f79a4c93-1692-4be0-9ec2-dc22359e4f87 36d71830ce70436e97fbc17b6da8d3c6 95574103d0094883861c58d01690e5a3 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Jan 21 18:49:23 np0005591285 nova_compute[182755]: 2026-01-21 23:49:23.708 182759 DEBUG nova.virt.hardware [None req-f79a4c93-1692-4be0-9ec2-dc22359e4f87 36d71830ce70436e97fbc17b6da8d3c6 95574103d0094883861c58d01690e5a3 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Jan 21 18:49:23 np0005591285 nova_compute[182755]: 2026-01-21 23:49:23.709 182759 DEBUG nova.virt.hardware [None req-f79a4c93-1692-4be0-9ec2-dc22359e4f87 36d71830ce70436e97fbc17b6da8d3c6 95574103d0094883861c58d01690e5a3 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Jan 21 18:49:23 np0005591285 nova_compute[182755]: 2026-01-21 23:49:23.709 182759 DEBUG nova.virt.hardware [None req-f79a4c93-1692-4be0-9ec2-dc22359e4f87 36d71830ce70436e97fbc17b6da8d3c6 95574103d0094883861c58d01690e5a3 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Jan 21 18:49:23 np0005591285 nova_compute[182755]: 2026-01-21 23:49:23.710 182759 DEBUG nova.virt.hardware [None req-f79a4c93-1692-4be0-9ec2-dc22359e4f87 36d71830ce70436e97fbc17b6da8d3c6 95574103d0094883861c58d01690e5a3 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Jan 21 18:49:23 np0005591285 nova_compute[182755]: 2026-01-21 23:49:23.710 182759 DEBUG nova.virt.hardware [None req-f79a4c93-1692-4be0-9ec2-dc22359e4f87 36d71830ce70436e97fbc17b6da8d3c6 95574103d0094883861c58d01690e5a3 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Jan 21 18:49:23 np0005591285 nova_compute[182755]: 2026-01-21 23:49:23.711 182759 DEBUG nova.virt.hardware [None req-f79a4c93-1692-4be0-9ec2-dc22359e4f87 36d71830ce70436e97fbc17b6da8d3c6 95574103d0094883861c58d01690e5a3 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Jan 21 18:49:23 np0005591285 nova_compute[182755]: 2026-01-21 23:49:23.711 182759 DEBUG nova.objects.instance [None req-f79a4c93-1692-4be0-9ec2-dc22359e4f87 36d71830ce70436e97fbc17b6da8d3c6 95574103d0094883861c58d01690e5a3 - - default default] Lazy-loading 'vcpu_model' on Instance uuid 4d84ec02-4252-4dab-8580-d9961b6e6afd obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 21 18:49:23 np0005591285 nova_compute[182755]: 2026-01-21 23:49:23.738 182759 DEBUG oslo_concurrency.processutils [None req-f79a4c93-1692-4be0-9ec2-dc22359e4f87 36d71830ce70436e97fbc17b6da8d3c6 95574103d0094883861c58d01690e5a3 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/4d84ec02-4252-4dab-8580-d9961b6e6afd/disk.config --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 21 18:49:23 np0005591285 nova_compute[182755]: 2026-01-21 23:49:23.796 182759 DEBUG oslo_concurrency.processutils [None req-f79a4c93-1692-4be0-9ec2-dc22359e4f87 36d71830ce70436e97fbc17b6da8d3c6 95574103d0094883861c58d01690e5a3 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/4d84ec02-4252-4dab-8580-d9961b6e6afd/disk.config --force-share --output=json" returned: 0 in 0.058s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 21 18:49:23 np0005591285 nova_compute[182755]: 2026-01-21 23:49:23.799 182759 DEBUG oslo_concurrency.lockutils [None req-f79a4c93-1692-4be0-9ec2-dc22359e4f87 36d71830ce70436e97fbc17b6da8d3c6 95574103d0094883861c58d01690e5a3 - - default default] Acquiring lock "/var/lib/nova/instances/4d84ec02-4252-4dab-8580-d9961b6e6afd/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 21 18:49:23 np0005591285 nova_compute[182755]: 2026-01-21 23:49:23.799 182759 DEBUG oslo_concurrency.lockutils [None req-f79a4c93-1692-4be0-9ec2-dc22359e4f87 36d71830ce70436e97fbc17b6da8d3c6 95574103d0094883861c58d01690e5a3 - - default default] Lock "/var/lib/nova/instances/4d84ec02-4252-4dab-8580-d9961b6e6afd/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 21 18:49:23 np0005591285 nova_compute[182755]: 2026-01-21 23:49:23.801 182759 DEBUG oslo_concurrency.lockutils [None req-f79a4c93-1692-4be0-9ec2-dc22359e4f87 36d71830ce70436e97fbc17b6da8d3c6 95574103d0094883861c58d01690e5a3 - - default default] Lock "/var/lib/nova/instances/4d84ec02-4252-4dab-8580-d9961b6e6afd/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 21 18:49:23 np0005591285 nova_compute[182755]: 2026-01-21 23:49:23.807 182759 DEBUG nova.virt.libvirt.driver [None req-f79a4c93-1692-4be0-9ec2-dc22359e4f87 36d71830ce70436e97fbc17b6da8d3c6 95574103d0094883861c58d01690e5a3 - - default default] [instance: 4d84ec02-4252-4dab-8580-d9961b6e6afd] End _get_guest_xml xml=<domain type="kvm">
Jan 21 18:49:23 np0005591285 nova_compute[182755]:  <uuid>4d84ec02-4252-4dab-8580-d9961b6e6afd</uuid>
Jan 21 18:49:23 np0005591285 nova_compute[182755]:  <name>instance-0000001e</name>
Jan 21 18:49:23 np0005591285 nova_compute[182755]:  <memory>196608</memory>
Jan 21 18:49:23 np0005591285 nova_compute[182755]:  <vcpu>1</vcpu>
Jan 21 18:49:23 np0005591285 nova_compute[182755]:  <metadata>
Jan 21 18:49:23 np0005591285 nova_compute[182755]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 21 18:49:23 np0005591285 nova_compute[182755]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 21 18:49:23 np0005591285 nova_compute[182755]:      <nova:name>tempest-MigrationsAdminTest-server-867500350</nova:name>
Jan 21 18:49:23 np0005591285 nova_compute[182755]:      <nova:creationTime>2026-01-21 23:49:23</nova:creationTime>
Jan 21 18:49:23 np0005591285 nova_compute[182755]:      <nova:flavor name="m1.micro">
Jan 21 18:49:23 np0005591285 nova_compute[182755]:        <nova:memory>192</nova:memory>
Jan 21 18:49:23 np0005591285 nova_compute[182755]:        <nova:disk>1</nova:disk>
Jan 21 18:49:23 np0005591285 nova_compute[182755]:        <nova:swap>0</nova:swap>
Jan 21 18:49:23 np0005591285 nova_compute[182755]:        <nova:ephemeral>0</nova:ephemeral>
Jan 21 18:49:23 np0005591285 nova_compute[182755]:        <nova:vcpus>1</nova:vcpus>
Jan 21 18:49:23 np0005591285 nova_compute[182755]:      </nova:flavor>
Jan 21 18:49:23 np0005591285 nova_compute[182755]:      <nova:owner>
Jan 21 18:49:23 np0005591285 nova_compute[182755]:        <nova:user uuid="36d71830ce70436e97fbc17b6da8d3c6">tempest-MigrationsAdminTest-1559502816-project-member</nova:user>
Jan 21 18:49:23 np0005591285 nova_compute[182755]:        <nova:project uuid="95574103d0094883861c58d01690e5a3">tempest-MigrationsAdminTest-1559502816</nova:project>
Jan 21 18:49:23 np0005591285 nova_compute[182755]:      </nova:owner>
Jan 21 18:49:23 np0005591285 nova_compute[182755]:      <nova:root type="image" uuid="9cd98f02-a505-4543-a7ad-04e9a377b456"/>
Jan 21 18:49:23 np0005591285 nova_compute[182755]:      <nova:ports/>
Jan 21 18:49:23 np0005591285 nova_compute[182755]:    </nova:instance>
Jan 21 18:49:23 np0005591285 nova_compute[182755]:  </metadata>
Jan 21 18:49:23 np0005591285 nova_compute[182755]:  <sysinfo type="smbios">
Jan 21 18:49:23 np0005591285 nova_compute[182755]:    <system>
Jan 21 18:49:23 np0005591285 nova_compute[182755]:      <entry name="manufacturer">RDO</entry>
Jan 21 18:49:23 np0005591285 nova_compute[182755]:      <entry name="product">OpenStack Compute</entry>
Jan 21 18:49:23 np0005591285 nova_compute[182755]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 21 18:49:23 np0005591285 nova_compute[182755]:      <entry name="serial">4d84ec02-4252-4dab-8580-d9961b6e6afd</entry>
Jan 21 18:49:23 np0005591285 nova_compute[182755]:      <entry name="uuid">4d84ec02-4252-4dab-8580-d9961b6e6afd</entry>
Jan 21 18:49:23 np0005591285 nova_compute[182755]:      <entry name="family">Virtual Machine</entry>
Jan 21 18:49:23 np0005591285 nova_compute[182755]:    </system>
Jan 21 18:49:23 np0005591285 nova_compute[182755]:  </sysinfo>
Jan 21 18:49:23 np0005591285 nova_compute[182755]:  <os>
Jan 21 18:49:23 np0005591285 nova_compute[182755]:    <type arch="x86_64" machine="q35">hvm</type>
Jan 21 18:49:23 np0005591285 nova_compute[182755]:    <boot dev="hd"/>
Jan 21 18:49:23 np0005591285 nova_compute[182755]:    <smbios mode="sysinfo"/>
Jan 21 18:49:23 np0005591285 nova_compute[182755]:  </os>
Jan 21 18:49:23 np0005591285 nova_compute[182755]:  <features>
Jan 21 18:49:23 np0005591285 nova_compute[182755]:    <acpi/>
Jan 21 18:49:23 np0005591285 nova_compute[182755]:    <apic/>
Jan 21 18:49:23 np0005591285 nova_compute[182755]:    <vmcoreinfo/>
Jan 21 18:49:23 np0005591285 nova_compute[182755]:  </features>
Jan 21 18:49:23 np0005591285 nova_compute[182755]:  <clock offset="utc">
Jan 21 18:49:23 np0005591285 nova_compute[182755]:    <timer name="pit" tickpolicy="delay"/>
Jan 21 18:49:23 np0005591285 nova_compute[182755]:    <timer name="rtc" tickpolicy="catchup"/>
Jan 21 18:49:23 np0005591285 nova_compute[182755]:    <timer name="hpet" present="no"/>
Jan 21 18:49:23 np0005591285 nova_compute[182755]:  </clock>
Jan 21 18:49:23 np0005591285 nova_compute[182755]:  <cpu mode="custom" match="exact">
Jan 21 18:49:23 np0005591285 nova_compute[182755]:    <model>Nehalem</model>
Jan 21 18:49:23 np0005591285 nova_compute[182755]:    <topology sockets="1" cores="1" threads="1"/>
Jan 21 18:49:23 np0005591285 nova_compute[182755]:  </cpu>
Jan 21 18:49:23 np0005591285 nova_compute[182755]:  <devices>
Jan 21 18:49:23 np0005591285 nova_compute[182755]:    <disk type="file" device="disk">
Jan 21 18:49:23 np0005591285 nova_compute[182755]:      <driver name="qemu" type="qcow2" cache="none"/>
Jan 21 18:49:23 np0005591285 nova_compute[182755]:      <source file="/var/lib/nova/instances/4d84ec02-4252-4dab-8580-d9961b6e6afd/disk"/>
Jan 21 18:49:23 np0005591285 nova_compute[182755]:      <target dev="vda" bus="virtio"/>
Jan 21 18:49:23 np0005591285 nova_compute[182755]:    </disk>
Jan 21 18:49:23 np0005591285 nova_compute[182755]:    <disk type="file" device="cdrom">
Jan 21 18:49:23 np0005591285 nova_compute[182755]:      <driver name="qemu" type="raw" cache="none"/>
Jan 21 18:49:23 np0005591285 nova_compute[182755]:      <source file="/var/lib/nova/instances/4d84ec02-4252-4dab-8580-d9961b6e6afd/disk.config"/>
Jan 21 18:49:23 np0005591285 nova_compute[182755]:      <target dev="sda" bus="sata"/>
Jan 21 18:49:23 np0005591285 nova_compute[182755]:    </disk>
Jan 21 18:49:23 np0005591285 nova_compute[182755]:    <serial type="pty">
Jan 21 18:49:23 np0005591285 nova_compute[182755]:      <log file="/var/lib/nova/instances/4d84ec02-4252-4dab-8580-d9961b6e6afd/console.log" append="off"/>
Jan 21 18:49:23 np0005591285 nova_compute[182755]:    </serial>
Jan 21 18:49:23 np0005591285 nova_compute[182755]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 21 18:49:23 np0005591285 nova_compute[182755]:    <video>
Jan 21 18:49:23 np0005591285 nova_compute[182755]:      <model type="virtio"/>
Jan 21 18:49:23 np0005591285 nova_compute[182755]:    </video>
Jan 21 18:49:23 np0005591285 nova_compute[182755]:    <input type="tablet" bus="usb"/>
Jan 21 18:49:23 np0005591285 nova_compute[182755]:    <rng model="virtio">
Jan 21 18:49:23 np0005591285 nova_compute[182755]:      <backend model="random">/dev/urandom</backend>
Jan 21 18:49:23 np0005591285 nova_compute[182755]:    </rng>
Jan 21 18:49:23 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root"/>
Jan 21 18:49:23 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 18:49:23 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 18:49:23 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 18:49:23 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 18:49:23 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 18:49:23 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 18:49:23 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 18:49:23 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 18:49:23 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 18:49:23 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 18:49:23 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 18:49:23 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 18:49:23 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 18:49:23 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 18:49:23 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 18:49:23 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 18:49:23 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 18:49:23 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 18:49:23 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 18:49:23 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 18:49:23 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 18:49:23 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 18:49:23 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 18:49:23 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 18:49:23 np0005591285 nova_compute[182755]:    <controller type="usb" index="0"/>
Jan 21 18:49:23 np0005591285 nova_compute[182755]:    <memballoon model="virtio">
Jan 21 18:49:23 np0005591285 nova_compute[182755]:      <stats period="10"/>
Jan 21 18:49:23 np0005591285 nova_compute[182755]:    </memballoon>
Jan 21 18:49:23 np0005591285 nova_compute[182755]:  </devices>
Jan 21 18:49:23 np0005591285 nova_compute[182755]: </domain>
Jan 21 18:49:23 np0005591285 nova_compute[182755]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Jan 21 18:49:23 np0005591285 nova_compute[182755]: 2026-01-21 23:49:23.915 182759 DEBUG nova.virt.libvirt.driver [None req-f79a4c93-1692-4be0-9ec2-dc22359e4f87 36d71830ce70436e97fbc17b6da8d3c6 95574103d0094883861c58d01690e5a3 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 21 18:49:23 np0005591285 nova_compute[182755]: 2026-01-21 23:49:23.916 182759 DEBUG nova.virt.libvirt.driver [None req-f79a4c93-1692-4be0-9ec2-dc22359e4f87 36d71830ce70436e97fbc17b6da8d3c6 95574103d0094883861c58d01690e5a3 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 21 18:49:23 np0005591285 nova_compute[182755]: 2026-01-21 23:49:23.916 182759 INFO nova.virt.libvirt.driver [None req-f79a4c93-1692-4be0-9ec2-dc22359e4f87 36d71830ce70436e97fbc17b6da8d3c6 95574103d0094883861c58d01690e5a3 - - default default] [instance: 4d84ec02-4252-4dab-8580-d9961b6e6afd] Using config drive#033[00m
Jan 21 18:49:24 np0005591285 systemd-machined[154022]: New machine qemu-11-instance-0000001e.
Jan 21 18:49:24 np0005591285 systemd[1]: Started Virtual Machine qemu-11-instance-0000001e.
Jan 21 18:49:24 np0005591285 podman[215129]: 2026-01-21 23:49:24.061337304 +0000 UTC m=+0.190816709 container health_status e38def30a2d032278c507a8fe96e62e9ea51ff4721fe55b24e66691821fd079e (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_managed=true)
Jan 21 18:49:24 np0005591285 nova_compute[182755]: 2026-01-21 23:49:24.184 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 18:49:24 np0005591285 nova_compute[182755]: 2026-01-21 23:49:24.348 182759 DEBUG nova.compute.manager [None req-73d4e93b-928c-47a8-aaff-d4156f21e743 abd17ede09d948d58de153b963381f13 54c1b2890dcc4b4599ff907adcbbbbb0 - - default default] [instance: 539483c9-32b1-4c30-b72a-be10a98b79fa] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 21 18:49:24 np0005591285 nova_compute[182755]: 2026-01-21 23:49:24.431 182759 DEBUG nova.virt.driver [None req-f447ef32-7edb-4e2c-a5c5-5452f2f9c37e - - - - - -] Emitting event <LifecycleEvent: 1769039364.4304457, 4d84ec02-4252-4dab-8580-d9961b6e6afd => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 21 18:49:24 np0005591285 nova_compute[182755]: 2026-01-21 23:49:24.431 182759 INFO nova.compute.manager [None req-f447ef32-7edb-4e2c-a5c5-5452f2f9c37e - - - - - -] [instance: 4d84ec02-4252-4dab-8580-d9961b6e6afd] VM Resumed (Lifecycle Event)#033[00m
Jan 21 18:49:24 np0005591285 nova_compute[182755]: 2026-01-21 23:49:24.437 182759 DEBUG nova.compute.manager [None req-f79a4c93-1692-4be0-9ec2-dc22359e4f87 36d71830ce70436e97fbc17b6da8d3c6 95574103d0094883861c58d01690e5a3 - - default default] [instance: 4d84ec02-4252-4dab-8580-d9961b6e6afd] Instance event wait completed in 0 seconds for  wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Jan 21 18:49:24 np0005591285 nova_compute[182755]: 2026-01-21 23:49:24.443 182759 INFO nova.virt.libvirt.driver [-] [instance: 4d84ec02-4252-4dab-8580-d9961b6e6afd] Instance running successfully.#033[00m
Jan 21 18:49:24 np0005591285 virtqemud[182299]: argument unsupported: QEMU guest agent is not configured
Jan 21 18:49:24 np0005591285 nova_compute[182755]: 2026-01-21 23:49:24.447 182759 DEBUG nova.virt.libvirt.guest [None req-f79a4c93-1692-4be0-9ec2-dc22359e4f87 36d71830ce70436e97fbc17b6da8d3c6 95574103d0094883861c58d01690e5a3 - - default default] [instance: 4d84ec02-4252-4dab-8580-d9961b6e6afd] Failed to set time: agent not configured sync_guest_time /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:200#033[00m
Jan 21 18:49:24 np0005591285 nova_compute[182755]: 2026-01-21 23:49:24.447 182759 DEBUG nova.virt.libvirt.driver [None req-f79a4c93-1692-4be0-9ec2-dc22359e4f87 36d71830ce70436e97fbc17b6da8d3c6 95574103d0094883861c58d01690e5a3 - - default default] [instance: 4d84ec02-4252-4dab-8580-d9961b6e6afd] finish_migration finished successfully. finish_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11793#033[00m
Jan 21 18:49:24 np0005591285 nova_compute[182755]: 2026-01-21 23:49:24.551 182759 DEBUG nova.compute.manager [None req-f447ef32-7edb-4e2c-a5c5-5452f2f9c37e - - - - - -] [instance: 4d84ec02-4252-4dab-8580-d9961b6e6afd] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 21 18:49:24 np0005591285 nova_compute[182755]: 2026-01-21 23:49:24.566 182759 DEBUG nova.compute.manager [None req-f447ef32-7edb-4e2c-a5c5-5452f2f9c37e - - - - - -] [instance: 4d84ec02-4252-4dab-8580-d9961b6e6afd] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: active, current task_state: resize_finish, current DB power_state: 1, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 21 18:49:24 np0005591285 nova_compute[182755]: 2026-01-21 23:49:24.569 182759 INFO nova.compute.manager [None req-73d4e93b-928c-47a8-aaff-d4156f21e743 abd17ede09d948d58de153b963381f13 54c1b2890dcc4b4599ff907adcbbbbb0 - - default default] [instance: 539483c9-32b1-4c30-b72a-be10a98b79fa] instance snapshotting#033[00m
Jan 21 18:49:24 np0005591285 nova_compute[182755]: 2026-01-21 23:49:24.640 182759 INFO nova.compute.manager [None req-f447ef32-7edb-4e2c-a5c5-5452f2f9c37e - - - - - -] [instance: 4d84ec02-4252-4dab-8580-d9961b6e6afd] During sync_power_state the instance has a pending task (resize_finish). Skip.#033[00m
Jan 21 18:49:24 np0005591285 nova_compute[182755]: 2026-01-21 23:49:24.641 182759 DEBUG nova.virt.driver [None req-f447ef32-7edb-4e2c-a5c5-5452f2f9c37e - - - - - -] Emitting event <LifecycleEvent: 1769039364.4366384, 4d84ec02-4252-4dab-8580-d9961b6e6afd => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 21 18:49:24 np0005591285 nova_compute[182755]: 2026-01-21 23:49:24.641 182759 INFO nova.compute.manager [None req-f447ef32-7edb-4e2c-a5c5-5452f2f9c37e - - - - - -] [instance: 4d84ec02-4252-4dab-8580-d9961b6e6afd] VM Started (Lifecycle Event)#033[00m
Jan 21 18:49:24 np0005591285 nova_compute[182755]: 2026-01-21 23:49:24.671 182759 DEBUG nova.compute.manager [None req-f447ef32-7edb-4e2c-a5c5-5452f2f9c37e - - - - - -] [instance: 4d84ec02-4252-4dab-8580-d9961b6e6afd] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 21 18:49:24 np0005591285 nova_compute[182755]: 2026-01-21 23:49:24.677 182759 DEBUG nova.compute.manager [None req-f447ef32-7edb-4e2c-a5c5-5452f2f9c37e - - - - - -] [instance: 4d84ec02-4252-4dab-8580-d9961b6e6afd] Synchronizing instance power state after lifecycle event "Started"; current vm_state: active, current task_state: resize_finish, current DB power_state: 1, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 21 18:49:24 np0005591285 nova_compute[182755]: 2026-01-21 23:49:24.812 182759 INFO nova.virt.libvirt.driver [None req-73d4e93b-928c-47a8-aaff-d4156f21e743 abd17ede09d948d58de153b963381f13 54c1b2890dcc4b4599ff907adcbbbbb0 - - default default] [instance: 539483c9-32b1-4c30-b72a-be10a98b79fa] Beginning live snapshot process#033[00m
Jan 21 18:49:25 np0005591285 virtqemud[182299]: invalid argument: disk vda does not have an active block job
Jan 21 18:49:25 np0005591285 nova_compute[182755]: 2026-01-21 23:49:25.004 182759 DEBUG oslo_concurrency.processutils [None req-73d4e93b-928c-47a8-aaff-d4156f21e743 abd17ede09d948d58de153b963381f13 54c1b2890dcc4b4599ff907adcbbbbb0 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/539483c9-32b1-4c30-b72a-be10a98b79fa/disk --force-share --output=json -f qcow2 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 21 18:49:25 np0005591285 nova_compute[182755]: 2026-01-21 23:49:25.099 182759 DEBUG oslo_concurrency.processutils [None req-73d4e93b-928c-47a8-aaff-d4156f21e743 abd17ede09d948d58de153b963381f13 54c1b2890dcc4b4599ff907adcbbbbb0 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/539483c9-32b1-4c30-b72a-be10a98b79fa/disk --force-share --output=json -f qcow2" returned: 0 in 0.095s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 21 18:49:25 np0005591285 nova_compute[182755]: 2026-01-21 23:49:25.101 182759 DEBUG oslo_concurrency.processutils [None req-73d4e93b-928c-47a8-aaff-d4156f21e743 abd17ede09d948d58de153b963381f13 54c1b2890dcc4b4599ff907adcbbbbb0 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/539483c9-32b1-4c30-b72a-be10a98b79fa/disk --force-share --output=json -f qcow2 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 21 18:49:25 np0005591285 nova_compute[182755]: 2026-01-21 23:49:25.181 182759 DEBUG oslo_concurrency.processutils [None req-73d4e93b-928c-47a8-aaff-d4156f21e743 abd17ede09d948d58de153b963381f13 54c1b2890dcc4b4599ff907adcbbbbb0 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/539483c9-32b1-4c30-b72a-be10a98b79fa/disk --force-share --output=json -f qcow2" returned: 0 in 0.079s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 21 18:49:25 np0005591285 nova_compute[182755]: 2026-01-21 23:49:25.194 182759 DEBUG oslo_concurrency.processutils [None req-73d4e93b-928c-47a8-aaff-d4156f21e743 abd17ede09d948d58de153b963381f13 54c1b2890dcc4b4599ff907adcbbbbb0 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 21 18:49:25 np0005591285 nova_compute[182755]: 2026-01-21 23:49:25.219 182759 DEBUG oslo_service.periodic_task [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 21 18:49:25 np0005591285 nova_compute[182755]: 2026-01-21 23:49:25.220 182759 DEBUG oslo_service.periodic_task [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 21 18:49:25 np0005591285 nova_compute[182755]: 2026-01-21 23:49:25.245 182759 DEBUG oslo_concurrency.lockutils [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 21 18:49:25 np0005591285 nova_compute[182755]: 2026-01-21 23:49:25.245 182759 DEBUG oslo_concurrency.lockutils [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 21 18:49:25 np0005591285 nova_compute[182755]: 2026-01-21 23:49:25.246 182759 DEBUG oslo_concurrency.lockutils [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 21 18:49:25 np0005591285 nova_compute[182755]: 2026-01-21 23:49:25.246 182759 DEBUG nova.compute.resource_tracker [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Jan 21 18:49:25 np0005591285 nova_compute[182755]: 2026-01-21 23:49:25.272 182759 DEBUG oslo_concurrency.processutils [None req-73d4e93b-928c-47a8-aaff-d4156f21e743 abd17ede09d948d58de153b963381f13 54c1b2890dcc4b4599ff907adcbbbbb0 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json" returned: 0 in 0.078s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 21 18:49:25 np0005591285 nova_compute[182755]: 2026-01-21 23:49:25.274 182759 DEBUG oslo_concurrency.processutils [None req-73d4e93b-928c-47a8-aaff-d4156f21e743 abd17ede09d948d58de153b963381f13 54c1b2890dcc4b4599ff907adcbbbbb0 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474,backing_fmt=raw /var/lib/nova/instances/snapshots/tmpkbk1c_op/bf843b802244485c82f77190d1edeae1.delta 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 21 18:49:25 np0005591285 nova_compute[182755]: 2026-01-21 23:49:25.322 182759 DEBUG oslo_concurrency.processutils [None req-73d4e93b-928c-47a8-aaff-d4156f21e743 abd17ede09d948d58de153b963381f13 54c1b2890dcc4b4599ff907adcbbbbb0 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474,backing_fmt=raw /var/lib/nova/instances/snapshots/tmpkbk1c_op/bf843b802244485c82f77190d1edeae1.delta 1073741824" returned: 0 in 0.048s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 21 18:49:25 np0005591285 nova_compute[182755]: 2026-01-21 23:49:25.323 182759 INFO nova.virt.libvirt.driver [None req-73d4e93b-928c-47a8-aaff-d4156f21e743 abd17ede09d948d58de153b963381f13 54c1b2890dcc4b4599ff907adcbbbbb0 - - default default] [instance: 539483c9-32b1-4c30-b72a-be10a98b79fa] Quiescing instance not available: QEMU guest agent is not enabled.#033[00m
Jan 21 18:49:25 np0005591285 nova_compute[182755]: 2026-01-21 23:49:25.378 182759 DEBUG nova.virt.libvirt.guest [None req-73d4e93b-928c-47a8-aaff-d4156f21e743 abd17ede09d948d58de153b963381f13 54c1b2890dcc4b4599ff907adcbbbbb0 - - default default] COPY block job progress, current cursor: 1 final cursor: 1 is_job_complete /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:846#033[00m
Jan 21 18:49:25 np0005591285 nova_compute[182755]: 2026-01-21 23:49:25.381 182759 INFO nova.virt.libvirt.driver [None req-73d4e93b-928c-47a8-aaff-d4156f21e743 abd17ede09d948d58de153b963381f13 54c1b2890dcc4b4599ff907adcbbbbb0 - - default default] [instance: 539483c9-32b1-4c30-b72a-be10a98b79fa] Skipping quiescing instance: QEMU guest agent is not enabled.#033[00m
Jan 21 18:49:25 np0005591285 nova_compute[182755]: 2026-01-21 23:49:25.385 182759 DEBUG oslo_concurrency.processutils [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/4d84ec02-4252-4dab-8580-d9961b6e6afd/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 21 18:49:25 np0005591285 nova_compute[182755]: 2026-01-21 23:49:25.437 182759 DEBUG nova.privsep.utils [None req-73d4e93b-928c-47a8-aaff-d4156f21e743 abd17ede09d948d58de153b963381f13 54c1b2890dcc4b4599ff907adcbbbbb0 - - default default] Path '/var/lib/nova/instances' supports direct I/O supports_direct_io /usr/lib/python3.9/site-packages/nova/privsep/utils.py:63#033[00m
Jan 21 18:49:25 np0005591285 nova_compute[182755]: 2026-01-21 23:49:25.438 182759 DEBUG oslo_concurrency.processutils [None req-73d4e93b-928c-47a8-aaff-d4156f21e743 abd17ede09d948d58de153b963381f13 54c1b2890dcc4b4599ff907adcbbbbb0 - - default default] Running cmd (subprocess): qemu-img convert -t none -O qcow2 -f qcow2 /var/lib/nova/instances/snapshots/tmpkbk1c_op/bf843b802244485c82f77190d1edeae1.delta /var/lib/nova/instances/snapshots/tmpkbk1c_op/bf843b802244485c82f77190d1edeae1 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 21 18:49:25 np0005591285 nova_compute[182755]: 2026-01-21 23:49:25.484 182759 DEBUG oslo_concurrency.processutils [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/4d84ec02-4252-4dab-8580-d9961b6e6afd/disk --force-share --output=json" returned: 0 in 0.098s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 21 18:49:25 np0005591285 nova_compute[182755]: 2026-01-21 23:49:25.485 182759 DEBUG oslo_concurrency.processutils [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/4d84ec02-4252-4dab-8580-d9961b6e6afd/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 21 18:49:25 np0005591285 nova_compute[182755]: 2026-01-21 23:49:25.562 182759 DEBUG oslo_concurrency.processutils [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/4d84ec02-4252-4dab-8580-d9961b6e6afd/disk --force-share --output=json" returned: 0 in 0.077s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 21 18:49:25 np0005591285 nova_compute[182755]: 2026-01-21 23:49:25.572 182759 DEBUG oslo_concurrency.processutils [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/539483c9-32b1-4c30-b72a-be10a98b79fa/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 21 18:49:25 np0005591285 nova_compute[182755]: 2026-01-21 23:49:25.637 182759 DEBUG oslo_concurrency.processutils [None req-73d4e93b-928c-47a8-aaff-d4156f21e743 abd17ede09d948d58de153b963381f13 54c1b2890dcc4b4599ff907adcbbbbb0 - - default default] CMD "qemu-img convert -t none -O qcow2 -f qcow2 /var/lib/nova/instances/snapshots/tmpkbk1c_op/bf843b802244485c82f77190d1edeae1.delta /var/lib/nova/instances/snapshots/tmpkbk1c_op/bf843b802244485c82f77190d1edeae1" returned: 0 in 0.198s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 21 18:49:25 np0005591285 nova_compute[182755]: 2026-01-21 23:49:25.639 182759 INFO nova.virt.libvirt.driver [None req-73d4e93b-928c-47a8-aaff-d4156f21e743 abd17ede09d948d58de153b963381f13 54c1b2890dcc4b4599ff907adcbbbbb0 - - default default] [instance: 539483c9-32b1-4c30-b72a-be10a98b79fa] Snapshot extracted, beginning image upload#033[00m
Jan 21 18:49:25 np0005591285 nova_compute[182755]: 2026-01-21 23:49:25.667 182759 DEBUG oslo_concurrency.processutils [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/539483c9-32b1-4c30-b72a-be10a98b79fa/disk --force-share --output=json" returned: 0 in 0.096s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 21 18:49:25 np0005591285 nova_compute[182755]: 2026-01-21 23:49:25.669 182759 DEBUG oslo_concurrency.processutils [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/539483c9-32b1-4c30-b72a-be10a98b79fa/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 21 18:49:25 np0005591285 nova_compute[182755]: 2026-01-21 23:49:25.736 182759 DEBUG oslo_concurrency.processutils [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/539483c9-32b1-4c30-b72a-be10a98b79fa/disk --force-share --output=json" returned: 0 in 0.067s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 21 18:49:25 np0005591285 nova_compute[182755]: 2026-01-21 23:49:25.995 182759 WARNING nova.virt.libvirt.driver [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 21 18:49:25 np0005591285 nova_compute[182755]: 2026-01-21 23:49:25.998 182759 DEBUG nova.compute.resource_tracker [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=5521MB free_disk=73.34746170043945GB free_vcpus=6 pci_devices=[{"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Jan 21 18:49:25 np0005591285 nova_compute[182755]: 2026-01-21 23:49:25.998 182759 DEBUG oslo_concurrency.lockutils [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 21 18:49:25 np0005591285 nova_compute[182755]: 2026-01-21 23:49:25.999 182759 DEBUG oslo_concurrency.lockutils [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 21 18:49:26 np0005591285 nova_compute[182755]: 2026-01-21 23:49:26.059 182759 DEBUG nova.compute.resource_tracker [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Applying migration context for instance 4d84ec02-4252-4dab-8580-d9961b6e6afd as it has an incoming, in-progress migration 63e5814c-3982-49e0-b565-1e0faa0531cf. Migration status is finished _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:950#033[00m
Jan 21 18:49:26 np0005591285 nova_compute[182755]: 2026-01-21 23:49:26.060 182759 INFO nova.compute.resource_tracker [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] [instance: 4d84ec02-4252-4dab-8580-d9961b6e6afd] Updating resource usage from migration 63e5814c-3982-49e0-b565-1e0faa0531cf#033[00m
Jan 21 18:49:26 np0005591285 nova_compute[182755]: 2026-01-21 23:49:26.091 182759 DEBUG nova.compute.resource_tracker [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Instance 4d84ec02-4252-4dab-8580-d9961b6e6afd actively managed on this compute host and has allocations in placement: {'resources': {'VCPU': 1, 'MEMORY_MB': 192, 'DISK_GB': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Jan 21 18:49:26 np0005591285 nova_compute[182755]: 2026-01-21 23:49:26.092 182759 DEBUG nova.compute.resource_tracker [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Instance 539483c9-32b1-4c30-b72a-be10a98b79fa actively managed on this compute host and has allocations in placement: {'resources': {'VCPU': 1, 'MEMORY_MB': 128, 'DISK_GB': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Jan 21 18:49:26 np0005591285 nova_compute[182755]: 2026-01-21 23:49:26.092 182759 DEBUG nova.compute.resource_tracker [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 2 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Jan 21 18:49:26 np0005591285 nova_compute[182755]: 2026-01-21 23:49:26.092 182759 DEBUG nova.compute.resource_tracker [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7679MB used_ram=832MB phys_disk=79GB used_disk=2GB total_vcpus=8 used_vcpus=2 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Jan 21 18:49:26 np0005591285 nova_compute[182755]: 2026-01-21 23:49:26.131 182759 DEBUG nova.scheduler.client.report [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Refreshing inventories for resource provider e96a8776-a298-4c19-937a-402cb8191067 _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:804#033[00m
Jan 21 18:49:26 np0005591285 nova_compute[182755]: 2026-01-21 23:49:26.155 182759 DEBUG nova.scheduler.client.report [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Updating ProviderTree inventory for provider e96a8776-a298-4c19-937a-402cb8191067 from _refresh_and_get_inventory using data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} _refresh_and_get_inventory /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:768#033[00m
Jan 21 18:49:26 np0005591285 nova_compute[182755]: 2026-01-21 23:49:26.155 182759 DEBUG nova.compute.provider_tree [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Updating inventory in ProviderTree for provider e96a8776-a298-4c19-937a-402cb8191067 with inventory: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176#033[00m
Jan 21 18:49:26 np0005591285 nova_compute[182755]: 2026-01-21 23:49:26.181 182759 DEBUG nova.scheduler.client.report [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Refreshing aggregate associations for resource provider e96a8776-a298-4c19-937a-402cb8191067, aggregates: None _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:813#033[00m
Jan 21 18:49:26 np0005591285 nova_compute[182755]: 2026-01-21 23:49:26.212 182759 DEBUG nova.scheduler.client.report [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Refreshing trait associations for resource provider e96a8776-a298-4c19-937a-402cb8191067, traits: COMPUTE_SECURITY_UEFI_SECURE_BOOT,COMPUTE_VIOMMU_MODEL_VIRTIO,COMPUTE_STORAGE_BUS_IDE,COMPUTE_NET_ATTACH_INTERFACE,HW_CPU_X86_SSSE3,HW_CPU_X86_SSE,COMPUTE_IMAGE_TYPE_QCOW2,COMPUTE_NET_VIF_MODEL_VMXNET3,COMPUTE_STORAGE_BUS_FDC,HW_CPU_X86_SSE42,COMPUTE_SECURITY_TPM_2_0,COMPUTE_IMAGE_TYPE_AMI,COMPUTE_DEVICE_TAGGING,COMPUTE_VOLUME_EXTEND,COMPUTE_NET_VIF_MODEL_E1000,COMPUTE_VIOMMU_MODEL_AUTO,COMPUTE_GRAPHICS_MODEL_CIRRUS,COMPUTE_VOLUME_ATTACH_WITH_TAG,COMPUTE_IMAGE_TYPE_AKI,COMPUTE_STORAGE_BUS_SATA,HW_CPU_X86_SSE2,COMPUTE_IMAGE_TYPE_ARI,HW_CPU_X86_SSE41,COMPUTE_RESCUE_BFV,COMPUTE_NODE,COMPUTE_GRAPHICS_MODEL_VIRTIO,COMPUTE_NET_VIF_MODEL_PCNET,COMPUTE_NET_VIF_MODEL_VIRTIO,COMPUTE_NET_VIF_MODEL_RTL8139,COMPUTE_ACCELERATORS,COMPUTE_SECURITY_TPM_1_2,COMPUTE_GRAPHICS_MODEL_NONE,COMPUTE_GRAPHICS_MODEL_BOCHS,COMPUTE_VOLUME_MULTI_ATTACH,COMPUTE_NET_VIF_MODEL_NE2K_PCI,COMPUTE_TRUSTED_CERTS,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,COMPUTE_VIOMMU_MODEL_INTEL,COMPUTE_IMAGE_TYPE_ISO,COMPUTE_SOCKET_PCI_NUMA_AFFINITY,COMPUTE_STORAGE_BUS_USB,COMPUTE_STORAGE_BUS_SCSI,COMPUTE_STORAGE_BUS_VIRTIO,HW_CPU_X86_MMX,COMPUTE_GRAPHICS_MODEL_VGA,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,COMPUTE_NET_VIF_MODEL_E1000E,COMPUTE_IMAGE_TYPE_RAW _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:825#033[00m
Jan 21 18:49:26 np0005591285 nova_compute[182755]: 2026-01-21 23:49:26.336 182759 DEBUG nova.compute.provider_tree [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Inventory has not changed in ProviderTree for provider: e96a8776-a298-4c19-937a-402cb8191067 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 21 18:49:26 np0005591285 nova_compute[182755]: 2026-01-21 23:49:26.352 182759 DEBUG nova.scheduler.client.report [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Inventory has not changed for provider e96a8776-a298-4c19-937a-402cb8191067 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 21 18:49:26 np0005591285 nova_compute[182755]: 2026-01-21 23:49:26.372 182759 DEBUG nova.compute.resource_tracker [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Jan 21 18:49:26 np0005591285 nova_compute[182755]: 2026-01-21 23:49:26.373 182759 DEBUG oslo_concurrency.lockutils [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.374s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 21 18:49:26 np0005591285 nova_compute[182755]: 2026-01-21 23:49:26.615 182759 DEBUG oslo_concurrency.lockutils [None req-e8917893-674d-4211-b112-f1f8370b16ea 36d71830ce70436e97fbc17b6da8d3c6 95574103d0094883861c58d01690e5a3 - - default default] Acquiring lock "refresh_cache-4d84ec02-4252-4dab-8580-d9961b6e6afd" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 21 18:49:26 np0005591285 nova_compute[182755]: 2026-01-21 23:49:26.616 182759 DEBUG oslo_concurrency.lockutils [None req-e8917893-674d-4211-b112-f1f8370b16ea 36d71830ce70436e97fbc17b6da8d3c6 95574103d0094883861c58d01690e5a3 - - default default] Acquired lock "refresh_cache-4d84ec02-4252-4dab-8580-d9961b6e6afd" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 21 18:49:26 np0005591285 nova_compute[182755]: 2026-01-21 23:49:26.616 182759 DEBUG nova.network.neutron [None req-e8917893-674d-4211-b112-f1f8370b16ea 36d71830ce70436e97fbc17b6da8d3c6 95574103d0094883861c58d01690e5a3 - - default default] [instance: 4d84ec02-4252-4dab-8580-d9961b6e6afd] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Jan 21 18:49:26 np0005591285 nova_compute[182755]: 2026-01-21 23:49:26.781 182759 DEBUG nova.network.neutron [None req-e8917893-674d-4211-b112-f1f8370b16ea 36d71830ce70436e97fbc17b6da8d3c6 95574103d0094883861c58d01690e5a3 - - default default] [instance: 4d84ec02-4252-4dab-8580-d9961b6e6afd] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Jan 21 18:49:27 np0005591285 nova_compute[182755]: 2026-01-21 23:49:27.013 182759 DEBUG nova.network.neutron [None req-e8917893-674d-4211-b112-f1f8370b16ea 36d71830ce70436e97fbc17b6da8d3c6 95574103d0094883861c58d01690e5a3 - - default default] [instance: 4d84ec02-4252-4dab-8580-d9961b6e6afd] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 21 18:49:27 np0005591285 nova_compute[182755]: 2026-01-21 23:49:27.028 182759 DEBUG oslo_concurrency.lockutils [None req-e8917893-674d-4211-b112-f1f8370b16ea 36d71830ce70436e97fbc17b6da8d3c6 95574103d0094883861c58d01690e5a3 - - default default] Releasing lock "refresh_cache-4d84ec02-4252-4dab-8580-d9961b6e6afd" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 21 18:49:27 np0005591285 nova_compute[182755]: 2026-01-21 23:49:27.044 182759 DEBUG nova.virt.libvirt.driver [None req-e8917893-674d-4211-b112-f1f8370b16ea 36d71830ce70436e97fbc17b6da8d3c6 95574103d0094883861c58d01690e5a3 - - default default] [instance: 4d84ec02-4252-4dab-8580-d9961b6e6afd] Creating tmpfile /var/lib/nova/instances/4d84ec02-4252-4dab-8580-d9961b6e6afd/tmpjs89j4hm to verify with other compute node that the instance is on the same shared storage. check_instance_shared_storage_local /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:9618#033[00m
Jan 21 18:49:27 np0005591285 systemd[1]: machine-qemu\x2d11\x2dinstance\x2d0000001e.scope: Deactivated successfully.
Jan 21 18:49:27 np0005591285 systemd[1]: machine-qemu\x2d11\x2dinstance\x2d0000001e.scope: Consumed 2.958s CPU time.
Jan 21 18:49:27 np0005591285 systemd-machined[154022]: Machine qemu-11-instance-0000001e terminated.
Jan 21 18:49:27 np0005591285 podman[215218]: 2026-01-21 23:49:27.213163026 +0000 UTC m=+0.078431999 container health_status 482a8450540b33e5cd4c4cb0c4b60281016c657b79b89fa82f8a9e447843f995 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2)
Jan 21 18:49:27 np0005591285 podman[215219]: 2026-01-21 23:49:27.213596327 +0000 UTC m=+0.080055772 container health_status 8919309155a033da8deb024bb37b9fc0d285fe97686ee0d202af37a5f2df8416 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']})
Jan 21 18:49:27 np0005591285 nova_compute[182755]: 2026-01-21 23:49:27.213 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 18:49:27 np0005591285 nova_compute[182755]: 2026-01-21 23:49:27.315 182759 INFO nova.virt.libvirt.driver [-] [instance: 4d84ec02-4252-4dab-8580-d9961b6e6afd] Instance destroyed successfully.#033[00m
Jan 21 18:49:27 np0005591285 nova_compute[182755]: 2026-01-21 23:49:27.316 182759 DEBUG nova.objects.instance [None req-e8917893-674d-4211-b112-f1f8370b16ea 36d71830ce70436e97fbc17b6da8d3c6 95574103d0094883861c58d01690e5a3 - - default default] Lazy-loading 'resources' on Instance uuid 4d84ec02-4252-4dab-8580-d9961b6e6afd obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 21 18:49:27 np0005591285 nova_compute[182755]: 2026-01-21 23:49:27.331 182759 INFO nova.virt.libvirt.driver [None req-e8917893-674d-4211-b112-f1f8370b16ea 36d71830ce70436e97fbc17b6da8d3c6 95574103d0094883861c58d01690e5a3 - - default default] [instance: 4d84ec02-4252-4dab-8580-d9961b6e6afd] Deleting instance files /var/lib/nova/instances/4d84ec02-4252-4dab-8580-d9961b6e6afd_del#033[00m
Jan 21 18:49:27 np0005591285 nova_compute[182755]: 2026-01-21 23:49:27.338 182759 INFO nova.virt.libvirt.driver [None req-e8917893-674d-4211-b112-f1f8370b16ea 36d71830ce70436e97fbc17b6da8d3c6 95574103d0094883861c58d01690e5a3 - - default default] [instance: 4d84ec02-4252-4dab-8580-d9961b6e6afd] Deletion of /var/lib/nova/instances/4d84ec02-4252-4dab-8580-d9961b6e6afd_del complete#033[00m
Jan 21 18:49:27 np0005591285 nova_compute[182755]: 2026-01-21 23:49:27.486 182759 DEBUG oslo_concurrency.lockutils [None req-e8917893-674d-4211-b112-f1f8370b16ea 36d71830ce70436e97fbc17b6da8d3c6 95574103d0094883861c58d01690e5a3 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.drop_move_claim_at_dest" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 21 18:49:27 np0005591285 nova_compute[182755]: 2026-01-21 23:49:27.487 182759 DEBUG oslo_concurrency.lockutils [None req-e8917893-674d-4211-b112-f1f8370b16ea 36d71830ce70436e97fbc17b6da8d3c6 95574103d0094883861c58d01690e5a3 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.drop_move_claim_at_dest" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 21 18:49:27 np0005591285 nova_compute[182755]: 2026-01-21 23:49:27.503 182759 DEBUG nova.objects.instance [None req-e8917893-674d-4211-b112-f1f8370b16ea 36d71830ce70436e97fbc17b6da8d3c6 95574103d0094883861c58d01690e5a3 - - default default] Lazy-loading 'migration_context' on Instance uuid 4d84ec02-4252-4dab-8580-d9961b6e6afd obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 21 18:49:27 np0005591285 nova_compute[182755]: 2026-01-21 23:49:27.576 182759 DEBUG nova.compute.provider_tree [None req-e8917893-674d-4211-b112-f1f8370b16ea 36d71830ce70436e97fbc17b6da8d3c6 95574103d0094883861c58d01690e5a3 - - default default] Inventory has not changed in ProviderTree for provider: e96a8776-a298-4c19-937a-402cb8191067 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 21 18:49:27 np0005591285 nova_compute[182755]: 2026-01-21 23:49:27.594 182759 DEBUG nova.scheduler.client.report [None req-e8917893-674d-4211-b112-f1f8370b16ea 36d71830ce70436e97fbc17b6da8d3c6 95574103d0094883861c58d01690e5a3 - - default default] Inventory has not changed for provider e96a8776-a298-4c19-937a-402cb8191067 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 21 18:49:27 np0005591285 nova_compute[182755]: 2026-01-21 23:49:27.671 182759 DEBUG oslo_concurrency.lockutils [None req-e8917893-674d-4211-b112-f1f8370b16ea 36d71830ce70436e97fbc17b6da8d3c6 95574103d0094883861c58d01690e5a3 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.drop_move_claim_at_dest" :: held 0.185s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 21 18:49:27 np0005591285 nova_compute[182755]: 2026-01-21 23:49:27.724 182759 INFO nova.virt.libvirt.driver [None req-73d4e93b-928c-47a8-aaff-d4156f21e743 abd17ede09d948d58de153b963381f13 54c1b2890dcc4b4599ff907adcbbbbb0 - - default default] [instance: 539483c9-32b1-4c30-b72a-be10a98b79fa] Snapshot image upload complete#033[00m
Jan 21 18:49:27 np0005591285 nova_compute[182755]: 2026-01-21 23:49:27.725 182759 INFO nova.compute.manager [None req-73d4e93b-928c-47a8-aaff-d4156f21e743 abd17ede09d948d58de153b963381f13 54c1b2890dcc4b4599ff907adcbbbbb0 - - default default] [instance: 539483c9-32b1-4c30-b72a-be10a98b79fa] Took 3.14 seconds to snapshot the instance on the hypervisor.#033[00m
Jan 21 18:49:28 np0005591285 nova_compute[182755]: 2026-01-21 23:49:28.372 182759 DEBUG oslo_service.periodic_task [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 21 18:49:29 np0005591285 nova_compute[182755]: 2026-01-21 23:49:29.231 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 18:49:30 np0005591285 systemd[1]: Stopping User Manager for UID 42436...
Jan 21 18:49:30 np0005591285 systemd[215079]: Activating special unit Exit the Session...
Jan 21 18:49:30 np0005591285 systemd[215079]: Stopped target Main User Target.
Jan 21 18:49:30 np0005591285 systemd[215079]: Stopped target Basic System.
Jan 21 18:49:30 np0005591285 systemd[215079]: Stopped target Paths.
Jan 21 18:49:30 np0005591285 systemd[215079]: Stopped target Sockets.
Jan 21 18:49:30 np0005591285 systemd[215079]: Stopped target Timers.
Jan 21 18:49:30 np0005591285 systemd[215079]: Stopped Mark boot as successful after the user session has run 2 minutes.
Jan 21 18:49:30 np0005591285 systemd[215079]: Stopped Daily Cleanup of User's Temporary Directories.
Jan 21 18:49:30 np0005591285 systemd[215079]: Closed D-Bus User Message Bus Socket.
Jan 21 18:49:30 np0005591285 systemd[215079]: Stopped Create User's Volatile Files and Directories.
Jan 21 18:49:30 np0005591285 systemd[215079]: Removed slice User Application Slice.
Jan 21 18:49:30 np0005591285 systemd[215079]: Reached target Shutdown.
Jan 21 18:49:30 np0005591285 systemd[215079]: Finished Exit the Session.
Jan 21 18:49:30 np0005591285 systemd[215079]: Reached target Exit the Session.
Jan 21 18:49:30 np0005591285 systemd[1]: user@42436.service: Deactivated successfully.
Jan 21 18:49:30 np0005591285 systemd[1]: Stopped User Manager for UID 42436.
Jan 21 18:49:30 np0005591285 systemd[1]: Stopping User Runtime Directory /run/user/42436...
Jan 21 18:49:30 np0005591285 systemd[1]: run-user-42436.mount: Deactivated successfully.
Jan 21 18:49:30 np0005591285 systemd[1]: user-runtime-dir@42436.service: Deactivated successfully.
Jan 21 18:49:30 np0005591285 systemd[1]: Stopped User Runtime Directory /run/user/42436.
Jan 21 18:49:30 np0005591285 systemd[1]: Removed slice User Slice of UID 42436.
Jan 21 18:49:32 np0005591285 nova_compute[182755]: 2026-01-21 23:49:32.216 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 18:49:32 np0005591285 ovn_controller[94908]: 2026-01-21T23:49:32Z|00089|memory_trim|INFO|Detected inactivity (last active 30002 ms ago): trimming memory
Jan 21 18:49:33 np0005591285 nova_compute[182755]: 2026-01-21 23:49:33.368 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 18:49:33 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:49:33.367 104259 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=9, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '16:a4:fb', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': 'fa:c2:bb:50:eb:27'}, ipsec=False) old=SB_Global(nb_cfg=8) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 21 18:49:33 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:49:33.371 104259 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 0 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Jan 21 18:49:33 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:49:33.373 104259 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=ce4b296c-26ac-415a-aa87-9634754eb3d3, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '9'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 21 18:49:33 np0005591285 nova_compute[182755]: 2026-01-21 23:49:33.856 182759 DEBUG oslo_concurrency.lockutils [None req-5d705a71-6e91-4e95-a0cf-3d339902d7b9 36d71830ce70436e97fbc17b6da8d3c6 95574103d0094883861c58d01690e5a3 - - default default] Acquiring lock "0f91ac3a-2383-45bf-94b7-631c1737e936" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 21 18:49:33 np0005591285 nova_compute[182755]: 2026-01-21 23:49:33.857 182759 DEBUG oslo_concurrency.lockutils [None req-5d705a71-6e91-4e95-a0cf-3d339902d7b9 36d71830ce70436e97fbc17b6da8d3c6 95574103d0094883861c58d01690e5a3 - - default default] Lock "0f91ac3a-2383-45bf-94b7-631c1737e936" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 21 18:49:33 np0005591285 nova_compute[182755]: 2026-01-21 23:49:33.904 182759 DEBUG nova.compute.manager [None req-5d705a71-6e91-4e95-a0cf-3d339902d7b9 36d71830ce70436e97fbc17b6da8d3c6 95574103d0094883861c58d01690e5a3 - - default default] [instance: 0f91ac3a-2383-45bf-94b7-631c1737e936] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Jan 21 18:49:34 np0005591285 nova_compute[182755]: 2026-01-21 23:49:34.058 182759 DEBUG oslo_concurrency.lockutils [None req-5d705a71-6e91-4e95-a0cf-3d339902d7b9 36d71830ce70436e97fbc17b6da8d3c6 95574103d0094883861c58d01690e5a3 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 21 18:49:34 np0005591285 nova_compute[182755]: 2026-01-21 23:49:34.059 182759 DEBUG oslo_concurrency.lockutils [None req-5d705a71-6e91-4e95-a0cf-3d339902d7b9 36d71830ce70436e97fbc17b6da8d3c6 95574103d0094883861c58d01690e5a3 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 21 18:49:34 np0005591285 nova_compute[182755]: 2026-01-21 23:49:34.069 182759 DEBUG nova.virt.hardware [None req-5d705a71-6e91-4e95-a0cf-3d339902d7b9 36d71830ce70436e97fbc17b6da8d3c6 95574103d0094883861c58d01690e5a3 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Jan 21 18:49:34 np0005591285 nova_compute[182755]: 2026-01-21 23:49:34.070 182759 INFO nova.compute.claims [None req-5d705a71-6e91-4e95-a0cf-3d339902d7b9 36d71830ce70436e97fbc17b6da8d3c6 95574103d0094883861c58d01690e5a3 - - default default] [instance: 0f91ac3a-2383-45bf-94b7-631c1737e936] Claim successful on node compute-2.ctlplane.example.com#033[00m
Jan 21 18:49:34 np0005591285 nova_compute[182755]: 2026-01-21 23:49:34.233 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 18:49:34 np0005591285 nova_compute[182755]: 2026-01-21 23:49:34.262 182759 DEBUG nova.compute.provider_tree [None req-5d705a71-6e91-4e95-a0cf-3d339902d7b9 36d71830ce70436e97fbc17b6da8d3c6 95574103d0094883861c58d01690e5a3 - - default default] Inventory has not changed in ProviderTree for provider: e96a8776-a298-4c19-937a-402cb8191067 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 21 18:49:34 np0005591285 nova_compute[182755]: 2026-01-21 23:49:34.308 182759 DEBUG nova.scheduler.client.report [None req-5d705a71-6e91-4e95-a0cf-3d339902d7b9 36d71830ce70436e97fbc17b6da8d3c6 95574103d0094883861c58d01690e5a3 - - default default] Inventory has not changed for provider e96a8776-a298-4c19-937a-402cb8191067 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 21 18:49:34 np0005591285 nova_compute[182755]: 2026-01-21 23:49:34.526 182759 DEBUG oslo_concurrency.lockutils [None req-5d705a71-6e91-4e95-a0cf-3d339902d7b9 36d71830ce70436e97fbc17b6da8d3c6 95574103d0094883861c58d01690e5a3 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.467s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 21 18:49:34 np0005591285 nova_compute[182755]: 2026-01-21 23:49:34.528 182759 DEBUG nova.compute.manager [None req-5d705a71-6e91-4e95-a0cf-3d339902d7b9 36d71830ce70436e97fbc17b6da8d3c6 95574103d0094883861c58d01690e5a3 - - default default] [instance: 0f91ac3a-2383-45bf-94b7-631c1737e936] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Jan 21 18:49:34 np0005591285 nova_compute[182755]: 2026-01-21 23:49:34.618 182759 DEBUG nova.compute.manager [None req-5d705a71-6e91-4e95-a0cf-3d339902d7b9 36d71830ce70436e97fbc17b6da8d3c6 95574103d0094883861c58d01690e5a3 - - default default] [instance: 0f91ac3a-2383-45bf-94b7-631c1737e936] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Jan 21 18:49:34 np0005591285 nova_compute[182755]: 2026-01-21 23:49:34.619 182759 DEBUG nova.network.neutron [None req-5d705a71-6e91-4e95-a0cf-3d339902d7b9 36d71830ce70436e97fbc17b6da8d3c6 95574103d0094883861c58d01690e5a3 - - default default] [instance: 0f91ac3a-2383-45bf-94b7-631c1737e936] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Jan 21 18:49:34 np0005591285 nova_compute[182755]: 2026-01-21 23:49:34.642 182759 INFO nova.virt.libvirt.driver [None req-5d705a71-6e91-4e95-a0cf-3d339902d7b9 36d71830ce70436e97fbc17b6da8d3c6 95574103d0094883861c58d01690e5a3 - - default default] [instance: 0f91ac3a-2383-45bf-94b7-631c1737e936] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Jan 21 18:49:34 np0005591285 nova_compute[182755]: 2026-01-21 23:49:34.722 182759 DEBUG nova.compute.manager [None req-5d705a71-6e91-4e95-a0cf-3d339902d7b9 36d71830ce70436e97fbc17b6da8d3c6 95574103d0094883861c58d01690e5a3 - - default default] [instance: 0f91ac3a-2383-45bf-94b7-631c1737e936] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Jan 21 18:49:34 np0005591285 nova_compute[182755]: 2026-01-21 23:49:34.887 182759 DEBUG nova.compute.manager [None req-5d705a71-6e91-4e95-a0cf-3d339902d7b9 36d71830ce70436e97fbc17b6da8d3c6 95574103d0094883861c58d01690e5a3 - - default default] [instance: 0f91ac3a-2383-45bf-94b7-631c1737e936] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Jan 21 18:49:34 np0005591285 nova_compute[182755]: 2026-01-21 23:49:34.889 182759 DEBUG nova.virt.libvirt.driver [None req-5d705a71-6e91-4e95-a0cf-3d339902d7b9 36d71830ce70436e97fbc17b6da8d3c6 95574103d0094883861c58d01690e5a3 - - default default] [instance: 0f91ac3a-2383-45bf-94b7-631c1737e936] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Jan 21 18:49:34 np0005591285 nova_compute[182755]: 2026-01-21 23:49:34.890 182759 INFO nova.virt.libvirt.driver [None req-5d705a71-6e91-4e95-a0cf-3d339902d7b9 36d71830ce70436e97fbc17b6da8d3c6 95574103d0094883861c58d01690e5a3 - - default default] [instance: 0f91ac3a-2383-45bf-94b7-631c1737e936] Creating image(s)#033[00m
Jan 21 18:49:34 np0005591285 nova_compute[182755]: 2026-01-21 23:49:34.891 182759 DEBUG oslo_concurrency.lockutils [None req-5d705a71-6e91-4e95-a0cf-3d339902d7b9 36d71830ce70436e97fbc17b6da8d3c6 95574103d0094883861c58d01690e5a3 - - default default] Acquiring lock "/var/lib/nova/instances/0f91ac3a-2383-45bf-94b7-631c1737e936/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 21 18:49:34 np0005591285 nova_compute[182755]: 2026-01-21 23:49:34.891 182759 DEBUG oslo_concurrency.lockutils [None req-5d705a71-6e91-4e95-a0cf-3d339902d7b9 36d71830ce70436e97fbc17b6da8d3c6 95574103d0094883861c58d01690e5a3 - - default default] Lock "/var/lib/nova/instances/0f91ac3a-2383-45bf-94b7-631c1737e936/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 21 18:49:34 np0005591285 nova_compute[182755]: 2026-01-21 23:49:34.893 182759 DEBUG oslo_concurrency.lockutils [None req-5d705a71-6e91-4e95-a0cf-3d339902d7b9 36d71830ce70436e97fbc17b6da8d3c6 95574103d0094883861c58d01690e5a3 - - default default] Lock "/var/lib/nova/instances/0f91ac3a-2383-45bf-94b7-631c1737e936/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 21 18:49:34 np0005591285 nova_compute[182755]: 2026-01-21 23:49:34.919 182759 DEBUG oslo_concurrency.processutils [None req-5d705a71-6e91-4e95-a0cf-3d339902d7b9 36d71830ce70436e97fbc17b6da8d3c6 95574103d0094883861c58d01690e5a3 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 21 18:49:35 np0005591285 nova_compute[182755]: 2026-01-21 23:49:35.020 182759 DEBUG oslo_concurrency.processutils [None req-5d705a71-6e91-4e95-a0cf-3d339902d7b9 36d71830ce70436e97fbc17b6da8d3c6 95574103d0094883861c58d01690e5a3 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json" returned: 0 in 0.101s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 21 18:49:35 np0005591285 nova_compute[182755]: 2026-01-21 23:49:35.022 182759 DEBUG oslo_concurrency.lockutils [None req-5d705a71-6e91-4e95-a0cf-3d339902d7b9 36d71830ce70436e97fbc17b6da8d3c6 95574103d0094883861c58d01690e5a3 - - default default] Acquiring lock "3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 21 18:49:35 np0005591285 nova_compute[182755]: 2026-01-21 23:49:35.023 182759 DEBUG oslo_concurrency.lockutils [None req-5d705a71-6e91-4e95-a0cf-3d339902d7b9 36d71830ce70436e97fbc17b6da8d3c6 95574103d0094883861c58d01690e5a3 - - default default] Lock "3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 21 18:49:35 np0005591285 nova_compute[182755]: 2026-01-21 23:49:35.052 182759 DEBUG oslo_concurrency.processutils [None req-5d705a71-6e91-4e95-a0cf-3d339902d7b9 36d71830ce70436e97fbc17b6da8d3c6 95574103d0094883861c58d01690e5a3 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 21 18:49:35 np0005591285 nova_compute[182755]: 2026-01-21 23:49:35.144 182759 DEBUG oslo_concurrency.processutils [None req-5d705a71-6e91-4e95-a0cf-3d339902d7b9 36d71830ce70436e97fbc17b6da8d3c6 95574103d0094883861c58d01690e5a3 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json" returned: 0 in 0.092s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 21 18:49:35 np0005591285 nova_compute[182755]: 2026-01-21 23:49:35.147 182759 DEBUG oslo_concurrency.processutils [None req-5d705a71-6e91-4e95-a0cf-3d339902d7b9 36d71830ce70436e97fbc17b6da8d3c6 95574103d0094883861c58d01690e5a3 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474,backing_fmt=raw /var/lib/nova/instances/0f91ac3a-2383-45bf-94b7-631c1737e936/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 21 18:49:35 np0005591285 nova_compute[182755]: 2026-01-21 23:49:35.200 182759 DEBUG oslo_concurrency.processutils [None req-5d705a71-6e91-4e95-a0cf-3d339902d7b9 36d71830ce70436e97fbc17b6da8d3c6 95574103d0094883861c58d01690e5a3 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474,backing_fmt=raw /var/lib/nova/instances/0f91ac3a-2383-45bf-94b7-631c1737e936/disk 1073741824" returned: 0 in 0.052s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 21 18:49:35 np0005591285 nova_compute[182755]: 2026-01-21 23:49:35.202 182759 DEBUG oslo_concurrency.lockutils [None req-5d705a71-6e91-4e95-a0cf-3d339902d7b9 36d71830ce70436e97fbc17b6da8d3c6 95574103d0094883861c58d01690e5a3 - - default default] Lock "3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.179s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 21 18:49:35 np0005591285 nova_compute[182755]: 2026-01-21 23:49:35.203 182759 DEBUG oslo_concurrency.processutils [None req-5d705a71-6e91-4e95-a0cf-3d339902d7b9 36d71830ce70436e97fbc17b6da8d3c6 95574103d0094883861c58d01690e5a3 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 21 18:49:35 np0005591285 nova_compute[182755]: 2026-01-21 23:49:35.293 182759 DEBUG oslo_concurrency.processutils [None req-5d705a71-6e91-4e95-a0cf-3d339902d7b9 36d71830ce70436e97fbc17b6da8d3c6 95574103d0094883861c58d01690e5a3 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json" returned: 0 in 0.090s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 21 18:49:35 np0005591285 nova_compute[182755]: 2026-01-21 23:49:35.295 182759 DEBUG nova.virt.disk.api [None req-5d705a71-6e91-4e95-a0cf-3d339902d7b9 36d71830ce70436e97fbc17b6da8d3c6 95574103d0094883861c58d01690e5a3 - - default default] Checking if we can resize image /var/lib/nova/instances/0f91ac3a-2383-45bf-94b7-631c1737e936/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166#033[00m
Jan 21 18:49:35 np0005591285 nova_compute[182755]: 2026-01-21 23:49:35.296 182759 DEBUG oslo_concurrency.processutils [None req-5d705a71-6e91-4e95-a0cf-3d339902d7b9 36d71830ce70436e97fbc17b6da8d3c6 95574103d0094883861c58d01690e5a3 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/0f91ac3a-2383-45bf-94b7-631c1737e936/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 21 18:49:35 np0005591285 nova_compute[182755]: 2026-01-21 23:49:35.340 182759 DEBUG nova.network.neutron [None req-5d705a71-6e91-4e95-a0cf-3d339902d7b9 36d71830ce70436e97fbc17b6da8d3c6 95574103d0094883861c58d01690e5a3 - - default default] [instance: 0f91ac3a-2383-45bf-94b7-631c1737e936] No network configured allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1188#033[00m
Jan 21 18:49:35 np0005591285 nova_compute[182755]: 2026-01-21 23:49:35.342 182759 DEBUG nova.compute.manager [None req-5d705a71-6e91-4e95-a0cf-3d339902d7b9 36d71830ce70436e97fbc17b6da8d3c6 95574103d0094883861c58d01690e5a3 - - default default] [instance: 0f91ac3a-2383-45bf-94b7-631c1737e936] Instance network_info: |[]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Jan 21 18:49:35 np0005591285 nova_compute[182755]: 2026-01-21 23:49:35.402 182759 DEBUG oslo_concurrency.processutils [None req-5d705a71-6e91-4e95-a0cf-3d339902d7b9 36d71830ce70436e97fbc17b6da8d3c6 95574103d0094883861c58d01690e5a3 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/0f91ac3a-2383-45bf-94b7-631c1737e936/disk --force-share --output=json" returned: 0 in 0.106s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 21 18:49:35 np0005591285 nova_compute[182755]: 2026-01-21 23:49:35.403 182759 DEBUG nova.virt.disk.api [None req-5d705a71-6e91-4e95-a0cf-3d339902d7b9 36d71830ce70436e97fbc17b6da8d3c6 95574103d0094883861c58d01690e5a3 - - default default] Cannot resize image /var/lib/nova/instances/0f91ac3a-2383-45bf-94b7-631c1737e936/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172#033[00m
Jan 21 18:49:35 np0005591285 nova_compute[182755]: 2026-01-21 23:49:35.404 182759 DEBUG nova.objects.instance [None req-5d705a71-6e91-4e95-a0cf-3d339902d7b9 36d71830ce70436e97fbc17b6da8d3c6 95574103d0094883861c58d01690e5a3 - - default default] Lazy-loading 'migration_context' on Instance uuid 0f91ac3a-2383-45bf-94b7-631c1737e936 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 21 18:49:35 np0005591285 nova_compute[182755]: 2026-01-21 23:49:35.421 182759 DEBUG nova.virt.libvirt.driver [None req-5d705a71-6e91-4e95-a0cf-3d339902d7b9 36d71830ce70436e97fbc17b6da8d3c6 95574103d0094883861c58d01690e5a3 - - default default] [instance: 0f91ac3a-2383-45bf-94b7-631c1737e936] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Jan 21 18:49:35 np0005591285 nova_compute[182755]: 2026-01-21 23:49:35.421 182759 DEBUG nova.virt.libvirt.driver [None req-5d705a71-6e91-4e95-a0cf-3d339902d7b9 36d71830ce70436e97fbc17b6da8d3c6 95574103d0094883861c58d01690e5a3 - - default default] [instance: 0f91ac3a-2383-45bf-94b7-631c1737e936] Ensure instance console log exists: /var/lib/nova/instances/0f91ac3a-2383-45bf-94b7-631c1737e936/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Jan 21 18:49:35 np0005591285 nova_compute[182755]: 2026-01-21 23:49:35.422 182759 DEBUG oslo_concurrency.lockutils [None req-5d705a71-6e91-4e95-a0cf-3d339902d7b9 36d71830ce70436e97fbc17b6da8d3c6 95574103d0094883861c58d01690e5a3 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 21 18:49:35 np0005591285 nova_compute[182755]: 2026-01-21 23:49:35.423 182759 DEBUG oslo_concurrency.lockutils [None req-5d705a71-6e91-4e95-a0cf-3d339902d7b9 36d71830ce70436e97fbc17b6da8d3c6 95574103d0094883861c58d01690e5a3 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 21 18:49:35 np0005591285 nova_compute[182755]: 2026-01-21 23:49:35.424 182759 DEBUG oslo_concurrency.lockutils [None req-5d705a71-6e91-4e95-a0cf-3d339902d7b9 36d71830ce70436e97fbc17b6da8d3c6 95574103d0094883861c58d01690e5a3 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 21 18:49:35 np0005591285 nova_compute[182755]: 2026-01-21 23:49:35.427 182759 DEBUG nova.virt.libvirt.driver [None req-5d705a71-6e91-4e95-a0cf-3d339902d7b9 36d71830ce70436e97fbc17b6da8d3c6 95574103d0094883861c58d01690e5a3 - - default default] [instance: 0f91ac3a-2383-45bf-94b7-631c1737e936] Start _get_guest_xml network_info=[] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-21T23:42:50Z,direct_url=<?>,disk_format='qcow2',id=9cd98f02-a505-4543-a7ad-04e9a377b456,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='43b70c4e837343859ac97b6b2397ba1b',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-21T23:42:52Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'device_type': 'disk', 'device_name': '/dev/vda', 'size': 0, 'boot_index': 0, 'disk_bus': 'virtio', 'guest_format': None, 'encryption_options': None, 'encryption_format': None, 'encrypted': False, 'encryption_secret_uuid': None, 'image_id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Jan 21 18:49:35 np0005591285 nova_compute[182755]: 2026-01-21 23:49:35.435 182759 WARNING nova.virt.libvirt.driver [None req-5d705a71-6e91-4e95-a0cf-3d339902d7b9 36d71830ce70436e97fbc17b6da8d3c6 95574103d0094883861c58d01690e5a3 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 21 18:49:35 np0005591285 nova_compute[182755]: 2026-01-21 23:49:35.443 182759 DEBUG nova.virt.libvirt.host [None req-5d705a71-6e91-4e95-a0cf-3d339902d7b9 36d71830ce70436e97fbc17b6da8d3c6 95574103d0094883861c58d01690e5a3 - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Jan 21 18:49:35 np0005591285 nova_compute[182755]: 2026-01-21 23:49:35.443 182759 DEBUG nova.virt.libvirt.host [None req-5d705a71-6e91-4e95-a0cf-3d339902d7b9 36d71830ce70436e97fbc17b6da8d3c6 95574103d0094883861c58d01690e5a3 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Jan 21 18:49:35 np0005591285 nova_compute[182755]: 2026-01-21 23:49:35.448 182759 DEBUG nova.virt.libvirt.host [None req-5d705a71-6e91-4e95-a0cf-3d339902d7b9 36d71830ce70436e97fbc17b6da8d3c6 95574103d0094883861c58d01690e5a3 - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Jan 21 18:49:35 np0005591285 nova_compute[182755]: 2026-01-21 23:49:35.449 182759 DEBUG nova.virt.libvirt.host [None req-5d705a71-6e91-4e95-a0cf-3d339902d7b9 36d71830ce70436e97fbc17b6da8d3c6 95574103d0094883861c58d01690e5a3 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Jan 21 18:49:35 np0005591285 nova_compute[182755]: 2026-01-21 23:49:35.451 182759 DEBUG nova.virt.libvirt.driver [None req-5d705a71-6e91-4e95-a0cf-3d339902d7b9 36d71830ce70436e97fbc17b6da8d3c6 95574103d0094883861c58d01690e5a3 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Jan 21 18:49:35 np0005591285 nova_compute[182755]: 2026-01-21 23:49:35.452 182759 DEBUG nova.virt.hardware [None req-5d705a71-6e91-4e95-a0cf-3d339902d7b9 36d71830ce70436e97fbc17b6da8d3c6 95574103d0094883861c58d01690e5a3 - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-21T23:42:45Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='c3389c03-89c4-4ff5-9e03-1a99d41713d4',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-21T23:42:50Z,direct_url=<?>,disk_format='qcow2',id=9cd98f02-a505-4543-a7ad-04e9a377b456,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='43b70c4e837343859ac97b6b2397ba1b',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-21T23:42:52Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Jan 21 18:49:35 np0005591285 nova_compute[182755]: 2026-01-21 23:49:35.453 182759 DEBUG nova.virt.hardware [None req-5d705a71-6e91-4e95-a0cf-3d339902d7b9 36d71830ce70436e97fbc17b6da8d3c6 95574103d0094883861c58d01690e5a3 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Jan 21 18:49:35 np0005591285 nova_compute[182755]: 2026-01-21 23:49:35.453 182759 DEBUG nova.virt.hardware [None req-5d705a71-6e91-4e95-a0cf-3d339902d7b9 36d71830ce70436e97fbc17b6da8d3c6 95574103d0094883861c58d01690e5a3 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Jan 21 18:49:35 np0005591285 nova_compute[182755]: 2026-01-21 23:49:35.454 182759 DEBUG nova.virt.hardware [None req-5d705a71-6e91-4e95-a0cf-3d339902d7b9 36d71830ce70436e97fbc17b6da8d3c6 95574103d0094883861c58d01690e5a3 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Jan 21 18:49:35 np0005591285 nova_compute[182755]: 2026-01-21 23:49:35.454 182759 DEBUG nova.virt.hardware [None req-5d705a71-6e91-4e95-a0cf-3d339902d7b9 36d71830ce70436e97fbc17b6da8d3c6 95574103d0094883861c58d01690e5a3 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Jan 21 18:49:35 np0005591285 nova_compute[182755]: 2026-01-21 23:49:35.455 182759 DEBUG nova.virt.hardware [None req-5d705a71-6e91-4e95-a0cf-3d339902d7b9 36d71830ce70436e97fbc17b6da8d3c6 95574103d0094883861c58d01690e5a3 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Jan 21 18:49:35 np0005591285 nova_compute[182755]: 2026-01-21 23:49:35.455 182759 DEBUG nova.virt.hardware [None req-5d705a71-6e91-4e95-a0cf-3d339902d7b9 36d71830ce70436e97fbc17b6da8d3c6 95574103d0094883861c58d01690e5a3 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Jan 21 18:49:35 np0005591285 nova_compute[182755]: 2026-01-21 23:49:35.456 182759 DEBUG nova.virt.hardware [None req-5d705a71-6e91-4e95-a0cf-3d339902d7b9 36d71830ce70436e97fbc17b6da8d3c6 95574103d0094883861c58d01690e5a3 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Jan 21 18:49:35 np0005591285 nova_compute[182755]: 2026-01-21 23:49:35.456 182759 DEBUG nova.virt.hardware [None req-5d705a71-6e91-4e95-a0cf-3d339902d7b9 36d71830ce70436e97fbc17b6da8d3c6 95574103d0094883861c58d01690e5a3 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Jan 21 18:49:35 np0005591285 nova_compute[182755]: 2026-01-21 23:49:35.457 182759 DEBUG nova.virt.hardware [None req-5d705a71-6e91-4e95-a0cf-3d339902d7b9 36d71830ce70436e97fbc17b6da8d3c6 95574103d0094883861c58d01690e5a3 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Jan 21 18:49:35 np0005591285 nova_compute[182755]: 2026-01-21 23:49:35.457 182759 DEBUG nova.virt.hardware [None req-5d705a71-6e91-4e95-a0cf-3d339902d7b9 36d71830ce70436e97fbc17b6da8d3c6 95574103d0094883861c58d01690e5a3 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Jan 21 18:49:35 np0005591285 nova_compute[182755]: 2026-01-21 23:49:35.466 182759 DEBUG nova.objects.instance [None req-5d705a71-6e91-4e95-a0cf-3d339902d7b9 36d71830ce70436e97fbc17b6da8d3c6 95574103d0094883861c58d01690e5a3 - - default default] Lazy-loading 'pci_devices' on Instance uuid 0f91ac3a-2383-45bf-94b7-631c1737e936 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 21 18:49:35 np0005591285 nova_compute[182755]: 2026-01-21 23:49:35.494 182759 DEBUG nova.virt.libvirt.driver [None req-5d705a71-6e91-4e95-a0cf-3d339902d7b9 36d71830ce70436e97fbc17b6da8d3c6 95574103d0094883861c58d01690e5a3 - - default default] [instance: 0f91ac3a-2383-45bf-94b7-631c1737e936] End _get_guest_xml xml=<domain type="kvm">
Jan 21 18:49:35 np0005591285 nova_compute[182755]:  <uuid>0f91ac3a-2383-45bf-94b7-631c1737e936</uuid>
Jan 21 18:49:35 np0005591285 nova_compute[182755]:  <name>instance-00000023</name>
Jan 21 18:49:35 np0005591285 nova_compute[182755]:  <memory>131072</memory>
Jan 21 18:49:35 np0005591285 nova_compute[182755]:  <vcpu>1</vcpu>
Jan 21 18:49:35 np0005591285 nova_compute[182755]:  <metadata>
Jan 21 18:49:35 np0005591285 nova_compute[182755]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 21 18:49:35 np0005591285 nova_compute[182755]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 21 18:49:35 np0005591285 nova_compute[182755]:      <nova:name>tempest-MigrationsAdminTest-server-1788036298</nova:name>
Jan 21 18:49:35 np0005591285 nova_compute[182755]:      <nova:creationTime>2026-01-21 23:49:35</nova:creationTime>
Jan 21 18:49:35 np0005591285 nova_compute[182755]:      <nova:flavor name="m1.nano">
Jan 21 18:49:35 np0005591285 nova_compute[182755]:        <nova:memory>128</nova:memory>
Jan 21 18:49:35 np0005591285 nova_compute[182755]:        <nova:disk>1</nova:disk>
Jan 21 18:49:35 np0005591285 nova_compute[182755]:        <nova:swap>0</nova:swap>
Jan 21 18:49:35 np0005591285 nova_compute[182755]:        <nova:ephemeral>0</nova:ephemeral>
Jan 21 18:49:35 np0005591285 nova_compute[182755]:        <nova:vcpus>1</nova:vcpus>
Jan 21 18:49:35 np0005591285 nova_compute[182755]:      </nova:flavor>
Jan 21 18:49:35 np0005591285 nova_compute[182755]:      <nova:owner>
Jan 21 18:49:35 np0005591285 nova_compute[182755]:        <nova:user uuid="36d71830ce70436e97fbc17b6da8d3c6">tempest-MigrationsAdminTest-1559502816-project-member</nova:user>
Jan 21 18:49:35 np0005591285 nova_compute[182755]:        <nova:project uuid="95574103d0094883861c58d01690e5a3">tempest-MigrationsAdminTest-1559502816</nova:project>
Jan 21 18:49:35 np0005591285 nova_compute[182755]:      </nova:owner>
Jan 21 18:49:35 np0005591285 nova_compute[182755]:      <nova:root type="image" uuid="9cd98f02-a505-4543-a7ad-04e9a377b456"/>
Jan 21 18:49:35 np0005591285 nova_compute[182755]:      <nova:ports/>
Jan 21 18:49:35 np0005591285 nova_compute[182755]:    </nova:instance>
Jan 21 18:49:35 np0005591285 nova_compute[182755]:  </metadata>
Jan 21 18:49:35 np0005591285 nova_compute[182755]:  <sysinfo type="smbios">
Jan 21 18:49:35 np0005591285 nova_compute[182755]:    <system>
Jan 21 18:49:35 np0005591285 nova_compute[182755]:      <entry name="manufacturer">RDO</entry>
Jan 21 18:49:35 np0005591285 nova_compute[182755]:      <entry name="product">OpenStack Compute</entry>
Jan 21 18:49:35 np0005591285 nova_compute[182755]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 21 18:49:35 np0005591285 nova_compute[182755]:      <entry name="serial">0f91ac3a-2383-45bf-94b7-631c1737e936</entry>
Jan 21 18:49:35 np0005591285 nova_compute[182755]:      <entry name="uuid">0f91ac3a-2383-45bf-94b7-631c1737e936</entry>
Jan 21 18:49:35 np0005591285 nova_compute[182755]:      <entry name="family">Virtual Machine</entry>
Jan 21 18:49:35 np0005591285 nova_compute[182755]:    </system>
Jan 21 18:49:35 np0005591285 nova_compute[182755]:  </sysinfo>
Jan 21 18:49:35 np0005591285 nova_compute[182755]:  <os>
Jan 21 18:49:35 np0005591285 nova_compute[182755]:    <type arch="x86_64" machine="q35">hvm</type>
Jan 21 18:49:35 np0005591285 nova_compute[182755]:    <boot dev="hd"/>
Jan 21 18:49:35 np0005591285 nova_compute[182755]:    <smbios mode="sysinfo"/>
Jan 21 18:49:35 np0005591285 nova_compute[182755]:  </os>
Jan 21 18:49:35 np0005591285 nova_compute[182755]:  <features>
Jan 21 18:49:35 np0005591285 nova_compute[182755]:    <acpi/>
Jan 21 18:49:35 np0005591285 nova_compute[182755]:    <apic/>
Jan 21 18:49:35 np0005591285 nova_compute[182755]:    <vmcoreinfo/>
Jan 21 18:49:35 np0005591285 nova_compute[182755]:  </features>
Jan 21 18:49:35 np0005591285 nova_compute[182755]:  <clock offset="utc">
Jan 21 18:49:35 np0005591285 nova_compute[182755]:    <timer name="pit" tickpolicy="delay"/>
Jan 21 18:49:35 np0005591285 nova_compute[182755]:    <timer name="rtc" tickpolicy="catchup"/>
Jan 21 18:49:35 np0005591285 nova_compute[182755]:    <timer name="hpet" present="no"/>
Jan 21 18:49:35 np0005591285 nova_compute[182755]:  </clock>
Jan 21 18:49:35 np0005591285 nova_compute[182755]:  <cpu mode="custom" match="exact">
Jan 21 18:49:35 np0005591285 nova_compute[182755]:    <model>Nehalem</model>
Jan 21 18:49:35 np0005591285 nova_compute[182755]:    <topology sockets="1" cores="1" threads="1"/>
Jan 21 18:49:35 np0005591285 nova_compute[182755]:  </cpu>
Jan 21 18:49:35 np0005591285 nova_compute[182755]:  <devices>
Jan 21 18:49:35 np0005591285 nova_compute[182755]:    <disk type="file" device="disk">
Jan 21 18:49:35 np0005591285 nova_compute[182755]:      <driver name="qemu" type="qcow2" cache="none"/>
Jan 21 18:49:35 np0005591285 nova_compute[182755]:      <source file="/var/lib/nova/instances/0f91ac3a-2383-45bf-94b7-631c1737e936/disk"/>
Jan 21 18:49:35 np0005591285 nova_compute[182755]:      <target dev="vda" bus="virtio"/>
Jan 21 18:49:35 np0005591285 nova_compute[182755]:    </disk>
Jan 21 18:49:35 np0005591285 nova_compute[182755]:    <disk type="file" device="cdrom">
Jan 21 18:49:35 np0005591285 nova_compute[182755]:      <driver name="qemu" type="raw" cache="none"/>
Jan 21 18:49:35 np0005591285 nova_compute[182755]:      <source file="/var/lib/nova/instances/0f91ac3a-2383-45bf-94b7-631c1737e936/disk.config"/>
Jan 21 18:49:35 np0005591285 nova_compute[182755]:      <target dev="sda" bus="sata"/>
Jan 21 18:49:35 np0005591285 nova_compute[182755]:    </disk>
Jan 21 18:49:35 np0005591285 nova_compute[182755]:    <serial type="pty">
Jan 21 18:49:35 np0005591285 nova_compute[182755]:      <log file="/var/lib/nova/instances/0f91ac3a-2383-45bf-94b7-631c1737e936/console.log" append="off"/>
Jan 21 18:49:35 np0005591285 nova_compute[182755]:    </serial>
Jan 21 18:49:35 np0005591285 nova_compute[182755]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 21 18:49:35 np0005591285 nova_compute[182755]:    <video>
Jan 21 18:49:35 np0005591285 nova_compute[182755]:      <model type="virtio"/>
Jan 21 18:49:35 np0005591285 nova_compute[182755]:    </video>
Jan 21 18:49:35 np0005591285 nova_compute[182755]:    <input type="tablet" bus="usb"/>
Jan 21 18:49:35 np0005591285 nova_compute[182755]:    <rng model="virtio">
Jan 21 18:49:35 np0005591285 nova_compute[182755]:      <backend model="random">/dev/urandom</backend>
Jan 21 18:49:35 np0005591285 nova_compute[182755]:    </rng>
Jan 21 18:49:35 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root"/>
Jan 21 18:49:35 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 18:49:35 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 18:49:35 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 18:49:35 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 18:49:35 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 18:49:35 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 18:49:35 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 18:49:35 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 18:49:35 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 18:49:35 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 18:49:35 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 18:49:35 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 18:49:35 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 18:49:35 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 18:49:35 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 18:49:35 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 18:49:35 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 18:49:35 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 18:49:35 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 18:49:35 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 18:49:35 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 18:49:35 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 18:49:35 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 18:49:35 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 18:49:35 np0005591285 nova_compute[182755]:    <controller type="usb" index="0"/>
Jan 21 18:49:35 np0005591285 nova_compute[182755]:    <memballoon model="virtio">
Jan 21 18:49:35 np0005591285 nova_compute[182755]:      <stats period="10"/>
Jan 21 18:49:35 np0005591285 nova_compute[182755]:    </memballoon>
Jan 21 18:49:35 np0005591285 nova_compute[182755]:  </devices>
Jan 21 18:49:35 np0005591285 nova_compute[182755]: </domain>
Jan 21 18:49:35 np0005591285 nova_compute[182755]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Jan 21 18:49:35 np0005591285 nova_compute[182755]: 2026-01-21 23:49:35.563 182759 DEBUG nova.virt.libvirt.driver [None req-5d705a71-6e91-4e95-a0cf-3d339902d7b9 36d71830ce70436e97fbc17b6da8d3c6 95574103d0094883861c58d01690e5a3 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 21 18:49:35 np0005591285 nova_compute[182755]: 2026-01-21 23:49:35.564 182759 DEBUG nova.virt.libvirt.driver [None req-5d705a71-6e91-4e95-a0cf-3d339902d7b9 36d71830ce70436e97fbc17b6da8d3c6 95574103d0094883861c58d01690e5a3 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 21 18:49:35 np0005591285 nova_compute[182755]: 2026-01-21 23:49:35.564 182759 INFO nova.virt.libvirt.driver [None req-5d705a71-6e91-4e95-a0cf-3d339902d7b9 36d71830ce70436e97fbc17b6da8d3c6 95574103d0094883861c58d01690e5a3 - - default default] [instance: 0f91ac3a-2383-45bf-94b7-631c1737e936] Using config drive#033[00m
Jan 21 18:49:35 np0005591285 nova_compute[182755]: 2026-01-21 23:49:35.886 182759 INFO nova.virt.libvirt.driver [None req-5d705a71-6e91-4e95-a0cf-3d339902d7b9 36d71830ce70436e97fbc17b6da8d3c6 95574103d0094883861c58d01690e5a3 - - default default] [instance: 0f91ac3a-2383-45bf-94b7-631c1737e936] Creating config drive at /var/lib/nova/instances/0f91ac3a-2383-45bf-94b7-631c1737e936/disk.config#033[00m
Jan 21 18:49:35 np0005591285 nova_compute[182755]: 2026-01-21 23:49:35.891 182759 DEBUG oslo_concurrency.processutils [None req-5d705a71-6e91-4e95-a0cf-3d339902d7b9 36d71830ce70436e97fbc17b6da8d3c6 95574103d0094883861c58d01690e5a3 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/0f91ac3a-2383-45bf-94b7-631c1737e936/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp0eef2wc8 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 21 18:49:36 np0005591285 nova_compute[182755]: 2026-01-21 23:49:36.023 182759 DEBUG oslo_concurrency.processutils [None req-5d705a71-6e91-4e95-a0cf-3d339902d7b9 36d71830ce70436e97fbc17b6da8d3c6 95574103d0094883861c58d01690e5a3 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/0f91ac3a-2383-45bf-94b7-631c1737e936/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp0eef2wc8" returned: 0 in 0.132s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 21 18:49:36 np0005591285 systemd-machined[154022]: New machine qemu-12-instance-00000023.
Jan 21 18:49:36 np0005591285 systemd[1]: Started Virtual Machine qemu-12-instance-00000023.
Jan 21 18:49:36 np0005591285 nova_compute[182755]: 2026-01-21 23:49:36.404 182759 DEBUG nova.virt.driver [None req-f447ef32-7edb-4e2c-a5c5-5452f2f9c37e - - - - - -] Emitting event <LifecycleEvent: 1769039376.403639, 0f91ac3a-2383-45bf-94b7-631c1737e936 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 21 18:49:36 np0005591285 nova_compute[182755]: 2026-01-21 23:49:36.405 182759 INFO nova.compute.manager [None req-f447ef32-7edb-4e2c-a5c5-5452f2f9c37e - - - - - -] [instance: 0f91ac3a-2383-45bf-94b7-631c1737e936] VM Resumed (Lifecycle Event)#033[00m
Jan 21 18:49:36 np0005591285 nova_compute[182755]: 2026-01-21 23:49:36.411 182759 DEBUG nova.compute.manager [None req-5d705a71-6e91-4e95-a0cf-3d339902d7b9 36d71830ce70436e97fbc17b6da8d3c6 95574103d0094883861c58d01690e5a3 - - default default] [instance: 0f91ac3a-2383-45bf-94b7-631c1737e936] Instance event wait completed in 0 seconds for  wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Jan 21 18:49:36 np0005591285 nova_compute[182755]: 2026-01-21 23:49:36.412 182759 DEBUG nova.virt.libvirt.driver [None req-5d705a71-6e91-4e95-a0cf-3d339902d7b9 36d71830ce70436e97fbc17b6da8d3c6 95574103d0094883861c58d01690e5a3 - - default default] [instance: 0f91ac3a-2383-45bf-94b7-631c1737e936] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Jan 21 18:49:36 np0005591285 nova_compute[182755]: 2026-01-21 23:49:36.418 182759 INFO nova.virt.libvirt.driver [-] [instance: 0f91ac3a-2383-45bf-94b7-631c1737e936] Instance spawned successfully.#033[00m
Jan 21 18:49:36 np0005591285 nova_compute[182755]: 2026-01-21 23:49:36.418 182759 DEBUG nova.virt.libvirt.driver [None req-5d705a71-6e91-4e95-a0cf-3d339902d7b9 36d71830ce70436e97fbc17b6da8d3c6 95574103d0094883861c58d01690e5a3 - - default default] [instance: 0f91ac3a-2383-45bf-94b7-631c1737e936] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Jan 21 18:49:36 np0005591285 nova_compute[182755]: 2026-01-21 23:49:36.448 182759 DEBUG nova.compute.manager [None req-f447ef32-7edb-4e2c-a5c5-5452f2f9c37e - - - - - -] [instance: 0f91ac3a-2383-45bf-94b7-631c1737e936] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 21 18:49:36 np0005591285 nova_compute[182755]: 2026-01-21 23:49:36.453 182759 DEBUG nova.virt.libvirt.driver [None req-5d705a71-6e91-4e95-a0cf-3d339902d7b9 36d71830ce70436e97fbc17b6da8d3c6 95574103d0094883861c58d01690e5a3 - - default default] [instance: 0f91ac3a-2383-45bf-94b7-631c1737e936] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 21 18:49:36 np0005591285 nova_compute[182755]: 2026-01-21 23:49:36.453 182759 DEBUG nova.virt.libvirt.driver [None req-5d705a71-6e91-4e95-a0cf-3d339902d7b9 36d71830ce70436e97fbc17b6da8d3c6 95574103d0094883861c58d01690e5a3 - - default default] [instance: 0f91ac3a-2383-45bf-94b7-631c1737e936] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 21 18:49:36 np0005591285 nova_compute[182755]: 2026-01-21 23:49:36.454 182759 DEBUG nova.virt.libvirt.driver [None req-5d705a71-6e91-4e95-a0cf-3d339902d7b9 36d71830ce70436e97fbc17b6da8d3c6 95574103d0094883861c58d01690e5a3 - - default default] [instance: 0f91ac3a-2383-45bf-94b7-631c1737e936] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 21 18:49:36 np0005591285 nova_compute[182755]: 2026-01-21 23:49:36.455 182759 DEBUG nova.virt.libvirt.driver [None req-5d705a71-6e91-4e95-a0cf-3d339902d7b9 36d71830ce70436e97fbc17b6da8d3c6 95574103d0094883861c58d01690e5a3 - - default default] [instance: 0f91ac3a-2383-45bf-94b7-631c1737e936] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 21 18:49:36 np0005591285 nova_compute[182755]: 2026-01-21 23:49:36.456 182759 DEBUG nova.virt.libvirt.driver [None req-5d705a71-6e91-4e95-a0cf-3d339902d7b9 36d71830ce70436e97fbc17b6da8d3c6 95574103d0094883861c58d01690e5a3 - - default default] [instance: 0f91ac3a-2383-45bf-94b7-631c1737e936] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 21 18:49:36 np0005591285 nova_compute[182755]: 2026-01-21 23:49:36.456 182759 DEBUG nova.virt.libvirt.driver [None req-5d705a71-6e91-4e95-a0cf-3d339902d7b9 36d71830ce70436e97fbc17b6da8d3c6 95574103d0094883861c58d01690e5a3 - - default default] [instance: 0f91ac3a-2383-45bf-94b7-631c1737e936] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 21 18:49:36 np0005591285 nova_compute[182755]: 2026-01-21 23:49:36.466 182759 DEBUG nova.compute.manager [None req-f447ef32-7edb-4e2c-a5c5-5452f2f9c37e - - - - - -] [instance: 0f91ac3a-2383-45bf-94b7-631c1737e936] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 21 18:49:36 np0005591285 nova_compute[182755]: 2026-01-21 23:49:36.494 182759 INFO nova.compute.manager [None req-f447ef32-7edb-4e2c-a5c5-5452f2f9c37e - - - - - -] [instance: 0f91ac3a-2383-45bf-94b7-631c1737e936] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 21 18:49:36 np0005591285 nova_compute[182755]: 2026-01-21 23:49:36.495 182759 DEBUG nova.virt.driver [None req-f447ef32-7edb-4e2c-a5c5-5452f2f9c37e - - - - - -] Emitting event <LifecycleEvent: 1769039376.4052064, 0f91ac3a-2383-45bf-94b7-631c1737e936 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 21 18:49:36 np0005591285 nova_compute[182755]: 2026-01-21 23:49:36.495 182759 INFO nova.compute.manager [None req-f447ef32-7edb-4e2c-a5c5-5452f2f9c37e - - - - - -] [instance: 0f91ac3a-2383-45bf-94b7-631c1737e936] VM Started (Lifecycle Event)#033[00m
Jan 21 18:49:36 np0005591285 nova_compute[182755]: 2026-01-21 23:49:36.536 182759 DEBUG nova.compute.manager [None req-f447ef32-7edb-4e2c-a5c5-5452f2f9c37e - - - - - -] [instance: 0f91ac3a-2383-45bf-94b7-631c1737e936] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 21 18:49:36 np0005591285 nova_compute[182755]: 2026-01-21 23:49:36.542 182759 DEBUG nova.compute.manager [None req-f447ef32-7edb-4e2c-a5c5-5452f2f9c37e - - - - - -] [instance: 0f91ac3a-2383-45bf-94b7-631c1737e936] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 21 18:49:36 np0005591285 nova_compute[182755]: 2026-01-21 23:49:36.565 182759 INFO nova.compute.manager [None req-5d705a71-6e91-4e95-a0cf-3d339902d7b9 36d71830ce70436e97fbc17b6da8d3c6 95574103d0094883861c58d01690e5a3 - - default default] [instance: 0f91ac3a-2383-45bf-94b7-631c1737e936] Took 1.68 seconds to spawn the instance on the hypervisor.#033[00m
Jan 21 18:49:36 np0005591285 nova_compute[182755]: 2026-01-21 23:49:36.566 182759 DEBUG nova.compute.manager [None req-5d705a71-6e91-4e95-a0cf-3d339902d7b9 36d71830ce70436e97fbc17b6da8d3c6 95574103d0094883861c58d01690e5a3 - - default default] [instance: 0f91ac3a-2383-45bf-94b7-631c1737e936] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 21 18:49:36 np0005591285 nova_compute[182755]: 2026-01-21 23:49:36.571 182759 INFO nova.compute.manager [None req-f447ef32-7edb-4e2c-a5c5-5452f2f9c37e - - - - - -] [instance: 0f91ac3a-2383-45bf-94b7-631c1737e936] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 21 18:49:36 np0005591285 nova_compute[182755]: 2026-01-21 23:49:36.692 182759 INFO nova.compute.manager [None req-5d705a71-6e91-4e95-a0cf-3d339902d7b9 36d71830ce70436e97fbc17b6da8d3c6 95574103d0094883861c58d01690e5a3 - - default default] [instance: 0f91ac3a-2383-45bf-94b7-631c1737e936] Took 2.68 seconds to build instance.#033[00m
Jan 21 18:49:36 np0005591285 nova_compute[182755]: 2026-01-21 23:49:36.722 182759 DEBUG oslo_concurrency.lockutils [None req-5d705a71-6e91-4e95-a0cf-3d339902d7b9 36d71830ce70436e97fbc17b6da8d3c6 95574103d0094883861c58d01690e5a3 - - default default] Lock "0f91ac3a-2383-45bf-94b7-631c1737e936" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 2.865s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 21 18:49:37 np0005591285 nova_compute[182755]: 2026-01-21 23:49:37.257 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 18:49:39 np0005591285 nova_compute[182755]: 2026-01-21 23:49:39.236 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 18:49:41 np0005591285 nova_compute[182755]: 2026-01-21 23:49:41.056 182759 DEBUG oslo_concurrency.lockutils [None req-542b52cb-55c3-4eb4-88d8-187b192e758e 9677535ac83f4274b374a03b9ade1cf8 fd13708789ca41ca9ed376ba39d87279 - - default default] Acquiring lock "refresh_cache-0f91ac3a-2383-45bf-94b7-631c1737e936" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 21 18:49:41 np0005591285 nova_compute[182755]: 2026-01-21 23:49:41.061 182759 DEBUG oslo_concurrency.lockutils [None req-542b52cb-55c3-4eb4-88d8-187b192e758e 9677535ac83f4274b374a03b9ade1cf8 fd13708789ca41ca9ed376ba39d87279 - - default default] Acquired lock "refresh_cache-0f91ac3a-2383-45bf-94b7-631c1737e936" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 21 18:49:41 np0005591285 nova_compute[182755]: 2026-01-21 23:49:41.062 182759 DEBUG nova.network.neutron [None req-542b52cb-55c3-4eb4-88d8-187b192e758e 9677535ac83f4274b374a03b9ade1cf8 fd13708789ca41ca9ed376ba39d87279 - - default default] [instance: 0f91ac3a-2383-45bf-94b7-631c1737e936] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Jan 21 18:49:41 np0005591285 podman[215323]: 2026-01-21 23:49:41.257299846 +0000 UTC m=+0.109289029 container health_status b5c1a31a508d0cf9e10702abe9c0ef424614b4d8f0daee85791f7de054afa6f0 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.build-date=20251202, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, container_name=ceilometer_agent_compute, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=ceilometer_agent_compute, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 21 18:49:41 np0005591285 nova_compute[182755]: 2026-01-21 23:49:41.342 182759 DEBUG nova.network.neutron [None req-542b52cb-55c3-4eb4-88d8-187b192e758e 9677535ac83f4274b374a03b9ade1cf8 fd13708789ca41ca9ed376ba39d87279 - - default default] [instance: 0f91ac3a-2383-45bf-94b7-631c1737e936] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Jan 21 18:49:41 np0005591285 nova_compute[182755]: 2026-01-21 23:49:41.699 182759 DEBUG nova.network.neutron [None req-542b52cb-55c3-4eb4-88d8-187b192e758e 9677535ac83f4274b374a03b9ade1cf8 fd13708789ca41ca9ed376ba39d87279 - - default default] [instance: 0f91ac3a-2383-45bf-94b7-631c1737e936] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 21 18:49:41 np0005591285 nova_compute[182755]: 2026-01-21 23:49:41.747 182759 DEBUG oslo_concurrency.lockutils [None req-542b52cb-55c3-4eb4-88d8-187b192e758e 9677535ac83f4274b374a03b9ade1cf8 fd13708789ca41ca9ed376ba39d87279 - - default default] Releasing lock "refresh_cache-0f91ac3a-2383-45bf-94b7-631c1737e936" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 21 18:49:41 np0005591285 nova_compute[182755]: 2026-01-21 23:49:41.869 182759 DEBUG nova.virt.libvirt.driver [None req-542b52cb-55c3-4eb4-88d8-187b192e758e 9677535ac83f4274b374a03b9ade1cf8 fd13708789ca41ca9ed376ba39d87279 - - default default] [instance: 0f91ac3a-2383-45bf-94b7-631c1737e936] Starting migrate_disk_and_power_off migrate_disk_and_power_off /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11511#033[00m
Jan 21 18:49:41 np0005591285 nova_compute[182755]: 2026-01-21 23:49:41.874 182759 DEBUG nova.virt.libvirt.volume.remotefs [None req-542b52cb-55c3-4eb4-88d8-187b192e758e 9677535ac83f4274b374a03b9ade1cf8 fd13708789ca41ca9ed376ba39d87279 - - default default] Creating file /var/lib/nova/instances/0f91ac3a-2383-45bf-94b7-631c1737e936/85a84cadfbf5420487a3a3ee388356eb.tmp on remote host 192.168.122.101 create_file /usr/lib/python3.9/site-packages/nova/virt/libvirt/volume/remotefs.py:79#033[00m
Jan 21 18:49:41 np0005591285 nova_compute[182755]: 2026-01-21 23:49:41.875 182759 DEBUG oslo_concurrency.processutils [None req-542b52cb-55c3-4eb4-88d8-187b192e758e 9677535ac83f4274b374a03b9ade1cf8 fd13708789ca41ca9ed376ba39d87279 - - default default] Running cmd (subprocess): ssh -o BatchMode=yes 192.168.122.101 touch /var/lib/nova/instances/0f91ac3a-2383-45bf-94b7-631c1737e936/85a84cadfbf5420487a3a3ee388356eb.tmp execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 21 18:49:42 np0005591285 nova_compute[182755]: 2026-01-21 23:49:42.259 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 18:49:42 np0005591285 nova_compute[182755]: 2026-01-21 23:49:42.313 182759 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1769039367.312057, 4d84ec02-4252-4dab-8580-d9961b6e6afd => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 21 18:49:42 np0005591285 nova_compute[182755]: 2026-01-21 23:49:42.314 182759 INFO nova.compute.manager [-] [instance: 4d84ec02-4252-4dab-8580-d9961b6e6afd] VM Stopped (Lifecycle Event)#033[00m
Jan 21 18:49:42 np0005591285 nova_compute[182755]: 2026-01-21 23:49:42.341 182759 DEBUG nova.compute.manager [None req-d577f208-27d1-4816-9752-277f824881a7 - - - - - -] [instance: 4d84ec02-4252-4dab-8580-d9961b6e6afd] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 21 18:49:42 np0005591285 nova_compute[182755]: 2026-01-21 23:49:42.381 182759 DEBUG oslo_concurrency.processutils [None req-542b52cb-55c3-4eb4-88d8-187b192e758e 9677535ac83f4274b374a03b9ade1cf8 fd13708789ca41ca9ed376ba39d87279 - - default default] CMD "ssh -o BatchMode=yes 192.168.122.101 touch /var/lib/nova/instances/0f91ac3a-2383-45bf-94b7-631c1737e936/85a84cadfbf5420487a3a3ee388356eb.tmp" returned: 1 in 0.506s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 21 18:49:42 np0005591285 nova_compute[182755]: 2026-01-21 23:49:42.385 182759 DEBUG oslo_concurrency.processutils [None req-542b52cb-55c3-4eb4-88d8-187b192e758e 9677535ac83f4274b374a03b9ade1cf8 fd13708789ca41ca9ed376ba39d87279 - - default default] 'ssh -o BatchMode=yes 192.168.122.101 touch /var/lib/nova/instances/0f91ac3a-2383-45bf-94b7-631c1737e936/85a84cadfbf5420487a3a3ee388356eb.tmp' failed. Not Retrying. execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:473#033[00m
Jan 21 18:49:42 np0005591285 nova_compute[182755]: 2026-01-21 23:49:42.386 182759 DEBUG nova.virt.libvirt.volume.remotefs [None req-542b52cb-55c3-4eb4-88d8-187b192e758e 9677535ac83f4274b374a03b9ade1cf8 fd13708789ca41ca9ed376ba39d87279 - - default default] Creating directory /var/lib/nova/instances/0f91ac3a-2383-45bf-94b7-631c1737e936 on remote host 192.168.122.101 create_dir /usr/lib/python3.9/site-packages/nova/virt/libvirt/volume/remotefs.py:91#033[00m
Jan 21 18:49:42 np0005591285 nova_compute[182755]: 2026-01-21 23:49:42.391 182759 DEBUG oslo_concurrency.processutils [None req-542b52cb-55c3-4eb4-88d8-187b192e758e 9677535ac83f4274b374a03b9ade1cf8 fd13708789ca41ca9ed376ba39d87279 - - default default] Running cmd (subprocess): ssh -o BatchMode=yes 192.168.122.101 mkdir -p /var/lib/nova/instances/0f91ac3a-2383-45bf-94b7-631c1737e936 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 21 18:49:42 np0005591285 nova_compute[182755]: 2026-01-21 23:49:42.722 182759 DEBUG oslo_concurrency.processutils [None req-542b52cb-55c3-4eb4-88d8-187b192e758e 9677535ac83f4274b374a03b9ade1cf8 fd13708789ca41ca9ed376ba39d87279 - - default default] CMD "ssh -o BatchMode=yes 192.168.122.101 mkdir -p /var/lib/nova/instances/0f91ac3a-2383-45bf-94b7-631c1737e936" returned: 0 in 0.331s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 21 18:49:42 np0005591285 nova_compute[182755]: 2026-01-21 23:49:42.727 182759 DEBUG nova.virt.libvirt.driver [None req-542b52cb-55c3-4eb4-88d8-187b192e758e 9677535ac83f4274b374a03b9ade1cf8 fd13708789ca41ca9ed376ba39d87279 - - default default] [instance: 0f91ac3a-2383-45bf-94b7-631c1737e936] Shutting down instance from state 1 _clean_shutdown /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4071#033[00m
Jan 21 18:49:43 np0005591285 podman[215345]: 2026-01-21 23:49:43.221588708 +0000 UTC m=+0.093662719 container health_status 769668cc4e818cfae854ab3fcb5517227ddca1e18005a18ef4d782a494f45662 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=minimal rhel9, maintainer=Red Hat, Inc., container_name=openstack_network_exporter, managed_by=edpm_ansible, version=9.6, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, com.redhat.component=ubi9-minimal-container, distribution-scope=public, build-date=2025-08-20T13:12:41, name=ubi9-minimal, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vcs-type=git, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., release=1755695350, config_id=openstack_network_exporter, io.openshift.expose-services=, url=https://catalog.redhat.com/en/search?searchType=containers, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vendor=Red Hat, Inc., architecture=x86_64, io.buildah.version=1.33.7)
Jan 21 18:49:44 np0005591285 nova_compute[182755]: 2026-01-21 23:49:44.279 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 18:49:47 np0005591285 nova_compute[182755]: 2026-01-21 23:49:47.264 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 18:49:48 np0005591285 nova_compute[182755]: 2026-01-21 23:49:48.089 182759 DEBUG oslo_concurrency.lockutils [None req-04b29d72-257d-44e3-9371-ef1d2b8e0efd abd17ede09d948d58de153b963381f13 54c1b2890dcc4b4599ff907adcbbbbb0 - - default default] Acquiring lock "539483c9-32b1-4c30-b72a-be10a98b79fa" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 21 18:49:48 np0005591285 nova_compute[182755]: 2026-01-21 23:49:48.089 182759 DEBUG oslo_concurrency.lockutils [None req-04b29d72-257d-44e3-9371-ef1d2b8e0efd abd17ede09d948d58de153b963381f13 54c1b2890dcc4b4599ff907adcbbbbb0 - - default default] Lock "539483c9-32b1-4c30-b72a-be10a98b79fa" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 21 18:49:48 np0005591285 nova_compute[182755]: 2026-01-21 23:49:48.090 182759 DEBUG oslo_concurrency.lockutils [None req-04b29d72-257d-44e3-9371-ef1d2b8e0efd abd17ede09d948d58de153b963381f13 54c1b2890dcc4b4599ff907adcbbbbb0 - - default default] Acquiring lock "539483c9-32b1-4c30-b72a-be10a98b79fa-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 21 18:49:48 np0005591285 nova_compute[182755]: 2026-01-21 23:49:48.090 182759 DEBUG oslo_concurrency.lockutils [None req-04b29d72-257d-44e3-9371-ef1d2b8e0efd abd17ede09d948d58de153b963381f13 54c1b2890dcc4b4599ff907adcbbbbb0 - - default default] Lock "539483c9-32b1-4c30-b72a-be10a98b79fa-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 21 18:49:48 np0005591285 nova_compute[182755]: 2026-01-21 23:49:48.090 182759 DEBUG oslo_concurrency.lockutils [None req-04b29d72-257d-44e3-9371-ef1d2b8e0efd abd17ede09d948d58de153b963381f13 54c1b2890dcc4b4599ff907adcbbbbb0 - - default default] Lock "539483c9-32b1-4c30-b72a-be10a98b79fa-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 21 18:49:48 np0005591285 nova_compute[182755]: 2026-01-21 23:49:48.103 182759 INFO nova.compute.manager [None req-04b29d72-257d-44e3-9371-ef1d2b8e0efd abd17ede09d948d58de153b963381f13 54c1b2890dcc4b4599ff907adcbbbbb0 - - default default] [instance: 539483c9-32b1-4c30-b72a-be10a98b79fa] Terminating instance#033[00m
Jan 21 18:49:48 np0005591285 nova_compute[182755]: 2026-01-21 23:49:48.114 182759 DEBUG oslo_concurrency.lockutils [None req-04b29d72-257d-44e3-9371-ef1d2b8e0efd abd17ede09d948d58de153b963381f13 54c1b2890dcc4b4599ff907adcbbbbb0 - - default default] Acquiring lock "refresh_cache-539483c9-32b1-4c30-b72a-be10a98b79fa" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 21 18:49:48 np0005591285 nova_compute[182755]: 2026-01-21 23:49:48.115 182759 DEBUG oslo_concurrency.lockutils [None req-04b29d72-257d-44e3-9371-ef1d2b8e0efd abd17ede09d948d58de153b963381f13 54c1b2890dcc4b4599ff907adcbbbbb0 - - default default] Acquired lock "refresh_cache-539483c9-32b1-4c30-b72a-be10a98b79fa" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 21 18:49:48 np0005591285 nova_compute[182755]: 2026-01-21 23:49:48.115 182759 DEBUG nova.network.neutron [None req-04b29d72-257d-44e3-9371-ef1d2b8e0efd abd17ede09d948d58de153b963381f13 54c1b2890dcc4b4599ff907adcbbbbb0 - - default default] [instance: 539483c9-32b1-4c30-b72a-be10a98b79fa] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Jan 21 18:49:48 np0005591285 nova_compute[182755]: 2026-01-21 23:49:48.402 182759 DEBUG nova.network.neutron [None req-04b29d72-257d-44e3-9371-ef1d2b8e0efd abd17ede09d948d58de153b963381f13 54c1b2890dcc4b4599ff907adcbbbbb0 - - default default] [instance: 539483c9-32b1-4c30-b72a-be10a98b79fa] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Jan 21 18:49:49 np0005591285 nova_compute[182755]: 2026-01-21 23:49:49.023 182759 DEBUG nova.network.neutron [None req-04b29d72-257d-44e3-9371-ef1d2b8e0efd abd17ede09d948d58de153b963381f13 54c1b2890dcc4b4599ff907adcbbbbb0 - - default default] [instance: 539483c9-32b1-4c30-b72a-be10a98b79fa] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 21 18:49:49 np0005591285 nova_compute[182755]: 2026-01-21 23:49:49.042 182759 DEBUG oslo_concurrency.lockutils [None req-04b29d72-257d-44e3-9371-ef1d2b8e0efd abd17ede09d948d58de153b963381f13 54c1b2890dcc4b4599ff907adcbbbbb0 - - default default] Releasing lock "refresh_cache-539483c9-32b1-4c30-b72a-be10a98b79fa" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 21 18:49:49 np0005591285 nova_compute[182755]: 2026-01-21 23:49:49.043 182759 DEBUG nova.compute.manager [None req-04b29d72-257d-44e3-9371-ef1d2b8e0efd abd17ede09d948d58de153b963381f13 54c1b2890dcc4b4599ff907adcbbbbb0 - - default default] [instance: 539483c9-32b1-4c30-b72a-be10a98b79fa] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Jan 21 18:49:49 np0005591285 systemd[1]: machine-qemu\x2d10\x2dinstance\x2d00000021.scope: Deactivated successfully.
Jan 21 18:49:49 np0005591285 systemd[1]: machine-qemu\x2d10\x2dinstance\x2d00000021.scope: Consumed 13.737s CPU time.
Jan 21 18:49:49 np0005591285 systemd-machined[154022]: Machine qemu-10-instance-00000021 terminated.
Jan 21 18:49:49 np0005591285 nova_compute[182755]: 2026-01-21 23:49:49.282 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 18:49:49 np0005591285 nova_compute[182755]: 2026-01-21 23:49:49.308 182759 INFO nova.virt.libvirt.driver [-] [instance: 539483c9-32b1-4c30-b72a-be10a98b79fa] Instance destroyed successfully.#033[00m
Jan 21 18:49:49 np0005591285 nova_compute[182755]: 2026-01-21 23:49:49.309 182759 DEBUG nova.objects.instance [None req-04b29d72-257d-44e3-9371-ef1d2b8e0efd abd17ede09d948d58de153b963381f13 54c1b2890dcc4b4599ff907adcbbbbb0 - - default default] Lazy-loading 'resources' on Instance uuid 539483c9-32b1-4c30-b72a-be10a98b79fa obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 21 18:49:49 np0005591285 nova_compute[182755]: 2026-01-21 23:49:49.331 182759 INFO nova.virt.libvirt.driver [None req-04b29d72-257d-44e3-9371-ef1d2b8e0efd abd17ede09d948d58de153b963381f13 54c1b2890dcc4b4599ff907adcbbbbb0 - - default default] [instance: 539483c9-32b1-4c30-b72a-be10a98b79fa] Deleting instance files /var/lib/nova/instances/539483c9-32b1-4c30-b72a-be10a98b79fa_del#033[00m
Jan 21 18:49:49 np0005591285 nova_compute[182755]: 2026-01-21 23:49:49.333 182759 INFO nova.virt.libvirt.driver [None req-04b29d72-257d-44e3-9371-ef1d2b8e0efd abd17ede09d948d58de153b963381f13 54c1b2890dcc4b4599ff907adcbbbbb0 - - default default] [instance: 539483c9-32b1-4c30-b72a-be10a98b79fa] Deletion of /var/lib/nova/instances/539483c9-32b1-4c30-b72a-be10a98b79fa_del complete#033[00m
Jan 21 18:49:49 np0005591285 nova_compute[182755]: 2026-01-21 23:49:49.434 182759 INFO nova.compute.manager [None req-04b29d72-257d-44e3-9371-ef1d2b8e0efd abd17ede09d948d58de153b963381f13 54c1b2890dcc4b4599ff907adcbbbbb0 - - default default] [instance: 539483c9-32b1-4c30-b72a-be10a98b79fa] Took 0.39 seconds to destroy the instance on the hypervisor.#033[00m
Jan 21 18:49:49 np0005591285 nova_compute[182755]: 2026-01-21 23:49:49.435 182759 DEBUG oslo.service.loopingcall [None req-04b29d72-257d-44e3-9371-ef1d2b8e0efd abd17ede09d948d58de153b963381f13 54c1b2890dcc4b4599ff907adcbbbbb0 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Jan 21 18:49:49 np0005591285 nova_compute[182755]: 2026-01-21 23:49:49.435 182759 DEBUG nova.compute.manager [-] [instance: 539483c9-32b1-4c30-b72a-be10a98b79fa] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Jan 21 18:49:49 np0005591285 nova_compute[182755]: 2026-01-21 23:49:49.436 182759 DEBUG nova.network.neutron [-] [instance: 539483c9-32b1-4c30-b72a-be10a98b79fa] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Jan 21 18:49:49 np0005591285 nova_compute[182755]: 2026-01-21 23:49:49.750 182759 DEBUG nova.network.neutron [-] [instance: 539483c9-32b1-4c30-b72a-be10a98b79fa] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Jan 21 18:49:49 np0005591285 nova_compute[182755]: 2026-01-21 23:49:49.770 182759 DEBUG nova.network.neutron [-] [instance: 539483c9-32b1-4c30-b72a-be10a98b79fa] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 21 18:49:49 np0005591285 nova_compute[182755]: 2026-01-21 23:49:49.794 182759 INFO nova.compute.manager [-] [instance: 539483c9-32b1-4c30-b72a-be10a98b79fa] Took 0.36 seconds to deallocate network for instance.#033[00m
Jan 21 18:49:49 np0005591285 nova_compute[182755]: 2026-01-21 23:49:49.900 182759 DEBUG oslo_concurrency.lockutils [None req-04b29d72-257d-44e3-9371-ef1d2b8e0efd abd17ede09d948d58de153b963381f13 54c1b2890dcc4b4599ff907adcbbbbb0 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 21 18:49:49 np0005591285 nova_compute[182755]: 2026-01-21 23:49:49.901 182759 DEBUG oslo_concurrency.lockutils [None req-04b29d72-257d-44e3-9371-ef1d2b8e0efd abd17ede09d948d58de153b963381f13 54c1b2890dcc4b4599ff907adcbbbbb0 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 21 18:49:50 np0005591285 nova_compute[182755]: 2026-01-21 23:49:50.017 182759 DEBUG nova.compute.provider_tree [None req-04b29d72-257d-44e3-9371-ef1d2b8e0efd abd17ede09d948d58de153b963381f13 54c1b2890dcc4b4599ff907adcbbbbb0 - - default default] Inventory has not changed in ProviderTree for provider: e96a8776-a298-4c19-937a-402cb8191067 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 21 18:49:50 np0005591285 nova_compute[182755]: 2026-01-21 23:49:50.039 182759 DEBUG nova.scheduler.client.report [None req-04b29d72-257d-44e3-9371-ef1d2b8e0efd abd17ede09d948d58de153b963381f13 54c1b2890dcc4b4599ff907adcbbbbb0 - - default default] Inventory has not changed for provider e96a8776-a298-4c19-937a-402cb8191067 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 21 18:49:50 np0005591285 nova_compute[182755]: 2026-01-21 23:49:50.072 182759 DEBUG oslo_concurrency.lockutils [None req-04b29d72-257d-44e3-9371-ef1d2b8e0efd abd17ede09d948d58de153b963381f13 54c1b2890dcc4b4599ff907adcbbbbb0 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.171s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 21 18:49:50 np0005591285 nova_compute[182755]: 2026-01-21 23:49:50.133 182759 INFO nova.scheduler.client.report [None req-04b29d72-257d-44e3-9371-ef1d2b8e0efd abd17ede09d948d58de153b963381f13 54c1b2890dcc4b4599ff907adcbbbbb0 - - default default] Deleted allocations for instance 539483c9-32b1-4c30-b72a-be10a98b79fa#033[00m
Jan 21 18:49:50 np0005591285 nova_compute[182755]: 2026-01-21 23:49:50.214 182759 DEBUG oslo_concurrency.lockutils [None req-04b29d72-257d-44e3-9371-ef1d2b8e0efd abd17ede09d948d58de153b963381f13 54c1b2890dcc4b4599ff907adcbbbbb0 - - default default] Lock "539483c9-32b1-4c30-b72a-be10a98b79fa" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 2.125s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 21 18:49:50 np0005591285 podman[215391]: 2026-01-21 23:49:50.229181324 +0000 UTC m=+0.093617798 container health_status 3d927e208c8d9d2bf2ed958430b573b5beb6e6062ba87e1ec7a0ee42c855b2e4 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Jan 21 18:49:52 np0005591285 nova_compute[182755]: 2026-01-21 23:49:52.266 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 18:49:52 np0005591285 nova_compute[182755]: 2026-01-21 23:49:52.790 182759 DEBUG nova.virt.libvirt.driver [None req-542b52cb-55c3-4eb4-88d8-187b192e758e 9677535ac83f4274b374a03b9ade1cf8 fd13708789ca41ca9ed376ba39d87279 - - default default] [instance: 0f91ac3a-2383-45bf-94b7-631c1737e936] Instance in state 1 after 10 seconds - resending shutdown _clean_shutdown /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4101#033[00m
Jan 21 18:49:54 np0005591285 nova_compute[182755]: 2026-01-21 23:49:54.285 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 18:49:54 np0005591285 podman[215417]: 2026-01-21 23:49:54.294584177 +0000 UTC m=+0.154022542 container health_status e38def30a2d032278c507a8fe96e62e9ea51ff4721fe55b24e66691821fd079e (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=ovn_controller, org.label-schema.build-date=20251202, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 21 18:49:55 np0005591285 systemd[1]: machine-qemu\x2d12\x2dinstance\x2d00000023.scope: Deactivated successfully.
Jan 21 18:49:55 np0005591285 systemd[1]: machine-qemu\x2d12\x2dinstance\x2d00000023.scope: Consumed 12.855s CPU time.
Jan 21 18:49:55 np0005591285 systemd-machined[154022]: Machine qemu-12-instance-00000023 terminated.
Jan 21 18:49:55 np0005591285 nova_compute[182755]: 2026-01-21 23:49:55.812 182759 INFO nova.virt.libvirt.driver [None req-542b52cb-55c3-4eb4-88d8-187b192e758e 9677535ac83f4274b374a03b9ade1cf8 fd13708789ca41ca9ed376ba39d87279 - - default default] [instance: 0f91ac3a-2383-45bf-94b7-631c1737e936] Instance shutdown successfully after 13 seconds.#033[00m
Jan 21 18:49:55 np0005591285 nova_compute[182755]: 2026-01-21 23:49:55.820 182759 INFO nova.virt.libvirt.driver [-] [instance: 0f91ac3a-2383-45bf-94b7-631c1737e936] Instance destroyed successfully.#033[00m
Jan 21 18:49:55 np0005591285 nova_compute[182755]: 2026-01-21 23:49:55.828 182759 DEBUG oslo_concurrency.processutils [None req-542b52cb-55c3-4eb4-88d8-187b192e758e 9677535ac83f4274b374a03b9ade1cf8 fd13708789ca41ca9ed376ba39d87279 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/0f91ac3a-2383-45bf-94b7-631c1737e936/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 21 18:49:55 np0005591285 nova_compute[182755]: 2026-01-21 23:49:55.924 182759 DEBUG oslo_concurrency.processutils [None req-542b52cb-55c3-4eb4-88d8-187b192e758e 9677535ac83f4274b374a03b9ade1cf8 fd13708789ca41ca9ed376ba39d87279 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/0f91ac3a-2383-45bf-94b7-631c1737e936/disk --force-share --output=json" returned: 0 in 0.096s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 21 18:49:55 np0005591285 nova_compute[182755]: 2026-01-21 23:49:55.927 182759 DEBUG oslo_concurrency.processutils [None req-542b52cb-55c3-4eb4-88d8-187b192e758e 9677535ac83f4274b374a03b9ade1cf8 fd13708789ca41ca9ed376ba39d87279 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/0f91ac3a-2383-45bf-94b7-631c1737e936/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 21 18:49:56 np0005591285 nova_compute[182755]: 2026-01-21 23:49:56.046 182759 DEBUG oslo_concurrency.processutils [None req-542b52cb-55c3-4eb4-88d8-187b192e758e 9677535ac83f4274b374a03b9ade1cf8 fd13708789ca41ca9ed376ba39d87279 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/0f91ac3a-2383-45bf-94b7-631c1737e936/disk --force-share --output=json" returned: 0 in 0.119s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 21 18:49:56 np0005591285 nova_compute[182755]: 2026-01-21 23:49:56.051 182759 DEBUG nova.virt.libvirt.volume.remotefs [None req-542b52cb-55c3-4eb4-88d8-187b192e758e 9677535ac83f4274b374a03b9ade1cf8 fd13708789ca41ca9ed376ba39d87279 - - default default] Copying file /var/lib/nova/instances/0f91ac3a-2383-45bf-94b7-631c1737e936_resize/disk to 192.168.122.101:/var/lib/nova/instances/0f91ac3a-2383-45bf-94b7-631c1737e936/disk copy_file /usr/lib/python3.9/site-packages/nova/virt/libvirt/volume/remotefs.py:103#033[00m
Jan 21 18:49:56 np0005591285 nova_compute[182755]: 2026-01-21 23:49:56.052 182759 DEBUG oslo_concurrency.processutils [None req-542b52cb-55c3-4eb4-88d8-187b192e758e 9677535ac83f4274b374a03b9ade1cf8 fd13708789ca41ca9ed376ba39d87279 - - default default] Running cmd (subprocess): scp -r /var/lib/nova/instances/0f91ac3a-2383-45bf-94b7-631c1737e936_resize/disk 192.168.122.101:/var/lib/nova/instances/0f91ac3a-2383-45bf-94b7-631c1737e936/disk execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 21 18:49:56 np0005591285 nova_compute[182755]: 2026-01-21 23:49:56.967 182759 DEBUG oslo_concurrency.processutils [None req-542b52cb-55c3-4eb4-88d8-187b192e758e 9677535ac83f4274b374a03b9ade1cf8 fd13708789ca41ca9ed376ba39d87279 - - default default] CMD "scp -r /var/lib/nova/instances/0f91ac3a-2383-45bf-94b7-631c1737e936_resize/disk 192.168.122.101:/var/lib/nova/instances/0f91ac3a-2383-45bf-94b7-631c1737e936/disk" returned: 0 in 0.915s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 21 18:49:56 np0005591285 nova_compute[182755]: 2026-01-21 23:49:56.969 182759 DEBUG nova.virt.libvirt.volume.remotefs [None req-542b52cb-55c3-4eb4-88d8-187b192e758e 9677535ac83f4274b374a03b9ade1cf8 fd13708789ca41ca9ed376ba39d87279 - - default default] Copying file /var/lib/nova/instances/0f91ac3a-2383-45bf-94b7-631c1737e936_resize/disk.config to 192.168.122.101:/var/lib/nova/instances/0f91ac3a-2383-45bf-94b7-631c1737e936/disk.config copy_file /usr/lib/python3.9/site-packages/nova/virt/libvirt/volume/remotefs.py:103#033[00m
Jan 21 18:49:56 np0005591285 nova_compute[182755]: 2026-01-21 23:49:56.970 182759 DEBUG oslo_concurrency.processutils [None req-542b52cb-55c3-4eb4-88d8-187b192e758e 9677535ac83f4274b374a03b9ade1cf8 fd13708789ca41ca9ed376ba39d87279 - - default default] Running cmd (subprocess): scp -C -r /var/lib/nova/instances/0f91ac3a-2383-45bf-94b7-631c1737e936_resize/disk.config 192.168.122.101:/var/lib/nova/instances/0f91ac3a-2383-45bf-94b7-631c1737e936/disk.config execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 21 18:49:57 np0005591285 nova_compute[182755]: 2026-01-21 23:49:57.222 182759 DEBUG oslo_concurrency.processutils [None req-542b52cb-55c3-4eb4-88d8-187b192e758e 9677535ac83f4274b374a03b9ade1cf8 fd13708789ca41ca9ed376ba39d87279 - - default default] CMD "scp -C -r /var/lib/nova/instances/0f91ac3a-2383-45bf-94b7-631c1737e936_resize/disk.config 192.168.122.101:/var/lib/nova/instances/0f91ac3a-2383-45bf-94b7-631c1737e936/disk.config" returned: 0 in 0.252s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 21 18:49:57 np0005591285 nova_compute[182755]: 2026-01-21 23:49:57.224 182759 DEBUG nova.virt.libvirt.volume.remotefs [None req-542b52cb-55c3-4eb4-88d8-187b192e758e 9677535ac83f4274b374a03b9ade1cf8 fd13708789ca41ca9ed376ba39d87279 - - default default] Copying file /var/lib/nova/instances/0f91ac3a-2383-45bf-94b7-631c1737e936_resize/disk.info to 192.168.122.101:/var/lib/nova/instances/0f91ac3a-2383-45bf-94b7-631c1737e936/disk.info copy_file /usr/lib/python3.9/site-packages/nova/virt/libvirt/volume/remotefs.py:103#033[00m
Jan 21 18:49:57 np0005591285 nova_compute[182755]: 2026-01-21 23:49:57.225 182759 DEBUG oslo_concurrency.processutils [None req-542b52cb-55c3-4eb4-88d8-187b192e758e 9677535ac83f4274b374a03b9ade1cf8 fd13708789ca41ca9ed376ba39d87279 - - default default] Running cmd (subprocess): scp -C -r /var/lib/nova/instances/0f91ac3a-2383-45bf-94b7-631c1737e936_resize/disk.info 192.168.122.101:/var/lib/nova/instances/0f91ac3a-2383-45bf-94b7-631c1737e936/disk.info execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 21 18:49:57 np0005591285 nova_compute[182755]: 2026-01-21 23:49:57.269 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 18:49:57 np0005591285 nova_compute[182755]: 2026-01-21 23:49:57.491 182759 DEBUG oslo_concurrency.processutils [None req-542b52cb-55c3-4eb4-88d8-187b192e758e 9677535ac83f4274b374a03b9ade1cf8 fd13708789ca41ca9ed376ba39d87279 - - default default] CMD "scp -C -r /var/lib/nova/instances/0f91ac3a-2383-45bf-94b7-631c1737e936_resize/disk.info 192.168.122.101:/var/lib/nova/instances/0f91ac3a-2383-45bf-94b7-631c1737e936/disk.info" returned: 0 in 0.266s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 21 18:49:57 np0005591285 nova_compute[182755]: 2026-01-21 23:49:57.676 182759 DEBUG oslo_concurrency.lockutils [None req-542b52cb-55c3-4eb4-88d8-187b192e758e 9677535ac83f4274b374a03b9ade1cf8 fd13708789ca41ca9ed376ba39d87279 - - default default] Acquiring lock "0f91ac3a-2383-45bf-94b7-631c1737e936-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 21 18:49:57 np0005591285 nova_compute[182755]: 2026-01-21 23:49:57.677 182759 DEBUG oslo_concurrency.lockutils [None req-542b52cb-55c3-4eb4-88d8-187b192e758e 9677535ac83f4274b374a03b9ade1cf8 fd13708789ca41ca9ed376ba39d87279 - - default default] Lock "0f91ac3a-2383-45bf-94b7-631c1737e936-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 21 18:49:57 np0005591285 nova_compute[182755]: 2026-01-21 23:49:57.677 182759 DEBUG oslo_concurrency.lockutils [None req-542b52cb-55c3-4eb4-88d8-187b192e758e 9677535ac83f4274b374a03b9ade1cf8 fd13708789ca41ca9ed376ba39d87279 - - default default] Lock "0f91ac3a-2383-45bf-94b7-631c1737e936-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 21 18:49:58 np0005591285 podman[215464]: 2026-01-21 23:49:58.2471323 +0000 UTC m=+0.095702532 container health_status 482a8450540b33e5cd4c4cb0c4b60281016c657b79b89fa82f8a9e447843f995 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent, org.label-schema.build-date=20251202, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS)
Jan 21 18:49:58 np0005591285 podman[215465]: 2026-01-21 23:49:58.255281573 +0000 UTC m=+0.094460129 container health_status 8919309155a033da8deb024bb37b9fc0d285fe97686ee0d202af37a5f2df8416 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Jan 21 18:49:59 np0005591285 nova_compute[182755]: 2026-01-21 23:49:59.331 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 18:50:02 np0005591285 nova_compute[182755]: 2026-01-21 23:50:02.273 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 18:50:02 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:50:02.956 104259 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 21 18:50:02 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:50:02.957 104259 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 21 18:50:02 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:50:02.957 104259 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 21 18:50:03 np0005591285 nova_compute[182755]: 2026-01-21 23:50:03.382 182759 INFO nova.compute.manager [None req-dba29aea-48cc-424b-9c0a-415ae10508a7 36d71830ce70436e97fbc17b6da8d3c6 95574103d0094883861c58d01690e5a3 - - default default] [instance: 0f91ac3a-2383-45bf-94b7-631c1737e936] Swapping old allocation on dict_keys(['e96a8776-a298-4c19-937a-402cb8191067']) held by migration d3b9acf4-73a1-472f-99a3-614594e27415 for instance#033[00m
Jan 21 18:50:03 np0005591285 nova_compute[182755]: 2026-01-21 23:50:03.438 182759 DEBUG nova.scheduler.client.report [None req-dba29aea-48cc-424b-9c0a-415ae10508a7 36d71830ce70436e97fbc17b6da8d3c6 95574103d0094883861c58d01690e5a3 - - default default] Overwriting current allocation {'allocations': {'39680711-70c9-4df1-ae59-25e54fac688d': {'resources': {'VCPU': 1, 'MEMORY_MB': 128, 'DISK_GB': 1}, 'generation': 30}}, 'project_id': '95574103d0094883861c58d01690e5a3', 'user_id': '36d71830ce70436e97fbc17b6da8d3c6', 'consumer_generation': 1} on consumer 0f91ac3a-2383-45bf-94b7-631c1737e936 move_allocations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:2018#033[00m
Jan 21 18:50:03 np0005591285 nova_compute[182755]: 2026-01-21 23:50:03.660 182759 DEBUG oslo_concurrency.lockutils [None req-dba29aea-48cc-424b-9c0a-415ae10508a7 36d71830ce70436e97fbc17b6da8d3c6 95574103d0094883861c58d01690e5a3 - - default default] Acquiring lock "refresh_cache-0f91ac3a-2383-45bf-94b7-631c1737e936" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 21 18:50:03 np0005591285 nova_compute[182755]: 2026-01-21 23:50:03.661 182759 DEBUG oslo_concurrency.lockutils [None req-dba29aea-48cc-424b-9c0a-415ae10508a7 36d71830ce70436e97fbc17b6da8d3c6 95574103d0094883861c58d01690e5a3 - - default default] Acquired lock "refresh_cache-0f91ac3a-2383-45bf-94b7-631c1737e936" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 21 18:50:03 np0005591285 nova_compute[182755]: 2026-01-21 23:50:03.661 182759 DEBUG nova.network.neutron [None req-dba29aea-48cc-424b-9c0a-415ae10508a7 36d71830ce70436e97fbc17b6da8d3c6 95574103d0094883861c58d01690e5a3 - - default default] [instance: 0f91ac3a-2383-45bf-94b7-631c1737e936] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Jan 21 18:50:03 np0005591285 nova_compute[182755]: 2026-01-21 23:50:03.823 182759 DEBUG nova.network.neutron [None req-dba29aea-48cc-424b-9c0a-415ae10508a7 36d71830ce70436e97fbc17b6da8d3c6 95574103d0094883861c58d01690e5a3 - - default default] [instance: 0f91ac3a-2383-45bf-94b7-631c1737e936] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Jan 21 18:50:04 np0005591285 nova_compute[182755]: 2026-01-21 23:50:04.250 182759 DEBUG nova.network.neutron [None req-dba29aea-48cc-424b-9c0a-415ae10508a7 36d71830ce70436e97fbc17b6da8d3c6 95574103d0094883861c58d01690e5a3 - - default default] [instance: 0f91ac3a-2383-45bf-94b7-631c1737e936] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 21 18:50:04 np0005591285 nova_compute[182755]: 2026-01-21 23:50:04.285 182759 DEBUG oslo_concurrency.lockutils [None req-dba29aea-48cc-424b-9c0a-415ae10508a7 36d71830ce70436e97fbc17b6da8d3c6 95574103d0094883861c58d01690e5a3 - - default default] Releasing lock "refresh_cache-0f91ac3a-2383-45bf-94b7-631c1737e936" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 21 18:50:04 np0005591285 nova_compute[182755]: 2026-01-21 23:50:04.286 182759 DEBUG nova.virt.libvirt.driver [None req-dba29aea-48cc-424b-9c0a-415ae10508a7 36d71830ce70436e97fbc17b6da8d3c6 95574103d0094883861c58d01690e5a3 - - default default] [instance: 0f91ac3a-2383-45bf-94b7-631c1737e936] Starting finish_revert_migration finish_revert_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11843#033[00m
Jan 21 18:50:04 np0005591285 nova_compute[182755]: 2026-01-21 23:50:04.301 182759 DEBUG nova.virt.libvirt.driver [None req-dba29aea-48cc-424b-9c0a-415ae10508a7 36d71830ce70436e97fbc17b6da8d3c6 95574103d0094883861c58d01690e5a3 - - default default] [instance: 0f91ac3a-2383-45bf-94b7-631c1737e936] Start _get_guest_xml network_info=[] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum=<?>,container_format='bare',created_at=<?>,direct_url=<?>,disk_format='qcow2',id=9cd98f02-a505-4543-a7ad-04e9a377b456,min_disk=1,min_ram=0,name=<?>,owner=<?>,properties=ImageMetaProps,protected=<?>,size=<?>,status=<?>,tags=<?>,updated_at=<?>,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'device_type': 'disk', 'device_name': '/dev/vda', 'size': 0, 'boot_index': 0, 'disk_bus': 'virtio', 'guest_format': None, 'encryption_options': None, 'encryption_format': None, 'encrypted': False, 'encryption_secret_uuid': None, 'image_id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Jan 21 18:50:04 np0005591285 nova_compute[182755]: 2026-01-21 23:50:04.305 182759 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1769039389.304341, 539483c9-32b1-4c30-b72a-be10a98b79fa => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 21 18:50:04 np0005591285 nova_compute[182755]: 2026-01-21 23:50:04.306 182759 INFO nova.compute.manager [-] [instance: 539483c9-32b1-4c30-b72a-be10a98b79fa] VM Stopped (Lifecycle Event)#033[00m
Jan 21 18:50:04 np0005591285 nova_compute[182755]: 2026-01-21 23:50:04.311 182759 WARNING nova.virt.libvirt.driver [None req-dba29aea-48cc-424b-9c0a-415ae10508a7 36d71830ce70436e97fbc17b6da8d3c6 95574103d0094883861c58d01690e5a3 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 21 18:50:04 np0005591285 nova_compute[182755]: 2026-01-21 23:50:04.321 182759 DEBUG nova.virt.libvirt.host [None req-dba29aea-48cc-424b-9c0a-415ae10508a7 36d71830ce70436e97fbc17b6da8d3c6 95574103d0094883861c58d01690e5a3 - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Jan 21 18:50:04 np0005591285 nova_compute[182755]: 2026-01-21 23:50:04.322 182759 DEBUG nova.virt.libvirt.host [None req-dba29aea-48cc-424b-9c0a-415ae10508a7 36d71830ce70436e97fbc17b6da8d3c6 95574103d0094883861c58d01690e5a3 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Jan 21 18:50:04 np0005591285 nova_compute[182755]: 2026-01-21 23:50:04.326 182759 DEBUG nova.virt.libvirt.host [None req-dba29aea-48cc-424b-9c0a-415ae10508a7 36d71830ce70436e97fbc17b6da8d3c6 95574103d0094883861c58d01690e5a3 - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Jan 21 18:50:04 np0005591285 nova_compute[182755]: 2026-01-21 23:50:04.327 182759 DEBUG nova.virt.libvirt.host [None req-dba29aea-48cc-424b-9c0a-415ae10508a7 36d71830ce70436e97fbc17b6da8d3c6 95574103d0094883861c58d01690e5a3 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Jan 21 18:50:04 np0005591285 nova_compute[182755]: 2026-01-21 23:50:04.329 182759 DEBUG nova.virt.libvirt.driver [None req-dba29aea-48cc-424b-9c0a-415ae10508a7 36d71830ce70436e97fbc17b6da8d3c6 95574103d0094883861c58d01690e5a3 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Jan 21 18:50:04 np0005591285 nova_compute[182755]: 2026-01-21 23:50:04.330 182759 DEBUG nova.virt.hardware [None req-dba29aea-48cc-424b-9c0a-415ae10508a7 36d71830ce70436e97fbc17b6da8d3c6 95574103d0094883861c58d01690e5a3 - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-21T23:42:45Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='c3389c03-89c4-4ff5-9e03-1a99d41713d4',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum=<?>,container_format='bare',created_at=<?>,direct_url=<?>,disk_format='qcow2',id=9cd98f02-a505-4543-a7ad-04e9a377b456,min_disk=1,min_ram=0,name=<?>,owner=<?>,properties=ImageMetaProps,protected=<?>,size=<?>,status=<?>,tags=<?>,updated_at=<?>,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Jan 21 18:50:04 np0005591285 nova_compute[182755]: 2026-01-21 23:50:04.331 182759 DEBUG nova.virt.hardware [None req-dba29aea-48cc-424b-9c0a-415ae10508a7 36d71830ce70436e97fbc17b6da8d3c6 95574103d0094883861c58d01690e5a3 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Jan 21 18:50:04 np0005591285 nova_compute[182755]: 2026-01-21 23:50:04.331 182759 DEBUG nova.virt.hardware [None req-dba29aea-48cc-424b-9c0a-415ae10508a7 36d71830ce70436e97fbc17b6da8d3c6 95574103d0094883861c58d01690e5a3 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Jan 21 18:50:04 np0005591285 nova_compute[182755]: 2026-01-21 23:50:04.331 182759 DEBUG nova.virt.hardware [None req-dba29aea-48cc-424b-9c0a-415ae10508a7 36d71830ce70436e97fbc17b6da8d3c6 95574103d0094883861c58d01690e5a3 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Jan 21 18:50:04 np0005591285 nova_compute[182755]: 2026-01-21 23:50:04.332 182759 DEBUG nova.virt.hardware [None req-dba29aea-48cc-424b-9c0a-415ae10508a7 36d71830ce70436e97fbc17b6da8d3c6 95574103d0094883861c58d01690e5a3 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Jan 21 18:50:04 np0005591285 nova_compute[182755]: 2026-01-21 23:50:04.332 182759 DEBUG nova.virt.hardware [None req-dba29aea-48cc-424b-9c0a-415ae10508a7 36d71830ce70436e97fbc17b6da8d3c6 95574103d0094883861c58d01690e5a3 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Jan 21 18:50:04 np0005591285 nova_compute[182755]: 2026-01-21 23:50:04.333 182759 DEBUG nova.virt.hardware [None req-dba29aea-48cc-424b-9c0a-415ae10508a7 36d71830ce70436e97fbc17b6da8d3c6 95574103d0094883861c58d01690e5a3 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Jan 21 18:50:04 np0005591285 nova_compute[182755]: 2026-01-21 23:50:04.333 182759 DEBUG nova.virt.hardware [None req-dba29aea-48cc-424b-9c0a-415ae10508a7 36d71830ce70436e97fbc17b6da8d3c6 95574103d0094883861c58d01690e5a3 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Jan 21 18:50:04 np0005591285 nova_compute[182755]: 2026-01-21 23:50:04.334 182759 DEBUG nova.virt.hardware [None req-dba29aea-48cc-424b-9c0a-415ae10508a7 36d71830ce70436e97fbc17b6da8d3c6 95574103d0094883861c58d01690e5a3 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Jan 21 18:50:04 np0005591285 nova_compute[182755]: 2026-01-21 23:50:04.334 182759 DEBUG nova.virt.hardware [None req-dba29aea-48cc-424b-9c0a-415ae10508a7 36d71830ce70436e97fbc17b6da8d3c6 95574103d0094883861c58d01690e5a3 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Jan 21 18:50:04 np0005591285 nova_compute[182755]: 2026-01-21 23:50:04.335 182759 DEBUG nova.virt.hardware [None req-dba29aea-48cc-424b-9c0a-415ae10508a7 36d71830ce70436e97fbc17b6da8d3c6 95574103d0094883861c58d01690e5a3 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Jan 21 18:50:04 np0005591285 nova_compute[182755]: 2026-01-21 23:50:04.335 182759 DEBUG nova.objects.instance [None req-dba29aea-48cc-424b-9c0a-415ae10508a7 36d71830ce70436e97fbc17b6da8d3c6 95574103d0094883861c58d01690e5a3 - - default default] Lazy-loading 'vcpu_model' on Instance uuid 0f91ac3a-2383-45bf-94b7-631c1737e936 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 21 18:50:04 np0005591285 nova_compute[182755]: 2026-01-21 23:50:04.337 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 18:50:04 np0005591285 nova_compute[182755]: 2026-01-21 23:50:04.341 182759 DEBUG nova.compute.manager [None req-f4bc9183-1ce3-42f3-801b-acf55f08c0e7 - - - - - -] [instance: 539483c9-32b1-4c30-b72a-be10a98b79fa] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 21 18:50:04 np0005591285 nova_compute[182755]: 2026-01-21 23:50:04.357 182759 DEBUG oslo_concurrency.processutils [None req-dba29aea-48cc-424b-9c0a-415ae10508a7 36d71830ce70436e97fbc17b6da8d3c6 95574103d0094883861c58d01690e5a3 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/0f91ac3a-2383-45bf-94b7-631c1737e936/disk.config --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 21 18:50:04 np0005591285 nova_compute[182755]: 2026-01-21 23:50:04.427 182759 DEBUG oslo_concurrency.processutils [None req-dba29aea-48cc-424b-9c0a-415ae10508a7 36d71830ce70436e97fbc17b6da8d3c6 95574103d0094883861c58d01690e5a3 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/0f91ac3a-2383-45bf-94b7-631c1737e936/disk.config --force-share --output=json" returned: 0 in 0.070s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 21 18:50:04 np0005591285 nova_compute[182755]: 2026-01-21 23:50:04.428 182759 DEBUG oslo_concurrency.lockutils [None req-dba29aea-48cc-424b-9c0a-415ae10508a7 36d71830ce70436e97fbc17b6da8d3c6 95574103d0094883861c58d01690e5a3 - - default default] Acquiring lock "/var/lib/nova/instances/0f91ac3a-2383-45bf-94b7-631c1737e936/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 21 18:50:04 np0005591285 nova_compute[182755]: 2026-01-21 23:50:04.429 182759 DEBUG oslo_concurrency.lockutils [None req-dba29aea-48cc-424b-9c0a-415ae10508a7 36d71830ce70436e97fbc17b6da8d3c6 95574103d0094883861c58d01690e5a3 - - default default] Lock "/var/lib/nova/instances/0f91ac3a-2383-45bf-94b7-631c1737e936/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 21 18:50:04 np0005591285 nova_compute[182755]: 2026-01-21 23:50:04.430 182759 DEBUG oslo_concurrency.lockutils [None req-dba29aea-48cc-424b-9c0a-415ae10508a7 36d71830ce70436e97fbc17b6da8d3c6 95574103d0094883861c58d01690e5a3 - - default default] Lock "/var/lib/nova/instances/0f91ac3a-2383-45bf-94b7-631c1737e936/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 21 18:50:04 np0005591285 nova_compute[182755]: 2026-01-21 23:50:04.435 182759 DEBUG nova.virt.libvirt.driver [None req-dba29aea-48cc-424b-9c0a-415ae10508a7 36d71830ce70436e97fbc17b6da8d3c6 95574103d0094883861c58d01690e5a3 - - default default] [instance: 0f91ac3a-2383-45bf-94b7-631c1737e936] End _get_guest_xml xml=<domain type="kvm">
Jan 21 18:50:04 np0005591285 nova_compute[182755]:  <uuid>0f91ac3a-2383-45bf-94b7-631c1737e936</uuid>
Jan 21 18:50:04 np0005591285 nova_compute[182755]:  <name>instance-00000023</name>
Jan 21 18:50:04 np0005591285 nova_compute[182755]:  <memory>131072</memory>
Jan 21 18:50:04 np0005591285 nova_compute[182755]:  <vcpu>1</vcpu>
Jan 21 18:50:04 np0005591285 nova_compute[182755]:  <metadata>
Jan 21 18:50:04 np0005591285 nova_compute[182755]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 21 18:50:04 np0005591285 nova_compute[182755]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 21 18:50:04 np0005591285 nova_compute[182755]:      <nova:name>tempest-MigrationsAdminTest-server-1788036298</nova:name>
Jan 21 18:50:04 np0005591285 nova_compute[182755]:      <nova:creationTime>2026-01-21 23:50:04</nova:creationTime>
Jan 21 18:50:04 np0005591285 nova_compute[182755]:      <nova:flavor name="m1.nano">
Jan 21 18:50:04 np0005591285 nova_compute[182755]:        <nova:memory>128</nova:memory>
Jan 21 18:50:04 np0005591285 nova_compute[182755]:        <nova:disk>1</nova:disk>
Jan 21 18:50:04 np0005591285 nova_compute[182755]:        <nova:swap>0</nova:swap>
Jan 21 18:50:04 np0005591285 nova_compute[182755]:        <nova:ephemeral>0</nova:ephemeral>
Jan 21 18:50:04 np0005591285 nova_compute[182755]:        <nova:vcpus>1</nova:vcpus>
Jan 21 18:50:04 np0005591285 nova_compute[182755]:      </nova:flavor>
Jan 21 18:50:04 np0005591285 nova_compute[182755]:      <nova:owner>
Jan 21 18:50:04 np0005591285 nova_compute[182755]:        <nova:user uuid="36d71830ce70436e97fbc17b6da8d3c6">tempest-MigrationsAdminTest-1559502816-project-member</nova:user>
Jan 21 18:50:04 np0005591285 nova_compute[182755]:        <nova:project uuid="95574103d0094883861c58d01690e5a3">tempest-MigrationsAdminTest-1559502816</nova:project>
Jan 21 18:50:04 np0005591285 nova_compute[182755]:      </nova:owner>
Jan 21 18:50:04 np0005591285 nova_compute[182755]:      <nova:root type="image" uuid="9cd98f02-a505-4543-a7ad-04e9a377b456"/>
Jan 21 18:50:04 np0005591285 nova_compute[182755]:      <nova:ports/>
Jan 21 18:50:04 np0005591285 nova_compute[182755]:    </nova:instance>
Jan 21 18:50:04 np0005591285 nova_compute[182755]:  </metadata>
Jan 21 18:50:04 np0005591285 nova_compute[182755]:  <sysinfo type="smbios">
Jan 21 18:50:04 np0005591285 nova_compute[182755]:    <system>
Jan 21 18:50:04 np0005591285 nova_compute[182755]:      <entry name="manufacturer">RDO</entry>
Jan 21 18:50:04 np0005591285 nova_compute[182755]:      <entry name="product">OpenStack Compute</entry>
Jan 21 18:50:04 np0005591285 nova_compute[182755]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 21 18:50:04 np0005591285 nova_compute[182755]:      <entry name="serial">0f91ac3a-2383-45bf-94b7-631c1737e936</entry>
Jan 21 18:50:04 np0005591285 nova_compute[182755]:      <entry name="uuid">0f91ac3a-2383-45bf-94b7-631c1737e936</entry>
Jan 21 18:50:04 np0005591285 nova_compute[182755]:      <entry name="family">Virtual Machine</entry>
Jan 21 18:50:04 np0005591285 nova_compute[182755]:    </system>
Jan 21 18:50:04 np0005591285 nova_compute[182755]:  </sysinfo>
Jan 21 18:50:04 np0005591285 nova_compute[182755]:  <os>
Jan 21 18:50:04 np0005591285 nova_compute[182755]:    <type arch="x86_64" machine="q35">hvm</type>
Jan 21 18:50:04 np0005591285 nova_compute[182755]:    <boot dev="hd"/>
Jan 21 18:50:04 np0005591285 nova_compute[182755]:    <smbios mode="sysinfo"/>
Jan 21 18:50:04 np0005591285 nova_compute[182755]:  </os>
Jan 21 18:50:04 np0005591285 nova_compute[182755]:  <features>
Jan 21 18:50:04 np0005591285 nova_compute[182755]:    <acpi/>
Jan 21 18:50:04 np0005591285 nova_compute[182755]:    <apic/>
Jan 21 18:50:04 np0005591285 nova_compute[182755]:    <vmcoreinfo/>
Jan 21 18:50:04 np0005591285 nova_compute[182755]:  </features>
Jan 21 18:50:04 np0005591285 nova_compute[182755]:  <clock offset="utc">
Jan 21 18:50:04 np0005591285 nova_compute[182755]:    <timer name="pit" tickpolicy="delay"/>
Jan 21 18:50:04 np0005591285 nova_compute[182755]:    <timer name="rtc" tickpolicy="catchup"/>
Jan 21 18:50:04 np0005591285 nova_compute[182755]:    <timer name="hpet" present="no"/>
Jan 21 18:50:04 np0005591285 nova_compute[182755]:  </clock>
Jan 21 18:50:04 np0005591285 nova_compute[182755]:  <cpu mode="custom" match="exact">
Jan 21 18:50:04 np0005591285 nova_compute[182755]:    <model>Nehalem</model>
Jan 21 18:50:04 np0005591285 nova_compute[182755]:    <topology sockets="1" cores="1" threads="1"/>
Jan 21 18:50:04 np0005591285 nova_compute[182755]:  </cpu>
Jan 21 18:50:04 np0005591285 nova_compute[182755]:  <devices>
Jan 21 18:50:04 np0005591285 nova_compute[182755]:    <disk type="file" device="disk">
Jan 21 18:50:04 np0005591285 nova_compute[182755]:      <driver name="qemu" type="qcow2" cache="none"/>
Jan 21 18:50:04 np0005591285 nova_compute[182755]:      <source file="/var/lib/nova/instances/0f91ac3a-2383-45bf-94b7-631c1737e936/disk"/>
Jan 21 18:50:04 np0005591285 nova_compute[182755]:      <target dev="vda" bus="virtio"/>
Jan 21 18:50:04 np0005591285 nova_compute[182755]:    </disk>
Jan 21 18:50:04 np0005591285 nova_compute[182755]:    <disk type="file" device="cdrom">
Jan 21 18:50:04 np0005591285 nova_compute[182755]:      <driver name="qemu" type="raw" cache="none"/>
Jan 21 18:50:04 np0005591285 nova_compute[182755]:      <source file="/var/lib/nova/instances/0f91ac3a-2383-45bf-94b7-631c1737e936/disk.config"/>
Jan 21 18:50:04 np0005591285 nova_compute[182755]:      <target dev="sda" bus="sata"/>
Jan 21 18:50:04 np0005591285 nova_compute[182755]:    </disk>
Jan 21 18:50:04 np0005591285 nova_compute[182755]:    <serial type="pty">
Jan 21 18:50:04 np0005591285 nova_compute[182755]:      <log file="/var/lib/nova/instances/0f91ac3a-2383-45bf-94b7-631c1737e936/console.log" append="off"/>
Jan 21 18:50:04 np0005591285 nova_compute[182755]:    </serial>
Jan 21 18:50:04 np0005591285 nova_compute[182755]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 21 18:50:04 np0005591285 nova_compute[182755]:    <video>
Jan 21 18:50:04 np0005591285 nova_compute[182755]:      <model type="virtio"/>
Jan 21 18:50:04 np0005591285 nova_compute[182755]:    </video>
Jan 21 18:50:04 np0005591285 nova_compute[182755]:    <input type="tablet" bus="usb"/>
Jan 21 18:50:04 np0005591285 nova_compute[182755]:    <input type="keyboard" bus="usb"/>
Jan 21 18:50:04 np0005591285 nova_compute[182755]:    <rng model="virtio">
Jan 21 18:50:04 np0005591285 nova_compute[182755]:      <backend model="random">/dev/urandom</backend>
Jan 21 18:50:04 np0005591285 nova_compute[182755]:    </rng>
Jan 21 18:50:04 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root"/>
Jan 21 18:50:04 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 18:50:04 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 18:50:04 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 18:50:04 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 18:50:04 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 18:50:04 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 18:50:04 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 18:50:04 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 18:50:04 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 18:50:04 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 18:50:04 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 18:50:04 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 18:50:04 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 18:50:04 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 18:50:04 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 18:50:04 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 18:50:04 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 18:50:04 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 18:50:04 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 18:50:04 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 18:50:04 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 18:50:04 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 18:50:04 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 18:50:04 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 18:50:04 np0005591285 nova_compute[182755]:    <controller type="usb" index="0"/>
Jan 21 18:50:04 np0005591285 nova_compute[182755]:    <memballoon model="virtio">
Jan 21 18:50:04 np0005591285 nova_compute[182755]:      <stats period="10"/>
Jan 21 18:50:04 np0005591285 nova_compute[182755]:    </memballoon>
Jan 21 18:50:04 np0005591285 nova_compute[182755]:  </devices>
Jan 21 18:50:04 np0005591285 nova_compute[182755]: </domain>
Jan 21 18:50:04 np0005591285 nova_compute[182755]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Jan 21 18:50:04 np0005591285 systemd-machined[154022]: New machine qemu-13-instance-00000023.
Jan 21 18:50:04 np0005591285 systemd[1]: Started Virtual Machine qemu-13-instance-00000023.
Jan 21 18:50:05 np0005591285 nova_compute[182755]: 2026-01-21 23:50:05.025 182759 DEBUG nova.virt.libvirt.host [None req-f447ef32-7edb-4e2c-a5c5-5452f2f9c37e - - - - - -] Removed pending event for 0f91ac3a-2383-45bf-94b7-631c1737e936 due to event _event_emit_delayed /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:438#033[00m
Jan 21 18:50:05 np0005591285 nova_compute[182755]: 2026-01-21 23:50:05.027 182759 DEBUG nova.virt.driver [None req-f447ef32-7edb-4e2c-a5c5-5452f2f9c37e - - - - - -] Emitting event <LifecycleEvent: 1769039405.0241485, 0f91ac3a-2383-45bf-94b7-631c1737e936 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 21 18:50:05 np0005591285 nova_compute[182755]: 2026-01-21 23:50:05.027 182759 INFO nova.compute.manager [None req-f447ef32-7edb-4e2c-a5c5-5452f2f9c37e - - - - - -] [instance: 0f91ac3a-2383-45bf-94b7-631c1737e936] VM Resumed (Lifecycle Event)#033[00m
Jan 21 18:50:05 np0005591285 nova_compute[182755]: 2026-01-21 23:50:05.031 182759 DEBUG nova.compute.manager [None req-dba29aea-48cc-424b-9c0a-415ae10508a7 36d71830ce70436e97fbc17b6da8d3c6 95574103d0094883861c58d01690e5a3 - - default default] [instance: 0f91ac3a-2383-45bf-94b7-631c1737e936] Instance event wait completed in 0 seconds for  wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Jan 21 18:50:05 np0005591285 nova_compute[182755]: 2026-01-21 23:50:05.039 182759 INFO nova.virt.libvirt.driver [-] [instance: 0f91ac3a-2383-45bf-94b7-631c1737e936] Instance running successfully.#033[00m
Jan 21 18:50:05 np0005591285 nova_compute[182755]: 2026-01-21 23:50:05.040 182759 DEBUG nova.virt.libvirt.driver [None req-dba29aea-48cc-424b-9c0a-415ae10508a7 36d71830ce70436e97fbc17b6da8d3c6 95574103d0094883861c58d01690e5a3 - - default default] [instance: 0f91ac3a-2383-45bf-94b7-631c1737e936] finish_revert_migration finished successfully. finish_revert_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11887#033[00m
Jan 21 18:50:05 np0005591285 nova_compute[182755]: 2026-01-21 23:50:05.066 182759 DEBUG nova.compute.manager [None req-f447ef32-7edb-4e2c-a5c5-5452f2f9c37e - - - - - -] [instance: 0f91ac3a-2383-45bf-94b7-631c1737e936] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 21 18:50:05 np0005591285 nova_compute[182755]: 2026-01-21 23:50:05.082 182759 DEBUG nova.compute.manager [None req-f447ef32-7edb-4e2c-a5c5-5452f2f9c37e - - - - - -] [instance: 0f91ac3a-2383-45bf-94b7-631c1737e936] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: resized, current task_state: resize_reverting, current DB power_state: 1, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 21 18:50:05 np0005591285 nova_compute[182755]: 2026-01-21 23:50:05.116 182759 INFO nova.compute.manager [None req-f447ef32-7edb-4e2c-a5c5-5452f2f9c37e - - - - - -] [instance: 0f91ac3a-2383-45bf-94b7-631c1737e936] During sync_power_state the instance has a pending task (resize_reverting). Skip.#033[00m
Jan 21 18:50:05 np0005591285 nova_compute[182755]: 2026-01-21 23:50:05.117 182759 DEBUG nova.virt.driver [None req-f447ef32-7edb-4e2c-a5c5-5452f2f9c37e - - - - - -] Emitting event <LifecycleEvent: 1769039405.026142, 0f91ac3a-2383-45bf-94b7-631c1737e936 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 21 18:50:05 np0005591285 nova_compute[182755]: 2026-01-21 23:50:05.117 182759 INFO nova.compute.manager [None req-f447ef32-7edb-4e2c-a5c5-5452f2f9c37e - - - - - -] [instance: 0f91ac3a-2383-45bf-94b7-631c1737e936] VM Started (Lifecycle Event)#033[00m
Jan 21 18:50:05 np0005591285 nova_compute[182755]: 2026-01-21 23:50:05.149 182759 DEBUG nova.compute.manager [None req-f447ef32-7edb-4e2c-a5c5-5452f2f9c37e - - - - - -] [instance: 0f91ac3a-2383-45bf-94b7-631c1737e936] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 21 18:50:05 np0005591285 nova_compute[182755]: 2026-01-21 23:50:05.154 182759 DEBUG nova.compute.manager [None req-f447ef32-7edb-4e2c-a5c5-5452f2f9c37e - - - - - -] [instance: 0f91ac3a-2383-45bf-94b7-631c1737e936] Synchronizing instance power state after lifecycle event "Started"; current vm_state: resized, current task_state: resize_reverting, current DB power_state: 1, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 21 18:50:05 np0005591285 nova_compute[182755]: 2026-01-21 23:50:05.186 182759 INFO nova.compute.manager [None req-dba29aea-48cc-424b-9c0a-415ae10508a7 36d71830ce70436e97fbc17b6da8d3c6 95574103d0094883861c58d01690e5a3 - - default default] [instance: 0f91ac3a-2383-45bf-94b7-631c1737e936] Updating instance to original state: 'active'#033[00m
Jan 21 18:50:05 np0005591285 nova_compute[182755]: 2026-01-21 23:50:05.189 182759 INFO nova.compute.manager [None req-f447ef32-7edb-4e2c-a5c5-5452f2f9c37e - - - - - -] [instance: 0f91ac3a-2383-45bf-94b7-631c1737e936] During sync_power_state the instance has a pending task (resize_reverting). Skip.#033[00m
Jan 21 18:50:07 np0005591285 nova_compute[182755]: 2026-01-21 23:50:07.275 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 18:50:07 np0005591285 nova_compute[182755]: 2026-01-21 23:50:07.533 182759 DEBUG oslo_concurrency.lockutils [None req-607f3cd3-5a06-46f1-9605-cc2449a6e508 36d71830ce70436e97fbc17b6da8d3c6 95574103d0094883861c58d01690e5a3 - - default default] Acquiring lock "0f91ac3a-2383-45bf-94b7-631c1737e936" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 21 18:50:07 np0005591285 nova_compute[182755]: 2026-01-21 23:50:07.534 182759 DEBUG oslo_concurrency.lockutils [None req-607f3cd3-5a06-46f1-9605-cc2449a6e508 36d71830ce70436e97fbc17b6da8d3c6 95574103d0094883861c58d01690e5a3 - - default default] Lock "0f91ac3a-2383-45bf-94b7-631c1737e936" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 21 18:50:07 np0005591285 nova_compute[182755]: 2026-01-21 23:50:07.535 182759 DEBUG oslo_concurrency.lockutils [None req-607f3cd3-5a06-46f1-9605-cc2449a6e508 36d71830ce70436e97fbc17b6da8d3c6 95574103d0094883861c58d01690e5a3 - - default default] Acquiring lock "0f91ac3a-2383-45bf-94b7-631c1737e936-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 21 18:50:07 np0005591285 nova_compute[182755]: 2026-01-21 23:50:07.536 182759 DEBUG oslo_concurrency.lockutils [None req-607f3cd3-5a06-46f1-9605-cc2449a6e508 36d71830ce70436e97fbc17b6da8d3c6 95574103d0094883861c58d01690e5a3 - - default default] Lock "0f91ac3a-2383-45bf-94b7-631c1737e936-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 21 18:50:07 np0005591285 nova_compute[182755]: 2026-01-21 23:50:07.536 182759 DEBUG oslo_concurrency.lockutils [None req-607f3cd3-5a06-46f1-9605-cc2449a6e508 36d71830ce70436e97fbc17b6da8d3c6 95574103d0094883861c58d01690e5a3 - - default default] Lock "0f91ac3a-2383-45bf-94b7-631c1737e936-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 21 18:50:07 np0005591285 nova_compute[182755]: 2026-01-21 23:50:07.558 182759 INFO nova.compute.manager [None req-607f3cd3-5a06-46f1-9605-cc2449a6e508 36d71830ce70436e97fbc17b6da8d3c6 95574103d0094883861c58d01690e5a3 - - default default] [instance: 0f91ac3a-2383-45bf-94b7-631c1737e936] Terminating instance#033[00m
Jan 21 18:50:07 np0005591285 nova_compute[182755]: 2026-01-21 23:50:07.579 182759 DEBUG oslo_concurrency.lockutils [None req-607f3cd3-5a06-46f1-9605-cc2449a6e508 36d71830ce70436e97fbc17b6da8d3c6 95574103d0094883861c58d01690e5a3 - - default default] Acquiring lock "refresh_cache-0f91ac3a-2383-45bf-94b7-631c1737e936" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 21 18:50:07 np0005591285 nova_compute[182755]: 2026-01-21 23:50:07.580 182759 DEBUG oslo_concurrency.lockutils [None req-607f3cd3-5a06-46f1-9605-cc2449a6e508 36d71830ce70436e97fbc17b6da8d3c6 95574103d0094883861c58d01690e5a3 - - default default] Acquired lock "refresh_cache-0f91ac3a-2383-45bf-94b7-631c1737e936" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 21 18:50:07 np0005591285 nova_compute[182755]: 2026-01-21 23:50:07.580 182759 DEBUG nova.network.neutron [None req-607f3cd3-5a06-46f1-9605-cc2449a6e508 36d71830ce70436e97fbc17b6da8d3c6 95574103d0094883861c58d01690e5a3 - - default default] [instance: 0f91ac3a-2383-45bf-94b7-631c1737e936] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Jan 21 18:50:07 np0005591285 nova_compute[182755]: 2026-01-21 23:50:07.866 182759 DEBUG nova.network.neutron [None req-607f3cd3-5a06-46f1-9605-cc2449a6e508 36d71830ce70436e97fbc17b6da8d3c6 95574103d0094883861c58d01690e5a3 - - default default] [instance: 0f91ac3a-2383-45bf-94b7-631c1737e936] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Jan 21 18:50:08 np0005591285 nova_compute[182755]: 2026-01-21 23:50:08.326 182759 DEBUG nova.network.neutron [None req-607f3cd3-5a06-46f1-9605-cc2449a6e508 36d71830ce70436e97fbc17b6da8d3c6 95574103d0094883861c58d01690e5a3 - - default default] [instance: 0f91ac3a-2383-45bf-94b7-631c1737e936] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 21 18:50:08 np0005591285 nova_compute[182755]: 2026-01-21 23:50:08.345 182759 DEBUG oslo_concurrency.lockutils [None req-607f3cd3-5a06-46f1-9605-cc2449a6e508 36d71830ce70436e97fbc17b6da8d3c6 95574103d0094883861c58d01690e5a3 - - default default] Releasing lock "refresh_cache-0f91ac3a-2383-45bf-94b7-631c1737e936" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 21 18:50:08 np0005591285 nova_compute[182755]: 2026-01-21 23:50:08.346 182759 DEBUG nova.compute.manager [None req-607f3cd3-5a06-46f1-9605-cc2449a6e508 36d71830ce70436e97fbc17b6da8d3c6 95574103d0094883861c58d01690e5a3 - - default default] [instance: 0f91ac3a-2383-45bf-94b7-631c1737e936] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Jan 21 18:50:08 np0005591285 systemd[1]: machine-qemu\x2d13\x2dinstance\x2d00000023.scope: Deactivated successfully.
Jan 21 18:50:08 np0005591285 systemd[1]: machine-qemu\x2d13\x2dinstance\x2d00000023.scope: Consumed 3.926s CPU time.
Jan 21 18:50:08 np0005591285 systemd-machined[154022]: Machine qemu-13-instance-00000023 terminated.
Jan 21 18:50:08 np0005591285 nova_compute[182755]: 2026-01-21 23:50:08.615 182759 INFO nova.virt.libvirt.driver [-] [instance: 0f91ac3a-2383-45bf-94b7-631c1737e936] Instance destroyed successfully.#033[00m
Jan 21 18:50:08 np0005591285 nova_compute[182755]: 2026-01-21 23:50:08.616 182759 DEBUG nova.objects.instance [None req-607f3cd3-5a06-46f1-9605-cc2449a6e508 36d71830ce70436e97fbc17b6da8d3c6 95574103d0094883861c58d01690e5a3 - - default default] Lazy-loading 'resources' on Instance uuid 0f91ac3a-2383-45bf-94b7-631c1737e936 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 21 18:50:08 np0005591285 nova_compute[182755]: 2026-01-21 23:50:08.632 182759 INFO nova.virt.libvirt.driver [None req-607f3cd3-5a06-46f1-9605-cc2449a6e508 36d71830ce70436e97fbc17b6da8d3c6 95574103d0094883861c58d01690e5a3 - - default default] [instance: 0f91ac3a-2383-45bf-94b7-631c1737e936] Deleting instance files /var/lib/nova/instances/0f91ac3a-2383-45bf-94b7-631c1737e936_del#033[00m
Jan 21 18:50:08 np0005591285 nova_compute[182755]: 2026-01-21 23:50:08.641 182759 INFO nova.virt.libvirt.driver [None req-607f3cd3-5a06-46f1-9605-cc2449a6e508 36d71830ce70436e97fbc17b6da8d3c6 95574103d0094883861c58d01690e5a3 - - default default] [instance: 0f91ac3a-2383-45bf-94b7-631c1737e936] Deletion of /var/lib/nova/instances/0f91ac3a-2383-45bf-94b7-631c1737e936_del complete#033[00m
Jan 21 18:50:08 np0005591285 nova_compute[182755]: 2026-01-21 23:50:08.714 182759 INFO nova.compute.manager [None req-607f3cd3-5a06-46f1-9605-cc2449a6e508 36d71830ce70436e97fbc17b6da8d3c6 95574103d0094883861c58d01690e5a3 - - default default] [instance: 0f91ac3a-2383-45bf-94b7-631c1737e936] Took 0.37 seconds to destroy the instance on the hypervisor.#033[00m
Jan 21 18:50:08 np0005591285 nova_compute[182755]: 2026-01-21 23:50:08.718 182759 DEBUG oslo.service.loopingcall [None req-607f3cd3-5a06-46f1-9605-cc2449a6e508 36d71830ce70436e97fbc17b6da8d3c6 95574103d0094883861c58d01690e5a3 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Jan 21 18:50:08 np0005591285 nova_compute[182755]: 2026-01-21 23:50:08.719 182759 DEBUG nova.compute.manager [-] [instance: 0f91ac3a-2383-45bf-94b7-631c1737e936] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Jan 21 18:50:08 np0005591285 nova_compute[182755]: 2026-01-21 23:50:08.719 182759 DEBUG nova.network.neutron [-] [instance: 0f91ac3a-2383-45bf-94b7-631c1737e936] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Jan 21 18:50:08 np0005591285 nova_compute[182755]: 2026-01-21 23:50:08.928 182759 DEBUG nova.network.neutron [-] [instance: 0f91ac3a-2383-45bf-94b7-631c1737e936] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Jan 21 18:50:08 np0005591285 nova_compute[182755]: 2026-01-21 23:50:08.944 182759 DEBUG nova.network.neutron [-] [instance: 0f91ac3a-2383-45bf-94b7-631c1737e936] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 21 18:50:08 np0005591285 nova_compute[182755]: 2026-01-21 23:50:08.960 182759 INFO nova.compute.manager [-] [instance: 0f91ac3a-2383-45bf-94b7-631c1737e936] Took 0.24 seconds to deallocate network for instance.#033[00m
Jan 21 18:50:09 np0005591285 nova_compute[182755]: 2026-01-21 23:50:09.054 182759 DEBUG oslo_concurrency.lockutils [None req-607f3cd3-5a06-46f1-9605-cc2449a6e508 36d71830ce70436e97fbc17b6da8d3c6 95574103d0094883861c58d01690e5a3 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 21 18:50:09 np0005591285 nova_compute[182755]: 2026-01-21 23:50:09.055 182759 DEBUG oslo_concurrency.lockutils [None req-607f3cd3-5a06-46f1-9605-cc2449a6e508 36d71830ce70436e97fbc17b6da8d3c6 95574103d0094883861c58d01690e5a3 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 21 18:50:09 np0005591285 nova_compute[182755]: 2026-01-21 23:50:09.149 182759 DEBUG nova.compute.provider_tree [None req-607f3cd3-5a06-46f1-9605-cc2449a6e508 36d71830ce70436e97fbc17b6da8d3c6 95574103d0094883861c58d01690e5a3 - - default default] Inventory has not changed in ProviderTree for provider: e96a8776-a298-4c19-937a-402cb8191067 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 21 18:50:09 np0005591285 nova_compute[182755]: 2026-01-21 23:50:09.169 182759 DEBUG nova.scheduler.client.report [None req-607f3cd3-5a06-46f1-9605-cc2449a6e508 36d71830ce70436e97fbc17b6da8d3c6 95574103d0094883861c58d01690e5a3 - - default default] Inventory has not changed for provider e96a8776-a298-4c19-937a-402cb8191067 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 21 18:50:09 np0005591285 nova_compute[182755]: 2026-01-21 23:50:09.213 182759 DEBUG oslo_concurrency.lockutils [None req-607f3cd3-5a06-46f1-9605-cc2449a6e508 36d71830ce70436e97fbc17b6da8d3c6 95574103d0094883861c58d01690e5a3 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.158s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 21 18:50:09 np0005591285 nova_compute[182755]: 2026-01-21 23:50:09.241 182759 INFO nova.scheduler.client.report [None req-607f3cd3-5a06-46f1-9605-cc2449a6e508 36d71830ce70436e97fbc17b6da8d3c6 95574103d0094883861c58d01690e5a3 - - default default] Deleted allocations for instance 0f91ac3a-2383-45bf-94b7-631c1737e936#033[00m
Jan 21 18:50:09 np0005591285 nova_compute[182755]: 2026-01-21 23:50:09.337 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 18:50:09 np0005591285 nova_compute[182755]: 2026-01-21 23:50:09.359 182759 DEBUG oslo_concurrency.lockutils [None req-607f3cd3-5a06-46f1-9605-cc2449a6e508 36d71830ce70436e97fbc17b6da8d3c6 95574103d0094883861c58d01690e5a3 - - default default] Lock "0f91ac3a-2383-45bf-94b7-631c1737e936" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 1.825s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 21 18:50:12 np0005591285 podman[215543]: 2026-01-21 23:50:12.240681462 +0000 UTC m=+0.095062955 container health_status b5c1a31a508d0cf9e10702abe9c0ef424614b4d8f0daee85791f7de054afa6f0 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, tcib_managed=true, config_id=ceilometer_agent_compute, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 21 18:50:12 np0005591285 nova_compute[182755]: 2026-01-21 23:50:12.277 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 18:50:14 np0005591285 podman[215564]: 2026-01-21 23:50:14.250759935 +0000 UTC m=+0.110383347 container health_status 769668cc4e818cfae854ab3fcb5517227ddca1e18005a18ef4d782a494f45662 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, release=1755695350, vendor=Red Hat, Inc., container_name=openstack_network_exporter, io.openshift.tags=minimal rhel9, distribution-scope=public, maintainer=Red Hat, Inc., com.redhat.component=ubi9-minimal-container, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., config_id=openstack_network_exporter, architecture=x86_64, url=https://catalog.redhat.com/en/search?searchType=containers, version=9.6, managed_by=edpm_ansible, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, name=ubi9-minimal, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.buildah.version=1.33.7, build-date=2025-08-20T13:12:41, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=)
Jan 21 18:50:14 np0005591285 nova_compute[182755]: 2026-01-21 23:50:14.338 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 18:50:17 np0005591285 nova_compute[182755]: 2026-01-21 23:50:17.279 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 18:50:19 np0005591285 nova_compute[182755]: 2026-01-21 23:50:19.218 182759 DEBUG oslo_service.periodic_task [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 21 18:50:19 np0005591285 nova_compute[182755]: 2026-01-21 23:50:19.340 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 18:50:20 np0005591285 nova_compute[182755]: 2026-01-21 23:50:20.214 182759 DEBUG oslo_service.periodic_task [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 21 18:50:20 np0005591285 nova_compute[182755]: 2026-01-21 23:50:20.216 182759 DEBUG oslo_service.periodic_task [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 21 18:50:21 np0005591285 podman[215587]: 2026-01-21 23:50:21.193753967 +0000 UTC m=+0.063630730 container health_status 3d927e208c8d9d2bf2ed958430b573b5beb6e6062ba87e1ec7a0ee42c855b2e4 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter)
Jan 21 18:50:21 np0005591285 nova_compute[182755]: 2026-01-21 23:50:21.218 182759 DEBUG oslo_service.periodic_task [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 21 18:50:21 np0005591285 nova_compute[182755]: 2026-01-21 23:50:21.218 182759 DEBUG nova.compute.manager [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Jan 21 18:50:21 np0005591285 nova_compute[182755]: 2026-01-21 23:50:21.218 182759 DEBUG nova.compute.manager [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Jan 21 18:50:21 np0005591285 nova_compute[182755]: 2026-01-21 23:50:21.245 182759 DEBUG nova.compute.manager [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Jan 21 18:50:21 np0005591285 nova_compute[182755]: 2026-01-21 23:50:21.748 182759 DEBUG oslo_concurrency.lockutils [None req-e2e7e875-02d5-4aba-8d1c-694ca0e921b7 85650249f90d4a3b8aea823b57ea554f b7dd3a6bac624dfeb76708960fbea805 - - default default] Acquiring lock "31e5706d-d327-4283-affc-6a3f60b78063" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 21 18:50:21 np0005591285 nova_compute[182755]: 2026-01-21 23:50:21.749 182759 DEBUG oslo_concurrency.lockutils [None req-e2e7e875-02d5-4aba-8d1c-694ca0e921b7 85650249f90d4a3b8aea823b57ea554f b7dd3a6bac624dfeb76708960fbea805 - - default default] Lock "31e5706d-d327-4283-affc-6a3f60b78063" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 21 18:50:21 np0005591285 nova_compute[182755]: 2026-01-21 23:50:21.775 182759 DEBUG nova.compute.manager [None req-e2e7e875-02d5-4aba-8d1c-694ca0e921b7 85650249f90d4a3b8aea823b57ea554f b7dd3a6bac624dfeb76708960fbea805 - - default default] [instance: 31e5706d-d327-4283-affc-6a3f60b78063] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Jan 21 18:50:21 np0005591285 nova_compute[182755]: 2026-01-21 23:50:21.927 182759 DEBUG oslo_concurrency.lockutils [None req-e2e7e875-02d5-4aba-8d1c-694ca0e921b7 85650249f90d4a3b8aea823b57ea554f b7dd3a6bac624dfeb76708960fbea805 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 21 18:50:21 np0005591285 nova_compute[182755]: 2026-01-21 23:50:21.928 182759 DEBUG oslo_concurrency.lockutils [None req-e2e7e875-02d5-4aba-8d1c-694ca0e921b7 85650249f90d4a3b8aea823b57ea554f b7dd3a6bac624dfeb76708960fbea805 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 21 18:50:21 np0005591285 nova_compute[182755]: 2026-01-21 23:50:21.940 182759 DEBUG nova.virt.hardware [None req-e2e7e875-02d5-4aba-8d1c-694ca0e921b7 85650249f90d4a3b8aea823b57ea554f b7dd3a6bac624dfeb76708960fbea805 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Jan 21 18:50:21 np0005591285 nova_compute[182755]: 2026-01-21 23:50:21.941 182759 INFO nova.compute.claims [None req-e2e7e875-02d5-4aba-8d1c-694ca0e921b7 85650249f90d4a3b8aea823b57ea554f b7dd3a6bac624dfeb76708960fbea805 - - default default] [instance: 31e5706d-d327-4283-affc-6a3f60b78063] Claim successful on node compute-2.ctlplane.example.com#033[00m
Jan 21 18:50:22 np0005591285 nova_compute[182755]: 2026-01-21 23:50:22.102 182759 DEBUG nova.compute.provider_tree [None req-e2e7e875-02d5-4aba-8d1c-694ca0e921b7 85650249f90d4a3b8aea823b57ea554f b7dd3a6bac624dfeb76708960fbea805 - - default default] Inventory has not changed in ProviderTree for provider: e96a8776-a298-4c19-937a-402cb8191067 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 21 18:50:22 np0005591285 nova_compute[182755]: 2026-01-21 23:50:22.124 182759 DEBUG nova.scheduler.client.report [None req-e2e7e875-02d5-4aba-8d1c-694ca0e921b7 85650249f90d4a3b8aea823b57ea554f b7dd3a6bac624dfeb76708960fbea805 - - default default] Inventory has not changed for provider e96a8776-a298-4c19-937a-402cb8191067 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 21 18:50:22 np0005591285 nova_compute[182755]: 2026-01-21 23:50:22.164 182759 DEBUG oslo_concurrency.lockutils [None req-e2e7e875-02d5-4aba-8d1c-694ca0e921b7 85650249f90d4a3b8aea823b57ea554f b7dd3a6bac624dfeb76708960fbea805 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.235s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 21 18:50:22 np0005591285 nova_compute[182755]: 2026-01-21 23:50:22.165 182759 DEBUG nova.compute.manager [None req-e2e7e875-02d5-4aba-8d1c-694ca0e921b7 85650249f90d4a3b8aea823b57ea554f b7dd3a6bac624dfeb76708960fbea805 - - default default] [instance: 31e5706d-d327-4283-affc-6a3f60b78063] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Jan 21 18:50:22 np0005591285 nova_compute[182755]: 2026-01-21 23:50:22.234 182759 DEBUG nova.compute.manager [None req-e2e7e875-02d5-4aba-8d1c-694ca0e921b7 85650249f90d4a3b8aea823b57ea554f b7dd3a6bac624dfeb76708960fbea805 - - default default] [instance: 31e5706d-d327-4283-affc-6a3f60b78063] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Jan 21 18:50:22 np0005591285 nova_compute[182755]: 2026-01-21 23:50:22.235 182759 DEBUG nova.network.neutron [None req-e2e7e875-02d5-4aba-8d1c-694ca0e921b7 85650249f90d4a3b8aea823b57ea554f b7dd3a6bac624dfeb76708960fbea805 - - default default] [instance: 31e5706d-d327-4283-affc-6a3f60b78063] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Jan 21 18:50:22 np0005591285 nova_compute[182755]: 2026-01-21 23:50:22.260 182759 INFO nova.virt.libvirt.driver [None req-e2e7e875-02d5-4aba-8d1c-694ca0e921b7 85650249f90d4a3b8aea823b57ea554f b7dd3a6bac624dfeb76708960fbea805 - - default default] [instance: 31e5706d-d327-4283-affc-6a3f60b78063] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Jan 21 18:50:22 np0005591285 nova_compute[182755]: 2026-01-21 23:50:22.282 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 18:50:22 np0005591285 nova_compute[182755]: 2026-01-21 23:50:22.314 182759 DEBUG nova.compute.manager [None req-e2e7e875-02d5-4aba-8d1c-694ca0e921b7 85650249f90d4a3b8aea823b57ea554f b7dd3a6bac624dfeb76708960fbea805 - - default default] [instance: 31e5706d-d327-4283-affc-6a3f60b78063] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Jan 21 18:50:22 np0005591285 nova_compute[182755]: 2026-01-21 23:50:22.458 182759 DEBUG nova.compute.manager [None req-e2e7e875-02d5-4aba-8d1c-694ca0e921b7 85650249f90d4a3b8aea823b57ea554f b7dd3a6bac624dfeb76708960fbea805 - - default default] [instance: 31e5706d-d327-4283-affc-6a3f60b78063] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Jan 21 18:50:22 np0005591285 nova_compute[182755]: 2026-01-21 23:50:22.459 182759 DEBUG nova.virt.libvirt.driver [None req-e2e7e875-02d5-4aba-8d1c-694ca0e921b7 85650249f90d4a3b8aea823b57ea554f b7dd3a6bac624dfeb76708960fbea805 - - default default] [instance: 31e5706d-d327-4283-affc-6a3f60b78063] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Jan 21 18:50:22 np0005591285 nova_compute[182755]: 2026-01-21 23:50:22.459 182759 INFO nova.virt.libvirt.driver [None req-e2e7e875-02d5-4aba-8d1c-694ca0e921b7 85650249f90d4a3b8aea823b57ea554f b7dd3a6bac624dfeb76708960fbea805 - - default default] [instance: 31e5706d-d327-4283-affc-6a3f60b78063] Creating image(s)#033[00m
Jan 21 18:50:22 np0005591285 nova_compute[182755]: 2026-01-21 23:50:22.460 182759 DEBUG oslo_concurrency.lockutils [None req-e2e7e875-02d5-4aba-8d1c-694ca0e921b7 85650249f90d4a3b8aea823b57ea554f b7dd3a6bac624dfeb76708960fbea805 - - default default] Acquiring lock "/var/lib/nova/instances/31e5706d-d327-4283-affc-6a3f60b78063/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 21 18:50:22 np0005591285 nova_compute[182755]: 2026-01-21 23:50:22.460 182759 DEBUG oslo_concurrency.lockutils [None req-e2e7e875-02d5-4aba-8d1c-694ca0e921b7 85650249f90d4a3b8aea823b57ea554f b7dd3a6bac624dfeb76708960fbea805 - - default default] Lock "/var/lib/nova/instances/31e5706d-d327-4283-affc-6a3f60b78063/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 21 18:50:22 np0005591285 nova_compute[182755]: 2026-01-21 23:50:22.461 182759 DEBUG oslo_concurrency.lockutils [None req-e2e7e875-02d5-4aba-8d1c-694ca0e921b7 85650249f90d4a3b8aea823b57ea554f b7dd3a6bac624dfeb76708960fbea805 - - default default] Lock "/var/lib/nova/instances/31e5706d-d327-4283-affc-6a3f60b78063/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 21 18:50:22 np0005591285 nova_compute[182755]: 2026-01-21 23:50:22.475 182759 DEBUG oslo_concurrency.processutils [None req-e2e7e875-02d5-4aba-8d1c-694ca0e921b7 85650249f90d4a3b8aea823b57ea554f b7dd3a6bac624dfeb76708960fbea805 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 21 18:50:22 np0005591285 nova_compute[182755]: 2026-01-21 23:50:22.577 182759 DEBUG oslo_concurrency.processutils [None req-e2e7e875-02d5-4aba-8d1c-694ca0e921b7 85650249f90d4a3b8aea823b57ea554f b7dd3a6bac624dfeb76708960fbea805 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json" returned: 0 in 0.102s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 21 18:50:22 np0005591285 nova_compute[182755]: 2026-01-21 23:50:22.579 182759 DEBUG oslo_concurrency.lockutils [None req-e2e7e875-02d5-4aba-8d1c-694ca0e921b7 85650249f90d4a3b8aea823b57ea554f b7dd3a6bac624dfeb76708960fbea805 - - default default] Acquiring lock "3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 21 18:50:22 np0005591285 nova_compute[182755]: 2026-01-21 23:50:22.581 182759 DEBUG oslo_concurrency.lockutils [None req-e2e7e875-02d5-4aba-8d1c-694ca0e921b7 85650249f90d4a3b8aea823b57ea554f b7dd3a6bac624dfeb76708960fbea805 - - default default] Lock "3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 21 18:50:22 np0005591285 nova_compute[182755]: 2026-01-21 23:50:22.609 182759 DEBUG oslo_concurrency.processutils [None req-e2e7e875-02d5-4aba-8d1c-694ca0e921b7 85650249f90d4a3b8aea823b57ea554f b7dd3a6bac624dfeb76708960fbea805 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 21 18:50:22 np0005591285 nova_compute[182755]: 2026-01-21 23:50:22.696 182759 DEBUG oslo_concurrency.processutils [None req-e2e7e875-02d5-4aba-8d1c-694ca0e921b7 85650249f90d4a3b8aea823b57ea554f b7dd3a6bac624dfeb76708960fbea805 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json" returned: 0 in 0.088s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 21 18:50:22 np0005591285 nova_compute[182755]: 2026-01-21 23:50:22.699 182759 DEBUG oslo_concurrency.processutils [None req-e2e7e875-02d5-4aba-8d1c-694ca0e921b7 85650249f90d4a3b8aea823b57ea554f b7dd3a6bac624dfeb76708960fbea805 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474,backing_fmt=raw /var/lib/nova/instances/31e5706d-d327-4283-affc-6a3f60b78063/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 21 18:50:22 np0005591285 nova_compute[182755]: 2026-01-21 23:50:22.745 182759 DEBUG oslo_concurrency.processutils [None req-e2e7e875-02d5-4aba-8d1c-694ca0e921b7 85650249f90d4a3b8aea823b57ea554f b7dd3a6bac624dfeb76708960fbea805 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474,backing_fmt=raw /var/lib/nova/instances/31e5706d-d327-4283-affc-6a3f60b78063/disk 1073741824" returned: 0 in 0.046s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 21 18:50:22 np0005591285 nova_compute[182755]: 2026-01-21 23:50:22.747 182759 DEBUG oslo_concurrency.lockutils [None req-e2e7e875-02d5-4aba-8d1c-694ca0e921b7 85650249f90d4a3b8aea823b57ea554f b7dd3a6bac624dfeb76708960fbea805 - - default default] Lock "3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.166s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 21 18:50:22 np0005591285 nova_compute[182755]: 2026-01-21 23:50:22.748 182759 DEBUG oslo_concurrency.processutils [None req-e2e7e875-02d5-4aba-8d1c-694ca0e921b7 85650249f90d4a3b8aea823b57ea554f b7dd3a6bac624dfeb76708960fbea805 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 21 18:50:22 np0005591285 nova_compute[182755]: 2026-01-21 23:50:22.831 182759 DEBUG oslo_concurrency.processutils [None req-e2e7e875-02d5-4aba-8d1c-694ca0e921b7 85650249f90d4a3b8aea823b57ea554f b7dd3a6bac624dfeb76708960fbea805 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json" returned: 0 in 0.084s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 21 18:50:22 np0005591285 nova_compute[182755]: 2026-01-21 23:50:22.833 182759 DEBUG nova.virt.disk.api [None req-e2e7e875-02d5-4aba-8d1c-694ca0e921b7 85650249f90d4a3b8aea823b57ea554f b7dd3a6bac624dfeb76708960fbea805 - - default default] Checking if we can resize image /var/lib/nova/instances/31e5706d-d327-4283-affc-6a3f60b78063/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166#033[00m
Jan 21 18:50:22 np0005591285 nova_compute[182755]: 2026-01-21 23:50:22.834 182759 DEBUG oslo_concurrency.processutils [None req-e2e7e875-02d5-4aba-8d1c-694ca0e921b7 85650249f90d4a3b8aea823b57ea554f b7dd3a6bac624dfeb76708960fbea805 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/31e5706d-d327-4283-affc-6a3f60b78063/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 21 18:50:22 np0005591285 nova_compute[182755]: 2026-01-21 23:50:22.894 182759 DEBUG oslo_concurrency.processutils [None req-e2e7e875-02d5-4aba-8d1c-694ca0e921b7 85650249f90d4a3b8aea823b57ea554f b7dd3a6bac624dfeb76708960fbea805 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/31e5706d-d327-4283-affc-6a3f60b78063/disk --force-share --output=json" returned: 0 in 0.060s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 21 18:50:22 np0005591285 nova_compute[182755]: 2026-01-21 23:50:22.895 182759 DEBUG nova.virt.disk.api [None req-e2e7e875-02d5-4aba-8d1c-694ca0e921b7 85650249f90d4a3b8aea823b57ea554f b7dd3a6bac624dfeb76708960fbea805 - - default default] Cannot resize image /var/lib/nova/instances/31e5706d-d327-4283-affc-6a3f60b78063/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172#033[00m
Jan 21 18:50:22 np0005591285 nova_compute[182755]: 2026-01-21 23:50:22.895 182759 DEBUG nova.objects.instance [None req-e2e7e875-02d5-4aba-8d1c-694ca0e921b7 85650249f90d4a3b8aea823b57ea554f b7dd3a6bac624dfeb76708960fbea805 - - default default] Lazy-loading 'migration_context' on Instance uuid 31e5706d-d327-4283-affc-6a3f60b78063 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 21 18:50:22 np0005591285 nova_compute[182755]: 2026-01-21 23:50:22.917 182759 DEBUG nova.virt.libvirt.driver [None req-e2e7e875-02d5-4aba-8d1c-694ca0e921b7 85650249f90d4a3b8aea823b57ea554f b7dd3a6bac624dfeb76708960fbea805 - - default default] [instance: 31e5706d-d327-4283-affc-6a3f60b78063] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Jan 21 18:50:22 np0005591285 nova_compute[182755]: 2026-01-21 23:50:22.918 182759 DEBUG nova.virt.libvirt.driver [None req-e2e7e875-02d5-4aba-8d1c-694ca0e921b7 85650249f90d4a3b8aea823b57ea554f b7dd3a6bac624dfeb76708960fbea805 - - default default] [instance: 31e5706d-d327-4283-affc-6a3f60b78063] Ensure instance console log exists: /var/lib/nova/instances/31e5706d-d327-4283-affc-6a3f60b78063/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Jan 21 18:50:22 np0005591285 nova_compute[182755]: 2026-01-21 23:50:22.918 182759 DEBUG oslo_concurrency.lockutils [None req-e2e7e875-02d5-4aba-8d1c-694ca0e921b7 85650249f90d4a3b8aea823b57ea554f b7dd3a6bac624dfeb76708960fbea805 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 21 18:50:22 np0005591285 nova_compute[182755]: 2026-01-21 23:50:22.919 182759 DEBUG oslo_concurrency.lockutils [None req-e2e7e875-02d5-4aba-8d1c-694ca0e921b7 85650249f90d4a3b8aea823b57ea554f b7dd3a6bac624dfeb76708960fbea805 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 21 18:50:22 np0005591285 nova_compute[182755]: 2026-01-21 23:50:22.919 182759 DEBUG oslo_concurrency.lockutils [None req-e2e7e875-02d5-4aba-8d1c-694ca0e921b7 85650249f90d4a3b8aea823b57ea554f b7dd3a6bac624dfeb76708960fbea805 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 21 18:50:23 np0005591285 nova_compute[182755]: 2026-01-21 23:50:23.076 182759 DEBUG nova.policy [None req-e2e7e875-02d5-4aba-8d1c-694ca0e921b7 85650249f90d4a3b8aea823b57ea554f b7dd3a6bac624dfeb76708960fbea805 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '85650249f90d4a3b8aea823b57ea554f', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'b7dd3a6bac624dfeb76708960fbea805', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Jan 21 18:50:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:50:23.144 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 21 18:50:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:50:23.146 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 21 18:50:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:50:23.146 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 21 18:50:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:50:23.146 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 21 18:50:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:50:23.146 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.allocation, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 21 18:50:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:50:23.146 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 21 18:50:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:50:23.146 12 DEBUG ceilometer.polling.manager [-] Skip pollster cpu, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 21 18:50:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:50:23.147 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 21 18:50:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:50:23.147 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 21 18:50:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:50:23.147 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 21 18:50:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:50:23.147 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 21 18:50:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:50:23.147 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 21 18:50:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:50:23.147 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 21 18:50:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:50:23.147 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.capacity, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 21 18:50:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:50:23.147 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 21 18:50:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:50:23.148 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.iops, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 21 18:50:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:50:23.148 12 DEBUG ceilometer.polling.manager [-] Skip pollster memory.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 21 18:50:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:50:23.148 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 21 18:50:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:50:23.148 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 21 18:50:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:50:23.148 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 21 18:50:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:50:23.148 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 21 18:50:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:50:23.148 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 21 18:50:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:50:23.148 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 21 18:50:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:50:23.148 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 21 18:50:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:50:23.149 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 21 18:50:23 np0005591285 nova_compute[182755]: 2026-01-21 23:50:23.217 182759 DEBUG oslo_service.periodic_task [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 21 18:50:23 np0005591285 nova_compute[182755]: 2026-01-21 23:50:23.218 182759 DEBUG nova.compute.manager [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Jan 21 18:50:23 np0005591285 nova_compute[182755]: 2026-01-21 23:50:23.613 182759 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1769039408.6107163, 0f91ac3a-2383-45bf-94b7-631c1737e936 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 21 18:50:23 np0005591285 nova_compute[182755]: 2026-01-21 23:50:23.613 182759 INFO nova.compute.manager [-] [instance: 0f91ac3a-2383-45bf-94b7-631c1737e936] VM Stopped (Lifecycle Event)#033[00m
Jan 21 18:50:23 np0005591285 nova_compute[182755]: 2026-01-21 23:50:23.637 182759 DEBUG nova.compute.manager [None req-631cf5c2-763c-469d-ac4a-7765eed65062 - - - - - -] [instance: 0f91ac3a-2383-45bf-94b7-631c1737e936] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 21 18:50:24 np0005591285 nova_compute[182755]: 2026-01-21 23:50:24.096 182759 DEBUG nova.network.neutron [None req-e2e7e875-02d5-4aba-8d1c-694ca0e921b7 85650249f90d4a3b8aea823b57ea554f b7dd3a6bac624dfeb76708960fbea805 - - default default] [instance: 31e5706d-d327-4283-affc-6a3f60b78063] Successfully created port: ebad2ab9-91b4-44f7-9b48-683234f5b24d _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Jan 21 18:50:24 np0005591285 nova_compute[182755]: 2026-01-21 23:50:24.218 182759 DEBUG oslo_service.periodic_task [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 21 18:50:24 np0005591285 nova_compute[182755]: 2026-01-21 23:50:24.343 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 18:50:25 np0005591285 nova_compute[182755]: 2026-01-21 23:50:25.217 182759 DEBUG oslo_service.periodic_task [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 21 18:50:25 np0005591285 nova_compute[182755]: 2026-01-21 23:50:25.246 182759 DEBUG oslo_concurrency.lockutils [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 21 18:50:25 np0005591285 nova_compute[182755]: 2026-01-21 23:50:25.247 182759 DEBUG oslo_concurrency.lockutils [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 21 18:50:25 np0005591285 nova_compute[182755]: 2026-01-21 23:50:25.247 182759 DEBUG oslo_concurrency.lockutils [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 21 18:50:25 np0005591285 nova_compute[182755]: 2026-01-21 23:50:25.248 182759 DEBUG nova.compute.resource_tracker [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Jan 21 18:50:25 np0005591285 podman[215627]: 2026-01-21 23:50:25.282023821 +0000 UTC m=+0.140690623 container health_status e38def30a2d032278c507a8fe96e62e9ea51ff4721fe55b24e66691821fd079e (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 21 18:50:25 np0005591285 nova_compute[182755]: 2026-01-21 23:50:25.425 182759 WARNING nova.virt.libvirt.driver [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 21 18:50:25 np0005591285 nova_compute[182755]: 2026-01-21 23:50:25.427 182759 DEBUG nova.compute.resource_tracker [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=5686MB free_disk=73.37637710571289GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Jan 21 18:50:25 np0005591285 nova_compute[182755]: 2026-01-21 23:50:25.427 182759 DEBUG oslo_concurrency.lockutils [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 21 18:50:25 np0005591285 nova_compute[182755]: 2026-01-21 23:50:25.427 182759 DEBUG oslo_concurrency.lockutils [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 21 18:50:25 np0005591285 nova_compute[182755]: 2026-01-21 23:50:25.507 182759 DEBUG nova.compute.resource_tracker [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Instance 31e5706d-d327-4283-affc-6a3f60b78063 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Jan 21 18:50:25 np0005591285 nova_compute[182755]: 2026-01-21 23:50:25.508 182759 DEBUG nova.compute.resource_tracker [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Jan 21 18:50:25 np0005591285 nova_compute[182755]: 2026-01-21 23:50:25.508 182759 DEBUG nova.compute.resource_tracker [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7679MB used_ram=640MB phys_disk=79GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Jan 21 18:50:25 np0005591285 nova_compute[182755]: 2026-01-21 23:50:25.549 182759 DEBUG nova.compute.provider_tree [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Inventory has not changed in ProviderTree for provider: e96a8776-a298-4c19-937a-402cb8191067 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 21 18:50:25 np0005591285 nova_compute[182755]: 2026-01-21 23:50:25.567 182759 DEBUG nova.scheduler.client.report [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Inventory has not changed for provider e96a8776-a298-4c19-937a-402cb8191067 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 21 18:50:25 np0005591285 nova_compute[182755]: 2026-01-21 23:50:25.597 182759 DEBUG nova.compute.resource_tracker [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Jan 21 18:50:25 np0005591285 nova_compute[182755]: 2026-01-21 23:50:25.598 182759 DEBUG oslo_concurrency.lockutils [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.171s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 21 18:50:25 np0005591285 nova_compute[182755]: 2026-01-21 23:50:25.847 182759 DEBUG nova.network.neutron [None req-e2e7e875-02d5-4aba-8d1c-694ca0e921b7 85650249f90d4a3b8aea823b57ea554f b7dd3a6bac624dfeb76708960fbea805 - - default default] [instance: 31e5706d-d327-4283-affc-6a3f60b78063] Successfully updated port: ebad2ab9-91b4-44f7-9b48-683234f5b24d _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Jan 21 18:50:25 np0005591285 nova_compute[182755]: 2026-01-21 23:50:25.863 182759 DEBUG oslo_concurrency.lockutils [None req-e2e7e875-02d5-4aba-8d1c-694ca0e921b7 85650249f90d4a3b8aea823b57ea554f b7dd3a6bac624dfeb76708960fbea805 - - default default] Acquiring lock "refresh_cache-31e5706d-d327-4283-affc-6a3f60b78063" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 21 18:50:25 np0005591285 nova_compute[182755]: 2026-01-21 23:50:25.863 182759 DEBUG oslo_concurrency.lockutils [None req-e2e7e875-02d5-4aba-8d1c-694ca0e921b7 85650249f90d4a3b8aea823b57ea554f b7dd3a6bac624dfeb76708960fbea805 - - default default] Acquired lock "refresh_cache-31e5706d-d327-4283-affc-6a3f60b78063" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 21 18:50:25 np0005591285 nova_compute[182755]: 2026-01-21 23:50:25.863 182759 DEBUG nova.network.neutron [None req-e2e7e875-02d5-4aba-8d1c-694ca0e921b7 85650249f90d4a3b8aea823b57ea554f b7dd3a6bac624dfeb76708960fbea805 - - default default] [instance: 31e5706d-d327-4283-affc-6a3f60b78063] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Jan 21 18:50:26 np0005591285 nova_compute[182755]: 2026-01-21 23:50:26.040 182759 DEBUG nova.compute.manager [req-1a783ed2-9879-4f7a-b4db-9a2d9f370421 req-9273e641-8eab-4565-ad38-212d125157a8 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 31e5706d-d327-4283-affc-6a3f60b78063] Received event network-changed-ebad2ab9-91b4-44f7-9b48-683234f5b24d external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 21 18:50:26 np0005591285 nova_compute[182755]: 2026-01-21 23:50:26.041 182759 DEBUG nova.compute.manager [req-1a783ed2-9879-4f7a-b4db-9a2d9f370421 req-9273e641-8eab-4565-ad38-212d125157a8 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 31e5706d-d327-4283-affc-6a3f60b78063] Refreshing instance network info cache due to event network-changed-ebad2ab9-91b4-44f7-9b48-683234f5b24d. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 21 18:50:26 np0005591285 nova_compute[182755]: 2026-01-21 23:50:26.041 182759 DEBUG oslo_concurrency.lockutils [req-1a783ed2-9879-4f7a-b4db-9a2d9f370421 req-9273e641-8eab-4565-ad38-212d125157a8 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "refresh_cache-31e5706d-d327-4283-affc-6a3f60b78063" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 21 18:50:26 np0005591285 nova_compute[182755]: 2026-01-21 23:50:26.193 182759 DEBUG nova.network.neutron [None req-e2e7e875-02d5-4aba-8d1c-694ca0e921b7 85650249f90d4a3b8aea823b57ea554f b7dd3a6bac624dfeb76708960fbea805 - - default default] [instance: 31e5706d-d327-4283-affc-6a3f60b78063] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Jan 21 18:50:27 np0005591285 nova_compute[182755]: 2026-01-21 23:50:27.284 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 18:50:27 np0005591285 nova_compute[182755]: 2026-01-21 23:50:27.547 182759 DEBUG nova.network.neutron [None req-e2e7e875-02d5-4aba-8d1c-694ca0e921b7 85650249f90d4a3b8aea823b57ea554f b7dd3a6bac624dfeb76708960fbea805 - - default default] [instance: 31e5706d-d327-4283-affc-6a3f60b78063] Updating instance_info_cache with network_info: [{"id": "ebad2ab9-91b4-44f7-9b48-683234f5b24d", "address": "fa:16:3e:46:1e:4f", "network": {"id": "fe7d64db-d08e-4a1e-9c22-107fc8f6cdce", "bridge": "br-int", "label": "tempest-ServersTestManualDisk-561965875-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b7dd3a6bac624dfeb76708960fbea805", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapebad2ab9-91", "ovs_interfaceid": "ebad2ab9-91b4-44f7-9b48-683234f5b24d", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 21 18:50:27 np0005591285 nova_compute[182755]: 2026-01-21 23:50:27.582 182759 DEBUG oslo_concurrency.lockutils [None req-e2e7e875-02d5-4aba-8d1c-694ca0e921b7 85650249f90d4a3b8aea823b57ea554f b7dd3a6bac624dfeb76708960fbea805 - - default default] Releasing lock "refresh_cache-31e5706d-d327-4283-affc-6a3f60b78063" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 21 18:50:27 np0005591285 nova_compute[182755]: 2026-01-21 23:50:27.583 182759 DEBUG nova.compute.manager [None req-e2e7e875-02d5-4aba-8d1c-694ca0e921b7 85650249f90d4a3b8aea823b57ea554f b7dd3a6bac624dfeb76708960fbea805 - - default default] [instance: 31e5706d-d327-4283-affc-6a3f60b78063] Instance network_info: |[{"id": "ebad2ab9-91b4-44f7-9b48-683234f5b24d", "address": "fa:16:3e:46:1e:4f", "network": {"id": "fe7d64db-d08e-4a1e-9c22-107fc8f6cdce", "bridge": "br-int", "label": "tempest-ServersTestManualDisk-561965875-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b7dd3a6bac624dfeb76708960fbea805", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapebad2ab9-91", "ovs_interfaceid": "ebad2ab9-91b4-44f7-9b48-683234f5b24d", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Jan 21 18:50:27 np0005591285 nova_compute[182755]: 2026-01-21 23:50:27.583 182759 DEBUG oslo_concurrency.lockutils [req-1a783ed2-9879-4f7a-b4db-9a2d9f370421 req-9273e641-8eab-4565-ad38-212d125157a8 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquired lock "refresh_cache-31e5706d-d327-4283-affc-6a3f60b78063" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 21 18:50:27 np0005591285 nova_compute[182755]: 2026-01-21 23:50:27.583 182759 DEBUG nova.network.neutron [req-1a783ed2-9879-4f7a-b4db-9a2d9f370421 req-9273e641-8eab-4565-ad38-212d125157a8 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 31e5706d-d327-4283-affc-6a3f60b78063] Refreshing network info cache for port ebad2ab9-91b4-44f7-9b48-683234f5b24d _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 21 18:50:27 np0005591285 nova_compute[182755]: 2026-01-21 23:50:27.587 182759 DEBUG nova.virt.libvirt.driver [None req-e2e7e875-02d5-4aba-8d1c-694ca0e921b7 85650249f90d4a3b8aea823b57ea554f b7dd3a6bac624dfeb76708960fbea805 - - default default] [instance: 31e5706d-d327-4283-affc-6a3f60b78063] Start _get_guest_xml network_info=[{"id": "ebad2ab9-91b4-44f7-9b48-683234f5b24d", "address": "fa:16:3e:46:1e:4f", "network": {"id": "fe7d64db-d08e-4a1e-9c22-107fc8f6cdce", "bridge": "br-int", "label": "tempest-ServersTestManualDisk-561965875-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b7dd3a6bac624dfeb76708960fbea805", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapebad2ab9-91", "ovs_interfaceid": "ebad2ab9-91b4-44f7-9b48-683234f5b24d", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-21T23:42:50Z,direct_url=<?>,disk_format='qcow2',id=9cd98f02-a505-4543-a7ad-04e9a377b456,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='43b70c4e837343859ac97b6b2397ba1b',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-21T23:42:52Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'device_type': 'disk', 'device_name': '/dev/vda', 'size': 0, 'boot_index': 0, 'disk_bus': 'virtio', 'guest_format': None, 'encryption_options': None, 'encryption_format': None, 'encrypted': False, 'encryption_secret_uuid': None, 'image_id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Jan 21 18:50:27 np0005591285 nova_compute[182755]: 2026-01-21 23:50:27.594 182759 WARNING nova.virt.libvirt.driver [None req-e2e7e875-02d5-4aba-8d1c-694ca0e921b7 85650249f90d4a3b8aea823b57ea554f b7dd3a6bac624dfeb76708960fbea805 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 21 18:50:27 np0005591285 nova_compute[182755]: 2026-01-21 23:50:27.597 182759 DEBUG oslo_service.periodic_task [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 21 18:50:27 np0005591285 nova_compute[182755]: 2026-01-21 23:50:27.608 182759 DEBUG nova.virt.libvirt.host [None req-e2e7e875-02d5-4aba-8d1c-694ca0e921b7 85650249f90d4a3b8aea823b57ea554f b7dd3a6bac624dfeb76708960fbea805 - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Jan 21 18:50:27 np0005591285 nova_compute[182755]: 2026-01-21 23:50:27.610 182759 DEBUG nova.virt.libvirt.host [None req-e2e7e875-02d5-4aba-8d1c-694ca0e921b7 85650249f90d4a3b8aea823b57ea554f b7dd3a6bac624dfeb76708960fbea805 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Jan 21 18:50:27 np0005591285 nova_compute[182755]: 2026-01-21 23:50:27.616 182759 DEBUG nova.virt.libvirt.host [None req-e2e7e875-02d5-4aba-8d1c-694ca0e921b7 85650249f90d4a3b8aea823b57ea554f b7dd3a6bac624dfeb76708960fbea805 - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Jan 21 18:50:27 np0005591285 nova_compute[182755]: 2026-01-21 23:50:27.618 182759 DEBUG nova.virt.libvirt.host [None req-e2e7e875-02d5-4aba-8d1c-694ca0e921b7 85650249f90d4a3b8aea823b57ea554f b7dd3a6bac624dfeb76708960fbea805 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Jan 21 18:50:27 np0005591285 nova_compute[182755]: 2026-01-21 23:50:27.620 182759 DEBUG nova.virt.libvirt.driver [None req-e2e7e875-02d5-4aba-8d1c-694ca0e921b7 85650249f90d4a3b8aea823b57ea554f b7dd3a6bac624dfeb76708960fbea805 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Jan 21 18:50:27 np0005591285 nova_compute[182755]: 2026-01-21 23:50:27.620 182759 DEBUG nova.virt.hardware [None req-e2e7e875-02d5-4aba-8d1c-694ca0e921b7 85650249f90d4a3b8aea823b57ea554f b7dd3a6bac624dfeb76708960fbea805 - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-21T23:42:45Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='c3389c03-89c4-4ff5-9e03-1a99d41713d4',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-21T23:42:50Z,direct_url=<?>,disk_format='qcow2',id=9cd98f02-a505-4543-a7ad-04e9a377b456,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='43b70c4e837343859ac97b6b2397ba1b',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-21T23:42:52Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Jan 21 18:50:27 np0005591285 nova_compute[182755]: 2026-01-21 23:50:27.621 182759 DEBUG nova.virt.hardware [None req-e2e7e875-02d5-4aba-8d1c-694ca0e921b7 85650249f90d4a3b8aea823b57ea554f b7dd3a6bac624dfeb76708960fbea805 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Jan 21 18:50:27 np0005591285 nova_compute[182755]: 2026-01-21 23:50:27.621 182759 DEBUG nova.virt.hardware [None req-e2e7e875-02d5-4aba-8d1c-694ca0e921b7 85650249f90d4a3b8aea823b57ea554f b7dd3a6bac624dfeb76708960fbea805 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Jan 21 18:50:27 np0005591285 nova_compute[182755]: 2026-01-21 23:50:27.622 182759 DEBUG nova.virt.hardware [None req-e2e7e875-02d5-4aba-8d1c-694ca0e921b7 85650249f90d4a3b8aea823b57ea554f b7dd3a6bac624dfeb76708960fbea805 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Jan 21 18:50:27 np0005591285 nova_compute[182755]: 2026-01-21 23:50:27.622 182759 DEBUG nova.virt.hardware [None req-e2e7e875-02d5-4aba-8d1c-694ca0e921b7 85650249f90d4a3b8aea823b57ea554f b7dd3a6bac624dfeb76708960fbea805 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Jan 21 18:50:27 np0005591285 nova_compute[182755]: 2026-01-21 23:50:27.623 182759 DEBUG nova.virt.hardware [None req-e2e7e875-02d5-4aba-8d1c-694ca0e921b7 85650249f90d4a3b8aea823b57ea554f b7dd3a6bac624dfeb76708960fbea805 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Jan 21 18:50:27 np0005591285 nova_compute[182755]: 2026-01-21 23:50:27.623 182759 DEBUG nova.virt.hardware [None req-e2e7e875-02d5-4aba-8d1c-694ca0e921b7 85650249f90d4a3b8aea823b57ea554f b7dd3a6bac624dfeb76708960fbea805 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Jan 21 18:50:27 np0005591285 nova_compute[182755]: 2026-01-21 23:50:27.623 182759 DEBUG nova.virt.hardware [None req-e2e7e875-02d5-4aba-8d1c-694ca0e921b7 85650249f90d4a3b8aea823b57ea554f b7dd3a6bac624dfeb76708960fbea805 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Jan 21 18:50:27 np0005591285 nova_compute[182755]: 2026-01-21 23:50:27.624 182759 DEBUG nova.virt.hardware [None req-e2e7e875-02d5-4aba-8d1c-694ca0e921b7 85650249f90d4a3b8aea823b57ea554f b7dd3a6bac624dfeb76708960fbea805 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Jan 21 18:50:27 np0005591285 nova_compute[182755]: 2026-01-21 23:50:27.624 182759 DEBUG nova.virt.hardware [None req-e2e7e875-02d5-4aba-8d1c-694ca0e921b7 85650249f90d4a3b8aea823b57ea554f b7dd3a6bac624dfeb76708960fbea805 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Jan 21 18:50:27 np0005591285 nova_compute[182755]: 2026-01-21 23:50:27.624 182759 DEBUG nova.virt.hardware [None req-e2e7e875-02d5-4aba-8d1c-694ca0e921b7 85650249f90d4a3b8aea823b57ea554f b7dd3a6bac624dfeb76708960fbea805 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Jan 21 18:50:27 np0005591285 nova_compute[182755]: 2026-01-21 23:50:27.632 182759 DEBUG nova.virt.libvirt.vif [None req-e2e7e875-02d5-4aba-8d1c-694ca0e921b7 85650249f90d4a3b8aea823b57ea554f b7dd3a6bac624dfeb76708960fbea805 - - default default] vif_type=ovs instance=Instance(access_ip_v4=1.1.1.1,access_ip_v6=::babe:dc0c:1602,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-21T23:50:20Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ServersTestManualDisk-server-2071487848',display_name='tempest-ServersTestManualDisk-server-2071487848',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-serverstestmanualdisk-server-2071487848',id=36,image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBJ6RAQLgw2u4CM7F628XdMwgGoQHo7Cj5eEJZtrGWO7PU9f/CieBUmud/DgBl17nCO7a4E+JzuVuVInm7vYerW4jUfpaUxKWjQ8cUcIQB1cj9HDIYFwKtBjWq7X/VW4G3g==',key_name='tempest-keypair-9054980',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={hello='world'},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='b7dd3a6bac624dfeb76708960fbea805',ramdisk_id='',reservation_id='r-uv009hhq',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ServersTestManualDisk-920151640',owner_user_name='tempest-ServersTestManualDisk-920151640-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-21T23:50:22Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='85650249f90d4a3b8aea823b57ea554f',uuid=31e5706d-d327-4283-affc-6a3f60b78063,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "ebad2ab9-91b4-44f7-9b48-683234f5b24d", "address": "fa:16:3e:46:1e:4f", "network": {"id": "fe7d64db-d08e-4a1e-9c22-107fc8f6cdce", "bridge": "br-int", "label": "tempest-ServersTestManualDisk-561965875-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b7dd3a6bac624dfeb76708960fbea805", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapebad2ab9-91", "ovs_interfaceid": "ebad2ab9-91b4-44f7-9b48-683234f5b24d", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Jan 21 18:50:27 np0005591285 nova_compute[182755]: 2026-01-21 23:50:27.633 182759 DEBUG nova.network.os_vif_util [None req-e2e7e875-02d5-4aba-8d1c-694ca0e921b7 85650249f90d4a3b8aea823b57ea554f b7dd3a6bac624dfeb76708960fbea805 - - default default] Converting VIF {"id": "ebad2ab9-91b4-44f7-9b48-683234f5b24d", "address": "fa:16:3e:46:1e:4f", "network": {"id": "fe7d64db-d08e-4a1e-9c22-107fc8f6cdce", "bridge": "br-int", "label": "tempest-ServersTestManualDisk-561965875-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b7dd3a6bac624dfeb76708960fbea805", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapebad2ab9-91", "ovs_interfaceid": "ebad2ab9-91b4-44f7-9b48-683234f5b24d", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 21 18:50:27 np0005591285 nova_compute[182755]: 2026-01-21 23:50:27.634 182759 DEBUG nova.network.os_vif_util [None req-e2e7e875-02d5-4aba-8d1c-694ca0e921b7 85650249f90d4a3b8aea823b57ea554f b7dd3a6bac624dfeb76708960fbea805 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:46:1e:4f,bridge_name='br-int',has_traffic_filtering=True,id=ebad2ab9-91b4-44f7-9b48-683234f5b24d,network=Network(fe7d64db-d08e-4a1e-9c22-107fc8f6cdce),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapebad2ab9-91') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 21 18:50:27 np0005591285 nova_compute[182755]: 2026-01-21 23:50:27.636 182759 DEBUG nova.objects.instance [None req-e2e7e875-02d5-4aba-8d1c-694ca0e921b7 85650249f90d4a3b8aea823b57ea554f b7dd3a6bac624dfeb76708960fbea805 - - default default] Lazy-loading 'pci_devices' on Instance uuid 31e5706d-d327-4283-affc-6a3f60b78063 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 21 18:50:27 np0005591285 nova_compute[182755]: 2026-01-21 23:50:27.652 182759 DEBUG nova.virt.libvirt.driver [None req-e2e7e875-02d5-4aba-8d1c-694ca0e921b7 85650249f90d4a3b8aea823b57ea554f b7dd3a6bac624dfeb76708960fbea805 - - default default] [instance: 31e5706d-d327-4283-affc-6a3f60b78063] End _get_guest_xml xml=<domain type="kvm">
Jan 21 18:50:27 np0005591285 nova_compute[182755]:  <uuid>31e5706d-d327-4283-affc-6a3f60b78063</uuid>
Jan 21 18:50:27 np0005591285 nova_compute[182755]:  <name>instance-00000024</name>
Jan 21 18:50:27 np0005591285 nova_compute[182755]:  <memory>131072</memory>
Jan 21 18:50:27 np0005591285 nova_compute[182755]:  <vcpu>1</vcpu>
Jan 21 18:50:27 np0005591285 nova_compute[182755]:  <metadata>
Jan 21 18:50:27 np0005591285 nova_compute[182755]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 21 18:50:27 np0005591285 nova_compute[182755]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 21 18:50:27 np0005591285 nova_compute[182755]:      <nova:name>tempest-ServersTestManualDisk-server-2071487848</nova:name>
Jan 21 18:50:27 np0005591285 nova_compute[182755]:      <nova:creationTime>2026-01-21 23:50:27</nova:creationTime>
Jan 21 18:50:27 np0005591285 nova_compute[182755]:      <nova:flavor name="m1.nano">
Jan 21 18:50:27 np0005591285 nova_compute[182755]:        <nova:memory>128</nova:memory>
Jan 21 18:50:27 np0005591285 nova_compute[182755]:        <nova:disk>1</nova:disk>
Jan 21 18:50:27 np0005591285 nova_compute[182755]:        <nova:swap>0</nova:swap>
Jan 21 18:50:27 np0005591285 nova_compute[182755]:        <nova:ephemeral>0</nova:ephemeral>
Jan 21 18:50:27 np0005591285 nova_compute[182755]:        <nova:vcpus>1</nova:vcpus>
Jan 21 18:50:27 np0005591285 nova_compute[182755]:      </nova:flavor>
Jan 21 18:50:27 np0005591285 nova_compute[182755]:      <nova:owner>
Jan 21 18:50:27 np0005591285 nova_compute[182755]:        <nova:user uuid="85650249f90d4a3b8aea823b57ea554f">tempest-ServersTestManualDisk-920151640-project-member</nova:user>
Jan 21 18:50:27 np0005591285 nova_compute[182755]:        <nova:project uuid="b7dd3a6bac624dfeb76708960fbea805">tempest-ServersTestManualDisk-920151640</nova:project>
Jan 21 18:50:27 np0005591285 nova_compute[182755]:      </nova:owner>
Jan 21 18:50:27 np0005591285 nova_compute[182755]:      <nova:root type="image" uuid="9cd98f02-a505-4543-a7ad-04e9a377b456"/>
Jan 21 18:50:27 np0005591285 nova_compute[182755]:      <nova:ports>
Jan 21 18:50:27 np0005591285 nova_compute[182755]:        <nova:port uuid="ebad2ab9-91b4-44f7-9b48-683234f5b24d">
Jan 21 18:50:27 np0005591285 nova_compute[182755]:          <nova:ip type="fixed" address="10.100.0.11" ipVersion="4"/>
Jan 21 18:50:27 np0005591285 nova_compute[182755]:        </nova:port>
Jan 21 18:50:27 np0005591285 nova_compute[182755]:      </nova:ports>
Jan 21 18:50:27 np0005591285 nova_compute[182755]:    </nova:instance>
Jan 21 18:50:27 np0005591285 nova_compute[182755]:  </metadata>
Jan 21 18:50:27 np0005591285 nova_compute[182755]:  <sysinfo type="smbios">
Jan 21 18:50:27 np0005591285 nova_compute[182755]:    <system>
Jan 21 18:50:27 np0005591285 nova_compute[182755]:      <entry name="manufacturer">RDO</entry>
Jan 21 18:50:27 np0005591285 nova_compute[182755]:      <entry name="product">OpenStack Compute</entry>
Jan 21 18:50:27 np0005591285 nova_compute[182755]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 21 18:50:27 np0005591285 nova_compute[182755]:      <entry name="serial">31e5706d-d327-4283-affc-6a3f60b78063</entry>
Jan 21 18:50:27 np0005591285 nova_compute[182755]:      <entry name="uuid">31e5706d-d327-4283-affc-6a3f60b78063</entry>
Jan 21 18:50:27 np0005591285 nova_compute[182755]:      <entry name="family">Virtual Machine</entry>
Jan 21 18:50:27 np0005591285 nova_compute[182755]:    </system>
Jan 21 18:50:27 np0005591285 nova_compute[182755]:  </sysinfo>
Jan 21 18:50:27 np0005591285 nova_compute[182755]:  <os>
Jan 21 18:50:27 np0005591285 nova_compute[182755]:    <type arch="x86_64" machine="q35">hvm</type>
Jan 21 18:50:27 np0005591285 nova_compute[182755]:    <boot dev="hd"/>
Jan 21 18:50:27 np0005591285 nova_compute[182755]:    <smbios mode="sysinfo"/>
Jan 21 18:50:27 np0005591285 nova_compute[182755]:  </os>
Jan 21 18:50:27 np0005591285 nova_compute[182755]:  <features>
Jan 21 18:50:27 np0005591285 nova_compute[182755]:    <acpi/>
Jan 21 18:50:27 np0005591285 nova_compute[182755]:    <apic/>
Jan 21 18:50:27 np0005591285 nova_compute[182755]:    <vmcoreinfo/>
Jan 21 18:50:27 np0005591285 nova_compute[182755]:  </features>
Jan 21 18:50:27 np0005591285 nova_compute[182755]:  <clock offset="utc">
Jan 21 18:50:27 np0005591285 nova_compute[182755]:    <timer name="pit" tickpolicy="delay"/>
Jan 21 18:50:27 np0005591285 nova_compute[182755]:    <timer name="rtc" tickpolicy="catchup"/>
Jan 21 18:50:27 np0005591285 nova_compute[182755]:    <timer name="hpet" present="no"/>
Jan 21 18:50:27 np0005591285 nova_compute[182755]:  </clock>
Jan 21 18:50:27 np0005591285 nova_compute[182755]:  <cpu mode="custom" match="exact">
Jan 21 18:50:27 np0005591285 nova_compute[182755]:    <model>Nehalem</model>
Jan 21 18:50:27 np0005591285 nova_compute[182755]:    <topology sockets="1" cores="1" threads="1"/>
Jan 21 18:50:27 np0005591285 nova_compute[182755]:  </cpu>
Jan 21 18:50:27 np0005591285 nova_compute[182755]:  <devices>
Jan 21 18:50:27 np0005591285 nova_compute[182755]:    <disk type="file" device="disk">
Jan 21 18:50:27 np0005591285 nova_compute[182755]:      <driver name="qemu" type="qcow2" cache="none"/>
Jan 21 18:50:27 np0005591285 nova_compute[182755]:      <source file="/var/lib/nova/instances/31e5706d-d327-4283-affc-6a3f60b78063/disk"/>
Jan 21 18:50:27 np0005591285 nova_compute[182755]:      <target dev="vda" bus="virtio"/>
Jan 21 18:50:27 np0005591285 nova_compute[182755]:    </disk>
Jan 21 18:50:27 np0005591285 nova_compute[182755]:    <disk type="file" device="cdrom">
Jan 21 18:50:27 np0005591285 nova_compute[182755]:      <driver name="qemu" type="raw" cache="none"/>
Jan 21 18:50:27 np0005591285 nova_compute[182755]:      <source file="/var/lib/nova/instances/31e5706d-d327-4283-affc-6a3f60b78063/disk.config"/>
Jan 21 18:50:27 np0005591285 nova_compute[182755]:      <target dev="sda" bus="sata"/>
Jan 21 18:50:27 np0005591285 nova_compute[182755]:    </disk>
Jan 21 18:50:27 np0005591285 nova_compute[182755]:    <interface type="ethernet">
Jan 21 18:50:27 np0005591285 nova_compute[182755]:      <mac address="fa:16:3e:46:1e:4f"/>
Jan 21 18:50:27 np0005591285 nova_compute[182755]:      <model type="virtio"/>
Jan 21 18:50:27 np0005591285 nova_compute[182755]:      <driver name="vhost" rx_queue_size="512"/>
Jan 21 18:50:27 np0005591285 nova_compute[182755]:      <mtu size="1442"/>
Jan 21 18:50:27 np0005591285 nova_compute[182755]:      <target dev="tapebad2ab9-91"/>
Jan 21 18:50:27 np0005591285 nova_compute[182755]:    </interface>
Jan 21 18:50:27 np0005591285 nova_compute[182755]:    <serial type="pty">
Jan 21 18:50:27 np0005591285 nova_compute[182755]:      <log file="/var/lib/nova/instances/31e5706d-d327-4283-affc-6a3f60b78063/console.log" append="off"/>
Jan 21 18:50:27 np0005591285 nova_compute[182755]:    </serial>
Jan 21 18:50:27 np0005591285 nova_compute[182755]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 21 18:50:27 np0005591285 nova_compute[182755]:    <video>
Jan 21 18:50:27 np0005591285 nova_compute[182755]:      <model type="virtio"/>
Jan 21 18:50:27 np0005591285 nova_compute[182755]:    </video>
Jan 21 18:50:27 np0005591285 nova_compute[182755]:    <input type="tablet" bus="usb"/>
Jan 21 18:50:27 np0005591285 nova_compute[182755]:    <rng model="virtio">
Jan 21 18:50:27 np0005591285 nova_compute[182755]:      <backend model="random">/dev/urandom</backend>
Jan 21 18:50:27 np0005591285 nova_compute[182755]:    </rng>
Jan 21 18:50:27 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root"/>
Jan 21 18:50:27 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 18:50:27 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 18:50:27 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 18:50:27 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 18:50:27 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 18:50:27 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 18:50:27 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 18:50:27 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 18:50:27 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 18:50:27 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 18:50:27 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 18:50:27 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 18:50:27 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 18:50:27 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 18:50:27 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 18:50:27 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 18:50:27 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 18:50:27 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 18:50:27 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 18:50:27 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 18:50:27 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 18:50:27 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 18:50:27 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 18:50:27 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 18:50:27 np0005591285 nova_compute[182755]:    <controller type="usb" index="0"/>
Jan 21 18:50:27 np0005591285 nova_compute[182755]:    <memballoon model="virtio">
Jan 21 18:50:27 np0005591285 nova_compute[182755]:      <stats period="10"/>
Jan 21 18:50:27 np0005591285 nova_compute[182755]:    </memballoon>
Jan 21 18:50:27 np0005591285 nova_compute[182755]:  </devices>
Jan 21 18:50:27 np0005591285 nova_compute[182755]: </domain>
Jan 21 18:50:27 np0005591285 nova_compute[182755]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Jan 21 18:50:27 np0005591285 nova_compute[182755]: 2026-01-21 23:50:27.653 182759 DEBUG nova.compute.manager [None req-e2e7e875-02d5-4aba-8d1c-694ca0e921b7 85650249f90d4a3b8aea823b57ea554f b7dd3a6bac624dfeb76708960fbea805 - - default default] [instance: 31e5706d-d327-4283-affc-6a3f60b78063] Preparing to wait for external event network-vif-plugged-ebad2ab9-91b4-44f7-9b48-683234f5b24d prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Jan 21 18:50:27 np0005591285 nova_compute[182755]: 2026-01-21 23:50:27.653 182759 DEBUG oslo_concurrency.lockutils [None req-e2e7e875-02d5-4aba-8d1c-694ca0e921b7 85650249f90d4a3b8aea823b57ea554f b7dd3a6bac624dfeb76708960fbea805 - - default default] Acquiring lock "31e5706d-d327-4283-affc-6a3f60b78063-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 21 18:50:27 np0005591285 nova_compute[182755]: 2026-01-21 23:50:27.654 182759 DEBUG oslo_concurrency.lockutils [None req-e2e7e875-02d5-4aba-8d1c-694ca0e921b7 85650249f90d4a3b8aea823b57ea554f b7dd3a6bac624dfeb76708960fbea805 - - default default] Lock "31e5706d-d327-4283-affc-6a3f60b78063-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 21 18:50:27 np0005591285 nova_compute[182755]: 2026-01-21 23:50:27.654 182759 DEBUG oslo_concurrency.lockutils [None req-e2e7e875-02d5-4aba-8d1c-694ca0e921b7 85650249f90d4a3b8aea823b57ea554f b7dd3a6bac624dfeb76708960fbea805 - - default default] Lock "31e5706d-d327-4283-affc-6a3f60b78063-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 21 18:50:27 np0005591285 nova_compute[182755]: 2026-01-21 23:50:27.656 182759 DEBUG nova.virt.libvirt.vif [None req-e2e7e875-02d5-4aba-8d1c-694ca0e921b7 85650249f90d4a3b8aea823b57ea554f b7dd3a6bac624dfeb76708960fbea805 - - default default] vif_type=ovs instance=Instance(access_ip_v4=1.1.1.1,access_ip_v6=::babe:dc0c:1602,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-21T23:50:20Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ServersTestManualDisk-server-2071487848',display_name='tempest-ServersTestManualDisk-server-2071487848',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-serverstestmanualdisk-server-2071487848',id=36,image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBJ6RAQLgw2u4CM7F628XdMwgGoQHo7Cj5eEJZtrGWO7PU9f/CieBUmud/DgBl17nCO7a4E+JzuVuVInm7vYerW4jUfpaUxKWjQ8cUcIQB1cj9HDIYFwKtBjWq7X/VW4G3g==',key_name='tempest-keypair-9054980',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={hello='world'},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='b7dd3a6bac624dfeb76708960fbea805',ramdisk_id='',reservation_id='r-uv009hhq',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ServersTestManualDisk-920151640',owner_user_name='tempest-ServersTestManualDisk-920151640-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-21T23:50:22Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='85650249f90d4a3b8aea823b57ea554f',uuid=31e5706d-d327-4283-affc-6a3f60b78063,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "ebad2ab9-91b4-44f7-9b48-683234f5b24d", "address": "fa:16:3e:46:1e:4f", "network": {"id": "fe7d64db-d08e-4a1e-9c22-107fc8f6cdce", "bridge": "br-int", "label": "tempest-ServersTestManualDisk-561965875-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b7dd3a6bac624dfeb76708960fbea805", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapebad2ab9-91", "ovs_interfaceid": "ebad2ab9-91b4-44f7-9b48-683234f5b24d", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Jan 21 18:50:27 np0005591285 nova_compute[182755]: 2026-01-21 23:50:27.656 182759 DEBUG nova.network.os_vif_util [None req-e2e7e875-02d5-4aba-8d1c-694ca0e921b7 85650249f90d4a3b8aea823b57ea554f b7dd3a6bac624dfeb76708960fbea805 - - default default] Converting VIF {"id": "ebad2ab9-91b4-44f7-9b48-683234f5b24d", "address": "fa:16:3e:46:1e:4f", "network": {"id": "fe7d64db-d08e-4a1e-9c22-107fc8f6cdce", "bridge": "br-int", "label": "tempest-ServersTestManualDisk-561965875-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b7dd3a6bac624dfeb76708960fbea805", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapebad2ab9-91", "ovs_interfaceid": "ebad2ab9-91b4-44f7-9b48-683234f5b24d", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 21 18:50:27 np0005591285 nova_compute[182755]: 2026-01-21 23:50:27.657 182759 DEBUG nova.network.os_vif_util [None req-e2e7e875-02d5-4aba-8d1c-694ca0e921b7 85650249f90d4a3b8aea823b57ea554f b7dd3a6bac624dfeb76708960fbea805 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:46:1e:4f,bridge_name='br-int',has_traffic_filtering=True,id=ebad2ab9-91b4-44f7-9b48-683234f5b24d,network=Network(fe7d64db-d08e-4a1e-9c22-107fc8f6cdce),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapebad2ab9-91') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 21 18:50:27 np0005591285 nova_compute[182755]: 2026-01-21 23:50:27.658 182759 DEBUG os_vif [None req-e2e7e875-02d5-4aba-8d1c-694ca0e921b7 85650249f90d4a3b8aea823b57ea554f b7dd3a6bac624dfeb76708960fbea805 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:46:1e:4f,bridge_name='br-int',has_traffic_filtering=True,id=ebad2ab9-91b4-44f7-9b48-683234f5b24d,network=Network(fe7d64db-d08e-4a1e-9c22-107fc8f6cdce),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapebad2ab9-91') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Jan 21 18:50:27 np0005591285 nova_compute[182755]: 2026-01-21 23:50:27.659 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 18:50:27 np0005591285 nova_compute[182755]: 2026-01-21 23:50:27.660 182759 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 21 18:50:27 np0005591285 nova_compute[182755]: 2026-01-21 23:50:27.661 182759 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 21 18:50:27 np0005591285 nova_compute[182755]: 2026-01-21 23:50:27.672 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 18:50:27 np0005591285 nova_compute[182755]: 2026-01-21 23:50:27.673 182759 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapebad2ab9-91, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 21 18:50:27 np0005591285 nova_compute[182755]: 2026-01-21 23:50:27.674 182759 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tapebad2ab9-91, col_values=(('external_ids', {'iface-id': 'ebad2ab9-91b4-44f7-9b48-683234f5b24d', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:46:1e:4f', 'vm-uuid': '31e5706d-d327-4283-affc-6a3f60b78063'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 21 18:50:27 np0005591285 nova_compute[182755]: 2026-01-21 23:50:27.677 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 18:50:27 np0005591285 NetworkManager[55017]: <info>  [1769039427.6788] manager: (tapebad2ab9-91): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/53)
Jan 21 18:50:27 np0005591285 nova_compute[182755]: 2026-01-21 23:50:27.681 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 21 18:50:27 np0005591285 nova_compute[182755]: 2026-01-21 23:50:27.688 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 18:50:27 np0005591285 nova_compute[182755]: 2026-01-21 23:50:27.689 182759 INFO os_vif [None req-e2e7e875-02d5-4aba-8d1c-694ca0e921b7 85650249f90d4a3b8aea823b57ea554f b7dd3a6bac624dfeb76708960fbea805 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:46:1e:4f,bridge_name='br-int',has_traffic_filtering=True,id=ebad2ab9-91b4-44f7-9b48-683234f5b24d,network=Network(fe7d64db-d08e-4a1e-9c22-107fc8f6cdce),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapebad2ab9-91')#033[00m
Jan 21 18:50:27 np0005591285 nova_compute[182755]: 2026-01-21 23:50:27.747 182759 DEBUG nova.virt.libvirt.driver [None req-e2e7e875-02d5-4aba-8d1c-694ca0e921b7 85650249f90d4a3b8aea823b57ea554f b7dd3a6bac624dfeb76708960fbea805 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 21 18:50:27 np0005591285 nova_compute[182755]: 2026-01-21 23:50:27.747 182759 DEBUG nova.virt.libvirt.driver [None req-e2e7e875-02d5-4aba-8d1c-694ca0e921b7 85650249f90d4a3b8aea823b57ea554f b7dd3a6bac624dfeb76708960fbea805 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 21 18:50:27 np0005591285 nova_compute[182755]: 2026-01-21 23:50:27.748 182759 DEBUG nova.virt.libvirt.driver [None req-e2e7e875-02d5-4aba-8d1c-694ca0e921b7 85650249f90d4a3b8aea823b57ea554f b7dd3a6bac624dfeb76708960fbea805 - - default default] No VIF found with MAC fa:16:3e:46:1e:4f, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Jan 21 18:50:27 np0005591285 nova_compute[182755]: 2026-01-21 23:50:27.748 182759 INFO nova.virt.libvirt.driver [None req-e2e7e875-02d5-4aba-8d1c-694ca0e921b7 85650249f90d4a3b8aea823b57ea554f b7dd3a6bac624dfeb76708960fbea805 - - default default] [instance: 31e5706d-d327-4283-affc-6a3f60b78063] Using config drive#033[00m
Jan 21 18:50:28 np0005591285 nova_compute[182755]: 2026-01-21 23:50:28.113 182759 INFO nova.virt.libvirt.driver [None req-e2e7e875-02d5-4aba-8d1c-694ca0e921b7 85650249f90d4a3b8aea823b57ea554f b7dd3a6bac624dfeb76708960fbea805 - - default default] [instance: 31e5706d-d327-4283-affc-6a3f60b78063] Creating config drive at /var/lib/nova/instances/31e5706d-d327-4283-affc-6a3f60b78063/disk.config#033[00m
Jan 21 18:50:28 np0005591285 nova_compute[182755]: 2026-01-21 23:50:28.126 182759 DEBUG oslo_concurrency.processutils [None req-e2e7e875-02d5-4aba-8d1c-694ca0e921b7 85650249f90d4a3b8aea823b57ea554f b7dd3a6bac624dfeb76708960fbea805 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/31e5706d-d327-4283-affc-6a3f60b78063/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmprb30pqnx execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 21 18:50:28 np0005591285 nova_compute[182755]: 2026-01-21 23:50:28.219 182759 DEBUG oslo_service.periodic_task [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 21 18:50:28 np0005591285 nova_compute[182755]: 2026-01-21 23:50:28.262 182759 DEBUG oslo_concurrency.processutils [None req-e2e7e875-02d5-4aba-8d1c-694ca0e921b7 85650249f90d4a3b8aea823b57ea554f b7dd3a6bac624dfeb76708960fbea805 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/31e5706d-d327-4283-affc-6a3f60b78063/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmprb30pqnx" returned: 0 in 0.137s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 21 18:50:28 np0005591285 kernel: tapebad2ab9-91: entered promiscuous mode
Jan 21 18:50:28 np0005591285 NetworkManager[55017]: <info>  [1769039428.3653] manager: (tapebad2ab9-91): new Tun device (/org/freedesktop/NetworkManager/Devices/54)
Jan 21 18:50:28 np0005591285 ovn_controller[94908]: 2026-01-21T23:50:28Z|00090|binding|INFO|Claiming lport ebad2ab9-91b4-44f7-9b48-683234f5b24d for this chassis.
Jan 21 18:50:28 np0005591285 nova_compute[182755]: 2026-01-21 23:50:28.408 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 18:50:28 np0005591285 ovn_controller[94908]: 2026-01-21T23:50:28Z|00091|binding|INFO|ebad2ab9-91b4-44f7-9b48-683234f5b24d: Claiming fa:16:3e:46:1e:4f 10.100.0.11
Jan 21 18:50:28 np0005591285 nova_compute[182755]: 2026-01-21 23:50:28.414 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 18:50:28 np0005591285 podman[215660]: 2026-01-21 23:50:28.414959168 +0000 UTC m=+0.109822914 container health_status 482a8450540b33e5cd4c4cb0c4b60281016c657b79b89fa82f8a9e447843f995 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team)
Jan 21 18:50:28 np0005591285 systemd-udevd[215703]: Network interface NamePolicy= disabled on kernel command line.
Jan 21 18:50:28 np0005591285 nova_compute[182755]: 2026-01-21 23:50:28.424 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 18:50:28 np0005591285 NetworkManager[55017]: <info>  [1769039428.4353] device (tapebad2ab9-91): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 21 18:50:28 np0005591285 NetworkManager[55017]: <info>  [1769039428.4360] device (tapebad2ab9-91): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 21 18:50:28 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:50:28.434 104259 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:46:1e:4f 10.100.0.11'], port_security=['fa:16:3e:46:1e:4f 10.100.0.11'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.11/28', 'neutron:device_id': '31e5706d-d327-4283-affc-6a3f60b78063', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-fe7d64db-d08e-4a1e-9c22-107fc8f6cdce', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'b7dd3a6bac624dfeb76708960fbea805', 'neutron:revision_number': '2', 'neutron:security_group_ids': '14c7e5ff-710e-43ee-89f1-095dbe750986', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=91d6bebd-4af5-4e84-b131-0a75959656c1, chassis=[<ovs.db.idl.Row object at 0x7fdd55b5c970>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fdd55b5c970>], logical_port=ebad2ab9-91b4-44f7-9b48-683234f5b24d) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 21 18:50:28 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:50:28.436 104259 INFO neutron.agent.ovn.metadata.agent [-] Port ebad2ab9-91b4-44f7-9b48-683234f5b24d in datapath fe7d64db-d08e-4a1e-9c22-107fc8f6cdce bound to our chassis#033[00m
Jan 21 18:50:28 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:50:28.439 104259 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network fe7d64db-d08e-4a1e-9c22-107fc8f6cdce#033[00m
Jan 21 18:50:28 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:50:28.454 211690 DEBUG oslo.privsep.daemon [-] privsep: reply[5e35f207-2474-43cd-b30a-5128ecf57c2f]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 18:50:28 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:50:28.455 104259 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tapfe7d64db-d1 in ovnmeta-fe7d64db-d08e-4a1e-9c22-107fc8f6cdce namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Jan 21 18:50:28 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:50:28.459 211690 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tapfe7d64db-d0 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Jan 21 18:50:28 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:50:28.459 211690 DEBUG oslo.privsep.daemon [-] privsep: reply[7953cd5c-0fa3-465c-be64-bdf2d22bd1bd]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 18:50:28 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:50:28.460 211690 DEBUG oslo.privsep.daemon [-] privsep: reply[102106a3-36b4-41e1-b0b7-392215e95213]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 18:50:28 np0005591285 systemd-machined[154022]: New machine qemu-14-instance-00000024.
Jan 21 18:50:28 np0005591285 podman[215663]: 2026-01-21 23:50:28.467562817 +0000 UTC m=+0.146703780 container health_status 8919309155a033da8deb024bb37b9fc0d285fe97686ee0d202af37a5f2df8416 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter)
Jan 21 18:50:28 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:50:28.478 104650 DEBUG oslo.privsep.daemon [-] privsep: reply[6e118a72-cffd-45ed-a704-ecd068752c77]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 18:50:28 np0005591285 ovn_controller[94908]: 2026-01-21T23:50:28Z|00092|binding|INFO|Setting lport ebad2ab9-91b4-44f7-9b48-683234f5b24d ovn-installed in OVS
Jan 21 18:50:28 np0005591285 ovn_controller[94908]: 2026-01-21T23:50:28Z|00093|binding|INFO|Setting lport ebad2ab9-91b4-44f7-9b48-683234f5b24d up in Southbound
Jan 21 18:50:28 np0005591285 nova_compute[182755]: 2026-01-21 23:50:28.498 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 18:50:28 np0005591285 systemd[1]: Started Virtual Machine qemu-14-instance-00000024.
Jan 21 18:50:28 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:50:28.511 211690 DEBUG oslo.privsep.daemon [-] privsep: reply[d0878c0f-dd7f-4db1-823b-afa87cb538d3]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 18:50:28 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:50:28.551 211704 DEBUG oslo.privsep.daemon [-] privsep: reply[8eeadee9-71e6-4556-a0d6-48ef0eda9aa4]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 18:50:28 np0005591285 NetworkManager[55017]: <info>  [1769039428.5596] manager: (tapfe7d64db-d0): new Veth device (/org/freedesktop/NetworkManager/Devices/55)
Jan 21 18:50:28 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:50:28.559 211690 DEBUG oslo.privsep.daemon [-] privsep: reply[0b3daba6-1db0-43c8-bef8-49172a2564d9]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 18:50:28 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:50:28.616 211704 DEBUG oslo.privsep.daemon [-] privsep: reply[79ad3f87-bf49-471e-96d9-da681f27eac3]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 18:50:28 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:50:28.623 211704 DEBUG oslo.privsep.daemon [-] privsep: reply[6be47875-a0c2-4b65-b533-2260e24ed1c5]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 18:50:28 np0005591285 NetworkManager[55017]: <info>  [1769039428.6571] device (tapfe7d64db-d0): carrier: link connected
Jan 21 18:50:28 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:50:28.667 211704 DEBUG oslo.privsep.daemon [-] privsep: reply[8f10ed48-ac04-4ab7-b287-f316adfecc7b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 18:50:28 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:50:28.697 211690 DEBUG oslo.privsep.daemon [-] privsep: reply[17250f71-72f3-4df1-9d57-0104736e052c]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapfe7d64db-d1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:b0:21:2e'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 33], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 398731, 'reachable_time': 38811, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 215747, 'error': None, 'target': 'ovnmeta-fe7d64db-d08e-4a1e-9c22-107fc8f6cdce', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 18:50:28 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:50:28.723 211690 DEBUG oslo.privsep.daemon [-] privsep: reply[3c5c61d5-8c6f-4b7c-80f3-a774ecaa6a63]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:feb0:212e'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 398731, 'tstamp': 398731}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 215748, 'error': None, 'target': 'ovnmeta-fe7d64db-d08e-4a1e-9c22-107fc8f6cdce', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 18:50:28 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:50:28.753 211690 DEBUG oslo.privsep.daemon [-] privsep: reply[baa38644-e38d-4080-8160-8359e305a2b2]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapfe7d64db-d1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:b0:21:2e'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 33], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 398731, 'reachable_time': 38811, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 215749, 'error': None, 'target': 'ovnmeta-fe7d64db-d08e-4a1e-9c22-107fc8f6cdce', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 18:50:28 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:50:28.811 211690 DEBUG oslo.privsep.daemon [-] privsep: reply[9a199dab-0490-4e23-bb19-815948751c6e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 18:50:28 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:50:28.915 211690 DEBUG oslo.privsep.daemon [-] privsep: reply[7db985db-4567-45f4-991a-b5a073c986df]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 18:50:28 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:50:28.918 104259 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapfe7d64db-d0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 21 18:50:28 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:50:28.919 104259 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 21 18:50:28 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:50:28.920 104259 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapfe7d64db-d0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 21 18:50:28 np0005591285 nova_compute[182755]: 2026-01-21 23:50:28.923 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 18:50:28 np0005591285 NetworkManager[55017]: <info>  [1769039428.9246] manager: (tapfe7d64db-d0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/56)
Jan 21 18:50:28 np0005591285 kernel: tapfe7d64db-d0: entered promiscuous mode
Jan 21 18:50:28 np0005591285 nova_compute[182755]: 2026-01-21 23:50:28.928 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 18:50:28 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:50:28.937 104259 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapfe7d64db-d0, col_values=(('external_ids', {'iface-id': '38be274f-21e1-4f3b-b2be-0a4430f5cc37'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 21 18:50:28 np0005591285 nova_compute[182755]: 2026-01-21 23:50:28.939 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 18:50:28 np0005591285 ovn_controller[94908]: 2026-01-21T23:50:28Z|00094|binding|INFO|Releasing lport 38be274f-21e1-4f3b-b2be-0a4430f5cc37 from this chassis (sb_readonly=0)
Jan 21 18:50:28 np0005591285 nova_compute[182755]: 2026-01-21 23:50:28.941 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 18:50:28 np0005591285 nova_compute[182755]: 2026-01-21 23:50:28.965 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 18:50:28 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:50:28.965 104259 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/fe7d64db-d08e-4a1e-9c22-107fc8f6cdce.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/fe7d64db-d08e-4a1e-9c22-107fc8f6cdce.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Jan 21 18:50:28 np0005591285 nova_compute[182755]: 2026-01-21 23:50:28.974 182759 DEBUG nova.compute.manager [req-92ecb624-4862-4184-89d8-1f41eff64ea9 req-caf1c680-79fa-487a-a40f-f1c1e3408220 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 31e5706d-d327-4283-affc-6a3f60b78063] Received event network-vif-plugged-ebad2ab9-91b4-44f7-9b48-683234f5b24d external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 21 18:50:28 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:50:28.968 211690 DEBUG oslo.privsep.daemon [-] privsep: reply[49d9c7ab-2a75-4c8e-aaf3-67b69d8d37d8]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 18:50:28 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:50:28.969 104259 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Jan 21 18:50:28 np0005591285 ovn_metadata_agent[104254]: global
Jan 21 18:50:28 np0005591285 ovn_metadata_agent[104254]:    log         /dev/log local0 debug
Jan 21 18:50:28 np0005591285 ovn_metadata_agent[104254]:    log-tag     haproxy-metadata-proxy-fe7d64db-d08e-4a1e-9c22-107fc8f6cdce
Jan 21 18:50:28 np0005591285 ovn_metadata_agent[104254]:    user        root
Jan 21 18:50:28 np0005591285 ovn_metadata_agent[104254]:    group       root
Jan 21 18:50:28 np0005591285 ovn_metadata_agent[104254]:    maxconn     1024
Jan 21 18:50:28 np0005591285 ovn_metadata_agent[104254]:    pidfile     /var/lib/neutron/external/pids/fe7d64db-d08e-4a1e-9c22-107fc8f6cdce.pid.haproxy
Jan 21 18:50:28 np0005591285 ovn_metadata_agent[104254]:    daemon
Jan 21 18:50:28 np0005591285 ovn_metadata_agent[104254]: 
Jan 21 18:50:28 np0005591285 ovn_metadata_agent[104254]: defaults
Jan 21 18:50:28 np0005591285 ovn_metadata_agent[104254]:    log global
Jan 21 18:50:28 np0005591285 ovn_metadata_agent[104254]:    mode http
Jan 21 18:50:28 np0005591285 ovn_metadata_agent[104254]:    option httplog
Jan 21 18:50:28 np0005591285 ovn_metadata_agent[104254]:    option dontlognull
Jan 21 18:50:28 np0005591285 ovn_metadata_agent[104254]:    option http-server-close
Jan 21 18:50:28 np0005591285 ovn_metadata_agent[104254]:    option forwardfor
Jan 21 18:50:28 np0005591285 ovn_metadata_agent[104254]:    retries                 3
Jan 21 18:50:28 np0005591285 ovn_metadata_agent[104254]:    timeout http-request    30s
Jan 21 18:50:28 np0005591285 ovn_metadata_agent[104254]:    timeout connect         30s
Jan 21 18:50:28 np0005591285 ovn_metadata_agent[104254]:    timeout client          32s
Jan 21 18:50:28 np0005591285 ovn_metadata_agent[104254]:    timeout server          32s
Jan 21 18:50:28 np0005591285 ovn_metadata_agent[104254]:    timeout http-keep-alive 30s
Jan 21 18:50:28 np0005591285 ovn_metadata_agent[104254]: 
Jan 21 18:50:28 np0005591285 ovn_metadata_agent[104254]: 
Jan 21 18:50:28 np0005591285 ovn_metadata_agent[104254]: listen listener
Jan 21 18:50:28 np0005591285 ovn_metadata_agent[104254]:    bind 169.254.169.254:80
Jan 21 18:50:28 np0005591285 ovn_metadata_agent[104254]:    server metadata /var/lib/neutron/metadata_proxy
Jan 21 18:50:28 np0005591285 ovn_metadata_agent[104254]:    http-request add-header X-OVN-Network-ID fe7d64db-d08e-4a1e-9c22-107fc8f6cdce
Jan 21 18:50:28 np0005591285 ovn_metadata_agent[104254]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Jan 21 18:50:28 np0005591285 nova_compute[182755]: 2026-01-21 23:50:28.975 182759 DEBUG oslo_concurrency.lockutils [req-92ecb624-4862-4184-89d8-1f41eff64ea9 req-caf1c680-79fa-487a-a40f-f1c1e3408220 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "31e5706d-d327-4283-affc-6a3f60b78063-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 21 18:50:28 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:50:28.970 104259 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-fe7d64db-d08e-4a1e-9c22-107fc8f6cdce', 'env', 'PROCESS_TAG=haproxy-fe7d64db-d08e-4a1e-9c22-107fc8f6cdce', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/fe7d64db-d08e-4a1e-9c22-107fc8f6cdce.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Jan 21 18:50:28 np0005591285 nova_compute[182755]: 2026-01-21 23:50:28.975 182759 DEBUG oslo_concurrency.lockutils [req-92ecb624-4862-4184-89d8-1f41eff64ea9 req-caf1c680-79fa-487a-a40f-f1c1e3408220 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "31e5706d-d327-4283-affc-6a3f60b78063-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 21 18:50:28 np0005591285 nova_compute[182755]: 2026-01-21 23:50:28.976 182759 DEBUG oslo_concurrency.lockutils [req-92ecb624-4862-4184-89d8-1f41eff64ea9 req-caf1c680-79fa-487a-a40f-f1c1e3408220 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "31e5706d-d327-4283-affc-6a3f60b78063-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 21 18:50:28 np0005591285 nova_compute[182755]: 2026-01-21 23:50:28.976 182759 DEBUG nova.compute.manager [req-92ecb624-4862-4184-89d8-1f41eff64ea9 req-caf1c680-79fa-487a-a40f-f1c1e3408220 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 31e5706d-d327-4283-affc-6a3f60b78063] Processing event network-vif-plugged-ebad2ab9-91b4-44f7-9b48-683234f5b24d _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Jan 21 18:50:29 np0005591285 nova_compute[182755]: 2026-01-21 23:50:29.205 182759 DEBUG nova.virt.driver [None req-f447ef32-7edb-4e2c-a5c5-5452f2f9c37e - - - - - -] Emitting event <LifecycleEvent: 1769039429.204543, 31e5706d-d327-4283-affc-6a3f60b78063 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 21 18:50:29 np0005591285 nova_compute[182755]: 2026-01-21 23:50:29.206 182759 INFO nova.compute.manager [None req-f447ef32-7edb-4e2c-a5c5-5452f2f9c37e - - - - - -] [instance: 31e5706d-d327-4283-affc-6a3f60b78063] VM Started (Lifecycle Event)#033[00m
Jan 21 18:50:29 np0005591285 nova_compute[182755]: 2026-01-21 23:50:29.216 182759 DEBUG nova.compute.manager [None req-e2e7e875-02d5-4aba-8d1c-694ca0e921b7 85650249f90d4a3b8aea823b57ea554f b7dd3a6bac624dfeb76708960fbea805 - - default default] [instance: 31e5706d-d327-4283-affc-6a3f60b78063] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Jan 21 18:50:29 np0005591285 nova_compute[182755]: 2026-01-21 23:50:29.237 182759 DEBUG nova.virt.libvirt.driver [None req-e2e7e875-02d5-4aba-8d1c-694ca0e921b7 85650249f90d4a3b8aea823b57ea554f b7dd3a6bac624dfeb76708960fbea805 - - default default] [instance: 31e5706d-d327-4283-affc-6a3f60b78063] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Jan 21 18:50:29 np0005591285 nova_compute[182755]: 2026-01-21 23:50:29.242 182759 DEBUG nova.compute.manager [None req-f447ef32-7edb-4e2c-a5c5-5452f2f9c37e - - - - - -] [instance: 31e5706d-d327-4283-affc-6a3f60b78063] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 21 18:50:29 np0005591285 nova_compute[182755]: 2026-01-21 23:50:29.246 182759 INFO nova.virt.libvirt.driver [-] [instance: 31e5706d-d327-4283-affc-6a3f60b78063] Instance spawned successfully.#033[00m
Jan 21 18:50:29 np0005591285 nova_compute[182755]: 2026-01-21 23:50:29.247 182759 DEBUG nova.virt.libvirt.driver [None req-e2e7e875-02d5-4aba-8d1c-694ca0e921b7 85650249f90d4a3b8aea823b57ea554f b7dd3a6bac624dfeb76708960fbea805 - - default default] [instance: 31e5706d-d327-4283-affc-6a3f60b78063] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Jan 21 18:50:29 np0005591285 nova_compute[182755]: 2026-01-21 23:50:29.255 182759 DEBUG nova.compute.manager [None req-f447ef32-7edb-4e2c-a5c5-5452f2f9c37e - - - - - -] [instance: 31e5706d-d327-4283-affc-6a3f60b78063] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 21 18:50:29 np0005591285 nova_compute[182755]: 2026-01-21 23:50:29.278 182759 DEBUG nova.virt.libvirt.driver [None req-e2e7e875-02d5-4aba-8d1c-694ca0e921b7 85650249f90d4a3b8aea823b57ea554f b7dd3a6bac624dfeb76708960fbea805 - - default default] [instance: 31e5706d-d327-4283-affc-6a3f60b78063] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 21 18:50:29 np0005591285 nova_compute[182755]: 2026-01-21 23:50:29.279 182759 DEBUG nova.virt.libvirt.driver [None req-e2e7e875-02d5-4aba-8d1c-694ca0e921b7 85650249f90d4a3b8aea823b57ea554f b7dd3a6bac624dfeb76708960fbea805 - - default default] [instance: 31e5706d-d327-4283-affc-6a3f60b78063] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 21 18:50:29 np0005591285 nova_compute[182755]: 2026-01-21 23:50:29.279 182759 DEBUG nova.virt.libvirt.driver [None req-e2e7e875-02d5-4aba-8d1c-694ca0e921b7 85650249f90d4a3b8aea823b57ea554f b7dd3a6bac624dfeb76708960fbea805 - - default default] [instance: 31e5706d-d327-4283-affc-6a3f60b78063] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 21 18:50:29 np0005591285 nova_compute[182755]: 2026-01-21 23:50:29.280 182759 DEBUG nova.virt.libvirt.driver [None req-e2e7e875-02d5-4aba-8d1c-694ca0e921b7 85650249f90d4a3b8aea823b57ea554f b7dd3a6bac624dfeb76708960fbea805 - - default default] [instance: 31e5706d-d327-4283-affc-6a3f60b78063] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 21 18:50:29 np0005591285 nova_compute[182755]: 2026-01-21 23:50:29.280 182759 DEBUG nova.virt.libvirt.driver [None req-e2e7e875-02d5-4aba-8d1c-694ca0e921b7 85650249f90d4a3b8aea823b57ea554f b7dd3a6bac624dfeb76708960fbea805 - - default default] [instance: 31e5706d-d327-4283-affc-6a3f60b78063] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 21 18:50:29 np0005591285 nova_compute[182755]: 2026-01-21 23:50:29.281 182759 DEBUG nova.virt.libvirt.driver [None req-e2e7e875-02d5-4aba-8d1c-694ca0e921b7 85650249f90d4a3b8aea823b57ea554f b7dd3a6bac624dfeb76708960fbea805 - - default default] [instance: 31e5706d-d327-4283-affc-6a3f60b78063] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 21 18:50:29 np0005591285 nova_compute[182755]: 2026-01-21 23:50:29.299 182759 INFO nova.compute.manager [None req-f447ef32-7edb-4e2c-a5c5-5452f2f9c37e - - - - - -] [instance: 31e5706d-d327-4283-affc-6a3f60b78063] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 21 18:50:29 np0005591285 nova_compute[182755]: 2026-01-21 23:50:29.300 182759 DEBUG nova.virt.driver [None req-f447ef32-7edb-4e2c-a5c5-5452f2f9c37e - - - - - -] Emitting event <LifecycleEvent: 1769039429.204916, 31e5706d-d327-4283-affc-6a3f60b78063 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 21 18:50:29 np0005591285 nova_compute[182755]: 2026-01-21 23:50:29.300 182759 INFO nova.compute.manager [None req-f447ef32-7edb-4e2c-a5c5-5452f2f9c37e - - - - - -] [instance: 31e5706d-d327-4283-affc-6a3f60b78063] VM Paused (Lifecycle Event)#033[00m
Jan 21 18:50:29 np0005591285 nova_compute[182755]: 2026-01-21 23:50:29.328 182759 DEBUG nova.compute.manager [None req-f447ef32-7edb-4e2c-a5c5-5452f2f9c37e - - - - - -] [instance: 31e5706d-d327-4283-affc-6a3f60b78063] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 21 18:50:29 np0005591285 nova_compute[182755]: 2026-01-21 23:50:29.333 182759 DEBUG nova.virt.driver [None req-f447ef32-7edb-4e2c-a5c5-5452f2f9c37e - - - - - -] Emitting event <LifecycleEvent: 1769039429.234087, 31e5706d-d327-4283-affc-6a3f60b78063 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 21 18:50:29 np0005591285 nova_compute[182755]: 2026-01-21 23:50:29.333 182759 INFO nova.compute.manager [None req-f447ef32-7edb-4e2c-a5c5-5452f2f9c37e - - - - - -] [instance: 31e5706d-d327-4283-affc-6a3f60b78063] VM Resumed (Lifecycle Event)#033[00m
Jan 21 18:50:29 np0005591285 nova_compute[182755]: 2026-01-21 23:50:29.346 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 18:50:29 np0005591285 nova_compute[182755]: 2026-01-21 23:50:29.361 182759 DEBUG nova.compute.manager [None req-f447ef32-7edb-4e2c-a5c5-5452f2f9c37e - - - - - -] [instance: 31e5706d-d327-4283-affc-6a3f60b78063] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 21 18:50:29 np0005591285 nova_compute[182755]: 2026-01-21 23:50:29.366 182759 DEBUG nova.compute.manager [None req-f447ef32-7edb-4e2c-a5c5-5452f2f9c37e - - - - - -] [instance: 31e5706d-d327-4283-affc-6a3f60b78063] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 21 18:50:29 np0005591285 nova_compute[182755]: 2026-01-21 23:50:29.377 182759 DEBUG nova.network.neutron [req-1a783ed2-9879-4f7a-b4db-9a2d9f370421 req-9273e641-8eab-4565-ad38-212d125157a8 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 31e5706d-d327-4283-affc-6a3f60b78063] Updated VIF entry in instance network info cache for port ebad2ab9-91b4-44f7-9b48-683234f5b24d. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 21 18:50:29 np0005591285 nova_compute[182755]: 2026-01-21 23:50:29.377 182759 DEBUG nova.network.neutron [req-1a783ed2-9879-4f7a-b4db-9a2d9f370421 req-9273e641-8eab-4565-ad38-212d125157a8 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 31e5706d-d327-4283-affc-6a3f60b78063] Updating instance_info_cache with network_info: [{"id": "ebad2ab9-91b4-44f7-9b48-683234f5b24d", "address": "fa:16:3e:46:1e:4f", "network": {"id": "fe7d64db-d08e-4a1e-9c22-107fc8f6cdce", "bridge": "br-int", "label": "tempest-ServersTestManualDisk-561965875-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b7dd3a6bac624dfeb76708960fbea805", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapebad2ab9-91", "ovs_interfaceid": "ebad2ab9-91b4-44f7-9b48-683234f5b24d", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 21 18:50:29 np0005591285 nova_compute[182755]: 2026-01-21 23:50:29.396 182759 INFO nova.compute.manager [None req-e2e7e875-02d5-4aba-8d1c-694ca0e921b7 85650249f90d4a3b8aea823b57ea554f b7dd3a6bac624dfeb76708960fbea805 - - default default] [instance: 31e5706d-d327-4283-affc-6a3f60b78063] Took 6.94 seconds to spawn the instance on the hypervisor.#033[00m
Jan 21 18:50:29 np0005591285 nova_compute[182755]: 2026-01-21 23:50:29.396 182759 DEBUG nova.compute.manager [None req-e2e7e875-02d5-4aba-8d1c-694ca0e921b7 85650249f90d4a3b8aea823b57ea554f b7dd3a6bac624dfeb76708960fbea805 - - default default] [instance: 31e5706d-d327-4283-affc-6a3f60b78063] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 21 18:50:29 np0005591285 nova_compute[182755]: 2026-01-21 23:50:29.404 182759 INFO nova.compute.manager [None req-f447ef32-7edb-4e2c-a5c5-5452f2f9c37e - - - - - -] [instance: 31e5706d-d327-4283-affc-6a3f60b78063] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 21 18:50:29 np0005591285 nova_compute[182755]: 2026-01-21 23:50:29.404 182759 DEBUG oslo_concurrency.lockutils [req-1a783ed2-9879-4f7a-b4db-9a2d9f370421 req-9273e641-8eab-4565-ad38-212d125157a8 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Releasing lock "refresh_cache-31e5706d-d327-4283-affc-6a3f60b78063" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 21 18:50:29 np0005591285 podman[215788]: 2026-01-21 23:50:29.496720033 +0000 UTC m=+0.076128439 container create 438d32d56f4851558ea3715744e614e033a0b120d7e89bf40f3f623a8a805c88 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-fe7d64db-d08e-4a1e-9c22-107fc8f6cdce, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 21 18:50:29 np0005591285 nova_compute[182755]: 2026-01-21 23:50:29.508 182759 INFO nova.compute.manager [None req-e2e7e875-02d5-4aba-8d1c-694ca0e921b7 85650249f90d4a3b8aea823b57ea554f b7dd3a6bac624dfeb76708960fbea805 - - default default] [instance: 31e5706d-d327-4283-affc-6a3f60b78063] Took 7.65 seconds to build instance.#033[00m
Jan 21 18:50:29 np0005591285 nova_compute[182755]: 2026-01-21 23:50:29.534 182759 DEBUG oslo_concurrency.lockutils [None req-e2e7e875-02d5-4aba-8d1c-694ca0e921b7 85650249f90d4a3b8aea823b57ea554f b7dd3a6bac624dfeb76708960fbea805 - - default default] Lock "31e5706d-d327-4283-affc-6a3f60b78063" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 7.785s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 21 18:50:29 np0005591285 podman[215788]: 2026-01-21 23:50:29.455916101 +0000 UTC m=+0.035324557 image pull 3695f0466b4af47afdf4b467956f8cc4744d7249671a73e7ca3fd26cca2f59c3 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Jan 21 18:50:29 np0005591285 systemd[1]: Started libpod-conmon-438d32d56f4851558ea3715744e614e033a0b120d7e89bf40f3f623a8a805c88.scope.
Jan 21 18:50:29 np0005591285 systemd[1]: Started libcrun container.
Jan 21 18:50:29 np0005591285 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/579997afcd44ec25de4abd36e3335c1ecd7f8ba14afef63d29a9d3ed15182a1e/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 21 18:50:29 np0005591285 podman[215788]: 2026-01-21 23:50:29.620613163 +0000 UTC m=+0.200021569 container init 438d32d56f4851558ea3715744e614e033a0b120d7e89bf40f3f623a8a805c88 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-fe7d64db-d08e-4a1e-9c22-107fc8f6cdce, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.license=GPLv2)
Jan 21 18:50:29 np0005591285 podman[215788]: 2026-01-21 23:50:29.628782948 +0000 UTC m=+0.208191324 container start 438d32d56f4851558ea3715744e614e033a0b120d7e89bf40f3f623a8a805c88 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-fe7d64db-d08e-4a1e-9c22-107fc8f6cdce, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.vendor=CentOS, tcib_managed=true)
Jan 21 18:50:29 np0005591285 neutron-haproxy-ovnmeta-fe7d64db-d08e-4a1e-9c22-107fc8f6cdce[215803]: [NOTICE]   (215807) : New worker (215809) forked
Jan 21 18:50:29 np0005591285 neutron-haproxy-ovnmeta-fe7d64db-d08e-4a1e-9c22-107fc8f6cdce[215803]: [NOTICE]   (215807) : Loading success.
Jan 21 18:50:31 np0005591285 nova_compute[182755]: 2026-01-21 23:50:31.119 182759 DEBUG nova.compute.manager [req-5385ea57-f68e-4d91-8b33-019059ef7c9c req-2b12b7db-6385-4cd5-97d7-841732029d35 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 31e5706d-d327-4283-affc-6a3f60b78063] Received event network-vif-plugged-ebad2ab9-91b4-44f7-9b48-683234f5b24d external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 21 18:50:31 np0005591285 nova_compute[182755]: 2026-01-21 23:50:31.120 182759 DEBUG oslo_concurrency.lockutils [req-5385ea57-f68e-4d91-8b33-019059ef7c9c req-2b12b7db-6385-4cd5-97d7-841732029d35 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "31e5706d-d327-4283-affc-6a3f60b78063-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 21 18:50:31 np0005591285 nova_compute[182755]: 2026-01-21 23:50:31.120 182759 DEBUG oslo_concurrency.lockutils [req-5385ea57-f68e-4d91-8b33-019059ef7c9c req-2b12b7db-6385-4cd5-97d7-841732029d35 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "31e5706d-d327-4283-affc-6a3f60b78063-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 21 18:50:31 np0005591285 nova_compute[182755]: 2026-01-21 23:50:31.121 182759 DEBUG oslo_concurrency.lockutils [req-5385ea57-f68e-4d91-8b33-019059ef7c9c req-2b12b7db-6385-4cd5-97d7-841732029d35 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "31e5706d-d327-4283-affc-6a3f60b78063-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 21 18:50:31 np0005591285 nova_compute[182755]: 2026-01-21 23:50:31.121 182759 DEBUG nova.compute.manager [req-5385ea57-f68e-4d91-8b33-019059ef7c9c req-2b12b7db-6385-4cd5-97d7-841732029d35 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 31e5706d-d327-4283-affc-6a3f60b78063] No waiting events found dispatching network-vif-plugged-ebad2ab9-91b4-44f7-9b48-683234f5b24d pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 21 18:50:31 np0005591285 nova_compute[182755]: 2026-01-21 23:50:31.121 182759 WARNING nova.compute.manager [req-5385ea57-f68e-4d91-8b33-019059ef7c9c req-2b12b7db-6385-4cd5-97d7-841732029d35 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 31e5706d-d327-4283-affc-6a3f60b78063] Received unexpected event network-vif-plugged-ebad2ab9-91b4-44f7-9b48-683234f5b24d for instance with vm_state active and task_state None.#033[00m
Jan 21 18:50:32 np0005591285 NetworkManager[55017]: <info>  [1769039432.4450] manager: (patch-br-int-to-provnet-99bcf688-d143-4306-a11c-956e66fbe227): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/57)
Jan 21 18:50:32 np0005591285 nova_compute[182755]: 2026-01-21 23:50:32.446 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 18:50:32 np0005591285 NetworkManager[55017]: <info>  [1769039432.4471] manager: (patch-provnet-99bcf688-d143-4306-a11c-956e66fbe227-to-br-int): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/58)
Jan 21 18:50:32 np0005591285 nova_compute[182755]: 2026-01-21 23:50:32.577 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 18:50:32 np0005591285 ovn_controller[94908]: 2026-01-21T23:50:32Z|00095|binding|INFO|Releasing lport 38be274f-21e1-4f3b-b2be-0a4430f5cc37 from this chassis (sb_readonly=0)
Jan 21 18:50:32 np0005591285 nova_compute[182755]: 2026-01-21 23:50:32.591 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 18:50:32 np0005591285 nova_compute[182755]: 2026-01-21 23:50:32.677 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 18:50:32 np0005591285 nova_compute[182755]: 2026-01-21 23:50:32.971 182759 DEBUG nova.compute.manager [req-5174a832-c987-44c1-a066-38a9d27adbce req-3d55f7b8-405e-4d22-89cd-cf584fddecf7 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 31e5706d-d327-4283-affc-6a3f60b78063] Received event network-changed-ebad2ab9-91b4-44f7-9b48-683234f5b24d external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 21 18:50:32 np0005591285 nova_compute[182755]: 2026-01-21 23:50:32.972 182759 DEBUG nova.compute.manager [req-5174a832-c987-44c1-a066-38a9d27adbce req-3d55f7b8-405e-4d22-89cd-cf584fddecf7 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 31e5706d-d327-4283-affc-6a3f60b78063] Refreshing instance network info cache due to event network-changed-ebad2ab9-91b4-44f7-9b48-683234f5b24d. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 21 18:50:32 np0005591285 nova_compute[182755]: 2026-01-21 23:50:32.972 182759 DEBUG oslo_concurrency.lockutils [req-5174a832-c987-44c1-a066-38a9d27adbce req-3d55f7b8-405e-4d22-89cd-cf584fddecf7 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "refresh_cache-31e5706d-d327-4283-affc-6a3f60b78063" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 21 18:50:32 np0005591285 nova_compute[182755]: 2026-01-21 23:50:32.972 182759 DEBUG oslo_concurrency.lockutils [req-5174a832-c987-44c1-a066-38a9d27adbce req-3d55f7b8-405e-4d22-89cd-cf584fddecf7 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquired lock "refresh_cache-31e5706d-d327-4283-affc-6a3f60b78063" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 21 18:50:32 np0005591285 nova_compute[182755]: 2026-01-21 23:50:32.972 182759 DEBUG nova.network.neutron [req-5174a832-c987-44c1-a066-38a9d27adbce req-3d55f7b8-405e-4d22-89cd-cf584fddecf7 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 31e5706d-d327-4283-affc-6a3f60b78063] Refreshing network info cache for port ebad2ab9-91b4-44f7-9b48-683234f5b24d _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 21 18:50:34 np0005591285 nova_compute[182755]: 2026-01-21 23:50:34.348 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 18:50:35 np0005591285 nova_compute[182755]: 2026-01-21 23:50:35.379 182759 DEBUG nova.network.neutron [req-5174a832-c987-44c1-a066-38a9d27adbce req-3d55f7b8-405e-4d22-89cd-cf584fddecf7 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 31e5706d-d327-4283-affc-6a3f60b78063] Updated VIF entry in instance network info cache for port ebad2ab9-91b4-44f7-9b48-683234f5b24d. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 21 18:50:35 np0005591285 nova_compute[182755]: 2026-01-21 23:50:35.379 182759 DEBUG nova.network.neutron [req-5174a832-c987-44c1-a066-38a9d27adbce req-3d55f7b8-405e-4d22-89cd-cf584fddecf7 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 31e5706d-d327-4283-affc-6a3f60b78063] Updating instance_info_cache with network_info: [{"id": "ebad2ab9-91b4-44f7-9b48-683234f5b24d", "address": "fa:16:3e:46:1e:4f", "network": {"id": "fe7d64db-d08e-4a1e-9c22-107fc8f6cdce", "bridge": "br-int", "label": "tempest-ServersTestManualDisk-561965875-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.192", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b7dd3a6bac624dfeb76708960fbea805", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapebad2ab9-91", "ovs_interfaceid": "ebad2ab9-91b4-44f7-9b48-683234f5b24d", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 21 18:50:35 np0005591285 nova_compute[182755]: 2026-01-21 23:50:35.428 182759 DEBUG oslo_concurrency.lockutils [req-5174a832-c987-44c1-a066-38a9d27adbce req-3d55f7b8-405e-4d22-89cd-cf584fddecf7 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Releasing lock "refresh_cache-31e5706d-d327-4283-affc-6a3f60b78063" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 21 18:50:37 np0005591285 nova_compute[182755]: 2026-01-21 23:50:37.680 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 18:50:39 np0005591285 nova_compute[182755]: 2026-01-21 23:50:39.374 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 18:50:42 np0005591285 ovn_controller[94908]: 2026-01-21T23:50:42Z|00012|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:46:1e:4f 10.100.0.11
Jan 21 18:50:42 np0005591285 ovn_controller[94908]: 2026-01-21T23:50:42Z|00013|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:46:1e:4f 10.100.0.11
Jan 21 18:50:42 np0005591285 nova_compute[182755]: 2026-01-21 23:50:42.716 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 18:50:43 np0005591285 podman[215844]: 2026-01-21 23:50:43.239584838 +0000 UTC m=+0.095738144 container health_status b5c1a31a508d0cf9e10702abe9c0ef424614b4d8f0daee85791f7de054afa6f0 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, managed_by=edpm_ansible, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_id=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS)
Jan 21 18:50:44 np0005591285 nova_compute[182755]: 2026-01-21 23:50:44.378 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 18:50:45 np0005591285 podman[215864]: 2026-01-21 23:50:45.249362823 +0000 UTC m=+0.103982000 container health_status 769668cc4e818cfae854ab3fcb5517227ddca1e18005a18ef4d782a494f45662 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., distribution-scope=public, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.buildah.version=1.33.7, build-date=2025-08-20T13:12:41, container_name=openstack_network_exporter, version=9.6, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vcs-type=git, io.openshift.tags=minimal rhel9, maintainer=Red Hat, Inc., com.redhat.component=ubi9-minimal-container, name=ubi9-minimal, managed_by=edpm_ansible, release=1755695350, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., url=https://catalog.redhat.com/en/search?searchType=containers, vendor=Red Hat, Inc., io.openshift.expose-services=, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, config_id=openstack_network_exporter)
Jan 21 18:50:46 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:50:46.843 104259 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=10, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '16:a4:fb', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': 'fa:c2:bb:50:eb:27'}, ipsec=False) old=SB_Global(nb_cfg=9) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 21 18:50:46 np0005591285 nova_compute[182755]: 2026-01-21 23:50:46.844 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 18:50:46 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:50:46.845 104259 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 7 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Jan 21 18:50:47 np0005591285 nova_compute[182755]: 2026-01-21 23:50:47.719 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 18:50:49 np0005591285 nova_compute[182755]: 2026-01-21 23:50:49.381 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 18:50:52 np0005591285 podman[215886]: 2026-01-21 23:50:52.206818035 +0000 UTC m=+0.063953700 container health_status 3d927e208c8d9d2bf2ed958430b573b5beb6e6062ba87e1ec7a0ee42c855b2e4 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter)
Jan 21 18:50:52 np0005591285 nova_compute[182755]: 2026-01-21 23:50:52.721 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 18:50:53 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:50:53.849 104259 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=ce4b296c-26ac-415a-aa87-9634754eb3d3, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '10'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 21 18:50:54 np0005591285 nova_compute[182755]: 2026-01-21 23:50:54.383 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 18:50:56 np0005591285 podman[215910]: 2026-01-21 23:50:56.237208619 +0000 UTC m=+0.105583600 container health_status e38def30a2d032278c507a8fe96e62e9ea51ff4721fe55b24e66691821fd079e (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, config_id=ovn_controller, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller)
Jan 21 18:50:56 np0005591285 nova_compute[182755]: 2026-01-21 23:50:56.581 182759 DEBUG oslo_concurrency.lockutils [None req-eab1f483-bcbe-44af-ad2c-2052169cffc6 85650249f90d4a3b8aea823b57ea554f b7dd3a6bac624dfeb76708960fbea805 - - default default] Acquiring lock "31e5706d-d327-4283-affc-6a3f60b78063" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 21 18:50:56 np0005591285 nova_compute[182755]: 2026-01-21 23:50:56.583 182759 DEBUG oslo_concurrency.lockutils [None req-eab1f483-bcbe-44af-ad2c-2052169cffc6 85650249f90d4a3b8aea823b57ea554f b7dd3a6bac624dfeb76708960fbea805 - - default default] Lock "31e5706d-d327-4283-affc-6a3f60b78063" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 21 18:50:56 np0005591285 nova_compute[182755]: 2026-01-21 23:50:56.584 182759 DEBUG oslo_concurrency.lockutils [None req-eab1f483-bcbe-44af-ad2c-2052169cffc6 85650249f90d4a3b8aea823b57ea554f b7dd3a6bac624dfeb76708960fbea805 - - default default] Acquiring lock "31e5706d-d327-4283-affc-6a3f60b78063-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 21 18:50:56 np0005591285 nova_compute[182755]: 2026-01-21 23:50:56.584 182759 DEBUG oslo_concurrency.lockutils [None req-eab1f483-bcbe-44af-ad2c-2052169cffc6 85650249f90d4a3b8aea823b57ea554f b7dd3a6bac624dfeb76708960fbea805 - - default default] Lock "31e5706d-d327-4283-affc-6a3f60b78063-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 21 18:50:56 np0005591285 nova_compute[182755]: 2026-01-21 23:50:56.585 182759 DEBUG oslo_concurrency.lockutils [None req-eab1f483-bcbe-44af-ad2c-2052169cffc6 85650249f90d4a3b8aea823b57ea554f b7dd3a6bac624dfeb76708960fbea805 - - default default] Lock "31e5706d-d327-4283-affc-6a3f60b78063-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 21 18:50:56 np0005591285 nova_compute[182755]: 2026-01-21 23:50:56.603 182759 INFO nova.compute.manager [None req-eab1f483-bcbe-44af-ad2c-2052169cffc6 85650249f90d4a3b8aea823b57ea554f b7dd3a6bac624dfeb76708960fbea805 - - default default] [instance: 31e5706d-d327-4283-affc-6a3f60b78063] Terminating instance#033[00m
Jan 21 18:50:56 np0005591285 nova_compute[182755]: 2026-01-21 23:50:56.616 182759 DEBUG nova.compute.manager [None req-eab1f483-bcbe-44af-ad2c-2052169cffc6 85650249f90d4a3b8aea823b57ea554f b7dd3a6bac624dfeb76708960fbea805 - - default default] [instance: 31e5706d-d327-4283-affc-6a3f60b78063] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Jan 21 18:50:56 np0005591285 kernel: tapebad2ab9-91 (unregistering): left promiscuous mode
Jan 21 18:50:56 np0005591285 NetworkManager[55017]: <info>  [1769039456.6412] device (tapebad2ab9-91): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 21 18:50:56 np0005591285 ovn_controller[94908]: 2026-01-21T23:50:56Z|00096|binding|INFO|Releasing lport ebad2ab9-91b4-44f7-9b48-683234f5b24d from this chassis (sb_readonly=0)
Jan 21 18:50:56 np0005591285 ovn_controller[94908]: 2026-01-21T23:50:56Z|00097|binding|INFO|Setting lport ebad2ab9-91b4-44f7-9b48-683234f5b24d down in Southbound
Jan 21 18:50:56 np0005591285 ovn_controller[94908]: 2026-01-21T23:50:56Z|00098|binding|INFO|Removing iface tapebad2ab9-91 ovn-installed in OVS
Jan 21 18:50:56 np0005591285 nova_compute[182755]: 2026-01-21 23:50:56.653 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 18:50:56 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:50:56.673 104259 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:46:1e:4f 10.100.0.11'], port_security=['fa:16:3e:46:1e:4f 10.100.0.11'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.11/28', 'neutron:device_id': '31e5706d-d327-4283-affc-6a3f60b78063', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-fe7d64db-d08e-4a1e-9c22-107fc8f6cdce', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'b7dd3a6bac624dfeb76708960fbea805', 'neutron:revision_number': '4', 'neutron:security_group_ids': '14c7e5ff-710e-43ee-89f1-095dbe750986', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-2.ctlplane.example.com', 'neutron:port_fip': '192.168.122.192'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=91d6bebd-4af5-4e84-b131-0a75959656c1, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fdd55b5c970>], logical_port=ebad2ab9-91b4-44f7-9b48-683234f5b24d) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7fdd55b5c970>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 21 18:50:56 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:50:56.676 104259 INFO neutron.agent.ovn.metadata.agent [-] Port ebad2ab9-91b4-44f7-9b48-683234f5b24d in datapath fe7d64db-d08e-4a1e-9c22-107fc8f6cdce unbound from our chassis#033[00m
Jan 21 18:50:56 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:50:56.678 104259 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network fe7d64db-d08e-4a1e-9c22-107fc8f6cdce, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Jan 21 18:50:56 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:50:56.680 211690 DEBUG oslo.privsep.daemon [-] privsep: reply[1cb4ece8-f5c6-4f84-9834-cbcbbbae39f5]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 18:50:56 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:50:56.681 104259 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-fe7d64db-d08e-4a1e-9c22-107fc8f6cdce namespace which is not needed anymore#033[00m
Jan 21 18:50:56 np0005591285 nova_compute[182755]: 2026-01-21 23:50:56.683 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 18:50:56 np0005591285 systemd[1]: machine-qemu\x2d14\x2dinstance\x2d00000024.scope: Deactivated successfully.
Jan 21 18:50:56 np0005591285 systemd[1]: machine-qemu\x2d14\x2dinstance\x2d00000024.scope: Consumed 14.345s CPU time.
Jan 21 18:50:56 np0005591285 systemd-machined[154022]: Machine qemu-14-instance-00000024 terminated.
Jan 21 18:50:56 np0005591285 neutron-haproxy-ovnmeta-fe7d64db-d08e-4a1e-9c22-107fc8f6cdce[215803]: [NOTICE]   (215807) : haproxy version is 2.8.14-c23fe91
Jan 21 18:50:56 np0005591285 neutron-haproxy-ovnmeta-fe7d64db-d08e-4a1e-9c22-107fc8f6cdce[215803]: [NOTICE]   (215807) : path to executable is /usr/sbin/haproxy
Jan 21 18:50:56 np0005591285 neutron-haproxy-ovnmeta-fe7d64db-d08e-4a1e-9c22-107fc8f6cdce[215803]: [WARNING]  (215807) : Exiting Master process...
Jan 21 18:50:56 np0005591285 neutron-haproxy-ovnmeta-fe7d64db-d08e-4a1e-9c22-107fc8f6cdce[215803]: [WARNING]  (215807) : Exiting Master process...
Jan 21 18:50:56 np0005591285 neutron-haproxy-ovnmeta-fe7d64db-d08e-4a1e-9c22-107fc8f6cdce[215803]: [ALERT]    (215807) : Current worker (215809) exited with code 143 (Terminated)
Jan 21 18:50:56 np0005591285 neutron-haproxy-ovnmeta-fe7d64db-d08e-4a1e-9c22-107fc8f6cdce[215803]: [WARNING]  (215807) : All workers exited. Exiting... (0)
Jan 21 18:50:56 np0005591285 systemd[1]: libpod-438d32d56f4851558ea3715744e614e033a0b120d7e89bf40f3f623a8a805c88.scope: Deactivated successfully.
Jan 21 18:50:56 np0005591285 podman[215960]: 2026-01-21 23:50:56.877037649 +0000 UTC m=+0.063249811 container died 438d32d56f4851558ea3715744e614e033a0b120d7e89bf40f3f623a8a805c88 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-fe7d64db-d08e-4a1e-9c22-107fc8f6cdce, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251202)
Jan 21 18:50:56 np0005591285 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-438d32d56f4851558ea3715744e614e033a0b120d7e89bf40f3f623a8a805c88-userdata-shm.mount: Deactivated successfully.
Jan 21 18:50:56 np0005591285 systemd[1]: var-lib-containers-storage-overlay-579997afcd44ec25de4abd36e3335c1ecd7f8ba14afef63d29a9d3ed15182a1e-merged.mount: Deactivated successfully.
Jan 21 18:50:56 np0005591285 nova_compute[182755]: 2026-01-21 23:50:56.915 182759 INFO nova.virt.libvirt.driver [-] [instance: 31e5706d-d327-4283-affc-6a3f60b78063] Instance destroyed successfully.#033[00m
Jan 21 18:50:56 np0005591285 nova_compute[182755]: 2026-01-21 23:50:56.916 182759 DEBUG nova.objects.instance [None req-eab1f483-bcbe-44af-ad2c-2052169cffc6 85650249f90d4a3b8aea823b57ea554f b7dd3a6bac624dfeb76708960fbea805 - - default default] Lazy-loading 'resources' on Instance uuid 31e5706d-d327-4283-affc-6a3f60b78063 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 21 18:50:56 np0005591285 podman[215960]: 2026-01-21 23:50:56.925242743 +0000 UTC m=+0.111454945 container cleanup 438d32d56f4851558ea3715744e614e033a0b120d7e89bf40f3f623a8a805c88 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-fe7d64db-d08e-4a1e-9c22-107fc8f6cdce, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_managed=true)
Jan 21 18:50:56 np0005591285 nova_compute[182755]: 2026-01-21 23:50:56.939 182759 DEBUG nova.virt.libvirt.vif [None req-eab1f483-bcbe-44af-ad2c-2052169cffc6 85650249f90d4a3b8aea823b57ea554f b7dd3a6bac624dfeb76708960fbea805 - - default default] vif_type=ovs instance=Instance(access_ip_v4=1.1.1.1,access_ip_v6=::babe:dc0c:1602,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-21T23:50:20Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ServersTestManualDisk-server-2071487848',display_name='tempest-ServersTestManualDisk-server-2071487848',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-serverstestmanualdisk-server-2071487848',id=36,image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBJ6RAQLgw2u4CM7F628XdMwgGoQHo7Cj5eEJZtrGWO7PU9f/CieBUmud/DgBl17nCO7a4E+JzuVuVInm7vYerW4jUfpaUxKWjQ8cUcIQB1cj9HDIYFwKtBjWq7X/VW4G3g==',key_name='tempest-keypair-9054980',keypairs=<?>,launch_index=0,launched_at=2026-01-21T23:50:29Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={hello='world'},migration_context=<?>,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='b7dd3a6bac624dfeb76708960fbea805',ramdisk_id='',reservation_id='r-uv009hhq',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-ServersTestManualDisk-920151640',owner_user_name='tempest-ServersTestManualDisk-920151640-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-21T23:50:29Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='85650249f90d4a3b8aea823b57ea554f',uuid=31e5706d-d327-4283-affc-6a3f60b78063,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "ebad2ab9-91b4-44f7-9b48-683234f5b24d", "address": "fa:16:3e:46:1e:4f", "network": {"id": "fe7d64db-d08e-4a1e-9c22-107fc8f6cdce", "bridge": "br-int", "label": "tempest-ServersTestManualDisk-561965875-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.192", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b7dd3a6bac624dfeb76708960fbea805", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapebad2ab9-91", "ovs_interfaceid": "ebad2ab9-91b4-44f7-9b48-683234f5b24d", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Jan 21 18:50:56 np0005591285 nova_compute[182755]: 2026-01-21 23:50:56.940 182759 DEBUG nova.network.os_vif_util [None req-eab1f483-bcbe-44af-ad2c-2052169cffc6 85650249f90d4a3b8aea823b57ea554f b7dd3a6bac624dfeb76708960fbea805 - - default default] Converting VIF {"id": "ebad2ab9-91b4-44f7-9b48-683234f5b24d", "address": "fa:16:3e:46:1e:4f", "network": {"id": "fe7d64db-d08e-4a1e-9c22-107fc8f6cdce", "bridge": "br-int", "label": "tempest-ServersTestManualDisk-561965875-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.192", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b7dd3a6bac624dfeb76708960fbea805", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapebad2ab9-91", "ovs_interfaceid": "ebad2ab9-91b4-44f7-9b48-683234f5b24d", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 21 18:50:56 np0005591285 nova_compute[182755]: 2026-01-21 23:50:56.941 182759 DEBUG nova.network.os_vif_util [None req-eab1f483-bcbe-44af-ad2c-2052169cffc6 85650249f90d4a3b8aea823b57ea554f b7dd3a6bac624dfeb76708960fbea805 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:46:1e:4f,bridge_name='br-int',has_traffic_filtering=True,id=ebad2ab9-91b4-44f7-9b48-683234f5b24d,network=Network(fe7d64db-d08e-4a1e-9c22-107fc8f6cdce),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapebad2ab9-91') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 21 18:50:56 np0005591285 nova_compute[182755]: 2026-01-21 23:50:56.941 182759 DEBUG os_vif [None req-eab1f483-bcbe-44af-ad2c-2052169cffc6 85650249f90d4a3b8aea823b57ea554f b7dd3a6bac624dfeb76708960fbea805 - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:46:1e:4f,bridge_name='br-int',has_traffic_filtering=True,id=ebad2ab9-91b4-44f7-9b48-683234f5b24d,network=Network(fe7d64db-d08e-4a1e-9c22-107fc8f6cdce),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapebad2ab9-91') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Jan 21 18:50:56 np0005591285 nova_compute[182755]: 2026-01-21 23:50:56.944 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 18:50:56 np0005591285 nova_compute[182755]: 2026-01-21 23:50:56.945 182759 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapebad2ab9-91, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 21 18:50:56 np0005591285 nova_compute[182755]: 2026-01-21 23:50:56.947 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 18:50:56 np0005591285 nova_compute[182755]: 2026-01-21 23:50:56.949 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 18:50:56 np0005591285 nova_compute[182755]: 2026-01-21 23:50:56.953 182759 INFO os_vif [None req-eab1f483-bcbe-44af-ad2c-2052169cffc6 85650249f90d4a3b8aea823b57ea554f b7dd3a6bac624dfeb76708960fbea805 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:46:1e:4f,bridge_name='br-int',has_traffic_filtering=True,id=ebad2ab9-91b4-44f7-9b48-683234f5b24d,network=Network(fe7d64db-d08e-4a1e-9c22-107fc8f6cdce),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapebad2ab9-91')#033[00m
Jan 21 18:50:56 np0005591285 nova_compute[182755]: 2026-01-21 23:50:56.953 182759 INFO nova.virt.libvirt.driver [None req-eab1f483-bcbe-44af-ad2c-2052169cffc6 85650249f90d4a3b8aea823b57ea554f b7dd3a6bac624dfeb76708960fbea805 - - default default] [instance: 31e5706d-d327-4283-affc-6a3f60b78063] Deleting instance files /var/lib/nova/instances/31e5706d-d327-4283-affc-6a3f60b78063_del#033[00m
Jan 21 18:50:56 np0005591285 nova_compute[182755]: 2026-01-21 23:50:56.955 182759 INFO nova.virt.libvirt.driver [None req-eab1f483-bcbe-44af-ad2c-2052169cffc6 85650249f90d4a3b8aea823b57ea554f b7dd3a6bac624dfeb76708960fbea805 - - default default] [instance: 31e5706d-d327-4283-affc-6a3f60b78063] Deletion of /var/lib/nova/instances/31e5706d-d327-4283-affc-6a3f60b78063_del complete#033[00m
Jan 21 18:50:56 np0005591285 systemd[1]: libpod-conmon-438d32d56f4851558ea3715744e614e033a0b120d7e89bf40f3f623a8a805c88.scope: Deactivated successfully.
Jan 21 18:50:57 np0005591285 podman[216006]: 2026-01-21 23:50:57.010246524 +0000 UTC m=+0.053443794 container remove 438d32d56f4851558ea3715744e614e033a0b120d7e89bf40f3f623a8a805c88 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-fe7d64db-d08e-4a1e-9c22-107fc8f6cdce, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251202, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3)
Jan 21 18:50:57 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:50:57.017 211690 DEBUG oslo.privsep.daemon [-] privsep: reply[6302c6f7-8d1b-45ea-a60e-d5f6f6b73cf1]: (4, ('Wed Jan 21 11:50:56 PM UTC 2026 Stopping container neutron-haproxy-ovnmeta-fe7d64db-d08e-4a1e-9c22-107fc8f6cdce (438d32d56f4851558ea3715744e614e033a0b120d7e89bf40f3f623a8a805c88)\n438d32d56f4851558ea3715744e614e033a0b120d7e89bf40f3f623a8a805c88\nWed Jan 21 11:50:56 PM UTC 2026 Deleting container neutron-haproxy-ovnmeta-fe7d64db-d08e-4a1e-9c22-107fc8f6cdce (438d32d56f4851558ea3715744e614e033a0b120d7e89bf40f3f623a8a805c88)\n438d32d56f4851558ea3715744e614e033a0b120d7e89bf40f3f623a8a805c88\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 18:50:57 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:50:57.019 211690 DEBUG oslo.privsep.daemon [-] privsep: reply[368e791e-8abe-4793-ab76-0fac7fd57db1]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 18:50:57 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:50:57.021 104259 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapfe7d64db-d0, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 21 18:50:57 np0005591285 kernel: tapfe7d64db-d0: left promiscuous mode
Jan 21 18:50:57 np0005591285 nova_compute[182755]: 2026-01-21 23:50:57.024 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 18:50:57 np0005591285 nova_compute[182755]: 2026-01-21 23:50:57.091 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 18:50:57 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:50:57.095 211690 DEBUG oslo.privsep.daemon [-] privsep: reply[64650c96-66ac-406e-b88d-19eba3fbd41c]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 18:50:57 np0005591285 nova_compute[182755]: 2026-01-21 23:50:57.101 182759 INFO nova.compute.manager [None req-eab1f483-bcbe-44af-ad2c-2052169cffc6 85650249f90d4a3b8aea823b57ea554f b7dd3a6bac624dfeb76708960fbea805 - - default default] [instance: 31e5706d-d327-4283-affc-6a3f60b78063] Took 0.48 seconds to destroy the instance on the hypervisor.#033[00m
Jan 21 18:50:57 np0005591285 nova_compute[182755]: 2026-01-21 23:50:57.102 182759 DEBUG oslo.service.loopingcall [None req-eab1f483-bcbe-44af-ad2c-2052169cffc6 85650249f90d4a3b8aea823b57ea554f b7dd3a6bac624dfeb76708960fbea805 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Jan 21 18:50:57 np0005591285 nova_compute[182755]: 2026-01-21 23:50:57.103 182759 DEBUG nova.compute.manager [-] [instance: 31e5706d-d327-4283-affc-6a3f60b78063] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Jan 21 18:50:57 np0005591285 nova_compute[182755]: 2026-01-21 23:50:57.103 182759 DEBUG nova.network.neutron [-] [instance: 31e5706d-d327-4283-affc-6a3f60b78063] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Jan 21 18:50:57 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:50:57.114 211690 DEBUG oslo.privsep.daemon [-] privsep: reply[91a395ab-1940-4c68-ad32-3ce548239efb]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 18:50:57 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:50:57.115 211690 DEBUG oslo.privsep.daemon [-] privsep: reply[29328b72-1dd1-46d4-85e0-ccd8415386da]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 18:50:57 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:50:57.136 211690 DEBUG oslo.privsep.daemon [-] privsep: reply[c6ef9fbc-67d2-4701-b03b-0ed35791c190]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 398720, 'reachable_time': 29870, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 216020, 'error': None, 'target': 'ovnmeta-fe7d64db-d08e-4a1e-9c22-107fc8f6cdce', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 18:50:57 np0005591285 systemd[1]: run-netns-ovnmeta\x2dfe7d64db\x2dd08e\x2d4a1e\x2d9c22\x2d107fc8f6cdce.mount: Deactivated successfully.
Jan 21 18:50:57 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:50:57.140 104650 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-fe7d64db-d08e-4a1e-9c22-107fc8f6cdce deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Jan 21 18:50:57 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:50:57.141 104650 DEBUG oslo.privsep.daemon [-] privsep: reply[e08b1eb9-e6ee-49b6-81dd-e5b0db489bda]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 18:50:57 np0005591285 nova_compute[182755]: 2026-01-21 23:50:57.167 182759 DEBUG nova.compute.manager [req-3b4e2bbd-9dc1-4b9b-aaab-202a9bf2b2f5 req-b77ea1dd-c27b-4ab2-83d3-0dcbfcc99756 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 31e5706d-d327-4283-affc-6a3f60b78063] Received event network-vif-unplugged-ebad2ab9-91b4-44f7-9b48-683234f5b24d external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 21 18:50:57 np0005591285 nova_compute[182755]: 2026-01-21 23:50:57.168 182759 DEBUG oslo_concurrency.lockutils [req-3b4e2bbd-9dc1-4b9b-aaab-202a9bf2b2f5 req-b77ea1dd-c27b-4ab2-83d3-0dcbfcc99756 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "31e5706d-d327-4283-affc-6a3f60b78063-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 21 18:50:57 np0005591285 nova_compute[182755]: 2026-01-21 23:50:57.168 182759 DEBUG oslo_concurrency.lockutils [req-3b4e2bbd-9dc1-4b9b-aaab-202a9bf2b2f5 req-b77ea1dd-c27b-4ab2-83d3-0dcbfcc99756 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "31e5706d-d327-4283-affc-6a3f60b78063-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 21 18:50:57 np0005591285 nova_compute[182755]: 2026-01-21 23:50:57.168 182759 DEBUG oslo_concurrency.lockutils [req-3b4e2bbd-9dc1-4b9b-aaab-202a9bf2b2f5 req-b77ea1dd-c27b-4ab2-83d3-0dcbfcc99756 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "31e5706d-d327-4283-affc-6a3f60b78063-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 21 18:50:57 np0005591285 nova_compute[182755]: 2026-01-21 23:50:57.169 182759 DEBUG nova.compute.manager [req-3b4e2bbd-9dc1-4b9b-aaab-202a9bf2b2f5 req-b77ea1dd-c27b-4ab2-83d3-0dcbfcc99756 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 31e5706d-d327-4283-affc-6a3f60b78063] No waiting events found dispatching network-vif-unplugged-ebad2ab9-91b4-44f7-9b48-683234f5b24d pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 21 18:50:57 np0005591285 nova_compute[182755]: 2026-01-21 23:50:57.169 182759 DEBUG nova.compute.manager [req-3b4e2bbd-9dc1-4b9b-aaab-202a9bf2b2f5 req-b77ea1dd-c27b-4ab2-83d3-0dcbfcc99756 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 31e5706d-d327-4283-affc-6a3f60b78063] Received event network-vif-unplugged-ebad2ab9-91b4-44f7-9b48-683234f5b24d for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Jan 21 18:50:59 np0005591285 podman[216021]: 2026-01-21 23:50:59.248154325 +0000 UTC m=+0.104397140 container health_status 482a8450540b33e5cd4c4cb0c4b60281016c657b79b89fa82f8a9e447843f995 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 21 18:50:59 np0005591285 podman[216022]: 2026-01-21 23:50:59.250676481 +0000 UTC m=+0.098837894 container health_status 8919309155a033da8deb024bb37b9fc0d285fe97686ee0d202af37a5f2df8416 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Jan 21 18:50:59 np0005591285 nova_compute[182755]: 2026-01-21 23:50:59.419 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 18:50:59 np0005591285 nova_compute[182755]: 2026-01-21 23:50:59.586 182759 DEBUG nova.compute.manager [req-04bd88ab-8ac4-4750-afce-f8e199899141 req-1ddf7a7e-2020-4304-a8ee-06595eab8f9a 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 31e5706d-d327-4283-affc-6a3f60b78063] Received event network-vif-plugged-ebad2ab9-91b4-44f7-9b48-683234f5b24d external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 21 18:50:59 np0005591285 nova_compute[182755]: 2026-01-21 23:50:59.587 182759 DEBUG oslo_concurrency.lockutils [req-04bd88ab-8ac4-4750-afce-f8e199899141 req-1ddf7a7e-2020-4304-a8ee-06595eab8f9a 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "31e5706d-d327-4283-affc-6a3f60b78063-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 21 18:50:59 np0005591285 nova_compute[182755]: 2026-01-21 23:50:59.587 182759 DEBUG oslo_concurrency.lockutils [req-04bd88ab-8ac4-4750-afce-f8e199899141 req-1ddf7a7e-2020-4304-a8ee-06595eab8f9a 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "31e5706d-d327-4283-affc-6a3f60b78063-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 21 18:50:59 np0005591285 nova_compute[182755]: 2026-01-21 23:50:59.588 182759 DEBUG oslo_concurrency.lockutils [req-04bd88ab-8ac4-4750-afce-f8e199899141 req-1ddf7a7e-2020-4304-a8ee-06595eab8f9a 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "31e5706d-d327-4283-affc-6a3f60b78063-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 21 18:50:59 np0005591285 nova_compute[182755]: 2026-01-21 23:50:59.589 182759 DEBUG nova.compute.manager [req-04bd88ab-8ac4-4750-afce-f8e199899141 req-1ddf7a7e-2020-4304-a8ee-06595eab8f9a 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 31e5706d-d327-4283-affc-6a3f60b78063] No waiting events found dispatching network-vif-plugged-ebad2ab9-91b4-44f7-9b48-683234f5b24d pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 21 18:50:59 np0005591285 nova_compute[182755]: 2026-01-21 23:50:59.589 182759 WARNING nova.compute.manager [req-04bd88ab-8ac4-4750-afce-f8e199899141 req-1ddf7a7e-2020-4304-a8ee-06595eab8f9a 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 31e5706d-d327-4283-affc-6a3f60b78063] Received unexpected event network-vif-plugged-ebad2ab9-91b4-44f7-9b48-683234f5b24d for instance with vm_state active and task_state deleting.#033[00m
Jan 21 18:50:59 np0005591285 nova_compute[182755]: 2026-01-21 23:50:59.930 182759 DEBUG nova.network.neutron [-] [instance: 31e5706d-d327-4283-affc-6a3f60b78063] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 21 18:50:59 np0005591285 nova_compute[182755]: 2026-01-21 23:50:59.953 182759 INFO nova.compute.manager [-] [instance: 31e5706d-d327-4283-affc-6a3f60b78063] Took 2.85 seconds to deallocate network for instance.#033[00m
Jan 21 18:51:00 np0005591285 nova_compute[182755]: 2026-01-21 23:51:00.034 182759 DEBUG nova.compute.manager [req-399065f8-a433-42e3-9805-5ab5a9411af6 req-4102e450-3d5b-49e3-bccb-e47353e095a1 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 31e5706d-d327-4283-affc-6a3f60b78063] Received event network-vif-deleted-ebad2ab9-91b4-44f7-9b48-683234f5b24d external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 21 18:51:00 np0005591285 nova_compute[182755]: 2026-01-21 23:51:00.097 182759 DEBUG oslo_concurrency.lockutils [None req-eab1f483-bcbe-44af-ad2c-2052169cffc6 85650249f90d4a3b8aea823b57ea554f b7dd3a6bac624dfeb76708960fbea805 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 21 18:51:00 np0005591285 nova_compute[182755]: 2026-01-21 23:51:00.098 182759 DEBUG oslo_concurrency.lockutils [None req-eab1f483-bcbe-44af-ad2c-2052169cffc6 85650249f90d4a3b8aea823b57ea554f b7dd3a6bac624dfeb76708960fbea805 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 21 18:51:00 np0005591285 nova_compute[182755]: 2026-01-21 23:51:00.169 182759 DEBUG nova.compute.provider_tree [None req-eab1f483-bcbe-44af-ad2c-2052169cffc6 85650249f90d4a3b8aea823b57ea554f b7dd3a6bac624dfeb76708960fbea805 - - default default] Inventory has not changed in ProviderTree for provider: e96a8776-a298-4c19-937a-402cb8191067 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 21 18:51:00 np0005591285 nova_compute[182755]: 2026-01-21 23:51:00.188 182759 DEBUG nova.scheduler.client.report [None req-eab1f483-bcbe-44af-ad2c-2052169cffc6 85650249f90d4a3b8aea823b57ea554f b7dd3a6bac624dfeb76708960fbea805 - - default default] Inventory has not changed for provider e96a8776-a298-4c19-937a-402cb8191067 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 21 18:51:00 np0005591285 nova_compute[182755]: 2026-01-21 23:51:00.214 182759 DEBUG oslo_concurrency.lockutils [None req-eab1f483-bcbe-44af-ad2c-2052169cffc6 85650249f90d4a3b8aea823b57ea554f b7dd3a6bac624dfeb76708960fbea805 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.116s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 21 18:51:00 np0005591285 nova_compute[182755]: 2026-01-21 23:51:00.246 182759 INFO nova.scheduler.client.report [None req-eab1f483-bcbe-44af-ad2c-2052169cffc6 85650249f90d4a3b8aea823b57ea554f b7dd3a6bac624dfeb76708960fbea805 - - default default] Deleted allocations for instance 31e5706d-d327-4283-affc-6a3f60b78063#033[00m
Jan 21 18:51:00 np0005591285 nova_compute[182755]: 2026-01-21 23:51:00.381 182759 DEBUG oslo_concurrency.lockutils [None req-eab1f483-bcbe-44af-ad2c-2052169cffc6 85650249f90d4a3b8aea823b57ea554f b7dd3a6bac624dfeb76708960fbea805 - - default default] Lock "31e5706d-d327-4283-affc-6a3f60b78063" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 3.798s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 21 18:51:01 np0005591285 nova_compute[182755]: 2026-01-21 23:51:01.993 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 18:51:02 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:51:02.957 104259 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 21 18:51:02 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:51:02.958 104259 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 21 18:51:02 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:51:02.958 104259 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 21 18:51:03 np0005591285 nova_compute[182755]: 2026-01-21 23:51:03.046 182759 DEBUG oslo_concurrency.lockutils [None req-57f4fd3b-cc0d-4aaf-ac18-0771eac34ad3 d0c4727b6f6e46339b56a8168cf80a7b c1e85e2b0f934b719d3ad4076dc719f2 - - default default] Acquiring lock "5f345aa8-94d2-4213-ab21-fadc362697b1" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 21 18:51:03 np0005591285 nova_compute[182755]: 2026-01-21 23:51:03.047 182759 DEBUG oslo_concurrency.lockutils [None req-57f4fd3b-cc0d-4aaf-ac18-0771eac34ad3 d0c4727b6f6e46339b56a8168cf80a7b c1e85e2b0f934b719d3ad4076dc719f2 - - default default] Lock "5f345aa8-94d2-4213-ab21-fadc362697b1" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 21 18:51:03 np0005591285 nova_compute[182755]: 2026-01-21 23:51:03.072 182759 DEBUG nova.compute.manager [None req-57f4fd3b-cc0d-4aaf-ac18-0771eac34ad3 d0c4727b6f6e46339b56a8168cf80a7b c1e85e2b0f934b719d3ad4076dc719f2 - - default default] [instance: 5f345aa8-94d2-4213-ab21-fadc362697b1] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Jan 21 18:51:03 np0005591285 nova_compute[182755]: 2026-01-21 23:51:03.104 182759 DEBUG oslo_concurrency.lockutils [None req-57f4fd3b-cc0d-4aaf-ac18-0771eac34ad3 d0c4727b6f6e46339b56a8168cf80a7b c1e85e2b0f934b719d3ad4076dc719f2 - - default default] Acquiring lock "e9e930a4-00ab-4044-bd2d-1f099528cb5d" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 21 18:51:03 np0005591285 nova_compute[182755]: 2026-01-21 23:51:03.106 182759 DEBUG oslo_concurrency.lockutils [None req-57f4fd3b-cc0d-4aaf-ac18-0771eac34ad3 d0c4727b6f6e46339b56a8168cf80a7b c1e85e2b0f934b719d3ad4076dc719f2 - - default default] Lock "e9e930a4-00ab-4044-bd2d-1f099528cb5d" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 21 18:51:03 np0005591285 nova_compute[182755]: 2026-01-21 23:51:03.129 182759 DEBUG nova.compute.manager [None req-57f4fd3b-cc0d-4aaf-ac18-0771eac34ad3 d0c4727b6f6e46339b56a8168cf80a7b c1e85e2b0f934b719d3ad4076dc719f2 - - default default] [instance: e9e930a4-00ab-4044-bd2d-1f099528cb5d] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Jan 21 18:51:03 np0005591285 nova_compute[182755]: 2026-01-21 23:51:03.182 182759 DEBUG oslo_concurrency.lockutils [None req-57f4fd3b-cc0d-4aaf-ac18-0771eac34ad3 d0c4727b6f6e46339b56a8168cf80a7b c1e85e2b0f934b719d3ad4076dc719f2 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 21 18:51:03 np0005591285 nova_compute[182755]: 2026-01-21 23:51:03.183 182759 DEBUG oslo_concurrency.lockutils [None req-57f4fd3b-cc0d-4aaf-ac18-0771eac34ad3 d0c4727b6f6e46339b56a8168cf80a7b c1e85e2b0f934b719d3ad4076dc719f2 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 21 18:51:03 np0005591285 nova_compute[182755]: 2026-01-21 23:51:03.201 182759 DEBUG nova.virt.hardware [None req-57f4fd3b-cc0d-4aaf-ac18-0771eac34ad3 d0c4727b6f6e46339b56a8168cf80a7b c1e85e2b0f934b719d3ad4076dc719f2 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Jan 21 18:51:03 np0005591285 nova_compute[182755]: 2026-01-21 23:51:03.202 182759 INFO nova.compute.claims [None req-57f4fd3b-cc0d-4aaf-ac18-0771eac34ad3 d0c4727b6f6e46339b56a8168cf80a7b c1e85e2b0f934b719d3ad4076dc719f2 - - default default] [instance: 5f345aa8-94d2-4213-ab21-fadc362697b1] Claim successful on node compute-2.ctlplane.example.com#033[00m
Jan 21 18:51:03 np0005591285 nova_compute[182755]: 2026-01-21 23:51:03.227 182759 DEBUG oslo_concurrency.lockutils [None req-57f4fd3b-cc0d-4aaf-ac18-0771eac34ad3 d0c4727b6f6e46339b56a8168cf80a7b c1e85e2b0f934b719d3ad4076dc719f2 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 21 18:51:03 np0005591285 nova_compute[182755]: 2026-01-21 23:51:03.376 182759 DEBUG nova.compute.provider_tree [None req-57f4fd3b-cc0d-4aaf-ac18-0771eac34ad3 d0c4727b6f6e46339b56a8168cf80a7b c1e85e2b0f934b719d3ad4076dc719f2 - - default default] Inventory has not changed in ProviderTree for provider: e96a8776-a298-4c19-937a-402cb8191067 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 21 18:51:03 np0005591285 nova_compute[182755]: 2026-01-21 23:51:03.395 182759 DEBUG nova.scheduler.client.report [None req-57f4fd3b-cc0d-4aaf-ac18-0771eac34ad3 d0c4727b6f6e46339b56a8168cf80a7b c1e85e2b0f934b719d3ad4076dc719f2 - - default default] Inventory has not changed for provider e96a8776-a298-4c19-937a-402cb8191067 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 21 18:51:03 np0005591285 nova_compute[182755]: 2026-01-21 23:51:03.422 182759 DEBUG oslo_concurrency.lockutils [None req-57f4fd3b-cc0d-4aaf-ac18-0771eac34ad3 d0c4727b6f6e46339b56a8168cf80a7b c1e85e2b0f934b719d3ad4076dc719f2 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.238s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 21 18:51:03 np0005591285 nova_compute[182755]: 2026-01-21 23:51:03.423 182759 DEBUG oslo_concurrency.lockutils [None req-57f4fd3b-cc0d-4aaf-ac18-0771eac34ad3 d0c4727b6f6e46339b56a8168cf80a7b c1e85e2b0f934b719d3ad4076dc719f2 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.196s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 21 18:51:03 np0005591285 nova_compute[182755]: 2026-01-21 23:51:03.429 182759 DEBUG nova.virt.hardware [None req-57f4fd3b-cc0d-4aaf-ac18-0771eac34ad3 d0c4727b6f6e46339b56a8168cf80a7b c1e85e2b0f934b719d3ad4076dc719f2 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Jan 21 18:51:03 np0005591285 nova_compute[182755]: 2026-01-21 23:51:03.430 182759 INFO nova.compute.claims [None req-57f4fd3b-cc0d-4aaf-ac18-0771eac34ad3 d0c4727b6f6e46339b56a8168cf80a7b c1e85e2b0f934b719d3ad4076dc719f2 - - default default] [instance: e9e930a4-00ab-4044-bd2d-1f099528cb5d] Claim successful on node compute-2.ctlplane.example.com#033[00m
Jan 21 18:51:03 np0005591285 nova_compute[182755]: 2026-01-21 23:51:03.471 182759 DEBUG oslo_concurrency.lockutils [None req-57f4fd3b-cc0d-4aaf-ac18-0771eac34ad3 d0c4727b6f6e46339b56a8168cf80a7b c1e85e2b0f934b719d3ad4076dc719f2 - - default default] Acquiring lock "49816e0f-43f1-4649-b3ba-c4365962f714" by "nova.compute.manager.ComputeManager._validate_instance_group_policy.<locals>._do_validation" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 21 18:51:03 np0005591285 nova_compute[182755]: 2026-01-21 23:51:03.471 182759 DEBUG oslo_concurrency.lockutils [None req-57f4fd3b-cc0d-4aaf-ac18-0771eac34ad3 d0c4727b6f6e46339b56a8168cf80a7b c1e85e2b0f934b719d3ad4076dc719f2 - - default default] Lock "49816e0f-43f1-4649-b3ba-c4365962f714" acquired by "nova.compute.manager.ComputeManager._validate_instance_group_policy.<locals>._do_validation" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 21 18:51:03 np0005591285 nova_compute[182755]: 2026-01-21 23:51:03.511 182759 DEBUG oslo_concurrency.lockutils [None req-57f4fd3b-cc0d-4aaf-ac18-0771eac34ad3 d0c4727b6f6e46339b56a8168cf80a7b c1e85e2b0f934b719d3ad4076dc719f2 - - default default] Lock "49816e0f-43f1-4649-b3ba-c4365962f714" "released" by "nova.compute.manager.ComputeManager._validate_instance_group_policy.<locals>._do_validation" :: held 0.040s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 21 18:51:03 np0005591285 nova_compute[182755]: 2026-01-21 23:51:03.512 182759 DEBUG nova.compute.manager [None req-57f4fd3b-cc0d-4aaf-ac18-0771eac34ad3 d0c4727b6f6e46339b56a8168cf80a7b c1e85e2b0f934b719d3ad4076dc719f2 - - default default] [instance: 5f345aa8-94d2-4213-ab21-fadc362697b1] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Jan 21 18:51:03 np0005591285 nova_compute[182755]: 2026-01-21 23:51:03.845 182759 DEBUG nova.compute.manager [None req-57f4fd3b-cc0d-4aaf-ac18-0771eac34ad3 d0c4727b6f6e46339b56a8168cf80a7b c1e85e2b0f934b719d3ad4076dc719f2 - - default default] [instance: 5f345aa8-94d2-4213-ab21-fadc362697b1] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Jan 21 18:51:03 np0005591285 nova_compute[182755]: 2026-01-21 23:51:03.846 182759 DEBUG nova.network.neutron [None req-57f4fd3b-cc0d-4aaf-ac18-0771eac34ad3 d0c4727b6f6e46339b56a8168cf80a7b c1e85e2b0f934b719d3ad4076dc719f2 - - default default] [instance: 5f345aa8-94d2-4213-ab21-fadc362697b1] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Jan 21 18:51:03 np0005591285 nova_compute[182755]: 2026-01-21 23:51:03.864 182759 DEBUG nova.compute.provider_tree [None req-57f4fd3b-cc0d-4aaf-ac18-0771eac34ad3 d0c4727b6f6e46339b56a8168cf80a7b c1e85e2b0f934b719d3ad4076dc719f2 - - default default] Inventory has not changed in ProviderTree for provider: e96a8776-a298-4c19-937a-402cb8191067 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 21 18:51:03 np0005591285 nova_compute[182755]: 2026-01-21 23:51:03.881 182759 INFO nova.virt.libvirt.driver [None req-57f4fd3b-cc0d-4aaf-ac18-0771eac34ad3 d0c4727b6f6e46339b56a8168cf80a7b c1e85e2b0f934b719d3ad4076dc719f2 - - default default] [instance: 5f345aa8-94d2-4213-ab21-fadc362697b1] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Jan 21 18:51:03 np0005591285 nova_compute[182755]: 2026-01-21 23:51:03.885 182759 DEBUG nova.scheduler.client.report [None req-57f4fd3b-cc0d-4aaf-ac18-0771eac34ad3 d0c4727b6f6e46339b56a8168cf80a7b c1e85e2b0f934b719d3ad4076dc719f2 - - default default] Inventory has not changed for provider e96a8776-a298-4c19-937a-402cb8191067 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 21 18:51:03 np0005591285 nova_compute[182755]: 2026-01-21 23:51:03.912 182759 DEBUG oslo_concurrency.lockutils [None req-57f4fd3b-cc0d-4aaf-ac18-0771eac34ad3 d0c4727b6f6e46339b56a8168cf80a7b c1e85e2b0f934b719d3ad4076dc719f2 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.489s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 21 18:51:03 np0005591285 nova_compute[182755]: 2026-01-21 23:51:03.926 182759 DEBUG nova.compute.manager [None req-57f4fd3b-cc0d-4aaf-ac18-0771eac34ad3 d0c4727b6f6e46339b56a8168cf80a7b c1e85e2b0f934b719d3ad4076dc719f2 - - default default] [instance: 5f345aa8-94d2-4213-ab21-fadc362697b1] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Jan 21 18:51:03 np0005591285 nova_compute[182755]: 2026-01-21 23:51:03.954 182759 DEBUG oslo_concurrency.lockutils [None req-57f4fd3b-cc0d-4aaf-ac18-0771eac34ad3 d0c4727b6f6e46339b56a8168cf80a7b c1e85e2b0f934b719d3ad4076dc719f2 - - default default] Acquiring lock "49816e0f-43f1-4649-b3ba-c4365962f714" by "nova.compute.manager.ComputeManager._validate_instance_group_policy.<locals>._do_validation" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 21 18:51:03 np0005591285 nova_compute[182755]: 2026-01-21 23:51:03.955 182759 DEBUG oslo_concurrency.lockutils [None req-57f4fd3b-cc0d-4aaf-ac18-0771eac34ad3 d0c4727b6f6e46339b56a8168cf80a7b c1e85e2b0f934b719d3ad4076dc719f2 - - default default] Lock "49816e0f-43f1-4649-b3ba-c4365962f714" acquired by "nova.compute.manager.ComputeManager._validate_instance_group_policy.<locals>._do_validation" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 21 18:51:03 np0005591285 nova_compute[182755]: 2026-01-21 23:51:03.993 182759 DEBUG oslo_concurrency.lockutils [None req-57f4fd3b-cc0d-4aaf-ac18-0771eac34ad3 d0c4727b6f6e46339b56a8168cf80a7b c1e85e2b0f934b719d3ad4076dc719f2 - - default default] Lock "49816e0f-43f1-4649-b3ba-c4365962f714" "released" by "nova.compute.manager.ComputeManager._validate_instance_group_policy.<locals>._do_validation" :: held 0.038s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 21 18:51:03 np0005591285 nova_compute[182755]: 2026-01-21 23:51:03.994 182759 DEBUG nova.compute.manager [None req-57f4fd3b-cc0d-4aaf-ac18-0771eac34ad3 d0c4727b6f6e46339b56a8168cf80a7b c1e85e2b0f934b719d3ad4076dc719f2 - - default default] [instance: e9e930a4-00ab-4044-bd2d-1f099528cb5d] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Jan 21 18:51:04 np0005591285 nova_compute[182755]: 2026-01-21 23:51:04.101 182759 DEBUG nova.compute.manager [None req-57f4fd3b-cc0d-4aaf-ac18-0771eac34ad3 d0c4727b6f6e46339b56a8168cf80a7b c1e85e2b0f934b719d3ad4076dc719f2 - - default default] [instance: e9e930a4-00ab-4044-bd2d-1f099528cb5d] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Jan 21 18:51:04 np0005591285 nova_compute[182755]: 2026-01-21 23:51:04.102 182759 DEBUG nova.network.neutron [None req-57f4fd3b-cc0d-4aaf-ac18-0771eac34ad3 d0c4727b6f6e46339b56a8168cf80a7b c1e85e2b0f934b719d3ad4076dc719f2 - - default default] [instance: e9e930a4-00ab-4044-bd2d-1f099528cb5d] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Jan 21 18:51:04 np0005591285 nova_compute[182755]: 2026-01-21 23:51:04.122 182759 DEBUG nova.compute.manager [None req-57f4fd3b-cc0d-4aaf-ac18-0771eac34ad3 d0c4727b6f6e46339b56a8168cf80a7b c1e85e2b0f934b719d3ad4076dc719f2 - - default default] [instance: 5f345aa8-94d2-4213-ab21-fadc362697b1] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Jan 21 18:51:04 np0005591285 nova_compute[182755]: 2026-01-21 23:51:04.124 182759 DEBUG nova.virt.libvirt.driver [None req-57f4fd3b-cc0d-4aaf-ac18-0771eac34ad3 d0c4727b6f6e46339b56a8168cf80a7b c1e85e2b0f934b719d3ad4076dc719f2 - - default default] [instance: 5f345aa8-94d2-4213-ab21-fadc362697b1] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Jan 21 18:51:04 np0005591285 nova_compute[182755]: 2026-01-21 23:51:04.124 182759 INFO nova.virt.libvirt.driver [None req-57f4fd3b-cc0d-4aaf-ac18-0771eac34ad3 d0c4727b6f6e46339b56a8168cf80a7b c1e85e2b0f934b719d3ad4076dc719f2 - - default default] [instance: 5f345aa8-94d2-4213-ab21-fadc362697b1] Creating image(s)#033[00m
Jan 21 18:51:04 np0005591285 nova_compute[182755]: 2026-01-21 23:51:04.125 182759 DEBUG oslo_concurrency.lockutils [None req-57f4fd3b-cc0d-4aaf-ac18-0771eac34ad3 d0c4727b6f6e46339b56a8168cf80a7b c1e85e2b0f934b719d3ad4076dc719f2 - - default default] Acquiring lock "/var/lib/nova/instances/5f345aa8-94d2-4213-ab21-fadc362697b1/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 21 18:51:04 np0005591285 nova_compute[182755]: 2026-01-21 23:51:04.125 182759 DEBUG oslo_concurrency.lockutils [None req-57f4fd3b-cc0d-4aaf-ac18-0771eac34ad3 d0c4727b6f6e46339b56a8168cf80a7b c1e85e2b0f934b719d3ad4076dc719f2 - - default default] Lock "/var/lib/nova/instances/5f345aa8-94d2-4213-ab21-fadc362697b1/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 21 18:51:04 np0005591285 nova_compute[182755]: 2026-01-21 23:51:04.126 182759 DEBUG oslo_concurrency.lockutils [None req-57f4fd3b-cc0d-4aaf-ac18-0771eac34ad3 d0c4727b6f6e46339b56a8168cf80a7b c1e85e2b0f934b719d3ad4076dc719f2 - - default default] Lock "/var/lib/nova/instances/5f345aa8-94d2-4213-ab21-fadc362697b1/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 21 18:51:04 np0005591285 nova_compute[182755]: 2026-01-21 23:51:04.139 182759 INFO nova.virt.libvirt.driver [None req-57f4fd3b-cc0d-4aaf-ac18-0771eac34ad3 d0c4727b6f6e46339b56a8168cf80a7b c1e85e2b0f934b719d3ad4076dc719f2 - - default default] [instance: e9e930a4-00ab-4044-bd2d-1f099528cb5d] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Jan 21 18:51:04 np0005591285 nova_compute[182755]: 2026-01-21 23:51:04.144 182759 DEBUG oslo_concurrency.processutils [None req-57f4fd3b-cc0d-4aaf-ac18-0771eac34ad3 d0c4727b6f6e46339b56a8168cf80a7b c1e85e2b0f934b719d3ad4076dc719f2 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 21 18:51:04 np0005591285 nova_compute[182755]: 2026-01-21 23:51:04.169 182759 DEBUG nova.compute.manager [None req-57f4fd3b-cc0d-4aaf-ac18-0771eac34ad3 d0c4727b6f6e46339b56a8168cf80a7b c1e85e2b0f934b719d3ad4076dc719f2 - - default default] [instance: e9e930a4-00ab-4044-bd2d-1f099528cb5d] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Jan 21 18:51:04 np0005591285 nova_compute[182755]: 2026-01-21 23:51:04.210 182759 DEBUG oslo_concurrency.processutils [None req-57f4fd3b-cc0d-4aaf-ac18-0771eac34ad3 d0c4727b6f6e46339b56a8168cf80a7b c1e85e2b0f934b719d3ad4076dc719f2 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json" returned: 0 in 0.065s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 21 18:51:04 np0005591285 nova_compute[182755]: 2026-01-21 23:51:04.211 182759 DEBUG oslo_concurrency.lockutils [None req-57f4fd3b-cc0d-4aaf-ac18-0771eac34ad3 d0c4727b6f6e46339b56a8168cf80a7b c1e85e2b0f934b719d3ad4076dc719f2 - - default default] Acquiring lock "3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 21 18:51:04 np0005591285 nova_compute[182755]: 2026-01-21 23:51:04.211 182759 DEBUG oslo_concurrency.lockutils [None req-57f4fd3b-cc0d-4aaf-ac18-0771eac34ad3 d0c4727b6f6e46339b56a8168cf80a7b c1e85e2b0f934b719d3ad4076dc719f2 - - default default] Lock "3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 21 18:51:04 np0005591285 nova_compute[182755]: 2026-01-21 23:51:04.222 182759 DEBUG oslo_concurrency.processutils [None req-57f4fd3b-cc0d-4aaf-ac18-0771eac34ad3 d0c4727b6f6e46339b56a8168cf80a7b c1e85e2b0f934b719d3ad4076dc719f2 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 21 18:51:04 np0005591285 nova_compute[182755]: 2026-01-21 23:51:04.280 182759 DEBUG oslo_concurrency.processutils [None req-57f4fd3b-cc0d-4aaf-ac18-0771eac34ad3 d0c4727b6f6e46339b56a8168cf80a7b c1e85e2b0f934b719d3ad4076dc719f2 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json" returned: 0 in 0.059s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 21 18:51:04 np0005591285 nova_compute[182755]: 2026-01-21 23:51:04.282 182759 DEBUG oslo_concurrency.processutils [None req-57f4fd3b-cc0d-4aaf-ac18-0771eac34ad3 d0c4727b6f6e46339b56a8168cf80a7b c1e85e2b0f934b719d3ad4076dc719f2 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474,backing_fmt=raw /var/lib/nova/instances/5f345aa8-94d2-4213-ab21-fadc362697b1/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 21 18:51:04 np0005591285 nova_compute[182755]: 2026-01-21 23:51:04.321 182759 DEBUG nova.network.neutron [None req-57f4fd3b-cc0d-4aaf-ac18-0771eac34ad3 d0c4727b6f6e46339b56a8168cf80a7b c1e85e2b0f934b719d3ad4076dc719f2 - - default default] [instance: 5f345aa8-94d2-4213-ab21-fadc362697b1] No network configured allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1188#033[00m
Jan 21 18:51:04 np0005591285 nova_compute[182755]: 2026-01-21 23:51:04.322 182759 DEBUG nova.compute.manager [None req-57f4fd3b-cc0d-4aaf-ac18-0771eac34ad3 d0c4727b6f6e46339b56a8168cf80a7b c1e85e2b0f934b719d3ad4076dc719f2 - - default default] [instance: 5f345aa8-94d2-4213-ab21-fadc362697b1] Instance network_info: |[]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Jan 21 18:51:04 np0005591285 nova_compute[182755]: 2026-01-21 23:51:04.325 182759 DEBUG nova.compute.manager [None req-57f4fd3b-cc0d-4aaf-ac18-0771eac34ad3 d0c4727b6f6e46339b56a8168cf80a7b c1e85e2b0f934b719d3ad4076dc719f2 - - default default] [instance: e9e930a4-00ab-4044-bd2d-1f099528cb5d] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Jan 21 18:51:04 np0005591285 nova_compute[182755]: 2026-01-21 23:51:04.328 182759 DEBUG nova.virt.libvirt.driver [None req-57f4fd3b-cc0d-4aaf-ac18-0771eac34ad3 d0c4727b6f6e46339b56a8168cf80a7b c1e85e2b0f934b719d3ad4076dc719f2 - - default default] [instance: e9e930a4-00ab-4044-bd2d-1f099528cb5d] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Jan 21 18:51:04 np0005591285 nova_compute[182755]: 2026-01-21 23:51:04.329 182759 INFO nova.virt.libvirt.driver [None req-57f4fd3b-cc0d-4aaf-ac18-0771eac34ad3 d0c4727b6f6e46339b56a8168cf80a7b c1e85e2b0f934b719d3ad4076dc719f2 - - default default] [instance: e9e930a4-00ab-4044-bd2d-1f099528cb5d] Creating image(s)#033[00m
Jan 21 18:51:04 np0005591285 nova_compute[182755]: 2026-01-21 23:51:04.331 182759 DEBUG oslo_concurrency.lockutils [None req-57f4fd3b-cc0d-4aaf-ac18-0771eac34ad3 d0c4727b6f6e46339b56a8168cf80a7b c1e85e2b0f934b719d3ad4076dc719f2 - - default default] Acquiring lock "/var/lib/nova/instances/e9e930a4-00ab-4044-bd2d-1f099528cb5d/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 21 18:51:04 np0005591285 nova_compute[182755]: 2026-01-21 23:51:04.332 182759 DEBUG oslo_concurrency.lockutils [None req-57f4fd3b-cc0d-4aaf-ac18-0771eac34ad3 d0c4727b6f6e46339b56a8168cf80a7b c1e85e2b0f934b719d3ad4076dc719f2 - - default default] Lock "/var/lib/nova/instances/e9e930a4-00ab-4044-bd2d-1f099528cb5d/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 21 18:51:04 np0005591285 nova_compute[182755]: 2026-01-21 23:51:04.333 182759 DEBUG oslo_concurrency.lockutils [None req-57f4fd3b-cc0d-4aaf-ac18-0771eac34ad3 d0c4727b6f6e46339b56a8168cf80a7b c1e85e2b0f934b719d3ad4076dc719f2 - - default default] Lock "/var/lib/nova/instances/e9e930a4-00ab-4044-bd2d-1f099528cb5d/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 21 18:51:04 np0005591285 nova_compute[182755]: 2026-01-21 23:51:04.358 182759 DEBUG oslo_concurrency.processutils [None req-57f4fd3b-cc0d-4aaf-ac18-0771eac34ad3 d0c4727b6f6e46339b56a8168cf80a7b c1e85e2b0f934b719d3ad4076dc719f2 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 21 18:51:04 np0005591285 nova_compute[182755]: 2026-01-21 23:51:04.388 182759 DEBUG oslo_concurrency.processutils [None req-57f4fd3b-cc0d-4aaf-ac18-0771eac34ad3 d0c4727b6f6e46339b56a8168cf80a7b c1e85e2b0f934b719d3ad4076dc719f2 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474,backing_fmt=raw /var/lib/nova/instances/5f345aa8-94d2-4213-ab21-fadc362697b1/disk 1073741824" returned: 0 in 0.106s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 21 18:51:04 np0005591285 nova_compute[182755]: 2026-01-21 23:51:04.393 182759 DEBUG oslo_concurrency.lockutils [None req-57f4fd3b-cc0d-4aaf-ac18-0771eac34ad3 d0c4727b6f6e46339b56a8168cf80a7b c1e85e2b0f934b719d3ad4076dc719f2 - - default default] Lock "3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.182s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 21 18:51:04 np0005591285 nova_compute[182755]: 2026-01-21 23:51:04.394 182759 DEBUG oslo_concurrency.processutils [None req-57f4fd3b-cc0d-4aaf-ac18-0771eac34ad3 d0c4727b6f6e46339b56a8168cf80a7b c1e85e2b0f934b719d3ad4076dc719f2 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 21 18:51:04 np0005591285 nova_compute[182755]: 2026-01-21 23:51:04.431 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 18:51:04 np0005591285 nova_compute[182755]: 2026-01-21 23:51:04.445 182759 DEBUG oslo_concurrency.processutils [None req-57f4fd3b-cc0d-4aaf-ac18-0771eac34ad3 d0c4727b6f6e46339b56a8168cf80a7b c1e85e2b0f934b719d3ad4076dc719f2 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json" returned: 0 in 0.087s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 21 18:51:04 np0005591285 nova_compute[182755]: 2026-01-21 23:51:04.447 182759 DEBUG oslo_concurrency.lockutils [None req-57f4fd3b-cc0d-4aaf-ac18-0771eac34ad3 d0c4727b6f6e46339b56a8168cf80a7b c1e85e2b0f934b719d3ad4076dc719f2 - - default default] Acquiring lock "3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 21 18:51:04 np0005591285 nova_compute[182755]: 2026-01-21 23:51:04.448 182759 DEBUG oslo_concurrency.lockutils [None req-57f4fd3b-cc0d-4aaf-ac18-0771eac34ad3 d0c4727b6f6e46339b56a8168cf80a7b c1e85e2b0f934b719d3ad4076dc719f2 - - default default] Lock "3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 21 18:51:04 np0005591285 nova_compute[182755]: 2026-01-21 23:51:04.477 182759 DEBUG oslo_concurrency.processutils [None req-57f4fd3b-cc0d-4aaf-ac18-0771eac34ad3 d0c4727b6f6e46339b56a8168cf80a7b c1e85e2b0f934b719d3ad4076dc719f2 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 21 18:51:04 np0005591285 nova_compute[182755]: 2026-01-21 23:51:04.499 182759 DEBUG oslo_concurrency.processutils [None req-57f4fd3b-cc0d-4aaf-ac18-0771eac34ad3 d0c4727b6f6e46339b56a8168cf80a7b c1e85e2b0f934b719d3ad4076dc719f2 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json" returned: 0 in 0.105s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 21 18:51:04 np0005591285 nova_compute[182755]: 2026-01-21 23:51:04.500 182759 DEBUG nova.virt.disk.api [None req-57f4fd3b-cc0d-4aaf-ac18-0771eac34ad3 d0c4727b6f6e46339b56a8168cf80a7b c1e85e2b0f934b719d3ad4076dc719f2 - - default default] Checking if we can resize image /var/lib/nova/instances/5f345aa8-94d2-4213-ab21-fadc362697b1/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166#033[00m
Jan 21 18:51:04 np0005591285 nova_compute[182755]: 2026-01-21 23:51:04.501 182759 DEBUG oslo_concurrency.processutils [None req-57f4fd3b-cc0d-4aaf-ac18-0771eac34ad3 d0c4727b6f6e46339b56a8168cf80a7b c1e85e2b0f934b719d3ad4076dc719f2 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/5f345aa8-94d2-4213-ab21-fadc362697b1/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 21 18:51:04 np0005591285 nova_compute[182755]: 2026-01-21 23:51:04.538 182759 DEBUG oslo_concurrency.processutils [None req-57f4fd3b-cc0d-4aaf-ac18-0771eac34ad3 d0c4727b6f6e46339b56a8168cf80a7b c1e85e2b0f934b719d3ad4076dc719f2 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json" returned: 0 in 0.060s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 21 18:51:04 np0005591285 nova_compute[182755]: 2026-01-21 23:51:04.539 182759 DEBUG oslo_concurrency.processutils [None req-57f4fd3b-cc0d-4aaf-ac18-0771eac34ad3 d0c4727b6f6e46339b56a8168cf80a7b c1e85e2b0f934b719d3ad4076dc719f2 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474,backing_fmt=raw /var/lib/nova/instances/e9e930a4-00ab-4044-bd2d-1f099528cb5d/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 21 18:51:04 np0005591285 nova_compute[182755]: 2026-01-21 23:51:04.561 182759 DEBUG oslo_concurrency.processutils [None req-57f4fd3b-cc0d-4aaf-ac18-0771eac34ad3 d0c4727b6f6e46339b56a8168cf80a7b c1e85e2b0f934b719d3ad4076dc719f2 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/5f345aa8-94d2-4213-ab21-fadc362697b1/disk --force-share --output=json" returned: 0 in 0.060s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 21 18:51:04 np0005591285 nova_compute[182755]: 2026-01-21 23:51:04.563 182759 DEBUG nova.virt.disk.api [None req-57f4fd3b-cc0d-4aaf-ac18-0771eac34ad3 d0c4727b6f6e46339b56a8168cf80a7b c1e85e2b0f934b719d3ad4076dc719f2 - - default default] Cannot resize image /var/lib/nova/instances/5f345aa8-94d2-4213-ab21-fadc362697b1/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172#033[00m
Jan 21 18:51:04 np0005591285 nova_compute[182755]: 2026-01-21 23:51:04.563 182759 DEBUG nova.objects.instance [None req-57f4fd3b-cc0d-4aaf-ac18-0771eac34ad3 d0c4727b6f6e46339b56a8168cf80a7b c1e85e2b0f934b719d3ad4076dc719f2 - - default default] Lazy-loading 'migration_context' on Instance uuid 5f345aa8-94d2-4213-ab21-fadc362697b1 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 21 18:51:04 np0005591285 nova_compute[182755]: 2026-01-21 23:51:04.579 182759 DEBUG oslo_concurrency.processutils [None req-57f4fd3b-cc0d-4aaf-ac18-0771eac34ad3 d0c4727b6f6e46339b56a8168cf80a7b c1e85e2b0f934b719d3ad4076dc719f2 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474,backing_fmt=raw /var/lib/nova/instances/e9e930a4-00ab-4044-bd2d-1f099528cb5d/disk 1073741824" returned: 0 in 0.040s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 21 18:51:04 np0005591285 nova_compute[182755]: 2026-01-21 23:51:04.579 182759 DEBUG oslo_concurrency.lockutils [None req-57f4fd3b-cc0d-4aaf-ac18-0771eac34ad3 d0c4727b6f6e46339b56a8168cf80a7b c1e85e2b0f934b719d3ad4076dc719f2 - - default default] Lock "3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.131s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 21 18:51:04 np0005591285 nova_compute[182755]: 2026-01-21 23:51:04.580 182759 DEBUG oslo_concurrency.processutils [None req-57f4fd3b-cc0d-4aaf-ac18-0771eac34ad3 d0c4727b6f6e46339b56a8168cf80a7b c1e85e2b0f934b719d3ad4076dc719f2 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 21 18:51:04 np0005591285 nova_compute[182755]: 2026-01-21 23:51:04.605 182759 DEBUG nova.virt.libvirt.driver [None req-57f4fd3b-cc0d-4aaf-ac18-0771eac34ad3 d0c4727b6f6e46339b56a8168cf80a7b c1e85e2b0f934b719d3ad4076dc719f2 - - default default] [instance: 5f345aa8-94d2-4213-ab21-fadc362697b1] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Jan 21 18:51:04 np0005591285 nova_compute[182755]: 2026-01-21 23:51:04.606 182759 DEBUG nova.virt.libvirt.driver [None req-57f4fd3b-cc0d-4aaf-ac18-0771eac34ad3 d0c4727b6f6e46339b56a8168cf80a7b c1e85e2b0f934b719d3ad4076dc719f2 - - default default] [instance: 5f345aa8-94d2-4213-ab21-fadc362697b1] Ensure instance console log exists: /var/lib/nova/instances/5f345aa8-94d2-4213-ab21-fadc362697b1/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Jan 21 18:51:04 np0005591285 nova_compute[182755]: 2026-01-21 23:51:04.606 182759 DEBUG oslo_concurrency.lockutils [None req-57f4fd3b-cc0d-4aaf-ac18-0771eac34ad3 d0c4727b6f6e46339b56a8168cf80a7b c1e85e2b0f934b719d3ad4076dc719f2 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 21 18:51:04 np0005591285 nova_compute[182755]: 2026-01-21 23:51:04.607 182759 DEBUG oslo_concurrency.lockutils [None req-57f4fd3b-cc0d-4aaf-ac18-0771eac34ad3 d0c4727b6f6e46339b56a8168cf80a7b c1e85e2b0f934b719d3ad4076dc719f2 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 21 18:51:04 np0005591285 nova_compute[182755]: 2026-01-21 23:51:04.607 182759 DEBUG oslo_concurrency.lockutils [None req-57f4fd3b-cc0d-4aaf-ac18-0771eac34ad3 d0c4727b6f6e46339b56a8168cf80a7b c1e85e2b0f934b719d3ad4076dc719f2 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 21 18:51:04 np0005591285 nova_compute[182755]: 2026-01-21 23:51:04.609 182759 DEBUG nova.virt.libvirt.driver [None req-57f4fd3b-cc0d-4aaf-ac18-0771eac34ad3 d0c4727b6f6e46339b56a8168cf80a7b c1e85e2b0f934b719d3ad4076dc719f2 - - default default] [instance: 5f345aa8-94d2-4213-ab21-fadc362697b1] Start _get_guest_xml network_info=[] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-21T23:42:50Z,direct_url=<?>,disk_format='qcow2',id=9cd98f02-a505-4543-a7ad-04e9a377b456,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='43b70c4e837343859ac97b6b2397ba1b',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-21T23:42:52Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'device_type': 'disk', 'device_name': '/dev/vda', 'size': 0, 'boot_index': 0, 'disk_bus': 'virtio', 'guest_format': None, 'encryption_options': None, 'encryption_format': None, 'encrypted': False, 'encryption_secret_uuid': None, 'image_id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Jan 21 18:51:04 np0005591285 nova_compute[182755]: 2026-01-21 23:51:04.612 182759 DEBUG nova.network.neutron [None req-57f4fd3b-cc0d-4aaf-ac18-0771eac34ad3 d0c4727b6f6e46339b56a8168cf80a7b c1e85e2b0f934b719d3ad4076dc719f2 - - default default] [instance: e9e930a4-00ab-4044-bd2d-1f099528cb5d] No network configured allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1188#033[00m
Jan 21 18:51:04 np0005591285 nova_compute[182755]: 2026-01-21 23:51:04.613 182759 DEBUG nova.compute.manager [None req-57f4fd3b-cc0d-4aaf-ac18-0771eac34ad3 d0c4727b6f6e46339b56a8168cf80a7b c1e85e2b0f934b719d3ad4076dc719f2 - - default default] [instance: e9e930a4-00ab-4044-bd2d-1f099528cb5d] Instance network_info: |[]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Jan 21 18:51:04 np0005591285 nova_compute[182755]: 2026-01-21 23:51:04.618 182759 WARNING nova.virt.libvirt.driver [None req-57f4fd3b-cc0d-4aaf-ac18-0771eac34ad3 d0c4727b6f6e46339b56a8168cf80a7b c1e85e2b0f934b719d3ad4076dc719f2 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 21 18:51:04 np0005591285 nova_compute[182755]: 2026-01-21 23:51:04.622 182759 DEBUG nova.virt.libvirt.host [None req-57f4fd3b-cc0d-4aaf-ac18-0771eac34ad3 d0c4727b6f6e46339b56a8168cf80a7b c1e85e2b0f934b719d3ad4076dc719f2 - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Jan 21 18:51:04 np0005591285 nova_compute[182755]: 2026-01-21 23:51:04.623 182759 DEBUG nova.virt.libvirt.host [None req-57f4fd3b-cc0d-4aaf-ac18-0771eac34ad3 d0c4727b6f6e46339b56a8168cf80a7b c1e85e2b0f934b719d3ad4076dc719f2 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Jan 21 18:51:04 np0005591285 nova_compute[182755]: 2026-01-21 23:51:04.626 182759 DEBUG nova.virt.libvirt.host [None req-57f4fd3b-cc0d-4aaf-ac18-0771eac34ad3 d0c4727b6f6e46339b56a8168cf80a7b c1e85e2b0f934b719d3ad4076dc719f2 - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Jan 21 18:51:04 np0005591285 nova_compute[182755]: 2026-01-21 23:51:04.626 182759 DEBUG nova.virt.libvirt.host [None req-57f4fd3b-cc0d-4aaf-ac18-0771eac34ad3 d0c4727b6f6e46339b56a8168cf80a7b c1e85e2b0f934b719d3ad4076dc719f2 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Jan 21 18:51:04 np0005591285 nova_compute[182755]: 2026-01-21 23:51:04.628 182759 DEBUG nova.virt.libvirt.driver [None req-57f4fd3b-cc0d-4aaf-ac18-0771eac34ad3 d0c4727b6f6e46339b56a8168cf80a7b c1e85e2b0f934b719d3ad4076dc719f2 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Jan 21 18:51:04 np0005591285 nova_compute[182755]: 2026-01-21 23:51:04.628 182759 DEBUG nova.virt.hardware [None req-57f4fd3b-cc0d-4aaf-ac18-0771eac34ad3 d0c4727b6f6e46339b56a8168cf80a7b c1e85e2b0f934b719d3ad4076dc719f2 - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-21T23:42:45Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='c3389c03-89c4-4ff5-9e03-1a99d41713d4',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-21T23:42:50Z,direct_url=<?>,disk_format='qcow2',id=9cd98f02-a505-4543-a7ad-04e9a377b456,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='43b70c4e837343859ac97b6b2397ba1b',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-21T23:42:52Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Jan 21 18:51:04 np0005591285 nova_compute[182755]: 2026-01-21 23:51:04.629 182759 DEBUG nova.virt.hardware [None req-57f4fd3b-cc0d-4aaf-ac18-0771eac34ad3 d0c4727b6f6e46339b56a8168cf80a7b c1e85e2b0f934b719d3ad4076dc719f2 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Jan 21 18:51:04 np0005591285 nova_compute[182755]: 2026-01-21 23:51:04.629 182759 DEBUG nova.virt.hardware [None req-57f4fd3b-cc0d-4aaf-ac18-0771eac34ad3 d0c4727b6f6e46339b56a8168cf80a7b c1e85e2b0f934b719d3ad4076dc719f2 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Jan 21 18:51:04 np0005591285 nova_compute[182755]: 2026-01-21 23:51:04.629 182759 DEBUG nova.virt.hardware [None req-57f4fd3b-cc0d-4aaf-ac18-0771eac34ad3 d0c4727b6f6e46339b56a8168cf80a7b c1e85e2b0f934b719d3ad4076dc719f2 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Jan 21 18:51:04 np0005591285 nova_compute[182755]: 2026-01-21 23:51:04.630 182759 DEBUG nova.virt.hardware [None req-57f4fd3b-cc0d-4aaf-ac18-0771eac34ad3 d0c4727b6f6e46339b56a8168cf80a7b c1e85e2b0f934b719d3ad4076dc719f2 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Jan 21 18:51:04 np0005591285 nova_compute[182755]: 2026-01-21 23:51:04.630 182759 DEBUG nova.virt.hardware [None req-57f4fd3b-cc0d-4aaf-ac18-0771eac34ad3 d0c4727b6f6e46339b56a8168cf80a7b c1e85e2b0f934b719d3ad4076dc719f2 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Jan 21 18:51:04 np0005591285 nova_compute[182755]: 2026-01-21 23:51:04.630 182759 DEBUG nova.virt.hardware [None req-57f4fd3b-cc0d-4aaf-ac18-0771eac34ad3 d0c4727b6f6e46339b56a8168cf80a7b c1e85e2b0f934b719d3ad4076dc719f2 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Jan 21 18:51:04 np0005591285 nova_compute[182755]: 2026-01-21 23:51:04.631 182759 DEBUG nova.virt.hardware [None req-57f4fd3b-cc0d-4aaf-ac18-0771eac34ad3 d0c4727b6f6e46339b56a8168cf80a7b c1e85e2b0f934b719d3ad4076dc719f2 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Jan 21 18:51:04 np0005591285 nova_compute[182755]: 2026-01-21 23:51:04.631 182759 DEBUG nova.virt.hardware [None req-57f4fd3b-cc0d-4aaf-ac18-0771eac34ad3 d0c4727b6f6e46339b56a8168cf80a7b c1e85e2b0f934b719d3ad4076dc719f2 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Jan 21 18:51:04 np0005591285 nova_compute[182755]: 2026-01-21 23:51:04.631 182759 DEBUG nova.virt.hardware [None req-57f4fd3b-cc0d-4aaf-ac18-0771eac34ad3 d0c4727b6f6e46339b56a8168cf80a7b c1e85e2b0f934b719d3ad4076dc719f2 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Jan 21 18:51:04 np0005591285 nova_compute[182755]: 2026-01-21 23:51:04.631 182759 DEBUG nova.virt.hardware [None req-57f4fd3b-cc0d-4aaf-ac18-0771eac34ad3 d0c4727b6f6e46339b56a8168cf80a7b c1e85e2b0f934b719d3ad4076dc719f2 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Jan 21 18:51:04 np0005591285 nova_compute[182755]: 2026-01-21 23:51:04.636 182759 DEBUG nova.objects.instance [None req-57f4fd3b-cc0d-4aaf-ac18-0771eac34ad3 d0c4727b6f6e46339b56a8168cf80a7b c1e85e2b0f934b719d3ad4076dc719f2 - - default default] Lazy-loading 'pci_devices' on Instance uuid 5f345aa8-94d2-4213-ab21-fadc362697b1 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 21 18:51:04 np0005591285 nova_compute[182755]: 2026-01-21 23:51:04.645 182759 DEBUG oslo_concurrency.processutils [None req-57f4fd3b-cc0d-4aaf-ac18-0771eac34ad3 d0c4727b6f6e46339b56a8168cf80a7b c1e85e2b0f934b719d3ad4076dc719f2 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json" returned: 0 in 0.065s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 21 18:51:04 np0005591285 nova_compute[182755]: 2026-01-21 23:51:04.645 182759 DEBUG nova.virt.disk.api [None req-57f4fd3b-cc0d-4aaf-ac18-0771eac34ad3 d0c4727b6f6e46339b56a8168cf80a7b c1e85e2b0f934b719d3ad4076dc719f2 - - default default] Checking if we can resize image /var/lib/nova/instances/e9e930a4-00ab-4044-bd2d-1f099528cb5d/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166#033[00m
Jan 21 18:51:04 np0005591285 nova_compute[182755]: 2026-01-21 23:51:04.646 182759 DEBUG oslo_concurrency.processutils [None req-57f4fd3b-cc0d-4aaf-ac18-0771eac34ad3 d0c4727b6f6e46339b56a8168cf80a7b c1e85e2b0f934b719d3ad4076dc719f2 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/e9e930a4-00ab-4044-bd2d-1f099528cb5d/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 21 18:51:04 np0005591285 nova_compute[182755]: 2026-01-21 23:51:04.670 182759 DEBUG nova.virt.libvirt.driver [None req-57f4fd3b-cc0d-4aaf-ac18-0771eac34ad3 d0c4727b6f6e46339b56a8168cf80a7b c1e85e2b0f934b719d3ad4076dc719f2 - - default default] [instance: 5f345aa8-94d2-4213-ab21-fadc362697b1] End _get_guest_xml xml=<domain type="kvm">
Jan 21 18:51:04 np0005591285 nova_compute[182755]:  <uuid>5f345aa8-94d2-4213-ab21-fadc362697b1</uuid>
Jan 21 18:51:04 np0005591285 nova_compute[182755]:  <name>instance-00000029</name>
Jan 21 18:51:04 np0005591285 nova_compute[182755]:  <memory>131072</memory>
Jan 21 18:51:04 np0005591285 nova_compute[182755]:  <vcpu>1</vcpu>
Jan 21 18:51:04 np0005591285 nova_compute[182755]:  <metadata>
Jan 21 18:51:04 np0005591285 nova_compute[182755]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 21 18:51:04 np0005591285 nova_compute[182755]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 21 18:51:04 np0005591285 nova_compute[182755]:      <nova:name>tempest-ServersOnMultiNodesTest-server-121266114-1</nova:name>
Jan 21 18:51:04 np0005591285 nova_compute[182755]:      <nova:creationTime>2026-01-21 23:51:04</nova:creationTime>
Jan 21 18:51:04 np0005591285 nova_compute[182755]:      <nova:flavor name="m1.nano">
Jan 21 18:51:04 np0005591285 nova_compute[182755]:        <nova:memory>128</nova:memory>
Jan 21 18:51:04 np0005591285 nova_compute[182755]:        <nova:disk>1</nova:disk>
Jan 21 18:51:04 np0005591285 nova_compute[182755]:        <nova:swap>0</nova:swap>
Jan 21 18:51:04 np0005591285 nova_compute[182755]:        <nova:ephemeral>0</nova:ephemeral>
Jan 21 18:51:04 np0005591285 nova_compute[182755]:        <nova:vcpus>1</nova:vcpus>
Jan 21 18:51:04 np0005591285 nova_compute[182755]:      </nova:flavor>
Jan 21 18:51:04 np0005591285 nova_compute[182755]:      <nova:owner>
Jan 21 18:51:04 np0005591285 nova_compute[182755]:        <nova:user uuid="d0c4727b6f6e46339b56a8168cf80a7b">tempest-ServersOnMultiNodesTest-1927863391-project-member</nova:user>
Jan 21 18:51:04 np0005591285 nova_compute[182755]:        <nova:project uuid="c1e85e2b0f934b719d3ad4076dc719f2">tempest-ServersOnMultiNodesTest-1927863391</nova:project>
Jan 21 18:51:04 np0005591285 nova_compute[182755]:      </nova:owner>
Jan 21 18:51:04 np0005591285 nova_compute[182755]:      <nova:root type="image" uuid="9cd98f02-a505-4543-a7ad-04e9a377b456"/>
Jan 21 18:51:04 np0005591285 nova_compute[182755]:      <nova:ports/>
Jan 21 18:51:04 np0005591285 nova_compute[182755]:    </nova:instance>
Jan 21 18:51:04 np0005591285 nova_compute[182755]:  </metadata>
Jan 21 18:51:04 np0005591285 nova_compute[182755]:  <sysinfo type="smbios">
Jan 21 18:51:04 np0005591285 nova_compute[182755]:    <system>
Jan 21 18:51:04 np0005591285 nova_compute[182755]:      <entry name="manufacturer">RDO</entry>
Jan 21 18:51:04 np0005591285 nova_compute[182755]:      <entry name="product">OpenStack Compute</entry>
Jan 21 18:51:04 np0005591285 nova_compute[182755]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 21 18:51:04 np0005591285 nova_compute[182755]:      <entry name="serial">5f345aa8-94d2-4213-ab21-fadc362697b1</entry>
Jan 21 18:51:04 np0005591285 nova_compute[182755]:      <entry name="uuid">5f345aa8-94d2-4213-ab21-fadc362697b1</entry>
Jan 21 18:51:04 np0005591285 nova_compute[182755]:      <entry name="family">Virtual Machine</entry>
Jan 21 18:51:04 np0005591285 nova_compute[182755]:    </system>
Jan 21 18:51:04 np0005591285 nova_compute[182755]:  </sysinfo>
Jan 21 18:51:04 np0005591285 nova_compute[182755]:  <os>
Jan 21 18:51:04 np0005591285 nova_compute[182755]:    <type arch="x86_64" machine="q35">hvm</type>
Jan 21 18:51:04 np0005591285 nova_compute[182755]:    <boot dev="hd"/>
Jan 21 18:51:04 np0005591285 nova_compute[182755]:    <smbios mode="sysinfo"/>
Jan 21 18:51:04 np0005591285 nova_compute[182755]:  </os>
Jan 21 18:51:04 np0005591285 nova_compute[182755]:  <features>
Jan 21 18:51:04 np0005591285 nova_compute[182755]:    <acpi/>
Jan 21 18:51:04 np0005591285 nova_compute[182755]:    <apic/>
Jan 21 18:51:04 np0005591285 nova_compute[182755]:    <vmcoreinfo/>
Jan 21 18:51:04 np0005591285 nova_compute[182755]:  </features>
Jan 21 18:51:04 np0005591285 nova_compute[182755]:  <clock offset="utc">
Jan 21 18:51:04 np0005591285 nova_compute[182755]:    <timer name="pit" tickpolicy="delay"/>
Jan 21 18:51:04 np0005591285 nova_compute[182755]:    <timer name="rtc" tickpolicy="catchup"/>
Jan 21 18:51:04 np0005591285 nova_compute[182755]:    <timer name="hpet" present="no"/>
Jan 21 18:51:04 np0005591285 nova_compute[182755]:  </clock>
Jan 21 18:51:04 np0005591285 nova_compute[182755]:  <cpu mode="custom" match="exact">
Jan 21 18:51:04 np0005591285 nova_compute[182755]:    <model>Nehalem</model>
Jan 21 18:51:04 np0005591285 nova_compute[182755]:    <topology sockets="1" cores="1" threads="1"/>
Jan 21 18:51:04 np0005591285 nova_compute[182755]:  </cpu>
Jan 21 18:51:04 np0005591285 nova_compute[182755]:  <devices>
Jan 21 18:51:04 np0005591285 nova_compute[182755]:    <disk type="file" device="disk">
Jan 21 18:51:04 np0005591285 nova_compute[182755]:      <driver name="qemu" type="qcow2" cache="none"/>
Jan 21 18:51:04 np0005591285 nova_compute[182755]:      <source file="/var/lib/nova/instances/5f345aa8-94d2-4213-ab21-fadc362697b1/disk"/>
Jan 21 18:51:04 np0005591285 nova_compute[182755]:      <target dev="vda" bus="virtio"/>
Jan 21 18:51:04 np0005591285 nova_compute[182755]:    </disk>
Jan 21 18:51:04 np0005591285 nova_compute[182755]:    <disk type="file" device="cdrom">
Jan 21 18:51:04 np0005591285 nova_compute[182755]:      <driver name="qemu" type="raw" cache="none"/>
Jan 21 18:51:04 np0005591285 nova_compute[182755]:      <source file="/var/lib/nova/instances/5f345aa8-94d2-4213-ab21-fadc362697b1/disk.config"/>
Jan 21 18:51:04 np0005591285 nova_compute[182755]:      <target dev="sda" bus="sata"/>
Jan 21 18:51:04 np0005591285 nova_compute[182755]:    </disk>
Jan 21 18:51:04 np0005591285 nova_compute[182755]:    <serial type="pty">
Jan 21 18:51:04 np0005591285 nova_compute[182755]:      <log file="/var/lib/nova/instances/5f345aa8-94d2-4213-ab21-fadc362697b1/console.log" append="off"/>
Jan 21 18:51:04 np0005591285 nova_compute[182755]:    </serial>
Jan 21 18:51:04 np0005591285 nova_compute[182755]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 21 18:51:04 np0005591285 nova_compute[182755]:    <video>
Jan 21 18:51:04 np0005591285 nova_compute[182755]:      <model type="virtio"/>
Jan 21 18:51:04 np0005591285 nova_compute[182755]:    </video>
Jan 21 18:51:04 np0005591285 nova_compute[182755]:    <input type="tablet" bus="usb"/>
Jan 21 18:51:04 np0005591285 nova_compute[182755]:    <rng model="virtio">
Jan 21 18:51:04 np0005591285 nova_compute[182755]:      <backend model="random">/dev/urandom</backend>
Jan 21 18:51:04 np0005591285 nova_compute[182755]:    </rng>
Jan 21 18:51:04 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root"/>
Jan 21 18:51:04 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 18:51:04 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 18:51:04 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 18:51:04 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 18:51:04 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 18:51:04 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 18:51:04 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 18:51:04 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 18:51:04 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 18:51:04 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 18:51:04 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 18:51:04 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 18:51:04 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 18:51:04 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 18:51:04 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 18:51:04 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 18:51:04 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 18:51:04 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 18:51:04 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 18:51:04 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 18:51:04 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 18:51:04 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 18:51:04 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 18:51:04 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 18:51:04 np0005591285 nova_compute[182755]:    <controller type="usb" index="0"/>
Jan 21 18:51:04 np0005591285 nova_compute[182755]:    <memballoon model="virtio">
Jan 21 18:51:04 np0005591285 nova_compute[182755]:      <stats period="10"/>
Jan 21 18:51:04 np0005591285 nova_compute[182755]:    </memballoon>
Jan 21 18:51:04 np0005591285 nova_compute[182755]:  </devices>
Jan 21 18:51:04 np0005591285 nova_compute[182755]: </domain>
Jan 21 18:51:04 np0005591285 nova_compute[182755]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Jan 21 18:51:04 np0005591285 nova_compute[182755]: 2026-01-21 23:51:04.726 182759 DEBUG nova.virt.libvirt.driver [None req-57f4fd3b-cc0d-4aaf-ac18-0771eac34ad3 d0c4727b6f6e46339b56a8168cf80a7b c1e85e2b0f934b719d3ad4076dc719f2 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 21 18:51:04 np0005591285 nova_compute[182755]: 2026-01-21 23:51:04.727 182759 DEBUG nova.virt.libvirt.driver [None req-57f4fd3b-cc0d-4aaf-ac18-0771eac34ad3 d0c4727b6f6e46339b56a8168cf80a7b c1e85e2b0f934b719d3ad4076dc719f2 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 21 18:51:04 np0005591285 nova_compute[182755]: 2026-01-21 23:51:04.727 182759 INFO nova.virt.libvirt.driver [None req-57f4fd3b-cc0d-4aaf-ac18-0771eac34ad3 d0c4727b6f6e46339b56a8168cf80a7b c1e85e2b0f934b719d3ad4076dc719f2 - - default default] [instance: 5f345aa8-94d2-4213-ab21-fadc362697b1] Using config drive#033[00m
Jan 21 18:51:04 np0005591285 nova_compute[182755]: 2026-01-21 23:51:04.753 182759 DEBUG oslo_concurrency.processutils [None req-57f4fd3b-cc0d-4aaf-ac18-0771eac34ad3 d0c4727b6f6e46339b56a8168cf80a7b c1e85e2b0f934b719d3ad4076dc719f2 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/e9e930a4-00ab-4044-bd2d-1f099528cb5d/disk --force-share --output=json" returned: 0 in 0.107s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 21 18:51:04 np0005591285 nova_compute[182755]: 2026-01-21 23:51:04.755 182759 DEBUG nova.virt.disk.api [None req-57f4fd3b-cc0d-4aaf-ac18-0771eac34ad3 d0c4727b6f6e46339b56a8168cf80a7b c1e85e2b0f934b719d3ad4076dc719f2 - - default default] Cannot resize image /var/lib/nova/instances/e9e930a4-00ab-4044-bd2d-1f099528cb5d/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172#033[00m
Jan 21 18:51:04 np0005591285 nova_compute[182755]: 2026-01-21 23:51:04.756 182759 DEBUG nova.objects.instance [None req-57f4fd3b-cc0d-4aaf-ac18-0771eac34ad3 d0c4727b6f6e46339b56a8168cf80a7b c1e85e2b0f934b719d3ad4076dc719f2 - - default default] Lazy-loading 'migration_context' on Instance uuid e9e930a4-00ab-4044-bd2d-1f099528cb5d obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 21 18:51:04 np0005591285 nova_compute[182755]: 2026-01-21 23:51:04.770 182759 DEBUG nova.virt.libvirt.driver [None req-57f4fd3b-cc0d-4aaf-ac18-0771eac34ad3 d0c4727b6f6e46339b56a8168cf80a7b c1e85e2b0f934b719d3ad4076dc719f2 - - default default] [instance: e9e930a4-00ab-4044-bd2d-1f099528cb5d] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Jan 21 18:51:04 np0005591285 nova_compute[182755]: 2026-01-21 23:51:04.771 182759 DEBUG nova.virt.libvirt.driver [None req-57f4fd3b-cc0d-4aaf-ac18-0771eac34ad3 d0c4727b6f6e46339b56a8168cf80a7b c1e85e2b0f934b719d3ad4076dc719f2 - - default default] [instance: e9e930a4-00ab-4044-bd2d-1f099528cb5d] Ensure instance console log exists: /var/lib/nova/instances/e9e930a4-00ab-4044-bd2d-1f099528cb5d/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Jan 21 18:51:04 np0005591285 nova_compute[182755]: 2026-01-21 23:51:04.772 182759 DEBUG oslo_concurrency.lockutils [None req-57f4fd3b-cc0d-4aaf-ac18-0771eac34ad3 d0c4727b6f6e46339b56a8168cf80a7b c1e85e2b0f934b719d3ad4076dc719f2 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 21 18:51:04 np0005591285 nova_compute[182755]: 2026-01-21 23:51:04.772 182759 DEBUG oslo_concurrency.lockutils [None req-57f4fd3b-cc0d-4aaf-ac18-0771eac34ad3 d0c4727b6f6e46339b56a8168cf80a7b c1e85e2b0f934b719d3ad4076dc719f2 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 21 18:51:04 np0005591285 nova_compute[182755]: 2026-01-21 23:51:04.772 182759 DEBUG oslo_concurrency.lockutils [None req-57f4fd3b-cc0d-4aaf-ac18-0771eac34ad3 d0c4727b6f6e46339b56a8168cf80a7b c1e85e2b0f934b719d3ad4076dc719f2 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 21 18:51:04 np0005591285 nova_compute[182755]: 2026-01-21 23:51:04.774 182759 DEBUG nova.virt.libvirt.driver [None req-57f4fd3b-cc0d-4aaf-ac18-0771eac34ad3 d0c4727b6f6e46339b56a8168cf80a7b c1e85e2b0f934b719d3ad4076dc719f2 - - default default] [instance: e9e930a4-00ab-4044-bd2d-1f099528cb5d] Start _get_guest_xml network_info=[] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-21T23:42:50Z,direct_url=<?>,disk_format='qcow2',id=9cd98f02-a505-4543-a7ad-04e9a377b456,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='43b70c4e837343859ac97b6b2397ba1b',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-21T23:42:52Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'device_type': 'disk', 'device_name': '/dev/vda', 'size': 0, 'boot_index': 0, 'disk_bus': 'virtio', 'guest_format': None, 'encryption_options': None, 'encryption_format': None, 'encrypted': False, 'encryption_secret_uuid': None, 'image_id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Jan 21 18:51:04 np0005591285 nova_compute[182755]: 2026-01-21 23:51:04.779 182759 WARNING nova.virt.libvirt.driver [None req-57f4fd3b-cc0d-4aaf-ac18-0771eac34ad3 d0c4727b6f6e46339b56a8168cf80a7b c1e85e2b0f934b719d3ad4076dc719f2 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 21 18:51:04 np0005591285 nova_compute[182755]: 2026-01-21 23:51:04.783 182759 DEBUG nova.virt.libvirt.host [None req-57f4fd3b-cc0d-4aaf-ac18-0771eac34ad3 d0c4727b6f6e46339b56a8168cf80a7b c1e85e2b0f934b719d3ad4076dc719f2 - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Jan 21 18:51:04 np0005591285 nova_compute[182755]: 2026-01-21 23:51:04.784 182759 DEBUG nova.virt.libvirt.host [None req-57f4fd3b-cc0d-4aaf-ac18-0771eac34ad3 d0c4727b6f6e46339b56a8168cf80a7b c1e85e2b0f934b719d3ad4076dc719f2 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Jan 21 18:51:04 np0005591285 nova_compute[182755]: 2026-01-21 23:51:04.787 182759 DEBUG nova.virt.libvirt.host [None req-57f4fd3b-cc0d-4aaf-ac18-0771eac34ad3 d0c4727b6f6e46339b56a8168cf80a7b c1e85e2b0f934b719d3ad4076dc719f2 - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Jan 21 18:51:04 np0005591285 nova_compute[182755]: 2026-01-21 23:51:04.788 182759 DEBUG nova.virt.libvirt.host [None req-57f4fd3b-cc0d-4aaf-ac18-0771eac34ad3 d0c4727b6f6e46339b56a8168cf80a7b c1e85e2b0f934b719d3ad4076dc719f2 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Jan 21 18:51:04 np0005591285 nova_compute[182755]: 2026-01-21 23:51:04.789 182759 DEBUG nova.virt.libvirt.driver [None req-57f4fd3b-cc0d-4aaf-ac18-0771eac34ad3 d0c4727b6f6e46339b56a8168cf80a7b c1e85e2b0f934b719d3ad4076dc719f2 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Jan 21 18:51:04 np0005591285 nova_compute[182755]: 2026-01-21 23:51:04.790 182759 DEBUG nova.virt.hardware [None req-57f4fd3b-cc0d-4aaf-ac18-0771eac34ad3 d0c4727b6f6e46339b56a8168cf80a7b c1e85e2b0f934b719d3ad4076dc719f2 - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-21T23:42:45Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='c3389c03-89c4-4ff5-9e03-1a99d41713d4',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-21T23:42:50Z,direct_url=<?>,disk_format='qcow2',id=9cd98f02-a505-4543-a7ad-04e9a377b456,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='43b70c4e837343859ac97b6b2397ba1b',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-21T23:42:52Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Jan 21 18:51:04 np0005591285 nova_compute[182755]: 2026-01-21 23:51:04.790 182759 DEBUG nova.virt.hardware [None req-57f4fd3b-cc0d-4aaf-ac18-0771eac34ad3 d0c4727b6f6e46339b56a8168cf80a7b c1e85e2b0f934b719d3ad4076dc719f2 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Jan 21 18:51:04 np0005591285 nova_compute[182755]: 2026-01-21 23:51:04.790 182759 DEBUG nova.virt.hardware [None req-57f4fd3b-cc0d-4aaf-ac18-0771eac34ad3 d0c4727b6f6e46339b56a8168cf80a7b c1e85e2b0f934b719d3ad4076dc719f2 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Jan 21 18:51:04 np0005591285 nova_compute[182755]: 2026-01-21 23:51:04.791 182759 DEBUG nova.virt.hardware [None req-57f4fd3b-cc0d-4aaf-ac18-0771eac34ad3 d0c4727b6f6e46339b56a8168cf80a7b c1e85e2b0f934b719d3ad4076dc719f2 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Jan 21 18:51:04 np0005591285 nova_compute[182755]: 2026-01-21 23:51:04.791 182759 DEBUG nova.virt.hardware [None req-57f4fd3b-cc0d-4aaf-ac18-0771eac34ad3 d0c4727b6f6e46339b56a8168cf80a7b c1e85e2b0f934b719d3ad4076dc719f2 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Jan 21 18:51:04 np0005591285 nova_compute[182755]: 2026-01-21 23:51:04.791 182759 DEBUG nova.virt.hardware [None req-57f4fd3b-cc0d-4aaf-ac18-0771eac34ad3 d0c4727b6f6e46339b56a8168cf80a7b c1e85e2b0f934b719d3ad4076dc719f2 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Jan 21 18:51:04 np0005591285 nova_compute[182755]: 2026-01-21 23:51:04.791 182759 DEBUG nova.virt.hardware [None req-57f4fd3b-cc0d-4aaf-ac18-0771eac34ad3 d0c4727b6f6e46339b56a8168cf80a7b c1e85e2b0f934b719d3ad4076dc719f2 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Jan 21 18:51:04 np0005591285 nova_compute[182755]: 2026-01-21 23:51:04.792 182759 DEBUG nova.virt.hardware [None req-57f4fd3b-cc0d-4aaf-ac18-0771eac34ad3 d0c4727b6f6e46339b56a8168cf80a7b c1e85e2b0f934b719d3ad4076dc719f2 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Jan 21 18:51:04 np0005591285 nova_compute[182755]: 2026-01-21 23:51:04.792 182759 DEBUG nova.virt.hardware [None req-57f4fd3b-cc0d-4aaf-ac18-0771eac34ad3 d0c4727b6f6e46339b56a8168cf80a7b c1e85e2b0f934b719d3ad4076dc719f2 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Jan 21 18:51:04 np0005591285 nova_compute[182755]: 2026-01-21 23:51:04.792 182759 DEBUG nova.virt.hardware [None req-57f4fd3b-cc0d-4aaf-ac18-0771eac34ad3 d0c4727b6f6e46339b56a8168cf80a7b c1e85e2b0f934b719d3ad4076dc719f2 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Jan 21 18:51:04 np0005591285 nova_compute[182755]: 2026-01-21 23:51:04.793 182759 DEBUG nova.virt.hardware [None req-57f4fd3b-cc0d-4aaf-ac18-0771eac34ad3 d0c4727b6f6e46339b56a8168cf80a7b c1e85e2b0f934b719d3ad4076dc719f2 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Jan 21 18:51:04 np0005591285 nova_compute[182755]: 2026-01-21 23:51:04.796 182759 DEBUG nova.objects.instance [None req-57f4fd3b-cc0d-4aaf-ac18-0771eac34ad3 d0c4727b6f6e46339b56a8168cf80a7b c1e85e2b0f934b719d3ad4076dc719f2 - - default default] Lazy-loading 'pci_devices' on Instance uuid e9e930a4-00ab-4044-bd2d-1f099528cb5d obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 21 18:51:04 np0005591285 nova_compute[182755]: 2026-01-21 23:51:04.816 182759 DEBUG nova.virt.libvirt.driver [None req-57f4fd3b-cc0d-4aaf-ac18-0771eac34ad3 d0c4727b6f6e46339b56a8168cf80a7b c1e85e2b0f934b719d3ad4076dc719f2 - - default default] [instance: e9e930a4-00ab-4044-bd2d-1f099528cb5d] End _get_guest_xml xml=<domain type="kvm">
Jan 21 18:51:04 np0005591285 nova_compute[182755]:  <uuid>e9e930a4-00ab-4044-bd2d-1f099528cb5d</uuid>
Jan 21 18:51:04 np0005591285 nova_compute[182755]:  <name>instance-0000002a</name>
Jan 21 18:51:04 np0005591285 nova_compute[182755]:  <memory>131072</memory>
Jan 21 18:51:04 np0005591285 nova_compute[182755]:  <vcpu>1</vcpu>
Jan 21 18:51:04 np0005591285 nova_compute[182755]:  <metadata>
Jan 21 18:51:04 np0005591285 nova_compute[182755]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 21 18:51:04 np0005591285 nova_compute[182755]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 21 18:51:04 np0005591285 nova_compute[182755]:      <nova:name>tempest-ServersOnMultiNodesTest-server-121266114-2</nova:name>
Jan 21 18:51:04 np0005591285 nova_compute[182755]:      <nova:creationTime>2026-01-21 23:51:04</nova:creationTime>
Jan 21 18:51:04 np0005591285 nova_compute[182755]:      <nova:flavor name="m1.nano">
Jan 21 18:51:04 np0005591285 nova_compute[182755]:        <nova:memory>128</nova:memory>
Jan 21 18:51:04 np0005591285 nova_compute[182755]:        <nova:disk>1</nova:disk>
Jan 21 18:51:04 np0005591285 nova_compute[182755]:        <nova:swap>0</nova:swap>
Jan 21 18:51:04 np0005591285 nova_compute[182755]:        <nova:ephemeral>0</nova:ephemeral>
Jan 21 18:51:04 np0005591285 nova_compute[182755]:        <nova:vcpus>1</nova:vcpus>
Jan 21 18:51:04 np0005591285 nova_compute[182755]:      </nova:flavor>
Jan 21 18:51:04 np0005591285 nova_compute[182755]:      <nova:owner>
Jan 21 18:51:04 np0005591285 nova_compute[182755]:        <nova:user uuid="d0c4727b6f6e46339b56a8168cf80a7b">tempest-ServersOnMultiNodesTest-1927863391-project-member</nova:user>
Jan 21 18:51:04 np0005591285 nova_compute[182755]:        <nova:project uuid="c1e85e2b0f934b719d3ad4076dc719f2">tempest-ServersOnMultiNodesTest-1927863391</nova:project>
Jan 21 18:51:04 np0005591285 nova_compute[182755]:      </nova:owner>
Jan 21 18:51:04 np0005591285 nova_compute[182755]:      <nova:root type="image" uuid="9cd98f02-a505-4543-a7ad-04e9a377b456"/>
Jan 21 18:51:04 np0005591285 nova_compute[182755]:      <nova:ports/>
Jan 21 18:51:04 np0005591285 nova_compute[182755]:    </nova:instance>
Jan 21 18:51:04 np0005591285 nova_compute[182755]:  </metadata>
Jan 21 18:51:04 np0005591285 nova_compute[182755]:  <sysinfo type="smbios">
Jan 21 18:51:04 np0005591285 nova_compute[182755]:    <system>
Jan 21 18:51:04 np0005591285 nova_compute[182755]:      <entry name="manufacturer">RDO</entry>
Jan 21 18:51:04 np0005591285 nova_compute[182755]:      <entry name="product">OpenStack Compute</entry>
Jan 21 18:51:04 np0005591285 nova_compute[182755]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 21 18:51:04 np0005591285 nova_compute[182755]:      <entry name="serial">e9e930a4-00ab-4044-bd2d-1f099528cb5d</entry>
Jan 21 18:51:04 np0005591285 nova_compute[182755]:      <entry name="uuid">e9e930a4-00ab-4044-bd2d-1f099528cb5d</entry>
Jan 21 18:51:04 np0005591285 nova_compute[182755]:      <entry name="family">Virtual Machine</entry>
Jan 21 18:51:04 np0005591285 nova_compute[182755]:    </system>
Jan 21 18:51:04 np0005591285 nova_compute[182755]:  </sysinfo>
Jan 21 18:51:04 np0005591285 nova_compute[182755]:  <os>
Jan 21 18:51:04 np0005591285 nova_compute[182755]:    <type arch="x86_64" machine="q35">hvm</type>
Jan 21 18:51:04 np0005591285 nova_compute[182755]:    <boot dev="hd"/>
Jan 21 18:51:04 np0005591285 nova_compute[182755]:    <smbios mode="sysinfo"/>
Jan 21 18:51:04 np0005591285 nova_compute[182755]:  </os>
Jan 21 18:51:04 np0005591285 nova_compute[182755]:  <features>
Jan 21 18:51:04 np0005591285 nova_compute[182755]:    <acpi/>
Jan 21 18:51:04 np0005591285 nova_compute[182755]:    <apic/>
Jan 21 18:51:04 np0005591285 nova_compute[182755]:    <vmcoreinfo/>
Jan 21 18:51:04 np0005591285 nova_compute[182755]:  </features>
Jan 21 18:51:04 np0005591285 nova_compute[182755]:  <clock offset="utc">
Jan 21 18:51:04 np0005591285 nova_compute[182755]:    <timer name="pit" tickpolicy="delay"/>
Jan 21 18:51:04 np0005591285 nova_compute[182755]:    <timer name="rtc" tickpolicy="catchup"/>
Jan 21 18:51:04 np0005591285 nova_compute[182755]:    <timer name="hpet" present="no"/>
Jan 21 18:51:04 np0005591285 nova_compute[182755]:  </clock>
Jan 21 18:51:04 np0005591285 nova_compute[182755]:  <cpu mode="custom" match="exact">
Jan 21 18:51:04 np0005591285 nova_compute[182755]:    <model>Nehalem</model>
Jan 21 18:51:04 np0005591285 nova_compute[182755]:    <topology sockets="1" cores="1" threads="1"/>
Jan 21 18:51:04 np0005591285 nova_compute[182755]:  </cpu>
Jan 21 18:51:04 np0005591285 nova_compute[182755]:  <devices>
Jan 21 18:51:04 np0005591285 nova_compute[182755]:    <disk type="file" device="disk">
Jan 21 18:51:04 np0005591285 nova_compute[182755]:      <driver name="qemu" type="qcow2" cache="none"/>
Jan 21 18:51:04 np0005591285 nova_compute[182755]:      <source file="/var/lib/nova/instances/e9e930a4-00ab-4044-bd2d-1f099528cb5d/disk"/>
Jan 21 18:51:04 np0005591285 nova_compute[182755]:      <target dev="vda" bus="virtio"/>
Jan 21 18:51:04 np0005591285 nova_compute[182755]:    </disk>
Jan 21 18:51:04 np0005591285 nova_compute[182755]:    <disk type="file" device="cdrom">
Jan 21 18:51:04 np0005591285 nova_compute[182755]:      <driver name="qemu" type="raw" cache="none"/>
Jan 21 18:51:04 np0005591285 nova_compute[182755]:      <source file="/var/lib/nova/instances/e9e930a4-00ab-4044-bd2d-1f099528cb5d/disk.config"/>
Jan 21 18:51:04 np0005591285 nova_compute[182755]:      <target dev="sda" bus="sata"/>
Jan 21 18:51:04 np0005591285 nova_compute[182755]:    </disk>
Jan 21 18:51:04 np0005591285 nova_compute[182755]:    <serial type="pty">
Jan 21 18:51:04 np0005591285 nova_compute[182755]:      <log file="/var/lib/nova/instances/e9e930a4-00ab-4044-bd2d-1f099528cb5d/console.log" append="off"/>
Jan 21 18:51:04 np0005591285 nova_compute[182755]:    </serial>
Jan 21 18:51:04 np0005591285 nova_compute[182755]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 21 18:51:04 np0005591285 nova_compute[182755]:    <video>
Jan 21 18:51:04 np0005591285 nova_compute[182755]:      <model type="virtio"/>
Jan 21 18:51:04 np0005591285 nova_compute[182755]:    </video>
Jan 21 18:51:04 np0005591285 nova_compute[182755]:    <input type="tablet" bus="usb"/>
Jan 21 18:51:04 np0005591285 nova_compute[182755]:    <rng model="virtio">
Jan 21 18:51:04 np0005591285 nova_compute[182755]:      <backend model="random">/dev/urandom</backend>
Jan 21 18:51:04 np0005591285 nova_compute[182755]:    </rng>
Jan 21 18:51:04 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root"/>
Jan 21 18:51:04 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 18:51:04 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 18:51:04 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 18:51:04 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 18:51:04 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 18:51:04 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 18:51:04 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 18:51:04 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 18:51:04 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 18:51:04 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 18:51:04 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 18:51:04 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 18:51:04 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 18:51:04 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 18:51:04 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 18:51:04 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 18:51:04 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 18:51:04 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 18:51:04 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 18:51:04 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 18:51:04 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 18:51:04 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 18:51:04 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 18:51:04 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 18:51:04 np0005591285 nova_compute[182755]:    <controller type="usb" index="0"/>
Jan 21 18:51:04 np0005591285 nova_compute[182755]:    <memballoon model="virtio">
Jan 21 18:51:04 np0005591285 nova_compute[182755]:      <stats period="10"/>
Jan 21 18:51:04 np0005591285 nova_compute[182755]:    </memballoon>
Jan 21 18:51:04 np0005591285 nova_compute[182755]:  </devices>
Jan 21 18:51:04 np0005591285 nova_compute[182755]: </domain>
Jan 21 18:51:04 np0005591285 nova_compute[182755]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Jan 21 18:51:04 np0005591285 nova_compute[182755]: 2026-01-21 23:51:04.874 182759 DEBUG nova.virt.libvirt.driver [None req-57f4fd3b-cc0d-4aaf-ac18-0771eac34ad3 d0c4727b6f6e46339b56a8168cf80a7b c1e85e2b0f934b719d3ad4076dc719f2 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 21 18:51:04 np0005591285 nova_compute[182755]: 2026-01-21 23:51:04.876 182759 DEBUG nova.virt.libvirt.driver [None req-57f4fd3b-cc0d-4aaf-ac18-0771eac34ad3 d0c4727b6f6e46339b56a8168cf80a7b c1e85e2b0f934b719d3ad4076dc719f2 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 21 18:51:04 np0005591285 nova_compute[182755]: 2026-01-21 23:51:04.878 182759 INFO nova.virt.libvirt.driver [None req-57f4fd3b-cc0d-4aaf-ac18-0771eac34ad3 d0c4727b6f6e46339b56a8168cf80a7b c1e85e2b0f934b719d3ad4076dc719f2 - - default default] [instance: e9e930a4-00ab-4044-bd2d-1f099528cb5d] Using config drive#033[00m
Jan 21 18:51:04 np0005591285 nova_compute[182755]: 2026-01-21 23:51:04.930 182759 INFO nova.virt.libvirt.driver [None req-57f4fd3b-cc0d-4aaf-ac18-0771eac34ad3 d0c4727b6f6e46339b56a8168cf80a7b c1e85e2b0f934b719d3ad4076dc719f2 - - default default] [instance: 5f345aa8-94d2-4213-ab21-fadc362697b1] Creating config drive at /var/lib/nova/instances/5f345aa8-94d2-4213-ab21-fadc362697b1/disk.config#033[00m
Jan 21 18:51:04 np0005591285 nova_compute[182755]: 2026-01-21 23:51:04.941 182759 DEBUG oslo_concurrency.processutils [None req-57f4fd3b-cc0d-4aaf-ac18-0771eac34ad3 d0c4727b6f6e46339b56a8168cf80a7b c1e85e2b0f934b719d3ad4076dc719f2 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/5f345aa8-94d2-4213-ab21-fadc362697b1/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpieuy7s1x execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 21 18:51:05 np0005591285 nova_compute[182755]: 2026-01-21 23:51:05.078 182759 DEBUG oslo_concurrency.processutils [None req-57f4fd3b-cc0d-4aaf-ac18-0771eac34ad3 d0c4727b6f6e46339b56a8168cf80a7b c1e85e2b0f934b719d3ad4076dc719f2 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/5f345aa8-94d2-4213-ab21-fadc362697b1/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpieuy7s1x" returned: 0 in 0.137s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 21 18:51:05 np0005591285 nova_compute[182755]: 2026-01-21 23:51:05.111 182759 INFO nova.virt.libvirt.driver [None req-57f4fd3b-cc0d-4aaf-ac18-0771eac34ad3 d0c4727b6f6e46339b56a8168cf80a7b c1e85e2b0f934b719d3ad4076dc719f2 - - default default] [instance: e9e930a4-00ab-4044-bd2d-1f099528cb5d] Creating config drive at /var/lib/nova/instances/e9e930a4-00ab-4044-bd2d-1f099528cb5d/disk.config#033[00m
Jan 21 18:51:05 np0005591285 nova_compute[182755]: 2026-01-21 23:51:05.123 182759 DEBUG oslo_concurrency.processutils [None req-57f4fd3b-cc0d-4aaf-ac18-0771eac34ad3 d0c4727b6f6e46339b56a8168cf80a7b c1e85e2b0f934b719d3ad4076dc719f2 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/e9e930a4-00ab-4044-bd2d-1f099528cb5d/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpu7lrhou5 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 21 18:51:05 np0005591285 systemd-machined[154022]: New machine qemu-15-instance-00000029.
Jan 21 18:51:05 np0005591285 systemd[1]: Started Virtual Machine qemu-15-instance-00000029.
Jan 21 18:51:05 np0005591285 nova_compute[182755]: 2026-01-21 23:51:05.263 182759 DEBUG oslo_concurrency.processutils [None req-57f4fd3b-cc0d-4aaf-ac18-0771eac34ad3 d0c4727b6f6e46339b56a8168cf80a7b c1e85e2b0f934b719d3ad4076dc719f2 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/e9e930a4-00ab-4044-bd2d-1f099528cb5d/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpu7lrhou5" returned: 0 in 0.139s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 21 18:51:05 np0005591285 systemd-machined[154022]: New machine qemu-16-instance-0000002a.
Jan 21 18:51:05 np0005591285 systemd[1]: Started Virtual Machine qemu-16-instance-0000002a.
Jan 21 18:51:05 np0005591285 nova_compute[182755]: 2026-01-21 23:51:05.624 182759 DEBUG nova.virt.driver [None req-f447ef32-7edb-4e2c-a5c5-5452f2f9c37e - - - - - -] Emitting event <LifecycleEvent: 1769039465.6234567, 5f345aa8-94d2-4213-ab21-fadc362697b1 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 21 18:51:05 np0005591285 nova_compute[182755]: 2026-01-21 23:51:05.624 182759 INFO nova.compute.manager [None req-f447ef32-7edb-4e2c-a5c5-5452f2f9c37e - - - - - -] [instance: 5f345aa8-94d2-4213-ab21-fadc362697b1] VM Resumed (Lifecycle Event)#033[00m
Jan 21 18:51:05 np0005591285 nova_compute[182755]: 2026-01-21 23:51:05.628 182759 DEBUG nova.compute.manager [None req-57f4fd3b-cc0d-4aaf-ac18-0771eac34ad3 d0c4727b6f6e46339b56a8168cf80a7b c1e85e2b0f934b719d3ad4076dc719f2 - - default default] [instance: 5f345aa8-94d2-4213-ab21-fadc362697b1] Instance event wait completed in 0 seconds for  wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Jan 21 18:51:05 np0005591285 nova_compute[182755]: 2026-01-21 23:51:05.629 182759 DEBUG nova.virt.libvirt.driver [None req-57f4fd3b-cc0d-4aaf-ac18-0771eac34ad3 d0c4727b6f6e46339b56a8168cf80a7b c1e85e2b0f934b719d3ad4076dc719f2 - - default default] [instance: 5f345aa8-94d2-4213-ab21-fadc362697b1] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Jan 21 18:51:05 np0005591285 nova_compute[182755]: 2026-01-21 23:51:05.634 182759 INFO nova.virt.libvirt.driver [-] [instance: 5f345aa8-94d2-4213-ab21-fadc362697b1] Instance spawned successfully.#033[00m
Jan 21 18:51:05 np0005591285 nova_compute[182755]: 2026-01-21 23:51:05.634 182759 DEBUG nova.virt.libvirt.driver [None req-57f4fd3b-cc0d-4aaf-ac18-0771eac34ad3 d0c4727b6f6e46339b56a8168cf80a7b c1e85e2b0f934b719d3ad4076dc719f2 - - default default] [instance: 5f345aa8-94d2-4213-ab21-fadc362697b1] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Jan 21 18:51:05 np0005591285 nova_compute[182755]: 2026-01-21 23:51:05.657 182759 DEBUG nova.compute.manager [None req-f447ef32-7edb-4e2c-a5c5-5452f2f9c37e - - - - - -] [instance: 5f345aa8-94d2-4213-ab21-fadc362697b1] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 21 18:51:05 np0005591285 nova_compute[182755]: 2026-01-21 23:51:05.664 182759 DEBUG nova.compute.manager [None req-f447ef32-7edb-4e2c-a5c5-5452f2f9c37e - - - - - -] [instance: 5f345aa8-94d2-4213-ab21-fadc362697b1] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 21 18:51:05 np0005591285 nova_compute[182755]: 2026-01-21 23:51:05.670 182759 DEBUG nova.virt.libvirt.driver [None req-57f4fd3b-cc0d-4aaf-ac18-0771eac34ad3 d0c4727b6f6e46339b56a8168cf80a7b c1e85e2b0f934b719d3ad4076dc719f2 - - default default] [instance: 5f345aa8-94d2-4213-ab21-fadc362697b1] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 21 18:51:05 np0005591285 nova_compute[182755]: 2026-01-21 23:51:05.670 182759 DEBUG nova.virt.libvirt.driver [None req-57f4fd3b-cc0d-4aaf-ac18-0771eac34ad3 d0c4727b6f6e46339b56a8168cf80a7b c1e85e2b0f934b719d3ad4076dc719f2 - - default default] [instance: 5f345aa8-94d2-4213-ab21-fadc362697b1] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 21 18:51:05 np0005591285 nova_compute[182755]: 2026-01-21 23:51:05.671 182759 DEBUG nova.virt.libvirt.driver [None req-57f4fd3b-cc0d-4aaf-ac18-0771eac34ad3 d0c4727b6f6e46339b56a8168cf80a7b c1e85e2b0f934b719d3ad4076dc719f2 - - default default] [instance: 5f345aa8-94d2-4213-ab21-fadc362697b1] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 21 18:51:05 np0005591285 nova_compute[182755]: 2026-01-21 23:51:05.671 182759 DEBUG nova.virt.libvirt.driver [None req-57f4fd3b-cc0d-4aaf-ac18-0771eac34ad3 d0c4727b6f6e46339b56a8168cf80a7b c1e85e2b0f934b719d3ad4076dc719f2 - - default default] [instance: 5f345aa8-94d2-4213-ab21-fadc362697b1] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 21 18:51:05 np0005591285 nova_compute[182755]: 2026-01-21 23:51:05.671 182759 DEBUG nova.virt.libvirt.driver [None req-57f4fd3b-cc0d-4aaf-ac18-0771eac34ad3 d0c4727b6f6e46339b56a8168cf80a7b c1e85e2b0f934b719d3ad4076dc719f2 - - default default] [instance: 5f345aa8-94d2-4213-ab21-fadc362697b1] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 21 18:51:05 np0005591285 nova_compute[182755]: 2026-01-21 23:51:05.672 182759 DEBUG nova.virt.libvirt.driver [None req-57f4fd3b-cc0d-4aaf-ac18-0771eac34ad3 d0c4727b6f6e46339b56a8168cf80a7b c1e85e2b0f934b719d3ad4076dc719f2 - - default default] [instance: 5f345aa8-94d2-4213-ab21-fadc362697b1] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 21 18:51:05 np0005591285 nova_compute[182755]: 2026-01-21 23:51:05.695 182759 INFO nova.compute.manager [None req-f447ef32-7edb-4e2c-a5c5-5452f2f9c37e - - - - - -] [instance: 5f345aa8-94d2-4213-ab21-fadc362697b1] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 21 18:51:05 np0005591285 nova_compute[182755]: 2026-01-21 23:51:05.696 182759 DEBUG nova.virt.driver [None req-f447ef32-7edb-4e2c-a5c5-5452f2f9c37e - - - - - -] Emitting event <LifecycleEvent: 1769039465.6273847, 5f345aa8-94d2-4213-ab21-fadc362697b1 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 21 18:51:05 np0005591285 nova_compute[182755]: 2026-01-21 23:51:05.696 182759 INFO nova.compute.manager [None req-f447ef32-7edb-4e2c-a5c5-5452f2f9c37e - - - - - -] [instance: 5f345aa8-94d2-4213-ab21-fadc362697b1] VM Started (Lifecycle Event)#033[00m
Jan 21 18:51:05 np0005591285 nova_compute[182755]: 2026-01-21 23:51:05.725 182759 DEBUG nova.compute.manager [None req-f447ef32-7edb-4e2c-a5c5-5452f2f9c37e - - - - - -] [instance: 5f345aa8-94d2-4213-ab21-fadc362697b1] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 21 18:51:05 np0005591285 nova_compute[182755]: 2026-01-21 23:51:05.729 182759 DEBUG nova.compute.manager [None req-f447ef32-7edb-4e2c-a5c5-5452f2f9c37e - - - - - -] [instance: 5f345aa8-94d2-4213-ab21-fadc362697b1] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 21 18:51:05 np0005591285 nova_compute[182755]: 2026-01-21 23:51:05.761 182759 DEBUG nova.compute.manager [None req-57f4fd3b-cc0d-4aaf-ac18-0771eac34ad3 d0c4727b6f6e46339b56a8168cf80a7b c1e85e2b0f934b719d3ad4076dc719f2 - - default default] [instance: e9e930a4-00ab-4044-bd2d-1f099528cb5d] Instance event wait completed in 0 seconds for  wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Jan 21 18:51:05 np0005591285 nova_compute[182755]: 2026-01-21 23:51:05.762 182759 DEBUG nova.virt.libvirt.driver [None req-57f4fd3b-cc0d-4aaf-ac18-0771eac34ad3 d0c4727b6f6e46339b56a8168cf80a7b c1e85e2b0f934b719d3ad4076dc719f2 - - default default] [instance: e9e930a4-00ab-4044-bd2d-1f099528cb5d] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Jan 21 18:51:05 np0005591285 nova_compute[182755]: 2026-01-21 23:51:05.768 182759 INFO nova.virt.libvirt.driver [-] [instance: e9e930a4-00ab-4044-bd2d-1f099528cb5d] Instance spawned successfully.#033[00m
Jan 21 18:51:05 np0005591285 nova_compute[182755]: 2026-01-21 23:51:05.768 182759 DEBUG nova.virt.libvirt.driver [None req-57f4fd3b-cc0d-4aaf-ac18-0771eac34ad3 d0c4727b6f6e46339b56a8168cf80a7b c1e85e2b0f934b719d3ad4076dc719f2 - - default default] [instance: e9e930a4-00ab-4044-bd2d-1f099528cb5d] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Jan 21 18:51:05 np0005591285 nova_compute[182755]: 2026-01-21 23:51:05.796 182759 INFO nova.compute.manager [None req-57f4fd3b-cc0d-4aaf-ac18-0771eac34ad3 d0c4727b6f6e46339b56a8168cf80a7b c1e85e2b0f934b719d3ad4076dc719f2 - - default default] [instance: 5f345aa8-94d2-4213-ab21-fadc362697b1] Took 1.67 seconds to spawn the instance on the hypervisor.#033[00m
Jan 21 18:51:05 np0005591285 nova_compute[182755]: 2026-01-21 23:51:05.797 182759 DEBUG nova.compute.manager [None req-57f4fd3b-cc0d-4aaf-ac18-0771eac34ad3 d0c4727b6f6e46339b56a8168cf80a7b c1e85e2b0f934b719d3ad4076dc719f2 - - default default] [instance: 5f345aa8-94d2-4213-ab21-fadc362697b1] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 21 18:51:05 np0005591285 nova_compute[182755]: 2026-01-21 23:51:05.800 182759 INFO nova.compute.manager [None req-f447ef32-7edb-4e2c-a5c5-5452f2f9c37e - - - - - -] [instance: 5f345aa8-94d2-4213-ab21-fadc362697b1] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 21 18:51:05 np0005591285 nova_compute[182755]: 2026-01-21 23:51:05.801 182759 DEBUG nova.virt.driver [None req-f447ef32-7edb-4e2c-a5c5-5452f2f9c37e - - - - - -] Emitting event <LifecycleEvent: 1769039465.7585814, e9e930a4-00ab-4044-bd2d-1f099528cb5d => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 21 18:51:05 np0005591285 nova_compute[182755]: 2026-01-21 23:51:05.801 182759 INFO nova.compute.manager [None req-f447ef32-7edb-4e2c-a5c5-5452f2f9c37e - - - - - -] [instance: e9e930a4-00ab-4044-bd2d-1f099528cb5d] VM Resumed (Lifecycle Event)#033[00m
Jan 21 18:51:05 np0005591285 nova_compute[182755]: 2026-01-21 23:51:05.811 182759 DEBUG nova.virt.libvirt.driver [None req-57f4fd3b-cc0d-4aaf-ac18-0771eac34ad3 d0c4727b6f6e46339b56a8168cf80a7b c1e85e2b0f934b719d3ad4076dc719f2 - - default default] [instance: e9e930a4-00ab-4044-bd2d-1f099528cb5d] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 21 18:51:05 np0005591285 nova_compute[182755]: 2026-01-21 23:51:05.811 182759 DEBUG nova.virt.libvirt.driver [None req-57f4fd3b-cc0d-4aaf-ac18-0771eac34ad3 d0c4727b6f6e46339b56a8168cf80a7b c1e85e2b0f934b719d3ad4076dc719f2 - - default default] [instance: e9e930a4-00ab-4044-bd2d-1f099528cb5d] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 21 18:51:05 np0005591285 nova_compute[182755]: 2026-01-21 23:51:05.812 182759 DEBUG nova.virt.libvirt.driver [None req-57f4fd3b-cc0d-4aaf-ac18-0771eac34ad3 d0c4727b6f6e46339b56a8168cf80a7b c1e85e2b0f934b719d3ad4076dc719f2 - - default default] [instance: e9e930a4-00ab-4044-bd2d-1f099528cb5d] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 21 18:51:05 np0005591285 nova_compute[182755]: 2026-01-21 23:51:05.812 182759 DEBUG nova.virt.libvirt.driver [None req-57f4fd3b-cc0d-4aaf-ac18-0771eac34ad3 d0c4727b6f6e46339b56a8168cf80a7b c1e85e2b0f934b719d3ad4076dc719f2 - - default default] [instance: e9e930a4-00ab-4044-bd2d-1f099528cb5d] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 21 18:51:05 np0005591285 nova_compute[182755]: 2026-01-21 23:51:05.813 182759 DEBUG nova.virt.libvirt.driver [None req-57f4fd3b-cc0d-4aaf-ac18-0771eac34ad3 d0c4727b6f6e46339b56a8168cf80a7b c1e85e2b0f934b719d3ad4076dc719f2 - - default default] [instance: e9e930a4-00ab-4044-bd2d-1f099528cb5d] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 21 18:51:05 np0005591285 nova_compute[182755]: 2026-01-21 23:51:05.813 182759 DEBUG nova.virt.libvirt.driver [None req-57f4fd3b-cc0d-4aaf-ac18-0771eac34ad3 d0c4727b6f6e46339b56a8168cf80a7b c1e85e2b0f934b719d3ad4076dc719f2 - - default default] [instance: e9e930a4-00ab-4044-bd2d-1f099528cb5d] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 21 18:51:05 np0005591285 nova_compute[182755]: 2026-01-21 23:51:05.838 182759 DEBUG nova.compute.manager [None req-f447ef32-7edb-4e2c-a5c5-5452f2f9c37e - - - - - -] [instance: e9e930a4-00ab-4044-bd2d-1f099528cb5d] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 21 18:51:05 np0005591285 nova_compute[182755]: 2026-01-21 23:51:05.842 182759 DEBUG nova.compute.manager [None req-f447ef32-7edb-4e2c-a5c5-5452f2f9c37e - - - - - -] [instance: e9e930a4-00ab-4044-bd2d-1f099528cb5d] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 21 18:51:05 np0005591285 nova_compute[182755]: 2026-01-21 23:51:05.886 182759 INFO nova.compute.manager [None req-f447ef32-7edb-4e2c-a5c5-5452f2f9c37e - - - - - -] [instance: e9e930a4-00ab-4044-bd2d-1f099528cb5d] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 21 18:51:05 np0005591285 nova_compute[182755]: 2026-01-21 23:51:05.887 182759 DEBUG nova.virt.driver [None req-f447ef32-7edb-4e2c-a5c5-5452f2f9c37e - - - - - -] Emitting event <LifecycleEvent: 1769039465.7606409, e9e930a4-00ab-4044-bd2d-1f099528cb5d => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 21 18:51:05 np0005591285 nova_compute[182755]: 2026-01-21 23:51:05.887 182759 INFO nova.compute.manager [None req-f447ef32-7edb-4e2c-a5c5-5452f2f9c37e - - - - - -] [instance: e9e930a4-00ab-4044-bd2d-1f099528cb5d] VM Started (Lifecycle Event)#033[00m
Jan 21 18:51:05 np0005591285 nova_compute[182755]: 2026-01-21 23:51:05.933 182759 INFO nova.compute.manager [None req-57f4fd3b-cc0d-4aaf-ac18-0771eac34ad3 d0c4727b6f6e46339b56a8168cf80a7b c1e85e2b0f934b719d3ad4076dc719f2 - - default default] [instance: e9e930a4-00ab-4044-bd2d-1f099528cb5d] Took 1.61 seconds to spawn the instance on the hypervisor.#033[00m
Jan 21 18:51:05 np0005591285 nova_compute[182755]: 2026-01-21 23:51:05.934 182759 DEBUG nova.compute.manager [None req-57f4fd3b-cc0d-4aaf-ac18-0771eac34ad3 d0c4727b6f6e46339b56a8168cf80a7b c1e85e2b0f934b719d3ad4076dc719f2 - - default default] [instance: e9e930a4-00ab-4044-bd2d-1f099528cb5d] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 21 18:51:05 np0005591285 nova_compute[182755]: 2026-01-21 23:51:05.941 182759 DEBUG nova.compute.manager [None req-f447ef32-7edb-4e2c-a5c5-5452f2f9c37e - - - - - -] [instance: e9e930a4-00ab-4044-bd2d-1f099528cb5d] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 21 18:51:05 np0005591285 nova_compute[182755]: 2026-01-21 23:51:05.944 182759 INFO nova.compute.manager [None req-57f4fd3b-cc0d-4aaf-ac18-0771eac34ad3 d0c4727b6f6e46339b56a8168cf80a7b c1e85e2b0f934b719d3ad4076dc719f2 - - default default] [instance: 5f345aa8-94d2-4213-ab21-fadc362697b1] Took 2.81 seconds to build instance.#033[00m
Jan 21 18:51:05 np0005591285 nova_compute[182755]: 2026-01-21 23:51:05.955 182759 DEBUG nova.compute.manager [None req-f447ef32-7edb-4e2c-a5c5-5452f2f9c37e - - - - - -] [instance: e9e930a4-00ab-4044-bd2d-1f099528cb5d] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 21 18:51:06 np0005591285 nova_compute[182755]: 2026-01-21 23:51:05.996 182759 DEBUG oslo_concurrency.lockutils [None req-57f4fd3b-cc0d-4aaf-ac18-0771eac34ad3 d0c4727b6f6e46339b56a8168cf80a7b c1e85e2b0f934b719d3ad4076dc719f2 - - default default] Lock "5f345aa8-94d2-4213-ab21-fadc362697b1" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 2.948s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 21 18:51:06 np0005591285 nova_compute[182755]: 2026-01-21 23:51:06.000 182759 INFO nova.compute.manager [None req-f447ef32-7edb-4e2c-a5c5-5452f2f9c37e - - - - - -] [instance: e9e930a4-00ab-4044-bd2d-1f099528cb5d] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 21 18:51:06 np0005591285 nova_compute[182755]: 2026-01-21 23:51:06.044 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 18:51:06 np0005591285 nova_compute[182755]: 2026-01-21 23:51:06.047 182759 INFO nova.compute.manager [None req-57f4fd3b-cc0d-4aaf-ac18-0771eac34ad3 d0c4727b6f6e46339b56a8168cf80a7b c1e85e2b0f934b719d3ad4076dc719f2 - - default default] [instance: e9e930a4-00ab-4044-bd2d-1f099528cb5d] Took 2.86 seconds to build instance.#033[00m
Jan 21 18:51:06 np0005591285 nova_compute[182755]: 2026-01-21 23:51:06.064 182759 DEBUG oslo_concurrency.lockutils [None req-57f4fd3b-cc0d-4aaf-ac18-0771eac34ad3 d0c4727b6f6e46339b56a8168cf80a7b c1e85e2b0f934b719d3ad4076dc719f2 - - default default] Lock "e9e930a4-00ab-4044-bd2d-1f099528cb5d" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 2.958s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 21 18:51:06 np0005591285 nova_compute[182755]: 2026-01-21 23:51:06.173 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 18:51:06 np0005591285 nova_compute[182755]: 2026-01-21 23:51:06.995 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 18:51:09 np0005591285 nova_compute[182755]: 2026-01-21 23:51:09.453 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 18:51:11 np0005591285 nova_compute[182755]: 2026-01-21 23:51:11.915 182759 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1769039456.9131331, 31e5706d-d327-4283-affc-6a3f60b78063 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 21 18:51:11 np0005591285 nova_compute[182755]: 2026-01-21 23:51:11.916 182759 INFO nova.compute.manager [-] [instance: 31e5706d-d327-4283-affc-6a3f60b78063] VM Stopped (Lifecycle Event)#033[00m
Jan 21 18:51:11 np0005591285 nova_compute[182755]: 2026-01-21 23:51:11.959 182759 DEBUG nova.compute.manager [None req-0123bd0d-4863-44ff-82e0-3834da1de9b8 - - - - - -] [instance: 31e5706d-d327-4283-affc-6a3f60b78063] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 21 18:51:12 np0005591285 nova_compute[182755]: 2026-01-21 23:51:12.035 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 18:51:13 np0005591285 nova_compute[182755]: 2026-01-21 23:51:13.532 182759 DEBUG oslo_concurrency.lockutils [None req-d2489ffc-0686-4414-ae85-79f8ea32295b d0c4727b6f6e46339b56a8168cf80a7b c1e85e2b0f934b719d3ad4076dc719f2 - - default default] Acquiring lock "c4b127c8-46d7-4b97-abfe-12c84f0d2070" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 21 18:51:13 np0005591285 nova_compute[182755]: 2026-01-21 23:51:13.534 182759 DEBUG oslo_concurrency.lockutils [None req-d2489ffc-0686-4414-ae85-79f8ea32295b d0c4727b6f6e46339b56a8168cf80a7b c1e85e2b0f934b719d3ad4076dc719f2 - - default default] Lock "c4b127c8-46d7-4b97-abfe-12c84f0d2070" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 21 18:51:13 np0005591285 nova_compute[182755]: 2026-01-21 23:51:13.576 182759 DEBUG nova.compute.manager [None req-d2489ffc-0686-4414-ae85-79f8ea32295b d0c4727b6f6e46339b56a8168cf80a7b c1e85e2b0f934b719d3ad4076dc719f2 - - default default] [instance: c4b127c8-46d7-4b97-abfe-12c84f0d2070] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Jan 21 18:51:13 np0005591285 nova_compute[182755]: 2026-01-21 23:51:13.709 182759 DEBUG oslo_concurrency.lockutils [None req-d2489ffc-0686-4414-ae85-79f8ea32295b d0c4727b6f6e46339b56a8168cf80a7b c1e85e2b0f934b719d3ad4076dc719f2 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 21 18:51:13 np0005591285 nova_compute[182755]: 2026-01-21 23:51:13.712 182759 DEBUG oslo_concurrency.lockutils [None req-d2489ffc-0686-4414-ae85-79f8ea32295b d0c4727b6f6e46339b56a8168cf80a7b c1e85e2b0f934b719d3ad4076dc719f2 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.003s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 21 18:51:13 np0005591285 nova_compute[182755]: 2026-01-21 23:51:13.723 182759 DEBUG nova.virt.hardware [None req-d2489ffc-0686-4414-ae85-79f8ea32295b d0c4727b6f6e46339b56a8168cf80a7b c1e85e2b0f934b719d3ad4076dc719f2 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Jan 21 18:51:13 np0005591285 nova_compute[182755]: 2026-01-21 23:51:13.724 182759 INFO nova.compute.claims [None req-d2489ffc-0686-4414-ae85-79f8ea32295b d0c4727b6f6e46339b56a8168cf80a7b c1e85e2b0f934b719d3ad4076dc719f2 - - default default] [instance: c4b127c8-46d7-4b97-abfe-12c84f0d2070] Claim successful on node compute-2.ctlplane.example.com#033[00m
Jan 21 18:51:13 np0005591285 nova_compute[182755]: 2026-01-21 23:51:13.948 182759 DEBUG nova.compute.provider_tree [None req-d2489ffc-0686-4414-ae85-79f8ea32295b d0c4727b6f6e46339b56a8168cf80a7b c1e85e2b0f934b719d3ad4076dc719f2 - - default default] Inventory has not changed in ProviderTree for provider: e96a8776-a298-4c19-937a-402cb8191067 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 21 18:51:13 np0005591285 nova_compute[182755]: 2026-01-21 23:51:13.975 182759 DEBUG nova.scheduler.client.report [None req-d2489ffc-0686-4414-ae85-79f8ea32295b d0c4727b6f6e46339b56a8168cf80a7b c1e85e2b0f934b719d3ad4076dc719f2 - - default default] Inventory has not changed for provider e96a8776-a298-4c19-937a-402cb8191067 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 21 18:51:13 np0005591285 nova_compute[182755]: 2026-01-21 23:51:13.998 182759 DEBUG oslo_concurrency.lockutils [None req-d2489ffc-0686-4414-ae85-79f8ea32295b d0c4727b6f6e46339b56a8168cf80a7b c1e85e2b0f934b719d3ad4076dc719f2 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.286s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 21 18:51:14 np0005591285 nova_compute[182755]: 2026-01-21 23:51:14.016 182759 DEBUG oslo_concurrency.lockutils [None req-d2489ffc-0686-4414-ae85-79f8ea32295b d0c4727b6f6e46339b56a8168cf80a7b c1e85e2b0f934b719d3ad4076dc719f2 - - default default] Acquiring lock "e07971b1-069f-4494-9b4a-04c296f1e891" by "nova.compute.manager.ComputeManager._validate_instance_group_policy.<locals>._do_validation" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 21 18:51:14 np0005591285 nova_compute[182755]: 2026-01-21 23:51:14.017 182759 DEBUG oslo_concurrency.lockutils [None req-d2489ffc-0686-4414-ae85-79f8ea32295b d0c4727b6f6e46339b56a8168cf80a7b c1e85e2b0f934b719d3ad4076dc719f2 - - default default] Lock "e07971b1-069f-4494-9b4a-04c296f1e891" acquired by "nova.compute.manager.ComputeManager._validate_instance_group_policy.<locals>._do_validation" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 21 18:51:14 np0005591285 nova_compute[182755]: 2026-01-21 23:51:14.039 182759 DEBUG nova.compute.manager [None req-d2489ffc-0686-4414-ae85-79f8ea32295b d0c4727b6f6e46339b56a8168cf80a7b c1e85e2b0f934b719d3ad4076dc719f2 - - default default] [instance: c4b127c8-46d7-4b97-abfe-12c84f0d2070] No node specified, defaulting to compute-2.ctlplane.example.com _get_nodename /usr/lib/python3.9/site-packages/nova/compute/manager.py:10505#033[00m
Jan 21 18:51:14 np0005591285 nova_compute[182755]: 2026-01-21 23:51:14.073 182759 DEBUG oslo_concurrency.lockutils [None req-d2489ffc-0686-4414-ae85-79f8ea32295b d0c4727b6f6e46339b56a8168cf80a7b c1e85e2b0f934b719d3ad4076dc719f2 - - default default] Lock "e07971b1-069f-4494-9b4a-04c296f1e891" "released" by "nova.compute.manager.ComputeManager._validate_instance_group_policy.<locals>._do_validation" :: held 0.056s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 21 18:51:14 np0005591285 nova_compute[182755]: 2026-01-21 23:51:14.075 182759 DEBUG nova.compute.manager [None req-d2489ffc-0686-4414-ae85-79f8ea32295b d0c4727b6f6e46339b56a8168cf80a7b c1e85e2b0f934b719d3ad4076dc719f2 - - default default] [instance: c4b127c8-46d7-4b97-abfe-12c84f0d2070] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Jan 21 18:51:14 np0005591285 nova_compute[182755]: 2026-01-21 23:51:14.140 182759 DEBUG nova.compute.manager [None req-d2489ffc-0686-4414-ae85-79f8ea32295b d0c4727b6f6e46339b56a8168cf80a7b c1e85e2b0f934b719d3ad4076dc719f2 - - default default] [instance: c4b127c8-46d7-4b97-abfe-12c84f0d2070] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Jan 21 18:51:14 np0005591285 nova_compute[182755]: 2026-01-21 23:51:14.141 182759 DEBUG nova.network.neutron [None req-d2489ffc-0686-4414-ae85-79f8ea32295b d0c4727b6f6e46339b56a8168cf80a7b c1e85e2b0f934b719d3ad4076dc719f2 - - default default] [instance: c4b127c8-46d7-4b97-abfe-12c84f0d2070] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Jan 21 18:51:14 np0005591285 nova_compute[182755]: 2026-01-21 23:51:14.173 182759 INFO nova.virt.libvirt.driver [None req-d2489ffc-0686-4414-ae85-79f8ea32295b d0c4727b6f6e46339b56a8168cf80a7b c1e85e2b0f934b719d3ad4076dc719f2 - - default default] [instance: c4b127c8-46d7-4b97-abfe-12c84f0d2070] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Jan 21 18:51:14 np0005591285 nova_compute[182755]: 2026-01-21 23:51:14.200 182759 DEBUG nova.compute.manager [None req-d2489ffc-0686-4414-ae85-79f8ea32295b d0c4727b6f6e46339b56a8168cf80a7b c1e85e2b0f934b719d3ad4076dc719f2 - - default default] [instance: c4b127c8-46d7-4b97-abfe-12c84f0d2070] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Jan 21 18:51:14 np0005591285 podman[216147]: 2026-01-21 23:51:14.266324697 +0000 UTC m=+0.114354944 container health_status b5c1a31a508d0cf9e10702abe9c0ef424614b4d8f0daee85791f7de054afa6f0 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251202, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=ceilometer_agent_compute, container_name=ceilometer_agent_compute, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 21 18:51:14 np0005591285 nova_compute[182755]: 2026-01-21 23:51:14.318 182759 DEBUG nova.compute.manager [None req-d2489ffc-0686-4414-ae85-79f8ea32295b d0c4727b6f6e46339b56a8168cf80a7b c1e85e2b0f934b719d3ad4076dc719f2 - - default default] [instance: c4b127c8-46d7-4b97-abfe-12c84f0d2070] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Jan 21 18:51:14 np0005591285 nova_compute[182755]: 2026-01-21 23:51:14.321 182759 DEBUG nova.virt.libvirt.driver [None req-d2489ffc-0686-4414-ae85-79f8ea32295b d0c4727b6f6e46339b56a8168cf80a7b c1e85e2b0f934b719d3ad4076dc719f2 - - default default] [instance: c4b127c8-46d7-4b97-abfe-12c84f0d2070] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Jan 21 18:51:14 np0005591285 nova_compute[182755]: 2026-01-21 23:51:14.321 182759 INFO nova.virt.libvirt.driver [None req-d2489ffc-0686-4414-ae85-79f8ea32295b d0c4727b6f6e46339b56a8168cf80a7b c1e85e2b0f934b719d3ad4076dc719f2 - - default default] [instance: c4b127c8-46d7-4b97-abfe-12c84f0d2070] Creating image(s)#033[00m
Jan 21 18:51:14 np0005591285 nova_compute[182755]: 2026-01-21 23:51:14.322 182759 DEBUG oslo_concurrency.lockutils [None req-d2489ffc-0686-4414-ae85-79f8ea32295b d0c4727b6f6e46339b56a8168cf80a7b c1e85e2b0f934b719d3ad4076dc719f2 - - default default] Acquiring lock "/var/lib/nova/instances/c4b127c8-46d7-4b97-abfe-12c84f0d2070/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 21 18:51:14 np0005591285 nova_compute[182755]: 2026-01-21 23:51:14.322 182759 DEBUG oslo_concurrency.lockutils [None req-d2489ffc-0686-4414-ae85-79f8ea32295b d0c4727b6f6e46339b56a8168cf80a7b c1e85e2b0f934b719d3ad4076dc719f2 - - default default] Lock "/var/lib/nova/instances/c4b127c8-46d7-4b97-abfe-12c84f0d2070/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 21 18:51:14 np0005591285 nova_compute[182755]: 2026-01-21 23:51:14.323 182759 DEBUG oslo_concurrency.lockutils [None req-d2489ffc-0686-4414-ae85-79f8ea32295b d0c4727b6f6e46339b56a8168cf80a7b c1e85e2b0f934b719d3ad4076dc719f2 - - default default] Lock "/var/lib/nova/instances/c4b127c8-46d7-4b97-abfe-12c84f0d2070/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 21 18:51:14 np0005591285 nova_compute[182755]: 2026-01-21 23:51:14.337 182759 DEBUG oslo_concurrency.processutils [None req-d2489ffc-0686-4414-ae85-79f8ea32295b d0c4727b6f6e46339b56a8168cf80a7b c1e85e2b0f934b719d3ad4076dc719f2 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 21 18:51:14 np0005591285 nova_compute[182755]: 2026-01-21 23:51:14.419 182759 DEBUG oslo_concurrency.processutils [None req-d2489ffc-0686-4414-ae85-79f8ea32295b d0c4727b6f6e46339b56a8168cf80a7b c1e85e2b0f934b719d3ad4076dc719f2 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json" returned: 0 in 0.082s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 21 18:51:14 np0005591285 nova_compute[182755]: 2026-01-21 23:51:14.421 182759 DEBUG oslo_concurrency.lockutils [None req-d2489ffc-0686-4414-ae85-79f8ea32295b d0c4727b6f6e46339b56a8168cf80a7b c1e85e2b0f934b719d3ad4076dc719f2 - - default default] Acquiring lock "3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 21 18:51:14 np0005591285 nova_compute[182755]: 2026-01-21 23:51:14.421 182759 DEBUG oslo_concurrency.lockutils [None req-d2489ffc-0686-4414-ae85-79f8ea32295b d0c4727b6f6e46339b56a8168cf80a7b c1e85e2b0f934b719d3ad4076dc719f2 - - default default] Lock "3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 21 18:51:14 np0005591285 nova_compute[182755]: 2026-01-21 23:51:14.432 182759 DEBUG oslo_concurrency.processutils [None req-d2489ffc-0686-4414-ae85-79f8ea32295b d0c4727b6f6e46339b56a8168cf80a7b c1e85e2b0f934b719d3ad4076dc719f2 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 21 18:51:14 np0005591285 nova_compute[182755]: 2026-01-21 23:51:14.460 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 18:51:14 np0005591285 nova_compute[182755]: 2026-01-21 23:51:14.540 182759 DEBUG oslo_concurrency.processutils [None req-d2489ffc-0686-4414-ae85-79f8ea32295b d0c4727b6f6e46339b56a8168cf80a7b c1e85e2b0f934b719d3ad4076dc719f2 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json" returned: 0 in 0.107s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 21 18:51:14 np0005591285 nova_compute[182755]: 2026-01-21 23:51:14.541 182759 DEBUG oslo_concurrency.processutils [None req-d2489ffc-0686-4414-ae85-79f8ea32295b d0c4727b6f6e46339b56a8168cf80a7b c1e85e2b0f934b719d3ad4076dc719f2 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474,backing_fmt=raw /var/lib/nova/instances/c4b127c8-46d7-4b97-abfe-12c84f0d2070/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 21 18:51:14 np0005591285 nova_compute[182755]: 2026-01-21 23:51:14.608 182759 DEBUG oslo_concurrency.processutils [None req-d2489ffc-0686-4414-ae85-79f8ea32295b d0c4727b6f6e46339b56a8168cf80a7b c1e85e2b0f934b719d3ad4076dc719f2 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474,backing_fmt=raw /var/lib/nova/instances/c4b127c8-46d7-4b97-abfe-12c84f0d2070/disk 1073741824" returned: 0 in 0.066s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 21 18:51:14 np0005591285 nova_compute[182755]: 2026-01-21 23:51:14.610 182759 DEBUG oslo_concurrency.lockutils [None req-d2489ffc-0686-4414-ae85-79f8ea32295b d0c4727b6f6e46339b56a8168cf80a7b c1e85e2b0f934b719d3ad4076dc719f2 - - default default] Lock "3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.189s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 21 18:51:14 np0005591285 nova_compute[182755]: 2026-01-21 23:51:14.611 182759 DEBUG oslo_concurrency.processutils [None req-d2489ffc-0686-4414-ae85-79f8ea32295b d0c4727b6f6e46339b56a8168cf80a7b c1e85e2b0f934b719d3ad4076dc719f2 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 21 18:51:14 np0005591285 nova_compute[182755]: 2026-01-21 23:51:14.720 182759 DEBUG oslo_concurrency.processutils [None req-d2489ffc-0686-4414-ae85-79f8ea32295b d0c4727b6f6e46339b56a8168cf80a7b c1e85e2b0f934b719d3ad4076dc719f2 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json" returned: 0 in 0.109s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 21 18:51:14 np0005591285 nova_compute[182755]: 2026-01-21 23:51:14.723 182759 DEBUG nova.virt.disk.api [None req-d2489ffc-0686-4414-ae85-79f8ea32295b d0c4727b6f6e46339b56a8168cf80a7b c1e85e2b0f934b719d3ad4076dc719f2 - - default default] Checking if we can resize image /var/lib/nova/instances/c4b127c8-46d7-4b97-abfe-12c84f0d2070/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166#033[00m
Jan 21 18:51:14 np0005591285 nova_compute[182755]: 2026-01-21 23:51:14.724 182759 DEBUG oslo_concurrency.processutils [None req-d2489ffc-0686-4414-ae85-79f8ea32295b d0c4727b6f6e46339b56a8168cf80a7b c1e85e2b0f934b719d3ad4076dc719f2 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/c4b127c8-46d7-4b97-abfe-12c84f0d2070/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 21 18:51:14 np0005591285 nova_compute[182755]: 2026-01-21 23:51:14.824 182759 DEBUG oslo_concurrency.processutils [None req-d2489ffc-0686-4414-ae85-79f8ea32295b d0c4727b6f6e46339b56a8168cf80a7b c1e85e2b0f934b719d3ad4076dc719f2 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/c4b127c8-46d7-4b97-abfe-12c84f0d2070/disk --force-share --output=json" returned: 0 in 0.100s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 21 18:51:14 np0005591285 nova_compute[182755]: 2026-01-21 23:51:14.826 182759 DEBUG nova.virt.disk.api [None req-d2489ffc-0686-4414-ae85-79f8ea32295b d0c4727b6f6e46339b56a8168cf80a7b c1e85e2b0f934b719d3ad4076dc719f2 - - default default] Cannot resize image /var/lib/nova/instances/c4b127c8-46d7-4b97-abfe-12c84f0d2070/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172#033[00m
Jan 21 18:51:14 np0005591285 nova_compute[182755]: 2026-01-21 23:51:14.828 182759 DEBUG nova.objects.instance [None req-d2489ffc-0686-4414-ae85-79f8ea32295b d0c4727b6f6e46339b56a8168cf80a7b c1e85e2b0f934b719d3ad4076dc719f2 - - default default] Lazy-loading 'migration_context' on Instance uuid c4b127c8-46d7-4b97-abfe-12c84f0d2070 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 21 18:51:14 np0005591285 nova_compute[182755]: 2026-01-21 23:51:14.871 182759 DEBUG nova.virt.libvirt.driver [None req-d2489ffc-0686-4414-ae85-79f8ea32295b d0c4727b6f6e46339b56a8168cf80a7b c1e85e2b0f934b719d3ad4076dc719f2 - - default default] [instance: c4b127c8-46d7-4b97-abfe-12c84f0d2070] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Jan 21 18:51:14 np0005591285 nova_compute[182755]: 2026-01-21 23:51:14.872 182759 DEBUG nova.virt.libvirt.driver [None req-d2489ffc-0686-4414-ae85-79f8ea32295b d0c4727b6f6e46339b56a8168cf80a7b c1e85e2b0f934b719d3ad4076dc719f2 - - default default] [instance: c4b127c8-46d7-4b97-abfe-12c84f0d2070] Ensure instance console log exists: /var/lib/nova/instances/c4b127c8-46d7-4b97-abfe-12c84f0d2070/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Jan 21 18:51:14 np0005591285 nova_compute[182755]: 2026-01-21 23:51:14.873 182759 DEBUG oslo_concurrency.lockutils [None req-d2489ffc-0686-4414-ae85-79f8ea32295b d0c4727b6f6e46339b56a8168cf80a7b c1e85e2b0f934b719d3ad4076dc719f2 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 21 18:51:14 np0005591285 nova_compute[182755]: 2026-01-21 23:51:14.873 182759 DEBUG oslo_concurrency.lockutils [None req-d2489ffc-0686-4414-ae85-79f8ea32295b d0c4727b6f6e46339b56a8168cf80a7b c1e85e2b0f934b719d3ad4076dc719f2 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 21 18:51:14 np0005591285 nova_compute[182755]: 2026-01-21 23:51:14.874 182759 DEBUG oslo_concurrency.lockutils [None req-d2489ffc-0686-4414-ae85-79f8ea32295b d0c4727b6f6e46339b56a8168cf80a7b c1e85e2b0f934b719d3ad4076dc719f2 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 21 18:51:15 np0005591285 nova_compute[182755]: 2026-01-21 23:51:15.023 182759 DEBUG nova.network.neutron [None req-d2489ffc-0686-4414-ae85-79f8ea32295b d0c4727b6f6e46339b56a8168cf80a7b c1e85e2b0f934b719d3ad4076dc719f2 - - default default] [instance: c4b127c8-46d7-4b97-abfe-12c84f0d2070] No network configured allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1188#033[00m
Jan 21 18:51:15 np0005591285 nova_compute[182755]: 2026-01-21 23:51:15.024 182759 DEBUG nova.compute.manager [None req-d2489ffc-0686-4414-ae85-79f8ea32295b d0c4727b6f6e46339b56a8168cf80a7b c1e85e2b0f934b719d3ad4076dc719f2 - - default default] [instance: c4b127c8-46d7-4b97-abfe-12c84f0d2070] Instance network_info: |[]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Jan 21 18:51:15 np0005591285 nova_compute[182755]: 2026-01-21 23:51:15.027 182759 DEBUG nova.virt.libvirt.driver [None req-d2489ffc-0686-4414-ae85-79f8ea32295b d0c4727b6f6e46339b56a8168cf80a7b c1e85e2b0f934b719d3ad4076dc719f2 - - default default] [instance: c4b127c8-46d7-4b97-abfe-12c84f0d2070] Start _get_guest_xml network_info=[] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-21T23:42:50Z,direct_url=<?>,disk_format='qcow2',id=9cd98f02-a505-4543-a7ad-04e9a377b456,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='43b70c4e837343859ac97b6b2397ba1b',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-21T23:42:52Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'device_type': 'disk', 'device_name': '/dev/vda', 'size': 0, 'boot_index': 0, 'disk_bus': 'virtio', 'guest_format': None, 'encryption_options': None, 'encryption_format': None, 'encrypted': False, 'encryption_secret_uuid': None, 'image_id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Jan 21 18:51:15 np0005591285 nova_compute[182755]: 2026-01-21 23:51:15.034 182759 WARNING nova.virt.libvirt.driver [None req-d2489ffc-0686-4414-ae85-79f8ea32295b d0c4727b6f6e46339b56a8168cf80a7b c1e85e2b0f934b719d3ad4076dc719f2 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 21 18:51:15 np0005591285 nova_compute[182755]: 2026-01-21 23:51:15.040 182759 DEBUG nova.virt.libvirt.host [None req-d2489ffc-0686-4414-ae85-79f8ea32295b d0c4727b6f6e46339b56a8168cf80a7b c1e85e2b0f934b719d3ad4076dc719f2 - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Jan 21 18:51:15 np0005591285 nova_compute[182755]: 2026-01-21 23:51:15.041 182759 DEBUG nova.virt.libvirt.host [None req-d2489ffc-0686-4414-ae85-79f8ea32295b d0c4727b6f6e46339b56a8168cf80a7b c1e85e2b0f934b719d3ad4076dc719f2 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Jan 21 18:51:15 np0005591285 nova_compute[182755]: 2026-01-21 23:51:15.046 182759 DEBUG nova.virt.libvirt.host [None req-d2489ffc-0686-4414-ae85-79f8ea32295b d0c4727b6f6e46339b56a8168cf80a7b c1e85e2b0f934b719d3ad4076dc719f2 - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Jan 21 18:51:15 np0005591285 nova_compute[182755]: 2026-01-21 23:51:15.047 182759 DEBUG nova.virt.libvirt.host [None req-d2489ffc-0686-4414-ae85-79f8ea32295b d0c4727b6f6e46339b56a8168cf80a7b c1e85e2b0f934b719d3ad4076dc719f2 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Jan 21 18:51:15 np0005591285 nova_compute[182755]: 2026-01-21 23:51:15.050 182759 DEBUG nova.virt.libvirt.driver [None req-d2489ffc-0686-4414-ae85-79f8ea32295b d0c4727b6f6e46339b56a8168cf80a7b c1e85e2b0f934b719d3ad4076dc719f2 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Jan 21 18:51:15 np0005591285 nova_compute[182755]: 2026-01-21 23:51:15.051 182759 DEBUG nova.virt.hardware [None req-d2489ffc-0686-4414-ae85-79f8ea32295b d0c4727b6f6e46339b56a8168cf80a7b c1e85e2b0f934b719d3ad4076dc719f2 - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-21T23:42:45Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='c3389c03-89c4-4ff5-9e03-1a99d41713d4',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-21T23:42:50Z,direct_url=<?>,disk_format='qcow2',id=9cd98f02-a505-4543-a7ad-04e9a377b456,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='43b70c4e837343859ac97b6b2397ba1b',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-21T23:42:52Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Jan 21 18:51:15 np0005591285 nova_compute[182755]: 2026-01-21 23:51:15.052 182759 DEBUG nova.virt.hardware [None req-d2489ffc-0686-4414-ae85-79f8ea32295b d0c4727b6f6e46339b56a8168cf80a7b c1e85e2b0f934b719d3ad4076dc719f2 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Jan 21 18:51:15 np0005591285 nova_compute[182755]: 2026-01-21 23:51:15.052 182759 DEBUG nova.virt.hardware [None req-d2489ffc-0686-4414-ae85-79f8ea32295b d0c4727b6f6e46339b56a8168cf80a7b c1e85e2b0f934b719d3ad4076dc719f2 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Jan 21 18:51:15 np0005591285 nova_compute[182755]: 2026-01-21 23:51:15.053 182759 DEBUG nova.virt.hardware [None req-d2489ffc-0686-4414-ae85-79f8ea32295b d0c4727b6f6e46339b56a8168cf80a7b c1e85e2b0f934b719d3ad4076dc719f2 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Jan 21 18:51:15 np0005591285 nova_compute[182755]: 2026-01-21 23:51:15.053 182759 DEBUG nova.virt.hardware [None req-d2489ffc-0686-4414-ae85-79f8ea32295b d0c4727b6f6e46339b56a8168cf80a7b c1e85e2b0f934b719d3ad4076dc719f2 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Jan 21 18:51:15 np0005591285 nova_compute[182755]: 2026-01-21 23:51:15.054 182759 DEBUG nova.virt.hardware [None req-d2489ffc-0686-4414-ae85-79f8ea32295b d0c4727b6f6e46339b56a8168cf80a7b c1e85e2b0f934b719d3ad4076dc719f2 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Jan 21 18:51:15 np0005591285 nova_compute[182755]: 2026-01-21 23:51:15.054 182759 DEBUG nova.virt.hardware [None req-d2489ffc-0686-4414-ae85-79f8ea32295b d0c4727b6f6e46339b56a8168cf80a7b c1e85e2b0f934b719d3ad4076dc719f2 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Jan 21 18:51:15 np0005591285 nova_compute[182755]: 2026-01-21 23:51:15.055 182759 DEBUG nova.virt.hardware [None req-d2489ffc-0686-4414-ae85-79f8ea32295b d0c4727b6f6e46339b56a8168cf80a7b c1e85e2b0f934b719d3ad4076dc719f2 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Jan 21 18:51:15 np0005591285 nova_compute[182755]: 2026-01-21 23:51:15.056 182759 DEBUG nova.virt.hardware [None req-d2489ffc-0686-4414-ae85-79f8ea32295b d0c4727b6f6e46339b56a8168cf80a7b c1e85e2b0f934b719d3ad4076dc719f2 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Jan 21 18:51:15 np0005591285 nova_compute[182755]: 2026-01-21 23:51:15.056 182759 DEBUG nova.virt.hardware [None req-d2489ffc-0686-4414-ae85-79f8ea32295b d0c4727b6f6e46339b56a8168cf80a7b c1e85e2b0f934b719d3ad4076dc719f2 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Jan 21 18:51:15 np0005591285 nova_compute[182755]: 2026-01-21 23:51:15.057 182759 DEBUG nova.virt.hardware [None req-d2489ffc-0686-4414-ae85-79f8ea32295b d0c4727b6f6e46339b56a8168cf80a7b c1e85e2b0f934b719d3ad4076dc719f2 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Jan 21 18:51:15 np0005591285 nova_compute[182755]: 2026-01-21 23:51:15.064 182759 DEBUG nova.objects.instance [None req-d2489ffc-0686-4414-ae85-79f8ea32295b d0c4727b6f6e46339b56a8168cf80a7b c1e85e2b0f934b719d3ad4076dc719f2 - - default default] Lazy-loading 'pci_devices' on Instance uuid c4b127c8-46d7-4b97-abfe-12c84f0d2070 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 21 18:51:15 np0005591285 nova_compute[182755]: 2026-01-21 23:51:15.087 182759 DEBUG nova.virt.libvirt.driver [None req-d2489ffc-0686-4414-ae85-79f8ea32295b d0c4727b6f6e46339b56a8168cf80a7b c1e85e2b0f934b719d3ad4076dc719f2 - - default default] [instance: c4b127c8-46d7-4b97-abfe-12c84f0d2070] End _get_guest_xml xml=<domain type="kvm">
Jan 21 18:51:15 np0005591285 nova_compute[182755]:  <uuid>c4b127c8-46d7-4b97-abfe-12c84f0d2070</uuid>
Jan 21 18:51:15 np0005591285 nova_compute[182755]:  <name>instance-0000002c</name>
Jan 21 18:51:15 np0005591285 nova_compute[182755]:  <memory>131072</memory>
Jan 21 18:51:15 np0005591285 nova_compute[182755]:  <vcpu>1</vcpu>
Jan 21 18:51:15 np0005591285 nova_compute[182755]:  <metadata>
Jan 21 18:51:15 np0005591285 nova_compute[182755]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 21 18:51:15 np0005591285 nova_compute[182755]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 21 18:51:15 np0005591285 nova_compute[182755]:      <nova:name>tempest-ServersOnMultiNodesTest-server-689613698-2</nova:name>
Jan 21 18:51:15 np0005591285 nova_compute[182755]:      <nova:creationTime>2026-01-21 23:51:15</nova:creationTime>
Jan 21 18:51:15 np0005591285 nova_compute[182755]:      <nova:flavor name="m1.nano">
Jan 21 18:51:15 np0005591285 nova_compute[182755]:        <nova:memory>128</nova:memory>
Jan 21 18:51:15 np0005591285 nova_compute[182755]:        <nova:disk>1</nova:disk>
Jan 21 18:51:15 np0005591285 nova_compute[182755]:        <nova:swap>0</nova:swap>
Jan 21 18:51:15 np0005591285 nova_compute[182755]:        <nova:ephemeral>0</nova:ephemeral>
Jan 21 18:51:15 np0005591285 nova_compute[182755]:        <nova:vcpus>1</nova:vcpus>
Jan 21 18:51:15 np0005591285 nova_compute[182755]:      </nova:flavor>
Jan 21 18:51:15 np0005591285 nova_compute[182755]:      <nova:owner>
Jan 21 18:51:15 np0005591285 nova_compute[182755]:        <nova:user uuid="d0c4727b6f6e46339b56a8168cf80a7b">tempest-ServersOnMultiNodesTest-1927863391-project-member</nova:user>
Jan 21 18:51:15 np0005591285 nova_compute[182755]:        <nova:project uuid="c1e85e2b0f934b719d3ad4076dc719f2">tempest-ServersOnMultiNodesTest-1927863391</nova:project>
Jan 21 18:51:15 np0005591285 nova_compute[182755]:      </nova:owner>
Jan 21 18:51:15 np0005591285 nova_compute[182755]:      <nova:root type="image" uuid="9cd98f02-a505-4543-a7ad-04e9a377b456"/>
Jan 21 18:51:15 np0005591285 nova_compute[182755]:      <nova:ports/>
Jan 21 18:51:15 np0005591285 nova_compute[182755]:    </nova:instance>
Jan 21 18:51:15 np0005591285 nova_compute[182755]:  </metadata>
Jan 21 18:51:15 np0005591285 nova_compute[182755]:  <sysinfo type="smbios">
Jan 21 18:51:15 np0005591285 nova_compute[182755]:    <system>
Jan 21 18:51:15 np0005591285 nova_compute[182755]:      <entry name="manufacturer">RDO</entry>
Jan 21 18:51:15 np0005591285 nova_compute[182755]:      <entry name="product">OpenStack Compute</entry>
Jan 21 18:51:15 np0005591285 nova_compute[182755]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 21 18:51:15 np0005591285 nova_compute[182755]:      <entry name="serial">c4b127c8-46d7-4b97-abfe-12c84f0d2070</entry>
Jan 21 18:51:15 np0005591285 nova_compute[182755]:      <entry name="uuid">c4b127c8-46d7-4b97-abfe-12c84f0d2070</entry>
Jan 21 18:51:15 np0005591285 nova_compute[182755]:      <entry name="family">Virtual Machine</entry>
Jan 21 18:51:15 np0005591285 nova_compute[182755]:    </system>
Jan 21 18:51:15 np0005591285 nova_compute[182755]:  </sysinfo>
Jan 21 18:51:15 np0005591285 nova_compute[182755]:  <os>
Jan 21 18:51:15 np0005591285 nova_compute[182755]:    <type arch="x86_64" machine="q35">hvm</type>
Jan 21 18:51:15 np0005591285 nova_compute[182755]:    <boot dev="hd"/>
Jan 21 18:51:15 np0005591285 nova_compute[182755]:    <smbios mode="sysinfo"/>
Jan 21 18:51:15 np0005591285 nova_compute[182755]:  </os>
Jan 21 18:51:15 np0005591285 nova_compute[182755]:  <features>
Jan 21 18:51:15 np0005591285 nova_compute[182755]:    <acpi/>
Jan 21 18:51:15 np0005591285 nova_compute[182755]:    <apic/>
Jan 21 18:51:15 np0005591285 nova_compute[182755]:    <vmcoreinfo/>
Jan 21 18:51:15 np0005591285 nova_compute[182755]:  </features>
Jan 21 18:51:15 np0005591285 nova_compute[182755]:  <clock offset="utc">
Jan 21 18:51:15 np0005591285 nova_compute[182755]:    <timer name="pit" tickpolicy="delay"/>
Jan 21 18:51:15 np0005591285 nova_compute[182755]:    <timer name="rtc" tickpolicy="catchup"/>
Jan 21 18:51:15 np0005591285 nova_compute[182755]:    <timer name="hpet" present="no"/>
Jan 21 18:51:15 np0005591285 nova_compute[182755]:  </clock>
Jan 21 18:51:15 np0005591285 nova_compute[182755]:  <cpu mode="custom" match="exact">
Jan 21 18:51:15 np0005591285 nova_compute[182755]:    <model>Nehalem</model>
Jan 21 18:51:15 np0005591285 nova_compute[182755]:    <topology sockets="1" cores="1" threads="1"/>
Jan 21 18:51:15 np0005591285 nova_compute[182755]:  </cpu>
Jan 21 18:51:15 np0005591285 nova_compute[182755]:  <devices>
Jan 21 18:51:15 np0005591285 nova_compute[182755]:    <disk type="file" device="disk">
Jan 21 18:51:15 np0005591285 nova_compute[182755]:      <driver name="qemu" type="qcow2" cache="none"/>
Jan 21 18:51:15 np0005591285 nova_compute[182755]:      <source file="/var/lib/nova/instances/c4b127c8-46d7-4b97-abfe-12c84f0d2070/disk"/>
Jan 21 18:51:15 np0005591285 nova_compute[182755]:      <target dev="vda" bus="virtio"/>
Jan 21 18:51:15 np0005591285 nova_compute[182755]:    </disk>
Jan 21 18:51:15 np0005591285 nova_compute[182755]:    <disk type="file" device="cdrom">
Jan 21 18:51:15 np0005591285 nova_compute[182755]:      <driver name="qemu" type="raw" cache="none"/>
Jan 21 18:51:15 np0005591285 nova_compute[182755]:      <source file="/var/lib/nova/instances/c4b127c8-46d7-4b97-abfe-12c84f0d2070/disk.config"/>
Jan 21 18:51:15 np0005591285 nova_compute[182755]:      <target dev="sda" bus="sata"/>
Jan 21 18:51:15 np0005591285 nova_compute[182755]:    </disk>
Jan 21 18:51:15 np0005591285 nova_compute[182755]:    <serial type="pty">
Jan 21 18:51:15 np0005591285 nova_compute[182755]:      <log file="/var/lib/nova/instances/c4b127c8-46d7-4b97-abfe-12c84f0d2070/console.log" append="off"/>
Jan 21 18:51:15 np0005591285 nova_compute[182755]:    </serial>
Jan 21 18:51:15 np0005591285 nova_compute[182755]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 21 18:51:15 np0005591285 nova_compute[182755]:    <video>
Jan 21 18:51:15 np0005591285 nova_compute[182755]:      <model type="virtio"/>
Jan 21 18:51:15 np0005591285 nova_compute[182755]:    </video>
Jan 21 18:51:15 np0005591285 nova_compute[182755]:    <input type="tablet" bus="usb"/>
Jan 21 18:51:15 np0005591285 nova_compute[182755]:    <rng model="virtio">
Jan 21 18:51:15 np0005591285 nova_compute[182755]:      <backend model="random">/dev/urandom</backend>
Jan 21 18:51:15 np0005591285 nova_compute[182755]:    </rng>
Jan 21 18:51:15 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root"/>
Jan 21 18:51:15 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 18:51:15 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 18:51:15 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 18:51:15 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 18:51:15 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 18:51:15 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 18:51:15 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 18:51:15 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 18:51:15 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 18:51:15 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 18:51:15 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 18:51:15 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 18:51:15 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 18:51:15 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 18:51:15 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 18:51:15 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 18:51:15 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 18:51:15 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 18:51:15 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 18:51:15 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 18:51:15 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 18:51:15 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 18:51:15 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 18:51:15 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 18:51:15 np0005591285 nova_compute[182755]:    <controller type="usb" index="0"/>
Jan 21 18:51:15 np0005591285 nova_compute[182755]:    <memballoon model="virtio">
Jan 21 18:51:15 np0005591285 nova_compute[182755]:      <stats period="10"/>
Jan 21 18:51:15 np0005591285 nova_compute[182755]:    </memballoon>
Jan 21 18:51:15 np0005591285 nova_compute[182755]:  </devices>
Jan 21 18:51:15 np0005591285 nova_compute[182755]: </domain>
Jan 21 18:51:15 np0005591285 nova_compute[182755]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Jan 21 18:51:15 np0005591285 nova_compute[182755]: 2026-01-21 23:51:15.172 182759 DEBUG nova.virt.libvirt.driver [None req-d2489ffc-0686-4414-ae85-79f8ea32295b d0c4727b6f6e46339b56a8168cf80a7b c1e85e2b0f934b719d3ad4076dc719f2 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 21 18:51:15 np0005591285 nova_compute[182755]: 2026-01-21 23:51:15.173 182759 DEBUG nova.virt.libvirt.driver [None req-d2489ffc-0686-4414-ae85-79f8ea32295b d0c4727b6f6e46339b56a8168cf80a7b c1e85e2b0f934b719d3ad4076dc719f2 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 21 18:51:15 np0005591285 nova_compute[182755]: 2026-01-21 23:51:15.173 182759 INFO nova.virt.libvirt.driver [None req-d2489ffc-0686-4414-ae85-79f8ea32295b d0c4727b6f6e46339b56a8168cf80a7b c1e85e2b0f934b719d3ad4076dc719f2 - - default default] [instance: c4b127c8-46d7-4b97-abfe-12c84f0d2070] Using config drive#033[00m
Jan 21 18:51:15 np0005591285 nova_compute[182755]: 2026-01-21 23:51:15.412 182759 INFO nova.virt.libvirt.driver [None req-d2489ffc-0686-4414-ae85-79f8ea32295b d0c4727b6f6e46339b56a8168cf80a7b c1e85e2b0f934b719d3ad4076dc719f2 - - default default] [instance: c4b127c8-46d7-4b97-abfe-12c84f0d2070] Creating config drive at /var/lib/nova/instances/c4b127c8-46d7-4b97-abfe-12c84f0d2070/disk.config#033[00m
Jan 21 18:51:15 np0005591285 nova_compute[182755]: 2026-01-21 23:51:15.424 182759 DEBUG oslo_concurrency.processutils [None req-d2489ffc-0686-4414-ae85-79f8ea32295b d0c4727b6f6e46339b56a8168cf80a7b c1e85e2b0f934b719d3ad4076dc719f2 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/c4b127c8-46d7-4b97-abfe-12c84f0d2070/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp3k0161yb execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 21 18:51:15 np0005591285 nova_compute[182755]: 2026-01-21 23:51:15.572 182759 DEBUG oslo_concurrency.processutils [None req-d2489ffc-0686-4414-ae85-79f8ea32295b d0c4727b6f6e46339b56a8168cf80a7b c1e85e2b0f934b719d3ad4076dc719f2 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/c4b127c8-46d7-4b97-abfe-12c84f0d2070/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp3k0161yb" returned: 0 in 0.148s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 21 18:51:15 np0005591285 systemd-machined[154022]: New machine qemu-17-instance-0000002c.
Jan 21 18:51:15 np0005591285 systemd[1]: Started Virtual Machine qemu-17-instance-0000002c.
Jan 21 18:51:15 np0005591285 podman[216191]: 2026-01-21 23:51:15.771518439 +0000 UTC m=+0.096202627 container health_status 769668cc4e818cfae854ab3fcb5517227ddca1e18005a18ef4d782a494f45662 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, com.redhat.component=ubi9-minimal-container, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., name=ubi9-minimal, release=1755695350, url=https://catalog.redhat.com/en/search?searchType=containers, architecture=x86_64, io.openshift.tags=minimal rhel9, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, version=9.6, build-date=2025-08-20T13:12:41, config_id=openstack_network_exporter, distribution-scope=public, managed_by=edpm_ansible, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.openshift.expose-services=, maintainer=Red Hat, Inc., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., container_name=openstack_network_exporter, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.33.7, vcs-type=git)
Jan 21 18:51:16 np0005591285 nova_compute[182755]: 2026-01-21 23:51:16.277 182759 DEBUG nova.virt.driver [None req-f447ef32-7edb-4e2c-a5c5-5452f2f9c37e - - - - - -] Emitting event <LifecycleEvent: 1769039476.2773485, c4b127c8-46d7-4b97-abfe-12c84f0d2070 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 21 18:51:16 np0005591285 nova_compute[182755]: 2026-01-21 23:51:16.278 182759 INFO nova.compute.manager [None req-f447ef32-7edb-4e2c-a5c5-5452f2f9c37e - - - - - -] [instance: c4b127c8-46d7-4b97-abfe-12c84f0d2070] VM Resumed (Lifecycle Event)#033[00m
Jan 21 18:51:16 np0005591285 nova_compute[182755]: 2026-01-21 23:51:16.281 182759 DEBUG nova.compute.manager [None req-d2489ffc-0686-4414-ae85-79f8ea32295b d0c4727b6f6e46339b56a8168cf80a7b c1e85e2b0f934b719d3ad4076dc719f2 - - default default] [instance: c4b127c8-46d7-4b97-abfe-12c84f0d2070] Instance event wait completed in 0 seconds for  wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Jan 21 18:51:16 np0005591285 nova_compute[182755]: 2026-01-21 23:51:16.281 182759 DEBUG nova.virt.libvirt.driver [None req-d2489ffc-0686-4414-ae85-79f8ea32295b d0c4727b6f6e46339b56a8168cf80a7b c1e85e2b0f934b719d3ad4076dc719f2 - - default default] [instance: c4b127c8-46d7-4b97-abfe-12c84f0d2070] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Jan 21 18:51:16 np0005591285 nova_compute[182755]: 2026-01-21 23:51:16.291 182759 INFO nova.virt.libvirt.driver [-] [instance: c4b127c8-46d7-4b97-abfe-12c84f0d2070] Instance spawned successfully.#033[00m
Jan 21 18:51:16 np0005591285 nova_compute[182755]: 2026-01-21 23:51:16.292 182759 DEBUG nova.virt.libvirt.driver [None req-d2489ffc-0686-4414-ae85-79f8ea32295b d0c4727b6f6e46339b56a8168cf80a7b c1e85e2b0f934b719d3ad4076dc719f2 - - default default] [instance: c4b127c8-46d7-4b97-abfe-12c84f0d2070] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Jan 21 18:51:16 np0005591285 nova_compute[182755]: 2026-01-21 23:51:16.320 182759 DEBUG nova.compute.manager [None req-f447ef32-7edb-4e2c-a5c5-5452f2f9c37e - - - - - -] [instance: c4b127c8-46d7-4b97-abfe-12c84f0d2070] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 21 18:51:16 np0005591285 nova_compute[182755]: 2026-01-21 23:51:16.326 182759 DEBUG nova.compute.manager [None req-f447ef32-7edb-4e2c-a5c5-5452f2f9c37e - - - - - -] [instance: c4b127c8-46d7-4b97-abfe-12c84f0d2070] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 21 18:51:16 np0005591285 nova_compute[182755]: 2026-01-21 23:51:16.337 182759 DEBUG nova.virt.libvirt.driver [None req-d2489ffc-0686-4414-ae85-79f8ea32295b d0c4727b6f6e46339b56a8168cf80a7b c1e85e2b0f934b719d3ad4076dc719f2 - - default default] [instance: c4b127c8-46d7-4b97-abfe-12c84f0d2070] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 21 18:51:16 np0005591285 nova_compute[182755]: 2026-01-21 23:51:16.337 182759 DEBUG nova.virt.libvirt.driver [None req-d2489ffc-0686-4414-ae85-79f8ea32295b d0c4727b6f6e46339b56a8168cf80a7b c1e85e2b0f934b719d3ad4076dc719f2 - - default default] [instance: c4b127c8-46d7-4b97-abfe-12c84f0d2070] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 21 18:51:16 np0005591285 nova_compute[182755]: 2026-01-21 23:51:16.338 182759 DEBUG nova.virt.libvirt.driver [None req-d2489ffc-0686-4414-ae85-79f8ea32295b d0c4727b6f6e46339b56a8168cf80a7b c1e85e2b0f934b719d3ad4076dc719f2 - - default default] [instance: c4b127c8-46d7-4b97-abfe-12c84f0d2070] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 21 18:51:16 np0005591285 nova_compute[182755]: 2026-01-21 23:51:16.338 182759 DEBUG nova.virt.libvirt.driver [None req-d2489ffc-0686-4414-ae85-79f8ea32295b d0c4727b6f6e46339b56a8168cf80a7b c1e85e2b0f934b719d3ad4076dc719f2 - - default default] [instance: c4b127c8-46d7-4b97-abfe-12c84f0d2070] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 21 18:51:16 np0005591285 nova_compute[182755]: 2026-01-21 23:51:16.339 182759 DEBUG nova.virt.libvirt.driver [None req-d2489ffc-0686-4414-ae85-79f8ea32295b d0c4727b6f6e46339b56a8168cf80a7b c1e85e2b0f934b719d3ad4076dc719f2 - - default default] [instance: c4b127c8-46d7-4b97-abfe-12c84f0d2070] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 21 18:51:16 np0005591285 nova_compute[182755]: 2026-01-21 23:51:16.340 182759 DEBUG nova.virt.libvirt.driver [None req-d2489ffc-0686-4414-ae85-79f8ea32295b d0c4727b6f6e46339b56a8168cf80a7b c1e85e2b0f934b719d3ad4076dc719f2 - - default default] [instance: c4b127c8-46d7-4b97-abfe-12c84f0d2070] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 21 18:51:16 np0005591285 nova_compute[182755]: 2026-01-21 23:51:16.381 182759 INFO nova.compute.manager [None req-f447ef32-7edb-4e2c-a5c5-5452f2f9c37e - - - - - -] [instance: c4b127c8-46d7-4b97-abfe-12c84f0d2070] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 21 18:51:16 np0005591285 nova_compute[182755]: 2026-01-21 23:51:16.382 182759 DEBUG nova.virt.driver [None req-f447ef32-7edb-4e2c-a5c5-5452f2f9c37e - - - - - -] Emitting event <LifecycleEvent: 1769039476.2784667, c4b127c8-46d7-4b97-abfe-12c84f0d2070 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 21 18:51:16 np0005591285 nova_compute[182755]: 2026-01-21 23:51:16.382 182759 INFO nova.compute.manager [None req-f447ef32-7edb-4e2c-a5c5-5452f2f9c37e - - - - - -] [instance: c4b127c8-46d7-4b97-abfe-12c84f0d2070] VM Started (Lifecycle Event)#033[00m
Jan 21 18:51:16 np0005591285 nova_compute[182755]: 2026-01-21 23:51:16.415 182759 DEBUG nova.compute.manager [None req-f447ef32-7edb-4e2c-a5c5-5452f2f9c37e - - - - - -] [instance: c4b127c8-46d7-4b97-abfe-12c84f0d2070] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 21 18:51:16 np0005591285 nova_compute[182755]: 2026-01-21 23:51:16.421 182759 DEBUG nova.compute.manager [None req-f447ef32-7edb-4e2c-a5c5-5452f2f9c37e - - - - - -] [instance: c4b127c8-46d7-4b97-abfe-12c84f0d2070] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 21 18:51:16 np0005591285 nova_compute[182755]: 2026-01-21 23:51:16.459 182759 INFO nova.compute.manager [None req-f447ef32-7edb-4e2c-a5c5-5452f2f9c37e - - - - - -] [instance: c4b127c8-46d7-4b97-abfe-12c84f0d2070] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 21 18:51:16 np0005591285 nova_compute[182755]: 2026-01-21 23:51:16.463 182759 INFO nova.compute.manager [None req-d2489ffc-0686-4414-ae85-79f8ea32295b d0c4727b6f6e46339b56a8168cf80a7b c1e85e2b0f934b719d3ad4076dc719f2 - - default default] [instance: c4b127c8-46d7-4b97-abfe-12c84f0d2070] Took 2.14 seconds to spawn the instance on the hypervisor.#033[00m
Jan 21 18:51:16 np0005591285 nova_compute[182755]: 2026-01-21 23:51:16.463 182759 DEBUG nova.compute.manager [None req-d2489ffc-0686-4414-ae85-79f8ea32295b d0c4727b6f6e46339b56a8168cf80a7b c1e85e2b0f934b719d3ad4076dc719f2 - - default default] [instance: c4b127c8-46d7-4b97-abfe-12c84f0d2070] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 21 18:51:16 np0005591285 nova_compute[182755]: 2026-01-21 23:51:16.577 182759 INFO nova.compute.manager [None req-d2489ffc-0686-4414-ae85-79f8ea32295b d0c4727b6f6e46339b56a8168cf80a7b c1e85e2b0f934b719d3ad4076dc719f2 - - default default] [instance: c4b127c8-46d7-4b97-abfe-12c84f0d2070] Took 2.92 seconds to build instance.#033[00m
Jan 21 18:51:16 np0005591285 nova_compute[182755]: 2026-01-21 23:51:16.619 182759 DEBUG oslo_concurrency.lockutils [None req-d2489ffc-0686-4414-ae85-79f8ea32295b d0c4727b6f6e46339b56a8168cf80a7b c1e85e2b0f934b719d3ad4076dc719f2 - - default default] Lock "c4b127c8-46d7-4b97-abfe-12c84f0d2070" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 3.084s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 21 18:51:17 np0005591285 nova_compute[182755]: 2026-01-21 23:51:17.040 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 18:51:19 np0005591285 nova_compute[182755]: 2026-01-21 23:51:19.458 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 18:51:20 np0005591285 nova_compute[182755]: 2026-01-21 23:51:20.214 182759 DEBUG oslo_service.periodic_task [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 21 18:51:20 np0005591285 nova_compute[182755]: 2026-01-21 23:51:20.267 182759 DEBUG oslo_concurrency.lockutils [None req-50d45197-300d-431d-989c-00f5ef6f41af d0c4727b6f6e46339b56a8168cf80a7b c1e85e2b0f934b719d3ad4076dc719f2 - - default default] Acquiring lock "c4b127c8-46d7-4b97-abfe-12c84f0d2070" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 21 18:51:20 np0005591285 nova_compute[182755]: 2026-01-21 23:51:20.267 182759 DEBUG oslo_concurrency.lockutils [None req-50d45197-300d-431d-989c-00f5ef6f41af d0c4727b6f6e46339b56a8168cf80a7b c1e85e2b0f934b719d3ad4076dc719f2 - - default default] Lock "c4b127c8-46d7-4b97-abfe-12c84f0d2070" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 21 18:51:20 np0005591285 nova_compute[182755]: 2026-01-21 23:51:20.268 182759 DEBUG oslo_concurrency.lockutils [None req-50d45197-300d-431d-989c-00f5ef6f41af d0c4727b6f6e46339b56a8168cf80a7b c1e85e2b0f934b719d3ad4076dc719f2 - - default default] Acquiring lock "c4b127c8-46d7-4b97-abfe-12c84f0d2070-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 21 18:51:20 np0005591285 nova_compute[182755]: 2026-01-21 23:51:20.268 182759 DEBUG oslo_concurrency.lockutils [None req-50d45197-300d-431d-989c-00f5ef6f41af d0c4727b6f6e46339b56a8168cf80a7b c1e85e2b0f934b719d3ad4076dc719f2 - - default default] Lock "c4b127c8-46d7-4b97-abfe-12c84f0d2070-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 21 18:51:20 np0005591285 nova_compute[182755]: 2026-01-21 23:51:20.269 182759 DEBUG oslo_concurrency.lockutils [None req-50d45197-300d-431d-989c-00f5ef6f41af d0c4727b6f6e46339b56a8168cf80a7b c1e85e2b0f934b719d3ad4076dc719f2 - - default default] Lock "c4b127c8-46d7-4b97-abfe-12c84f0d2070-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 21 18:51:20 np0005591285 nova_compute[182755]: 2026-01-21 23:51:20.281 182759 INFO nova.compute.manager [None req-50d45197-300d-431d-989c-00f5ef6f41af d0c4727b6f6e46339b56a8168cf80a7b c1e85e2b0f934b719d3ad4076dc719f2 - - default default] [instance: c4b127c8-46d7-4b97-abfe-12c84f0d2070] Terminating instance#033[00m
Jan 21 18:51:20 np0005591285 nova_compute[182755]: 2026-01-21 23:51:20.294 182759 DEBUG oslo_concurrency.lockutils [None req-50d45197-300d-431d-989c-00f5ef6f41af d0c4727b6f6e46339b56a8168cf80a7b c1e85e2b0f934b719d3ad4076dc719f2 - - default default] Acquiring lock "refresh_cache-c4b127c8-46d7-4b97-abfe-12c84f0d2070" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 21 18:51:20 np0005591285 nova_compute[182755]: 2026-01-21 23:51:20.295 182759 DEBUG oslo_concurrency.lockutils [None req-50d45197-300d-431d-989c-00f5ef6f41af d0c4727b6f6e46339b56a8168cf80a7b c1e85e2b0f934b719d3ad4076dc719f2 - - default default] Acquired lock "refresh_cache-c4b127c8-46d7-4b97-abfe-12c84f0d2070" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 21 18:51:20 np0005591285 nova_compute[182755]: 2026-01-21 23:51:20.295 182759 DEBUG nova.network.neutron [None req-50d45197-300d-431d-989c-00f5ef6f41af d0c4727b6f6e46339b56a8168cf80a7b c1e85e2b0f934b719d3ad4076dc719f2 - - default default] [instance: c4b127c8-46d7-4b97-abfe-12c84f0d2070] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Jan 21 18:51:20 np0005591285 nova_compute[182755]: 2026-01-21 23:51:20.519 182759 DEBUG nova.network.neutron [None req-50d45197-300d-431d-989c-00f5ef6f41af d0c4727b6f6e46339b56a8168cf80a7b c1e85e2b0f934b719d3ad4076dc719f2 - - default default] [instance: c4b127c8-46d7-4b97-abfe-12c84f0d2070] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Jan 21 18:51:20 np0005591285 nova_compute[182755]: 2026-01-21 23:51:20.882 182759 DEBUG nova.network.neutron [None req-50d45197-300d-431d-989c-00f5ef6f41af d0c4727b6f6e46339b56a8168cf80a7b c1e85e2b0f934b719d3ad4076dc719f2 - - default default] [instance: c4b127c8-46d7-4b97-abfe-12c84f0d2070] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 21 18:51:20 np0005591285 nova_compute[182755]: 2026-01-21 23:51:20.903 182759 DEBUG oslo_concurrency.lockutils [None req-50d45197-300d-431d-989c-00f5ef6f41af d0c4727b6f6e46339b56a8168cf80a7b c1e85e2b0f934b719d3ad4076dc719f2 - - default default] Releasing lock "refresh_cache-c4b127c8-46d7-4b97-abfe-12c84f0d2070" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 21 18:51:20 np0005591285 nova_compute[182755]: 2026-01-21 23:51:20.904 182759 DEBUG nova.compute.manager [None req-50d45197-300d-431d-989c-00f5ef6f41af d0c4727b6f6e46339b56a8168cf80a7b c1e85e2b0f934b719d3ad4076dc719f2 - - default default] [instance: c4b127c8-46d7-4b97-abfe-12c84f0d2070] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Jan 21 18:51:20 np0005591285 systemd[1]: machine-qemu\x2d17\x2dinstance\x2d0000002c.scope: Deactivated successfully.
Jan 21 18:51:20 np0005591285 systemd[1]: machine-qemu\x2d17\x2dinstance\x2d0000002c.scope: Consumed 5.162s CPU time.
Jan 21 18:51:20 np0005591285 systemd-machined[154022]: Machine qemu-17-instance-0000002c terminated.
Jan 21 18:51:21 np0005591285 nova_compute[182755]: 2026-01-21 23:51:21.177 182759 INFO nova.virt.libvirt.driver [-] [instance: c4b127c8-46d7-4b97-abfe-12c84f0d2070] Instance destroyed successfully.#033[00m
Jan 21 18:51:21 np0005591285 nova_compute[182755]: 2026-01-21 23:51:21.178 182759 DEBUG nova.objects.instance [None req-50d45197-300d-431d-989c-00f5ef6f41af d0c4727b6f6e46339b56a8168cf80a7b c1e85e2b0f934b719d3ad4076dc719f2 - - default default] Lazy-loading 'resources' on Instance uuid c4b127c8-46d7-4b97-abfe-12c84f0d2070 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 21 18:51:21 np0005591285 nova_compute[182755]: 2026-01-21 23:51:21.202 182759 INFO nova.virt.libvirt.driver [None req-50d45197-300d-431d-989c-00f5ef6f41af d0c4727b6f6e46339b56a8168cf80a7b c1e85e2b0f934b719d3ad4076dc719f2 - - default default] [instance: c4b127c8-46d7-4b97-abfe-12c84f0d2070] Deleting instance files /var/lib/nova/instances/c4b127c8-46d7-4b97-abfe-12c84f0d2070_del#033[00m
Jan 21 18:51:21 np0005591285 nova_compute[182755]: 2026-01-21 23:51:21.203 182759 INFO nova.virt.libvirt.driver [None req-50d45197-300d-431d-989c-00f5ef6f41af d0c4727b6f6e46339b56a8168cf80a7b c1e85e2b0f934b719d3ad4076dc719f2 - - default default] [instance: c4b127c8-46d7-4b97-abfe-12c84f0d2070] Deletion of /var/lib/nova/instances/c4b127c8-46d7-4b97-abfe-12c84f0d2070_del complete#033[00m
Jan 21 18:51:21 np0005591285 nova_compute[182755]: 2026-01-21 23:51:21.217 182759 DEBUG oslo_service.periodic_task [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 21 18:51:21 np0005591285 nova_compute[182755]: 2026-01-21 23:51:21.217 182759 DEBUG oslo_service.periodic_task [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 21 18:51:21 np0005591285 nova_compute[182755]: 2026-01-21 23:51:21.267 182759 INFO nova.compute.manager [None req-50d45197-300d-431d-989c-00f5ef6f41af d0c4727b6f6e46339b56a8168cf80a7b c1e85e2b0f934b719d3ad4076dc719f2 - - default default] [instance: c4b127c8-46d7-4b97-abfe-12c84f0d2070] Took 0.36 seconds to destroy the instance on the hypervisor.#033[00m
Jan 21 18:51:21 np0005591285 nova_compute[182755]: 2026-01-21 23:51:21.267 182759 DEBUG oslo.service.loopingcall [None req-50d45197-300d-431d-989c-00f5ef6f41af d0c4727b6f6e46339b56a8168cf80a7b c1e85e2b0f934b719d3ad4076dc719f2 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Jan 21 18:51:21 np0005591285 nova_compute[182755]: 2026-01-21 23:51:21.268 182759 DEBUG nova.compute.manager [-] [instance: c4b127c8-46d7-4b97-abfe-12c84f0d2070] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Jan 21 18:51:21 np0005591285 nova_compute[182755]: 2026-01-21 23:51:21.268 182759 DEBUG nova.network.neutron [-] [instance: c4b127c8-46d7-4b97-abfe-12c84f0d2070] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Jan 21 18:51:21 np0005591285 nova_compute[182755]: 2026-01-21 23:51:21.828 182759 DEBUG nova.network.neutron [-] [instance: c4b127c8-46d7-4b97-abfe-12c84f0d2070] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Jan 21 18:51:21 np0005591285 nova_compute[182755]: 2026-01-21 23:51:21.843 182759 DEBUG nova.network.neutron [-] [instance: c4b127c8-46d7-4b97-abfe-12c84f0d2070] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 21 18:51:21 np0005591285 nova_compute[182755]: 2026-01-21 23:51:21.869 182759 INFO nova.compute.manager [-] [instance: c4b127c8-46d7-4b97-abfe-12c84f0d2070] Took 0.60 seconds to deallocate network for instance.#033[00m
Jan 21 18:51:21 np0005591285 nova_compute[182755]: 2026-01-21 23:51:21.970 182759 DEBUG oslo_concurrency.lockutils [None req-50d45197-300d-431d-989c-00f5ef6f41af d0c4727b6f6e46339b56a8168cf80a7b c1e85e2b0f934b719d3ad4076dc719f2 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 21 18:51:21 np0005591285 nova_compute[182755]: 2026-01-21 23:51:21.970 182759 DEBUG oslo_concurrency.lockutils [None req-50d45197-300d-431d-989c-00f5ef6f41af d0c4727b6f6e46339b56a8168cf80a7b c1e85e2b0f934b719d3ad4076dc719f2 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 21 18:51:22 np0005591285 nova_compute[182755]: 2026-01-21 23:51:22.042 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 18:51:22 np0005591285 nova_compute[182755]: 2026-01-21 23:51:22.071 182759 DEBUG nova.compute.provider_tree [None req-50d45197-300d-431d-989c-00f5ef6f41af d0c4727b6f6e46339b56a8168cf80a7b c1e85e2b0f934b719d3ad4076dc719f2 - - default default] Inventory has not changed in ProviderTree for provider: e96a8776-a298-4c19-937a-402cb8191067 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 21 18:51:22 np0005591285 nova_compute[182755]: 2026-01-21 23:51:22.098 182759 DEBUG nova.scheduler.client.report [None req-50d45197-300d-431d-989c-00f5ef6f41af d0c4727b6f6e46339b56a8168cf80a7b c1e85e2b0f934b719d3ad4076dc719f2 - - default default] Inventory has not changed for provider e96a8776-a298-4c19-937a-402cb8191067 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 21 18:51:22 np0005591285 nova_compute[182755]: 2026-01-21 23:51:22.131 182759 DEBUG oslo_concurrency.lockutils [None req-50d45197-300d-431d-989c-00f5ef6f41af d0c4727b6f6e46339b56a8168cf80a7b c1e85e2b0f934b719d3ad4076dc719f2 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.160s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 21 18:51:22 np0005591285 nova_compute[182755]: 2026-01-21 23:51:22.158 182759 INFO nova.scheduler.client.report [None req-50d45197-300d-431d-989c-00f5ef6f41af d0c4727b6f6e46339b56a8168cf80a7b c1e85e2b0f934b719d3ad4076dc719f2 - - default default] Deleted allocations for instance c4b127c8-46d7-4b97-abfe-12c84f0d2070#033[00m
Jan 21 18:51:22 np0005591285 nova_compute[182755]: 2026-01-21 23:51:22.217 182759 DEBUG oslo_service.periodic_task [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 21 18:51:22 np0005591285 nova_compute[182755]: 2026-01-21 23:51:22.218 182759 DEBUG nova.compute.manager [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Jan 21 18:51:22 np0005591285 nova_compute[182755]: 2026-01-21 23:51:22.218 182759 DEBUG nova.compute.manager [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Jan 21 18:51:22 np0005591285 nova_compute[182755]: 2026-01-21 23:51:22.280 182759 DEBUG oslo_concurrency.lockutils [None req-50d45197-300d-431d-989c-00f5ef6f41af d0c4727b6f6e46339b56a8168cf80a7b c1e85e2b0f934b719d3ad4076dc719f2 - - default default] Lock "c4b127c8-46d7-4b97-abfe-12c84f0d2070" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 2.013s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 21 18:51:22 np0005591285 nova_compute[182755]: 2026-01-21 23:51:22.435 182759 DEBUG oslo_concurrency.lockutils [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Acquiring lock "refresh_cache-5f345aa8-94d2-4213-ab21-fadc362697b1" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 21 18:51:22 np0005591285 nova_compute[182755]: 2026-01-21 23:51:22.436 182759 DEBUG oslo_concurrency.lockutils [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Acquired lock "refresh_cache-5f345aa8-94d2-4213-ab21-fadc362697b1" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 21 18:51:22 np0005591285 nova_compute[182755]: 2026-01-21 23:51:22.436 182759 DEBUG nova.network.neutron [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] [instance: 5f345aa8-94d2-4213-ab21-fadc362697b1] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004#033[00m
Jan 21 18:51:22 np0005591285 nova_compute[182755]: 2026-01-21 23:51:22.436 182759 DEBUG nova.objects.instance [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Lazy-loading 'info_cache' on Instance uuid 5f345aa8-94d2-4213-ab21-fadc362697b1 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 21 18:51:22 np0005591285 nova_compute[182755]: 2026-01-21 23:51:22.744 182759 DEBUG nova.network.neutron [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] [instance: 5f345aa8-94d2-4213-ab21-fadc362697b1] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Jan 21 18:51:23 np0005591285 podman[216261]: 2026-01-21 23:51:23.238474114 +0000 UTC m=+0.087329750 container health_status 3d927e208c8d9d2bf2ed958430b573b5beb6e6062ba87e1ec7a0ee42c855b2e4 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter)
Jan 21 18:51:23 np0005591285 nova_compute[182755]: 2026-01-21 23:51:23.397 182759 DEBUG nova.network.neutron [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] [instance: 5f345aa8-94d2-4213-ab21-fadc362697b1] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 21 18:51:23 np0005591285 nova_compute[182755]: 2026-01-21 23:51:23.414 182759 DEBUG oslo_concurrency.lockutils [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Releasing lock "refresh_cache-5f345aa8-94d2-4213-ab21-fadc362697b1" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 21 18:51:23 np0005591285 nova_compute[182755]: 2026-01-21 23:51:23.415 182759 DEBUG nova.compute.manager [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] [instance: 5f345aa8-94d2-4213-ab21-fadc362697b1] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929#033[00m
Jan 21 18:51:24 np0005591285 nova_compute[182755]: 2026-01-21 23:51:24.217 182759 DEBUG oslo_service.periodic_task [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 21 18:51:24 np0005591285 nova_compute[182755]: 2026-01-21 23:51:24.246 182759 DEBUG oslo_service.periodic_task [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 21 18:51:24 np0005591285 nova_compute[182755]: 2026-01-21 23:51:24.247 182759 DEBUG nova.compute.manager [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Jan 21 18:51:24 np0005591285 nova_compute[182755]: 2026-01-21 23:51:24.461 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 18:51:24 np0005591285 nova_compute[182755]: 2026-01-21 23:51:24.876 182759 DEBUG oslo_concurrency.lockutils [None req-6204eec7-2fd1-4c08-9e33-9eefe34a69fe d0c4727b6f6e46339b56a8168cf80a7b c1e85e2b0f934b719d3ad4076dc719f2 - - default default] Acquiring lock "5f345aa8-94d2-4213-ab21-fadc362697b1" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 21 18:51:24 np0005591285 nova_compute[182755]: 2026-01-21 23:51:24.877 182759 DEBUG oslo_concurrency.lockutils [None req-6204eec7-2fd1-4c08-9e33-9eefe34a69fe d0c4727b6f6e46339b56a8168cf80a7b c1e85e2b0f934b719d3ad4076dc719f2 - - default default] Lock "5f345aa8-94d2-4213-ab21-fadc362697b1" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 21 18:51:24 np0005591285 nova_compute[182755]: 2026-01-21 23:51:24.877 182759 DEBUG oslo_concurrency.lockutils [None req-6204eec7-2fd1-4c08-9e33-9eefe34a69fe d0c4727b6f6e46339b56a8168cf80a7b c1e85e2b0f934b719d3ad4076dc719f2 - - default default] Acquiring lock "5f345aa8-94d2-4213-ab21-fadc362697b1-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 21 18:51:24 np0005591285 nova_compute[182755]: 2026-01-21 23:51:24.878 182759 DEBUG oslo_concurrency.lockutils [None req-6204eec7-2fd1-4c08-9e33-9eefe34a69fe d0c4727b6f6e46339b56a8168cf80a7b c1e85e2b0f934b719d3ad4076dc719f2 - - default default] Lock "5f345aa8-94d2-4213-ab21-fadc362697b1-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 21 18:51:24 np0005591285 nova_compute[182755]: 2026-01-21 23:51:24.879 182759 DEBUG oslo_concurrency.lockutils [None req-6204eec7-2fd1-4c08-9e33-9eefe34a69fe d0c4727b6f6e46339b56a8168cf80a7b c1e85e2b0f934b719d3ad4076dc719f2 - - default default] Lock "5f345aa8-94d2-4213-ab21-fadc362697b1-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 21 18:51:24 np0005591285 nova_compute[182755]: 2026-01-21 23:51:24.896 182759 INFO nova.compute.manager [None req-6204eec7-2fd1-4c08-9e33-9eefe34a69fe d0c4727b6f6e46339b56a8168cf80a7b c1e85e2b0f934b719d3ad4076dc719f2 - - default default] [instance: 5f345aa8-94d2-4213-ab21-fadc362697b1] Terminating instance#033[00m
Jan 21 18:51:24 np0005591285 nova_compute[182755]: 2026-01-21 23:51:24.908 182759 DEBUG oslo_concurrency.lockutils [None req-6204eec7-2fd1-4c08-9e33-9eefe34a69fe d0c4727b6f6e46339b56a8168cf80a7b c1e85e2b0f934b719d3ad4076dc719f2 - - default default] Acquiring lock "refresh_cache-5f345aa8-94d2-4213-ab21-fadc362697b1" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 21 18:51:24 np0005591285 nova_compute[182755]: 2026-01-21 23:51:24.909 182759 DEBUG oslo_concurrency.lockutils [None req-6204eec7-2fd1-4c08-9e33-9eefe34a69fe d0c4727b6f6e46339b56a8168cf80a7b c1e85e2b0f934b719d3ad4076dc719f2 - - default default] Acquired lock "refresh_cache-5f345aa8-94d2-4213-ab21-fadc362697b1" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 21 18:51:24 np0005591285 nova_compute[182755]: 2026-01-21 23:51:24.909 182759 DEBUG nova.network.neutron [None req-6204eec7-2fd1-4c08-9e33-9eefe34a69fe d0c4727b6f6e46339b56a8168cf80a7b c1e85e2b0f934b719d3ad4076dc719f2 - - default default] [instance: 5f345aa8-94d2-4213-ab21-fadc362697b1] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Jan 21 18:51:25 np0005591285 nova_compute[182755]: 2026-01-21 23:51:25.102 182759 DEBUG nova.network.neutron [None req-6204eec7-2fd1-4c08-9e33-9eefe34a69fe d0c4727b6f6e46339b56a8168cf80a7b c1e85e2b0f934b719d3ad4076dc719f2 - - default default] [instance: 5f345aa8-94d2-4213-ab21-fadc362697b1] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Jan 21 18:51:25 np0005591285 nova_compute[182755]: 2026-01-21 23:51:25.117 182759 DEBUG oslo_concurrency.lockutils [None req-6e5029f7-1c3b-41d3-a79e-5f72b37f46e1 d0c4727b6f6e46339b56a8168cf80a7b c1e85e2b0f934b719d3ad4076dc719f2 - - default default] Acquiring lock "e9e930a4-00ab-4044-bd2d-1f099528cb5d" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 21 18:51:25 np0005591285 nova_compute[182755]: 2026-01-21 23:51:25.118 182759 DEBUG oslo_concurrency.lockutils [None req-6e5029f7-1c3b-41d3-a79e-5f72b37f46e1 d0c4727b6f6e46339b56a8168cf80a7b c1e85e2b0f934b719d3ad4076dc719f2 - - default default] Lock "e9e930a4-00ab-4044-bd2d-1f099528cb5d" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 21 18:51:25 np0005591285 nova_compute[182755]: 2026-01-21 23:51:25.118 182759 DEBUG oslo_concurrency.lockutils [None req-6e5029f7-1c3b-41d3-a79e-5f72b37f46e1 d0c4727b6f6e46339b56a8168cf80a7b c1e85e2b0f934b719d3ad4076dc719f2 - - default default] Acquiring lock "e9e930a4-00ab-4044-bd2d-1f099528cb5d-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 21 18:51:25 np0005591285 nova_compute[182755]: 2026-01-21 23:51:25.118 182759 DEBUG oslo_concurrency.lockutils [None req-6e5029f7-1c3b-41d3-a79e-5f72b37f46e1 d0c4727b6f6e46339b56a8168cf80a7b c1e85e2b0f934b719d3ad4076dc719f2 - - default default] Lock "e9e930a4-00ab-4044-bd2d-1f099528cb5d-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 21 18:51:25 np0005591285 nova_compute[182755]: 2026-01-21 23:51:25.119 182759 DEBUG oslo_concurrency.lockutils [None req-6e5029f7-1c3b-41d3-a79e-5f72b37f46e1 d0c4727b6f6e46339b56a8168cf80a7b c1e85e2b0f934b719d3ad4076dc719f2 - - default default] Lock "e9e930a4-00ab-4044-bd2d-1f099528cb5d-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 21 18:51:25 np0005591285 nova_compute[182755]: 2026-01-21 23:51:25.131 182759 INFO nova.compute.manager [None req-6e5029f7-1c3b-41d3-a79e-5f72b37f46e1 d0c4727b6f6e46339b56a8168cf80a7b c1e85e2b0f934b719d3ad4076dc719f2 - - default default] [instance: e9e930a4-00ab-4044-bd2d-1f099528cb5d] Terminating instance#033[00m
Jan 21 18:51:25 np0005591285 nova_compute[182755]: 2026-01-21 23:51:25.144 182759 DEBUG oslo_concurrency.lockutils [None req-6e5029f7-1c3b-41d3-a79e-5f72b37f46e1 d0c4727b6f6e46339b56a8168cf80a7b c1e85e2b0f934b719d3ad4076dc719f2 - - default default] Acquiring lock "refresh_cache-e9e930a4-00ab-4044-bd2d-1f099528cb5d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 21 18:51:25 np0005591285 nova_compute[182755]: 2026-01-21 23:51:25.145 182759 DEBUG oslo_concurrency.lockutils [None req-6e5029f7-1c3b-41d3-a79e-5f72b37f46e1 d0c4727b6f6e46339b56a8168cf80a7b c1e85e2b0f934b719d3ad4076dc719f2 - - default default] Acquired lock "refresh_cache-e9e930a4-00ab-4044-bd2d-1f099528cb5d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 21 18:51:25 np0005591285 nova_compute[182755]: 2026-01-21 23:51:25.145 182759 DEBUG nova.network.neutron [None req-6e5029f7-1c3b-41d3-a79e-5f72b37f46e1 d0c4727b6f6e46339b56a8168cf80a7b c1e85e2b0f934b719d3ad4076dc719f2 - - default default] [instance: e9e930a4-00ab-4044-bd2d-1f099528cb5d] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Jan 21 18:51:25 np0005591285 nova_compute[182755]: 2026-01-21 23:51:25.459 182759 DEBUG nova.network.neutron [None req-6e5029f7-1c3b-41d3-a79e-5f72b37f46e1 d0c4727b6f6e46339b56a8168cf80a7b c1e85e2b0f934b719d3ad4076dc719f2 - - default default] [instance: e9e930a4-00ab-4044-bd2d-1f099528cb5d] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Jan 21 18:51:25 np0005591285 nova_compute[182755]: 2026-01-21 23:51:25.506 182759 DEBUG nova.network.neutron [None req-6204eec7-2fd1-4c08-9e33-9eefe34a69fe d0c4727b6f6e46339b56a8168cf80a7b c1e85e2b0f934b719d3ad4076dc719f2 - - default default] [instance: 5f345aa8-94d2-4213-ab21-fadc362697b1] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 21 18:51:25 np0005591285 nova_compute[182755]: 2026-01-21 23:51:25.530 182759 DEBUG oslo_concurrency.lockutils [None req-6204eec7-2fd1-4c08-9e33-9eefe34a69fe d0c4727b6f6e46339b56a8168cf80a7b c1e85e2b0f934b719d3ad4076dc719f2 - - default default] Releasing lock "refresh_cache-5f345aa8-94d2-4213-ab21-fadc362697b1" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 21 18:51:25 np0005591285 nova_compute[182755]: 2026-01-21 23:51:25.531 182759 DEBUG nova.compute.manager [None req-6204eec7-2fd1-4c08-9e33-9eefe34a69fe d0c4727b6f6e46339b56a8168cf80a7b c1e85e2b0f934b719d3ad4076dc719f2 - - default default] [instance: 5f345aa8-94d2-4213-ab21-fadc362697b1] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Jan 21 18:51:25 np0005591285 systemd[1]: machine-qemu\x2d15\x2dinstance\x2d00000029.scope: Deactivated successfully.
Jan 21 18:51:25 np0005591285 systemd[1]: machine-qemu\x2d15\x2dinstance\x2d00000029.scope: Consumed 12.885s CPU time.
Jan 21 18:51:25 np0005591285 systemd-machined[154022]: Machine qemu-15-instance-00000029 terminated.
Jan 21 18:51:25 np0005591285 nova_compute[182755]: 2026-01-21 23:51:25.805 182759 INFO nova.virt.libvirt.driver [-] [instance: 5f345aa8-94d2-4213-ab21-fadc362697b1] Instance destroyed successfully.#033[00m
Jan 21 18:51:25 np0005591285 nova_compute[182755]: 2026-01-21 23:51:25.806 182759 DEBUG nova.objects.instance [None req-6204eec7-2fd1-4c08-9e33-9eefe34a69fe d0c4727b6f6e46339b56a8168cf80a7b c1e85e2b0f934b719d3ad4076dc719f2 - - default default] Lazy-loading 'resources' on Instance uuid 5f345aa8-94d2-4213-ab21-fadc362697b1 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 21 18:51:25 np0005591285 nova_compute[182755]: 2026-01-21 23:51:25.835 182759 INFO nova.virt.libvirt.driver [None req-6204eec7-2fd1-4c08-9e33-9eefe34a69fe d0c4727b6f6e46339b56a8168cf80a7b c1e85e2b0f934b719d3ad4076dc719f2 - - default default] [instance: 5f345aa8-94d2-4213-ab21-fadc362697b1] Deleting instance files /var/lib/nova/instances/5f345aa8-94d2-4213-ab21-fadc362697b1_del#033[00m
Jan 21 18:51:25 np0005591285 nova_compute[182755]: 2026-01-21 23:51:25.836 182759 INFO nova.virt.libvirt.driver [None req-6204eec7-2fd1-4c08-9e33-9eefe34a69fe d0c4727b6f6e46339b56a8168cf80a7b c1e85e2b0f934b719d3ad4076dc719f2 - - default default] [instance: 5f345aa8-94d2-4213-ab21-fadc362697b1] Deletion of /var/lib/nova/instances/5f345aa8-94d2-4213-ab21-fadc362697b1_del complete#033[00m
Jan 21 18:51:25 np0005591285 nova_compute[182755]: 2026-01-21 23:51:25.877 182759 DEBUG nova.network.neutron [None req-6e5029f7-1c3b-41d3-a79e-5f72b37f46e1 d0c4727b6f6e46339b56a8168cf80a7b c1e85e2b0f934b719d3ad4076dc719f2 - - default default] [instance: e9e930a4-00ab-4044-bd2d-1f099528cb5d] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 21 18:51:25 np0005591285 nova_compute[182755]: 2026-01-21 23:51:25.897 182759 DEBUG oslo_concurrency.lockutils [None req-6e5029f7-1c3b-41d3-a79e-5f72b37f46e1 d0c4727b6f6e46339b56a8168cf80a7b c1e85e2b0f934b719d3ad4076dc719f2 - - default default] Releasing lock "refresh_cache-e9e930a4-00ab-4044-bd2d-1f099528cb5d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 21 18:51:25 np0005591285 nova_compute[182755]: 2026-01-21 23:51:25.898 182759 DEBUG nova.compute.manager [None req-6e5029f7-1c3b-41d3-a79e-5f72b37f46e1 d0c4727b6f6e46339b56a8168cf80a7b c1e85e2b0f934b719d3ad4076dc719f2 - - default default] [instance: e9e930a4-00ab-4044-bd2d-1f099528cb5d] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Jan 21 18:51:25 np0005591285 nova_compute[182755]: 2026-01-21 23:51:25.924 182759 INFO nova.compute.manager [None req-6204eec7-2fd1-4c08-9e33-9eefe34a69fe d0c4727b6f6e46339b56a8168cf80a7b c1e85e2b0f934b719d3ad4076dc719f2 - - default default] [instance: 5f345aa8-94d2-4213-ab21-fadc362697b1] Took 0.39 seconds to destroy the instance on the hypervisor.#033[00m
Jan 21 18:51:25 np0005591285 nova_compute[182755]: 2026-01-21 23:51:25.925 182759 DEBUG oslo.service.loopingcall [None req-6204eec7-2fd1-4c08-9e33-9eefe34a69fe d0c4727b6f6e46339b56a8168cf80a7b c1e85e2b0f934b719d3ad4076dc719f2 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Jan 21 18:51:25 np0005591285 nova_compute[182755]: 2026-01-21 23:51:25.925 182759 DEBUG nova.compute.manager [-] [instance: 5f345aa8-94d2-4213-ab21-fadc362697b1] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Jan 21 18:51:25 np0005591285 nova_compute[182755]: 2026-01-21 23:51:25.925 182759 DEBUG nova.network.neutron [-] [instance: 5f345aa8-94d2-4213-ab21-fadc362697b1] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Jan 21 18:51:25 np0005591285 systemd[1]: machine-qemu\x2d16\x2dinstance\x2d0000002a.scope: Deactivated successfully.
Jan 21 18:51:25 np0005591285 systemd[1]: machine-qemu\x2d16\x2dinstance\x2d0000002a.scope: Consumed 13.418s CPU time.
Jan 21 18:51:25 np0005591285 systemd-machined[154022]: Machine qemu-16-instance-0000002a terminated.
Jan 21 18:51:26 np0005591285 nova_compute[182755]: 2026-01-21 23:51:26.111 182759 DEBUG nova.network.neutron [-] [instance: 5f345aa8-94d2-4213-ab21-fadc362697b1] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Jan 21 18:51:26 np0005591285 nova_compute[182755]: 2026-01-21 23:51:26.132 182759 DEBUG nova.network.neutron [-] [instance: 5f345aa8-94d2-4213-ab21-fadc362697b1] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 21 18:51:26 np0005591285 nova_compute[182755]: 2026-01-21 23:51:26.148 182759 INFO nova.compute.manager [-] [instance: 5f345aa8-94d2-4213-ab21-fadc362697b1] Took 0.22 seconds to deallocate network for instance.#033[00m
Jan 21 18:51:26 np0005591285 nova_compute[182755]: 2026-01-21 23:51:26.173 182759 INFO nova.virt.libvirt.driver [-] [instance: e9e930a4-00ab-4044-bd2d-1f099528cb5d] Instance destroyed successfully.#033[00m
Jan 21 18:51:26 np0005591285 nova_compute[182755]: 2026-01-21 23:51:26.174 182759 DEBUG nova.objects.instance [None req-6e5029f7-1c3b-41d3-a79e-5f72b37f46e1 d0c4727b6f6e46339b56a8168cf80a7b c1e85e2b0f934b719d3ad4076dc719f2 - - default default] Lazy-loading 'resources' on Instance uuid e9e930a4-00ab-4044-bd2d-1f099528cb5d obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 21 18:51:26 np0005591285 nova_compute[182755]: 2026-01-21 23:51:26.199 182759 INFO nova.virt.libvirt.driver [None req-6e5029f7-1c3b-41d3-a79e-5f72b37f46e1 d0c4727b6f6e46339b56a8168cf80a7b c1e85e2b0f934b719d3ad4076dc719f2 - - default default] [instance: e9e930a4-00ab-4044-bd2d-1f099528cb5d] Deleting instance files /var/lib/nova/instances/e9e930a4-00ab-4044-bd2d-1f099528cb5d_del#033[00m
Jan 21 18:51:26 np0005591285 nova_compute[182755]: 2026-01-21 23:51:26.200 182759 INFO nova.virt.libvirt.driver [None req-6e5029f7-1c3b-41d3-a79e-5f72b37f46e1 d0c4727b6f6e46339b56a8168cf80a7b c1e85e2b0f934b719d3ad4076dc719f2 - - default default] [instance: e9e930a4-00ab-4044-bd2d-1f099528cb5d] Deletion of /var/lib/nova/instances/e9e930a4-00ab-4044-bd2d-1f099528cb5d_del complete#033[00m
Jan 21 18:51:26 np0005591285 nova_compute[182755]: 2026-01-21 23:51:26.219 182759 DEBUG oslo_service.periodic_task [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 21 18:51:26 np0005591285 nova_compute[182755]: 2026-01-21 23:51:26.290 182759 DEBUG oslo_concurrency.lockutils [None req-6204eec7-2fd1-4c08-9e33-9eefe34a69fe d0c4727b6f6e46339b56a8168cf80a7b c1e85e2b0f934b719d3ad4076dc719f2 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 21 18:51:26 np0005591285 nova_compute[182755]: 2026-01-21 23:51:26.291 182759 DEBUG oslo_concurrency.lockutils [None req-6204eec7-2fd1-4c08-9e33-9eefe34a69fe d0c4727b6f6e46339b56a8168cf80a7b c1e85e2b0f934b719d3ad4076dc719f2 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 21 18:51:26 np0005591285 nova_compute[182755]: 2026-01-21 23:51:26.315 182759 INFO nova.compute.manager [None req-6e5029f7-1c3b-41d3-a79e-5f72b37f46e1 d0c4727b6f6e46339b56a8168cf80a7b c1e85e2b0f934b719d3ad4076dc719f2 - - default default] [instance: e9e930a4-00ab-4044-bd2d-1f099528cb5d] Took 0.42 seconds to destroy the instance on the hypervisor.#033[00m
Jan 21 18:51:26 np0005591285 nova_compute[182755]: 2026-01-21 23:51:26.315 182759 DEBUG oslo.service.loopingcall [None req-6e5029f7-1c3b-41d3-a79e-5f72b37f46e1 d0c4727b6f6e46339b56a8168cf80a7b c1e85e2b0f934b719d3ad4076dc719f2 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Jan 21 18:51:26 np0005591285 nova_compute[182755]: 2026-01-21 23:51:26.316 182759 DEBUG nova.compute.manager [-] [instance: e9e930a4-00ab-4044-bd2d-1f099528cb5d] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Jan 21 18:51:26 np0005591285 nova_compute[182755]: 2026-01-21 23:51:26.316 182759 DEBUG nova.network.neutron [-] [instance: e9e930a4-00ab-4044-bd2d-1f099528cb5d] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Jan 21 18:51:26 np0005591285 nova_compute[182755]: 2026-01-21 23:51:26.407 182759 DEBUG nova.compute.provider_tree [None req-6204eec7-2fd1-4c08-9e33-9eefe34a69fe d0c4727b6f6e46339b56a8168cf80a7b c1e85e2b0f934b719d3ad4076dc719f2 - - default default] Inventory has not changed in ProviderTree for provider: e96a8776-a298-4c19-937a-402cb8191067 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 21 18:51:26 np0005591285 nova_compute[182755]: 2026-01-21 23:51:26.427 182759 DEBUG nova.scheduler.client.report [None req-6204eec7-2fd1-4c08-9e33-9eefe34a69fe d0c4727b6f6e46339b56a8168cf80a7b c1e85e2b0f934b719d3ad4076dc719f2 - - default default] Inventory has not changed for provider e96a8776-a298-4c19-937a-402cb8191067 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 21 18:51:26 np0005591285 nova_compute[182755]: 2026-01-21 23:51:26.469 182759 DEBUG oslo_concurrency.lockutils [None req-6204eec7-2fd1-4c08-9e33-9eefe34a69fe d0c4727b6f6e46339b56a8168cf80a7b c1e85e2b0f934b719d3ad4076dc719f2 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.178s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 21 18:51:26 np0005591285 nova_compute[182755]: 2026-01-21 23:51:26.510 182759 INFO nova.scheduler.client.report [None req-6204eec7-2fd1-4c08-9e33-9eefe34a69fe d0c4727b6f6e46339b56a8168cf80a7b c1e85e2b0f934b719d3ad4076dc719f2 - - default default] Deleted allocations for instance 5f345aa8-94d2-4213-ab21-fadc362697b1#033[00m
Jan 21 18:51:26 np0005591285 nova_compute[182755]: 2026-01-21 23:51:26.610 182759 DEBUG nova.network.neutron [-] [instance: e9e930a4-00ab-4044-bd2d-1f099528cb5d] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Jan 21 18:51:26 np0005591285 nova_compute[182755]: 2026-01-21 23:51:26.624 182759 DEBUG oslo_concurrency.lockutils [None req-6204eec7-2fd1-4c08-9e33-9eefe34a69fe d0c4727b6f6e46339b56a8168cf80a7b c1e85e2b0f934b719d3ad4076dc719f2 - - default default] Lock "5f345aa8-94d2-4213-ab21-fadc362697b1" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 1.747s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 21 18:51:26 np0005591285 nova_compute[182755]: 2026-01-21 23:51:26.654 182759 DEBUG nova.network.neutron [-] [instance: e9e930a4-00ab-4044-bd2d-1f099528cb5d] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 21 18:51:26 np0005591285 nova_compute[182755]: 2026-01-21 23:51:26.685 182759 INFO nova.compute.manager [-] [instance: e9e930a4-00ab-4044-bd2d-1f099528cb5d] Took 0.37 seconds to deallocate network for instance.#033[00m
Jan 21 18:51:26 np0005591285 nova_compute[182755]: 2026-01-21 23:51:26.829 182759 DEBUG oslo_concurrency.lockutils [None req-6e5029f7-1c3b-41d3-a79e-5f72b37f46e1 d0c4727b6f6e46339b56a8168cf80a7b c1e85e2b0f934b719d3ad4076dc719f2 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 21 18:51:26 np0005591285 nova_compute[182755]: 2026-01-21 23:51:26.830 182759 DEBUG oslo_concurrency.lockutils [None req-6e5029f7-1c3b-41d3-a79e-5f72b37f46e1 d0c4727b6f6e46339b56a8168cf80a7b c1e85e2b0f934b719d3ad4076dc719f2 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 21 18:51:26 np0005591285 nova_compute[182755]: 2026-01-21 23:51:26.905 182759 DEBUG nova.compute.provider_tree [None req-6e5029f7-1c3b-41d3-a79e-5f72b37f46e1 d0c4727b6f6e46339b56a8168cf80a7b c1e85e2b0f934b719d3ad4076dc719f2 - - default default] Inventory has not changed in ProviderTree for provider: e96a8776-a298-4c19-937a-402cb8191067 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 21 18:51:26 np0005591285 nova_compute[182755]: 2026-01-21 23:51:26.927 182759 DEBUG nova.scheduler.client.report [None req-6e5029f7-1c3b-41d3-a79e-5f72b37f46e1 d0c4727b6f6e46339b56a8168cf80a7b c1e85e2b0f934b719d3ad4076dc719f2 - - default default] Inventory has not changed for provider e96a8776-a298-4c19-937a-402cb8191067 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 21 18:51:26 np0005591285 nova_compute[182755]: 2026-01-21 23:51:26.970 182759 DEBUG oslo_concurrency.lockutils [None req-6e5029f7-1c3b-41d3-a79e-5f72b37f46e1 d0c4727b6f6e46339b56a8168cf80a7b c1e85e2b0f934b719d3ad4076dc719f2 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.140s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 21 18:51:27 np0005591285 nova_compute[182755]: 2026-01-21 23:51:27.007 182759 INFO nova.scheduler.client.report [None req-6e5029f7-1c3b-41d3-a79e-5f72b37f46e1 d0c4727b6f6e46339b56a8168cf80a7b c1e85e2b0f934b719d3ad4076dc719f2 - - default default] Deleted allocations for instance e9e930a4-00ab-4044-bd2d-1f099528cb5d#033[00m
Jan 21 18:51:27 np0005591285 nova_compute[182755]: 2026-01-21 23:51:27.047 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 18:51:27 np0005591285 nova_compute[182755]: 2026-01-21 23:51:27.128 182759 DEBUG oslo_concurrency.lockutils [None req-6e5029f7-1c3b-41d3-a79e-5f72b37f46e1 d0c4727b6f6e46339b56a8168cf80a7b c1e85e2b0f934b719d3ad4076dc719f2 - - default default] Lock "e9e930a4-00ab-4044-bd2d-1f099528cb5d" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 2.010s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 21 18:51:27 np0005591285 nova_compute[182755]: 2026-01-21 23:51:27.217 182759 DEBUG oslo_service.periodic_task [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 21 18:51:27 np0005591285 nova_compute[182755]: 2026-01-21 23:51:27.218 182759 DEBUG oslo_service.periodic_task [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 21 18:51:27 np0005591285 nova_compute[182755]: 2026-01-21 23:51:27.249 182759 DEBUG oslo_concurrency.lockutils [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 21 18:51:27 np0005591285 nova_compute[182755]: 2026-01-21 23:51:27.251 182759 DEBUG oslo_concurrency.lockutils [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 21 18:51:27 np0005591285 nova_compute[182755]: 2026-01-21 23:51:27.252 182759 DEBUG oslo_concurrency.lockutils [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 21 18:51:27 np0005591285 nova_compute[182755]: 2026-01-21 23:51:27.253 182759 DEBUG nova.compute.resource_tracker [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Jan 21 18:51:27 np0005591285 podman[216302]: 2026-01-21 23:51:27.326498192 +0000 UTC m=+0.178727528 container health_status e38def30a2d032278c507a8fe96e62e9ea51ff4721fe55b24e66691821fd079e (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, container_name=ovn_controller, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_managed=true)
Jan 21 18:51:27 np0005591285 nova_compute[182755]: 2026-01-21 23:51:27.519 182759 WARNING nova.virt.libvirt.driver [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 21 18:51:27 np0005591285 nova_compute[182755]: 2026-01-21 23:51:27.522 182759 DEBUG nova.compute.resource_tracker [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=5666MB free_disk=73.37655639648438GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Jan 21 18:51:27 np0005591285 nova_compute[182755]: 2026-01-21 23:51:27.522 182759 DEBUG oslo_concurrency.lockutils [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 21 18:51:27 np0005591285 nova_compute[182755]: 2026-01-21 23:51:27.523 182759 DEBUG oslo_concurrency.lockutils [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 21 18:51:27 np0005591285 nova_compute[182755]: 2026-01-21 23:51:27.582 182759 DEBUG nova.compute.resource_tracker [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Jan 21 18:51:27 np0005591285 nova_compute[182755]: 2026-01-21 23:51:27.583 182759 DEBUG nova.compute.resource_tracker [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Jan 21 18:51:27 np0005591285 nova_compute[182755]: 2026-01-21 23:51:27.606 182759 DEBUG nova.compute.provider_tree [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Inventory has not changed in ProviderTree for provider: e96a8776-a298-4c19-937a-402cb8191067 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 21 18:51:27 np0005591285 nova_compute[182755]: 2026-01-21 23:51:27.627 182759 DEBUG nova.scheduler.client.report [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Inventory has not changed for provider e96a8776-a298-4c19-937a-402cb8191067 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 21 18:51:27 np0005591285 nova_compute[182755]: 2026-01-21 23:51:27.650 182759 DEBUG nova.compute.resource_tracker [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Jan 21 18:51:27 np0005591285 nova_compute[182755]: 2026-01-21 23:51:27.650 182759 DEBUG oslo_concurrency.lockutils [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.128s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 21 18:51:29 np0005591285 nova_compute[182755]: 2026-01-21 23:51:29.463 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 18:51:30 np0005591285 podman[216329]: 2026-01-21 23:51:30.239571502 +0000 UTC m=+0.105994980 container health_status 482a8450540b33e5cd4c4cb0c4b60281016c657b79b89fa82f8a9e447843f995 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS)
Jan 21 18:51:30 np0005591285 podman[216330]: 2026-01-21 23:51:30.249630742 +0000 UTC m=+0.110255625 container health_status 8919309155a033da8deb024bb37b9fc0d285fe97686ee0d202af37a5f2df8416 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter)
Jan 21 18:51:30 np0005591285 nova_compute[182755]: 2026-01-21 23:51:30.652 182759 DEBUG oslo_service.periodic_task [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 21 18:51:32 np0005591285 nova_compute[182755]: 2026-01-21 23:51:32.052 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 18:51:34 np0005591285 nova_compute[182755]: 2026-01-21 23:51:34.489 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 18:51:36 np0005591285 nova_compute[182755]: 2026-01-21 23:51:36.175 182759 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1769039481.172932, c4b127c8-46d7-4b97-abfe-12c84f0d2070 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 21 18:51:36 np0005591285 nova_compute[182755]: 2026-01-21 23:51:36.176 182759 INFO nova.compute.manager [-] [instance: c4b127c8-46d7-4b97-abfe-12c84f0d2070] VM Stopped (Lifecycle Event)#033[00m
Jan 21 18:51:36 np0005591285 nova_compute[182755]: 2026-01-21 23:51:36.208 182759 DEBUG nova.compute.manager [None req-0171a29a-e664-49fa-ba59-6bbb469a89e2 - - - - - -] [instance: c4b127c8-46d7-4b97-abfe-12c84f0d2070] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 21 18:51:37 np0005591285 nova_compute[182755]: 2026-01-21 23:51:37.056 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 18:51:39 np0005591285 nova_compute[182755]: 2026-01-21 23:51:39.490 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 18:51:40 np0005591285 nova_compute[182755]: 2026-01-21 23:51:40.804 182759 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1769039485.8026242, 5f345aa8-94d2-4213-ab21-fadc362697b1 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 21 18:51:40 np0005591285 nova_compute[182755]: 2026-01-21 23:51:40.805 182759 INFO nova.compute.manager [-] [instance: 5f345aa8-94d2-4213-ab21-fadc362697b1] VM Stopped (Lifecycle Event)#033[00m
Jan 21 18:51:40 np0005591285 nova_compute[182755]: 2026-01-21 23:51:40.824 182759 DEBUG nova.compute.manager [None req-9cbd0bc3-0fb1-4b0a-9e45-12b96814872b - - - - - -] [instance: 5f345aa8-94d2-4213-ab21-fadc362697b1] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 21 18:51:41 np0005591285 nova_compute[182755]: 2026-01-21 23:51:41.171 182759 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1769039486.1690207, e9e930a4-00ab-4044-bd2d-1f099528cb5d => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 21 18:51:41 np0005591285 nova_compute[182755]: 2026-01-21 23:51:41.172 182759 INFO nova.compute.manager [-] [instance: e9e930a4-00ab-4044-bd2d-1f099528cb5d] VM Stopped (Lifecycle Event)#033[00m
Jan 21 18:51:41 np0005591285 nova_compute[182755]: 2026-01-21 23:51:41.201 182759 DEBUG nova.compute.manager [None req-fac4934c-73fd-48d2-80a6-1f301d4b3308 - - - - - -] [instance: e9e930a4-00ab-4044-bd2d-1f099528cb5d] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 21 18:51:42 np0005591285 nova_compute[182755]: 2026-01-21 23:51:42.075 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 18:51:44 np0005591285 nova_compute[182755]: 2026-01-21 23:51:44.523 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 18:51:45 np0005591285 podman[216372]: 2026-01-21 23:51:45.253591094 +0000 UTC m=+0.117147677 container health_status b5c1a31a508d0cf9e10702abe9c0ef424614b4d8f0daee85791f7de054afa6f0 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_id=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, managed_by=edpm_ansible)
Jan 21 18:51:46 np0005591285 podman[216390]: 2026-01-21 23:51:46.255274573 +0000 UTC m=+0.120783576 container health_status 769668cc4e818cfae854ab3fcb5517227ddca1e18005a18ef4d782a494f45662 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=openstack_network_exporter, vcs-type=git, distribution-scope=public, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.tags=minimal rhel9, name=ubi9-minimal, vendor=Red Hat, Inc., io.openshift.expose-services=, url=https://catalog.redhat.com/en/search?searchType=containers, io.buildah.version=1.33.7, release=1755695350, build-date=2025-08-20T13:12:41, version=9.6, maintainer=Red Hat, Inc., config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, managed_by=edpm_ansible, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, com.redhat.component=ubi9-minimal-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., config_id=openstack_network_exporter, architecture=x86_64, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal)
Jan 21 18:51:47 np0005591285 nova_compute[182755]: 2026-01-21 23:51:47.077 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 18:51:47 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:51:47.921 104259 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=11, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '16:a4:fb', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': 'fa:c2:bb:50:eb:27'}, ipsec=False) old=SB_Global(nb_cfg=10) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 21 18:51:47 np0005591285 nova_compute[182755]: 2026-01-21 23:51:47.922 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 18:51:47 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:51:47.922 104259 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 3 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Jan 21 18:51:49 np0005591285 nova_compute[182755]: 2026-01-21 23:51:49.524 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 18:51:50 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:51:50.924 104259 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=ce4b296c-26ac-415a-aa87-9634754eb3d3, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '11'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 21 18:51:52 np0005591285 nova_compute[182755]: 2026-01-21 23:51:52.081 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 18:51:52 np0005591285 nova_compute[182755]: 2026-01-21 23:51:52.503 182759 DEBUG oslo_concurrency.lockutils [None req-29096907-7086-45f6-8ffc-b0bcc4f3d5bc 32616a8bfc24415297b9c4783dbc977d 1d77eaad9237406bac52794626a266ee - - default default] Acquiring lock "094e5be1-805d-4b29-81c0-62d8ecfe353d" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 21 18:51:52 np0005591285 nova_compute[182755]: 2026-01-21 23:51:52.504 182759 DEBUG oslo_concurrency.lockutils [None req-29096907-7086-45f6-8ffc-b0bcc4f3d5bc 32616a8bfc24415297b9c4783dbc977d 1d77eaad9237406bac52794626a266ee - - default default] Lock "094e5be1-805d-4b29-81c0-62d8ecfe353d" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 21 18:51:52 np0005591285 nova_compute[182755]: 2026-01-21 23:51:52.536 182759 DEBUG nova.compute.manager [None req-29096907-7086-45f6-8ffc-b0bcc4f3d5bc 32616a8bfc24415297b9c4783dbc977d 1d77eaad9237406bac52794626a266ee - - default default] [instance: 094e5be1-805d-4b29-81c0-62d8ecfe353d] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Jan 21 18:51:52 np0005591285 nova_compute[182755]: 2026-01-21 23:51:52.678 182759 DEBUG oslo_concurrency.lockutils [None req-29096907-7086-45f6-8ffc-b0bcc4f3d5bc 32616a8bfc24415297b9c4783dbc977d 1d77eaad9237406bac52794626a266ee - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 21 18:51:52 np0005591285 nova_compute[182755]: 2026-01-21 23:51:52.679 182759 DEBUG oslo_concurrency.lockutils [None req-29096907-7086-45f6-8ffc-b0bcc4f3d5bc 32616a8bfc24415297b9c4783dbc977d 1d77eaad9237406bac52794626a266ee - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 21 18:51:52 np0005591285 nova_compute[182755]: 2026-01-21 23:51:52.687 182759 DEBUG nova.virt.hardware [None req-29096907-7086-45f6-8ffc-b0bcc4f3d5bc 32616a8bfc24415297b9c4783dbc977d 1d77eaad9237406bac52794626a266ee - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Jan 21 18:51:52 np0005591285 nova_compute[182755]: 2026-01-21 23:51:52.688 182759 INFO nova.compute.claims [None req-29096907-7086-45f6-8ffc-b0bcc4f3d5bc 32616a8bfc24415297b9c4783dbc977d 1d77eaad9237406bac52794626a266ee - - default default] [instance: 094e5be1-805d-4b29-81c0-62d8ecfe353d] Claim successful on node compute-2.ctlplane.example.com#033[00m
Jan 21 18:51:52 np0005591285 nova_compute[182755]: 2026-01-21 23:51:52.859 182759 DEBUG nova.compute.provider_tree [None req-29096907-7086-45f6-8ffc-b0bcc4f3d5bc 32616a8bfc24415297b9c4783dbc977d 1d77eaad9237406bac52794626a266ee - - default default] Inventory has not changed in ProviderTree for provider: e96a8776-a298-4c19-937a-402cb8191067 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 21 18:51:52 np0005591285 nova_compute[182755]: 2026-01-21 23:51:52.883 182759 DEBUG nova.scheduler.client.report [None req-29096907-7086-45f6-8ffc-b0bcc4f3d5bc 32616a8bfc24415297b9c4783dbc977d 1d77eaad9237406bac52794626a266ee - - default default] Inventory has not changed for provider e96a8776-a298-4c19-937a-402cb8191067 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 21 18:51:52 np0005591285 nova_compute[182755]: 2026-01-21 23:51:52.914 182759 DEBUG oslo_concurrency.lockutils [None req-29096907-7086-45f6-8ffc-b0bcc4f3d5bc 32616a8bfc24415297b9c4783dbc977d 1d77eaad9237406bac52794626a266ee - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.235s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 21 18:51:52 np0005591285 nova_compute[182755]: 2026-01-21 23:51:52.915 182759 DEBUG nova.compute.manager [None req-29096907-7086-45f6-8ffc-b0bcc4f3d5bc 32616a8bfc24415297b9c4783dbc977d 1d77eaad9237406bac52794626a266ee - - default default] [instance: 094e5be1-805d-4b29-81c0-62d8ecfe353d] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Jan 21 18:51:52 np0005591285 nova_compute[182755]: 2026-01-21 23:51:52.995 182759 DEBUG nova.compute.manager [None req-29096907-7086-45f6-8ffc-b0bcc4f3d5bc 32616a8bfc24415297b9c4783dbc977d 1d77eaad9237406bac52794626a266ee - - default default] [instance: 094e5be1-805d-4b29-81c0-62d8ecfe353d] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Jan 21 18:51:52 np0005591285 nova_compute[182755]: 2026-01-21 23:51:52.995 182759 DEBUG nova.network.neutron [None req-29096907-7086-45f6-8ffc-b0bcc4f3d5bc 32616a8bfc24415297b9c4783dbc977d 1d77eaad9237406bac52794626a266ee - - default default] [instance: 094e5be1-805d-4b29-81c0-62d8ecfe353d] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Jan 21 18:51:53 np0005591285 nova_compute[182755]: 2026-01-21 23:51:53.030 182759 INFO nova.virt.libvirt.driver [None req-29096907-7086-45f6-8ffc-b0bcc4f3d5bc 32616a8bfc24415297b9c4783dbc977d 1d77eaad9237406bac52794626a266ee - - default default] [instance: 094e5be1-805d-4b29-81c0-62d8ecfe353d] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Jan 21 18:51:53 np0005591285 nova_compute[182755]: 2026-01-21 23:51:53.051 182759 DEBUG nova.compute.manager [None req-29096907-7086-45f6-8ffc-b0bcc4f3d5bc 32616a8bfc24415297b9c4783dbc977d 1d77eaad9237406bac52794626a266ee - - default default] [instance: 094e5be1-805d-4b29-81c0-62d8ecfe353d] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Jan 21 18:51:53 np0005591285 nova_compute[182755]: 2026-01-21 23:51:53.159 182759 DEBUG nova.compute.manager [None req-29096907-7086-45f6-8ffc-b0bcc4f3d5bc 32616a8bfc24415297b9c4783dbc977d 1d77eaad9237406bac52794626a266ee - - default default] [instance: 094e5be1-805d-4b29-81c0-62d8ecfe353d] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Jan 21 18:51:53 np0005591285 nova_compute[182755]: 2026-01-21 23:51:53.161 182759 DEBUG nova.virt.libvirt.driver [None req-29096907-7086-45f6-8ffc-b0bcc4f3d5bc 32616a8bfc24415297b9c4783dbc977d 1d77eaad9237406bac52794626a266ee - - default default] [instance: 094e5be1-805d-4b29-81c0-62d8ecfe353d] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Jan 21 18:51:53 np0005591285 nova_compute[182755]: 2026-01-21 23:51:53.162 182759 INFO nova.virt.libvirt.driver [None req-29096907-7086-45f6-8ffc-b0bcc4f3d5bc 32616a8bfc24415297b9c4783dbc977d 1d77eaad9237406bac52794626a266ee - - default default] [instance: 094e5be1-805d-4b29-81c0-62d8ecfe353d] Creating image(s)#033[00m
Jan 21 18:51:53 np0005591285 nova_compute[182755]: 2026-01-21 23:51:53.163 182759 DEBUG oslo_concurrency.lockutils [None req-29096907-7086-45f6-8ffc-b0bcc4f3d5bc 32616a8bfc24415297b9c4783dbc977d 1d77eaad9237406bac52794626a266ee - - default default] Acquiring lock "/var/lib/nova/instances/094e5be1-805d-4b29-81c0-62d8ecfe353d/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 21 18:51:53 np0005591285 nova_compute[182755]: 2026-01-21 23:51:53.164 182759 DEBUG oslo_concurrency.lockutils [None req-29096907-7086-45f6-8ffc-b0bcc4f3d5bc 32616a8bfc24415297b9c4783dbc977d 1d77eaad9237406bac52794626a266ee - - default default] Lock "/var/lib/nova/instances/094e5be1-805d-4b29-81c0-62d8ecfe353d/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 21 18:51:53 np0005591285 nova_compute[182755]: 2026-01-21 23:51:53.165 182759 DEBUG oslo_concurrency.lockutils [None req-29096907-7086-45f6-8ffc-b0bcc4f3d5bc 32616a8bfc24415297b9c4783dbc977d 1d77eaad9237406bac52794626a266ee - - default default] Lock "/var/lib/nova/instances/094e5be1-805d-4b29-81c0-62d8ecfe353d/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 21 18:51:53 np0005591285 nova_compute[182755]: 2026-01-21 23:51:53.191 182759 DEBUG oslo_concurrency.processutils [None req-29096907-7086-45f6-8ffc-b0bcc4f3d5bc 32616a8bfc24415297b9c4783dbc977d 1d77eaad9237406bac52794626a266ee - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 21 18:51:53 np0005591285 nova_compute[182755]: 2026-01-21 23:51:53.266 182759 DEBUG oslo_concurrency.processutils [None req-29096907-7086-45f6-8ffc-b0bcc4f3d5bc 32616a8bfc24415297b9c4783dbc977d 1d77eaad9237406bac52794626a266ee - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json" returned: 0 in 0.075s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 21 18:51:53 np0005591285 nova_compute[182755]: 2026-01-21 23:51:53.268 182759 DEBUG oslo_concurrency.lockutils [None req-29096907-7086-45f6-8ffc-b0bcc4f3d5bc 32616a8bfc24415297b9c4783dbc977d 1d77eaad9237406bac52794626a266ee - - default default] Acquiring lock "3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 21 18:51:53 np0005591285 nova_compute[182755]: 2026-01-21 23:51:53.269 182759 DEBUG oslo_concurrency.lockutils [None req-29096907-7086-45f6-8ffc-b0bcc4f3d5bc 32616a8bfc24415297b9c4783dbc977d 1d77eaad9237406bac52794626a266ee - - default default] Lock "3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 21 18:51:53 np0005591285 nova_compute[182755]: 2026-01-21 23:51:53.292 182759 DEBUG oslo_concurrency.processutils [None req-29096907-7086-45f6-8ffc-b0bcc4f3d5bc 32616a8bfc24415297b9c4783dbc977d 1d77eaad9237406bac52794626a266ee - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 21 18:51:53 np0005591285 nova_compute[182755]: 2026-01-21 23:51:53.381 182759 DEBUG oslo_concurrency.processutils [None req-29096907-7086-45f6-8ffc-b0bcc4f3d5bc 32616a8bfc24415297b9c4783dbc977d 1d77eaad9237406bac52794626a266ee - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json" returned: 0 in 0.089s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 21 18:51:53 np0005591285 nova_compute[182755]: 2026-01-21 23:51:53.383 182759 DEBUG oslo_concurrency.processutils [None req-29096907-7086-45f6-8ffc-b0bcc4f3d5bc 32616a8bfc24415297b9c4783dbc977d 1d77eaad9237406bac52794626a266ee - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474,backing_fmt=raw /var/lib/nova/instances/094e5be1-805d-4b29-81c0-62d8ecfe353d/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 21 18:51:53 np0005591285 nova_compute[182755]: 2026-01-21 23:51:53.438 182759 DEBUG oslo_concurrency.processutils [None req-29096907-7086-45f6-8ffc-b0bcc4f3d5bc 32616a8bfc24415297b9c4783dbc977d 1d77eaad9237406bac52794626a266ee - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474,backing_fmt=raw /var/lib/nova/instances/094e5be1-805d-4b29-81c0-62d8ecfe353d/disk 1073741824" returned: 0 in 0.055s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 21 18:51:53 np0005591285 nova_compute[182755]: 2026-01-21 23:51:53.440 182759 DEBUG oslo_concurrency.lockutils [None req-29096907-7086-45f6-8ffc-b0bcc4f3d5bc 32616a8bfc24415297b9c4783dbc977d 1d77eaad9237406bac52794626a266ee - - default default] Lock "3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.171s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 21 18:51:53 np0005591285 nova_compute[182755]: 2026-01-21 23:51:53.441 182759 DEBUG oslo_concurrency.processutils [None req-29096907-7086-45f6-8ffc-b0bcc4f3d5bc 32616a8bfc24415297b9c4783dbc977d 1d77eaad9237406bac52794626a266ee - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 21 18:51:53 np0005591285 nova_compute[182755]: 2026-01-21 23:51:53.536 182759 DEBUG oslo_concurrency.processutils [None req-29096907-7086-45f6-8ffc-b0bcc4f3d5bc 32616a8bfc24415297b9c4783dbc977d 1d77eaad9237406bac52794626a266ee - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json" returned: 0 in 0.095s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 21 18:51:53 np0005591285 nova_compute[182755]: 2026-01-21 23:51:53.537 182759 DEBUG nova.virt.disk.api [None req-29096907-7086-45f6-8ffc-b0bcc4f3d5bc 32616a8bfc24415297b9c4783dbc977d 1d77eaad9237406bac52794626a266ee - - default default] Checking if we can resize image /var/lib/nova/instances/094e5be1-805d-4b29-81c0-62d8ecfe353d/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166#033[00m
Jan 21 18:51:53 np0005591285 nova_compute[182755]: 2026-01-21 23:51:53.538 182759 DEBUG oslo_concurrency.processutils [None req-29096907-7086-45f6-8ffc-b0bcc4f3d5bc 32616a8bfc24415297b9c4783dbc977d 1d77eaad9237406bac52794626a266ee - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/094e5be1-805d-4b29-81c0-62d8ecfe353d/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 21 18:51:53 np0005591285 nova_compute[182755]: 2026-01-21 23:51:53.611 182759 DEBUG oslo_concurrency.processutils [None req-29096907-7086-45f6-8ffc-b0bcc4f3d5bc 32616a8bfc24415297b9c4783dbc977d 1d77eaad9237406bac52794626a266ee - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/094e5be1-805d-4b29-81c0-62d8ecfe353d/disk --force-share --output=json" returned: 0 in 0.073s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 21 18:51:53 np0005591285 nova_compute[182755]: 2026-01-21 23:51:53.612 182759 DEBUG nova.virt.disk.api [None req-29096907-7086-45f6-8ffc-b0bcc4f3d5bc 32616a8bfc24415297b9c4783dbc977d 1d77eaad9237406bac52794626a266ee - - default default] Cannot resize image /var/lib/nova/instances/094e5be1-805d-4b29-81c0-62d8ecfe353d/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172#033[00m
Jan 21 18:51:53 np0005591285 nova_compute[182755]: 2026-01-21 23:51:53.613 182759 DEBUG nova.objects.instance [None req-29096907-7086-45f6-8ffc-b0bcc4f3d5bc 32616a8bfc24415297b9c4783dbc977d 1d77eaad9237406bac52794626a266ee - - default default] Lazy-loading 'migration_context' on Instance uuid 094e5be1-805d-4b29-81c0-62d8ecfe353d obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 21 18:51:53 np0005591285 nova_compute[182755]: 2026-01-21 23:51:53.627 182759 DEBUG nova.virt.libvirt.driver [None req-29096907-7086-45f6-8ffc-b0bcc4f3d5bc 32616a8bfc24415297b9c4783dbc977d 1d77eaad9237406bac52794626a266ee - - default default] [instance: 094e5be1-805d-4b29-81c0-62d8ecfe353d] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Jan 21 18:51:53 np0005591285 nova_compute[182755]: 2026-01-21 23:51:53.628 182759 DEBUG nova.virt.libvirt.driver [None req-29096907-7086-45f6-8ffc-b0bcc4f3d5bc 32616a8bfc24415297b9c4783dbc977d 1d77eaad9237406bac52794626a266ee - - default default] [instance: 094e5be1-805d-4b29-81c0-62d8ecfe353d] Ensure instance console log exists: /var/lib/nova/instances/094e5be1-805d-4b29-81c0-62d8ecfe353d/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Jan 21 18:51:53 np0005591285 nova_compute[182755]: 2026-01-21 23:51:53.629 182759 DEBUG oslo_concurrency.lockutils [None req-29096907-7086-45f6-8ffc-b0bcc4f3d5bc 32616a8bfc24415297b9c4783dbc977d 1d77eaad9237406bac52794626a266ee - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 21 18:51:53 np0005591285 nova_compute[182755]: 2026-01-21 23:51:53.629 182759 DEBUG oslo_concurrency.lockutils [None req-29096907-7086-45f6-8ffc-b0bcc4f3d5bc 32616a8bfc24415297b9c4783dbc977d 1d77eaad9237406bac52794626a266ee - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 21 18:51:53 np0005591285 nova_compute[182755]: 2026-01-21 23:51:53.629 182759 DEBUG oslo_concurrency.lockutils [None req-29096907-7086-45f6-8ffc-b0bcc4f3d5bc 32616a8bfc24415297b9c4783dbc977d 1d77eaad9237406bac52794626a266ee - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 21 18:51:54 np0005591285 nova_compute[182755]: 2026-01-21 23:51:54.086 182759 DEBUG nova.network.neutron [None req-29096907-7086-45f6-8ffc-b0bcc4f3d5bc 32616a8bfc24415297b9c4783dbc977d 1d77eaad9237406bac52794626a266ee - - default default] [instance: 094e5be1-805d-4b29-81c0-62d8ecfe353d] No network configured allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1188#033[00m
Jan 21 18:51:54 np0005591285 nova_compute[182755]: 2026-01-21 23:51:54.087 182759 DEBUG nova.compute.manager [None req-29096907-7086-45f6-8ffc-b0bcc4f3d5bc 32616a8bfc24415297b9c4783dbc977d 1d77eaad9237406bac52794626a266ee - - default default] [instance: 094e5be1-805d-4b29-81c0-62d8ecfe353d] Instance network_info: |[]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Jan 21 18:51:54 np0005591285 nova_compute[182755]: 2026-01-21 23:51:54.088 182759 DEBUG nova.virt.libvirt.driver [None req-29096907-7086-45f6-8ffc-b0bcc4f3d5bc 32616a8bfc24415297b9c4783dbc977d 1d77eaad9237406bac52794626a266ee - - default default] [instance: 094e5be1-805d-4b29-81c0-62d8ecfe353d] Start _get_guest_xml network_info=[] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-21T23:42:50Z,direct_url=<?>,disk_format='qcow2',id=9cd98f02-a505-4543-a7ad-04e9a377b456,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='43b70c4e837343859ac97b6b2397ba1b',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-21T23:42:52Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'device_type': 'disk', 'device_name': '/dev/vda', 'size': 0, 'boot_index': 0, 'disk_bus': 'virtio', 'guest_format': None, 'encryption_options': None, 'encryption_format': None, 'encrypted': False, 'encryption_secret_uuid': None, 'image_id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Jan 21 18:51:54 np0005591285 nova_compute[182755]: 2026-01-21 23:51:54.094 182759 WARNING nova.virt.libvirt.driver [None req-29096907-7086-45f6-8ffc-b0bcc4f3d5bc 32616a8bfc24415297b9c4783dbc977d 1d77eaad9237406bac52794626a266ee - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 21 18:51:54 np0005591285 nova_compute[182755]: 2026-01-21 23:51:54.099 182759 DEBUG nova.virt.libvirt.host [None req-29096907-7086-45f6-8ffc-b0bcc4f3d5bc 32616a8bfc24415297b9c4783dbc977d 1d77eaad9237406bac52794626a266ee - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Jan 21 18:51:54 np0005591285 nova_compute[182755]: 2026-01-21 23:51:54.100 182759 DEBUG nova.virt.libvirt.host [None req-29096907-7086-45f6-8ffc-b0bcc4f3d5bc 32616a8bfc24415297b9c4783dbc977d 1d77eaad9237406bac52794626a266ee - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Jan 21 18:51:54 np0005591285 nova_compute[182755]: 2026-01-21 23:51:54.103 182759 DEBUG nova.virt.libvirt.host [None req-29096907-7086-45f6-8ffc-b0bcc4f3d5bc 32616a8bfc24415297b9c4783dbc977d 1d77eaad9237406bac52794626a266ee - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Jan 21 18:51:54 np0005591285 nova_compute[182755]: 2026-01-21 23:51:54.104 182759 DEBUG nova.virt.libvirt.host [None req-29096907-7086-45f6-8ffc-b0bcc4f3d5bc 32616a8bfc24415297b9c4783dbc977d 1d77eaad9237406bac52794626a266ee - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Jan 21 18:51:54 np0005591285 nova_compute[182755]: 2026-01-21 23:51:54.106 182759 DEBUG nova.virt.libvirt.driver [None req-29096907-7086-45f6-8ffc-b0bcc4f3d5bc 32616a8bfc24415297b9c4783dbc977d 1d77eaad9237406bac52794626a266ee - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Jan 21 18:51:54 np0005591285 nova_compute[182755]: 2026-01-21 23:51:54.106 182759 DEBUG nova.virt.hardware [None req-29096907-7086-45f6-8ffc-b0bcc4f3d5bc 32616a8bfc24415297b9c4783dbc977d 1d77eaad9237406bac52794626a266ee - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-21T23:42:45Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='c3389c03-89c4-4ff5-9e03-1a99d41713d4',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-21T23:42:50Z,direct_url=<?>,disk_format='qcow2',id=9cd98f02-a505-4543-a7ad-04e9a377b456,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='43b70c4e837343859ac97b6b2397ba1b',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-21T23:42:52Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Jan 21 18:51:54 np0005591285 nova_compute[182755]: 2026-01-21 23:51:54.107 182759 DEBUG nova.virt.hardware [None req-29096907-7086-45f6-8ffc-b0bcc4f3d5bc 32616a8bfc24415297b9c4783dbc977d 1d77eaad9237406bac52794626a266ee - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Jan 21 18:51:54 np0005591285 nova_compute[182755]: 2026-01-21 23:51:54.107 182759 DEBUG nova.virt.hardware [None req-29096907-7086-45f6-8ffc-b0bcc4f3d5bc 32616a8bfc24415297b9c4783dbc977d 1d77eaad9237406bac52794626a266ee - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Jan 21 18:51:54 np0005591285 nova_compute[182755]: 2026-01-21 23:51:54.108 182759 DEBUG nova.virt.hardware [None req-29096907-7086-45f6-8ffc-b0bcc4f3d5bc 32616a8bfc24415297b9c4783dbc977d 1d77eaad9237406bac52794626a266ee - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Jan 21 18:51:54 np0005591285 nova_compute[182755]: 2026-01-21 23:51:54.108 182759 DEBUG nova.virt.hardware [None req-29096907-7086-45f6-8ffc-b0bcc4f3d5bc 32616a8bfc24415297b9c4783dbc977d 1d77eaad9237406bac52794626a266ee - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Jan 21 18:51:54 np0005591285 nova_compute[182755]: 2026-01-21 23:51:54.108 182759 DEBUG nova.virt.hardware [None req-29096907-7086-45f6-8ffc-b0bcc4f3d5bc 32616a8bfc24415297b9c4783dbc977d 1d77eaad9237406bac52794626a266ee - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Jan 21 18:51:54 np0005591285 nova_compute[182755]: 2026-01-21 23:51:54.108 182759 DEBUG nova.virt.hardware [None req-29096907-7086-45f6-8ffc-b0bcc4f3d5bc 32616a8bfc24415297b9c4783dbc977d 1d77eaad9237406bac52794626a266ee - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Jan 21 18:51:54 np0005591285 nova_compute[182755]: 2026-01-21 23:51:54.109 182759 DEBUG nova.virt.hardware [None req-29096907-7086-45f6-8ffc-b0bcc4f3d5bc 32616a8bfc24415297b9c4783dbc977d 1d77eaad9237406bac52794626a266ee - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Jan 21 18:51:54 np0005591285 nova_compute[182755]: 2026-01-21 23:51:54.109 182759 DEBUG nova.virt.hardware [None req-29096907-7086-45f6-8ffc-b0bcc4f3d5bc 32616a8bfc24415297b9c4783dbc977d 1d77eaad9237406bac52794626a266ee - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Jan 21 18:51:54 np0005591285 nova_compute[182755]: 2026-01-21 23:51:54.109 182759 DEBUG nova.virt.hardware [None req-29096907-7086-45f6-8ffc-b0bcc4f3d5bc 32616a8bfc24415297b9c4783dbc977d 1d77eaad9237406bac52794626a266ee - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Jan 21 18:51:54 np0005591285 nova_compute[182755]: 2026-01-21 23:51:54.110 182759 DEBUG nova.virt.hardware [None req-29096907-7086-45f6-8ffc-b0bcc4f3d5bc 32616a8bfc24415297b9c4783dbc977d 1d77eaad9237406bac52794626a266ee - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Jan 21 18:51:54 np0005591285 nova_compute[182755]: 2026-01-21 23:51:54.115 182759 DEBUG nova.objects.instance [None req-29096907-7086-45f6-8ffc-b0bcc4f3d5bc 32616a8bfc24415297b9c4783dbc977d 1d77eaad9237406bac52794626a266ee - - default default] Lazy-loading 'pci_devices' on Instance uuid 094e5be1-805d-4b29-81c0-62d8ecfe353d obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 21 18:51:54 np0005591285 nova_compute[182755]: 2026-01-21 23:51:54.131 182759 DEBUG nova.virt.libvirt.driver [None req-29096907-7086-45f6-8ffc-b0bcc4f3d5bc 32616a8bfc24415297b9c4783dbc977d 1d77eaad9237406bac52794626a266ee - - default default] [instance: 094e5be1-805d-4b29-81c0-62d8ecfe353d] End _get_guest_xml xml=<domain type="kvm">
Jan 21 18:51:54 np0005591285 nova_compute[182755]:  <uuid>094e5be1-805d-4b29-81c0-62d8ecfe353d</uuid>
Jan 21 18:51:54 np0005591285 nova_compute[182755]:  <name>instance-0000002e</name>
Jan 21 18:51:54 np0005591285 nova_compute[182755]:  <memory>131072</memory>
Jan 21 18:51:54 np0005591285 nova_compute[182755]:  <vcpu>1</vcpu>
Jan 21 18:51:54 np0005591285 nova_compute[182755]:  <metadata>
Jan 21 18:51:54 np0005591285 nova_compute[182755]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 21 18:51:54 np0005591285 nova_compute[182755]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 21 18:51:54 np0005591285 nova_compute[182755]:      <nova:name>tempest-TenantUsagesTestJSON-server-1973191792</nova:name>
Jan 21 18:51:54 np0005591285 nova_compute[182755]:      <nova:creationTime>2026-01-21 23:51:54</nova:creationTime>
Jan 21 18:51:54 np0005591285 nova_compute[182755]:      <nova:flavor name="m1.nano">
Jan 21 18:51:54 np0005591285 nova_compute[182755]:        <nova:memory>128</nova:memory>
Jan 21 18:51:54 np0005591285 nova_compute[182755]:        <nova:disk>1</nova:disk>
Jan 21 18:51:54 np0005591285 nova_compute[182755]:        <nova:swap>0</nova:swap>
Jan 21 18:51:54 np0005591285 nova_compute[182755]:        <nova:ephemeral>0</nova:ephemeral>
Jan 21 18:51:54 np0005591285 nova_compute[182755]:        <nova:vcpus>1</nova:vcpus>
Jan 21 18:51:54 np0005591285 nova_compute[182755]:      </nova:flavor>
Jan 21 18:51:54 np0005591285 nova_compute[182755]:      <nova:owner>
Jan 21 18:51:54 np0005591285 nova_compute[182755]:        <nova:user uuid="32616a8bfc24415297b9c4783dbc977d">tempest-TenantUsagesTestJSON-1604812101-project-member</nova:user>
Jan 21 18:51:54 np0005591285 nova_compute[182755]:        <nova:project uuid="1d77eaad9237406bac52794626a266ee">tempest-TenantUsagesTestJSON-1604812101</nova:project>
Jan 21 18:51:54 np0005591285 nova_compute[182755]:      </nova:owner>
Jan 21 18:51:54 np0005591285 nova_compute[182755]:      <nova:root type="image" uuid="9cd98f02-a505-4543-a7ad-04e9a377b456"/>
Jan 21 18:51:54 np0005591285 nova_compute[182755]:      <nova:ports/>
Jan 21 18:51:54 np0005591285 nova_compute[182755]:    </nova:instance>
Jan 21 18:51:54 np0005591285 nova_compute[182755]:  </metadata>
Jan 21 18:51:54 np0005591285 nova_compute[182755]:  <sysinfo type="smbios">
Jan 21 18:51:54 np0005591285 nova_compute[182755]:    <system>
Jan 21 18:51:54 np0005591285 nova_compute[182755]:      <entry name="manufacturer">RDO</entry>
Jan 21 18:51:54 np0005591285 nova_compute[182755]:      <entry name="product">OpenStack Compute</entry>
Jan 21 18:51:54 np0005591285 nova_compute[182755]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 21 18:51:54 np0005591285 nova_compute[182755]:      <entry name="serial">094e5be1-805d-4b29-81c0-62d8ecfe353d</entry>
Jan 21 18:51:54 np0005591285 nova_compute[182755]:      <entry name="uuid">094e5be1-805d-4b29-81c0-62d8ecfe353d</entry>
Jan 21 18:51:54 np0005591285 nova_compute[182755]:      <entry name="family">Virtual Machine</entry>
Jan 21 18:51:54 np0005591285 nova_compute[182755]:    </system>
Jan 21 18:51:54 np0005591285 nova_compute[182755]:  </sysinfo>
Jan 21 18:51:54 np0005591285 nova_compute[182755]:  <os>
Jan 21 18:51:54 np0005591285 nova_compute[182755]:    <type arch="x86_64" machine="q35">hvm</type>
Jan 21 18:51:54 np0005591285 nova_compute[182755]:    <boot dev="hd"/>
Jan 21 18:51:54 np0005591285 nova_compute[182755]:    <smbios mode="sysinfo"/>
Jan 21 18:51:54 np0005591285 nova_compute[182755]:  </os>
Jan 21 18:51:54 np0005591285 nova_compute[182755]:  <features>
Jan 21 18:51:54 np0005591285 nova_compute[182755]:    <acpi/>
Jan 21 18:51:54 np0005591285 nova_compute[182755]:    <apic/>
Jan 21 18:51:54 np0005591285 nova_compute[182755]:    <vmcoreinfo/>
Jan 21 18:51:54 np0005591285 nova_compute[182755]:  </features>
Jan 21 18:51:54 np0005591285 nova_compute[182755]:  <clock offset="utc">
Jan 21 18:51:54 np0005591285 nova_compute[182755]:    <timer name="pit" tickpolicy="delay"/>
Jan 21 18:51:54 np0005591285 nova_compute[182755]:    <timer name="rtc" tickpolicy="catchup"/>
Jan 21 18:51:54 np0005591285 nova_compute[182755]:    <timer name="hpet" present="no"/>
Jan 21 18:51:54 np0005591285 nova_compute[182755]:  </clock>
Jan 21 18:51:54 np0005591285 nova_compute[182755]:  <cpu mode="custom" match="exact">
Jan 21 18:51:54 np0005591285 nova_compute[182755]:    <model>Nehalem</model>
Jan 21 18:51:54 np0005591285 nova_compute[182755]:    <topology sockets="1" cores="1" threads="1"/>
Jan 21 18:51:54 np0005591285 nova_compute[182755]:  </cpu>
Jan 21 18:51:54 np0005591285 nova_compute[182755]:  <devices>
Jan 21 18:51:54 np0005591285 nova_compute[182755]:    <disk type="file" device="disk">
Jan 21 18:51:54 np0005591285 nova_compute[182755]:      <driver name="qemu" type="qcow2" cache="none"/>
Jan 21 18:51:54 np0005591285 nova_compute[182755]:      <source file="/var/lib/nova/instances/094e5be1-805d-4b29-81c0-62d8ecfe353d/disk"/>
Jan 21 18:51:54 np0005591285 nova_compute[182755]:      <target dev="vda" bus="virtio"/>
Jan 21 18:51:54 np0005591285 nova_compute[182755]:    </disk>
Jan 21 18:51:54 np0005591285 nova_compute[182755]:    <disk type="file" device="cdrom">
Jan 21 18:51:54 np0005591285 nova_compute[182755]:      <driver name="qemu" type="raw" cache="none"/>
Jan 21 18:51:54 np0005591285 nova_compute[182755]:      <source file="/var/lib/nova/instances/094e5be1-805d-4b29-81c0-62d8ecfe353d/disk.config"/>
Jan 21 18:51:54 np0005591285 nova_compute[182755]:      <target dev="sda" bus="sata"/>
Jan 21 18:51:54 np0005591285 nova_compute[182755]:    </disk>
Jan 21 18:51:54 np0005591285 nova_compute[182755]:    <serial type="pty">
Jan 21 18:51:54 np0005591285 nova_compute[182755]:      <log file="/var/lib/nova/instances/094e5be1-805d-4b29-81c0-62d8ecfe353d/console.log" append="off"/>
Jan 21 18:51:54 np0005591285 nova_compute[182755]:    </serial>
Jan 21 18:51:54 np0005591285 nova_compute[182755]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 21 18:51:54 np0005591285 nova_compute[182755]:    <video>
Jan 21 18:51:54 np0005591285 nova_compute[182755]:      <model type="virtio"/>
Jan 21 18:51:54 np0005591285 nova_compute[182755]:    </video>
Jan 21 18:51:54 np0005591285 nova_compute[182755]:    <input type="tablet" bus="usb"/>
Jan 21 18:51:54 np0005591285 nova_compute[182755]:    <rng model="virtio">
Jan 21 18:51:54 np0005591285 nova_compute[182755]:      <backend model="random">/dev/urandom</backend>
Jan 21 18:51:54 np0005591285 nova_compute[182755]:    </rng>
Jan 21 18:51:54 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root"/>
Jan 21 18:51:54 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 18:51:54 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 18:51:54 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 18:51:54 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 18:51:54 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 18:51:54 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 18:51:54 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 18:51:54 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 18:51:54 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 18:51:54 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 18:51:54 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 18:51:54 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 18:51:54 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 18:51:54 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 18:51:54 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 18:51:54 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 18:51:54 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 18:51:54 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 18:51:54 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 18:51:54 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 18:51:54 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 18:51:54 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 18:51:54 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 18:51:54 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 18:51:54 np0005591285 nova_compute[182755]:    <controller type="usb" index="0"/>
Jan 21 18:51:54 np0005591285 nova_compute[182755]:    <memballoon model="virtio">
Jan 21 18:51:54 np0005591285 nova_compute[182755]:      <stats period="10"/>
Jan 21 18:51:54 np0005591285 nova_compute[182755]:    </memballoon>
Jan 21 18:51:54 np0005591285 nova_compute[182755]:  </devices>
Jan 21 18:51:54 np0005591285 nova_compute[182755]: </domain>
Jan 21 18:51:54 np0005591285 nova_compute[182755]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Jan 21 18:51:54 np0005591285 podman[216427]: 2026-01-21 23:51:54.208410887 +0000 UTC m=+0.075981705 container health_status 3d927e208c8d9d2bf2ed958430b573b5beb6e6062ba87e1ec7a0ee42c855b2e4 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Jan 21 18:51:54 np0005591285 nova_compute[182755]: 2026-01-21 23:51:54.212 182759 DEBUG nova.virt.libvirt.driver [None req-29096907-7086-45f6-8ffc-b0bcc4f3d5bc 32616a8bfc24415297b9c4783dbc977d 1d77eaad9237406bac52794626a266ee - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 21 18:51:54 np0005591285 nova_compute[182755]: 2026-01-21 23:51:54.213 182759 DEBUG nova.virt.libvirt.driver [None req-29096907-7086-45f6-8ffc-b0bcc4f3d5bc 32616a8bfc24415297b9c4783dbc977d 1d77eaad9237406bac52794626a266ee - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 21 18:51:54 np0005591285 nova_compute[182755]: 2026-01-21 23:51:54.214 182759 INFO nova.virt.libvirt.driver [None req-29096907-7086-45f6-8ffc-b0bcc4f3d5bc 32616a8bfc24415297b9c4783dbc977d 1d77eaad9237406bac52794626a266ee - - default default] [instance: 094e5be1-805d-4b29-81c0-62d8ecfe353d] Using config drive#033[00m
Jan 21 18:51:54 np0005591285 nova_compute[182755]: 2026-01-21 23:51:54.425 182759 INFO nova.virt.libvirt.driver [None req-29096907-7086-45f6-8ffc-b0bcc4f3d5bc 32616a8bfc24415297b9c4783dbc977d 1d77eaad9237406bac52794626a266ee - - default default] [instance: 094e5be1-805d-4b29-81c0-62d8ecfe353d] Creating config drive at /var/lib/nova/instances/094e5be1-805d-4b29-81c0-62d8ecfe353d/disk.config#033[00m
Jan 21 18:51:54 np0005591285 nova_compute[182755]: 2026-01-21 23:51:54.430 182759 DEBUG oslo_concurrency.processutils [None req-29096907-7086-45f6-8ffc-b0bcc4f3d5bc 32616a8bfc24415297b9c4783dbc977d 1d77eaad9237406bac52794626a266ee - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/094e5be1-805d-4b29-81c0-62d8ecfe353d/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpp4nmayjw execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 21 18:51:54 np0005591285 nova_compute[182755]: 2026-01-21 23:51:54.527 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 18:51:54 np0005591285 nova_compute[182755]: 2026-01-21 23:51:54.580 182759 DEBUG oslo_concurrency.processutils [None req-29096907-7086-45f6-8ffc-b0bcc4f3d5bc 32616a8bfc24415297b9c4783dbc977d 1d77eaad9237406bac52794626a266ee - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/094e5be1-805d-4b29-81c0-62d8ecfe353d/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpp4nmayjw" returned: 0 in 0.150s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 21 18:51:54 np0005591285 systemd-machined[154022]: New machine qemu-18-instance-0000002e.
Jan 21 18:51:54 np0005591285 systemd[1]: Started Virtual Machine qemu-18-instance-0000002e.
Jan 21 18:51:55 np0005591285 nova_compute[182755]: 2026-01-21 23:51:55.237 182759 DEBUG nova.virt.driver [None req-f447ef32-7edb-4e2c-a5c5-5452f2f9c37e - - - - - -] Emitting event <LifecycleEvent: 1769039515.2360191, 094e5be1-805d-4b29-81c0-62d8ecfe353d => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 21 18:51:55 np0005591285 nova_compute[182755]: 2026-01-21 23:51:55.239 182759 INFO nova.compute.manager [None req-f447ef32-7edb-4e2c-a5c5-5452f2f9c37e - - - - - -] [instance: 094e5be1-805d-4b29-81c0-62d8ecfe353d] VM Resumed (Lifecycle Event)#033[00m
Jan 21 18:51:55 np0005591285 nova_compute[182755]: 2026-01-21 23:51:55.245 182759 DEBUG nova.compute.manager [None req-29096907-7086-45f6-8ffc-b0bcc4f3d5bc 32616a8bfc24415297b9c4783dbc977d 1d77eaad9237406bac52794626a266ee - - default default] [instance: 094e5be1-805d-4b29-81c0-62d8ecfe353d] Instance event wait completed in 0 seconds for  wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Jan 21 18:51:55 np0005591285 nova_compute[182755]: 2026-01-21 23:51:55.246 182759 DEBUG nova.virt.libvirt.driver [None req-29096907-7086-45f6-8ffc-b0bcc4f3d5bc 32616a8bfc24415297b9c4783dbc977d 1d77eaad9237406bac52794626a266ee - - default default] [instance: 094e5be1-805d-4b29-81c0-62d8ecfe353d] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Jan 21 18:51:55 np0005591285 nova_compute[182755]: 2026-01-21 23:51:55.252 182759 INFO nova.virt.libvirt.driver [-] [instance: 094e5be1-805d-4b29-81c0-62d8ecfe353d] Instance spawned successfully.#033[00m
Jan 21 18:51:55 np0005591285 nova_compute[182755]: 2026-01-21 23:51:55.253 182759 DEBUG nova.virt.libvirt.driver [None req-29096907-7086-45f6-8ffc-b0bcc4f3d5bc 32616a8bfc24415297b9c4783dbc977d 1d77eaad9237406bac52794626a266ee - - default default] [instance: 094e5be1-805d-4b29-81c0-62d8ecfe353d] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Jan 21 18:51:55 np0005591285 nova_compute[182755]: 2026-01-21 23:51:55.288 182759 DEBUG nova.compute.manager [None req-f447ef32-7edb-4e2c-a5c5-5452f2f9c37e - - - - - -] [instance: 094e5be1-805d-4b29-81c0-62d8ecfe353d] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 21 18:51:55 np0005591285 nova_compute[182755]: 2026-01-21 23:51:55.292 182759 DEBUG nova.compute.manager [None req-f447ef32-7edb-4e2c-a5c5-5452f2f9c37e - - - - - -] [instance: 094e5be1-805d-4b29-81c0-62d8ecfe353d] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 21 18:51:55 np0005591285 nova_compute[182755]: 2026-01-21 23:51:55.300 182759 DEBUG nova.virt.libvirt.driver [None req-29096907-7086-45f6-8ffc-b0bcc4f3d5bc 32616a8bfc24415297b9c4783dbc977d 1d77eaad9237406bac52794626a266ee - - default default] [instance: 094e5be1-805d-4b29-81c0-62d8ecfe353d] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 21 18:51:55 np0005591285 nova_compute[182755]: 2026-01-21 23:51:55.301 182759 DEBUG nova.virt.libvirt.driver [None req-29096907-7086-45f6-8ffc-b0bcc4f3d5bc 32616a8bfc24415297b9c4783dbc977d 1d77eaad9237406bac52794626a266ee - - default default] [instance: 094e5be1-805d-4b29-81c0-62d8ecfe353d] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 21 18:51:55 np0005591285 nova_compute[182755]: 2026-01-21 23:51:55.301 182759 DEBUG nova.virt.libvirt.driver [None req-29096907-7086-45f6-8ffc-b0bcc4f3d5bc 32616a8bfc24415297b9c4783dbc977d 1d77eaad9237406bac52794626a266ee - - default default] [instance: 094e5be1-805d-4b29-81c0-62d8ecfe353d] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 21 18:51:55 np0005591285 nova_compute[182755]: 2026-01-21 23:51:55.301 182759 DEBUG nova.virt.libvirt.driver [None req-29096907-7086-45f6-8ffc-b0bcc4f3d5bc 32616a8bfc24415297b9c4783dbc977d 1d77eaad9237406bac52794626a266ee - - default default] [instance: 094e5be1-805d-4b29-81c0-62d8ecfe353d] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 21 18:51:55 np0005591285 nova_compute[182755]: 2026-01-21 23:51:55.302 182759 DEBUG nova.virt.libvirt.driver [None req-29096907-7086-45f6-8ffc-b0bcc4f3d5bc 32616a8bfc24415297b9c4783dbc977d 1d77eaad9237406bac52794626a266ee - - default default] [instance: 094e5be1-805d-4b29-81c0-62d8ecfe353d] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 21 18:51:55 np0005591285 nova_compute[182755]: 2026-01-21 23:51:55.302 182759 DEBUG nova.virt.libvirt.driver [None req-29096907-7086-45f6-8ffc-b0bcc4f3d5bc 32616a8bfc24415297b9c4783dbc977d 1d77eaad9237406bac52794626a266ee - - default default] [instance: 094e5be1-805d-4b29-81c0-62d8ecfe353d] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 21 18:51:55 np0005591285 nova_compute[182755]: 2026-01-21 23:51:55.331 182759 INFO nova.compute.manager [None req-f447ef32-7edb-4e2c-a5c5-5452f2f9c37e - - - - - -] [instance: 094e5be1-805d-4b29-81c0-62d8ecfe353d] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 21 18:51:55 np0005591285 nova_compute[182755]: 2026-01-21 23:51:55.332 182759 DEBUG nova.virt.driver [None req-f447ef32-7edb-4e2c-a5c5-5452f2f9c37e - - - - - -] Emitting event <LifecycleEvent: 1769039515.2382462, 094e5be1-805d-4b29-81c0-62d8ecfe353d => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 21 18:51:55 np0005591285 nova_compute[182755]: 2026-01-21 23:51:55.332 182759 INFO nova.compute.manager [None req-f447ef32-7edb-4e2c-a5c5-5452f2f9c37e - - - - - -] [instance: 094e5be1-805d-4b29-81c0-62d8ecfe353d] VM Started (Lifecycle Event)#033[00m
Jan 21 18:51:55 np0005591285 nova_compute[182755]: 2026-01-21 23:51:55.383 182759 DEBUG nova.compute.manager [None req-f447ef32-7edb-4e2c-a5c5-5452f2f9c37e - - - - - -] [instance: 094e5be1-805d-4b29-81c0-62d8ecfe353d] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 21 18:51:55 np0005591285 nova_compute[182755]: 2026-01-21 23:51:55.388 182759 DEBUG nova.compute.manager [None req-f447ef32-7edb-4e2c-a5c5-5452f2f9c37e - - - - - -] [instance: 094e5be1-805d-4b29-81c0-62d8ecfe353d] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 21 18:51:55 np0005591285 nova_compute[182755]: 2026-01-21 23:51:55.420 182759 INFO nova.compute.manager [None req-f447ef32-7edb-4e2c-a5c5-5452f2f9c37e - - - - - -] [instance: 094e5be1-805d-4b29-81c0-62d8ecfe353d] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 21 18:51:55 np0005591285 nova_compute[182755]: 2026-01-21 23:51:55.463 182759 INFO nova.compute.manager [None req-29096907-7086-45f6-8ffc-b0bcc4f3d5bc 32616a8bfc24415297b9c4783dbc977d 1d77eaad9237406bac52794626a266ee - - default default] [instance: 094e5be1-805d-4b29-81c0-62d8ecfe353d] Took 2.30 seconds to spawn the instance on the hypervisor.#033[00m
Jan 21 18:51:55 np0005591285 nova_compute[182755]: 2026-01-21 23:51:55.464 182759 DEBUG nova.compute.manager [None req-29096907-7086-45f6-8ffc-b0bcc4f3d5bc 32616a8bfc24415297b9c4783dbc977d 1d77eaad9237406bac52794626a266ee - - default default] [instance: 094e5be1-805d-4b29-81c0-62d8ecfe353d] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 21 18:51:55 np0005591285 nova_compute[182755]: 2026-01-21 23:51:55.576 182759 INFO nova.compute.manager [None req-29096907-7086-45f6-8ffc-b0bcc4f3d5bc 32616a8bfc24415297b9c4783dbc977d 1d77eaad9237406bac52794626a266ee - - default default] [instance: 094e5be1-805d-4b29-81c0-62d8ecfe353d] Took 2.95 seconds to build instance.#033[00m
Jan 21 18:51:55 np0005591285 nova_compute[182755]: 2026-01-21 23:51:55.603 182759 DEBUG oslo_concurrency.lockutils [None req-29096907-7086-45f6-8ffc-b0bcc4f3d5bc 32616a8bfc24415297b9c4783dbc977d 1d77eaad9237406bac52794626a266ee - - default default] Lock "094e5be1-805d-4b29-81c0-62d8ecfe353d" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 3.099s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 21 18:51:57 np0005591285 nova_compute[182755]: 2026-01-21 23:51:57.085 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 18:51:58 np0005591285 nova_compute[182755]: 2026-01-21 23:51:58.057 182759 DEBUG oslo_concurrency.lockutils [None req-3f328647-004c-4e58-bae4-58f94a15f71b 32616a8bfc24415297b9c4783dbc977d 1d77eaad9237406bac52794626a266ee - - default default] Acquiring lock "094e5be1-805d-4b29-81c0-62d8ecfe353d" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 21 18:51:58 np0005591285 nova_compute[182755]: 2026-01-21 23:51:58.059 182759 DEBUG oslo_concurrency.lockutils [None req-3f328647-004c-4e58-bae4-58f94a15f71b 32616a8bfc24415297b9c4783dbc977d 1d77eaad9237406bac52794626a266ee - - default default] Lock "094e5be1-805d-4b29-81c0-62d8ecfe353d" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 21 18:51:58 np0005591285 nova_compute[182755]: 2026-01-21 23:51:58.059 182759 DEBUG oslo_concurrency.lockutils [None req-3f328647-004c-4e58-bae4-58f94a15f71b 32616a8bfc24415297b9c4783dbc977d 1d77eaad9237406bac52794626a266ee - - default default] Acquiring lock "094e5be1-805d-4b29-81c0-62d8ecfe353d-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 21 18:51:58 np0005591285 nova_compute[182755]: 2026-01-21 23:51:58.060 182759 DEBUG oslo_concurrency.lockutils [None req-3f328647-004c-4e58-bae4-58f94a15f71b 32616a8bfc24415297b9c4783dbc977d 1d77eaad9237406bac52794626a266ee - - default default] Lock "094e5be1-805d-4b29-81c0-62d8ecfe353d-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 21 18:51:58 np0005591285 nova_compute[182755]: 2026-01-21 23:51:58.060 182759 DEBUG oslo_concurrency.lockutils [None req-3f328647-004c-4e58-bae4-58f94a15f71b 32616a8bfc24415297b9c4783dbc977d 1d77eaad9237406bac52794626a266ee - - default default] Lock "094e5be1-805d-4b29-81c0-62d8ecfe353d-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 21 18:51:58 np0005591285 nova_compute[182755]: 2026-01-21 23:51:58.083 182759 INFO nova.compute.manager [None req-3f328647-004c-4e58-bae4-58f94a15f71b 32616a8bfc24415297b9c4783dbc977d 1d77eaad9237406bac52794626a266ee - - default default] [instance: 094e5be1-805d-4b29-81c0-62d8ecfe353d] Terminating instance#033[00m
Jan 21 18:51:58 np0005591285 nova_compute[182755]: 2026-01-21 23:51:58.102 182759 DEBUG oslo_concurrency.lockutils [None req-3f328647-004c-4e58-bae4-58f94a15f71b 32616a8bfc24415297b9c4783dbc977d 1d77eaad9237406bac52794626a266ee - - default default] Acquiring lock "refresh_cache-094e5be1-805d-4b29-81c0-62d8ecfe353d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 21 18:51:58 np0005591285 nova_compute[182755]: 2026-01-21 23:51:58.103 182759 DEBUG oslo_concurrency.lockutils [None req-3f328647-004c-4e58-bae4-58f94a15f71b 32616a8bfc24415297b9c4783dbc977d 1d77eaad9237406bac52794626a266ee - - default default] Acquired lock "refresh_cache-094e5be1-805d-4b29-81c0-62d8ecfe353d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 21 18:51:58 np0005591285 nova_compute[182755]: 2026-01-21 23:51:58.103 182759 DEBUG nova.network.neutron [None req-3f328647-004c-4e58-bae4-58f94a15f71b 32616a8bfc24415297b9c4783dbc977d 1d77eaad9237406bac52794626a266ee - - default default] [instance: 094e5be1-805d-4b29-81c0-62d8ecfe353d] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Jan 21 18:51:58 np0005591285 podman[216478]: 2026-01-21 23:51:58.264843699 +0000 UTC m=+0.134872383 container health_status e38def30a2d032278c507a8fe96e62e9ea51ff4721fe55b24e66691821fd079e (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_controller, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller)
Jan 21 18:51:58 np0005591285 nova_compute[182755]: 2026-01-21 23:51:58.386 182759 DEBUG nova.network.neutron [None req-3f328647-004c-4e58-bae4-58f94a15f71b 32616a8bfc24415297b9c4783dbc977d 1d77eaad9237406bac52794626a266ee - - default default] [instance: 094e5be1-805d-4b29-81c0-62d8ecfe353d] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Jan 21 18:51:59 np0005591285 nova_compute[182755]: 2026-01-21 23:51:59.020 182759 DEBUG nova.network.neutron [None req-3f328647-004c-4e58-bae4-58f94a15f71b 32616a8bfc24415297b9c4783dbc977d 1d77eaad9237406bac52794626a266ee - - default default] [instance: 094e5be1-805d-4b29-81c0-62d8ecfe353d] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 21 18:51:59 np0005591285 nova_compute[182755]: 2026-01-21 23:51:59.039 182759 DEBUG oslo_concurrency.lockutils [None req-3f328647-004c-4e58-bae4-58f94a15f71b 32616a8bfc24415297b9c4783dbc977d 1d77eaad9237406bac52794626a266ee - - default default] Releasing lock "refresh_cache-094e5be1-805d-4b29-81c0-62d8ecfe353d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 21 18:51:59 np0005591285 nova_compute[182755]: 2026-01-21 23:51:59.040 182759 DEBUG nova.compute.manager [None req-3f328647-004c-4e58-bae4-58f94a15f71b 32616a8bfc24415297b9c4783dbc977d 1d77eaad9237406bac52794626a266ee - - default default] [instance: 094e5be1-805d-4b29-81c0-62d8ecfe353d] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Jan 21 18:51:59 np0005591285 systemd[1]: machine-qemu\x2d18\x2dinstance\x2d0000002e.scope: Deactivated successfully.
Jan 21 18:51:59 np0005591285 systemd[1]: machine-qemu\x2d18\x2dinstance\x2d0000002e.scope: Consumed 4.346s CPU time.
Jan 21 18:51:59 np0005591285 systemd-machined[154022]: Machine qemu-18-instance-0000002e terminated.
Jan 21 18:51:59 np0005591285 nova_compute[182755]: 2026-01-21 23:51:59.308 182759 INFO nova.virt.libvirt.driver [-] [instance: 094e5be1-805d-4b29-81c0-62d8ecfe353d] Instance destroyed successfully.#033[00m
Jan 21 18:51:59 np0005591285 nova_compute[182755]: 2026-01-21 23:51:59.309 182759 DEBUG nova.objects.instance [None req-3f328647-004c-4e58-bae4-58f94a15f71b 32616a8bfc24415297b9c4783dbc977d 1d77eaad9237406bac52794626a266ee - - default default] Lazy-loading 'resources' on Instance uuid 094e5be1-805d-4b29-81c0-62d8ecfe353d obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 21 18:51:59 np0005591285 nova_compute[182755]: 2026-01-21 23:51:59.325 182759 INFO nova.virt.libvirt.driver [None req-3f328647-004c-4e58-bae4-58f94a15f71b 32616a8bfc24415297b9c4783dbc977d 1d77eaad9237406bac52794626a266ee - - default default] [instance: 094e5be1-805d-4b29-81c0-62d8ecfe353d] Deleting instance files /var/lib/nova/instances/094e5be1-805d-4b29-81c0-62d8ecfe353d_del#033[00m
Jan 21 18:51:59 np0005591285 nova_compute[182755]: 2026-01-21 23:51:59.326 182759 INFO nova.virt.libvirt.driver [None req-3f328647-004c-4e58-bae4-58f94a15f71b 32616a8bfc24415297b9c4783dbc977d 1d77eaad9237406bac52794626a266ee - - default default] [instance: 094e5be1-805d-4b29-81c0-62d8ecfe353d] Deletion of /var/lib/nova/instances/094e5be1-805d-4b29-81c0-62d8ecfe353d_del complete#033[00m
Jan 21 18:51:59 np0005591285 nova_compute[182755]: 2026-01-21 23:51:59.441 182759 INFO nova.compute.manager [None req-3f328647-004c-4e58-bae4-58f94a15f71b 32616a8bfc24415297b9c4783dbc977d 1d77eaad9237406bac52794626a266ee - - default default] [instance: 094e5be1-805d-4b29-81c0-62d8ecfe353d] Took 0.40 seconds to destroy the instance on the hypervisor.#033[00m
Jan 21 18:51:59 np0005591285 nova_compute[182755]: 2026-01-21 23:51:59.442 182759 DEBUG oslo.service.loopingcall [None req-3f328647-004c-4e58-bae4-58f94a15f71b 32616a8bfc24415297b9c4783dbc977d 1d77eaad9237406bac52794626a266ee - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Jan 21 18:51:59 np0005591285 nova_compute[182755]: 2026-01-21 23:51:59.443 182759 DEBUG nova.compute.manager [-] [instance: 094e5be1-805d-4b29-81c0-62d8ecfe353d] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Jan 21 18:51:59 np0005591285 nova_compute[182755]: 2026-01-21 23:51:59.443 182759 DEBUG nova.network.neutron [-] [instance: 094e5be1-805d-4b29-81c0-62d8ecfe353d] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Jan 21 18:51:59 np0005591285 nova_compute[182755]: 2026-01-21 23:51:59.576 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 18:51:59 np0005591285 nova_compute[182755]: 2026-01-21 23:51:59.648 182759 DEBUG nova.network.neutron [-] [instance: 094e5be1-805d-4b29-81c0-62d8ecfe353d] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Jan 21 18:51:59 np0005591285 nova_compute[182755]: 2026-01-21 23:51:59.668 182759 DEBUG nova.network.neutron [-] [instance: 094e5be1-805d-4b29-81c0-62d8ecfe353d] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 21 18:51:59 np0005591285 nova_compute[182755]: 2026-01-21 23:51:59.690 182759 INFO nova.compute.manager [-] [instance: 094e5be1-805d-4b29-81c0-62d8ecfe353d] Took 0.25 seconds to deallocate network for instance.#033[00m
Jan 21 18:51:59 np0005591285 nova_compute[182755]: 2026-01-21 23:51:59.835 182759 DEBUG oslo_concurrency.lockutils [None req-3f328647-004c-4e58-bae4-58f94a15f71b 32616a8bfc24415297b9c4783dbc977d 1d77eaad9237406bac52794626a266ee - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 21 18:51:59 np0005591285 nova_compute[182755]: 2026-01-21 23:51:59.836 182759 DEBUG oslo_concurrency.lockutils [None req-3f328647-004c-4e58-bae4-58f94a15f71b 32616a8bfc24415297b9c4783dbc977d 1d77eaad9237406bac52794626a266ee - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 21 18:51:59 np0005591285 nova_compute[182755]: 2026-01-21 23:51:59.940 182759 DEBUG nova.compute.provider_tree [None req-3f328647-004c-4e58-bae4-58f94a15f71b 32616a8bfc24415297b9c4783dbc977d 1d77eaad9237406bac52794626a266ee - - default default] Inventory has not changed in ProviderTree for provider: e96a8776-a298-4c19-937a-402cb8191067 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 21 18:51:59 np0005591285 nova_compute[182755]: 2026-01-21 23:51:59.982 182759 DEBUG nova.scheduler.client.report [None req-3f328647-004c-4e58-bae4-58f94a15f71b 32616a8bfc24415297b9c4783dbc977d 1d77eaad9237406bac52794626a266ee - - default default] Inventory has not changed for provider e96a8776-a298-4c19-937a-402cb8191067 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 21 18:52:00 np0005591285 nova_compute[182755]: 2026-01-21 23:52:00.017 182759 DEBUG oslo_concurrency.lockutils [None req-3f328647-004c-4e58-bae4-58f94a15f71b 32616a8bfc24415297b9c4783dbc977d 1d77eaad9237406bac52794626a266ee - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.182s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 21 18:52:00 np0005591285 nova_compute[182755]: 2026-01-21 23:52:00.056 182759 INFO nova.scheduler.client.report [None req-3f328647-004c-4e58-bae4-58f94a15f71b 32616a8bfc24415297b9c4783dbc977d 1d77eaad9237406bac52794626a266ee - - default default] Deleted allocations for instance 094e5be1-805d-4b29-81c0-62d8ecfe353d#033[00m
Jan 21 18:52:00 np0005591285 nova_compute[182755]: 2026-01-21 23:52:00.138 182759 DEBUG oslo_concurrency.lockutils [None req-3f328647-004c-4e58-bae4-58f94a15f71b 32616a8bfc24415297b9c4783dbc977d 1d77eaad9237406bac52794626a266ee - - default default] Lock "094e5be1-805d-4b29-81c0-62d8ecfe353d" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 2.080s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 21 18:52:01 np0005591285 podman[216514]: 2026-01-21 23:52:01.193596618 +0000 UTC m=+0.054143571 container health_status 8919309155a033da8deb024bb37b9fc0d285fe97686ee0d202af37a5f2df8416 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']})
Jan 21 18:52:01 np0005591285 podman[216513]: 2026-01-21 23:52:01.221006502 +0000 UTC m=+0.085385087 container health_status 482a8450540b33e5cd4c4cb0c4b60281016c657b79b89fa82f8a9e447843f995 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']})
Jan 21 18:52:02 np0005591285 nova_compute[182755]: 2026-01-21 23:52:02.089 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 18:52:02 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:52:02.957 104259 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 21 18:52:02 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:52:02.958 104259 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 21 18:52:02 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:52:02.958 104259 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 21 18:52:04 np0005591285 nova_compute[182755]: 2026-01-21 23:52:04.578 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 18:52:07 np0005591285 nova_compute[182755]: 2026-01-21 23:52:07.092 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 18:52:09 np0005591285 nova_compute[182755]: 2026-01-21 23:52:09.580 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 18:52:12 np0005591285 nova_compute[182755]: 2026-01-21 23:52:12.097 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 18:52:14 np0005591285 nova_compute[182755]: 2026-01-21 23:52:14.305 182759 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1769039519.303811, 094e5be1-805d-4b29-81c0-62d8ecfe353d => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 21 18:52:14 np0005591285 nova_compute[182755]: 2026-01-21 23:52:14.306 182759 INFO nova.compute.manager [-] [instance: 094e5be1-805d-4b29-81c0-62d8ecfe353d] VM Stopped (Lifecycle Event)#033[00m
Jan 21 18:52:14 np0005591285 nova_compute[182755]: 2026-01-21 23:52:14.583 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 18:52:15 np0005591285 nova_compute[182755]: 2026-01-21 23:52:15.447 182759 DEBUG nova.compute.manager [None req-fdbb484d-cbc3-4455-8933-3649fb005e96 - - - - - -] [instance: 094e5be1-805d-4b29-81c0-62d8ecfe353d] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 21 18:52:16 np0005591285 podman[216558]: 2026-01-21 23:52:16.23743442 +0000 UTC m=+0.095640484 container health_status b5c1a31a508d0cf9e10702abe9c0ef424614b4d8f0daee85791f7de054afa6f0 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, managed_by=edpm_ansible, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.build-date=20251202, tcib_managed=true, config_id=ceilometer_agent_compute, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 21 18:52:17 np0005591285 nova_compute[182755]: 2026-01-21 23:52:17.100 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 18:52:17 np0005591285 podman[216578]: 2026-01-21 23:52:17.243573677 +0000 UTC m=+0.107843670 container health_status 769668cc4e818cfae854ab3fcb5517227ddca1e18005a18ef4d782a494f45662 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.openshift.expose-services=, vendor=Red Hat, Inc., build-date=2025-08-20T13:12:41, distribution-scope=public, managed_by=edpm_ansible, io.openshift.tags=minimal rhel9, version=9.6, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., name=ubi9-minimal, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, maintainer=Red Hat, Inc., config_id=openstack_network_exporter, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, com.redhat.component=ubi9-minimal-container, container_name=openstack_network_exporter, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., architecture=x86_64, io.buildah.version=1.33.7, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://catalog.redhat.com/en/search?searchType=containers, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., release=1755695350, vcs-type=git)
Jan 21 18:52:17 np0005591285 nova_compute[182755]: 2026-01-21 23:52:17.373 182759 DEBUG oslo_concurrency.lockutils [None req-d7afeab4-24db-4f00-b9c2-7822a08cbcb2 70ab01acb8e9483f8de4b65f638c134d 746e12e350104caf8fd0201b7e30f2fe - - default default] Acquiring lock "f3e5045f-b39a-435f-9112-c2adfb8c8b71" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 21 18:52:17 np0005591285 nova_compute[182755]: 2026-01-21 23:52:17.374 182759 DEBUG oslo_concurrency.lockutils [None req-d7afeab4-24db-4f00-b9c2-7822a08cbcb2 70ab01acb8e9483f8de4b65f638c134d 746e12e350104caf8fd0201b7e30f2fe - - default default] Lock "f3e5045f-b39a-435f-9112-c2adfb8c8b71" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 21 18:52:17 np0005591285 nova_compute[182755]: 2026-01-21 23:52:17.567 182759 DEBUG nova.compute.manager [None req-d7afeab4-24db-4f00-b9c2-7822a08cbcb2 70ab01acb8e9483f8de4b65f638c134d 746e12e350104caf8fd0201b7e30f2fe - - default default] [instance: f3e5045f-b39a-435f-9112-c2adfb8c8b71] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Jan 21 18:52:19 np0005591285 nova_compute[182755]: 2026-01-21 23:52:19.585 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 18:52:20 np0005591285 nova_compute[182755]: 2026-01-21 23:52:20.214 182759 DEBUG oslo_service.periodic_task [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 21 18:52:20 np0005591285 nova_compute[182755]: 2026-01-21 23:52:20.598 182759 DEBUG oslo_concurrency.lockutils [None req-d7afeab4-24db-4f00-b9c2-7822a08cbcb2 70ab01acb8e9483f8de4b65f638c134d 746e12e350104caf8fd0201b7e30f2fe - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 21 18:52:20 np0005591285 nova_compute[182755]: 2026-01-21 23:52:20.599 182759 DEBUG oslo_concurrency.lockutils [None req-d7afeab4-24db-4f00-b9c2-7822a08cbcb2 70ab01acb8e9483f8de4b65f638c134d 746e12e350104caf8fd0201b7e30f2fe - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 21 18:52:20 np0005591285 nova_compute[182755]: 2026-01-21 23:52:20.609 182759 DEBUG nova.virt.hardware [None req-d7afeab4-24db-4f00-b9c2-7822a08cbcb2 70ab01acb8e9483f8de4b65f638c134d 746e12e350104caf8fd0201b7e30f2fe - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Jan 21 18:52:20 np0005591285 nova_compute[182755]: 2026-01-21 23:52:20.610 182759 INFO nova.compute.claims [None req-d7afeab4-24db-4f00-b9c2-7822a08cbcb2 70ab01acb8e9483f8de4b65f638c134d 746e12e350104caf8fd0201b7e30f2fe - - default default] [instance: f3e5045f-b39a-435f-9112-c2adfb8c8b71] Claim successful on node compute-2.ctlplane.example.com#033[00m
Jan 21 18:52:20 np0005591285 nova_compute[182755]: 2026-01-21 23:52:20.836 182759 DEBUG nova.compute.provider_tree [None req-d7afeab4-24db-4f00-b9c2-7822a08cbcb2 70ab01acb8e9483f8de4b65f638c134d 746e12e350104caf8fd0201b7e30f2fe - - default default] Inventory has not changed in ProviderTree for provider: e96a8776-a298-4c19-937a-402cb8191067 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 21 18:52:20 np0005591285 nova_compute[182755]: 2026-01-21 23:52:20.853 182759 DEBUG nova.scheduler.client.report [None req-d7afeab4-24db-4f00-b9c2-7822a08cbcb2 70ab01acb8e9483f8de4b65f638c134d 746e12e350104caf8fd0201b7e30f2fe - - default default] Inventory has not changed for provider e96a8776-a298-4c19-937a-402cb8191067 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 21 18:52:20 np0005591285 nova_compute[182755]: 2026-01-21 23:52:20.893 182759 DEBUG oslo_concurrency.lockutils [None req-d7afeab4-24db-4f00-b9c2-7822a08cbcb2 70ab01acb8e9483f8de4b65f638c134d 746e12e350104caf8fd0201b7e30f2fe - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.294s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 21 18:52:20 np0005591285 nova_compute[182755]: 2026-01-21 23:52:20.894 182759 DEBUG nova.compute.manager [None req-d7afeab4-24db-4f00-b9c2-7822a08cbcb2 70ab01acb8e9483f8de4b65f638c134d 746e12e350104caf8fd0201b7e30f2fe - - default default] [instance: f3e5045f-b39a-435f-9112-c2adfb8c8b71] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Jan 21 18:52:20 np0005591285 nova_compute[182755]: 2026-01-21 23:52:20.980 182759 DEBUG nova.compute.manager [None req-d7afeab4-24db-4f00-b9c2-7822a08cbcb2 70ab01acb8e9483f8de4b65f638c134d 746e12e350104caf8fd0201b7e30f2fe - - default default] [instance: f3e5045f-b39a-435f-9112-c2adfb8c8b71] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Jan 21 18:52:20 np0005591285 nova_compute[182755]: 2026-01-21 23:52:20.981 182759 DEBUG nova.network.neutron [None req-d7afeab4-24db-4f00-b9c2-7822a08cbcb2 70ab01acb8e9483f8de4b65f638c134d 746e12e350104caf8fd0201b7e30f2fe - - default default] [instance: f3e5045f-b39a-435f-9112-c2adfb8c8b71] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Jan 21 18:52:21 np0005591285 nova_compute[182755]: 2026-01-21 23:52:21.014 182759 INFO nova.virt.libvirt.driver [None req-d7afeab4-24db-4f00-b9c2-7822a08cbcb2 70ab01acb8e9483f8de4b65f638c134d 746e12e350104caf8fd0201b7e30f2fe - - default default] [instance: f3e5045f-b39a-435f-9112-c2adfb8c8b71] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Jan 21 18:52:21 np0005591285 nova_compute[182755]: 2026-01-21 23:52:21.059 182759 DEBUG nova.compute.manager [None req-d7afeab4-24db-4f00-b9c2-7822a08cbcb2 70ab01acb8e9483f8de4b65f638c134d 746e12e350104caf8fd0201b7e30f2fe - - default default] [instance: f3e5045f-b39a-435f-9112-c2adfb8c8b71] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Jan 21 18:52:21 np0005591285 nova_compute[182755]: 2026-01-21 23:52:21.218 182759 DEBUG oslo_service.periodic_task [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 21 18:52:21 np0005591285 nova_compute[182755]: 2026-01-21 23:52:21.266 182759 DEBUG nova.compute.manager [None req-d7afeab4-24db-4f00-b9c2-7822a08cbcb2 70ab01acb8e9483f8de4b65f638c134d 746e12e350104caf8fd0201b7e30f2fe - - default default] [instance: f3e5045f-b39a-435f-9112-c2adfb8c8b71] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Jan 21 18:52:21 np0005591285 nova_compute[182755]: 2026-01-21 23:52:21.269 182759 DEBUG nova.virt.libvirt.driver [None req-d7afeab4-24db-4f00-b9c2-7822a08cbcb2 70ab01acb8e9483f8de4b65f638c134d 746e12e350104caf8fd0201b7e30f2fe - - default default] [instance: f3e5045f-b39a-435f-9112-c2adfb8c8b71] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Jan 21 18:52:21 np0005591285 nova_compute[182755]: 2026-01-21 23:52:21.269 182759 INFO nova.virt.libvirt.driver [None req-d7afeab4-24db-4f00-b9c2-7822a08cbcb2 70ab01acb8e9483f8de4b65f638c134d 746e12e350104caf8fd0201b7e30f2fe - - default default] [instance: f3e5045f-b39a-435f-9112-c2adfb8c8b71] Creating image(s)#033[00m
Jan 21 18:52:21 np0005591285 nova_compute[182755]: 2026-01-21 23:52:21.271 182759 DEBUG oslo_concurrency.lockutils [None req-d7afeab4-24db-4f00-b9c2-7822a08cbcb2 70ab01acb8e9483f8de4b65f638c134d 746e12e350104caf8fd0201b7e30f2fe - - default default] Acquiring lock "/var/lib/nova/instances/f3e5045f-b39a-435f-9112-c2adfb8c8b71/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 21 18:52:21 np0005591285 nova_compute[182755]: 2026-01-21 23:52:21.271 182759 DEBUG oslo_concurrency.lockutils [None req-d7afeab4-24db-4f00-b9c2-7822a08cbcb2 70ab01acb8e9483f8de4b65f638c134d 746e12e350104caf8fd0201b7e30f2fe - - default default] Lock "/var/lib/nova/instances/f3e5045f-b39a-435f-9112-c2adfb8c8b71/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 21 18:52:21 np0005591285 nova_compute[182755]: 2026-01-21 23:52:21.273 182759 DEBUG oslo_concurrency.lockutils [None req-d7afeab4-24db-4f00-b9c2-7822a08cbcb2 70ab01acb8e9483f8de4b65f638c134d 746e12e350104caf8fd0201b7e30f2fe - - default default] Lock "/var/lib/nova/instances/f3e5045f-b39a-435f-9112-c2adfb8c8b71/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 21 18:52:21 np0005591285 nova_compute[182755]: 2026-01-21 23:52:21.301 182759 DEBUG oslo_concurrency.processutils [None req-d7afeab4-24db-4f00-b9c2-7822a08cbcb2 70ab01acb8e9483f8de4b65f638c134d 746e12e350104caf8fd0201b7e30f2fe - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 21 18:52:21 np0005591285 nova_compute[182755]: 2026-01-21 23:52:21.395 182759 DEBUG oslo_concurrency.processutils [None req-d7afeab4-24db-4f00-b9c2-7822a08cbcb2 70ab01acb8e9483f8de4b65f638c134d 746e12e350104caf8fd0201b7e30f2fe - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json" returned: 0 in 0.095s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 21 18:52:21 np0005591285 nova_compute[182755]: 2026-01-21 23:52:21.398 182759 DEBUG oslo_concurrency.lockutils [None req-d7afeab4-24db-4f00-b9c2-7822a08cbcb2 70ab01acb8e9483f8de4b65f638c134d 746e12e350104caf8fd0201b7e30f2fe - - default default] Acquiring lock "3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 21 18:52:21 np0005591285 nova_compute[182755]: 2026-01-21 23:52:21.399 182759 DEBUG oslo_concurrency.lockutils [None req-d7afeab4-24db-4f00-b9c2-7822a08cbcb2 70ab01acb8e9483f8de4b65f638c134d 746e12e350104caf8fd0201b7e30f2fe - - default default] Lock "3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 21 18:52:21 np0005591285 nova_compute[182755]: 2026-01-21 23:52:21.427 182759 DEBUG oslo_concurrency.processutils [None req-d7afeab4-24db-4f00-b9c2-7822a08cbcb2 70ab01acb8e9483f8de4b65f638c134d 746e12e350104caf8fd0201b7e30f2fe - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 21 18:52:21 np0005591285 nova_compute[182755]: 2026-01-21 23:52:21.517 182759 DEBUG oslo_concurrency.processutils [None req-d7afeab4-24db-4f00-b9c2-7822a08cbcb2 70ab01acb8e9483f8de4b65f638c134d 746e12e350104caf8fd0201b7e30f2fe - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json" returned: 0 in 0.090s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 21 18:52:21 np0005591285 nova_compute[182755]: 2026-01-21 23:52:21.519 182759 DEBUG oslo_concurrency.processutils [None req-d7afeab4-24db-4f00-b9c2-7822a08cbcb2 70ab01acb8e9483f8de4b65f638c134d 746e12e350104caf8fd0201b7e30f2fe - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474,backing_fmt=raw /var/lib/nova/instances/f3e5045f-b39a-435f-9112-c2adfb8c8b71/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 21 18:52:21 np0005591285 nova_compute[182755]: 2026-01-21 23:52:21.574 182759 DEBUG oslo_concurrency.processutils [None req-d7afeab4-24db-4f00-b9c2-7822a08cbcb2 70ab01acb8e9483f8de4b65f638c134d 746e12e350104caf8fd0201b7e30f2fe - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474,backing_fmt=raw /var/lib/nova/instances/f3e5045f-b39a-435f-9112-c2adfb8c8b71/disk 1073741824" returned: 0 in 0.055s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 21 18:52:21 np0005591285 nova_compute[182755]: 2026-01-21 23:52:21.576 182759 DEBUG oslo_concurrency.lockutils [None req-d7afeab4-24db-4f00-b9c2-7822a08cbcb2 70ab01acb8e9483f8de4b65f638c134d 746e12e350104caf8fd0201b7e30f2fe - - default default] Lock "3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.176s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 21 18:52:21 np0005591285 nova_compute[182755]: 2026-01-21 23:52:21.576 182759 DEBUG oslo_concurrency.processutils [None req-d7afeab4-24db-4f00-b9c2-7822a08cbcb2 70ab01acb8e9483f8de4b65f638c134d 746e12e350104caf8fd0201b7e30f2fe - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 21 18:52:21 np0005591285 nova_compute[182755]: 2026-01-21 23:52:21.676 182759 DEBUG oslo_concurrency.processutils [None req-d7afeab4-24db-4f00-b9c2-7822a08cbcb2 70ab01acb8e9483f8de4b65f638c134d 746e12e350104caf8fd0201b7e30f2fe - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json" returned: 0 in 0.100s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 21 18:52:21 np0005591285 nova_compute[182755]: 2026-01-21 23:52:21.678 182759 DEBUG nova.virt.disk.api [None req-d7afeab4-24db-4f00-b9c2-7822a08cbcb2 70ab01acb8e9483f8de4b65f638c134d 746e12e350104caf8fd0201b7e30f2fe - - default default] Checking if we can resize image /var/lib/nova/instances/f3e5045f-b39a-435f-9112-c2adfb8c8b71/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166#033[00m
Jan 21 18:52:21 np0005591285 nova_compute[182755]: 2026-01-21 23:52:21.679 182759 DEBUG oslo_concurrency.processutils [None req-d7afeab4-24db-4f00-b9c2-7822a08cbcb2 70ab01acb8e9483f8de4b65f638c134d 746e12e350104caf8fd0201b7e30f2fe - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/f3e5045f-b39a-435f-9112-c2adfb8c8b71/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 21 18:52:21 np0005591285 nova_compute[182755]: 2026-01-21 23:52:21.757 182759 DEBUG oslo_concurrency.processutils [None req-d7afeab4-24db-4f00-b9c2-7822a08cbcb2 70ab01acb8e9483f8de4b65f638c134d 746e12e350104caf8fd0201b7e30f2fe - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/f3e5045f-b39a-435f-9112-c2adfb8c8b71/disk --force-share --output=json" returned: 0 in 0.078s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 21 18:52:21 np0005591285 nova_compute[182755]: 2026-01-21 23:52:21.758 182759 DEBUG nova.virt.disk.api [None req-d7afeab4-24db-4f00-b9c2-7822a08cbcb2 70ab01acb8e9483f8de4b65f638c134d 746e12e350104caf8fd0201b7e30f2fe - - default default] Cannot resize image /var/lib/nova/instances/f3e5045f-b39a-435f-9112-c2adfb8c8b71/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172#033[00m
Jan 21 18:52:21 np0005591285 nova_compute[182755]: 2026-01-21 23:52:21.759 182759 DEBUG nova.objects.instance [None req-d7afeab4-24db-4f00-b9c2-7822a08cbcb2 70ab01acb8e9483f8de4b65f638c134d 746e12e350104caf8fd0201b7e30f2fe - - default default] Lazy-loading 'migration_context' on Instance uuid f3e5045f-b39a-435f-9112-c2adfb8c8b71 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 21 18:52:21 np0005591285 nova_compute[182755]: 2026-01-21 23:52:21.953 182759 DEBUG nova.policy [None req-d7afeab4-24db-4f00-b9c2-7822a08cbcb2 70ab01acb8e9483f8de4b65f638c134d 746e12e350104caf8fd0201b7e30f2fe - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '70ab01acb8e9483f8de4b65f638c134d', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '746e12e350104caf8fd0201b7e30f2fe', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Jan 21 18:52:22 np0005591285 nova_compute[182755]: 2026-01-21 23:52:22.144 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 18:52:22 np0005591285 nova_compute[182755]: 2026-01-21 23:52:22.219 182759 DEBUG oslo_service.periodic_task [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 21 18:52:23 np0005591285 nova_compute[182755]: 2026-01-21 23:52:23.146 182759 DEBUG nova.virt.libvirt.driver [None req-d7afeab4-24db-4f00-b9c2-7822a08cbcb2 70ab01acb8e9483f8de4b65f638c134d 746e12e350104caf8fd0201b7e30f2fe - - default default] [instance: f3e5045f-b39a-435f-9112-c2adfb8c8b71] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Jan 21 18:52:23 np0005591285 nova_compute[182755]: 2026-01-21 23:52:23.147 182759 DEBUG nova.virt.libvirt.driver [None req-d7afeab4-24db-4f00-b9c2-7822a08cbcb2 70ab01acb8e9483f8de4b65f638c134d 746e12e350104caf8fd0201b7e30f2fe - - default default] [instance: f3e5045f-b39a-435f-9112-c2adfb8c8b71] Ensure instance console log exists: /var/lib/nova/instances/f3e5045f-b39a-435f-9112-c2adfb8c8b71/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Jan 21 18:52:23 np0005591285 nova_compute[182755]: 2026-01-21 23:52:23.148 182759 DEBUG oslo_concurrency.lockutils [None req-d7afeab4-24db-4f00-b9c2-7822a08cbcb2 70ab01acb8e9483f8de4b65f638c134d 746e12e350104caf8fd0201b7e30f2fe - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 21 18:52:23 np0005591285 nova_compute[182755]: 2026-01-21 23:52:23.148 182759 DEBUG oslo_concurrency.lockutils [None req-d7afeab4-24db-4f00-b9c2-7822a08cbcb2 70ab01acb8e9483f8de4b65f638c134d 746e12e350104caf8fd0201b7e30f2fe - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 21 18:52:23 np0005591285 nova_compute[182755]: 2026-01-21 23:52:23.149 182759 DEBUG oslo_concurrency.lockutils [None req-d7afeab4-24db-4f00-b9c2-7822a08cbcb2 70ab01acb8e9483f8de4b65f638c134d 746e12e350104caf8fd0201b7e30f2fe - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 21 18:52:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:52:23.149 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 21 18:52:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:52:23.151 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 21 18:52:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:52:23.151 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 21 18:52:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:52:23.152 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.capacity, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 21 18:52:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:52:23.152 12 DEBUG ceilometer.polling.manager [-] Skip pollster memory.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 21 18:52:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:52:23.152 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.iops, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 21 18:52:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:52:23.152 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 21 18:52:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:52:23.152 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 21 18:52:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:52:23.152 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 21 18:52:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:52:23.152 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 21 18:52:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:52:23.152 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 21 18:52:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:52:23.152 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 21 18:52:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:52:23.153 12 DEBUG ceilometer.polling.manager [-] Skip pollster cpu, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 21 18:52:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:52:23.153 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 21 18:52:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:52:23.153 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.allocation, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 21 18:52:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:52:23.153 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 21 18:52:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:52:23.153 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 21 18:52:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:52:23.153 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 21 18:52:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:52:23.153 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 21 18:52:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:52:23.153 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 21 18:52:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:52:23.153 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 21 18:52:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:52:23.153 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 21 18:52:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:52:23.153 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 21 18:52:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:52:23.153 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 21 18:52:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:52:23.154 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 21 18:52:24 np0005591285 nova_compute[182755]: 2026-01-21 23:52:24.217 182759 DEBUG oslo_service.periodic_task [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 21 18:52:24 np0005591285 nova_compute[182755]: 2026-01-21 23:52:24.218 182759 DEBUG nova.compute.manager [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Jan 21 18:52:24 np0005591285 nova_compute[182755]: 2026-01-21 23:52:24.587 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 18:52:25 np0005591285 podman[216614]: 2026-01-21 23:52:25.192249333 +0000 UTC m=+0.067535370 container health_status 3d927e208c8d9d2bf2ed958430b573b5beb6e6062ba87e1ec7a0ee42c855b2e4 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Jan 21 18:52:25 np0005591285 nova_compute[182755]: 2026-01-21 23:52:25.696 182759 DEBUG nova.compute.manager [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Jan 21 18:52:25 np0005591285 nova_compute[182755]: 2026-01-21 23:52:25.697 182759 DEBUG oslo_service.periodic_task [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 21 18:52:25 np0005591285 nova_compute[182755]: 2026-01-21 23:52:25.697 182759 DEBUG nova.compute.manager [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Jan 21 18:52:25 np0005591285 nova_compute[182755]: 2026-01-21 23:52:25.698 182759 DEBUG oslo_service.periodic_task [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Running periodic task ComputeManager._run_pending_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 21 18:52:25 np0005591285 nova_compute[182755]: 2026-01-21 23:52:25.698 182759 DEBUG nova.compute.manager [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Cleaning up deleted instances _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11145#033[00m
Jan 21 18:52:25 np0005591285 nova_compute[182755]: 2026-01-21 23:52:25.729 182759 DEBUG nova.compute.manager [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] There are 0 instances to clean _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11154#033[00m
Jan 21 18:52:25 np0005591285 nova_compute[182755]: 2026-01-21 23:52:25.729 182759 DEBUG oslo_service.periodic_task [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Running periodic task ComputeManager._cleanup_incomplete_migrations run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 21 18:52:25 np0005591285 nova_compute[182755]: 2026-01-21 23:52:25.729 182759 DEBUG nova.compute.manager [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Cleaning up deleted instances with incomplete migration  _cleanup_incomplete_migrations /usr/lib/python3.9/site-packages/nova/compute/manager.py:11183#033[00m
Jan 21 18:52:25 np0005591285 nova_compute[182755]: 2026-01-21 23:52:25.755 182759 DEBUG oslo_service.periodic_task [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 21 18:52:25 np0005591285 nova_compute[182755]: 2026-01-21 23:52:25.943 182759 DEBUG nova.network.neutron [None req-d7afeab4-24db-4f00-b9c2-7822a08cbcb2 70ab01acb8e9483f8de4b65f638c134d 746e12e350104caf8fd0201b7e30f2fe - - default default] [instance: f3e5045f-b39a-435f-9112-c2adfb8c8b71] Successfully created port: b6683742-2d0d-48b9-8fcd-c835a90a3423 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Jan 21 18:52:26 np0005591285 ovn_controller[94908]: 2026-01-21T23:52:26Z|00099|memory_trim|INFO|Detected inactivity (last active 30002 ms ago): trimming memory
Jan 21 18:52:27 np0005591285 nova_compute[182755]: 2026-01-21 23:52:27.148 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 18:52:27 np0005591285 nova_compute[182755]: 2026-01-21 23:52:27.312 182759 DEBUG oslo_service.periodic_task [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 21 18:52:27 np0005591285 nova_compute[182755]: 2026-01-21 23:52:27.313 182759 DEBUG oslo_service.periodic_task [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 21 18:52:27 np0005591285 nova_compute[182755]: 2026-01-21 23:52:27.979 182759 DEBUG oslo_concurrency.lockutils [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 21 18:52:27 np0005591285 nova_compute[182755]: 2026-01-21 23:52:27.980 182759 DEBUG oslo_concurrency.lockutils [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 21 18:52:27 np0005591285 nova_compute[182755]: 2026-01-21 23:52:27.980 182759 DEBUG oslo_concurrency.lockutils [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 21 18:52:27 np0005591285 nova_compute[182755]: 2026-01-21 23:52:27.981 182759 DEBUG nova.compute.resource_tracker [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Jan 21 18:52:28 np0005591285 nova_compute[182755]: 2026-01-21 23:52:28.240 182759 WARNING nova.virt.libvirt.driver [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 21 18:52:28 np0005591285 nova_compute[182755]: 2026-01-21 23:52:28.241 182759 DEBUG nova.compute.resource_tracker [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=5699MB free_disk=73.37643432617188GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Jan 21 18:52:28 np0005591285 nova_compute[182755]: 2026-01-21 23:52:28.242 182759 DEBUG oslo_concurrency.lockutils [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 21 18:52:28 np0005591285 nova_compute[182755]: 2026-01-21 23:52:28.242 182759 DEBUG oslo_concurrency.lockutils [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 21 18:52:28 np0005591285 nova_compute[182755]: 2026-01-21 23:52:28.622 182759 DEBUG nova.network.neutron [None req-d7afeab4-24db-4f00-b9c2-7822a08cbcb2 70ab01acb8e9483f8de4b65f638c134d 746e12e350104caf8fd0201b7e30f2fe - - default default] [instance: f3e5045f-b39a-435f-9112-c2adfb8c8b71] Successfully updated port: b6683742-2d0d-48b9-8fcd-c835a90a3423 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Jan 21 18:52:28 np0005591285 nova_compute[182755]: 2026-01-21 23:52:28.687 182759 DEBUG nova.compute.resource_tracker [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Instance f3e5045f-b39a-435f-9112-c2adfb8c8b71 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Jan 21 18:52:28 np0005591285 nova_compute[182755]: 2026-01-21 23:52:28.688 182759 DEBUG nova.compute.resource_tracker [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Jan 21 18:52:28 np0005591285 nova_compute[182755]: 2026-01-21 23:52:28.688 182759 DEBUG nova.compute.resource_tracker [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7679MB used_ram=640MB phys_disk=79GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Jan 21 18:52:28 np0005591285 nova_compute[182755]: 2026-01-21 23:52:28.901 182759 DEBUG nova.compute.provider_tree [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Inventory has not changed in ProviderTree for provider: e96a8776-a298-4c19-937a-402cb8191067 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 21 18:52:29 np0005591285 podman[216639]: 2026-01-21 23:52:29.227533489 +0000 UTC m=+0.097805301 container health_status e38def30a2d032278c507a8fe96e62e9ea51ff4721fe55b24e66691821fd079e (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, config_id=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS)
Jan 21 18:52:29 np0005591285 nova_compute[182755]: 2026-01-21 23:52:29.588 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 18:52:29 np0005591285 nova_compute[182755]: 2026-01-21 23:52:29.955 182759 DEBUG nova.compute.manager [req-014fc005-167a-46cd-9dbe-2b93b9455ef1 req-51ca09c9-ef28-4f50-aca7-2578c2a61bc3 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: f3e5045f-b39a-435f-9112-c2adfb8c8b71] Received event network-changed-b6683742-2d0d-48b9-8fcd-c835a90a3423 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 21 18:52:29 np0005591285 nova_compute[182755]: 2026-01-21 23:52:29.956 182759 DEBUG nova.compute.manager [req-014fc005-167a-46cd-9dbe-2b93b9455ef1 req-51ca09c9-ef28-4f50-aca7-2578c2a61bc3 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: f3e5045f-b39a-435f-9112-c2adfb8c8b71] Refreshing instance network info cache due to event network-changed-b6683742-2d0d-48b9-8fcd-c835a90a3423. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 21 18:52:29 np0005591285 nova_compute[182755]: 2026-01-21 23:52:29.956 182759 DEBUG oslo_concurrency.lockutils [req-014fc005-167a-46cd-9dbe-2b93b9455ef1 req-51ca09c9-ef28-4f50-aca7-2578c2a61bc3 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "refresh_cache-f3e5045f-b39a-435f-9112-c2adfb8c8b71" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 21 18:52:29 np0005591285 nova_compute[182755]: 2026-01-21 23:52:29.956 182759 DEBUG oslo_concurrency.lockutils [req-014fc005-167a-46cd-9dbe-2b93b9455ef1 req-51ca09c9-ef28-4f50-aca7-2578c2a61bc3 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquired lock "refresh_cache-f3e5045f-b39a-435f-9112-c2adfb8c8b71" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 21 18:52:29 np0005591285 nova_compute[182755]: 2026-01-21 23:52:29.956 182759 DEBUG nova.network.neutron [req-014fc005-167a-46cd-9dbe-2b93b9455ef1 req-51ca09c9-ef28-4f50-aca7-2578c2a61bc3 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: f3e5045f-b39a-435f-9112-c2adfb8c8b71] Refreshing network info cache for port b6683742-2d0d-48b9-8fcd-c835a90a3423 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 21 18:52:30 np0005591285 nova_compute[182755]: 2026-01-21 23:52:30.117 182759 DEBUG oslo_concurrency.lockutils [None req-d7afeab4-24db-4f00-b9c2-7822a08cbcb2 70ab01acb8e9483f8de4b65f638c134d 746e12e350104caf8fd0201b7e30f2fe - - default default] Acquiring lock "refresh_cache-f3e5045f-b39a-435f-9112-c2adfb8c8b71" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 21 18:52:30 np0005591285 nova_compute[182755]: 2026-01-21 23:52:30.187 182759 DEBUG nova.scheduler.client.report [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Inventory has not changed for provider e96a8776-a298-4c19-937a-402cb8191067 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 21 18:52:30 np0005591285 nova_compute[182755]: 2026-01-21 23:52:30.257 182759 DEBUG nova.compute.resource_tracker [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Jan 21 18:52:30 np0005591285 nova_compute[182755]: 2026-01-21 23:52:30.257 182759 DEBUG oslo_concurrency.lockutils [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 2.015s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 21 18:52:30 np0005591285 nova_compute[182755]: 2026-01-21 23:52:30.923 182759 DEBUG nova.network.neutron [req-014fc005-167a-46cd-9dbe-2b93b9455ef1 req-51ca09c9-ef28-4f50-aca7-2578c2a61bc3 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: f3e5045f-b39a-435f-9112-c2adfb8c8b71] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Jan 21 18:52:31 np0005591285 nova_compute[182755]: 2026-01-21 23:52:31.409 182759 DEBUG nova.network.neutron [req-014fc005-167a-46cd-9dbe-2b93b9455ef1 req-51ca09c9-ef28-4f50-aca7-2578c2a61bc3 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: f3e5045f-b39a-435f-9112-c2adfb8c8b71] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 21 18:52:32 np0005591285 nova_compute[182755]: 2026-01-21 23:52:32.182 182759 DEBUG oslo_service.periodic_task [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 21 18:52:32 np0005591285 nova_compute[182755]: 2026-01-21 23:52:32.184 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 18:52:32 np0005591285 nova_compute[182755]: 2026-01-21 23:52:32.185 182759 DEBUG oslo_service.periodic_task [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 21 18:52:32 np0005591285 podman[216665]: 2026-01-21 23:52:32.247664516 +0000 UTC m=+0.103373680 container health_status 482a8450540b33e5cd4c4cb0c4b60281016c657b79b89fa82f8a9e447843f995 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.build-date=20251202, config_id=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible)
Jan 21 18:52:32 np0005591285 podman[216666]: 2026-01-21 23:52:32.256652237 +0000 UTC m=+0.111506918 container health_status 8919309155a033da8deb024bb37b9fc0d285fe97686ee0d202af37a5f2df8416 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Jan 21 18:52:32 np0005591285 nova_compute[182755]: 2026-01-21 23:52:32.715 182759 DEBUG oslo_concurrency.lockutils [req-014fc005-167a-46cd-9dbe-2b93b9455ef1 req-51ca09c9-ef28-4f50-aca7-2578c2a61bc3 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Releasing lock "refresh_cache-f3e5045f-b39a-435f-9112-c2adfb8c8b71" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 21 18:52:32 np0005591285 nova_compute[182755]: 2026-01-21 23:52:32.717 182759 DEBUG oslo_concurrency.lockutils [None req-d7afeab4-24db-4f00-b9c2-7822a08cbcb2 70ab01acb8e9483f8de4b65f638c134d 746e12e350104caf8fd0201b7e30f2fe - - default default] Acquired lock "refresh_cache-f3e5045f-b39a-435f-9112-c2adfb8c8b71" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 21 18:52:32 np0005591285 nova_compute[182755]: 2026-01-21 23:52:32.717 182759 DEBUG nova.network.neutron [None req-d7afeab4-24db-4f00-b9c2-7822a08cbcb2 70ab01acb8e9483f8de4b65f638c134d 746e12e350104caf8fd0201b7e30f2fe - - default default] [instance: f3e5045f-b39a-435f-9112-c2adfb8c8b71] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Jan 21 18:52:33 np0005591285 nova_compute[182755]: 2026-01-21 23:52:33.116 182759 DEBUG nova.network.neutron [None req-d7afeab4-24db-4f00-b9c2-7822a08cbcb2 70ab01acb8e9483f8de4b65f638c134d 746e12e350104caf8fd0201b7e30f2fe - - default default] [instance: f3e5045f-b39a-435f-9112-c2adfb8c8b71] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Jan 21 18:52:34 np0005591285 nova_compute[182755]: 2026-01-21 23:52:34.469 182759 DEBUG nova.network.neutron [None req-d7afeab4-24db-4f00-b9c2-7822a08cbcb2 70ab01acb8e9483f8de4b65f638c134d 746e12e350104caf8fd0201b7e30f2fe - - default default] [instance: f3e5045f-b39a-435f-9112-c2adfb8c8b71] Updating instance_info_cache with network_info: [{"id": "b6683742-2d0d-48b9-8fcd-c835a90a3423", "address": "fa:16:3e:97:6d:a4", "network": {"id": "ac9d78c1-4947-4d13-afef-98be9fffb4a3", "bridge": "br-int", "label": "tempest-ImagesNegativeTestJSON-1028183010-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "746e12e350104caf8fd0201b7e30f2fe", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb6683742-2d", "ovs_interfaceid": "b6683742-2d0d-48b9-8fcd-c835a90a3423", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 21 18:52:34 np0005591285 nova_compute[182755]: 2026-01-21 23:52:34.510 182759 DEBUG oslo_concurrency.lockutils [None req-d7afeab4-24db-4f00-b9c2-7822a08cbcb2 70ab01acb8e9483f8de4b65f638c134d 746e12e350104caf8fd0201b7e30f2fe - - default default] Releasing lock "refresh_cache-f3e5045f-b39a-435f-9112-c2adfb8c8b71" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 21 18:52:34 np0005591285 nova_compute[182755]: 2026-01-21 23:52:34.510 182759 DEBUG nova.compute.manager [None req-d7afeab4-24db-4f00-b9c2-7822a08cbcb2 70ab01acb8e9483f8de4b65f638c134d 746e12e350104caf8fd0201b7e30f2fe - - default default] [instance: f3e5045f-b39a-435f-9112-c2adfb8c8b71] Instance network_info: |[{"id": "b6683742-2d0d-48b9-8fcd-c835a90a3423", "address": "fa:16:3e:97:6d:a4", "network": {"id": "ac9d78c1-4947-4d13-afef-98be9fffb4a3", "bridge": "br-int", "label": "tempest-ImagesNegativeTestJSON-1028183010-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "746e12e350104caf8fd0201b7e30f2fe", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb6683742-2d", "ovs_interfaceid": "b6683742-2d0d-48b9-8fcd-c835a90a3423", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Jan 21 18:52:34 np0005591285 nova_compute[182755]: 2026-01-21 23:52:34.515 182759 DEBUG nova.virt.libvirt.driver [None req-d7afeab4-24db-4f00-b9c2-7822a08cbcb2 70ab01acb8e9483f8de4b65f638c134d 746e12e350104caf8fd0201b7e30f2fe - - default default] [instance: f3e5045f-b39a-435f-9112-c2adfb8c8b71] Start _get_guest_xml network_info=[{"id": "b6683742-2d0d-48b9-8fcd-c835a90a3423", "address": "fa:16:3e:97:6d:a4", "network": {"id": "ac9d78c1-4947-4d13-afef-98be9fffb4a3", "bridge": "br-int", "label": "tempest-ImagesNegativeTestJSON-1028183010-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "746e12e350104caf8fd0201b7e30f2fe", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb6683742-2d", "ovs_interfaceid": "b6683742-2d0d-48b9-8fcd-c835a90a3423", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-21T23:42:50Z,direct_url=<?>,disk_format='qcow2',id=9cd98f02-a505-4543-a7ad-04e9a377b456,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='43b70c4e837343859ac97b6b2397ba1b',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-21T23:42:52Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'device_type': 'disk', 'device_name': '/dev/vda', 'size': 0, 'boot_index': 0, 'disk_bus': 'virtio', 'guest_format': None, 'encryption_options': None, 'encryption_format': None, 'encrypted': False, 'encryption_secret_uuid': None, 'image_id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Jan 21 18:52:34 np0005591285 nova_compute[182755]: 2026-01-21 23:52:34.522 182759 WARNING nova.virt.libvirt.driver [None req-d7afeab4-24db-4f00-b9c2-7822a08cbcb2 70ab01acb8e9483f8de4b65f638c134d 746e12e350104caf8fd0201b7e30f2fe - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 21 18:52:34 np0005591285 nova_compute[182755]: 2026-01-21 23:52:34.527 182759 DEBUG nova.virt.libvirt.host [None req-d7afeab4-24db-4f00-b9c2-7822a08cbcb2 70ab01acb8e9483f8de4b65f638c134d 746e12e350104caf8fd0201b7e30f2fe - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Jan 21 18:52:34 np0005591285 nova_compute[182755]: 2026-01-21 23:52:34.528 182759 DEBUG nova.virt.libvirt.host [None req-d7afeab4-24db-4f00-b9c2-7822a08cbcb2 70ab01acb8e9483f8de4b65f638c134d 746e12e350104caf8fd0201b7e30f2fe - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Jan 21 18:52:34 np0005591285 nova_compute[182755]: 2026-01-21 23:52:34.532 182759 DEBUG nova.virt.libvirt.host [None req-d7afeab4-24db-4f00-b9c2-7822a08cbcb2 70ab01acb8e9483f8de4b65f638c134d 746e12e350104caf8fd0201b7e30f2fe - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Jan 21 18:52:34 np0005591285 nova_compute[182755]: 2026-01-21 23:52:34.533 182759 DEBUG nova.virt.libvirt.host [None req-d7afeab4-24db-4f00-b9c2-7822a08cbcb2 70ab01acb8e9483f8de4b65f638c134d 746e12e350104caf8fd0201b7e30f2fe - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Jan 21 18:52:34 np0005591285 nova_compute[182755]: 2026-01-21 23:52:34.535 182759 DEBUG nova.virt.libvirt.driver [None req-d7afeab4-24db-4f00-b9c2-7822a08cbcb2 70ab01acb8e9483f8de4b65f638c134d 746e12e350104caf8fd0201b7e30f2fe - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Jan 21 18:52:34 np0005591285 nova_compute[182755]: 2026-01-21 23:52:34.536 182759 DEBUG nova.virt.hardware [None req-d7afeab4-24db-4f00-b9c2-7822a08cbcb2 70ab01acb8e9483f8de4b65f638c134d 746e12e350104caf8fd0201b7e30f2fe - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-21T23:42:45Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='c3389c03-89c4-4ff5-9e03-1a99d41713d4',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-21T23:42:50Z,direct_url=<?>,disk_format='qcow2',id=9cd98f02-a505-4543-a7ad-04e9a377b456,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='43b70c4e837343859ac97b6b2397ba1b',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-21T23:42:52Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Jan 21 18:52:34 np0005591285 nova_compute[182755]: 2026-01-21 23:52:34.537 182759 DEBUG nova.virt.hardware [None req-d7afeab4-24db-4f00-b9c2-7822a08cbcb2 70ab01acb8e9483f8de4b65f638c134d 746e12e350104caf8fd0201b7e30f2fe - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Jan 21 18:52:34 np0005591285 nova_compute[182755]: 2026-01-21 23:52:34.537 182759 DEBUG nova.virt.hardware [None req-d7afeab4-24db-4f00-b9c2-7822a08cbcb2 70ab01acb8e9483f8de4b65f638c134d 746e12e350104caf8fd0201b7e30f2fe - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Jan 21 18:52:34 np0005591285 nova_compute[182755]: 2026-01-21 23:52:34.538 182759 DEBUG nova.virt.hardware [None req-d7afeab4-24db-4f00-b9c2-7822a08cbcb2 70ab01acb8e9483f8de4b65f638c134d 746e12e350104caf8fd0201b7e30f2fe - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Jan 21 18:52:34 np0005591285 nova_compute[182755]: 2026-01-21 23:52:34.538 182759 DEBUG nova.virt.hardware [None req-d7afeab4-24db-4f00-b9c2-7822a08cbcb2 70ab01acb8e9483f8de4b65f638c134d 746e12e350104caf8fd0201b7e30f2fe - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Jan 21 18:52:34 np0005591285 nova_compute[182755]: 2026-01-21 23:52:34.539 182759 DEBUG nova.virt.hardware [None req-d7afeab4-24db-4f00-b9c2-7822a08cbcb2 70ab01acb8e9483f8de4b65f638c134d 746e12e350104caf8fd0201b7e30f2fe - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Jan 21 18:52:34 np0005591285 nova_compute[182755]: 2026-01-21 23:52:34.539 182759 DEBUG nova.virt.hardware [None req-d7afeab4-24db-4f00-b9c2-7822a08cbcb2 70ab01acb8e9483f8de4b65f638c134d 746e12e350104caf8fd0201b7e30f2fe - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Jan 21 18:52:34 np0005591285 nova_compute[182755]: 2026-01-21 23:52:34.540 182759 DEBUG nova.virt.hardware [None req-d7afeab4-24db-4f00-b9c2-7822a08cbcb2 70ab01acb8e9483f8de4b65f638c134d 746e12e350104caf8fd0201b7e30f2fe - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Jan 21 18:52:34 np0005591285 nova_compute[182755]: 2026-01-21 23:52:34.540 182759 DEBUG nova.virt.hardware [None req-d7afeab4-24db-4f00-b9c2-7822a08cbcb2 70ab01acb8e9483f8de4b65f638c134d 746e12e350104caf8fd0201b7e30f2fe - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Jan 21 18:52:34 np0005591285 nova_compute[182755]: 2026-01-21 23:52:34.541 182759 DEBUG nova.virt.hardware [None req-d7afeab4-24db-4f00-b9c2-7822a08cbcb2 70ab01acb8e9483f8de4b65f638c134d 746e12e350104caf8fd0201b7e30f2fe - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Jan 21 18:52:34 np0005591285 nova_compute[182755]: 2026-01-21 23:52:34.541 182759 DEBUG nova.virt.hardware [None req-d7afeab4-24db-4f00-b9c2-7822a08cbcb2 70ab01acb8e9483f8de4b65f638c134d 746e12e350104caf8fd0201b7e30f2fe - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Jan 21 18:52:34 np0005591285 nova_compute[182755]: 2026-01-21 23:52:34.548 182759 DEBUG nova.virt.libvirt.vif [None req-d7afeab4-24db-4f00-b9c2-7822a08cbcb2 70ab01acb8e9483f8de4b65f638c134d 746e12e350104caf8fd0201b7e30f2fe - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-21T23:52:15Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ImagesNegativeTestJSON-server-1741589388',display_name='tempest-ImagesNegativeTestJSON-server-1741589388',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-imagesnegativetestjson-server-1741589388',id=48,image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='746e12e350104caf8fd0201b7e30f2fe',ramdisk_id='',reservation_id='r-jduu6rny',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ImagesNegativeTestJSON-1247847828',owner_user_name='tempest-ImagesNegativeTestJSON-1247847828-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-21T23:52:21Z,user_data=None,user_id='70ab01acb8e9483f8de4b65f638c134d',uuid=f3e5045f-b39a-435f-9112-c2adfb8c8b71,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "b6683742-2d0d-48b9-8fcd-c835a90a3423", "address": "fa:16:3e:97:6d:a4", "network": {"id": "ac9d78c1-4947-4d13-afef-98be9fffb4a3", "bridge": "br-int", "label": "tempest-ImagesNegativeTestJSON-1028183010-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "746e12e350104caf8fd0201b7e30f2fe", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb6683742-2d", "ovs_interfaceid": "b6683742-2d0d-48b9-8fcd-c835a90a3423", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Jan 21 18:52:34 np0005591285 nova_compute[182755]: 2026-01-21 23:52:34.549 182759 DEBUG nova.network.os_vif_util [None req-d7afeab4-24db-4f00-b9c2-7822a08cbcb2 70ab01acb8e9483f8de4b65f638c134d 746e12e350104caf8fd0201b7e30f2fe - - default default] Converting VIF {"id": "b6683742-2d0d-48b9-8fcd-c835a90a3423", "address": "fa:16:3e:97:6d:a4", "network": {"id": "ac9d78c1-4947-4d13-afef-98be9fffb4a3", "bridge": "br-int", "label": "tempest-ImagesNegativeTestJSON-1028183010-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "746e12e350104caf8fd0201b7e30f2fe", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb6683742-2d", "ovs_interfaceid": "b6683742-2d0d-48b9-8fcd-c835a90a3423", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 21 18:52:34 np0005591285 nova_compute[182755]: 2026-01-21 23:52:34.551 182759 DEBUG nova.network.os_vif_util [None req-d7afeab4-24db-4f00-b9c2-7822a08cbcb2 70ab01acb8e9483f8de4b65f638c134d 746e12e350104caf8fd0201b7e30f2fe - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:97:6d:a4,bridge_name='br-int',has_traffic_filtering=True,id=b6683742-2d0d-48b9-8fcd-c835a90a3423,network=Network(ac9d78c1-4947-4d13-afef-98be9fffb4a3),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapb6683742-2d') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 21 18:52:34 np0005591285 nova_compute[182755]: 2026-01-21 23:52:34.553 182759 DEBUG nova.objects.instance [None req-d7afeab4-24db-4f00-b9c2-7822a08cbcb2 70ab01acb8e9483f8de4b65f638c134d 746e12e350104caf8fd0201b7e30f2fe - - default default] Lazy-loading 'pci_devices' on Instance uuid f3e5045f-b39a-435f-9112-c2adfb8c8b71 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 21 18:52:34 np0005591285 nova_compute[182755]: 2026-01-21 23:52:34.592 182759 DEBUG nova.virt.libvirt.driver [None req-d7afeab4-24db-4f00-b9c2-7822a08cbcb2 70ab01acb8e9483f8de4b65f638c134d 746e12e350104caf8fd0201b7e30f2fe - - default default] [instance: f3e5045f-b39a-435f-9112-c2adfb8c8b71] End _get_guest_xml xml=<domain type="kvm">
Jan 21 18:52:34 np0005591285 nova_compute[182755]:  <uuid>f3e5045f-b39a-435f-9112-c2adfb8c8b71</uuid>
Jan 21 18:52:34 np0005591285 nova_compute[182755]:  <name>instance-00000030</name>
Jan 21 18:52:34 np0005591285 nova_compute[182755]:  <memory>131072</memory>
Jan 21 18:52:34 np0005591285 nova_compute[182755]:  <vcpu>1</vcpu>
Jan 21 18:52:34 np0005591285 nova_compute[182755]:  <metadata>
Jan 21 18:52:34 np0005591285 nova_compute[182755]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 21 18:52:34 np0005591285 nova_compute[182755]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 21 18:52:34 np0005591285 nova_compute[182755]:      <nova:name>tempest-ImagesNegativeTestJSON-server-1741589388</nova:name>
Jan 21 18:52:34 np0005591285 nova_compute[182755]:      <nova:creationTime>2026-01-21 23:52:34</nova:creationTime>
Jan 21 18:52:34 np0005591285 nova_compute[182755]:      <nova:flavor name="m1.nano">
Jan 21 18:52:34 np0005591285 nova_compute[182755]:        <nova:memory>128</nova:memory>
Jan 21 18:52:34 np0005591285 nova_compute[182755]:        <nova:disk>1</nova:disk>
Jan 21 18:52:34 np0005591285 nova_compute[182755]:        <nova:swap>0</nova:swap>
Jan 21 18:52:34 np0005591285 nova_compute[182755]:        <nova:ephemeral>0</nova:ephemeral>
Jan 21 18:52:34 np0005591285 nova_compute[182755]:        <nova:vcpus>1</nova:vcpus>
Jan 21 18:52:34 np0005591285 nova_compute[182755]:      </nova:flavor>
Jan 21 18:52:34 np0005591285 nova_compute[182755]:      <nova:owner>
Jan 21 18:52:34 np0005591285 nova_compute[182755]:        <nova:user uuid="70ab01acb8e9483f8de4b65f638c134d">tempest-ImagesNegativeTestJSON-1247847828-project-member</nova:user>
Jan 21 18:52:34 np0005591285 nova_compute[182755]:        <nova:project uuid="746e12e350104caf8fd0201b7e30f2fe">tempest-ImagesNegativeTestJSON-1247847828</nova:project>
Jan 21 18:52:34 np0005591285 nova_compute[182755]:      </nova:owner>
Jan 21 18:52:34 np0005591285 nova_compute[182755]:      <nova:root type="image" uuid="9cd98f02-a505-4543-a7ad-04e9a377b456"/>
Jan 21 18:52:34 np0005591285 nova_compute[182755]:      <nova:ports>
Jan 21 18:52:34 np0005591285 nova_compute[182755]:        <nova:port uuid="b6683742-2d0d-48b9-8fcd-c835a90a3423">
Jan 21 18:52:34 np0005591285 nova_compute[182755]:          <nova:ip type="fixed" address="10.100.0.3" ipVersion="4"/>
Jan 21 18:52:34 np0005591285 nova_compute[182755]:        </nova:port>
Jan 21 18:52:34 np0005591285 nova_compute[182755]:      </nova:ports>
Jan 21 18:52:34 np0005591285 nova_compute[182755]:    </nova:instance>
Jan 21 18:52:34 np0005591285 nova_compute[182755]:  </metadata>
Jan 21 18:52:34 np0005591285 nova_compute[182755]:  <sysinfo type="smbios">
Jan 21 18:52:34 np0005591285 nova_compute[182755]:    <system>
Jan 21 18:52:34 np0005591285 nova_compute[182755]:      <entry name="manufacturer">RDO</entry>
Jan 21 18:52:34 np0005591285 nova_compute[182755]:      <entry name="product">OpenStack Compute</entry>
Jan 21 18:52:34 np0005591285 nova_compute[182755]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 21 18:52:34 np0005591285 nova_compute[182755]:      <entry name="serial">f3e5045f-b39a-435f-9112-c2adfb8c8b71</entry>
Jan 21 18:52:34 np0005591285 nova_compute[182755]:      <entry name="uuid">f3e5045f-b39a-435f-9112-c2adfb8c8b71</entry>
Jan 21 18:52:34 np0005591285 nova_compute[182755]:      <entry name="family">Virtual Machine</entry>
Jan 21 18:52:34 np0005591285 nova_compute[182755]:    </system>
Jan 21 18:52:34 np0005591285 nova_compute[182755]:  </sysinfo>
Jan 21 18:52:34 np0005591285 nova_compute[182755]:  <os>
Jan 21 18:52:34 np0005591285 nova_compute[182755]:    <type arch="x86_64" machine="q35">hvm</type>
Jan 21 18:52:34 np0005591285 nova_compute[182755]:    <boot dev="hd"/>
Jan 21 18:52:34 np0005591285 nova_compute[182755]:    <smbios mode="sysinfo"/>
Jan 21 18:52:34 np0005591285 nova_compute[182755]:  </os>
Jan 21 18:52:34 np0005591285 nova_compute[182755]:  <features>
Jan 21 18:52:34 np0005591285 nova_compute[182755]:    <acpi/>
Jan 21 18:52:34 np0005591285 nova_compute[182755]:    <apic/>
Jan 21 18:52:34 np0005591285 nova_compute[182755]:    <vmcoreinfo/>
Jan 21 18:52:34 np0005591285 nova_compute[182755]:  </features>
Jan 21 18:52:34 np0005591285 nova_compute[182755]:  <clock offset="utc">
Jan 21 18:52:34 np0005591285 nova_compute[182755]:    <timer name="pit" tickpolicy="delay"/>
Jan 21 18:52:34 np0005591285 nova_compute[182755]:    <timer name="rtc" tickpolicy="catchup"/>
Jan 21 18:52:34 np0005591285 nova_compute[182755]:    <timer name="hpet" present="no"/>
Jan 21 18:52:34 np0005591285 nova_compute[182755]:  </clock>
Jan 21 18:52:34 np0005591285 nova_compute[182755]:  <cpu mode="custom" match="exact">
Jan 21 18:52:34 np0005591285 nova_compute[182755]:    <model>Nehalem</model>
Jan 21 18:52:34 np0005591285 nova_compute[182755]:    <topology sockets="1" cores="1" threads="1"/>
Jan 21 18:52:34 np0005591285 nova_compute[182755]:  </cpu>
Jan 21 18:52:34 np0005591285 nova_compute[182755]:  <devices>
Jan 21 18:52:34 np0005591285 nova_compute[182755]:    <disk type="file" device="disk">
Jan 21 18:52:34 np0005591285 nova_compute[182755]:      <driver name="qemu" type="qcow2" cache="none"/>
Jan 21 18:52:34 np0005591285 nova_compute[182755]:      <source file="/var/lib/nova/instances/f3e5045f-b39a-435f-9112-c2adfb8c8b71/disk"/>
Jan 21 18:52:34 np0005591285 nova_compute[182755]:      <target dev="vda" bus="virtio"/>
Jan 21 18:52:34 np0005591285 nova_compute[182755]:    </disk>
Jan 21 18:52:34 np0005591285 nova_compute[182755]:    <disk type="file" device="cdrom">
Jan 21 18:52:34 np0005591285 nova_compute[182755]:      <driver name="qemu" type="raw" cache="none"/>
Jan 21 18:52:34 np0005591285 nova_compute[182755]:      <source file="/var/lib/nova/instances/f3e5045f-b39a-435f-9112-c2adfb8c8b71/disk.config"/>
Jan 21 18:52:34 np0005591285 nova_compute[182755]:      <target dev="sda" bus="sata"/>
Jan 21 18:52:34 np0005591285 nova_compute[182755]:    </disk>
Jan 21 18:52:34 np0005591285 nova_compute[182755]:    <interface type="ethernet">
Jan 21 18:52:34 np0005591285 nova_compute[182755]:      <mac address="fa:16:3e:97:6d:a4"/>
Jan 21 18:52:34 np0005591285 nova_compute[182755]:      <model type="virtio"/>
Jan 21 18:52:34 np0005591285 nova_compute[182755]:      <driver name="vhost" rx_queue_size="512"/>
Jan 21 18:52:34 np0005591285 nova_compute[182755]:      <mtu size="1442"/>
Jan 21 18:52:34 np0005591285 nova_compute[182755]:      <target dev="tapb6683742-2d"/>
Jan 21 18:52:34 np0005591285 nova_compute[182755]:    </interface>
Jan 21 18:52:34 np0005591285 nova_compute[182755]:    <serial type="pty">
Jan 21 18:52:34 np0005591285 nova_compute[182755]:      <log file="/var/lib/nova/instances/f3e5045f-b39a-435f-9112-c2adfb8c8b71/console.log" append="off"/>
Jan 21 18:52:34 np0005591285 nova_compute[182755]:    </serial>
Jan 21 18:52:34 np0005591285 nova_compute[182755]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 21 18:52:34 np0005591285 nova_compute[182755]:    <video>
Jan 21 18:52:34 np0005591285 nova_compute[182755]:      <model type="virtio"/>
Jan 21 18:52:34 np0005591285 nova_compute[182755]:    </video>
Jan 21 18:52:34 np0005591285 nova_compute[182755]:    <input type="tablet" bus="usb"/>
Jan 21 18:52:34 np0005591285 nova_compute[182755]:    <rng model="virtio">
Jan 21 18:52:34 np0005591285 nova_compute[182755]:      <backend model="random">/dev/urandom</backend>
Jan 21 18:52:34 np0005591285 nova_compute[182755]:    </rng>
Jan 21 18:52:34 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root"/>
Jan 21 18:52:34 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 18:52:34 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 18:52:34 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 18:52:34 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 18:52:34 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 18:52:34 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 18:52:34 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 18:52:34 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 18:52:34 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 18:52:34 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 18:52:34 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 18:52:34 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 18:52:34 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 18:52:34 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 18:52:34 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 18:52:34 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 18:52:34 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 18:52:34 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 18:52:34 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 18:52:34 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 18:52:34 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 18:52:34 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 18:52:34 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 18:52:34 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 18:52:34 np0005591285 nova_compute[182755]:    <controller type="usb" index="0"/>
Jan 21 18:52:34 np0005591285 nova_compute[182755]:    <memballoon model="virtio">
Jan 21 18:52:34 np0005591285 nova_compute[182755]:      <stats period="10"/>
Jan 21 18:52:34 np0005591285 nova_compute[182755]:    </memballoon>
Jan 21 18:52:34 np0005591285 nova_compute[182755]:  </devices>
Jan 21 18:52:34 np0005591285 nova_compute[182755]: </domain>
Jan 21 18:52:34 np0005591285 nova_compute[182755]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Jan 21 18:52:34 np0005591285 nova_compute[182755]: 2026-01-21 23:52:34.594 182759 DEBUG nova.compute.manager [None req-d7afeab4-24db-4f00-b9c2-7822a08cbcb2 70ab01acb8e9483f8de4b65f638c134d 746e12e350104caf8fd0201b7e30f2fe - - default default] [instance: f3e5045f-b39a-435f-9112-c2adfb8c8b71] Preparing to wait for external event network-vif-plugged-b6683742-2d0d-48b9-8fcd-c835a90a3423 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Jan 21 18:52:34 np0005591285 nova_compute[182755]: 2026-01-21 23:52:34.595 182759 DEBUG oslo_concurrency.lockutils [None req-d7afeab4-24db-4f00-b9c2-7822a08cbcb2 70ab01acb8e9483f8de4b65f638c134d 746e12e350104caf8fd0201b7e30f2fe - - default default] Acquiring lock "f3e5045f-b39a-435f-9112-c2adfb8c8b71-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 21 18:52:34 np0005591285 nova_compute[182755]: 2026-01-21 23:52:34.596 182759 DEBUG oslo_concurrency.lockutils [None req-d7afeab4-24db-4f00-b9c2-7822a08cbcb2 70ab01acb8e9483f8de4b65f638c134d 746e12e350104caf8fd0201b7e30f2fe - - default default] Lock "f3e5045f-b39a-435f-9112-c2adfb8c8b71-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 21 18:52:34 np0005591285 nova_compute[182755]: 2026-01-21 23:52:34.596 182759 DEBUG oslo_concurrency.lockutils [None req-d7afeab4-24db-4f00-b9c2-7822a08cbcb2 70ab01acb8e9483f8de4b65f638c134d 746e12e350104caf8fd0201b7e30f2fe - - default default] Lock "f3e5045f-b39a-435f-9112-c2adfb8c8b71-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 21 18:52:34 np0005591285 nova_compute[182755]: 2026-01-21 23:52:34.597 182759 DEBUG nova.virt.libvirt.vif [None req-d7afeab4-24db-4f00-b9c2-7822a08cbcb2 70ab01acb8e9483f8de4b65f638c134d 746e12e350104caf8fd0201b7e30f2fe - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-21T23:52:15Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ImagesNegativeTestJSON-server-1741589388',display_name='tempest-ImagesNegativeTestJSON-server-1741589388',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-imagesnegativetestjson-server-1741589388',id=48,image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='746e12e350104caf8fd0201b7e30f2fe',ramdisk_id='',reservation_id='r-jduu6rny',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ImagesNegativeTestJSON-1247847828',owner_user_name='tempest-ImagesNegativeTestJSON-1247847828-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-21T23:52:21Z,user_data=None,user_id='70ab01acb8e9483f8de4b65f638c134d',uuid=f3e5045f-b39a-435f-9112-c2adfb8c8b71,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "b6683742-2d0d-48b9-8fcd-c835a90a3423", "address": "fa:16:3e:97:6d:a4", "network": {"id": "ac9d78c1-4947-4d13-afef-98be9fffb4a3", "bridge": "br-int", "label": "tempest-ImagesNegativeTestJSON-1028183010-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "746e12e350104caf8fd0201b7e30f2fe", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb6683742-2d", "ovs_interfaceid": "b6683742-2d0d-48b9-8fcd-c835a90a3423", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Jan 21 18:52:34 np0005591285 nova_compute[182755]: 2026-01-21 23:52:34.598 182759 DEBUG nova.network.os_vif_util [None req-d7afeab4-24db-4f00-b9c2-7822a08cbcb2 70ab01acb8e9483f8de4b65f638c134d 746e12e350104caf8fd0201b7e30f2fe - - default default] Converting VIF {"id": "b6683742-2d0d-48b9-8fcd-c835a90a3423", "address": "fa:16:3e:97:6d:a4", "network": {"id": "ac9d78c1-4947-4d13-afef-98be9fffb4a3", "bridge": "br-int", "label": "tempest-ImagesNegativeTestJSON-1028183010-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "746e12e350104caf8fd0201b7e30f2fe", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb6683742-2d", "ovs_interfaceid": "b6683742-2d0d-48b9-8fcd-c835a90a3423", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 21 18:52:34 np0005591285 nova_compute[182755]: 2026-01-21 23:52:34.599 182759 DEBUG nova.network.os_vif_util [None req-d7afeab4-24db-4f00-b9c2-7822a08cbcb2 70ab01acb8e9483f8de4b65f638c134d 746e12e350104caf8fd0201b7e30f2fe - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:97:6d:a4,bridge_name='br-int',has_traffic_filtering=True,id=b6683742-2d0d-48b9-8fcd-c835a90a3423,network=Network(ac9d78c1-4947-4d13-afef-98be9fffb4a3),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapb6683742-2d') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 21 18:52:34 np0005591285 nova_compute[182755]: 2026-01-21 23:52:34.600 182759 DEBUG os_vif [None req-d7afeab4-24db-4f00-b9c2-7822a08cbcb2 70ab01acb8e9483f8de4b65f638c134d 746e12e350104caf8fd0201b7e30f2fe - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:97:6d:a4,bridge_name='br-int',has_traffic_filtering=True,id=b6683742-2d0d-48b9-8fcd-c835a90a3423,network=Network(ac9d78c1-4947-4d13-afef-98be9fffb4a3),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapb6683742-2d') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Jan 21 18:52:34 np0005591285 nova_compute[182755]: 2026-01-21 23:52:34.602 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 18:52:34 np0005591285 nova_compute[182755]: 2026-01-21 23:52:34.604 182759 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 21 18:52:34 np0005591285 nova_compute[182755]: 2026-01-21 23:52:34.604 182759 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 21 18:52:34 np0005591285 nova_compute[182755]: 2026-01-21 23:52:34.605 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 18:52:34 np0005591285 nova_compute[182755]: 2026-01-21 23:52:34.615 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 18:52:34 np0005591285 nova_compute[182755]: 2026-01-21 23:52:34.616 182759 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapb6683742-2d, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 21 18:52:34 np0005591285 nova_compute[182755]: 2026-01-21 23:52:34.617 182759 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tapb6683742-2d, col_values=(('external_ids', {'iface-id': 'b6683742-2d0d-48b9-8fcd-c835a90a3423', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:97:6d:a4', 'vm-uuid': 'f3e5045f-b39a-435f-9112-c2adfb8c8b71'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 21 18:52:34 np0005591285 nova_compute[182755]: 2026-01-21 23:52:34.619 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 18:52:34 np0005591285 NetworkManager[55017]: <info>  [1769039554.6210] manager: (tapb6683742-2d): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/59)
Jan 21 18:52:34 np0005591285 nova_compute[182755]: 2026-01-21 23:52:34.623 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 21 18:52:34 np0005591285 nova_compute[182755]: 2026-01-21 23:52:34.628 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 18:52:34 np0005591285 nova_compute[182755]: 2026-01-21 23:52:34.629 182759 INFO os_vif [None req-d7afeab4-24db-4f00-b9c2-7822a08cbcb2 70ab01acb8e9483f8de4b65f638c134d 746e12e350104caf8fd0201b7e30f2fe - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:97:6d:a4,bridge_name='br-int',has_traffic_filtering=True,id=b6683742-2d0d-48b9-8fcd-c835a90a3423,network=Network(ac9d78c1-4947-4d13-afef-98be9fffb4a3),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapb6683742-2d')#033[00m
Jan 21 18:52:34 np0005591285 nova_compute[182755]: 2026-01-21 23:52:34.702 182759 DEBUG nova.virt.libvirt.driver [None req-d7afeab4-24db-4f00-b9c2-7822a08cbcb2 70ab01acb8e9483f8de4b65f638c134d 746e12e350104caf8fd0201b7e30f2fe - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 21 18:52:34 np0005591285 nova_compute[182755]: 2026-01-21 23:52:34.702 182759 DEBUG nova.virt.libvirt.driver [None req-d7afeab4-24db-4f00-b9c2-7822a08cbcb2 70ab01acb8e9483f8de4b65f638c134d 746e12e350104caf8fd0201b7e30f2fe - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 21 18:52:34 np0005591285 nova_compute[182755]: 2026-01-21 23:52:34.703 182759 DEBUG nova.virt.libvirt.driver [None req-d7afeab4-24db-4f00-b9c2-7822a08cbcb2 70ab01acb8e9483f8de4b65f638c134d 746e12e350104caf8fd0201b7e30f2fe - - default default] No VIF found with MAC fa:16:3e:97:6d:a4, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Jan 21 18:52:34 np0005591285 nova_compute[182755]: 2026-01-21 23:52:34.703 182759 INFO nova.virt.libvirt.driver [None req-d7afeab4-24db-4f00-b9c2-7822a08cbcb2 70ab01acb8e9483f8de4b65f638c134d 746e12e350104caf8fd0201b7e30f2fe - - default default] [instance: f3e5045f-b39a-435f-9112-c2adfb8c8b71] Using config drive#033[00m
Jan 21 18:52:35 np0005591285 nova_compute[182755]: 2026-01-21 23:52:35.708 182759 INFO nova.virt.libvirt.driver [None req-d7afeab4-24db-4f00-b9c2-7822a08cbcb2 70ab01acb8e9483f8de4b65f638c134d 746e12e350104caf8fd0201b7e30f2fe - - default default] [instance: f3e5045f-b39a-435f-9112-c2adfb8c8b71] Creating config drive at /var/lib/nova/instances/f3e5045f-b39a-435f-9112-c2adfb8c8b71/disk.config#033[00m
Jan 21 18:52:35 np0005591285 nova_compute[182755]: 2026-01-21 23:52:35.719 182759 DEBUG oslo_concurrency.processutils [None req-d7afeab4-24db-4f00-b9c2-7822a08cbcb2 70ab01acb8e9483f8de4b65f638c134d 746e12e350104caf8fd0201b7e30f2fe - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/f3e5045f-b39a-435f-9112-c2adfb8c8b71/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpbaxaepoh execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 21 18:52:35 np0005591285 nova_compute[182755]: 2026-01-21 23:52:35.854 182759 DEBUG oslo_concurrency.processutils [None req-d7afeab4-24db-4f00-b9c2-7822a08cbcb2 70ab01acb8e9483f8de4b65f638c134d 746e12e350104caf8fd0201b7e30f2fe - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/f3e5045f-b39a-435f-9112-c2adfb8c8b71/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpbaxaepoh" returned: 0 in 0.136s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 21 18:52:35 np0005591285 kernel: tapb6683742-2d: entered promiscuous mode
Jan 21 18:52:35 np0005591285 NetworkManager[55017]: <info>  [1769039555.9433] manager: (tapb6683742-2d): new Tun device (/org/freedesktop/NetworkManager/Devices/60)
Jan 21 18:52:35 np0005591285 ovn_controller[94908]: 2026-01-21T23:52:35Z|00100|binding|INFO|Claiming lport b6683742-2d0d-48b9-8fcd-c835a90a3423 for this chassis.
Jan 21 18:52:35 np0005591285 ovn_controller[94908]: 2026-01-21T23:52:35Z|00101|binding|INFO|b6683742-2d0d-48b9-8fcd-c835a90a3423: Claiming fa:16:3e:97:6d:a4 10.100.0.3
Jan 21 18:52:35 np0005591285 nova_compute[182755]: 2026-01-21 23:52:35.944 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 18:52:35 np0005591285 nova_compute[182755]: 2026-01-21 23:52:35.950 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 18:52:35 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:52:35.964 104259 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:97:6d:a4 10.100.0.3'], port_security=['fa:16:3e:97:6d:a4 10.100.0.3'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.3/28', 'neutron:device_id': 'f3e5045f-b39a-435f-9112-c2adfb8c8b71', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-ac9d78c1-4947-4d13-afef-98be9fffb4a3', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '746e12e350104caf8fd0201b7e30f2fe', 'neutron:revision_number': '2', 'neutron:security_group_ids': '32ec70e7-f388-434b-bf72-519901c7c219', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=c45b543f-1374-4935-8ddf-60444b61519b, chassis=[<ovs.db.idl.Row object at 0x7fdd55b5c970>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fdd55b5c970>], logical_port=b6683742-2d0d-48b9-8fcd-c835a90a3423) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 21 18:52:35 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:52:35.966 104259 INFO neutron.agent.ovn.metadata.agent [-] Port b6683742-2d0d-48b9-8fcd-c835a90a3423 in datapath ac9d78c1-4947-4d13-afef-98be9fffb4a3 bound to our chassis#033[00m
Jan 21 18:52:35 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:52:35.968 104259 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network ac9d78c1-4947-4d13-afef-98be9fffb4a3#033[00m
Jan 21 18:52:35 np0005591285 systemd-udevd[216727]: Network interface NamePolicy= disabled on kernel command line.
Jan 21 18:52:35 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:52:35.995 211690 DEBUG oslo.privsep.daemon [-] privsep: reply[a51b5b47-ecf7-4906-b1b6-627796fbdbeb]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 18:52:35 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:52:35.996 104259 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tapac9d78c1-41 in ovnmeta-ac9d78c1-4947-4d13-afef-98be9fffb4a3 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Jan 21 18:52:36 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:52:36.000 211690 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tapac9d78c1-40 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Jan 21 18:52:36 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:52:36.001 211690 DEBUG oslo.privsep.daemon [-] privsep: reply[c1919aeb-1278-4dbc-8267-c85043dff3d7]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 18:52:36 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:52:36.002 211690 DEBUG oslo.privsep.daemon [-] privsep: reply[20373a90-6a44-48a6-920f-5f7791ebd67b]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 18:52:36 np0005591285 NetworkManager[55017]: <info>  [1769039556.0154] device (tapb6683742-2d): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 21 18:52:36 np0005591285 NetworkManager[55017]: <info>  [1769039556.0163] device (tapb6683742-2d): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 21 18:52:36 np0005591285 systemd-machined[154022]: New machine qemu-19-instance-00000030.
Jan 21 18:52:36 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:52:36.024 104650 DEBUG oslo.privsep.daemon [-] privsep: reply[1cb0c2d0-1690-454e-a51c-6e5158bed818]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 18:52:36 np0005591285 nova_compute[182755]: 2026-01-21 23:52:36.028 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 18:52:36 np0005591285 ovn_controller[94908]: 2026-01-21T23:52:36Z|00102|binding|INFO|Setting lport b6683742-2d0d-48b9-8fcd-c835a90a3423 ovn-installed in OVS
Jan 21 18:52:36 np0005591285 ovn_controller[94908]: 2026-01-21T23:52:36Z|00103|binding|INFO|Setting lport b6683742-2d0d-48b9-8fcd-c835a90a3423 up in Southbound
Jan 21 18:52:36 np0005591285 nova_compute[182755]: 2026-01-21 23:52:36.033 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 18:52:36 np0005591285 systemd[1]: Started Virtual Machine qemu-19-instance-00000030.
Jan 21 18:52:36 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:52:36.051 211690 DEBUG oslo.privsep.daemon [-] privsep: reply[1f90cc69-2244-4997-8279-b1c7c3a87735]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 18:52:36 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:52:36.097 211704 DEBUG oslo.privsep.daemon [-] privsep: reply[abe8d744-f241-4bb9-9696-cdab2380f0cd]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 18:52:36 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:52:36.105 211690 DEBUG oslo.privsep.daemon [-] privsep: reply[a5c166c0-041a-40e7-9773-6f5e85c406b7]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 18:52:36 np0005591285 NetworkManager[55017]: <info>  [1769039556.1067] manager: (tapac9d78c1-40): new Veth device (/org/freedesktop/NetworkManager/Devices/61)
Jan 21 18:52:36 np0005591285 systemd-udevd[216733]: Network interface NamePolicy= disabled on kernel command line.
Jan 21 18:52:36 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:52:36.153 211704 DEBUG oslo.privsep.daemon [-] privsep: reply[369567dd-b29f-4f2d-b0a4-11b8ed0ded20]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 18:52:36 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:52:36.160 211704 DEBUG oslo.privsep.daemon [-] privsep: reply[31ebb1d9-96f9-49bd-9cde-bf87853f0b47]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 18:52:36 np0005591285 NetworkManager[55017]: <info>  [1769039556.1982] device (tapac9d78c1-40): carrier: link connected
Jan 21 18:52:36 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:52:36.206 211704 DEBUG oslo.privsep.daemon [-] privsep: reply[e538dcf9-197f-46c0-8d7d-84e15e910f37]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 18:52:36 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:52:36.235 211690 DEBUG oslo.privsep.daemon [-] privsep: reply[907ffcc7-95ba-4314-bfd0-ed586f3f605e]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapac9d78c1-41'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:52:75:26'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 36], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 411485, 'reachable_time': 41319, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 216762, 'error': None, 'target': 'ovnmeta-ac9d78c1-4947-4d13-afef-98be9fffb4a3', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 18:52:36 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:52:36.253 211690 DEBUG oslo.privsep.daemon [-] privsep: reply[a0496676-a835-40a9-92d8-d21c1f0d3cd6]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe52:7526'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 411485, 'tstamp': 411485}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 216763, 'error': None, 'target': 'ovnmeta-ac9d78c1-4947-4d13-afef-98be9fffb4a3', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 18:52:36 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:52:36.281 211690 DEBUG oslo.privsep.daemon [-] privsep: reply[f850877c-57a2-426f-b8d2-b83ea0472511]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapac9d78c1-41'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:52:75:26'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 36], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 411485, 'reachable_time': 41319, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 216764, 'error': None, 'target': 'ovnmeta-ac9d78c1-4947-4d13-afef-98be9fffb4a3', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 18:52:36 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:52:36.322 211690 DEBUG oslo.privsep.daemon [-] privsep: reply[5c277e9b-55aa-46d1-944f-a2d08d56fd98]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 18:52:36 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:52:36.414 211690 DEBUG oslo.privsep.daemon [-] privsep: reply[f5cf7fc5-b669-4244-9625-49f1d47783cd]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 18:52:36 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:52:36.416 104259 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapac9d78c1-40, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 21 18:52:36 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:52:36.417 104259 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 21 18:52:36 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:52:36.418 104259 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapac9d78c1-40, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 21 18:52:36 np0005591285 NetworkManager[55017]: <info>  [1769039556.4211] manager: (tapac9d78c1-40): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/62)
Jan 21 18:52:36 np0005591285 nova_compute[182755]: 2026-01-21 23:52:36.420 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 18:52:36 np0005591285 kernel: tapac9d78c1-40: entered promiscuous mode
Jan 21 18:52:36 np0005591285 nova_compute[182755]: 2026-01-21 23:52:36.424 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 18:52:36 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:52:36.425 104259 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapac9d78c1-40, col_values=(('external_ids', {'iface-id': 'c80abfd5-5bae-41b1-a02f-2cf23be5b451'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 21 18:52:36 np0005591285 nova_compute[182755]: 2026-01-21 23:52:36.427 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 18:52:36 np0005591285 ovn_controller[94908]: 2026-01-21T23:52:36Z|00104|binding|INFO|Releasing lport c80abfd5-5bae-41b1-a02f-2cf23be5b451 from this chassis (sb_readonly=0)
Jan 21 18:52:36 np0005591285 nova_compute[182755]: 2026-01-21 23:52:36.428 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 18:52:36 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:52:36.429 104259 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/ac9d78c1-4947-4d13-afef-98be9fffb4a3.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/ac9d78c1-4947-4d13-afef-98be9fffb4a3.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Jan 21 18:52:36 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:52:36.430 211690 DEBUG oslo.privsep.daemon [-] privsep: reply[c0509e76-fba4-4eaa-a7ce-5ed9ce247d94]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 18:52:36 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:52:36.431 104259 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Jan 21 18:52:36 np0005591285 ovn_metadata_agent[104254]: global
Jan 21 18:52:36 np0005591285 ovn_metadata_agent[104254]:    log         /dev/log local0 debug
Jan 21 18:52:36 np0005591285 ovn_metadata_agent[104254]:    log-tag     haproxy-metadata-proxy-ac9d78c1-4947-4d13-afef-98be9fffb4a3
Jan 21 18:52:36 np0005591285 ovn_metadata_agent[104254]:    user        root
Jan 21 18:52:36 np0005591285 ovn_metadata_agent[104254]:    group       root
Jan 21 18:52:36 np0005591285 ovn_metadata_agent[104254]:    maxconn     1024
Jan 21 18:52:36 np0005591285 ovn_metadata_agent[104254]:    pidfile     /var/lib/neutron/external/pids/ac9d78c1-4947-4d13-afef-98be9fffb4a3.pid.haproxy
Jan 21 18:52:36 np0005591285 ovn_metadata_agent[104254]:    daemon
Jan 21 18:52:36 np0005591285 ovn_metadata_agent[104254]: 
Jan 21 18:52:36 np0005591285 ovn_metadata_agent[104254]: defaults
Jan 21 18:52:36 np0005591285 ovn_metadata_agent[104254]:    log global
Jan 21 18:52:36 np0005591285 ovn_metadata_agent[104254]:    mode http
Jan 21 18:52:36 np0005591285 ovn_metadata_agent[104254]:    option httplog
Jan 21 18:52:36 np0005591285 ovn_metadata_agent[104254]:    option dontlognull
Jan 21 18:52:36 np0005591285 ovn_metadata_agent[104254]:    option http-server-close
Jan 21 18:52:36 np0005591285 ovn_metadata_agent[104254]:    option forwardfor
Jan 21 18:52:36 np0005591285 ovn_metadata_agent[104254]:    retries                 3
Jan 21 18:52:36 np0005591285 ovn_metadata_agent[104254]:    timeout http-request    30s
Jan 21 18:52:36 np0005591285 ovn_metadata_agent[104254]:    timeout connect         30s
Jan 21 18:52:36 np0005591285 ovn_metadata_agent[104254]:    timeout client          32s
Jan 21 18:52:36 np0005591285 ovn_metadata_agent[104254]:    timeout server          32s
Jan 21 18:52:36 np0005591285 ovn_metadata_agent[104254]:    timeout http-keep-alive 30s
Jan 21 18:52:36 np0005591285 ovn_metadata_agent[104254]: 
Jan 21 18:52:36 np0005591285 ovn_metadata_agent[104254]: 
Jan 21 18:52:36 np0005591285 ovn_metadata_agent[104254]: listen listener
Jan 21 18:52:36 np0005591285 ovn_metadata_agent[104254]:    bind 169.254.169.254:80
Jan 21 18:52:36 np0005591285 ovn_metadata_agent[104254]:    server metadata /var/lib/neutron/metadata_proxy
Jan 21 18:52:36 np0005591285 ovn_metadata_agent[104254]:    http-request add-header X-OVN-Network-ID ac9d78c1-4947-4d13-afef-98be9fffb4a3
Jan 21 18:52:36 np0005591285 ovn_metadata_agent[104254]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Jan 21 18:52:36 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:52:36.432 104259 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-ac9d78c1-4947-4d13-afef-98be9fffb4a3', 'env', 'PROCESS_TAG=haproxy-ac9d78c1-4947-4d13-afef-98be9fffb4a3', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/ac9d78c1-4947-4d13-afef-98be9fffb4a3.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Jan 21 18:52:36 np0005591285 nova_compute[182755]: 2026-01-21 23:52:36.469 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 18:52:36 np0005591285 nova_compute[182755]: 2026-01-21 23:52:36.596 182759 DEBUG nova.virt.driver [None req-f447ef32-7edb-4e2c-a5c5-5452f2f9c37e - - - - - -] Emitting event <LifecycleEvent: 1769039556.595621, f3e5045f-b39a-435f-9112-c2adfb8c8b71 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 21 18:52:36 np0005591285 nova_compute[182755]: 2026-01-21 23:52:36.597 182759 INFO nova.compute.manager [None req-f447ef32-7edb-4e2c-a5c5-5452f2f9c37e - - - - - -] [instance: f3e5045f-b39a-435f-9112-c2adfb8c8b71] VM Started (Lifecycle Event)#033[00m
Jan 21 18:52:36 np0005591285 nova_compute[182755]: 2026-01-21 23:52:36.627 182759 DEBUG nova.compute.manager [None req-f447ef32-7edb-4e2c-a5c5-5452f2f9c37e - - - - - -] [instance: f3e5045f-b39a-435f-9112-c2adfb8c8b71] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 21 18:52:36 np0005591285 nova_compute[182755]: 2026-01-21 23:52:36.633 182759 DEBUG nova.virt.driver [None req-f447ef32-7edb-4e2c-a5c5-5452f2f9c37e - - - - - -] Emitting event <LifecycleEvent: 1769039556.5958037, f3e5045f-b39a-435f-9112-c2adfb8c8b71 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 21 18:52:36 np0005591285 nova_compute[182755]: 2026-01-21 23:52:36.634 182759 INFO nova.compute.manager [None req-f447ef32-7edb-4e2c-a5c5-5452f2f9c37e - - - - - -] [instance: f3e5045f-b39a-435f-9112-c2adfb8c8b71] VM Paused (Lifecycle Event)#033[00m
Jan 21 18:52:36 np0005591285 nova_compute[182755]: 2026-01-21 23:52:36.653 182759 DEBUG nova.compute.manager [None req-f447ef32-7edb-4e2c-a5c5-5452f2f9c37e - - - - - -] [instance: f3e5045f-b39a-435f-9112-c2adfb8c8b71] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 21 18:52:36 np0005591285 nova_compute[182755]: 2026-01-21 23:52:36.658 182759 DEBUG nova.compute.manager [None req-f447ef32-7edb-4e2c-a5c5-5452f2f9c37e - - - - - -] [instance: f3e5045f-b39a-435f-9112-c2adfb8c8b71] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 21 18:52:36 np0005591285 nova_compute[182755]: 2026-01-21 23:52:36.714 182759 INFO nova.compute.manager [None req-f447ef32-7edb-4e2c-a5c5-5452f2f9c37e - - - - - -] [instance: f3e5045f-b39a-435f-9112-c2adfb8c8b71] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 21 18:52:36 np0005591285 podman[216803]: 2026-01-21 23:52:36.908464523 +0000 UTC m=+0.055663141 container create 7f4ecc782405d7596d7fb75fed116ba14460590ca686779397997b13cd5211f3 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-ac9d78c1-4947-4d13-afef-98be9fffb4a3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, tcib_managed=true, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team)
Jan 21 18:52:36 np0005591285 systemd[1]: Started libpod-conmon-7f4ecc782405d7596d7fb75fed116ba14460590ca686779397997b13cd5211f3.scope.
Jan 21 18:52:36 np0005591285 podman[216803]: 2026-01-21 23:52:36.879684483 +0000 UTC m=+0.026883101 image pull 3695f0466b4af47afdf4b467956f8cc4744d7249671a73e7ca3fd26cca2f59c3 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Jan 21 18:52:36 np0005591285 nova_compute[182755]: 2026-01-21 23:52:36.987 182759 DEBUG nova.compute.manager [req-434da1ec-c524-4aff-a18c-e4324b0286e2 req-65e1c84d-54a4-49fa-8351-c853adfa9754 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: f3e5045f-b39a-435f-9112-c2adfb8c8b71] Received event network-vif-plugged-b6683742-2d0d-48b9-8fcd-c835a90a3423 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 21 18:52:36 np0005591285 nova_compute[182755]: 2026-01-21 23:52:36.988 182759 DEBUG oslo_concurrency.lockutils [req-434da1ec-c524-4aff-a18c-e4324b0286e2 req-65e1c84d-54a4-49fa-8351-c853adfa9754 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "f3e5045f-b39a-435f-9112-c2adfb8c8b71-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 21 18:52:36 np0005591285 nova_compute[182755]: 2026-01-21 23:52:36.988 182759 DEBUG oslo_concurrency.lockutils [req-434da1ec-c524-4aff-a18c-e4324b0286e2 req-65e1c84d-54a4-49fa-8351-c853adfa9754 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "f3e5045f-b39a-435f-9112-c2adfb8c8b71-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 21 18:52:36 np0005591285 nova_compute[182755]: 2026-01-21 23:52:36.989 182759 DEBUG oslo_concurrency.lockutils [req-434da1ec-c524-4aff-a18c-e4324b0286e2 req-65e1c84d-54a4-49fa-8351-c853adfa9754 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "f3e5045f-b39a-435f-9112-c2adfb8c8b71-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 21 18:52:36 np0005591285 nova_compute[182755]: 2026-01-21 23:52:36.989 182759 DEBUG nova.compute.manager [req-434da1ec-c524-4aff-a18c-e4324b0286e2 req-65e1c84d-54a4-49fa-8351-c853adfa9754 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: f3e5045f-b39a-435f-9112-c2adfb8c8b71] Processing event network-vif-plugged-b6683742-2d0d-48b9-8fcd-c835a90a3423 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Jan 21 18:52:36 np0005591285 nova_compute[182755]: 2026-01-21 23:52:36.990 182759 DEBUG nova.compute.manager [None req-d7afeab4-24db-4f00-b9c2-7822a08cbcb2 70ab01acb8e9483f8de4b65f638c134d 746e12e350104caf8fd0201b7e30f2fe - - default default] [instance: f3e5045f-b39a-435f-9112-c2adfb8c8b71] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Jan 21 18:52:36 np0005591285 nova_compute[182755]: 2026-01-21 23:52:36.997 182759 DEBUG nova.virt.driver [None req-f447ef32-7edb-4e2c-a5c5-5452f2f9c37e - - - - - -] Emitting event <LifecycleEvent: 1769039556.9970303, f3e5045f-b39a-435f-9112-c2adfb8c8b71 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 21 18:52:36 np0005591285 nova_compute[182755]: 2026-01-21 23:52:36.998 182759 INFO nova.compute.manager [None req-f447ef32-7edb-4e2c-a5c5-5452f2f9c37e - - - - - -] [instance: f3e5045f-b39a-435f-9112-c2adfb8c8b71] VM Resumed (Lifecycle Event)#033[00m
Jan 21 18:52:37 np0005591285 nova_compute[182755]: 2026-01-21 23:52:37.001 182759 DEBUG nova.virt.libvirt.driver [None req-d7afeab4-24db-4f00-b9c2-7822a08cbcb2 70ab01acb8e9483f8de4b65f638c134d 746e12e350104caf8fd0201b7e30f2fe - - default default] [instance: f3e5045f-b39a-435f-9112-c2adfb8c8b71] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Jan 21 18:52:37 np0005591285 nova_compute[182755]: 2026-01-21 23:52:37.005 182759 INFO nova.virt.libvirt.driver [-] [instance: f3e5045f-b39a-435f-9112-c2adfb8c8b71] Instance spawned successfully.#033[00m
Jan 21 18:52:37 np0005591285 nova_compute[182755]: 2026-01-21 23:52:37.006 182759 DEBUG nova.virt.libvirt.driver [None req-d7afeab4-24db-4f00-b9c2-7822a08cbcb2 70ab01acb8e9483f8de4b65f638c134d 746e12e350104caf8fd0201b7e30f2fe - - default default] [instance: f3e5045f-b39a-435f-9112-c2adfb8c8b71] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Jan 21 18:52:37 np0005591285 systemd[1]: Started libcrun container.
Jan 21 18:52:37 np0005591285 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/98d0a0c9299aeafe7b8a13f1bc26b8902a2cfcb66ed58ccffd11be32420ed1b3/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 21 18:52:37 np0005591285 podman[216803]: 2026-01-21 23:52:37.047314303 +0000 UTC m=+0.194512911 container init 7f4ecc782405d7596d7fb75fed116ba14460590ca686779397997b13cd5211f3 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-ac9d78c1-4947-4d13-afef-98be9fffb4a3, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0)
Jan 21 18:52:37 np0005591285 nova_compute[182755]: 2026-01-21 23:52:37.047 182759 DEBUG nova.compute.manager [None req-f447ef32-7edb-4e2c-a5c5-5452f2f9c37e - - - - - -] [instance: f3e5045f-b39a-435f-9112-c2adfb8c8b71] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 21 18:52:37 np0005591285 nova_compute[182755]: 2026-01-21 23:52:37.054 182759 DEBUG nova.virt.libvirt.driver [None req-d7afeab4-24db-4f00-b9c2-7822a08cbcb2 70ab01acb8e9483f8de4b65f638c134d 746e12e350104caf8fd0201b7e30f2fe - - default default] [instance: f3e5045f-b39a-435f-9112-c2adfb8c8b71] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 21 18:52:37 np0005591285 nova_compute[182755]: 2026-01-21 23:52:37.055 182759 DEBUG nova.virt.libvirt.driver [None req-d7afeab4-24db-4f00-b9c2-7822a08cbcb2 70ab01acb8e9483f8de4b65f638c134d 746e12e350104caf8fd0201b7e30f2fe - - default default] [instance: f3e5045f-b39a-435f-9112-c2adfb8c8b71] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 21 18:52:37 np0005591285 nova_compute[182755]: 2026-01-21 23:52:37.055 182759 DEBUG nova.virt.libvirt.driver [None req-d7afeab4-24db-4f00-b9c2-7822a08cbcb2 70ab01acb8e9483f8de4b65f638c134d 746e12e350104caf8fd0201b7e30f2fe - - default default] [instance: f3e5045f-b39a-435f-9112-c2adfb8c8b71] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 21 18:52:37 np0005591285 nova_compute[182755]: 2026-01-21 23:52:37.056 182759 DEBUG nova.virt.libvirt.driver [None req-d7afeab4-24db-4f00-b9c2-7822a08cbcb2 70ab01acb8e9483f8de4b65f638c134d 746e12e350104caf8fd0201b7e30f2fe - - default default] [instance: f3e5045f-b39a-435f-9112-c2adfb8c8b71] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 21 18:52:37 np0005591285 nova_compute[182755]: 2026-01-21 23:52:37.057 182759 DEBUG nova.virt.libvirt.driver [None req-d7afeab4-24db-4f00-b9c2-7822a08cbcb2 70ab01acb8e9483f8de4b65f638c134d 746e12e350104caf8fd0201b7e30f2fe - - default default] [instance: f3e5045f-b39a-435f-9112-c2adfb8c8b71] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 21 18:52:37 np0005591285 podman[216803]: 2026-01-21 23:52:37.058119092 +0000 UTC m=+0.205317700 container start 7f4ecc782405d7596d7fb75fed116ba14460590ca686779397997b13cd5211f3 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-ac9d78c1-4947-4d13-afef-98be9fffb4a3, tcib_managed=true, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2)
Jan 21 18:52:37 np0005591285 nova_compute[182755]: 2026-01-21 23:52:37.059 182759 DEBUG nova.virt.libvirt.driver [None req-d7afeab4-24db-4f00-b9c2-7822a08cbcb2 70ab01acb8e9483f8de4b65f638c134d 746e12e350104caf8fd0201b7e30f2fe - - default default] [instance: f3e5045f-b39a-435f-9112-c2adfb8c8b71] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 21 18:52:37 np0005591285 nova_compute[182755]: 2026-01-21 23:52:37.071 182759 DEBUG nova.compute.manager [None req-f447ef32-7edb-4e2c-a5c5-5452f2f9c37e - - - - - -] [instance: f3e5045f-b39a-435f-9112-c2adfb8c8b71] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 21 18:52:37 np0005591285 neutron-haproxy-ovnmeta-ac9d78c1-4947-4d13-afef-98be9fffb4a3[216818]: [NOTICE]   (216822) : New worker (216824) forked
Jan 21 18:52:37 np0005591285 neutron-haproxy-ovnmeta-ac9d78c1-4947-4d13-afef-98be9fffb4a3[216818]: [NOTICE]   (216822) : Loading success.
Jan 21 18:52:37 np0005591285 nova_compute[182755]: 2026-01-21 23:52:37.168 182759 INFO nova.compute.manager [None req-f447ef32-7edb-4e2c-a5c5-5452f2f9c37e - - - - - -] [instance: f3e5045f-b39a-435f-9112-c2adfb8c8b71] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 21 18:52:37 np0005591285 nova_compute[182755]: 2026-01-21 23:52:37.402 182759 INFO nova.compute.manager [None req-d7afeab4-24db-4f00-b9c2-7822a08cbcb2 70ab01acb8e9483f8de4b65f638c134d 746e12e350104caf8fd0201b7e30f2fe - - default default] [instance: f3e5045f-b39a-435f-9112-c2adfb8c8b71] Took 16.14 seconds to spawn the instance on the hypervisor.#033[00m
Jan 21 18:52:37 np0005591285 nova_compute[182755]: 2026-01-21 23:52:37.404 182759 DEBUG nova.compute.manager [None req-d7afeab4-24db-4f00-b9c2-7822a08cbcb2 70ab01acb8e9483f8de4b65f638c134d 746e12e350104caf8fd0201b7e30f2fe - - default default] [instance: f3e5045f-b39a-435f-9112-c2adfb8c8b71] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 21 18:52:37 np0005591285 nova_compute[182755]: 2026-01-21 23:52:37.518 182759 INFO nova.compute.manager [None req-d7afeab4-24db-4f00-b9c2-7822a08cbcb2 70ab01acb8e9483f8de4b65f638c134d 746e12e350104caf8fd0201b7e30f2fe - - default default] [instance: f3e5045f-b39a-435f-9112-c2adfb8c8b71] Took 16.96 seconds to build instance.#033[00m
Jan 21 18:52:37 np0005591285 nova_compute[182755]: 2026-01-21 23:52:37.542 182759 DEBUG oslo_concurrency.lockutils [None req-d7afeab4-24db-4f00-b9c2-7822a08cbcb2 70ab01acb8e9483f8de4b65f638c134d 746e12e350104caf8fd0201b7e30f2fe - - default default] Lock "f3e5045f-b39a-435f-9112-c2adfb8c8b71" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 20.168s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 21 18:52:39 np0005591285 nova_compute[182755]: 2026-01-21 23:52:39.595 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 18:52:39 np0005591285 nova_compute[182755]: 2026-01-21 23:52:39.620 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 18:52:41 np0005591285 nova_compute[182755]: 2026-01-21 23:52:41.208 182759 DEBUG nova.compute.manager [req-a7714f2e-d9bc-42c8-bcdd-7b0c88596709 req-4752c755-895b-4fc8-a131-dea6778c3126 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: f3e5045f-b39a-435f-9112-c2adfb8c8b71] Received event network-vif-plugged-b6683742-2d0d-48b9-8fcd-c835a90a3423 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 21 18:52:41 np0005591285 nova_compute[182755]: 2026-01-21 23:52:41.209 182759 DEBUG oslo_concurrency.lockutils [req-a7714f2e-d9bc-42c8-bcdd-7b0c88596709 req-4752c755-895b-4fc8-a131-dea6778c3126 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "f3e5045f-b39a-435f-9112-c2adfb8c8b71-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 21 18:52:41 np0005591285 nova_compute[182755]: 2026-01-21 23:52:41.209 182759 DEBUG oslo_concurrency.lockutils [req-a7714f2e-d9bc-42c8-bcdd-7b0c88596709 req-4752c755-895b-4fc8-a131-dea6778c3126 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "f3e5045f-b39a-435f-9112-c2adfb8c8b71-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 21 18:52:41 np0005591285 nova_compute[182755]: 2026-01-21 23:52:41.210 182759 DEBUG oslo_concurrency.lockutils [req-a7714f2e-d9bc-42c8-bcdd-7b0c88596709 req-4752c755-895b-4fc8-a131-dea6778c3126 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "f3e5045f-b39a-435f-9112-c2adfb8c8b71-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 21 18:52:41 np0005591285 nova_compute[182755]: 2026-01-21 23:52:41.210 182759 DEBUG nova.compute.manager [req-a7714f2e-d9bc-42c8-bcdd-7b0c88596709 req-4752c755-895b-4fc8-a131-dea6778c3126 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: f3e5045f-b39a-435f-9112-c2adfb8c8b71] No waiting events found dispatching network-vif-plugged-b6683742-2d0d-48b9-8fcd-c835a90a3423 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 21 18:52:41 np0005591285 nova_compute[182755]: 2026-01-21 23:52:41.210 182759 WARNING nova.compute.manager [req-a7714f2e-d9bc-42c8-bcdd-7b0c88596709 req-4752c755-895b-4fc8-a131-dea6778c3126 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: f3e5045f-b39a-435f-9112-c2adfb8c8b71] Received unexpected event network-vif-plugged-b6683742-2d0d-48b9-8fcd-c835a90a3423 for instance with vm_state active and task_state None.#033[00m
Jan 21 18:52:41 np0005591285 nova_compute[182755]: 2026-01-21 23:52:41.728 182759 DEBUG oslo_concurrency.lockutils [None req-275f2413-493a-4c39-8b75-0f45876c39ae 70ab01acb8e9483f8de4b65f638c134d 746e12e350104caf8fd0201b7e30f2fe - - default default] Acquiring lock "f3e5045f-b39a-435f-9112-c2adfb8c8b71" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 21 18:52:41 np0005591285 nova_compute[182755]: 2026-01-21 23:52:41.729 182759 DEBUG oslo_concurrency.lockutils [None req-275f2413-493a-4c39-8b75-0f45876c39ae 70ab01acb8e9483f8de4b65f638c134d 746e12e350104caf8fd0201b7e30f2fe - - default default] Lock "f3e5045f-b39a-435f-9112-c2adfb8c8b71" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 21 18:52:41 np0005591285 nova_compute[182755]: 2026-01-21 23:52:41.731 182759 DEBUG oslo_concurrency.lockutils [None req-275f2413-493a-4c39-8b75-0f45876c39ae 70ab01acb8e9483f8de4b65f638c134d 746e12e350104caf8fd0201b7e30f2fe - - default default] Acquiring lock "f3e5045f-b39a-435f-9112-c2adfb8c8b71-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 21 18:52:41 np0005591285 nova_compute[182755]: 2026-01-21 23:52:41.731 182759 DEBUG oslo_concurrency.lockutils [None req-275f2413-493a-4c39-8b75-0f45876c39ae 70ab01acb8e9483f8de4b65f638c134d 746e12e350104caf8fd0201b7e30f2fe - - default default] Lock "f3e5045f-b39a-435f-9112-c2adfb8c8b71-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 21 18:52:41 np0005591285 nova_compute[182755]: 2026-01-21 23:52:41.731 182759 DEBUG oslo_concurrency.lockutils [None req-275f2413-493a-4c39-8b75-0f45876c39ae 70ab01acb8e9483f8de4b65f638c134d 746e12e350104caf8fd0201b7e30f2fe - - default default] Lock "f3e5045f-b39a-435f-9112-c2adfb8c8b71-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 21 18:52:41 np0005591285 nova_compute[182755]: 2026-01-21 23:52:41.745 182759 INFO nova.compute.manager [None req-275f2413-493a-4c39-8b75-0f45876c39ae 70ab01acb8e9483f8de4b65f638c134d 746e12e350104caf8fd0201b7e30f2fe - - default default] [instance: f3e5045f-b39a-435f-9112-c2adfb8c8b71] Terminating instance#033[00m
Jan 21 18:52:41 np0005591285 nova_compute[182755]: 2026-01-21 23:52:41.756 182759 DEBUG nova.compute.manager [None req-275f2413-493a-4c39-8b75-0f45876c39ae 70ab01acb8e9483f8de4b65f638c134d 746e12e350104caf8fd0201b7e30f2fe - - default default] [instance: f3e5045f-b39a-435f-9112-c2adfb8c8b71] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Jan 21 18:52:41 np0005591285 kernel: tapb6683742-2d (unregistering): left promiscuous mode
Jan 21 18:52:41 np0005591285 NetworkManager[55017]: <info>  [1769039561.7809] device (tapb6683742-2d): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 21 18:52:41 np0005591285 ovn_controller[94908]: 2026-01-21T23:52:41Z|00105|binding|INFO|Releasing lport b6683742-2d0d-48b9-8fcd-c835a90a3423 from this chassis (sb_readonly=0)
Jan 21 18:52:41 np0005591285 ovn_controller[94908]: 2026-01-21T23:52:41Z|00106|binding|INFO|Setting lport b6683742-2d0d-48b9-8fcd-c835a90a3423 down in Southbound
Jan 21 18:52:41 np0005591285 ovn_controller[94908]: 2026-01-21T23:52:41Z|00107|binding|INFO|Removing iface tapb6683742-2d ovn-installed in OVS
Jan 21 18:52:41 np0005591285 nova_compute[182755]: 2026-01-21 23:52:41.789 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 18:52:41 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:52:41.808 104259 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:97:6d:a4 10.100.0.3'], port_security=['fa:16:3e:97:6d:a4 10.100.0.3'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.3/28', 'neutron:device_id': 'f3e5045f-b39a-435f-9112-c2adfb8c8b71', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-ac9d78c1-4947-4d13-afef-98be9fffb4a3', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '746e12e350104caf8fd0201b7e30f2fe', 'neutron:revision_number': '4', 'neutron:security_group_ids': '32ec70e7-f388-434b-bf72-519901c7c219', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-2.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=c45b543f-1374-4935-8ddf-60444b61519b, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fdd55b5c970>], logical_port=b6683742-2d0d-48b9-8fcd-c835a90a3423) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7fdd55b5c970>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 21 18:52:41 np0005591285 nova_compute[182755]: 2026-01-21 23:52:41.810 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 18:52:41 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:52:41.812 104259 INFO neutron.agent.ovn.metadata.agent [-] Port b6683742-2d0d-48b9-8fcd-c835a90a3423 in datapath ac9d78c1-4947-4d13-afef-98be9fffb4a3 unbound from our chassis#033[00m
Jan 21 18:52:41 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:52:41.816 104259 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network ac9d78c1-4947-4d13-afef-98be9fffb4a3, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Jan 21 18:52:41 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:52:41.818 211690 DEBUG oslo.privsep.daemon [-] privsep: reply[dfff48f6-67c8-4b0d-9bbe-52c7cb0e64f9]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 18:52:41 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:52:41.819 104259 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-ac9d78c1-4947-4d13-afef-98be9fffb4a3 namespace which is not needed anymore#033[00m
Jan 21 18:52:41 np0005591285 systemd[1]: machine-qemu\x2d19\x2dinstance\x2d00000030.scope: Deactivated successfully.
Jan 21 18:52:41 np0005591285 systemd[1]: machine-qemu\x2d19\x2dinstance\x2d00000030.scope: Consumed 5.406s CPU time.
Jan 21 18:52:41 np0005591285 systemd-machined[154022]: Machine qemu-19-instance-00000030 terminated.
Jan 21 18:52:41 np0005591285 nova_compute[182755]: 2026-01-21 23:52:41.989 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 18:52:42 np0005591285 nova_compute[182755]: 2026-01-21 23:52:41.998 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 18:52:42 np0005591285 neutron-haproxy-ovnmeta-ac9d78c1-4947-4d13-afef-98be9fffb4a3[216818]: [NOTICE]   (216822) : haproxy version is 2.8.14-c23fe91
Jan 21 18:52:42 np0005591285 neutron-haproxy-ovnmeta-ac9d78c1-4947-4d13-afef-98be9fffb4a3[216818]: [NOTICE]   (216822) : path to executable is /usr/sbin/haproxy
Jan 21 18:52:42 np0005591285 neutron-haproxy-ovnmeta-ac9d78c1-4947-4d13-afef-98be9fffb4a3[216818]: [WARNING]  (216822) : Exiting Master process...
Jan 21 18:52:42 np0005591285 neutron-haproxy-ovnmeta-ac9d78c1-4947-4d13-afef-98be9fffb4a3[216818]: [WARNING]  (216822) : Exiting Master process...
Jan 21 18:52:42 np0005591285 neutron-haproxy-ovnmeta-ac9d78c1-4947-4d13-afef-98be9fffb4a3[216818]: [ALERT]    (216822) : Current worker (216824) exited with code 143 (Terminated)
Jan 21 18:52:42 np0005591285 neutron-haproxy-ovnmeta-ac9d78c1-4947-4d13-afef-98be9fffb4a3[216818]: [WARNING]  (216822) : All workers exited. Exiting... (0)
Jan 21 18:52:42 np0005591285 systemd[1]: libpod-7f4ecc782405d7596d7fb75fed116ba14460590ca686779397997b13cd5211f3.scope: Deactivated successfully.
Jan 21 18:52:42 np0005591285 podman[216857]: 2026-01-21 23:52:42.02881497 +0000 UTC m=+0.076487320 container died 7f4ecc782405d7596d7fb75fed116ba14460590ca686779397997b13cd5211f3 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-ac9d78c1-4947-4d13-afef-98be9fffb4a3, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.build-date=20251202, tcib_managed=true)
Jan 21 18:52:42 np0005591285 nova_compute[182755]: 2026-01-21 23:52:42.052 182759 INFO nova.virt.libvirt.driver [-] [instance: f3e5045f-b39a-435f-9112-c2adfb8c8b71] Instance destroyed successfully.#033[00m
Jan 21 18:52:42 np0005591285 nova_compute[182755]: 2026-01-21 23:52:42.053 182759 DEBUG nova.objects.instance [None req-275f2413-493a-4c39-8b75-0f45876c39ae 70ab01acb8e9483f8de4b65f638c134d 746e12e350104caf8fd0201b7e30f2fe - - default default] Lazy-loading 'resources' on Instance uuid f3e5045f-b39a-435f-9112-c2adfb8c8b71 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 21 18:52:42 np0005591285 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-7f4ecc782405d7596d7fb75fed116ba14460590ca686779397997b13cd5211f3-userdata-shm.mount: Deactivated successfully.
Jan 21 18:52:42 np0005591285 systemd[1]: var-lib-containers-storage-overlay-98d0a0c9299aeafe7b8a13f1bc26b8902a2cfcb66ed58ccffd11be32420ed1b3-merged.mount: Deactivated successfully.
Jan 21 18:52:42 np0005591285 podman[216857]: 2026-01-21 23:52:42.096078341 +0000 UTC m=+0.143750641 container cleanup 7f4ecc782405d7596d7fb75fed116ba14460590ca686779397997b13cd5211f3 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-ac9d78c1-4947-4d13-afef-98be9fffb4a3, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS)
Jan 21 18:52:42 np0005591285 nova_compute[182755]: 2026-01-21 23:52:42.097 182759 DEBUG nova.virt.libvirt.vif [None req-275f2413-493a-4c39-8b75-0f45876c39ae 70ab01acb8e9483f8de4b65f638c134d 746e12e350104caf8fd0201b7e30f2fe - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-21T23:52:15Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ImagesNegativeTestJSON-server-1741589388',display_name='tempest-ImagesNegativeTestJSON-server-1741589388',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-imagesnegativetestjson-server-1741589388',id=48,image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2026-01-21T23:52:37Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='746e12e350104caf8fd0201b7e30f2fe',ramdisk_id='',reservation_id='r-jduu6rny',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-ImagesNegativeTestJSON-1247847828',owner_user_name='tempest-ImagesNegativeTestJSON-1247847828-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-21T23:52:37Z,user_data=None,user_id='70ab01acb8e9483f8de4b65f638c134d',uuid=f3e5045f-b39a-435f-9112-c2adfb8c8b71,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "b6683742-2d0d-48b9-8fcd-c835a90a3423", "address": "fa:16:3e:97:6d:a4", "network": {"id": "ac9d78c1-4947-4d13-afef-98be9fffb4a3", "bridge": "br-int", "label": "tempest-ImagesNegativeTestJSON-1028183010-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "746e12e350104caf8fd0201b7e30f2fe", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb6683742-2d", "ovs_interfaceid": "b6683742-2d0d-48b9-8fcd-c835a90a3423", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Jan 21 18:52:42 np0005591285 nova_compute[182755]: 2026-01-21 23:52:42.098 182759 DEBUG nova.network.os_vif_util [None req-275f2413-493a-4c39-8b75-0f45876c39ae 70ab01acb8e9483f8de4b65f638c134d 746e12e350104caf8fd0201b7e30f2fe - - default default] Converting VIF {"id": "b6683742-2d0d-48b9-8fcd-c835a90a3423", "address": "fa:16:3e:97:6d:a4", "network": {"id": "ac9d78c1-4947-4d13-afef-98be9fffb4a3", "bridge": "br-int", "label": "tempest-ImagesNegativeTestJSON-1028183010-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "746e12e350104caf8fd0201b7e30f2fe", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb6683742-2d", "ovs_interfaceid": "b6683742-2d0d-48b9-8fcd-c835a90a3423", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 21 18:52:42 np0005591285 nova_compute[182755]: 2026-01-21 23:52:42.100 182759 DEBUG nova.network.os_vif_util [None req-275f2413-493a-4c39-8b75-0f45876c39ae 70ab01acb8e9483f8de4b65f638c134d 746e12e350104caf8fd0201b7e30f2fe - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:97:6d:a4,bridge_name='br-int',has_traffic_filtering=True,id=b6683742-2d0d-48b9-8fcd-c835a90a3423,network=Network(ac9d78c1-4947-4d13-afef-98be9fffb4a3),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapb6683742-2d') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 21 18:52:42 np0005591285 nova_compute[182755]: 2026-01-21 23:52:42.101 182759 DEBUG os_vif [None req-275f2413-493a-4c39-8b75-0f45876c39ae 70ab01acb8e9483f8de4b65f638c134d 746e12e350104caf8fd0201b7e30f2fe - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:97:6d:a4,bridge_name='br-int',has_traffic_filtering=True,id=b6683742-2d0d-48b9-8fcd-c835a90a3423,network=Network(ac9d78c1-4947-4d13-afef-98be9fffb4a3),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapb6683742-2d') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Jan 21 18:52:42 np0005591285 nova_compute[182755]: 2026-01-21 23:52:42.104 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 18:52:42 np0005591285 nova_compute[182755]: 2026-01-21 23:52:42.105 182759 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapb6683742-2d, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 21 18:52:42 np0005591285 systemd[1]: libpod-conmon-7f4ecc782405d7596d7fb75fed116ba14460590ca686779397997b13cd5211f3.scope: Deactivated successfully.
Jan 21 18:52:42 np0005591285 nova_compute[182755]: 2026-01-21 23:52:42.141 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 18:52:42 np0005591285 nova_compute[182755]: 2026-01-21 23:52:42.144 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 21 18:52:42 np0005591285 nova_compute[182755]: 2026-01-21 23:52:42.148 182759 INFO os_vif [None req-275f2413-493a-4c39-8b75-0f45876c39ae 70ab01acb8e9483f8de4b65f638c134d 746e12e350104caf8fd0201b7e30f2fe - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:97:6d:a4,bridge_name='br-int',has_traffic_filtering=True,id=b6683742-2d0d-48b9-8fcd-c835a90a3423,network=Network(ac9d78c1-4947-4d13-afef-98be9fffb4a3),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapb6683742-2d')#033[00m
Jan 21 18:52:42 np0005591285 nova_compute[182755]: 2026-01-21 23:52:42.149 182759 INFO nova.virt.libvirt.driver [None req-275f2413-493a-4c39-8b75-0f45876c39ae 70ab01acb8e9483f8de4b65f638c134d 746e12e350104caf8fd0201b7e30f2fe - - default default] [instance: f3e5045f-b39a-435f-9112-c2adfb8c8b71] Deleting instance files /var/lib/nova/instances/f3e5045f-b39a-435f-9112-c2adfb8c8b71_del#033[00m
Jan 21 18:52:42 np0005591285 nova_compute[182755]: 2026-01-21 23:52:42.150 182759 INFO nova.virt.libvirt.driver [None req-275f2413-493a-4c39-8b75-0f45876c39ae 70ab01acb8e9483f8de4b65f638c134d 746e12e350104caf8fd0201b7e30f2fe - - default default] [instance: f3e5045f-b39a-435f-9112-c2adfb8c8b71] Deletion of /var/lib/nova/instances/f3e5045f-b39a-435f-9112-c2adfb8c8b71_del complete#033[00m
Jan 21 18:52:42 np0005591285 podman[216899]: 2026-01-21 23:52:42.191688602 +0000 UTC m=+0.058677173 container remove 7f4ecc782405d7596d7fb75fed116ba14460590ca686779397997b13cd5211f3 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-ac9d78c1-4947-4d13-afef-98be9fffb4a3, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 21 18:52:42 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:52:42.199 211690 DEBUG oslo.privsep.daemon [-] privsep: reply[9799f7e1-f0ec-4f7d-ae7f-e6dd38fb882a]: (4, ('Wed Jan 21 11:52:41 PM UTC 2026 Stopping container neutron-haproxy-ovnmeta-ac9d78c1-4947-4d13-afef-98be9fffb4a3 (7f4ecc782405d7596d7fb75fed116ba14460590ca686779397997b13cd5211f3)\n7f4ecc782405d7596d7fb75fed116ba14460590ca686779397997b13cd5211f3\nWed Jan 21 11:52:42 PM UTC 2026 Deleting container neutron-haproxy-ovnmeta-ac9d78c1-4947-4d13-afef-98be9fffb4a3 (7f4ecc782405d7596d7fb75fed116ba14460590ca686779397997b13cd5211f3)\n7f4ecc782405d7596d7fb75fed116ba14460590ca686779397997b13cd5211f3\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 18:52:42 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:52:42.201 211690 DEBUG oslo.privsep.daemon [-] privsep: reply[5b8368e4-d34b-45be-b92b-93142746a7d9]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 18:52:42 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:52:42.202 104259 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapac9d78c1-40, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 21 18:52:42 np0005591285 nova_compute[182755]: 2026-01-21 23:52:42.204 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 18:52:42 np0005591285 kernel: tapac9d78c1-40: left promiscuous mode
Jan 21 18:52:42 np0005591285 nova_compute[182755]: 2026-01-21 23:52:42.207 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 18:52:42 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:52:42.210 211690 DEBUG oslo.privsep.daemon [-] privsep: reply[163e9bc9-1e09-48d4-a419-dd1cc171460b]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 18:52:42 np0005591285 nova_compute[182755]: 2026-01-21 23:52:42.223 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 18:52:42 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:52:42.237 211690 DEBUG oslo.privsep.daemon [-] privsep: reply[c23d4a46-8c80-4b0a-9675-3f5fd5bfd949]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 18:52:42 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:52:42.238 211690 DEBUG oslo.privsep.daemon [-] privsep: reply[7630c8e9-063a-40fa-bef4-72234f495802]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 18:52:42 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:52:42.260 211690 DEBUG oslo.privsep.daemon [-] privsep: reply[6d2e452c-0ea8-4079-800a-263e21a82d00]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 411474, 'reachable_time': 28507, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 216914, 'error': None, 'target': 'ovnmeta-ac9d78c1-4947-4d13-afef-98be9fffb4a3', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 18:52:42 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:52:42.264 104650 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-ac9d78c1-4947-4d13-afef-98be9fffb4a3 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Jan 21 18:52:42 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:52:42.264 104650 DEBUG oslo.privsep.daemon [-] privsep: reply[15b42f3b-7e8b-4bb0-bb60-02d17db7f945]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 18:52:42 np0005591285 systemd[1]: run-netns-ovnmeta\x2dac9d78c1\x2d4947\x2d4d13\x2dafef\x2d98be9fffb4a3.mount: Deactivated successfully.
Jan 21 18:52:42 np0005591285 nova_compute[182755]: 2026-01-21 23:52:42.346 182759 INFO nova.compute.manager [None req-275f2413-493a-4c39-8b75-0f45876c39ae 70ab01acb8e9483f8de4b65f638c134d 746e12e350104caf8fd0201b7e30f2fe - - default default] [instance: f3e5045f-b39a-435f-9112-c2adfb8c8b71] Took 0.59 seconds to destroy the instance on the hypervisor.#033[00m
Jan 21 18:52:42 np0005591285 nova_compute[182755]: 2026-01-21 23:52:42.347 182759 DEBUG oslo.service.loopingcall [None req-275f2413-493a-4c39-8b75-0f45876c39ae 70ab01acb8e9483f8de4b65f638c134d 746e12e350104caf8fd0201b7e30f2fe - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Jan 21 18:52:42 np0005591285 nova_compute[182755]: 2026-01-21 23:52:42.348 182759 DEBUG nova.compute.manager [-] [instance: f3e5045f-b39a-435f-9112-c2adfb8c8b71] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Jan 21 18:52:42 np0005591285 nova_compute[182755]: 2026-01-21 23:52:42.348 182759 DEBUG nova.network.neutron [-] [instance: f3e5045f-b39a-435f-9112-c2adfb8c8b71] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Jan 21 18:52:44 np0005591285 nova_compute[182755]: 2026-01-21 23:52:44.318 182759 DEBUG nova.compute.manager [req-f4a7909c-afd2-4654-a0f2-1ebd9ae10969 req-c17a7bc2-adce-418f-a6dd-7bc4f38da794 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: f3e5045f-b39a-435f-9112-c2adfb8c8b71] Received event network-vif-unplugged-b6683742-2d0d-48b9-8fcd-c835a90a3423 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 21 18:52:44 np0005591285 nova_compute[182755]: 2026-01-21 23:52:44.319 182759 DEBUG oslo_concurrency.lockutils [req-f4a7909c-afd2-4654-a0f2-1ebd9ae10969 req-c17a7bc2-adce-418f-a6dd-7bc4f38da794 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "f3e5045f-b39a-435f-9112-c2adfb8c8b71-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 21 18:52:44 np0005591285 nova_compute[182755]: 2026-01-21 23:52:44.320 182759 DEBUG oslo_concurrency.lockutils [req-f4a7909c-afd2-4654-a0f2-1ebd9ae10969 req-c17a7bc2-adce-418f-a6dd-7bc4f38da794 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "f3e5045f-b39a-435f-9112-c2adfb8c8b71-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 21 18:52:44 np0005591285 nova_compute[182755]: 2026-01-21 23:52:44.320 182759 DEBUG oslo_concurrency.lockutils [req-f4a7909c-afd2-4654-a0f2-1ebd9ae10969 req-c17a7bc2-adce-418f-a6dd-7bc4f38da794 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "f3e5045f-b39a-435f-9112-c2adfb8c8b71-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 21 18:52:44 np0005591285 nova_compute[182755]: 2026-01-21 23:52:44.321 182759 DEBUG nova.compute.manager [req-f4a7909c-afd2-4654-a0f2-1ebd9ae10969 req-c17a7bc2-adce-418f-a6dd-7bc4f38da794 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: f3e5045f-b39a-435f-9112-c2adfb8c8b71] No waiting events found dispatching network-vif-unplugged-b6683742-2d0d-48b9-8fcd-c835a90a3423 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 21 18:52:44 np0005591285 nova_compute[182755]: 2026-01-21 23:52:44.321 182759 DEBUG nova.compute.manager [req-f4a7909c-afd2-4654-a0f2-1ebd9ae10969 req-c17a7bc2-adce-418f-a6dd-7bc4f38da794 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: f3e5045f-b39a-435f-9112-c2adfb8c8b71] Received event network-vif-unplugged-b6683742-2d0d-48b9-8fcd-c835a90a3423 for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Jan 21 18:52:44 np0005591285 nova_compute[182755]: 2026-01-21 23:52:44.598 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 18:52:45 np0005591285 nova_compute[182755]: 2026-01-21 23:52:45.263 182759 DEBUG nova.network.neutron [-] [instance: f3e5045f-b39a-435f-9112-c2adfb8c8b71] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 21 18:52:45 np0005591285 nova_compute[182755]: 2026-01-21 23:52:45.405 182759 INFO nova.compute.manager [-] [instance: f3e5045f-b39a-435f-9112-c2adfb8c8b71] Took 3.06 seconds to deallocate network for instance.#033[00m
Jan 21 18:52:45 np0005591285 nova_compute[182755]: 2026-01-21 23:52:45.689 182759 DEBUG oslo_concurrency.lockutils [None req-275f2413-493a-4c39-8b75-0f45876c39ae 70ab01acb8e9483f8de4b65f638c134d 746e12e350104caf8fd0201b7e30f2fe - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 21 18:52:45 np0005591285 nova_compute[182755]: 2026-01-21 23:52:45.690 182759 DEBUG oslo_concurrency.lockutils [None req-275f2413-493a-4c39-8b75-0f45876c39ae 70ab01acb8e9483f8de4b65f638c134d 746e12e350104caf8fd0201b7e30f2fe - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 21 18:52:45 np0005591285 nova_compute[182755]: 2026-01-21 23:52:45.702 182759 DEBUG nova.compute.manager [req-ad484e92-4566-472d-8f5c-4fe7c6847123 req-6403a9ae-bea9-4b83-a941-1a578f926ce4 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: f3e5045f-b39a-435f-9112-c2adfb8c8b71] Received event network-vif-deleted-b6683742-2d0d-48b9-8fcd-c835a90a3423 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 21 18:52:45 np0005591285 nova_compute[182755]: 2026-01-21 23:52:45.875 182759 DEBUG nova.compute.provider_tree [None req-275f2413-493a-4c39-8b75-0f45876c39ae 70ab01acb8e9483f8de4b65f638c134d 746e12e350104caf8fd0201b7e30f2fe - - default default] Inventory has not changed in ProviderTree for provider: e96a8776-a298-4c19-937a-402cb8191067 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 21 18:52:45 np0005591285 nova_compute[182755]: 2026-01-21 23:52:45.931 182759 DEBUG nova.scheduler.client.report [None req-275f2413-493a-4c39-8b75-0f45876c39ae 70ab01acb8e9483f8de4b65f638c134d 746e12e350104caf8fd0201b7e30f2fe - - default default] Inventory has not changed for provider e96a8776-a298-4c19-937a-402cb8191067 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 21 18:52:46 np0005591285 nova_compute[182755]: 2026-01-21 23:52:46.040 182759 DEBUG oslo_concurrency.lockutils [None req-275f2413-493a-4c39-8b75-0f45876c39ae 70ab01acb8e9483f8de4b65f638c134d 746e12e350104caf8fd0201b7e30f2fe - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.350s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 21 18:52:46 np0005591285 nova_compute[182755]: 2026-01-21 23:52:46.072 182759 INFO nova.scheduler.client.report [None req-275f2413-493a-4c39-8b75-0f45876c39ae 70ab01acb8e9483f8de4b65f638c134d 746e12e350104caf8fd0201b7e30f2fe - - default default] Deleted allocations for instance f3e5045f-b39a-435f-9112-c2adfb8c8b71#033[00m
Jan 21 18:52:46 np0005591285 nova_compute[182755]: 2026-01-21 23:52:46.316 182759 DEBUG oslo_concurrency.lockutils [None req-275f2413-493a-4c39-8b75-0f45876c39ae 70ab01acb8e9483f8de4b65f638c134d 746e12e350104caf8fd0201b7e30f2fe - - default default] Lock "f3e5045f-b39a-435f-9112-c2adfb8c8b71" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 4.587s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 21 18:52:46 np0005591285 nova_compute[182755]: 2026-01-21 23:52:46.650 182759 DEBUG nova.compute.manager [req-fb4f62ae-1722-4968-a759-25d3b6949990 req-c92fbc94-6a17-4f65-978c-584121d1a3e5 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: f3e5045f-b39a-435f-9112-c2adfb8c8b71] Received event network-vif-plugged-b6683742-2d0d-48b9-8fcd-c835a90a3423 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 21 18:52:46 np0005591285 nova_compute[182755]: 2026-01-21 23:52:46.651 182759 DEBUG oslo_concurrency.lockutils [req-fb4f62ae-1722-4968-a759-25d3b6949990 req-c92fbc94-6a17-4f65-978c-584121d1a3e5 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "f3e5045f-b39a-435f-9112-c2adfb8c8b71-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 21 18:52:46 np0005591285 nova_compute[182755]: 2026-01-21 23:52:46.652 182759 DEBUG oslo_concurrency.lockutils [req-fb4f62ae-1722-4968-a759-25d3b6949990 req-c92fbc94-6a17-4f65-978c-584121d1a3e5 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "f3e5045f-b39a-435f-9112-c2adfb8c8b71-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 21 18:52:46 np0005591285 nova_compute[182755]: 2026-01-21 23:52:46.652 182759 DEBUG oslo_concurrency.lockutils [req-fb4f62ae-1722-4968-a759-25d3b6949990 req-c92fbc94-6a17-4f65-978c-584121d1a3e5 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "f3e5045f-b39a-435f-9112-c2adfb8c8b71-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 21 18:52:46 np0005591285 nova_compute[182755]: 2026-01-21 23:52:46.653 182759 DEBUG nova.compute.manager [req-fb4f62ae-1722-4968-a759-25d3b6949990 req-c92fbc94-6a17-4f65-978c-584121d1a3e5 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: f3e5045f-b39a-435f-9112-c2adfb8c8b71] No waiting events found dispatching network-vif-plugged-b6683742-2d0d-48b9-8fcd-c835a90a3423 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 21 18:52:46 np0005591285 nova_compute[182755]: 2026-01-21 23:52:46.653 182759 WARNING nova.compute.manager [req-fb4f62ae-1722-4968-a759-25d3b6949990 req-c92fbc94-6a17-4f65-978c-584121d1a3e5 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: f3e5045f-b39a-435f-9112-c2adfb8c8b71] Received unexpected event network-vif-plugged-b6683742-2d0d-48b9-8fcd-c835a90a3423 for instance with vm_state deleted and task_state None.#033[00m
Jan 21 18:52:47 np0005591285 nova_compute[182755]: 2026-01-21 23:52:47.143 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 18:52:47 np0005591285 podman[216916]: 2026-01-21 23:52:47.24304046 +0000 UTC m=+0.100926874 container health_status b5c1a31a508d0cf9e10702abe9c0ef424614b4d8f0daee85791f7de054afa6f0 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, config_id=ceilometer_agent_compute, container_name=ceilometer_agent_compute, tcib_managed=true, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 21 18:52:48 np0005591285 podman[216936]: 2026-01-21 23:52:48.313618141 +0000 UTC m=+0.177938427 container health_status 769668cc4e818cfae854ab3fcb5517227ddca1e18005a18ef4d782a494f45662 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, name=ubi9-minimal, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, managed_by=edpm_ansible, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, release=1755695350, architecture=x86_64, io.openshift.tags=minimal rhel9, maintainer=Red Hat, Inc., url=https://catalog.redhat.com/en/search?searchType=containers, vendor=Red Hat, Inc., container_name=openstack_network_exporter, distribution-scope=public, config_id=openstack_network_exporter, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-type=git, io.openshift.expose-services=, io.buildah.version=1.33.7, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., com.redhat.component=ubi9-minimal-container, version=9.6, build-date=2025-08-20T13:12:41)
Jan 21 18:52:49 np0005591285 nova_compute[182755]: 2026-01-21 23:52:49.600 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 18:52:52 np0005591285 nova_compute[182755]: 2026-01-21 23:52:52.147 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 18:52:54 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:52:54.120 104259 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=12, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '16:a4:fb', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': 'fa:c2:bb:50:eb:27'}, ipsec=False) old=SB_Global(nb_cfg=11) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 21 18:52:54 np0005591285 nova_compute[182755]: 2026-01-21 23:52:54.120 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 18:52:54 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:52:54.122 104259 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 5 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Jan 21 18:52:54 np0005591285 nova_compute[182755]: 2026-01-21 23:52:54.602 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 18:52:56 np0005591285 podman[216957]: 2026-01-21 23:52:56.224927337 +0000 UTC m=+0.085292956 container health_status 3d927e208c8d9d2bf2ed958430b573b5beb6e6062ba87e1ec7a0ee42c855b2e4 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Jan 21 18:52:56 np0005591285 nova_compute[182755]: 2026-01-21 23:52:56.758 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 18:52:57 np0005591285 nova_compute[182755]: 2026-01-21 23:52:57.049 182759 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1769039562.046998, f3e5045f-b39a-435f-9112-c2adfb8c8b71 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 21 18:52:57 np0005591285 nova_compute[182755]: 2026-01-21 23:52:57.049 182759 INFO nova.compute.manager [-] [instance: f3e5045f-b39a-435f-9112-c2adfb8c8b71] VM Stopped (Lifecycle Event)#033[00m
Jan 21 18:52:57 np0005591285 nova_compute[182755]: 2026-01-21 23:52:57.086 182759 DEBUG nova.compute.manager [None req-eb65232c-840e-4c51-9396-40e5c4f6e955 - - - - - -] [instance: f3e5045f-b39a-435f-9112-c2adfb8c8b71] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 21 18:52:57 np0005591285 nova_compute[182755]: 2026-01-21 23:52:57.194 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 18:52:59 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:52:59.125 104259 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=ce4b296c-26ac-415a-aa87-9634754eb3d3, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '12'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 21 18:52:59 np0005591285 nova_compute[182755]: 2026-01-21 23:52:59.604 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 18:53:00 np0005591285 podman[216982]: 2026-01-21 23:53:00.270123966 +0000 UTC m=+0.126920980 container health_status e38def30a2d032278c507a8fe96e62e9ea51ff4721fe55b24e66691821fd079e (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251202, config_id=ovn_controller, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller)
Jan 21 18:53:02 np0005591285 nova_compute[182755]: 2026-01-21 23:53:02.199 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 18:53:02 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:53:02.959 104259 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 21 18:53:02 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:53:02.960 104259 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 21 18:53:02 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:53:02.960 104259 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 21 18:53:03 np0005591285 podman[217008]: 2026-01-21 23:53:03.218256104 +0000 UTC m=+0.083937059 container health_status 482a8450540b33e5cd4c4cb0c4b60281016c657b79b89fa82f8a9e447843f995 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.license=GPLv2)
Jan 21 18:53:03 np0005591285 podman[217009]: 2026-01-21 23:53:03.239842973 +0000 UTC m=+0.097150164 container health_status 8919309155a033da8deb024bb37b9fc0d285fe97686ee0d202af37a5f2df8416 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']})
Jan 21 18:53:04 np0005591285 nova_compute[182755]: 2026-01-21 23:53:04.607 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 18:53:07 np0005591285 nova_compute[182755]: 2026-01-21 23:53:07.207 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 18:53:09 np0005591285 nova_compute[182755]: 2026-01-21 23:53:09.628 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 18:53:12 np0005591285 nova_compute[182755]: 2026-01-21 23:53:12.244 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 18:53:14 np0005591285 nova_compute[182755]: 2026-01-21 23:53:14.671 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 18:53:15 np0005591285 nova_compute[182755]: 2026-01-21 23:53:15.248 182759 DEBUG nova.compute.manager [None req-26dc531a-94bd-4864-905e-beac91ad2dea a7fb6bdd938b4fcdb749b0bc4f86f97e c09a5cf201e249f69f57cd4a632d1e2b - - default default] [instance: 40bd1cc4-d1de-4488-8160-e6d4f5fce4bc] Stashing vm_state: active _prep_resize /usr/lib/python3.9/site-packages/nova/compute/manager.py:5560#033[00m
Jan 21 18:53:15 np0005591285 nova_compute[182755]: 2026-01-21 23:53:15.803 182759 DEBUG oslo_concurrency.lockutils [None req-26dc531a-94bd-4864-905e-beac91ad2dea a7fb6bdd938b4fcdb749b0bc4f86f97e c09a5cf201e249f69f57cd4a632d1e2b - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.resize_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 21 18:53:15 np0005591285 nova_compute[182755]: 2026-01-21 23:53:15.803 182759 DEBUG oslo_concurrency.lockutils [None req-26dc531a-94bd-4864-905e-beac91ad2dea a7fb6bdd938b4fcdb749b0bc4f86f97e c09a5cf201e249f69f57cd4a632d1e2b - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.resize_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 21 18:53:15 np0005591285 nova_compute[182755]: 2026-01-21 23:53:15.844 182759 DEBUG nova.objects.instance [None req-26dc531a-94bd-4864-905e-beac91ad2dea a7fb6bdd938b4fcdb749b0bc4f86f97e c09a5cf201e249f69f57cd4a632d1e2b - - default default] Lazy-loading 'pci_requests' on Instance uuid 40bd1cc4-d1de-4488-8160-e6d4f5fce4bc obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 21 18:53:15 np0005591285 nova_compute[182755]: 2026-01-21 23:53:15.870 182759 DEBUG nova.virt.hardware [None req-26dc531a-94bd-4864-905e-beac91ad2dea a7fb6bdd938b4fcdb749b0bc4f86f97e c09a5cf201e249f69f57cd4a632d1e2b - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Jan 21 18:53:15 np0005591285 nova_compute[182755]: 2026-01-21 23:53:15.870 182759 INFO nova.compute.claims [None req-26dc531a-94bd-4864-905e-beac91ad2dea a7fb6bdd938b4fcdb749b0bc4f86f97e c09a5cf201e249f69f57cd4a632d1e2b - - default default] [instance: 40bd1cc4-d1de-4488-8160-e6d4f5fce4bc] Claim successful on node compute-2.ctlplane.example.com#033[00m
Jan 21 18:53:15 np0005591285 nova_compute[182755]: 2026-01-21 23:53:15.871 182759 DEBUG nova.objects.instance [None req-26dc531a-94bd-4864-905e-beac91ad2dea a7fb6bdd938b4fcdb749b0bc4f86f97e c09a5cf201e249f69f57cd4a632d1e2b - - default default] Lazy-loading 'resources' on Instance uuid 40bd1cc4-d1de-4488-8160-e6d4f5fce4bc obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 21 18:53:15 np0005591285 nova_compute[182755]: 2026-01-21 23:53:15.898 182759 DEBUG nova.objects.instance [None req-26dc531a-94bd-4864-905e-beac91ad2dea a7fb6bdd938b4fcdb749b0bc4f86f97e c09a5cf201e249f69f57cd4a632d1e2b - - default default] Lazy-loading 'pci_devices' on Instance uuid 40bd1cc4-d1de-4488-8160-e6d4f5fce4bc obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 21 18:53:15 np0005591285 nova_compute[182755]: 2026-01-21 23:53:15.973 182759 INFO nova.compute.resource_tracker [None req-26dc531a-94bd-4864-905e-beac91ad2dea a7fb6bdd938b4fcdb749b0bc4f86f97e c09a5cf201e249f69f57cd4a632d1e2b - - default default] [instance: 40bd1cc4-d1de-4488-8160-e6d4f5fce4bc] Updating resource usage from migration 7a6f14fa-de30-444c-a296-3a3cc19a7a58#033[00m
Jan 21 18:53:15 np0005591285 nova_compute[182755]: 2026-01-21 23:53:15.973 182759 DEBUG nova.compute.resource_tracker [None req-26dc531a-94bd-4864-905e-beac91ad2dea a7fb6bdd938b4fcdb749b0bc4f86f97e c09a5cf201e249f69f57cd4a632d1e2b - - default default] [instance: 40bd1cc4-d1de-4488-8160-e6d4f5fce4bc] Starting to track incoming migration 7a6f14fa-de30-444c-a296-3a3cc19a7a58 with flavor ff01ccba-ad51-439f-9037-926190d6dc0f _update_usage_from_migration /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1431#033[00m
Jan 21 18:53:16 np0005591285 nova_compute[182755]: 2026-01-21 23:53:16.075 182759 DEBUG nova.compute.provider_tree [None req-26dc531a-94bd-4864-905e-beac91ad2dea a7fb6bdd938b4fcdb749b0bc4f86f97e c09a5cf201e249f69f57cd4a632d1e2b - - default default] Inventory has not changed in ProviderTree for provider: e96a8776-a298-4c19-937a-402cb8191067 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 21 18:53:16 np0005591285 nova_compute[182755]: 2026-01-21 23:53:16.100 182759 DEBUG nova.scheduler.client.report [None req-26dc531a-94bd-4864-905e-beac91ad2dea a7fb6bdd938b4fcdb749b0bc4f86f97e c09a5cf201e249f69f57cd4a632d1e2b - - default default] Inventory has not changed for provider e96a8776-a298-4c19-937a-402cb8191067 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 21 18:53:16 np0005591285 nova_compute[182755]: 2026-01-21 23:53:16.123 182759 DEBUG oslo_concurrency.lockutils [None req-26dc531a-94bd-4864-905e-beac91ad2dea a7fb6bdd938b4fcdb749b0bc4f86f97e c09a5cf201e249f69f57cd4a632d1e2b - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.resize_claim" :: held 0.320s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 21 18:53:16 np0005591285 nova_compute[182755]: 2026-01-21 23:53:16.124 182759 INFO nova.compute.manager [None req-26dc531a-94bd-4864-905e-beac91ad2dea a7fb6bdd938b4fcdb749b0bc4f86f97e c09a5cf201e249f69f57cd4a632d1e2b - - default default] [instance: 40bd1cc4-d1de-4488-8160-e6d4f5fce4bc] Migrating#033[00m
Jan 21 18:53:17 np0005591285 nova_compute[182755]: 2026-01-21 23:53:17.247 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 18:53:18 np0005591285 podman[217053]: 2026-01-21 23:53:18.212410905 +0000 UTC m=+0.083266372 container health_status b5c1a31a508d0cf9e10702abe9c0ef424614b4d8f0daee85791f7de054afa6f0 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_id=ceilometer_agent_compute, container_name=ceilometer_agent_compute)
Jan 21 18:53:19 np0005591285 podman[217074]: 2026-01-21 23:53:19.250636378 +0000 UTC m=+0.104126024 container health_status 769668cc4e818cfae854ab3fcb5517227ddca1e18005a18ef4d782a494f45662 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=openstack_network_exporter, name=ubi9-minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., release=1755695350, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.33.7, url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.component=ubi9-minimal-container, vcs-type=git, version=9.6, io.openshift.expose-services=, io.openshift.tags=minimal rhel9, architecture=x86_64, build-date=2025-08-20T13:12:41, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., container_name=openstack_network_exporter, managed_by=edpm_ansible, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, maintainer=Red Hat, Inc., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, distribution-scope=public, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal)
Jan 21 18:53:19 np0005591285 systemd[1]: Created slice User Slice of UID 42436.
Jan 21 18:53:19 np0005591285 systemd[1]: Starting User Runtime Directory /run/user/42436...
Jan 21 18:53:19 np0005591285 systemd-logind[788]: New session 36 of user nova.
Jan 21 18:53:19 np0005591285 nova_compute[182755]: 2026-01-21 23:53:19.676 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 18:53:19 np0005591285 systemd[1]: Finished User Runtime Directory /run/user/42436.
Jan 21 18:53:19 np0005591285 systemd[1]: Starting User Manager for UID 42436...
Jan 21 18:53:19 np0005591285 systemd[217099]: Queued start job for default target Main User Target.
Jan 21 18:53:19 np0005591285 systemd[217099]: Created slice User Application Slice.
Jan 21 18:53:19 np0005591285 systemd[217099]: Started Mark boot as successful after the user session has run 2 minutes.
Jan 21 18:53:19 np0005591285 systemd[217099]: Started Daily Cleanup of User's Temporary Directories.
Jan 21 18:53:19 np0005591285 systemd[217099]: Reached target Paths.
Jan 21 18:53:19 np0005591285 systemd[217099]: Reached target Timers.
Jan 21 18:53:19 np0005591285 systemd[217099]: Starting D-Bus User Message Bus Socket...
Jan 21 18:53:19 np0005591285 systemd[217099]: Starting Create User's Volatile Files and Directories...
Jan 21 18:53:19 np0005591285 systemd[217099]: Listening on D-Bus User Message Bus Socket.
Jan 21 18:53:19 np0005591285 systemd[217099]: Reached target Sockets.
Jan 21 18:53:19 np0005591285 systemd[217099]: Finished Create User's Volatile Files and Directories.
Jan 21 18:53:19 np0005591285 systemd[217099]: Reached target Basic System.
Jan 21 18:53:19 np0005591285 systemd[217099]: Reached target Main User Target.
Jan 21 18:53:19 np0005591285 systemd[217099]: Startup finished in 174ms.
Jan 21 18:53:19 np0005591285 systemd[1]: Started User Manager for UID 42436.
Jan 21 18:53:19 np0005591285 systemd[1]: Started Session 36 of User nova.
Jan 21 18:53:20 np0005591285 systemd[1]: session-36.scope: Deactivated successfully.
Jan 21 18:53:20 np0005591285 systemd-logind[788]: Session 36 logged out. Waiting for processes to exit.
Jan 21 18:53:20 np0005591285 systemd-logind[788]: Removed session 36.
Jan 21 18:53:20 np0005591285 systemd-logind[788]: New session 38 of user nova.
Jan 21 18:53:20 np0005591285 systemd[1]: Started Session 38 of User nova.
Jan 21 18:53:20 np0005591285 systemd[1]: session-38.scope: Deactivated successfully.
Jan 21 18:53:20 np0005591285 systemd-logind[788]: Session 38 logged out. Waiting for processes to exit.
Jan 21 18:53:20 np0005591285 systemd-logind[788]: Removed session 38.
Jan 21 18:53:21 np0005591285 nova_compute[182755]: 2026-01-21 23:53:21.216 182759 DEBUG oslo_service.periodic_task [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 21 18:53:22 np0005591285 nova_compute[182755]: 2026-01-21 23:53:22.251 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 18:53:23 np0005591285 nova_compute[182755]: 2026-01-21 23:53:23.217 182759 DEBUG oslo_service.periodic_task [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 21 18:53:23 np0005591285 nova_compute[182755]: 2026-01-21 23:53:23.218 182759 DEBUG oslo_service.periodic_task [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 21 18:53:23 np0005591285 nova_compute[182755]: 2026-01-21 23:53:23.389 182759 DEBUG nova.compute.manager [req-8fb1910a-7379-4075-8168-9176af5e7f7c req-a2b84b50-f2ea-4e83-86d9-24d5e5acf814 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 40bd1cc4-d1de-4488-8160-e6d4f5fce4bc] Received event network-vif-unplugged-bc3d02f6-e146-4659-b018-41d3813ed1c3 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 21 18:53:23 np0005591285 nova_compute[182755]: 2026-01-21 23:53:23.390 182759 DEBUG oslo_concurrency.lockutils [req-8fb1910a-7379-4075-8168-9176af5e7f7c req-a2b84b50-f2ea-4e83-86d9-24d5e5acf814 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "40bd1cc4-d1de-4488-8160-e6d4f5fce4bc-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 21 18:53:23 np0005591285 nova_compute[182755]: 2026-01-21 23:53:23.390 182759 DEBUG oslo_concurrency.lockutils [req-8fb1910a-7379-4075-8168-9176af5e7f7c req-a2b84b50-f2ea-4e83-86d9-24d5e5acf814 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "40bd1cc4-d1de-4488-8160-e6d4f5fce4bc-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 21 18:53:23 np0005591285 nova_compute[182755]: 2026-01-21 23:53:23.391 182759 DEBUG oslo_concurrency.lockutils [req-8fb1910a-7379-4075-8168-9176af5e7f7c req-a2b84b50-f2ea-4e83-86d9-24d5e5acf814 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "40bd1cc4-d1de-4488-8160-e6d4f5fce4bc-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 21 18:53:23 np0005591285 nova_compute[182755]: 2026-01-21 23:53:23.391 182759 DEBUG nova.compute.manager [req-8fb1910a-7379-4075-8168-9176af5e7f7c req-a2b84b50-f2ea-4e83-86d9-24d5e5acf814 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 40bd1cc4-d1de-4488-8160-e6d4f5fce4bc] No waiting events found dispatching network-vif-unplugged-bc3d02f6-e146-4659-b018-41d3813ed1c3 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 21 18:53:23 np0005591285 nova_compute[182755]: 2026-01-21 23:53:23.392 182759 WARNING nova.compute.manager [req-8fb1910a-7379-4075-8168-9176af5e7f7c req-a2b84b50-f2ea-4e83-86d9-24d5e5acf814 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 40bd1cc4-d1de-4488-8160-e6d4f5fce4bc] Received unexpected event network-vif-unplugged-bc3d02f6-e146-4659-b018-41d3813ed1c3 for instance with vm_state active and task_state resize_migrating.#033[00m
Jan 21 18:53:23 np0005591285 systemd-logind[788]: New session 39 of user nova.
Jan 21 18:53:23 np0005591285 systemd[1]: Started Session 39 of User nova.
Jan 21 18:53:24 np0005591285 systemd[1]: session-39.scope: Deactivated successfully.
Jan 21 18:53:24 np0005591285 systemd-logind[788]: Session 39 logged out. Waiting for processes to exit.
Jan 21 18:53:24 np0005591285 systemd-logind[788]: Removed session 39.
Jan 21 18:53:24 np0005591285 nova_compute[182755]: 2026-01-21 23:53:24.214 182759 DEBUG oslo_service.periodic_task [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 21 18:53:24 np0005591285 systemd-logind[788]: New session 40 of user nova.
Jan 21 18:53:24 np0005591285 systemd[1]: Started Session 40 of User nova.
Jan 21 18:53:24 np0005591285 systemd[1]: session-40.scope: Deactivated successfully.
Jan 21 18:53:24 np0005591285 systemd-logind[788]: Session 40 logged out. Waiting for processes to exit.
Jan 21 18:53:24 np0005591285 systemd-logind[788]: Removed session 40.
Jan 21 18:53:24 np0005591285 systemd-logind[788]: New session 41 of user nova.
Jan 21 18:53:24 np0005591285 systemd[1]: Started Session 41 of User nova.
Jan 21 18:53:24 np0005591285 systemd[1]: session-41.scope: Deactivated successfully.
Jan 21 18:53:24 np0005591285 systemd-logind[788]: Session 41 logged out. Waiting for processes to exit.
Jan 21 18:53:24 np0005591285 systemd-logind[788]: Removed session 41.
Jan 21 18:53:24 np0005591285 nova_compute[182755]: 2026-01-21 23:53:24.715 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 18:53:25 np0005591285 nova_compute[182755]: 2026-01-21 23:53:25.567 182759 DEBUG nova.compute.manager [req-993f94bd-c253-4336-bd1b-e1f14ab1c651 req-a13f6df0-e332-48a4-aeac-170e01fb773d 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 40bd1cc4-d1de-4488-8160-e6d4f5fce4bc] Received event network-vif-plugged-bc3d02f6-e146-4659-b018-41d3813ed1c3 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 21 18:53:25 np0005591285 nova_compute[182755]: 2026-01-21 23:53:25.567 182759 DEBUG oslo_concurrency.lockutils [req-993f94bd-c253-4336-bd1b-e1f14ab1c651 req-a13f6df0-e332-48a4-aeac-170e01fb773d 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "40bd1cc4-d1de-4488-8160-e6d4f5fce4bc-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 21 18:53:25 np0005591285 nova_compute[182755]: 2026-01-21 23:53:25.568 182759 DEBUG oslo_concurrency.lockutils [req-993f94bd-c253-4336-bd1b-e1f14ab1c651 req-a13f6df0-e332-48a4-aeac-170e01fb773d 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "40bd1cc4-d1de-4488-8160-e6d4f5fce4bc-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 21 18:53:25 np0005591285 nova_compute[182755]: 2026-01-21 23:53:25.568 182759 DEBUG oslo_concurrency.lockutils [req-993f94bd-c253-4336-bd1b-e1f14ab1c651 req-a13f6df0-e332-48a4-aeac-170e01fb773d 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "40bd1cc4-d1de-4488-8160-e6d4f5fce4bc-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 21 18:53:25 np0005591285 nova_compute[182755]: 2026-01-21 23:53:25.568 182759 DEBUG nova.compute.manager [req-993f94bd-c253-4336-bd1b-e1f14ab1c651 req-a13f6df0-e332-48a4-aeac-170e01fb773d 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 40bd1cc4-d1de-4488-8160-e6d4f5fce4bc] No waiting events found dispatching network-vif-plugged-bc3d02f6-e146-4659-b018-41d3813ed1c3 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 21 18:53:25 np0005591285 nova_compute[182755]: 2026-01-21 23:53:25.569 182759 WARNING nova.compute.manager [req-993f94bd-c253-4336-bd1b-e1f14ab1c651 req-a13f6df0-e332-48a4-aeac-170e01fb773d 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 40bd1cc4-d1de-4488-8160-e6d4f5fce4bc] Received unexpected event network-vif-plugged-bc3d02f6-e146-4659-b018-41d3813ed1c3 for instance with vm_state active and task_state resize_migrated.#033[00m
Jan 21 18:53:25 np0005591285 nova_compute[182755]: 2026-01-21 23:53:25.737 182759 INFO nova.network.neutron [None req-26dc531a-94bd-4864-905e-beac91ad2dea a7fb6bdd938b4fcdb749b0bc4f86f97e c09a5cf201e249f69f57cd4a632d1e2b - - default default] [instance: 40bd1cc4-d1de-4488-8160-e6d4f5fce4bc] Updating port bc3d02f6-e146-4659-b018-41d3813ed1c3 with attributes {'binding:host_id': 'compute-2.ctlplane.example.com', 'device_owner': 'compute:nova'}#033[00m
Jan 21 18:53:26 np0005591285 nova_compute[182755]: 2026-01-21 23:53:26.217 182759 DEBUG oslo_service.periodic_task [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 21 18:53:26 np0005591285 nova_compute[182755]: 2026-01-21 23:53:26.217 182759 DEBUG nova.compute.manager [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Jan 21 18:53:26 np0005591285 nova_compute[182755]: 2026-01-21 23:53:26.218 182759 DEBUG nova.compute.manager [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Jan 21 18:53:26 np0005591285 nova_compute[182755]: 2026-01-21 23:53:26.261 182759 DEBUG oslo_concurrency.lockutils [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Acquiring lock "refresh_cache-40bd1cc4-d1de-4488-8160-e6d4f5fce4bc" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 21 18:53:26 np0005591285 nova_compute[182755]: 2026-01-21 23:53:26.262 182759 DEBUG oslo_concurrency.lockutils [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Acquired lock "refresh_cache-40bd1cc4-d1de-4488-8160-e6d4f5fce4bc" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 21 18:53:26 np0005591285 nova_compute[182755]: 2026-01-21 23:53:26.262 182759 DEBUG nova.network.neutron [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] [instance: 40bd1cc4-d1de-4488-8160-e6d4f5fce4bc] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004#033[00m
Jan 21 18:53:26 np0005591285 nova_compute[182755]: 2026-01-21 23:53:26.263 182759 DEBUG nova.objects.instance [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Lazy-loading 'info_cache' on Instance uuid 40bd1cc4-d1de-4488-8160-e6d4f5fce4bc obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 21 18:53:27 np0005591285 nova_compute[182755]: 2026-01-21 23:53:27.054 182759 DEBUG oslo_concurrency.lockutils [None req-26dc531a-94bd-4864-905e-beac91ad2dea a7fb6bdd938b4fcdb749b0bc4f86f97e c09a5cf201e249f69f57cd4a632d1e2b - - default default] Acquiring lock "refresh_cache-40bd1cc4-d1de-4488-8160-e6d4f5fce4bc" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 21 18:53:27 np0005591285 nova_compute[182755]: 2026-01-21 23:53:27.254 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 18:53:27 np0005591285 podman[217136]: 2026-01-21 23:53:27.257747533 +0000 UTC m=+0.108629604 container health_status 3d927e208c8d9d2bf2ed958430b573b5beb6e6062ba87e1ec7a0ee42c855b2e4 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Jan 21 18:53:27 np0005591285 nova_compute[182755]: 2026-01-21 23:53:27.353 182759 DEBUG nova.compute.manager [req-b5f108c3-5767-4332-8c1d-c545d1eee43f req-4c1ea6a5-d664-44f1-9c3f-960a68ea0ee0 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 40bd1cc4-d1de-4488-8160-e6d4f5fce4bc] Received event network-changed-bc3d02f6-e146-4659-b018-41d3813ed1c3 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 21 18:53:27 np0005591285 nova_compute[182755]: 2026-01-21 23:53:27.354 182759 DEBUG nova.compute.manager [req-b5f108c3-5767-4332-8c1d-c545d1eee43f req-4c1ea6a5-d664-44f1-9c3f-960a68ea0ee0 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 40bd1cc4-d1de-4488-8160-e6d4f5fce4bc] Refreshing instance network info cache due to event network-changed-bc3d02f6-e146-4659-b018-41d3813ed1c3. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 21 18:53:27 np0005591285 nova_compute[182755]: 2026-01-21 23:53:27.354 182759 DEBUG oslo_concurrency.lockutils [req-b5f108c3-5767-4332-8c1d-c545d1eee43f req-4c1ea6a5-d664-44f1-9c3f-960a68ea0ee0 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "refresh_cache-40bd1cc4-d1de-4488-8160-e6d4f5fce4bc" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 21 18:53:29 np0005591285 nova_compute[182755]: 2026-01-21 23:53:29.184 182759 DEBUG nova.network.neutron [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] [instance: 40bd1cc4-d1de-4488-8160-e6d4f5fce4bc] Updating instance_info_cache with network_info: [{"id": "bc3d02f6-e146-4659-b018-41d3813ed1c3", "address": "fa:16:3e:05:f7:b6", "network": {"id": "7b586c54-3322-410f-9bc9-972a63b8deff", "bridge": null, "label": "tempest-ServerDiskConfigTestJSON-1542056474-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c09a5cf201e249f69f57cd4a632d1e2b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "unbound", "details": {}, "devname": "tapbc3d02f6-e1", "ovs_interfaceid": null, "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 21 18:53:29 np0005591285 nova_compute[182755]: 2026-01-21 23:53:29.219 182759 DEBUG oslo_concurrency.lockutils [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Releasing lock "refresh_cache-40bd1cc4-d1de-4488-8160-e6d4f5fce4bc" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 21 18:53:29 np0005591285 nova_compute[182755]: 2026-01-21 23:53:29.220 182759 DEBUG nova.compute.manager [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] [instance: 40bd1cc4-d1de-4488-8160-e6d4f5fce4bc] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929#033[00m
Jan 21 18:53:29 np0005591285 nova_compute[182755]: 2026-01-21 23:53:29.221 182759 DEBUG oslo_concurrency.lockutils [None req-26dc531a-94bd-4864-905e-beac91ad2dea a7fb6bdd938b4fcdb749b0bc4f86f97e c09a5cf201e249f69f57cd4a632d1e2b - - default default] Acquired lock "refresh_cache-40bd1cc4-d1de-4488-8160-e6d4f5fce4bc" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 21 18:53:29 np0005591285 nova_compute[182755]: 2026-01-21 23:53:29.221 182759 DEBUG nova.network.neutron [None req-26dc531a-94bd-4864-905e-beac91ad2dea a7fb6bdd938b4fcdb749b0bc4f86f97e c09a5cf201e249f69f57cd4a632d1e2b - - default default] [instance: 40bd1cc4-d1de-4488-8160-e6d4f5fce4bc] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Jan 21 18:53:29 np0005591285 nova_compute[182755]: 2026-01-21 23:53:29.223 182759 DEBUG oslo_service.periodic_task [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 21 18:53:29 np0005591285 nova_compute[182755]: 2026-01-21 23:53:29.225 182759 DEBUG oslo_service.periodic_task [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 21 18:53:29 np0005591285 nova_compute[182755]: 2026-01-21 23:53:29.226 182759 DEBUG nova.compute.manager [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Jan 21 18:53:29 np0005591285 nova_compute[182755]: 2026-01-21 23:53:29.227 182759 DEBUG oslo_service.periodic_task [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 21 18:53:29 np0005591285 nova_compute[182755]: 2026-01-21 23:53:29.269 182759 DEBUG oslo_concurrency.lockutils [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 21 18:53:29 np0005591285 nova_compute[182755]: 2026-01-21 23:53:29.270 182759 DEBUG oslo_concurrency.lockutils [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 21 18:53:29 np0005591285 nova_compute[182755]: 2026-01-21 23:53:29.270 182759 DEBUG oslo_concurrency.lockutils [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 21 18:53:29 np0005591285 nova_compute[182755]: 2026-01-21 23:53:29.270 182759 DEBUG nova.compute.resource_tracker [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Jan 21 18:53:29 np0005591285 nova_compute[182755]: 2026-01-21 23:53:29.534 182759 WARNING nova.virt.libvirt.driver [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 21 18:53:29 np0005591285 nova_compute[182755]: 2026-01-21 23:53:29.538 182759 DEBUG nova.compute.resource_tracker [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=5684MB free_disk=73.34846496582031GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Jan 21 18:53:29 np0005591285 nova_compute[182755]: 2026-01-21 23:53:29.538 182759 DEBUG oslo_concurrency.lockutils [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 21 18:53:29 np0005591285 nova_compute[182755]: 2026-01-21 23:53:29.539 182759 DEBUG oslo_concurrency.lockutils [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 21 18:53:29 np0005591285 nova_compute[182755]: 2026-01-21 23:53:29.625 182759 DEBUG nova.compute.resource_tracker [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Applying migration context for instance 40bd1cc4-d1de-4488-8160-e6d4f5fce4bc as it has an incoming, in-progress migration 7a6f14fa-de30-444c-a296-3a3cc19a7a58. Migration status is post-migrating _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:950#033[00m
Jan 21 18:53:29 np0005591285 nova_compute[182755]: 2026-01-21 23:53:29.626 182759 INFO nova.compute.resource_tracker [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] [instance: 40bd1cc4-d1de-4488-8160-e6d4f5fce4bc] Updating resource usage from migration 7a6f14fa-de30-444c-a296-3a3cc19a7a58#033[00m
Jan 21 18:53:29 np0005591285 nova_compute[182755]: 2026-01-21 23:53:29.687 182759 DEBUG nova.compute.resource_tracker [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Instance 40bd1cc4-d1de-4488-8160-e6d4f5fce4bc actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 192, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Jan 21 18:53:29 np0005591285 nova_compute[182755]: 2026-01-21 23:53:29.688 182759 DEBUG nova.compute.resource_tracker [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Jan 21 18:53:29 np0005591285 nova_compute[182755]: 2026-01-21 23:53:29.688 182759 DEBUG nova.compute.resource_tracker [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7679MB used_ram=640MB phys_disk=79GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Jan 21 18:53:29 np0005591285 nova_compute[182755]: 2026-01-21 23:53:29.753 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 18:53:29 np0005591285 nova_compute[182755]: 2026-01-21 23:53:29.763 182759 DEBUG nova.compute.provider_tree [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Inventory has not changed in ProviderTree for provider: e96a8776-a298-4c19-937a-402cb8191067 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 21 18:53:29 np0005591285 nova_compute[182755]: 2026-01-21 23:53:29.798 182759 DEBUG nova.scheduler.client.report [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Inventory has not changed for provider e96a8776-a298-4c19-937a-402cb8191067 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 21 18:53:29 np0005591285 nova_compute[182755]: 2026-01-21 23:53:29.845 182759 DEBUG nova.compute.resource_tracker [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Jan 21 18:53:29 np0005591285 nova_compute[182755]: 2026-01-21 23:53:29.846 182759 DEBUG oslo_concurrency.lockutils [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.307s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 21 18:53:30 np0005591285 nova_compute[182755]: 2026-01-21 23:53:30.839 182759 DEBUG oslo_service.periodic_task [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 21 18:53:31 np0005591285 nova_compute[182755]: 2026-01-21 23:53:31.218 182759 DEBUG oslo_service.periodic_task [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 21 18:53:31 np0005591285 podman[217163]: 2026-01-21 23:53:31.265635982 +0000 UTC m=+0.126825484 container health_status e38def30a2d032278c507a8fe96e62e9ea51ff4721fe55b24e66691821fd079e (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_controller, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, tcib_managed=true, container_name=ovn_controller, org.label-schema.vendor=CentOS)
Jan 21 18:53:31 np0005591285 ovn_controller[94908]: 2026-01-21T23:53:31Z|00108|memory_trim|INFO|Detected inactivity (last active 30002 ms ago): trimming memory
Jan 21 18:53:32 np0005591285 nova_compute[182755]: 2026-01-21 23:53:32.256 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 18:53:32 np0005591285 nova_compute[182755]: 2026-01-21 23:53:32.695 182759 DEBUG nova.network.neutron [None req-26dc531a-94bd-4864-905e-beac91ad2dea a7fb6bdd938b4fcdb749b0bc4f86f97e c09a5cf201e249f69f57cd4a632d1e2b - - default default] [instance: 40bd1cc4-d1de-4488-8160-e6d4f5fce4bc] Updating instance_info_cache with network_info: [{"id": "bc3d02f6-e146-4659-b018-41d3813ed1c3", "address": "fa:16:3e:05:f7:b6", "network": {"id": "7b586c54-3322-410f-9bc9-972a63b8deff", "bridge": "br-int", "label": "tempest-ServerDiskConfigTestJSON-1542056474-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c09a5cf201e249f69f57cd4a632d1e2b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapbc3d02f6-e1", "ovs_interfaceid": "bc3d02f6-e146-4659-b018-41d3813ed1c3", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 21 18:53:32 np0005591285 nova_compute[182755]: 2026-01-21 23:53:32.717 182759 DEBUG oslo_concurrency.lockutils [None req-26dc531a-94bd-4864-905e-beac91ad2dea a7fb6bdd938b4fcdb749b0bc4f86f97e c09a5cf201e249f69f57cd4a632d1e2b - - default default] Releasing lock "refresh_cache-40bd1cc4-d1de-4488-8160-e6d4f5fce4bc" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 21 18:53:32 np0005591285 nova_compute[182755]: 2026-01-21 23:53:32.723 182759 DEBUG oslo_concurrency.lockutils [req-b5f108c3-5767-4332-8c1d-c545d1eee43f req-4c1ea6a5-d664-44f1-9c3f-960a68ea0ee0 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquired lock "refresh_cache-40bd1cc4-d1de-4488-8160-e6d4f5fce4bc" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 21 18:53:32 np0005591285 nova_compute[182755]: 2026-01-21 23:53:32.723 182759 DEBUG nova.network.neutron [req-b5f108c3-5767-4332-8c1d-c545d1eee43f req-4c1ea6a5-d664-44f1-9c3f-960a68ea0ee0 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 40bd1cc4-d1de-4488-8160-e6d4f5fce4bc] Refreshing network info cache for port bc3d02f6-e146-4659-b018-41d3813ed1c3 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 21 18:53:32 np0005591285 nova_compute[182755]: 2026-01-21 23:53:32.905 182759 DEBUG nova.virt.libvirt.driver [None req-26dc531a-94bd-4864-905e-beac91ad2dea a7fb6bdd938b4fcdb749b0bc4f86f97e c09a5cf201e249f69f57cd4a632d1e2b - - default default] [instance: 40bd1cc4-d1de-4488-8160-e6d4f5fce4bc] Starting finish_migration finish_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11698#033[00m
Jan 21 18:53:32 np0005591285 nova_compute[182755]: 2026-01-21 23:53:32.908 182759 DEBUG nova.virt.libvirt.driver [None req-26dc531a-94bd-4864-905e-beac91ad2dea a7fb6bdd938b4fcdb749b0bc4f86f97e c09a5cf201e249f69f57cd4a632d1e2b - - default default] [instance: 40bd1cc4-d1de-4488-8160-e6d4f5fce4bc] Instance directory exists: not creating _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4719#033[00m
Jan 21 18:53:32 np0005591285 nova_compute[182755]: 2026-01-21 23:53:32.908 182759 INFO nova.virt.libvirt.driver [None req-26dc531a-94bd-4864-905e-beac91ad2dea a7fb6bdd938b4fcdb749b0bc4f86f97e c09a5cf201e249f69f57cd4a632d1e2b - - default default] [instance: 40bd1cc4-d1de-4488-8160-e6d4f5fce4bc] Creating image(s)#033[00m
Jan 21 18:53:32 np0005591285 nova_compute[182755]: 2026-01-21 23:53:32.910 182759 DEBUG nova.objects.instance [None req-26dc531a-94bd-4864-905e-beac91ad2dea a7fb6bdd938b4fcdb749b0bc4f86f97e c09a5cf201e249f69f57cd4a632d1e2b - - default default] Lazy-loading 'trusted_certs' on Instance uuid 40bd1cc4-d1de-4488-8160-e6d4f5fce4bc obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 21 18:53:33 np0005591285 nova_compute[182755]: 2026-01-21 23:53:33.025 182759 DEBUG oslo_concurrency.processutils [None req-26dc531a-94bd-4864-905e-beac91ad2dea a7fb6bdd938b4fcdb749b0bc4f86f97e c09a5cf201e249f69f57cd4a632d1e2b - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 21 18:53:33 np0005591285 nova_compute[182755]: 2026-01-21 23:53:33.089 182759 DEBUG oslo_concurrency.processutils [None req-26dc531a-94bd-4864-905e-beac91ad2dea a7fb6bdd938b4fcdb749b0bc4f86f97e c09a5cf201e249f69f57cd4a632d1e2b - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json" returned: 0 in 0.064s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 21 18:53:33 np0005591285 nova_compute[182755]: 2026-01-21 23:53:33.091 182759 DEBUG nova.virt.disk.api [None req-26dc531a-94bd-4864-905e-beac91ad2dea a7fb6bdd938b4fcdb749b0bc4f86f97e c09a5cf201e249f69f57cd4a632d1e2b - - default default] Checking if we can resize image /var/lib/nova/instances/40bd1cc4-d1de-4488-8160-e6d4f5fce4bc/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166#033[00m
Jan 21 18:53:33 np0005591285 nova_compute[182755]: 2026-01-21 23:53:33.092 182759 DEBUG oslo_concurrency.processutils [None req-26dc531a-94bd-4864-905e-beac91ad2dea a7fb6bdd938b4fcdb749b0bc4f86f97e c09a5cf201e249f69f57cd4a632d1e2b - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/40bd1cc4-d1de-4488-8160-e6d4f5fce4bc/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 21 18:53:33 np0005591285 nova_compute[182755]: 2026-01-21 23:53:33.186 182759 DEBUG oslo_concurrency.processutils [None req-26dc531a-94bd-4864-905e-beac91ad2dea a7fb6bdd938b4fcdb749b0bc4f86f97e c09a5cf201e249f69f57cd4a632d1e2b - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/40bd1cc4-d1de-4488-8160-e6d4f5fce4bc/disk --force-share --output=json" returned: 0 in 0.094s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 21 18:53:33 np0005591285 nova_compute[182755]: 2026-01-21 23:53:33.188 182759 DEBUG nova.virt.disk.api [None req-26dc531a-94bd-4864-905e-beac91ad2dea a7fb6bdd938b4fcdb749b0bc4f86f97e c09a5cf201e249f69f57cd4a632d1e2b - - default default] Cannot resize image /var/lib/nova/instances/40bd1cc4-d1de-4488-8160-e6d4f5fce4bc/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172#033[00m
Jan 21 18:53:33 np0005591285 nova_compute[182755]: 2026-01-21 23:53:33.207 182759 DEBUG nova.virt.libvirt.driver [None req-26dc531a-94bd-4864-905e-beac91ad2dea a7fb6bdd938b4fcdb749b0bc4f86f97e c09a5cf201e249f69f57cd4a632d1e2b - - default default] [instance: 40bd1cc4-d1de-4488-8160-e6d4f5fce4bc] Did not create local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4859#033[00m
Jan 21 18:53:33 np0005591285 nova_compute[182755]: 2026-01-21 23:53:33.207 182759 DEBUG nova.virt.libvirt.driver [None req-26dc531a-94bd-4864-905e-beac91ad2dea a7fb6bdd938b4fcdb749b0bc4f86f97e c09a5cf201e249f69f57cd4a632d1e2b - - default default] [instance: 40bd1cc4-d1de-4488-8160-e6d4f5fce4bc] Ensure instance console log exists: /var/lib/nova/instances/40bd1cc4-d1de-4488-8160-e6d4f5fce4bc/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Jan 21 18:53:33 np0005591285 nova_compute[182755]: 2026-01-21 23:53:33.208 182759 DEBUG oslo_concurrency.lockutils [None req-26dc531a-94bd-4864-905e-beac91ad2dea a7fb6bdd938b4fcdb749b0bc4f86f97e c09a5cf201e249f69f57cd4a632d1e2b - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 21 18:53:33 np0005591285 nova_compute[182755]: 2026-01-21 23:53:33.208 182759 DEBUG oslo_concurrency.lockutils [None req-26dc531a-94bd-4864-905e-beac91ad2dea a7fb6bdd938b4fcdb749b0bc4f86f97e c09a5cf201e249f69f57cd4a632d1e2b - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 21 18:53:33 np0005591285 nova_compute[182755]: 2026-01-21 23:53:33.208 182759 DEBUG oslo_concurrency.lockutils [None req-26dc531a-94bd-4864-905e-beac91ad2dea a7fb6bdd938b4fcdb749b0bc4f86f97e c09a5cf201e249f69f57cd4a632d1e2b - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 21 18:53:33 np0005591285 nova_compute[182755]: 2026-01-21 23:53:33.211 182759 DEBUG nova.virt.libvirt.driver [None req-26dc531a-94bd-4864-905e-beac91ad2dea a7fb6bdd938b4fcdb749b0bc4f86f97e c09a5cf201e249f69f57cd4a632d1e2b - - default default] [instance: 40bd1cc4-d1de-4488-8160-e6d4f5fce4bc] Start _get_guest_xml network_info=[{"id": "bc3d02f6-e146-4659-b018-41d3813ed1c3", "address": "fa:16:3e:05:f7:b6", "network": {"id": "7b586c54-3322-410f-9bc9-972a63b8deff", "bridge": "br-int", "label": "tempest-ServerDiskConfigTestJSON-1542056474-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [], "label": "tempest-ServerDiskConfigTestJSON-1542056474-network", "vif_mac": "fa:16:3e:05:f7:b6"}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c09a5cf201e249f69f57cd4a632d1e2b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapbc3d02f6-e1", "ovs_interfaceid": "bc3d02f6-e146-4659-b018-41d3813ed1c3", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-21T23:42:50Z,direct_url=<?>,disk_format='qcow2',id=9cd98f02-a505-4543-a7ad-04e9a377b456,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='43b70c4e837343859ac97b6b2397ba1b',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-21T23:42:52Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'device_type': 'disk', 'device_name': '/dev/vda', 'size': 0, 'boot_index': 0, 'disk_bus': 'virtio', 'guest_format': None, 'encryption_options': None, 'encryption_format': None, 'encrypted': False, 'encryption_secret_uuid': None, 'image_id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Jan 21 18:53:33 np0005591285 nova_compute[182755]: 2026-01-21 23:53:33.217 182759 WARNING nova.virt.libvirt.driver [None req-26dc531a-94bd-4864-905e-beac91ad2dea a7fb6bdd938b4fcdb749b0bc4f86f97e c09a5cf201e249f69f57cd4a632d1e2b - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 21 18:53:33 np0005591285 nova_compute[182755]: 2026-01-21 23:53:33.223 182759 DEBUG nova.virt.libvirt.host [None req-26dc531a-94bd-4864-905e-beac91ad2dea a7fb6bdd938b4fcdb749b0bc4f86f97e c09a5cf201e249f69f57cd4a632d1e2b - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Jan 21 18:53:33 np0005591285 nova_compute[182755]: 2026-01-21 23:53:33.224 182759 DEBUG nova.virt.libvirt.host [None req-26dc531a-94bd-4864-905e-beac91ad2dea a7fb6bdd938b4fcdb749b0bc4f86f97e c09a5cf201e249f69f57cd4a632d1e2b - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Jan 21 18:53:33 np0005591285 nova_compute[182755]: 2026-01-21 23:53:33.228 182759 DEBUG nova.virt.libvirt.host [None req-26dc531a-94bd-4864-905e-beac91ad2dea a7fb6bdd938b4fcdb749b0bc4f86f97e c09a5cf201e249f69f57cd4a632d1e2b - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Jan 21 18:53:33 np0005591285 nova_compute[182755]: 2026-01-21 23:53:33.230 182759 DEBUG nova.virt.libvirt.host [None req-26dc531a-94bd-4864-905e-beac91ad2dea a7fb6bdd938b4fcdb749b0bc4f86f97e c09a5cf201e249f69f57cd4a632d1e2b - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Jan 21 18:53:33 np0005591285 nova_compute[182755]: 2026-01-21 23:53:33.232 182759 DEBUG nova.virt.libvirt.driver [None req-26dc531a-94bd-4864-905e-beac91ad2dea a7fb6bdd938b4fcdb749b0bc4f86f97e c09a5cf201e249f69f57cd4a632d1e2b - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Jan 21 18:53:33 np0005591285 nova_compute[182755]: 2026-01-21 23:53:33.233 182759 DEBUG nova.virt.hardware [None req-26dc531a-94bd-4864-905e-beac91ad2dea a7fb6bdd938b4fcdb749b0bc4f86f97e c09a5cf201e249f69f57cd4a632d1e2b - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-21T23:42:48Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='ff01ccba-ad51-439f-9037-926190d6dc0f',id=2,is_public=True,memory_mb=192,name='m1.micro',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-21T23:42:50Z,direct_url=<?>,disk_format='qcow2',id=9cd98f02-a505-4543-a7ad-04e9a377b456,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='43b70c4e837343859ac97b6b2397ba1b',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-21T23:42:52Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Jan 21 18:53:33 np0005591285 nova_compute[182755]: 2026-01-21 23:53:33.234 182759 DEBUG nova.virt.hardware [None req-26dc531a-94bd-4864-905e-beac91ad2dea a7fb6bdd938b4fcdb749b0bc4f86f97e c09a5cf201e249f69f57cd4a632d1e2b - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Jan 21 18:53:33 np0005591285 nova_compute[182755]: 2026-01-21 23:53:33.234 182759 DEBUG nova.virt.hardware [None req-26dc531a-94bd-4864-905e-beac91ad2dea a7fb6bdd938b4fcdb749b0bc4f86f97e c09a5cf201e249f69f57cd4a632d1e2b - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Jan 21 18:53:33 np0005591285 nova_compute[182755]: 2026-01-21 23:53:33.234 182759 DEBUG nova.virt.hardware [None req-26dc531a-94bd-4864-905e-beac91ad2dea a7fb6bdd938b4fcdb749b0bc4f86f97e c09a5cf201e249f69f57cd4a632d1e2b - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Jan 21 18:53:33 np0005591285 nova_compute[182755]: 2026-01-21 23:53:33.234 182759 DEBUG nova.virt.hardware [None req-26dc531a-94bd-4864-905e-beac91ad2dea a7fb6bdd938b4fcdb749b0bc4f86f97e c09a5cf201e249f69f57cd4a632d1e2b - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Jan 21 18:53:33 np0005591285 nova_compute[182755]: 2026-01-21 23:53:33.235 182759 DEBUG nova.virt.hardware [None req-26dc531a-94bd-4864-905e-beac91ad2dea a7fb6bdd938b4fcdb749b0bc4f86f97e c09a5cf201e249f69f57cd4a632d1e2b - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Jan 21 18:53:33 np0005591285 nova_compute[182755]: 2026-01-21 23:53:33.235 182759 DEBUG nova.virt.hardware [None req-26dc531a-94bd-4864-905e-beac91ad2dea a7fb6bdd938b4fcdb749b0bc4f86f97e c09a5cf201e249f69f57cd4a632d1e2b - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Jan 21 18:53:33 np0005591285 nova_compute[182755]: 2026-01-21 23:53:33.235 182759 DEBUG nova.virt.hardware [None req-26dc531a-94bd-4864-905e-beac91ad2dea a7fb6bdd938b4fcdb749b0bc4f86f97e c09a5cf201e249f69f57cd4a632d1e2b - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Jan 21 18:53:33 np0005591285 nova_compute[182755]: 2026-01-21 23:53:33.235 182759 DEBUG nova.virt.hardware [None req-26dc531a-94bd-4864-905e-beac91ad2dea a7fb6bdd938b4fcdb749b0bc4f86f97e c09a5cf201e249f69f57cd4a632d1e2b - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Jan 21 18:53:33 np0005591285 nova_compute[182755]: 2026-01-21 23:53:33.235 182759 DEBUG nova.virt.hardware [None req-26dc531a-94bd-4864-905e-beac91ad2dea a7fb6bdd938b4fcdb749b0bc4f86f97e c09a5cf201e249f69f57cd4a632d1e2b - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Jan 21 18:53:33 np0005591285 nova_compute[182755]: 2026-01-21 23:53:33.236 182759 DEBUG nova.virt.hardware [None req-26dc531a-94bd-4864-905e-beac91ad2dea a7fb6bdd938b4fcdb749b0bc4f86f97e c09a5cf201e249f69f57cd4a632d1e2b - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Jan 21 18:53:33 np0005591285 nova_compute[182755]: 2026-01-21 23:53:33.236 182759 DEBUG nova.objects.instance [None req-26dc531a-94bd-4864-905e-beac91ad2dea a7fb6bdd938b4fcdb749b0bc4f86f97e c09a5cf201e249f69f57cd4a632d1e2b - - default default] Lazy-loading 'vcpu_model' on Instance uuid 40bd1cc4-d1de-4488-8160-e6d4f5fce4bc obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 21 18:53:33 np0005591285 nova_compute[182755]: 2026-01-21 23:53:33.280 182759 DEBUG oslo_concurrency.processutils [None req-26dc531a-94bd-4864-905e-beac91ad2dea a7fb6bdd938b4fcdb749b0bc4f86f97e c09a5cf201e249f69f57cd4a632d1e2b - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/40bd1cc4-d1de-4488-8160-e6d4f5fce4bc/disk.config --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 21 18:53:33 np0005591285 nova_compute[182755]: 2026-01-21 23:53:33.364 182759 DEBUG oslo_concurrency.processutils [None req-26dc531a-94bd-4864-905e-beac91ad2dea a7fb6bdd938b4fcdb749b0bc4f86f97e c09a5cf201e249f69f57cd4a632d1e2b - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/40bd1cc4-d1de-4488-8160-e6d4f5fce4bc/disk.config --force-share --output=json" returned: 0 in 0.083s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 21 18:53:33 np0005591285 nova_compute[182755]: 2026-01-21 23:53:33.365 182759 DEBUG oslo_concurrency.lockutils [None req-26dc531a-94bd-4864-905e-beac91ad2dea a7fb6bdd938b4fcdb749b0bc4f86f97e c09a5cf201e249f69f57cd4a632d1e2b - - default default] Acquiring lock "/var/lib/nova/instances/40bd1cc4-d1de-4488-8160-e6d4f5fce4bc/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 21 18:53:33 np0005591285 nova_compute[182755]: 2026-01-21 23:53:33.366 182759 DEBUG oslo_concurrency.lockutils [None req-26dc531a-94bd-4864-905e-beac91ad2dea a7fb6bdd938b4fcdb749b0bc4f86f97e c09a5cf201e249f69f57cd4a632d1e2b - - default default] Lock "/var/lib/nova/instances/40bd1cc4-d1de-4488-8160-e6d4f5fce4bc/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 21 18:53:33 np0005591285 nova_compute[182755]: 2026-01-21 23:53:33.367 182759 DEBUG oslo_concurrency.lockutils [None req-26dc531a-94bd-4864-905e-beac91ad2dea a7fb6bdd938b4fcdb749b0bc4f86f97e c09a5cf201e249f69f57cd4a632d1e2b - - default default] Lock "/var/lib/nova/instances/40bd1cc4-d1de-4488-8160-e6d4f5fce4bc/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 21 18:53:33 np0005591285 nova_compute[182755]: 2026-01-21 23:53:33.368 182759 DEBUG nova.virt.libvirt.vif [None req-26dc531a-94bd-4864-905e-beac91ad2dea a7fb6bdd938b4fcdb749b0bc4f86f97e c09a5cf201e249f69f57cd4a632d1e2b - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-21T23:52:42Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ServerDiskConfigTestJSON-server-732234770',display_name='tempest-ServerDiskConfigTestJSON-server-732234770',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(2),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-serverdiskconfigtestjson-server-732234770',id=49,image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',info_cache=InstanceInfoCache,instance_type_id=2,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2026-01-21T23:52:59Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=192,metadata={},migration_context=MigrationContext,new_flavor=Flavor(2),node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=Flavor(1),os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='c09a5cf201e249f69f57cd4a632d1e2b',ramdisk_id='',reservation_id='r-7yxmtfzt',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=ServiceList,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',old_vm_state='active',owner_project_name='tempest-ServerDiskConfigTestJSON-1417790226',owner_user_name='tempest-ServerDiskConfigTestJSON-1417790226-project-member'},tags=<?>,task_state='resize_finish',terminated_at=None,trusted_certs=None,updated_at=2026-01-21T23:53:25Z,user_data=None,user_id='a7fb6bdd938b4fcdb749b0bc4f86f97e',uuid=40bd1cc4-d1de-4488-8160-e6d4f5fce4bc,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "bc3d02f6-e146-4659-b018-41d3813ed1c3", "address": "fa:16:3e:05:f7:b6", "network": {"id": "7b586c54-3322-410f-9bc9-972a63b8deff", "bridge": "br-int", "label": "tempest-ServerDiskConfigTestJSON-1542056474-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [], "label": "tempest-ServerDiskConfigTestJSON-1542056474-network", "vif_mac": "fa:16:3e:05:f7:b6"}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c09a5cf201e249f69f57cd4a632d1e2b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapbc3d02f6-e1", "ovs_interfaceid": "bc3d02f6-e146-4659-b018-41d3813ed1c3", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Jan 21 18:53:33 np0005591285 nova_compute[182755]: 2026-01-21 23:53:33.368 182759 DEBUG nova.network.os_vif_util [None req-26dc531a-94bd-4864-905e-beac91ad2dea a7fb6bdd938b4fcdb749b0bc4f86f97e c09a5cf201e249f69f57cd4a632d1e2b - - default default] Converting VIF {"id": "bc3d02f6-e146-4659-b018-41d3813ed1c3", "address": "fa:16:3e:05:f7:b6", "network": {"id": "7b586c54-3322-410f-9bc9-972a63b8deff", "bridge": "br-int", "label": "tempest-ServerDiskConfigTestJSON-1542056474-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [], "label": "tempest-ServerDiskConfigTestJSON-1542056474-network", "vif_mac": "fa:16:3e:05:f7:b6"}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c09a5cf201e249f69f57cd4a632d1e2b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapbc3d02f6-e1", "ovs_interfaceid": "bc3d02f6-e146-4659-b018-41d3813ed1c3", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 21 18:53:33 np0005591285 nova_compute[182755]: 2026-01-21 23:53:33.369 182759 DEBUG nova.network.os_vif_util [None req-26dc531a-94bd-4864-905e-beac91ad2dea a7fb6bdd938b4fcdb749b0bc4f86f97e c09a5cf201e249f69f57cd4a632d1e2b - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:05:f7:b6,bridge_name='br-int',has_traffic_filtering=True,id=bc3d02f6-e146-4659-b018-41d3813ed1c3,network=Network(7b586c54-3322-410f-9bc9-972a63b8deff),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapbc3d02f6-e1') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 21 18:53:33 np0005591285 nova_compute[182755]: 2026-01-21 23:53:33.372 182759 DEBUG nova.virt.libvirt.driver [None req-26dc531a-94bd-4864-905e-beac91ad2dea a7fb6bdd938b4fcdb749b0bc4f86f97e c09a5cf201e249f69f57cd4a632d1e2b - - default default] [instance: 40bd1cc4-d1de-4488-8160-e6d4f5fce4bc] End _get_guest_xml xml=<domain type="kvm">
Jan 21 18:53:33 np0005591285 nova_compute[182755]:  <uuid>40bd1cc4-d1de-4488-8160-e6d4f5fce4bc</uuid>
Jan 21 18:53:33 np0005591285 nova_compute[182755]:  <name>instance-00000031</name>
Jan 21 18:53:33 np0005591285 nova_compute[182755]:  <memory>196608</memory>
Jan 21 18:53:33 np0005591285 nova_compute[182755]:  <vcpu>1</vcpu>
Jan 21 18:53:33 np0005591285 nova_compute[182755]:  <metadata>
Jan 21 18:53:33 np0005591285 nova_compute[182755]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 21 18:53:33 np0005591285 nova_compute[182755]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 21 18:53:33 np0005591285 nova_compute[182755]:      <nova:name>tempest-ServerDiskConfigTestJSON-server-732234770</nova:name>
Jan 21 18:53:33 np0005591285 nova_compute[182755]:      <nova:creationTime>2026-01-21 23:53:33</nova:creationTime>
Jan 21 18:53:33 np0005591285 nova_compute[182755]:      <nova:flavor name="m1.micro">
Jan 21 18:53:33 np0005591285 nova_compute[182755]:        <nova:memory>192</nova:memory>
Jan 21 18:53:33 np0005591285 nova_compute[182755]:        <nova:disk>1</nova:disk>
Jan 21 18:53:33 np0005591285 nova_compute[182755]:        <nova:swap>0</nova:swap>
Jan 21 18:53:33 np0005591285 nova_compute[182755]:        <nova:ephemeral>0</nova:ephemeral>
Jan 21 18:53:33 np0005591285 nova_compute[182755]:        <nova:vcpus>1</nova:vcpus>
Jan 21 18:53:33 np0005591285 nova_compute[182755]:      </nova:flavor>
Jan 21 18:53:33 np0005591285 nova_compute[182755]:      <nova:owner>
Jan 21 18:53:33 np0005591285 nova_compute[182755]:        <nova:user uuid="a7fb6bdd938b4fcdb749b0bc4f86f97e">tempest-ServerDiskConfigTestJSON-1417790226-project-member</nova:user>
Jan 21 18:53:33 np0005591285 nova_compute[182755]:        <nova:project uuid="c09a5cf201e249f69f57cd4a632d1e2b">tempest-ServerDiskConfigTestJSON-1417790226</nova:project>
Jan 21 18:53:33 np0005591285 nova_compute[182755]:      </nova:owner>
Jan 21 18:53:33 np0005591285 nova_compute[182755]:      <nova:root type="image" uuid="9cd98f02-a505-4543-a7ad-04e9a377b456"/>
Jan 21 18:53:33 np0005591285 nova_compute[182755]:      <nova:ports>
Jan 21 18:53:33 np0005591285 nova_compute[182755]:        <nova:port uuid="bc3d02f6-e146-4659-b018-41d3813ed1c3">
Jan 21 18:53:33 np0005591285 nova_compute[182755]:          <nova:ip type="fixed" address="10.100.0.3" ipVersion="4"/>
Jan 21 18:53:33 np0005591285 nova_compute[182755]:        </nova:port>
Jan 21 18:53:33 np0005591285 nova_compute[182755]:      </nova:ports>
Jan 21 18:53:33 np0005591285 nova_compute[182755]:    </nova:instance>
Jan 21 18:53:33 np0005591285 nova_compute[182755]:  </metadata>
Jan 21 18:53:33 np0005591285 nova_compute[182755]:  <sysinfo type="smbios">
Jan 21 18:53:33 np0005591285 nova_compute[182755]:    <system>
Jan 21 18:53:33 np0005591285 nova_compute[182755]:      <entry name="manufacturer">RDO</entry>
Jan 21 18:53:33 np0005591285 nova_compute[182755]:      <entry name="product">OpenStack Compute</entry>
Jan 21 18:53:33 np0005591285 nova_compute[182755]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 21 18:53:33 np0005591285 nova_compute[182755]:      <entry name="serial">40bd1cc4-d1de-4488-8160-e6d4f5fce4bc</entry>
Jan 21 18:53:33 np0005591285 nova_compute[182755]:      <entry name="uuid">40bd1cc4-d1de-4488-8160-e6d4f5fce4bc</entry>
Jan 21 18:53:33 np0005591285 nova_compute[182755]:      <entry name="family">Virtual Machine</entry>
Jan 21 18:53:33 np0005591285 nova_compute[182755]:    </system>
Jan 21 18:53:33 np0005591285 nova_compute[182755]:  </sysinfo>
Jan 21 18:53:33 np0005591285 nova_compute[182755]:  <os>
Jan 21 18:53:33 np0005591285 nova_compute[182755]:    <type arch="x86_64" machine="q35">hvm</type>
Jan 21 18:53:33 np0005591285 nova_compute[182755]:    <boot dev="hd"/>
Jan 21 18:53:33 np0005591285 nova_compute[182755]:    <smbios mode="sysinfo"/>
Jan 21 18:53:33 np0005591285 nova_compute[182755]:  </os>
Jan 21 18:53:33 np0005591285 nova_compute[182755]:  <features>
Jan 21 18:53:33 np0005591285 nova_compute[182755]:    <acpi/>
Jan 21 18:53:33 np0005591285 nova_compute[182755]:    <apic/>
Jan 21 18:53:33 np0005591285 nova_compute[182755]:    <vmcoreinfo/>
Jan 21 18:53:33 np0005591285 nova_compute[182755]:  </features>
Jan 21 18:53:33 np0005591285 nova_compute[182755]:  <clock offset="utc">
Jan 21 18:53:33 np0005591285 nova_compute[182755]:    <timer name="pit" tickpolicy="delay"/>
Jan 21 18:53:33 np0005591285 nova_compute[182755]:    <timer name="rtc" tickpolicy="catchup"/>
Jan 21 18:53:33 np0005591285 nova_compute[182755]:    <timer name="hpet" present="no"/>
Jan 21 18:53:33 np0005591285 nova_compute[182755]:  </clock>
Jan 21 18:53:33 np0005591285 nova_compute[182755]:  <cpu mode="custom" match="exact">
Jan 21 18:53:33 np0005591285 nova_compute[182755]:    <model>Nehalem</model>
Jan 21 18:53:33 np0005591285 nova_compute[182755]:    <topology sockets="1" cores="1" threads="1"/>
Jan 21 18:53:33 np0005591285 nova_compute[182755]:  </cpu>
Jan 21 18:53:33 np0005591285 nova_compute[182755]:  <devices>
Jan 21 18:53:33 np0005591285 nova_compute[182755]:    <disk type="file" device="disk">
Jan 21 18:53:33 np0005591285 nova_compute[182755]:      <driver name="qemu" type="qcow2" cache="none"/>
Jan 21 18:53:33 np0005591285 nova_compute[182755]:      <source file="/var/lib/nova/instances/40bd1cc4-d1de-4488-8160-e6d4f5fce4bc/disk"/>
Jan 21 18:53:33 np0005591285 nova_compute[182755]:      <target dev="vda" bus="virtio"/>
Jan 21 18:53:33 np0005591285 nova_compute[182755]:    </disk>
Jan 21 18:53:33 np0005591285 nova_compute[182755]:    <disk type="file" device="cdrom">
Jan 21 18:53:33 np0005591285 nova_compute[182755]:      <driver name="qemu" type="raw" cache="none"/>
Jan 21 18:53:33 np0005591285 nova_compute[182755]:      <source file="/var/lib/nova/instances/40bd1cc4-d1de-4488-8160-e6d4f5fce4bc/disk.config"/>
Jan 21 18:53:33 np0005591285 nova_compute[182755]:      <target dev="sda" bus="sata"/>
Jan 21 18:53:33 np0005591285 nova_compute[182755]:    </disk>
Jan 21 18:53:33 np0005591285 nova_compute[182755]:    <interface type="ethernet">
Jan 21 18:53:33 np0005591285 nova_compute[182755]:      <mac address="fa:16:3e:05:f7:b6"/>
Jan 21 18:53:33 np0005591285 nova_compute[182755]:      <model type="virtio"/>
Jan 21 18:53:33 np0005591285 nova_compute[182755]:      <driver name="vhost" rx_queue_size="512"/>
Jan 21 18:53:33 np0005591285 nova_compute[182755]:      <mtu size="1442"/>
Jan 21 18:53:33 np0005591285 nova_compute[182755]:      <target dev="tapbc3d02f6-e1"/>
Jan 21 18:53:33 np0005591285 nova_compute[182755]:    </interface>
Jan 21 18:53:33 np0005591285 nova_compute[182755]:    <serial type="pty">
Jan 21 18:53:33 np0005591285 nova_compute[182755]:      <log file="/var/lib/nova/instances/40bd1cc4-d1de-4488-8160-e6d4f5fce4bc/console.log" append="off"/>
Jan 21 18:53:33 np0005591285 nova_compute[182755]:    </serial>
Jan 21 18:53:33 np0005591285 nova_compute[182755]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 21 18:53:33 np0005591285 nova_compute[182755]:    <video>
Jan 21 18:53:33 np0005591285 nova_compute[182755]:      <model type="virtio"/>
Jan 21 18:53:33 np0005591285 nova_compute[182755]:    </video>
Jan 21 18:53:33 np0005591285 nova_compute[182755]:    <input type="tablet" bus="usb"/>
Jan 21 18:53:33 np0005591285 nova_compute[182755]:    <rng model="virtio">
Jan 21 18:53:33 np0005591285 nova_compute[182755]:      <backend model="random">/dev/urandom</backend>
Jan 21 18:53:33 np0005591285 nova_compute[182755]:    </rng>
Jan 21 18:53:33 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root"/>
Jan 21 18:53:33 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 18:53:33 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 18:53:33 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 18:53:33 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 18:53:33 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 18:53:33 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 18:53:33 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 18:53:33 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 18:53:33 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 18:53:33 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 18:53:33 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 18:53:33 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 18:53:33 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 18:53:33 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 18:53:33 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 18:53:33 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 18:53:33 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 18:53:33 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 18:53:33 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 18:53:33 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 18:53:33 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 18:53:33 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 18:53:33 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 18:53:33 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 18:53:33 np0005591285 nova_compute[182755]:    <controller type="usb" index="0"/>
Jan 21 18:53:33 np0005591285 nova_compute[182755]:    <memballoon model="virtio">
Jan 21 18:53:33 np0005591285 nova_compute[182755]:      <stats period="10"/>
Jan 21 18:53:33 np0005591285 nova_compute[182755]:    </memballoon>
Jan 21 18:53:33 np0005591285 nova_compute[182755]:  </devices>
Jan 21 18:53:33 np0005591285 nova_compute[182755]: </domain>
Jan 21 18:53:33 np0005591285 nova_compute[182755]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Jan 21 18:53:33 np0005591285 nova_compute[182755]: 2026-01-21 23:53:33.373 182759 DEBUG nova.virt.libvirt.vif [None req-26dc531a-94bd-4864-905e-beac91ad2dea a7fb6bdd938b4fcdb749b0bc4f86f97e c09a5cf201e249f69f57cd4a632d1e2b - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-21T23:52:42Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ServerDiskConfigTestJSON-server-732234770',display_name='tempest-ServerDiskConfigTestJSON-server-732234770',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(2),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-serverdiskconfigtestjson-server-732234770',id=49,image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',info_cache=InstanceInfoCache,instance_type_id=2,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2026-01-21T23:52:59Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=192,metadata={},migration_context=MigrationContext,new_flavor=Flavor(2),node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=Flavor(1),os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='c09a5cf201e249f69f57cd4a632d1e2b',ramdisk_id='',reservation_id='r-7yxmtfzt',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=ServiceList,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',old_vm_state='active',owner_project_name='tempest-ServerDiskConfigTestJSON-1417790226',owner_user_name='tempest-ServerDiskConfigTestJSON-1417790226-project-member'},tags=<?>,task_state='resize_finish',terminated_at=None,trusted_certs=None,updated_at=2026-01-21T23:53:25Z,user_data=None,user_id='a7fb6bdd938b4fcdb749b0bc4f86f97e',uuid=40bd1cc4-d1de-4488-8160-e6d4f5fce4bc,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "bc3d02f6-e146-4659-b018-41d3813ed1c3", "address": "fa:16:3e:05:f7:b6", "network": {"id": "7b586c54-3322-410f-9bc9-972a63b8deff", "bridge": "br-int", "label": "tempest-ServerDiskConfigTestJSON-1542056474-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [], "label": "tempest-ServerDiskConfigTestJSON-1542056474-network", "vif_mac": "fa:16:3e:05:f7:b6"}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c09a5cf201e249f69f57cd4a632d1e2b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapbc3d02f6-e1", "ovs_interfaceid": "bc3d02f6-e146-4659-b018-41d3813ed1c3", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Jan 21 18:53:33 np0005591285 nova_compute[182755]: 2026-01-21 23:53:33.373 182759 DEBUG nova.network.os_vif_util [None req-26dc531a-94bd-4864-905e-beac91ad2dea a7fb6bdd938b4fcdb749b0bc4f86f97e c09a5cf201e249f69f57cd4a632d1e2b - - default default] Converting VIF {"id": "bc3d02f6-e146-4659-b018-41d3813ed1c3", "address": "fa:16:3e:05:f7:b6", "network": {"id": "7b586c54-3322-410f-9bc9-972a63b8deff", "bridge": "br-int", "label": "tempest-ServerDiskConfigTestJSON-1542056474-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [], "label": "tempest-ServerDiskConfigTestJSON-1542056474-network", "vif_mac": "fa:16:3e:05:f7:b6"}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c09a5cf201e249f69f57cd4a632d1e2b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapbc3d02f6-e1", "ovs_interfaceid": "bc3d02f6-e146-4659-b018-41d3813ed1c3", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 21 18:53:33 np0005591285 nova_compute[182755]: 2026-01-21 23:53:33.374 182759 DEBUG nova.network.os_vif_util [None req-26dc531a-94bd-4864-905e-beac91ad2dea a7fb6bdd938b4fcdb749b0bc4f86f97e c09a5cf201e249f69f57cd4a632d1e2b - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:05:f7:b6,bridge_name='br-int',has_traffic_filtering=True,id=bc3d02f6-e146-4659-b018-41d3813ed1c3,network=Network(7b586c54-3322-410f-9bc9-972a63b8deff),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapbc3d02f6-e1') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 21 18:53:33 np0005591285 nova_compute[182755]: 2026-01-21 23:53:33.374 182759 DEBUG os_vif [None req-26dc531a-94bd-4864-905e-beac91ad2dea a7fb6bdd938b4fcdb749b0bc4f86f97e c09a5cf201e249f69f57cd4a632d1e2b - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:05:f7:b6,bridge_name='br-int',has_traffic_filtering=True,id=bc3d02f6-e146-4659-b018-41d3813ed1c3,network=Network(7b586c54-3322-410f-9bc9-972a63b8deff),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapbc3d02f6-e1') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Jan 21 18:53:33 np0005591285 nova_compute[182755]: 2026-01-21 23:53:33.375 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 18:53:33 np0005591285 nova_compute[182755]: 2026-01-21 23:53:33.376 182759 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 21 18:53:33 np0005591285 nova_compute[182755]: 2026-01-21 23:53:33.376 182759 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 21 18:53:33 np0005591285 nova_compute[182755]: 2026-01-21 23:53:33.380 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 18:53:33 np0005591285 nova_compute[182755]: 2026-01-21 23:53:33.380 182759 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapbc3d02f6-e1, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 21 18:53:33 np0005591285 nova_compute[182755]: 2026-01-21 23:53:33.381 182759 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tapbc3d02f6-e1, col_values=(('external_ids', {'iface-id': 'bc3d02f6-e146-4659-b018-41d3813ed1c3', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:05:f7:b6', 'vm-uuid': '40bd1cc4-d1de-4488-8160-e6d4f5fce4bc'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 21 18:53:33 np0005591285 nova_compute[182755]: 2026-01-21 23:53:33.382 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 18:53:33 np0005591285 NetworkManager[55017]: <info>  [1769039613.3840] manager: (tapbc3d02f6-e1): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/63)
Jan 21 18:53:33 np0005591285 nova_compute[182755]: 2026-01-21 23:53:33.388 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 21 18:53:33 np0005591285 nova_compute[182755]: 2026-01-21 23:53:33.391 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 18:53:33 np0005591285 nova_compute[182755]: 2026-01-21 23:53:33.392 182759 INFO os_vif [None req-26dc531a-94bd-4864-905e-beac91ad2dea a7fb6bdd938b4fcdb749b0bc4f86f97e c09a5cf201e249f69f57cd4a632d1e2b - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:05:f7:b6,bridge_name='br-int',has_traffic_filtering=True,id=bc3d02f6-e146-4659-b018-41d3813ed1c3,network=Network(7b586c54-3322-410f-9bc9-972a63b8deff),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapbc3d02f6-e1')#033[00m
Jan 21 18:53:33 np0005591285 nova_compute[182755]: 2026-01-21 23:53:33.477 182759 DEBUG nova.virt.libvirt.driver [None req-26dc531a-94bd-4864-905e-beac91ad2dea a7fb6bdd938b4fcdb749b0bc4f86f97e c09a5cf201e249f69f57cd4a632d1e2b - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 21 18:53:33 np0005591285 nova_compute[182755]: 2026-01-21 23:53:33.478 182759 DEBUG nova.virt.libvirt.driver [None req-26dc531a-94bd-4864-905e-beac91ad2dea a7fb6bdd938b4fcdb749b0bc4f86f97e c09a5cf201e249f69f57cd4a632d1e2b - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 21 18:53:33 np0005591285 nova_compute[182755]: 2026-01-21 23:53:33.478 182759 DEBUG nova.virt.libvirt.driver [None req-26dc531a-94bd-4864-905e-beac91ad2dea a7fb6bdd938b4fcdb749b0bc4f86f97e c09a5cf201e249f69f57cd4a632d1e2b - - default default] No VIF found with MAC fa:16:3e:05:f7:b6, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Jan 21 18:53:33 np0005591285 nova_compute[182755]: 2026-01-21 23:53:33.478 182759 INFO nova.virt.libvirt.driver [None req-26dc531a-94bd-4864-905e-beac91ad2dea a7fb6bdd938b4fcdb749b0bc4f86f97e c09a5cf201e249f69f57cd4a632d1e2b - - default default] [instance: 40bd1cc4-d1de-4488-8160-e6d4f5fce4bc] Using config drive#033[00m
Jan 21 18:53:33 np0005591285 kernel: tapbc3d02f6-e1: entered promiscuous mode
Jan 21 18:53:33 np0005591285 NetworkManager[55017]: <info>  [1769039613.5812] manager: (tapbc3d02f6-e1): new Tun device (/org/freedesktop/NetworkManager/Devices/64)
Jan 21 18:53:33 np0005591285 ovn_controller[94908]: 2026-01-21T23:53:33Z|00109|binding|INFO|Claiming lport bc3d02f6-e146-4659-b018-41d3813ed1c3 for this chassis.
Jan 21 18:53:33 np0005591285 nova_compute[182755]: 2026-01-21 23:53:33.581 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 18:53:33 np0005591285 ovn_controller[94908]: 2026-01-21T23:53:33Z|00110|binding|INFO|bc3d02f6-e146-4659-b018-41d3813ed1c3: Claiming fa:16:3e:05:f7:b6 10.100.0.3
Jan 21 18:53:33 np0005591285 nova_compute[182755]: 2026-01-21 23:53:33.593 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 18:53:33 np0005591285 nova_compute[182755]: 2026-01-21 23:53:33.596 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 18:53:33 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:53:33.608 104259 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:05:f7:b6 10.100.0.3'], port_security=['fa:16:3e:05:f7:b6 10.100.0.3'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.3/28', 'neutron:device_id': '40bd1cc4-d1de-4488-8160-e6d4f5fce4bc', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-7b586c54-3322-410f-9bc9-972a63b8deff', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'c09a5cf201e249f69f57cd4a632d1e2b', 'neutron:revision_number': '6', 'neutron:security_group_ids': '0a9884ba-4fab-4d1a-a8f1-d417efefef12', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=f503744d-afcf-48c4-bcde-b001877de7d3, chassis=[<ovs.db.idl.Row object at 0x7fdd55b5c970>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fdd55b5c970>], logical_port=bc3d02f6-e146-4659-b018-41d3813ed1c3) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 21 18:53:33 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:53:33.610 104259 INFO neutron.agent.ovn.metadata.agent [-] Port bc3d02f6-e146-4659-b018-41d3813ed1c3 in datapath 7b586c54-3322-410f-9bc9-972a63b8deff bound to our chassis#033[00m
Jan 21 18:53:33 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:53:33.612 104259 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 7b586c54-3322-410f-9bc9-972a63b8deff#033[00m
Jan 21 18:53:33 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:53:33.631 211690 DEBUG oslo.privsep.daemon [-] privsep: reply[7cc7d277-c973-47e4-a813-03bf183eaa5e]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 18:53:33 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:53:33.632 104259 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap7b586c54-31 in ovnmeta-7b586c54-3322-410f-9bc9-972a63b8deff namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Jan 21 18:53:33 np0005591285 podman[217204]: 2026-01-21 23:53:33.635571438 +0000 UTC m=+0.106083167 container health_status 8919309155a033da8deb024bb37b9fc0d285fe97686ee0d202af37a5f2df8416 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter)
Jan 21 18:53:33 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:53:33.634 211690 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap7b586c54-30 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Jan 21 18:53:33 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:53:33.635 211690 DEBUG oslo.privsep.daemon [-] privsep: reply[738cb2bf-4d6c-4326-9675-367bfb5a1551]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 18:53:33 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:53:33.638 211690 DEBUG oslo.privsep.daemon [-] privsep: reply[62617e69-0532-4c0b-929a-e328b153cbda]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 18:53:33 np0005591285 systemd-machined[154022]: New machine qemu-20-instance-00000031.
Jan 21 18:53:33 np0005591285 ovn_controller[94908]: 2026-01-21T23:53:33Z|00111|binding|INFO|Setting lport bc3d02f6-e146-4659-b018-41d3813ed1c3 ovn-installed in OVS
Jan 21 18:53:33 np0005591285 ovn_controller[94908]: 2026-01-21T23:53:33Z|00112|binding|INFO|Setting lport bc3d02f6-e146-4659-b018-41d3813ed1c3 up in Southbound
Jan 21 18:53:33 np0005591285 nova_compute[182755]: 2026-01-21 23:53:33.650 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 18:53:33 np0005591285 systemd[1]: Started Virtual Machine qemu-20-instance-00000031.
Jan 21 18:53:33 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:53:33.659 104650 DEBUG oslo.privsep.daemon [-] privsep: reply[4b1c85b1-5279-4a67-b65a-e08818d68682]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 18:53:33 np0005591285 podman[217201]: 2026-01-21 23:53:33.660798507 +0000 UTC m=+0.130535874 container health_status 482a8450540b33e5cd4c4cb0c4b60281016c657b79b89fa82f8a9e447843f995 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 21 18:53:33 np0005591285 systemd-udevd[217261]: Network interface NamePolicy= disabled on kernel command line.
Jan 21 18:53:33 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:53:33.676 211690 DEBUG oslo.privsep.daemon [-] privsep: reply[20015f1f-b131-4f7f-91bc-a5e54fa0a398]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 18:53:33 np0005591285 NetworkManager[55017]: <info>  [1769039613.6918] device (tapbc3d02f6-e1): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 21 18:53:33 np0005591285 NetworkManager[55017]: <info>  [1769039613.6925] device (tapbc3d02f6-e1): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 21 18:53:33 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:53:33.720 211704 DEBUG oslo.privsep.daemon [-] privsep: reply[f6942361-0739-450e-b49f-17fca720b1cf]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 18:53:33 np0005591285 NetworkManager[55017]: <info>  [1769039613.7321] manager: (tap7b586c54-30): new Veth device (/org/freedesktop/NetworkManager/Devices/65)
Jan 21 18:53:33 np0005591285 systemd-udevd[217264]: Network interface NamePolicy= disabled on kernel command line.
Jan 21 18:53:33 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:53:33.731 211690 DEBUG oslo.privsep.daemon [-] privsep: reply[7c1c2d9e-41ea-461c-97db-f5f5eabe8ef7]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 18:53:33 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:53:33.778 211704 DEBUG oslo.privsep.daemon [-] privsep: reply[4a88df0a-f000-4789-b864-71ff58a9463c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 18:53:33 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:53:33.781 211704 DEBUG oslo.privsep.daemon [-] privsep: reply[81fc54b2-a1ce-4811-af19-17349b635ec3]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 18:53:33 np0005591285 NetworkManager[55017]: <info>  [1769039613.8163] device (tap7b586c54-30): carrier: link connected
Jan 21 18:53:33 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:53:33.828 211704 DEBUG oslo.privsep.daemon [-] privsep: reply[3246b8c5-794d-4bc8-b6c6-43647b6804e9]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 18:53:33 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:53:33.860 211690 DEBUG oslo.privsep.daemon [-] privsep: reply[5d2ff81f-b7ab-4aca-88cd-2a65dd9f162d]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap7b586c54-31'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:dc:a9:f5'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 39], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 417247, 'reachable_time': 18909, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 217291, 'error': None, 'target': 'ovnmeta-7b586c54-3322-410f-9bc9-972a63b8deff', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 18:53:33 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:53:33.889 211690 DEBUG oslo.privsep.daemon [-] privsep: reply[3b0f25f9-0f1b-4b57-a225-871f605e5f9f]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fedc:a9f5'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 417247, 'tstamp': 417247}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 217292, 'error': None, 'target': 'ovnmeta-7b586c54-3322-410f-9bc9-972a63b8deff', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 18:53:33 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:53:33.929 211690 DEBUG oslo.privsep.daemon [-] privsep: reply[53e6be44-8165-45bb-98cb-7f855c12e1ff]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap7b586c54-31'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:dc:a9:f5'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 39], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 417247, 'reachable_time': 18909, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 217293, 'error': None, 'target': 'ovnmeta-7b586c54-3322-410f-9bc9-972a63b8deff', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 18:53:33 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:53:33.981 211690 DEBUG oslo.privsep.daemon [-] privsep: reply[1926a4cc-425d-4a34-89b3-3505bf82ef32]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 18:53:34 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:53:34.089 211690 DEBUG oslo.privsep.daemon [-] privsep: reply[331a6d03-06c0-4d74-8a75-d9e22affd0f9]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 18:53:34 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:53:34.092 104259 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap7b586c54-30, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 21 18:53:34 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:53:34.092 104259 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 21 18:53:34 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:53:34.093 104259 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap7b586c54-30, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 21 18:53:34 np0005591285 NetworkManager[55017]: <info>  [1769039614.0982] manager: (tap7b586c54-30): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/66)
Jan 21 18:53:34 np0005591285 kernel: tap7b586c54-30: entered promiscuous mode
Jan 21 18:53:34 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:53:34.105 104259 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap7b586c54-30, col_values=(('external_ids', {'iface-id': '52e5d5d5-be78-49fa-86d7-24ac4adf40c1'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 21 18:53:34 np0005591285 nova_compute[182755]: 2026-01-21 23:53:34.106 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 18:53:34 np0005591285 ovn_controller[94908]: 2026-01-21T23:53:34Z|00113|binding|INFO|Releasing lport 52e5d5d5-be78-49fa-86d7-24ac4adf40c1 from this chassis (sb_readonly=0)
Jan 21 18:53:34 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:53:34.139 104259 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/7b586c54-3322-410f-9bc9-972a63b8deff.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/7b586c54-3322-410f-9bc9-972a63b8deff.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Jan 21 18:53:34 np0005591285 nova_compute[182755]: 2026-01-21 23:53:34.141 182759 DEBUG nova.compute.manager [req-8fce7ef7-eb65-40de-88a0-f9c57aff3e10 req-edcf3a05-78b1-4fd5-a06e-1e8077e26c3c 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 40bd1cc4-d1de-4488-8160-e6d4f5fce4bc] Received event network-vif-plugged-bc3d02f6-e146-4659-b018-41d3813ed1c3 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 21 18:53:34 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:53:34.141 211690 DEBUG oslo.privsep.daemon [-] privsep: reply[99e2577a-38a7-4e78-ad35-b0a296844cc3]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 18:53:34 np0005591285 nova_compute[182755]: 2026-01-21 23:53:34.142 182759 DEBUG oslo_concurrency.lockutils [req-8fce7ef7-eb65-40de-88a0-f9c57aff3e10 req-edcf3a05-78b1-4fd5-a06e-1e8077e26c3c 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "40bd1cc4-d1de-4488-8160-e6d4f5fce4bc-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 21 18:53:34 np0005591285 nova_compute[182755]: 2026-01-21 23:53:34.142 182759 DEBUG oslo_concurrency.lockutils [req-8fce7ef7-eb65-40de-88a0-f9c57aff3e10 req-edcf3a05-78b1-4fd5-a06e-1e8077e26c3c 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "40bd1cc4-d1de-4488-8160-e6d4f5fce4bc-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 21 18:53:34 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:53:34.143 104259 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Jan 21 18:53:34 np0005591285 ovn_metadata_agent[104254]: global
Jan 21 18:53:34 np0005591285 nova_compute[182755]: 2026-01-21 23:53:34.143 182759 DEBUG oslo_concurrency.lockutils [req-8fce7ef7-eb65-40de-88a0-f9c57aff3e10 req-edcf3a05-78b1-4fd5-a06e-1e8077e26c3c 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "40bd1cc4-d1de-4488-8160-e6d4f5fce4bc-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 21 18:53:34 np0005591285 ovn_metadata_agent[104254]:    log         /dev/log local0 debug
Jan 21 18:53:34 np0005591285 ovn_metadata_agent[104254]:    log-tag     haproxy-metadata-proxy-7b586c54-3322-410f-9bc9-972a63b8deff
Jan 21 18:53:34 np0005591285 ovn_metadata_agent[104254]:    user        root
Jan 21 18:53:34 np0005591285 ovn_metadata_agent[104254]:    group       root
Jan 21 18:53:34 np0005591285 ovn_metadata_agent[104254]:    maxconn     1024
Jan 21 18:53:34 np0005591285 ovn_metadata_agent[104254]:    pidfile     /var/lib/neutron/external/pids/7b586c54-3322-410f-9bc9-972a63b8deff.pid.haproxy
Jan 21 18:53:34 np0005591285 ovn_metadata_agent[104254]:    daemon
Jan 21 18:53:34 np0005591285 ovn_metadata_agent[104254]: 
Jan 21 18:53:34 np0005591285 ovn_metadata_agent[104254]: defaults
Jan 21 18:53:34 np0005591285 ovn_metadata_agent[104254]:    log global
Jan 21 18:53:34 np0005591285 ovn_metadata_agent[104254]:    mode http
Jan 21 18:53:34 np0005591285 ovn_metadata_agent[104254]:    option httplog
Jan 21 18:53:34 np0005591285 ovn_metadata_agent[104254]:    option dontlognull
Jan 21 18:53:34 np0005591285 ovn_metadata_agent[104254]:    option http-server-close
Jan 21 18:53:34 np0005591285 ovn_metadata_agent[104254]:    option forwardfor
Jan 21 18:53:34 np0005591285 ovn_metadata_agent[104254]:    retries                 3
Jan 21 18:53:34 np0005591285 ovn_metadata_agent[104254]:    timeout http-request    30s
Jan 21 18:53:34 np0005591285 ovn_metadata_agent[104254]:    timeout connect         30s
Jan 21 18:53:34 np0005591285 ovn_metadata_agent[104254]:    timeout client          32s
Jan 21 18:53:34 np0005591285 ovn_metadata_agent[104254]:    timeout server          32s
Jan 21 18:53:34 np0005591285 ovn_metadata_agent[104254]:    timeout http-keep-alive 30s
Jan 21 18:53:34 np0005591285 ovn_metadata_agent[104254]: 
Jan 21 18:53:34 np0005591285 ovn_metadata_agent[104254]: 
Jan 21 18:53:34 np0005591285 ovn_metadata_agent[104254]: listen listener
Jan 21 18:53:34 np0005591285 ovn_metadata_agent[104254]:    bind 169.254.169.254:80
Jan 21 18:53:34 np0005591285 ovn_metadata_agent[104254]:    server metadata /var/lib/neutron/metadata_proxy
Jan 21 18:53:34 np0005591285 ovn_metadata_agent[104254]:    http-request add-header X-OVN-Network-ID 7b586c54-3322-410f-9bc9-972a63b8deff
Jan 21 18:53:34 np0005591285 ovn_metadata_agent[104254]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Jan 21 18:53:34 np0005591285 nova_compute[182755]: 2026-01-21 23:53:34.143 182759 DEBUG nova.compute.manager [req-8fce7ef7-eb65-40de-88a0-f9c57aff3e10 req-edcf3a05-78b1-4fd5-a06e-1e8077e26c3c 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 40bd1cc4-d1de-4488-8160-e6d4f5fce4bc] No waiting events found dispatching network-vif-plugged-bc3d02f6-e146-4659-b018-41d3813ed1c3 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 21 18:53:34 np0005591285 nova_compute[182755]: 2026-01-21 23:53:34.144 182759 WARNING nova.compute.manager [req-8fce7ef7-eb65-40de-88a0-f9c57aff3e10 req-edcf3a05-78b1-4fd5-a06e-1e8077e26c3c 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 40bd1cc4-d1de-4488-8160-e6d4f5fce4bc] Received unexpected event network-vif-plugged-bc3d02f6-e146-4659-b018-41d3813ed1c3 for instance with vm_state active and task_state resize_finish.#033[00m
Jan 21 18:53:34 np0005591285 nova_compute[182755]: 2026-01-21 23:53:34.144 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 18:53:34 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:53:34.148 104259 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-7b586c54-3322-410f-9bc9-972a63b8deff', 'env', 'PROCESS_TAG=haproxy-7b586c54-3322-410f-9bc9-972a63b8deff', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/7b586c54-3322-410f-9bc9-972a63b8deff.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Jan 21 18:53:34 np0005591285 nova_compute[182755]: 2026-01-21 23:53:34.479 182759 DEBUG nova.virt.driver [None req-f447ef32-7edb-4e2c-a5c5-5452f2f9c37e - - - - - -] Emitting event <LifecycleEvent: 1769039614.47883, 40bd1cc4-d1de-4488-8160-e6d4f5fce4bc => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 21 18:53:34 np0005591285 nova_compute[182755]: 2026-01-21 23:53:34.481 182759 INFO nova.compute.manager [None req-f447ef32-7edb-4e2c-a5c5-5452f2f9c37e - - - - - -] [instance: 40bd1cc4-d1de-4488-8160-e6d4f5fce4bc] VM Resumed (Lifecycle Event)#033[00m
Jan 21 18:53:34 np0005591285 nova_compute[182755]: 2026-01-21 23:53:34.484 182759 DEBUG nova.compute.manager [None req-26dc531a-94bd-4864-905e-beac91ad2dea a7fb6bdd938b4fcdb749b0bc4f86f97e c09a5cf201e249f69f57cd4a632d1e2b - - default default] [instance: 40bd1cc4-d1de-4488-8160-e6d4f5fce4bc] Instance event wait completed in 0 seconds for  wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Jan 21 18:53:34 np0005591285 nova_compute[182755]: 2026-01-21 23:53:34.494 182759 INFO nova.virt.libvirt.driver [-] [instance: 40bd1cc4-d1de-4488-8160-e6d4f5fce4bc] Instance running successfully.#033[00m
Jan 21 18:53:34 np0005591285 virtqemud[182299]: argument unsupported: QEMU guest agent is not configured
Jan 21 18:53:34 np0005591285 nova_compute[182755]: 2026-01-21 23:53:34.498 182759 DEBUG nova.virt.libvirt.guest [None req-26dc531a-94bd-4864-905e-beac91ad2dea a7fb6bdd938b4fcdb749b0bc4f86f97e c09a5cf201e249f69f57cd4a632d1e2b - - default default] [instance: 40bd1cc4-d1de-4488-8160-e6d4f5fce4bc] Failed to set time: agent not configured sync_guest_time /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:200#033[00m
Jan 21 18:53:34 np0005591285 nova_compute[182755]: 2026-01-21 23:53:34.499 182759 DEBUG nova.virt.libvirt.driver [None req-26dc531a-94bd-4864-905e-beac91ad2dea a7fb6bdd938b4fcdb749b0bc4f86f97e c09a5cf201e249f69f57cd4a632d1e2b - - default default] [instance: 40bd1cc4-d1de-4488-8160-e6d4f5fce4bc] finish_migration finished successfully. finish_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11793#033[00m
Jan 21 18:53:34 np0005591285 nova_compute[182755]: 2026-01-21 23:53:34.529 182759 DEBUG nova.compute.manager [None req-f447ef32-7edb-4e2c-a5c5-5452f2f9c37e - - - - - -] [instance: 40bd1cc4-d1de-4488-8160-e6d4f5fce4bc] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 21 18:53:34 np0005591285 nova_compute[182755]: 2026-01-21 23:53:34.534 182759 DEBUG nova.compute.manager [None req-f447ef32-7edb-4e2c-a5c5-5452f2f9c37e - - - - - -] [instance: 40bd1cc4-d1de-4488-8160-e6d4f5fce4bc] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: active, current task_state: resize_finish, current DB power_state: 1, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 21 18:53:34 np0005591285 nova_compute[182755]: 2026-01-21 23:53:34.562 182759 INFO nova.compute.manager [None req-f447ef32-7edb-4e2c-a5c5-5452f2f9c37e - - - - - -] [instance: 40bd1cc4-d1de-4488-8160-e6d4f5fce4bc] During sync_power_state the instance has a pending task (resize_finish). Skip.#033[00m
Jan 21 18:53:34 np0005591285 nova_compute[182755]: 2026-01-21 23:53:34.562 182759 DEBUG nova.virt.driver [None req-f447ef32-7edb-4e2c-a5c5-5452f2f9c37e - - - - - -] Emitting event <LifecycleEvent: 1769039614.4802804, 40bd1cc4-d1de-4488-8160-e6d4f5fce4bc => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 21 18:53:34 np0005591285 nova_compute[182755]: 2026-01-21 23:53:34.563 182759 INFO nova.compute.manager [None req-f447ef32-7edb-4e2c-a5c5-5452f2f9c37e - - - - - -] [instance: 40bd1cc4-d1de-4488-8160-e6d4f5fce4bc] VM Started (Lifecycle Event)#033[00m
Jan 21 18:53:34 np0005591285 podman[217331]: 2026-01-21 23:53:34.590459028 +0000 UTC m=+0.056182783 container create 8cf1aa422a04b7408f8edf8c758cafccb30fa363470e7444aae87dd77eb83649 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-7b586c54-3322-410f-9bc9-972a63b8deff, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 21 18:53:34 np0005591285 nova_compute[182755]: 2026-01-21 23:53:34.621 182759 DEBUG nova.compute.manager [None req-f447ef32-7edb-4e2c-a5c5-5452f2f9c37e - - - - - -] [instance: 40bd1cc4-d1de-4488-8160-e6d4f5fce4bc] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 21 18:53:34 np0005591285 nova_compute[182755]: 2026-01-21 23:53:34.626 182759 DEBUG nova.compute.manager [None req-f447ef32-7edb-4e2c-a5c5-5452f2f9c37e - - - - - -] [instance: 40bd1cc4-d1de-4488-8160-e6d4f5fce4bc] Synchronizing instance power state after lifecycle event "Started"; current vm_state: active, current task_state: resize_finish, current DB power_state: 1, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 21 18:53:34 np0005591285 systemd[1]: Started libpod-conmon-8cf1aa422a04b7408f8edf8c758cafccb30fa363470e7444aae87dd77eb83649.scope.
Jan 21 18:53:34 np0005591285 podman[217331]: 2026-01-21 23:53:34.560142702 +0000 UTC m=+0.025866497 image pull 3695f0466b4af47afdf4b467956f8cc4744d7249671a73e7ca3fd26cca2f59c3 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Jan 21 18:53:34 np0005591285 systemd[1]: Stopping User Manager for UID 42436...
Jan 21 18:53:34 np0005591285 systemd[217099]: Activating special unit Exit the Session...
Jan 21 18:53:34 np0005591285 systemd[217099]: Stopped target Main User Target.
Jan 21 18:53:34 np0005591285 systemd[217099]: Stopped target Basic System.
Jan 21 18:53:34 np0005591285 systemd[217099]: Stopped target Paths.
Jan 21 18:53:34 np0005591285 systemd[217099]: Stopped target Sockets.
Jan 21 18:53:34 np0005591285 systemd[217099]: Stopped target Timers.
Jan 21 18:53:34 np0005591285 systemd[217099]: Stopped Mark boot as successful after the user session has run 2 minutes.
Jan 21 18:53:34 np0005591285 systemd[217099]: Stopped Daily Cleanup of User's Temporary Directories.
Jan 21 18:53:34 np0005591285 systemd[217099]: Closed D-Bus User Message Bus Socket.
Jan 21 18:53:34 np0005591285 systemd[217099]: Stopped Create User's Volatile Files and Directories.
Jan 21 18:53:34 np0005591285 systemd[217099]: Removed slice User Application Slice.
Jan 21 18:53:34 np0005591285 systemd[217099]: Reached target Shutdown.
Jan 21 18:53:34 np0005591285 systemd[217099]: Finished Exit the Session.
Jan 21 18:53:34 np0005591285 systemd[217099]: Reached target Exit the Session.
Jan 21 18:53:34 np0005591285 systemd[1]: Started libcrun container.
Jan 21 18:53:34 np0005591285 systemd[1]: user@42436.service: Deactivated successfully.
Jan 21 18:53:34 np0005591285 systemd[1]: Stopped User Manager for UID 42436.
Jan 21 18:53:34 np0005591285 systemd[1]: Stopping User Runtime Directory /run/user/42436...
Jan 21 18:53:34 np0005591285 nova_compute[182755]: 2026-01-21 23:53:34.698 182759 INFO nova.compute.manager [None req-f447ef32-7edb-4e2c-a5c5-5452f2f9c37e - - - - - -] [instance: 40bd1cc4-d1de-4488-8160-e6d4f5fce4bc] During sync_power_state the instance has a pending task (resize_finish). Skip.#033[00m
Jan 21 18:53:34 np0005591285 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/ab6183eb04886ce0e48a2774f17fe1c128d3a3ee7553dd2a300d0a06f6552174/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 21 18:53:34 np0005591285 systemd[1]: run-user-42436.mount: Deactivated successfully.
Jan 21 18:53:34 np0005591285 systemd[1]: user-runtime-dir@42436.service: Deactivated successfully.
Jan 21 18:53:34 np0005591285 systemd[1]: Stopped User Runtime Directory /run/user/42436.
Jan 21 18:53:34 np0005591285 systemd[1]: Removed slice User Slice of UID 42436.
Jan 21 18:53:34 np0005591285 podman[217331]: 2026-01-21 23:53:34.726606393 +0000 UTC m=+0.192330158 container init 8cf1aa422a04b7408f8edf8c758cafccb30fa363470e7444aae87dd77eb83649 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-7b586c54-3322-410f-9bc9-972a63b8deff, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2)
Jan 21 18:53:34 np0005591285 podman[217331]: 2026-01-21 23:53:34.732520072 +0000 UTC m=+0.198243837 container start 8cf1aa422a04b7408f8edf8c758cafccb30fa363470e7444aae87dd77eb83649 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-7b586c54-3322-410f-9bc9-972a63b8deff, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 21 18:53:34 np0005591285 nova_compute[182755]: 2026-01-21 23:53:34.755 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 18:53:34 np0005591285 neutron-haproxy-ovnmeta-7b586c54-3322-410f-9bc9-972a63b8deff[217346]: [NOTICE]   (217351) : New worker (217353) forked
Jan 21 18:53:34 np0005591285 neutron-haproxy-ovnmeta-7b586c54-3322-410f-9bc9-972a63b8deff[217346]: [NOTICE]   (217351) : Loading success.
Jan 21 18:53:35 np0005591285 nova_compute[182755]: 2026-01-21 23:53:35.163 182759 DEBUG nova.network.neutron [req-b5f108c3-5767-4332-8c1d-c545d1eee43f req-4c1ea6a5-d664-44f1-9c3f-960a68ea0ee0 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 40bd1cc4-d1de-4488-8160-e6d4f5fce4bc] Updated VIF entry in instance network info cache for port bc3d02f6-e146-4659-b018-41d3813ed1c3. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 21 18:53:35 np0005591285 nova_compute[182755]: 2026-01-21 23:53:35.164 182759 DEBUG nova.network.neutron [req-b5f108c3-5767-4332-8c1d-c545d1eee43f req-4c1ea6a5-d664-44f1-9c3f-960a68ea0ee0 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 40bd1cc4-d1de-4488-8160-e6d4f5fce4bc] Updating instance_info_cache with network_info: [{"id": "bc3d02f6-e146-4659-b018-41d3813ed1c3", "address": "fa:16:3e:05:f7:b6", "network": {"id": "7b586c54-3322-410f-9bc9-972a63b8deff", "bridge": "br-int", "label": "tempest-ServerDiskConfigTestJSON-1542056474-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c09a5cf201e249f69f57cd4a632d1e2b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapbc3d02f6-e1", "ovs_interfaceid": "bc3d02f6-e146-4659-b018-41d3813ed1c3", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 21 18:53:35 np0005591285 nova_compute[182755]: 2026-01-21 23:53:35.187 182759 DEBUG oslo_concurrency.lockutils [req-b5f108c3-5767-4332-8c1d-c545d1eee43f req-4c1ea6a5-d664-44f1-9c3f-960a68ea0ee0 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Releasing lock "refresh_cache-40bd1cc4-d1de-4488-8160-e6d4f5fce4bc" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 21 18:53:36 np0005591285 nova_compute[182755]: 2026-01-21 23:53:36.352 182759 DEBUG nova.compute.manager [req-5c2f91a7-06ed-4026-8c6c-eaeb1700ec25 req-efdad5d1-6cd7-4499-96f3-9b13925f0b0f 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 40bd1cc4-d1de-4488-8160-e6d4f5fce4bc] Received event network-vif-plugged-bc3d02f6-e146-4659-b018-41d3813ed1c3 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 21 18:53:36 np0005591285 nova_compute[182755]: 2026-01-21 23:53:36.356 182759 DEBUG oslo_concurrency.lockutils [req-5c2f91a7-06ed-4026-8c6c-eaeb1700ec25 req-efdad5d1-6cd7-4499-96f3-9b13925f0b0f 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "40bd1cc4-d1de-4488-8160-e6d4f5fce4bc-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 21 18:53:36 np0005591285 nova_compute[182755]: 2026-01-21 23:53:36.356 182759 DEBUG oslo_concurrency.lockutils [req-5c2f91a7-06ed-4026-8c6c-eaeb1700ec25 req-efdad5d1-6cd7-4499-96f3-9b13925f0b0f 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "40bd1cc4-d1de-4488-8160-e6d4f5fce4bc-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 21 18:53:36 np0005591285 nova_compute[182755]: 2026-01-21 23:53:36.357 182759 DEBUG oslo_concurrency.lockutils [req-5c2f91a7-06ed-4026-8c6c-eaeb1700ec25 req-efdad5d1-6cd7-4499-96f3-9b13925f0b0f 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "40bd1cc4-d1de-4488-8160-e6d4f5fce4bc-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 21 18:53:36 np0005591285 nova_compute[182755]: 2026-01-21 23:53:36.358 182759 DEBUG nova.compute.manager [req-5c2f91a7-06ed-4026-8c6c-eaeb1700ec25 req-efdad5d1-6cd7-4499-96f3-9b13925f0b0f 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 40bd1cc4-d1de-4488-8160-e6d4f5fce4bc] No waiting events found dispatching network-vif-plugged-bc3d02f6-e146-4659-b018-41d3813ed1c3 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 21 18:53:36 np0005591285 nova_compute[182755]: 2026-01-21 23:53:36.359 182759 WARNING nova.compute.manager [req-5c2f91a7-06ed-4026-8c6c-eaeb1700ec25 req-efdad5d1-6cd7-4499-96f3-9b13925f0b0f 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 40bd1cc4-d1de-4488-8160-e6d4f5fce4bc] Received unexpected event network-vif-plugged-bc3d02f6-e146-4659-b018-41d3813ed1c3 for instance with vm_state resized and task_state None.#033[00m
Jan 21 18:53:38 np0005591285 nova_compute[182755]: 2026-01-21 23:53:38.384 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 18:53:39 np0005591285 nova_compute[182755]: 2026-01-21 23:53:39.759 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 18:53:43 np0005591285 nova_compute[182755]: 2026-01-21 23:53:43.388 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 18:53:44 np0005591285 nova_compute[182755]: 2026-01-21 23:53:44.803 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 18:53:46 np0005591285 ovn_controller[94908]: 2026-01-21T23:53:46Z|00014|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:05:f7:b6 10.100.0.3
Jan 21 18:53:48 np0005591285 nova_compute[182755]: 2026-01-21 23:53:48.437 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 18:53:49 np0005591285 podman[217380]: 2026-01-21 23:53:49.216237121 +0000 UTC m=+0.081758201 container health_status b5c1a31a508d0cf9e10702abe9c0ef424614b4d8f0daee85791f7de054afa6f0 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=ceilometer_agent_compute, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, container_name=ceilometer_agent_compute)
Jan 21 18:53:49 np0005591285 nova_compute[182755]: 2026-01-21 23:53:49.805 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 18:53:50 np0005591285 podman[217400]: 2026-01-21 23:53:50.251291509 +0000 UTC m=+0.109315853 container health_status 769668cc4e818cfae854ab3fcb5517227ddca1e18005a18ef4d782a494f45662 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, build-date=2025-08-20T13:12:41, io.openshift.expose-services=, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., release=1755695350, url=https://catalog.redhat.com/en/search?searchType=containers, version=9.6, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, container_name=openstack_network_exporter, com.redhat.component=ubi9-minimal-container, io.openshift.tags=minimal rhel9, name=ubi9-minimal, vcs-type=git, vendor=Red Hat, Inc., io.buildah.version=1.33.7, managed_by=edpm_ansible, distribution-scope=public, config_id=openstack_network_exporter, maintainer=Red Hat, Inc., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly.)
Jan 21 18:53:52 np0005591285 nova_compute[182755]: 2026-01-21 23:53:52.729 182759 DEBUG oslo_concurrency.lockutils [None req-ee29f895-5cff-4f2a-9038-d08eb808154b a7fb6bdd938b4fcdb749b0bc4f86f97e c09a5cf201e249f69f57cd4a632d1e2b - - default default] Acquiring lock "40bd1cc4-d1de-4488-8160-e6d4f5fce4bc" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 21 18:53:52 np0005591285 nova_compute[182755]: 2026-01-21 23:53:52.730 182759 DEBUG oslo_concurrency.lockutils [None req-ee29f895-5cff-4f2a-9038-d08eb808154b a7fb6bdd938b4fcdb749b0bc4f86f97e c09a5cf201e249f69f57cd4a632d1e2b - - default default] Lock "40bd1cc4-d1de-4488-8160-e6d4f5fce4bc" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 21 18:53:52 np0005591285 nova_compute[182755]: 2026-01-21 23:53:52.730 182759 DEBUG oslo_concurrency.lockutils [None req-ee29f895-5cff-4f2a-9038-d08eb808154b a7fb6bdd938b4fcdb749b0bc4f86f97e c09a5cf201e249f69f57cd4a632d1e2b - - default default] Acquiring lock "40bd1cc4-d1de-4488-8160-e6d4f5fce4bc-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 21 18:53:52 np0005591285 nova_compute[182755]: 2026-01-21 23:53:52.731 182759 DEBUG oslo_concurrency.lockutils [None req-ee29f895-5cff-4f2a-9038-d08eb808154b a7fb6bdd938b4fcdb749b0bc4f86f97e c09a5cf201e249f69f57cd4a632d1e2b - - default default] Lock "40bd1cc4-d1de-4488-8160-e6d4f5fce4bc-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 21 18:53:52 np0005591285 nova_compute[182755]: 2026-01-21 23:53:52.731 182759 DEBUG oslo_concurrency.lockutils [None req-ee29f895-5cff-4f2a-9038-d08eb808154b a7fb6bdd938b4fcdb749b0bc4f86f97e c09a5cf201e249f69f57cd4a632d1e2b - - default default] Lock "40bd1cc4-d1de-4488-8160-e6d4f5fce4bc-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 21 18:53:52 np0005591285 nova_compute[182755]: 2026-01-21 23:53:52.749 182759 INFO nova.compute.manager [None req-ee29f895-5cff-4f2a-9038-d08eb808154b a7fb6bdd938b4fcdb749b0bc4f86f97e c09a5cf201e249f69f57cd4a632d1e2b - - default default] [instance: 40bd1cc4-d1de-4488-8160-e6d4f5fce4bc] Terminating instance#033[00m
Jan 21 18:53:52 np0005591285 nova_compute[182755]: 2026-01-21 23:53:52.764 182759 DEBUG nova.compute.manager [None req-ee29f895-5cff-4f2a-9038-d08eb808154b a7fb6bdd938b4fcdb749b0bc4f86f97e c09a5cf201e249f69f57cd4a632d1e2b - - default default] [instance: 40bd1cc4-d1de-4488-8160-e6d4f5fce4bc] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Jan 21 18:53:52 np0005591285 kernel: tapbc3d02f6-e1 (unregistering): left promiscuous mode
Jan 21 18:53:52 np0005591285 NetworkManager[55017]: <info>  [1769039632.8011] device (tapbc3d02f6-e1): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 21 18:53:52 np0005591285 ovn_controller[94908]: 2026-01-21T23:53:52Z|00114|binding|INFO|Releasing lport bc3d02f6-e146-4659-b018-41d3813ed1c3 from this chassis (sb_readonly=0)
Jan 21 18:53:52 np0005591285 ovn_controller[94908]: 2026-01-21T23:53:52Z|00115|binding|INFO|Setting lport bc3d02f6-e146-4659-b018-41d3813ed1c3 down in Southbound
Jan 21 18:53:52 np0005591285 ovn_controller[94908]: 2026-01-21T23:53:52Z|00116|binding|INFO|Removing iface tapbc3d02f6-e1 ovn-installed in OVS
Jan 21 18:53:52 np0005591285 nova_compute[182755]: 2026-01-21 23:53:52.816 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 18:53:52 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:53:52.824 104259 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:05:f7:b6 10.100.0.3'], port_security=['fa:16:3e:05:f7:b6 10.100.0.3'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.3/28', 'neutron:device_id': '40bd1cc4-d1de-4488-8160-e6d4f5fce4bc', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-7b586c54-3322-410f-9bc9-972a63b8deff', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'c09a5cf201e249f69f57cd4a632d1e2b', 'neutron:revision_number': '8', 'neutron:security_group_ids': '0a9884ba-4fab-4d1a-a8f1-d417efefef12', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-2.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=f503744d-afcf-48c4-bcde-b001877de7d3, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fdd55b5c970>], logical_port=bc3d02f6-e146-4659-b018-41d3813ed1c3) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7fdd55b5c970>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 21 18:53:52 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:53:52.827 104259 INFO neutron.agent.ovn.metadata.agent [-] Port bc3d02f6-e146-4659-b018-41d3813ed1c3 in datapath 7b586c54-3322-410f-9bc9-972a63b8deff unbound from our chassis#033[00m
Jan 21 18:53:52 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:53:52.828 104259 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 7b586c54-3322-410f-9bc9-972a63b8deff, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Jan 21 18:53:52 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:53:52.831 211690 DEBUG oslo.privsep.daemon [-] privsep: reply[5ae88845-490e-42d6-9698-adda49ced9d1]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 18:53:52 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:53:52.832 104259 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-7b586c54-3322-410f-9bc9-972a63b8deff namespace which is not needed anymore#033[00m
Jan 21 18:53:52 np0005591285 nova_compute[182755]: 2026-01-21 23:53:52.834 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 18:53:52 np0005591285 systemd[1]: machine-qemu\x2d20\x2dinstance\x2d00000031.scope: Deactivated successfully.
Jan 21 18:53:52 np0005591285 systemd[1]: machine-qemu\x2d20\x2dinstance\x2d00000031.scope: Consumed 13.841s CPU time.
Jan 21 18:53:52 np0005591285 systemd-machined[154022]: Machine qemu-20-instance-00000031 terminated.
Jan 21 18:53:53 np0005591285 neutron-haproxy-ovnmeta-7b586c54-3322-410f-9bc9-972a63b8deff[217346]: [NOTICE]   (217351) : haproxy version is 2.8.14-c23fe91
Jan 21 18:53:53 np0005591285 neutron-haproxy-ovnmeta-7b586c54-3322-410f-9bc9-972a63b8deff[217346]: [NOTICE]   (217351) : path to executable is /usr/sbin/haproxy
Jan 21 18:53:53 np0005591285 neutron-haproxy-ovnmeta-7b586c54-3322-410f-9bc9-972a63b8deff[217346]: [WARNING]  (217351) : Exiting Master process...
Jan 21 18:53:53 np0005591285 neutron-haproxy-ovnmeta-7b586c54-3322-410f-9bc9-972a63b8deff[217346]: [ALERT]    (217351) : Current worker (217353) exited with code 143 (Terminated)
Jan 21 18:53:53 np0005591285 neutron-haproxy-ovnmeta-7b586c54-3322-410f-9bc9-972a63b8deff[217346]: [WARNING]  (217351) : All workers exited. Exiting... (0)
Jan 21 18:53:53 np0005591285 systemd[1]: libpod-8cf1aa422a04b7408f8edf8c758cafccb30fa363470e7444aae87dd77eb83649.scope: Deactivated successfully.
Jan 21 18:53:53 np0005591285 podman[217448]: 2026-01-21 23:53:53.06504639 +0000 UTC m=+0.088773931 container died 8cf1aa422a04b7408f8edf8c758cafccb30fa363470e7444aae87dd77eb83649 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-7b586c54-3322-410f-9bc9-972a63b8deff, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3)
Jan 21 18:53:53 np0005591285 nova_compute[182755]: 2026-01-21 23:53:53.069 182759 INFO nova.virt.libvirt.driver [-] [instance: 40bd1cc4-d1de-4488-8160-e6d4f5fce4bc] Instance destroyed successfully.#033[00m
Jan 21 18:53:53 np0005591285 nova_compute[182755]: 2026-01-21 23:53:53.070 182759 DEBUG nova.objects.instance [None req-ee29f895-5cff-4f2a-9038-d08eb808154b a7fb6bdd938b4fcdb749b0bc4f86f97e c09a5cf201e249f69f57cd4a632d1e2b - - default default] Lazy-loading 'resources' on Instance uuid 40bd1cc4-d1de-4488-8160-e6d4f5fce4bc obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 21 18:53:53 np0005591285 nova_compute[182755]: 2026-01-21 23:53:53.090 182759 DEBUG nova.virt.libvirt.vif [None req-ee29f895-5cff-4f2a-9038-d08eb808154b a7fb6bdd938b4fcdb749b0bc4f86f97e c09a5cf201e249f69f57cd4a632d1e2b - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-21T23:52:42Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ServerDiskConfigTestJSON-server-732234770',display_name='tempest-ServerDiskConfigTestJSON-server-732234770',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(2),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-serverdiskconfigtestjson-server-732234770',id=49,image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',info_cache=InstanceInfoCache,instance_type_id=2,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2026-01-21T23:53:34Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=192,metadata={},migration_context=<?>,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='c09a5cf201e249f69f57cd4a632d1e2b',ramdisk_id='',reservation_id='r-7yxmtfzt',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-ServerDiskConfigTestJSON-1417790226',owner_user_name='tempest-ServerDiskConfigTestJSON-1417790226-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-21T23:53:43Z,user_data=None,user_id='a7fb6bdd938b4fcdb749b0bc4f86f97e',uuid=40bd1cc4-d1de-4488-8160-e6d4f5fce4bc,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "bc3d02f6-e146-4659-b018-41d3813ed1c3", "address": "fa:16:3e:05:f7:b6", "network": {"id": "7b586c54-3322-410f-9bc9-972a63b8deff", "bridge": "br-int", "label": "tempest-ServerDiskConfigTestJSON-1542056474-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c09a5cf201e249f69f57cd4a632d1e2b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapbc3d02f6-e1", "ovs_interfaceid": "bc3d02f6-e146-4659-b018-41d3813ed1c3", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Jan 21 18:53:53 np0005591285 nova_compute[182755]: 2026-01-21 23:53:53.091 182759 DEBUG nova.network.os_vif_util [None req-ee29f895-5cff-4f2a-9038-d08eb808154b a7fb6bdd938b4fcdb749b0bc4f86f97e c09a5cf201e249f69f57cd4a632d1e2b - - default default] Converting VIF {"id": "bc3d02f6-e146-4659-b018-41d3813ed1c3", "address": "fa:16:3e:05:f7:b6", "network": {"id": "7b586c54-3322-410f-9bc9-972a63b8deff", "bridge": "br-int", "label": "tempest-ServerDiskConfigTestJSON-1542056474-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c09a5cf201e249f69f57cd4a632d1e2b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapbc3d02f6-e1", "ovs_interfaceid": "bc3d02f6-e146-4659-b018-41d3813ed1c3", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 21 18:53:53 np0005591285 nova_compute[182755]: 2026-01-21 23:53:53.093 182759 DEBUG nova.network.os_vif_util [None req-ee29f895-5cff-4f2a-9038-d08eb808154b a7fb6bdd938b4fcdb749b0bc4f86f97e c09a5cf201e249f69f57cd4a632d1e2b - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:05:f7:b6,bridge_name='br-int',has_traffic_filtering=True,id=bc3d02f6-e146-4659-b018-41d3813ed1c3,network=Network(7b586c54-3322-410f-9bc9-972a63b8deff),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapbc3d02f6-e1') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 21 18:53:53 np0005591285 nova_compute[182755]: 2026-01-21 23:53:53.093 182759 DEBUG os_vif [None req-ee29f895-5cff-4f2a-9038-d08eb808154b a7fb6bdd938b4fcdb749b0bc4f86f97e c09a5cf201e249f69f57cd4a632d1e2b - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:05:f7:b6,bridge_name='br-int',has_traffic_filtering=True,id=bc3d02f6-e146-4659-b018-41d3813ed1c3,network=Network(7b586c54-3322-410f-9bc9-972a63b8deff),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapbc3d02f6-e1') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Jan 21 18:53:53 np0005591285 nova_compute[182755]: 2026-01-21 23:53:53.097 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 18:53:53 np0005591285 nova_compute[182755]: 2026-01-21 23:53:53.098 182759 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapbc3d02f6-e1, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 21 18:53:53 np0005591285 nova_compute[182755]: 2026-01-21 23:53:53.103 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 18:53:53 np0005591285 nova_compute[182755]: 2026-01-21 23:53:53.113 182759 INFO os_vif [None req-ee29f895-5cff-4f2a-9038-d08eb808154b a7fb6bdd938b4fcdb749b0bc4f86f97e c09a5cf201e249f69f57cd4a632d1e2b - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:05:f7:b6,bridge_name='br-int',has_traffic_filtering=True,id=bc3d02f6-e146-4659-b018-41d3813ed1c3,network=Network(7b586c54-3322-410f-9bc9-972a63b8deff),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapbc3d02f6-e1')#033[00m
Jan 21 18:53:53 np0005591285 nova_compute[182755]: 2026-01-21 23:53:53.114 182759 INFO nova.virt.libvirt.driver [None req-ee29f895-5cff-4f2a-9038-d08eb808154b a7fb6bdd938b4fcdb749b0bc4f86f97e c09a5cf201e249f69f57cd4a632d1e2b - - default default] [instance: 40bd1cc4-d1de-4488-8160-e6d4f5fce4bc] Deleting instance files /var/lib/nova/instances/40bd1cc4-d1de-4488-8160-e6d4f5fce4bc_del#033[00m
Jan 21 18:53:53 np0005591285 nova_compute[182755]: 2026-01-21 23:53:53.128 182759 INFO nova.virt.libvirt.driver [None req-ee29f895-5cff-4f2a-9038-d08eb808154b a7fb6bdd938b4fcdb749b0bc4f86f97e c09a5cf201e249f69f57cd4a632d1e2b - - default default] [instance: 40bd1cc4-d1de-4488-8160-e6d4f5fce4bc] Deletion of /var/lib/nova/instances/40bd1cc4-d1de-4488-8160-e6d4f5fce4bc_del complete#033[00m
Jan 21 18:53:53 np0005591285 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-8cf1aa422a04b7408f8edf8c758cafccb30fa363470e7444aae87dd77eb83649-userdata-shm.mount: Deactivated successfully.
Jan 21 18:53:53 np0005591285 systemd[1]: var-lib-containers-storage-overlay-ab6183eb04886ce0e48a2774f17fe1c128d3a3ee7553dd2a300d0a06f6552174-merged.mount: Deactivated successfully.
Jan 21 18:53:53 np0005591285 podman[217448]: 2026-01-21 23:53:53.226233367 +0000 UTC m=+0.249960918 container cleanup 8cf1aa422a04b7408f8edf8c758cafccb30fa363470e7444aae87dd77eb83649 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-7b586c54-3322-410f-9bc9-972a63b8deff, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 21 18:53:53 np0005591285 nova_compute[182755]: 2026-01-21 23:53:53.262 182759 INFO nova.compute.manager [None req-ee29f895-5cff-4f2a-9038-d08eb808154b a7fb6bdd938b4fcdb749b0bc4f86f97e c09a5cf201e249f69f57cd4a632d1e2b - - default default] [instance: 40bd1cc4-d1de-4488-8160-e6d4f5fce4bc] Took 0.50 seconds to destroy the instance on the hypervisor.#033[00m
Jan 21 18:53:53 np0005591285 nova_compute[182755]: 2026-01-21 23:53:53.264 182759 DEBUG oslo.service.loopingcall [None req-ee29f895-5cff-4f2a-9038-d08eb808154b a7fb6bdd938b4fcdb749b0bc4f86f97e c09a5cf201e249f69f57cd4a632d1e2b - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Jan 21 18:53:53 np0005591285 nova_compute[182755]: 2026-01-21 23:53:53.265 182759 DEBUG nova.compute.manager [-] [instance: 40bd1cc4-d1de-4488-8160-e6d4f5fce4bc] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Jan 21 18:53:53 np0005591285 nova_compute[182755]: 2026-01-21 23:53:53.265 182759 DEBUG nova.network.neutron [-] [instance: 40bd1cc4-d1de-4488-8160-e6d4f5fce4bc] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Jan 21 18:53:53 np0005591285 systemd[1]: libpod-conmon-8cf1aa422a04b7408f8edf8c758cafccb30fa363470e7444aae87dd77eb83649.scope: Deactivated successfully.
Jan 21 18:53:53 np0005591285 nova_compute[182755]: 2026-01-21 23:53:53.279 182759 DEBUG nova.compute.manager [req-e8777b7a-4958-4ca4-996d-2953a616f681 req-48d793d2-d6da-445d-bf68-ef0736e53c02 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 40bd1cc4-d1de-4488-8160-e6d4f5fce4bc] Received event network-vif-unplugged-bc3d02f6-e146-4659-b018-41d3813ed1c3 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 21 18:53:53 np0005591285 nova_compute[182755]: 2026-01-21 23:53:53.280 182759 DEBUG oslo_concurrency.lockutils [req-e8777b7a-4958-4ca4-996d-2953a616f681 req-48d793d2-d6da-445d-bf68-ef0736e53c02 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "40bd1cc4-d1de-4488-8160-e6d4f5fce4bc-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 21 18:53:53 np0005591285 nova_compute[182755]: 2026-01-21 23:53:53.280 182759 DEBUG oslo_concurrency.lockutils [req-e8777b7a-4958-4ca4-996d-2953a616f681 req-48d793d2-d6da-445d-bf68-ef0736e53c02 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "40bd1cc4-d1de-4488-8160-e6d4f5fce4bc-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 21 18:53:53 np0005591285 nova_compute[182755]: 2026-01-21 23:53:53.280 182759 DEBUG oslo_concurrency.lockutils [req-e8777b7a-4958-4ca4-996d-2953a616f681 req-48d793d2-d6da-445d-bf68-ef0736e53c02 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "40bd1cc4-d1de-4488-8160-e6d4f5fce4bc-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 21 18:53:53 np0005591285 nova_compute[182755]: 2026-01-21 23:53:53.280 182759 DEBUG nova.compute.manager [req-e8777b7a-4958-4ca4-996d-2953a616f681 req-48d793d2-d6da-445d-bf68-ef0736e53c02 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 40bd1cc4-d1de-4488-8160-e6d4f5fce4bc] No waiting events found dispatching network-vif-unplugged-bc3d02f6-e146-4659-b018-41d3813ed1c3 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 21 18:53:53 np0005591285 nova_compute[182755]: 2026-01-21 23:53:53.281 182759 DEBUG nova.compute.manager [req-e8777b7a-4958-4ca4-996d-2953a616f681 req-48d793d2-d6da-445d-bf68-ef0736e53c02 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 40bd1cc4-d1de-4488-8160-e6d4f5fce4bc] Received event network-vif-unplugged-bc3d02f6-e146-4659-b018-41d3813ed1c3 for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Jan 21 18:53:53 np0005591285 podman[217492]: 2026-01-21 23:53:53.343152894 +0000 UTC m=+0.070440766 container remove 8cf1aa422a04b7408f8edf8c758cafccb30fa363470e7444aae87dd77eb83649 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-7b586c54-3322-410f-9bc9-972a63b8deff, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.schema-version=1.0)
Jan 21 18:53:53 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:53:53.357 211690 DEBUG oslo.privsep.daemon [-] privsep: reply[c4af06a8-9010-447d-88ce-02fb6fe49c4a]: (4, ('Wed Jan 21 11:53:52 PM UTC 2026 Stopping container neutron-haproxy-ovnmeta-7b586c54-3322-410f-9bc9-972a63b8deff (8cf1aa422a04b7408f8edf8c758cafccb30fa363470e7444aae87dd77eb83649)\n8cf1aa422a04b7408f8edf8c758cafccb30fa363470e7444aae87dd77eb83649\nWed Jan 21 11:53:53 PM UTC 2026 Deleting container neutron-haproxy-ovnmeta-7b586c54-3322-410f-9bc9-972a63b8deff (8cf1aa422a04b7408f8edf8c758cafccb30fa363470e7444aae87dd77eb83649)\n8cf1aa422a04b7408f8edf8c758cafccb30fa363470e7444aae87dd77eb83649\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 18:53:53 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:53:53.360 211690 DEBUG oslo.privsep.daemon [-] privsep: reply[55342525-ee85-456d-94f6-e34545adfc6d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 18:53:53 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:53:53.362 104259 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap7b586c54-30, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 21 18:53:53 np0005591285 nova_compute[182755]: 2026-01-21 23:53:53.385 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 18:53:53 np0005591285 kernel: tap7b586c54-30: left promiscuous mode
Jan 21 18:53:53 np0005591285 nova_compute[182755]: 2026-01-21 23:53:53.409 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 18:53:53 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:53:53.413 211690 DEBUG oslo.privsep.daemon [-] privsep: reply[9d632b77-ef45-4347-bb18-6a1bf62f8f59]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 18:53:53 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:53:53.429 211690 DEBUG oslo.privsep.daemon [-] privsep: reply[520fd85b-998b-4742-a537-031ed79916b2]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 18:53:53 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:53:53.431 211690 DEBUG oslo.privsep.daemon [-] privsep: reply[ffe6180c-d8d9-4b32-acd4-d6f43854bed7]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 18:53:53 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:53:53.459 211690 DEBUG oslo.privsep.daemon [-] privsep: reply[b8b32158-0336-48b3-b323-20052ea52991]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 417236, 'reachable_time': 43275, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 217508, 'error': None, 'target': 'ovnmeta-7b586c54-3322-410f-9bc9-972a63b8deff', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 18:53:53 np0005591285 systemd[1]: run-netns-ovnmeta\x2d7b586c54\x2d3322\x2d410f\x2d9bc9\x2d972a63b8deff.mount: Deactivated successfully.
Jan 21 18:53:53 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:53:53.466 104650 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-7b586c54-3322-410f-9bc9-972a63b8deff deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Jan 21 18:53:53 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:53:53.467 104650 DEBUG oslo.privsep.daemon [-] privsep: reply[ba36d531-9567-4b3e-ad89-1cf571d1a0cd]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 18:53:54 np0005591285 nova_compute[182755]: 2026-01-21 23:53:54.773 182759 DEBUG nova.network.neutron [-] [instance: 40bd1cc4-d1de-4488-8160-e6d4f5fce4bc] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 21 18:53:54 np0005591285 nova_compute[182755]: 2026-01-21 23:53:54.800 182759 INFO nova.compute.manager [-] [instance: 40bd1cc4-d1de-4488-8160-e6d4f5fce4bc] Took 1.53 seconds to deallocate network for instance.#033[00m
Jan 21 18:53:54 np0005591285 nova_compute[182755]: 2026-01-21 23:53:54.811 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 18:53:54 np0005591285 nova_compute[182755]: 2026-01-21 23:53:54.910 182759 DEBUG oslo_concurrency.lockutils [None req-ee29f895-5cff-4f2a-9038-d08eb808154b a7fb6bdd938b4fcdb749b0bc4f86f97e c09a5cf201e249f69f57cd4a632d1e2b - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 21 18:53:54 np0005591285 nova_compute[182755]: 2026-01-21 23:53:54.911 182759 DEBUG oslo_concurrency.lockutils [None req-ee29f895-5cff-4f2a-9038-d08eb808154b a7fb6bdd938b4fcdb749b0bc4f86f97e c09a5cf201e249f69f57cd4a632d1e2b - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 21 18:53:55 np0005591285 nova_compute[182755]: 2026-01-21 23:53:55.027 182759 DEBUG nova.compute.provider_tree [None req-ee29f895-5cff-4f2a-9038-d08eb808154b a7fb6bdd938b4fcdb749b0bc4f86f97e c09a5cf201e249f69f57cd4a632d1e2b - - default default] Inventory has not changed in ProviderTree for provider: e96a8776-a298-4c19-937a-402cb8191067 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 21 18:53:55 np0005591285 nova_compute[182755]: 2026-01-21 23:53:55.049 182759 DEBUG nova.scheduler.client.report [None req-ee29f895-5cff-4f2a-9038-d08eb808154b a7fb6bdd938b4fcdb749b0bc4f86f97e c09a5cf201e249f69f57cd4a632d1e2b - - default default] Inventory has not changed for provider e96a8776-a298-4c19-937a-402cb8191067 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 21 18:53:55 np0005591285 nova_compute[182755]: 2026-01-21 23:53:55.103 182759 DEBUG oslo_concurrency.lockutils [None req-ee29f895-5cff-4f2a-9038-d08eb808154b a7fb6bdd938b4fcdb749b0bc4f86f97e c09a5cf201e249f69f57cd4a632d1e2b - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.192s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 21 18:53:55 np0005591285 nova_compute[182755]: 2026-01-21 23:53:55.138 182759 INFO nova.scheduler.client.report [None req-ee29f895-5cff-4f2a-9038-d08eb808154b a7fb6bdd938b4fcdb749b0bc4f86f97e c09a5cf201e249f69f57cd4a632d1e2b - - default default] Deleted allocations for instance 40bd1cc4-d1de-4488-8160-e6d4f5fce4bc#033[00m
Jan 21 18:53:55 np0005591285 nova_compute[182755]: 2026-01-21 23:53:55.573 182759 DEBUG oslo_concurrency.lockutils [None req-ee29f895-5cff-4f2a-9038-d08eb808154b a7fb6bdd938b4fcdb749b0bc4f86f97e c09a5cf201e249f69f57cd4a632d1e2b - - default default] Lock "40bd1cc4-d1de-4488-8160-e6d4f5fce4bc" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 2.843s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 21 18:53:55 np0005591285 nova_compute[182755]: 2026-01-21 23:53:55.855 182759 DEBUG nova.compute.manager [req-84dcb359-ecf0-4648-91ec-893e56267315 req-f559c230-0446-4fae-9673-0e9902bd4d30 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 40bd1cc4-d1de-4488-8160-e6d4f5fce4bc] Received event network-vif-plugged-bc3d02f6-e146-4659-b018-41d3813ed1c3 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 21 18:53:55 np0005591285 nova_compute[182755]: 2026-01-21 23:53:55.856 182759 DEBUG oslo_concurrency.lockutils [req-84dcb359-ecf0-4648-91ec-893e56267315 req-f559c230-0446-4fae-9673-0e9902bd4d30 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "40bd1cc4-d1de-4488-8160-e6d4f5fce4bc-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 21 18:53:55 np0005591285 nova_compute[182755]: 2026-01-21 23:53:55.856 182759 DEBUG oslo_concurrency.lockutils [req-84dcb359-ecf0-4648-91ec-893e56267315 req-f559c230-0446-4fae-9673-0e9902bd4d30 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "40bd1cc4-d1de-4488-8160-e6d4f5fce4bc-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 21 18:53:55 np0005591285 nova_compute[182755]: 2026-01-21 23:53:55.857 182759 DEBUG oslo_concurrency.lockutils [req-84dcb359-ecf0-4648-91ec-893e56267315 req-f559c230-0446-4fae-9673-0e9902bd4d30 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "40bd1cc4-d1de-4488-8160-e6d4f5fce4bc-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 21 18:53:55 np0005591285 nova_compute[182755]: 2026-01-21 23:53:55.857 182759 DEBUG nova.compute.manager [req-84dcb359-ecf0-4648-91ec-893e56267315 req-f559c230-0446-4fae-9673-0e9902bd4d30 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 40bd1cc4-d1de-4488-8160-e6d4f5fce4bc] No waiting events found dispatching network-vif-plugged-bc3d02f6-e146-4659-b018-41d3813ed1c3 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 21 18:53:55 np0005591285 nova_compute[182755]: 2026-01-21 23:53:55.857 182759 WARNING nova.compute.manager [req-84dcb359-ecf0-4648-91ec-893e56267315 req-f559c230-0446-4fae-9673-0e9902bd4d30 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 40bd1cc4-d1de-4488-8160-e6d4f5fce4bc] Received unexpected event network-vif-plugged-bc3d02f6-e146-4659-b018-41d3813ed1c3 for instance with vm_state deleted and task_state None.#033[00m
Jan 21 18:53:55 np0005591285 nova_compute[182755]: 2026-01-21 23:53:55.858 182759 DEBUG nova.compute.manager [req-84dcb359-ecf0-4648-91ec-893e56267315 req-f559c230-0446-4fae-9673-0e9902bd4d30 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 40bd1cc4-d1de-4488-8160-e6d4f5fce4bc] Received event network-vif-deleted-bc3d02f6-e146-4659-b018-41d3813ed1c3 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 21 18:53:57 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:53:57.780 104259 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=13, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '16:a4:fb', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': 'fa:c2:bb:50:eb:27'}, ipsec=False) old=SB_Global(nb_cfg=12) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 21 18:53:57 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:53:57.781 104259 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 2 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Jan 21 18:53:57 np0005591285 nova_compute[182755]: 2026-01-21 23:53:57.816 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 18:53:58 np0005591285 nova_compute[182755]: 2026-01-21 23:53:58.084 182759 DEBUG oslo_concurrency.lockutils [None req-7900a28e-a76b-4bba-a7e4-97a2b29e0f2c a7fb6bdd938b4fcdb749b0bc4f86f97e c09a5cf201e249f69f57cd4a632d1e2b - - default default] Acquiring lock "2c5b484c-19e7-47b1-bf93-fa599ddb6873" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 21 18:53:58 np0005591285 nova_compute[182755]: 2026-01-21 23:53:58.085 182759 DEBUG oslo_concurrency.lockutils [None req-7900a28e-a76b-4bba-a7e4-97a2b29e0f2c a7fb6bdd938b4fcdb749b0bc4f86f97e c09a5cf201e249f69f57cd4a632d1e2b - - default default] Lock "2c5b484c-19e7-47b1-bf93-fa599ddb6873" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 21 18:53:58 np0005591285 nova_compute[182755]: 2026-01-21 23:53:58.102 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 18:53:58 np0005591285 nova_compute[182755]: 2026-01-21 23:53:58.119 182759 DEBUG nova.compute.manager [None req-7900a28e-a76b-4bba-a7e4-97a2b29e0f2c a7fb6bdd938b4fcdb749b0bc4f86f97e c09a5cf201e249f69f57cd4a632d1e2b - - default default] [instance: 2c5b484c-19e7-47b1-bf93-fa599ddb6873] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Jan 21 18:53:58 np0005591285 podman[217509]: 2026-01-21 23:53:58.252606679 +0000 UTC m=+0.116559678 container health_status 3d927e208c8d9d2bf2ed958430b573b5beb6e6062ba87e1ec7a0ee42c855b2e4 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Jan 21 18:53:58 np0005591285 nova_compute[182755]: 2026-01-21 23:53:58.260 182759 DEBUG oslo_concurrency.lockutils [None req-7900a28e-a76b-4bba-a7e4-97a2b29e0f2c a7fb6bdd938b4fcdb749b0bc4f86f97e c09a5cf201e249f69f57cd4a632d1e2b - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 21 18:53:58 np0005591285 nova_compute[182755]: 2026-01-21 23:53:58.261 182759 DEBUG oslo_concurrency.lockutils [None req-7900a28e-a76b-4bba-a7e4-97a2b29e0f2c a7fb6bdd938b4fcdb749b0bc4f86f97e c09a5cf201e249f69f57cd4a632d1e2b - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 21 18:53:58 np0005591285 nova_compute[182755]: 2026-01-21 23:53:58.268 182759 DEBUG nova.virt.hardware [None req-7900a28e-a76b-4bba-a7e4-97a2b29e0f2c a7fb6bdd938b4fcdb749b0bc4f86f97e c09a5cf201e249f69f57cd4a632d1e2b - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Jan 21 18:53:58 np0005591285 nova_compute[182755]: 2026-01-21 23:53:58.268 182759 INFO nova.compute.claims [None req-7900a28e-a76b-4bba-a7e4-97a2b29e0f2c a7fb6bdd938b4fcdb749b0bc4f86f97e c09a5cf201e249f69f57cd4a632d1e2b - - default default] [instance: 2c5b484c-19e7-47b1-bf93-fa599ddb6873] Claim successful on node compute-2.ctlplane.example.com#033[00m
Jan 21 18:53:58 np0005591285 nova_compute[182755]: 2026-01-21 23:53:58.475 182759 DEBUG nova.compute.provider_tree [None req-7900a28e-a76b-4bba-a7e4-97a2b29e0f2c a7fb6bdd938b4fcdb749b0bc4f86f97e c09a5cf201e249f69f57cd4a632d1e2b - - default default] Inventory has not changed in ProviderTree for provider: e96a8776-a298-4c19-937a-402cb8191067 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 21 18:53:58 np0005591285 nova_compute[182755]: 2026-01-21 23:53:58.511 182759 DEBUG nova.scheduler.client.report [None req-7900a28e-a76b-4bba-a7e4-97a2b29e0f2c a7fb6bdd938b4fcdb749b0bc4f86f97e c09a5cf201e249f69f57cd4a632d1e2b - - default default] Inventory has not changed for provider e96a8776-a298-4c19-937a-402cb8191067 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 21 18:53:58 np0005591285 nova_compute[182755]: 2026-01-21 23:53:58.543 182759 DEBUG oslo_concurrency.lockutils [None req-7900a28e-a76b-4bba-a7e4-97a2b29e0f2c a7fb6bdd938b4fcdb749b0bc4f86f97e c09a5cf201e249f69f57cd4a632d1e2b - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.283s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 21 18:53:58 np0005591285 nova_compute[182755]: 2026-01-21 23:53:58.544 182759 DEBUG nova.compute.manager [None req-7900a28e-a76b-4bba-a7e4-97a2b29e0f2c a7fb6bdd938b4fcdb749b0bc4f86f97e c09a5cf201e249f69f57cd4a632d1e2b - - default default] [instance: 2c5b484c-19e7-47b1-bf93-fa599ddb6873] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Jan 21 18:53:58 np0005591285 nova_compute[182755]: 2026-01-21 23:53:58.647 182759 DEBUG nova.compute.manager [None req-7900a28e-a76b-4bba-a7e4-97a2b29e0f2c a7fb6bdd938b4fcdb749b0bc4f86f97e c09a5cf201e249f69f57cd4a632d1e2b - - default default] [instance: 2c5b484c-19e7-47b1-bf93-fa599ddb6873] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Jan 21 18:53:58 np0005591285 nova_compute[182755]: 2026-01-21 23:53:58.647 182759 DEBUG nova.network.neutron [None req-7900a28e-a76b-4bba-a7e4-97a2b29e0f2c a7fb6bdd938b4fcdb749b0bc4f86f97e c09a5cf201e249f69f57cd4a632d1e2b - - default default] [instance: 2c5b484c-19e7-47b1-bf93-fa599ddb6873] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Jan 21 18:53:58 np0005591285 nova_compute[182755]: 2026-01-21 23:53:58.681 182759 INFO nova.virt.libvirt.driver [None req-7900a28e-a76b-4bba-a7e4-97a2b29e0f2c a7fb6bdd938b4fcdb749b0bc4f86f97e c09a5cf201e249f69f57cd4a632d1e2b - - default default] [instance: 2c5b484c-19e7-47b1-bf93-fa599ddb6873] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Jan 21 18:53:58 np0005591285 nova_compute[182755]: 2026-01-21 23:53:58.727 182759 DEBUG nova.compute.manager [None req-7900a28e-a76b-4bba-a7e4-97a2b29e0f2c a7fb6bdd938b4fcdb749b0bc4f86f97e c09a5cf201e249f69f57cd4a632d1e2b - - default default] [instance: 2c5b484c-19e7-47b1-bf93-fa599ddb6873] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Jan 21 18:53:58 np0005591285 nova_compute[182755]: 2026-01-21 23:53:58.928 182759 DEBUG nova.compute.manager [None req-7900a28e-a76b-4bba-a7e4-97a2b29e0f2c a7fb6bdd938b4fcdb749b0bc4f86f97e c09a5cf201e249f69f57cd4a632d1e2b - - default default] [instance: 2c5b484c-19e7-47b1-bf93-fa599ddb6873] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Jan 21 18:53:58 np0005591285 nova_compute[182755]: 2026-01-21 23:53:58.930 182759 DEBUG nova.virt.libvirt.driver [None req-7900a28e-a76b-4bba-a7e4-97a2b29e0f2c a7fb6bdd938b4fcdb749b0bc4f86f97e c09a5cf201e249f69f57cd4a632d1e2b - - default default] [instance: 2c5b484c-19e7-47b1-bf93-fa599ddb6873] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Jan 21 18:53:58 np0005591285 nova_compute[182755]: 2026-01-21 23:53:58.930 182759 INFO nova.virt.libvirt.driver [None req-7900a28e-a76b-4bba-a7e4-97a2b29e0f2c a7fb6bdd938b4fcdb749b0bc4f86f97e c09a5cf201e249f69f57cd4a632d1e2b - - default default] [instance: 2c5b484c-19e7-47b1-bf93-fa599ddb6873] Creating image(s)#033[00m
Jan 21 18:53:58 np0005591285 nova_compute[182755]: 2026-01-21 23:53:58.931 182759 DEBUG oslo_concurrency.lockutils [None req-7900a28e-a76b-4bba-a7e4-97a2b29e0f2c a7fb6bdd938b4fcdb749b0bc4f86f97e c09a5cf201e249f69f57cd4a632d1e2b - - default default] Acquiring lock "/var/lib/nova/instances/2c5b484c-19e7-47b1-bf93-fa599ddb6873/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 21 18:53:58 np0005591285 nova_compute[182755]: 2026-01-21 23:53:58.931 182759 DEBUG oslo_concurrency.lockutils [None req-7900a28e-a76b-4bba-a7e4-97a2b29e0f2c a7fb6bdd938b4fcdb749b0bc4f86f97e c09a5cf201e249f69f57cd4a632d1e2b - - default default] Lock "/var/lib/nova/instances/2c5b484c-19e7-47b1-bf93-fa599ddb6873/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 21 18:53:58 np0005591285 nova_compute[182755]: 2026-01-21 23:53:58.932 182759 DEBUG oslo_concurrency.lockutils [None req-7900a28e-a76b-4bba-a7e4-97a2b29e0f2c a7fb6bdd938b4fcdb749b0bc4f86f97e c09a5cf201e249f69f57cd4a632d1e2b - - default default] Lock "/var/lib/nova/instances/2c5b484c-19e7-47b1-bf93-fa599ddb6873/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 21 18:53:58 np0005591285 nova_compute[182755]: 2026-01-21 23:53:58.949 182759 DEBUG oslo_concurrency.processutils [None req-7900a28e-a76b-4bba-a7e4-97a2b29e0f2c a7fb6bdd938b4fcdb749b0bc4f86f97e c09a5cf201e249f69f57cd4a632d1e2b - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 21 18:53:59 np0005591285 nova_compute[182755]: 2026-01-21 23:53:59.045 182759 DEBUG oslo_concurrency.processutils [None req-7900a28e-a76b-4bba-a7e4-97a2b29e0f2c a7fb6bdd938b4fcdb749b0bc4f86f97e c09a5cf201e249f69f57cd4a632d1e2b - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json" returned: 0 in 0.096s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 21 18:53:59 np0005591285 nova_compute[182755]: 2026-01-21 23:53:59.046 182759 DEBUG oslo_concurrency.lockutils [None req-7900a28e-a76b-4bba-a7e4-97a2b29e0f2c a7fb6bdd938b4fcdb749b0bc4f86f97e c09a5cf201e249f69f57cd4a632d1e2b - - default default] Acquiring lock "3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 21 18:53:59 np0005591285 nova_compute[182755]: 2026-01-21 23:53:59.047 182759 DEBUG oslo_concurrency.lockutils [None req-7900a28e-a76b-4bba-a7e4-97a2b29e0f2c a7fb6bdd938b4fcdb749b0bc4f86f97e c09a5cf201e249f69f57cd4a632d1e2b - - default default] Lock "3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 21 18:53:59 np0005591285 nova_compute[182755]: 2026-01-21 23:53:59.058 182759 DEBUG oslo_concurrency.processutils [None req-7900a28e-a76b-4bba-a7e4-97a2b29e0f2c a7fb6bdd938b4fcdb749b0bc4f86f97e c09a5cf201e249f69f57cd4a632d1e2b - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 21 18:53:59 np0005591285 nova_compute[182755]: 2026-01-21 23:53:59.133 182759 DEBUG oslo_concurrency.processutils [None req-7900a28e-a76b-4bba-a7e4-97a2b29e0f2c a7fb6bdd938b4fcdb749b0bc4f86f97e c09a5cf201e249f69f57cd4a632d1e2b - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json" returned: 0 in 0.075s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 21 18:53:59 np0005591285 nova_compute[182755]: 2026-01-21 23:53:59.135 182759 DEBUG oslo_concurrency.processutils [None req-7900a28e-a76b-4bba-a7e4-97a2b29e0f2c a7fb6bdd938b4fcdb749b0bc4f86f97e c09a5cf201e249f69f57cd4a632d1e2b - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474,backing_fmt=raw /var/lib/nova/instances/2c5b484c-19e7-47b1-bf93-fa599ddb6873/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 21 18:53:59 np0005591285 nova_compute[182755]: 2026-01-21 23:53:59.178 182759 DEBUG oslo_concurrency.processutils [None req-7900a28e-a76b-4bba-a7e4-97a2b29e0f2c a7fb6bdd938b4fcdb749b0bc4f86f97e c09a5cf201e249f69f57cd4a632d1e2b - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474,backing_fmt=raw /var/lib/nova/instances/2c5b484c-19e7-47b1-bf93-fa599ddb6873/disk 1073741824" returned: 0 in 0.044s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 21 18:53:59 np0005591285 nova_compute[182755]: 2026-01-21 23:53:59.180 182759 DEBUG oslo_concurrency.lockutils [None req-7900a28e-a76b-4bba-a7e4-97a2b29e0f2c a7fb6bdd938b4fcdb749b0bc4f86f97e c09a5cf201e249f69f57cd4a632d1e2b - - default default] Lock "3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.134s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 21 18:53:59 np0005591285 nova_compute[182755]: 2026-01-21 23:53:59.181 182759 DEBUG oslo_concurrency.processutils [None req-7900a28e-a76b-4bba-a7e4-97a2b29e0f2c a7fb6bdd938b4fcdb749b0bc4f86f97e c09a5cf201e249f69f57cd4a632d1e2b - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 21 18:53:59 np0005591285 nova_compute[182755]: 2026-01-21 23:53:59.214 182759 DEBUG nova.policy [None req-7900a28e-a76b-4bba-a7e4-97a2b29e0f2c a7fb6bdd938b4fcdb749b0bc4f86f97e c09a5cf201e249f69f57cd4a632d1e2b - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'a7fb6bdd938b4fcdb749b0bc4f86f97e', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'c09a5cf201e249f69f57cd4a632d1e2b', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Jan 21 18:53:59 np0005591285 nova_compute[182755]: 2026-01-21 23:53:59.256 182759 DEBUG oslo_concurrency.processutils [None req-7900a28e-a76b-4bba-a7e4-97a2b29e0f2c a7fb6bdd938b4fcdb749b0bc4f86f97e c09a5cf201e249f69f57cd4a632d1e2b - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json" returned: 0 in 0.075s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 21 18:53:59 np0005591285 nova_compute[182755]: 2026-01-21 23:53:59.257 182759 DEBUG nova.virt.disk.api [None req-7900a28e-a76b-4bba-a7e4-97a2b29e0f2c a7fb6bdd938b4fcdb749b0bc4f86f97e c09a5cf201e249f69f57cd4a632d1e2b - - default default] Checking if we can resize image /var/lib/nova/instances/2c5b484c-19e7-47b1-bf93-fa599ddb6873/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166#033[00m
Jan 21 18:53:59 np0005591285 nova_compute[182755]: 2026-01-21 23:53:59.258 182759 DEBUG oslo_concurrency.processutils [None req-7900a28e-a76b-4bba-a7e4-97a2b29e0f2c a7fb6bdd938b4fcdb749b0bc4f86f97e c09a5cf201e249f69f57cd4a632d1e2b - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/2c5b484c-19e7-47b1-bf93-fa599ddb6873/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 21 18:53:59 np0005591285 nova_compute[182755]: 2026-01-21 23:53:59.326 182759 DEBUG oslo_concurrency.processutils [None req-7900a28e-a76b-4bba-a7e4-97a2b29e0f2c a7fb6bdd938b4fcdb749b0bc4f86f97e c09a5cf201e249f69f57cd4a632d1e2b - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/2c5b484c-19e7-47b1-bf93-fa599ddb6873/disk --force-share --output=json" returned: 0 in 0.067s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 21 18:53:59 np0005591285 nova_compute[182755]: 2026-01-21 23:53:59.328 182759 DEBUG nova.virt.disk.api [None req-7900a28e-a76b-4bba-a7e4-97a2b29e0f2c a7fb6bdd938b4fcdb749b0bc4f86f97e c09a5cf201e249f69f57cd4a632d1e2b - - default default] Cannot resize image /var/lib/nova/instances/2c5b484c-19e7-47b1-bf93-fa599ddb6873/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172#033[00m
Jan 21 18:53:59 np0005591285 nova_compute[182755]: 2026-01-21 23:53:59.329 182759 DEBUG nova.objects.instance [None req-7900a28e-a76b-4bba-a7e4-97a2b29e0f2c a7fb6bdd938b4fcdb749b0bc4f86f97e c09a5cf201e249f69f57cd4a632d1e2b - - default default] Lazy-loading 'migration_context' on Instance uuid 2c5b484c-19e7-47b1-bf93-fa599ddb6873 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 21 18:53:59 np0005591285 nova_compute[182755]: 2026-01-21 23:53:59.362 182759 DEBUG nova.virt.libvirt.driver [None req-7900a28e-a76b-4bba-a7e4-97a2b29e0f2c a7fb6bdd938b4fcdb749b0bc4f86f97e c09a5cf201e249f69f57cd4a632d1e2b - - default default] [instance: 2c5b484c-19e7-47b1-bf93-fa599ddb6873] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Jan 21 18:53:59 np0005591285 nova_compute[182755]: 2026-01-21 23:53:59.363 182759 DEBUG nova.virt.libvirt.driver [None req-7900a28e-a76b-4bba-a7e4-97a2b29e0f2c a7fb6bdd938b4fcdb749b0bc4f86f97e c09a5cf201e249f69f57cd4a632d1e2b - - default default] [instance: 2c5b484c-19e7-47b1-bf93-fa599ddb6873] Ensure instance console log exists: /var/lib/nova/instances/2c5b484c-19e7-47b1-bf93-fa599ddb6873/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Jan 21 18:53:59 np0005591285 nova_compute[182755]: 2026-01-21 23:53:59.364 182759 DEBUG oslo_concurrency.lockutils [None req-7900a28e-a76b-4bba-a7e4-97a2b29e0f2c a7fb6bdd938b4fcdb749b0bc4f86f97e c09a5cf201e249f69f57cd4a632d1e2b - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 21 18:53:59 np0005591285 nova_compute[182755]: 2026-01-21 23:53:59.365 182759 DEBUG oslo_concurrency.lockutils [None req-7900a28e-a76b-4bba-a7e4-97a2b29e0f2c a7fb6bdd938b4fcdb749b0bc4f86f97e c09a5cf201e249f69f57cd4a632d1e2b - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 21 18:53:59 np0005591285 nova_compute[182755]: 2026-01-21 23:53:59.366 182759 DEBUG oslo_concurrency.lockutils [None req-7900a28e-a76b-4bba-a7e4-97a2b29e0f2c a7fb6bdd938b4fcdb749b0bc4f86f97e c09a5cf201e249f69f57cd4a632d1e2b - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 21 18:53:59 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:53:59.784 104259 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=ce4b296c-26ac-415a-aa87-9634754eb3d3, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '13'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 21 18:53:59 np0005591285 nova_compute[182755]: 2026-01-21 23:53:59.809 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 18:54:00 np0005591285 nova_compute[182755]: 2026-01-21 23:54:00.540 182759 DEBUG nova.network.neutron [None req-7900a28e-a76b-4bba-a7e4-97a2b29e0f2c a7fb6bdd938b4fcdb749b0bc4f86f97e c09a5cf201e249f69f57cd4a632d1e2b - - default default] [instance: 2c5b484c-19e7-47b1-bf93-fa599ddb6873] Successfully created port: 2ad8a775-c03c-4a1b-919a-278faef8cb47 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Jan 21 18:54:02 np0005591285 podman[217549]: 2026-01-21 23:54:02.318624784 +0000 UTC m=+0.174130217 container health_status e38def30a2d032278c507a8fe96e62e9ea51ff4721fe55b24e66691821fd079e (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, container_name=ovn_controller, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']})
Jan 21 18:54:02 np0005591285 nova_compute[182755]: 2026-01-21 23:54:02.374 182759 DEBUG nova.network.neutron [None req-7900a28e-a76b-4bba-a7e4-97a2b29e0f2c a7fb6bdd938b4fcdb749b0bc4f86f97e c09a5cf201e249f69f57cd4a632d1e2b - - default default] [instance: 2c5b484c-19e7-47b1-bf93-fa599ddb6873] Successfully updated port: 2ad8a775-c03c-4a1b-919a-278faef8cb47 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Jan 21 18:54:02 np0005591285 nova_compute[182755]: 2026-01-21 23:54:02.402 182759 DEBUG oslo_concurrency.lockutils [None req-7900a28e-a76b-4bba-a7e4-97a2b29e0f2c a7fb6bdd938b4fcdb749b0bc4f86f97e c09a5cf201e249f69f57cd4a632d1e2b - - default default] Acquiring lock "refresh_cache-2c5b484c-19e7-47b1-bf93-fa599ddb6873" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 21 18:54:02 np0005591285 nova_compute[182755]: 2026-01-21 23:54:02.402 182759 DEBUG oslo_concurrency.lockutils [None req-7900a28e-a76b-4bba-a7e4-97a2b29e0f2c a7fb6bdd938b4fcdb749b0bc4f86f97e c09a5cf201e249f69f57cd4a632d1e2b - - default default] Acquired lock "refresh_cache-2c5b484c-19e7-47b1-bf93-fa599ddb6873" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 21 18:54:02 np0005591285 nova_compute[182755]: 2026-01-21 23:54:02.403 182759 DEBUG nova.network.neutron [None req-7900a28e-a76b-4bba-a7e4-97a2b29e0f2c a7fb6bdd938b4fcdb749b0bc4f86f97e c09a5cf201e249f69f57cd4a632d1e2b - - default default] [instance: 2c5b484c-19e7-47b1-bf93-fa599ddb6873] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Jan 21 18:54:02 np0005591285 nova_compute[182755]: 2026-01-21 23:54:02.521 182759 DEBUG nova.compute.manager [req-1ebe081c-32f5-4b80-b469-17e2b1213781 req-dc65971f-30eb-4a64-bb8f-d86edf83f705 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 2c5b484c-19e7-47b1-bf93-fa599ddb6873] Received event network-changed-2ad8a775-c03c-4a1b-919a-278faef8cb47 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 21 18:54:02 np0005591285 nova_compute[182755]: 2026-01-21 23:54:02.522 182759 DEBUG nova.compute.manager [req-1ebe081c-32f5-4b80-b469-17e2b1213781 req-dc65971f-30eb-4a64-bb8f-d86edf83f705 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 2c5b484c-19e7-47b1-bf93-fa599ddb6873] Refreshing instance network info cache due to event network-changed-2ad8a775-c03c-4a1b-919a-278faef8cb47. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 21 18:54:02 np0005591285 nova_compute[182755]: 2026-01-21 23:54:02.522 182759 DEBUG oslo_concurrency.lockutils [req-1ebe081c-32f5-4b80-b469-17e2b1213781 req-dc65971f-30eb-4a64-bb8f-d86edf83f705 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "refresh_cache-2c5b484c-19e7-47b1-bf93-fa599ddb6873" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 21 18:54:02 np0005591285 nova_compute[182755]: 2026-01-21 23:54:02.693 182759 DEBUG nova.network.neutron [None req-7900a28e-a76b-4bba-a7e4-97a2b29e0f2c a7fb6bdd938b4fcdb749b0bc4f86f97e c09a5cf201e249f69f57cd4a632d1e2b - - default default] [instance: 2c5b484c-19e7-47b1-bf93-fa599ddb6873] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Jan 21 18:54:02 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:54:02.960 104259 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 21 18:54:02 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:54:02.962 104259 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 21 18:54:02 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:54:02.962 104259 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 21 18:54:03 np0005591285 nova_compute[182755]: 2026-01-21 23:54:03.105 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 18:54:04 np0005591285 podman[217576]: 2026-01-21 23:54:04.213599815 +0000 UTC m=+0.081055931 container health_status 482a8450540b33e5cd4c4cb0c4b60281016c657b79b89fa82f8a9e447843f995 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Jan 21 18:54:04 np0005591285 podman[217577]: 2026-01-21 23:54:04.219453524 +0000 UTC m=+0.085421311 container health_status 8919309155a033da8deb024bb37b9fc0d285fe97686ee0d202af37a5f2df8416 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter)
Jan 21 18:54:04 np0005591285 nova_compute[182755]: 2026-01-21 23:54:04.640 182759 DEBUG nova.network.neutron [None req-7900a28e-a76b-4bba-a7e4-97a2b29e0f2c a7fb6bdd938b4fcdb749b0bc4f86f97e c09a5cf201e249f69f57cd4a632d1e2b - - default default] [instance: 2c5b484c-19e7-47b1-bf93-fa599ddb6873] Updating instance_info_cache with network_info: [{"id": "2ad8a775-c03c-4a1b-919a-278faef8cb47", "address": "fa:16:3e:10:e7:d9", "network": {"id": "7b586c54-3322-410f-9bc9-972a63b8deff", "bridge": "br-int", "label": "tempest-ServerDiskConfigTestJSON-1542056474-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c09a5cf201e249f69f57cd4a632d1e2b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap2ad8a775-c0", "ovs_interfaceid": "2ad8a775-c03c-4a1b-919a-278faef8cb47", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 21 18:54:04 np0005591285 nova_compute[182755]: 2026-01-21 23:54:04.684 182759 DEBUG oslo_concurrency.lockutils [None req-7900a28e-a76b-4bba-a7e4-97a2b29e0f2c a7fb6bdd938b4fcdb749b0bc4f86f97e c09a5cf201e249f69f57cd4a632d1e2b - - default default] Releasing lock "refresh_cache-2c5b484c-19e7-47b1-bf93-fa599ddb6873" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 21 18:54:04 np0005591285 nova_compute[182755]: 2026-01-21 23:54:04.685 182759 DEBUG nova.compute.manager [None req-7900a28e-a76b-4bba-a7e4-97a2b29e0f2c a7fb6bdd938b4fcdb749b0bc4f86f97e c09a5cf201e249f69f57cd4a632d1e2b - - default default] [instance: 2c5b484c-19e7-47b1-bf93-fa599ddb6873] Instance network_info: |[{"id": "2ad8a775-c03c-4a1b-919a-278faef8cb47", "address": "fa:16:3e:10:e7:d9", "network": {"id": "7b586c54-3322-410f-9bc9-972a63b8deff", "bridge": "br-int", "label": "tempest-ServerDiskConfigTestJSON-1542056474-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c09a5cf201e249f69f57cd4a632d1e2b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap2ad8a775-c0", "ovs_interfaceid": "2ad8a775-c03c-4a1b-919a-278faef8cb47", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Jan 21 18:54:04 np0005591285 nova_compute[182755]: 2026-01-21 23:54:04.687 182759 DEBUG oslo_concurrency.lockutils [req-1ebe081c-32f5-4b80-b469-17e2b1213781 req-dc65971f-30eb-4a64-bb8f-d86edf83f705 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquired lock "refresh_cache-2c5b484c-19e7-47b1-bf93-fa599ddb6873" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 21 18:54:04 np0005591285 nova_compute[182755]: 2026-01-21 23:54:04.687 182759 DEBUG nova.network.neutron [req-1ebe081c-32f5-4b80-b469-17e2b1213781 req-dc65971f-30eb-4a64-bb8f-d86edf83f705 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 2c5b484c-19e7-47b1-bf93-fa599ddb6873] Refreshing network info cache for port 2ad8a775-c03c-4a1b-919a-278faef8cb47 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 21 18:54:04 np0005591285 nova_compute[182755]: 2026-01-21 23:54:04.695 182759 DEBUG nova.virt.libvirt.driver [None req-7900a28e-a76b-4bba-a7e4-97a2b29e0f2c a7fb6bdd938b4fcdb749b0bc4f86f97e c09a5cf201e249f69f57cd4a632d1e2b - - default default] [instance: 2c5b484c-19e7-47b1-bf93-fa599ddb6873] Start _get_guest_xml network_info=[{"id": "2ad8a775-c03c-4a1b-919a-278faef8cb47", "address": "fa:16:3e:10:e7:d9", "network": {"id": "7b586c54-3322-410f-9bc9-972a63b8deff", "bridge": "br-int", "label": "tempest-ServerDiskConfigTestJSON-1542056474-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c09a5cf201e249f69f57cd4a632d1e2b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap2ad8a775-c0", "ovs_interfaceid": "2ad8a775-c03c-4a1b-919a-278faef8cb47", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-21T23:42:50Z,direct_url=<?>,disk_format='qcow2',id=9cd98f02-a505-4543-a7ad-04e9a377b456,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='43b70c4e837343859ac97b6b2397ba1b',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-21T23:42:52Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'device_type': 'disk', 'device_name': '/dev/vda', 'size': 0, 'boot_index': 0, 'disk_bus': 'virtio', 'guest_format': None, 'encryption_options': None, 'encryption_format': None, 'encrypted': False, 'encryption_secret_uuid': None, 'image_id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Jan 21 18:54:04 np0005591285 nova_compute[182755]: 2026-01-21 23:54:04.704 182759 WARNING nova.virt.libvirt.driver [None req-7900a28e-a76b-4bba-a7e4-97a2b29e0f2c a7fb6bdd938b4fcdb749b0bc4f86f97e c09a5cf201e249f69f57cd4a632d1e2b - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 21 18:54:04 np0005591285 nova_compute[182755]: 2026-01-21 23:54:04.710 182759 DEBUG nova.virt.libvirt.host [None req-7900a28e-a76b-4bba-a7e4-97a2b29e0f2c a7fb6bdd938b4fcdb749b0bc4f86f97e c09a5cf201e249f69f57cd4a632d1e2b - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Jan 21 18:54:04 np0005591285 nova_compute[182755]: 2026-01-21 23:54:04.710 182759 DEBUG nova.virt.libvirt.host [None req-7900a28e-a76b-4bba-a7e4-97a2b29e0f2c a7fb6bdd938b4fcdb749b0bc4f86f97e c09a5cf201e249f69f57cd4a632d1e2b - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Jan 21 18:54:04 np0005591285 nova_compute[182755]: 2026-01-21 23:54:04.721 182759 DEBUG nova.virt.libvirt.host [None req-7900a28e-a76b-4bba-a7e4-97a2b29e0f2c a7fb6bdd938b4fcdb749b0bc4f86f97e c09a5cf201e249f69f57cd4a632d1e2b - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Jan 21 18:54:04 np0005591285 nova_compute[182755]: 2026-01-21 23:54:04.722 182759 DEBUG nova.virt.libvirt.host [None req-7900a28e-a76b-4bba-a7e4-97a2b29e0f2c a7fb6bdd938b4fcdb749b0bc4f86f97e c09a5cf201e249f69f57cd4a632d1e2b - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Jan 21 18:54:04 np0005591285 nova_compute[182755]: 2026-01-21 23:54:04.724 182759 DEBUG nova.virt.libvirt.driver [None req-7900a28e-a76b-4bba-a7e4-97a2b29e0f2c a7fb6bdd938b4fcdb749b0bc4f86f97e c09a5cf201e249f69f57cd4a632d1e2b - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Jan 21 18:54:04 np0005591285 nova_compute[182755]: 2026-01-21 23:54:04.724 182759 DEBUG nova.virt.hardware [None req-7900a28e-a76b-4bba-a7e4-97a2b29e0f2c a7fb6bdd938b4fcdb749b0bc4f86f97e c09a5cf201e249f69f57cd4a632d1e2b - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-21T23:42:45Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='c3389c03-89c4-4ff5-9e03-1a99d41713d4',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-21T23:42:50Z,direct_url=<?>,disk_format='qcow2',id=9cd98f02-a505-4543-a7ad-04e9a377b456,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='43b70c4e837343859ac97b6b2397ba1b',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-21T23:42:52Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Jan 21 18:54:04 np0005591285 nova_compute[182755]: 2026-01-21 23:54:04.724 182759 DEBUG nova.virt.hardware [None req-7900a28e-a76b-4bba-a7e4-97a2b29e0f2c a7fb6bdd938b4fcdb749b0bc4f86f97e c09a5cf201e249f69f57cd4a632d1e2b - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Jan 21 18:54:04 np0005591285 nova_compute[182755]: 2026-01-21 23:54:04.725 182759 DEBUG nova.virt.hardware [None req-7900a28e-a76b-4bba-a7e4-97a2b29e0f2c a7fb6bdd938b4fcdb749b0bc4f86f97e c09a5cf201e249f69f57cd4a632d1e2b - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Jan 21 18:54:04 np0005591285 nova_compute[182755]: 2026-01-21 23:54:04.725 182759 DEBUG nova.virt.hardware [None req-7900a28e-a76b-4bba-a7e4-97a2b29e0f2c a7fb6bdd938b4fcdb749b0bc4f86f97e c09a5cf201e249f69f57cd4a632d1e2b - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Jan 21 18:54:04 np0005591285 nova_compute[182755]: 2026-01-21 23:54:04.725 182759 DEBUG nova.virt.hardware [None req-7900a28e-a76b-4bba-a7e4-97a2b29e0f2c a7fb6bdd938b4fcdb749b0bc4f86f97e c09a5cf201e249f69f57cd4a632d1e2b - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Jan 21 18:54:04 np0005591285 nova_compute[182755]: 2026-01-21 23:54:04.726 182759 DEBUG nova.virt.hardware [None req-7900a28e-a76b-4bba-a7e4-97a2b29e0f2c a7fb6bdd938b4fcdb749b0bc4f86f97e c09a5cf201e249f69f57cd4a632d1e2b - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Jan 21 18:54:04 np0005591285 nova_compute[182755]: 2026-01-21 23:54:04.726 182759 DEBUG nova.virt.hardware [None req-7900a28e-a76b-4bba-a7e4-97a2b29e0f2c a7fb6bdd938b4fcdb749b0bc4f86f97e c09a5cf201e249f69f57cd4a632d1e2b - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Jan 21 18:54:04 np0005591285 nova_compute[182755]: 2026-01-21 23:54:04.726 182759 DEBUG nova.virt.hardware [None req-7900a28e-a76b-4bba-a7e4-97a2b29e0f2c a7fb6bdd938b4fcdb749b0bc4f86f97e c09a5cf201e249f69f57cd4a632d1e2b - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Jan 21 18:54:04 np0005591285 nova_compute[182755]: 2026-01-21 23:54:04.727 182759 DEBUG nova.virt.hardware [None req-7900a28e-a76b-4bba-a7e4-97a2b29e0f2c a7fb6bdd938b4fcdb749b0bc4f86f97e c09a5cf201e249f69f57cd4a632d1e2b - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Jan 21 18:54:04 np0005591285 nova_compute[182755]: 2026-01-21 23:54:04.727 182759 DEBUG nova.virt.hardware [None req-7900a28e-a76b-4bba-a7e4-97a2b29e0f2c a7fb6bdd938b4fcdb749b0bc4f86f97e c09a5cf201e249f69f57cd4a632d1e2b - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Jan 21 18:54:04 np0005591285 nova_compute[182755]: 2026-01-21 23:54:04.727 182759 DEBUG nova.virt.hardware [None req-7900a28e-a76b-4bba-a7e4-97a2b29e0f2c a7fb6bdd938b4fcdb749b0bc4f86f97e c09a5cf201e249f69f57cd4a632d1e2b - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Jan 21 18:54:04 np0005591285 nova_compute[182755]: 2026-01-21 23:54:04.732 182759 DEBUG nova.virt.libvirt.vif [None req-7900a28e-a76b-4bba-a7e4-97a2b29e0f2c a7fb6bdd938b4fcdb749b0bc4f86f97e c09a5cf201e249f69f57cd4a632d1e2b - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-21T23:53:56Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ServerDiskConfigTestJSON-server-1222136722',display_name='tempest-ServerDiskConfigTestJSON-server-1222136722',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-serverdiskconfigtestjson-server-1222136722',id=51,image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='c09a5cf201e249f69f57cd4a632d1e2b',ramdisk_id='',reservation_id='r-wyf3b09s',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ServerDiskConfigTestJSON-1417790226',owner_user_name='tempest-ServerDiskConfigTestJSON-1417790226-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-21T23:53:58Z,user_data=None,user_id='a7fb6bdd938b4fcdb749b0bc4f86f97e',uuid=2c5b484c-19e7-47b1-bf93-fa599ddb6873,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "2ad8a775-c03c-4a1b-919a-278faef8cb47", "address": "fa:16:3e:10:e7:d9", "network": {"id": "7b586c54-3322-410f-9bc9-972a63b8deff", "bridge": "br-int", "label": "tempest-ServerDiskConfigTestJSON-1542056474-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c09a5cf201e249f69f57cd4a632d1e2b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap2ad8a775-c0", "ovs_interfaceid": "2ad8a775-c03c-4a1b-919a-278faef8cb47", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Jan 21 18:54:04 np0005591285 nova_compute[182755]: 2026-01-21 23:54:04.733 182759 DEBUG nova.network.os_vif_util [None req-7900a28e-a76b-4bba-a7e4-97a2b29e0f2c a7fb6bdd938b4fcdb749b0bc4f86f97e c09a5cf201e249f69f57cd4a632d1e2b - - default default] Converting VIF {"id": "2ad8a775-c03c-4a1b-919a-278faef8cb47", "address": "fa:16:3e:10:e7:d9", "network": {"id": "7b586c54-3322-410f-9bc9-972a63b8deff", "bridge": "br-int", "label": "tempest-ServerDiskConfigTestJSON-1542056474-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c09a5cf201e249f69f57cd4a632d1e2b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap2ad8a775-c0", "ovs_interfaceid": "2ad8a775-c03c-4a1b-919a-278faef8cb47", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 21 18:54:04 np0005591285 nova_compute[182755]: 2026-01-21 23:54:04.734 182759 DEBUG nova.network.os_vif_util [None req-7900a28e-a76b-4bba-a7e4-97a2b29e0f2c a7fb6bdd938b4fcdb749b0bc4f86f97e c09a5cf201e249f69f57cd4a632d1e2b - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:10:e7:d9,bridge_name='br-int',has_traffic_filtering=True,id=2ad8a775-c03c-4a1b-919a-278faef8cb47,network=Network(7b586c54-3322-410f-9bc9-972a63b8deff),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap2ad8a775-c0') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 21 18:54:04 np0005591285 nova_compute[182755]: 2026-01-21 23:54:04.735 182759 DEBUG nova.objects.instance [None req-7900a28e-a76b-4bba-a7e4-97a2b29e0f2c a7fb6bdd938b4fcdb749b0bc4f86f97e c09a5cf201e249f69f57cd4a632d1e2b - - default default] Lazy-loading 'pci_devices' on Instance uuid 2c5b484c-19e7-47b1-bf93-fa599ddb6873 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 21 18:54:04 np0005591285 nova_compute[182755]: 2026-01-21 23:54:04.771 182759 DEBUG nova.virt.libvirt.driver [None req-7900a28e-a76b-4bba-a7e4-97a2b29e0f2c a7fb6bdd938b4fcdb749b0bc4f86f97e c09a5cf201e249f69f57cd4a632d1e2b - - default default] [instance: 2c5b484c-19e7-47b1-bf93-fa599ddb6873] End _get_guest_xml xml=<domain type="kvm">
Jan 21 18:54:04 np0005591285 nova_compute[182755]:  <uuid>2c5b484c-19e7-47b1-bf93-fa599ddb6873</uuid>
Jan 21 18:54:04 np0005591285 nova_compute[182755]:  <name>instance-00000033</name>
Jan 21 18:54:04 np0005591285 nova_compute[182755]:  <memory>131072</memory>
Jan 21 18:54:04 np0005591285 nova_compute[182755]:  <vcpu>1</vcpu>
Jan 21 18:54:04 np0005591285 nova_compute[182755]:  <metadata>
Jan 21 18:54:04 np0005591285 nova_compute[182755]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 21 18:54:04 np0005591285 nova_compute[182755]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 21 18:54:04 np0005591285 nova_compute[182755]:      <nova:name>tempest-ServerDiskConfigTestJSON-server-1222136722</nova:name>
Jan 21 18:54:04 np0005591285 nova_compute[182755]:      <nova:creationTime>2026-01-21 23:54:04</nova:creationTime>
Jan 21 18:54:04 np0005591285 nova_compute[182755]:      <nova:flavor name="m1.nano">
Jan 21 18:54:04 np0005591285 nova_compute[182755]:        <nova:memory>128</nova:memory>
Jan 21 18:54:04 np0005591285 nova_compute[182755]:        <nova:disk>1</nova:disk>
Jan 21 18:54:04 np0005591285 nova_compute[182755]:        <nova:swap>0</nova:swap>
Jan 21 18:54:04 np0005591285 nova_compute[182755]:        <nova:ephemeral>0</nova:ephemeral>
Jan 21 18:54:04 np0005591285 nova_compute[182755]:        <nova:vcpus>1</nova:vcpus>
Jan 21 18:54:04 np0005591285 nova_compute[182755]:      </nova:flavor>
Jan 21 18:54:04 np0005591285 nova_compute[182755]:      <nova:owner>
Jan 21 18:54:04 np0005591285 nova_compute[182755]:        <nova:user uuid="a7fb6bdd938b4fcdb749b0bc4f86f97e">tempest-ServerDiskConfigTestJSON-1417790226-project-member</nova:user>
Jan 21 18:54:04 np0005591285 nova_compute[182755]:        <nova:project uuid="c09a5cf201e249f69f57cd4a632d1e2b">tempest-ServerDiskConfigTestJSON-1417790226</nova:project>
Jan 21 18:54:04 np0005591285 nova_compute[182755]:      </nova:owner>
Jan 21 18:54:04 np0005591285 nova_compute[182755]:      <nova:root type="image" uuid="9cd98f02-a505-4543-a7ad-04e9a377b456"/>
Jan 21 18:54:04 np0005591285 nova_compute[182755]:      <nova:ports>
Jan 21 18:54:04 np0005591285 nova_compute[182755]:        <nova:port uuid="2ad8a775-c03c-4a1b-919a-278faef8cb47">
Jan 21 18:54:04 np0005591285 nova_compute[182755]:          <nova:ip type="fixed" address="10.100.0.12" ipVersion="4"/>
Jan 21 18:54:04 np0005591285 nova_compute[182755]:        </nova:port>
Jan 21 18:54:04 np0005591285 nova_compute[182755]:      </nova:ports>
Jan 21 18:54:04 np0005591285 nova_compute[182755]:    </nova:instance>
Jan 21 18:54:04 np0005591285 nova_compute[182755]:  </metadata>
Jan 21 18:54:04 np0005591285 nova_compute[182755]:  <sysinfo type="smbios">
Jan 21 18:54:04 np0005591285 nova_compute[182755]:    <system>
Jan 21 18:54:04 np0005591285 nova_compute[182755]:      <entry name="manufacturer">RDO</entry>
Jan 21 18:54:04 np0005591285 nova_compute[182755]:      <entry name="product">OpenStack Compute</entry>
Jan 21 18:54:04 np0005591285 nova_compute[182755]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 21 18:54:04 np0005591285 nova_compute[182755]:      <entry name="serial">2c5b484c-19e7-47b1-bf93-fa599ddb6873</entry>
Jan 21 18:54:04 np0005591285 nova_compute[182755]:      <entry name="uuid">2c5b484c-19e7-47b1-bf93-fa599ddb6873</entry>
Jan 21 18:54:04 np0005591285 nova_compute[182755]:      <entry name="family">Virtual Machine</entry>
Jan 21 18:54:04 np0005591285 nova_compute[182755]:    </system>
Jan 21 18:54:04 np0005591285 nova_compute[182755]:  </sysinfo>
Jan 21 18:54:04 np0005591285 nova_compute[182755]:  <os>
Jan 21 18:54:04 np0005591285 nova_compute[182755]:    <type arch="x86_64" machine="q35">hvm</type>
Jan 21 18:54:04 np0005591285 nova_compute[182755]:    <boot dev="hd"/>
Jan 21 18:54:04 np0005591285 nova_compute[182755]:    <smbios mode="sysinfo"/>
Jan 21 18:54:04 np0005591285 nova_compute[182755]:  </os>
Jan 21 18:54:04 np0005591285 nova_compute[182755]:  <features>
Jan 21 18:54:04 np0005591285 nova_compute[182755]:    <acpi/>
Jan 21 18:54:04 np0005591285 nova_compute[182755]:    <apic/>
Jan 21 18:54:04 np0005591285 nova_compute[182755]:    <vmcoreinfo/>
Jan 21 18:54:04 np0005591285 nova_compute[182755]:  </features>
Jan 21 18:54:04 np0005591285 nova_compute[182755]:  <clock offset="utc">
Jan 21 18:54:04 np0005591285 nova_compute[182755]:    <timer name="pit" tickpolicy="delay"/>
Jan 21 18:54:04 np0005591285 nova_compute[182755]:    <timer name="rtc" tickpolicy="catchup"/>
Jan 21 18:54:04 np0005591285 nova_compute[182755]:    <timer name="hpet" present="no"/>
Jan 21 18:54:04 np0005591285 nova_compute[182755]:  </clock>
Jan 21 18:54:04 np0005591285 nova_compute[182755]:  <cpu mode="custom" match="exact">
Jan 21 18:54:04 np0005591285 nova_compute[182755]:    <model>Nehalem</model>
Jan 21 18:54:04 np0005591285 nova_compute[182755]:    <topology sockets="1" cores="1" threads="1"/>
Jan 21 18:54:04 np0005591285 nova_compute[182755]:  </cpu>
Jan 21 18:54:04 np0005591285 nova_compute[182755]:  <devices>
Jan 21 18:54:04 np0005591285 nova_compute[182755]:    <disk type="file" device="disk">
Jan 21 18:54:04 np0005591285 nova_compute[182755]:      <driver name="qemu" type="qcow2" cache="none"/>
Jan 21 18:54:04 np0005591285 nova_compute[182755]:      <source file="/var/lib/nova/instances/2c5b484c-19e7-47b1-bf93-fa599ddb6873/disk"/>
Jan 21 18:54:04 np0005591285 nova_compute[182755]:      <target dev="vda" bus="virtio"/>
Jan 21 18:54:04 np0005591285 nova_compute[182755]:    </disk>
Jan 21 18:54:04 np0005591285 nova_compute[182755]:    <disk type="file" device="cdrom">
Jan 21 18:54:04 np0005591285 nova_compute[182755]:      <driver name="qemu" type="raw" cache="none"/>
Jan 21 18:54:04 np0005591285 nova_compute[182755]:      <source file="/var/lib/nova/instances/2c5b484c-19e7-47b1-bf93-fa599ddb6873/disk.config"/>
Jan 21 18:54:04 np0005591285 nova_compute[182755]:      <target dev="sda" bus="sata"/>
Jan 21 18:54:04 np0005591285 nova_compute[182755]:    </disk>
Jan 21 18:54:04 np0005591285 nova_compute[182755]:    <interface type="ethernet">
Jan 21 18:54:04 np0005591285 nova_compute[182755]:      <mac address="fa:16:3e:10:e7:d9"/>
Jan 21 18:54:04 np0005591285 nova_compute[182755]:      <model type="virtio"/>
Jan 21 18:54:04 np0005591285 nova_compute[182755]:      <driver name="vhost" rx_queue_size="512"/>
Jan 21 18:54:04 np0005591285 nova_compute[182755]:      <mtu size="1442"/>
Jan 21 18:54:04 np0005591285 nova_compute[182755]:      <target dev="tap2ad8a775-c0"/>
Jan 21 18:54:04 np0005591285 nova_compute[182755]:    </interface>
Jan 21 18:54:04 np0005591285 nova_compute[182755]:    <serial type="pty">
Jan 21 18:54:04 np0005591285 nova_compute[182755]:      <log file="/var/lib/nova/instances/2c5b484c-19e7-47b1-bf93-fa599ddb6873/console.log" append="off"/>
Jan 21 18:54:04 np0005591285 nova_compute[182755]:    </serial>
Jan 21 18:54:04 np0005591285 nova_compute[182755]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 21 18:54:04 np0005591285 nova_compute[182755]:    <video>
Jan 21 18:54:04 np0005591285 nova_compute[182755]:      <model type="virtio"/>
Jan 21 18:54:04 np0005591285 nova_compute[182755]:    </video>
Jan 21 18:54:04 np0005591285 nova_compute[182755]:    <input type="tablet" bus="usb"/>
Jan 21 18:54:04 np0005591285 nova_compute[182755]:    <rng model="virtio">
Jan 21 18:54:04 np0005591285 nova_compute[182755]:      <backend model="random">/dev/urandom</backend>
Jan 21 18:54:04 np0005591285 nova_compute[182755]:    </rng>
Jan 21 18:54:04 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root"/>
Jan 21 18:54:04 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 18:54:04 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 18:54:04 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 18:54:04 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 18:54:04 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 18:54:04 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 18:54:04 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 18:54:04 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 18:54:04 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 18:54:04 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 18:54:04 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 18:54:04 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 18:54:04 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 18:54:04 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 18:54:04 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 18:54:04 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 18:54:04 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 18:54:04 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 18:54:04 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 18:54:04 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 18:54:04 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 18:54:04 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 18:54:04 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 18:54:04 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 18:54:04 np0005591285 nova_compute[182755]:    <controller type="usb" index="0"/>
Jan 21 18:54:04 np0005591285 nova_compute[182755]:    <memballoon model="virtio">
Jan 21 18:54:04 np0005591285 nova_compute[182755]:      <stats period="10"/>
Jan 21 18:54:04 np0005591285 nova_compute[182755]:    </memballoon>
Jan 21 18:54:04 np0005591285 nova_compute[182755]:  </devices>
Jan 21 18:54:04 np0005591285 nova_compute[182755]: </domain>
Jan 21 18:54:04 np0005591285 nova_compute[182755]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Jan 21 18:54:04 np0005591285 nova_compute[182755]: 2026-01-21 23:54:04.773 182759 DEBUG nova.compute.manager [None req-7900a28e-a76b-4bba-a7e4-97a2b29e0f2c a7fb6bdd938b4fcdb749b0bc4f86f97e c09a5cf201e249f69f57cd4a632d1e2b - - default default] [instance: 2c5b484c-19e7-47b1-bf93-fa599ddb6873] Preparing to wait for external event network-vif-plugged-2ad8a775-c03c-4a1b-919a-278faef8cb47 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Jan 21 18:54:04 np0005591285 nova_compute[182755]: 2026-01-21 23:54:04.774 182759 DEBUG oslo_concurrency.lockutils [None req-7900a28e-a76b-4bba-a7e4-97a2b29e0f2c a7fb6bdd938b4fcdb749b0bc4f86f97e c09a5cf201e249f69f57cd4a632d1e2b - - default default] Acquiring lock "2c5b484c-19e7-47b1-bf93-fa599ddb6873-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 21 18:54:04 np0005591285 nova_compute[182755]: 2026-01-21 23:54:04.775 182759 DEBUG oslo_concurrency.lockutils [None req-7900a28e-a76b-4bba-a7e4-97a2b29e0f2c a7fb6bdd938b4fcdb749b0bc4f86f97e c09a5cf201e249f69f57cd4a632d1e2b - - default default] Lock "2c5b484c-19e7-47b1-bf93-fa599ddb6873-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 21 18:54:04 np0005591285 nova_compute[182755]: 2026-01-21 23:54:04.775 182759 DEBUG oslo_concurrency.lockutils [None req-7900a28e-a76b-4bba-a7e4-97a2b29e0f2c a7fb6bdd938b4fcdb749b0bc4f86f97e c09a5cf201e249f69f57cd4a632d1e2b - - default default] Lock "2c5b484c-19e7-47b1-bf93-fa599ddb6873-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 21 18:54:04 np0005591285 nova_compute[182755]: 2026-01-21 23:54:04.777 182759 DEBUG nova.virt.libvirt.vif [None req-7900a28e-a76b-4bba-a7e4-97a2b29e0f2c a7fb6bdd938b4fcdb749b0bc4f86f97e c09a5cf201e249f69f57cd4a632d1e2b - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-21T23:53:56Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ServerDiskConfigTestJSON-server-1222136722',display_name='tempest-ServerDiskConfigTestJSON-server-1222136722',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-serverdiskconfigtestjson-server-1222136722',id=51,image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='c09a5cf201e249f69f57cd4a632d1e2b',ramdisk_id='',reservation_id='r-wyf3b09s',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ServerDiskConfigTestJSON-1417790226',owner_user_name='tempest-ServerDiskConfigTestJSON-1417790226-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-21T23:53:58Z,user_data=None,user_id='a7fb6bdd938b4fcdb749b0bc4f86f97e',uuid=2c5b484c-19e7-47b1-bf93-fa599ddb6873,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "2ad8a775-c03c-4a1b-919a-278faef8cb47", "address": "fa:16:3e:10:e7:d9", "network": {"id": "7b586c54-3322-410f-9bc9-972a63b8deff", "bridge": "br-int", "label": "tempest-ServerDiskConfigTestJSON-1542056474-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c09a5cf201e249f69f57cd4a632d1e2b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap2ad8a775-c0", "ovs_interfaceid": "2ad8a775-c03c-4a1b-919a-278faef8cb47", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Jan 21 18:54:04 np0005591285 nova_compute[182755]: 2026-01-21 23:54:04.777 182759 DEBUG nova.network.os_vif_util [None req-7900a28e-a76b-4bba-a7e4-97a2b29e0f2c a7fb6bdd938b4fcdb749b0bc4f86f97e c09a5cf201e249f69f57cd4a632d1e2b - - default default] Converting VIF {"id": "2ad8a775-c03c-4a1b-919a-278faef8cb47", "address": "fa:16:3e:10:e7:d9", "network": {"id": "7b586c54-3322-410f-9bc9-972a63b8deff", "bridge": "br-int", "label": "tempest-ServerDiskConfigTestJSON-1542056474-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c09a5cf201e249f69f57cd4a632d1e2b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap2ad8a775-c0", "ovs_interfaceid": "2ad8a775-c03c-4a1b-919a-278faef8cb47", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 21 18:54:04 np0005591285 nova_compute[182755]: 2026-01-21 23:54:04.779 182759 DEBUG nova.network.os_vif_util [None req-7900a28e-a76b-4bba-a7e4-97a2b29e0f2c a7fb6bdd938b4fcdb749b0bc4f86f97e c09a5cf201e249f69f57cd4a632d1e2b - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:10:e7:d9,bridge_name='br-int',has_traffic_filtering=True,id=2ad8a775-c03c-4a1b-919a-278faef8cb47,network=Network(7b586c54-3322-410f-9bc9-972a63b8deff),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap2ad8a775-c0') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 21 18:54:04 np0005591285 nova_compute[182755]: 2026-01-21 23:54:04.779 182759 DEBUG os_vif [None req-7900a28e-a76b-4bba-a7e4-97a2b29e0f2c a7fb6bdd938b4fcdb749b0bc4f86f97e c09a5cf201e249f69f57cd4a632d1e2b - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:10:e7:d9,bridge_name='br-int',has_traffic_filtering=True,id=2ad8a775-c03c-4a1b-919a-278faef8cb47,network=Network(7b586c54-3322-410f-9bc9-972a63b8deff),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap2ad8a775-c0') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Jan 21 18:54:04 np0005591285 nova_compute[182755]: 2026-01-21 23:54:04.781 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 18:54:04 np0005591285 nova_compute[182755]: 2026-01-21 23:54:04.782 182759 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 21 18:54:04 np0005591285 nova_compute[182755]: 2026-01-21 23:54:04.782 182759 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 21 18:54:04 np0005591285 nova_compute[182755]: 2026-01-21 23:54:04.794 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 18:54:04 np0005591285 nova_compute[182755]: 2026-01-21 23:54:04.794 182759 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap2ad8a775-c0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 21 18:54:04 np0005591285 nova_compute[182755]: 2026-01-21 23:54:04.795 182759 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap2ad8a775-c0, col_values=(('external_ids', {'iface-id': '2ad8a775-c03c-4a1b-919a-278faef8cb47', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:10:e7:d9', 'vm-uuid': '2c5b484c-19e7-47b1-bf93-fa599ddb6873'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 21 18:54:04 np0005591285 NetworkManager[55017]: <info>  [1769039644.7980] manager: (tap2ad8a775-c0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/67)
Jan 21 18:54:04 np0005591285 nova_compute[182755]: 2026-01-21 23:54:04.797 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 18:54:04 np0005591285 nova_compute[182755]: 2026-01-21 23:54:04.802 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 21 18:54:04 np0005591285 nova_compute[182755]: 2026-01-21 23:54:04.805 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 18:54:04 np0005591285 nova_compute[182755]: 2026-01-21 23:54:04.806 182759 INFO os_vif [None req-7900a28e-a76b-4bba-a7e4-97a2b29e0f2c a7fb6bdd938b4fcdb749b0bc4f86f97e c09a5cf201e249f69f57cd4a632d1e2b - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:10:e7:d9,bridge_name='br-int',has_traffic_filtering=True,id=2ad8a775-c03c-4a1b-919a-278faef8cb47,network=Network(7b586c54-3322-410f-9bc9-972a63b8deff),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap2ad8a775-c0')#033[00m
Jan 21 18:54:04 np0005591285 nova_compute[182755]: 2026-01-21 23:54:04.810 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 18:54:04 np0005591285 nova_compute[182755]: 2026-01-21 23:54:04.885 182759 DEBUG nova.virt.libvirt.driver [None req-7900a28e-a76b-4bba-a7e4-97a2b29e0f2c a7fb6bdd938b4fcdb749b0bc4f86f97e c09a5cf201e249f69f57cd4a632d1e2b - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 21 18:54:04 np0005591285 nova_compute[182755]: 2026-01-21 23:54:04.886 182759 DEBUG nova.virt.libvirt.driver [None req-7900a28e-a76b-4bba-a7e4-97a2b29e0f2c a7fb6bdd938b4fcdb749b0bc4f86f97e c09a5cf201e249f69f57cd4a632d1e2b - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 21 18:54:04 np0005591285 nova_compute[182755]: 2026-01-21 23:54:04.886 182759 DEBUG nova.virt.libvirt.driver [None req-7900a28e-a76b-4bba-a7e4-97a2b29e0f2c a7fb6bdd938b4fcdb749b0bc4f86f97e c09a5cf201e249f69f57cd4a632d1e2b - - default default] No VIF found with MAC fa:16:3e:10:e7:d9, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Jan 21 18:54:04 np0005591285 nova_compute[182755]: 2026-01-21 23:54:04.887 182759 INFO nova.virt.libvirt.driver [None req-7900a28e-a76b-4bba-a7e4-97a2b29e0f2c a7fb6bdd938b4fcdb749b0bc4f86f97e c09a5cf201e249f69f57cd4a632d1e2b - - default default] [instance: 2c5b484c-19e7-47b1-bf93-fa599ddb6873] Using config drive#033[00m
Jan 21 18:54:05 np0005591285 nova_compute[182755]: 2026-01-21 23:54:05.814 182759 INFO nova.virt.libvirt.driver [None req-7900a28e-a76b-4bba-a7e4-97a2b29e0f2c a7fb6bdd938b4fcdb749b0bc4f86f97e c09a5cf201e249f69f57cd4a632d1e2b - - default default] [instance: 2c5b484c-19e7-47b1-bf93-fa599ddb6873] Creating config drive at /var/lib/nova/instances/2c5b484c-19e7-47b1-bf93-fa599ddb6873/disk.config#033[00m
Jan 21 18:54:05 np0005591285 nova_compute[182755]: 2026-01-21 23:54:05.826 182759 DEBUG oslo_concurrency.processutils [None req-7900a28e-a76b-4bba-a7e4-97a2b29e0f2c a7fb6bdd938b4fcdb749b0bc4f86f97e c09a5cf201e249f69f57cd4a632d1e2b - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/2c5b484c-19e7-47b1-bf93-fa599ddb6873/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpfdlsyuyi execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 21 18:54:05 np0005591285 nova_compute[182755]: 2026-01-21 23:54:05.963 182759 DEBUG oslo_concurrency.processutils [None req-7900a28e-a76b-4bba-a7e4-97a2b29e0f2c a7fb6bdd938b4fcdb749b0bc4f86f97e c09a5cf201e249f69f57cd4a632d1e2b - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/2c5b484c-19e7-47b1-bf93-fa599ddb6873/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpfdlsyuyi" returned: 0 in 0.137s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 21 18:54:06 np0005591285 kernel: tap2ad8a775-c0: entered promiscuous mode
Jan 21 18:54:06 np0005591285 NetworkManager[55017]: <info>  [1769039646.0428] manager: (tap2ad8a775-c0): new Tun device (/org/freedesktop/NetworkManager/Devices/68)
Jan 21 18:54:06 np0005591285 nova_compute[182755]: 2026-01-21 23:54:06.045 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 18:54:06 np0005591285 ovn_controller[94908]: 2026-01-21T23:54:06Z|00117|binding|INFO|Claiming lport 2ad8a775-c03c-4a1b-919a-278faef8cb47 for this chassis.
Jan 21 18:54:06 np0005591285 ovn_controller[94908]: 2026-01-21T23:54:06Z|00118|binding|INFO|2ad8a775-c03c-4a1b-919a-278faef8cb47: Claiming fa:16:3e:10:e7:d9 10.100.0.12
Jan 21 18:54:06 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:54:06.066 104259 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:10:e7:d9 10.100.0.12'], port_security=['fa:16:3e:10:e7:d9 10.100.0.12'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.12/28', 'neutron:device_id': '2c5b484c-19e7-47b1-bf93-fa599ddb6873', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-7b586c54-3322-410f-9bc9-972a63b8deff', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'c09a5cf201e249f69f57cd4a632d1e2b', 'neutron:revision_number': '2', 'neutron:security_group_ids': '0a9884ba-4fab-4d1a-a8f1-d417efefef12', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=f503744d-afcf-48c4-bcde-b001877de7d3, chassis=[<ovs.db.idl.Row object at 0x7fdd55b5c970>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fdd55b5c970>], logical_port=2ad8a775-c03c-4a1b-919a-278faef8cb47) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 21 18:54:06 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:54:06.068 104259 INFO neutron.agent.ovn.metadata.agent [-] Port 2ad8a775-c03c-4a1b-919a-278faef8cb47 in datapath 7b586c54-3322-410f-9bc9-972a63b8deff bound to our chassis#033[00m
Jan 21 18:54:06 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:54:06.069 104259 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 7b586c54-3322-410f-9bc9-972a63b8deff#033[00m
Jan 21 18:54:06 np0005591285 ovn_controller[94908]: 2026-01-21T23:54:06Z|00119|binding|INFO|Setting lport 2ad8a775-c03c-4a1b-919a-278faef8cb47 ovn-installed in OVS
Jan 21 18:54:06 np0005591285 ovn_controller[94908]: 2026-01-21T23:54:06Z|00120|binding|INFO|Setting lport 2ad8a775-c03c-4a1b-919a-278faef8cb47 up in Southbound
Jan 21 18:54:06 np0005591285 nova_compute[182755]: 2026-01-21 23:54:06.074 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 18:54:06 np0005591285 systemd-udevd[217638]: Network interface NamePolicy= disabled on kernel command line.
Jan 21 18:54:06 np0005591285 nova_compute[182755]: 2026-01-21 23:54:06.077 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 18:54:06 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:54:06.087 211690 DEBUG oslo.privsep.daemon [-] privsep: reply[f8737702-ccc9-4a70-9fbc-fa68b56489bc]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 18:54:06 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:54:06.088 104259 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap7b586c54-31 in ovnmeta-7b586c54-3322-410f-9bc9-972a63b8deff namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Jan 21 18:54:06 np0005591285 NetworkManager[55017]: <info>  [1769039646.0909] device (tap2ad8a775-c0): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 21 18:54:06 np0005591285 NetworkManager[55017]: <info>  [1769039646.0918] device (tap2ad8a775-c0): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 21 18:54:06 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:54:06.092 211690 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap7b586c54-30 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Jan 21 18:54:06 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:54:06.092 211690 DEBUG oslo.privsep.daemon [-] privsep: reply[4db119c0-5d01-4f5b-8983-5a0fd52105cf]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 18:54:06 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:54:06.093 211690 DEBUG oslo.privsep.daemon [-] privsep: reply[a668067f-18d8-4d6e-976f-1c67cd48f7b7]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 18:54:06 np0005591285 systemd-machined[154022]: New machine qemu-21-instance-00000033.
Jan 21 18:54:06 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:54:06.114 104650 DEBUG oslo.privsep.daemon [-] privsep: reply[389c7485-1415-4d63-b048-9ef4ad943988]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 18:54:06 np0005591285 systemd[1]: Started Virtual Machine qemu-21-instance-00000033.
Jan 21 18:54:06 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:54:06.135 211690 DEBUG oslo.privsep.daemon [-] privsep: reply[ba4b667d-7716-4503-98c1-8ab78e9da924]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 18:54:06 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:54:06.188 211704 DEBUG oslo.privsep.daemon [-] privsep: reply[047450df-8ee2-4581-86ea-b73c58dbf9ff]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 18:54:06 np0005591285 NetworkManager[55017]: <info>  [1769039646.1974] manager: (tap7b586c54-30): new Veth device (/org/freedesktop/NetworkManager/Devices/69)
Jan 21 18:54:06 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:54:06.196 211690 DEBUG oslo.privsep.daemon [-] privsep: reply[96211035-4518-4ed3-b0f6-236629b44082]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 18:54:06 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:54:06.233 211704 DEBUG oslo.privsep.daemon [-] privsep: reply[e5e56674-3ca2-4831-8d4a-7fec03630338]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 18:54:06 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:54:06.237 211704 DEBUG oslo.privsep.daemon [-] privsep: reply[be412815-65e7-4bb1-bf53-55db69c2a9c3]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 18:54:06 np0005591285 NetworkManager[55017]: <info>  [1769039646.2642] device (tap7b586c54-30): carrier: link connected
Jan 21 18:54:06 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:54:06.271 211704 DEBUG oslo.privsep.daemon [-] privsep: reply[5f0f2eae-b78c-45cc-b5b6-d1635d412b1d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 18:54:06 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:54:06.300 211690 DEBUG oslo.privsep.daemon [-] privsep: reply[1ea22f19-c5cb-4c9a-b3a1-64504abf3e8d]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap7b586c54-31'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:dc:a9:f5'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 42], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 420492, 'reachable_time': 39319, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 217674, 'error': None, 'target': 'ovnmeta-7b586c54-3322-410f-9bc9-972a63b8deff', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 18:54:06 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:54:06.325 211690 DEBUG oslo.privsep.daemon [-] privsep: reply[ef8cae5c-f56c-43b2-bcb8-49d27476f7ee]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fedc:a9f5'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 420492, 'tstamp': 420492}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 217678, 'error': None, 'target': 'ovnmeta-7b586c54-3322-410f-9bc9-972a63b8deff', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 18:54:06 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:54:06.350 211690 DEBUG oslo.privsep.daemon [-] privsep: reply[48278d14-2d68-44df-b9de-a22833bf6115]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap7b586c54-31'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:dc:a9:f5'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 42], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 420492, 'reachable_time': 39319, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 217682, 'error': None, 'target': 'ovnmeta-7b586c54-3322-410f-9bc9-972a63b8deff', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 18:54:06 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:54:06.395 211690 DEBUG oslo.privsep.daemon [-] privsep: reply[733d9b4c-acd2-4856-a920-048f883de555]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 18:54:06 np0005591285 nova_compute[182755]: 2026-01-21 23:54:06.443 182759 DEBUG nova.virt.driver [None req-f447ef32-7edb-4e2c-a5c5-5452f2f9c37e - - - - - -] Emitting event <LifecycleEvent: 1769039646.443108, 2c5b484c-19e7-47b1-bf93-fa599ddb6873 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 21 18:54:06 np0005591285 nova_compute[182755]: 2026-01-21 23:54:06.444 182759 INFO nova.compute.manager [None req-f447ef32-7edb-4e2c-a5c5-5452f2f9c37e - - - - - -] [instance: 2c5b484c-19e7-47b1-bf93-fa599ddb6873] VM Started (Lifecycle Event)#033[00m
Jan 21 18:54:06 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:54:06.501 211690 DEBUG oslo.privsep.daemon [-] privsep: reply[d395e4b2-887b-4a37-9927-9bb87c3da978]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 18:54:06 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:54:06.503 104259 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap7b586c54-30, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 21 18:54:06 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:54:06.504 104259 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 21 18:54:06 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:54:06.505 104259 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap7b586c54-30, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 21 18:54:06 np0005591285 nova_compute[182755]: 2026-01-21 23:54:06.508 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 18:54:06 np0005591285 NetworkManager[55017]: <info>  [1769039646.5093] manager: (tap7b586c54-30): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/70)
Jan 21 18:54:06 np0005591285 kernel: tap7b586c54-30: entered promiscuous mode
Jan 21 18:54:06 np0005591285 nova_compute[182755]: 2026-01-21 23:54:06.511 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 18:54:06 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:54:06.514 104259 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap7b586c54-30, col_values=(('external_ids', {'iface-id': '52e5d5d5-be78-49fa-86d7-24ac4adf40c1'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 21 18:54:06 np0005591285 ovn_controller[94908]: 2026-01-21T23:54:06Z|00121|binding|INFO|Releasing lport 52e5d5d5-be78-49fa-86d7-24ac4adf40c1 from this chassis (sb_readonly=0)
Jan 21 18:54:06 np0005591285 nova_compute[182755]: 2026-01-21 23:54:06.516 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 18:54:06 np0005591285 nova_compute[182755]: 2026-01-21 23:54:06.518 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 18:54:06 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:54:06.519 104259 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/7b586c54-3322-410f-9bc9-972a63b8deff.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/7b586c54-3322-410f-9bc9-972a63b8deff.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Jan 21 18:54:06 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:54:06.521 211690 DEBUG oslo.privsep.daemon [-] privsep: reply[e73dadcc-e3fe-404c-87d2-e15ad14b76f2]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 18:54:06 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:54:06.523 104259 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Jan 21 18:54:06 np0005591285 ovn_metadata_agent[104254]: global
Jan 21 18:54:06 np0005591285 ovn_metadata_agent[104254]:    log         /dev/log local0 debug
Jan 21 18:54:06 np0005591285 ovn_metadata_agent[104254]:    log-tag     haproxy-metadata-proxy-7b586c54-3322-410f-9bc9-972a63b8deff
Jan 21 18:54:06 np0005591285 ovn_metadata_agent[104254]:    user        root
Jan 21 18:54:06 np0005591285 nova_compute[182755]: 2026-01-21 23:54:06.523 182759 DEBUG nova.compute.manager [None req-f447ef32-7edb-4e2c-a5c5-5452f2f9c37e - - - - - -] [instance: 2c5b484c-19e7-47b1-bf93-fa599ddb6873] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 21 18:54:06 np0005591285 ovn_metadata_agent[104254]:    group       root
Jan 21 18:54:06 np0005591285 ovn_metadata_agent[104254]:    maxconn     1024
Jan 21 18:54:06 np0005591285 ovn_metadata_agent[104254]:    pidfile     /var/lib/neutron/external/pids/7b586c54-3322-410f-9bc9-972a63b8deff.pid.haproxy
Jan 21 18:54:06 np0005591285 ovn_metadata_agent[104254]:    daemon
Jan 21 18:54:06 np0005591285 ovn_metadata_agent[104254]: 
Jan 21 18:54:06 np0005591285 ovn_metadata_agent[104254]: defaults
Jan 21 18:54:06 np0005591285 ovn_metadata_agent[104254]:    log global
Jan 21 18:54:06 np0005591285 ovn_metadata_agent[104254]:    mode http
Jan 21 18:54:06 np0005591285 ovn_metadata_agent[104254]:    option httplog
Jan 21 18:54:06 np0005591285 ovn_metadata_agent[104254]:    option dontlognull
Jan 21 18:54:06 np0005591285 ovn_metadata_agent[104254]:    option http-server-close
Jan 21 18:54:06 np0005591285 ovn_metadata_agent[104254]:    option forwardfor
Jan 21 18:54:06 np0005591285 ovn_metadata_agent[104254]:    retries                 3
Jan 21 18:54:06 np0005591285 ovn_metadata_agent[104254]:    timeout http-request    30s
Jan 21 18:54:06 np0005591285 ovn_metadata_agent[104254]:    timeout connect         30s
Jan 21 18:54:06 np0005591285 ovn_metadata_agent[104254]:    timeout client          32s
Jan 21 18:54:06 np0005591285 ovn_metadata_agent[104254]:    timeout server          32s
Jan 21 18:54:06 np0005591285 ovn_metadata_agent[104254]:    timeout http-keep-alive 30s
Jan 21 18:54:06 np0005591285 ovn_metadata_agent[104254]: 
Jan 21 18:54:06 np0005591285 ovn_metadata_agent[104254]: 
Jan 21 18:54:06 np0005591285 ovn_metadata_agent[104254]: listen listener
Jan 21 18:54:06 np0005591285 ovn_metadata_agent[104254]:    bind 169.254.169.254:80
Jan 21 18:54:06 np0005591285 ovn_metadata_agent[104254]:    server metadata /var/lib/neutron/metadata_proxy
Jan 21 18:54:06 np0005591285 ovn_metadata_agent[104254]:    http-request add-header X-OVN-Network-ID 7b586c54-3322-410f-9bc9-972a63b8deff
Jan 21 18:54:06 np0005591285 ovn_metadata_agent[104254]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Jan 21 18:54:06 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:54:06.524 104259 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-7b586c54-3322-410f-9bc9-972a63b8deff', 'env', 'PROCESS_TAG=haproxy-7b586c54-3322-410f-9bc9-972a63b8deff', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/7b586c54-3322-410f-9bc9-972a63b8deff.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Jan 21 18:54:06 np0005591285 nova_compute[182755]: 2026-01-21 23:54:06.530 182759 DEBUG nova.virt.driver [None req-f447ef32-7edb-4e2c-a5c5-5452f2f9c37e - - - - - -] Emitting event <LifecycleEvent: 1769039646.443274, 2c5b484c-19e7-47b1-bf93-fa599ddb6873 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 21 18:54:06 np0005591285 nova_compute[182755]: 2026-01-21 23:54:06.531 182759 INFO nova.compute.manager [None req-f447ef32-7edb-4e2c-a5c5-5452f2f9c37e - - - - - -] [instance: 2c5b484c-19e7-47b1-bf93-fa599ddb6873] VM Paused (Lifecycle Event)#033[00m
Jan 21 18:54:06 np0005591285 nova_compute[182755]: 2026-01-21 23:54:06.541 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 18:54:06 np0005591285 nova_compute[182755]: 2026-01-21 23:54:06.561 182759 DEBUG nova.compute.manager [None req-f447ef32-7edb-4e2c-a5c5-5452f2f9c37e - - - - - -] [instance: 2c5b484c-19e7-47b1-bf93-fa599ddb6873] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 21 18:54:06 np0005591285 nova_compute[182755]: 2026-01-21 23:54:06.566 182759 DEBUG nova.compute.manager [None req-f447ef32-7edb-4e2c-a5c5-5452f2f9c37e - - - - - -] [instance: 2c5b484c-19e7-47b1-bf93-fa599ddb6873] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 21 18:54:06 np0005591285 nova_compute[182755]: 2026-01-21 23:54:06.607 182759 INFO nova.compute.manager [None req-f447ef32-7edb-4e2c-a5c5-5452f2f9c37e - - - - - -] [instance: 2c5b484c-19e7-47b1-bf93-fa599ddb6873] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 21 18:54:06 np0005591285 podman[217715]: 2026-01-21 23:54:06.955410539 +0000 UTC m=+0.061563408 container create d0a3fbfe21a547c02a504e096feab1e7c307e36683ecc89137ba6cc96a5e5e56 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-7b586c54-3322-410f-9bc9-972a63b8deff, org.label-schema.schema-version=1.0, tcib_managed=true, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.build-date=20251202, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2)
Jan 21 18:54:06 np0005591285 systemd[1]: Started libpod-conmon-d0a3fbfe21a547c02a504e096feab1e7c307e36683ecc89137ba6cc96a5e5e56.scope.
Jan 21 18:54:07 np0005591285 podman[217715]: 2026-01-21 23:54:06.924430115 +0000 UTC m=+0.030582984 image pull 3695f0466b4af47afdf4b467956f8cc4744d7249671a73e7ca3fd26cca2f59c3 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Jan 21 18:54:07 np0005591285 systemd[1]: Started libcrun container.
Jan 21 18:54:07 np0005591285 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/dc95c0cad680c89c25343b8bb5f7b754cd4e2f3a06e0ea54f59f92b890fcd035/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 21 18:54:07 np0005591285 podman[217715]: 2026-01-21 23:54:07.058778501 +0000 UTC m=+0.164931450 container init d0a3fbfe21a547c02a504e096feab1e7c307e36683ecc89137ba6cc96a5e5e56 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-7b586c54-3322-410f-9bc9-972a63b8deff, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, tcib_managed=true, org.label-schema.build-date=20251202, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Jan 21 18:54:07 np0005591285 podman[217715]: 2026-01-21 23:54:07.065409209 +0000 UTC m=+0.171562098 container start d0a3fbfe21a547c02a504e096feab1e7c307e36683ecc89137ba6cc96a5e5e56 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-7b586c54-3322-410f-9bc9-972a63b8deff, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_managed=true)
Jan 21 18:54:07 np0005591285 neutron-haproxy-ovnmeta-7b586c54-3322-410f-9bc9-972a63b8deff[217731]: [NOTICE]   (217735) : New worker (217737) forked
Jan 21 18:54:07 np0005591285 neutron-haproxy-ovnmeta-7b586c54-3322-410f-9bc9-972a63b8deff[217731]: [NOTICE]   (217735) : Loading success.
Jan 21 18:54:07 np0005591285 nova_compute[182755]: 2026-01-21 23:54:07.630 182759 DEBUG nova.network.neutron [req-1ebe081c-32f5-4b80-b469-17e2b1213781 req-dc65971f-30eb-4a64-bb8f-d86edf83f705 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 2c5b484c-19e7-47b1-bf93-fa599ddb6873] Updated VIF entry in instance network info cache for port 2ad8a775-c03c-4a1b-919a-278faef8cb47. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 21 18:54:07 np0005591285 nova_compute[182755]: 2026-01-21 23:54:07.632 182759 DEBUG nova.network.neutron [req-1ebe081c-32f5-4b80-b469-17e2b1213781 req-dc65971f-30eb-4a64-bb8f-d86edf83f705 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 2c5b484c-19e7-47b1-bf93-fa599ddb6873] Updating instance_info_cache with network_info: [{"id": "2ad8a775-c03c-4a1b-919a-278faef8cb47", "address": "fa:16:3e:10:e7:d9", "network": {"id": "7b586c54-3322-410f-9bc9-972a63b8deff", "bridge": "br-int", "label": "tempest-ServerDiskConfigTestJSON-1542056474-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c09a5cf201e249f69f57cd4a632d1e2b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap2ad8a775-c0", "ovs_interfaceid": "2ad8a775-c03c-4a1b-919a-278faef8cb47", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 21 18:54:07 np0005591285 nova_compute[182755]: 2026-01-21 23:54:07.682 182759 DEBUG nova.compute.manager [req-26c0b922-36bf-42f4-8744-bba9cee1fd0d req-502dc56b-a5d0-4253-8fda-5cb1ca1ec0a9 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 2c5b484c-19e7-47b1-bf93-fa599ddb6873] Received event network-vif-plugged-2ad8a775-c03c-4a1b-919a-278faef8cb47 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 21 18:54:07 np0005591285 nova_compute[182755]: 2026-01-21 23:54:07.683 182759 DEBUG oslo_concurrency.lockutils [req-26c0b922-36bf-42f4-8744-bba9cee1fd0d req-502dc56b-a5d0-4253-8fda-5cb1ca1ec0a9 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "2c5b484c-19e7-47b1-bf93-fa599ddb6873-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 21 18:54:07 np0005591285 nova_compute[182755]: 2026-01-21 23:54:07.683 182759 DEBUG oslo_concurrency.lockutils [req-26c0b922-36bf-42f4-8744-bba9cee1fd0d req-502dc56b-a5d0-4253-8fda-5cb1ca1ec0a9 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "2c5b484c-19e7-47b1-bf93-fa599ddb6873-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 21 18:54:07 np0005591285 nova_compute[182755]: 2026-01-21 23:54:07.683 182759 DEBUG oslo_concurrency.lockutils [req-26c0b922-36bf-42f4-8744-bba9cee1fd0d req-502dc56b-a5d0-4253-8fda-5cb1ca1ec0a9 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "2c5b484c-19e7-47b1-bf93-fa599ddb6873-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 21 18:54:07 np0005591285 nova_compute[182755]: 2026-01-21 23:54:07.684 182759 DEBUG nova.compute.manager [req-26c0b922-36bf-42f4-8744-bba9cee1fd0d req-502dc56b-a5d0-4253-8fda-5cb1ca1ec0a9 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 2c5b484c-19e7-47b1-bf93-fa599ddb6873] Processing event network-vif-plugged-2ad8a775-c03c-4a1b-919a-278faef8cb47 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Jan 21 18:54:07 np0005591285 nova_compute[182755]: 2026-01-21 23:54:07.684 182759 DEBUG nova.compute.manager [req-26c0b922-36bf-42f4-8744-bba9cee1fd0d req-502dc56b-a5d0-4253-8fda-5cb1ca1ec0a9 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 2c5b484c-19e7-47b1-bf93-fa599ddb6873] Received event network-vif-plugged-2ad8a775-c03c-4a1b-919a-278faef8cb47 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 21 18:54:07 np0005591285 nova_compute[182755]: 2026-01-21 23:54:07.685 182759 DEBUG oslo_concurrency.lockutils [req-26c0b922-36bf-42f4-8744-bba9cee1fd0d req-502dc56b-a5d0-4253-8fda-5cb1ca1ec0a9 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "2c5b484c-19e7-47b1-bf93-fa599ddb6873-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 21 18:54:07 np0005591285 nova_compute[182755]: 2026-01-21 23:54:07.685 182759 DEBUG oslo_concurrency.lockutils [req-26c0b922-36bf-42f4-8744-bba9cee1fd0d req-502dc56b-a5d0-4253-8fda-5cb1ca1ec0a9 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "2c5b484c-19e7-47b1-bf93-fa599ddb6873-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 21 18:54:07 np0005591285 nova_compute[182755]: 2026-01-21 23:54:07.685 182759 DEBUG oslo_concurrency.lockutils [req-26c0b922-36bf-42f4-8744-bba9cee1fd0d req-502dc56b-a5d0-4253-8fda-5cb1ca1ec0a9 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "2c5b484c-19e7-47b1-bf93-fa599ddb6873-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 21 18:54:07 np0005591285 nova_compute[182755]: 2026-01-21 23:54:07.686 182759 DEBUG nova.compute.manager [req-26c0b922-36bf-42f4-8744-bba9cee1fd0d req-502dc56b-a5d0-4253-8fda-5cb1ca1ec0a9 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 2c5b484c-19e7-47b1-bf93-fa599ddb6873] No waiting events found dispatching network-vif-plugged-2ad8a775-c03c-4a1b-919a-278faef8cb47 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 21 18:54:07 np0005591285 nova_compute[182755]: 2026-01-21 23:54:07.686 182759 WARNING nova.compute.manager [req-26c0b922-36bf-42f4-8744-bba9cee1fd0d req-502dc56b-a5d0-4253-8fda-5cb1ca1ec0a9 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 2c5b484c-19e7-47b1-bf93-fa599ddb6873] Received unexpected event network-vif-plugged-2ad8a775-c03c-4a1b-919a-278faef8cb47 for instance with vm_state building and task_state spawning.#033[00m
Jan 21 18:54:07 np0005591285 nova_compute[182755]: 2026-01-21 23:54:07.687 182759 DEBUG nova.compute.manager [None req-7900a28e-a76b-4bba-a7e4-97a2b29e0f2c a7fb6bdd938b4fcdb749b0bc4f86f97e c09a5cf201e249f69f57cd4a632d1e2b - - default default] [instance: 2c5b484c-19e7-47b1-bf93-fa599ddb6873] Instance event wait completed in 1 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Jan 21 18:54:07 np0005591285 nova_compute[182755]: 2026-01-21 23:54:07.691 182759 DEBUG oslo_concurrency.lockutils [req-1ebe081c-32f5-4b80-b469-17e2b1213781 req-dc65971f-30eb-4a64-bb8f-d86edf83f705 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Releasing lock "refresh_cache-2c5b484c-19e7-47b1-bf93-fa599ddb6873" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 21 18:54:07 np0005591285 nova_compute[182755]: 2026-01-21 23:54:07.695 182759 DEBUG nova.virt.driver [None req-f447ef32-7edb-4e2c-a5c5-5452f2f9c37e - - - - - -] Emitting event <LifecycleEvent: 1769039647.6947894, 2c5b484c-19e7-47b1-bf93-fa599ddb6873 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 21 18:54:07 np0005591285 nova_compute[182755]: 2026-01-21 23:54:07.695 182759 INFO nova.compute.manager [None req-f447ef32-7edb-4e2c-a5c5-5452f2f9c37e - - - - - -] [instance: 2c5b484c-19e7-47b1-bf93-fa599ddb6873] VM Resumed (Lifecycle Event)#033[00m
Jan 21 18:54:07 np0005591285 nova_compute[182755]: 2026-01-21 23:54:07.697 182759 DEBUG nova.virt.libvirt.driver [None req-7900a28e-a76b-4bba-a7e4-97a2b29e0f2c a7fb6bdd938b4fcdb749b0bc4f86f97e c09a5cf201e249f69f57cd4a632d1e2b - - default default] [instance: 2c5b484c-19e7-47b1-bf93-fa599ddb6873] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Jan 21 18:54:07 np0005591285 nova_compute[182755]: 2026-01-21 23:54:07.702 182759 INFO nova.virt.libvirt.driver [-] [instance: 2c5b484c-19e7-47b1-bf93-fa599ddb6873] Instance spawned successfully.#033[00m
Jan 21 18:54:07 np0005591285 nova_compute[182755]: 2026-01-21 23:54:07.702 182759 DEBUG nova.virt.libvirt.driver [None req-7900a28e-a76b-4bba-a7e4-97a2b29e0f2c a7fb6bdd938b4fcdb749b0bc4f86f97e c09a5cf201e249f69f57cd4a632d1e2b - - default default] [instance: 2c5b484c-19e7-47b1-bf93-fa599ddb6873] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Jan 21 18:54:07 np0005591285 nova_compute[182755]: 2026-01-21 23:54:07.724 182759 DEBUG nova.compute.manager [None req-f447ef32-7edb-4e2c-a5c5-5452f2f9c37e - - - - - -] [instance: 2c5b484c-19e7-47b1-bf93-fa599ddb6873] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 21 18:54:07 np0005591285 nova_compute[182755]: 2026-01-21 23:54:07.729 182759 DEBUG nova.compute.manager [None req-f447ef32-7edb-4e2c-a5c5-5452f2f9c37e - - - - - -] [instance: 2c5b484c-19e7-47b1-bf93-fa599ddb6873] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 21 18:54:07 np0005591285 nova_compute[182755]: 2026-01-21 23:54:07.731 182759 DEBUG nova.virt.libvirt.driver [None req-7900a28e-a76b-4bba-a7e4-97a2b29e0f2c a7fb6bdd938b4fcdb749b0bc4f86f97e c09a5cf201e249f69f57cd4a632d1e2b - - default default] [instance: 2c5b484c-19e7-47b1-bf93-fa599ddb6873] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 21 18:54:07 np0005591285 nova_compute[182755]: 2026-01-21 23:54:07.732 182759 DEBUG nova.virt.libvirt.driver [None req-7900a28e-a76b-4bba-a7e4-97a2b29e0f2c a7fb6bdd938b4fcdb749b0bc4f86f97e c09a5cf201e249f69f57cd4a632d1e2b - - default default] [instance: 2c5b484c-19e7-47b1-bf93-fa599ddb6873] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 21 18:54:07 np0005591285 nova_compute[182755]: 2026-01-21 23:54:07.732 182759 DEBUG nova.virt.libvirt.driver [None req-7900a28e-a76b-4bba-a7e4-97a2b29e0f2c a7fb6bdd938b4fcdb749b0bc4f86f97e c09a5cf201e249f69f57cd4a632d1e2b - - default default] [instance: 2c5b484c-19e7-47b1-bf93-fa599ddb6873] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 21 18:54:07 np0005591285 nova_compute[182755]: 2026-01-21 23:54:07.733 182759 DEBUG nova.virt.libvirt.driver [None req-7900a28e-a76b-4bba-a7e4-97a2b29e0f2c a7fb6bdd938b4fcdb749b0bc4f86f97e c09a5cf201e249f69f57cd4a632d1e2b - - default default] [instance: 2c5b484c-19e7-47b1-bf93-fa599ddb6873] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 21 18:54:07 np0005591285 nova_compute[182755]: 2026-01-21 23:54:07.733 182759 DEBUG nova.virt.libvirt.driver [None req-7900a28e-a76b-4bba-a7e4-97a2b29e0f2c a7fb6bdd938b4fcdb749b0bc4f86f97e c09a5cf201e249f69f57cd4a632d1e2b - - default default] [instance: 2c5b484c-19e7-47b1-bf93-fa599ddb6873] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 21 18:54:07 np0005591285 nova_compute[182755]: 2026-01-21 23:54:07.733 182759 DEBUG nova.virt.libvirt.driver [None req-7900a28e-a76b-4bba-a7e4-97a2b29e0f2c a7fb6bdd938b4fcdb749b0bc4f86f97e c09a5cf201e249f69f57cd4a632d1e2b - - default default] [instance: 2c5b484c-19e7-47b1-bf93-fa599ddb6873] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 21 18:54:07 np0005591285 nova_compute[182755]: 2026-01-21 23:54:07.765 182759 INFO nova.compute.manager [None req-f447ef32-7edb-4e2c-a5c5-5452f2f9c37e - - - - - -] [instance: 2c5b484c-19e7-47b1-bf93-fa599ddb6873] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 21 18:54:07 np0005591285 nova_compute[182755]: 2026-01-21 23:54:07.908 182759 INFO nova.compute.manager [None req-7900a28e-a76b-4bba-a7e4-97a2b29e0f2c a7fb6bdd938b4fcdb749b0bc4f86f97e c09a5cf201e249f69f57cd4a632d1e2b - - default default] [instance: 2c5b484c-19e7-47b1-bf93-fa599ddb6873] Took 8.98 seconds to spawn the instance on the hypervisor.#033[00m
Jan 21 18:54:07 np0005591285 nova_compute[182755]: 2026-01-21 23:54:07.909 182759 DEBUG nova.compute.manager [None req-7900a28e-a76b-4bba-a7e4-97a2b29e0f2c a7fb6bdd938b4fcdb749b0bc4f86f97e c09a5cf201e249f69f57cd4a632d1e2b - - default default] [instance: 2c5b484c-19e7-47b1-bf93-fa599ddb6873] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 21 18:54:08 np0005591285 nova_compute[182755]: 2026-01-21 23:54:08.069 182759 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1769039633.0623932, 40bd1cc4-d1de-4488-8160-e6d4f5fce4bc => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 21 18:54:08 np0005591285 nova_compute[182755]: 2026-01-21 23:54:08.070 182759 INFO nova.compute.manager [-] [instance: 40bd1cc4-d1de-4488-8160-e6d4f5fce4bc] VM Stopped (Lifecycle Event)#033[00m
Jan 21 18:54:08 np0005591285 nova_compute[182755]: 2026-01-21 23:54:08.094 182759 INFO nova.compute.manager [None req-7900a28e-a76b-4bba-a7e4-97a2b29e0f2c a7fb6bdd938b4fcdb749b0bc4f86f97e c09a5cf201e249f69f57cd4a632d1e2b - - default default] [instance: 2c5b484c-19e7-47b1-bf93-fa599ddb6873] Took 9.87 seconds to build instance.#033[00m
Jan 21 18:54:08 np0005591285 nova_compute[182755]: 2026-01-21 23:54:08.098 182759 DEBUG nova.compute.manager [None req-11b070ab-091b-4149-bc73-755cc9328a01 - - - - - -] [instance: 40bd1cc4-d1de-4488-8160-e6d4f5fce4bc] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 21 18:54:08 np0005591285 nova_compute[182755]: 2026-01-21 23:54:08.124 182759 DEBUG oslo_concurrency.lockutils [None req-7900a28e-a76b-4bba-a7e4-97a2b29e0f2c a7fb6bdd938b4fcdb749b0bc4f86f97e c09a5cf201e249f69f57cd4a632d1e2b - - default default] Lock "2c5b484c-19e7-47b1-bf93-fa599ddb6873" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 10.039s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 21 18:54:09 np0005591285 nova_compute[182755]: 2026-01-21 23:54:09.798 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 18:54:09 np0005591285 nova_compute[182755]: 2026-01-21 23:54:09.812 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 18:54:14 np0005591285 nova_compute[182755]: 2026-01-21 23:54:14.804 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 18:54:14 np0005591285 nova_compute[182755]: 2026-01-21 23:54:14.814 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 18:54:19 np0005591285 nova_compute[182755]: 2026-01-21 23:54:19.809 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 18:54:19 np0005591285 nova_compute[182755]: 2026-01-21 23:54:19.817 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 18:54:20 np0005591285 podman[217763]: 2026-01-21 23:54:20.255531594 +0000 UTC m=+0.109400926 container health_status b5c1a31a508d0cf9e10702abe9c0ef424614b4d8f0daee85791f7de054afa6f0 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, config_id=ceilometer_agent_compute, container_name=ceilometer_agent_compute, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, tcib_managed=true, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2)
Jan 21 18:54:20 np0005591285 podman[217785]: 2026-01-21 23:54:20.400414993 +0000 UTC m=+0.086991972 container health_status 769668cc4e818cfae854ab3fcb5517227ddca1e18005a18ef4d782a494f45662 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, com.redhat.component=ubi9-minimal-container, name=ubi9-minimal, io.openshift.expose-services=, config_id=openstack_network_exporter, release=1755695350, io.openshift.tags=minimal rhel9, maintainer=Red Hat, Inc., vendor=Red Hat, Inc., version=9.6, managed_by=edpm_ansible, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., url=https://catalog.redhat.com/en/search?searchType=containers, build-date=2025-08-20T13:12:41, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., distribution-scope=public, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, container_name=openstack_network_exporter, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vcs-type=git, io.buildah.version=1.33.7, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, architecture=x86_64)
Jan 21 18:54:21 np0005591285 ovn_controller[94908]: 2026-01-21T23:54:21Z|00015|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:10:e7:d9 10.100.0.12
Jan 21 18:54:21 np0005591285 ovn_controller[94908]: 2026-01-21T23:54:21Z|00016|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:10:e7:d9 10.100.0.12
Jan 21 18:54:22 np0005591285 nova_compute[182755]: 2026-01-21 23:54:22.149 182759 DEBUG oslo_concurrency.lockutils [None req-119b87c6-e29b-40e2-ade9-16bc86f4c969 a7fb6bdd938b4fcdb749b0bc4f86f97e c09a5cf201e249f69f57cd4a632d1e2b - - default default] Acquiring lock "refresh_cache-2c5b484c-19e7-47b1-bf93-fa599ddb6873" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 21 18:54:22 np0005591285 nova_compute[182755]: 2026-01-21 23:54:22.150 182759 DEBUG oslo_concurrency.lockutils [None req-119b87c6-e29b-40e2-ade9-16bc86f4c969 a7fb6bdd938b4fcdb749b0bc4f86f97e c09a5cf201e249f69f57cd4a632d1e2b - - default default] Acquired lock "refresh_cache-2c5b484c-19e7-47b1-bf93-fa599ddb6873" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 21 18:54:22 np0005591285 nova_compute[182755]: 2026-01-21 23:54:22.151 182759 DEBUG nova.network.neutron [None req-119b87c6-e29b-40e2-ade9-16bc86f4c969 a7fb6bdd938b4fcdb749b0bc4f86f97e c09a5cf201e249f69f57cd4a632d1e2b - - default default] [instance: 2c5b484c-19e7-47b1-bf93-fa599ddb6873] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Jan 21 18:54:22 np0005591285 nova_compute[182755]: 2026-01-21 23:54:22.213 182759 DEBUG oslo_service.periodic_task [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.158 12 DEBUG ceilometer.compute.discovery [-] instance data: {'id': '2c5b484c-19e7-47b1-bf93-fa599ddb6873', 'name': 'tempest-ServerDiskConfigTestJSON-server-1222136722', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'os_type': 'hvm', 'architecture': 'x86_64', 'OS-EXT-SRV-ATTR:instance_name': 'instance-00000033', 'OS-EXT-SRV-ATTR:host': 'compute-2.ctlplane.example.com', 'OS-EXT-STS:vm_state': 'running', 'tenant_id': 'c09a5cf201e249f69f57cd4a632d1e2b', 'user_id': 'a7fb6bdd938b4fcdb749b0bc4f86f97e', 'hostId': '5e88b73f28190189c9d9e7f1c76645cadee6e8a282dc99bb0afd881f', 'status': 'active', 'metadata': {}} discover_libvirt_polling /usr/lib/python3.9/site-packages/ceilometer/compute/discovery.py:228
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.160 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.bytes in the context of pollsters
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.206 12 DEBUG ceilometer.compute.pollsters [-] 2c5b484c-19e7-47b1-bf93-fa599ddb6873/disk.device.write.bytes volume: 72695808 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.207 12 DEBUG ceilometer.compute.pollsters [-] 2c5b484c-19e7-47b1-bf93-fa599ddb6873/disk.device.write.bytes volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.213 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'a4404544-7156-4c5e-942f-ca0295bc3ea4', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 72695808, 'user_id': 'a7fb6bdd938b4fcdb749b0bc4f86f97e', 'user_name': None, 'project_id': 'c09a5cf201e249f69f57cd4a632d1e2b', 'project_name': None, 'resource_id': '2c5b484c-19e7-47b1-bf93-fa599ddb6873-vda', 'timestamp': '2026-01-21T23:54:23.161046', 'resource_metadata': {'display_name': 'tempest-ServerDiskConfigTestJSON-server-1222136722', 'name': 'instance-00000033', 'instance_id': '2c5b484c-19e7-47b1-bf93-fa599ddb6873', 'instance_type': 'm1.nano', 'host': '5e88b73f28190189c9d9e7f1c76645cadee6e8a282dc99bb0afd881f', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '82b1fb4e-f724-11f0-b13b-fa163e425b77', 'monotonic_time': 4221.880377997, 'message_signature': '704040feb17666492f612b2468edf69706946bf78cc9848fb0fe3efc76fecc4e'}, {'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': 'a7fb6bdd938b4fcdb749b0bc4f86f97e', 'user_name': None, 'project_id': 'c09a5cf201e249f69f57cd4a632d1e2b', 'project_name': None, 'resource_id': '2c5b484c-19e7-47b1-bf93-fa599ddb6873-sda', 'timestamp': '2026-01-21T23:54:23.161046', 'resource_metadata': {'display_name': 'tempest-ServerDiskConfigTestJSON-server-1222136722', 'name': 'instance-00000033', 'instance_id': '2c5b484c-19e7-47b1-bf93-fa599ddb6873', 'instance_type': 'm1.nano', 'host': '5e88b73f28190189c9d9e7f1c76645cadee6e8a282dc99bb0afd881f', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '82b22826-f724-11f0-b13b-fa163e425b77', 'monotonic_time': 4221.880377997, 'message_signature': '184a7381de64bf35f27e62051974fea9de83d3115e50030592c4b227e84bdd93'}]}, 'timestamp': '2026-01-21 23:54:23.208688', '_unique_id': '38d8bded926b4ceb8f802531ef5239c2'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.213 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.213 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.213 12 ERROR oslo_messaging.notify.messaging     yield
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.213 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.213 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.213 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.213 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.213 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.213 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.213 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.213 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.213 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.213 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.213 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.213 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.213 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.213 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.213 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.213 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.213 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.213 12 ERROR oslo_messaging.notify.messaging 
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.213 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.213 12 ERROR oslo_messaging.notify.messaging 
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.213 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.213 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.213 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.213 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.213 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.213 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.213 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.213 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.213 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.213 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.213 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.213 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.213 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.213 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.213 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.213 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.213 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.213 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.213 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.213 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.213 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.213 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.213 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.213 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.213 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.213 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.213 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.213 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.213 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.213 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.213 12 ERROR oslo_messaging.notify.messaging 
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.215 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.drop in the context of pollsters
Jan 21 18:54:23 np0005591285 nova_compute[182755]: 2026-01-21 23:54:23.217 182759 DEBUG oslo_service.periodic_task [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.220 12 DEBUG ceilometer.compute.virt.libvirt.inspector [-] No delta meter predecessor for 2c5b484c-19e7-47b1-bf93-fa599ddb6873 / tap2ad8a775-c0 inspect_vnics /usr/lib/python3.9/site-packages/ceilometer/compute/virt/libvirt/inspector.py:136
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.221 12 DEBUG ceilometer.compute.pollsters [-] 2c5b484c-19e7-47b1-bf93-fa599ddb6873/network.outgoing.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.223 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'bb2de40c-56a5-40d5-99f1-c774622ed6ca', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'a7fb6bdd938b4fcdb749b0bc4f86f97e', 'user_name': None, 'project_id': 'c09a5cf201e249f69f57cd4a632d1e2b', 'project_name': None, 'resource_id': 'instance-00000033-2c5b484c-19e7-47b1-bf93-fa599ddb6873-tap2ad8a775-c0', 'timestamp': '2026-01-21T23:54:23.215997', 'resource_metadata': {'display_name': 'tempest-ServerDiskConfigTestJSON-server-1222136722', 'name': 'tap2ad8a775-c0', 'instance_id': '2c5b484c-19e7-47b1-bf93-fa599ddb6873', 'instance_type': 'm1.nano', 'host': '5e88b73f28190189c9d9e7f1c76645cadee6e8a282dc99bb0afd881f', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:10:e7:d9', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap2ad8a775-c0'}, 'message_id': '82b43788-f724-11f0-b13b-fa163e425b77', 'monotonic_time': 4221.935264985, 'message_signature': '2a0ba0e3488abc6ac55cc9bb567780985a80d09119f3b95e2ea7b0c56b891c23'}]}, 'timestamp': '2026-01-21 23:54:23.222040', '_unique_id': 'b82dd79fde54417a91e2fd701a2628f9'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.223 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.223 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.223 12 ERROR oslo_messaging.notify.messaging     yield
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.223 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.223 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.223 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.223 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.223 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.223 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.223 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.223 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.223 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.223 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.223 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.223 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.223 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.223 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.223 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.223 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.223 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.223 12 ERROR oslo_messaging.notify.messaging 
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.223 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.223 12 ERROR oslo_messaging.notify.messaging 
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.223 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.223 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.223 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.223 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.223 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.223 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.223 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.223 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.223 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.223 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.223 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.223 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.223 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.223 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.223 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.223 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.223 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.223 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.223 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.223 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.223 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.223 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.223 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.223 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.223 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.223 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.223 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.223 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.223 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.223 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.223 12 ERROR oslo_messaging.notify.messaging 
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.224 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.iops in the context of pollsters
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.224 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for PerDeviceDiskIOPSPollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.224 12 ERROR ceilometer.polling.manager [-] Prevent pollster disk.device.iops from polling [<NovaLikeServer: tempest-ServerDiskConfigTestJSON-server-1222136722>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-ServerDiskConfigTestJSON-server-1222136722>]
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.225 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.allocation in the context of pollsters
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.241 12 DEBUG ceilometer.compute.pollsters [-] 2c5b484c-19e7-47b1-bf93-fa599ddb6873/disk.device.allocation volume: 30482432 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.241 12 DEBUG ceilometer.compute.pollsters [-] 2c5b484c-19e7-47b1-bf93-fa599ddb6873/disk.device.allocation volume: 487424 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.243 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '9ae90b8a-07f0-4bae-872c-f89782d2731d', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 30482432, 'user_id': 'a7fb6bdd938b4fcdb749b0bc4f86f97e', 'user_name': None, 'project_id': 'c09a5cf201e249f69f57cd4a632d1e2b', 'project_name': None, 'resource_id': '2c5b484c-19e7-47b1-bf93-fa599ddb6873-vda', 'timestamp': '2026-01-21T23:54:23.226005', 'resource_metadata': {'display_name': 'tempest-ServerDiskConfigTestJSON-server-1222136722', 'name': 'instance-00000033', 'instance_id': '2c5b484c-19e7-47b1-bf93-fa599ddb6873', 'instance_type': 'm1.nano', 'host': '5e88b73f28190189c9d9e7f1c76645cadee6e8a282dc99bb0afd881f', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '82b73640-f724-11f0-b13b-fa163e425b77', 'monotonic_time': 4221.945225892, 'message_signature': '926fca3bdd7b83be8a12656f82e8afe594763221a38335f1086b11ee615b53f8'}, {'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 487424, 'user_id': 'a7fb6bdd938b4fcdb749b0bc4f86f97e', 'user_name': None, 'project_id': 'c09a5cf201e249f69f57cd4a632d1e2b', 'project_name': None, 'resource_id': '2c5b484c-19e7-47b1-bf93-fa599ddb6873-sda', 'timestamp': '2026-01-21T23:54:23.226005', 'resource_metadata': {'display_name': 'tempest-ServerDiskConfigTestJSON-server-1222136722', 'name': 'instance-00000033', 'instance_id': '2c5b484c-19e7-47b1-bf93-fa599ddb6873', 'instance_type': 'm1.nano', 'host': '5e88b73f28190189c9d9e7f1c76645cadee6e8a282dc99bb0afd881f', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '82b7477a-f724-11f0-b13b-fa163e425b77', 'monotonic_time': 4221.945225892, 'message_signature': '49799a5cdc139b27e788768450d29409c997cf6160a5f15e4db37397d9dec656'}]}, 'timestamp': '2026-01-21 23:54:23.242106', '_unique_id': '70d3e31a6a3a4959a1463a2dc794171c'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.243 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.243 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.243 12 ERROR oslo_messaging.notify.messaging     yield
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.243 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.243 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.243 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.243 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.243 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.243 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.243 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.243 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.243 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.243 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.243 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.243 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.243 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.243 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.243 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.243 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.243 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.243 12 ERROR oslo_messaging.notify.messaging 
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.243 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.243 12 ERROR oslo_messaging.notify.messaging 
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.243 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.243 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.243 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.243 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.243 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.243 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.243 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.243 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.243 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.243 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.243 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.243 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.243 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.243 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.243 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.243 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.243 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.243 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.243 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.243 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.243 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.243 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.243 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.243 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.243 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.243 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.243 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.243 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.243 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.243 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.243 12 ERROR oslo_messaging.notify.messaging 
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.244 12 INFO ceilometer.polling.manager [-] Polling pollster cpu in the context of pollsters
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.273 12 DEBUG ceilometer.compute.pollsters [-] 2c5b484c-19e7-47b1-bf93-fa599ddb6873/cpu volume: 11760000000 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.275 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '134b5768-094e-40b9-86fe-f137f1570231', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'cpu', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 11760000000, 'user_id': 'a7fb6bdd938b4fcdb749b0bc4f86f97e', 'user_name': None, 'project_id': 'c09a5cf201e249f69f57cd4a632d1e2b', 'project_name': None, 'resource_id': '2c5b484c-19e7-47b1-bf93-fa599ddb6873', 'timestamp': '2026-01-21T23:54:23.244631', 'resource_metadata': {'display_name': 'tempest-ServerDiskConfigTestJSON-server-1222136722', 'name': 'instance-00000033', 'instance_id': '2c5b484c-19e7-47b1-bf93-fa599ddb6873', 'instance_type': 'm1.nano', 'host': '5e88b73f28190189c9d9e7f1c76645cadee6e8a282dc99bb0afd881f', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'cpu_number': 1}, 'message_id': '82bc25c4-f724-11f0-b13b-fa163e425b77', 'monotonic_time': 4221.992277259, 'message_signature': '16efa61274e792ef108145cd755c5c29bf4feba0f379dd76dac4c64aecf03742'}]}, 'timestamp': '2026-01-21 23:54:23.274063', '_unique_id': '7bb09dfa74244a909a4a11b1c116d8c3'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.275 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.275 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.275 12 ERROR oslo_messaging.notify.messaging     yield
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.275 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.275 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.275 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.275 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.275 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.275 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.275 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.275 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.275 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.275 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.275 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.275 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.275 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.275 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.275 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.275 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.275 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.275 12 ERROR oslo_messaging.notify.messaging 
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.275 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.275 12 ERROR oslo_messaging.notify.messaging 
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.275 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.275 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.275 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.275 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.275 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.275 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.275 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.275 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.275 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.275 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.275 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.275 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.275 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.275 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.275 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.275 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.275 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.275 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.275 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.275 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.275 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.275 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.275 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.275 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.275 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.275 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.275 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.275 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.275 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.275 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.275 12 ERROR oslo_messaging.notify.messaging 
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.276 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.latency in the context of pollsters
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.277 12 DEBUG ceilometer.compute.pollsters [-] 2c5b484c-19e7-47b1-bf93-fa599ddb6873/disk.device.write.latency volume: 1818411318 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.277 12 DEBUG ceilometer.compute.pollsters [-] 2c5b484c-19e7-47b1-bf93-fa599ddb6873/disk.device.write.latency volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.278 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'f30743cf-5f5b-4e56-bf4e-7bb6b4a33b52', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 1818411318, 'user_id': 'a7fb6bdd938b4fcdb749b0bc4f86f97e', 'user_name': None, 'project_id': 'c09a5cf201e249f69f57cd4a632d1e2b', 'project_name': None, 'resource_id': '2c5b484c-19e7-47b1-bf93-fa599ddb6873-vda', 'timestamp': '2026-01-21T23:54:23.277074', 'resource_metadata': {'display_name': 'tempest-ServerDiskConfigTestJSON-server-1222136722', 'name': 'instance-00000033', 'instance_id': '2c5b484c-19e7-47b1-bf93-fa599ddb6873', 'instance_type': 'm1.nano', 'host': '5e88b73f28190189c9d9e7f1c76645cadee6e8a282dc99bb0afd881f', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '82bcb246-f724-11f0-b13b-fa163e425b77', 'monotonic_time': 4221.880377997, 'message_signature': '48c7951dbd8216eda266b7898ea22fee5c6c23c2fdbd8eefe67786c656ce26a6'}, {'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 0, 'user_id': 'a7fb6bdd938b4fcdb749b0bc4f86f97e', 'user_name': None, 'project_id': 'c09a5cf201e249f69f57cd4a632d1e2b', 'project_name': None, 'resource_id': '2c5b484c-19e7-47b1-bf93-fa599ddb6873-sda', 'timestamp': '2026-01-21T23:54:23.277074', 'resource_metadata': {'display_name': 'tempest-ServerDiskConfigTestJSON-server-1222136722', 'name': 'instance-00000033', 'instance_id': '2c5b484c-19e7-47b1-bf93-fa599ddb6873', 'instance_type': 'm1.nano', 'host': '5e88b73f28190189c9d9e7f1c76645cadee6e8a282dc99bb0afd881f', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '82bcc312-f724-11f0-b13b-fa163e425b77', 'monotonic_time': 4221.880377997, 'message_signature': 'b1d3815cf225524277db10710162867ad3336c03e954d4a449fe11426e86c92c'}]}, 'timestamp': '2026-01-21 23:54:23.277986', '_unique_id': '29e615303a074081a6fd9082190af101'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.278 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.278 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.278 12 ERROR oslo_messaging.notify.messaging     yield
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.278 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.278 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.278 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.278 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.278 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.278 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.278 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.278 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.278 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.278 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.278 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.278 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.278 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.278 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.278 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.278 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.278 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.278 12 ERROR oslo_messaging.notify.messaging 
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.278 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.278 12 ERROR oslo_messaging.notify.messaging 
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.278 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.278 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.278 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.278 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.278 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.278 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.278 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.278 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.278 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.278 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.278 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.278 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.278 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.278 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.278 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.278 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.278 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.278 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.278 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.278 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.278 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.278 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.278 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.278 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.278 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.278 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.278 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.278 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.278 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.278 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.278 12 ERROR oslo_messaging.notify.messaging 
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.280 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets in the context of pollsters
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.280 12 DEBUG ceilometer.compute.pollsters [-] 2c5b484c-19e7-47b1-bf93-fa599ddb6873/network.outgoing.packets volume: 10 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.281 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '3d798bc0-f95c-4b5e-aaf2-f68fae61cdac', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 10, 'user_id': 'a7fb6bdd938b4fcdb749b0bc4f86f97e', 'user_name': None, 'project_id': 'c09a5cf201e249f69f57cd4a632d1e2b', 'project_name': None, 'resource_id': 'instance-00000033-2c5b484c-19e7-47b1-bf93-fa599ddb6873-tap2ad8a775-c0', 'timestamp': '2026-01-21T23:54:23.280301', 'resource_metadata': {'display_name': 'tempest-ServerDiskConfigTestJSON-server-1222136722', 'name': 'tap2ad8a775-c0', 'instance_id': '2c5b484c-19e7-47b1-bf93-fa599ddb6873', 'instance_type': 'm1.nano', 'host': '5e88b73f28190189c9d9e7f1c76645cadee6e8a282dc99bb0afd881f', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:10:e7:d9', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap2ad8a775-c0'}, 'message_id': '82bd314e-f724-11f0-b13b-fa163e425b77', 'monotonic_time': 4221.935264985, 'message_signature': '84a31661d000fd40f2f1827cd4eedcc7eacc6404c1e7b7fd5e824423bac5de43'}]}, 'timestamp': '2026-01-21 23:54:23.280799', '_unique_id': 'bc1f0356d9c94812aefe7f812b96d31a'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.281 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.281 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.281 12 ERROR oslo_messaging.notify.messaging     yield
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.281 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.281 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.281 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.281 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.281 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.281 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.281 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.281 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.281 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.281 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.281 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.281 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.281 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.281 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.281 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.281 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.281 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.281 12 ERROR oslo_messaging.notify.messaging 
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.281 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.281 12 ERROR oslo_messaging.notify.messaging 
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.281 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.281 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.281 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.281 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.281 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.281 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.281 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.281 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.281 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.281 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.281 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.281 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.281 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.281 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.281 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.281 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.281 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.281 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.281 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.281 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.281 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.281 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.281 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.281 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.281 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.281 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.281 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.281 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.281 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.281 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.281 12 ERROR oslo_messaging.notify.messaging 
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.282 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.error in the context of pollsters
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.283 12 DEBUG ceilometer.compute.pollsters [-] 2c5b484c-19e7-47b1-bf93-fa599ddb6873/network.incoming.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.284 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '187a31c9-9cda-43cd-9f9f-82b8992dcb0e', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'a7fb6bdd938b4fcdb749b0bc4f86f97e', 'user_name': None, 'project_id': 'c09a5cf201e249f69f57cd4a632d1e2b', 'project_name': None, 'resource_id': 'instance-00000033-2c5b484c-19e7-47b1-bf93-fa599ddb6873-tap2ad8a775-c0', 'timestamp': '2026-01-21T23:54:23.283137', 'resource_metadata': {'display_name': 'tempest-ServerDiskConfigTestJSON-server-1222136722', 'name': 'tap2ad8a775-c0', 'instance_id': '2c5b484c-19e7-47b1-bf93-fa599ddb6873', 'instance_type': 'm1.nano', 'host': '5e88b73f28190189c9d9e7f1c76645cadee6e8a282dc99bb0afd881f', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:10:e7:d9', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap2ad8a775-c0'}, 'message_id': '82bd9f08-f724-11f0-b13b-fa163e425b77', 'monotonic_time': 4221.935264985, 'message_signature': '8b0506fc30148f16fd19e7826fb37d3089bfeb817e5127bd0e94ec3780e60b08'}]}, 'timestamp': '2026-01-21 23:54:23.283610', '_unique_id': 'b08d680f69ab4212939b157f565fa870'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.284 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.284 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.284 12 ERROR oslo_messaging.notify.messaging     yield
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.284 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.284 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.284 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.284 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.284 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.284 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.284 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.284 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.284 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.284 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.284 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.284 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.284 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.284 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.284 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.284 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.284 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.284 12 ERROR oslo_messaging.notify.messaging 
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.284 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.284 12 ERROR oslo_messaging.notify.messaging 
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.284 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.284 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.284 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.284 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.284 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.284 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.284 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.284 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.284 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.284 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.284 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.284 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.284 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.284 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.284 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.284 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.284 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.284 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.284 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.284 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.284 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.284 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.284 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.284 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.284 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.284 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.284 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.284 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.284 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.284 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.284 12 ERROR oslo_messaging.notify.messaging 
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.285 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.delta in the context of pollsters
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.286 12 DEBUG ceilometer.compute.pollsters [-] 2c5b484c-19e7-47b1-bf93-fa599ddb6873/network.outgoing.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.287 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '74293b16-c840-4b86-89b0-791fd6d303a0', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': 'a7fb6bdd938b4fcdb749b0bc4f86f97e', 'user_name': None, 'project_id': 'c09a5cf201e249f69f57cd4a632d1e2b', 'project_name': None, 'resource_id': 'instance-00000033-2c5b484c-19e7-47b1-bf93-fa599ddb6873-tap2ad8a775-c0', 'timestamp': '2026-01-21T23:54:23.286030', 'resource_metadata': {'display_name': 'tempest-ServerDiskConfigTestJSON-server-1222136722', 'name': 'tap2ad8a775-c0', 'instance_id': '2c5b484c-19e7-47b1-bf93-fa599ddb6873', 'instance_type': 'm1.nano', 'host': '5e88b73f28190189c9d9e7f1c76645cadee6e8a282dc99bb0afd881f', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:10:e7:d9', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap2ad8a775-c0'}, 'message_id': '82be10c8-f724-11f0-b13b-fa163e425b77', 'monotonic_time': 4221.935264985, 'message_signature': '25dae6cf9fb18fe26901e0d25631356dd8f36b79bb6243fae50f31ae1cb3047c'}]}, 'timestamp': '2026-01-21 23:54:23.286524', '_unique_id': 'e4a42dbf51e24e46aed340ed5f0fe081'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.287 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.287 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.287 12 ERROR oslo_messaging.notify.messaging     yield
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.287 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.287 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.287 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.287 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.287 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.287 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.287 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.287 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.287 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.287 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.287 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.287 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.287 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.287 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.287 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.287 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.287 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.287 12 ERROR oslo_messaging.notify.messaging 
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.287 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.287 12 ERROR oslo_messaging.notify.messaging 
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.287 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.287 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.287 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.287 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.287 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.287 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.287 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.287 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.287 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.287 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.287 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.287 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.287 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.287 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.287 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.287 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.287 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.287 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.287 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.287 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.287 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.287 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.287 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.287 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.287 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.287 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.287 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.287 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.287 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.287 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.287 12 ERROR oslo_messaging.notify.messaging 
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.289 12 INFO ceilometer.polling.manager [-] Polling pollster memory.usage in the context of pollsters
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.289 12 DEBUG ceilometer.compute.pollsters [-] 2c5b484c-19e7-47b1-bf93-fa599ddb6873/memory.usage volume: 40.38671875 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.290 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '87eedbcc-c453-4aa5-9423-d1c7372adc4b', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'memory.usage', 'counter_type': 'gauge', 'counter_unit': 'MB', 'counter_volume': 40.38671875, 'user_id': 'a7fb6bdd938b4fcdb749b0bc4f86f97e', 'user_name': None, 'project_id': 'c09a5cf201e249f69f57cd4a632d1e2b', 'project_name': None, 'resource_id': '2c5b484c-19e7-47b1-bf93-fa599ddb6873', 'timestamp': '2026-01-21T23:54:23.289471', 'resource_metadata': {'display_name': 'tempest-ServerDiskConfigTestJSON-server-1222136722', 'name': 'instance-00000033', 'instance_id': '2c5b484c-19e7-47b1-bf93-fa599ddb6873', 'instance_type': 'm1.nano', 'host': '5e88b73f28190189c9d9e7f1c76645cadee6e8a282dc99bb0afd881f', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1}, 'message_id': '82be96ec-f724-11f0-b13b-fa163e425b77', 'monotonic_time': 4221.992277259, 'message_signature': '2417199d8ff8061acef6a0cf4d1a04be7bb9628328eb297deaed50c1add5915a'}]}, 'timestamp': '2026-01-21 23:54:23.289976', '_unique_id': '8750e46aa1d34cc08c151c25c0e045c5'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.290 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.290 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.290 12 ERROR oslo_messaging.notify.messaging     yield
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.290 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.290 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.290 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.290 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.290 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.290 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.290 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.290 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.290 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.290 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.290 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.290 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.290 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.290 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.290 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.290 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.290 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.290 12 ERROR oslo_messaging.notify.messaging 
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.290 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.290 12 ERROR oslo_messaging.notify.messaging 
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.290 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.290 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.290 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.290 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.290 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.290 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.290 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.290 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.290 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.290 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.290 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.290 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.290 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.290 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.290 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.290 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.290 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.290 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.290 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.290 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.290 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.290 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.290 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.290 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.290 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.290 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.290 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.290 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.290 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.290 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.290 12 ERROR oslo_messaging.notify.messaging 
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.292 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.requests in the context of pollsters
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.292 12 DEBUG ceilometer.compute.pollsters [-] 2c5b484c-19e7-47b1-bf93-fa599ddb6873/disk.device.write.requests volume: 301 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.292 12 DEBUG ceilometer.compute.pollsters [-] 2c5b484c-19e7-47b1-bf93-fa599ddb6873/disk.device.write.requests volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.294 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '9e53022e-1cf3-4a31-8069-dfa347ac2371', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 301, 'user_id': 'a7fb6bdd938b4fcdb749b0bc4f86f97e', 'user_name': None, 'project_id': 'c09a5cf201e249f69f57cd4a632d1e2b', 'project_name': None, 'resource_id': '2c5b484c-19e7-47b1-bf93-fa599ddb6873-vda', 'timestamp': '2026-01-21T23:54:23.292235', 'resource_metadata': {'display_name': 'tempest-ServerDiskConfigTestJSON-server-1222136722', 'name': 'instance-00000033', 'instance_id': '2c5b484c-19e7-47b1-bf93-fa599ddb6873', 'instance_type': 'm1.nano', 'host': '5e88b73f28190189c9d9e7f1c76645cadee6e8a282dc99bb0afd881f', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '82bf02f8-f724-11f0-b13b-fa163e425b77', 'monotonic_time': 4221.880377997, 'message_signature': '1075d5db168050bc98cc844723b9954a52fec14d564efe5aa30c6cf787353d33'}, {'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 0, 'user_id': 'a7fb6bdd938b4fcdb749b0bc4f86f97e', 'user_name': None, 'project_id': 'c09a5cf201e249f69f57cd4a632d1e2b', 'project_name': None, 'resource_id': '2c5b484c-19e7-47b1-bf93-fa599ddb6873-sda', 'timestamp': '2026-01-21T23:54:23.292235', 'resource_metadata': {'display_name': 'tempest-ServerDiskConfigTestJSON-server-1222136722', 'name': 'instance-00000033', 'instance_id': '2c5b484c-19e7-47b1-bf93-fa599ddb6873', 'instance_type': 'm1.nano', 'host': '5e88b73f28190189c9d9e7f1c76645cadee6e8a282dc99bb0afd881f', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '82bf17f2-f724-11f0-b13b-fa163e425b77', 'monotonic_time': 4221.880377997, 'message_signature': 'f463679bfa3f9dc9f7fe7ee1536e96fe8a0d2842c9d7835339adfd39f0b7a21f'}]}, 'timestamp': '2026-01-21 23:54:23.293238', '_unique_id': '36c0a15f8a0e499d80107c63529c6148'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.294 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.294 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.294 12 ERROR oslo_messaging.notify.messaging     yield
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.294 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.294 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.294 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.294 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.294 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.294 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.294 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.294 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.294 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.294 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.294 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.294 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.294 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.294 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.294 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.294 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.294 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.294 12 ERROR oslo_messaging.notify.messaging 
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.294 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.294 12 ERROR oslo_messaging.notify.messaging 
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.294 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.294 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.294 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.294 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.294 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.294 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.294 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.294 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.294 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.294 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.294 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.294 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.294 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.294 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.294 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.294 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.294 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.294 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.294 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.294 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.294 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.294 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.294 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.294 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.294 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.294 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.294 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.294 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.294 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.294 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.294 12 ERROR oslo_messaging.notify.messaging 
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.295 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.latency in the context of pollsters
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.295 12 DEBUG ceilometer.compute.pollsters [-] 2c5b484c-19e7-47b1-bf93-fa599ddb6873/disk.device.read.latency volume: 210019418 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.296 12 DEBUG ceilometer.compute.pollsters [-] 2c5b484c-19e7-47b1-bf93-fa599ddb6873/disk.device.read.latency volume: 29295891 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.297 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'fe44988f-b8aa-477b-b1a8-26bb74814d63', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 210019418, 'user_id': 'a7fb6bdd938b4fcdb749b0bc4f86f97e', 'user_name': None, 'project_id': 'c09a5cf201e249f69f57cd4a632d1e2b', 'project_name': None, 'resource_id': '2c5b484c-19e7-47b1-bf93-fa599ddb6873-vda', 'timestamp': '2026-01-21T23:54:23.295585', 'resource_metadata': {'display_name': 'tempest-ServerDiskConfigTestJSON-server-1222136722', 'name': 'instance-00000033', 'instance_id': '2c5b484c-19e7-47b1-bf93-fa599ddb6873', 'instance_type': 'm1.nano', 'host': '5e88b73f28190189c9d9e7f1c76645cadee6e8a282dc99bb0afd881f', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '82bf8cbe-f724-11f0-b13b-fa163e425b77', 'monotonic_time': 4221.880377997, 'message_signature': 'b7fed36b0216971f201dbdcafc68faa9dbe3eb7aec956852a202309e4a7fe510'}, {'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 29295891, 'user_id': 'a7fb6bdd938b4fcdb749b0bc4f86f97e', 'user_name': None, 'project_id': 'c09a5cf201e249f69f57cd4a632d1e2b', 'project_name': None, 'resource_id': '2c5b484c-19e7-47b1-bf93-fa599ddb6873-sda', 'timestamp': '2026-01-21T23:54:23.295585', 'resource_metadata': {'display_name': 'tempest-ServerDiskConfigTestJSON-server-1222136722', 'name': 'instance-00000033', 'instance_id': '2c5b484c-19e7-47b1-bf93-fa599ddb6873', 'instance_type': 'm1.nano', 'host': '5e88b73f28190189c9d9e7f1c76645cadee6e8a282dc99bb0afd881f', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '82bf9dbc-f724-11f0-b13b-fa163e425b77', 'monotonic_time': 4221.880377997, 'message_signature': '68617849470e697958b0d9a23f32d80475a241685a2e41a6e7614803f51c6714'}]}, 'timestamp': '2026-01-21 23:54:23.296657', '_unique_id': 'b6ee0a4ea68642319b7988a7fa5dd6f2'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.297 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.297 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.297 12 ERROR oslo_messaging.notify.messaging     yield
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.297 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.297 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.297 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.297 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.297 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.297 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.297 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.297 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.297 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.297 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.297 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.297 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.297 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.297 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.297 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.297 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.297 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.297 12 ERROR oslo_messaging.notify.messaging 
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.297 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.297 12 ERROR oslo_messaging.notify.messaging 
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.297 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.297 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.297 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.297 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.297 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.297 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.297 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.297 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.297 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.297 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.297 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.297 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.297 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.297 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.297 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.297 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.297 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.297 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.297 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.297 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.297 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.297 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.297 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.297 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.297 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.297 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.297 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.297 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.297 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.297 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.297 12 ERROR oslo_messaging.notify.messaging 
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.298 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes in the context of pollsters
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.299 12 DEBUG ceilometer.compute.pollsters [-] 2c5b484c-19e7-47b1-bf93-fa599ddb6873/network.incoming.bytes volume: 1352 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.300 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '0bd29571-cd38-49aa-bfa6-d64c25e80830', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 1352, 'user_id': 'a7fb6bdd938b4fcdb749b0bc4f86f97e', 'user_name': None, 'project_id': 'c09a5cf201e249f69f57cd4a632d1e2b', 'project_name': None, 'resource_id': 'instance-00000033-2c5b484c-19e7-47b1-bf93-fa599ddb6873-tap2ad8a775-c0', 'timestamp': '2026-01-21T23:54:23.298961', 'resource_metadata': {'display_name': 'tempest-ServerDiskConfigTestJSON-server-1222136722', 'name': 'tap2ad8a775-c0', 'instance_id': '2c5b484c-19e7-47b1-bf93-fa599ddb6873', 'instance_type': 'm1.nano', 'host': '5e88b73f28190189c9d9e7f1c76645cadee6e8a282dc99bb0afd881f', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:10:e7:d9', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap2ad8a775-c0'}, 'message_id': '82c00946-f724-11f0-b13b-fa163e425b77', 'monotonic_time': 4221.935264985, 'message_signature': 'ea3b60840518134622464305eca8c899a22d98e275375b89313b4274de8735ba'}]}, 'timestamp': '2026-01-21 23:54:23.299440', '_unique_id': 'ad4b196ee6bd4119915055daa826b643'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.300 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.300 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.300 12 ERROR oslo_messaging.notify.messaging     yield
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.300 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.300 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.300 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.300 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.300 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.300 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.300 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.300 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.300 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.300 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.300 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.300 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.300 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.300 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.300 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.300 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.300 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.300 12 ERROR oslo_messaging.notify.messaging 
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.300 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.300 12 ERROR oslo_messaging.notify.messaging 
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.300 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.300 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.300 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.300 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.300 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.300 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.300 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.300 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.300 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.300 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.300 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.300 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.300 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.300 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.300 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.300 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.300 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.300 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.300 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.300 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.300 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.300 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.300 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.300 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.300 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.300 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.300 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.300 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.300 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.300 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.300 12 ERROR oslo_messaging.notify.messaging 
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.301 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.requests in the context of pollsters
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.301 12 DEBUG ceilometer.compute.pollsters [-] 2c5b484c-19e7-47b1-bf93-fa599ddb6873/disk.device.read.requests volume: 1066 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.302 12 DEBUG ceilometer.compute.pollsters [-] 2c5b484c-19e7-47b1-bf93-fa599ddb6873/disk.device.read.requests volume: 108 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.303 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '82a7903c-8472-4dca-8113-d40975ed97fd', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 1066, 'user_id': 'a7fb6bdd938b4fcdb749b0bc4f86f97e', 'user_name': None, 'project_id': 'c09a5cf201e249f69f57cd4a632d1e2b', 'project_name': None, 'resource_id': '2c5b484c-19e7-47b1-bf93-fa599ddb6873-vda', 'timestamp': '2026-01-21T23:54:23.301668', 'resource_metadata': {'display_name': 'tempest-ServerDiskConfigTestJSON-server-1222136722', 'name': 'instance-00000033', 'instance_id': '2c5b484c-19e7-47b1-bf93-fa599ddb6873', 'instance_type': 'm1.nano', 'host': '5e88b73f28190189c9d9e7f1c76645cadee6e8a282dc99bb0afd881f', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '82c07476-f724-11f0-b13b-fa163e425b77', 'monotonic_time': 4221.880377997, 'message_signature': 'cd2bf3340052448fc822bbe472404b99e295355441905243a89ab0d5cc000cd9'}, {'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 108, 'user_id': 'a7fb6bdd938b4fcdb749b0bc4f86f97e', 'user_name': None, 'project_id': 'c09a5cf201e249f69f57cd4a632d1e2b', 'project_name': None, 'resource_id': '2c5b484c-19e7-47b1-bf93-fa599ddb6873-sda', 'timestamp': '2026-01-21T23:54:23.301668', 'resource_metadata': {'display_name': 'tempest-ServerDiskConfigTestJSON-server-1222136722', 'name': 'instance-00000033', 'instance_id': '2c5b484c-19e7-47b1-bf93-fa599ddb6873', 'instance_type': 'm1.nano', 'host': '5e88b73f28190189c9d9e7f1c76645cadee6e8a282dc99bb0afd881f', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '82c0851a-f724-11f0-b13b-fa163e425b77', 'monotonic_time': 4221.880377997, 'message_signature': 'd797954cecb789d1308e98817377d42e792ac9e845eaf90ed03b45d70ccc8099'}]}, 'timestamp': '2026-01-21 23:54:23.302577', '_unique_id': '6d4cfb80715441dabe654348cb206b9f'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.303 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.303 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.303 12 ERROR oslo_messaging.notify.messaging     yield
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.303 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.303 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.303 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.303 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.303 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.303 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.303 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.303 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.303 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.303 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.303 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.303 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.303 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.303 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.303 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.303 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.303 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.303 12 ERROR oslo_messaging.notify.messaging 
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.303 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.303 12 ERROR oslo_messaging.notify.messaging 
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.303 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.303 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.303 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.303 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.303 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.303 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.303 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.303 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.303 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.303 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.303 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.303 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.303 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.303 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.303 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.303 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.303 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.303 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.303 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.303 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.303 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.303 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.303 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.303 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.303 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.303 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.303 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.303 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.303 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.303 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.303 12 ERROR oslo_messaging.notify.messaging 
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.304 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.latency in the context of pollsters
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.305 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for PerDeviceDiskLatencyPollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.305 12 ERROR ceilometer.polling.manager [-] Prevent pollster disk.device.latency from polling [<NovaLikeServer: tempest-ServerDiskConfigTestJSON-server-1222136722>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-ServerDiskConfigTestJSON-server-1222136722>]
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.305 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.rate in the context of pollsters
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.305 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for OutgoingBytesRatePollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.305 12 ERROR ceilometer.polling.manager [-] Prevent pollster network.outgoing.bytes.rate from polling [<NovaLikeServer: tempest-ServerDiskConfigTestJSON-server-1222136722>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-ServerDiskConfigTestJSON-server-1222136722>]
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.306 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.delta in the context of pollsters
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.306 12 DEBUG ceilometer.compute.pollsters [-] 2c5b484c-19e7-47b1-bf93-fa599ddb6873/network.incoming.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.307 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '840d066d-b314-4dca-88e2-a1fa479e7511', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': 'a7fb6bdd938b4fcdb749b0bc4f86f97e', 'user_name': None, 'project_id': 'c09a5cf201e249f69f57cd4a632d1e2b', 'project_name': None, 'resource_id': 'instance-00000033-2c5b484c-19e7-47b1-bf93-fa599ddb6873-tap2ad8a775-c0', 'timestamp': '2026-01-21T23:54:23.306328', 'resource_metadata': {'display_name': 'tempest-ServerDiskConfigTestJSON-server-1222136722', 'name': 'tap2ad8a775-c0', 'instance_id': '2c5b484c-19e7-47b1-bf93-fa599ddb6873', 'instance_type': 'm1.nano', 'host': '5e88b73f28190189c9d9e7f1c76645cadee6e8a282dc99bb0afd881f', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:10:e7:d9', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap2ad8a775-c0'}, 'message_id': '82c1292a-f724-11f0-b13b-fa163e425b77', 'monotonic_time': 4221.935264985, 'message_signature': '830b27466df25c57672034e5768076c32afbaf5d60ac9d8e4c0113aa43e0cc1d'}]}, 'timestamp': '2026-01-21 23:54:23.306845', '_unique_id': 'e11c64c02f4f49b3a841092ea5ed4513'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.307 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.307 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.307 12 ERROR oslo_messaging.notify.messaging     yield
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.307 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.307 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.307 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.307 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.307 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.307 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.307 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.307 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.307 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.307 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.307 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.307 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.307 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.307 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.307 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.307 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.307 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.307 12 ERROR oslo_messaging.notify.messaging 
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.307 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.307 12 ERROR oslo_messaging.notify.messaging 
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.307 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.307 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.307 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.307 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.307 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.307 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.307 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.307 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.307 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.307 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.307 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.307 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.307 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.307 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.307 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.307 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.307 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.307 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.307 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.307 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.307 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.307 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.307 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.307 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.307 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.307 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.307 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.307 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.307 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.307 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.307 12 ERROR oslo_messaging.notify.messaging 
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.317 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets in the context of pollsters
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.317 12 DEBUG ceilometer.compute.pollsters [-] 2c5b484c-19e7-47b1-bf93-fa599ddb6873/network.incoming.packets volume: 9 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.318 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '9e316be4-8ad5-463f-90dc-017428c3dc38', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 9, 'user_id': 'a7fb6bdd938b4fcdb749b0bc4f86f97e', 'user_name': None, 'project_id': 'c09a5cf201e249f69f57cd4a632d1e2b', 'project_name': None, 'resource_id': 'instance-00000033-2c5b484c-19e7-47b1-bf93-fa599ddb6873-tap2ad8a775-c0', 'timestamp': '2026-01-21T23:54:23.317232', 'resource_metadata': {'display_name': 'tempest-ServerDiskConfigTestJSON-server-1222136722', 'name': 'tap2ad8a775-c0', 'instance_id': '2c5b484c-19e7-47b1-bf93-fa599ddb6873', 'instance_type': 'm1.nano', 'host': '5e88b73f28190189c9d9e7f1c76645cadee6e8a282dc99bb0afd881f', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:10:e7:d9', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap2ad8a775-c0'}, 'message_id': '82c2d144-f724-11f0-b13b-fa163e425b77', 'monotonic_time': 4221.935264985, 'message_signature': '3a8c2923a2e5c8d9e26a4d4c0c3baf615e9332174ad4809e3f61ffdb85d9bb55'}]}, 'timestamp': '2026-01-21 23:54:23.317630', '_unique_id': '083fb7cd74fd4008bfe8bd14e421c7fe'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.318 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.318 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.318 12 ERROR oslo_messaging.notify.messaging     yield
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.318 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.318 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.318 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.318 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.318 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.318 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.318 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.318 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.318 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.318 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.318 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.318 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.318 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.318 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.318 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.318 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.318 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.318 12 ERROR oslo_messaging.notify.messaging 
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.318 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.318 12 ERROR oslo_messaging.notify.messaging 
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.318 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.318 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.318 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.318 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.318 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.318 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.318 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.318 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.318 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.318 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.318 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.318 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.318 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.318 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.318 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.318 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.318 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.318 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.318 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.318 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.318 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.318 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.318 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.318 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.318 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.318 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.318 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.318 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.318 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.318 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.318 12 ERROR oslo_messaging.notify.messaging 
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.319 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.usage in the context of pollsters
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.319 12 DEBUG ceilometer.compute.pollsters [-] 2c5b484c-19e7-47b1-bf93-fa599ddb6873/disk.device.usage volume: 29753344 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.319 12 DEBUG ceilometer.compute.pollsters [-] 2c5b484c-19e7-47b1-bf93-fa599ddb6873/disk.device.usage volume: 485376 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.320 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '8e2329f0-dd03-4e36-96d4-0ec9f1639669', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 29753344, 'user_id': 'a7fb6bdd938b4fcdb749b0bc4f86f97e', 'user_name': None, 'project_id': 'c09a5cf201e249f69f57cd4a632d1e2b', 'project_name': None, 'resource_id': '2c5b484c-19e7-47b1-bf93-fa599ddb6873-vda', 'timestamp': '2026-01-21T23:54:23.319437', 'resource_metadata': {'display_name': 'tempest-ServerDiskConfigTestJSON-server-1222136722', 'name': 'instance-00000033', 'instance_id': '2c5b484c-19e7-47b1-bf93-fa599ddb6873', 'instance_type': 'm1.nano', 'host': '5e88b73f28190189c9d9e7f1c76645cadee6e8a282dc99bb0afd881f', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '82c325d6-f724-11f0-b13b-fa163e425b77', 'monotonic_time': 4221.945225892, 'message_signature': '5f04b515a6ba7ecc1a51718f6dc1b6e1d1d8677e75aff9eff855ba872fc5ba3a'}, {'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 485376, 'user_id': 'a7fb6bdd938b4fcdb749b0bc4f86f97e', 'user_name': None, 'project_id': 'c09a5cf201e249f69f57cd4a632d1e2b', 'project_name': None, 'resource_id': '2c5b484c-19e7-47b1-bf93-fa599ddb6873-sda', 'timestamp': '2026-01-21T23:54:23.319437', 'resource_metadata': {'display_name': 'tempest-ServerDiskConfigTestJSON-server-1222136722', 'name': 'instance-00000033', 'instance_id': '2c5b484c-19e7-47b1-bf93-fa599ddb6873', 'instance_type': 'm1.nano', 'host': '5e88b73f28190189c9d9e7f1c76645cadee6e8a282dc99bb0afd881f', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '82c33350-f724-11f0-b13b-fa163e425b77', 'monotonic_time': 4221.945225892, 'message_signature': '7d4d2585cc360205599200b775b171222c41cc4350e901dbccf702453bf512e1'}]}, 'timestamp': '2026-01-21 23:54:23.320089', '_unique_id': '49bc4b80fe154f99b2c17989227587a7'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.320 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.320 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.320 12 ERROR oslo_messaging.notify.messaging     yield
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.320 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.320 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.320 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.320 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.320 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.320 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.320 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.320 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.320 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.320 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.320 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.320 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.320 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.320 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.320 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.320 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.320 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.320 12 ERROR oslo_messaging.notify.messaging 
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.320 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.320 12 ERROR oslo_messaging.notify.messaging 
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.320 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.320 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.320 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.320 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.320 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.320 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.320 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.320 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.320 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.320 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.320 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.320 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.320 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.320 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.320 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.320 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.320 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.320 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.320 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.320 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.320 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.320 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.320 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.320 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.320 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.320 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.320 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.320 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.320 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.320 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.320 12 ERROR oslo_messaging.notify.messaging 
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.321 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.rate in the context of pollsters
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.321 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for IncomingBytesRatePollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.322 12 ERROR ceilometer.polling.manager [-] Prevent pollster network.incoming.bytes.rate from polling [<NovaLikeServer: tempest-ServerDiskConfigTestJSON-server-1222136722>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-ServerDiskConfigTestJSON-server-1222136722>]
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.322 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.bytes in the context of pollsters
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.322 12 DEBUG ceilometer.compute.pollsters [-] 2c5b484c-19e7-47b1-bf93-fa599ddb6873/disk.device.read.bytes volume: 29465088 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.322 12 DEBUG ceilometer.compute.pollsters [-] 2c5b484c-19e7-47b1-bf93-fa599ddb6873/disk.device.read.bytes volume: 274750 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.324 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'f7a18d2a-78db-477e-ac27-81dd4475aaf4', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 29465088, 'user_id': 'a7fb6bdd938b4fcdb749b0bc4f86f97e', 'user_name': None, 'project_id': 'c09a5cf201e249f69f57cd4a632d1e2b', 'project_name': None, 'resource_id': '2c5b484c-19e7-47b1-bf93-fa599ddb6873-vda', 'timestamp': '2026-01-21T23:54:23.322476', 'resource_metadata': {'display_name': 'tempest-ServerDiskConfigTestJSON-server-1222136722', 'name': 'instance-00000033', 'instance_id': '2c5b484c-19e7-47b1-bf93-fa599ddb6873', 'instance_type': 'm1.nano', 'host': '5e88b73f28190189c9d9e7f1c76645cadee6e8a282dc99bb0afd881f', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '82c39ff2-f724-11f0-b13b-fa163e425b77', 'monotonic_time': 4221.880377997, 'message_signature': '411e05be5e63ceed05e24bb9eda6cbf081dc88a12e005a13105c0b87db9671b7'}, {'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 274750, 'user_id': 'a7fb6bdd938b4fcdb749b0bc4f86f97e', 'user_name': None, 'project_id': 'c09a5cf201e249f69f57cd4a632d1e2b', 'project_name': None, 'resource_id': '2c5b484c-19e7-47b1-bf93-fa599ddb6873-sda', 'timestamp': '2026-01-21T23:54:23.322476', 'resource_metadata': {'display_name': 'tempest-ServerDiskConfigTestJSON-server-1222136722', 'name': 'instance-00000033', 'instance_id': '2c5b484c-19e7-47b1-bf93-fa599ddb6873', 'instance_type': 'm1.nano', 'host': '5e88b73f28190189c9d9e7f1c76645cadee6e8a282dc99bb0afd881f', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '82c3b104-f724-11f0-b13b-fa163e425b77', 'monotonic_time': 4221.880377997, 'message_signature': 'daa4f59398df3647eb01be234662563ec37599edf14bccff143830e129176027'}]}, 'timestamp': '2026-01-21 23:54:23.323350', '_unique_id': 'a453393594c041b894900da8bb678da2'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.324 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.324 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.324 12 ERROR oslo_messaging.notify.messaging     yield
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.324 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.324 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.324 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.324 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.324 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.324 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.324 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.324 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.324 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.324 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.324 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.324 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.324 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.324 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.324 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.324 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.324 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.324 12 ERROR oslo_messaging.notify.messaging 
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.324 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.324 12 ERROR oslo_messaging.notify.messaging 
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.324 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.324 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.324 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.324 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.324 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.324 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.324 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.324 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.324 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.324 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.324 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.324 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.324 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.324 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.324 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.324 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.324 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.324 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.324 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.324 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.324 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.324 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.324 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.324 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.324 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.324 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.324 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.324 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.324 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.324 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.324 12 ERROR oslo_messaging.notify.messaging 
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.324 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.capacity in the context of pollsters
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.324 12 DEBUG ceilometer.compute.pollsters [-] 2c5b484c-19e7-47b1-bf93-fa599ddb6873/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.325 12 DEBUG ceilometer.compute.pollsters [-] 2c5b484c-19e7-47b1-bf93-fa599ddb6873/disk.device.capacity volume: 485376 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.326 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '20b117b7-3228-43a9-bee2-b5705093e749', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': 'a7fb6bdd938b4fcdb749b0bc4f86f97e', 'user_name': None, 'project_id': 'c09a5cf201e249f69f57cd4a632d1e2b', 'project_name': None, 'resource_id': '2c5b484c-19e7-47b1-bf93-fa599ddb6873-vda', 'timestamp': '2026-01-21T23:54:23.324939', 'resource_metadata': {'display_name': 'tempest-ServerDiskConfigTestJSON-server-1222136722', 'name': 'instance-00000033', 'instance_id': '2c5b484c-19e7-47b1-bf93-fa599ddb6873', 'instance_type': 'm1.nano', 'host': '5e88b73f28190189c9d9e7f1c76645cadee6e8a282dc99bb0afd881f', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '82c3fcf4-f724-11f0-b13b-fa163e425b77', 'monotonic_time': 4221.945225892, 'message_signature': '403943b7c93dbc49448827facc0a29955c1d8bc09641cf52433ef3c8e9679a85'}, {'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 485376, 'user_id': 'a7fb6bdd938b4fcdb749b0bc4f86f97e', 'user_name': None, 'project_id': 'c09a5cf201e249f69f57cd4a632d1e2b', 'project_name': None, 'resource_id': '2c5b484c-19e7-47b1-bf93-fa599ddb6873-sda', 'timestamp': '2026-01-21T23:54:23.324939', 'resource_metadata': {'display_name': 'tempest-ServerDiskConfigTestJSON-server-1222136722', 'name': 'instance-00000033', 'instance_id': '2c5b484c-19e7-47b1-bf93-fa599ddb6873', 'instance_type': 'm1.nano', 'host': '5e88b73f28190189c9d9e7f1c76645cadee6e8a282dc99bb0afd881f', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '82c407da-f724-11f0-b13b-fa163e425b77', 'monotonic_time': 4221.945225892, 'message_signature': 'a8f2651a7279ce3fdfb3f197b8b7d926a812b80082e01cd9f082ecee641e9b63'}]}, 'timestamp': '2026-01-21 23:54:23.325520', '_unique_id': '9ac9850952d64876981980ee6c2eb9e6'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.326 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.326 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.326 12 ERROR oslo_messaging.notify.messaging     yield
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.326 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.326 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.326 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.326 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.326 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.326 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.326 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.326 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.326 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.326 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.326 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.326 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.326 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.326 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.326 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.326 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.326 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.326 12 ERROR oslo_messaging.notify.messaging 
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.326 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.326 12 ERROR oslo_messaging.notify.messaging 
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.326 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.326 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.326 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.326 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.326 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.326 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.326 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.326 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.326 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.326 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.326 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.326 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.326 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.326 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.326 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.326 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.326 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.326 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.326 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.326 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.326 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.326 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.326 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.326 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.326 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.326 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.326 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.326 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.326 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.326 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.326 12 ERROR oslo_messaging.notify.messaging 
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.326 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes in the context of pollsters
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.327 12 DEBUG ceilometer.compute.pollsters [-] 2c5b484c-19e7-47b1-bf93-fa599ddb6873/network.outgoing.bytes volume: 1284 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.328 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'db9b602e-d4eb-4c18-997e-f8c5ddb37b39', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 1284, 'user_id': 'a7fb6bdd938b4fcdb749b0bc4f86f97e', 'user_name': None, 'project_id': 'c09a5cf201e249f69f57cd4a632d1e2b', 'project_name': None, 'resource_id': 'instance-00000033-2c5b484c-19e7-47b1-bf93-fa599ddb6873-tap2ad8a775-c0', 'timestamp': '2026-01-21T23:54:23.327069', 'resource_metadata': {'display_name': 'tempest-ServerDiskConfigTestJSON-server-1222136722', 'name': 'tap2ad8a775-c0', 'instance_id': '2c5b484c-19e7-47b1-bf93-fa599ddb6873', 'instance_type': 'm1.nano', 'host': '5e88b73f28190189c9d9e7f1c76645cadee6e8a282dc99bb0afd881f', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:10:e7:d9', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap2ad8a775-c0'}, 'message_id': '82c45014-f724-11f0-b13b-fa163e425b77', 'monotonic_time': 4221.935264985, 'message_signature': 'f919d3c8b4265df04fc4569b62fb0348556c0d81e8327e43952f1aff5996a858'}]}, 'timestamp': '2026-01-21 23:54:23.327387', '_unique_id': 'feb5d30ebfcb4df8a1a2364922aa2196'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.328 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.328 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.328 12 ERROR oslo_messaging.notify.messaging     yield
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.328 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.328 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.328 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.328 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.328 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.328 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.328 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.328 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.328 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.328 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.328 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.328 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.328 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.328 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.328 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.328 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.328 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.328 12 ERROR oslo_messaging.notify.messaging 
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.328 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.328 12 ERROR oslo_messaging.notify.messaging 
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.328 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.328 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.328 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.328 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.328 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.328 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.328 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.328 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.328 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.328 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.328 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.328 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.328 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.328 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.328 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.328 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.328 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.328 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.328 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.328 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.328 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.328 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.328 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.328 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.328 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.328 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.328 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.328 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.328 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.328 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.328 12 ERROR oslo_messaging.notify.messaging 
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.328 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.error in the context of pollsters
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.328 12 DEBUG ceilometer.compute.pollsters [-] 2c5b484c-19e7-47b1-bf93-fa599ddb6873/network.outgoing.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.329 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'b04c1b34-f7cc-4c7d-9c52-476866863470', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'a7fb6bdd938b4fcdb749b0bc4f86f97e', 'user_name': None, 'project_id': 'c09a5cf201e249f69f57cd4a632d1e2b', 'project_name': None, 'resource_id': 'instance-00000033-2c5b484c-19e7-47b1-bf93-fa599ddb6873-tap2ad8a775-c0', 'timestamp': '2026-01-21T23:54:23.328900', 'resource_metadata': {'display_name': 'tempest-ServerDiskConfigTestJSON-server-1222136722', 'name': 'tap2ad8a775-c0', 'instance_id': '2c5b484c-19e7-47b1-bf93-fa599ddb6873', 'instance_type': 'm1.nano', 'host': '5e88b73f28190189c9d9e7f1c76645cadee6e8a282dc99bb0afd881f', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:10:e7:d9', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap2ad8a775-c0'}, 'message_id': '82c497ea-f724-11f0-b13b-fa163e425b77', 'monotonic_time': 4221.935264985, 'message_signature': 'f79c9b34adf389ec4f872982faebf3863a381f31ef68df982681927444218cfa'}]}, 'timestamp': '2026-01-21 23:54:23.329229', '_unique_id': '43dd9edf138e4e3bbe8c8243f7d0a617'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.329 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.329 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.329 12 ERROR oslo_messaging.notify.messaging     yield
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.329 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.329 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.329 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.329 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.329 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.329 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.329 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.329 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.329 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.329 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.329 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.329 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.329 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.329 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.329 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.329 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.329 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.329 12 ERROR oslo_messaging.notify.messaging 
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.329 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.329 12 ERROR oslo_messaging.notify.messaging 
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.329 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.329 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.329 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.329 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.329 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.329 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.329 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.329 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.329 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.329 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.329 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.329 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.329 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.329 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.329 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.329 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.329 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.329 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.329 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.329 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.329 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.329 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.329 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.329 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.329 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.329 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.329 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.329 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.329 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.329 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.329 12 ERROR oslo_messaging.notify.messaging 
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.330 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.drop in the context of pollsters
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.330 12 DEBUG ceilometer.compute.pollsters [-] 2c5b484c-19e7-47b1-bf93-fa599ddb6873/network.incoming.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.331 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '83f12fcf-6f93-4ebc-9a77-ca682f0891fb', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'a7fb6bdd938b4fcdb749b0bc4f86f97e', 'user_name': None, 'project_id': 'c09a5cf201e249f69f57cd4a632d1e2b', 'project_name': None, 'resource_id': 'instance-00000033-2c5b484c-19e7-47b1-bf93-fa599ddb6873-tap2ad8a775-c0', 'timestamp': '2026-01-21T23:54:23.330731', 'resource_metadata': {'display_name': 'tempest-ServerDiskConfigTestJSON-server-1222136722', 'name': 'tap2ad8a775-c0', 'instance_id': '2c5b484c-19e7-47b1-bf93-fa599ddb6873', 'instance_type': 'm1.nano', 'host': '5e88b73f28190189c9d9e7f1c76645cadee6e8a282dc99bb0afd881f', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:10:e7:d9', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap2ad8a775-c0'}, 'message_id': '82c4dfe8-f724-11f0-b13b-fa163e425b77', 'monotonic_time': 4221.935264985, 'message_signature': 'eebb937427ff817d6e649e6a79ceb095f5487c20f143bb45fe576cf1466c7843'}]}, 'timestamp': '2026-01-21 23:54:23.331073', '_unique_id': 'a38e9ecc4b6e4eeca3c93305146fe0d8'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.331 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.331 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.331 12 ERROR oslo_messaging.notify.messaging     yield
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.331 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.331 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.331 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.331 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.331 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.331 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.331 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.331 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.331 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.331 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.331 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.331 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.331 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.331 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.331 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.331 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.331 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.331 12 ERROR oslo_messaging.notify.messaging 
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.331 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.331 12 ERROR oslo_messaging.notify.messaging 
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.331 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.331 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.331 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.331 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.331 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.331 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.331 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.331 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.331 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.331 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.331 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.331 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.331 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.331 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.331 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.331 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.331 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.331 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.331 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.331 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.331 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.331 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.331 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.331 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.331 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.331 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.331 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.331 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.331 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.331 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 21 18:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:54:23.331 12 ERROR oslo_messaging.notify.messaging 
Jan 21 18:54:24 np0005591285 nova_compute[182755]: 2026-01-21 23:54:24.217 182759 DEBUG oslo_service.periodic_task [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 21 18:54:24 np0005591285 nova_compute[182755]: 2026-01-21 23:54:24.814 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 18:54:24 np0005591285 nova_compute[182755]: 2026-01-21 23:54:24.820 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 18:54:25 np0005591285 nova_compute[182755]: 2026-01-21 23:54:25.797 182759 DEBUG nova.network.neutron [None req-119b87c6-e29b-40e2-ade9-16bc86f4c969 a7fb6bdd938b4fcdb749b0bc4f86f97e c09a5cf201e249f69f57cd4a632d1e2b - - default default] [instance: 2c5b484c-19e7-47b1-bf93-fa599ddb6873] Updating instance_info_cache with network_info: [{"id": "2ad8a775-c03c-4a1b-919a-278faef8cb47", "address": "fa:16:3e:10:e7:d9", "network": {"id": "7b586c54-3322-410f-9bc9-972a63b8deff", "bridge": "br-int", "label": "tempest-ServerDiskConfigTestJSON-1542056474-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c09a5cf201e249f69f57cd4a632d1e2b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap2ad8a775-c0", "ovs_interfaceid": "2ad8a775-c03c-4a1b-919a-278faef8cb47", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 21 18:54:25 np0005591285 nova_compute[182755]: 2026-01-21 23:54:25.826 182759 DEBUG oslo_concurrency.lockutils [None req-119b87c6-e29b-40e2-ade9-16bc86f4c969 a7fb6bdd938b4fcdb749b0bc4f86f97e c09a5cf201e249f69f57cd4a632d1e2b - - default default] Releasing lock "refresh_cache-2c5b484c-19e7-47b1-bf93-fa599ddb6873" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 21 18:54:25 np0005591285 nova_compute[182755]: 2026-01-21 23:54:25.989 182759 DEBUG nova.virt.libvirt.driver [None req-119b87c6-e29b-40e2-ade9-16bc86f4c969 a7fb6bdd938b4fcdb749b0bc4f86f97e c09a5cf201e249f69f57cd4a632d1e2b - - default default] [instance: 2c5b484c-19e7-47b1-bf93-fa599ddb6873] Starting migrate_disk_and_power_off migrate_disk_and_power_off /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11511#033[00m
Jan 21 18:54:25 np0005591285 nova_compute[182755]: 2026-01-21 23:54:25.990 182759 DEBUG nova.virt.libvirt.volume.remotefs [None req-119b87c6-e29b-40e2-ade9-16bc86f4c969 a7fb6bdd938b4fcdb749b0bc4f86f97e c09a5cf201e249f69f57cd4a632d1e2b - - default default] Creating file /var/lib/nova/instances/2c5b484c-19e7-47b1-bf93-fa599ddb6873/f2214aef74984f878d3d97d0219b2395.tmp on remote host 192.168.122.100 create_file /usr/lib/python3.9/site-packages/nova/virt/libvirt/volume/remotefs.py:79#033[00m
Jan 21 18:54:25 np0005591285 nova_compute[182755]: 2026-01-21 23:54:25.991 182759 DEBUG oslo_concurrency.processutils [None req-119b87c6-e29b-40e2-ade9-16bc86f4c969 a7fb6bdd938b4fcdb749b0bc4f86f97e c09a5cf201e249f69f57cd4a632d1e2b - - default default] Running cmd (subprocess): ssh -o BatchMode=yes 192.168.122.100 touch /var/lib/nova/instances/2c5b484c-19e7-47b1-bf93-fa599ddb6873/f2214aef74984f878d3d97d0219b2395.tmp execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 21 18:54:26 np0005591285 nova_compute[182755]: 2026-01-21 23:54:26.218 182759 DEBUG oslo_service.periodic_task [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 21 18:54:26 np0005591285 nova_compute[182755]: 2026-01-21 23:54:26.219 182759 DEBUG nova.compute.manager [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Jan 21 18:54:26 np0005591285 nova_compute[182755]: 2026-01-21 23:54:26.437 182759 DEBUG oslo_concurrency.processutils [None req-119b87c6-e29b-40e2-ade9-16bc86f4c969 a7fb6bdd938b4fcdb749b0bc4f86f97e c09a5cf201e249f69f57cd4a632d1e2b - - default default] CMD "ssh -o BatchMode=yes 192.168.122.100 touch /var/lib/nova/instances/2c5b484c-19e7-47b1-bf93-fa599ddb6873/f2214aef74984f878d3d97d0219b2395.tmp" returned: 1 in 0.447s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 21 18:54:26 np0005591285 nova_compute[182755]: 2026-01-21 23:54:26.438 182759 DEBUG oslo_concurrency.processutils [None req-119b87c6-e29b-40e2-ade9-16bc86f4c969 a7fb6bdd938b4fcdb749b0bc4f86f97e c09a5cf201e249f69f57cd4a632d1e2b - - default default] 'ssh -o BatchMode=yes 192.168.122.100 touch /var/lib/nova/instances/2c5b484c-19e7-47b1-bf93-fa599ddb6873/f2214aef74984f878d3d97d0219b2395.tmp' failed. Not Retrying. execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:473#033[00m
Jan 21 18:54:26 np0005591285 nova_compute[182755]: 2026-01-21 23:54:26.439 182759 DEBUG nova.virt.libvirt.volume.remotefs [None req-119b87c6-e29b-40e2-ade9-16bc86f4c969 a7fb6bdd938b4fcdb749b0bc4f86f97e c09a5cf201e249f69f57cd4a632d1e2b - - default default] Creating directory /var/lib/nova/instances/2c5b484c-19e7-47b1-bf93-fa599ddb6873 on remote host 192.168.122.100 create_dir /usr/lib/python3.9/site-packages/nova/virt/libvirt/volume/remotefs.py:91#033[00m
Jan 21 18:54:26 np0005591285 nova_compute[182755]: 2026-01-21 23:54:26.439 182759 DEBUG oslo_concurrency.processutils [None req-119b87c6-e29b-40e2-ade9-16bc86f4c969 a7fb6bdd938b4fcdb749b0bc4f86f97e c09a5cf201e249f69f57cd4a632d1e2b - - default default] Running cmd (subprocess): ssh -o BatchMode=yes 192.168.122.100 mkdir -p /var/lib/nova/instances/2c5b484c-19e7-47b1-bf93-fa599ddb6873 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 21 18:54:26 np0005591285 nova_compute[182755]: 2026-01-21 23:54:26.686 182759 DEBUG oslo_concurrency.processutils [None req-119b87c6-e29b-40e2-ade9-16bc86f4c969 a7fb6bdd938b4fcdb749b0bc4f86f97e c09a5cf201e249f69f57cd4a632d1e2b - - default default] CMD "ssh -o BatchMode=yes 192.168.122.100 mkdir -p /var/lib/nova/instances/2c5b484c-19e7-47b1-bf93-fa599ddb6873" returned: 0 in 0.247s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 21 18:54:26 np0005591285 nova_compute[182755]: 2026-01-21 23:54:26.693 182759 DEBUG nova.virt.libvirt.driver [None req-119b87c6-e29b-40e2-ade9-16bc86f4c969 a7fb6bdd938b4fcdb749b0bc4f86f97e c09a5cf201e249f69f57cd4a632d1e2b - - default default] [instance: 2c5b484c-19e7-47b1-bf93-fa599ddb6873] Shutting down instance from state 1 _clean_shutdown /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4071#033[00m
Jan 21 18:54:27 np0005591285 nova_compute[182755]: 2026-01-21 23:54:27.217 182759 DEBUG oslo_service.periodic_task [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 21 18:54:27 np0005591285 nova_compute[182755]: 2026-01-21 23:54:27.246 182759 DEBUG oslo_concurrency.lockutils [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 21 18:54:27 np0005591285 nova_compute[182755]: 2026-01-21 23:54:27.247 182759 DEBUG oslo_concurrency.lockutils [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 21 18:54:27 np0005591285 nova_compute[182755]: 2026-01-21 23:54:27.247 182759 DEBUG oslo_concurrency.lockutils [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 21 18:54:27 np0005591285 nova_compute[182755]: 2026-01-21 23:54:27.248 182759 DEBUG nova.compute.resource_tracker [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Jan 21 18:54:27 np0005591285 nova_compute[182755]: 2026-01-21 23:54:27.365 182759 DEBUG oslo_concurrency.processutils [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/2c5b484c-19e7-47b1-bf93-fa599ddb6873/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 21 18:54:27 np0005591285 nova_compute[182755]: 2026-01-21 23:54:27.449 182759 DEBUG oslo_concurrency.processutils [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/2c5b484c-19e7-47b1-bf93-fa599ddb6873/disk --force-share --output=json" returned: 0 in 0.083s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 21 18:54:27 np0005591285 nova_compute[182755]: 2026-01-21 23:54:27.451 182759 DEBUG oslo_concurrency.processutils [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/2c5b484c-19e7-47b1-bf93-fa599ddb6873/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 21 18:54:27 np0005591285 nova_compute[182755]: 2026-01-21 23:54:27.546 182759 DEBUG oslo_concurrency.processutils [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/2c5b484c-19e7-47b1-bf93-fa599ddb6873/disk --force-share --output=json" returned: 0 in 0.095s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 21 18:54:27 np0005591285 nova_compute[182755]: 2026-01-21 23:54:27.782 182759 WARNING nova.virt.libvirt.driver [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 21 18:54:27 np0005591285 nova_compute[182755]: 2026-01-21 23:54:27.785 182759 DEBUG nova.compute.resource_tracker [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=5510MB free_disk=73.3476676940918GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Jan 21 18:54:27 np0005591285 nova_compute[182755]: 2026-01-21 23:54:27.785 182759 DEBUG oslo_concurrency.lockutils [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 21 18:54:27 np0005591285 nova_compute[182755]: 2026-01-21 23:54:27.786 182759 DEBUG oslo_concurrency.lockutils [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 21 18:54:27 np0005591285 nova_compute[182755]: 2026-01-21 23:54:27.852 182759 INFO nova.compute.resource_tracker [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] [instance: 2c5b484c-19e7-47b1-bf93-fa599ddb6873] Updating resource usage from migration 5cffe5fd-78f0-49b5-bafd-dbed115bc56f#033[00m
Jan 21 18:54:27 np0005591285 nova_compute[182755]: 2026-01-21 23:54:27.895 182759 DEBUG nova.compute.resource_tracker [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Migration 5cffe5fd-78f0-49b5-bafd-dbed115bc56f is active on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1640#033[00m
Jan 21 18:54:27 np0005591285 nova_compute[182755]: 2026-01-21 23:54:27.896 182759 DEBUG nova.compute.resource_tracker [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Jan 21 18:54:27 np0005591285 nova_compute[182755]: 2026-01-21 23:54:27.896 182759 DEBUG nova.compute.resource_tracker [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7679MB used_ram=640MB phys_disk=79GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Jan 21 18:54:27 np0005591285 nova_compute[182755]: 2026-01-21 23:54:27.915 182759 DEBUG nova.scheduler.client.report [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Refreshing inventories for resource provider e96a8776-a298-4c19-937a-402cb8191067 _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:804#033[00m
Jan 21 18:54:27 np0005591285 nova_compute[182755]: 2026-01-21 23:54:27.935 182759 DEBUG nova.scheduler.client.report [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Updating ProviderTree inventory for provider e96a8776-a298-4c19-937a-402cb8191067 from _refresh_and_get_inventory using data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} _refresh_and_get_inventory /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:768#033[00m
Jan 21 18:54:27 np0005591285 nova_compute[182755]: 2026-01-21 23:54:27.935 182759 DEBUG nova.compute.provider_tree [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Updating inventory in ProviderTree for provider e96a8776-a298-4c19-937a-402cb8191067 with inventory: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176#033[00m
Jan 21 18:54:27 np0005591285 nova_compute[182755]: 2026-01-21 23:54:27.951 182759 DEBUG nova.scheduler.client.report [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Refreshing aggregate associations for resource provider e96a8776-a298-4c19-937a-402cb8191067, aggregates: None _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:813#033[00m
Jan 21 18:54:27 np0005591285 nova_compute[182755]: 2026-01-21 23:54:27.978 182759 DEBUG nova.scheduler.client.report [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Refreshing trait associations for resource provider e96a8776-a298-4c19-937a-402cb8191067, traits: COMPUTE_SECURITY_UEFI_SECURE_BOOT,COMPUTE_VIOMMU_MODEL_VIRTIO,COMPUTE_STORAGE_BUS_IDE,COMPUTE_NET_ATTACH_INTERFACE,HW_CPU_X86_SSSE3,HW_CPU_X86_SSE,COMPUTE_IMAGE_TYPE_QCOW2,COMPUTE_NET_VIF_MODEL_VMXNET3,COMPUTE_STORAGE_BUS_FDC,HW_CPU_X86_SSE42,COMPUTE_SECURITY_TPM_2_0,COMPUTE_IMAGE_TYPE_AMI,COMPUTE_DEVICE_TAGGING,COMPUTE_VOLUME_EXTEND,COMPUTE_NET_VIF_MODEL_E1000,COMPUTE_VIOMMU_MODEL_AUTO,COMPUTE_GRAPHICS_MODEL_CIRRUS,COMPUTE_VOLUME_ATTACH_WITH_TAG,COMPUTE_IMAGE_TYPE_AKI,COMPUTE_STORAGE_BUS_SATA,HW_CPU_X86_SSE2,COMPUTE_IMAGE_TYPE_ARI,HW_CPU_X86_SSE41,COMPUTE_RESCUE_BFV,COMPUTE_NODE,COMPUTE_GRAPHICS_MODEL_VIRTIO,COMPUTE_NET_VIF_MODEL_PCNET,COMPUTE_NET_VIF_MODEL_VIRTIO,COMPUTE_NET_VIF_MODEL_RTL8139,COMPUTE_ACCELERATORS,COMPUTE_SECURITY_TPM_1_2,COMPUTE_GRAPHICS_MODEL_NONE,COMPUTE_GRAPHICS_MODEL_BOCHS,COMPUTE_VOLUME_MULTI_ATTACH,COMPUTE_NET_VIF_MODEL_NE2K_PCI,COMPUTE_TRUSTED_CERTS,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,COMPUTE_VIOMMU_MODEL_INTEL,COMPUTE_IMAGE_TYPE_ISO,COMPUTE_SOCKET_PCI_NUMA_AFFINITY,COMPUTE_STORAGE_BUS_USB,COMPUTE_STORAGE_BUS_SCSI,COMPUTE_STORAGE_BUS_VIRTIO,HW_CPU_X86_MMX,COMPUTE_GRAPHICS_MODEL_VGA,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,COMPUTE_NET_VIF_MODEL_E1000E,COMPUTE_IMAGE_TYPE_RAW _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:825#033[00m
Jan 21 18:54:28 np0005591285 nova_compute[182755]: 2026-01-21 23:54:28.029 182759 DEBUG nova.compute.provider_tree [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Inventory has not changed in ProviderTree for provider: e96a8776-a298-4c19-937a-402cb8191067 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 21 18:54:28 np0005591285 nova_compute[182755]: 2026-01-21 23:54:28.071 182759 DEBUG nova.scheduler.client.report [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Inventory has not changed for provider e96a8776-a298-4c19-937a-402cb8191067 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 21 18:54:28 np0005591285 nova_compute[182755]: 2026-01-21 23:54:28.120 182759 DEBUG nova.compute.resource_tracker [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Jan 21 18:54:28 np0005591285 nova_compute[182755]: 2026-01-21 23:54:28.120 182759 DEBUG oslo_concurrency.lockutils [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.334s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 21 18:54:28 np0005591285 kernel: tap2ad8a775-c0 (unregistering): left promiscuous mode
Jan 21 18:54:28 np0005591285 NetworkManager[55017]: <info>  [1769039668.9383] device (tap2ad8a775-c0): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 21 18:54:28 np0005591285 ovn_controller[94908]: 2026-01-21T23:54:28Z|00122|binding|INFO|Releasing lport 2ad8a775-c03c-4a1b-919a-278faef8cb47 from this chassis (sb_readonly=0)
Jan 21 18:54:28 np0005591285 ovn_controller[94908]: 2026-01-21T23:54:28Z|00123|binding|INFO|Setting lport 2ad8a775-c03c-4a1b-919a-278faef8cb47 down in Southbound
Jan 21 18:54:28 np0005591285 ovn_controller[94908]: 2026-01-21T23:54:28Z|00124|binding|INFO|Removing iface tap2ad8a775-c0 ovn-installed in OVS
Jan 21 18:54:28 np0005591285 nova_compute[182755]: 2026-01-21 23:54:28.948 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 18:54:28 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:54:28.958 104259 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:10:e7:d9 10.100.0.12'], port_security=['fa:16:3e:10:e7:d9 10.100.0.12'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.12/28', 'neutron:device_id': '2c5b484c-19e7-47b1-bf93-fa599ddb6873', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-7b586c54-3322-410f-9bc9-972a63b8deff', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'c09a5cf201e249f69f57cd4a632d1e2b', 'neutron:revision_number': '4', 'neutron:security_group_ids': '0a9884ba-4fab-4d1a-a8f1-d417efefef12', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-2.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=f503744d-afcf-48c4-bcde-b001877de7d3, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fdd55b5c970>], logical_port=2ad8a775-c03c-4a1b-919a-278faef8cb47) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7fdd55b5c970>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 21 18:54:28 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:54:28.961 104259 INFO neutron.agent.ovn.metadata.agent [-] Port 2ad8a775-c03c-4a1b-919a-278faef8cb47 in datapath 7b586c54-3322-410f-9bc9-972a63b8deff unbound from our chassis#033[00m
Jan 21 18:54:28 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:54:28.964 104259 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 7b586c54-3322-410f-9bc9-972a63b8deff, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Jan 21 18:54:28 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:54:28.966 211690 DEBUG oslo.privsep.daemon [-] privsep: reply[bc837565-e7eb-4211-b5ac-f045851ff357]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 18:54:28 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:54:28.967 104259 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-7b586c54-3322-410f-9bc9-972a63b8deff namespace which is not needed anymore#033[00m
Jan 21 18:54:28 np0005591285 nova_compute[182755]: 2026-01-21 23:54:28.982 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 18:54:29 np0005591285 systemd[1]: machine-qemu\x2d21\x2dinstance\x2d00000033.scope: Deactivated successfully.
Jan 21 18:54:29 np0005591285 systemd[1]: machine-qemu\x2d21\x2dinstance\x2d00000033.scope: Consumed 13.852s CPU time.
Jan 21 18:54:29 np0005591285 systemd-machined[154022]: Machine qemu-21-instance-00000033 terminated.
Jan 21 18:54:29 np0005591285 podman[217819]: 2026-01-21 23:54:29.081526998 +0000 UTC m=+0.114263436 container health_status 3d927e208c8d9d2bf2ed958430b573b5beb6e6062ba87e1ec7a0ee42c855b2e4 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter)
Jan 21 18:54:29 np0005591285 nova_compute[182755]: 2026-01-21 23:54:29.121 182759 DEBUG oslo_service.periodic_task [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 21 18:54:29 np0005591285 nova_compute[182755]: 2026-01-21 23:54:29.122 182759 DEBUG nova.compute.manager [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Jan 21 18:54:29 np0005591285 nova_compute[182755]: 2026-01-21 23:54:29.122 182759 DEBUG nova.compute.manager [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Jan 21 18:54:29 np0005591285 neutron-haproxy-ovnmeta-7b586c54-3322-410f-9bc9-972a63b8deff[217731]: [NOTICE]   (217735) : haproxy version is 2.8.14-c23fe91
Jan 21 18:54:29 np0005591285 neutron-haproxy-ovnmeta-7b586c54-3322-410f-9bc9-972a63b8deff[217731]: [NOTICE]   (217735) : path to executable is /usr/sbin/haproxy
Jan 21 18:54:29 np0005591285 neutron-haproxy-ovnmeta-7b586c54-3322-410f-9bc9-972a63b8deff[217731]: [WARNING]  (217735) : Exiting Master process...
Jan 21 18:54:29 np0005591285 neutron-haproxy-ovnmeta-7b586c54-3322-410f-9bc9-972a63b8deff[217731]: [WARNING]  (217735) : Exiting Master process...
Jan 21 18:54:29 np0005591285 neutron-haproxy-ovnmeta-7b586c54-3322-410f-9bc9-972a63b8deff[217731]: [ALERT]    (217735) : Current worker (217737) exited with code 143 (Terminated)
Jan 21 18:54:29 np0005591285 neutron-haproxy-ovnmeta-7b586c54-3322-410f-9bc9-972a63b8deff[217731]: [WARNING]  (217735) : All workers exited. Exiting... (0)
Jan 21 18:54:29 np0005591285 nova_compute[182755]: 2026-01-21 23:54:29.165 182759 DEBUG oslo_concurrency.lockutils [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Acquiring lock "refresh_cache-2c5b484c-19e7-47b1-bf93-fa599ddb6873" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 21 18:54:29 np0005591285 nova_compute[182755]: 2026-01-21 23:54:29.166 182759 DEBUG oslo_concurrency.lockutils [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Acquired lock "refresh_cache-2c5b484c-19e7-47b1-bf93-fa599ddb6873" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 21 18:54:29 np0005591285 systemd[1]: libpod-d0a3fbfe21a547c02a504e096feab1e7c307e36683ecc89137ba6cc96a5e5e56.scope: Deactivated successfully.
Jan 21 18:54:29 np0005591285 nova_compute[182755]: 2026-01-21 23:54:29.167 182759 DEBUG nova.network.neutron [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] [instance: 2c5b484c-19e7-47b1-bf93-fa599ddb6873] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004#033[00m
Jan 21 18:54:29 np0005591285 nova_compute[182755]: 2026-01-21 23:54:29.167 182759 DEBUG nova.objects.instance [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Lazy-loading 'info_cache' on Instance uuid 2c5b484c-19e7-47b1-bf93-fa599ddb6873 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 21 18:54:29 np0005591285 podman[217867]: 2026-01-21 23:54:29.171208032 +0000 UTC m=+0.066121831 container died d0a3fbfe21a547c02a504e096feab1e7c307e36683ecc89137ba6cc96a5e5e56 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-7b586c54-3322-410f-9bc9-972a63b8deff, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 21 18:54:29 np0005591285 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-d0a3fbfe21a547c02a504e096feab1e7c307e36683ecc89137ba6cc96a5e5e56-userdata-shm.mount: Deactivated successfully.
Jan 21 18:54:29 np0005591285 systemd[1]: var-lib-containers-storage-overlay-dc95c0cad680c89c25343b8bb5f7b754cd4e2f3a06e0ea54f59f92b890fcd035-merged.mount: Deactivated successfully.
Jan 21 18:54:29 np0005591285 podman[217867]: 2026-01-21 23:54:29.235199024 +0000 UTC m=+0.130112783 container cleanup d0a3fbfe21a547c02a504e096feab1e7c307e36683ecc89137ba6cc96a5e5e56 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-7b586c54-3322-410f-9bc9-972a63b8deff, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 21 18:54:29 np0005591285 systemd[1]: libpod-conmon-d0a3fbfe21a547c02a504e096feab1e7c307e36683ecc89137ba6cc96a5e5e56.scope: Deactivated successfully.
Jan 21 18:54:29 np0005591285 podman[217911]: 2026-01-21 23:54:29.310579413 +0000 UTC m=+0.046289867 container remove d0a3fbfe21a547c02a504e096feab1e7c307e36683ecc89137ba6cc96a5e5e56 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-7b586c54-3322-410f-9bc9-972a63b8deff, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2)
Jan 21 18:54:29 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:54:29.317 211690 DEBUG oslo.privsep.daemon [-] privsep: reply[289fef45-d158-461f-b29e-1736dd15d315]: (4, ('Wed Jan 21 11:54:29 PM UTC 2026 Stopping container neutron-haproxy-ovnmeta-7b586c54-3322-410f-9bc9-972a63b8deff (d0a3fbfe21a547c02a504e096feab1e7c307e36683ecc89137ba6cc96a5e5e56)\nd0a3fbfe21a547c02a504e096feab1e7c307e36683ecc89137ba6cc96a5e5e56\nWed Jan 21 11:54:29 PM UTC 2026 Deleting container neutron-haproxy-ovnmeta-7b586c54-3322-410f-9bc9-972a63b8deff (d0a3fbfe21a547c02a504e096feab1e7c307e36683ecc89137ba6cc96a5e5e56)\nd0a3fbfe21a547c02a504e096feab1e7c307e36683ecc89137ba6cc96a5e5e56\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 18:54:29 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:54:29.320 211690 DEBUG oslo.privsep.daemon [-] privsep: reply[3a60bd54-59ac-4dfa-ad5a-24d8227d686e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 18:54:29 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:54:29.321 104259 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap7b586c54-30, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 21 18:54:29 np0005591285 kernel: tap7b586c54-30: left promiscuous mode
Jan 21 18:54:29 np0005591285 nova_compute[182755]: 2026-01-21 23:54:29.324 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 18:54:29 np0005591285 nova_compute[182755]: 2026-01-21 23:54:29.344 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 18:54:29 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:54:29.348 211690 DEBUG oslo.privsep.daemon [-] privsep: reply[cfc79653-d63f-4704-8dd0-467689ef7eb4]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 18:54:29 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:54:29.363 211690 DEBUG oslo.privsep.daemon [-] privsep: reply[0ea74432-483c-4e58-a291-22563496c151]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 18:54:29 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:54:29.365 211690 DEBUG oslo.privsep.daemon [-] privsep: reply[e656e743-f1b5-4c60-98e4-10e1af8ecce4]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 18:54:29 np0005591285 nova_compute[182755]: 2026-01-21 23:54:29.391 182759 DEBUG nova.compute.manager [req-c5f4c898-a3bf-46c1-bf21-2fb574174819 req-cbb7371b-9b72-43a0-9351-f0d2112e7570 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 2c5b484c-19e7-47b1-bf93-fa599ddb6873] Received event network-vif-unplugged-2ad8a775-c03c-4a1b-919a-278faef8cb47 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 21 18:54:29 np0005591285 nova_compute[182755]: 2026-01-21 23:54:29.392 182759 DEBUG oslo_concurrency.lockutils [req-c5f4c898-a3bf-46c1-bf21-2fb574174819 req-cbb7371b-9b72-43a0-9351-f0d2112e7570 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "2c5b484c-19e7-47b1-bf93-fa599ddb6873-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 21 18:54:29 np0005591285 nova_compute[182755]: 2026-01-21 23:54:29.392 182759 DEBUG oslo_concurrency.lockutils [req-c5f4c898-a3bf-46c1-bf21-2fb574174819 req-cbb7371b-9b72-43a0-9351-f0d2112e7570 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "2c5b484c-19e7-47b1-bf93-fa599ddb6873-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 21 18:54:29 np0005591285 nova_compute[182755]: 2026-01-21 23:54:29.392 182759 DEBUG oslo_concurrency.lockutils [req-c5f4c898-a3bf-46c1-bf21-2fb574174819 req-cbb7371b-9b72-43a0-9351-f0d2112e7570 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "2c5b484c-19e7-47b1-bf93-fa599ddb6873-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 21 18:54:29 np0005591285 nova_compute[182755]: 2026-01-21 23:54:29.392 182759 DEBUG nova.compute.manager [req-c5f4c898-a3bf-46c1-bf21-2fb574174819 req-cbb7371b-9b72-43a0-9351-f0d2112e7570 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 2c5b484c-19e7-47b1-bf93-fa599ddb6873] No waiting events found dispatching network-vif-unplugged-2ad8a775-c03c-4a1b-919a-278faef8cb47 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 21 18:54:29 np0005591285 nova_compute[182755]: 2026-01-21 23:54:29.393 182759 WARNING nova.compute.manager [req-c5f4c898-a3bf-46c1-bf21-2fb574174819 req-cbb7371b-9b72-43a0-9351-f0d2112e7570 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 2c5b484c-19e7-47b1-bf93-fa599ddb6873] Received unexpected event network-vif-unplugged-2ad8a775-c03c-4a1b-919a-278faef8cb47 for instance with vm_state active and task_state resize_migrating.#033[00m
Jan 21 18:54:29 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:54:29.392 211690 DEBUG oslo.privsep.daemon [-] privsep: reply[7a53012e-083e-4c33-8220-fa31f1c3ad14]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 420483, 'reachable_time': 31784, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 217933, 'error': None, 'target': 'ovnmeta-7b586c54-3322-410f-9bc9-972a63b8deff', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 18:54:29 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:54:29.398 104650 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-7b586c54-3322-410f-9bc9-972a63b8deff deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Jan 21 18:54:29 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:54:29.399 104650 DEBUG oslo.privsep.daemon [-] privsep: reply[bcee69ac-16ff-4080-9bfd-675ed9e3ba35]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 18:54:29 np0005591285 systemd[1]: run-netns-ovnmeta\x2d7b586c54\x2d3322\x2d410f\x2d9bc9\x2d972a63b8deff.mount: Deactivated successfully.
Jan 21 18:54:29 np0005591285 nova_compute[182755]: 2026-01-21 23:54:29.719 182759 INFO nova.virt.libvirt.driver [None req-119b87c6-e29b-40e2-ade9-16bc86f4c969 a7fb6bdd938b4fcdb749b0bc4f86f97e c09a5cf201e249f69f57cd4a632d1e2b - - default default] [instance: 2c5b484c-19e7-47b1-bf93-fa599ddb6873] Instance shutdown successfully after 3 seconds.#033[00m
Jan 21 18:54:29 np0005591285 nova_compute[182755]: 2026-01-21 23:54:29.729 182759 INFO nova.virt.libvirt.driver [-] [instance: 2c5b484c-19e7-47b1-bf93-fa599ddb6873] Instance destroyed successfully.#033[00m
Jan 21 18:54:29 np0005591285 nova_compute[182755]: 2026-01-21 23:54:29.730 182759 DEBUG nova.virt.libvirt.vif [None req-119b87c6-e29b-40e2-ade9-16bc86f4c969 a7fb6bdd938b4fcdb749b0bc4f86f97e c09a5cf201e249f69f57cd4a632d1e2b - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=True,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-21T23:53:56Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ServerDiskConfigTestJSON-server-1222136722',display_name='tempest-ServerDiskConfigTestJSON-server-1222136722',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-serverdiskconfigtestjson-server-1222136722',id=51,image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2026-01-21T23:54:07Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=MigrationContext,new_flavor=Flavor(2),node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='c09a5cf201e249f69f57cd4a632d1e2b',ramdisk_id='',reservation_id='r-wyf3b09s',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=ServiceList,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',old_vm_state='active',owner_project_name='tempest-ServerDiskConfigTestJSON-1417790226',owner_user_name='tempest-ServerDiskConfigTestJSON-1417790226-project-member'},tags=<?>,task_state='resize_migrating',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-21T23:54:21Z,user_data=None,user_id='a7fb6bdd938b4fcdb749b0bc4f86f97e',uuid=2c5b484c-19e7-47b1-bf93-fa599ddb6873,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "2ad8a775-c03c-4a1b-919a-278faef8cb47", "address": "fa:16:3e:10:e7:d9", "network": {"id": "7b586c54-3322-410f-9bc9-972a63b8deff", "bridge": "br-int", "label": "tempest-ServerDiskConfigTestJSON-1542056474-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [], "label": "tempest-ServerDiskConfigTestJSON-1542056474-network", "vif_mac": "fa:16:3e:10:e7:d9"}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c09a5cf201e249f69f57cd4a632d1e2b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap2ad8a775-c0", "ovs_interfaceid": "2ad8a775-c03c-4a1b-919a-278faef8cb47", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Jan 21 18:54:29 np0005591285 nova_compute[182755]: 2026-01-21 23:54:29.731 182759 DEBUG nova.network.os_vif_util [None req-119b87c6-e29b-40e2-ade9-16bc86f4c969 a7fb6bdd938b4fcdb749b0bc4f86f97e c09a5cf201e249f69f57cd4a632d1e2b - - default default] Converting VIF {"id": "2ad8a775-c03c-4a1b-919a-278faef8cb47", "address": "fa:16:3e:10:e7:d9", "network": {"id": "7b586c54-3322-410f-9bc9-972a63b8deff", "bridge": "br-int", "label": "tempest-ServerDiskConfigTestJSON-1542056474-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [], "label": "tempest-ServerDiskConfigTestJSON-1542056474-network", "vif_mac": "fa:16:3e:10:e7:d9"}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c09a5cf201e249f69f57cd4a632d1e2b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap2ad8a775-c0", "ovs_interfaceid": "2ad8a775-c03c-4a1b-919a-278faef8cb47", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 21 18:54:29 np0005591285 nova_compute[182755]: 2026-01-21 23:54:29.733 182759 DEBUG nova.network.os_vif_util [None req-119b87c6-e29b-40e2-ade9-16bc86f4c969 a7fb6bdd938b4fcdb749b0bc4f86f97e c09a5cf201e249f69f57cd4a632d1e2b - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:10:e7:d9,bridge_name='br-int',has_traffic_filtering=True,id=2ad8a775-c03c-4a1b-919a-278faef8cb47,network=Network(7b586c54-3322-410f-9bc9-972a63b8deff),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap2ad8a775-c0') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 21 18:54:29 np0005591285 nova_compute[182755]: 2026-01-21 23:54:29.734 182759 DEBUG os_vif [None req-119b87c6-e29b-40e2-ade9-16bc86f4c969 a7fb6bdd938b4fcdb749b0bc4f86f97e c09a5cf201e249f69f57cd4a632d1e2b - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:10:e7:d9,bridge_name='br-int',has_traffic_filtering=True,id=2ad8a775-c03c-4a1b-919a-278faef8cb47,network=Network(7b586c54-3322-410f-9bc9-972a63b8deff),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap2ad8a775-c0') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Jan 21 18:54:29 np0005591285 nova_compute[182755]: 2026-01-21 23:54:29.737 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 18:54:29 np0005591285 nova_compute[182755]: 2026-01-21 23:54:29.738 182759 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap2ad8a775-c0, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 21 18:54:29 np0005591285 nova_compute[182755]: 2026-01-21 23:54:29.740 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 18:54:29 np0005591285 nova_compute[182755]: 2026-01-21 23:54:29.743 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 18:54:29 np0005591285 nova_compute[182755]: 2026-01-21 23:54:29.747 182759 INFO os_vif [None req-119b87c6-e29b-40e2-ade9-16bc86f4c969 a7fb6bdd938b4fcdb749b0bc4f86f97e c09a5cf201e249f69f57cd4a632d1e2b - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:10:e7:d9,bridge_name='br-int',has_traffic_filtering=True,id=2ad8a775-c03c-4a1b-919a-278faef8cb47,network=Network(7b586c54-3322-410f-9bc9-972a63b8deff),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap2ad8a775-c0')#033[00m
Jan 21 18:54:29 np0005591285 nova_compute[182755]: 2026-01-21 23:54:29.752 182759 DEBUG oslo_concurrency.processutils [None req-119b87c6-e29b-40e2-ade9-16bc86f4c969 a7fb6bdd938b4fcdb749b0bc4f86f97e c09a5cf201e249f69f57cd4a632d1e2b - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/2c5b484c-19e7-47b1-bf93-fa599ddb6873/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 21 18:54:29 np0005591285 nova_compute[182755]: 2026-01-21 23:54:29.822 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 18:54:29 np0005591285 nova_compute[182755]: 2026-01-21 23:54:29.847 182759 DEBUG oslo_concurrency.processutils [None req-119b87c6-e29b-40e2-ade9-16bc86f4c969 a7fb6bdd938b4fcdb749b0bc4f86f97e c09a5cf201e249f69f57cd4a632d1e2b - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/2c5b484c-19e7-47b1-bf93-fa599ddb6873/disk --force-share --output=json" returned: 0 in 0.095s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 21 18:54:29 np0005591285 nova_compute[182755]: 2026-01-21 23:54:29.849 182759 DEBUG oslo_concurrency.processutils [None req-119b87c6-e29b-40e2-ade9-16bc86f4c969 a7fb6bdd938b4fcdb749b0bc4f86f97e c09a5cf201e249f69f57cd4a632d1e2b - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/2c5b484c-19e7-47b1-bf93-fa599ddb6873/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 21 18:54:29 np0005591285 nova_compute[182755]: 2026-01-21 23:54:29.920 182759 DEBUG oslo_concurrency.processutils [None req-119b87c6-e29b-40e2-ade9-16bc86f4c969 a7fb6bdd938b4fcdb749b0bc4f86f97e c09a5cf201e249f69f57cd4a632d1e2b - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/2c5b484c-19e7-47b1-bf93-fa599ddb6873/disk --force-share --output=json" returned: 0 in 0.071s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 21 18:54:29 np0005591285 nova_compute[182755]: 2026-01-21 23:54:29.922 182759 DEBUG nova.virt.libvirt.volume.remotefs [None req-119b87c6-e29b-40e2-ade9-16bc86f4c969 a7fb6bdd938b4fcdb749b0bc4f86f97e c09a5cf201e249f69f57cd4a632d1e2b - - default default] Copying file /var/lib/nova/instances/2c5b484c-19e7-47b1-bf93-fa599ddb6873_resize/disk to 192.168.122.100:/var/lib/nova/instances/2c5b484c-19e7-47b1-bf93-fa599ddb6873/disk copy_file /usr/lib/python3.9/site-packages/nova/virt/libvirt/volume/remotefs.py:103#033[00m
Jan 21 18:54:29 np0005591285 nova_compute[182755]: 2026-01-21 23:54:29.923 182759 DEBUG oslo_concurrency.processutils [None req-119b87c6-e29b-40e2-ade9-16bc86f4c969 a7fb6bdd938b4fcdb749b0bc4f86f97e c09a5cf201e249f69f57cd4a632d1e2b - - default default] Running cmd (subprocess): scp -r /var/lib/nova/instances/2c5b484c-19e7-47b1-bf93-fa599ddb6873_resize/disk 192.168.122.100:/var/lib/nova/instances/2c5b484c-19e7-47b1-bf93-fa599ddb6873/disk execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 21 18:54:30 np0005591285 nova_compute[182755]: 2026-01-21 23:54:30.472 182759 DEBUG oslo_concurrency.processutils [None req-119b87c6-e29b-40e2-ade9-16bc86f4c969 a7fb6bdd938b4fcdb749b0bc4f86f97e c09a5cf201e249f69f57cd4a632d1e2b - - default default] CMD "scp -r /var/lib/nova/instances/2c5b484c-19e7-47b1-bf93-fa599ddb6873_resize/disk 192.168.122.100:/var/lib/nova/instances/2c5b484c-19e7-47b1-bf93-fa599ddb6873/disk" returned: 0 in 0.549s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 21 18:54:30 np0005591285 nova_compute[182755]: 2026-01-21 23:54:30.474 182759 DEBUG nova.virt.libvirt.volume.remotefs [None req-119b87c6-e29b-40e2-ade9-16bc86f4c969 a7fb6bdd938b4fcdb749b0bc4f86f97e c09a5cf201e249f69f57cd4a632d1e2b - - default default] Copying file /var/lib/nova/instances/2c5b484c-19e7-47b1-bf93-fa599ddb6873_resize/disk.config to 192.168.122.100:/var/lib/nova/instances/2c5b484c-19e7-47b1-bf93-fa599ddb6873/disk.config copy_file /usr/lib/python3.9/site-packages/nova/virt/libvirt/volume/remotefs.py:103#033[00m
Jan 21 18:54:30 np0005591285 nova_compute[182755]: 2026-01-21 23:54:30.474 182759 DEBUG oslo_concurrency.processutils [None req-119b87c6-e29b-40e2-ade9-16bc86f4c969 a7fb6bdd938b4fcdb749b0bc4f86f97e c09a5cf201e249f69f57cd4a632d1e2b - - default default] Running cmd (subprocess): scp -C -r /var/lib/nova/instances/2c5b484c-19e7-47b1-bf93-fa599ddb6873_resize/disk.config 192.168.122.100:/var/lib/nova/instances/2c5b484c-19e7-47b1-bf93-fa599ddb6873/disk.config execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 21 18:54:30 np0005591285 nova_compute[182755]: 2026-01-21 23:54:30.727 182759 DEBUG oslo_concurrency.processutils [None req-119b87c6-e29b-40e2-ade9-16bc86f4c969 a7fb6bdd938b4fcdb749b0bc4f86f97e c09a5cf201e249f69f57cd4a632d1e2b - - default default] CMD "scp -C -r /var/lib/nova/instances/2c5b484c-19e7-47b1-bf93-fa599ddb6873_resize/disk.config 192.168.122.100:/var/lib/nova/instances/2c5b484c-19e7-47b1-bf93-fa599ddb6873/disk.config" returned: 0 in 0.253s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 21 18:54:30 np0005591285 nova_compute[182755]: 2026-01-21 23:54:30.729 182759 DEBUG nova.virt.libvirt.volume.remotefs [None req-119b87c6-e29b-40e2-ade9-16bc86f4c969 a7fb6bdd938b4fcdb749b0bc4f86f97e c09a5cf201e249f69f57cd4a632d1e2b - - default default] Copying file /var/lib/nova/instances/2c5b484c-19e7-47b1-bf93-fa599ddb6873_resize/disk.info to 192.168.122.100:/var/lib/nova/instances/2c5b484c-19e7-47b1-bf93-fa599ddb6873/disk.info copy_file /usr/lib/python3.9/site-packages/nova/virt/libvirt/volume/remotefs.py:103#033[00m
Jan 21 18:54:30 np0005591285 nova_compute[182755]: 2026-01-21 23:54:30.729 182759 DEBUG oslo_concurrency.processutils [None req-119b87c6-e29b-40e2-ade9-16bc86f4c969 a7fb6bdd938b4fcdb749b0bc4f86f97e c09a5cf201e249f69f57cd4a632d1e2b - - default default] Running cmd (subprocess): scp -C -r /var/lib/nova/instances/2c5b484c-19e7-47b1-bf93-fa599ddb6873_resize/disk.info 192.168.122.100:/var/lib/nova/instances/2c5b484c-19e7-47b1-bf93-fa599ddb6873/disk.info execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 21 18:54:30 np0005591285 nova_compute[182755]: 2026-01-21 23:54:30.950 182759 DEBUG oslo_concurrency.processutils [None req-119b87c6-e29b-40e2-ade9-16bc86f4c969 a7fb6bdd938b4fcdb749b0bc4f86f97e c09a5cf201e249f69f57cd4a632d1e2b - - default default] CMD "scp -C -r /var/lib/nova/instances/2c5b484c-19e7-47b1-bf93-fa599ddb6873_resize/disk.info 192.168.122.100:/var/lib/nova/instances/2c5b484c-19e7-47b1-bf93-fa599ddb6873/disk.info" returned: 0 in 0.220s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 21 18:54:31 np0005591285 nova_compute[182755]: 2026-01-21 23:54:31.342 182759 DEBUG neutronclient.v2_0.client [None req-119b87c6-e29b-40e2-ade9-16bc86f4c969 a7fb6bdd938b4fcdb749b0bc4f86f97e c09a5cf201e249f69f57cd4a632d1e2b - - default default] Error message: {"NeutronError": {"type": "PortBindingNotFound", "message": "Binding for port 2ad8a775-c03c-4a1b-919a-278faef8cb47 for host compute-0.ctlplane.example.com could not be found.", "detail": ""}} _handle_fault_response /usr/lib/python3.9/site-packages/neutronclient/v2_0/client.py:262#033[00m
Jan 21 18:54:31 np0005591285 nova_compute[182755]: 2026-01-21 23:54:31.528 182759 DEBUG oslo_concurrency.lockutils [None req-119b87c6-e29b-40e2-ade9-16bc86f4c969 a7fb6bdd938b4fcdb749b0bc4f86f97e c09a5cf201e249f69f57cd4a632d1e2b - - default default] Acquiring lock "2c5b484c-19e7-47b1-bf93-fa599ddb6873-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 21 18:54:31 np0005591285 nova_compute[182755]: 2026-01-21 23:54:31.529 182759 DEBUG oslo_concurrency.lockutils [None req-119b87c6-e29b-40e2-ade9-16bc86f4c969 a7fb6bdd938b4fcdb749b0bc4f86f97e c09a5cf201e249f69f57cd4a632d1e2b - - default default] Lock "2c5b484c-19e7-47b1-bf93-fa599ddb6873-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 21 18:54:31 np0005591285 nova_compute[182755]: 2026-01-21 23:54:31.530 182759 DEBUG oslo_concurrency.lockutils [None req-119b87c6-e29b-40e2-ade9-16bc86f4c969 a7fb6bdd938b4fcdb749b0bc4f86f97e c09a5cf201e249f69f57cd4a632d1e2b - - default default] Lock "2c5b484c-19e7-47b1-bf93-fa599ddb6873-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 21 18:54:31 np0005591285 nova_compute[182755]: 2026-01-21 23:54:31.573 182759 DEBUG nova.compute.manager [req-f1c81cab-c099-4c54-9d37-7921d360c113 req-64b14beb-ee53-4f24-9b92-39a989235ef0 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 2c5b484c-19e7-47b1-bf93-fa599ddb6873] Received event network-vif-plugged-2ad8a775-c03c-4a1b-919a-278faef8cb47 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 21 18:54:31 np0005591285 nova_compute[182755]: 2026-01-21 23:54:31.574 182759 DEBUG oslo_concurrency.lockutils [req-f1c81cab-c099-4c54-9d37-7921d360c113 req-64b14beb-ee53-4f24-9b92-39a989235ef0 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "2c5b484c-19e7-47b1-bf93-fa599ddb6873-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 21 18:54:31 np0005591285 nova_compute[182755]: 2026-01-21 23:54:31.575 182759 DEBUG oslo_concurrency.lockutils [req-f1c81cab-c099-4c54-9d37-7921d360c113 req-64b14beb-ee53-4f24-9b92-39a989235ef0 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "2c5b484c-19e7-47b1-bf93-fa599ddb6873-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 21 18:54:31 np0005591285 nova_compute[182755]: 2026-01-21 23:54:31.575 182759 DEBUG oslo_concurrency.lockutils [req-f1c81cab-c099-4c54-9d37-7921d360c113 req-64b14beb-ee53-4f24-9b92-39a989235ef0 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "2c5b484c-19e7-47b1-bf93-fa599ddb6873-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 21 18:54:31 np0005591285 nova_compute[182755]: 2026-01-21 23:54:31.575 182759 DEBUG nova.compute.manager [req-f1c81cab-c099-4c54-9d37-7921d360c113 req-64b14beb-ee53-4f24-9b92-39a989235ef0 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 2c5b484c-19e7-47b1-bf93-fa599ddb6873] No waiting events found dispatching network-vif-plugged-2ad8a775-c03c-4a1b-919a-278faef8cb47 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 21 18:54:31 np0005591285 nova_compute[182755]: 2026-01-21 23:54:31.576 182759 WARNING nova.compute.manager [req-f1c81cab-c099-4c54-9d37-7921d360c113 req-64b14beb-ee53-4f24-9b92-39a989235ef0 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 2c5b484c-19e7-47b1-bf93-fa599ddb6873] Received unexpected event network-vif-plugged-2ad8a775-c03c-4a1b-919a-278faef8cb47 for instance with vm_state active and task_state resize_migrated.#033[00m
Jan 21 18:54:33 np0005591285 nova_compute[182755]: 2026-01-21 23:54:33.177 182759 DEBUG nova.network.neutron [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] [instance: 2c5b484c-19e7-47b1-bf93-fa599ddb6873] Updating instance_info_cache with network_info: [{"id": "2ad8a775-c03c-4a1b-919a-278faef8cb47", "address": "fa:16:3e:10:e7:d9", "network": {"id": "7b586c54-3322-410f-9bc9-972a63b8deff", "bridge": "br-int", "label": "tempest-ServerDiskConfigTestJSON-1542056474-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c09a5cf201e249f69f57cd4a632d1e2b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap2ad8a775-c0", "ovs_interfaceid": "2ad8a775-c03c-4a1b-919a-278faef8cb47", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 21 18:54:33 np0005591285 nova_compute[182755]: 2026-01-21 23:54:33.202 182759 DEBUG oslo_concurrency.lockutils [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Releasing lock "refresh_cache-2c5b484c-19e7-47b1-bf93-fa599ddb6873" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 21 18:54:33 np0005591285 nova_compute[182755]: 2026-01-21 23:54:33.203 182759 DEBUG nova.compute.manager [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] [instance: 2c5b484c-19e7-47b1-bf93-fa599ddb6873] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929#033[00m
Jan 21 18:54:33 np0005591285 nova_compute[182755]: 2026-01-21 23:54:33.204 182759 DEBUG oslo_service.periodic_task [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 21 18:54:33 np0005591285 nova_compute[182755]: 2026-01-21 23:54:33.204 182759 DEBUG oslo_service.periodic_task [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 21 18:54:33 np0005591285 nova_compute[182755]: 2026-01-21 23:54:33.204 182759 DEBUG oslo_service.periodic_task [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 21 18:54:33 np0005591285 podman[217946]: 2026-01-21 23:54:33.287553201 +0000 UTC m=+0.144288955 container health_status e38def30a2d032278c507a8fe96e62e9ea51ff4721fe55b24e66691821fd079e (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, container_name=ovn_controller, io.buildah.version=1.41.3, managed_by=edpm_ansible, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 21 18:54:33 np0005591285 nova_compute[182755]: 2026-01-21 23:54:33.960 182759 DEBUG nova.compute.manager [req-a60413fe-018c-48eb-962e-85fca4a59575 req-be40ca7b-d00d-481c-b79f-835d1663f184 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 2c5b484c-19e7-47b1-bf93-fa599ddb6873] Received event network-changed-2ad8a775-c03c-4a1b-919a-278faef8cb47 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 21 18:54:33 np0005591285 nova_compute[182755]: 2026-01-21 23:54:33.961 182759 DEBUG nova.compute.manager [req-a60413fe-018c-48eb-962e-85fca4a59575 req-be40ca7b-d00d-481c-b79f-835d1663f184 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 2c5b484c-19e7-47b1-bf93-fa599ddb6873] Refreshing instance network info cache due to event network-changed-2ad8a775-c03c-4a1b-919a-278faef8cb47. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 21 18:54:33 np0005591285 nova_compute[182755]: 2026-01-21 23:54:33.961 182759 DEBUG oslo_concurrency.lockutils [req-a60413fe-018c-48eb-962e-85fca4a59575 req-be40ca7b-d00d-481c-b79f-835d1663f184 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "refresh_cache-2c5b484c-19e7-47b1-bf93-fa599ddb6873" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 21 18:54:33 np0005591285 nova_compute[182755]: 2026-01-21 23:54:33.962 182759 DEBUG oslo_concurrency.lockutils [req-a60413fe-018c-48eb-962e-85fca4a59575 req-be40ca7b-d00d-481c-b79f-835d1663f184 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquired lock "refresh_cache-2c5b484c-19e7-47b1-bf93-fa599ddb6873" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 21 18:54:33 np0005591285 nova_compute[182755]: 2026-01-21 23:54:33.962 182759 DEBUG nova.network.neutron [req-a60413fe-018c-48eb-962e-85fca4a59575 req-be40ca7b-d00d-481c-b79f-835d1663f184 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 2c5b484c-19e7-47b1-bf93-fa599ddb6873] Refreshing network info cache for port 2ad8a775-c03c-4a1b-919a-278faef8cb47 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 21 18:54:34 np0005591285 nova_compute[182755]: 2026-01-21 23:54:34.742 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 18:54:34 np0005591285 nova_compute[182755]: 2026-01-21 23:54:34.824 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 18:54:35 np0005591285 podman[217972]: 2026-01-21 23:54:35.213076675 +0000 UTC m=+0.074825634 container health_status 482a8450540b33e5cd4c4cb0c4b60281016c657b79b89fa82f8a9e447843f995 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, managed_by=edpm_ansible, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team)
Jan 21 18:54:35 np0005591285 podman[217973]: 2026-01-21 23:54:35.2604522 +0000 UTC m=+0.119671962 container health_status 8919309155a033da8deb024bb37b9fc0d285fe97686ee0d202af37a5f2df8416 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']})
Jan 21 18:54:39 np0005591285 nova_compute[182755]: 2026-01-21 23:54:39.552 182759 DEBUG nova.network.neutron [req-a60413fe-018c-48eb-962e-85fca4a59575 req-be40ca7b-d00d-481c-b79f-835d1663f184 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 2c5b484c-19e7-47b1-bf93-fa599ddb6873] Updated VIF entry in instance network info cache for port 2ad8a775-c03c-4a1b-919a-278faef8cb47. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 21 18:54:39 np0005591285 nova_compute[182755]: 2026-01-21 23:54:39.553 182759 DEBUG nova.network.neutron [req-a60413fe-018c-48eb-962e-85fca4a59575 req-be40ca7b-d00d-481c-b79f-835d1663f184 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 2c5b484c-19e7-47b1-bf93-fa599ddb6873] Updating instance_info_cache with network_info: [{"id": "2ad8a775-c03c-4a1b-919a-278faef8cb47", "address": "fa:16:3e:10:e7:d9", "network": {"id": "7b586c54-3322-410f-9bc9-972a63b8deff", "bridge": "br-int", "label": "tempest-ServerDiskConfigTestJSON-1542056474-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c09a5cf201e249f69f57cd4a632d1e2b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap2ad8a775-c0", "ovs_interfaceid": "2ad8a775-c03c-4a1b-919a-278faef8cb47", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 21 18:54:39 np0005591285 nova_compute[182755]: 2026-01-21 23:54:39.590 182759 DEBUG oslo_concurrency.lockutils [req-a60413fe-018c-48eb-962e-85fca4a59575 req-be40ca7b-d00d-481c-b79f-835d1663f184 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Releasing lock "refresh_cache-2c5b484c-19e7-47b1-bf93-fa599ddb6873" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 21 18:54:39 np0005591285 nova_compute[182755]: 2026-01-21 23:54:39.745 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 18:54:39 np0005591285 nova_compute[182755]: 2026-01-21 23:54:39.826 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 18:54:40 np0005591285 nova_compute[182755]: 2026-01-21 23:54:40.869 182759 DEBUG nova.compute.manager [req-55a8046f-5a09-40e5-a631-da7f0f5567a4 req-ce910971-3a6c-4387-a979-522c849cafd3 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 2c5b484c-19e7-47b1-bf93-fa599ddb6873] Received event network-vif-plugged-2ad8a775-c03c-4a1b-919a-278faef8cb47 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 21 18:54:40 np0005591285 nova_compute[182755]: 2026-01-21 23:54:40.870 182759 DEBUG oslo_concurrency.lockutils [req-55a8046f-5a09-40e5-a631-da7f0f5567a4 req-ce910971-3a6c-4387-a979-522c849cafd3 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "2c5b484c-19e7-47b1-bf93-fa599ddb6873-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 21 18:54:40 np0005591285 nova_compute[182755]: 2026-01-21 23:54:40.871 182759 DEBUG oslo_concurrency.lockutils [req-55a8046f-5a09-40e5-a631-da7f0f5567a4 req-ce910971-3a6c-4387-a979-522c849cafd3 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "2c5b484c-19e7-47b1-bf93-fa599ddb6873-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 21 18:54:40 np0005591285 nova_compute[182755]: 2026-01-21 23:54:40.871 182759 DEBUG oslo_concurrency.lockutils [req-55a8046f-5a09-40e5-a631-da7f0f5567a4 req-ce910971-3a6c-4387-a979-522c849cafd3 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "2c5b484c-19e7-47b1-bf93-fa599ddb6873-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 21 18:54:40 np0005591285 nova_compute[182755]: 2026-01-21 23:54:40.872 182759 DEBUG nova.compute.manager [req-55a8046f-5a09-40e5-a631-da7f0f5567a4 req-ce910971-3a6c-4387-a979-522c849cafd3 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 2c5b484c-19e7-47b1-bf93-fa599ddb6873] No waiting events found dispatching network-vif-plugged-2ad8a775-c03c-4a1b-919a-278faef8cb47 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 21 18:54:40 np0005591285 nova_compute[182755]: 2026-01-21 23:54:40.872 182759 WARNING nova.compute.manager [req-55a8046f-5a09-40e5-a631-da7f0f5567a4 req-ce910971-3a6c-4387-a979-522c849cafd3 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 2c5b484c-19e7-47b1-bf93-fa599ddb6873] Received unexpected event network-vif-plugged-2ad8a775-c03c-4a1b-919a-278faef8cb47 for instance with vm_state active and task_state resize_finish.#033[00m
Jan 21 18:54:42 np0005591285 nova_compute[182755]: 2026-01-21 23:54:42.974 182759 DEBUG nova.compute.manager [req-0ae2b0da-55e1-4a6e-8646-45025146963d req-3284c023-e4f8-46a4-a8a5-350c14b06f87 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 2c5b484c-19e7-47b1-bf93-fa599ddb6873] Received event network-vif-plugged-2ad8a775-c03c-4a1b-919a-278faef8cb47 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 21 18:54:42 np0005591285 nova_compute[182755]: 2026-01-21 23:54:42.975 182759 DEBUG oslo_concurrency.lockutils [req-0ae2b0da-55e1-4a6e-8646-45025146963d req-3284c023-e4f8-46a4-a8a5-350c14b06f87 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "2c5b484c-19e7-47b1-bf93-fa599ddb6873-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 21 18:54:42 np0005591285 nova_compute[182755]: 2026-01-21 23:54:42.975 182759 DEBUG oslo_concurrency.lockutils [req-0ae2b0da-55e1-4a6e-8646-45025146963d req-3284c023-e4f8-46a4-a8a5-350c14b06f87 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "2c5b484c-19e7-47b1-bf93-fa599ddb6873-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 21 18:54:42 np0005591285 nova_compute[182755]: 2026-01-21 23:54:42.976 182759 DEBUG oslo_concurrency.lockutils [req-0ae2b0da-55e1-4a6e-8646-45025146963d req-3284c023-e4f8-46a4-a8a5-350c14b06f87 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "2c5b484c-19e7-47b1-bf93-fa599ddb6873-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 21 18:54:42 np0005591285 nova_compute[182755]: 2026-01-21 23:54:42.976 182759 DEBUG nova.compute.manager [req-0ae2b0da-55e1-4a6e-8646-45025146963d req-3284c023-e4f8-46a4-a8a5-350c14b06f87 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 2c5b484c-19e7-47b1-bf93-fa599ddb6873] No waiting events found dispatching network-vif-plugged-2ad8a775-c03c-4a1b-919a-278faef8cb47 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 21 18:54:42 np0005591285 nova_compute[182755]: 2026-01-21 23:54:42.977 182759 WARNING nova.compute.manager [req-0ae2b0da-55e1-4a6e-8646-45025146963d req-3284c023-e4f8-46a4-a8a5-350c14b06f87 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 2c5b484c-19e7-47b1-bf93-fa599ddb6873] Received unexpected event network-vif-plugged-2ad8a775-c03c-4a1b-919a-278faef8cb47 for instance with vm_state resized and task_state None.#033[00m
Jan 21 18:54:43 np0005591285 nova_compute[182755]: 2026-01-21 23:54:43.487 182759 DEBUG oslo_concurrency.lockutils [None req-f6ce3027-5da7-49dd-a4f7-2b7f63b02400 a7fb6bdd938b4fcdb749b0bc4f86f97e c09a5cf201e249f69f57cd4a632d1e2b - - default default] Acquiring lock "2c5b484c-19e7-47b1-bf93-fa599ddb6873" by "nova.compute.manager.ComputeManager.confirm_resize.<locals>.do_confirm_resize" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 21 18:54:43 np0005591285 nova_compute[182755]: 2026-01-21 23:54:43.487 182759 DEBUG oslo_concurrency.lockutils [None req-f6ce3027-5da7-49dd-a4f7-2b7f63b02400 a7fb6bdd938b4fcdb749b0bc4f86f97e c09a5cf201e249f69f57cd4a632d1e2b - - default default] Lock "2c5b484c-19e7-47b1-bf93-fa599ddb6873" acquired by "nova.compute.manager.ComputeManager.confirm_resize.<locals>.do_confirm_resize" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 21 18:54:43 np0005591285 nova_compute[182755]: 2026-01-21 23:54:43.488 182759 DEBUG nova.compute.manager [None req-f6ce3027-5da7-49dd-a4f7-2b7f63b02400 a7fb6bdd938b4fcdb749b0bc4f86f97e c09a5cf201e249f69f57cd4a632d1e2b - - default default] [instance: 2c5b484c-19e7-47b1-bf93-fa599ddb6873] Going to confirm migration 12 do_confirm_resize /usr/lib/python3.9/site-packages/nova/compute/manager.py:4679#033[00m
Jan 21 18:54:43 np0005591285 nova_compute[182755]: 2026-01-21 23:54:43.526 182759 DEBUG nova.objects.instance [None req-f6ce3027-5da7-49dd-a4f7-2b7f63b02400 a7fb6bdd938b4fcdb749b0bc4f86f97e c09a5cf201e249f69f57cd4a632d1e2b - - default default] Lazy-loading 'info_cache' on Instance uuid 2c5b484c-19e7-47b1-bf93-fa599ddb6873 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 21 18:54:44 np0005591285 nova_compute[182755]: 2026-01-21 23:54:44.232 182759 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1769039669.2314832, 2c5b484c-19e7-47b1-bf93-fa599ddb6873 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 21 18:54:44 np0005591285 nova_compute[182755]: 2026-01-21 23:54:44.233 182759 INFO nova.compute.manager [-] [instance: 2c5b484c-19e7-47b1-bf93-fa599ddb6873] VM Stopped (Lifecycle Event)#033[00m
Jan 21 18:54:44 np0005591285 nova_compute[182755]: 2026-01-21 23:54:44.523 182759 DEBUG nova.compute.manager [None req-98b3d5b9-59c1-48db-b75c-fc6e7096b6c6 - - - - - -] [instance: 2c5b484c-19e7-47b1-bf93-fa599ddb6873] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 21 18:54:44 np0005591285 nova_compute[182755]: 2026-01-21 23:54:44.529 182759 DEBUG nova.compute.manager [None req-98b3d5b9-59c1-48db-b75c-fc6e7096b6c6 - - - - - -] [instance: 2c5b484c-19e7-47b1-bf93-fa599ddb6873] Synchronizing instance power state after lifecycle event "Stopped"; current vm_state: resized, current task_state: None, current DB power_state: 1, VM power_state: 4 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 21 18:54:44 np0005591285 nova_compute[182755]: 2026-01-21 23:54:44.550 182759 INFO nova.compute.manager [None req-98b3d5b9-59c1-48db-b75c-fc6e7096b6c6 - - - - - -] [instance: 2c5b484c-19e7-47b1-bf93-fa599ddb6873] During the sync_power process the instance has moved from host compute-0.ctlplane.example.com to host compute-2.ctlplane.example.com#033[00m
Jan 21 18:54:44 np0005591285 nova_compute[182755]: 2026-01-21 23:54:44.748 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 18:54:44 np0005591285 nova_compute[182755]: 2026-01-21 23:54:44.828 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 18:54:45 np0005591285 nova_compute[182755]: 2026-01-21 23:54:45.194 182759 DEBUG neutronclient.v2_0.client [None req-f6ce3027-5da7-49dd-a4f7-2b7f63b02400 a7fb6bdd938b4fcdb749b0bc4f86f97e c09a5cf201e249f69f57cd4a632d1e2b - - default default] Error message: {"NeutronError": {"type": "PortBindingNotFound", "message": "Binding for port 2ad8a775-c03c-4a1b-919a-278faef8cb47 for host compute-2.ctlplane.example.com could not be found.", "detail": ""}} _handle_fault_response /usr/lib/python3.9/site-packages/neutronclient/v2_0/client.py:262#033[00m
Jan 21 18:54:45 np0005591285 nova_compute[182755]: 2026-01-21 23:54:45.195 182759 DEBUG oslo_concurrency.lockutils [None req-f6ce3027-5da7-49dd-a4f7-2b7f63b02400 a7fb6bdd938b4fcdb749b0bc4f86f97e c09a5cf201e249f69f57cd4a632d1e2b - - default default] Acquiring lock "refresh_cache-2c5b484c-19e7-47b1-bf93-fa599ddb6873" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 21 18:54:45 np0005591285 nova_compute[182755]: 2026-01-21 23:54:45.195 182759 DEBUG oslo_concurrency.lockutils [None req-f6ce3027-5da7-49dd-a4f7-2b7f63b02400 a7fb6bdd938b4fcdb749b0bc4f86f97e c09a5cf201e249f69f57cd4a632d1e2b - - default default] Acquired lock "refresh_cache-2c5b484c-19e7-47b1-bf93-fa599ddb6873" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 21 18:54:45 np0005591285 nova_compute[182755]: 2026-01-21 23:54:45.195 182759 DEBUG nova.network.neutron [None req-f6ce3027-5da7-49dd-a4f7-2b7f63b02400 a7fb6bdd938b4fcdb749b0bc4f86f97e c09a5cf201e249f69f57cd4a632d1e2b - - default default] [instance: 2c5b484c-19e7-47b1-bf93-fa599ddb6873] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Jan 21 18:54:47 np0005591285 nova_compute[182755]: 2026-01-21 23:54:47.254 182759 DEBUG nova.network.neutron [None req-f6ce3027-5da7-49dd-a4f7-2b7f63b02400 a7fb6bdd938b4fcdb749b0bc4f86f97e c09a5cf201e249f69f57cd4a632d1e2b - - default default] [instance: 2c5b484c-19e7-47b1-bf93-fa599ddb6873] Updating instance_info_cache with network_info: [{"id": "2ad8a775-c03c-4a1b-919a-278faef8cb47", "address": "fa:16:3e:10:e7:d9", "network": {"id": "7b586c54-3322-410f-9bc9-972a63b8deff", "bridge": "br-int", "label": "tempest-ServerDiskConfigTestJSON-1542056474-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c09a5cf201e249f69f57cd4a632d1e2b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap2ad8a775-c0", "ovs_interfaceid": "2ad8a775-c03c-4a1b-919a-278faef8cb47", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 21 18:54:47 np0005591285 nova_compute[182755]: 2026-01-21 23:54:47.302 182759 DEBUG oslo_concurrency.lockutils [None req-f6ce3027-5da7-49dd-a4f7-2b7f63b02400 a7fb6bdd938b4fcdb749b0bc4f86f97e c09a5cf201e249f69f57cd4a632d1e2b - - default default] Releasing lock "refresh_cache-2c5b484c-19e7-47b1-bf93-fa599ddb6873" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 21 18:54:47 np0005591285 nova_compute[182755]: 2026-01-21 23:54:47.303 182759 DEBUG nova.objects.instance [None req-f6ce3027-5da7-49dd-a4f7-2b7f63b02400 a7fb6bdd938b4fcdb749b0bc4f86f97e c09a5cf201e249f69f57cd4a632d1e2b - - default default] Lazy-loading 'migration_context' on Instance uuid 2c5b484c-19e7-47b1-bf93-fa599ddb6873 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 21 18:54:47 np0005591285 nova_compute[182755]: 2026-01-21 23:54:47.344 182759 DEBUG nova.virt.libvirt.vif [None req-f6ce3027-5da7-49dd-a4f7-2b7f63b02400 a7fb6bdd938b4fcdb749b0bc4f86f97e c09a5cf201e249f69f57cd4a632d1e2b - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=True,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-21T23:53:56Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ServerDiskConfigTestJSON-server-1222136722',display_name='tempest-ServerDiskConfigTestJSON-server-1222136722',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(2),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-serverdiskconfigtestjson-server-1222136722',id=51,image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',info_cache=InstanceInfoCache,instance_type_id=2,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2026-01-21T23:54:40Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=192,metadata={},migration_context=MigrationContext,new_flavor=Flavor(2),node='compute-0.ctlplane.example.com',numa_topology=<?>,old_flavor=Flavor(1),os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='c09a5cf201e249f69f57cd4a632d1e2b',ramdisk_id='',reservation_id='r-wyf3b09s',resources=<?>,root_device_name='/dev/vda',root_gb=1,security_groups=<?>,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',old_vm_state='active',owner_project_name='tempest-ServerDiskConfigTestJSON-1417790226',owner_user_name='tempest-ServerDiskConfigTestJSON-1417790226-project-member'},tags=<?>,task_state=None,terminated_at=None,trusted_certs=<?>,updated_at=2026-01-21T23:54:40Z,user_data=None,user_id='a7fb6bdd938b4fcdb749b0bc4f86f97e',uuid=2c5b484c-19e7-47b1-bf93-fa599ddb6873,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='resized') vif={"id": "2ad8a775-c03c-4a1b-919a-278faef8cb47", "address": "fa:16:3e:10:e7:d9", "network": {"id": "7b586c54-3322-410f-9bc9-972a63b8deff", "bridge": "br-int", "label": "tempest-ServerDiskConfigTestJSON-1542056474-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c09a5cf201e249f69f57cd4a632d1e2b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap2ad8a775-c0", "ovs_interfaceid": "2ad8a775-c03c-4a1b-919a-278faef8cb47", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Jan 21 18:54:47 np0005591285 nova_compute[182755]: 2026-01-21 23:54:47.345 182759 DEBUG nova.network.os_vif_util [None req-f6ce3027-5da7-49dd-a4f7-2b7f63b02400 a7fb6bdd938b4fcdb749b0bc4f86f97e c09a5cf201e249f69f57cd4a632d1e2b - - default default] Converting VIF {"id": "2ad8a775-c03c-4a1b-919a-278faef8cb47", "address": "fa:16:3e:10:e7:d9", "network": {"id": "7b586c54-3322-410f-9bc9-972a63b8deff", "bridge": "br-int", "label": "tempest-ServerDiskConfigTestJSON-1542056474-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c09a5cf201e249f69f57cd4a632d1e2b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap2ad8a775-c0", "ovs_interfaceid": "2ad8a775-c03c-4a1b-919a-278faef8cb47", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 21 18:54:47 np0005591285 nova_compute[182755]: 2026-01-21 23:54:47.347 182759 DEBUG nova.network.os_vif_util [None req-f6ce3027-5da7-49dd-a4f7-2b7f63b02400 a7fb6bdd938b4fcdb749b0bc4f86f97e c09a5cf201e249f69f57cd4a632d1e2b - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:10:e7:d9,bridge_name='br-int',has_traffic_filtering=True,id=2ad8a775-c03c-4a1b-919a-278faef8cb47,network=Network(7b586c54-3322-410f-9bc9-972a63b8deff),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap2ad8a775-c0') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 21 18:54:47 np0005591285 nova_compute[182755]: 2026-01-21 23:54:47.347 182759 DEBUG os_vif [None req-f6ce3027-5da7-49dd-a4f7-2b7f63b02400 a7fb6bdd938b4fcdb749b0bc4f86f97e c09a5cf201e249f69f57cd4a632d1e2b - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:10:e7:d9,bridge_name='br-int',has_traffic_filtering=True,id=2ad8a775-c03c-4a1b-919a-278faef8cb47,network=Network(7b586c54-3322-410f-9bc9-972a63b8deff),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap2ad8a775-c0') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Jan 21 18:54:47 np0005591285 nova_compute[182755]: 2026-01-21 23:54:47.350 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 18:54:47 np0005591285 nova_compute[182755]: 2026-01-21 23:54:47.351 182759 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap2ad8a775-c0, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 21 18:54:47 np0005591285 nova_compute[182755]: 2026-01-21 23:54:47.352 182759 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 21 18:54:47 np0005591285 nova_compute[182755]: 2026-01-21 23:54:47.356 182759 INFO os_vif [None req-f6ce3027-5da7-49dd-a4f7-2b7f63b02400 a7fb6bdd938b4fcdb749b0bc4f86f97e c09a5cf201e249f69f57cd4a632d1e2b - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:10:e7:d9,bridge_name='br-int',has_traffic_filtering=True,id=2ad8a775-c03c-4a1b-919a-278faef8cb47,network=Network(7b586c54-3322-410f-9bc9-972a63b8deff),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap2ad8a775-c0')#033[00m
Jan 21 18:54:47 np0005591285 nova_compute[182755]: 2026-01-21 23:54:47.357 182759 DEBUG oslo_concurrency.lockutils [None req-f6ce3027-5da7-49dd-a4f7-2b7f63b02400 a7fb6bdd938b4fcdb749b0bc4f86f97e c09a5cf201e249f69f57cd4a632d1e2b - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.drop_move_claim_at_source" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 21 18:54:47 np0005591285 nova_compute[182755]: 2026-01-21 23:54:47.357 182759 DEBUG oslo_concurrency.lockutils [None req-f6ce3027-5da7-49dd-a4f7-2b7f63b02400 a7fb6bdd938b4fcdb749b0bc4f86f97e c09a5cf201e249f69f57cd4a632d1e2b - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.drop_move_claim_at_source" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 21 18:54:47 np0005591285 nova_compute[182755]: 2026-01-21 23:54:47.448 182759 DEBUG nova.compute.provider_tree [None req-f6ce3027-5da7-49dd-a4f7-2b7f63b02400 a7fb6bdd938b4fcdb749b0bc4f86f97e c09a5cf201e249f69f57cd4a632d1e2b - - default default] Inventory has not changed in ProviderTree for provider: e96a8776-a298-4c19-937a-402cb8191067 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 21 18:54:47 np0005591285 nova_compute[182755]: 2026-01-21 23:54:47.471 182759 DEBUG nova.scheduler.client.report [None req-f6ce3027-5da7-49dd-a4f7-2b7f63b02400 a7fb6bdd938b4fcdb749b0bc4f86f97e c09a5cf201e249f69f57cd4a632d1e2b - - default default] Inventory has not changed for provider e96a8776-a298-4c19-937a-402cb8191067 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 21 18:54:47 np0005591285 nova_compute[182755]: 2026-01-21 23:54:47.529 182759 DEBUG oslo_concurrency.lockutils [None req-f6ce3027-5da7-49dd-a4f7-2b7f63b02400 a7fb6bdd938b4fcdb749b0bc4f86f97e c09a5cf201e249f69f57cd4a632d1e2b - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.drop_move_claim_at_source" :: held 0.172s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 21 18:54:47 np0005591285 nova_compute[182755]: 2026-01-21 23:54:47.717 182759 INFO nova.scheduler.client.report [None req-f6ce3027-5da7-49dd-a4f7-2b7f63b02400 a7fb6bdd938b4fcdb749b0bc4f86f97e c09a5cf201e249f69f57cd4a632d1e2b - - default default] Deleted allocation for migration 5cffe5fd-78f0-49b5-bafd-dbed115bc56f#033[00m
Jan 21 18:54:48 np0005591285 nova_compute[182755]: 2026-01-21 23:54:48.080 182759 DEBUG oslo_concurrency.lockutils [None req-f6ce3027-5da7-49dd-a4f7-2b7f63b02400 a7fb6bdd938b4fcdb749b0bc4f86f97e c09a5cf201e249f69f57cd4a632d1e2b - - default default] Lock "2c5b484c-19e7-47b1-bf93-fa599ddb6873" "released" by "nova.compute.manager.ComputeManager.confirm_resize.<locals>.do_confirm_resize" :: held 4.592s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 21 18:54:49 np0005591285 nova_compute[182755]: 2026-01-21 23:54:49.752 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 18:54:49 np0005591285 nova_compute[182755]: 2026-01-21 23:54:49.831 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 18:54:51 np0005591285 podman[218014]: 2026-01-21 23:54:51.244534472 +0000 UTC m=+0.105814139 container health_status 769668cc4e818cfae854ab3fcb5517227ddca1e18005a18ef4d782a494f45662 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=openstack_network_exporter, release=1755695350, build-date=2025-08-20T13:12:41, container_name=openstack_network_exporter, version=9.6, com.redhat.component=ubi9-minimal-container, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., distribution-scope=public, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vendor=Red Hat, Inc., config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.buildah.version=1.33.7, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-type=git, io.openshift.expose-services=, maintainer=Red Hat, Inc., managed_by=edpm_ansible, name=ubi9-minimal, io.openshift.tags=minimal rhel9)
Jan 21 18:54:51 np0005591285 podman[218015]: 2026-01-21 23:54:51.256015751 +0000 UTC m=+0.113259019 container health_status b5c1a31a508d0cf9e10702abe9c0ef424614b4d8f0daee85791f7de054afa6f0 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, container_name=ceilometer_agent_compute, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, managed_by=edpm_ansible)
Jan 21 18:54:53 np0005591285 nova_compute[182755]: 2026-01-21 23:54:53.323 182759 DEBUG oslo_concurrency.lockutils [None req-4c60d553-7ce7-4384-a085-991b0c0cfc7f a7fb6bdd938b4fcdb749b0bc4f86f97e c09a5cf201e249f69f57cd4a632d1e2b - - default default] Acquiring lock "f9f52173-c7d5-4d5a-9edc-b3c4e283213d" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 21 18:54:53 np0005591285 nova_compute[182755]: 2026-01-21 23:54:53.324 182759 DEBUG oslo_concurrency.lockutils [None req-4c60d553-7ce7-4384-a085-991b0c0cfc7f a7fb6bdd938b4fcdb749b0bc4f86f97e c09a5cf201e249f69f57cd4a632d1e2b - - default default] Lock "f9f52173-c7d5-4d5a-9edc-b3c4e283213d" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 21 18:54:53 np0005591285 nova_compute[182755]: 2026-01-21 23:54:53.352 182759 DEBUG nova.compute.manager [None req-4c60d553-7ce7-4384-a085-991b0c0cfc7f a7fb6bdd938b4fcdb749b0bc4f86f97e c09a5cf201e249f69f57cd4a632d1e2b - - default default] [instance: f9f52173-c7d5-4d5a-9edc-b3c4e283213d] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Jan 21 18:54:53 np0005591285 nova_compute[182755]: 2026-01-21 23:54:53.475 182759 DEBUG oslo_concurrency.lockutils [None req-4c60d553-7ce7-4384-a085-991b0c0cfc7f a7fb6bdd938b4fcdb749b0bc4f86f97e c09a5cf201e249f69f57cd4a632d1e2b - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 21 18:54:53 np0005591285 nova_compute[182755]: 2026-01-21 23:54:53.477 182759 DEBUG oslo_concurrency.lockutils [None req-4c60d553-7ce7-4384-a085-991b0c0cfc7f a7fb6bdd938b4fcdb749b0bc4f86f97e c09a5cf201e249f69f57cd4a632d1e2b - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 21 18:54:53 np0005591285 nova_compute[182755]: 2026-01-21 23:54:53.488 182759 DEBUG nova.virt.hardware [None req-4c60d553-7ce7-4384-a085-991b0c0cfc7f a7fb6bdd938b4fcdb749b0bc4f86f97e c09a5cf201e249f69f57cd4a632d1e2b - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Jan 21 18:54:53 np0005591285 nova_compute[182755]: 2026-01-21 23:54:53.489 182759 INFO nova.compute.claims [None req-4c60d553-7ce7-4384-a085-991b0c0cfc7f a7fb6bdd938b4fcdb749b0bc4f86f97e c09a5cf201e249f69f57cd4a632d1e2b - - default default] [instance: f9f52173-c7d5-4d5a-9edc-b3c4e283213d] Claim successful on node compute-2.ctlplane.example.com#033[00m
Jan 21 18:54:53 np0005591285 nova_compute[182755]: 2026-01-21 23:54:53.654 182759 DEBUG nova.compute.provider_tree [None req-4c60d553-7ce7-4384-a085-991b0c0cfc7f a7fb6bdd938b4fcdb749b0bc4f86f97e c09a5cf201e249f69f57cd4a632d1e2b - - default default] Inventory has not changed in ProviderTree for provider: e96a8776-a298-4c19-937a-402cb8191067 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 21 18:54:53 np0005591285 nova_compute[182755]: 2026-01-21 23:54:53.672 182759 DEBUG nova.scheduler.client.report [None req-4c60d553-7ce7-4384-a085-991b0c0cfc7f a7fb6bdd938b4fcdb749b0bc4f86f97e c09a5cf201e249f69f57cd4a632d1e2b - - default default] Inventory has not changed for provider e96a8776-a298-4c19-937a-402cb8191067 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 21 18:54:53 np0005591285 nova_compute[182755]: 2026-01-21 23:54:53.697 182759 DEBUG oslo_concurrency.lockutils [None req-4c60d553-7ce7-4384-a085-991b0c0cfc7f a7fb6bdd938b4fcdb749b0bc4f86f97e c09a5cf201e249f69f57cd4a632d1e2b - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.221s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 21 18:54:53 np0005591285 nova_compute[182755]: 2026-01-21 23:54:53.698 182759 DEBUG nova.compute.manager [None req-4c60d553-7ce7-4384-a085-991b0c0cfc7f a7fb6bdd938b4fcdb749b0bc4f86f97e c09a5cf201e249f69f57cd4a632d1e2b - - default default] [instance: f9f52173-c7d5-4d5a-9edc-b3c4e283213d] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Jan 21 18:54:53 np0005591285 nova_compute[182755]: 2026-01-21 23:54:53.795 182759 DEBUG nova.compute.manager [None req-4c60d553-7ce7-4384-a085-991b0c0cfc7f a7fb6bdd938b4fcdb749b0bc4f86f97e c09a5cf201e249f69f57cd4a632d1e2b - - default default] [instance: f9f52173-c7d5-4d5a-9edc-b3c4e283213d] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Jan 21 18:54:53 np0005591285 nova_compute[182755]: 2026-01-21 23:54:53.796 182759 DEBUG nova.network.neutron [None req-4c60d553-7ce7-4384-a085-991b0c0cfc7f a7fb6bdd938b4fcdb749b0bc4f86f97e c09a5cf201e249f69f57cd4a632d1e2b - - default default] [instance: f9f52173-c7d5-4d5a-9edc-b3c4e283213d] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Jan 21 18:54:53 np0005591285 nova_compute[182755]: 2026-01-21 23:54:53.828 182759 INFO nova.virt.libvirt.driver [None req-4c60d553-7ce7-4384-a085-991b0c0cfc7f a7fb6bdd938b4fcdb749b0bc4f86f97e c09a5cf201e249f69f57cd4a632d1e2b - - default default] [instance: f9f52173-c7d5-4d5a-9edc-b3c4e283213d] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Jan 21 18:54:53 np0005591285 nova_compute[182755]: 2026-01-21 23:54:53.864 182759 DEBUG nova.compute.manager [None req-4c60d553-7ce7-4384-a085-991b0c0cfc7f a7fb6bdd938b4fcdb749b0bc4f86f97e c09a5cf201e249f69f57cd4a632d1e2b - - default default] [instance: f9f52173-c7d5-4d5a-9edc-b3c4e283213d] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Jan 21 18:54:54 np0005591285 nova_compute[182755]: 2026-01-21 23:54:54.018 182759 DEBUG nova.compute.manager [None req-4c60d553-7ce7-4384-a085-991b0c0cfc7f a7fb6bdd938b4fcdb749b0bc4f86f97e c09a5cf201e249f69f57cd4a632d1e2b - - default default] [instance: f9f52173-c7d5-4d5a-9edc-b3c4e283213d] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Jan 21 18:54:54 np0005591285 nova_compute[182755]: 2026-01-21 23:54:54.020 182759 DEBUG nova.virt.libvirt.driver [None req-4c60d553-7ce7-4384-a085-991b0c0cfc7f a7fb6bdd938b4fcdb749b0bc4f86f97e c09a5cf201e249f69f57cd4a632d1e2b - - default default] [instance: f9f52173-c7d5-4d5a-9edc-b3c4e283213d] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Jan 21 18:54:54 np0005591285 nova_compute[182755]: 2026-01-21 23:54:54.021 182759 INFO nova.virt.libvirt.driver [None req-4c60d553-7ce7-4384-a085-991b0c0cfc7f a7fb6bdd938b4fcdb749b0bc4f86f97e c09a5cf201e249f69f57cd4a632d1e2b - - default default] [instance: f9f52173-c7d5-4d5a-9edc-b3c4e283213d] Creating image(s)#033[00m
Jan 21 18:54:54 np0005591285 nova_compute[182755]: 2026-01-21 23:54:54.021 182759 DEBUG oslo_concurrency.lockutils [None req-4c60d553-7ce7-4384-a085-991b0c0cfc7f a7fb6bdd938b4fcdb749b0bc4f86f97e c09a5cf201e249f69f57cd4a632d1e2b - - default default] Acquiring lock "/var/lib/nova/instances/f9f52173-c7d5-4d5a-9edc-b3c4e283213d/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 21 18:54:54 np0005591285 nova_compute[182755]: 2026-01-21 23:54:54.022 182759 DEBUG oslo_concurrency.lockutils [None req-4c60d553-7ce7-4384-a085-991b0c0cfc7f a7fb6bdd938b4fcdb749b0bc4f86f97e c09a5cf201e249f69f57cd4a632d1e2b - - default default] Lock "/var/lib/nova/instances/f9f52173-c7d5-4d5a-9edc-b3c4e283213d/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 21 18:54:54 np0005591285 nova_compute[182755]: 2026-01-21 23:54:54.022 182759 DEBUG oslo_concurrency.lockutils [None req-4c60d553-7ce7-4384-a085-991b0c0cfc7f a7fb6bdd938b4fcdb749b0bc4f86f97e c09a5cf201e249f69f57cd4a632d1e2b - - default default] Lock "/var/lib/nova/instances/f9f52173-c7d5-4d5a-9edc-b3c4e283213d/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 21 18:54:54 np0005591285 rsyslogd[1006]: imjournal: journal files changed, reloading...  [v8.2510.0-2.el9 try https://www.rsyslog.com/e/0 ]
Jan 21 18:54:54 np0005591285 rsyslogd[1006]: imjournal: journal files changed, reloading...  [v8.2510.0-2.el9 try https://www.rsyslog.com/e/0 ]
Jan 21 18:54:54 np0005591285 nova_compute[182755]: 2026-01-21 23:54:54.041 182759 DEBUG oslo_concurrency.processutils [None req-4c60d553-7ce7-4384-a085-991b0c0cfc7f a7fb6bdd938b4fcdb749b0bc4f86f97e c09a5cf201e249f69f57cd4a632d1e2b - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 21 18:54:54 np0005591285 nova_compute[182755]: 2026-01-21 23:54:54.110 182759 DEBUG oslo_concurrency.processutils [None req-4c60d553-7ce7-4384-a085-991b0c0cfc7f a7fb6bdd938b4fcdb749b0bc4f86f97e c09a5cf201e249f69f57cd4a632d1e2b - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json" returned: 0 in 0.069s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 21 18:54:54 np0005591285 nova_compute[182755]: 2026-01-21 23:54:54.111 182759 DEBUG oslo_concurrency.lockutils [None req-4c60d553-7ce7-4384-a085-991b0c0cfc7f a7fb6bdd938b4fcdb749b0bc4f86f97e c09a5cf201e249f69f57cd4a632d1e2b - - default default] Acquiring lock "3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 21 18:54:54 np0005591285 nova_compute[182755]: 2026-01-21 23:54:54.112 182759 DEBUG oslo_concurrency.lockutils [None req-4c60d553-7ce7-4384-a085-991b0c0cfc7f a7fb6bdd938b4fcdb749b0bc4f86f97e c09a5cf201e249f69f57cd4a632d1e2b - - default default] Lock "3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 21 18:54:54 np0005591285 nova_compute[182755]: 2026-01-21 23:54:54.128 182759 DEBUG oslo_concurrency.processutils [None req-4c60d553-7ce7-4384-a085-991b0c0cfc7f a7fb6bdd938b4fcdb749b0bc4f86f97e c09a5cf201e249f69f57cd4a632d1e2b - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 21 18:54:54 np0005591285 nova_compute[182755]: 2026-01-21 23:54:54.200 182759 DEBUG oslo_concurrency.processutils [None req-4c60d553-7ce7-4384-a085-991b0c0cfc7f a7fb6bdd938b4fcdb749b0bc4f86f97e c09a5cf201e249f69f57cd4a632d1e2b - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json" returned: 0 in 0.072s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 21 18:54:54 np0005591285 nova_compute[182755]: 2026-01-21 23:54:54.202 182759 DEBUG oslo_concurrency.processutils [None req-4c60d553-7ce7-4384-a085-991b0c0cfc7f a7fb6bdd938b4fcdb749b0bc4f86f97e c09a5cf201e249f69f57cd4a632d1e2b - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474,backing_fmt=raw /var/lib/nova/instances/f9f52173-c7d5-4d5a-9edc-b3c4e283213d/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 21 18:54:54 np0005591285 nova_compute[182755]: 2026-01-21 23:54:54.240 182759 DEBUG nova.policy [None req-4c60d553-7ce7-4384-a085-991b0c0cfc7f a7fb6bdd938b4fcdb749b0bc4f86f97e c09a5cf201e249f69f57cd4a632d1e2b - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'a7fb6bdd938b4fcdb749b0bc4f86f97e', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'c09a5cf201e249f69f57cd4a632d1e2b', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Jan 21 18:54:54 np0005591285 nova_compute[182755]: 2026-01-21 23:54:54.262 182759 DEBUG oslo_concurrency.processutils [None req-4c60d553-7ce7-4384-a085-991b0c0cfc7f a7fb6bdd938b4fcdb749b0bc4f86f97e c09a5cf201e249f69f57cd4a632d1e2b - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474,backing_fmt=raw /var/lib/nova/instances/f9f52173-c7d5-4d5a-9edc-b3c4e283213d/disk 1073741824" returned: 0 in 0.060s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 21 18:54:54 np0005591285 nova_compute[182755]: 2026-01-21 23:54:54.264 182759 DEBUG oslo_concurrency.lockutils [None req-4c60d553-7ce7-4384-a085-991b0c0cfc7f a7fb6bdd938b4fcdb749b0bc4f86f97e c09a5cf201e249f69f57cd4a632d1e2b - - default default] Lock "3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.151s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 21 18:54:54 np0005591285 nova_compute[182755]: 2026-01-21 23:54:54.265 182759 DEBUG oslo_concurrency.processutils [None req-4c60d553-7ce7-4384-a085-991b0c0cfc7f a7fb6bdd938b4fcdb749b0bc4f86f97e c09a5cf201e249f69f57cd4a632d1e2b - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 21 18:54:54 np0005591285 nova_compute[182755]: 2026-01-21 23:54:54.357 182759 DEBUG oslo_concurrency.processutils [None req-4c60d553-7ce7-4384-a085-991b0c0cfc7f a7fb6bdd938b4fcdb749b0bc4f86f97e c09a5cf201e249f69f57cd4a632d1e2b - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json" returned: 0 in 0.092s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 21 18:54:54 np0005591285 nova_compute[182755]: 2026-01-21 23:54:54.359 182759 DEBUG nova.virt.disk.api [None req-4c60d553-7ce7-4384-a085-991b0c0cfc7f a7fb6bdd938b4fcdb749b0bc4f86f97e c09a5cf201e249f69f57cd4a632d1e2b - - default default] Checking if we can resize image /var/lib/nova/instances/f9f52173-c7d5-4d5a-9edc-b3c4e283213d/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166#033[00m
Jan 21 18:54:54 np0005591285 nova_compute[182755]: 2026-01-21 23:54:54.360 182759 DEBUG oslo_concurrency.processutils [None req-4c60d553-7ce7-4384-a085-991b0c0cfc7f a7fb6bdd938b4fcdb749b0bc4f86f97e c09a5cf201e249f69f57cd4a632d1e2b - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/f9f52173-c7d5-4d5a-9edc-b3c4e283213d/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 21 18:54:54 np0005591285 nova_compute[182755]: 2026-01-21 23:54:54.439 182759 DEBUG oslo_concurrency.processutils [None req-4c60d553-7ce7-4384-a085-991b0c0cfc7f a7fb6bdd938b4fcdb749b0bc4f86f97e c09a5cf201e249f69f57cd4a632d1e2b - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/f9f52173-c7d5-4d5a-9edc-b3c4e283213d/disk --force-share --output=json" returned: 0 in 0.078s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 21 18:54:54 np0005591285 nova_compute[182755]: 2026-01-21 23:54:54.441 182759 DEBUG nova.virt.disk.api [None req-4c60d553-7ce7-4384-a085-991b0c0cfc7f a7fb6bdd938b4fcdb749b0bc4f86f97e c09a5cf201e249f69f57cd4a632d1e2b - - default default] Cannot resize image /var/lib/nova/instances/f9f52173-c7d5-4d5a-9edc-b3c4e283213d/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172#033[00m
Jan 21 18:54:54 np0005591285 nova_compute[182755]: 2026-01-21 23:54:54.442 182759 DEBUG nova.objects.instance [None req-4c60d553-7ce7-4384-a085-991b0c0cfc7f a7fb6bdd938b4fcdb749b0bc4f86f97e c09a5cf201e249f69f57cd4a632d1e2b - - default default] Lazy-loading 'migration_context' on Instance uuid f9f52173-c7d5-4d5a-9edc-b3c4e283213d obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 21 18:54:54 np0005591285 nova_compute[182755]: 2026-01-21 23:54:54.468 182759 DEBUG nova.virt.libvirt.driver [None req-4c60d553-7ce7-4384-a085-991b0c0cfc7f a7fb6bdd938b4fcdb749b0bc4f86f97e c09a5cf201e249f69f57cd4a632d1e2b - - default default] [instance: f9f52173-c7d5-4d5a-9edc-b3c4e283213d] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Jan 21 18:54:54 np0005591285 nova_compute[182755]: 2026-01-21 23:54:54.468 182759 DEBUG nova.virt.libvirt.driver [None req-4c60d553-7ce7-4384-a085-991b0c0cfc7f a7fb6bdd938b4fcdb749b0bc4f86f97e c09a5cf201e249f69f57cd4a632d1e2b - - default default] [instance: f9f52173-c7d5-4d5a-9edc-b3c4e283213d] Ensure instance console log exists: /var/lib/nova/instances/f9f52173-c7d5-4d5a-9edc-b3c4e283213d/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Jan 21 18:54:54 np0005591285 nova_compute[182755]: 2026-01-21 23:54:54.470 182759 DEBUG oslo_concurrency.lockutils [None req-4c60d553-7ce7-4384-a085-991b0c0cfc7f a7fb6bdd938b4fcdb749b0bc4f86f97e c09a5cf201e249f69f57cd4a632d1e2b - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 21 18:54:54 np0005591285 nova_compute[182755]: 2026-01-21 23:54:54.470 182759 DEBUG oslo_concurrency.lockutils [None req-4c60d553-7ce7-4384-a085-991b0c0cfc7f a7fb6bdd938b4fcdb749b0bc4f86f97e c09a5cf201e249f69f57cd4a632d1e2b - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 21 18:54:54 np0005591285 nova_compute[182755]: 2026-01-21 23:54:54.471 182759 DEBUG oslo_concurrency.lockutils [None req-4c60d553-7ce7-4384-a085-991b0c0cfc7f a7fb6bdd938b4fcdb749b0bc4f86f97e c09a5cf201e249f69f57cd4a632d1e2b - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 21 18:54:54 np0005591285 nova_compute[182755]: 2026-01-21 23:54:54.754 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 18:54:54 np0005591285 nova_compute[182755]: 2026-01-21 23:54:54.834 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 18:54:55 np0005591285 nova_compute[182755]: 2026-01-21 23:54:55.535 182759 DEBUG nova.network.neutron [None req-4c60d553-7ce7-4384-a085-991b0c0cfc7f a7fb6bdd938b4fcdb749b0bc4f86f97e c09a5cf201e249f69f57cd4a632d1e2b - - default default] [instance: f9f52173-c7d5-4d5a-9edc-b3c4e283213d] Successfully created port: 798c95ea-7b9b-4c11-b29c-7a919d3070d4 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Jan 21 18:54:56 np0005591285 nova_compute[182755]: 2026-01-21 23:54:56.426 182759 DEBUG nova.network.neutron [None req-4c60d553-7ce7-4384-a085-991b0c0cfc7f a7fb6bdd938b4fcdb749b0bc4f86f97e c09a5cf201e249f69f57cd4a632d1e2b - - default default] [instance: f9f52173-c7d5-4d5a-9edc-b3c4e283213d] Successfully updated port: 798c95ea-7b9b-4c11-b29c-7a919d3070d4 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Jan 21 18:54:56 np0005591285 nova_compute[182755]: 2026-01-21 23:54:56.447 182759 DEBUG oslo_concurrency.lockutils [None req-4c60d553-7ce7-4384-a085-991b0c0cfc7f a7fb6bdd938b4fcdb749b0bc4f86f97e c09a5cf201e249f69f57cd4a632d1e2b - - default default] Acquiring lock "refresh_cache-f9f52173-c7d5-4d5a-9edc-b3c4e283213d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 21 18:54:56 np0005591285 nova_compute[182755]: 2026-01-21 23:54:56.448 182759 DEBUG oslo_concurrency.lockutils [None req-4c60d553-7ce7-4384-a085-991b0c0cfc7f a7fb6bdd938b4fcdb749b0bc4f86f97e c09a5cf201e249f69f57cd4a632d1e2b - - default default] Acquired lock "refresh_cache-f9f52173-c7d5-4d5a-9edc-b3c4e283213d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 21 18:54:56 np0005591285 nova_compute[182755]: 2026-01-21 23:54:56.449 182759 DEBUG nova.network.neutron [None req-4c60d553-7ce7-4384-a085-991b0c0cfc7f a7fb6bdd938b4fcdb749b0bc4f86f97e c09a5cf201e249f69f57cd4a632d1e2b - - default default] [instance: f9f52173-c7d5-4d5a-9edc-b3c4e283213d] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Jan 21 18:54:56 np0005591285 nova_compute[182755]: 2026-01-21 23:54:56.678 182759 DEBUG nova.network.neutron [None req-4c60d553-7ce7-4384-a085-991b0c0cfc7f a7fb6bdd938b4fcdb749b0bc4f86f97e c09a5cf201e249f69f57cd4a632d1e2b - - default default] [instance: f9f52173-c7d5-4d5a-9edc-b3c4e283213d] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Jan 21 18:54:57 np0005591285 nova_compute[182755]: 2026-01-21 23:54:57.803 182759 DEBUG nova.network.neutron [None req-4c60d553-7ce7-4384-a085-991b0c0cfc7f a7fb6bdd938b4fcdb749b0bc4f86f97e c09a5cf201e249f69f57cd4a632d1e2b - - default default] [instance: f9f52173-c7d5-4d5a-9edc-b3c4e283213d] Updating instance_info_cache with network_info: [{"id": "798c95ea-7b9b-4c11-b29c-7a919d3070d4", "address": "fa:16:3e:e2:1a:16", "network": {"id": "7b586c54-3322-410f-9bc9-972a63b8deff", "bridge": "br-int", "label": "tempest-ServerDiskConfigTestJSON-1542056474-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c09a5cf201e249f69f57cd4a632d1e2b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap798c95ea-7b", "ovs_interfaceid": "798c95ea-7b9b-4c11-b29c-7a919d3070d4", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 21 18:54:57 np0005591285 nova_compute[182755]: 2026-01-21 23:54:57.837 182759 DEBUG oslo_concurrency.lockutils [None req-4c60d553-7ce7-4384-a085-991b0c0cfc7f a7fb6bdd938b4fcdb749b0bc4f86f97e c09a5cf201e249f69f57cd4a632d1e2b - - default default] Releasing lock "refresh_cache-f9f52173-c7d5-4d5a-9edc-b3c4e283213d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 21 18:54:57 np0005591285 nova_compute[182755]: 2026-01-21 23:54:57.837 182759 DEBUG nova.compute.manager [None req-4c60d553-7ce7-4384-a085-991b0c0cfc7f a7fb6bdd938b4fcdb749b0bc4f86f97e c09a5cf201e249f69f57cd4a632d1e2b - - default default] [instance: f9f52173-c7d5-4d5a-9edc-b3c4e283213d] Instance network_info: |[{"id": "798c95ea-7b9b-4c11-b29c-7a919d3070d4", "address": "fa:16:3e:e2:1a:16", "network": {"id": "7b586c54-3322-410f-9bc9-972a63b8deff", "bridge": "br-int", "label": "tempest-ServerDiskConfigTestJSON-1542056474-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c09a5cf201e249f69f57cd4a632d1e2b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap798c95ea-7b", "ovs_interfaceid": "798c95ea-7b9b-4c11-b29c-7a919d3070d4", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Jan 21 18:54:57 np0005591285 nova_compute[182755]: 2026-01-21 23:54:57.842 182759 DEBUG nova.virt.libvirt.driver [None req-4c60d553-7ce7-4384-a085-991b0c0cfc7f a7fb6bdd938b4fcdb749b0bc4f86f97e c09a5cf201e249f69f57cd4a632d1e2b - - default default] [instance: f9f52173-c7d5-4d5a-9edc-b3c4e283213d] Start _get_guest_xml network_info=[{"id": "798c95ea-7b9b-4c11-b29c-7a919d3070d4", "address": "fa:16:3e:e2:1a:16", "network": {"id": "7b586c54-3322-410f-9bc9-972a63b8deff", "bridge": "br-int", "label": "tempest-ServerDiskConfigTestJSON-1542056474-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c09a5cf201e249f69f57cd4a632d1e2b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap798c95ea-7b", "ovs_interfaceid": "798c95ea-7b9b-4c11-b29c-7a919d3070d4", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-21T23:42:50Z,direct_url=<?>,disk_format='qcow2',id=9cd98f02-a505-4543-a7ad-04e9a377b456,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='43b70c4e837343859ac97b6b2397ba1b',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-21T23:42:52Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'device_type': 'disk', 'device_name': '/dev/vda', 'size': 0, 'boot_index': 0, 'disk_bus': 'virtio', 'guest_format': None, 'encryption_options': None, 'encryption_format': None, 'encrypted': False, 'encryption_secret_uuid': None, 'image_id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Jan 21 18:54:57 np0005591285 nova_compute[182755]: 2026-01-21 23:54:57.850 182759 WARNING nova.virt.libvirt.driver [None req-4c60d553-7ce7-4384-a085-991b0c0cfc7f a7fb6bdd938b4fcdb749b0bc4f86f97e c09a5cf201e249f69f57cd4a632d1e2b - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 21 18:54:57 np0005591285 nova_compute[182755]: 2026-01-21 23:54:57.859 182759 DEBUG nova.virt.libvirt.host [None req-4c60d553-7ce7-4384-a085-991b0c0cfc7f a7fb6bdd938b4fcdb749b0bc4f86f97e c09a5cf201e249f69f57cd4a632d1e2b - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Jan 21 18:54:57 np0005591285 nova_compute[182755]: 2026-01-21 23:54:57.861 182759 DEBUG nova.virt.libvirt.host [None req-4c60d553-7ce7-4384-a085-991b0c0cfc7f a7fb6bdd938b4fcdb749b0bc4f86f97e c09a5cf201e249f69f57cd4a632d1e2b - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Jan 21 18:54:57 np0005591285 nova_compute[182755]: 2026-01-21 23:54:57.867 182759 DEBUG nova.virt.libvirt.host [None req-4c60d553-7ce7-4384-a085-991b0c0cfc7f a7fb6bdd938b4fcdb749b0bc4f86f97e c09a5cf201e249f69f57cd4a632d1e2b - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Jan 21 18:54:57 np0005591285 nova_compute[182755]: 2026-01-21 23:54:57.868 182759 DEBUG nova.virt.libvirt.host [None req-4c60d553-7ce7-4384-a085-991b0c0cfc7f a7fb6bdd938b4fcdb749b0bc4f86f97e c09a5cf201e249f69f57cd4a632d1e2b - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Jan 21 18:54:57 np0005591285 nova_compute[182755]: 2026-01-21 23:54:57.869 182759 DEBUG nova.virt.libvirt.driver [None req-4c60d553-7ce7-4384-a085-991b0c0cfc7f a7fb6bdd938b4fcdb749b0bc4f86f97e c09a5cf201e249f69f57cd4a632d1e2b - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Jan 21 18:54:57 np0005591285 nova_compute[182755]: 2026-01-21 23:54:57.870 182759 DEBUG nova.virt.hardware [None req-4c60d553-7ce7-4384-a085-991b0c0cfc7f a7fb6bdd938b4fcdb749b0bc4f86f97e c09a5cf201e249f69f57cd4a632d1e2b - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-21T23:42:45Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='c3389c03-89c4-4ff5-9e03-1a99d41713d4',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-21T23:42:50Z,direct_url=<?>,disk_format='qcow2',id=9cd98f02-a505-4543-a7ad-04e9a377b456,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='43b70c4e837343859ac97b6b2397ba1b',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-21T23:42:52Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Jan 21 18:54:57 np0005591285 nova_compute[182755]: 2026-01-21 23:54:57.870 182759 DEBUG nova.virt.hardware [None req-4c60d553-7ce7-4384-a085-991b0c0cfc7f a7fb6bdd938b4fcdb749b0bc4f86f97e c09a5cf201e249f69f57cd4a632d1e2b - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Jan 21 18:54:57 np0005591285 nova_compute[182755]: 2026-01-21 23:54:57.871 182759 DEBUG nova.virt.hardware [None req-4c60d553-7ce7-4384-a085-991b0c0cfc7f a7fb6bdd938b4fcdb749b0bc4f86f97e c09a5cf201e249f69f57cd4a632d1e2b - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Jan 21 18:54:57 np0005591285 nova_compute[182755]: 2026-01-21 23:54:57.871 182759 DEBUG nova.virt.hardware [None req-4c60d553-7ce7-4384-a085-991b0c0cfc7f a7fb6bdd938b4fcdb749b0bc4f86f97e c09a5cf201e249f69f57cd4a632d1e2b - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Jan 21 18:54:57 np0005591285 nova_compute[182755]: 2026-01-21 23:54:57.871 182759 DEBUG nova.virt.hardware [None req-4c60d553-7ce7-4384-a085-991b0c0cfc7f a7fb6bdd938b4fcdb749b0bc4f86f97e c09a5cf201e249f69f57cd4a632d1e2b - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Jan 21 18:54:57 np0005591285 nova_compute[182755]: 2026-01-21 23:54:57.872 182759 DEBUG nova.virt.hardware [None req-4c60d553-7ce7-4384-a085-991b0c0cfc7f a7fb6bdd938b4fcdb749b0bc4f86f97e c09a5cf201e249f69f57cd4a632d1e2b - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Jan 21 18:54:57 np0005591285 nova_compute[182755]: 2026-01-21 23:54:57.872 182759 DEBUG nova.virt.hardware [None req-4c60d553-7ce7-4384-a085-991b0c0cfc7f a7fb6bdd938b4fcdb749b0bc4f86f97e c09a5cf201e249f69f57cd4a632d1e2b - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Jan 21 18:54:57 np0005591285 nova_compute[182755]: 2026-01-21 23:54:57.872 182759 DEBUG nova.virt.hardware [None req-4c60d553-7ce7-4384-a085-991b0c0cfc7f a7fb6bdd938b4fcdb749b0bc4f86f97e c09a5cf201e249f69f57cd4a632d1e2b - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Jan 21 18:54:57 np0005591285 nova_compute[182755]: 2026-01-21 23:54:57.873 182759 DEBUG nova.virt.hardware [None req-4c60d553-7ce7-4384-a085-991b0c0cfc7f a7fb6bdd938b4fcdb749b0bc4f86f97e c09a5cf201e249f69f57cd4a632d1e2b - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Jan 21 18:54:57 np0005591285 nova_compute[182755]: 2026-01-21 23:54:57.873 182759 DEBUG nova.virt.hardware [None req-4c60d553-7ce7-4384-a085-991b0c0cfc7f a7fb6bdd938b4fcdb749b0bc4f86f97e c09a5cf201e249f69f57cd4a632d1e2b - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Jan 21 18:54:57 np0005591285 nova_compute[182755]: 2026-01-21 23:54:57.873 182759 DEBUG nova.virt.hardware [None req-4c60d553-7ce7-4384-a085-991b0c0cfc7f a7fb6bdd938b4fcdb749b0bc4f86f97e c09a5cf201e249f69f57cd4a632d1e2b - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Jan 21 18:54:57 np0005591285 nova_compute[182755]: 2026-01-21 23:54:57.878 182759 DEBUG nova.virt.libvirt.vif [None req-4c60d553-7ce7-4384-a085-991b0c0cfc7f a7fb6bdd938b4fcdb749b0bc4f86f97e c09a5cf201e249f69f57cd4a632d1e2b - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-21T23:54:52Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ServerDiskConfigTestJSON-server-1758440293',display_name='tempest-ServerDiskConfigTestJSON-server-1758440293',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-serverdiskconfigtestjson-server-1758440293',id=55,image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='c09a5cf201e249f69f57cd4a632d1e2b',ramdisk_id='',reservation_id='r-tlh7z2ba',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ServerDiskConfigTestJSON-1417790226',owner_user_name='tempest-ServerDiskConfigTestJSON-1417790226-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-21T23:54:53Z,user_data=None,user_id='a7fb6bdd938b4fcdb749b0bc4f86f97e',uuid=f9f52173-c7d5-4d5a-9edc-b3c4e283213d,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "798c95ea-7b9b-4c11-b29c-7a919d3070d4", "address": "fa:16:3e:e2:1a:16", "network": {"id": "7b586c54-3322-410f-9bc9-972a63b8deff", "bridge": "br-int", "label": "tempest-ServerDiskConfigTestJSON-1542056474-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c09a5cf201e249f69f57cd4a632d1e2b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap798c95ea-7b", "ovs_interfaceid": "798c95ea-7b9b-4c11-b29c-7a919d3070d4", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Jan 21 18:54:57 np0005591285 nova_compute[182755]: 2026-01-21 23:54:57.879 182759 DEBUG nova.network.os_vif_util [None req-4c60d553-7ce7-4384-a085-991b0c0cfc7f a7fb6bdd938b4fcdb749b0bc4f86f97e c09a5cf201e249f69f57cd4a632d1e2b - - default default] Converting VIF {"id": "798c95ea-7b9b-4c11-b29c-7a919d3070d4", "address": "fa:16:3e:e2:1a:16", "network": {"id": "7b586c54-3322-410f-9bc9-972a63b8deff", "bridge": "br-int", "label": "tempest-ServerDiskConfigTestJSON-1542056474-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c09a5cf201e249f69f57cd4a632d1e2b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap798c95ea-7b", "ovs_interfaceid": "798c95ea-7b9b-4c11-b29c-7a919d3070d4", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 21 18:54:57 np0005591285 nova_compute[182755]: 2026-01-21 23:54:57.880 182759 DEBUG nova.network.os_vif_util [None req-4c60d553-7ce7-4384-a085-991b0c0cfc7f a7fb6bdd938b4fcdb749b0bc4f86f97e c09a5cf201e249f69f57cd4a632d1e2b - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:e2:1a:16,bridge_name='br-int',has_traffic_filtering=True,id=798c95ea-7b9b-4c11-b29c-7a919d3070d4,network=Network(7b586c54-3322-410f-9bc9-972a63b8deff),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap798c95ea-7b') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 21 18:54:57 np0005591285 nova_compute[182755]: 2026-01-21 23:54:57.881 182759 DEBUG nova.objects.instance [None req-4c60d553-7ce7-4384-a085-991b0c0cfc7f a7fb6bdd938b4fcdb749b0bc4f86f97e c09a5cf201e249f69f57cd4a632d1e2b - - default default] Lazy-loading 'pci_devices' on Instance uuid f9f52173-c7d5-4d5a-9edc-b3c4e283213d obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 21 18:54:57 np0005591285 nova_compute[182755]: 2026-01-21 23:54:57.911 182759 DEBUG nova.virt.libvirt.driver [None req-4c60d553-7ce7-4384-a085-991b0c0cfc7f a7fb6bdd938b4fcdb749b0bc4f86f97e c09a5cf201e249f69f57cd4a632d1e2b - - default default] [instance: f9f52173-c7d5-4d5a-9edc-b3c4e283213d] End _get_guest_xml xml=<domain type="kvm">
Jan 21 18:54:57 np0005591285 nova_compute[182755]:  <uuid>f9f52173-c7d5-4d5a-9edc-b3c4e283213d</uuid>
Jan 21 18:54:57 np0005591285 nova_compute[182755]:  <name>instance-00000037</name>
Jan 21 18:54:57 np0005591285 nova_compute[182755]:  <memory>131072</memory>
Jan 21 18:54:57 np0005591285 nova_compute[182755]:  <vcpu>1</vcpu>
Jan 21 18:54:57 np0005591285 nova_compute[182755]:  <metadata>
Jan 21 18:54:57 np0005591285 nova_compute[182755]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 21 18:54:57 np0005591285 nova_compute[182755]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 21 18:54:57 np0005591285 nova_compute[182755]:      <nova:name>tempest-ServerDiskConfigTestJSON-server-1758440293</nova:name>
Jan 21 18:54:57 np0005591285 nova_compute[182755]:      <nova:creationTime>2026-01-21 23:54:57</nova:creationTime>
Jan 21 18:54:57 np0005591285 nova_compute[182755]:      <nova:flavor name="m1.nano">
Jan 21 18:54:57 np0005591285 nova_compute[182755]:        <nova:memory>128</nova:memory>
Jan 21 18:54:57 np0005591285 nova_compute[182755]:        <nova:disk>1</nova:disk>
Jan 21 18:54:57 np0005591285 nova_compute[182755]:        <nova:swap>0</nova:swap>
Jan 21 18:54:57 np0005591285 nova_compute[182755]:        <nova:ephemeral>0</nova:ephemeral>
Jan 21 18:54:57 np0005591285 nova_compute[182755]:        <nova:vcpus>1</nova:vcpus>
Jan 21 18:54:57 np0005591285 nova_compute[182755]:      </nova:flavor>
Jan 21 18:54:57 np0005591285 nova_compute[182755]:      <nova:owner>
Jan 21 18:54:57 np0005591285 nova_compute[182755]:        <nova:user uuid="a7fb6bdd938b4fcdb749b0bc4f86f97e">tempest-ServerDiskConfigTestJSON-1417790226-project-member</nova:user>
Jan 21 18:54:57 np0005591285 nova_compute[182755]:        <nova:project uuid="c09a5cf201e249f69f57cd4a632d1e2b">tempest-ServerDiskConfigTestJSON-1417790226</nova:project>
Jan 21 18:54:57 np0005591285 nova_compute[182755]:      </nova:owner>
Jan 21 18:54:57 np0005591285 nova_compute[182755]:      <nova:root type="image" uuid="9cd98f02-a505-4543-a7ad-04e9a377b456"/>
Jan 21 18:54:57 np0005591285 nova_compute[182755]:      <nova:ports>
Jan 21 18:54:57 np0005591285 nova_compute[182755]:        <nova:port uuid="798c95ea-7b9b-4c11-b29c-7a919d3070d4">
Jan 21 18:54:57 np0005591285 nova_compute[182755]:          <nova:ip type="fixed" address="10.100.0.7" ipVersion="4"/>
Jan 21 18:54:57 np0005591285 nova_compute[182755]:        </nova:port>
Jan 21 18:54:57 np0005591285 nova_compute[182755]:      </nova:ports>
Jan 21 18:54:57 np0005591285 nova_compute[182755]:    </nova:instance>
Jan 21 18:54:57 np0005591285 nova_compute[182755]:  </metadata>
Jan 21 18:54:57 np0005591285 nova_compute[182755]:  <sysinfo type="smbios">
Jan 21 18:54:57 np0005591285 nova_compute[182755]:    <system>
Jan 21 18:54:57 np0005591285 nova_compute[182755]:      <entry name="manufacturer">RDO</entry>
Jan 21 18:54:57 np0005591285 nova_compute[182755]:      <entry name="product">OpenStack Compute</entry>
Jan 21 18:54:57 np0005591285 nova_compute[182755]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 21 18:54:57 np0005591285 nova_compute[182755]:      <entry name="serial">f9f52173-c7d5-4d5a-9edc-b3c4e283213d</entry>
Jan 21 18:54:57 np0005591285 nova_compute[182755]:      <entry name="uuid">f9f52173-c7d5-4d5a-9edc-b3c4e283213d</entry>
Jan 21 18:54:57 np0005591285 nova_compute[182755]:      <entry name="family">Virtual Machine</entry>
Jan 21 18:54:57 np0005591285 nova_compute[182755]:    </system>
Jan 21 18:54:57 np0005591285 nova_compute[182755]:  </sysinfo>
Jan 21 18:54:57 np0005591285 nova_compute[182755]:  <os>
Jan 21 18:54:57 np0005591285 nova_compute[182755]:    <type arch="x86_64" machine="q35">hvm</type>
Jan 21 18:54:57 np0005591285 nova_compute[182755]:    <boot dev="hd"/>
Jan 21 18:54:57 np0005591285 nova_compute[182755]:    <smbios mode="sysinfo"/>
Jan 21 18:54:57 np0005591285 nova_compute[182755]:  </os>
Jan 21 18:54:57 np0005591285 nova_compute[182755]:  <features>
Jan 21 18:54:57 np0005591285 nova_compute[182755]:    <acpi/>
Jan 21 18:54:57 np0005591285 nova_compute[182755]:    <apic/>
Jan 21 18:54:57 np0005591285 nova_compute[182755]:    <vmcoreinfo/>
Jan 21 18:54:57 np0005591285 nova_compute[182755]:  </features>
Jan 21 18:54:57 np0005591285 nova_compute[182755]:  <clock offset="utc">
Jan 21 18:54:57 np0005591285 nova_compute[182755]:    <timer name="pit" tickpolicy="delay"/>
Jan 21 18:54:57 np0005591285 nova_compute[182755]:    <timer name="rtc" tickpolicy="catchup"/>
Jan 21 18:54:57 np0005591285 nova_compute[182755]:    <timer name="hpet" present="no"/>
Jan 21 18:54:57 np0005591285 nova_compute[182755]:  </clock>
Jan 21 18:54:57 np0005591285 nova_compute[182755]:  <cpu mode="custom" match="exact">
Jan 21 18:54:57 np0005591285 nova_compute[182755]:    <model>Nehalem</model>
Jan 21 18:54:57 np0005591285 nova_compute[182755]:    <topology sockets="1" cores="1" threads="1"/>
Jan 21 18:54:57 np0005591285 nova_compute[182755]:  </cpu>
Jan 21 18:54:57 np0005591285 nova_compute[182755]:  <devices>
Jan 21 18:54:57 np0005591285 nova_compute[182755]:    <disk type="file" device="disk">
Jan 21 18:54:57 np0005591285 nova_compute[182755]:      <driver name="qemu" type="qcow2" cache="none"/>
Jan 21 18:54:57 np0005591285 nova_compute[182755]:      <source file="/var/lib/nova/instances/f9f52173-c7d5-4d5a-9edc-b3c4e283213d/disk"/>
Jan 21 18:54:57 np0005591285 nova_compute[182755]:      <target dev="vda" bus="virtio"/>
Jan 21 18:54:57 np0005591285 nova_compute[182755]:    </disk>
Jan 21 18:54:57 np0005591285 nova_compute[182755]:    <disk type="file" device="cdrom">
Jan 21 18:54:57 np0005591285 nova_compute[182755]:      <driver name="qemu" type="raw" cache="none"/>
Jan 21 18:54:57 np0005591285 nova_compute[182755]:      <source file="/var/lib/nova/instances/f9f52173-c7d5-4d5a-9edc-b3c4e283213d/disk.config"/>
Jan 21 18:54:57 np0005591285 nova_compute[182755]:      <target dev="sda" bus="sata"/>
Jan 21 18:54:57 np0005591285 nova_compute[182755]:    </disk>
Jan 21 18:54:57 np0005591285 nova_compute[182755]:    <interface type="ethernet">
Jan 21 18:54:57 np0005591285 nova_compute[182755]:      <mac address="fa:16:3e:e2:1a:16"/>
Jan 21 18:54:57 np0005591285 nova_compute[182755]:      <model type="virtio"/>
Jan 21 18:54:57 np0005591285 nova_compute[182755]:      <driver name="vhost" rx_queue_size="512"/>
Jan 21 18:54:57 np0005591285 nova_compute[182755]:      <mtu size="1442"/>
Jan 21 18:54:57 np0005591285 nova_compute[182755]:      <target dev="tap798c95ea-7b"/>
Jan 21 18:54:57 np0005591285 nova_compute[182755]:    </interface>
Jan 21 18:54:57 np0005591285 nova_compute[182755]:    <serial type="pty">
Jan 21 18:54:57 np0005591285 nova_compute[182755]:      <log file="/var/lib/nova/instances/f9f52173-c7d5-4d5a-9edc-b3c4e283213d/console.log" append="off"/>
Jan 21 18:54:57 np0005591285 nova_compute[182755]:    </serial>
Jan 21 18:54:57 np0005591285 nova_compute[182755]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 21 18:54:57 np0005591285 nova_compute[182755]:    <video>
Jan 21 18:54:57 np0005591285 nova_compute[182755]:      <model type="virtio"/>
Jan 21 18:54:57 np0005591285 nova_compute[182755]:    </video>
Jan 21 18:54:57 np0005591285 nova_compute[182755]:    <input type="tablet" bus="usb"/>
Jan 21 18:54:57 np0005591285 nova_compute[182755]:    <rng model="virtio">
Jan 21 18:54:57 np0005591285 nova_compute[182755]:      <backend model="random">/dev/urandom</backend>
Jan 21 18:54:57 np0005591285 nova_compute[182755]:    </rng>
Jan 21 18:54:57 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root"/>
Jan 21 18:54:57 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 18:54:57 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 18:54:57 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 18:54:57 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 18:54:57 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 18:54:57 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 18:54:57 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 18:54:57 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 18:54:57 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 18:54:57 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 18:54:57 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 18:54:57 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 18:54:57 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 18:54:57 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 18:54:57 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 18:54:57 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 18:54:57 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 18:54:57 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 18:54:57 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 18:54:57 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 18:54:57 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 18:54:57 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 18:54:57 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 18:54:57 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 18:54:57 np0005591285 nova_compute[182755]:    <controller type="usb" index="0"/>
Jan 21 18:54:57 np0005591285 nova_compute[182755]:    <memballoon model="virtio">
Jan 21 18:54:57 np0005591285 nova_compute[182755]:      <stats period="10"/>
Jan 21 18:54:57 np0005591285 nova_compute[182755]:    </memballoon>
Jan 21 18:54:57 np0005591285 nova_compute[182755]:  </devices>
Jan 21 18:54:57 np0005591285 nova_compute[182755]: </domain>
Jan 21 18:54:57 np0005591285 nova_compute[182755]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Jan 21 18:54:57 np0005591285 nova_compute[182755]: 2026-01-21 23:54:57.913 182759 DEBUG nova.compute.manager [None req-4c60d553-7ce7-4384-a085-991b0c0cfc7f a7fb6bdd938b4fcdb749b0bc4f86f97e c09a5cf201e249f69f57cd4a632d1e2b - - default default] [instance: f9f52173-c7d5-4d5a-9edc-b3c4e283213d] Preparing to wait for external event network-vif-plugged-798c95ea-7b9b-4c11-b29c-7a919d3070d4 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Jan 21 18:54:57 np0005591285 nova_compute[182755]: 2026-01-21 23:54:57.914 182759 DEBUG oslo_concurrency.lockutils [None req-4c60d553-7ce7-4384-a085-991b0c0cfc7f a7fb6bdd938b4fcdb749b0bc4f86f97e c09a5cf201e249f69f57cd4a632d1e2b - - default default] Acquiring lock "f9f52173-c7d5-4d5a-9edc-b3c4e283213d-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 21 18:54:57 np0005591285 nova_compute[182755]: 2026-01-21 23:54:57.915 182759 DEBUG oslo_concurrency.lockutils [None req-4c60d553-7ce7-4384-a085-991b0c0cfc7f a7fb6bdd938b4fcdb749b0bc4f86f97e c09a5cf201e249f69f57cd4a632d1e2b - - default default] Lock "f9f52173-c7d5-4d5a-9edc-b3c4e283213d-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 21 18:54:57 np0005591285 nova_compute[182755]: 2026-01-21 23:54:57.915 182759 DEBUG oslo_concurrency.lockutils [None req-4c60d553-7ce7-4384-a085-991b0c0cfc7f a7fb6bdd938b4fcdb749b0bc4f86f97e c09a5cf201e249f69f57cd4a632d1e2b - - default default] Lock "f9f52173-c7d5-4d5a-9edc-b3c4e283213d-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 21 18:54:57 np0005591285 nova_compute[182755]: 2026-01-21 23:54:57.917 182759 DEBUG nova.virt.libvirt.vif [None req-4c60d553-7ce7-4384-a085-991b0c0cfc7f a7fb6bdd938b4fcdb749b0bc4f86f97e c09a5cf201e249f69f57cd4a632d1e2b - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-21T23:54:52Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ServerDiskConfigTestJSON-server-1758440293',display_name='tempest-ServerDiskConfigTestJSON-server-1758440293',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-serverdiskconfigtestjson-server-1758440293',id=55,image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='c09a5cf201e249f69f57cd4a632d1e2b',ramdisk_id='',reservation_id='r-tlh7z2ba',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ServerDiskConfigTestJSON-1417790226',owner_user_name='tempest-ServerDiskConfigTestJSON-1417790226-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-21T23:54:53Z,user_data=None,user_id='a7fb6bdd938b4fcdb749b0bc4f86f97e',uuid=f9f52173-c7d5-4d5a-9edc-b3c4e283213d,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "798c95ea-7b9b-4c11-b29c-7a919d3070d4", "address": "fa:16:3e:e2:1a:16", "network": {"id": "7b586c54-3322-410f-9bc9-972a63b8deff", "bridge": "br-int", "label": "tempest-ServerDiskConfigTestJSON-1542056474-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c09a5cf201e249f69f57cd4a632d1e2b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap798c95ea-7b", "ovs_interfaceid": "798c95ea-7b9b-4c11-b29c-7a919d3070d4", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Jan 21 18:54:57 np0005591285 nova_compute[182755]: 2026-01-21 23:54:57.918 182759 DEBUG nova.network.os_vif_util [None req-4c60d553-7ce7-4384-a085-991b0c0cfc7f a7fb6bdd938b4fcdb749b0bc4f86f97e c09a5cf201e249f69f57cd4a632d1e2b - - default default] Converting VIF {"id": "798c95ea-7b9b-4c11-b29c-7a919d3070d4", "address": "fa:16:3e:e2:1a:16", "network": {"id": "7b586c54-3322-410f-9bc9-972a63b8deff", "bridge": "br-int", "label": "tempest-ServerDiskConfigTestJSON-1542056474-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c09a5cf201e249f69f57cd4a632d1e2b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap798c95ea-7b", "ovs_interfaceid": "798c95ea-7b9b-4c11-b29c-7a919d3070d4", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 21 18:54:57 np0005591285 nova_compute[182755]: 2026-01-21 23:54:57.919 182759 DEBUG nova.network.os_vif_util [None req-4c60d553-7ce7-4384-a085-991b0c0cfc7f a7fb6bdd938b4fcdb749b0bc4f86f97e c09a5cf201e249f69f57cd4a632d1e2b - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:e2:1a:16,bridge_name='br-int',has_traffic_filtering=True,id=798c95ea-7b9b-4c11-b29c-7a919d3070d4,network=Network(7b586c54-3322-410f-9bc9-972a63b8deff),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap798c95ea-7b') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 21 18:54:57 np0005591285 nova_compute[182755]: 2026-01-21 23:54:57.920 182759 DEBUG os_vif [None req-4c60d553-7ce7-4384-a085-991b0c0cfc7f a7fb6bdd938b4fcdb749b0bc4f86f97e c09a5cf201e249f69f57cd4a632d1e2b - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:e2:1a:16,bridge_name='br-int',has_traffic_filtering=True,id=798c95ea-7b9b-4c11-b29c-7a919d3070d4,network=Network(7b586c54-3322-410f-9bc9-972a63b8deff),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap798c95ea-7b') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Jan 21 18:54:57 np0005591285 nova_compute[182755]: 2026-01-21 23:54:57.921 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 18:54:57 np0005591285 nova_compute[182755]: 2026-01-21 23:54:57.922 182759 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 21 18:54:57 np0005591285 nova_compute[182755]: 2026-01-21 23:54:57.923 182759 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 21 18:54:57 np0005591285 nova_compute[182755]: 2026-01-21 23:54:57.928 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 18:54:57 np0005591285 nova_compute[182755]: 2026-01-21 23:54:57.928 182759 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap798c95ea-7b, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 21 18:54:57 np0005591285 nova_compute[182755]: 2026-01-21 23:54:57.929 182759 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap798c95ea-7b, col_values=(('external_ids', {'iface-id': '798c95ea-7b9b-4c11-b29c-7a919d3070d4', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:e2:1a:16', 'vm-uuid': 'f9f52173-c7d5-4d5a-9edc-b3c4e283213d'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 21 18:54:57 np0005591285 nova_compute[182755]: 2026-01-21 23:54:57.933 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 18:54:57 np0005591285 NetworkManager[55017]: <info>  [1769039697.9342] manager: (tap798c95ea-7b): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/71)
Jan 21 18:54:57 np0005591285 nova_compute[182755]: 2026-01-21 23:54:57.937 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 21 18:54:57 np0005591285 nova_compute[182755]: 2026-01-21 23:54:57.943 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 18:54:57 np0005591285 nova_compute[182755]: 2026-01-21 23:54:57.946 182759 INFO os_vif [None req-4c60d553-7ce7-4384-a085-991b0c0cfc7f a7fb6bdd938b4fcdb749b0bc4f86f97e c09a5cf201e249f69f57cd4a632d1e2b - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:e2:1a:16,bridge_name='br-int',has_traffic_filtering=True,id=798c95ea-7b9b-4c11-b29c-7a919d3070d4,network=Network(7b586c54-3322-410f-9bc9-972a63b8deff),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap798c95ea-7b')#033[00m
Jan 21 18:54:58 np0005591285 nova_compute[182755]: 2026-01-21 23:54:58.029 182759 DEBUG nova.virt.libvirt.driver [None req-4c60d553-7ce7-4384-a085-991b0c0cfc7f a7fb6bdd938b4fcdb749b0bc4f86f97e c09a5cf201e249f69f57cd4a632d1e2b - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 21 18:54:58 np0005591285 nova_compute[182755]: 2026-01-21 23:54:58.030 182759 DEBUG nova.virt.libvirt.driver [None req-4c60d553-7ce7-4384-a085-991b0c0cfc7f a7fb6bdd938b4fcdb749b0bc4f86f97e c09a5cf201e249f69f57cd4a632d1e2b - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 21 18:54:58 np0005591285 nova_compute[182755]: 2026-01-21 23:54:58.030 182759 DEBUG nova.virt.libvirt.driver [None req-4c60d553-7ce7-4384-a085-991b0c0cfc7f a7fb6bdd938b4fcdb749b0bc4f86f97e c09a5cf201e249f69f57cd4a632d1e2b - - default default] No VIF found with MAC fa:16:3e:e2:1a:16, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Jan 21 18:54:58 np0005591285 nova_compute[182755]: 2026-01-21 23:54:58.032 182759 INFO nova.virt.libvirt.driver [None req-4c60d553-7ce7-4384-a085-991b0c0cfc7f a7fb6bdd938b4fcdb749b0bc4f86f97e c09a5cf201e249f69f57cd4a632d1e2b - - default default] [instance: f9f52173-c7d5-4d5a-9edc-b3c4e283213d] Using config drive#033[00m
Jan 21 18:54:58 np0005591285 nova_compute[182755]: 2026-01-21 23:54:58.425 182759 INFO nova.virt.libvirt.driver [None req-4c60d553-7ce7-4384-a085-991b0c0cfc7f a7fb6bdd938b4fcdb749b0bc4f86f97e c09a5cf201e249f69f57cd4a632d1e2b - - default default] [instance: f9f52173-c7d5-4d5a-9edc-b3c4e283213d] Creating config drive at /var/lib/nova/instances/f9f52173-c7d5-4d5a-9edc-b3c4e283213d/disk.config#033[00m
Jan 21 18:54:58 np0005591285 nova_compute[182755]: 2026-01-21 23:54:58.435 182759 DEBUG oslo_concurrency.processutils [None req-4c60d553-7ce7-4384-a085-991b0c0cfc7f a7fb6bdd938b4fcdb749b0bc4f86f97e c09a5cf201e249f69f57cd4a632d1e2b - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/f9f52173-c7d5-4d5a-9edc-b3c4e283213d/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpqrbl41x4 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 21 18:54:58 np0005591285 nova_compute[182755]: 2026-01-21 23:54:58.532 182759 DEBUG nova.compute.manager [req-3ec2f1e1-a6c6-4fa9-aaf5-f9e5cdfba054 req-7ae8106d-6fcb-4b75-a30c-a424a8a724f7 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: f9f52173-c7d5-4d5a-9edc-b3c4e283213d] Received event network-changed-798c95ea-7b9b-4c11-b29c-7a919d3070d4 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 21 18:54:58 np0005591285 nova_compute[182755]: 2026-01-21 23:54:58.533 182759 DEBUG nova.compute.manager [req-3ec2f1e1-a6c6-4fa9-aaf5-f9e5cdfba054 req-7ae8106d-6fcb-4b75-a30c-a424a8a724f7 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: f9f52173-c7d5-4d5a-9edc-b3c4e283213d] Refreshing instance network info cache due to event network-changed-798c95ea-7b9b-4c11-b29c-7a919d3070d4. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 21 18:54:58 np0005591285 nova_compute[182755]: 2026-01-21 23:54:58.534 182759 DEBUG oslo_concurrency.lockutils [req-3ec2f1e1-a6c6-4fa9-aaf5-f9e5cdfba054 req-7ae8106d-6fcb-4b75-a30c-a424a8a724f7 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "refresh_cache-f9f52173-c7d5-4d5a-9edc-b3c4e283213d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 21 18:54:58 np0005591285 nova_compute[182755]: 2026-01-21 23:54:58.535 182759 DEBUG oslo_concurrency.lockutils [req-3ec2f1e1-a6c6-4fa9-aaf5-f9e5cdfba054 req-7ae8106d-6fcb-4b75-a30c-a424a8a724f7 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquired lock "refresh_cache-f9f52173-c7d5-4d5a-9edc-b3c4e283213d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 21 18:54:58 np0005591285 nova_compute[182755]: 2026-01-21 23:54:58.535 182759 DEBUG nova.network.neutron [req-3ec2f1e1-a6c6-4fa9-aaf5-f9e5cdfba054 req-7ae8106d-6fcb-4b75-a30c-a424a8a724f7 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: f9f52173-c7d5-4d5a-9edc-b3c4e283213d] Refreshing network info cache for port 798c95ea-7b9b-4c11-b29c-7a919d3070d4 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 21 18:54:58 np0005591285 nova_compute[182755]: 2026-01-21 23:54:58.581 182759 DEBUG oslo_concurrency.processutils [None req-4c60d553-7ce7-4384-a085-991b0c0cfc7f a7fb6bdd938b4fcdb749b0bc4f86f97e c09a5cf201e249f69f57cd4a632d1e2b - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/f9f52173-c7d5-4d5a-9edc-b3c4e283213d/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpqrbl41x4" returned: 0 in 0.146s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 21 18:54:58 np0005591285 kernel: tap798c95ea-7b: entered promiscuous mode
Jan 21 18:54:58 np0005591285 NetworkManager[55017]: <info>  [1769039698.6602] manager: (tap798c95ea-7b): new Tun device (/org/freedesktop/NetworkManager/Devices/72)
Jan 21 18:54:58 np0005591285 nova_compute[182755]: 2026-01-21 23:54:58.661 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 18:54:58 np0005591285 ovn_controller[94908]: 2026-01-21T23:54:58Z|00125|binding|INFO|Claiming lport 798c95ea-7b9b-4c11-b29c-7a919d3070d4 for this chassis.
Jan 21 18:54:58 np0005591285 ovn_controller[94908]: 2026-01-21T23:54:58Z|00126|binding|INFO|798c95ea-7b9b-4c11-b29c-7a919d3070d4: Claiming fa:16:3e:e2:1a:16 10.100.0.7
Jan 21 18:54:58 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:54:58.685 104259 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:e2:1a:16 10.100.0.7'], port_security=['fa:16:3e:e2:1a:16 10.100.0.7'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.7/28', 'neutron:device_id': 'f9f52173-c7d5-4d5a-9edc-b3c4e283213d', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-7b586c54-3322-410f-9bc9-972a63b8deff', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'c09a5cf201e249f69f57cd4a632d1e2b', 'neutron:revision_number': '2', 'neutron:security_group_ids': '0a9884ba-4fab-4d1a-a8f1-d417efefef12', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=f503744d-afcf-48c4-bcde-b001877de7d3, chassis=[<ovs.db.idl.Row object at 0x7fdd55b5c970>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fdd55b5c970>], logical_port=798c95ea-7b9b-4c11-b29c-7a919d3070d4) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 21 18:54:58 np0005591285 nova_compute[182755]: 2026-01-21 23:54:58.691 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 18:54:58 np0005591285 ovn_controller[94908]: 2026-01-21T23:54:58Z|00127|binding|INFO|Setting lport 798c95ea-7b9b-4c11-b29c-7a919d3070d4 ovn-installed in OVS
Jan 21 18:54:58 np0005591285 ovn_controller[94908]: 2026-01-21T23:54:58Z|00128|binding|INFO|Setting lport 798c95ea-7b9b-4c11-b29c-7a919d3070d4 up in Southbound
Jan 21 18:54:58 np0005591285 nova_compute[182755]: 2026-01-21 23:54:58.695 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 18:54:58 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:54:58.695 104259 INFO neutron.agent.ovn.metadata.agent [-] Port 798c95ea-7b9b-4c11-b29c-7a919d3070d4 in datapath 7b586c54-3322-410f-9bc9-972a63b8deff bound to our chassis#033[00m
Jan 21 18:54:58 np0005591285 nova_compute[182755]: 2026-01-21 23:54:58.697 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 18:54:58 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:54:58.699 104259 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 7b586c54-3322-410f-9bc9-972a63b8deff#033[00m
Jan 21 18:54:58 np0005591285 systemd-udevd[218090]: Network interface NamePolicy= disabled on kernel command line.
Jan 21 18:54:58 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:54:58.717 211690 DEBUG oslo.privsep.daemon [-] privsep: reply[59c1580c-67d2-4233-83e3-f2f3a3ab1ea5]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 18:54:58 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:54:58.719 104259 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap7b586c54-31 in ovnmeta-7b586c54-3322-410f-9bc9-972a63b8deff namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Jan 21 18:54:58 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:54:58.725 211690 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap7b586c54-30 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Jan 21 18:54:58 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:54:58.726 211690 DEBUG oslo.privsep.daemon [-] privsep: reply[6ebcd9f7-d56a-419e-a72d-07d02c40d049]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 18:54:58 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:54:58.727 211690 DEBUG oslo.privsep.daemon [-] privsep: reply[5d878ea1-3c76-420a-aed6-23cbd0342071]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 18:54:58 np0005591285 NetworkManager[55017]: <info>  [1769039698.7309] device (tap798c95ea-7b): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 21 18:54:58 np0005591285 NetworkManager[55017]: <info>  [1769039698.7320] device (tap798c95ea-7b): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 21 18:54:58 np0005591285 systemd-machined[154022]: New machine qemu-22-instance-00000037.
Jan 21 18:54:58 np0005591285 systemd[1]: Started Virtual Machine qemu-22-instance-00000037.
Jan 21 18:54:58 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:54:58.749 104650 DEBUG oslo.privsep.daemon [-] privsep: reply[0f832205-b2fd-44fd-b5de-1fae18c789ef]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 18:54:58 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:54:58.779 211690 DEBUG oslo.privsep.daemon [-] privsep: reply[d3eaa36b-469e-4369-9c07-323dd7a4db2c]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 18:54:58 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:54:58.834 211704 DEBUG oslo.privsep.daemon [-] privsep: reply[d0905b6a-24fd-47bf-95ad-a5ad655a21a2]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 18:54:58 np0005591285 NetworkManager[55017]: <info>  [1769039698.8446] manager: (tap7b586c54-30): new Veth device (/org/freedesktop/NetworkManager/Devices/73)
Jan 21 18:54:58 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:54:58.843 211690 DEBUG oslo.privsep.daemon [-] privsep: reply[ba26ed76-3a30-43a9-84bd-6391078fe646]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 18:54:58 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:54:58.906 211704 DEBUG oslo.privsep.daemon [-] privsep: reply[ded44e5b-9070-4b4f-8435-a466c15ccbfc]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 18:54:58 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:54:58.912 211704 DEBUG oslo.privsep.daemon [-] privsep: reply[1281bd20-6f0f-4900-b07f-3324a2944e69]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 18:54:58 np0005591285 NetworkManager[55017]: <info>  [1769039698.9580] device (tap7b586c54-30): carrier: link connected
Jan 21 18:54:58 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:54:58.968 211704 DEBUG oslo.privsep.daemon [-] privsep: reply[9a9f5285-7795-44aa-a148-f37afbf0918b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 18:54:59 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:54:58.998 211690 DEBUG oslo.privsep.daemon [-] privsep: reply[0c7bce1c-d7e7-44ac-aafd-f3cfbaf7c73f]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap7b586c54-31'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:dc:a9:f5'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 45], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 425761, 'reachable_time': 28220, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 218125, 'error': None, 'target': 'ovnmeta-7b586c54-3322-410f-9bc9-972a63b8deff', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 18:54:59 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:54:59.026 211690 DEBUG oslo.privsep.daemon [-] privsep: reply[9d9b80d2-a070-4754-aaa6-6ca50df20980]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fedc:a9f5'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 425761, 'tstamp': 425761}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 218128, 'error': None, 'target': 'ovnmeta-7b586c54-3322-410f-9bc9-972a63b8deff', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 18:54:59 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:54:59.058 211690 DEBUG oslo.privsep.daemon [-] privsep: reply[3380b9dd-8216-40d0-8059-da4d0ae66689]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap7b586c54-31'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:dc:a9:f5'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 45], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 425761, 'reachable_time': 28220, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 218132, 'error': None, 'target': 'ovnmeta-7b586c54-3322-410f-9bc9-972a63b8deff', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 18:54:59 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:54:59.119 211690 DEBUG oslo.privsep.daemon [-] privsep: reply[eff93662-ac48-47ee-918e-ee4def99762f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 18:54:59 np0005591285 nova_compute[182755]: 2026-01-21 23:54:59.129 182759 DEBUG nova.compute.manager [req-ad799b17-518e-4fc4-bc35-1b4662179230 req-a48f6e77-7345-497b-9403-743ea6c5fe40 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: f9f52173-c7d5-4d5a-9edc-b3c4e283213d] Received event network-vif-plugged-798c95ea-7b9b-4c11-b29c-7a919d3070d4 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 21 18:54:59 np0005591285 nova_compute[182755]: 2026-01-21 23:54:59.130 182759 DEBUG oslo_concurrency.lockutils [req-ad799b17-518e-4fc4-bc35-1b4662179230 req-a48f6e77-7345-497b-9403-743ea6c5fe40 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "f9f52173-c7d5-4d5a-9edc-b3c4e283213d-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 21 18:54:59 np0005591285 nova_compute[182755]: 2026-01-21 23:54:59.130 182759 DEBUG oslo_concurrency.lockutils [req-ad799b17-518e-4fc4-bc35-1b4662179230 req-a48f6e77-7345-497b-9403-743ea6c5fe40 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "f9f52173-c7d5-4d5a-9edc-b3c4e283213d-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 21 18:54:59 np0005591285 nova_compute[182755]: 2026-01-21 23:54:59.131 182759 DEBUG oslo_concurrency.lockutils [req-ad799b17-518e-4fc4-bc35-1b4662179230 req-a48f6e77-7345-497b-9403-743ea6c5fe40 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "f9f52173-c7d5-4d5a-9edc-b3c4e283213d-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 21 18:54:59 np0005591285 nova_compute[182755]: 2026-01-21 23:54:59.131 182759 DEBUG nova.compute.manager [req-ad799b17-518e-4fc4-bc35-1b4662179230 req-a48f6e77-7345-497b-9403-743ea6c5fe40 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: f9f52173-c7d5-4d5a-9edc-b3c4e283213d] Processing event network-vif-plugged-798c95ea-7b9b-4c11-b29c-7a919d3070d4 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Jan 21 18:54:59 np0005591285 nova_compute[182755]: 2026-01-21 23:54:59.137 182759 DEBUG nova.virt.driver [None req-f447ef32-7edb-4e2c-a5c5-5452f2f9c37e - - - - - -] Emitting event <LifecycleEvent: 1769039699.1363528, f9f52173-c7d5-4d5a-9edc-b3c4e283213d => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 21 18:54:59 np0005591285 nova_compute[182755]: 2026-01-21 23:54:59.137 182759 INFO nova.compute.manager [None req-f447ef32-7edb-4e2c-a5c5-5452f2f9c37e - - - - - -] [instance: f9f52173-c7d5-4d5a-9edc-b3c4e283213d] VM Started (Lifecycle Event)#033[00m
Jan 21 18:54:59 np0005591285 nova_compute[182755]: 2026-01-21 23:54:59.141 182759 DEBUG nova.compute.manager [None req-4c60d553-7ce7-4384-a085-991b0c0cfc7f a7fb6bdd938b4fcdb749b0bc4f86f97e c09a5cf201e249f69f57cd4a632d1e2b - - default default] [instance: f9f52173-c7d5-4d5a-9edc-b3c4e283213d] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Jan 21 18:54:59 np0005591285 nova_compute[182755]: 2026-01-21 23:54:59.147 182759 DEBUG nova.virt.libvirt.driver [None req-4c60d553-7ce7-4384-a085-991b0c0cfc7f a7fb6bdd938b4fcdb749b0bc4f86f97e c09a5cf201e249f69f57cd4a632d1e2b - - default default] [instance: f9f52173-c7d5-4d5a-9edc-b3c4e283213d] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Jan 21 18:54:59 np0005591285 nova_compute[182755]: 2026-01-21 23:54:59.153 182759 INFO nova.virt.libvirt.driver [-] [instance: f9f52173-c7d5-4d5a-9edc-b3c4e283213d] Instance spawned successfully.#033[00m
Jan 21 18:54:59 np0005591285 nova_compute[182755]: 2026-01-21 23:54:59.153 182759 DEBUG nova.virt.libvirt.driver [None req-4c60d553-7ce7-4384-a085-991b0c0cfc7f a7fb6bdd938b4fcdb749b0bc4f86f97e c09a5cf201e249f69f57cd4a632d1e2b - - default default] [instance: f9f52173-c7d5-4d5a-9edc-b3c4e283213d] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Jan 21 18:54:59 np0005591285 nova_compute[182755]: 2026-01-21 23:54:59.179 182759 DEBUG nova.compute.manager [None req-f447ef32-7edb-4e2c-a5c5-5452f2f9c37e - - - - - -] [instance: f9f52173-c7d5-4d5a-9edc-b3c4e283213d] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 21 18:54:59 np0005591285 nova_compute[182755]: 2026-01-21 23:54:59.187 182759 DEBUG nova.compute.manager [None req-f447ef32-7edb-4e2c-a5c5-5452f2f9c37e - - - - - -] [instance: f9f52173-c7d5-4d5a-9edc-b3c4e283213d] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 21 18:54:59 np0005591285 nova_compute[182755]: 2026-01-21 23:54:59.191 182759 DEBUG nova.virt.libvirt.driver [None req-4c60d553-7ce7-4384-a085-991b0c0cfc7f a7fb6bdd938b4fcdb749b0bc4f86f97e c09a5cf201e249f69f57cd4a632d1e2b - - default default] [instance: f9f52173-c7d5-4d5a-9edc-b3c4e283213d] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 21 18:54:59 np0005591285 nova_compute[182755]: 2026-01-21 23:54:59.192 182759 DEBUG nova.virt.libvirt.driver [None req-4c60d553-7ce7-4384-a085-991b0c0cfc7f a7fb6bdd938b4fcdb749b0bc4f86f97e c09a5cf201e249f69f57cd4a632d1e2b - - default default] [instance: f9f52173-c7d5-4d5a-9edc-b3c4e283213d] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 21 18:54:59 np0005591285 nova_compute[182755]: 2026-01-21 23:54:59.192 182759 DEBUG nova.virt.libvirt.driver [None req-4c60d553-7ce7-4384-a085-991b0c0cfc7f a7fb6bdd938b4fcdb749b0bc4f86f97e c09a5cf201e249f69f57cd4a632d1e2b - - default default] [instance: f9f52173-c7d5-4d5a-9edc-b3c4e283213d] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 21 18:54:59 np0005591285 nova_compute[182755]: 2026-01-21 23:54:59.192 182759 DEBUG nova.virt.libvirt.driver [None req-4c60d553-7ce7-4384-a085-991b0c0cfc7f a7fb6bdd938b4fcdb749b0bc4f86f97e c09a5cf201e249f69f57cd4a632d1e2b - - default default] [instance: f9f52173-c7d5-4d5a-9edc-b3c4e283213d] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 21 18:54:59 np0005591285 nova_compute[182755]: 2026-01-21 23:54:59.193 182759 DEBUG nova.virt.libvirt.driver [None req-4c60d553-7ce7-4384-a085-991b0c0cfc7f a7fb6bdd938b4fcdb749b0bc4f86f97e c09a5cf201e249f69f57cd4a632d1e2b - - default default] [instance: f9f52173-c7d5-4d5a-9edc-b3c4e283213d] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 21 18:54:59 np0005591285 nova_compute[182755]: 2026-01-21 23:54:59.193 182759 DEBUG nova.virt.libvirt.driver [None req-4c60d553-7ce7-4384-a085-991b0c0cfc7f a7fb6bdd938b4fcdb749b0bc4f86f97e c09a5cf201e249f69f57cd4a632d1e2b - - default default] [instance: f9f52173-c7d5-4d5a-9edc-b3c4e283213d] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 21 18:54:59 np0005591285 nova_compute[182755]: 2026-01-21 23:54:59.225 182759 INFO nova.compute.manager [None req-f447ef32-7edb-4e2c-a5c5-5452f2f9c37e - - - - - -] [instance: f9f52173-c7d5-4d5a-9edc-b3c4e283213d] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 21 18:54:59 np0005591285 nova_compute[182755]: 2026-01-21 23:54:59.226 182759 DEBUG nova.virt.driver [None req-f447ef32-7edb-4e2c-a5c5-5452f2f9c37e - - - - - -] Emitting event <LifecycleEvent: 1769039699.1374433, f9f52173-c7d5-4d5a-9edc-b3c4e283213d => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 21 18:54:59 np0005591285 nova_compute[182755]: 2026-01-21 23:54:59.226 182759 INFO nova.compute.manager [None req-f447ef32-7edb-4e2c-a5c5-5452f2f9c37e - - - - - -] [instance: f9f52173-c7d5-4d5a-9edc-b3c4e283213d] VM Paused (Lifecycle Event)#033[00m
Jan 21 18:54:59 np0005591285 podman[218137]: 2026-01-21 23:54:59.23420985 +0000 UTC m=+0.085899944 container health_status 3d927e208c8d9d2bf2ed958430b573b5beb6e6062ba87e1ec7a0ee42c855b2e4 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Jan 21 18:54:59 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:54:59.236 211690 DEBUG oslo.privsep.daemon [-] privsep: reply[30510c10-4581-4b24-812c-b74041edb39d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 18:54:59 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:54:59.238 104259 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap7b586c54-30, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 21 18:54:59 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:54:59.238 104259 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 21 18:54:59 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:54:59.239 104259 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap7b586c54-30, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 21 18:54:59 np0005591285 kernel: tap7b586c54-30: entered promiscuous mode
Jan 21 18:54:59 np0005591285 NetworkManager[55017]: <info>  [1769039699.2417] manager: (tap7b586c54-30): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/74)
Jan 21 18:54:59 np0005591285 nova_compute[182755]: 2026-01-21 23:54:59.244 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 18:54:59 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:54:59.247 104259 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap7b586c54-30, col_values=(('external_ids', {'iface-id': '52e5d5d5-be78-49fa-86d7-24ac4adf40c1'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 21 18:54:59 np0005591285 ovn_controller[94908]: 2026-01-21T23:54:59Z|00129|binding|INFO|Releasing lport 52e5d5d5-be78-49fa-86d7-24ac4adf40c1 from this chassis (sb_readonly=0)
Jan 21 18:54:59 np0005591285 nova_compute[182755]: 2026-01-21 23:54:59.249 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 18:54:59 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:54:59.250 104259 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/7b586c54-3322-410f-9bc9-972a63b8deff.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/7b586c54-3322-410f-9bc9-972a63b8deff.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Jan 21 18:54:59 np0005591285 nova_compute[182755]: 2026-01-21 23:54:59.251 182759 DEBUG nova.compute.manager [None req-f447ef32-7edb-4e2c-a5c5-5452f2f9c37e - - - - - -] [instance: f9f52173-c7d5-4d5a-9edc-b3c4e283213d] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 21 18:54:59 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:54:59.251 211690 DEBUG oslo.privsep.daemon [-] privsep: reply[a4d7a864-c906-4572-a87b-6a8ab0983bde]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 18:54:59 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:54:59.252 104259 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Jan 21 18:54:59 np0005591285 ovn_metadata_agent[104254]: global
Jan 21 18:54:59 np0005591285 ovn_metadata_agent[104254]:    log         /dev/log local0 debug
Jan 21 18:54:59 np0005591285 ovn_metadata_agent[104254]:    log-tag     haproxy-metadata-proxy-7b586c54-3322-410f-9bc9-972a63b8deff
Jan 21 18:54:59 np0005591285 ovn_metadata_agent[104254]:    user        root
Jan 21 18:54:59 np0005591285 ovn_metadata_agent[104254]:    group       root
Jan 21 18:54:59 np0005591285 ovn_metadata_agent[104254]:    maxconn     1024
Jan 21 18:54:59 np0005591285 ovn_metadata_agent[104254]:    pidfile     /var/lib/neutron/external/pids/7b586c54-3322-410f-9bc9-972a63b8deff.pid.haproxy
Jan 21 18:54:59 np0005591285 ovn_metadata_agent[104254]:    daemon
Jan 21 18:54:59 np0005591285 ovn_metadata_agent[104254]: 
Jan 21 18:54:59 np0005591285 ovn_metadata_agent[104254]: defaults
Jan 21 18:54:59 np0005591285 ovn_metadata_agent[104254]:    log global
Jan 21 18:54:59 np0005591285 ovn_metadata_agent[104254]:    mode http
Jan 21 18:54:59 np0005591285 ovn_metadata_agent[104254]:    option httplog
Jan 21 18:54:59 np0005591285 ovn_metadata_agent[104254]:    option dontlognull
Jan 21 18:54:59 np0005591285 ovn_metadata_agent[104254]:    option http-server-close
Jan 21 18:54:59 np0005591285 ovn_metadata_agent[104254]:    option forwardfor
Jan 21 18:54:59 np0005591285 ovn_metadata_agent[104254]:    retries                 3
Jan 21 18:54:59 np0005591285 ovn_metadata_agent[104254]:    timeout http-request    30s
Jan 21 18:54:59 np0005591285 ovn_metadata_agent[104254]:    timeout connect         30s
Jan 21 18:54:59 np0005591285 ovn_metadata_agent[104254]:    timeout client          32s
Jan 21 18:54:59 np0005591285 ovn_metadata_agent[104254]:    timeout server          32s
Jan 21 18:54:59 np0005591285 ovn_metadata_agent[104254]:    timeout http-keep-alive 30s
Jan 21 18:54:59 np0005591285 ovn_metadata_agent[104254]: 
Jan 21 18:54:59 np0005591285 ovn_metadata_agent[104254]: 
Jan 21 18:54:59 np0005591285 ovn_metadata_agent[104254]: listen listener
Jan 21 18:54:59 np0005591285 ovn_metadata_agent[104254]:    bind 169.254.169.254:80
Jan 21 18:54:59 np0005591285 ovn_metadata_agent[104254]:    server metadata /var/lib/neutron/metadata_proxy
Jan 21 18:54:59 np0005591285 ovn_metadata_agent[104254]:    http-request add-header X-OVN-Network-ID 7b586c54-3322-410f-9bc9-972a63b8deff
Jan 21 18:54:59 np0005591285 ovn_metadata_agent[104254]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Jan 21 18:54:59 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:54:59.253 104259 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-7b586c54-3322-410f-9bc9-972a63b8deff', 'env', 'PROCESS_TAG=haproxy-7b586c54-3322-410f-9bc9-972a63b8deff', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/7b586c54-3322-410f-9bc9-972a63b8deff.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Jan 21 18:54:59 np0005591285 nova_compute[182755]: 2026-01-21 23:54:59.260 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 18:54:59 np0005591285 nova_compute[182755]: 2026-01-21 23:54:59.262 182759 DEBUG nova.virt.driver [None req-f447ef32-7edb-4e2c-a5c5-5452f2f9c37e - - - - - -] Emitting event <LifecycleEvent: 1769039699.1471713, f9f52173-c7d5-4d5a-9edc-b3c4e283213d => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 21 18:54:59 np0005591285 nova_compute[182755]: 2026-01-21 23:54:59.263 182759 INFO nova.compute.manager [None req-f447ef32-7edb-4e2c-a5c5-5452f2f9c37e - - - - - -] [instance: f9f52173-c7d5-4d5a-9edc-b3c4e283213d] VM Resumed (Lifecycle Event)#033[00m
Jan 21 18:54:59 np0005591285 nova_compute[182755]: 2026-01-21 23:54:59.274 182759 INFO nova.compute.manager [None req-4c60d553-7ce7-4384-a085-991b0c0cfc7f a7fb6bdd938b4fcdb749b0bc4f86f97e c09a5cf201e249f69f57cd4a632d1e2b - - default default] [instance: f9f52173-c7d5-4d5a-9edc-b3c4e283213d] Took 5.26 seconds to spawn the instance on the hypervisor.#033[00m
Jan 21 18:54:59 np0005591285 nova_compute[182755]: 2026-01-21 23:54:59.275 182759 DEBUG nova.compute.manager [None req-4c60d553-7ce7-4384-a085-991b0c0cfc7f a7fb6bdd938b4fcdb749b0bc4f86f97e c09a5cf201e249f69f57cd4a632d1e2b - - default default] [instance: f9f52173-c7d5-4d5a-9edc-b3c4e283213d] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 21 18:54:59 np0005591285 nova_compute[182755]: 2026-01-21 23:54:59.280 182759 DEBUG nova.compute.manager [None req-f447ef32-7edb-4e2c-a5c5-5452f2f9c37e - - - - - -] [instance: f9f52173-c7d5-4d5a-9edc-b3c4e283213d] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 21 18:54:59 np0005591285 nova_compute[182755]: 2026-01-21 23:54:59.282 182759 DEBUG nova.compute.manager [None req-f447ef32-7edb-4e2c-a5c5-5452f2f9c37e - - - - - -] [instance: f9f52173-c7d5-4d5a-9edc-b3c4e283213d] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 21 18:54:59 np0005591285 nova_compute[182755]: 2026-01-21 23:54:59.306 182759 INFO nova.compute.manager [None req-f447ef32-7edb-4e2c-a5c5-5452f2f9c37e - - - - - -] [instance: f9f52173-c7d5-4d5a-9edc-b3c4e283213d] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 21 18:54:59 np0005591285 nova_compute[182755]: 2026-01-21 23:54:59.352 182759 INFO nova.compute.manager [None req-4c60d553-7ce7-4384-a085-991b0c0cfc7f a7fb6bdd938b4fcdb749b0bc4f86f97e c09a5cf201e249f69f57cd4a632d1e2b - - default default] [instance: f9f52173-c7d5-4d5a-9edc-b3c4e283213d] Took 5.93 seconds to build instance.#033[00m
Jan 21 18:54:59 np0005591285 nova_compute[182755]: 2026-01-21 23:54:59.368 182759 DEBUG oslo_concurrency.lockutils [None req-4c60d553-7ce7-4384-a085-991b0c0cfc7f a7fb6bdd938b4fcdb749b0bc4f86f97e c09a5cf201e249f69f57cd4a632d1e2b - - default default] Lock "f9f52173-c7d5-4d5a-9edc-b3c4e283213d" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 6.044s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 21 18:54:59 np0005591285 podman[218192]: 2026-01-21 23:54:59.752218762 +0000 UTC m=+0.097192568 container create 8d762925d4000ff506032e7a81b8466c51d111d9f0aad14f6ab8bc521d54c7f2 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-7b586c54-3322-410f-9bc9-972a63b8deff, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Jan 21 18:54:59 np0005591285 podman[218192]: 2026-01-21 23:54:59.708943766 +0000 UTC m=+0.053917622 image pull 3695f0466b4af47afdf4b467956f8cc4744d7249671a73e7ca3fd26cca2f59c3 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Jan 21 18:54:59 np0005591285 systemd[1]: Started libpod-conmon-8d762925d4000ff506032e7a81b8466c51d111d9f0aad14f6ab8bc521d54c7f2.scope.
Jan 21 18:54:59 np0005591285 nova_compute[182755]: 2026-01-21 23:54:59.837 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 18:54:59 np0005591285 systemd[1]: Started libcrun container.
Jan 21 18:54:59 np0005591285 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/d29a2eb96bb87d14007663948f470f00c39541412e005947d97dd3dda5a1d070/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 21 18:54:59 np0005591285 podman[218192]: 2026-01-21 23:54:59.878617223 +0000 UTC m=+0.223591049 container init 8d762925d4000ff506032e7a81b8466c51d111d9f0aad14f6ab8bc521d54c7f2 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-7b586c54-3322-410f-9bc9-972a63b8deff, org.label-schema.build-date=20251202, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 21 18:54:59 np0005591285 podman[218192]: 2026-01-21 23:54:59.890735909 +0000 UTC m=+0.235709695 container start 8d762925d4000ff506032e7a81b8466c51d111d9f0aad14f6ab8bc521d54c7f2 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-7b586c54-3322-410f-9bc9-972a63b8deff, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251202)
Jan 21 18:54:59 np0005591285 neutron-haproxy-ovnmeta-7b586c54-3322-410f-9bc9-972a63b8deff[218208]: [NOTICE]   (218212) : New worker (218214) forked
Jan 21 18:54:59 np0005591285 neutron-haproxy-ovnmeta-7b586c54-3322-410f-9bc9-972a63b8deff[218208]: [NOTICE]   (218212) : Loading success.
Jan 21 18:55:00 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:55:00.876 104259 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=14, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '16:a4:fb', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': 'fa:c2:bb:50:eb:27'}, ipsec=False) old=SB_Global(nb_cfg=13) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 21 18:55:00 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:55:00.877 104259 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 9 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Jan 21 18:55:00 np0005591285 nova_compute[182755]: 2026-01-21 23:55:00.900 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 18:55:01 np0005591285 nova_compute[182755]: 2026-01-21 23:55:01.127 182759 DEBUG nova.network.neutron [req-3ec2f1e1-a6c6-4fa9-aaf5-f9e5cdfba054 req-7ae8106d-6fcb-4b75-a30c-a424a8a724f7 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: f9f52173-c7d5-4d5a-9edc-b3c4e283213d] Updated VIF entry in instance network info cache for port 798c95ea-7b9b-4c11-b29c-7a919d3070d4. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 21 18:55:01 np0005591285 nova_compute[182755]: 2026-01-21 23:55:01.128 182759 DEBUG nova.network.neutron [req-3ec2f1e1-a6c6-4fa9-aaf5-f9e5cdfba054 req-7ae8106d-6fcb-4b75-a30c-a424a8a724f7 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: f9f52173-c7d5-4d5a-9edc-b3c4e283213d] Updating instance_info_cache with network_info: [{"id": "798c95ea-7b9b-4c11-b29c-7a919d3070d4", "address": "fa:16:3e:e2:1a:16", "network": {"id": "7b586c54-3322-410f-9bc9-972a63b8deff", "bridge": "br-int", "label": "tempest-ServerDiskConfigTestJSON-1542056474-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c09a5cf201e249f69f57cd4a632d1e2b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap798c95ea-7b", "ovs_interfaceid": "798c95ea-7b9b-4c11-b29c-7a919d3070d4", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 21 18:55:01 np0005591285 nova_compute[182755]: 2026-01-21 23:55:01.159 182759 DEBUG oslo_concurrency.lockutils [req-3ec2f1e1-a6c6-4fa9-aaf5-f9e5cdfba054 req-7ae8106d-6fcb-4b75-a30c-a424a8a724f7 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Releasing lock "refresh_cache-f9f52173-c7d5-4d5a-9edc-b3c4e283213d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 21 18:55:01 np0005591285 nova_compute[182755]: 2026-01-21 23:55:01.258 182759 DEBUG nova.compute.manager [req-cc9115f9-31c1-4d38-87ab-f7b74abcbaf7 req-604ebdbd-db32-4590-89fa-d58a3d0ef411 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: f9f52173-c7d5-4d5a-9edc-b3c4e283213d] Received event network-vif-plugged-798c95ea-7b9b-4c11-b29c-7a919d3070d4 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 21 18:55:01 np0005591285 nova_compute[182755]: 2026-01-21 23:55:01.259 182759 DEBUG oslo_concurrency.lockutils [req-cc9115f9-31c1-4d38-87ab-f7b74abcbaf7 req-604ebdbd-db32-4590-89fa-d58a3d0ef411 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "f9f52173-c7d5-4d5a-9edc-b3c4e283213d-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 21 18:55:01 np0005591285 nova_compute[182755]: 2026-01-21 23:55:01.260 182759 DEBUG oslo_concurrency.lockutils [req-cc9115f9-31c1-4d38-87ab-f7b74abcbaf7 req-604ebdbd-db32-4590-89fa-d58a3d0ef411 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "f9f52173-c7d5-4d5a-9edc-b3c4e283213d-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 21 18:55:01 np0005591285 nova_compute[182755]: 2026-01-21 23:55:01.260 182759 DEBUG oslo_concurrency.lockutils [req-cc9115f9-31c1-4d38-87ab-f7b74abcbaf7 req-604ebdbd-db32-4590-89fa-d58a3d0ef411 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "f9f52173-c7d5-4d5a-9edc-b3c4e283213d-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 21 18:55:01 np0005591285 nova_compute[182755]: 2026-01-21 23:55:01.261 182759 DEBUG nova.compute.manager [req-cc9115f9-31c1-4d38-87ab-f7b74abcbaf7 req-604ebdbd-db32-4590-89fa-d58a3d0ef411 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: f9f52173-c7d5-4d5a-9edc-b3c4e283213d] No waiting events found dispatching network-vif-plugged-798c95ea-7b9b-4c11-b29c-7a919d3070d4 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 21 18:55:01 np0005591285 nova_compute[182755]: 2026-01-21 23:55:01.261 182759 WARNING nova.compute.manager [req-cc9115f9-31c1-4d38-87ab-f7b74abcbaf7 req-604ebdbd-db32-4590-89fa-d58a3d0ef411 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: f9f52173-c7d5-4d5a-9edc-b3c4e283213d] Received unexpected event network-vif-plugged-798c95ea-7b9b-4c11-b29c-7a919d3070d4 for instance with vm_state active and task_state None.#033[00m
Jan 21 18:55:02 np0005591285 nova_compute[182755]: 2026-01-21 23:55:02.933 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 18:55:02 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:55:02.962 104259 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 21 18:55:02 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:55:02.963 104259 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 21 18:55:02 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:55:02.963 104259 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 21 18:55:04 np0005591285 podman[218223]: 2026-01-21 23:55:04.266568272 +0000 UTC m=+0.135378334 container health_status e38def30a2d032278c507a8fe96e62e9ea51ff4721fe55b24e66691821fd079e (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_controller, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 21 18:55:04 np0005591285 nova_compute[182755]: 2026-01-21 23:55:04.845 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 18:55:06 np0005591285 podman[218250]: 2026-01-21 23:55:06.214195152 +0000 UTC m=+0.076475830 container health_status 8919309155a033da8deb024bb37b9fc0d285fe97686ee0d202af37a5f2df8416 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Jan 21 18:55:06 np0005591285 podman[218249]: 2026-01-21 23:55:06.2271415 +0000 UTC m=+0.088373429 container health_status 482a8450540b33e5cd4c4cb0c4b60281016c657b79b89fa82f8a9e447843f995 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, tcib_managed=true, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']})
Jan 21 18:55:07 np0005591285 nova_compute[182755]: 2026-01-21 23:55:07.480 182759 DEBUG oslo_concurrency.lockutils [None req-a95e50ff-26e7-4a09-ae00-9eddfdc5ee55 a7fb6bdd938b4fcdb749b0bc4f86f97e c09a5cf201e249f69f57cd4a632d1e2b - - default default] Acquiring lock "f9f52173-c7d5-4d5a-9edc-b3c4e283213d" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 21 18:55:07 np0005591285 nova_compute[182755]: 2026-01-21 23:55:07.480 182759 DEBUG oslo_concurrency.lockutils [None req-a95e50ff-26e7-4a09-ae00-9eddfdc5ee55 a7fb6bdd938b4fcdb749b0bc4f86f97e c09a5cf201e249f69f57cd4a632d1e2b - - default default] Lock "f9f52173-c7d5-4d5a-9edc-b3c4e283213d" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 21 18:55:07 np0005591285 nova_compute[182755]: 2026-01-21 23:55:07.481 182759 DEBUG oslo_concurrency.lockutils [None req-a95e50ff-26e7-4a09-ae00-9eddfdc5ee55 a7fb6bdd938b4fcdb749b0bc4f86f97e c09a5cf201e249f69f57cd4a632d1e2b - - default default] Acquiring lock "f9f52173-c7d5-4d5a-9edc-b3c4e283213d-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 21 18:55:07 np0005591285 nova_compute[182755]: 2026-01-21 23:55:07.481 182759 DEBUG oslo_concurrency.lockutils [None req-a95e50ff-26e7-4a09-ae00-9eddfdc5ee55 a7fb6bdd938b4fcdb749b0bc4f86f97e c09a5cf201e249f69f57cd4a632d1e2b - - default default] Lock "f9f52173-c7d5-4d5a-9edc-b3c4e283213d-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 21 18:55:07 np0005591285 nova_compute[182755]: 2026-01-21 23:55:07.481 182759 DEBUG oslo_concurrency.lockutils [None req-a95e50ff-26e7-4a09-ae00-9eddfdc5ee55 a7fb6bdd938b4fcdb749b0bc4f86f97e c09a5cf201e249f69f57cd4a632d1e2b - - default default] Lock "f9f52173-c7d5-4d5a-9edc-b3c4e283213d-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 21 18:55:07 np0005591285 nova_compute[182755]: 2026-01-21 23:55:07.495 182759 INFO nova.compute.manager [None req-a95e50ff-26e7-4a09-ae00-9eddfdc5ee55 a7fb6bdd938b4fcdb749b0bc4f86f97e c09a5cf201e249f69f57cd4a632d1e2b - - default default] [instance: f9f52173-c7d5-4d5a-9edc-b3c4e283213d] Terminating instance#033[00m
Jan 21 18:55:07 np0005591285 nova_compute[182755]: 2026-01-21 23:55:07.506 182759 DEBUG nova.compute.manager [None req-a95e50ff-26e7-4a09-ae00-9eddfdc5ee55 a7fb6bdd938b4fcdb749b0bc4f86f97e c09a5cf201e249f69f57cd4a632d1e2b - - default default] [instance: f9f52173-c7d5-4d5a-9edc-b3c4e283213d] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Jan 21 18:55:07 np0005591285 kernel: tap798c95ea-7b (unregistering): left promiscuous mode
Jan 21 18:55:07 np0005591285 NetworkManager[55017]: <info>  [1769039707.5279] device (tap798c95ea-7b): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 21 18:55:07 np0005591285 nova_compute[182755]: 2026-01-21 23:55:07.539 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 18:55:07 np0005591285 ovn_controller[94908]: 2026-01-21T23:55:07Z|00130|binding|INFO|Releasing lport 798c95ea-7b9b-4c11-b29c-7a919d3070d4 from this chassis (sb_readonly=0)
Jan 21 18:55:07 np0005591285 ovn_controller[94908]: 2026-01-21T23:55:07Z|00131|binding|INFO|Setting lport 798c95ea-7b9b-4c11-b29c-7a919d3070d4 down in Southbound
Jan 21 18:55:07 np0005591285 ovn_controller[94908]: 2026-01-21T23:55:07Z|00132|binding|INFO|Removing iface tap798c95ea-7b ovn-installed in OVS
Jan 21 18:55:07 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:55:07.550 104259 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:e2:1a:16 10.100.0.7'], port_security=['fa:16:3e:e2:1a:16 10.100.0.7'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.7/28', 'neutron:device_id': 'f9f52173-c7d5-4d5a-9edc-b3c4e283213d', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-7b586c54-3322-410f-9bc9-972a63b8deff', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'c09a5cf201e249f69f57cd4a632d1e2b', 'neutron:revision_number': '4', 'neutron:security_group_ids': '0a9884ba-4fab-4d1a-a8f1-d417efefef12', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-2.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=f503744d-afcf-48c4-bcde-b001877de7d3, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fdd55b5c970>], logical_port=798c95ea-7b9b-4c11-b29c-7a919d3070d4) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7fdd55b5c970>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 21 18:55:07 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:55:07.551 104259 INFO neutron.agent.ovn.metadata.agent [-] Port 798c95ea-7b9b-4c11-b29c-7a919d3070d4 in datapath 7b586c54-3322-410f-9bc9-972a63b8deff unbound from our chassis#033[00m
Jan 21 18:55:07 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:55:07.553 104259 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 7b586c54-3322-410f-9bc9-972a63b8deff, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Jan 21 18:55:07 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:55:07.554 211690 DEBUG oslo.privsep.daemon [-] privsep: reply[f413377b-9213-41cc-9ad7-65b6abc2c0a1]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 18:55:07 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:55:07.555 104259 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-7b586c54-3322-410f-9bc9-972a63b8deff namespace which is not needed anymore#033[00m
Jan 21 18:55:07 np0005591285 nova_compute[182755]: 2026-01-21 23:55:07.563 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 18:55:07 np0005591285 systemd[1]: machine-qemu\x2d22\x2dinstance\x2d00000037.scope: Deactivated successfully.
Jan 21 18:55:07 np0005591285 systemd[1]: machine-qemu\x2d22\x2dinstance\x2d00000037.scope: Consumed 8.863s CPU time.
Jan 21 18:55:07 np0005591285 systemd-machined[154022]: Machine qemu-22-instance-00000037 terminated.
Jan 21 18:55:07 np0005591285 nova_compute[182755]: 2026-01-21 23:55:07.732 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 18:55:07 np0005591285 nova_compute[182755]: 2026-01-21 23:55:07.739 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 18:55:07 np0005591285 neutron-haproxy-ovnmeta-7b586c54-3322-410f-9bc9-972a63b8deff[218208]: [NOTICE]   (218212) : haproxy version is 2.8.14-c23fe91
Jan 21 18:55:07 np0005591285 neutron-haproxy-ovnmeta-7b586c54-3322-410f-9bc9-972a63b8deff[218208]: [NOTICE]   (218212) : path to executable is /usr/sbin/haproxy
Jan 21 18:55:07 np0005591285 neutron-haproxy-ovnmeta-7b586c54-3322-410f-9bc9-972a63b8deff[218208]: [WARNING]  (218212) : Exiting Master process...
Jan 21 18:55:07 np0005591285 neutron-haproxy-ovnmeta-7b586c54-3322-410f-9bc9-972a63b8deff[218208]: [WARNING]  (218212) : Exiting Master process...
Jan 21 18:55:07 np0005591285 neutron-haproxy-ovnmeta-7b586c54-3322-410f-9bc9-972a63b8deff[218208]: [ALERT]    (218212) : Current worker (218214) exited with code 143 (Terminated)
Jan 21 18:55:07 np0005591285 neutron-haproxy-ovnmeta-7b586c54-3322-410f-9bc9-972a63b8deff[218208]: [WARNING]  (218212) : All workers exited. Exiting... (0)
Jan 21 18:55:07 np0005591285 systemd[1]: libpod-8d762925d4000ff506032e7a81b8466c51d111d9f0aad14f6ab8bc521d54c7f2.scope: Deactivated successfully.
Jan 21 18:55:07 np0005591285 podman[218314]: 2026-01-21 23:55:07.764312592 +0000 UTC m=+0.069375348 container died 8d762925d4000ff506032e7a81b8466c51d111d9f0aad14f6ab8bc521d54c7f2 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-7b586c54-3322-410f-9bc9-972a63b8deff, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0)
Jan 21 18:55:07 np0005591285 nova_compute[182755]: 2026-01-21 23:55:07.791 182759 INFO nova.virt.libvirt.driver [-] [instance: f9f52173-c7d5-4d5a-9edc-b3c4e283213d] Instance destroyed successfully.#033[00m
Jan 21 18:55:07 np0005591285 nova_compute[182755]: 2026-01-21 23:55:07.792 182759 DEBUG nova.objects.instance [None req-a95e50ff-26e7-4a09-ae00-9eddfdc5ee55 a7fb6bdd938b4fcdb749b0bc4f86f97e c09a5cf201e249f69f57cd4a632d1e2b - - default default] Lazy-loading 'resources' on Instance uuid f9f52173-c7d5-4d5a-9edc-b3c4e283213d obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 21 18:55:07 np0005591285 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-8d762925d4000ff506032e7a81b8466c51d111d9f0aad14f6ab8bc521d54c7f2-userdata-shm.mount: Deactivated successfully.
Jan 21 18:55:07 np0005591285 systemd[1]: var-lib-containers-storage-overlay-d29a2eb96bb87d14007663948f470f00c39541412e005947d97dd3dda5a1d070-merged.mount: Deactivated successfully.
Jan 21 18:55:07 np0005591285 nova_compute[182755]: 2026-01-21 23:55:07.819 182759 DEBUG nova.virt.libvirt.vif [None req-a95e50ff-26e7-4a09-ae00-9eddfdc5ee55 a7fb6bdd938b4fcdb749b0bc4f86f97e c09a5cf201e249f69f57cd4a632d1e2b - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-21T23:54:52Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ServerDiskConfigTestJSON-server-1758440293',display_name='tempest-ServerDiskConfigTestJSON-server-1758440293',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-serverdiskconfigtestjson-server-1758440293',id=55,image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2026-01-21T23:54:59Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='c09a5cf201e249f69f57cd4a632d1e2b',ramdisk_id='',reservation_id='r-tlh7z2ba',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-ServerDiskConfigTestJSON-1417790226',owner_user_name='tempest-ServerDiskConfigTestJSON-1417790226-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-21T23:55:05Z,user_data=None,user_id='a7fb6bdd938b4fcdb749b0bc4f86f97e',uuid=f9f52173-c7d5-4d5a-9edc-b3c4e283213d,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "798c95ea-7b9b-4c11-b29c-7a919d3070d4", "address": "fa:16:3e:e2:1a:16", "network": {"id": "7b586c54-3322-410f-9bc9-972a63b8deff", "bridge": "br-int", "label": "tempest-ServerDiskConfigTestJSON-1542056474-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c09a5cf201e249f69f57cd4a632d1e2b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap798c95ea-7b", "ovs_interfaceid": "798c95ea-7b9b-4c11-b29c-7a919d3070d4", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Jan 21 18:55:07 np0005591285 nova_compute[182755]: 2026-01-21 23:55:07.820 182759 DEBUG nova.network.os_vif_util [None req-a95e50ff-26e7-4a09-ae00-9eddfdc5ee55 a7fb6bdd938b4fcdb749b0bc4f86f97e c09a5cf201e249f69f57cd4a632d1e2b - - default default] Converting VIF {"id": "798c95ea-7b9b-4c11-b29c-7a919d3070d4", "address": "fa:16:3e:e2:1a:16", "network": {"id": "7b586c54-3322-410f-9bc9-972a63b8deff", "bridge": "br-int", "label": "tempest-ServerDiskConfigTestJSON-1542056474-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c09a5cf201e249f69f57cd4a632d1e2b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap798c95ea-7b", "ovs_interfaceid": "798c95ea-7b9b-4c11-b29c-7a919d3070d4", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 21 18:55:07 np0005591285 nova_compute[182755]: 2026-01-21 23:55:07.822 182759 DEBUG nova.network.os_vif_util [None req-a95e50ff-26e7-4a09-ae00-9eddfdc5ee55 a7fb6bdd938b4fcdb749b0bc4f86f97e c09a5cf201e249f69f57cd4a632d1e2b - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:e2:1a:16,bridge_name='br-int',has_traffic_filtering=True,id=798c95ea-7b9b-4c11-b29c-7a919d3070d4,network=Network(7b586c54-3322-410f-9bc9-972a63b8deff),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap798c95ea-7b') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 21 18:55:07 np0005591285 nova_compute[182755]: 2026-01-21 23:55:07.823 182759 DEBUG os_vif [None req-a95e50ff-26e7-4a09-ae00-9eddfdc5ee55 a7fb6bdd938b4fcdb749b0bc4f86f97e c09a5cf201e249f69f57cd4a632d1e2b - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:e2:1a:16,bridge_name='br-int',has_traffic_filtering=True,id=798c95ea-7b9b-4c11-b29c-7a919d3070d4,network=Network(7b586c54-3322-410f-9bc9-972a63b8deff),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap798c95ea-7b') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Jan 21 18:55:07 np0005591285 podman[218314]: 2026-01-21 23:55:07.823632188 +0000 UTC m=+0.128694934 container cleanup 8d762925d4000ff506032e7a81b8466c51d111d9f0aad14f6ab8bc521d54c7f2 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-7b586c54-3322-410f-9bc9-972a63b8deff, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS)
Jan 21 18:55:07 np0005591285 nova_compute[182755]: 2026-01-21 23:55:07.825 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 18:55:07 np0005591285 nova_compute[182755]: 2026-01-21 23:55:07.826 182759 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap798c95ea-7b, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 21 18:55:07 np0005591285 nova_compute[182755]: 2026-01-21 23:55:07.829 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 18:55:07 np0005591285 nova_compute[182755]: 2026-01-21 23:55:07.832 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 21 18:55:07 np0005591285 nova_compute[182755]: 2026-01-21 23:55:07.836 182759 INFO os_vif [None req-a95e50ff-26e7-4a09-ae00-9eddfdc5ee55 a7fb6bdd938b4fcdb749b0bc4f86f97e c09a5cf201e249f69f57cd4a632d1e2b - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:e2:1a:16,bridge_name='br-int',has_traffic_filtering=True,id=798c95ea-7b9b-4c11-b29c-7a919d3070d4,network=Network(7b586c54-3322-410f-9bc9-972a63b8deff),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap798c95ea-7b')#033[00m
Jan 21 18:55:07 np0005591285 nova_compute[182755]: 2026-01-21 23:55:07.837 182759 INFO nova.virt.libvirt.driver [None req-a95e50ff-26e7-4a09-ae00-9eddfdc5ee55 a7fb6bdd938b4fcdb749b0bc4f86f97e c09a5cf201e249f69f57cd4a632d1e2b - - default default] [instance: f9f52173-c7d5-4d5a-9edc-b3c4e283213d] Deleting instance files /var/lib/nova/instances/f9f52173-c7d5-4d5a-9edc-b3c4e283213d_del#033[00m
Jan 21 18:55:07 np0005591285 nova_compute[182755]: 2026-01-21 23:55:07.839 182759 INFO nova.virt.libvirt.driver [None req-a95e50ff-26e7-4a09-ae00-9eddfdc5ee55 a7fb6bdd938b4fcdb749b0bc4f86f97e c09a5cf201e249f69f57cd4a632d1e2b - - default default] [instance: f9f52173-c7d5-4d5a-9edc-b3c4e283213d] Deletion of /var/lib/nova/instances/f9f52173-c7d5-4d5a-9edc-b3c4e283213d_del complete#033[00m
Jan 21 18:55:07 np0005591285 systemd[1]: libpod-conmon-8d762925d4000ff506032e7a81b8466c51d111d9f0aad14f6ab8bc521d54c7f2.scope: Deactivated successfully.
Jan 21 18:55:07 np0005591285 podman[218358]: 2026-01-21 23:55:07.927185505 +0000 UTC m=+0.064868656 container remove 8d762925d4000ff506032e7a81b8466c51d111d9f0aad14f6ab8bc521d54c7f2 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-7b586c54-3322-410f-9bc9-972a63b8deff, io.buildah.version=1.41.3, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.build-date=20251202)
Jan 21 18:55:07 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:55:07.936 211690 DEBUG oslo.privsep.daemon [-] privsep: reply[239264aa-39fa-4d28-bdca-7381b3599605]: (4, ('Wed Jan 21 11:55:07 PM UTC 2026 Stopping container neutron-haproxy-ovnmeta-7b586c54-3322-410f-9bc9-972a63b8deff (8d762925d4000ff506032e7a81b8466c51d111d9f0aad14f6ab8bc521d54c7f2)\n8d762925d4000ff506032e7a81b8466c51d111d9f0aad14f6ab8bc521d54c7f2\nWed Jan 21 11:55:07 PM UTC 2026 Deleting container neutron-haproxy-ovnmeta-7b586c54-3322-410f-9bc9-972a63b8deff (8d762925d4000ff506032e7a81b8466c51d111d9f0aad14f6ab8bc521d54c7f2)\n8d762925d4000ff506032e7a81b8466c51d111d9f0aad14f6ab8bc521d54c7f2\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 18:55:07 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:55:07.938 211690 DEBUG oslo.privsep.daemon [-] privsep: reply[09259545-1e4f-47b3-981d-96ef7b7758a8]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 18:55:07 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:55:07.939 104259 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap7b586c54-30, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 21 18:55:07 np0005591285 nova_compute[182755]: 2026-01-21 23:55:07.941 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 18:55:07 np0005591285 kernel: tap7b586c54-30: left promiscuous mode
Jan 21 18:55:07 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:55:07.949 211690 DEBUG oslo.privsep.daemon [-] privsep: reply[85bbb1df-7749-4b8a-ae09-5f87eb327aeb]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 18:55:07 np0005591285 nova_compute[182755]: 2026-01-21 23:55:07.953 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 18:55:07 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:55:07.964 211690 DEBUG oslo.privsep.daemon [-] privsep: reply[7f1192d1-d8ac-4dfb-8096-ce222762d4a5]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 18:55:07 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:55:07.965 211690 DEBUG oslo.privsep.daemon [-] privsep: reply[4cc322fe-f6fc-4d21-8bb9-62f4f02c02f3]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 18:55:07 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:55:07.987 211690 DEBUG oslo.privsep.daemon [-] privsep: reply[9c8bc99f-a6bd-4913-8b42-54ab78d2d0c6]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 425748, 'reachable_time': 20136, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 218373, 'error': None, 'target': 'ovnmeta-7b586c54-3322-410f-9bc9-972a63b8deff', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 18:55:07 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:55:07.991 104650 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-7b586c54-3322-410f-9bc9-972a63b8deff deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Jan 21 18:55:07 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:55:07.991 104650 DEBUG oslo.privsep.daemon [-] privsep: reply[ca9049ea-f869-4b41-ba73-ba61dc426e02]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 18:55:07 np0005591285 systemd[1]: run-netns-ovnmeta\x2d7b586c54\x2d3322\x2d410f\x2d9bc9\x2d972a63b8deff.mount: Deactivated successfully.
Jan 21 18:55:08 np0005591285 nova_compute[182755]: 2026-01-21 23:55:08.000 182759 DEBUG nova.compute.manager [req-854a20fd-99dc-406b-b6bb-c40a3f3c84ae req-53b9b7d3-3d79-4398-bda7-b0c84a46037d 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: f9f52173-c7d5-4d5a-9edc-b3c4e283213d] Received event network-vif-unplugged-798c95ea-7b9b-4c11-b29c-7a919d3070d4 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 21 18:55:08 np0005591285 nova_compute[182755]: 2026-01-21 23:55:08.000 182759 DEBUG oslo_concurrency.lockutils [req-854a20fd-99dc-406b-b6bb-c40a3f3c84ae req-53b9b7d3-3d79-4398-bda7-b0c84a46037d 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "f9f52173-c7d5-4d5a-9edc-b3c4e283213d-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 21 18:55:08 np0005591285 nova_compute[182755]: 2026-01-21 23:55:08.001 182759 DEBUG oslo_concurrency.lockutils [req-854a20fd-99dc-406b-b6bb-c40a3f3c84ae req-53b9b7d3-3d79-4398-bda7-b0c84a46037d 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "f9f52173-c7d5-4d5a-9edc-b3c4e283213d-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 21 18:55:08 np0005591285 nova_compute[182755]: 2026-01-21 23:55:08.001 182759 DEBUG oslo_concurrency.lockutils [req-854a20fd-99dc-406b-b6bb-c40a3f3c84ae req-53b9b7d3-3d79-4398-bda7-b0c84a46037d 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "f9f52173-c7d5-4d5a-9edc-b3c4e283213d-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 21 18:55:08 np0005591285 nova_compute[182755]: 2026-01-21 23:55:08.001 182759 DEBUG nova.compute.manager [req-854a20fd-99dc-406b-b6bb-c40a3f3c84ae req-53b9b7d3-3d79-4398-bda7-b0c84a46037d 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: f9f52173-c7d5-4d5a-9edc-b3c4e283213d] No waiting events found dispatching network-vif-unplugged-798c95ea-7b9b-4c11-b29c-7a919d3070d4 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 21 18:55:08 np0005591285 nova_compute[182755]: 2026-01-21 23:55:08.001 182759 DEBUG nova.compute.manager [req-854a20fd-99dc-406b-b6bb-c40a3f3c84ae req-53b9b7d3-3d79-4398-bda7-b0c84a46037d 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: f9f52173-c7d5-4d5a-9edc-b3c4e283213d] Received event network-vif-unplugged-798c95ea-7b9b-4c11-b29c-7a919d3070d4 for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Jan 21 18:55:08 np0005591285 nova_compute[182755]: 2026-01-21 23:55:08.070 182759 INFO nova.compute.manager [None req-a95e50ff-26e7-4a09-ae00-9eddfdc5ee55 a7fb6bdd938b4fcdb749b0bc4f86f97e c09a5cf201e249f69f57cd4a632d1e2b - - default default] [instance: f9f52173-c7d5-4d5a-9edc-b3c4e283213d] Took 0.56 seconds to destroy the instance on the hypervisor.#033[00m
Jan 21 18:55:08 np0005591285 nova_compute[182755]: 2026-01-21 23:55:08.071 182759 DEBUG oslo.service.loopingcall [None req-a95e50ff-26e7-4a09-ae00-9eddfdc5ee55 a7fb6bdd938b4fcdb749b0bc4f86f97e c09a5cf201e249f69f57cd4a632d1e2b - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Jan 21 18:55:08 np0005591285 nova_compute[182755]: 2026-01-21 23:55:08.071 182759 DEBUG nova.compute.manager [-] [instance: f9f52173-c7d5-4d5a-9edc-b3c4e283213d] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Jan 21 18:55:08 np0005591285 nova_compute[182755]: 2026-01-21 23:55:08.071 182759 DEBUG nova.network.neutron [-] [instance: f9f52173-c7d5-4d5a-9edc-b3c4e283213d] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Jan 21 18:55:09 np0005591285 nova_compute[182755]: 2026-01-21 23:55:09.511 182759 DEBUG nova.network.neutron [-] [instance: f9f52173-c7d5-4d5a-9edc-b3c4e283213d] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 21 18:55:09 np0005591285 nova_compute[182755]: 2026-01-21 23:55:09.542 182759 INFO nova.compute.manager [-] [instance: f9f52173-c7d5-4d5a-9edc-b3c4e283213d] Took 1.47 seconds to deallocate network for instance.#033[00m
Jan 21 18:55:09 np0005591285 nova_compute[182755]: 2026-01-21 23:55:09.652 182759 DEBUG oslo_concurrency.lockutils [None req-a95e50ff-26e7-4a09-ae00-9eddfdc5ee55 a7fb6bdd938b4fcdb749b0bc4f86f97e c09a5cf201e249f69f57cd4a632d1e2b - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 21 18:55:09 np0005591285 nova_compute[182755]: 2026-01-21 23:55:09.653 182759 DEBUG oslo_concurrency.lockutils [None req-a95e50ff-26e7-4a09-ae00-9eddfdc5ee55 a7fb6bdd938b4fcdb749b0bc4f86f97e c09a5cf201e249f69f57cd4a632d1e2b - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 21 18:55:09 np0005591285 nova_compute[182755]: 2026-01-21 23:55:09.668 182759 DEBUG nova.compute.manager [req-d84dda75-5906-4b37-93b4-ad30475fef68 req-33bba927-2ce0-42e9-aff4-46a9438a5efa 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: f9f52173-c7d5-4d5a-9edc-b3c4e283213d] Received event network-vif-deleted-798c95ea-7b9b-4c11-b29c-7a919d3070d4 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 21 18:55:09 np0005591285 nova_compute[182755]: 2026-01-21 23:55:09.713 182759 DEBUG nova.compute.provider_tree [None req-a95e50ff-26e7-4a09-ae00-9eddfdc5ee55 a7fb6bdd938b4fcdb749b0bc4f86f97e c09a5cf201e249f69f57cd4a632d1e2b - - default default] Inventory has not changed in ProviderTree for provider: e96a8776-a298-4c19-937a-402cb8191067 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 21 18:55:09 np0005591285 nova_compute[182755]: 2026-01-21 23:55:09.730 182759 DEBUG nova.scheduler.client.report [None req-a95e50ff-26e7-4a09-ae00-9eddfdc5ee55 a7fb6bdd938b4fcdb749b0bc4f86f97e c09a5cf201e249f69f57cd4a632d1e2b - - default default] Inventory has not changed for provider e96a8776-a298-4c19-937a-402cb8191067 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 21 18:55:09 np0005591285 nova_compute[182755]: 2026-01-21 23:55:09.760 182759 DEBUG oslo_concurrency.lockutils [None req-a95e50ff-26e7-4a09-ae00-9eddfdc5ee55 a7fb6bdd938b4fcdb749b0bc4f86f97e c09a5cf201e249f69f57cd4a632d1e2b - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.107s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 21 18:55:09 np0005591285 nova_compute[182755]: 2026-01-21 23:55:09.808 182759 INFO nova.scheduler.client.report [None req-a95e50ff-26e7-4a09-ae00-9eddfdc5ee55 a7fb6bdd938b4fcdb749b0bc4f86f97e c09a5cf201e249f69f57cd4a632d1e2b - - default default] Deleted allocations for instance f9f52173-c7d5-4d5a-9edc-b3c4e283213d#033[00m
Jan 21 18:55:09 np0005591285 nova_compute[182755]: 2026-01-21 23:55:09.846 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 18:55:09 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:55:09.879 104259 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=ce4b296c-26ac-415a-aa87-9634754eb3d3, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '14'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 21 18:55:09 np0005591285 nova_compute[182755]: 2026-01-21 23:55:09.901 182759 DEBUG oslo_concurrency.lockutils [None req-a95e50ff-26e7-4a09-ae00-9eddfdc5ee55 a7fb6bdd938b4fcdb749b0bc4f86f97e c09a5cf201e249f69f57cd4a632d1e2b - - default default] Lock "f9f52173-c7d5-4d5a-9edc-b3c4e283213d" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 2.421s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 21 18:55:10 np0005591285 nova_compute[182755]: 2026-01-21 23:55:10.097 182759 DEBUG nova.compute.manager [req-a1fea8eb-9e10-44fb-9fff-2aaf0a0a4efd req-6b1fbc16-d14a-482b-832e-da1aaeaa6f64 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: f9f52173-c7d5-4d5a-9edc-b3c4e283213d] Received event network-vif-plugged-798c95ea-7b9b-4c11-b29c-7a919d3070d4 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 21 18:55:10 np0005591285 nova_compute[182755]: 2026-01-21 23:55:10.097 182759 DEBUG oslo_concurrency.lockutils [req-a1fea8eb-9e10-44fb-9fff-2aaf0a0a4efd req-6b1fbc16-d14a-482b-832e-da1aaeaa6f64 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "f9f52173-c7d5-4d5a-9edc-b3c4e283213d-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 21 18:55:10 np0005591285 nova_compute[182755]: 2026-01-21 23:55:10.098 182759 DEBUG oslo_concurrency.lockutils [req-a1fea8eb-9e10-44fb-9fff-2aaf0a0a4efd req-6b1fbc16-d14a-482b-832e-da1aaeaa6f64 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "f9f52173-c7d5-4d5a-9edc-b3c4e283213d-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 21 18:55:10 np0005591285 nova_compute[182755]: 2026-01-21 23:55:10.098 182759 DEBUG oslo_concurrency.lockutils [req-a1fea8eb-9e10-44fb-9fff-2aaf0a0a4efd req-6b1fbc16-d14a-482b-832e-da1aaeaa6f64 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "f9f52173-c7d5-4d5a-9edc-b3c4e283213d-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 21 18:55:10 np0005591285 nova_compute[182755]: 2026-01-21 23:55:10.099 182759 DEBUG nova.compute.manager [req-a1fea8eb-9e10-44fb-9fff-2aaf0a0a4efd req-6b1fbc16-d14a-482b-832e-da1aaeaa6f64 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: f9f52173-c7d5-4d5a-9edc-b3c4e283213d] No waiting events found dispatching network-vif-plugged-798c95ea-7b9b-4c11-b29c-7a919d3070d4 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 21 18:55:10 np0005591285 nova_compute[182755]: 2026-01-21 23:55:10.099 182759 WARNING nova.compute.manager [req-a1fea8eb-9e10-44fb-9fff-2aaf0a0a4efd req-6b1fbc16-d14a-482b-832e-da1aaeaa6f64 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: f9f52173-c7d5-4d5a-9edc-b3c4e283213d] Received unexpected event network-vif-plugged-798c95ea-7b9b-4c11-b29c-7a919d3070d4 for instance with vm_state deleted and task_state None.#033[00m
Jan 21 18:55:12 np0005591285 nova_compute[182755]: 2026-01-21 23:55:12.829 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 18:55:14 np0005591285 nova_compute[182755]: 2026-01-21 23:55:14.849 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 18:55:17 np0005591285 nova_compute[182755]: 2026-01-21 23:55:17.824 182759 DEBUG oslo_concurrency.lockutils [None req-902440b9-b2f7-4012-a2a7-6955d51d74ce fb46f340c44c473b9286568553cb6374 8556453a9e6644b4b29f7e2585b6beb3 - - default default] Acquiring lock "2b44528a-0ec9-4df9-afce-0d76ed92b221" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 21 18:55:17 np0005591285 nova_compute[182755]: 2026-01-21 23:55:17.825 182759 DEBUG oslo_concurrency.lockutils [None req-902440b9-b2f7-4012-a2a7-6955d51d74ce fb46f340c44c473b9286568553cb6374 8556453a9e6644b4b29f7e2585b6beb3 - - default default] Lock "2b44528a-0ec9-4df9-afce-0d76ed92b221" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 21 18:55:17 np0005591285 nova_compute[182755]: 2026-01-21 23:55:17.831 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 18:55:17 np0005591285 nova_compute[182755]: 2026-01-21 23:55:17.851 182759 DEBUG nova.compute.manager [None req-902440b9-b2f7-4012-a2a7-6955d51d74ce fb46f340c44c473b9286568553cb6374 8556453a9e6644b4b29f7e2585b6beb3 - - default default] [instance: 2b44528a-0ec9-4df9-afce-0d76ed92b221] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Jan 21 18:55:18 np0005591285 nova_compute[182755]: 2026-01-21 23:55:18.015 182759 DEBUG oslo_concurrency.lockutils [None req-902440b9-b2f7-4012-a2a7-6955d51d74ce fb46f340c44c473b9286568553cb6374 8556453a9e6644b4b29f7e2585b6beb3 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 21 18:55:18 np0005591285 nova_compute[182755]: 2026-01-21 23:55:18.015 182759 DEBUG oslo_concurrency.lockutils [None req-902440b9-b2f7-4012-a2a7-6955d51d74ce fb46f340c44c473b9286568553cb6374 8556453a9e6644b4b29f7e2585b6beb3 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 21 18:55:18 np0005591285 nova_compute[182755]: 2026-01-21 23:55:18.025 182759 DEBUG nova.virt.hardware [None req-902440b9-b2f7-4012-a2a7-6955d51d74ce fb46f340c44c473b9286568553cb6374 8556453a9e6644b4b29f7e2585b6beb3 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Jan 21 18:55:18 np0005591285 nova_compute[182755]: 2026-01-21 23:55:18.025 182759 INFO nova.compute.claims [None req-902440b9-b2f7-4012-a2a7-6955d51d74ce fb46f340c44c473b9286568553cb6374 8556453a9e6644b4b29f7e2585b6beb3 - - default default] [instance: 2b44528a-0ec9-4df9-afce-0d76ed92b221] Claim successful on node compute-2.ctlplane.example.com#033[00m
Jan 21 18:55:18 np0005591285 nova_compute[182755]: 2026-01-21 23:55:18.225 182759 DEBUG nova.compute.provider_tree [None req-902440b9-b2f7-4012-a2a7-6955d51d74ce fb46f340c44c473b9286568553cb6374 8556453a9e6644b4b29f7e2585b6beb3 - - default default] Inventory has not changed in ProviderTree for provider: e96a8776-a298-4c19-937a-402cb8191067 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 21 18:55:18 np0005591285 nova_compute[182755]: 2026-01-21 23:55:18.244 182759 DEBUG nova.scheduler.client.report [None req-902440b9-b2f7-4012-a2a7-6955d51d74ce fb46f340c44c473b9286568553cb6374 8556453a9e6644b4b29f7e2585b6beb3 - - default default] Inventory has not changed for provider e96a8776-a298-4c19-937a-402cb8191067 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 21 18:55:18 np0005591285 nova_compute[182755]: 2026-01-21 23:55:18.267 182759 DEBUG oslo_concurrency.lockutils [None req-902440b9-b2f7-4012-a2a7-6955d51d74ce fb46f340c44c473b9286568553cb6374 8556453a9e6644b4b29f7e2585b6beb3 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.252s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 21 18:55:18 np0005591285 nova_compute[182755]: 2026-01-21 23:55:18.268 182759 DEBUG nova.compute.manager [None req-902440b9-b2f7-4012-a2a7-6955d51d74ce fb46f340c44c473b9286568553cb6374 8556453a9e6644b4b29f7e2585b6beb3 - - default default] [instance: 2b44528a-0ec9-4df9-afce-0d76ed92b221] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Jan 21 18:55:18 np0005591285 nova_compute[182755]: 2026-01-21 23:55:18.320 182759 DEBUG nova.compute.manager [None req-902440b9-b2f7-4012-a2a7-6955d51d74ce fb46f340c44c473b9286568553cb6374 8556453a9e6644b4b29f7e2585b6beb3 - - default default] [instance: 2b44528a-0ec9-4df9-afce-0d76ed92b221] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Jan 21 18:55:18 np0005591285 nova_compute[182755]: 2026-01-21 23:55:18.321 182759 DEBUG nova.network.neutron [None req-902440b9-b2f7-4012-a2a7-6955d51d74ce fb46f340c44c473b9286568553cb6374 8556453a9e6644b4b29f7e2585b6beb3 - - default default] [instance: 2b44528a-0ec9-4df9-afce-0d76ed92b221] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Jan 21 18:55:18 np0005591285 nova_compute[182755]: 2026-01-21 23:55:18.339 182759 INFO nova.virt.libvirt.driver [None req-902440b9-b2f7-4012-a2a7-6955d51d74ce fb46f340c44c473b9286568553cb6374 8556453a9e6644b4b29f7e2585b6beb3 - - default default] [instance: 2b44528a-0ec9-4df9-afce-0d76ed92b221] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Jan 21 18:55:18 np0005591285 nova_compute[182755]: 2026-01-21 23:55:18.365 182759 DEBUG nova.compute.manager [None req-902440b9-b2f7-4012-a2a7-6955d51d74ce fb46f340c44c473b9286568553cb6374 8556453a9e6644b4b29f7e2585b6beb3 - - default default] [instance: 2b44528a-0ec9-4df9-afce-0d76ed92b221] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Jan 21 18:55:18 np0005591285 nova_compute[182755]: 2026-01-21 23:55:18.520 182759 DEBUG nova.compute.manager [None req-902440b9-b2f7-4012-a2a7-6955d51d74ce fb46f340c44c473b9286568553cb6374 8556453a9e6644b4b29f7e2585b6beb3 - - default default] [instance: 2b44528a-0ec9-4df9-afce-0d76ed92b221] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Jan 21 18:55:18 np0005591285 nova_compute[182755]: 2026-01-21 23:55:18.522 182759 DEBUG nova.virt.libvirt.driver [None req-902440b9-b2f7-4012-a2a7-6955d51d74ce fb46f340c44c473b9286568553cb6374 8556453a9e6644b4b29f7e2585b6beb3 - - default default] [instance: 2b44528a-0ec9-4df9-afce-0d76ed92b221] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Jan 21 18:55:18 np0005591285 nova_compute[182755]: 2026-01-21 23:55:18.523 182759 INFO nova.virt.libvirt.driver [None req-902440b9-b2f7-4012-a2a7-6955d51d74ce fb46f340c44c473b9286568553cb6374 8556453a9e6644b4b29f7e2585b6beb3 - - default default] [instance: 2b44528a-0ec9-4df9-afce-0d76ed92b221] Creating image(s)#033[00m
Jan 21 18:55:18 np0005591285 nova_compute[182755]: 2026-01-21 23:55:18.525 182759 DEBUG oslo_concurrency.lockutils [None req-902440b9-b2f7-4012-a2a7-6955d51d74ce fb46f340c44c473b9286568553cb6374 8556453a9e6644b4b29f7e2585b6beb3 - - default default] Acquiring lock "/var/lib/nova/instances/2b44528a-0ec9-4df9-afce-0d76ed92b221/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 21 18:55:18 np0005591285 nova_compute[182755]: 2026-01-21 23:55:18.525 182759 DEBUG oslo_concurrency.lockutils [None req-902440b9-b2f7-4012-a2a7-6955d51d74ce fb46f340c44c473b9286568553cb6374 8556453a9e6644b4b29f7e2585b6beb3 - - default default] Lock "/var/lib/nova/instances/2b44528a-0ec9-4df9-afce-0d76ed92b221/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 21 18:55:18 np0005591285 nova_compute[182755]: 2026-01-21 23:55:18.526 182759 DEBUG oslo_concurrency.lockutils [None req-902440b9-b2f7-4012-a2a7-6955d51d74ce fb46f340c44c473b9286568553cb6374 8556453a9e6644b4b29f7e2585b6beb3 - - default default] Lock "/var/lib/nova/instances/2b44528a-0ec9-4df9-afce-0d76ed92b221/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 21 18:55:18 np0005591285 nova_compute[182755]: 2026-01-21 23:55:18.554 182759 DEBUG nova.policy [None req-902440b9-b2f7-4012-a2a7-6955d51d74ce fb46f340c44c473b9286568553cb6374 8556453a9e6644b4b29f7e2585b6beb3 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'fb46f340c44c473b9286568553cb6374', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '8556453a9e6644b4b29f7e2585b6beb3', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Jan 21 18:55:18 np0005591285 nova_compute[182755]: 2026-01-21 23:55:18.559 182759 DEBUG oslo_concurrency.processutils [None req-902440b9-b2f7-4012-a2a7-6955d51d74ce fb46f340c44c473b9286568553cb6374 8556453a9e6644b4b29f7e2585b6beb3 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 21 18:55:18 np0005591285 nova_compute[182755]: 2026-01-21 23:55:18.656 182759 DEBUG oslo_concurrency.processutils [None req-902440b9-b2f7-4012-a2a7-6955d51d74ce fb46f340c44c473b9286568553cb6374 8556453a9e6644b4b29f7e2585b6beb3 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json" returned: 0 in 0.097s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 21 18:55:18 np0005591285 nova_compute[182755]: 2026-01-21 23:55:18.657 182759 DEBUG oslo_concurrency.lockutils [None req-902440b9-b2f7-4012-a2a7-6955d51d74ce fb46f340c44c473b9286568553cb6374 8556453a9e6644b4b29f7e2585b6beb3 - - default default] Acquiring lock "3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 21 18:55:18 np0005591285 nova_compute[182755]: 2026-01-21 23:55:18.658 182759 DEBUG oslo_concurrency.lockutils [None req-902440b9-b2f7-4012-a2a7-6955d51d74ce fb46f340c44c473b9286568553cb6374 8556453a9e6644b4b29f7e2585b6beb3 - - default default] Lock "3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 21 18:55:18 np0005591285 nova_compute[182755]: 2026-01-21 23:55:18.674 182759 DEBUG oslo_concurrency.processutils [None req-902440b9-b2f7-4012-a2a7-6955d51d74ce fb46f340c44c473b9286568553cb6374 8556453a9e6644b4b29f7e2585b6beb3 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 21 18:55:18 np0005591285 nova_compute[182755]: 2026-01-21 23:55:18.749 182759 DEBUG oslo_concurrency.processutils [None req-902440b9-b2f7-4012-a2a7-6955d51d74ce fb46f340c44c473b9286568553cb6374 8556453a9e6644b4b29f7e2585b6beb3 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json" returned: 0 in 0.075s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 21 18:55:18 np0005591285 nova_compute[182755]: 2026-01-21 23:55:18.751 182759 DEBUG oslo_concurrency.processutils [None req-902440b9-b2f7-4012-a2a7-6955d51d74ce fb46f340c44c473b9286568553cb6374 8556453a9e6644b4b29f7e2585b6beb3 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474,backing_fmt=raw /var/lib/nova/instances/2b44528a-0ec9-4df9-afce-0d76ed92b221/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 21 18:55:18 np0005591285 nova_compute[182755]: 2026-01-21 23:55:18.795 182759 DEBUG oslo_concurrency.processutils [None req-902440b9-b2f7-4012-a2a7-6955d51d74ce fb46f340c44c473b9286568553cb6374 8556453a9e6644b4b29f7e2585b6beb3 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474,backing_fmt=raw /var/lib/nova/instances/2b44528a-0ec9-4df9-afce-0d76ed92b221/disk 1073741824" returned: 0 in 0.044s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 21 18:55:18 np0005591285 nova_compute[182755]: 2026-01-21 23:55:18.796 182759 DEBUG oslo_concurrency.lockutils [None req-902440b9-b2f7-4012-a2a7-6955d51d74ce fb46f340c44c473b9286568553cb6374 8556453a9e6644b4b29f7e2585b6beb3 - - default default] Lock "3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.138s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 21 18:55:18 np0005591285 nova_compute[182755]: 2026-01-21 23:55:18.797 182759 DEBUG oslo_concurrency.processutils [None req-902440b9-b2f7-4012-a2a7-6955d51d74ce fb46f340c44c473b9286568553cb6374 8556453a9e6644b4b29f7e2585b6beb3 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 21 18:55:18 np0005591285 nova_compute[182755]: 2026-01-21 23:55:18.868 182759 DEBUG oslo_concurrency.processutils [None req-902440b9-b2f7-4012-a2a7-6955d51d74ce fb46f340c44c473b9286568553cb6374 8556453a9e6644b4b29f7e2585b6beb3 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json" returned: 0 in 0.071s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 21 18:55:18 np0005591285 nova_compute[182755]: 2026-01-21 23:55:18.870 182759 DEBUG nova.virt.disk.api [None req-902440b9-b2f7-4012-a2a7-6955d51d74ce fb46f340c44c473b9286568553cb6374 8556453a9e6644b4b29f7e2585b6beb3 - - default default] Checking if we can resize image /var/lib/nova/instances/2b44528a-0ec9-4df9-afce-0d76ed92b221/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166#033[00m
Jan 21 18:55:18 np0005591285 nova_compute[182755]: 2026-01-21 23:55:18.871 182759 DEBUG oslo_concurrency.processutils [None req-902440b9-b2f7-4012-a2a7-6955d51d74ce fb46f340c44c473b9286568553cb6374 8556453a9e6644b4b29f7e2585b6beb3 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/2b44528a-0ec9-4df9-afce-0d76ed92b221/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 21 18:55:18 np0005591285 nova_compute[182755]: 2026-01-21 23:55:18.972 182759 DEBUG oslo_concurrency.processutils [None req-902440b9-b2f7-4012-a2a7-6955d51d74ce fb46f340c44c473b9286568553cb6374 8556453a9e6644b4b29f7e2585b6beb3 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/2b44528a-0ec9-4df9-afce-0d76ed92b221/disk --force-share --output=json" returned: 0 in 0.101s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 21 18:55:18 np0005591285 nova_compute[182755]: 2026-01-21 23:55:18.974 182759 DEBUG nova.virt.disk.api [None req-902440b9-b2f7-4012-a2a7-6955d51d74ce fb46f340c44c473b9286568553cb6374 8556453a9e6644b4b29f7e2585b6beb3 - - default default] Cannot resize image /var/lib/nova/instances/2b44528a-0ec9-4df9-afce-0d76ed92b221/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172#033[00m
Jan 21 18:55:18 np0005591285 nova_compute[182755]: 2026-01-21 23:55:18.975 182759 DEBUG nova.objects.instance [None req-902440b9-b2f7-4012-a2a7-6955d51d74ce fb46f340c44c473b9286568553cb6374 8556453a9e6644b4b29f7e2585b6beb3 - - default default] Lazy-loading 'migration_context' on Instance uuid 2b44528a-0ec9-4df9-afce-0d76ed92b221 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 21 18:55:19 np0005591285 nova_compute[182755]: 2026-01-21 23:55:19.001 182759 DEBUG nova.virt.libvirt.driver [None req-902440b9-b2f7-4012-a2a7-6955d51d74ce fb46f340c44c473b9286568553cb6374 8556453a9e6644b4b29f7e2585b6beb3 - - default default] [instance: 2b44528a-0ec9-4df9-afce-0d76ed92b221] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Jan 21 18:55:19 np0005591285 nova_compute[182755]: 2026-01-21 23:55:19.002 182759 DEBUG nova.virt.libvirt.driver [None req-902440b9-b2f7-4012-a2a7-6955d51d74ce fb46f340c44c473b9286568553cb6374 8556453a9e6644b4b29f7e2585b6beb3 - - default default] [instance: 2b44528a-0ec9-4df9-afce-0d76ed92b221] Ensure instance console log exists: /var/lib/nova/instances/2b44528a-0ec9-4df9-afce-0d76ed92b221/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Jan 21 18:55:19 np0005591285 nova_compute[182755]: 2026-01-21 23:55:19.003 182759 DEBUG oslo_concurrency.lockutils [None req-902440b9-b2f7-4012-a2a7-6955d51d74ce fb46f340c44c473b9286568553cb6374 8556453a9e6644b4b29f7e2585b6beb3 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 21 18:55:19 np0005591285 nova_compute[182755]: 2026-01-21 23:55:19.004 182759 DEBUG oslo_concurrency.lockutils [None req-902440b9-b2f7-4012-a2a7-6955d51d74ce fb46f340c44c473b9286568553cb6374 8556453a9e6644b4b29f7e2585b6beb3 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 21 18:55:19 np0005591285 nova_compute[182755]: 2026-01-21 23:55:19.004 182759 DEBUG oslo_concurrency.lockutils [None req-902440b9-b2f7-4012-a2a7-6955d51d74ce fb46f340c44c473b9286568553cb6374 8556453a9e6644b4b29f7e2585b6beb3 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 21 18:55:19 np0005591285 nova_compute[182755]: 2026-01-21 23:55:19.229 182759 DEBUG nova.network.neutron [None req-902440b9-b2f7-4012-a2a7-6955d51d74ce fb46f340c44c473b9286568553cb6374 8556453a9e6644b4b29f7e2585b6beb3 - - default default] [instance: 2b44528a-0ec9-4df9-afce-0d76ed92b221] Successfully created port: b07d054c-f7c2-465a-8c0d-fa4f5dac4828 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Jan 21 18:55:19 np0005591285 nova_compute[182755]: 2026-01-21 23:55:19.760 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 18:55:19 np0005591285 nova_compute[182755]: 2026-01-21 23:55:19.851 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 18:55:20 np0005591285 nova_compute[182755]: 2026-01-21 23:55:20.313 182759 DEBUG nova.network.neutron [None req-902440b9-b2f7-4012-a2a7-6955d51d74ce fb46f340c44c473b9286568553cb6374 8556453a9e6644b4b29f7e2585b6beb3 - - default default] [instance: 2b44528a-0ec9-4df9-afce-0d76ed92b221] Successfully updated port: b07d054c-f7c2-465a-8c0d-fa4f5dac4828 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Jan 21 18:55:20 np0005591285 nova_compute[182755]: 2026-01-21 23:55:20.341 182759 DEBUG oslo_concurrency.lockutils [None req-902440b9-b2f7-4012-a2a7-6955d51d74ce fb46f340c44c473b9286568553cb6374 8556453a9e6644b4b29f7e2585b6beb3 - - default default] Acquiring lock "refresh_cache-2b44528a-0ec9-4df9-afce-0d76ed92b221" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 21 18:55:20 np0005591285 nova_compute[182755]: 2026-01-21 23:55:20.342 182759 DEBUG oslo_concurrency.lockutils [None req-902440b9-b2f7-4012-a2a7-6955d51d74ce fb46f340c44c473b9286568553cb6374 8556453a9e6644b4b29f7e2585b6beb3 - - default default] Acquired lock "refresh_cache-2b44528a-0ec9-4df9-afce-0d76ed92b221" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 21 18:55:20 np0005591285 nova_compute[182755]: 2026-01-21 23:55:20.342 182759 DEBUG nova.network.neutron [None req-902440b9-b2f7-4012-a2a7-6955d51d74ce fb46f340c44c473b9286568553cb6374 8556453a9e6644b4b29f7e2585b6beb3 - - default default] [instance: 2b44528a-0ec9-4df9-afce-0d76ed92b221] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Jan 21 18:55:20 np0005591285 nova_compute[182755]: 2026-01-21 23:55:20.403 182759 DEBUG nova.compute.manager [req-9817f9da-3375-439d-82e3-f57774cae4d8 req-9b55e334-50ff-4f5c-a89f-253e4ec8b802 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 2b44528a-0ec9-4df9-afce-0d76ed92b221] Received event network-changed-b07d054c-f7c2-465a-8c0d-fa4f5dac4828 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 21 18:55:20 np0005591285 nova_compute[182755]: 2026-01-21 23:55:20.403 182759 DEBUG nova.compute.manager [req-9817f9da-3375-439d-82e3-f57774cae4d8 req-9b55e334-50ff-4f5c-a89f-253e4ec8b802 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 2b44528a-0ec9-4df9-afce-0d76ed92b221] Refreshing instance network info cache due to event network-changed-b07d054c-f7c2-465a-8c0d-fa4f5dac4828. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 21 18:55:20 np0005591285 nova_compute[182755]: 2026-01-21 23:55:20.404 182759 DEBUG oslo_concurrency.lockutils [req-9817f9da-3375-439d-82e3-f57774cae4d8 req-9b55e334-50ff-4f5c-a89f-253e4ec8b802 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "refresh_cache-2b44528a-0ec9-4df9-afce-0d76ed92b221" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 21 18:55:20 np0005591285 nova_compute[182755]: 2026-01-21 23:55:20.539 182759 DEBUG nova.network.neutron [None req-902440b9-b2f7-4012-a2a7-6955d51d74ce fb46f340c44c473b9286568553cb6374 8556453a9e6644b4b29f7e2585b6beb3 - - default default] [instance: 2b44528a-0ec9-4df9-afce-0d76ed92b221] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Jan 21 18:55:22 np0005591285 nova_compute[182755]: 2026-01-21 23:55:22.159 182759 DEBUG nova.network.neutron [None req-902440b9-b2f7-4012-a2a7-6955d51d74ce fb46f340c44c473b9286568553cb6374 8556453a9e6644b4b29f7e2585b6beb3 - - default default] [instance: 2b44528a-0ec9-4df9-afce-0d76ed92b221] Updating instance_info_cache with network_info: [{"id": "b07d054c-f7c2-465a-8c0d-fa4f5dac4828", "address": "fa:16:3e:87:3c:66", "network": {"id": "f32b0ae0-64b5-4b08-b029-da33b7e8f96a", "bridge": "br-int", "label": "tempest-SecurityGroupsTestJSON-1426753002-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "8556453a9e6644b4b29f7e2585b6beb3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb07d054c-f7", "ovs_interfaceid": "b07d054c-f7c2-465a-8c0d-fa4f5dac4828", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 21 18:55:22 np0005591285 nova_compute[182755]: 2026-01-21 23:55:22.185 182759 DEBUG oslo_concurrency.lockutils [None req-902440b9-b2f7-4012-a2a7-6955d51d74ce fb46f340c44c473b9286568553cb6374 8556453a9e6644b4b29f7e2585b6beb3 - - default default] Releasing lock "refresh_cache-2b44528a-0ec9-4df9-afce-0d76ed92b221" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 21 18:55:22 np0005591285 nova_compute[182755]: 2026-01-21 23:55:22.186 182759 DEBUG nova.compute.manager [None req-902440b9-b2f7-4012-a2a7-6955d51d74ce fb46f340c44c473b9286568553cb6374 8556453a9e6644b4b29f7e2585b6beb3 - - default default] [instance: 2b44528a-0ec9-4df9-afce-0d76ed92b221] Instance network_info: |[{"id": "b07d054c-f7c2-465a-8c0d-fa4f5dac4828", "address": "fa:16:3e:87:3c:66", "network": {"id": "f32b0ae0-64b5-4b08-b029-da33b7e8f96a", "bridge": "br-int", "label": "tempest-SecurityGroupsTestJSON-1426753002-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "8556453a9e6644b4b29f7e2585b6beb3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb07d054c-f7", "ovs_interfaceid": "b07d054c-f7c2-465a-8c0d-fa4f5dac4828", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Jan 21 18:55:22 np0005591285 nova_compute[182755]: 2026-01-21 23:55:22.187 182759 DEBUG oslo_concurrency.lockutils [req-9817f9da-3375-439d-82e3-f57774cae4d8 req-9b55e334-50ff-4f5c-a89f-253e4ec8b802 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquired lock "refresh_cache-2b44528a-0ec9-4df9-afce-0d76ed92b221" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 21 18:55:22 np0005591285 nova_compute[182755]: 2026-01-21 23:55:22.187 182759 DEBUG nova.network.neutron [req-9817f9da-3375-439d-82e3-f57774cae4d8 req-9b55e334-50ff-4f5c-a89f-253e4ec8b802 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 2b44528a-0ec9-4df9-afce-0d76ed92b221] Refreshing network info cache for port b07d054c-f7c2-465a-8c0d-fa4f5dac4828 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 21 18:55:22 np0005591285 nova_compute[182755]: 2026-01-21 23:55:22.192 182759 DEBUG nova.virt.libvirt.driver [None req-902440b9-b2f7-4012-a2a7-6955d51d74ce fb46f340c44c473b9286568553cb6374 8556453a9e6644b4b29f7e2585b6beb3 - - default default] [instance: 2b44528a-0ec9-4df9-afce-0d76ed92b221] Start _get_guest_xml network_info=[{"id": "b07d054c-f7c2-465a-8c0d-fa4f5dac4828", "address": "fa:16:3e:87:3c:66", "network": {"id": "f32b0ae0-64b5-4b08-b029-da33b7e8f96a", "bridge": "br-int", "label": "tempest-SecurityGroupsTestJSON-1426753002-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "8556453a9e6644b4b29f7e2585b6beb3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb07d054c-f7", "ovs_interfaceid": "b07d054c-f7c2-465a-8c0d-fa4f5dac4828", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-21T23:42:50Z,direct_url=<?>,disk_format='qcow2',id=9cd98f02-a505-4543-a7ad-04e9a377b456,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='43b70c4e837343859ac97b6b2397ba1b',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-21T23:42:52Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'device_type': 'disk', 'device_name': '/dev/vda', 'size': 0, 'boot_index': 0, 'disk_bus': 'virtio', 'guest_format': None, 'encryption_options': None, 'encryption_format': None, 'encrypted': False, 'encryption_secret_uuid': None, 'image_id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Jan 21 18:55:22 np0005591285 nova_compute[182755]: 2026-01-21 23:55:22.200 182759 WARNING nova.virt.libvirt.driver [None req-902440b9-b2f7-4012-a2a7-6955d51d74ce fb46f340c44c473b9286568553cb6374 8556453a9e6644b4b29f7e2585b6beb3 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 21 18:55:22 np0005591285 nova_compute[182755]: 2026-01-21 23:55:22.208 182759 DEBUG nova.virt.libvirt.host [None req-902440b9-b2f7-4012-a2a7-6955d51d74ce fb46f340c44c473b9286568553cb6374 8556453a9e6644b4b29f7e2585b6beb3 - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Jan 21 18:55:22 np0005591285 nova_compute[182755]: 2026-01-21 23:55:22.209 182759 DEBUG nova.virt.libvirt.host [None req-902440b9-b2f7-4012-a2a7-6955d51d74ce fb46f340c44c473b9286568553cb6374 8556453a9e6644b4b29f7e2585b6beb3 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Jan 21 18:55:22 np0005591285 podman[218389]: 2026-01-21 23:55:22.218140227 +0000 UTC m=+0.084488355 container health_status 769668cc4e818cfae854ab3fcb5517227ddca1e18005a18ef4d782a494f45662 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., managed_by=edpm_ansible, io.openshift.tags=minimal rhel9, vendor=Red Hat, Inc., maintainer=Red Hat, Inc., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.buildah.version=1.33.7, release=1755695350, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2025-08-20T13:12:41, com.redhat.component=ubi9-minimal-container, io.openshift.expose-services=, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, container_name=openstack_network_exporter, version=9.6, distribution-scope=public, config_id=openstack_network_exporter, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vcs-type=git, name=ubi9-minimal, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, url=https://catalog.redhat.com/en/search?searchType=containers, architecture=x86_64)
Jan 21 18:55:22 np0005591285 nova_compute[182755]: 2026-01-21 23:55:22.221 182759 DEBUG nova.virt.libvirt.host [None req-902440b9-b2f7-4012-a2a7-6955d51d74ce fb46f340c44c473b9286568553cb6374 8556453a9e6644b4b29f7e2585b6beb3 - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Jan 21 18:55:22 np0005591285 nova_compute[182755]: 2026-01-21 23:55:22.222 182759 DEBUG nova.virt.libvirt.host [None req-902440b9-b2f7-4012-a2a7-6955d51d74ce fb46f340c44c473b9286568553cb6374 8556453a9e6644b4b29f7e2585b6beb3 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Jan 21 18:55:22 np0005591285 nova_compute[182755]: 2026-01-21 23:55:22.225 182759 DEBUG nova.virt.libvirt.driver [None req-902440b9-b2f7-4012-a2a7-6955d51d74ce fb46f340c44c473b9286568553cb6374 8556453a9e6644b4b29f7e2585b6beb3 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Jan 21 18:55:22 np0005591285 nova_compute[182755]: 2026-01-21 23:55:22.225 182759 DEBUG nova.virt.hardware [None req-902440b9-b2f7-4012-a2a7-6955d51d74ce fb46f340c44c473b9286568553cb6374 8556453a9e6644b4b29f7e2585b6beb3 - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-21T23:42:45Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='c3389c03-89c4-4ff5-9e03-1a99d41713d4',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-21T23:42:50Z,direct_url=<?>,disk_format='qcow2',id=9cd98f02-a505-4543-a7ad-04e9a377b456,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='43b70c4e837343859ac97b6b2397ba1b',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-21T23:42:52Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Jan 21 18:55:22 np0005591285 nova_compute[182755]: 2026-01-21 23:55:22.226 182759 DEBUG nova.virt.hardware [None req-902440b9-b2f7-4012-a2a7-6955d51d74ce fb46f340c44c473b9286568553cb6374 8556453a9e6644b4b29f7e2585b6beb3 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Jan 21 18:55:22 np0005591285 nova_compute[182755]: 2026-01-21 23:55:22.226 182759 DEBUG nova.virt.hardware [None req-902440b9-b2f7-4012-a2a7-6955d51d74ce fb46f340c44c473b9286568553cb6374 8556453a9e6644b4b29f7e2585b6beb3 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Jan 21 18:55:22 np0005591285 nova_compute[182755]: 2026-01-21 23:55:22.227 182759 DEBUG nova.virt.hardware [None req-902440b9-b2f7-4012-a2a7-6955d51d74ce fb46f340c44c473b9286568553cb6374 8556453a9e6644b4b29f7e2585b6beb3 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Jan 21 18:55:22 np0005591285 nova_compute[182755]: 2026-01-21 23:55:22.227 182759 DEBUG nova.virt.hardware [None req-902440b9-b2f7-4012-a2a7-6955d51d74ce fb46f340c44c473b9286568553cb6374 8556453a9e6644b4b29f7e2585b6beb3 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Jan 21 18:55:22 np0005591285 nova_compute[182755]: 2026-01-21 23:55:22.227 182759 DEBUG nova.virt.hardware [None req-902440b9-b2f7-4012-a2a7-6955d51d74ce fb46f340c44c473b9286568553cb6374 8556453a9e6644b4b29f7e2585b6beb3 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Jan 21 18:55:22 np0005591285 nova_compute[182755]: 2026-01-21 23:55:22.228 182759 DEBUG nova.virt.hardware [None req-902440b9-b2f7-4012-a2a7-6955d51d74ce fb46f340c44c473b9286568553cb6374 8556453a9e6644b4b29f7e2585b6beb3 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Jan 21 18:55:22 np0005591285 nova_compute[182755]: 2026-01-21 23:55:22.228 182759 DEBUG nova.virt.hardware [None req-902440b9-b2f7-4012-a2a7-6955d51d74ce fb46f340c44c473b9286568553cb6374 8556453a9e6644b4b29f7e2585b6beb3 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Jan 21 18:55:22 np0005591285 nova_compute[182755]: 2026-01-21 23:55:22.229 182759 DEBUG nova.virt.hardware [None req-902440b9-b2f7-4012-a2a7-6955d51d74ce fb46f340c44c473b9286568553cb6374 8556453a9e6644b4b29f7e2585b6beb3 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Jan 21 18:55:22 np0005591285 nova_compute[182755]: 2026-01-21 23:55:22.229 182759 DEBUG nova.virt.hardware [None req-902440b9-b2f7-4012-a2a7-6955d51d74ce fb46f340c44c473b9286568553cb6374 8556453a9e6644b4b29f7e2585b6beb3 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Jan 21 18:55:22 np0005591285 nova_compute[182755]: 2026-01-21 23:55:22.229 182759 DEBUG nova.virt.hardware [None req-902440b9-b2f7-4012-a2a7-6955d51d74ce fb46f340c44c473b9286568553cb6374 8556453a9e6644b4b29f7e2585b6beb3 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Jan 21 18:55:22 np0005591285 nova_compute[182755]: 2026-01-21 23:55:22.238 182759 DEBUG nova.virt.libvirt.vif [None req-902440b9-b2f7-4012-a2a7-6955d51d74ce fb46f340c44c473b9286568553cb6374 8556453a9e6644b4b29f7e2585b6beb3 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-21T23:55:16Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-SecurityGroupsTestJSON-server-89867646',display_name='tempest-SecurityGroupsTestJSON-server-89867646',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-securitygroupstestjson-server-89867646',id=57,image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='8556453a9e6644b4b29f7e2585b6beb3',ramdisk_id='',reservation_id='r-vin64boz',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-SecurityGroupsTestJSON-744520065',owner_user_name='tempest-SecurityGroupsTestJSON-744520065-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-21T23:55:18Z,user_data=None,user_id='fb46f340c44c473b9286568553cb6374',uuid=2b44528a-0ec9-4df9-afce-0d76ed92b221,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "b07d054c-f7c2-465a-8c0d-fa4f5dac4828", "address": "fa:16:3e:87:3c:66", "network": {"id": "f32b0ae0-64b5-4b08-b029-da33b7e8f96a", "bridge": "br-int", "label": "tempest-SecurityGroupsTestJSON-1426753002-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "8556453a9e6644b4b29f7e2585b6beb3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb07d054c-f7", "ovs_interfaceid": "b07d054c-f7c2-465a-8c0d-fa4f5dac4828", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Jan 21 18:55:22 np0005591285 nova_compute[182755]: 2026-01-21 23:55:22.239 182759 DEBUG nova.network.os_vif_util [None req-902440b9-b2f7-4012-a2a7-6955d51d74ce fb46f340c44c473b9286568553cb6374 8556453a9e6644b4b29f7e2585b6beb3 - - default default] Converting VIF {"id": "b07d054c-f7c2-465a-8c0d-fa4f5dac4828", "address": "fa:16:3e:87:3c:66", "network": {"id": "f32b0ae0-64b5-4b08-b029-da33b7e8f96a", "bridge": "br-int", "label": "tempest-SecurityGroupsTestJSON-1426753002-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "8556453a9e6644b4b29f7e2585b6beb3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb07d054c-f7", "ovs_interfaceid": "b07d054c-f7c2-465a-8c0d-fa4f5dac4828", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 21 18:55:22 np0005591285 nova_compute[182755]: 2026-01-21 23:55:22.240 182759 DEBUG nova.network.os_vif_util [None req-902440b9-b2f7-4012-a2a7-6955d51d74ce fb46f340c44c473b9286568553cb6374 8556453a9e6644b4b29f7e2585b6beb3 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:87:3c:66,bridge_name='br-int',has_traffic_filtering=True,id=b07d054c-f7c2-465a-8c0d-fa4f5dac4828,network=Network(f32b0ae0-64b5-4b08-b029-da33b7e8f96a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapb07d054c-f7') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 21 18:55:22 np0005591285 nova_compute[182755]: 2026-01-21 23:55:22.242 182759 DEBUG nova.objects.instance [None req-902440b9-b2f7-4012-a2a7-6955d51d74ce fb46f340c44c473b9286568553cb6374 8556453a9e6644b4b29f7e2585b6beb3 - - default default] Lazy-loading 'pci_devices' on Instance uuid 2b44528a-0ec9-4df9-afce-0d76ed92b221 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 21 18:55:22 np0005591285 podman[218390]: 2026-01-21 23:55:22.246968354 +0000 UTC m=+0.104630848 container health_status b5c1a31a508d0cf9e10702abe9c0ef424614b4d8f0daee85791f7de054afa6f0 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, tcib_managed=true, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=ceilometer_agent_compute, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS)
Jan 21 18:55:22 np0005591285 nova_compute[182755]: 2026-01-21 23:55:22.269 182759 DEBUG nova.virt.libvirt.driver [None req-902440b9-b2f7-4012-a2a7-6955d51d74ce fb46f340c44c473b9286568553cb6374 8556453a9e6644b4b29f7e2585b6beb3 - - default default] [instance: 2b44528a-0ec9-4df9-afce-0d76ed92b221] End _get_guest_xml xml=<domain type="kvm">
Jan 21 18:55:22 np0005591285 nova_compute[182755]:  <uuid>2b44528a-0ec9-4df9-afce-0d76ed92b221</uuid>
Jan 21 18:55:22 np0005591285 nova_compute[182755]:  <name>instance-00000039</name>
Jan 21 18:55:22 np0005591285 nova_compute[182755]:  <memory>131072</memory>
Jan 21 18:55:22 np0005591285 nova_compute[182755]:  <vcpu>1</vcpu>
Jan 21 18:55:22 np0005591285 nova_compute[182755]:  <metadata>
Jan 21 18:55:22 np0005591285 nova_compute[182755]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 21 18:55:22 np0005591285 nova_compute[182755]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 21 18:55:22 np0005591285 nova_compute[182755]:      <nova:name>tempest-SecurityGroupsTestJSON-server-89867646</nova:name>
Jan 21 18:55:22 np0005591285 nova_compute[182755]:      <nova:creationTime>2026-01-21 23:55:22</nova:creationTime>
Jan 21 18:55:22 np0005591285 nova_compute[182755]:      <nova:flavor name="m1.nano">
Jan 21 18:55:22 np0005591285 nova_compute[182755]:        <nova:memory>128</nova:memory>
Jan 21 18:55:22 np0005591285 nova_compute[182755]:        <nova:disk>1</nova:disk>
Jan 21 18:55:22 np0005591285 nova_compute[182755]:        <nova:swap>0</nova:swap>
Jan 21 18:55:22 np0005591285 nova_compute[182755]:        <nova:ephemeral>0</nova:ephemeral>
Jan 21 18:55:22 np0005591285 nova_compute[182755]:        <nova:vcpus>1</nova:vcpus>
Jan 21 18:55:22 np0005591285 nova_compute[182755]:      </nova:flavor>
Jan 21 18:55:22 np0005591285 nova_compute[182755]:      <nova:owner>
Jan 21 18:55:22 np0005591285 nova_compute[182755]:        <nova:user uuid="fb46f340c44c473b9286568553cb6374">tempest-SecurityGroupsTestJSON-744520065-project-member</nova:user>
Jan 21 18:55:22 np0005591285 nova_compute[182755]:        <nova:project uuid="8556453a9e6644b4b29f7e2585b6beb3">tempest-SecurityGroupsTestJSON-744520065</nova:project>
Jan 21 18:55:22 np0005591285 nova_compute[182755]:      </nova:owner>
Jan 21 18:55:22 np0005591285 nova_compute[182755]:      <nova:root type="image" uuid="9cd98f02-a505-4543-a7ad-04e9a377b456"/>
Jan 21 18:55:22 np0005591285 nova_compute[182755]:      <nova:ports>
Jan 21 18:55:22 np0005591285 nova_compute[182755]:        <nova:port uuid="b07d054c-f7c2-465a-8c0d-fa4f5dac4828">
Jan 21 18:55:22 np0005591285 nova_compute[182755]:          <nova:ip type="fixed" address="10.100.0.5" ipVersion="4"/>
Jan 21 18:55:22 np0005591285 nova_compute[182755]:        </nova:port>
Jan 21 18:55:22 np0005591285 nova_compute[182755]:      </nova:ports>
Jan 21 18:55:22 np0005591285 nova_compute[182755]:    </nova:instance>
Jan 21 18:55:22 np0005591285 nova_compute[182755]:  </metadata>
Jan 21 18:55:22 np0005591285 nova_compute[182755]:  <sysinfo type="smbios">
Jan 21 18:55:22 np0005591285 nova_compute[182755]:    <system>
Jan 21 18:55:22 np0005591285 nova_compute[182755]:      <entry name="manufacturer">RDO</entry>
Jan 21 18:55:22 np0005591285 nova_compute[182755]:      <entry name="product">OpenStack Compute</entry>
Jan 21 18:55:22 np0005591285 nova_compute[182755]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 21 18:55:22 np0005591285 nova_compute[182755]:      <entry name="serial">2b44528a-0ec9-4df9-afce-0d76ed92b221</entry>
Jan 21 18:55:22 np0005591285 nova_compute[182755]:      <entry name="uuid">2b44528a-0ec9-4df9-afce-0d76ed92b221</entry>
Jan 21 18:55:22 np0005591285 nova_compute[182755]:      <entry name="family">Virtual Machine</entry>
Jan 21 18:55:22 np0005591285 nova_compute[182755]:    </system>
Jan 21 18:55:22 np0005591285 nova_compute[182755]:  </sysinfo>
Jan 21 18:55:22 np0005591285 nova_compute[182755]:  <os>
Jan 21 18:55:22 np0005591285 nova_compute[182755]:    <type arch="x86_64" machine="q35">hvm</type>
Jan 21 18:55:22 np0005591285 nova_compute[182755]:    <boot dev="hd"/>
Jan 21 18:55:22 np0005591285 nova_compute[182755]:    <smbios mode="sysinfo"/>
Jan 21 18:55:22 np0005591285 nova_compute[182755]:  </os>
Jan 21 18:55:22 np0005591285 nova_compute[182755]:  <features>
Jan 21 18:55:22 np0005591285 nova_compute[182755]:    <acpi/>
Jan 21 18:55:22 np0005591285 nova_compute[182755]:    <apic/>
Jan 21 18:55:22 np0005591285 nova_compute[182755]:    <vmcoreinfo/>
Jan 21 18:55:22 np0005591285 nova_compute[182755]:  </features>
Jan 21 18:55:22 np0005591285 nova_compute[182755]:  <clock offset="utc">
Jan 21 18:55:22 np0005591285 nova_compute[182755]:    <timer name="pit" tickpolicy="delay"/>
Jan 21 18:55:22 np0005591285 nova_compute[182755]:    <timer name="rtc" tickpolicy="catchup"/>
Jan 21 18:55:22 np0005591285 nova_compute[182755]:    <timer name="hpet" present="no"/>
Jan 21 18:55:22 np0005591285 nova_compute[182755]:  </clock>
Jan 21 18:55:22 np0005591285 nova_compute[182755]:  <cpu mode="custom" match="exact">
Jan 21 18:55:22 np0005591285 nova_compute[182755]:    <model>Nehalem</model>
Jan 21 18:55:22 np0005591285 nova_compute[182755]:    <topology sockets="1" cores="1" threads="1"/>
Jan 21 18:55:22 np0005591285 nova_compute[182755]:  </cpu>
Jan 21 18:55:22 np0005591285 nova_compute[182755]:  <devices>
Jan 21 18:55:22 np0005591285 nova_compute[182755]:    <disk type="file" device="disk">
Jan 21 18:55:22 np0005591285 nova_compute[182755]:      <driver name="qemu" type="qcow2" cache="none"/>
Jan 21 18:55:22 np0005591285 nova_compute[182755]:      <source file="/var/lib/nova/instances/2b44528a-0ec9-4df9-afce-0d76ed92b221/disk"/>
Jan 21 18:55:22 np0005591285 nova_compute[182755]:      <target dev="vda" bus="virtio"/>
Jan 21 18:55:22 np0005591285 nova_compute[182755]:    </disk>
Jan 21 18:55:22 np0005591285 nova_compute[182755]:    <disk type="file" device="cdrom">
Jan 21 18:55:22 np0005591285 nova_compute[182755]:      <driver name="qemu" type="raw" cache="none"/>
Jan 21 18:55:22 np0005591285 nova_compute[182755]:      <source file="/var/lib/nova/instances/2b44528a-0ec9-4df9-afce-0d76ed92b221/disk.config"/>
Jan 21 18:55:22 np0005591285 nova_compute[182755]:      <target dev="sda" bus="sata"/>
Jan 21 18:55:22 np0005591285 nova_compute[182755]:    </disk>
Jan 21 18:55:22 np0005591285 nova_compute[182755]:    <interface type="ethernet">
Jan 21 18:55:22 np0005591285 nova_compute[182755]:      <mac address="fa:16:3e:87:3c:66"/>
Jan 21 18:55:22 np0005591285 nova_compute[182755]:      <model type="virtio"/>
Jan 21 18:55:22 np0005591285 nova_compute[182755]:      <driver name="vhost" rx_queue_size="512"/>
Jan 21 18:55:22 np0005591285 nova_compute[182755]:      <mtu size="1442"/>
Jan 21 18:55:22 np0005591285 nova_compute[182755]:      <target dev="tapb07d054c-f7"/>
Jan 21 18:55:22 np0005591285 nova_compute[182755]:    </interface>
Jan 21 18:55:22 np0005591285 nova_compute[182755]:    <serial type="pty">
Jan 21 18:55:22 np0005591285 nova_compute[182755]:      <log file="/var/lib/nova/instances/2b44528a-0ec9-4df9-afce-0d76ed92b221/console.log" append="off"/>
Jan 21 18:55:22 np0005591285 nova_compute[182755]:    </serial>
Jan 21 18:55:22 np0005591285 nova_compute[182755]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 21 18:55:22 np0005591285 nova_compute[182755]:    <video>
Jan 21 18:55:22 np0005591285 nova_compute[182755]:      <model type="virtio"/>
Jan 21 18:55:22 np0005591285 nova_compute[182755]:    </video>
Jan 21 18:55:22 np0005591285 nova_compute[182755]:    <input type="tablet" bus="usb"/>
Jan 21 18:55:22 np0005591285 nova_compute[182755]:    <rng model="virtio">
Jan 21 18:55:22 np0005591285 nova_compute[182755]:      <backend model="random">/dev/urandom</backend>
Jan 21 18:55:22 np0005591285 nova_compute[182755]:    </rng>
Jan 21 18:55:22 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root"/>
Jan 21 18:55:22 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 18:55:22 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 18:55:22 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 18:55:22 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 18:55:22 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 18:55:22 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 18:55:22 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 18:55:22 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 18:55:22 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 18:55:22 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 18:55:22 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 18:55:22 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 18:55:22 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 18:55:22 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 18:55:22 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 18:55:22 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 18:55:22 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 18:55:22 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 18:55:22 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 18:55:22 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 18:55:22 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 18:55:22 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 18:55:22 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 18:55:22 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 18:55:22 np0005591285 nova_compute[182755]:    <controller type="usb" index="0"/>
Jan 21 18:55:22 np0005591285 nova_compute[182755]:    <memballoon model="virtio">
Jan 21 18:55:22 np0005591285 nova_compute[182755]:      <stats period="10"/>
Jan 21 18:55:22 np0005591285 nova_compute[182755]:    </memballoon>
Jan 21 18:55:22 np0005591285 nova_compute[182755]:  </devices>
Jan 21 18:55:22 np0005591285 nova_compute[182755]: </domain>
Jan 21 18:55:22 np0005591285 nova_compute[182755]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Jan 21 18:55:22 np0005591285 nova_compute[182755]: 2026-01-21 23:55:22.270 182759 DEBUG nova.compute.manager [None req-902440b9-b2f7-4012-a2a7-6955d51d74ce fb46f340c44c473b9286568553cb6374 8556453a9e6644b4b29f7e2585b6beb3 - - default default] [instance: 2b44528a-0ec9-4df9-afce-0d76ed92b221] Preparing to wait for external event network-vif-plugged-b07d054c-f7c2-465a-8c0d-fa4f5dac4828 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Jan 21 18:55:22 np0005591285 nova_compute[182755]: 2026-01-21 23:55:22.270 182759 DEBUG oslo_concurrency.lockutils [None req-902440b9-b2f7-4012-a2a7-6955d51d74ce fb46f340c44c473b9286568553cb6374 8556453a9e6644b4b29f7e2585b6beb3 - - default default] Acquiring lock "2b44528a-0ec9-4df9-afce-0d76ed92b221-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 21 18:55:22 np0005591285 nova_compute[182755]: 2026-01-21 23:55:22.270 182759 DEBUG oslo_concurrency.lockutils [None req-902440b9-b2f7-4012-a2a7-6955d51d74ce fb46f340c44c473b9286568553cb6374 8556453a9e6644b4b29f7e2585b6beb3 - - default default] Lock "2b44528a-0ec9-4df9-afce-0d76ed92b221-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 21 18:55:22 np0005591285 nova_compute[182755]: 2026-01-21 23:55:22.270 182759 DEBUG oslo_concurrency.lockutils [None req-902440b9-b2f7-4012-a2a7-6955d51d74ce fb46f340c44c473b9286568553cb6374 8556453a9e6644b4b29f7e2585b6beb3 - - default default] Lock "2b44528a-0ec9-4df9-afce-0d76ed92b221-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 21 18:55:22 np0005591285 nova_compute[182755]: 2026-01-21 23:55:22.271 182759 DEBUG nova.virt.libvirt.vif [None req-902440b9-b2f7-4012-a2a7-6955d51d74ce fb46f340c44c473b9286568553cb6374 8556453a9e6644b4b29f7e2585b6beb3 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-21T23:55:16Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-SecurityGroupsTestJSON-server-89867646',display_name='tempest-SecurityGroupsTestJSON-server-89867646',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-securitygroupstestjson-server-89867646',id=57,image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='8556453a9e6644b4b29f7e2585b6beb3',ramdisk_id='',reservation_id='r-vin64boz',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-SecurityGroupsTestJSON-744520065',owner_user_name='tempest-SecurityGroupsTestJSON-744520065-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-21T23:55:18Z,user_data=None,user_id='fb46f340c44c473b9286568553cb6374',uuid=2b44528a-0ec9-4df9-afce-0d76ed92b221,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "b07d054c-f7c2-465a-8c0d-fa4f5dac4828", "address": "fa:16:3e:87:3c:66", "network": {"id": "f32b0ae0-64b5-4b08-b029-da33b7e8f96a", "bridge": "br-int", "label": "tempest-SecurityGroupsTestJSON-1426753002-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "8556453a9e6644b4b29f7e2585b6beb3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb07d054c-f7", "ovs_interfaceid": "b07d054c-f7c2-465a-8c0d-fa4f5dac4828", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Jan 21 18:55:22 np0005591285 nova_compute[182755]: 2026-01-21 23:55:22.272 182759 DEBUG nova.network.os_vif_util [None req-902440b9-b2f7-4012-a2a7-6955d51d74ce fb46f340c44c473b9286568553cb6374 8556453a9e6644b4b29f7e2585b6beb3 - - default default] Converting VIF {"id": "b07d054c-f7c2-465a-8c0d-fa4f5dac4828", "address": "fa:16:3e:87:3c:66", "network": {"id": "f32b0ae0-64b5-4b08-b029-da33b7e8f96a", "bridge": "br-int", "label": "tempest-SecurityGroupsTestJSON-1426753002-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "8556453a9e6644b4b29f7e2585b6beb3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb07d054c-f7", "ovs_interfaceid": "b07d054c-f7c2-465a-8c0d-fa4f5dac4828", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 21 18:55:22 np0005591285 nova_compute[182755]: 2026-01-21 23:55:22.272 182759 DEBUG nova.network.os_vif_util [None req-902440b9-b2f7-4012-a2a7-6955d51d74ce fb46f340c44c473b9286568553cb6374 8556453a9e6644b4b29f7e2585b6beb3 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:87:3c:66,bridge_name='br-int',has_traffic_filtering=True,id=b07d054c-f7c2-465a-8c0d-fa4f5dac4828,network=Network(f32b0ae0-64b5-4b08-b029-da33b7e8f96a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapb07d054c-f7') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 21 18:55:22 np0005591285 nova_compute[182755]: 2026-01-21 23:55:22.273 182759 DEBUG os_vif [None req-902440b9-b2f7-4012-a2a7-6955d51d74ce fb46f340c44c473b9286568553cb6374 8556453a9e6644b4b29f7e2585b6beb3 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:87:3c:66,bridge_name='br-int',has_traffic_filtering=True,id=b07d054c-f7c2-465a-8c0d-fa4f5dac4828,network=Network(f32b0ae0-64b5-4b08-b029-da33b7e8f96a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapb07d054c-f7') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Jan 21 18:55:22 np0005591285 nova_compute[182755]: 2026-01-21 23:55:22.274 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 18:55:22 np0005591285 nova_compute[182755]: 2026-01-21 23:55:22.274 182759 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 21 18:55:22 np0005591285 nova_compute[182755]: 2026-01-21 23:55:22.274 182759 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 21 18:55:22 np0005591285 nova_compute[182755]: 2026-01-21 23:55:22.277 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 18:55:22 np0005591285 nova_compute[182755]: 2026-01-21 23:55:22.277 182759 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapb07d054c-f7, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 21 18:55:22 np0005591285 nova_compute[182755]: 2026-01-21 23:55:22.278 182759 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tapb07d054c-f7, col_values=(('external_ids', {'iface-id': 'b07d054c-f7c2-465a-8c0d-fa4f5dac4828', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:87:3c:66', 'vm-uuid': '2b44528a-0ec9-4df9-afce-0d76ed92b221'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 21 18:55:22 np0005591285 nova_compute[182755]: 2026-01-21 23:55:22.281 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 18:55:22 np0005591285 NetworkManager[55017]: <info>  [1769039722.2815] manager: (tapb07d054c-f7): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/75)
Jan 21 18:55:22 np0005591285 nova_compute[182755]: 2026-01-21 23:55:22.284 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 21 18:55:22 np0005591285 nova_compute[182755]: 2026-01-21 23:55:22.289 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 18:55:22 np0005591285 nova_compute[182755]: 2026-01-21 23:55:22.291 182759 INFO os_vif [None req-902440b9-b2f7-4012-a2a7-6955d51d74ce fb46f340c44c473b9286568553cb6374 8556453a9e6644b4b29f7e2585b6beb3 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:87:3c:66,bridge_name='br-int',has_traffic_filtering=True,id=b07d054c-f7c2-465a-8c0d-fa4f5dac4828,network=Network(f32b0ae0-64b5-4b08-b029-da33b7e8f96a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapb07d054c-f7')#033[00m
Jan 21 18:55:22 np0005591285 nova_compute[182755]: 2026-01-21 23:55:22.361 182759 DEBUG nova.virt.libvirt.driver [None req-902440b9-b2f7-4012-a2a7-6955d51d74ce fb46f340c44c473b9286568553cb6374 8556453a9e6644b4b29f7e2585b6beb3 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 21 18:55:22 np0005591285 nova_compute[182755]: 2026-01-21 23:55:22.361 182759 DEBUG nova.virt.libvirt.driver [None req-902440b9-b2f7-4012-a2a7-6955d51d74ce fb46f340c44c473b9286568553cb6374 8556453a9e6644b4b29f7e2585b6beb3 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 21 18:55:22 np0005591285 nova_compute[182755]: 2026-01-21 23:55:22.362 182759 DEBUG nova.virt.libvirt.driver [None req-902440b9-b2f7-4012-a2a7-6955d51d74ce fb46f340c44c473b9286568553cb6374 8556453a9e6644b4b29f7e2585b6beb3 - - default default] No VIF found with MAC fa:16:3e:87:3c:66, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Jan 21 18:55:22 np0005591285 nova_compute[182755]: 2026-01-21 23:55:22.362 182759 INFO nova.virt.libvirt.driver [None req-902440b9-b2f7-4012-a2a7-6955d51d74ce fb46f340c44c473b9286568553cb6374 8556453a9e6644b4b29f7e2585b6beb3 - - default default] [instance: 2b44528a-0ec9-4df9-afce-0d76ed92b221] Using config drive#033[00m
Jan 21 18:55:22 np0005591285 nova_compute[182755]: 2026-01-21 23:55:22.793 182759 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1769039707.7870975, f9f52173-c7d5-4d5a-9edc-b3c4e283213d => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 21 18:55:22 np0005591285 nova_compute[182755]: 2026-01-21 23:55:22.793 182759 INFO nova.compute.manager [-] [instance: f9f52173-c7d5-4d5a-9edc-b3c4e283213d] VM Stopped (Lifecycle Event)#033[00m
Jan 21 18:55:22 np0005591285 nova_compute[182755]: 2026-01-21 23:55:22.816 182759 DEBUG nova.compute.manager [None req-2eca4140-f400-42ed-8138-708742c57784 - - - - - -] [instance: f9f52173-c7d5-4d5a-9edc-b3c4e283213d] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 21 18:55:22 np0005591285 nova_compute[182755]: 2026-01-21 23:55:22.874 182759 INFO nova.virt.libvirt.driver [None req-902440b9-b2f7-4012-a2a7-6955d51d74ce fb46f340c44c473b9286568553cb6374 8556453a9e6644b4b29f7e2585b6beb3 - - default default] [instance: 2b44528a-0ec9-4df9-afce-0d76ed92b221] Creating config drive at /var/lib/nova/instances/2b44528a-0ec9-4df9-afce-0d76ed92b221/disk.config#033[00m
Jan 21 18:55:22 np0005591285 nova_compute[182755]: 2026-01-21 23:55:22.880 182759 DEBUG oslo_concurrency.processutils [None req-902440b9-b2f7-4012-a2a7-6955d51d74ce fb46f340c44c473b9286568553cb6374 8556453a9e6644b4b29f7e2585b6beb3 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/2b44528a-0ec9-4df9-afce-0d76ed92b221/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp8lw2x2ds execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 21 18:55:23 np0005591285 nova_compute[182755]: 2026-01-21 23:55:23.016 182759 DEBUG oslo_concurrency.processutils [None req-902440b9-b2f7-4012-a2a7-6955d51d74ce fb46f340c44c473b9286568553cb6374 8556453a9e6644b4b29f7e2585b6beb3 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/2b44528a-0ec9-4df9-afce-0d76ed92b221/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp8lw2x2ds" returned: 0 in 0.136s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 21 18:55:23 np0005591285 kernel: tapb07d054c-f7: entered promiscuous mode
Jan 21 18:55:23 np0005591285 ovn_controller[94908]: 2026-01-21T23:55:23Z|00133|binding|INFO|Claiming lport b07d054c-f7c2-465a-8c0d-fa4f5dac4828 for this chassis.
Jan 21 18:55:23 np0005591285 nova_compute[182755]: 2026-01-21 23:55:23.116 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 18:55:23 np0005591285 NetworkManager[55017]: <info>  [1769039723.1173] manager: (tapb07d054c-f7): new Tun device (/org/freedesktop/NetworkManager/Devices/76)
Jan 21 18:55:23 np0005591285 ovn_controller[94908]: 2026-01-21T23:55:23Z|00134|binding|INFO|b07d054c-f7c2-465a-8c0d-fa4f5dac4828: Claiming fa:16:3e:87:3c:66 10.100.0.5
Jan 21 18:55:23 np0005591285 nova_compute[182755]: 2026-01-21 23:55:23.130 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 18:55:23 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:55:23.141 104259 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:87:3c:66 10.100.0.5'], port_security=['fa:16:3e:87:3c:66 10.100.0.5'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.5/28', 'neutron:device_id': '2b44528a-0ec9-4df9-afce-0d76ed92b221', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-f32b0ae0-64b5-4b08-b029-da33b7e8f96a', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '8556453a9e6644b4b29f7e2585b6beb3', 'neutron:revision_number': '2', 'neutron:security_group_ids': '07a1a0ce-5790-4d2e-8869-adf91647e1de', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=c83a09c2-c943-4d92-aedc-1a1adb93cc19, chassis=[<ovs.db.idl.Row object at 0x7fdd55b5c970>], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fdd55b5c970>], logical_port=b07d054c-f7c2-465a-8c0d-fa4f5dac4828) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 21 18:55:23 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:55:23.143 104259 INFO neutron.agent.ovn.metadata.agent [-] Port b07d054c-f7c2-465a-8c0d-fa4f5dac4828 in datapath f32b0ae0-64b5-4b08-b029-da33b7e8f96a bound to our chassis#033[00m
Jan 21 18:55:23 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:55:23.144 104259 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network f32b0ae0-64b5-4b08-b029-da33b7e8f96a#033[00m
Jan 21 18:55:23 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:55:23.163 211690 DEBUG oslo.privsep.daemon [-] privsep: reply[0f03a39d-709f-4743-b950-92a78c10cff6]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 18:55:23 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:55:23.164 104259 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tapf32b0ae0-61 in ovnmeta-f32b0ae0-64b5-4b08-b029-da33b7e8f96a namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Jan 21 18:55:23 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:55:23.167 211690 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tapf32b0ae0-60 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Jan 21 18:55:23 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:55:23.167 211690 DEBUG oslo.privsep.daemon [-] privsep: reply[1867b868-0eb1-4fa7-b6e1-4e8bd4a48174]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 18:55:23 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:55:23.168 211690 DEBUG oslo.privsep.daemon [-] privsep: reply[23507c7e-b48e-4da4-9fd6-60d3bf42d025]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 18:55:23 np0005591285 systemd-udevd[218452]: Network interface NamePolicy= disabled on kernel command line.
Jan 21 18:55:23 np0005591285 systemd-machined[154022]: New machine qemu-23-instance-00000039.
Jan 21 18:55:23 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:55:23.181 104650 DEBUG oslo.privsep.daemon [-] privsep: reply[edb47927-d5c1-4911-94fc-c71708e67a4e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 18:55:23 np0005591285 NetworkManager[55017]: <info>  [1769039723.1967] device (tapb07d054c-f7): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 21 18:55:23 np0005591285 NetworkManager[55017]: <info>  [1769039723.1985] device (tapb07d054c-f7): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 21 18:55:23 np0005591285 nova_compute[182755]: 2026-01-21 23:55:23.212 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 18:55:23 np0005591285 ovn_controller[94908]: 2026-01-21T23:55:23Z|00135|binding|INFO|Setting lport b07d054c-f7c2-465a-8c0d-fa4f5dac4828 ovn-installed in OVS
Jan 21 18:55:23 np0005591285 ovn_controller[94908]: 2026-01-21T23:55:23Z|00136|binding|INFO|Setting lport b07d054c-f7c2-465a-8c0d-fa4f5dac4828 up in Southbound
Jan 21 18:55:23 np0005591285 nova_compute[182755]: 2026-01-21 23:55:23.217 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 18:55:23 np0005591285 systemd[1]: Started Virtual Machine qemu-23-instance-00000039.
Jan 21 18:55:23 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:55:23.224 211690 DEBUG oslo.privsep.daemon [-] privsep: reply[8be3665f-244a-4bf1-9104-e1aefcddd466]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 18:55:23 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:55:23.276 211704 DEBUG oslo.privsep.daemon [-] privsep: reply[25277f9d-2137-42ba-962f-6e609c678dfc]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 18:55:23 np0005591285 NetworkManager[55017]: <info>  [1769039723.2886] manager: (tapf32b0ae0-60): new Veth device (/org/freedesktop/NetworkManager/Devices/77)
Jan 21 18:55:23 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:55:23.287 211690 DEBUG oslo.privsep.daemon [-] privsep: reply[fdc68d3e-b558-4d14-866e-0da439f11e21]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 18:55:23 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:55:23.325 211704 DEBUG oslo.privsep.daemon [-] privsep: reply[76161f56-3ce2-4957-a1d7-c09400eb02b5]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 18:55:23 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:55:23.329 211704 DEBUG oslo.privsep.daemon [-] privsep: reply[5e70aef9-d2fe-4e1f-b135-00d6fa2badc8]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 18:55:23 np0005591285 NetworkManager[55017]: <info>  [1769039723.3683] device (tapf32b0ae0-60): carrier: link connected
Jan 21 18:55:23 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:55:23.376 211704 DEBUG oslo.privsep.daemon [-] privsep: reply[944cccae-569a-46d2-bc56-4e24d63a1416]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 18:55:23 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:55:23.400 211690 DEBUG oslo.privsep.daemon [-] privsep: reply[4365abc3-dac3-4b70-816e-e1c45a42c336]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapf32b0ae0-61'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:63:84:cb'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 48], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 428202, 'reachable_time': 18816, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 218485, 'error': None, 'target': 'ovnmeta-f32b0ae0-64b5-4b08-b029-da33b7e8f96a', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 18:55:23 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:55:23.421 211690 DEBUG oslo.privsep.daemon [-] privsep: reply[30d8afcd-fd98-4371-b6e4-03078b652c0f]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe63:84cb'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 428202, 'tstamp': 428202}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 218486, 'error': None, 'target': 'ovnmeta-f32b0ae0-64b5-4b08-b029-da33b7e8f96a', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 18:55:23 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:55:23.443 211690 DEBUG oslo.privsep.daemon [-] privsep: reply[e4982f5b-54ac-4f57-bcc3-eefd5849caa0]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapf32b0ae0-61'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:63:84:cb'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 48], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 428202, 'reachable_time': 18816, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 218487, 'error': None, 'target': 'ovnmeta-f32b0ae0-64b5-4b08-b029-da33b7e8f96a', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 18:55:23 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:55:23.491 211690 DEBUG oslo.privsep.daemon [-] privsep: reply[43dcabdb-7fdd-4183-91a4-93ba94db3e59]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 18:55:23 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:55:23.570 211690 DEBUG oslo.privsep.daemon [-] privsep: reply[9fcc8027-8d9c-4349-b462-207180d8a8be]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 18:55:23 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:55:23.572 104259 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapf32b0ae0-60, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 21 18:55:23 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:55:23.572 104259 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 21 18:55:23 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:55:23.573 104259 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapf32b0ae0-60, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 21 18:55:23 np0005591285 nova_compute[182755]: 2026-01-21 23:55:23.575 182759 DEBUG nova.compute.manager [req-152cbf9a-21ae-4add-8076-559c400a94dd req-3983f57c-e397-46c4-abd1-c02f871a1c76 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 2b44528a-0ec9-4df9-afce-0d76ed92b221] Received event network-vif-plugged-b07d054c-f7c2-465a-8c0d-fa4f5dac4828 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 21 18:55:23 np0005591285 nova_compute[182755]: 2026-01-21 23:55:23.576 182759 DEBUG oslo_concurrency.lockutils [req-152cbf9a-21ae-4add-8076-559c400a94dd req-3983f57c-e397-46c4-abd1-c02f871a1c76 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "2b44528a-0ec9-4df9-afce-0d76ed92b221-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 21 18:55:23 np0005591285 nova_compute[182755]: 2026-01-21 23:55:23.576 182759 DEBUG oslo_concurrency.lockutils [req-152cbf9a-21ae-4add-8076-559c400a94dd req-3983f57c-e397-46c4-abd1-c02f871a1c76 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "2b44528a-0ec9-4df9-afce-0d76ed92b221-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 21 18:55:23 np0005591285 nova_compute[182755]: 2026-01-21 23:55:23.576 182759 DEBUG oslo_concurrency.lockutils [req-152cbf9a-21ae-4add-8076-559c400a94dd req-3983f57c-e397-46c4-abd1-c02f871a1c76 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "2b44528a-0ec9-4df9-afce-0d76ed92b221-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 21 18:55:23 np0005591285 nova_compute[182755]: 2026-01-21 23:55:23.576 182759 DEBUG nova.compute.manager [req-152cbf9a-21ae-4add-8076-559c400a94dd req-3983f57c-e397-46c4-abd1-c02f871a1c76 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 2b44528a-0ec9-4df9-afce-0d76ed92b221] Processing event network-vif-plugged-b07d054c-f7c2-465a-8c0d-fa4f5dac4828 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Jan 21 18:55:23 np0005591285 kernel: tapf32b0ae0-60: entered promiscuous mode
Jan 21 18:55:23 np0005591285 NetworkManager[55017]: <info>  [1769039723.6098] manager: (tapf32b0ae0-60): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/78)
Jan 21 18:55:23 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:55:23.613 104259 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapf32b0ae0-60, col_values=(('external_ids', {'iface-id': '6cc0b80c-0a82-45e0-b8cc-ede53f6f4c47'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 21 18:55:23 np0005591285 ovn_controller[94908]: 2026-01-21T23:55:23Z|00137|binding|INFO|Releasing lport 6cc0b80c-0a82-45e0-b8cc-ede53f6f4c47 from this chassis (sb_readonly=0)
Jan 21 18:55:23 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:55:23.615 104259 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/f32b0ae0-64b5-4b08-b029-da33b7e8f96a.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/f32b0ae0-64b5-4b08-b029-da33b7e8f96a.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Jan 21 18:55:23 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:55:23.616 211690 DEBUG oslo.privsep.daemon [-] privsep: reply[0e83c940-92d1-480c-b47c-9c35cc68c61c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 18:55:23 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:55:23.617 104259 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Jan 21 18:55:23 np0005591285 ovn_metadata_agent[104254]: global
Jan 21 18:55:23 np0005591285 ovn_metadata_agent[104254]:    log         /dev/log local0 debug
Jan 21 18:55:23 np0005591285 ovn_metadata_agent[104254]:    log-tag     haproxy-metadata-proxy-f32b0ae0-64b5-4b08-b029-da33b7e8f96a
Jan 21 18:55:23 np0005591285 ovn_metadata_agent[104254]:    user        root
Jan 21 18:55:23 np0005591285 ovn_metadata_agent[104254]:    group       root
Jan 21 18:55:23 np0005591285 ovn_metadata_agent[104254]:    maxconn     1024
Jan 21 18:55:23 np0005591285 ovn_metadata_agent[104254]:    pidfile     /var/lib/neutron/external/pids/f32b0ae0-64b5-4b08-b029-da33b7e8f96a.pid.haproxy
Jan 21 18:55:23 np0005591285 ovn_metadata_agent[104254]:    daemon
Jan 21 18:55:23 np0005591285 ovn_metadata_agent[104254]: 
Jan 21 18:55:23 np0005591285 ovn_metadata_agent[104254]: defaults
Jan 21 18:55:23 np0005591285 ovn_metadata_agent[104254]:    log global
Jan 21 18:55:23 np0005591285 ovn_metadata_agent[104254]:    mode http
Jan 21 18:55:23 np0005591285 ovn_metadata_agent[104254]:    option httplog
Jan 21 18:55:23 np0005591285 ovn_metadata_agent[104254]:    option dontlognull
Jan 21 18:55:23 np0005591285 ovn_metadata_agent[104254]:    option http-server-close
Jan 21 18:55:23 np0005591285 ovn_metadata_agent[104254]:    option forwardfor
Jan 21 18:55:23 np0005591285 ovn_metadata_agent[104254]:    retries                 3
Jan 21 18:55:23 np0005591285 ovn_metadata_agent[104254]:    timeout http-request    30s
Jan 21 18:55:23 np0005591285 ovn_metadata_agent[104254]:    timeout connect         30s
Jan 21 18:55:23 np0005591285 ovn_metadata_agent[104254]:    timeout client          32s
Jan 21 18:55:23 np0005591285 nova_compute[182755]: 2026-01-21 23:55:23.617 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 18:55:23 np0005591285 ovn_metadata_agent[104254]:    timeout server          32s
Jan 21 18:55:23 np0005591285 ovn_metadata_agent[104254]:    timeout http-keep-alive 30s
Jan 21 18:55:23 np0005591285 ovn_metadata_agent[104254]: 
Jan 21 18:55:23 np0005591285 ovn_metadata_agent[104254]: 
Jan 21 18:55:23 np0005591285 ovn_metadata_agent[104254]: listen listener
Jan 21 18:55:23 np0005591285 ovn_metadata_agent[104254]:    bind 169.254.169.254:80
Jan 21 18:55:23 np0005591285 ovn_metadata_agent[104254]:    server metadata /var/lib/neutron/metadata_proxy
Jan 21 18:55:23 np0005591285 ovn_metadata_agent[104254]:    http-request add-header X-OVN-Network-ID f32b0ae0-64b5-4b08-b029-da33b7e8f96a
Jan 21 18:55:23 np0005591285 ovn_metadata_agent[104254]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Jan 21 18:55:23 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:55:23.617 104259 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-f32b0ae0-64b5-4b08-b029-da33b7e8f96a', 'env', 'PROCESS_TAG=haproxy-f32b0ae0-64b5-4b08-b029-da33b7e8f96a', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/f32b0ae0-64b5-4b08-b029-da33b7e8f96a.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Jan 21 18:55:23 np0005591285 nova_compute[182755]: 2026-01-21 23:55:23.625 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 18:55:24 np0005591285 podman[218519]: 2026-01-21 23:55:24.062894708 +0000 UTC m=+0.077679292 container create 60785762b73496354a0753eb65b5ecb0bd5a37d4f28611e9059e72f1320f71ce (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-f32b0ae0-64b5-4b08-b029-da33b7e8f96a, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.build-date=20251202)
Jan 21 18:55:24 np0005591285 systemd[1]: Started libpod-conmon-60785762b73496354a0753eb65b5ecb0bd5a37d4f28611e9059e72f1320f71ce.scope.
Jan 21 18:55:24 np0005591285 nova_compute[182755]: 2026-01-21 23:55:24.106 182759 DEBUG nova.compute.manager [None req-902440b9-b2f7-4012-a2a7-6955d51d74ce fb46f340c44c473b9286568553cb6374 8556453a9e6644b4b29f7e2585b6beb3 - - default default] [instance: 2b44528a-0ec9-4df9-afce-0d76ed92b221] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Jan 21 18:55:24 np0005591285 nova_compute[182755]: 2026-01-21 23:55:24.107 182759 DEBUG nova.virt.driver [None req-f447ef32-7edb-4e2c-a5c5-5452f2f9c37e - - - - - -] Emitting event <LifecycleEvent: 1769039724.105472, 2b44528a-0ec9-4df9-afce-0d76ed92b221 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 21 18:55:24 np0005591285 nova_compute[182755]: 2026-01-21 23:55:24.107 182759 INFO nova.compute.manager [None req-f447ef32-7edb-4e2c-a5c5-5452f2f9c37e - - - - - -] [instance: 2b44528a-0ec9-4df9-afce-0d76ed92b221] VM Started (Lifecycle Event)#033[00m
Jan 21 18:55:24 np0005591285 nova_compute[182755]: 2026-01-21 23:55:24.117 182759 DEBUG nova.virt.libvirt.driver [None req-902440b9-b2f7-4012-a2a7-6955d51d74ce fb46f340c44c473b9286568553cb6374 8556453a9e6644b4b29f7e2585b6beb3 - - default default] [instance: 2b44528a-0ec9-4df9-afce-0d76ed92b221] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Jan 21 18:55:24 np0005591285 podman[218519]: 2026-01-21 23:55:24.029415697 +0000 UTC m=+0.044200311 image pull 3695f0466b4af47afdf4b467956f8cc4744d7249671a73e7ca3fd26cca2f59c3 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Jan 21 18:55:24 np0005591285 nova_compute[182755]: 2026-01-21 23:55:24.122 182759 INFO nova.virt.libvirt.driver [-] [instance: 2b44528a-0ec9-4df9-afce-0d76ed92b221] Instance spawned successfully.#033[00m
Jan 21 18:55:24 np0005591285 nova_compute[182755]: 2026-01-21 23:55:24.122 182759 DEBUG nova.virt.libvirt.driver [None req-902440b9-b2f7-4012-a2a7-6955d51d74ce fb46f340c44c473b9286568553cb6374 8556453a9e6644b4b29f7e2585b6beb3 - - default default] [instance: 2b44528a-0ec9-4df9-afce-0d76ed92b221] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Jan 21 18:55:24 np0005591285 systemd[1]: Started libcrun container.
Jan 21 18:55:24 np0005591285 nova_compute[182755]: 2026-01-21 23:55:24.140 182759 DEBUG nova.compute.manager [None req-f447ef32-7edb-4e2c-a5c5-5452f2f9c37e - - - - - -] [instance: 2b44528a-0ec9-4df9-afce-0d76ed92b221] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 21 18:55:24 np0005591285 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/a6b8d563281285d67beb5b982461f1dc3c4576c4c7fbf9364a37437da39b6aae/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 21 18:55:24 np0005591285 nova_compute[182755]: 2026-01-21 23:55:24.148 182759 DEBUG nova.compute.manager [None req-f447ef32-7edb-4e2c-a5c5-5452f2f9c37e - - - - - -] [instance: 2b44528a-0ec9-4df9-afce-0d76ed92b221] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 21 18:55:24 np0005591285 nova_compute[182755]: 2026-01-21 23:55:24.153 182759 DEBUG nova.virt.libvirt.driver [None req-902440b9-b2f7-4012-a2a7-6955d51d74ce fb46f340c44c473b9286568553cb6374 8556453a9e6644b4b29f7e2585b6beb3 - - default default] [instance: 2b44528a-0ec9-4df9-afce-0d76ed92b221] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 21 18:55:24 np0005591285 nova_compute[182755]: 2026-01-21 23:55:24.153 182759 DEBUG nova.virt.libvirt.driver [None req-902440b9-b2f7-4012-a2a7-6955d51d74ce fb46f340c44c473b9286568553cb6374 8556453a9e6644b4b29f7e2585b6beb3 - - default default] [instance: 2b44528a-0ec9-4df9-afce-0d76ed92b221] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 21 18:55:24 np0005591285 nova_compute[182755]: 2026-01-21 23:55:24.154 182759 DEBUG nova.virt.libvirt.driver [None req-902440b9-b2f7-4012-a2a7-6955d51d74ce fb46f340c44c473b9286568553cb6374 8556453a9e6644b4b29f7e2585b6beb3 - - default default] [instance: 2b44528a-0ec9-4df9-afce-0d76ed92b221] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 21 18:55:24 np0005591285 nova_compute[182755]: 2026-01-21 23:55:24.154 182759 DEBUG nova.virt.libvirt.driver [None req-902440b9-b2f7-4012-a2a7-6955d51d74ce fb46f340c44c473b9286568553cb6374 8556453a9e6644b4b29f7e2585b6beb3 - - default default] [instance: 2b44528a-0ec9-4df9-afce-0d76ed92b221] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 21 18:55:24 np0005591285 nova_compute[182755]: 2026-01-21 23:55:24.155 182759 DEBUG nova.virt.libvirt.driver [None req-902440b9-b2f7-4012-a2a7-6955d51d74ce fb46f340c44c473b9286568553cb6374 8556453a9e6644b4b29f7e2585b6beb3 - - default default] [instance: 2b44528a-0ec9-4df9-afce-0d76ed92b221] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 21 18:55:24 np0005591285 nova_compute[182755]: 2026-01-21 23:55:24.155 182759 DEBUG nova.virt.libvirt.driver [None req-902440b9-b2f7-4012-a2a7-6955d51d74ce fb46f340c44c473b9286568553cb6374 8556453a9e6644b4b29f7e2585b6beb3 - - default default] [instance: 2b44528a-0ec9-4df9-afce-0d76ed92b221] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 21 18:55:24 np0005591285 podman[218519]: 2026-01-21 23:55:24.164822191 +0000 UTC m=+0.179606795 container init 60785762b73496354a0753eb65b5ecb0bd5a37d4f28611e9059e72f1320f71ce (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-f32b0ae0-64b5-4b08-b029-da33b7e8f96a, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.schema-version=1.0, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 21 18:55:24 np0005591285 podman[218519]: 2026-01-21 23:55:24.170460562 +0000 UTC m=+0.185245146 container start 60785762b73496354a0753eb65b5ecb0bd5a37d4f28611e9059e72f1320f71ce (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-f32b0ae0-64b5-4b08-b029-da33b7e8f96a, tcib_managed=true, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.schema-version=1.0)
Jan 21 18:55:24 np0005591285 nova_compute[182755]: 2026-01-21 23:55:24.192 182759 INFO nova.compute.manager [None req-f447ef32-7edb-4e2c-a5c5-5452f2f9c37e - - - - - -] [instance: 2b44528a-0ec9-4df9-afce-0d76ed92b221] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 21 18:55:24 np0005591285 nova_compute[182755]: 2026-01-21 23:55:24.193 182759 DEBUG nova.virt.driver [None req-f447ef32-7edb-4e2c-a5c5-5452f2f9c37e - - - - - -] Emitting event <LifecycleEvent: 1769039724.1058018, 2b44528a-0ec9-4df9-afce-0d76ed92b221 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 21 18:55:24 np0005591285 nova_compute[182755]: 2026-01-21 23:55:24.193 182759 INFO nova.compute.manager [None req-f447ef32-7edb-4e2c-a5c5-5452f2f9c37e - - - - - -] [instance: 2b44528a-0ec9-4df9-afce-0d76ed92b221] VM Paused (Lifecycle Event)#033[00m
Jan 21 18:55:24 np0005591285 neutron-haproxy-ovnmeta-f32b0ae0-64b5-4b08-b029-da33b7e8f96a[218541]: [NOTICE]   (218545) : New worker (218547) forked
Jan 21 18:55:24 np0005591285 neutron-haproxy-ovnmeta-f32b0ae0-64b5-4b08-b029-da33b7e8f96a[218541]: [NOTICE]   (218545) : Loading success.
Jan 21 18:55:24 np0005591285 nova_compute[182755]: 2026-01-21 23:55:24.235 182759 DEBUG nova.compute.manager [None req-f447ef32-7edb-4e2c-a5c5-5452f2f9c37e - - - - - -] [instance: 2b44528a-0ec9-4df9-afce-0d76ed92b221] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 21 18:55:24 np0005591285 nova_compute[182755]: 2026-01-21 23:55:24.240 182759 DEBUG nova.virt.driver [None req-f447ef32-7edb-4e2c-a5c5-5452f2f9c37e - - - - - -] Emitting event <LifecycleEvent: 1769039724.1104238, 2b44528a-0ec9-4df9-afce-0d76ed92b221 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 21 18:55:24 np0005591285 nova_compute[182755]: 2026-01-21 23:55:24.240 182759 INFO nova.compute.manager [None req-f447ef32-7edb-4e2c-a5c5-5452f2f9c37e - - - - - -] [instance: 2b44528a-0ec9-4df9-afce-0d76ed92b221] VM Resumed (Lifecycle Event)#033[00m
Jan 21 18:55:24 np0005591285 nova_compute[182755]: 2026-01-21 23:55:24.277 182759 INFO nova.compute.manager [None req-902440b9-b2f7-4012-a2a7-6955d51d74ce fb46f340c44c473b9286568553cb6374 8556453a9e6644b4b29f7e2585b6beb3 - - default default] [instance: 2b44528a-0ec9-4df9-afce-0d76ed92b221] Took 5.76 seconds to spawn the instance on the hypervisor.#033[00m
Jan 21 18:55:24 np0005591285 nova_compute[182755]: 2026-01-21 23:55:24.277 182759 DEBUG nova.compute.manager [None req-902440b9-b2f7-4012-a2a7-6955d51d74ce fb46f340c44c473b9286568553cb6374 8556453a9e6644b4b29f7e2585b6beb3 - - default default] [instance: 2b44528a-0ec9-4df9-afce-0d76ed92b221] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 21 18:55:24 np0005591285 nova_compute[182755]: 2026-01-21 23:55:24.278 182759 DEBUG nova.compute.manager [None req-f447ef32-7edb-4e2c-a5c5-5452f2f9c37e - - - - - -] [instance: 2b44528a-0ec9-4df9-afce-0d76ed92b221] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 21 18:55:24 np0005591285 nova_compute[182755]: 2026-01-21 23:55:24.287 182759 DEBUG nova.compute.manager [None req-f447ef32-7edb-4e2c-a5c5-5452f2f9c37e - - - - - -] [instance: 2b44528a-0ec9-4df9-afce-0d76ed92b221] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 21 18:55:24 np0005591285 nova_compute[182755]: 2026-01-21 23:55:24.343 182759 INFO nova.compute.manager [None req-f447ef32-7edb-4e2c-a5c5-5452f2f9c37e - - - - - -] [instance: 2b44528a-0ec9-4df9-afce-0d76ed92b221] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 21 18:55:24 np0005591285 nova_compute[182755]: 2026-01-21 23:55:24.383 182759 INFO nova.compute.manager [None req-902440b9-b2f7-4012-a2a7-6955d51d74ce fb46f340c44c473b9286568553cb6374 8556453a9e6644b4b29f7e2585b6beb3 - - default default] [instance: 2b44528a-0ec9-4df9-afce-0d76ed92b221] Took 6.43 seconds to build instance.#033[00m
Jan 21 18:55:24 np0005591285 nova_compute[182755]: 2026-01-21 23:55:24.405 182759 DEBUG oslo_concurrency.lockutils [None req-902440b9-b2f7-4012-a2a7-6955d51d74ce fb46f340c44c473b9286568553cb6374 8556453a9e6644b4b29f7e2585b6beb3 - - default default] Lock "2b44528a-0ec9-4df9-afce-0d76ed92b221" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 6.579s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 21 18:55:24 np0005591285 nova_compute[182755]: 2026-01-21 23:55:24.515 182759 DEBUG nova.network.neutron [req-9817f9da-3375-439d-82e3-f57774cae4d8 req-9b55e334-50ff-4f5c-a89f-253e4ec8b802 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 2b44528a-0ec9-4df9-afce-0d76ed92b221] Updated VIF entry in instance network info cache for port b07d054c-f7c2-465a-8c0d-fa4f5dac4828. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 21 18:55:24 np0005591285 nova_compute[182755]: 2026-01-21 23:55:24.515 182759 DEBUG nova.network.neutron [req-9817f9da-3375-439d-82e3-f57774cae4d8 req-9b55e334-50ff-4f5c-a89f-253e4ec8b802 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 2b44528a-0ec9-4df9-afce-0d76ed92b221] Updating instance_info_cache with network_info: [{"id": "b07d054c-f7c2-465a-8c0d-fa4f5dac4828", "address": "fa:16:3e:87:3c:66", "network": {"id": "f32b0ae0-64b5-4b08-b029-da33b7e8f96a", "bridge": "br-int", "label": "tempest-SecurityGroupsTestJSON-1426753002-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "8556453a9e6644b4b29f7e2585b6beb3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb07d054c-f7", "ovs_interfaceid": "b07d054c-f7c2-465a-8c0d-fa4f5dac4828", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 21 18:55:24 np0005591285 nova_compute[182755]: 2026-01-21 23:55:24.538 182759 DEBUG oslo_concurrency.lockutils [req-9817f9da-3375-439d-82e3-f57774cae4d8 req-9b55e334-50ff-4f5c-a89f-253e4ec8b802 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Releasing lock "refresh_cache-2b44528a-0ec9-4df9-afce-0d76ed92b221" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 21 18:55:24 np0005591285 nova_compute[182755]: 2026-01-21 23:55:24.854 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 18:55:25 np0005591285 nova_compute[182755]: 2026-01-21 23:55:25.218 182759 DEBUG oslo_service.periodic_task [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 21 18:55:25 np0005591285 nova_compute[182755]: 2026-01-21 23:55:25.219 182759 DEBUG oslo_service.periodic_task [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 21 18:55:25 np0005591285 nova_compute[182755]: 2026-01-21 23:55:25.219 182759 DEBUG oslo_service.periodic_task [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 21 18:55:25 np0005591285 nova_compute[182755]: 2026-01-21 23:55:25.863 182759 DEBUG nova.compute.manager [req-e9b55737-0039-4aeb-a012-998052b946c8 req-f6e8c465-1f55-4562-b4d6-5ba2f013ced2 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 2b44528a-0ec9-4df9-afce-0d76ed92b221] Received event network-vif-plugged-b07d054c-f7c2-465a-8c0d-fa4f5dac4828 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 21 18:55:25 np0005591285 nova_compute[182755]: 2026-01-21 23:55:25.864 182759 DEBUG oslo_concurrency.lockutils [req-e9b55737-0039-4aeb-a012-998052b946c8 req-f6e8c465-1f55-4562-b4d6-5ba2f013ced2 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "2b44528a-0ec9-4df9-afce-0d76ed92b221-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 21 18:55:25 np0005591285 nova_compute[182755]: 2026-01-21 23:55:25.864 182759 DEBUG oslo_concurrency.lockutils [req-e9b55737-0039-4aeb-a012-998052b946c8 req-f6e8c465-1f55-4562-b4d6-5ba2f013ced2 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "2b44528a-0ec9-4df9-afce-0d76ed92b221-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 21 18:55:25 np0005591285 nova_compute[182755]: 2026-01-21 23:55:25.865 182759 DEBUG oslo_concurrency.lockutils [req-e9b55737-0039-4aeb-a012-998052b946c8 req-f6e8c465-1f55-4562-b4d6-5ba2f013ced2 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "2b44528a-0ec9-4df9-afce-0d76ed92b221-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 21 18:55:25 np0005591285 nova_compute[182755]: 2026-01-21 23:55:25.865 182759 DEBUG nova.compute.manager [req-e9b55737-0039-4aeb-a012-998052b946c8 req-f6e8c465-1f55-4562-b4d6-5ba2f013ced2 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 2b44528a-0ec9-4df9-afce-0d76ed92b221] No waiting events found dispatching network-vif-plugged-b07d054c-f7c2-465a-8c0d-fa4f5dac4828 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 21 18:55:25 np0005591285 nova_compute[182755]: 2026-01-21 23:55:25.865 182759 WARNING nova.compute.manager [req-e9b55737-0039-4aeb-a012-998052b946c8 req-f6e8c465-1f55-4562-b4d6-5ba2f013ced2 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 2b44528a-0ec9-4df9-afce-0d76ed92b221] Received unexpected event network-vif-plugged-b07d054c-f7c2-465a-8c0d-fa4f5dac4828 for instance with vm_state active and task_state None.#033[00m
Jan 21 18:55:27 np0005591285 nova_compute[182755]: 2026-01-21 23:55:27.213 182759 DEBUG oslo_service.periodic_task [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 21 18:55:27 np0005591285 nova_compute[182755]: 2026-01-21 23:55:27.243 182759 DEBUG oslo_service.periodic_task [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 21 18:55:27 np0005591285 nova_compute[182755]: 2026-01-21 23:55:27.244 182759 DEBUG nova.compute.manager [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Jan 21 18:55:27 np0005591285 nova_compute[182755]: 2026-01-21 23:55:27.245 182759 DEBUG oslo_service.periodic_task [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 21 18:55:27 np0005591285 nova_compute[182755]: 2026-01-21 23:55:27.270 182759 DEBUG oslo_concurrency.lockutils [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 21 18:55:27 np0005591285 nova_compute[182755]: 2026-01-21 23:55:27.271 182759 DEBUG oslo_concurrency.lockutils [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 21 18:55:27 np0005591285 nova_compute[182755]: 2026-01-21 23:55:27.272 182759 DEBUG oslo_concurrency.lockutils [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 21 18:55:27 np0005591285 nova_compute[182755]: 2026-01-21 23:55:27.272 182759 DEBUG nova.compute.resource_tracker [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Jan 21 18:55:27 np0005591285 nova_compute[182755]: 2026-01-21 23:55:27.282 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 18:55:27 np0005591285 nova_compute[182755]: 2026-01-21 23:55:27.350 182759 DEBUG oslo_concurrency.processutils [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/2b44528a-0ec9-4df9-afce-0d76ed92b221/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 21 18:55:27 np0005591285 nova_compute[182755]: 2026-01-21 23:55:27.426 182759 DEBUG oslo_concurrency.processutils [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/2b44528a-0ec9-4df9-afce-0d76ed92b221/disk --force-share --output=json" returned: 0 in 0.076s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 21 18:55:27 np0005591285 nova_compute[182755]: 2026-01-21 23:55:27.427 182759 DEBUG oslo_concurrency.processutils [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/2b44528a-0ec9-4df9-afce-0d76ed92b221/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 21 18:55:27 np0005591285 nova_compute[182755]: 2026-01-21 23:55:27.507 182759 DEBUG oslo_concurrency.processutils [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/2b44528a-0ec9-4df9-afce-0d76ed92b221/disk --force-share --output=json" returned: 0 in 0.080s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 21 18:55:27 np0005591285 nova_compute[182755]: 2026-01-21 23:55:27.747 182759 WARNING nova.virt.libvirt.driver [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 21 18:55:27 np0005591285 nova_compute[182755]: 2026-01-21 23:55:27.749 182759 DEBUG nova.compute.resource_tracker [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=5544MB free_disk=73.37179565429688GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Jan 21 18:55:27 np0005591285 nova_compute[182755]: 2026-01-21 23:55:27.750 182759 DEBUG oslo_concurrency.lockutils [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 21 18:55:27 np0005591285 nova_compute[182755]: 2026-01-21 23:55:27.751 182759 DEBUG oslo_concurrency.lockutils [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 21 18:55:27 np0005591285 nova_compute[182755]: 2026-01-21 23:55:27.852 182759 DEBUG nova.compute.resource_tracker [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Instance 2b44528a-0ec9-4df9-afce-0d76ed92b221 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Jan 21 18:55:27 np0005591285 nova_compute[182755]: 2026-01-21 23:55:27.853 182759 DEBUG nova.compute.resource_tracker [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Jan 21 18:55:27 np0005591285 nova_compute[182755]: 2026-01-21 23:55:27.853 182759 DEBUG nova.compute.resource_tracker [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7679MB used_ram=640MB phys_disk=79GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Jan 21 18:55:27 np0005591285 nova_compute[182755]: 2026-01-21 23:55:27.916 182759 DEBUG nova.compute.provider_tree [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Inventory has not changed in ProviderTree for provider: e96a8776-a298-4c19-937a-402cb8191067 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 21 18:55:27 np0005591285 nova_compute[182755]: 2026-01-21 23:55:27.934 182759 DEBUG nova.scheduler.client.report [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Inventory has not changed for provider e96a8776-a298-4c19-937a-402cb8191067 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 21 18:55:27 np0005591285 nova_compute[182755]: 2026-01-21 23:55:27.959 182759 DEBUG nova.compute.resource_tracker [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Jan 21 18:55:27 np0005591285 nova_compute[182755]: 2026-01-21 23:55:27.959 182759 DEBUG oslo_concurrency.lockutils [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.209s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 21 18:55:28 np0005591285 nova_compute[182755]: 2026-01-21 23:55:28.114 182759 DEBUG oslo_concurrency.lockutils [None req-208de37f-cd46-4fe0-a7ba-b91f2014780b fb46f340c44c473b9286568553cb6374 8556453a9e6644b4b29f7e2585b6beb3 - - default default] Acquiring lock "2b44528a-0ec9-4df9-afce-0d76ed92b221" by "nova.compute.manager.ComputeManager.reboot_instance.<locals>.do_reboot_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 21 18:55:28 np0005591285 nova_compute[182755]: 2026-01-21 23:55:28.115 182759 DEBUG oslo_concurrency.lockutils [None req-208de37f-cd46-4fe0-a7ba-b91f2014780b fb46f340c44c473b9286568553cb6374 8556453a9e6644b4b29f7e2585b6beb3 - - default default] Lock "2b44528a-0ec9-4df9-afce-0d76ed92b221" acquired by "nova.compute.manager.ComputeManager.reboot_instance.<locals>.do_reboot_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 21 18:55:28 np0005591285 nova_compute[182755]: 2026-01-21 23:55:28.116 182759 INFO nova.compute.manager [None req-208de37f-cd46-4fe0-a7ba-b91f2014780b fb46f340c44c473b9286568553cb6374 8556453a9e6644b4b29f7e2585b6beb3 - - default default] [instance: 2b44528a-0ec9-4df9-afce-0d76ed92b221] Rebooting instance#033[00m
Jan 21 18:55:28 np0005591285 nova_compute[182755]: 2026-01-21 23:55:28.135 182759 DEBUG oslo_concurrency.lockutils [None req-208de37f-cd46-4fe0-a7ba-b91f2014780b fb46f340c44c473b9286568553cb6374 8556453a9e6644b4b29f7e2585b6beb3 - - default default] Acquiring lock "refresh_cache-2b44528a-0ec9-4df9-afce-0d76ed92b221" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 21 18:55:28 np0005591285 nova_compute[182755]: 2026-01-21 23:55:28.136 182759 DEBUG oslo_concurrency.lockutils [None req-208de37f-cd46-4fe0-a7ba-b91f2014780b fb46f340c44c473b9286568553cb6374 8556453a9e6644b4b29f7e2585b6beb3 - - default default] Acquired lock "refresh_cache-2b44528a-0ec9-4df9-afce-0d76ed92b221" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 21 18:55:28 np0005591285 nova_compute[182755]: 2026-01-21 23:55:28.137 182759 DEBUG nova.network.neutron [None req-208de37f-cd46-4fe0-a7ba-b91f2014780b fb46f340c44c473b9286568553cb6374 8556453a9e6644b4b29f7e2585b6beb3 - - default default] [instance: 2b44528a-0ec9-4df9-afce-0d76ed92b221] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Jan 21 18:55:28 np0005591285 nova_compute[182755]: 2026-01-21 23:55:28.286 182759 DEBUG nova.compute.manager [req-b5032d44-1837-423e-baf1-3f2627850c5f req-e604d6bd-e90f-4a73-ae74-ef5ea4e4f1e4 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 2b44528a-0ec9-4df9-afce-0d76ed92b221] Received event network-changed-b07d054c-f7c2-465a-8c0d-fa4f5dac4828 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 21 18:55:28 np0005591285 nova_compute[182755]: 2026-01-21 23:55:28.287 182759 DEBUG nova.compute.manager [req-b5032d44-1837-423e-baf1-3f2627850c5f req-e604d6bd-e90f-4a73-ae74-ef5ea4e4f1e4 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 2b44528a-0ec9-4df9-afce-0d76ed92b221] Refreshing instance network info cache due to event network-changed-b07d054c-f7c2-465a-8c0d-fa4f5dac4828. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 21 18:55:28 np0005591285 nova_compute[182755]: 2026-01-21 23:55:28.287 182759 DEBUG oslo_concurrency.lockutils [req-b5032d44-1837-423e-baf1-3f2627850c5f req-e604d6bd-e90f-4a73-ae74-ef5ea4e4f1e4 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "refresh_cache-2b44528a-0ec9-4df9-afce-0d76ed92b221" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 21 18:55:28 np0005591285 nova_compute[182755]: 2026-01-21 23:55:28.934 182759 DEBUG oslo_service.periodic_task [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 21 18:55:29 np0005591285 nova_compute[182755]: 2026-01-21 23:55:29.893 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 18:55:30 np0005591285 nova_compute[182755]: 2026-01-21 23:55:30.218 182759 DEBUG oslo_service.periodic_task [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 21 18:55:30 np0005591285 nova_compute[182755]: 2026-01-21 23:55:30.219 182759 DEBUG nova.compute.manager [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Jan 21 18:55:30 np0005591285 nova_compute[182755]: 2026-01-21 23:55:30.219 182759 DEBUG nova.compute.manager [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Jan 21 18:55:30 np0005591285 podman[218563]: 2026-01-21 23:55:30.231118466 +0000 UTC m=+0.093100066 container health_status 3d927e208c8d9d2bf2ed958430b573b5beb6e6062ba87e1ec7a0ee42c855b2e4 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter)
Jan 21 18:55:30 np0005591285 nova_compute[182755]: 2026-01-21 23:55:30.240 182759 DEBUG oslo_concurrency.lockutils [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Acquiring lock "refresh_cache-2b44528a-0ec9-4df9-afce-0d76ed92b221" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 21 18:55:30 np0005591285 nova_compute[182755]: 2026-01-21 23:55:30.458 182759 DEBUG nova.network.neutron [None req-208de37f-cd46-4fe0-a7ba-b91f2014780b fb46f340c44c473b9286568553cb6374 8556453a9e6644b4b29f7e2585b6beb3 - - default default] [instance: 2b44528a-0ec9-4df9-afce-0d76ed92b221] Updating instance_info_cache with network_info: [{"id": "b07d054c-f7c2-465a-8c0d-fa4f5dac4828", "address": "fa:16:3e:87:3c:66", "network": {"id": "f32b0ae0-64b5-4b08-b029-da33b7e8f96a", "bridge": "br-int", "label": "tempest-SecurityGroupsTestJSON-1426753002-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "8556453a9e6644b4b29f7e2585b6beb3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb07d054c-f7", "ovs_interfaceid": "b07d054c-f7c2-465a-8c0d-fa4f5dac4828", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 21 18:55:30 np0005591285 nova_compute[182755]: 2026-01-21 23:55:30.490 182759 DEBUG oslo_concurrency.lockutils [None req-208de37f-cd46-4fe0-a7ba-b91f2014780b fb46f340c44c473b9286568553cb6374 8556453a9e6644b4b29f7e2585b6beb3 - - default default] Releasing lock "refresh_cache-2b44528a-0ec9-4df9-afce-0d76ed92b221" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 21 18:55:30 np0005591285 nova_compute[182755]: 2026-01-21 23:55:30.493 182759 DEBUG oslo_concurrency.lockutils [req-b5032d44-1837-423e-baf1-3f2627850c5f req-e604d6bd-e90f-4a73-ae74-ef5ea4e4f1e4 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquired lock "refresh_cache-2b44528a-0ec9-4df9-afce-0d76ed92b221" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 21 18:55:30 np0005591285 nova_compute[182755]: 2026-01-21 23:55:30.494 182759 DEBUG nova.network.neutron [req-b5032d44-1837-423e-baf1-3f2627850c5f req-e604d6bd-e90f-4a73-ae74-ef5ea4e4f1e4 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 2b44528a-0ec9-4df9-afce-0d76ed92b221] Refreshing network info cache for port b07d054c-f7c2-465a-8c0d-fa4f5dac4828 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 21 18:55:30 np0005591285 nova_compute[182755]: 2026-01-21 23:55:30.507 182759 DEBUG nova.compute.manager [None req-208de37f-cd46-4fe0-a7ba-b91f2014780b fb46f340c44c473b9286568553cb6374 8556453a9e6644b4b29f7e2585b6beb3 - - default default] [instance: 2b44528a-0ec9-4df9-afce-0d76ed92b221] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 21 18:55:30 np0005591285 kernel: tapb07d054c-f7 (unregistering): left promiscuous mode
Jan 21 18:55:30 np0005591285 NetworkManager[55017]: <info>  [1769039730.7191] device (tapb07d054c-f7): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 21 18:55:30 np0005591285 nova_compute[182755]: 2026-01-21 23:55:30.726 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 18:55:30 np0005591285 ovn_controller[94908]: 2026-01-21T23:55:30Z|00138|binding|INFO|Releasing lport b07d054c-f7c2-465a-8c0d-fa4f5dac4828 from this chassis (sb_readonly=0)
Jan 21 18:55:30 np0005591285 ovn_controller[94908]: 2026-01-21T23:55:30Z|00139|binding|INFO|Setting lport b07d054c-f7c2-465a-8c0d-fa4f5dac4828 down in Southbound
Jan 21 18:55:30 np0005591285 ovn_controller[94908]: 2026-01-21T23:55:30Z|00140|binding|INFO|Removing iface tapb07d054c-f7 ovn-installed in OVS
Jan 21 18:55:30 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:55:30.737 104259 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:87:3c:66 10.100.0.5'], port_security=['fa:16:3e:87:3c:66 10.100.0.5'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.5/28', 'neutron:device_id': '2b44528a-0ec9-4df9-afce-0d76ed92b221', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-f32b0ae0-64b5-4b08-b029-da33b7e8f96a', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '8556453a9e6644b4b29f7e2585b6beb3', 'neutron:revision_number': '5', 'neutron:security_group_ids': '07a1a0ce-5790-4d2e-8869-adf91647e1de 746fe9a7-60c8-4d8b-9d10-bd8b258787a5', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-2.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=c83a09c2-c943-4d92-aedc-1a1adb93cc19, chassis=[], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fdd55b5c970>], logical_port=b07d054c-f7c2-465a-8c0d-fa4f5dac4828) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7fdd55b5c970>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 21 18:55:30 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:55:30.740 104259 INFO neutron.agent.ovn.metadata.agent [-] Port b07d054c-f7c2-465a-8c0d-fa4f5dac4828 in datapath f32b0ae0-64b5-4b08-b029-da33b7e8f96a unbound from our chassis#033[00m
Jan 21 18:55:30 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:55:30.742 104259 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network f32b0ae0-64b5-4b08-b029-da33b7e8f96a, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Jan 21 18:55:30 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:55:30.744 211690 DEBUG oslo.privsep.daemon [-] privsep: reply[1b7b4579-660d-4ea2-917f-b05e4da44e16]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 18:55:30 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:55:30.744 104259 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-f32b0ae0-64b5-4b08-b029-da33b7e8f96a namespace which is not needed anymore#033[00m
Jan 21 18:55:30 np0005591285 nova_compute[182755]: 2026-01-21 23:55:30.765 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 18:55:30 np0005591285 systemd[1]: machine-qemu\x2d23\x2dinstance\x2d00000039.scope: Deactivated successfully.
Jan 21 18:55:30 np0005591285 systemd[1]: machine-qemu\x2d23\x2dinstance\x2d00000039.scope: Consumed 7.553s CPU time.
Jan 21 18:55:30 np0005591285 systemd-machined[154022]: Machine qemu-23-instance-00000039 terminated.
Jan 21 18:55:30 np0005591285 neutron-haproxy-ovnmeta-f32b0ae0-64b5-4b08-b029-da33b7e8f96a[218541]: [NOTICE]   (218545) : haproxy version is 2.8.14-c23fe91
Jan 21 18:55:30 np0005591285 neutron-haproxy-ovnmeta-f32b0ae0-64b5-4b08-b029-da33b7e8f96a[218541]: [NOTICE]   (218545) : path to executable is /usr/sbin/haproxy
Jan 21 18:55:30 np0005591285 neutron-haproxy-ovnmeta-f32b0ae0-64b5-4b08-b029-da33b7e8f96a[218541]: [WARNING]  (218545) : Exiting Master process...
Jan 21 18:55:30 np0005591285 neutron-haproxy-ovnmeta-f32b0ae0-64b5-4b08-b029-da33b7e8f96a[218541]: [ALERT]    (218545) : Current worker (218547) exited with code 143 (Terminated)
Jan 21 18:55:30 np0005591285 neutron-haproxy-ovnmeta-f32b0ae0-64b5-4b08-b029-da33b7e8f96a[218541]: [WARNING]  (218545) : All workers exited. Exiting... (0)
Jan 21 18:55:30 np0005591285 systemd[1]: libpod-60785762b73496354a0753eb65b5ecb0bd5a37d4f28611e9059e72f1320f71ce.scope: Deactivated successfully.
Jan 21 18:55:30 np0005591285 podman[218610]: 2026-01-21 23:55:30.924240553 +0000 UTC m=+0.057405887 container died 60785762b73496354a0753eb65b5ecb0bd5a37d4f28611e9059e72f1320f71ce (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-f32b0ae0-64b5-4b08-b029-da33b7e8f96a, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team)
Jan 21 18:55:30 np0005591285 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-60785762b73496354a0753eb65b5ecb0bd5a37d4f28611e9059e72f1320f71ce-userdata-shm.mount: Deactivated successfully.
Jan 21 18:55:31 np0005591285 systemd[1]: var-lib-containers-storage-overlay-a6b8d563281285d67beb5b982461f1dc3c4576c4c7fbf9364a37437da39b6aae-merged.mount: Deactivated successfully.
Jan 21 18:55:31 np0005591285 podman[218610]: 2026-01-21 23:55:31.067955172 +0000 UTC m=+0.201120496 container cleanup 60785762b73496354a0753eb65b5ecb0bd5a37d4f28611e9059e72f1320f71ce (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-f32b0ae0-64b5-4b08-b029-da33b7e8f96a, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Jan 21 18:55:31 np0005591285 nova_compute[182755]: 2026-01-21 23:55:31.096 182759 INFO nova.virt.libvirt.driver [-] [instance: 2b44528a-0ec9-4df9-afce-0d76ed92b221] Instance destroyed successfully.#033[00m
Jan 21 18:55:31 np0005591285 nova_compute[182755]: 2026-01-21 23:55:31.097 182759 DEBUG nova.objects.instance [None req-208de37f-cd46-4fe0-a7ba-b91f2014780b fb46f340c44c473b9286568553cb6374 8556453a9e6644b4b29f7e2585b6beb3 - - default default] Lazy-loading 'resources' on Instance uuid 2b44528a-0ec9-4df9-afce-0d76ed92b221 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 21 18:55:31 np0005591285 systemd[1]: libpod-conmon-60785762b73496354a0753eb65b5ecb0bd5a37d4f28611e9059e72f1320f71ce.scope: Deactivated successfully.
Jan 21 18:55:31 np0005591285 nova_compute[182755]: 2026-01-21 23:55:31.120 182759 DEBUG nova.virt.libvirt.vif [None req-208de37f-cd46-4fe0-a7ba-b91f2014780b fb46f340c44c473b9286568553cb6374 8556453a9e6644b4b29f7e2585b6beb3 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-21T23:55:16Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-SecurityGroupsTestJSON-server-89867646',display_name='tempest-SecurityGroupsTestJSON-server-89867646',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-securitygroupstestjson-server-89867646',id=57,image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2026-01-21T23:55:24Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='8556453a9e6644b4b29f7e2585b6beb3',ramdisk_id='',reservation_id='r-vin64boz',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-SecurityGroupsTestJSON-744520065',owner_user_name='tempest-SecurityGroupsTestJSON-744520065-project-member'},tags=<?>,task_state='reboot_started_hard',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-21T23:55:30Z,user_data=None,user_id='fb46f340c44c473b9286568553cb6374',uuid=2b44528a-0ec9-4df9-afce-0d76ed92b221,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "b07d054c-f7c2-465a-8c0d-fa4f5dac4828", "address": "fa:16:3e:87:3c:66", "network": {"id": "f32b0ae0-64b5-4b08-b029-da33b7e8f96a", "bridge": "br-int", "label": "tempest-SecurityGroupsTestJSON-1426753002-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "8556453a9e6644b4b29f7e2585b6beb3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb07d054c-f7", "ovs_interfaceid": "b07d054c-f7c2-465a-8c0d-fa4f5dac4828", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Jan 21 18:55:31 np0005591285 nova_compute[182755]: 2026-01-21 23:55:31.121 182759 DEBUG nova.network.os_vif_util [None req-208de37f-cd46-4fe0-a7ba-b91f2014780b fb46f340c44c473b9286568553cb6374 8556453a9e6644b4b29f7e2585b6beb3 - - default default] Converting VIF {"id": "b07d054c-f7c2-465a-8c0d-fa4f5dac4828", "address": "fa:16:3e:87:3c:66", "network": {"id": "f32b0ae0-64b5-4b08-b029-da33b7e8f96a", "bridge": "br-int", "label": "tempest-SecurityGroupsTestJSON-1426753002-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "8556453a9e6644b4b29f7e2585b6beb3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb07d054c-f7", "ovs_interfaceid": "b07d054c-f7c2-465a-8c0d-fa4f5dac4828", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 21 18:55:31 np0005591285 nova_compute[182755]: 2026-01-21 23:55:31.122 182759 DEBUG nova.network.os_vif_util [None req-208de37f-cd46-4fe0-a7ba-b91f2014780b fb46f340c44c473b9286568553cb6374 8556453a9e6644b4b29f7e2585b6beb3 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:87:3c:66,bridge_name='br-int',has_traffic_filtering=True,id=b07d054c-f7c2-465a-8c0d-fa4f5dac4828,network=Network(f32b0ae0-64b5-4b08-b029-da33b7e8f96a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapb07d054c-f7') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 21 18:55:31 np0005591285 nova_compute[182755]: 2026-01-21 23:55:31.123 182759 DEBUG os_vif [None req-208de37f-cd46-4fe0-a7ba-b91f2014780b fb46f340c44c473b9286568553cb6374 8556453a9e6644b4b29f7e2585b6beb3 - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:87:3c:66,bridge_name='br-int',has_traffic_filtering=True,id=b07d054c-f7c2-465a-8c0d-fa4f5dac4828,network=Network(f32b0ae0-64b5-4b08-b029-da33b7e8f96a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapb07d054c-f7') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Jan 21 18:55:31 np0005591285 nova_compute[182755]: 2026-01-21 23:55:31.124 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 18:55:31 np0005591285 nova_compute[182755]: 2026-01-21 23:55:31.125 182759 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapb07d054c-f7, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 21 18:55:31 np0005591285 nova_compute[182755]: 2026-01-21 23:55:31.127 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 18:55:31 np0005591285 nova_compute[182755]: 2026-01-21 23:55:31.129 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 18:55:31 np0005591285 nova_compute[182755]: 2026-01-21 23:55:31.134 182759 INFO os_vif [None req-208de37f-cd46-4fe0-a7ba-b91f2014780b fb46f340c44c473b9286568553cb6374 8556453a9e6644b4b29f7e2585b6beb3 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:87:3c:66,bridge_name='br-int',has_traffic_filtering=True,id=b07d054c-f7c2-465a-8c0d-fa4f5dac4828,network=Network(f32b0ae0-64b5-4b08-b029-da33b7e8f96a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapb07d054c-f7')#033[00m
Jan 21 18:55:31 np0005591285 nova_compute[182755]: 2026-01-21 23:55:31.142 182759 DEBUG nova.virt.libvirt.driver [None req-208de37f-cd46-4fe0-a7ba-b91f2014780b fb46f340c44c473b9286568553cb6374 8556453a9e6644b4b29f7e2585b6beb3 - - default default] [instance: 2b44528a-0ec9-4df9-afce-0d76ed92b221] Start _get_guest_xml network_info=[{"id": "b07d054c-f7c2-465a-8c0d-fa4f5dac4828", "address": "fa:16:3e:87:3c:66", "network": {"id": "f32b0ae0-64b5-4b08-b029-da33b7e8f96a", "bridge": "br-int", "label": "tempest-SecurityGroupsTestJSON-1426753002-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "8556453a9e6644b4b29f7e2585b6beb3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb07d054c-f7", "ovs_interfaceid": "b07d054c-f7c2-465a-8c0d-fa4f5dac4828", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum=<?>,container_format='bare',created_at=<?>,direct_url=<?>,disk_format='qcow2',id=9cd98f02-a505-4543-a7ad-04e9a377b456,min_disk=1,min_ram=0,name=<?>,owner=<?>,properties=ImageMetaProps,protected=<?>,size=<?>,status=<?>,tags=<?>,updated_at=<?>,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'device_type': 'disk', 'device_name': '/dev/vda', 'size': 0, 'boot_index': 0, 'disk_bus': 'virtio', 'guest_format': None, 'encryption_options': None, 'encryption_format': None, 'encrypted': False, 'encryption_secret_uuid': None, 'image_id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Jan 21 18:55:31 np0005591285 podman[218653]: 2026-01-21 23:55:31.14571387 +0000 UTC m=+0.043612363 container remove 60785762b73496354a0753eb65b5ecb0bd5a37d4f28611e9059e72f1320f71ce (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-f32b0ae0-64b5-4b08-b029-da33b7e8f96a, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.license=GPLv2, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 21 18:55:31 np0005591285 nova_compute[182755]: 2026-01-21 23:55:31.146 182759 WARNING nova.virt.libvirt.driver [None req-208de37f-cd46-4fe0-a7ba-b91f2014780b fb46f340c44c473b9286568553cb6374 8556453a9e6644b4b29f7e2585b6beb3 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 21 18:55:31 np0005591285 nova_compute[182755]: 2026-01-21 23:55:31.152 182759 DEBUG nova.virt.libvirt.host [None req-208de37f-cd46-4fe0-a7ba-b91f2014780b fb46f340c44c473b9286568553cb6374 8556453a9e6644b4b29f7e2585b6beb3 - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Jan 21 18:55:31 np0005591285 nova_compute[182755]: 2026-01-21 23:55:31.152 182759 DEBUG nova.virt.libvirt.host [None req-208de37f-cd46-4fe0-a7ba-b91f2014780b fb46f340c44c473b9286568553cb6374 8556453a9e6644b4b29f7e2585b6beb3 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Jan 21 18:55:31 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:55:31.153 211690 DEBUG oslo.privsep.daemon [-] privsep: reply[23766b19-ddb1-4382-b1a9-f8a00b833ec3]: (4, ('Wed Jan 21 11:55:30 PM UTC 2026 Stopping container neutron-haproxy-ovnmeta-f32b0ae0-64b5-4b08-b029-da33b7e8f96a (60785762b73496354a0753eb65b5ecb0bd5a37d4f28611e9059e72f1320f71ce)\n60785762b73496354a0753eb65b5ecb0bd5a37d4f28611e9059e72f1320f71ce\nWed Jan 21 11:55:31 PM UTC 2026 Deleting container neutron-haproxy-ovnmeta-f32b0ae0-64b5-4b08-b029-da33b7e8f96a (60785762b73496354a0753eb65b5ecb0bd5a37d4f28611e9059e72f1320f71ce)\n60785762b73496354a0753eb65b5ecb0bd5a37d4f28611e9059e72f1320f71ce\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 18:55:31 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:55:31.155 211690 DEBUG oslo.privsep.daemon [-] privsep: reply[b83a594f-cdf0-4f59-b670-61fa0c17c9fc]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 18:55:31 np0005591285 nova_compute[182755]: 2026-01-21 23:55:31.156 182759 DEBUG nova.virt.libvirt.host [None req-208de37f-cd46-4fe0-a7ba-b91f2014780b fb46f340c44c473b9286568553cb6374 8556453a9e6644b4b29f7e2585b6beb3 - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Jan 21 18:55:31 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:55:31.156 104259 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapf32b0ae0-60, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 21 18:55:31 np0005591285 nova_compute[182755]: 2026-01-21 23:55:31.157 182759 DEBUG nova.virt.libvirt.host [None req-208de37f-cd46-4fe0-a7ba-b91f2014780b fb46f340c44c473b9286568553cb6374 8556453a9e6644b4b29f7e2585b6beb3 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Jan 21 18:55:31 np0005591285 kernel: tapf32b0ae0-60: left promiscuous mode
Jan 21 18:55:31 np0005591285 nova_compute[182755]: 2026-01-21 23:55:31.158 182759 DEBUG nova.virt.libvirt.driver [None req-208de37f-cd46-4fe0-a7ba-b91f2014780b fb46f340c44c473b9286568553cb6374 8556453a9e6644b4b29f7e2585b6beb3 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Jan 21 18:55:31 np0005591285 nova_compute[182755]: 2026-01-21 23:55:31.160 182759 DEBUG nova.virt.hardware [None req-208de37f-cd46-4fe0-a7ba-b91f2014780b fb46f340c44c473b9286568553cb6374 8556453a9e6644b4b29f7e2585b6beb3 - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-21T23:42:45Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='c3389c03-89c4-4ff5-9e03-1a99d41713d4',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum=<?>,container_format='bare',created_at=<?>,direct_url=<?>,disk_format='qcow2',id=9cd98f02-a505-4543-a7ad-04e9a377b456,min_disk=1,min_ram=0,name=<?>,owner=<?>,properties=ImageMetaProps,protected=<?>,size=<?>,status=<?>,tags=<?>,updated_at=<?>,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Jan 21 18:55:31 np0005591285 nova_compute[182755]: 2026-01-21 23:55:31.161 182759 DEBUG nova.virt.hardware [None req-208de37f-cd46-4fe0-a7ba-b91f2014780b fb46f340c44c473b9286568553cb6374 8556453a9e6644b4b29f7e2585b6beb3 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Jan 21 18:55:31 np0005591285 nova_compute[182755]: 2026-01-21 23:55:31.161 182759 DEBUG nova.virt.hardware [None req-208de37f-cd46-4fe0-a7ba-b91f2014780b fb46f340c44c473b9286568553cb6374 8556453a9e6644b4b29f7e2585b6beb3 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Jan 21 18:55:31 np0005591285 nova_compute[182755]: 2026-01-21 23:55:31.162 182759 DEBUG nova.virt.hardware [None req-208de37f-cd46-4fe0-a7ba-b91f2014780b fb46f340c44c473b9286568553cb6374 8556453a9e6644b4b29f7e2585b6beb3 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Jan 21 18:55:31 np0005591285 nova_compute[182755]: 2026-01-21 23:55:31.162 182759 DEBUG nova.virt.hardware [None req-208de37f-cd46-4fe0-a7ba-b91f2014780b fb46f340c44c473b9286568553cb6374 8556453a9e6644b4b29f7e2585b6beb3 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Jan 21 18:55:31 np0005591285 nova_compute[182755]: 2026-01-21 23:55:31.162 182759 DEBUG nova.virt.hardware [None req-208de37f-cd46-4fe0-a7ba-b91f2014780b fb46f340c44c473b9286568553cb6374 8556453a9e6644b4b29f7e2585b6beb3 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Jan 21 18:55:31 np0005591285 nova_compute[182755]: 2026-01-21 23:55:31.162 182759 DEBUG nova.virt.hardware [None req-208de37f-cd46-4fe0-a7ba-b91f2014780b fb46f340c44c473b9286568553cb6374 8556453a9e6644b4b29f7e2585b6beb3 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Jan 21 18:55:31 np0005591285 nova_compute[182755]: 2026-01-21 23:55:31.162 182759 DEBUG nova.virt.hardware [None req-208de37f-cd46-4fe0-a7ba-b91f2014780b fb46f340c44c473b9286568553cb6374 8556453a9e6644b4b29f7e2585b6beb3 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Jan 21 18:55:31 np0005591285 nova_compute[182755]: 2026-01-21 23:55:31.163 182759 DEBUG nova.virt.hardware [None req-208de37f-cd46-4fe0-a7ba-b91f2014780b fb46f340c44c473b9286568553cb6374 8556453a9e6644b4b29f7e2585b6beb3 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Jan 21 18:55:31 np0005591285 nova_compute[182755]: 2026-01-21 23:55:31.163 182759 DEBUG nova.virt.hardware [None req-208de37f-cd46-4fe0-a7ba-b91f2014780b fb46f340c44c473b9286568553cb6374 8556453a9e6644b4b29f7e2585b6beb3 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Jan 21 18:55:31 np0005591285 nova_compute[182755]: 2026-01-21 23:55:31.163 182759 DEBUG nova.virt.hardware [None req-208de37f-cd46-4fe0-a7ba-b91f2014780b fb46f340c44c473b9286568553cb6374 8556453a9e6644b4b29f7e2585b6beb3 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Jan 21 18:55:31 np0005591285 nova_compute[182755]: 2026-01-21 23:55:31.163 182759 DEBUG nova.objects.instance [None req-208de37f-cd46-4fe0-a7ba-b91f2014780b fb46f340c44c473b9286568553cb6374 8556453a9e6644b4b29f7e2585b6beb3 - - default default] Lazy-loading 'vcpu_model' on Instance uuid 2b44528a-0ec9-4df9-afce-0d76ed92b221 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 21 18:55:31 np0005591285 nova_compute[182755]: 2026-01-21 23:55:31.165 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 18:55:31 np0005591285 nova_compute[182755]: 2026-01-21 23:55:31.170 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 18:55:31 np0005591285 nova_compute[182755]: 2026-01-21 23:55:31.171 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 18:55:31 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:55:31.174 211690 DEBUG oslo.privsep.daemon [-] privsep: reply[e04cdca0-101f-47ab-a2da-334fb88c3b3f]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 18:55:31 np0005591285 nova_compute[182755]: 2026-01-21 23:55:31.181 182759 DEBUG oslo_concurrency.processutils [None req-208de37f-cd46-4fe0-a7ba-b91f2014780b fb46f340c44c473b9286568553cb6374 8556453a9e6644b4b29f7e2585b6beb3 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/2b44528a-0ec9-4df9-afce-0d76ed92b221/disk.config --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 21 18:55:31 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:55:31.190 211690 DEBUG oslo.privsep.daemon [-] privsep: reply[899e9086-b8f6-4179-9bc1-8f997e862904]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 18:55:31 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:55:31.192 211690 DEBUG oslo.privsep.daemon [-] privsep: reply[f3d2a66f-738c-43b8-87ac-7f50c534a734]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 18:55:31 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:55:31.217 211690 DEBUG oslo.privsep.daemon [-] privsep: reply[55acb4f6-3e82-46a2-a348-4346f317464d]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 428192, 'reachable_time': 23480, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 218675, 'error': None, 'target': 'ovnmeta-f32b0ae0-64b5-4b08-b029-da33b7e8f96a', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 18:55:31 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:55:31.220 104650 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-f32b0ae0-64b5-4b08-b029-da33b7e8f96a deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Jan 21 18:55:31 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:55:31.220 104650 DEBUG oslo.privsep.daemon [-] privsep: reply[9265d9b8-e298-48f2-9242-3e45d6628d74]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 18:55:31 np0005591285 systemd[1]: run-netns-ovnmeta\x2df32b0ae0\x2d64b5\x2d4b08\x2db029\x2dda33b7e8f96a.mount: Deactivated successfully.
Jan 21 18:55:31 np0005591285 nova_compute[182755]: 2026-01-21 23:55:31.239 182759 DEBUG oslo_concurrency.processutils [None req-208de37f-cd46-4fe0-a7ba-b91f2014780b fb46f340c44c473b9286568553cb6374 8556453a9e6644b4b29f7e2585b6beb3 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/2b44528a-0ec9-4df9-afce-0d76ed92b221/disk.config --force-share --output=json" returned: 0 in 0.057s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 21 18:55:31 np0005591285 nova_compute[182755]: 2026-01-21 23:55:31.240 182759 DEBUG oslo_concurrency.lockutils [None req-208de37f-cd46-4fe0-a7ba-b91f2014780b fb46f340c44c473b9286568553cb6374 8556453a9e6644b4b29f7e2585b6beb3 - - default default] Acquiring lock "/var/lib/nova/instances/2b44528a-0ec9-4df9-afce-0d76ed92b221/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 21 18:55:31 np0005591285 nova_compute[182755]: 2026-01-21 23:55:31.240 182759 DEBUG oslo_concurrency.lockutils [None req-208de37f-cd46-4fe0-a7ba-b91f2014780b fb46f340c44c473b9286568553cb6374 8556453a9e6644b4b29f7e2585b6beb3 - - default default] Lock "/var/lib/nova/instances/2b44528a-0ec9-4df9-afce-0d76ed92b221/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 21 18:55:31 np0005591285 nova_compute[182755]: 2026-01-21 23:55:31.241 182759 DEBUG oslo_concurrency.lockutils [None req-208de37f-cd46-4fe0-a7ba-b91f2014780b fb46f340c44c473b9286568553cb6374 8556453a9e6644b4b29f7e2585b6beb3 - - default default] Lock "/var/lib/nova/instances/2b44528a-0ec9-4df9-afce-0d76ed92b221/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 21 18:55:31 np0005591285 nova_compute[182755]: 2026-01-21 23:55:31.242 182759 DEBUG nova.virt.libvirt.vif [None req-208de37f-cd46-4fe0-a7ba-b91f2014780b fb46f340c44c473b9286568553cb6374 8556453a9e6644b4b29f7e2585b6beb3 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-21T23:55:16Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-SecurityGroupsTestJSON-server-89867646',display_name='tempest-SecurityGroupsTestJSON-server-89867646',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-securitygroupstestjson-server-89867646',id=57,image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2026-01-21T23:55:24Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='8556453a9e6644b4b29f7e2585b6beb3',ramdisk_id='',reservation_id='r-vin64boz',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-SecurityGroupsTestJSON-744520065',owner_user_name='tempest-SecurityGroupsTestJSON-744520065-project-member'},tags=<?>,task_state='reboot_started_hard',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-21T23:55:30Z,user_data=None,user_id='fb46f340c44c473b9286568553cb6374',uuid=2b44528a-0ec9-4df9-afce-0d76ed92b221,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "b07d054c-f7c2-465a-8c0d-fa4f5dac4828", "address": "fa:16:3e:87:3c:66", "network": {"id": "f32b0ae0-64b5-4b08-b029-da33b7e8f96a", "bridge": "br-int", "label": "tempest-SecurityGroupsTestJSON-1426753002-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "8556453a9e6644b4b29f7e2585b6beb3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb07d054c-f7", "ovs_interfaceid": "b07d054c-f7c2-465a-8c0d-fa4f5dac4828", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Jan 21 18:55:31 np0005591285 nova_compute[182755]: 2026-01-21 23:55:31.243 182759 DEBUG nova.network.os_vif_util [None req-208de37f-cd46-4fe0-a7ba-b91f2014780b fb46f340c44c473b9286568553cb6374 8556453a9e6644b4b29f7e2585b6beb3 - - default default] Converting VIF {"id": "b07d054c-f7c2-465a-8c0d-fa4f5dac4828", "address": "fa:16:3e:87:3c:66", "network": {"id": "f32b0ae0-64b5-4b08-b029-da33b7e8f96a", "bridge": "br-int", "label": "tempest-SecurityGroupsTestJSON-1426753002-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "8556453a9e6644b4b29f7e2585b6beb3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb07d054c-f7", "ovs_interfaceid": "b07d054c-f7c2-465a-8c0d-fa4f5dac4828", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 21 18:55:31 np0005591285 nova_compute[182755]: 2026-01-21 23:55:31.244 182759 DEBUG nova.network.os_vif_util [None req-208de37f-cd46-4fe0-a7ba-b91f2014780b fb46f340c44c473b9286568553cb6374 8556453a9e6644b4b29f7e2585b6beb3 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:87:3c:66,bridge_name='br-int',has_traffic_filtering=True,id=b07d054c-f7c2-465a-8c0d-fa4f5dac4828,network=Network(f32b0ae0-64b5-4b08-b029-da33b7e8f96a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapb07d054c-f7') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 21 18:55:31 np0005591285 nova_compute[182755]: 2026-01-21 23:55:31.245 182759 DEBUG nova.objects.instance [None req-208de37f-cd46-4fe0-a7ba-b91f2014780b fb46f340c44c473b9286568553cb6374 8556453a9e6644b4b29f7e2585b6beb3 - - default default] Lazy-loading 'pci_devices' on Instance uuid 2b44528a-0ec9-4df9-afce-0d76ed92b221 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 21 18:55:31 np0005591285 nova_compute[182755]: 2026-01-21 23:55:31.263 182759 DEBUG nova.virt.libvirt.driver [None req-208de37f-cd46-4fe0-a7ba-b91f2014780b fb46f340c44c473b9286568553cb6374 8556453a9e6644b4b29f7e2585b6beb3 - - default default] [instance: 2b44528a-0ec9-4df9-afce-0d76ed92b221] End _get_guest_xml xml=<domain type="kvm">
Jan 21 18:55:31 np0005591285 nova_compute[182755]:  <uuid>2b44528a-0ec9-4df9-afce-0d76ed92b221</uuid>
Jan 21 18:55:31 np0005591285 nova_compute[182755]:  <name>instance-00000039</name>
Jan 21 18:55:31 np0005591285 nova_compute[182755]:  <memory>131072</memory>
Jan 21 18:55:31 np0005591285 nova_compute[182755]:  <vcpu>1</vcpu>
Jan 21 18:55:31 np0005591285 nova_compute[182755]:  <metadata>
Jan 21 18:55:31 np0005591285 nova_compute[182755]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 21 18:55:31 np0005591285 nova_compute[182755]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 21 18:55:31 np0005591285 nova_compute[182755]:      <nova:name>tempest-SecurityGroupsTestJSON-server-89867646</nova:name>
Jan 21 18:55:31 np0005591285 nova_compute[182755]:      <nova:creationTime>2026-01-21 23:55:31</nova:creationTime>
Jan 21 18:55:31 np0005591285 nova_compute[182755]:      <nova:flavor name="m1.nano">
Jan 21 18:55:31 np0005591285 nova_compute[182755]:        <nova:memory>128</nova:memory>
Jan 21 18:55:31 np0005591285 nova_compute[182755]:        <nova:disk>1</nova:disk>
Jan 21 18:55:31 np0005591285 nova_compute[182755]:        <nova:swap>0</nova:swap>
Jan 21 18:55:31 np0005591285 nova_compute[182755]:        <nova:ephemeral>0</nova:ephemeral>
Jan 21 18:55:31 np0005591285 nova_compute[182755]:        <nova:vcpus>1</nova:vcpus>
Jan 21 18:55:31 np0005591285 nova_compute[182755]:      </nova:flavor>
Jan 21 18:55:31 np0005591285 nova_compute[182755]:      <nova:owner>
Jan 21 18:55:31 np0005591285 nova_compute[182755]:        <nova:user uuid="fb46f340c44c473b9286568553cb6374">tempest-SecurityGroupsTestJSON-744520065-project-member</nova:user>
Jan 21 18:55:31 np0005591285 nova_compute[182755]:        <nova:project uuid="8556453a9e6644b4b29f7e2585b6beb3">tempest-SecurityGroupsTestJSON-744520065</nova:project>
Jan 21 18:55:31 np0005591285 nova_compute[182755]:      </nova:owner>
Jan 21 18:55:31 np0005591285 nova_compute[182755]:      <nova:root type="image" uuid="9cd98f02-a505-4543-a7ad-04e9a377b456"/>
Jan 21 18:55:31 np0005591285 nova_compute[182755]:      <nova:ports>
Jan 21 18:55:31 np0005591285 nova_compute[182755]:        <nova:port uuid="b07d054c-f7c2-465a-8c0d-fa4f5dac4828">
Jan 21 18:55:31 np0005591285 nova_compute[182755]:          <nova:ip type="fixed" address="10.100.0.5" ipVersion="4"/>
Jan 21 18:55:31 np0005591285 nova_compute[182755]:        </nova:port>
Jan 21 18:55:31 np0005591285 nova_compute[182755]:      </nova:ports>
Jan 21 18:55:31 np0005591285 nova_compute[182755]:    </nova:instance>
Jan 21 18:55:31 np0005591285 nova_compute[182755]:  </metadata>
Jan 21 18:55:31 np0005591285 nova_compute[182755]:  <sysinfo type="smbios">
Jan 21 18:55:31 np0005591285 nova_compute[182755]:    <system>
Jan 21 18:55:31 np0005591285 nova_compute[182755]:      <entry name="manufacturer">RDO</entry>
Jan 21 18:55:31 np0005591285 nova_compute[182755]:      <entry name="product">OpenStack Compute</entry>
Jan 21 18:55:31 np0005591285 nova_compute[182755]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 21 18:55:31 np0005591285 nova_compute[182755]:      <entry name="serial">2b44528a-0ec9-4df9-afce-0d76ed92b221</entry>
Jan 21 18:55:31 np0005591285 nova_compute[182755]:      <entry name="uuid">2b44528a-0ec9-4df9-afce-0d76ed92b221</entry>
Jan 21 18:55:31 np0005591285 nova_compute[182755]:      <entry name="family">Virtual Machine</entry>
Jan 21 18:55:31 np0005591285 nova_compute[182755]:    </system>
Jan 21 18:55:31 np0005591285 nova_compute[182755]:  </sysinfo>
Jan 21 18:55:31 np0005591285 nova_compute[182755]:  <os>
Jan 21 18:55:31 np0005591285 nova_compute[182755]:    <type arch="x86_64" machine="q35">hvm</type>
Jan 21 18:55:31 np0005591285 nova_compute[182755]:    <boot dev="hd"/>
Jan 21 18:55:31 np0005591285 nova_compute[182755]:    <smbios mode="sysinfo"/>
Jan 21 18:55:31 np0005591285 nova_compute[182755]:  </os>
Jan 21 18:55:31 np0005591285 nova_compute[182755]:  <features>
Jan 21 18:55:31 np0005591285 nova_compute[182755]:    <acpi/>
Jan 21 18:55:31 np0005591285 nova_compute[182755]:    <apic/>
Jan 21 18:55:31 np0005591285 nova_compute[182755]:    <vmcoreinfo/>
Jan 21 18:55:31 np0005591285 nova_compute[182755]:  </features>
Jan 21 18:55:31 np0005591285 nova_compute[182755]:  <clock offset="utc">
Jan 21 18:55:31 np0005591285 nova_compute[182755]:    <timer name="pit" tickpolicy="delay"/>
Jan 21 18:55:31 np0005591285 nova_compute[182755]:    <timer name="rtc" tickpolicy="catchup"/>
Jan 21 18:55:31 np0005591285 nova_compute[182755]:    <timer name="hpet" present="no"/>
Jan 21 18:55:31 np0005591285 nova_compute[182755]:  </clock>
Jan 21 18:55:31 np0005591285 nova_compute[182755]:  <cpu mode="custom" match="exact">
Jan 21 18:55:31 np0005591285 nova_compute[182755]:    <model>Nehalem</model>
Jan 21 18:55:31 np0005591285 nova_compute[182755]:    <topology sockets="1" cores="1" threads="1"/>
Jan 21 18:55:31 np0005591285 nova_compute[182755]:  </cpu>
Jan 21 18:55:31 np0005591285 nova_compute[182755]:  <devices>
Jan 21 18:55:31 np0005591285 nova_compute[182755]:    <disk type="file" device="disk">
Jan 21 18:55:31 np0005591285 nova_compute[182755]:      <driver name="qemu" type="qcow2" cache="none"/>
Jan 21 18:55:31 np0005591285 nova_compute[182755]:      <source file="/var/lib/nova/instances/2b44528a-0ec9-4df9-afce-0d76ed92b221/disk"/>
Jan 21 18:55:31 np0005591285 nova_compute[182755]:      <target dev="vda" bus="virtio"/>
Jan 21 18:55:31 np0005591285 nova_compute[182755]:    </disk>
Jan 21 18:55:31 np0005591285 nova_compute[182755]:    <disk type="file" device="cdrom">
Jan 21 18:55:31 np0005591285 nova_compute[182755]:      <driver name="qemu" type="raw" cache="none"/>
Jan 21 18:55:31 np0005591285 nova_compute[182755]:      <source file="/var/lib/nova/instances/2b44528a-0ec9-4df9-afce-0d76ed92b221/disk.config"/>
Jan 21 18:55:31 np0005591285 nova_compute[182755]:      <target dev="sda" bus="sata"/>
Jan 21 18:55:31 np0005591285 nova_compute[182755]:    </disk>
Jan 21 18:55:31 np0005591285 nova_compute[182755]:    <interface type="ethernet">
Jan 21 18:55:31 np0005591285 nova_compute[182755]:      <mac address="fa:16:3e:87:3c:66"/>
Jan 21 18:55:31 np0005591285 nova_compute[182755]:      <model type="virtio"/>
Jan 21 18:55:31 np0005591285 nova_compute[182755]:      <driver name="vhost" rx_queue_size="512"/>
Jan 21 18:55:31 np0005591285 nova_compute[182755]:      <mtu size="1442"/>
Jan 21 18:55:31 np0005591285 nova_compute[182755]:      <target dev="tapb07d054c-f7"/>
Jan 21 18:55:31 np0005591285 nova_compute[182755]:    </interface>
Jan 21 18:55:31 np0005591285 nova_compute[182755]:    <serial type="pty">
Jan 21 18:55:31 np0005591285 nova_compute[182755]:      <log file="/var/lib/nova/instances/2b44528a-0ec9-4df9-afce-0d76ed92b221/console.log" append="off"/>
Jan 21 18:55:31 np0005591285 nova_compute[182755]:    </serial>
Jan 21 18:55:31 np0005591285 nova_compute[182755]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 21 18:55:31 np0005591285 nova_compute[182755]:    <video>
Jan 21 18:55:31 np0005591285 nova_compute[182755]:      <model type="virtio"/>
Jan 21 18:55:31 np0005591285 nova_compute[182755]:    </video>
Jan 21 18:55:31 np0005591285 nova_compute[182755]:    <input type="tablet" bus="usb"/>
Jan 21 18:55:31 np0005591285 nova_compute[182755]:    <input type="keyboard" bus="usb"/>
Jan 21 18:55:31 np0005591285 nova_compute[182755]:    <rng model="virtio">
Jan 21 18:55:31 np0005591285 nova_compute[182755]:      <backend model="random">/dev/urandom</backend>
Jan 21 18:55:31 np0005591285 nova_compute[182755]:    </rng>
Jan 21 18:55:31 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root"/>
Jan 21 18:55:31 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 18:55:31 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 18:55:31 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 18:55:31 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 18:55:31 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 18:55:31 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 18:55:31 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 18:55:31 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 18:55:31 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 18:55:31 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 18:55:31 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 18:55:31 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 18:55:31 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 18:55:31 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 18:55:31 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 18:55:31 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 18:55:31 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 18:55:31 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 18:55:31 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 18:55:31 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 18:55:31 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 18:55:31 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 18:55:31 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 18:55:31 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 18:55:31 np0005591285 nova_compute[182755]:    <controller type="usb" index="0"/>
Jan 21 18:55:31 np0005591285 nova_compute[182755]:    <memballoon model="virtio">
Jan 21 18:55:31 np0005591285 nova_compute[182755]:      <stats period="10"/>
Jan 21 18:55:31 np0005591285 nova_compute[182755]:    </memballoon>
Jan 21 18:55:31 np0005591285 nova_compute[182755]:  </devices>
Jan 21 18:55:31 np0005591285 nova_compute[182755]: </domain>
Jan 21 18:55:31 np0005591285 nova_compute[182755]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Jan 21 18:55:31 np0005591285 nova_compute[182755]: 2026-01-21 23:55:31.265 182759 DEBUG oslo_concurrency.processutils [None req-208de37f-cd46-4fe0-a7ba-b91f2014780b fb46f340c44c473b9286568553cb6374 8556453a9e6644b4b29f7e2585b6beb3 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/2b44528a-0ec9-4df9-afce-0d76ed92b221/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 21 18:55:31 np0005591285 nova_compute[182755]: 2026-01-21 23:55:31.328 182759 DEBUG oslo_concurrency.processutils [None req-208de37f-cd46-4fe0-a7ba-b91f2014780b fb46f340c44c473b9286568553cb6374 8556453a9e6644b4b29f7e2585b6beb3 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/2b44528a-0ec9-4df9-afce-0d76ed92b221/disk --force-share --output=json" returned: 0 in 0.063s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 21 18:55:31 np0005591285 nova_compute[182755]: 2026-01-21 23:55:31.331 182759 DEBUG oslo_concurrency.processutils [None req-208de37f-cd46-4fe0-a7ba-b91f2014780b fb46f340c44c473b9286568553cb6374 8556453a9e6644b4b29f7e2585b6beb3 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/2b44528a-0ec9-4df9-afce-0d76ed92b221/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 21 18:55:31 np0005591285 nova_compute[182755]: 2026-01-21 23:55:31.433 182759 DEBUG oslo_concurrency.processutils [None req-208de37f-cd46-4fe0-a7ba-b91f2014780b fb46f340c44c473b9286568553cb6374 8556453a9e6644b4b29f7e2585b6beb3 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/2b44528a-0ec9-4df9-afce-0d76ed92b221/disk --force-share --output=json" returned: 0 in 0.102s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 21 18:55:31 np0005591285 nova_compute[182755]: 2026-01-21 23:55:31.435 182759 DEBUG nova.objects.instance [None req-208de37f-cd46-4fe0-a7ba-b91f2014780b fb46f340c44c473b9286568553cb6374 8556453a9e6644b4b29f7e2585b6beb3 - - default default] Lazy-loading 'trusted_certs' on Instance uuid 2b44528a-0ec9-4df9-afce-0d76ed92b221 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 21 18:55:31 np0005591285 nova_compute[182755]: 2026-01-21 23:55:31.455 182759 DEBUG oslo_concurrency.processutils [None req-208de37f-cd46-4fe0-a7ba-b91f2014780b fb46f340c44c473b9286568553cb6374 8556453a9e6644b4b29f7e2585b6beb3 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 21 18:55:31 np0005591285 nova_compute[182755]: 2026-01-21 23:55:31.530 182759 DEBUG oslo_concurrency.processutils [None req-208de37f-cd46-4fe0-a7ba-b91f2014780b fb46f340c44c473b9286568553cb6374 8556453a9e6644b4b29f7e2585b6beb3 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json" returned: 0 in 0.075s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 21 18:55:31 np0005591285 nova_compute[182755]: 2026-01-21 23:55:31.532 182759 DEBUG nova.virt.disk.api [None req-208de37f-cd46-4fe0-a7ba-b91f2014780b fb46f340c44c473b9286568553cb6374 8556453a9e6644b4b29f7e2585b6beb3 - - default default] Checking if we can resize image /var/lib/nova/instances/2b44528a-0ec9-4df9-afce-0d76ed92b221/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166#033[00m
Jan 21 18:55:31 np0005591285 nova_compute[182755]: 2026-01-21 23:55:31.532 182759 DEBUG oslo_concurrency.processutils [None req-208de37f-cd46-4fe0-a7ba-b91f2014780b fb46f340c44c473b9286568553cb6374 8556453a9e6644b4b29f7e2585b6beb3 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/2b44528a-0ec9-4df9-afce-0d76ed92b221/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 21 18:55:31 np0005591285 nova_compute[182755]: 2026-01-21 23:55:31.631 182759 DEBUG oslo_concurrency.processutils [None req-208de37f-cd46-4fe0-a7ba-b91f2014780b fb46f340c44c473b9286568553cb6374 8556453a9e6644b4b29f7e2585b6beb3 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/2b44528a-0ec9-4df9-afce-0d76ed92b221/disk --force-share --output=json" returned: 0 in 0.099s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 21 18:55:31 np0005591285 nova_compute[182755]: 2026-01-21 23:55:31.633 182759 DEBUG nova.virt.disk.api [None req-208de37f-cd46-4fe0-a7ba-b91f2014780b fb46f340c44c473b9286568553cb6374 8556453a9e6644b4b29f7e2585b6beb3 - - default default] Cannot resize image /var/lib/nova/instances/2b44528a-0ec9-4df9-afce-0d76ed92b221/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172#033[00m
Jan 21 18:55:31 np0005591285 nova_compute[182755]: 2026-01-21 23:55:31.634 182759 DEBUG nova.objects.instance [None req-208de37f-cd46-4fe0-a7ba-b91f2014780b fb46f340c44c473b9286568553cb6374 8556453a9e6644b4b29f7e2585b6beb3 - - default default] Lazy-loading 'migration_context' on Instance uuid 2b44528a-0ec9-4df9-afce-0d76ed92b221 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 21 18:55:31 np0005591285 nova_compute[182755]: 2026-01-21 23:55:31.655 182759 DEBUG nova.virt.libvirt.vif [None req-208de37f-cd46-4fe0-a7ba-b91f2014780b fb46f340c44c473b9286568553cb6374 8556453a9e6644b4b29f7e2585b6beb3 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-21T23:55:16Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-SecurityGroupsTestJSON-server-89867646',display_name='tempest-SecurityGroupsTestJSON-server-89867646',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-securitygroupstestjson-server-89867646',id=57,image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2026-01-21T23:55:24Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=<?>,power_state=1,progress=0,project_id='8556453a9e6644b4b29f7e2585b6beb3',ramdisk_id='',reservation_id='r-vin64boz',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-SecurityGroupsTestJSON-744520065',owner_user_name='tempest-SecurityGroupsTestJSON-744520065-project-member'},tags=<?>,task_state='reboot_started_hard',terminated_at=None,trusted_certs=None,updated_at=2026-01-21T23:55:30Z,user_data=None,user_id='fb46f340c44c473b9286568553cb6374',uuid=2b44528a-0ec9-4df9-afce-0d76ed92b221,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "b07d054c-f7c2-465a-8c0d-fa4f5dac4828", "address": "fa:16:3e:87:3c:66", "network": {"id": "f32b0ae0-64b5-4b08-b029-da33b7e8f96a", "bridge": "br-int", "label": "tempest-SecurityGroupsTestJSON-1426753002-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "8556453a9e6644b4b29f7e2585b6beb3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb07d054c-f7", "ovs_interfaceid": "b07d054c-f7c2-465a-8c0d-fa4f5dac4828", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Jan 21 18:55:31 np0005591285 nova_compute[182755]: 2026-01-21 23:55:31.656 182759 DEBUG nova.network.os_vif_util [None req-208de37f-cd46-4fe0-a7ba-b91f2014780b fb46f340c44c473b9286568553cb6374 8556453a9e6644b4b29f7e2585b6beb3 - - default default] Converting VIF {"id": "b07d054c-f7c2-465a-8c0d-fa4f5dac4828", "address": "fa:16:3e:87:3c:66", "network": {"id": "f32b0ae0-64b5-4b08-b029-da33b7e8f96a", "bridge": "br-int", "label": "tempest-SecurityGroupsTestJSON-1426753002-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "8556453a9e6644b4b29f7e2585b6beb3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb07d054c-f7", "ovs_interfaceid": "b07d054c-f7c2-465a-8c0d-fa4f5dac4828", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 21 18:55:31 np0005591285 nova_compute[182755]: 2026-01-21 23:55:31.657 182759 DEBUG nova.network.os_vif_util [None req-208de37f-cd46-4fe0-a7ba-b91f2014780b fb46f340c44c473b9286568553cb6374 8556453a9e6644b4b29f7e2585b6beb3 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:87:3c:66,bridge_name='br-int',has_traffic_filtering=True,id=b07d054c-f7c2-465a-8c0d-fa4f5dac4828,network=Network(f32b0ae0-64b5-4b08-b029-da33b7e8f96a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapb07d054c-f7') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 21 18:55:31 np0005591285 nova_compute[182755]: 2026-01-21 23:55:31.658 182759 DEBUG os_vif [None req-208de37f-cd46-4fe0-a7ba-b91f2014780b fb46f340c44c473b9286568553cb6374 8556453a9e6644b4b29f7e2585b6beb3 - - default default] Plugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:87:3c:66,bridge_name='br-int',has_traffic_filtering=True,id=b07d054c-f7c2-465a-8c0d-fa4f5dac4828,network=Network(f32b0ae0-64b5-4b08-b029-da33b7e8f96a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapb07d054c-f7') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Jan 21 18:55:31 np0005591285 nova_compute[182755]: 2026-01-21 23:55:31.659 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 18:55:31 np0005591285 nova_compute[182755]: 2026-01-21 23:55:31.660 182759 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 21 18:55:31 np0005591285 nova_compute[182755]: 2026-01-21 23:55:31.661 182759 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 21 18:55:31 np0005591285 nova_compute[182755]: 2026-01-21 23:55:31.670 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 18:55:31 np0005591285 nova_compute[182755]: 2026-01-21 23:55:31.671 182759 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapb07d054c-f7, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 21 18:55:31 np0005591285 nova_compute[182755]: 2026-01-21 23:55:31.672 182759 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tapb07d054c-f7, col_values=(('external_ids', {'iface-id': 'b07d054c-f7c2-465a-8c0d-fa4f5dac4828', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:87:3c:66', 'vm-uuid': '2b44528a-0ec9-4df9-afce-0d76ed92b221'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 21 18:55:31 np0005591285 NetworkManager[55017]: <info>  [1769039731.6763] manager: (tapb07d054c-f7): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/79)
Jan 21 18:55:31 np0005591285 nova_compute[182755]: 2026-01-21 23:55:31.679 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 21 18:55:31 np0005591285 nova_compute[182755]: 2026-01-21 23:55:31.681 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 18:55:31 np0005591285 nova_compute[182755]: 2026-01-21 23:55:31.682 182759 INFO os_vif [None req-208de37f-cd46-4fe0-a7ba-b91f2014780b fb46f340c44c473b9286568553cb6374 8556453a9e6644b4b29f7e2585b6beb3 - - default default] Successfully plugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:87:3c:66,bridge_name='br-int',has_traffic_filtering=True,id=b07d054c-f7c2-465a-8c0d-fa4f5dac4828,network=Network(f32b0ae0-64b5-4b08-b029-da33b7e8f96a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapb07d054c-f7')#033[00m
Jan 21 18:55:31 np0005591285 kernel: tapb07d054c-f7: entered promiscuous mode
Jan 21 18:55:31 np0005591285 NetworkManager[55017]: <info>  [1769039731.8044] manager: (tapb07d054c-f7): new Tun device (/org/freedesktop/NetworkManager/Devices/80)
Jan 21 18:55:31 np0005591285 ovn_controller[94908]: 2026-01-21T23:55:31Z|00141|binding|INFO|Claiming lport b07d054c-f7c2-465a-8c0d-fa4f5dac4828 for this chassis.
Jan 21 18:55:31 np0005591285 ovn_controller[94908]: 2026-01-21T23:55:31Z|00142|binding|INFO|b07d054c-f7c2-465a-8c0d-fa4f5dac4828: Claiming fa:16:3e:87:3c:66 10.100.0.5
Jan 21 18:55:31 np0005591285 nova_compute[182755]: 2026-01-21 23:55:31.806 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 18:55:31 np0005591285 systemd-udevd[218592]: Network interface NamePolicy= disabled on kernel command line.
Jan 21 18:55:31 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:55:31.813 104259 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:87:3c:66 10.100.0.5'], port_security=['fa:16:3e:87:3c:66 10.100.0.5'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.5/28', 'neutron:device_id': '2b44528a-0ec9-4df9-afce-0d76ed92b221', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-f32b0ae0-64b5-4b08-b029-da33b7e8f96a', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '8556453a9e6644b4b29f7e2585b6beb3', 'neutron:revision_number': '5', 'neutron:security_group_ids': '07a1a0ce-5790-4d2e-8869-adf91647e1de 746fe9a7-60c8-4d8b-9d10-bd8b258787a5', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-2.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=c83a09c2-c943-4d92-aedc-1a1adb93cc19, chassis=[<ovs.db.idl.Row object at 0x7fdd55b5c970>], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fdd55b5c970>], logical_port=b07d054c-f7c2-465a-8c0d-fa4f5dac4828) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 21 18:55:31 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:55:31.814 104259 INFO neutron.agent.ovn.metadata.agent [-] Port b07d054c-f7c2-465a-8c0d-fa4f5dac4828 in datapath f32b0ae0-64b5-4b08-b029-da33b7e8f96a bound to our chassis#033[00m
Jan 21 18:55:31 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:55:31.815 104259 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network f32b0ae0-64b5-4b08-b029-da33b7e8f96a#033[00m
Jan 21 18:55:31 np0005591285 ovn_controller[94908]: 2026-01-21T23:55:31Z|00143|binding|INFO|Setting lport b07d054c-f7c2-465a-8c0d-fa4f5dac4828 ovn-installed in OVS
Jan 21 18:55:31 np0005591285 nova_compute[182755]: 2026-01-21 23:55:31.817 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 18:55:31 np0005591285 ovn_controller[94908]: 2026-01-21T23:55:31Z|00144|binding|INFO|Setting lport b07d054c-f7c2-465a-8c0d-fa4f5dac4828 up in Southbound
Jan 21 18:55:31 np0005591285 nova_compute[182755]: 2026-01-21 23:55:31.819 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 18:55:31 np0005591285 nova_compute[182755]: 2026-01-21 23:55:31.822 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 18:55:31 np0005591285 NetworkManager[55017]: <info>  [1769039731.8301] device (tapb07d054c-f7): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 21 18:55:31 np0005591285 nova_compute[182755]: 2026-01-21 23:55:31.829 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 18:55:31 np0005591285 NetworkManager[55017]: <info>  [1769039731.8313] device (tapb07d054c-f7): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 21 18:55:31 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:55:31.839 211690 DEBUG oslo.privsep.daemon [-] privsep: reply[76256938-d11b-49c8-96e2-5daa69550344]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 18:55:31 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:55:31.841 104259 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tapf32b0ae0-61 in ovnmeta-f32b0ae0-64b5-4b08-b029-da33b7e8f96a namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Jan 21 18:55:31 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:55:31.843 211690 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tapf32b0ae0-60 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Jan 21 18:55:31 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:55:31.844 211690 DEBUG oslo.privsep.daemon [-] privsep: reply[b01574dd-ed8d-4661-bf92-a632a178f229]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 18:55:31 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:55:31.845 211690 DEBUG oslo.privsep.daemon [-] privsep: reply[2ebc598c-42fe-4a3d-bee0-134ce7b91d55]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 18:55:31 np0005591285 systemd-machined[154022]: New machine qemu-24-instance-00000039.
Jan 21 18:55:31 np0005591285 systemd[1]: Started Virtual Machine qemu-24-instance-00000039.
Jan 21 18:55:31 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:55:31.865 104650 DEBUG oslo.privsep.daemon [-] privsep: reply[361aff53-2293-4b2a-a519-70885e8792a6]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 18:55:31 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:55:31.902 211690 DEBUG oslo.privsep.daemon [-] privsep: reply[94afbbe5-bf24-4fc0-afca-05416cca73cc]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 18:55:31 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:55:31.956 211704 DEBUG oslo.privsep.daemon [-] privsep: reply[ac0df693-d770-4eb8-ac17-7c2af89f70b4]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 18:55:31 np0005591285 NetworkManager[55017]: <info>  [1769039731.9677] manager: (tapf32b0ae0-60): new Veth device (/org/freedesktop/NetworkManager/Devices/81)
Jan 21 18:55:31 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:55:31.965 211690 DEBUG oslo.privsep.daemon [-] privsep: reply[fd995d50-35f4-494b-a395-72cafe37215e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 18:55:32 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:55:32.004 211704 DEBUG oslo.privsep.daemon [-] privsep: reply[0df40fb1-ff86-4cdc-9d51-08f0d04d814a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 18:55:32 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:55:32.007 211704 DEBUG oslo.privsep.daemon [-] privsep: reply[41bd3d7c-0f07-4325-b12d-291696ea95af]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 18:55:32 np0005591285 NetworkManager[55017]: <info>  [1769039732.0413] device (tapf32b0ae0-60): carrier: link connected
Jan 21 18:55:32 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:55:32.048 211704 DEBUG oslo.privsep.daemon [-] privsep: reply[428efeec-1d94-4590-9211-957f36a03766]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 18:55:32 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:55:32.074 211690 DEBUG oslo.privsep.daemon [-] privsep: reply[4d418552-f289-490f-ba45-d9239504ce70]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapf32b0ae0-61'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:63:84:cb'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 51], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 429069, 'reachable_time': 26350, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 218743, 'error': None, 'target': 'ovnmeta-f32b0ae0-64b5-4b08-b029-da33b7e8f96a', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 18:55:32 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:55:32.100 211690 DEBUG oslo.privsep.daemon [-] privsep: reply[6ef3dc51-abb4-40a3-bc81-fc9515d38801]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe63:84cb'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 429069, 'tstamp': 429069}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 218744, 'error': None, 'target': 'ovnmeta-f32b0ae0-64b5-4b08-b029-da33b7e8f96a', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 18:55:32 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:55:32.126 211690 DEBUG oslo.privsep.daemon [-] privsep: reply[4d4f482b-a17d-4bab-b601-ce00af1d70d2]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapf32b0ae0-61'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:63:84:cb'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 51], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 429069, 'reachable_time': 26350, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 218746, 'error': None, 'target': 'ovnmeta-f32b0ae0-64b5-4b08-b029-da33b7e8f96a', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 18:55:32 np0005591285 nova_compute[182755]: 2026-01-21 23:55:32.139 182759 DEBUG nova.virt.libvirt.host [None req-f447ef32-7edb-4e2c-a5c5-5452f2f9c37e - - - - - -] Removed pending event for 2b44528a-0ec9-4df9-afce-0d76ed92b221 due to event _event_emit_delayed /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:438#033[00m
Jan 21 18:55:32 np0005591285 nova_compute[182755]: 2026-01-21 23:55:32.140 182759 DEBUG nova.virt.driver [None req-f447ef32-7edb-4e2c-a5c5-5452f2f9c37e - - - - - -] Emitting event <LifecycleEvent: 1769039732.1387067, 2b44528a-0ec9-4df9-afce-0d76ed92b221 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 21 18:55:32 np0005591285 nova_compute[182755]: 2026-01-21 23:55:32.141 182759 INFO nova.compute.manager [None req-f447ef32-7edb-4e2c-a5c5-5452f2f9c37e - - - - - -] [instance: 2b44528a-0ec9-4df9-afce-0d76ed92b221] VM Resumed (Lifecycle Event)#033[00m
Jan 21 18:55:32 np0005591285 nova_compute[182755]: 2026-01-21 23:55:32.144 182759 DEBUG nova.compute.manager [None req-208de37f-cd46-4fe0-a7ba-b91f2014780b fb46f340c44c473b9286568553cb6374 8556453a9e6644b4b29f7e2585b6beb3 - - default default] [instance: 2b44528a-0ec9-4df9-afce-0d76ed92b221] Instance event wait completed in 0 seconds for  wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Jan 21 18:55:32 np0005591285 nova_compute[182755]: 2026-01-21 23:55:32.150 182759 INFO nova.virt.libvirt.driver [-] [instance: 2b44528a-0ec9-4df9-afce-0d76ed92b221] Instance rebooted successfully.#033[00m
Jan 21 18:55:32 np0005591285 nova_compute[182755]: 2026-01-21 23:55:32.153 182759 DEBUG nova.compute.manager [None req-208de37f-cd46-4fe0-a7ba-b91f2014780b fb46f340c44c473b9286568553cb6374 8556453a9e6644b4b29f7e2585b6beb3 - - default default] [instance: 2b44528a-0ec9-4df9-afce-0d76ed92b221] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 21 18:55:32 np0005591285 nova_compute[182755]: 2026-01-21 23:55:32.170 182759 DEBUG nova.compute.manager [None req-f447ef32-7edb-4e2c-a5c5-5452f2f9c37e - - - - - -] [instance: 2b44528a-0ec9-4df9-afce-0d76ed92b221] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 21 18:55:32 np0005591285 nova_compute[182755]: 2026-01-21 23:55:32.176 182759 DEBUG nova.compute.manager [None req-f447ef32-7edb-4e2c-a5c5-5452f2f9c37e - - - - - -] [instance: 2b44528a-0ec9-4df9-afce-0d76ed92b221] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: active, current task_state: reboot_started_hard, current DB power_state: 1, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 21 18:55:32 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:55:32.180 211690 DEBUG oslo.privsep.daemon [-] privsep: reply[bf54cab1-08ca-456c-b6f6-d7a863c8029a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 18:55:32 np0005591285 nova_compute[182755]: 2026-01-21 23:55:32.206 182759 INFO nova.compute.manager [None req-f447ef32-7edb-4e2c-a5c5-5452f2f9c37e - - - - - -] [instance: 2b44528a-0ec9-4df9-afce-0d76ed92b221] During sync_power_state the instance has a pending task (reboot_started_hard). Skip.#033[00m
Jan 21 18:55:32 np0005591285 nova_compute[182755]: 2026-01-21 23:55:32.207 182759 DEBUG nova.virt.driver [None req-f447ef32-7edb-4e2c-a5c5-5452f2f9c37e - - - - - -] Emitting event <LifecycleEvent: 1769039732.141439, 2b44528a-0ec9-4df9-afce-0d76ed92b221 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 21 18:55:32 np0005591285 nova_compute[182755]: 2026-01-21 23:55:32.207 182759 INFO nova.compute.manager [None req-f447ef32-7edb-4e2c-a5c5-5452f2f9c37e - - - - - -] [instance: 2b44528a-0ec9-4df9-afce-0d76ed92b221] VM Started (Lifecycle Event)#033[00m
Jan 21 18:55:32 np0005591285 nova_compute[182755]: 2026-01-21 23:55:32.231 182759 DEBUG nova.compute.manager [None req-f447ef32-7edb-4e2c-a5c5-5452f2f9c37e - - - - - -] [instance: 2b44528a-0ec9-4df9-afce-0d76ed92b221] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 21 18:55:32 np0005591285 nova_compute[182755]: 2026-01-21 23:55:32.237 182759 DEBUG nova.compute.manager [None req-f447ef32-7edb-4e2c-a5c5-5452f2f9c37e - - - - - -] [instance: 2b44528a-0ec9-4df9-afce-0d76ed92b221] Synchronizing instance power state after lifecycle event "Started"; current vm_state: active, current task_state: None, current DB power_state: 1, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 21 18:55:32 np0005591285 nova_compute[182755]: 2026-01-21 23:55:32.269 182759 DEBUG oslo_concurrency.lockutils [None req-208de37f-cd46-4fe0-a7ba-b91f2014780b fb46f340c44c473b9286568553cb6374 8556453a9e6644b4b29f7e2585b6beb3 - - default default] Lock "2b44528a-0ec9-4df9-afce-0d76ed92b221" "released" by "nova.compute.manager.ComputeManager.reboot_instance.<locals>.do_reboot_instance" :: held 4.154s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 21 18:55:32 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:55:32.271 211690 DEBUG oslo.privsep.daemon [-] privsep: reply[c5eb93ab-c601-4212-89e7-b106ade56db5]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 18:55:32 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:55:32.273 104259 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapf32b0ae0-60, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 21 18:55:32 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:55:32.273 104259 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 21 18:55:32 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:55:32.274 104259 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapf32b0ae0-60, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 21 18:55:32 np0005591285 kernel: tapf32b0ae0-60: entered promiscuous mode
Jan 21 18:55:32 np0005591285 NetworkManager[55017]: <info>  [1769039732.2777] manager: (tapf32b0ae0-60): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/82)
Jan 21 18:55:32 np0005591285 nova_compute[182755]: 2026-01-21 23:55:32.276 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 18:55:32 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:55:32.287 104259 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapf32b0ae0-60, col_values=(('external_ids', {'iface-id': '6cc0b80c-0a82-45e0-b8cc-ede53f6f4c47'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 21 18:55:32 np0005591285 ovn_controller[94908]: 2026-01-21T23:55:32Z|00145|binding|INFO|Releasing lport 6cc0b80c-0a82-45e0-b8cc-ede53f6f4c47 from this chassis (sb_readonly=0)
Jan 21 18:55:32 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:55:32.291 104259 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/f32b0ae0-64b5-4b08-b029-da33b7e8f96a.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/f32b0ae0-64b5-4b08-b029-da33b7e8f96a.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Jan 21 18:55:32 np0005591285 nova_compute[182755]: 2026-01-21 23:55:32.292 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 18:55:32 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:55:32.295 211690 DEBUG oslo.privsep.daemon [-] privsep: reply[f19fb6a8-91a7-4807-befa-424d21fe95c7]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 18:55:32 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:55:32.296 104259 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Jan 21 18:55:32 np0005591285 ovn_metadata_agent[104254]: global
Jan 21 18:55:32 np0005591285 ovn_metadata_agent[104254]:    log         /dev/log local0 debug
Jan 21 18:55:32 np0005591285 ovn_metadata_agent[104254]:    log-tag     haproxy-metadata-proxy-f32b0ae0-64b5-4b08-b029-da33b7e8f96a
Jan 21 18:55:32 np0005591285 ovn_metadata_agent[104254]:    user        root
Jan 21 18:55:32 np0005591285 ovn_metadata_agent[104254]:    group       root
Jan 21 18:55:32 np0005591285 ovn_metadata_agent[104254]:    maxconn     1024
Jan 21 18:55:32 np0005591285 ovn_metadata_agent[104254]:    pidfile     /var/lib/neutron/external/pids/f32b0ae0-64b5-4b08-b029-da33b7e8f96a.pid.haproxy
Jan 21 18:55:32 np0005591285 ovn_metadata_agent[104254]:    daemon
Jan 21 18:55:32 np0005591285 ovn_metadata_agent[104254]: 
Jan 21 18:55:32 np0005591285 ovn_metadata_agent[104254]: defaults
Jan 21 18:55:32 np0005591285 ovn_metadata_agent[104254]:    log global
Jan 21 18:55:32 np0005591285 ovn_metadata_agent[104254]:    mode http
Jan 21 18:55:32 np0005591285 ovn_metadata_agent[104254]:    option httplog
Jan 21 18:55:32 np0005591285 ovn_metadata_agent[104254]:    option dontlognull
Jan 21 18:55:32 np0005591285 ovn_metadata_agent[104254]:    option http-server-close
Jan 21 18:55:32 np0005591285 ovn_metadata_agent[104254]:    option forwardfor
Jan 21 18:55:32 np0005591285 ovn_metadata_agent[104254]:    retries                 3
Jan 21 18:55:32 np0005591285 ovn_metadata_agent[104254]:    timeout http-request    30s
Jan 21 18:55:32 np0005591285 ovn_metadata_agent[104254]:    timeout connect         30s
Jan 21 18:55:32 np0005591285 ovn_metadata_agent[104254]:    timeout client          32s
Jan 21 18:55:32 np0005591285 ovn_metadata_agent[104254]:    timeout server          32s
Jan 21 18:55:32 np0005591285 ovn_metadata_agent[104254]:    timeout http-keep-alive 30s
Jan 21 18:55:32 np0005591285 ovn_metadata_agent[104254]: 
Jan 21 18:55:32 np0005591285 ovn_metadata_agent[104254]: 
Jan 21 18:55:32 np0005591285 ovn_metadata_agent[104254]: listen listener
Jan 21 18:55:32 np0005591285 ovn_metadata_agent[104254]:    bind 169.254.169.254:80
Jan 21 18:55:32 np0005591285 ovn_metadata_agent[104254]:    server metadata /var/lib/neutron/metadata_proxy
Jan 21 18:55:32 np0005591285 ovn_metadata_agent[104254]:    http-request add-header X-OVN-Network-ID f32b0ae0-64b5-4b08-b029-da33b7e8f96a
Jan 21 18:55:32 np0005591285 ovn_metadata_agent[104254]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Jan 21 18:55:32 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:55:32.299 104259 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-f32b0ae0-64b5-4b08-b029-da33b7e8f96a', 'env', 'PROCESS_TAG=haproxy-f32b0ae0-64b5-4b08-b029-da33b7e8f96a', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/f32b0ae0-64b5-4b08-b029-da33b7e8f96a.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Jan 21 18:55:32 np0005591285 nova_compute[182755]: 2026-01-21 23:55:32.312 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 18:55:32 np0005591285 nova_compute[182755]: 2026-01-21 23:55:32.316 182759 DEBUG nova.network.neutron [req-b5032d44-1837-423e-baf1-3f2627850c5f req-e604d6bd-e90f-4a73-ae74-ef5ea4e4f1e4 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 2b44528a-0ec9-4df9-afce-0d76ed92b221] Updated VIF entry in instance network info cache for port b07d054c-f7c2-465a-8c0d-fa4f5dac4828. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 21 18:55:32 np0005591285 nova_compute[182755]: 2026-01-21 23:55:32.317 182759 DEBUG nova.network.neutron [req-b5032d44-1837-423e-baf1-3f2627850c5f req-e604d6bd-e90f-4a73-ae74-ef5ea4e4f1e4 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 2b44528a-0ec9-4df9-afce-0d76ed92b221] Updating instance_info_cache with network_info: [{"id": "b07d054c-f7c2-465a-8c0d-fa4f5dac4828", "address": "fa:16:3e:87:3c:66", "network": {"id": "f32b0ae0-64b5-4b08-b029-da33b7e8f96a", "bridge": "br-int", "label": "tempest-SecurityGroupsTestJSON-1426753002-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "8556453a9e6644b4b29f7e2585b6beb3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb07d054c-f7", "ovs_interfaceid": "b07d054c-f7c2-465a-8c0d-fa4f5dac4828", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 21 18:55:32 np0005591285 nova_compute[182755]: 2026-01-21 23:55:32.337 182759 DEBUG oslo_concurrency.lockutils [req-b5032d44-1837-423e-baf1-3f2627850c5f req-e604d6bd-e90f-4a73-ae74-ef5ea4e4f1e4 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Releasing lock "refresh_cache-2b44528a-0ec9-4df9-afce-0d76ed92b221" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 21 18:55:32 np0005591285 nova_compute[182755]: 2026-01-21 23:55:32.339 182759 DEBUG oslo_concurrency.lockutils [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Acquired lock "refresh_cache-2b44528a-0ec9-4df9-afce-0d76ed92b221" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 21 18:55:32 np0005591285 nova_compute[182755]: 2026-01-21 23:55:32.340 182759 DEBUG nova.network.neutron [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] [instance: 2b44528a-0ec9-4df9-afce-0d76ed92b221] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004#033[00m
Jan 21 18:55:32 np0005591285 nova_compute[182755]: 2026-01-21 23:55:32.340 182759 DEBUG nova.objects.instance [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Lazy-loading 'info_cache' on Instance uuid 2b44528a-0ec9-4df9-afce-0d76ed92b221 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 21 18:55:32 np0005591285 podman[218778]: 2026-01-21 23:55:32.833919636 +0000 UTC m=+0.077686678 container create 19b229169124fd5c54f9e1311d46b00c25aab98961a580255afe3d7ca3536e8b (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-f32b0ae0-64b5-4b08-b029-da33b7e8f96a, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 21 18:55:32 np0005591285 podman[218778]: 2026-01-21 23:55:32.795842083 +0000 UTC m=+0.039609175 image pull 3695f0466b4af47afdf4b467956f8cc4744d7249671a73e7ca3fd26cca2f59c3 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Jan 21 18:55:32 np0005591285 systemd[1]: Started libpod-conmon-19b229169124fd5c54f9e1311d46b00c25aab98961a580255afe3d7ca3536e8b.scope.
Jan 21 18:55:32 np0005591285 systemd[1]: Started libcrun container.
Jan 21 18:55:32 np0005591285 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/47889113cf011959d84ae1cb7243bd34e23456804d923e44c61005904f73ebb9/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 21 18:55:32 np0005591285 podman[218778]: 2026-01-21 23:55:32.961187397 +0000 UTC m=+0.204954469 container init 19b229169124fd5c54f9e1311d46b00c25aab98961a580255afe3d7ca3536e8b (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-f32b0ae0-64b5-4b08-b029-da33b7e8f96a, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3)
Jan 21 18:55:32 np0005591285 podman[218778]: 2026-01-21 23:55:32.968506916 +0000 UTC m=+0.212273958 container start 19b229169124fd5c54f9e1311d46b00c25aab98961a580255afe3d7ca3536e8b (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-f32b0ae0-64b5-4b08-b029-da33b7e8f96a, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Jan 21 18:55:32 np0005591285 neutron-haproxy-ovnmeta-f32b0ae0-64b5-4b08-b029-da33b7e8f96a[218793]: [NOTICE]   (218797) : New worker (218799) forked
Jan 21 18:55:33 np0005591285 neutron-haproxy-ovnmeta-f32b0ae0-64b5-4b08-b029-da33b7e8f96a[218793]: [NOTICE]   (218797) : Loading success.
Jan 21 18:55:33 np0005591285 nova_compute[182755]: 2026-01-21 23:55:33.904 182759 DEBUG nova.network.neutron [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] [instance: 2b44528a-0ec9-4df9-afce-0d76ed92b221] Updating instance_info_cache with network_info: [{"id": "b07d054c-f7c2-465a-8c0d-fa4f5dac4828", "address": "fa:16:3e:87:3c:66", "network": {"id": "f32b0ae0-64b5-4b08-b029-da33b7e8f96a", "bridge": "br-int", "label": "tempest-SecurityGroupsTestJSON-1426753002-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "8556453a9e6644b4b29f7e2585b6beb3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb07d054c-f7", "ovs_interfaceid": "b07d054c-f7c2-465a-8c0d-fa4f5dac4828", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 21 18:55:33 np0005591285 nova_compute[182755]: 2026-01-21 23:55:33.947 182759 DEBUG oslo_concurrency.lockutils [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Releasing lock "refresh_cache-2b44528a-0ec9-4df9-afce-0d76ed92b221" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 21 18:55:33 np0005591285 nova_compute[182755]: 2026-01-21 23:55:33.948 182759 DEBUG nova.compute.manager [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] [instance: 2b44528a-0ec9-4df9-afce-0d76ed92b221] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929#033[00m
Jan 21 18:55:33 np0005591285 nova_compute[182755]: 2026-01-21 23:55:33.949 182759 DEBUG oslo_service.periodic_task [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 21 18:55:33 np0005591285 nova_compute[182755]: 2026-01-21 23:55:33.949 182759 DEBUG oslo_service.periodic_task [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 21 18:55:34 np0005591285 nova_compute[182755]: 2026-01-21 23:55:34.947 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 18:55:34 np0005591285 nova_compute[182755]: 2026-01-21 23:55:34.990 182759 DEBUG nova.compute.manager [req-9092fcd0-9588-4f63-bfb5-5e8988909385 req-c53579f2-bacc-49d3-955a-2137d4fd0d16 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 2b44528a-0ec9-4df9-afce-0d76ed92b221] Received event network-changed-b07d054c-f7c2-465a-8c0d-fa4f5dac4828 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 21 18:55:34 np0005591285 nova_compute[182755]: 2026-01-21 23:55:34.991 182759 DEBUG nova.compute.manager [req-9092fcd0-9588-4f63-bfb5-5e8988909385 req-c53579f2-bacc-49d3-955a-2137d4fd0d16 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 2b44528a-0ec9-4df9-afce-0d76ed92b221] Refreshing instance network info cache due to event network-changed-b07d054c-f7c2-465a-8c0d-fa4f5dac4828. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 21 18:55:34 np0005591285 nova_compute[182755]: 2026-01-21 23:55:34.991 182759 DEBUG oslo_concurrency.lockutils [req-9092fcd0-9588-4f63-bfb5-5e8988909385 req-c53579f2-bacc-49d3-955a-2137d4fd0d16 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "refresh_cache-2b44528a-0ec9-4df9-afce-0d76ed92b221" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 21 18:55:34 np0005591285 nova_compute[182755]: 2026-01-21 23:55:34.992 182759 DEBUG oslo_concurrency.lockutils [req-9092fcd0-9588-4f63-bfb5-5e8988909385 req-c53579f2-bacc-49d3-955a-2137d4fd0d16 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquired lock "refresh_cache-2b44528a-0ec9-4df9-afce-0d76ed92b221" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 21 18:55:34 np0005591285 nova_compute[182755]: 2026-01-21 23:55:34.992 182759 DEBUG nova.network.neutron [req-9092fcd0-9588-4f63-bfb5-5e8988909385 req-c53579f2-bacc-49d3-955a-2137d4fd0d16 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 2b44528a-0ec9-4df9-afce-0d76ed92b221] Refreshing network info cache for port b07d054c-f7c2-465a-8c0d-fa4f5dac4828 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 21 18:55:35 np0005591285 podman[218808]: 2026-01-21 23:55:35.301381666 +0000 UTC m=+0.155445398 container health_status e38def30a2d032278c507a8fe96e62e9ea51ff4721fe55b24e66691821fd079e (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, managed_by=edpm_ansible)
Jan 21 18:55:35 np0005591285 nova_compute[182755]: 2026-01-21 23:55:35.759 182759 DEBUG oslo_concurrency.lockutils [None req-81a93673-ed46-4e26-8b42-813fd1cad033 fb46f340c44c473b9286568553cb6374 8556453a9e6644b4b29f7e2585b6beb3 - - default default] Acquiring lock "2b44528a-0ec9-4df9-afce-0d76ed92b221" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 21 18:55:35 np0005591285 nova_compute[182755]: 2026-01-21 23:55:35.760 182759 DEBUG oslo_concurrency.lockutils [None req-81a93673-ed46-4e26-8b42-813fd1cad033 fb46f340c44c473b9286568553cb6374 8556453a9e6644b4b29f7e2585b6beb3 - - default default] Lock "2b44528a-0ec9-4df9-afce-0d76ed92b221" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 21 18:55:35 np0005591285 nova_compute[182755]: 2026-01-21 23:55:35.761 182759 DEBUG oslo_concurrency.lockutils [None req-81a93673-ed46-4e26-8b42-813fd1cad033 fb46f340c44c473b9286568553cb6374 8556453a9e6644b4b29f7e2585b6beb3 - - default default] Acquiring lock "2b44528a-0ec9-4df9-afce-0d76ed92b221-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 21 18:55:35 np0005591285 nova_compute[182755]: 2026-01-21 23:55:35.761 182759 DEBUG oslo_concurrency.lockutils [None req-81a93673-ed46-4e26-8b42-813fd1cad033 fb46f340c44c473b9286568553cb6374 8556453a9e6644b4b29f7e2585b6beb3 - - default default] Lock "2b44528a-0ec9-4df9-afce-0d76ed92b221-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 21 18:55:35 np0005591285 nova_compute[182755]: 2026-01-21 23:55:35.761 182759 DEBUG oslo_concurrency.lockutils [None req-81a93673-ed46-4e26-8b42-813fd1cad033 fb46f340c44c473b9286568553cb6374 8556453a9e6644b4b29f7e2585b6beb3 - - default default] Lock "2b44528a-0ec9-4df9-afce-0d76ed92b221-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 21 18:55:35 np0005591285 nova_compute[182755]: 2026-01-21 23:55:35.778 182759 INFO nova.compute.manager [None req-81a93673-ed46-4e26-8b42-813fd1cad033 fb46f340c44c473b9286568553cb6374 8556453a9e6644b4b29f7e2585b6beb3 - - default default] [instance: 2b44528a-0ec9-4df9-afce-0d76ed92b221] Terminating instance#033[00m
Jan 21 18:55:35 np0005591285 nova_compute[182755]: 2026-01-21 23:55:35.789 182759 DEBUG nova.compute.manager [None req-81a93673-ed46-4e26-8b42-813fd1cad033 fb46f340c44c473b9286568553cb6374 8556453a9e6644b4b29f7e2585b6beb3 - - default default] [instance: 2b44528a-0ec9-4df9-afce-0d76ed92b221] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Jan 21 18:55:35 np0005591285 kernel: tapb07d054c-f7 (unregistering): left promiscuous mode
Jan 21 18:55:35 np0005591285 NetworkManager[55017]: <info>  [1769039735.8166] device (tapb07d054c-f7): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 21 18:55:35 np0005591285 ovn_controller[94908]: 2026-01-21T23:55:35Z|00146|binding|INFO|Releasing lport b07d054c-f7c2-465a-8c0d-fa4f5dac4828 from this chassis (sb_readonly=0)
Jan 21 18:55:35 np0005591285 ovn_controller[94908]: 2026-01-21T23:55:35Z|00147|binding|INFO|Setting lport b07d054c-f7c2-465a-8c0d-fa4f5dac4828 down in Southbound
Jan 21 18:55:35 np0005591285 ovn_controller[94908]: 2026-01-21T23:55:35Z|00148|binding|INFO|Removing iface tapb07d054c-f7 ovn-installed in OVS
Jan 21 18:55:35 np0005591285 nova_compute[182755]: 2026-01-21 23:55:35.833 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 18:55:35 np0005591285 nova_compute[182755]: 2026-01-21 23:55:35.836 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 18:55:35 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:55:35.845 104259 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:87:3c:66 10.100.0.5'], port_security=['fa:16:3e:87:3c:66 10.100.0.5'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.5/28', 'neutron:device_id': '2b44528a-0ec9-4df9-afce-0d76ed92b221', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-f32b0ae0-64b5-4b08-b029-da33b7e8f96a', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '8556453a9e6644b4b29f7e2585b6beb3', 'neutron:revision_number': '7', 'neutron:security_group_ids': '07a1a0ce-5790-4d2e-8869-adf91647e1de 153f1801-2b3e-433f-a54c-53aa7d6ed086 746fe9a7-60c8-4d8b-9d10-bd8b258787a5', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-2.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=c83a09c2-c943-4d92-aedc-1a1adb93cc19, chassis=[], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fdd55b5c970>], logical_port=b07d054c-f7c2-465a-8c0d-fa4f5dac4828) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7fdd55b5c970>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 21 18:55:35 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:55:35.848 104259 INFO neutron.agent.ovn.metadata.agent [-] Port b07d054c-f7c2-465a-8c0d-fa4f5dac4828 in datapath f32b0ae0-64b5-4b08-b029-da33b7e8f96a unbound from our chassis#033[00m
Jan 21 18:55:35 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:55:35.851 104259 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network f32b0ae0-64b5-4b08-b029-da33b7e8f96a, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Jan 21 18:55:35 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:55:35.853 211690 DEBUG oslo.privsep.daemon [-] privsep: reply[bd1dfba0-c5de-4c74-b44a-3f5a37809266]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 18:55:35 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:55:35.854 104259 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-f32b0ae0-64b5-4b08-b029-da33b7e8f96a namespace which is not needed anymore#033[00m
Jan 21 18:55:35 np0005591285 nova_compute[182755]: 2026-01-21 23:55:35.868 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 18:55:35 np0005591285 systemd[1]: machine-qemu\x2d24\x2dinstance\x2d00000039.scope: Deactivated successfully.
Jan 21 18:55:35 np0005591285 systemd[1]: machine-qemu\x2d24\x2dinstance\x2d00000039.scope: Consumed 4.110s CPU time.
Jan 21 18:55:35 np0005591285 systemd-machined[154022]: Machine qemu-24-instance-00000039 terminated.
Jan 21 18:55:36 np0005591285 neutron-haproxy-ovnmeta-f32b0ae0-64b5-4b08-b029-da33b7e8f96a[218793]: [NOTICE]   (218797) : haproxy version is 2.8.14-c23fe91
Jan 21 18:55:36 np0005591285 neutron-haproxy-ovnmeta-f32b0ae0-64b5-4b08-b029-da33b7e8f96a[218793]: [NOTICE]   (218797) : path to executable is /usr/sbin/haproxy
Jan 21 18:55:36 np0005591285 neutron-haproxy-ovnmeta-f32b0ae0-64b5-4b08-b029-da33b7e8f96a[218793]: [WARNING]  (218797) : Exiting Master process...
Jan 21 18:55:36 np0005591285 neutron-haproxy-ovnmeta-f32b0ae0-64b5-4b08-b029-da33b7e8f96a[218793]: [ALERT]    (218797) : Current worker (218799) exited with code 143 (Terminated)
Jan 21 18:55:36 np0005591285 neutron-haproxy-ovnmeta-f32b0ae0-64b5-4b08-b029-da33b7e8f96a[218793]: [WARNING]  (218797) : All workers exited. Exiting... (0)
Jan 21 18:55:36 np0005591285 systemd[1]: libpod-19b229169124fd5c54f9e1311d46b00c25aab98961a580255afe3d7ca3536e8b.scope: Deactivated successfully.
Jan 21 18:55:36 np0005591285 nova_compute[182755]: 2026-01-21 23:55:36.024 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 18:55:36 np0005591285 nova_compute[182755]: 2026-01-21 23:55:36.029 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 18:55:36 np0005591285 podman[218860]: 2026-01-21 23:55:36.030680584 +0000 UTC m=+0.062307090 container died 19b229169124fd5c54f9e1311d46b00c25aab98961a580255afe3d7ca3536e8b (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-f32b0ae0-64b5-4b08-b029-da33b7e8f96a, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 21 18:55:36 np0005591285 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-19b229169124fd5c54f9e1311d46b00c25aab98961a580255afe3d7ca3536e8b-userdata-shm.mount: Deactivated successfully.
Jan 21 18:55:36 np0005591285 systemd[1]: var-lib-containers-storage-overlay-47889113cf011959d84ae1cb7243bd34e23456804d923e44c61005904f73ebb9-merged.mount: Deactivated successfully.
Jan 21 18:55:36 np0005591285 podman[218860]: 2026-01-21 23:55:36.084914835 +0000 UTC m=+0.116541341 container cleanup 19b229169124fd5c54f9e1311d46b00c25aab98961a580255afe3d7ca3536e8b (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-f32b0ae0-64b5-4b08-b029-da33b7e8f96a, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.build-date=20251202, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0)
Jan 21 18:55:36 np0005591285 nova_compute[182755]: 2026-01-21 23:55:36.099 182759 INFO nova.virt.libvirt.driver [-] [instance: 2b44528a-0ec9-4df9-afce-0d76ed92b221] Instance destroyed successfully.#033[00m
Jan 21 18:55:36 np0005591285 nova_compute[182755]: 2026-01-21 23:55:36.100 182759 DEBUG nova.objects.instance [None req-81a93673-ed46-4e26-8b42-813fd1cad033 fb46f340c44c473b9286568553cb6374 8556453a9e6644b4b29f7e2585b6beb3 - - default default] Lazy-loading 'resources' on Instance uuid 2b44528a-0ec9-4df9-afce-0d76ed92b221 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 21 18:55:36 np0005591285 systemd[1]: libpod-conmon-19b229169124fd5c54f9e1311d46b00c25aab98961a580255afe3d7ca3536e8b.scope: Deactivated successfully.
Jan 21 18:55:36 np0005591285 nova_compute[182755]: 2026-01-21 23:55:36.124 182759 DEBUG nova.virt.libvirt.vif [None req-81a93673-ed46-4e26-8b42-813fd1cad033 fb46f340c44c473b9286568553cb6374 8556453a9e6644b4b29f7e2585b6beb3 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-21T23:55:16Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-SecurityGroupsTestJSON-server-89867646',display_name='tempest-SecurityGroupsTestJSON-server-89867646',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-securitygroupstestjson-server-89867646',id=57,image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2026-01-21T23:55:24Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='8556453a9e6644b4b29f7e2585b6beb3',ramdisk_id='',reservation_id='r-vin64boz',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-SecurityGroupsTestJSON-744520065',owner_user_name='tempest-SecurityGroupsTestJSON-744520065-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-21T23:55:32Z,user_data=None,user_id='fb46f340c44c473b9286568553cb6374',uuid=2b44528a-0ec9-4df9-afce-0d76ed92b221,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "b07d054c-f7c2-465a-8c0d-fa4f5dac4828", "address": "fa:16:3e:87:3c:66", "network": {"id": "f32b0ae0-64b5-4b08-b029-da33b7e8f96a", "bridge": "br-int", "label": "tempest-SecurityGroupsTestJSON-1426753002-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "8556453a9e6644b4b29f7e2585b6beb3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb07d054c-f7", "ovs_interfaceid": "b07d054c-f7c2-465a-8c0d-fa4f5dac4828", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Jan 21 18:55:36 np0005591285 nova_compute[182755]: 2026-01-21 23:55:36.126 182759 DEBUG nova.network.os_vif_util [None req-81a93673-ed46-4e26-8b42-813fd1cad033 fb46f340c44c473b9286568553cb6374 8556453a9e6644b4b29f7e2585b6beb3 - - default default] Converting VIF {"id": "b07d054c-f7c2-465a-8c0d-fa4f5dac4828", "address": "fa:16:3e:87:3c:66", "network": {"id": "f32b0ae0-64b5-4b08-b029-da33b7e8f96a", "bridge": "br-int", "label": "tempest-SecurityGroupsTestJSON-1426753002-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "8556453a9e6644b4b29f7e2585b6beb3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb07d054c-f7", "ovs_interfaceid": "b07d054c-f7c2-465a-8c0d-fa4f5dac4828", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 21 18:55:36 np0005591285 nova_compute[182755]: 2026-01-21 23:55:36.127 182759 DEBUG nova.network.os_vif_util [None req-81a93673-ed46-4e26-8b42-813fd1cad033 fb46f340c44c473b9286568553cb6374 8556453a9e6644b4b29f7e2585b6beb3 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:87:3c:66,bridge_name='br-int',has_traffic_filtering=True,id=b07d054c-f7c2-465a-8c0d-fa4f5dac4828,network=Network(f32b0ae0-64b5-4b08-b029-da33b7e8f96a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapb07d054c-f7') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 21 18:55:36 np0005591285 nova_compute[182755]: 2026-01-21 23:55:36.127 182759 DEBUG os_vif [None req-81a93673-ed46-4e26-8b42-813fd1cad033 fb46f340c44c473b9286568553cb6374 8556453a9e6644b4b29f7e2585b6beb3 - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:87:3c:66,bridge_name='br-int',has_traffic_filtering=True,id=b07d054c-f7c2-465a-8c0d-fa4f5dac4828,network=Network(f32b0ae0-64b5-4b08-b029-da33b7e8f96a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapb07d054c-f7') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Jan 21 18:55:36 np0005591285 nova_compute[182755]: 2026-01-21 23:55:36.129 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 18:55:36 np0005591285 nova_compute[182755]: 2026-01-21 23:55:36.130 182759 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapb07d054c-f7, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 21 18:55:36 np0005591285 nova_compute[182755]: 2026-01-21 23:55:36.132 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 18:55:36 np0005591285 nova_compute[182755]: 2026-01-21 23:55:36.134 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 21 18:55:36 np0005591285 nova_compute[182755]: 2026-01-21 23:55:36.141 182759 INFO os_vif [None req-81a93673-ed46-4e26-8b42-813fd1cad033 fb46f340c44c473b9286568553cb6374 8556453a9e6644b4b29f7e2585b6beb3 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:87:3c:66,bridge_name='br-int',has_traffic_filtering=True,id=b07d054c-f7c2-465a-8c0d-fa4f5dac4828,network=Network(f32b0ae0-64b5-4b08-b029-da33b7e8f96a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapb07d054c-f7')#033[00m
Jan 21 18:55:36 np0005591285 nova_compute[182755]: 2026-01-21 23:55:36.142 182759 INFO nova.virt.libvirt.driver [None req-81a93673-ed46-4e26-8b42-813fd1cad033 fb46f340c44c473b9286568553cb6374 8556453a9e6644b4b29f7e2585b6beb3 - - default default] [instance: 2b44528a-0ec9-4df9-afce-0d76ed92b221] Deleting instance files /var/lib/nova/instances/2b44528a-0ec9-4df9-afce-0d76ed92b221_del#033[00m
Jan 21 18:55:36 np0005591285 nova_compute[182755]: 2026-01-21 23:55:36.144 182759 INFO nova.virt.libvirt.driver [None req-81a93673-ed46-4e26-8b42-813fd1cad033 fb46f340c44c473b9286568553cb6374 8556453a9e6644b4b29f7e2585b6beb3 - - default default] [instance: 2b44528a-0ec9-4df9-afce-0d76ed92b221] Deletion of /var/lib/nova/instances/2b44528a-0ec9-4df9-afce-0d76ed92b221_del complete#033[00m
Jan 21 18:55:36 np0005591285 podman[218906]: 2026-01-21 23:55:36.192606346 +0000 UTC m=+0.061020606 container remove 19b229169124fd5c54f9e1311d46b00c25aab98961a580255afe3d7ca3536e8b (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-f32b0ae0-64b5-4b08-b029-da33b7e8f96a, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3)
Jan 21 18:55:36 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:55:36.201 211690 DEBUG oslo.privsep.daemon [-] privsep: reply[0d47df60-d8a3-46fa-934c-c22c2d6df5a9]: (4, ('Wed Jan 21 11:55:35 PM UTC 2026 Stopping container neutron-haproxy-ovnmeta-f32b0ae0-64b5-4b08-b029-da33b7e8f96a (19b229169124fd5c54f9e1311d46b00c25aab98961a580255afe3d7ca3536e8b)\n19b229169124fd5c54f9e1311d46b00c25aab98961a580255afe3d7ca3536e8b\nWed Jan 21 11:55:36 PM UTC 2026 Deleting container neutron-haproxy-ovnmeta-f32b0ae0-64b5-4b08-b029-da33b7e8f96a (19b229169124fd5c54f9e1311d46b00c25aab98961a580255afe3d7ca3536e8b)\n19b229169124fd5c54f9e1311d46b00c25aab98961a580255afe3d7ca3536e8b\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 18:55:36 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:55:36.204 211690 DEBUG oslo.privsep.daemon [-] privsep: reply[0603965b-27d0-4909-8066-671d5a4ec31f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 18:55:36 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:55:36.205 104259 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapf32b0ae0-60, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 21 18:55:36 np0005591285 nova_compute[182755]: 2026-01-21 23:55:36.207 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 18:55:36 np0005591285 kernel: tapf32b0ae0-60: left promiscuous mode
Jan 21 18:55:36 np0005591285 nova_compute[182755]: 2026-01-21 23:55:36.210 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 18:55:36 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:55:36.215 211690 DEBUG oslo.privsep.daemon [-] privsep: reply[6efe8347-dba4-4338-825b-9a2faacdb37f]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 18:55:36 np0005591285 nova_compute[182755]: 2026-01-21 23:55:36.232 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 18:55:36 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:55:36.245 211690 DEBUG oslo.privsep.daemon [-] privsep: reply[c3ba6627-848d-4978-a768-75cdaa52edde]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 18:55:36 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:55:36.247 211690 DEBUG oslo.privsep.daemon [-] privsep: reply[76aae351-f242-47c2-b844-07378300a87f]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 18:55:36 np0005591285 nova_compute[182755]: 2026-01-21 23:55:36.251 182759 INFO nova.compute.manager [None req-81a93673-ed46-4e26-8b42-813fd1cad033 fb46f340c44c473b9286568553cb6374 8556453a9e6644b4b29f7e2585b6beb3 - - default default] [instance: 2b44528a-0ec9-4df9-afce-0d76ed92b221] Took 0.46 seconds to destroy the instance on the hypervisor.#033[00m
Jan 21 18:55:36 np0005591285 nova_compute[182755]: 2026-01-21 23:55:36.252 182759 DEBUG oslo.service.loopingcall [None req-81a93673-ed46-4e26-8b42-813fd1cad033 fb46f340c44c473b9286568553cb6374 8556453a9e6644b4b29f7e2585b6beb3 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Jan 21 18:55:36 np0005591285 nova_compute[182755]: 2026-01-21 23:55:36.253 182759 DEBUG nova.compute.manager [-] [instance: 2b44528a-0ec9-4df9-afce-0d76ed92b221] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Jan 21 18:55:36 np0005591285 nova_compute[182755]: 2026-01-21 23:55:36.254 182759 DEBUG nova.network.neutron [-] [instance: 2b44528a-0ec9-4df9-afce-0d76ed92b221] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Jan 21 18:55:36 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:55:36.276 211690 DEBUG oslo.privsep.daemon [-] privsep: reply[7786d104-dc60-4df6-b1c0-1dc5c8ae9698]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 429060, 'reachable_time': 36520, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 218922, 'error': None, 'target': 'ovnmeta-f32b0ae0-64b5-4b08-b029-da33b7e8f96a', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 18:55:36 np0005591285 systemd[1]: run-netns-ovnmeta\x2df32b0ae0\x2d64b5\x2d4b08\x2db029\x2dda33b7e8f96a.mount: Deactivated successfully.
Jan 21 18:55:36 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:55:36.281 104650 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-f32b0ae0-64b5-4b08-b029-da33b7e8f96a deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Jan 21 18:55:36 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:55:36.281 104650 DEBUG oslo.privsep.daemon [-] privsep: reply[4eb1b3c7-4ec3-45f5-9d8f-79da7c2358b8]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 18:55:36 np0005591285 podman[218920]: 2026-01-21 23:55:36.378081396 +0000 UTC m=+0.098886672 container health_status 8919309155a033da8deb024bb37b9fc0d285fe97686ee0d202af37a5f2df8416 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Jan 21 18:55:36 np0005591285 podman[218923]: 2026-01-21 23:55:36.383855063 +0000 UTC m=+0.102999255 container health_status 482a8450540b33e5cd4c4cb0c4b60281016c657b79b89fa82f8a9e447843f995 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, container_name=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, managed_by=edpm_ansible, org.label-schema.build-date=20251202, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent)
Jan 21 18:55:36 np0005591285 nova_compute[182755]: 2026-01-21 23:55:36.576 182759 DEBUG nova.network.neutron [req-9092fcd0-9588-4f63-bfb5-5e8988909385 req-c53579f2-bacc-49d3-955a-2137d4fd0d16 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 2b44528a-0ec9-4df9-afce-0d76ed92b221] Updated VIF entry in instance network info cache for port b07d054c-f7c2-465a-8c0d-fa4f5dac4828. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 21 18:55:36 np0005591285 nova_compute[182755]: 2026-01-21 23:55:36.577 182759 DEBUG nova.network.neutron [req-9092fcd0-9588-4f63-bfb5-5e8988909385 req-c53579f2-bacc-49d3-955a-2137d4fd0d16 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 2b44528a-0ec9-4df9-afce-0d76ed92b221] Updating instance_info_cache with network_info: [{"id": "b07d054c-f7c2-465a-8c0d-fa4f5dac4828", "address": "fa:16:3e:87:3c:66", "network": {"id": "f32b0ae0-64b5-4b08-b029-da33b7e8f96a", "bridge": "br-int", "label": "tempest-SecurityGroupsTestJSON-1426753002-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "8556453a9e6644b4b29f7e2585b6beb3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb07d054c-f7", "ovs_interfaceid": "b07d054c-f7c2-465a-8c0d-fa4f5dac4828", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 21 18:55:36 np0005591285 nova_compute[182755]: 2026-01-21 23:55:36.605 182759 DEBUG oslo_concurrency.lockutils [req-9092fcd0-9588-4f63-bfb5-5e8988909385 req-c53579f2-bacc-49d3-955a-2137d4fd0d16 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Releasing lock "refresh_cache-2b44528a-0ec9-4df9-afce-0d76ed92b221" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 21 18:55:36 np0005591285 nova_compute[182755]: 2026-01-21 23:55:36.691 182759 DEBUG oslo_concurrency.lockutils [None req-c515a25c-1fe2-41b0-9aa3-e2ff88dd1e82 7e79b904cb8a49f990b05eb0ed72fdf4 70b1c9f8be0042aa8de9841a26729700 - - default default] Acquiring lock "3c28cc1f-7479-4ee7-805d-ae13cd2b6dff" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 21 18:55:36 np0005591285 nova_compute[182755]: 2026-01-21 23:55:36.692 182759 DEBUG oslo_concurrency.lockutils [None req-c515a25c-1fe2-41b0-9aa3-e2ff88dd1e82 7e79b904cb8a49f990b05eb0ed72fdf4 70b1c9f8be0042aa8de9841a26729700 - - default default] Lock "3c28cc1f-7479-4ee7-805d-ae13cd2b6dff" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 21 18:55:36 np0005591285 nova_compute[182755]: 2026-01-21 23:55:36.714 182759 DEBUG nova.compute.manager [None req-c515a25c-1fe2-41b0-9aa3-e2ff88dd1e82 7e79b904cb8a49f990b05eb0ed72fdf4 70b1c9f8be0042aa8de9841a26729700 - - default default] [instance: 3c28cc1f-7479-4ee7-805d-ae13cd2b6dff] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Jan 21 18:55:36 np0005591285 nova_compute[182755]: 2026-01-21 23:55:36.878 182759 DEBUG oslo_concurrency.lockutils [None req-c515a25c-1fe2-41b0-9aa3-e2ff88dd1e82 7e79b904cb8a49f990b05eb0ed72fdf4 70b1c9f8be0042aa8de9841a26729700 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 21 18:55:36 np0005591285 nova_compute[182755]: 2026-01-21 23:55:36.880 182759 DEBUG oslo_concurrency.lockutils [None req-c515a25c-1fe2-41b0-9aa3-e2ff88dd1e82 7e79b904cb8a49f990b05eb0ed72fdf4 70b1c9f8be0042aa8de9841a26729700 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 21 18:55:36 np0005591285 nova_compute[182755]: 2026-01-21 23:55:36.890 182759 DEBUG nova.virt.hardware [None req-c515a25c-1fe2-41b0-9aa3-e2ff88dd1e82 7e79b904cb8a49f990b05eb0ed72fdf4 70b1c9f8be0042aa8de9841a26729700 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Jan 21 18:55:36 np0005591285 nova_compute[182755]: 2026-01-21 23:55:36.891 182759 INFO nova.compute.claims [None req-c515a25c-1fe2-41b0-9aa3-e2ff88dd1e82 7e79b904cb8a49f990b05eb0ed72fdf4 70b1c9f8be0042aa8de9841a26729700 - - default default] [instance: 3c28cc1f-7479-4ee7-805d-ae13cd2b6dff] Claim successful on node compute-2.ctlplane.example.com#033[00m
Jan 21 18:55:37 np0005591285 nova_compute[182755]: 2026-01-21 23:55:37.112 182759 DEBUG nova.compute.provider_tree [None req-c515a25c-1fe2-41b0-9aa3-e2ff88dd1e82 7e79b904cb8a49f990b05eb0ed72fdf4 70b1c9f8be0042aa8de9841a26729700 - - default default] Inventory has not changed in ProviderTree for provider: e96a8776-a298-4c19-937a-402cb8191067 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 21 18:55:37 np0005591285 nova_compute[182755]: 2026-01-21 23:55:37.128 182759 DEBUG nova.network.neutron [-] [instance: 2b44528a-0ec9-4df9-afce-0d76ed92b221] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 21 18:55:37 np0005591285 nova_compute[182755]: 2026-01-21 23:55:37.131 182759 DEBUG nova.scheduler.client.report [None req-c515a25c-1fe2-41b0-9aa3-e2ff88dd1e82 7e79b904cb8a49f990b05eb0ed72fdf4 70b1c9f8be0042aa8de9841a26729700 - - default default] Inventory has not changed for provider e96a8776-a298-4c19-937a-402cb8191067 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 21 18:55:37 np0005591285 nova_compute[182755]: 2026-01-21 23:55:37.156 182759 DEBUG nova.compute.manager [req-75a5f1ed-2121-40bd-9d29-a0d789be0202 req-79965801-505f-4cdb-9dba-113cc6210b57 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 2b44528a-0ec9-4df9-afce-0d76ed92b221] Received event network-vif-unplugged-b07d054c-f7c2-465a-8c0d-fa4f5dac4828 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 21 18:55:37 np0005591285 nova_compute[182755]: 2026-01-21 23:55:37.157 182759 DEBUG oslo_concurrency.lockutils [req-75a5f1ed-2121-40bd-9d29-a0d789be0202 req-79965801-505f-4cdb-9dba-113cc6210b57 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "2b44528a-0ec9-4df9-afce-0d76ed92b221-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 21 18:55:37 np0005591285 nova_compute[182755]: 2026-01-21 23:55:37.158 182759 DEBUG oslo_concurrency.lockutils [req-75a5f1ed-2121-40bd-9d29-a0d789be0202 req-79965801-505f-4cdb-9dba-113cc6210b57 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "2b44528a-0ec9-4df9-afce-0d76ed92b221-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 21 18:55:37 np0005591285 nova_compute[182755]: 2026-01-21 23:55:37.158 182759 DEBUG oslo_concurrency.lockutils [req-75a5f1ed-2121-40bd-9d29-a0d789be0202 req-79965801-505f-4cdb-9dba-113cc6210b57 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "2b44528a-0ec9-4df9-afce-0d76ed92b221-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 21 18:55:37 np0005591285 nova_compute[182755]: 2026-01-21 23:55:37.158 182759 DEBUG nova.compute.manager [req-75a5f1ed-2121-40bd-9d29-a0d789be0202 req-79965801-505f-4cdb-9dba-113cc6210b57 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 2b44528a-0ec9-4df9-afce-0d76ed92b221] No waiting events found dispatching network-vif-unplugged-b07d054c-f7c2-465a-8c0d-fa4f5dac4828 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 21 18:55:37 np0005591285 nova_compute[182755]: 2026-01-21 23:55:37.159 182759 DEBUG nova.compute.manager [req-75a5f1ed-2121-40bd-9d29-a0d789be0202 req-79965801-505f-4cdb-9dba-113cc6210b57 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 2b44528a-0ec9-4df9-afce-0d76ed92b221] Received event network-vif-unplugged-b07d054c-f7c2-465a-8c0d-fa4f5dac4828 for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Jan 21 18:55:37 np0005591285 nova_compute[182755]: 2026-01-21 23:55:37.159 182759 DEBUG nova.compute.manager [req-75a5f1ed-2121-40bd-9d29-a0d789be0202 req-79965801-505f-4cdb-9dba-113cc6210b57 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 2b44528a-0ec9-4df9-afce-0d76ed92b221] Received event network-vif-plugged-b07d054c-f7c2-465a-8c0d-fa4f5dac4828 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 21 18:55:37 np0005591285 nova_compute[182755]: 2026-01-21 23:55:37.159 182759 DEBUG oslo_concurrency.lockutils [req-75a5f1ed-2121-40bd-9d29-a0d789be0202 req-79965801-505f-4cdb-9dba-113cc6210b57 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "2b44528a-0ec9-4df9-afce-0d76ed92b221-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 21 18:55:37 np0005591285 nova_compute[182755]: 2026-01-21 23:55:37.159 182759 DEBUG oslo_concurrency.lockutils [req-75a5f1ed-2121-40bd-9d29-a0d789be0202 req-79965801-505f-4cdb-9dba-113cc6210b57 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "2b44528a-0ec9-4df9-afce-0d76ed92b221-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 21 18:55:37 np0005591285 nova_compute[182755]: 2026-01-21 23:55:37.160 182759 DEBUG oslo_concurrency.lockutils [req-75a5f1ed-2121-40bd-9d29-a0d789be0202 req-79965801-505f-4cdb-9dba-113cc6210b57 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "2b44528a-0ec9-4df9-afce-0d76ed92b221-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 21 18:55:37 np0005591285 nova_compute[182755]: 2026-01-21 23:55:37.160 182759 DEBUG nova.compute.manager [req-75a5f1ed-2121-40bd-9d29-a0d789be0202 req-79965801-505f-4cdb-9dba-113cc6210b57 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 2b44528a-0ec9-4df9-afce-0d76ed92b221] No waiting events found dispatching network-vif-plugged-b07d054c-f7c2-465a-8c0d-fa4f5dac4828 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 21 18:55:37 np0005591285 nova_compute[182755]: 2026-01-21 23:55:37.160 182759 WARNING nova.compute.manager [req-75a5f1ed-2121-40bd-9d29-a0d789be0202 req-79965801-505f-4cdb-9dba-113cc6210b57 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 2b44528a-0ec9-4df9-afce-0d76ed92b221] Received unexpected event network-vif-plugged-b07d054c-f7c2-465a-8c0d-fa4f5dac4828 for instance with vm_state active and task_state deleting.#033[00m
Jan 21 18:55:37 np0005591285 nova_compute[182755]: 2026-01-21 23:55:37.161 182759 DEBUG nova.compute.manager [req-75a5f1ed-2121-40bd-9d29-a0d789be0202 req-79965801-505f-4cdb-9dba-113cc6210b57 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 2b44528a-0ec9-4df9-afce-0d76ed92b221] Received event network-vif-plugged-b07d054c-f7c2-465a-8c0d-fa4f5dac4828 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 21 18:55:37 np0005591285 nova_compute[182755]: 2026-01-21 23:55:37.161 182759 DEBUG oslo_concurrency.lockutils [req-75a5f1ed-2121-40bd-9d29-a0d789be0202 req-79965801-505f-4cdb-9dba-113cc6210b57 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "2b44528a-0ec9-4df9-afce-0d76ed92b221-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 21 18:55:37 np0005591285 nova_compute[182755]: 2026-01-21 23:55:37.161 182759 DEBUG oslo_concurrency.lockutils [req-75a5f1ed-2121-40bd-9d29-a0d789be0202 req-79965801-505f-4cdb-9dba-113cc6210b57 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "2b44528a-0ec9-4df9-afce-0d76ed92b221-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 21 18:55:37 np0005591285 nova_compute[182755]: 2026-01-21 23:55:37.161 182759 DEBUG oslo_concurrency.lockutils [req-75a5f1ed-2121-40bd-9d29-a0d789be0202 req-79965801-505f-4cdb-9dba-113cc6210b57 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "2b44528a-0ec9-4df9-afce-0d76ed92b221-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 21 18:55:37 np0005591285 nova_compute[182755]: 2026-01-21 23:55:37.162 182759 DEBUG nova.compute.manager [req-75a5f1ed-2121-40bd-9d29-a0d789be0202 req-79965801-505f-4cdb-9dba-113cc6210b57 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 2b44528a-0ec9-4df9-afce-0d76ed92b221] No waiting events found dispatching network-vif-plugged-b07d054c-f7c2-465a-8c0d-fa4f5dac4828 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 21 18:55:37 np0005591285 nova_compute[182755]: 2026-01-21 23:55:37.162 182759 WARNING nova.compute.manager [req-75a5f1ed-2121-40bd-9d29-a0d789be0202 req-79965801-505f-4cdb-9dba-113cc6210b57 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 2b44528a-0ec9-4df9-afce-0d76ed92b221] Received unexpected event network-vif-plugged-b07d054c-f7c2-465a-8c0d-fa4f5dac4828 for instance with vm_state active and task_state deleting.#033[00m
Jan 21 18:55:37 np0005591285 nova_compute[182755]: 2026-01-21 23:55:37.162 182759 DEBUG nova.compute.manager [req-75a5f1ed-2121-40bd-9d29-a0d789be0202 req-79965801-505f-4cdb-9dba-113cc6210b57 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 2b44528a-0ec9-4df9-afce-0d76ed92b221] Received event network-vif-plugged-b07d054c-f7c2-465a-8c0d-fa4f5dac4828 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 21 18:55:37 np0005591285 nova_compute[182755]: 2026-01-21 23:55:37.163 182759 DEBUG oslo_concurrency.lockutils [req-75a5f1ed-2121-40bd-9d29-a0d789be0202 req-79965801-505f-4cdb-9dba-113cc6210b57 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "2b44528a-0ec9-4df9-afce-0d76ed92b221-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 21 18:55:37 np0005591285 nova_compute[182755]: 2026-01-21 23:55:37.163 182759 DEBUG oslo_concurrency.lockutils [req-75a5f1ed-2121-40bd-9d29-a0d789be0202 req-79965801-505f-4cdb-9dba-113cc6210b57 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "2b44528a-0ec9-4df9-afce-0d76ed92b221-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 21 18:55:37 np0005591285 nova_compute[182755]: 2026-01-21 23:55:37.163 182759 DEBUG oslo_concurrency.lockutils [req-75a5f1ed-2121-40bd-9d29-a0d789be0202 req-79965801-505f-4cdb-9dba-113cc6210b57 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "2b44528a-0ec9-4df9-afce-0d76ed92b221-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 21 18:55:37 np0005591285 nova_compute[182755]: 2026-01-21 23:55:37.163 182759 DEBUG nova.compute.manager [req-75a5f1ed-2121-40bd-9d29-a0d789be0202 req-79965801-505f-4cdb-9dba-113cc6210b57 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 2b44528a-0ec9-4df9-afce-0d76ed92b221] No waiting events found dispatching network-vif-plugged-b07d054c-f7c2-465a-8c0d-fa4f5dac4828 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 21 18:55:37 np0005591285 nova_compute[182755]: 2026-01-21 23:55:37.164 182759 WARNING nova.compute.manager [req-75a5f1ed-2121-40bd-9d29-a0d789be0202 req-79965801-505f-4cdb-9dba-113cc6210b57 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 2b44528a-0ec9-4df9-afce-0d76ed92b221] Received unexpected event network-vif-plugged-b07d054c-f7c2-465a-8c0d-fa4f5dac4828 for instance with vm_state active and task_state deleting.#033[00m
Jan 21 18:55:37 np0005591285 nova_compute[182755]: 2026-01-21 23:55:37.164 182759 DEBUG nova.compute.manager [req-75a5f1ed-2121-40bd-9d29-a0d789be0202 req-79965801-505f-4cdb-9dba-113cc6210b57 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 2b44528a-0ec9-4df9-afce-0d76ed92b221] Received event network-vif-unplugged-b07d054c-f7c2-465a-8c0d-fa4f5dac4828 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 21 18:55:37 np0005591285 nova_compute[182755]: 2026-01-21 23:55:37.164 182759 DEBUG oslo_concurrency.lockutils [req-75a5f1ed-2121-40bd-9d29-a0d789be0202 req-79965801-505f-4cdb-9dba-113cc6210b57 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "2b44528a-0ec9-4df9-afce-0d76ed92b221-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 21 18:55:37 np0005591285 nova_compute[182755]: 2026-01-21 23:55:37.165 182759 DEBUG oslo_concurrency.lockutils [req-75a5f1ed-2121-40bd-9d29-a0d789be0202 req-79965801-505f-4cdb-9dba-113cc6210b57 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "2b44528a-0ec9-4df9-afce-0d76ed92b221-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 21 18:55:37 np0005591285 nova_compute[182755]: 2026-01-21 23:55:37.165 182759 DEBUG oslo_concurrency.lockutils [req-75a5f1ed-2121-40bd-9d29-a0d789be0202 req-79965801-505f-4cdb-9dba-113cc6210b57 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "2b44528a-0ec9-4df9-afce-0d76ed92b221-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 21 18:55:37 np0005591285 nova_compute[182755]: 2026-01-21 23:55:37.165 182759 DEBUG nova.compute.manager [req-75a5f1ed-2121-40bd-9d29-a0d789be0202 req-79965801-505f-4cdb-9dba-113cc6210b57 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 2b44528a-0ec9-4df9-afce-0d76ed92b221] No waiting events found dispatching network-vif-unplugged-b07d054c-f7c2-465a-8c0d-fa4f5dac4828 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 21 18:55:37 np0005591285 nova_compute[182755]: 2026-01-21 23:55:37.165 182759 DEBUG nova.compute.manager [req-75a5f1ed-2121-40bd-9d29-a0d789be0202 req-79965801-505f-4cdb-9dba-113cc6210b57 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 2b44528a-0ec9-4df9-afce-0d76ed92b221] Received event network-vif-unplugged-b07d054c-f7c2-465a-8c0d-fa4f5dac4828 for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Jan 21 18:55:37 np0005591285 nova_compute[182755]: 2026-01-21 23:55:37.166 182759 DEBUG nova.compute.manager [req-75a5f1ed-2121-40bd-9d29-a0d789be0202 req-79965801-505f-4cdb-9dba-113cc6210b57 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 2b44528a-0ec9-4df9-afce-0d76ed92b221] Received event network-vif-plugged-b07d054c-f7c2-465a-8c0d-fa4f5dac4828 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 21 18:55:37 np0005591285 nova_compute[182755]: 2026-01-21 23:55:37.166 182759 DEBUG oslo_concurrency.lockutils [req-75a5f1ed-2121-40bd-9d29-a0d789be0202 req-79965801-505f-4cdb-9dba-113cc6210b57 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "2b44528a-0ec9-4df9-afce-0d76ed92b221-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 21 18:55:37 np0005591285 nova_compute[182755]: 2026-01-21 23:55:37.166 182759 DEBUG oslo_concurrency.lockutils [req-75a5f1ed-2121-40bd-9d29-a0d789be0202 req-79965801-505f-4cdb-9dba-113cc6210b57 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "2b44528a-0ec9-4df9-afce-0d76ed92b221-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 21 18:55:37 np0005591285 nova_compute[182755]: 2026-01-21 23:55:37.167 182759 DEBUG oslo_concurrency.lockutils [req-75a5f1ed-2121-40bd-9d29-a0d789be0202 req-79965801-505f-4cdb-9dba-113cc6210b57 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "2b44528a-0ec9-4df9-afce-0d76ed92b221-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 21 18:55:37 np0005591285 nova_compute[182755]: 2026-01-21 23:55:37.167 182759 DEBUG nova.compute.manager [req-75a5f1ed-2121-40bd-9d29-a0d789be0202 req-79965801-505f-4cdb-9dba-113cc6210b57 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 2b44528a-0ec9-4df9-afce-0d76ed92b221] No waiting events found dispatching network-vif-plugged-b07d054c-f7c2-465a-8c0d-fa4f5dac4828 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 21 18:55:37 np0005591285 nova_compute[182755]: 2026-01-21 23:55:37.167 182759 WARNING nova.compute.manager [req-75a5f1ed-2121-40bd-9d29-a0d789be0202 req-79965801-505f-4cdb-9dba-113cc6210b57 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 2b44528a-0ec9-4df9-afce-0d76ed92b221] Received unexpected event network-vif-plugged-b07d054c-f7c2-465a-8c0d-fa4f5dac4828 for instance with vm_state active and task_state deleting.#033[00m
Jan 21 18:55:37 np0005591285 nova_compute[182755]: 2026-01-21 23:55:37.317 182759 INFO nova.compute.manager [-] [instance: 2b44528a-0ec9-4df9-afce-0d76ed92b221] Took 1.06 seconds to deallocate network for instance.#033[00m
Jan 21 18:55:37 np0005591285 nova_compute[182755]: 2026-01-21 23:55:37.322 182759 DEBUG oslo_concurrency.lockutils [None req-c515a25c-1fe2-41b0-9aa3-e2ff88dd1e82 7e79b904cb8a49f990b05eb0ed72fdf4 70b1c9f8be0042aa8de9841a26729700 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.442s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 21 18:55:37 np0005591285 nova_compute[182755]: 2026-01-21 23:55:37.323 182759 DEBUG nova.compute.manager [None req-c515a25c-1fe2-41b0-9aa3-e2ff88dd1e82 7e79b904cb8a49f990b05eb0ed72fdf4 70b1c9f8be0042aa8de9841a26729700 - - default default] [instance: 3c28cc1f-7479-4ee7-805d-ae13cd2b6dff] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Jan 21 18:55:37 np0005591285 nova_compute[182755]: 2026-01-21 23:55:37.396 182759 DEBUG nova.compute.manager [req-38db2ec7-b7a9-4e1e-975a-336c1f3539b4 req-1a651cbf-0fcd-434b-98a2-119484c67802 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 2b44528a-0ec9-4df9-afce-0d76ed92b221] Received event network-vif-deleted-b07d054c-f7c2-465a-8c0d-fa4f5dac4828 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 21 18:55:37 np0005591285 nova_compute[182755]: 2026-01-21 23:55:37.430 182759 DEBUG nova.compute.manager [None req-c515a25c-1fe2-41b0-9aa3-e2ff88dd1e82 7e79b904cb8a49f990b05eb0ed72fdf4 70b1c9f8be0042aa8de9841a26729700 - - default default] [instance: 3c28cc1f-7479-4ee7-805d-ae13cd2b6dff] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Jan 21 18:55:37 np0005591285 nova_compute[182755]: 2026-01-21 23:55:37.431 182759 DEBUG nova.network.neutron [None req-c515a25c-1fe2-41b0-9aa3-e2ff88dd1e82 7e79b904cb8a49f990b05eb0ed72fdf4 70b1c9f8be0042aa8de9841a26729700 - - default default] [instance: 3c28cc1f-7479-4ee7-805d-ae13cd2b6dff] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Jan 21 18:55:37 np0005591285 nova_compute[182755]: 2026-01-21 23:55:37.451 182759 DEBUG oslo_concurrency.lockutils [None req-81a93673-ed46-4e26-8b42-813fd1cad033 fb46f340c44c473b9286568553cb6374 8556453a9e6644b4b29f7e2585b6beb3 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 21 18:55:37 np0005591285 nova_compute[182755]: 2026-01-21 23:55:37.451 182759 DEBUG oslo_concurrency.lockutils [None req-81a93673-ed46-4e26-8b42-813fd1cad033 fb46f340c44c473b9286568553cb6374 8556453a9e6644b4b29f7e2585b6beb3 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 21 18:55:37 np0005591285 nova_compute[182755]: 2026-01-21 23:55:37.454 182759 INFO nova.virt.libvirt.driver [None req-c515a25c-1fe2-41b0-9aa3-e2ff88dd1e82 7e79b904cb8a49f990b05eb0ed72fdf4 70b1c9f8be0042aa8de9841a26729700 - - default default] [instance: 3c28cc1f-7479-4ee7-805d-ae13cd2b6dff] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Jan 21 18:55:37 np0005591285 nova_compute[182755]: 2026-01-21 23:55:37.487 182759 DEBUG nova.compute.manager [None req-c515a25c-1fe2-41b0-9aa3-e2ff88dd1e82 7e79b904cb8a49f990b05eb0ed72fdf4 70b1c9f8be0042aa8de9841a26729700 - - default default] [instance: 3c28cc1f-7479-4ee7-805d-ae13cd2b6dff] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Jan 21 18:55:37 np0005591285 nova_compute[182755]: 2026-01-21 23:55:37.555 182759 DEBUG nova.compute.provider_tree [None req-81a93673-ed46-4e26-8b42-813fd1cad033 fb46f340c44c473b9286568553cb6374 8556453a9e6644b4b29f7e2585b6beb3 - - default default] Inventory has not changed in ProviderTree for provider: e96a8776-a298-4c19-937a-402cb8191067 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 21 18:55:37 np0005591285 nova_compute[182755]: 2026-01-21 23:55:37.589 182759 DEBUG nova.scheduler.client.report [None req-81a93673-ed46-4e26-8b42-813fd1cad033 fb46f340c44c473b9286568553cb6374 8556453a9e6644b4b29f7e2585b6beb3 - - default default] Inventory has not changed for provider e96a8776-a298-4c19-937a-402cb8191067 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 21 18:55:37 np0005591285 nova_compute[182755]: 2026-01-21 23:55:37.623 182759 DEBUG nova.compute.manager [None req-c515a25c-1fe2-41b0-9aa3-e2ff88dd1e82 7e79b904cb8a49f990b05eb0ed72fdf4 70b1c9f8be0042aa8de9841a26729700 - - default default] [instance: 3c28cc1f-7479-4ee7-805d-ae13cd2b6dff] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Jan 21 18:55:37 np0005591285 nova_compute[182755]: 2026-01-21 23:55:37.626 182759 DEBUG nova.virt.libvirt.driver [None req-c515a25c-1fe2-41b0-9aa3-e2ff88dd1e82 7e79b904cb8a49f990b05eb0ed72fdf4 70b1c9f8be0042aa8de9841a26729700 - - default default] [instance: 3c28cc1f-7479-4ee7-805d-ae13cd2b6dff] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Jan 21 18:55:37 np0005591285 nova_compute[182755]: 2026-01-21 23:55:37.627 182759 INFO nova.virt.libvirt.driver [None req-c515a25c-1fe2-41b0-9aa3-e2ff88dd1e82 7e79b904cb8a49f990b05eb0ed72fdf4 70b1c9f8be0042aa8de9841a26729700 - - default default] [instance: 3c28cc1f-7479-4ee7-805d-ae13cd2b6dff] Creating image(s)#033[00m
Jan 21 18:55:37 np0005591285 nova_compute[182755]: 2026-01-21 23:55:37.628 182759 DEBUG oslo_concurrency.lockutils [None req-c515a25c-1fe2-41b0-9aa3-e2ff88dd1e82 7e79b904cb8a49f990b05eb0ed72fdf4 70b1c9f8be0042aa8de9841a26729700 - - default default] Acquiring lock "/var/lib/nova/instances/3c28cc1f-7479-4ee7-805d-ae13cd2b6dff/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 21 18:55:37 np0005591285 nova_compute[182755]: 2026-01-21 23:55:37.629 182759 DEBUG oslo_concurrency.lockutils [None req-c515a25c-1fe2-41b0-9aa3-e2ff88dd1e82 7e79b904cb8a49f990b05eb0ed72fdf4 70b1c9f8be0042aa8de9841a26729700 - - default default] Lock "/var/lib/nova/instances/3c28cc1f-7479-4ee7-805d-ae13cd2b6dff/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 21 18:55:37 np0005591285 nova_compute[182755]: 2026-01-21 23:55:37.630 182759 DEBUG oslo_concurrency.lockutils [None req-c515a25c-1fe2-41b0-9aa3-e2ff88dd1e82 7e79b904cb8a49f990b05eb0ed72fdf4 70b1c9f8be0042aa8de9841a26729700 - - default default] Lock "/var/lib/nova/instances/3c28cc1f-7479-4ee7-805d-ae13cd2b6dff/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 21 18:55:37 np0005591285 nova_compute[182755]: 2026-01-21 23:55:37.631 182759 DEBUG oslo_concurrency.lockutils [None req-c515a25c-1fe2-41b0-9aa3-e2ff88dd1e82 7e79b904cb8a49f990b05eb0ed72fdf4 70b1c9f8be0042aa8de9841a26729700 - - default default] Acquiring lock "28d799b333bd7d52e5e892149f424e185effed74" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 21 18:55:37 np0005591285 nova_compute[182755]: 2026-01-21 23:55:37.632 182759 DEBUG oslo_concurrency.lockutils [None req-c515a25c-1fe2-41b0-9aa3-e2ff88dd1e82 7e79b904cb8a49f990b05eb0ed72fdf4 70b1c9f8be0042aa8de9841a26729700 - - default default] Lock "28d799b333bd7d52e5e892149f424e185effed74" acquired by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 21 18:55:37 np0005591285 nova_compute[182755]: 2026-01-21 23:55:37.641 182759 DEBUG oslo_concurrency.lockutils [None req-81a93673-ed46-4e26-8b42-813fd1cad033 fb46f340c44c473b9286568553cb6374 8556453a9e6644b4b29f7e2585b6beb3 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.190s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 21 18:55:37 np0005591285 nova_compute[182755]: 2026-01-21 23:55:37.649 182759 DEBUG nova.policy [None req-c515a25c-1fe2-41b0-9aa3-e2ff88dd1e82 7e79b904cb8a49f990b05eb0ed72fdf4 70b1c9f8be0042aa8de9841a26729700 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '7e79b904cb8a49f990b05eb0ed72fdf4', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '70b1c9f8be0042aa8de9841a26729700', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Jan 21 18:55:37 np0005591285 nova_compute[182755]: 2026-01-21 23:55:37.674 182759 INFO nova.scheduler.client.report [None req-81a93673-ed46-4e26-8b42-813fd1cad033 fb46f340c44c473b9286568553cb6374 8556453a9e6644b4b29f7e2585b6beb3 - - default default] Deleted allocations for instance 2b44528a-0ec9-4df9-afce-0d76ed92b221#033[00m
Jan 21 18:55:37 np0005591285 nova_compute[182755]: 2026-01-21 23:55:37.779 182759 DEBUG oslo_concurrency.lockutils [None req-81a93673-ed46-4e26-8b42-813fd1cad033 fb46f340c44c473b9286568553cb6374 8556453a9e6644b4b29f7e2585b6beb3 - - default default] Lock "2b44528a-0ec9-4df9-afce-0d76ed92b221" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 2.019s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 21 18:55:38 np0005591285 nova_compute[182755]: 2026-01-21 23:55:38.699 182759 DEBUG nova.network.neutron [None req-c515a25c-1fe2-41b0-9aa3-e2ff88dd1e82 7e79b904cb8a49f990b05eb0ed72fdf4 70b1c9f8be0042aa8de9841a26729700 - - default default] [instance: 3c28cc1f-7479-4ee7-805d-ae13cd2b6dff] Successfully created port: c9a59fac-68ff-4aa5-abcb-98567b80fb6f _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Jan 21 18:55:39 np0005591285 nova_compute[182755]: 2026-01-21 23:55:39.282 182759 DEBUG oslo_concurrency.processutils [None req-c515a25c-1fe2-41b0-9aa3-e2ff88dd1e82 7e79b904cb8a49f990b05eb0ed72fdf4 70b1c9f8be0042aa8de9841a26729700 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/28d799b333bd7d52e5e892149f424e185effed74.part --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 21 18:55:39 np0005591285 nova_compute[182755]: 2026-01-21 23:55:39.386 182759 DEBUG oslo_concurrency.processutils [None req-c515a25c-1fe2-41b0-9aa3-e2ff88dd1e82 7e79b904cb8a49f990b05eb0ed72fdf4 70b1c9f8be0042aa8de9841a26729700 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/28d799b333bd7d52e5e892149f424e185effed74.part --force-share --output=json" returned: 0 in 0.104s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 21 18:55:39 np0005591285 nova_compute[182755]: 2026-01-21 23:55:39.388 182759 DEBUG nova.virt.images [None req-c515a25c-1fe2-41b0-9aa3-e2ff88dd1e82 7e79b904cb8a49f990b05eb0ed72fdf4 70b1c9f8be0042aa8de9841a26729700 - - default default] 3e1dda74-3c6a-4d29-8792-32134d1c36c5 was qcow2, converting to raw fetch_to_raw /usr/lib/python3.9/site-packages/nova/virt/images.py:242#033[00m
Jan 21 18:55:39 np0005591285 nova_compute[182755]: 2026-01-21 23:55:39.390 182759 DEBUG nova.privsep.utils [None req-c515a25c-1fe2-41b0-9aa3-e2ff88dd1e82 7e79b904cb8a49f990b05eb0ed72fdf4 70b1c9f8be0042aa8de9841a26729700 - - default default] Path '/var/lib/nova/instances' supports direct I/O supports_direct_io /usr/lib/python3.9/site-packages/nova/privsep/utils.py:63#033[00m
Jan 21 18:55:39 np0005591285 nova_compute[182755]: 2026-01-21 23:55:39.390 182759 DEBUG oslo_concurrency.processutils [None req-c515a25c-1fe2-41b0-9aa3-e2ff88dd1e82 7e79b904cb8a49f990b05eb0ed72fdf4 70b1c9f8be0042aa8de9841a26729700 - - default default] Running cmd (subprocess): qemu-img convert -t none -O raw -f qcow2 /var/lib/nova/instances/_base/28d799b333bd7d52e5e892149f424e185effed74.part /var/lib/nova/instances/_base/28d799b333bd7d52e5e892149f424e185effed74.converted execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 21 18:55:39 np0005591285 nova_compute[182755]: 2026-01-21 23:55:39.578 182759 DEBUG oslo_concurrency.processutils [None req-c515a25c-1fe2-41b0-9aa3-e2ff88dd1e82 7e79b904cb8a49f990b05eb0ed72fdf4 70b1c9f8be0042aa8de9841a26729700 - - default default] CMD "qemu-img convert -t none -O raw -f qcow2 /var/lib/nova/instances/_base/28d799b333bd7d52e5e892149f424e185effed74.part /var/lib/nova/instances/_base/28d799b333bd7d52e5e892149f424e185effed74.converted" returned: 0 in 0.188s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 21 18:55:39 np0005591285 nova_compute[182755]: 2026-01-21 23:55:39.583 182759 DEBUG oslo_concurrency.processutils [None req-c515a25c-1fe2-41b0-9aa3-e2ff88dd1e82 7e79b904cb8a49f990b05eb0ed72fdf4 70b1c9f8be0042aa8de9841a26729700 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/28d799b333bd7d52e5e892149f424e185effed74.converted --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 21 18:55:39 np0005591285 nova_compute[182755]: 2026-01-21 23:55:39.674 182759 DEBUG oslo_concurrency.processutils [None req-c515a25c-1fe2-41b0-9aa3-e2ff88dd1e82 7e79b904cb8a49f990b05eb0ed72fdf4 70b1c9f8be0042aa8de9841a26729700 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/28d799b333bd7d52e5e892149f424e185effed74.converted --force-share --output=json" returned: 0 in 0.091s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 21 18:55:39 np0005591285 nova_compute[182755]: 2026-01-21 23:55:39.676 182759 DEBUG oslo_concurrency.lockutils [None req-c515a25c-1fe2-41b0-9aa3-e2ff88dd1e82 7e79b904cb8a49f990b05eb0ed72fdf4 70b1c9f8be0042aa8de9841a26729700 - - default default] Lock "28d799b333bd7d52e5e892149f424e185effed74" "released" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: held 2.044s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 21 18:55:39 np0005591285 nova_compute[182755]: 2026-01-21 23:55:39.703 182759 DEBUG oslo_concurrency.processutils [None req-c515a25c-1fe2-41b0-9aa3-e2ff88dd1e82 7e79b904cb8a49f990b05eb0ed72fdf4 70b1c9f8be0042aa8de9841a26729700 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/28d799b333bd7d52e5e892149f424e185effed74 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 21 18:55:39 np0005591285 nova_compute[182755]: 2026-01-21 23:55:39.794 182759 DEBUG oslo_concurrency.processutils [None req-c515a25c-1fe2-41b0-9aa3-e2ff88dd1e82 7e79b904cb8a49f990b05eb0ed72fdf4 70b1c9f8be0042aa8de9841a26729700 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/28d799b333bd7d52e5e892149f424e185effed74 --force-share --output=json" returned: 0 in 0.091s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 21 18:55:39 np0005591285 nova_compute[182755]: 2026-01-21 23:55:39.796 182759 DEBUG oslo_concurrency.lockutils [None req-c515a25c-1fe2-41b0-9aa3-e2ff88dd1e82 7e79b904cb8a49f990b05eb0ed72fdf4 70b1c9f8be0042aa8de9841a26729700 - - default default] Acquiring lock "28d799b333bd7d52e5e892149f424e185effed74" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 21 18:55:39 np0005591285 nova_compute[182755]: 2026-01-21 23:55:39.797 182759 DEBUG oslo_concurrency.lockutils [None req-c515a25c-1fe2-41b0-9aa3-e2ff88dd1e82 7e79b904cb8a49f990b05eb0ed72fdf4 70b1c9f8be0042aa8de9841a26729700 - - default default] Lock "28d799b333bd7d52e5e892149f424e185effed74" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 21 18:55:39 np0005591285 nova_compute[182755]: 2026-01-21 23:55:39.822 182759 DEBUG oslo_concurrency.processutils [None req-c515a25c-1fe2-41b0-9aa3-e2ff88dd1e82 7e79b904cb8a49f990b05eb0ed72fdf4 70b1c9f8be0042aa8de9841a26729700 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/28d799b333bd7d52e5e892149f424e185effed74 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 21 18:55:39 np0005591285 nova_compute[182755]: 2026-01-21 23:55:39.904 182759 DEBUG oslo_concurrency.processutils [None req-c515a25c-1fe2-41b0-9aa3-e2ff88dd1e82 7e79b904cb8a49f990b05eb0ed72fdf4 70b1c9f8be0042aa8de9841a26729700 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/28d799b333bd7d52e5e892149f424e185effed74 --force-share --output=json" returned: 0 in 0.082s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 21 18:55:39 np0005591285 nova_compute[182755]: 2026-01-21 23:55:39.906 182759 DEBUG oslo_concurrency.processutils [None req-c515a25c-1fe2-41b0-9aa3-e2ff88dd1e82 7e79b904cb8a49f990b05eb0ed72fdf4 70b1c9f8be0042aa8de9841a26729700 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/28d799b333bd7d52e5e892149f424e185effed74,backing_fmt=raw /var/lib/nova/instances/3c28cc1f-7479-4ee7-805d-ae13cd2b6dff/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 21 18:55:39 np0005591285 nova_compute[182755]: 2026-01-21 23:55:39.949 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 18:55:39 np0005591285 nova_compute[182755]: 2026-01-21 23:55:39.965 182759 DEBUG oslo_concurrency.processutils [None req-c515a25c-1fe2-41b0-9aa3-e2ff88dd1e82 7e79b904cb8a49f990b05eb0ed72fdf4 70b1c9f8be0042aa8de9841a26729700 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/28d799b333bd7d52e5e892149f424e185effed74,backing_fmt=raw /var/lib/nova/instances/3c28cc1f-7479-4ee7-805d-ae13cd2b6dff/disk 1073741824" returned: 0 in 0.060s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 21 18:55:39 np0005591285 nova_compute[182755]: 2026-01-21 23:55:39.967 182759 DEBUG oslo_concurrency.lockutils [None req-c515a25c-1fe2-41b0-9aa3-e2ff88dd1e82 7e79b904cb8a49f990b05eb0ed72fdf4 70b1c9f8be0042aa8de9841a26729700 - - default default] Lock "28d799b333bd7d52e5e892149f424e185effed74" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.170s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 21 18:55:39 np0005591285 nova_compute[182755]: 2026-01-21 23:55:39.968 182759 DEBUG oslo_concurrency.processutils [None req-c515a25c-1fe2-41b0-9aa3-e2ff88dd1e82 7e79b904cb8a49f990b05eb0ed72fdf4 70b1c9f8be0042aa8de9841a26729700 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/28d799b333bd7d52e5e892149f424e185effed74 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 21 18:55:40 np0005591285 nova_compute[182755]: 2026-01-21 23:55:40.033 182759 DEBUG oslo_concurrency.processutils [None req-c515a25c-1fe2-41b0-9aa3-e2ff88dd1e82 7e79b904cb8a49f990b05eb0ed72fdf4 70b1c9f8be0042aa8de9841a26729700 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/28d799b333bd7d52e5e892149f424e185effed74 --force-share --output=json" returned: 0 in 0.065s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 21 18:55:40 np0005591285 nova_compute[182755]: 2026-01-21 23:55:40.036 182759 DEBUG nova.virt.disk.api [None req-c515a25c-1fe2-41b0-9aa3-e2ff88dd1e82 7e79b904cb8a49f990b05eb0ed72fdf4 70b1c9f8be0042aa8de9841a26729700 - - default default] Checking if we can resize image /var/lib/nova/instances/3c28cc1f-7479-4ee7-805d-ae13cd2b6dff/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166#033[00m
Jan 21 18:55:40 np0005591285 nova_compute[182755]: 2026-01-21 23:55:40.037 182759 DEBUG oslo_concurrency.processutils [None req-c515a25c-1fe2-41b0-9aa3-e2ff88dd1e82 7e79b904cb8a49f990b05eb0ed72fdf4 70b1c9f8be0042aa8de9841a26729700 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/3c28cc1f-7479-4ee7-805d-ae13cd2b6dff/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 21 18:55:40 np0005591285 nova_compute[182755]: 2026-01-21 23:55:40.110 182759 DEBUG oslo_concurrency.processutils [None req-c515a25c-1fe2-41b0-9aa3-e2ff88dd1e82 7e79b904cb8a49f990b05eb0ed72fdf4 70b1c9f8be0042aa8de9841a26729700 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/3c28cc1f-7479-4ee7-805d-ae13cd2b6dff/disk --force-share --output=json" returned: 0 in 0.073s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 21 18:55:40 np0005591285 nova_compute[182755]: 2026-01-21 23:55:40.112 182759 DEBUG nova.virt.disk.api [None req-c515a25c-1fe2-41b0-9aa3-e2ff88dd1e82 7e79b904cb8a49f990b05eb0ed72fdf4 70b1c9f8be0042aa8de9841a26729700 - - default default] Cannot resize image /var/lib/nova/instances/3c28cc1f-7479-4ee7-805d-ae13cd2b6dff/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172#033[00m
Jan 21 18:55:40 np0005591285 nova_compute[182755]: 2026-01-21 23:55:40.113 182759 DEBUG nova.objects.instance [None req-c515a25c-1fe2-41b0-9aa3-e2ff88dd1e82 7e79b904cb8a49f990b05eb0ed72fdf4 70b1c9f8be0042aa8de9841a26729700 - - default default] Lazy-loading 'migration_context' on Instance uuid 3c28cc1f-7479-4ee7-805d-ae13cd2b6dff obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 21 18:55:40 np0005591285 nova_compute[182755]: 2026-01-21 23:55:40.134 182759 DEBUG nova.virt.libvirt.driver [None req-c515a25c-1fe2-41b0-9aa3-e2ff88dd1e82 7e79b904cb8a49f990b05eb0ed72fdf4 70b1c9f8be0042aa8de9841a26729700 - - default default] [instance: 3c28cc1f-7479-4ee7-805d-ae13cd2b6dff] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Jan 21 18:55:40 np0005591285 nova_compute[182755]: 2026-01-21 23:55:40.135 182759 DEBUG nova.virt.libvirt.driver [None req-c515a25c-1fe2-41b0-9aa3-e2ff88dd1e82 7e79b904cb8a49f990b05eb0ed72fdf4 70b1c9f8be0042aa8de9841a26729700 - - default default] [instance: 3c28cc1f-7479-4ee7-805d-ae13cd2b6dff] Ensure instance console log exists: /var/lib/nova/instances/3c28cc1f-7479-4ee7-805d-ae13cd2b6dff/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Jan 21 18:55:40 np0005591285 nova_compute[182755]: 2026-01-21 23:55:40.136 182759 DEBUG oslo_concurrency.lockutils [None req-c515a25c-1fe2-41b0-9aa3-e2ff88dd1e82 7e79b904cb8a49f990b05eb0ed72fdf4 70b1c9f8be0042aa8de9841a26729700 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 21 18:55:40 np0005591285 nova_compute[182755]: 2026-01-21 23:55:40.137 182759 DEBUG oslo_concurrency.lockutils [None req-c515a25c-1fe2-41b0-9aa3-e2ff88dd1e82 7e79b904cb8a49f990b05eb0ed72fdf4 70b1c9f8be0042aa8de9841a26729700 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 21 18:55:40 np0005591285 nova_compute[182755]: 2026-01-21 23:55:40.138 182759 DEBUG oslo_concurrency.lockutils [None req-c515a25c-1fe2-41b0-9aa3-e2ff88dd1e82 7e79b904cb8a49f990b05eb0ed72fdf4 70b1c9f8be0042aa8de9841a26729700 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 21 18:55:41 np0005591285 nova_compute[182755]: 2026-01-21 23:55:41.134 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 18:55:43 np0005591285 nova_compute[182755]: 2026-01-21 23:55:43.564 182759 DEBUG nova.network.neutron [None req-c515a25c-1fe2-41b0-9aa3-e2ff88dd1e82 7e79b904cb8a49f990b05eb0ed72fdf4 70b1c9f8be0042aa8de9841a26729700 - - default default] [instance: 3c28cc1f-7479-4ee7-805d-ae13cd2b6dff] Successfully updated port: c9a59fac-68ff-4aa5-abcb-98567b80fb6f _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Jan 21 18:55:43 np0005591285 nova_compute[182755]: 2026-01-21 23:55:43.606 182759 DEBUG oslo_concurrency.lockutils [None req-c515a25c-1fe2-41b0-9aa3-e2ff88dd1e82 7e79b904cb8a49f990b05eb0ed72fdf4 70b1c9f8be0042aa8de9841a26729700 - - default default] Acquiring lock "refresh_cache-3c28cc1f-7479-4ee7-805d-ae13cd2b6dff" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 21 18:55:43 np0005591285 nova_compute[182755]: 2026-01-21 23:55:43.607 182759 DEBUG oslo_concurrency.lockutils [None req-c515a25c-1fe2-41b0-9aa3-e2ff88dd1e82 7e79b904cb8a49f990b05eb0ed72fdf4 70b1c9f8be0042aa8de9841a26729700 - - default default] Acquired lock "refresh_cache-3c28cc1f-7479-4ee7-805d-ae13cd2b6dff" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 21 18:55:43 np0005591285 nova_compute[182755]: 2026-01-21 23:55:43.607 182759 DEBUG nova.network.neutron [None req-c515a25c-1fe2-41b0-9aa3-e2ff88dd1e82 7e79b904cb8a49f990b05eb0ed72fdf4 70b1c9f8be0042aa8de9841a26729700 - - default default] [instance: 3c28cc1f-7479-4ee7-805d-ae13cd2b6dff] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Jan 21 18:55:43 np0005591285 nova_compute[182755]: 2026-01-21 23:55:43.688 182759 DEBUG nova.compute.manager [req-c41f4dba-c537-41ff-9859-7871e98106e9 req-f894f215-4055-4da4-bfd2-92b4372dbc89 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 3c28cc1f-7479-4ee7-805d-ae13cd2b6dff] Received event network-changed-c9a59fac-68ff-4aa5-abcb-98567b80fb6f external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 21 18:55:43 np0005591285 nova_compute[182755]: 2026-01-21 23:55:43.688 182759 DEBUG nova.compute.manager [req-c41f4dba-c537-41ff-9859-7871e98106e9 req-f894f215-4055-4da4-bfd2-92b4372dbc89 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 3c28cc1f-7479-4ee7-805d-ae13cd2b6dff] Refreshing instance network info cache due to event network-changed-c9a59fac-68ff-4aa5-abcb-98567b80fb6f. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 21 18:55:43 np0005591285 nova_compute[182755]: 2026-01-21 23:55:43.689 182759 DEBUG oslo_concurrency.lockutils [req-c41f4dba-c537-41ff-9859-7871e98106e9 req-f894f215-4055-4da4-bfd2-92b4372dbc89 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "refresh_cache-3c28cc1f-7479-4ee7-805d-ae13cd2b6dff" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 21 18:55:43 np0005591285 nova_compute[182755]: 2026-01-21 23:55:43.793 182759 DEBUG nova.network.neutron [None req-c515a25c-1fe2-41b0-9aa3-e2ff88dd1e82 7e79b904cb8a49f990b05eb0ed72fdf4 70b1c9f8be0042aa8de9841a26729700 - - default default] [instance: 3c28cc1f-7479-4ee7-805d-ae13cd2b6dff] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Jan 21 18:55:44 np0005591285 nova_compute[182755]: 2026-01-21 23:55:44.951 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 18:55:46 np0005591285 nova_compute[182755]: 2026-01-21 23:55:46.138 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 18:55:46 np0005591285 nova_compute[182755]: 2026-01-21 23:55:46.239 182759 DEBUG nova.network.neutron [None req-c515a25c-1fe2-41b0-9aa3-e2ff88dd1e82 7e79b904cb8a49f990b05eb0ed72fdf4 70b1c9f8be0042aa8de9841a26729700 - - default default] [instance: 3c28cc1f-7479-4ee7-805d-ae13cd2b6dff] Updating instance_info_cache with network_info: [{"id": "c9a59fac-68ff-4aa5-abcb-98567b80fb6f", "address": "fa:16:3e:58:a7:27", "network": {"id": "a78bfb22-a192-4dbe-a117-9f8a59130e27", "bridge": "br-int", "label": "tempest-ListServerFiltersTestJSON-9709523-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "70b1c9f8be0042aa8de9841a26729700", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc9a59fac-68", "ovs_interfaceid": "c9a59fac-68ff-4aa5-abcb-98567b80fb6f", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 21 18:55:46 np0005591285 nova_compute[182755]: 2026-01-21 23:55:46.268 182759 DEBUG oslo_concurrency.lockutils [None req-c515a25c-1fe2-41b0-9aa3-e2ff88dd1e82 7e79b904cb8a49f990b05eb0ed72fdf4 70b1c9f8be0042aa8de9841a26729700 - - default default] Releasing lock "refresh_cache-3c28cc1f-7479-4ee7-805d-ae13cd2b6dff" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 21 18:55:46 np0005591285 nova_compute[182755]: 2026-01-21 23:55:46.269 182759 DEBUG nova.compute.manager [None req-c515a25c-1fe2-41b0-9aa3-e2ff88dd1e82 7e79b904cb8a49f990b05eb0ed72fdf4 70b1c9f8be0042aa8de9841a26729700 - - default default] [instance: 3c28cc1f-7479-4ee7-805d-ae13cd2b6dff] Instance network_info: |[{"id": "c9a59fac-68ff-4aa5-abcb-98567b80fb6f", "address": "fa:16:3e:58:a7:27", "network": {"id": "a78bfb22-a192-4dbe-a117-9f8a59130e27", "bridge": "br-int", "label": "tempest-ListServerFiltersTestJSON-9709523-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "70b1c9f8be0042aa8de9841a26729700", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc9a59fac-68", "ovs_interfaceid": "c9a59fac-68ff-4aa5-abcb-98567b80fb6f", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Jan 21 18:55:46 np0005591285 nova_compute[182755]: 2026-01-21 23:55:46.269 182759 DEBUG oslo_concurrency.lockutils [req-c41f4dba-c537-41ff-9859-7871e98106e9 req-f894f215-4055-4da4-bfd2-92b4372dbc89 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquired lock "refresh_cache-3c28cc1f-7479-4ee7-805d-ae13cd2b6dff" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 21 18:55:46 np0005591285 nova_compute[182755]: 2026-01-21 23:55:46.269 182759 DEBUG nova.network.neutron [req-c41f4dba-c537-41ff-9859-7871e98106e9 req-f894f215-4055-4da4-bfd2-92b4372dbc89 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 3c28cc1f-7479-4ee7-805d-ae13cd2b6dff] Refreshing network info cache for port c9a59fac-68ff-4aa5-abcb-98567b80fb6f _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 21 18:55:46 np0005591285 nova_compute[182755]: 2026-01-21 23:55:46.272 182759 DEBUG nova.virt.libvirt.driver [None req-c515a25c-1fe2-41b0-9aa3-e2ff88dd1e82 7e79b904cb8a49f990b05eb0ed72fdf4 70b1c9f8be0042aa8de9841a26729700 - - default default] [instance: 3c28cc1f-7479-4ee7-805d-ae13cd2b6dff] Start _get_guest_xml network_info=[{"id": "c9a59fac-68ff-4aa5-abcb-98567b80fb6f", "address": "fa:16:3e:58:a7:27", "network": {"id": "a78bfb22-a192-4dbe-a117-9f8a59130e27", "bridge": "br-int", "label": "tempest-ListServerFiltersTestJSON-9709523-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "70b1c9f8be0042aa8de9841a26729700", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc9a59fac-68", "ovs_interfaceid": "c9a59fac-68ff-4aa5-abcb-98567b80fb6f", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-21T23:42:53Z,direct_url=<?>,disk_format='qcow2',id=3e1dda74-3c6a-4d29-8792-32134d1c36c5,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img_alt',owner='43b70c4e837343859ac97b6b2397ba1b',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-21T23:42:54Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'device_type': 'disk', 'device_name': '/dev/vda', 'size': 0, 'boot_index': 0, 'disk_bus': 'virtio', 'guest_format': None, 'encryption_options': None, 'encryption_format': None, 'encrypted': False, 'encryption_secret_uuid': None, 'image_id': '3e1dda74-3c6a-4d29-8792-32134d1c36c5'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Jan 21 18:55:46 np0005591285 nova_compute[182755]: 2026-01-21 23:55:46.278 182759 WARNING nova.virt.libvirt.driver [None req-c515a25c-1fe2-41b0-9aa3-e2ff88dd1e82 7e79b904cb8a49f990b05eb0ed72fdf4 70b1c9f8be0042aa8de9841a26729700 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 21 18:55:46 np0005591285 nova_compute[182755]: 2026-01-21 23:55:46.285 182759 DEBUG nova.virt.libvirt.host [None req-c515a25c-1fe2-41b0-9aa3-e2ff88dd1e82 7e79b904cb8a49f990b05eb0ed72fdf4 70b1c9f8be0042aa8de9841a26729700 - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Jan 21 18:55:46 np0005591285 nova_compute[182755]: 2026-01-21 23:55:46.286 182759 DEBUG nova.virt.libvirt.host [None req-c515a25c-1fe2-41b0-9aa3-e2ff88dd1e82 7e79b904cb8a49f990b05eb0ed72fdf4 70b1c9f8be0042aa8de9841a26729700 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Jan 21 18:55:46 np0005591285 nova_compute[182755]: 2026-01-21 23:55:46.297 182759 DEBUG nova.virt.libvirt.host [None req-c515a25c-1fe2-41b0-9aa3-e2ff88dd1e82 7e79b904cb8a49f990b05eb0ed72fdf4 70b1c9f8be0042aa8de9841a26729700 - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Jan 21 18:55:46 np0005591285 nova_compute[182755]: 2026-01-21 23:55:46.298 182759 DEBUG nova.virt.libvirt.host [None req-c515a25c-1fe2-41b0-9aa3-e2ff88dd1e82 7e79b904cb8a49f990b05eb0ed72fdf4 70b1c9f8be0042aa8de9841a26729700 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Jan 21 18:55:46 np0005591285 nova_compute[182755]: 2026-01-21 23:55:46.300 182759 DEBUG nova.virt.libvirt.driver [None req-c515a25c-1fe2-41b0-9aa3-e2ff88dd1e82 7e79b904cb8a49f990b05eb0ed72fdf4 70b1c9f8be0042aa8de9841a26729700 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Jan 21 18:55:46 np0005591285 nova_compute[182755]: 2026-01-21 23:55:46.301 182759 DEBUG nova.virt.hardware [None req-c515a25c-1fe2-41b0-9aa3-e2ff88dd1e82 7e79b904cb8a49f990b05eb0ed72fdf4 70b1c9f8be0042aa8de9841a26729700 - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-21T23:42:45Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='c3389c03-89c4-4ff5-9e03-1a99d41713d4',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-21T23:42:53Z,direct_url=<?>,disk_format='qcow2',id=3e1dda74-3c6a-4d29-8792-32134d1c36c5,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img_alt',owner='43b70c4e837343859ac97b6b2397ba1b',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-21T23:42:54Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Jan 21 18:55:46 np0005591285 nova_compute[182755]: 2026-01-21 23:55:46.302 182759 DEBUG nova.virt.hardware [None req-c515a25c-1fe2-41b0-9aa3-e2ff88dd1e82 7e79b904cb8a49f990b05eb0ed72fdf4 70b1c9f8be0042aa8de9841a26729700 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Jan 21 18:55:46 np0005591285 nova_compute[182755]: 2026-01-21 23:55:46.302 182759 DEBUG nova.virt.hardware [None req-c515a25c-1fe2-41b0-9aa3-e2ff88dd1e82 7e79b904cb8a49f990b05eb0ed72fdf4 70b1c9f8be0042aa8de9841a26729700 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Jan 21 18:55:46 np0005591285 nova_compute[182755]: 2026-01-21 23:55:46.304 182759 DEBUG nova.virt.hardware [None req-c515a25c-1fe2-41b0-9aa3-e2ff88dd1e82 7e79b904cb8a49f990b05eb0ed72fdf4 70b1c9f8be0042aa8de9841a26729700 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Jan 21 18:55:46 np0005591285 nova_compute[182755]: 2026-01-21 23:55:46.304 182759 DEBUG nova.virt.hardware [None req-c515a25c-1fe2-41b0-9aa3-e2ff88dd1e82 7e79b904cb8a49f990b05eb0ed72fdf4 70b1c9f8be0042aa8de9841a26729700 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Jan 21 18:55:46 np0005591285 nova_compute[182755]: 2026-01-21 23:55:46.305 182759 DEBUG nova.virt.hardware [None req-c515a25c-1fe2-41b0-9aa3-e2ff88dd1e82 7e79b904cb8a49f990b05eb0ed72fdf4 70b1c9f8be0042aa8de9841a26729700 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Jan 21 18:55:46 np0005591285 nova_compute[182755]: 2026-01-21 23:55:46.305 182759 DEBUG nova.virt.hardware [None req-c515a25c-1fe2-41b0-9aa3-e2ff88dd1e82 7e79b904cb8a49f990b05eb0ed72fdf4 70b1c9f8be0042aa8de9841a26729700 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Jan 21 18:55:46 np0005591285 nova_compute[182755]: 2026-01-21 23:55:46.305 182759 DEBUG nova.virt.hardware [None req-c515a25c-1fe2-41b0-9aa3-e2ff88dd1e82 7e79b904cb8a49f990b05eb0ed72fdf4 70b1c9f8be0042aa8de9841a26729700 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Jan 21 18:55:46 np0005591285 nova_compute[182755]: 2026-01-21 23:55:46.306 182759 DEBUG nova.virt.hardware [None req-c515a25c-1fe2-41b0-9aa3-e2ff88dd1e82 7e79b904cb8a49f990b05eb0ed72fdf4 70b1c9f8be0042aa8de9841a26729700 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Jan 21 18:55:46 np0005591285 nova_compute[182755]: 2026-01-21 23:55:46.306 182759 DEBUG nova.virt.hardware [None req-c515a25c-1fe2-41b0-9aa3-e2ff88dd1e82 7e79b904cb8a49f990b05eb0ed72fdf4 70b1c9f8be0042aa8de9841a26729700 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Jan 21 18:55:46 np0005591285 nova_compute[182755]: 2026-01-21 23:55:46.306 182759 DEBUG nova.virt.hardware [None req-c515a25c-1fe2-41b0-9aa3-e2ff88dd1e82 7e79b904cb8a49f990b05eb0ed72fdf4 70b1c9f8be0042aa8de9841a26729700 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Jan 21 18:55:46 np0005591285 nova_compute[182755]: 2026-01-21 23:55:46.312 182759 DEBUG nova.virt.libvirt.vif [None req-c515a25c-1fe2-41b0-9aa3-e2ff88dd1e82 7e79b904cb8a49f990b05eb0ed72fdf4 70b1c9f8be0042aa8de9841a26729700 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-21T23:55:35Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ListServerFiltersTestJSON-instance-1934699601',display_name='tempest-ListServerFiltersTestJSON-instance-1934699601',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-listserverfilterstestjson-instance-1934699601',id=61,image_ref='3e1dda74-3c6a-4d29-8792-32134d1c36c5',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='70b1c9f8be0042aa8de9841a26729700',ramdisk_id='',reservation_id='r-ei615cu0',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='3e1dda74-3c6a-4d29-8792-32134d1c36c5',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ListServerFiltersTestJSON-1547380946',owner_user_name='tempest-ListServerFiltersTestJSON-1547380946-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-21T23:55:37Z,user_data=None,user_id='7e79b904cb8a49f990b05eb0ed72fdf4',uuid=3c28cc1f-7479-4ee7-805d-ae13cd2b6dff,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "c9a59fac-68ff-4aa5-abcb-98567b80fb6f", "address": "fa:16:3e:58:a7:27", "network": {"id": "a78bfb22-a192-4dbe-a117-9f8a59130e27", "bridge": "br-int", "label": "tempest-ListServerFiltersTestJSON-9709523-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "70b1c9f8be0042aa8de9841a26729700", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc9a59fac-68", "ovs_interfaceid": "c9a59fac-68ff-4aa5-abcb-98567b80fb6f", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Jan 21 18:55:46 np0005591285 nova_compute[182755]: 2026-01-21 23:55:46.312 182759 DEBUG nova.network.os_vif_util [None req-c515a25c-1fe2-41b0-9aa3-e2ff88dd1e82 7e79b904cb8a49f990b05eb0ed72fdf4 70b1c9f8be0042aa8de9841a26729700 - - default default] Converting VIF {"id": "c9a59fac-68ff-4aa5-abcb-98567b80fb6f", "address": "fa:16:3e:58:a7:27", "network": {"id": "a78bfb22-a192-4dbe-a117-9f8a59130e27", "bridge": "br-int", "label": "tempest-ListServerFiltersTestJSON-9709523-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "70b1c9f8be0042aa8de9841a26729700", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc9a59fac-68", "ovs_interfaceid": "c9a59fac-68ff-4aa5-abcb-98567b80fb6f", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 21 18:55:46 np0005591285 nova_compute[182755]: 2026-01-21 23:55:46.313 182759 DEBUG nova.network.os_vif_util [None req-c515a25c-1fe2-41b0-9aa3-e2ff88dd1e82 7e79b904cb8a49f990b05eb0ed72fdf4 70b1c9f8be0042aa8de9841a26729700 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:58:a7:27,bridge_name='br-int',has_traffic_filtering=True,id=c9a59fac-68ff-4aa5-abcb-98567b80fb6f,network=Network(a78bfb22-a192-4dbe-a117-9f8a59130e27),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapc9a59fac-68') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 21 18:55:46 np0005591285 nova_compute[182755]: 2026-01-21 23:55:46.314 182759 DEBUG nova.objects.instance [None req-c515a25c-1fe2-41b0-9aa3-e2ff88dd1e82 7e79b904cb8a49f990b05eb0ed72fdf4 70b1c9f8be0042aa8de9841a26729700 - - default default] Lazy-loading 'pci_devices' on Instance uuid 3c28cc1f-7479-4ee7-805d-ae13cd2b6dff obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 21 18:55:46 np0005591285 nova_compute[182755]: 2026-01-21 23:55:46.330 182759 DEBUG nova.virt.libvirt.driver [None req-c515a25c-1fe2-41b0-9aa3-e2ff88dd1e82 7e79b904cb8a49f990b05eb0ed72fdf4 70b1c9f8be0042aa8de9841a26729700 - - default default] [instance: 3c28cc1f-7479-4ee7-805d-ae13cd2b6dff] End _get_guest_xml xml=<domain type="kvm">
Jan 21 18:55:46 np0005591285 nova_compute[182755]:  <uuid>3c28cc1f-7479-4ee7-805d-ae13cd2b6dff</uuid>
Jan 21 18:55:46 np0005591285 nova_compute[182755]:  <name>instance-0000003d</name>
Jan 21 18:55:46 np0005591285 nova_compute[182755]:  <memory>131072</memory>
Jan 21 18:55:46 np0005591285 nova_compute[182755]:  <vcpu>1</vcpu>
Jan 21 18:55:46 np0005591285 nova_compute[182755]:  <metadata>
Jan 21 18:55:46 np0005591285 nova_compute[182755]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 21 18:55:46 np0005591285 nova_compute[182755]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 21 18:55:46 np0005591285 nova_compute[182755]:      <nova:name>tempest-ListServerFiltersTestJSON-instance-1934699601</nova:name>
Jan 21 18:55:46 np0005591285 nova_compute[182755]:      <nova:creationTime>2026-01-21 23:55:46</nova:creationTime>
Jan 21 18:55:46 np0005591285 nova_compute[182755]:      <nova:flavor name="m1.nano">
Jan 21 18:55:46 np0005591285 nova_compute[182755]:        <nova:memory>128</nova:memory>
Jan 21 18:55:46 np0005591285 nova_compute[182755]:        <nova:disk>1</nova:disk>
Jan 21 18:55:46 np0005591285 nova_compute[182755]:        <nova:swap>0</nova:swap>
Jan 21 18:55:46 np0005591285 nova_compute[182755]:        <nova:ephemeral>0</nova:ephemeral>
Jan 21 18:55:46 np0005591285 nova_compute[182755]:        <nova:vcpus>1</nova:vcpus>
Jan 21 18:55:46 np0005591285 nova_compute[182755]:      </nova:flavor>
Jan 21 18:55:46 np0005591285 nova_compute[182755]:      <nova:owner>
Jan 21 18:55:46 np0005591285 nova_compute[182755]:        <nova:user uuid="7e79b904cb8a49f990b05eb0ed72fdf4">tempest-ListServerFiltersTestJSON-1547380946-project-member</nova:user>
Jan 21 18:55:46 np0005591285 nova_compute[182755]:        <nova:project uuid="70b1c9f8be0042aa8de9841a26729700">tempest-ListServerFiltersTestJSON-1547380946</nova:project>
Jan 21 18:55:46 np0005591285 nova_compute[182755]:      </nova:owner>
Jan 21 18:55:46 np0005591285 nova_compute[182755]:      <nova:root type="image" uuid="3e1dda74-3c6a-4d29-8792-32134d1c36c5"/>
Jan 21 18:55:46 np0005591285 nova_compute[182755]:      <nova:ports>
Jan 21 18:55:46 np0005591285 nova_compute[182755]:        <nova:port uuid="c9a59fac-68ff-4aa5-abcb-98567b80fb6f">
Jan 21 18:55:46 np0005591285 nova_compute[182755]:          <nova:ip type="fixed" address="10.100.0.8" ipVersion="4"/>
Jan 21 18:55:46 np0005591285 nova_compute[182755]:        </nova:port>
Jan 21 18:55:46 np0005591285 nova_compute[182755]:      </nova:ports>
Jan 21 18:55:46 np0005591285 nova_compute[182755]:    </nova:instance>
Jan 21 18:55:46 np0005591285 nova_compute[182755]:  </metadata>
Jan 21 18:55:46 np0005591285 nova_compute[182755]:  <sysinfo type="smbios">
Jan 21 18:55:46 np0005591285 nova_compute[182755]:    <system>
Jan 21 18:55:46 np0005591285 nova_compute[182755]:      <entry name="manufacturer">RDO</entry>
Jan 21 18:55:46 np0005591285 nova_compute[182755]:      <entry name="product">OpenStack Compute</entry>
Jan 21 18:55:46 np0005591285 nova_compute[182755]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 21 18:55:46 np0005591285 nova_compute[182755]:      <entry name="serial">3c28cc1f-7479-4ee7-805d-ae13cd2b6dff</entry>
Jan 21 18:55:46 np0005591285 nova_compute[182755]:      <entry name="uuid">3c28cc1f-7479-4ee7-805d-ae13cd2b6dff</entry>
Jan 21 18:55:46 np0005591285 nova_compute[182755]:      <entry name="family">Virtual Machine</entry>
Jan 21 18:55:46 np0005591285 nova_compute[182755]:    </system>
Jan 21 18:55:46 np0005591285 nova_compute[182755]:  </sysinfo>
Jan 21 18:55:46 np0005591285 nova_compute[182755]:  <os>
Jan 21 18:55:46 np0005591285 nova_compute[182755]:    <type arch="x86_64" machine="q35">hvm</type>
Jan 21 18:55:46 np0005591285 nova_compute[182755]:    <boot dev="hd"/>
Jan 21 18:55:46 np0005591285 nova_compute[182755]:    <smbios mode="sysinfo"/>
Jan 21 18:55:46 np0005591285 nova_compute[182755]:  </os>
Jan 21 18:55:46 np0005591285 nova_compute[182755]:  <features>
Jan 21 18:55:46 np0005591285 nova_compute[182755]:    <acpi/>
Jan 21 18:55:46 np0005591285 nova_compute[182755]:    <apic/>
Jan 21 18:55:46 np0005591285 nova_compute[182755]:    <vmcoreinfo/>
Jan 21 18:55:46 np0005591285 nova_compute[182755]:  </features>
Jan 21 18:55:46 np0005591285 nova_compute[182755]:  <clock offset="utc">
Jan 21 18:55:46 np0005591285 nova_compute[182755]:    <timer name="pit" tickpolicy="delay"/>
Jan 21 18:55:46 np0005591285 nova_compute[182755]:    <timer name="rtc" tickpolicy="catchup"/>
Jan 21 18:55:46 np0005591285 nova_compute[182755]:    <timer name="hpet" present="no"/>
Jan 21 18:55:46 np0005591285 nova_compute[182755]:  </clock>
Jan 21 18:55:46 np0005591285 nova_compute[182755]:  <cpu mode="custom" match="exact">
Jan 21 18:55:46 np0005591285 nova_compute[182755]:    <model>Nehalem</model>
Jan 21 18:55:46 np0005591285 nova_compute[182755]:    <topology sockets="1" cores="1" threads="1"/>
Jan 21 18:55:46 np0005591285 nova_compute[182755]:  </cpu>
Jan 21 18:55:46 np0005591285 nova_compute[182755]:  <devices>
Jan 21 18:55:46 np0005591285 nova_compute[182755]:    <disk type="file" device="disk">
Jan 21 18:55:46 np0005591285 nova_compute[182755]:      <driver name="qemu" type="qcow2" cache="none"/>
Jan 21 18:55:46 np0005591285 nova_compute[182755]:      <source file="/var/lib/nova/instances/3c28cc1f-7479-4ee7-805d-ae13cd2b6dff/disk"/>
Jan 21 18:55:46 np0005591285 nova_compute[182755]:      <target dev="vda" bus="virtio"/>
Jan 21 18:55:46 np0005591285 nova_compute[182755]:    </disk>
Jan 21 18:55:46 np0005591285 nova_compute[182755]:    <disk type="file" device="cdrom">
Jan 21 18:55:46 np0005591285 nova_compute[182755]:      <driver name="qemu" type="raw" cache="none"/>
Jan 21 18:55:46 np0005591285 nova_compute[182755]:      <source file="/var/lib/nova/instances/3c28cc1f-7479-4ee7-805d-ae13cd2b6dff/disk.config"/>
Jan 21 18:55:46 np0005591285 nova_compute[182755]:      <target dev="sda" bus="sata"/>
Jan 21 18:55:46 np0005591285 nova_compute[182755]:    </disk>
Jan 21 18:55:46 np0005591285 nova_compute[182755]:    <interface type="ethernet">
Jan 21 18:55:46 np0005591285 nova_compute[182755]:      <mac address="fa:16:3e:58:a7:27"/>
Jan 21 18:55:46 np0005591285 nova_compute[182755]:      <model type="virtio"/>
Jan 21 18:55:46 np0005591285 nova_compute[182755]:      <driver name="vhost" rx_queue_size="512"/>
Jan 21 18:55:46 np0005591285 nova_compute[182755]:      <mtu size="1442"/>
Jan 21 18:55:46 np0005591285 nova_compute[182755]:      <target dev="tapc9a59fac-68"/>
Jan 21 18:55:46 np0005591285 nova_compute[182755]:    </interface>
Jan 21 18:55:46 np0005591285 nova_compute[182755]:    <serial type="pty">
Jan 21 18:55:46 np0005591285 nova_compute[182755]:      <log file="/var/lib/nova/instances/3c28cc1f-7479-4ee7-805d-ae13cd2b6dff/console.log" append="off"/>
Jan 21 18:55:46 np0005591285 nova_compute[182755]:    </serial>
Jan 21 18:55:46 np0005591285 nova_compute[182755]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 21 18:55:46 np0005591285 nova_compute[182755]:    <video>
Jan 21 18:55:46 np0005591285 nova_compute[182755]:      <model type="virtio"/>
Jan 21 18:55:46 np0005591285 nova_compute[182755]:    </video>
Jan 21 18:55:46 np0005591285 nova_compute[182755]:    <input type="tablet" bus="usb"/>
Jan 21 18:55:46 np0005591285 nova_compute[182755]:    <rng model="virtio">
Jan 21 18:55:46 np0005591285 nova_compute[182755]:      <backend model="random">/dev/urandom</backend>
Jan 21 18:55:46 np0005591285 nova_compute[182755]:    </rng>
Jan 21 18:55:46 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root"/>
Jan 21 18:55:46 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 18:55:46 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 18:55:46 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 18:55:46 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 18:55:46 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 18:55:46 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 18:55:46 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 18:55:46 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 18:55:46 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 18:55:46 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 18:55:46 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 18:55:46 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 18:55:46 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 18:55:46 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 18:55:46 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 18:55:46 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 18:55:46 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 18:55:46 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 18:55:46 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 18:55:46 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 18:55:46 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 18:55:46 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 18:55:46 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 18:55:46 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 18:55:46 np0005591285 nova_compute[182755]:    <controller type="usb" index="0"/>
Jan 21 18:55:46 np0005591285 nova_compute[182755]:    <memballoon model="virtio">
Jan 21 18:55:46 np0005591285 nova_compute[182755]:      <stats period="10"/>
Jan 21 18:55:46 np0005591285 nova_compute[182755]:    </memballoon>
Jan 21 18:55:46 np0005591285 nova_compute[182755]:  </devices>
Jan 21 18:55:46 np0005591285 nova_compute[182755]: </domain>
Jan 21 18:55:46 np0005591285 nova_compute[182755]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Jan 21 18:55:46 np0005591285 nova_compute[182755]: 2026-01-21 23:55:46.331 182759 DEBUG nova.compute.manager [None req-c515a25c-1fe2-41b0-9aa3-e2ff88dd1e82 7e79b904cb8a49f990b05eb0ed72fdf4 70b1c9f8be0042aa8de9841a26729700 - - default default] [instance: 3c28cc1f-7479-4ee7-805d-ae13cd2b6dff] Preparing to wait for external event network-vif-plugged-c9a59fac-68ff-4aa5-abcb-98567b80fb6f prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Jan 21 18:55:46 np0005591285 nova_compute[182755]: 2026-01-21 23:55:46.332 182759 DEBUG oslo_concurrency.lockutils [None req-c515a25c-1fe2-41b0-9aa3-e2ff88dd1e82 7e79b904cb8a49f990b05eb0ed72fdf4 70b1c9f8be0042aa8de9841a26729700 - - default default] Acquiring lock "3c28cc1f-7479-4ee7-805d-ae13cd2b6dff-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 21 18:55:46 np0005591285 nova_compute[182755]: 2026-01-21 23:55:46.332 182759 DEBUG oslo_concurrency.lockutils [None req-c515a25c-1fe2-41b0-9aa3-e2ff88dd1e82 7e79b904cb8a49f990b05eb0ed72fdf4 70b1c9f8be0042aa8de9841a26729700 - - default default] Lock "3c28cc1f-7479-4ee7-805d-ae13cd2b6dff-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 21 18:55:46 np0005591285 nova_compute[182755]: 2026-01-21 23:55:46.332 182759 DEBUG oslo_concurrency.lockutils [None req-c515a25c-1fe2-41b0-9aa3-e2ff88dd1e82 7e79b904cb8a49f990b05eb0ed72fdf4 70b1c9f8be0042aa8de9841a26729700 - - default default] Lock "3c28cc1f-7479-4ee7-805d-ae13cd2b6dff-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 21 18:55:46 np0005591285 nova_compute[182755]: 2026-01-21 23:55:46.333 182759 DEBUG nova.virt.libvirt.vif [None req-c515a25c-1fe2-41b0-9aa3-e2ff88dd1e82 7e79b904cb8a49f990b05eb0ed72fdf4 70b1c9f8be0042aa8de9841a26729700 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-21T23:55:35Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ListServerFiltersTestJSON-instance-1934699601',display_name='tempest-ListServerFiltersTestJSON-instance-1934699601',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-listserverfilterstestjson-instance-1934699601',id=61,image_ref='3e1dda74-3c6a-4d29-8792-32134d1c36c5',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='70b1c9f8be0042aa8de9841a26729700',ramdisk_id='',reservation_id='r-ei615cu0',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='3e1dda74-3c6a-4d29-8792-32134d1c36c5',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ListServerFiltersTestJSON-1547380946',owner_user_name='tempest-ListServerFiltersTestJSON-1547380946-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-21T23:55:37Z,user_data=None,user_id='7e79b904cb8a49f990b05eb0ed72fdf4',uuid=3c28cc1f-7479-4ee7-805d-ae13cd2b6dff,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "c9a59fac-68ff-4aa5-abcb-98567b80fb6f", "address": "fa:16:3e:58:a7:27", "network": {"id": "a78bfb22-a192-4dbe-a117-9f8a59130e27", "bridge": "br-int", "label": "tempest-ListServerFiltersTestJSON-9709523-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "70b1c9f8be0042aa8de9841a26729700", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc9a59fac-68", "ovs_interfaceid": "c9a59fac-68ff-4aa5-abcb-98567b80fb6f", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Jan 21 18:55:46 np0005591285 nova_compute[182755]: 2026-01-21 23:55:46.333 182759 DEBUG nova.network.os_vif_util [None req-c515a25c-1fe2-41b0-9aa3-e2ff88dd1e82 7e79b904cb8a49f990b05eb0ed72fdf4 70b1c9f8be0042aa8de9841a26729700 - - default default] Converting VIF {"id": "c9a59fac-68ff-4aa5-abcb-98567b80fb6f", "address": "fa:16:3e:58:a7:27", "network": {"id": "a78bfb22-a192-4dbe-a117-9f8a59130e27", "bridge": "br-int", "label": "tempest-ListServerFiltersTestJSON-9709523-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "70b1c9f8be0042aa8de9841a26729700", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc9a59fac-68", "ovs_interfaceid": "c9a59fac-68ff-4aa5-abcb-98567b80fb6f", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 21 18:55:46 np0005591285 nova_compute[182755]: 2026-01-21 23:55:46.334 182759 DEBUG nova.network.os_vif_util [None req-c515a25c-1fe2-41b0-9aa3-e2ff88dd1e82 7e79b904cb8a49f990b05eb0ed72fdf4 70b1c9f8be0042aa8de9841a26729700 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:58:a7:27,bridge_name='br-int',has_traffic_filtering=True,id=c9a59fac-68ff-4aa5-abcb-98567b80fb6f,network=Network(a78bfb22-a192-4dbe-a117-9f8a59130e27),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapc9a59fac-68') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 21 18:55:46 np0005591285 nova_compute[182755]: 2026-01-21 23:55:46.335 182759 DEBUG os_vif [None req-c515a25c-1fe2-41b0-9aa3-e2ff88dd1e82 7e79b904cb8a49f990b05eb0ed72fdf4 70b1c9f8be0042aa8de9841a26729700 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:58:a7:27,bridge_name='br-int',has_traffic_filtering=True,id=c9a59fac-68ff-4aa5-abcb-98567b80fb6f,network=Network(a78bfb22-a192-4dbe-a117-9f8a59130e27),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapc9a59fac-68') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Jan 21 18:55:46 np0005591285 nova_compute[182755]: 2026-01-21 23:55:46.335 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 18:55:46 np0005591285 nova_compute[182755]: 2026-01-21 23:55:46.336 182759 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 21 18:55:46 np0005591285 nova_compute[182755]: 2026-01-21 23:55:46.336 182759 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 21 18:55:46 np0005591285 nova_compute[182755]: 2026-01-21 23:55:46.340 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 18:55:46 np0005591285 nova_compute[182755]: 2026-01-21 23:55:46.341 182759 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapc9a59fac-68, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 21 18:55:46 np0005591285 nova_compute[182755]: 2026-01-21 23:55:46.341 182759 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tapc9a59fac-68, col_values=(('external_ids', {'iface-id': 'c9a59fac-68ff-4aa5-abcb-98567b80fb6f', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:58:a7:27', 'vm-uuid': '3c28cc1f-7479-4ee7-805d-ae13cd2b6dff'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 21 18:55:46 np0005591285 nova_compute[182755]: 2026-01-21 23:55:46.343 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 18:55:46 np0005591285 NetworkManager[55017]: <info>  [1769039746.3446] manager: (tapc9a59fac-68): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/83)
Jan 21 18:55:46 np0005591285 nova_compute[182755]: 2026-01-21 23:55:46.345 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 21 18:55:46 np0005591285 nova_compute[182755]: 2026-01-21 23:55:46.349 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 18:55:46 np0005591285 nova_compute[182755]: 2026-01-21 23:55:46.350 182759 INFO os_vif [None req-c515a25c-1fe2-41b0-9aa3-e2ff88dd1e82 7e79b904cb8a49f990b05eb0ed72fdf4 70b1c9f8be0042aa8de9841a26729700 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:58:a7:27,bridge_name='br-int',has_traffic_filtering=True,id=c9a59fac-68ff-4aa5-abcb-98567b80fb6f,network=Network(a78bfb22-a192-4dbe-a117-9f8a59130e27),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapc9a59fac-68')#033[00m
Jan 21 18:55:46 np0005591285 nova_compute[182755]: 2026-01-21 23:55:46.468 182759 DEBUG nova.virt.libvirt.driver [None req-c515a25c-1fe2-41b0-9aa3-e2ff88dd1e82 7e79b904cb8a49f990b05eb0ed72fdf4 70b1c9f8be0042aa8de9841a26729700 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 21 18:55:46 np0005591285 nova_compute[182755]: 2026-01-21 23:55:46.469 182759 DEBUG nova.virt.libvirt.driver [None req-c515a25c-1fe2-41b0-9aa3-e2ff88dd1e82 7e79b904cb8a49f990b05eb0ed72fdf4 70b1c9f8be0042aa8de9841a26729700 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 21 18:55:46 np0005591285 nova_compute[182755]: 2026-01-21 23:55:46.471 182759 DEBUG nova.virt.libvirt.driver [None req-c515a25c-1fe2-41b0-9aa3-e2ff88dd1e82 7e79b904cb8a49f990b05eb0ed72fdf4 70b1c9f8be0042aa8de9841a26729700 - - default default] No VIF found with MAC fa:16:3e:58:a7:27, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Jan 21 18:55:46 np0005591285 nova_compute[182755]: 2026-01-21 23:55:46.473 182759 INFO nova.virt.libvirt.driver [None req-c515a25c-1fe2-41b0-9aa3-e2ff88dd1e82 7e79b904cb8a49f990b05eb0ed72fdf4 70b1c9f8be0042aa8de9841a26729700 - - default default] [instance: 3c28cc1f-7479-4ee7-805d-ae13cd2b6dff] Using config drive#033[00m
Jan 21 18:55:47 np0005591285 nova_compute[182755]: 2026-01-21 23:55:47.133 182759 INFO nova.virt.libvirt.driver [None req-c515a25c-1fe2-41b0-9aa3-e2ff88dd1e82 7e79b904cb8a49f990b05eb0ed72fdf4 70b1c9f8be0042aa8de9841a26729700 - - default default] [instance: 3c28cc1f-7479-4ee7-805d-ae13cd2b6dff] Creating config drive at /var/lib/nova/instances/3c28cc1f-7479-4ee7-805d-ae13cd2b6dff/disk.config#033[00m
Jan 21 18:55:47 np0005591285 nova_compute[182755]: 2026-01-21 23:55:47.140 182759 DEBUG oslo_concurrency.processutils [None req-c515a25c-1fe2-41b0-9aa3-e2ff88dd1e82 7e79b904cb8a49f990b05eb0ed72fdf4 70b1c9f8be0042aa8de9841a26729700 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/3c28cc1f-7479-4ee7-805d-ae13cd2b6dff/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmphs4gratf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 21 18:55:47 np0005591285 nova_compute[182755]: 2026-01-21 23:55:47.273 182759 DEBUG oslo_concurrency.processutils [None req-c515a25c-1fe2-41b0-9aa3-e2ff88dd1e82 7e79b904cb8a49f990b05eb0ed72fdf4 70b1c9f8be0042aa8de9841a26729700 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/3c28cc1f-7479-4ee7-805d-ae13cd2b6dff/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmphs4gratf" returned: 0 in 0.132s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 21 18:55:47 np0005591285 kernel: tapc9a59fac-68: entered promiscuous mode
Jan 21 18:55:47 np0005591285 NetworkManager[55017]: <info>  [1769039747.3405] manager: (tapc9a59fac-68): new Tun device (/org/freedesktop/NetworkManager/Devices/84)
Jan 21 18:55:47 np0005591285 ovn_controller[94908]: 2026-01-21T23:55:47Z|00149|binding|INFO|Claiming lport c9a59fac-68ff-4aa5-abcb-98567b80fb6f for this chassis.
Jan 21 18:55:47 np0005591285 ovn_controller[94908]: 2026-01-21T23:55:47Z|00150|binding|INFO|c9a59fac-68ff-4aa5-abcb-98567b80fb6f: Claiming fa:16:3e:58:a7:27 10.100.0.8
Jan 21 18:55:47 np0005591285 nova_compute[182755]: 2026-01-21 23:55:47.341 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 18:55:47 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:55:47.353 104259 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:58:a7:27 10.100.0.8'], port_security=['fa:16:3e:58:a7:27 10.100.0.8'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.8/28', 'neutron:device_id': '3c28cc1f-7479-4ee7-805d-ae13cd2b6dff', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-a78bfb22-a192-4dbe-a117-9f8a59130e27', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '70b1c9f8be0042aa8de9841a26729700', 'neutron:revision_number': '2', 'neutron:security_group_ids': '5943869c-ade1-4cd3-81a5-29e65236fb49', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=88d3d39a-f56f-4f3b-95e9-79768ac7b596, chassis=[<ovs.db.idl.Row object at 0x7fdd55b5c970>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fdd55b5c970>], logical_port=c9a59fac-68ff-4aa5-abcb-98567b80fb6f) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 21 18:55:47 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:55:47.354 104259 INFO neutron.agent.ovn.metadata.agent [-] Port c9a59fac-68ff-4aa5-abcb-98567b80fb6f in datapath a78bfb22-a192-4dbe-a117-9f8a59130e27 bound to our chassis#033[00m
Jan 21 18:55:47 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:55:47.356 104259 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network a78bfb22-a192-4dbe-a117-9f8a59130e27#033[00m
Jan 21 18:55:47 np0005591285 systemd-udevd[219015]: Network interface NamePolicy= disabled on kernel command line.
Jan 21 18:55:47 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:55:47.370 211690 DEBUG oslo.privsep.daemon [-] privsep: reply[220909cf-ba1a-4e75-9422-e68af1678f87]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 18:55:47 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:55:47.373 104259 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tapa78bfb22-a1 in ovnmeta-a78bfb22-a192-4dbe-a117-9f8a59130e27 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Jan 21 18:55:47 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:55:47.376 211690 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tapa78bfb22-a0 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Jan 21 18:55:47 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:55:47.376 211690 DEBUG oslo.privsep.daemon [-] privsep: reply[f1ab9b1d-9d4c-4103-b3a5-908da778ab70]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 18:55:47 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:55:47.377 211690 DEBUG oslo.privsep.daemon [-] privsep: reply[fd8d531e-843f-4d06-a893-4b8cfbcc5f84]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 18:55:47 np0005591285 NetworkManager[55017]: <info>  [1769039747.3851] device (tapc9a59fac-68): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 21 18:55:47 np0005591285 NetworkManager[55017]: <info>  [1769039747.3859] device (tapc9a59fac-68): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 21 18:55:47 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:55:47.396 104650 DEBUG oslo.privsep.daemon [-] privsep: reply[6483a5c1-8155-40a2-bccd-8787b8087f9f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 18:55:47 np0005591285 nova_compute[182755]: 2026-01-21 23:55:47.401 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 18:55:47 np0005591285 nova_compute[182755]: 2026-01-21 23:55:47.408 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 18:55:47 np0005591285 ovn_controller[94908]: 2026-01-21T23:55:47Z|00151|binding|INFO|Setting lport c9a59fac-68ff-4aa5-abcb-98567b80fb6f ovn-installed in OVS
Jan 21 18:55:47 np0005591285 ovn_controller[94908]: 2026-01-21T23:55:47Z|00152|binding|INFO|Setting lport c9a59fac-68ff-4aa5-abcb-98567b80fb6f up in Southbound
Jan 21 18:55:47 np0005591285 systemd-machined[154022]: New machine qemu-25-instance-0000003d.
Jan 21 18:55:47 np0005591285 nova_compute[182755]: 2026-01-21 23:55:47.412 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 18:55:47 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:55:47.417 211690 DEBUG oslo.privsep.daemon [-] privsep: reply[33b4dbc3-5be9-4333-add2-d3f33c37eb53]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 18:55:47 np0005591285 systemd[1]: Started Virtual Machine qemu-25-instance-0000003d.
Jan 21 18:55:47 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:55:47.452 211704 DEBUG oslo.privsep.daemon [-] privsep: reply[eb9b16de-8bed-4050-a849-40ced8a4c876]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 18:55:47 np0005591285 NetworkManager[55017]: <info>  [1769039747.4585] manager: (tapa78bfb22-a0): new Veth device (/org/freedesktop/NetworkManager/Devices/85)
Jan 21 18:55:47 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:55:47.457 211690 DEBUG oslo.privsep.daemon [-] privsep: reply[36a7f538-f8aa-4237-a17f-673728585f07]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 18:55:47 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:55:47.488 211704 DEBUG oslo.privsep.daemon [-] privsep: reply[3a9fb1f6-d9b0-4cee-a637-303494b0b79b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 18:55:47 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:55:47.492 211704 DEBUG oslo.privsep.daemon [-] privsep: reply[174abfab-f268-4f3a-9026-49e1e0e8fd00]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 18:55:47 np0005591285 NetworkManager[55017]: <info>  [1769039747.5162] device (tapa78bfb22-a0): carrier: link connected
Jan 21 18:55:47 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:55:47.523 211704 DEBUG oslo.privsep.daemon [-] privsep: reply[9c0f5afe-d240-481a-b3ca-05ca7305e325]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 18:55:47 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:55:47.544 211690 DEBUG oslo.privsep.daemon [-] privsep: reply[6bb00a36-66fc-4759-a42a-f1931eceab22]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapa78bfb22-a1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:2f:41:94'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 2, 'rx_bytes': 90, 'tx_bytes': 180, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 2, 'rx_bytes': 90, 'tx_bytes': 180, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 54], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 430617, 'reachable_time': 22882, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 2, 'outoctets': 152, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 2, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 152, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 2, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 219050, 'error': None, 'target': 'ovnmeta-a78bfb22-a192-4dbe-a117-9f8a59130e27', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 18:55:47 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:55:47.563 211690 DEBUG oslo.privsep.daemon [-] privsep: reply[9169e9d1-c66e-474c-856a-4f16cd51a834]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe2f:4194'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 430617, 'tstamp': 430617}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 219051, 'error': None, 'target': 'ovnmeta-a78bfb22-a192-4dbe-a117-9f8a59130e27', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 18:55:47 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:55:47.586 211690 DEBUG oslo.privsep.daemon [-] privsep: reply[883feac1-2aa3-482f-870a-05c7bc111212]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapa78bfb22-a1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:2f:41:94'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 2, 'rx_bytes': 90, 'tx_bytes': 180, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 2, 'rx_bytes': 90, 'tx_bytes': 180, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 54], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 430617, 'reachable_time': 22882, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 2, 'outoctets': 152, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 2, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 152, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 2, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 219052, 'error': None, 'target': 'ovnmeta-a78bfb22-a192-4dbe-a117-9f8a59130e27', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 18:55:47 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:55:47.622 211690 DEBUG oslo.privsep.daemon [-] privsep: reply[9a783da0-1d65-49d3-b8ff-71a3fb96ba47]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 18:55:47 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:55:47.694 211690 DEBUG oslo.privsep.daemon [-] privsep: reply[d00707f1-6b46-4197-85c1-762b6c43a5c9]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 18:55:47 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:55:47.696 104259 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapa78bfb22-a0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 21 18:55:47 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:55:47.697 104259 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 21 18:55:47 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:55:47.697 104259 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapa78bfb22-a0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 21 18:55:47 np0005591285 nova_compute[182755]: 2026-01-21 23:55:47.700 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 18:55:47 np0005591285 NetworkManager[55017]: <info>  [1769039747.7009] manager: (tapa78bfb22-a0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/86)
Jan 21 18:55:47 np0005591285 kernel: tapa78bfb22-a0: entered promiscuous mode
Jan 21 18:55:47 np0005591285 nova_compute[182755]: 2026-01-21 23:55:47.704 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 18:55:47 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:55:47.705 104259 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapa78bfb22-a0, col_values=(('external_ids', {'iface-id': 'bb8c3f45-55b8-4c8e-8a31-26c5ecb4fb32'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 21 18:55:47 np0005591285 nova_compute[182755]: 2026-01-21 23:55:47.707 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 18:55:47 np0005591285 ovn_controller[94908]: 2026-01-21T23:55:47Z|00153|binding|INFO|Releasing lport bb8c3f45-55b8-4c8e-8a31-26c5ecb4fb32 from this chassis (sb_readonly=0)
Jan 21 18:55:47 np0005591285 nova_compute[182755]: 2026-01-21 23:55:47.734 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 18:55:47 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:55:47.735 104259 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/a78bfb22-a192-4dbe-a117-9f8a59130e27.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/a78bfb22-a192-4dbe-a117-9f8a59130e27.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Jan 21 18:55:47 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:55:47.736 211690 DEBUG oslo.privsep.daemon [-] privsep: reply[e863ad1c-989a-43eb-a20e-9e150e5209d7]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 18:55:47 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:55:47.737 104259 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Jan 21 18:55:47 np0005591285 ovn_metadata_agent[104254]: global
Jan 21 18:55:47 np0005591285 ovn_metadata_agent[104254]:    log         /dev/log local0 debug
Jan 21 18:55:47 np0005591285 ovn_metadata_agent[104254]:    log-tag     haproxy-metadata-proxy-a78bfb22-a192-4dbe-a117-9f8a59130e27
Jan 21 18:55:47 np0005591285 ovn_metadata_agent[104254]:    user        root
Jan 21 18:55:47 np0005591285 ovn_metadata_agent[104254]:    group       root
Jan 21 18:55:47 np0005591285 ovn_metadata_agent[104254]:    maxconn     1024
Jan 21 18:55:47 np0005591285 ovn_metadata_agent[104254]:    pidfile     /var/lib/neutron/external/pids/a78bfb22-a192-4dbe-a117-9f8a59130e27.pid.haproxy
Jan 21 18:55:47 np0005591285 ovn_metadata_agent[104254]:    daemon
Jan 21 18:55:47 np0005591285 ovn_metadata_agent[104254]: 
Jan 21 18:55:47 np0005591285 ovn_metadata_agent[104254]: defaults
Jan 21 18:55:47 np0005591285 ovn_metadata_agent[104254]:    log global
Jan 21 18:55:47 np0005591285 ovn_metadata_agent[104254]:    mode http
Jan 21 18:55:47 np0005591285 ovn_metadata_agent[104254]:    option httplog
Jan 21 18:55:47 np0005591285 ovn_metadata_agent[104254]:    option dontlognull
Jan 21 18:55:47 np0005591285 ovn_metadata_agent[104254]:    option http-server-close
Jan 21 18:55:47 np0005591285 ovn_metadata_agent[104254]:    option forwardfor
Jan 21 18:55:47 np0005591285 ovn_metadata_agent[104254]:    retries                 3
Jan 21 18:55:47 np0005591285 ovn_metadata_agent[104254]:    timeout http-request    30s
Jan 21 18:55:47 np0005591285 ovn_metadata_agent[104254]:    timeout connect         30s
Jan 21 18:55:47 np0005591285 ovn_metadata_agent[104254]:    timeout client          32s
Jan 21 18:55:47 np0005591285 ovn_metadata_agent[104254]:    timeout server          32s
Jan 21 18:55:47 np0005591285 ovn_metadata_agent[104254]:    timeout http-keep-alive 30s
Jan 21 18:55:47 np0005591285 ovn_metadata_agent[104254]: 
Jan 21 18:55:47 np0005591285 ovn_metadata_agent[104254]: 
Jan 21 18:55:47 np0005591285 ovn_metadata_agent[104254]: listen listener
Jan 21 18:55:47 np0005591285 ovn_metadata_agent[104254]:    bind 169.254.169.254:80
Jan 21 18:55:47 np0005591285 ovn_metadata_agent[104254]:    server metadata /var/lib/neutron/metadata_proxy
Jan 21 18:55:47 np0005591285 ovn_metadata_agent[104254]:    http-request add-header X-OVN-Network-ID a78bfb22-a192-4dbe-a117-9f8a59130e27
Jan 21 18:55:47 np0005591285 ovn_metadata_agent[104254]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Jan 21 18:55:47 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:55:47.738 104259 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-a78bfb22-a192-4dbe-a117-9f8a59130e27', 'env', 'PROCESS_TAG=haproxy-a78bfb22-a192-4dbe-a117-9f8a59130e27', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/a78bfb22-a192-4dbe-a117-9f8a59130e27.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Jan 21 18:55:47 np0005591285 nova_compute[182755]: 2026-01-21 23:55:47.926 182759 DEBUG nova.network.neutron [req-c41f4dba-c537-41ff-9859-7871e98106e9 req-f894f215-4055-4da4-bfd2-92b4372dbc89 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 3c28cc1f-7479-4ee7-805d-ae13cd2b6dff] Updated VIF entry in instance network info cache for port c9a59fac-68ff-4aa5-abcb-98567b80fb6f. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 21 18:55:47 np0005591285 nova_compute[182755]: 2026-01-21 23:55:47.928 182759 DEBUG nova.network.neutron [req-c41f4dba-c537-41ff-9859-7871e98106e9 req-f894f215-4055-4da4-bfd2-92b4372dbc89 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 3c28cc1f-7479-4ee7-805d-ae13cd2b6dff] Updating instance_info_cache with network_info: [{"id": "c9a59fac-68ff-4aa5-abcb-98567b80fb6f", "address": "fa:16:3e:58:a7:27", "network": {"id": "a78bfb22-a192-4dbe-a117-9f8a59130e27", "bridge": "br-int", "label": "tempest-ListServerFiltersTestJSON-9709523-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "70b1c9f8be0042aa8de9841a26729700", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc9a59fac-68", "ovs_interfaceid": "c9a59fac-68ff-4aa5-abcb-98567b80fb6f", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 21 18:55:47 np0005591285 nova_compute[182755]: 2026-01-21 23:55:47.949 182759 DEBUG oslo_concurrency.lockutils [req-c41f4dba-c537-41ff-9859-7871e98106e9 req-f894f215-4055-4da4-bfd2-92b4372dbc89 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Releasing lock "refresh_cache-3c28cc1f-7479-4ee7-805d-ae13cd2b6dff" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 21 18:55:48 np0005591285 nova_compute[182755]: 2026-01-21 23:55:48.158 182759 DEBUG nova.virt.driver [None req-f447ef32-7edb-4e2c-a5c5-5452f2f9c37e - - - - - -] Emitting event <LifecycleEvent: 1769039748.1567216, 3c28cc1f-7479-4ee7-805d-ae13cd2b6dff => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 21 18:55:48 np0005591285 nova_compute[182755]: 2026-01-21 23:55:48.160 182759 INFO nova.compute.manager [None req-f447ef32-7edb-4e2c-a5c5-5452f2f9c37e - - - - - -] [instance: 3c28cc1f-7479-4ee7-805d-ae13cd2b6dff] VM Started (Lifecycle Event)#033[00m
Jan 21 18:55:48 np0005591285 nova_compute[182755]: 2026-01-21 23:55:48.189 182759 DEBUG nova.compute.manager [None req-f447ef32-7edb-4e2c-a5c5-5452f2f9c37e - - - - - -] [instance: 3c28cc1f-7479-4ee7-805d-ae13cd2b6dff] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 21 18:55:48 np0005591285 nova_compute[182755]: 2026-01-21 23:55:48.194 182759 DEBUG nova.virt.driver [None req-f447ef32-7edb-4e2c-a5c5-5452f2f9c37e - - - - - -] Emitting event <LifecycleEvent: 1769039748.1580286, 3c28cc1f-7479-4ee7-805d-ae13cd2b6dff => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 21 18:55:48 np0005591285 nova_compute[182755]: 2026-01-21 23:55:48.195 182759 INFO nova.compute.manager [None req-f447ef32-7edb-4e2c-a5c5-5452f2f9c37e - - - - - -] [instance: 3c28cc1f-7479-4ee7-805d-ae13cd2b6dff] VM Paused (Lifecycle Event)#033[00m
Jan 21 18:55:48 np0005591285 podman[219089]: 2026-01-21 23:55:48.134530852 +0000 UTC m=+0.049399022 image pull 3695f0466b4af47afdf4b467956f8cc4744d7249671a73e7ca3fd26cca2f59c3 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Jan 21 18:55:48 np0005591285 nova_compute[182755]: 2026-01-21 23:55:48.239 182759 DEBUG nova.compute.manager [None req-f447ef32-7edb-4e2c-a5c5-5452f2f9c37e - - - - - -] [instance: 3c28cc1f-7479-4ee7-805d-ae13cd2b6dff] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 21 18:55:48 np0005591285 nova_compute[182755]: 2026-01-21 23:55:48.283 182759 DEBUG nova.compute.manager [None req-f447ef32-7edb-4e2c-a5c5-5452f2f9c37e - - - - - -] [instance: 3c28cc1f-7479-4ee7-805d-ae13cd2b6dff] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 21 18:55:48 np0005591285 podman[219089]: 2026-01-21 23:55:48.30410518 +0000 UTC m=+0.218973290 container create 64266c0d3701dfbdf5614f771b0b55a55e99b434e82e42ceb4788dd7ac9b0190 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-a78bfb22-a192-4dbe-a117-9f8a59130e27, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, tcib_managed=true, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2)
Jan 21 18:55:48 np0005591285 nova_compute[182755]: 2026-01-21 23:55:48.313 182759 INFO nova.compute.manager [None req-f447ef32-7edb-4e2c-a5c5-5452f2f9c37e - - - - - -] [instance: 3c28cc1f-7479-4ee7-805d-ae13cd2b6dff] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 21 18:55:48 np0005591285 systemd[1]: Started libpod-conmon-64266c0d3701dfbdf5614f771b0b55a55e99b434e82e42ceb4788dd7ac9b0190.scope.
Jan 21 18:55:48 np0005591285 systemd[1]: Started libcrun container.
Jan 21 18:55:48 np0005591285 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/396d86ced6955a0fe0991607ff3e8d3f69abd2e055809b8d0968a6b8751030c6/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 21 18:55:48 np0005591285 podman[219089]: 2026-01-21 23:55:48.42466959 +0000 UTC m=+0.339537790 container init 64266c0d3701dfbdf5614f771b0b55a55e99b434e82e42ceb4788dd7ac9b0190 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-a78bfb22-a192-4dbe-a117-9f8a59130e27, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 21 18:55:48 np0005591285 podman[219089]: 2026-01-21 23:55:48.434766674 +0000 UTC m=+0.349634814 container start 64266c0d3701dfbdf5614f771b0b55a55e99b434e82e42ceb4788dd7ac9b0190 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-a78bfb22-a192-4dbe-a117-9f8a59130e27, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, io.buildah.version=1.41.3)
Jan 21 18:55:48 np0005591285 neutron-haproxy-ovnmeta-a78bfb22-a192-4dbe-a117-9f8a59130e27[219106]: [NOTICE]   (219110) : New worker (219112) forked
Jan 21 18:55:48 np0005591285 neutron-haproxy-ovnmeta-a78bfb22-a192-4dbe-a117-9f8a59130e27[219106]: [NOTICE]   (219110) : Loading success.
Jan 21 18:55:49 np0005591285 nova_compute[182755]: 2026-01-21 23:55:49.769 182759 DEBUG nova.compute.manager [req-a8d801e8-db65-4ed9-b27e-6d9bb69dfa22 req-36094cd3-bbb5-4c41-af4e-51d9cbdbc2bc 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 3c28cc1f-7479-4ee7-805d-ae13cd2b6dff] Received event network-vif-plugged-c9a59fac-68ff-4aa5-abcb-98567b80fb6f external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 21 18:55:49 np0005591285 nova_compute[182755]: 2026-01-21 23:55:49.770 182759 DEBUG oslo_concurrency.lockutils [req-a8d801e8-db65-4ed9-b27e-6d9bb69dfa22 req-36094cd3-bbb5-4c41-af4e-51d9cbdbc2bc 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "3c28cc1f-7479-4ee7-805d-ae13cd2b6dff-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 21 18:55:49 np0005591285 nova_compute[182755]: 2026-01-21 23:55:49.771 182759 DEBUG oslo_concurrency.lockutils [req-a8d801e8-db65-4ed9-b27e-6d9bb69dfa22 req-36094cd3-bbb5-4c41-af4e-51d9cbdbc2bc 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "3c28cc1f-7479-4ee7-805d-ae13cd2b6dff-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 21 18:55:49 np0005591285 nova_compute[182755]: 2026-01-21 23:55:49.771 182759 DEBUG oslo_concurrency.lockutils [req-a8d801e8-db65-4ed9-b27e-6d9bb69dfa22 req-36094cd3-bbb5-4c41-af4e-51d9cbdbc2bc 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "3c28cc1f-7479-4ee7-805d-ae13cd2b6dff-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 21 18:55:49 np0005591285 nova_compute[182755]: 2026-01-21 23:55:49.772 182759 DEBUG nova.compute.manager [req-a8d801e8-db65-4ed9-b27e-6d9bb69dfa22 req-36094cd3-bbb5-4c41-af4e-51d9cbdbc2bc 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 3c28cc1f-7479-4ee7-805d-ae13cd2b6dff] Processing event network-vif-plugged-c9a59fac-68ff-4aa5-abcb-98567b80fb6f _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Jan 21 18:55:49 np0005591285 nova_compute[182755]: 2026-01-21 23:55:49.772 182759 DEBUG nova.compute.manager [req-a8d801e8-db65-4ed9-b27e-6d9bb69dfa22 req-36094cd3-bbb5-4c41-af4e-51d9cbdbc2bc 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 3c28cc1f-7479-4ee7-805d-ae13cd2b6dff] Received event network-vif-plugged-c9a59fac-68ff-4aa5-abcb-98567b80fb6f external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 21 18:55:49 np0005591285 nova_compute[182755]: 2026-01-21 23:55:49.772 182759 DEBUG oslo_concurrency.lockutils [req-a8d801e8-db65-4ed9-b27e-6d9bb69dfa22 req-36094cd3-bbb5-4c41-af4e-51d9cbdbc2bc 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "3c28cc1f-7479-4ee7-805d-ae13cd2b6dff-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 21 18:55:49 np0005591285 nova_compute[182755]: 2026-01-21 23:55:49.773 182759 DEBUG oslo_concurrency.lockutils [req-a8d801e8-db65-4ed9-b27e-6d9bb69dfa22 req-36094cd3-bbb5-4c41-af4e-51d9cbdbc2bc 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "3c28cc1f-7479-4ee7-805d-ae13cd2b6dff-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 21 18:55:49 np0005591285 nova_compute[182755]: 2026-01-21 23:55:49.773 182759 DEBUG oslo_concurrency.lockutils [req-a8d801e8-db65-4ed9-b27e-6d9bb69dfa22 req-36094cd3-bbb5-4c41-af4e-51d9cbdbc2bc 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "3c28cc1f-7479-4ee7-805d-ae13cd2b6dff-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 21 18:55:49 np0005591285 nova_compute[182755]: 2026-01-21 23:55:49.774 182759 DEBUG nova.compute.manager [req-a8d801e8-db65-4ed9-b27e-6d9bb69dfa22 req-36094cd3-bbb5-4c41-af4e-51d9cbdbc2bc 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 3c28cc1f-7479-4ee7-805d-ae13cd2b6dff] No waiting events found dispatching network-vif-plugged-c9a59fac-68ff-4aa5-abcb-98567b80fb6f pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 21 18:55:49 np0005591285 nova_compute[182755]: 2026-01-21 23:55:49.774 182759 WARNING nova.compute.manager [req-a8d801e8-db65-4ed9-b27e-6d9bb69dfa22 req-36094cd3-bbb5-4c41-af4e-51d9cbdbc2bc 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 3c28cc1f-7479-4ee7-805d-ae13cd2b6dff] Received unexpected event network-vif-plugged-c9a59fac-68ff-4aa5-abcb-98567b80fb6f for instance with vm_state building and task_state spawning.#033[00m
Jan 21 18:55:49 np0005591285 nova_compute[182755]: 2026-01-21 23:55:49.775 182759 DEBUG nova.compute.manager [None req-c515a25c-1fe2-41b0-9aa3-e2ff88dd1e82 7e79b904cb8a49f990b05eb0ed72fdf4 70b1c9f8be0042aa8de9841a26729700 - - default default] [instance: 3c28cc1f-7479-4ee7-805d-ae13cd2b6dff] Instance event wait completed in 1 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Jan 21 18:55:49 np0005591285 nova_compute[182755]: 2026-01-21 23:55:49.782 182759 DEBUG nova.virt.driver [None req-f447ef32-7edb-4e2c-a5c5-5452f2f9c37e - - - - - -] Emitting event <LifecycleEvent: 1769039749.7818124, 3c28cc1f-7479-4ee7-805d-ae13cd2b6dff => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 21 18:55:49 np0005591285 nova_compute[182755]: 2026-01-21 23:55:49.783 182759 INFO nova.compute.manager [None req-f447ef32-7edb-4e2c-a5c5-5452f2f9c37e - - - - - -] [instance: 3c28cc1f-7479-4ee7-805d-ae13cd2b6dff] VM Resumed (Lifecycle Event)#033[00m
Jan 21 18:55:49 np0005591285 nova_compute[182755]: 2026-01-21 23:55:49.789 182759 DEBUG nova.virt.libvirt.driver [None req-c515a25c-1fe2-41b0-9aa3-e2ff88dd1e82 7e79b904cb8a49f990b05eb0ed72fdf4 70b1c9f8be0042aa8de9841a26729700 - - default default] [instance: 3c28cc1f-7479-4ee7-805d-ae13cd2b6dff] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Jan 21 18:55:49 np0005591285 nova_compute[182755]: 2026-01-21 23:55:49.795 182759 INFO nova.virt.libvirt.driver [-] [instance: 3c28cc1f-7479-4ee7-805d-ae13cd2b6dff] Instance spawned successfully.#033[00m
Jan 21 18:55:49 np0005591285 nova_compute[182755]: 2026-01-21 23:55:49.796 182759 DEBUG nova.virt.libvirt.driver [None req-c515a25c-1fe2-41b0-9aa3-e2ff88dd1e82 7e79b904cb8a49f990b05eb0ed72fdf4 70b1c9f8be0042aa8de9841a26729700 - - default default] [instance: 3c28cc1f-7479-4ee7-805d-ae13cd2b6dff] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Jan 21 18:55:49 np0005591285 nova_compute[182755]: 2026-01-21 23:55:49.818 182759 DEBUG nova.compute.manager [None req-f447ef32-7edb-4e2c-a5c5-5452f2f9c37e - - - - - -] [instance: 3c28cc1f-7479-4ee7-805d-ae13cd2b6dff] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 21 18:55:49 np0005591285 nova_compute[182755]: 2026-01-21 23:55:49.826 182759 DEBUG nova.compute.manager [None req-f447ef32-7edb-4e2c-a5c5-5452f2f9c37e - - - - - -] [instance: 3c28cc1f-7479-4ee7-805d-ae13cd2b6dff] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 21 18:55:49 np0005591285 nova_compute[182755]: 2026-01-21 23:55:49.832 182759 DEBUG nova.virt.libvirt.driver [None req-c515a25c-1fe2-41b0-9aa3-e2ff88dd1e82 7e79b904cb8a49f990b05eb0ed72fdf4 70b1c9f8be0042aa8de9841a26729700 - - default default] [instance: 3c28cc1f-7479-4ee7-805d-ae13cd2b6dff] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 21 18:55:49 np0005591285 nova_compute[182755]: 2026-01-21 23:55:49.832 182759 DEBUG nova.virt.libvirt.driver [None req-c515a25c-1fe2-41b0-9aa3-e2ff88dd1e82 7e79b904cb8a49f990b05eb0ed72fdf4 70b1c9f8be0042aa8de9841a26729700 - - default default] [instance: 3c28cc1f-7479-4ee7-805d-ae13cd2b6dff] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 21 18:55:49 np0005591285 nova_compute[182755]: 2026-01-21 23:55:49.833 182759 DEBUG nova.virt.libvirt.driver [None req-c515a25c-1fe2-41b0-9aa3-e2ff88dd1e82 7e79b904cb8a49f990b05eb0ed72fdf4 70b1c9f8be0042aa8de9841a26729700 - - default default] [instance: 3c28cc1f-7479-4ee7-805d-ae13cd2b6dff] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 21 18:55:49 np0005591285 nova_compute[182755]: 2026-01-21 23:55:49.833 182759 DEBUG nova.virt.libvirt.driver [None req-c515a25c-1fe2-41b0-9aa3-e2ff88dd1e82 7e79b904cb8a49f990b05eb0ed72fdf4 70b1c9f8be0042aa8de9841a26729700 - - default default] [instance: 3c28cc1f-7479-4ee7-805d-ae13cd2b6dff] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 21 18:55:49 np0005591285 nova_compute[182755]: 2026-01-21 23:55:49.834 182759 DEBUG nova.virt.libvirt.driver [None req-c515a25c-1fe2-41b0-9aa3-e2ff88dd1e82 7e79b904cb8a49f990b05eb0ed72fdf4 70b1c9f8be0042aa8de9841a26729700 - - default default] [instance: 3c28cc1f-7479-4ee7-805d-ae13cd2b6dff] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 21 18:55:49 np0005591285 nova_compute[182755]: 2026-01-21 23:55:49.834 182759 DEBUG nova.virt.libvirt.driver [None req-c515a25c-1fe2-41b0-9aa3-e2ff88dd1e82 7e79b904cb8a49f990b05eb0ed72fdf4 70b1c9f8be0042aa8de9841a26729700 - - default default] [instance: 3c28cc1f-7479-4ee7-805d-ae13cd2b6dff] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 21 18:55:49 np0005591285 nova_compute[182755]: 2026-01-21 23:55:49.862 182759 INFO nova.compute.manager [None req-f447ef32-7edb-4e2c-a5c5-5452f2f9c37e - - - - - -] [instance: 3c28cc1f-7479-4ee7-805d-ae13cd2b6dff] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 21 18:55:49 np0005591285 nova_compute[182755]: 2026-01-21 23:55:49.922 182759 INFO nova.compute.manager [None req-c515a25c-1fe2-41b0-9aa3-e2ff88dd1e82 7e79b904cb8a49f990b05eb0ed72fdf4 70b1c9f8be0042aa8de9841a26729700 - - default default] [instance: 3c28cc1f-7479-4ee7-805d-ae13cd2b6dff] Took 12.30 seconds to spawn the instance on the hypervisor.#033[00m
Jan 21 18:55:49 np0005591285 nova_compute[182755]: 2026-01-21 23:55:49.923 182759 DEBUG nova.compute.manager [None req-c515a25c-1fe2-41b0-9aa3-e2ff88dd1e82 7e79b904cb8a49f990b05eb0ed72fdf4 70b1c9f8be0042aa8de9841a26729700 - - default default] [instance: 3c28cc1f-7479-4ee7-805d-ae13cd2b6dff] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 21 18:55:49 np0005591285 nova_compute[182755]: 2026-01-21 23:55:49.954 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 18:55:50 np0005591285 nova_compute[182755]: 2026-01-21 23:55:50.024 182759 INFO nova.compute.manager [None req-c515a25c-1fe2-41b0-9aa3-e2ff88dd1e82 7e79b904cb8a49f990b05eb0ed72fdf4 70b1c9f8be0042aa8de9841a26729700 - - default default] [instance: 3c28cc1f-7479-4ee7-805d-ae13cd2b6dff] Took 13.21 seconds to build instance.#033[00m
Jan 21 18:55:50 np0005591285 nova_compute[182755]: 2026-01-21 23:55:50.053 182759 DEBUG oslo_concurrency.lockutils [None req-c515a25c-1fe2-41b0-9aa3-e2ff88dd1e82 7e79b904cb8a49f990b05eb0ed72fdf4 70b1c9f8be0042aa8de9841a26729700 - - default default] Lock "3c28cc1f-7479-4ee7-805d-ae13cd2b6dff" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 13.361s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 21 18:55:51 np0005591285 nova_compute[182755]: 2026-01-21 23:55:51.096 182759 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1769039736.0947523, 2b44528a-0ec9-4df9-afce-0d76ed92b221 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 21 18:55:51 np0005591285 nova_compute[182755]: 2026-01-21 23:55:51.097 182759 INFO nova.compute.manager [-] [instance: 2b44528a-0ec9-4df9-afce-0d76ed92b221] VM Stopped (Lifecycle Event)#033[00m
Jan 21 18:55:51 np0005591285 nova_compute[182755]: 2026-01-21 23:55:51.126 182759 DEBUG nova.compute.manager [None req-b1ee8049-d8e0-4c94-8063-f4e0d7277ce1 - - - - - -] [instance: 2b44528a-0ec9-4df9-afce-0d76ed92b221] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 21 18:55:51 np0005591285 nova_compute[182755]: 2026-01-21 23:55:51.345 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 18:55:53 np0005591285 podman[219121]: 2026-01-21 23:55:53.247938371 +0000 UTC m=+0.091362409 container health_status 769668cc4e818cfae854ab3fcb5517227ddca1e18005a18ef4d782a494f45662 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.openshift.tags=minimal rhel9, architecture=x86_64, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.openshift.expose-services=, vcs-type=git, build-date=2025-08-20T13:12:41, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Red Hat, Inc., version=9.6, com.redhat.component=ubi9-minimal-container, release=1755695350, distribution-scope=public, io.buildah.version=1.33.7, name=ubi9-minimal, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, url=https://catalog.redhat.com/en/search?searchType=containers, container_name=openstack_network_exporter, config_id=openstack_network_exporter, vendor=Red Hat, Inc., config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., managed_by=edpm_ansible)
Jan 21 18:55:53 np0005591285 podman[219122]: 2026-01-21 23:55:53.286747934 +0000 UTC m=+0.129237546 container health_status b5c1a31a508d0cf9e10702abe9c0ef424614b4d8f0daee85791f7de054afa6f0 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, config_id=ceilometer_agent_compute, container_name=ceilometer_agent_compute, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, managed_by=edpm_ansible)
Jan 21 18:55:55 np0005591285 nova_compute[182755]: 2026-01-21 23:55:55.001 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 18:55:55 np0005591285 nova_compute[182755]: 2026-01-21 23:55:55.035 182759 DEBUG oslo_concurrency.lockutils [None req-9a6e6287-4ebd-494b-bab4-ac68708dbe9f 6eb1bcf645844eaca088761a04e59542 63e5713bcd4c429796b251487b6136bc - - default default] Acquiring lock "fc483404-7890-49b3-a98c-14073862383f" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 21 18:55:55 np0005591285 nova_compute[182755]: 2026-01-21 23:55:55.037 182759 DEBUG oslo_concurrency.lockutils [None req-9a6e6287-4ebd-494b-bab4-ac68708dbe9f 6eb1bcf645844eaca088761a04e59542 63e5713bcd4c429796b251487b6136bc - - default default] Lock "fc483404-7890-49b3-a98c-14073862383f" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 21 18:55:55 np0005591285 nova_compute[182755]: 2026-01-21 23:55:55.065 182759 DEBUG nova.compute.manager [None req-9a6e6287-4ebd-494b-bab4-ac68708dbe9f 6eb1bcf645844eaca088761a04e59542 63e5713bcd4c429796b251487b6136bc - - default default] [instance: fc483404-7890-49b3-a98c-14073862383f] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Jan 21 18:55:55 np0005591285 nova_compute[182755]: 2026-01-21 23:55:55.190 182759 DEBUG oslo_concurrency.lockutils [None req-9a6e6287-4ebd-494b-bab4-ac68708dbe9f 6eb1bcf645844eaca088761a04e59542 63e5713bcd4c429796b251487b6136bc - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 21 18:55:55 np0005591285 nova_compute[182755]: 2026-01-21 23:55:55.192 182759 DEBUG oslo_concurrency.lockutils [None req-9a6e6287-4ebd-494b-bab4-ac68708dbe9f 6eb1bcf645844eaca088761a04e59542 63e5713bcd4c429796b251487b6136bc - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 21 18:55:55 np0005591285 nova_compute[182755]: 2026-01-21 23:55:55.208 182759 DEBUG nova.virt.hardware [None req-9a6e6287-4ebd-494b-bab4-ac68708dbe9f 6eb1bcf645844eaca088761a04e59542 63e5713bcd4c429796b251487b6136bc - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Jan 21 18:55:55 np0005591285 nova_compute[182755]: 2026-01-21 23:55:55.209 182759 INFO nova.compute.claims [None req-9a6e6287-4ebd-494b-bab4-ac68708dbe9f 6eb1bcf645844eaca088761a04e59542 63e5713bcd4c429796b251487b6136bc - - default default] [instance: fc483404-7890-49b3-a98c-14073862383f] Claim successful on node compute-2.ctlplane.example.com#033[00m
Jan 21 18:55:55 np0005591285 nova_compute[182755]: 2026-01-21 23:55:55.409 182759 DEBUG nova.compute.provider_tree [None req-9a6e6287-4ebd-494b-bab4-ac68708dbe9f 6eb1bcf645844eaca088761a04e59542 63e5713bcd4c429796b251487b6136bc - - default default] Inventory has not changed in ProviderTree for provider: e96a8776-a298-4c19-937a-402cb8191067 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 21 18:55:55 np0005591285 nova_compute[182755]: 2026-01-21 23:55:55.434 182759 DEBUG nova.scheduler.client.report [None req-9a6e6287-4ebd-494b-bab4-ac68708dbe9f 6eb1bcf645844eaca088761a04e59542 63e5713bcd4c429796b251487b6136bc - - default default] Inventory has not changed for provider e96a8776-a298-4c19-937a-402cb8191067 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 21 18:55:55 np0005591285 nova_compute[182755]: 2026-01-21 23:55:55.471 182759 DEBUG oslo_concurrency.lockutils [None req-9a6e6287-4ebd-494b-bab4-ac68708dbe9f 6eb1bcf645844eaca088761a04e59542 63e5713bcd4c429796b251487b6136bc - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.279s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 21 18:55:55 np0005591285 nova_compute[182755]: 2026-01-21 23:55:55.473 182759 DEBUG nova.compute.manager [None req-9a6e6287-4ebd-494b-bab4-ac68708dbe9f 6eb1bcf645844eaca088761a04e59542 63e5713bcd4c429796b251487b6136bc - - default default] [instance: fc483404-7890-49b3-a98c-14073862383f] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Jan 21 18:55:55 np0005591285 nova_compute[182755]: 2026-01-21 23:55:55.544 182759 DEBUG nova.compute.manager [None req-9a6e6287-4ebd-494b-bab4-ac68708dbe9f 6eb1bcf645844eaca088761a04e59542 63e5713bcd4c429796b251487b6136bc - - default default] [instance: fc483404-7890-49b3-a98c-14073862383f] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Jan 21 18:55:55 np0005591285 nova_compute[182755]: 2026-01-21 23:55:55.545 182759 DEBUG nova.network.neutron [None req-9a6e6287-4ebd-494b-bab4-ac68708dbe9f 6eb1bcf645844eaca088761a04e59542 63e5713bcd4c429796b251487b6136bc - - default default] [instance: fc483404-7890-49b3-a98c-14073862383f] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Jan 21 18:55:55 np0005591285 nova_compute[182755]: 2026-01-21 23:55:55.574 182759 INFO nova.virt.libvirt.driver [None req-9a6e6287-4ebd-494b-bab4-ac68708dbe9f 6eb1bcf645844eaca088761a04e59542 63e5713bcd4c429796b251487b6136bc - - default default] [instance: fc483404-7890-49b3-a98c-14073862383f] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Jan 21 18:55:55 np0005591285 nova_compute[182755]: 2026-01-21 23:55:55.608 182759 DEBUG nova.compute.manager [None req-9a6e6287-4ebd-494b-bab4-ac68708dbe9f 6eb1bcf645844eaca088761a04e59542 63e5713bcd4c429796b251487b6136bc - - default default] [instance: fc483404-7890-49b3-a98c-14073862383f] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Jan 21 18:55:55 np0005591285 nova_compute[182755]: 2026-01-21 23:55:55.788 182759 DEBUG nova.compute.manager [None req-9a6e6287-4ebd-494b-bab4-ac68708dbe9f 6eb1bcf645844eaca088761a04e59542 63e5713bcd4c429796b251487b6136bc - - default default] [instance: fc483404-7890-49b3-a98c-14073862383f] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Jan 21 18:55:55 np0005591285 nova_compute[182755]: 2026-01-21 23:55:55.792 182759 DEBUG nova.virt.libvirt.driver [None req-9a6e6287-4ebd-494b-bab4-ac68708dbe9f 6eb1bcf645844eaca088761a04e59542 63e5713bcd4c429796b251487b6136bc - - default default] [instance: fc483404-7890-49b3-a98c-14073862383f] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Jan 21 18:55:55 np0005591285 nova_compute[182755]: 2026-01-21 23:55:55.793 182759 INFO nova.virt.libvirt.driver [None req-9a6e6287-4ebd-494b-bab4-ac68708dbe9f 6eb1bcf645844eaca088761a04e59542 63e5713bcd4c429796b251487b6136bc - - default default] [instance: fc483404-7890-49b3-a98c-14073862383f] Creating image(s)#033[00m
Jan 21 18:55:55 np0005591285 nova_compute[182755]: 2026-01-21 23:55:55.795 182759 DEBUG oslo_concurrency.lockutils [None req-9a6e6287-4ebd-494b-bab4-ac68708dbe9f 6eb1bcf645844eaca088761a04e59542 63e5713bcd4c429796b251487b6136bc - - default default] Acquiring lock "/var/lib/nova/instances/fc483404-7890-49b3-a98c-14073862383f/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 21 18:55:55 np0005591285 nova_compute[182755]: 2026-01-21 23:55:55.796 182759 DEBUG oslo_concurrency.lockutils [None req-9a6e6287-4ebd-494b-bab4-ac68708dbe9f 6eb1bcf645844eaca088761a04e59542 63e5713bcd4c429796b251487b6136bc - - default default] Lock "/var/lib/nova/instances/fc483404-7890-49b3-a98c-14073862383f/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 21 18:55:55 np0005591285 nova_compute[182755]: 2026-01-21 23:55:55.798 182759 DEBUG oslo_concurrency.lockutils [None req-9a6e6287-4ebd-494b-bab4-ac68708dbe9f 6eb1bcf645844eaca088761a04e59542 63e5713bcd4c429796b251487b6136bc - - default default] Lock "/var/lib/nova/instances/fc483404-7890-49b3-a98c-14073862383f/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 21 18:55:55 np0005591285 nova_compute[182755]: 2026-01-21 23:55:55.799 182759 DEBUG oslo_concurrency.lockutils [None req-9a6e6287-4ebd-494b-bab4-ac68708dbe9f 6eb1bcf645844eaca088761a04e59542 63e5713bcd4c429796b251487b6136bc - - default default] Acquiring lock "6779aa1454b0f9e323fac2693f45a73902da912b" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 21 18:55:55 np0005591285 nova_compute[182755]: 2026-01-21 23:55:55.801 182759 DEBUG oslo_concurrency.lockutils [None req-9a6e6287-4ebd-494b-bab4-ac68708dbe9f 6eb1bcf645844eaca088761a04e59542 63e5713bcd4c429796b251487b6136bc - - default default] Lock "6779aa1454b0f9e323fac2693f45a73902da912b" acquired by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: waited 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 21 18:55:55 np0005591285 nova_compute[182755]: 2026-01-21 23:55:55.962 182759 DEBUG nova.policy [None req-9a6e6287-4ebd-494b-bab4-ac68708dbe9f 6eb1bcf645844eaca088761a04e59542 63e5713bcd4c429796b251487b6136bc - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '6eb1bcf645844eaca088761a04e59542', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '63e5713bcd4c429796b251487b6136bc', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Jan 21 18:55:56 np0005591285 nova_compute[182755]: 2026-01-21 23:55:56.349 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 18:55:57 np0005591285 nova_compute[182755]: 2026-01-21 23:55:57.150 182759 DEBUG oslo_concurrency.processutils [None req-9a6e6287-4ebd-494b-bab4-ac68708dbe9f 6eb1bcf645844eaca088761a04e59542 63e5713bcd4c429796b251487b6136bc - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/6779aa1454b0f9e323fac2693f45a73902da912b.part --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 21 18:55:57 np0005591285 nova_compute[182755]: 2026-01-21 23:55:57.248 182759 DEBUG oslo_concurrency.processutils [None req-9a6e6287-4ebd-494b-bab4-ac68708dbe9f 6eb1bcf645844eaca088761a04e59542 63e5713bcd4c429796b251487b6136bc - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/6779aa1454b0f9e323fac2693f45a73902da912b.part --force-share --output=json" returned: 0 in 0.097s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 21 18:55:57 np0005591285 nova_compute[182755]: 2026-01-21 23:55:57.249 182759 DEBUG nova.virt.images [None req-9a6e6287-4ebd-494b-bab4-ac68708dbe9f 6eb1bcf645844eaca088761a04e59542 63e5713bcd4c429796b251487b6136bc - - default default] 7a3304a5-0254-4967-ad18-be3da95aaf72 was qcow2, converting to raw fetch_to_raw /usr/lib/python3.9/site-packages/nova/virt/images.py:242#033[00m
Jan 21 18:55:57 np0005591285 nova_compute[182755]: 2026-01-21 23:55:57.250 182759 DEBUG nova.privsep.utils [None req-9a6e6287-4ebd-494b-bab4-ac68708dbe9f 6eb1bcf645844eaca088761a04e59542 63e5713bcd4c429796b251487b6136bc - - default default] Path '/var/lib/nova/instances' supports direct I/O supports_direct_io /usr/lib/python3.9/site-packages/nova/privsep/utils.py:63#033[00m
Jan 21 18:55:57 np0005591285 nova_compute[182755]: 2026-01-21 23:55:57.251 182759 DEBUG oslo_concurrency.processutils [None req-9a6e6287-4ebd-494b-bab4-ac68708dbe9f 6eb1bcf645844eaca088761a04e59542 63e5713bcd4c429796b251487b6136bc - - default default] Running cmd (subprocess): qemu-img convert -t none -O raw -f qcow2 /var/lib/nova/instances/_base/6779aa1454b0f9e323fac2693f45a73902da912b.part /var/lib/nova/instances/_base/6779aa1454b0f9e323fac2693f45a73902da912b.converted execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 21 18:55:57 np0005591285 nova_compute[182755]: 2026-01-21 23:55:57.325 182759 DEBUG nova.network.neutron [None req-9a6e6287-4ebd-494b-bab4-ac68708dbe9f 6eb1bcf645844eaca088761a04e59542 63e5713bcd4c429796b251487b6136bc - - default default] [instance: fc483404-7890-49b3-a98c-14073862383f] Successfully created port: c6e2fc90-4c49-49de-91ff-3394321164b9 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Jan 21 18:55:57 np0005591285 nova_compute[182755]: 2026-01-21 23:55:57.395 182759 DEBUG oslo_concurrency.processutils [None req-9a6e6287-4ebd-494b-bab4-ac68708dbe9f 6eb1bcf645844eaca088761a04e59542 63e5713bcd4c429796b251487b6136bc - - default default] CMD "qemu-img convert -t none -O raw -f qcow2 /var/lib/nova/instances/_base/6779aa1454b0f9e323fac2693f45a73902da912b.part /var/lib/nova/instances/_base/6779aa1454b0f9e323fac2693f45a73902da912b.converted" returned: 0 in 0.144s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 21 18:55:57 np0005591285 nova_compute[182755]: 2026-01-21 23:55:57.404 182759 DEBUG oslo_concurrency.processutils [None req-9a6e6287-4ebd-494b-bab4-ac68708dbe9f 6eb1bcf645844eaca088761a04e59542 63e5713bcd4c429796b251487b6136bc - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/6779aa1454b0f9e323fac2693f45a73902da912b.converted --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 21 18:55:57 np0005591285 nova_compute[182755]: 2026-01-21 23:55:57.489 182759 DEBUG oslo_concurrency.processutils [None req-9a6e6287-4ebd-494b-bab4-ac68708dbe9f 6eb1bcf645844eaca088761a04e59542 63e5713bcd4c429796b251487b6136bc - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/6779aa1454b0f9e323fac2693f45a73902da912b.converted --force-share --output=json" returned: 0 in 0.085s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 21 18:55:57 np0005591285 nova_compute[182755]: 2026-01-21 23:55:57.492 182759 DEBUG oslo_concurrency.lockutils [None req-9a6e6287-4ebd-494b-bab4-ac68708dbe9f 6eb1bcf645844eaca088761a04e59542 63e5713bcd4c429796b251487b6136bc - - default default] Lock "6779aa1454b0f9e323fac2693f45a73902da912b" "released" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: held 1.691s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 21 18:55:57 np0005591285 nova_compute[182755]: 2026-01-21 23:55:57.512 182759 DEBUG oslo_concurrency.processutils [None req-9a6e6287-4ebd-494b-bab4-ac68708dbe9f 6eb1bcf645844eaca088761a04e59542 63e5713bcd4c429796b251487b6136bc - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/6779aa1454b0f9e323fac2693f45a73902da912b --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 21 18:55:57 np0005591285 nova_compute[182755]: 2026-01-21 23:55:57.586 182759 DEBUG oslo_concurrency.processutils [None req-9a6e6287-4ebd-494b-bab4-ac68708dbe9f 6eb1bcf645844eaca088761a04e59542 63e5713bcd4c429796b251487b6136bc - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/6779aa1454b0f9e323fac2693f45a73902da912b --force-share --output=json" returned: 0 in 0.074s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 21 18:55:57 np0005591285 nova_compute[182755]: 2026-01-21 23:55:57.589 182759 DEBUG oslo_concurrency.lockutils [None req-9a6e6287-4ebd-494b-bab4-ac68708dbe9f 6eb1bcf645844eaca088761a04e59542 63e5713bcd4c429796b251487b6136bc - - default default] Acquiring lock "6779aa1454b0f9e323fac2693f45a73902da912b" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 21 18:55:57 np0005591285 nova_compute[182755]: 2026-01-21 23:55:57.590 182759 DEBUG oslo_concurrency.lockutils [None req-9a6e6287-4ebd-494b-bab4-ac68708dbe9f 6eb1bcf645844eaca088761a04e59542 63e5713bcd4c429796b251487b6136bc - - default default] Lock "6779aa1454b0f9e323fac2693f45a73902da912b" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 21 18:55:57 np0005591285 nova_compute[182755]: 2026-01-21 23:55:57.606 182759 DEBUG oslo_concurrency.processutils [None req-9a6e6287-4ebd-494b-bab4-ac68708dbe9f 6eb1bcf645844eaca088761a04e59542 63e5713bcd4c429796b251487b6136bc - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/6779aa1454b0f9e323fac2693f45a73902da912b --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 21 18:55:57 np0005591285 nova_compute[182755]: 2026-01-21 23:55:57.679 182759 DEBUG oslo_concurrency.processutils [None req-9a6e6287-4ebd-494b-bab4-ac68708dbe9f 6eb1bcf645844eaca088761a04e59542 63e5713bcd4c429796b251487b6136bc - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/6779aa1454b0f9e323fac2693f45a73902da912b --force-share --output=json" returned: 0 in 0.073s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 21 18:55:57 np0005591285 nova_compute[182755]: 2026-01-21 23:55:57.682 182759 DEBUG oslo_concurrency.processutils [None req-9a6e6287-4ebd-494b-bab4-ac68708dbe9f 6eb1bcf645844eaca088761a04e59542 63e5713bcd4c429796b251487b6136bc - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/6779aa1454b0f9e323fac2693f45a73902da912b,backing_fmt=raw /var/lib/nova/instances/fc483404-7890-49b3-a98c-14073862383f/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 21 18:55:57 np0005591285 nova_compute[182755]: 2026-01-21 23:55:57.739 182759 DEBUG oslo_concurrency.processutils [None req-9a6e6287-4ebd-494b-bab4-ac68708dbe9f 6eb1bcf645844eaca088761a04e59542 63e5713bcd4c429796b251487b6136bc - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/6779aa1454b0f9e323fac2693f45a73902da912b,backing_fmt=raw /var/lib/nova/instances/fc483404-7890-49b3-a98c-14073862383f/disk 1073741824" returned: 0 in 0.058s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 21 18:55:57 np0005591285 nova_compute[182755]: 2026-01-21 23:55:57.741 182759 DEBUG oslo_concurrency.lockutils [None req-9a6e6287-4ebd-494b-bab4-ac68708dbe9f 6eb1bcf645844eaca088761a04e59542 63e5713bcd4c429796b251487b6136bc - - default default] Lock "6779aa1454b0f9e323fac2693f45a73902da912b" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.151s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 21 18:55:57 np0005591285 nova_compute[182755]: 2026-01-21 23:55:57.741 182759 DEBUG oslo_concurrency.processutils [None req-9a6e6287-4ebd-494b-bab4-ac68708dbe9f 6eb1bcf645844eaca088761a04e59542 63e5713bcd4c429796b251487b6136bc - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/6779aa1454b0f9e323fac2693f45a73902da912b --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 21 18:55:57 np0005591285 nova_compute[182755]: 2026-01-21 23:55:57.801 182759 DEBUG oslo_concurrency.processutils [None req-9a6e6287-4ebd-494b-bab4-ac68708dbe9f 6eb1bcf645844eaca088761a04e59542 63e5713bcd4c429796b251487b6136bc - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/6779aa1454b0f9e323fac2693f45a73902da912b --force-share --output=json" returned: 0 in 0.060s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 21 18:55:57 np0005591285 nova_compute[182755]: 2026-01-21 23:55:57.803 182759 DEBUG nova.objects.instance [None req-9a6e6287-4ebd-494b-bab4-ac68708dbe9f 6eb1bcf645844eaca088761a04e59542 63e5713bcd4c429796b251487b6136bc - - default default] Lazy-loading 'migration_context' on Instance uuid fc483404-7890-49b3-a98c-14073862383f obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 21 18:55:57 np0005591285 nova_compute[182755]: 2026-01-21 23:55:57.824 182759 DEBUG nova.virt.libvirt.driver [None req-9a6e6287-4ebd-494b-bab4-ac68708dbe9f 6eb1bcf645844eaca088761a04e59542 63e5713bcd4c429796b251487b6136bc - - default default] [instance: fc483404-7890-49b3-a98c-14073862383f] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Jan 21 18:55:57 np0005591285 nova_compute[182755]: 2026-01-21 23:55:57.825 182759 DEBUG nova.virt.libvirt.driver [None req-9a6e6287-4ebd-494b-bab4-ac68708dbe9f 6eb1bcf645844eaca088761a04e59542 63e5713bcd4c429796b251487b6136bc - - default default] [instance: fc483404-7890-49b3-a98c-14073862383f] Ensure instance console log exists: /var/lib/nova/instances/fc483404-7890-49b3-a98c-14073862383f/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Jan 21 18:55:57 np0005591285 nova_compute[182755]: 2026-01-21 23:55:57.826 182759 DEBUG oslo_concurrency.lockutils [None req-9a6e6287-4ebd-494b-bab4-ac68708dbe9f 6eb1bcf645844eaca088761a04e59542 63e5713bcd4c429796b251487b6136bc - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 21 18:55:57 np0005591285 nova_compute[182755]: 2026-01-21 23:55:57.826 182759 DEBUG oslo_concurrency.lockutils [None req-9a6e6287-4ebd-494b-bab4-ac68708dbe9f 6eb1bcf645844eaca088761a04e59542 63e5713bcd4c429796b251487b6136bc - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 21 18:55:57 np0005591285 nova_compute[182755]: 2026-01-21 23:55:57.827 182759 DEBUG oslo_concurrency.lockutils [None req-9a6e6287-4ebd-494b-bab4-ac68708dbe9f 6eb1bcf645844eaca088761a04e59542 63e5713bcd4c429796b251487b6136bc - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 21 18:55:58 np0005591285 nova_compute[182755]: 2026-01-21 23:55:58.607 182759 DEBUG nova.network.neutron [None req-9a6e6287-4ebd-494b-bab4-ac68708dbe9f 6eb1bcf645844eaca088761a04e59542 63e5713bcd4c429796b251487b6136bc - - default default] [instance: fc483404-7890-49b3-a98c-14073862383f] Successfully updated port: c6e2fc90-4c49-49de-91ff-3394321164b9 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Jan 21 18:55:58 np0005591285 nova_compute[182755]: 2026-01-21 23:55:58.626 182759 DEBUG oslo_concurrency.lockutils [None req-9a6e6287-4ebd-494b-bab4-ac68708dbe9f 6eb1bcf645844eaca088761a04e59542 63e5713bcd4c429796b251487b6136bc - - default default] Acquiring lock "refresh_cache-fc483404-7890-49b3-a98c-14073862383f" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 21 18:55:58 np0005591285 nova_compute[182755]: 2026-01-21 23:55:58.626 182759 DEBUG oslo_concurrency.lockutils [None req-9a6e6287-4ebd-494b-bab4-ac68708dbe9f 6eb1bcf645844eaca088761a04e59542 63e5713bcd4c429796b251487b6136bc - - default default] Acquired lock "refresh_cache-fc483404-7890-49b3-a98c-14073862383f" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 21 18:55:58 np0005591285 nova_compute[182755]: 2026-01-21 23:55:58.627 182759 DEBUG nova.network.neutron [None req-9a6e6287-4ebd-494b-bab4-ac68708dbe9f 6eb1bcf645844eaca088761a04e59542 63e5713bcd4c429796b251487b6136bc - - default default] [instance: fc483404-7890-49b3-a98c-14073862383f] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Jan 21 18:55:58 np0005591285 nova_compute[182755]: 2026-01-21 23:55:58.733 182759 DEBUG nova.compute.manager [req-7c69b8a7-1ab6-4f90-a99d-d143afd1c02e req-2cd1788d-3f54-4914-abdd-13ea734bd8e5 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: fc483404-7890-49b3-a98c-14073862383f] Received event network-changed-c6e2fc90-4c49-49de-91ff-3394321164b9 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 21 18:55:58 np0005591285 nova_compute[182755]: 2026-01-21 23:55:58.734 182759 DEBUG nova.compute.manager [req-7c69b8a7-1ab6-4f90-a99d-d143afd1c02e req-2cd1788d-3f54-4914-abdd-13ea734bd8e5 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: fc483404-7890-49b3-a98c-14073862383f] Refreshing instance network info cache due to event network-changed-c6e2fc90-4c49-49de-91ff-3394321164b9. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 21 18:55:58 np0005591285 nova_compute[182755]: 2026-01-21 23:55:58.735 182759 DEBUG oslo_concurrency.lockutils [req-7c69b8a7-1ab6-4f90-a99d-d143afd1c02e req-2cd1788d-3f54-4914-abdd-13ea734bd8e5 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "refresh_cache-fc483404-7890-49b3-a98c-14073862383f" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 21 18:55:59 np0005591285 nova_compute[182755]: 2026-01-21 23:55:59.297 182759 DEBUG nova.network.neutron [None req-9a6e6287-4ebd-494b-bab4-ac68708dbe9f 6eb1bcf645844eaca088761a04e59542 63e5713bcd4c429796b251487b6136bc - - default default] [instance: fc483404-7890-49b3-a98c-14073862383f] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Jan 21 18:56:00 np0005591285 nova_compute[182755]: 2026-01-21 23:56:00.003 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 18:56:01 np0005591285 ovn_controller[94908]: 2026-01-21T23:56:01Z|00154|binding|INFO|Releasing lport bb8c3f45-55b8-4c8e-8a31-26c5ecb4fb32 from this chassis (sb_readonly=0)
Jan 21 18:56:01 np0005591285 nova_compute[182755]: 2026-01-21 23:56:01.092 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 18:56:01 np0005591285 podman[219188]: 2026-01-21 23:56:01.238140173 +0000 UTC m=+0.098675827 container health_status 3d927e208c8d9d2bf2ed958430b573b5beb6e6062ba87e1ec7a0ee42c855b2e4 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter)
Jan 21 18:56:01 np0005591285 nova_compute[182755]: 2026-01-21 23:56:01.351 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 18:56:01 np0005591285 nova_compute[182755]: 2026-01-21 23:56:01.374 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 18:56:01 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:56:01.375 104259 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=15, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '16:a4:fb', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': 'fa:c2:bb:50:eb:27'}, ipsec=False) old=SB_Global(nb_cfg=14) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 21 18:56:01 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:56:01.377 104259 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 8 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Jan 21 18:56:01 np0005591285 nova_compute[182755]: 2026-01-21 23:56:01.489 182759 DEBUG nova.network.neutron [None req-9a6e6287-4ebd-494b-bab4-ac68708dbe9f 6eb1bcf645844eaca088761a04e59542 63e5713bcd4c429796b251487b6136bc - - default default] [instance: fc483404-7890-49b3-a98c-14073862383f] Updating instance_info_cache with network_info: [{"id": "c6e2fc90-4c49-49de-91ff-3394321164b9", "address": "fa:16:3e:32:00:69", "network": {"id": "74e2da48-44c2-4c6d-9597-6c47d6247f9c", "bridge": "br-int", "label": "tempest-ImagesTestJSON-926280031-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "63e5713bcd4c429796b251487b6136bc", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc6e2fc90-4c", "ovs_interfaceid": "c6e2fc90-4c49-49de-91ff-3394321164b9", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 21 18:56:01 np0005591285 nova_compute[182755]: 2026-01-21 23:56:01.511 182759 DEBUG oslo_concurrency.lockutils [None req-9a6e6287-4ebd-494b-bab4-ac68708dbe9f 6eb1bcf645844eaca088761a04e59542 63e5713bcd4c429796b251487b6136bc - - default default] Releasing lock "refresh_cache-fc483404-7890-49b3-a98c-14073862383f" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 21 18:56:01 np0005591285 nova_compute[182755]: 2026-01-21 23:56:01.511 182759 DEBUG nova.compute.manager [None req-9a6e6287-4ebd-494b-bab4-ac68708dbe9f 6eb1bcf645844eaca088761a04e59542 63e5713bcd4c429796b251487b6136bc - - default default] [instance: fc483404-7890-49b3-a98c-14073862383f] Instance network_info: |[{"id": "c6e2fc90-4c49-49de-91ff-3394321164b9", "address": "fa:16:3e:32:00:69", "network": {"id": "74e2da48-44c2-4c6d-9597-6c47d6247f9c", "bridge": "br-int", "label": "tempest-ImagesTestJSON-926280031-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "63e5713bcd4c429796b251487b6136bc", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc6e2fc90-4c", "ovs_interfaceid": "c6e2fc90-4c49-49de-91ff-3394321164b9", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Jan 21 18:56:01 np0005591285 nova_compute[182755]: 2026-01-21 23:56:01.512 182759 DEBUG oslo_concurrency.lockutils [req-7c69b8a7-1ab6-4f90-a99d-d143afd1c02e req-2cd1788d-3f54-4914-abdd-13ea734bd8e5 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquired lock "refresh_cache-fc483404-7890-49b3-a98c-14073862383f" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 21 18:56:01 np0005591285 nova_compute[182755]: 2026-01-21 23:56:01.512 182759 DEBUG nova.network.neutron [req-7c69b8a7-1ab6-4f90-a99d-d143afd1c02e req-2cd1788d-3f54-4914-abdd-13ea734bd8e5 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: fc483404-7890-49b3-a98c-14073862383f] Refreshing network info cache for port c6e2fc90-4c49-49de-91ff-3394321164b9 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 21 18:56:01 np0005591285 nova_compute[182755]: 2026-01-21 23:56:01.516 182759 DEBUG nova.virt.libvirt.driver [None req-9a6e6287-4ebd-494b-bab4-ac68708dbe9f 6eb1bcf645844eaca088761a04e59542 63e5713bcd4c429796b251487b6136bc - - default default] [instance: fc483404-7890-49b3-a98c-14073862383f] Start _get_guest_xml network_info=[{"id": "c6e2fc90-4c49-49de-91ff-3394321164b9", "address": "fa:16:3e:32:00:69", "network": {"id": "74e2da48-44c2-4c6d-9597-6c47d6247f9c", "bridge": "br-int", "label": "tempest-ImagesTestJSON-926280031-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "63e5713bcd4c429796b251487b6136bc", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc6e2fc90-4c", "ovs_interfaceid": "c6e2fc90-4c49-49de-91ff-3394321164b9", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='8585e91edf8d37af388395b4946b59f4',container_format='bare',created_at=2026-01-21T23:55:47Z,direct_url=<?>,disk_format='qcow2',id=7a3304a5-0254-4967-ad18-be3da95aaf72,min_disk=1,min_ram=0,name='tempest-test-snap-633392554',owner='63e5713bcd4c429796b251487b6136bc',properties=ImageMetaProps,protected=<?>,size=23330816,status='active',tags=<?>,updated_at=2026-01-21T23:55:50Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'device_type': 'disk', 'device_name': '/dev/vda', 'size': 0, 'boot_index': 0, 'disk_bus': 'virtio', 'guest_format': None, 'encryption_options': None, 'encryption_format': None, 'encrypted': False, 'encryption_secret_uuid': None, 'image_id': '7a3304a5-0254-4967-ad18-be3da95aaf72'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Jan 21 18:56:01 np0005591285 nova_compute[182755]: 2026-01-21 23:56:01.523 182759 WARNING nova.virt.libvirt.driver [None req-9a6e6287-4ebd-494b-bab4-ac68708dbe9f 6eb1bcf645844eaca088761a04e59542 63e5713bcd4c429796b251487b6136bc - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 21 18:56:01 np0005591285 nova_compute[182755]: 2026-01-21 23:56:01.527 182759 DEBUG nova.virt.libvirt.host [None req-9a6e6287-4ebd-494b-bab4-ac68708dbe9f 6eb1bcf645844eaca088761a04e59542 63e5713bcd4c429796b251487b6136bc - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Jan 21 18:56:01 np0005591285 nova_compute[182755]: 2026-01-21 23:56:01.528 182759 DEBUG nova.virt.libvirt.host [None req-9a6e6287-4ebd-494b-bab4-ac68708dbe9f 6eb1bcf645844eaca088761a04e59542 63e5713bcd4c429796b251487b6136bc - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Jan 21 18:56:01 np0005591285 nova_compute[182755]: 2026-01-21 23:56:01.536 182759 DEBUG nova.virt.libvirt.host [None req-9a6e6287-4ebd-494b-bab4-ac68708dbe9f 6eb1bcf645844eaca088761a04e59542 63e5713bcd4c429796b251487b6136bc - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Jan 21 18:56:01 np0005591285 nova_compute[182755]: 2026-01-21 23:56:01.537 182759 DEBUG nova.virt.libvirt.host [None req-9a6e6287-4ebd-494b-bab4-ac68708dbe9f 6eb1bcf645844eaca088761a04e59542 63e5713bcd4c429796b251487b6136bc - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Jan 21 18:56:01 np0005591285 nova_compute[182755]: 2026-01-21 23:56:01.539 182759 DEBUG nova.virt.libvirt.driver [None req-9a6e6287-4ebd-494b-bab4-ac68708dbe9f 6eb1bcf645844eaca088761a04e59542 63e5713bcd4c429796b251487b6136bc - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Jan 21 18:56:01 np0005591285 nova_compute[182755]: 2026-01-21 23:56:01.539 182759 DEBUG nova.virt.hardware [None req-9a6e6287-4ebd-494b-bab4-ac68708dbe9f 6eb1bcf645844eaca088761a04e59542 63e5713bcd4c429796b251487b6136bc - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-21T23:42:45Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='c3389c03-89c4-4ff5-9e03-1a99d41713d4',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='8585e91edf8d37af388395b4946b59f4',container_format='bare',created_at=2026-01-21T23:55:47Z,direct_url=<?>,disk_format='qcow2',id=7a3304a5-0254-4967-ad18-be3da95aaf72,min_disk=1,min_ram=0,name='tempest-test-snap-633392554',owner='63e5713bcd4c429796b251487b6136bc',properties=ImageMetaProps,protected=<?>,size=23330816,status='active',tags=<?>,updated_at=2026-01-21T23:55:50Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Jan 21 18:56:01 np0005591285 nova_compute[182755]: 2026-01-21 23:56:01.540 182759 DEBUG nova.virt.hardware [None req-9a6e6287-4ebd-494b-bab4-ac68708dbe9f 6eb1bcf645844eaca088761a04e59542 63e5713bcd4c429796b251487b6136bc - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Jan 21 18:56:01 np0005591285 nova_compute[182755]: 2026-01-21 23:56:01.540 182759 DEBUG nova.virt.hardware [None req-9a6e6287-4ebd-494b-bab4-ac68708dbe9f 6eb1bcf645844eaca088761a04e59542 63e5713bcd4c429796b251487b6136bc - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Jan 21 18:56:01 np0005591285 nova_compute[182755]: 2026-01-21 23:56:01.540 182759 DEBUG nova.virt.hardware [None req-9a6e6287-4ebd-494b-bab4-ac68708dbe9f 6eb1bcf645844eaca088761a04e59542 63e5713bcd4c429796b251487b6136bc - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Jan 21 18:56:01 np0005591285 nova_compute[182755]: 2026-01-21 23:56:01.541 182759 DEBUG nova.virt.hardware [None req-9a6e6287-4ebd-494b-bab4-ac68708dbe9f 6eb1bcf645844eaca088761a04e59542 63e5713bcd4c429796b251487b6136bc - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Jan 21 18:56:01 np0005591285 nova_compute[182755]: 2026-01-21 23:56:01.541 182759 DEBUG nova.virt.hardware [None req-9a6e6287-4ebd-494b-bab4-ac68708dbe9f 6eb1bcf645844eaca088761a04e59542 63e5713bcd4c429796b251487b6136bc - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Jan 21 18:56:01 np0005591285 nova_compute[182755]: 2026-01-21 23:56:01.541 182759 DEBUG nova.virt.hardware [None req-9a6e6287-4ebd-494b-bab4-ac68708dbe9f 6eb1bcf645844eaca088761a04e59542 63e5713bcd4c429796b251487b6136bc - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Jan 21 18:56:01 np0005591285 nova_compute[182755]: 2026-01-21 23:56:01.542 182759 DEBUG nova.virt.hardware [None req-9a6e6287-4ebd-494b-bab4-ac68708dbe9f 6eb1bcf645844eaca088761a04e59542 63e5713bcd4c429796b251487b6136bc - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Jan 21 18:56:01 np0005591285 nova_compute[182755]: 2026-01-21 23:56:01.542 182759 DEBUG nova.virt.hardware [None req-9a6e6287-4ebd-494b-bab4-ac68708dbe9f 6eb1bcf645844eaca088761a04e59542 63e5713bcd4c429796b251487b6136bc - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Jan 21 18:56:01 np0005591285 nova_compute[182755]: 2026-01-21 23:56:01.542 182759 DEBUG nova.virt.hardware [None req-9a6e6287-4ebd-494b-bab4-ac68708dbe9f 6eb1bcf645844eaca088761a04e59542 63e5713bcd4c429796b251487b6136bc - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Jan 21 18:56:01 np0005591285 nova_compute[182755]: 2026-01-21 23:56:01.543 182759 DEBUG nova.virt.hardware [None req-9a6e6287-4ebd-494b-bab4-ac68708dbe9f 6eb1bcf645844eaca088761a04e59542 63e5713bcd4c429796b251487b6136bc - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Jan 21 18:56:01 np0005591285 nova_compute[182755]: 2026-01-21 23:56:01.547 182759 DEBUG nova.virt.libvirt.vif [None req-9a6e6287-4ebd-494b-bab4-ac68708dbe9f 6eb1bcf645844eaca088761a04e59542 63e5713bcd4c429796b251487b6136bc - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-21T23:55:54Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ImagesTestJSON-server-1100102574',display_name='tempest-ImagesTestJSON-server-1100102574',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-imagestestjson-server-1100102574',id=63,image_ref='7a3304a5-0254-4967-ad18-be3da95aaf72',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='63e5713bcd4c429796b251487b6136bc',ramdisk_id='',reservation_id='r-1jf74nj6',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',image_boot_roles='member,reader',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_image_location='snapshot',image_image_state='available',image_image_type='snapshot',image_instance_uuid='2a5e9a44-f095-4122-8db9-4918b6ba22b7',image_min_disk='1',image_min_ram='0',image_owner_id='63e5713bcd4c429796b251487b6136bc',image_owner_project_name='tempest-ImagesTestJSON-126431515',image_owner_user_name='tempest-ImagesTestJSON-126431515-project-member',image_user_id='6eb1bcf645844eaca088761a04e59542',network_allocated='True',owner_project_name='tempest-ImagesTestJSON-126431515',owner_user_name='tempest-ImagesTestJSON-126431515-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-21T23:55:55Z,user_data=None,user_id='6eb1bcf645844eaca088761a04e59542',uuid=fc483404-7890-49b3-a98c-14073862383f,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "c6e2fc90-4c49-49de-91ff-3394321164b9", "address": "fa:16:3e:32:00:69", "network": {"id": "74e2da48-44c2-4c6d-9597-6c47d6247f9c", "bridge": "br-int", "label": "tempest-ImagesTestJSON-926280031-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "63e5713bcd4c429796b251487b6136bc", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc6e2fc90-4c", "ovs_interfaceid": "c6e2fc90-4c49-49de-91ff-3394321164b9", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Jan 21 18:56:01 np0005591285 nova_compute[182755]: 2026-01-21 23:56:01.548 182759 DEBUG nova.network.os_vif_util [None req-9a6e6287-4ebd-494b-bab4-ac68708dbe9f 6eb1bcf645844eaca088761a04e59542 63e5713bcd4c429796b251487b6136bc - - default default] Converting VIF {"id": "c6e2fc90-4c49-49de-91ff-3394321164b9", "address": "fa:16:3e:32:00:69", "network": {"id": "74e2da48-44c2-4c6d-9597-6c47d6247f9c", "bridge": "br-int", "label": "tempest-ImagesTestJSON-926280031-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "63e5713bcd4c429796b251487b6136bc", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc6e2fc90-4c", "ovs_interfaceid": "c6e2fc90-4c49-49de-91ff-3394321164b9", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 21 18:56:01 np0005591285 nova_compute[182755]: 2026-01-21 23:56:01.549 182759 DEBUG nova.network.os_vif_util [None req-9a6e6287-4ebd-494b-bab4-ac68708dbe9f 6eb1bcf645844eaca088761a04e59542 63e5713bcd4c429796b251487b6136bc - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:32:00:69,bridge_name='br-int',has_traffic_filtering=True,id=c6e2fc90-4c49-49de-91ff-3394321164b9,network=Network(74e2da48-44c2-4c6d-9597-6c47d6247f9c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapc6e2fc90-4c') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 21 18:56:01 np0005591285 nova_compute[182755]: 2026-01-21 23:56:01.550 182759 DEBUG nova.objects.instance [None req-9a6e6287-4ebd-494b-bab4-ac68708dbe9f 6eb1bcf645844eaca088761a04e59542 63e5713bcd4c429796b251487b6136bc - - default default] Lazy-loading 'pci_devices' on Instance uuid fc483404-7890-49b3-a98c-14073862383f obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 21 18:56:01 np0005591285 nova_compute[182755]: 2026-01-21 23:56:01.575 182759 DEBUG nova.virt.libvirt.driver [None req-9a6e6287-4ebd-494b-bab4-ac68708dbe9f 6eb1bcf645844eaca088761a04e59542 63e5713bcd4c429796b251487b6136bc - - default default] [instance: fc483404-7890-49b3-a98c-14073862383f] End _get_guest_xml xml=<domain type="kvm">
Jan 21 18:56:01 np0005591285 nova_compute[182755]:  <uuid>fc483404-7890-49b3-a98c-14073862383f</uuid>
Jan 21 18:56:01 np0005591285 nova_compute[182755]:  <name>instance-0000003f</name>
Jan 21 18:56:01 np0005591285 nova_compute[182755]:  <memory>131072</memory>
Jan 21 18:56:01 np0005591285 nova_compute[182755]:  <vcpu>1</vcpu>
Jan 21 18:56:01 np0005591285 nova_compute[182755]:  <metadata>
Jan 21 18:56:01 np0005591285 nova_compute[182755]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 21 18:56:01 np0005591285 nova_compute[182755]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 21 18:56:01 np0005591285 nova_compute[182755]:      <nova:name>tempest-ImagesTestJSON-server-1100102574</nova:name>
Jan 21 18:56:01 np0005591285 nova_compute[182755]:      <nova:creationTime>2026-01-21 23:56:01</nova:creationTime>
Jan 21 18:56:01 np0005591285 nova_compute[182755]:      <nova:flavor name="m1.nano">
Jan 21 18:56:01 np0005591285 nova_compute[182755]:        <nova:memory>128</nova:memory>
Jan 21 18:56:01 np0005591285 nova_compute[182755]:        <nova:disk>1</nova:disk>
Jan 21 18:56:01 np0005591285 nova_compute[182755]:        <nova:swap>0</nova:swap>
Jan 21 18:56:01 np0005591285 nova_compute[182755]:        <nova:ephemeral>0</nova:ephemeral>
Jan 21 18:56:01 np0005591285 nova_compute[182755]:        <nova:vcpus>1</nova:vcpus>
Jan 21 18:56:01 np0005591285 nova_compute[182755]:      </nova:flavor>
Jan 21 18:56:01 np0005591285 nova_compute[182755]:      <nova:owner>
Jan 21 18:56:01 np0005591285 nova_compute[182755]:        <nova:user uuid="6eb1bcf645844eaca088761a04e59542">tempest-ImagesTestJSON-126431515-project-member</nova:user>
Jan 21 18:56:01 np0005591285 nova_compute[182755]:        <nova:project uuid="63e5713bcd4c429796b251487b6136bc">tempest-ImagesTestJSON-126431515</nova:project>
Jan 21 18:56:01 np0005591285 nova_compute[182755]:      </nova:owner>
Jan 21 18:56:01 np0005591285 nova_compute[182755]:      <nova:root type="image" uuid="7a3304a5-0254-4967-ad18-be3da95aaf72"/>
Jan 21 18:56:01 np0005591285 nova_compute[182755]:      <nova:ports>
Jan 21 18:56:01 np0005591285 nova_compute[182755]:        <nova:port uuid="c6e2fc90-4c49-49de-91ff-3394321164b9">
Jan 21 18:56:01 np0005591285 nova_compute[182755]:          <nova:ip type="fixed" address="10.100.0.4" ipVersion="4"/>
Jan 21 18:56:01 np0005591285 nova_compute[182755]:        </nova:port>
Jan 21 18:56:01 np0005591285 nova_compute[182755]:      </nova:ports>
Jan 21 18:56:01 np0005591285 nova_compute[182755]:    </nova:instance>
Jan 21 18:56:01 np0005591285 nova_compute[182755]:  </metadata>
Jan 21 18:56:01 np0005591285 nova_compute[182755]:  <sysinfo type="smbios">
Jan 21 18:56:01 np0005591285 nova_compute[182755]:    <system>
Jan 21 18:56:01 np0005591285 nova_compute[182755]:      <entry name="manufacturer">RDO</entry>
Jan 21 18:56:01 np0005591285 nova_compute[182755]:      <entry name="product">OpenStack Compute</entry>
Jan 21 18:56:01 np0005591285 nova_compute[182755]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 21 18:56:01 np0005591285 nova_compute[182755]:      <entry name="serial">fc483404-7890-49b3-a98c-14073862383f</entry>
Jan 21 18:56:01 np0005591285 nova_compute[182755]:      <entry name="uuid">fc483404-7890-49b3-a98c-14073862383f</entry>
Jan 21 18:56:01 np0005591285 nova_compute[182755]:      <entry name="family">Virtual Machine</entry>
Jan 21 18:56:01 np0005591285 nova_compute[182755]:    </system>
Jan 21 18:56:01 np0005591285 nova_compute[182755]:  </sysinfo>
Jan 21 18:56:01 np0005591285 nova_compute[182755]:  <os>
Jan 21 18:56:01 np0005591285 nova_compute[182755]:    <type arch="x86_64" machine="q35">hvm</type>
Jan 21 18:56:01 np0005591285 nova_compute[182755]:    <boot dev="hd"/>
Jan 21 18:56:01 np0005591285 nova_compute[182755]:    <smbios mode="sysinfo"/>
Jan 21 18:56:01 np0005591285 nova_compute[182755]:  </os>
Jan 21 18:56:01 np0005591285 nova_compute[182755]:  <features>
Jan 21 18:56:01 np0005591285 nova_compute[182755]:    <acpi/>
Jan 21 18:56:01 np0005591285 nova_compute[182755]:    <apic/>
Jan 21 18:56:01 np0005591285 nova_compute[182755]:    <vmcoreinfo/>
Jan 21 18:56:01 np0005591285 nova_compute[182755]:  </features>
Jan 21 18:56:01 np0005591285 nova_compute[182755]:  <clock offset="utc">
Jan 21 18:56:01 np0005591285 nova_compute[182755]:    <timer name="pit" tickpolicy="delay"/>
Jan 21 18:56:01 np0005591285 nova_compute[182755]:    <timer name="rtc" tickpolicy="catchup"/>
Jan 21 18:56:01 np0005591285 nova_compute[182755]:    <timer name="hpet" present="no"/>
Jan 21 18:56:01 np0005591285 nova_compute[182755]:  </clock>
Jan 21 18:56:01 np0005591285 nova_compute[182755]:  <cpu mode="custom" match="exact">
Jan 21 18:56:01 np0005591285 nova_compute[182755]:    <model>Nehalem</model>
Jan 21 18:56:01 np0005591285 nova_compute[182755]:    <topology sockets="1" cores="1" threads="1"/>
Jan 21 18:56:01 np0005591285 nova_compute[182755]:  </cpu>
Jan 21 18:56:01 np0005591285 nova_compute[182755]:  <devices>
Jan 21 18:56:01 np0005591285 nova_compute[182755]:    <disk type="file" device="disk">
Jan 21 18:56:01 np0005591285 nova_compute[182755]:      <driver name="qemu" type="qcow2" cache="none"/>
Jan 21 18:56:01 np0005591285 nova_compute[182755]:      <source file="/var/lib/nova/instances/fc483404-7890-49b3-a98c-14073862383f/disk"/>
Jan 21 18:56:01 np0005591285 nova_compute[182755]:      <target dev="vda" bus="virtio"/>
Jan 21 18:56:01 np0005591285 nova_compute[182755]:    </disk>
Jan 21 18:56:01 np0005591285 nova_compute[182755]:    <disk type="file" device="cdrom">
Jan 21 18:56:01 np0005591285 nova_compute[182755]:      <driver name="qemu" type="raw" cache="none"/>
Jan 21 18:56:01 np0005591285 nova_compute[182755]:      <source file="/var/lib/nova/instances/fc483404-7890-49b3-a98c-14073862383f/disk.config"/>
Jan 21 18:56:01 np0005591285 nova_compute[182755]:      <target dev="sda" bus="sata"/>
Jan 21 18:56:01 np0005591285 nova_compute[182755]:    </disk>
Jan 21 18:56:01 np0005591285 nova_compute[182755]:    <interface type="ethernet">
Jan 21 18:56:01 np0005591285 nova_compute[182755]:      <mac address="fa:16:3e:32:00:69"/>
Jan 21 18:56:01 np0005591285 nova_compute[182755]:      <model type="virtio"/>
Jan 21 18:56:01 np0005591285 nova_compute[182755]:      <driver name="vhost" rx_queue_size="512"/>
Jan 21 18:56:01 np0005591285 nova_compute[182755]:      <mtu size="1442"/>
Jan 21 18:56:01 np0005591285 nova_compute[182755]:      <target dev="tapc6e2fc90-4c"/>
Jan 21 18:56:01 np0005591285 nova_compute[182755]:    </interface>
Jan 21 18:56:01 np0005591285 nova_compute[182755]:    <serial type="pty">
Jan 21 18:56:01 np0005591285 nova_compute[182755]:      <log file="/var/lib/nova/instances/fc483404-7890-49b3-a98c-14073862383f/console.log" append="off"/>
Jan 21 18:56:01 np0005591285 nova_compute[182755]:    </serial>
Jan 21 18:56:01 np0005591285 nova_compute[182755]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 21 18:56:01 np0005591285 nova_compute[182755]:    <video>
Jan 21 18:56:01 np0005591285 nova_compute[182755]:      <model type="virtio"/>
Jan 21 18:56:01 np0005591285 nova_compute[182755]:    </video>
Jan 21 18:56:01 np0005591285 nova_compute[182755]:    <input type="tablet" bus="usb"/>
Jan 21 18:56:01 np0005591285 nova_compute[182755]:    <input type="keyboard" bus="usb"/>
Jan 21 18:56:01 np0005591285 nova_compute[182755]:    <rng model="virtio">
Jan 21 18:56:01 np0005591285 nova_compute[182755]:      <backend model="random">/dev/urandom</backend>
Jan 21 18:56:01 np0005591285 nova_compute[182755]:    </rng>
Jan 21 18:56:01 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root"/>
Jan 21 18:56:01 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 18:56:01 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 18:56:01 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 18:56:01 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 18:56:01 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 18:56:01 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 18:56:01 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 18:56:01 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 18:56:01 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 18:56:01 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 18:56:01 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 18:56:01 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 18:56:01 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 18:56:01 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 18:56:01 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 18:56:01 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 18:56:01 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 18:56:01 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 18:56:01 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 18:56:01 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 18:56:01 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 18:56:01 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 18:56:01 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 18:56:01 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 18:56:01 np0005591285 nova_compute[182755]:    <controller type="usb" index="0"/>
Jan 21 18:56:01 np0005591285 nova_compute[182755]:    <memballoon model="virtio">
Jan 21 18:56:01 np0005591285 nova_compute[182755]:      <stats period="10"/>
Jan 21 18:56:01 np0005591285 nova_compute[182755]:    </memballoon>
Jan 21 18:56:01 np0005591285 nova_compute[182755]:  </devices>
Jan 21 18:56:01 np0005591285 nova_compute[182755]: </domain>
Jan 21 18:56:01 np0005591285 nova_compute[182755]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Jan 21 18:56:01 np0005591285 nova_compute[182755]: 2026-01-21 23:56:01.577 182759 DEBUG nova.compute.manager [None req-9a6e6287-4ebd-494b-bab4-ac68708dbe9f 6eb1bcf645844eaca088761a04e59542 63e5713bcd4c429796b251487b6136bc - - default default] [instance: fc483404-7890-49b3-a98c-14073862383f] Preparing to wait for external event network-vif-plugged-c6e2fc90-4c49-49de-91ff-3394321164b9 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Jan 21 18:56:01 np0005591285 nova_compute[182755]: 2026-01-21 23:56:01.577 182759 DEBUG oslo_concurrency.lockutils [None req-9a6e6287-4ebd-494b-bab4-ac68708dbe9f 6eb1bcf645844eaca088761a04e59542 63e5713bcd4c429796b251487b6136bc - - default default] Acquiring lock "fc483404-7890-49b3-a98c-14073862383f-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 21 18:56:01 np0005591285 nova_compute[182755]: 2026-01-21 23:56:01.577 182759 DEBUG oslo_concurrency.lockutils [None req-9a6e6287-4ebd-494b-bab4-ac68708dbe9f 6eb1bcf645844eaca088761a04e59542 63e5713bcd4c429796b251487b6136bc - - default default] Lock "fc483404-7890-49b3-a98c-14073862383f-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 21 18:56:01 np0005591285 nova_compute[182755]: 2026-01-21 23:56:01.578 182759 DEBUG oslo_concurrency.lockutils [None req-9a6e6287-4ebd-494b-bab4-ac68708dbe9f 6eb1bcf645844eaca088761a04e59542 63e5713bcd4c429796b251487b6136bc - - default default] Lock "fc483404-7890-49b3-a98c-14073862383f-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 21 18:56:01 np0005591285 nova_compute[182755]: 2026-01-21 23:56:01.578 182759 DEBUG nova.virt.libvirt.vif [None req-9a6e6287-4ebd-494b-bab4-ac68708dbe9f 6eb1bcf645844eaca088761a04e59542 63e5713bcd4c429796b251487b6136bc - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-21T23:55:54Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ImagesTestJSON-server-1100102574',display_name='tempest-ImagesTestJSON-server-1100102574',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-imagestestjson-server-1100102574',id=63,image_ref='7a3304a5-0254-4967-ad18-be3da95aaf72',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='63e5713bcd4c429796b251487b6136bc',ramdisk_id='',reservation_id='r-1jf74nj6',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',image_boot_roles='member,reader',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_image_location='snapshot',image_image_state='available',image_image_type='snapshot',image_instance_uuid='2a5e9a44-f095-4122-8db9-4918b6ba22b7',image_min_disk='1',image_min_ram='0',image_owner_id='63e5713bcd4c429796b251487b6136bc',image_owner_project_name='tempest-ImagesTestJSON-126431515',image_owner_user_name='tempest-ImagesTestJSON-126431515-project-member',image_user_id='6eb1bcf645844eaca088761a04e59542',network_allocated='True',owner_project_name='tempest-ImagesTestJSON-126431515',owner_user_name='tempest-ImagesTestJSON-126431515-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-21T23:55:55Z,user_data=None,user_id='6eb1bcf645844eaca088761a04e59542',uuid=fc483404-7890-49b3-a98c-14073862383f,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "c6e2fc90-4c49-49de-91ff-3394321164b9", "address": "fa:16:3e:32:00:69", "network": {"id": "74e2da48-44c2-4c6d-9597-6c47d6247f9c", "bridge": "br-int", "label": "tempest-ImagesTestJSON-926280031-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "63e5713bcd4c429796b251487b6136bc", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc6e2fc90-4c", "ovs_interfaceid": "c6e2fc90-4c49-49de-91ff-3394321164b9", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Jan 21 18:56:01 np0005591285 nova_compute[182755]: 2026-01-21 23:56:01.579 182759 DEBUG nova.network.os_vif_util [None req-9a6e6287-4ebd-494b-bab4-ac68708dbe9f 6eb1bcf645844eaca088761a04e59542 63e5713bcd4c429796b251487b6136bc - - default default] Converting VIF {"id": "c6e2fc90-4c49-49de-91ff-3394321164b9", "address": "fa:16:3e:32:00:69", "network": {"id": "74e2da48-44c2-4c6d-9597-6c47d6247f9c", "bridge": "br-int", "label": "tempest-ImagesTestJSON-926280031-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "63e5713bcd4c429796b251487b6136bc", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc6e2fc90-4c", "ovs_interfaceid": "c6e2fc90-4c49-49de-91ff-3394321164b9", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 21 18:56:01 np0005591285 nova_compute[182755]: 2026-01-21 23:56:01.580 182759 DEBUG nova.network.os_vif_util [None req-9a6e6287-4ebd-494b-bab4-ac68708dbe9f 6eb1bcf645844eaca088761a04e59542 63e5713bcd4c429796b251487b6136bc - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:32:00:69,bridge_name='br-int',has_traffic_filtering=True,id=c6e2fc90-4c49-49de-91ff-3394321164b9,network=Network(74e2da48-44c2-4c6d-9597-6c47d6247f9c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapc6e2fc90-4c') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 21 18:56:01 np0005591285 nova_compute[182755]: 2026-01-21 23:56:01.580 182759 DEBUG os_vif [None req-9a6e6287-4ebd-494b-bab4-ac68708dbe9f 6eb1bcf645844eaca088761a04e59542 63e5713bcd4c429796b251487b6136bc - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:32:00:69,bridge_name='br-int',has_traffic_filtering=True,id=c6e2fc90-4c49-49de-91ff-3394321164b9,network=Network(74e2da48-44c2-4c6d-9597-6c47d6247f9c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapc6e2fc90-4c') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Jan 21 18:56:01 np0005591285 nova_compute[182755]: 2026-01-21 23:56:01.581 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 18:56:01 np0005591285 nova_compute[182755]: 2026-01-21 23:56:01.581 182759 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 21 18:56:01 np0005591285 nova_compute[182755]: 2026-01-21 23:56:01.582 182759 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 21 18:56:01 np0005591285 nova_compute[182755]: 2026-01-21 23:56:01.586 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 18:56:01 np0005591285 nova_compute[182755]: 2026-01-21 23:56:01.586 182759 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapc6e2fc90-4c, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 21 18:56:01 np0005591285 nova_compute[182755]: 2026-01-21 23:56:01.586 182759 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tapc6e2fc90-4c, col_values=(('external_ids', {'iface-id': 'c6e2fc90-4c49-49de-91ff-3394321164b9', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:32:00:69', 'vm-uuid': 'fc483404-7890-49b3-a98c-14073862383f'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 21 18:56:01 np0005591285 NetworkManager[55017]: <info>  [1769039761.5895] manager: (tapc6e2fc90-4c): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/87)
Jan 21 18:56:01 np0005591285 nova_compute[182755]: 2026-01-21 23:56:01.588 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 18:56:01 np0005591285 nova_compute[182755]: 2026-01-21 23:56:01.592 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 21 18:56:01 np0005591285 nova_compute[182755]: 2026-01-21 23:56:01.595 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 18:56:01 np0005591285 nova_compute[182755]: 2026-01-21 23:56:01.596 182759 INFO os_vif [None req-9a6e6287-4ebd-494b-bab4-ac68708dbe9f 6eb1bcf645844eaca088761a04e59542 63e5713bcd4c429796b251487b6136bc - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:32:00:69,bridge_name='br-int',has_traffic_filtering=True,id=c6e2fc90-4c49-49de-91ff-3394321164b9,network=Network(74e2da48-44c2-4c6d-9597-6c47d6247f9c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapc6e2fc90-4c')#033[00m
Jan 21 18:56:01 np0005591285 nova_compute[182755]: 2026-01-21 23:56:01.698 182759 DEBUG nova.virt.libvirt.driver [None req-9a6e6287-4ebd-494b-bab4-ac68708dbe9f 6eb1bcf645844eaca088761a04e59542 63e5713bcd4c429796b251487b6136bc - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 21 18:56:01 np0005591285 nova_compute[182755]: 2026-01-21 23:56:01.698 182759 DEBUG nova.virt.libvirt.driver [None req-9a6e6287-4ebd-494b-bab4-ac68708dbe9f 6eb1bcf645844eaca088761a04e59542 63e5713bcd4c429796b251487b6136bc - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 21 18:56:01 np0005591285 nova_compute[182755]: 2026-01-21 23:56:01.699 182759 DEBUG nova.virt.libvirt.driver [None req-9a6e6287-4ebd-494b-bab4-ac68708dbe9f 6eb1bcf645844eaca088761a04e59542 63e5713bcd4c429796b251487b6136bc - - default default] No VIF found with MAC fa:16:3e:32:00:69, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Jan 21 18:56:01 np0005591285 nova_compute[182755]: 2026-01-21 23:56:01.700 182759 INFO nova.virt.libvirt.driver [None req-9a6e6287-4ebd-494b-bab4-ac68708dbe9f 6eb1bcf645844eaca088761a04e59542 63e5713bcd4c429796b251487b6136bc - - default default] [instance: fc483404-7890-49b3-a98c-14073862383f] Using config drive#033[00m
Jan 21 18:56:02 np0005591285 nova_compute[182755]: 2026-01-21 23:56:02.368 182759 INFO nova.virt.libvirt.driver [None req-9a6e6287-4ebd-494b-bab4-ac68708dbe9f 6eb1bcf645844eaca088761a04e59542 63e5713bcd4c429796b251487b6136bc - - default default] [instance: fc483404-7890-49b3-a98c-14073862383f] Creating config drive at /var/lib/nova/instances/fc483404-7890-49b3-a98c-14073862383f/disk.config#033[00m
Jan 21 18:56:02 np0005591285 nova_compute[182755]: 2026-01-21 23:56:02.376 182759 DEBUG oslo_concurrency.processutils [None req-9a6e6287-4ebd-494b-bab4-ac68708dbe9f 6eb1bcf645844eaca088761a04e59542 63e5713bcd4c429796b251487b6136bc - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/fc483404-7890-49b3-a98c-14073862383f/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpl6etl10o execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 21 18:56:02 np0005591285 nova_compute[182755]: 2026-01-21 23:56:02.505 182759 DEBUG oslo_concurrency.processutils [None req-9a6e6287-4ebd-494b-bab4-ac68708dbe9f 6eb1bcf645844eaca088761a04e59542 63e5713bcd4c429796b251487b6136bc - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/fc483404-7890-49b3-a98c-14073862383f/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpl6etl10o" returned: 0 in 0.129s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 21 18:56:02 np0005591285 kernel: tapc6e2fc90-4c: entered promiscuous mode
Jan 21 18:56:02 np0005591285 NetworkManager[55017]: <info>  [1769039762.5911] manager: (tapc6e2fc90-4c): new Tun device (/org/freedesktop/NetworkManager/Devices/88)
Jan 21 18:56:02 np0005591285 nova_compute[182755]: 2026-01-21 23:56:02.593 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 18:56:02 np0005591285 ovn_controller[94908]: 2026-01-21T23:56:02Z|00155|binding|INFO|Claiming lport c6e2fc90-4c49-49de-91ff-3394321164b9 for this chassis.
Jan 21 18:56:02 np0005591285 ovn_controller[94908]: 2026-01-21T23:56:02Z|00156|binding|INFO|c6e2fc90-4c49-49de-91ff-3394321164b9: Claiming fa:16:3e:32:00:69 10.100.0.4
Jan 21 18:56:02 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:56:02.616 104259 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:32:00:69 10.100.0.4'], port_security=['fa:16:3e:32:00:69 10.100.0.4'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.4/28', 'neutron:device_id': 'fc483404-7890-49b3-a98c-14073862383f', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-74e2da48-44c2-4c6d-9597-6c47d6247f9c', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '63e5713bcd4c429796b251487b6136bc', 'neutron:revision_number': '2', 'neutron:security_group_ids': 'f9b7e0b6-a204-4fa2-b013-6e4797586550', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=d83c4887-6d70-4852-825f-2112d1d70c76, chassis=[<ovs.db.idl.Row object at 0x7fdd55b5c970>], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fdd55b5c970>], logical_port=c6e2fc90-4c49-49de-91ff-3394321164b9) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 21 18:56:02 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:56:02.618 104259 INFO neutron.agent.ovn.metadata.agent [-] Port c6e2fc90-4c49-49de-91ff-3394321164b9 in datapath 74e2da48-44c2-4c6d-9597-6c47d6247f9c bound to our chassis#033[00m
Jan 21 18:56:02 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:56:02.619 104259 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 74e2da48-44c2-4c6d-9597-6c47d6247f9c#033[00m
Jan 21 18:56:02 np0005591285 systemd-udevd[219243]: Network interface NamePolicy= disabled on kernel command line.
Jan 21 18:56:02 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:56:02.634 211690 DEBUG oslo.privsep.daemon [-] privsep: reply[9960f9af-3e54-4382-b8e3-e34ca89dcce2]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 18:56:02 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:56:02.639 104259 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap74e2da48-41 in ovnmeta-74e2da48-44c2-4c6d-9597-6c47d6247f9c namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Jan 21 18:56:02 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:56:02.642 211690 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap74e2da48-40 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Jan 21 18:56:02 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:56:02.642 211690 DEBUG oslo.privsep.daemon [-] privsep: reply[3d7c1cf3-6c06-470c-b57d-78b6a91b6b0e]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 18:56:02 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:56:02.643 211690 DEBUG oslo.privsep.daemon [-] privsep: reply[a7311e87-be55-4295-a93b-c0481ffecee0]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 18:56:02 np0005591285 systemd-machined[154022]: New machine qemu-26-instance-0000003f.
Jan 21 18:56:02 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:56:02.661 104650 DEBUG oslo.privsep.daemon [-] privsep: reply[4af33792-a230-4cde-a961-8f36b8acf6ea]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 18:56:02 np0005591285 NetworkManager[55017]: <info>  [1769039762.6664] device (tapc6e2fc90-4c): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 21 18:56:02 np0005591285 NetworkManager[55017]: <info>  [1769039762.6674] device (tapc6e2fc90-4c): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 21 18:56:02 np0005591285 systemd[1]: Started Virtual Machine qemu-26-instance-0000003f.
Jan 21 18:56:02 np0005591285 nova_compute[182755]: 2026-01-21 23:56:02.673 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 18:56:02 np0005591285 ovn_controller[94908]: 2026-01-21T23:56:02Z|00157|binding|INFO|Setting lport c6e2fc90-4c49-49de-91ff-3394321164b9 ovn-installed in OVS
Jan 21 18:56:02 np0005591285 ovn_controller[94908]: 2026-01-21T23:56:02Z|00158|binding|INFO|Setting lport c6e2fc90-4c49-49de-91ff-3394321164b9 up in Southbound
Jan 21 18:56:02 np0005591285 nova_compute[182755]: 2026-01-21 23:56:02.679 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 18:56:02 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:56:02.695 211690 DEBUG oslo.privsep.daemon [-] privsep: reply[5e1886e3-7fb3-476c-8a86-dac0acf176f1]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 18:56:02 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:56:02.726 211704 DEBUG oslo.privsep.daemon [-] privsep: reply[d5407796-33db-4909-96b0-7e6a06b7fe2c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 18:56:02 np0005591285 NetworkManager[55017]: <info>  [1769039762.7323] manager: (tap74e2da48-40): new Veth device (/org/freedesktop/NetworkManager/Devices/89)
Jan 21 18:56:02 np0005591285 systemd-udevd[219249]: Network interface NamePolicy= disabled on kernel command line.
Jan 21 18:56:02 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:56:02.733 211690 DEBUG oslo.privsep.daemon [-] privsep: reply[f5726003-2422-43fe-aa0f-5c68609d14cd]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 18:56:02 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:56:02.787 211704 DEBUG oslo.privsep.daemon [-] privsep: reply[e1539e8c-e71e-4156-885c-5d5bb938e9e5]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 18:56:02 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:56:02.791 211704 DEBUG oslo.privsep.daemon [-] privsep: reply[b8f47897-ae4f-4902-a4fc-4a4b8e047ca8]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 18:56:02 np0005591285 NetworkManager[55017]: <info>  [1769039762.8247] device (tap74e2da48-40): carrier: link connected
Jan 21 18:56:02 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:56:02.834 211704 DEBUG oslo.privsep.daemon [-] privsep: reply[330ba416-112d-434d-abc9-0dab8734360b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 18:56:02 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:56:02.855 211690 DEBUG oslo.privsep.daemon [-] privsep: reply[e7fae431-bb78-496a-9ffe-4e34f4cff5e6]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap74e2da48-41'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:af:75:49'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 56], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 432148, 'reachable_time': 38738, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 219277, 'error': None, 'target': 'ovnmeta-74e2da48-44c2-4c6d-9597-6c47d6247f9c', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 18:56:02 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:56:02.872 211690 DEBUG oslo.privsep.daemon [-] privsep: reply[19dfe123-0ba7-4fdf-beda-0a7c5b8333d6]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:feaf:7549'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 432148, 'tstamp': 432148}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 219278, 'error': None, 'target': 'ovnmeta-74e2da48-44c2-4c6d-9597-6c47d6247f9c', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 18:56:02 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:56:02.888 211690 DEBUG oslo.privsep.daemon [-] privsep: reply[c766a65c-d6eb-49b9-9698-bd54c64db424]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap74e2da48-41'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:af:75:49'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 56], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 432148, 'reachable_time': 38738, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 219279, 'error': None, 'target': 'ovnmeta-74e2da48-44c2-4c6d-9597-6c47d6247f9c', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 18:56:02 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:56:02.926 211690 DEBUG oslo.privsep.daemon [-] privsep: reply[6f93df78-ab0d-4888-8ecf-cc88fed736c6]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 18:56:02 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:56:02.961 104259 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 21 18:56:02 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:56:02.962 104259 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 21 18:56:02 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:56:02.963 104259 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 21 18:56:03 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:56:03.000 211690 DEBUG oslo.privsep.daemon [-] privsep: reply[07918e41-1673-44fd-9ede-92a624ac6502]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 18:56:03 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:56:03.003 104259 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap74e2da48-40, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 21 18:56:03 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:56:03.003 104259 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 21 18:56:03 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:56:03.004 104259 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap74e2da48-40, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 21 18:56:03 np0005591285 nova_compute[182755]: 2026-01-21 23:56:03.006 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 18:56:03 np0005591285 kernel: tap74e2da48-40: entered promiscuous mode
Jan 21 18:56:03 np0005591285 NetworkManager[55017]: <info>  [1769039763.0080] manager: (tap74e2da48-40): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/90)
Jan 21 18:56:03 np0005591285 nova_compute[182755]: 2026-01-21 23:56:03.010 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 18:56:03 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:56:03.012 104259 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap74e2da48-40, col_values=(('external_ids', {'iface-id': '5f8f321e-2942-4700-a50e-4b0628052c1b'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 21 18:56:03 np0005591285 nova_compute[182755]: 2026-01-21 23:56:03.013 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 18:56:03 np0005591285 ovn_controller[94908]: 2026-01-21T23:56:03Z|00159|binding|INFO|Releasing lport 5f8f321e-2942-4700-a50e-4b0628052c1b from this chassis (sb_readonly=0)
Jan 21 18:56:03 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:56:03.015 104259 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/74e2da48-44c2-4c6d-9597-6c47d6247f9c.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/74e2da48-44c2-4c6d-9597-6c47d6247f9c.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Jan 21 18:56:03 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:56:03.016 211690 DEBUG oslo.privsep.daemon [-] privsep: reply[93e7f246-9e77-46d4-9f33-a30e3b092245]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 18:56:03 np0005591285 nova_compute[182755]: 2026-01-21 23:56:03.016 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 18:56:03 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:56:03.017 104259 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Jan 21 18:56:03 np0005591285 ovn_metadata_agent[104254]: global
Jan 21 18:56:03 np0005591285 ovn_metadata_agent[104254]:    log         /dev/log local0 debug
Jan 21 18:56:03 np0005591285 ovn_metadata_agent[104254]:    log-tag     haproxy-metadata-proxy-74e2da48-44c2-4c6d-9597-6c47d6247f9c
Jan 21 18:56:03 np0005591285 ovn_metadata_agent[104254]:    user        root
Jan 21 18:56:03 np0005591285 ovn_metadata_agent[104254]:    group       root
Jan 21 18:56:03 np0005591285 ovn_metadata_agent[104254]:    maxconn     1024
Jan 21 18:56:03 np0005591285 ovn_metadata_agent[104254]:    pidfile     /var/lib/neutron/external/pids/74e2da48-44c2-4c6d-9597-6c47d6247f9c.pid.haproxy
Jan 21 18:56:03 np0005591285 ovn_metadata_agent[104254]:    daemon
Jan 21 18:56:03 np0005591285 ovn_metadata_agent[104254]: 
Jan 21 18:56:03 np0005591285 ovn_metadata_agent[104254]: defaults
Jan 21 18:56:03 np0005591285 ovn_metadata_agent[104254]:    log global
Jan 21 18:56:03 np0005591285 ovn_metadata_agent[104254]:    mode http
Jan 21 18:56:03 np0005591285 ovn_metadata_agent[104254]:    option httplog
Jan 21 18:56:03 np0005591285 ovn_metadata_agent[104254]:    option dontlognull
Jan 21 18:56:03 np0005591285 ovn_metadata_agent[104254]:    option http-server-close
Jan 21 18:56:03 np0005591285 ovn_metadata_agent[104254]:    option forwardfor
Jan 21 18:56:03 np0005591285 ovn_metadata_agent[104254]:    retries                 3
Jan 21 18:56:03 np0005591285 ovn_metadata_agent[104254]:    timeout http-request    30s
Jan 21 18:56:03 np0005591285 ovn_metadata_agent[104254]:    timeout connect         30s
Jan 21 18:56:03 np0005591285 ovn_metadata_agent[104254]:    timeout client          32s
Jan 21 18:56:03 np0005591285 ovn_metadata_agent[104254]:    timeout server          32s
Jan 21 18:56:03 np0005591285 ovn_metadata_agent[104254]:    timeout http-keep-alive 30s
Jan 21 18:56:03 np0005591285 ovn_metadata_agent[104254]: 
Jan 21 18:56:03 np0005591285 ovn_metadata_agent[104254]: 
Jan 21 18:56:03 np0005591285 ovn_metadata_agent[104254]: listen listener
Jan 21 18:56:03 np0005591285 ovn_metadata_agent[104254]:    bind 169.254.169.254:80
Jan 21 18:56:03 np0005591285 ovn_metadata_agent[104254]:    server metadata /var/lib/neutron/metadata_proxy
Jan 21 18:56:03 np0005591285 ovn_metadata_agent[104254]:    http-request add-header X-OVN-Network-ID 74e2da48-44c2-4c6d-9597-6c47d6247f9c
Jan 21 18:56:03 np0005591285 ovn_metadata_agent[104254]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Jan 21 18:56:03 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:56:03.018 104259 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-74e2da48-44c2-4c6d-9597-6c47d6247f9c', 'env', 'PROCESS_TAG=haproxy-74e2da48-44c2-4c6d-9597-6c47d6247f9c', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/74e2da48-44c2-4c6d-9597-6c47d6247f9c.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Jan 21 18:56:03 np0005591285 nova_compute[182755]: 2026-01-21 23:56:03.021 182759 DEBUG nova.compute.manager [req-a85f18f2-806c-4221-9ee0-f9f1613bb27c req-8334b067-7021-4efc-a52f-3bf5e691811a 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: fc483404-7890-49b3-a98c-14073862383f] Received event network-vif-plugged-c6e2fc90-4c49-49de-91ff-3394321164b9 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 21 18:56:03 np0005591285 nova_compute[182755]: 2026-01-21 23:56:03.023 182759 DEBUG oslo_concurrency.lockutils [req-a85f18f2-806c-4221-9ee0-f9f1613bb27c req-8334b067-7021-4efc-a52f-3bf5e691811a 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "fc483404-7890-49b3-a98c-14073862383f-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 21 18:56:03 np0005591285 nova_compute[182755]: 2026-01-21 23:56:03.024 182759 DEBUG oslo_concurrency.lockutils [req-a85f18f2-806c-4221-9ee0-f9f1613bb27c req-8334b067-7021-4efc-a52f-3bf5e691811a 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "fc483404-7890-49b3-a98c-14073862383f-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 21 18:56:03 np0005591285 nova_compute[182755]: 2026-01-21 23:56:03.025 182759 DEBUG oslo_concurrency.lockutils [req-a85f18f2-806c-4221-9ee0-f9f1613bb27c req-8334b067-7021-4efc-a52f-3bf5e691811a 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "fc483404-7890-49b3-a98c-14073862383f-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 21 18:56:03 np0005591285 nova_compute[182755]: 2026-01-21 23:56:03.025 182759 DEBUG nova.compute.manager [req-a85f18f2-806c-4221-9ee0-f9f1613bb27c req-8334b067-7021-4efc-a52f-3bf5e691811a 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: fc483404-7890-49b3-a98c-14073862383f] Processing event network-vif-plugged-c6e2fc90-4c49-49de-91ff-3394321164b9 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Jan 21 18:56:03 np0005591285 nova_compute[182755]: 2026-01-21 23:56:03.026 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 18:56:03 np0005591285 nova_compute[182755]: 2026-01-21 23:56:03.043 182759 DEBUG nova.virt.driver [None req-f447ef32-7edb-4e2c-a5c5-5452f2f9c37e - - - - - -] Emitting event <LifecycleEvent: 1769039763.0428295, fc483404-7890-49b3-a98c-14073862383f => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 21 18:56:03 np0005591285 nova_compute[182755]: 2026-01-21 23:56:03.044 182759 INFO nova.compute.manager [None req-f447ef32-7edb-4e2c-a5c5-5452f2f9c37e - - - - - -] [instance: fc483404-7890-49b3-a98c-14073862383f] VM Started (Lifecycle Event)#033[00m
Jan 21 18:56:03 np0005591285 nova_compute[182755]: 2026-01-21 23:56:03.048 182759 DEBUG nova.compute.manager [None req-9a6e6287-4ebd-494b-bab4-ac68708dbe9f 6eb1bcf645844eaca088761a04e59542 63e5713bcd4c429796b251487b6136bc - - default default] [instance: fc483404-7890-49b3-a98c-14073862383f] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Jan 21 18:56:03 np0005591285 nova_compute[182755]: 2026-01-21 23:56:03.053 182759 DEBUG nova.virt.libvirt.driver [None req-9a6e6287-4ebd-494b-bab4-ac68708dbe9f 6eb1bcf645844eaca088761a04e59542 63e5713bcd4c429796b251487b6136bc - - default default] [instance: fc483404-7890-49b3-a98c-14073862383f] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Jan 21 18:56:03 np0005591285 nova_compute[182755]: 2026-01-21 23:56:03.059 182759 INFO nova.virt.libvirt.driver [-] [instance: fc483404-7890-49b3-a98c-14073862383f] Instance spawned successfully.#033[00m
Jan 21 18:56:03 np0005591285 nova_compute[182755]: 2026-01-21 23:56:03.060 182759 INFO nova.compute.manager [None req-9a6e6287-4ebd-494b-bab4-ac68708dbe9f 6eb1bcf645844eaca088761a04e59542 63e5713bcd4c429796b251487b6136bc - - default default] [instance: fc483404-7890-49b3-a98c-14073862383f] Took 7.27 seconds to spawn the instance on the hypervisor.#033[00m
Jan 21 18:56:03 np0005591285 nova_compute[182755]: 2026-01-21 23:56:03.060 182759 DEBUG nova.compute.manager [None req-9a6e6287-4ebd-494b-bab4-ac68708dbe9f 6eb1bcf645844eaca088761a04e59542 63e5713bcd4c429796b251487b6136bc - - default default] [instance: fc483404-7890-49b3-a98c-14073862383f] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 21 18:56:03 np0005591285 nova_compute[182755]: 2026-01-21 23:56:03.069 182759 DEBUG nova.compute.manager [None req-f447ef32-7edb-4e2c-a5c5-5452f2f9c37e - - - - - -] [instance: fc483404-7890-49b3-a98c-14073862383f] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 21 18:56:03 np0005591285 nova_compute[182755]: 2026-01-21 23:56:03.073 182759 DEBUG nova.compute.manager [None req-f447ef32-7edb-4e2c-a5c5-5452f2f9c37e - - - - - -] [instance: fc483404-7890-49b3-a98c-14073862383f] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 21 18:56:03 np0005591285 nova_compute[182755]: 2026-01-21 23:56:03.094 182759 INFO nova.compute.manager [None req-f447ef32-7edb-4e2c-a5c5-5452f2f9c37e - - - - - -] [instance: fc483404-7890-49b3-a98c-14073862383f] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 21 18:56:03 np0005591285 nova_compute[182755]: 2026-01-21 23:56:03.095 182759 DEBUG nova.virt.driver [None req-f447ef32-7edb-4e2c-a5c5-5452f2f9c37e - - - - - -] Emitting event <LifecycleEvent: 1769039763.0431662, fc483404-7890-49b3-a98c-14073862383f => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 21 18:56:03 np0005591285 nova_compute[182755]: 2026-01-21 23:56:03.095 182759 INFO nova.compute.manager [None req-f447ef32-7edb-4e2c-a5c5-5452f2f9c37e - - - - - -] [instance: fc483404-7890-49b3-a98c-14073862383f] VM Paused (Lifecycle Event)#033[00m
Jan 21 18:56:03 np0005591285 nova_compute[182755]: 2026-01-21 23:56:03.121 182759 DEBUG nova.compute.manager [None req-f447ef32-7edb-4e2c-a5c5-5452f2f9c37e - - - - - -] [instance: fc483404-7890-49b3-a98c-14073862383f] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 21 18:56:03 np0005591285 nova_compute[182755]: 2026-01-21 23:56:03.126 182759 DEBUG nova.virt.driver [None req-f447ef32-7edb-4e2c-a5c5-5452f2f9c37e - - - - - -] Emitting event <LifecycleEvent: 1769039763.0513806, fc483404-7890-49b3-a98c-14073862383f => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 21 18:56:03 np0005591285 nova_compute[182755]: 2026-01-21 23:56:03.126 182759 INFO nova.compute.manager [None req-f447ef32-7edb-4e2c-a5c5-5452f2f9c37e - - - - - -] [instance: fc483404-7890-49b3-a98c-14073862383f] VM Resumed (Lifecycle Event)#033[00m
Jan 21 18:56:03 np0005591285 nova_compute[182755]: 2026-01-21 23:56:03.143 182759 DEBUG nova.compute.manager [None req-f447ef32-7edb-4e2c-a5c5-5452f2f9c37e - - - - - -] [instance: fc483404-7890-49b3-a98c-14073862383f] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 21 18:56:03 np0005591285 nova_compute[182755]: 2026-01-21 23:56:03.146 182759 INFO nova.compute.manager [None req-9a6e6287-4ebd-494b-bab4-ac68708dbe9f 6eb1bcf645844eaca088761a04e59542 63e5713bcd4c429796b251487b6136bc - - default default] [instance: fc483404-7890-49b3-a98c-14073862383f] Took 8.02 seconds to build instance.#033[00m
Jan 21 18:56:03 np0005591285 nova_compute[182755]: 2026-01-21 23:56:03.148 182759 DEBUG nova.compute.manager [None req-f447ef32-7edb-4e2c-a5c5-5452f2f9c37e - - - - - -] [instance: fc483404-7890-49b3-a98c-14073862383f] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: active, current task_state: None, current DB power_state: 1, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 21 18:56:03 np0005591285 nova_compute[182755]: 2026-01-21 23:56:03.167 182759 DEBUG oslo_concurrency.lockutils [None req-9a6e6287-4ebd-494b-bab4-ac68708dbe9f 6eb1bcf645844eaca088761a04e59542 63e5713bcd4c429796b251487b6136bc - - default default] Lock "fc483404-7890-49b3-a98c-14073862383f" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 8.131s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 21 18:56:03 np0005591285 podman[219318]: 2026-01-21 23:56:03.48166987 +0000 UTC m=+0.066587058 container create afb8cf29a599072555b882756964a7065adc8328d8254564a7c247b90f2d0ecb (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-74e2da48-44c2-4c6d-9597-6c47d6247f9c, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251202, io.buildah.version=1.41.3, tcib_managed=true, org.label-schema.license=GPLv2)
Jan 21 18:56:03 np0005591285 systemd[1]: Started libpod-conmon-afb8cf29a599072555b882756964a7065adc8328d8254564a7c247b90f2d0ecb.scope.
Jan 21 18:56:03 np0005591285 podman[219318]: 2026-01-21 23:56:03.449246081 +0000 UTC m=+0.034163299 image pull 3695f0466b4af47afdf4b467956f8cc4744d7249671a73e7ca3fd26cca2f59c3 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Jan 21 18:56:03 np0005591285 systemd[1]: Started libcrun container.
Jan 21 18:56:03 np0005591285 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/891731a0cefc1eec3b6e1fc5343011971e404f864ace7335416e427df1e9d866/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 21 18:56:03 np0005591285 podman[219318]: 2026-01-21 23:56:03.59856497 +0000 UTC m=+0.183482168 container init afb8cf29a599072555b882756964a7065adc8328d8254564a7c247b90f2d0ecb (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-74e2da48-44c2-4c6d-9597-6c47d6247f9c, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS)
Jan 21 18:56:03 np0005591285 podman[219318]: 2026-01-21 23:56:03.6133081 +0000 UTC m=+0.198225278 container start afb8cf29a599072555b882756964a7065adc8328d8254564a7c247b90f2d0ecb (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-74e2da48-44c2-4c6d-9597-6c47d6247f9c, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, io.buildah.version=1.41.3)
Jan 21 18:56:03 np0005591285 neutron-haproxy-ovnmeta-74e2da48-44c2-4c6d-9597-6c47d6247f9c[219333]: [NOTICE]   (219337) : New worker (219339) forked
Jan 21 18:56:03 np0005591285 neutron-haproxy-ovnmeta-74e2da48-44c2-4c6d-9597-6c47d6247f9c[219333]: [NOTICE]   (219337) : Loading success.
Jan 21 18:56:03 np0005591285 nova_compute[182755]: 2026-01-21 23:56:03.840 182759 DEBUG nova.network.neutron [req-7c69b8a7-1ab6-4f90-a99d-d143afd1c02e req-2cd1788d-3f54-4914-abdd-13ea734bd8e5 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: fc483404-7890-49b3-a98c-14073862383f] Updated VIF entry in instance network info cache for port c6e2fc90-4c49-49de-91ff-3394321164b9. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 21 18:56:03 np0005591285 nova_compute[182755]: 2026-01-21 23:56:03.841 182759 DEBUG nova.network.neutron [req-7c69b8a7-1ab6-4f90-a99d-d143afd1c02e req-2cd1788d-3f54-4914-abdd-13ea734bd8e5 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: fc483404-7890-49b3-a98c-14073862383f] Updating instance_info_cache with network_info: [{"id": "c6e2fc90-4c49-49de-91ff-3394321164b9", "address": "fa:16:3e:32:00:69", "network": {"id": "74e2da48-44c2-4c6d-9597-6c47d6247f9c", "bridge": "br-int", "label": "tempest-ImagesTestJSON-926280031-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "63e5713bcd4c429796b251487b6136bc", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc6e2fc90-4c", "ovs_interfaceid": "c6e2fc90-4c49-49de-91ff-3394321164b9", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 21 18:56:03 np0005591285 nova_compute[182755]: 2026-01-21 23:56:03.862 182759 DEBUG oslo_concurrency.lockutils [req-7c69b8a7-1ab6-4f90-a99d-d143afd1c02e req-2cd1788d-3f54-4914-abdd-13ea734bd8e5 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Releasing lock "refresh_cache-fc483404-7890-49b3-a98c-14073862383f" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 21 18:56:03 np0005591285 ovn_controller[94908]: 2026-01-21T23:56:03Z|00017|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:58:a7:27 10.100.0.8
Jan 21 18:56:03 np0005591285 ovn_controller[94908]: 2026-01-21T23:56:03Z|00018|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:58:a7:27 10.100.0.8
Jan 21 18:56:05 np0005591285 nova_compute[182755]: 2026-01-21 23:56:05.041 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 18:56:05 np0005591285 nova_compute[182755]: 2026-01-21 23:56:05.212 182759 DEBUG nova.compute.manager [req-6da35fe7-a60d-4f67-b081-0899191682d1 req-136d57f3-2626-4654-92ae-6ab9a19ea1d8 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: fc483404-7890-49b3-a98c-14073862383f] Received event network-vif-plugged-c6e2fc90-4c49-49de-91ff-3394321164b9 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 21 18:56:05 np0005591285 nova_compute[182755]: 2026-01-21 23:56:05.213 182759 DEBUG oslo_concurrency.lockutils [req-6da35fe7-a60d-4f67-b081-0899191682d1 req-136d57f3-2626-4654-92ae-6ab9a19ea1d8 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "fc483404-7890-49b3-a98c-14073862383f-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 21 18:56:05 np0005591285 nova_compute[182755]: 2026-01-21 23:56:05.213 182759 DEBUG oslo_concurrency.lockutils [req-6da35fe7-a60d-4f67-b081-0899191682d1 req-136d57f3-2626-4654-92ae-6ab9a19ea1d8 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "fc483404-7890-49b3-a98c-14073862383f-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 21 18:56:05 np0005591285 nova_compute[182755]: 2026-01-21 23:56:05.214 182759 DEBUG oslo_concurrency.lockutils [req-6da35fe7-a60d-4f67-b081-0899191682d1 req-136d57f3-2626-4654-92ae-6ab9a19ea1d8 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "fc483404-7890-49b3-a98c-14073862383f-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 21 18:56:05 np0005591285 nova_compute[182755]: 2026-01-21 23:56:05.214 182759 DEBUG nova.compute.manager [req-6da35fe7-a60d-4f67-b081-0899191682d1 req-136d57f3-2626-4654-92ae-6ab9a19ea1d8 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: fc483404-7890-49b3-a98c-14073862383f] No waiting events found dispatching network-vif-plugged-c6e2fc90-4c49-49de-91ff-3394321164b9 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 21 18:56:05 np0005591285 nova_compute[182755]: 2026-01-21 23:56:05.215 182759 WARNING nova.compute.manager [req-6da35fe7-a60d-4f67-b081-0899191682d1 req-136d57f3-2626-4654-92ae-6ab9a19ea1d8 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: fc483404-7890-49b3-a98c-14073862383f] Received unexpected event network-vif-plugged-c6e2fc90-4c49-49de-91ff-3394321164b9 for instance with vm_state active and task_state None.#033[00m
Jan 21 18:56:05 np0005591285 nova_compute[182755]: 2026-01-21 23:56:05.387 182759 DEBUG oslo_concurrency.lockutils [None req-cf4eb2d1-deb8-4d94-9c5a-44bf9bb86ff3 6eb1bcf645844eaca088761a04e59542 63e5713bcd4c429796b251487b6136bc - - default default] Acquiring lock "fc483404-7890-49b3-a98c-14073862383f" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 21 18:56:05 np0005591285 nova_compute[182755]: 2026-01-21 23:56:05.387 182759 DEBUG oslo_concurrency.lockutils [None req-cf4eb2d1-deb8-4d94-9c5a-44bf9bb86ff3 6eb1bcf645844eaca088761a04e59542 63e5713bcd4c429796b251487b6136bc - - default default] Lock "fc483404-7890-49b3-a98c-14073862383f" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 21 18:56:05 np0005591285 nova_compute[182755]: 2026-01-21 23:56:05.388 182759 DEBUG oslo_concurrency.lockutils [None req-cf4eb2d1-deb8-4d94-9c5a-44bf9bb86ff3 6eb1bcf645844eaca088761a04e59542 63e5713bcd4c429796b251487b6136bc - - default default] Acquiring lock "fc483404-7890-49b3-a98c-14073862383f-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 21 18:56:05 np0005591285 nova_compute[182755]: 2026-01-21 23:56:05.388 182759 DEBUG oslo_concurrency.lockutils [None req-cf4eb2d1-deb8-4d94-9c5a-44bf9bb86ff3 6eb1bcf645844eaca088761a04e59542 63e5713bcd4c429796b251487b6136bc - - default default] Lock "fc483404-7890-49b3-a98c-14073862383f-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 21 18:56:05 np0005591285 nova_compute[182755]: 2026-01-21 23:56:05.388 182759 DEBUG oslo_concurrency.lockutils [None req-cf4eb2d1-deb8-4d94-9c5a-44bf9bb86ff3 6eb1bcf645844eaca088761a04e59542 63e5713bcd4c429796b251487b6136bc - - default default] Lock "fc483404-7890-49b3-a98c-14073862383f-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 21 18:56:05 np0005591285 nova_compute[182755]: 2026-01-21 23:56:05.400 182759 INFO nova.compute.manager [None req-cf4eb2d1-deb8-4d94-9c5a-44bf9bb86ff3 6eb1bcf645844eaca088761a04e59542 63e5713bcd4c429796b251487b6136bc - - default default] [instance: fc483404-7890-49b3-a98c-14073862383f] Terminating instance#033[00m
Jan 21 18:56:05 np0005591285 nova_compute[182755]: 2026-01-21 23:56:05.412 182759 DEBUG nova.compute.manager [None req-cf4eb2d1-deb8-4d94-9c5a-44bf9bb86ff3 6eb1bcf645844eaca088761a04e59542 63e5713bcd4c429796b251487b6136bc - - default default] [instance: fc483404-7890-49b3-a98c-14073862383f] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Jan 21 18:56:05 np0005591285 kernel: tapc6e2fc90-4c (unregistering): left promiscuous mode
Jan 21 18:56:05 np0005591285 NetworkManager[55017]: <info>  [1769039765.4356] device (tapc6e2fc90-4c): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 21 18:56:05 np0005591285 ovn_controller[94908]: 2026-01-21T23:56:05Z|00160|binding|INFO|Releasing lport c6e2fc90-4c49-49de-91ff-3394321164b9 from this chassis (sb_readonly=0)
Jan 21 18:56:05 np0005591285 ovn_controller[94908]: 2026-01-21T23:56:05Z|00161|binding|INFO|Setting lport c6e2fc90-4c49-49de-91ff-3394321164b9 down in Southbound
Jan 21 18:56:05 np0005591285 ovn_controller[94908]: 2026-01-21T23:56:05Z|00162|binding|INFO|Removing iface tapc6e2fc90-4c ovn-installed in OVS
Jan 21 18:56:05 np0005591285 nova_compute[182755]: 2026-01-21 23:56:05.453 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 18:56:05 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:56:05.461 104259 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:32:00:69 10.100.0.4'], port_security=['fa:16:3e:32:00:69 10.100.0.4'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.4/28', 'neutron:device_id': 'fc483404-7890-49b3-a98c-14073862383f', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-74e2da48-44c2-4c6d-9597-6c47d6247f9c', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '63e5713bcd4c429796b251487b6136bc', 'neutron:revision_number': '4', 'neutron:security_group_ids': 'f9b7e0b6-a204-4fa2-b013-6e4797586550', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-2.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=d83c4887-6d70-4852-825f-2112d1d70c76, chassis=[], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fdd55b5c970>], logical_port=c6e2fc90-4c49-49de-91ff-3394321164b9) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7fdd55b5c970>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 21 18:56:05 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:56:05.463 104259 INFO neutron.agent.ovn.metadata.agent [-] Port c6e2fc90-4c49-49de-91ff-3394321164b9 in datapath 74e2da48-44c2-4c6d-9597-6c47d6247f9c unbound from our chassis#033[00m
Jan 21 18:56:05 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:56:05.465 104259 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 74e2da48-44c2-4c6d-9597-6c47d6247f9c, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Jan 21 18:56:05 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:56:05.466 211690 DEBUG oslo.privsep.daemon [-] privsep: reply[ca76ef68-dd07-48d0-8366-347709a77b24]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 18:56:05 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:56:05.467 104259 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-74e2da48-44c2-4c6d-9597-6c47d6247f9c namespace which is not needed anymore#033[00m
Jan 21 18:56:05 np0005591285 nova_compute[182755]: 2026-01-21 23:56:05.476 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 18:56:05 np0005591285 systemd[1]: machine-qemu\x2d26\x2dinstance\x2d0000003f.scope: Deactivated successfully.
Jan 21 18:56:05 np0005591285 systemd[1]: machine-qemu\x2d26\x2dinstance\x2d0000003f.scope: Consumed 2.772s CPU time.
Jan 21 18:56:05 np0005591285 systemd-machined[154022]: Machine qemu-26-instance-0000003f terminated.
Jan 21 18:56:05 np0005591285 podman[219348]: 2026-01-21 23:56:05.590835352 +0000 UTC m=+0.120542330 container health_status e38def30a2d032278c507a8fe96e62e9ea51ff4721fe55b24e66691821fd079e (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 21 18:56:05 np0005591285 kernel: tapc6e2fc90-4c: entered promiscuous mode
Jan 21 18:56:05 np0005591285 NetworkManager[55017]: <info>  [1769039765.6411] manager: (tapc6e2fc90-4c): new Tun device (/org/freedesktop/NetworkManager/Devices/91)
Jan 21 18:56:05 np0005591285 ovn_controller[94908]: 2026-01-21T23:56:05Z|00163|binding|INFO|Claiming lport c6e2fc90-4c49-49de-91ff-3394321164b9 for this chassis.
Jan 21 18:56:05 np0005591285 ovn_controller[94908]: 2026-01-21T23:56:05Z|00164|binding|INFO|c6e2fc90-4c49-49de-91ff-3394321164b9: Claiming fa:16:3e:32:00:69 10.100.0.4
Jan 21 18:56:05 np0005591285 nova_compute[182755]: 2026-01-21 23:56:05.643 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 18:56:05 np0005591285 kernel: tapc6e2fc90-4c (unregistering): left promiscuous mode
Jan 21 18:56:05 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:56:05.656 104259 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:32:00:69 10.100.0.4'], port_security=['fa:16:3e:32:00:69 10.100.0.4'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.4/28', 'neutron:device_id': 'fc483404-7890-49b3-a98c-14073862383f', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-74e2da48-44c2-4c6d-9597-6c47d6247f9c', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '63e5713bcd4c429796b251487b6136bc', 'neutron:revision_number': '4', 'neutron:security_group_ids': 'f9b7e0b6-a204-4fa2-b013-6e4797586550', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-2.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=d83c4887-6d70-4852-825f-2112d1d70c76, chassis=[<ovs.db.idl.Row object at 0x7fdd55b5c970>], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fdd55b5c970>], logical_port=c6e2fc90-4c49-49de-91ff-3394321164b9) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 21 18:56:05 np0005591285 neutron-haproxy-ovnmeta-74e2da48-44c2-4c6d-9597-6c47d6247f9c[219333]: [NOTICE]   (219337) : haproxy version is 2.8.14-c23fe91
Jan 21 18:56:05 np0005591285 neutron-haproxy-ovnmeta-74e2da48-44c2-4c6d-9597-6c47d6247f9c[219333]: [NOTICE]   (219337) : path to executable is /usr/sbin/haproxy
Jan 21 18:56:05 np0005591285 neutron-haproxy-ovnmeta-74e2da48-44c2-4c6d-9597-6c47d6247f9c[219333]: [WARNING]  (219337) : Exiting Master process...
Jan 21 18:56:05 np0005591285 neutron-haproxy-ovnmeta-74e2da48-44c2-4c6d-9597-6c47d6247f9c[219333]: [WARNING]  (219337) : Exiting Master process...
Jan 21 18:56:05 np0005591285 neutron-haproxy-ovnmeta-74e2da48-44c2-4c6d-9597-6c47d6247f9c[219333]: [ALERT]    (219337) : Current worker (219339) exited with code 143 (Terminated)
Jan 21 18:56:05 np0005591285 neutron-haproxy-ovnmeta-74e2da48-44c2-4c6d-9597-6c47d6247f9c[219333]: [WARNING]  (219337) : All workers exited. Exiting... (0)
Jan 21 18:56:05 np0005591285 ovn_controller[94908]: 2026-01-21T23:56:05Z|00165|binding|INFO|Setting lport c6e2fc90-4c49-49de-91ff-3394321164b9 ovn-installed in OVS
Jan 21 18:56:05 np0005591285 ovn_controller[94908]: 2026-01-21T23:56:05Z|00166|binding|INFO|Setting lport c6e2fc90-4c49-49de-91ff-3394321164b9 up in Southbound
Jan 21 18:56:05 np0005591285 ovn_controller[94908]: 2026-01-21T23:56:05Z|00167|binding|INFO|Releasing lport c6e2fc90-4c49-49de-91ff-3394321164b9 from this chassis (sb_readonly=1)
Jan 21 18:56:05 np0005591285 ovn_controller[94908]: 2026-01-21T23:56:05Z|00168|if_status|INFO|Not setting lport c6e2fc90-4c49-49de-91ff-3394321164b9 down as sb is readonly
Jan 21 18:56:05 np0005591285 ovn_controller[94908]: 2026-01-21T23:56:05Z|00169|binding|INFO|Removing iface tapc6e2fc90-4c ovn-installed in OVS
Jan 21 18:56:05 np0005591285 nova_compute[182755]: 2026-01-21 23:56:05.689 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 18:56:05 np0005591285 nova_compute[182755]: 2026-01-21 23:56:05.691 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 18:56:05 np0005591285 systemd[1]: libpod-afb8cf29a599072555b882756964a7065adc8328d8254564a7c247b90f2d0ecb.scope: Deactivated successfully.
Jan 21 18:56:05 np0005591285 ovn_controller[94908]: 2026-01-21T23:56:05Z|00170|binding|INFO|Releasing lport c6e2fc90-4c49-49de-91ff-3394321164b9 from this chassis (sb_readonly=0)
Jan 21 18:56:05 np0005591285 ovn_controller[94908]: 2026-01-21T23:56:05Z|00171|binding|INFO|Setting lport c6e2fc90-4c49-49de-91ff-3394321164b9 down in Southbound
Jan 21 18:56:05 np0005591285 podman[219392]: 2026-01-21 23:56:05.699807658 +0000 UTC m=+0.085673575 container died afb8cf29a599072555b882756964a7065adc8328d8254564a7c247b90f2d0ecb (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-74e2da48-44c2-4c6d-9597-6c47d6247f9c, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251202, tcib_managed=true, io.buildah.version=1.41.3)
Jan 21 18:56:05 np0005591285 nova_compute[182755]: 2026-01-21 23:56:05.708 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 18:56:05 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:56:05.708 104259 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:32:00:69 10.100.0.4'], port_security=['fa:16:3e:32:00:69 10.100.0.4'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.4/28', 'neutron:device_id': 'fc483404-7890-49b3-a98c-14073862383f', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-74e2da48-44c2-4c6d-9597-6c47d6247f9c', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '63e5713bcd4c429796b251487b6136bc', 'neutron:revision_number': '4', 'neutron:security_group_ids': 'f9b7e0b6-a204-4fa2-b013-6e4797586550', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-2.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=d83c4887-6d70-4852-825f-2112d1d70c76, chassis=[], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fdd55b5c970>], logical_port=c6e2fc90-4c49-49de-91ff-3394321164b9) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7fdd55b5c970>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 21 18:56:05 np0005591285 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-afb8cf29a599072555b882756964a7065adc8328d8254564a7c247b90f2d0ecb-userdata-shm.mount: Deactivated successfully.
Jan 21 18:56:05 np0005591285 systemd[1]: var-lib-containers-storage-overlay-891731a0cefc1eec3b6e1fc5343011971e404f864ace7335416e427df1e9d866-merged.mount: Deactivated successfully.
Jan 21 18:56:05 np0005591285 nova_compute[182755]: 2026-01-21 23:56:05.751 182759 INFO nova.virt.libvirt.driver [-] [instance: fc483404-7890-49b3-a98c-14073862383f] Instance destroyed successfully.#033[00m
Jan 21 18:56:05 np0005591285 nova_compute[182755]: 2026-01-21 23:56:05.752 182759 DEBUG nova.objects.instance [None req-cf4eb2d1-deb8-4d94-9c5a-44bf9bb86ff3 6eb1bcf645844eaca088761a04e59542 63e5713bcd4c429796b251487b6136bc - - default default] Lazy-loading 'resources' on Instance uuid fc483404-7890-49b3-a98c-14073862383f obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 21 18:56:05 np0005591285 podman[219392]: 2026-01-21 23:56:05.760939926 +0000 UTC m=+0.146805813 container cleanup afb8cf29a599072555b882756964a7065adc8328d8254564a7c247b90f2d0ecb (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-74e2da48-44c2-4c6d-9597-6c47d6247f9c, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS)
Jan 21 18:56:05 np0005591285 systemd[1]: libpod-conmon-afb8cf29a599072555b882756964a7065adc8328d8254564a7c247b90f2d0ecb.scope: Deactivated successfully.
Jan 21 18:56:05 np0005591285 nova_compute[182755]: 2026-01-21 23:56:05.776 182759 DEBUG nova.virt.libvirt.vif [None req-cf4eb2d1-deb8-4d94-9c5a-44bf9bb86ff3 6eb1bcf645844eaca088761a04e59542 63e5713bcd4c429796b251487b6136bc - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-21T23:55:54Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ImagesTestJSON-server-1100102574',display_name='tempest-ImagesTestJSON-server-1100102574',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-imagestestjson-server-1100102574',id=63,image_ref='7a3304a5-0254-4967-ad18-be3da95aaf72',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2026-01-21T23:56:03Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='63e5713bcd4c429796b251487b6136bc',ramdisk_id='',reservation_id='r-1jf74nj6',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',image_boot_roles='member,reader',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_image_location='snapshot',image_image_state='available',image_image_type='snapshot',image_instance_uuid='2a5e9a44-f095-4122-8db9-4918b6ba22b7',image_min_disk='1',image_min_ram='0',image_owner_id='63e5713bcd4c429796b251487b6136bc',image_owner_project_name='tempest-ImagesTestJSON-126431515',image_owner_user_name='tempest-ImagesTestJSON-126431515-project-member',image_user_id='6eb1bcf645844eaca088761a04e59542',owner_project_name='tempest-ImagesTestJSON-126431515',owner_user_name='tempest-ImagesTestJSON-126431515-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-21T23:56:03Z,user_data=None,user_id='6eb1bcf645844eaca088761a04e59542',uuid=fc483404-7890-49b3-a98c-14073862383f,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "c6e2fc90-4c49-49de-91ff-3394321164b9", "address": "fa:16:3e:32:00:69", "network": {"id": "74e2da48-44c2-4c6d-9597-6c47d6247f9c", "bridge": "br-int", "label": "tempest-ImagesTestJSON-926280031-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "63e5713bcd4c429796b251487b6136bc", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc6e2fc90-4c", "ovs_interfaceid": "c6e2fc90-4c49-49de-91ff-3394321164b9", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Jan 21 18:56:05 np0005591285 nova_compute[182755]: 2026-01-21 23:56:05.777 182759 DEBUG nova.network.os_vif_util [None req-cf4eb2d1-deb8-4d94-9c5a-44bf9bb86ff3 6eb1bcf645844eaca088761a04e59542 63e5713bcd4c429796b251487b6136bc - - default default] Converting VIF {"id": "c6e2fc90-4c49-49de-91ff-3394321164b9", "address": "fa:16:3e:32:00:69", "network": {"id": "74e2da48-44c2-4c6d-9597-6c47d6247f9c", "bridge": "br-int", "label": "tempest-ImagesTestJSON-926280031-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "63e5713bcd4c429796b251487b6136bc", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc6e2fc90-4c", "ovs_interfaceid": "c6e2fc90-4c49-49de-91ff-3394321164b9", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 21 18:56:05 np0005591285 nova_compute[182755]: 2026-01-21 23:56:05.778 182759 DEBUG nova.network.os_vif_util [None req-cf4eb2d1-deb8-4d94-9c5a-44bf9bb86ff3 6eb1bcf645844eaca088761a04e59542 63e5713bcd4c429796b251487b6136bc - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:32:00:69,bridge_name='br-int',has_traffic_filtering=True,id=c6e2fc90-4c49-49de-91ff-3394321164b9,network=Network(74e2da48-44c2-4c6d-9597-6c47d6247f9c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapc6e2fc90-4c') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 21 18:56:05 np0005591285 nova_compute[182755]: 2026-01-21 23:56:05.778 182759 DEBUG os_vif [None req-cf4eb2d1-deb8-4d94-9c5a-44bf9bb86ff3 6eb1bcf645844eaca088761a04e59542 63e5713bcd4c429796b251487b6136bc - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:32:00:69,bridge_name='br-int',has_traffic_filtering=True,id=c6e2fc90-4c49-49de-91ff-3394321164b9,network=Network(74e2da48-44c2-4c6d-9597-6c47d6247f9c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapc6e2fc90-4c') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Jan 21 18:56:05 np0005591285 nova_compute[182755]: 2026-01-21 23:56:05.780 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 18:56:05 np0005591285 nova_compute[182755]: 2026-01-21 23:56:05.781 182759 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapc6e2fc90-4c, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 21 18:56:05 np0005591285 nova_compute[182755]: 2026-01-21 23:56:05.782 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 18:56:05 np0005591285 nova_compute[182755]: 2026-01-21 23:56:05.784 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 18:56:05 np0005591285 nova_compute[182755]: 2026-01-21 23:56:05.787 182759 INFO os_vif [None req-cf4eb2d1-deb8-4d94-9c5a-44bf9bb86ff3 6eb1bcf645844eaca088761a04e59542 63e5713bcd4c429796b251487b6136bc - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:32:00:69,bridge_name='br-int',has_traffic_filtering=True,id=c6e2fc90-4c49-49de-91ff-3394321164b9,network=Network(74e2da48-44c2-4c6d-9597-6c47d6247f9c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapc6e2fc90-4c')#033[00m
Jan 21 18:56:05 np0005591285 nova_compute[182755]: 2026-01-21 23:56:05.789 182759 INFO nova.virt.libvirt.driver [None req-cf4eb2d1-deb8-4d94-9c5a-44bf9bb86ff3 6eb1bcf645844eaca088761a04e59542 63e5713bcd4c429796b251487b6136bc - - default default] [instance: fc483404-7890-49b3-a98c-14073862383f] Deleting instance files /var/lib/nova/instances/fc483404-7890-49b3-a98c-14073862383f_del#033[00m
Jan 21 18:56:05 np0005591285 nova_compute[182755]: 2026-01-21 23:56:05.790 182759 INFO nova.virt.libvirt.driver [None req-cf4eb2d1-deb8-4d94-9c5a-44bf9bb86ff3 6eb1bcf645844eaca088761a04e59542 63e5713bcd4c429796b251487b6136bc - - default default] [instance: fc483404-7890-49b3-a98c-14073862383f] Deletion of /var/lib/nova/instances/fc483404-7890-49b3-a98c-14073862383f_del complete#033[00m
Jan 21 18:56:05 np0005591285 podman[219438]: 2026-01-21 23:56:05.845324614 +0000 UTC m=+0.050898671 container remove afb8cf29a599072555b882756964a7065adc8328d8254564a7c247b90f2d0ecb (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-74e2da48-44c2-4c6d-9597-6c47d6247f9c, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.build-date=20251202)
Jan 21 18:56:05 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:56:05.852 211690 DEBUG oslo.privsep.daemon [-] privsep: reply[dbed3b7e-e707-46be-915b-8c3ddce9bb80]: (4, ('Wed Jan 21 11:56:05 PM UTC 2026 Stopping container neutron-haproxy-ovnmeta-74e2da48-44c2-4c6d-9597-6c47d6247f9c (afb8cf29a599072555b882756964a7065adc8328d8254564a7c247b90f2d0ecb)\nafb8cf29a599072555b882756964a7065adc8328d8254564a7c247b90f2d0ecb\nWed Jan 21 11:56:05 PM UTC 2026 Deleting container neutron-haproxy-ovnmeta-74e2da48-44c2-4c6d-9597-6c47d6247f9c (afb8cf29a599072555b882756964a7065adc8328d8254564a7c247b90f2d0ecb)\nafb8cf29a599072555b882756964a7065adc8328d8254564a7c247b90f2d0ecb\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 18:56:05 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:56:05.854 211690 DEBUG oslo.privsep.daemon [-] privsep: reply[9e3cb6a6-225a-4b37-b933-9aed8e4220cc]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 18:56:05 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:56:05.855 104259 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap74e2da48-40, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 21 18:56:05 np0005591285 nova_compute[182755]: 2026-01-21 23:56:05.857 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 18:56:05 np0005591285 kernel: tap74e2da48-40: left promiscuous mode
Jan 21 18:56:05 np0005591285 nova_compute[182755]: 2026-01-21 23:56:05.869 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 18:56:05 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:56:05.872 211690 DEBUG oslo.privsep.daemon [-] privsep: reply[50f242c2-878c-473f-ba03-aa0cc71a8ebd]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 18:56:05 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:56:05.889 211690 DEBUG oslo.privsep.daemon [-] privsep: reply[e80eac50-a12f-437f-8b52-201325f21249]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 18:56:05 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:56:05.892 211690 DEBUG oslo.privsep.daemon [-] privsep: reply[19a6772a-72d0-421a-b156-18f587c68a07]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 18:56:05 np0005591285 nova_compute[182755]: 2026-01-21 23:56:05.894 182759 INFO nova.compute.manager [None req-cf4eb2d1-deb8-4d94-9c5a-44bf9bb86ff3 6eb1bcf645844eaca088761a04e59542 63e5713bcd4c429796b251487b6136bc - - default default] [instance: fc483404-7890-49b3-a98c-14073862383f] Took 0.48 seconds to destroy the instance on the hypervisor.#033[00m
Jan 21 18:56:05 np0005591285 nova_compute[182755]: 2026-01-21 23:56:05.895 182759 DEBUG oslo.service.loopingcall [None req-cf4eb2d1-deb8-4d94-9c5a-44bf9bb86ff3 6eb1bcf645844eaca088761a04e59542 63e5713bcd4c429796b251487b6136bc - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Jan 21 18:56:05 np0005591285 nova_compute[182755]: 2026-01-21 23:56:05.896 182759 DEBUG nova.compute.manager [-] [instance: fc483404-7890-49b3-a98c-14073862383f] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Jan 21 18:56:05 np0005591285 nova_compute[182755]: 2026-01-21 23:56:05.896 182759 DEBUG nova.network.neutron [-] [instance: fc483404-7890-49b3-a98c-14073862383f] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Jan 21 18:56:05 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:56:05.916 211690 DEBUG oslo.privsep.daemon [-] privsep: reply[0d04c9aa-33e2-4f5e-89fc-61faf17b5161]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 432138, 'reachable_time': 19139, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 219453, 'error': None, 'target': 'ovnmeta-74e2da48-44c2-4c6d-9597-6c47d6247f9c', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 18:56:05 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:56:05.923 104650 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-74e2da48-44c2-4c6d-9597-6c47d6247f9c deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Jan 21 18:56:05 np0005591285 systemd[1]: run-netns-ovnmeta\x2d74e2da48\x2d44c2\x2d4c6d\x2d9597\x2d6c47d6247f9c.mount: Deactivated successfully.
Jan 21 18:56:05 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:56:05.923 104650 DEBUG oslo.privsep.daemon [-] privsep: reply[eaade938-0771-4457-bf05-893af3cc2cfe]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 18:56:05 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:56:05.925 104259 INFO neutron.agent.ovn.metadata.agent [-] Port c6e2fc90-4c49-49de-91ff-3394321164b9 in datapath 74e2da48-44c2-4c6d-9597-6c47d6247f9c unbound from our chassis#033[00m
Jan 21 18:56:05 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:56:05.927 104259 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 74e2da48-44c2-4c6d-9597-6c47d6247f9c, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Jan 21 18:56:05 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:56:05.929 211690 DEBUG oslo.privsep.daemon [-] privsep: reply[56c628f1-5509-4b86-a41e-324d76f523ac]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 18:56:05 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:56:05.930 104259 INFO neutron.agent.ovn.metadata.agent [-] Port c6e2fc90-4c49-49de-91ff-3394321164b9 in datapath 74e2da48-44c2-4c6d-9597-6c47d6247f9c unbound from our chassis#033[00m
Jan 21 18:56:05 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:56:05.932 104259 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 74e2da48-44c2-4c6d-9597-6c47d6247f9c, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Jan 21 18:56:05 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:56:05.933 211690 DEBUG oslo.privsep.daemon [-] privsep: reply[ddb4bc8d-d77d-478d-918c-be40c92cf71e]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 18:56:06 np0005591285 nova_compute[182755]: 2026-01-21 23:56:06.790 182759 DEBUG nova.network.neutron [-] [instance: fc483404-7890-49b3-a98c-14073862383f] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 21 18:56:06 np0005591285 nova_compute[182755]: 2026-01-21 23:56:06.821 182759 INFO nova.compute.manager [-] [instance: fc483404-7890-49b3-a98c-14073862383f] Took 0.92 seconds to deallocate network for instance.#033[00m
Jan 21 18:56:06 np0005591285 podman[219454]: 2026-01-21 23:56:06.896334618 +0000 UTC m=+0.077690808 container health_status 482a8450540b33e5cd4c4cb0c4b60281016c657b79b89fa82f8a9e447843f995 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, managed_by=edpm_ansible, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 21 18:56:06 np0005591285 podman[219455]: 2026-01-21 23:56:06.900141141 +0000 UTC m=+0.081489051 container health_status 8919309155a033da8deb024bb37b9fc0d285fe97686ee0d202af37a5f2df8416 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter)
Jan 21 18:56:06 np0005591285 nova_compute[182755]: 2026-01-21 23:56:06.930 182759 DEBUG oslo_concurrency.lockutils [None req-cf4eb2d1-deb8-4d94-9c5a-44bf9bb86ff3 6eb1bcf645844eaca088761a04e59542 63e5713bcd4c429796b251487b6136bc - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 21 18:56:06 np0005591285 nova_compute[182755]: 2026-01-21 23:56:06.931 182759 DEBUG oslo_concurrency.lockutils [None req-cf4eb2d1-deb8-4d94-9c5a-44bf9bb86ff3 6eb1bcf645844eaca088761a04e59542 63e5713bcd4c429796b251487b6136bc - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 21 18:56:07 np0005591285 nova_compute[182755]: 2026-01-21 23:56:07.033 182759 DEBUG nova.compute.provider_tree [None req-cf4eb2d1-deb8-4d94-9c5a-44bf9bb86ff3 6eb1bcf645844eaca088761a04e59542 63e5713bcd4c429796b251487b6136bc - - default default] Inventory has not changed in ProviderTree for provider: e96a8776-a298-4c19-937a-402cb8191067 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 21 18:56:07 np0005591285 nova_compute[182755]: 2026-01-21 23:56:07.053 182759 DEBUG nova.scheduler.client.report [None req-cf4eb2d1-deb8-4d94-9c5a-44bf9bb86ff3 6eb1bcf645844eaca088761a04e59542 63e5713bcd4c429796b251487b6136bc - - default default] Inventory has not changed for provider e96a8776-a298-4c19-937a-402cb8191067 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 21 18:56:07 np0005591285 nova_compute[182755]: 2026-01-21 23:56:07.082 182759 DEBUG oslo_concurrency.lockutils [None req-cf4eb2d1-deb8-4d94-9c5a-44bf9bb86ff3 6eb1bcf645844eaca088761a04e59542 63e5713bcd4c429796b251487b6136bc - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.151s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 21 18:56:07 np0005591285 nova_compute[182755]: 2026-01-21 23:56:07.121 182759 INFO nova.scheduler.client.report [None req-cf4eb2d1-deb8-4d94-9c5a-44bf9bb86ff3 6eb1bcf645844eaca088761a04e59542 63e5713bcd4c429796b251487b6136bc - - default default] Deleted allocations for instance fc483404-7890-49b3-a98c-14073862383f#033[00m
Jan 21 18:56:07 np0005591285 nova_compute[182755]: 2026-01-21 23:56:07.225 182759 DEBUG oslo_concurrency.lockutils [None req-cf4eb2d1-deb8-4d94-9c5a-44bf9bb86ff3 6eb1bcf645844eaca088761a04e59542 63e5713bcd4c429796b251487b6136bc - - default default] Lock "fc483404-7890-49b3-a98c-14073862383f" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 1.838s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 21 18:56:07 np0005591285 nova_compute[182755]: 2026-01-21 23:56:07.335 182759 DEBUG nova.compute.manager [req-d0aad6ee-e38b-4689-a789-6cf069dd6ba0 req-74d6aa0f-472e-4ca2-8022-fbb6f4d9bc48 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: fc483404-7890-49b3-a98c-14073862383f] Received event network-vif-deleted-c6e2fc90-4c49-49de-91ff-3394321164b9 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 21 18:56:09 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:56:09.381 104259 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=ce4b296c-26ac-415a-aa87-9634754eb3d3, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '15'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 21 18:56:10 np0005591285 nova_compute[182755]: 2026-01-21 23:56:10.047 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 18:56:10 np0005591285 nova_compute[182755]: 2026-01-21 23:56:10.799 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 18:56:13 np0005591285 nova_compute[182755]: 2026-01-21 23:56:13.898 182759 DEBUG oslo_concurrency.lockutils [None req-38ece97f-025d-4e62-9204-d5dc562a12a1 9a4a4a5f3c9f4c5091261592272bcb81 414437860afc460b9e86d674975e9d1f - - default default] Acquiring lock "7768ce19-cdaa-43a0-9404-cafa72f2d077" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 21 18:56:13 np0005591285 nova_compute[182755]: 2026-01-21 23:56:13.899 182759 DEBUG oslo_concurrency.lockutils [None req-38ece97f-025d-4e62-9204-d5dc562a12a1 9a4a4a5f3c9f4c5091261592272bcb81 414437860afc460b9e86d674975e9d1f - - default default] Lock "7768ce19-cdaa-43a0-9404-cafa72f2d077" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 21 18:56:13 np0005591285 nova_compute[182755]: 2026-01-21 23:56:13.965 182759 DEBUG nova.compute.manager [None req-38ece97f-025d-4e62-9204-d5dc562a12a1 9a4a4a5f3c9f4c5091261592272bcb81 414437860afc460b9e86d674975e9d1f - - default default] [instance: 7768ce19-cdaa-43a0-9404-cafa72f2d077] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Jan 21 18:56:14 np0005591285 nova_compute[182755]: 2026-01-21 23:56:14.198 182759 DEBUG oslo_concurrency.lockutils [None req-38ece97f-025d-4e62-9204-d5dc562a12a1 9a4a4a5f3c9f4c5091261592272bcb81 414437860afc460b9e86d674975e9d1f - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 21 18:56:14 np0005591285 nova_compute[182755]: 2026-01-21 23:56:14.199 182759 DEBUG oslo_concurrency.lockutils [None req-38ece97f-025d-4e62-9204-d5dc562a12a1 9a4a4a5f3c9f4c5091261592272bcb81 414437860afc460b9e86d674975e9d1f - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 21 18:56:14 np0005591285 nova_compute[182755]: 2026-01-21 23:56:14.212 182759 DEBUG nova.virt.hardware [None req-38ece97f-025d-4e62-9204-d5dc562a12a1 9a4a4a5f3c9f4c5091261592272bcb81 414437860afc460b9e86d674975e9d1f - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Jan 21 18:56:14 np0005591285 nova_compute[182755]: 2026-01-21 23:56:14.213 182759 INFO nova.compute.claims [None req-38ece97f-025d-4e62-9204-d5dc562a12a1 9a4a4a5f3c9f4c5091261592272bcb81 414437860afc460b9e86d674975e9d1f - - default default] [instance: 7768ce19-cdaa-43a0-9404-cafa72f2d077] Claim successful on node compute-2.ctlplane.example.com#033[00m
Jan 21 18:56:14 np0005591285 nova_compute[182755]: 2026-01-21 23:56:14.468 182759 DEBUG nova.compute.provider_tree [None req-38ece97f-025d-4e62-9204-d5dc562a12a1 9a4a4a5f3c9f4c5091261592272bcb81 414437860afc460b9e86d674975e9d1f - - default default] Inventory has not changed in ProviderTree for provider: e96a8776-a298-4c19-937a-402cb8191067 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 21 18:56:14 np0005591285 nova_compute[182755]: 2026-01-21 23:56:14.485 182759 DEBUG nova.scheduler.client.report [None req-38ece97f-025d-4e62-9204-d5dc562a12a1 9a4a4a5f3c9f4c5091261592272bcb81 414437860afc460b9e86d674975e9d1f - - default default] Inventory has not changed for provider e96a8776-a298-4c19-937a-402cb8191067 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 21 18:56:14 np0005591285 nova_compute[182755]: 2026-01-21 23:56:14.515 182759 DEBUG oslo_concurrency.lockutils [None req-38ece97f-025d-4e62-9204-d5dc562a12a1 9a4a4a5f3c9f4c5091261592272bcb81 414437860afc460b9e86d674975e9d1f - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.316s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 21 18:56:14 np0005591285 nova_compute[182755]: 2026-01-21 23:56:14.516 182759 DEBUG nova.compute.manager [None req-38ece97f-025d-4e62-9204-d5dc562a12a1 9a4a4a5f3c9f4c5091261592272bcb81 414437860afc460b9e86d674975e9d1f - - default default] [instance: 7768ce19-cdaa-43a0-9404-cafa72f2d077] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Jan 21 18:56:14 np0005591285 nova_compute[182755]: 2026-01-21 23:56:14.613 182759 DEBUG nova.compute.manager [None req-38ece97f-025d-4e62-9204-d5dc562a12a1 9a4a4a5f3c9f4c5091261592272bcb81 414437860afc460b9e86d674975e9d1f - - default default] [instance: 7768ce19-cdaa-43a0-9404-cafa72f2d077] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Jan 21 18:56:14 np0005591285 nova_compute[182755]: 2026-01-21 23:56:14.614 182759 DEBUG nova.network.neutron [None req-38ece97f-025d-4e62-9204-d5dc562a12a1 9a4a4a5f3c9f4c5091261592272bcb81 414437860afc460b9e86d674975e9d1f - - default default] [instance: 7768ce19-cdaa-43a0-9404-cafa72f2d077] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Jan 21 18:56:14 np0005591285 nova_compute[182755]: 2026-01-21 23:56:14.649 182759 INFO nova.virt.libvirt.driver [None req-38ece97f-025d-4e62-9204-d5dc562a12a1 9a4a4a5f3c9f4c5091261592272bcb81 414437860afc460b9e86d674975e9d1f - - default default] [instance: 7768ce19-cdaa-43a0-9404-cafa72f2d077] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Jan 21 18:56:14 np0005591285 nova_compute[182755]: 2026-01-21 23:56:14.688 182759 DEBUG nova.compute.manager [None req-38ece97f-025d-4e62-9204-d5dc562a12a1 9a4a4a5f3c9f4c5091261592272bcb81 414437860afc460b9e86d674975e9d1f - - default default] [instance: 7768ce19-cdaa-43a0-9404-cafa72f2d077] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Jan 21 18:56:14 np0005591285 nova_compute[182755]: 2026-01-21 23:56:14.870 182759 DEBUG nova.compute.manager [None req-38ece97f-025d-4e62-9204-d5dc562a12a1 9a4a4a5f3c9f4c5091261592272bcb81 414437860afc460b9e86d674975e9d1f - - default default] [instance: 7768ce19-cdaa-43a0-9404-cafa72f2d077] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Jan 21 18:56:14 np0005591285 nova_compute[182755]: 2026-01-21 23:56:14.872 182759 DEBUG nova.virt.libvirt.driver [None req-38ece97f-025d-4e62-9204-d5dc562a12a1 9a4a4a5f3c9f4c5091261592272bcb81 414437860afc460b9e86d674975e9d1f - - default default] [instance: 7768ce19-cdaa-43a0-9404-cafa72f2d077] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Jan 21 18:56:14 np0005591285 nova_compute[182755]: 2026-01-21 23:56:14.873 182759 INFO nova.virt.libvirt.driver [None req-38ece97f-025d-4e62-9204-d5dc562a12a1 9a4a4a5f3c9f4c5091261592272bcb81 414437860afc460b9e86d674975e9d1f - - default default] [instance: 7768ce19-cdaa-43a0-9404-cafa72f2d077] Creating image(s)#033[00m
Jan 21 18:56:14 np0005591285 nova_compute[182755]: 2026-01-21 23:56:14.874 182759 DEBUG oslo_concurrency.lockutils [None req-38ece97f-025d-4e62-9204-d5dc562a12a1 9a4a4a5f3c9f4c5091261592272bcb81 414437860afc460b9e86d674975e9d1f - - default default] Acquiring lock "/var/lib/nova/instances/7768ce19-cdaa-43a0-9404-cafa72f2d077/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 21 18:56:14 np0005591285 nova_compute[182755]: 2026-01-21 23:56:14.874 182759 DEBUG oslo_concurrency.lockutils [None req-38ece97f-025d-4e62-9204-d5dc562a12a1 9a4a4a5f3c9f4c5091261592272bcb81 414437860afc460b9e86d674975e9d1f - - default default] Lock "/var/lib/nova/instances/7768ce19-cdaa-43a0-9404-cafa72f2d077/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 21 18:56:14 np0005591285 nova_compute[182755]: 2026-01-21 23:56:14.876 182759 DEBUG oslo_concurrency.lockutils [None req-38ece97f-025d-4e62-9204-d5dc562a12a1 9a4a4a5f3c9f4c5091261592272bcb81 414437860afc460b9e86d674975e9d1f - - default default] Lock "/var/lib/nova/instances/7768ce19-cdaa-43a0-9404-cafa72f2d077/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 21 18:56:14 np0005591285 nova_compute[182755]: 2026-01-21 23:56:14.905 182759 DEBUG oslo_concurrency.processutils [None req-38ece97f-025d-4e62-9204-d5dc562a12a1 9a4a4a5f3c9f4c5091261592272bcb81 414437860afc460b9e86d674975e9d1f - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 21 18:56:14 np0005591285 nova_compute[182755]: 2026-01-21 23:56:14.940 182759 DEBUG nova.policy [None req-38ece97f-025d-4e62-9204-d5dc562a12a1 9a4a4a5f3c9f4c5091261592272bcb81 414437860afc460b9e86d674975e9d1f - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '9a4a4a5f3c9f4c5091261592272bcb81', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '414437860afc460b9e86d674975e9d1f', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Jan 21 18:56:14 np0005591285 nova_compute[182755]: 2026-01-21 23:56:14.987 182759 DEBUG oslo_concurrency.processutils [None req-38ece97f-025d-4e62-9204-d5dc562a12a1 9a4a4a5f3c9f4c5091261592272bcb81 414437860afc460b9e86d674975e9d1f - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json" returned: 0 in 0.082s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 21 18:56:14 np0005591285 nova_compute[182755]: 2026-01-21 23:56:14.988 182759 DEBUG oslo_concurrency.lockutils [None req-38ece97f-025d-4e62-9204-d5dc562a12a1 9a4a4a5f3c9f4c5091261592272bcb81 414437860afc460b9e86d674975e9d1f - - default default] Acquiring lock "3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 21 18:56:14 np0005591285 nova_compute[182755]: 2026-01-21 23:56:14.989 182759 DEBUG oslo_concurrency.lockutils [None req-38ece97f-025d-4e62-9204-d5dc562a12a1 9a4a4a5f3c9f4c5091261592272bcb81 414437860afc460b9e86d674975e9d1f - - default default] Lock "3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 21 18:56:15 np0005591285 nova_compute[182755]: 2026-01-21 23:56:15.000 182759 DEBUG oslo_concurrency.processutils [None req-38ece97f-025d-4e62-9204-d5dc562a12a1 9a4a4a5f3c9f4c5091261592272bcb81 414437860afc460b9e86d674975e9d1f - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 21 18:56:15 np0005591285 nova_compute[182755]: 2026-01-21 23:56:15.050 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 18:56:15 np0005591285 nova_compute[182755]: 2026-01-21 23:56:15.092 182759 DEBUG oslo_concurrency.processutils [None req-38ece97f-025d-4e62-9204-d5dc562a12a1 9a4a4a5f3c9f4c5091261592272bcb81 414437860afc460b9e86d674975e9d1f - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json" returned: 0 in 0.092s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 21 18:56:15 np0005591285 nova_compute[182755]: 2026-01-21 23:56:15.094 182759 DEBUG oslo_concurrency.processutils [None req-38ece97f-025d-4e62-9204-d5dc562a12a1 9a4a4a5f3c9f4c5091261592272bcb81 414437860afc460b9e86d674975e9d1f - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474,backing_fmt=raw /var/lib/nova/instances/7768ce19-cdaa-43a0-9404-cafa72f2d077/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 21 18:56:15 np0005591285 nova_compute[182755]: 2026-01-21 23:56:15.136 182759 DEBUG oslo_concurrency.processutils [None req-38ece97f-025d-4e62-9204-d5dc562a12a1 9a4a4a5f3c9f4c5091261592272bcb81 414437860afc460b9e86d674975e9d1f - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474,backing_fmt=raw /var/lib/nova/instances/7768ce19-cdaa-43a0-9404-cafa72f2d077/disk 1073741824" returned: 0 in 0.042s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 21 18:56:15 np0005591285 nova_compute[182755]: 2026-01-21 23:56:15.138 182759 DEBUG oslo_concurrency.lockutils [None req-38ece97f-025d-4e62-9204-d5dc562a12a1 9a4a4a5f3c9f4c5091261592272bcb81 414437860afc460b9e86d674975e9d1f - - default default] Lock "3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.149s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 21 18:56:15 np0005591285 nova_compute[182755]: 2026-01-21 23:56:15.138 182759 DEBUG oslo_concurrency.processutils [None req-38ece97f-025d-4e62-9204-d5dc562a12a1 9a4a4a5f3c9f4c5091261592272bcb81 414437860afc460b9e86d674975e9d1f - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 21 18:56:15 np0005591285 nova_compute[182755]: 2026-01-21 23:56:15.223 182759 DEBUG oslo_concurrency.processutils [None req-38ece97f-025d-4e62-9204-d5dc562a12a1 9a4a4a5f3c9f4c5091261592272bcb81 414437860afc460b9e86d674975e9d1f - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json" returned: 0 in 0.085s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 21 18:56:15 np0005591285 nova_compute[182755]: 2026-01-21 23:56:15.225 182759 DEBUG nova.virt.disk.api [None req-38ece97f-025d-4e62-9204-d5dc562a12a1 9a4a4a5f3c9f4c5091261592272bcb81 414437860afc460b9e86d674975e9d1f - - default default] Checking if we can resize image /var/lib/nova/instances/7768ce19-cdaa-43a0-9404-cafa72f2d077/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166#033[00m
Jan 21 18:56:15 np0005591285 nova_compute[182755]: 2026-01-21 23:56:15.225 182759 DEBUG oslo_concurrency.processutils [None req-38ece97f-025d-4e62-9204-d5dc562a12a1 9a4a4a5f3c9f4c5091261592272bcb81 414437860afc460b9e86d674975e9d1f - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/7768ce19-cdaa-43a0-9404-cafa72f2d077/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 21 18:56:15 np0005591285 nova_compute[182755]: 2026-01-21 23:56:15.292 182759 DEBUG oslo_concurrency.processutils [None req-38ece97f-025d-4e62-9204-d5dc562a12a1 9a4a4a5f3c9f4c5091261592272bcb81 414437860afc460b9e86d674975e9d1f - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/7768ce19-cdaa-43a0-9404-cafa72f2d077/disk --force-share --output=json" returned: 0 in 0.067s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 21 18:56:15 np0005591285 nova_compute[182755]: 2026-01-21 23:56:15.293 182759 DEBUG nova.virt.disk.api [None req-38ece97f-025d-4e62-9204-d5dc562a12a1 9a4a4a5f3c9f4c5091261592272bcb81 414437860afc460b9e86d674975e9d1f - - default default] Cannot resize image /var/lib/nova/instances/7768ce19-cdaa-43a0-9404-cafa72f2d077/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172#033[00m
Jan 21 18:56:15 np0005591285 nova_compute[182755]: 2026-01-21 23:56:15.294 182759 DEBUG nova.objects.instance [None req-38ece97f-025d-4e62-9204-d5dc562a12a1 9a4a4a5f3c9f4c5091261592272bcb81 414437860afc460b9e86d674975e9d1f - - default default] Lazy-loading 'migration_context' on Instance uuid 7768ce19-cdaa-43a0-9404-cafa72f2d077 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 21 18:56:15 np0005591285 nova_compute[182755]: 2026-01-21 23:56:15.328 182759 DEBUG nova.virt.libvirt.driver [None req-38ece97f-025d-4e62-9204-d5dc562a12a1 9a4a4a5f3c9f4c5091261592272bcb81 414437860afc460b9e86d674975e9d1f - - default default] [instance: 7768ce19-cdaa-43a0-9404-cafa72f2d077] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Jan 21 18:56:15 np0005591285 nova_compute[182755]: 2026-01-21 23:56:15.330 182759 DEBUG nova.virt.libvirt.driver [None req-38ece97f-025d-4e62-9204-d5dc562a12a1 9a4a4a5f3c9f4c5091261592272bcb81 414437860afc460b9e86d674975e9d1f - - default default] [instance: 7768ce19-cdaa-43a0-9404-cafa72f2d077] Ensure instance console log exists: /var/lib/nova/instances/7768ce19-cdaa-43a0-9404-cafa72f2d077/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Jan 21 18:56:15 np0005591285 nova_compute[182755]: 2026-01-21 23:56:15.330 182759 DEBUG oslo_concurrency.lockutils [None req-38ece97f-025d-4e62-9204-d5dc562a12a1 9a4a4a5f3c9f4c5091261592272bcb81 414437860afc460b9e86d674975e9d1f - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 21 18:56:15 np0005591285 nova_compute[182755]: 2026-01-21 23:56:15.331 182759 DEBUG oslo_concurrency.lockutils [None req-38ece97f-025d-4e62-9204-d5dc562a12a1 9a4a4a5f3c9f4c5091261592272bcb81 414437860afc460b9e86d674975e9d1f - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 21 18:56:15 np0005591285 nova_compute[182755]: 2026-01-21 23:56:15.331 182759 DEBUG oslo_concurrency.lockutils [None req-38ece97f-025d-4e62-9204-d5dc562a12a1 9a4a4a5f3c9f4c5091261592272bcb81 414437860afc460b9e86d674975e9d1f - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 21 18:56:15 np0005591285 nova_compute[182755]: 2026-01-21 23:56:15.826 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 18:56:17 np0005591285 nova_compute[182755]: 2026-01-21 23:56:17.367 182759 DEBUG nova.network.neutron [None req-38ece97f-025d-4e62-9204-d5dc562a12a1 9a4a4a5f3c9f4c5091261592272bcb81 414437860afc460b9e86d674975e9d1f - - default default] [instance: 7768ce19-cdaa-43a0-9404-cafa72f2d077] Successfully created port: cbd11a5b-9ef3-4c5b-a638-afc4464a8dfc _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Jan 21 18:56:19 np0005591285 nova_compute[182755]: 2026-01-21 23:56:19.489 182759 DEBUG nova.network.neutron [None req-38ece97f-025d-4e62-9204-d5dc562a12a1 9a4a4a5f3c9f4c5091261592272bcb81 414437860afc460b9e86d674975e9d1f - - default default] [instance: 7768ce19-cdaa-43a0-9404-cafa72f2d077] Successfully updated port: cbd11a5b-9ef3-4c5b-a638-afc4464a8dfc _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Jan 21 18:56:19 np0005591285 nova_compute[182755]: 2026-01-21 23:56:19.522 182759 DEBUG oslo_concurrency.lockutils [None req-38ece97f-025d-4e62-9204-d5dc562a12a1 9a4a4a5f3c9f4c5091261592272bcb81 414437860afc460b9e86d674975e9d1f - - default default] Acquiring lock "refresh_cache-7768ce19-cdaa-43a0-9404-cafa72f2d077" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 21 18:56:19 np0005591285 nova_compute[182755]: 2026-01-21 23:56:19.523 182759 DEBUG oslo_concurrency.lockutils [None req-38ece97f-025d-4e62-9204-d5dc562a12a1 9a4a4a5f3c9f4c5091261592272bcb81 414437860afc460b9e86d674975e9d1f - - default default] Acquired lock "refresh_cache-7768ce19-cdaa-43a0-9404-cafa72f2d077" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 21 18:56:19 np0005591285 nova_compute[182755]: 2026-01-21 23:56:19.523 182759 DEBUG nova.network.neutron [None req-38ece97f-025d-4e62-9204-d5dc562a12a1 9a4a4a5f3c9f4c5091261592272bcb81 414437860afc460b9e86d674975e9d1f - - default default] [instance: 7768ce19-cdaa-43a0-9404-cafa72f2d077] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Jan 21 18:56:19 np0005591285 nova_compute[182755]: 2026-01-21 23:56:19.672 182759 DEBUG nova.compute.manager [req-f62f9a62-0010-4072-944d-df7f081bf732 req-6f0ad488-78bb-486b-b858-81a6c2dcadca 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 7768ce19-cdaa-43a0-9404-cafa72f2d077] Received event network-changed-cbd11a5b-9ef3-4c5b-a638-afc4464a8dfc external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 21 18:56:19 np0005591285 nova_compute[182755]: 2026-01-21 23:56:19.672 182759 DEBUG nova.compute.manager [req-f62f9a62-0010-4072-944d-df7f081bf732 req-6f0ad488-78bb-486b-b858-81a6c2dcadca 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 7768ce19-cdaa-43a0-9404-cafa72f2d077] Refreshing instance network info cache due to event network-changed-cbd11a5b-9ef3-4c5b-a638-afc4464a8dfc. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 21 18:56:19 np0005591285 nova_compute[182755]: 2026-01-21 23:56:19.673 182759 DEBUG oslo_concurrency.lockutils [req-f62f9a62-0010-4072-944d-df7f081bf732 req-6f0ad488-78bb-486b-b858-81a6c2dcadca 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "refresh_cache-7768ce19-cdaa-43a0-9404-cafa72f2d077" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 21 18:56:19 np0005591285 nova_compute[182755]: 2026-01-21 23:56:19.806 182759 DEBUG nova.network.neutron [None req-38ece97f-025d-4e62-9204-d5dc562a12a1 9a4a4a5f3c9f4c5091261592272bcb81 414437860afc460b9e86d674975e9d1f - - default default] [instance: 7768ce19-cdaa-43a0-9404-cafa72f2d077] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Jan 21 18:56:20 np0005591285 nova_compute[182755]: 2026-01-21 23:56:20.054 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 18:56:20 np0005591285 nova_compute[182755]: 2026-01-21 23:56:20.746 182759 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1769039765.7443316, fc483404-7890-49b3-a98c-14073862383f => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 21 18:56:20 np0005591285 nova_compute[182755]: 2026-01-21 23:56:20.747 182759 INFO nova.compute.manager [-] [instance: fc483404-7890-49b3-a98c-14073862383f] VM Stopped (Lifecycle Event)#033[00m
Jan 21 18:56:20 np0005591285 nova_compute[182755]: 2026-01-21 23:56:20.780 182759 DEBUG nova.compute.manager [None req-df173500-5c61-40bf-ba61-fbcb0b4c8d1e - - - - - -] [instance: fc483404-7890-49b3-a98c-14073862383f] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 21 18:56:20 np0005591285 nova_compute[182755]: 2026-01-21 23:56:20.829 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 18:56:21 np0005591285 nova_compute[182755]: 2026-01-21 23:56:21.017 182759 DEBUG nova.network.neutron [None req-38ece97f-025d-4e62-9204-d5dc562a12a1 9a4a4a5f3c9f4c5091261592272bcb81 414437860afc460b9e86d674975e9d1f - - default default] [instance: 7768ce19-cdaa-43a0-9404-cafa72f2d077] Updating instance_info_cache with network_info: [{"id": "cbd11a5b-9ef3-4c5b-a638-afc4464a8dfc", "address": "fa:16:3e:fa:c6:29", "network": {"id": "835f4434-3fa6-458b-b79c-b27830f531cf", "bridge": "br-int", "label": "tempest-ListServersNegativeTestJSON-1274650069-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "414437860afc460b9e86d674975e9d1f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapcbd11a5b-9e", "ovs_interfaceid": "cbd11a5b-9ef3-4c5b-a638-afc4464a8dfc", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 21 18:56:21 np0005591285 nova_compute[182755]: 2026-01-21 23:56:21.093 182759 DEBUG oslo_concurrency.lockutils [None req-38ece97f-025d-4e62-9204-d5dc562a12a1 9a4a4a5f3c9f4c5091261592272bcb81 414437860afc460b9e86d674975e9d1f - - default default] Releasing lock "refresh_cache-7768ce19-cdaa-43a0-9404-cafa72f2d077" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 21 18:56:21 np0005591285 nova_compute[182755]: 2026-01-21 23:56:21.094 182759 DEBUG nova.compute.manager [None req-38ece97f-025d-4e62-9204-d5dc562a12a1 9a4a4a5f3c9f4c5091261592272bcb81 414437860afc460b9e86d674975e9d1f - - default default] [instance: 7768ce19-cdaa-43a0-9404-cafa72f2d077] Instance network_info: |[{"id": "cbd11a5b-9ef3-4c5b-a638-afc4464a8dfc", "address": "fa:16:3e:fa:c6:29", "network": {"id": "835f4434-3fa6-458b-b79c-b27830f531cf", "bridge": "br-int", "label": "tempest-ListServersNegativeTestJSON-1274650069-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "414437860afc460b9e86d674975e9d1f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapcbd11a5b-9e", "ovs_interfaceid": "cbd11a5b-9ef3-4c5b-a638-afc4464a8dfc", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Jan 21 18:56:21 np0005591285 nova_compute[182755]: 2026-01-21 23:56:21.095 182759 DEBUG oslo_concurrency.lockutils [req-f62f9a62-0010-4072-944d-df7f081bf732 req-6f0ad488-78bb-486b-b858-81a6c2dcadca 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquired lock "refresh_cache-7768ce19-cdaa-43a0-9404-cafa72f2d077" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 21 18:56:21 np0005591285 nova_compute[182755]: 2026-01-21 23:56:21.096 182759 DEBUG nova.network.neutron [req-f62f9a62-0010-4072-944d-df7f081bf732 req-6f0ad488-78bb-486b-b858-81a6c2dcadca 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 7768ce19-cdaa-43a0-9404-cafa72f2d077] Refreshing network info cache for port cbd11a5b-9ef3-4c5b-a638-afc4464a8dfc _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 21 18:56:21 np0005591285 nova_compute[182755]: 2026-01-21 23:56:21.101 182759 DEBUG nova.virt.libvirt.driver [None req-38ece97f-025d-4e62-9204-d5dc562a12a1 9a4a4a5f3c9f4c5091261592272bcb81 414437860afc460b9e86d674975e9d1f - - default default] [instance: 7768ce19-cdaa-43a0-9404-cafa72f2d077] Start _get_guest_xml network_info=[{"id": "cbd11a5b-9ef3-4c5b-a638-afc4464a8dfc", "address": "fa:16:3e:fa:c6:29", "network": {"id": "835f4434-3fa6-458b-b79c-b27830f531cf", "bridge": "br-int", "label": "tempest-ListServersNegativeTestJSON-1274650069-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "414437860afc460b9e86d674975e9d1f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapcbd11a5b-9e", "ovs_interfaceid": "cbd11a5b-9ef3-4c5b-a638-afc4464a8dfc", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-21T23:42:50Z,direct_url=<?>,disk_format='qcow2',id=9cd98f02-a505-4543-a7ad-04e9a377b456,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='43b70c4e837343859ac97b6b2397ba1b',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-21T23:42:52Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'device_type': 'disk', 'device_name': '/dev/vda', 'size': 0, 'boot_index': 0, 'disk_bus': 'virtio', 'guest_format': None, 'encryption_options': None, 'encryption_format': None, 'encrypted': False, 'encryption_secret_uuid': None, 'image_id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Jan 21 18:56:21 np0005591285 nova_compute[182755]: 2026-01-21 23:56:21.111 182759 WARNING nova.virt.libvirt.driver [None req-38ece97f-025d-4e62-9204-d5dc562a12a1 9a4a4a5f3c9f4c5091261592272bcb81 414437860afc460b9e86d674975e9d1f - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 21 18:56:21 np0005591285 nova_compute[182755]: 2026-01-21 23:56:21.120 182759 DEBUG nova.virt.libvirt.host [None req-38ece97f-025d-4e62-9204-d5dc562a12a1 9a4a4a5f3c9f4c5091261592272bcb81 414437860afc460b9e86d674975e9d1f - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Jan 21 18:56:21 np0005591285 nova_compute[182755]: 2026-01-21 23:56:21.122 182759 DEBUG nova.virt.libvirt.host [None req-38ece97f-025d-4e62-9204-d5dc562a12a1 9a4a4a5f3c9f4c5091261592272bcb81 414437860afc460b9e86d674975e9d1f - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Jan 21 18:56:21 np0005591285 nova_compute[182755]: 2026-01-21 23:56:21.129 182759 DEBUG nova.virt.libvirt.host [None req-38ece97f-025d-4e62-9204-d5dc562a12a1 9a4a4a5f3c9f4c5091261592272bcb81 414437860afc460b9e86d674975e9d1f - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Jan 21 18:56:21 np0005591285 nova_compute[182755]: 2026-01-21 23:56:21.130 182759 DEBUG nova.virt.libvirt.host [None req-38ece97f-025d-4e62-9204-d5dc562a12a1 9a4a4a5f3c9f4c5091261592272bcb81 414437860afc460b9e86d674975e9d1f - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Jan 21 18:56:21 np0005591285 nova_compute[182755]: 2026-01-21 23:56:21.133 182759 DEBUG nova.virt.libvirt.driver [None req-38ece97f-025d-4e62-9204-d5dc562a12a1 9a4a4a5f3c9f4c5091261592272bcb81 414437860afc460b9e86d674975e9d1f - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Jan 21 18:56:21 np0005591285 nova_compute[182755]: 2026-01-21 23:56:21.134 182759 DEBUG nova.virt.hardware [None req-38ece97f-025d-4e62-9204-d5dc562a12a1 9a4a4a5f3c9f4c5091261592272bcb81 414437860afc460b9e86d674975e9d1f - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-21T23:42:45Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='c3389c03-89c4-4ff5-9e03-1a99d41713d4',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-21T23:42:50Z,direct_url=<?>,disk_format='qcow2',id=9cd98f02-a505-4543-a7ad-04e9a377b456,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='43b70c4e837343859ac97b6b2397ba1b',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-21T23:42:52Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Jan 21 18:56:21 np0005591285 nova_compute[182755]: 2026-01-21 23:56:21.135 182759 DEBUG nova.virt.hardware [None req-38ece97f-025d-4e62-9204-d5dc562a12a1 9a4a4a5f3c9f4c5091261592272bcb81 414437860afc460b9e86d674975e9d1f - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Jan 21 18:56:21 np0005591285 nova_compute[182755]: 2026-01-21 23:56:21.136 182759 DEBUG nova.virt.hardware [None req-38ece97f-025d-4e62-9204-d5dc562a12a1 9a4a4a5f3c9f4c5091261592272bcb81 414437860afc460b9e86d674975e9d1f - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Jan 21 18:56:21 np0005591285 nova_compute[182755]: 2026-01-21 23:56:21.136 182759 DEBUG nova.virt.hardware [None req-38ece97f-025d-4e62-9204-d5dc562a12a1 9a4a4a5f3c9f4c5091261592272bcb81 414437860afc460b9e86d674975e9d1f - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Jan 21 18:56:21 np0005591285 nova_compute[182755]: 2026-01-21 23:56:21.137 182759 DEBUG nova.virt.hardware [None req-38ece97f-025d-4e62-9204-d5dc562a12a1 9a4a4a5f3c9f4c5091261592272bcb81 414437860afc460b9e86d674975e9d1f - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Jan 21 18:56:21 np0005591285 nova_compute[182755]: 2026-01-21 23:56:21.138 182759 DEBUG nova.virt.hardware [None req-38ece97f-025d-4e62-9204-d5dc562a12a1 9a4a4a5f3c9f4c5091261592272bcb81 414437860afc460b9e86d674975e9d1f - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Jan 21 18:56:21 np0005591285 nova_compute[182755]: 2026-01-21 23:56:21.138 182759 DEBUG nova.virt.hardware [None req-38ece97f-025d-4e62-9204-d5dc562a12a1 9a4a4a5f3c9f4c5091261592272bcb81 414437860afc460b9e86d674975e9d1f - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Jan 21 18:56:21 np0005591285 nova_compute[182755]: 2026-01-21 23:56:21.139 182759 DEBUG nova.virt.hardware [None req-38ece97f-025d-4e62-9204-d5dc562a12a1 9a4a4a5f3c9f4c5091261592272bcb81 414437860afc460b9e86d674975e9d1f - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Jan 21 18:56:21 np0005591285 nova_compute[182755]: 2026-01-21 23:56:21.140 182759 DEBUG nova.virt.hardware [None req-38ece97f-025d-4e62-9204-d5dc562a12a1 9a4a4a5f3c9f4c5091261592272bcb81 414437860afc460b9e86d674975e9d1f - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Jan 21 18:56:21 np0005591285 nova_compute[182755]: 2026-01-21 23:56:21.140 182759 DEBUG nova.virt.hardware [None req-38ece97f-025d-4e62-9204-d5dc562a12a1 9a4a4a5f3c9f4c5091261592272bcb81 414437860afc460b9e86d674975e9d1f - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Jan 21 18:56:21 np0005591285 nova_compute[182755]: 2026-01-21 23:56:21.141 182759 DEBUG nova.virt.hardware [None req-38ece97f-025d-4e62-9204-d5dc562a12a1 9a4a4a5f3c9f4c5091261592272bcb81 414437860afc460b9e86d674975e9d1f - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Jan 21 18:56:21 np0005591285 nova_compute[182755]: 2026-01-21 23:56:21.153 182759 DEBUG nova.virt.libvirt.vif [None req-38ece97f-025d-4e62-9204-d5dc562a12a1 9a4a4a5f3c9f4c5091261592272bcb81 414437860afc460b9e86d674975e9d1f - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-21T23:56:12Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ListServersNegativeTestJSON-server-1677728672',display_name='tempest-ListServersNegativeTestJSON-server-1677728672-1',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-listserversnegativetestjson-server-1677728672-1',id=65,image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='414437860afc460b9e86d674975e9d1f',ramdisk_id='',reservation_id='r-dznclk2m',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ListServersNegativeTestJSON-1787990789',owner_user_name='tempest-ListServersNegativeTestJSON-1787990789-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-21T23:56:14Z,user_data=None,user_id='9a4a4a5f3c9f4c5091261592272bcb81',uuid=7768ce19-cdaa-43a0-9404-cafa72f2d077,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "cbd11a5b-9ef3-4c5b-a638-afc4464a8dfc", "address": "fa:16:3e:fa:c6:29", "network": {"id": "835f4434-3fa6-458b-b79c-b27830f531cf", "bridge": "br-int", "label": "tempest-ListServersNegativeTestJSON-1274650069-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "414437860afc460b9e86d674975e9d1f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapcbd11a5b-9e", "ovs_interfaceid": "cbd11a5b-9ef3-4c5b-a638-afc4464a8dfc", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Jan 21 18:56:21 np0005591285 nova_compute[182755]: 2026-01-21 23:56:21.155 182759 DEBUG nova.network.os_vif_util [None req-38ece97f-025d-4e62-9204-d5dc562a12a1 9a4a4a5f3c9f4c5091261592272bcb81 414437860afc460b9e86d674975e9d1f - - default default] Converting VIF {"id": "cbd11a5b-9ef3-4c5b-a638-afc4464a8dfc", "address": "fa:16:3e:fa:c6:29", "network": {"id": "835f4434-3fa6-458b-b79c-b27830f531cf", "bridge": "br-int", "label": "tempest-ListServersNegativeTestJSON-1274650069-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "414437860afc460b9e86d674975e9d1f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapcbd11a5b-9e", "ovs_interfaceid": "cbd11a5b-9ef3-4c5b-a638-afc4464a8dfc", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 21 18:56:21 np0005591285 nova_compute[182755]: 2026-01-21 23:56:21.156 182759 DEBUG nova.network.os_vif_util [None req-38ece97f-025d-4e62-9204-d5dc562a12a1 9a4a4a5f3c9f4c5091261592272bcb81 414437860afc460b9e86d674975e9d1f - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:fa:c6:29,bridge_name='br-int',has_traffic_filtering=True,id=cbd11a5b-9ef3-4c5b-a638-afc4464a8dfc,network=Network(835f4434-3fa6-458b-b79c-b27830f531cf),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapcbd11a5b-9e') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 21 18:56:21 np0005591285 nova_compute[182755]: 2026-01-21 23:56:21.158 182759 DEBUG nova.objects.instance [None req-38ece97f-025d-4e62-9204-d5dc562a12a1 9a4a4a5f3c9f4c5091261592272bcb81 414437860afc460b9e86d674975e9d1f - - default default] Lazy-loading 'pci_devices' on Instance uuid 7768ce19-cdaa-43a0-9404-cafa72f2d077 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 21 18:56:21 np0005591285 nova_compute[182755]: 2026-01-21 23:56:21.184 182759 DEBUG nova.virt.libvirt.driver [None req-38ece97f-025d-4e62-9204-d5dc562a12a1 9a4a4a5f3c9f4c5091261592272bcb81 414437860afc460b9e86d674975e9d1f - - default default] [instance: 7768ce19-cdaa-43a0-9404-cafa72f2d077] End _get_guest_xml xml=<domain type="kvm">
Jan 21 18:56:21 np0005591285 nova_compute[182755]:  <uuid>7768ce19-cdaa-43a0-9404-cafa72f2d077</uuid>
Jan 21 18:56:21 np0005591285 nova_compute[182755]:  <name>instance-00000041</name>
Jan 21 18:56:21 np0005591285 nova_compute[182755]:  <memory>131072</memory>
Jan 21 18:56:21 np0005591285 nova_compute[182755]:  <vcpu>1</vcpu>
Jan 21 18:56:21 np0005591285 nova_compute[182755]:  <metadata>
Jan 21 18:56:21 np0005591285 nova_compute[182755]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 21 18:56:21 np0005591285 nova_compute[182755]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 21 18:56:21 np0005591285 nova_compute[182755]:      <nova:name>tempest-ListServersNegativeTestJSON-server-1677728672-1</nova:name>
Jan 21 18:56:21 np0005591285 nova_compute[182755]:      <nova:creationTime>2026-01-21 23:56:21</nova:creationTime>
Jan 21 18:56:21 np0005591285 nova_compute[182755]:      <nova:flavor name="m1.nano">
Jan 21 18:56:21 np0005591285 nova_compute[182755]:        <nova:memory>128</nova:memory>
Jan 21 18:56:21 np0005591285 nova_compute[182755]:        <nova:disk>1</nova:disk>
Jan 21 18:56:21 np0005591285 nova_compute[182755]:        <nova:swap>0</nova:swap>
Jan 21 18:56:21 np0005591285 nova_compute[182755]:        <nova:ephemeral>0</nova:ephemeral>
Jan 21 18:56:21 np0005591285 nova_compute[182755]:        <nova:vcpus>1</nova:vcpus>
Jan 21 18:56:21 np0005591285 nova_compute[182755]:      </nova:flavor>
Jan 21 18:56:21 np0005591285 nova_compute[182755]:      <nova:owner>
Jan 21 18:56:21 np0005591285 nova_compute[182755]:        <nova:user uuid="9a4a4a5f3c9f4c5091261592272bcb81">tempest-ListServersNegativeTestJSON-1787990789-project-member</nova:user>
Jan 21 18:56:21 np0005591285 nova_compute[182755]:        <nova:project uuid="414437860afc460b9e86d674975e9d1f">tempest-ListServersNegativeTestJSON-1787990789</nova:project>
Jan 21 18:56:21 np0005591285 nova_compute[182755]:      </nova:owner>
Jan 21 18:56:21 np0005591285 nova_compute[182755]:      <nova:root type="image" uuid="9cd98f02-a505-4543-a7ad-04e9a377b456"/>
Jan 21 18:56:21 np0005591285 nova_compute[182755]:      <nova:ports>
Jan 21 18:56:21 np0005591285 nova_compute[182755]:        <nova:port uuid="cbd11a5b-9ef3-4c5b-a638-afc4464a8dfc">
Jan 21 18:56:21 np0005591285 nova_compute[182755]:          <nova:ip type="fixed" address="10.100.0.8" ipVersion="4"/>
Jan 21 18:56:21 np0005591285 nova_compute[182755]:        </nova:port>
Jan 21 18:56:21 np0005591285 nova_compute[182755]:      </nova:ports>
Jan 21 18:56:21 np0005591285 nova_compute[182755]:    </nova:instance>
Jan 21 18:56:21 np0005591285 nova_compute[182755]:  </metadata>
Jan 21 18:56:21 np0005591285 nova_compute[182755]:  <sysinfo type="smbios">
Jan 21 18:56:21 np0005591285 nova_compute[182755]:    <system>
Jan 21 18:56:21 np0005591285 nova_compute[182755]:      <entry name="manufacturer">RDO</entry>
Jan 21 18:56:21 np0005591285 nova_compute[182755]:      <entry name="product">OpenStack Compute</entry>
Jan 21 18:56:21 np0005591285 nova_compute[182755]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 21 18:56:21 np0005591285 nova_compute[182755]:      <entry name="serial">7768ce19-cdaa-43a0-9404-cafa72f2d077</entry>
Jan 21 18:56:21 np0005591285 nova_compute[182755]:      <entry name="uuid">7768ce19-cdaa-43a0-9404-cafa72f2d077</entry>
Jan 21 18:56:21 np0005591285 nova_compute[182755]:      <entry name="family">Virtual Machine</entry>
Jan 21 18:56:21 np0005591285 nova_compute[182755]:    </system>
Jan 21 18:56:21 np0005591285 nova_compute[182755]:  </sysinfo>
Jan 21 18:56:21 np0005591285 nova_compute[182755]:  <os>
Jan 21 18:56:21 np0005591285 nova_compute[182755]:    <type arch="x86_64" machine="q35">hvm</type>
Jan 21 18:56:21 np0005591285 nova_compute[182755]:    <boot dev="hd"/>
Jan 21 18:56:21 np0005591285 nova_compute[182755]:    <smbios mode="sysinfo"/>
Jan 21 18:56:21 np0005591285 nova_compute[182755]:  </os>
Jan 21 18:56:21 np0005591285 nova_compute[182755]:  <features>
Jan 21 18:56:21 np0005591285 nova_compute[182755]:    <acpi/>
Jan 21 18:56:21 np0005591285 nova_compute[182755]:    <apic/>
Jan 21 18:56:21 np0005591285 nova_compute[182755]:    <vmcoreinfo/>
Jan 21 18:56:21 np0005591285 nova_compute[182755]:  </features>
Jan 21 18:56:21 np0005591285 nova_compute[182755]:  <clock offset="utc">
Jan 21 18:56:21 np0005591285 nova_compute[182755]:    <timer name="pit" tickpolicy="delay"/>
Jan 21 18:56:21 np0005591285 nova_compute[182755]:    <timer name="rtc" tickpolicy="catchup"/>
Jan 21 18:56:21 np0005591285 nova_compute[182755]:    <timer name="hpet" present="no"/>
Jan 21 18:56:21 np0005591285 nova_compute[182755]:  </clock>
Jan 21 18:56:21 np0005591285 nova_compute[182755]:  <cpu mode="custom" match="exact">
Jan 21 18:56:21 np0005591285 nova_compute[182755]:    <model>Nehalem</model>
Jan 21 18:56:21 np0005591285 nova_compute[182755]:    <topology sockets="1" cores="1" threads="1"/>
Jan 21 18:56:21 np0005591285 nova_compute[182755]:  </cpu>
Jan 21 18:56:21 np0005591285 nova_compute[182755]:  <devices>
Jan 21 18:56:21 np0005591285 nova_compute[182755]:    <disk type="file" device="disk">
Jan 21 18:56:21 np0005591285 nova_compute[182755]:      <driver name="qemu" type="qcow2" cache="none"/>
Jan 21 18:56:21 np0005591285 nova_compute[182755]:      <source file="/var/lib/nova/instances/7768ce19-cdaa-43a0-9404-cafa72f2d077/disk"/>
Jan 21 18:56:21 np0005591285 nova_compute[182755]:      <target dev="vda" bus="virtio"/>
Jan 21 18:56:21 np0005591285 nova_compute[182755]:    </disk>
Jan 21 18:56:21 np0005591285 nova_compute[182755]:    <disk type="file" device="cdrom">
Jan 21 18:56:21 np0005591285 nova_compute[182755]:      <driver name="qemu" type="raw" cache="none"/>
Jan 21 18:56:21 np0005591285 nova_compute[182755]:      <source file="/var/lib/nova/instances/7768ce19-cdaa-43a0-9404-cafa72f2d077/disk.config"/>
Jan 21 18:56:21 np0005591285 nova_compute[182755]:      <target dev="sda" bus="sata"/>
Jan 21 18:56:21 np0005591285 nova_compute[182755]:    </disk>
Jan 21 18:56:21 np0005591285 nova_compute[182755]:    <interface type="ethernet">
Jan 21 18:56:21 np0005591285 nova_compute[182755]:      <mac address="fa:16:3e:fa:c6:29"/>
Jan 21 18:56:21 np0005591285 nova_compute[182755]:      <model type="virtio"/>
Jan 21 18:56:21 np0005591285 nova_compute[182755]:      <driver name="vhost" rx_queue_size="512"/>
Jan 21 18:56:21 np0005591285 nova_compute[182755]:      <mtu size="1442"/>
Jan 21 18:56:21 np0005591285 nova_compute[182755]:      <target dev="tapcbd11a5b-9e"/>
Jan 21 18:56:21 np0005591285 nova_compute[182755]:    </interface>
Jan 21 18:56:21 np0005591285 nova_compute[182755]:    <serial type="pty">
Jan 21 18:56:21 np0005591285 nova_compute[182755]:      <log file="/var/lib/nova/instances/7768ce19-cdaa-43a0-9404-cafa72f2d077/console.log" append="off"/>
Jan 21 18:56:21 np0005591285 nova_compute[182755]:    </serial>
Jan 21 18:56:21 np0005591285 nova_compute[182755]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 21 18:56:21 np0005591285 nova_compute[182755]:    <video>
Jan 21 18:56:21 np0005591285 nova_compute[182755]:      <model type="virtio"/>
Jan 21 18:56:21 np0005591285 nova_compute[182755]:    </video>
Jan 21 18:56:21 np0005591285 nova_compute[182755]:    <input type="tablet" bus="usb"/>
Jan 21 18:56:21 np0005591285 nova_compute[182755]:    <rng model="virtio">
Jan 21 18:56:21 np0005591285 nova_compute[182755]:      <backend model="random">/dev/urandom</backend>
Jan 21 18:56:21 np0005591285 nova_compute[182755]:    </rng>
Jan 21 18:56:21 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root"/>
Jan 21 18:56:21 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 18:56:21 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 18:56:21 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 18:56:21 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 18:56:21 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 18:56:21 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 18:56:21 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 18:56:21 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 18:56:21 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 18:56:21 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 18:56:21 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 18:56:21 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 18:56:21 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 18:56:21 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 18:56:21 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 18:56:21 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 18:56:21 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 18:56:21 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 18:56:21 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 18:56:21 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 18:56:21 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 18:56:21 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 18:56:21 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 18:56:21 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 18:56:21 np0005591285 nova_compute[182755]:    <controller type="usb" index="0"/>
Jan 21 18:56:21 np0005591285 nova_compute[182755]:    <memballoon model="virtio">
Jan 21 18:56:21 np0005591285 nova_compute[182755]:      <stats period="10"/>
Jan 21 18:56:21 np0005591285 nova_compute[182755]:    </memballoon>
Jan 21 18:56:21 np0005591285 nova_compute[182755]:  </devices>
Jan 21 18:56:21 np0005591285 nova_compute[182755]: </domain>
Jan 21 18:56:21 np0005591285 nova_compute[182755]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Jan 21 18:56:21 np0005591285 nova_compute[182755]: 2026-01-21 23:56:21.186 182759 DEBUG nova.compute.manager [None req-38ece97f-025d-4e62-9204-d5dc562a12a1 9a4a4a5f3c9f4c5091261592272bcb81 414437860afc460b9e86d674975e9d1f - - default default] [instance: 7768ce19-cdaa-43a0-9404-cafa72f2d077] Preparing to wait for external event network-vif-plugged-cbd11a5b-9ef3-4c5b-a638-afc4464a8dfc prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Jan 21 18:56:21 np0005591285 nova_compute[182755]: 2026-01-21 23:56:21.187 182759 DEBUG oslo_concurrency.lockutils [None req-38ece97f-025d-4e62-9204-d5dc562a12a1 9a4a4a5f3c9f4c5091261592272bcb81 414437860afc460b9e86d674975e9d1f - - default default] Acquiring lock "7768ce19-cdaa-43a0-9404-cafa72f2d077-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 21 18:56:21 np0005591285 nova_compute[182755]: 2026-01-21 23:56:21.187 182759 DEBUG oslo_concurrency.lockutils [None req-38ece97f-025d-4e62-9204-d5dc562a12a1 9a4a4a5f3c9f4c5091261592272bcb81 414437860afc460b9e86d674975e9d1f - - default default] Lock "7768ce19-cdaa-43a0-9404-cafa72f2d077-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 21 18:56:21 np0005591285 nova_compute[182755]: 2026-01-21 23:56:21.188 182759 DEBUG oslo_concurrency.lockutils [None req-38ece97f-025d-4e62-9204-d5dc562a12a1 9a4a4a5f3c9f4c5091261592272bcb81 414437860afc460b9e86d674975e9d1f - - default default] Lock "7768ce19-cdaa-43a0-9404-cafa72f2d077-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 21 18:56:21 np0005591285 nova_compute[182755]: 2026-01-21 23:56:21.189 182759 DEBUG nova.virt.libvirt.vif [None req-38ece97f-025d-4e62-9204-d5dc562a12a1 9a4a4a5f3c9f4c5091261592272bcb81 414437860afc460b9e86d674975e9d1f - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-21T23:56:12Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ListServersNegativeTestJSON-server-1677728672',display_name='tempest-ListServersNegativeTestJSON-server-1677728672-1',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-listserversnegativetestjson-server-1677728672-1',id=65,image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='414437860afc460b9e86d674975e9d1f',ramdisk_id='',reservation_id='r-dznclk2m',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ListServersNegativeTestJSON-1787990789',owner_user_name='tempest-ListServersNegativeTestJSON-1787990789-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-21T23:56:14Z,user_data=None,user_id='9a4a4a5f3c9f4c5091261592272bcb81',uuid=7768ce19-cdaa-43a0-9404-cafa72f2d077,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "cbd11a5b-9ef3-4c5b-a638-afc4464a8dfc", "address": "fa:16:3e:fa:c6:29", "network": {"id": "835f4434-3fa6-458b-b79c-b27830f531cf", "bridge": "br-int", "label": "tempest-ListServersNegativeTestJSON-1274650069-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "414437860afc460b9e86d674975e9d1f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapcbd11a5b-9e", "ovs_interfaceid": "cbd11a5b-9ef3-4c5b-a638-afc4464a8dfc", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Jan 21 18:56:21 np0005591285 nova_compute[182755]: 2026-01-21 23:56:21.190 182759 DEBUG nova.network.os_vif_util [None req-38ece97f-025d-4e62-9204-d5dc562a12a1 9a4a4a5f3c9f4c5091261592272bcb81 414437860afc460b9e86d674975e9d1f - - default default] Converting VIF {"id": "cbd11a5b-9ef3-4c5b-a638-afc4464a8dfc", "address": "fa:16:3e:fa:c6:29", "network": {"id": "835f4434-3fa6-458b-b79c-b27830f531cf", "bridge": "br-int", "label": "tempest-ListServersNegativeTestJSON-1274650069-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "414437860afc460b9e86d674975e9d1f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapcbd11a5b-9e", "ovs_interfaceid": "cbd11a5b-9ef3-4c5b-a638-afc4464a8dfc", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 21 18:56:21 np0005591285 nova_compute[182755]: 2026-01-21 23:56:21.191 182759 DEBUG nova.network.os_vif_util [None req-38ece97f-025d-4e62-9204-d5dc562a12a1 9a4a4a5f3c9f4c5091261592272bcb81 414437860afc460b9e86d674975e9d1f - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:fa:c6:29,bridge_name='br-int',has_traffic_filtering=True,id=cbd11a5b-9ef3-4c5b-a638-afc4464a8dfc,network=Network(835f4434-3fa6-458b-b79c-b27830f531cf),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapcbd11a5b-9e') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 21 18:56:21 np0005591285 nova_compute[182755]: 2026-01-21 23:56:21.191 182759 DEBUG os_vif [None req-38ece97f-025d-4e62-9204-d5dc562a12a1 9a4a4a5f3c9f4c5091261592272bcb81 414437860afc460b9e86d674975e9d1f - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:fa:c6:29,bridge_name='br-int',has_traffic_filtering=True,id=cbd11a5b-9ef3-4c5b-a638-afc4464a8dfc,network=Network(835f4434-3fa6-458b-b79c-b27830f531cf),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapcbd11a5b-9e') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Jan 21 18:56:21 np0005591285 nova_compute[182755]: 2026-01-21 23:56:21.193 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 18:56:21 np0005591285 nova_compute[182755]: 2026-01-21 23:56:21.193 182759 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 21 18:56:21 np0005591285 nova_compute[182755]: 2026-01-21 23:56:21.194 182759 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 21 18:56:21 np0005591285 nova_compute[182755]: 2026-01-21 23:56:21.198 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 18:56:21 np0005591285 nova_compute[182755]: 2026-01-21 23:56:21.199 182759 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapcbd11a5b-9e, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 21 18:56:21 np0005591285 nova_compute[182755]: 2026-01-21 23:56:21.200 182759 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tapcbd11a5b-9e, col_values=(('external_ids', {'iface-id': 'cbd11a5b-9ef3-4c5b-a638-afc4464a8dfc', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:fa:c6:29', 'vm-uuid': '7768ce19-cdaa-43a0-9404-cafa72f2d077'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 21 18:56:21 np0005591285 nova_compute[182755]: 2026-01-21 23:56:21.202 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 18:56:21 np0005591285 NetworkManager[55017]: <info>  [1769039781.2043] manager: (tapcbd11a5b-9e): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/92)
Jan 21 18:56:21 np0005591285 nova_compute[182755]: 2026-01-21 23:56:21.206 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 21 18:56:21 np0005591285 nova_compute[182755]: 2026-01-21 23:56:21.214 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 18:56:21 np0005591285 nova_compute[182755]: 2026-01-21 23:56:21.217 182759 INFO os_vif [None req-38ece97f-025d-4e62-9204-d5dc562a12a1 9a4a4a5f3c9f4c5091261592272bcb81 414437860afc460b9e86d674975e9d1f - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:fa:c6:29,bridge_name='br-int',has_traffic_filtering=True,id=cbd11a5b-9ef3-4c5b-a638-afc4464a8dfc,network=Network(835f4434-3fa6-458b-b79c-b27830f531cf),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapcbd11a5b-9e')#033[00m
Jan 21 18:56:21 np0005591285 nova_compute[182755]: 2026-01-21 23:56:21.295 182759 DEBUG nova.virt.libvirt.driver [None req-38ece97f-025d-4e62-9204-d5dc562a12a1 9a4a4a5f3c9f4c5091261592272bcb81 414437860afc460b9e86d674975e9d1f - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 21 18:56:21 np0005591285 nova_compute[182755]: 2026-01-21 23:56:21.296 182759 DEBUG nova.virt.libvirt.driver [None req-38ece97f-025d-4e62-9204-d5dc562a12a1 9a4a4a5f3c9f4c5091261592272bcb81 414437860afc460b9e86d674975e9d1f - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 21 18:56:21 np0005591285 nova_compute[182755]: 2026-01-21 23:56:21.296 182759 DEBUG nova.virt.libvirt.driver [None req-38ece97f-025d-4e62-9204-d5dc562a12a1 9a4a4a5f3c9f4c5091261592272bcb81 414437860afc460b9e86d674975e9d1f - - default default] No VIF found with MAC fa:16:3e:fa:c6:29, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Jan 21 18:56:21 np0005591285 nova_compute[182755]: 2026-01-21 23:56:21.296 182759 INFO nova.virt.libvirt.driver [None req-38ece97f-025d-4e62-9204-d5dc562a12a1 9a4a4a5f3c9f4c5091261592272bcb81 414437860afc460b9e86d674975e9d1f - - default default] [instance: 7768ce19-cdaa-43a0-9404-cafa72f2d077] Using config drive#033[00m
Jan 21 18:56:21 np0005591285 nova_compute[182755]: 2026-01-21 23:56:21.876 182759 INFO nova.virt.libvirt.driver [None req-38ece97f-025d-4e62-9204-d5dc562a12a1 9a4a4a5f3c9f4c5091261592272bcb81 414437860afc460b9e86d674975e9d1f - - default default] [instance: 7768ce19-cdaa-43a0-9404-cafa72f2d077] Creating config drive at /var/lib/nova/instances/7768ce19-cdaa-43a0-9404-cafa72f2d077/disk.config#033[00m
Jan 21 18:56:21 np0005591285 nova_compute[182755]: 2026-01-21 23:56:21.884 182759 DEBUG oslo_concurrency.processutils [None req-38ece97f-025d-4e62-9204-d5dc562a12a1 9a4a4a5f3c9f4c5091261592272bcb81 414437860afc460b9e86d674975e9d1f - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/7768ce19-cdaa-43a0-9404-cafa72f2d077/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpernma7ej execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 21 18:56:22 np0005591285 nova_compute[182755]: 2026-01-21 23:56:22.019 182759 DEBUG oslo_concurrency.processutils [None req-38ece97f-025d-4e62-9204-d5dc562a12a1 9a4a4a5f3c9f4c5091261592272bcb81 414437860afc460b9e86d674975e9d1f - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/7768ce19-cdaa-43a0-9404-cafa72f2d077/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpernma7ej" returned: 0 in 0.134s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 21 18:56:22 np0005591285 kernel: tapcbd11a5b-9e: entered promiscuous mode
Jan 21 18:56:22 np0005591285 nova_compute[182755]: 2026-01-21 23:56:22.096 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 18:56:22 np0005591285 NetworkManager[55017]: <info>  [1769039782.0978] manager: (tapcbd11a5b-9e): new Tun device (/org/freedesktop/NetworkManager/Devices/93)
Jan 21 18:56:22 np0005591285 ovn_controller[94908]: 2026-01-21T23:56:22Z|00172|binding|INFO|Claiming lport cbd11a5b-9ef3-4c5b-a638-afc4464a8dfc for this chassis.
Jan 21 18:56:22 np0005591285 ovn_controller[94908]: 2026-01-21T23:56:22Z|00173|binding|INFO|cbd11a5b-9ef3-4c5b-a638-afc4464a8dfc: Claiming fa:16:3e:fa:c6:29 10.100.0.8
Jan 21 18:56:22 np0005591285 nova_compute[182755]: 2026-01-21 23:56:22.104 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 18:56:22 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:56:22.111 104259 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:fa:c6:29 10.100.0.8'], port_security=['fa:16:3e:fa:c6:29 10.100.0.8'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.8/28', 'neutron:device_id': '7768ce19-cdaa-43a0-9404-cafa72f2d077', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-835f4434-3fa6-458b-b79c-b27830f531cf', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '414437860afc460b9e86d674975e9d1f', 'neutron:revision_number': '2', 'neutron:security_group_ids': '30db0ce4-28a9-4add-b257-f90dc081c48d', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=f2a61a9c-1832-4a5f-89c7-e09ac8a1046e, chassis=[<ovs.db.idl.Row object at 0x7fdd55b5c970>], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fdd55b5c970>], logical_port=cbd11a5b-9ef3-4c5b-a638-afc4464a8dfc) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 21 18:56:22 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:56:22.112 104259 INFO neutron.agent.ovn.metadata.agent [-] Port cbd11a5b-9ef3-4c5b-a638-afc4464a8dfc in datapath 835f4434-3fa6-458b-b79c-b27830f531cf bound to our chassis#033[00m
Jan 21 18:56:22 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:56:22.114 104259 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 835f4434-3fa6-458b-b79c-b27830f531cf#033[00m
Jan 21 18:56:22 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:56:22.133 211690 DEBUG oslo.privsep.daemon [-] privsep: reply[c486796c-ea43-42f1-8b1d-3ff5b72570ac]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 18:56:22 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:56:22.135 104259 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap835f4434-31 in ovnmeta-835f4434-3fa6-458b-b79c-b27830f531cf namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Jan 21 18:56:22 np0005591285 systemd-udevd[219529]: Network interface NamePolicy= disabled on kernel command line.
Jan 21 18:56:22 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:56:22.139 211690 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap835f4434-30 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Jan 21 18:56:22 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:56:22.139 211690 DEBUG oslo.privsep.daemon [-] privsep: reply[7336cc04-470e-4bb9-8798-49c5f80cd53a]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 18:56:22 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:56:22.141 211690 DEBUG oslo.privsep.daemon [-] privsep: reply[b987a905-50ba-4665-b35b-c1eb0ee1d9be]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 18:56:22 np0005591285 nova_compute[182755]: 2026-01-21 23:56:22.153 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 18:56:22 np0005591285 NetworkManager[55017]: <info>  [1769039782.1588] device (tapcbd11a5b-9e): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 21 18:56:22 np0005591285 NetworkManager[55017]: <info>  [1769039782.1596] device (tapcbd11a5b-9e): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 21 18:56:22 np0005591285 ovn_controller[94908]: 2026-01-21T23:56:22Z|00174|binding|INFO|Setting lport cbd11a5b-9ef3-4c5b-a638-afc4464a8dfc ovn-installed in OVS
Jan 21 18:56:22 np0005591285 ovn_controller[94908]: 2026-01-21T23:56:22Z|00175|binding|INFO|Setting lport cbd11a5b-9ef3-4c5b-a638-afc4464a8dfc up in Southbound
Jan 21 18:56:22 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:56:22.162 104650 DEBUG oslo.privsep.daemon [-] privsep: reply[b9e85e62-92b1-46c6-b608-0570a220aa00]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 18:56:22 np0005591285 systemd-machined[154022]: New machine qemu-27-instance-00000041.
Jan 21 18:56:22 np0005591285 nova_compute[182755]: 2026-01-21 23:56:22.168 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 18:56:22 np0005591285 systemd[1]: Started Virtual Machine qemu-27-instance-00000041.
Jan 21 18:56:22 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:56:22.193 211690 DEBUG oslo.privsep.daemon [-] privsep: reply[d76e345d-e694-465f-8f14-4897a8f3f2f4]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 18:56:22 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:56:22.249 211704 DEBUG oslo.privsep.daemon [-] privsep: reply[e8367a69-3325-48ec-9f54-77392053b191]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 18:56:22 np0005591285 NetworkManager[55017]: <info>  [1769039782.2615] manager: (tap835f4434-30): new Veth device (/org/freedesktop/NetworkManager/Devices/94)
Jan 21 18:56:22 np0005591285 systemd-udevd[219533]: Network interface NamePolicy= disabled on kernel command line.
Jan 21 18:56:22 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:56:22.260 211690 DEBUG oslo.privsep.daemon [-] privsep: reply[b1d967a1-d09c-41b6-b7eb-7d6c778f96aa]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 18:56:22 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:56:22.309 211704 DEBUG oslo.privsep.daemon [-] privsep: reply[42359e11-f914-4ae7-a48f-84c38f265026]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 18:56:22 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:56:22.314 211704 DEBUG oslo.privsep.daemon [-] privsep: reply[54a58bdf-4ebf-48ba-8052-3d0bda9b4930]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 18:56:22 np0005591285 NetworkManager[55017]: <info>  [1769039782.3522] device (tap835f4434-30): carrier: link connected
Jan 21 18:56:22 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:56:22.366 211704 DEBUG oslo.privsep.daemon [-] privsep: reply[06994810-6786-45ae-9e8f-af88e0ab9bd6]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 18:56:22 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:56:22.398 211690 DEBUG oslo.privsep.daemon [-] privsep: reply[558c7c62-ff1f-401b-8230-c89db6646c1f]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap835f4434-31'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:3d:51:07'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 59], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 434101, 'reachable_time': 33257, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 219562, 'error': None, 'target': 'ovnmeta-835f4434-3fa6-458b-b79c-b27830f531cf', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 18:56:22 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:56:22.427 211690 DEBUG oslo.privsep.daemon [-] privsep: reply[f10df9fe-53a7-47dc-8157-82efe31fd348]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe3d:5107'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 434101, 'tstamp': 434101}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 219563, 'error': None, 'target': 'ovnmeta-835f4434-3fa6-458b-b79c-b27830f531cf', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 18:56:22 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:56:22.453 211690 DEBUG oslo.privsep.daemon [-] privsep: reply[43c15b61-8569-4fdd-b84d-e1ab700bcfc4]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap835f4434-31'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:3d:51:07'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 59], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 434101, 'reachable_time': 33257, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 219564, 'error': None, 'target': 'ovnmeta-835f4434-3fa6-458b-b79c-b27830f531cf', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 18:56:22 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:56:22.511 211690 DEBUG oslo.privsep.daemon [-] privsep: reply[a67cf729-805a-45b1-b0e6-de976d99e665]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 18:56:22 np0005591285 nova_compute[182755]: 2026-01-21 23:56:22.585 182759 DEBUG nova.virt.driver [None req-f447ef32-7edb-4e2c-a5c5-5452f2f9c37e - - - - - -] Emitting event <LifecycleEvent: 1769039782.5847113, 7768ce19-cdaa-43a0-9404-cafa72f2d077 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 21 18:56:22 np0005591285 nova_compute[182755]: 2026-01-21 23:56:22.586 182759 INFO nova.compute.manager [None req-f447ef32-7edb-4e2c-a5c5-5452f2f9c37e - - - - - -] [instance: 7768ce19-cdaa-43a0-9404-cafa72f2d077] VM Started (Lifecycle Event)#033[00m
Jan 21 18:56:22 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:56:22.593 211690 DEBUG oslo.privsep.daemon [-] privsep: reply[799e4596-595d-4aec-83b7-7b6e9aff7ec3]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 18:56:22 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:56:22.595 104259 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap835f4434-30, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 21 18:56:22 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:56:22.595 104259 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 21 18:56:22 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:56:22.596 104259 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap835f4434-30, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 21 18:56:22 np0005591285 nova_compute[182755]: 2026-01-21 23:56:22.599 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 18:56:22 np0005591285 kernel: tap835f4434-30: entered promiscuous mode
Jan 21 18:56:22 np0005591285 NetworkManager[55017]: <info>  [1769039782.6000] manager: (tap835f4434-30): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/95)
Jan 21 18:56:22 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:56:22.604 104259 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap835f4434-30, col_values=(('external_ids', {'iface-id': '8bc16eeb-6666-4300-9ce8-0a810442a173'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 21 18:56:22 np0005591285 nova_compute[182755]: 2026-01-21 23:56:22.605 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 18:56:22 np0005591285 ovn_controller[94908]: 2026-01-21T23:56:22Z|00176|binding|INFO|Releasing lport 8bc16eeb-6666-4300-9ce8-0a810442a173 from this chassis (sb_readonly=0)
Jan 21 18:56:22 np0005591285 nova_compute[182755]: 2026-01-21 23:56:22.631 182759 DEBUG nova.compute.manager [None req-f447ef32-7edb-4e2c-a5c5-5452f2f9c37e - - - - - -] [instance: 7768ce19-cdaa-43a0-9404-cafa72f2d077] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 21 18:56:22 np0005591285 nova_compute[182755]: 2026-01-21 23:56:22.632 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 18:56:22 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:56:22.632 104259 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/835f4434-3fa6-458b-b79c-b27830f531cf.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/835f4434-3fa6-458b-b79c-b27830f531cf.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Jan 21 18:56:22 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:56:22.635 211690 DEBUG oslo.privsep.daemon [-] privsep: reply[66e9a28e-9cd8-44f8-ba7c-7464fab618e2]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 18:56:22 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:56:22.636 104259 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Jan 21 18:56:22 np0005591285 ovn_metadata_agent[104254]: global
Jan 21 18:56:22 np0005591285 ovn_metadata_agent[104254]:    log         /dev/log local0 debug
Jan 21 18:56:22 np0005591285 ovn_metadata_agent[104254]:    log-tag     haproxy-metadata-proxy-835f4434-3fa6-458b-b79c-b27830f531cf
Jan 21 18:56:22 np0005591285 ovn_metadata_agent[104254]:    user        root
Jan 21 18:56:22 np0005591285 ovn_metadata_agent[104254]:    group       root
Jan 21 18:56:22 np0005591285 ovn_metadata_agent[104254]:    maxconn     1024
Jan 21 18:56:22 np0005591285 ovn_metadata_agent[104254]:    pidfile     /var/lib/neutron/external/pids/835f4434-3fa6-458b-b79c-b27830f531cf.pid.haproxy
Jan 21 18:56:22 np0005591285 ovn_metadata_agent[104254]:    daemon
Jan 21 18:56:22 np0005591285 ovn_metadata_agent[104254]: 
Jan 21 18:56:22 np0005591285 ovn_metadata_agent[104254]: defaults
Jan 21 18:56:22 np0005591285 ovn_metadata_agent[104254]:    log global
Jan 21 18:56:22 np0005591285 ovn_metadata_agent[104254]:    mode http
Jan 21 18:56:22 np0005591285 ovn_metadata_agent[104254]:    option httplog
Jan 21 18:56:22 np0005591285 ovn_metadata_agent[104254]:    option dontlognull
Jan 21 18:56:22 np0005591285 ovn_metadata_agent[104254]:    option http-server-close
Jan 21 18:56:22 np0005591285 ovn_metadata_agent[104254]:    option forwardfor
Jan 21 18:56:22 np0005591285 ovn_metadata_agent[104254]:    retries                 3
Jan 21 18:56:22 np0005591285 ovn_metadata_agent[104254]:    timeout http-request    30s
Jan 21 18:56:22 np0005591285 ovn_metadata_agent[104254]:    timeout connect         30s
Jan 21 18:56:22 np0005591285 ovn_metadata_agent[104254]:    timeout client          32s
Jan 21 18:56:22 np0005591285 ovn_metadata_agent[104254]:    timeout server          32s
Jan 21 18:56:22 np0005591285 ovn_metadata_agent[104254]:    timeout http-keep-alive 30s
Jan 21 18:56:22 np0005591285 ovn_metadata_agent[104254]: 
Jan 21 18:56:22 np0005591285 ovn_metadata_agent[104254]: 
Jan 21 18:56:22 np0005591285 ovn_metadata_agent[104254]: listen listener
Jan 21 18:56:22 np0005591285 ovn_metadata_agent[104254]:    bind 169.254.169.254:80
Jan 21 18:56:22 np0005591285 ovn_metadata_agent[104254]:    server metadata /var/lib/neutron/metadata_proxy
Jan 21 18:56:22 np0005591285 ovn_metadata_agent[104254]:    http-request add-header X-OVN-Network-ID 835f4434-3fa6-458b-b79c-b27830f531cf
Jan 21 18:56:22 np0005591285 ovn_metadata_agent[104254]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Jan 21 18:56:22 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:56:22.637 104259 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-835f4434-3fa6-458b-b79c-b27830f531cf', 'env', 'PROCESS_TAG=haproxy-835f4434-3fa6-458b-b79c-b27830f531cf', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/835f4434-3fa6-458b-b79c-b27830f531cf.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Jan 21 18:56:22 np0005591285 nova_compute[182755]: 2026-01-21 23:56:22.638 182759 DEBUG nova.virt.driver [None req-f447ef32-7edb-4e2c-a5c5-5452f2f9c37e - - - - - -] Emitting event <LifecycleEvent: 1769039782.585121, 7768ce19-cdaa-43a0-9404-cafa72f2d077 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 21 18:56:22 np0005591285 nova_compute[182755]: 2026-01-21 23:56:22.638 182759 INFO nova.compute.manager [None req-f447ef32-7edb-4e2c-a5c5-5452f2f9c37e - - - - - -] [instance: 7768ce19-cdaa-43a0-9404-cafa72f2d077] VM Paused (Lifecycle Event)#033[00m
Jan 21 18:56:22 np0005591285 nova_compute[182755]: 2026-01-21 23:56:22.727 182759 DEBUG nova.compute.manager [None req-f447ef32-7edb-4e2c-a5c5-5452f2f9c37e - - - - - -] [instance: 7768ce19-cdaa-43a0-9404-cafa72f2d077] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 21 18:56:22 np0005591285 nova_compute[182755]: 2026-01-21 23:56:22.739 182759 DEBUG nova.compute.manager [None req-f447ef32-7edb-4e2c-a5c5-5452f2f9c37e - - - - - -] [instance: 7768ce19-cdaa-43a0-9404-cafa72f2d077] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 21 18:56:22 np0005591285 nova_compute[182755]: 2026-01-21 23:56:22.781 182759 INFO nova.compute.manager [None req-f447ef32-7edb-4e2c-a5c5-5452f2f9c37e - - - - - -] [instance: 7768ce19-cdaa-43a0-9404-cafa72f2d077] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 21 18:56:23 np0005591285 nova_compute[182755]: 2026-01-21 23:56:23.049 182759 DEBUG nova.network.neutron [req-f62f9a62-0010-4072-944d-df7f081bf732 req-6f0ad488-78bb-486b-b858-81a6c2dcadca 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 7768ce19-cdaa-43a0-9404-cafa72f2d077] Updated VIF entry in instance network info cache for port cbd11a5b-9ef3-4c5b-a638-afc4464a8dfc. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 21 18:56:23 np0005591285 nova_compute[182755]: 2026-01-21 23:56:23.050 182759 DEBUG nova.network.neutron [req-f62f9a62-0010-4072-944d-df7f081bf732 req-6f0ad488-78bb-486b-b858-81a6c2dcadca 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 7768ce19-cdaa-43a0-9404-cafa72f2d077] Updating instance_info_cache with network_info: [{"id": "cbd11a5b-9ef3-4c5b-a638-afc4464a8dfc", "address": "fa:16:3e:fa:c6:29", "network": {"id": "835f4434-3fa6-458b-b79c-b27830f531cf", "bridge": "br-int", "label": "tempest-ListServersNegativeTestJSON-1274650069-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "414437860afc460b9e86d674975e9d1f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapcbd11a5b-9e", "ovs_interfaceid": "cbd11a5b-9ef3-4c5b-a638-afc4464a8dfc", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 21 18:56:23 np0005591285 nova_compute[182755]: 2026-01-21 23:56:23.082 182759 DEBUG oslo_concurrency.lockutils [req-f62f9a62-0010-4072-944d-df7f081bf732 req-6f0ad488-78bb-486b-b858-81a6c2dcadca 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Releasing lock "refresh_cache-7768ce19-cdaa-43a0-9404-cafa72f2d077" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 21 18:56:23 np0005591285 podman[219603]: 2026-01-21 23:56:23.105128786 +0000 UTC m=+0.083400853 container create 6aa4eda3d15466d5386119ecef1c489403c3a4c617c46fedc147aa1aa17a29e6 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-835f4434-3fa6-458b-b79c-b27830f531cf, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.build-date=20251202, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0)
Jan 21 18:56:23 np0005591285 systemd[1]: Started libpod-conmon-6aa4eda3d15466d5386119ecef1c489403c3a4c617c46fedc147aa1aa17a29e6.scope.
Jan 21 18:56:23 np0005591285 podman[219603]: 2026-01-21 23:56:23.067830575 +0000 UTC m=+0.046102662 image pull 3695f0466b4af47afdf4b467956f8cc4744d7249671a73e7ca3fd26cca2f59c3 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.157 12 DEBUG ceilometer.compute.discovery [-] instance data: {'id': '3c28cc1f-7479-4ee7-805d-ae13cd2b6dff', 'name': 'tempest-ListServerFiltersTestJSON-instance-1934699601', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'image': {'id': '3e1dda74-3c6a-4d29-8792-32134d1c36c5'}, 'os_type': 'hvm', 'architecture': 'x86_64', 'OS-EXT-SRV-ATTR:instance_name': 'instance-0000003d', 'OS-EXT-SRV-ATTR:host': 'compute-2.ctlplane.example.com', 'OS-EXT-STS:vm_state': 'running', 'tenant_id': '70b1c9f8be0042aa8de9841a26729700', 'user_id': '7e79b904cb8a49f990b05eb0ed72fdf4', 'hostId': '3da21ce2eff432485c6c9ffe39b17a0dfaf50da221da7a27fe3dbf1f', 'status': 'active', 'metadata': {}} discover_libvirt_polling /usr/lib/python3.9/site-packages/ceilometer/compute/discovery.py:228
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.161 12 DEBUG ceilometer.compute.discovery [-] instance data: {'id': '7768ce19-cdaa-43a0-9404-cafa72f2d077', 'name': 'tempest-ListServersNegativeTestJSON-server-1677728672-1', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'os_type': 'hvm', 'architecture': 'x86_64', 'OS-EXT-SRV-ATTR:instance_name': 'instance-00000041', 'OS-EXT-SRV-ATTR:host': 'compute-2.ctlplane.example.com', 'OS-EXT-STS:vm_state': 'paused', 'tenant_id': '414437860afc460b9e86d674975e9d1f', 'user_id': '9a4a4a5f3c9f4c5091261592272bcb81', 'hostId': 'd9631757f800d29c241e4abfd23b0ae48959b8cda62adc0042c28989', 'status': 'paused', 'metadata': {}} discover_libvirt_polling /usr/lib/python3.9/site-packages/ceilometer/compute/discovery.py:228
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.162 12 INFO ceilometer.polling.manager [-] Polling pollster cpu in the context of pollsters
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.190 12 DEBUG ceilometer.compute.pollsters [-] 3c28cc1f-7479-4ee7-805d-ae13cd2b6dff/cpu volume: 11910000000 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 21 18:56:23 np0005591285 systemd[1]: Started libcrun container.
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.214 12 DEBUG ceilometer.compute.pollsters [-] 7768ce19-cdaa-43a0-9404-cafa72f2d077/cpu volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 21 18:56:23 np0005591285 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/0422cdc677962067809d062a173647c9cf392341d153e924181ea1956f806b91/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.218 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'f72fb791-231c-43eb-9f98-b5eebd808d91', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'cpu', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 11910000000, 'user_id': '7e79b904cb8a49f990b05eb0ed72fdf4', 'user_name': None, 'project_id': '70b1c9f8be0042aa8de9841a26729700', 'project_name': None, 'resource_id': '3c28cc1f-7479-4ee7-805d-ae13cd2b6dff', 'timestamp': '2026-01-21T23:56:23.162438', 'resource_metadata': {'display_name': 'tempest-ListServerFiltersTestJSON-instance-1934699601', 'name': 'instance-0000003d', 'instance_id': '3c28cc1f-7479-4ee7-805d-ae13cd2b6dff', 'instance_type': 'm1.nano', 'host': '3da21ce2eff432485c6c9ffe39b17a0dfaf50da221da7a27fe3dbf1f', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '3e1dda74-3c6a-4d29-8792-32134d1c36c5'}, 'image_ref': '3e1dda74-3c6a-4d29-8792-32134d1c36c5', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'cpu_number': 1}, 'message_id': 'ca3629f4-f724-11f0-b13b-fa163e425b77', 'monotonic_time': 4341.909801376, 'message_signature': '4cb60af38c2fe6a5a09273ad0baf4b5e148c7ed855320cec7ccb763762f07459'}, {'source': 'openstack', 'counter_name': 'cpu', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 0, 'user_id': '9a4a4a5f3c9f4c5091261592272bcb81', 'user_name': None, 'project_id': '414437860afc460b9e86d674975e9d1f', 'project_name': None, 'resource_id': '7768ce19-cdaa-43a0-9404-cafa72f2d077', 'timestamp': '2026-01-21T23:56:23.162438', 'resource_metadata': {'display_name': 'tempest-ListServersNegativeTestJSON-server-1677728672-1', 'name': 'instance-00000041', 'instance_id': '7768ce19-cdaa-43a0-9404-cafa72f2d077', 'instance_type': 'm1.nano', 'host': 'd9631757f800d29c241e4abfd23b0ae48959b8cda62adc0042c28989', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'paused', 'state': 'paused', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'cpu_number': 1}, 'message_id': 'ca39b0b0-f724-11f0-b13b-fa163e425b77', 'monotonic_time': 4341.933513918, 'message_signature': '4eced58bc5a4e22a41d0c86390909353c218aa9db4a7e5246c396b11bd285f5c'}]}, 'timestamp': '2026-01-21 23:56:23.215026', '_unique_id': 'ee2966dd6b3b44dfa2f56ac8d9ceb37d'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.218 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.218 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.218 12 ERROR oslo_messaging.notify.messaging     yield
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.218 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.218 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.218 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.218 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.218 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.218 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.218 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.218 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.218 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.218 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.218 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.218 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.218 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.218 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.218 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.218 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.218 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.218 12 ERROR oslo_messaging.notify.messaging 
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.218 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.218 12 ERROR oslo_messaging.notify.messaging 
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.218 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.218 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.218 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.218 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.218 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.218 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.218 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.218 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.218 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.218 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.218 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.218 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.218 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.218 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.218 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.218 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.218 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.218 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.218 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.218 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.218 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.218 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.218 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.218 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.218 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.218 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.218 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.218 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.218 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.218 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.218 12 ERROR oslo_messaging.notify.messaging 
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.219 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes in the context of pollsters
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.221 12 DEBUG ceilometer.compute.virt.libvirt.inspector [-] No delta meter predecessor for 3c28cc1f-7479-4ee7-805d-ae13cd2b6dff / tapc9a59fac-68 inspect_vnics /usr/lib/python3.9/site-packages/ceilometer/compute/virt/libvirt/inspector.py:136
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.222 12 DEBUG ceilometer.compute.pollsters [-] 3c28cc1f-7479-4ee7-805d-ae13cd2b6dff/network.outgoing.bytes volume: 1620 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.224 12 DEBUG ceilometer.compute.virt.libvirt.inspector [-] No delta meter predecessor for 7768ce19-cdaa-43a0-9404-cafa72f2d077 / tapcbd11a5b-9e inspect_vnics /usr/lib/python3.9/site-packages/ceilometer/compute/virt/libvirt/inspector.py:136
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.224 12 DEBUG ceilometer.compute.pollsters [-] 7768ce19-cdaa-43a0-9404-cafa72f2d077/network.outgoing.bytes volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.225 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '0683e383-7026-4409-a397-684309bd00cc', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 1620, 'user_id': '7e79b904cb8a49f990b05eb0ed72fdf4', 'user_name': None, 'project_id': '70b1c9f8be0042aa8de9841a26729700', 'project_name': None, 'resource_id': 'instance-0000003d-3c28cc1f-7479-4ee7-805d-ae13cd2b6dff-tapc9a59fac-68', 'timestamp': '2026-01-21T23:56:23.219684', 'resource_metadata': {'display_name': 'tempest-ListServerFiltersTestJSON-instance-1934699601', 'name': 'tapc9a59fac-68', 'instance_id': '3c28cc1f-7479-4ee7-805d-ae13cd2b6dff', 'instance_type': 'm1.nano', 'host': '3da21ce2eff432485c6c9ffe39b17a0dfaf50da221da7a27fe3dbf1f', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '3e1dda74-3c6a-4d29-8792-32134d1c36c5'}, 'image_ref': '3e1dda74-3c6a-4d29-8792-32134d1c36c5', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:58:a7:27', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapc9a59fac-68'}, 'message_id': 'ca3ad6ac-f724-11f0-b13b-fa163e425b77', 'monotonic_time': 4341.938884654, 'message_signature': 'bf9345c0d2077714df5e69971b28579c92c2d7ebffb6b1c7f0928ef16c64be6a'}, {'source': 'openstack', 'counter_name': 'network.outgoing.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '9a4a4a5f3c9f4c5091261592272bcb81', 'user_name': None, 'project_id': '414437860afc460b9e86d674975e9d1f', 'project_name': None, 'resource_id': 'instance-00000041-7768ce19-cdaa-43a0-9404-cafa72f2d077-tapcbd11a5b-9e', 'timestamp': '2026-01-21T23:56:23.219684', 'resource_metadata': {'display_name': 'tempest-ListServersNegativeTestJSON-server-1677728672-1', 'name': 'tapcbd11a5b-9e', 'instance_id': '7768ce19-cdaa-43a0-9404-cafa72f2d077', 'instance_type': 'm1.nano', 'host': 'd9631757f800d29c241e4abfd23b0ae48959b8cda62adc0042c28989', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'paused', 'state': 'paused', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:fa:c6:29', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapcbd11a5b-9e'}, 'message_id': 'ca3b29fe-f724-11f0-b13b-fa163e425b77', 'monotonic_time': 4341.941528186, 'message_signature': 'c61db5534331059578b93f81fd649da9961a52d586115282608b553289e7f698'}]}, 'timestamp': '2026-01-21 23:56:23.224491', '_unique_id': 'db451da3f445402687527f262e998f46'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.225 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.225 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.225 12 ERROR oslo_messaging.notify.messaging     yield
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.225 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.225 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.225 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.225 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.225 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.225 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.225 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.225 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.225 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.225 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.225 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.225 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.225 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.225 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.225 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.225 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.225 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.225 12 ERROR oslo_messaging.notify.messaging 
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.225 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.225 12 ERROR oslo_messaging.notify.messaging 
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.225 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.225 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.225 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.225 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.225 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.225 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.225 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.225 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.225 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.225 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.225 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.225 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.225 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.225 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.225 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.225 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.225 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.225 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.225 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.225 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.225 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.225 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.225 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.225 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.225 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.225 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.225 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.225 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.225 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.225 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.225 12 ERROR oslo_messaging.notify.messaging 
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.225 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.latency in the context of pollsters
Jan 21 18:56:23 np0005591285 podman[219603]: 2026-01-21 23:56:23.226371124 +0000 UTC m=+0.204643221 container init 6aa4eda3d15466d5386119ecef1c489403c3a4c617c46fedc147aa1aa17a29e6 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-835f4434-3fa6-458b-b79c-b27830f531cf, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.build-date=20251202, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 21 18:56:23 np0005591285 podman[219603]: 2026-01-21 23:56:23.234011132 +0000 UTC m=+0.212283189 container start 6aa4eda3d15466d5386119ecef1c489403c3a4c617c46fedc147aa1aa17a29e6 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-835f4434-3fa6-458b-b79c-b27830f531cf, org.label-schema.build-date=20251202, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, tcib_managed=true, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2)
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.257 12 DEBUG ceilometer.compute.pollsters [-] 3c28cc1f-7479-4ee7-805d-ae13cd2b6dff/disk.device.read.latency volume: 235608411 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.258 12 DEBUG ceilometer.compute.pollsters [-] 3c28cc1f-7479-4ee7-805d-ae13cd2b6dff/disk.device.read.latency volume: 28439515 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 21 18:56:23 np0005591285 neutron-haproxy-ovnmeta-835f4434-3fa6-458b-b79c-b27830f531cf[219618]: [NOTICE]   (219622) : New worker (219624) forked
Jan 21 18:56:23 np0005591285 neutron-haproxy-ovnmeta-835f4434-3fa6-458b-b79c-b27830f531cf[219618]: [NOTICE]   (219622) : Loading success.
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.296 12 DEBUG ceilometer.compute.pollsters [-] 7768ce19-cdaa-43a0-9404-cafa72f2d077/disk.device.read.latency volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.297 12 DEBUG ceilometer.compute.pollsters [-] 7768ce19-cdaa-43a0-9404-cafa72f2d077/disk.device.read.latency volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.298 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'ebae5c83-a100-4635-8f63-60a7f57b5bc2', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 235608411, 'user_id': '7e79b904cb8a49f990b05eb0ed72fdf4', 'user_name': None, 'project_id': '70b1c9f8be0042aa8de9841a26729700', 'project_name': None, 'resource_id': '3c28cc1f-7479-4ee7-805d-ae13cd2b6dff-vda', 'timestamp': '2026-01-21T23:56:23.225837', 'resource_metadata': {'display_name': 'tempest-ListServerFiltersTestJSON-instance-1934699601', 'name': 'instance-0000003d', 'instance_id': '3c28cc1f-7479-4ee7-805d-ae13cd2b6dff', 'instance_type': 'm1.nano', 'host': '3da21ce2eff432485c6c9ffe39b17a0dfaf50da221da7a27fe3dbf1f', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '3e1dda74-3c6a-4d29-8792-32134d1c36c5'}, 'image_ref': '3e1dda74-3c6a-4d29-8792-32134d1c36c5', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'ca404e16-f724-11f0-b13b-fa163e425b77', 'monotonic_time': 4341.945027771, 'message_signature': '476412dc301e85aef9ad7ff2a4882202cdf94f63c53e6240341ed5e4c2d0fa60'}, {'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 28439515, 'user_id': '7e79b904cb8a49f990b05eb0ed72fdf4', 'user_name': None, 'project_id': '70b1c9f8be0042aa8de9841a26729700', 'project_name': None, 'resource_id': '3c28cc1f-7479-4ee7-805d-ae13cd2b6dff-sda', 'timestamp': '2026-01-21T23:56:23.225837', 'resource_metadata': {'display_name': 'tempest-ListServerFiltersTestJSON-instance-1934699601', 'name': 'instance-0000003d', 'instance_id': '3c28cc1f-7479-4ee7-805d-ae13cd2b6dff', 'instance_type': 'm1.nano', 'host': '3da21ce2eff432485c6c9ffe39b17a0dfaf50da221da7a27fe3dbf1f', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '3e1dda74-3c6a-4d29-8792-32134d1c36c5'}, 'image_ref': '3e1dda74-3c6a-4d29-8792-32134d1c36c5', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'ca405a82-f724-11f0-b13b-fa163e425b77', 'monotonic_time': 4341.945027771, 'message_signature': '782e949ef9a6cb33b5600e6b8e3a6f090d5237c29b2fd136d2619f64bba9759f'}, {'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 0, 'user_id': '9a4a4a5f3c9f4c5091261592272bcb81', 'user_name': None, 'project_id': '414437860afc460b9e86d674975e9d1f', 'project_name': None, 'resource_id': '7768ce19-cdaa-43a0-9404-cafa72f2d077-vda', 'timestamp': '2026-01-21T23:56:23.225837', 'resource_metadata': {'display_name': 'tempest-ListServersNegativeTestJSON-server-1677728672-1', 'name': 'instance-00000041', 'instance_id': '7768ce19-cdaa-43a0-9404-cafa72f2d077', 'instance_type': 'm1.nano', 'host': 'd9631757f800d29c241e4abfd23b0ae48959b8cda62adc0042c28989', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'paused', 'state': 'paused', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'ca463d3a-f724-11f0-b13b-fa163e425b77', 'monotonic_time': 4341.977680376, 'message_signature': '86db44a7faa95f78753f5aae884fcddb4e5e096a422d49a4faee068419a6e28a'}, {'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 0, 'user_id': '9a4a4a5f3c9f4c5091261592272bcb81', 'user_name': None, 'project_id': '414437860afc460b9e86d674975e9d1f', 'project_name': None, 'resource_id': '7768ce19-cdaa-43a0-9404-cafa72f2d077-sda', 'timestamp': '2026-01-21T23:56:23.225837', 'resource_metadata': {'display_name': 'tempest-ListServersNegativeTestJSON-server-1677728672-1', 'name': 'instance-00000041', 'instance_id': '7768ce19-cdaa-43a0-9404-cafa72f2d077', 'instance_type': 'm1.nano', 'host': 'd9631757f800d29c241e4abfd23b0ae48959b8cda62adc0042c28989', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'paused', 'state': 'paused', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'ca4649ce-f724-11f0-b13b-fa163e425b77', 'monotonic_time': 4341.977680376, 'message_signature': '510efa3cff245fa910e25c302de377938ea7bada1536edb96e4cecad1f595380'}]}, 'timestamp': '2026-01-21 23:56:23.297442', '_unique_id': 'c39141cf71fa4f4489b509ba44d90084'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.298 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.298 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.298 12 ERROR oslo_messaging.notify.messaging     yield
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.298 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.298 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.298 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.298 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.298 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.298 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.298 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.298 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.298 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.298 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.298 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.298 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.298 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.298 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.298 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.298 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.298 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.298 12 ERROR oslo_messaging.notify.messaging 
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.298 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.298 12 ERROR oslo_messaging.notify.messaging 
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.298 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.298 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.298 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.298 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.298 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.298 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.298 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.298 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.298 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.298 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.298 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.298 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.298 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.298 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.298 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.298 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.298 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.298 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.298 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.298 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.298 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.298 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.298 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.298 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.298 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.298 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.298 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.298 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.298 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.298 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.298 12 ERROR oslo_messaging.notify.messaging 
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.299 12 INFO ceilometer.polling.manager [-] Polling pollster memory.usage in the context of pollsters
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.299 12 DEBUG ceilometer.compute.pollsters [-] 3c28cc1f-7479-4ee7-805d-ae13cd2b6dff/memory.usage volume: 42.6796875 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.299 12 DEBUG ceilometer.compute.pollsters [-] 7768ce19-cdaa-43a0-9404-cafa72f2d077/memory.usage volume: Unavailable _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.300 12 WARNING ceilometer.compute.pollsters [-] memory.usage statistic in not available for instance 7768ce19-cdaa-43a0-9404-cafa72f2d077: ceilometer.compute.pollsters.NoVolumeException
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.300 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '075297ae-5e4e-48d2-bff4-08e0322b7189', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'memory.usage', 'counter_type': 'gauge', 'counter_unit': 'MB', 'counter_volume': 42.6796875, 'user_id': '7e79b904cb8a49f990b05eb0ed72fdf4', 'user_name': None, 'project_id': '70b1c9f8be0042aa8de9841a26729700', 'project_name': None, 'resource_id': '3c28cc1f-7479-4ee7-805d-ae13cd2b6dff', 'timestamp': '2026-01-21T23:56:23.299679', 'resource_metadata': {'display_name': 'tempest-ListServerFiltersTestJSON-instance-1934699601', 'name': 'instance-0000003d', 'instance_id': '3c28cc1f-7479-4ee7-805d-ae13cd2b6dff', 'instance_type': 'm1.nano', 'host': '3da21ce2eff432485c6c9ffe39b17a0dfaf50da221da7a27fe3dbf1f', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '3e1dda74-3c6a-4d29-8792-32134d1c36c5'}, 'image_ref': '3e1dda74-3c6a-4d29-8792-32134d1c36c5', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1}, 'message_id': 'ca46ae32-f724-11f0-b13b-fa163e425b77', 'monotonic_time': 4341.909801376, 'message_signature': 'a7adca1b6952d57ebb513d0e4ed28769502114a76f14e9e0545a0bfb2d2f4a3a'}]}, 'timestamp': '2026-01-21 23:56:23.300176', '_unique_id': '28d6a9845ec24a4a85de96e9dbfb51b4'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.300 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.300 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.300 12 ERROR oslo_messaging.notify.messaging     yield
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.300 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.300 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.300 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.300 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.300 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.300 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.300 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.300 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.300 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.300 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.300 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.300 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.300 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.300 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.300 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.300 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.300 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.300 12 ERROR oslo_messaging.notify.messaging 
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.300 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.300 12 ERROR oslo_messaging.notify.messaging 
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.300 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.300 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.300 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.300 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.300 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.300 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.300 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.300 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.300 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.300 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.300 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.300 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.300 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.300 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.300 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.300 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.300 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.300 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.300 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.300 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.300 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.300 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.300 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.300 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.300 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.300 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.300 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.300 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.300 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.300 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.300 12 ERROR oslo_messaging.notify.messaging 
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.301 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.bytes in the context of pollsters
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.301 12 DEBUG ceilometer.compute.pollsters [-] 3c28cc1f-7479-4ee7-805d-ae13cd2b6dff/disk.device.write.bytes volume: 72876032 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.301 12 DEBUG ceilometer.compute.pollsters [-] 3c28cc1f-7479-4ee7-805d-ae13cd2b6dff/disk.device.write.bytes volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.302 12 DEBUG ceilometer.compute.pollsters [-] 7768ce19-cdaa-43a0-9404-cafa72f2d077/disk.device.write.bytes volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.302 12 DEBUG ceilometer.compute.pollsters [-] 7768ce19-cdaa-43a0-9404-cafa72f2d077/disk.device.write.bytes volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.303 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '3d35d9a8-5282-4ef2-89f1-7ed94cbd5122', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 72876032, 'user_id': '7e79b904cb8a49f990b05eb0ed72fdf4', 'user_name': None, 'project_id': '70b1c9f8be0042aa8de9841a26729700', 'project_name': None, 'resource_id': '3c28cc1f-7479-4ee7-805d-ae13cd2b6dff-vda', 'timestamp': '2026-01-21T23:56:23.301666', 'resource_metadata': {'display_name': 'tempest-ListServerFiltersTestJSON-instance-1934699601', 'name': 'instance-0000003d', 'instance_id': '3c28cc1f-7479-4ee7-805d-ae13cd2b6dff', 'instance_type': 'm1.nano', 'host': '3da21ce2eff432485c6c9ffe39b17a0dfaf50da221da7a27fe3dbf1f', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '3e1dda74-3c6a-4d29-8792-32134d1c36c5'}, 'image_ref': '3e1dda74-3c6a-4d29-8792-32134d1c36c5', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'ca46fa68-f724-11f0-b13b-fa163e425b77', 'monotonic_time': 4341.945027771, 'message_signature': '4dc56a6514163723d7c1bdbc115d41741285598be4a72a7f16f1aab1bf808245'}, {'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '7e79b904cb8a49f990b05eb0ed72fdf4', 'user_name': None, 'project_id': '70b1c9f8be0042aa8de9841a26729700', 'project_name': None, 'resource_id': '3c28cc1f-7479-4ee7-805d-ae13cd2b6dff-sda', 'timestamp': '2026-01-21T23:56:23.301666', 'resource_metadata': {'display_name': 'tempest-ListServerFiltersTestJSON-instance-1934699601', 'name': 'instance-0000003d', 'instance_id': '3c28cc1f-7479-4ee7-805d-ae13cd2b6dff', 'instance_type': 'm1.nano', 'host': '3da21ce2eff432485c6c9ffe39b17a0dfaf50da221da7a27fe3dbf1f', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '3e1dda74-3c6a-4d29-8792-32134d1c36c5'}, 'image_ref': '3e1dda74-3c6a-4d29-8792-32134d1c36c5', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'ca470346-f724-11f0-b13b-fa163e425b77', 'monotonic_time': 4341.945027771, 'message_signature': '98f9b8a327384885fc379cbeddf22b59dd0109d6891c1194dc37510734bee2f1'}, {'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '9a4a4a5f3c9f4c5091261592272bcb81', 'user_name': None, 'project_id': '414437860afc460b9e86d674975e9d1f', 'project_name': None, 'resource_id': '7768ce19-cdaa-43a0-9404-cafa72f2d077-vda', 'timestamp': '2026-01-21T23:56:23.301666', 'resource_metadata': {'display_name': 'tempest-ListServersNegativeTestJSON-server-1677728672-1', 'name': 'instance-00000041', 'instance_id': '7768ce19-cdaa-43a0-9404-cafa72f2d077', 'instance_type': 'm1.nano', 'host': 'd9631757f800d29c241e4abfd23b0ae48959b8cda62adc0042c28989', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'paused', 'state': 'paused', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'ca470ab2-f724-11f0-b13b-fa163e425b77', 'monotonic_time': 4341.977680376, 'message_signature': '535bd781da2f62cae9c17f5ac22b35a1a894808268c6c886dffd7206ab37e8b9'}, {'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '9a4a4a5f3c9f4c5091261592272bcb81', 'user_name': None, 'project_id': '414437860afc460b9e86d674975e9d1f', 'project_name': None, 'resource_id': '7768ce19-cdaa-43a0-9404-cafa72f2d077-sda', 'timestamp': '2026-01-21T23:56:23.301666', 'resource_metadata': {'display_name': 'tempest-ListServersNegativeTestJSON-server-1677728672-1', 'name': 'instance-00000041', 'instance_id': '7768ce19-cdaa-43a0-9404-cafa72f2d077', 'instance_type': 'm1.nano', 'host': 'd9631757f800d29c241e4abfd23b0ae48959b8cda62adc0042c28989', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'paused', 'state': 'paused', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'ca4711d8-f724-11f0-b13b-fa163e425b77', 'monotonic_time': 4341.977680376, 'message_signature': 'bebe2163e448bbe6ff131d0ed9777979738a3adc2f8dee13d7d96fc50935b1d7'}]}, 'timestamp': '2026-01-21 23:56:23.302489', '_unique_id': '0415814871914aaca18558306189702b'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.303 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.303 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.303 12 ERROR oslo_messaging.notify.messaging     yield
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.303 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.303 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.303 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.303 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.303 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.303 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.303 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.303 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.303 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.303 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.303 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.303 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.303 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.303 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.303 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.303 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.303 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.303 12 ERROR oslo_messaging.notify.messaging 
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.303 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.303 12 ERROR oslo_messaging.notify.messaging 
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.303 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.303 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.303 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.303 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.303 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.303 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.303 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.303 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.303 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.303 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.303 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.303 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.303 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.303 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.303 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.303 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.303 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.303 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.303 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.303 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.303 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.303 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.303 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.303 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.303 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.303 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.303 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.303 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.303 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.303 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.303 12 ERROR oslo_messaging.notify.messaging 
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.303 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.error in the context of pollsters
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.303 12 DEBUG ceilometer.compute.pollsters [-] 3c28cc1f-7479-4ee7-805d-ae13cd2b6dff/network.incoming.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.304 12 DEBUG ceilometer.compute.pollsters [-] 7768ce19-cdaa-43a0-9404-cafa72f2d077/network.incoming.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.304 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '99082047-6783-4547-9a5b-1b7ee528e79b', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '7e79b904cb8a49f990b05eb0ed72fdf4', 'user_name': None, 'project_id': '70b1c9f8be0042aa8de9841a26729700', 'project_name': None, 'resource_id': 'instance-0000003d-3c28cc1f-7479-4ee7-805d-ae13cd2b6dff-tapc9a59fac-68', 'timestamp': '2026-01-21T23:56:23.303722', 'resource_metadata': {'display_name': 'tempest-ListServerFiltersTestJSON-instance-1934699601', 'name': 'tapc9a59fac-68', 'instance_id': '3c28cc1f-7479-4ee7-805d-ae13cd2b6dff', 'instance_type': 'm1.nano', 'host': '3da21ce2eff432485c6c9ffe39b17a0dfaf50da221da7a27fe3dbf1f', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '3e1dda74-3c6a-4d29-8792-32134d1c36c5'}, 'image_ref': '3e1dda74-3c6a-4d29-8792-32134d1c36c5', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:58:a7:27', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapc9a59fac-68'}, 'message_id': 'ca474b12-f724-11f0-b13b-fa163e425b77', 'monotonic_time': 4341.938884654, 'message_signature': '66f2374a2441ac5b51ea00a8b0922e887f9a2878d66319eb5b885acba6d7d10c'}, {'source': 'openstack', 'counter_name': 'network.incoming.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '9a4a4a5f3c9f4c5091261592272bcb81', 'user_name': None, 'project_id': '414437860afc460b9e86d674975e9d1f', 'project_name': None, 'resource_id': 'instance-00000041-7768ce19-cdaa-43a0-9404-cafa72f2d077-tapcbd11a5b-9e', 'timestamp': '2026-01-21T23:56:23.303722', 'resource_metadata': {'display_name': 'tempest-ListServersNegativeTestJSON-server-1677728672-1', 'name': 'tapcbd11a5b-9e', 'instance_id': '7768ce19-cdaa-43a0-9404-cafa72f2d077', 'instance_type': 'm1.nano', 'host': 'd9631757f800d29c241e4abfd23b0ae48959b8cda62adc0042c28989', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'paused', 'state': 'paused', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:fa:c6:29', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapcbd11a5b-9e'}, 'message_id': 'ca4754fe-f724-11f0-b13b-fa163e425b77', 'monotonic_time': 4341.941528186, 'message_signature': '3e740f6a48cc272deee9414f70153fdc5e8ffd5e65137ccf221bea8197ab1301'}]}, 'timestamp': '2026-01-21 23:56:23.304218', '_unique_id': '50a66470a8c34160a012e0d92e4f78d8'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.304 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.304 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.304 12 ERROR oslo_messaging.notify.messaging     yield
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.304 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.304 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.304 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.304 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.304 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.304 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.304 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.304 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.304 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.304 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.304 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.304 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.304 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.304 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.304 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.304 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.304 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.304 12 ERROR oslo_messaging.notify.messaging 
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.304 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.304 12 ERROR oslo_messaging.notify.messaging 
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.304 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.304 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.304 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.304 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.304 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.304 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.304 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.304 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.304 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.304 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.304 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.304 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.304 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.304 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.304 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.304 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.304 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.304 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.304 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.304 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.304 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.304 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.304 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.304 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.304 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.304 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.304 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.304 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.304 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.304 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.304 12 ERROR oslo_messaging.notify.messaging 
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.305 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.requests in the context of pollsters
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.305 12 DEBUG ceilometer.compute.pollsters [-] 3c28cc1f-7479-4ee7-805d-ae13cd2b6dff/disk.device.read.requests volume: 1055 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.305 12 DEBUG ceilometer.compute.pollsters [-] 3c28cc1f-7479-4ee7-805d-ae13cd2b6dff/disk.device.read.requests volume: 108 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.306 12 DEBUG ceilometer.compute.pollsters [-] 7768ce19-cdaa-43a0-9404-cafa72f2d077/disk.device.read.requests volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.306 12 DEBUG ceilometer.compute.pollsters [-] 7768ce19-cdaa-43a0-9404-cafa72f2d077/disk.device.read.requests volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.307 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'cfb6a0ca-a048-412c-9d00-d7dc31cce971', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 1055, 'user_id': '7e79b904cb8a49f990b05eb0ed72fdf4', 'user_name': None, 'project_id': '70b1c9f8be0042aa8de9841a26729700', 'project_name': None, 'resource_id': '3c28cc1f-7479-4ee7-805d-ae13cd2b6dff-vda', 'timestamp': '2026-01-21T23:56:23.305626', 'resource_metadata': {'display_name': 'tempest-ListServerFiltersTestJSON-instance-1934699601', 'name': 'instance-0000003d', 'instance_id': '3c28cc1f-7479-4ee7-805d-ae13cd2b6dff', 'instance_type': 'm1.nano', 'host': '3da21ce2eff432485c6c9ffe39b17a0dfaf50da221da7a27fe3dbf1f', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '3e1dda74-3c6a-4d29-8792-32134d1c36c5'}, 'image_ref': '3e1dda74-3c6a-4d29-8792-32134d1c36c5', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'ca4795ea-f724-11f0-b13b-fa163e425b77', 'monotonic_time': 4341.945027771, 'message_signature': 'c8879a46f4c08d6247518363ac70ab23b23acaad905fb93760540bd77a876671'}, {'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 108, 'user_id': '7e79b904cb8a49f990b05eb0ed72fdf4', 'user_name': None, 'project_id': '70b1c9f8be0042aa8de9841a26729700', 'project_name': None, 'resource_id': '3c28cc1f-7479-4ee7-805d-ae13cd2b6dff-sda', 'timestamp': '2026-01-21T23:56:23.305626', 'resource_metadata': {'display_name': 'tempest-ListServerFiltersTestJSON-instance-1934699601', 'name': 'instance-0000003d', 'instance_id': '3c28cc1f-7479-4ee7-805d-ae13cd2b6dff', 'instance_type': 'm1.nano', 'host': '3da21ce2eff432485c6c9ffe39b17a0dfaf50da221da7a27fe3dbf1f', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '3e1dda74-3c6a-4d29-8792-32134d1c36c5'}, 'image_ref': '3e1dda74-3c6a-4d29-8792-32134d1c36c5', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'ca479eaa-f724-11f0-b13b-fa163e425b77', 'monotonic_time': 4341.945027771, 'message_signature': 'd7c0a82bc944580431be4e555c070b0ee253531539a4619bef879a061eb1b7a1'}, {'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 0, 'user_id': '9a4a4a5f3c9f4c5091261592272bcb81', 'user_name': None, 'project_id': '414437860afc460b9e86d674975e9d1f', 'project_name': None, 'resource_id': '7768ce19-cdaa-43a0-9404-cafa72f2d077-vda', 'timestamp': '2026-01-21T23:56:23.305626', 'resource_metadata': {'display_name': 'tempest-ListServersNegativeTestJSON-server-1677728672-1', 'name': 'instance-00000041', 'instance_id': '7768ce19-cdaa-43a0-9404-cafa72f2d077', 'instance_type': 'm1.nano', 'host': 'd9631757f800d29c241e4abfd23b0ae48959b8cda62adc0042c28989', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'paused', 'state': 'paused', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'ca47a616-f724-11f0-b13b-fa163e425b77', 'monotonic_time': 4341.977680376, 'message_signature': '46d0e48fc6661348e601fc30c5b20b9dcb8a510012c5b8eeea6e91c313a1b932'}, {'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 0, 'user_id': '9a4a4a5f3c9f4c5091261592272bcb81', 'user_name': None, 'project_id': '414437860afc460b9e86d674975e9d1f', 'project_name': None, 'resource_id': '7768ce19-cdaa-43a0-9404-cafa72f2d077-sda', 'timestamp': '2026-01-21T23:56:23.305626', 'resource_metadata': {'display_name': 'tempest-ListServersNegativeTestJSON-server-1677728672-1', 'name': 'instance-00000041', 'instance_id': '7768ce19-cdaa-43a0-9404-cafa72f2d077', 'instance_type': 'm1.nano', 'host': 'd9631757f800d29c241e4abfd23b0ae48959b8cda62adc0042c28989', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'paused', 'state': 'paused', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'ca47ad28-f724-11f0-b13b-fa163e425b77', 'monotonic_time': 4341.977680376, 'message_signature': '5887235f5a82f9c7463a48d0218e148a1e75af13cf44eaf8f99fba00101b9cc0'}]}, 'timestamp': '2026-01-21 23:56:23.306465', '_unique_id': '53280cd7d59a442baa482bb11cf2b5ba'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.307 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.307 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.307 12 ERROR oslo_messaging.notify.messaging     yield
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.307 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.307 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.307 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.307 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.307 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.307 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.307 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.307 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.307 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.307 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.307 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.307 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.307 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.307 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.307 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.307 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.307 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.307 12 ERROR oslo_messaging.notify.messaging 
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.307 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.307 12 ERROR oslo_messaging.notify.messaging 
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.307 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.307 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.307 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.307 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.307 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.307 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.307 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.307 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.307 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.307 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.307 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.307 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.307 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.307 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.307 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.307 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.307 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.307 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.307 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.307 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.307 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.307 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.307 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.307 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.307 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.307 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.307 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.307 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.307 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.307 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.307 12 ERROR oslo_messaging.notify.messaging 
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.307 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.delta in the context of pollsters
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.307 12 DEBUG ceilometer.compute.pollsters [-] 3c28cc1f-7479-4ee7-805d-ae13cd2b6dff/network.incoming.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.308 12 DEBUG ceilometer.compute.pollsters [-] 7768ce19-cdaa-43a0-9404-cafa72f2d077/network.incoming.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.308 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '77a1c6be-229e-42dc-b267-69009221634b', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '7e79b904cb8a49f990b05eb0ed72fdf4', 'user_name': None, 'project_id': '70b1c9f8be0042aa8de9841a26729700', 'project_name': None, 'resource_id': 'instance-0000003d-3c28cc1f-7479-4ee7-805d-ae13cd2b6dff-tapc9a59fac-68', 'timestamp': '2026-01-21T23:56:23.307744', 'resource_metadata': {'display_name': 'tempest-ListServerFiltersTestJSON-instance-1934699601', 'name': 'tapc9a59fac-68', 'instance_id': '3c28cc1f-7479-4ee7-805d-ae13cd2b6dff', 'instance_type': 'm1.nano', 'host': '3da21ce2eff432485c6c9ffe39b17a0dfaf50da221da7a27fe3dbf1f', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '3e1dda74-3c6a-4d29-8792-32134d1c36c5'}, 'image_ref': '3e1dda74-3c6a-4d29-8792-32134d1c36c5', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:58:a7:27', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapc9a59fac-68'}, 'message_id': 'ca47e8e2-f724-11f0-b13b-fa163e425b77', 'monotonic_time': 4341.938884654, 'message_signature': '0ff3018ee91ac61b7079c29a4e3ef85a1ad25bcf698e58164351adff8e00e3e6'}, {'source': 'openstack', 'counter_name': 'network.incoming.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '9a4a4a5f3c9f4c5091261592272bcb81', 'user_name': None, 'project_id': '414437860afc460b9e86d674975e9d1f', 'project_name': None, 'resource_id': 'instance-00000041-7768ce19-cdaa-43a0-9404-cafa72f2d077-tapcbd11a5b-9e', 'timestamp': '2026-01-21T23:56:23.307744', 'resource_metadata': {'display_name': 'tempest-ListServersNegativeTestJSON-server-1677728672-1', 'name': 'tapcbd11a5b-9e', 'instance_id': '7768ce19-cdaa-43a0-9404-cafa72f2d077', 'instance_type': 'm1.nano', 'host': 'd9631757f800d29c241e4abfd23b0ae48959b8cda62adc0042c28989', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'paused', 'state': 'paused', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:fa:c6:29', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapcbd11a5b-9e'}, 'message_id': 'ca47f166-f724-11f0-b13b-fa163e425b77', 'monotonic_time': 4341.941528186, 'message_signature': '9f6f72a1dae7be674bf4570b6bf58c49a62a4fab162f9dd3c2f37136919d932a'}]}, 'timestamp': '2026-01-21 23:56:23.308220', '_unique_id': '31bf65001b8248d895ec7f0c5e625302'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.308 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.308 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.308 12 ERROR oslo_messaging.notify.messaging     yield
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.308 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.308 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.308 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.308 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.308 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.308 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.308 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.308 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.308 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.308 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.308 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.308 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.308 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.308 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.308 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.308 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.308 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.308 12 ERROR oslo_messaging.notify.messaging 
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.308 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.308 12 ERROR oslo_messaging.notify.messaging 
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.308 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.308 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.308 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.308 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.308 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.308 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.308 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.308 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.308 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.308 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.308 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.308 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.308 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.308 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.308 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.308 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.308 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.308 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.308 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.308 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.308 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.308 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.308 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.308 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.308 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.308 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.308 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.308 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.308 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.308 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.308 12 ERROR oslo_messaging.notify.messaging 
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.309 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.usage in the context of pollsters
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.328 12 DEBUG ceilometer.compute.pollsters [-] 3c28cc1f-7479-4ee7-805d-ae13cd2b6dff/disk.device.usage volume: 29949952 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.328 12 DEBUG ceilometer.compute.pollsters [-] 3c28cc1f-7479-4ee7-805d-ae13cd2b6dff/disk.device.usage volume: 485376 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.338 12 DEBUG ceilometer.compute.pollsters [-] 7768ce19-cdaa-43a0-9404-cafa72f2d077/disk.device.usage volume: 196624 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.339 12 DEBUG ceilometer.compute.pollsters [-] 7768ce19-cdaa-43a0-9404-cafa72f2d077/disk.device.usage volume: 485376 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.340 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'd386598f-e053-4773-a381-ac3ff0df7f7f', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 29949952, 'user_id': '7e79b904cb8a49f990b05eb0ed72fdf4', 'user_name': None, 'project_id': '70b1c9f8be0042aa8de9841a26729700', 'project_name': None, 'resource_id': '3c28cc1f-7479-4ee7-805d-ae13cd2b6dff-vda', 'timestamp': '2026-01-21T23:56:23.309355', 'resource_metadata': {'display_name': 'tempest-ListServerFiltersTestJSON-instance-1934699601', 'name': 'instance-0000003d', 'instance_id': '3c28cc1f-7479-4ee7-805d-ae13cd2b6dff', 'instance_type': 'm1.nano', 'host': '3da21ce2eff432485c6c9ffe39b17a0dfaf50da221da7a27fe3dbf1f', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '3e1dda74-3c6a-4d29-8792-32134d1c36c5'}, 'image_ref': '3e1dda74-3c6a-4d29-8792-32134d1c36c5', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'ca4b08ec-f724-11f0-b13b-fa163e425b77', 'monotonic_time': 4342.028589817, 'message_signature': '21cc49a1d21157795778503d46990f4391bce7aaedbf150776d22f509e8f2c8b'}, {'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 485376, 'user_id': '7e79b904cb8a49f990b05eb0ed72fdf4', 'user_name': None, 'project_id': '70b1c9f8be0042aa8de9841a26729700', 'project_name': None, 'resource_id': '3c28cc1f-7479-4ee7-805d-ae13cd2b6dff-sda', 'timestamp': '2026-01-21T23:56:23.309355', 'resource_metadata': {'display_name': 'tempest-ListServerFiltersTestJSON-instance-1934699601', 'name': 'instance-0000003d', 'instance_id': '3c28cc1f-7479-4ee7-805d-ae13cd2b6dff', 'instance_type': 'm1.nano', 'host': '3da21ce2eff432485c6c9ffe39b17a0dfaf50da221da7a27fe3dbf1f', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '3e1dda74-3c6a-4d29-8792-32134d1c36c5'}, 'image_ref': '3e1dda74-3c6a-4d29-8792-32134d1c36c5', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'ca4b1472-f724-11f0-b13b-fa163e425b77', 'monotonic_time': 4342.028589817, 'message_signature': '18ff7c7d91734d8d97aca777f7e8cbff0eef3eb5ab379a1d6bb6bc866579ebde'}, {'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 196624, 'user_id': '9a4a4a5f3c9f4c5091261592272bcb81', 'user_name': None, 'project_id': '414437860afc460b9e86d674975e9d1f', 'project_name': None, 'resource_id': '7768ce19-cdaa-43a0-9404-cafa72f2d077-vda', 'timestamp': '2026-01-21T23:56:23.309355', 'resource_metadata': {'display_name': 'tempest-ListServersNegativeTestJSON-server-1677728672-1', 'name': 'instance-00000041', 'instance_id': '7768ce19-cdaa-43a0-9404-cafa72f2d077', 'instance_type': 'm1.nano', 'host': 'd9631757f800d29c241e4abfd23b0ae48959b8cda62adc0042c28989', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'paused', 'state': 'paused', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'ca4cadfa-f724-11f0-b13b-fa163e425b77', 'monotonic_time': 4342.047967613, 'message_signature': '48cb91634bfd9d2a96bd3cae514a8188796a5ef9af0f7120fa1d450a5498af04'}, {'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 485376, 'user_id': '9a4a4a5f3c9f4c5091261592272bcb81', 'user_name': None, 'project_id': '414437860afc460b9e86d674975e9d1f', 'project_name': None, 'resource_id': '7768ce19-cdaa-43a0-9404-cafa72f2d077-sda', 'timestamp': '2026-01-21T23:56:23.309355', 'resource_metadata': {'display_name': 'tempest-ListServersNegativeTestJSON-server-1677728672-1', 'name': 'instance-00000041', 'instance_id': '7768ce19-cdaa-43a0-9404-cafa72f2d077', 'instance_type': 'm1.nano', 'host': 'd9631757f800d29c241e4abfd23b0ae48959b8cda62adc0042c28989', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'paused', 'state': 'paused', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'ca4cb8ae-f724-11f0-b13b-fa163e425b77', 'monotonic_time': 4342.047967613, 'message_signature': 'eafe990790f31732b346fd8a7b1b03b71eab6510b448e21a72edd999c344707d'}]}, 'timestamp': '2026-01-21 23:56:23.339562', '_unique_id': '6842b01782b348edac13dd45e5492a25'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.340 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.340 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.340 12 ERROR oslo_messaging.notify.messaging     yield
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.340 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.340 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.340 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.340 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.340 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.340 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.340 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.340 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.340 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.340 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.340 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.340 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.340 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.340 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.340 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.340 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.340 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.340 12 ERROR oslo_messaging.notify.messaging 
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.340 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.340 12 ERROR oslo_messaging.notify.messaging 
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.340 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.340 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.340 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.340 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.340 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.340 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.340 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.340 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.340 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.340 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.340 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.340 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.340 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.340 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.340 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.340 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.340 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.340 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.340 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.340 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.340 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.340 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.340 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.340 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.340 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.340 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.340 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.340 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.340 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.340 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.340 12 ERROR oslo_messaging.notify.messaging 
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.341 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.drop in the context of pollsters
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.341 12 DEBUG ceilometer.compute.pollsters [-] 3c28cc1f-7479-4ee7-805d-ae13cd2b6dff/network.outgoing.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.341 12 DEBUG ceilometer.compute.pollsters [-] 7768ce19-cdaa-43a0-9404-cafa72f2d077/network.outgoing.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.342 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '374328f4-b232-4543-97b0-8cacce45cddc', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '7e79b904cb8a49f990b05eb0ed72fdf4', 'user_name': None, 'project_id': '70b1c9f8be0042aa8de9841a26729700', 'project_name': None, 'resource_id': 'instance-0000003d-3c28cc1f-7479-4ee7-805d-ae13cd2b6dff-tapc9a59fac-68', 'timestamp': '2026-01-21T23:56:23.341573', 'resource_metadata': {'display_name': 'tempest-ListServerFiltersTestJSON-instance-1934699601', 'name': 'tapc9a59fac-68', 'instance_id': '3c28cc1f-7479-4ee7-805d-ae13cd2b6dff', 'instance_type': 'm1.nano', 'host': '3da21ce2eff432485c6c9ffe39b17a0dfaf50da221da7a27fe3dbf1f', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '3e1dda74-3c6a-4d29-8792-32134d1c36c5'}, 'image_ref': '3e1dda74-3c6a-4d29-8792-32134d1c36c5', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:58:a7:27', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapc9a59fac-68'}, 'message_id': 'ca4d1196-f724-11f0-b13b-fa163e425b77', 'monotonic_time': 4341.938884654, 'message_signature': '06dba775c7a6f87a1445145e7881aa15dc52ef6f8dae5e84fec34f272ba8fc9e'}, {'source': 'openstack', 'counter_name': 'network.outgoing.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '9a4a4a5f3c9f4c5091261592272bcb81', 'user_name': None, 'project_id': '414437860afc460b9e86d674975e9d1f', 'project_name': None, 'resource_id': 'instance-00000041-7768ce19-cdaa-43a0-9404-cafa72f2d077-tapcbd11a5b-9e', 'timestamp': '2026-01-21T23:56:23.341573', 'resource_metadata': {'display_name': 'tempest-ListServersNegativeTestJSON-server-1677728672-1', 'name': 'tapcbd11a5b-9e', 'instance_id': '7768ce19-cdaa-43a0-9404-cafa72f2d077', 'instance_type': 'm1.nano', 'host': 'd9631757f800d29c241e4abfd23b0ae48959b8cda62adc0042c28989', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'paused', 'state': 'paused', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:fa:c6:29', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapcbd11a5b-9e'}, 'message_id': 'ca4d1b14-f724-11f0-b13b-fa163e425b77', 'monotonic_time': 4341.941528186, 'message_signature': '3639578874001769687bff725bf31f4cbaf4b79aa6b86b9ed80261b65a5a9564'}]}, 'timestamp': '2026-01-21 23:56:23.342059', '_unique_id': 'a9289e0f8ff14cdca5ba9d2658857832'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.342 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.342 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.342 12 ERROR oslo_messaging.notify.messaging     yield
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.342 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.342 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.342 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.342 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.342 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.342 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.342 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.342 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.342 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.342 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.342 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.342 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.342 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.342 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.342 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.342 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.342 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.342 12 ERROR oslo_messaging.notify.messaging 
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.342 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.342 12 ERROR oslo_messaging.notify.messaging 
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.342 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.342 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.342 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.342 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.342 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.342 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.342 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.342 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.342 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.342 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.342 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.342 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.342 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.342 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.342 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.342 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.342 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.342 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.342 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.342 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.342 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.342 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.342 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.342 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.342 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.342 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.342 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.342 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.342 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.342 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.342 12 ERROR oslo_messaging.notify.messaging 
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.343 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.requests in the context of pollsters
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.343 12 DEBUG ceilometer.compute.pollsters [-] 3c28cc1f-7479-4ee7-805d-ae13cd2b6dff/disk.device.write.requests volume: 307 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.343 12 DEBUG ceilometer.compute.pollsters [-] 3c28cc1f-7479-4ee7-805d-ae13cd2b6dff/disk.device.write.requests volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.343 12 DEBUG ceilometer.compute.pollsters [-] 7768ce19-cdaa-43a0-9404-cafa72f2d077/disk.device.write.requests volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.344 12 DEBUG ceilometer.compute.pollsters [-] 7768ce19-cdaa-43a0-9404-cafa72f2d077/disk.device.write.requests volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.344 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '66fe0d2a-73da-4068-a311-05fd179f1a4a', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 307, 'user_id': '7e79b904cb8a49f990b05eb0ed72fdf4', 'user_name': None, 'project_id': '70b1c9f8be0042aa8de9841a26729700', 'project_name': None, 'resource_id': '3c28cc1f-7479-4ee7-805d-ae13cd2b6dff-vda', 'timestamp': '2026-01-21T23:56:23.343247', 'resource_metadata': {'display_name': 'tempest-ListServerFiltersTestJSON-instance-1934699601', 'name': 'instance-0000003d', 'instance_id': '3c28cc1f-7479-4ee7-805d-ae13cd2b6dff', 'instance_type': 'm1.nano', 'host': '3da21ce2eff432485c6c9ffe39b17a0dfaf50da221da7a27fe3dbf1f', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '3e1dda74-3c6a-4d29-8792-32134d1c36c5'}, 'image_ref': '3e1dda74-3c6a-4d29-8792-32134d1c36c5', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'ca4d52b4-f724-11f0-b13b-fa163e425b77', 'monotonic_time': 4341.945027771, 'message_signature': '8d6c20a9b26ce7fde146542be4ac9702adcc29b5c87452a314cb9a9381b439fb'}, {'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 0, 'user_id': '7e79b904cb8a49f990b05eb0ed72fdf4', 'user_name': None, 'project_id': '70b1c9f8be0042aa8de9841a26729700', 'project_name': None, 'resource_id': '3c28cc1f-7479-4ee7-805d-ae13cd2b6dff-sda', 'timestamp': '2026-01-21T23:56:23.343247', 'resource_metadata': {'display_name': 'tempest-ListServerFiltersTestJSON-instance-1934699601', 'name': 'instance-0000003d', 'instance_id': '3c28cc1f-7479-4ee7-805d-ae13cd2b6dff', 'instance_type': 'm1.nano', 'host': '3da21ce2eff432485c6c9ffe39b17a0dfaf50da221da7a27fe3dbf1f', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '3e1dda74-3c6a-4d29-8792-32134d1c36c5'}, 'image_ref': '3e1dda74-3c6a-4d29-8792-32134d1c36c5', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'ca4d6394-f724-11f0-b13b-fa163e425b77', 'monotonic_time': 4341.945027771, 'message_signature': 'b328f62c9f609fb7e1029a0359757952ab202749706d182ff95e486c69a4e859'}, {'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 0, 'user_id': '9a4a4a5f3c9f4c5091261592272bcb81', 'user_name': None, 'project_id': '414437860afc460b9e86d674975e9d1f', 'project_name': None, 'resource_id': '7768ce19-cdaa-43a0-9404-cafa72f2d077-vda', 'timestamp': '2026-01-21T23:56:23.343247', 'resource_metadata': {'display_name': 'tempest-ListServersNegativeTestJSON-server-1677728672-1', 'name': 'instance-00000041', 'instance_id': '7768ce19-cdaa-43a0-9404-cafa72f2d077', 'instance_type': 'm1.nano', 'host': 'd9631757f800d29c241e4abfd23b0ae48959b8cda62adc0042c28989', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'paused', 'state': 'paused', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'ca4d6c7c-f724-11f0-b13b-fa163e425b77', 'monotonic_time': 4341.977680376, 'message_signature': '4eba16b131a66933d59e70a0de7e97d93b9c816c479437817d15a46a5caf84af'}, {'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 0, 'user_id': '9a4a4a5f3c9f4c5091261592272bcb81', 'user_name': None, 'project_id': '414437860afc460b9e86d674975e9d1f', 'project_name': None, 'resource_id': '7768ce19-cdaa-43a0-9404-cafa72f2d077-sda', 'timestamp': '2026-01-21T23:56:23.343247', 'resource_metadata': {'display_name': 'tempest-ListServersNegativeTestJSON-server-1677728672-1', 'name': 'instance-00000041', 'instance_id': '7768ce19-cdaa-43a0-9404-cafa72f2d077', 'instance_type': 'm1.nano', 'host': 'd9631757f800d29c241e4abfd23b0ae48959b8cda62adc0042c28989', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'paused', 'state': 'paused', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'ca4d73f2-f724-11f0-b13b-fa163e425b77', 'monotonic_time': 4341.977680376, 'message_signature': 'a053555917c3e517d2d1d2e5702d987b69ad0ee636e2917fa47a8e35f2f56ebc'}]}, 'timestamp': '2026-01-21 23:56:23.344331', '_unique_id': 'd4acdc996a884809bccb39924f5b530a'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.344 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.344 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.344 12 ERROR oslo_messaging.notify.messaging     yield
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.344 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.344 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.344 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.344 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.344 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.344 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.344 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.344 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.344 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.344 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.344 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.344 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.344 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.344 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.344 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.344 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.344 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.344 12 ERROR oslo_messaging.notify.messaging 
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.344 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.344 12 ERROR oslo_messaging.notify.messaging 
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.344 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.344 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.344 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.344 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.344 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.344 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.344 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.344 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.344 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.344 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.344 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.344 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.344 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.344 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.344 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.344 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.344 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.344 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.344 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.344 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.344 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.344 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.344 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.344 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.344 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.344 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.344 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.344 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.344 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.344 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.344 12 ERROR oslo_messaging.notify.messaging 
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.345 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.rate in the context of pollsters
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.345 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for IncomingBytesRatePollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.345 12 ERROR ceilometer.polling.manager [-] Prevent pollster network.incoming.bytes.rate from polling [<NovaLikeServer: tempest-ListServerFiltersTestJSON-instance-1934699601>, <NovaLikeServer: tempest-ListServersNegativeTestJSON-server-1677728672-1>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-ListServerFiltersTestJSON-instance-1934699601>, <NovaLikeServer: tempest-ListServersNegativeTestJSON-server-1677728672-1>]
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.346 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.drop in the context of pollsters
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.346 12 DEBUG ceilometer.compute.pollsters [-] 3c28cc1f-7479-4ee7-805d-ae13cd2b6dff/network.incoming.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.346 12 DEBUG ceilometer.compute.pollsters [-] 7768ce19-cdaa-43a0-9404-cafa72f2d077/network.incoming.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.347 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'ad4ddb36-087a-434d-8001-6195910e8631', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '7e79b904cb8a49f990b05eb0ed72fdf4', 'user_name': None, 'project_id': '70b1c9f8be0042aa8de9841a26729700', 'project_name': None, 'resource_id': 'instance-0000003d-3c28cc1f-7479-4ee7-805d-ae13cd2b6dff-tapc9a59fac-68', 'timestamp': '2026-01-21T23:56:23.346089', 'resource_metadata': {'display_name': 'tempest-ListServerFiltersTestJSON-instance-1934699601', 'name': 'tapc9a59fac-68', 'instance_id': '3c28cc1f-7479-4ee7-805d-ae13cd2b6dff', 'instance_type': 'm1.nano', 'host': '3da21ce2eff432485c6c9ffe39b17a0dfaf50da221da7a27fe3dbf1f', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '3e1dda74-3c6a-4d29-8792-32134d1c36c5'}, 'image_ref': '3e1dda74-3c6a-4d29-8792-32134d1c36c5', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:58:a7:27', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapc9a59fac-68'}, 'message_id': 'ca4dc172-f724-11f0-b13b-fa163e425b77', 'monotonic_time': 4341.938884654, 'message_signature': '3c8b95bcb873392a191daed9b3e2a724c9537b1446247709f2ba2f446587bf53'}, {'source': 'openstack', 'counter_name': 'network.incoming.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '9a4a4a5f3c9f4c5091261592272bcb81', 'user_name': None, 'project_id': '414437860afc460b9e86d674975e9d1f', 'project_name': None, 'resource_id': 'instance-00000041-7768ce19-cdaa-43a0-9404-cafa72f2d077-tapcbd11a5b-9e', 'timestamp': '2026-01-21T23:56:23.346089', 'resource_metadata': {'display_name': 'tempest-ListServersNegativeTestJSON-server-1677728672-1', 'name': 'tapcbd11a5b-9e', 'instance_id': '7768ce19-cdaa-43a0-9404-cafa72f2d077', 'instance_type': 'm1.nano', 'host': 'd9631757f800d29c241e4abfd23b0ae48959b8cda62adc0042c28989', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'paused', 'state': 'paused', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:fa:c6:29', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapcbd11a5b-9e'}, 'message_id': 'ca4dcb36-f724-11f0-b13b-fa163e425b77', 'monotonic_time': 4341.941528186, 'message_signature': '33fa81940655186179ae0e3a318d6daeb397a3245544d271ee02513ec275f990'}]}, 'timestamp': '2026-01-21 23:56:23.346566', '_unique_id': 'a798b5fb7ed6437fb605f3d57c91fa1a'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.347 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.347 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.347 12 ERROR oslo_messaging.notify.messaging     yield
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.347 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.347 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.347 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.347 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.347 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.347 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.347 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.347 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.347 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.347 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.347 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.347 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.347 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.347 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.347 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.347 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.347 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.347 12 ERROR oslo_messaging.notify.messaging 
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.347 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.347 12 ERROR oslo_messaging.notify.messaging 
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.347 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.347 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.347 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.347 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.347 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.347 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.347 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.347 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.347 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.347 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.347 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.347 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.347 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.347 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.347 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.347 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.347 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.347 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.347 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.347 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.347 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.347 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.347 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.347 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.347 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.347 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.347 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.347 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.347 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.347 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.347 12 ERROR oslo_messaging.notify.messaging 
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.347 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.capacity in the context of pollsters
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.347 12 DEBUG ceilometer.compute.pollsters [-] 3c28cc1f-7479-4ee7-805d-ae13cd2b6dff/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.347 12 DEBUG ceilometer.compute.pollsters [-] 3c28cc1f-7479-4ee7-805d-ae13cd2b6dff/disk.device.capacity volume: 485376 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.348 12 DEBUG ceilometer.compute.pollsters [-] 7768ce19-cdaa-43a0-9404-cafa72f2d077/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.348 12 DEBUG ceilometer.compute.pollsters [-] 7768ce19-cdaa-43a0-9404-cafa72f2d077/disk.device.capacity volume: 485376 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 21 18:56:23 np0005591285 nova_compute[182755]: 2026-01-21 23:56:23.348 182759 DEBUG nova.compute.manager [req-09b0d55d-fc8c-4039-99ef-3c0ba4a6bdef req-5e089d0c-68ba-4188-b6dc-4347a1cae637 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 7768ce19-cdaa-43a0-9404-cafa72f2d077] Received event network-vif-plugged-cbd11a5b-9ef3-4c5b-a638-afc4464a8dfc external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 21 18:56:23 np0005591285 nova_compute[182755]: 2026-01-21 23:56:23.349 182759 DEBUG oslo_concurrency.lockutils [req-09b0d55d-fc8c-4039-99ef-3c0ba4a6bdef req-5e089d0c-68ba-4188-b6dc-4347a1cae637 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "7768ce19-cdaa-43a0-9404-cafa72f2d077-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.349 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '9288f09e-b274-4322-8633-1adb103d055c', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '7e79b904cb8a49f990b05eb0ed72fdf4', 'user_name': None, 'project_id': '70b1c9f8be0042aa8de9841a26729700', 'project_name': None, 'resource_id': '3c28cc1f-7479-4ee7-805d-ae13cd2b6dff-vda', 'timestamp': '2026-01-21T23:56:23.347673', 'resource_metadata': {'display_name': 'tempest-ListServerFiltersTestJSON-instance-1934699601', 'name': 'instance-0000003d', 'instance_id': '3c28cc1f-7479-4ee7-805d-ae13cd2b6dff', 'instance_type': 'm1.nano', 'host': '3da21ce2eff432485c6c9ffe39b17a0dfaf50da221da7a27fe3dbf1f', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '3e1dda74-3c6a-4d29-8792-32134d1c36c5'}, 'image_ref': '3e1dda74-3c6a-4d29-8792-32134d1c36c5', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'ca4dff20-f724-11f0-b13b-fa163e425b77', 'monotonic_time': 4342.028589817, 'message_signature': '34b99f5101858bb7af120c53cf217730539ad84f04b68b827b19dea0b7a082fb'}, {'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 485376, 'user_id': '7e79b904cb8a49f990b05eb0ed72fdf4', 'user_name': None, 'project_id': '70b1c9f8be0042aa8de9841a26729700', 'project_name': None, 'resource_id': '3c28cc1f-7479-4ee7-805d-ae13cd2b6dff-sda', 'timestamp': '2026-01-21T23:56:23.347673', 'resource_metadata': {'display_name': 'tempest-ListServerFiltersTestJSON-instance-1934699601', 'name': 'instance-0000003d', 'instance_id': '3c28cc1f-7479-4ee7-805d-ae13cd2b6dff', 'instance_type': 'm1.nano', 'host': '3da21ce2eff432485c6c9ffe39b17a0dfaf50da221da7a27fe3dbf1f', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '3e1dda74-3c6a-4d29-8792-32134d1c36c5'}, 'image_ref': '3e1dda74-3c6a-4d29-8792-32134d1c36c5', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'ca4e0876-f724-11f0-b13b-fa163e425b77', 'monotonic_time': 4342.028589817, 'message_signature': '74c89653f50051c2f8865462d083ed2f2de53bd330344b5694c4fcccc743adfb'}, {'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '9a4a4a5f3c9f4c5091261592272bcb81', 'user_name': None, 'project_id': '414437860afc460b9e86d674975e9d1f', 'project_name': None, 'resource_id': '7768ce19-cdaa-43a0-9404-cafa72f2d077-vda', 'timestamp': '2026-01-21T23:56:23.347673', 'resource_metadata': {'display_name': 'tempest-ListServersNegativeTestJSON-server-1677728672-1', 'name': 'instance-00000041', 'instance_id': '7768ce19-cdaa-43a0-9404-cafa72f2d077', 'instance_type': 'm1.nano', 'host': 'd9631757f800d29c241e4abfd23b0ae48959b8cda62adc0042c28989', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'paused', 'state': 'paused', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'ca4e1050-f724-11f0-b13b-fa163e425b77', 'monotonic_time': 4342.047967613, 'message_signature': '60d1abc6b845b232aa58631ffb7a4a2229ae314e110cae508235d5d08f9f972a'}, {'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 485376, 'user_id': '9a4a4a5f3c9f4c5091261592272bcb81', 'user_name': None, 'project_id': '414437860afc460b9e86d674975e9d1f', 'project_name': None, 'resource_id': '7768ce19-cdaa-43a0-9404-cafa72f2d077-sda', 'timestamp': '2026-01-21T23:56:23.347673', 'resource_metadata': {'display_name': 'tempest-ListServersNegativeTestJSON-server-1677728672-1', 'name': 'instance-00000041', 'instance_id': '7768ce19-cdaa-43a0-9404-cafa72f2d077', 'instance_type': 'm1.nano', 'host': 'd9631757f800d29c241e4abfd23b0ae48959b8cda62adc0042c28989', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'paused', 'state': 'paused', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'ca4e17f8-f724-11f0-b13b-fa163e425b77', 'monotonic_time': 4342.047967613, 'message_signature': '05b2e3afc234179cc191d03bb034798e4463af64b451c3daff9cc73f82a491bd'}]}, 'timestamp': '2026-01-21 23:56:23.348520', '_unique_id': 'db5f9f88186c4d3bb22f9730fb001c3c'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.349 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.349 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.349 12 ERROR oslo_messaging.notify.messaging     yield
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.349 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.349 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.349 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.349 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.349 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.349 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.349 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.349 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.349 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.349 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.349 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.349 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.349 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.349 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.349 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.349 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.349 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.349 12 ERROR oslo_messaging.notify.messaging 
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.349 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.349 12 ERROR oslo_messaging.notify.messaging 
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.349 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.349 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.349 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.349 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.349 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.349 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.349 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.349 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.349 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.349 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.349 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.349 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.349 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.349 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.349 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.349 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.349 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.349 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.349 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.349 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.349 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.349 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.349 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.349 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.349 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.349 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.349 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.349 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.349 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.349 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.349 12 ERROR oslo_messaging.notify.messaging 
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.349 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.allocation in the context of pollsters
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.349 12 DEBUG ceilometer.compute.pollsters [-] 3c28cc1f-7479-4ee7-805d-ae13cd2b6dff/disk.device.allocation volume: 30547968 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.350 12 DEBUG ceilometer.compute.pollsters [-] 3c28cc1f-7479-4ee7-805d-ae13cd2b6dff/disk.device.allocation volume: 487424 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 21 18:56:23 np0005591285 nova_compute[182755]: 2026-01-21 23:56:23.350 182759 DEBUG oslo_concurrency.lockutils [req-09b0d55d-fc8c-4039-99ef-3c0ba4a6bdef req-5e089d0c-68ba-4188-b6dc-4347a1cae637 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "7768ce19-cdaa-43a0-9404-cafa72f2d077-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.350 12 DEBUG ceilometer.compute.pollsters [-] 7768ce19-cdaa-43a0-9404-cafa72f2d077/disk.device.allocation volume: 204800 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.350 12 DEBUG ceilometer.compute.pollsters [-] 7768ce19-cdaa-43a0-9404-cafa72f2d077/disk.device.allocation volume: 487424 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 21 18:56:23 np0005591285 nova_compute[182755]: 2026-01-21 23:56:23.350 182759 DEBUG oslo_concurrency.lockutils [req-09b0d55d-fc8c-4039-99ef-3c0ba4a6bdef req-5e089d0c-68ba-4188-b6dc-4347a1cae637 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "7768ce19-cdaa-43a0-9404-cafa72f2d077-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 21 18:56:23 np0005591285 nova_compute[182755]: 2026-01-21 23:56:23.350 182759 DEBUG nova.compute.manager [req-09b0d55d-fc8c-4039-99ef-3c0ba4a6bdef req-5e089d0c-68ba-4188-b6dc-4347a1cae637 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 7768ce19-cdaa-43a0-9404-cafa72f2d077] Processing event network-vif-plugged-cbd11a5b-9ef3-4c5b-a638-afc4464a8dfc _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.351 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'bbc8f7bb-5c6c-4c06-a41a-252db635d75f', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 30547968, 'user_id': '7e79b904cb8a49f990b05eb0ed72fdf4', 'user_name': None, 'project_id': '70b1c9f8be0042aa8de9841a26729700', 'project_name': None, 'resource_id': '3c28cc1f-7479-4ee7-805d-ae13cd2b6dff-vda', 'timestamp': '2026-01-21T23:56:23.349957', 'resource_metadata': {'display_name': 'tempest-ListServerFiltersTestJSON-instance-1934699601', 'name': 'instance-0000003d', 'instance_id': '3c28cc1f-7479-4ee7-805d-ae13cd2b6dff', 'instance_type': 'm1.nano', 'host': '3da21ce2eff432485c6c9ffe39b17a0dfaf50da221da7a27fe3dbf1f', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '3e1dda74-3c6a-4d29-8792-32134d1c36c5'}, 'image_ref': '3e1dda74-3c6a-4d29-8792-32134d1c36c5', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'ca4e58a8-f724-11f0-b13b-fa163e425b77', 'monotonic_time': 4342.028589817, 'message_signature': '1248ef26511008c54e10b1744ea624d015c802a9ba36a2d8b604860435489710'}, {'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 487424, 'user_id': '7e79b904cb8a49f990b05eb0ed72fdf4', 'user_name': None, 'project_id': '70b1c9f8be0042aa8de9841a26729700', 'project_name': None, 'resource_id': '3c28cc1f-7479-4ee7-805d-ae13cd2b6dff-sda', 'timestamp': '2026-01-21T23:56:23.349957', 'resource_metadata': {'display_name': 'tempest-ListServerFiltersTestJSON-instance-1934699601', 'name': 'instance-0000003d', 'instance_id': '3c28cc1f-7479-4ee7-805d-ae13cd2b6dff', 'instance_type': 'm1.nano', 'host': '3da21ce2eff432485c6c9ffe39b17a0dfaf50da221da7a27fe3dbf1f', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '3e1dda74-3c6a-4d29-8792-32134d1c36c5'}, 'image_ref': '3e1dda74-3c6a-4d29-8792-32134d1c36c5', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'ca4e615e-f724-11f0-b13b-fa163e425b77', 'monotonic_time': 4342.028589817, 'message_signature': '6fd0c072f1efb5ebf38766613bbfd4ded1875bb829d06624bab5b83bc4646eab'}, {'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 204800, 'user_id': '9a4a4a5f3c9f4c5091261592272bcb81', 'user_name': None, 'project_id': '414437860afc460b9e86d674975e9d1f', 'project_name': None, 'resource_id': '7768ce19-cdaa-43a0-9404-cafa72f2d077-vda', 'timestamp': '2026-01-21T23:56:23.349957', 'resource_metadata': {'display_name': 'tempest-ListServersNegativeTestJSON-server-1677728672-1', 'name': 'instance-00000041', 'instance_id': '7768ce19-cdaa-43a0-9404-cafa72f2d077', 'instance_type': 'm1.nano', 'host': 'd9631757f800d29c241e4abfd23b0ae48959b8cda62adc0042c28989', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'paused', 'state': 'paused', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'ca4e691a-f724-11f0-b13b-fa163e425b77', 'monotonic_time': 4342.047967613, 'message_signature': '074f387f9efa1dfd825f6981f547287569e104e67709d964c691ad764af036c0'}, {'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 487424, 'user_id': '9a4a4a5f3c9f4c5091261592272bcb81', 'user_name': None, 'project_id': '414437860afc460b9e86d674975e9d1f', 'project_name': None, 'resource_id': '7768ce19-cdaa-43a0-9404-cafa72f2d077-sda', 'timestamp': '2026-01-21T23:56:23.349957', 'resource_metadata': {'display_name': 'tempest-ListServersNegativeTestJSON-server-1677728672-1', 'name': 'instance-00000041', 'instance_id': '7768ce19-cdaa-43a0-9404-cafa72f2d077', 'instance_type': 'm1.nano', 'host': 'd9631757f800d29c241e4abfd23b0ae48959b8cda62adc0042c28989', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'paused', 'state': 'paused', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'ca4e7086-f724-11f0-b13b-fa163e425b77', 'monotonic_time': 4342.047967613, 'message_signature': '50cea23822a8a1426e69b76d6005201c46b1828659e7c3aaea941c00cb3e8e0f'}]}, 'timestamp': '2026-01-21 23:56:23.350786', '_unique_id': '698792bfd3a04fea91f44c02a8af63c6'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.351 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.351 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.351 12 ERROR oslo_messaging.notify.messaging     yield
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.351 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.351 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.351 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.351 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.351 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.351 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.351 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.351 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.351 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.351 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.351 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.351 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.351 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.351 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.351 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.351 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.351 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.351 12 ERROR oslo_messaging.notify.messaging 
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.351 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.351 12 ERROR oslo_messaging.notify.messaging 
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.351 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.351 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.351 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.351 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.351 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.351 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.351 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.351 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.351 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.351 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.351 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.351 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.351 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.351 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.351 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.351 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.351 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.351 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.351 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.351 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.351 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.351 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.351 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.351 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.351 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.351 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.351 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.351 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.351 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.351 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.351 12 ERROR oslo_messaging.notify.messaging 
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.351 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.latency in the context of pollsters
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.351 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for PerDeviceDiskLatencyPollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.352 12 ERROR ceilometer.polling.manager [-] Prevent pollster disk.device.latency from polling [<NovaLikeServer: tempest-ListServerFiltersTestJSON-instance-1934699601>, <NovaLikeServer: tempest-ListServersNegativeTestJSON-server-1677728672-1>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-ListServerFiltersTestJSON-instance-1934699601>, <NovaLikeServer: tempest-ListServersNegativeTestJSON-server-1677728672-1>]
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.352 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.rate in the context of pollsters
Jan 21 18:56:23 np0005591285 nova_compute[182755]: 2026-01-21 23:56:23.351 182759 DEBUG nova.compute.manager [None req-38ece97f-025d-4e62-9204-d5dc562a12a1 9a4a4a5f3c9f4c5091261592272bcb81 414437860afc460b9e86d674975e9d1f - - default default] [instance: 7768ce19-cdaa-43a0-9404-cafa72f2d077] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.352 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for OutgoingBytesRatePollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.352 12 ERROR ceilometer.polling.manager [-] Prevent pollster network.outgoing.bytes.rate from polling [<NovaLikeServer: tempest-ListServerFiltersTestJSON-instance-1934699601>, <NovaLikeServer: tempest-ListServersNegativeTestJSON-server-1677728672-1>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-ListServerFiltersTestJSON-instance-1934699601>, <NovaLikeServer: tempest-ListServersNegativeTestJSON-server-1677728672-1>]
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.352 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes in the context of pollsters
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.352 12 DEBUG ceilometer.compute.pollsters [-] 3c28cc1f-7479-4ee7-805d-ae13cd2b6dff/network.incoming.bytes volume: 1436 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.353 12 DEBUG ceilometer.compute.pollsters [-] 7768ce19-cdaa-43a0-9404-cafa72f2d077/network.incoming.bytes volume: 90 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 21 18:56:23 np0005591285 nova_compute[182755]: 2026-01-21 23:56:23.356 182759 DEBUG nova.virt.driver [None req-f447ef32-7edb-4e2c-a5c5-5452f2f9c37e - - - - - -] Emitting event <LifecycleEvent: 1769039783.355381, 7768ce19-cdaa-43a0-9404-cafa72f2d077 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 21 18:56:23 np0005591285 nova_compute[182755]: 2026-01-21 23:56:23.356 182759 INFO nova.compute.manager [None req-f447ef32-7edb-4e2c-a5c5-5452f2f9c37e - - - - - -] [instance: 7768ce19-cdaa-43a0-9404-cafa72f2d077] VM Resumed (Lifecycle Event)#033[00m
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.356 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '587a06fe-75aa-432f-9601-ae4e5b4522f0', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 1436, 'user_id': '7e79b904cb8a49f990b05eb0ed72fdf4', 'user_name': None, 'project_id': '70b1c9f8be0042aa8de9841a26729700', 'project_name': None, 'resource_id': 'instance-0000003d-3c28cc1f-7479-4ee7-805d-ae13cd2b6dff-tapc9a59fac-68', 'timestamp': '2026-01-21T23:56:23.352537', 'resource_metadata': {'display_name': 'tempest-ListServerFiltersTestJSON-instance-1934699601', 'name': 'tapc9a59fac-68', 'instance_id': '3c28cc1f-7479-4ee7-805d-ae13cd2b6dff', 'instance_type': 'm1.nano', 'host': '3da21ce2eff432485c6c9ffe39b17a0dfaf50da221da7a27fe3dbf1f', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '3e1dda74-3c6a-4d29-8792-32134d1c36c5'}, 'image_ref': '3e1dda74-3c6a-4d29-8792-32134d1c36c5', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:58:a7:27', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapc9a59fac-68'}, 'message_id': 'ca4ebd0c-f724-11f0-b13b-fa163e425b77', 'monotonic_time': 4341.938884654, 'message_signature': '50fae051470b179204d14f811cef321daba31fe68d19bccf52021befef54d1eb'}, {'source': 'openstack', 'counter_name': 'network.incoming.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 90, 'user_id': '9a4a4a5f3c9f4c5091261592272bcb81', 'user_name': None, 'project_id': '414437860afc460b9e86d674975e9d1f', 'project_name': None, 'resource_id': 'instance-00000041-7768ce19-cdaa-43a0-9404-cafa72f2d077-tapcbd11a5b-9e', 'timestamp': '2026-01-21T23:56:23.352537', 'resource_metadata': {'display_name': 'tempest-ListServersNegativeTestJSON-server-1677728672-1', 'name': 'tapcbd11a5b-9e', 'instance_id': '7768ce19-cdaa-43a0-9404-cafa72f2d077', 'instance_type': 'm1.nano', 'host': 'd9631757f800d29c241e4abfd23b0ae48959b8cda62adc0042c28989', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'paused', 'state': 'paused', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:fa:c6:29', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapcbd11a5b-9e'}, 'message_id': 'ca4ed04e-f724-11f0-b13b-fa163e425b77', 'monotonic_time': 4341.941528186, 'message_signature': 'c5a1c054b28ebfba4ef4f8abaf8e1130e7edd12a28969ea6ecfbf5572f1c3a7e'}]}, 'timestamp': '2026-01-21 23:56:23.353257', '_unique_id': '05d7378d7acd499d86a16ab3e36de831'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.356 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.356 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.356 12 ERROR oslo_messaging.notify.messaging     yield
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.356 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.356 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.356 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.356 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.356 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.356 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.356 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.356 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.356 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.356 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.356 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.356 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.356 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.356 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.356 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.356 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.356 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.356 12 ERROR oslo_messaging.notify.messaging 
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.356 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.356 12 ERROR oslo_messaging.notify.messaging 
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.356 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.356 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.356 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.356 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.356 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.356 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.356 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.356 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.356 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.356 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.356 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.356 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.356 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.356 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.356 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.356 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.356 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.356 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.356 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.356 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.356 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.356 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.356 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.356 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.356 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.356 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.356 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.356 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.356 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.356 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.356 12 ERROR oslo_messaging.notify.messaging 
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.359 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.delta in the context of pollsters
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.359 12 DEBUG ceilometer.compute.pollsters [-] 3c28cc1f-7479-4ee7-805d-ae13cd2b6dff/network.outgoing.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.359 12 DEBUG ceilometer.compute.pollsters [-] 7768ce19-cdaa-43a0-9404-cafa72f2d077/network.outgoing.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 21 18:56:23 np0005591285 nova_compute[182755]: 2026-01-21 23:56:23.360 182759 DEBUG nova.virt.libvirt.driver [None req-38ece97f-025d-4e62-9204-d5dc562a12a1 9a4a4a5f3c9f4c5091261592272bcb81 414437860afc460b9e86d674975e9d1f - - default default] [instance: 7768ce19-cdaa-43a0-9404-cafa72f2d077] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.362 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'a744896e-c6e6-4e21-8d12-78a87f958a85', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '7e79b904cb8a49f990b05eb0ed72fdf4', 'user_name': None, 'project_id': '70b1c9f8be0042aa8de9841a26729700', 'project_name': None, 'resource_id': 'instance-0000003d-3c28cc1f-7479-4ee7-805d-ae13cd2b6dff-tapc9a59fac-68', 'timestamp': '2026-01-21T23:56:23.359219', 'resource_metadata': {'display_name': 'tempest-ListServerFiltersTestJSON-instance-1934699601', 'name': 'tapc9a59fac-68', 'instance_id': '3c28cc1f-7479-4ee7-805d-ae13cd2b6dff', 'instance_type': 'm1.nano', 'host': '3da21ce2eff432485c6c9ffe39b17a0dfaf50da221da7a27fe3dbf1f', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '3e1dda74-3c6a-4d29-8792-32134d1c36c5'}, 'image_ref': '3e1dda74-3c6a-4d29-8792-32134d1c36c5', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:58:a7:27', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapc9a59fac-68'}, 'message_id': 'ca4fcd8c-f724-11f0-b13b-fa163e425b77', 'monotonic_time': 4341.938884654, 'message_signature': '5fb4406cbf28d2022a2535a3ed4622eef6e4d0909512b0f2a04919670f986a53'}, {'source': 'openstack', 'counter_name': 'network.outgoing.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '9a4a4a5f3c9f4c5091261592272bcb81', 'user_name': None, 'project_id': '414437860afc460b9e86d674975e9d1f', 'project_name': None, 'resource_id': 'instance-00000041-7768ce19-cdaa-43a0-9404-cafa72f2d077-tapcbd11a5b-9e', 'timestamp': '2026-01-21T23:56:23.359219', 'resource_metadata': {'display_name': 'tempest-ListServersNegativeTestJSON-server-1677728672-1', 'name': 'tapcbd11a5b-9e', 'instance_id': '7768ce19-cdaa-43a0-9404-cafa72f2d077', 'instance_type': 'm1.nano', 'host': 'd9631757f800d29c241e4abfd23b0ae48959b8cda62adc0042c28989', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'paused', 'state': 'paused', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:fa:c6:29', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapcbd11a5b-9e'}, 'message_id': 'ca4fe704-f724-11f0-b13b-fa163e425b77', 'monotonic_time': 4341.941528186, 'message_signature': '4f55fea3ba777fa580e086e4ab3027a2f9c01b64b61c13221248aacb3f5992dc'}]}, 'timestamp': '2026-01-21 23:56:23.360573', '_unique_id': '2392d53bc9ef471db7c7d154ebb2958f'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.362 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.362 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.362 12 ERROR oslo_messaging.notify.messaging     yield
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.362 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.362 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.362 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.362 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.362 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.362 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.362 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.362 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.362 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.362 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.362 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.362 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.362 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.362 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.362 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.362 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.362 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.362 12 ERROR oslo_messaging.notify.messaging 
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.362 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.362 12 ERROR oslo_messaging.notify.messaging 
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.362 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.362 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.362 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.362 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.362 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.362 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.362 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.362 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.362 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.362 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.362 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.362 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.362 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.362 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.362 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.362 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.362 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.362 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.362 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.362 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.362 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.362 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.362 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.362 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.362 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.362 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.362 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.362 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.362 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.362 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.362 12 ERROR oslo_messaging.notify.messaging 
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.363 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.bytes in the context of pollsters
Jan 21 18:56:23 np0005591285 nova_compute[182755]: 2026-01-21 23:56:23.363 182759 INFO nova.virt.libvirt.driver [-] [instance: 7768ce19-cdaa-43a0-9404-cafa72f2d077] Instance spawned successfully.#033[00m
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.364 12 DEBUG ceilometer.compute.pollsters [-] 3c28cc1f-7479-4ee7-805d-ae13cd2b6dff/disk.device.read.bytes volume: 29190656 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 21 18:56:23 np0005591285 nova_compute[182755]: 2026-01-21 23:56:23.364 182759 DEBUG nova.virt.libvirt.driver [None req-38ece97f-025d-4e62-9204-d5dc562a12a1 9a4a4a5f3c9f4c5091261592272bcb81 414437860afc460b9e86d674975e9d1f - - default default] [instance: 7768ce19-cdaa-43a0-9404-cafa72f2d077] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.365 12 DEBUG ceilometer.compute.pollsters [-] 3c28cc1f-7479-4ee7-805d-ae13cd2b6dff/disk.device.read.bytes volume: 274750 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.365 12 DEBUG ceilometer.compute.pollsters [-] 7768ce19-cdaa-43a0-9404-cafa72f2d077/disk.device.read.bytes volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.366 12 DEBUG ceilometer.compute.pollsters [-] 7768ce19-cdaa-43a0-9404-cafa72f2d077/disk.device.read.bytes volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.369 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '8efcfe72-ab77-4cf2-9f49-9fbde959ff4a', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 29190656, 'user_id': '7e79b904cb8a49f990b05eb0ed72fdf4', 'user_name': None, 'project_id': '70b1c9f8be0042aa8de9841a26729700', 'project_name': None, 'resource_id': '3c28cc1f-7479-4ee7-805d-ae13cd2b6dff-vda', 'timestamp': '2026-01-21T23:56:23.364189', 'resource_metadata': {'display_name': 'tempest-ListServerFiltersTestJSON-instance-1934699601', 'name': 'instance-0000003d', 'instance_id': '3c28cc1f-7479-4ee7-805d-ae13cd2b6dff', 'instance_type': 'm1.nano', 'host': '3da21ce2eff432485c6c9ffe39b17a0dfaf50da221da7a27fe3dbf1f', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '3e1dda74-3c6a-4d29-8792-32134d1c36c5'}, 'image_ref': '3e1dda74-3c6a-4d29-8792-32134d1c36c5', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'ca508e8e-f724-11f0-b13b-fa163e425b77', 'monotonic_time': 4341.945027771, 'message_signature': '78388c5a148b4a4573846a4ea1f3d8a38c99a88e9d8cdfd74a30bbb913c99fda'}, {'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 274750, 'user_id': '7e79b904cb8a49f990b05eb0ed72fdf4', 'user_name': None, 'project_id': '70b1c9f8be0042aa8de9841a26729700', 'project_name': None, 'resource_id': '3c28cc1f-7479-4ee7-805d-ae13cd2b6dff-sda', 'timestamp': '2026-01-21T23:56:23.364189', 'resource_metadata': {'display_name': 'tempest-ListServerFiltersTestJSON-instance-1934699601', 'name': 'instance-0000003d', 'instance_id': '3c28cc1f-7479-4ee7-805d-ae13cd2b6dff', 'instance_type': 'm1.nano', 'host': '3da21ce2eff432485c6c9ffe39b17a0dfaf50da221da7a27fe3dbf1f', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '3e1dda74-3c6a-4d29-8792-32134d1c36c5'}, 'image_ref': '3e1dda74-3c6a-4d29-8792-32134d1c36c5', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'ca50b396-f724-11f0-b13b-fa163e425b77', 'monotonic_time': 4341.945027771, 'message_signature': 'd0f982e5351316f33fdfb419cc4953215782e0ad075031f73210dae2ed3e4e6d'}, {'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '9a4a4a5f3c9f4c5091261592272bcb81', 'user_name': None, 'project_id': '414437860afc460b9e86d674975e9d1f', 'project_name': None, 'resource_id': '7768ce19-cdaa-43a0-9404-cafa72f2d077-vda', 'timestamp': '2026-01-21T23:56:23.364189', 'resource_metadata': {'display_name': 'tempest-ListServersNegativeTestJSON-server-1677728672-1', 'name': 'instance-00000041', 'instance_id': '7768ce19-cdaa-43a0-9404-cafa72f2d077', 'instance_type': 'm1.nano', 'host': 'd9631757f800d29c241e4abfd23b0ae48959b8cda62adc0042c28989', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'paused', 'state': 'paused', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'ca50d286-f724-11f0-b13b-fa163e425b77', 'monotonic_time': 4341.977680376, 'message_signature': '6e1001995d41c9adca9c7eca4c2277328dbde5f2394cb5faa13e731aac8601df'}, {'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '9a4a4a5f3c9f4c5091261592272bcb81', 'user_name': None, 'project_id': '414437860afc460b9e86d674975e9d1f', 'project_name': None, 'resource_id': '7768ce19-cdaa-43a0-9404-cafa72f2d077-sda', 'timestamp': '2026-01-21T23:56:23.364189', 'resource_metadata': {'display_name': 'tempest-ListServersNegativeTestJSON-server-1677728672-1', 'name': 'instance-00000041', 'instance_id': '7768ce19-cdaa-43a0-9404-cafa72f2d077', 'instance_type': 'm1.nano', 'host': 'd9631757f800d29c241e4abfd23b0ae48959b8cda62adc0042c28989', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'paused', 'state': 'paused', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'ca50efd2-f724-11f0-b13b-fa163e425b77', 'monotonic_time': 4341.977680376, 'message_signature': '5dbd294a5224030dea15f151619759c4aec8d199a473e1eea70a894074991aa1'}]}, 'timestamp': '2026-01-21 23:56:23.367402', '_unique_id': '30f2231c5f06434483102c0c719e8e07'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.369 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.369 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.369 12 ERROR oslo_messaging.notify.messaging     yield
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.369 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.369 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.369 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.369 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.369 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.369 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.369 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.369 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.369 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.369 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.369 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.369 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.369 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.369 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.369 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.369 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.369 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.369 12 ERROR oslo_messaging.notify.messaging 
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.369 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.369 12 ERROR oslo_messaging.notify.messaging 
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.369 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.369 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.369 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.369 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.369 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.369 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.369 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.369 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.369 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.369 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.369 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.369 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.369 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.369 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.369 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.369 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.369 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.369 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.369 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.369 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.369 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.369 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.369 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.369 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.369 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.369 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.369 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.369 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.369 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.369 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.369 12 ERROR oslo_messaging.notify.messaging 
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.370 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets in the context of pollsters
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.370 12 DEBUG ceilometer.compute.pollsters [-] 3c28cc1f-7479-4ee7-805d-ae13cd2b6dff/network.incoming.packets volume: 11 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.371 12 DEBUG ceilometer.compute.pollsters [-] 7768ce19-cdaa-43a0-9404-cafa72f2d077/network.incoming.packets volume: 1 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.372 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '943b62cc-63a9-40a2-aacc-6c5a2270b650', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 11, 'user_id': '7e79b904cb8a49f990b05eb0ed72fdf4', 'user_name': None, 'project_id': '70b1c9f8be0042aa8de9841a26729700', 'project_name': None, 'resource_id': 'instance-0000003d-3c28cc1f-7479-4ee7-805d-ae13cd2b6dff-tapc9a59fac-68', 'timestamp': '2026-01-21T23:56:23.370624', 'resource_metadata': {'display_name': 'tempest-ListServerFiltersTestJSON-instance-1934699601', 'name': 'tapc9a59fac-68', 'instance_id': '3c28cc1f-7479-4ee7-805d-ae13cd2b6dff', 'instance_type': 'm1.nano', 'host': '3da21ce2eff432485c6c9ffe39b17a0dfaf50da221da7a27fe3dbf1f', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '3e1dda74-3c6a-4d29-8792-32134d1c36c5'}, 'image_ref': '3e1dda74-3c6a-4d29-8792-32134d1c36c5', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:58:a7:27', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapc9a59fac-68'}, 'message_id': 'ca5188d4-f724-11f0-b13b-fa163e425b77', 'monotonic_time': 4341.938884654, 'message_signature': '1e7e67b508fea10b69eb3abcba923cd5370c92b0d4bbae34c6c1c005ed1d791c'}, {'source': 'openstack', 'counter_name': 'network.incoming.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 1, 'user_id': '9a4a4a5f3c9f4c5091261592272bcb81', 'user_name': None, 'project_id': '414437860afc460b9e86d674975e9d1f', 'project_name': None, 'resource_id': 'instance-00000041-7768ce19-cdaa-43a0-9404-cafa72f2d077-tapcbd11a5b-9e', 'timestamp': '2026-01-21T23:56:23.370624', 'resource_metadata': {'display_name': 'tempest-ListServersNegativeTestJSON-server-1677728672-1', 'name': 'tapcbd11a5b-9e', 'instance_id': '7768ce19-cdaa-43a0-9404-cafa72f2d077', 'instance_type': 'm1.nano', 'host': 'd9631757f800d29c241e4abfd23b0ae48959b8cda62adc0042c28989', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'paused', 'state': 'paused', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:fa:c6:29', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapcbd11a5b-9e'}, 'message_id': 'ca519e00-f724-11f0-b13b-fa163e425b77', 'monotonic_time': 4341.941528186, 'message_signature': '803ca0b20c30c71591938e31423638bf4d8d4079ece9fbd0ac9279f841cd85bc'}]}, 'timestamp': '2026-01-21 23:56:23.371755', '_unique_id': '4e7926cac1f84e14a4e60bc716809315'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.372 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.372 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.372 12 ERROR oslo_messaging.notify.messaging     yield
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.372 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.372 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.372 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.372 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.372 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.372 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.372 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.372 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.372 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.372 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.372 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.372 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.372 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.372 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.372 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.372 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.372 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.372 12 ERROR oslo_messaging.notify.messaging 
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.372 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.372 12 ERROR oslo_messaging.notify.messaging 
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.372 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.372 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.372 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.372 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.372 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.372 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.372 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.372 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.372 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.372 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.372 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.372 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.372 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.372 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.372 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.372 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.372 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.372 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.372 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.372 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.372 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.372 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.372 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.372 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.372 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.372 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.372 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.372 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.372 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.372 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.372 12 ERROR oslo_messaging.notify.messaging 
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.374 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.error in the context of pollsters
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.374 12 DEBUG ceilometer.compute.pollsters [-] 3c28cc1f-7479-4ee7-805d-ae13cd2b6dff/network.outgoing.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.375 12 DEBUG ceilometer.compute.pollsters [-] 7768ce19-cdaa-43a0-9404-cafa72f2d077/network.outgoing.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.377 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'dda68f59-5fc1-4f4e-9eed-b63109b4591c', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '7e79b904cb8a49f990b05eb0ed72fdf4', 'user_name': None, 'project_id': '70b1c9f8be0042aa8de9841a26729700', 'project_name': None, 'resource_id': 'instance-0000003d-3c28cc1f-7479-4ee7-805d-ae13cd2b6dff-tapc9a59fac-68', 'timestamp': '2026-01-21T23:56:23.374948', 'resource_metadata': {'display_name': 'tempest-ListServerFiltersTestJSON-instance-1934699601', 'name': 'tapc9a59fac-68', 'instance_id': '3c28cc1f-7479-4ee7-805d-ae13cd2b6dff', 'instance_type': 'm1.nano', 'host': '3da21ce2eff432485c6c9ffe39b17a0dfaf50da221da7a27fe3dbf1f', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '3e1dda74-3c6a-4d29-8792-32134d1c36c5'}, 'image_ref': '3e1dda74-3c6a-4d29-8792-32134d1c36c5', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:58:a7:27', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapc9a59fac-68'}, 'message_id': 'ca522f5a-f724-11f0-b13b-fa163e425b77', 'monotonic_time': 4341.938884654, 'message_signature': 'de5c5fbc542259d73669a8764e3424875d4f2196f2136e8388a7dc10aaa16672'}, {'source': 'openstack', 'counter_name': 'network.outgoing.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '9a4a4a5f3c9f4c5091261592272bcb81', 'user_name': None, 'project_id': '414437860afc460b9e86d674975e9d1f', 'project_name': None, 'resource_id': 'instance-00000041-7768ce19-cdaa-43a0-9404-cafa72f2d077-tapcbd11a5b-9e', 'timestamp': '2026-01-21T23:56:23.374948', 'resource_metadata': {'display_name': 'tempest-ListServersNegativeTestJSON-server-1677728672-1', 'name': 'tapcbd11a5b-9e', 'instance_id': '7768ce19-cdaa-43a0-9404-cafa72f2d077', 'instance_type': 'm1.nano', 'host': 'd9631757f800d29c241e4abfd23b0ae48959b8cda62adc0042c28989', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'paused', 'state': 'paused', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:fa:c6:29', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapcbd11a5b-9e'}, 'message_id': 'ca52426a-f724-11f0-b13b-fa163e425b77', 'monotonic_time': 4341.941528186, 'message_signature': 'a2b9de6f5fac12a11f5a0ee969ebbff994a480d295e90b07edea8cec4bd2f586'}]}, 'timestamp': '2026-01-21 23:56:23.375994', '_unique_id': '33cd85ada22f465d87ec78b609a56f5c'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.377 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.377 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.377 12 ERROR oslo_messaging.notify.messaging     yield
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.377 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.377 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.377 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.377 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.377 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.377 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.377 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.377 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.377 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.377 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.377 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.377 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.377 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.377 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.377 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.377 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.377 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.377 12 ERROR oslo_messaging.notify.messaging 
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.377 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.377 12 ERROR oslo_messaging.notify.messaging 
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.377 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.377 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.377 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.377 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.377 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.377 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.377 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.377 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.377 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.377 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.377 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.377 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.377 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.377 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.377 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.377 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.377 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.377 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.377 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.377 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.377 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.377 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.377 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.377 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.377 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.377 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.377 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.377 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.377 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.377 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.377 12 ERROR oslo_messaging.notify.messaging 
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.378 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.iops in the context of pollsters
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.378 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for PerDeviceDiskIOPSPollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.378 12 ERROR ceilometer.polling.manager [-] Prevent pollster disk.device.iops from polling [<NovaLikeServer: tempest-ListServerFiltersTestJSON-instance-1934699601>, <NovaLikeServer: tempest-ListServersNegativeTestJSON-server-1677728672-1>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-ListServerFiltersTestJSON-instance-1934699601>, <NovaLikeServer: tempest-ListServersNegativeTestJSON-server-1677728672-1>]
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.379 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets in the context of pollsters
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.379 12 DEBUG ceilometer.compute.pollsters [-] 3c28cc1f-7479-4ee7-805d-ae13cd2b6dff/network.outgoing.packets volume: 16 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.379 12 DEBUG ceilometer.compute.pollsters [-] 7768ce19-cdaa-43a0-9404-cafa72f2d077/network.outgoing.packets volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.381 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '558ced79-8b05-4dc4-b61d-a3e8d87af0c6', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 16, 'user_id': '7e79b904cb8a49f990b05eb0ed72fdf4', 'user_name': None, 'project_id': '70b1c9f8be0042aa8de9841a26729700', 'project_name': None, 'resource_id': 'instance-0000003d-3c28cc1f-7479-4ee7-805d-ae13cd2b6dff-tapc9a59fac-68', 'timestamp': '2026-01-21T23:56:23.379301', 'resource_metadata': {'display_name': 'tempest-ListServerFiltersTestJSON-instance-1934699601', 'name': 'tapc9a59fac-68', 'instance_id': '3c28cc1f-7479-4ee7-805d-ae13cd2b6dff', 'instance_type': 'm1.nano', 'host': '3da21ce2eff432485c6c9ffe39b17a0dfaf50da221da7a27fe3dbf1f', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '3e1dda74-3c6a-4d29-8792-32134d1c36c5'}, 'image_ref': '3e1dda74-3c6a-4d29-8792-32134d1c36c5', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:58:a7:27', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapc9a59fac-68'}, 'message_id': 'ca52d8b0-f724-11f0-b13b-fa163e425b77', 'monotonic_time': 4341.938884654, 'message_signature': 'b1eb95125df00916ed3665e9b2c14b9fb41bc1a4c01f8a89adb6a434afa0d989'}, {'source': 'openstack', 'counter_name': 'network.outgoing.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '9a4a4a5f3c9f4c5091261592272bcb81', 'user_name': None, 'project_id': '414437860afc460b9e86d674975e9d1f', 'project_name': None, 'resource_id': 'instance-00000041-7768ce19-cdaa-43a0-9404-cafa72f2d077-tapcbd11a5b-9e', 'timestamp': '2026-01-21T23:56:23.379301', 'resource_metadata': {'display_name': 'tempest-ListServersNegativeTestJSON-server-1677728672-1', 'name': 'tapcbd11a5b-9e', 'instance_id': '7768ce19-cdaa-43a0-9404-cafa72f2d077', 'instance_type': 'm1.nano', 'host': 'd9631757f800d29c241e4abfd23b0ae48959b8cda62adc0042c28989', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'paused', 'state': 'paused', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:fa:c6:29', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapcbd11a5b-9e'}, 'message_id': 'ca52ee9a-f724-11f0-b13b-fa163e425b77', 'monotonic_time': 4341.941528186, 'message_signature': 'bbe734f15f90b2970225ed738e1e61c789b18d3912e06a111418ed1e4377c6ba'}]}, 'timestamp': '2026-01-21 23:56:23.380456', '_unique_id': 'f6a32a8feaf5443e9f238072f9806895'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.381 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.381 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.381 12 ERROR oslo_messaging.notify.messaging     yield
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.381 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.381 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.381 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.381 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.381 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.381 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.381 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.381 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.381 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.381 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.381 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.381 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.381 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.381 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.381 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.381 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.381 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.381 12 ERROR oslo_messaging.notify.messaging 
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.381 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.381 12 ERROR oslo_messaging.notify.messaging 
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.381 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.381 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.381 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.381 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.381 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.381 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.381 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.381 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.381 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.381 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.381 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.381 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.381 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.381 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.381 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.381 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.381 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.381 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.381 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.381 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.381 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.381 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.381 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.381 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.381 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.381 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.381 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.381 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.381 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.381 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.381 12 ERROR oslo_messaging.notify.messaging 
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.383 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.latency in the context of pollsters
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.383 12 DEBUG ceilometer.compute.pollsters [-] 3c28cc1f-7479-4ee7-805d-ae13cd2b6dff/disk.device.write.latency volume: 1853061707 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.384 12 DEBUG ceilometer.compute.pollsters [-] 3c28cc1f-7479-4ee7-805d-ae13cd2b6dff/disk.device.write.latency volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.384 12 DEBUG ceilometer.compute.pollsters [-] 7768ce19-cdaa-43a0-9404-cafa72f2d077/disk.device.write.latency volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.385 12 DEBUG ceilometer.compute.pollsters [-] 7768ce19-cdaa-43a0-9404-cafa72f2d077/disk.device.write.latency volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.387 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'ef5dcaff-508d-4621-9e7f-36b8943aea5c', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 1853061707, 'user_id': '7e79b904cb8a49f990b05eb0ed72fdf4', 'user_name': None, 'project_id': '70b1c9f8be0042aa8de9841a26729700', 'project_name': None, 'resource_id': '3c28cc1f-7479-4ee7-805d-ae13cd2b6dff-vda', 'timestamp': '2026-01-21T23:56:23.383359', 'resource_metadata': {'display_name': 'tempest-ListServerFiltersTestJSON-instance-1934699601', 'name': 'instance-0000003d', 'instance_id': '3c28cc1f-7479-4ee7-805d-ae13cd2b6dff', 'instance_type': 'm1.nano', 'host': '3da21ce2eff432485c6c9ffe39b17a0dfaf50da221da7a27fe3dbf1f', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '3e1dda74-3c6a-4d29-8792-32134d1c36c5'}, 'image_ref': '3e1dda74-3c6a-4d29-8792-32134d1c36c5', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'ca537b08-f724-11f0-b13b-fa163e425b77', 'monotonic_time': 4341.945027771, 'message_signature': '0717fe4f9848577f1ee1ea85a9720340b71dc653211ac8d90fc7bd2704325543'}, {'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 0, 'user_id': '7e79b904cb8a49f990b05eb0ed72fdf4', 'user_name': None, 'project_id': '70b1c9f8be0042aa8de9841a26729700', 'project_name': None, 'resource_id': '3c28cc1f-7479-4ee7-805d-ae13cd2b6dff-sda', 'timestamp': '2026-01-21T23:56:23.383359', 'resource_metadata': {'display_name': 'tempest-ListServerFiltersTestJSON-instance-1934699601', 'name': 'instance-0000003d', 'instance_id': '3c28cc1f-7479-4ee7-805d-ae13cd2b6dff', 'instance_type': 'm1.nano', 'host': '3da21ce2eff432485c6c9ffe39b17a0dfaf50da221da7a27fe3dbf1f', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '3e1dda74-3c6a-4d29-8792-32134d1c36c5'}, 'image_ref': '3e1dda74-3c6a-4d29-8792-32134d1c36c5', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'ca5395f2-f724-11f0-b13b-fa163e425b77', 'monotonic_time': 4341.945027771, 'message_signature': '38e99e19a1b8d2b07515ddb36716d509f425da04f35a58e382f1de814635618c'}, {'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 0, 'user_id': '9a4a4a5f3c9f4c5091261592272bcb81', 'user_name': None, 'project_id': '414437860afc460b9e86d674975e9d1f', 'project_name': None, 'resource_id': '7768ce19-cdaa-43a0-9404-cafa72f2d077-vda', 'timestamp': '2026-01-21T23:56:23.383359', 'resource_metadata': {'display_name': 'tempest-ListServersNegativeTestJSON-server-1677728672-1', 'name': 'instance-00000041', 'instance_id': '7768ce19-cdaa-43a0-9404-cafa72f2d077', 'instance_type': 'm1.nano', 'host': 'd9631757f800d29c241e4abfd23b0ae48959b8cda62adc0042c28989', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'paused', 'state': 'paused', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'ca53a9b6-f724-11f0-b13b-fa163e425b77', 'monotonic_time': 4341.977680376, 'message_signature': 'bd5a43c953d2554e903c2c74ec7fcc3b958638a45cfcf655b956c6106b332b82'}, {'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 0, 'user_id': '9a4a4a5f3c9f4c5091261592272bcb81', 'user_name': None, 'project_id': '414437860afc460b9e86d674975e9d1f', 'project_name': None, 'resource_id': '7768ce19-cdaa-43a0-9404-cafa72f2d077-sda', 'timestamp': '2026-01-21T23:56:23.383359', 'resource_metadata': {'display_name': 'tempest-ListServersNegativeTestJSON-server-1677728672-1', 'name': 'instance-00000041', 'instance_id': '7768ce19-cdaa-43a0-9404-cafa72f2d077', 'instance_type': 'm1.nano', 'host': 'd9631757f800d29c241e4abfd23b0ae48959b8cda62adc0042c28989', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'paused', 'state': 'paused', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'ca53c18a-f724-11f0-b13b-fa163e425b77', 'monotonic_time': 4341.977680376, 'message_signature': '75bcf20a2e8ad8289bfd3362f785c8234083629d33d41c99a73c56342de8c2f9'}]}, 'timestamp': '2026-01-21 23:56:23.385835', '_unique_id': 'e0ee65ca6e034ddc9999ba9e9f35ccb7'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.387 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.387 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.387 12 ERROR oslo_messaging.notify.messaging     yield
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.387 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.387 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.387 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.387 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.387 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.387 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.387 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.387 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.387 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.387 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.387 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.387 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.387 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.387 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.387 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.387 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.387 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.387 12 ERROR oslo_messaging.notify.messaging 
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.387 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.387 12 ERROR oslo_messaging.notify.messaging 
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.387 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.387 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.387 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.387 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.387 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.387 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.387 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.387 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.387 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.387 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.387 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.387 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.387 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.387 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.387 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.387 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.387 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.387 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.387 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.387 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.387 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.387 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.387 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.387 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.387 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.387 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.387 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.387 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.387 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.387 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 21 18:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:56:23.387 12 ERROR oslo_messaging.notify.messaging 
Jan 21 18:56:23 np0005591285 nova_compute[182755]: 2026-01-21 23:56:23.441 182759 DEBUG oslo_concurrency.lockutils [None req-58a0e89d-0650-449c-a608-79ae149a1c1b 0f8ef02149394f2dac899fc3395b6bf7 717cc581e6a349a98dfd390d05b18624 - - default default] Acquiring lock "d30c29ef-0595-4d30-a826-50e21d7d3463" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 21 18:56:23 np0005591285 nova_compute[182755]: 2026-01-21 23:56:23.442 182759 DEBUG oslo_concurrency.lockutils [None req-58a0e89d-0650-449c-a608-79ae149a1c1b 0f8ef02149394f2dac899fc3395b6bf7 717cc581e6a349a98dfd390d05b18624 - - default default] Lock "d30c29ef-0595-4d30-a826-50e21d7d3463" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 21 18:56:23 np0005591285 nova_compute[182755]: 2026-01-21 23:56:23.582 182759 DEBUG nova.compute.manager [None req-f447ef32-7edb-4e2c-a5c5-5452f2f9c37e - - - - - -] [instance: 7768ce19-cdaa-43a0-9404-cafa72f2d077] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 21 18:56:23 np0005591285 nova_compute[182755]: 2026-01-21 23:56:23.582 182759 DEBUG nova.compute.manager [None req-58a0e89d-0650-449c-a608-79ae149a1c1b 0f8ef02149394f2dac899fc3395b6bf7 717cc581e6a349a98dfd390d05b18624 - - default default] [instance: d30c29ef-0595-4d30-a826-50e21d7d3463] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Jan 21 18:56:23 np0005591285 nova_compute[182755]: 2026-01-21 23:56:23.590 182759 DEBUG nova.virt.libvirt.driver [None req-38ece97f-025d-4e62-9204-d5dc562a12a1 9a4a4a5f3c9f4c5091261592272bcb81 414437860afc460b9e86d674975e9d1f - - default default] [instance: 7768ce19-cdaa-43a0-9404-cafa72f2d077] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 21 18:56:23 np0005591285 nova_compute[182755]: 2026-01-21 23:56:23.590 182759 DEBUG nova.virt.libvirt.driver [None req-38ece97f-025d-4e62-9204-d5dc562a12a1 9a4a4a5f3c9f4c5091261592272bcb81 414437860afc460b9e86d674975e9d1f - - default default] [instance: 7768ce19-cdaa-43a0-9404-cafa72f2d077] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 21 18:56:23 np0005591285 nova_compute[182755]: 2026-01-21 23:56:23.590 182759 DEBUG nova.virt.libvirt.driver [None req-38ece97f-025d-4e62-9204-d5dc562a12a1 9a4a4a5f3c9f4c5091261592272bcb81 414437860afc460b9e86d674975e9d1f - - default default] [instance: 7768ce19-cdaa-43a0-9404-cafa72f2d077] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 21 18:56:23 np0005591285 nova_compute[182755]: 2026-01-21 23:56:23.591 182759 DEBUG nova.virt.libvirt.driver [None req-38ece97f-025d-4e62-9204-d5dc562a12a1 9a4a4a5f3c9f4c5091261592272bcb81 414437860afc460b9e86d674975e9d1f - - default default] [instance: 7768ce19-cdaa-43a0-9404-cafa72f2d077] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 21 18:56:23 np0005591285 nova_compute[182755]: 2026-01-21 23:56:23.591 182759 DEBUG nova.virt.libvirt.driver [None req-38ece97f-025d-4e62-9204-d5dc562a12a1 9a4a4a5f3c9f4c5091261592272bcb81 414437860afc460b9e86d674975e9d1f - - default default] [instance: 7768ce19-cdaa-43a0-9404-cafa72f2d077] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 21 18:56:23 np0005591285 nova_compute[182755]: 2026-01-21 23:56:23.592 182759 DEBUG nova.virt.libvirt.driver [None req-38ece97f-025d-4e62-9204-d5dc562a12a1 9a4a4a5f3c9f4c5091261592272bcb81 414437860afc460b9e86d674975e9d1f - - default default] [instance: 7768ce19-cdaa-43a0-9404-cafa72f2d077] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 21 18:56:23 np0005591285 nova_compute[182755]: 2026-01-21 23:56:23.597 182759 DEBUG nova.compute.manager [None req-f447ef32-7edb-4e2c-a5c5-5452f2f9c37e - - - - - -] [instance: 7768ce19-cdaa-43a0-9404-cafa72f2d077] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 21 18:56:23 np0005591285 nova_compute[182755]: 2026-01-21 23:56:23.642 182759 INFO nova.compute.manager [None req-f447ef32-7edb-4e2c-a5c5-5452f2f9c37e - - - - - -] [instance: 7768ce19-cdaa-43a0-9404-cafa72f2d077] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 21 18:56:23 np0005591285 nova_compute[182755]: 2026-01-21 23:56:23.706 182759 INFO nova.compute.manager [None req-38ece97f-025d-4e62-9204-d5dc562a12a1 9a4a4a5f3c9f4c5091261592272bcb81 414437860afc460b9e86d674975e9d1f - - default default] [instance: 7768ce19-cdaa-43a0-9404-cafa72f2d077] Took 8.84 seconds to spawn the instance on the hypervisor.#033[00m
Jan 21 18:56:23 np0005591285 nova_compute[182755]: 2026-01-21 23:56:23.707 182759 DEBUG nova.compute.manager [None req-38ece97f-025d-4e62-9204-d5dc562a12a1 9a4a4a5f3c9f4c5091261592272bcb81 414437860afc460b9e86d674975e9d1f - - default default] [instance: 7768ce19-cdaa-43a0-9404-cafa72f2d077] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 21 18:56:23 np0005591285 nova_compute[182755]: 2026-01-21 23:56:23.752 182759 DEBUG oslo_concurrency.lockutils [None req-58a0e89d-0650-449c-a608-79ae149a1c1b 0f8ef02149394f2dac899fc3395b6bf7 717cc581e6a349a98dfd390d05b18624 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 21 18:56:23 np0005591285 nova_compute[182755]: 2026-01-21 23:56:23.752 182759 DEBUG oslo_concurrency.lockutils [None req-58a0e89d-0650-449c-a608-79ae149a1c1b 0f8ef02149394f2dac899fc3395b6bf7 717cc581e6a349a98dfd390d05b18624 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 21 18:56:23 np0005591285 nova_compute[182755]: 2026-01-21 23:56:23.762 182759 DEBUG nova.virt.hardware [None req-58a0e89d-0650-449c-a608-79ae149a1c1b 0f8ef02149394f2dac899fc3395b6bf7 717cc581e6a349a98dfd390d05b18624 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Jan 21 18:56:23 np0005591285 nova_compute[182755]: 2026-01-21 23:56:23.762 182759 INFO nova.compute.claims [None req-58a0e89d-0650-449c-a608-79ae149a1c1b 0f8ef02149394f2dac899fc3395b6bf7 717cc581e6a349a98dfd390d05b18624 - - default default] [instance: d30c29ef-0595-4d30-a826-50e21d7d3463] Claim successful on node compute-2.ctlplane.example.com#033[00m
Jan 21 18:56:23 np0005591285 nova_compute[182755]: 2026-01-21 23:56:23.836 182759 INFO nova.compute.manager [None req-38ece97f-025d-4e62-9204-d5dc562a12a1 9a4a4a5f3c9f4c5091261592272bcb81 414437860afc460b9e86d674975e9d1f - - default default] [instance: 7768ce19-cdaa-43a0-9404-cafa72f2d077] Took 9.76 seconds to build instance.#033[00m
Jan 21 18:56:23 np0005591285 nova_compute[182755]: 2026-01-21 23:56:23.864 182759 DEBUG oslo_concurrency.lockutils [None req-38ece97f-025d-4e62-9204-d5dc562a12a1 9a4a4a5f3c9f4c5091261592272bcb81 414437860afc460b9e86d674975e9d1f - - default default] Lock "7768ce19-cdaa-43a0-9404-cafa72f2d077" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 9.965s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 21 18:56:23 np0005591285 nova_compute[182755]: 2026-01-21 23:56:23.971 182759 DEBUG nova.compute.provider_tree [None req-58a0e89d-0650-449c-a608-79ae149a1c1b 0f8ef02149394f2dac899fc3395b6bf7 717cc581e6a349a98dfd390d05b18624 - - default default] Inventory has not changed in ProviderTree for provider: e96a8776-a298-4c19-937a-402cb8191067 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 21 18:56:23 np0005591285 nova_compute[182755]: 2026-01-21 23:56:23.990 182759 DEBUG nova.scheduler.client.report [None req-58a0e89d-0650-449c-a608-79ae149a1c1b 0f8ef02149394f2dac899fc3395b6bf7 717cc581e6a349a98dfd390d05b18624 - - default default] Inventory has not changed for provider e96a8776-a298-4c19-937a-402cb8191067 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 21 18:56:24 np0005591285 nova_compute[182755]: 2026-01-21 23:56:24.015 182759 DEBUG oslo_concurrency.lockutils [None req-58a0e89d-0650-449c-a608-79ae149a1c1b 0f8ef02149394f2dac899fc3395b6bf7 717cc581e6a349a98dfd390d05b18624 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.262s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 21 18:56:24 np0005591285 nova_compute[182755]: 2026-01-21 23:56:24.016 182759 DEBUG nova.compute.manager [None req-58a0e89d-0650-449c-a608-79ae149a1c1b 0f8ef02149394f2dac899fc3395b6bf7 717cc581e6a349a98dfd390d05b18624 - - default default] [instance: d30c29ef-0595-4d30-a826-50e21d7d3463] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Jan 21 18:56:24 np0005591285 nova_compute[182755]: 2026-01-21 23:56:24.086 182759 DEBUG nova.compute.manager [None req-58a0e89d-0650-449c-a608-79ae149a1c1b 0f8ef02149394f2dac899fc3395b6bf7 717cc581e6a349a98dfd390d05b18624 - - default default] [instance: d30c29ef-0595-4d30-a826-50e21d7d3463] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Jan 21 18:56:24 np0005591285 nova_compute[182755]: 2026-01-21 23:56:24.086 182759 DEBUG nova.network.neutron [None req-58a0e89d-0650-449c-a608-79ae149a1c1b 0f8ef02149394f2dac899fc3395b6bf7 717cc581e6a349a98dfd390d05b18624 - - default default] [instance: d30c29ef-0595-4d30-a826-50e21d7d3463] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Jan 21 18:56:24 np0005591285 nova_compute[182755]: 2026-01-21 23:56:24.114 182759 INFO nova.virt.libvirt.driver [None req-58a0e89d-0650-449c-a608-79ae149a1c1b 0f8ef02149394f2dac899fc3395b6bf7 717cc581e6a349a98dfd390d05b18624 - - default default] [instance: d30c29ef-0595-4d30-a826-50e21d7d3463] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Jan 21 18:56:24 np0005591285 nova_compute[182755]: 2026-01-21 23:56:24.153 182759 DEBUG nova.compute.manager [None req-58a0e89d-0650-449c-a608-79ae149a1c1b 0f8ef02149394f2dac899fc3395b6bf7 717cc581e6a349a98dfd390d05b18624 - - default default] [instance: d30c29ef-0595-4d30-a826-50e21d7d3463] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Jan 21 18:56:24 np0005591285 podman[219634]: 2026-01-21 23:56:24.212105448 +0000 UTC m=+0.065120517 container health_status b5c1a31a508d0cf9e10702abe9c0ef424614b4d8f0daee85791f7de054afa6f0 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=ceilometer_agent_compute, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, container_name=ceilometer_agent_compute, managed_by=edpm_ansible, org.label-schema.license=GPLv2)
Jan 21 18:56:24 np0005591285 podman[219633]: 2026-01-21 23:56:24.239165453 +0000 UTC m=+0.099009256 container health_status 769668cc4e818cfae854ab3fcb5517227ddca1e18005a18ef4d782a494f45662 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, vcs-type=git, io.openshift.expose-services=, container_name=openstack_network_exporter, distribution-scope=public, io.openshift.tags=minimal rhel9, url=https://catalog.redhat.com/en/search?searchType=containers, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, architecture=x86_64, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., release=1755695350, io.buildah.version=1.33.7, managed_by=edpm_ansible, version=9.6, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vendor=Red Hat, Inc., config_id=openstack_network_exporter, maintainer=Red Hat, Inc., name=ubi9-minimal, build-date=2025-08-20T13:12:41, com.redhat.component=ubi9-minimal-container, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9.)
Jan 21 18:56:24 np0005591285 nova_compute[182755]: 2026-01-21 23:56:24.299 182759 DEBUG nova.compute.manager [None req-58a0e89d-0650-449c-a608-79ae149a1c1b 0f8ef02149394f2dac899fc3395b6bf7 717cc581e6a349a98dfd390d05b18624 - - default default] [instance: d30c29ef-0595-4d30-a826-50e21d7d3463] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Jan 21 18:56:24 np0005591285 nova_compute[182755]: 2026-01-21 23:56:24.301 182759 DEBUG nova.virt.libvirt.driver [None req-58a0e89d-0650-449c-a608-79ae149a1c1b 0f8ef02149394f2dac899fc3395b6bf7 717cc581e6a349a98dfd390d05b18624 - - default default] [instance: d30c29ef-0595-4d30-a826-50e21d7d3463] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Jan 21 18:56:24 np0005591285 nova_compute[182755]: 2026-01-21 23:56:24.301 182759 INFO nova.virt.libvirt.driver [None req-58a0e89d-0650-449c-a608-79ae149a1c1b 0f8ef02149394f2dac899fc3395b6bf7 717cc581e6a349a98dfd390d05b18624 - - default default] [instance: d30c29ef-0595-4d30-a826-50e21d7d3463] Creating image(s)#033[00m
Jan 21 18:56:24 np0005591285 nova_compute[182755]: 2026-01-21 23:56:24.302 182759 DEBUG oslo_concurrency.lockutils [None req-58a0e89d-0650-449c-a608-79ae149a1c1b 0f8ef02149394f2dac899fc3395b6bf7 717cc581e6a349a98dfd390d05b18624 - - default default] Acquiring lock "/var/lib/nova/instances/d30c29ef-0595-4d30-a826-50e21d7d3463/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 21 18:56:24 np0005591285 nova_compute[182755]: 2026-01-21 23:56:24.302 182759 DEBUG oslo_concurrency.lockutils [None req-58a0e89d-0650-449c-a608-79ae149a1c1b 0f8ef02149394f2dac899fc3395b6bf7 717cc581e6a349a98dfd390d05b18624 - - default default] Lock "/var/lib/nova/instances/d30c29ef-0595-4d30-a826-50e21d7d3463/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 21 18:56:24 np0005591285 nova_compute[182755]: 2026-01-21 23:56:24.303 182759 DEBUG oslo_concurrency.lockutils [None req-58a0e89d-0650-449c-a608-79ae149a1c1b 0f8ef02149394f2dac899fc3395b6bf7 717cc581e6a349a98dfd390d05b18624 - - default default] Lock "/var/lib/nova/instances/d30c29ef-0595-4d30-a826-50e21d7d3463/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 21 18:56:24 np0005591285 nova_compute[182755]: 2026-01-21 23:56:24.316 182759 DEBUG oslo_concurrency.processutils [None req-58a0e89d-0650-449c-a608-79ae149a1c1b 0f8ef02149394f2dac899fc3395b6bf7 717cc581e6a349a98dfd390d05b18624 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 21 18:56:24 np0005591285 nova_compute[182755]: 2026-01-21 23:56:24.341 182759 DEBUG nova.policy [None req-58a0e89d-0650-449c-a608-79ae149a1c1b 0f8ef02149394f2dac899fc3395b6bf7 717cc581e6a349a98dfd390d05b18624 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '0f8ef02149394f2dac899fc3395b6bf7', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '717cc581e6a349a98dfd390d05b18624', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Jan 21 18:56:24 np0005591285 nova_compute[182755]: 2026-01-21 23:56:24.394 182759 DEBUG oslo_concurrency.processutils [None req-58a0e89d-0650-449c-a608-79ae149a1c1b 0f8ef02149394f2dac899fc3395b6bf7 717cc581e6a349a98dfd390d05b18624 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json" returned: 0 in 0.078s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 21 18:56:24 np0005591285 nova_compute[182755]: 2026-01-21 23:56:24.395 182759 DEBUG oslo_concurrency.lockutils [None req-58a0e89d-0650-449c-a608-79ae149a1c1b 0f8ef02149394f2dac899fc3395b6bf7 717cc581e6a349a98dfd390d05b18624 - - default default] Acquiring lock "3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 21 18:56:24 np0005591285 nova_compute[182755]: 2026-01-21 23:56:24.395 182759 DEBUG oslo_concurrency.lockutils [None req-58a0e89d-0650-449c-a608-79ae149a1c1b 0f8ef02149394f2dac899fc3395b6bf7 717cc581e6a349a98dfd390d05b18624 - - default default] Lock "3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 21 18:56:24 np0005591285 nova_compute[182755]: 2026-01-21 23:56:24.407 182759 DEBUG oslo_concurrency.processutils [None req-58a0e89d-0650-449c-a608-79ae149a1c1b 0f8ef02149394f2dac899fc3395b6bf7 717cc581e6a349a98dfd390d05b18624 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 21 18:56:24 np0005591285 nova_compute[182755]: 2026-01-21 23:56:24.503 182759 DEBUG oslo_concurrency.processutils [None req-58a0e89d-0650-449c-a608-79ae149a1c1b 0f8ef02149394f2dac899fc3395b6bf7 717cc581e6a349a98dfd390d05b18624 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json" returned: 0 in 0.096s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 21 18:56:24 np0005591285 nova_compute[182755]: 2026-01-21 23:56:24.504 182759 DEBUG oslo_concurrency.processutils [None req-58a0e89d-0650-449c-a608-79ae149a1c1b 0f8ef02149394f2dac899fc3395b6bf7 717cc581e6a349a98dfd390d05b18624 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474,backing_fmt=raw /var/lib/nova/instances/d30c29ef-0595-4d30-a826-50e21d7d3463/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 21 18:56:24 np0005591285 nova_compute[182755]: 2026-01-21 23:56:24.545 182759 DEBUG oslo_concurrency.processutils [None req-58a0e89d-0650-449c-a608-79ae149a1c1b 0f8ef02149394f2dac899fc3395b6bf7 717cc581e6a349a98dfd390d05b18624 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474,backing_fmt=raw /var/lib/nova/instances/d30c29ef-0595-4d30-a826-50e21d7d3463/disk 1073741824" returned: 0 in 0.041s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 21 18:56:24 np0005591285 nova_compute[182755]: 2026-01-21 23:56:24.547 182759 DEBUG oslo_concurrency.lockutils [None req-58a0e89d-0650-449c-a608-79ae149a1c1b 0f8ef02149394f2dac899fc3395b6bf7 717cc581e6a349a98dfd390d05b18624 - - default default] Lock "3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.152s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 21 18:56:24 np0005591285 nova_compute[182755]: 2026-01-21 23:56:24.548 182759 DEBUG oslo_concurrency.processutils [None req-58a0e89d-0650-449c-a608-79ae149a1c1b 0f8ef02149394f2dac899fc3395b6bf7 717cc581e6a349a98dfd390d05b18624 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 21 18:56:24 np0005591285 nova_compute[182755]: 2026-01-21 23:56:24.620 182759 DEBUG oslo_concurrency.processutils [None req-58a0e89d-0650-449c-a608-79ae149a1c1b 0f8ef02149394f2dac899fc3395b6bf7 717cc581e6a349a98dfd390d05b18624 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json" returned: 0 in 0.071s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 21 18:56:24 np0005591285 nova_compute[182755]: 2026-01-21 23:56:24.621 182759 DEBUG nova.virt.disk.api [None req-58a0e89d-0650-449c-a608-79ae149a1c1b 0f8ef02149394f2dac899fc3395b6bf7 717cc581e6a349a98dfd390d05b18624 - - default default] Checking if we can resize image /var/lib/nova/instances/d30c29ef-0595-4d30-a826-50e21d7d3463/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166#033[00m
Jan 21 18:56:24 np0005591285 nova_compute[182755]: 2026-01-21 23:56:24.622 182759 DEBUG oslo_concurrency.processutils [None req-58a0e89d-0650-449c-a608-79ae149a1c1b 0f8ef02149394f2dac899fc3395b6bf7 717cc581e6a349a98dfd390d05b18624 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/d30c29ef-0595-4d30-a826-50e21d7d3463/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 21 18:56:24 np0005591285 nova_compute[182755]: 2026-01-21 23:56:24.681 182759 DEBUG oslo_concurrency.processutils [None req-58a0e89d-0650-449c-a608-79ae149a1c1b 0f8ef02149394f2dac899fc3395b6bf7 717cc581e6a349a98dfd390d05b18624 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/d30c29ef-0595-4d30-a826-50e21d7d3463/disk --force-share --output=json" returned: 0 in 0.059s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 21 18:56:24 np0005591285 nova_compute[182755]: 2026-01-21 23:56:24.682 182759 DEBUG nova.virt.disk.api [None req-58a0e89d-0650-449c-a608-79ae149a1c1b 0f8ef02149394f2dac899fc3395b6bf7 717cc581e6a349a98dfd390d05b18624 - - default default] Cannot resize image /var/lib/nova/instances/d30c29ef-0595-4d30-a826-50e21d7d3463/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172#033[00m
Jan 21 18:56:24 np0005591285 nova_compute[182755]: 2026-01-21 23:56:24.683 182759 DEBUG nova.objects.instance [None req-58a0e89d-0650-449c-a608-79ae149a1c1b 0f8ef02149394f2dac899fc3395b6bf7 717cc581e6a349a98dfd390d05b18624 - - default default] Lazy-loading 'migration_context' on Instance uuid d30c29ef-0595-4d30-a826-50e21d7d3463 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 21 18:56:24 np0005591285 nova_compute[182755]: 2026-01-21 23:56:24.702 182759 DEBUG nova.virt.libvirt.driver [None req-58a0e89d-0650-449c-a608-79ae149a1c1b 0f8ef02149394f2dac899fc3395b6bf7 717cc581e6a349a98dfd390d05b18624 - - default default] [instance: d30c29ef-0595-4d30-a826-50e21d7d3463] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Jan 21 18:56:24 np0005591285 nova_compute[182755]: 2026-01-21 23:56:24.703 182759 DEBUG nova.virt.libvirt.driver [None req-58a0e89d-0650-449c-a608-79ae149a1c1b 0f8ef02149394f2dac899fc3395b6bf7 717cc581e6a349a98dfd390d05b18624 - - default default] [instance: d30c29ef-0595-4d30-a826-50e21d7d3463] Ensure instance console log exists: /var/lib/nova/instances/d30c29ef-0595-4d30-a826-50e21d7d3463/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Jan 21 18:56:24 np0005591285 nova_compute[182755]: 2026-01-21 23:56:24.703 182759 DEBUG oslo_concurrency.lockutils [None req-58a0e89d-0650-449c-a608-79ae149a1c1b 0f8ef02149394f2dac899fc3395b6bf7 717cc581e6a349a98dfd390d05b18624 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 21 18:56:24 np0005591285 nova_compute[182755]: 2026-01-21 23:56:24.704 182759 DEBUG oslo_concurrency.lockutils [None req-58a0e89d-0650-449c-a608-79ae149a1c1b 0f8ef02149394f2dac899fc3395b6bf7 717cc581e6a349a98dfd390d05b18624 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 21 18:56:24 np0005591285 nova_compute[182755]: 2026-01-21 23:56:24.704 182759 DEBUG oslo_concurrency.lockutils [None req-58a0e89d-0650-449c-a608-79ae149a1c1b 0f8ef02149394f2dac899fc3395b6bf7 717cc581e6a349a98dfd390d05b18624 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 21 18:56:24 np0005591285 nova_compute[182755]: 2026-01-21 23:56:24.761 182759 DEBUG oslo_concurrency.lockutils [None req-42c5bf16-9df2-48f9-a470-43e0daccc1a8 7e79b904cb8a49f990b05eb0ed72fdf4 70b1c9f8be0042aa8de9841a26729700 - - default default] Acquiring lock "3c28cc1f-7479-4ee7-805d-ae13cd2b6dff" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 21 18:56:24 np0005591285 nova_compute[182755]: 2026-01-21 23:56:24.766 182759 DEBUG oslo_concurrency.lockutils [None req-42c5bf16-9df2-48f9-a470-43e0daccc1a8 7e79b904cb8a49f990b05eb0ed72fdf4 70b1c9f8be0042aa8de9841a26729700 - - default default] Lock "3c28cc1f-7479-4ee7-805d-ae13cd2b6dff" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.005s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 21 18:56:24 np0005591285 nova_compute[182755]: 2026-01-21 23:56:24.769 182759 DEBUG oslo_concurrency.lockutils [None req-42c5bf16-9df2-48f9-a470-43e0daccc1a8 7e79b904cb8a49f990b05eb0ed72fdf4 70b1c9f8be0042aa8de9841a26729700 - - default default] Acquiring lock "3c28cc1f-7479-4ee7-805d-ae13cd2b6dff-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 21 18:56:24 np0005591285 nova_compute[182755]: 2026-01-21 23:56:24.773 182759 DEBUG oslo_concurrency.lockutils [None req-42c5bf16-9df2-48f9-a470-43e0daccc1a8 7e79b904cb8a49f990b05eb0ed72fdf4 70b1c9f8be0042aa8de9841a26729700 - - default default] Lock "3c28cc1f-7479-4ee7-805d-ae13cd2b6dff-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.003s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 21 18:56:24 np0005591285 nova_compute[182755]: 2026-01-21 23:56:24.775 182759 DEBUG oslo_concurrency.lockutils [None req-42c5bf16-9df2-48f9-a470-43e0daccc1a8 7e79b904cb8a49f990b05eb0ed72fdf4 70b1c9f8be0042aa8de9841a26729700 - - default default] Lock "3c28cc1f-7479-4ee7-805d-ae13cd2b6dff-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.003s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 21 18:56:24 np0005591285 nova_compute[182755]: 2026-01-21 23:56:24.842 182759 INFO nova.compute.manager [None req-42c5bf16-9df2-48f9-a470-43e0daccc1a8 7e79b904cb8a49f990b05eb0ed72fdf4 70b1c9f8be0042aa8de9841a26729700 - - default default] [instance: 3c28cc1f-7479-4ee7-805d-ae13cd2b6dff] Terminating instance#033[00m
Jan 21 18:56:24 np0005591285 nova_compute[182755]: 2026-01-21 23:56:24.876 182759 DEBUG nova.compute.manager [None req-42c5bf16-9df2-48f9-a470-43e0daccc1a8 7e79b904cb8a49f990b05eb0ed72fdf4 70b1c9f8be0042aa8de9841a26729700 - - default default] [instance: 3c28cc1f-7479-4ee7-805d-ae13cd2b6dff] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Jan 21 18:56:24 np0005591285 kernel: tapc9a59fac-68 (unregistering): left promiscuous mode
Jan 21 18:56:24 np0005591285 NetworkManager[55017]: <info>  [1769039784.9035] device (tapc9a59fac-68): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 21 18:56:24 np0005591285 ovn_controller[94908]: 2026-01-21T23:56:24Z|00177|binding|INFO|Releasing lport c9a59fac-68ff-4aa5-abcb-98567b80fb6f from this chassis (sb_readonly=0)
Jan 21 18:56:24 np0005591285 ovn_controller[94908]: 2026-01-21T23:56:24Z|00178|binding|INFO|Setting lport c9a59fac-68ff-4aa5-abcb-98567b80fb6f down in Southbound
Jan 21 18:56:24 np0005591285 ovn_controller[94908]: 2026-01-21T23:56:24Z|00179|binding|INFO|Removing iface tapc9a59fac-68 ovn-installed in OVS
Jan 21 18:56:24 np0005591285 nova_compute[182755]: 2026-01-21 23:56:24.916 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 18:56:24 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:56:24.940 104259 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:58:a7:27 10.100.0.8'], port_security=['fa:16:3e:58:a7:27 10.100.0.8'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.8/28', 'neutron:device_id': '3c28cc1f-7479-4ee7-805d-ae13cd2b6dff', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-a78bfb22-a192-4dbe-a117-9f8a59130e27', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '70b1c9f8be0042aa8de9841a26729700', 'neutron:revision_number': '4', 'neutron:security_group_ids': '5943869c-ade1-4cd3-81a5-29e65236fb49', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-2.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=88d3d39a-f56f-4f3b-95e9-79768ac7b596, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fdd55b5c970>], logical_port=c9a59fac-68ff-4aa5-abcb-98567b80fb6f) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7fdd55b5c970>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 21 18:56:24 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:56:24.942 104259 INFO neutron.agent.ovn.metadata.agent [-] Port c9a59fac-68ff-4aa5-abcb-98567b80fb6f in datapath a78bfb22-a192-4dbe-a117-9f8a59130e27 unbound from our chassis#033[00m
Jan 21 18:56:24 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:56:24.943 104259 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network a78bfb22-a192-4dbe-a117-9f8a59130e27, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Jan 21 18:56:24 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:56:24.945 211690 DEBUG oslo.privsep.daemon [-] privsep: reply[8f7fb185-a65f-4bde-b721-d4202965d51e]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 18:56:24 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:56:24.946 104259 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-a78bfb22-a192-4dbe-a117-9f8a59130e27 namespace which is not needed anymore#033[00m
Jan 21 18:56:24 np0005591285 nova_compute[182755]: 2026-01-21 23:56:24.947 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 18:56:24 np0005591285 systemd[1]: machine-qemu\x2d25\x2dinstance\x2d0000003d.scope: Deactivated successfully.
Jan 21 18:56:24 np0005591285 systemd[1]: machine-qemu\x2d25\x2dinstance\x2d0000003d.scope: Consumed 14.706s CPU time.
Jan 21 18:56:24 np0005591285 systemd-machined[154022]: Machine qemu-25-instance-0000003d terminated.
Jan 21 18:56:25 np0005591285 nova_compute[182755]: 2026-01-21 23:56:25.113 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 18:56:25 np0005591285 neutron-haproxy-ovnmeta-a78bfb22-a192-4dbe-a117-9f8a59130e27[219106]: [NOTICE]   (219110) : haproxy version is 2.8.14-c23fe91
Jan 21 18:56:25 np0005591285 neutron-haproxy-ovnmeta-a78bfb22-a192-4dbe-a117-9f8a59130e27[219106]: [NOTICE]   (219110) : path to executable is /usr/sbin/haproxy
Jan 21 18:56:25 np0005591285 neutron-haproxy-ovnmeta-a78bfb22-a192-4dbe-a117-9f8a59130e27[219106]: [WARNING]  (219110) : Exiting Master process...
Jan 21 18:56:25 np0005591285 neutron-haproxy-ovnmeta-a78bfb22-a192-4dbe-a117-9f8a59130e27[219106]: [WARNING]  (219110) : Exiting Master process...
Jan 21 18:56:25 np0005591285 neutron-haproxy-ovnmeta-a78bfb22-a192-4dbe-a117-9f8a59130e27[219106]: [ALERT]    (219110) : Current worker (219112) exited with code 143 (Terminated)
Jan 21 18:56:25 np0005591285 neutron-haproxy-ovnmeta-a78bfb22-a192-4dbe-a117-9f8a59130e27[219106]: [WARNING]  (219110) : All workers exited. Exiting... (0)
Jan 21 18:56:25 np0005591285 systemd[1]: libpod-64266c0d3701dfbdf5614f771b0b55a55e99b434e82e42ceb4788dd7ac9b0190.scope: Deactivated successfully.
Jan 21 18:56:25 np0005591285 conmon[219106]: conmon 64266c0d3701dfbdf561 <nwarn>: Failed to open cgroups file: /sys/fs/cgroup/machine.slice/libpod-64266c0d3701dfbdf5614f771b0b55a55e99b434e82e42ceb4788dd7ac9b0190.scope/container/memory.events
Jan 21 18:56:25 np0005591285 podman[219708]: 2026-01-21 23:56:25.13579039 +0000 UTC m=+0.057022587 container died 64266c0d3701dfbdf5614f771b0b55a55e99b434e82e42ceb4788dd7ac9b0190 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-a78bfb22-a192-4dbe-a117-9f8a59130e27, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, org.label-schema.build-date=20251202, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2)
Jan 21 18:56:25 np0005591285 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-64266c0d3701dfbdf5614f771b0b55a55e99b434e82e42ceb4788dd7ac9b0190-userdata-shm.mount: Deactivated successfully.
Jan 21 18:56:25 np0005591285 systemd[1]: var-lib-containers-storage-overlay-396d86ced6955a0fe0991607ff3e8d3f69abd2e055809b8d0968a6b8751030c6-merged.mount: Deactivated successfully.
Jan 21 18:56:25 np0005591285 nova_compute[182755]: 2026-01-21 23:56:25.182 182759 INFO nova.virt.libvirt.driver [-] [instance: 3c28cc1f-7479-4ee7-805d-ae13cd2b6dff] Instance destroyed successfully.#033[00m
Jan 21 18:56:25 np0005591285 nova_compute[182755]: 2026-01-21 23:56:25.183 182759 DEBUG nova.objects.instance [None req-42c5bf16-9df2-48f9-a470-43e0daccc1a8 7e79b904cb8a49f990b05eb0ed72fdf4 70b1c9f8be0042aa8de9841a26729700 - - default default] Lazy-loading 'resources' on Instance uuid 3c28cc1f-7479-4ee7-805d-ae13cd2b6dff obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 21 18:56:25 np0005591285 podman[219708]: 2026-01-21 23:56:25.189566608 +0000 UTC m=+0.110798795 container cleanup 64266c0d3701dfbdf5614f771b0b55a55e99b434e82e42ceb4788dd7ac9b0190 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-a78bfb22-a192-4dbe-a117-9f8a59130e27, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.build-date=20251202)
Jan 21 18:56:25 np0005591285 systemd[1]: libpod-conmon-64266c0d3701dfbdf5614f771b0b55a55e99b434e82e42ceb4788dd7ac9b0190.scope: Deactivated successfully.
Jan 21 18:56:25 np0005591285 nova_compute[182755]: 2026-01-21 23:56:25.204 182759 DEBUG nova.virt.libvirt.vif [None req-42c5bf16-9df2-48f9-a470-43e0daccc1a8 7e79b904cb8a49f990b05eb0ed72fdf4 70b1c9f8be0042aa8de9841a26729700 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-21T23:55:35Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ListServerFiltersTestJSON-instance-1934699601',display_name='tempest-ListServerFiltersTestJSON-instance-1934699601',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-listserverfilterstestjson-instance-1934699601',id=61,image_ref='3e1dda74-3c6a-4d29-8792-32134d1c36c5',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2026-01-21T23:55:49Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='70b1c9f8be0042aa8de9841a26729700',ramdisk_id='',reservation_id='r-ei615cu0',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='3e1dda74-3c6a-4d29-8792-32134d1c36c5',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-ListServerFiltersTestJSON-1547380946',owner_user_name='tempest-ListServerFiltersTestJSON-1547380946-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-21T23:55:49Z,user_data=None,user_id='7e79b904cb8a49f990b05eb0ed72fdf4',uuid=3c28cc1f-7479-4ee7-805d-ae13cd2b6dff,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "c9a59fac-68ff-4aa5-abcb-98567b80fb6f", "address": "fa:16:3e:58:a7:27", "network": {"id": "a78bfb22-a192-4dbe-a117-9f8a59130e27", "bridge": "br-int", "label": "tempest-ListServerFiltersTestJSON-9709523-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "70b1c9f8be0042aa8de9841a26729700", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc9a59fac-68", "ovs_interfaceid": "c9a59fac-68ff-4aa5-abcb-98567b80fb6f", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Jan 21 18:56:25 np0005591285 nova_compute[182755]: 2026-01-21 23:56:25.206 182759 DEBUG nova.network.os_vif_util [None req-42c5bf16-9df2-48f9-a470-43e0daccc1a8 7e79b904cb8a49f990b05eb0ed72fdf4 70b1c9f8be0042aa8de9841a26729700 - - default default] Converting VIF {"id": "c9a59fac-68ff-4aa5-abcb-98567b80fb6f", "address": "fa:16:3e:58:a7:27", "network": {"id": "a78bfb22-a192-4dbe-a117-9f8a59130e27", "bridge": "br-int", "label": "tempest-ListServerFiltersTestJSON-9709523-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "70b1c9f8be0042aa8de9841a26729700", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc9a59fac-68", "ovs_interfaceid": "c9a59fac-68ff-4aa5-abcb-98567b80fb6f", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 21 18:56:25 np0005591285 nova_compute[182755]: 2026-01-21 23:56:25.206 182759 DEBUG nova.network.os_vif_util [None req-42c5bf16-9df2-48f9-a470-43e0daccc1a8 7e79b904cb8a49f990b05eb0ed72fdf4 70b1c9f8be0042aa8de9841a26729700 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:58:a7:27,bridge_name='br-int',has_traffic_filtering=True,id=c9a59fac-68ff-4aa5-abcb-98567b80fb6f,network=Network(a78bfb22-a192-4dbe-a117-9f8a59130e27),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapc9a59fac-68') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 21 18:56:25 np0005591285 nova_compute[182755]: 2026-01-21 23:56:25.207 182759 DEBUG os_vif [None req-42c5bf16-9df2-48f9-a470-43e0daccc1a8 7e79b904cb8a49f990b05eb0ed72fdf4 70b1c9f8be0042aa8de9841a26729700 - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:58:a7:27,bridge_name='br-int',has_traffic_filtering=True,id=c9a59fac-68ff-4aa5-abcb-98567b80fb6f,network=Network(a78bfb22-a192-4dbe-a117-9f8a59130e27),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapc9a59fac-68') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Jan 21 18:56:25 np0005591285 nova_compute[182755]: 2026-01-21 23:56:25.210 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 18:56:25 np0005591285 nova_compute[182755]: 2026-01-21 23:56:25.212 182759 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapc9a59fac-68, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 21 18:56:25 np0005591285 nova_compute[182755]: 2026-01-21 23:56:25.217 182759 DEBUG oslo_service.periodic_task [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 21 18:56:25 np0005591285 nova_compute[182755]: 2026-01-21 23:56:25.218 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 18:56:25 np0005591285 nova_compute[182755]: 2026-01-21 23:56:25.219 182759 DEBUG oslo_service.periodic_task [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 21 18:56:25 np0005591285 nova_compute[182755]: 2026-01-21 23:56:25.220 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 21 18:56:25 np0005591285 nova_compute[182755]: 2026-01-21 23:56:25.222 182759 INFO os_vif [None req-42c5bf16-9df2-48f9-a470-43e0daccc1a8 7e79b904cb8a49f990b05eb0ed72fdf4 70b1c9f8be0042aa8de9841a26729700 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:58:a7:27,bridge_name='br-int',has_traffic_filtering=True,id=c9a59fac-68ff-4aa5-abcb-98567b80fb6f,network=Network(a78bfb22-a192-4dbe-a117-9f8a59130e27),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapc9a59fac-68')#033[00m
Jan 21 18:56:25 np0005591285 nova_compute[182755]: 2026-01-21 23:56:25.222 182759 INFO nova.virt.libvirt.driver [None req-42c5bf16-9df2-48f9-a470-43e0daccc1a8 7e79b904cb8a49f990b05eb0ed72fdf4 70b1c9f8be0042aa8de9841a26729700 - - default default] [instance: 3c28cc1f-7479-4ee7-805d-ae13cd2b6dff] Deleting instance files /var/lib/nova/instances/3c28cc1f-7479-4ee7-805d-ae13cd2b6dff_del#033[00m
Jan 21 18:56:25 np0005591285 nova_compute[182755]: 2026-01-21 23:56:25.223 182759 INFO nova.virt.libvirt.driver [None req-42c5bf16-9df2-48f9-a470-43e0daccc1a8 7e79b904cb8a49f990b05eb0ed72fdf4 70b1c9f8be0042aa8de9841a26729700 - - default default] [instance: 3c28cc1f-7479-4ee7-805d-ae13cd2b6dff] Deletion of /var/lib/nova/instances/3c28cc1f-7479-4ee7-805d-ae13cd2b6dff_del complete#033[00m
Jan 21 18:56:25 np0005591285 podman[219758]: 2026-01-21 23:56:25.264586833 +0000 UTC m=+0.044493948 container remove 64266c0d3701dfbdf5614f771b0b55a55e99b434e82e42ceb4788dd7ac9b0190 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-a78bfb22-a192-4dbe-a117-9f8a59130e27, io.buildah.version=1.41.3, tcib_managed=true, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 21 18:56:25 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:56:25.273 211690 DEBUG oslo.privsep.daemon [-] privsep: reply[3782a36b-01e1-4d8a-8118-fd5dfe8947f4]: (4, ('Wed Jan 21 11:56:25 PM UTC 2026 Stopping container neutron-haproxy-ovnmeta-a78bfb22-a192-4dbe-a117-9f8a59130e27 (64266c0d3701dfbdf5614f771b0b55a55e99b434e82e42ceb4788dd7ac9b0190)\n64266c0d3701dfbdf5614f771b0b55a55e99b434e82e42ceb4788dd7ac9b0190\nWed Jan 21 11:56:25 PM UTC 2026 Deleting container neutron-haproxy-ovnmeta-a78bfb22-a192-4dbe-a117-9f8a59130e27 (64266c0d3701dfbdf5614f771b0b55a55e99b434e82e42ceb4788dd7ac9b0190)\n64266c0d3701dfbdf5614f771b0b55a55e99b434e82e42ceb4788dd7ac9b0190\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 18:56:25 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:56:25.275 211690 DEBUG oslo.privsep.daemon [-] privsep: reply[6c98e5ad-8fab-4e3c-be31-9126f7acfc81]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 18:56:25 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:56:25.276 104259 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapa78bfb22-a0, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 21 18:56:25 np0005591285 kernel: tapa78bfb22-a0: left promiscuous mode
Jan 21 18:56:25 np0005591285 nova_compute[182755]: 2026-01-21 23:56:25.288 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 18:56:25 np0005591285 nova_compute[182755]: 2026-01-21 23:56:25.293 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 18:56:25 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:56:25.298 211690 DEBUG oslo.privsep.daemon [-] privsep: reply[ac30849e-3f0c-4dd5-b711-d22d7084ebd0]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 18:56:25 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:56:25.313 211690 DEBUG oslo.privsep.daemon [-] privsep: reply[a727d44c-ed97-49ec-9b72-3916d144ed4d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 18:56:25 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:56:25.316 211690 DEBUG oslo.privsep.daemon [-] privsep: reply[055db17b-d981-4998-9b49-c75b0fc41d8a]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 18:56:25 np0005591285 nova_compute[182755]: 2026-01-21 23:56:25.323 182759 INFO nova.compute.manager [None req-42c5bf16-9df2-48f9-a470-43e0daccc1a8 7e79b904cb8a49f990b05eb0ed72fdf4 70b1c9f8be0042aa8de9841a26729700 - - default default] [instance: 3c28cc1f-7479-4ee7-805d-ae13cd2b6dff] Took 0.45 seconds to destroy the instance on the hypervisor.#033[00m
Jan 21 18:56:25 np0005591285 nova_compute[182755]: 2026-01-21 23:56:25.323 182759 DEBUG oslo.service.loopingcall [None req-42c5bf16-9df2-48f9-a470-43e0daccc1a8 7e79b904cb8a49f990b05eb0ed72fdf4 70b1c9f8be0042aa8de9841a26729700 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Jan 21 18:56:25 np0005591285 nova_compute[182755]: 2026-01-21 23:56:25.323 182759 DEBUG nova.compute.manager [-] [instance: 3c28cc1f-7479-4ee7-805d-ae13cd2b6dff] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Jan 21 18:56:25 np0005591285 nova_compute[182755]: 2026-01-21 23:56:25.324 182759 DEBUG nova.network.neutron [-] [instance: 3c28cc1f-7479-4ee7-805d-ae13cd2b6dff] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Jan 21 18:56:25 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:56:25.338 211690 DEBUG oslo.privsep.daemon [-] privsep: reply[4e4692a7-ba6e-47f9-9416-2d69fe47bb9d]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 430610, 'reachable_time': 38957, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 219774, 'error': None, 'target': 'ovnmeta-a78bfb22-a192-4dbe-a117-9f8a59130e27', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 18:56:25 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:56:25.342 104650 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-a78bfb22-a192-4dbe-a117-9f8a59130e27 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Jan 21 18:56:25 np0005591285 systemd[1]: run-netns-ovnmeta\x2da78bfb22\x2da192\x2d4dbe\x2da117\x2d9f8a59130e27.mount: Deactivated successfully.
Jan 21 18:56:25 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:56:25.343 104650 DEBUG oslo.privsep.daemon [-] privsep: reply[8bcf838b-dd41-4e82-bfac-cd04400981c0]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 18:56:25 np0005591285 nova_compute[182755]: 2026-01-21 23:56:25.481 182759 DEBUG nova.network.neutron [None req-58a0e89d-0650-449c-a608-79ae149a1c1b 0f8ef02149394f2dac899fc3395b6bf7 717cc581e6a349a98dfd390d05b18624 - - default default] [instance: d30c29ef-0595-4d30-a826-50e21d7d3463] Successfully created port: 8a89069d-b676-4bb0-ab19-1d71370566f0 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Jan 21 18:56:25 np0005591285 nova_compute[182755]: 2026-01-21 23:56:25.491 182759 DEBUG nova.compute.manager [req-b3ccd975-3d78-4e23-9454-6973edaad347 req-a240d915-4a95-4a44-837b-ee4603cd528d 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 7768ce19-cdaa-43a0-9404-cafa72f2d077] Received event network-vif-plugged-cbd11a5b-9ef3-4c5b-a638-afc4464a8dfc external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 21 18:56:25 np0005591285 nova_compute[182755]: 2026-01-21 23:56:25.492 182759 DEBUG oslo_concurrency.lockutils [req-b3ccd975-3d78-4e23-9454-6973edaad347 req-a240d915-4a95-4a44-837b-ee4603cd528d 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "7768ce19-cdaa-43a0-9404-cafa72f2d077-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 21 18:56:25 np0005591285 nova_compute[182755]: 2026-01-21 23:56:25.492 182759 DEBUG oslo_concurrency.lockutils [req-b3ccd975-3d78-4e23-9454-6973edaad347 req-a240d915-4a95-4a44-837b-ee4603cd528d 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "7768ce19-cdaa-43a0-9404-cafa72f2d077-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 21 18:56:25 np0005591285 nova_compute[182755]: 2026-01-21 23:56:25.493 182759 DEBUG oslo_concurrency.lockutils [req-b3ccd975-3d78-4e23-9454-6973edaad347 req-a240d915-4a95-4a44-837b-ee4603cd528d 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "7768ce19-cdaa-43a0-9404-cafa72f2d077-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 21 18:56:25 np0005591285 nova_compute[182755]: 2026-01-21 23:56:25.493 182759 DEBUG nova.compute.manager [req-b3ccd975-3d78-4e23-9454-6973edaad347 req-a240d915-4a95-4a44-837b-ee4603cd528d 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 7768ce19-cdaa-43a0-9404-cafa72f2d077] No waiting events found dispatching network-vif-plugged-cbd11a5b-9ef3-4c5b-a638-afc4464a8dfc pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 21 18:56:25 np0005591285 nova_compute[182755]: 2026-01-21 23:56:25.493 182759 WARNING nova.compute.manager [req-b3ccd975-3d78-4e23-9454-6973edaad347 req-a240d915-4a95-4a44-837b-ee4603cd528d 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 7768ce19-cdaa-43a0-9404-cafa72f2d077] Received unexpected event network-vif-plugged-cbd11a5b-9ef3-4c5b-a638-afc4464a8dfc for instance with vm_state active and task_state None.#033[00m
Jan 21 18:56:25 np0005591285 nova_compute[182755]: 2026-01-21 23:56:25.494 182759 DEBUG nova.compute.manager [req-b3ccd975-3d78-4e23-9454-6973edaad347 req-a240d915-4a95-4a44-837b-ee4603cd528d 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 3c28cc1f-7479-4ee7-805d-ae13cd2b6dff] Received event network-vif-unplugged-c9a59fac-68ff-4aa5-abcb-98567b80fb6f external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 21 18:56:25 np0005591285 nova_compute[182755]: 2026-01-21 23:56:25.494 182759 DEBUG oslo_concurrency.lockutils [req-b3ccd975-3d78-4e23-9454-6973edaad347 req-a240d915-4a95-4a44-837b-ee4603cd528d 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "3c28cc1f-7479-4ee7-805d-ae13cd2b6dff-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 21 18:56:25 np0005591285 nova_compute[182755]: 2026-01-21 23:56:25.494 182759 DEBUG oslo_concurrency.lockutils [req-b3ccd975-3d78-4e23-9454-6973edaad347 req-a240d915-4a95-4a44-837b-ee4603cd528d 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "3c28cc1f-7479-4ee7-805d-ae13cd2b6dff-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 21 18:56:25 np0005591285 nova_compute[182755]: 2026-01-21 23:56:25.495 182759 DEBUG oslo_concurrency.lockutils [req-b3ccd975-3d78-4e23-9454-6973edaad347 req-a240d915-4a95-4a44-837b-ee4603cd528d 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "3c28cc1f-7479-4ee7-805d-ae13cd2b6dff-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 21 18:56:25 np0005591285 nova_compute[182755]: 2026-01-21 23:56:25.495 182759 DEBUG nova.compute.manager [req-b3ccd975-3d78-4e23-9454-6973edaad347 req-a240d915-4a95-4a44-837b-ee4603cd528d 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 3c28cc1f-7479-4ee7-805d-ae13cd2b6dff] No waiting events found dispatching network-vif-unplugged-c9a59fac-68ff-4aa5-abcb-98567b80fb6f pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 21 18:56:25 np0005591285 nova_compute[182755]: 2026-01-21 23:56:25.495 182759 DEBUG nova.compute.manager [req-b3ccd975-3d78-4e23-9454-6973edaad347 req-a240d915-4a95-4a44-837b-ee4603cd528d 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 3c28cc1f-7479-4ee7-805d-ae13cd2b6dff] Received event network-vif-unplugged-c9a59fac-68ff-4aa5-abcb-98567b80fb6f for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Jan 21 18:56:25 np0005591285 nova_compute[182755]: 2026-01-21 23:56:25.495 182759 DEBUG nova.compute.manager [req-b3ccd975-3d78-4e23-9454-6973edaad347 req-a240d915-4a95-4a44-837b-ee4603cd528d 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 3c28cc1f-7479-4ee7-805d-ae13cd2b6dff] Received event network-vif-plugged-c9a59fac-68ff-4aa5-abcb-98567b80fb6f external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 21 18:56:25 np0005591285 nova_compute[182755]: 2026-01-21 23:56:25.496 182759 DEBUG oslo_concurrency.lockutils [req-b3ccd975-3d78-4e23-9454-6973edaad347 req-a240d915-4a95-4a44-837b-ee4603cd528d 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "3c28cc1f-7479-4ee7-805d-ae13cd2b6dff-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 21 18:56:25 np0005591285 nova_compute[182755]: 2026-01-21 23:56:25.496 182759 DEBUG oslo_concurrency.lockutils [req-b3ccd975-3d78-4e23-9454-6973edaad347 req-a240d915-4a95-4a44-837b-ee4603cd528d 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "3c28cc1f-7479-4ee7-805d-ae13cd2b6dff-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 21 18:56:25 np0005591285 nova_compute[182755]: 2026-01-21 23:56:25.496 182759 DEBUG oslo_concurrency.lockutils [req-b3ccd975-3d78-4e23-9454-6973edaad347 req-a240d915-4a95-4a44-837b-ee4603cd528d 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "3c28cc1f-7479-4ee7-805d-ae13cd2b6dff-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 21 18:56:25 np0005591285 nova_compute[182755]: 2026-01-21 23:56:25.497 182759 DEBUG nova.compute.manager [req-b3ccd975-3d78-4e23-9454-6973edaad347 req-a240d915-4a95-4a44-837b-ee4603cd528d 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 3c28cc1f-7479-4ee7-805d-ae13cd2b6dff] No waiting events found dispatching network-vif-plugged-c9a59fac-68ff-4aa5-abcb-98567b80fb6f pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 21 18:56:25 np0005591285 nova_compute[182755]: 2026-01-21 23:56:25.497 182759 WARNING nova.compute.manager [req-b3ccd975-3d78-4e23-9454-6973edaad347 req-a240d915-4a95-4a44-837b-ee4603cd528d 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 3c28cc1f-7479-4ee7-805d-ae13cd2b6dff] Received unexpected event network-vif-plugged-c9a59fac-68ff-4aa5-abcb-98567b80fb6f for instance with vm_state active and task_state deleting.#033[00m
Jan 21 18:56:26 np0005591285 nova_compute[182755]: 2026-01-21 23:56:26.254 182759 DEBUG nova.network.neutron [-] [instance: 3c28cc1f-7479-4ee7-805d-ae13cd2b6dff] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 21 18:56:26 np0005591285 nova_compute[182755]: 2026-01-21 23:56:26.280 182759 INFO nova.compute.manager [-] [instance: 3c28cc1f-7479-4ee7-805d-ae13cd2b6dff] Took 0.96 seconds to deallocate network for instance.#033[00m
Jan 21 18:56:26 np0005591285 nova_compute[182755]: 2026-01-21 23:56:26.319 182759 DEBUG nova.network.neutron [None req-58a0e89d-0650-449c-a608-79ae149a1c1b 0f8ef02149394f2dac899fc3395b6bf7 717cc581e6a349a98dfd390d05b18624 - - default default] [instance: d30c29ef-0595-4d30-a826-50e21d7d3463] Successfully updated port: 8a89069d-b676-4bb0-ab19-1d71370566f0 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Jan 21 18:56:26 np0005591285 nova_compute[182755]: 2026-01-21 23:56:26.357 182759 DEBUG oslo_concurrency.lockutils [None req-58a0e89d-0650-449c-a608-79ae149a1c1b 0f8ef02149394f2dac899fc3395b6bf7 717cc581e6a349a98dfd390d05b18624 - - default default] Acquiring lock "refresh_cache-d30c29ef-0595-4d30-a826-50e21d7d3463" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 21 18:56:26 np0005591285 nova_compute[182755]: 2026-01-21 23:56:26.358 182759 DEBUG oslo_concurrency.lockutils [None req-58a0e89d-0650-449c-a608-79ae149a1c1b 0f8ef02149394f2dac899fc3395b6bf7 717cc581e6a349a98dfd390d05b18624 - - default default] Acquired lock "refresh_cache-d30c29ef-0595-4d30-a826-50e21d7d3463" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 21 18:56:26 np0005591285 nova_compute[182755]: 2026-01-21 23:56:26.358 182759 DEBUG nova.network.neutron [None req-58a0e89d-0650-449c-a608-79ae149a1c1b 0f8ef02149394f2dac899fc3395b6bf7 717cc581e6a349a98dfd390d05b18624 - - default default] [instance: d30c29ef-0595-4d30-a826-50e21d7d3463] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Jan 21 18:56:26 np0005591285 nova_compute[182755]: 2026-01-21 23:56:26.390 182759 DEBUG oslo_concurrency.lockutils [None req-42c5bf16-9df2-48f9-a470-43e0daccc1a8 7e79b904cb8a49f990b05eb0ed72fdf4 70b1c9f8be0042aa8de9841a26729700 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 21 18:56:26 np0005591285 nova_compute[182755]: 2026-01-21 23:56:26.391 182759 DEBUG oslo_concurrency.lockutils [None req-42c5bf16-9df2-48f9-a470-43e0daccc1a8 7e79b904cb8a49f990b05eb0ed72fdf4 70b1c9f8be0042aa8de9841a26729700 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 21 18:56:26 np0005591285 nova_compute[182755]: 2026-01-21 23:56:26.479 182759 DEBUG nova.compute.manager [req-379cd3ec-2a79-48d3-8bdc-933778f0e3f9 req-3220a8b7-2e34-4348-9d84-7e04b5af4522 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: d30c29ef-0595-4d30-a826-50e21d7d3463] Received event network-changed-8a89069d-b676-4bb0-ab19-1d71370566f0 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 21 18:56:26 np0005591285 nova_compute[182755]: 2026-01-21 23:56:26.480 182759 DEBUG nova.compute.manager [req-379cd3ec-2a79-48d3-8bdc-933778f0e3f9 req-3220a8b7-2e34-4348-9d84-7e04b5af4522 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: d30c29ef-0595-4d30-a826-50e21d7d3463] Refreshing instance network info cache due to event network-changed-8a89069d-b676-4bb0-ab19-1d71370566f0. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 21 18:56:26 np0005591285 nova_compute[182755]: 2026-01-21 23:56:26.480 182759 DEBUG oslo_concurrency.lockutils [req-379cd3ec-2a79-48d3-8bdc-933778f0e3f9 req-3220a8b7-2e34-4348-9d84-7e04b5af4522 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "refresh_cache-d30c29ef-0595-4d30-a826-50e21d7d3463" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 21 18:56:26 np0005591285 nova_compute[182755]: 2026-01-21 23:56:26.509 182759 DEBUG nova.compute.provider_tree [None req-42c5bf16-9df2-48f9-a470-43e0daccc1a8 7e79b904cb8a49f990b05eb0ed72fdf4 70b1c9f8be0042aa8de9841a26729700 - - default default] Inventory has not changed in ProviderTree for provider: e96a8776-a298-4c19-937a-402cb8191067 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 21 18:56:26 np0005591285 nova_compute[182755]: 2026-01-21 23:56:26.528 182759 DEBUG nova.scheduler.client.report [None req-42c5bf16-9df2-48f9-a470-43e0daccc1a8 7e79b904cb8a49f990b05eb0ed72fdf4 70b1c9f8be0042aa8de9841a26729700 - - default default] Inventory has not changed for provider e96a8776-a298-4c19-937a-402cb8191067 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 21 18:56:26 np0005591285 nova_compute[182755]: 2026-01-21 23:56:26.565 182759 DEBUG oslo_concurrency.lockutils [None req-42c5bf16-9df2-48f9-a470-43e0daccc1a8 7e79b904cb8a49f990b05eb0ed72fdf4 70b1c9f8be0042aa8de9841a26729700 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.174s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 21 18:56:26 np0005591285 nova_compute[182755]: 2026-01-21 23:56:26.592 182759 DEBUG nova.network.neutron [None req-58a0e89d-0650-449c-a608-79ae149a1c1b 0f8ef02149394f2dac899fc3395b6bf7 717cc581e6a349a98dfd390d05b18624 - - default default] [instance: d30c29ef-0595-4d30-a826-50e21d7d3463] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Jan 21 18:56:26 np0005591285 nova_compute[182755]: 2026-01-21 23:56:26.604 182759 INFO nova.scheduler.client.report [None req-42c5bf16-9df2-48f9-a470-43e0daccc1a8 7e79b904cb8a49f990b05eb0ed72fdf4 70b1c9f8be0042aa8de9841a26729700 - - default default] Deleted allocations for instance 3c28cc1f-7479-4ee7-805d-ae13cd2b6dff#033[00m
Jan 21 18:56:26 np0005591285 nova_compute[182755]: 2026-01-21 23:56:26.720 182759 DEBUG oslo_concurrency.lockutils [None req-42c5bf16-9df2-48f9-a470-43e0daccc1a8 7e79b904cb8a49f990b05eb0ed72fdf4 70b1c9f8be0042aa8de9841a26729700 - - default default] Lock "3c28cc1f-7479-4ee7-805d-ae13cd2b6dff" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 1.955s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 21 18:56:27 np0005591285 nova_compute[182755]: 2026-01-21 23:56:27.217 182759 DEBUG oslo_service.periodic_task [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 21 18:56:27 np0005591285 nova_compute[182755]: 2026-01-21 23:56:27.217 182759 DEBUG oslo_service.periodic_task [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 21 18:56:27 np0005591285 nova_compute[182755]: 2026-01-21 23:56:27.218 182759 DEBUG nova.compute.manager [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Jan 21 18:56:27 np0005591285 nova_compute[182755]: 2026-01-21 23:56:27.638 182759 DEBUG nova.compute.manager [req-51c5d0c4-709a-42cd-af6e-a14bbf812b33 req-ec3ff821-0a7c-4874-97e5-2014071de0af 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 3c28cc1f-7479-4ee7-805d-ae13cd2b6dff] Received event network-vif-deleted-c9a59fac-68ff-4aa5-abcb-98567b80fb6f external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 21 18:56:28 np0005591285 nova_compute[182755]: 2026-01-21 23:56:28.217 182759 DEBUG oslo_service.periodic_task [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 21 18:56:28 np0005591285 nova_compute[182755]: 2026-01-21 23:56:28.246 182759 DEBUG oslo_concurrency.lockutils [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 21 18:56:28 np0005591285 nova_compute[182755]: 2026-01-21 23:56:28.247 182759 DEBUG oslo_concurrency.lockutils [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 21 18:56:28 np0005591285 nova_compute[182755]: 2026-01-21 23:56:28.247 182759 DEBUG oslo_concurrency.lockutils [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 21 18:56:28 np0005591285 nova_compute[182755]: 2026-01-21 23:56:28.247 182759 DEBUG nova.compute.resource_tracker [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Jan 21 18:56:28 np0005591285 nova_compute[182755]: 2026-01-21 23:56:28.325 182759 DEBUG oslo_concurrency.processutils [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/7768ce19-cdaa-43a0-9404-cafa72f2d077/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 21 18:56:28 np0005591285 nova_compute[182755]: 2026-01-21 23:56:28.368 182759 DEBUG nova.network.neutron [None req-58a0e89d-0650-449c-a608-79ae149a1c1b 0f8ef02149394f2dac899fc3395b6bf7 717cc581e6a349a98dfd390d05b18624 - - default default] [instance: d30c29ef-0595-4d30-a826-50e21d7d3463] Updating instance_info_cache with network_info: [{"id": "8a89069d-b676-4bb0-ab19-1d71370566f0", "address": "fa:16:3e:04:81:11", "network": {"id": "1995baab-0f8d-4658-a4fc-2d21868dc592", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-54296518-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "717cc581e6a349a98dfd390d05b18624", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap8a89069d-b6", "ovs_interfaceid": "8a89069d-b676-4bb0-ab19-1d71370566f0", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 21 18:56:28 np0005591285 nova_compute[182755]: 2026-01-21 23:56:28.391 182759 DEBUG oslo_concurrency.lockutils [None req-58a0e89d-0650-449c-a608-79ae149a1c1b 0f8ef02149394f2dac899fc3395b6bf7 717cc581e6a349a98dfd390d05b18624 - - default default] Releasing lock "refresh_cache-d30c29ef-0595-4d30-a826-50e21d7d3463" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 21 18:56:28 np0005591285 nova_compute[182755]: 2026-01-21 23:56:28.391 182759 DEBUG nova.compute.manager [None req-58a0e89d-0650-449c-a608-79ae149a1c1b 0f8ef02149394f2dac899fc3395b6bf7 717cc581e6a349a98dfd390d05b18624 - - default default] [instance: d30c29ef-0595-4d30-a826-50e21d7d3463] Instance network_info: |[{"id": "8a89069d-b676-4bb0-ab19-1d71370566f0", "address": "fa:16:3e:04:81:11", "network": {"id": "1995baab-0f8d-4658-a4fc-2d21868dc592", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-54296518-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "717cc581e6a349a98dfd390d05b18624", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap8a89069d-b6", "ovs_interfaceid": "8a89069d-b676-4bb0-ab19-1d71370566f0", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Jan 21 18:56:28 np0005591285 nova_compute[182755]: 2026-01-21 23:56:28.392 182759 DEBUG oslo_concurrency.lockutils [req-379cd3ec-2a79-48d3-8bdc-933778f0e3f9 req-3220a8b7-2e34-4348-9d84-7e04b5af4522 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquired lock "refresh_cache-d30c29ef-0595-4d30-a826-50e21d7d3463" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 21 18:56:28 np0005591285 nova_compute[182755]: 2026-01-21 23:56:28.392 182759 DEBUG nova.network.neutron [req-379cd3ec-2a79-48d3-8bdc-933778f0e3f9 req-3220a8b7-2e34-4348-9d84-7e04b5af4522 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: d30c29ef-0595-4d30-a826-50e21d7d3463] Refreshing network info cache for port 8a89069d-b676-4bb0-ab19-1d71370566f0 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 21 18:56:28 np0005591285 nova_compute[182755]: 2026-01-21 23:56:28.398 182759 DEBUG nova.virt.libvirt.driver [None req-58a0e89d-0650-449c-a608-79ae149a1c1b 0f8ef02149394f2dac899fc3395b6bf7 717cc581e6a349a98dfd390d05b18624 - - default default] [instance: d30c29ef-0595-4d30-a826-50e21d7d3463] Start _get_guest_xml network_info=[{"id": "8a89069d-b676-4bb0-ab19-1d71370566f0", "address": "fa:16:3e:04:81:11", "network": {"id": "1995baab-0f8d-4658-a4fc-2d21868dc592", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-54296518-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "717cc581e6a349a98dfd390d05b18624", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap8a89069d-b6", "ovs_interfaceid": "8a89069d-b676-4bb0-ab19-1d71370566f0", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-21T23:42:50Z,direct_url=<?>,disk_format='qcow2',id=9cd98f02-a505-4543-a7ad-04e9a377b456,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='43b70c4e837343859ac97b6b2397ba1b',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-21T23:42:52Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'device_type': 'disk', 'device_name': '/dev/vda', 'size': 0, 'boot_index': 0, 'disk_bus': 'virtio', 'guest_format': None, 'encryption_options': None, 'encryption_format': None, 'encrypted': False, 'encryption_secret_uuid': None, 'image_id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Jan 21 18:56:28 np0005591285 nova_compute[182755]: 2026-01-21 23:56:28.406 182759 DEBUG oslo_concurrency.processutils [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/7768ce19-cdaa-43a0-9404-cafa72f2d077/disk --force-share --output=json" returned: 0 in 0.081s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 21 18:56:28 np0005591285 nova_compute[182755]: 2026-01-21 23:56:28.407 182759 WARNING nova.virt.libvirt.driver [None req-58a0e89d-0650-449c-a608-79ae149a1c1b 0f8ef02149394f2dac899fc3395b6bf7 717cc581e6a349a98dfd390d05b18624 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 21 18:56:28 np0005591285 nova_compute[182755]: 2026-01-21 23:56:28.409 182759 DEBUG oslo_concurrency.processutils [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/7768ce19-cdaa-43a0-9404-cafa72f2d077/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 21 18:56:28 np0005591285 nova_compute[182755]: 2026-01-21 23:56:28.454 182759 DEBUG nova.virt.libvirt.host [None req-58a0e89d-0650-449c-a608-79ae149a1c1b 0f8ef02149394f2dac899fc3395b6bf7 717cc581e6a349a98dfd390d05b18624 - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Jan 21 18:56:28 np0005591285 nova_compute[182755]: 2026-01-21 23:56:28.455 182759 DEBUG nova.virt.libvirt.host [None req-58a0e89d-0650-449c-a608-79ae149a1c1b 0f8ef02149394f2dac899fc3395b6bf7 717cc581e6a349a98dfd390d05b18624 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Jan 21 18:56:28 np0005591285 nova_compute[182755]: 2026-01-21 23:56:28.459 182759 DEBUG nova.virt.libvirt.host [None req-58a0e89d-0650-449c-a608-79ae149a1c1b 0f8ef02149394f2dac899fc3395b6bf7 717cc581e6a349a98dfd390d05b18624 - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Jan 21 18:56:28 np0005591285 nova_compute[182755]: 2026-01-21 23:56:28.460 182759 DEBUG nova.virt.libvirt.host [None req-58a0e89d-0650-449c-a608-79ae149a1c1b 0f8ef02149394f2dac899fc3395b6bf7 717cc581e6a349a98dfd390d05b18624 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Jan 21 18:56:28 np0005591285 nova_compute[182755]: 2026-01-21 23:56:28.463 182759 DEBUG nova.virt.libvirt.driver [None req-58a0e89d-0650-449c-a608-79ae149a1c1b 0f8ef02149394f2dac899fc3395b6bf7 717cc581e6a349a98dfd390d05b18624 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Jan 21 18:56:28 np0005591285 nova_compute[182755]: 2026-01-21 23:56:28.463 182759 DEBUG nova.virt.hardware [None req-58a0e89d-0650-449c-a608-79ae149a1c1b 0f8ef02149394f2dac899fc3395b6bf7 717cc581e6a349a98dfd390d05b18624 - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-21T23:42:45Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='c3389c03-89c4-4ff5-9e03-1a99d41713d4',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-21T23:42:50Z,direct_url=<?>,disk_format='qcow2',id=9cd98f02-a505-4543-a7ad-04e9a377b456,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='43b70c4e837343859ac97b6b2397ba1b',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-21T23:42:52Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Jan 21 18:56:28 np0005591285 nova_compute[182755]: 2026-01-21 23:56:28.464 182759 DEBUG nova.virt.hardware [None req-58a0e89d-0650-449c-a608-79ae149a1c1b 0f8ef02149394f2dac899fc3395b6bf7 717cc581e6a349a98dfd390d05b18624 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Jan 21 18:56:28 np0005591285 nova_compute[182755]: 2026-01-21 23:56:28.465 182759 DEBUG nova.virt.hardware [None req-58a0e89d-0650-449c-a608-79ae149a1c1b 0f8ef02149394f2dac899fc3395b6bf7 717cc581e6a349a98dfd390d05b18624 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Jan 21 18:56:28 np0005591285 nova_compute[182755]: 2026-01-21 23:56:28.465 182759 DEBUG nova.virt.hardware [None req-58a0e89d-0650-449c-a608-79ae149a1c1b 0f8ef02149394f2dac899fc3395b6bf7 717cc581e6a349a98dfd390d05b18624 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Jan 21 18:56:28 np0005591285 nova_compute[182755]: 2026-01-21 23:56:28.465 182759 DEBUG nova.virt.hardware [None req-58a0e89d-0650-449c-a608-79ae149a1c1b 0f8ef02149394f2dac899fc3395b6bf7 717cc581e6a349a98dfd390d05b18624 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Jan 21 18:56:28 np0005591285 nova_compute[182755]: 2026-01-21 23:56:28.466 182759 DEBUG nova.virt.hardware [None req-58a0e89d-0650-449c-a608-79ae149a1c1b 0f8ef02149394f2dac899fc3395b6bf7 717cc581e6a349a98dfd390d05b18624 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Jan 21 18:56:28 np0005591285 nova_compute[182755]: 2026-01-21 23:56:28.466 182759 DEBUG nova.virt.hardware [None req-58a0e89d-0650-449c-a608-79ae149a1c1b 0f8ef02149394f2dac899fc3395b6bf7 717cc581e6a349a98dfd390d05b18624 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Jan 21 18:56:28 np0005591285 nova_compute[182755]: 2026-01-21 23:56:28.466 182759 DEBUG nova.virt.hardware [None req-58a0e89d-0650-449c-a608-79ae149a1c1b 0f8ef02149394f2dac899fc3395b6bf7 717cc581e6a349a98dfd390d05b18624 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Jan 21 18:56:28 np0005591285 nova_compute[182755]: 2026-01-21 23:56:28.467 182759 DEBUG nova.virt.hardware [None req-58a0e89d-0650-449c-a608-79ae149a1c1b 0f8ef02149394f2dac899fc3395b6bf7 717cc581e6a349a98dfd390d05b18624 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Jan 21 18:56:28 np0005591285 nova_compute[182755]: 2026-01-21 23:56:28.467 182759 DEBUG nova.virt.hardware [None req-58a0e89d-0650-449c-a608-79ae149a1c1b 0f8ef02149394f2dac899fc3395b6bf7 717cc581e6a349a98dfd390d05b18624 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Jan 21 18:56:28 np0005591285 nova_compute[182755]: 2026-01-21 23:56:28.468 182759 DEBUG nova.virt.hardware [None req-58a0e89d-0650-449c-a608-79ae149a1c1b 0f8ef02149394f2dac899fc3395b6bf7 717cc581e6a349a98dfd390d05b18624 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Jan 21 18:56:28 np0005591285 nova_compute[182755]: 2026-01-21 23:56:28.477 182759 DEBUG nova.virt.libvirt.vif [None req-58a0e89d-0650-449c-a608-79ae149a1c1b 0f8ef02149394f2dac899fc3395b6bf7 717cc581e6a349a98dfd390d05b18624 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-21T23:56:21Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-AttachInterfacesTestJSON-server-1062303531',display_name='tempest-AttachInterfacesTestJSON-server-1062303531',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-attachinterfacestestjson-server-1062303531',id=68,image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBDwgch/zV9NgtxoAjj9JBs7lcsR6b8YWvnQIu5vmkBx68RzwPILt1+wzp9eXPXNsQsEuX9cbGoTpflNJROPxrxLqg5cuoOtsv4rM+f7gXiy2vWn/dQAcGM72Np9ilcHeEQ==',key_name='tempest-keypair-829623367',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='717cc581e6a349a98dfd390d05b18624',ramdisk_id='',reservation_id='r-2l69dl8j',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-AttachInterfacesTestJSON-658760528',owner_user_name='tempest-AttachInterfacesTestJSON-658760528-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-21T23:56:24Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='0f8ef02149394f2dac899fc3395b6bf7',uuid=d30c29ef-0595-4d30-a826-50e21d7d3463,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "8a89069d-b676-4bb0-ab19-1d71370566f0", "address": "fa:16:3e:04:81:11", "network": {"id": "1995baab-0f8d-4658-a4fc-2d21868dc592", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-54296518-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "717cc581e6a349a98dfd390d05b18624", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap8a89069d-b6", "ovs_interfaceid": "8a89069d-b676-4bb0-ab19-1d71370566f0", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Jan 21 18:56:28 np0005591285 nova_compute[182755]: 2026-01-21 23:56:28.477 182759 DEBUG nova.network.os_vif_util [None req-58a0e89d-0650-449c-a608-79ae149a1c1b 0f8ef02149394f2dac899fc3395b6bf7 717cc581e6a349a98dfd390d05b18624 - - default default] Converting VIF {"id": "8a89069d-b676-4bb0-ab19-1d71370566f0", "address": "fa:16:3e:04:81:11", "network": {"id": "1995baab-0f8d-4658-a4fc-2d21868dc592", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-54296518-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "717cc581e6a349a98dfd390d05b18624", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap8a89069d-b6", "ovs_interfaceid": "8a89069d-b676-4bb0-ab19-1d71370566f0", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 21 18:56:28 np0005591285 nova_compute[182755]: 2026-01-21 23:56:28.478 182759 DEBUG nova.network.os_vif_util [None req-58a0e89d-0650-449c-a608-79ae149a1c1b 0f8ef02149394f2dac899fc3395b6bf7 717cc581e6a349a98dfd390d05b18624 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:04:81:11,bridge_name='br-int',has_traffic_filtering=True,id=8a89069d-b676-4bb0-ab19-1d71370566f0,network=Network(1995baab-0f8d-4658-a4fc-2d21868dc592),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap8a89069d-b6') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 21 18:56:28 np0005591285 nova_compute[182755]: 2026-01-21 23:56:28.479 182759 DEBUG nova.objects.instance [None req-58a0e89d-0650-449c-a608-79ae149a1c1b 0f8ef02149394f2dac899fc3395b6bf7 717cc581e6a349a98dfd390d05b18624 - - default default] Lazy-loading 'pci_devices' on Instance uuid d30c29ef-0595-4d30-a826-50e21d7d3463 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 21 18:56:28 np0005591285 nova_compute[182755]: 2026-01-21 23:56:28.497 182759 DEBUG oslo_concurrency.processutils [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/7768ce19-cdaa-43a0-9404-cafa72f2d077/disk --force-share --output=json" returned: 0 in 0.088s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 21 18:56:28 np0005591285 nova_compute[182755]: 2026-01-21 23:56:28.500 182759 DEBUG nova.virt.libvirt.driver [None req-58a0e89d-0650-449c-a608-79ae149a1c1b 0f8ef02149394f2dac899fc3395b6bf7 717cc581e6a349a98dfd390d05b18624 - - default default] [instance: d30c29ef-0595-4d30-a826-50e21d7d3463] End _get_guest_xml xml=<domain type="kvm">
Jan 21 18:56:28 np0005591285 nova_compute[182755]:  <uuid>d30c29ef-0595-4d30-a826-50e21d7d3463</uuid>
Jan 21 18:56:28 np0005591285 nova_compute[182755]:  <name>instance-00000044</name>
Jan 21 18:56:28 np0005591285 nova_compute[182755]:  <memory>131072</memory>
Jan 21 18:56:28 np0005591285 nova_compute[182755]:  <vcpu>1</vcpu>
Jan 21 18:56:28 np0005591285 nova_compute[182755]:  <metadata>
Jan 21 18:56:28 np0005591285 nova_compute[182755]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 21 18:56:28 np0005591285 nova_compute[182755]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 21 18:56:28 np0005591285 nova_compute[182755]:      <nova:name>tempest-AttachInterfacesTestJSON-server-1062303531</nova:name>
Jan 21 18:56:28 np0005591285 nova_compute[182755]:      <nova:creationTime>2026-01-21 23:56:28</nova:creationTime>
Jan 21 18:56:28 np0005591285 nova_compute[182755]:      <nova:flavor name="m1.nano">
Jan 21 18:56:28 np0005591285 nova_compute[182755]:        <nova:memory>128</nova:memory>
Jan 21 18:56:28 np0005591285 nova_compute[182755]:        <nova:disk>1</nova:disk>
Jan 21 18:56:28 np0005591285 nova_compute[182755]:        <nova:swap>0</nova:swap>
Jan 21 18:56:28 np0005591285 nova_compute[182755]:        <nova:ephemeral>0</nova:ephemeral>
Jan 21 18:56:28 np0005591285 nova_compute[182755]:        <nova:vcpus>1</nova:vcpus>
Jan 21 18:56:28 np0005591285 nova_compute[182755]:      </nova:flavor>
Jan 21 18:56:28 np0005591285 nova_compute[182755]:      <nova:owner>
Jan 21 18:56:28 np0005591285 nova_compute[182755]:        <nova:user uuid="0f8ef02149394f2dac899fc3395b6bf7">tempest-AttachInterfacesTestJSON-658760528-project-member</nova:user>
Jan 21 18:56:28 np0005591285 nova_compute[182755]:        <nova:project uuid="717cc581e6a349a98dfd390d05b18624">tempest-AttachInterfacesTestJSON-658760528</nova:project>
Jan 21 18:56:28 np0005591285 nova_compute[182755]:      </nova:owner>
Jan 21 18:56:28 np0005591285 nova_compute[182755]:      <nova:root type="image" uuid="9cd98f02-a505-4543-a7ad-04e9a377b456"/>
Jan 21 18:56:28 np0005591285 nova_compute[182755]:      <nova:ports>
Jan 21 18:56:28 np0005591285 nova_compute[182755]:        <nova:port uuid="8a89069d-b676-4bb0-ab19-1d71370566f0">
Jan 21 18:56:28 np0005591285 nova_compute[182755]:          <nova:ip type="fixed" address="10.100.0.14" ipVersion="4"/>
Jan 21 18:56:28 np0005591285 nova_compute[182755]:        </nova:port>
Jan 21 18:56:28 np0005591285 nova_compute[182755]:      </nova:ports>
Jan 21 18:56:28 np0005591285 nova_compute[182755]:    </nova:instance>
Jan 21 18:56:28 np0005591285 nova_compute[182755]:  </metadata>
Jan 21 18:56:28 np0005591285 nova_compute[182755]:  <sysinfo type="smbios">
Jan 21 18:56:28 np0005591285 nova_compute[182755]:    <system>
Jan 21 18:56:28 np0005591285 nova_compute[182755]:      <entry name="manufacturer">RDO</entry>
Jan 21 18:56:28 np0005591285 nova_compute[182755]:      <entry name="product">OpenStack Compute</entry>
Jan 21 18:56:28 np0005591285 nova_compute[182755]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 21 18:56:28 np0005591285 nova_compute[182755]:      <entry name="serial">d30c29ef-0595-4d30-a826-50e21d7d3463</entry>
Jan 21 18:56:28 np0005591285 nova_compute[182755]:      <entry name="uuid">d30c29ef-0595-4d30-a826-50e21d7d3463</entry>
Jan 21 18:56:28 np0005591285 nova_compute[182755]:      <entry name="family">Virtual Machine</entry>
Jan 21 18:56:28 np0005591285 nova_compute[182755]:    </system>
Jan 21 18:56:28 np0005591285 nova_compute[182755]:  </sysinfo>
Jan 21 18:56:28 np0005591285 nova_compute[182755]:  <os>
Jan 21 18:56:28 np0005591285 nova_compute[182755]:    <type arch="x86_64" machine="q35">hvm</type>
Jan 21 18:56:28 np0005591285 nova_compute[182755]:    <boot dev="hd"/>
Jan 21 18:56:28 np0005591285 nova_compute[182755]:    <smbios mode="sysinfo"/>
Jan 21 18:56:28 np0005591285 nova_compute[182755]:  </os>
Jan 21 18:56:28 np0005591285 nova_compute[182755]:  <features>
Jan 21 18:56:28 np0005591285 nova_compute[182755]:    <acpi/>
Jan 21 18:56:28 np0005591285 nova_compute[182755]:    <apic/>
Jan 21 18:56:28 np0005591285 nova_compute[182755]:    <vmcoreinfo/>
Jan 21 18:56:28 np0005591285 nova_compute[182755]:  </features>
Jan 21 18:56:28 np0005591285 nova_compute[182755]:  <clock offset="utc">
Jan 21 18:56:28 np0005591285 nova_compute[182755]:    <timer name="pit" tickpolicy="delay"/>
Jan 21 18:56:28 np0005591285 nova_compute[182755]:    <timer name="rtc" tickpolicy="catchup"/>
Jan 21 18:56:28 np0005591285 nova_compute[182755]:    <timer name="hpet" present="no"/>
Jan 21 18:56:28 np0005591285 nova_compute[182755]:  </clock>
Jan 21 18:56:28 np0005591285 nova_compute[182755]:  <cpu mode="custom" match="exact">
Jan 21 18:56:28 np0005591285 nova_compute[182755]:    <model>Nehalem</model>
Jan 21 18:56:28 np0005591285 nova_compute[182755]:    <topology sockets="1" cores="1" threads="1"/>
Jan 21 18:56:28 np0005591285 nova_compute[182755]:  </cpu>
Jan 21 18:56:28 np0005591285 nova_compute[182755]:  <devices>
Jan 21 18:56:28 np0005591285 nova_compute[182755]:    <disk type="file" device="disk">
Jan 21 18:56:28 np0005591285 nova_compute[182755]:      <driver name="qemu" type="qcow2" cache="none"/>
Jan 21 18:56:28 np0005591285 nova_compute[182755]:      <source file="/var/lib/nova/instances/d30c29ef-0595-4d30-a826-50e21d7d3463/disk"/>
Jan 21 18:56:28 np0005591285 nova_compute[182755]:      <target dev="vda" bus="virtio"/>
Jan 21 18:56:28 np0005591285 nova_compute[182755]:    </disk>
Jan 21 18:56:28 np0005591285 nova_compute[182755]:    <disk type="file" device="cdrom">
Jan 21 18:56:28 np0005591285 nova_compute[182755]:      <driver name="qemu" type="raw" cache="none"/>
Jan 21 18:56:28 np0005591285 nova_compute[182755]:      <source file="/var/lib/nova/instances/d30c29ef-0595-4d30-a826-50e21d7d3463/disk.config"/>
Jan 21 18:56:28 np0005591285 nova_compute[182755]:      <target dev="sda" bus="sata"/>
Jan 21 18:56:28 np0005591285 nova_compute[182755]:    </disk>
Jan 21 18:56:28 np0005591285 nova_compute[182755]:    <interface type="ethernet">
Jan 21 18:56:28 np0005591285 nova_compute[182755]:      <mac address="fa:16:3e:04:81:11"/>
Jan 21 18:56:28 np0005591285 nova_compute[182755]:      <model type="virtio"/>
Jan 21 18:56:28 np0005591285 nova_compute[182755]:      <driver name="vhost" rx_queue_size="512"/>
Jan 21 18:56:28 np0005591285 nova_compute[182755]:      <mtu size="1442"/>
Jan 21 18:56:28 np0005591285 nova_compute[182755]:      <target dev="tap8a89069d-b6"/>
Jan 21 18:56:28 np0005591285 nova_compute[182755]:    </interface>
Jan 21 18:56:28 np0005591285 nova_compute[182755]:    <serial type="pty">
Jan 21 18:56:28 np0005591285 nova_compute[182755]:      <log file="/var/lib/nova/instances/d30c29ef-0595-4d30-a826-50e21d7d3463/console.log" append="off"/>
Jan 21 18:56:28 np0005591285 nova_compute[182755]:    </serial>
Jan 21 18:56:28 np0005591285 nova_compute[182755]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 21 18:56:28 np0005591285 nova_compute[182755]:    <video>
Jan 21 18:56:28 np0005591285 nova_compute[182755]:      <model type="virtio"/>
Jan 21 18:56:28 np0005591285 nova_compute[182755]:    </video>
Jan 21 18:56:28 np0005591285 nova_compute[182755]:    <input type="tablet" bus="usb"/>
Jan 21 18:56:28 np0005591285 nova_compute[182755]:    <rng model="virtio">
Jan 21 18:56:28 np0005591285 nova_compute[182755]:      <backend model="random">/dev/urandom</backend>
Jan 21 18:56:28 np0005591285 nova_compute[182755]:    </rng>
Jan 21 18:56:28 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root"/>
Jan 21 18:56:28 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 18:56:28 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 18:56:28 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 18:56:28 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 18:56:28 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 18:56:28 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 18:56:28 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 18:56:28 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 18:56:28 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 18:56:28 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 18:56:28 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 18:56:28 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 18:56:28 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 18:56:28 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 18:56:28 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 18:56:28 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 18:56:28 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 18:56:28 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 18:56:28 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 18:56:28 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 18:56:28 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 18:56:28 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 18:56:28 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 18:56:28 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 18:56:28 np0005591285 nova_compute[182755]:    <controller type="usb" index="0"/>
Jan 21 18:56:28 np0005591285 nova_compute[182755]:    <memballoon model="virtio">
Jan 21 18:56:28 np0005591285 nova_compute[182755]:      <stats period="10"/>
Jan 21 18:56:28 np0005591285 nova_compute[182755]:    </memballoon>
Jan 21 18:56:28 np0005591285 nova_compute[182755]:  </devices>
Jan 21 18:56:28 np0005591285 nova_compute[182755]: </domain>
Jan 21 18:56:28 np0005591285 nova_compute[182755]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Jan 21 18:56:28 np0005591285 nova_compute[182755]: 2026-01-21 23:56:28.501 182759 DEBUG nova.compute.manager [None req-58a0e89d-0650-449c-a608-79ae149a1c1b 0f8ef02149394f2dac899fc3395b6bf7 717cc581e6a349a98dfd390d05b18624 - - default default] [instance: d30c29ef-0595-4d30-a826-50e21d7d3463] Preparing to wait for external event network-vif-plugged-8a89069d-b676-4bb0-ab19-1d71370566f0 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Jan 21 18:56:28 np0005591285 nova_compute[182755]: 2026-01-21 23:56:28.501 182759 DEBUG oslo_concurrency.lockutils [None req-58a0e89d-0650-449c-a608-79ae149a1c1b 0f8ef02149394f2dac899fc3395b6bf7 717cc581e6a349a98dfd390d05b18624 - - default default] Acquiring lock "d30c29ef-0595-4d30-a826-50e21d7d3463-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 21 18:56:28 np0005591285 nova_compute[182755]: 2026-01-21 23:56:28.501 182759 DEBUG oslo_concurrency.lockutils [None req-58a0e89d-0650-449c-a608-79ae149a1c1b 0f8ef02149394f2dac899fc3395b6bf7 717cc581e6a349a98dfd390d05b18624 - - default default] Lock "d30c29ef-0595-4d30-a826-50e21d7d3463-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 21 18:56:28 np0005591285 nova_compute[182755]: 2026-01-21 23:56:28.502 182759 DEBUG oslo_concurrency.lockutils [None req-58a0e89d-0650-449c-a608-79ae149a1c1b 0f8ef02149394f2dac899fc3395b6bf7 717cc581e6a349a98dfd390d05b18624 - - default default] Lock "d30c29ef-0595-4d30-a826-50e21d7d3463-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 21 18:56:28 np0005591285 nova_compute[182755]: 2026-01-21 23:56:28.502 182759 DEBUG nova.virt.libvirt.vif [None req-58a0e89d-0650-449c-a608-79ae149a1c1b 0f8ef02149394f2dac899fc3395b6bf7 717cc581e6a349a98dfd390d05b18624 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-21T23:56:21Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-AttachInterfacesTestJSON-server-1062303531',display_name='tempest-AttachInterfacesTestJSON-server-1062303531',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-attachinterfacestestjson-server-1062303531',id=68,image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBDwgch/zV9NgtxoAjj9JBs7lcsR6b8YWvnQIu5vmkBx68RzwPILt1+wzp9eXPXNsQsEuX9cbGoTpflNJROPxrxLqg5cuoOtsv4rM+f7gXiy2vWn/dQAcGM72Np9ilcHeEQ==',key_name='tempest-keypair-829623367',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='717cc581e6a349a98dfd390d05b18624',ramdisk_id='',reservation_id='r-2l69dl8j',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-AttachInterfacesTestJSON-658760528',owner_user_name='tempest-AttachInterfacesTestJSON-658760528-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-21T23:56:24Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='0f8ef02149394f2dac899fc3395b6bf7',uuid=d30c29ef-0595-4d30-a826-50e21d7d3463,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "8a89069d-b676-4bb0-ab19-1d71370566f0", "address": "fa:16:3e:04:81:11", "network": {"id": "1995baab-0f8d-4658-a4fc-2d21868dc592", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-54296518-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "717cc581e6a349a98dfd390d05b18624", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap8a89069d-b6", "ovs_interfaceid": "8a89069d-b676-4bb0-ab19-1d71370566f0", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Jan 21 18:56:28 np0005591285 nova_compute[182755]: 2026-01-21 23:56:28.503 182759 DEBUG nova.network.os_vif_util [None req-58a0e89d-0650-449c-a608-79ae149a1c1b 0f8ef02149394f2dac899fc3395b6bf7 717cc581e6a349a98dfd390d05b18624 - - default default] Converting VIF {"id": "8a89069d-b676-4bb0-ab19-1d71370566f0", "address": "fa:16:3e:04:81:11", "network": {"id": "1995baab-0f8d-4658-a4fc-2d21868dc592", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-54296518-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "717cc581e6a349a98dfd390d05b18624", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap8a89069d-b6", "ovs_interfaceid": "8a89069d-b676-4bb0-ab19-1d71370566f0", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 21 18:56:28 np0005591285 nova_compute[182755]: 2026-01-21 23:56:28.503 182759 DEBUG nova.network.os_vif_util [None req-58a0e89d-0650-449c-a608-79ae149a1c1b 0f8ef02149394f2dac899fc3395b6bf7 717cc581e6a349a98dfd390d05b18624 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:04:81:11,bridge_name='br-int',has_traffic_filtering=True,id=8a89069d-b676-4bb0-ab19-1d71370566f0,network=Network(1995baab-0f8d-4658-a4fc-2d21868dc592),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap8a89069d-b6') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 21 18:56:28 np0005591285 nova_compute[182755]: 2026-01-21 23:56:28.504 182759 DEBUG os_vif [None req-58a0e89d-0650-449c-a608-79ae149a1c1b 0f8ef02149394f2dac899fc3395b6bf7 717cc581e6a349a98dfd390d05b18624 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:04:81:11,bridge_name='br-int',has_traffic_filtering=True,id=8a89069d-b676-4bb0-ab19-1d71370566f0,network=Network(1995baab-0f8d-4658-a4fc-2d21868dc592),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap8a89069d-b6') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Jan 21 18:56:28 np0005591285 nova_compute[182755]: 2026-01-21 23:56:28.504 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 18:56:28 np0005591285 nova_compute[182755]: 2026-01-21 23:56:28.505 182759 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 21 18:56:28 np0005591285 nova_compute[182755]: 2026-01-21 23:56:28.505 182759 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 21 18:56:28 np0005591285 nova_compute[182755]: 2026-01-21 23:56:28.509 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 18:56:28 np0005591285 nova_compute[182755]: 2026-01-21 23:56:28.509 182759 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap8a89069d-b6, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 21 18:56:28 np0005591285 nova_compute[182755]: 2026-01-21 23:56:28.509 182759 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap8a89069d-b6, col_values=(('external_ids', {'iface-id': '8a89069d-b676-4bb0-ab19-1d71370566f0', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:04:81:11', 'vm-uuid': 'd30c29ef-0595-4d30-a826-50e21d7d3463'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 21 18:56:28 np0005591285 nova_compute[182755]: 2026-01-21 23:56:28.511 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 18:56:28 np0005591285 NetworkManager[55017]: <info>  [1769039788.5137] manager: (tap8a89069d-b6): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/96)
Jan 21 18:56:28 np0005591285 nova_compute[182755]: 2026-01-21 23:56:28.514 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 21 18:56:28 np0005591285 nova_compute[182755]: 2026-01-21 23:56:28.524 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 18:56:28 np0005591285 nova_compute[182755]: 2026-01-21 23:56:28.525 182759 INFO os_vif [None req-58a0e89d-0650-449c-a608-79ae149a1c1b 0f8ef02149394f2dac899fc3395b6bf7 717cc581e6a349a98dfd390d05b18624 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:04:81:11,bridge_name='br-int',has_traffic_filtering=True,id=8a89069d-b676-4bb0-ab19-1d71370566f0,network=Network(1995baab-0f8d-4658-a4fc-2d21868dc592),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap8a89069d-b6')#033[00m
Jan 21 18:56:28 np0005591285 nova_compute[182755]: 2026-01-21 23:56:28.594 182759 DEBUG nova.virt.libvirt.driver [None req-58a0e89d-0650-449c-a608-79ae149a1c1b 0f8ef02149394f2dac899fc3395b6bf7 717cc581e6a349a98dfd390d05b18624 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 21 18:56:28 np0005591285 nova_compute[182755]: 2026-01-21 23:56:28.595 182759 DEBUG nova.virt.libvirt.driver [None req-58a0e89d-0650-449c-a608-79ae149a1c1b 0f8ef02149394f2dac899fc3395b6bf7 717cc581e6a349a98dfd390d05b18624 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 21 18:56:28 np0005591285 nova_compute[182755]: 2026-01-21 23:56:28.595 182759 DEBUG nova.virt.libvirt.driver [None req-58a0e89d-0650-449c-a608-79ae149a1c1b 0f8ef02149394f2dac899fc3395b6bf7 717cc581e6a349a98dfd390d05b18624 - - default default] No VIF found with MAC fa:16:3e:04:81:11, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Jan 21 18:56:28 np0005591285 nova_compute[182755]: 2026-01-21 23:56:28.596 182759 INFO nova.virt.libvirt.driver [None req-58a0e89d-0650-449c-a608-79ae149a1c1b 0f8ef02149394f2dac899fc3395b6bf7 717cc581e6a349a98dfd390d05b18624 - - default default] [instance: d30c29ef-0595-4d30-a826-50e21d7d3463] Using config drive#033[00m
Jan 21 18:56:28 np0005591285 nova_compute[182755]: 2026-01-21 23:56:28.691 182759 DEBUG oslo_concurrency.lockutils [None req-31668c17-8b54-4361-aadd-3620835da512 9a4a4a5f3c9f4c5091261592272bcb81 414437860afc460b9e86d674975e9d1f - - default default] Acquiring lock "7768ce19-cdaa-43a0-9404-cafa72f2d077" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 21 18:56:28 np0005591285 nova_compute[182755]: 2026-01-21 23:56:28.692 182759 DEBUG oslo_concurrency.lockutils [None req-31668c17-8b54-4361-aadd-3620835da512 9a4a4a5f3c9f4c5091261592272bcb81 414437860afc460b9e86d674975e9d1f - - default default] Lock "7768ce19-cdaa-43a0-9404-cafa72f2d077" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 21 18:56:28 np0005591285 nova_compute[182755]: 2026-01-21 23:56:28.692 182759 DEBUG oslo_concurrency.lockutils [None req-31668c17-8b54-4361-aadd-3620835da512 9a4a4a5f3c9f4c5091261592272bcb81 414437860afc460b9e86d674975e9d1f - - default default] Acquiring lock "7768ce19-cdaa-43a0-9404-cafa72f2d077-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 21 18:56:28 np0005591285 nova_compute[182755]: 2026-01-21 23:56:28.693 182759 DEBUG oslo_concurrency.lockutils [None req-31668c17-8b54-4361-aadd-3620835da512 9a4a4a5f3c9f4c5091261592272bcb81 414437860afc460b9e86d674975e9d1f - - default default] Lock "7768ce19-cdaa-43a0-9404-cafa72f2d077-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 21 18:56:28 np0005591285 nova_compute[182755]: 2026-01-21 23:56:28.693 182759 DEBUG oslo_concurrency.lockutils [None req-31668c17-8b54-4361-aadd-3620835da512 9a4a4a5f3c9f4c5091261592272bcb81 414437860afc460b9e86d674975e9d1f - - default default] Lock "7768ce19-cdaa-43a0-9404-cafa72f2d077-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 21 18:56:28 np0005591285 nova_compute[182755]: 2026-01-21 23:56:28.705 182759 INFO nova.compute.manager [None req-31668c17-8b54-4361-aadd-3620835da512 9a4a4a5f3c9f4c5091261592272bcb81 414437860afc460b9e86d674975e9d1f - - default default] [instance: 7768ce19-cdaa-43a0-9404-cafa72f2d077] Terminating instance#033[00m
Jan 21 18:56:28 np0005591285 nova_compute[182755]: 2026-01-21 23:56:28.720 182759 DEBUG nova.compute.manager [None req-31668c17-8b54-4361-aadd-3620835da512 9a4a4a5f3c9f4c5091261592272bcb81 414437860afc460b9e86d674975e9d1f - - default default] [instance: 7768ce19-cdaa-43a0-9404-cafa72f2d077] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Jan 21 18:56:28 np0005591285 kernel: tapcbd11a5b-9e (unregistering): left promiscuous mode
Jan 21 18:56:28 np0005591285 NetworkManager[55017]: <info>  [1769039788.7558] device (tapcbd11a5b-9e): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 21 18:56:28 np0005591285 nova_compute[182755]: 2026-01-21 23:56:28.758 182759 WARNING nova.virt.libvirt.driver [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 21 18:56:28 np0005591285 nova_compute[182755]: 2026-01-21 23:56:28.759 182759 DEBUG nova.compute.resource_tracker [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=5564MB free_disk=73.30340576171875GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Jan 21 18:56:28 np0005591285 nova_compute[182755]: 2026-01-21 23:56:28.759 182759 DEBUG oslo_concurrency.lockutils [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 21 18:56:28 np0005591285 nova_compute[182755]: 2026-01-21 23:56:28.760 182759 DEBUG oslo_concurrency.lockutils [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 21 18:56:28 np0005591285 nova_compute[182755]: 2026-01-21 23:56:28.762 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 18:56:28 np0005591285 ovn_controller[94908]: 2026-01-21T23:56:28Z|00180|binding|INFO|Releasing lport cbd11a5b-9ef3-4c5b-a638-afc4464a8dfc from this chassis (sb_readonly=0)
Jan 21 18:56:28 np0005591285 ovn_controller[94908]: 2026-01-21T23:56:28Z|00181|binding|INFO|Setting lport cbd11a5b-9ef3-4c5b-a638-afc4464a8dfc down in Southbound
Jan 21 18:56:28 np0005591285 ovn_controller[94908]: 2026-01-21T23:56:28Z|00182|binding|INFO|Removing iface tapcbd11a5b-9e ovn-installed in OVS
Jan 21 18:56:28 np0005591285 nova_compute[182755]: 2026-01-21 23:56:28.764 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 18:56:28 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:56:28.770 104259 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:fa:c6:29 10.100.0.8'], port_security=['fa:16:3e:fa:c6:29 10.100.0.8'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.8/28', 'neutron:device_id': '7768ce19-cdaa-43a0-9404-cafa72f2d077', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-835f4434-3fa6-458b-b79c-b27830f531cf', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '414437860afc460b9e86d674975e9d1f', 'neutron:revision_number': '4', 'neutron:security_group_ids': '30db0ce4-28a9-4add-b257-f90dc081c48d', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-2.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=f2a61a9c-1832-4a5f-89c7-e09ac8a1046e, chassis=[], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fdd55b5c970>], logical_port=cbd11a5b-9ef3-4c5b-a638-afc4464a8dfc) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7fdd55b5c970>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 21 18:56:28 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:56:28.772 104259 INFO neutron.agent.ovn.metadata.agent [-] Port cbd11a5b-9ef3-4c5b-a638-afc4464a8dfc in datapath 835f4434-3fa6-458b-b79c-b27830f531cf unbound from our chassis#033[00m
Jan 21 18:56:28 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:56:28.773 104259 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 835f4434-3fa6-458b-b79c-b27830f531cf, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Jan 21 18:56:28 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:56:28.775 211690 DEBUG oslo.privsep.daemon [-] privsep: reply[2a735c57-630e-4145-90a8-3619f2c61130]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 18:56:28 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:56:28.775 104259 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-835f4434-3fa6-458b-b79c-b27830f531cf namespace which is not needed anymore#033[00m
Jan 21 18:56:28 np0005591285 nova_compute[182755]: 2026-01-21 23:56:28.785 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 18:56:28 np0005591285 systemd[1]: machine-qemu\x2d27\x2dinstance\x2d00000041.scope: Deactivated successfully.
Jan 21 18:56:28 np0005591285 systemd[1]: machine-qemu\x2d27\x2dinstance\x2d00000041.scope: Consumed 5.871s CPU time.
Jan 21 18:56:28 np0005591285 systemd-machined[154022]: Machine qemu-27-instance-00000041 terminated.
Jan 21 18:56:28 np0005591285 nova_compute[182755]: 2026-01-21 23:56:28.864 182759 DEBUG nova.compute.resource_tracker [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Instance 7768ce19-cdaa-43a0-9404-cafa72f2d077 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Jan 21 18:56:28 np0005591285 nova_compute[182755]: 2026-01-21 23:56:28.865 182759 DEBUG nova.compute.resource_tracker [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Instance d30c29ef-0595-4d30-a826-50e21d7d3463 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Jan 21 18:56:28 np0005591285 nova_compute[182755]: 2026-01-21 23:56:28.865 182759 DEBUG nova.compute.resource_tracker [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 2 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Jan 21 18:56:28 np0005591285 nova_compute[182755]: 2026-01-21 23:56:28.865 182759 DEBUG nova.compute.resource_tracker [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7679MB used_ram=768MB phys_disk=79GB used_disk=2GB total_vcpus=8 used_vcpus=2 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Jan 21 18:56:28 np0005591285 neutron-haproxy-ovnmeta-835f4434-3fa6-458b-b79c-b27830f531cf[219618]: [NOTICE]   (219622) : haproxy version is 2.8.14-c23fe91
Jan 21 18:56:28 np0005591285 neutron-haproxy-ovnmeta-835f4434-3fa6-458b-b79c-b27830f531cf[219618]: [NOTICE]   (219622) : path to executable is /usr/sbin/haproxy
Jan 21 18:56:28 np0005591285 neutron-haproxy-ovnmeta-835f4434-3fa6-458b-b79c-b27830f531cf[219618]: [WARNING]  (219622) : Exiting Master process...
Jan 21 18:56:28 np0005591285 neutron-haproxy-ovnmeta-835f4434-3fa6-458b-b79c-b27830f531cf[219618]: [WARNING]  (219622) : Exiting Master process...
Jan 21 18:56:28 np0005591285 neutron-haproxy-ovnmeta-835f4434-3fa6-458b-b79c-b27830f531cf[219618]: [ALERT]    (219622) : Current worker (219624) exited with code 143 (Terminated)
Jan 21 18:56:28 np0005591285 neutron-haproxy-ovnmeta-835f4434-3fa6-458b-b79c-b27830f531cf[219618]: [WARNING]  (219622) : All workers exited. Exiting... (0)
Jan 21 18:56:28 np0005591285 systemd[1]: libpod-6aa4eda3d15466d5386119ecef1c489403c3a4c617c46fedc147aa1aa17a29e6.scope: Deactivated successfully.
Jan 21 18:56:28 np0005591285 conmon[219618]: conmon 6aa4eda3d15466d53861 <nwarn>: Failed to open cgroups file: /sys/fs/cgroup/machine.slice/libpod-6aa4eda3d15466d5386119ecef1c489403c3a4c617c46fedc147aa1aa17a29e6.scope/container/memory.events
Jan 21 18:56:28 np0005591285 nova_compute[182755]: 2026-01-21 23:56:28.948 182759 DEBUG nova.compute.provider_tree [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Inventory has not changed in ProviderTree for provider: e96a8776-a298-4c19-937a-402cb8191067 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 21 18:56:28 np0005591285 podman[219811]: 2026-01-21 23:56:28.951278399 +0000 UTC m=+0.060350708 container died 6aa4eda3d15466d5386119ecef1c489403c3a4c617c46fedc147aa1aa17a29e6 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-835f4434-3fa6-458b-b79c-b27830f531cf, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, tcib_managed=true, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0)
Jan 21 18:56:28 np0005591285 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-6aa4eda3d15466d5386119ecef1c489403c3a4c617c46fedc147aa1aa17a29e6-userdata-shm.mount: Deactivated successfully.
Jan 21 18:56:28 np0005591285 systemd[1]: var-lib-containers-storage-overlay-0422cdc677962067809d062a173647c9cf392341d153e924181ea1956f806b91-merged.mount: Deactivated successfully.
Jan 21 18:56:28 np0005591285 podman[219811]: 2026-01-21 23:56:28.995054477 +0000 UTC m=+0.104126796 container cleanup 6aa4eda3d15466d5386119ecef1c489403c3a4c617c46fedc147aa1aa17a29e6 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-835f4434-3fa6-458b-b79c-b27830f531cf, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2)
Jan 21 18:56:28 np0005591285 nova_compute[182755]: 2026-01-21 23:56:28.997 182759 DEBUG nova.scheduler.client.report [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Inventory has not changed for provider e96a8776-a298-4c19-937a-402cb8191067 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 21 18:56:29 np0005591285 nova_compute[182755]: 2026-01-21 23:56:29.014 182759 INFO nova.virt.libvirt.driver [-] [instance: 7768ce19-cdaa-43a0-9404-cafa72f2d077] Instance destroyed successfully.#033[00m
Jan 21 18:56:29 np0005591285 nova_compute[182755]: 2026-01-21 23:56:29.015 182759 DEBUG nova.objects.instance [None req-31668c17-8b54-4361-aadd-3620835da512 9a4a4a5f3c9f4c5091261592272bcb81 414437860afc460b9e86d674975e9d1f - - default default] Lazy-loading 'resources' on Instance uuid 7768ce19-cdaa-43a0-9404-cafa72f2d077 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 21 18:56:29 np0005591285 systemd[1]: libpod-conmon-6aa4eda3d15466d5386119ecef1c489403c3a4c617c46fedc147aa1aa17a29e6.scope: Deactivated successfully.
Jan 21 18:56:29 np0005591285 podman[219858]: 2026-01-21 23:56:29.075145088 +0000 UTC m=+0.046418409 container remove 6aa4eda3d15466d5386119ecef1c489403c3a4c617c46fedc147aa1aa17a29e6 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-835f4434-3fa6-458b-b79c-b27830f531cf, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 21 18:56:29 np0005591285 nova_compute[182755]: 2026-01-21 23:56:29.076 182759 DEBUG nova.compute.resource_tracker [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Jan 21 18:56:29 np0005591285 nova_compute[182755]: 2026-01-21 23:56:29.077 182759 DEBUG oslo_concurrency.lockutils [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.318s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 21 18:56:29 np0005591285 nova_compute[182755]: 2026-01-21 23:56:29.079 182759 DEBUG nova.virt.libvirt.vif [None req-31668c17-8b54-4361-aadd-3620835da512 9a4a4a5f3c9f4c5091261592272bcb81 414437860afc460b9e86d674975e9d1f - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-21T23:56:12Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ListServersNegativeTestJSON-server-1677728672',display_name='tempest-ListServersNegativeTestJSON-server-1677728672-1',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-listserversnegativetestjson-server-1677728672-1',id=65,image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2026-01-21T23:56:23Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='414437860afc460b9e86d674975e9d1f',ramdisk_id='',reservation_id='r-dznclk2m',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-ListServersNegativeTestJSON-1787990789',owner_user_name='tempest-ListServersNegativeTestJSON-1787990789-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-21T23:56:23Z,user_data=None,user_id='9a4a4a5f3c9f4c5091261592272bcb81',uuid=7768ce19-cdaa-43a0-9404-cafa72f2d077,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "cbd11a5b-9ef3-4c5b-a638-afc4464a8dfc", "address": "fa:16:3e:fa:c6:29", "network": {"id": "835f4434-3fa6-458b-b79c-b27830f531cf", "bridge": "br-int", "label": "tempest-ListServersNegativeTestJSON-1274650069-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "414437860afc460b9e86d674975e9d1f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapcbd11a5b-9e", "ovs_interfaceid": "cbd11a5b-9ef3-4c5b-a638-afc4464a8dfc", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Jan 21 18:56:29 np0005591285 nova_compute[182755]: 2026-01-21 23:56:29.080 182759 DEBUG nova.network.os_vif_util [None req-31668c17-8b54-4361-aadd-3620835da512 9a4a4a5f3c9f4c5091261592272bcb81 414437860afc460b9e86d674975e9d1f - - default default] Converting VIF {"id": "cbd11a5b-9ef3-4c5b-a638-afc4464a8dfc", "address": "fa:16:3e:fa:c6:29", "network": {"id": "835f4434-3fa6-458b-b79c-b27830f531cf", "bridge": "br-int", "label": "tempest-ListServersNegativeTestJSON-1274650069-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "414437860afc460b9e86d674975e9d1f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapcbd11a5b-9e", "ovs_interfaceid": "cbd11a5b-9ef3-4c5b-a638-afc4464a8dfc", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 21 18:56:29 np0005591285 nova_compute[182755]: 2026-01-21 23:56:29.081 182759 DEBUG nova.network.os_vif_util [None req-31668c17-8b54-4361-aadd-3620835da512 9a4a4a5f3c9f4c5091261592272bcb81 414437860afc460b9e86d674975e9d1f - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:fa:c6:29,bridge_name='br-int',has_traffic_filtering=True,id=cbd11a5b-9ef3-4c5b-a638-afc4464a8dfc,network=Network(835f4434-3fa6-458b-b79c-b27830f531cf),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapcbd11a5b-9e') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 21 18:56:29 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:56:29.081 211690 DEBUG oslo.privsep.daemon [-] privsep: reply[a0d4eb0f-df53-4349-91a6-903e26459516]: (4, ('Wed Jan 21 11:56:28 PM UTC 2026 Stopping container neutron-haproxy-ovnmeta-835f4434-3fa6-458b-b79c-b27830f531cf (6aa4eda3d15466d5386119ecef1c489403c3a4c617c46fedc147aa1aa17a29e6)\n6aa4eda3d15466d5386119ecef1c489403c3a4c617c46fedc147aa1aa17a29e6\nWed Jan 21 11:56:29 PM UTC 2026 Deleting container neutron-haproxy-ovnmeta-835f4434-3fa6-458b-b79c-b27830f531cf (6aa4eda3d15466d5386119ecef1c489403c3a4c617c46fedc147aa1aa17a29e6)\n6aa4eda3d15466d5386119ecef1c489403c3a4c617c46fedc147aa1aa17a29e6\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 18:56:29 np0005591285 nova_compute[182755]: 2026-01-21 23:56:29.082 182759 DEBUG os_vif [None req-31668c17-8b54-4361-aadd-3620835da512 9a4a4a5f3c9f4c5091261592272bcb81 414437860afc460b9e86d674975e9d1f - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:fa:c6:29,bridge_name='br-int',has_traffic_filtering=True,id=cbd11a5b-9ef3-4c5b-a638-afc4464a8dfc,network=Network(835f4434-3fa6-458b-b79c-b27830f531cf),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapcbd11a5b-9e') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Jan 21 18:56:29 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:56:29.083 211690 DEBUG oslo.privsep.daemon [-] privsep: reply[d77be3cb-d660-40f7-b8ec-d73ecae6bd24]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 18:56:29 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:56:29.084 104259 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap835f4434-30, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 21 18:56:29 np0005591285 nova_compute[182755]: 2026-01-21 23:56:29.085 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 18:56:29 np0005591285 nova_compute[182755]: 2026-01-21 23:56:29.085 182759 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapcbd11a5b-9e, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 21 18:56:29 np0005591285 kernel: tap835f4434-30: left promiscuous mode
Jan 21 18:56:29 np0005591285 nova_compute[182755]: 2026-01-21 23:56:29.089 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 21 18:56:29 np0005591285 nova_compute[182755]: 2026-01-21 23:56:29.119 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 18:56:29 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:56:29.123 211690 DEBUG oslo.privsep.daemon [-] privsep: reply[0a5cd273-efa9-43da-8d42-c566cf405ffc]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 18:56:29 np0005591285 nova_compute[182755]: 2026-01-21 23:56:29.127 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 18:56:29 np0005591285 nova_compute[182755]: 2026-01-21 23:56:29.134 182759 INFO os_vif [None req-31668c17-8b54-4361-aadd-3620835da512 9a4a4a5f3c9f4c5091261592272bcb81 414437860afc460b9e86d674975e9d1f - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:fa:c6:29,bridge_name='br-int',has_traffic_filtering=True,id=cbd11a5b-9ef3-4c5b-a638-afc4464a8dfc,network=Network(835f4434-3fa6-458b-b79c-b27830f531cf),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapcbd11a5b-9e')#033[00m
Jan 21 18:56:29 np0005591285 nova_compute[182755]: 2026-01-21 23:56:29.135 182759 INFO nova.virt.libvirt.driver [None req-31668c17-8b54-4361-aadd-3620835da512 9a4a4a5f3c9f4c5091261592272bcb81 414437860afc460b9e86d674975e9d1f - - default default] [instance: 7768ce19-cdaa-43a0-9404-cafa72f2d077] Deleting instance files /var/lib/nova/instances/7768ce19-cdaa-43a0-9404-cafa72f2d077_del#033[00m
Jan 21 18:56:29 np0005591285 nova_compute[182755]: 2026-01-21 23:56:29.135 182759 INFO nova.virt.libvirt.driver [None req-31668c17-8b54-4361-aadd-3620835da512 9a4a4a5f3c9f4c5091261592272bcb81 414437860afc460b9e86d674975e9d1f - - default default] [instance: 7768ce19-cdaa-43a0-9404-cafa72f2d077] Deletion of /var/lib/nova/instances/7768ce19-cdaa-43a0-9404-cafa72f2d077_del complete#033[00m
Jan 21 18:56:29 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:56:29.146 211690 DEBUG oslo.privsep.daemon [-] privsep: reply[b8db13cd-9a8e-46b3-8cc6-1c5cb10d7952]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 18:56:29 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:56:29.148 211690 DEBUG oslo.privsep.daemon [-] privsep: reply[8e2bc909-fe74-4397-9b44-4dadab592bde]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 18:56:29 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:56:29.175 211690 DEBUG oslo.privsep.daemon [-] privsep: reply[c4890abc-bbda-4559-a9c3-18a3f1d7a13c]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 434089, 'reachable_time': 15774, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 219877, 'error': None, 'target': 'ovnmeta-835f4434-3fa6-458b-b79c-b27830f531cf', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 18:56:29 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:56:29.179 104650 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-835f4434-3fa6-458b-b79c-b27830f531cf deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Jan 21 18:56:29 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:56:29.180 104650 DEBUG oslo.privsep.daemon [-] privsep: reply[60a44629-4d2d-4f3a-910e-56b9866174d7]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 18:56:29 np0005591285 systemd[1]: run-netns-ovnmeta\x2d835f4434\x2d3fa6\x2d458b\x2db79c\x2db27830f531cf.mount: Deactivated successfully.
Jan 21 18:56:29 np0005591285 nova_compute[182755]: 2026-01-21 23:56:29.318 182759 INFO nova.compute.manager [None req-31668c17-8b54-4361-aadd-3620835da512 9a4a4a5f3c9f4c5091261592272bcb81 414437860afc460b9e86d674975e9d1f - - default default] [instance: 7768ce19-cdaa-43a0-9404-cafa72f2d077] Took 0.60 seconds to destroy the instance on the hypervisor.#033[00m
Jan 21 18:56:29 np0005591285 nova_compute[182755]: 2026-01-21 23:56:29.321 182759 DEBUG oslo.service.loopingcall [None req-31668c17-8b54-4361-aadd-3620835da512 9a4a4a5f3c9f4c5091261592272bcb81 414437860afc460b9e86d674975e9d1f - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Jan 21 18:56:29 np0005591285 nova_compute[182755]: 2026-01-21 23:56:29.323 182759 DEBUG nova.compute.manager [-] [instance: 7768ce19-cdaa-43a0-9404-cafa72f2d077] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Jan 21 18:56:29 np0005591285 nova_compute[182755]: 2026-01-21 23:56:29.324 182759 DEBUG nova.network.neutron [-] [instance: 7768ce19-cdaa-43a0-9404-cafa72f2d077] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Jan 21 18:56:29 np0005591285 nova_compute[182755]: 2026-01-21 23:56:29.765 182759 DEBUG nova.compute.manager [req-61117e1e-a3a3-4d76-8362-1f0bd99140d2 req-938f8f27-e256-4c67-88c8-2241106d5b3b 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 7768ce19-cdaa-43a0-9404-cafa72f2d077] Received event network-vif-unplugged-cbd11a5b-9ef3-4c5b-a638-afc4464a8dfc external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 21 18:56:29 np0005591285 nova_compute[182755]: 2026-01-21 23:56:29.766 182759 DEBUG oslo_concurrency.lockutils [req-61117e1e-a3a3-4d76-8362-1f0bd99140d2 req-938f8f27-e256-4c67-88c8-2241106d5b3b 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "7768ce19-cdaa-43a0-9404-cafa72f2d077-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 21 18:56:29 np0005591285 nova_compute[182755]: 2026-01-21 23:56:29.766 182759 DEBUG oslo_concurrency.lockutils [req-61117e1e-a3a3-4d76-8362-1f0bd99140d2 req-938f8f27-e256-4c67-88c8-2241106d5b3b 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "7768ce19-cdaa-43a0-9404-cafa72f2d077-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 21 18:56:29 np0005591285 nova_compute[182755]: 2026-01-21 23:56:29.766 182759 DEBUG oslo_concurrency.lockutils [req-61117e1e-a3a3-4d76-8362-1f0bd99140d2 req-938f8f27-e256-4c67-88c8-2241106d5b3b 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "7768ce19-cdaa-43a0-9404-cafa72f2d077-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 21 18:56:29 np0005591285 nova_compute[182755]: 2026-01-21 23:56:29.767 182759 DEBUG nova.compute.manager [req-61117e1e-a3a3-4d76-8362-1f0bd99140d2 req-938f8f27-e256-4c67-88c8-2241106d5b3b 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 7768ce19-cdaa-43a0-9404-cafa72f2d077] No waiting events found dispatching network-vif-unplugged-cbd11a5b-9ef3-4c5b-a638-afc4464a8dfc pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 21 18:56:29 np0005591285 nova_compute[182755]: 2026-01-21 23:56:29.767 182759 DEBUG nova.compute.manager [req-61117e1e-a3a3-4d76-8362-1f0bd99140d2 req-938f8f27-e256-4c67-88c8-2241106d5b3b 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 7768ce19-cdaa-43a0-9404-cafa72f2d077] Received event network-vif-unplugged-cbd11a5b-9ef3-4c5b-a638-afc4464a8dfc for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Jan 21 18:56:29 np0005591285 nova_compute[182755]: 2026-01-21 23:56:29.767 182759 DEBUG nova.compute.manager [req-61117e1e-a3a3-4d76-8362-1f0bd99140d2 req-938f8f27-e256-4c67-88c8-2241106d5b3b 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 7768ce19-cdaa-43a0-9404-cafa72f2d077] Received event network-vif-plugged-cbd11a5b-9ef3-4c5b-a638-afc4464a8dfc external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 21 18:56:29 np0005591285 nova_compute[182755]: 2026-01-21 23:56:29.768 182759 DEBUG oslo_concurrency.lockutils [req-61117e1e-a3a3-4d76-8362-1f0bd99140d2 req-938f8f27-e256-4c67-88c8-2241106d5b3b 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "7768ce19-cdaa-43a0-9404-cafa72f2d077-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 21 18:56:29 np0005591285 nova_compute[182755]: 2026-01-21 23:56:29.768 182759 DEBUG oslo_concurrency.lockutils [req-61117e1e-a3a3-4d76-8362-1f0bd99140d2 req-938f8f27-e256-4c67-88c8-2241106d5b3b 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "7768ce19-cdaa-43a0-9404-cafa72f2d077-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 21 18:56:29 np0005591285 nova_compute[182755]: 2026-01-21 23:56:29.768 182759 DEBUG oslo_concurrency.lockutils [req-61117e1e-a3a3-4d76-8362-1f0bd99140d2 req-938f8f27-e256-4c67-88c8-2241106d5b3b 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "7768ce19-cdaa-43a0-9404-cafa72f2d077-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 21 18:56:29 np0005591285 nova_compute[182755]: 2026-01-21 23:56:29.769 182759 DEBUG nova.compute.manager [req-61117e1e-a3a3-4d76-8362-1f0bd99140d2 req-938f8f27-e256-4c67-88c8-2241106d5b3b 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 7768ce19-cdaa-43a0-9404-cafa72f2d077] No waiting events found dispatching network-vif-plugged-cbd11a5b-9ef3-4c5b-a638-afc4464a8dfc pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 21 18:56:29 np0005591285 nova_compute[182755]: 2026-01-21 23:56:29.769 182759 WARNING nova.compute.manager [req-61117e1e-a3a3-4d76-8362-1f0bd99140d2 req-938f8f27-e256-4c67-88c8-2241106d5b3b 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 7768ce19-cdaa-43a0-9404-cafa72f2d077] Received unexpected event network-vif-plugged-cbd11a5b-9ef3-4c5b-a638-afc4464a8dfc for instance with vm_state active and task_state deleting.#033[00m
Jan 21 18:56:30 np0005591285 nova_compute[182755]: 2026-01-21 23:56:30.112 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 18:56:30 np0005591285 nova_compute[182755]: 2026-01-21 23:56:30.388 182759 INFO nova.virt.libvirt.driver [None req-58a0e89d-0650-449c-a608-79ae149a1c1b 0f8ef02149394f2dac899fc3395b6bf7 717cc581e6a349a98dfd390d05b18624 - - default default] [instance: d30c29ef-0595-4d30-a826-50e21d7d3463] Creating config drive at /var/lib/nova/instances/d30c29ef-0595-4d30-a826-50e21d7d3463/disk.config#033[00m
Jan 21 18:56:30 np0005591285 nova_compute[182755]: 2026-01-21 23:56:30.398 182759 DEBUG oslo_concurrency.processutils [None req-58a0e89d-0650-449c-a608-79ae149a1c1b 0f8ef02149394f2dac899fc3395b6bf7 717cc581e6a349a98dfd390d05b18624 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/d30c29ef-0595-4d30-a826-50e21d7d3463/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpahjrznk5 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 21 18:56:30 np0005591285 nova_compute[182755]: 2026-01-21 23:56:30.526 182759 DEBUG oslo_concurrency.processutils [None req-58a0e89d-0650-449c-a608-79ae149a1c1b 0f8ef02149394f2dac899fc3395b6bf7 717cc581e6a349a98dfd390d05b18624 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/d30c29ef-0595-4d30-a826-50e21d7d3463/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpahjrznk5" returned: 0 in 0.128s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 21 18:56:30 np0005591285 kernel: tap8a89069d-b6: entered promiscuous mode
Jan 21 18:56:30 np0005591285 systemd-udevd[219791]: Network interface NamePolicy= disabled on kernel command line.
Jan 21 18:56:30 np0005591285 NetworkManager[55017]: <info>  [1769039790.6453] manager: (tap8a89069d-b6): new Tun device (/org/freedesktop/NetworkManager/Devices/97)
Jan 21 18:56:30 np0005591285 nova_compute[182755]: 2026-01-21 23:56:30.646 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 18:56:30 np0005591285 NetworkManager[55017]: <info>  [1769039790.6571] device (tap8a89069d-b6): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 21 18:56:30 np0005591285 NetworkManager[55017]: <info>  [1769039790.6580] device (tap8a89069d-b6): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 21 18:56:30 np0005591285 ovn_controller[94908]: 2026-01-21T23:56:30Z|00183|binding|INFO|Claiming lport 8a89069d-b676-4bb0-ab19-1d71370566f0 for this chassis.
Jan 21 18:56:30 np0005591285 ovn_controller[94908]: 2026-01-21T23:56:30Z|00184|binding|INFO|8a89069d-b676-4bb0-ab19-1d71370566f0: Claiming fa:16:3e:04:81:11 10.100.0.14
Jan 21 18:56:30 np0005591285 nova_compute[182755]: 2026-01-21 23:56:30.658 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 18:56:30 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:56:30.670 104259 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:04:81:11 10.100.0.14'], port_security=['fa:16:3e:04:81:11 10.100.0.14'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.14/28', 'neutron:device_id': 'd30c29ef-0595-4d30-a826-50e21d7d3463', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-1995baab-0f8d-4658-a4fc-2d21868dc592', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '717cc581e6a349a98dfd390d05b18624', 'neutron:revision_number': '2', 'neutron:security_group_ids': '3580c2cf-9b7e-4a0b-a165-8de5bca87e40', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=a84fa12f-731b-4479-8697-844749c5a76f, chassis=[<ovs.db.idl.Row object at 0x7fdd55b5c970>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fdd55b5c970>], logical_port=8a89069d-b676-4bb0-ab19-1d71370566f0) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 21 18:56:30 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:56:30.672 104259 INFO neutron.agent.ovn.metadata.agent [-] Port 8a89069d-b676-4bb0-ab19-1d71370566f0 in datapath 1995baab-0f8d-4658-a4fc-2d21868dc592 bound to our chassis#033[00m
Jan 21 18:56:30 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:56:30.673 104259 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 1995baab-0f8d-4658-a4fc-2d21868dc592#033[00m
Jan 21 18:56:30 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:56:30.689 211690 DEBUG oslo.privsep.daemon [-] privsep: reply[f71856c9-f493-4455-a772-fb96046c3671]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 18:56:30 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:56:30.690 104259 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap1995baab-01 in ovnmeta-1995baab-0f8d-4658-a4fc-2d21868dc592 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Jan 21 18:56:30 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:56:30.694 211690 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap1995baab-00 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Jan 21 18:56:30 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:56:30.694 211690 DEBUG oslo.privsep.daemon [-] privsep: reply[7fc0076f-0b50-47c5-9ecb-5c08447a0bdf]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 18:56:30 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:56:30.697 211690 DEBUG oslo.privsep.daemon [-] privsep: reply[bee89d07-73e3-40ce-8e31-199244f7df6b]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 18:56:30 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:56:30.709 104650 DEBUG oslo.privsep.daemon [-] privsep: reply[da6a2ccc-1a1e-4616-b2ee-31b0e0b31df8]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 18:56:30 np0005591285 systemd-machined[154022]: New machine qemu-28-instance-00000044.
Jan 21 18:56:30 np0005591285 nova_compute[182755]: 2026-01-21 23:56:30.729 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 18:56:30 np0005591285 systemd[1]: Started Virtual Machine qemu-28-instance-00000044.
Jan 21 18:56:30 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:56:30.729 211690 DEBUG oslo.privsep.daemon [-] privsep: reply[e6741d4e-1b16-4c00-91f0-20ddf3de5f36]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 18:56:30 np0005591285 ovn_controller[94908]: 2026-01-21T23:56:30Z|00185|binding|INFO|Setting lport 8a89069d-b676-4bb0-ab19-1d71370566f0 ovn-installed in OVS
Jan 21 18:56:30 np0005591285 ovn_controller[94908]: 2026-01-21T23:56:30Z|00186|binding|INFO|Setting lport 8a89069d-b676-4bb0-ab19-1d71370566f0 up in Southbound
Jan 21 18:56:30 np0005591285 nova_compute[182755]: 2026-01-21 23:56:30.746 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 18:56:30 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:56:30.777 211704 DEBUG oslo.privsep.daemon [-] privsep: reply[f5d46b32-7322-4a68-829e-f171b54d479e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 18:56:30 np0005591285 NetworkManager[55017]: <info>  [1769039790.7869] manager: (tap1995baab-00): new Veth device (/org/freedesktop/NetworkManager/Devices/98)
Jan 21 18:56:30 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:56:30.784 211690 DEBUG oslo.privsep.daemon [-] privsep: reply[5b5d1d20-afaa-4315-9cb4-6e3e2689f351]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 18:56:30 np0005591285 nova_compute[182755]: 2026-01-21 23:56:30.812 182759 DEBUG nova.network.neutron [req-379cd3ec-2a79-48d3-8bdc-933778f0e3f9 req-3220a8b7-2e34-4348-9d84-7e04b5af4522 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: d30c29ef-0595-4d30-a826-50e21d7d3463] Updated VIF entry in instance network info cache for port 8a89069d-b676-4bb0-ab19-1d71370566f0. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 21 18:56:30 np0005591285 nova_compute[182755]: 2026-01-21 23:56:30.813 182759 DEBUG nova.network.neutron [req-379cd3ec-2a79-48d3-8bdc-933778f0e3f9 req-3220a8b7-2e34-4348-9d84-7e04b5af4522 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: d30c29ef-0595-4d30-a826-50e21d7d3463] Updating instance_info_cache with network_info: [{"id": "8a89069d-b676-4bb0-ab19-1d71370566f0", "address": "fa:16:3e:04:81:11", "network": {"id": "1995baab-0f8d-4658-a4fc-2d21868dc592", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-54296518-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "717cc581e6a349a98dfd390d05b18624", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap8a89069d-b6", "ovs_interfaceid": "8a89069d-b676-4bb0-ab19-1d71370566f0", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 21 18:56:30 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:56:30.831 211704 DEBUG oslo.privsep.daemon [-] privsep: reply[0c97aaab-df15-497f-8ce8-6652eca45526]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 18:56:30 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:56:30.835 211704 DEBUG oslo.privsep.daemon [-] privsep: reply[f4c05f39-d22d-46d4-b3e8-b4fb4418393d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 18:56:30 np0005591285 nova_compute[182755]: 2026-01-21 23:56:30.840 182759 DEBUG nova.network.neutron [-] [instance: 7768ce19-cdaa-43a0-9404-cafa72f2d077] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 21 18:56:30 np0005591285 nova_compute[182755]: 2026-01-21 23:56:30.844 182759 DEBUG oslo_concurrency.lockutils [req-379cd3ec-2a79-48d3-8bdc-933778f0e3f9 req-3220a8b7-2e34-4348-9d84-7e04b5af4522 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Releasing lock "refresh_cache-d30c29ef-0595-4d30-a826-50e21d7d3463" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 21 18:56:30 np0005591285 nova_compute[182755]: 2026-01-21 23:56:30.854 182759 INFO nova.compute.manager [-] [instance: 7768ce19-cdaa-43a0-9404-cafa72f2d077] Took 1.53 seconds to deallocate network for instance.#033[00m
Jan 21 18:56:30 np0005591285 NetworkManager[55017]: <info>  [1769039790.8708] device (tap1995baab-00): carrier: link connected
Jan 21 18:56:30 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:56:30.875 211704 DEBUG oslo.privsep.daemon [-] privsep: reply[bfadcaba-faa1-4e0a-b6be-ed3b1eed16e9]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 18:56:30 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:56:30.918 211690 DEBUG oslo.privsep.daemon [-] privsep: reply[94f63c77-2edc-4b82-9bef-4827f89b5568]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap1995baab-01'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:4b:ff:2f'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 63], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 434952, 'reachable_time': 38323, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 219928, 'error': None, 'target': 'ovnmeta-1995baab-0f8d-4658-a4fc-2d21868dc592', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 18:56:30 np0005591285 nova_compute[182755]: 2026-01-21 23:56:30.929 182759 DEBUG oslo_concurrency.lockutils [None req-31668c17-8b54-4361-aadd-3620835da512 9a4a4a5f3c9f4c5091261592272bcb81 414437860afc460b9e86d674975e9d1f - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 21 18:56:30 np0005591285 nova_compute[182755]: 2026-01-21 23:56:30.929 182759 DEBUG oslo_concurrency.lockutils [None req-31668c17-8b54-4361-aadd-3620835da512 9a4a4a5f3c9f4c5091261592272bcb81 414437860afc460b9e86d674975e9d1f - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 21 18:56:30 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:56:30.944 211690 DEBUG oslo.privsep.daemon [-] privsep: reply[230e0566-c539-4710-814d-84c6ff5061a0]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe4b:ff2f'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 434952, 'tstamp': 434952}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 219929, 'error': None, 'target': 'ovnmeta-1995baab-0f8d-4658-a4fc-2d21868dc592', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 18:56:30 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:56:30.972 211690 DEBUG oslo.privsep.daemon [-] privsep: reply[9dc8fa33-8de9-4b0d-acc6-9f1592a3433a]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap1995baab-01'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:4b:ff:2f'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 2, 'rx_bytes': 90, 'tx_bytes': 180, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 2, 'rx_bytes': 90, 'tx_bytes': 180, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 63], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 434952, 'reachable_time': 38323, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 2, 'outoctets': 152, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 2, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 152, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 2, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 219930, 'error': None, 'target': 'ovnmeta-1995baab-0f8d-4658-a4fc-2d21868dc592', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 18:56:31 np0005591285 nova_compute[182755]: 2026-01-21 23:56:31.008 182759 DEBUG nova.compute.provider_tree [None req-31668c17-8b54-4361-aadd-3620835da512 9a4a4a5f3c9f4c5091261592272bcb81 414437860afc460b9e86d674975e9d1f - - default default] Inventory has not changed in ProviderTree for provider: e96a8776-a298-4c19-937a-402cb8191067 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 21 18:56:31 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:56:31.013 211690 DEBUG oslo.privsep.daemon [-] privsep: reply[a79db286-800f-4aea-b3cf-538ec5c01b9c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 18:56:31 np0005591285 nova_compute[182755]: 2026-01-21 23:56:31.026 182759 DEBUG nova.scheduler.client.report [None req-31668c17-8b54-4361-aadd-3620835da512 9a4a4a5f3c9f4c5091261592272bcb81 414437860afc460b9e86d674975e9d1f - - default default] Inventory has not changed for provider e96a8776-a298-4c19-937a-402cb8191067 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 21 18:56:31 np0005591285 nova_compute[182755]: 2026-01-21 23:56:31.056 182759 DEBUG oslo_concurrency.lockutils [None req-31668c17-8b54-4361-aadd-3620835da512 9a4a4a5f3c9f4c5091261592272bcb81 414437860afc460b9e86d674975e9d1f - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.127s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 21 18:56:31 np0005591285 nova_compute[182755]: 2026-01-21 23:56:31.088 182759 INFO nova.scheduler.client.report [None req-31668c17-8b54-4361-aadd-3620835da512 9a4a4a5f3c9f4c5091261592272bcb81 414437860afc460b9e86d674975e9d1f - - default default] Deleted allocations for instance 7768ce19-cdaa-43a0-9404-cafa72f2d077#033[00m
Jan 21 18:56:31 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:56:31.090 211690 DEBUG oslo.privsep.daemon [-] privsep: reply[31a918b5-1987-4429-a402-abbe660ffef4]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 18:56:31 np0005591285 nova_compute[182755]: 2026-01-21 23:56:31.092 182759 DEBUG oslo_service.periodic_task [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 21 18:56:31 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:56:31.094 104259 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap1995baab-00, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 21 18:56:31 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:56:31.094 104259 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 21 18:56:31 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:56:31.094 104259 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap1995baab-00, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 21 18:56:31 np0005591285 kernel: tap1995baab-00: entered promiscuous mode
Jan 21 18:56:31 np0005591285 nova_compute[182755]: 2026-01-21 23:56:31.097 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 18:56:31 np0005591285 NetworkManager[55017]: <info>  [1769039791.0989] manager: (tap1995baab-00): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/99)
Jan 21 18:56:31 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:56:31.100 104259 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap1995baab-00, col_values=(('external_ids', {'iface-id': '4a5cc35b-5169-43e2-b11f-202219aae22d'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 21 18:56:31 np0005591285 ovn_controller[94908]: 2026-01-21T23:56:31Z|00187|binding|INFO|Releasing lport 4a5cc35b-5169-43e2-b11f-202219aae22d from this chassis (sb_readonly=0)
Jan 21 18:56:31 np0005591285 nova_compute[182755]: 2026-01-21 23:56:31.102 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 18:56:31 np0005591285 nova_compute[182755]: 2026-01-21 23:56:31.127 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 18:56:31 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:56:31.131 104259 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/1995baab-0f8d-4658-a4fc-2d21868dc592.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/1995baab-0f8d-4658-a4fc-2d21868dc592.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Jan 21 18:56:31 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:56:31.132 211690 DEBUG oslo.privsep.daemon [-] privsep: reply[5c18e30d-d1f0-4535-a4d0-caa7db596033]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 18:56:31 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:56:31.134 104259 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Jan 21 18:56:31 np0005591285 ovn_metadata_agent[104254]: global
Jan 21 18:56:31 np0005591285 ovn_metadata_agent[104254]:    log         /dev/log local0 debug
Jan 21 18:56:31 np0005591285 ovn_metadata_agent[104254]:    log-tag     haproxy-metadata-proxy-1995baab-0f8d-4658-a4fc-2d21868dc592
Jan 21 18:56:31 np0005591285 ovn_metadata_agent[104254]:    user        root
Jan 21 18:56:31 np0005591285 ovn_metadata_agent[104254]:    group       root
Jan 21 18:56:31 np0005591285 ovn_metadata_agent[104254]:    maxconn     1024
Jan 21 18:56:31 np0005591285 ovn_metadata_agent[104254]:    pidfile     /var/lib/neutron/external/pids/1995baab-0f8d-4658-a4fc-2d21868dc592.pid.haproxy
Jan 21 18:56:31 np0005591285 ovn_metadata_agent[104254]:    daemon
Jan 21 18:56:31 np0005591285 ovn_metadata_agent[104254]: 
Jan 21 18:56:31 np0005591285 ovn_metadata_agent[104254]: defaults
Jan 21 18:56:31 np0005591285 ovn_metadata_agent[104254]:    log global
Jan 21 18:56:31 np0005591285 ovn_metadata_agent[104254]:    mode http
Jan 21 18:56:31 np0005591285 ovn_metadata_agent[104254]:    option httplog
Jan 21 18:56:31 np0005591285 ovn_metadata_agent[104254]:    option dontlognull
Jan 21 18:56:31 np0005591285 ovn_metadata_agent[104254]:    option http-server-close
Jan 21 18:56:31 np0005591285 ovn_metadata_agent[104254]:    option forwardfor
Jan 21 18:56:31 np0005591285 ovn_metadata_agent[104254]:    retries                 3
Jan 21 18:56:31 np0005591285 ovn_metadata_agent[104254]:    timeout http-request    30s
Jan 21 18:56:31 np0005591285 ovn_metadata_agent[104254]:    timeout connect         30s
Jan 21 18:56:31 np0005591285 ovn_metadata_agent[104254]:    timeout client          32s
Jan 21 18:56:31 np0005591285 ovn_metadata_agent[104254]:    timeout server          32s
Jan 21 18:56:31 np0005591285 ovn_metadata_agent[104254]:    timeout http-keep-alive 30s
Jan 21 18:56:31 np0005591285 ovn_metadata_agent[104254]: 
Jan 21 18:56:31 np0005591285 ovn_metadata_agent[104254]: 
Jan 21 18:56:31 np0005591285 ovn_metadata_agent[104254]: listen listener
Jan 21 18:56:31 np0005591285 ovn_metadata_agent[104254]:    bind 169.254.169.254:80
Jan 21 18:56:31 np0005591285 ovn_metadata_agent[104254]:    server metadata /var/lib/neutron/metadata_proxy
Jan 21 18:56:31 np0005591285 ovn_metadata_agent[104254]:    http-request add-header X-OVN-Network-ID 1995baab-0f8d-4658-a4fc-2d21868dc592
Jan 21 18:56:31 np0005591285 ovn_metadata_agent[104254]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Jan 21 18:56:31 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:56:31.136 104259 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-1995baab-0f8d-4658-a4fc-2d21868dc592', 'env', 'PROCESS_TAG=haproxy-1995baab-0f8d-4658-a4fc-2d21868dc592', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/1995baab-0f8d-4658-a4fc-2d21868dc592.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Jan 21 18:56:31 np0005591285 nova_compute[182755]: 2026-01-21 23:56:31.143 182759 DEBUG nova.virt.driver [None req-f447ef32-7edb-4e2c-a5c5-5452f2f9c37e - - - - - -] Emitting event <LifecycleEvent: 1769039791.1428254, d30c29ef-0595-4d30-a826-50e21d7d3463 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 21 18:56:31 np0005591285 nova_compute[182755]: 2026-01-21 23:56:31.144 182759 INFO nova.compute.manager [None req-f447ef32-7edb-4e2c-a5c5-5452f2f9c37e - - - - - -] [instance: d30c29ef-0595-4d30-a826-50e21d7d3463] VM Started (Lifecycle Event)#033[00m
Jan 21 18:56:31 np0005591285 nova_compute[182755]: 2026-01-21 23:56:31.171 182759 DEBUG nova.compute.manager [None req-f447ef32-7edb-4e2c-a5c5-5452f2f9c37e - - - - - -] [instance: d30c29ef-0595-4d30-a826-50e21d7d3463] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 21 18:56:31 np0005591285 nova_compute[182755]: 2026-01-21 23:56:31.176 182759 DEBUG nova.virt.driver [None req-f447ef32-7edb-4e2c-a5c5-5452f2f9c37e - - - - - -] Emitting event <LifecycleEvent: 1769039791.143176, d30c29ef-0595-4d30-a826-50e21d7d3463 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 21 18:56:31 np0005591285 nova_compute[182755]: 2026-01-21 23:56:31.176 182759 INFO nova.compute.manager [None req-f447ef32-7edb-4e2c-a5c5-5452f2f9c37e - - - - - -] [instance: d30c29ef-0595-4d30-a826-50e21d7d3463] VM Paused (Lifecycle Event)#033[00m
Jan 21 18:56:31 np0005591285 nova_compute[182755]: 2026-01-21 23:56:31.202 182759 DEBUG nova.compute.manager [None req-f447ef32-7edb-4e2c-a5c5-5452f2f9c37e - - - - - -] [instance: d30c29ef-0595-4d30-a826-50e21d7d3463] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 21 18:56:31 np0005591285 nova_compute[182755]: 2026-01-21 23:56:31.205 182759 DEBUG nova.compute.manager [None req-f447ef32-7edb-4e2c-a5c5-5452f2f9c37e - - - - - -] [instance: d30c29ef-0595-4d30-a826-50e21d7d3463] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 21 18:56:31 np0005591285 nova_compute[182755]: 2026-01-21 23:56:31.231 182759 INFO nova.compute.manager [None req-f447ef32-7edb-4e2c-a5c5-5452f2f9c37e - - - - - -] [instance: d30c29ef-0595-4d30-a826-50e21d7d3463] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 21 18:56:31 np0005591285 nova_compute[182755]: 2026-01-21 23:56:31.241 182759 DEBUG oslo_concurrency.lockutils [None req-31668c17-8b54-4361-aadd-3620835da512 9a4a4a5f3c9f4c5091261592272bcb81 414437860afc460b9e86d674975e9d1f - - default default] Lock "7768ce19-cdaa-43a0-9404-cafa72f2d077" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 2.549s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 21 18:56:31 np0005591285 nova_compute[182755]: 2026-01-21 23:56:31.345 182759 DEBUG nova.compute.manager [req-e87cc06c-a8c2-4d72-b1bd-53b3ba123942 req-e69407e9-0630-4108-a803-a7cdf40fb23d 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 7768ce19-cdaa-43a0-9404-cafa72f2d077] Received event network-vif-deleted-cbd11a5b-9ef3-4c5b-a638-afc4464a8dfc external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 21 18:56:31 np0005591285 ovn_controller[94908]: 2026-01-21T23:56:31Z|00188|binding|INFO|Releasing lport 4a5cc35b-5169-43e2-b11f-202219aae22d from this chassis (sb_readonly=0)
Jan 21 18:56:31 np0005591285 nova_compute[182755]: 2026-01-21 23:56:31.559 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 18:56:31 np0005591285 podman[219969]: 2026-01-21 23:56:31.587788452 +0000 UTC m=+0.068212530 container create 3f9cc8f07d73113514d18af2e39e30cc4ec66dabf08aa309b8dc166d66f2b7d6 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-1995baab-0f8d-4658-a4fc-2d21868dc592, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 21 18:56:31 np0005591285 nova_compute[182755]: 2026-01-21 23:56:31.598 182759 DEBUG nova.compute.manager [req-aef09fb2-fc31-4334-b579-d7acaf2c9d5a req-e5bd5424-0805-470b-939c-a7db32bff27e 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: d30c29ef-0595-4d30-a826-50e21d7d3463] Received event network-vif-plugged-8a89069d-b676-4bb0-ab19-1d71370566f0 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 21 18:56:31 np0005591285 nova_compute[182755]: 2026-01-21 23:56:31.599 182759 DEBUG oslo_concurrency.lockutils [req-aef09fb2-fc31-4334-b579-d7acaf2c9d5a req-e5bd5424-0805-470b-939c-a7db32bff27e 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "d30c29ef-0595-4d30-a826-50e21d7d3463-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 21 18:56:31 np0005591285 nova_compute[182755]: 2026-01-21 23:56:31.599 182759 DEBUG oslo_concurrency.lockutils [req-aef09fb2-fc31-4334-b579-d7acaf2c9d5a req-e5bd5424-0805-470b-939c-a7db32bff27e 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "d30c29ef-0595-4d30-a826-50e21d7d3463-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 21 18:56:31 np0005591285 nova_compute[182755]: 2026-01-21 23:56:31.599 182759 DEBUG oslo_concurrency.lockutils [req-aef09fb2-fc31-4334-b579-d7acaf2c9d5a req-e5bd5424-0805-470b-939c-a7db32bff27e 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "d30c29ef-0595-4d30-a826-50e21d7d3463-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 21 18:56:31 np0005591285 nova_compute[182755]: 2026-01-21 23:56:31.600 182759 DEBUG nova.compute.manager [req-aef09fb2-fc31-4334-b579-d7acaf2c9d5a req-e5bd5424-0805-470b-939c-a7db32bff27e 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: d30c29ef-0595-4d30-a826-50e21d7d3463] Processing event network-vif-plugged-8a89069d-b676-4bb0-ab19-1d71370566f0 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Jan 21 18:56:31 np0005591285 nova_compute[182755]: 2026-01-21 23:56:31.600 182759 DEBUG nova.compute.manager [req-aef09fb2-fc31-4334-b579-d7acaf2c9d5a req-e5bd5424-0805-470b-939c-a7db32bff27e 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: d30c29ef-0595-4d30-a826-50e21d7d3463] Received event network-vif-plugged-8a89069d-b676-4bb0-ab19-1d71370566f0 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 21 18:56:31 np0005591285 nova_compute[182755]: 2026-01-21 23:56:31.600 182759 DEBUG oslo_concurrency.lockutils [req-aef09fb2-fc31-4334-b579-d7acaf2c9d5a req-e5bd5424-0805-470b-939c-a7db32bff27e 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "d30c29ef-0595-4d30-a826-50e21d7d3463-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 21 18:56:31 np0005591285 nova_compute[182755]: 2026-01-21 23:56:31.601 182759 DEBUG oslo_concurrency.lockutils [req-aef09fb2-fc31-4334-b579-d7acaf2c9d5a req-e5bd5424-0805-470b-939c-a7db32bff27e 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "d30c29ef-0595-4d30-a826-50e21d7d3463-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 21 18:56:31 np0005591285 nova_compute[182755]: 2026-01-21 23:56:31.601 182759 DEBUG oslo_concurrency.lockutils [req-aef09fb2-fc31-4334-b579-d7acaf2c9d5a req-e5bd5424-0805-470b-939c-a7db32bff27e 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "d30c29ef-0595-4d30-a826-50e21d7d3463-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 21 18:56:31 np0005591285 nova_compute[182755]: 2026-01-21 23:56:31.601 182759 DEBUG nova.compute.manager [req-aef09fb2-fc31-4334-b579-d7acaf2c9d5a req-e5bd5424-0805-470b-939c-a7db32bff27e 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: d30c29ef-0595-4d30-a826-50e21d7d3463] No waiting events found dispatching network-vif-plugged-8a89069d-b676-4bb0-ab19-1d71370566f0 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 21 18:56:31 np0005591285 nova_compute[182755]: 2026-01-21 23:56:31.601 182759 WARNING nova.compute.manager [req-aef09fb2-fc31-4334-b579-d7acaf2c9d5a req-e5bd5424-0805-470b-939c-a7db32bff27e 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: d30c29ef-0595-4d30-a826-50e21d7d3463] Received unexpected event network-vif-plugged-8a89069d-b676-4bb0-ab19-1d71370566f0 for instance with vm_state building and task_state spawning.#033[00m
Jan 21 18:56:31 np0005591285 nova_compute[182755]: 2026-01-21 23:56:31.602 182759 DEBUG nova.compute.manager [None req-58a0e89d-0650-449c-a608-79ae149a1c1b 0f8ef02149394f2dac899fc3395b6bf7 717cc581e6a349a98dfd390d05b18624 - - default default] [instance: d30c29ef-0595-4d30-a826-50e21d7d3463] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Jan 21 18:56:31 np0005591285 nova_compute[182755]: 2026-01-21 23:56:31.607 182759 DEBUG nova.virt.driver [None req-f447ef32-7edb-4e2c-a5c5-5452f2f9c37e - - - - - -] Emitting event <LifecycleEvent: 1769039791.6067863, d30c29ef-0595-4d30-a826-50e21d7d3463 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 21 18:56:31 np0005591285 nova_compute[182755]: 2026-01-21 23:56:31.607 182759 INFO nova.compute.manager [None req-f447ef32-7edb-4e2c-a5c5-5452f2f9c37e - - - - - -] [instance: d30c29ef-0595-4d30-a826-50e21d7d3463] VM Resumed (Lifecycle Event)#033[00m
Jan 21 18:56:31 np0005591285 nova_compute[182755]: 2026-01-21 23:56:31.610 182759 DEBUG nova.virt.libvirt.driver [None req-58a0e89d-0650-449c-a608-79ae149a1c1b 0f8ef02149394f2dac899fc3395b6bf7 717cc581e6a349a98dfd390d05b18624 - - default default] [instance: d30c29ef-0595-4d30-a826-50e21d7d3463] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Jan 21 18:56:31 np0005591285 nova_compute[182755]: 2026-01-21 23:56:31.614 182759 INFO nova.virt.libvirt.driver [-] [instance: d30c29ef-0595-4d30-a826-50e21d7d3463] Instance spawned successfully.#033[00m
Jan 21 18:56:31 np0005591285 nova_compute[182755]: 2026-01-21 23:56:31.614 182759 DEBUG nova.virt.libvirt.driver [None req-58a0e89d-0650-449c-a608-79ae149a1c1b 0f8ef02149394f2dac899fc3395b6bf7 717cc581e6a349a98dfd390d05b18624 - - default default] [instance: d30c29ef-0595-4d30-a826-50e21d7d3463] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Jan 21 18:56:31 np0005591285 systemd[1]: Started libpod-conmon-3f9cc8f07d73113514d18af2e39e30cc4ec66dabf08aa309b8dc166d66f2b7d6.scope.
Jan 21 18:56:31 np0005591285 podman[219969]: 2026-01-21 23:56:31.555465856 +0000 UTC m=+0.035890024 image pull 3695f0466b4af47afdf4b467956f8cc4744d7249671a73e7ca3fd26cca2f59c3 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Jan 21 18:56:31 np0005591285 nova_compute[182755]: 2026-01-21 23:56:31.645 182759 DEBUG nova.compute.manager [None req-f447ef32-7edb-4e2c-a5c5-5452f2f9c37e - - - - - -] [instance: d30c29ef-0595-4d30-a826-50e21d7d3463] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 21 18:56:31 np0005591285 nova_compute[182755]: 2026-01-21 23:56:31.652 182759 DEBUG nova.virt.libvirt.driver [None req-58a0e89d-0650-449c-a608-79ae149a1c1b 0f8ef02149394f2dac899fc3395b6bf7 717cc581e6a349a98dfd390d05b18624 - - default default] [instance: d30c29ef-0595-4d30-a826-50e21d7d3463] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 21 18:56:31 np0005591285 nova_compute[182755]: 2026-01-21 23:56:31.653 182759 DEBUG nova.virt.libvirt.driver [None req-58a0e89d-0650-449c-a608-79ae149a1c1b 0f8ef02149394f2dac899fc3395b6bf7 717cc581e6a349a98dfd390d05b18624 - - default default] [instance: d30c29ef-0595-4d30-a826-50e21d7d3463] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 21 18:56:31 np0005591285 nova_compute[182755]: 2026-01-21 23:56:31.653 182759 DEBUG nova.virt.libvirt.driver [None req-58a0e89d-0650-449c-a608-79ae149a1c1b 0f8ef02149394f2dac899fc3395b6bf7 717cc581e6a349a98dfd390d05b18624 - - default default] [instance: d30c29ef-0595-4d30-a826-50e21d7d3463] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 21 18:56:31 np0005591285 nova_compute[182755]: 2026-01-21 23:56:31.653 182759 DEBUG nova.virt.libvirt.driver [None req-58a0e89d-0650-449c-a608-79ae149a1c1b 0f8ef02149394f2dac899fc3395b6bf7 717cc581e6a349a98dfd390d05b18624 - - default default] [instance: d30c29ef-0595-4d30-a826-50e21d7d3463] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 21 18:56:31 np0005591285 nova_compute[182755]: 2026-01-21 23:56:31.654 182759 DEBUG nova.virt.libvirt.driver [None req-58a0e89d-0650-449c-a608-79ae149a1c1b 0f8ef02149394f2dac899fc3395b6bf7 717cc581e6a349a98dfd390d05b18624 - - default default] [instance: d30c29ef-0595-4d30-a826-50e21d7d3463] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 21 18:56:31 np0005591285 nova_compute[182755]: 2026-01-21 23:56:31.654 182759 DEBUG nova.virt.libvirt.driver [None req-58a0e89d-0650-449c-a608-79ae149a1c1b 0f8ef02149394f2dac899fc3395b6bf7 717cc581e6a349a98dfd390d05b18624 - - default default] [instance: d30c29ef-0595-4d30-a826-50e21d7d3463] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 21 18:56:31 np0005591285 systemd[1]: Started libcrun container.
Jan 21 18:56:31 np0005591285 nova_compute[182755]: 2026-01-21 23:56:31.659 182759 DEBUG nova.compute.manager [None req-f447ef32-7edb-4e2c-a5c5-5452f2f9c37e - - - - - -] [instance: d30c29ef-0595-4d30-a826-50e21d7d3463] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 21 18:56:31 np0005591285 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/2934eaa9cd350eab130f04301aaf070eaa211f248ed4f4a407dbbdca6160b58f/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 21 18:56:31 np0005591285 podman[219969]: 2026-01-21 23:56:31.692016169 +0000 UTC m=+0.172440277 container init 3f9cc8f07d73113514d18af2e39e30cc4ec66dabf08aa309b8dc166d66f2b7d6 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-1995baab-0f8d-4658-a4fc-2d21868dc592, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, org.label-schema.license=GPLv2)
Jan 21 18:56:31 np0005591285 podman[219969]: 2026-01-21 23:56:31.698044962 +0000 UTC m=+0.178469050 container start 3f9cc8f07d73113514d18af2e39e30cc4ec66dabf08aa309b8dc166d66f2b7d6 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-1995baab-0f8d-4658-a4fc-2d21868dc592, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_managed=true)
Jan 21 18:56:31 np0005591285 podman[219983]: 2026-01-21 23:56:31.732419245 +0000 UTC m=+0.085156811 container health_status 3d927e208c8d9d2bf2ed958430b573b5beb6e6062ba87e1ec7a0ee42c855b2e4 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Jan 21 18:56:31 np0005591285 neutron-haproxy-ovnmeta-1995baab-0f8d-4658-a4fc-2d21868dc592[219986]: [NOTICE]   (220002) : New worker (220012) forked
Jan 21 18:56:31 np0005591285 neutron-haproxy-ovnmeta-1995baab-0f8d-4658-a4fc-2d21868dc592[219986]: [NOTICE]   (220002) : Loading success.
Jan 21 18:56:31 np0005591285 nova_compute[182755]: 2026-01-21 23:56:31.933 182759 INFO nova.compute.manager [None req-f447ef32-7edb-4e2c-a5c5-5452f2f9c37e - - - - - -] [instance: d30c29ef-0595-4d30-a826-50e21d7d3463] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 21 18:56:31 np0005591285 nova_compute[182755]: 2026-01-21 23:56:31.934 182759 INFO nova.compute.manager [None req-58a0e89d-0650-449c-a608-79ae149a1c1b 0f8ef02149394f2dac899fc3395b6bf7 717cc581e6a349a98dfd390d05b18624 - - default default] [instance: d30c29ef-0595-4d30-a826-50e21d7d3463] Took 7.63 seconds to spawn the instance on the hypervisor.#033[00m
Jan 21 18:56:31 np0005591285 nova_compute[182755]: 2026-01-21 23:56:31.935 182759 DEBUG nova.compute.manager [None req-58a0e89d-0650-449c-a608-79ae149a1c1b 0f8ef02149394f2dac899fc3395b6bf7 717cc581e6a349a98dfd390d05b18624 - - default default] [instance: d30c29ef-0595-4d30-a826-50e21d7d3463] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 21 18:56:32 np0005591285 nova_compute[182755]: 2026-01-21 23:56:32.078 182759 INFO nova.compute.manager [None req-58a0e89d-0650-449c-a608-79ae149a1c1b 0f8ef02149394f2dac899fc3395b6bf7 717cc581e6a349a98dfd390d05b18624 - - default default] [instance: d30c29ef-0595-4d30-a826-50e21d7d3463] Took 8.38 seconds to build instance.#033[00m
Jan 21 18:56:32 np0005591285 nova_compute[182755]: 2026-01-21 23:56:32.099 182759 DEBUG oslo_concurrency.lockutils [None req-58a0e89d-0650-449c-a608-79ae149a1c1b 0f8ef02149394f2dac899fc3395b6bf7 717cc581e6a349a98dfd390d05b18624 - - default default] Lock "d30c29ef-0595-4d30-a826-50e21d7d3463" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 8.658s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 21 18:56:32 np0005591285 nova_compute[182755]: 2026-01-21 23:56:32.219 182759 DEBUG oslo_service.periodic_task [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 21 18:56:32 np0005591285 nova_compute[182755]: 2026-01-21 23:56:32.220 182759 DEBUG nova.compute.manager [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Jan 21 18:56:32 np0005591285 nova_compute[182755]: 2026-01-21 23:56:32.220 182759 DEBUG nova.compute.manager [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Jan 21 18:56:32 np0005591285 nova_compute[182755]: 2026-01-21 23:56:32.479 182759 DEBUG oslo_concurrency.lockutils [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Acquiring lock "refresh_cache-d30c29ef-0595-4d30-a826-50e21d7d3463" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 21 18:56:32 np0005591285 nova_compute[182755]: 2026-01-21 23:56:32.480 182759 DEBUG oslo_concurrency.lockutils [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Acquired lock "refresh_cache-d30c29ef-0595-4d30-a826-50e21d7d3463" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 21 18:56:32 np0005591285 nova_compute[182755]: 2026-01-21 23:56:32.480 182759 DEBUG nova.network.neutron [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] [instance: d30c29ef-0595-4d30-a826-50e21d7d3463] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004#033[00m
Jan 21 18:56:32 np0005591285 nova_compute[182755]: 2026-01-21 23:56:32.481 182759 DEBUG nova.objects.instance [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Lazy-loading 'info_cache' on Instance uuid d30c29ef-0595-4d30-a826-50e21d7d3463 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 21 18:56:34 np0005591285 nova_compute[182755]: 2026-01-21 23:56:34.090 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 18:56:34 np0005591285 nova_compute[182755]: 2026-01-21 23:56:34.210 182759 DEBUG nova.network.neutron [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] [instance: d30c29ef-0595-4d30-a826-50e21d7d3463] Updating instance_info_cache with network_info: [{"id": "8a89069d-b676-4bb0-ab19-1d71370566f0", "address": "fa:16:3e:04:81:11", "network": {"id": "1995baab-0f8d-4658-a4fc-2d21868dc592", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-54296518-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "717cc581e6a349a98dfd390d05b18624", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap8a89069d-b6", "ovs_interfaceid": "8a89069d-b676-4bb0-ab19-1d71370566f0", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 21 18:56:34 np0005591285 nova_compute[182755]: 2026-01-21 23:56:34.245 182759 DEBUG oslo_concurrency.lockutils [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Releasing lock "refresh_cache-d30c29ef-0595-4d30-a826-50e21d7d3463" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 21 18:56:34 np0005591285 nova_compute[182755]: 2026-01-21 23:56:34.246 182759 DEBUG nova.compute.manager [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] [instance: d30c29ef-0595-4d30-a826-50e21d7d3463] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929#033[00m
Jan 21 18:56:34 np0005591285 nova_compute[182755]: 2026-01-21 23:56:34.247 182759 DEBUG oslo_service.periodic_task [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 21 18:56:35 np0005591285 nova_compute[182755]: 2026-01-21 23:56:35.115 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 18:56:35 np0005591285 nova_compute[182755]: 2026-01-21 23:56:35.217 182759 DEBUG oslo_service.periodic_task [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 21 18:56:35 np0005591285 nova_compute[182755]: 2026-01-21 23:56:35.894 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 18:56:35 np0005591285 NetworkManager[55017]: <info>  [1769039795.8952] manager: (patch-provnet-99bcf688-d143-4306-a11c-956e66fbe227-to-br-int): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/100)
Jan 21 18:56:35 np0005591285 NetworkManager[55017]: <info>  [1769039795.8970] manager: (patch-br-int-to-provnet-99bcf688-d143-4306-a11c-956e66fbe227): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/101)
Jan 21 18:56:35 np0005591285 nova_compute[182755]: 2026-01-21 23:56:35.989 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 18:56:35 np0005591285 ovn_controller[94908]: 2026-01-21T23:56:35Z|00189|binding|INFO|Releasing lport 4a5cc35b-5169-43e2-b11f-202219aae22d from this chassis (sb_readonly=0)
Jan 21 18:56:36 np0005591285 nova_compute[182755]: 2026-01-21 23:56:36.000 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 18:56:36 np0005591285 podman[220022]: 2026-01-21 23:56:36.24867558 +0000 UTC m=+0.119561284 container health_status e38def30a2d032278c507a8fe96e62e9ea51ff4721fe55b24e66691821fd079e (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=ovn_controller, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']})
Jan 21 18:56:36 np0005591285 nova_compute[182755]: 2026-01-21 23:56:36.467 182759 DEBUG nova.compute.manager [req-11933ea4-3bc5-4ef1-980c-a8af09d54915 req-150cd8ae-222f-4c70-bab8-b9ed826ccaca 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: d30c29ef-0595-4d30-a826-50e21d7d3463] Received event network-changed-8a89069d-b676-4bb0-ab19-1d71370566f0 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 21 18:56:36 np0005591285 nova_compute[182755]: 2026-01-21 23:56:36.467 182759 DEBUG nova.compute.manager [req-11933ea4-3bc5-4ef1-980c-a8af09d54915 req-150cd8ae-222f-4c70-bab8-b9ed826ccaca 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: d30c29ef-0595-4d30-a826-50e21d7d3463] Refreshing instance network info cache due to event network-changed-8a89069d-b676-4bb0-ab19-1d71370566f0. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 21 18:56:36 np0005591285 nova_compute[182755]: 2026-01-21 23:56:36.467 182759 DEBUG oslo_concurrency.lockutils [req-11933ea4-3bc5-4ef1-980c-a8af09d54915 req-150cd8ae-222f-4c70-bab8-b9ed826ccaca 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "refresh_cache-d30c29ef-0595-4d30-a826-50e21d7d3463" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 21 18:56:36 np0005591285 nova_compute[182755]: 2026-01-21 23:56:36.467 182759 DEBUG oslo_concurrency.lockutils [req-11933ea4-3bc5-4ef1-980c-a8af09d54915 req-150cd8ae-222f-4c70-bab8-b9ed826ccaca 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquired lock "refresh_cache-d30c29ef-0595-4d30-a826-50e21d7d3463" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 21 18:56:36 np0005591285 nova_compute[182755]: 2026-01-21 23:56:36.468 182759 DEBUG nova.network.neutron [req-11933ea4-3bc5-4ef1-980c-a8af09d54915 req-150cd8ae-222f-4c70-bab8-b9ed826ccaca 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: d30c29ef-0595-4d30-a826-50e21d7d3463] Refreshing network info cache for port 8a89069d-b676-4bb0-ab19-1d71370566f0 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 21 18:56:37 np0005591285 podman[220050]: 2026-01-21 23:56:37.214178555 +0000 UTC m=+0.074046410 container health_status 482a8450540b33e5cd4c4cb0c4b60281016c657b79b89fa82f8a9e447843f995 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 21 18:56:37 np0005591285 podman[220051]: 2026-01-21 23:56:37.236823959 +0000 UTC m=+0.082854108 container health_status 8919309155a033da8deb024bb37b9fc0d285fe97686ee0d202af37a5f2df8416 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Jan 21 18:56:38 np0005591285 nova_compute[182755]: 2026-01-21 23:56:38.409 182759 DEBUG nova.network.neutron [req-11933ea4-3bc5-4ef1-980c-a8af09d54915 req-150cd8ae-222f-4c70-bab8-b9ed826ccaca 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: d30c29ef-0595-4d30-a826-50e21d7d3463] Updated VIF entry in instance network info cache for port 8a89069d-b676-4bb0-ab19-1d71370566f0. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 21 18:56:38 np0005591285 nova_compute[182755]: 2026-01-21 23:56:38.411 182759 DEBUG nova.network.neutron [req-11933ea4-3bc5-4ef1-980c-a8af09d54915 req-150cd8ae-222f-4c70-bab8-b9ed826ccaca 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: d30c29ef-0595-4d30-a826-50e21d7d3463] Updating instance_info_cache with network_info: [{"id": "8a89069d-b676-4bb0-ab19-1d71370566f0", "address": "fa:16:3e:04:81:11", "network": {"id": "1995baab-0f8d-4658-a4fc-2d21868dc592", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-54296518-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.212", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "717cc581e6a349a98dfd390d05b18624", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap8a89069d-b6", "ovs_interfaceid": "8a89069d-b676-4bb0-ab19-1d71370566f0", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 21 18:56:38 np0005591285 nova_compute[182755]: 2026-01-21 23:56:38.441 182759 DEBUG oslo_concurrency.lockutils [req-11933ea4-3bc5-4ef1-980c-a8af09d54915 req-150cd8ae-222f-4c70-bab8-b9ed826ccaca 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Releasing lock "refresh_cache-d30c29ef-0595-4d30-a826-50e21d7d3463" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 21 18:56:39 np0005591285 nova_compute[182755]: 2026-01-21 23:56:39.096 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 18:56:39 np0005591285 nova_compute[182755]: 2026-01-21 23:56:39.482 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 18:56:40 np0005591285 nova_compute[182755]: 2026-01-21 23:56:40.122 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 18:56:40 np0005591285 nova_compute[182755]: 2026-01-21 23:56:40.178 182759 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1769039785.177128, 3c28cc1f-7479-4ee7-805d-ae13cd2b6dff => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 21 18:56:40 np0005591285 nova_compute[182755]: 2026-01-21 23:56:40.180 182759 INFO nova.compute.manager [-] [instance: 3c28cc1f-7479-4ee7-805d-ae13cd2b6dff] VM Stopped (Lifecycle Event)#033[00m
Jan 21 18:56:40 np0005591285 nova_compute[182755]: 2026-01-21 23:56:40.260 182759 DEBUG nova.compute.manager [None req-3b6e0479-c735-4114-8c12-4073bdffe2f4 - - - - - -] [instance: 3c28cc1f-7479-4ee7-805d-ae13cd2b6dff] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 21 18:56:40 np0005591285 ovn_controller[94908]: 2026-01-21T23:56:40Z|00190|binding|INFO|Releasing lport 4a5cc35b-5169-43e2-b11f-202219aae22d from this chassis (sb_readonly=0)
Jan 21 18:56:40 np0005591285 nova_compute[182755]: 2026-01-21 23:56:40.425 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 18:56:42 np0005591285 nova_compute[182755]: 2026-01-21 23:56:42.814 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 18:56:42 np0005591285 nova_compute[182755]: 2026-01-21 23:56:42.868 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 18:56:44 np0005591285 nova_compute[182755]: 2026-01-21 23:56:44.008 182759 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1769039789.0062869, 7768ce19-cdaa-43a0-9404-cafa72f2d077 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 21 18:56:44 np0005591285 nova_compute[182755]: 2026-01-21 23:56:44.009 182759 INFO nova.compute.manager [-] [instance: 7768ce19-cdaa-43a0-9404-cafa72f2d077] VM Stopped (Lifecycle Event)#033[00m
Jan 21 18:56:44 np0005591285 nova_compute[182755]: 2026-01-21 23:56:44.051 182759 DEBUG nova.compute.manager [None req-151fa644-4759-4dbd-979a-73fc3e56406b - - - - - -] [instance: 7768ce19-cdaa-43a0-9404-cafa72f2d077] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 21 18:56:44 np0005591285 nova_compute[182755]: 2026-01-21 23:56:44.100 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 18:56:44 np0005591285 ovn_controller[94908]: 2026-01-21T23:56:44Z|00019|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:04:81:11 10.100.0.14
Jan 21 18:56:44 np0005591285 ovn_controller[94908]: 2026-01-21T23:56:44Z|00020|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:04:81:11 10.100.0.14
Jan 21 18:56:45 np0005591285 nova_compute[182755]: 2026-01-21 23:56:45.177 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 18:56:48 np0005591285 nova_compute[182755]: 2026-01-21 23:56:48.307 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 18:56:49 np0005591285 nova_compute[182755]: 2026-01-21 23:56:49.102 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 18:56:50 np0005591285 nova_compute[182755]: 2026-01-21 23:56:50.180 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 18:56:54 np0005591285 nova_compute[182755]: 2026-01-21 23:56:54.105 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 18:56:54 np0005591285 nova_compute[182755]: 2026-01-21 23:56:54.533 182759 DEBUG oslo_concurrency.lockutils [None req-c52e6306-d0c0-4f2b-9872-1ebb86a5b8a8 0f8ef02149394f2dac899fc3395b6bf7 717cc581e6a349a98dfd390d05b18624 - - default default] Acquiring lock "interface-d30c29ef-0595-4d30-a826-50e21d7d3463-None" by "nova.compute.manager.ComputeManager.attach_interface.<locals>.do_attach_interface" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 21 18:56:54 np0005591285 nova_compute[182755]: 2026-01-21 23:56:54.534 182759 DEBUG oslo_concurrency.lockutils [None req-c52e6306-d0c0-4f2b-9872-1ebb86a5b8a8 0f8ef02149394f2dac899fc3395b6bf7 717cc581e6a349a98dfd390d05b18624 - - default default] Lock "interface-d30c29ef-0595-4d30-a826-50e21d7d3463-None" acquired by "nova.compute.manager.ComputeManager.attach_interface.<locals>.do_attach_interface" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 21 18:56:54 np0005591285 nova_compute[182755]: 2026-01-21 23:56:54.536 182759 DEBUG nova.objects.instance [None req-c52e6306-d0c0-4f2b-9872-1ebb86a5b8a8 0f8ef02149394f2dac899fc3395b6bf7 717cc581e6a349a98dfd390d05b18624 - - default default] Lazy-loading 'flavor' on Instance uuid d30c29ef-0595-4d30-a826-50e21d7d3463 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 21 18:56:55 np0005591285 nova_compute[182755]: 2026-01-21 23:56:55.217 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 18:56:55 np0005591285 podman[220104]: 2026-01-21 23:56:55.261244617 +0000 UTC m=+0.097745792 container health_status b5c1a31a508d0cf9e10702abe9c0ef424614b4d8f0daee85791f7de054afa6f0 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, container_name=ceilometer_agent_compute, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_id=ceilometer_agent_compute, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team)
Jan 21 18:56:55 np0005591285 podman[220103]: 2026-01-21 23:56:55.276144701 +0000 UTC m=+0.115179305 container health_status 769668cc4e818cfae854ab3fcb5517227ddca1e18005a18ef4d782a494f45662 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., version=9.6, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, vendor=Red Hat, Inc., container_name=openstack_network_exporter, name=ubi9-minimal, build-date=2025-08-20T13:12:41, config_id=openstack_network_exporter, io.openshift.expose-services=, managed_by=edpm_ansible, com.redhat.component=ubi9-minimal-container, url=https://catalog.redhat.com/en/search?searchType=containers, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, maintainer=Red Hat, Inc., release=1755695350, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, architecture=x86_64, io.openshift.tags=minimal rhel9, io.buildah.version=1.33.7, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b)
Jan 21 18:56:55 np0005591285 nova_compute[182755]: 2026-01-21 23:56:55.678 182759 DEBUG nova.objects.instance [None req-c52e6306-d0c0-4f2b-9872-1ebb86a5b8a8 0f8ef02149394f2dac899fc3395b6bf7 717cc581e6a349a98dfd390d05b18624 - - default default] Lazy-loading 'pci_requests' on Instance uuid d30c29ef-0595-4d30-a826-50e21d7d3463 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 21 18:56:55 np0005591285 nova_compute[182755]: 2026-01-21 23:56:55.692 182759 DEBUG nova.network.neutron [None req-c52e6306-d0c0-4f2b-9872-1ebb86a5b8a8 0f8ef02149394f2dac899fc3395b6bf7 717cc581e6a349a98dfd390d05b18624 - - default default] [instance: d30c29ef-0595-4d30-a826-50e21d7d3463] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Jan 21 18:56:56 np0005591285 nova_compute[182755]: 2026-01-21 23:56:56.037 182759 DEBUG nova.policy [None req-c52e6306-d0c0-4f2b-9872-1ebb86a5b8a8 0f8ef02149394f2dac899fc3395b6bf7 717cc581e6a349a98dfd390d05b18624 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '0f8ef02149394f2dac899fc3395b6bf7', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '717cc581e6a349a98dfd390d05b18624', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Jan 21 18:56:56 np0005591285 nova_compute[182755]: 2026-01-21 23:56:56.764 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 18:56:57 np0005591285 nova_compute[182755]: 2026-01-21 23:56:57.468 182759 DEBUG nova.network.neutron [None req-c52e6306-d0c0-4f2b-9872-1ebb86a5b8a8 0f8ef02149394f2dac899fc3395b6bf7 717cc581e6a349a98dfd390d05b18624 - - default default] [instance: d30c29ef-0595-4d30-a826-50e21d7d3463] Successfully created port: e7e9d1d2-925c-4854-8e1c-90b360ae8694 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Jan 21 18:56:59 np0005591285 nova_compute[182755]: 2026-01-21 23:56:59.117 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 18:56:59 np0005591285 nova_compute[182755]: 2026-01-21 23:56:59.392 182759 DEBUG nova.network.neutron [None req-c52e6306-d0c0-4f2b-9872-1ebb86a5b8a8 0f8ef02149394f2dac899fc3395b6bf7 717cc581e6a349a98dfd390d05b18624 - - default default] [instance: d30c29ef-0595-4d30-a826-50e21d7d3463] Successfully updated port: e7e9d1d2-925c-4854-8e1c-90b360ae8694 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Jan 21 18:56:59 np0005591285 nova_compute[182755]: 2026-01-21 23:56:59.409 182759 DEBUG oslo_concurrency.lockutils [None req-c52e6306-d0c0-4f2b-9872-1ebb86a5b8a8 0f8ef02149394f2dac899fc3395b6bf7 717cc581e6a349a98dfd390d05b18624 - - default default] Acquiring lock "refresh_cache-d30c29ef-0595-4d30-a826-50e21d7d3463" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 21 18:56:59 np0005591285 nova_compute[182755]: 2026-01-21 23:56:59.410 182759 DEBUG oslo_concurrency.lockutils [None req-c52e6306-d0c0-4f2b-9872-1ebb86a5b8a8 0f8ef02149394f2dac899fc3395b6bf7 717cc581e6a349a98dfd390d05b18624 - - default default] Acquired lock "refresh_cache-d30c29ef-0595-4d30-a826-50e21d7d3463" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 21 18:56:59 np0005591285 nova_compute[182755]: 2026-01-21 23:56:59.410 182759 DEBUG nova.network.neutron [None req-c52e6306-d0c0-4f2b-9872-1ebb86a5b8a8 0f8ef02149394f2dac899fc3395b6bf7 717cc581e6a349a98dfd390d05b18624 - - default default] [instance: d30c29ef-0595-4d30-a826-50e21d7d3463] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Jan 21 18:56:59 np0005591285 nova_compute[182755]: 2026-01-21 23:56:59.659 182759 WARNING nova.network.neutron [None req-c52e6306-d0c0-4f2b-9872-1ebb86a5b8a8 0f8ef02149394f2dac899fc3395b6bf7 717cc581e6a349a98dfd390d05b18624 - - default default] [instance: d30c29ef-0595-4d30-a826-50e21d7d3463] 1995baab-0f8d-4658-a4fc-2d21868dc592 already exists in list: networks containing: ['1995baab-0f8d-4658-a4fc-2d21868dc592']. ignoring it#033[00m
Jan 21 18:56:59 np0005591285 nova_compute[182755]: 2026-01-21 23:56:59.890 182759 DEBUG nova.compute.manager [req-8ee978d6-249e-4275-a66c-3a5587657d0d req-173d3d0e-7326-4524-923a-1ded5f932b89 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: d30c29ef-0595-4d30-a826-50e21d7d3463] Received event network-changed-e7e9d1d2-925c-4854-8e1c-90b360ae8694 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 21 18:56:59 np0005591285 nova_compute[182755]: 2026-01-21 23:56:59.891 182759 DEBUG nova.compute.manager [req-8ee978d6-249e-4275-a66c-3a5587657d0d req-173d3d0e-7326-4524-923a-1ded5f932b89 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: d30c29ef-0595-4d30-a826-50e21d7d3463] Refreshing instance network info cache due to event network-changed-e7e9d1d2-925c-4854-8e1c-90b360ae8694. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 21 18:56:59 np0005591285 nova_compute[182755]: 2026-01-21 23:56:59.892 182759 DEBUG oslo_concurrency.lockutils [req-8ee978d6-249e-4275-a66c-3a5587657d0d req-173d3d0e-7326-4524-923a-1ded5f932b89 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "refresh_cache-d30c29ef-0595-4d30-a826-50e21d7d3463" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 21 18:57:00 np0005591285 nova_compute[182755]: 2026-01-21 23:57:00.271 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 18:57:02 np0005591285 podman[220141]: 2026-01-21 23:57:02.244913621 +0000 UTC m=+0.098860653 container health_status 3d927e208c8d9d2bf2ed958430b573b5beb6e6062ba87e1ec7a0ee42c855b2e4 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Jan 21 18:57:02 np0005591285 nova_compute[182755]: 2026-01-21 23:57:02.257 182759 DEBUG nova.network.neutron [None req-c52e6306-d0c0-4f2b-9872-1ebb86a5b8a8 0f8ef02149394f2dac899fc3395b6bf7 717cc581e6a349a98dfd390d05b18624 - - default default] [instance: d30c29ef-0595-4d30-a826-50e21d7d3463] Updating instance_info_cache with network_info: [{"id": "8a89069d-b676-4bb0-ab19-1d71370566f0", "address": "fa:16:3e:04:81:11", "network": {"id": "1995baab-0f8d-4658-a4fc-2d21868dc592", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-54296518-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.212", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "717cc581e6a349a98dfd390d05b18624", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap8a89069d-b6", "ovs_interfaceid": "8a89069d-b676-4bb0-ab19-1d71370566f0", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "e7e9d1d2-925c-4854-8e1c-90b360ae8694", "address": "fa:16:3e:42:b1:a8", "network": {"id": "1995baab-0f8d-4658-a4fc-2d21868dc592", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-54296518-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "717cc581e6a349a98dfd390d05b18624", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape7e9d1d2-92", "ovs_interfaceid": "e7e9d1d2-925c-4854-8e1c-90b360ae8694", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 21 18:57:02 np0005591285 nova_compute[182755]: 2026-01-21 23:57:02.279 182759 DEBUG oslo_concurrency.lockutils [None req-c52e6306-d0c0-4f2b-9872-1ebb86a5b8a8 0f8ef02149394f2dac899fc3395b6bf7 717cc581e6a349a98dfd390d05b18624 - - default default] Releasing lock "refresh_cache-d30c29ef-0595-4d30-a826-50e21d7d3463" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 21 18:57:02 np0005591285 nova_compute[182755]: 2026-01-21 23:57:02.280 182759 DEBUG oslo_concurrency.lockutils [req-8ee978d6-249e-4275-a66c-3a5587657d0d req-173d3d0e-7326-4524-923a-1ded5f932b89 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquired lock "refresh_cache-d30c29ef-0595-4d30-a826-50e21d7d3463" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 21 18:57:02 np0005591285 nova_compute[182755]: 2026-01-21 23:57:02.280 182759 DEBUG nova.network.neutron [req-8ee978d6-249e-4275-a66c-3a5587657d0d req-173d3d0e-7326-4524-923a-1ded5f932b89 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: d30c29ef-0595-4d30-a826-50e21d7d3463] Refreshing network info cache for port e7e9d1d2-925c-4854-8e1c-90b360ae8694 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 21 18:57:02 np0005591285 nova_compute[182755]: 2026-01-21 23:57:02.287 182759 DEBUG nova.virt.libvirt.vif [None req-c52e6306-d0c0-4f2b-9872-1ebb86a5b8a8 0f8ef02149394f2dac899fc3395b6bf7 717cc581e6a349a98dfd390d05b18624 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-21T23:56:21Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-AttachInterfacesTestJSON-server-1062303531',display_name='tempest-AttachInterfacesTestJSON-server-1062303531',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-attachinterfacestestjson-server-1062303531',id=68,image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBDwgch/zV9NgtxoAjj9JBs7lcsR6b8YWvnQIu5vmkBx68RzwPILt1+wzp9eXPXNsQsEuX9cbGoTpflNJROPxrxLqg5cuoOtsv4rM+f7gXiy2vWn/dQAcGM72Np9ilcHeEQ==',key_name='tempest-keypair-829623367',keypairs=<?>,launch_index=0,launched_at=2026-01-21T23:56:31Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=<?>,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='717cc581e6a349a98dfd390d05b18624',ramdisk_id='',reservation_id='r-2l69dl8j',resources=<?>,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-AttachInterfacesTestJSON-658760528',owner_user_name='tempest-AttachInterfacesTestJSON-658760528-project-member'},tags=<?>,task_state=None,terminated_at=None,trusted_certs=<?>,updated_at=2026-01-21T23:56:32Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='0f8ef02149394f2dac899fc3395b6bf7',uuid=d30c29ef-0595-4d30-a826-50e21d7d3463,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "e7e9d1d2-925c-4854-8e1c-90b360ae8694", "address": "fa:16:3e:42:b1:a8", "network": {"id": "1995baab-0f8d-4658-a4fc-2d21868dc592", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-54296518-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "717cc581e6a349a98dfd390d05b18624", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape7e9d1d2-92", "ovs_interfaceid": "e7e9d1d2-925c-4854-8e1c-90b360ae8694", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Jan 21 18:57:02 np0005591285 nova_compute[182755]: 2026-01-21 23:57:02.287 182759 DEBUG nova.network.os_vif_util [None req-c52e6306-d0c0-4f2b-9872-1ebb86a5b8a8 0f8ef02149394f2dac899fc3395b6bf7 717cc581e6a349a98dfd390d05b18624 - - default default] Converting VIF {"id": "e7e9d1d2-925c-4854-8e1c-90b360ae8694", "address": "fa:16:3e:42:b1:a8", "network": {"id": "1995baab-0f8d-4658-a4fc-2d21868dc592", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-54296518-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "717cc581e6a349a98dfd390d05b18624", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape7e9d1d2-92", "ovs_interfaceid": "e7e9d1d2-925c-4854-8e1c-90b360ae8694", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 21 18:57:02 np0005591285 nova_compute[182755]: 2026-01-21 23:57:02.288 182759 DEBUG nova.network.os_vif_util [None req-c52e6306-d0c0-4f2b-9872-1ebb86a5b8a8 0f8ef02149394f2dac899fc3395b6bf7 717cc581e6a349a98dfd390d05b18624 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:42:b1:a8,bridge_name='br-int',has_traffic_filtering=True,id=e7e9d1d2-925c-4854-8e1c-90b360ae8694,network=Network(1995baab-0f8d-4658-a4fc-2d21868dc592),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tape7e9d1d2-92') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 21 18:57:02 np0005591285 nova_compute[182755]: 2026-01-21 23:57:02.289 182759 DEBUG os_vif [None req-c52e6306-d0c0-4f2b-9872-1ebb86a5b8a8 0f8ef02149394f2dac899fc3395b6bf7 717cc581e6a349a98dfd390d05b18624 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:42:b1:a8,bridge_name='br-int',has_traffic_filtering=True,id=e7e9d1d2-925c-4854-8e1c-90b360ae8694,network=Network(1995baab-0f8d-4658-a4fc-2d21868dc592),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tape7e9d1d2-92') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Jan 21 18:57:02 np0005591285 nova_compute[182755]: 2026-01-21 23:57:02.290 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 18:57:02 np0005591285 nova_compute[182755]: 2026-01-21 23:57:02.290 182759 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 21 18:57:02 np0005591285 nova_compute[182755]: 2026-01-21 23:57:02.291 182759 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 21 18:57:02 np0005591285 nova_compute[182755]: 2026-01-21 23:57:02.307 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 18:57:02 np0005591285 nova_compute[182755]: 2026-01-21 23:57:02.308 182759 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tape7e9d1d2-92, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 21 18:57:02 np0005591285 nova_compute[182755]: 2026-01-21 23:57:02.309 182759 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tape7e9d1d2-92, col_values=(('external_ids', {'iface-id': 'e7e9d1d2-925c-4854-8e1c-90b360ae8694', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:42:b1:a8', 'vm-uuid': 'd30c29ef-0595-4d30-a826-50e21d7d3463'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 21 18:57:02 np0005591285 nova_compute[182755]: 2026-01-21 23:57:02.311 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 18:57:02 np0005591285 NetworkManager[55017]: <info>  [1769039822.3127] manager: (tape7e9d1d2-92): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/102)
Jan 21 18:57:02 np0005591285 nova_compute[182755]: 2026-01-21 23:57:02.315 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 21 18:57:02 np0005591285 nova_compute[182755]: 2026-01-21 23:57:02.320 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 18:57:02 np0005591285 nova_compute[182755]: 2026-01-21 23:57:02.322 182759 INFO os_vif [None req-c52e6306-d0c0-4f2b-9872-1ebb86a5b8a8 0f8ef02149394f2dac899fc3395b6bf7 717cc581e6a349a98dfd390d05b18624 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:42:b1:a8,bridge_name='br-int',has_traffic_filtering=True,id=e7e9d1d2-925c-4854-8e1c-90b360ae8694,network=Network(1995baab-0f8d-4658-a4fc-2d21868dc592),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tape7e9d1d2-92')#033[00m
Jan 21 18:57:02 np0005591285 nova_compute[182755]: 2026-01-21 23:57:02.323 182759 DEBUG nova.virt.libvirt.vif [None req-c52e6306-d0c0-4f2b-9872-1ebb86a5b8a8 0f8ef02149394f2dac899fc3395b6bf7 717cc581e6a349a98dfd390d05b18624 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-21T23:56:21Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-AttachInterfacesTestJSON-server-1062303531',display_name='tempest-AttachInterfacesTestJSON-server-1062303531',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-attachinterfacestestjson-server-1062303531',id=68,image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBDwgch/zV9NgtxoAjj9JBs7lcsR6b8YWvnQIu5vmkBx68RzwPILt1+wzp9eXPXNsQsEuX9cbGoTpflNJROPxrxLqg5cuoOtsv4rM+f7gXiy2vWn/dQAcGM72Np9ilcHeEQ==',key_name='tempest-keypair-829623367',keypairs=<?>,launch_index=0,launched_at=2026-01-21T23:56:31Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=<?>,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='717cc581e6a349a98dfd390d05b18624',ramdisk_id='',reservation_id='r-2l69dl8j',resources=<?>,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-AttachInterfacesTestJSON-658760528',owner_user_name='tempest-AttachInterfacesTestJSON-658760528-project-member'},tags=<?>,task_state=None,terminated_at=None,trusted_certs=<?>,updated_at=2026-01-21T23:56:32Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='0f8ef02149394f2dac899fc3395b6bf7',uuid=d30c29ef-0595-4d30-a826-50e21d7d3463,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "e7e9d1d2-925c-4854-8e1c-90b360ae8694", "address": "fa:16:3e:42:b1:a8", "network": {"id": "1995baab-0f8d-4658-a4fc-2d21868dc592", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-54296518-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "717cc581e6a349a98dfd390d05b18624", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape7e9d1d2-92", "ovs_interfaceid": "e7e9d1d2-925c-4854-8e1c-90b360ae8694", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Jan 21 18:57:02 np0005591285 nova_compute[182755]: 2026-01-21 23:57:02.324 182759 DEBUG nova.network.os_vif_util [None req-c52e6306-d0c0-4f2b-9872-1ebb86a5b8a8 0f8ef02149394f2dac899fc3395b6bf7 717cc581e6a349a98dfd390d05b18624 - - default default] Converting VIF {"id": "e7e9d1d2-925c-4854-8e1c-90b360ae8694", "address": "fa:16:3e:42:b1:a8", "network": {"id": "1995baab-0f8d-4658-a4fc-2d21868dc592", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-54296518-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "717cc581e6a349a98dfd390d05b18624", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape7e9d1d2-92", "ovs_interfaceid": "e7e9d1d2-925c-4854-8e1c-90b360ae8694", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 21 18:57:02 np0005591285 nova_compute[182755]: 2026-01-21 23:57:02.325 182759 DEBUG nova.network.os_vif_util [None req-c52e6306-d0c0-4f2b-9872-1ebb86a5b8a8 0f8ef02149394f2dac899fc3395b6bf7 717cc581e6a349a98dfd390d05b18624 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:42:b1:a8,bridge_name='br-int',has_traffic_filtering=True,id=e7e9d1d2-925c-4854-8e1c-90b360ae8694,network=Network(1995baab-0f8d-4658-a4fc-2d21868dc592),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tape7e9d1d2-92') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 21 18:57:02 np0005591285 nova_compute[182755]: 2026-01-21 23:57:02.332 182759 DEBUG nova.virt.libvirt.guest [None req-c52e6306-d0c0-4f2b-9872-1ebb86a5b8a8 0f8ef02149394f2dac899fc3395b6bf7 717cc581e6a349a98dfd390d05b18624 - - default default] attach device xml: <interface type="ethernet">
Jan 21 18:57:02 np0005591285 nova_compute[182755]:  <mac address="fa:16:3e:42:b1:a8"/>
Jan 21 18:57:02 np0005591285 nova_compute[182755]:  <model type="virtio"/>
Jan 21 18:57:02 np0005591285 nova_compute[182755]:  <driver name="vhost" rx_queue_size="512"/>
Jan 21 18:57:02 np0005591285 nova_compute[182755]:  <mtu size="1442"/>
Jan 21 18:57:02 np0005591285 nova_compute[182755]:  <target dev="tape7e9d1d2-92"/>
Jan 21 18:57:02 np0005591285 nova_compute[182755]: </interface>
Jan 21 18:57:02 np0005591285 nova_compute[182755]: attach_device /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:339#033[00m
Jan 21 18:57:02 np0005591285 kernel: tape7e9d1d2-92: entered promiscuous mode
Jan 21 18:57:02 np0005591285 NetworkManager[55017]: <info>  [1769039822.3594] manager: (tape7e9d1d2-92): new Tun device (/org/freedesktop/NetworkManager/Devices/103)
Jan 21 18:57:02 np0005591285 nova_compute[182755]: 2026-01-21 23:57:02.362 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 18:57:02 np0005591285 ovn_controller[94908]: 2026-01-21T23:57:02Z|00191|binding|INFO|Claiming lport e7e9d1d2-925c-4854-8e1c-90b360ae8694 for this chassis.
Jan 21 18:57:02 np0005591285 ovn_controller[94908]: 2026-01-21T23:57:02Z|00192|binding|INFO|e7e9d1d2-925c-4854-8e1c-90b360ae8694: Claiming fa:16:3e:42:b1:a8 10.100.0.13
Jan 21 18:57:02 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:57:02.372 104259 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:42:b1:a8 10.100.0.13'], port_security=['fa:16:3e:42:b1:a8 10.100.0.13'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.13/28', 'neutron:device_id': 'd30c29ef-0595-4d30-a826-50e21d7d3463', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-1995baab-0f8d-4658-a4fc-2d21868dc592', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '717cc581e6a349a98dfd390d05b18624', 'neutron:revision_number': '2', 'neutron:security_group_ids': '453c6af8-25dc-4538-90e8-d74d46875cdc', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=a84fa12f-731b-4479-8697-844749c5a76f, chassis=[<ovs.db.idl.Row object at 0x7fdd55b5c970>], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fdd55b5c970>], logical_port=e7e9d1d2-925c-4854-8e1c-90b360ae8694) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 21 18:57:02 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:57:02.376 104259 INFO neutron.agent.ovn.metadata.agent [-] Port e7e9d1d2-925c-4854-8e1c-90b360ae8694 in datapath 1995baab-0f8d-4658-a4fc-2d21868dc592 bound to our chassis#033[00m
Jan 21 18:57:02 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:57:02.378 104259 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 1995baab-0f8d-4658-a4fc-2d21868dc592#033[00m
Jan 21 18:57:02 np0005591285 ovn_controller[94908]: 2026-01-21T23:57:02Z|00193|binding|INFO|Setting lport e7e9d1d2-925c-4854-8e1c-90b360ae8694 ovn-installed in OVS
Jan 21 18:57:02 np0005591285 ovn_controller[94908]: 2026-01-21T23:57:02Z|00194|binding|INFO|Setting lport e7e9d1d2-925c-4854-8e1c-90b360ae8694 up in Southbound
Jan 21 18:57:02 np0005591285 nova_compute[182755]: 2026-01-21 23:57:02.389 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 18:57:02 np0005591285 nova_compute[182755]: 2026-01-21 23:57:02.394 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 18:57:02 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:57:02.412 211690 DEBUG oslo.privsep.daemon [-] privsep: reply[d3480ec0-f835-4573-889f-2cf5ce3d874c]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 18:57:02 np0005591285 systemd-udevd[220173]: Network interface NamePolicy= disabled on kernel command line.
Jan 21 18:57:02 np0005591285 NetworkManager[55017]: <info>  [1769039822.4510] device (tape7e9d1d2-92): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 21 18:57:02 np0005591285 NetworkManager[55017]: <info>  [1769039822.4539] device (tape7e9d1d2-92): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 21 18:57:02 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:57:02.471 211704 DEBUG oslo.privsep.daemon [-] privsep: reply[79081eeb-8849-42cb-b3bf-5f3a2e81b332]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 18:57:02 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:57:02.477 211704 DEBUG oslo.privsep.daemon [-] privsep: reply[e85b2900-ac84-4c8e-b85e-27f775d41371]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 18:57:02 np0005591285 nova_compute[182755]: 2026-01-21 23:57:02.490 182759 DEBUG nova.virt.libvirt.driver [None req-c52e6306-d0c0-4f2b-9872-1ebb86a5b8a8 0f8ef02149394f2dac899fc3395b6bf7 717cc581e6a349a98dfd390d05b18624 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 21 18:57:02 np0005591285 nova_compute[182755]: 2026-01-21 23:57:02.490 182759 DEBUG nova.virt.libvirt.driver [None req-c52e6306-d0c0-4f2b-9872-1ebb86a5b8a8 0f8ef02149394f2dac899fc3395b6bf7 717cc581e6a349a98dfd390d05b18624 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 21 18:57:02 np0005591285 nova_compute[182755]: 2026-01-21 23:57:02.491 182759 DEBUG nova.virt.libvirt.driver [None req-c52e6306-d0c0-4f2b-9872-1ebb86a5b8a8 0f8ef02149394f2dac899fc3395b6bf7 717cc581e6a349a98dfd390d05b18624 - - default default] No VIF found with MAC fa:16:3e:04:81:11, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Jan 21 18:57:02 np0005591285 nova_compute[182755]: 2026-01-21 23:57:02.491 182759 DEBUG nova.virt.libvirt.driver [None req-c52e6306-d0c0-4f2b-9872-1ebb86a5b8a8 0f8ef02149394f2dac899fc3395b6bf7 717cc581e6a349a98dfd390d05b18624 - - default default] No VIF found with MAC fa:16:3e:42:b1:a8, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Jan 21 18:57:02 np0005591285 nova_compute[182755]: 2026-01-21 23:57:02.521 182759 DEBUG nova.virt.libvirt.guest [None req-c52e6306-d0c0-4f2b-9872-1ebb86a5b8a8 0f8ef02149394f2dac899fc3395b6bf7 717cc581e6a349a98dfd390d05b18624 - - default default] set metadata xml: <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 21 18:57:02 np0005591285 nova_compute[182755]:  <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 21 18:57:02 np0005591285 nova_compute[182755]:  <nova:name>tempest-AttachInterfacesTestJSON-server-1062303531</nova:name>
Jan 21 18:57:02 np0005591285 nova_compute[182755]:  <nova:creationTime>2026-01-21 23:57:02</nova:creationTime>
Jan 21 18:57:02 np0005591285 nova_compute[182755]:  <nova:flavor name="m1.nano">
Jan 21 18:57:02 np0005591285 nova_compute[182755]:    <nova:memory>128</nova:memory>
Jan 21 18:57:02 np0005591285 nova_compute[182755]:    <nova:disk>1</nova:disk>
Jan 21 18:57:02 np0005591285 nova_compute[182755]:    <nova:swap>0</nova:swap>
Jan 21 18:57:02 np0005591285 nova_compute[182755]:    <nova:ephemeral>0</nova:ephemeral>
Jan 21 18:57:02 np0005591285 nova_compute[182755]:    <nova:vcpus>1</nova:vcpus>
Jan 21 18:57:02 np0005591285 nova_compute[182755]:  </nova:flavor>
Jan 21 18:57:02 np0005591285 nova_compute[182755]:  <nova:owner>
Jan 21 18:57:02 np0005591285 nova_compute[182755]:    <nova:user uuid="0f8ef02149394f2dac899fc3395b6bf7">tempest-AttachInterfacesTestJSON-658760528-project-member</nova:user>
Jan 21 18:57:02 np0005591285 nova_compute[182755]:    <nova:project uuid="717cc581e6a349a98dfd390d05b18624">tempest-AttachInterfacesTestJSON-658760528</nova:project>
Jan 21 18:57:02 np0005591285 nova_compute[182755]:  </nova:owner>
Jan 21 18:57:02 np0005591285 nova_compute[182755]:  <nova:root type="image" uuid="9cd98f02-a505-4543-a7ad-04e9a377b456"/>
Jan 21 18:57:02 np0005591285 nova_compute[182755]:  <nova:ports>
Jan 21 18:57:02 np0005591285 nova_compute[182755]:    <nova:port uuid="8a89069d-b676-4bb0-ab19-1d71370566f0">
Jan 21 18:57:02 np0005591285 nova_compute[182755]:      <nova:ip type="fixed" address="10.100.0.14" ipVersion="4"/>
Jan 21 18:57:02 np0005591285 nova_compute[182755]:    </nova:port>
Jan 21 18:57:02 np0005591285 nova_compute[182755]:    <nova:port uuid="e7e9d1d2-925c-4854-8e1c-90b360ae8694">
Jan 21 18:57:02 np0005591285 nova_compute[182755]:      <nova:ip type="fixed" address="10.100.0.13" ipVersion="4"/>
Jan 21 18:57:02 np0005591285 nova_compute[182755]:    </nova:port>
Jan 21 18:57:02 np0005591285 nova_compute[182755]:  </nova:ports>
Jan 21 18:57:02 np0005591285 nova_compute[182755]: </nova:instance>
Jan 21 18:57:02 np0005591285 nova_compute[182755]: set_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:359#033[00m
Jan 21 18:57:02 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:57:02.525 211704 DEBUG oslo.privsep.daemon [-] privsep: reply[eb1dace9-2e68-4ab7-a856-f652407e3fef]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 18:57:02 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:57:02.555 211690 DEBUG oslo.privsep.daemon [-] privsep: reply[5349e83a-ec59-4627-afe2-a1f3eca587a5]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap1995baab-01'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:4b:ff:2f'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 8, 'tx_packets': 6, 'rx_bytes': 616, 'tx_bytes': 444, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 8, 'tx_packets': 6, 'rx_bytes': 616, 'tx_bytes': 444, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 63], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 434952, 'reachable_time': 38323, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 6, 'inoctets': 448, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 4, 'outoctets': 304, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 6, 'outmcastpkts': 4, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 448, 'outmcastoctets': 304, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 6, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 4, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 220180, 'error': None, 'target': 'ovnmeta-1995baab-0f8d-4658-a4fc-2d21868dc592', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 18:57:02 np0005591285 nova_compute[182755]: 2026-01-21 23:57:02.562 182759 DEBUG oslo_concurrency.lockutils [None req-c52e6306-d0c0-4f2b-9872-1ebb86a5b8a8 0f8ef02149394f2dac899fc3395b6bf7 717cc581e6a349a98dfd390d05b18624 - - default default] Lock "interface-d30c29ef-0595-4d30-a826-50e21d7d3463-None" "released" by "nova.compute.manager.ComputeManager.attach_interface.<locals>.do_attach_interface" :: held 8.028s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 21 18:57:02 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:57:02.586 211690 DEBUG oslo.privsep.daemon [-] privsep: reply[53f126e7-b63b-46ef-8195-fde49f0283ed]: (4, ({'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tap1995baab-01'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 434971, 'tstamp': 434971}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 220181, 'error': None, 'target': 'ovnmeta-1995baab-0f8d-4658-a4fc-2d21868dc592', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.15'], ['IFA_LABEL', 'tap1995baab-01'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 434974, 'tstamp': 434974}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 220181, 'error': None, 'target': 'ovnmeta-1995baab-0f8d-4658-a4fc-2d21868dc592', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 18:57:02 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:57:02.590 104259 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap1995baab-00, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 21 18:57:02 np0005591285 nova_compute[182755]: 2026-01-21 23:57:02.593 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 18:57:02 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:57:02.595 104259 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap1995baab-00, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 21 18:57:02 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:57:02.595 104259 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 21 18:57:02 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:57:02.596 104259 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap1995baab-00, col_values=(('external_ids', {'iface-id': '4a5cc35b-5169-43e2-b11f-202219aae22d'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 21 18:57:02 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:57:02.597 104259 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 21 18:57:02 np0005591285 nova_compute[182755]: 2026-01-21 23:57:02.625 182759 DEBUG nova.compute.manager [req-d20d5cb3-5a4b-4504-90e9-bddaa54dc6d9 req-9bcd2f37-3160-4b76-a0d8-e95c19356f4e 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: d30c29ef-0595-4d30-a826-50e21d7d3463] Received event network-vif-plugged-e7e9d1d2-925c-4854-8e1c-90b360ae8694 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 21 18:57:02 np0005591285 nova_compute[182755]: 2026-01-21 23:57:02.626 182759 DEBUG oslo_concurrency.lockutils [req-d20d5cb3-5a4b-4504-90e9-bddaa54dc6d9 req-9bcd2f37-3160-4b76-a0d8-e95c19356f4e 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "d30c29ef-0595-4d30-a826-50e21d7d3463-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 21 18:57:02 np0005591285 nova_compute[182755]: 2026-01-21 23:57:02.626 182759 DEBUG oslo_concurrency.lockutils [req-d20d5cb3-5a4b-4504-90e9-bddaa54dc6d9 req-9bcd2f37-3160-4b76-a0d8-e95c19356f4e 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "d30c29ef-0595-4d30-a826-50e21d7d3463-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 21 18:57:02 np0005591285 nova_compute[182755]: 2026-01-21 23:57:02.627 182759 DEBUG oslo_concurrency.lockutils [req-d20d5cb3-5a4b-4504-90e9-bddaa54dc6d9 req-9bcd2f37-3160-4b76-a0d8-e95c19356f4e 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "d30c29ef-0595-4d30-a826-50e21d7d3463-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 21 18:57:02 np0005591285 nova_compute[182755]: 2026-01-21 23:57:02.627 182759 DEBUG nova.compute.manager [req-d20d5cb3-5a4b-4504-90e9-bddaa54dc6d9 req-9bcd2f37-3160-4b76-a0d8-e95c19356f4e 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: d30c29ef-0595-4d30-a826-50e21d7d3463] No waiting events found dispatching network-vif-plugged-e7e9d1d2-925c-4854-8e1c-90b360ae8694 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 21 18:57:02 np0005591285 nova_compute[182755]: 2026-01-21 23:57:02.627 182759 WARNING nova.compute.manager [req-d20d5cb3-5a4b-4504-90e9-bddaa54dc6d9 req-9bcd2f37-3160-4b76-a0d8-e95c19356f4e 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: d30c29ef-0595-4d30-a826-50e21d7d3463] Received unexpected event network-vif-plugged-e7e9d1d2-925c-4854-8e1c-90b360ae8694 for instance with vm_state active and task_state None.#033[00m
Jan 21 18:57:02 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:57:02.962 104259 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 21 18:57:02 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:57:02.963 104259 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 21 18:57:02 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:57:02.964 104259 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 21 18:57:03 np0005591285 nova_compute[182755]: 2026-01-21 23:57:03.659 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 18:57:03 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:57:03.867 104259 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=16, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '16:a4:fb', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': 'fa:c2:bb:50:eb:27'}, ipsec=False) old=SB_Global(nb_cfg=15) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 21 18:57:03 np0005591285 nova_compute[182755]: 2026-01-21 23:57:03.867 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 18:57:03 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:57:03.868 104259 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 10 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Jan 21 18:57:04 np0005591285 nova_compute[182755]: 2026-01-21 23:57:04.620 182759 DEBUG nova.network.neutron [req-8ee978d6-249e-4275-a66c-3a5587657d0d req-173d3d0e-7326-4524-923a-1ded5f932b89 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: d30c29ef-0595-4d30-a826-50e21d7d3463] Updated VIF entry in instance network info cache for port e7e9d1d2-925c-4854-8e1c-90b360ae8694. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 21 18:57:04 np0005591285 nova_compute[182755]: 2026-01-21 23:57:04.621 182759 DEBUG nova.network.neutron [req-8ee978d6-249e-4275-a66c-3a5587657d0d req-173d3d0e-7326-4524-923a-1ded5f932b89 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: d30c29ef-0595-4d30-a826-50e21d7d3463] Updating instance_info_cache with network_info: [{"id": "8a89069d-b676-4bb0-ab19-1d71370566f0", "address": "fa:16:3e:04:81:11", "network": {"id": "1995baab-0f8d-4658-a4fc-2d21868dc592", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-54296518-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.212", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "717cc581e6a349a98dfd390d05b18624", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap8a89069d-b6", "ovs_interfaceid": "8a89069d-b676-4bb0-ab19-1d71370566f0", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "e7e9d1d2-925c-4854-8e1c-90b360ae8694", "address": "fa:16:3e:42:b1:a8", "network": {"id": "1995baab-0f8d-4658-a4fc-2d21868dc592", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-54296518-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "717cc581e6a349a98dfd390d05b18624", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape7e9d1d2-92", "ovs_interfaceid": "e7e9d1d2-925c-4854-8e1c-90b360ae8694", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 21 18:57:04 np0005591285 nova_compute[182755]: 2026-01-21 23:57:04.651 182759 DEBUG oslo_concurrency.lockutils [req-8ee978d6-249e-4275-a66c-3a5587657d0d req-173d3d0e-7326-4524-923a-1ded5f932b89 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Releasing lock "refresh_cache-d30c29ef-0595-4d30-a826-50e21d7d3463" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 21 18:57:04 np0005591285 nova_compute[182755]: 2026-01-21 23:57:04.720 182759 DEBUG nova.compute.manager [req-b54e2bf9-25f4-4ed8-9b01-87aabc1e8f84 req-22e39931-17a5-43ab-bb1c-dbd091baa78e 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: d30c29ef-0595-4d30-a826-50e21d7d3463] Received event network-vif-plugged-e7e9d1d2-925c-4854-8e1c-90b360ae8694 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 21 18:57:04 np0005591285 nova_compute[182755]: 2026-01-21 23:57:04.721 182759 DEBUG oslo_concurrency.lockutils [req-b54e2bf9-25f4-4ed8-9b01-87aabc1e8f84 req-22e39931-17a5-43ab-bb1c-dbd091baa78e 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "d30c29ef-0595-4d30-a826-50e21d7d3463-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 21 18:57:04 np0005591285 nova_compute[182755]: 2026-01-21 23:57:04.721 182759 DEBUG oslo_concurrency.lockutils [req-b54e2bf9-25f4-4ed8-9b01-87aabc1e8f84 req-22e39931-17a5-43ab-bb1c-dbd091baa78e 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "d30c29ef-0595-4d30-a826-50e21d7d3463-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 21 18:57:04 np0005591285 nova_compute[182755]: 2026-01-21 23:57:04.721 182759 DEBUG oslo_concurrency.lockutils [req-b54e2bf9-25f4-4ed8-9b01-87aabc1e8f84 req-22e39931-17a5-43ab-bb1c-dbd091baa78e 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "d30c29ef-0595-4d30-a826-50e21d7d3463-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 21 18:57:04 np0005591285 nova_compute[182755]: 2026-01-21 23:57:04.722 182759 DEBUG nova.compute.manager [req-b54e2bf9-25f4-4ed8-9b01-87aabc1e8f84 req-22e39931-17a5-43ab-bb1c-dbd091baa78e 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: d30c29ef-0595-4d30-a826-50e21d7d3463] No waiting events found dispatching network-vif-plugged-e7e9d1d2-925c-4854-8e1c-90b360ae8694 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 21 18:57:04 np0005591285 nova_compute[182755]: 2026-01-21 23:57:04.722 182759 WARNING nova.compute.manager [req-b54e2bf9-25f4-4ed8-9b01-87aabc1e8f84 req-22e39931-17a5-43ab-bb1c-dbd091baa78e 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: d30c29ef-0595-4d30-a826-50e21d7d3463] Received unexpected event network-vif-plugged-e7e9d1d2-925c-4854-8e1c-90b360ae8694 for instance with vm_state active and task_state None.#033[00m
Jan 21 18:57:05 np0005591285 ovn_controller[94908]: 2026-01-21T23:57:05Z|00195|binding|INFO|Releasing lport 4a5cc35b-5169-43e2-b11f-202219aae22d from this chassis (sb_readonly=0)
Jan 21 18:57:05 np0005591285 nova_compute[182755]: 2026-01-21 23:57:05.273 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 18:57:05 np0005591285 ovn_controller[94908]: 2026-01-21T23:57:05Z|00021|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:42:b1:a8 10.100.0.13
Jan 21 18:57:05 np0005591285 ovn_controller[94908]: 2026-01-21T23:57:05Z|00022|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:42:b1:a8 10.100.0.13
Jan 21 18:57:05 np0005591285 nova_compute[182755]: 2026-01-21 23:57:05.576 182759 DEBUG oslo_concurrency.lockutils [None req-f149204a-fb2f-4bc3-b144-43c74e796542 0f8ef02149394f2dac899fc3395b6bf7 717cc581e6a349a98dfd390d05b18624 - - default default] Acquiring lock "interface-d30c29ef-0595-4d30-a826-50e21d7d3463-e7e9d1d2-925c-4854-8e1c-90b360ae8694" by "nova.compute.manager.ComputeManager.detach_interface.<locals>.do_detach_interface" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 21 18:57:05 np0005591285 nova_compute[182755]: 2026-01-21 23:57:05.577 182759 DEBUG oslo_concurrency.lockutils [None req-f149204a-fb2f-4bc3-b144-43c74e796542 0f8ef02149394f2dac899fc3395b6bf7 717cc581e6a349a98dfd390d05b18624 - - default default] Lock "interface-d30c29ef-0595-4d30-a826-50e21d7d3463-e7e9d1d2-925c-4854-8e1c-90b360ae8694" acquired by "nova.compute.manager.ComputeManager.detach_interface.<locals>.do_detach_interface" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 21 18:57:05 np0005591285 nova_compute[182755]: 2026-01-21 23:57:05.593 182759 DEBUG nova.objects.instance [None req-f149204a-fb2f-4bc3-b144-43c74e796542 0f8ef02149394f2dac899fc3395b6bf7 717cc581e6a349a98dfd390d05b18624 - - default default] Lazy-loading 'flavor' on Instance uuid d30c29ef-0595-4d30-a826-50e21d7d3463 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 21 18:57:05 np0005591285 nova_compute[182755]: 2026-01-21 23:57:05.626 182759 DEBUG nova.virt.libvirt.vif [None req-f149204a-fb2f-4bc3-b144-43c74e796542 0f8ef02149394f2dac899fc3395b6bf7 717cc581e6a349a98dfd390d05b18624 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-21T23:56:21Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-AttachInterfacesTestJSON-server-1062303531',display_name='tempest-AttachInterfacesTestJSON-server-1062303531',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-attachinterfacestestjson-server-1062303531',id=68,image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBDwgch/zV9NgtxoAjj9JBs7lcsR6b8YWvnQIu5vmkBx68RzwPILt1+wzp9eXPXNsQsEuX9cbGoTpflNJROPxrxLqg5cuoOtsv4rM+f7gXiy2vWn/dQAcGM72Np9ilcHeEQ==',key_name='tempest-keypair-829623367',keypairs=<?>,launch_index=0,launched_at=2026-01-21T23:56:31Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=<?>,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='717cc581e6a349a98dfd390d05b18624',ramdisk_id='',reservation_id='r-2l69dl8j',resources=<?>,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-AttachInterfacesTestJSON-658760528',owner_user_name='tempest-AttachInterfacesTestJSON-658760528-project-member'},tags=<?>,task_state=None,terminated_at=None,trusted_certs=<?>,updated_at=2026-01-21T23:56:32Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='0f8ef02149394f2dac899fc3395b6bf7',uuid=d30c29ef-0595-4d30-a826-50e21d7d3463,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "e7e9d1d2-925c-4854-8e1c-90b360ae8694", "address": "fa:16:3e:42:b1:a8", "network": {"id": "1995baab-0f8d-4658-a4fc-2d21868dc592", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-54296518-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "717cc581e6a349a98dfd390d05b18624", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape7e9d1d2-92", "ovs_interfaceid": "e7e9d1d2-925c-4854-8e1c-90b360ae8694", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Jan 21 18:57:05 np0005591285 nova_compute[182755]: 2026-01-21 23:57:05.627 182759 DEBUG nova.network.os_vif_util [None req-f149204a-fb2f-4bc3-b144-43c74e796542 0f8ef02149394f2dac899fc3395b6bf7 717cc581e6a349a98dfd390d05b18624 - - default default] Converting VIF {"id": "e7e9d1d2-925c-4854-8e1c-90b360ae8694", "address": "fa:16:3e:42:b1:a8", "network": {"id": "1995baab-0f8d-4658-a4fc-2d21868dc592", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-54296518-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "717cc581e6a349a98dfd390d05b18624", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape7e9d1d2-92", "ovs_interfaceid": "e7e9d1d2-925c-4854-8e1c-90b360ae8694", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 21 18:57:05 np0005591285 nova_compute[182755]: 2026-01-21 23:57:05.629 182759 DEBUG nova.network.os_vif_util [None req-f149204a-fb2f-4bc3-b144-43c74e796542 0f8ef02149394f2dac899fc3395b6bf7 717cc581e6a349a98dfd390d05b18624 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:42:b1:a8,bridge_name='br-int',has_traffic_filtering=True,id=e7e9d1d2-925c-4854-8e1c-90b360ae8694,network=Network(1995baab-0f8d-4658-a4fc-2d21868dc592),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tape7e9d1d2-92') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 21 18:57:05 np0005591285 nova_compute[182755]: 2026-01-21 23:57:05.633 182759 DEBUG nova.virt.libvirt.guest [None req-f149204a-fb2f-4bc3-b144-43c74e796542 0f8ef02149394f2dac899fc3395b6bf7 717cc581e6a349a98dfd390d05b18624 - - default default] looking for interface given config: <interface type="ethernet"><mac address="fa:16:3e:42:b1:a8"/><model type="virtio"/><driver name="vhost" rx_queue_size="512"/><mtu size="1442"/><target dev="tape7e9d1d2-92"/></interface> get_interface_by_cfg /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:257#033[00m
Jan 21 18:57:05 np0005591285 nova_compute[182755]: 2026-01-21 23:57:05.636 182759 DEBUG nova.virt.libvirt.guest [None req-f149204a-fb2f-4bc3-b144-43c74e796542 0f8ef02149394f2dac899fc3395b6bf7 717cc581e6a349a98dfd390d05b18624 - - default default] looking for interface given config: <interface type="ethernet"><mac address="fa:16:3e:42:b1:a8"/><model type="virtio"/><driver name="vhost" rx_queue_size="512"/><mtu size="1442"/><target dev="tape7e9d1d2-92"/></interface> get_interface_by_cfg /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:257#033[00m
Jan 21 18:57:05 np0005591285 nova_compute[182755]: 2026-01-21 23:57:05.638 182759 DEBUG nova.virt.libvirt.driver [None req-f149204a-fb2f-4bc3-b144-43c74e796542 0f8ef02149394f2dac899fc3395b6bf7 717cc581e6a349a98dfd390d05b18624 - - default default] Attempting to detach device tape7e9d1d2-92 from instance d30c29ef-0595-4d30-a826-50e21d7d3463 from the persistent domain config. _detach_from_persistent /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:2487#033[00m
Jan 21 18:57:05 np0005591285 nova_compute[182755]: 2026-01-21 23:57:05.638 182759 DEBUG nova.virt.libvirt.guest [None req-f149204a-fb2f-4bc3-b144-43c74e796542 0f8ef02149394f2dac899fc3395b6bf7 717cc581e6a349a98dfd390d05b18624 - - default default] detach device xml: <interface type="ethernet">
Jan 21 18:57:05 np0005591285 nova_compute[182755]:  <mac address="fa:16:3e:42:b1:a8"/>
Jan 21 18:57:05 np0005591285 nova_compute[182755]:  <model type="virtio"/>
Jan 21 18:57:05 np0005591285 nova_compute[182755]:  <driver name="vhost" rx_queue_size="512"/>
Jan 21 18:57:05 np0005591285 nova_compute[182755]:  <mtu size="1442"/>
Jan 21 18:57:05 np0005591285 nova_compute[182755]:  <target dev="tape7e9d1d2-92"/>
Jan 21 18:57:05 np0005591285 nova_compute[182755]: </interface>
Jan 21 18:57:05 np0005591285 nova_compute[182755]: detach_device /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:465#033[00m
Jan 21 18:57:05 np0005591285 nova_compute[182755]: 2026-01-21 23:57:05.645 182759 DEBUG nova.virt.libvirt.guest [None req-f149204a-fb2f-4bc3-b144-43c74e796542 0f8ef02149394f2dac899fc3395b6bf7 717cc581e6a349a98dfd390d05b18624 - - default default] looking for interface given config: <interface type="ethernet"><mac address="fa:16:3e:42:b1:a8"/><model type="virtio"/><driver name="vhost" rx_queue_size="512"/><mtu size="1442"/><target dev="tape7e9d1d2-92"/></interface> get_interface_by_cfg /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:257#033[00m
Jan 21 18:57:05 np0005591285 nova_compute[182755]: 2026-01-21 23:57:05.651 182759 DEBUG nova.virt.libvirt.guest [None req-f149204a-fb2f-4bc3-b144-43c74e796542 0f8ef02149394f2dac899fc3395b6bf7 717cc581e6a349a98dfd390d05b18624 - - default default] interface for config: <interface type="ethernet"><mac address="fa:16:3e:42:b1:a8"/><model type="virtio"/><driver name="vhost" rx_queue_size="512"/><mtu size="1442"/><target dev="tape7e9d1d2-92"/></interface>not found in domain: <domain type='kvm' id='28'>
Jan 21 18:57:05 np0005591285 nova_compute[182755]:  <name>instance-00000044</name>
Jan 21 18:57:05 np0005591285 nova_compute[182755]:  <uuid>d30c29ef-0595-4d30-a826-50e21d7d3463</uuid>
Jan 21 18:57:05 np0005591285 nova_compute[182755]:  <metadata>
Jan 21 18:57:05 np0005591285 nova_compute[182755]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1" xmlns:instance="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 21 18:57:05 np0005591285 nova_compute[182755]:  <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 21 18:57:05 np0005591285 nova_compute[182755]:  <nova:name>tempest-AttachInterfacesTestJSON-server-1062303531</nova:name>
Jan 21 18:57:05 np0005591285 nova_compute[182755]:  <nova:creationTime>2026-01-21 23:57:02</nova:creationTime>
Jan 21 18:57:05 np0005591285 nova_compute[182755]:  <nova:flavor name="m1.nano">
Jan 21 18:57:05 np0005591285 nova_compute[182755]:    <nova:memory>128</nova:memory>
Jan 21 18:57:05 np0005591285 nova_compute[182755]:    <nova:disk>1</nova:disk>
Jan 21 18:57:05 np0005591285 nova_compute[182755]:    <nova:swap>0</nova:swap>
Jan 21 18:57:05 np0005591285 nova_compute[182755]:    <nova:ephemeral>0</nova:ephemeral>
Jan 21 18:57:05 np0005591285 nova_compute[182755]:    <nova:vcpus>1</nova:vcpus>
Jan 21 18:57:05 np0005591285 nova_compute[182755]:  </nova:flavor>
Jan 21 18:57:05 np0005591285 nova_compute[182755]:  <nova:owner>
Jan 21 18:57:05 np0005591285 nova_compute[182755]:    <nova:user uuid="0f8ef02149394f2dac899fc3395b6bf7">tempest-AttachInterfacesTestJSON-658760528-project-member</nova:user>
Jan 21 18:57:05 np0005591285 nova_compute[182755]:    <nova:project uuid="717cc581e6a349a98dfd390d05b18624">tempest-AttachInterfacesTestJSON-658760528</nova:project>
Jan 21 18:57:05 np0005591285 nova_compute[182755]:  </nova:owner>
Jan 21 18:57:05 np0005591285 nova_compute[182755]:  <nova:root type="image" uuid="9cd98f02-a505-4543-a7ad-04e9a377b456"/>
Jan 21 18:57:05 np0005591285 nova_compute[182755]:  <nova:ports>
Jan 21 18:57:05 np0005591285 nova_compute[182755]:    <nova:port uuid="8a89069d-b676-4bb0-ab19-1d71370566f0">
Jan 21 18:57:05 np0005591285 nova_compute[182755]:      <nova:ip type="fixed" address="10.100.0.14" ipVersion="4"/>
Jan 21 18:57:05 np0005591285 nova_compute[182755]:    </nova:port>
Jan 21 18:57:05 np0005591285 nova_compute[182755]:    <nova:port uuid="e7e9d1d2-925c-4854-8e1c-90b360ae8694">
Jan 21 18:57:05 np0005591285 nova_compute[182755]:      <nova:ip type="fixed" address="10.100.0.13" ipVersion="4"/>
Jan 21 18:57:05 np0005591285 nova_compute[182755]:    </nova:port>
Jan 21 18:57:05 np0005591285 nova_compute[182755]:  </nova:ports>
Jan 21 18:57:05 np0005591285 nova_compute[182755]: </nova:instance>
Jan 21 18:57:05 np0005591285 nova_compute[182755]:  </metadata>
Jan 21 18:57:05 np0005591285 nova_compute[182755]:  <memory unit='KiB'>131072</memory>
Jan 21 18:57:05 np0005591285 nova_compute[182755]:  <currentMemory unit='KiB'>131072</currentMemory>
Jan 21 18:57:05 np0005591285 nova_compute[182755]:  <vcpu placement='static'>1</vcpu>
Jan 21 18:57:05 np0005591285 nova_compute[182755]:  <resource>
Jan 21 18:57:05 np0005591285 nova_compute[182755]:    <partition>/machine</partition>
Jan 21 18:57:05 np0005591285 nova_compute[182755]:  </resource>
Jan 21 18:57:05 np0005591285 nova_compute[182755]:  <sysinfo type='smbios'>
Jan 21 18:57:05 np0005591285 nova_compute[182755]:    <system>
Jan 21 18:57:05 np0005591285 nova_compute[182755]:      <entry name='manufacturer'>RDO</entry>
Jan 21 18:57:05 np0005591285 nova_compute[182755]:      <entry name='product'>OpenStack Compute</entry>
Jan 21 18:57:05 np0005591285 nova_compute[182755]:      <entry name='version'>27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 21 18:57:05 np0005591285 nova_compute[182755]:      <entry name='serial'>d30c29ef-0595-4d30-a826-50e21d7d3463</entry>
Jan 21 18:57:05 np0005591285 nova_compute[182755]:      <entry name='uuid'>d30c29ef-0595-4d30-a826-50e21d7d3463</entry>
Jan 21 18:57:05 np0005591285 nova_compute[182755]:      <entry name='family'>Virtual Machine</entry>
Jan 21 18:57:05 np0005591285 nova_compute[182755]:    </system>
Jan 21 18:57:05 np0005591285 nova_compute[182755]:  </sysinfo>
Jan 21 18:57:05 np0005591285 nova_compute[182755]:  <os>
Jan 21 18:57:05 np0005591285 nova_compute[182755]:    <type arch='x86_64' machine='pc-q35-rhel9.8.0'>hvm</type>
Jan 21 18:57:05 np0005591285 nova_compute[182755]:    <boot dev='hd'/>
Jan 21 18:57:05 np0005591285 nova_compute[182755]:    <smbios mode='sysinfo'/>
Jan 21 18:57:05 np0005591285 nova_compute[182755]:  </os>
Jan 21 18:57:05 np0005591285 nova_compute[182755]:  <features>
Jan 21 18:57:05 np0005591285 nova_compute[182755]:    <acpi/>
Jan 21 18:57:05 np0005591285 nova_compute[182755]:    <apic/>
Jan 21 18:57:05 np0005591285 nova_compute[182755]:    <vmcoreinfo state='on'/>
Jan 21 18:57:05 np0005591285 nova_compute[182755]:  </features>
Jan 21 18:57:05 np0005591285 nova_compute[182755]:  <cpu mode='custom' match='exact' check='full'>
Jan 21 18:57:05 np0005591285 nova_compute[182755]:    <model fallback='forbid'>Nehalem</model>
Jan 21 18:57:05 np0005591285 nova_compute[182755]:    <topology sockets='1' dies='1' clusters='1' cores='1' threads='1'/>
Jan 21 18:57:05 np0005591285 nova_compute[182755]:    <feature policy='require' name='x2apic'/>
Jan 21 18:57:05 np0005591285 nova_compute[182755]:    <feature policy='require' name='hypervisor'/>
Jan 21 18:57:05 np0005591285 nova_compute[182755]:    <feature policy='require' name='vme'/>
Jan 21 18:57:05 np0005591285 nova_compute[182755]:  </cpu>
Jan 21 18:57:05 np0005591285 nova_compute[182755]:  <clock offset='utc'>
Jan 21 18:57:05 np0005591285 nova_compute[182755]:    <timer name='pit' tickpolicy='delay'/>
Jan 21 18:57:05 np0005591285 nova_compute[182755]:    <timer name='rtc' tickpolicy='catchup'/>
Jan 21 18:57:05 np0005591285 nova_compute[182755]:    <timer name='hpet' present='no'/>
Jan 21 18:57:05 np0005591285 nova_compute[182755]:  </clock>
Jan 21 18:57:05 np0005591285 nova_compute[182755]:  <on_poweroff>destroy</on_poweroff>
Jan 21 18:57:05 np0005591285 nova_compute[182755]:  <on_reboot>restart</on_reboot>
Jan 21 18:57:05 np0005591285 nova_compute[182755]:  <on_crash>destroy</on_crash>
Jan 21 18:57:05 np0005591285 nova_compute[182755]:  <devices>
Jan 21 18:57:05 np0005591285 nova_compute[182755]:    <emulator>/usr/libexec/qemu-kvm</emulator>
Jan 21 18:57:05 np0005591285 nova_compute[182755]:    <disk type='file' device='disk'>
Jan 21 18:57:05 np0005591285 nova_compute[182755]:      <driver name='qemu' type='qcow2' cache='none'/>
Jan 21 18:57:05 np0005591285 nova_compute[182755]:      <source file='/var/lib/nova/instances/d30c29ef-0595-4d30-a826-50e21d7d3463/disk' index='2'/>
Jan 21 18:57:05 np0005591285 nova_compute[182755]:      <backingStore type='file' index='3'>
Jan 21 18:57:05 np0005591285 nova_compute[182755]:        <format type='raw'/>
Jan 21 18:57:05 np0005591285 nova_compute[182755]:        <source file='/var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474'/>
Jan 21 18:57:05 np0005591285 nova_compute[182755]:        <backingStore/>
Jan 21 18:57:05 np0005591285 nova_compute[182755]:      </backingStore>
Jan 21 18:57:05 np0005591285 nova_compute[182755]:      <target dev='vda' bus='virtio'/>
Jan 21 18:57:05 np0005591285 nova_compute[182755]:      <alias name='virtio-disk0'/>
Jan 21 18:57:05 np0005591285 nova_compute[182755]:      <address type='pci' domain='0x0000' bus='0x03' slot='0x00' function='0x0'/>
Jan 21 18:57:05 np0005591285 nova_compute[182755]:    </disk>
Jan 21 18:57:05 np0005591285 nova_compute[182755]:    <disk type='file' device='cdrom'>
Jan 21 18:57:05 np0005591285 nova_compute[182755]:      <driver name='qemu' type='raw' cache='none'/>
Jan 21 18:57:05 np0005591285 nova_compute[182755]:      <source file='/var/lib/nova/instances/d30c29ef-0595-4d30-a826-50e21d7d3463/disk.config' index='1'/>
Jan 21 18:57:05 np0005591285 nova_compute[182755]:      <backingStore/>
Jan 21 18:57:05 np0005591285 nova_compute[182755]:      <target dev='sda' bus='sata'/>
Jan 21 18:57:05 np0005591285 nova_compute[182755]:      <readonly/>
Jan 21 18:57:05 np0005591285 nova_compute[182755]:      <alias name='sata0-0-0'/>
Jan 21 18:57:05 np0005591285 nova_compute[182755]:      <address type='drive' controller='0' bus='0' target='0' unit='0'/>
Jan 21 18:57:05 np0005591285 nova_compute[182755]:    </disk>
Jan 21 18:57:05 np0005591285 nova_compute[182755]:    <controller type='pci' index='0' model='pcie-root'>
Jan 21 18:57:05 np0005591285 nova_compute[182755]:      <alias name='pcie.0'/>
Jan 21 18:57:05 np0005591285 nova_compute[182755]:    </controller>
Jan 21 18:57:05 np0005591285 nova_compute[182755]:    <controller type='pci' index='1' model='pcie-root-port'>
Jan 21 18:57:05 np0005591285 nova_compute[182755]:      <model name='pcie-root-port'/>
Jan 21 18:57:05 np0005591285 nova_compute[182755]:      <target chassis='1' port='0x10'/>
Jan 21 18:57:05 np0005591285 nova_compute[182755]:      <alias name='pci.1'/>
Jan 21 18:57:05 np0005591285 nova_compute[182755]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x0' multifunction='on'/>
Jan 21 18:57:05 np0005591285 nova_compute[182755]:    </controller>
Jan 21 18:57:05 np0005591285 nova_compute[182755]:    <controller type='pci' index='2' model='pcie-root-port'>
Jan 21 18:57:05 np0005591285 nova_compute[182755]:      <model name='pcie-root-port'/>
Jan 21 18:57:05 np0005591285 nova_compute[182755]:      <target chassis='2' port='0x11'/>
Jan 21 18:57:05 np0005591285 nova_compute[182755]:      <alias name='pci.2'/>
Jan 21 18:57:05 np0005591285 nova_compute[182755]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x1'/>
Jan 21 18:57:05 np0005591285 nova_compute[182755]:    </controller>
Jan 21 18:57:05 np0005591285 nova_compute[182755]:    <controller type='pci' index='3' model='pcie-root-port'>
Jan 21 18:57:05 np0005591285 nova_compute[182755]:      <model name='pcie-root-port'/>
Jan 21 18:57:05 np0005591285 nova_compute[182755]:      <target chassis='3' port='0x12'/>
Jan 21 18:57:05 np0005591285 nova_compute[182755]:      <alias name='pci.3'/>
Jan 21 18:57:05 np0005591285 nova_compute[182755]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x2'/>
Jan 21 18:57:05 np0005591285 nova_compute[182755]:    </controller>
Jan 21 18:57:05 np0005591285 nova_compute[182755]:    <controller type='pci' index='4' model='pcie-root-port'>
Jan 21 18:57:05 np0005591285 nova_compute[182755]:      <model name='pcie-root-port'/>
Jan 21 18:57:05 np0005591285 nova_compute[182755]:      <target chassis='4' port='0x13'/>
Jan 21 18:57:05 np0005591285 nova_compute[182755]:      <alias name='pci.4'/>
Jan 21 18:57:05 np0005591285 nova_compute[182755]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x3'/>
Jan 21 18:57:05 np0005591285 nova_compute[182755]:    </controller>
Jan 21 18:57:05 np0005591285 nova_compute[182755]:    <controller type='pci' index='5' model='pcie-root-port'>
Jan 21 18:57:05 np0005591285 nova_compute[182755]:      <model name='pcie-root-port'/>
Jan 21 18:57:05 np0005591285 nova_compute[182755]:      <target chassis='5' port='0x14'/>
Jan 21 18:57:05 np0005591285 nova_compute[182755]:      <alias name='pci.5'/>
Jan 21 18:57:05 np0005591285 nova_compute[182755]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x4'/>
Jan 21 18:57:05 np0005591285 nova_compute[182755]:    </controller>
Jan 21 18:57:05 np0005591285 nova_compute[182755]:    <controller type='pci' index='6' model='pcie-root-port'>
Jan 21 18:57:05 np0005591285 nova_compute[182755]:      <model name='pcie-root-port'/>
Jan 21 18:57:05 np0005591285 nova_compute[182755]:      <target chassis='6' port='0x15'/>
Jan 21 18:57:05 np0005591285 nova_compute[182755]:      <alias name='pci.6'/>
Jan 21 18:57:05 np0005591285 nova_compute[182755]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x5'/>
Jan 21 18:57:05 np0005591285 nova_compute[182755]:    </controller>
Jan 21 18:57:05 np0005591285 nova_compute[182755]:    <controller type='pci' index='7' model='pcie-root-port'>
Jan 21 18:57:05 np0005591285 nova_compute[182755]:      <model name='pcie-root-port'/>
Jan 21 18:57:05 np0005591285 nova_compute[182755]:      <target chassis='7' port='0x16'/>
Jan 21 18:57:05 np0005591285 nova_compute[182755]:      <alias name='pci.7'/>
Jan 21 18:57:05 np0005591285 nova_compute[182755]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x6'/>
Jan 21 18:57:05 np0005591285 nova_compute[182755]:    </controller>
Jan 21 18:57:05 np0005591285 nova_compute[182755]:    <controller type='pci' index='8' model='pcie-root-port'>
Jan 21 18:57:05 np0005591285 nova_compute[182755]:      <model name='pcie-root-port'/>
Jan 21 18:57:05 np0005591285 nova_compute[182755]:      <target chassis='8' port='0x17'/>
Jan 21 18:57:05 np0005591285 nova_compute[182755]:      <alias name='pci.8'/>
Jan 21 18:57:05 np0005591285 nova_compute[182755]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x7'/>
Jan 21 18:57:05 np0005591285 nova_compute[182755]:    </controller>
Jan 21 18:57:05 np0005591285 nova_compute[182755]:    <controller type='pci' index='9' model='pcie-root-port'>
Jan 21 18:57:05 np0005591285 nova_compute[182755]:      <model name='pcie-root-port'/>
Jan 21 18:57:05 np0005591285 nova_compute[182755]:      <target chassis='9' port='0x18'/>
Jan 21 18:57:05 np0005591285 nova_compute[182755]:      <alias name='pci.9'/>
Jan 21 18:57:05 np0005591285 nova_compute[182755]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x0' multifunction='on'/>
Jan 21 18:57:05 np0005591285 nova_compute[182755]:    </controller>
Jan 21 18:57:05 np0005591285 nova_compute[182755]:    <controller type='pci' index='10' model='pcie-root-port'>
Jan 21 18:57:05 np0005591285 nova_compute[182755]:      <model name='pcie-root-port'/>
Jan 21 18:57:05 np0005591285 nova_compute[182755]:      <target chassis='10' port='0x19'/>
Jan 21 18:57:05 np0005591285 nova_compute[182755]:      <alias name='pci.10'/>
Jan 21 18:57:05 np0005591285 nova_compute[182755]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x1'/>
Jan 21 18:57:05 np0005591285 nova_compute[182755]:    </controller>
Jan 21 18:57:05 np0005591285 nova_compute[182755]:    <controller type='pci' index='11' model='pcie-root-port'>
Jan 21 18:57:05 np0005591285 nova_compute[182755]:      <model name='pcie-root-port'/>
Jan 21 18:57:05 np0005591285 nova_compute[182755]:      <target chassis='11' port='0x1a'/>
Jan 21 18:57:05 np0005591285 nova_compute[182755]:      <alias name='pci.11'/>
Jan 21 18:57:05 np0005591285 nova_compute[182755]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x2'/>
Jan 21 18:57:05 np0005591285 nova_compute[182755]:    </controller>
Jan 21 18:57:05 np0005591285 nova_compute[182755]:    <controller type='pci' index='12' model='pcie-root-port'>
Jan 21 18:57:05 np0005591285 nova_compute[182755]:      <model name='pcie-root-port'/>
Jan 21 18:57:05 np0005591285 nova_compute[182755]:      <target chassis='12' port='0x1b'/>
Jan 21 18:57:05 np0005591285 nova_compute[182755]:      <alias name='pci.12'/>
Jan 21 18:57:05 np0005591285 nova_compute[182755]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x3'/>
Jan 21 18:57:05 np0005591285 nova_compute[182755]:    </controller>
Jan 21 18:57:05 np0005591285 nova_compute[182755]:    <controller type='pci' index='13' model='pcie-root-port'>
Jan 21 18:57:05 np0005591285 nova_compute[182755]:      <model name='pcie-root-port'/>
Jan 21 18:57:05 np0005591285 nova_compute[182755]:      <target chassis='13' port='0x1c'/>
Jan 21 18:57:05 np0005591285 nova_compute[182755]:      <alias name='pci.13'/>
Jan 21 18:57:05 np0005591285 nova_compute[182755]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x4'/>
Jan 21 18:57:05 np0005591285 nova_compute[182755]:    </controller>
Jan 21 18:57:05 np0005591285 nova_compute[182755]:    <controller type='pci' index='14' model='pcie-root-port'>
Jan 21 18:57:05 np0005591285 nova_compute[182755]:      <model name='pcie-root-port'/>
Jan 21 18:57:05 np0005591285 nova_compute[182755]:      <target chassis='14' port='0x1d'/>
Jan 21 18:57:05 np0005591285 nova_compute[182755]:      <alias name='pci.14'/>
Jan 21 18:57:05 np0005591285 nova_compute[182755]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x5'/>
Jan 21 18:57:05 np0005591285 nova_compute[182755]:    </controller>
Jan 21 18:57:05 np0005591285 nova_compute[182755]:    <controller type='pci' index='15' model='pcie-root-port'>
Jan 21 18:57:05 np0005591285 nova_compute[182755]:      <model name='pcie-root-port'/>
Jan 21 18:57:05 np0005591285 nova_compute[182755]:      <target chassis='15' port='0x1e'/>
Jan 21 18:57:05 np0005591285 nova_compute[182755]:      <alias name='pci.15'/>
Jan 21 18:57:05 np0005591285 nova_compute[182755]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x6'/>
Jan 21 18:57:05 np0005591285 nova_compute[182755]:    </controller>
Jan 21 18:57:05 np0005591285 nova_compute[182755]:    <controller type='pci' index='16' model='pcie-root-port'>
Jan 21 18:57:05 np0005591285 nova_compute[182755]:      <model name='pcie-root-port'/>
Jan 21 18:57:05 np0005591285 nova_compute[182755]:      <target chassis='16' port='0x1f'/>
Jan 21 18:57:05 np0005591285 nova_compute[182755]:      <alias name='pci.16'/>
Jan 21 18:57:05 np0005591285 nova_compute[182755]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x7'/>
Jan 21 18:57:05 np0005591285 nova_compute[182755]:    </controller>
Jan 21 18:57:05 np0005591285 nova_compute[182755]:    <controller type='pci' index='17' model='pcie-root-port'>
Jan 21 18:57:05 np0005591285 nova_compute[182755]:      <model name='pcie-root-port'/>
Jan 21 18:57:05 np0005591285 nova_compute[182755]:      <target chassis='17' port='0x20'/>
Jan 21 18:57:05 np0005591285 nova_compute[182755]:      <alias name='pci.17'/>
Jan 21 18:57:05 np0005591285 nova_compute[182755]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x0' multifunction='on'/>
Jan 21 18:57:05 np0005591285 nova_compute[182755]:    </controller>
Jan 21 18:57:05 np0005591285 nova_compute[182755]:    <controller type='pci' index='18' model='pcie-root-port'>
Jan 21 18:57:05 np0005591285 nova_compute[182755]:      <model name='pcie-root-port'/>
Jan 21 18:57:05 np0005591285 nova_compute[182755]:      <target chassis='18' port='0x21'/>
Jan 21 18:57:05 np0005591285 nova_compute[182755]:      <alias name='pci.18'/>
Jan 21 18:57:05 np0005591285 nova_compute[182755]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x1'/>
Jan 21 18:57:05 np0005591285 nova_compute[182755]:    </controller>
Jan 21 18:57:05 np0005591285 nova_compute[182755]:    <controller type='pci' index='19' model='pcie-root-port'>
Jan 21 18:57:05 np0005591285 nova_compute[182755]:      <model name='pcie-root-port'/>
Jan 21 18:57:05 np0005591285 nova_compute[182755]:      <target chassis='19' port='0x22'/>
Jan 21 18:57:05 np0005591285 nova_compute[182755]:      <alias name='pci.19'/>
Jan 21 18:57:05 np0005591285 nova_compute[182755]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x2'/>
Jan 21 18:57:05 np0005591285 nova_compute[182755]:    </controller>
Jan 21 18:57:05 np0005591285 nova_compute[182755]:    <controller type='pci' index='20' model='pcie-root-port'>
Jan 21 18:57:05 np0005591285 nova_compute[182755]:      <model name='pcie-root-port'/>
Jan 21 18:57:05 np0005591285 nova_compute[182755]:      <target chassis='20' port='0x23'/>
Jan 21 18:57:05 np0005591285 nova_compute[182755]:      <alias name='pci.20'/>
Jan 21 18:57:05 np0005591285 nova_compute[182755]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x3'/>
Jan 21 18:57:05 np0005591285 nova_compute[182755]:    </controller>
Jan 21 18:57:05 np0005591285 nova_compute[182755]:    <controller type='pci' index='21' model='pcie-root-port'>
Jan 21 18:57:05 np0005591285 nova_compute[182755]:      <model name='pcie-root-port'/>
Jan 21 18:57:05 np0005591285 nova_compute[182755]:      <target chassis='21' port='0x24'/>
Jan 21 18:57:05 np0005591285 nova_compute[182755]:      <alias name='pci.21'/>
Jan 21 18:57:05 np0005591285 nova_compute[182755]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x4'/>
Jan 21 18:57:05 np0005591285 nova_compute[182755]:    </controller>
Jan 21 18:57:05 np0005591285 nova_compute[182755]:    <controller type='pci' index='22' model='pcie-root-port'>
Jan 21 18:57:05 np0005591285 nova_compute[182755]:      <model name='pcie-root-port'/>
Jan 21 18:57:05 np0005591285 nova_compute[182755]:      <target chassis='22' port='0x25'/>
Jan 21 18:57:05 np0005591285 nova_compute[182755]:      <alias name='pci.22'/>
Jan 21 18:57:05 np0005591285 nova_compute[182755]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x5'/>
Jan 21 18:57:05 np0005591285 nova_compute[182755]:    </controller>
Jan 21 18:57:05 np0005591285 nova_compute[182755]:    <controller type='pci' index='23' model='pcie-root-port'>
Jan 21 18:57:05 np0005591285 nova_compute[182755]:      <model name='pcie-root-port'/>
Jan 21 18:57:05 np0005591285 nova_compute[182755]:      <target chassis='23' port='0x26'/>
Jan 21 18:57:05 np0005591285 nova_compute[182755]:      <alias name='pci.23'/>
Jan 21 18:57:05 np0005591285 nova_compute[182755]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x6'/>
Jan 21 18:57:05 np0005591285 nova_compute[182755]:    </controller>
Jan 21 18:57:05 np0005591285 nova_compute[182755]:    <controller type='pci' index='24' model='pcie-root-port'>
Jan 21 18:57:05 np0005591285 nova_compute[182755]:      <model name='pcie-root-port'/>
Jan 21 18:57:05 np0005591285 nova_compute[182755]:      <target chassis='24' port='0x27'/>
Jan 21 18:57:05 np0005591285 nova_compute[182755]:      <alias name='pci.24'/>
Jan 21 18:57:05 np0005591285 nova_compute[182755]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x7'/>
Jan 21 18:57:05 np0005591285 nova_compute[182755]:    </controller>
Jan 21 18:57:05 np0005591285 nova_compute[182755]:    <controller type='pci' index='25' model='pcie-root-port'>
Jan 21 18:57:05 np0005591285 nova_compute[182755]:      <model name='pcie-root-port'/>
Jan 21 18:57:05 np0005591285 nova_compute[182755]:      <target chassis='25' port='0x28'/>
Jan 21 18:57:05 np0005591285 nova_compute[182755]:      <alias name='pci.25'/>
Jan 21 18:57:05 np0005591285 nova_compute[182755]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x05' function='0x0'/>
Jan 21 18:57:05 np0005591285 nova_compute[182755]:    </controller>
Jan 21 18:57:05 np0005591285 nova_compute[182755]:    <controller type='pci' index='26' model='pcie-to-pci-bridge'>
Jan 21 18:57:05 np0005591285 nova_compute[182755]:      <model name='pcie-pci-bridge'/>
Jan 21 18:57:05 np0005591285 nova_compute[182755]:      <alias name='pci.26'/>
Jan 21 18:57:05 np0005591285 nova_compute[182755]:      <address type='pci' domain='0x0000' bus='0x01' slot='0x00' function='0x0'/>
Jan 21 18:57:05 np0005591285 nova_compute[182755]:    </controller>
Jan 21 18:57:05 np0005591285 nova_compute[182755]:    <controller type='usb' index='0' model='piix3-uhci'>
Jan 21 18:57:05 np0005591285 nova_compute[182755]:      <alias name='usb'/>
Jan 21 18:57:05 np0005591285 nova_compute[182755]:      <address type='pci' domain='0x0000' bus='0x1a' slot='0x01' function='0x0'/>
Jan 21 18:57:05 np0005591285 nova_compute[182755]:    </controller>
Jan 21 18:57:05 np0005591285 nova_compute[182755]:    <controller type='sata' index='0'>
Jan 21 18:57:05 np0005591285 nova_compute[182755]:      <alias name='ide'/>
Jan 21 18:57:05 np0005591285 nova_compute[182755]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x1f' function='0x2'/>
Jan 21 18:57:05 np0005591285 nova_compute[182755]:    </controller>
Jan 21 18:57:05 np0005591285 nova_compute[182755]:    <interface type='ethernet'>
Jan 21 18:57:05 np0005591285 nova_compute[182755]:      <mac address='fa:16:3e:04:81:11'/>
Jan 21 18:57:05 np0005591285 nova_compute[182755]:      <target dev='tap8a89069d-b6'/>
Jan 21 18:57:05 np0005591285 nova_compute[182755]:      <model type='virtio'/>
Jan 21 18:57:05 np0005591285 nova_compute[182755]:      <driver name='vhost' rx_queue_size='512'/>
Jan 21 18:57:05 np0005591285 nova_compute[182755]:      <mtu size='1442'/>
Jan 21 18:57:05 np0005591285 nova_compute[182755]:      <alias name='net0'/>
Jan 21 18:57:05 np0005591285 nova_compute[182755]:      <address type='pci' domain='0x0000' bus='0x02' slot='0x00' function='0x0'/>
Jan 21 18:57:05 np0005591285 nova_compute[182755]:    </interface>
Jan 21 18:57:05 np0005591285 nova_compute[182755]:    <interface type='ethernet'>
Jan 21 18:57:05 np0005591285 nova_compute[182755]:      <mac address='fa:16:3e:42:b1:a8'/>
Jan 21 18:57:05 np0005591285 nova_compute[182755]:      <target dev='tape7e9d1d2-92'/>
Jan 21 18:57:05 np0005591285 nova_compute[182755]:      <model type='virtio'/>
Jan 21 18:57:05 np0005591285 nova_compute[182755]:      <driver name='vhost' rx_queue_size='512'/>
Jan 21 18:57:05 np0005591285 nova_compute[182755]:      <mtu size='1442'/>
Jan 21 18:57:05 np0005591285 nova_compute[182755]:      <alias name='net1'/>
Jan 21 18:57:05 np0005591285 nova_compute[182755]:      <address type='pci' domain='0x0000' bus='0x06' slot='0x00' function='0x0'/>
Jan 21 18:57:05 np0005591285 nova_compute[182755]:    </interface>
Jan 21 18:57:05 np0005591285 nova_compute[182755]:    <serial type='pty'>
Jan 21 18:57:05 np0005591285 nova_compute[182755]:      <source path='/dev/pts/0'/>
Jan 21 18:57:05 np0005591285 nova_compute[182755]:      <log file='/var/lib/nova/instances/d30c29ef-0595-4d30-a826-50e21d7d3463/console.log' append='off'/>
Jan 21 18:57:05 np0005591285 nova_compute[182755]:      <target type='isa-serial' port='0'>
Jan 21 18:57:05 np0005591285 nova_compute[182755]:        <model name='isa-serial'/>
Jan 21 18:57:05 np0005591285 nova_compute[182755]:      </target>
Jan 21 18:57:05 np0005591285 nova_compute[182755]:      <alias name='serial0'/>
Jan 21 18:57:05 np0005591285 nova_compute[182755]:    </serial>
Jan 21 18:57:05 np0005591285 nova_compute[182755]:    <console type='pty' tty='/dev/pts/0'>
Jan 21 18:57:05 np0005591285 nova_compute[182755]:      <source path='/dev/pts/0'/>
Jan 21 18:57:05 np0005591285 nova_compute[182755]:      <log file='/var/lib/nova/instances/d30c29ef-0595-4d30-a826-50e21d7d3463/console.log' append='off'/>
Jan 21 18:57:05 np0005591285 nova_compute[182755]:      <target type='serial' port='0'/>
Jan 21 18:57:05 np0005591285 nova_compute[182755]:      <alias name='serial0'/>
Jan 21 18:57:05 np0005591285 nova_compute[182755]:    </console>
Jan 21 18:57:05 np0005591285 nova_compute[182755]:    <input type='tablet' bus='usb'>
Jan 21 18:57:05 np0005591285 nova_compute[182755]:      <alias name='input0'/>
Jan 21 18:57:05 np0005591285 nova_compute[182755]:      <address type='usb' bus='0' port='1'/>
Jan 21 18:57:05 np0005591285 nova_compute[182755]:    </input>
Jan 21 18:57:05 np0005591285 nova_compute[182755]:    <input type='mouse' bus='ps2'>
Jan 21 18:57:05 np0005591285 nova_compute[182755]:      <alias name='input1'/>
Jan 21 18:57:05 np0005591285 nova_compute[182755]:    </input>
Jan 21 18:57:05 np0005591285 nova_compute[182755]:    <input type='keyboard' bus='ps2'>
Jan 21 18:57:05 np0005591285 nova_compute[182755]:      <alias name='input2'/>
Jan 21 18:57:05 np0005591285 nova_compute[182755]:    </input>
Jan 21 18:57:05 np0005591285 nova_compute[182755]:    <graphics type='vnc' port='5900' autoport='yes' listen='::0'>
Jan 21 18:57:05 np0005591285 nova_compute[182755]:      <listen type='address' address='::0'/>
Jan 21 18:57:05 np0005591285 nova_compute[182755]:    </graphics>
Jan 21 18:57:05 np0005591285 nova_compute[182755]:    <audio id='1' type='none'/>
Jan 21 18:57:05 np0005591285 nova_compute[182755]:    <video>
Jan 21 18:57:05 np0005591285 nova_compute[182755]:      <model type='virtio' heads='1' primary='yes'/>
Jan 21 18:57:05 np0005591285 nova_compute[182755]:      <alias name='video0'/>
Jan 21 18:57:05 np0005591285 nova_compute[182755]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x01' function='0x0'/>
Jan 21 18:57:05 np0005591285 nova_compute[182755]:    </video>
Jan 21 18:57:05 np0005591285 nova_compute[182755]:    <watchdog model='itco' action='reset'>
Jan 21 18:57:05 np0005591285 nova_compute[182755]:      <alias name='watchdog0'/>
Jan 21 18:57:05 np0005591285 nova_compute[182755]:    </watchdog>
Jan 21 18:57:05 np0005591285 nova_compute[182755]:    <memballoon model='virtio'>
Jan 21 18:57:05 np0005591285 nova_compute[182755]:      <stats period='10'/>
Jan 21 18:57:05 np0005591285 nova_compute[182755]:      <alias name='balloon0'/>
Jan 21 18:57:05 np0005591285 nova_compute[182755]:      <address type='pci' domain='0x0000' bus='0x04' slot='0x00' function='0x0'/>
Jan 21 18:57:05 np0005591285 nova_compute[182755]:    </memballoon>
Jan 21 18:57:05 np0005591285 nova_compute[182755]:    <rng model='virtio'>
Jan 21 18:57:05 np0005591285 nova_compute[182755]:      <backend model='random'>/dev/urandom</backend>
Jan 21 18:57:05 np0005591285 nova_compute[182755]:      <alias name='rng0'/>
Jan 21 18:57:05 np0005591285 nova_compute[182755]:      <address type='pci' domain='0x0000' bus='0x05' slot='0x00' function='0x0'/>
Jan 21 18:57:05 np0005591285 nova_compute[182755]:    </rng>
Jan 21 18:57:05 np0005591285 nova_compute[182755]:  </devices>
Jan 21 18:57:05 np0005591285 nova_compute[182755]:  <seclabel type='dynamic' model='selinux' relabel='yes'>
Jan 21 18:57:05 np0005591285 nova_compute[182755]:    <label>system_u:system_r:svirt_t:s0:c724,c873</label>
Jan 21 18:57:05 np0005591285 nova_compute[182755]:    <imagelabel>system_u:object_r:svirt_image_t:s0:c724,c873</imagelabel>
Jan 21 18:57:05 np0005591285 nova_compute[182755]:  </seclabel>
Jan 21 18:57:05 np0005591285 nova_compute[182755]:  <seclabel type='dynamic' model='dac' relabel='yes'>
Jan 21 18:57:05 np0005591285 nova_compute[182755]:    <label>+107:+107</label>
Jan 21 18:57:05 np0005591285 nova_compute[182755]:    <imagelabel>+107:+107</imagelabel>
Jan 21 18:57:05 np0005591285 nova_compute[182755]:  </seclabel>
Jan 21 18:57:05 np0005591285 nova_compute[182755]: </domain>
Jan 21 18:57:05 np0005591285 nova_compute[182755]: get_interface_by_cfg /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:282#033[00m
Jan 21 18:57:05 np0005591285 nova_compute[182755]: 2026-01-21 23:57:05.652 182759 INFO nova.virt.libvirt.driver [None req-f149204a-fb2f-4bc3-b144-43c74e796542 0f8ef02149394f2dac899fc3395b6bf7 717cc581e6a349a98dfd390d05b18624 - - default default] Successfully detached device tape7e9d1d2-92 from instance d30c29ef-0595-4d30-a826-50e21d7d3463 from the persistent domain config.#033[00m
Jan 21 18:57:05 np0005591285 nova_compute[182755]: 2026-01-21 23:57:05.653 182759 DEBUG nova.virt.libvirt.driver [None req-f149204a-fb2f-4bc3-b144-43c74e796542 0f8ef02149394f2dac899fc3395b6bf7 717cc581e6a349a98dfd390d05b18624 - - default default] (1/8): Attempting to detach device tape7e9d1d2-92 with device alias net1 from instance d30c29ef-0595-4d30-a826-50e21d7d3463 from the live domain config. _detach_from_live_with_retry /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:2523#033[00m
Jan 21 18:57:05 np0005591285 nova_compute[182755]: 2026-01-21 23:57:05.653 182759 DEBUG nova.virt.libvirt.guest [None req-f149204a-fb2f-4bc3-b144-43c74e796542 0f8ef02149394f2dac899fc3395b6bf7 717cc581e6a349a98dfd390d05b18624 - - default default] detach device xml: <interface type="ethernet">
Jan 21 18:57:05 np0005591285 nova_compute[182755]:  <mac address="fa:16:3e:42:b1:a8"/>
Jan 21 18:57:05 np0005591285 nova_compute[182755]:  <model type="virtio"/>
Jan 21 18:57:05 np0005591285 nova_compute[182755]:  <driver name="vhost" rx_queue_size="512"/>
Jan 21 18:57:05 np0005591285 nova_compute[182755]:  <mtu size="1442"/>
Jan 21 18:57:05 np0005591285 nova_compute[182755]:  <target dev="tape7e9d1d2-92"/>
Jan 21 18:57:05 np0005591285 nova_compute[182755]: </interface>
Jan 21 18:57:05 np0005591285 nova_compute[182755]: detach_device /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:465#033[00m
Jan 21 18:57:05 np0005591285 kernel: tape7e9d1d2-92 (unregistering): left promiscuous mode
Jan 21 18:57:05 np0005591285 NetworkManager[55017]: <info>  [1769039825.7649] device (tape7e9d1d2-92): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 21 18:57:05 np0005591285 ovn_controller[94908]: 2026-01-21T23:57:05Z|00196|binding|INFO|Releasing lport e7e9d1d2-925c-4854-8e1c-90b360ae8694 from this chassis (sb_readonly=0)
Jan 21 18:57:05 np0005591285 ovn_controller[94908]: 2026-01-21T23:57:05Z|00197|binding|INFO|Setting lport e7e9d1d2-925c-4854-8e1c-90b360ae8694 down in Southbound
Jan 21 18:57:05 np0005591285 nova_compute[182755]: 2026-01-21 23:57:05.775 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 18:57:05 np0005591285 ovn_controller[94908]: 2026-01-21T23:57:05Z|00198|binding|INFO|Removing iface tape7e9d1d2-92 ovn-installed in OVS
Jan 21 18:57:05 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:57:05.782 104259 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:42:b1:a8 10.100.0.13'], port_security=['fa:16:3e:42:b1:a8 10.100.0.13'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.13/28', 'neutron:device_id': 'd30c29ef-0595-4d30-a826-50e21d7d3463', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-1995baab-0f8d-4658-a4fc-2d21868dc592', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '717cc581e6a349a98dfd390d05b18624', 'neutron:revision_number': '4', 'neutron:security_group_ids': '453c6af8-25dc-4538-90e8-d74d46875cdc', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-2.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=a84fa12f-731b-4479-8697-844749c5a76f, chassis=[], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fdd55b5c970>], logical_port=e7e9d1d2-925c-4854-8e1c-90b360ae8694) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7fdd55b5c970>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 21 18:57:05 np0005591285 nova_compute[182755]: 2026-01-21 23:57:05.781 182759 DEBUG nova.virt.libvirt.driver [None req-f447ef32-7edb-4e2c-a5c5-5452f2f9c37e - - - - - -] Received event <DeviceRemovedEvent: 1769039825.7803676, d30c29ef-0595-4d30-a826-50e21d7d3463 => net1> from libvirt while the driver is waiting for it; dispatched. emit_event /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:2370#033[00m
Jan 21 18:57:05 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:57:05.784 104259 INFO neutron.agent.ovn.metadata.agent [-] Port e7e9d1d2-925c-4854-8e1c-90b360ae8694 in datapath 1995baab-0f8d-4658-a4fc-2d21868dc592 unbound from our chassis#033[00m
Jan 21 18:57:05 np0005591285 nova_compute[182755]: 2026-01-21 23:57:05.784 182759 DEBUG nova.virt.libvirt.driver [None req-f149204a-fb2f-4bc3-b144-43c74e796542 0f8ef02149394f2dac899fc3395b6bf7 717cc581e6a349a98dfd390d05b18624 - - default default] Start waiting for the detach event from libvirt for device tape7e9d1d2-92 with device alias net1 for instance d30c29ef-0595-4d30-a826-50e21d7d3463 _detach_from_live_and_wait_for_event /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:2599#033[00m
Jan 21 18:57:05 np0005591285 nova_compute[182755]: 2026-01-21 23:57:05.785 182759 DEBUG nova.virt.libvirt.guest [None req-f149204a-fb2f-4bc3-b144-43c74e796542 0f8ef02149394f2dac899fc3395b6bf7 717cc581e6a349a98dfd390d05b18624 - - default default] looking for interface given config: <interface type="ethernet"><mac address="fa:16:3e:42:b1:a8"/><model type="virtio"/><driver name="vhost" rx_queue_size="512"/><mtu size="1442"/><target dev="tape7e9d1d2-92"/></interface> get_interface_by_cfg /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:257#033[00m
Jan 21 18:57:05 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:57:05.786 104259 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 1995baab-0f8d-4658-a4fc-2d21868dc592#033[00m
Jan 21 18:57:05 np0005591285 nova_compute[182755]: 2026-01-21 23:57:05.791 182759 DEBUG nova.virt.libvirt.guest [None req-f149204a-fb2f-4bc3-b144-43c74e796542 0f8ef02149394f2dac899fc3395b6bf7 717cc581e6a349a98dfd390d05b18624 - - default default] interface for config: <interface type="ethernet"><mac address="fa:16:3e:42:b1:a8"/><model type="virtio"/><driver name="vhost" rx_queue_size="512"/><mtu size="1442"/><target dev="tape7e9d1d2-92"/></interface>not found in domain: <domain type='kvm' id='28'>
Jan 21 18:57:05 np0005591285 nova_compute[182755]:  <name>instance-00000044</name>
Jan 21 18:57:05 np0005591285 nova_compute[182755]:  <uuid>d30c29ef-0595-4d30-a826-50e21d7d3463</uuid>
Jan 21 18:57:05 np0005591285 nova_compute[182755]:  <metadata>
Jan 21 18:57:05 np0005591285 nova_compute[182755]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1" xmlns:instance="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 21 18:57:05 np0005591285 nova_compute[182755]:  <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 21 18:57:05 np0005591285 nova_compute[182755]:  <nova:name>tempest-AttachInterfacesTestJSON-server-1062303531</nova:name>
Jan 21 18:57:05 np0005591285 nova_compute[182755]:  <nova:creationTime>2026-01-21 23:57:02</nova:creationTime>
Jan 21 18:57:05 np0005591285 nova_compute[182755]:  <nova:flavor name="m1.nano">
Jan 21 18:57:05 np0005591285 nova_compute[182755]:    <nova:memory>128</nova:memory>
Jan 21 18:57:05 np0005591285 nova_compute[182755]:    <nova:disk>1</nova:disk>
Jan 21 18:57:05 np0005591285 nova_compute[182755]:    <nova:swap>0</nova:swap>
Jan 21 18:57:05 np0005591285 nova_compute[182755]:    <nova:ephemeral>0</nova:ephemeral>
Jan 21 18:57:05 np0005591285 nova_compute[182755]:    <nova:vcpus>1</nova:vcpus>
Jan 21 18:57:05 np0005591285 nova_compute[182755]:  </nova:flavor>
Jan 21 18:57:05 np0005591285 nova_compute[182755]:  <nova:owner>
Jan 21 18:57:05 np0005591285 nova_compute[182755]:    <nova:user uuid="0f8ef02149394f2dac899fc3395b6bf7">tempest-AttachInterfacesTestJSON-658760528-project-member</nova:user>
Jan 21 18:57:05 np0005591285 nova_compute[182755]:    <nova:project uuid="717cc581e6a349a98dfd390d05b18624">tempest-AttachInterfacesTestJSON-658760528</nova:project>
Jan 21 18:57:05 np0005591285 nova_compute[182755]:  </nova:owner>
Jan 21 18:57:05 np0005591285 nova_compute[182755]:  <nova:root type="image" uuid="9cd98f02-a505-4543-a7ad-04e9a377b456"/>
Jan 21 18:57:05 np0005591285 nova_compute[182755]:  <nova:ports>
Jan 21 18:57:05 np0005591285 nova_compute[182755]:    <nova:port uuid="8a89069d-b676-4bb0-ab19-1d71370566f0">
Jan 21 18:57:05 np0005591285 nova_compute[182755]:      <nova:ip type="fixed" address="10.100.0.14" ipVersion="4"/>
Jan 21 18:57:05 np0005591285 nova_compute[182755]:    </nova:port>
Jan 21 18:57:05 np0005591285 nova_compute[182755]:    <nova:port uuid="e7e9d1d2-925c-4854-8e1c-90b360ae8694">
Jan 21 18:57:05 np0005591285 nova_compute[182755]:      <nova:ip type="fixed" address="10.100.0.13" ipVersion="4"/>
Jan 21 18:57:05 np0005591285 nova_compute[182755]:    </nova:port>
Jan 21 18:57:05 np0005591285 nova_compute[182755]:  </nova:ports>
Jan 21 18:57:05 np0005591285 nova_compute[182755]: </nova:instance>
Jan 21 18:57:05 np0005591285 nova_compute[182755]:  </metadata>
Jan 21 18:57:05 np0005591285 nova_compute[182755]:  <memory unit='KiB'>131072</memory>
Jan 21 18:57:05 np0005591285 nova_compute[182755]:  <currentMemory unit='KiB'>131072</currentMemory>
Jan 21 18:57:05 np0005591285 nova_compute[182755]:  <vcpu placement='static'>1</vcpu>
Jan 21 18:57:05 np0005591285 nova_compute[182755]:  <resource>
Jan 21 18:57:05 np0005591285 nova_compute[182755]:    <partition>/machine</partition>
Jan 21 18:57:05 np0005591285 nova_compute[182755]:  </resource>
Jan 21 18:57:05 np0005591285 nova_compute[182755]:  <sysinfo type='smbios'>
Jan 21 18:57:05 np0005591285 nova_compute[182755]:    <system>
Jan 21 18:57:05 np0005591285 nova_compute[182755]:      <entry name='manufacturer'>RDO</entry>
Jan 21 18:57:05 np0005591285 nova_compute[182755]:      <entry name='product'>OpenStack Compute</entry>
Jan 21 18:57:05 np0005591285 nova_compute[182755]:      <entry name='version'>27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 21 18:57:05 np0005591285 nova_compute[182755]:      <entry name='serial'>d30c29ef-0595-4d30-a826-50e21d7d3463</entry>
Jan 21 18:57:05 np0005591285 nova_compute[182755]:      <entry name='uuid'>d30c29ef-0595-4d30-a826-50e21d7d3463</entry>
Jan 21 18:57:05 np0005591285 nova_compute[182755]:      <entry name='family'>Virtual Machine</entry>
Jan 21 18:57:05 np0005591285 nova_compute[182755]:    </system>
Jan 21 18:57:05 np0005591285 nova_compute[182755]:  </sysinfo>
Jan 21 18:57:05 np0005591285 nova_compute[182755]:  <os>
Jan 21 18:57:05 np0005591285 nova_compute[182755]:    <type arch='x86_64' machine='pc-q35-rhel9.8.0'>hvm</type>
Jan 21 18:57:05 np0005591285 nova_compute[182755]:    <boot dev='hd'/>
Jan 21 18:57:05 np0005591285 nova_compute[182755]:    <smbios mode='sysinfo'/>
Jan 21 18:57:05 np0005591285 nova_compute[182755]:  </os>
Jan 21 18:57:05 np0005591285 nova_compute[182755]:  <features>
Jan 21 18:57:05 np0005591285 nova_compute[182755]:    <acpi/>
Jan 21 18:57:05 np0005591285 nova_compute[182755]:    <apic/>
Jan 21 18:57:05 np0005591285 nova_compute[182755]:    <vmcoreinfo state='on'/>
Jan 21 18:57:05 np0005591285 nova_compute[182755]:  </features>
Jan 21 18:57:05 np0005591285 nova_compute[182755]:  <cpu mode='custom' match='exact' check='full'>
Jan 21 18:57:05 np0005591285 nova_compute[182755]:    <model fallback='forbid'>Nehalem</model>
Jan 21 18:57:05 np0005591285 nova_compute[182755]:    <topology sockets='1' dies='1' clusters='1' cores='1' threads='1'/>
Jan 21 18:57:05 np0005591285 nova_compute[182755]:    <feature policy='require' name='x2apic'/>
Jan 21 18:57:05 np0005591285 nova_compute[182755]:    <feature policy='require' name='hypervisor'/>
Jan 21 18:57:05 np0005591285 nova_compute[182755]:    <feature policy='require' name='vme'/>
Jan 21 18:57:05 np0005591285 nova_compute[182755]:  </cpu>
Jan 21 18:57:05 np0005591285 nova_compute[182755]:  <clock offset='utc'>
Jan 21 18:57:05 np0005591285 nova_compute[182755]:    <timer name='pit' tickpolicy='delay'/>
Jan 21 18:57:05 np0005591285 nova_compute[182755]:    <timer name='rtc' tickpolicy='catchup'/>
Jan 21 18:57:05 np0005591285 nova_compute[182755]:    <timer name='hpet' present='no'/>
Jan 21 18:57:05 np0005591285 nova_compute[182755]:  </clock>
Jan 21 18:57:05 np0005591285 nova_compute[182755]:  <on_poweroff>destroy</on_poweroff>
Jan 21 18:57:05 np0005591285 nova_compute[182755]:  <on_reboot>restart</on_reboot>
Jan 21 18:57:05 np0005591285 nova_compute[182755]:  <on_crash>destroy</on_crash>
Jan 21 18:57:05 np0005591285 nova_compute[182755]:  <devices>
Jan 21 18:57:05 np0005591285 nova_compute[182755]:    <emulator>/usr/libexec/qemu-kvm</emulator>
Jan 21 18:57:05 np0005591285 nova_compute[182755]:    <disk type='file' device='disk'>
Jan 21 18:57:05 np0005591285 nova_compute[182755]:      <driver name='qemu' type='qcow2' cache='none'/>
Jan 21 18:57:05 np0005591285 nova_compute[182755]:      <source file='/var/lib/nova/instances/d30c29ef-0595-4d30-a826-50e21d7d3463/disk' index='2'/>
Jan 21 18:57:05 np0005591285 nova_compute[182755]:      <backingStore type='file' index='3'>
Jan 21 18:57:05 np0005591285 nova_compute[182755]:        <format type='raw'/>
Jan 21 18:57:05 np0005591285 nova_compute[182755]:        <source file='/var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474'/>
Jan 21 18:57:05 np0005591285 nova_compute[182755]:        <backingStore/>
Jan 21 18:57:05 np0005591285 nova_compute[182755]:      </backingStore>
Jan 21 18:57:05 np0005591285 nova_compute[182755]:      <target dev='vda' bus='virtio'/>
Jan 21 18:57:05 np0005591285 nova_compute[182755]:      <alias name='virtio-disk0'/>
Jan 21 18:57:05 np0005591285 nova_compute[182755]:      <address type='pci' domain='0x0000' bus='0x03' slot='0x00' function='0x0'/>
Jan 21 18:57:05 np0005591285 nova_compute[182755]:    </disk>
Jan 21 18:57:05 np0005591285 nova_compute[182755]:    <disk type='file' device='cdrom'>
Jan 21 18:57:05 np0005591285 nova_compute[182755]:      <driver name='qemu' type='raw' cache='none'/>
Jan 21 18:57:05 np0005591285 nova_compute[182755]:      <source file='/var/lib/nova/instances/d30c29ef-0595-4d30-a826-50e21d7d3463/disk.config' index='1'/>
Jan 21 18:57:05 np0005591285 nova_compute[182755]:      <backingStore/>
Jan 21 18:57:05 np0005591285 nova_compute[182755]:      <target dev='sda' bus='sata'/>
Jan 21 18:57:05 np0005591285 nova_compute[182755]:      <readonly/>
Jan 21 18:57:05 np0005591285 nova_compute[182755]:      <alias name='sata0-0-0'/>
Jan 21 18:57:05 np0005591285 nova_compute[182755]:      <address type='drive' controller='0' bus='0' target='0' unit='0'/>
Jan 21 18:57:05 np0005591285 nova_compute[182755]:    </disk>
Jan 21 18:57:05 np0005591285 nova_compute[182755]:    <controller type='pci' index='0' model='pcie-root'>
Jan 21 18:57:05 np0005591285 nova_compute[182755]:      <alias name='pcie.0'/>
Jan 21 18:57:05 np0005591285 nova_compute[182755]:    </controller>
Jan 21 18:57:05 np0005591285 nova_compute[182755]:    <controller type='pci' index='1' model='pcie-root-port'>
Jan 21 18:57:05 np0005591285 nova_compute[182755]:      <model name='pcie-root-port'/>
Jan 21 18:57:05 np0005591285 nova_compute[182755]:      <target chassis='1' port='0x10'/>
Jan 21 18:57:05 np0005591285 nova_compute[182755]:      <alias name='pci.1'/>
Jan 21 18:57:05 np0005591285 nova_compute[182755]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x0' multifunction='on'/>
Jan 21 18:57:05 np0005591285 nova_compute[182755]:    </controller>
Jan 21 18:57:05 np0005591285 nova_compute[182755]:    <controller type='pci' index='2' model='pcie-root-port'>
Jan 21 18:57:05 np0005591285 nova_compute[182755]:      <model name='pcie-root-port'/>
Jan 21 18:57:05 np0005591285 nova_compute[182755]:      <target chassis='2' port='0x11'/>
Jan 21 18:57:05 np0005591285 nova_compute[182755]:      <alias name='pci.2'/>
Jan 21 18:57:05 np0005591285 nova_compute[182755]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x1'/>
Jan 21 18:57:05 np0005591285 nova_compute[182755]:    </controller>
Jan 21 18:57:05 np0005591285 nova_compute[182755]:    <controller type='pci' index='3' model='pcie-root-port'>
Jan 21 18:57:05 np0005591285 nova_compute[182755]:      <model name='pcie-root-port'/>
Jan 21 18:57:05 np0005591285 nova_compute[182755]:      <target chassis='3' port='0x12'/>
Jan 21 18:57:05 np0005591285 nova_compute[182755]:      <alias name='pci.3'/>
Jan 21 18:57:05 np0005591285 nova_compute[182755]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x2'/>
Jan 21 18:57:05 np0005591285 nova_compute[182755]:    </controller>
Jan 21 18:57:05 np0005591285 nova_compute[182755]:    <controller type='pci' index='4' model='pcie-root-port'>
Jan 21 18:57:05 np0005591285 nova_compute[182755]:      <model name='pcie-root-port'/>
Jan 21 18:57:05 np0005591285 nova_compute[182755]:      <target chassis='4' port='0x13'/>
Jan 21 18:57:05 np0005591285 nova_compute[182755]:      <alias name='pci.4'/>
Jan 21 18:57:05 np0005591285 nova_compute[182755]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x3'/>
Jan 21 18:57:05 np0005591285 nova_compute[182755]:    </controller>
Jan 21 18:57:05 np0005591285 nova_compute[182755]:    <controller type='pci' index='5' model='pcie-root-port'>
Jan 21 18:57:05 np0005591285 nova_compute[182755]:      <model name='pcie-root-port'/>
Jan 21 18:57:05 np0005591285 nova_compute[182755]:      <target chassis='5' port='0x14'/>
Jan 21 18:57:05 np0005591285 nova_compute[182755]:      <alias name='pci.5'/>
Jan 21 18:57:05 np0005591285 nova_compute[182755]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x4'/>
Jan 21 18:57:05 np0005591285 nova_compute[182755]:    </controller>
Jan 21 18:57:05 np0005591285 nova_compute[182755]:    <controller type='pci' index='6' model='pcie-root-port'>
Jan 21 18:57:05 np0005591285 nova_compute[182755]:      <model name='pcie-root-port'/>
Jan 21 18:57:05 np0005591285 nova_compute[182755]:      <target chassis='6' port='0x15'/>
Jan 21 18:57:05 np0005591285 nova_compute[182755]:      <alias name='pci.6'/>
Jan 21 18:57:05 np0005591285 nova_compute[182755]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x5'/>
Jan 21 18:57:05 np0005591285 nova_compute[182755]:    </controller>
Jan 21 18:57:05 np0005591285 nova_compute[182755]:    <controller type='pci' index='7' model='pcie-root-port'>
Jan 21 18:57:05 np0005591285 nova_compute[182755]:      <model name='pcie-root-port'/>
Jan 21 18:57:05 np0005591285 nova_compute[182755]:      <target chassis='7' port='0x16'/>
Jan 21 18:57:05 np0005591285 nova_compute[182755]:      <alias name='pci.7'/>
Jan 21 18:57:05 np0005591285 nova_compute[182755]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x6'/>
Jan 21 18:57:05 np0005591285 nova_compute[182755]:    </controller>
Jan 21 18:57:05 np0005591285 nova_compute[182755]:    <controller type='pci' index='8' model='pcie-root-port'>
Jan 21 18:57:05 np0005591285 nova_compute[182755]:      <model name='pcie-root-port'/>
Jan 21 18:57:05 np0005591285 nova_compute[182755]:      <target chassis='8' port='0x17'/>
Jan 21 18:57:05 np0005591285 nova_compute[182755]:      <alias name='pci.8'/>
Jan 21 18:57:05 np0005591285 nova_compute[182755]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x7'/>
Jan 21 18:57:05 np0005591285 nova_compute[182755]:    </controller>
Jan 21 18:57:05 np0005591285 nova_compute[182755]:    <controller type='pci' index='9' model='pcie-root-port'>
Jan 21 18:57:05 np0005591285 nova_compute[182755]:      <model name='pcie-root-port'/>
Jan 21 18:57:05 np0005591285 nova_compute[182755]:      <target chassis='9' port='0x18'/>
Jan 21 18:57:05 np0005591285 nova_compute[182755]:      <alias name='pci.9'/>
Jan 21 18:57:05 np0005591285 nova_compute[182755]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x0' multifunction='on'/>
Jan 21 18:57:05 np0005591285 nova_compute[182755]:    </controller>
Jan 21 18:57:05 np0005591285 nova_compute[182755]:    <controller type='pci' index='10' model='pcie-root-port'>
Jan 21 18:57:05 np0005591285 nova_compute[182755]:      <model name='pcie-root-port'/>
Jan 21 18:57:05 np0005591285 nova_compute[182755]:      <target chassis='10' port='0x19'/>
Jan 21 18:57:05 np0005591285 nova_compute[182755]:      <alias name='pci.10'/>
Jan 21 18:57:05 np0005591285 nova_compute[182755]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x1'/>
Jan 21 18:57:05 np0005591285 nova_compute[182755]:    </controller>
Jan 21 18:57:05 np0005591285 nova_compute[182755]:    <controller type='pci' index='11' model='pcie-root-port'>
Jan 21 18:57:05 np0005591285 nova_compute[182755]:      <model name='pcie-root-port'/>
Jan 21 18:57:05 np0005591285 nova_compute[182755]:      <target chassis='11' port='0x1a'/>
Jan 21 18:57:05 np0005591285 nova_compute[182755]:      <alias name='pci.11'/>
Jan 21 18:57:05 np0005591285 nova_compute[182755]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x2'/>
Jan 21 18:57:05 np0005591285 nova_compute[182755]:    </controller>
Jan 21 18:57:05 np0005591285 nova_compute[182755]:    <controller type='pci' index='12' model='pcie-root-port'>
Jan 21 18:57:05 np0005591285 nova_compute[182755]:      <model name='pcie-root-port'/>
Jan 21 18:57:05 np0005591285 nova_compute[182755]:      <target chassis='12' port='0x1b'/>
Jan 21 18:57:05 np0005591285 nova_compute[182755]:      <alias name='pci.12'/>
Jan 21 18:57:05 np0005591285 nova_compute[182755]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x3'/>
Jan 21 18:57:05 np0005591285 nova_compute[182755]:    </controller>
Jan 21 18:57:05 np0005591285 nova_compute[182755]:    <controller type='pci' index='13' model='pcie-root-port'>
Jan 21 18:57:05 np0005591285 nova_compute[182755]:      <model name='pcie-root-port'/>
Jan 21 18:57:05 np0005591285 nova_compute[182755]:      <target chassis='13' port='0x1c'/>
Jan 21 18:57:05 np0005591285 nova_compute[182755]:      <alias name='pci.13'/>
Jan 21 18:57:05 np0005591285 nova_compute[182755]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x4'/>
Jan 21 18:57:05 np0005591285 nova_compute[182755]:    </controller>
Jan 21 18:57:05 np0005591285 nova_compute[182755]:    <controller type='pci' index='14' model='pcie-root-port'>
Jan 21 18:57:05 np0005591285 nova_compute[182755]:      <model name='pcie-root-port'/>
Jan 21 18:57:05 np0005591285 nova_compute[182755]:      <target chassis='14' port='0x1d'/>
Jan 21 18:57:05 np0005591285 nova_compute[182755]:      <alias name='pci.14'/>
Jan 21 18:57:05 np0005591285 nova_compute[182755]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x5'/>
Jan 21 18:57:05 np0005591285 nova_compute[182755]:    </controller>
Jan 21 18:57:05 np0005591285 nova_compute[182755]:    <controller type='pci' index='15' model='pcie-root-port'>
Jan 21 18:57:05 np0005591285 nova_compute[182755]:      <model name='pcie-root-port'/>
Jan 21 18:57:05 np0005591285 nova_compute[182755]:      <target chassis='15' port='0x1e'/>
Jan 21 18:57:05 np0005591285 nova_compute[182755]:      <alias name='pci.15'/>
Jan 21 18:57:05 np0005591285 nova_compute[182755]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x6'/>
Jan 21 18:57:05 np0005591285 nova_compute[182755]:    </controller>
Jan 21 18:57:05 np0005591285 nova_compute[182755]:    <controller type='pci' index='16' model='pcie-root-port'>
Jan 21 18:57:05 np0005591285 nova_compute[182755]:      <model name='pcie-root-port'/>
Jan 21 18:57:05 np0005591285 nova_compute[182755]:      <target chassis='16' port='0x1f'/>
Jan 21 18:57:05 np0005591285 nova_compute[182755]:      <alias name='pci.16'/>
Jan 21 18:57:05 np0005591285 nova_compute[182755]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x7'/>
Jan 21 18:57:05 np0005591285 nova_compute[182755]:    </controller>
Jan 21 18:57:05 np0005591285 nova_compute[182755]:    <controller type='pci' index='17' model='pcie-root-port'>
Jan 21 18:57:05 np0005591285 nova_compute[182755]:      <model name='pcie-root-port'/>
Jan 21 18:57:05 np0005591285 nova_compute[182755]:      <target chassis='17' port='0x20'/>
Jan 21 18:57:05 np0005591285 nova_compute[182755]:      <alias name='pci.17'/>
Jan 21 18:57:05 np0005591285 nova_compute[182755]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x0' multifunction='on'/>
Jan 21 18:57:05 np0005591285 nova_compute[182755]:    </controller>
Jan 21 18:57:05 np0005591285 nova_compute[182755]:    <controller type='pci' index='18' model='pcie-root-port'>
Jan 21 18:57:05 np0005591285 nova_compute[182755]:      <model name='pcie-root-port'/>
Jan 21 18:57:05 np0005591285 nova_compute[182755]:      <target chassis='18' port='0x21'/>
Jan 21 18:57:05 np0005591285 nova_compute[182755]:      <alias name='pci.18'/>
Jan 21 18:57:05 np0005591285 nova_compute[182755]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x1'/>
Jan 21 18:57:05 np0005591285 nova_compute[182755]:    </controller>
Jan 21 18:57:05 np0005591285 nova_compute[182755]:    <controller type='pci' index='19' model='pcie-root-port'>
Jan 21 18:57:05 np0005591285 nova_compute[182755]:      <model name='pcie-root-port'/>
Jan 21 18:57:05 np0005591285 nova_compute[182755]:      <target chassis='19' port='0x22'/>
Jan 21 18:57:05 np0005591285 nova_compute[182755]:      <alias name='pci.19'/>
Jan 21 18:57:05 np0005591285 nova_compute[182755]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x2'/>
Jan 21 18:57:05 np0005591285 nova_compute[182755]:    </controller>
Jan 21 18:57:05 np0005591285 nova_compute[182755]:    <controller type='pci' index='20' model='pcie-root-port'>
Jan 21 18:57:05 np0005591285 nova_compute[182755]:      <model name='pcie-root-port'/>
Jan 21 18:57:05 np0005591285 nova_compute[182755]:      <target chassis='20' port='0x23'/>
Jan 21 18:57:05 np0005591285 nova_compute[182755]:      <alias name='pci.20'/>
Jan 21 18:57:05 np0005591285 nova_compute[182755]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x3'/>
Jan 21 18:57:05 np0005591285 nova_compute[182755]:    </controller>
Jan 21 18:57:05 np0005591285 nova_compute[182755]:    <controller type='pci' index='21' model='pcie-root-port'>
Jan 21 18:57:05 np0005591285 nova_compute[182755]:      <model name='pcie-root-port'/>
Jan 21 18:57:05 np0005591285 nova_compute[182755]:      <target chassis='21' port='0x24'/>
Jan 21 18:57:05 np0005591285 nova_compute[182755]:      <alias name='pci.21'/>
Jan 21 18:57:05 np0005591285 nova_compute[182755]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x4'/>
Jan 21 18:57:05 np0005591285 nova_compute[182755]:    </controller>
Jan 21 18:57:05 np0005591285 nova_compute[182755]:    <controller type='pci' index='22' model='pcie-root-port'>
Jan 21 18:57:05 np0005591285 nova_compute[182755]:      <model name='pcie-root-port'/>
Jan 21 18:57:05 np0005591285 nova_compute[182755]:      <target chassis='22' port='0x25'/>
Jan 21 18:57:05 np0005591285 nova_compute[182755]:      <alias name='pci.22'/>
Jan 21 18:57:05 np0005591285 nova_compute[182755]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x5'/>
Jan 21 18:57:05 np0005591285 nova_compute[182755]:    </controller>
Jan 21 18:57:05 np0005591285 nova_compute[182755]:    <controller type='pci' index='23' model='pcie-root-port'>
Jan 21 18:57:05 np0005591285 nova_compute[182755]:      <model name='pcie-root-port'/>
Jan 21 18:57:05 np0005591285 nova_compute[182755]:      <target chassis='23' port='0x26'/>
Jan 21 18:57:05 np0005591285 nova_compute[182755]:      <alias name='pci.23'/>
Jan 21 18:57:05 np0005591285 nova_compute[182755]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x6'/>
Jan 21 18:57:05 np0005591285 nova_compute[182755]:    </controller>
Jan 21 18:57:05 np0005591285 nova_compute[182755]:    <controller type='pci' index='24' model='pcie-root-port'>
Jan 21 18:57:05 np0005591285 nova_compute[182755]:      <model name='pcie-root-port'/>
Jan 21 18:57:05 np0005591285 nova_compute[182755]:      <target chassis='24' port='0x27'/>
Jan 21 18:57:05 np0005591285 nova_compute[182755]:      <alias name='pci.24'/>
Jan 21 18:57:05 np0005591285 nova_compute[182755]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x7'/>
Jan 21 18:57:05 np0005591285 nova_compute[182755]:    </controller>
Jan 21 18:57:05 np0005591285 nova_compute[182755]:    <controller type='pci' index='25' model='pcie-root-port'>
Jan 21 18:57:05 np0005591285 nova_compute[182755]:      <model name='pcie-root-port'/>
Jan 21 18:57:05 np0005591285 nova_compute[182755]:      <target chassis='25' port='0x28'/>
Jan 21 18:57:05 np0005591285 nova_compute[182755]:      <alias name='pci.25'/>
Jan 21 18:57:05 np0005591285 nova_compute[182755]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x05' function='0x0'/>
Jan 21 18:57:05 np0005591285 nova_compute[182755]:    </controller>
Jan 21 18:57:05 np0005591285 nova_compute[182755]:    <controller type='pci' index='26' model='pcie-to-pci-bridge'>
Jan 21 18:57:05 np0005591285 nova_compute[182755]:      <model name='pcie-pci-bridge'/>
Jan 21 18:57:05 np0005591285 nova_compute[182755]:      <alias name='pci.26'/>
Jan 21 18:57:05 np0005591285 nova_compute[182755]:      <address type='pci' domain='0x0000' bus='0x01' slot='0x00' function='0x0'/>
Jan 21 18:57:05 np0005591285 nova_compute[182755]:    </controller>
Jan 21 18:57:05 np0005591285 nova_compute[182755]:    <controller type='usb' index='0' model='piix3-uhci'>
Jan 21 18:57:05 np0005591285 nova_compute[182755]:      <alias name='usb'/>
Jan 21 18:57:05 np0005591285 nova_compute[182755]:      <address type='pci' domain='0x0000' bus='0x1a' slot='0x01' function='0x0'/>
Jan 21 18:57:05 np0005591285 nova_compute[182755]:    </controller>
Jan 21 18:57:05 np0005591285 nova_compute[182755]:    <controller type='sata' index='0'>
Jan 21 18:57:05 np0005591285 nova_compute[182755]:      <alias name='ide'/>
Jan 21 18:57:05 np0005591285 nova_compute[182755]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x1f' function='0x2'/>
Jan 21 18:57:05 np0005591285 nova_compute[182755]:    </controller>
Jan 21 18:57:05 np0005591285 nova_compute[182755]:    <interface type='ethernet'>
Jan 21 18:57:05 np0005591285 nova_compute[182755]:      <mac address='fa:16:3e:04:81:11'/>
Jan 21 18:57:05 np0005591285 nova_compute[182755]:      <target dev='tap8a89069d-b6'/>
Jan 21 18:57:05 np0005591285 nova_compute[182755]:      <model type='virtio'/>
Jan 21 18:57:05 np0005591285 nova_compute[182755]:      <driver name='vhost' rx_queue_size='512'/>
Jan 21 18:57:05 np0005591285 nova_compute[182755]:      <mtu size='1442'/>
Jan 21 18:57:05 np0005591285 nova_compute[182755]:      <alias name='net0'/>
Jan 21 18:57:05 np0005591285 nova_compute[182755]:      <address type='pci' domain='0x0000' bus='0x02' slot='0x00' function='0x0'/>
Jan 21 18:57:05 np0005591285 nova_compute[182755]:    </interface>
Jan 21 18:57:05 np0005591285 nova_compute[182755]:    <serial type='pty'>
Jan 21 18:57:05 np0005591285 nova_compute[182755]:      <source path='/dev/pts/0'/>
Jan 21 18:57:05 np0005591285 nova_compute[182755]:      <log file='/var/lib/nova/instances/d30c29ef-0595-4d30-a826-50e21d7d3463/console.log' append='off'/>
Jan 21 18:57:05 np0005591285 nova_compute[182755]:      <target type='isa-serial' port='0'>
Jan 21 18:57:05 np0005591285 nova_compute[182755]:        <model name='isa-serial'/>
Jan 21 18:57:05 np0005591285 nova_compute[182755]:      </target>
Jan 21 18:57:05 np0005591285 nova_compute[182755]:      <alias name='serial0'/>
Jan 21 18:57:05 np0005591285 nova_compute[182755]:    </serial>
Jan 21 18:57:05 np0005591285 nova_compute[182755]:    <console type='pty' tty='/dev/pts/0'>
Jan 21 18:57:05 np0005591285 nova_compute[182755]:      <source path='/dev/pts/0'/>
Jan 21 18:57:05 np0005591285 nova_compute[182755]:      <log file='/var/lib/nova/instances/d30c29ef-0595-4d30-a826-50e21d7d3463/console.log' append='off'/>
Jan 21 18:57:05 np0005591285 nova_compute[182755]:      <target type='serial' port='0'/>
Jan 21 18:57:05 np0005591285 nova_compute[182755]:      <alias name='serial0'/>
Jan 21 18:57:05 np0005591285 nova_compute[182755]:    </console>
Jan 21 18:57:05 np0005591285 nova_compute[182755]:    <input type='tablet' bus='usb'>
Jan 21 18:57:05 np0005591285 nova_compute[182755]:      <alias name='input0'/>
Jan 21 18:57:05 np0005591285 nova_compute[182755]:      <address type='usb' bus='0' port='1'/>
Jan 21 18:57:05 np0005591285 nova_compute[182755]:    </input>
Jan 21 18:57:05 np0005591285 nova_compute[182755]:    <input type='mouse' bus='ps2'>
Jan 21 18:57:05 np0005591285 nova_compute[182755]:      <alias name='input1'/>
Jan 21 18:57:05 np0005591285 nova_compute[182755]:    </input>
Jan 21 18:57:05 np0005591285 nova_compute[182755]:    <input type='keyboard' bus='ps2'>
Jan 21 18:57:05 np0005591285 nova_compute[182755]:      <alias name='input2'/>
Jan 21 18:57:05 np0005591285 nova_compute[182755]:    </input>
Jan 21 18:57:05 np0005591285 nova_compute[182755]:    <graphics type='vnc' port='5900' autoport='yes' listen='::0'>
Jan 21 18:57:05 np0005591285 nova_compute[182755]:      <listen type='address' address='::0'/>
Jan 21 18:57:05 np0005591285 nova_compute[182755]:    </graphics>
Jan 21 18:57:05 np0005591285 nova_compute[182755]:    <audio id='1' type='none'/>
Jan 21 18:57:05 np0005591285 nova_compute[182755]:    <video>
Jan 21 18:57:05 np0005591285 nova_compute[182755]:      <model type='virtio' heads='1' primary='yes'/>
Jan 21 18:57:05 np0005591285 nova_compute[182755]:      <alias name='video0'/>
Jan 21 18:57:05 np0005591285 nova_compute[182755]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x01' function='0x0'/>
Jan 21 18:57:05 np0005591285 nova_compute[182755]:    </video>
Jan 21 18:57:05 np0005591285 nova_compute[182755]:    <watchdog model='itco' action='reset'>
Jan 21 18:57:05 np0005591285 nova_compute[182755]:      <alias name='watchdog0'/>
Jan 21 18:57:05 np0005591285 nova_compute[182755]:    </watchdog>
Jan 21 18:57:05 np0005591285 nova_compute[182755]:    <memballoon model='virtio'>
Jan 21 18:57:05 np0005591285 nova_compute[182755]:      <stats period='10'/>
Jan 21 18:57:05 np0005591285 nova_compute[182755]:      <alias name='balloon0'/>
Jan 21 18:57:05 np0005591285 nova_compute[182755]:      <address type='pci' domain='0x0000' bus='0x04' slot='0x00' function='0x0'/>
Jan 21 18:57:05 np0005591285 nova_compute[182755]:    </memballoon>
Jan 21 18:57:05 np0005591285 nova_compute[182755]:    <rng model='virtio'>
Jan 21 18:57:05 np0005591285 nova_compute[182755]:      <backend model='random'>/dev/urandom</backend>
Jan 21 18:57:05 np0005591285 nova_compute[182755]:      <alias name='rng0'/>
Jan 21 18:57:05 np0005591285 nova_compute[182755]:      <address type='pci' domain='0x0000' bus='0x05' slot='0x00' function='0x0'/>
Jan 21 18:57:05 np0005591285 nova_compute[182755]:    </rng>
Jan 21 18:57:05 np0005591285 nova_compute[182755]:  </devices>
Jan 21 18:57:05 np0005591285 nova_compute[182755]:  <seclabel type='dynamic' model='selinux' relabel='yes'>
Jan 21 18:57:05 np0005591285 nova_compute[182755]:    <label>system_u:system_r:svirt_t:s0:c724,c873</label>
Jan 21 18:57:05 np0005591285 nova_compute[182755]:    <imagelabel>system_u:object_r:svirt_image_t:s0:c724,c873</imagelabel>
Jan 21 18:57:05 np0005591285 nova_compute[182755]:  </seclabel>
Jan 21 18:57:05 np0005591285 nova_compute[182755]:  <seclabel type='dynamic' model='dac' relabel='yes'>
Jan 21 18:57:05 np0005591285 nova_compute[182755]:    <label>+107:+107</label>
Jan 21 18:57:05 np0005591285 nova_compute[182755]:    <imagelabel>+107:+107</imagelabel>
Jan 21 18:57:05 np0005591285 nova_compute[182755]:  </seclabel>
Jan 21 18:57:05 np0005591285 nova_compute[182755]: </domain>
Jan 21 18:57:05 np0005591285 nova_compute[182755]: get_interface_by_cfg /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:282#033[00m
Jan 21 18:57:05 np0005591285 nova_compute[182755]: 2026-01-21 23:57:05.793 182759 INFO nova.virt.libvirt.driver [None req-f149204a-fb2f-4bc3-b144-43c74e796542 0f8ef02149394f2dac899fc3395b6bf7 717cc581e6a349a98dfd390d05b18624 - - default default] Successfully detached device tape7e9d1d2-92 from instance d30c29ef-0595-4d30-a826-50e21d7d3463 from the live domain config.#033[00m
Jan 21 18:57:05 np0005591285 nova_compute[182755]: 2026-01-21 23:57:05.795 182759 DEBUG nova.virt.libvirt.vif [None req-f149204a-fb2f-4bc3-b144-43c74e796542 0f8ef02149394f2dac899fc3395b6bf7 717cc581e6a349a98dfd390d05b18624 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-21T23:56:21Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-AttachInterfacesTestJSON-server-1062303531',display_name='tempest-AttachInterfacesTestJSON-server-1062303531',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-attachinterfacestestjson-server-1062303531',id=68,image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBDwgch/zV9NgtxoAjj9JBs7lcsR6b8YWvnQIu5vmkBx68RzwPILt1+wzp9eXPXNsQsEuX9cbGoTpflNJROPxrxLqg5cuoOtsv4rM+f7gXiy2vWn/dQAcGM72Np9ilcHeEQ==',key_name='tempest-keypair-829623367',keypairs=<?>,launch_index=0,launched_at=2026-01-21T23:56:31Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=<?>,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='717cc581e6a349a98dfd390d05b18624',ramdisk_id='',reservation_id='r-2l69dl8j',resources=<?>,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-AttachInterfacesTestJSON-658760528',owner_user_name='tempest-AttachInterfacesTestJSON-658760528-project-member'},tags=<?>,task_state=None,terminated_at=None,trusted_certs=<?>,updated_at=2026-01-21T23:56:32Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='0f8ef02149394f2dac899fc3395b6bf7',uuid=d30c29ef-0595-4d30-a826-50e21d7d3463,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "e7e9d1d2-925c-4854-8e1c-90b360ae8694", "address": "fa:16:3e:42:b1:a8", "network": {"id": "1995baab-0f8d-4658-a4fc-2d21868dc592", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-54296518-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "717cc581e6a349a98dfd390d05b18624", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape7e9d1d2-92", "ovs_interfaceid": "e7e9d1d2-925c-4854-8e1c-90b360ae8694", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Jan 21 18:57:05 np0005591285 nova_compute[182755]: 2026-01-21 23:57:05.795 182759 DEBUG nova.network.os_vif_util [None req-f149204a-fb2f-4bc3-b144-43c74e796542 0f8ef02149394f2dac899fc3395b6bf7 717cc581e6a349a98dfd390d05b18624 - - default default] Converting VIF {"id": "e7e9d1d2-925c-4854-8e1c-90b360ae8694", "address": "fa:16:3e:42:b1:a8", "network": {"id": "1995baab-0f8d-4658-a4fc-2d21868dc592", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-54296518-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "717cc581e6a349a98dfd390d05b18624", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape7e9d1d2-92", "ovs_interfaceid": "e7e9d1d2-925c-4854-8e1c-90b360ae8694", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 21 18:57:05 np0005591285 nova_compute[182755]: 2026-01-21 23:57:05.796 182759 DEBUG nova.network.os_vif_util [None req-f149204a-fb2f-4bc3-b144-43c74e796542 0f8ef02149394f2dac899fc3395b6bf7 717cc581e6a349a98dfd390d05b18624 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:42:b1:a8,bridge_name='br-int',has_traffic_filtering=True,id=e7e9d1d2-925c-4854-8e1c-90b360ae8694,network=Network(1995baab-0f8d-4658-a4fc-2d21868dc592),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tape7e9d1d2-92') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 21 18:57:05 np0005591285 nova_compute[182755]: 2026-01-21 23:57:05.797 182759 DEBUG os_vif [None req-f149204a-fb2f-4bc3-b144-43c74e796542 0f8ef02149394f2dac899fc3395b6bf7 717cc581e6a349a98dfd390d05b18624 - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:42:b1:a8,bridge_name='br-int',has_traffic_filtering=True,id=e7e9d1d2-925c-4854-8e1c-90b360ae8694,network=Network(1995baab-0f8d-4658-a4fc-2d21868dc592),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tape7e9d1d2-92') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Jan 21 18:57:05 np0005591285 nova_compute[182755]: 2026-01-21 23:57:05.801 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 18:57:05 np0005591285 nova_compute[182755]: 2026-01-21 23:57:05.803 182759 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tape7e9d1d2-92, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 21 18:57:05 np0005591285 nova_compute[182755]: 2026-01-21 23:57:05.804 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 18:57:05 np0005591285 nova_compute[182755]: 2026-01-21 23:57:05.806 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 18:57:05 np0005591285 nova_compute[182755]: 2026-01-21 23:57:05.808 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 18:57:05 np0005591285 nova_compute[182755]: 2026-01-21 23:57:05.812 182759 INFO os_vif [None req-f149204a-fb2f-4bc3-b144-43c74e796542 0f8ef02149394f2dac899fc3395b6bf7 717cc581e6a349a98dfd390d05b18624 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:42:b1:a8,bridge_name='br-int',has_traffic_filtering=True,id=e7e9d1d2-925c-4854-8e1c-90b360ae8694,network=Network(1995baab-0f8d-4658-a4fc-2d21868dc592),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tape7e9d1d2-92')#033[00m
Jan 21 18:57:05 np0005591285 nova_compute[182755]: 2026-01-21 23:57:05.813 182759 DEBUG nova.virt.libvirt.guest [None req-f149204a-fb2f-4bc3-b144-43c74e796542 0f8ef02149394f2dac899fc3395b6bf7 717cc581e6a349a98dfd390d05b18624 - - default default] set metadata xml: <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 21 18:57:05 np0005591285 nova_compute[182755]:  <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 21 18:57:05 np0005591285 nova_compute[182755]:  <nova:name>tempest-AttachInterfacesTestJSON-server-1062303531</nova:name>
Jan 21 18:57:05 np0005591285 nova_compute[182755]:  <nova:creationTime>2026-01-21 23:57:05</nova:creationTime>
Jan 21 18:57:05 np0005591285 nova_compute[182755]:  <nova:flavor name="m1.nano">
Jan 21 18:57:05 np0005591285 nova_compute[182755]:    <nova:memory>128</nova:memory>
Jan 21 18:57:05 np0005591285 nova_compute[182755]:    <nova:disk>1</nova:disk>
Jan 21 18:57:05 np0005591285 nova_compute[182755]:    <nova:swap>0</nova:swap>
Jan 21 18:57:05 np0005591285 nova_compute[182755]:    <nova:ephemeral>0</nova:ephemeral>
Jan 21 18:57:05 np0005591285 nova_compute[182755]:    <nova:vcpus>1</nova:vcpus>
Jan 21 18:57:05 np0005591285 nova_compute[182755]:  </nova:flavor>
Jan 21 18:57:05 np0005591285 nova_compute[182755]:  <nova:owner>
Jan 21 18:57:05 np0005591285 nova_compute[182755]:    <nova:user uuid="0f8ef02149394f2dac899fc3395b6bf7">tempest-AttachInterfacesTestJSON-658760528-project-member</nova:user>
Jan 21 18:57:05 np0005591285 nova_compute[182755]:    <nova:project uuid="717cc581e6a349a98dfd390d05b18624">tempest-AttachInterfacesTestJSON-658760528</nova:project>
Jan 21 18:57:05 np0005591285 nova_compute[182755]:  </nova:owner>
Jan 21 18:57:05 np0005591285 nova_compute[182755]:  <nova:root type="image" uuid="9cd98f02-a505-4543-a7ad-04e9a377b456"/>
Jan 21 18:57:05 np0005591285 nova_compute[182755]:  <nova:ports>
Jan 21 18:57:05 np0005591285 nova_compute[182755]:    <nova:port uuid="8a89069d-b676-4bb0-ab19-1d71370566f0">
Jan 21 18:57:05 np0005591285 nova_compute[182755]:      <nova:ip type="fixed" address="10.100.0.14" ipVersion="4"/>
Jan 21 18:57:05 np0005591285 nova_compute[182755]:    </nova:port>
Jan 21 18:57:05 np0005591285 nova_compute[182755]:  </nova:ports>
Jan 21 18:57:05 np0005591285 nova_compute[182755]: </nova:instance>
Jan 21 18:57:05 np0005591285 nova_compute[182755]: set_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:359#033[00m
Jan 21 18:57:05 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:57:05.822 211690 DEBUG oslo.privsep.daemon [-] privsep: reply[470841f9-6bae-403e-b8dc-6512b04dd16b]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 18:57:05 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:57:05.869 211704 DEBUG oslo.privsep.daemon [-] privsep: reply[6366cdc6-9d96-4e59-8e45-3ff382ff74c5]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 18:57:05 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:57:05.875 211704 DEBUG oslo.privsep.daemon [-] privsep: reply[2e1fcd73-2728-4c95-aea2-890da8230ce1]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 18:57:05 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:57:05.921 211704 DEBUG oslo.privsep.daemon [-] privsep: reply[d87a5ba9-4380-4756-b326-b49090a15b28]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 18:57:05 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:57:05.946 211690 DEBUG oslo.privsep.daemon [-] privsep: reply[c29ab816-8979-43f3-bb80-1e521a2bf437]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap1995baab-01'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:4b:ff:2f'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 8, 'tx_packets': 8, 'rx_bytes': 616, 'tx_bytes': 528, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 8, 'tx_packets': 8, 'rx_bytes': 616, 'tx_bytes': 528, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 63], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 434952, 'reachable_time': 38323, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 6, 'inoctets': 448, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 4, 'outoctets': 304, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 6, 'outmcastpkts': 4, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 448, 'outmcastoctets': 304, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 6, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 4, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 220194, 'error': None, 'target': 'ovnmeta-1995baab-0f8d-4658-a4fc-2d21868dc592', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 18:57:05 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:57:05.973 211690 DEBUG oslo.privsep.daemon [-] privsep: reply[79254f93-508c-48d4-b63e-44fb48a76c31]: (4, ({'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tap1995baab-01'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 434971, 'tstamp': 434971}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 220195, 'error': None, 'target': 'ovnmeta-1995baab-0f8d-4658-a4fc-2d21868dc592', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.15'], ['IFA_LABEL', 'tap1995baab-01'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 434974, 'tstamp': 434974}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 220195, 'error': None, 'target': 'ovnmeta-1995baab-0f8d-4658-a4fc-2d21868dc592', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 18:57:05 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:57:05.975 104259 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap1995baab-00, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 21 18:57:05 np0005591285 nova_compute[182755]: 2026-01-21 23:57:05.977 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 18:57:05 np0005591285 nova_compute[182755]: 2026-01-21 23:57:05.978 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 18:57:05 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:57:05.978 104259 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap1995baab-00, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 21 18:57:05 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:57:05.979 104259 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 21 18:57:05 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:57:05.979 104259 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap1995baab-00, col_values=(('external_ids', {'iface-id': '4a5cc35b-5169-43e2-b11f-202219aae22d'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 21 18:57:05 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:57:05.979 104259 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 21 18:57:06 np0005591285 ovn_controller[94908]: 2026-01-21T23:57:06Z|00199|binding|INFO|Releasing lport 4a5cc35b-5169-43e2-b11f-202219aae22d from this chassis (sb_readonly=0)
Jan 21 18:57:06 np0005591285 nova_compute[182755]: 2026-01-21 23:57:06.322 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 18:57:06 np0005591285 nova_compute[182755]: 2026-01-21 23:57:06.523 182759 DEBUG oslo_concurrency.lockutils [None req-f149204a-fb2f-4bc3-b144-43c74e796542 0f8ef02149394f2dac899fc3395b6bf7 717cc581e6a349a98dfd390d05b18624 - - default default] Acquiring lock "refresh_cache-d30c29ef-0595-4d30-a826-50e21d7d3463" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 21 18:57:06 np0005591285 nova_compute[182755]: 2026-01-21 23:57:06.524 182759 DEBUG oslo_concurrency.lockutils [None req-f149204a-fb2f-4bc3-b144-43c74e796542 0f8ef02149394f2dac899fc3395b6bf7 717cc581e6a349a98dfd390d05b18624 - - default default] Acquired lock "refresh_cache-d30c29ef-0595-4d30-a826-50e21d7d3463" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 21 18:57:06 np0005591285 nova_compute[182755]: 2026-01-21 23:57:06.524 182759 DEBUG nova.network.neutron [None req-f149204a-fb2f-4bc3-b144-43c74e796542 0f8ef02149394f2dac899fc3395b6bf7 717cc581e6a349a98dfd390d05b18624 - - default default] [instance: d30c29ef-0595-4d30-a826-50e21d7d3463] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Jan 21 18:57:06 np0005591285 nova_compute[182755]: 2026-01-21 23:57:06.603 182759 DEBUG nova.compute.manager [req-3b2bb319-b6f0-4a96-afb4-f1856d6a9aeb req-9b7c2a27-7691-4d30-b5cb-acb2dea2999d 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: d30c29ef-0595-4d30-a826-50e21d7d3463] Received event network-vif-deleted-e7e9d1d2-925c-4854-8e1c-90b360ae8694 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 21 18:57:06 np0005591285 nova_compute[182755]: 2026-01-21 23:57:06.604 182759 INFO nova.compute.manager [req-3b2bb319-b6f0-4a96-afb4-f1856d6a9aeb req-9b7c2a27-7691-4d30-b5cb-acb2dea2999d 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: d30c29ef-0595-4d30-a826-50e21d7d3463] Neutron deleted interface e7e9d1d2-925c-4854-8e1c-90b360ae8694; detaching it from the instance and deleting it from the info cache#033[00m
Jan 21 18:57:06 np0005591285 nova_compute[182755]: 2026-01-21 23:57:06.604 182759 DEBUG nova.network.neutron [req-3b2bb319-b6f0-4a96-afb4-f1856d6a9aeb req-9b7c2a27-7691-4d30-b5cb-acb2dea2999d 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: d30c29ef-0595-4d30-a826-50e21d7d3463] Updating instance_info_cache with network_info: [{"id": "8a89069d-b676-4bb0-ab19-1d71370566f0", "address": "fa:16:3e:04:81:11", "network": {"id": "1995baab-0f8d-4658-a4fc-2d21868dc592", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-54296518-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.212", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "717cc581e6a349a98dfd390d05b18624", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap8a89069d-b6", "ovs_interfaceid": "8a89069d-b676-4bb0-ab19-1d71370566f0", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 21 18:57:06 np0005591285 nova_compute[182755]: 2026-01-21 23:57:06.633 182759 DEBUG nova.objects.instance [req-3b2bb319-b6f0-4a96-afb4-f1856d6a9aeb req-9b7c2a27-7691-4d30-b5cb-acb2dea2999d 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lazy-loading 'system_metadata' on Instance uuid d30c29ef-0595-4d30-a826-50e21d7d3463 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 21 18:57:06 np0005591285 nova_compute[182755]: 2026-01-21 23:57:06.693 182759 DEBUG nova.objects.instance [req-3b2bb319-b6f0-4a96-afb4-f1856d6a9aeb req-9b7c2a27-7691-4d30-b5cb-acb2dea2999d 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lazy-loading 'flavor' on Instance uuid d30c29ef-0595-4d30-a826-50e21d7d3463 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 21 18:57:06 np0005591285 nova_compute[182755]: 2026-01-21 23:57:06.729 182759 DEBUG nova.virt.libvirt.vif [req-3b2bb319-b6f0-4a96-afb4-f1856d6a9aeb req-9b7c2a27-7691-4d30-b5cb-acb2dea2999d 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-21T23:56:21Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-AttachInterfacesTestJSON-server-1062303531',display_name='tempest-AttachInterfacesTestJSON-server-1062303531',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-attachinterfacestestjson-server-1062303531',id=68,image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBDwgch/zV9NgtxoAjj9JBs7lcsR6b8YWvnQIu5vmkBx68RzwPILt1+wzp9eXPXNsQsEuX9cbGoTpflNJROPxrxLqg5cuoOtsv4rM+f7gXiy2vWn/dQAcGM72Np9ilcHeEQ==',key_name='tempest-keypair-829623367',keypairs=<?>,launch_index=0,launched_at=2026-01-21T23:56:31Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata=<?>,migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=<?>,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='717cc581e6a349a98dfd390d05b18624',ramdisk_id='',reservation_id='r-2l69dl8j',resources=<?>,root_device_name='/dev/vda',root_gb=1,security_groups=<?>,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-AttachInterfacesTestJSON-658760528',owner_user_name='tempest-AttachInterfacesTestJSON-658760528-project-member'},tags=<?>,task_state=None,terminated_at=None,trusted_certs=<?>,updated_at=2026-01-21T23:56:32Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='0f8ef02149394f2dac899fc3395b6bf7',uuid=d30c29ef-0595-4d30-a826-50e21d7d3463,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "e7e9d1d2-925c-4854-8e1c-90b360ae8694", "address": "fa:16:3e:42:b1:a8", "network": {"id": "1995baab-0f8d-4658-a4fc-2d21868dc592", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-54296518-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "717cc581e6a349a98dfd390d05b18624", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape7e9d1d2-92", "ovs_interfaceid": "e7e9d1d2-925c-4854-8e1c-90b360ae8694", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Jan 21 18:57:06 np0005591285 nova_compute[182755]: 2026-01-21 23:57:06.730 182759 DEBUG nova.network.os_vif_util [req-3b2bb319-b6f0-4a96-afb4-f1856d6a9aeb req-9b7c2a27-7691-4d30-b5cb-acb2dea2999d 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Converting VIF {"id": "e7e9d1d2-925c-4854-8e1c-90b360ae8694", "address": "fa:16:3e:42:b1:a8", "network": {"id": "1995baab-0f8d-4658-a4fc-2d21868dc592", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-54296518-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "717cc581e6a349a98dfd390d05b18624", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape7e9d1d2-92", "ovs_interfaceid": "e7e9d1d2-925c-4854-8e1c-90b360ae8694", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 21 18:57:06 np0005591285 nova_compute[182755]: 2026-01-21 23:57:06.731 182759 DEBUG nova.network.os_vif_util [req-3b2bb319-b6f0-4a96-afb4-f1856d6a9aeb req-9b7c2a27-7691-4d30-b5cb-acb2dea2999d 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:42:b1:a8,bridge_name='br-int',has_traffic_filtering=True,id=e7e9d1d2-925c-4854-8e1c-90b360ae8694,network=Network(1995baab-0f8d-4658-a4fc-2d21868dc592),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tape7e9d1d2-92') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 21 18:57:06 np0005591285 nova_compute[182755]: 2026-01-21 23:57:06.735 182759 DEBUG nova.virt.libvirt.guest [req-3b2bb319-b6f0-4a96-afb4-f1856d6a9aeb req-9b7c2a27-7691-4d30-b5cb-acb2dea2999d 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] looking for interface given config: <interface type="ethernet"><mac address="fa:16:3e:42:b1:a8"/><model type="virtio"/><driver name="vhost" rx_queue_size="512"/><mtu size="1442"/><target dev="tape7e9d1d2-92"/></interface> get_interface_by_cfg /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:257#033[00m
Jan 21 18:57:06 np0005591285 nova_compute[182755]: 2026-01-21 23:57:06.739 182759 DEBUG nova.virt.libvirt.guest [req-3b2bb319-b6f0-4a96-afb4-f1856d6a9aeb req-9b7c2a27-7691-4d30-b5cb-acb2dea2999d 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] interface for config: <interface type="ethernet"><mac address="fa:16:3e:42:b1:a8"/><model type="virtio"/><driver name="vhost" rx_queue_size="512"/><mtu size="1442"/><target dev="tape7e9d1d2-92"/></interface>not found in domain: <domain type='kvm' id='28'>
Jan 21 18:57:06 np0005591285 nova_compute[182755]:  <name>instance-00000044</name>
Jan 21 18:57:06 np0005591285 nova_compute[182755]:  <uuid>d30c29ef-0595-4d30-a826-50e21d7d3463</uuid>
Jan 21 18:57:06 np0005591285 nova_compute[182755]:  <metadata>
Jan 21 18:57:06 np0005591285 nova_compute[182755]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1" xmlns:instance="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 21 18:57:06 np0005591285 nova_compute[182755]:  <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 21 18:57:06 np0005591285 nova_compute[182755]:  <nova:name>tempest-AttachInterfacesTestJSON-server-1062303531</nova:name>
Jan 21 18:57:06 np0005591285 nova_compute[182755]:  <nova:creationTime>2026-01-21 23:57:05</nova:creationTime>
Jan 21 18:57:06 np0005591285 nova_compute[182755]:  <nova:flavor name="m1.nano">
Jan 21 18:57:06 np0005591285 nova_compute[182755]:    <nova:memory>128</nova:memory>
Jan 21 18:57:06 np0005591285 nova_compute[182755]:    <nova:disk>1</nova:disk>
Jan 21 18:57:06 np0005591285 nova_compute[182755]:    <nova:swap>0</nova:swap>
Jan 21 18:57:06 np0005591285 nova_compute[182755]:    <nova:ephemeral>0</nova:ephemeral>
Jan 21 18:57:06 np0005591285 nova_compute[182755]:    <nova:vcpus>1</nova:vcpus>
Jan 21 18:57:06 np0005591285 nova_compute[182755]:  </nova:flavor>
Jan 21 18:57:06 np0005591285 nova_compute[182755]:  <nova:owner>
Jan 21 18:57:06 np0005591285 nova_compute[182755]:    <nova:user uuid="0f8ef02149394f2dac899fc3395b6bf7">tempest-AttachInterfacesTestJSON-658760528-project-member</nova:user>
Jan 21 18:57:06 np0005591285 nova_compute[182755]:    <nova:project uuid="717cc581e6a349a98dfd390d05b18624">tempest-AttachInterfacesTestJSON-658760528</nova:project>
Jan 21 18:57:06 np0005591285 nova_compute[182755]:  </nova:owner>
Jan 21 18:57:06 np0005591285 nova_compute[182755]:  <nova:root type="image" uuid="9cd98f02-a505-4543-a7ad-04e9a377b456"/>
Jan 21 18:57:06 np0005591285 nova_compute[182755]:  <nova:ports>
Jan 21 18:57:06 np0005591285 nova_compute[182755]:    <nova:port uuid="8a89069d-b676-4bb0-ab19-1d71370566f0">
Jan 21 18:57:06 np0005591285 nova_compute[182755]:      <nova:ip type="fixed" address="10.100.0.14" ipVersion="4"/>
Jan 21 18:57:06 np0005591285 nova_compute[182755]:    </nova:port>
Jan 21 18:57:06 np0005591285 nova_compute[182755]:  </nova:ports>
Jan 21 18:57:06 np0005591285 nova_compute[182755]: </nova:instance>
Jan 21 18:57:06 np0005591285 nova_compute[182755]:  </metadata>
Jan 21 18:57:06 np0005591285 nova_compute[182755]:  <memory unit='KiB'>131072</memory>
Jan 21 18:57:06 np0005591285 nova_compute[182755]:  <currentMemory unit='KiB'>131072</currentMemory>
Jan 21 18:57:06 np0005591285 nova_compute[182755]:  <vcpu placement='static'>1</vcpu>
Jan 21 18:57:06 np0005591285 nova_compute[182755]:  <resource>
Jan 21 18:57:06 np0005591285 nova_compute[182755]:    <partition>/machine</partition>
Jan 21 18:57:06 np0005591285 nova_compute[182755]:  </resource>
Jan 21 18:57:06 np0005591285 nova_compute[182755]:  <sysinfo type='smbios'>
Jan 21 18:57:06 np0005591285 nova_compute[182755]:    <system>
Jan 21 18:57:06 np0005591285 nova_compute[182755]:      <entry name='manufacturer'>RDO</entry>
Jan 21 18:57:06 np0005591285 nova_compute[182755]:      <entry name='product'>OpenStack Compute</entry>
Jan 21 18:57:06 np0005591285 nova_compute[182755]:      <entry name='version'>27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 21 18:57:06 np0005591285 nova_compute[182755]:      <entry name='serial'>d30c29ef-0595-4d30-a826-50e21d7d3463</entry>
Jan 21 18:57:06 np0005591285 nova_compute[182755]:      <entry name='uuid'>d30c29ef-0595-4d30-a826-50e21d7d3463</entry>
Jan 21 18:57:06 np0005591285 nova_compute[182755]:      <entry name='family'>Virtual Machine</entry>
Jan 21 18:57:06 np0005591285 nova_compute[182755]:    </system>
Jan 21 18:57:06 np0005591285 nova_compute[182755]:  </sysinfo>
Jan 21 18:57:06 np0005591285 nova_compute[182755]:  <os>
Jan 21 18:57:06 np0005591285 nova_compute[182755]:    <type arch='x86_64' machine='pc-q35-rhel9.8.0'>hvm</type>
Jan 21 18:57:06 np0005591285 nova_compute[182755]:    <boot dev='hd'/>
Jan 21 18:57:06 np0005591285 nova_compute[182755]:    <smbios mode='sysinfo'/>
Jan 21 18:57:06 np0005591285 nova_compute[182755]:  </os>
Jan 21 18:57:06 np0005591285 nova_compute[182755]:  <features>
Jan 21 18:57:06 np0005591285 nova_compute[182755]:    <acpi/>
Jan 21 18:57:06 np0005591285 nova_compute[182755]:    <apic/>
Jan 21 18:57:06 np0005591285 nova_compute[182755]:    <vmcoreinfo state='on'/>
Jan 21 18:57:06 np0005591285 nova_compute[182755]:  </features>
Jan 21 18:57:06 np0005591285 nova_compute[182755]:  <cpu mode='custom' match='exact' check='full'>
Jan 21 18:57:06 np0005591285 nova_compute[182755]:    <model fallback='forbid'>Nehalem</model>
Jan 21 18:57:06 np0005591285 nova_compute[182755]:    <topology sockets='1' dies='1' clusters='1' cores='1' threads='1'/>
Jan 21 18:57:06 np0005591285 nova_compute[182755]:    <feature policy='require' name='x2apic'/>
Jan 21 18:57:06 np0005591285 nova_compute[182755]:    <feature policy='require' name='hypervisor'/>
Jan 21 18:57:06 np0005591285 nova_compute[182755]:    <feature policy='require' name='vme'/>
Jan 21 18:57:06 np0005591285 nova_compute[182755]:  </cpu>
Jan 21 18:57:06 np0005591285 nova_compute[182755]:  <clock offset='utc'>
Jan 21 18:57:06 np0005591285 nova_compute[182755]:    <timer name='pit' tickpolicy='delay'/>
Jan 21 18:57:06 np0005591285 nova_compute[182755]:    <timer name='rtc' tickpolicy='catchup'/>
Jan 21 18:57:06 np0005591285 nova_compute[182755]:    <timer name='hpet' present='no'/>
Jan 21 18:57:06 np0005591285 nova_compute[182755]:  </clock>
Jan 21 18:57:06 np0005591285 nova_compute[182755]:  <on_poweroff>destroy</on_poweroff>
Jan 21 18:57:06 np0005591285 nova_compute[182755]:  <on_reboot>restart</on_reboot>
Jan 21 18:57:06 np0005591285 nova_compute[182755]:  <on_crash>destroy</on_crash>
Jan 21 18:57:06 np0005591285 nova_compute[182755]:  <devices>
Jan 21 18:57:06 np0005591285 nova_compute[182755]:    <emulator>/usr/libexec/qemu-kvm</emulator>
Jan 21 18:57:06 np0005591285 nova_compute[182755]:    <disk type='file' device='disk'>
Jan 21 18:57:06 np0005591285 nova_compute[182755]:      <driver name='qemu' type='qcow2' cache='none'/>
Jan 21 18:57:06 np0005591285 nova_compute[182755]:      <source file='/var/lib/nova/instances/d30c29ef-0595-4d30-a826-50e21d7d3463/disk' index='2'/>
Jan 21 18:57:06 np0005591285 nova_compute[182755]:      <backingStore type='file' index='3'>
Jan 21 18:57:06 np0005591285 nova_compute[182755]:        <format type='raw'/>
Jan 21 18:57:06 np0005591285 nova_compute[182755]:        <source file='/var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474'/>
Jan 21 18:57:06 np0005591285 nova_compute[182755]:        <backingStore/>
Jan 21 18:57:06 np0005591285 nova_compute[182755]:      </backingStore>
Jan 21 18:57:06 np0005591285 nova_compute[182755]:      <target dev='vda' bus='virtio'/>
Jan 21 18:57:06 np0005591285 nova_compute[182755]:      <alias name='virtio-disk0'/>
Jan 21 18:57:06 np0005591285 nova_compute[182755]:      <address type='pci' domain='0x0000' bus='0x03' slot='0x00' function='0x0'/>
Jan 21 18:57:06 np0005591285 nova_compute[182755]:    </disk>
Jan 21 18:57:06 np0005591285 nova_compute[182755]:    <disk type='file' device='cdrom'>
Jan 21 18:57:06 np0005591285 nova_compute[182755]:      <driver name='qemu' type='raw' cache='none'/>
Jan 21 18:57:06 np0005591285 nova_compute[182755]:      <source file='/var/lib/nova/instances/d30c29ef-0595-4d30-a826-50e21d7d3463/disk.config' index='1'/>
Jan 21 18:57:06 np0005591285 nova_compute[182755]:      <backingStore/>
Jan 21 18:57:06 np0005591285 nova_compute[182755]:      <target dev='sda' bus='sata'/>
Jan 21 18:57:06 np0005591285 nova_compute[182755]:      <readonly/>
Jan 21 18:57:06 np0005591285 nova_compute[182755]:      <alias name='sata0-0-0'/>
Jan 21 18:57:06 np0005591285 nova_compute[182755]:      <address type='drive' controller='0' bus='0' target='0' unit='0'/>
Jan 21 18:57:06 np0005591285 nova_compute[182755]:    </disk>
Jan 21 18:57:06 np0005591285 nova_compute[182755]:    <controller type='pci' index='0' model='pcie-root'>
Jan 21 18:57:06 np0005591285 nova_compute[182755]:      <alias name='pcie.0'/>
Jan 21 18:57:06 np0005591285 nova_compute[182755]:    </controller>
Jan 21 18:57:06 np0005591285 nova_compute[182755]:    <controller type='pci' index='1' model='pcie-root-port'>
Jan 21 18:57:06 np0005591285 nova_compute[182755]:      <model name='pcie-root-port'/>
Jan 21 18:57:06 np0005591285 nova_compute[182755]:      <target chassis='1' port='0x10'/>
Jan 21 18:57:06 np0005591285 nova_compute[182755]:      <alias name='pci.1'/>
Jan 21 18:57:06 np0005591285 nova_compute[182755]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x0' multifunction='on'/>
Jan 21 18:57:06 np0005591285 nova_compute[182755]:    </controller>
Jan 21 18:57:06 np0005591285 nova_compute[182755]:    <controller type='pci' index='2' model='pcie-root-port'>
Jan 21 18:57:06 np0005591285 nova_compute[182755]:      <model name='pcie-root-port'/>
Jan 21 18:57:06 np0005591285 nova_compute[182755]:      <target chassis='2' port='0x11'/>
Jan 21 18:57:06 np0005591285 nova_compute[182755]:      <alias name='pci.2'/>
Jan 21 18:57:06 np0005591285 nova_compute[182755]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x1'/>
Jan 21 18:57:06 np0005591285 nova_compute[182755]:    </controller>
Jan 21 18:57:06 np0005591285 nova_compute[182755]:    <controller type='pci' index='3' model='pcie-root-port'>
Jan 21 18:57:06 np0005591285 nova_compute[182755]:      <model name='pcie-root-port'/>
Jan 21 18:57:06 np0005591285 nova_compute[182755]:      <target chassis='3' port='0x12'/>
Jan 21 18:57:06 np0005591285 nova_compute[182755]:      <alias name='pci.3'/>
Jan 21 18:57:06 np0005591285 nova_compute[182755]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x2'/>
Jan 21 18:57:06 np0005591285 nova_compute[182755]:    </controller>
Jan 21 18:57:06 np0005591285 nova_compute[182755]:    <controller type='pci' index='4' model='pcie-root-port'>
Jan 21 18:57:06 np0005591285 nova_compute[182755]:      <model name='pcie-root-port'/>
Jan 21 18:57:06 np0005591285 nova_compute[182755]:      <target chassis='4' port='0x13'/>
Jan 21 18:57:06 np0005591285 nova_compute[182755]:      <alias name='pci.4'/>
Jan 21 18:57:06 np0005591285 nova_compute[182755]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x3'/>
Jan 21 18:57:06 np0005591285 nova_compute[182755]:    </controller>
Jan 21 18:57:06 np0005591285 nova_compute[182755]:    <controller type='pci' index='5' model='pcie-root-port'>
Jan 21 18:57:06 np0005591285 nova_compute[182755]:      <model name='pcie-root-port'/>
Jan 21 18:57:06 np0005591285 nova_compute[182755]:      <target chassis='5' port='0x14'/>
Jan 21 18:57:06 np0005591285 nova_compute[182755]:      <alias name='pci.5'/>
Jan 21 18:57:06 np0005591285 nova_compute[182755]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x4'/>
Jan 21 18:57:06 np0005591285 nova_compute[182755]:    </controller>
Jan 21 18:57:06 np0005591285 nova_compute[182755]:    <controller type='pci' index='6' model='pcie-root-port'>
Jan 21 18:57:06 np0005591285 nova_compute[182755]:      <model name='pcie-root-port'/>
Jan 21 18:57:06 np0005591285 nova_compute[182755]:      <target chassis='6' port='0x15'/>
Jan 21 18:57:06 np0005591285 nova_compute[182755]:      <alias name='pci.6'/>
Jan 21 18:57:06 np0005591285 nova_compute[182755]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x5'/>
Jan 21 18:57:06 np0005591285 nova_compute[182755]:    </controller>
Jan 21 18:57:06 np0005591285 nova_compute[182755]:    <controller type='pci' index='7' model='pcie-root-port'>
Jan 21 18:57:06 np0005591285 nova_compute[182755]:      <model name='pcie-root-port'/>
Jan 21 18:57:06 np0005591285 nova_compute[182755]:      <target chassis='7' port='0x16'/>
Jan 21 18:57:06 np0005591285 nova_compute[182755]:      <alias name='pci.7'/>
Jan 21 18:57:06 np0005591285 nova_compute[182755]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x6'/>
Jan 21 18:57:06 np0005591285 nova_compute[182755]:    </controller>
Jan 21 18:57:06 np0005591285 nova_compute[182755]:    <controller type='pci' index='8' model='pcie-root-port'>
Jan 21 18:57:06 np0005591285 nova_compute[182755]:      <model name='pcie-root-port'/>
Jan 21 18:57:06 np0005591285 nova_compute[182755]:      <target chassis='8' port='0x17'/>
Jan 21 18:57:06 np0005591285 nova_compute[182755]:      <alias name='pci.8'/>
Jan 21 18:57:06 np0005591285 nova_compute[182755]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x7'/>
Jan 21 18:57:06 np0005591285 nova_compute[182755]:    </controller>
Jan 21 18:57:06 np0005591285 nova_compute[182755]:    <controller type='pci' index='9' model='pcie-root-port'>
Jan 21 18:57:06 np0005591285 nova_compute[182755]:      <model name='pcie-root-port'/>
Jan 21 18:57:06 np0005591285 nova_compute[182755]:      <target chassis='9' port='0x18'/>
Jan 21 18:57:06 np0005591285 nova_compute[182755]:      <alias name='pci.9'/>
Jan 21 18:57:06 np0005591285 nova_compute[182755]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x0' multifunction='on'/>
Jan 21 18:57:06 np0005591285 nova_compute[182755]:    </controller>
Jan 21 18:57:06 np0005591285 nova_compute[182755]:    <controller type='pci' index='10' model='pcie-root-port'>
Jan 21 18:57:06 np0005591285 nova_compute[182755]:      <model name='pcie-root-port'/>
Jan 21 18:57:06 np0005591285 nova_compute[182755]:      <target chassis='10' port='0x19'/>
Jan 21 18:57:06 np0005591285 nova_compute[182755]:      <alias name='pci.10'/>
Jan 21 18:57:06 np0005591285 nova_compute[182755]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x1'/>
Jan 21 18:57:06 np0005591285 nova_compute[182755]:    </controller>
Jan 21 18:57:06 np0005591285 nova_compute[182755]:    <controller type='pci' index='11' model='pcie-root-port'>
Jan 21 18:57:06 np0005591285 nova_compute[182755]:      <model name='pcie-root-port'/>
Jan 21 18:57:06 np0005591285 nova_compute[182755]:      <target chassis='11' port='0x1a'/>
Jan 21 18:57:06 np0005591285 nova_compute[182755]:      <alias name='pci.11'/>
Jan 21 18:57:06 np0005591285 nova_compute[182755]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x2'/>
Jan 21 18:57:06 np0005591285 nova_compute[182755]:    </controller>
Jan 21 18:57:06 np0005591285 nova_compute[182755]:    <controller type='pci' index='12' model='pcie-root-port'>
Jan 21 18:57:06 np0005591285 nova_compute[182755]:      <model name='pcie-root-port'/>
Jan 21 18:57:06 np0005591285 nova_compute[182755]:      <target chassis='12' port='0x1b'/>
Jan 21 18:57:06 np0005591285 nova_compute[182755]:      <alias name='pci.12'/>
Jan 21 18:57:06 np0005591285 nova_compute[182755]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x3'/>
Jan 21 18:57:06 np0005591285 nova_compute[182755]:    </controller>
Jan 21 18:57:06 np0005591285 nova_compute[182755]:    <controller type='pci' index='13' model='pcie-root-port'>
Jan 21 18:57:06 np0005591285 nova_compute[182755]:      <model name='pcie-root-port'/>
Jan 21 18:57:06 np0005591285 nova_compute[182755]:      <target chassis='13' port='0x1c'/>
Jan 21 18:57:06 np0005591285 nova_compute[182755]:      <alias name='pci.13'/>
Jan 21 18:57:06 np0005591285 nova_compute[182755]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x4'/>
Jan 21 18:57:06 np0005591285 nova_compute[182755]:    </controller>
Jan 21 18:57:06 np0005591285 nova_compute[182755]:    <controller type='pci' index='14' model='pcie-root-port'>
Jan 21 18:57:06 np0005591285 nova_compute[182755]:      <model name='pcie-root-port'/>
Jan 21 18:57:06 np0005591285 nova_compute[182755]:      <target chassis='14' port='0x1d'/>
Jan 21 18:57:06 np0005591285 nova_compute[182755]:      <alias name='pci.14'/>
Jan 21 18:57:06 np0005591285 nova_compute[182755]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x5'/>
Jan 21 18:57:06 np0005591285 nova_compute[182755]:    </controller>
Jan 21 18:57:06 np0005591285 nova_compute[182755]:    <controller type='pci' index='15' model='pcie-root-port'>
Jan 21 18:57:06 np0005591285 nova_compute[182755]:      <model name='pcie-root-port'/>
Jan 21 18:57:06 np0005591285 nova_compute[182755]:      <target chassis='15' port='0x1e'/>
Jan 21 18:57:06 np0005591285 nova_compute[182755]:      <alias name='pci.15'/>
Jan 21 18:57:06 np0005591285 nova_compute[182755]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x6'/>
Jan 21 18:57:06 np0005591285 nova_compute[182755]:    </controller>
Jan 21 18:57:06 np0005591285 nova_compute[182755]:    <controller type='pci' index='16' model='pcie-root-port'>
Jan 21 18:57:06 np0005591285 nova_compute[182755]:      <model name='pcie-root-port'/>
Jan 21 18:57:06 np0005591285 nova_compute[182755]:      <target chassis='16' port='0x1f'/>
Jan 21 18:57:06 np0005591285 nova_compute[182755]:      <alias name='pci.16'/>
Jan 21 18:57:06 np0005591285 nova_compute[182755]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x7'/>
Jan 21 18:57:06 np0005591285 nova_compute[182755]:    </controller>
Jan 21 18:57:06 np0005591285 nova_compute[182755]:    <controller type='pci' index='17' model='pcie-root-port'>
Jan 21 18:57:06 np0005591285 nova_compute[182755]:      <model name='pcie-root-port'/>
Jan 21 18:57:06 np0005591285 nova_compute[182755]:      <target chassis='17' port='0x20'/>
Jan 21 18:57:06 np0005591285 nova_compute[182755]:      <alias name='pci.17'/>
Jan 21 18:57:06 np0005591285 nova_compute[182755]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x0' multifunction='on'/>
Jan 21 18:57:06 np0005591285 nova_compute[182755]:    </controller>
Jan 21 18:57:06 np0005591285 nova_compute[182755]:    <controller type='pci' index='18' model='pcie-root-port'>
Jan 21 18:57:06 np0005591285 nova_compute[182755]:      <model name='pcie-root-port'/>
Jan 21 18:57:06 np0005591285 nova_compute[182755]:      <target chassis='18' port='0x21'/>
Jan 21 18:57:06 np0005591285 nova_compute[182755]:      <alias name='pci.18'/>
Jan 21 18:57:06 np0005591285 nova_compute[182755]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x1'/>
Jan 21 18:57:06 np0005591285 nova_compute[182755]:    </controller>
Jan 21 18:57:06 np0005591285 nova_compute[182755]:    <controller type='pci' index='19' model='pcie-root-port'>
Jan 21 18:57:06 np0005591285 nova_compute[182755]:      <model name='pcie-root-port'/>
Jan 21 18:57:06 np0005591285 nova_compute[182755]:      <target chassis='19' port='0x22'/>
Jan 21 18:57:06 np0005591285 nova_compute[182755]:      <alias name='pci.19'/>
Jan 21 18:57:06 np0005591285 nova_compute[182755]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x2'/>
Jan 21 18:57:06 np0005591285 nova_compute[182755]:    </controller>
Jan 21 18:57:06 np0005591285 nova_compute[182755]:    <controller type='pci' index='20' model='pcie-root-port'>
Jan 21 18:57:06 np0005591285 nova_compute[182755]:      <model name='pcie-root-port'/>
Jan 21 18:57:06 np0005591285 nova_compute[182755]:      <target chassis='20' port='0x23'/>
Jan 21 18:57:06 np0005591285 nova_compute[182755]:      <alias name='pci.20'/>
Jan 21 18:57:06 np0005591285 nova_compute[182755]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x3'/>
Jan 21 18:57:06 np0005591285 nova_compute[182755]:    </controller>
Jan 21 18:57:06 np0005591285 nova_compute[182755]:    <controller type='pci' index='21' model='pcie-root-port'>
Jan 21 18:57:06 np0005591285 nova_compute[182755]:      <model name='pcie-root-port'/>
Jan 21 18:57:06 np0005591285 nova_compute[182755]:      <target chassis='21' port='0x24'/>
Jan 21 18:57:06 np0005591285 nova_compute[182755]:      <alias name='pci.21'/>
Jan 21 18:57:06 np0005591285 nova_compute[182755]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x4'/>
Jan 21 18:57:06 np0005591285 nova_compute[182755]:    </controller>
Jan 21 18:57:06 np0005591285 nova_compute[182755]:    <controller type='pci' index='22' model='pcie-root-port'>
Jan 21 18:57:06 np0005591285 nova_compute[182755]:      <model name='pcie-root-port'/>
Jan 21 18:57:06 np0005591285 nova_compute[182755]:      <target chassis='22' port='0x25'/>
Jan 21 18:57:06 np0005591285 nova_compute[182755]:      <alias name='pci.22'/>
Jan 21 18:57:06 np0005591285 nova_compute[182755]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x5'/>
Jan 21 18:57:06 np0005591285 nova_compute[182755]:    </controller>
Jan 21 18:57:06 np0005591285 nova_compute[182755]:    <controller type='pci' index='23' model='pcie-root-port'>
Jan 21 18:57:06 np0005591285 nova_compute[182755]:      <model name='pcie-root-port'/>
Jan 21 18:57:06 np0005591285 nova_compute[182755]:      <target chassis='23' port='0x26'/>
Jan 21 18:57:06 np0005591285 nova_compute[182755]:      <alias name='pci.23'/>
Jan 21 18:57:06 np0005591285 nova_compute[182755]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x6'/>
Jan 21 18:57:06 np0005591285 nova_compute[182755]:    </controller>
Jan 21 18:57:06 np0005591285 nova_compute[182755]:    <controller type='pci' index='24' model='pcie-root-port'>
Jan 21 18:57:06 np0005591285 nova_compute[182755]:      <model name='pcie-root-port'/>
Jan 21 18:57:06 np0005591285 nova_compute[182755]:      <target chassis='24' port='0x27'/>
Jan 21 18:57:06 np0005591285 nova_compute[182755]:      <alias name='pci.24'/>
Jan 21 18:57:06 np0005591285 nova_compute[182755]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x7'/>
Jan 21 18:57:06 np0005591285 nova_compute[182755]:    </controller>
Jan 21 18:57:06 np0005591285 nova_compute[182755]:    <controller type='pci' index='25' model='pcie-root-port'>
Jan 21 18:57:06 np0005591285 nova_compute[182755]:      <model name='pcie-root-port'/>
Jan 21 18:57:06 np0005591285 nova_compute[182755]:      <target chassis='25' port='0x28'/>
Jan 21 18:57:06 np0005591285 nova_compute[182755]:      <alias name='pci.25'/>
Jan 21 18:57:06 np0005591285 nova_compute[182755]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x05' function='0x0'/>
Jan 21 18:57:06 np0005591285 nova_compute[182755]:    </controller>
Jan 21 18:57:06 np0005591285 nova_compute[182755]:    <controller type='pci' index='26' model='pcie-to-pci-bridge'>
Jan 21 18:57:06 np0005591285 nova_compute[182755]:      <model name='pcie-pci-bridge'/>
Jan 21 18:57:06 np0005591285 nova_compute[182755]:      <alias name='pci.26'/>
Jan 21 18:57:06 np0005591285 nova_compute[182755]:      <address type='pci' domain='0x0000' bus='0x01' slot='0x00' function='0x0'/>
Jan 21 18:57:06 np0005591285 nova_compute[182755]:    </controller>
Jan 21 18:57:06 np0005591285 nova_compute[182755]:    <controller type='usb' index='0' model='piix3-uhci'>
Jan 21 18:57:06 np0005591285 nova_compute[182755]:      <alias name='usb'/>
Jan 21 18:57:06 np0005591285 nova_compute[182755]:      <address type='pci' domain='0x0000' bus='0x1a' slot='0x01' function='0x0'/>
Jan 21 18:57:06 np0005591285 nova_compute[182755]:    </controller>
Jan 21 18:57:06 np0005591285 nova_compute[182755]:    <controller type='sata' index='0'>
Jan 21 18:57:06 np0005591285 nova_compute[182755]:      <alias name='ide'/>
Jan 21 18:57:06 np0005591285 nova_compute[182755]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x1f' function='0x2'/>
Jan 21 18:57:06 np0005591285 nova_compute[182755]:    </controller>
Jan 21 18:57:06 np0005591285 nova_compute[182755]:    <interface type='ethernet'>
Jan 21 18:57:06 np0005591285 nova_compute[182755]:      <mac address='fa:16:3e:04:81:11'/>
Jan 21 18:57:06 np0005591285 nova_compute[182755]:      <target dev='tap8a89069d-b6'/>
Jan 21 18:57:06 np0005591285 nova_compute[182755]:      <model type='virtio'/>
Jan 21 18:57:06 np0005591285 nova_compute[182755]:      <driver name='vhost' rx_queue_size='512'/>
Jan 21 18:57:06 np0005591285 nova_compute[182755]:      <mtu size='1442'/>
Jan 21 18:57:06 np0005591285 nova_compute[182755]:      <alias name='net0'/>
Jan 21 18:57:06 np0005591285 nova_compute[182755]:      <address type='pci' domain='0x0000' bus='0x02' slot='0x00' function='0x0'/>
Jan 21 18:57:06 np0005591285 nova_compute[182755]:    </interface>
Jan 21 18:57:06 np0005591285 nova_compute[182755]:    <serial type='pty'>
Jan 21 18:57:06 np0005591285 nova_compute[182755]:      <source path='/dev/pts/0'/>
Jan 21 18:57:06 np0005591285 nova_compute[182755]:      <log file='/var/lib/nova/instances/d30c29ef-0595-4d30-a826-50e21d7d3463/console.log' append='off'/>
Jan 21 18:57:06 np0005591285 nova_compute[182755]:      <target type='isa-serial' port='0'>
Jan 21 18:57:06 np0005591285 nova_compute[182755]:        <model name='isa-serial'/>
Jan 21 18:57:06 np0005591285 nova_compute[182755]:      </target>
Jan 21 18:57:06 np0005591285 nova_compute[182755]:      <alias name='serial0'/>
Jan 21 18:57:06 np0005591285 nova_compute[182755]:    </serial>
Jan 21 18:57:06 np0005591285 nova_compute[182755]:    <console type='pty' tty='/dev/pts/0'>
Jan 21 18:57:06 np0005591285 nova_compute[182755]:      <source path='/dev/pts/0'/>
Jan 21 18:57:06 np0005591285 nova_compute[182755]:      <log file='/var/lib/nova/instances/d30c29ef-0595-4d30-a826-50e21d7d3463/console.log' append='off'/>
Jan 21 18:57:06 np0005591285 nova_compute[182755]:      <target type='serial' port='0'/>
Jan 21 18:57:06 np0005591285 nova_compute[182755]:      <alias name='serial0'/>
Jan 21 18:57:06 np0005591285 nova_compute[182755]:    </console>
Jan 21 18:57:06 np0005591285 nova_compute[182755]:    <input type='tablet' bus='usb'>
Jan 21 18:57:06 np0005591285 nova_compute[182755]:      <alias name='input0'/>
Jan 21 18:57:06 np0005591285 nova_compute[182755]:      <address type='usb' bus='0' port='1'/>
Jan 21 18:57:06 np0005591285 nova_compute[182755]:    </input>
Jan 21 18:57:06 np0005591285 nova_compute[182755]:    <input type='mouse' bus='ps2'>
Jan 21 18:57:06 np0005591285 nova_compute[182755]:      <alias name='input1'/>
Jan 21 18:57:06 np0005591285 nova_compute[182755]:    </input>
Jan 21 18:57:06 np0005591285 nova_compute[182755]:    <input type='keyboard' bus='ps2'>
Jan 21 18:57:06 np0005591285 nova_compute[182755]:      <alias name='input2'/>
Jan 21 18:57:06 np0005591285 nova_compute[182755]:    </input>
Jan 21 18:57:06 np0005591285 nova_compute[182755]:    <graphics type='vnc' port='5900' autoport='yes' listen='::0'>
Jan 21 18:57:06 np0005591285 nova_compute[182755]:      <listen type='address' address='::0'/>
Jan 21 18:57:06 np0005591285 nova_compute[182755]:    </graphics>
Jan 21 18:57:06 np0005591285 nova_compute[182755]:    <audio id='1' type='none'/>
Jan 21 18:57:06 np0005591285 nova_compute[182755]:    <video>
Jan 21 18:57:06 np0005591285 nova_compute[182755]:      <model type='virtio' heads='1' primary='yes'/>
Jan 21 18:57:06 np0005591285 nova_compute[182755]:      <alias name='video0'/>
Jan 21 18:57:06 np0005591285 nova_compute[182755]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x01' function='0x0'/>
Jan 21 18:57:06 np0005591285 nova_compute[182755]:    </video>
Jan 21 18:57:06 np0005591285 nova_compute[182755]:    <watchdog model='itco' action='reset'>
Jan 21 18:57:06 np0005591285 nova_compute[182755]:      <alias name='watchdog0'/>
Jan 21 18:57:06 np0005591285 nova_compute[182755]:    </watchdog>
Jan 21 18:57:06 np0005591285 nova_compute[182755]:    <memballoon model='virtio'>
Jan 21 18:57:06 np0005591285 nova_compute[182755]:      <stats period='10'/>
Jan 21 18:57:06 np0005591285 nova_compute[182755]:      <alias name='balloon0'/>
Jan 21 18:57:06 np0005591285 nova_compute[182755]:      <address type='pci' domain='0x0000' bus='0x04' slot='0x00' function='0x0'/>
Jan 21 18:57:06 np0005591285 nova_compute[182755]:    </memballoon>
Jan 21 18:57:06 np0005591285 nova_compute[182755]:    <rng model='virtio'>
Jan 21 18:57:06 np0005591285 nova_compute[182755]:      <backend model='random'>/dev/urandom</backend>
Jan 21 18:57:06 np0005591285 nova_compute[182755]:      <alias name='rng0'/>
Jan 21 18:57:06 np0005591285 nova_compute[182755]:      <address type='pci' domain='0x0000' bus='0x05' slot='0x00' function='0x0'/>
Jan 21 18:57:06 np0005591285 nova_compute[182755]:    </rng>
Jan 21 18:57:06 np0005591285 nova_compute[182755]:  </devices>
Jan 21 18:57:06 np0005591285 nova_compute[182755]:  <seclabel type='dynamic' model='selinux' relabel='yes'>
Jan 21 18:57:06 np0005591285 nova_compute[182755]:    <label>system_u:system_r:svirt_t:s0:c724,c873</label>
Jan 21 18:57:06 np0005591285 nova_compute[182755]:    <imagelabel>system_u:object_r:svirt_image_t:s0:c724,c873</imagelabel>
Jan 21 18:57:06 np0005591285 nova_compute[182755]:  </seclabel>
Jan 21 18:57:06 np0005591285 nova_compute[182755]:  <seclabel type='dynamic' model='dac' relabel='yes'>
Jan 21 18:57:06 np0005591285 nova_compute[182755]:    <label>+107:+107</label>
Jan 21 18:57:06 np0005591285 nova_compute[182755]:    <imagelabel>+107:+107</imagelabel>
Jan 21 18:57:06 np0005591285 nova_compute[182755]:  </seclabel>
Jan 21 18:57:06 np0005591285 nova_compute[182755]: </domain>
Jan 21 18:57:06 np0005591285 nova_compute[182755]: get_interface_by_cfg /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:282#033[00m
Jan 21 18:57:06 np0005591285 nova_compute[182755]: 2026-01-21 23:57:06.741 182759 DEBUG nova.virt.libvirt.guest [req-3b2bb319-b6f0-4a96-afb4-f1856d6a9aeb req-9b7c2a27-7691-4d30-b5cb-acb2dea2999d 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] looking for interface given config: <interface type="ethernet"><mac address="fa:16:3e:42:b1:a8"/><model type="virtio"/><driver name="vhost" rx_queue_size="512"/><mtu size="1442"/><target dev="tape7e9d1d2-92"/></interface> get_interface_by_cfg /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:257#033[00m
Jan 21 18:57:06 np0005591285 nova_compute[182755]: 2026-01-21 23:57:06.746 182759 DEBUG nova.virt.libvirt.guest [req-3b2bb319-b6f0-4a96-afb4-f1856d6a9aeb req-9b7c2a27-7691-4d30-b5cb-acb2dea2999d 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] interface for config: <interface type="ethernet"><mac address="fa:16:3e:42:b1:a8"/><model type="virtio"/><driver name="vhost" rx_queue_size="512"/><mtu size="1442"/><target dev="tape7e9d1d2-92"/></interface>not found in domain: <domain type='kvm' id='28'>
Jan 21 18:57:06 np0005591285 nova_compute[182755]:  <name>instance-00000044</name>
Jan 21 18:57:06 np0005591285 nova_compute[182755]:  <uuid>d30c29ef-0595-4d30-a826-50e21d7d3463</uuid>
Jan 21 18:57:06 np0005591285 nova_compute[182755]:  <metadata>
Jan 21 18:57:06 np0005591285 nova_compute[182755]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1" xmlns:instance="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 21 18:57:06 np0005591285 nova_compute[182755]:  <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 21 18:57:06 np0005591285 nova_compute[182755]:  <nova:name>tempest-AttachInterfacesTestJSON-server-1062303531</nova:name>
Jan 21 18:57:06 np0005591285 nova_compute[182755]:  <nova:creationTime>2026-01-21 23:57:05</nova:creationTime>
Jan 21 18:57:06 np0005591285 nova_compute[182755]:  <nova:flavor name="m1.nano">
Jan 21 18:57:06 np0005591285 nova_compute[182755]:    <nova:memory>128</nova:memory>
Jan 21 18:57:06 np0005591285 nova_compute[182755]:    <nova:disk>1</nova:disk>
Jan 21 18:57:06 np0005591285 nova_compute[182755]:    <nova:swap>0</nova:swap>
Jan 21 18:57:06 np0005591285 nova_compute[182755]:    <nova:ephemeral>0</nova:ephemeral>
Jan 21 18:57:06 np0005591285 nova_compute[182755]:    <nova:vcpus>1</nova:vcpus>
Jan 21 18:57:06 np0005591285 nova_compute[182755]:  </nova:flavor>
Jan 21 18:57:06 np0005591285 nova_compute[182755]:  <nova:owner>
Jan 21 18:57:06 np0005591285 nova_compute[182755]:    <nova:user uuid="0f8ef02149394f2dac899fc3395b6bf7">tempest-AttachInterfacesTestJSON-658760528-project-member</nova:user>
Jan 21 18:57:06 np0005591285 nova_compute[182755]:    <nova:project uuid="717cc581e6a349a98dfd390d05b18624">tempest-AttachInterfacesTestJSON-658760528</nova:project>
Jan 21 18:57:06 np0005591285 nova_compute[182755]:  </nova:owner>
Jan 21 18:57:06 np0005591285 nova_compute[182755]:  <nova:root type="image" uuid="9cd98f02-a505-4543-a7ad-04e9a377b456"/>
Jan 21 18:57:06 np0005591285 nova_compute[182755]:  <nova:ports>
Jan 21 18:57:06 np0005591285 nova_compute[182755]:    <nova:port uuid="8a89069d-b676-4bb0-ab19-1d71370566f0">
Jan 21 18:57:06 np0005591285 nova_compute[182755]:      <nova:ip type="fixed" address="10.100.0.14" ipVersion="4"/>
Jan 21 18:57:06 np0005591285 nova_compute[182755]:    </nova:port>
Jan 21 18:57:06 np0005591285 nova_compute[182755]:  </nova:ports>
Jan 21 18:57:06 np0005591285 nova_compute[182755]: </nova:instance>
Jan 21 18:57:06 np0005591285 nova_compute[182755]:  </metadata>
Jan 21 18:57:06 np0005591285 nova_compute[182755]:  <memory unit='KiB'>131072</memory>
Jan 21 18:57:06 np0005591285 nova_compute[182755]:  <currentMemory unit='KiB'>131072</currentMemory>
Jan 21 18:57:06 np0005591285 nova_compute[182755]:  <vcpu placement='static'>1</vcpu>
Jan 21 18:57:06 np0005591285 nova_compute[182755]:  <resource>
Jan 21 18:57:06 np0005591285 nova_compute[182755]:    <partition>/machine</partition>
Jan 21 18:57:06 np0005591285 nova_compute[182755]:  </resource>
Jan 21 18:57:06 np0005591285 nova_compute[182755]:  <sysinfo type='smbios'>
Jan 21 18:57:06 np0005591285 nova_compute[182755]:    <system>
Jan 21 18:57:06 np0005591285 nova_compute[182755]:      <entry name='manufacturer'>RDO</entry>
Jan 21 18:57:06 np0005591285 nova_compute[182755]:      <entry name='product'>OpenStack Compute</entry>
Jan 21 18:57:06 np0005591285 nova_compute[182755]:      <entry name='version'>27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 21 18:57:06 np0005591285 nova_compute[182755]:      <entry name='serial'>d30c29ef-0595-4d30-a826-50e21d7d3463</entry>
Jan 21 18:57:06 np0005591285 nova_compute[182755]:      <entry name='uuid'>d30c29ef-0595-4d30-a826-50e21d7d3463</entry>
Jan 21 18:57:06 np0005591285 nova_compute[182755]:      <entry name='family'>Virtual Machine</entry>
Jan 21 18:57:06 np0005591285 nova_compute[182755]:    </system>
Jan 21 18:57:06 np0005591285 nova_compute[182755]:  </sysinfo>
Jan 21 18:57:06 np0005591285 nova_compute[182755]:  <os>
Jan 21 18:57:06 np0005591285 nova_compute[182755]:    <type arch='x86_64' machine='pc-q35-rhel9.8.0'>hvm</type>
Jan 21 18:57:06 np0005591285 nova_compute[182755]:    <boot dev='hd'/>
Jan 21 18:57:06 np0005591285 nova_compute[182755]:    <smbios mode='sysinfo'/>
Jan 21 18:57:06 np0005591285 nova_compute[182755]:  </os>
Jan 21 18:57:06 np0005591285 nova_compute[182755]:  <features>
Jan 21 18:57:06 np0005591285 nova_compute[182755]:    <acpi/>
Jan 21 18:57:06 np0005591285 nova_compute[182755]:    <apic/>
Jan 21 18:57:06 np0005591285 nova_compute[182755]:    <vmcoreinfo state='on'/>
Jan 21 18:57:06 np0005591285 nova_compute[182755]:  </features>
Jan 21 18:57:06 np0005591285 nova_compute[182755]:  <cpu mode='custom' match='exact' check='full'>
Jan 21 18:57:06 np0005591285 nova_compute[182755]:    <model fallback='forbid'>Nehalem</model>
Jan 21 18:57:06 np0005591285 nova_compute[182755]:    <topology sockets='1' dies='1' clusters='1' cores='1' threads='1'/>
Jan 21 18:57:06 np0005591285 nova_compute[182755]:    <feature policy='require' name='x2apic'/>
Jan 21 18:57:06 np0005591285 nova_compute[182755]:    <feature policy='require' name='hypervisor'/>
Jan 21 18:57:06 np0005591285 nova_compute[182755]:    <feature policy='require' name='vme'/>
Jan 21 18:57:06 np0005591285 nova_compute[182755]:  </cpu>
Jan 21 18:57:06 np0005591285 nova_compute[182755]:  <clock offset='utc'>
Jan 21 18:57:06 np0005591285 nova_compute[182755]:    <timer name='pit' tickpolicy='delay'/>
Jan 21 18:57:06 np0005591285 nova_compute[182755]:    <timer name='rtc' tickpolicy='catchup'/>
Jan 21 18:57:06 np0005591285 nova_compute[182755]:    <timer name='hpet' present='no'/>
Jan 21 18:57:06 np0005591285 nova_compute[182755]:  </clock>
Jan 21 18:57:06 np0005591285 nova_compute[182755]:  <on_poweroff>destroy</on_poweroff>
Jan 21 18:57:06 np0005591285 nova_compute[182755]:  <on_reboot>restart</on_reboot>
Jan 21 18:57:06 np0005591285 nova_compute[182755]:  <on_crash>destroy</on_crash>
Jan 21 18:57:06 np0005591285 nova_compute[182755]:  <devices>
Jan 21 18:57:06 np0005591285 nova_compute[182755]:    <emulator>/usr/libexec/qemu-kvm</emulator>
Jan 21 18:57:06 np0005591285 nova_compute[182755]:    <disk type='file' device='disk'>
Jan 21 18:57:06 np0005591285 nova_compute[182755]:      <driver name='qemu' type='qcow2' cache='none'/>
Jan 21 18:57:06 np0005591285 nova_compute[182755]:      <source file='/var/lib/nova/instances/d30c29ef-0595-4d30-a826-50e21d7d3463/disk' index='2'/>
Jan 21 18:57:06 np0005591285 nova_compute[182755]:      <backingStore type='file' index='3'>
Jan 21 18:57:06 np0005591285 nova_compute[182755]:        <format type='raw'/>
Jan 21 18:57:06 np0005591285 nova_compute[182755]:        <source file='/var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474'/>
Jan 21 18:57:06 np0005591285 nova_compute[182755]:        <backingStore/>
Jan 21 18:57:06 np0005591285 nova_compute[182755]:      </backingStore>
Jan 21 18:57:06 np0005591285 nova_compute[182755]:      <target dev='vda' bus='virtio'/>
Jan 21 18:57:06 np0005591285 nova_compute[182755]:      <alias name='virtio-disk0'/>
Jan 21 18:57:06 np0005591285 nova_compute[182755]:      <address type='pci' domain='0x0000' bus='0x03' slot='0x00' function='0x0'/>
Jan 21 18:57:06 np0005591285 nova_compute[182755]:    </disk>
Jan 21 18:57:06 np0005591285 nova_compute[182755]:    <disk type='file' device='cdrom'>
Jan 21 18:57:06 np0005591285 nova_compute[182755]:      <driver name='qemu' type='raw' cache='none'/>
Jan 21 18:57:06 np0005591285 nova_compute[182755]:      <source file='/var/lib/nova/instances/d30c29ef-0595-4d30-a826-50e21d7d3463/disk.config' index='1'/>
Jan 21 18:57:06 np0005591285 nova_compute[182755]:      <backingStore/>
Jan 21 18:57:06 np0005591285 nova_compute[182755]:      <target dev='sda' bus='sata'/>
Jan 21 18:57:06 np0005591285 nova_compute[182755]:      <readonly/>
Jan 21 18:57:06 np0005591285 nova_compute[182755]:      <alias name='sata0-0-0'/>
Jan 21 18:57:06 np0005591285 nova_compute[182755]:      <address type='drive' controller='0' bus='0' target='0' unit='0'/>
Jan 21 18:57:06 np0005591285 nova_compute[182755]:    </disk>
Jan 21 18:57:06 np0005591285 nova_compute[182755]:    <controller type='pci' index='0' model='pcie-root'>
Jan 21 18:57:06 np0005591285 nova_compute[182755]:      <alias name='pcie.0'/>
Jan 21 18:57:06 np0005591285 nova_compute[182755]:    </controller>
Jan 21 18:57:06 np0005591285 nova_compute[182755]:    <controller type='pci' index='1' model='pcie-root-port'>
Jan 21 18:57:06 np0005591285 nova_compute[182755]:      <model name='pcie-root-port'/>
Jan 21 18:57:06 np0005591285 nova_compute[182755]:      <target chassis='1' port='0x10'/>
Jan 21 18:57:06 np0005591285 nova_compute[182755]:      <alias name='pci.1'/>
Jan 21 18:57:06 np0005591285 nova_compute[182755]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x0' multifunction='on'/>
Jan 21 18:57:06 np0005591285 nova_compute[182755]:    </controller>
Jan 21 18:57:06 np0005591285 nova_compute[182755]:    <controller type='pci' index='2' model='pcie-root-port'>
Jan 21 18:57:06 np0005591285 nova_compute[182755]:      <model name='pcie-root-port'/>
Jan 21 18:57:06 np0005591285 nova_compute[182755]:      <target chassis='2' port='0x11'/>
Jan 21 18:57:06 np0005591285 nova_compute[182755]:      <alias name='pci.2'/>
Jan 21 18:57:06 np0005591285 nova_compute[182755]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x1'/>
Jan 21 18:57:06 np0005591285 nova_compute[182755]:    </controller>
Jan 21 18:57:06 np0005591285 nova_compute[182755]:    <controller type='pci' index='3' model='pcie-root-port'>
Jan 21 18:57:06 np0005591285 nova_compute[182755]:      <model name='pcie-root-port'/>
Jan 21 18:57:06 np0005591285 nova_compute[182755]:      <target chassis='3' port='0x12'/>
Jan 21 18:57:06 np0005591285 nova_compute[182755]:      <alias name='pci.3'/>
Jan 21 18:57:06 np0005591285 nova_compute[182755]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x2'/>
Jan 21 18:57:06 np0005591285 nova_compute[182755]:    </controller>
Jan 21 18:57:06 np0005591285 nova_compute[182755]:    <controller type='pci' index='4' model='pcie-root-port'>
Jan 21 18:57:06 np0005591285 nova_compute[182755]:      <model name='pcie-root-port'/>
Jan 21 18:57:06 np0005591285 nova_compute[182755]:      <target chassis='4' port='0x13'/>
Jan 21 18:57:06 np0005591285 nova_compute[182755]:      <alias name='pci.4'/>
Jan 21 18:57:06 np0005591285 nova_compute[182755]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x3'/>
Jan 21 18:57:06 np0005591285 nova_compute[182755]:    </controller>
Jan 21 18:57:06 np0005591285 nova_compute[182755]:    <controller type='pci' index='5' model='pcie-root-port'>
Jan 21 18:57:06 np0005591285 nova_compute[182755]:      <model name='pcie-root-port'/>
Jan 21 18:57:06 np0005591285 nova_compute[182755]:      <target chassis='5' port='0x14'/>
Jan 21 18:57:06 np0005591285 nova_compute[182755]:      <alias name='pci.5'/>
Jan 21 18:57:06 np0005591285 nova_compute[182755]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x4'/>
Jan 21 18:57:06 np0005591285 nova_compute[182755]:    </controller>
Jan 21 18:57:06 np0005591285 nova_compute[182755]:    <controller type='pci' index='6' model='pcie-root-port'>
Jan 21 18:57:06 np0005591285 nova_compute[182755]:      <model name='pcie-root-port'/>
Jan 21 18:57:06 np0005591285 nova_compute[182755]:      <target chassis='6' port='0x15'/>
Jan 21 18:57:06 np0005591285 nova_compute[182755]:      <alias name='pci.6'/>
Jan 21 18:57:06 np0005591285 nova_compute[182755]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x5'/>
Jan 21 18:57:06 np0005591285 nova_compute[182755]:    </controller>
Jan 21 18:57:06 np0005591285 nova_compute[182755]:    <controller type='pci' index='7' model='pcie-root-port'>
Jan 21 18:57:06 np0005591285 nova_compute[182755]:      <model name='pcie-root-port'/>
Jan 21 18:57:06 np0005591285 nova_compute[182755]:      <target chassis='7' port='0x16'/>
Jan 21 18:57:06 np0005591285 nova_compute[182755]:      <alias name='pci.7'/>
Jan 21 18:57:06 np0005591285 nova_compute[182755]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x6'/>
Jan 21 18:57:06 np0005591285 nova_compute[182755]:    </controller>
Jan 21 18:57:06 np0005591285 nova_compute[182755]:    <controller type='pci' index='8' model='pcie-root-port'>
Jan 21 18:57:06 np0005591285 nova_compute[182755]:      <model name='pcie-root-port'/>
Jan 21 18:57:06 np0005591285 nova_compute[182755]:      <target chassis='8' port='0x17'/>
Jan 21 18:57:06 np0005591285 nova_compute[182755]:      <alias name='pci.8'/>
Jan 21 18:57:06 np0005591285 nova_compute[182755]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x7'/>
Jan 21 18:57:06 np0005591285 nova_compute[182755]:    </controller>
Jan 21 18:57:06 np0005591285 nova_compute[182755]:    <controller type='pci' index='9' model='pcie-root-port'>
Jan 21 18:57:06 np0005591285 nova_compute[182755]:      <model name='pcie-root-port'/>
Jan 21 18:57:06 np0005591285 nova_compute[182755]:      <target chassis='9' port='0x18'/>
Jan 21 18:57:06 np0005591285 nova_compute[182755]:      <alias name='pci.9'/>
Jan 21 18:57:06 np0005591285 nova_compute[182755]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x0' multifunction='on'/>
Jan 21 18:57:06 np0005591285 nova_compute[182755]:    </controller>
Jan 21 18:57:06 np0005591285 nova_compute[182755]:    <controller type='pci' index='10' model='pcie-root-port'>
Jan 21 18:57:06 np0005591285 nova_compute[182755]:      <model name='pcie-root-port'/>
Jan 21 18:57:06 np0005591285 nova_compute[182755]:      <target chassis='10' port='0x19'/>
Jan 21 18:57:06 np0005591285 nova_compute[182755]:      <alias name='pci.10'/>
Jan 21 18:57:06 np0005591285 nova_compute[182755]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x1'/>
Jan 21 18:57:06 np0005591285 nova_compute[182755]:    </controller>
Jan 21 18:57:06 np0005591285 nova_compute[182755]:    <controller type='pci' index='11' model='pcie-root-port'>
Jan 21 18:57:06 np0005591285 nova_compute[182755]:      <model name='pcie-root-port'/>
Jan 21 18:57:06 np0005591285 nova_compute[182755]:      <target chassis='11' port='0x1a'/>
Jan 21 18:57:06 np0005591285 nova_compute[182755]:      <alias name='pci.11'/>
Jan 21 18:57:06 np0005591285 nova_compute[182755]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x2'/>
Jan 21 18:57:06 np0005591285 nova_compute[182755]:    </controller>
Jan 21 18:57:06 np0005591285 nova_compute[182755]:    <controller type='pci' index='12' model='pcie-root-port'>
Jan 21 18:57:06 np0005591285 nova_compute[182755]:      <model name='pcie-root-port'/>
Jan 21 18:57:06 np0005591285 nova_compute[182755]:      <target chassis='12' port='0x1b'/>
Jan 21 18:57:06 np0005591285 nova_compute[182755]:      <alias name='pci.12'/>
Jan 21 18:57:06 np0005591285 nova_compute[182755]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x3'/>
Jan 21 18:57:06 np0005591285 nova_compute[182755]:    </controller>
Jan 21 18:57:06 np0005591285 nova_compute[182755]:    <controller type='pci' index='13' model='pcie-root-port'>
Jan 21 18:57:06 np0005591285 nova_compute[182755]:      <model name='pcie-root-port'/>
Jan 21 18:57:06 np0005591285 nova_compute[182755]:      <target chassis='13' port='0x1c'/>
Jan 21 18:57:06 np0005591285 nova_compute[182755]:      <alias name='pci.13'/>
Jan 21 18:57:06 np0005591285 nova_compute[182755]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x4'/>
Jan 21 18:57:06 np0005591285 nova_compute[182755]:    </controller>
Jan 21 18:57:06 np0005591285 nova_compute[182755]:    <controller type='pci' index='14' model='pcie-root-port'>
Jan 21 18:57:06 np0005591285 nova_compute[182755]:      <model name='pcie-root-port'/>
Jan 21 18:57:06 np0005591285 nova_compute[182755]:      <target chassis='14' port='0x1d'/>
Jan 21 18:57:06 np0005591285 nova_compute[182755]:      <alias name='pci.14'/>
Jan 21 18:57:06 np0005591285 nova_compute[182755]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x5'/>
Jan 21 18:57:06 np0005591285 nova_compute[182755]:    </controller>
Jan 21 18:57:06 np0005591285 nova_compute[182755]:    <controller type='pci' index='15' model='pcie-root-port'>
Jan 21 18:57:06 np0005591285 nova_compute[182755]:      <model name='pcie-root-port'/>
Jan 21 18:57:06 np0005591285 nova_compute[182755]:      <target chassis='15' port='0x1e'/>
Jan 21 18:57:06 np0005591285 nova_compute[182755]:      <alias name='pci.15'/>
Jan 21 18:57:06 np0005591285 nova_compute[182755]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x6'/>
Jan 21 18:57:06 np0005591285 nova_compute[182755]:    </controller>
Jan 21 18:57:06 np0005591285 nova_compute[182755]:    <controller type='pci' index='16' model='pcie-root-port'>
Jan 21 18:57:06 np0005591285 nova_compute[182755]:      <model name='pcie-root-port'/>
Jan 21 18:57:06 np0005591285 nova_compute[182755]:      <target chassis='16' port='0x1f'/>
Jan 21 18:57:06 np0005591285 nova_compute[182755]:      <alias name='pci.16'/>
Jan 21 18:57:06 np0005591285 nova_compute[182755]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x7'/>
Jan 21 18:57:06 np0005591285 nova_compute[182755]:    </controller>
Jan 21 18:57:06 np0005591285 nova_compute[182755]:    <controller type='pci' index='17' model='pcie-root-port'>
Jan 21 18:57:06 np0005591285 nova_compute[182755]:      <model name='pcie-root-port'/>
Jan 21 18:57:06 np0005591285 nova_compute[182755]:      <target chassis='17' port='0x20'/>
Jan 21 18:57:06 np0005591285 nova_compute[182755]:      <alias name='pci.17'/>
Jan 21 18:57:06 np0005591285 nova_compute[182755]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x0' multifunction='on'/>
Jan 21 18:57:06 np0005591285 nova_compute[182755]:    </controller>
Jan 21 18:57:06 np0005591285 nova_compute[182755]:    <controller type='pci' index='18' model='pcie-root-port'>
Jan 21 18:57:06 np0005591285 nova_compute[182755]:      <model name='pcie-root-port'/>
Jan 21 18:57:06 np0005591285 nova_compute[182755]:      <target chassis='18' port='0x21'/>
Jan 21 18:57:06 np0005591285 nova_compute[182755]:      <alias name='pci.18'/>
Jan 21 18:57:06 np0005591285 nova_compute[182755]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x1'/>
Jan 21 18:57:06 np0005591285 nova_compute[182755]:    </controller>
Jan 21 18:57:06 np0005591285 nova_compute[182755]:    <controller type='pci' index='19' model='pcie-root-port'>
Jan 21 18:57:06 np0005591285 nova_compute[182755]:      <model name='pcie-root-port'/>
Jan 21 18:57:06 np0005591285 nova_compute[182755]:      <target chassis='19' port='0x22'/>
Jan 21 18:57:06 np0005591285 nova_compute[182755]:      <alias name='pci.19'/>
Jan 21 18:57:06 np0005591285 nova_compute[182755]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x2'/>
Jan 21 18:57:06 np0005591285 nova_compute[182755]:    </controller>
Jan 21 18:57:06 np0005591285 nova_compute[182755]:    <controller type='pci' index='20' model='pcie-root-port'>
Jan 21 18:57:06 np0005591285 nova_compute[182755]:      <model name='pcie-root-port'/>
Jan 21 18:57:06 np0005591285 nova_compute[182755]:      <target chassis='20' port='0x23'/>
Jan 21 18:57:06 np0005591285 nova_compute[182755]:      <alias name='pci.20'/>
Jan 21 18:57:06 np0005591285 nova_compute[182755]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x3'/>
Jan 21 18:57:06 np0005591285 nova_compute[182755]:    </controller>
Jan 21 18:57:06 np0005591285 nova_compute[182755]:    <controller type='pci' index='21' model='pcie-root-port'>
Jan 21 18:57:06 np0005591285 nova_compute[182755]:      <model name='pcie-root-port'/>
Jan 21 18:57:06 np0005591285 nova_compute[182755]:      <target chassis='21' port='0x24'/>
Jan 21 18:57:06 np0005591285 nova_compute[182755]:      <alias name='pci.21'/>
Jan 21 18:57:06 np0005591285 nova_compute[182755]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x4'/>
Jan 21 18:57:06 np0005591285 nova_compute[182755]:    </controller>
Jan 21 18:57:06 np0005591285 nova_compute[182755]:    <controller type='pci' index='22' model='pcie-root-port'>
Jan 21 18:57:06 np0005591285 nova_compute[182755]:      <model name='pcie-root-port'/>
Jan 21 18:57:06 np0005591285 nova_compute[182755]:      <target chassis='22' port='0x25'/>
Jan 21 18:57:06 np0005591285 nova_compute[182755]:      <alias name='pci.22'/>
Jan 21 18:57:06 np0005591285 nova_compute[182755]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x5'/>
Jan 21 18:57:06 np0005591285 nova_compute[182755]:    </controller>
Jan 21 18:57:06 np0005591285 nova_compute[182755]:    <controller type='pci' index='23' model='pcie-root-port'>
Jan 21 18:57:06 np0005591285 nova_compute[182755]:      <model name='pcie-root-port'/>
Jan 21 18:57:06 np0005591285 nova_compute[182755]:      <target chassis='23' port='0x26'/>
Jan 21 18:57:06 np0005591285 nova_compute[182755]:      <alias name='pci.23'/>
Jan 21 18:57:06 np0005591285 nova_compute[182755]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x6'/>
Jan 21 18:57:06 np0005591285 nova_compute[182755]:    </controller>
Jan 21 18:57:06 np0005591285 nova_compute[182755]:    <controller type='pci' index='24' model='pcie-root-port'>
Jan 21 18:57:06 np0005591285 nova_compute[182755]:      <model name='pcie-root-port'/>
Jan 21 18:57:06 np0005591285 nova_compute[182755]:      <target chassis='24' port='0x27'/>
Jan 21 18:57:06 np0005591285 nova_compute[182755]:      <alias name='pci.24'/>
Jan 21 18:57:06 np0005591285 nova_compute[182755]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x7'/>
Jan 21 18:57:06 np0005591285 nova_compute[182755]:    </controller>
Jan 21 18:57:06 np0005591285 nova_compute[182755]:    <controller type='pci' index='25' model='pcie-root-port'>
Jan 21 18:57:06 np0005591285 nova_compute[182755]:      <model name='pcie-root-port'/>
Jan 21 18:57:06 np0005591285 nova_compute[182755]:      <target chassis='25' port='0x28'/>
Jan 21 18:57:06 np0005591285 nova_compute[182755]:      <alias name='pci.25'/>
Jan 21 18:57:06 np0005591285 nova_compute[182755]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x05' function='0x0'/>
Jan 21 18:57:06 np0005591285 nova_compute[182755]:    </controller>
Jan 21 18:57:06 np0005591285 nova_compute[182755]:    <controller type='pci' index='26' model='pcie-to-pci-bridge'>
Jan 21 18:57:06 np0005591285 nova_compute[182755]:      <model name='pcie-pci-bridge'/>
Jan 21 18:57:06 np0005591285 nova_compute[182755]:      <alias name='pci.26'/>
Jan 21 18:57:06 np0005591285 nova_compute[182755]:      <address type='pci' domain='0x0000' bus='0x01' slot='0x00' function='0x0'/>
Jan 21 18:57:06 np0005591285 nova_compute[182755]:    </controller>
Jan 21 18:57:06 np0005591285 nova_compute[182755]:    <controller type='usb' index='0' model='piix3-uhci'>
Jan 21 18:57:06 np0005591285 nova_compute[182755]:      <alias name='usb'/>
Jan 21 18:57:06 np0005591285 nova_compute[182755]:      <address type='pci' domain='0x0000' bus='0x1a' slot='0x01' function='0x0'/>
Jan 21 18:57:06 np0005591285 nova_compute[182755]:    </controller>
Jan 21 18:57:06 np0005591285 nova_compute[182755]:    <controller type='sata' index='0'>
Jan 21 18:57:06 np0005591285 nova_compute[182755]:      <alias name='ide'/>
Jan 21 18:57:06 np0005591285 nova_compute[182755]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x1f' function='0x2'/>
Jan 21 18:57:06 np0005591285 nova_compute[182755]:    </controller>
Jan 21 18:57:06 np0005591285 nova_compute[182755]:    <interface type='ethernet'>
Jan 21 18:57:06 np0005591285 nova_compute[182755]:      <mac address='fa:16:3e:04:81:11'/>
Jan 21 18:57:06 np0005591285 nova_compute[182755]:      <target dev='tap8a89069d-b6'/>
Jan 21 18:57:06 np0005591285 nova_compute[182755]:      <model type='virtio'/>
Jan 21 18:57:06 np0005591285 nova_compute[182755]:      <driver name='vhost' rx_queue_size='512'/>
Jan 21 18:57:06 np0005591285 nova_compute[182755]:      <mtu size='1442'/>
Jan 21 18:57:06 np0005591285 nova_compute[182755]:      <alias name='net0'/>
Jan 21 18:57:06 np0005591285 nova_compute[182755]:      <address type='pci' domain='0x0000' bus='0x02' slot='0x00' function='0x0'/>
Jan 21 18:57:06 np0005591285 nova_compute[182755]:    </interface>
Jan 21 18:57:06 np0005591285 nova_compute[182755]:    <serial type='pty'>
Jan 21 18:57:06 np0005591285 nova_compute[182755]:      <source path='/dev/pts/0'/>
Jan 21 18:57:06 np0005591285 nova_compute[182755]:      <log file='/var/lib/nova/instances/d30c29ef-0595-4d30-a826-50e21d7d3463/console.log' append='off'/>
Jan 21 18:57:06 np0005591285 nova_compute[182755]:      <target type='isa-serial' port='0'>
Jan 21 18:57:06 np0005591285 nova_compute[182755]:        <model name='isa-serial'/>
Jan 21 18:57:06 np0005591285 nova_compute[182755]:      </target>
Jan 21 18:57:06 np0005591285 nova_compute[182755]:      <alias name='serial0'/>
Jan 21 18:57:06 np0005591285 nova_compute[182755]:    </serial>
Jan 21 18:57:06 np0005591285 nova_compute[182755]:    <console type='pty' tty='/dev/pts/0'>
Jan 21 18:57:06 np0005591285 nova_compute[182755]:      <source path='/dev/pts/0'/>
Jan 21 18:57:06 np0005591285 nova_compute[182755]:      <log file='/var/lib/nova/instances/d30c29ef-0595-4d30-a826-50e21d7d3463/console.log' append='off'/>
Jan 21 18:57:06 np0005591285 nova_compute[182755]:      <target type='serial' port='0'/>
Jan 21 18:57:06 np0005591285 nova_compute[182755]:      <alias name='serial0'/>
Jan 21 18:57:06 np0005591285 nova_compute[182755]:    </console>
Jan 21 18:57:06 np0005591285 nova_compute[182755]:    <input type='tablet' bus='usb'>
Jan 21 18:57:06 np0005591285 nova_compute[182755]:      <alias name='input0'/>
Jan 21 18:57:06 np0005591285 nova_compute[182755]:      <address type='usb' bus='0' port='1'/>
Jan 21 18:57:06 np0005591285 nova_compute[182755]:    </input>
Jan 21 18:57:06 np0005591285 nova_compute[182755]:    <input type='mouse' bus='ps2'>
Jan 21 18:57:06 np0005591285 nova_compute[182755]:      <alias name='input1'/>
Jan 21 18:57:06 np0005591285 nova_compute[182755]:    </input>
Jan 21 18:57:06 np0005591285 nova_compute[182755]:    <input type='keyboard' bus='ps2'>
Jan 21 18:57:06 np0005591285 nova_compute[182755]:      <alias name='input2'/>
Jan 21 18:57:06 np0005591285 nova_compute[182755]:    </input>
Jan 21 18:57:06 np0005591285 nova_compute[182755]:    <graphics type='vnc' port='5900' autoport='yes' listen='::0'>
Jan 21 18:57:06 np0005591285 nova_compute[182755]:      <listen type='address' address='::0'/>
Jan 21 18:57:06 np0005591285 nova_compute[182755]:    </graphics>
Jan 21 18:57:06 np0005591285 nova_compute[182755]:    <audio id='1' type='none'/>
Jan 21 18:57:06 np0005591285 nova_compute[182755]:    <video>
Jan 21 18:57:06 np0005591285 nova_compute[182755]:      <model type='virtio' heads='1' primary='yes'/>
Jan 21 18:57:06 np0005591285 nova_compute[182755]:      <alias name='video0'/>
Jan 21 18:57:06 np0005591285 nova_compute[182755]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x01' function='0x0'/>
Jan 21 18:57:06 np0005591285 nova_compute[182755]:    </video>
Jan 21 18:57:06 np0005591285 nova_compute[182755]:    <watchdog model='itco' action='reset'>
Jan 21 18:57:06 np0005591285 nova_compute[182755]:      <alias name='watchdog0'/>
Jan 21 18:57:06 np0005591285 nova_compute[182755]:    </watchdog>
Jan 21 18:57:06 np0005591285 nova_compute[182755]:    <memballoon model='virtio'>
Jan 21 18:57:06 np0005591285 nova_compute[182755]:      <stats period='10'/>
Jan 21 18:57:06 np0005591285 nova_compute[182755]:      <alias name='balloon0'/>
Jan 21 18:57:06 np0005591285 nova_compute[182755]:      <address type='pci' domain='0x0000' bus='0x04' slot='0x00' function='0x0'/>
Jan 21 18:57:06 np0005591285 nova_compute[182755]:    </memballoon>
Jan 21 18:57:06 np0005591285 nova_compute[182755]:    <rng model='virtio'>
Jan 21 18:57:06 np0005591285 nova_compute[182755]:      <backend model='random'>/dev/urandom</backend>
Jan 21 18:57:06 np0005591285 nova_compute[182755]:      <alias name='rng0'/>
Jan 21 18:57:06 np0005591285 nova_compute[182755]:      <address type='pci' domain='0x0000' bus='0x05' slot='0x00' function='0x0'/>
Jan 21 18:57:06 np0005591285 nova_compute[182755]:    </rng>
Jan 21 18:57:06 np0005591285 nova_compute[182755]:  </devices>
Jan 21 18:57:06 np0005591285 nova_compute[182755]:  <seclabel type='dynamic' model='selinux' relabel='yes'>
Jan 21 18:57:06 np0005591285 nova_compute[182755]:    <label>system_u:system_r:svirt_t:s0:c724,c873</label>
Jan 21 18:57:06 np0005591285 nova_compute[182755]:    <imagelabel>system_u:object_r:svirt_image_t:s0:c724,c873</imagelabel>
Jan 21 18:57:06 np0005591285 nova_compute[182755]:  </seclabel>
Jan 21 18:57:06 np0005591285 nova_compute[182755]:  <seclabel type='dynamic' model='dac' relabel='yes'>
Jan 21 18:57:06 np0005591285 nova_compute[182755]:    <label>+107:+107</label>
Jan 21 18:57:06 np0005591285 nova_compute[182755]:    <imagelabel>+107:+107</imagelabel>
Jan 21 18:57:06 np0005591285 nova_compute[182755]:  </seclabel>
Jan 21 18:57:06 np0005591285 nova_compute[182755]: </domain>
Jan 21 18:57:06 np0005591285 nova_compute[182755]: get_interface_by_cfg /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:282#033[00m
Jan 21 18:57:06 np0005591285 nova_compute[182755]: 2026-01-21 23:57:06.747 182759 WARNING nova.virt.libvirt.driver [req-3b2bb319-b6f0-4a96-afb4-f1856d6a9aeb req-9b7c2a27-7691-4d30-b5cb-acb2dea2999d 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: d30c29ef-0595-4d30-a826-50e21d7d3463] Detaching interface fa:16:3e:42:b1:a8 failed because the device is no longer found on the guest.: nova.exception.DeviceNotFound: Device 'tape7e9d1d2-92' not found.#033[00m
Jan 21 18:57:06 np0005591285 nova_compute[182755]: 2026-01-21 23:57:06.748 182759 DEBUG nova.virt.libvirt.vif [req-3b2bb319-b6f0-4a96-afb4-f1856d6a9aeb req-9b7c2a27-7691-4d30-b5cb-acb2dea2999d 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-21T23:56:21Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-AttachInterfacesTestJSON-server-1062303531',display_name='tempest-AttachInterfacesTestJSON-server-1062303531',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-attachinterfacestestjson-server-1062303531',id=68,image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBDwgch/zV9NgtxoAjj9JBs7lcsR6b8YWvnQIu5vmkBx68RzwPILt1+wzp9eXPXNsQsEuX9cbGoTpflNJROPxrxLqg5cuoOtsv4rM+f7gXiy2vWn/dQAcGM72Np9ilcHeEQ==',key_name='tempest-keypair-829623367',keypairs=<?>,launch_index=0,launched_at=2026-01-21T23:56:31Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata=<?>,migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=<?>,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='717cc581e6a349a98dfd390d05b18624',ramdisk_id='',reservation_id='r-2l69dl8j',resources=<?>,root_device_name='/dev/vda',root_gb=1,security_groups=<?>,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-AttachInterfacesTestJSON-658760528',owner_user_name='tempest-AttachInterfacesTestJSON-658760528-project-member'},tags=<?>,task_state=None,terminated_at=None,trusted_certs=<?>,updated_at=2026-01-21T23:56:32Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='0f8ef02149394f2dac899fc3395b6bf7',uuid=d30c29ef-0595-4d30-a826-50e21d7d3463,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "e7e9d1d2-925c-4854-8e1c-90b360ae8694", "address": "fa:16:3e:42:b1:a8", "network": {"id": "1995baab-0f8d-4658-a4fc-2d21868dc592", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-54296518-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "717cc581e6a349a98dfd390d05b18624", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape7e9d1d2-92", "ovs_interfaceid": "e7e9d1d2-925c-4854-8e1c-90b360ae8694", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Jan 21 18:57:06 np0005591285 nova_compute[182755]: 2026-01-21 23:57:06.749 182759 DEBUG nova.network.os_vif_util [req-3b2bb319-b6f0-4a96-afb4-f1856d6a9aeb req-9b7c2a27-7691-4d30-b5cb-acb2dea2999d 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Converting VIF {"id": "e7e9d1d2-925c-4854-8e1c-90b360ae8694", "address": "fa:16:3e:42:b1:a8", "network": {"id": "1995baab-0f8d-4658-a4fc-2d21868dc592", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-54296518-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "717cc581e6a349a98dfd390d05b18624", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape7e9d1d2-92", "ovs_interfaceid": "e7e9d1d2-925c-4854-8e1c-90b360ae8694", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 21 18:57:06 np0005591285 nova_compute[182755]: 2026-01-21 23:57:06.749 182759 DEBUG nova.network.os_vif_util [req-3b2bb319-b6f0-4a96-afb4-f1856d6a9aeb req-9b7c2a27-7691-4d30-b5cb-acb2dea2999d 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:42:b1:a8,bridge_name='br-int',has_traffic_filtering=True,id=e7e9d1d2-925c-4854-8e1c-90b360ae8694,network=Network(1995baab-0f8d-4658-a4fc-2d21868dc592),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tape7e9d1d2-92') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 21 18:57:06 np0005591285 nova_compute[182755]: 2026-01-21 23:57:06.750 182759 DEBUG os_vif [req-3b2bb319-b6f0-4a96-afb4-f1856d6a9aeb req-9b7c2a27-7691-4d30-b5cb-acb2dea2999d 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:42:b1:a8,bridge_name='br-int',has_traffic_filtering=True,id=e7e9d1d2-925c-4854-8e1c-90b360ae8694,network=Network(1995baab-0f8d-4658-a4fc-2d21868dc592),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tape7e9d1d2-92') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Jan 21 18:57:06 np0005591285 nova_compute[182755]: 2026-01-21 23:57:06.751 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 18:57:06 np0005591285 nova_compute[182755]: 2026-01-21 23:57:06.752 182759 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tape7e9d1d2-92, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 21 18:57:06 np0005591285 nova_compute[182755]: 2026-01-21 23:57:06.752 182759 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 21 18:57:06 np0005591285 nova_compute[182755]: 2026-01-21 23:57:06.755 182759 INFO os_vif [req-3b2bb319-b6f0-4a96-afb4-f1856d6a9aeb req-9b7c2a27-7691-4d30-b5cb-acb2dea2999d 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:42:b1:a8,bridge_name='br-int',has_traffic_filtering=True,id=e7e9d1d2-925c-4854-8e1c-90b360ae8694,network=Network(1995baab-0f8d-4658-a4fc-2d21868dc592),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tape7e9d1d2-92')#033[00m
Jan 21 18:57:06 np0005591285 nova_compute[182755]: 2026-01-21 23:57:06.755 182759 DEBUG nova.virt.libvirt.guest [req-3b2bb319-b6f0-4a96-afb4-f1856d6a9aeb req-9b7c2a27-7691-4d30-b5cb-acb2dea2999d 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] set metadata xml: <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 21 18:57:06 np0005591285 nova_compute[182755]:  <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 21 18:57:06 np0005591285 nova_compute[182755]:  <nova:name>tempest-AttachInterfacesTestJSON-server-1062303531</nova:name>
Jan 21 18:57:06 np0005591285 nova_compute[182755]:  <nova:creationTime>2026-01-21 23:57:06</nova:creationTime>
Jan 21 18:57:06 np0005591285 nova_compute[182755]:  <nova:flavor name="m1.nano">
Jan 21 18:57:06 np0005591285 nova_compute[182755]:    <nova:memory>128</nova:memory>
Jan 21 18:57:06 np0005591285 nova_compute[182755]:    <nova:disk>1</nova:disk>
Jan 21 18:57:06 np0005591285 nova_compute[182755]:    <nova:swap>0</nova:swap>
Jan 21 18:57:06 np0005591285 nova_compute[182755]:    <nova:ephemeral>0</nova:ephemeral>
Jan 21 18:57:06 np0005591285 nova_compute[182755]:    <nova:vcpus>1</nova:vcpus>
Jan 21 18:57:06 np0005591285 nova_compute[182755]:  </nova:flavor>
Jan 21 18:57:06 np0005591285 nova_compute[182755]:  <nova:owner>
Jan 21 18:57:06 np0005591285 nova_compute[182755]:    <nova:user uuid="0f8ef02149394f2dac899fc3395b6bf7">tempest-AttachInterfacesTestJSON-658760528-project-member</nova:user>
Jan 21 18:57:06 np0005591285 nova_compute[182755]:    <nova:project uuid="717cc581e6a349a98dfd390d05b18624">tempest-AttachInterfacesTestJSON-658760528</nova:project>
Jan 21 18:57:06 np0005591285 nova_compute[182755]:  </nova:owner>
Jan 21 18:57:06 np0005591285 nova_compute[182755]:  <nova:root type="image" uuid="9cd98f02-a505-4543-a7ad-04e9a377b456"/>
Jan 21 18:57:06 np0005591285 nova_compute[182755]:  <nova:ports>
Jan 21 18:57:06 np0005591285 nova_compute[182755]:    <nova:port uuid="8a89069d-b676-4bb0-ab19-1d71370566f0">
Jan 21 18:57:06 np0005591285 nova_compute[182755]:      <nova:ip type="fixed" address="10.100.0.14" ipVersion="4"/>
Jan 21 18:57:06 np0005591285 nova_compute[182755]:    </nova:port>
Jan 21 18:57:06 np0005591285 nova_compute[182755]:  </nova:ports>
Jan 21 18:57:06 np0005591285 nova_compute[182755]: </nova:instance>
Jan 21 18:57:06 np0005591285 nova_compute[182755]: set_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:359#033[00m
Jan 21 18:57:06 np0005591285 nova_compute[182755]: 2026-01-21 23:57:06.907 182759 DEBUG nova.compute.manager [req-0631618b-7ad2-4854-ad0d-e2af965d3a59 req-12ec9e79-2f1e-4aac-bf32-a493877473b8 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: d30c29ef-0595-4d30-a826-50e21d7d3463] Received event network-vif-unplugged-e7e9d1d2-925c-4854-8e1c-90b360ae8694 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 21 18:57:06 np0005591285 nova_compute[182755]: 2026-01-21 23:57:06.907 182759 DEBUG oslo_concurrency.lockutils [req-0631618b-7ad2-4854-ad0d-e2af965d3a59 req-12ec9e79-2f1e-4aac-bf32-a493877473b8 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "d30c29ef-0595-4d30-a826-50e21d7d3463-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 21 18:57:06 np0005591285 nova_compute[182755]: 2026-01-21 23:57:06.908 182759 DEBUG oslo_concurrency.lockutils [req-0631618b-7ad2-4854-ad0d-e2af965d3a59 req-12ec9e79-2f1e-4aac-bf32-a493877473b8 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "d30c29ef-0595-4d30-a826-50e21d7d3463-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 21 18:57:06 np0005591285 nova_compute[182755]: 2026-01-21 23:57:06.908 182759 DEBUG oslo_concurrency.lockutils [req-0631618b-7ad2-4854-ad0d-e2af965d3a59 req-12ec9e79-2f1e-4aac-bf32-a493877473b8 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "d30c29ef-0595-4d30-a826-50e21d7d3463-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 21 18:57:06 np0005591285 nova_compute[182755]: 2026-01-21 23:57:06.908 182759 DEBUG nova.compute.manager [req-0631618b-7ad2-4854-ad0d-e2af965d3a59 req-12ec9e79-2f1e-4aac-bf32-a493877473b8 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: d30c29ef-0595-4d30-a826-50e21d7d3463] No waiting events found dispatching network-vif-unplugged-e7e9d1d2-925c-4854-8e1c-90b360ae8694 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 21 18:57:06 np0005591285 nova_compute[182755]: 2026-01-21 23:57:06.908 182759 WARNING nova.compute.manager [req-0631618b-7ad2-4854-ad0d-e2af965d3a59 req-12ec9e79-2f1e-4aac-bf32-a493877473b8 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: d30c29ef-0595-4d30-a826-50e21d7d3463] Received unexpected event network-vif-unplugged-e7e9d1d2-925c-4854-8e1c-90b360ae8694 for instance with vm_state active and task_state None.#033[00m
Jan 21 18:57:06 np0005591285 nova_compute[182755]: 2026-01-21 23:57:06.908 182759 DEBUG nova.compute.manager [req-0631618b-7ad2-4854-ad0d-e2af965d3a59 req-12ec9e79-2f1e-4aac-bf32-a493877473b8 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: d30c29ef-0595-4d30-a826-50e21d7d3463] Received event network-vif-plugged-e7e9d1d2-925c-4854-8e1c-90b360ae8694 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 21 18:57:06 np0005591285 nova_compute[182755]: 2026-01-21 23:57:06.909 182759 DEBUG oslo_concurrency.lockutils [req-0631618b-7ad2-4854-ad0d-e2af965d3a59 req-12ec9e79-2f1e-4aac-bf32-a493877473b8 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "d30c29ef-0595-4d30-a826-50e21d7d3463-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 21 18:57:06 np0005591285 nova_compute[182755]: 2026-01-21 23:57:06.909 182759 DEBUG oslo_concurrency.lockutils [req-0631618b-7ad2-4854-ad0d-e2af965d3a59 req-12ec9e79-2f1e-4aac-bf32-a493877473b8 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "d30c29ef-0595-4d30-a826-50e21d7d3463-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 21 18:57:06 np0005591285 nova_compute[182755]: 2026-01-21 23:57:06.909 182759 DEBUG oslo_concurrency.lockutils [req-0631618b-7ad2-4854-ad0d-e2af965d3a59 req-12ec9e79-2f1e-4aac-bf32-a493877473b8 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "d30c29ef-0595-4d30-a826-50e21d7d3463-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 21 18:57:06 np0005591285 nova_compute[182755]: 2026-01-21 23:57:06.909 182759 DEBUG nova.compute.manager [req-0631618b-7ad2-4854-ad0d-e2af965d3a59 req-12ec9e79-2f1e-4aac-bf32-a493877473b8 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: d30c29ef-0595-4d30-a826-50e21d7d3463] No waiting events found dispatching network-vif-plugged-e7e9d1d2-925c-4854-8e1c-90b360ae8694 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 21 18:57:06 np0005591285 nova_compute[182755]: 2026-01-21 23:57:06.910 182759 WARNING nova.compute.manager [req-0631618b-7ad2-4854-ad0d-e2af965d3a59 req-12ec9e79-2f1e-4aac-bf32-a493877473b8 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: d30c29ef-0595-4d30-a826-50e21d7d3463] Received unexpected event network-vif-plugged-e7e9d1d2-925c-4854-8e1c-90b360ae8694 for instance with vm_state active and task_state None.#033[00m
Jan 21 18:57:07 np0005591285 podman[220196]: 2026-01-21 23:57:07.264161086 +0000 UTC m=+0.137804368 container health_status e38def30a2d032278c507a8fe96e62e9ea51ff4721fe55b24e66691821fd079e (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, org.label-schema.build-date=20251202, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_managed=true, container_name=ovn_controller, org.label-schema.license=GPLv2)
Jan 21 18:57:07 np0005591285 podman[220221]: 2026-01-21 23:57:07.388309823 +0000 UTC m=+0.084209264 container health_status 8919309155a033da8deb024bb37b9fc0d285fe97686ee0d202af37a5f2df8416 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter)
Jan 21 18:57:07 np0005591285 podman[220220]: 2026-01-21 23:57:07.393660019 +0000 UTC m=+0.085653315 container health_status 482a8450540b33e5cd4c4cb0c4b60281016c657b79b89fa82f8a9e447843f995 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, container_name=ovn_metadata_agent, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3)
Jan 21 18:57:07 np0005591285 nova_compute[182755]: 2026-01-21 23:57:07.983 182759 DEBUG oslo_concurrency.lockutils [None req-10551a9e-f7f1-4f77-af74-7e8a47ab4752 0f8ef02149394f2dac899fc3395b6bf7 717cc581e6a349a98dfd390d05b18624 - - default default] Acquiring lock "d30c29ef-0595-4d30-a826-50e21d7d3463" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 21 18:57:07 np0005591285 nova_compute[182755]: 2026-01-21 23:57:07.984 182759 DEBUG oslo_concurrency.lockutils [None req-10551a9e-f7f1-4f77-af74-7e8a47ab4752 0f8ef02149394f2dac899fc3395b6bf7 717cc581e6a349a98dfd390d05b18624 - - default default] Lock "d30c29ef-0595-4d30-a826-50e21d7d3463" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 21 18:57:07 np0005591285 nova_compute[182755]: 2026-01-21 23:57:07.984 182759 DEBUG oslo_concurrency.lockutils [None req-10551a9e-f7f1-4f77-af74-7e8a47ab4752 0f8ef02149394f2dac899fc3395b6bf7 717cc581e6a349a98dfd390d05b18624 - - default default] Acquiring lock "d30c29ef-0595-4d30-a826-50e21d7d3463-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 21 18:57:07 np0005591285 nova_compute[182755]: 2026-01-21 23:57:07.984 182759 DEBUG oslo_concurrency.lockutils [None req-10551a9e-f7f1-4f77-af74-7e8a47ab4752 0f8ef02149394f2dac899fc3395b6bf7 717cc581e6a349a98dfd390d05b18624 - - default default] Lock "d30c29ef-0595-4d30-a826-50e21d7d3463-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 21 18:57:07 np0005591285 nova_compute[182755]: 2026-01-21 23:57:07.985 182759 DEBUG oslo_concurrency.lockutils [None req-10551a9e-f7f1-4f77-af74-7e8a47ab4752 0f8ef02149394f2dac899fc3395b6bf7 717cc581e6a349a98dfd390d05b18624 - - default default] Lock "d30c29ef-0595-4d30-a826-50e21d7d3463-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 21 18:57:08 np0005591285 nova_compute[182755]: 2026-01-21 23:57:07.999 182759 INFO nova.compute.manager [None req-10551a9e-f7f1-4f77-af74-7e8a47ab4752 0f8ef02149394f2dac899fc3395b6bf7 717cc581e6a349a98dfd390d05b18624 - - default default] [instance: d30c29ef-0595-4d30-a826-50e21d7d3463] Terminating instance#033[00m
Jan 21 18:57:08 np0005591285 nova_compute[182755]: 2026-01-21 23:57:08.011 182759 DEBUG nova.compute.manager [None req-10551a9e-f7f1-4f77-af74-7e8a47ab4752 0f8ef02149394f2dac899fc3395b6bf7 717cc581e6a349a98dfd390d05b18624 - - default default] [instance: d30c29ef-0595-4d30-a826-50e21d7d3463] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Jan 21 18:57:08 np0005591285 kernel: tap8a89069d-b6 (unregistering): left promiscuous mode
Jan 21 18:57:08 np0005591285 NetworkManager[55017]: <info>  [1769039828.0350] device (tap8a89069d-b6): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 21 18:57:08 np0005591285 ovn_controller[94908]: 2026-01-21T23:57:08Z|00200|binding|INFO|Releasing lport 8a89069d-b676-4bb0-ab19-1d71370566f0 from this chassis (sb_readonly=0)
Jan 21 18:57:08 np0005591285 ovn_controller[94908]: 2026-01-21T23:57:08Z|00201|binding|INFO|Setting lport 8a89069d-b676-4bb0-ab19-1d71370566f0 down in Southbound
Jan 21 18:57:08 np0005591285 ovn_controller[94908]: 2026-01-21T23:57:08Z|00202|binding|INFO|Removing iface tap8a89069d-b6 ovn-installed in OVS
Jan 21 18:57:08 np0005591285 nova_compute[182755]: 2026-01-21 23:57:08.042 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 18:57:08 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:57:08.057 104259 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:04:81:11 10.100.0.14'], port_security=['fa:16:3e:04:81:11 10.100.0.14'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.14/28', 'neutron:device_id': 'd30c29ef-0595-4d30-a826-50e21d7d3463', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-1995baab-0f8d-4658-a4fc-2d21868dc592', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '717cc581e6a349a98dfd390d05b18624', 'neutron:revision_number': '4', 'neutron:security_group_ids': '3580c2cf-9b7e-4a0b-a165-8de5bca87e40', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-2.ctlplane.example.com', 'neutron:port_fip': '192.168.122.212'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=a84fa12f-731b-4479-8697-844749c5a76f, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fdd55b5c970>], logical_port=8a89069d-b676-4bb0-ab19-1d71370566f0) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7fdd55b5c970>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 21 18:57:08 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:57:08.060 104259 INFO neutron.agent.ovn.metadata.agent [-] Port 8a89069d-b676-4bb0-ab19-1d71370566f0 in datapath 1995baab-0f8d-4658-a4fc-2d21868dc592 unbound from our chassis#033[00m
Jan 21 18:57:08 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:57:08.062 104259 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 1995baab-0f8d-4658-a4fc-2d21868dc592, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Jan 21 18:57:08 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:57:08.063 211690 DEBUG oslo.privsep.daemon [-] privsep: reply[59f3c800-7afe-46ca-9064-3688ad4c40f1]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 18:57:08 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:57:08.064 104259 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-1995baab-0f8d-4658-a4fc-2d21868dc592 namespace which is not needed anymore#033[00m
Jan 21 18:57:08 np0005591285 nova_compute[182755]: 2026-01-21 23:57:08.064 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 18:57:08 np0005591285 systemd[1]: machine-qemu\x2d28\x2dinstance\x2d00000044.scope: Deactivated successfully.
Jan 21 18:57:08 np0005591285 systemd[1]: machine-qemu\x2d28\x2dinstance\x2d00000044.scope: Consumed 15.059s CPU time.
Jan 21 18:57:08 np0005591285 systemd-machined[154022]: Machine qemu-28-instance-00000044 terminated.
Jan 21 18:57:08 np0005591285 neutron-haproxy-ovnmeta-1995baab-0f8d-4658-a4fc-2d21868dc592[219986]: [NOTICE]   (220002) : haproxy version is 2.8.14-c23fe91
Jan 21 18:57:08 np0005591285 neutron-haproxy-ovnmeta-1995baab-0f8d-4658-a4fc-2d21868dc592[219986]: [NOTICE]   (220002) : path to executable is /usr/sbin/haproxy
Jan 21 18:57:08 np0005591285 neutron-haproxy-ovnmeta-1995baab-0f8d-4658-a4fc-2d21868dc592[219986]: [WARNING]  (220002) : Exiting Master process...
Jan 21 18:57:08 np0005591285 neutron-haproxy-ovnmeta-1995baab-0f8d-4658-a4fc-2d21868dc592[219986]: [ALERT]    (220002) : Current worker (220012) exited with code 143 (Terminated)
Jan 21 18:57:08 np0005591285 neutron-haproxy-ovnmeta-1995baab-0f8d-4658-a4fc-2d21868dc592[219986]: [WARNING]  (220002) : All workers exited. Exiting... (0)
Jan 21 18:57:08 np0005591285 systemd[1]: libpod-3f9cc8f07d73113514d18af2e39e30cc4ec66dabf08aa309b8dc166d66f2b7d6.scope: Deactivated successfully.
Jan 21 18:57:08 np0005591285 podman[220285]: 2026-01-21 23:57:08.233016162 +0000 UTC m=+0.061737096 container died 3f9cc8f07d73113514d18af2e39e30cc4ec66dabf08aa309b8dc166d66f2b7d6 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-1995baab-0f8d-4658-a4fc-2d21868dc592, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.schema-version=1.0)
Jan 21 18:57:08 np0005591285 NetworkManager[55017]: <info>  [1769039828.2383] manager: (tap8a89069d-b6): new Tun device (/org/freedesktop/NetworkManager/Devices/104)
Jan 21 18:57:08 np0005591285 nova_compute[182755]: 2026-01-21 23:57:08.241 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 18:57:08 np0005591285 nova_compute[182755]: 2026-01-21 23:57:08.262 182759 INFO nova.network.neutron [None req-f149204a-fb2f-4bc3-b144-43c74e796542 0f8ef02149394f2dac899fc3395b6bf7 717cc581e6a349a98dfd390d05b18624 - - default default] [instance: d30c29ef-0595-4d30-a826-50e21d7d3463] Port e7e9d1d2-925c-4854-8e1c-90b360ae8694 from network info_cache is no longer associated with instance in Neutron. Removing from network info_cache.#033[00m
Jan 21 18:57:08 np0005591285 nova_compute[182755]: 2026-01-21 23:57:08.263 182759 DEBUG nova.network.neutron [None req-f149204a-fb2f-4bc3-b144-43c74e796542 0f8ef02149394f2dac899fc3395b6bf7 717cc581e6a349a98dfd390d05b18624 - - default default] [instance: d30c29ef-0595-4d30-a826-50e21d7d3463] Updating instance_info_cache with network_info: [{"id": "8a89069d-b676-4bb0-ab19-1d71370566f0", "address": "fa:16:3e:04:81:11", "network": {"id": "1995baab-0f8d-4658-a4fc-2d21868dc592", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-54296518-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.212", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "717cc581e6a349a98dfd390d05b18624", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap8a89069d-b6", "ovs_interfaceid": "8a89069d-b676-4bb0-ab19-1d71370566f0", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 21 18:57:08 np0005591285 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-3f9cc8f07d73113514d18af2e39e30cc4ec66dabf08aa309b8dc166d66f2b7d6-userdata-shm.mount: Deactivated successfully.
Jan 21 18:57:08 np0005591285 systemd[1]: var-lib-containers-storage-overlay-2934eaa9cd350eab130f04301aaf070eaa211f248ed4f4a407dbbdca6160b58f-merged.mount: Deactivated successfully.
Jan 21 18:57:08 np0005591285 podman[220285]: 2026-01-21 23:57:08.283505431 +0000 UTC m=+0.112226355 container cleanup 3f9cc8f07d73113514d18af2e39e30cc4ec66dabf08aa309b8dc166d66f2b7d6 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-1995baab-0f8d-4658-a4fc-2d21868dc592, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.license=GPLv2)
Jan 21 18:57:08 np0005591285 nova_compute[182755]: 2026-01-21 23:57:08.293 182759 DEBUG oslo_concurrency.lockutils [None req-f149204a-fb2f-4bc3-b144-43c74e796542 0f8ef02149394f2dac899fc3395b6bf7 717cc581e6a349a98dfd390d05b18624 - - default default] Releasing lock "refresh_cache-d30c29ef-0595-4d30-a826-50e21d7d3463" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 21 18:57:08 np0005591285 nova_compute[182755]: 2026-01-21 23:57:08.304 182759 INFO nova.virt.libvirt.driver [-] [instance: d30c29ef-0595-4d30-a826-50e21d7d3463] Instance destroyed successfully.#033[00m
Jan 21 18:57:08 np0005591285 nova_compute[182755]: 2026-01-21 23:57:08.305 182759 DEBUG nova.objects.instance [None req-10551a9e-f7f1-4f77-af74-7e8a47ab4752 0f8ef02149394f2dac899fc3395b6bf7 717cc581e6a349a98dfd390d05b18624 - - default default] Lazy-loading 'resources' on Instance uuid d30c29ef-0595-4d30-a826-50e21d7d3463 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 21 18:57:08 np0005591285 systemd[1]: libpod-conmon-3f9cc8f07d73113514d18af2e39e30cc4ec66dabf08aa309b8dc166d66f2b7d6.scope: Deactivated successfully.
Jan 21 18:57:08 np0005591285 nova_compute[182755]: 2026-01-21 23:57:08.323 182759 DEBUG nova.virt.libvirt.vif [None req-10551a9e-f7f1-4f77-af74-7e8a47ab4752 0f8ef02149394f2dac899fc3395b6bf7 717cc581e6a349a98dfd390d05b18624 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-21T23:56:21Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-AttachInterfacesTestJSON-server-1062303531',display_name='tempest-AttachInterfacesTestJSON-server-1062303531',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-attachinterfacestestjson-server-1062303531',id=68,image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBDwgch/zV9NgtxoAjj9JBs7lcsR6b8YWvnQIu5vmkBx68RzwPILt1+wzp9eXPXNsQsEuX9cbGoTpflNJROPxrxLqg5cuoOtsv4rM+f7gXiy2vWn/dQAcGM72Np9ilcHeEQ==',key_name='tempest-keypair-829623367',keypairs=<?>,launch_index=0,launched_at=2026-01-21T23:56:31Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='717cc581e6a349a98dfd390d05b18624',ramdisk_id='',reservation_id='r-2l69dl8j',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-AttachInterfacesTestJSON-658760528',owner_user_name='tempest-AttachInterfacesTestJSON-658760528-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-21T23:56:32Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='0f8ef02149394f2dac899fc3395b6bf7',uuid=d30c29ef-0595-4d30-a826-50e21d7d3463,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "8a89069d-b676-4bb0-ab19-1d71370566f0", "address": "fa:16:3e:04:81:11", "network": {"id": "1995baab-0f8d-4658-a4fc-2d21868dc592", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-54296518-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.212", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "717cc581e6a349a98dfd390d05b18624", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap8a89069d-b6", "ovs_interfaceid": "8a89069d-b676-4bb0-ab19-1d71370566f0", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Jan 21 18:57:08 np0005591285 nova_compute[182755]: 2026-01-21 23:57:08.323 182759 DEBUG nova.network.os_vif_util [None req-10551a9e-f7f1-4f77-af74-7e8a47ab4752 0f8ef02149394f2dac899fc3395b6bf7 717cc581e6a349a98dfd390d05b18624 - - default default] Converting VIF {"id": "8a89069d-b676-4bb0-ab19-1d71370566f0", "address": "fa:16:3e:04:81:11", "network": {"id": "1995baab-0f8d-4658-a4fc-2d21868dc592", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-54296518-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.212", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "717cc581e6a349a98dfd390d05b18624", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap8a89069d-b6", "ovs_interfaceid": "8a89069d-b676-4bb0-ab19-1d71370566f0", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 21 18:57:08 np0005591285 nova_compute[182755]: 2026-01-21 23:57:08.324 182759 DEBUG nova.network.os_vif_util [None req-10551a9e-f7f1-4f77-af74-7e8a47ab4752 0f8ef02149394f2dac899fc3395b6bf7 717cc581e6a349a98dfd390d05b18624 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:04:81:11,bridge_name='br-int',has_traffic_filtering=True,id=8a89069d-b676-4bb0-ab19-1d71370566f0,network=Network(1995baab-0f8d-4658-a4fc-2d21868dc592),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap8a89069d-b6') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 21 18:57:08 np0005591285 nova_compute[182755]: 2026-01-21 23:57:08.325 182759 DEBUG os_vif [None req-10551a9e-f7f1-4f77-af74-7e8a47ab4752 0f8ef02149394f2dac899fc3395b6bf7 717cc581e6a349a98dfd390d05b18624 - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:04:81:11,bridge_name='br-int',has_traffic_filtering=True,id=8a89069d-b676-4bb0-ab19-1d71370566f0,network=Network(1995baab-0f8d-4658-a4fc-2d21868dc592),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap8a89069d-b6') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Jan 21 18:57:08 np0005591285 nova_compute[182755]: 2026-01-21 23:57:08.330 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 18:57:08 np0005591285 nova_compute[182755]: 2026-01-21 23:57:08.330 182759 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap8a89069d-b6, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 21 18:57:08 np0005591285 nova_compute[182755]: 2026-01-21 23:57:08.385 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 18:57:08 np0005591285 nova_compute[182755]: 2026-01-21 23:57:08.386 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 18:57:08 np0005591285 nova_compute[182755]: 2026-01-21 23:57:08.388 182759 DEBUG oslo_concurrency.lockutils [None req-f149204a-fb2f-4bc3-b144-43c74e796542 0f8ef02149394f2dac899fc3395b6bf7 717cc581e6a349a98dfd390d05b18624 - - default default] Lock "interface-d30c29ef-0595-4d30-a826-50e21d7d3463-e7e9d1d2-925c-4854-8e1c-90b360ae8694" "released" by "nova.compute.manager.ComputeManager.detach_interface.<locals>.do_detach_interface" :: held 2.811s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 21 18:57:08 np0005591285 nova_compute[182755]: 2026-01-21 23:57:08.389 182759 INFO os_vif [None req-10551a9e-f7f1-4f77-af74-7e8a47ab4752 0f8ef02149394f2dac899fc3395b6bf7 717cc581e6a349a98dfd390d05b18624 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:04:81:11,bridge_name='br-int',has_traffic_filtering=True,id=8a89069d-b676-4bb0-ab19-1d71370566f0,network=Network(1995baab-0f8d-4658-a4fc-2d21868dc592),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap8a89069d-b6')#033[00m
Jan 21 18:57:08 np0005591285 nova_compute[182755]: 2026-01-21 23:57:08.390 182759 INFO nova.virt.libvirt.driver [None req-10551a9e-f7f1-4f77-af74-7e8a47ab4752 0f8ef02149394f2dac899fc3395b6bf7 717cc581e6a349a98dfd390d05b18624 - - default default] [instance: d30c29ef-0595-4d30-a826-50e21d7d3463] Deleting instance files /var/lib/nova/instances/d30c29ef-0595-4d30-a826-50e21d7d3463_del#033[00m
Jan 21 18:57:08 np0005591285 nova_compute[182755]: 2026-01-21 23:57:08.391 182759 INFO nova.virt.libvirt.driver [None req-10551a9e-f7f1-4f77-af74-7e8a47ab4752 0f8ef02149394f2dac899fc3395b6bf7 717cc581e6a349a98dfd390d05b18624 - - default default] [instance: d30c29ef-0595-4d30-a826-50e21d7d3463] Deletion of /var/lib/nova/instances/d30c29ef-0595-4d30-a826-50e21d7d3463_del complete#033[00m
Jan 21 18:57:08 np0005591285 nova_compute[182755]: 2026-01-21 23:57:08.412 182759 DEBUG nova.compute.manager [req-ab1744f7-0698-40fd-8134-cf4e6006c5e8 req-a0fb4599-ae43-4c48-a4e6-d1aece439fe1 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: d30c29ef-0595-4d30-a826-50e21d7d3463] Received event network-vif-unplugged-8a89069d-b676-4bb0-ab19-1d71370566f0 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 21 18:57:08 np0005591285 nova_compute[182755]: 2026-01-21 23:57:08.414 182759 DEBUG oslo_concurrency.lockutils [req-ab1744f7-0698-40fd-8134-cf4e6006c5e8 req-a0fb4599-ae43-4c48-a4e6-d1aece439fe1 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "d30c29ef-0595-4d30-a826-50e21d7d3463-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 21 18:57:08 np0005591285 nova_compute[182755]: 2026-01-21 23:57:08.414 182759 DEBUG oslo_concurrency.lockutils [req-ab1744f7-0698-40fd-8134-cf4e6006c5e8 req-a0fb4599-ae43-4c48-a4e6-d1aece439fe1 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "d30c29ef-0595-4d30-a826-50e21d7d3463-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 21 18:57:08 np0005591285 nova_compute[182755]: 2026-01-21 23:57:08.414 182759 DEBUG oslo_concurrency.lockutils [req-ab1744f7-0698-40fd-8134-cf4e6006c5e8 req-a0fb4599-ae43-4c48-a4e6-d1aece439fe1 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "d30c29ef-0595-4d30-a826-50e21d7d3463-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 21 18:57:08 np0005591285 nova_compute[182755]: 2026-01-21 23:57:08.415 182759 DEBUG nova.compute.manager [req-ab1744f7-0698-40fd-8134-cf4e6006c5e8 req-a0fb4599-ae43-4c48-a4e6-d1aece439fe1 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: d30c29ef-0595-4d30-a826-50e21d7d3463] No waiting events found dispatching network-vif-unplugged-8a89069d-b676-4bb0-ab19-1d71370566f0 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 21 18:57:08 np0005591285 nova_compute[182755]: 2026-01-21 23:57:08.415 182759 DEBUG nova.compute.manager [req-ab1744f7-0698-40fd-8134-cf4e6006c5e8 req-a0fb4599-ae43-4c48-a4e6-d1aece439fe1 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: d30c29ef-0595-4d30-a826-50e21d7d3463] Received event network-vif-unplugged-8a89069d-b676-4bb0-ab19-1d71370566f0 for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Jan 21 18:57:08 np0005591285 podman[220332]: 2026-01-21 23:57:08.422204953 +0000 UTC m=+0.101152304 container remove 3f9cc8f07d73113514d18af2e39e30cc4ec66dabf08aa309b8dc166d66f2b7d6 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-1995baab-0f8d-4658-a4fc-2d21868dc592, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, tcib_managed=true)
Jan 21 18:57:08 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:57:08.428 211690 DEBUG oslo.privsep.daemon [-] privsep: reply[2d0ff61f-fe61-4b2c-ac15-2ff4fb6ea696]: (4, ('Wed Jan 21 11:57:08 PM UTC 2026 Stopping container neutron-haproxy-ovnmeta-1995baab-0f8d-4658-a4fc-2d21868dc592 (3f9cc8f07d73113514d18af2e39e30cc4ec66dabf08aa309b8dc166d66f2b7d6)\n3f9cc8f07d73113514d18af2e39e30cc4ec66dabf08aa309b8dc166d66f2b7d6\nWed Jan 21 11:57:08 PM UTC 2026 Deleting container neutron-haproxy-ovnmeta-1995baab-0f8d-4658-a4fc-2d21868dc592 (3f9cc8f07d73113514d18af2e39e30cc4ec66dabf08aa309b8dc166d66f2b7d6)\n3f9cc8f07d73113514d18af2e39e30cc4ec66dabf08aa309b8dc166d66f2b7d6\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 18:57:08 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:57:08.430 211690 DEBUG oslo.privsep.daemon [-] privsep: reply[86307d03-0f3d-49a4-87ee-8581e2e83fe6]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 18:57:08 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:57:08.432 104259 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap1995baab-00, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 21 18:57:08 np0005591285 kernel: tap1995baab-00: left promiscuous mode
Jan 21 18:57:08 np0005591285 nova_compute[182755]: 2026-01-21 23:57:08.434 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 18:57:08 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:57:08.437 211690 DEBUG oslo.privsep.daemon [-] privsep: reply[8c793c87-50e2-4400-bab9-d1211cfea4cc]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 18:57:08 np0005591285 nova_compute[182755]: 2026-01-21 23:57:08.452 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 18:57:08 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:57:08.460 211690 DEBUG oslo.privsep.daemon [-] privsep: reply[8d2013f8-e8a6-4084-b0f1-20520e25e54b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 18:57:08 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:57:08.461 211690 DEBUG oslo.privsep.daemon [-] privsep: reply[22d28703-edcc-4e50-ae89-5c46d4d1f46f]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 18:57:08 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:57:08.479 211690 DEBUG oslo.privsep.daemon [-] privsep: reply[c23830ba-80ee-4afa-af3b-356a82c66d67]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 434942, 'reachable_time': 42366, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 220348, 'error': None, 'target': 'ovnmeta-1995baab-0f8d-4658-a4fc-2d21868dc592', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 18:57:08 np0005591285 systemd[1]: run-netns-ovnmeta\x2d1995baab\x2d0f8d\x2d4658\x2da4fc\x2d2d21868dc592.mount: Deactivated successfully.
Jan 21 18:57:08 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:57:08.484 104650 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-1995baab-0f8d-4658-a4fc-2d21868dc592 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Jan 21 18:57:08 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:57:08.485 104650 DEBUG oslo.privsep.daemon [-] privsep: reply[409053cd-fd26-4d70-bdf8-ccc11ecbbd5e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 18:57:08 np0005591285 nova_compute[182755]: 2026-01-21 23:57:08.495 182759 INFO nova.compute.manager [None req-10551a9e-f7f1-4f77-af74-7e8a47ab4752 0f8ef02149394f2dac899fc3395b6bf7 717cc581e6a349a98dfd390d05b18624 - - default default] [instance: d30c29ef-0595-4d30-a826-50e21d7d3463] Took 0.48 seconds to destroy the instance on the hypervisor.#033[00m
Jan 21 18:57:08 np0005591285 nova_compute[182755]: 2026-01-21 23:57:08.495 182759 DEBUG oslo.service.loopingcall [None req-10551a9e-f7f1-4f77-af74-7e8a47ab4752 0f8ef02149394f2dac899fc3395b6bf7 717cc581e6a349a98dfd390d05b18624 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Jan 21 18:57:08 np0005591285 nova_compute[182755]: 2026-01-21 23:57:08.495 182759 DEBUG nova.compute.manager [-] [instance: d30c29ef-0595-4d30-a826-50e21d7d3463] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Jan 21 18:57:08 np0005591285 nova_compute[182755]: 2026-01-21 23:57:08.496 182759 DEBUG nova.network.neutron [-] [instance: d30c29ef-0595-4d30-a826-50e21d7d3463] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Jan 21 18:57:09 np0005591285 nova_compute[182755]: 2026-01-21 23:57:09.633 182759 DEBUG nova.network.neutron [-] [instance: d30c29ef-0595-4d30-a826-50e21d7d3463] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 21 18:57:09 np0005591285 nova_compute[182755]: 2026-01-21 23:57:09.655 182759 INFO nova.compute.manager [-] [instance: d30c29ef-0595-4d30-a826-50e21d7d3463] Took 1.16 seconds to deallocate network for instance.#033[00m
Jan 21 18:57:09 np0005591285 nova_compute[182755]: 2026-01-21 23:57:09.766 182759 DEBUG oslo_concurrency.lockutils [None req-10551a9e-f7f1-4f77-af74-7e8a47ab4752 0f8ef02149394f2dac899fc3395b6bf7 717cc581e6a349a98dfd390d05b18624 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 21 18:57:09 np0005591285 nova_compute[182755]: 2026-01-21 23:57:09.767 182759 DEBUG oslo_concurrency.lockutils [None req-10551a9e-f7f1-4f77-af74-7e8a47ab4752 0f8ef02149394f2dac899fc3395b6bf7 717cc581e6a349a98dfd390d05b18624 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 21 18:57:09 np0005591285 nova_compute[182755]: 2026-01-21 23:57:09.799 182759 DEBUG nova.compute.manager [req-fc705ae3-b5d5-4a17-abaa-2d011f608f7f req-02ccc110-d73a-44b6-a33f-dc0f6ecc2b10 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: d30c29ef-0595-4d30-a826-50e21d7d3463] Received event network-vif-deleted-8a89069d-b676-4bb0-ab19-1d71370566f0 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 21 18:57:09 np0005591285 nova_compute[182755]: 2026-01-21 23:57:09.858 182759 DEBUG nova.compute.provider_tree [None req-10551a9e-f7f1-4f77-af74-7e8a47ab4752 0f8ef02149394f2dac899fc3395b6bf7 717cc581e6a349a98dfd390d05b18624 - - default default] Inventory has not changed in ProviderTree for provider: e96a8776-a298-4c19-937a-402cb8191067 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 21 18:57:09 np0005591285 nova_compute[182755]: 2026-01-21 23:57:09.874 182759 DEBUG nova.scheduler.client.report [None req-10551a9e-f7f1-4f77-af74-7e8a47ab4752 0f8ef02149394f2dac899fc3395b6bf7 717cc581e6a349a98dfd390d05b18624 - - default default] Inventory has not changed for provider e96a8776-a298-4c19-937a-402cb8191067 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 21 18:57:09 np0005591285 nova_compute[182755]: 2026-01-21 23:57:09.901 182759 DEBUG oslo_concurrency.lockutils [None req-10551a9e-f7f1-4f77-af74-7e8a47ab4752 0f8ef02149394f2dac899fc3395b6bf7 717cc581e6a349a98dfd390d05b18624 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.134s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 21 18:57:09 np0005591285 nova_compute[182755]: 2026-01-21 23:57:09.946 182759 INFO nova.scheduler.client.report [None req-10551a9e-f7f1-4f77-af74-7e8a47ab4752 0f8ef02149394f2dac899fc3395b6bf7 717cc581e6a349a98dfd390d05b18624 - - default default] Deleted allocations for instance d30c29ef-0595-4d30-a826-50e21d7d3463#033[00m
Jan 21 18:57:10 np0005591285 nova_compute[182755]: 2026-01-21 23:57:10.080 182759 DEBUG oslo_concurrency.lockutils [None req-10551a9e-f7f1-4f77-af74-7e8a47ab4752 0f8ef02149394f2dac899fc3395b6bf7 717cc581e6a349a98dfd390d05b18624 - - default default] Lock "d30c29ef-0595-4d30-a826-50e21d7d3463" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 2.096s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 21 18:57:10 np0005591285 nova_compute[182755]: 2026-01-21 23:57:10.276 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 18:57:10 np0005591285 nova_compute[182755]: 2026-01-21 23:57:10.535 182759 DEBUG nova.compute.manager [req-dc4d52d0-a32f-4f5b-b53d-e25e5f0b0f18 req-adb1af99-af82-4ab9-b918-bff736f84554 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: d30c29ef-0595-4d30-a826-50e21d7d3463] Received event network-vif-plugged-8a89069d-b676-4bb0-ab19-1d71370566f0 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 21 18:57:10 np0005591285 nova_compute[182755]: 2026-01-21 23:57:10.536 182759 DEBUG oslo_concurrency.lockutils [req-dc4d52d0-a32f-4f5b-b53d-e25e5f0b0f18 req-adb1af99-af82-4ab9-b918-bff736f84554 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "d30c29ef-0595-4d30-a826-50e21d7d3463-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 21 18:57:10 np0005591285 nova_compute[182755]: 2026-01-21 23:57:10.536 182759 DEBUG oslo_concurrency.lockutils [req-dc4d52d0-a32f-4f5b-b53d-e25e5f0b0f18 req-adb1af99-af82-4ab9-b918-bff736f84554 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "d30c29ef-0595-4d30-a826-50e21d7d3463-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 21 18:57:10 np0005591285 nova_compute[182755]: 2026-01-21 23:57:10.536 182759 DEBUG oslo_concurrency.lockutils [req-dc4d52d0-a32f-4f5b-b53d-e25e5f0b0f18 req-adb1af99-af82-4ab9-b918-bff736f84554 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "d30c29ef-0595-4d30-a826-50e21d7d3463-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 21 18:57:10 np0005591285 nova_compute[182755]: 2026-01-21 23:57:10.536 182759 DEBUG nova.compute.manager [req-dc4d52d0-a32f-4f5b-b53d-e25e5f0b0f18 req-adb1af99-af82-4ab9-b918-bff736f84554 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: d30c29ef-0595-4d30-a826-50e21d7d3463] No waiting events found dispatching network-vif-plugged-8a89069d-b676-4bb0-ab19-1d71370566f0 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 21 18:57:10 np0005591285 nova_compute[182755]: 2026-01-21 23:57:10.536 182759 WARNING nova.compute.manager [req-dc4d52d0-a32f-4f5b-b53d-e25e5f0b0f18 req-adb1af99-af82-4ab9-b918-bff736f84554 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: d30c29ef-0595-4d30-a826-50e21d7d3463] Received unexpected event network-vif-plugged-8a89069d-b676-4bb0-ab19-1d71370566f0 for instance with vm_state deleted and task_state None.#033[00m
Jan 21 18:57:11 np0005591285 nova_compute[182755]: 2026-01-21 23:57:11.442 182759 DEBUG oslo_concurrency.lockutils [None req-eb1381db-a837-4fc6-94f6-361592f36d64 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] Acquiring lock "9308be91-9a92-4389-939a-8b03d37474cf" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 21 18:57:11 np0005591285 nova_compute[182755]: 2026-01-21 23:57:11.443 182759 DEBUG oslo_concurrency.lockutils [None req-eb1381db-a837-4fc6-94f6-361592f36d64 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] Lock "9308be91-9a92-4389-939a-8b03d37474cf" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 21 18:57:11 np0005591285 nova_compute[182755]: 2026-01-21 23:57:11.459 182759 DEBUG nova.compute.manager [None req-eb1381db-a837-4fc6-94f6-361592f36d64 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] [instance: 9308be91-9a92-4389-939a-8b03d37474cf] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Jan 21 18:57:11 np0005591285 nova_compute[182755]: 2026-01-21 23:57:11.566 182759 DEBUG oslo_concurrency.lockutils [None req-eb1381db-a837-4fc6-94f6-361592f36d64 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 21 18:57:11 np0005591285 nova_compute[182755]: 2026-01-21 23:57:11.567 182759 DEBUG oslo_concurrency.lockutils [None req-eb1381db-a837-4fc6-94f6-361592f36d64 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 21 18:57:11 np0005591285 nova_compute[182755]: 2026-01-21 23:57:11.575 182759 DEBUG nova.virt.hardware [None req-eb1381db-a837-4fc6-94f6-361592f36d64 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Jan 21 18:57:11 np0005591285 nova_compute[182755]: 2026-01-21 23:57:11.575 182759 INFO nova.compute.claims [None req-eb1381db-a837-4fc6-94f6-361592f36d64 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] [instance: 9308be91-9a92-4389-939a-8b03d37474cf] Claim successful on node compute-2.ctlplane.example.com#033[00m
Jan 21 18:57:11 np0005591285 nova_compute[182755]: 2026-01-21 23:57:11.728 182759 DEBUG nova.compute.provider_tree [None req-eb1381db-a837-4fc6-94f6-361592f36d64 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] Inventory has not changed in ProviderTree for provider: e96a8776-a298-4c19-937a-402cb8191067 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 21 18:57:11 np0005591285 nova_compute[182755]: 2026-01-21 23:57:11.743 182759 DEBUG nova.scheduler.client.report [None req-eb1381db-a837-4fc6-94f6-361592f36d64 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] Inventory has not changed for provider e96a8776-a298-4c19-937a-402cb8191067 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 21 18:57:11 np0005591285 nova_compute[182755]: 2026-01-21 23:57:11.762 182759 DEBUG oslo_concurrency.lockutils [None req-eb1381db-a837-4fc6-94f6-361592f36d64 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.196s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 21 18:57:11 np0005591285 nova_compute[182755]: 2026-01-21 23:57:11.763 182759 DEBUG nova.compute.manager [None req-eb1381db-a837-4fc6-94f6-361592f36d64 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] [instance: 9308be91-9a92-4389-939a-8b03d37474cf] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Jan 21 18:57:11 np0005591285 nova_compute[182755]: 2026-01-21 23:57:11.848 182759 DEBUG nova.compute.manager [None req-eb1381db-a837-4fc6-94f6-361592f36d64 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] [instance: 9308be91-9a92-4389-939a-8b03d37474cf] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Jan 21 18:57:11 np0005591285 nova_compute[182755]: 2026-01-21 23:57:11.849 182759 DEBUG nova.network.neutron [None req-eb1381db-a837-4fc6-94f6-361592f36d64 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] [instance: 9308be91-9a92-4389-939a-8b03d37474cf] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Jan 21 18:57:11 np0005591285 nova_compute[182755]: 2026-01-21 23:57:11.876 182759 INFO nova.virt.libvirt.driver [None req-eb1381db-a837-4fc6-94f6-361592f36d64 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] [instance: 9308be91-9a92-4389-939a-8b03d37474cf] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Jan 21 18:57:11 np0005591285 nova_compute[182755]: 2026-01-21 23:57:11.910 182759 DEBUG nova.compute.manager [None req-eb1381db-a837-4fc6-94f6-361592f36d64 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] [instance: 9308be91-9a92-4389-939a-8b03d37474cf] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Jan 21 18:57:12 np0005591285 nova_compute[182755]: 2026-01-21 23:57:12.103 182759 DEBUG nova.compute.manager [None req-eb1381db-a837-4fc6-94f6-361592f36d64 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] [instance: 9308be91-9a92-4389-939a-8b03d37474cf] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Jan 21 18:57:12 np0005591285 nova_compute[182755]: 2026-01-21 23:57:12.105 182759 DEBUG nova.virt.libvirt.driver [None req-eb1381db-a837-4fc6-94f6-361592f36d64 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] [instance: 9308be91-9a92-4389-939a-8b03d37474cf] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Jan 21 18:57:12 np0005591285 nova_compute[182755]: 2026-01-21 23:57:12.105 182759 INFO nova.virt.libvirt.driver [None req-eb1381db-a837-4fc6-94f6-361592f36d64 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] [instance: 9308be91-9a92-4389-939a-8b03d37474cf] Creating image(s)#033[00m
Jan 21 18:57:12 np0005591285 nova_compute[182755]: 2026-01-21 23:57:12.106 182759 DEBUG oslo_concurrency.lockutils [None req-eb1381db-a837-4fc6-94f6-361592f36d64 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] Acquiring lock "/var/lib/nova/instances/9308be91-9a92-4389-939a-8b03d37474cf/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 21 18:57:12 np0005591285 nova_compute[182755]: 2026-01-21 23:57:12.106 182759 DEBUG oslo_concurrency.lockutils [None req-eb1381db-a837-4fc6-94f6-361592f36d64 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] Lock "/var/lib/nova/instances/9308be91-9a92-4389-939a-8b03d37474cf/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 21 18:57:12 np0005591285 nova_compute[182755]: 2026-01-21 23:57:12.107 182759 DEBUG oslo_concurrency.lockutils [None req-eb1381db-a837-4fc6-94f6-361592f36d64 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] Lock "/var/lib/nova/instances/9308be91-9a92-4389-939a-8b03d37474cf/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 21 18:57:12 np0005591285 nova_compute[182755]: 2026-01-21 23:57:12.119 182759 DEBUG nova.policy [None req-eb1381db-a837-4fc6-94f6-361592f36d64 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '3e78a70a1d284a9d932d4a53b872df39', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'cccb624dbe6d4401a89e9cd254f91828', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Jan 21 18:57:12 np0005591285 nova_compute[182755]: 2026-01-21 23:57:12.124 182759 DEBUG oslo_concurrency.processutils [None req-eb1381db-a837-4fc6-94f6-361592f36d64 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 21 18:57:12 np0005591285 nova_compute[182755]: 2026-01-21 23:57:12.186 182759 DEBUG oslo_concurrency.processutils [None req-eb1381db-a837-4fc6-94f6-361592f36d64 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json" returned: 0 in 0.063s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 21 18:57:12 np0005591285 nova_compute[182755]: 2026-01-21 23:57:12.188 182759 DEBUG oslo_concurrency.lockutils [None req-eb1381db-a837-4fc6-94f6-361592f36d64 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] Acquiring lock "3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 21 18:57:12 np0005591285 nova_compute[182755]: 2026-01-21 23:57:12.189 182759 DEBUG oslo_concurrency.lockutils [None req-eb1381db-a837-4fc6-94f6-361592f36d64 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] Lock "3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 21 18:57:12 np0005591285 nova_compute[182755]: 2026-01-21 23:57:12.204 182759 DEBUG oslo_concurrency.processutils [None req-eb1381db-a837-4fc6-94f6-361592f36d64 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 21 18:57:12 np0005591285 nova_compute[182755]: 2026-01-21 23:57:12.270 182759 DEBUG oslo_concurrency.processutils [None req-eb1381db-a837-4fc6-94f6-361592f36d64 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json" returned: 0 in 0.065s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 21 18:57:12 np0005591285 nova_compute[182755]: 2026-01-21 23:57:12.271 182759 DEBUG oslo_concurrency.processutils [None req-eb1381db-a837-4fc6-94f6-361592f36d64 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474,backing_fmt=raw /var/lib/nova/instances/9308be91-9a92-4389-939a-8b03d37474cf/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 21 18:57:12 np0005591285 nova_compute[182755]: 2026-01-21 23:57:12.335 182759 DEBUG oslo_concurrency.processutils [None req-eb1381db-a837-4fc6-94f6-361592f36d64 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474,backing_fmt=raw /var/lib/nova/instances/9308be91-9a92-4389-939a-8b03d37474cf/disk 1073741824" returned: 0 in 0.064s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 21 18:57:12 np0005591285 nova_compute[182755]: 2026-01-21 23:57:12.336 182759 DEBUG oslo_concurrency.lockutils [None req-eb1381db-a837-4fc6-94f6-361592f36d64 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] Lock "3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.147s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 21 18:57:12 np0005591285 nova_compute[182755]: 2026-01-21 23:57:12.336 182759 DEBUG oslo_concurrency.processutils [None req-eb1381db-a837-4fc6-94f6-361592f36d64 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 21 18:57:12 np0005591285 nova_compute[182755]: 2026-01-21 23:57:12.431 182759 DEBUG oslo_concurrency.processutils [None req-eb1381db-a837-4fc6-94f6-361592f36d64 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json" returned: 0 in 0.094s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 21 18:57:12 np0005591285 nova_compute[182755]: 2026-01-21 23:57:12.433 182759 DEBUG nova.virt.disk.api [None req-eb1381db-a837-4fc6-94f6-361592f36d64 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] Checking if we can resize image /var/lib/nova/instances/9308be91-9a92-4389-939a-8b03d37474cf/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166#033[00m
Jan 21 18:57:12 np0005591285 nova_compute[182755]: 2026-01-21 23:57:12.434 182759 DEBUG oslo_concurrency.processutils [None req-eb1381db-a837-4fc6-94f6-361592f36d64 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/9308be91-9a92-4389-939a-8b03d37474cf/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 21 18:57:12 np0005591285 nova_compute[182755]: 2026-01-21 23:57:12.528 182759 DEBUG oslo_concurrency.processutils [None req-eb1381db-a837-4fc6-94f6-361592f36d64 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/9308be91-9a92-4389-939a-8b03d37474cf/disk --force-share --output=json" returned: 0 in 0.094s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 21 18:57:12 np0005591285 nova_compute[182755]: 2026-01-21 23:57:12.530 182759 DEBUG nova.virt.disk.api [None req-eb1381db-a837-4fc6-94f6-361592f36d64 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] Cannot resize image /var/lib/nova/instances/9308be91-9a92-4389-939a-8b03d37474cf/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172#033[00m
Jan 21 18:57:12 np0005591285 nova_compute[182755]: 2026-01-21 23:57:12.531 182759 DEBUG nova.objects.instance [None req-eb1381db-a837-4fc6-94f6-361592f36d64 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] Lazy-loading 'migration_context' on Instance uuid 9308be91-9a92-4389-939a-8b03d37474cf obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 21 18:57:12 np0005591285 nova_compute[182755]: 2026-01-21 23:57:12.551 182759 DEBUG nova.virt.libvirt.driver [None req-eb1381db-a837-4fc6-94f6-361592f36d64 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] [instance: 9308be91-9a92-4389-939a-8b03d37474cf] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Jan 21 18:57:12 np0005591285 nova_compute[182755]: 2026-01-21 23:57:12.552 182759 DEBUG nova.virt.libvirt.driver [None req-eb1381db-a837-4fc6-94f6-361592f36d64 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] [instance: 9308be91-9a92-4389-939a-8b03d37474cf] Ensure instance console log exists: /var/lib/nova/instances/9308be91-9a92-4389-939a-8b03d37474cf/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Jan 21 18:57:12 np0005591285 nova_compute[182755]: 2026-01-21 23:57:12.553 182759 DEBUG oslo_concurrency.lockutils [None req-eb1381db-a837-4fc6-94f6-361592f36d64 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 21 18:57:12 np0005591285 nova_compute[182755]: 2026-01-21 23:57:12.554 182759 DEBUG oslo_concurrency.lockutils [None req-eb1381db-a837-4fc6-94f6-361592f36d64 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 21 18:57:12 np0005591285 nova_compute[182755]: 2026-01-21 23:57:12.554 182759 DEBUG oslo_concurrency.lockutils [None req-eb1381db-a837-4fc6-94f6-361592f36d64 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 21 18:57:13 np0005591285 nova_compute[182755]: 2026-01-21 23:57:13.016 182759 DEBUG nova.network.neutron [None req-eb1381db-a837-4fc6-94f6-361592f36d64 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] [instance: 9308be91-9a92-4389-939a-8b03d37474cf] Successfully created port: d96fb6bb-9793-4373-8f62-3aa3f32af6a5 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Jan 21 18:57:13 np0005591285 nova_compute[182755]: 2026-01-21 23:57:13.386 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 18:57:13 np0005591285 nova_compute[182755]: 2026-01-21 23:57:13.867 182759 DEBUG nova.network.neutron [None req-eb1381db-a837-4fc6-94f6-361592f36d64 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] [instance: 9308be91-9a92-4389-939a-8b03d37474cf] Successfully updated port: d96fb6bb-9793-4373-8f62-3aa3f32af6a5 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Jan 21 18:57:13 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:57:13.871 104259 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=ce4b296c-26ac-415a-aa87-9634754eb3d3, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '16'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 21 18:57:13 np0005591285 nova_compute[182755]: 2026-01-21 23:57:13.904 182759 DEBUG oslo_concurrency.lockutils [None req-eb1381db-a837-4fc6-94f6-361592f36d64 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] Acquiring lock "refresh_cache-9308be91-9a92-4389-939a-8b03d37474cf" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 21 18:57:13 np0005591285 nova_compute[182755]: 2026-01-21 23:57:13.905 182759 DEBUG oslo_concurrency.lockutils [None req-eb1381db-a837-4fc6-94f6-361592f36d64 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] Acquired lock "refresh_cache-9308be91-9a92-4389-939a-8b03d37474cf" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 21 18:57:13 np0005591285 nova_compute[182755]: 2026-01-21 23:57:13.905 182759 DEBUG nova.network.neutron [None req-eb1381db-a837-4fc6-94f6-361592f36d64 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] [instance: 9308be91-9a92-4389-939a-8b03d37474cf] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Jan 21 18:57:13 np0005591285 nova_compute[182755]: 2026-01-21 23:57:13.977 182759 DEBUG nova.compute.manager [req-cf593e0e-74cd-4a83-8b45-977fc0d4d424 req-17732428-0ae3-4d2d-a5eb-c710610ff39a 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 9308be91-9a92-4389-939a-8b03d37474cf] Received event network-changed-d96fb6bb-9793-4373-8f62-3aa3f32af6a5 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 21 18:57:13 np0005591285 nova_compute[182755]: 2026-01-21 23:57:13.977 182759 DEBUG nova.compute.manager [req-cf593e0e-74cd-4a83-8b45-977fc0d4d424 req-17732428-0ae3-4d2d-a5eb-c710610ff39a 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 9308be91-9a92-4389-939a-8b03d37474cf] Refreshing instance network info cache due to event network-changed-d96fb6bb-9793-4373-8f62-3aa3f32af6a5. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 21 18:57:13 np0005591285 nova_compute[182755]: 2026-01-21 23:57:13.978 182759 DEBUG oslo_concurrency.lockutils [req-cf593e0e-74cd-4a83-8b45-977fc0d4d424 req-17732428-0ae3-4d2d-a5eb-c710610ff39a 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "refresh_cache-9308be91-9a92-4389-939a-8b03d37474cf" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 21 18:57:14 np0005591285 nova_compute[182755]: 2026-01-21 23:57:14.117 182759 DEBUG nova.network.neutron [None req-eb1381db-a837-4fc6-94f6-361592f36d64 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] [instance: 9308be91-9a92-4389-939a-8b03d37474cf] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Jan 21 18:57:14 np0005591285 nova_compute[182755]: 2026-01-21 23:57:14.752 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 18:57:15 np0005591285 nova_compute[182755]: 2026-01-21 23:57:15.072 182759 DEBUG nova.network.neutron [None req-eb1381db-a837-4fc6-94f6-361592f36d64 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] [instance: 9308be91-9a92-4389-939a-8b03d37474cf] Updating instance_info_cache with network_info: [{"id": "d96fb6bb-9793-4373-8f62-3aa3f32af6a5", "address": "fa:16:3e:c3:44:d7", "network": {"id": "19c3e0c8-5563-479c-995a-ab38d8b8c7f7", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-10713966-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "cccb624dbe6d4401a89e9cd254f91828", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd96fb6bb-97", "ovs_interfaceid": "d96fb6bb-9793-4373-8f62-3aa3f32af6a5", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 21 18:57:15 np0005591285 nova_compute[182755]: 2026-01-21 23:57:15.095 182759 DEBUG oslo_concurrency.lockutils [None req-eb1381db-a837-4fc6-94f6-361592f36d64 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] Releasing lock "refresh_cache-9308be91-9a92-4389-939a-8b03d37474cf" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 21 18:57:15 np0005591285 nova_compute[182755]: 2026-01-21 23:57:15.096 182759 DEBUG nova.compute.manager [None req-eb1381db-a837-4fc6-94f6-361592f36d64 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] [instance: 9308be91-9a92-4389-939a-8b03d37474cf] Instance network_info: |[{"id": "d96fb6bb-9793-4373-8f62-3aa3f32af6a5", "address": "fa:16:3e:c3:44:d7", "network": {"id": "19c3e0c8-5563-479c-995a-ab38d8b8c7f7", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-10713966-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "cccb624dbe6d4401a89e9cd254f91828", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd96fb6bb-97", "ovs_interfaceid": "d96fb6bb-9793-4373-8f62-3aa3f32af6a5", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Jan 21 18:57:15 np0005591285 nova_compute[182755]: 2026-01-21 23:57:15.098 182759 DEBUG oslo_concurrency.lockutils [req-cf593e0e-74cd-4a83-8b45-977fc0d4d424 req-17732428-0ae3-4d2d-a5eb-c710610ff39a 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquired lock "refresh_cache-9308be91-9a92-4389-939a-8b03d37474cf" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 21 18:57:15 np0005591285 nova_compute[182755]: 2026-01-21 23:57:15.098 182759 DEBUG nova.network.neutron [req-cf593e0e-74cd-4a83-8b45-977fc0d4d424 req-17732428-0ae3-4d2d-a5eb-c710610ff39a 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 9308be91-9a92-4389-939a-8b03d37474cf] Refreshing network info cache for port d96fb6bb-9793-4373-8f62-3aa3f32af6a5 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 21 18:57:15 np0005591285 nova_compute[182755]: 2026-01-21 23:57:15.107 182759 DEBUG nova.virt.libvirt.driver [None req-eb1381db-a837-4fc6-94f6-361592f36d64 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] [instance: 9308be91-9a92-4389-939a-8b03d37474cf] Start _get_guest_xml network_info=[{"id": "d96fb6bb-9793-4373-8f62-3aa3f32af6a5", "address": "fa:16:3e:c3:44:d7", "network": {"id": "19c3e0c8-5563-479c-995a-ab38d8b8c7f7", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-10713966-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "cccb624dbe6d4401a89e9cd254f91828", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd96fb6bb-97", "ovs_interfaceid": "d96fb6bb-9793-4373-8f62-3aa3f32af6a5", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-21T23:42:50Z,direct_url=<?>,disk_format='qcow2',id=9cd98f02-a505-4543-a7ad-04e9a377b456,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='43b70c4e837343859ac97b6b2397ba1b',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-21T23:42:52Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'device_type': 'disk', 'device_name': '/dev/vda', 'size': 0, 'boot_index': 0, 'disk_bus': 'virtio', 'guest_format': None, 'encryption_options': None, 'encryption_format': None, 'encrypted': False, 'encryption_secret_uuid': None, 'image_id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Jan 21 18:57:15 np0005591285 nova_compute[182755]: 2026-01-21 23:57:15.122 182759 WARNING nova.virt.libvirt.driver [None req-eb1381db-a837-4fc6-94f6-361592f36d64 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 21 18:57:15 np0005591285 nova_compute[182755]: 2026-01-21 23:57:15.134 182759 DEBUG nova.virt.libvirt.host [None req-eb1381db-a837-4fc6-94f6-361592f36d64 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Jan 21 18:57:15 np0005591285 nova_compute[182755]: 2026-01-21 23:57:15.135 182759 DEBUG nova.virt.libvirt.host [None req-eb1381db-a837-4fc6-94f6-361592f36d64 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Jan 21 18:57:15 np0005591285 nova_compute[182755]: 2026-01-21 23:57:15.141 182759 DEBUG nova.virt.libvirt.host [None req-eb1381db-a837-4fc6-94f6-361592f36d64 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Jan 21 18:57:15 np0005591285 nova_compute[182755]: 2026-01-21 23:57:15.143 182759 DEBUG nova.virt.libvirt.host [None req-eb1381db-a837-4fc6-94f6-361592f36d64 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Jan 21 18:57:15 np0005591285 nova_compute[182755]: 2026-01-21 23:57:15.146 182759 DEBUG nova.virt.libvirt.driver [None req-eb1381db-a837-4fc6-94f6-361592f36d64 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Jan 21 18:57:15 np0005591285 nova_compute[182755]: 2026-01-21 23:57:15.147 182759 DEBUG nova.virt.hardware [None req-eb1381db-a837-4fc6-94f6-361592f36d64 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-21T23:42:45Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='c3389c03-89c4-4ff5-9e03-1a99d41713d4',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-21T23:42:50Z,direct_url=<?>,disk_format='qcow2',id=9cd98f02-a505-4543-a7ad-04e9a377b456,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='43b70c4e837343859ac97b6b2397ba1b',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-21T23:42:52Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Jan 21 18:57:15 np0005591285 nova_compute[182755]: 2026-01-21 23:57:15.148 182759 DEBUG nova.virt.hardware [None req-eb1381db-a837-4fc6-94f6-361592f36d64 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Jan 21 18:57:15 np0005591285 nova_compute[182755]: 2026-01-21 23:57:15.148 182759 DEBUG nova.virt.hardware [None req-eb1381db-a837-4fc6-94f6-361592f36d64 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Jan 21 18:57:15 np0005591285 nova_compute[182755]: 2026-01-21 23:57:15.149 182759 DEBUG nova.virt.hardware [None req-eb1381db-a837-4fc6-94f6-361592f36d64 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Jan 21 18:57:15 np0005591285 nova_compute[182755]: 2026-01-21 23:57:15.150 182759 DEBUG nova.virt.hardware [None req-eb1381db-a837-4fc6-94f6-361592f36d64 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Jan 21 18:57:15 np0005591285 nova_compute[182755]: 2026-01-21 23:57:15.150 182759 DEBUG nova.virt.hardware [None req-eb1381db-a837-4fc6-94f6-361592f36d64 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Jan 21 18:57:15 np0005591285 nova_compute[182755]: 2026-01-21 23:57:15.151 182759 DEBUG nova.virt.hardware [None req-eb1381db-a837-4fc6-94f6-361592f36d64 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Jan 21 18:57:15 np0005591285 nova_compute[182755]: 2026-01-21 23:57:15.151 182759 DEBUG nova.virt.hardware [None req-eb1381db-a837-4fc6-94f6-361592f36d64 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Jan 21 18:57:15 np0005591285 nova_compute[182755]: 2026-01-21 23:57:15.152 182759 DEBUG nova.virt.hardware [None req-eb1381db-a837-4fc6-94f6-361592f36d64 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Jan 21 18:57:15 np0005591285 nova_compute[182755]: 2026-01-21 23:57:15.153 182759 DEBUG nova.virt.hardware [None req-eb1381db-a837-4fc6-94f6-361592f36d64 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Jan 21 18:57:15 np0005591285 nova_compute[182755]: 2026-01-21 23:57:15.153 182759 DEBUG nova.virt.hardware [None req-eb1381db-a837-4fc6-94f6-361592f36d64 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Jan 21 18:57:15 np0005591285 nova_compute[182755]: 2026-01-21 23:57:15.161 182759 DEBUG nova.virt.libvirt.vif [None req-eb1381db-a837-4fc6-94f6-361592f36d64 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-21T23:57:10Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ServerActionsTestJSON-server-396111842',display_name='tempest-ServerActionsTestJSON-server-396111842',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-serveractionstestjson-server-396111842',id=70,image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBM2ugiUux7DYMlN8dY8gue1BzsfXbOKOqdPq/gJUxFgjYtiZRKn0Il7yH7vkt/FF0n0nQ57uKZ7FjQwDvGcLpEHkhrK3RTLhPWsztjfiNHjhjKK0S86T4k3kzP0rpeoh4Q==',key_name='tempest-keypair-452781070',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='cccb624dbe6d4401a89e9cd254f91828',ramdisk_id='',reservation_id='r-740ncwsh',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ServerActionsTestJSON-78742637',owner_user_name='tempest-ServerActionsTestJSON-78742637-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-21T23:57:11Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='3e78a70a1d284a9d932d4a53b872df39',uuid=9308be91-9a92-4389-939a-8b03d37474cf,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "d96fb6bb-9793-4373-8f62-3aa3f32af6a5", "address": "fa:16:3e:c3:44:d7", "network": {"id": "19c3e0c8-5563-479c-995a-ab38d8b8c7f7", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-10713966-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "cccb624dbe6d4401a89e9cd254f91828", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd96fb6bb-97", "ovs_interfaceid": "d96fb6bb-9793-4373-8f62-3aa3f32af6a5", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Jan 21 18:57:15 np0005591285 nova_compute[182755]: 2026-01-21 23:57:15.162 182759 DEBUG nova.network.os_vif_util [None req-eb1381db-a837-4fc6-94f6-361592f36d64 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] Converting VIF {"id": "d96fb6bb-9793-4373-8f62-3aa3f32af6a5", "address": "fa:16:3e:c3:44:d7", "network": {"id": "19c3e0c8-5563-479c-995a-ab38d8b8c7f7", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-10713966-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "cccb624dbe6d4401a89e9cd254f91828", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd96fb6bb-97", "ovs_interfaceid": "d96fb6bb-9793-4373-8f62-3aa3f32af6a5", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 21 18:57:15 np0005591285 nova_compute[182755]: 2026-01-21 23:57:15.166 182759 DEBUG nova.network.os_vif_util [None req-eb1381db-a837-4fc6-94f6-361592f36d64 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:c3:44:d7,bridge_name='br-int',has_traffic_filtering=True,id=d96fb6bb-9793-4373-8f62-3aa3f32af6a5,network=Network(19c3e0c8-5563-479c-995a-ab38d8b8c7f7),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapd96fb6bb-97') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 21 18:57:15 np0005591285 nova_compute[182755]: 2026-01-21 23:57:15.169 182759 DEBUG nova.objects.instance [None req-eb1381db-a837-4fc6-94f6-361592f36d64 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] Lazy-loading 'pci_devices' on Instance uuid 9308be91-9a92-4389-939a-8b03d37474cf obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 21 18:57:15 np0005591285 nova_compute[182755]: 2026-01-21 23:57:15.188 182759 DEBUG nova.virt.libvirt.driver [None req-eb1381db-a837-4fc6-94f6-361592f36d64 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] [instance: 9308be91-9a92-4389-939a-8b03d37474cf] End _get_guest_xml xml=<domain type="kvm">
Jan 21 18:57:15 np0005591285 nova_compute[182755]:  <uuid>9308be91-9a92-4389-939a-8b03d37474cf</uuid>
Jan 21 18:57:15 np0005591285 nova_compute[182755]:  <name>instance-00000046</name>
Jan 21 18:57:15 np0005591285 nova_compute[182755]:  <memory>131072</memory>
Jan 21 18:57:15 np0005591285 nova_compute[182755]:  <vcpu>1</vcpu>
Jan 21 18:57:15 np0005591285 nova_compute[182755]:  <metadata>
Jan 21 18:57:15 np0005591285 nova_compute[182755]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 21 18:57:15 np0005591285 nova_compute[182755]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 21 18:57:15 np0005591285 nova_compute[182755]:      <nova:name>tempest-ServerActionsTestJSON-server-396111842</nova:name>
Jan 21 18:57:15 np0005591285 nova_compute[182755]:      <nova:creationTime>2026-01-21 23:57:15</nova:creationTime>
Jan 21 18:57:15 np0005591285 nova_compute[182755]:      <nova:flavor name="m1.nano">
Jan 21 18:57:15 np0005591285 nova_compute[182755]:        <nova:memory>128</nova:memory>
Jan 21 18:57:15 np0005591285 nova_compute[182755]:        <nova:disk>1</nova:disk>
Jan 21 18:57:15 np0005591285 nova_compute[182755]:        <nova:swap>0</nova:swap>
Jan 21 18:57:15 np0005591285 nova_compute[182755]:        <nova:ephemeral>0</nova:ephemeral>
Jan 21 18:57:15 np0005591285 nova_compute[182755]:        <nova:vcpus>1</nova:vcpus>
Jan 21 18:57:15 np0005591285 nova_compute[182755]:      </nova:flavor>
Jan 21 18:57:15 np0005591285 nova_compute[182755]:      <nova:owner>
Jan 21 18:57:15 np0005591285 nova_compute[182755]:        <nova:user uuid="3e78a70a1d284a9d932d4a53b872df39">tempest-ServerActionsTestJSON-78742637-project-member</nova:user>
Jan 21 18:57:15 np0005591285 nova_compute[182755]:        <nova:project uuid="cccb624dbe6d4401a89e9cd254f91828">tempest-ServerActionsTestJSON-78742637</nova:project>
Jan 21 18:57:15 np0005591285 nova_compute[182755]:      </nova:owner>
Jan 21 18:57:15 np0005591285 nova_compute[182755]:      <nova:root type="image" uuid="9cd98f02-a505-4543-a7ad-04e9a377b456"/>
Jan 21 18:57:15 np0005591285 nova_compute[182755]:      <nova:ports>
Jan 21 18:57:15 np0005591285 nova_compute[182755]:        <nova:port uuid="d96fb6bb-9793-4373-8f62-3aa3f32af6a5">
Jan 21 18:57:15 np0005591285 nova_compute[182755]:          <nova:ip type="fixed" address="10.100.0.7" ipVersion="4"/>
Jan 21 18:57:15 np0005591285 nova_compute[182755]:        </nova:port>
Jan 21 18:57:15 np0005591285 nova_compute[182755]:      </nova:ports>
Jan 21 18:57:15 np0005591285 nova_compute[182755]:    </nova:instance>
Jan 21 18:57:15 np0005591285 nova_compute[182755]:  </metadata>
Jan 21 18:57:15 np0005591285 nova_compute[182755]:  <sysinfo type="smbios">
Jan 21 18:57:15 np0005591285 nova_compute[182755]:    <system>
Jan 21 18:57:15 np0005591285 nova_compute[182755]:      <entry name="manufacturer">RDO</entry>
Jan 21 18:57:15 np0005591285 nova_compute[182755]:      <entry name="product">OpenStack Compute</entry>
Jan 21 18:57:15 np0005591285 nova_compute[182755]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 21 18:57:15 np0005591285 nova_compute[182755]:      <entry name="serial">9308be91-9a92-4389-939a-8b03d37474cf</entry>
Jan 21 18:57:15 np0005591285 nova_compute[182755]:      <entry name="uuid">9308be91-9a92-4389-939a-8b03d37474cf</entry>
Jan 21 18:57:15 np0005591285 nova_compute[182755]:      <entry name="family">Virtual Machine</entry>
Jan 21 18:57:15 np0005591285 nova_compute[182755]:    </system>
Jan 21 18:57:15 np0005591285 nova_compute[182755]:  </sysinfo>
Jan 21 18:57:15 np0005591285 nova_compute[182755]:  <os>
Jan 21 18:57:15 np0005591285 nova_compute[182755]:    <type arch="x86_64" machine="q35">hvm</type>
Jan 21 18:57:15 np0005591285 nova_compute[182755]:    <boot dev="hd"/>
Jan 21 18:57:15 np0005591285 nova_compute[182755]:    <smbios mode="sysinfo"/>
Jan 21 18:57:15 np0005591285 nova_compute[182755]:  </os>
Jan 21 18:57:15 np0005591285 nova_compute[182755]:  <features>
Jan 21 18:57:15 np0005591285 nova_compute[182755]:    <acpi/>
Jan 21 18:57:15 np0005591285 nova_compute[182755]:    <apic/>
Jan 21 18:57:15 np0005591285 nova_compute[182755]:    <vmcoreinfo/>
Jan 21 18:57:15 np0005591285 nova_compute[182755]:  </features>
Jan 21 18:57:15 np0005591285 nova_compute[182755]:  <clock offset="utc">
Jan 21 18:57:15 np0005591285 nova_compute[182755]:    <timer name="pit" tickpolicy="delay"/>
Jan 21 18:57:15 np0005591285 nova_compute[182755]:    <timer name="rtc" tickpolicy="catchup"/>
Jan 21 18:57:15 np0005591285 nova_compute[182755]:    <timer name="hpet" present="no"/>
Jan 21 18:57:15 np0005591285 nova_compute[182755]:  </clock>
Jan 21 18:57:15 np0005591285 nova_compute[182755]:  <cpu mode="custom" match="exact">
Jan 21 18:57:15 np0005591285 nova_compute[182755]:    <model>Nehalem</model>
Jan 21 18:57:15 np0005591285 nova_compute[182755]:    <topology sockets="1" cores="1" threads="1"/>
Jan 21 18:57:15 np0005591285 nova_compute[182755]:  </cpu>
Jan 21 18:57:15 np0005591285 nova_compute[182755]:  <devices>
Jan 21 18:57:15 np0005591285 nova_compute[182755]:    <disk type="file" device="disk">
Jan 21 18:57:15 np0005591285 nova_compute[182755]:      <driver name="qemu" type="qcow2" cache="none"/>
Jan 21 18:57:15 np0005591285 nova_compute[182755]:      <source file="/var/lib/nova/instances/9308be91-9a92-4389-939a-8b03d37474cf/disk"/>
Jan 21 18:57:15 np0005591285 nova_compute[182755]:      <target dev="vda" bus="virtio"/>
Jan 21 18:57:15 np0005591285 nova_compute[182755]:    </disk>
Jan 21 18:57:15 np0005591285 nova_compute[182755]:    <disk type="file" device="cdrom">
Jan 21 18:57:15 np0005591285 nova_compute[182755]:      <driver name="qemu" type="raw" cache="none"/>
Jan 21 18:57:15 np0005591285 nova_compute[182755]:      <source file="/var/lib/nova/instances/9308be91-9a92-4389-939a-8b03d37474cf/disk.config"/>
Jan 21 18:57:15 np0005591285 nova_compute[182755]:      <target dev="sda" bus="sata"/>
Jan 21 18:57:15 np0005591285 nova_compute[182755]:    </disk>
Jan 21 18:57:15 np0005591285 nova_compute[182755]:    <interface type="ethernet">
Jan 21 18:57:15 np0005591285 nova_compute[182755]:      <mac address="fa:16:3e:c3:44:d7"/>
Jan 21 18:57:15 np0005591285 nova_compute[182755]:      <model type="virtio"/>
Jan 21 18:57:15 np0005591285 nova_compute[182755]:      <driver name="vhost" rx_queue_size="512"/>
Jan 21 18:57:15 np0005591285 nova_compute[182755]:      <mtu size="1442"/>
Jan 21 18:57:15 np0005591285 nova_compute[182755]:      <target dev="tapd96fb6bb-97"/>
Jan 21 18:57:15 np0005591285 nova_compute[182755]:    </interface>
Jan 21 18:57:15 np0005591285 nova_compute[182755]:    <serial type="pty">
Jan 21 18:57:15 np0005591285 nova_compute[182755]:      <log file="/var/lib/nova/instances/9308be91-9a92-4389-939a-8b03d37474cf/console.log" append="off"/>
Jan 21 18:57:15 np0005591285 nova_compute[182755]:    </serial>
Jan 21 18:57:15 np0005591285 nova_compute[182755]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 21 18:57:15 np0005591285 nova_compute[182755]:    <video>
Jan 21 18:57:15 np0005591285 nova_compute[182755]:      <model type="virtio"/>
Jan 21 18:57:15 np0005591285 nova_compute[182755]:    </video>
Jan 21 18:57:15 np0005591285 nova_compute[182755]:    <input type="tablet" bus="usb"/>
Jan 21 18:57:15 np0005591285 nova_compute[182755]:    <rng model="virtio">
Jan 21 18:57:15 np0005591285 nova_compute[182755]:      <backend model="random">/dev/urandom</backend>
Jan 21 18:57:15 np0005591285 nova_compute[182755]:    </rng>
Jan 21 18:57:15 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root"/>
Jan 21 18:57:15 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 18:57:15 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 18:57:15 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 18:57:15 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 18:57:15 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 18:57:15 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 18:57:15 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 18:57:15 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 18:57:15 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 18:57:15 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 18:57:15 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 18:57:15 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 18:57:15 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 18:57:15 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 18:57:15 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 18:57:15 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 18:57:15 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 18:57:15 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 18:57:15 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 18:57:15 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 18:57:15 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 18:57:15 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 18:57:15 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 18:57:15 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 18:57:15 np0005591285 nova_compute[182755]:    <controller type="usb" index="0"/>
Jan 21 18:57:15 np0005591285 nova_compute[182755]:    <memballoon model="virtio">
Jan 21 18:57:15 np0005591285 nova_compute[182755]:      <stats period="10"/>
Jan 21 18:57:15 np0005591285 nova_compute[182755]:    </memballoon>
Jan 21 18:57:15 np0005591285 nova_compute[182755]:  </devices>
Jan 21 18:57:15 np0005591285 nova_compute[182755]: </domain>
Jan 21 18:57:15 np0005591285 nova_compute[182755]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Jan 21 18:57:15 np0005591285 nova_compute[182755]: 2026-01-21 23:57:15.190 182759 DEBUG nova.compute.manager [None req-eb1381db-a837-4fc6-94f6-361592f36d64 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] [instance: 9308be91-9a92-4389-939a-8b03d37474cf] Preparing to wait for external event network-vif-plugged-d96fb6bb-9793-4373-8f62-3aa3f32af6a5 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Jan 21 18:57:15 np0005591285 nova_compute[182755]: 2026-01-21 23:57:15.191 182759 DEBUG oslo_concurrency.lockutils [None req-eb1381db-a837-4fc6-94f6-361592f36d64 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] Acquiring lock "9308be91-9a92-4389-939a-8b03d37474cf-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 21 18:57:15 np0005591285 nova_compute[182755]: 2026-01-21 23:57:15.191 182759 DEBUG oslo_concurrency.lockutils [None req-eb1381db-a837-4fc6-94f6-361592f36d64 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] Lock "9308be91-9a92-4389-939a-8b03d37474cf-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 21 18:57:15 np0005591285 nova_compute[182755]: 2026-01-21 23:57:15.192 182759 DEBUG oslo_concurrency.lockutils [None req-eb1381db-a837-4fc6-94f6-361592f36d64 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] Lock "9308be91-9a92-4389-939a-8b03d37474cf-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 21 18:57:15 np0005591285 nova_compute[182755]: 2026-01-21 23:57:15.193 182759 DEBUG nova.virt.libvirt.vif [None req-eb1381db-a837-4fc6-94f6-361592f36d64 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-21T23:57:10Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ServerActionsTestJSON-server-396111842',display_name='tempest-ServerActionsTestJSON-server-396111842',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-serveractionstestjson-server-396111842',id=70,image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBM2ugiUux7DYMlN8dY8gue1BzsfXbOKOqdPq/gJUxFgjYtiZRKn0Il7yH7vkt/FF0n0nQ57uKZ7FjQwDvGcLpEHkhrK3RTLhPWsztjfiNHjhjKK0S86T4k3kzP0rpeoh4Q==',key_name='tempest-keypair-452781070',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='cccb624dbe6d4401a89e9cd254f91828',ramdisk_id='',reservation_id='r-740ncwsh',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ServerActionsTestJSON-78742637',owner_user_name='tempest-ServerActionsTestJSON-78742637-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-21T23:57:11Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='3e78a70a1d284a9d932d4a53b872df39',uuid=9308be91-9a92-4389-939a-8b03d37474cf,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "d96fb6bb-9793-4373-8f62-3aa3f32af6a5", "address": "fa:16:3e:c3:44:d7", "network": {"id": "19c3e0c8-5563-479c-995a-ab38d8b8c7f7", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-10713966-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "cccb624dbe6d4401a89e9cd254f91828", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd96fb6bb-97", "ovs_interfaceid": "d96fb6bb-9793-4373-8f62-3aa3f32af6a5", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Jan 21 18:57:15 np0005591285 nova_compute[182755]: 2026-01-21 23:57:15.193 182759 DEBUG nova.network.os_vif_util [None req-eb1381db-a837-4fc6-94f6-361592f36d64 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] Converting VIF {"id": "d96fb6bb-9793-4373-8f62-3aa3f32af6a5", "address": "fa:16:3e:c3:44:d7", "network": {"id": "19c3e0c8-5563-479c-995a-ab38d8b8c7f7", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-10713966-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "cccb624dbe6d4401a89e9cd254f91828", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd96fb6bb-97", "ovs_interfaceid": "d96fb6bb-9793-4373-8f62-3aa3f32af6a5", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 21 18:57:15 np0005591285 nova_compute[182755]: 2026-01-21 23:57:15.194 182759 DEBUG nova.network.os_vif_util [None req-eb1381db-a837-4fc6-94f6-361592f36d64 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:c3:44:d7,bridge_name='br-int',has_traffic_filtering=True,id=d96fb6bb-9793-4373-8f62-3aa3f32af6a5,network=Network(19c3e0c8-5563-479c-995a-ab38d8b8c7f7),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapd96fb6bb-97') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 21 18:57:15 np0005591285 nova_compute[182755]: 2026-01-21 23:57:15.195 182759 DEBUG os_vif [None req-eb1381db-a837-4fc6-94f6-361592f36d64 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:c3:44:d7,bridge_name='br-int',has_traffic_filtering=True,id=d96fb6bb-9793-4373-8f62-3aa3f32af6a5,network=Network(19c3e0c8-5563-479c-995a-ab38d8b8c7f7),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapd96fb6bb-97') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Jan 21 18:57:15 np0005591285 nova_compute[182755]: 2026-01-21 23:57:15.196 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 18:57:15 np0005591285 nova_compute[182755]: 2026-01-21 23:57:15.196 182759 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 21 18:57:15 np0005591285 nova_compute[182755]: 2026-01-21 23:57:15.197 182759 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 21 18:57:15 np0005591285 nova_compute[182755]: 2026-01-21 23:57:15.201 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 18:57:15 np0005591285 nova_compute[182755]: 2026-01-21 23:57:15.201 182759 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapd96fb6bb-97, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 21 18:57:15 np0005591285 nova_compute[182755]: 2026-01-21 23:57:15.202 182759 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tapd96fb6bb-97, col_values=(('external_ids', {'iface-id': 'd96fb6bb-9793-4373-8f62-3aa3f32af6a5', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:c3:44:d7', 'vm-uuid': '9308be91-9a92-4389-939a-8b03d37474cf'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 21 18:57:15 np0005591285 nova_compute[182755]: 2026-01-21 23:57:15.248 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 18:57:15 np0005591285 nova_compute[182755]: 2026-01-21 23:57:15.249 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 18:57:15 np0005591285 nova_compute[182755]: 2026-01-21 23:57:15.251 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 21 18:57:15 np0005591285 NetworkManager[55017]: <info>  [1769039835.2527] manager: (tapd96fb6bb-97): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/105)
Jan 21 18:57:15 np0005591285 nova_compute[182755]: 2026-01-21 23:57:15.261 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 18:57:15 np0005591285 nova_compute[182755]: 2026-01-21 23:57:15.262 182759 INFO os_vif [None req-eb1381db-a837-4fc6-94f6-361592f36d64 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:c3:44:d7,bridge_name='br-int',has_traffic_filtering=True,id=d96fb6bb-9793-4373-8f62-3aa3f32af6a5,network=Network(19c3e0c8-5563-479c-995a-ab38d8b8c7f7),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapd96fb6bb-97')#033[00m
Jan 21 18:57:15 np0005591285 nova_compute[182755]: 2026-01-21 23:57:15.278 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 18:57:15 np0005591285 nova_compute[182755]: 2026-01-21 23:57:15.355 182759 DEBUG nova.virt.libvirt.driver [None req-eb1381db-a837-4fc6-94f6-361592f36d64 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 21 18:57:15 np0005591285 nova_compute[182755]: 2026-01-21 23:57:15.356 182759 DEBUG nova.virt.libvirt.driver [None req-eb1381db-a837-4fc6-94f6-361592f36d64 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 21 18:57:15 np0005591285 nova_compute[182755]: 2026-01-21 23:57:15.356 182759 DEBUG nova.virt.libvirt.driver [None req-eb1381db-a837-4fc6-94f6-361592f36d64 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] No VIF found with MAC fa:16:3e:c3:44:d7, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Jan 21 18:57:15 np0005591285 nova_compute[182755]: 2026-01-21 23:57:15.357 182759 INFO nova.virt.libvirt.driver [None req-eb1381db-a837-4fc6-94f6-361592f36d64 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] [instance: 9308be91-9a92-4389-939a-8b03d37474cf] Using config drive#033[00m
Jan 21 18:57:15 np0005591285 nova_compute[182755]: 2026-01-21 23:57:15.570 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 18:57:16 np0005591285 nova_compute[182755]: 2026-01-21 23:57:16.403 182759 INFO nova.virt.libvirt.driver [None req-eb1381db-a837-4fc6-94f6-361592f36d64 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] [instance: 9308be91-9a92-4389-939a-8b03d37474cf] Creating config drive at /var/lib/nova/instances/9308be91-9a92-4389-939a-8b03d37474cf/disk.config#033[00m
Jan 21 18:57:16 np0005591285 nova_compute[182755]: 2026-01-21 23:57:16.409 182759 DEBUG oslo_concurrency.processutils [None req-eb1381db-a837-4fc6-94f6-361592f36d64 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/9308be91-9a92-4389-939a-8b03d37474cf/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpuucby4_9 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 21 18:57:16 np0005591285 nova_compute[182755]: 2026-01-21 23:57:16.552 182759 DEBUG oslo_concurrency.processutils [None req-eb1381db-a837-4fc6-94f6-361592f36d64 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/9308be91-9a92-4389-939a-8b03d37474cf/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpuucby4_9" returned: 0 in 0.143s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 21 18:57:16 np0005591285 kernel: tapd96fb6bb-97: entered promiscuous mode
Jan 21 18:57:16 np0005591285 NetworkManager[55017]: <info>  [1769039836.6341] manager: (tapd96fb6bb-97): new Tun device (/org/freedesktop/NetworkManager/Devices/106)
Jan 21 18:57:16 np0005591285 nova_compute[182755]: 2026-01-21 23:57:16.635 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 18:57:16 np0005591285 ovn_controller[94908]: 2026-01-21T23:57:16Z|00203|binding|INFO|Claiming lport d96fb6bb-9793-4373-8f62-3aa3f32af6a5 for this chassis.
Jan 21 18:57:16 np0005591285 ovn_controller[94908]: 2026-01-21T23:57:16Z|00204|binding|INFO|d96fb6bb-9793-4373-8f62-3aa3f32af6a5: Claiming fa:16:3e:c3:44:d7 10.100.0.7
Jan 21 18:57:16 np0005591285 ovn_controller[94908]: 2026-01-21T23:57:16Z|00205|binding|INFO|Setting lport d96fb6bb-9793-4373-8f62-3aa3f32af6a5 ovn-installed in OVS
Jan 21 18:57:16 np0005591285 ovn_controller[94908]: 2026-01-21T23:57:16Z|00206|binding|INFO|Setting lport d96fb6bb-9793-4373-8f62-3aa3f32af6a5 up in Southbound
Jan 21 18:57:16 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:57:16.646 104259 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:c3:44:d7 10.100.0.7'], port_security=['fa:16:3e:c3:44:d7 10.100.0.7'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.7/28', 'neutron:device_id': '9308be91-9a92-4389-939a-8b03d37474cf', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-19c3e0c8-5563-479c-995a-ab38d8b8c7f7', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'cccb624dbe6d4401a89e9cd254f91828', 'neutron:revision_number': '2', 'neutron:security_group_ids': '6d59a7e5-ecca-4ec2-a40e-386acabc1d66', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=1cb5ae5b-fb9e-4b4d-8960-35191db09308, chassis=[<ovs.db.idl.Row object at 0x7fdd55b5c970>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fdd55b5c970>], logical_port=d96fb6bb-9793-4373-8f62-3aa3f32af6a5) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 21 18:57:16 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:57:16.648 104259 INFO neutron.agent.ovn.metadata.agent [-] Port d96fb6bb-9793-4373-8f62-3aa3f32af6a5 in datapath 19c3e0c8-5563-479c-995a-ab38d8b8c7f7 bound to our chassis#033[00m
Jan 21 18:57:16 np0005591285 nova_compute[182755]: 2026-01-21 23:57:16.649 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 18:57:16 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:57:16.650 104259 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 19c3e0c8-5563-479c-995a-ab38d8b8c7f7#033[00m
Jan 21 18:57:16 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:57:16.664 211690 DEBUG oslo.privsep.daemon [-] privsep: reply[7337ebfb-5b57-4f12-b792-45f5a7436b9e]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 18:57:16 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:57:16.665 104259 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap19c3e0c8-51 in ovnmeta-19c3e0c8-5563-479c-995a-ab38d8b8c7f7 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Jan 21 18:57:16 np0005591285 systemd-udevd[220385]: Network interface NamePolicy= disabled on kernel command line.
Jan 21 18:57:16 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:57:16.668 211690 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap19c3e0c8-50 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Jan 21 18:57:16 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:57:16.668 211690 DEBUG oslo.privsep.daemon [-] privsep: reply[da14f27d-fc94-40f1-a63a-adf90dded492]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 18:57:16 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:57:16.669 211690 DEBUG oslo.privsep.daemon [-] privsep: reply[fdedb993-b2a3-425a-8315-205b03fa6c3e]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 18:57:16 np0005591285 systemd-machined[154022]: New machine qemu-29-instance-00000046.
Jan 21 18:57:16 np0005591285 nova_compute[182755]: 2026-01-21 23:57:16.680 182759 DEBUG nova.network.neutron [req-cf593e0e-74cd-4a83-8b45-977fc0d4d424 req-17732428-0ae3-4d2d-a5eb-c710610ff39a 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 9308be91-9a92-4389-939a-8b03d37474cf] Updated VIF entry in instance network info cache for port d96fb6bb-9793-4373-8f62-3aa3f32af6a5. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 21 18:57:16 np0005591285 nova_compute[182755]: 2026-01-21 23:57:16.681 182759 DEBUG nova.network.neutron [req-cf593e0e-74cd-4a83-8b45-977fc0d4d424 req-17732428-0ae3-4d2d-a5eb-c710610ff39a 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 9308be91-9a92-4389-939a-8b03d37474cf] Updating instance_info_cache with network_info: [{"id": "d96fb6bb-9793-4373-8f62-3aa3f32af6a5", "address": "fa:16:3e:c3:44:d7", "network": {"id": "19c3e0c8-5563-479c-995a-ab38d8b8c7f7", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-10713966-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "cccb624dbe6d4401a89e9cd254f91828", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd96fb6bb-97", "ovs_interfaceid": "d96fb6bb-9793-4373-8f62-3aa3f32af6a5", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 21 18:57:16 np0005591285 NetworkManager[55017]: <info>  [1769039836.6841] device (tapd96fb6bb-97): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 21 18:57:16 np0005591285 NetworkManager[55017]: <info>  [1769039836.6848] device (tapd96fb6bb-97): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 21 18:57:16 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:57:16.687 104650 DEBUG oslo.privsep.daemon [-] privsep: reply[dbcbb690-a448-4b28-af91-cf26d92368fe]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 18:57:16 np0005591285 systemd[1]: Started Virtual Machine qemu-29-instance-00000046.
Jan 21 18:57:16 np0005591285 nova_compute[182755]: 2026-01-21 23:57:16.702 182759 DEBUG oslo_concurrency.lockutils [req-cf593e0e-74cd-4a83-8b45-977fc0d4d424 req-17732428-0ae3-4d2d-a5eb-c710610ff39a 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Releasing lock "refresh_cache-9308be91-9a92-4389-939a-8b03d37474cf" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 21 18:57:16 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:57:16.703 211690 DEBUG oslo.privsep.daemon [-] privsep: reply[c32e8701-c72c-4627-a726-eafb3f03f39f]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 18:57:16 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:57:16.743 211704 DEBUG oslo.privsep.daemon [-] privsep: reply[3b5d54b2-e9ba-4f38-a7dd-f1a9715de720]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 18:57:16 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:57:16.751 211690 DEBUG oslo.privsep.daemon [-] privsep: reply[84a6485d-f73f-4ae1-a9a1-4ccbb465705c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 18:57:16 np0005591285 NetworkManager[55017]: <info>  [1769039836.7525] manager: (tap19c3e0c8-50): new Veth device (/org/freedesktop/NetworkManager/Devices/107)
Jan 21 18:57:16 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:57:16.797 211704 DEBUG oslo.privsep.daemon [-] privsep: reply[32f2be96-5851-435d-959a-4dc21848aa6c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 18:57:16 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:57:16.802 211704 DEBUG oslo.privsep.daemon [-] privsep: reply[12fa1a82-4823-4eb2-8c0f-09e1c513d837]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 18:57:16 np0005591285 NetworkManager[55017]: <info>  [1769039836.8295] device (tap19c3e0c8-50): carrier: link connected
Jan 21 18:57:16 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:57:16.838 211704 DEBUG oslo.privsep.daemon [-] privsep: reply[81b4153f-04bf-41c8-a820-6a7beb361d94]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 18:57:16 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:57:16.866 211690 DEBUG oslo.privsep.daemon [-] privsep: reply[90f5f778-1e36-40fb-8625-1e7debac85eb]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap19c3e0c8-51'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:8f:3a:b0'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 67], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 439548, 'reachable_time': 34223, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 220418, 'error': None, 'target': 'ovnmeta-19c3e0c8-5563-479c-995a-ab38d8b8c7f7', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 18:57:16 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:57:16.889 211690 DEBUG oslo.privsep.daemon [-] privsep: reply[b6532381-c65f-4d76-b2d3-c05e034126e4]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe8f:3ab0'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 439548, 'tstamp': 439548}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 220419, 'error': None, 'target': 'ovnmeta-19c3e0c8-5563-479c-995a-ab38d8b8c7f7', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 18:57:16 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:57:16.918 211690 DEBUG oslo.privsep.daemon [-] privsep: reply[b1f59706-2aac-4fd5-b622-412a55c2e2b5]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap19c3e0c8-51'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:8f:3a:b0'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 67], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 439548, 'reachable_time': 34223, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 220422, 'error': None, 'target': 'ovnmeta-19c3e0c8-5563-479c-995a-ab38d8b8c7f7', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 18:57:16 np0005591285 nova_compute[182755]: 2026-01-21 23:57:16.936 182759 DEBUG nova.compute.manager [req-890eff01-8e89-4519-ab87-3cfef04b6ac0 req-7b2665da-6d91-4141-bcdf-b11d6fb2cff2 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 9308be91-9a92-4389-939a-8b03d37474cf] Received event network-vif-plugged-d96fb6bb-9793-4373-8f62-3aa3f32af6a5 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 21 18:57:16 np0005591285 nova_compute[182755]: 2026-01-21 23:57:16.937 182759 DEBUG oslo_concurrency.lockutils [req-890eff01-8e89-4519-ab87-3cfef04b6ac0 req-7b2665da-6d91-4141-bcdf-b11d6fb2cff2 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "9308be91-9a92-4389-939a-8b03d37474cf-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 21 18:57:16 np0005591285 nova_compute[182755]: 2026-01-21 23:57:16.938 182759 DEBUG oslo_concurrency.lockutils [req-890eff01-8e89-4519-ab87-3cfef04b6ac0 req-7b2665da-6d91-4141-bcdf-b11d6fb2cff2 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "9308be91-9a92-4389-939a-8b03d37474cf-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 21 18:57:16 np0005591285 nova_compute[182755]: 2026-01-21 23:57:16.938 182759 DEBUG oslo_concurrency.lockutils [req-890eff01-8e89-4519-ab87-3cfef04b6ac0 req-7b2665da-6d91-4141-bcdf-b11d6fb2cff2 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "9308be91-9a92-4389-939a-8b03d37474cf-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 21 18:57:16 np0005591285 nova_compute[182755]: 2026-01-21 23:57:16.939 182759 DEBUG nova.compute.manager [req-890eff01-8e89-4519-ab87-3cfef04b6ac0 req-7b2665da-6d91-4141-bcdf-b11d6fb2cff2 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 9308be91-9a92-4389-939a-8b03d37474cf] Processing event network-vif-plugged-d96fb6bb-9793-4373-8f62-3aa3f32af6a5 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Jan 21 18:57:16 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:57:16.957 211690 DEBUG oslo.privsep.daemon [-] privsep: reply[9f82f318-2fd0-4827-b2ab-2da075b5eee7]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 18:57:16 np0005591285 nova_compute[182755]: 2026-01-21 23:57:16.997 182759 DEBUG nova.virt.driver [None req-f447ef32-7edb-4e2c-a5c5-5452f2f9c37e - - - - - -] Emitting event <LifecycleEvent: 1769039836.9968204, 9308be91-9a92-4389-939a-8b03d37474cf => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 21 18:57:16 np0005591285 nova_compute[182755]: 2026-01-21 23:57:16.998 182759 INFO nova.compute.manager [None req-f447ef32-7edb-4e2c-a5c5-5452f2f9c37e - - - - - -] [instance: 9308be91-9a92-4389-939a-8b03d37474cf] VM Started (Lifecycle Event)#033[00m
Jan 21 18:57:17 np0005591285 nova_compute[182755]: 2026-01-21 23:57:17.000 182759 DEBUG nova.compute.manager [None req-eb1381db-a837-4fc6-94f6-361592f36d64 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] [instance: 9308be91-9a92-4389-939a-8b03d37474cf] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Jan 21 18:57:17 np0005591285 nova_compute[182755]: 2026-01-21 23:57:17.006 182759 DEBUG nova.virt.libvirt.driver [None req-eb1381db-a837-4fc6-94f6-361592f36d64 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] [instance: 9308be91-9a92-4389-939a-8b03d37474cf] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Jan 21 18:57:17 np0005591285 nova_compute[182755]: 2026-01-21 23:57:17.011 182759 INFO nova.virt.libvirt.driver [-] [instance: 9308be91-9a92-4389-939a-8b03d37474cf] Instance spawned successfully.#033[00m
Jan 21 18:57:17 np0005591285 nova_compute[182755]: 2026-01-21 23:57:17.012 182759 DEBUG nova.virt.libvirt.driver [None req-eb1381db-a837-4fc6-94f6-361592f36d64 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] [instance: 9308be91-9a92-4389-939a-8b03d37474cf] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Jan 21 18:57:17 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:57:17.023 211690 DEBUG oslo.privsep.daemon [-] privsep: reply[3ac60b57-4fd3-4fa5-aff4-0459a8474005]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 18:57:17 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:57:17.025 104259 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap19c3e0c8-50, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 21 18:57:17 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:57:17.026 104259 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 21 18:57:17 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:57:17.027 104259 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap19c3e0c8-50, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 21 18:57:17 np0005591285 nova_compute[182755]: 2026-01-21 23:57:17.029 182759 DEBUG nova.compute.manager [None req-f447ef32-7edb-4e2c-a5c5-5452f2f9c37e - - - - - -] [instance: 9308be91-9a92-4389-939a-8b03d37474cf] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 21 18:57:17 np0005591285 NetworkManager[55017]: <info>  [1769039837.0298] manager: (tap19c3e0c8-50): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/108)
Jan 21 18:57:17 np0005591285 nova_compute[182755]: 2026-01-21 23:57:17.030 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 18:57:17 np0005591285 kernel: tap19c3e0c8-50: entered promiscuous mode
Jan 21 18:57:17 np0005591285 nova_compute[182755]: 2026-01-21 23:57:17.034 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 18:57:17 np0005591285 nova_compute[182755]: 2026-01-21 23:57:17.038 182759 DEBUG nova.compute.manager [None req-f447ef32-7edb-4e2c-a5c5-5452f2f9c37e - - - - - -] [instance: 9308be91-9a92-4389-939a-8b03d37474cf] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 21 18:57:17 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:57:17.037 104259 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap19c3e0c8-50, col_values=(('external_ids', {'iface-id': '1b7e9589-a667-4684-99c2-2699b19c29bb'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 21 18:57:17 np0005591285 ovn_controller[94908]: 2026-01-21T23:57:17Z|00207|binding|INFO|Releasing lport 1b7e9589-a667-4684-99c2-2699b19c29bb from this chassis (sb_readonly=0)
Jan 21 18:57:17 np0005591285 nova_compute[182755]: 2026-01-21 23:57:17.040 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 18:57:17 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:57:17.042 104259 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/19c3e0c8-5563-479c-995a-ab38d8b8c7f7.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/19c3e0c8-5563-479c-995a-ab38d8b8c7f7.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Jan 21 18:57:17 np0005591285 nova_compute[182755]: 2026-01-21 23:57:17.043 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 18:57:17 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:57:17.043 211690 DEBUG oslo.privsep.daemon [-] privsep: reply[c85533ef-7d37-44ff-be76-e17923ad4135]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 18:57:17 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:57:17.044 104259 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Jan 21 18:57:17 np0005591285 ovn_metadata_agent[104254]: global
Jan 21 18:57:17 np0005591285 ovn_metadata_agent[104254]:    log         /dev/log local0 debug
Jan 21 18:57:17 np0005591285 ovn_metadata_agent[104254]:    log-tag     haproxy-metadata-proxy-19c3e0c8-5563-479c-995a-ab38d8b8c7f7
Jan 21 18:57:17 np0005591285 ovn_metadata_agent[104254]:    user        root
Jan 21 18:57:17 np0005591285 ovn_metadata_agent[104254]:    group       root
Jan 21 18:57:17 np0005591285 ovn_metadata_agent[104254]:    maxconn     1024
Jan 21 18:57:17 np0005591285 ovn_metadata_agent[104254]:    pidfile     /var/lib/neutron/external/pids/19c3e0c8-5563-479c-995a-ab38d8b8c7f7.pid.haproxy
Jan 21 18:57:17 np0005591285 ovn_metadata_agent[104254]:    daemon
Jan 21 18:57:17 np0005591285 ovn_metadata_agent[104254]: 
Jan 21 18:57:17 np0005591285 ovn_metadata_agent[104254]: defaults
Jan 21 18:57:17 np0005591285 ovn_metadata_agent[104254]:    log global
Jan 21 18:57:17 np0005591285 ovn_metadata_agent[104254]:    mode http
Jan 21 18:57:17 np0005591285 ovn_metadata_agent[104254]:    option httplog
Jan 21 18:57:17 np0005591285 ovn_metadata_agent[104254]:    option dontlognull
Jan 21 18:57:17 np0005591285 ovn_metadata_agent[104254]:    option http-server-close
Jan 21 18:57:17 np0005591285 ovn_metadata_agent[104254]:    option forwardfor
Jan 21 18:57:17 np0005591285 ovn_metadata_agent[104254]:    retries                 3
Jan 21 18:57:17 np0005591285 ovn_metadata_agent[104254]:    timeout http-request    30s
Jan 21 18:57:17 np0005591285 ovn_metadata_agent[104254]:    timeout connect         30s
Jan 21 18:57:17 np0005591285 ovn_metadata_agent[104254]:    timeout client          32s
Jan 21 18:57:17 np0005591285 ovn_metadata_agent[104254]:    timeout server          32s
Jan 21 18:57:17 np0005591285 ovn_metadata_agent[104254]:    timeout http-keep-alive 30s
Jan 21 18:57:17 np0005591285 ovn_metadata_agent[104254]: 
Jan 21 18:57:17 np0005591285 ovn_metadata_agent[104254]: 
Jan 21 18:57:17 np0005591285 ovn_metadata_agent[104254]: listen listener
Jan 21 18:57:17 np0005591285 ovn_metadata_agent[104254]:    bind 169.254.169.254:80
Jan 21 18:57:17 np0005591285 ovn_metadata_agent[104254]:    server metadata /var/lib/neutron/metadata_proxy
Jan 21 18:57:17 np0005591285 ovn_metadata_agent[104254]:    http-request add-header X-OVN-Network-ID 19c3e0c8-5563-479c-995a-ab38d8b8c7f7
Jan 21 18:57:17 np0005591285 ovn_metadata_agent[104254]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Jan 21 18:57:17 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:57:17.046 104259 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-19c3e0c8-5563-479c-995a-ab38d8b8c7f7', 'env', 'PROCESS_TAG=haproxy-19c3e0c8-5563-479c-995a-ab38d8b8c7f7', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/19c3e0c8-5563-479c-995a-ab38d8b8c7f7.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Jan 21 18:57:17 np0005591285 nova_compute[182755]: 2026-01-21 23:57:17.050 182759 DEBUG nova.virt.libvirt.driver [None req-eb1381db-a837-4fc6-94f6-361592f36d64 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] [instance: 9308be91-9a92-4389-939a-8b03d37474cf] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 21 18:57:17 np0005591285 nova_compute[182755]: 2026-01-21 23:57:17.051 182759 DEBUG nova.virt.libvirt.driver [None req-eb1381db-a837-4fc6-94f6-361592f36d64 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] [instance: 9308be91-9a92-4389-939a-8b03d37474cf] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 21 18:57:17 np0005591285 nova_compute[182755]: 2026-01-21 23:57:17.051 182759 DEBUG nova.virt.libvirt.driver [None req-eb1381db-a837-4fc6-94f6-361592f36d64 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] [instance: 9308be91-9a92-4389-939a-8b03d37474cf] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 21 18:57:17 np0005591285 nova_compute[182755]: 2026-01-21 23:57:17.052 182759 DEBUG nova.virt.libvirt.driver [None req-eb1381db-a837-4fc6-94f6-361592f36d64 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] [instance: 9308be91-9a92-4389-939a-8b03d37474cf] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 21 18:57:17 np0005591285 nova_compute[182755]: 2026-01-21 23:57:17.052 182759 DEBUG nova.virt.libvirt.driver [None req-eb1381db-a837-4fc6-94f6-361592f36d64 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] [instance: 9308be91-9a92-4389-939a-8b03d37474cf] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 21 18:57:17 np0005591285 nova_compute[182755]: 2026-01-21 23:57:17.053 182759 DEBUG nova.virt.libvirt.driver [None req-eb1381db-a837-4fc6-94f6-361592f36d64 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] [instance: 9308be91-9a92-4389-939a-8b03d37474cf] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 21 18:57:17 np0005591285 nova_compute[182755]: 2026-01-21 23:57:17.060 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 18:57:17 np0005591285 nova_compute[182755]: 2026-01-21 23:57:17.067 182759 INFO nova.compute.manager [None req-f447ef32-7edb-4e2c-a5c5-5452f2f9c37e - - - - - -] [instance: 9308be91-9a92-4389-939a-8b03d37474cf] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 21 18:57:17 np0005591285 nova_compute[182755]: 2026-01-21 23:57:17.067 182759 DEBUG nova.virt.driver [None req-f447ef32-7edb-4e2c-a5c5-5452f2f9c37e - - - - - -] Emitting event <LifecycleEvent: 1769039836.9970026, 9308be91-9a92-4389-939a-8b03d37474cf => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 21 18:57:17 np0005591285 nova_compute[182755]: 2026-01-21 23:57:17.068 182759 INFO nova.compute.manager [None req-f447ef32-7edb-4e2c-a5c5-5452f2f9c37e - - - - - -] [instance: 9308be91-9a92-4389-939a-8b03d37474cf] VM Paused (Lifecycle Event)#033[00m
Jan 21 18:57:17 np0005591285 nova_compute[182755]: 2026-01-21 23:57:17.093 182759 DEBUG nova.compute.manager [None req-f447ef32-7edb-4e2c-a5c5-5452f2f9c37e - - - - - -] [instance: 9308be91-9a92-4389-939a-8b03d37474cf] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 21 18:57:17 np0005591285 nova_compute[182755]: 2026-01-21 23:57:17.098 182759 DEBUG nova.virt.driver [None req-f447ef32-7edb-4e2c-a5c5-5452f2f9c37e - - - - - -] Emitting event <LifecycleEvent: 1769039837.0041847, 9308be91-9a92-4389-939a-8b03d37474cf => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 21 18:57:17 np0005591285 nova_compute[182755]: 2026-01-21 23:57:17.098 182759 INFO nova.compute.manager [None req-f447ef32-7edb-4e2c-a5c5-5452f2f9c37e - - - - - -] [instance: 9308be91-9a92-4389-939a-8b03d37474cf] VM Resumed (Lifecycle Event)#033[00m
Jan 21 18:57:17 np0005591285 nova_compute[182755]: 2026-01-21 23:57:17.133 182759 DEBUG nova.compute.manager [None req-f447ef32-7edb-4e2c-a5c5-5452f2f9c37e - - - - - -] [instance: 9308be91-9a92-4389-939a-8b03d37474cf] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 21 18:57:17 np0005591285 nova_compute[182755]: 2026-01-21 23:57:17.137 182759 DEBUG nova.compute.manager [None req-f447ef32-7edb-4e2c-a5c5-5452f2f9c37e - - - - - -] [instance: 9308be91-9a92-4389-939a-8b03d37474cf] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 21 18:57:17 np0005591285 nova_compute[182755]: 2026-01-21 23:57:17.164 182759 INFO nova.compute.manager [None req-eb1381db-a837-4fc6-94f6-361592f36d64 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] [instance: 9308be91-9a92-4389-939a-8b03d37474cf] Took 5.06 seconds to spawn the instance on the hypervisor.#033[00m
Jan 21 18:57:17 np0005591285 nova_compute[182755]: 2026-01-21 23:57:17.165 182759 DEBUG nova.compute.manager [None req-eb1381db-a837-4fc6-94f6-361592f36d64 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] [instance: 9308be91-9a92-4389-939a-8b03d37474cf] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 21 18:57:17 np0005591285 nova_compute[182755]: 2026-01-21 23:57:17.174 182759 INFO nova.compute.manager [None req-f447ef32-7edb-4e2c-a5c5-5452f2f9c37e - - - - - -] [instance: 9308be91-9a92-4389-939a-8b03d37474cf] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 21 18:57:17 np0005591285 nova_compute[182755]: 2026-01-21 23:57:17.262 182759 INFO nova.compute.manager [None req-eb1381db-a837-4fc6-94f6-361592f36d64 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] [instance: 9308be91-9a92-4389-939a-8b03d37474cf] Took 5.73 seconds to build instance.#033[00m
Jan 21 18:57:17 np0005591285 nova_compute[182755]: 2026-01-21 23:57:17.280 182759 DEBUG oslo_concurrency.lockutils [None req-eb1381db-a837-4fc6-94f6-361592f36d64 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] Lock "9308be91-9a92-4389-939a-8b03d37474cf" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 5.837s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 21 18:57:17 np0005591285 podman[220459]: 2026-01-21 23:57:17.509924529 +0000 UTC m=+0.070449272 container create 77fef6afd9dbb1c67389f8a7aa3ab75a9a26585643b1f60383509571e163508c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-19c3e0c8-5563-479c-995a-ab38d8b8c7f7, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251202, tcib_managed=true)
Jan 21 18:57:17 np0005591285 systemd[1]: Started libpod-conmon-77fef6afd9dbb1c67389f8a7aa3ab75a9a26585643b1f60383509571e163508c.scope.
Jan 21 18:57:17 np0005591285 podman[220459]: 2026-01-21 23:57:17.467049585 +0000 UTC m=+0.027574388 image pull 3695f0466b4af47afdf4b467956f8cc4744d7249671a73e7ca3fd26cca2f59c3 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Jan 21 18:57:17 np0005591285 systemd[1]: Started libcrun container.
Jan 21 18:57:17 np0005591285 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/a44e99290dfd448d39097315742f89b5fea21cc60729802930cf3554fff2caeb/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 21 18:57:17 np0005591285 podman[220459]: 2026-01-21 23:57:17.63128721 +0000 UTC m=+0.191811993 container init 77fef6afd9dbb1c67389f8a7aa3ab75a9a26585643b1f60383509571e163508c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-19c3e0c8-5563-479c-995a-ab38d8b8c7f7, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251202, tcib_managed=true, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, io.buildah.version=1.41.3)
Jan 21 18:57:17 np0005591285 podman[220459]: 2026-01-21 23:57:17.640634383 +0000 UTC m=+0.201159146 container start 77fef6afd9dbb1c67389f8a7aa3ab75a9a26585643b1f60383509571e163508c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-19c3e0c8-5563-479c-995a-ab38d8b8c7f7, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251202, io.buildah.version=1.41.3, tcib_managed=true, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 21 18:57:17 np0005591285 neutron-haproxy-ovnmeta-19c3e0c8-5563-479c-995a-ab38d8b8c7f7[220474]: [NOTICE]   (220479) : New worker (220481) forked
Jan 21 18:57:17 np0005591285 neutron-haproxy-ovnmeta-19c3e0c8-5563-479c-995a-ab38d8b8c7f7[220474]: [NOTICE]   (220479) : Loading success.
Jan 21 18:57:19 np0005591285 nova_compute[182755]: 2026-01-21 23:57:19.046 182759 DEBUG nova.compute.manager [req-03db9047-698a-4264-b8dc-618b191d8ea7 req-8416cd64-2f64-47dc-b4f4-34dadc9435cf 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 9308be91-9a92-4389-939a-8b03d37474cf] Received event network-vif-plugged-d96fb6bb-9793-4373-8f62-3aa3f32af6a5 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 21 18:57:19 np0005591285 nova_compute[182755]: 2026-01-21 23:57:19.046 182759 DEBUG oslo_concurrency.lockutils [req-03db9047-698a-4264-b8dc-618b191d8ea7 req-8416cd64-2f64-47dc-b4f4-34dadc9435cf 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "9308be91-9a92-4389-939a-8b03d37474cf-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 21 18:57:19 np0005591285 nova_compute[182755]: 2026-01-21 23:57:19.047 182759 DEBUG oslo_concurrency.lockutils [req-03db9047-698a-4264-b8dc-618b191d8ea7 req-8416cd64-2f64-47dc-b4f4-34dadc9435cf 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "9308be91-9a92-4389-939a-8b03d37474cf-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 21 18:57:19 np0005591285 nova_compute[182755]: 2026-01-21 23:57:19.047 182759 DEBUG oslo_concurrency.lockutils [req-03db9047-698a-4264-b8dc-618b191d8ea7 req-8416cd64-2f64-47dc-b4f4-34dadc9435cf 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "9308be91-9a92-4389-939a-8b03d37474cf-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 21 18:57:19 np0005591285 nova_compute[182755]: 2026-01-21 23:57:19.048 182759 DEBUG nova.compute.manager [req-03db9047-698a-4264-b8dc-618b191d8ea7 req-8416cd64-2f64-47dc-b4f4-34dadc9435cf 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 9308be91-9a92-4389-939a-8b03d37474cf] No waiting events found dispatching network-vif-plugged-d96fb6bb-9793-4373-8f62-3aa3f32af6a5 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 21 18:57:19 np0005591285 nova_compute[182755]: 2026-01-21 23:57:19.048 182759 WARNING nova.compute.manager [req-03db9047-698a-4264-b8dc-618b191d8ea7 req-8416cd64-2f64-47dc-b4f4-34dadc9435cf 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 9308be91-9a92-4389-939a-8b03d37474cf] Received unexpected event network-vif-plugged-d96fb6bb-9793-4373-8f62-3aa3f32af6a5 for instance with vm_state active and task_state None.#033[00m
Jan 21 18:57:19 np0005591285 nova_compute[182755]: 2026-01-21 23:57:19.911 182759 DEBUG nova.compute.manager [req-118460eb-8675-40ec-9ea9-31a05e635c04 req-2db545d6-f26f-49f9-8a22-e1f7cfb189e0 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 9308be91-9a92-4389-939a-8b03d37474cf] Received event network-changed-d96fb6bb-9793-4373-8f62-3aa3f32af6a5 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 21 18:57:19 np0005591285 nova_compute[182755]: 2026-01-21 23:57:19.912 182759 DEBUG nova.compute.manager [req-118460eb-8675-40ec-9ea9-31a05e635c04 req-2db545d6-f26f-49f9-8a22-e1f7cfb189e0 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 9308be91-9a92-4389-939a-8b03d37474cf] Refreshing instance network info cache due to event network-changed-d96fb6bb-9793-4373-8f62-3aa3f32af6a5. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 21 18:57:19 np0005591285 nova_compute[182755]: 2026-01-21 23:57:19.912 182759 DEBUG oslo_concurrency.lockutils [req-118460eb-8675-40ec-9ea9-31a05e635c04 req-2db545d6-f26f-49f9-8a22-e1f7cfb189e0 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "refresh_cache-9308be91-9a92-4389-939a-8b03d37474cf" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 21 18:57:19 np0005591285 nova_compute[182755]: 2026-01-21 23:57:19.912 182759 DEBUG oslo_concurrency.lockutils [req-118460eb-8675-40ec-9ea9-31a05e635c04 req-2db545d6-f26f-49f9-8a22-e1f7cfb189e0 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquired lock "refresh_cache-9308be91-9a92-4389-939a-8b03d37474cf" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 21 18:57:19 np0005591285 nova_compute[182755]: 2026-01-21 23:57:19.913 182759 DEBUG nova.network.neutron [req-118460eb-8675-40ec-9ea9-31a05e635c04 req-2db545d6-f26f-49f9-8a22-e1f7cfb189e0 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 9308be91-9a92-4389-939a-8b03d37474cf] Refreshing network info cache for port d96fb6bb-9793-4373-8f62-3aa3f32af6a5 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 21 18:57:20 np0005591285 nova_compute[182755]: 2026-01-21 23:57:20.248 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 18:57:20 np0005591285 nova_compute[182755]: 2026-01-21 23:57:20.281 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 18:57:20 np0005591285 nova_compute[182755]: 2026-01-21 23:57:20.625 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 18:57:21 np0005591285 nova_compute[182755]: 2026-01-21 23:57:21.697 182759 DEBUG nova.network.neutron [req-118460eb-8675-40ec-9ea9-31a05e635c04 req-2db545d6-f26f-49f9-8a22-e1f7cfb189e0 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 9308be91-9a92-4389-939a-8b03d37474cf] Updated VIF entry in instance network info cache for port d96fb6bb-9793-4373-8f62-3aa3f32af6a5. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 21 18:57:21 np0005591285 nova_compute[182755]: 2026-01-21 23:57:21.698 182759 DEBUG nova.network.neutron [req-118460eb-8675-40ec-9ea9-31a05e635c04 req-2db545d6-f26f-49f9-8a22-e1f7cfb189e0 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 9308be91-9a92-4389-939a-8b03d37474cf] Updating instance_info_cache with network_info: [{"id": "d96fb6bb-9793-4373-8f62-3aa3f32af6a5", "address": "fa:16:3e:c3:44:d7", "network": {"id": "19c3e0c8-5563-479c-995a-ab38d8b8c7f7", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-10713966-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.240", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "cccb624dbe6d4401a89e9cd254f91828", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd96fb6bb-97", "ovs_interfaceid": "d96fb6bb-9793-4373-8f62-3aa3f32af6a5", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 21 18:57:21 np0005591285 nova_compute[182755]: 2026-01-21 23:57:21.721 182759 DEBUG oslo_concurrency.lockutils [req-118460eb-8675-40ec-9ea9-31a05e635c04 req-2db545d6-f26f-49f9-8a22-e1f7cfb189e0 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Releasing lock "refresh_cache-9308be91-9a92-4389-939a-8b03d37474cf" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 21 18:57:22 np0005591285 nova_compute[182755]: 2026-01-21 23:57:22.041 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 18:57:23 np0005591285 nova_compute[182755]: 2026-01-21 23:57:23.303 182759 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1769039828.3016407, d30c29ef-0595-4d30-a826-50e21d7d3463 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 21 18:57:23 np0005591285 nova_compute[182755]: 2026-01-21 23:57:23.304 182759 INFO nova.compute.manager [-] [instance: d30c29ef-0595-4d30-a826-50e21d7d3463] VM Stopped (Lifecycle Event)#033[00m
Jan 21 18:57:23 np0005591285 nova_compute[182755]: 2026-01-21 23:57:23.330 182759 DEBUG nova.compute.manager [None req-419b80f9-ed9e-4020-a52e-7df6dd0f1d7c - - - - - -] [instance: d30c29ef-0595-4d30-a826-50e21d7d3463] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 21 18:57:25 np0005591285 nova_compute[182755]: 2026-01-21 23:57:25.297 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 18:57:26 np0005591285 podman[220490]: 2026-01-21 23:57:26.204815752 +0000 UTC m=+0.074034290 container health_status 769668cc4e818cfae854ab3fcb5517227ddca1e18005a18ef4d782a494f45662 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=openstack_network_exporter, managed_by=edpm_ansible, version=9.6, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.component=ubi9-minimal-container, io.buildah.version=1.33.7, maintainer=Red Hat, Inc., build-date=2025-08-20T13:12:41, url=https://catalog.redhat.com/en/search?searchType=containers, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., name=ubi9-minimal, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, config_id=openstack_network_exporter, io.openshift.tags=minimal rhel9, distribution-scope=public, release=1755695350, architecture=x86_64, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vendor=Red Hat, Inc., io.openshift.expose-services=, vcs-type=git)
Jan 21 18:57:26 np0005591285 podman[220491]: 2026-01-21 23:57:26.206645141 +0000 UTC m=+0.067271466 container health_status b5c1a31a508d0cf9e10702abe9c0ef424614b4d8f0daee85791f7de054afa6f0 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ceilometer_agent_compute, io.buildah.version=1.41.3, tcib_managed=true, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team)
Jan 21 18:57:26 np0005591285 nova_compute[182755]: 2026-01-21 23:57:26.213 182759 DEBUG oslo_service.periodic_task [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 21 18:57:26 np0005591285 nova_compute[182755]: 2026-01-21 23:57:26.216 182759 DEBUG oslo_service.periodic_task [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 21 18:57:26 np0005591285 nova_compute[182755]: 2026-01-21 23:57:26.216 182759 DEBUG oslo_service.periodic_task [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Running periodic task ComputeManager._run_pending_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 21 18:57:26 np0005591285 nova_compute[182755]: 2026-01-21 23:57:26.216 182759 DEBUG nova.compute.manager [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Cleaning up deleted instances _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11145#033[00m
Jan 21 18:57:26 np0005591285 nova_compute[182755]: 2026-01-21 23:57:26.251 182759 DEBUG nova.compute.manager [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] There are 0 instances to clean _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11154#033[00m
Jan 21 18:57:27 np0005591285 nova_compute[182755]: 2026-01-21 23:57:27.252 182759 DEBUG oslo_service.periodic_task [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 21 18:57:29 np0005591285 nova_compute[182755]: 2026-01-21 23:57:29.217 182759 DEBUG oslo_service.periodic_task [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 21 18:57:29 np0005591285 nova_compute[182755]: 2026-01-21 23:57:29.223 182759 DEBUG nova.compute.manager [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Jan 21 18:57:29 np0005591285 nova_compute[182755]: 2026-01-21 23:57:29.228 182759 DEBUG oslo_service.periodic_task [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 21 18:57:29 np0005591285 nova_compute[182755]: 2026-01-21 23:57:29.256 182759 DEBUG oslo_concurrency.lockutils [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 21 18:57:29 np0005591285 nova_compute[182755]: 2026-01-21 23:57:29.257 182759 DEBUG oslo_concurrency.lockutils [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 21 18:57:29 np0005591285 nova_compute[182755]: 2026-01-21 23:57:29.257 182759 DEBUG oslo_concurrency.lockutils [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 21 18:57:29 np0005591285 nova_compute[182755]: 2026-01-21 23:57:29.258 182759 DEBUG nova.compute.resource_tracker [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Jan 21 18:57:29 np0005591285 nova_compute[182755]: 2026-01-21 23:57:29.350 182759 DEBUG oslo_concurrency.processutils [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/9308be91-9a92-4389-939a-8b03d37474cf/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 21 18:57:29 np0005591285 nova_compute[182755]: 2026-01-21 23:57:29.465 182759 DEBUG oslo_concurrency.processutils [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/9308be91-9a92-4389-939a-8b03d37474cf/disk --force-share --output=json" returned: 0 in 0.115s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 21 18:57:29 np0005591285 nova_compute[182755]: 2026-01-21 23:57:29.466 182759 DEBUG oslo_concurrency.processutils [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/9308be91-9a92-4389-939a-8b03d37474cf/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 21 18:57:29 np0005591285 nova_compute[182755]: 2026-01-21 23:57:29.554 182759 DEBUG oslo_concurrency.processutils [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/9308be91-9a92-4389-939a-8b03d37474cf/disk --force-share --output=json" returned: 0 in 0.089s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 21 18:57:29 np0005591285 nova_compute[182755]: 2026-01-21 23:57:29.746 182759 WARNING nova.virt.libvirt.driver [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 21 18:57:29 np0005591285 nova_compute[182755]: 2026-01-21 23:57:29.748 182759 DEBUG nova.compute.resource_tracker [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=5526MB free_disk=73.2767562866211GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Jan 21 18:57:29 np0005591285 nova_compute[182755]: 2026-01-21 23:57:29.749 182759 DEBUG oslo_concurrency.lockutils [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 21 18:57:29 np0005591285 nova_compute[182755]: 2026-01-21 23:57:29.749 182759 DEBUG oslo_concurrency.lockutils [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 21 18:57:29 np0005591285 nova_compute[182755]: 2026-01-21 23:57:29.953 182759 DEBUG nova.compute.resource_tracker [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Instance 9308be91-9a92-4389-939a-8b03d37474cf actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Jan 21 18:57:29 np0005591285 nova_compute[182755]: 2026-01-21 23:57:29.954 182759 DEBUG nova.compute.resource_tracker [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Jan 21 18:57:29 np0005591285 nova_compute[182755]: 2026-01-21 23:57:29.954 182759 DEBUG nova.compute.resource_tracker [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7679MB used_ram=640MB phys_disk=79GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Jan 21 18:57:30 np0005591285 nova_compute[182755]: 2026-01-21 23:57:30.116 182759 DEBUG nova.compute.provider_tree [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Inventory has not changed in ProviderTree for provider: e96a8776-a298-4c19-937a-402cb8191067 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 21 18:57:30 np0005591285 nova_compute[182755]: 2026-01-21 23:57:30.134 182759 DEBUG nova.scheduler.client.report [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Inventory has not changed for provider e96a8776-a298-4c19-937a-402cb8191067 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 21 18:57:30 np0005591285 nova_compute[182755]: 2026-01-21 23:57:30.156 182759 DEBUG nova.compute.resource_tracker [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Jan 21 18:57:30 np0005591285 nova_compute[182755]: 2026-01-21 23:57:30.156 182759 DEBUG oslo_concurrency.lockutils [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.407s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 21 18:57:30 np0005591285 nova_compute[182755]: 2026-01-21 23:57:30.157 182759 DEBUG oslo_service.periodic_task [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Running periodic task ComputeManager._cleanup_incomplete_migrations run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 21 18:57:30 np0005591285 nova_compute[182755]: 2026-01-21 23:57:30.157 182759 DEBUG nova.compute.manager [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Cleaning up deleted instances with incomplete migration  _cleanup_incomplete_migrations /usr/lib/python3.9/site-packages/nova/compute/manager.py:11183#033[00m
Jan 21 18:57:30 np0005591285 nova_compute[182755]: 2026-01-21 23:57:30.297 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 18:57:30 np0005591285 nova_compute[182755]: 2026-01-21 23:57:30.300 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 18:57:30 np0005591285 ovn_controller[94908]: 2026-01-21T23:57:30Z|00023|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:c3:44:d7 10.100.0.7
Jan 21 18:57:30 np0005591285 ovn_controller[94908]: 2026-01-21T23:57:30Z|00024|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:c3:44:d7 10.100.0.7
Jan 21 18:57:31 np0005591285 ovn_controller[94908]: 2026-01-21T23:57:31Z|00208|binding|INFO|Releasing lport 1b7e9589-a667-4684-99c2-2699b19c29bb from this chassis (sb_readonly=0)
Jan 21 18:57:31 np0005591285 nova_compute[182755]: 2026-01-21 23:57:31.377 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 18:57:32 np0005591285 nova_compute[182755]: 2026-01-21 23:57:32.165 182759 DEBUG oslo_service.periodic_task [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 21 18:57:32 np0005591285 nova_compute[182755]: 2026-01-21 23:57:32.262 182759 DEBUG oslo_service.periodic_task [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 21 18:57:33 np0005591285 nova_compute[182755]: 2026-01-21 23:57:33.217 182759 DEBUG oslo_service.periodic_task [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 21 18:57:33 np0005591285 nova_compute[182755]: 2026-01-21 23:57:33.218 182759 DEBUG nova.compute.manager [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Jan 21 18:57:33 np0005591285 nova_compute[182755]: 2026-01-21 23:57:33.218 182759 DEBUG nova.compute.manager [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Jan 21 18:57:33 np0005591285 podman[220556]: 2026-01-21 23:57:33.223776171 +0000 UTC m=+0.082792155 container health_status 3d927e208c8d9d2bf2ed958430b573b5beb6e6062ba87e1ec7a0ee42c855b2e4 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Jan 21 18:57:33 np0005591285 nova_compute[182755]: 2026-01-21 23:57:33.461 182759 DEBUG oslo_concurrency.lockutils [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Acquiring lock "refresh_cache-9308be91-9a92-4389-939a-8b03d37474cf" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 21 18:57:33 np0005591285 nova_compute[182755]: 2026-01-21 23:57:33.462 182759 DEBUG oslo_concurrency.lockutils [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Acquired lock "refresh_cache-9308be91-9a92-4389-939a-8b03d37474cf" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 21 18:57:33 np0005591285 nova_compute[182755]: 2026-01-21 23:57:33.462 182759 DEBUG nova.network.neutron [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] [instance: 9308be91-9a92-4389-939a-8b03d37474cf] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004#033[00m
Jan 21 18:57:33 np0005591285 nova_compute[182755]: 2026-01-21 23:57:33.462 182759 DEBUG nova.objects.instance [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Lazy-loading 'info_cache' on Instance uuid 9308be91-9a92-4389-939a-8b03d37474cf obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 21 18:57:35 np0005591285 nova_compute[182755]: 2026-01-21 23:57:35.300 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 18:57:35 np0005591285 nova_compute[182755]: 2026-01-21 23:57:35.537 182759 DEBUG nova.network.neutron [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] [instance: 9308be91-9a92-4389-939a-8b03d37474cf] Updating instance_info_cache with network_info: [{"id": "d96fb6bb-9793-4373-8f62-3aa3f32af6a5", "address": "fa:16:3e:c3:44:d7", "network": {"id": "19c3e0c8-5563-479c-995a-ab38d8b8c7f7", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-10713966-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.240", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "cccb624dbe6d4401a89e9cd254f91828", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd96fb6bb-97", "ovs_interfaceid": "d96fb6bb-9793-4373-8f62-3aa3f32af6a5", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 21 18:57:35 np0005591285 nova_compute[182755]: 2026-01-21 23:57:35.577 182759 DEBUG oslo_concurrency.lockutils [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Releasing lock "refresh_cache-9308be91-9a92-4389-939a-8b03d37474cf" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 21 18:57:35 np0005591285 nova_compute[182755]: 2026-01-21 23:57:35.578 182759 DEBUG nova.compute.manager [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] [instance: 9308be91-9a92-4389-939a-8b03d37474cf] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929#033[00m
Jan 21 18:57:35 np0005591285 nova_compute[182755]: 2026-01-21 23:57:35.579 182759 DEBUG oslo_service.periodic_task [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 21 18:57:37 np0005591285 nova_compute[182755]: 2026-01-21 23:57:37.217 182759 DEBUG oslo_service.periodic_task [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 21 18:57:38 np0005591285 nova_compute[182755]: 2026-01-21 23:57:38.217 182759 DEBUG oslo_service.periodic_task [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 21 18:57:38 np0005591285 podman[220580]: 2026-01-21 23:57:38.236675725 +0000 UTC m=+0.096110559 container health_status 482a8450540b33e5cd4c4cb0c4b60281016c657b79b89fa82f8a9e447843f995 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 21 18:57:38 np0005591285 podman[220581]: 2026-01-21 23:57:38.274549321 +0000 UTC m=+0.128616209 container health_status 8919309155a033da8deb024bb37b9fc0d285fe97686ee0d202af37a5f2df8416 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Jan 21 18:57:38 np0005591285 podman[220582]: 2026-01-21 23:57:38.290215537 +0000 UTC m=+0.134079038 container health_status e38def30a2d032278c507a8fe96e62e9ea51ff4721fe55b24e66691821fd079e (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, config_id=ovn_controller, io.buildah.version=1.41.3, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, container_name=ovn_controller, org.label-schema.build-date=20251202, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 21 18:57:40 np0005591285 nova_compute[182755]: 2026-01-21 23:57:40.303 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4998-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 21 18:57:43 np0005591285 nova_compute[182755]: 2026-01-21 23:57:43.689 182759 DEBUG oslo_service.periodic_task [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Running periodic task ComputeManager._sync_power_states run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 21 18:57:43 np0005591285 nova_compute[182755]: 2026-01-21 23:57:43.715 182759 DEBUG nova.compute.manager [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Triggering sync for uuid 9308be91-9a92-4389-939a-8b03d37474cf _sync_power_states /usr/lib/python3.9/site-packages/nova/compute/manager.py:10268#033[00m
Jan 21 18:57:43 np0005591285 nova_compute[182755]: 2026-01-21 23:57:43.716 182759 DEBUG oslo_concurrency.lockutils [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Acquiring lock "9308be91-9a92-4389-939a-8b03d37474cf" by "nova.compute.manager.ComputeManager._sync_power_states.<locals>._sync.<locals>.query_driver_power_state_and_sync" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 21 18:57:43 np0005591285 nova_compute[182755]: 2026-01-21 23:57:43.716 182759 DEBUG oslo_concurrency.lockutils [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Lock "9308be91-9a92-4389-939a-8b03d37474cf" acquired by "nova.compute.manager.ComputeManager._sync_power_states.<locals>._sync.<locals>.query_driver_power_state_and_sync" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 21 18:57:43 np0005591285 nova_compute[182755]: 2026-01-21 23:57:43.740 182759 DEBUG oslo_concurrency.lockutils [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Lock "9308be91-9a92-4389-939a-8b03d37474cf" "released" by "nova.compute.manager.ComputeManager._sync_power_states.<locals>._sync.<locals>.query_driver_power_state_and_sync" :: held 0.024s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 21 18:57:45 np0005591285 nova_compute[182755]: 2026-01-21 23:57:45.307 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4998-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 21 18:57:45 np0005591285 nova_compute[182755]: 2026-01-21 23:57:45.728 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 18:57:49 np0005591285 nova_compute[182755]: 2026-01-21 23:57:49.971 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 18:57:50 np0005591285 nova_compute[182755]: 2026-01-21 23:57:50.592 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 18:57:55 np0005591285 nova_compute[182755]: 2026-01-21 23:57:55.594 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4998-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 21 18:57:55 np0005591285 nova_compute[182755]: 2026-01-21 23:57:55.596 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 18:57:55 np0005591285 nova_compute[182755]: 2026-01-21 23:57:55.898 182759 DEBUG oslo_concurrency.lockutils [None req-0a1daf86-2767-46a7-915c-2d1e4fd70c9c 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] Acquiring lock "9308be91-9a92-4389-939a-8b03d37474cf" by "nova.compute.manager.ComputeManager.reboot_instance.<locals>.do_reboot_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 21 18:57:55 np0005591285 nova_compute[182755]: 2026-01-21 23:57:55.899 182759 DEBUG oslo_concurrency.lockutils [None req-0a1daf86-2767-46a7-915c-2d1e4fd70c9c 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] Lock "9308be91-9a92-4389-939a-8b03d37474cf" acquired by "nova.compute.manager.ComputeManager.reboot_instance.<locals>.do_reboot_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 21 18:57:55 np0005591285 nova_compute[182755]: 2026-01-21 23:57:55.899 182759 INFO nova.compute.manager [None req-0a1daf86-2767-46a7-915c-2d1e4fd70c9c 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] [instance: 9308be91-9a92-4389-939a-8b03d37474cf] Rebooting instance#033[00m
Jan 21 18:57:55 np0005591285 nova_compute[182755]: 2026-01-21 23:57:55.919 182759 DEBUG oslo_concurrency.lockutils [None req-0a1daf86-2767-46a7-915c-2d1e4fd70c9c 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] Acquiring lock "refresh_cache-9308be91-9a92-4389-939a-8b03d37474cf" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 21 18:57:55 np0005591285 nova_compute[182755]: 2026-01-21 23:57:55.920 182759 DEBUG oslo_concurrency.lockutils [None req-0a1daf86-2767-46a7-915c-2d1e4fd70c9c 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] Acquired lock "refresh_cache-9308be91-9a92-4389-939a-8b03d37474cf" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 21 18:57:55 np0005591285 nova_compute[182755]: 2026-01-21 23:57:55.920 182759 DEBUG nova.network.neutron [None req-0a1daf86-2767-46a7-915c-2d1e4fd70c9c 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] [instance: 9308be91-9a92-4389-939a-8b03d37474cf] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Jan 21 18:57:57 np0005591285 podman[220648]: 2026-01-21 23:57:57.200191341 +0000 UTC m=+0.064559462 container health_status 769668cc4e818cfae854ab3fcb5517227ddca1e18005a18ef4d782a494f45662 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.33.7, url=https://catalog.redhat.com/en/search?searchType=containers, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., container_name=openstack_network_exporter, release=1755695350, name=ubi9-minimal, version=9.6, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, maintainer=Red Hat, Inc., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, config_id=openstack_network_exporter, architecture=x86_64, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.tags=minimal rhel9, vendor=Red Hat, Inc., build-date=2025-08-20T13:12:41, com.redhat.component=ubi9-minimal-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, vcs-type=git, distribution-scope=public, managed_by=edpm_ansible)
Jan 21 18:57:57 np0005591285 podman[220649]: 2026-01-21 23:57:57.200192191 +0000 UTC m=+0.063326649 container health_status b5c1a31a508d0cf9e10702abe9c0ef424614b4d8f0daee85791f7de054afa6f0 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_id=ceilometer_agent_compute, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS)
Jan 21 18:57:58 np0005591285 nova_compute[182755]: 2026-01-21 23:57:58.044 182759 DEBUG nova.network.neutron [None req-0a1daf86-2767-46a7-915c-2d1e4fd70c9c 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] [instance: 9308be91-9a92-4389-939a-8b03d37474cf] Updating instance_info_cache with network_info: [{"id": "d96fb6bb-9793-4373-8f62-3aa3f32af6a5", "address": "fa:16:3e:c3:44:d7", "network": {"id": "19c3e0c8-5563-479c-995a-ab38d8b8c7f7", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-10713966-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.240", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "cccb624dbe6d4401a89e9cd254f91828", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd96fb6bb-97", "ovs_interfaceid": "d96fb6bb-9793-4373-8f62-3aa3f32af6a5", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 21 18:57:58 np0005591285 nova_compute[182755]: 2026-01-21 23:57:58.060 182759 DEBUG oslo_concurrency.lockutils [None req-0a1daf86-2767-46a7-915c-2d1e4fd70c9c 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] Releasing lock "refresh_cache-9308be91-9a92-4389-939a-8b03d37474cf" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 21 18:57:58 np0005591285 nova_compute[182755]: 2026-01-21 23:57:58.071 182759 DEBUG nova.compute.manager [None req-0a1daf86-2767-46a7-915c-2d1e4fd70c9c 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] [instance: 9308be91-9a92-4389-939a-8b03d37474cf] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 21 18:57:58 np0005591285 kernel: tapd96fb6bb-97 (unregistering): left promiscuous mode
Jan 21 18:57:58 np0005591285 NetworkManager[55017]: <info>  [1769039878.2574] device (tapd96fb6bb-97): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 21 18:57:58 np0005591285 nova_compute[182755]: 2026-01-21 23:57:58.264 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 18:57:58 np0005591285 ovn_controller[94908]: 2026-01-21T23:57:58Z|00209|binding|INFO|Releasing lport d96fb6bb-9793-4373-8f62-3aa3f32af6a5 from this chassis (sb_readonly=0)
Jan 21 18:57:58 np0005591285 ovn_controller[94908]: 2026-01-21T23:57:58Z|00210|binding|INFO|Setting lport d96fb6bb-9793-4373-8f62-3aa3f32af6a5 down in Southbound
Jan 21 18:57:58 np0005591285 ovn_controller[94908]: 2026-01-21T23:57:58Z|00211|binding|INFO|Removing iface tapd96fb6bb-97 ovn-installed in OVS
Jan 21 18:57:58 np0005591285 nova_compute[182755]: 2026-01-21 23:57:58.267 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 18:57:58 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:57:58.285 104259 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:c3:44:d7 10.100.0.7'], port_security=['fa:16:3e:c3:44:d7 10.100.0.7'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.7/28', 'neutron:device_id': '9308be91-9a92-4389-939a-8b03d37474cf', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-19c3e0c8-5563-479c-995a-ab38d8b8c7f7', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'cccb624dbe6d4401a89e9cd254f91828', 'neutron:revision_number': '4', 'neutron:security_group_ids': '6d59a7e5-ecca-4ec2-a40e-386acabc1d66', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-2.ctlplane.example.com', 'neutron:port_fip': '192.168.122.240'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=1cb5ae5b-fb9e-4b4d-8960-35191db09308, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fdd55b5c970>], logical_port=d96fb6bb-9793-4373-8f62-3aa3f32af6a5) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7fdd55b5c970>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 21 18:57:58 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:57:58.288 104259 INFO neutron.agent.ovn.metadata.agent [-] Port d96fb6bb-9793-4373-8f62-3aa3f32af6a5 in datapath 19c3e0c8-5563-479c-995a-ab38d8b8c7f7 unbound from our chassis#033[00m
Jan 21 18:57:58 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:57:58.291 104259 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 19c3e0c8-5563-479c-995a-ab38d8b8c7f7, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Jan 21 18:57:58 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:57:58.293 211690 DEBUG oslo.privsep.daemon [-] privsep: reply[2c296a0b-1326-431b-859e-1df9657d0619]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 18:57:58 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:57:58.296 104259 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-19c3e0c8-5563-479c-995a-ab38d8b8c7f7 namespace which is not needed anymore#033[00m
Jan 21 18:57:58 np0005591285 nova_compute[182755]: 2026-01-21 23:57:58.297 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 18:57:58 np0005591285 systemd[1]: machine-qemu\x2d29\x2dinstance\x2d00000046.scope: Deactivated successfully.
Jan 21 18:57:58 np0005591285 systemd[1]: machine-qemu\x2d29\x2dinstance\x2d00000046.scope: Consumed 15.206s CPU time.
Jan 21 18:57:58 np0005591285 systemd-machined[154022]: Machine qemu-29-instance-00000046 terminated.
Jan 21 18:57:58 np0005591285 nova_compute[182755]: 2026-01-21 23:57:58.463 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 18:57:58 np0005591285 neutron-haproxy-ovnmeta-19c3e0c8-5563-479c-995a-ab38d8b8c7f7[220474]: [NOTICE]   (220479) : haproxy version is 2.8.14-c23fe91
Jan 21 18:57:58 np0005591285 neutron-haproxy-ovnmeta-19c3e0c8-5563-479c-995a-ab38d8b8c7f7[220474]: [NOTICE]   (220479) : path to executable is /usr/sbin/haproxy
Jan 21 18:57:58 np0005591285 neutron-haproxy-ovnmeta-19c3e0c8-5563-479c-995a-ab38d8b8c7f7[220474]: [WARNING]  (220479) : Exiting Master process...
Jan 21 18:57:58 np0005591285 neutron-haproxy-ovnmeta-19c3e0c8-5563-479c-995a-ab38d8b8c7f7[220474]: [ALERT]    (220479) : Current worker (220481) exited with code 143 (Terminated)
Jan 21 18:57:58 np0005591285 neutron-haproxy-ovnmeta-19c3e0c8-5563-479c-995a-ab38d8b8c7f7[220474]: [WARNING]  (220479) : All workers exited. Exiting... (0)
Jan 21 18:57:58 np0005591285 nova_compute[182755]: 2026-01-21 23:57:58.469 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 18:57:58 np0005591285 systemd[1]: libpod-77fef6afd9dbb1c67389f8a7aa3ab75a9a26585643b1f60383509571e163508c.scope: Deactivated successfully.
Jan 21 18:57:58 np0005591285 podman[220713]: 2026-01-21 23:57:58.477181584 +0000 UTC m=+0.053643675 container died 77fef6afd9dbb1c67389f8a7aa3ab75a9a26585643b1f60383509571e163508c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-19c3e0c8-5563-479c-995a-ab38d8b8c7f7, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251202)
Jan 21 18:57:58 np0005591285 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-77fef6afd9dbb1c67389f8a7aa3ab75a9a26585643b1f60383509571e163508c-userdata-shm.mount: Deactivated successfully.
Jan 21 18:57:58 np0005591285 systemd[1]: var-lib-containers-storage-overlay-a44e99290dfd448d39097315742f89b5fea21cc60729802930cf3554fff2caeb-merged.mount: Deactivated successfully.
Jan 21 18:57:58 np0005591285 nova_compute[182755]: 2026-01-21 23:57:58.522 182759 DEBUG nova.compute.manager [req-bd80d865-2a1c-4381-b6fa-894f002779ae req-0e350668-c8e1-41f7-9be8-117a51dfa117 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 9308be91-9a92-4389-939a-8b03d37474cf] Received event network-vif-unplugged-d96fb6bb-9793-4373-8f62-3aa3f32af6a5 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 21 18:57:58 np0005591285 nova_compute[182755]: 2026-01-21 23:57:58.523 182759 DEBUG oslo_concurrency.lockutils [req-bd80d865-2a1c-4381-b6fa-894f002779ae req-0e350668-c8e1-41f7-9be8-117a51dfa117 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "9308be91-9a92-4389-939a-8b03d37474cf-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 21 18:57:58 np0005591285 nova_compute[182755]: 2026-01-21 23:57:58.524 182759 DEBUG oslo_concurrency.lockutils [req-bd80d865-2a1c-4381-b6fa-894f002779ae req-0e350668-c8e1-41f7-9be8-117a51dfa117 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "9308be91-9a92-4389-939a-8b03d37474cf-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 21 18:57:58 np0005591285 nova_compute[182755]: 2026-01-21 23:57:58.524 182759 DEBUG oslo_concurrency.lockutils [req-bd80d865-2a1c-4381-b6fa-894f002779ae req-0e350668-c8e1-41f7-9be8-117a51dfa117 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "9308be91-9a92-4389-939a-8b03d37474cf-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 21 18:57:58 np0005591285 nova_compute[182755]: 2026-01-21 23:57:58.524 182759 DEBUG nova.compute.manager [req-bd80d865-2a1c-4381-b6fa-894f002779ae req-0e350668-c8e1-41f7-9be8-117a51dfa117 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 9308be91-9a92-4389-939a-8b03d37474cf] No waiting events found dispatching network-vif-unplugged-d96fb6bb-9793-4373-8f62-3aa3f32af6a5 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 21 18:57:58 np0005591285 nova_compute[182755]: 2026-01-21 23:57:58.524 182759 WARNING nova.compute.manager [req-bd80d865-2a1c-4381-b6fa-894f002779ae req-0e350668-c8e1-41f7-9be8-117a51dfa117 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 9308be91-9a92-4389-939a-8b03d37474cf] Received unexpected event network-vif-unplugged-d96fb6bb-9793-4373-8f62-3aa3f32af6a5 for instance with vm_state active and task_state reboot_started_hard.#033[00m
Jan 21 18:57:58 np0005591285 nova_compute[182755]: 2026-01-21 23:57:58.526 182759 INFO nova.virt.libvirt.driver [-] [instance: 9308be91-9a92-4389-939a-8b03d37474cf] Instance destroyed successfully.#033[00m
Jan 21 18:57:58 np0005591285 nova_compute[182755]: 2026-01-21 23:57:58.526 182759 DEBUG nova.objects.instance [None req-0a1daf86-2767-46a7-915c-2d1e4fd70c9c 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] Lazy-loading 'resources' on Instance uuid 9308be91-9a92-4389-939a-8b03d37474cf obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 21 18:57:58 np0005591285 podman[220713]: 2026-01-21 23:57:58.528577588 +0000 UTC m=+0.105039709 container cleanup 77fef6afd9dbb1c67389f8a7aa3ab75a9a26585643b1f60383509571e163508c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-19c3e0c8-5563-479c-995a-ab38d8b8c7f7, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0)
Jan 21 18:57:58 np0005591285 systemd[1]: libpod-conmon-77fef6afd9dbb1c67389f8a7aa3ab75a9a26585643b1f60383509571e163508c.scope: Deactivated successfully.
Jan 21 18:57:58 np0005591285 nova_compute[182755]: 2026-01-21 23:57:58.554 182759 DEBUG nova.virt.libvirt.vif [None req-0a1daf86-2767-46a7-915c-2d1e4fd70c9c 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-21T23:57:10Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ServerActionsTestJSON-server-396111842',display_name='tempest-ServerActionsTestJSON-server-396111842',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-serveractionstestjson-server-396111842',id=70,image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBM2ugiUux7DYMlN8dY8gue1BzsfXbOKOqdPq/gJUxFgjYtiZRKn0Il7yH7vkt/FF0n0nQ57uKZ7FjQwDvGcLpEHkhrK3RTLhPWsztjfiNHjhjKK0S86T4k3kzP0rpeoh4Q==',key_name='tempest-keypair-452781070',keypairs=<?>,launch_index=0,launched_at=2026-01-21T23:57:17Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='cccb624dbe6d4401a89e9cd254f91828',ramdisk_id='',reservation_id='r-740ncwsh',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-ServerActionsTestJSON-78742637',owner_user_name='tempest-ServerActionsTestJSON-78742637-project-member'},tags=<?>,task_state='reboot_started_hard',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-21T23:57:58Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='3e78a70a1d284a9d932d4a53b872df39',uuid=9308be91-9a92-4389-939a-8b03d37474cf,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "d96fb6bb-9793-4373-8f62-3aa3f32af6a5", "address": "fa:16:3e:c3:44:d7", "network": {"id": "19c3e0c8-5563-479c-995a-ab38d8b8c7f7", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-10713966-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.240", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "cccb624dbe6d4401a89e9cd254f91828", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd96fb6bb-97", "ovs_interfaceid": "d96fb6bb-9793-4373-8f62-3aa3f32af6a5", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Jan 21 18:57:58 np0005591285 nova_compute[182755]: 2026-01-21 23:57:58.555 182759 DEBUG nova.network.os_vif_util [None req-0a1daf86-2767-46a7-915c-2d1e4fd70c9c 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] Converting VIF {"id": "d96fb6bb-9793-4373-8f62-3aa3f32af6a5", "address": "fa:16:3e:c3:44:d7", "network": {"id": "19c3e0c8-5563-479c-995a-ab38d8b8c7f7", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-10713966-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.240", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "cccb624dbe6d4401a89e9cd254f91828", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd96fb6bb-97", "ovs_interfaceid": "d96fb6bb-9793-4373-8f62-3aa3f32af6a5", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 21 18:57:58 np0005591285 nova_compute[182755]: 2026-01-21 23:57:58.556 182759 DEBUG nova.network.os_vif_util [None req-0a1daf86-2767-46a7-915c-2d1e4fd70c9c 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:c3:44:d7,bridge_name='br-int',has_traffic_filtering=True,id=d96fb6bb-9793-4373-8f62-3aa3f32af6a5,network=Network(19c3e0c8-5563-479c-995a-ab38d8b8c7f7),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapd96fb6bb-97') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 21 18:57:58 np0005591285 nova_compute[182755]: 2026-01-21 23:57:58.557 182759 DEBUG os_vif [None req-0a1daf86-2767-46a7-915c-2d1e4fd70c9c 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:c3:44:d7,bridge_name='br-int',has_traffic_filtering=True,id=d96fb6bb-9793-4373-8f62-3aa3f32af6a5,network=Network(19c3e0c8-5563-479c-995a-ab38d8b8c7f7),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapd96fb6bb-97') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Jan 21 18:57:58 np0005591285 nova_compute[182755]: 2026-01-21 23:57:58.559 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 18:57:58 np0005591285 nova_compute[182755]: 2026-01-21 23:57:58.559 182759 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapd96fb6bb-97, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 21 18:57:58 np0005591285 nova_compute[182755]: 2026-01-21 23:57:58.561 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 18:57:58 np0005591285 nova_compute[182755]: 2026-01-21 23:57:58.563 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 18:57:58 np0005591285 nova_compute[182755]: 2026-01-21 23:57:58.566 182759 INFO os_vif [None req-0a1daf86-2767-46a7-915c-2d1e4fd70c9c 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:c3:44:d7,bridge_name='br-int',has_traffic_filtering=True,id=d96fb6bb-9793-4373-8f62-3aa3f32af6a5,network=Network(19c3e0c8-5563-479c-995a-ab38d8b8c7f7),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapd96fb6bb-97')#033[00m
Jan 21 18:57:58 np0005591285 nova_compute[182755]: 2026-01-21 23:57:58.574 182759 DEBUG nova.virt.libvirt.driver [None req-0a1daf86-2767-46a7-915c-2d1e4fd70c9c 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] [instance: 9308be91-9a92-4389-939a-8b03d37474cf] Start _get_guest_xml network_info=[{"id": "d96fb6bb-9793-4373-8f62-3aa3f32af6a5", "address": "fa:16:3e:c3:44:d7", "network": {"id": "19c3e0c8-5563-479c-995a-ab38d8b8c7f7", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-10713966-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.240", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "cccb624dbe6d4401a89e9cd254f91828", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd96fb6bb-97", "ovs_interfaceid": "d96fb6bb-9793-4373-8f62-3aa3f32af6a5", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum=<?>,container_format='bare',created_at=<?>,direct_url=<?>,disk_format='qcow2',id=9cd98f02-a505-4543-a7ad-04e9a377b456,min_disk=1,min_ram=0,name=<?>,owner=<?>,properties=ImageMetaProps,protected=<?>,size=<?>,status=<?>,tags=<?>,updated_at=<?>,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'device_type': 'disk', 'device_name': '/dev/vda', 'size': 0, 'boot_index': 0, 'disk_bus': 'virtio', 'guest_format': None, 'encryption_options': None, 'encryption_format': None, 'encrypted': False, 'encryption_secret_uuid': None, 'image_id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Jan 21 18:57:58 np0005591285 nova_compute[182755]: 2026-01-21 23:57:58.577 182759 WARNING nova.virt.libvirt.driver [None req-0a1daf86-2767-46a7-915c-2d1e4fd70c9c 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 21 18:57:58 np0005591285 nova_compute[182755]: 2026-01-21 23:57:58.581 182759 DEBUG nova.virt.libvirt.host [None req-0a1daf86-2767-46a7-915c-2d1e4fd70c9c 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Jan 21 18:57:58 np0005591285 nova_compute[182755]: 2026-01-21 23:57:58.581 182759 DEBUG nova.virt.libvirt.host [None req-0a1daf86-2767-46a7-915c-2d1e4fd70c9c 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Jan 21 18:57:58 np0005591285 nova_compute[182755]: 2026-01-21 23:57:58.584 182759 DEBUG nova.virt.libvirt.host [None req-0a1daf86-2767-46a7-915c-2d1e4fd70c9c 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Jan 21 18:57:58 np0005591285 nova_compute[182755]: 2026-01-21 23:57:58.584 182759 DEBUG nova.virt.libvirt.host [None req-0a1daf86-2767-46a7-915c-2d1e4fd70c9c 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Jan 21 18:57:58 np0005591285 nova_compute[182755]: 2026-01-21 23:57:58.585 182759 DEBUG nova.virt.libvirt.driver [None req-0a1daf86-2767-46a7-915c-2d1e4fd70c9c 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Jan 21 18:57:58 np0005591285 nova_compute[182755]: 2026-01-21 23:57:58.585 182759 DEBUG nova.virt.hardware [None req-0a1daf86-2767-46a7-915c-2d1e4fd70c9c 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-21T23:42:45Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='c3389c03-89c4-4ff5-9e03-1a99d41713d4',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum=<?>,container_format='bare',created_at=<?>,direct_url=<?>,disk_format='qcow2',id=9cd98f02-a505-4543-a7ad-04e9a377b456,min_disk=1,min_ram=0,name=<?>,owner=<?>,properties=ImageMetaProps,protected=<?>,size=<?>,status=<?>,tags=<?>,updated_at=<?>,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Jan 21 18:57:58 np0005591285 nova_compute[182755]: 2026-01-21 23:57:58.586 182759 DEBUG nova.virt.hardware [None req-0a1daf86-2767-46a7-915c-2d1e4fd70c9c 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Jan 21 18:57:58 np0005591285 nova_compute[182755]: 2026-01-21 23:57:58.586 182759 DEBUG nova.virt.hardware [None req-0a1daf86-2767-46a7-915c-2d1e4fd70c9c 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Jan 21 18:57:58 np0005591285 nova_compute[182755]: 2026-01-21 23:57:58.586 182759 DEBUG nova.virt.hardware [None req-0a1daf86-2767-46a7-915c-2d1e4fd70c9c 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Jan 21 18:57:58 np0005591285 nova_compute[182755]: 2026-01-21 23:57:58.586 182759 DEBUG nova.virt.hardware [None req-0a1daf86-2767-46a7-915c-2d1e4fd70c9c 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Jan 21 18:57:58 np0005591285 nova_compute[182755]: 2026-01-21 23:57:58.587 182759 DEBUG nova.virt.hardware [None req-0a1daf86-2767-46a7-915c-2d1e4fd70c9c 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Jan 21 18:57:58 np0005591285 nova_compute[182755]: 2026-01-21 23:57:58.587 182759 DEBUG nova.virt.hardware [None req-0a1daf86-2767-46a7-915c-2d1e4fd70c9c 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Jan 21 18:57:58 np0005591285 nova_compute[182755]: 2026-01-21 23:57:58.587 182759 DEBUG nova.virt.hardware [None req-0a1daf86-2767-46a7-915c-2d1e4fd70c9c 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Jan 21 18:57:58 np0005591285 nova_compute[182755]: 2026-01-21 23:57:58.587 182759 DEBUG nova.virt.hardware [None req-0a1daf86-2767-46a7-915c-2d1e4fd70c9c 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Jan 21 18:57:58 np0005591285 nova_compute[182755]: 2026-01-21 23:57:58.588 182759 DEBUG nova.virt.hardware [None req-0a1daf86-2767-46a7-915c-2d1e4fd70c9c 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Jan 21 18:57:58 np0005591285 nova_compute[182755]: 2026-01-21 23:57:58.588 182759 DEBUG nova.virt.hardware [None req-0a1daf86-2767-46a7-915c-2d1e4fd70c9c 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Jan 21 18:57:58 np0005591285 nova_compute[182755]: 2026-01-21 23:57:58.588 182759 DEBUG nova.objects.instance [None req-0a1daf86-2767-46a7-915c-2d1e4fd70c9c 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] Lazy-loading 'vcpu_model' on Instance uuid 9308be91-9a92-4389-939a-8b03d37474cf obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 21 18:57:58 np0005591285 nova_compute[182755]: 2026-01-21 23:57:58.606 182759 DEBUG oslo_concurrency.processutils [None req-0a1daf86-2767-46a7-915c-2d1e4fd70c9c 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/9308be91-9a92-4389-939a-8b03d37474cf/disk.config --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 21 18:57:58 np0005591285 podman[220757]: 2026-01-21 23:57:58.63741059 +0000 UTC m=+0.068096188 container remove 77fef6afd9dbb1c67389f8a7aa3ab75a9a26585643b1f60383509571e163508c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-19c3e0c8-5563-479c-995a-ab38d8b8c7f7, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 21 18:57:58 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:57:58.646 211690 DEBUG oslo.privsep.daemon [-] privsep: reply[38222984-e45c-4284-9568-effe335f9c08]: (4, ('Wed Jan 21 11:57:58 PM UTC 2026 Stopping container neutron-haproxy-ovnmeta-19c3e0c8-5563-479c-995a-ab38d8b8c7f7 (77fef6afd9dbb1c67389f8a7aa3ab75a9a26585643b1f60383509571e163508c)\n77fef6afd9dbb1c67389f8a7aa3ab75a9a26585643b1f60383509571e163508c\nWed Jan 21 11:57:58 PM UTC 2026 Deleting container neutron-haproxy-ovnmeta-19c3e0c8-5563-479c-995a-ab38d8b8c7f7 (77fef6afd9dbb1c67389f8a7aa3ab75a9a26585643b1f60383509571e163508c)\n77fef6afd9dbb1c67389f8a7aa3ab75a9a26585643b1f60383509571e163508c\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 18:57:58 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:57:58.648 211690 DEBUG oslo.privsep.daemon [-] privsep: reply[e71b9374-b7ff-45c4-8469-494ac5f6132a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 18:57:58 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:57:58.650 104259 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap19c3e0c8-50, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 21 18:57:58 np0005591285 kernel: tap19c3e0c8-50: left promiscuous mode
Jan 21 18:57:58 np0005591285 nova_compute[182755]: 2026-01-21 23:57:58.651 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 18:57:58 np0005591285 nova_compute[182755]: 2026-01-21 23:57:58.682 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 18:57:58 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:57:58.686 211690 DEBUG oslo.privsep.daemon [-] privsep: reply[0f9feb71-91f2-42cd-9101-99ea6e6bf5c8]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 18:57:58 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:57:58.709 211690 DEBUG oslo.privsep.daemon [-] privsep: reply[8bf905e1-b20e-44a2-b1fa-91b63ca5927f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 18:57:58 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:57:58.711 211690 DEBUG oslo.privsep.daemon [-] privsep: reply[0feebbd6-62a4-4f62-9082-90a330a294fb]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 18:57:58 np0005591285 nova_compute[182755]: 2026-01-21 23:57:58.715 182759 DEBUG oslo_concurrency.processutils [None req-0a1daf86-2767-46a7-915c-2d1e4fd70c9c 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/9308be91-9a92-4389-939a-8b03d37474cf/disk.config --force-share --output=json" returned: 0 in 0.109s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 21 18:57:58 np0005591285 nova_compute[182755]: 2026-01-21 23:57:58.716 182759 DEBUG oslo_concurrency.lockutils [None req-0a1daf86-2767-46a7-915c-2d1e4fd70c9c 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] Acquiring lock "/var/lib/nova/instances/9308be91-9a92-4389-939a-8b03d37474cf/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 21 18:57:58 np0005591285 nova_compute[182755]: 2026-01-21 23:57:58.716 182759 DEBUG oslo_concurrency.lockutils [None req-0a1daf86-2767-46a7-915c-2d1e4fd70c9c 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] Lock "/var/lib/nova/instances/9308be91-9a92-4389-939a-8b03d37474cf/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 21 18:57:58 np0005591285 nova_compute[182755]: 2026-01-21 23:57:58.717 182759 DEBUG oslo_concurrency.lockutils [None req-0a1daf86-2767-46a7-915c-2d1e4fd70c9c 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] Lock "/var/lib/nova/instances/9308be91-9a92-4389-939a-8b03d37474cf/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 21 18:57:58 np0005591285 nova_compute[182755]: 2026-01-21 23:57:58.718 182759 DEBUG nova.virt.libvirt.vif [None req-0a1daf86-2767-46a7-915c-2d1e4fd70c9c 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-21T23:57:10Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ServerActionsTestJSON-server-396111842',display_name='tempest-ServerActionsTestJSON-server-396111842',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-serveractionstestjson-server-396111842',id=70,image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBM2ugiUux7DYMlN8dY8gue1BzsfXbOKOqdPq/gJUxFgjYtiZRKn0Il7yH7vkt/FF0n0nQ57uKZ7FjQwDvGcLpEHkhrK3RTLhPWsztjfiNHjhjKK0S86T4k3kzP0rpeoh4Q==',key_name='tempest-keypair-452781070',keypairs=<?>,launch_index=0,launched_at=2026-01-21T23:57:17Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='cccb624dbe6d4401a89e9cd254f91828',ramdisk_id='',reservation_id='r-740ncwsh',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-ServerActionsTestJSON-78742637',owner_user_name='tempest-ServerActionsTestJSON-78742637-project-member'},tags=<?>,task_state='reboot_started_hard',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-21T23:57:58Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='3e78a70a1d284a9d932d4a53b872df39',uuid=9308be91-9a92-4389-939a-8b03d37474cf,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "d96fb6bb-9793-4373-8f62-3aa3f32af6a5", "address": "fa:16:3e:c3:44:d7", "network": {"id": "19c3e0c8-5563-479c-995a-ab38d8b8c7f7", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-10713966-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.240", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "cccb624dbe6d4401a89e9cd254f91828", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd96fb6bb-97", "ovs_interfaceid": "d96fb6bb-9793-4373-8f62-3aa3f32af6a5", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Jan 21 18:57:58 np0005591285 nova_compute[182755]: 2026-01-21 23:57:58.718 182759 DEBUG nova.network.os_vif_util [None req-0a1daf86-2767-46a7-915c-2d1e4fd70c9c 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] Converting VIF {"id": "d96fb6bb-9793-4373-8f62-3aa3f32af6a5", "address": "fa:16:3e:c3:44:d7", "network": {"id": "19c3e0c8-5563-479c-995a-ab38d8b8c7f7", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-10713966-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.240", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "cccb624dbe6d4401a89e9cd254f91828", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd96fb6bb-97", "ovs_interfaceid": "d96fb6bb-9793-4373-8f62-3aa3f32af6a5", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 21 18:57:58 np0005591285 nova_compute[182755]: 2026-01-21 23:57:58.719 182759 DEBUG nova.network.os_vif_util [None req-0a1daf86-2767-46a7-915c-2d1e4fd70c9c 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:c3:44:d7,bridge_name='br-int',has_traffic_filtering=True,id=d96fb6bb-9793-4373-8f62-3aa3f32af6a5,network=Network(19c3e0c8-5563-479c-995a-ab38d8b8c7f7),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapd96fb6bb-97') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 21 18:57:58 np0005591285 nova_compute[182755]: 2026-01-21 23:57:58.720 182759 DEBUG nova.objects.instance [None req-0a1daf86-2767-46a7-915c-2d1e4fd70c9c 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] Lazy-loading 'pci_devices' on Instance uuid 9308be91-9a92-4389-939a-8b03d37474cf obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 21 18:57:58 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:57:58.733 211690 DEBUG oslo.privsep.daemon [-] privsep: reply[0cbc1e31-52a4-409a-8cbc-10420316a956]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 439539, 'reachable_time': 22897, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 220775, 'error': None, 'target': 'ovnmeta-19c3e0c8-5563-479c-995a-ab38d8b8c7f7', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 18:57:58 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:57:58.738 104650 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-19c3e0c8-5563-479c-995a-ab38d8b8c7f7 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Jan 21 18:57:58 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:57:58.739 104650 DEBUG oslo.privsep.daemon [-] privsep: reply[797797b2-46a0-4804-947b-c241d3822677]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 18:57:58 np0005591285 systemd[1]: run-netns-ovnmeta\x2d19c3e0c8\x2d5563\x2d479c\x2d995a\x2dab38d8b8c7f7.mount: Deactivated successfully.
Jan 21 18:57:58 np0005591285 nova_compute[182755]: 2026-01-21 23:57:58.741 182759 DEBUG nova.virt.libvirt.driver [None req-0a1daf86-2767-46a7-915c-2d1e4fd70c9c 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] [instance: 9308be91-9a92-4389-939a-8b03d37474cf] End _get_guest_xml xml=<domain type="kvm">
Jan 21 18:57:58 np0005591285 nova_compute[182755]:  <uuid>9308be91-9a92-4389-939a-8b03d37474cf</uuid>
Jan 21 18:57:58 np0005591285 nova_compute[182755]:  <name>instance-00000046</name>
Jan 21 18:57:58 np0005591285 nova_compute[182755]:  <memory>131072</memory>
Jan 21 18:57:58 np0005591285 nova_compute[182755]:  <vcpu>1</vcpu>
Jan 21 18:57:58 np0005591285 nova_compute[182755]:  <metadata>
Jan 21 18:57:58 np0005591285 nova_compute[182755]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 21 18:57:58 np0005591285 nova_compute[182755]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 21 18:57:58 np0005591285 nova_compute[182755]:      <nova:name>tempest-ServerActionsTestJSON-server-396111842</nova:name>
Jan 21 18:57:58 np0005591285 nova_compute[182755]:      <nova:creationTime>2026-01-21 23:57:58</nova:creationTime>
Jan 21 18:57:58 np0005591285 nova_compute[182755]:      <nova:flavor name="m1.nano">
Jan 21 18:57:58 np0005591285 nova_compute[182755]:        <nova:memory>128</nova:memory>
Jan 21 18:57:58 np0005591285 nova_compute[182755]:        <nova:disk>1</nova:disk>
Jan 21 18:57:58 np0005591285 nova_compute[182755]:        <nova:swap>0</nova:swap>
Jan 21 18:57:58 np0005591285 nova_compute[182755]:        <nova:ephemeral>0</nova:ephemeral>
Jan 21 18:57:58 np0005591285 nova_compute[182755]:        <nova:vcpus>1</nova:vcpus>
Jan 21 18:57:58 np0005591285 nova_compute[182755]:      </nova:flavor>
Jan 21 18:57:58 np0005591285 nova_compute[182755]:      <nova:owner>
Jan 21 18:57:58 np0005591285 nova_compute[182755]:        <nova:user uuid="3e78a70a1d284a9d932d4a53b872df39">tempest-ServerActionsTestJSON-78742637-project-member</nova:user>
Jan 21 18:57:58 np0005591285 nova_compute[182755]:        <nova:project uuid="cccb624dbe6d4401a89e9cd254f91828">tempest-ServerActionsTestJSON-78742637</nova:project>
Jan 21 18:57:58 np0005591285 nova_compute[182755]:      </nova:owner>
Jan 21 18:57:58 np0005591285 nova_compute[182755]:      <nova:root type="image" uuid="9cd98f02-a505-4543-a7ad-04e9a377b456"/>
Jan 21 18:57:58 np0005591285 nova_compute[182755]:      <nova:ports>
Jan 21 18:57:58 np0005591285 nova_compute[182755]:        <nova:port uuid="d96fb6bb-9793-4373-8f62-3aa3f32af6a5">
Jan 21 18:57:58 np0005591285 nova_compute[182755]:          <nova:ip type="fixed" address="10.100.0.7" ipVersion="4"/>
Jan 21 18:57:58 np0005591285 nova_compute[182755]:        </nova:port>
Jan 21 18:57:58 np0005591285 nova_compute[182755]:      </nova:ports>
Jan 21 18:57:58 np0005591285 nova_compute[182755]:    </nova:instance>
Jan 21 18:57:58 np0005591285 nova_compute[182755]:  </metadata>
Jan 21 18:57:58 np0005591285 nova_compute[182755]:  <sysinfo type="smbios">
Jan 21 18:57:58 np0005591285 nova_compute[182755]:    <system>
Jan 21 18:57:58 np0005591285 nova_compute[182755]:      <entry name="manufacturer">RDO</entry>
Jan 21 18:57:58 np0005591285 nova_compute[182755]:      <entry name="product">OpenStack Compute</entry>
Jan 21 18:57:58 np0005591285 nova_compute[182755]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 21 18:57:58 np0005591285 nova_compute[182755]:      <entry name="serial">9308be91-9a92-4389-939a-8b03d37474cf</entry>
Jan 21 18:57:58 np0005591285 nova_compute[182755]:      <entry name="uuid">9308be91-9a92-4389-939a-8b03d37474cf</entry>
Jan 21 18:57:58 np0005591285 nova_compute[182755]:      <entry name="family">Virtual Machine</entry>
Jan 21 18:57:58 np0005591285 nova_compute[182755]:    </system>
Jan 21 18:57:58 np0005591285 nova_compute[182755]:  </sysinfo>
Jan 21 18:57:58 np0005591285 nova_compute[182755]:  <os>
Jan 21 18:57:58 np0005591285 nova_compute[182755]:    <type arch="x86_64" machine="q35">hvm</type>
Jan 21 18:57:58 np0005591285 nova_compute[182755]:    <boot dev="hd"/>
Jan 21 18:57:58 np0005591285 nova_compute[182755]:    <smbios mode="sysinfo"/>
Jan 21 18:57:58 np0005591285 nova_compute[182755]:  </os>
Jan 21 18:57:58 np0005591285 nova_compute[182755]:  <features>
Jan 21 18:57:58 np0005591285 nova_compute[182755]:    <acpi/>
Jan 21 18:57:58 np0005591285 nova_compute[182755]:    <apic/>
Jan 21 18:57:58 np0005591285 nova_compute[182755]:    <vmcoreinfo/>
Jan 21 18:57:58 np0005591285 nova_compute[182755]:  </features>
Jan 21 18:57:58 np0005591285 nova_compute[182755]:  <clock offset="utc">
Jan 21 18:57:58 np0005591285 nova_compute[182755]:    <timer name="pit" tickpolicy="delay"/>
Jan 21 18:57:58 np0005591285 nova_compute[182755]:    <timer name="rtc" tickpolicy="catchup"/>
Jan 21 18:57:58 np0005591285 nova_compute[182755]:    <timer name="hpet" present="no"/>
Jan 21 18:57:58 np0005591285 nova_compute[182755]:  </clock>
Jan 21 18:57:58 np0005591285 nova_compute[182755]:  <cpu mode="custom" match="exact">
Jan 21 18:57:58 np0005591285 nova_compute[182755]:    <model>Nehalem</model>
Jan 21 18:57:58 np0005591285 nova_compute[182755]:    <topology sockets="1" cores="1" threads="1"/>
Jan 21 18:57:58 np0005591285 nova_compute[182755]:  </cpu>
Jan 21 18:57:58 np0005591285 nova_compute[182755]:  <devices>
Jan 21 18:57:58 np0005591285 nova_compute[182755]:    <disk type="file" device="disk">
Jan 21 18:57:58 np0005591285 nova_compute[182755]:      <driver name="qemu" type="qcow2" cache="none"/>
Jan 21 18:57:58 np0005591285 nova_compute[182755]:      <source file="/var/lib/nova/instances/9308be91-9a92-4389-939a-8b03d37474cf/disk"/>
Jan 21 18:57:58 np0005591285 nova_compute[182755]:      <target dev="vda" bus="virtio"/>
Jan 21 18:57:58 np0005591285 nova_compute[182755]:    </disk>
Jan 21 18:57:58 np0005591285 nova_compute[182755]:    <disk type="file" device="cdrom">
Jan 21 18:57:58 np0005591285 nova_compute[182755]:      <driver name="qemu" type="raw" cache="none"/>
Jan 21 18:57:58 np0005591285 nova_compute[182755]:      <source file="/var/lib/nova/instances/9308be91-9a92-4389-939a-8b03d37474cf/disk.config"/>
Jan 21 18:57:58 np0005591285 nova_compute[182755]:      <target dev="sda" bus="sata"/>
Jan 21 18:57:58 np0005591285 nova_compute[182755]:    </disk>
Jan 21 18:57:58 np0005591285 nova_compute[182755]:    <interface type="ethernet">
Jan 21 18:57:58 np0005591285 nova_compute[182755]:      <mac address="fa:16:3e:c3:44:d7"/>
Jan 21 18:57:58 np0005591285 nova_compute[182755]:      <model type="virtio"/>
Jan 21 18:57:58 np0005591285 nova_compute[182755]:      <driver name="vhost" rx_queue_size="512"/>
Jan 21 18:57:58 np0005591285 nova_compute[182755]:      <mtu size="1442"/>
Jan 21 18:57:58 np0005591285 nova_compute[182755]:      <target dev="tapd96fb6bb-97"/>
Jan 21 18:57:58 np0005591285 nova_compute[182755]:    </interface>
Jan 21 18:57:58 np0005591285 nova_compute[182755]:    <serial type="pty">
Jan 21 18:57:58 np0005591285 nova_compute[182755]:      <log file="/var/lib/nova/instances/9308be91-9a92-4389-939a-8b03d37474cf/console.log" append="off"/>
Jan 21 18:57:58 np0005591285 nova_compute[182755]:    </serial>
Jan 21 18:57:58 np0005591285 nova_compute[182755]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 21 18:57:58 np0005591285 nova_compute[182755]:    <video>
Jan 21 18:57:58 np0005591285 nova_compute[182755]:      <model type="virtio"/>
Jan 21 18:57:58 np0005591285 nova_compute[182755]:    </video>
Jan 21 18:57:58 np0005591285 nova_compute[182755]:    <input type="tablet" bus="usb"/>
Jan 21 18:57:58 np0005591285 nova_compute[182755]:    <input type="keyboard" bus="usb"/>
Jan 21 18:57:58 np0005591285 nova_compute[182755]:    <rng model="virtio">
Jan 21 18:57:58 np0005591285 nova_compute[182755]:      <backend model="random">/dev/urandom</backend>
Jan 21 18:57:58 np0005591285 nova_compute[182755]:    </rng>
Jan 21 18:57:58 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root"/>
Jan 21 18:57:58 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 18:57:58 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 18:57:58 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 18:57:58 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 18:57:58 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 18:57:58 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 18:57:58 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 18:57:58 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 18:57:58 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 18:57:58 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 18:57:58 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 18:57:58 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 18:57:58 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 18:57:58 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 18:57:58 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 18:57:58 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 18:57:58 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 18:57:58 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 18:57:58 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 18:57:58 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 18:57:58 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 18:57:58 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 18:57:58 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 18:57:58 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 18:57:58 np0005591285 nova_compute[182755]:    <controller type="usb" index="0"/>
Jan 21 18:57:58 np0005591285 nova_compute[182755]:    <memballoon model="virtio">
Jan 21 18:57:58 np0005591285 nova_compute[182755]:      <stats period="10"/>
Jan 21 18:57:58 np0005591285 nova_compute[182755]:    </memballoon>
Jan 21 18:57:58 np0005591285 nova_compute[182755]:  </devices>
Jan 21 18:57:58 np0005591285 nova_compute[182755]: </domain>
Jan 21 18:57:58 np0005591285 nova_compute[182755]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Jan 21 18:57:58 np0005591285 nova_compute[182755]: 2026-01-21 23:57:58.742 182759 DEBUG oslo_concurrency.processutils [None req-0a1daf86-2767-46a7-915c-2d1e4fd70c9c 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/9308be91-9a92-4389-939a-8b03d37474cf/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 21 18:57:58 np0005591285 nova_compute[182755]: 2026-01-21 23:57:58.820 182759 DEBUG oslo_concurrency.processutils [None req-0a1daf86-2767-46a7-915c-2d1e4fd70c9c 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/9308be91-9a92-4389-939a-8b03d37474cf/disk --force-share --output=json" returned: 0 in 0.078s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 21 18:57:58 np0005591285 nova_compute[182755]: 2026-01-21 23:57:58.822 182759 DEBUG oslo_concurrency.processutils [None req-0a1daf86-2767-46a7-915c-2d1e4fd70c9c 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/9308be91-9a92-4389-939a-8b03d37474cf/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 21 18:57:58 np0005591285 nova_compute[182755]: 2026-01-21 23:57:58.915 182759 DEBUG oslo_concurrency.processutils [None req-0a1daf86-2767-46a7-915c-2d1e4fd70c9c 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/9308be91-9a92-4389-939a-8b03d37474cf/disk --force-share --output=json" returned: 0 in 0.093s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 21 18:57:58 np0005591285 nova_compute[182755]: 2026-01-21 23:57:58.917 182759 DEBUG nova.objects.instance [None req-0a1daf86-2767-46a7-915c-2d1e4fd70c9c 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] Lazy-loading 'trusted_certs' on Instance uuid 9308be91-9a92-4389-939a-8b03d37474cf obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 21 18:57:58 np0005591285 nova_compute[182755]: 2026-01-21 23:57:58.934 182759 DEBUG oslo_concurrency.processutils [None req-0a1daf86-2767-46a7-915c-2d1e4fd70c9c 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 21 18:57:59 np0005591285 nova_compute[182755]: 2026-01-21 23:57:59.006 182759 DEBUG oslo_concurrency.processutils [None req-0a1daf86-2767-46a7-915c-2d1e4fd70c9c 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json" returned: 0 in 0.072s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 21 18:57:59 np0005591285 nova_compute[182755]: 2026-01-21 23:57:59.007 182759 DEBUG nova.virt.disk.api [None req-0a1daf86-2767-46a7-915c-2d1e4fd70c9c 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] Checking if we can resize image /var/lib/nova/instances/9308be91-9a92-4389-939a-8b03d37474cf/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166#033[00m
Jan 21 18:57:59 np0005591285 nova_compute[182755]: 2026-01-21 23:57:59.008 182759 DEBUG oslo_concurrency.processutils [None req-0a1daf86-2767-46a7-915c-2d1e4fd70c9c 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/9308be91-9a92-4389-939a-8b03d37474cf/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 21 18:57:59 np0005591285 nova_compute[182755]: 2026-01-21 23:57:59.101 182759 DEBUG oslo_concurrency.processutils [None req-0a1daf86-2767-46a7-915c-2d1e4fd70c9c 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/9308be91-9a92-4389-939a-8b03d37474cf/disk --force-share --output=json" returned: 0 in 0.093s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 21 18:57:59 np0005591285 nova_compute[182755]: 2026-01-21 23:57:59.103 182759 DEBUG nova.virt.disk.api [None req-0a1daf86-2767-46a7-915c-2d1e4fd70c9c 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] Cannot resize image /var/lib/nova/instances/9308be91-9a92-4389-939a-8b03d37474cf/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172#033[00m
Jan 21 18:57:59 np0005591285 nova_compute[182755]: 2026-01-21 23:57:59.104 182759 DEBUG nova.objects.instance [None req-0a1daf86-2767-46a7-915c-2d1e4fd70c9c 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] Lazy-loading 'migration_context' on Instance uuid 9308be91-9a92-4389-939a-8b03d37474cf obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 21 18:57:59 np0005591285 nova_compute[182755]: 2026-01-21 23:57:59.124 182759 DEBUG nova.virt.libvirt.vif [None req-0a1daf86-2767-46a7-915c-2d1e4fd70c9c 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-21T23:57:10Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ServerActionsTestJSON-server-396111842',display_name='tempest-ServerActionsTestJSON-server-396111842',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-serveractionstestjson-server-396111842',id=70,image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBM2ugiUux7DYMlN8dY8gue1BzsfXbOKOqdPq/gJUxFgjYtiZRKn0Il7yH7vkt/FF0n0nQ57uKZ7FjQwDvGcLpEHkhrK3RTLhPWsztjfiNHjhjKK0S86T4k3kzP0rpeoh4Q==',key_name='tempest-keypair-452781070',keypairs=<?>,launch_index=0,launched_at=2026-01-21T23:57:17Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=<?>,power_state=1,progress=0,project_id='cccb624dbe6d4401a89e9cd254f91828',ramdisk_id='',reservation_id='r-740ncwsh',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-ServerActionsTestJSON-78742637',owner_user_name='tempest-ServerActionsTestJSON-78742637-project-member'},tags=<?>,task_state='reboot_started_hard',terminated_at=None,trusted_certs=None,updated_at=2026-01-21T23:57:58Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='3e78a70a1d284a9d932d4a53b872df39',uuid=9308be91-9a92-4389-939a-8b03d37474cf,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "d96fb6bb-9793-4373-8f62-3aa3f32af6a5", "address": "fa:16:3e:c3:44:d7", "network": {"id": "19c3e0c8-5563-479c-995a-ab38d8b8c7f7", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-10713966-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.240", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "cccb624dbe6d4401a89e9cd254f91828", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd96fb6bb-97", "ovs_interfaceid": "d96fb6bb-9793-4373-8f62-3aa3f32af6a5", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Jan 21 18:57:59 np0005591285 nova_compute[182755]: 2026-01-21 23:57:59.125 182759 DEBUG nova.network.os_vif_util [None req-0a1daf86-2767-46a7-915c-2d1e4fd70c9c 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] Converting VIF {"id": "d96fb6bb-9793-4373-8f62-3aa3f32af6a5", "address": "fa:16:3e:c3:44:d7", "network": {"id": "19c3e0c8-5563-479c-995a-ab38d8b8c7f7", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-10713966-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.240", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "cccb624dbe6d4401a89e9cd254f91828", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd96fb6bb-97", "ovs_interfaceid": "d96fb6bb-9793-4373-8f62-3aa3f32af6a5", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 21 18:57:59 np0005591285 nova_compute[182755]: 2026-01-21 23:57:59.127 182759 DEBUG nova.network.os_vif_util [None req-0a1daf86-2767-46a7-915c-2d1e4fd70c9c 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:c3:44:d7,bridge_name='br-int',has_traffic_filtering=True,id=d96fb6bb-9793-4373-8f62-3aa3f32af6a5,network=Network(19c3e0c8-5563-479c-995a-ab38d8b8c7f7),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapd96fb6bb-97') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 21 18:57:59 np0005591285 nova_compute[182755]: 2026-01-21 23:57:59.127 182759 DEBUG os_vif [None req-0a1daf86-2767-46a7-915c-2d1e4fd70c9c 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] Plugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:c3:44:d7,bridge_name='br-int',has_traffic_filtering=True,id=d96fb6bb-9793-4373-8f62-3aa3f32af6a5,network=Network(19c3e0c8-5563-479c-995a-ab38d8b8c7f7),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapd96fb6bb-97') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Jan 21 18:57:59 np0005591285 nova_compute[182755]: 2026-01-21 23:57:59.129 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 18:57:59 np0005591285 nova_compute[182755]: 2026-01-21 23:57:59.130 182759 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 21 18:57:59 np0005591285 nova_compute[182755]: 2026-01-21 23:57:59.131 182759 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 21 18:57:59 np0005591285 nova_compute[182755]: 2026-01-21 23:57:59.137 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 18:57:59 np0005591285 nova_compute[182755]: 2026-01-21 23:57:59.138 182759 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapd96fb6bb-97, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 21 18:57:59 np0005591285 nova_compute[182755]: 2026-01-21 23:57:59.138 182759 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tapd96fb6bb-97, col_values=(('external_ids', {'iface-id': 'd96fb6bb-9793-4373-8f62-3aa3f32af6a5', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:c3:44:d7', 'vm-uuid': '9308be91-9a92-4389-939a-8b03d37474cf'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 21 18:57:59 np0005591285 nova_compute[182755]: 2026-01-21 23:57:59.170 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 18:57:59 np0005591285 NetworkManager[55017]: <info>  [1769039879.1712] manager: (tapd96fb6bb-97): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/109)
Jan 21 18:57:59 np0005591285 nova_compute[182755]: 2026-01-21 23:57:59.174 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 21 18:57:59 np0005591285 nova_compute[182755]: 2026-01-21 23:57:59.178 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 18:57:59 np0005591285 nova_compute[182755]: 2026-01-21 23:57:59.179 182759 INFO os_vif [None req-0a1daf86-2767-46a7-915c-2d1e4fd70c9c 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] Successfully plugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:c3:44:d7,bridge_name='br-int',has_traffic_filtering=True,id=d96fb6bb-9793-4373-8f62-3aa3f32af6a5,network=Network(19c3e0c8-5563-479c-995a-ab38d8b8c7f7),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapd96fb6bb-97')#033[00m
Jan 21 18:57:59 np0005591285 kernel: tapd96fb6bb-97: entered promiscuous mode
Jan 21 18:57:59 np0005591285 systemd-udevd[220691]: Network interface NamePolicy= disabled on kernel command line.
Jan 21 18:57:59 np0005591285 NetworkManager[55017]: <info>  [1769039879.3084] manager: (tapd96fb6bb-97): new Tun device (/org/freedesktop/NetworkManager/Devices/110)
Jan 21 18:57:59 np0005591285 ovn_controller[94908]: 2026-01-21T23:57:59Z|00212|binding|INFO|Claiming lport d96fb6bb-9793-4373-8f62-3aa3f32af6a5 for this chassis.
Jan 21 18:57:59 np0005591285 ovn_controller[94908]: 2026-01-21T23:57:59Z|00213|binding|INFO|d96fb6bb-9793-4373-8f62-3aa3f32af6a5: Claiming fa:16:3e:c3:44:d7 10.100.0.7
Jan 21 18:57:59 np0005591285 nova_compute[182755]: 2026-01-21 23:57:59.311 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 18:57:59 np0005591285 ovn_controller[94908]: 2026-01-21T23:57:59Z|00214|binding|INFO|Setting lport d96fb6bb-9793-4373-8f62-3aa3f32af6a5 ovn-installed in OVS
Jan 21 18:57:59 np0005591285 ovn_controller[94908]: 2026-01-21T23:57:59Z|00215|binding|INFO|Setting lport d96fb6bb-9793-4373-8f62-3aa3f32af6a5 up in Southbound
Jan 21 18:57:59 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:57:59.329 104259 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:c3:44:d7 10.100.0.7'], port_security=['fa:16:3e:c3:44:d7 10.100.0.7'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.7/28', 'neutron:device_id': '9308be91-9a92-4389-939a-8b03d37474cf', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-19c3e0c8-5563-479c-995a-ab38d8b8c7f7', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'cccb624dbe6d4401a89e9cd254f91828', 'neutron:revision_number': '5', 'neutron:security_group_ids': '6d59a7e5-ecca-4ec2-a40e-386acabc1d66', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-2.ctlplane.example.com', 'neutron:port_fip': '192.168.122.240'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=1cb5ae5b-fb9e-4b4d-8960-35191db09308, chassis=[<ovs.db.idl.Row object at 0x7fdd55b5c970>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fdd55b5c970>], logical_port=d96fb6bb-9793-4373-8f62-3aa3f32af6a5) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 21 18:57:59 np0005591285 nova_compute[182755]: 2026-01-21 23:57:59.331 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 18:57:59 np0005591285 NetworkManager[55017]: <info>  [1769039879.3341] device (tapd96fb6bb-97): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 21 18:57:59 np0005591285 NetworkManager[55017]: <info>  [1769039879.3355] device (tapd96fb6bb-97): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 21 18:57:59 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:57:59.336 104259 INFO neutron.agent.ovn.metadata.agent [-] Port d96fb6bb-9793-4373-8f62-3aa3f32af6a5 in datapath 19c3e0c8-5563-479c-995a-ab38d8b8c7f7 bound to our chassis#033[00m
Jan 21 18:57:59 np0005591285 nova_compute[182755]: 2026-01-21 23:57:59.337 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 18:57:59 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:57:59.338 104259 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 19c3e0c8-5563-479c-995a-ab38d8b8c7f7#033[00m
Jan 21 18:57:59 np0005591285 nova_compute[182755]: 2026-01-21 23:57:59.340 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 18:57:59 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:57:59.360 211690 DEBUG oslo.privsep.daemon [-] privsep: reply[1af54096-b867-4747-9df3-99567af2d733]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 18:57:59 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:57:59.362 104259 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap19c3e0c8-51 in ovnmeta-19c3e0c8-5563-479c-995a-ab38d8b8c7f7 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Jan 21 18:57:59 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:57:59.365 211690 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap19c3e0c8-50 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Jan 21 18:57:59 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:57:59.365 211690 DEBUG oslo.privsep.daemon [-] privsep: reply[1469c4ff-a02f-4521-8401-8d5fc0c57546]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 18:57:59 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:57:59.367 211690 DEBUG oslo.privsep.daemon [-] privsep: reply[c70314f4-3ca6-4056-9793-eccb96fb09d3]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 18:57:59 np0005591285 systemd-machined[154022]: New machine qemu-30-instance-00000046.
Jan 21 18:57:59 np0005591285 systemd[1]: Started Virtual Machine qemu-30-instance-00000046.
Jan 21 18:57:59 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:57:59.386 104650 DEBUG oslo.privsep.daemon [-] privsep: reply[ee6babeb-bcd7-441b-b331-e56070169622]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 18:57:59 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:57:59.415 211690 DEBUG oslo.privsep.daemon [-] privsep: reply[116db075-7848-4e35-85a0-1ca2d1916e25]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 18:57:59 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:57:59.464 211704 DEBUG oslo.privsep.daemon [-] privsep: reply[0a7d5c5e-3830-4549-ad97-231f1286115f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 18:57:59 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:57:59.474 211690 DEBUG oslo.privsep.daemon [-] privsep: reply[0ecd0b4c-12bd-40a7-8059-2bf19f6d9eb6]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 18:57:59 np0005591285 NetworkManager[55017]: <info>  [1769039879.4772] manager: (tap19c3e0c8-50): new Veth device (/org/freedesktop/NetworkManager/Devices/111)
Jan 21 18:57:59 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:57:59.534 211704 DEBUG oslo.privsep.daemon [-] privsep: reply[c93aecc3-209e-4f41-820c-06f8b23f9c06]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 18:57:59 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:57:59.539 211704 DEBUG oslo.privsep.daemon [-] privsep: reply[2389522d-9810-49ec-ac95-4792719866fb]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 18:57:59 np0005591285 NetworkManager[55017]: <info>  [1769039879.5709] device (tap19c3e0c8-50): carrier: link connected
Jan 21 18:57:59 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:57:59.583 211704 DEBUG oslo.privsep.daemon [-] privsep: reply[41066445-25c8-44d8-8f5c-55d326e70160]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 18:57:59 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:57:59.614 211690 DEBUG oslo.privsep.daemon [-] privsep: reply[247925a5-6e8a-40dd-ae22-1a483f813eee]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap19c3e0c8-51'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:8f:3a:b0'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 70], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 443822, 'reachable_time': 20448, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 220837, 'error': None, 'target': 'ovnmeta-19c3e0c8-5563-479c-995a-ab38d8b8c7f7', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 18:57:59 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:57:59.644 211690 DEBUG oslo.privsep.daemon [-] privsep: reply[26ac752c-750c-4868-a546-7f03a63bb8ac]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe8f:3ab0'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 443822, 'tstamp': 443822}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 220840, 'error': None, 'target': 'ovnmeta-19c3e0c8-5563-479c-995a-ab38d8b8c7f7', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 18:57:59 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:57:59.677 211690 DEBUG oslo.privsep.daemon [-] privsep: reply[ac791885-db57-4d73-a9ce-427fdaa540db]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap19c3e0c8-51'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:8f:3a:b0'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 2, 'tx_packets': 1, 'rx_bytes': 180, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 2, 'tx_packets': 1, 'rx_bytes': 180, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 70], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 443822, 'reachable_time': 20448, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 2, 'inoctets': 152, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 2, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 152, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 2, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 220845, 'error': None, 'target': 'ovnmeta-19c3e0c8-5563-479c-995a-ab38d8b8c7f7', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 18:57:59 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:57:59.739 211690 DEBUG oslo.privsep.daemon [-] privsep: reply[19573e5c-c80d-4c45-a04f-1046d32faa0e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 18:57:59 np0005591285 nova_compute[182755]: 2026-01-21 23:57:59.748 182759 DEBUG nova.virt.libvirt.host [None req-f447ef32-7edb-4e2c-a5c5-5452f2f9c37e - - - - - -] Removed pending event for 9308be91-9a92-4389-939a-8b03d37474cf due to event _event_emit_delayed /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:438#033[00m
Jan 21 18:57:59 np0005591285 nova_compute[182755]: 2026-01-21 23:57:59.749 182759 DEBUG nova.virt.driver [None req-f447ef32-7edb-4e2c-a5c5-5452f2f9c37e - - - - - -] Emitting event <LifecycleEvent: 1769039879.7474685, 9308be91-9a92-4389-939a-8b03d37474cf => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 21 18:57:59 np0005591285 nova_compute[182755]: 2026-01-21 23:57:59.750 182759 INFO nova.compute.manager [None req-f447ef32-7edb-4e2c-a5c5-5452f2f9c37e - - - - - -] [instance: 9308be91-9a92-4389-939a-8b03d37474cf] VM Resumed (Lifecycle Event)#033[00m
Jan 21 18:57:59 np0005591285 nova_compute[182755]: 2026-01-21 23:57:59.755 182759 DEBUG nova.compute.manager [None req-0a1daf86-2767-46a7-915c-2d1e4fd70c9c 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] [instance: 9308be91-9a92-4389-939a-8b03d37474cf] Instance event wait completed in 0 seconds for  wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Jan 21 18:57:59 np0005591285 nova_compute[182755]: 2026-01-21 23:57:59.760 182759 INFO nova.virt.libvirt.driver [-] [instance: 9308be91-9a92-4389-939a-8b03d37474cf] Instance rebooted successfully.#033[00m
Jan 21 18:57:59 np0005591285 nova_compute[182755]: 2026-01-21 23:57:59.761 182759 DEBUG nova.compute.manager [None req-0a1daf86-2767-46a7-915c-2d1e4fd70c9c 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] [instance: 9308be91-9a92-4389-939a-8b03d37474cf] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 21 18:57:59 np0005591285 nova_compute[182755]: 2026-01-21 23:57:59.776 182759 DEBUG nova.compute.manager [None req-f447ef32-7edb-4e2c-a5c5-5452f2f9c37e - - - - - -] [instance: 9308be91-9a92-4389-939a-8b03d37474cf] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 21 18:57:59 np0005591285 nova_compute[182755]: 2026-01-21 23:57:59.782 182759 DEBUG nova.compute.manager [None req-f447ef32-7edb-4e2c-a5c5-5452f2f9c37e - - - - - -] [instance: 9308be91-9a92-4389-939a-8b03d37474cf] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: active, current task_state: reboot_started_hard, current DB power_state: 1, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 21 18:57:59 np0005591285 nova_compute[182755]: 2026-01-21 23:57:59.814 182759 INFO nova.compute.manager [None req-f447ef32-7edb-4e2c-a5c5-5452f2f9c37e - - - - - -] [instance: 9308be91-9a92-4389-939a-8b03d37474cf] During sync_power_state the instance has a pending task (reboot_started_hard). Skip.#033[00m
Jan 21 18:57:59 np0005591285 nova_compute[182755]: 2026-01-21 23:57:59.815 182759 DEBUG nova.virt.driver [None req-f447ef32-7edb-4e2c-a5c5-5452f2f9c37e - - - - - -] Emitting event <LifecycleEvent: 1769039879.7498097, 9308be91-9a92-4389-939a-8b03d37474cf => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 21 18:57:59 np0005591285 nova_compute[182755]: 2026-01-21 23:57:59.815 182759 INFO nova.compute.manager [None req-f447ef32-7edb-4e2c-a5c5-5452f2f9c37e - - - - - -] [instance: 9308be91-9a92-4389-939a-8b03d37474cf] VM Started (Lifecycle Event)#033[00m
Jan 21 18:57:59 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:57:59.829 211690 DEBUG oslo.privsep.daemon [-] privsep: reply[125587b0-ea94-4bb7-a445-0139332aebff]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 18:57:59 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:57:59.831 104259 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap19c3e0c8-50, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 21 18:57:59 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:57:59.832 104259 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 21 18:57:59 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:57:59.832 104259 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap19c3e0c8-50, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 21 18:57:59 np0005591285 kernel: tap19c3e0c8-50: entered promiscuous mode
Jan 21 18:57:59 np0005591285 nova_compute[182755]: 2026-01-21 23:57:59.835 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 18:57:59 np0005591285 NetworkManager[55017]: <info>  [1769039879.8361] manager: (tap19c3e0c8-50): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/112)
Jan 21 18:57:59 np0005591285 nova_compute[182755]: 2026-01-21 23:57:59.837 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 18:57:59 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:57:59.839 104259 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap19c3e0c8-50, col_values=(('external_ids', {'iface-id': '1b7e9589-a667-4684-99c2-2699b19c29bb'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 21 18:57:59 np0005591285 nova_compute[182755]: 2026-01-21 23:57:59.840 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 18:57:59 np0005591285 ovn_controller[94908]: 2026-01-21T23:57:59Z|00216|binding|INFO|Releasing lport 1b7e9589-a667-4684-99c2-2699b19c29bb from this chassis (sb_readonly=0)
Jan 21 18:57:59 np0005591285 nova_compute[182755]: 2026-01-21 23:57:59.841 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 18:57:59 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:57:59.841 104259 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/19c3e0c8-5563-479c-995a-ab38d8b8c7f7.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/19c3e0c8-5563-479c-995a-ab38d8b8c7f7.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Jan 21 18:57:59 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:57:59.852 211690 DEBUG oslo.privsep.daemon [-] privsep: reply[b26fb1d2-b4f0-4bb6-9b42-68d11e4b5078]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 18:57:59 np0005591285 nova_compute[182755]: 2026-01-21 23:57:59.853 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 18:57:59 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:57:59.854 104259 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Jan 21 18:57:59 np0005591285 ovn_metadata_agent[104254]: global
Jan 21 18:57:59 np0005591285 ovn_metadata_agent[104254]:    log         /dev/log local0 debug
Jan 21 18:57:59 np0005591285 ovn_metadata_agent[104254]:    log-tag     haproxy-metadata-proxy-19c3e0c8-5563-479c-995a-ab38d8b8c7f7
Jan 21 18:57:59 np0005591285 ovn_metadata_agent[104254]:    user        root
Jan 21 18:57:59 np0005591285 ovn_metadata_agent[104254]:    group       root
Jan 21 18:57:59 np0005591285 ovn_metadata_agent[104254]:    maxconn     1024
Jan 21 18:57:59 np0005591285 ovn_metadata_agent[104254]:    pidfile     /var/lib/neutron/external/pids/19c3e0c8-5563-479c-995a-ab38d8b8c7f7.pid.haproxy
Jan 21 18:57:59 np0005591285 ovn_metadata_agent[104254]:    daemon
Jan 21 18:57:59 np0005591285 ovn_metadata_agent[104254]: 
Jan 21 18:57:59 np0005591285 ovn_metadata_agent[104254]: defaults
Jan 21 18:57:59 np0005591285 ovn_metadata_agent[104254]:    log global
Jan 21 18:57:59 np0005591285 ovn_metadata_agent[104254]:    mode http
Jan 21 18:57:59 np0005591285 ovn_metadata_agent[104254]:    option httplog
Jan 21 18:57:59 np0005591285 ovn_metadata_agent[104254]:    option dontlognull
Jan 21 18:57:59 np0005591285 ovn_metadata_agent[104254]:    option http-server-close
Jan 21 18:57:59 np0005591285 ovn_metadata_agent[104254]:    option forwardfor
Jan 21 18:57:59 np0005591285 ovn_metadata_agent[104254]:    retries                 3
Jan 21 18:57:59 np0005591285 ovn_metadata_agent[104254]:    timeout http-request    30s
Jan 21 18:57:59 np0005591285 ovn_metadata_agent[104254]:    timeout connect         30s
Jan 21 18:57:59 np0005591285 ovn_metadata_agent[104254]:    timeout client          32s
Jan 21 18:57:59 np0005591285 ovn_metadata_agent[104254]:    timeout server          32s
Jan 21 18:57:59 np0005591285 ovn_metadata_agent[104254]:    timeout http-keep-alive 30s
Jan 21 18:57:59 np0005591285 ovn_metadata_agent[104254]: 
Jan 21 18:57:59 np0005591285 ovn_metadata_agent[104254]: 
Jan 21 18:57:59 np0005591285 ovn_metadata_agent[104254]: listen listener
Jan 21 18:57:59 np0005591285 ovn_metadata_agent[104254]:    bind 169.254.169.254:80
Jan 21 18:57:59 np0005591285 ovn_metadata_agent[104254]:    server metadata /var/lib/neutron/metadata_proxy
Jan 21 18:57:59 np0005591285 ovn_metadata_agent[104254]:    http-request add-header X-OVN-Network-ID 19c3e0c8-5563-479c-995a-ab38d8b8c7f7
Jan 21 18:57:59 np0005591285 ovn_metadata_agent[104254]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Jan 21 18:57:59 np0005591285 nova_compute[182755]: 2026-01-21 23:57:59.855 182759 DEBUG nova.compute.manager [None req-f447ef32-7edb-4e2c-a5c5-5452f2f9c37e - - - - - -] [instance: 9308be91-9a92-4389-939a-8b03d37474cf] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 21 18:57:59 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:57:59.857 104259 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-19c3e0c8-5563-479c-995a-ab38d8b8c7f7', 'env', 'PROCESS_TAG=haproxy-19c3e0c8-5563-479c-995a-ab38d8b8c7f7', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/19c3e0c8-5563-479c-995a-ab38d8b8c7f7.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Jan 21 18:57:59 np0005591285 nova_compute[182755]: 2026-01-21 23:57:59.862 182759 DEBUG nova.compute.manager [None req-f447ef32-7edb-4e2c-a5c5-5452f2f9c37e - - - - - -] [instance: 9308be91-9a92-4389-939a-8b03d37474cf] Synchronizing instance power state after lifecycle event "Started"; current vm_state: active, current task_state: None, current DB power_state: 1, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 21 18:57:59 np0005591285 nova_compute[182755]: 2026-01-21 23:57:59.886 182759 DEBUG oslo_concurrency.lockutils [None req-0a1daf86-2767-46a7-915c-2d1e4fd70c9c 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] Lock "9308be91-9a92-4389-939a-8b03d37474cf" "released" by "nova.compute.manager.ComputeManager.reboot_instance.<locals>.do_reboot_instance" :: held 3.987s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 21 18:58:00 np0005591285 podman[220878]: 2026-01-21 23:58:00.258582538 +0000 UTC m=+0.059606289 container create 0df379867abec058de90816a73a0a741a0f81f883fbb2baf54373ef14583e3a7 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-19c3e0c8-5563-479c-995a-ab38d8b8c7f7, tcib_managed=true, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team)
Jan 21 18:58:00 np0005591285 systemd[1]: Started libpod-conmon-0df379867abec058de90816a73a0a741a0f81f883fbb2baf54373ef14583e3a7.scope.
Jan 21 18:58:00 np0005591285 podman[220878]: 2026-01-21 23:58:00.22364552 +0000 UTC m=+0.024669321 image pull 3695f0466b4af47afdf4b467956f8cc4744d7249671a73e7ca3fd26cca2f59c3 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Jan 21 18:58:00 np0005591285 systemd[1]: Started libcrun container.
Jan 21 18:58:00 np0005591285 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/9452e20cd6787e03a11416e53ffb81e10eb8876aaacf60ec3f33298e94aad6f5/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 21 18:58:00 np0005591285 podman[220878]: 2026-01-21 23:58:00.378043697 +0000 UTC m=+0.179067438 container init 0df379867abec058de90816a73a0a741a0f81f883fbb2baf54373ef14583e3a7 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-19c3e0c8-5563-479c-995a-ab38d8b8c7f7, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.build-date=20251202, maintainer=OpenStack Kubernetes Operator team)
Jan 21 18:58:00 np0005591285 podman[220878]: 2026-01-21 23:58:00.384023419 +0000 UTC m=+0.185047140 container start 0df379867abec058de90816a73a0a741a0f81f883fbb2baf54373ef14583e3a7 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-19c3e0c8-5563-479c-995a-ab38d8b8c7f7, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, tcib_managed=true, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2)
Jan 21 18:58:00 np0005591285 neutron-haproxy-ovnmeta-19c3e0c8-5563-479c-995a-ab38d8b8c7f7[220894]: [NOTICE]   (220899) : New worker (220901) forked
Jan 21 18:58:00 np0005591285 neutron-haproxy-ovnmeta-19c3e0c8-5563-479c-995a-ab38d8b8c7f7[220894]: [NOTICE]   (220899) : Loading success.
Jan 21 18:58:00 np0005591285 nova_compute[182755]: 2026-01-21 23:58:00.596 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 18:58:00 np0005591285 nova_compute[182755]: 2026-01-21 23:58:00.648 182759 DEBUG nova.compute.manager [req-d23a6af5-c4a4-40f4-a3d8-b51214af049f req-db70e19d-654d-486d-a209-17f216afd50d 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 9308be91-9a92-4389-939a-8b03d37474cf] Received event network-vif-plugged-d96fb6bb-9793-4373-8f62-3aa3f32af6a5 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 21 18:58:00 np0005591285 nova_compute[182755]: 2026-01-21 23:58:00.649 182759 DEBUG oslo_concurrency.lockutils [req-d23a6af5-c4a4-40f4-a3d8-b51214af049f req-db70e19d-654d-486d-a209-17f216afd50d 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "9308be91-9a92-4389-939a-8b03d37474cf-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 21 18:58:00 np0005591285 nova_compute[182755]: 2026-01-21 23:58:00.649 182759 DEBUG oslo_concurrency.lockutils [req-d23a6af5-c4a4-40f4-a3d8-b51214af049f req-db70e19d-654d-486d-a209-17f216afd50d 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "9308be91-9a92-4389-939a-8b03d37474cf-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 21 18:58:00 np0005591285 nova_compute[182755]: 2026-01-21 23:58:00.649 182759 DEBUG oslo_concurrency.lockutils [req-d23a6af5-c4a4-40f4-a3d8-b51214af049f req-db70e19d-654d-486d-a209-17f216afd50d 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "9308be91-9a92-4389-939a-8b03d37474cf-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 21 18:58:00 np0005591285 nova_compute[182755]: 2026-01-21 23:58:00.649 182759 DEBUG nova.compute.manager [req-d23a6af5-c4a4-40f4-a3d8-b51214af049f req-db70e19d-654d-486d-a209-17f216afd50d 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 9308be91-9a92-4389-939a-8b03d37474cf] No waiting events found dispatching network-vif-plugged-d96fb6bb-9793-4373-8f62-3aa3f32af6a5 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 21 18:58:00 np0005591285 nova_compute[182755]: 2026-01-21 23:58:00.650 182759 WARNING nova.compute.manager [req-d23a6af5-c4a4-40f4-a3d8-b51214af049f req-db70e19d-654d-486d-a209-17f216afd50d 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 9308be91-9a92-4389-939a-8b03d37474cf] Received unexpected event network-vif-plugged-d96fb6bb-9793-4373-8f62-3aa3f32af6a5 for instance with vm_state active and task_state None.#033[00m
Jan 21 18:58:00 np0005591285 nova_compute[182755]: 2026-01-21 23:58:00.650 182759 DEBUG nova.compute.manager [req-d23a6af5-c4a4-40f4-a3d8-b51214af049f req-db70e19d-654d-486d-a209-17f216afd50d 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 9308be91-9a92-4389-939a-8b03d37474cf] Received event network-vif-plugged-d96fb6bb-9793-4373-8f62-3aa3f32af6a5 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 21 18:58:00 np0005591285 nova_compute[182755]: 2026-01-21 23:58:00.650 182759 DEBUG oslo_concurrency.lockutils [req-d23a6af5-c4a4-40f4-a3d8-b51214af049f req-db70e19d-654d-486d-a209-17f216afd50d 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "9308be91-9a92-4389-939a-8b03d37474cf-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 21 18:58:00 np0005591285 nova_compute[182755]: 2026-01-21 23:58:00.650 182759 DEBUG oslo_concurrency.lockutils [req-d23a6af5-c4a4-40f4-a3d8-b51214af049f req-db70e19d-654d-486d-a209-17f216afd50d 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "9308be91-9a92-4389-939a-8b03d37474cf-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 21 18:58:00 np0005591285 nova_compute[182755]: 2026-01-21 23:58:00.650 182759 DEBUG oslo_concurrency.lockutils [req-d23a6af5-c4a4-40f4-a3d8-b51214af049f req-db70e19d-654d-486d-a209-17f216afd50d 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "9308be91-9a92-4389-939a-8b03d37474cf-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 21 18:58:00 np0005591285 nova_compute[182755]: 2026-01-21 23:58:00.651 182759 DEBUG nova.compute.manager [req-d23a6af5-c4a4-40f4-a3d8-b51214af049f req-db70e19d-654d-486d-a209-17f216afd50d 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 9308be91-9a92-4389-939a-8b03d37474cf] No waiting events found dispatching network-vif-plugged-d96fb6bb-9793-4373-8f62-3aa3f32af6a5 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 21 18:58:00 np0005591285 nova_compute[182755]: 2026-01-21 23:58:00.651 182759 WARNING nova.compute.manager [req-d23a6af5-c4a4-40f4-a3d8-b51214af049f req-db70e19d-654d-486d-a209-17f216afd50d 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 9308be91-9a92-4389-939a-8b03d37474cf] Received unexpected event network-vif-plugged-d96fb6bb-9793-4373-8f62-3aa3f32af6a5 for instance with vm_state active and task_state None.#033[00m
Jan 21 18:58:00 np0005591285 nova_compute[182755]: 2026-01-21 23:58:00.651 182759 DEBUG nova.compute.manager [req-d23a6af5-c4a4-40f4-a3d8-b51214af049f req-db70e19d-654d-486d-a209-17f216afd50d 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 9308be91-9a92-4389-939a-8b03d37474cf] Received event network-vif-plugged-d96fb6bb-9793-4373-8f62-3aa3f32af6a5 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 21 18:58:00 np0005591285 nova_compute[182755]: 2026-01-21 23:58:00.651 182759 DEBUG oslo_concurrency.lockutils [req-d23a6af5-c4a4-40f4-a3d8-b51214af049f req-db70e19d-654d-486d-a209-17f216afd50d 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "9308be91-9a92-4389-939a-8b03d37474cf-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 21 18:58:00 np0005591285 nova_compute[182755]: 2026-01-21 23:58:00.651 182759 DEBUG oslo_concurrency.lockutils [req-d23a6af5-c4a4-40f4-a3d8-b51214af049f req-db70e19d-654d-486d-a209-17f216afd50d 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "9308be91-9a92-4389-939a-8b03d37474cf-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 21 18:58:00 np0005591285 nova_compute[182755]: 2026-01-21 23:58:00.652 182759 DEBUG oslo_concurrency.lockutils [req-d23a6af5-c4a4-40f4-a3d8-b51214af049f req-db70e19d-654d-486d-a209-17f216afd50d 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "9308be91-9a92-4389-939a-8b03d37474cf-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 21 18:58:00 np0005591285 nova_compute[182755]: 2026-01-21 23:58:00.652 182759 DEBUG nova.compute.manager [req-d23a6af5-c4a4-40f4-a3d8-b51214af049f req-db70e19d-654d-486d-a209-17f216afd50d 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 9308be91-9a92-4389-939a-8b03d37474cf] No waiting events found dispatching network-vif-plugged-d96fb6bb-9793-4373-8f62-3aa3f32af6a5 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 21 18:58:00 np0005591285 nova_compute[182755]: 2026-01-21 23:58:00.652 182759 WARNING nova.compute.manager [req-d23a6af5-c4a4-40f4-a3d8-b51214af049f req-db70e19d-654d-486d-a209-17f216afd50d 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 9308be91-9a92-4389-939a-8b03d37474cf] Received unexpected event network-vif-plugged-d96fb6bb-9793-4373-8f62-3aa3f32af6a5 for instance with vm_state active and task_state None.#033[00m
Jan 21 18:58:01 np0005591285 nova_compute[182755]: 2026-01-21 23:58:01.501 182759 INFO nova.compute.manager [None req-785878ac-95c4-41ef-ab17-8316746ace31 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] [instance: 9308be91-9a92-4389-939a-8b03d37474cf] Get console output#033[00m
Jan 21 18:58:02 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:58:02.963 104259 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 21 18:58:02 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:58:02.966 104259 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.003s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 21 18:58:02 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:58:02.968 104259 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 21 18:58:04 np0005591285 nova_compute[182755]: 2026-01-21 23:58:04.170 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 18:58:04 np0005591285 podman[220912]: 2026-01-21 23:58:04.218297688 +0000 UTC m=+0.079075445 container health_status 3d927e208c8d9d2bf2ed958430b573b5beb6e6062ba87e1ec7a0ee42c855b2e4 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Jan 21 18:58:04 np0005591285 nova_compute[182755]: 2026-01-21 23:58:04.867 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 18:58:04 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:58:04.867 104259 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=17, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '16:a4:fb', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': 'fa:c2:bb:50:eb:27'}, ipsec=False) old=SB_Global(nb_cfg=16) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 21 18:58:04 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:58:04.869 104259 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 10 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Jan 21 18:58:05 np0005591285 nova_compute[182755]: 2026-01-21 23:58:05.599 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 18:58:08 np0005591285 nova_compute[182755]: 2026-01-21 23:58:08.143 182759 DEBUG oslo_concurrency.lockutils [None req-8591e522-8258-4c34-98fd-0c91f9b69a3e 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] Acquiring lock "9308be91-9a92-4389-939a-8b03d37474cf" by "nova.compute.manager.ComputeManager.stop_instance.<locals>.do_stop_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 21 18:58:08 np0005591285 nova_compute[182755]: 2026-01-21 23:58:08.144 182759 DEBUG oslo_concurrency.lockutils [None req-8591e522-8258-4c34-98fd-0c91f9b69a3e 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] Lock "9308be91-9a92-4389-939a-8b03d37474cf" acquired by "nova.compute.manager.ComputeManager.stop_instance.<locals>.do_stop_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 21 18:58:08 np0005591285 nova_compute[182755]: 2026-01-21 23:58:08.144 182759 DEBUG nova.compute.manager [None req-8591e522-8258-4c34-98fd-0c91f9b69a3e 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] [instance: 9308be91-9a92-4389-939a-8b03d37474cf] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 21 18:58:08 np0005591285 nova_compute[182755]: 2026-01-21 23:58:08.149 182759 DEBUG nova.compute.manager [None req-8591e522-8258-4c34-98fd-0c91f9b69a3e 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] [instance: 9308be91-9a92-4389-939a-8b03d37474cf] Stopping instance; current vm_state: active, current task_state: powering-off, current DB power_state: 1, current VM power_state: 1 do_stop_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3338#033[00m
Jan 21 18:58:08 np0005591285 nova_compute[182755]: 2026-01-21 23:58:08.151 182759 DEBUG nova.objects.instance [None req-8591e522-8258-4c34-98fd-0c91f9b69a3e 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] Lazy-loading 'flavor' on Instance uuid 9308be91-9a92-4389-939a-8b03d37474cf obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 21 18:58:08 np0005591285 nova_compute[182755]: 2026-01-21 23:58:08.187 182759 DEBUG nova.objects.instance [None req-8591e522-8258-4c34-98fd-0c91f9b69a3e 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] Lazy-loading 'info_cache' on Instance uuid 9308be91-9a92-4389-939a-8b03d37474cf obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 21 18:58:08 np0005591285 nova_compute[182755]: 2026-01-21 23:58:08.224 182759 DEBUG nova.virt.libvirt.driver [None req-8591e522-8258-4c34-98fd-0c91f9b69a3e 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] [instance: 9308be91-9a92-4389-939a-8b03d37474cf] Shutting down instance from state 1 _clean_shutdown /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4071#033[00m
Jan 21 18:58:09 np0005591285 nova_compute[182755]: 2026-01-21 23:58:09.212 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 18:58:09 np0005591285 podman[220937]: 2026-01-21 23:58:09.240549806 +0000 UTC m=+0.105221625 container health_status 482a8450540b33e5cd4c4cb0c4b60281016c657b79b89fa82f8a9e447843f995 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.license=GPLv2, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team)
Jan 21 18:58:09 np0005591285 podman[220938]: 2026-01-21 23:58:09.266958452 +0000 UTC m=+0.114745213 container health_status 8919309155a033da8deb024bb37b9fc0d285fe97686ee0d202af37a5f2df8416 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Jan 21 18:58:09 np0005591285 podman[220940]: 2026-01-21 23:58:09.284691293 +0000 UTC m=+0.136003540 container health_status e38def30a2d032278c507a8fe96e62e9ea51ff4721fe55b24e66691821fd079e (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, tcib_managed=true, config_id=ovn_controller, container_name=ovn_controller)
Jan 21 18:58:10 np0005591285 nova_compute[182755]: 2026-01-21 23:58:10.602 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 18:58:13 np0005591285 ovn_controller[94908]: 2026-01-21T23:58:13Z|00025|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:c3:44:d7 10.100.0.7
Jan 21 18:58:14 np0005591285 nova_compute[182755]: 2026-01-21 23:58:14.214 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 18:58:14 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:58:14.871 104259 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=ce4b296c-26ac-415a-aa87-9634754eb3d3, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '17'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 21 18:58:15 np0005591285 nova_compute[182755]: 2026-01-21 23:58:15.606 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 18:58:16 np0005591285 nova_compute[182755]: 2026-01-21 23:58:16.492 182759 DEBUG oslo_concurrency.lockutils [None req-978889eb-2cc1-4b42-b652-b4e64952d3f3 55710edfd4b24e368807c8b5087ec91c 011e84f966444a668bd6c0f5674f551f - - default default] Acquiring lock "83fe04ea-7d77-4003-9276-6a7d268e942a" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 21 18:58:16 np0005591285 nova_compute[182755]: 2026-01-21 23:58:16.493 182759 DEBUG oslo_concurrency.lockutils [None req-978889eb-2cc1-4b42-b652-b4e64952d3f3 55710edfd4b24e368807c8b5087ec91c 011e84f966444a668bd6c0f5674f551f - - default default] Lock "83fe04ea-7d77-4003-9276-6a7d268e942a" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 21 18:58:16 np0005591285 nova_compute[182755]: 2026-01-21 23:58:16.513 182759 DEBUG nova.compute.manager [None req-978889eb-2cc1-4b42-b652-b4e64952d3f3 55710edfd4b24e368807c8b5087ec91c 011e84f966444a668bd6c0f5674f551f - - default default] [instance: 83fe04ea-7d77-4003-9276-6a7d268e942a] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Jan 21 18:58:16 np0005591285 nova_compute[182755]: 2026-01-21 23:58:16.657 182759 DEBUG oslo_concurrency.lockutils [None req-978889eb-2cc1-4b42-b652-b4e64952d3f3 55710edfd4b24e368807c8b5087ec91c 011e84f966444a668bd6c0f5674f551f - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 21 18:58:16 np0005591285 nova_compute[182755]: 2026-01-21 23:58:16.658 182759 DEBUG oslo_concurrency.lockutils [None req-978889eb-2cc1-4b42-b652-b4e64952d3f3 55710edfd4b24e368807c8b5087ec91c 011e84f966444a668bd6c0f5674f551f - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 21 18:58:16 np0005591285 nova_compute[182755]: 2026-01-21 23:58:16.669 182759 DEBUG nova.virt.hardware [None req-978889eb-2cc1-4b42-b652-b4e64952d3f3 55710edfd4b24e368807c8b5087ec91c 011e84f966444a668bd6c0f5674f551f - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Jan 21 18:58:16 np0005591285 nova_compute[182755]: 2026-01-21 23:58:16.670 182759 INFO nova.compute.claims [None req-978889eb-2cc1-4b42-b652-b4e64952d3f3 55710edfd4b24e368807c8b5087ec91c 011e84f966444a668bd6c0f5674f551f - - default default] [instance: 83fe04ea-7d77-4003-9276-6a7d268e942a] Claim successful on node compute-2.ctlplane.example.com#033[00m
Jan 21 18:58:16 np0005591285 nova_compute[182755]: 2026-01-21 23:58:16.828 182759 DEBUG nova.compute.provider_tree [None req-978889eb-2cc1-4b42-b652-b4e64952d3f3 55710edfd4b24e368807c8b5087ec91c 011e84f966444a668bd6c0f5674f551f - - default default] Inventory has not changed in ProviderTree for provider: e96a8776-a298-4c19-937a-402cb8191067 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 21 18:58:16 np0005591285 nova_compute[182755]: 2026-01-21 23:58:16.843 182759 DEBUG nova.scheduler.client.report [None req-978889eb-2cc1-4b42-b652-b4e64952d3f3 55710edfd4b24e368807c8b5087ec91c 011e84f966444a668bd6c0f5674f551f - - default default] Inventory has not changed for provider e96a8776-a298-4c19-937a-402cb8191067 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 21 18:58:16 np0005591285 nova_compute[182755]: 2026-01-21 23:58:16.879 182759 DEBUG oslo_concurrency.lockutils [None req-978889eb-2cc1-4b42-b652-b4e64952d3f3 55710edfd4b24e368807c8b5087ec91c 011e84f966444a668bd6c0f5674f551f - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.221s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 21 18:58:16 np0005591285 nova_compute[182755]: 2026-01-21 23:58:16.880 182759 DEBUG nova.compute.manager [None req-978889eb-2cc1-4b42-b652-b4e64952d3f3 55710edfd4b24e368807c8b5087ec91c 011e84f966444a668bd6c0f5674f551f - - default default] [instance: 83fe04ea-7d77-4003-9276-6a7d268e942a] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Jan 21 18:58:16 np0005591285 nova_compute[182755]: 2026-01-21 23:58:16.952 182759 DEBUG nova.compute.manager [None req-978889eb-2cc1-4b42-b652-b4e64952d3f3 55710edfd4b24e368807c8b5087ec91c 011e84f966444a668bd6c0f5674f551f - - default default] [instance: 83fe04ea-7d77-4003-9276-6a7d268e942a] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Jan 21 18:58:16 np0005591285 nova_compute[182755]: 2026-01-21 23:58:16.953 182759 DEBUG nova.network.neutron [None req-978889eb-2cc1-4b42-b652-b4e64952d3f3 55710edfd4b24e368807c8b5087ec91c 011e84f966444a668bd6c0f5674f551f - - default default] [instance: 83fe04ea-7d77-4003-9276-6a7d268e942a] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Jan 21 18:58:16 np0005591285 nova_compute[182755]: 2026-01-21 23:58:16.979 182759 INFO nova.virt.libvirt.driver [None req-978889eb-2cc1-4b42-b652-b4e64952d3f3 55710edfd4b24e368807c8b5087ec91c 011e84f966444a668bd6c0f5674f551f - - default default] [instance: 83fe04ea-7d77-4003-9276-6a7d268e942a] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Jan 21 18:58:17 np0005591285 nova_compute[182755]: 2026-01-21 23:58:17.009 182759 DEBUG nova.compute.manager [None req-978889eb-2cc1-4b42-b652-b4e64952d3f3 55710edfd4b24e368807c8b5087ec91c 011e84f966444a668bd6c0f5674f551f - - default default] [instance: 83fe04ea-7d77-4003-9276-6a7d268e942a] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Jan 21 18:58:17 np0005591285 nova_compute[182755]: 2026-01-21 23:58:17.174 182759 DEBUG nova.compute.manager [None req-978889eb-2cc1-4b42-b652-b4e64952d3f3 55710edfd4b24e368807c8b5087ec91c 011e84f966444a668bd6c0f5674f551f - - default default] [instance: 83fe04ea-7d77-4003-9276-6a7d268e942a] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Jan 21 18:58:17 np0005591285 nova_compute[182755]: 2026-01-21 23:58:17.177 182759 DEBUG nova.virt.libvirt.driver [None req-978889eb-2cc1-4b42-b652-b4e64952d3f3 55710edfd4b24e368807c8b5087ec91c 011e84f966444a668bd6c0f5674f551f - - default default] [instance: 83fe04ea-7d77-4003-9276-6a7d268e942a] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Jan 21 18:58:17 np0005591285 nova_compute[182755]: 2026-01-21 23:58:17.178 182759 INFO nova.virt.libvirt.driver [None req-978889eb-2cc1-4b42-b652-b4e64952d3f3 55710edfd4b24e368807c8b5087ec91c 011e84f966444a668bd6c0f5674f551f - - default default] [instance: 83fe04ea-7d77-4003-9276-6a7d268e942a] Creating image(s)#033[00m
Jan 21 18:58:17 np0005591285 nova_compute[182755]: 2026-01-21 23:58:17.179 182759 DEBUG oslo_concurrency.lockutils [None req-978889eb-2cc1-4b42-b652-b4e64952d3f3 55710edfd4b24e368807c8b5087ec91c 011e84f966444a668bd6c0f5674f551f - - default default] Acquiring lock "/var/lib/nova/instances/83fe04ea-7d77-4003-9276-6a7d268e942a/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 21 18:58:17 np0005591285 nova_compute[182755]: 2026-01-21 23:58:17.180 182759 DEBUG oslo_concurrency.lockutils [None req-978889eb-2cc1-4b42-b652-b4e64952d3f3 55710edfd4b24e368807c8b5087ec91c 011e84f966444a668bd6c0f5674f551f - - default default] Lock "/var/lib/nova/instances/83fe04ea-7d77-4003-9276-6a7d268e942a/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 21 18:58:17 np0005591285 nova_compute[182755]: 2026-01-21 23:58:17.181 182759 DEBUG oslo_concurrency.lockutils [None req-978889eb-2cc1-4b42-b652-b4e64952d3f3 55710edfd4b24e368807c8b5087ec91c 011e84f966444a668bd6c0f5674f551f - - default default] Lock "/var/lib/nova/instances/83fe04ea-7d77-4003-9276-6a7d268e942a/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 21 18:58:17 np0005591285 nova_compute[182755]: 2026-01-21 23:58:17.216 182759 DEBUG nova.policy [None req-978889eb-2cc1-4b42-b652-b4e64952d3f3 55710edfd4b24e368807c8b5087ec91c 011e84f966444a668bd6c0f5674f551f - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '55710edfd4b24e368807c8b5087ec91c', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '011e84f966444a668bd6c0f5674f551f', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Jan 21 18:58:17 np0005591285 nova_compute[182755]: 2026-01-21 23:58:17.221 182759 DEBUG oslo_concurrency.processutils [None req-978889eb-2cc1-4b42-b652-b4e64952d3f3 55710edfd4b24e368807c8b5087ec91c 011e84f966444a668bd6c0f5674f551f - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 21 18:58:17 np0005591285 nova_compute[182755]: 2026-01-21 23:58:17.288 182759 DEBUG oslo_concurrency.processutils [None req-978889eb-2cc1-4b42-b652-b4e64952d3f3 55710edfd4b24e368807c8b5087ec91c 011e84f966444a668bd6c0f5674f551f - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json" returned: 0 in 0.066s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 21 18:58:17 np0005591285 nova_compute[182755]: 2026-01-21 23:58:17.289 182759 DEBUG oslo_concurrency.lockutils [None req-978889eb-2cc1-4b42-b652-b4e64952d3f3 55710edfd4b24e368807c8b5087ec91c 011e84f966444a668bd6c0f5674f551f - - default default] Acquiring lock "3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 21 18:58:17 np0005591285 nova_compute[182755]: 2026-01-21 23:58:17.289 182759 DEBUG oslo_concurrency.lockutils [None req-978889eb-2cc1-4b42-b652-b4e64952d3f3 55710edfd4b24e368807c8b5087ec91c 011e84f966444a668bd6c0f5674f551f - - default default] Lock "3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 21 18:58:17 np0005591285 nova_compute[182755]: 2026-01-21 23:58:17.300 182759 DEBUG oslo_concurrency.processutils [None req-978889eb-2cc1-4b42-b652-b4e64952d3f3 55710edfd4b24e368807c8b5087ec91c 011e84f966444a668bd6c0f5674f551f - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 21 18:58:17 np0005591285 nova_compute[182755]: 2026-01-21 23:58:17.370 182759 DEBUG oslo_concurrency.processutils [None req-978889eb-2cc1-4b42-b652-b4e64952d3f3 55710edfd4b24e368807c8b5087ec91c 011e84f966444a668bd6c0f5674f551f - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json" returned: 0 in 0.070s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 21 18:58:17 np0005591285 nova_compute[182755]: 2026-01-21 23:58:17.371 182759 DEBUG oslo_concurrency.processutils [None req-978889eb-2cc1-4b42-b652-b4e64952d3f3 55710edfd4b24e368807c8b5087ec91c 011e84f966444a668bd6c0f5674f551f - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474,backing_fmt=raw /var/lib/nova/instances/83fe04ea-7d77-4003-9276-6a7d268e942a/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 21 18:58:17 np0005591285 nova_compute[182755]: 2026-01-21 23:58:17.412 182759 DEBUG oslo_concurrency.processutils [None req-978889eb-2cc1-4b42-b652-b4e64952d3f3 55710edfd4b24e368807c8b5087ec91c 011e84f966444a668bd6c0f5674f551f - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474,backing_fmt=raw /var/lib/nova/instances/83fe04ea-7d77-4003-9276-6a7d268e942a/disk 1073741824" returned: 0 in 0.041s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 21 18:58:17 np0005591285 nova_compute[182755]: 2026-01-21 23:58:17.414 182759 DEBUG oslo_concurrency.lockutils [None req-978889eb-2cc1-4b42-b652-b4e64952d3f3 55710edfd4b24e368807c8b5087ec91c 011e84f966444a668bd6c0f5674f551f - - default default] Lock "3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.124s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 21 18:58:17 np0005591285 nova_compute[182755]: 2026-01-21 23:58:17.414 182759 DEBUG oslo_concurrency.processutils [None req-978889eb-2cc1-4b42-b652-b4e64952d3f3 55710edfd4b24e368807c8b5087ec91c 011e84f966444a668bd6c0f5674f551f - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 21 18:58:17 np0005591285 nova_compute[182755]: 2026-01-21 23:58:17.514 182759 DEBUG oslo_concurrency.processutils [None req-978889eb-2cc1-4b42-b652-b4e64952d3f3 55710edfd4b24e368807c8b5087ec91c 011e84f966444a668bd6c0f5674f551f - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json" returned: 0 in 0.099s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 21 18:58:17 np0005591285 nova_compute[182755]: 2026-01-21 23:58:17.515 182759 DEBUG nova.virt.disk.api [None req-978889eb-2cc1-4b42-b652-b4e64952d3f3 55710edfd4b24e368807c8b5087ec91c 011e84f966444a668bd6c0f5674f551f - - default default] Checking if we can resize image /var/lib/nova/instances/83fe04ea-7d77-4003-9276-6a7d268e942a/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166#033[00m
Jan 21 18:58:17 np0005591285 nova_compute[182755]: 2026-01-21 23:58:17.516 182759 DEBUG oslo_concurrency.processutils [None req-978889eb-2cc1-4b42-b652-b4e64952d3f3 55710edfd4b24e368807c8b5087ec91c 011e84f966444a668bd6c0f5674f551f - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/83fe04ea-7d77-4003-9276-6a7d268e942a/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 21 18:58:17 np0005591285 nova_compute[182755]: 2026-01-21 23:58:17.573 182759 DEBUG oslo_concurrency.processutils [None req-978889eb-2cc1-4b42-b652-b4e64952d3f3 55710edfd4b24e368807c8b5087ec91c 011e84f966444a668bd6c0f5674f551f - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/83fe04ea-7d77-4003-9276-6a7d268e942a/disk --force-share --output=json" returned: 0 in 0.057s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 21 18:58:17 np0005591285 nova_compute[182755]: 2026-01-21 23:58:17.575 182759 DEBUG nova.virt.disk.api [None req-978889eb-2cc1-4b42-b652-b4e64952d3f3 55710edfd4b24e368807c8b5087ec91c 011e84f966444a668bd6c0f5674f551f - - default default] Cannot resize image /var/lib/nova/instances/83fe04ea-7d77-4003-9276-6a7d268e942a/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172#033[00m
Jan 21 18:58:17 np0005591285 nova_compute[182755]: 2026-01-21 23:58:17.576 182759 DEBUG nova.objects.instance [None req-978889eb-2cc1-4b42-b652-b4e64952d3f3 55710edfd4b24e368807c8b5087ec91c 011e84f966444a668bd6c0f5674f551f - - default default] Lazy-loading 'migration_context' on Instance uuid 83fe04ea-7d77-4003-9276-6a7d268e942a obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 21 18:58:17 np0005591285 nova_compute[182755]: 2026-01-21 23:58:17.599 182759 DEBUG nova.virt.libvirt.driver [None req-978889eb-2cc1-4b42-b652-b4e64952d3f3 55710edfd4b24e368807c8b5087ec91c 011e84f966444a668bd6c0f5674f551f - - default default] [instance: 83fe04ea-7d77-4003-9276-6a7d268e942a] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Jan 21 18:58:17 np0005591285 nova_compute[182755]: 2026-01-21 23:58:17.600 182759 DEBUG nova.virt.libvirt.driver [None req-978889eb-2cc1-4b42-b652-b4e64952d3f3 55710edfd4b24e368807c8b5087ec91c 011e84f966444a668bd6c0f5674f551f - - default default] [instance: 83fe04ea-7d77-4003-9276-6a7d268e942a] Ensure instance console log exists: /var/lib/nova/instances/83fe04ea-7d77-4003-9276-6a7d268e942a/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Jan 21 18:58:17 np0005591285 nova_compute[182755]: 2026-01-21 23:58:17.601 182759 DEBUG oslo_concurrency.lockutils [None req-978889eb-2cc1-4b42-b652-b4e64952d3f3 55710edfd4b24e368807c8b5087ec91c 011e84f966444a668bd6c0f5674f551f - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 21 18:58:17 np0005591285 nova_compute[182755]: 2026-01-21 23:58:17.601 182759 DEBUG oslo_concurrency.lockutils [None req-978889eb-2cc1-4b42-b652-b4e64952d3f3 55710edfd4b24e368807c8b5087ec91c 011e84f966444a668bd6c0f5674f551f - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 21 18:58:17 np0005591285 nova_compute[182755]: 2026-01-21 23:58:17.602 182759 DEBUG oslo_concurrency.lockutils [None req-978889eb-2cc1-4b42-b652-b4e64952d3f3 55710edfd4b24e368807c8b5087ec91c 011e84f966444a668bd6c0f5674f551f - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 21 18:58:18 np0005591285 nova_compute[182755]: 2026-01-21 23:58:18.290 182759 DEBUG nova.virt.libvirt.driver [None req-8591e522-8258-4c34-98fd-0c91f9b69a3e 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] [instance: 9308be91-9a92-4389-939a-8b03d37474cf] Instance in state 1 after 10 seconds - resending shutdown _clean_shutdown /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4101#033[00m
Jan 21 18:58:18 np0005591285 nova_compute[182755]: 2026-01-21 23:58:18.924 182759 DEBUG nova.network.neutron [None req-978889eb-2cc1-4b42-b652-b4e64952d3f3 55710edfd4b24e368807c8b5087ec91c 011e84f966444a668bd6c0f5674f551f - - default default] [instance: 83fe04ea-7d77-4003-9276-6a7d268e942a] Successfully created port: 8e162717-2b5c-4731-8484-d2c68330bdaa _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Jan 21 18:58:19 np0005591285 nova_compute[182755]: 2026-01-21 23:58:19.217 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 18:58:20 np0005591285 nova_compute[182755]: 2026-01-21 23:58:20.398 182759 DEBUG nova.network.neutron [None req-978889eb-2cc1-4b42-b652-b4e64952d3f3 55710edfd4b24e368807c8b5087ec91c 011e84f966444a668bd6c0f5674f551f - - default default] [instance: 83fe04ea-7d77-4003-9276-6a7d268e942a] Successfully updated port: 8e162717-2b5c-4731-8484-d2c68330bdaa _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Jan 21 18:58:20 np0005591285 nova_compute[182755]: 2026-01-21 23:58:20.429 182759 DEBUG oslo_concurrency.lockutils [None req-978889eb-2cc1-4b42-b652-b4e64952d3f3 55710edfd4b24e368807c8b5087ec91c 011e84f966444a668bd6c0f5674f551f - - default default] Acquiring lock "refresh_cache-83fe04ea-7d77-4003-9276-6a7d268e942a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 21 18:58:20 np0005591285 nova_compute[182755]: 2026-01-21 23:58:20.430 182759 DEBUG oslo_concurrency.lockutils [None req-978889eb-2cc1-4b42-b652-b4e64952d3f3 55710edfd4b24e368807c8b5087ec91c 011e84f966444a668bd6c0f5674f551f - - default default] Acquired lock "refresh_cache-83fe04ea-7d77-4003-9276-6a7d268e942a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 21 18:58:20 np0005591285 nova_compute[182755]: 2026-01-21 23:58:20.430 182759 DEBUG nova.network.neutron [None req-978889eb-2cc1-4b42-b652-b4e64952d3f3 55710edfd4b24e368807c8b5087ec91c 011e84f966444a668bd6c0f5674f551f - - default default] [instance: 83fe04ea-7d77-4003-9276-6a7d268e942a] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Jan 21 18:58:20 np0005591285 kernel: tapd96fb6bb-97 (unregistering): left promiscuous mode
Jan 21 18:58:20 np0005591285 NetworkManager[55017]: <info>  [1769039900.4748] device (tapd96fb6bb-97): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 21 18:58:20 np0005591285 nova_compute[182755]: 2026-01-21 23:58:20.486 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 18:58:20 np0005591285 ovn_controller[94908]: 2026-01-21T23:58:20Z|00217|binding|INFO|Releasing lport d96fb6bb-9793-4373-8f62-3aa3f32af6a5 from this chassis (sb_readonly=0)
Jan 21 18:58:20 np0005591285 ovn_controller[94908]: 2026-01-21T23:58:20Z|00218|binding|INFO|Setting lport d96fb6bb-9793-4373-8f62-3aa3f32af6a5 down in Southbound
Jan 21 18:58:20 np0005591285 ovn_controller[94908]: 2026-01-21T23:58:20Z|00219|binding|INFO|Removing iface tapd96fb6bb-97 ovn-installed in OVS
Jan 21 18:58:20 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:58:20.494 104259 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:c3:44:d7 10.100.0.7'], port_security=['fa:16:3e:c3:44:d7 10.100.0.7'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.7/28', 'neutron:device_id': '9308be91-9a92-4389-939a-8b03d37474cf', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-19c3e0c8-5563-479c-995a-ab38d8b8c7f7', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'cccb624dbe6d4401a89e9cd254f91828', 'neutron:revision_number': '6', 'neutron:security_group_ids': '6d59a7e5-ecca-4ec2-a40e-386acabc1d66', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-2.ctlplane.example.com', 'neutron:port_fip': '192.168.122.240'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=1cb5ae5b-fb9e-4b4d-8960-35191db09308, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fdd55b5c970>], logical_port=d96fb6bb-9793-4373-8f62-3aa3f32af6a5) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7fdd55b5c970>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 21 18:58:20 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:58:20.495 104259 INFO neutron.agent.ovn.metadata.agent [-] Port d96fb6bb-9793-4373-8f62-3aa3f32af6a5 in datapath 19c3e0c8-5563-479c-995a-ab38d8b8c7f7 unbound from our chassis#033[00m
Jan 21 18:58:20 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:58:20.497 104259 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 19c3e0c8-5563-479c-995a-ab38d8b8c7f7, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Jan 21 18:58:20 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:58:20.498 211690 DEBUG oslo.privsep.daemon [-] privsep: reply[f7a60357-2909-43e1-b432-18603ac688b8]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 18:58:20 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:58:20.499 104259 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-19c3e0c8-5563-479c-995a-ab38d8b8c7f7 namespace which is not needed anymore#033[00m
Jan 21 18:58:20 np0005591285 nova_compute[182755]: 2026-01-21 23:58:20.502 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 18:58:20 np0005591285 nova_compute[182755]: 2026-01-21 23:58:20.517 182759 DEBUG nova.compute.manager [req-0f34a3f2-e9a5-4ee0-ac5e-1d0d84b8ed75 req-9ffb12ee-808b-47d1-bdca-c2605cebc494 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 83fe04ea-7d77-4003-9276-6a7d268e942a] Received event network-changed-8e162717-2b5c-4731-8484-d2c68330bdaa external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 21 18:58:20 np0005591285 nova_compute[182755]: 2026-01-21 23:58:20.517 182759 DEBUG nova.compute.manager [req-0f34a3f2-e9a5-4ee0-ac5e-1d0d84b8ed75 req-9ffb12ee-808b-47d1-bdca-c2605cebc494 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 83fe04ea-7d77-4003-9276-6a7d268e942a] Refreshing instance network info cache due to event network-changed-8e162717-2b5c-4731-8484-d2c68330bdaa. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 21 18:58:20 np0005591285 nova_compute[182755]: 2026-01-21 23:58:20.518 182759 DEBUG oslo_concurrency.lockutils [req-0f34a3f2-e9a5-4ee0-ac5e-1d0d84b8ed75 req-9ffb12ee-808b-47d1-bdca-c2605cebc494 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "refresh_cache-83fe04ea-7d77-4003-9276-6a7d268e942a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 21 18:58:20 np0005591285 systemd[1]: machine-qemu\x2d30\x2dinstance\x2d00000046.scope: Deactivated successfully.
Jan 21 18:58:20 np0005591285 systemd[1]: machine-qemu\x2d30\x2dinstance\x2d00000046.scope: Consumed 14.137s CPU time.
Jan 21 18:58:20 np0005591285 systemd-machined[154022]: Machine qemu-30-instance-00000046 terminated.
Jan 21 18:58:20 np0005591285 nova_compute[182755]: 2026-01-21 23:58:20.606 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 18:58:20 np0005591285 neutron-haproxy-ovnmeta-19c3e0c8-5563-479c-995a-ab38d8b8c7f7[220894]: [NOTICE]   (220899) : haproxy version is 2.8.14-c23fe91
Jan 21 18:58:20 np0005591285 neutron-haproxy-ovnmeta-19c3e0c8-5563-479c-995a-ab38d8b8c7f7[220894]: [NOTICE]   (220899) : path to executable is /usr/sbin/haproxy
Jan 21 18:58:20 np0005591285 neutron-haproxy-ovnmeta-19c3e0c8-5563-479c-995a-ab38d8b8c7f7[220894]: [WARNING]  (220899) : Exiting Master process...
Jan 21 18:58:20 np0005591285 neutron-haproxy-ovnmeta-19c3e0c8-5563-479c-995a-ab38d8b8c7f7[220894]: [ALERT]    (220899) : Current worker (220901) exited with code 143 (Terminated)
Jan 21 18:58:20 np0005591285 neutron-haproxy-ovnmeta-19c3e0c8-5563-479c-995a-ab38d8b8c7f7[220894]: [WARNING]  (220899) : All workers exited. Exiting... (0)
Jan 21 18:58:20 np0005591285 systemd[1]: libpod-0df379867abec058de90816a73a0a741a0f81f883fbb2baf54373ef14583e3a7.scope: Deactivated successfully.
Jan 21 18:58:20 np0005591285 podman[221054]: 2026-01-21 23:58:20.641017445 +0000 UTC m=+0.050372177 container died 0df379867abec058de90816a73a0a741a0f81f883fbb2baf54373ef14583e3a7 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-19c3e0c8-5563-479c-995a-ab38d8b8c7f7, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true)
Jan 21 18:58:20 np0005591285 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-0df379867abec058de90816a73a0a741a0f81f883fbb2baf54373ef14583e3a7-userdata-shm.mount: Deactivated successfully.
Jan 21 18:58:20 np0005591285 systemd[1]: var-lib-containers-storage-overlay-9452e20cd6787e03a11416e53ffb81e10eb8876aaacf60ec3f33298e94aad6f5-merged.mount: Deactivated successfully.
Jan 21 18:58:20 np0005591285 podman[221054]: 2026-01-21 23:58:20.673019644 +0000 UTC m=+0.082374356 container cleanup 0df379867abec058de90816a73a0a741a0f81f883fbb2baf54373ef14583e3a7 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-19c3e0c8-5563-479c-995a-ab38d8b8c7f7, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.license=GPLv2)
Jan 21 18:58:20 np0005591285 systemd[1]: libpod-conmon-0df379867abec058de90816a73a0a741a0f81f883fbb2baf54373ef14583e3a7.scope: Deactivated successfully.
Jan 21 18:58:20 np0005591285 nova_compute[182755]: 2026-01-21 23:58:20.707 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 18:58:20 np0005591285 nova_compute[182755]: 2026-01-21 23:58:20.712 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 18:58:20 np0005591285 podman[221084]: 2026-01-21 23:58:20.751642006 +0000 UTC m=+0.057029657 container remove 0df379867abec058de90816a73a0a741a0f81f883fbb2baf54373ef14583e3a7 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-19c3e0c8-5563-479c-995a-ab38d8b8c7f7, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true)
Jan 21 18:58:20 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:58:20.756 211690 DEBUG oslo.privsep.daemon [-] privsep: reply[108ca2b1-c25e-4b8d-ba17-c79c6a6bed12]: (4, ('Wed Jan 21 11:58:20 PM UTC 2026 Stopping container neutron-haproxy-ovnmeta-19c3e0c8-5563-479c-995a-ab38d8b8c7f7 (0df379867abec058de90816a73a0a741a0f81f883fbb2baf54373ef14583e3a7)\n0df379867abec058de90816a73a0a741a0f81f883fbb2baf54373ef14583e3a7\nWed Jan 21 11:58:20 PM UTC 2026 Deleting container neutron-haproxy-ovnmeta-19c3e0c8-5563-479c-995a-ab38d8b8c7f7 (0df379867abec058de90816a73a0a741a0f81f883fbb2baf54373ef14583e3a7)\n0df379867abec058de90816a73a0a741a0f81f883fbb2baf54373ef14583e3a7\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 18:58:20 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:58:20.761 211690 DEBUG oslo.privsep.daemon [-] privsep: reply[57a26429-d7a4-4af8-aced-8a0eff35acb4]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 18:58:20 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:58:20.761 104259 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap19c3e0c8-50, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 21 18:58:20 np0005591285 nova_compute[182755]: 2026-01-21 23:58:20.763 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 18:58:20 np0005591285 kernel: tap19c3e0c8-50: left promiscuous mode
Jan 21 18:58:20 np0005591285 nova_compute[182755]: 2026-01-21 23:58:20.778 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 18:58:20 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:58:20.782 211690 DEBUG oslo.privsep.daemon [-] privsep: reply[0e177162-865f-4ea3-b42d-30287f6037b0]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 18:58:20 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:58:20.796 211690 DEBUG oslo.privsep.daemon [-] privsep: reply[9466de23-ec4c-45b3-85be-0231c0fd4796]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 18:58:20 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:58:20.797 211690 DEBUG oslo.privsep.daemon [-] privsep: reply[f4fc7cd3-8c7d-4868-adcf-1d5a01f5d657]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 18:58:20 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:58:20.813 211690 DEBUG oslo.privsep.daemon [-] privsep: reply[90488383-1fba-4b5a-9f13-62855e2a72ed]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 443811, 'reachable_time': 17998, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 221118, 'error': None, 'target': 'ovnmeta-19c3e0c8-5563-479c-995a-ab38d8b8c7f7', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 18:58:20 np0005591285 systemd[1]: run-netns-ovnmeta\x2d19c3e0c8\x2d5563\x2d479c\x2d995a\x2dab38d8b8c7f7.mount: Deactivated successfully.
Jan 21 18:58:20 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:58:20.817 104650 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-19c3e0c8-5563-479c-995a-ab38d8b8c7f7 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Jan 21 18:58:20 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:58:20.817 104650 DEBUG oslo.privsep.daemon [-] privsep: reply[0b67924b-0ed7-491c-8830-c10208df3125]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 18:58:21 np0005591285 nova_compute[182755]: 2026-01-21 23:58:21.313 182759 INFO nova.virt.libvirt.driver [None req-8591e522-8258-4c34-98fd-0c91f9b69a3e 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] [instance: 9308be91-9a92-4389-939a-8b03d37474cf] Instance shutdown successfully after 13 seconds.#033[00m
Jan 21 18:58:21 np0005591285 nova_compute[182755]: 2026-01-21 23:58:21.322 182759 INFO nova.virt.libvirt.driver [-] [instance: 9308be91-9a92-4389-939a-8b03d37474cf] Instance destroyed successfully.#033[00m
Jan 21 18:58:21 np0005591285 nova_compute[182755]: 2026-01-21 23:58:21.323 182759 DEBUG nova.objects.instance [None req-8591e522-8258-4c34-98fd-0c91f9b69a3e 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] Lazy-loading 'numa_topology' on Instance uuid 9308be91-9a92-4389-939a-8b03d37474cf obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 21 18:58:21 np0005591285 nova_compute[182755]: 2026-01-21 23:58:21.343 182759 DEBUG nova.compute.manager [None req-8591e522-8258-4c34-98fd-0c91f9b69a3e 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] [instance: 9308be91-9a92-4389-939a-8b03d37474cf] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 21 18:58:21 np0005591285 nova_compute[182755]: 2026-01-21 23:58:21.416 182759 DEBUG nova.network.neutron [None req-978889eb-2cc1-4b42-b652-b4e64952d3f3 55710edfd4b24e368807c8b5087ec91c 011e84f966444a668bd6c0f5674f551f - - default default] [instance: 83fe04ea-7d77-4003-9276-6a7d268e942a] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Jan 21 18:58:21 np0005591285 nova_compute[182755]: 2026-01-21 23:58:21.460 182759 DEBUG oslo_concurrency.lockutils [None req-8591e522-8258-4c34-98fd-0c91f9b69a3e 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] Lock "9308be91-9a92-4389-939a-8b03d37474cf" "released" by "nova.compute.manager.ComputeManager.stop_instance.<locals>.do_stop_instance" :: held 13.316s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 21 18:58:21 np0005591285 nova_compute[182755]: 2026-01-21 23:58:21.608 182759 DEBUG nova.compute.manager [req-d7140b38-7f6f-4ba4-883f-ced8fdc77c3b req-db35b9ea-25dd-425d-8855-dba78e816520 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 9308be91-9a92-4389-939a-8b03d37474cf] Received event network-vif-unplugged-d96fb6bb-9793-4373-8f62-3aa3f32af6a5 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 21 18:58:21 np0005591285 nova_compute[182755]: 2026-01-21 23:58:21.609 182759 DEBUG oslo_concurrency.lockutils [req-d7140b38-7f6f-4ba4-883f-ced8fdc77c3b req-db35b9ea-25dd-425d-8855-dba78e816520 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "9308be91-9a92-4389-939a-8b03d37474cf-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 21 18:58:21 np0005591285 nova_compute[182755]: 2026-01-21 23:58:21.609 182759 DEBUG oslo_concurrency.lockutils [req-d7140b38-7f6f-4ba4-883f-ced8fdc77c3b req-db35b9ea-25dd-425d-8855-dba78e816520 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "9308be91-9a92-4389-939a-8b03d37474cf-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 21 18:58:21 np0005591285 nova_compute[182755]: 2026-01-21 23:58:21.610 182759 DEBUG oslo_concurrency.lockutils [req-d7140b38-7f6f-4ba4-883f-ced8fdc77c3b req-db35b9ea-25dd-425d-8855-dba78e816520 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "9308be91-9a92-4389-939a-8b03d37474cf-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 21 18:58:21 np0005591285 nova_compute[182755]: 2026-01-21 23:58:21.610 182759 DEBUG nova.compute.manager [req-d7140b38-7f6f-4ba4-883f-ced8fdc77c3b req-db35b9ea-25dd-425d-8855-dba78e816520 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 9308be91-9a92-4389-939a-8b03d37474cf] No waiting events found dispatching network-vif-unplugged-d96fb6bb-9793-4373-8f62-3aa3f32af6a5 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 21 18:58:21 np0005591285 nova_compute[182755]: 2026-01-21 23:58:21.611 182759 WARNING nova.compute.manager [req-d7140b38-7f6f-4ba4-883f-ced8fdc77c3b req-db35b9ea-25dd-425d-8855-dba78e816520 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 9308be91-9a92-4389-939a-8b03d37474cf] Received unexpected event network-vif-unplugged-d96fb6bb-9793-4373-8f62-3aa3f32af6a5 for instance with vm_state stopped and task_state None.#033[00m
Jan 21 18:58:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:58:23.161 12 DEBUG ceilometer.compute.discovery [-] instance data: {'id': '9308be91-9a92-4389-939a-8b03d37474cf', 'name': 'tempest-ServerActionsTestJSON-server-396111842', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'os_type': 'hvm', 'architecture': 'x86_64', 'OS-EXT-SRV-ATTR:instance_name': 'instance-00000046', 'OS-EXT-SRV-ATTR:host': 'compute-2.ctlplane.example.com', 'OS-EXT-STS:vm_state': 'shutdown', 'tenant_id': 'cccb624dbe6d4401a89e9cd254f91828', 'user_id': '3e78a70a1d284a9d932d4a53b872df39', 'hostId': '98bf05fc3cde3063e357af07cf32397d1b83b1095afc25a5e9b316ae', 'status': 'stopped', 'metadata': {}} discover_libvirt_polling /usr/lib/python3.9/site-packages/ceilometer/compute/discovery.py:228
Jan 21 18:58:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:58:23.164 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.allocation in the context of pollsters
Jan 21 18:58:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:58:23.167 12 DEBUG ceilometer.compute.pollsters [-] Instance 9308be91-9a92-4389-939a-8b03d37474cf was shut off while getting sample of disk.device.allocation: Failed to inspect data of instance <name=instance-00000046, id=9308be91-9a92-4389-939a-8b03d37474cf>, domain state is SHUTOFF. get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:152
Jan 21 18:58:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:58:23.167 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.latency in the context of pollsters
Jan 21 18:58:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:58:23.168 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for PerDeviceDiskLatencyPollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Jan 21 18:58:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:58:23.168 12 ERROR ceilometer.polling.manager [-] Prevent pollster disk.device.latency from polling [<NovaLikeServer: tempest-ServerActionsTestJSON-server-396111842>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-ServerActionsTestJSON-server-396111842>]
Jan 21 18:58:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:58:23.169 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.error in the context of pollsters
Jan 21 18:58:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:58:23.170 12 DEBUG ceilometer.compute.pollsters [-] Instance 9308be91-9a92-4389-939a-8b03d37474cf was shut off while getting sample of network.outgoing.packets.error: Failed to inspect data of instance <name=instance-00000046, id=9308be91-9a92-4389-939a-8b03d37474cf>, domain state is SHUTOFF. get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:152
Jan 21 18:58:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:58:23.170 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.requests in the context of pollsters
Jan 21 18:58:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:58:23.172 12 DEBUG ceilometer.compute.pollsters [-] Instance 9308be91-9a92-4389-939a-8b03d37474cf was shut off while getting sample of disk.device.read.requests: Failed to inspect data of instance <name=instance-00000046, id=9308be91-9a92-4389-939a-8b03d37474cf>, domain state is SHUTOFF. get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:152
Jan 21 18:58:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:58:23.172 12 INFO ceilometer.polling.manager [-] Polling pollster cpu in the context of pollsters
Jan 21 18:58:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:58:23.173 12 DEBUG ceilometer.compute.pollsters [-] Instance 9308be91-9a92-4389-939a-8b03d37474cf was shut off while getting sample of cpu: Failed to inspect data of instance <name=instance-00000046, id=9308be91-9a92-4389-939a-8b03d37474cf>, domain state is SHUTOFF. get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:152
Jan 21 18:58:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:58:23.174 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.requests in the context of pollsters
Jan 21 18:58:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:58:23.175 12 DEBUG ceilometer.compute.pollsters [-] Instance 9308be91-9a92-4389-939a-8b03d37474cf was shut off while getting sample of disk.device.write.requests: Failed to inspect data of instance <name=instance-00000046, id=9308be91-9a92-4389-939a-8b03d37474cf>, domain state is SHUTOFF. get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:152
Jan 21 18:58:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:58:23.175 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.iops in the context of pollsters
Jan 21 18:58:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:58:23.175 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for PerDeviceDiskIOPSPollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Jan 21 18:58:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:58:23.175 12 ERROR ceilometer.polling.manager [-] Prevent pollster disk.device.iops from polling [<NovaLikeServer: tempest-ServerActionsTestJSON-server-396111842>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-ServerActionsTestJSON-server-396111842>]
Jan 21 18:58:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:58:23.176 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.delta in the context of pollsters
Jan 21 18:58:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:58:23.177 12 DEBUG ceilometer.compute.pollsters [-] Instance 9308be91-9a92-4389-939a-8b03d37474cf was shut off while getting sample of network.outgoing.bytes.delta: Failed to inspect data of instance <name=instance-00000046, id=9308be91-9a92-4389-939a-8b03d37474cf>, domain state is SHUTOFF. get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:152
Jan 21 18:58:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:58:23.177 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.error in the context of pollsters
Jan 21 18:58:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:58:23.178 12 DEBUG ceilometer.compute.pollsters [-] Instance 9308be91-9a92-4389-939a-8b03d37474cf was shut off while getting sample of network.incoming.packets.error: Failed to inspect data of instance <name=instance-00000046, id=9308be91-9a92-4389-939a-8b03d37474cf>, domain state is SHUTOFF. get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:152
Jan 21 18:58:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:58:23.178 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.latency in the context of pollsters
Jan 21 18:58:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:58:23.179 12 DEBUG ceilometer.compute.pollsters [-] Instance 9308be91-9a92-4389-939a-8b03d37474cf was shut off while getting sample of disk.device.write.latency: Failed to inspect data of instance <name=instance-00000046, id=9308be91-9a92-4389-939a-8b03d37474cf>, domain state is SHUTOFF. get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:152
Jan 21 18:58:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:58:23.179 12 INFO ceilometer.polling.manager [-] Polling pollster memory.usage in the context of pollsters
Jan 21 18:58:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:58:23.180 12 DEBUG ceilometer.compute.pollsters [-] Instance 9308be91-9a92-4389-939a-8b03d37474cf was shut off while getting sample of memory.usage: Failed to inspect data of instance <name=instance-00000046, id=9308be91-9a92-4389-939a-8b03d37474cf>, domain state is SHUTOFF. get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:152
Jan 21 18:58:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:58:23.180 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.usage in the context of pollsters
Jan 21 18:58:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:58:23.181 12 DEBUG ceilometer.compute.pollsters [-] Instance 9308be91-9a92-4389-939a-8b03d37474cf was shut off while getting sample of disk.device.usage: Failed to inspect data of instance <name=instance-00000046, id=9308be91-9a92-4389-939a-8b03d37474cf>, domain state is SHUTOFF. get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:152
Jan 21 18:58:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:58:23.181 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.latency in the context of pollsters
Jan 21 18:58:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:58:23.182 12 DEBUG ceilometer.compute.pollsters [-] Instance 9308be91-9a92-4389-939a-8b03d37474cf was shut off while getting sample of disk.device.read.latency: Failed to inspect data of instance <name=instance-00000046, id=9308be91-9a92-4389-939a-8b03d37474cf>, domain state is SHUTOFF. get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:152
Jan 21 18:58:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:58:23.182 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets in the context of pollsters
Jan 21 18:58:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:58:23.183 12 DEBUG ceilometer.compute.pollsters [-] Instance 9308be91-9a92-4389-939a-8b03d37474cf was shut off while getting sample of network.outgoing.packets: Failed to inspect data of instance <name=instance-00000046, id=9308be91-9a92-4389-939a-8b03d37474cf>, domain state is SHUTOFF. get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:152
Jan 21 18:58:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:58:23.183 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.rate in the context of pollsters
Jan 21 18:58:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:58:23.183 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for OutgoingBytesRatePollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Jan 21 18:58:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:58:23.183 12 ERROR ceilometer.polling.manager [-] Prevent pollster network.outgoing.bytes.rate from polling [<NovaLikeServer: tempest-ServerActionsTestJSON-server-396111842>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-ServerActionsTestJSON-server-396111842>]
Jan 21 18:58:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:58:23.184 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.drop in the context of pollsters
Jan 21 18:58:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:58:23.184 12 DEBUG ceilometer.compute.pollsters [-] Instance 9308be91-9a92-4389-939a-8b03d37474cf was shut off while getting sample of network.incoming.packets.drop: Failed to inspect data of instance <name=instance-00000046, id=9308be91-9a92-4389-939a-8b03d37474cf>, domain state is SHUTOFF. get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:152
Jan 21 18:58:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:58:23.184 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes in the context of pollsters
Jan 21 18:58:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:58:23.185 12 DEBUG ceilometer.compute.pollsters [-] Instance 9308be91-9a92-4389-939a-8b03d37474cf was shut off while getting sample of network.incoming.bytes: Failed to inspect data of instance <name=instance-00000046, id=9308be91-9a92-4389-939a-8b03d37474cf>, domain state is SHUTOFF. get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:152
Jan 21 18:58:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:58:23.185 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.bytes in the context of pollsters
Jan 21 18:58:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:58:23.186 12 DEBUG ceilometer.compute.pollsters [-] Instance 9308be91-9a92-4389-939a-8b03d37474cf was shut off while getting sample of disk.device.write.bytes: Failed to inspect data of instance <name=instance-00000046, id=9308be91-9a92-4389-939a-8b03d37474cf>, domain state is SHUTOFF. get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:152
Jan 21 18:58:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:58:23.186 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.capacity in the context of pollsters
Jan 21 18:58:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:58:23.187 12 DEBUG ceilometer.compute.pollsters [-] Instance 9308be91-9a92-4389-939a-8b03d37474cf was shut off while getting sample of disk.device.capacity: Failed to inspect data of instance <name=instance-00000046, id=9308be91-9a92-4389-939a-8b03d37474cf>, domain state is SHUTOFF. get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:152
Jan 21 18:58:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:58:23.187 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets in the context of pollsters
Jan 21 18:58:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:58:23.188 12 DEBUG ceilometer.compute.pollsters [-] Instance 9308be91-9a92-4389-939a-8b03d37474cf was shut off while getting sample of network.incoming.packets: Failed to inspect data of instance <name=instance-00000046, id=9308be91-9a92-4389-939a-8b03d37474cf>, domain state is SHUTOFF. get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:152
Jan 21 18:58:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:58:23.188 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.bytes in the context of pollsters
Jan 21 18:58:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:58:23.189 12 DEBUG ceilometer.compute.pollsters [-] Instance 9308be91-9a92-4389-939a-8b03d37474cf was shut off while getting sample of disk.device.read.bytes: Failed to inspect data of instance <name=instance-00000046, id=9308be91-9a92-4389-939a-8b03d37474cf>, domain state is SHUTOFF. get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:152
Jan 21 18:58:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:58:23.189 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes in the context of pollsters
Jan 21 18:58:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:58:23.190 12 DEBUG ceilometer.compute.pollsters [-] Instance 9308be91-9a92-4389-939a-8b03d37474cf was shut off while getting sample of network.outgoing.bytes: Failed to inspect data of instance <name=instance-00000046, id=9308be91-9a92-4389-939a-8b03d37474cf>, domain state is SHUTOFF. get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:152
Jan 21 18:58:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:58:23.190 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.delta in the context of pollsters
Jan 21 18:58:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:58:23.190 12 DEBUG ceilometer.compute.pollsters [-] Instance 9308be91-9a92-4389-939a-8b03d37474cf was shut off while getting sample of network.incoming.bytes.delta: Failed to inspect data of instance <name=instance-00000046, id=9308be91-9a92-4389-939a-8b03d37474cf>, domain state is SHUTOFF. get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:152
Jan 21 18:58:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:58:23.191 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.drop in the context of pollsters
Jan 21 18:58:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:58:23.191 12 DEBUG ceilometer.compute.pollsters [-] Instance 9308be91-9a92-4389-939a-8b03d37474cf was shut off while getting sample of network.outgoing.packets.drop: Failed to inspect data of instance <name=instance-00000046, id=9308be91-9a92-4389-939a-8b03d37474cf>, domain state is SHUTOFF. get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:152
Jan 21 18:58:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:58:23.191 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.rate in the context of pollsters
Jan 21 18:58:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:58:23.192 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for IncomingBytesRatePollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Jan 21 18:58:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-21 23:58:23.192 12 ERROR ceilometer.polling.manager [-] Prevent pollster network.incoming.bytes.rate from polling [<NovaLikeServer: tempest-ServerActionsTestJSON-server-396111842>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-ServerActionsTestJSON-server-396111842>]
Jan 21 18:58:23 np0005591285 nova_compute[182755]: 2026-01-21 23:58:23.593 182759 DEBUG nova.network.neutron [None req-978889eb-2cc1-4b42-b652-b4e64952d3f3 55710edfd4b24e368807c8b5087ec91c 011e84f966444a668bd6c0f5674f551f - - default default] [instance: 83fe04ea-7d77-4003-9276-6a7d268e942a] Updating instance_info_cache with network_info: [{"id": "8e162717-2b5c-4731-8484-d2c68330bdaa", "address": "fa:16:3e:2d:84:20", "network": {"id": "58cd83db-dcb3-409c-a108-07601ce5f67a", "bridge": "br-int", "label": "tempest-ServerStableDeviceRescueTest-1658091487-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "011e84f966444a668bd6c0f5674f551f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap8e162717-2b", "ovs_interfaceid": "8e162717-2b5c-4731-8484-d2c68330bdaa", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 21 18:58:23 np0005591285 nova_compute[182755]: 2026-01-21 23:58:23.619 182759 DEBUG oslo_concurrency.lockutils [None req-978889eb-2cc1-4b42-b652-b4e64952d3f3 55710edfd4b24e368807c8b5087ec91c 011e84f966444a668bd6c0f5674f551f - - default default] Releasing lock "refresh_cache-83fe04ea-7d77-4003-9276-6a7d268e942a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 21 18:58:23 np0005591285 nova_compute[182755]: 2026-01-21 23:58:23.619 182759 DEBUG nova.compute.manager [None req-978889eb-2cc1-4b42-b652-b4e64952d3f3 55710edfd4b24e368807c8b5087ec91c 011e84f966444a668bd6c0f5674f551f - - default default] [instance: 83fe04ea-7d77-4003-9276-6a7d268e942a] Instance network_info: |[{"id": "8e162717-2b5c-4731-8484-d2c68330bdaa", "address": "fa:16:3e:2d:84:20", "network": {"id": "58cd83db-dcb3-409c-a108-07601ce5f67a", "bridge": "br-int", "label": "tempest-ServerStableDeviceRescueTest-1658091487-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "011e84f966444a668bd6c0f5674f551f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap8e162717-2b", "ovs_interfaceid": "8e162717-2b5c-4731-8484-d2c68330bdaa", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Jan 21 18:58:23 np0005591285 nova_compute[182755]: 2026-01-21 23:58:23.620 182759 DEBUG oslo_concurrency.lockutils [req-0f34a3f2-e9a5-4ee0-ac5e-1d0d84b8ed75 req-9ffb12ee-808b-47d1-bdca-c2605cebc494 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquired lock "refresh_cache-83fe04ea-7d77-4003-9276-6a7d268e942a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 21 18:58:23 np0005591285 nova_compute[182755]: 2026-01-21 23:58:23.621 182759 DEBUG nova.network.neutron [req-0f34a3f2-e9a5-4ee0-ac5e-1d0d84b8ed75 req-9ffb12ee-808b-47d1-bdca-c2605cebc494 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 83fe04ea-7d77-4003-9276-6a7d268e942a] Refreshing network info cache for port 8e162717-2b5c-4731-8484-d2c68330bdaa _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 21 18:58:23 np0005591285 nova_compute[182755]: 2026-01-21 23:58:23.626 182759 DEBUG nova.virt.libvirt.driver [None req-978889eb-2cc1-4b42-b652-b4e64952d3f3 55710edfd4b24e368807c8b5087ec91c 011e84f966444a668bd6c0f5674f551f - - default default] [instance: 83fe04ea-7d77-4003-9276-6a7d268e942a] Start _get_guest_xml network_info=[{"id": "8e162717-2b5c-4731-8484-d2c68330bdaa", "address": "fa:16:3e:2d:84:20", "network": {"id": "58cd83db-dcb3-409c-a108-07601ce5f67a", "bridge": "br-int", "label": "tempest-ServerStableDeviceRescueTest-1658091487-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "011e84f966444a668bd6c0f5674f551f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap8e162717-2b", "ovs_interfaceid": "8e162717-2b5c-4731-8484-d2c68330bdaa", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-21T23:42:50Z,direct_url=<?>,disk_format='qcow2',id=9cd98f02-a505-4543-a7ad-04e9a377b456,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='43b70c4e837343859ac97b6b2397ba1b',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-21T23:42:52Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'device_type': 'disk', 'device_name': '/dev/vda', 'size': 0, 'boot_index': 0, 'disk_bus': 'virtio', 'guest_format': None, 'encryption_options': None, 'encryption_format': None, 'encrypted': False, 'encryption_secret_uuid': None, 'image_id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Jan 21 18:58:23 np0005591285 nova_compute[182755]: 2026-01-21 23:58:23.636 182759 WARNING nova.virt.libvirt.driver [None req-978889eb-2cc1-4b42-b652-b4e64952d3f3 55710edfd4b24e368807c8b5087ec91c 011e84f966444a668bd6c0f5674f551f - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 21 18:58:23 np0005591285 nova_compute[182755]: 2026-01-21 23:58:23.647 182759 DEBUG nova.virt.libvirt.host [None req-978889eb-2cc1-4b42-b652-b4e64952d3f3 55710edfd4b24e368807c8b5087ec91c 011e84f966444a668bd6c0f5674f551f - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Jan 21 18:58:23 np0005591285 nova_compute[182755]: 2026-01-21 23:58:23.649 182759 DEBUG nova.virt.libvirt.host [None req-978889eb-2cc1-4b42-b652-b4e64952d3f3 55710edfd4b24e368807c8b5087ec91c 011e84f966444a668bd6c0f5674f551f - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Jan 21 18:58:23 np0005591285 nova_compute[182755]: 2026-01-21 23:58:23.654 182759 DEBUG nova.virt.libvirt.host [None req-978889eb-2cc1-4b42-b652-b4e64952d3f3 55710edfd4b24e368807c8b5087ec91c 011e84f966444a668bd6c0f5674f551f - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Jan 21 18:58:23 np0005591285 nova_compute[182755]: 2026-01-21 23:58:23.655 182759 DEBUG nova.virt.libvirt.host [None req-978889eb-2cc1-4b42-b652-b4e64952d3f3 55710edfd4b24e368807c8b5087ec91c 011e84f966444a668bd6c0f5674f551f - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Jan 21 18:58:23 np0005591285 nova_compute[182755]: 2026-01-21 23:58:23.657 182759 DEBUG nova.virt.libvirt.driver [None req-978889eb-2cc1-4b42-b652-b4e64952d3f3 55710edfd4b24e368807c8b5087ec91c 011e84f966444a668bd6c0f5674f551f - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Jan 21 18:58:23 np0005591285 nova_compute[182755]: 2026-01-21 23:58:23.658 182759 DEBUG nova.virt.hardware [None req-978889eb-2cc1-4b42-b652-b4e64952d3f3 55710edfd4b24e368807c8b5087ec91c 011e84f966444a668bd6c0f5674f551f - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-21T23:42:45Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='c3389c03-89c4-4ff5-9e03-1a99d41713d4',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-21T23:42:50Z,direct_url=<?>,disk_format='qcow2',id=9cd98f02-a505-4543-a7ad-04e9a377b456,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='43b70c4e837343859ac97b6b2397ba1b',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-21T23:42:52Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Jan 21 18:58:23 np0005591285 nova_compute[182755]: 2026-01-21 23:58:23.658 182759 DEBUG nova.virt.hardware [None req-978889eb-2cc1-4b42-b652-b4e64952d3f3 55710edfd4b24e368807c8b5087ec91c 011e84f966444a668bd6c0f5674f551f - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Jan 21 18:58:23 np0005591285 nova_compute[182755]: 2026-01-21 23:58:23.659 182759 DEBUG nova.virt.hardware [None req-978889eb-2cc1-4b42-b652-b4e64952d3f3 55710edfd4b24e368807c8b5087ec91c 011e84f966444a668bd6c0f5674f551f - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Jan 21 18:58:23 np0005591285 nova_compute[182755]: 2026-01-21 23:58:23.659 182759 DEBUG nova.virt.hardware [None req-978889eb-2cc1-4b42-b652-b4e64952d3f3 55710edfd4b24e368807c8b5087ec91c 011e84f966444a668bd6c0f5674f551f - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Jan 21 18:58:23 np0005591285 nova_compute[182755]: 2026-01-21 23:58:23.659 182759 DEBUG nova.virt.hardware [None req-978889eb-2cc1-4b42-b652-b4e64952d3f3 55710edfd4b24e368807c8b5087ec91c 011e84f966444a668bd6c0f5674f551f - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Jan 21 18:58:23 np0005591285 nova_compute[182755]: 2026-01-21 23:58:23.660 182759 DEBUG nova.virt.hardware [None req-978889eb-2cc1-4b42-b652-b4e64952d3f3 55710edfd4b24e368807c8b5087ec91c 011e84f966444a668bd6c0f5674f551f - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Jan 21 18:58:23 np0005591285 nova_compute[182755]: 2026-01-21 23:58:23.660 182759 DEBUG nova.virt.hardware [None req-978889eb-2cc1-4b42-b652-b4e64952d3f3 55710edfd4b24e368807c8b5087ec91c 011e84f966444a668bd6c0f5674f551f - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Jan 21 18:58:23 np0005591285 nova_compute[182755]: 2026-01-21 23:58:23.661 182759 DEBUG nova.virt.hardware [None req-978889eb-2cc1-4b42-b652-b4e64952d3f3 55710edfd4b24e368807c8b5087ec91c 011e84f966444a668bd6c0f5674f551f - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Jan 21 18:58:23 np0005591285 nova_compute[182755]: 2026-01-21 23:58:23.661 182759 DEBUG nova.virt.hardware [None req-978889eb-2cc1-4b42-b652-b4e64952d3f3 55710edfd4b24e368807c8b5087ec91c 011e84f966444a668bd6c0f5674f551f - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Jan 21 18:58:23 np0005591285 nova_compute[182755]: 2026-01-21 23:58:23.661 182759 DEBUG nova.virt.hardware [None req-978889eb-2cc1-4b42-b652-b4e64952d3f3 55710edfd4b24e368807c8b5087ec91c 011e84f966444a668bd6c0f5674f551f - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Jan 21 18:58:23 np0005591285 nova_compute[182755]: 2026-01-21 23:58:23.662 182759 DEBUG nova.virt.hardware [None req-978889eb-2cc1-4b42-b652-b4e64952d3f3 55710edfd4b24e368807c8b5087ec91c 011e84f966444a668bd6c0f5674f551f - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Jan 21 18:58:23 np0005591285 nova_compute[182755]: 2026-01-21 23:58:23.670 182759 DEBUG nova.virt.libvirt.vif [None req-978889eb-2cc1-4b42-b652-b4e64952d3f3 55710edfd4b24e368807c8b5087ec91c 011e84f966444a668bd6c0f5674f551f - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-21T23:58:15Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ServerStableDeviceRescueTest-server-1755799853',display_name='tempest-ServerStableDeviceRescueTest-server-1755799853',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-serverstabledevicerescuetest-server-1755799853',id=74,image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='011e84f966444a668bd6c0f5674f551f',ramdisk_id='',reservation_id='r-gcfnr2kl',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ServerStableDeviceRescueTest-1256721315',owner_user_name='tempest-ServerStableDeviceRescueTest-1256721315-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-21T23:58:17Z,user_data=None,user_id='55710edfd4b24e368807c8b5087ec91c',uuid=83fe04ea-7d77-4003-9276-6a7d268e942a,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "8e162717-2b5c-4731-8484-d2c68330bdaa", "address": "fa:16:3e:2d:84:20", "network": {"id": "58cd83db-dcb3-409c-a108-07601ce5f67a", "bridge": "br-int", "label": "tempest-ServerStableDeviceRescueTest-1658091487-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "011e84f966444a668bd6c0f5674f551f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap8e162717-2b", "ovs_interfaceid": "8e162717-2b5c-4731-8484-d2c68330bdaa", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Jan 21 18:58:23 np0005591285 nova_compute[182755]: 2026-01-21 23:58:23.670 182759 DEBUG nova.network.os_vif_util [None req-978889eb-2cc1-4b42-b652-b4e64952d3f3 55710edfd4b24e368807c8b5087ec91c 011e84f966444a668bd6c0f5674f551f - - default default] Converting VIF {"id": "8e162717-2b5c-4731-8484-d2c68330bdaa", "address": "fa:16:3e:2d:84:20", "network": {"id": "58cd83db-dcb3-409c-a108-07601ce5f67a", "bridge": "br-int", "label": "tempest-ServerStableDeviceRescueTest-1658091487-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "011e84f966444a668bd6c0f5674f551f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap8e162717-2b", "ovs_interfaceid": "8e162717-2b5c-4731-8484-d2c68330bdaa", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 21 18:58:23 np0005591285 nova_compute[182755]: 2026-01-21 23:58:23.672 182759 DEBUG nova.network.os_vif_util [None req-978889eb-2cc1-4b42-b652-b4e64952d3f3 55710edfd4b24e368807c8b5087ec91c 011e84f966444a668bd6c0f5674f551f - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:2d:84:20,bridge_name='br-int',has_traffic_filtering=True,id=8e162717-2b5c-4731-8484-d2c68330bdaa,network=Network(58cd83db-dcb3-409c-a108-07601ce5f67a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap8e162717-2b') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 21 18:58:23 np0005591285 nova_compute[182755]: 2026-01-21 23:58:23.674 182759 DEBUG nova.objects.instance [None req-978889eb-2cc1-4b42-b652-b4e64952d3f3 55710edfd4b24e368807c8b5087ec91c 011e84f966444a668bd6c0f5674f551f - - default default] Lazy-loading 'pci_devices' on Instance uuid 83fe04ea-7d77-4003-9276-6a7d268e942a obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 21 18:58:23 np0005591285 nova_compute[182755]: 2026-01-21 23:58:23.701 182759 DEBUG nova.virt.libvirt.driver [None req-978889eb-2cc1-4b42-b652-b4e64952d3f3 55710edfd4b24e368807c8b5087ec91c 011e84f966444a668bd6c0f5674f551f - - default default] [instance: 83fe04ea-7d77-4003-9276-6a7d268e942a] End _get_guest_xml xml=<domain type="kvm">
Jan 21 18:58:23 np0005591285 nova_compute[182755]:  <uuid>83fe04ea-7d77-4003-9276-6a7d268e942a</uuid>
Jan 21 18:58:23 np0005591285 nova_compute[182755]:  <name>instance-0000004a</name>
Jan 21 18:58:23 np0005591285 nova_compute[182755]:  <memory>131072</memory>
Jan 21 18:58:23 np0005591285 nova_compute[182755]:  <vcpu>1</vcpu>
Jan 21 18:58:23 np0005591285 nova_compute[182755]:  <metadata>
Jan 21 18:58:23 np0005591285 nova_compute[182755]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 21 18:58:23 np0005591285 nova_compute[182755]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 21 18:58:23 np0005591285 nova_compute[182755]:      <nova:name>tempest-ServerStableDeviceRescueTest-server-1755799853</nova:name>
Jan 21 18:58:23 np0005591285 nova_compute[182755]:      <nova:creationTime>2026-01-21 23:58:23</nova:creationTime>
Jan 21 18:58:23 np0005591285 nova_compute[182755]:      <nova:flavor name="m1.nano">
Jan 21 18:58:23 np0005591285 nova_compute[182755]:        <nova:memory>128</nova:memory>
Jan 21 18:58:23 np0005591285 nova_compute[182755]:        <nova:disk>1</nova:disk>
Jan 21 18:58:23 np0005591285 nova_compute[182755]:        <nova:swap>0</nova:swap>
Jan 21 18:58:23 np0005591285 nova_compute[182755]:        <nova:ephemeral>0</nova:ephemeral>
Jan 21 18:58:23 np0005591285 nova_compute[182755]:        <nova:vcpus>1</nova:vcpus>
Jan 21 18:58:23 np0005591285 nova_compute[182755]:      </nova:flavor>
Jan 21 18:58:23 np0005591285 nova_compute[182755]:      <nova:owner>
Jan 21 18:58:23 np0005591285 nova_compute[182755]:        <nova:user uuid="55710edfd4b24e368807c8b5087ec91c">tempest-ServerStableDeviceRescueTest-1256721315-project-member</nova:user>
Jan 21 18:58:23 np0005591285 nova_compute[182755]:        <nova:project uuid="011e84f966444a668bd6c0f5674f551f">tempest-ServerStableDeviceRescueTest-1256721315</nova:project>
Jan 21 18:58:23 np0005591285 nova_compute[182755]:      </nova:owner>
Jan 21 18:58:23 np0005591285 nova_compute[182755]:      <nova:root type="image" uuid="9cd98f02-a505-4543-a7ad-04e9a377b456"/>
Jan 21 18:58:23 np0005591285 nova_compute[182755]:      <nova:ports>
Jan 21 18:58:23 np0005591285 nova_compute[182755]:        <nova:port uuid="8e162717-2b5c-4731-8484-d2c68330bdaa">
Jan 21 18:58:23 np0005591285 nova_compute[182755]:          <nova:ip type="fixed" address="10.100.0.13" ipVersion="4"/>
Jan 21 18:58:23 np0005591285 nova_compute[182755]:        </nova:port>
Jan 21 18:58:23 np0005591285 nova_compute[182755]:      </nova:ports>
Jan 21 18:58:23 np0005591285 nova_compute[182755]:    </nova:instance>
Jan 21 18:58:23 np0005591285 nova_compute[182755]:  </metadata>
Jan 21 18:58:23 np0005591285 nova_compute[182755]:  <sysinfo type="smbios">
Jan 21 18:58:23 np0005591285 nova_compute[182755]:    <system>
Jan 21 18:58:23 np0005591285 nova_compute[182755]:      <entry name="manufacturer">RDO</entry>
Jan 21 18:58:23 np0005591285 nova_compute[182755]:      <entry name="product">OpenStack Compute</entry>
Jan 21 18:58:23 np0005591285 nova_compute[182755]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 21 18:58:23 np0005591285 nova_compute[182755]:      <entry name="serial">83fe04ea-7d77-4003-9276-6a7d268e942a</entry>
Jan 21 18:58:23 np0005591285 nova_compute[182755]:      <entry name="uuid">83fe04ea-7d77-4003-9276-6a7d268e942a</entry>
Jan 21 18:58:23 np0005591285 nova_compute[182755]:      <entry name="family">Virtual Machine</entry>
Jan 21 18:58:23 np0005591285 nova_compute[182755]:    </system>
Jan 21 18:58:23 np0005591285 nova_compute[182755]:  </sysinfo>
Jan 21 18:58:23 np0005591285 nova_compute[182755]:  <os>
Jan 21 18:58:23 np0005591285 nova_compute[182755]:    <type arch="x86_64" machine="q35">hvm</type>
Jan 21 18:58:23 np0005591285 nova_compute[182755]:    <boot dev="hd"/>
Jan 21 18:58:23 np0005591285 nova_compute[182755]:    <smbios mode="sysinfo"/>
Jan 21 18:58:23 np0005591285 nova_compute[182755]:  </os>
Jan 21 18:58:23 np0005591285 nova_compute[182755]:  <features>
Jan 21 18:58:23 np0005591285 nova_compute[182755]:    <acpi/>
Jan 21 18:58:23 np0005591285 nova_compute[182755]:    <apic/>
Jan 21 18:58:23 np0005591285 nova_compute[182755]:    <vmcoreinfo/>
Jan 21 18:58:23 np0005591285 nova_compute[182755]:  </features>
Jan 21 18:58:23 np0005591285 nova_compute[182755]:  <clock offset="utc">
Jan 21 18:58:23 np0005591285 nova_compute[182755]:    <timer name="pit" tickpolicy="delay"/>
Jan 21 18:58:23 np0005591285 nova_compute[182755]:    <timer name="rtc" tickpolicy="catchup"/>
Jan 21 18:58:23 np0005591285 nova_compute[182755]:    <timer name="hpet" present="no"/>
Jan 21 18:58:23 np0005591285 nova_compute[182755]:  </clock>
Jan 21 18:58:23 np0005591285 nova_compute[182755]:  <cpu mode="custom" match="exact">
Jan 21 18:58:23 np0005591285 nova_compute[182755]:    <model>Nehalem</model>
Jan 21 18:58:23 np0005591285 nova_compute[182755]:    <topology sockets="1" cores="1" threads="1"/>
Jan 21 18:58:23 np0005591285 nova_compute[182755]:  </cpu>
Jan 21 18:58:23 np0005591285 nova_compute[182755]:  <devices>
Jan 21 18:58:23 np0005591285 nova_compute[182755]:    <disk type="file" device="disk">
Jan 21 18:58:23 np0005591285 nova_compute[182755]:      <driver name="qemu" type="qcow2" cache="none"/>
Jan 21 18:58:23 np0005591285 nova_compute[182755]:      <source file="/var/lib/nova/instances/83fe04ea-7d77-4003-9276-6a7d268e942a/disk"/>
Jan 21 18:58:23 np0005591285 nova_compute[182755]:      <target dev="vda" bus="virtio"/>
Jan 21 18:58:23 np0005591285 nova_compute[182755]:    </disk>
Jan 21 18:58:23 np0005591285 nova_compute[182755]:    <disk type="file" device="cdrom">
Jan 21 18:58:23 np0005591285 nova_compute[182755]:      <driver name="qemu" type="raw" cache="none"/>
Jan 21 18:58:23 np0005591285 nova_compute[182755]:      <source file="/var/lib/nova/instances/83fe04ea-7d77-4003-9276-6a7d268e942a/disk.config"/>
Jan 21 18:58:23 np0005591285 nova_compute[182755]:      <target dev="sda" bus="sata"/>
Jan 21 18:58:23 np0005591285 nova_compute[182755]:    </disk>
Jan 21 18:58:23 np0005591285 nova_compute[182755]:    <interface type="ethernet">
Jan 21 18:58:23 np0005591285 nova_compute[182755]:      <mac address="fa:16:3e:2d:84:20"/>
Jan 21 18:58:23 np0005591285 nova_compute[182755]:      <model type="virtio"/>
Jan 21 18:58:23 np0005591285 nova_compute[182755]:      <driver name="vhost" rx_queue_size="512"/>
Jan 21 18:58:23 np0005591285 nova_compute[182755]:      <mtu size="1442"/>
Jan 21 18:58:23 np0005591285 nova_compute[182755]:      <target dev="tap8e162717-2b"/>
Jan 21 18:58:23 np0005591285 nova_compute[182755]:    </interface>
Jan 21 18:58:23 np0005591285 nova_compute[182755]:    <serial type="pty">
Jan 21 18:58:23 np0005591285 nova_compute[182755]:      <log file="/var/lib/nova/instances/83fe04ea-7d77-4003-9276-6a7d268e942a/console.log" append="off"/>
Jan 21 18:58:23 np0005591285 nova_compute[182755]:    </serial>
Jan 21 18:58:23 np0005591285 nova_compute[182755]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 21 18:58:23 np0005591285 nova_compute[182755]:    <video>
Jan 21 18:58:23 np0005591285 nova_compute[182755]:      <model type="virtio"/>
Jan 21 18:58:23 np0005591285 nova_compute[182755]:    </video>
Jan 21 18:58:23 np0005591285 nova_compute[182755]:    <input type="tablet" bus="usb"/>
Jan 21 18:58:23 np0005591285 nova_compute[182755]:    <rng model="virtio">
Jan 21 18:58:23 np0005591285 nova_compute[182755]:      <backend model="random">/dev/urandom</backend>
Jan 21 18:58:23 np0005591285 nova_compute[182755]:    </rng>
Jan 21 18:58:23 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root"/>
Jan 21 18:58:23 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 18:58:23 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 18:58:23 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 18:58:23 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 18:58:23 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 18:58:23 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 18:58:23 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 18:58:23 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 18:58:23 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 18:58:23 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 18:58:23 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 18:58:23 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 18:58:23 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 18:58:23 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 18:58:23 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 18:58:23 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 18:58:23 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 18:58:23 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 18:58:23 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 18:58:23 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 18:58:23 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 18:58:23 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 18:58:23 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 18:58:23 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 18:58:23 np0005591285 nova_compute[182755]:    <controller type="usb" index="0"/>
Jan 21 18:58:23 np0005591285 nova_compute[182755]:    <memballoon model="virtio">
Jan 21 18:58:23 np0005591285 nova_compute[182755]:      <stats period="10"/>
Jan 21 18:58:23 np0005591285 nova_compute[182755]:    </memballoon>
Jan 21 18:58:23 np0005591285 nova_compute[182755]:  </devices>
Jan 21 18:58:23 np0005591285 nova_compute[182755]: </domain>
Jan 21 18:58:23 np0005591285 nova_compute[182755]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Jan 21 18:58:23 np0005591285 nova_compute[182755]: 2026-01-21 23:58:23.702 182759 DEBUG nova.compute.manager [None req-978889eb-2cc1-4b42-b652-b4e64952d3f3 55710edfd4b24e368807c8b5087ec91c 011e84f966444a668bd6c0f5674f551f - - default default] [instance: 83fe04ea-7d77-4003-9276-6a7d268e942a] Preparing to wait for external event network-vif-plugged-8e162717-2b5c-4731-8484-d2c68330bdaa prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Jan 21 18:58:23 np0005591285 nova_compute[182755]: 2026-01-21 23:58:23.702 182759 DEBUG oslo_concurrency.lockutils [None req-978889eb-2cc1-4b42-b652-b4e64952d3f3 55710edfd4b24e368807c8b5087ec91c 011e84f966444a668bd6c0f5674f551f - - default default] Acquiring lock "83fe04ea-7d77-4003-9276-6a7d268e942a-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 21 18:58:23 np0005591285 nova_compute[182755]: 2026-01-21 23:58:23.702 182759 DEBUG oslo_concurrency.lockutils [None req-978889eb-2cc1-4b42-b652-b4e64952d3f3 55710edfd4b24e368807c8b5087ec91c 011e84f966444a668bd6c0f5674f551f - - default default] Lock "83fe04ea-7d77-4003-9276-6a7d268e942a-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 21 18:58:23 np0005591285 nova_compute[182755]: 2026-01-21 23:58:23.703 182759 DEBUG oslo_concurrency.lockutils [None req-978889eb-2cc1-4b42-b652-b4e64952d3f3 55710edfd4b24e368807c8b5087ec91c 011e84f966444a668bd6c0f5674f551f - - default default] Lock "83fe04ea-7d77-4003-9276-6a7d268e942a-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 21 18:58:23 np0005591285 nova_compute[182755]: 2026-01-21 23:58:23.704 182759 DEBUG nova.virt.libvirt.vif [None req-978889eb-2cc1-4b42-b652-b4e64952d3f3 55710edfd4b24e368807c8b5087ec91c 011e84f966444a668bd6c0f5674f551f - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-21T23:58:15Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ServerStableDeviceRescueTest-server-1755799853',display_name='tempest-ServerStableDeviceRescueTest-server-1755799853',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-serverstabledevicerescuetest-server-1755799853',id=74,image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='011e84f966444a668bd6c0f5674f551f',ramdisk_id='',reservation_id='r-gcfnr2kl',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ServerStableDeviceRescueTest-1256721315',owner_user_name='tempest-ServerStableDeviceRescueTest-1256721315-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-21T23:58:17Z,user_data=None,user_id='55710edfd4b24e368807c8b5087ec91c',uuid=83fe04ea-7d77-4003-9276-6a7d268e942a,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "8e162717-2b5c-4731-8484-d2c68330bdaa", "address": "fa:16:3e:2d:84:20", "network": {"id": "58cd83db-dcb3-409c-a108-07601ce5f67a", "bridge": "br-int", "label": "tempest-ServerStableDeviceRescueTest-1658091487-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "011e84f966444a668bd6c0f5674f551f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap8e162717-2b", "ovs_interfaceid": "8e162717-2b5c-4731-8484-d2c68330bdaa", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Jan 21 18:58:23 np0005591285 nova_compute[182755]: 2026-01-21 23:58:23.704 182759 DEBUG nova.network.os_vif_util [None req-978889eb-2cc1-4b42-b652-b4e64952d3f3 55710edfd4b24e368807c8b5087ec91c 011e84f966444a668bd6c0f5674f551f - - default default] Converting VIF {"id": "8e162717-2b5c-4731-8484-d2c68330bdaa", "address": "fa:16:3e:2d:84:20", "network": {"id": "58cd83db-dcb3-409c-a108-07601ce5f67a", "bridge": "br-int", "label": "tempest-ServerStableDeviceRescueTest-1658091487-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "011e84f966444a668bd6c0f5674f551f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap8e162717-2b", "ovs_interfaceid": "8e162717-2b5c-4731-8484-d2c68330bdaa", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 21 18:58:23 np0005591285 nova_compute[182755]: 2026-01-21 23:58:23.705 182759 DEBUG nova.network.os_vif_util [None req-978889eb-2cc1-4b42-b652-b4e64952d3f3 55710edfd4b24e368807c8b5087ec91c 011e84f966444a668bd6c0f5674f551f - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:2d:84:20,bridge_name='br-int',has_traffic_filtering=True,id=8e162717-2b5c-4731-8484-d2c68330bdaa,network=Network(58cd83db-dcb3-409c-a108-07601ce5f67a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap8e162717-2b') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 21 18:58:23 np0005591285 nova_compute[182755]: 2026-01-21 23:58:23.706 182759 DEBUG os_vif [None req-978889eb-2cc1-4b42-b652-b4e64952d3f3 55710edfd4b24e368807c8b5087ec91c 011e84f966444a668bd6c0f5674f551f - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:2d:84:20,bridge_name='br-int',has_traffic_filtering=True,id=8e162717-2b5c-4731-8484-d2c68330bdaa,network=Network(58cd83db-dcb3-409c-a108-07601ce5f67a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap8e162717-2b') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Jan 21 18:58:23 np0005591285 nova_compute[182755]: 2026-01-21 23:58:23.707 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 18:58:23 np0005591285 nova_compute[182755]: 2026-01-21 23:58:23.708 182759 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 21 18:58:23 np0005591285 nova_compute[182755]: 2026-01-21 23:58:23.708 182759 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 21 18:58:23 np0005591285 nova_compute[182755]: 2026-01-21 23:58:23.714 182759 DEBUG nova.compute.manager [req-6a997e44-10c4-4629-9a1f-fff51b9f4400 req-09b65492-3b91-4288-81fc-8206ab80f2fc 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 9308be91-9a92-4389-939a-8b03d37474cf] Received event network-vif-plugged-d96fb6bb-9793-4373-8f62-3aa3f32af6a5 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 21 18:58:23 np0005591285 nova_compute[182755]: 2026-01-21 23:58:23.715 182759 DEBUG oslo_concurrency.lockutils [req-6a997e44-10c4-4629-9a1f-fff51b9f4400 req-09b65492-3b91-4288-81fc-8206ab80f2fc 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "9308be91-9a92-4389-939a-8b03d37474cf-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 21 18:58:23 np0005591285 nova_compute[182755]: 2026-01-21 23:58:23.715 182759 DEBUG oslo_concurrency.lockutils [req-6a997e44-10c4-4629-9a1f-fff51b9f4400 req-09b65492-3b91-4288-81fc-8206ab80f2fc 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "9308be91-9a92-4389-939a-8b03d37474cf-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 21 18:58:23 np0005591285 nova_compute[182755]: 2026-01-21 23:58:23.716 182759 DEBUG oslo_concurrency.lockutils [req-6a997e44-10c4-4629-9a1f-fff51b9f4400 req-09b65492-3b91-4288-81fc-8206ab80f2fc 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "9308be91-9a92-4389-939a-8b03d37474cf-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 21 18:58:23 np0005591285 nova_compute[182755]: 2026-01-21 23:58:23.716 182759 DEBUG nova.compute.manager [req-6a997e44-10c4-4629-9a1f-fff51b9f4400 req-09b65492-3b91-4288-81fc-8206ab80f2fc 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 9308be91-9a92-4389-939a-8b03d37474cf] No waiting events found dispatching network-vif-plugged-d96fb6bb-9793-4373-8f62-3aa3f32af6a5 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 21 18:58:23 np0005591285 nova_compute[182755]: 2026-01-21 23:58:23.716 182759 WARNING nova.compute.manager [req-6a997e44-10c4-4629-9a1f-fff51b9f4400 req-09b65492-3b91-4288-81fc-8206ab80f2fc 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 9308be91-9a92-4389-939a-8b03d37474cf] Received unexpected event network-vif-plugged-d96fb6bb-9793-4373-8f62-3aa3f32af6a5 for instance with vm_state stopped and task_state None.#033[00m
Jan 21 18:58:23 np0005591285 nova_compute[182755]: 2026-01-21 23:58:23.719 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 18:58:23 np0005591285 nova_compute[182755]: 2026-01-21 23:58:23.720 182759 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap8e162717-2b, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 21 18:58:23 np0005591285 nova_compute[182755]: 2026-01-21 23:58:23.721 182759 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap8e162717-2b, col_values=(('external_ids', {'iface-id': '8e162717-2b5c-4731-8484-d2c68330bdaa', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:2d:84:20', 'vm-uuid': '83fe04ea-7d77-4003-9276-6a7d268e942a'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 21 18:58:23 np0005591285 nova_compute[182755]: 2026-01-21 23:58:23.723 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 18:58:23 np0005591285 NetworkManager[55017]: <info>  [1769039903.7247] manager: (tap8e162717-2b): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/113)
Jan 21 18:58:23 np0005591285 nova_compute[182755]: 2026-01-21 23:58:23.726 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 21 18:58:23 np0005591285 nova_compute[182755]: 2026-01-21 23:58:23.737 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 18:58:23 np0005591285 nova_compute[182755]: 2026-01-21 23:58:23.738 182759 INFO os_vif [None req-978889eb-2cc1-4b42-b652-b4e64952d3f3 55710edfd4b24e368807c8b5087ec91c 011e84f966444a668bd6c0f5674f551f - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:2d:84:20,bridge_name='br-int',has_traffic_filtering=True,id=8e162717-2b5c-4731-8484-d2c68330bdaa,network=Network(58cd83db-dcb3-409c-a108-07601ce5f67a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap8e162717-2b')#033[00m
Jan 21 18:58:23 np0005591285 nova_compute[182755]: 2026-01-21 23:58:23.798 182759 DEBUG nova.objects.instance [None req-2fe0f222-a8ec-45aa-933d-645e3c93712f 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] Lazy-loading 'flavor' on Instance uuid 9308be91-9a92-4389-939a-8b03d37474cf obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 21 18:58:23 np0005591285 nova_compute[182755]: 2026-01-21 23:58:23.825 182759 DEBUG nova.virt.libvirt.driver [None req-978889eb-2cc1-4b42-b652-b4e64952d3f3 55710edfd4b24e368807c8b5087ec91c 011e84f966444a668bd6c0f5674f551f - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 21 18:58:23 np0005591285 nova_compute[182755]: 2026-01-21 23:58:23.825 182759 DEBUG nova.virt.libvirt.driver [None req-978889eb-2cc1-4b42-b652-b4e64952d3f3 55710edfd4b24e368807c8b5087ec91c 011e84f966444a668bd6c0f5674f551f - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 21 18:58:23 np0005591285 nova_compute[182755]: 2026-01-21 23:58:23.825 182759 DEBUG nova.virt.libvirt.driver [None req-978889eb-2cc1-4b42-b652-b4e64952d3f3 55710edfd4b24e368807c8b5087ec91c 011e84f966444a668bd6c0f5674f551f - - default default] No VIF found with MAC fa:16:3e:2d:84:20, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Jan 21 18:58:23 np0005591285 nova_compute[182755]: 2026-01-21 23:58:23.826 182759 INFO nova.virt.libvirt.driver [None req-978889eb-2cc1-4b42-b652-b4e64952d3f3 55710edfd4b24e368807c8b5087ec91c 011e84f966444a668bd6c0f5674f551f - - default default] [instance: 83fe04ea-7d77-4003-9276-6a7d268e942a] Using config drive#033[00m
Jan 21 18:58:23 np0005591285 nova_compute[182755]: 2026-01-21 23:58:23.836 182759 DEBUG nova.objects.instance [None req-2fe0f222-a8ec-45aa-933d-645e3c93712f 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] Lazy-loading 'info_cache' on Instance uuid 9308be91-9a92-4389-939a-8b03d37474cf obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 21 18:58:23 np0005591285 nova_compute[182755]: 2026-01-21 23:58:23.925 182759 DEBUG oslo_concurrency.lockutils [None req-2fe0f222-a8ec-45aa-933d-645e3c93712f 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] Acquiring lock "refresh_cache-9308be91-9a92-4389-939a-8b03d37474cf" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 21 18:58:23 np0005591285 nova_compute[182755]: 2026-01-21 23:58:23.925 182759 DEBUG oslo_concurrency.lockutils [None req-2fe0f222-a8ec-45aa-933d-645e3c93712f 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] Acquired lock "refresh_cache-9308be91-9a92-4389-939a-8b03d37474cf" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 21 18:58:23 np0005591285 nova_compute[182755]: 2026-01-21 23:58:23.926 182759 DEBUG nova.network.neutron [None req-2fe0f222-a8ec-45aa-933d-645e3c93712f 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] [instance: 9308be91-9a92-4389-939a-8b03d37474cf] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Jan 21 18:58:24 np0005591285 nova_compute[182755]: 2026-01-21 23:58:24.672 182759 INFO nova.virt.libvirt.driver [None req-978889eb-2cc1-4b42-b652-b4e64952d3f3 55710edfd4b24e368807c8b5087ec91c 011e84f966444a668bd6c0f5674f551f - - default default] [instance: 83fe04ea-7d77-4003-9276-6a7d268e942a] Creating config drive at /var/lib/nova/instances/83fe04ea-7d77-4003-9276-6a7d268e942a/disk.config#033[00m
Jan 21 18:58:24 np0005591285 nova_compute[182755]: 2026-01-21 23:58:24.682 182759 DEBUG oslo_concurrency.processutils [None req-978889eb-2cc1-4b42-b652-b4e64952d3f3 55710edfd4b24e368807c8b5087ec91c 011e84f966444a668bd6c0f5674f551f - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/83fe04ea-7d77-4003-9276-6a7d268e942a/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmppudp88vw execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 21 18:58:24 np0005591285 nova_compute[182755]: 2026-01-21 23:58:24.829 182759 DEBUG oslo_concurrency.processutils [None req-978889eb-2cc1-4b42-b652-b4e64952d3f3 55710edfd4b24e368807c8b5087ec91c 011e84f966444a668bd6c0f5674f551f - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/83fe04ea-7d77-4003-9276-6a7d268e942a/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmppudp88vw" returned: 0 in 0.148s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 21 18:58:24 np0005591285 kernel: tap8e162717-2b: entered promiscuous mode
Jan 21 18:58:24 np0005591285 NetworkManager[55017]: <info>  [1769039904.9255] manager: (tap8e162717-2b): new Tun device (/org/freedesktop/NetworkManager/Devices/114)
Jan 21 18:58:24 np0005591285 ovn_controller[94908]: 2026-01-21T23:58:24Z|00220|binding|INFO|Claiming lport 8e162717-2b5c-4731-8484-d2c68330bdaa for this chassis.
Jan 21 18:58:24 np0005591285 nova_compute[182755]: 2026-01-21 23:58:24.932 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 18:58:24 np0005591285 ovn_controller[94908]: 2026-01-21T23:58:24Z|00221|binding|INFO|8e162717-2b5c-4731-8484-d2c68330bdaa: Claiming fa:16:3e:2d:84:20 10.100.0.13
Jan 21 18:58:24 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:58:24.944 104259 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:2d:84:20 10.100.0.13'], port_security=['fa:16:3e:2d:84:20 10.100.0.13'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.13/28', 'neutron:device_id': '83fe04ea-7d77-4003-9276-6a7d268e942a', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-58cd83db-dcb3-409c-a108-07601ce5f67a', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '011e84f966444a668bd6c0f5674f551f', 'neutron:revision_number': '2', 'neutron:security_group_ids': '5f02e9fc-67d8-4ade-8ddf-f139c26fa610', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=bbb394e5-dc7d-4c83-b892-c42bee4b1312, chassis=[<ovs.db.idl.Row object at 0x7fdd55b5c970>], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fdd55b5c970>], logical_port=8e162717-2b5c-4731-8484-d2c68330bdaa) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 21 18:58:24 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:58:24.946 104259 INFO neutron.agent.ovn.metadata.agent [-] Port 8e162717-2b5c-4731-8484-d2c68330bdaa in datapath 58cd83db-dcb3-409c-a108-07601ce5f67a bound to our chassis#033[00m
Jan 21 18:58:24 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:58:24.949 104259 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 58cd83db-dcb3-409c-a108-07601ce5f67a#033[00m
Jan 21 18:58:24 np0005591285 ovn_controller[94908]: 2026-01-21T23:58:24Z|00222|binding|INFO|Setting lport 8e162717-2b5c-4731-8484-d2c68330bdaa ovn-installed in OVS
Jan 21 18:58:24 np0005591285 ovn_controller[94908]: 2026-01-21T23:58:24Z|00223|binding|INFO|Setting lport 8e162717-2b5c-4731-8484-d2c68330bdaa up in Southbound
Jan 21 18:58:24 np0005591285 nova_compute[182755]: 2026-01-21 23:58:24.962 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 18:58:24 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:58:24.966 211690 DEBUG oslo.privsep.daemon [-] privsep: reply[c79dd5bc-5e71-436f-a6f1-8c6e59fecd02]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 18:58:24 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:58:24.967 104259 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap58cd83db-d1 in ovnmeta-58cd83db-dcb3-409c-a108-07601ce5f67a namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Jan 21 18:58:24 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:58:24.970 211690 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap58cd83db-d0 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Jan 21 18:58:24 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:58:24.970 211690 DEBUG oslo.privsep.daemon [-] privsep: reply[9fd1af6f-0701-4e5d-8643-58ed95ef21f6]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 18:58:24 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:58:24.971 211690 DEBUG oslo.privsep.daemon [-] privsep: reply[2dc0236a-66f5-44f8-b3b7-7d27cb181aaa]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 18:58:24 np0005591285 nova_compute[182755]: 2026-01-21 23:58:24.972 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 18:58:24 np0005591285 systemd-udevd[221145]: Network interface NamePolicy= disabled on kernel command line.
Jan 21 18:58:24 np0005591285 systemd-machined[154022]: New machine qemu-31-instance-0000004a.
Jan 21 18:58:24 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:58:24.986 104650 DEBUG oslo.privsep.daemon [-] privsep: reply[3e9d38f0-da2c-4c4b-a88f-d1cb9e4fae6a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 18:58:24 np0005591285 systemd[1]: Started Virtual Machine qemu-31-instance-0000004a.
Jan 21 18:58:24 np0005591285 NetworkManager[55017]: <info>  [1769039904.9948] device (tap8e162717-2b): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 21 18:58:24 np0005591285 NetworkManager[55017]: <info>  [1769039904.9958] device (tap8e162717-2b): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 21 18:58:25 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:58:25.016 211690 DEBUG oslo.privsep.daemon [-] privsep: reply[fe7d00f3-2cd6-4701-ba5c-ad90f3e6e967]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 18:58:25 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:58:25.057 211704 DEBUG oslo.privsep.daemon [-] privsep: reply[c9d8b99c-16e8-470c-b9c0-b028cf02bfe1]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 18:58:25 np0005591285 NetworkManager[55017]: <info>  [1769039905.0638] manager: (tap58cd83db-d0): new Veth device (/org/freedesktop/NetworkManager/Devices/115)
Jan 21 18:58:25 np0005591285 systemd-udevd[221149]: Network interface NamePolicy= disabled on kernel command line.
Jan 21 18:58:25 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:58:25.063 211690 DEBUG oslo.privsep.daemon [-] privsep: reply[a97dec92-9b7e-4c48-8639-d64543dc8ed7]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 18:58:25 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:58:25.104 211704 DEBUG oslo.privsep.daemon [-] privsep: reply[71f75755-8df5-45be-a122-09c0cf252afb]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 18:58:25 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:58:25.108 211704 DEBUG oslo.privsep.daemon [-] privsep: reply[2b5e71ee-03bd-48e9-a31a-1862516d14ad]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 18:58:25 np0005591285 NetworkManager[55017]: <info>  [1769039905.1428] device (tap58cd83db-d0): carrier: link connected
Jan 21 18:58:25 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:58:25.149 211704 DEBUG oslo.privsep.daemon [-] privsep: reply[658a57c7-098f-430d-9b72-cecacd10c32d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 18:58:25 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:58:25.165 211690 DEBUG oslo.privsep.daemon [-] privsep: reply[92805047-5fac-482b-9958-241389a958f9]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap58cd83db-d1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:59:9a:20'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 73], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 446380, 'reachable_time': 20294, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 221181, 'error': None, 'target': 'ovnmeta-58cd83db-dcb3-409c-a108-07601ce5f67a', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 18:58:25 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:58:25.189 211690 DEBUG oslo.privsep.daemon [-] privsep: reply[1e1a640d-2619-44dc-aa20-58244ced96e6]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe59:9a20'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 446380, 'tstamp': 446380}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 221187, 'error': None, 'target': 'ovnmeta-58cd83db-dcb3-409c-a108-07601ce5f67a', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 18:58:25 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:58:25.213 211690 DEBUG oslo.privsep.daemon [-] privsep: reply[e431fd38-22e0-4eea-bd39-b605e87a7159]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap58cd83db-d1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:59:9a:20'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 2, 'tx_packets': 1, 'rx_bytes': 176, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 2, 'tx_packets': 1, 'rx_bytes': 176, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 73], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 446380, 'reachable_time': 20294, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 2, 'inoctets': 148, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 2, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 148, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 2, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 221190, 'error': None, 'target': 'ovnmeta-58cd83db-dcb3-409c-a108-07601ce5f67a', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 18:58:25 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:58:25.254 211690 DEBUG oslo.privsep.daemon [-] privsep: reply[a75bd615-017d-47eb-a2a7-f7a87d30589d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 18:58:25 np0005591285 nova_compute[182755]: 2026-01-21 23:58:25.260 182759 DEBUG nova.virt.driver [None req-f447ef32-7edb-4e2c-a5c5-5452f2f9c37e - - - - - -] Emitting event <LifecycleEvent: 1769039905.259847, 83fe04ea-7d77-4003-9276-6a7d268e942a => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 21 18:58:25 np0005591285 nova_compute[182755]: 2026-01-21 23:58:25.262 182759 INFO nova.compute.manager [None req-f447ef32-7edb-4e2c-a5c5-5452f2f9c37e - - - - - -] [instance: 83fe04ea-7d77-4003-9276-6a7d268e942a] VM Started (Lifecycle Event)#033[00m
Jan 21 18:58:25 np0005591285 nova_compute[182755]: 2026-01-21 23:58:25.296 182759 DEBUG nova.compute.manager [None req-f447ef32-7edb-4e2c-a5c5-5452f2f9c37e - - - - - -] [instance: 83fe04ea-7d77-4003-9276-6a7d268e942a] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 21 18:58:25 np0005591285 nova_compute[182755]: 2026-01-21 23:58:25.302 182759 DEBUG nova.virt.driver [None req-f447ef32-7edb-4e2c-a5c5-5452f2f9c37e - - - - - -] Emitting event <LifecycleEvent: 1769039905.2602108, 83fe04ea-7d77-4003-9276-6a7d268e942a => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 21 18:58:25 np0005591285 nova_compute[182755]: 2026-01-21 23:58:25.303 182759 INFO nova.compute.manager [None req-f447ef32-7edb-4e2c-a5c5-5452f2f9c37e - - - - - -] [instance: 83fe04ea-7d77-4003-9276-6a7d268e942a] VM Paused (Lifecycle Event)#033[00m
Jan 21 18:58:25 np0005591285 nova_compute[182755]: 2026-01-21 23:58:25.323 182759 DEBUG nova.compute.manager [None req-f447ef32-7edb-4e2c-a5c5-5452f2f9c37e - - - - - -] [instance: 83fe04ea-7d77-4003-9276-6a7d268e942a] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 21 18:58:25 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:58:25.324 211690 DEBUG oslo.privsep.daemon [-] privsep: reply[0fdebf84-05c4-4618-bc3d-8648b525b823]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 18:58:25 np0005591285 nova_compute[182755]: 2026-01-21 23:58:25.326 182759 DEBUG nova.compute.manager [None req-f447ef32-7edb-4e2c-a5c5-5452f2f9c37e - - - - - -] [instance: 83fe04ea-7d77-4003-9276-6a7d268e942a] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 21 18:58:25 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:58:25.326 104259 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap58cd83db-d0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 21 18:58:25 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:58:25.327 104259 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 21 18:58:25 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:58:25.327 104259 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap58cd83db-d0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 21 18:58:25 np0005591285 nova_compute[182755]: 2026-01-21 23:58:25.329 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 18:58:25 np0005591285 kernel: tap58cd83db-d0: entered promiscuous mode
Jan 21 18:58:25 np0005591285 NetworkManager[55017]: <info>  [1769039905.3312] manager: (tap58cd83db-d0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/116)
Jan 21 18:58:25 np0005591285 nova_compute[182755]: 2026-01-21 23:58:25.336 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 18:58:25 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:58:25.338 104259 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap58cd83db-d0, col_values=(('external_ids', {'iface-id': '2d113249-07d3-443f-9b57-5f5a422d1c98'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 21 18:58:25 np0005591285 nova_compute[182755]: 2026-01-21 23:58:25.339 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 18:58:25 np0005591285 ovn_controller[94908]: 2026-01-21T23:58:25Z|00224|binding|INFO|Releasing lport 2d113249-07d3-443f-9b57-5f5a422d1c98 from this chassis (sb_readonly=0)
Jan 21 18:58:25 np0005591285 nova_compute[182755]: 2026-01-21 23:58:25.349 182759 INFO nova.compute.manager [None req-f447ef32-7edb-4e2c-a5c5-5452f2f9c37e - - - - - -] [instance: 83fe04ea-7d77-4003-9276-6a7d268e942a] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 21 18:58:25 np0005591285 nova_compute[182755]: 2026-01-21 23:58:25.354 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 18:58:25 np0005591285 nova_compute[182755]: 2026-01-21 23:58:25.358 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 18:58:25 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:58:25.359 104259 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/58cd83db-dcb3-409c-a108-07601ce5f67a.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/58cd83db-dcb3-409c-a108-07601ce5f67a.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Jan 21 18:58:25 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:58:25.360 211690 DEBUG oslo.privsep.daemon [-] privsep: reply[383cdb2a-b527-494c-a7f1-9fca3e049f6c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 18:58:25 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:58:25.361 104259 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Jan 21 18:58:25 np0005591285 ovn_metadata_agent[104254]: global
Jan 21 18:58:25 np0005591285 ovn_metadata_agent[104254]:    log         /dev/log local0 debug
Jan 21 18:58:25 np0005591285 ovn_metadata_agent[104254]:    log-tag     haproxy-metadata-proxy-58cd83db-dcb3-409c-a108-07601ce5f67a
Jan 21 18:58:25 np0005591285 ovn_metadata_agent[104254]:    user        root
Jan 21 18:58:25 np0005591285 ovn_metadata_agent[104254]:    group       root
Jan 21 18:58:25 np0005591285 ovn_metadata_agent[104254]:    maxconn     1024
Jan 21 18:58:25 np0005591285 ovn_metadata_agent[104254]:    pidfile     /var/lib/neutron/external/pids/58cd83db-dcb3-409c-a108-07601ce5f67a.pid.haproxy
Jan 21 18:58:25 np0005591285 ovn_metadata_agent[104254]:    daemon
Jan 21 18:58:25 np0005591285 ovn_metadata_agent[104254]: 
Jan 21 18:58:25 np0005591285 ovn_metadata_agent[104254]: defaults
Jan 21 18:58:25 np0005591285 ovn_metadata_agent[104254]:    log global
Jan 21 18:58:25 np0005591285 ovn_metadata_agent[104254]:    mode http
Jan 21 18:58:25 np0005591285 ovn_metadata_agent[104254]:    option httplog
Jan 21 18:58:25 np0005591285 ovn_metadata_agent[104254]:    option dontlognull
Jan 21 18:58:25 np0005591285 ovn_metadata_agent[104254]:    option http-server-close
Jan 21 18:58:25 np0005591285 ovn_metadata_agent[104254]:    option forwardfor
Jan 21 18:58:25 np0005591285 ovn_metadata_agent[104254]:    retries                 3
Jan 21 18:58:25 np0005591285 ovn_metadata_agent[104254]:    timeout http-request    30s
Jan 21 18:58:25 np0005591285 ovn_metadata_agent[104254]:    timeout connect         30s
Jan 21 18:58:25 np0005591285 ovn_metadata_agent[104254]:    timeout client          32s
Jan 21 18:58:25 np0005591285 ovn_metadata_agent[104254]:    timeout server          32s
Jan 21 18:58:25 np0005591285 ovn_metadata_agent[104254]:    timeout http-keep-alive 30s
Jan 21 18:58:25 np0005591285 ovn_metadata_agent[104254]: 
Jan 21 18:58:25 np0005591285 ovn_metadata_agent[104254]: 
Jan 21 18:58:25 np0005591285 ovn_metadata_agent[104254]: listen listener
Jan 21 18:58:25 np0005591285 ovn_metadata_agent[104254]:    bind 169.254.169.254:80
Jan 21 18:58:25 np0005591285 ovn_metadata_agent[104254]:    server metadata /var/lib/neutron/metadata_proxy
Jan 21 18:58:25 np0005591285 ovn_metadata_agent[104254]:    http-request add-header X-OVN-Network-ID 58cd83db-dcb3-409c-a108-07601ce5f67a
Jan 21 18:58:25 np0005591285 ovn_metadata_agent[104254]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Jan 21 18:58:25 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:58:25.362 104259 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-58cd83db-dcb3-409c-a108-07601ce5f67a', 'env', 'PROCESS_TAG=haproxy-58cd83db-dcb3-409c-a108-07601ce5f67a', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/58cd83db-dcb3-409c-a108-07601ce5f67a.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Jan 21 18:58:25 np0005591285 nova_compute[182755]: 2026-01-21 23:58:25.608 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 18:58:25 np0005591285 podman[221226]: 2026-01-21 23:58:25.770245863 +0000 UTC m=+0.063094371 container create c3297c860cb3fcad40c389c6254ebb4f07805d162354aaf53dbe9b412bfc1c9c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-58cd83db-dcb3-409c-a108-07601ce5f67a, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team)
Jan 21 18:58:25 np0005591285 systemd[1]: Started libpod-conmon-c3297c860cb3fcad40c389c6254ebb4f07805d162354aaf53dbe9b412bfc1c9c.scope.
Jan 21 18:58:25 np0005591285 podman[221226]: 2026-01-21 23:58:25.730510696 +0000 UTC m=+0.023359204 image pull 3695f0466b4af47afdf4b467956f8cc4744d7249671a73e7ca3fd26cca2f59c3 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Jan 21 18:58:25 np0005591285 systemd[1]: Started libcrun container.
Jan 21 18:58:25 np0005591285 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/a4aafa22a503263b8dfe2a7d4f993aa1426a1d52484c6b77fa9b39ea37380fa6/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 21 18:58:25 np0005591285 podman[221226]: 2026-01-21 23:58:25.866795802 +0000 UTC m=+0.159644280 container init c3297c860cb3fcad40c389c6254ebb4f07805d162354aaf53dbe9b412bfc1c9c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-58cd83db-dcb3-409c-a108-07601ce5f67a, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team)
Jan 21 18:58:25 np0005591285 podman[221226]: 2026-01-21 23:58:25.874885172 +0000 UTC m=+0.167733640 container start c3297c860cb3fcad40c389c6254ebb4f07805d162354aaf53dbe9b412bfc1c9c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-58cd83db-dcb3-409c-a108-07601ce5f67a, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team)
Jan 21 18:58:25 np0005591285 neutron-haproxy-ovnmeta-58cd83db-dcb3-409c-a108-07601ce5f67a[221241]: [NOTICE]   (221245) : New worker (221247) forked
Jan 21 18:58:25 np0005591285 neutron-haproxy-ovnmeta-58cd83db-dcb3-409c-a108-07601ce5f67a[221241]: [NOTICE]   (221245) : Loading success.
Jan 21 18:58:27 np0005591285 nova_compute[182755]: 2026-01-21 23:58:27.241 182759 DEBUG oslo_service.periodic_task [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 21 18:58:27 np0005591285 nova_compute[182755]: 2026-01-21 23:58:27.482 182759 DEBUG nova.network.neutron [None req-2fe0f222-a8ec-45aa-933d-645e3c93712f 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] [instance: 9308be91-9a92-4389-939a-8b03d37474cf] Updating instance_info_cache with network_info: [{"id": "d96fb6bb-9793-4373-8f62-3aa3f32af6a5", "address": "fa:16:3e:c3:44:d7", "network": {"id": "19c3e0c8-5563-479c-995a-ab38d8b8c7f7", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-10713966-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.240", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "cccb624dbe6d4401a89e9cd254f91828", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd96fb6bb-97", "ovs_interfaceid": "d96fb6bb-9793-4373-8f62-3aa3f32af6a5", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 21 18:58:27 np0005591285 nova_compute[182755]: 2026-01-21 23:58:27.504 182759 DEBUG oslo_concurrency.lockutils [None req-2fe0f222-a8ec-45aa-933d-645e3c93712f 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] Releasing lock "refresh_cache-9308be91-9a92-4389-939a-8b03d37474cf" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 21 18:58:27 np0005591285 nova_compute[182755]: 2026-01-21 23:58:27.534 182759 DEBUG nova.compute.manager [req-ae49bf5a-5966-40f7-994c-b9f6c6f4ea25 req-8ed5ac22-9423-4df6-ac84-cf3c7aff07ef 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 83fe04ea-7d77-4003-9276-6a7d268e942a] Received event network-vif-plugged-8e162717-2b5c-4731-8484-d2c68330bdaa external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 21 18:58:27 np0005591285 nova_compute[182755]: 2026-01-21 23:58:27.534 182759 DEBUG oslo_concurrency.lockutils [req-ae49bf5a-5966-40f7-994c-b9f6c6f4ea25 req-8ed5ac22-9423-4df6-ac84-cf3c7aff07ef 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "83fe04ea-7d77-4003-9276-6a7d268e942a-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 21 18:58:27 np0005591285 nova_compute[182755]: 2026-01-21 23:58:27.535 182759 DEBUG oslo_concurrency.lockutils [req-ae49bf5a-5966-40f7-994c-b9f6c6f4ea25 req-8ed5ac22-9423-4df6-ac84-cf3c7aff07ef 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "83fe04ea-7d77-4003-9276-6a7d268e942a-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 21 18:58:27 np0005591285 nova_compute[182755]: 2026-01-21 23:58:27.535 182759 DEBUG oslo_concurrency.lockutils [req-ae49bf5a-5966-40f7-994c-b9f6c6f4ea25 req-8ed5ac22-9423-4df6-ac84-cf3c7aff07ef 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "83fe04ea-7d77-4003-9276-6a7d268e942a-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 21 18:58:27 np0005591285 nova_compute[182755]: 2026-01-21 23:58:27.536 182759 DEBUG nova.compute.manager [req-ae49bf5a-5966-40f7-994c-b9f6c6f4ea25 req-8ed5ac22-9423-4df6-ac84-cf3c7aff07ef 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 83fe04ea-7d77-4003-9276-6a7d268e942a] Processing event network-vif-plugged-8e162717-2b5c-4731-8484-d2c68330bdaa _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Jan 21 18:58:27 np0005591285 nova_compute[182755]: 2026-01-21 23:58:27.537 182759 DEBUG nova.compute.manager [None req-978889eb-2cc1-4b42-b652-b4e64952d3f3 55710edfd4b24e368807c8b5087ec91c 011e84f966444a668bd6c0f5674f551f - - default default] [instance: 83fe04ea-7d77-4003-9276-6a7d268e942a] Instance event wait completed in 2 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Jan 21 18:58:27 np0005591285 nova_compute[182755]: 2026-01-21 23:58:27.545 182759 DEBUG nova.virt.libvirt.driver [None req-978889eb-2cc1-4b42-b652-b4e64952d3f3 55710edfd4b24e368807c8b5087ec91c 011e84f966444a668bd6c0f5674f551f - - default default] [instance: 83fe04ea-7d77-4003-9276-6a7d268e942a] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Jan 21 18:58:27 np0005591285 nova_compute[182755]: 2026-01-21 23:58:27.546 182759 DEBUG nova.virt.driver [None req-f447ef32-7edb-4e2c-a5c5-5452f2f9c37e - - - - - -] Emitting event <LifecycleEvent: 1769039907.5449588, 83fe04ea-7d77-4003-9276-6a7d268e942a => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 21 18:58:27 np0005591285 nova_compute[182755]: 2026-01-21 23:58:27.547 182759 INFO nova.compute.manager [None req-f447ef32-7edb-4e2c-a5c5-5452f2f9c37e - - - - - -] [instance: 83fe04ea-7d77-4003-9276-6a7d268e942a] VM Resumed (Lifecycle Event)#033[00m
Jan 21 18:58:27 np0005591285 nova_compute[182755]: 2026-01-21 23:58:27.550 182759 INFO nova.virt.libvirt.driver [-] [instance: 9308be91-9a92-4389-939a-8b03d37474cf] Instance destroyed successfully.#033[00m
Jan 21 18:58:27 np0005591285 nova_compute[182755]: 2026-01-21 23:58:27.551 182759 DEBUG nova.objects.instance [None req-2fe0f222-a8ec-45aa-933d-645e3c93712f 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] Lazy-loading 'numa_topology' on Instance uuid 9308be91-9a92-4389-939a-8b03d37474cf obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 21 18:58:27 np0005591285 nova_compute[182755]: 2026-01-21 23:58:27.557 182759 INFO nova.virt.libvirt.driver [-] [instance: 83fe04ea-7d77-4003-9276-6a7d268e942a] Instance spawned successfully.#033[00m
Jan 21 18:58:27 np0005591285 nova_compute[182755]: 2026-01-21 23:58:27.558 182759 DEBUG nova.virt.libvirt.driver [None req-978889eb-2cc1-4b42-b652-b4e64952d3f3 55710edfd4b24e368807c8b5087ec91c 011e84f966444a668bd6c0f5674f551f - - default default] [instance: 83fe04ea-7d77-4003-9276-6a7d268e942a] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Jan 21 18:58:27 np0005591285 nova_compute[182755]: 2026-01-21 23:58:27.589 182759 DEBUG nova.objects.instance [None req-2fe0f222-a8ec-45aa-933d-645e3c93712f 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] Lazy-loading 'resources' on Instance uuid 9308be91-9a92-4389-939a-8b03d37474cf obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 21 18:58:27 np0005591285 nova_compute[182755]: 2026-01-21 23:58:27.594 182759 DEBUG nova.compute.manager [None req-f447ef32-7edb-4e2c-a5c5-5452f2f9c37e - - - - - -] [instance: 83fe04ea-7d77-4003-9276-6a7d268e942a] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 21 18:58:27 np0005591285 nova_compute[182755]: 2026-01-21 23:58:27.601 182759 DEBUG nova.compute.manager [None req-f447ef32-7edb-4e2c-a5c5-5452f2f9c37e - - - - - -] [instance: 83fe04ea-7d77-4003-9276-6a7d268e942a] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 21 18:58:27 np0005591285 nova_compute[182755]: 2026-01-21 23:58:27.606 182759 DEBUG nova.virt.libvirt.vif [None req-2fe0f222-a8ec-45aa-933d-645e3c93712f 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-21T23:57:10Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ServerActionsTestJSON-server-396111842',display_name='tempest-ServerActionsTestJSON-server-396111842',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-serveractionstestjson-server-396111842',id=70,image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBM2ugiUux7DYMlN8dY8gue1BzsfXbOKOqdPq/gJUxFgjYtiZRKn0Il7yH7vkt/FF0n0nQ57uKZ7FjQwDvGcLpEHkhrK3RTLhPWsztjfiNHjhjKK0S86T4k3kzP0rpeoh4Q==',key_name='tempest-keypair-452781070',keypairs=<?>,launch_index=0,launched_at=2026-01-21T23:57:17Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=4,progress=0,project_id='cccb624dbe6d4401a89e9cd254f91828',ramdisk_id='',reservation_id='r-740ncwsh',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=<?>,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-ServerActionsTestJSON-78742637',owner_user_name='tempest-ServerActionsTestJSON-78742637-project-member'},tags=<?>,task_state='powering-on',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-21T23:58:21Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='3e78a70a1d284a9d932d4a53b872df39',uuid=9308be91-9a92-4389-939a-8b03d37474cf,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='stopped') vif={"id": "d96fb6bb-9793-4373-8f62-3aa3f32af6a5", "address": "fa:16:3e:c3:44:d7", "network": {"id": "19c3e0c8-5563-479c-995a-ab38d8b8c7f7", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-10713966-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.240", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "cccb624dbe6d4401a89e9cd254f91828", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd96fb6bb-97", "ovs_interfaceid": "d96fb6bb-9793-4373-8f62-3aa3f32af6a5", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Jan 21 18:58:27 np0005591285 nova_compute[182755]: 2026-01-21 23:58:27.607 182759 DEBUG nova.network.os_vif_util [None req-2fe0f222-a8ec-45aa-933d-645e3c93712f 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] Converting VIF {"id": "d96fb6bb-9793-4373-8f62-3aa3f32af6a5", "address": "fa:16:3e:c3:44:d7", "network": {"id": "19c3e0c8-5563-479c-995a-ab38d8b8c7f7", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-10713966-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.240", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "cccb624dbe6d4401a89e9cd254f91828", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd96fb6bb-97", "ovs_interfaceid": "d96fb6bb-9793-4373-8f62-3aa3f32af6a5", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 21 18:58:27 np0005591285 nova_compute[182755]: 2026-01-21 23:58:27.608 182759 DEBUG nova.network.os_vif_util [None req-2fe0f222-a8ec-45aa-933d-645e3c93712f 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:c3:44:d7,bridge_name='br-int',has_traffic_filtering=True,id=d96fb6bb-9793-4373-8f62-3aa3f32af6a5,network=Network(19c3e0c8-5563-479c-995a-ab38d8b8c7f7),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapd96fb6bb-97') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 21 18:58:27 np0005591285 nova_compute[182755]: 2026-01-21 23:58:27.609 182759 DEBUG os_vif [None req-2fe0f222-a8ec-45aa-933d-645e3c93712f 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:c3:44:d7,bridge_name='br-int',has_traffic_filtering=True,id=d96fb6bb-9793-4373-8f62-3aa3f32af6a5,network=Network(19c3e0c8-5563-479c-995a-ab38d8b8c7f7),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapd96fb6bb-97') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Jan 21 18:58:27 np0005591285 nova_compute[182755]: 2026-01-21 23:58:27.612 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 18:58:27 np0005591285 nova_compute[182755]: 2026-01-21 23:58:27.613 182759 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapd96fb6bb-97, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 21 18:58:27 np0005591285 nova_compute[182755]: 2026-01-21 23:58:27.619 182759 DEBUG nova.virt.libvirt.driver [None req-978889eb-2cc1-4b42-b652-b4e64952d3f3 55710edfd4b24e368807c8b5087ec91c 011e84f966444a668bd6c0f5674f551f - - default default] [instance: 83fe04ea-7d77-4003-9276-6a7d268e942a] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 21 18:58:27 np0005591285 nova_compute[182755]: 2026-01-21 23:58:27.620 182759 DEBUG nova.virt.libvirt.driver [None req-978889eb-2cc1-4b42-b652-b4e64952d3f3 55710edfd4b24e368807c8b5087ec91c 011e84f966444a668bd6c0f5674f551f - - default default] [instance: 83fe04ea-7d77-4003-9276-6a7d268e942a] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 21 18:58:27 np0005591285 nova_compute[182755]: 2026-01-21 23:58:27.621 182759 DEBUG nova.virt.libvirt.driver [None req-978889eb-2cc1-4b42-b652-b4e64952d3f3 55710edfd4b24e368807c8b5087ec91c 011e84f966444a668bd6c0f5674f551f - - default default] [instance: 83fe04ea-7d77-4003-9276-6a7d268e942a] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 21 18:58:27 np0005591285 nova_compute[182755]: 2026-01-21 23:58:27.622 182759 DEBUG nova.virt.libvirt.driver [None req-978889eb-2cc1-4b42-b652-b4e64952d3f3 55710edfd4b24e368807c8b5087ec91c 011e84f966444a668bd6c0f5674f551f - - default default] [instance: 83fe04ea-7d77-4003-9276-6a7d268e942a] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 21 18:58:27 np0005591285 nova_compute[182755]: 2026-01-21 23:58:27.623 182759 DEBUG nova.virt.libvirt.driver [None req-978889eb-2cc1-4b42-b652-b4e64952d3f3 55710edfd4b24e368807c8b5087ec91c 011e84f966444a668bd6c0f5674f551f - - default default] [instance: 83fe04ea-7d77-4003-9276-6a7d268e942a] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 21 18:58:27 np0005591285 nova_compute[182755]: 2026-01-21 23:58:27.624 182759 DEBUG nova.virt.libvirt.driver [None req-978889eb-2cc1-4b42-b652-b4e64952d3f3 55710edfd4b24e368807c8b5087ec91c 011e84f966444a668bd6c0f5674f551f - - default default] [instance: 83fe04ea-7d77-4003-9276-6a7d268e942a] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 21 18:58:27 np0005591285 nova_compute[182755]: 2026-01-21 23:58:27.631 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 18:58:27 np0005591285 nova_compute[182755]: 2026-01-21 23:58:27.633 182759 INFO nova.compute.manager [None req-f447ef32-7edb-4e2c-a5c5-5452f2f9c37e - - - - - -] [instance: 83fe04ea-7d77-4003-9276-6a7d268e942a] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 21 18:58:27 np0005591285 nova_compute[182755]: 2026-01-21 23:58:27.636 182759 INFO os_vif [None req-2fe0f222-a8ec-45aa-933d-645e3c93712f 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:c3:44:d7,bridge_name='br-int',has_traffic_filtering=True,id=d96fb6bb-9793-4373-8f62-3aa3f32af6a5,network=Network(19c3e0c8-5563-479c-995a-ab38d8b8c7f7),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapd96fb6bb-97')#033[00m
Jan 21 18:58:27 np0005591285 nova_compute[182755]: 2026-01-21 23:58:27.648 182759 DEBUG nova.virt.libvirt.driver [None req-2fe0f222-a8ec-45aa-933d-645e3c93712f 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] [instance: 9308be91-9a92-4389-939a-8b03d37474cf] Start _get_guest_xml network_info=[{"id": "d96fb6bb-9793-4373-8f62-3aa3f32af6a5", "address": "fa:16:3e:c3:44:d7", "network": {"id": "19c3e0c8-5563-479c-995a-ab38d8b8c7f7", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-10713966-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.240", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "cccb624dbe6d4401a89e9cd254f91828", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd96fb6bb-97", "ovs_interfaceid": "d96fb6bb-9793-4373-8f62-3aa3f32af6a5", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum=<?>,container_format='bare',created_at=<?>,direct_url=<?>,disk_format='qcow2',id=9cd98f02-a505-4543-a7ad-04e9a377b456,min_disk=1,min_ram=0,name=<?>,owner=<?>,properties=ImageMetaProps,protected=<?>,size=<?>,status=<?>,tags=<?>,updated_at=<?>,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'device_type': 'disk', 'device_name': '/dev/vda', 'size': 0, 'boot_index': 0, 'disk_bus': 'virtio', 'guest_format': None, 'encryption_options': None, 'encryption_format': None, 'encrypted': False, 'encryption_secret_uuid': None, 'image_id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Jan 21 18:58:27 np0005591285 nova_compute[182755]: 2026-01-21 23:58:27.653 182759 WARNING nova.virt.libvirt.driver [None req-2fe0f222-a8ec-45aa-933d-645e3c93712f 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 21 18:58:27 np0005591285 nova_compute[182755]: 2026-01-21 23:58:27.660 182759 DEBUG nova.virt.libvirt.host [None req-2fe0f222-a8ec-45aa-933d-645e3c93712f 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Jan 21 18:58:27 np0005591285 nova_compute[182755]: 2026-01-21 23:58:27.661 182759 DEBUG nova.virt.libvirt.host [None req-2fe0f222-a8ec-45aa-933d-645e3c93712f 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Jan 21 18:58:27 np0005591285 nova_compute[182755]: 2026-01-21 23:58:27.664 182759 DEBUG nova.virt.libvirt.host [None req-2fe0f222-a8ec-45aa-933d-645e3c93712f 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Jan 21 18:58:27 np0005591285 nova_compute[182755]: 2026-01-21 23:58:27.665 182759 DEBUG nova.virt.libvirt.host [None req-2fe0f222-a8ec-45aa-933d-645e3c93712f 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Jan 21 18:58:27 np0005591285 nova_compute[182755]: 2026-01-21 23:58:27.667 182759 DEBUG nova.virt.libvirt.driver [None req-2fe0f222-a8ec-45aa-933d-645e3c93712f 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Jan 21 18:58:27 np0005591285 nova_compute[182755]: 2026-01-21 23:58:27.668 182759 DEBUG nova.virt.hardware [None req-2fe0f222-a8ec-45aa-933d-645e3c93712f 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-21T23:42:45Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='c3389c03-89c4-4ff5-9e03-1a99d41713d4',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum=<?>,container_format='bare',created_at=<?>,direct_url=<?>,disk_format='qcow2',id=9cd98f02-a505-4543-a7ad-04e9a377b456,min_disk=1,min_ram=0,name=<?>,owner=<?>,properties=ImageMetaProps,protected=<?>,size=<?>,status=<?>,tags=<?>,updated_at=<?>,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Jan 21 18:58:27 np0005591285 nova_compute[182755]: 2026-01-21 23:58:27.669 182759 DEBUG nova.virt.hardware [None req-2fe0f222-a8ec-45aa-933d-645e3c93712f 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Jan 21 18:58:27 np0005591285 nova_compute[182755]: 2026-01-21 23:58:27.669 182759 DEBUG nova.virt.hardware [None req-2fe0f222-a8ec-45aa-933d-645e3c93712f 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Jan 21 18:58:27 np0005591285 nova_compute[182755]: 2026-01-21 23:58:27.671 182759 DEBUG nova.virt.hardware [None req-2fe0f222-a8ec-45aa-933d-645e3c93712f 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Jan 21 18:58:27 np0005591285 nova_compute[182755]: 2026-01-21 23:58:27.671 182759 DEBUG nova.virt.hardware [None req-2fe0f222-a8ec-45aa-933d-645e3c93712f 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Jan 21 18:58:27 np0005591285 nova_compute[182755]: 2026-01-21 23:58:27.671 182759 DEBUG nova.virt.hardware [None req-2fe0f222-a8ec-45aa-933d-645e3c93712f 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Jan 21 18:58:27 np0005591285 nova_compute[182755]: 2026-01-21 23:58:27.672 182759 DEBUG nova.virt.hardware [None req-2fe0f222-a8ec-45aa-933d-645e3c93712f 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Jan 21 18:58:27 np0005591285 nova_compute[182755]: 2026-01-21 23:58:27.672 182759 DEBUG nova.virt.hardware [None req-2fe0f222-a8ec-45aa-933d-645e3c93712f 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Jan 21 18:58:27 np0005591285 nova_compute[182755]: 2026-01-21 23:58:27.673 182759 DEBUG nova.virt.hardware [None req-2fe0f222-a8ec-45aa-933d-645e3c93712f 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Jan 21 18:58:27 np0005591285 nova_compute[182755]: 2026-01-21 23:58:27.673 182759 DEBUG nova.virt.hardware [None req-2fe0f222-a8ec-45aa-933d-645e3c93712f 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Jan 21 18:58:27 np0005591285 nova_compute[182755]: 2026-01-21 23:58:27.673 182759 DEBUG nova.virt.hardware [None req-2fe0f222-a8ec-45aa-933d-645e3c93712f 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Jan 21 18:58:27 np0005591285 nova_compute[182755]: 2026-01-21 23:58:27.674 182759 DEBUG nova.objects.instance [None req-2fe0f222-a8ec-45aa-933d-645e3c93712f 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] Lazy-loading 'vcpu_model' on Instance uuid 9308be91-9a92-4389-939a-8b03d37474cf obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 21 18:58:27 np0005591285 nova_compute[182755]: 2026-01-21 23:58:27.697 182759 DEBUG nova.virt.libvirt.vif [None req-2fe0f222-a8ec-45aa-933d-645e3c93712f 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-21T23:57:10Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ServerActionsTestJSON-server-396111842',display_name='tempest-ServerActionsTestJSON-server-396111842',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-serveractionstestjson-server-396111842',id=70,image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBM2ugiUux7DYMlN8dY8gue1BzsfXbOKOqdPq/gJUxFgjYtiZRKn0Il7yH7vkt/FF0n0nQ57uKZ7FjQwDvGcLpEHkhrK3RTLhPWsztjfiNHjhjKK0S86T4k3kzP0rpeoh4Q==',key_name='tempest-keypair-452781070',keypairs=<?>,launch_index=0,launched_at=2026-01-21T23:57:17Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=4,progress=0,project_id='cccb624dbe6d4401a89e9cd254f91828',ramdisk_id='',reservation_id='r-740ncwsh',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=<?>,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-ServerActionsTestJSON-78742637',owner_user_name='tempest-ServerActionsTestJSON-78742637-project-member'},tags=<?>,task_state='powering-on',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-21T23:58:21Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='3e78a70a1d284a9d932d4a53b872df39',uuid=9308be91-9a92-4389-939a-8b03d37474cf,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='stopped') vif={"id": "d96fb6bb-9793-4373-8f62-3aa3f32af6a5", "address": "fa:16:3e:c3:44:d7", "network": {"id": "19c3e0c8-5563-479c-995a-ab38d8b8c7f7", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-10713966-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.240", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "cccb624dbe6d4401a89e9cd254f91828", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd96fb6bb-97", "ovs_interfaceid": "d96fb6bb-9793-4373-8f62-3aa3f32af6a5", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Jan 21 18:58:27 np0005591285 nova_compute[182755]: 2026-01-21 23:58:27.698 182759 DEBUG nova.network.os_vif_util [None req-2fe0f222-a8ec-45aa-933d-645e3c93712f 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] Converting VIF {"id": "d96fb6bb-9793-4373-8f62-3aa3f32af6a5", "address": "fa:16:3e:c3:44:d7", "network": {"id": "19c3e0c8-5563-479c-995a-ab38d8b8c7f7", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-10713966-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.240", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "cccb624dbe6d4401a89e9cd254f91828", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd96fb6bb-97", "ovs_interfaceid": "d96fb6bb-9793-4373-8f62-3aa3f32af6a5", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 21 18:58:27 np0005591285 nova_compute[182755]: 2026-01-21 23:58:27.700 182759 DEBUG nova.network.os_vif_util [None req-2fe0f222-a8ec-45aa-933d-645e3c93712f 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:c3:44:d7,bridge_name='br-int',has_traffic_filtering=True,id=d96fb6bb-9793-4373-8f62-3aa3f32af6a5,network=Network(19c3e0c8-5563-479c-995a-ab38d8b8c7f7),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapd96fb6bb-97') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 21 18:58:27 np0005591285 nova_compute[182755]: 2026-01-21 23:58:27.702 182759 DEBUG nova.objects.instance [None req-2fe0f222-a8ec-45aa-933d-645e3c93712f 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] Lazy-loading 'pci_devices' on Instance uuid 9308be91-9a92-4389-939a-8b03d37474cf obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 21 18:58:27 np0005591285 nova_compute[182755]: 2026-01-21 23:58:27.711 182759 INFO nova.compute.manager [None req-978889eb-2cc1-4b42-b652-b4e64952d3f3 55710edfd4b24e368807c8b5087ec91c 011e84f966444a668bd6c0f5674f551f - - default default] [instance: 83fe04ea-7d77-4003-9276-6a7d268e942a] Took 10.54 seconds to spawn the instance on the hypervisor.#033[00m
Jan 21 18:58:27 np0005591285 nova_compute[182755]: 2026-01-21 23:58:27.711 182759 DEBUG nova.compute.manager [None req-978889eb-2cc1-4b42-b652-b4e64952d3f3 55710edfd4b24e368807c8b5087ec91c 011e84f966444a668bd6c0f5674f551f - - default default] [instance: 83fe04ea-7d77-4003-9276-6a7d268e942a] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 21 18:58:27 np0005591285 nova_compute[182755]: 2026-01-21 23:58:27.724 182759 DEBUG nova.virt.libvirt.driver [None req-2fe0f222-a8ec-45aa-933d-645e3c93712f 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] [instance: 9308be91-9a92-4389-939a-8b03d37474cf] End _get_guest_xml xml=<domain type="kvm">
Jan 21 18:58:27 np0005591285 nova_compute[182755]:  <uuid>9308be91-9a92-4389-939a-8b03d37474cf</uuid>
Jan 21 18:58:27 np0005591285 nova_compute[182755]:  <name>instance-00000046</name>
Jan 21 18:58:27 np0005591285 nova_compute[182755]:  <memory>131072</memory>
Jan 21 18:58:27 np0005591285 nova_compute[182755]:  <vcpu>1</vcpu>
Jan 21 18:58:27 np0005591285 nova_compute[182755]:  <metadata>
Jan 21 18:58:27 np0005591285 nova_compute[182755]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 21 18:58:27 np0005591285 nova_compute[182755]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 21 18:58:27 np0005591285 nova_compute[182755]:      <nova:name>tempest-ServerActionsTestJSON-server-396111842</nova:name>
Jan 21 18:58:27 np0005591285 nova_compute[182755]:      <nova:creationTime>2026-01-21 23:58:27</nova:creationTime>
Jan 21 18:58:27 np0005591285 nova_compute[182755]:      <nova:flavor name="m1.nano">
Jan 21 18:58:27 np0005591285 nova_compute[182755]:        <nova:memory>128</nova:memory>
Jan 21 18:58:27 np0005591285 nova_compute[182755]:        <nova:disk>1</nova:disk>
Jan 21 18:58:27 np0005591285 nova_compute[182755]:        <nova:swap>0</nova:swap>
Jan 21 18:58:27 np0005591285 nova_compute[182755]:        <nova:ephemeral>0</nova:ephemeral>
Jan 21 18:58:27 np0005591285 nova_compute[182755]:        <nova:vcpus>1</nova:vcpus>
Jan 21 18:58:27 np0005591285 nova_compute[182755]:      </nova:flavor>
Jan 21 18:58:27 np0005591285 nova_compute[182755]:      <nova:owner>
Jan 21 18:58:27 np0005591285 nova_compute[182755]:        <nova:user uuid="3e78a70a1d284a9d932d4a53b872df39">tempest-ServerActionsTestJSON-78742637-project-member</nova:user>
Jan 21 18:58:27 np0005591285 nova_compute[182755]:        <nova:project uuid="cccb624dbe6d4401a89e9cd254f91828">tempest-ServerActionsTestJSON-78742637</nova:project>
Jan 21 18:58:27 np0005591285 nova_compute[182755]:      </nova:owner>
Jan 21 18:58:27 np0005591285 nova_compute[182755]:      <nova:root type="image" uuid="9cd98f02-a505-4543-a7ad-04e9a377b456"/>
Jan 21 18:58:27 np0005591285 nova_compute[182755]:      <nova:ports>
Jan 21 18:58:27 np0005591285 nova_compute[182755]:        <nova:port uuid="d96fb6bb-9793-4373-8f62-3aa3f32af6a5">
Jan 21 18:58:27 np0005591285 nova_compute[182755]:          <nova:ip type="fixed" address="10.100.0.7" ipVersion="4"/>
Jan 21 18:58:27 np0005591285 nova_compute[182755]:        </nova:port>
Jan 21 18:58:27 np0005591285 nova_compute[182755]:      </nova:ports>
Jan 21 18:58:27 np0005591285 nova_compute[182755]:    </nova:instance>
Jan 21 18:58:27 np0005591285 nova_compute[182755]:  </metadata>
Jan 21 18:58:27 np0005591285 nova_compute[182755]:  <sysinfo type="smbios">
Jan 21 18:58:27 np0005591285 nova_compute[182755]:    <system>
Jan 21 18:58:27 np0005591285 nova_compute[182755]:      <entry name="manufacturer">RDO</entry>
Jan 21 18:58:27 np0005591285 nova_compute[182755]:      <entry name="product">OpenStack Compute</entry>
Jan 21 18:58:27 np0005591285 nova_compute[182755]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 21 18:58:27 np0005591285 nova_compute[182755]:      <entry name="serial">9308be91-9a92-4389-939a-8b03d37474cf</entry>
Jan 21 18:58:27 np0005591285 nova_compute[182755]:      <entry name="uuid">9308be91-9a92-4389-939a-8b03d37474cf</entry>
Jan 21 18:58:27 np0005591285 nova_compute[182755]:      <entry name="family">Virtual Machine</entry>
Jan 21 18:58:27 np0005591285 nova_compute[182755]:    </system>
Jan 21 18:58:27 np0005591285 nova_compute[182755]:  </sysinfo>
Jan 21 18:58:27 np0005591285 nova_compute[182755]:  <os>
Jan 21 18:58:27 np0005591285 nova_compute[182755]:    <type arch="x86_64" machine="q35">hvm</type>
Jan 21 18:58:27 np0005591285 nova_compute[182755]:    <boot dev="hd"/>
Jan 21 18:58:27 np0005591285 nova_compute[182755]:    <smbios mode="sysinfo"/>
Jan 21 18:58:27 np0005591285 nova_compute[182755]:  </os>
Jan 21 18:58:27 np0005591285 nova_compute[182755]:  <features>
Jan 21 18:58:27 np0005591285 nova_compute[182755]:    <acpi/>
Jan 21 18:58:27 np0005591285 nova_compute[182755]:    <apic/>
Jan 21 18:58:27 np0005591285 nova_compute[182755]:    <vmcoreinfo/>
Jan 21 18:58:27 np0005591285 nova_compute[182755]:  </features>
Jan 21 18:58:27 np0005591285 nova_compute[182755]:  <clock offset="utc">
Jan 21 18:58:27 np0005591285 nova_compute[182755]:    <timer name="pit" tickpolicy="delay"/>
Jan 21 18:58:27 np0005591285 nova_compute[182755]:    <timer name="rtc" tickpolicy="catchup"/>
Jan 21 18:58:27 np0005591285 nova_compute[182755]:    <timer name="hpet" present="no"/>
Jan 21 18:58:27 np0005591285 nova_compute[182755]:  </clock>
Jan 21 18:58:27 np0005591285 nova_compute[182755]:  <cpu mode="custom" match="exact">
Jan 21 18:58:27 np0005591285 nova_compute[182755]:    <model>Nehalem</model>
Jan 21 18:58:27 np0005591285 nova_compute[182755]:    <topology sockets="1" cores="1" threads="1"/>
Jan 21 18:58:27 np0005591285 nova_compute[182755]:  </cpu>
Jan 21 18:58:27 np0005591285 nova_compute[182755]:  <devices>
Jan 21 18:58:27 np0005591285 nova_compute[182755]:    <disk type="file" device="disk">
Jan 21 18:58:27 np0005591285 nova_compute[182755]:      <driver name="qemu" type="qcow2" cache="none"/>
Jan 21 18:58:27 np0005591285 nova_compute[182755]:      <source file="/var/lib/nova/instances/9308be91-9a92-4389-939a-8b03d37474cf/disk"/>
Jan 21 18:58:27 np0005591285 nova_compute[182755]:      <target dev="vda" bus="virtio"/>
Jan 21 18:58:27 np0005591285 nova_compute[182755]:    </disk>
Jan 21 18:58:27 np0005591285 nova_compute[182755]:    <disk type="file" device="cdrom">
Jan 21 18:58:27 np0005591285 nova_compute[182755]:      <driver name="qemu" type="raw" cache="none"/>
Jan 21 18:58:27 np0005591285 nova_compute[182755]:      <source file="/var/lib/nova/instances/9308be91-9a92-4389-939a-8b03d37474cf/disk.config"/>
Jan 21 18:58:27 np0005591285 nova_compute[182755]:      <target dev="sda" bus="sata"/>
Jan 21 18:58:27 np0005591285 nova_compute[182755]:    </disk>
Jan 21 18:58:27 np0005591285 nova_compute[182755]:    <interface type="ethernet">
Jan 21 18:58:27 np0005591285 nova_compute[182755]:      <mac address="fa:16:3e:c3:44:d7"/>
Jan 21 18:58:27 np0005591285 nova_compute[182755]:      <model type="virtio"/>
Jan 21 18:58:27 np0005591285 nova_compute[182755]:      <driver name="vhost" rx_queue_size="512"/>
Jan 21 18:58:27 np0005591285 nova_compute[182755]:      <mtu size="1442"/>
Jan 21 18:58:27 np0005591285 nova_compute[182755]:      <target dev="tapd96fb6bb-97"/>
Jan 21 18:58:27 np0005591285 nova_compute[182755]:    </interface>
Jan 21 18:58:27 np0005591285 nova_compute[182755]:    <serial type="pty">
Jan 21 18:58:27 np0005591285 nova_compute[182755]:      <log file="/var/lib/nova/instances/9308be91-9a92-4389-939a-8b03d37474cf/console.log" append="off"/>
Jan 21 18:58:27 np0005591285 nova_compute[182755]:    </serial>
Jan 21 18:58:27 np0005591285 nova_compute[182755]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 21 18:58:27 np0005591285 nova_compute[182755]:    <video>
Jan 21 18:58:27 np0005591285 nova_compute[182755]:      <model type="virtio"/>
Jan 21 18:58:27 np0005591285 nova_compute[182755]:    </video>
Jan 21 18:58:27 np0005591285 nova_compute[182755]:    <input type="tablet" bus="usb"/>
Jan 21 18:58:27 np0005591285 nova_compute[182755]:    <input type="keyboard" bus="usb"/>
Jan 21 18:58:27 np0005591285 nova_compute[182755]:    <rng model="virtio">
Jan 21 18:58:27 np0005591285 nova_compute[182755]:      <backend model="random">/dev/urandom</backend>
Jan 21 18:58:27 np0005591285 nova_compute[182755]:    </rng>
Jan 21 18:58:27 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root"/>
Jan 21 18:58:27 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 18:58:27 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 18:58:27 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 18:58:27 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 18:58:27 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 18:58:27 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 18:58:27 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 18:58:27 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 18:58:27 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 18:58:27 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 18:58:27 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 18:58:27 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 18:58:27 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 18:58:27 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 18:58:27 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 18:58:27 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 18:58:27 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 18:58:27 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 18:58:27 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 18:58:27 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 18:58:27 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 18:58:27 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 18:58:27 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 18:58:27 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 18:58:27 np0005591285 nova_compute[182755]:    <controller type="usb" index="0"/>
Jan 21 18:58:27 np0005591285 nova_compute[182755]:    <memballoon model="virtio">
Jan 21 18:58:27 np0005591285 nova_compute[182755]:      <stats period="10"/>
Jan 21 18:58:27 np0005591285 nova_compute[182755]:    </memballoon>
Jan 21 18:58:27 np0005591285 nova_compute[182755]:  </devices>
Jan 21 18:58:27 np0005591285 nova_compute[182755]: </domain>
Jan 21 18:58:27 np0005591285 nova_compute[182755]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Jan 21 18:58:27 np0005591285 nova_compute[182755]: 2026-01-21 23:58:27.726 182759 DEBUG oslo_concurrency.processutils [None req-2fe0f222-a8ec-45aa-933d-645e3c93712f 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/9308be91-9a92-4389-939a-8b03d37474cf/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 21 18:58:27 np0005591285 nova_compute[182755]: 2026-01-21 23:58:27.813 182759 INFO nova.compute.manager [None req-978889eb-2cc1-4b42-b652-b4e64952d3f3 55710edfd4b24e368807c8b5087ec91c 011e84f966444a668bd6c0f5674f551f - - default default] [instance: 83fe04ea-7d77-4003-9276-6a7d268e942a] Took 11.22 seconds to build instance.#033[00m
Jan 21 18:58:27 np0005591285 nova_compute[182755]: 2026-01-21 23:58:27.822 182759 DEBUG oslo_concurrency.processutils [None req-2fe0f222-a8ec-45aa-933d-645e3c93712f 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/9308be91-9a92-4389-939a-8b03d37474cf/disk --force-share --output=json" returned: 0 in 0.095s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 21 18:58:27 np0005591285 nova_compute[182755]: 2026-01-21 23:58:27.822 182759 DEBUG oslo_concurrency.processutils [None req-2fe0f222-a8ec-45aa-933d-645e3c93712f 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/9308be91-9a92-4389-939a-8b03d37474cf/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 21 18:58:27 np0005591285 nova_compute[182755]: 2026-01-21 23:58:27.850 182759 DEBUG oslo_concurrency.lockutils [None req-978889eb-2cc1-4b42-b652-b4e64952d3f3 55710edfd4b24e368807c8b5087ec91c 011e84f966444a668bd6c0f5674f551f - - default default] Lock "83fe04ea-7d77-4003-9276-6a7d268e942a" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 11.357s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 21 18:58:27 np0005591285 nova_compute[182755]: 2026-01-21 23:58:27.892 182759 DEBUG nova.network.neutron [req-0f34a3f2-e9a5-4ee0-ac5e-1d0d84b8ed75 req-9ffb12ee-808b-47d1-bdca-c2605cebc494 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 83fe04ea-7d77-4003-9276-6a7d268e942a] Updated VIF entry in instance network info cache for port 8e162717-2b5c-4731-8484-d2c68330bdaa. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 21 18:58:27 np0005591285 nova_compute[182755]: 2026-01-21 23:58:27.893 182759 DEBUG nova.network.neutron [req-0f34a3f2-e9a5-4ee0-ac5e-1d0d84b8ed75 req-9ffb12ee-808b-47d1-bdca-c2605cebc494 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 83fe04ea-7d77-4003-9276-6a7d268e942a] Updating instance_info_cache with network_info: [{"id": "8e162717-2b5c-4731-8484-d2c68330bdaa", "address": "fa:16:3e:2d:84:20", "network": {"id": "58cd83db-dcb3-409c-a108-07601ce5f67a", "bridge": "br-int", "label": "tempest-ServerStableDeviceRescueTest-1658091487-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "011e84f966444a668bd6c0f5674f551f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap8e162717-2b", "ovs_interfaceid": "8e162717-2b5c-4731-8484-d2c68330bdaa", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 21 18:58:27 np0005591285 nova_compute[182755]: 2026-01-21 23:58:27.898 182759 DEBUG oslo_concurrency.processutils [None req-2fe0f222-a8ec-45aa-933d-645e3c93712f 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/9308be91-9a92-4389-939a-8b03d37474cf/disk --force-share --output=json" returned: 0 in 0.076s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 21 18:58:27 np0005591285 nova_compute[182755]: 2026-01-21 23:58:27.900 182759 DEBUG nova.objects.instance [None req-2fe0f222-a8ec-45aa-933d-645e3c93712f 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] Lazy-loading 'trusted_certs' on Instance uuid 9308be91-9a92-4389-939a-8b03d37474cf obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 21 18:58:27 np0005591285 nova_compute[182755]: 2026-01-21 23:58:27.921 182759 DEBUG oslo_concurrency.processutils [None req-2fe0f222-a8ec-45aa-933d-645e3c93712f 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 21 18:58:27 np0005591285 nova_compute[182755]: 2026-01-21 23:58:27.949 182759 DEBUG oslo_concurrency.lockutils [req-0f34a3f2-e9a5-4ee0-ac5e-1d0d84b8ed75 req-9ffb12ee-808b-47d1-bdca-c2605cebc494 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Releasing lock "refresh_cache-83fe04ea-7d77-4003-9276-6a7d268e942a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 21 18:58:27 np0005591285 nova_compute[182755]: 2026-01-21 23:58:27.986 182759 DEBUG oslo_concurrency.processutils [None req-2fe0f222-a8ec-45aa-933d-645e3c93712f 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json" returned: 0 in 0.065s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 21 18:58:27 np0005591285 nova_compute[182755]: 2026-01-21 23:58:27.988 182759 DEBUG nova.virt.disk.api [None req-2fe0f222-a8ec-45aa-933d-645e3c93712f 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] Checking if we can resize image /var/lib/nova/instances/9308be91-9a92-4389-939a-8b03d37474cf/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166#033[00m
Jan 21 18:58:27 np0005591285 nova_compute[182755]: 2026-01-21 23:58:27.989 182759 DEBUG oslo_concurrency.processutils [None req-2fe0f222-a8ec-45aa-933d-645e3c93712f 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/9308be91-9a92-4389-939a-8b03d37474cf/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 21 18:58:28 np0005591285 nova_compute[182755]: 2026-01-21 23:58:28.056 182759 DEBUG oslo_concurrency.processutils [None req-2fe0f222-a8ec-45aa-933d-645e3c93712f 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/9308be91-9a92-4389-939a-8b03d37474cf/disk --force-share --output=json" returned: 0 in 0.067s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 21 18:58:28 np0005591285 nova_compute[182755]: 2026-01-21 23:58:28.058 182759 DEBUG nova.virt.disk.api [None req-2fe0f222-a8ec-45aa-933d-645e3c93712f 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] Cannot resize image /var/lib/nova/instances/9308be91-9a92-4389-939a-8b03d37474cf/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172#033[00m
Jan 21 18:58:28 np0005591285 nova_compute[182755]: 2026-01-21 23:58:28.059 182759 DEBUG nova.objects.instance [None req-2fe0f222-a8ec-45aa-933d-645e3c93712f 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] Lazy-loading 'migration_context' on Instance uuid 9308be91-9a92-4389-939a-8b03d37474cf obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 21 18:58:28 np0005591285 nova_compute[182755]: 2026-01-21 23:58:28.084 182759 DEBUG nova.virt.libvirt.vif [None req-2fe0f222-a8ec-45aa-933d-645e3c93712f 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-21T23:57:10Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ServerActionsTestJSON-server-396111842',display_name='tempest-ServerActionsTestJSON-server-396111842',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-serveractionstestjson-server-396111842',id=70,image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBM2ugiUux7DYMlN8dY8gue1BzsfXbOKOqdPq/gJUxFgjYtiZRKn0Il7yH7vkt/FF0n0nQ57uKZ7FjQwDvGcLpEHkhrK3RTLhPWsztjfiNHjhjKK0S86T4k3kzP0rpeoh4Q==',key_name='tempest-keypair-452781070',keypairs=<?>,launch_index=0,launched_at=2026-01-21T23:57:17Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=<?>,power_state=4,progress=0,project_id='cccb624dbe6d4401a89e9cd254f91828',ramdisk_id='',reservation_id='r-740ncwsh',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=<?>,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-ServerActionsTestJSON-78742637',owner_user_name='tempest-ServerActionsTestJSON-78742637-project-member'},tags=<?>,task_state='powering-on',terminated_at=None,trusted_certs=None,updated_at=2026-01-21T23:58:21Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='3e78a70a1d284a9d932d4a53b872df39',uuid=9308be91-9a92-4389-939a-8b03d37474cf,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='stopped') vif={"id": "d96fb6bb-9793-4373-8f62-3aa3f32af6a5", "address": "fa:16:3e:c3:44:d7", "network": {"id": "19c3e0c8-5563-479c-995a-ab38d8b8c7f7", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-10713966-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.240", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "cccb624dbe6d4401a89e9cd254f91828", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd96fb6bb-97", "ovs_interfaceid": "d96fb6bb-9793-4373-8f62-3aa3f32af6a5", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Jan 21 18:58:28 np0005591285 nova_compute[182755]: 2026-01-21 23:58:28.085 182759 DEBUG nova.network.os_vif_util [None req-2fe0f222-a8ec-45aa-933d-645e3c93712f 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] Converting VIF {"id": "d96fb6bb-9793-4373-8f62-3aa3f32af6a5", "address": "fa:16:3e:c3:44:d7", "network": {"id": "19c3e0c8-5563-479c-995a-ab38d8b8c7f7", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-10713966-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.240", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "cccb624dbe6d4401a89e9cd254f91828", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd96fb6bb-97", "ovs_interfaceid": "d96fb6bb-9793-4373-8f62-3aa3f32af6a5", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 21 18:58:28 np0005591285 nova_compute[182755]: 2026-01-21 23:58:28.087 182759 DEBUG nova.network.os_vif_util [None req-2fe0f222-a8ec-45aa-933d-645e3c93712f 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:c3:44:d7,bridge_name='br-int',has_traffic_filtering=True,id=d96fb6bb-9793-4373-8f62-3aa3f32af6a5,network=Network(19c3e0c8-5563-479c-995a-ab38d8b8c7f7),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapd96fb6bb-97') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 21 18:58:28 np0005591285 nova_compute[182755]: 2026-01-21 23:58:28.088 182759 DEBUG os_vif [None req-2fe0f222-a8ec-45aa-933d-645e3c93712f 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:c3:44:d7,bridge_name='br-int',has_traffic_filtering=True,id=d96fb6bb-9793-4373-8f62-3aa3f32af6a5,network=Network(19c3e0c8-5563-479c-995a-ab38d8b8c7f7),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapd96fb6bb-97') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Jan 21 18:58:28 np0005591285 nova_compute[182755]: 2026-01-21 23:58:28.089 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 18:58:28 np0005591285 nova_compute[182755]: 2026-01-21 23:58:28.090 182759 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 21 18:58:28 np0005591285 nova_compute[182755]: 2026-01-21 23:58:28.092 182759 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 21 18:58:28 np0005591285 nova_compute[182755]: 2026-01-21 23:58:28.096 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 18:58:28 np0005591285 nova_compute[182755]: 2026-01-21 23:58:28.097 182759 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapd96fb6bb-97, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 21 18:58:28 np0005591285 nova_compute[182755]: 2026-01-21 23:58:28.098 182759 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tapd96fb6bb-97, col_values=(('external_ids', {'iface-id': 'd96fb6bb-9793-4373-8f62-3aa3f32af6a5', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:c3:44:d7', 'vm-uuid': '9308be91-9a92-4389-939a-8b03d37474cf'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 21 18:58:28 np0005591285 nova_compute[182755]: 2026-01-21 23:58:28.100 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 18:58:28 np0005591285 NetworkManager[55017]: <info>  [1769039908.1026] manager: (tapd96fb6bb-97): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/117)
Jan 21 18:58:28 np0005591285 nova_compute[182755]: 2026-01-21 23:58:28.109 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 21 18:58:28 np0005591285 nova_compute[182755]: 2026-01-21 23:58:28.110 182759 INFO os_vif [None req-2fe0f222-a8ec-45aa-933d-645e3c93712f 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:c3:44:d7,bridge_name='br-int',has_traffic_filtering=True,id=d96fb6bb-9793-4373-8f62-3aa3f32af6a5,network=Network(19c3e0c8-5563-479c-995a-ab38d8b8c7f7),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapd96fb6bb-97')#033[00m
Jan 21 18:58:28 np0005591285 nova_compute[182755]: 2026-01-21 23:58:28.217 182759 DEBUG oslo_service.periodic_task [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 21 18:58:28 np0005591285 podman[221270]: 2026-01-21 23:58:28.225357229 +0000 UTC m=+0.088698597 container health_status 769668cc4e818cfae854ab3fcb5517227ddca1e18005a18ef4d782a494f45662 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, vendor=Red Hat, Inc., config_id=openstack_network_exporter, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., architecture=x86_64, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, maintainer=Red Hat, Inc., name=ubi9-minimal, release=1755695350, build-date=2025-08-20T13:12:41, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, vcs-type=git, com.redhat.component=ubi9-minimal-container, url=https://catalog.redhat.com/en/search?searchType=containers, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., distribution-scope=public, io.openshift.tags=minimal rhel9, managed_by=edpm_ansible, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., version=9.6, io.buildah.version=1.33.7, container_name=openstack_network_exporter)
Jan 21 18:58:28 np0005591285 podman[221271]: 2026-01-21 23:58:28.227175357 +0000 UTC m=+0.086210728 container health_status b5c1a31a508d0cf9e10702abe9c0ef424614b4d8f0daee85791f7de054afa6f0 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, container_name=ceilometer_agent_compute, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_id=ceilometer_agent_compute, io.buildah.version=1.41.3, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS)
Jan 21 18:58:28 np0005591285 kernel: tapd96fb6bb-97: entered promiscuous mode
Jan 21 18:58:28 np0005591285 NetworkManager[55017]: <info>  [1769039908.2344] manager: (tapd96fb6bb-97): new Tun device (/org/freedesktop/NetworkManager/Devices/118)
Jan 21 18:58:28 np0005591285 nova_compute[182755]: 2026-01-21 23:58:28.238 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 18:58:28 np0005591285 ovn_controller[94908]: 2026-01-21T23:58:28Z|00225|binding|INFO|Claiming lport d96fb6bb-9793-4373-8f62-3aa3f32af6a5 for this chassis.
Jan 21 18:58:28 np0005591285 ovn_controller[94908]: 2026-01-21T23:58:28Z|00226|binding|INFO|d96fb6bb-9793-4373-8f62-3aa3f32af6a5: Claiming fa:16:3e:c3:44:d7 10.100.0.7
Jan 21 18:58:28 np0005591285 nova_compute[182755]: 2026-01-21 23:58:28.241 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 18:58:28 np0005591285 ovn_controller[94908]: 2026-01-21T23:58:28Z|00227|binding|INFO|Setting lport d96fb6bb-9793-4373-8f62-3aa3f32af6a5 ovn-installed in OVS
Jan 21 18:58:28 np0005591285 ovn_controller[94908]: 2026-01-21T23:58:28Z|00228|binding|INFO|Setting lport d96fb6bb-9793-4373-8f62-3aa3f32af6a5 up in Southbound
Jan 21 18:58:28 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:58:28.252 104259 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:c3:44:d7 10.100.0.7'], port_security=['fa:16:3e:c3:44:d7 10.100.0.7'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.7/28', 'neutron:device_id': '9308be91-9a92-4389-939a-8b03d37474cf', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-19c3e0c8-5563-479c-995a-ab38d8b8c7f7', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'cccb624dbe6d4401a89e9cd254f91828', 'neutron:revision_number': '7', 'neutron:security_group_ids': '6d59a7e5-ecca-4ec2-a40e-386acabc1d66', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:port_fip': '192.168.122.240'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=1cb5ae5b-fb9e-4b4d-8960-35191db09308, chassis=[<ovs.db.idl.Row object at 0x7fdd55b5c970>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fdd55b5c970>], logical_port=d96fb6bb-9793-4373-8f62-3aa3f32af6a5) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 21 18:58:28 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:58:28.254 104259 INFO neutron.agent.ovn.metadata.agent [-] Port d96fb6bb-9793-4373-8f62-3aa3f32af6a5 in datapath 19c3e0c8-5563-479c-995a-ab38d8b8c7f7 bound to our chassis#033[00m
Jan 21 18:58:28 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:58:28.255 104259 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 19c3e0c8-5563-479c-995a-ab38d8b8c7f7#033[00m
Jan 21 18:58:28 np0005591285 nova_compute[182755]: 2026-01-21 23:58:28.259 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 18:58:28 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:58:28.269 211690 DEBUG oslo.privsep.daemon [-] privsep: reply[7ad95720-fe8f-456a-a010-aec5b7acec98]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 18:58:28 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:58:28.270 104259 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap19c3e0c8-51 in ovnmeta-19c3e0c8-5563-479c-995a-ab38d8b8c7f7 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Jan 21 18:58:28 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:58:28.272 211690 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap19c3e0c8-50 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Jan 21 18:58:28 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:58:28.273 211690 DEBUG oslo.privsep.daemon [-] privsep: reply[25808154-f730-4da1-8a07-245bf63095e7]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 18:58:28 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:58:28.274 211690 DEBUG oslo.privsep.daemon [-] privsep: reply[3a242acb-713e-4146-a259-d8a670495ec5]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 18:58:28 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:58:28.294 104650 DEBUG oslo.privsep.daemon [-] privsep: reply[04af5af5-d157-4c67-a280-66a06440d3b9]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 18:58:28 np0005591285 systemd-udevd[221327]: Network interface NamePolicy= disabled on kernel command line.
Jan 21 18:58:28 np0005591285 systemd-machined[154022]: New machine qemu-32-instance-00000046.
Jan 21 18:58:28 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:58:28.314 211690 DEBUG oslo.privsep.daemon [-] privsep: reply[bfa68787-8a73-4d8d-a867-0a62a7b78255]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 18:58:28 np0005591285 NetworkManager[55017]: <info>  [1769039908.3167] device (tapd96fb6bb-97): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 21 18:58:28 np0005591285 NetworkManager[55017]: <info>  [1769039908.3178] device (tapd96fb6bb-97): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 21 18:58:28 np0005591285 systemd[1]: Started Virtual Machine qemu-32-instance-00000046.
Jan 21 18:58:28 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:58:28.353 211704 DEBUG oslo.privsep.daemon [-] privsep: reply[2da5489e-f8c2-4c93-9da1-b63517539b53]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 18:58:28 np0005591285 NetworkManager[55017]: <info>  [1769039908.3630] manager: (tap19c3e0c8-50): new Veth device (/org/freedesktop/NetworkManager/Devices/119)
Jan 21 18:58:28 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:58:28.364 211690 DEBUG oslo.privsep.daemon [-] privsep: reply[4cd12dc5-bc04-44c7-968f-1dc03ca9922f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 18:58:28 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:58:28.402 211704 DEBUG oslo.privsep.daemon [-] privsep: reply[75a2247f-d780-42b2-b724-954f8a6ea114]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 18:58:28 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:58:28.407 211704 DEBUG oslo.privsep.daemon [-] privsep: reply[c019f637-036f-48d9-8c85-8df4d2bf0c98]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 18:58:28 np0005591285 NetworkManager[55017]: <info>  [1769039908.4387] device (tap19c3e0c8-50): carrier: link connected
Jan 21 18:58:28 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:58:28.447 211704 DEBUG oslo.privsep.daemon [-] privsep: reply[d6f4d3e0-82be-4d21-9929-12d7bc43ece8]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 18:58:28 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:58:28.467 211690 DEBUG oslo.privsep.daemon [-] privsep: reply[259c7662-ba5c-4009-847e-72b07eaa5c1f]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap19c3e0c8-51'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:8f:3a:b0'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 75], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 446709, 'reachable_time': 41680, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 221358, 'error': None, 'target': 'ovnmeta-19c3e0c8-5563-479c-995a-ab38d8b8c7f7', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 18:58:28 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:58:28.489 211690 DEBUG oslo.privsep.daemon [-] privsep: reply[7fb646f7-5858-4d9b-bf52-975fd18570b9]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe8f:3ab0'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 446709, 'tstamp': 446709}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 221359, 'error': None, 'target': 'ovnmeta-19c3e0c8-5563-479c-995a-ab38d8b8c7f7', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 18:58:28 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:58:28.516 211690 DEBUG oslo.privsep.daemon [-] privsep: reply[71542fd7-a01c-4139-8bba-74142d301e04]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap19c3e0c8-51'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:8f:3a:b0'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 75], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 446709, 'reachable_time': 41680, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 221360, 'error': None, 'target': 'ovnmeta-19c3e0c8-5563-479c-995a-ab38d8b8c7f7', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 18:58:28 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:58:28.563 211690 DEBUG oslo.privsep.daemon [-] privsep: reply[8cab307d-aca9-4f54-8088-65d69fe57cd9]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 18:58:28 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:58:28.669 211690 DEBUG oslo.privsep.daemon [-] privsep: reply[3fc96d16-c2f1-4c93-b57e-0990a831631c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 18:58:28 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:58:28.671 104259 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap19c3e0c8-50, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 21 18:58:28 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:58:28.671 104259 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 21 18:58:28 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:58:28.672 104259 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap19c3e0c8-50, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 21 18:58:28 np0005591285 NetworkManager[55017]: <info>  [1769039908.6756] manager: (tap19c3e0c8-50): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/120)
Jan 21 18:58:28 np0005591285 nova_compute[182755]: 2026-01-21 23:58:28.675 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 18:58:28 np0005591285 kernel: tap19c3e0c8-50: entered promiscuous mode
Jan 21 18:58:28 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:58:28.682 104259 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap19c3e0c8-50, col_values=(('external_ids', {'iface-id': '1b7e9589-a667-4684-99c2-2699b19c29bb'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 21 18:58:28 np0005591285 ovn_controller[94908]: 2026-01-21T23:58:28Z|00229|binding|INFO|Releasing lport 1b7e9589-a667-4684-99c2-2699b19c29bb from this chassis (sb_readonly=0)
Jan 21 18:58:28 np0005591285 nova_compute[182755]: 2026-01-21 23:58:28.684 182759 DEBUG nova.compute.manager [None req-2fe0f222-a8ec-45aa-933d-645e3c93712f 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] [instance: 9308be91-9a92-4389-939a-8b03d37474cf] Instance event wait completed in 0 seconds for  wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Jan 21 18:58:28 np0005591285 nova_compute[182755]: 2026-01-21 23:58:28.686 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 18:58:28 np0005591285 nova_compute[182755]: 2026-01-21 23:58:28.695 182759 DEBUG nova.virt.libvirt.host [None req-f447ef32-7edb-4e2c-a5c5-5452f2f9c37e - - - - - -] Removed pending event for 9308be91-9a92-4389-939a-8b03d37474cf due to event _event_emit_delayed /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:438#033[00m
Jan 21 18:58:28 np0005591285 nova_compute[182755]: 2026-01-21 23:58:28.696 182759 DEBUG nova.virt.driver [None req-f447ef32-7edb-4e2c-a5c5-5452f2f9c37e - - - - - -] Emitting event <LifecycleEvent: 1769039908.6836476, 9308be91-9a92-4389-939a-8b03d37474cf => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 21 18:58:28 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:58:28.687 104259 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/19c3e0c8-5563-479c-995a-ab38d8b8c7f7.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/19c3e0c8-5563-479c-995a-ab38d8b8c7f7.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Jan 21 18:58:28 np0005591285 nova_compute[182755]: 2026-01-21 23:58:28.697 182759 INFO nova.compute.manager [None req-f447ef32-7edb-4e2c-a5c5-5452f2f9c37e - - - - - -] [instance: 9308be91-9a92-4389-939a-8b03d37474cf] VM Resumed (Lifecycle Event)#033[00m
Jan 21 18:58:28 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:58:28.696 211690 DEBUG oslo.privsep.daemon [-] privsep: reply[84c49322-00f3-4acd-965b-01c9ba42a53d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 18:58:28 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:58:28.698 104259 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Jan 21 18:58:28 np0005591285 ovn_metadata_agent[104254]: global
Jan 21 18:58:28 np0005591285 ovn_metadata_agent[104254]:    log         /dev/log local0 debug
Jan 21 18:58:28 np0005591285 ovn_metadata_agent[104254]:    log-tag     haproxy-metadata-proxy-19c3e0c8-5563-479c-995a-ab38d8b8c7f7
Jan 21 18:58:28 np0005591285 ovn_metadata_agent[104254]:    user        root
Jan 21 18:58:28 np0005591285 ovn_metadata_agent[104254]:    group       root
Jan 21 18:58:28 np0005591285 ovn_metadata_agent[104254]:    maxconn     1024
Jan 21 18:58:28 np0005591285 ovn_metadata_agent[104254]:    pidfile     /var/lib/neutron/external/pids/19c3e0c8-5563-479c-995a-ab38d8b8c7f7.pid.haproxy
Jan 21 18:58:28 np0005591285 ovn_metadata_agent[104254]:    daemon
Jan 21 18:58:28 np0005591285 ovn_metadata_agent[104254]: 
Jan 21 18:58:28 np0005591285 ovn_metadata_agent[104254]: defaults
Jan 21 18:58:28 np0005591285 ovn_metadata_agent[104254]:    log global
Jan 21 18:58:28 np0005591285 ovn_metadata_agent[104254]:    mode http
Jan 21 18:58:28 np0005591285 ovn_metadata_agent[104254]:    option httplog
Jan 21 18:58:28 np0005591285 ovn_metadata_agent[104254]:    option dontlognull
Jan 21 18:58:28 np0005591285 ovn_metadata_agent[104254]:    option http-server-close
Jan 21 18:58:28 np0005591285 ovn_metadata_agent[104254]:    option forwardfor
Jan 21 18:58:28 np0005591285 ovn_metadata_agent[104254]:    retries                 3
Jan 21 18:58:28 np0005591285 ovn_metadata_agent[104254]:    timeout http-request    30s
Jan 21 18:58:28 np0005591285 ovn_metadata_agent[104254]:    timeout connect         30s
Jan 21 18:58:28 np0005591285 ovn_metadata_agent[104254]:    timeout client          32s
Jan 21 18:58:28 np0005591285 ovn_metadata_agent[104254]:    timeout server          32s
Jan 21 18:58:28 np0005591285 ovn_metadata_agent[104254]:    timeout http-keep-alive 30s
Jan 21 18:58:28 np0005591285 ovn_metadata_agent[104254]: 
Jan 21 18:58:28 np0005591285 ovn_metadata_agent[104254]: 
Jan 21 18:58:28 np0005591285 ovn_metadata_agent[104254]: listen listener
Jan 21 18:58:28 np0005591285 ovn_metadata_agent[104254]:    bind 169.254.169.254:80
Jan 21 18:58:28 np0005591285 ovn_metadata_agent[104254]:    server metadata /var/lib/neutron/metadata_proxy
Jan 21 18:58:28 np0005591285 ovn_metadata_agent[104254]:    http-request add-header X-OVN-Network-ID 19c3e0c8-5563-479c-995a-ab38d8b8c7f7
Jan 21 18:58:28 np0005591285 ovn_metadata_agent[104254]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Jan 21 18:58:28 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:58:28.699 104259 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-19c3e0c8-5563-479c-995a-ab38d8b8c7f7', 'env', 'PROCESS_TAG=haproxy-19c3e0c8-5563-479c-995a-ab38d8b8c7f7', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/19c3e0c8-5563-479c-995a-ab38d8b8c7f7.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Jan 21 18:58:28 np0005591285 nova_compute[182755]: 2026-01-21 23:58:28.700 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 18:58:28 np0005591285 nova_compute[182755]: 2026-01-21 23:58:28.704 182759 INFO nova.virt.libvirt.driver [-] [instance: 9308be91-9a92-4389-939a-8b03d37474cf] Instance rebooted successfully.#033[00m
Jan 21 18:58:28 np0005591285 nova_compute[182755]: 2026-01-21 23:58:28.705 182759 DEBUG nova.compute.manager [None req-2fe0f222-a8ec-45aa-933d-645e3c93712f 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] [instance: 9308be91-9a92-4389-939a-8b03d37474cf] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 21 18:58:28 np0005591285 nova_compute[182755]: 2026-01-21 23:58:28.736 182759 DEBUG nova.compute.manager [None req-f447ef32-7edb-4e2c-a5c5-5452f2f9c37e - - - - - -] [instance: 9308be91-9a92-4389-939a-8b03d37474cf] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 21 18:58:28 np0005591285 nova_compute[182755]: 2026-01-21 23:58:28.741 182759 DEBUG nova.compute.manager [None req-f447ef32-7edb-4e2c-a5c5-5452f2f9c37e - - - - - -] [instance: 9308be91-9a92-4389-939a-8b03d37474cf] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: stopped, current task_state: powering-on, current DB power_state: 4, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 21 18:58:28 np0005591285 nova_compute[182755]: 2026-01-21 23:58:28.798 182759 INFO nova.compute.manager [None req-f447ef32-7edb-4e2c-a5c5-5452f2f9c37e - - - - - -] [instance: 9308be91-9a92-4389-939a-8b03d37474cf] During sync_power_state the instance has a pending task (powering-on). Skip.#033[00m
Jan 21 18:58:28 np0005591285 nova_compute[182755]: 2026-01-21 23:58:28.798 182759 DEBUG nova.virt.driver [None req-f447ef32-7edb-4e2c-a5c5-5452f2f9c37e - - - - - -] Emitting event <LifecycleEvent: 1769039908.686625, 9308be91-9a92-4389-939a-8b03d37474cf => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 21 18:58:28 np0005591285 nova_compute[182755]: 2026-01-21 23:58:28.799 182759 INFO nova.compute.manager [None req-f447ef32-7edb-4e2c-a5c5-5452f2f9c37e - - - - - -] [instance: 9308be91-9a92-4389-939a-8b03d37474cf] VM Started (Lifecycle Event)#033[00m
Jan 21 18:58:28 np0005591285 nova_compute[182755]: 2026-01-21 23:58:28.853 182759 DEBUG nova.compute.manager [None req-f447ef32-7edb-4e2c-a5c5-5452f2f9c37e - - - - - -] [instance: 9308be91-9a92-4389-939a-8b03d37474cf] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 21 18:58:28 np0005591285 nova_compute[182755]: 2026-01-21 23:58:28.858 182759 DEBUG nova.compute.manager [None req-f447ef32-7edb-4e2c-a5c5-5452f2f9c37e - - - - - -] [instance: 9308be91-9a92-4389-939a-8b03d37474cf] Synchronizing instance power state after lifecycle event "Started"; current vm_state: active, current task_state: None, current DB power_state: 1, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 21 18:58:29 np0005591285 podman[221400]: 2026-01-21 23:58:29.2014014 +0000 UTC m=+0.085909841 container create ce5f4a8ea03ba1e6a22fa04817afcff08c2929abd20a134387ab35b782d86549 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-19c3e0c8-5563-479c-995a-ab38d8b8c7f7, tcib_managed=true, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.license=GPLv2)
Jan 21 18:58:29 np0005591285 nova_compute[182755]: 2026-01-21 23:58:29.217 182759 DEBUG oslo_service.periodic_task [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 21 18:58:29 np0005591285 nova_compute[182755]: 2026-01-21 23:58:29.219 182759 DEBUG oslo_service.periodic_task [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 21 18:58:29 np0005591285 nova_compute[182755]: 2026-01-21 23:58:29.220 182759 DEBUG nova.compute.manager [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Jan 21 18:58:29 np0005591285 podman[221400]: 2026-01-21 23:58:29.15642766 +0000 UTC m=+0.040936151 image pull 3695f0466b4af47afdf4b467956f8cc4744d7249671a73e7ca3fd26cca2f59c3 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Jan 21 18:58:29 np0005591285 systemd[1]: Started libpod-conmon-ce5f4a8ea03ba1e6a22fa04817afcff08c2929abd20a134387ab35b782d86549.scope.
Jan 21 18:58:29 np0005591285 systemd[1]: Started libcrun container.
Jan 21 18:58:29 np0005591285 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/c43af1648b638f28e991b5a990512950a94631aa9495cd065478fa8cadbf332e/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 21 18:58:29 np0005591285 podman[221400]: 2026-01-21 23:58:29.336190035 +0000 UTC m=+0.220698496 container init ce5f4a8ea03ba1e6a22fa04817afcff08c2929abd20a134387ab35b782d86549 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-19c3e0c8-5563-479c-995a-ab38d8b8c7f7, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team)
Jan 21 18:58:29 np0005591285 podman[221400]: 2026-01-21 23:58:29.34668054 +0000 UTC m=+0.231188961 container start ce5f4a8ea03ba1e6a22fa04817afcff08c2929abd20a134387ab35b782d86549 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-19c3e0c8-5563-479c-995a-ab38d8b8c7f7, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, tcib_managed=true, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team)
Jan 21 18:58:29 np0005591285 neutron-haproxy-ovnmeta-19c3e0c8-5563-479c-995a-ab38d8b8c7f7[221417]: [NOTICE]   (221421) : New worker (221423) forked
Jan 21 18:58:29 np0005591285 neutron-haproxy-ovnmeta-19c3e0c8-5563-479c-995a-ab38d8b8c7f7[221417]: [NOTICE]   (221421) : Loading success.
Jan 21 18:58:29 np0005591285 nova_compute[182755]: 2026-01-21 23:58:29.672 182759 DEBUG nova.compute.manager [req-8f14142b-8ed4-40ab-b2f1-3b70c0ac494e req-76f8636a-d3ff-4164-a07e-36ae410e399b 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 83fe04ea-7d77-4003-9276-6a7d268e942a] Received event network-vif-plugged-8e162717-2b5c-4731-8484-d2c68330bdaa external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 21 18:58:29 np0005591285 nova_compute[182755]: 2026-01-21 23:58:29.673 182759 DEBUG oslo_concurrency.lockutils [req-8f14142b-8ed4-40ab-b2f1-3b70c0ac494e req-76f8636a-d3ff-4164-a07e-36ae410e399b 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "83fe04ea-7d77-4003-9276-6a7d268e942a-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 21 18:58:29 np0005591285 nova_compute[182755]: 2026-01-21 23:58:29.674 182759 DEBUG oslo_concurrency.lockutils [req-8f14142b-8ed4-40ab-b2f1-3b70c0ac494e req-76f8636a-d3ff-4164-a07e-36ae410e399b 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "83fe04ea-7d77-4003-9276-6a7d268e942a-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 21 18:58:29 np0005591285 nova_compute[182755]: 2026-01-21 23:58:29.674 182759 DEBUG oslo_concurrency.lockutils [req-8f14142b-8ed4-40ab-b2f1-3b70c0ac494e req-76f8636a-d3ff-4164-a07e-36ae410e399b 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "83fe04ea-7d77-4003-9276-6a7d268e942a-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 21 18:58:29 np0005591285 nova_compute[182755]: 2026-01-21 23:58:29.674 182759 DEBUG nova.compute.manager [req-8f14142b-8ed4-40ab-b2f1-3b70c0ac494e req-76f8636a-d3ff-4164-a07e-36ae410e399b 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 83fe04ea-7d77-4003-9276-6a7d268e942a] No waiting events found dispatching network-vif-plugged-8e162717-2b5c-4731-8484-d2c68330bdaa pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 21 18:58:29 np0005591285 nova_compute[182755]: 2026-01-21 23:58:29.674 182759 WARNING nova.compute.manager [req-8f14142b-8ed4-40ab-b2f1-3b70c0ac494e req-76f8636a-d3ff-4164-a07e-36ae410e399b 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 83fe04ea-7d77-4003-9276-6a7d268e942a] Received unexpected event network-vif-plugged-8e162717-2b5c-4731-8484-d2c68330bdaa for instance with vm_state active and task_state None.#033[00m
Jan 21 18:58:29 np0005591285 nova_compute[182755]: 2026-01-21 23:58:29.675 182759 DEBUG nova.compute.manager [req-8f14142b-8ed4-40ab-b2f1-3b70c0ac494e req-76f8636a-d3ff-4164-a07e-36ae410e399b 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 9308be91-9a92-4389-939a-8b03d37474cf] Received event network-vif-plugged-d96fb6bb-9793-4373-8f62-3aa3f32af6a5 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 21 18:58:29 np0005591285 nova_compute[182755]: 2026-01-21 23:58:29.675 182759 DEBUG oslo_concurrency.lockutils [req-8f14142b-8ed4-40ab-b2f1-3b70c0ac494e req-76f8636a-d3ff-4164-a07e-36ae410e399b 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "9308be91-9a92-4389-939a-8b03d37474cf-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 21 18:58:29 np0005591285 nova_compute[182755]: 2026-01-21 23:58:29.675 182759 DEBUG oslo_concurrency.lockutils [req-8f14142b-8ed4-40ab-b2f1-3b70c0ac494e req-76f8636a-d3ff-4164-a07e-36ae410e399b 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "9308be91-9a92-4389-939a-8b03d37474cf-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 21 18:58:29 np0005591285 nova_compute[182755]: 2026-01-21 23:58:29.676 182759 DEBUG oslo_concurrency.lockutils [req-8f14142b-8ed4-40ab-b2f1-3b70c0ac494e req-76f8636a-d3ff-4164-a07e-36ae410e399b 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "9308be91-9a92-4389-939a-8b03d37474cf-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 21 18:58:29 np0005591285 nova_compute[182755]: 2026-01-21 23:58:29.676 182759 DEBUG nova.compute.manager [req-8f14142b-8ed4-40ab-b2f1-3b70c0ac494e req-76f8636a-d3ff-4164-a07e-36ae410e399b 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 9308be91-9a92-4389-939a-8b03d37474cf] No waiting events found dispatching network-vif-plugged-d96fb6bb-9793-4373-8f62-3aa3f32af6a5 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 21 18:58:29 np0005591285 nova_compute[182755]: 2026-01-21 23:58:29.676 182759 WARNING nova.compute.manager [req-8f14142b-8ed4-40ab-b2f1-3b70c0ac494e req-76f8636a-d3ff-4164-a07e-36ae410e399b 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 9308be91-9a92-4389-939a-8b03d37474cf] Received unexpected event network-vif-plugged-d96fb6bb-9793-4373-8f62-3aa3f32af6a5 for instance with vm_state active and task_state None.#033[00m
Jan 21 18:58:29 np0005591285 nova_compute[182755]: 2026-01-21 23:58:29.676 182759 DEBUG nova.compute.manager [req-8f14142b-8ed4-40ab-b2f1-3b70c0ac494e req-76f8636a-d3ff-4164-a07e-36ae410e399b 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 9308be91-9a92-4389-939a-8b03d37474cf] Received event network-vif-plugged-d96fb6bb-9793-4373-8f62-3aa3f32af6a5 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 21 18:58:29 np0005591285 nova_compute[182755]: 2026-01-21 23:58:29.677 182759 DEBUG oslo_concurrency.lockutils [req-8f14142b-8ed4-40ab-b2f1-3b70c0ac494e req-76f8636a-d3ff-4164-a07e-36ae410e399b 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "9308be91-9a92-4389-939a-8b03d37474cf-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 21 18:58:29 np0005591285 nova_compute[182755]: 2026-01-21 23:58:29.677 182759 DEBUG oslo_concurrency.lockutils [req-8f14142b-8ed4-40ab-b2f1-3b70c0ac494e req-76f8636a-d3ff-4164-a07e-36ae410e399b 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "9308be91-9a92-4389-939a-8b03d37474cf-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 21 18:58:29 np0005591285 nova_compute[182755]: 2026-01-21 23:58:29.677 182759 DEBUG oslo_concurrency.lockutils [req-8f14142b-8ed4-40ab-b2f1-3b70c0ac494e req-76f8636a-d3ff-4164-a07e-36ae410e399b 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "9308be91-9a92-4389-939a-8b03d37474cf-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 21 18:58:29 np0005591285 nova_compute[182755]: 2026-01-21 23:58:29.677 182759 DEBUG nova.compute.manager [req-8f14142b-8ed4-40ab-b2f1-3b70c0ac494e req-76f8636a-d3ff-4164-a07e-36ae410e399b 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 9308be91-9a92-4389-939a-8b03d37474cf] No waiting events found dispatching network-vif-plugged-d96fb6bb-9793-4373-8f62-3aa3f32af6a5 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 21 18:58:29 np0005591285 nova_compute[182755]: 2026-01-21 23:58:29.678 182759 WARNING nova.compute.manager [req-8f14142b-8ed4-40ab-b2f1-3b70c0ac494e req-76f8636a-d3ff-4164-a07e-36ae410e399b 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 9308be91-9a92-4389-939a-8b03d37474cf] Received unexpected event network-vif-plugged-d96fb6bb-9793-4373-8f62-3aa3f32af6a5 for instance with vm_state active and task_state None.#033[00m
Jan 21 18:58:30 np0005591285 nova_compute[182755]: 2026-01-21 23:58:30.217 182759 DEBUG oslo_service.periodic_task [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 21 18:58:30 np0005591285 nova_compute[182755]: 2026-01-21 23:58:30.259 182759 DEBUG oslo_concurrency.lockutils [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 21 18:58:30 np0005591285 nova_compute[182755]: 2026-01-21 23:58:30.260 182759 DEBUG oslo_concurrency.lockutils [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 21 18:58:30 np0005591285 nova_compute[182755]: 2026-01-21 23:58:30.260 182759 DEBUG oslo_concurrency.lockutils [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 21 18:58:30 np0005591285 nova_compute[182755]: 2026-01-21 23:58:30.261 182759 DEBUG nova.compute.resource_tracker [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Jan 21 18:58:30 np0005591285 nova_compute[182755]: 2026-01-21 23:58:30.413 182759 DEBUG oslo_concurrency.processutils [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/83fe04ea-7d77-4003-9276-6a7d268e942a/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 21 18:58:30 np0005591285 nova_compute[182755]: 2026-01-21 23:58:30.450 182759 DEBUG nova.compute.manager [None req-e32b7712-c5eb-47d5-9993-2dbe4639d9b7 55710edfd4b24e368807c8b5087ec91c 011e84f966444a668bd6c0f5674f551f - - default default] [instance: 83fe04ea-7d77-4003-9276-6a7d268e942a] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 21 18:58:30 np0005591285 nova_compute[182755]: 2026-01-21 23:58:30.512 182759 DEBUG oslo_concurrency.processutils [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/83fe04ea-7d77-4003-9276-6a7d268e942a/disk --force-share --output=json" returned: 0 in 0.098s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 21 18:58:30 np0005591285 nova_compute[182755]: 2026-01-21 23:58:30.516 182759 DEBUG oslo_concurrency.processutils [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/83fe04ea-7d77-4003-9276-6a7d268e942a/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 21 18:58:30 np0005591285 nova_compute[182755]: 2026-01-21 23:58:30.550 182759 INFO nova.compute.manager [None req-e32b7712-c5eb-47d5-9993-2dbe4639d9b7 55710edfd4b24e368807c8b5087ec91c 011e84f966444a668bd6c0f5674f551f - - default default] [instance: 83fe04ea-7d77-4003-9276-6a7d268e942a] instance snapshotting#033[00m
Jan 21 18:58:30 np0005591285 nova_compute[182755]: 2026-01-21 23:58:30.608 182759 DEBUG oslo_concurrency.processutils [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/83fe04ea-7d77-4003-9276-6a7d268e942a/disk --force-share --output=json" returned: 0 in 0.092s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 21 18:58:30 np0005591285 nova_compute[182755]: 2026-01-21 23:58:30.611 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 18:58:30 np0005591285 nova_compute[182755]: 2026-01-21 23:58:30.619 182759 DEBUG oslo_concurrency.processutils [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/9308be91-9a92-4389-939a-8b03d37474cf/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 21 18:58:30 np0005591285 nova_compute[182755]: 2026-01-21 23:58:30.714 182759 DEBUG oslo_concurrency.processutils [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/9308be91-9a92-4389-939a-8b03d37474cf/disk --force-share --output=json" returned: 0 in 0.098s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 21 18:58:30 np0005591285 nova_compute[182755]: 2026-01-21 23:58:30.716 182759 DEBUG oslo_concurrency.processutils [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/9308be91-9a92-4389-939a-8b03d37474cf/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 21 18:58:30 np0005591285 nova_compute[182755]: 2026-01-21 23:58:30.795 182759 DEBUG oslo_concurrency.processutils [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/9308be91-9a92-4389-939a-8b03d37474cf/disk --force-share --output=json" returned: 0 in 0.079s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 21 18:58:30 np0005591285 nova_compute[182755]: 2026-01-21 23:58:30.893 182759 INFO nova.virt.libvirt.driver [None req-e32b7712-c5eb-47d5-9993-2dbe4639d9b7 55710edfd4b24e368807c8b5087ec91c 011e84f966444a668bd6c0f5674f551f - - default default] [instance: 83fe04ea-7d77-4003-9276-6a7d268e942a] Beginning live snapshot process#033[00m
Jan 21 18:58:31 np0005591285 virtqemud[182299]: invalid argument: disk vda does not have an active block job
Jan 21 18:58:31 np0005591285 nova_compute[182755]: 2026-01-21 23:58:31.195 182759 DEBUG oslo_concurrency.processutils [None req-e32b7712-c5eb-47d5-9993-2dbe4639d9b7 55710edfd4b24e368807c8b5087ec91c 011e84f966444a668bd6c0f5674f551f - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/83fe04ea-7d77-4003-9276-6a7d268e942a/disk --force-share --output=json -f qcow2 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 21 18:58:31 np0005591285 nova_compute[182755]: 2026-01-21 23:58:31.241 182759 WARNING nova.virt.libvirt.driver [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 21 18:58:31 np0005591285 nova_compute[182755]: 2026-01-21 23:58:31.243 182759 DEBUG nova.compute.resource_tracker [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=5365MB free_disk=73.27320861816406GB free_vcpus=6 pci_devices=[{"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Jan 21 18:58:31 np0005591285 nova_compute[182755]: 2026-01-21 23:58:31.243 182759 DEBUG oslo_concurrency.lockutils [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 21 18:58:31 np0005591285 nova_compute[182755]: 2026-01-21 23:58:31.244 182759 DEBUG oslo_concurrency.lockutils [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 21 18:58:31 np0005591285 nova_compute[182755]: 2026-01-21 23:58:31.270 182759 DEBUG oslo_concurrency.processutils [None req-e32b7712-c5eb-47d5-9993-2dbe4639d9b7 55710edfd4b24e368807c8b5087ec91c 011e84f966444a668bd6c0f5674f551f - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/83fe04ea-7d77-4003-9276-6a7d268e942a/disk --force-share --output=json -f qcow2" returned: 0 in 0.075s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 21 18:58:31 np0005591285 nova_compute[182755]: 2026-01-21 23:58:31.271 182759 DEBUG oslo_concurrency.processutils [None req-e32b7712-c5eb-47d5-9993-2dbe4639d9b7 55710edfd4b24e368807c8b5087ec91c 011e84f966444a668bd6c0f5674f551f - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/83fe04ea-7d77-4003-9276-6a7d268e942a/disk --force-share --output=json -f qcow2 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 21 18:58:31 np0005591285 nova_compute[182755]: 2026-01-21 23:58:31.336 182759 DEBUG oslo_concurrency.processutils [None req-e32b7712-c5eb-47d5-9993-2dbe4639d9b7 55710edfd4b24e368807c8b5087ec91c 011e84f966444a668bd6c0f5674f551f - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/83fe04ea-7d77-4003-9276-6a7d268e942a/disk --force-share --output=json -f qcow2" returned: 0 in 0.065s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 21 18:58:31 np0005591285 nova_compute[182755]: 2026-01-21 23:58:31.349 182759 DEBUG oslo_concurrency.processutils [None req-e32b7712-c5eb-47d5-9993-2dbe4639d9b7 55710edfd4b24e368807c8b5087ec91c 011e84f966444a668bd6c0f5674f551f - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 21 18:58:31 np0005591285 nova_compute[182755]: 2026-01-21 23:58:31.374 182759 DEBUG nova.compute.resource_tracker [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Instance 9308be91-9a92-4389-939a-8b03d37474cf actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Jan 21 18:58:31 np0005591285 nova_compute[182755]: 2026-01-21 23:58:31.374 182759 DEBUG nova.compute.resource_tracker [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Instance 83fe04ea-7d77-4003-9276-6a7d268e942a actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Jan 21 18:58:31 np0005591285 nova_compute[182755]: 2026-01-21 23:58:31.374 182759 DEBUG nova.compute.resource_tracker [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 2 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Jan 21 18:58:31 np0005591285 nova_compute[182755]: 2026-01-21 23:58:31.375 182759 DEBUG nova.compute.resource_tracker [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7679MB used_ram=768MB phys_disk=79GB used_disk=2GB total_vcpus=8 used_vcpus=2 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Jan 21 18:58:31 np0005591285 nova_compute[182755]: 2026-01-21 23:58:31.416 182759 DEBUG oslo_concurrency.processutils [None req-e32b7712-c5eb-47d5-9993-2dbe4639d9b7 55710edfd4b24e368807c8b5087ec91c 011e84f966444a668bd6c0f5674f551f - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json" returned: 0 in 0.067s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 21 18:58:31 np0005591285 nova_compute[182755]: 2026-01-21 23:58:31.417 182759 DEBUG oslo_concurrency.processutils [None req-e32b7712-c5eb-47d5-9993-2dbe4639d9b7 55710edfd4b24e368807c8b5087ec91c 011e84f966444a668bd6c0f5674f551f - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474,backing_fmt=raw /var/lib/nova/instances/snapshots/tmpgg3dfry3/9c6d1bb46e114155b41d9a4e7b16232b.delta 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 21 18:58:31 np0005591285 nova_compute[182755]: 2026-01-21 23:58:31.459 182759 DEBUG nova.compute.provider_tree [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Inventory has not changed in ProviderTree for provider: e96a8776-a298-4c19-937a-402cb8191067 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 21 18:58:31 np0005591285 nova_compute[182755]: 2026-01-21 23:58:31.476 182759 DEBUG oslo_concurrency.processutils [None req-e32b7712-c5eb-47d5-9993-2dbe4639d9b7 55710edfd4b24e368807c8b5087ec91c 011e84f966444a668bd6c0f5674f551f - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474,backing_fmt=raw /var/lib/nova/instances/snapshots/tmpgg3dfry3/9c6d1bb46e114155b41d9a4e7b16232b.delta 1073741824" returned: 0 in 0.059s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 21 18:58:31 np0005591285 nova_compute[182755]: 2026-01-21 23:58:31.477 182759 INFO nova.virt.libvirt.driver [None req-e32b7712-c5eb-47d5-9993-2dbe4639d9b7 55710edfd4b24e368807c8b5087ec91c 011e84f966444a668bd6c0f5674f551f - - default default] [instance: 83fe04ea-7d77-4003-9276-6a7d268e942a] Quiescing instance not available: QEMU guest agent is not enabled.#033[00m
Jan 21 18:58:31 np0005591285 nova_compute[182755]: 2026-01-21 23:58:31.481 182759 DEBUG nova.scheduler.client.report [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Inventory has not changed for provider e96a8776-a298-4c19-937a-402cb8191067 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 21 18:58:31 np0005591285 nova_compute[182755]: 2026-01-21 23:58:31.506 182759 DEBUG nova.compute.resource_tracker [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Jan 21 18:58:31 np0005591285 nova_compute[182755]: 2026-01-21 23:58:31.506 182759 DEBUG oslo_concurrency.lockutils [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.263s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 21 18:58:31 np0005591285 nova_compute[182755]: 2026-01-21 23:58:31.556 182759 DEBUG nova.virt.libvirt.guest [None req-e32b7712-c5eb-47d5-9993-2dbe4639d9b7 55710edfd4b24e368807c8b5087ec91c 011e84f966444a668bd6c0f5674f551f - - default default] COPY block job progress, current cursor: 1 final cursor: 1 is_job_complete /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:846#033[00m
Jan 21 18:58:31 np0005591285 nova_compute[182755]: 2026-01-21 23:58:31.561 182759 INFO nova.virt.libvirt.driver [None req-e32b7712-c5eb-47d5-9993-2dbe4639d9b7 55710edfd4b24e368807c8b5087ec91c 011e84f966444a668bd6c0f5674f551f - - default default] [instance: 83fe04ea-7d77-4003-9276-6a7d268e942a] Skipping quiescing instance: QEMU guest agent is not enabled.#033[00m
Jan 21 18:58:31 np0005591285 nova_compute[182755]: 2026-01-21 23:58:31.619 182759 DEBUG nova.privsep.utils [None req-e32b7712-c5eb-47d5-9993-2dbe4639d9b7 55710edfd4b24e368807c8b5087ec91c 011e84f966444a668bd6c0f5674f551f - - default default] Path '/var/lib/nova/instances' supports direct I/O supports_direct_io /usr/lib/python3.9/site-packages/nova/privsep/utils.py:63#033[00m
Jan 21 18:58:31 np0005591285 nova_compute[182755]: 2026-01-21 23:58:31.619 182759 DEBUG oslo_concurrency.processutils [None req-e32b7712-c5eb-47d5-9993-2dbe4639d9b7 55710edfd4b24e368807c8b5087ec91c 011e84f966444a668bd6c0f5674f551f - - default default] Running cmd (subprocess): qemu-img convert -t none -O qcow2 -f qcow2 /var/lib/nova/instances/snapshots/tmpgg3dfry3/9c6d1bb46e114155b41d9a4e7b16232b.delta /var/lib/nova/instances/snapshots/tmpgg3dfry3/9c6d1bb46e114155b41d9a4e7b16232b execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 21 18:58:31 np0005591285 nova_compute[182755]: 2026-01-21 23:58:31.816 182759 DEBUG oslo_concurrency.processutils [None req-e32b7712-c5eb-47d5-9993-2dbe4639d9b7 55710edfd4b24e368807c8b5087ec91c 011e84f966444a668bd6c0f5674f551f - - default default] CMD "qemu-img convert -t none -O qcow2 -f qcow2 /var/lib/nova/instances/snapshots/tmpgg3dfry3/9c6d1bb46e114155b41d9a4e7b16232b.delta /var/lib/nova/instances/snapshots/tmpgg3dfry3/9c6d1bb46e114155b41d9a4e7b16232b" returned: 0 in 0.196s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 21 18:58:31 np0005591285 nova_compute[182755]: 2026-01-21 23:58:31.817 182759 INFO nova.virt.libvirt.driver [None req-e32b7712-c5eb-47d5-9993-2dbe4639d9b7 55710edfd4b24e368807c8b5087ec91c 011e84f966444a668bd6c0f5674f551f - - default default] [instance: 83fe04ea-7d77-4003-9276-6a7d268e942a] Snapshot extracted, beginning image upload#033[00m
Jan 21 18:58:33 np0005591285 nova_compute[182755]: 2026-01-21 23:58:33.102 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 18:58:34 np0005591285 nova_compute[182755]: 2026-01-21 23:58:34.288 182759 INFO nova.compute.manager [None req-34a5e815-5638-4c72-9430-e2bf44c58d41 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] [instance: 9308be91-9a92-4389-939a-8b03d37474cf] Pausing#033[00m
Jan 21 18:58:34 np0005591285 nova_compute[182755]: 2026-01-21 23:58:34.290 182759 DEBUG nova.objects.instance [None req-34a5e815-5638-4c72-9430-e2bf44c58d41 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] Lazy-loading 'flavor' on Instance uuid 9308be91-9a92-4389-939a-8b03d37474cf obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 21 18:58:34 np0005591285 nova_compute[182755]: 2026-01-21 23:58:34.354 182759 DEBUG nova.virt.driver [None req-f447ef32-7edb-4e2c-a5c5-5452f2f9c37e - - - - - -] Emitting event <LifecycleEvent: 1769039914.3506632, 9308be91-9a92-4389-939a-8b03d37474cf => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 21 18:58:34 np0005591285 nova_compute[182755]: 2026-01-21 23:58:34.355 182759 INFO nova.compute.manager [None req-f447ef32-7edb-4e2c-a5c5-5452f2f9c37e - - - - - -] [instance: 9308be91-9a92-4389-939a-8b03d37474cf] VM Paused (Lifecycle Event)#033[00m
Jan 21 18:58:34 np0005591285 nova_compute[182755]: 2026-01-21 23:58:34.358 182759 DEBUG nova.compute.manager [None req-34a5e815-5638-4c72-9430-e2bf44c58d41 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] [instance: 9308be91-9a92-4389-939a-8b03d37474cf] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 21 18:58:34 np0005591285 nova_compute[182755]: 2026-01-21 23:58:34.414 182759 DEBUG nova.compute.manager [None req-f447ef32-7edb-4e2c-a5c5-5452f2f9c37e - - - - - -] [instance: 9308be91-9a92-4389-939a-8b03d37474cf] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 21 18:58:34 np0005591285 nova_compute[182755]: 2026-01-21 23:58:34.419 182759 DEBUG nova.compute.manager [None req-f447ef32-7edb-4e2c-a5c5-5452f2f9c37e - - - - - -] [instance: 9308be91-9a92-4389-939a-8b03d37474cf] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: active, current task_state: pausing, current DB power_state: 1, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 21 18:58:34 np0005591285 nova_compute[182755]: 2026-01-21 23:58:34.484 182759 INFO nova.compute.manager [None req-f447ef32-7edb-4e2c-a5c5-5452f2f9c37e - - - - - -] [instance: 9308be91-9a92-4389-939a-8b03d37474cf] During sync_power_state the instance has a pending task (pausing). Skip.#033[00m
Jan 21 18:58:34 np0005591285 nova_compute[182755]: 2026-01-21 23:58:34.506 182759 DEBUG oslo_service.periodic_task [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 21 18:58:34 np0005591285 nova_compute[182755]: 2026-01-21 23:58:34.507 182759 DEBUG nova.compute.manager [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Jan 21 18:58:34 np0005591285 nova_compute[182755]: 2026-01-21 23:58:34.507 182759 DEBUG nova.compute.manager [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Jan 21 18:58:34 np0005591285 nova_compute[182755]: 2026-01-21 23:58:34.714 182759 INFO nova.virt.libvirt.driver [None req-e32b7712-c5eb-47d5-9993-2dbe4639d9b7 55710edfd4b24e368807c8b5087ec91c 011e84f966444a668bd6c0f5674f551f - - default default] [instance: 83fe04ea-7d77-4003-9276-6a7d268e942a] Snapshot image upload complete#033[00m
Jan 21 18:58:34 np0005591285 nova_compute[182755]: 2026-01-21 23:58:34.716 182759 INFO nova.compute.manager [None req-e32b7712-c5eb-47d5-9993-2dbe4639d9b7 55710edfd4b24e368807c8b5087ec91c 011e84f966444a668bd6c0f5674f551f - - default default] [instance: 83fe04ea-7d77-4003-9276-6a7d268e942a] Took 4.15 seconds to snapshot the instance on the hypervisor.#033[00m
Jan 21 18:58:34 np0005591285 nova_compute[182755]: 2026-01-21 23:58:34.862 182759 DEBUG oslo_concurrency.lockutils [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Acquiring lock "refresh_cache-9308be91-9a92-4389-939a-8b03d37474cf" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 21 18:58:34 np0005591285 nova_compute[182755]: 2026-01-21 23:58:34.863 182759 DEBUG oslo_concurrency.lockutils [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Acquired lock "refresh_cache-9308be91-9a92-4389-939a-8b03d37474cf" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 21 18:58:34 np0005591285 nova_compute[182755]: 2026-01-21 23:58:34.863 182759 DEBUG nova.network.neutron [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] [instance: 9308be91-9a92-4389-939a-8b03d37474cf] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004#033[00m
Jan 21 18:58:34 np0005591285 nova_compute[182755]: 2026-01-21 23:58:34.863 182759 DEBUG nova.objects.instance [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Lazy-loading 'info_cache' on Instance uuid 9308be91-9a92-4389-939a-8b03d37474cf obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 21 18:58:35 np0005591285 podman[221471]: 2026-01-21 23:58:35.252188972 +0000 UTC m=+0.107487016 container health_status 3d927e208c8d9d2bf2ed958430b573b5beb6e6062ba87e1ec7a0ee42c855b2e4 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter)
Jan 21 18:58:35 np0005591285 nova_compute[182755]: 2026-01-21 23:58:35.614 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 18:58:36 np0005591285 nova_compute[182755]: 2026-01-21 23:58:36.578 182759 DEBUG nova.network.neutron [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] [instance: 9308be91-9a92-4389-939a-8b03d37474cf] Updating instance_info_cache with network_info: [{"id": "d96fb6bb-9793-4373-8f62-3aa3f32af6a5", "address": "fa:16:3e:c3:44:d7", "network": {"id": "19c3e0c8-5563-479c-995a-ab38d8b8c7f7", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-10713966-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.240", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "cccb624dbe6d4401a89e9cd254f91828", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd96fb6bb-97", "ovs_interfaceid": "d96fb6bb-9793-4373-8f62-3aa3f32af6a5", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 21 18:58:36 np0005591285 nova_compute[182755]: 2026-01-21 23:58:36.614 182759 DEBUG oslo_concurrency.lockutils [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Releasing lock "refresh_cache-9308be91-9a92-4389-939a-8b03d37474cf" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 21 18:58:36 np0005591285 nova_compute[182755]: 2026-01-21 23:58:36.615 182759 DEBUG nova.compute.manager [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] [instance: 9308be91-9a92-4389-939a-8b03d37474cf] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929#033[00m
Jan 21 18:58:36 np0005591285 nova_compute[182755]: 2026-01-21 23:58:36.617 182759 DEBUG oslo_service.periodic_task [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 21 18:58:36 np0005591285 nova_compute[182755]: 2026-01-21 23:58:36.618 182759 DEBUG oslo_service.periodic_task [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 21 18:58:38 np0005591285 nova_compute[182755]: 2026-01-21 23:58:38.107 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 18:58:38 np0005591285 nova_compute[182755]: 2026-01-21 23:58:38.218 182759 DEBUG oslo_service.periodic_task [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 21 18:58:38 np0005591285 nova_compute[182755]: 2026-01-21 23:58:38.361 182759 INFO nova.compute.manager [None req-de529fbe-b5af-4346-b0d9-5dd3dcd66207 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] [instance: 9308be91-9a92-4389-939a-8b03d37474cf] Unpausing#033[00m
Jan 21 18:58:38 np0005591285 nova_compute[182755]: 2026-01-21 23:58:38.363 182759 DEBUG nova.objects.instance [None req-de529fbe-b5af-4346-b0d9-5dd3dcd66207 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] Lazy-loading 'flavor' on Instance uuid 9308be91-9a92-4389-939a-8b03d37474cf obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 21 18:58:38 np0005591285 nova_compute[182755]: 2026-01-21 23:58:38.427 182759 DEBUG nova.virt.driver [None req-f447ef32-7edb-4e2c-a5c5-5452f2f9c37e - - - - - -] Emitting event <LifecycleEvent: 1769039918.4267757, 9308be91-9a92-4389-939a-8b03d37474cf => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 21 18:58:38 np0005591285 nova_compute[182755]: 2026-01-21 23:58:38.428 182759 INFO nova.compute.manager [None req-f447ef32-7edb-4e2c-a5c5-5452f2f9c37e - - - - - -] [instance: 9308be91-9a92-4389-939a-8b03d37474cf] VM Resumed (Lifecycle Event)#033[00m
Jan 21 18:58:38 np0005591285 virtqemud[182299]: argument unsupported: QEMU guest agent is not configured
Jan 21 18:58:38 np0005591285 nova_compute[182755]: 2026-01-21 23:58:38.432 182759 DEBUG nova.virt.libvirt.guest [None req-de529fbe-b5af-4346-b0d9-5dd3dcd66207 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] [instance: 9308be91-9a92-4389-939a-8b03d37474cf] Failed to set time: agent not configured sync_guest_time /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:200#033[00m
Jan 21 18:58:38 np0005591285 nova_compute[182755]: 2026-01-21 23:58:38.433 182759 DEBUG nova.compute.manager [None req-de529fbe-b5af-4346-b0d9-5dd3dcd66207 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] [instance: 9308be91-9a92-4389-939a-8b03d37474cf] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 21 18:58:38 np0005591285 nova_compute[182755]: 2026-01-21 23:58:38.466 182759 DEBUG nova.compute.manager [None req-f447ef32-7edb-4e2c-a5c5-5452f2f9c37e - - - - - -] [instance: 9308be91-9a92-4389-939a-8b03d37474cf] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 21 18:58:38 np0005591285 nova_compute[182755]: 2026-01-21 23:58:38.472 182759 DEBUG nova.compute.manager [None req-f447ef32-7edb-4e2c-a5c5-5452f2f9c37e - - - - - -] [instance: 9308be91-9a92-4389-939a-8b03d37474cf] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: paused, current task_state: unpausing, current DB power_state: 3, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 21 18:58:38 np0005591285 nova_compute[182755]: 2026-01-21 23:58:38.517 182759 INFO nova.compute.manager [None req-f447ef32-7edb-4e2c-a5c5-5452f2f9c37e - - - - - -] [instance: 9308be91-9a92-4389-939a-8b03d37474cf] During sync_power_state the instance has a pending task (unpausing). Skip.#033[00m
Jan 21 18:58:38 np0005591285 nova_compute[182755]: 2026-01-21 23:58:38.788 182759 INFO nova.compute.manager [None req-62bfc0ee-7c8e-4943-9745-39f5147a1181 55710edfd4b24e368807c8b5087ec91c 011e84f966444a668bd6c0f5674f551f - - default default] [instance: 83fe04ea-7d77-4003-9276-6a7d268e942a] Rescuing#033[00m
Jan 21 18:58:38 np0005591285 nova_compute[182755]: 2026-01-21 23:58:38.790 182759 DEBUG oslo_concurrency.lockutils [None req-62bfc0ee-7c8e-4943-9745-39f5147a1181 55710edfd4b24e368807c8b5087ec91c 011e84f966444a668bd6c0f5674f551f - - default default] Acquiring lock "refresh_cache-83fe04ea-7d77-4003-9276-6a7d268e942a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 21 18:58:38 np0005591285 nova_compute[182755]: 2026-01-21 23:58:38.790 182759 DEBUG oslo_concurrency.lockutils [None req-62bfc0ee-7c8e-4943-9745-39f5147a1181 55710edfd4b24e368807c8b5087ec91c 011e84f966444a668bd6c0f5674f551f - - default default] Acquired lock "refresh_cache-83fe04ea-7d77-4003-9276-6a7d268e942a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 21 18:58:38 np0005591285 nova_compute[182755]: 2026-01-21 23:58:38.791 182759 DEBUG nova.network.neutron [None req-62bfc0ee-7c8e-4943-9745-39f5147a1181 55710edfd4b24e368807c8b5087ec91c 011e84f966444a668bd6c0f5674f551f - - default default] [instance: 83fe04ea-7d77-4003-9276-6a7d268e942a] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Jan 21 18:58:39 np0005591285 ovn_controller[94908]: 2026-01-21T23:58:39Z|00230|binding|INFO|Releasing lport 1b7e9589-a667-4684-99c2-2699b19c29bb from this chassis (sb_readonly=0)
Jan 21 18:58:39 np0005591285 ovn_controller[94908]: 2026-01-21T23:58:39Z|00231|binding|INFO|Releasing lport 2d113249-07d3-443f-9b57-5f5a422d1c98 from this chassis (sb_readonly=0)
Jan 21 18:58:39 np0005591285 nova_compute[182755]: 2026-01-21 23:58:39.254 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 18:58:40 np0005591285 podman[221505]: 2026-01-21 23:58:40.226315115 +0000 UTC m=+0.075793907 container health_status 482a8450540b33e5cd4c4cb0c4b60281016c657b79b89fa82f8a9e447843f995 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=ovn_metadata_agent, io.buildah.version=1.41.3)
Jan 21 18:58:40 np0005591285 ovn_controller[94908]: 2026-01-21T23:58:40Z|00026|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:2d:84:20 10.100.0.13
Jan 21 18:58:40 np0005591285 ovn_controller[94908]: 2026-01-21T23:58:40Z|00027|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:2d:84:20 10.100.0.13
Jan 21 18:58:40 np0005591285 podman[221506]: 2026-01-21 23:58:40.24015544 +0000 UTC m=+0.085646804 container health_status 8919309155a033da8deb024bb37b9fc0d285fe97686ee0d202af37a5f2df8416 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter)
Jan 21 18:58:40 np0005591285 podman[221507]: 2026-01-21 23:58:40.263651677 +0000 UTC m=+0.112197414 container health_status e38def30a2d032278c507a8fe96e62e9ea51ff4721fe55b24e66691821fd079e (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, container_name=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, managed_by=edpm_ansible, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202)
Jan 21 18:58:40 np0005591285 nova_compute[182755]: 2026-01-21 23:58:40.616 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 18:58:40 np0005591285 nova_compute[182755]: 2026-01-21 23:58:40.754 182759 DEBUG nova.network.neutron [None req-62bfc0ee-7c8e-4943-9745-39f5147a1181 55710edfd4b24e368807c8b5087ec91c 011e84f966444a668bd6c0f5674f551f - - default default] [instance: 83fe04ea-7d77-4003-9276-6a7d268e942a] Updating instance_info_cache with network_info: [{"id": "8e162717-2b5c-4731-8484-d2c68330bdaa", "address": "fa:16:3e:2d:84:20", "network": {"id": "58cd83db-dcb3-409c-a108-07601ce5f67a", "bridge": "br-int", "label": "tempest-ServerStableDeviceRescueTest-1658091487-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "011e84f966444a668bd6c0f5674f551f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap8e162717-2b", "ovs_interfaceid": "8e162717-2b5c-4731-8484-d2c68330bdaa", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 21 18:58:40 np0005591285 nova_compute[182755]: 2026-01-21 23:58:40.797 182759 DEBUG oslo_concurrency.lockutils [None req-62bfc0ee-7c8e-4943-9745-39f5147a1181 55710edfd4b24e368807c8b5087ec91c 011e84f966444a668bd6c0f5674f551f - - default default] Releasing lock "refresh_cache-83fe04ea-7d77-4003-9276-6a7d268e942a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 21 18:58:41 np0005591285 nova_compute[182755]: 2026-01-21 23:58:41.178 182759 DEBUG nova.virt.libvirt.driver [None req-62bfc0ee-7c8e-4943-9745-39f5147a1181 55710edfd4b24e368807c8b5087ec91c 011e84f966444a668bd6c0f5674f551f - - default default] [instance: 83fe04ea-7d77-4003-9276-6a7d268e942a] Shutting down instance from state 1 _clean_shutdown /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4071#033[00m
Jan 21 18:58:43 np0005591285 nova_compute[182755]: 2026-01-21 23:58:43.113 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 18:58:43 np0005591285 kernel: tap8e162717-2b (unregistering): left promiscuous mode
Jan 21 18:58:43 np0005591285 NetworkManager[55017]: <info>  [1769039923.5912] device (tap8e162717-2b): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 21 18:58:43 np0005591285 ovn_controller[94908]: 2026-01-21T23:58:43Z|00232|binding|INFO|Releasing lport 8e162717-2b5c-4731-8484-d2c68330bdaa from this chassis (sb_readonly=0)
Jan 21 18:58:43 np0005591285 nova_compute[182755]: 2026-01-21 23:58:43.602 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 18:58:43 np0005591285 ovn_controller[94908]: 2026-01-21T23:58:43Z|00233|binding|INFO|Setting lport 8e162717-2b5c-4731-8484-d2c68330bdaa down in Southbound
Jan 21 18:58:43 np0005591285 ovn_controller[94908]: 2026-01-21T23:58:43Z|00234|binding|INFO|Removing iface tap8e162717-2b ovn-installed in OVS
Jan 21 18:58:43 np0005591285 nova_compute[182755]: 2026-01-21 23:58:43.605 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 18:58:43 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:58:43.615 104259 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:2d:84:20 10.100.0.13'], port_security=['fa:16:3e:2d:84:20 10.100.0.13'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.13/28', 'neutron:device_id': '83fe04ea-7d77-4003-9276-6a7d268e942a', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-58cd83db-dcb3-409c-a108-07601ce5f67a', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '011e84f966444a668bd6c0f5674f551f', 'neutron:revision_number': '4', 'neutron:security_group_ids': '5f02e9fc-67d8-4ade-8ddf-f139c26fa610', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-2.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=bbb394e5-dc7d-4c83-b892-c42bee4b1312, chassis=[], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fdd55b5c970>], logical_port=8e162717-2b5c-4731-8484-d2c68330bdaa) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7fdd55b5c970>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 21 18:58:43 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:58:43.617 104259 INFO neutron.agent.ovn.metadata.agent [-] Port 8e162717-2b5c-4731-8484-d2c68330bdaa in datapath 58cd83db-dcb3-409c-a108-07601ce5f67a unbound from our chassis#033[00m
Jan 21 18:58:43 np0005591285 nova_compute[182755]: 2026-01-21 23:58:43.619 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 18:58:43 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:58:43.619 104259 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 58cd83db-dcb3-409c-a108-07601ce5f67a, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Jan 21 18:58:43 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:58:43.621 211690 DEBUG oslo.privsep.daemon [-] privsep: reply[0292c031-1155-4359-bebe-6bb06ba899a4]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 18:58:43 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:58:43.622 104259 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-58cd83db-dcb3-409c-a108-07601ce5f67a namespace which is not needed anymore#033[00m
Jan 21 18:58:43 np0005591285 systemd[1]: machine-qemu\x2d31\x2dinstance\x2d0000004a.scope: Deactivated successfully.
Jan 21 18:58:43 np0005591285 systemd[1]: machine-qemu\x2d31\x2dinstance\x2d0000004a.scope: Consumed 13.974s CPU time.
Jan 21 18:58:43 np0005591285 systemd-machined[154022]: Machine qemu-31-instance-0000004a terminated.
Jan 21 18:58:43 np0005591285 neutron-haproxy-ovnmeta-58cd83db-dcb3-409c-a108-07601ce5f67a[221241]: [NOTICE]   (221245) : haproxy version is 2.8.14-c23fe91
Jan 21 18:58:43 np0005591285 neutron-haproxy-ovnmeta-58cd83db-dcb3-409c-a108-07601ce5f67a[221241]: [NOTICE]   (221245) : path to executable is /usr/sbin/haproxy
Jan 21 18:58:43 np0005591285 neutron-haproxy-ovnmeta-58cd83db-dcb3-409c-a108-07601ce5f67a[221241]: [WARNING]  (221245) : Exiting Master process...
Jan 21 18:58:43 np0005591285 neutron-haproxy-ovnmeta-58cd83db-dcb3-409c-a108-07601ce5f67a[221241]: [WARNING]  (221245) : Exiting Master process...
Jan 21 18:58:43 np0005591285 neutron-haproxy-ovnmeta-58cd83db-dcb3-409c-a108-07601ce5f67a[221241]: [ALERT]    (221245) : Current worker (221247) exited with code 143 (Terminated)
Jan 21 18:58:43 np0005591285 neutron-haproxy-ovnmeta-58cd83db-dcb3-409c-a108-07601ce5f67a[221241]: [WARNING]  (221245) : All workers exited. Exiting... (0)
Jan 21 18:58:43 np0005591285 systemd[1]: libpod-c3297c860cb3fcad40c389c6254ebb4f07805d162354aaf53dbe9b412bfc1c9c.scope: Deactivated successfully.
Jan 21 18:58:43 np0005591285 podman[221596]: 2026-01-21 23:58:43.784612449 +0000 UTC m=+0.052953188 container died c3297c860cb3fcad40c389c6254ebb4f07805d162354aaf53dbe9b412bfc1c9c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-58cd83db-dcb3-409c-a108-07601ce5f67a, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 21 18:58:43 np0005591285 NetworkManager[55017]: <info>  [1769039923.8302] manager: (tap8e162717-2b): new Tun device (/org/freedesktop/NetworkManager/Devices/121)
Jan 21 18:58:43 np0005591285 systemd-udevd[221578]: Network interface NamePolicy= disabled on kernel command line.
Jan 21 18:58:43 np0005591285 systemd[1]: var-lib-containers-storage-overlay-a4aafa22a503263b8dfe2a7d4f993aa1426a1d52484c6b77fa9b39ea37380fa6-merged.mount: Deactivated successfully.
Jan 21 18:58:43 np0005591285 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-c3297c860cb3fcad40c389c6254ebb4f07805d162354aaf53dbe9b412bfc1c9c-userdata-shm.mount: Deactivated successfully.
Jan 21 18:58:43 np0005591285 kernel: tap8e162717-2b: entered promiscuous mode
Jan 21 18:58:43 np0005591285 kernel: tap8e162717-2b (unregistering): left promiscuous mode
Jan 21 18:58:43 np0005591285 podman[221596]: 2026-01-21 23:58:43.846200439 +0000 UTC m=+0.114541178 container cleanup c3297c860cb3fcad40c389c6254ebb4f07805d162354aaf53dbe9b412bfc1c9c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-58cd83db-dcb3-409c-a108-07601ce5f67a, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 21 18:58:43 np0005591285 nova_compute[182755]: 2026-01-21 23:58:43.843 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 18:58:43 np0005591285 ovn_controller[94908]: 2026-01-21T23:58:43Z|00235|binding|INFO|Claiming lport 8e162717-2b5c-4731-8484-d2c68330bdaa for this chassis.
Jan 21 18:58:43 np0005591285 ovn_controller[94908]: 2026-01-21T23:58:43Z|00236|binding|INFO|8e162717-2b5c-4731-8484-d2c68330bdaa: Claiming fa:16:3e:2d:84:20 10.100.0.13
Jan 21 18:58:43 np0005591285 ovn_controller[94908]: 2026-01-21T23:58:43Z|00237|binding|INFO|Setting lport 8e162717-2b5c-4731-8484-d2c68330bdaa ovn-installed in OVS
Jan 21 18:58:43 np0005591285 nova_compute[182755]: 2026-01-21 23:58:43.867 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 18:58:43 np0005591285 systemd[1]: libpod-conmon-c3297c860cb3fcad40c389c6254ebb4f07805d162354aaf53dbe9b412bfc1c9c.scope: Deactivated successfully.
Jan 21 18:58:43 np0005591285 podman[221632]: 2026-01-21 23:58:43.950334593 +0000 UTC m=+0.065086556 container remove c3297c860cb3fcad40c389c6254ebb4f07805d162354aaf53dbe9b412bfc1c9c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-58cd83db-dcb3-409c-a108-07601ce5f67a, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 21 18:58:43 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:58:43.959 211690 DEBUG oslo.privsep.daemon [-] privsep: reply[1360a09d-9e3f-46b8-b849-27569c2a7412]: (4, ('Wed Jan 21 11:58:43 PM UTC 2026 Stopping container neutron-haproxy-ovnmeta-58cd83db-dcb3-409c-a108-07601ce5f67a (c3297c860cb3fcad40c389c6254ebb4f07805d162354aaf53dbe9b412bfc1c9c)\nc3297c860cb3fcad40c389c6254ebb4f07805d162354aaf53dbe9b412bfc1c9c\nWed Jan 21 11:58:43 PM UTC 2026 Deleting container neutron-haproxy-ovnmeta-58cd83db-dcb3-409c-a108-07601ce5f67a (c3297c860cb3fcad40c389c6254ebb4f07805d162354aaf53dbe9b412bfc1c9c)\nc3297c860cb3fcad40c389c6254ebb4f07805d162354aaf53dbe9b412bfc1c9c\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 18:58:43 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:58:43.962 211690 DEBUG oslo.privsep.daemon [-] privsep: reply[fb00b420-1fc5-4dc7-91d1-3942e7fefb33]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 18:58:43 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:58:43.964 104259 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap58cd83db-d0, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 21 18:58:43 np0005591285 kernel: tap58cd83db-d0: left promiscuous mode
Jan 21 18:58:43 np0005591285 nova_compute[182755]: 2026-01-21 23:58:43.968 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 18:58:44 np0005591285 nova_compute[182755]: 2026-01-21 23:58:44.000 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 18:58:44 np0005591285 ovn_controller[94908]: 2026-01-21T23:58:44Z|00238|if_status|INFO|Dropped 2 log messages in last 158 seconds (most recently, 158 seconds ago) due to excessive rate
Jan 21 18:58:44 np0005591285 ovn_controller[94908]: 2026-01-21T23:58:44Z|00239|if_status|INFO|Not setting lport 8e162717-2b5c-4731-8484-d2c68330bdaa down as sb is readonly
Jan 21 18:58:44 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:58:44.007 211690 DEBUG oslo.privsep.daemon [-] privsep: reply[0eee1a58-d2ec-41fa-b3b7-0a21df4f59f1]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 18:58:44 np0005591285 ovn_controller[94908]: 2026-01-21T23:58:44Z|00240|binding|INFO|Releasing lport 8e162717-2b5c-4731-8484-d2c68330bdaa from this chassis (sb_readonly=0)
Jan 21 18:58:44 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:58:44.023 104259 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:2d:84:20 10.100.0.13'], port_security=['fa:16:3e:2d:84:20 10.100.0.13'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.13/28', 'neutron:device_id': '83fe04ea-7d77-4003-9276-6a7d268e942a', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-58cd83db-dcb3-409c-a108-07601ce5f67a', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '011e84f966444a668bd6c0f5674f551f', 'neutron:revision_number': '4', 'neutron:security_group_ids': '5f02e9fc-67d8-4ade-8ddf-f139c26fa610', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-2.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=bbb394e5-dc7d-4c83-b892-c42bee4b1312, chassis=[<ovs.db.idl.Row object at 0x7fdd55b5c970>], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fdd55b5c970>], logical_port=8e162717-2b5c-4731-8484-d2c68330bdaa) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 21 18:58:44 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:58:44.048 211690 DEBUG oslo.privsep.daemon [-] privsep: reply[b9f1b7ad-4a4b-4129-91b9-95f5d174f2f8]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 18:58:44 np0005591285 nova_compute[182755]: 2026-01-21 23:58:44.048 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 18:58:44 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:58:44.051 104259 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:2d:84:20 10.100.0.13'], port_security=['fa:16:3e:2d:84:20 10.100.0.13'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.13/28', 'neutron:device_id': '83fe04ea-7d77-4003-9276-6a7d268e942a', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-58cd83db-dcb3-409c-a108-07601ce5f67a', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '011e84f966444a668bd6c0f5674f551f', 'neutron:revision_number': '4', 'neutron:security_group_ids': '5f02e9fc-67d8-4ade-8ddf-f139c26fa610', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-2.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=bbb394e5-dc7d-4c83-b892-c42bee4b1312, chassis=[], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fdd55b5c970>], logical_port=8e162717-2b5c-4731-8484-d2c68330bdaa) old=Port_Binding(chassis=[<ovs.db.idl.Row object at 0x7fdd55b5c970>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 21 18:58:44 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:58:44.053 211690 DEBUG oslo.privsep.daemon [-] privsep: reply[e324c16b-7dc7-4e31-94bd-dd9c00735fe9]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 18:58:44 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:58:44.079 211690 DEBUG oslo.privsep.daemon [-] privsep: reply[93f8fe5e-5100-4fff-9031-fda4f41470ba]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 446370, 'reachable_time': 33863, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 221657, 'error': None, 'target': 'ovnmeta-58cd83db-dcb3-409c-a108-07601ce5f67a', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 18:58:44 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:58:44.082 104650 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-58cd83db-dcb3-409c-a108-07601ce5f67a deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Jan 21 18:58:44 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:58:44.082 104650 DEBUG oslo.privsep.daemon [-] privsep: reply[0db32165-1335-49a6-b90e-e4c08bedb913]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 18:58:44 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:58:44.083 104259 INFO neutron.agent.ovn.metadata.agent [-] Port 8e162717-2b5c-4731-8484-d2c68330bdaa in datapath 58cd83db-dcb3-409c-a108-07601ce5f67a unbound from our chassis#033[00m
Jan 21 18:58:44 np0005591285 systemd[1]: run-netns-ovnmeta\x2d58cd83db\x2ddcb3\x2d409c\x2da108\x2d07601ce5f67a.mount: Deactivated successfully.
Jan 21 18:58:44 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:58:44.086 104259 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 58cd83db-dcb3-409c-a108-07601ce5f67a, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Jan 21 18:58:44 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:58:44.088 211690 DEBUG oslo.privsep.daemon [-] privsep: reply[058d06e3-7b1c-4ce0-8704-529e6ae3ca7f]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 18:58:44 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:58:44.089 104259 INFO neutron.agent.ovn.metadata.agent [-] Port 8e162717-2b5c-4731-8484-d2c68330bdaa in datapath 58cd83db-dcb3-409c-a108-07601ce5f67a unbound from our chassis#033[00m
Jan 21 18:58:44 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:58:44.091 104259 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 58cd83db-dcb3-409c-a108-07601ce5f67a, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Jan 21 18:58:44 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:58:44.091 211690 DEBUG oslo.privsep.daemon [-] privsep: reply[bd75a8cc-eb0d-4f72-b87f-00f550ca17e3]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 18:58:44 np0005591285 nova_compute[182755]: 2026-01-21 23:58:44.170 182759 DEBUG nova.compute.manager [req-5673ee45-d85e-4560-98cf-b1fe401d831b req-2727a51b-b079-48c4-859e-004e2eea71b1 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 83fe04ea-7d77-4003-9276-6a7d268e942a] Received event network-vif-unplugged-8e162717-2b5c-4731-8484-d2c68330bdaa external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 21 18:58:44 np0005591285 nova_compute[182755]: 2026-01-21 23:58:44.171 182759 DEBUG oslo_concurrency.lockutils [req-5673ee45-d85e-4560-98cf-b1fe401d831b req-2727a51b-b079-48c4-859e-004e2eea71b1 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "83fe04ea-7d77-4003-9276-6a7d268e942a-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 21 18:58:44 np0005591285 nova_compute[182755]: 2026-01-21 23:58:44.171 182759 DEBUG oslo_concurrency.lockutils [req-5673ee45-d85e-4560-98cf-b1fe401d831b req-2727a51b-b079-48c4-859e-004e2eea71b1 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "83fe04ea-7d77-4003-9276-6a7d268e942a-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 21 18:58:44 np0005591285 nova_compute[182755]: 2026-01-21 23:58:44.172 182759 DEBUG oslo_concurrency.lockutils [req-5673ee45-d85e-4560-98cf-b1fe401d831b req-2727a51b-b079-48c4-859e-004e2eea71b1 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "83fe04ea-7d77-4003-9276-6a7d268e942a-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 21 18:58:44 np0005591285 nova_compute[182755]: 2026-01-21 23:58:44.172 182759 DEBUG nova.compute.manager [req-5673ee45-d85e-4560-98cf-b1fe401d831b req-2727a51b-b079-48c4-859e-004e2eea71b1 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 83fe04ea-7d77-4003-9276-6a7d268e942a] No waiting events found dispatching network-vif-unplugged-8e162717-2b5c-4731-8484-d2c68330bdaa pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 21 18:58:44 np0005591285 nova_compute[182755]: 2026-01-21 23:58:44.173 182759 WARNING nova.compute.manager [req-5673ee45-d85e-4560-98cf-b1fe401d831b req-2727a51b-b079-48c4-859e-004e2eea71b1 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 83fe04ea-7d77-4003-9276-6a7d268e942a] Received unexpected event network-vif-unplugged-8e162717-2b5c-4731-8484-d2c68330bdaa for instance with vm_state active and task_state rescuing.#033[00m
Jan 21 18:58:44 np0005591285 nova_compute[182755]: 2026-01-21 23:58:44.199 182759 INFO nova.virt.libvirt.driver [None req-62bfc0ee-7c8e-4943-9745-39f5147a1181 55710edfd4b24e368807c8b5087ec91c 011e84f966444a668bd6c0f5674f551f - - default default] [instance: 83fe04ea-7d77-4003-9276-6a7d268e942a] Instance shutdown successfully after 3 seconds.#033[00m
Jan 21 18:58:44 np0005591285 nova_compute[182755]: 2026-01-21 23:58:44.207 182759 INFO nova.virt.libvirt.driver [-] [instance: 83fe04ea-7d77-4003-9276-6a7d268e942a] Instance destroyed successfully.#033[00m
Jan 21 18:58:44 np0005591285 nova_compute[182755]: 2026-01-21 23:58:44.208 182759 DEBUG nova.objects.instance [None req-62bfc0ee-7c8e-4943-9745-39f5147a1181 55710edfd4b24e368807c8b5087ec91c 011e84f966444a668bd6c0f5674f551f - - default default] Lazy-loading 'numa_topology' on Instance uuid 83fe04ea-7d77-4003-9276-6a7d268e942a obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 21 18:58:44 np0005591285 nova_compute[182755]: 2026-01-21 23:58:44.225 182759 INFO nova.virt.libvirt.driver [None req-62bfc0ee-7c8e-4943-9745-39f5147a1181 55710edfd4b24e368807c8b5087ec91c 011e84f966444a668bd6c0f5674f551f - - default default] [instance: 83fe04ea-7d77-4003-9276-6a7d268e942a] Attempting a stable device rescue#033[00m
Jan 21 18:58:44 np0005591285 nova_compute[182755]: 2026-01-21 23:58:44.618 182759 DEBUG nova.virt.libvirt.driver [None req-62bfc0ee-7c8e-4943-9745-39f5147a1181 55710edfd4b24e368807c8b5087ec91c 011e84f966444a668bd6c0f5674f551f - - default default] rescue generated disk_info: {'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}, 'disk.rescue': {'bus': 'usb', 'dev': 'sdb', 'type': 'disk'}}} rescue /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4314#033[00m
Jan 21 18:58:44 np0005591285 nova_compute[182755]: 2026-01-21 23:58:44.625 182759 DEBUG nova.virt.libvirt.driver [None req-62bfc0ee-7c8e-4943-9745-39f5147a1181 55710edfd4b24e368807c8b5087ec91c 011e84f966444a668bd6c0f5674f551f - - default default] [instance: 83fe04ea-7d77-4003-9276-6a7d268e942a] Instance directory exists: not creating _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4719#033[00m
Jan 21 18:58:44 np0005591285 nova_compute[182755]: 2026-01-21 23:58:44.625 182759 INFO nova.virt.libvirt.driver [None req-62bfc0ee-7c8e-4943-9745-39f5147a1181 55710edfd4b24e368807c8b5087ec91c 011e84f966444a668bd6c0f5674f551f - - default default] [instance: 83fe04ea-7d77-4003-9276-6a7d268e942a] Creating image(s)#033[00m
Jan 21 18:58:44 np0005591285 nova_compute[182755]: 2026-01-21 23:58:44.626 182759 DEBUG oslo_concurrency.lockutils [None req-62bfc0ee-7c8e-4943-9745-39f5147a1181 55710edfd4b24e368807c8b5087ec91c 011e84f966444a668bd6c0f5674f551f - - default default] Acquiring lock "/var/lib/nova/instances/83fe04ea-7d77-4003-9276-6a7d268e942a/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 21 18:58:44 np0005591285 nova_compute[182755]: 2026-01-21 23:58:44.626 182759 DEBUG oslo_concurrency.lockutils [None req-62bfc0ee-7c8e-4943-9745-39f5147a1181 55710edfd4b24e368807c8b5087ec91c 011e84f966444a668bd6c0f5674f551f - - default default] Lock "/var/lib/nova/instances/83fe04ea-7d77-4003-9276-6a7d268e942a/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 21 18:58:44 np0005591285 nova_compute[182755]: 2026-01-21 23:58:44.627 182759 DEBUG oslo_concurrency.lockutils [None req-62bfc0ee-7c8e-4943-9745-39f5147a1181 55710edfd4b24e368807c8b5087ec91c 011e84f966444a668bd6c0f5674f551f - - default default] Lock "/var/lib/nova/instances/83fe04ea-7d77-4003-9276-6a7d268e942a/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 21 18:58:44 np0005591285 nova_compute[182755]: 2026-01-21 23:58:44.628 182759 DEBUG nova.objects.instance [None req-62bfc0ee-7c8e-4943-9745-39f5147a1181 55710edfd4b24e368807c8b5087ec91c 011e84f966444a668bd6c0f5674f551f - - default default] Lazy-loading 'trusted_certs' on Instance uuid 83fe04ea-7d77-4003-9276-6a7d268e942a obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 21 18:58:44 np0005591285 nova_compute[182755]: 2026-01-21 23:58:44.644 182759 DEBUG oslo_concurrency.lockutils [None req-62bfc0ee-7c8e-4943-9745-39f5147a1181 55710edfd4b24e368807c8b5087ec91c 011e84f966444a668bd6c0f5674f551f - - default default] Acquiring lock "21a6e6787783e19d6abd064b1f558cdd7dc1053f" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 21 18:58:44 np0005591285 nova_compute[182755]: 2026-01-21 23:58:44.645 182759 DEBUG oslo_concurrency.lockutils [None req-62bfc0ee-7c8e-4943-9745-39f5147a1181 55710edfd4b24e368807c8b5087ec91c 011e84f966444a668bd6c0f5674f551f - - default default] Lock "21a6e6787783e19d6abd064b1f558cdd7dc1053f" acquired by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 21 18:58:45 np0005591285 nova_compute[182755]: 2026-01-21 23:58:45.714 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 18:58:45 np0005591285 nova_compute[182755]: 2026-01-21 23:58:45.993 182759 DEBUG oslo_concurrency.processutils [None req-62bfc0ee-7c8e-4943-9745-39f5147a1181 55710edfd4b24e368807c8b5087ec91c 011e84f966444a668bd6c0f5674f551f - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/21a6e6787783e19d6abd064b1f558cdd7dc1053f.part --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 21 18:58:46 np0005591285 nova_compute[182755]: 2026-01-21 23:58:46.076 182759 DEBUG oslo_concurrency.processutils [None req-62bfc0ee-7c8e-4943-9745-39f5147a1181 55710edfd4b24e368807c8b5087ec91c 011e84f966444a668bd6c0f5674f551f - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/21a6e6787783e19d6abd064b1f558cdd7dc1053f.part --force-share --output=json" returned: 0 in 0.083s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 21 18:58:46 np0005591285 nova_compute[182755]: 2026-01-21 23:58:46.078 182759 DEBUG nova.virt.images [None req-62bfc0ee-7c8e-4943-9745-39f5147a1181 55710edfd4b24e368807c8b5087ec91c 011e84f966444a668bd6c0f5674f551f - - default default] 9a3bd0b3-59ef-4418-9294-c8fbfc38f79b was qcow2, converting to raw fetch_to_raw /usr/lib/python3.9/site-packages/nova/virt/images.py:242#033[00m
Jan 21 18:58:46 np0005591285 nova_compute[182755]: 2026-01-21 23:58:46.080 182759 DEBUG nova.privsep.utils [None req-62bfc0ee-7c8e-4943-9745-39f5147a1181 55710edfd4b24e368807c8b5087ec91c 011e84f966444a668bd6c0f5674f551f - - default default] Path '/var/lib/nova/instances' supports direct I/O supports_direct_io /usr/lib/python3.9/site-packages/nova/privsep/utils.py:63#033[00m
Jan 21 18:58:46 np0005591285 nova_compute[182755]: 2026-01-21 23:58:46.081 182759 DEBUG oslo_concurrency.processutils [None req-62bfc0ee-7c8e-4943-9745-39f5147a1181 55710edfd4b24e368807c8b5087ec91c 011e84f966444a668bd6c0f5674f551f - - default default] Running cmd (subprocess): qemu-img convert -t none -O raw -f qcow2 /var/lib/nova/instances/_base/21a6e6787783e19d6abd064b1f558cdd7dc1053f.part /var/lib/nova/instances/_base/21a6e6787783e19d6abd064b1f558cdd7dc1053f.converted execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 21 18:58:46 np0005591285 nova_compute[182755]: 2026-01-21 23:58:46.219 182759 DEBUG oslo_concurrency.processutils [None req-62bfc0ee-7c8e-4943-9745-39f5147a1181 55710edfd4b24e368807c8b5087ec91c 011e84f966444a668bd6c0f5674f551f - - default default] CMD "qemu-img convert -t none -O raw -f qcow2 /var/lib/nova/instances/_base/21a6e6787783e19d6abd064b1f558cdd7dc1053f.part /var/lib/nova/instances/_base/21a6e6787783e19d6abd064b1f558cdd7dc1053f.converted" returned: 0 in 0.137s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 21 18:58:46 np0005591285 nova_compute[182755]: 2026-01-21 23:58:46.228 182759 DEBUG oslo_concurrency.processutils [None req-62bfc0ee-7c8e-4943-9745-39f5147a1181 55710edfd4b24e368807c8b5087ec91c 011e84f966444a668bd6c0f5674f551f - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/21a6e6787783e19d6abd064b1f558cdd7dc1053f.converted --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 21 18:58:46 np0005591285 nova_compute[182755]: 2026-01-21 23:58:46.256 182759 DEBUG nova.compute.manager [req-31f002fa-47d4-404c-9d93-31d89b3fba6b req-65246223-4d1e-40de-8c28-e5940901b122 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 83fe04ea-7d77-4003-9276-6a7d268e942a] Received event network-vif-plugged-8e162717-2b5c-4731-8484-d2c68330bdaa external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 21 18:58:46 np0005591285 nova_compute[182755]: 2026-01-21 23:58:46.257 182759 DEBUG oslo_concurrency.lockutils [req-31f002fa-47d4-404c-9d93-31d89b3fba6b req-65246223-4d1e-40de-8c28-e5940901b122 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "83fe04ea-7d77-4003-9276-6a7d268e942a-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 21 18:58:46 np0005591285 nova_compute[182755]: 2026-01-21 23:58:46.257 182759 DEBUG oslo_concurrency.lockutils [req-31f002fa-47d4-404c-9d93-31d89b3fba6b req-65246223-4d1e-40de-8c28-e5940901b122 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "83fe04ea-7d77-4003-9276-6a7d268e942a-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 21 18:58:46 np0005591285 nova_compute[182755]: 2026-01-21 23:58:46.257 182759 DEBUG oslo_concurrency.lockutils [req-31f002fa-47d4-404c-9d93-31d89b3fba6b req-65246223-4d1e-40de-8c28-e5940901b122 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "83fe04ea-7d77-4003-9276-6a7d268e942a-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 21 18:58:46 np0005591285 nova_compute[182755]: 2026-01-21 23:58:46.258 182759 DEBUG nova.compute.manager [req-31f002fa-47d4-404c-9d93-31d89b3fba6b req-65246223-4d1e-40de-8c28-e5940901b122 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 83fe04ea-7d77-4003-9276-6a7d268e942a] No waiting events found dispatching network-vif-plugged-8e162717-2b5c-4731-8484-d2c68330bdaa pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 21 18:58:46 np0005591285 nova_compute[182755]: 2026-01-21 23:58:46.258 182759 WARNING nova.compute.manager [req-31f002fa-47d4-404c-9d93-31d89b3fba6b req-65246223-4d1e-40de-8c28-e5940901b122 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 83fe04ea-7d77-4003-9276-6a7d268e942a] Received unexpected event network-vif-plugged-8e162717-2b5c-4731-8484-d2c68330bdaa for instance with vm_state active and task_state rescuing.#033[00m
Jan 21 18:58:46 np0005591285 nova_compute[182755]: 2026-01-21 23:58:46.293 182759 DEBUG oslo_concurrency.processutils [None req-62bfc0ee-7c8e-4943-9745-39f5147a1181 55710edfd4b24e368807c8b5087ec91c 011e84f966444a668bd6c0f5674f551f - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/21a6e6787783e19d6abd064b1f558cdd7dc1053f.converted --force-share --output=json" returned: 0 in 0.065s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 21 18:58:46 np0005591285 nova_compute[182755]: 2026-01-21 23:58:46.294 182759 DEBUG oslo_concurrency.lockutils [None req-62bfc0ee-7c8e-4943-9745-39f5147a1181 55710edfd4b24e368807c8b5087ec91c 011e84f966444a668bd6c0f5674f551f - - default default] Lock "21a6e6787783e19d6abd064b1f558cdd7dc1053f" "released" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: held 1.649s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 21 18:58:46 np0005591285 nova_compute[182755]: 2026-01-21 23:58:46.307 182759 DEBUG oslo_concurrency.lockutils [None req-62bfc0ee-7c8e-4943-9745-39f5147a1181 55710edfd4b24e368807c8b5087ec91c 011e84f966444a668bd6c0f5674f551f - - default default] Acquiring lock "21a6e6787783e19d6abd064b1f558cdd7dc1053f" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 21 18:58:46 np0005591285 nova_compute[182755]: 2026-01-21 23:58:46.308 182759 DEBUG oslo_concurrency.lockutils [None req-62bfc0ee-7c8e-4943-9745-39f5147a1181 55710edfd4b24e368807c8b5087ec91c 011e84f966444a668bd6c0f5674f551f - - default default] Lock "21a6e6787783e19d6abd064b1f558cdd7dc1053f" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 21 18:58:46 np0005591285 nova_compute[182755]: 2026-01-21 23:58:46.323 182759 DEBUG oslo_concurrency.processutils [None req-62bfc0ee-7c8e-4943-9745-39f5147a1181 55710edfd4b24e368807c8b5087ec91c 011e84f966444a668bd6c0f5674f551f - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/21a6e6787783e19d6abd064b1f558cdd7dc1053f --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 21 18:58:46 np0005591285 nova_compute[182755]: 2026-01-21 23:58:46.403 182759 DEBUG oslo_concurrency.processutils [None req-62bfc0ee-7c8e-4943-9745-39f5147a1181 55710edfd4b24e368807c8b5087ec91c 011e84f966444a668bd6c0f5674f551f - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/21a6e6787783e19d6abd064b1f558cdd7dc1053f --force-share --output=json" returned: 0 in 0.080s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 21 18:58:46 np0005591285 nova_compute[182755]: 2026-01-21 23:58:46.404 182759 DEBUG oslo_concurrency.processutils [None req-62bfc0ee-7c8e-4943-9745-39f5147a1181 55710edfd4b24e368807c8b5087ec91c 011e84f966444a668bd6c0f5674f551f - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/21a6e6787783e19d6abd064b1f558cdd7dc1053f,backing_fmt=raw /var/lib/nova/instances/83fe04ea-7d77-4003-9276-6a7d268e942a/disk.rescue execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 21 18:58:46 np0005591285 nova_compute[182755]: 2026-01-21 23:58:46.448 182759 DEBUG oslo_concurrency.processutils [None req-62bfc0ee-7c8e-4943-9745-39f5147a1181 55710edfd4b24e368807c8b5087ec91c 011e84f966444a668bd6c0f5674f551f - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/21a6e6787783e19d6abd064b1f558cdd7dc1053f,backing_fmt=raw /var/lib/nova/instances/83fe04ea-7d77-4003-9276-6a7d268e942a/disk.rescue" returned: 0 in 0.044s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 21 18:58:46 np0005591285 nova_compute[182755]: 2026-01-21 23:58:46.450 182759 DEBUG oslo_concurrency.lockutils [None req-62bfc0ee-7c8e-4943-9745-39f5147a1181 55710edfd4b24e368807c8b5087ec91c 011e84f966444a668bd6c0f5674f551f - - default default] Lock "21a6e6787783e19d6abd064b1f558cdd7dc1053f" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.141s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 21 18:58:46 np0005591285 nova_compute[182755]: 2026-01-21 23:58:46.450 182759 DEBUG nova.objects.instance [None req-62bfc0ee-7c8e-4943-9745-39f5147a1181 55710edfd4b24e368807c8b5087ec91c 011e84f966444a668bd6c0f5674f551f - - default default] Lazy-loading 'migration_context' on Instance uuid 83fe04ea-7d77-4003-9276-6a7d268e942a obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 21 18:58:46 np0005591285 nova_compute[182755]: 2026-01-21 23:58:46.472 182759 DEBUG nova.virt.libvirt.driver [None req-62bfc0ee-7c8e-4943-9745-39f5147a1181 55710edfd4b24e368807c8b5087ec91c 011e84f966444a668bd6c0f5674f551f - - default default] [instance: 83fe04ea-7d77-4003-9276-6a7d268e942a] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Jan 21 18:58:46 np0005591285 nova_compute[182755]: 2026-01-21 23:58:46.479 182759 DEBUG nova.virt.libvirt.driver [None req-62bfc0ee-7c8e-4943-9745-39f5147a1181 55710edfd4b24e368807c8b5087ec91c 011e84f966444a668bd6c0f5674f551f - - default default] [instance: 83fe04ea-7d77-4003-9276-6a7d268e942a] Start _get_guest_xml network_info=[{"id": "8e162717-2b5c-4731-8484-d2c68330bdaa", "address": "fa:16:3e:2d:84:20", "network": {"id": "58cd83db-dcb3-409c-a108-07601ce5f67a", "bridge": "br-int", "label": "tempest-ServerStableDeviceRescueTest-1658091487-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [], "label": "tempest-ServerStableDeviceRescueTest-1658091487-network", "vif_mac": "fa:16:3e:2d:84:20"}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "011e84f966444a668bd6c0f5674f551f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap8e162717-2b", "ovs_interfaceid": "8e162717-2b5c-4731-8484-d2c68330bdaa", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}, 'disk.rescue': {'bus': 'usb', 'dev': 'sdb', 'type': 'disk'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-21T23:42:50Z,direct_url=<?>,disk_format='qcow2',id=9cd98f02-a505-4543-a7ad-04e9a377b456,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='43b70c4e837343859ac97b6b2397ba1b',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-21T23:42:52Z,virtual_size=<?>,visibility=<?>) rescue={'image_id': '9a3bd0b3-59ef-4418-9294-c8fbfc38f79b', 'kernel_id': '', 'ramdisk_id': ''} block_device_info={'root_device_name': '/dev/vda', 'image': [{'device_type': 'disk', 'device_name': '/dev/vda', 'size': 0, 'boot_index': 0, 'disk_bus': 'virtio', 'guest_format': None, 'encryption_options': None, 'encryption_format': None, 'encrypted': False, 'encryption_secret_uuid': None, 'image_id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Jan 21 18:58:46 np0005591285 nova_compute[182755]: 2026-01-21 23:58:46.480 182759 DEBUG nova.objects.instance [None req-62bfc0ee-7c8e-4943-9745-39f5147a1181 55710edfd4b24e368807c8b5087ec91c 011e84f966444a668bd6c0f5674f551f - - default default] Lazy-loading 'resources' on Instance uuid 83fe04ea-7d77-4003-9276-6a7d268e942a obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 21 18:58:46 np0005591285 nova_compute[182755]: 2026-01-21 23:58:46.513 182759 WARNING nova.virt.libvirt.driver [None req-62bfc0ee-7c8e-4943-9745-39f5147a1181 55710edfd4b24e368807c8b5087ec91c 011e84f966444a668bd6c0f5674f551f - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 21 18:58:46 np0005591285 nova_compute[182755]: 2026-01-21 23:58:46.526 182759 DEBUG nova.virt.libvirt.host [None req-62bfc0ee-7c8e-4943-9745-39f5147a1181 55710edfd4b24e368807c8b5087ec91c 011e84f966444a668bd6c0f5674f551f - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Jan 21 18:58:46 np0005591285 nova_compute[182755]: 2026-01-21 23:58:46.527 182759 DEBUG nova.virt.libvirt.host [None req-62bfc0ee-7c8e-4943-9745-39f5147a1181 55710edfd4b24e368807c8b5087ec91c 011e84f966444a668bd6c0f5674f551f - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Jan 21 18:58:46 np0005591285 nova_compute[182755]: 2026-01-21 23:58:46.531 182759 DEBUG nova.virt.libvirt.host [None req-62bfc0ee-7c8e-4943-9745-39f5147a1181 55710edfd4b24e368807c8b5087ec91c 011e84f966444a668bd6c0f5674f551f - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Jan 21 18:58:46 np0005591285 nova_compute[182755]: 2026-01-21 23:58:46.531 182759 DEBUG nova.virt.libvirt.host [None req-62bfc0ee-7c8e-4943-9745-39f5147a1181 55710edfd4b24e368807c8b5087ec91c 011e84f966444a668bd6c0f5674f551f - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Jan 21 18:58:46 np0005591285 nova_compute[182755]: 2026-01-21 23:58:46.534 182759 DEBUG nova.virt.libvirt.driver [None req-62bfc0ee-7c8e-4943-9745-39f5147a1181 55710edfd4b24e368807c8b5087ec91c 011e84f966444a668bd6c0f5674f551f - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Jan 21 18:58:46 np0005591285 nova_compute[182755]: 2026-01-21 23:58:46.534 182759 DEBUG nova.virt.hardware [None req-62bfc0ee-7c8e-4943-9745-39f5147a1181 55710edfd4b24e368807c8b5087ec91c 011e84f966444a668bd6c0f5674f551f - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-21T23:42:45Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='c3389c03-89c4-4ff5-9e03-1a99d41713d4',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-21T23:42:50Z,direct_url=<?>,disk_format='qcow2',id=9cd98f02-a505-4543-a7ad-04e9a377b456,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='43b70c4e837343859ac97b6b2397ba1b',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-21T23:42:52Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Jan 21 18:58:46 np0005591285 nova_compute[182755]: 2026-01-21 23:58:46.535 182759 DEBUG nova.virt.hardware [None req-62bfc0ee-7c8e-4943-9745-39f5147a1181 55710edfd4b24e368807c8b5087ec91c 011e84f966444a668bd6c0f5674f551f - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Jan 21 18:58:46 np0005591285 nova_compute[182755]: 2026-01-21 23:58:46.535 182759 DEBUG nova.virt.hardware [None req-62bfc0ee-7c8e-4943-9745-39f5147a1181 55710edfd4b24e368807c8b5087ec91c 011e84f966444a668bd6c0f5674f551f - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Jan 21 18:58:46 np0005591285 nova_compute[182755]: 2026-01-21 23:58:46.536 182759 DEBUG nova.virt.hardware [None req-62bfc0ee-7c8e-4943-9745-39f5147a1181 55710edfd4b24e368807c8b5087ec91c 011e84f966444a668bd6c0f5674f551f - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Jan 21 18:58:46 np0005591285 nova_compute[182755]: 2026-01-21 23:58:46.536 182759 DEBUG nova.virt.hardware [None req-62bfc0ee-7c8e-4943-9745-39f5147a1181 55710edfd4b24e368807c8b5087ec91c 011e84f966444a668bd6c0f5674f551f - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Jan 21 18:58:46 np0005591285 nova_compute[182755]: 2026-01-21 23:58:46.537 182759 DEBUG nova.virt.hardware [None req-62bfc0ee-7c8e-4943-9745-39f5147a1181 55710edfd4b24e368807c8b5087ec91c 011e84f966444a668bd6c0f5674f551f - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Jan 21 18:58:46 np0005591285 nova_compute[182755]: 2026-01-21 23:58:46.537 182759 DEBUG nova.virt.hardware [None req-62bfc0ee-7c8e-4943-9745-39f5147a1181 55710edfd4b24e368807c8b5087ec91c 011e84f966444a668bd6c0f5674f551f - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Jan 21 18:58:46 np0005591285 nova_compute[182755]: 2026-01-21 23:58:46.538 182759 DEBUG nova.virt.hardware [None req-62bfc0ee-7c8e-4943-9745-39f5147a1181 55710edfd4b24e368807c8b5087ec91c 011e84f966444a668bd6c0f5674f551f - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Jan 21 18:58:46 np0005591285 nova_compute[182755]: 2026-01-21 23:58:46.538 182759 DEBUG nova.virt.hardware [None req-62bfc0ee-7c8e-4943-9745-39f5147a1181 55710edfd4b24e368807c8b5087ec91c 011e84f966444a668bd6c0f5674f551f - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Jan 21 18:58:46 np0005591285 nova_compute[182755]: 2026-01-21 23:58:46.539 182759 DEBUG nova.virt.hardware [None req-62bfc0ee-7c8e-4943-9745-39f5147a1181 55710edfd4b24e368807c8b5087ec91c 011e84f966444a668bd6c0f5674f551f - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Jan 21 18:58:46 np0005591285 nova_compute[182755]: 2026-01-21 23:58:46.539 182759 DEBUG nova.virt.hardware [None req-62bfc0ee-7c8e-4943-9745-39f5147a1181 55710edfd4b24e368807c8b5087ec91c 011e84f966444a668bd6c0f5674f551f - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Jan 21 18:58:46 np0005591285 nova_compute[182755]: 2026-01-21 23:58:46.540 182759 DEBUG nova.objects.instance [None req-62bfc0ee-7c8e-4943-9745-39f5147a1181 55710edfd4b24e368807c8b5087ec91c 011e84f966444a668bd6c0f5674f551f - - default default] Lazy-loading 'vcpu_model' on Instance uuid 83fe04ea-7d77-4003-9276-6a7d268e942a obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 21 18:58:46 np0005591285 nova_compute[182755]: 2026-01-21 23:58:46.565 182759 DEBUG oslo_concurrency.processutils [None req-62bfc0ee-7c8e-4943-9745-39f5147a1181 55710edfd4b24e368807c8b5087ec91c 011e84f966444a668bd6c0f5674f551f - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/83fe04ea-7d77-4003-9276-6a7d268e942a/disk.config --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 21 18:58:46 np0005591285 nova_compute[182755]: 2026-01-21 23:58:46.661 182759 DEBUG oslo_concurrency.processutils [None req-62bfc0ee-7c8e-4943-9745-39f5147a1181 55710edfd4b24e368807c8b5087ec91c 011e84f966444a668bd6c0f5674f551f - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/83fe04ea-7d77-4003-9276-6a7d268e942a/disk.config --force-share --output=json" returned: 0 in 0.096s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 21 18:58:46 np0005591285 nova_compute[182755]: 2026-01-21 23:58:46.663 182759 DEBUG oslo_concurrency.lockutils [None req-62bfc0ee-7c8e-4943-9745-39f5147a1181 55710edfd4b24e368807c8b5087ec91c 011e84f966444a668bd6c0f5674f551f - - default default] Acquiring lock "/var/lib/nova/instances/83fe04ea-7d77-4003-9276-6a7d268e942a/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 21 18:58:46 np0005591285 nova_compute[182755]: 2026-01-21 23:58:46.664 182759 DEBUG oslo_concurrency.lockutils [None req-62bfc0ee-7c8e-4943-9745-39f5147a1181 55710edfd4b24e368807c8b5087ec91c 011e84f966444a668bd6c0f5674f551f - - default default] Lock "/var/lib/nova/instances/83fe04ea-7d77-4003-9276-6a7d268e942a/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 21 18:58:46 np0005591285 nova_compute[182755]: 2026-01-21 23:58:46.666 182759 DEBUG oslo_concurrency.lockutils [None req-62bfc0ee-7c8e-4943-9745-39f5147a1181 55710edfd4b24e368807c8b5087ec91c 011e84f966444a668bd6c0f5674f551f - - default default] Lock "/var/lib/nova/instances/83fe04ea-7d77-4003-9276-6a7d268e942a/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 21 18:58:46 np0005591285 nova_compute[182755]: 2026-01-21 23:58:46.669 182759 DEBUG nova.virt.libvirt.vif [None req-62bfc0ee-7c8e-4943-9745-39f5147a1181 55710edfd4b24e368807c8b5087ec91c 011e84f966444a668bd6c0f5674f551f - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-21T23:58:15Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ServerStableDeviceRescueTest-server-1755799853',display_name='tempest-ServerStableDeviceRescueTest-server-1755799853',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-serverstabledevicerescuetest-server-1755799853',id=74,image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2026-01-21T23:58:27Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='011e84f966444a668bd6c0f5674f551f',ramdisk_id='',reservation_id='r-gcfnr2kl',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-ServerStableDeviceRescueTest-1256721315',owner_user_name='tempest-ServerStableDeviceRescueTest-1256721315-project-member'},tags=<?>,task_state='rescuing',terminated_at=None,trusted_certs=None,updated_at=2026-01-21T23:58:34Z,user_data=None,user_id='55710edfd4b24e368807c8b5087ec91c',uuid=83fe04ea-7d77-4003-9276-6a7d268e942a,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "8e162717-2b5c-4731-8484-d2c68330bdaa", "address": "fa:16:3e:2d:84:20", "network": {"id": "58cd83db-dcb3-409c-a108-07601ce5f67a", "bridge": "br-int", "label": "tempest-ServerStableDeviceRescueTest-1658091487-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [], "label": "tempest-ServerStableDeviceRescueTest-1658091487-network", "vif_mac": "fa:16:3e:2d:84:20"}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "011e84f966444a668bd6c0f5674f551f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap8e162717-2b", "ovs_interfaceid": "8e162717-2b5c-4731-8484-d2c68330bdaa", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Jan 21 18:58:46 np0005591285 nova_compute[182755]: 2026-01-21 23:58:46.669 182759 DEBUG nova.network.os_vif_util [None req-62bfc0ee-7c8e-4943-9745-39f5147a1181 55710edfd4b24e368807c8b5087ec91c 011e84f966444a668bd6c0f5674f551f - - default default] Converting VIF {"id": "8e162717-2b5c-4731-8484-d2c68330bdaa", "address": "fa:16:3e:2d:84:20", "network": {"id": "58cd83db-dcb3-409c-a108-07601ce5f67a", "bridge": "br-int", "label": "tempest-ServerStableDeviceRescueTest-1658091487-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [], "label": "tempest-ServerStableDeviceRescueTest-1658091487-network", "vif_mac": "fa:16:3e:2d:84:20"}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "011e84f966444a668bd6c0f5674f551f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap8e162717-2b", "ovs_interfaceid": "8e162717-2b5c-4731-8484-d2c68330bdaa", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 21 18:58:46 np0005591285 nova_compute[182755]: 2026-01-21 23:58:46.671 182759 DEBUG nova.network.os_vif_util [None req-62bfc0ee-7c8e-4943-9745-39f5147a1181 55710edfd4b24e368807c8b5087ec91c 011e84f966444a668bd6c0f5674f551f - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:2d:84:20,bridge_name='br-int',has_traffic_filtering=True,id=8e162717-2b5c-4731-8484-d2c68330bdaa,network=Network(58cd83db-dcb3-409c-a108-07601ce5f67a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap8e162717-2b') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 21 18:58:46 np0005591285 nova_compute[182755]: 2026-01-21 23:58:46.674 182759 DEBUG nova.objects.instance [None req-62bfc0ee-7c8e-4943-9745-39f5147a1181 55710edfd4b24e368807c8b5087ec91c 011e84f966444a668bd6c0f5674f551f - - default default] Lazy-loading 'pci_devices' on Instance uuid 83fe04ea-7d77-4003-9276-6a7d268e942a obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 21 18:58:46 np0005591285 ovn_controller[94908]: 2026-01-21T23:58:46Z|00028|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:c3:44:d7 10.100.0.7
Jan 21 18:58:46 np0005591285 nova_compute[182755]: 2026-01-21 23:58:46.720 182759 DEBUG nova.virt.libvirt.driver [None req-62bfc0ee-7c8e-4943-9745-39f5147a1181 55710edfd4b24e368807c8b5087ec91c 011e84f966444a668bd6c0f5674f551f - - default default] [instance: 83fe04ea-7d77-4003-9276-6a7d268e942a] End _get_guest_xml xml=<domain type="kvm">
Jan 21 18:58:46 np0005591285 nova_compute[182755]:  <uuid>83fe04ea-7d77-4003-9276-6a7d268e942a</uuid>
Jan 21 18:58:46 np0005591285 nova_compute[182755]:  <name>instance-0000004a</name>
Jan 21 18:58:46 np0005591285 nova_compute[182755]:  <memory>131072</memory>
Jan 21 18:58:46 np0005591285 nova_compute[182755]:  <vcpu>1</vcpu>
Jan 21 18:58:46 np0005591285 nova_compute[182755]:  <metadata>
Jan 21 18:58:46 np0005591285 nova_compute[182755]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 21 18:58:46 np0005591285 nova_compute[182755]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 21 18:58:46 np0005591285 nova_compute[182755]:      <nova:name>tempest-ServerStableDeviceRescueTest-server-1755799853</nova:name>
Jan 21 18:58:46 np0005591285 nova_compute[182755]:      <nova:creationTime>2026-01-21 23:58:46</nova:creationTime>
Jan 21 18:58:46 np0005591285 nova_compute[182755]:      <nova:flavor name="m1.nano">
Jan 21 18:58:46 np0005591285 nova_compute[182755]:        <nova:memory>128</nova:memory>
Jan 21 18:58:46 np0005591285 nova_compute[182755]:        <nova:disk>1</nova:disk>
Jan 21 18:58:46 np0005591285 nova_compute[182755]:        <nova:swap>0</nova:swap>
Jan 21 18:58:46 np0005591285 nova_compute[182755]:        <nova:ephemeral>0</nova:ephemeral>
Jan 21 18:58:46 np0005591285 nova_compute[182755]:        <nova:vcpus>1</nova:vcpus>
Jan 21 18:58:46 np0005591285 nova_compute[182755]:      </nova:flavor>
Jan 21 18:58:46 np0005591285 nova_compute[182755]:      <nova:owner>
Jan 21 18:58:46 np0005591285 nova_compute[182755]:        <nova:user uuid="55710edfd4b24e368807c8b5087ec91c">tempest-ServerStableDeviceRescueTest-1256721315-project-member</nova:user>
Jan 21 18:58:46 np0005591285 nova_compute[182755]:        <nova:project uuid="011e84f966444a668bd6c0f5674f551f">tempest-ServerStableDeviceRescueTest-1256721315</nova:project>
Jan 21 18:58:46 np0005591285 nova_compute[182755]:      </nova:owner>
Jan 21 18:58:46 np0005591285 nova_compute[182755]:      <nova:root type="image" uuid="9cd98f02-a505-4543-a7ad-04e9a377b456"/>
Jan 21 18:58:46 np0005591285 nova_compute[182755]:      <nova:ports>
Jan 21 18:58:46 np0005591285 nova_compute[182755]:        <nova:port uuid="8e162717-2b5c-4731-8484-d2c68330bdaa">
Jan 21 18:58:46 np0005591285 nova_compute[182755]:          <nova:ip type="fixed" address="10.100.0.13" ipVersion="4"/>
Jan 21 18:58:46 np0005591285 nova_compute[182755]:        </nova:port>
Jan 21 18:58:46 np0005591285 nova_compute[182755]:      </nova:ports>
Jan 21 18:58:46 np0005591285 nova_compute[182755]:    </nova:instance>
Jan 21 18:58:46 np0005591285 nova_compute[182755]:  </metadata>
Jan 21 18:58:46 np0005591285 nova_compute[182755]:  <sysinfo type="smbios">
Jan 21 18:58:46 np0005591285 nova_compute[182755]:    <system>
Jan 21 18:58:46 np0005591285 nova_compute[182755]:      <entry name="manufacturer">RDO</entry>
Jan 21 18:58:46 np0005591285 nova_compute[182755]:      <entry name="product">OpenStack Compute</entry>
Jan 21 18:58:46 np0005591285 nova_compute[182755]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 21 18:58:46 np0005591285 nova_compute[182755]:      <entry name="serial">83fe04ea-7d77-4003-9276-6a7d268e942a</entry>
Jan 21 18:58:46 np0005591285 nova_compute[182755]:      <entry name="uuid">83fe04ea-7d77-4003-9276-6a7d268e942a</entry>
Jan 21 18:58:46 np0005591285 nova_compute[182755]:      <entry name="family">Virtual Machine</entry>
Jan 21 18:58:46 np0005591285 nova_compute[182755]:    </system>
Jan 21 18:58:46 np0005591285 nova_compute[182755]:  </sysinfo>
Jan 21 18:58:46 np0005591285 nova_compute[182755]:  <os>
Jan 21 18:58:46 np0005591285 nova_compute[182755]:    <type arch="x86_64" machine="q35">hvm</type>
Jan 21 18:58:46 np0005591285 nova_compute[182755]:    <smbios mode="sysinfo"/>
Jan 21 18:58:46 np0005591285 nova_compute[182755]:  </os>
Jan 21 18:58:46 np0005591285 nova_compute[182755]:  <features>
Jan 21 18:58:46 np0005591285 nova_compute[182755]:    <acpi/>
Jan 21 18:58:46 np0005591285 nova_compute[182755]:    <apic/>
Jan 21 18:58:46 np0005591285 nova_compute[182755]:    <vmcoreinfo/>
Jan 21 18:58:46 np0005591285 nova_compute[182755]:  </features>
Jan 21 18:58:46 np0005591285 nova_compute[182755]:  <clock offset="utc">
Jan 21 18:58:46 np0005591285 nova_compute[182755]:    <timer name="pit" tickpolicy="delay"/>
Jan 21 18:58:46 np0005591285 nova_compute[182755]:    <timer name="rtc" tickpolicy="catchup"/>
Jan 21 18:58:46 np0005591285 nova_compute[182755]:    <timer name="hpet" present="no"/>
Jan 21 18:58:46 np0005591285 nova_compute[182755]:  </clock>
Jan 21 18:58:46 np0005591285 nova_compute[182755]:  <cpu mode="custom" match="exact">
Jan 21 18:58:46 np0005591285 nova_compute[182755]:    <model>Nehalem</model>
Jan 21 18:58:46 np0005591285 nova_compute[182755]:    <topology sockets="1" cores="1" threads="1"/>
Jan 21 18:58:46 np0005591285 nova_compute[182755]:  </cpu>
Jan 21 18:58:46 np0005591285 nova_compute[182755]:  <devices>
Jan 21 18:58:46 np0005591285 nova_compute[182755]:    <disk type="file" device="disk">
Jan 21 18:58:46 np0005591285 nova_compute[182755]:      <driver name="qemu" type="qcow2" cache="none"/>
Jan 21 18:58:46 np0005591285 nova_compute[182755]:      <source file="/var/lib/nova/instances/83fe04ea-7d77-4003-9276-6a7d268e942a/disk"/>
Jan 21 18:58:46 np0005591285 nova_compute[182755]:      <target dev="vda" bus="virtio"/>
Jan 21 18:58:46 np0005591285 nova_compute[182755]:    </disk>
Jan 21 18:58:46 np0005591285 nova_compute[182755]:    <disk type="file" device="cdrom">
Jan 21 18:58:46 np0005591285 nova_compute[182755]:      <driver name="qemu" type="raw" cache="none"/>
Jan 21 18:58:46 np0005591285 nova_compute[182755]:      <source file="/var/lib/nova/instances/83fe04ea-7d77-4003-9276-6a7d268e942a/disk.config"/>
Jan 21 18:58:46 np0005591285 nova_compute[182755]:      <target dev="sda" bus="sata"/>
Jan 21 18:58:46 np0005591285 nova_compute[182755]:    </disk>
Jan 21 18:58:46 np0005591285 nova_compute[182755]:    <disk type="file" device="disk">
Jan 21 18:58:46 np0005591285 nova_compute[182755]:      <driver name="qemu" type="qcow2" cache="none"/>
Jan 21 18:58:46 np0005591285 nova_compute[182755]:      <source file="/var/lib/nova/instances/83fe04ea-7d77-4003-9276-6a7d268e942a/disk.rescue"/>
Jan 21 18:58:46 np0005591285 nova_compute[182755]:      <target dev="sdb" bus="usb"/>
Jan 21 18:58:46 np0005591285 nova_compute[182755]:      <boot order="1"/>
Jan 21 18:58:46 np0005591285 nova_compute[182755]:    </disk>
Jan 21 18:58:46 np0005591285 nova_compute[182755]:    <interface type="ethernet">
Jan 21 18:58:46 np0005591285 nova_compute[182755]:      <mac address="fa:16:3e:2d:84:20"/>
Jan 21 18:58:46 np0005591285 nova_compute[182755]:      <model type="virtio"/>
Jan 21 18:58:46 np0005591285 nova_compute[182755]:      <driver name="vhost" rx_queue_size="512"/>
Jan 21 18:58:46 np0005591285 nova_compute[182755]:      <mtu size="1442"/>
Jan 21 18:58:46 np0005591285 nova_compute[182755]:      <target dev="tap8e162717-2b"/>
Jan 21 18:58:46 np0005591285 nova_compute[182755]:    </interface>
Jan 21 18:58:46 np0005591285 nova_compute[182755]:    <serial type="pty">
Jan 21 18:58:46 np0005591285 nova_compute[182755]:      <log file="/var/lib/nova/instances/83fe04ea-7d77-4003-9276-6a7d268e942a/console.log" append="off"/>
Jan 21 18:58:46 np0005591285 nova_compute[182755]:    </serial>
Jan 21 18:58:46 np0005591285 nova_compute[182755]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 21 18:58:46 np0005591285 nova_compute[182755]:    <video>
Jan 21 18:58:46 np0005591285 nova_compute[182755]:      <model type="virtio"/>
Jan 21 18:58:46 np0005591285 nova_compute[182755]:    </video>
Jan 21 18:58:46 np0005591285 nova_compute[182755]:    <input type="tablet" bus="usb"/>
Jan 21 18:58:46 np0005591285 nova_compute[182755]:    <rng model="virtio">
Jan 21 18:58:46 np0005591285 nova_compute[182755]:      <backend model="random">/dev/urandom</backend>
Jan 21 18:58:46 np0005591285 nova_compute[182755]:    </rng>
Jan 21 18:58:46 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root"/>
Jan 21 18:58:46 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 18:58:46 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 18:58:46 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 18:58:46 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 18:58:46 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 18:58:46 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 18:58:46 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 18:58:46 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 18:58:46 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 18:58:46 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 18:58:46 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 18:58:46 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 18:58:46 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 18:58:46 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 18:58:46 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 18:58:46 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 18:58:46 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 18:58:46 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 18:58:46 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 18:58:46 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 18:58:46 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 18:58:46 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 18:58:46 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 18:58:46 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 18:58:46 np0005591285 nova_compute[182755]:    <controller type="usb" index="0"/>
Jan 21 18:58:46 np0005591285 nova_compute[182755]:    <memballoon model="virtio">
Jan 21 18:58:46 np0005591285 nova_compute[182755]:      <stats period="10"/>
Jan 21 18:58:46 np0005591285 nova_compute[182755]:    </memballoon>
Jan 21 18:58:46 np0005591285 nova_compute[182755]:  </devices>
Jan 21 18:58:46 np0005591285 nova_compute[182755]: </domain>
Jan 21 18:58:46 np0005591285 nova_compute[182755]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Jan 21 18:58:46 np0005591285 nova_compute[182755]: 2026-01-21 23:58:46.726 182759 INFO nova.virt.libvirt.driver [-] [instance: 83fe04ea-7d77-4003-9276-6a7d268e942a] Instance destroyed successfully.#033[00m
Jan 21 18:58:46 np0005591285 nova_compute[182755]: 2026-01-21 23:58:46.813 182759 DEBUG nova.virt.libvirt.driver [None req-62bfc0ee-7c8e-4943-9745-39f5147a1181 55710edfd4b24e368807c8b5087ec91c 011e84f966444a668bd6c0f5674f551f - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 21 18:58:46 np0005591285 nova_compute[182755]: 2026-01-21 23:58:46.814 182759 DEBUG nova.virt.libvirt.driver [None req-62bfc0ee-7c8e-4943-9745-39f5147a1181 55710edfd4b24e368807c8b5087ec91c 011e84f966444a668bd6c0f5674f551f - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 21 18:58:46 np0005591285 nova_compute[182755]: 2026-01-21 23:58:46.814 182759 DEBUG nova.virt.libvirt.driver [None req-62bfc0ee-7c8e-4943-9745-39f5147a1181 55710edfd4b24e368807c8b5087ec91c 011e84f966444a668bd6c0f5674f551f - - default default] No BDM found with device name sdb, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 21 18:58:46 np0005591285 nova_compute[182755]: 2026-01-21 23:58:46.814 182759 DEBUG nova.virt.libvirt.driver [None req-62bfc0ee-7c8e-4943-9745-39f5147a1181 55710edfd4b24e368807c8b5087ec91c 011e84f966444a668bd6c0f5674f551f - - default default] No VIF found with MAC fa:16:3e:2d:84:20, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Jan 21 18:58:46 np0005591285 nova_compute[182755]: 2026-01-21 23:58:46.815 182759 INFO nova.virt.libvirt.driver [None req-62bfc0ee-7c8e-4943-9745-39f5147a1181 55710edfd4b24e368807c8b5087ec91c 011e84f966444a668bd6c0f5674f551f - - default default] [instance: 83fe04ea-7d77-4003-9276-6a7d268e942a] Using config drive#033[00m
Jan 21 18:58:46 np0005591285 nova_compute[182755]: 2026-01-21 23:58:46.835 182759 DEBUG nova.objects.instance [None req-62bfc0ee-7c8e-4943-9745-39f5147a1181 55710edfd4b24e368807c8b5087ec91c 011e84f966444a668bd6c0f5674f551f - - default default] Lazy-loading 'ec2_ids' on Instance uuid 83fe04ea-7d77-4003-9276-6a7d268e942a obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 21 18:58:46 np0005591285 nova_compute[182755]: 2026-01-21 23:58:46.871 182759 DEBUG nova.objects.instance [None req-62bfc0ee-7c8e-4943-9745-39f5147a1181 55710edfd4b24e368807c8b5087ec91c 011e84f966444a668bd6c0f5674f551f - - default default] Lazy-loading 'keypairs' on Instance uuid 83fe04ea-7d77-4003-9276-6a7d268e942a obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 21 18:58:47 np0005591285 nova_compute[182755]: 2026-01-21 23:58:47.953 182759 INFO nova.virt.libvirt.driver [None req-62bfc0ee-7c8e-4943-9745-39f5147a1181 55710edfd4b24e368807c8b5087ec91c 011e84f966444a668bd6c0f5674f551f - - default default] [instance: 83fe04ea-7d77-4003-9276-6a7d268e942a] Creating config drive at /var/lib/nova/instances/83fe04ea-7d77-4003-9276-6a7d268e942a/disk.config.rescue#033[00m
Jan 21 18:58:47 np0005591285 nova_compute[182755]: 2026-01-21 23:58:47.960 182759 DEBUG oslo_concurrency.processutils [None req-62bfc0ee-7c8e-4943-9745-39f5147a1181 55710edfd4b24e368807c8b5087ec91c 011e84f966444a668bd6c0f5674f551f - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/83fe04ea-7d77-4003-9276-6a7d268e942a/disk.config.rescue -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp8es5qqvm execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 21 18:58:48 np0005591285 nova_compute[182755]: 2026-01-21 23:58:48.110 182759 DEBUG oslo_concurrency.processutils [None req-62bfc0ee-7c8e-4943-9745-39f5147a1181 55710edfd4b24e368807c8b5087ec91c 011e84f966444a668bd6c0f5674f551f - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/83fe04ea-7d77-4003-9276-6a7d268e942a/disk.config.rescue -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp8es5qqvm" returned: 0 in 0.149s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 21 18:58:48 np0005591285 nova_compute[182755]: 2026-01-21 23:58:48.118 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 18:58:48 np0005591285 kernel: tap8e162717-2b: entered promiscuous mode
Jan 21 18:58:48 np0005591285 NetworkManager[55017]: <info>  [1769039928.2223] manager: (tap8e162717-2b): new Tun device (/org/freedesktop/NetworkManager/Devices/122)
Jan 21 18:58:48 np0005591285 ovn_controller[94908]: 2026-01-21T23:58:48Z|00241|binding|INFO|Claiming lport 8e162717-2b5c-4731-8484-d2c68330bdaa for this chassis.
Jan 21 18:58:48 np0005591285 ovn_controller[94908]: 2026-01-21T23:58:48Z|00242|binding|INFO|8e162717-2b5c-4731-8484-d2c68330bdaa: Claiming fa:16:3e:2d:84:20 10.100.0.13
Jan 21 18:58:48 np0005591285 ovn_controller[94908]: 2026-01-21T23:58:48Z|00243|binding|INFO|Removing lport 8e162717-2b5c-4731-8484-d2c68330bdaa ovn-installed in OVS
Jan 21 18:58:48 np0005591285 nova_compute[182755]: 2026-01-21 23:58:48.224 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 18:58:48 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:58:48.234 104259 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:2d:84:20 10.100.0.13'], port_security=['fa:16:3e:2d:84:20 10.100.0.13'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.13/28', 'neutron:device_id': '83fe04ea-7d77-4003-9276-6a7d268e942a', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-58cd83db-dcb3-409c-a108-07601ce5f67a', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '011e84f966444a668bd6c0f5674f551f', 'neutron:revision_number': '5', 'neutron:security_group_ids': '5f02e9fc-67d8-4ade-8ddf-f139c26fa610', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=bbb394e5-dc7d-4c83-b892-c42bee4b1312, chassis=[<ovs.db.idl.Row object at 0x7fdd55b5c970>], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fdd55b5c970>], logical_port=8e162717-2b5c-4731-8484-d2c68330bdaa) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 21 18:58:48 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:58:48.236 104259 INFO neutron.agent.ovn.metadata.agent [-] Port 8e162717-2b5c-4731-8484-d2c68330bdaa in datapath 58cd83db-dcb3-409c-a108-07601ce5f67a bound to our chassis#033[00m
Jan 21 18:58:48 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:58:48.237 104259 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 58cd83db-dcb3-409c-a108-07601ce5f67a#033[00m
Jan 21 18:58:48 np0005591285 ovn_controller[94908]: 2026-01-21T23:58:48Z|00244|binding|INFO|Setting lport 8e162717-2b5c-4731-8484-d2c68330bdaa up in Southbound
Jan 21 18:58:48 np0005591285 ovn_controller[94908]: 2026-01-21T23:58:48Z|00245|binding|INFO|Setting lport 8e162717-2b5c-4731-8484-d2c68330bdaa ovn-installed in OVS
Jan 21 18:58:48 np0005591285 nova_compute[182755]: 2026-01-21 23:58:48.251 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 18:58:48 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:58:48.252 211690 DEBUG oslo.privsep.daemon [-] privsep: reply[01239b3f-0cb5-43a6-b7a1-807b7b416369]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 18:58:48 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:58:48.253 104259 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap58cd83db-d1 in ovnmeta-58cd83db-dcb3-409c-a108-07601ce5f67a namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Jan 21 18:58:48 np0005591285 nova_compute[182755]: 2026-01-21 23:58:48.254 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 18:58:48 np0005591285 systemd-udevd[221712]: Network interface NamePolicy= disabled on kernel command line.
Jan 21 18:58:48 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:58:48.257 211690 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap58cd83db-d0 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Jan 21 18:58:48 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:58:48.257 211690 DEBUG oslo.privsep.daemon [-] privsep: reply[5a9b49a5-fc88-4d67-810a-5afc031dfc8e]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 18:58:48 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:58:48.258 211690 DEBUG oslo.privsep.daemon [-] privsep: reply[063248c6-61ee-422c-9236-41344ed57933]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 18:58:48 np0005591285 NetworkManager[55017]: <info>  [1769039928.2713] device (tap8e162717-2b): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 21 18:58:48 np0005591285 NetworkManager[55017]: <info>  [1769039928.2721] device (tap8e162717-2b): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 21 18:58:48 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:58:48.282 104650 DEBUG oslo.privsep.daemon [-] privsep: reply[48fcd84b-df44-4abd-bc1b-a85fcc34d019]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 18:58:48 np0005591285 systemd-machined[154022]: New machine qemu-33-instance-0000004a.
Jan 21 18:58:48 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:58:48.306 211690 DEBUG oslo.privsep.daemon [-] privsep: reply[bb7a8b54-0268-4b67-b2ec-f7a5edcdbf36]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 18:58:48 np0005591285 systemd[1]: Started Virtual Machine qemu-33-instance-0000004a.
Jan 21 18:58:48 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:58:48.350 211704 DEBUG oslo.privsep.daemon [-] privsep: reply[797246d2-2aeb-401b-9135-d9b9925c5ea6]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 18:58:48 np0005591285 NetworkManager[55017]: <info>  [1769039928.3583] manager: (tap58cd83db-d0): new Veth device (/org/freedesktop/NetworkManager/Devices/123)
Jan 21 18:58:48 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:58:48.357 211690 DEBUG oslo.privsep.daemon [-] privsep: reply[078e855e-6d77-4728-8f97-96a4ceda5e25]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 18:58:48 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:58:48.388 211704 DEBUG oslo.privsep.daemon [-] privsep: reply[4ecba436-bc7f-4c73-8ff2-d29603fc91c8]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 18:58:48 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:58:48.391 211704 DEBUG oslo.privsep.daemon [-] privsep: reply[05a28034-7c51-4fd6-afd6-29dc785b068f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 18:58:48 np0005591285 NetworkManager[55017]: <info>  [1769039928.4119] device (tap58cd83db-d0): carrier: link connected
Jan 21 18:58:48 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:58:48.418 211704 DEBUG oslo.privsep.daemon [-] privsep: reply[910035d8-1ecf-4dbe-9531-261fc9fb339e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 18:58:48 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:58:48.435 211690 DEBUG oslo.privsep.daemon [-] privsep: reply[bfad0d43-aaee-4f94-99d0-b8fc59758739]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap58cd83db-d1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:59:9a:20'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 2, 'rx_bytes': 90, 'tx_bytes': 176, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 2, 'rx_bytes': 90, 'tx_bytes': 176, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 78], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 448707, 'reachable_time': 23860, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 2, 'outoctets': 148, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 2, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 148, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 2, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 221747, 'error': None, 'target': 'ovnmeta-58cd83db-dcb3-409c-a108-07601ce5f67a', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 18:58:48 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:58:48.451 211690 DEBUG oslo.privsep.daemon [-] privsep: reply[a2755b03-5a95-44fd-8049-7ff292facfa0]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe59:9a20'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 448707, 'tstamp': 448707}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 221748, 'error': None, 'target': 'ovnmeta-58cd83db-dcb3-409c-a108-07601ce5f67a', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 18:58:48 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:58:48.470 211690 DEBUG oslo.privsep.daemon [-] privsep: reply[b26f7268-418b-4299-a609-6f2b99c130dd]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap58cd83db-d1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:59:9a:20'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 2, 'rx_bytes': 90, 'tx_bytes': 176, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 2, 'rx_bytes': 90, 'tx_bytes': 176, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 78], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 448707, 'reachable_time': 23860, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 2, 'outoctets': 148, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 2, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 148, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 2, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 221749, 'error': None, 'target': 'ovnmeta-58cd83db-dcb3-409c-a108-07601ce5f67a', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 18:58:48 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:58:48.509 211690 DEBUG oslo.privsep.daemon [-] privsep: reply[b6332f32-b693-4e58-87c9-64d8ee0dabb6]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 18:58:48 np0005591285 nova_compute[182755]: 2026-01-21 23:58:48.578 182759 DEBUG nova.virt.libvirt.host [None req-f447ef32-7edb-4e2c-a5c5-5452f2f9c37e - - - - - -] Removed pending event for 83fe04ea-7d77-4003-9276-6a7d268e942a due to event _event_emit_delayed /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:438#033[00m
Jan 21 18:58:48 np0005591285 nova_compute[182755]: 2026-01-21 23:58:48.579 182759 DEBUG nova.virt.driver [None req-f447ef32-7edb-4e2c-a5c5-5452f2f9c37e - - - - - -] Emitting event <LifecycleEvent: 1769039928.5771353, 83fe04ea-7d77-4003-9276-6a7d268e942a => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 21 18:58:48 np0005591285 nova_compute[182755]: 2026-01-21 23:58:48.580 182759 INFO nova.compute.manager [None req-f447ef32-7edb-4e2c-a5c5-5452f2f9c37e - - - - - -] [instance: 83fe04ea-7d77-4003-9276-6a7d268e942a] VM Resumed (Lifecycle Event)#033[00m
Jan 21 18:58:48 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:58:48.588 211690 DEBUG oslo.privsep.daemon [-] privsep: reply[81c65857-3a47-4a99-b531-5bd305e9cd5b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 18:58:48 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:58:48.591 104259 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap58cd83db-d0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 21 18:58:48 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:58:48.591 104259 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 21 18:58:48 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:58:48.592 104259 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap58cd83db-d0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 21 18:58:48 np0005591285 kernel: tap58cd83db-d0: entered promiscuous mode
Jan 21 18:58:48 np0005591285 NetworkManager[55017]: <info>  [1769039928.5979] manager: (tap58cd83db-d0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/124)
Jan 21 18:58:48 np0005591285 nova_compute[182755]: 2026-01-21 23:58:48.598 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 18:58:48 np0005591285 nova_compute[182755]: 2026-01-21 23:58:48.601 182759 DEBUG nova.compute.manager [None req-62bfc0ee-7c8e-4943-9745-39f5147a1181 55710edfd4b24e368807c8b5087ec91c 011e84f966444a668bd6c0f5674f551f - - default default] [instance: 83fe04ea-7d77-4003-9276-6a7d268e942a] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 21 18:58:48 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:58:48.602 104259 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap58cd83db-d0, col_values=(('external_ids', {'iface-id': '2d113249-07d3-443f-9b57-5f5a422d1c98'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 21 18:58:48 np0005591285 ovn_controller[94908]: 2026-01-21T23:58:48Z|00246|binding|INFO|Releasing lport 2d113249-07d3-443f-9b57-5f5a422d1c98 from this chassis (sb_readonly=0)
Jan 21 18:58:48 np0005591285 nova_compute[182755]: 2026-01-21 23:58:48.604 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 18:58:48 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:58:48.608 104259 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/58cd83db-dcb3-409c-a108-07601ce5f67a.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/58cd83db-dcb3-409c-a108-07601ce5f67a.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Jan 21 18:58:48 np0005591285 nova_compute[182755]: 2026-01-21 23:58:48.609 182759 DEBUG nova.compute.manager [None req-f447ef32-7edb-4e2c-a5c5-5452f2f9c37e - - - - - -] [instance: 83fe04ea-7d77-4003-9276-6a7d268e942a] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 21 18:58:48 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:58:48.609 211690 DEBUG oslo.privsep.daemon [-] privsep: reply[4c3da3b1-b7be-4bb6-9752-f8d4419d9c62]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 18:58:48 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:58:48.611 104259 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Jan 21 18:58:48 np0005591285 ovn_metadata_agent[104254]: global
Jan 21 18:58:48 np0005591285 ovn_metadata_agent[104254]:    log         /dev/log local0 debug
Jan 21 18:58:48 np0005591285 ovn_metadata_agent[104254]:    log-tag     haproxy-metadata-proxy-58cd83db-dcb3-409c-a108-07601ce5f67a
Jan 21 18:58:48 np0005591285 ovn_metadata_agent[104254]:    user        root
Jan 21 18:58:48 np0005591285 ovn_metadata_agent[104254]:    group       root
Jan 21 18:58:48 np0005591285 ovn_metadata_agent[104254]:    maxconn     1024
Jan 21 18:58:48 np0005591285 ovn_metadata_agent[104254]:    pidfile     /var/lib/neutron/external/pids/58cd83db-dcb3-409c-a108-07601ce5f67a.pid.haproxy
Jan 21 18:58:48 np0005591285 ovn_metadata_agent[104254]:    daemon
Jan 21 18:58:48 np0005591285 ovn_metadata_agent[104254]: 
Jan 21 18:58:48 np0005591285 ovn_metadata_agent[104254]: defaults
Jan 21 18:58:48 np0005591285 ovn_metadata_agent[104254]:    log global
Jan 21 18:58:48 np0005591285 ovn_metadata_agent[104254]:    mode http
Jan 21 18:58:48 np0005591285 ovn_metadata_agent[104254]:    option httplog
Jan 21 18:58:48 np0005591285 ovn_metadata_agent[104254]:    option dontlognull
Jan 21 18:58:48 np0005591285 ovn_metadata_agent[104254]:    option http-server-close
Jan 21 18:58:48 np0005591285 ovn_metadata_agent[104254]:    option forwardfor
Jan 21 18:58:48 np0005591285 ovn_metadata_agent[104254]:    retries                 3
Jan 21 18:58:48 np0005591285 ovn_metadata_agent[104254]:    timeout http-request    30s
Jan 21 18:58:48 np0005591285 ovn_metadata_agent[104254]:    timeout connect         30s
Jan 21 18:58:48 np0005591285 ovn_metadata_agent[104254]:    timeout client          32s
Jan 21 18:58:48 np0005591285 ovn_metadata_agent[104254]:    timeout server          32s
Jan 21 18:58:48 np0005591285 ovn_metadata_agent[104254]:    timeout http-keep-alive 30s
Jan 21 18:58:48 np0005591285 ovn_metadata_agent[104254]: 
Jan 21 18:58:48 np0005591285 ovn_metadata_agent[104254]: 
Jan 21 18:58:48 np0005591285 ovn_metadata_agent[104254]: listen listener
Jan 21 18:58:48 np0005591285 ovn_metadata_agent[104254]:    bind 169.254.169.254:80
Jan 21 18:58:48 np0005591285 ovn_metadata_agent[104254]:    server metadata /var/lib/neutron/metadata_proxy
Jan 21 18:58:48 np0005591285 ovn_metadata_agent[104254]:    http-request add-header X-OVN-Network-ID 58cd83db-dcb3-409c-a108-07601ce5f67a
Jan 21 18:58:48 np0005591285 ovn_metadata_agent[104254]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Jan 21 18:58:48 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:58:48.612 104259 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-58cd83db-dcb3-409c-a108-07601ce5f67a', 'env', 'PROCESS_TAG=haproxy-58cd83db-dcb3-409c-a108-07601ce5f67a', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/58cd83db-dcb3-409c-a108-07601ce5f67a.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Jan 21 18:58:48 np0005591285 nova_compute[182755]: 2026-01-21 23:58:48.620 182759 DEBUG nova.compute.manager [None req-f447ef32-7edb-4e2c-a5c5-5452f2f9c37e - - - - - -] [instance: 83fe04ea-7d77-4003-9276-6a7d268e942a] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: active, current task_state: rescuing, current DB power_state: 1, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 21 18:58:48 np0005591285 nova_compute[182755]: 2026-01-21 23:58:48.629 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 18:58:48 np0005591285 nova_compute[182755]: 2026-01-21 23:58:48.633 182759 DEBUG nova.compute.manager [req-bc5c08ac-d24b-46a8-973d-17fa1f74c064 req-36a7c828-700f-4ad3-b80e-a41cc738eac6 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 83fe04ea-7d77-4003-9276-6a7d268e942a] Received event network-vif-plugged-8e162717-2b5c-4731-8484-d2c68330bdaa external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 21 18:58:48 np0005591285 nova_compute[182755]: 2026-01-21 23:58:48.633 182759 DEBUG oslo_concurrency.lockutils [req-bc5c08ac-d24b-46a8-973d-17fa1f74c064 req-36a7c828-700f-4ad3-b80e-a41cc738eac6 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "83fe04ea-7d77-4003-9276-6a7d268e942a-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 21 18:58:48 np0005591285 nova_compute[182755]: 2026-01-21 23:58:48.634 182759 DEBUG oslo_concurrency.lockutils [req-bc5c08ac-d24b-46a8-973d-17fa1f74c064 req-36a7c828-700f-4ad3-b80e-a41cc738eac6 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "83fe04ea-7d77-4003-9276-6a7d268e942a-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 21 18:58:48 np0005591285 nova_compute[182755]: 2026-01-21 23:58:48.634 182759 DEBUG oslo_concurrency.lockutils [req-bc5c08ac-d24b-46a8-973d-17fa1f74c064 req-36a7c828-700f-4ad3-b80e-a41cc738eac6 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "83fe04ea-7d77-4003-9276-6a7d268e942a-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 21 18:58:48 np0005591285 nova_compute[182755]: 2026-01-21 23:58:48.634 182759 DEBUG nova.compute.manager [req-bc5c08ac-d24b-46a8-973d-17fa1f74c064 req-36a7c828-700f-4ad3-b80e-a41cc738eac6 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 83fe04ea-7d77-4003-9276-6a7d268e942a] No waiting events found dispatching network-vif-plugged-8e162717-2b5c-4731-8484-d2c68330bdaa pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 21 18:58:48 np0005591285 nova_compute[182755]: 2026-01-21 23:58:48.635 182759 WARNING nova.compute.manager [req-bc5c08ac-d24b-46a8-973d-17fa1f74c064 req-36a7c828-700f-4ad3-b80e-a41cc738eac6 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 83fe04ea-7d77-4003-9276-6a7d268e942a] Received unexpected event network-vif-plugged-8e162717-2b5c-4731-8484-d2c68330bdaa for instance with vm_state active and task_state rescuing.#033[00m
Jan 21 18:58:48 np0005591285 nova_compute[182755]: 2026-01-21 23:58:48.656 182759 INFO nova.compute.manager [None req-f447ef32-7edb-4e2c-a5c5-5452f2f9c37e - - - - - -] [instance: 83fe04ea-7d77-4003-9276-6a7d268e942a] During sync_power_state the instance has a pending task (rescuing). Skip.#033[00m
Jan 21 18:58:48 np0005591285 nova_compute[182755]: 2026-01-21 23:58:48.657 182759 DEBUG nova.virt.driver [None req-f447ef32-7edb-4e2c-a5c5-5452f2f9c37e - - - - - -] Emitting event <LifecycleEvent: 1769039928.5777867, 83fe04ea-7d77-4003-9276-6a7d268e942a => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 21 18:58:48 np0005591285 nova_compute[182755]: 2026-01-21 23:58:48.657 182759 INFO nova.compute.manager [None req-f447ef32-7edb-4e2c-a5c5-5452f2f9c37e - - - - - -] [instance: 83fe04ea-7d77-4003-9276-6a7d268e942a] VM Started (Lifecycle Event)#033[00m
Jan 21 18:58:48 np0005591285 nova_compute[182755]: 2026-01-21 23:58:48.689 182759 DEBUG nova.compute.manager [None req-f447ef32-7edb-4e2c-a5c5-5452f2f9c37e - - - - - -] [instance: 83fe04ea-7d77-4003-9276-6a7d268e942a] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 21 18:58:48 np0005591285 nova_compute[182755]: 2026-01-21 23:58:48.693 182759 DEBUG nova.compute.manager [None req-f447ef32-7edb-4e2c-a5c5-5452f2f9c37e - - - - - -] [instance: 83fe04ea-7d77-4003-9276-6a7d268e942a] Synchronizing instance power state after lifecycle event "Started"; current vm_state: rescued, current task_state: None, current DB power_state: 1, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 21 18:58:49 np0005591285 podman[221788]: 2026-01-21 23:58:49.021546088 +0000 UTC m=+0.059429793 container create a847191e94127d7b7089518f7f2b728be2e24a678d41e636951eb23a7eac41ff (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-58cd83db-dcb3-409c-a108-07601ce5f67a, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, tcib_managed=true, org.label-schema.schema-version=1.0)
Jan 21 18:58:49 np0005591285 systemd[1]: Started libpod-conmon-a847191e94127d7b7089518f7f2b728be2e24a678d41e636951eb23a7eac41ff.scope.
Jan 21 18:58:49 np0005591285 podman[221788]: 2026-01-21 23:58:48.987206326 +0000 UTC m=+0.025090011 image pull 3695f0466b4af47afdf4b467956f8cc4744d7249671a73e7ca3fd26cca2f59c3 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Jan 21 18:58:49 np0005591285 systemd[1]: Started libcrun container.
Jan 21 18:58:49 np0005591285 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/1f38509b0a9905f9ba44efcbf326d4bbdd1464fe04ef41310d81c6c170be1b1f/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 21 18:58:49 np0005591285 podman[221788]: 2026-01-21 23:58:49.171601878 +0000 UTC m=+0.209485593 container init a847191e94127d7b7089518f7f2b728be2e24a678d41e636951eb23a7eac41ff (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-58cd83db-dcb3-409c-a108-07601ce5f67a, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.build-date=20251202)
Jan 21 18:58:49 np0005591285 podman[221788]: 2026-01-21 23:58:49.179707408 +0000 UTC m=+0.217591083 container start a847191e94127d7b7089518f7f2b728be2e24a678d41e636951eb23a7eac41ff (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-58cd83db-dcb3-409c-a108-07601ce5f67a, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, tcib_managed=true, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 21 18:58:49 np0005591285 neutron-haproxy-ovnmeta-58cd83db-dcb3-409c-a108-07601ce5f67a[221803]: [NOTICE]   (221807) : New worker (221809) forked
Jan 21 18:58:49 np0005591285 neutron-haproxy-ovnmeta-58cd83db-dcb3-409c-a108-07601ce5f67a[221803]: [NOTICE]   (221807) : Loading success.
Jan 21 18:58:50 np0005591285 nova_compute[182755]: 2026-01-21 23:58:50.033 182759 INFO nova.compute.manager [None req-2dc6dd16-caea-436f-ae4f-5bb6e1f70d09 55710edfd4b24e368807c8b5087ec91c 011e84f966444a668bd6c0f5674f551f - - default default] [instance: 83fe04ea-7d77-4003-9276-6a7d268e942a] Unrescuing#033[00m
Jan 21 18:58:50 np0005591285 nova_compute[182755]: 2026-01-21 23:58:50.035 182759 DEBUG oslo_concurrency.lockutils [None req-2dc6dd16-caea-436f-ae4f-5bb6e1f70d09 55710edfd4b24e368807c8b5087ec91c 011e84f966444a668bd6c0f5674f551f - - default default] Acquiring lock "refresh_cache-83fe04ea-7d77-4003-9276-6a7d268e942a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 21 18:58:50 np0005591285 nova_compute[182755]: 2026-01-21 23:58:50.036 182759 DEBUG oslo_concurrency.lockutils [None req-2dc6dd16-caea-436f-ae4f-5bb6e1f70d09 55710edfd4b24e368807c8b5087ec91c 011e84f966444a668bd6c0f5674f551f - - default default] Acquired lock "refresh_cache-83fe04ea-7d77-4003-9276-6a7d268e942a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 21 18:58:50 np0005591285 nova_compute[182755]: 2026-01-21 23:58:50.036 182759 DEBUG nova.network.neutron [None req-2dc6dd16-caea-436f-ae4f-5bb6e1f70d09 55710edfd4b24e368807c8b5087ec91c 011e84f966444a668bd6c0f5674f551f - - default default] [instance: 83fe04ea-7d77-4003-9276-6a7d268e942a] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Jan 21 18:58:50 np0005591285 nova_compute[182755]: 2026-01-21 23:58:50.182 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 18:58:50 np0005591285 nova_compute[182755]: 2026-01-21 23:58:50.661 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 18:58:50 np0005591285 nova_compute[182755]: 2026-01-21 23:58:50.758 182759 DEBUG nova.compute.manager [req-9e4e07ba-b025-41d2-9fc5-32ab95ba5be4 req-a70fce28-9587-4f15-8860-323bfb8ab9aa 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 83fe04ea-7d77-4003-9276-6a7d268e942a] Received event network-vif-plugged-8e162717-2b5c-4731-8484-d2c68330bdaa external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 21 18:58:50 np0005591285 nova_compute[182755]: 2026-01-21 23:58:50.759 182759 DEBUG oslo_concurrency.lockutils [req-9e4e07ba-b025-41d2-9fc5-32ab95ba5be4 req-a70fce28-9587-4f15-8860-323bfb8ab9aa 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "83fe04ea-7d77-4003-9276-6a7d268e942a-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 21 18:58:50 np0005591285 nova_compute[182755]: 2026-01-21 23:58:50.760 182759 DEBUG oslo_concurrency.lockutils [req-9e4e07ba-b025-41d2-9fc5-32ab95ba5be4 req-a70fce28-9587-4f15-8860-323bfb8ab9aa 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "83fe04ea-7d77-4003-9276-6a7d268e942a-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 21 18:58:50 np0005591285 nova_compute[182755]: 2026-01-21 23:58:50.761 182759 DEBUG oslo_concurrency.lockutils [req-9e4e07ba-b025-41d2-9fc5-32ab95ba5be4 req-a70fce28-9587-4f15-8860-323bfb8ab9aa 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "83fe04ea-7d77-4003-9276-6a7d268e942a-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 21 18:58:50 np0005591285 nova_compute[182755]: 2026-01-21 23:58:50.762 182759 DEBUG nova.compute.manager [req-9e4e07ba-b025-41d2-9fc5-32ab95ba5be4 req-a70fce28-9587-4f15-8860-323bfb8ab9aa 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 83fe04ea-7d77-4003-9276-6a7d268e942a] No waiting events found dispatching network-vif-plugged-8e162717-2b5c-4731-8484-d2c68330bdaa pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 21 18:58:50 np0005591285 nova_compute[182755]: 2026-01-21 23:58:50.762 182759 WARNING nova.compute.manager [req-9e4e07ba-b025-41d2-9fc5-32ab95ba5be4 req-a70fce28-9587-4f15-8860-323bfb8ab9aa 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 83fe04ea-7d77-4003-9276-6a7d268e942a] Received unexpected event network-vif-plugged-8e162717-2b5c-4731-8484-d2c68330bdaa for instance with vm_state rescued and task_state unrescuing.#033[00m
Jan 21 18:58:52 np0005591285 nova_compute[182755]: 2026-01-21 23:58:52.098 182759 DEBUG nova.network.neutron [None req-2dc6dd16-caea-436f-ae4f-5bb6e1f70d09 55710edfd4b24e368807c8b5087ec91c 011e84f966444a668bd6c0f5674f551f - - default default] [instance: 83fe04ea-7d77-4003-9276-6a7d268e942a] Updating instance_info_cache with network_info: [{"id": "8e162717-2b5c-4731-8484-d2c68330bdaa", "address": "fa:16:3e:2d:84:20", "network": {"id": "58cd83db-dcb3-409c-a108-07601ce5f67a", "bridge": "br-int", "label": "tempest-ServerStableDeviceRescueTest-1658091487-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "011e84f966444a668bd6c0f5674f551f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap8e162717-2b", "ovs_interfaceid": "8e162717-2b5c-4731-8484-d2c68330bdaa", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 21 18:58:52 np0005591285 nova_compute[182755]: 2026-01-21 23:58:52.121 182759 DEBUG oslo_concurrency.lockutils [None req-2dc6dd16-caea-436f-ae4f-5bb6e1f70d09 55710edfd4b24e368807c8b5087ec91c 011e84f966444a668bd6c0f5674f551f - - default default] Releasing lock "refresh_cache-83fe04ea-7d77-4003-9276-6a7d268e942a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 21 18:58:52 np0005591285 nova_compute[182755]: 2026-01-21 23:58:52.123 182759 DEBUG nova.objects.instance [None req-2dc6dd16-caea-436f-ae4f-5bb6e1f70d09 55710edfd4b24e368807c8b5087ec91c 011e84f966444a668bd6c0f5674f551f - - default default] Lazy-loading 'flavor' on Instance uuid 83fe04ea-7d77-4003-9276-6a7d268e942a obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 21 18:58:52 np0005591285 kernel: tap8e162717-2b (unregistering): left promiscuous mode
Jan 21 18:58:52 np0005591285 NetworkManager[55017]: <info>  [1769039932.2236] device (tap8e162717-2b): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 21 18:58:52 np0005591285 ovn_controller[94908]: 2026-01-21T23:58:52Z|00247|binding|INFO|Releasing lport 8e162717-2b5c-4731-8484-d2c68330bdaa from this chassis (sb_readonly=0)
Jan 21 18:58:52 np0005591285 ovn_controller[94908]: 2026-01-21T23:58:52Z|00248|binding|INFO|Setting lport 8e162717-2b5c-4731-8484-d2c68330bdaa down in Southbound
Jan 21 18:58:52 np0005591285 nova_compute[182755]: 2026-01-21 23:58:52.241 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 18:58:52 np0005591285 ovn_controller[94908]: 2026-01-21T23:58:52Z|00249|binding|INFO|Removing iface tap8e162717-2b ovn-installed in OVS
Jan 21 18:58:52 np0005591285 nova_compute[182755]: 2026-01-21 23:58:52.244 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 18:58:52 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:58:52.251 104259 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:2d:84:20 10.100.0.13'], port_security=['fa:16:3e:2d:84:20 10.100.0.13'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.13/28', 'neutron:device_id': '83fe04ea-7d77-4003-9276-6a7d268e942a', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-58cd83db-dcb3-409c-a108-07601ce5f67a', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '011e84f966444a668bd6c0f5674f551f', 'neutron:revision_number': '6', 'neutron:security_group_ids': '5f02e9fc-67d8-4ade-8ddf-f139c26fa610', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-2.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=bbb394e5-dc7d-4c83-b892-c42bee4b1312, chassis=[], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fdd55b5c970>], logical_port=8e162717-2b5c-4731-8484-d2c68330bdaa) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7fdd55b5c970>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 21 18:58:52 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:58:52.254 104259 INFO neutron.agent.ovn.metadata.agent [-] Port 8e162717-2b5c-4731-8484-d2c68330bdaa in datapath 58cd83db-dcb3-409c-a108-07601ce5f67a unbound from our chassis#033[00m
Jan 21 18:58:52 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:58:52.257 104259 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 58cd83db-dcb3-409c-a108-07601ce5f67a, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Jan 21 18:58:52 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:58:52.259 211690 DEBUG oslo.privsep.daemon [-] privsep: reply[54ff9718-35e8-45f2-8918-f38b0cb72a96]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 18:58:52 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:58:52.260 104259 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-58cd83db-dcb3-409c-a108-07601ce5f67a namespace which is not needed anymore#033[00m
Jan 21 18:58:52 np0005591285 nova_compute[182755]: 2026-01-21 23:58:52.261 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 18:58:52 np0005591285 systemd[1]: machine-qemu\x2d33\x2dinstance\x2d0000004a.scope: Deactivated successfully.
Jan 21 18:58:52 np0005591285 systemd[1]: machine-qemu\x2d33\x2dinstance\x2d0000004a.scope: Consumed 3.930s CPU time.
Jan 21 18:58:52 np0005591285 systemd-machined[154022]: Machine qemu-33-instance-0000004a terminated.
Jan 21 18:58:52 np0005591285 neutron-haproxy-ovnmeta-58cd83db-dcb3-409c-a108-07601ce5f67a[221803]: [NOTICE]   (221807) : haproxy version is 2.8.14-c23fe91
Jan 21 18:58:52 np0005591285 neutron-haproxy-ovnmeta-58cd83db-dcb3-409c-a108-07601ce5f67a[221803]: [NOTICE]   (221807) : path to executable is /usr/sbin/haproxy
Jan 21 18:58:52 np0005591285 neutron-haproxy-ovnmeta-58cd83db-dcb3-409c-a108-07601ce5f67a[221803]: [WARNING]  (221807) : Exiting Master process...
Jan 21 18:58:52 np0005591285 neutron-haproxy-ovnmeta-58cd83db-dcb3-409c-a108-07601ce5f67a[221803]: [ALERT]    (221807) : Current worker (221809) exited with code 143 (Terminated)
Jan 21 18:58:52 np0005591285 neutron-haproxy-ovnmeta-58cd83db-dcb3-409c-a108-07601ce5f67a[221803]: [WARNING]  (221807) : All workers exited. Exiting... (0)
Jan 21 18:58:52 np0005591285 NetworkManager[55017]: <info>  [1769039932.4160] manager: (tap8e162717-2b): new Tun device (/org/freedesktop/NetworkManager/Devices/125)
Jan 21 18:58:52 np0005591285 kernel: tap8e162717-2b: entered promiscuous mode
Jan 21 18:58:52 np0005591285 kernel: tap8e162717-2b (unregistering): left promiscuous mode
Jan 21 18:58:52 np0005591285 systemd[1]: libpod-a847191e94127d7b7089518f7f2b728be2e24a678d41e636951eb23a7eac41ff.scope: Deactivated successfully.
Jan 21 18:58:52 np0005591285 podman[221842]: 2026-01-21 23:58:52.425407063 +0000 UTC m=+0.056074932 container died a847191e94127d7b7089518f7f2b728be2e24a678d41e636951eb23a7eac41ff (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-58cd83db-dcb3-409c-a108-07601ce5f67a, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team)
Jan 21 18:58:52 np0005591285 nova_compute[182755]: 2026-01-21 23:58:52.430 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 18:58:52 np0005591285 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-a847191e94127d7b7089518f7f2b728be2e24a678d41e636951eb23a7eac41ff-userdata-shm.mount: Deactivated successfully.
Jan 21 18:58:52 np0005591285 systemd[1]: var-lib-containers-storage-overlay-1f38509b0a9905f9ba44efcbf326d4bbdd1464fe04ef41310d81c6c170be1b1f-merged.mount: Deactivated successfully.
Jan 21 18:58:52 np0005591285 podman[221842]: 2026-01-21 23:58:52.478141143 +0000 UTC m=+0.108809002 container cleanup a847191e94127d7b7089518f7f2b728be2e24a678d41e636951eb23a7eac41ff (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-58cd83db-dcb3-409c-a108-07601ce5f67a, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.build-date=20251202)
Jan 21 18:58:52 np0005591285 systemd[1]: libpod-conmon-a847191e94127d7b7089518f7f2b728be2e24a678d41e636951eb23a7eac41ff.scope: Deactivated successfully.
Jan 21 18:58:52 np0005591285 nova_compute[182755]: 2026-01-21 23:58:52.494 182759 INFO nova.virt.libvirt.driver [-] [instance: 83fe04ea-7d77-4003-9276-6a7d268e942a] Instance destroyed successfully.#033[00m
Jan 21 18:58:52 np0005591285 nova_compute[182755]: 2026-01-21 23:58:52.495 182759 DEBUG nova.objects.instance [None req-2dc6dd16-caea-436f-ae4f-5bb6e1f70d09 55710edfd4b24e368807c8b5087ec91c 011e84f966444a668bd6c0f5674f551f - - default default] Lazy-loading 'numa_topology' on Instance uuid 83fe04ea-7d77-4003-9276-6a7d268e942a obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 21 18:58:52 np0005591285 podman[221890]: 2026-01-21 23:58:52.557032933 +0000 UTC m=+0.052355430 container remove a847191e94127d7b7089518f7f2b728be2e24a678d41e636951eb23a7eac41ff (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-58cd83db-dcb3-409c-a108-07601ce5f67a, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team)
Jan 21 18:58:52 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:58:52.564 211690 DEBUG oslo.privsep.daemon [-] privsep: reply[5c7e37dc-8dda-459d-b318-91ffacc3fed6]: (4, ('Wed Jan 21 11:58:52 PM UTC 2026 Stopping container neutron-haproxy-ovnmeta-58cd83db-dcb3-409c-a108-07601ce5f67a (a847191e94127d7b7089518f7f2b728be2e24a678d41e636951eb23a7eac41ff)\na847191e94127d7b7089518f7f2b728be2e24a678d41e636951eb23a7eac41ff\nWed Jan 21 11:58:52 PM UTC 2026 Deleting container neutron-haproxy-ovnmeta-58cd83db-dcb3-409c-a108-07601ce5f67a (a847191e94127d7b7089518f7f2b728be2e24a678d41e636951eb23a7eac41ff)\na847191e94127d7b7089518f7f2b728be2e24a678d41e636951eb23a7eac41ff\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 18:58:52 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:58:52.566 211690 DEBUG oslo.privsep.daemon [-] privsep: reply[79e07cc8-ff0f-4bcf-98b3-bd0df254776c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 18:58:52 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:58:52.568 104259 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap58cd83db-d0, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 21 18:58:52 np0005591285 nova_compute[182755]: 2026-01-21 23:58:52.570 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 18:58:52 np0005591285 kernel: tap58cd83db-d0: left promiscuous mode
Jan 21 18:58:52 np0005591285 nova_compute[182755]: 2026-01-21 23:58:52.599 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 18:58:52 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:58:52.605 211690 DEBUG oslo.privsep.daemon [-] privsep: reply[a60790c7-7893-4336-9627-b701253ff0ef]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 18:58:52 np0005591285 kernel: tap8e162717-2b: entered promiscuous mode
Jan 21 18:58:52 np0005591285 systemd-udevd[221820]: Network interface NamePolicy= disabled on kernel command line.
Jan 21 18:58:52 np0005591285 NetworkManager[55017]: <info>  [1769039932.6215] manager: (tap8e162717-2b): new Tun device (/org/freedesktop/NetworkManager/Devices/126)
Jan 21 18:58:52 np0005591285 ovn_controller[94908]: 2026-01-21T23:58:52Z|00250|binding|INFO|Claiming lport 8e162717-2b5c-4731-8484-d2c68330bdaa for this chassis.
Jan 21 18:58:52 np0005591285 ovn_controller[94908]: 2026-01-21T23:58:52Z|00251|binding|INFO|8e162717-2b5c-4731-8484-d2c68330bdaa: Claiming fa:16:3e:2d:84:20 10.100.0.13
Jan 21 18:58:52 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:58:52.623 211690 DEBUG oslo.privsep.daemon [-] privsep: reply[7b181f2a-9f26-4897-9b50-ec9940b27831]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 18:58:52 np0005591285 nova_compute[182755]: 2026-01-21 23:58:52.622 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 18:58:52 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:58:52.625 211690 DEBUG oslo.privsep.daemon [-] privsep: reply[14ea8636-604d-4e49-987c-812d45862615]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 18:58:52 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:58:52.635 104259 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:2d:84:20 10.100.0.13'], port_security=['fa:16:3e:2d:84:20 10.100.0.13'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.13/28', 'neutron:device_id': '83fe04ea-7d77-4003-9276-6a7d268e942a', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-58cd83db-dcb3-409c-a108-07601ce5f67a', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '011e84f966444a668bd6c0f5674f551f', 'neutron:revision_number': '6', 'neutron:security_group_ids': '5f02e9fc-67d8-4ade-8ddf-f139c26fa610', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-2.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=bbb394e5-dc7d-4c83-b892-c42bee4b1312, chassis=[<ovs.db.idl.Row object at 0x7fdd55b5c970>], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fdd55b5c970>], logical_port=8e162717-2b5c-4731-8484-d2c68330bdaa) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 21 18:58:52 np0005591285 NetworkManager[55017]: <info>  [1769039932.6361] device (tap8e162717-2b): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 21 18:58:52 np0005591285 NetworkManager[55017]: <info>  [1769039932.6366] device (tap8e162717-2b): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 21 18:58:52 np0005591285 ovn_controller[94908]: 2026-01-21T23:58:52Z|00252|binding|INFO|Setting lport 8e162717-2b5c-4731-8484-d2c68330bdaa ovn-installed in OVS
Jan 21 18:58:52 np0005591285 ovn_controller[94908]: 2026-01-21T23:58:52Z|00253|binding|INFO|Setting lport 8e162717-2b5c-4731-8484-d2c68330bdaa up in Southbound
Jan 21 18:58:52 np0005591285 nova_compute[182755]: 2026-01-21 23:58:52.640 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 18:58:52 np0005591285 nova_compute[182755]: 2026-01-21 23:58:52.645 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 18:58:52 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:58:52.653 211690 DEBUG oslo.privsep.daemon [-] privsep: reply[a4570d4f-2496-43d3-9ec4-b4fd5c690996]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 448700, 'reachable_time': 22774, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 221922, 'error': None, 'target': 'ovnmeta-58cd83db-dcb3-409c-a108-07601ce5f67a', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 18:58:52 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:58:52.657 104650 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-58cd83db-dcb3-409c-a108-07601ce5f67a deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Jan 21 18:58:52 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:58:52.658 104650 DEBUG oslo.privsep.daemon [-] privsep: reply[ab94ac5a-7a12-44d7-b127-b73eb6040660]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 18:58:52 np0005591285 systemd[1]: run-netns-ovnmeta\x2d58cd83db\x2ddcb3\x2d409c\x2da108\x2d07601ce5f67a.mount: Deactivated successfully.
Jan 21 18:58:52 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:58:52.660 104259 INFO neutron.agent.ovn.metadata.agent [-] Port 8e162717-2b5c-4731-8484-d2c68330bdaa in datapath 58cd83db-dcb3-409c-a108-07601ce5f67a unbound from our chassis#033[00m
Jan 21 18:58:52 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:58:52.662 104259 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 58cd83db-dcb3-409c-a108-07601ce5f67a#033[00m
Jan 21 18:58:52 np0005591285 systemd-machined[154022]: New machine qemu-34-instance-0000004a.
Jan 21 18:58:52 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:58:52.679 211690 DEBUG oslo.privsep.daemon [-] privsep: reply[3c3ef1b5-9e88-4c63-b0d0-2d06d2374f74]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 18:58:52 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:58:52.681 104259 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap58cd83db-d1 in ovnmeta-58cd83db-dcb3-409c-a108-07601ce5f67a namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Jan 21 18:58:52 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:58:52.683 211690 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap58cd83db-d0 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Jan 21 18:58:52 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:58:52.683 211690 DEBUG oslo.privsep.daemon [-] privsep: reply[b5b869aa-4e9c-49fb-9269-ab0d6cf5dabb]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 18:58:52 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:58:52.685 211690 DEBUG oslo.privsep.daemon [-] privsep: reply[1ec4b0e5-a7e0-4827-9d50-ed576f18b743]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 18:58:52 np0005591285 systemd[1]: Started Virtual Machine qemu-34-instance-0000004a.
Jan 21 18:58:52 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:58:52.706 104650 DEBUG oslo.privsep.daemon [-] privsep: reply[d8369356-ae2e-4c1d-86eb-9daed3b01148]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 18:58:52 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:58:52.727 211690 DEBUG oslo.privsep.daemon [-] privsep: reply[c6ed1a66-6ff6-4403-a197-6fbe24f571c9]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 18:58:52 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:58:52.783 211704 DEBUG oslo.privsep.daemon [-] privsep: reply[cba1677c-5592-4e6f-ad36-d041c529ec6e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 18:58:52 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:58:52.793 211690 DEBUG oslo.privsep.daemon [-] privsep: reply[b7a4f72a-d43a-4375-afd6-5a42361723f4]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 18:58:52 np0005591285 NetworkManager[55017]: <info>  [1769039932.7951] manager: (tap58cd83db-d0): new Veth device (/org/freedesktop/NetworkManager/Devices/127)
Jan 21 18:58:52 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:58:52.844 211704 DEBUG oslo.privsep.daemon [-] privsep: reply[bf46132b-94d6-4ae5-a1de-bbc4e15afc13]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 18:58:52 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:58:52.849 211704 DEBUG oslo.privsep.daemon [-] privsep: reply[3f77e5b2-fb57-4862-8534-7468e19ce2c8]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 18:58:52 np0005591285 NetworkManager[55017]: <info>  [1769039932.8826] device (tap58cd83db-d0): carrier: link connected
Jan 21 18:58:52 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:58:52.889 211704 DEBUG oslo.privsep.daemon [-] privsep: reply[8b7c9fa4-613e-4204-940a-283a90ad0d63]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 18:58:52 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:58:52.918 211690 DEBUG oslo.privsep.daemon [-] privsep: reply[5d76e387-57e3-4ef8-a337-be8708597e05]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap58cd83db-d1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:59:9a:20'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 81], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 449154, 'reachable_time': 29213, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 221955, 'error': None, 'target': 'ovnmeta-58cd83db-dcb3-409c-a108-07601ce5f67a', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 18:58:52 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:58:52.941 211690 DEBUG oslo.privsep.daemon [-] privsep: reply[af31970b-7768-4854-a6a4-78c4399789f7]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe59:9a20'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 449154, 'tstamp': 449154}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 221956, 'error': None, 'target': 'ovnmeta-58cd83db-dcb3-409c-a108-07601ce5f67a', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 18:58:52 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:58:52.980 211690 DEBUG oslo.privsep.daemon [-] privsep: reply[a3e5cc2c-aa03-4059-a903-dac16197ca58]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap58cd83db-d1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:59:9a:20'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 81], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 449154, 'reachable_time': 29213, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 221957, 'error': None, 'target': 'ovnmeta-58cd83db-dcb3-409c-a108-07601ce5f67a', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 18:58:53 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:58:53.025 211690 DEBUG oslo.privsep.daemon [-] privsep: reply[e65f77aa-8585-41c5-af7f-4409756650d0]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 18:58:53 np0005591285 nova_compute[182755]: 2026-01-21 23:58:53.121 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 18:58:53 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:58:53.128 211690 DEBUG oslo.privsep.daemon [-] privsep: reply[5844ed12-9cd8-4139-a3ef-79e8bfb7c2db]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 18:58:53 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:58:53.130 104259 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap58cd83db-d0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 21 18:58:53 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:58:53.131 104259 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 21 18:58:53 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:58:53.132 104259 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap58cd83db-d0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 21 18:58:53 np0005591285 nova_compute[182755]: 2026-01-21 23:58:53.134 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 18:58:53 np0005591285 NetworkManager[55017]: <info>  [1769039933.1356] manager: (tap58cd83db-d0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/128)
Jan 21 18:58:53 np0005591285 kernel: tap58cd83db-d0: entered promiscuous mode
Jan 21 18:58:53 np0005591285 nova_compute[182755]: 2026-01-21 23:58:53.138 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 18:58:53 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:58:53.141 104259 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap58cd83db-d0, col_values=(('external_ids', {'iface-id': '2d113249-07d3-443f-9b57-5f5a422d1c98'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 21 18:58:53 np0005591285 nova_compute[182755]: 2026-01-21 23:58:53.142 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 18:58:53 np0005591285 ovn_controller[94908]: 2026-01-21T23:58:53Z|00254|binding|INFO|Releasing lport 2d113249-07d3-443f-9b57-5f5a422d1c98 from this chassis (sb_readonly=0)
Jan 21 18:58:53 np0005591285 nova_compute[182755]: 2026-01-21 23:58:53.144 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 18:58:53 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:58:53.144 104259 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/58cd83db-dcb3-409c-a108-07601ce5f67a.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/58cd83db-dcb3-409c-a108-07601ce5f67a.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Jan 21 18:58:53 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:58:53.150 211690 DEBUG oslo.privsep.daemon [-] privsep: reply[fba50662-c976-4439-aec1-d0576cd6f473]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 18:58:53 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:58:53.151 104259 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Jan 21 18:58:53 np0005591285 ovn_metadata_agent[104254]: global
Jan 21 18:58:53 np0005591285 ovn_metadata_agent[104254]:    log         /dev/log local0 debug
Jan 21 18:58:53 np0005591285 ovn_metadata_agent[104254]:    log-tag     haproxy-metadata-proxy-58cd83db-dcb3-409c-a108-07601ce5f67a
Jan 21 18:58:53 np0005591285 ovn_metadata_agent[104254]:    user        root
Jan 21 18:58:53 np0005591285 ovn_metadata_agent[104254]:    group       root
Jan 21 18:58:53 np0005591285 ovn_metadata_agent[104254]:    maxconn     1024
Jan 21 18:58:53 np0005591285 ovn_metadata_agent[104254]:    pidfile     /var/lib/neutron/external/pids/58cd83db-dcb3-409c-a108-07601ce5f67a.pid.haproxy
Jan 21 18:58:53 np0005591285 ovn_metadata_agent[104254]:    daemon
Jan 21 18:58:53 np0005591285 ovn_metadata_agent[104254]: 
Jan 21 18:58:53 np0005591285 ovn_metadata_agent[104254]: defaults
Jan 21 18:58:53 np0005591285 ovn_metadata_agent[104254]:    log global
Jan 21 18:58:53 np0005591285 ovn_metadata_agent[104254]:    mode http
Jan 21 18:58:53 np0005591285 ovn_metadata_agent[104254]:    option httplog
Jan 21 18:58:53 np0005591285 ovn_metadata_agent[104254]:    option dontlognull
Jan 21 18:58:53 np0005591285 ovn_metadata_agent[104254]:    option http-server-close
Jan 21 18:58:53 np0005591285 ovn_metadata_agent[104254]:    option forwardfor
Jan 21 18:58:53 np0005591285 ovn_metadata_agent[104254]:    retries                 3
Jan 21 18:58:53 np0005591285 ovn_metadata_agent[104254]:    timeout http-request    30s
Jan 21 18:58:53 np0005591285 ovn_metadata_agent[104254]:    timeout connect         30s
Jan 21 18:58:53 np0005591285 ovn_metadata_agent[104254]:    timeout client          32s
Jan 21 18:58:53 np0005591285 ovn_metadata_agent[104254]:    timeout server          32s
Jan 21 18:58:53 np0005591285 ovn_metadata_agent[104254]:    timeout http-keep-alive 30s
Jan 21 18:58:53 np0005591285 ovn_metadata_agent[104254]: 
Jan 21 18:58:53 np0005591285 ovn_metadata_agent[104254]: 
Jan 21 18:58:53 np0005591285 ovn_metadata_agent[104254]: listen listener
Jan 21 18:58:53 np0005591285 ovn_metadata_agent[104254]:    bind 169.254.169.254:80
Jan 21 18:58:53 np0005591285 ovn_metadata_agent[104254]:    server metadata /var/lib/neutron/metadata_proxy
Jan 21 18:58:53 np0005591285 ovn_metadata_agent[104254]:    http-request add-header X-OVN-Network-ID 58cd83db-dcb3-409c-a108-07601ce5f67a
Jan 21 18:58:53 np0005591285 ovn_metadata_agent[104254]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Jan 21 18:58:53 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:58:53.152 104259 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-58cd83db-dcb3-409c-a108-07601ce5f67a', 'env', 'PROCESS_TAG=haproxy-58cd83db-dcb3-409c-a108-07601ce5f67a', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/58cd83db-dcb3-409c-a108-07601ce5f67a.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Jan 21 18:58:53 np0005591285 nova_compute[182755]: 2026-01-21 23:58:53.159 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 18:58:53 np0005591285 nova_compute[182755]: 2026-01-21 23:58:53.178 182759 DEBUG nova.virt.libvirt.host [None req-f447ef32-7edb-4e2c-a5c5-5452f2f9c37e - - - - - -] Removed pending event for 83fe04ea-7d77-4003-9276-6a7d268e942a due to event _event_emit_delayed /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:438#033[00m
Jan 21 18:58:53 np0005591285 nova_compute[182755]: 2026-01-21 23:58:53.180 182759 DEBUG nova.virt.driver [None req-f447ef32-7edb-4e2c-a5c5-5452f2f9c37e - - - - - -] Emitting event <LifecycleEvent: 1769039933.1781785, 83fe04ea-7d77-4003-9276-6a7d268e942a => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 21 18:58:53 np0005591285 nova_compute[182755]: 2026-01-21 23:58:53.180 182759 INFO nova.compute.manager [None req-f447ef32-7edb-4e2c-a5c5-5452f2f9c37e - - - - - -] [instance: 83fe04ea-7d77-4003-9276-6a7d268e942a] VM Resumed (Lifecycle Event)#033[00m
Jan 21 18:58:53 np0005591285 nova_compute[182755]: 2026-01-21 23:58:53.185 182759 DEBUG nova.compute.manager [None req-2dc6dd16-caea-436f-ae4f-5bb6e1f70d09 55710edfd4b24e368807c8b5087ec91c 011e84f966444a668bd6c0f5674f551f - - default default] [instance: 83fe04ea-7d77-4003-9276-6a7d268e942a] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 21 18:58:53 np0005591285 nova_compute[182755]: 2026-01-21 23:58:53.240 182759 DEBUG nova.compute.manager [None req-f447ef32-7edb-4e2c-a5c5-5452f2f9c37e - - - - - -] [instance: 83fe04ea-7d77-4003-9276-6a7d268e942a] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 21 18:58:53 np0005591285 nova_compute[182755]: 2026-01-21 23:58:53.244 182759 DEBUG nova.compute.manager [None req-f447ef32-7edb-4e2c-a5c5-5452f2f9c37e - - - - - -] [instance: 83fe04ea-7d77-4003-9276-6a7d268e942a] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: rescued, current task_state: unrescuing, current DB power_state: 1, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 21 18:58:53 np0005591285 nova_compute[182755]: 2026-01-21 23:58:53.268 182759 DEBUG nova.compute.manager [req-18192112-2b59-495e-9347-b8e734433e32 req-611311f3-6d22-4d93-a8eb-6c3354ba6105 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 83fe04ea-7d77-4003-9276-6a7d268e942a] Received event network-vif-unplugged-8e162717-2b5c-4731-8484-d2c68330bdaa external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 21 18:58:53 np0005591285 nova_compute[182755]: 2026-01-21 23:58:53.269 182759 DEBUG oslo_concurrency.lockutils [req-18192112-2b59-495e-9347-b8e734433e32 req-611311f3-6d22-4d93-a8eb-6c3354ba6105 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "83fe04ea-7d77-4003-9276-6a7d268e942a-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 21 18:58:53 np0005591285 nova_compute[182755]: 2026-01-21 23:58:53.270 182759 DEBUG oslo_concurrency.lockutils [req-18192112-2b59-495e-9347-b8e734433e32 req-611311f3-6d22-4d93-a8eb-6c3354ba6105 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "83fe04ea-7d77-4003-9276-6a7d268e942a-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 21 18:58:53 np0005591285 nova_compute[182755]: 2026-01-21 23:58:53.270 182759 DEBUG oslo_concurrency.lockutils [req-18192112-2b59-495e-9347-b8e734433e32 req-611311f3-6d22-4d93-a8eb-6c3354ba6105 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "83fe04ea-7d77-4003-9276-6a7d268e942a-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 21 18:58:53 np0005591285 nova_compute[182755]: 2026-01-21 23:58:53.271 182759 DEBUG nova.compute.manager [req-18192112-2b59-495e-9347-b8e734433e32 req-611311f3-6d22-4d93-a8eb-6c3354ba6105 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 83fe04ea-7d77-4003-9276-6a7d268e942a] No waiting events found dispatching network-vif-unplugged-8e162717-2b5c-4731-8484-d2c68330bdaa pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 21 18:58:53 np0005591285 nova_compute[182755]: 2026-01-21 23:58:53.271 182759 WARNING nova.compute.manager [req-18192112-2b59-495e-9347-b8e734433e32 req-611311f3-6d22-4d93-a8eb-6c3354ba6105 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 83fe04ea-7d77-4003-9276-6a7d268e942a] Received unexpected event network-vif-unplugged-8e162717-2b5c-4731-8484-d2c68330bdaa for instance with vm_state rescued and task_state unrescuing.#033[00m
Jan 21 18:58:53 np0005591285 nova_compute[182755]: 2026-01-21 23:58:53.272 182759 DEBUG nova.compute.manager [req-18192112-2b59-495e-9347-b8e734433e32 req-611311f3-6d22-4d93-a8eb-6c3354ba6105 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 83fe04ea-7d77-4003-9276-6a7d268e942a] Received event network-vif-plugged-8e162717-2b5c-4731-8484-d2c68330bdaa external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 21 18:58:53 np0005591285 nova_compute[182755]: 2026-01-21 23:58:53.273 182759 DEBUG oslo_concurrency.lockutils [req-18192112-2b59-495e-9347-b8e734433e32 req-611311f3-6d22-4d93-a8eb-6c3354ba6105 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "83fe04ea-7d77-4003-9276-6a7d268e942a-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 21 18:58:53 np0005591285 nova_compute[182755]: 2026-01-21 23:58:53.273 182759 DEBUG oslo_concurrency.lockutils [req-18192112-2b59-495e-9347-b8e734433e32 req-611311f3-6d22-4d93-a8eb-6c3354ba6105 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "83fe04ea-7d77-4003-9276-6a7d268e942a-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 21 18:58:53 np0005591285 nova_compute[182755]: 2026-01-21 23:58:53.274 182759 DEBUG oslo_concurrency.lockutils [req-18192112-2b59-495e-9347-b8e734433e32 req-611311f3-6d22-4d93-a8eb-6c3354ba6105 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "83fe04ea-7d77-4003-9276-6a7d268e942a-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 21 18:58:53 np0005591285 nova_compute[182755]: 2026-01-21 23:58:53.274 182759 DEBUG nova.compute.manager [req-18192112-2b59-495e-9347-b8e734433e32 req-611311f3-6d22-4d93-a8eb-6c3354ba6105 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 83fe04ea-7d77-4003-9276-6a7d268e942a] No waiting events found dispatching network-vif-plugged-8e162717-2b5c-4731-8484-d2c68330bdaa pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 21 18:58:53 np0005591285 nova_compute[182755]: 2026-01-21 23:58:53.275 182759 WARNING nova.compute.manager [req-18192112-2b59-495e-9347-b8e734433e32 req-611311f3-6d22-4d93-a8eb-6c3354ba6105 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 83fe04ea-7d77-4003-9276-6a7d268e942a] Received unexpected event network-vif-plugged-8e162717-2b5c-4731-8484-d2c68330bdaa for instance with vm_state rescued and task_state unrescuing.#033[00m
Jan 21 18:58:53 np0005591285 nova_compute[182755]: 2026-01-21 23:58:53.296 182759 INFO nova.compute.manager [None req-f447ef32-7edb-4e2c-a5c5-5452f2f9c37e - - - - - -] [instance: 83fe04ea-7d77-4003-9276-6a7d268e942a] During sync_power_state the instance has a pending task (unrescuing). Skip.#033[00m
Jan 21 18:58:53 np0005591285 nova_compute[182755]: 2026-01-21 23:58:53.297 182759 DEBUG nova.virt.driver [None req-f447ef32-7edb-4e2c-a5c5-5452f2f9c37e - - - - - -] Emitting event <LifecycleEvent: 1769039933.1799445, 83fe04ea-7d77-4003-9276-6a7d268e942a => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 21 18:58:53 np0005591285 nova_compute[182755]: 2026-01-21 23:58:53.297 182759 INFO nova.compute.manager [None req-f447ef32-7edb-4e2c-a5c5-5452f2f9c37e - - - - - -] [instance: 83fe04ea-7d77-4003-9276-6a7d268e942a] VM Started (Lifecycle Event)#033[00m
Jan 21 18:58:53 np0005591285 nova_compute[182755]: 2026-01-21 23:58:53.352 182759 DEBUG nova.compute.manager [None req-f447ef32-7edb-4e2c-a5c5-5452f2f9c37e - - - - - -] [instance: 83fe04ea-7d77-4003-9276-6a7d268e942a] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 21 18:58:53 np0005591285 nova_compute[182755]: 2026-01-21 23:58:53.357 182759 DEBUG nova.compute.manager [None req-f447ef32-7edb-4e2c-a5c5-5452f2f9c37e - - - - - -] [instance: 83fe04ea-7d77-4003-9276-6a7d268e942a] Synchronizing instance power state after lifecycle event "Started"; current vm_state: active, current task_state: None, current DB power_state: 1, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 21 18:58:53 np0005591285 nova_compute[182755]: 2026-01-21 23:58:53.360 182759 DEBUG oslo_concurrency.lockutils [None req-1cfd7b6b-7a83-4952-9ee4-6761383e3b7d 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] Acquiring lock "9308be91-9a92-4389-939a-8b03d37474cf" by "nova.compute.manager.ComputeManager.reboot_instance.<locals>.do_reboot_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 21 18:58:53 np0005591285 nova_compute[182755]: 2026-01-21 23:58:53.361 182759 DEBUG oslo_concurrency.lockutils [None req-1cfd7b6b-7a83-4952-9ee4-6761383e3b7d 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] Lock "9308be91-9a92-4389-939a-8b03d37474cf" acquired by "nova.compute.manager.ComputeManager.reboot_instance.<locals>.do_reboot_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 21 18:58:53 np0005591285 nova_compute[182755]: 2026-01-21 23:58:53.361 182759 INFO nova.compute.manager [None req-1cfd7b6b-7a83-4952-9ee4-6761383e3b7d 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] [instance: 9308be91-9a92-4389-939a-8b03d37474cf] Rebooting instance#033[00m
Jan 21 18:58:53 np0005591285 nova_compute[182755]: 2026-01-21 23:58:53.383 182759 DEBUG oslo_concurrency.lockutils [None req-1cfd7b6b-7a83-4952-9ee4-6761383e3b7d 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] Acquiring lock "refresh_cache-9308be91-9a92-4389-939a-8b03d37474cf" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 21 18:58:53 np0005591285 nova_compute[182755]: 2026-01-21 23:58:53.384 182759 DEBUG oslo_concurrency.lockutils [None req-1cfd7b6b-7a83-4952-9ee4-6761383e3b7d 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] Acquired lock "refresh_cache-9308be91-9a92-4389-939a-8b03d37474cf" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 21 18:58:53 np0005591285 nova_compute[182755]: 2026-01-21 23:58:53.385 182759 DEBUG nova.network.neutron [None req-1cfd7b6b-7a83-4952-9ee4-6761383e3b7d 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] [instance: 9308be91-9a92-4389-939a-8b03d37474cf] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Jan 21 18:58:53 np0005591285 podman[221996]: 2026-01-21 23:58:53.575719441 +0000 UTC m=+0.074665506 container create 0c19d527c5912897c5ede06cd1b4108b951dd546940f33ed9261e3c33c58a7fd (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-58cd83db-dcb3-409c-a108-07601ce5f67a, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, tcib_managed=true, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team)
Jan 21 18:58:53 np0005591285 podman[221996]: 2026-01-21 23:58:53.545659035 +0000 UTC m=+0.044605080 image pull 3695f0466b4af47afdf4b467956f8cc4744d7249671a73e7ca3fd26cca2f59c3 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Jan 21 18:58:53 np0005591285 systemd[1]: Started libpod-conmon-0c19d527c5912897c5ede06cd1b4108b951dd546940f33ed9261e3c33c58a7fd.scope.
Jan 21 18:58:53 np0005591285 systemd[1]: Started libcrun container.
Jan 21 18:58:53 np0005591285 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/d9254aad97b36a61c7f68f2f7c2c214d2f26ce61789016597d1f62dd3ae4ae9b/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 21 18:58:53 np0005591285 podman[221996]: 2026-01-21 23:58:53.704115303 +0000 UTC m=+0.203061418 container init 0c19d527c5912897c5ede06cd1b4108b951dd546940f33ed9261e3c33c58a7fd (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-58cd83db-dcb3-409c-a108-07601ce5f67a, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.build-date=20251202)
Jan 21 18:58:53 np0005591285 podman[221996]: 2026-01-21 23:58:53.718154734 +0000 UTC m=+0.217100799 container start 0c19d527c5912897c5ede06cd1b4108b951dd546940f33ed9261e3c33c58a7fd (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-58cd83db-dcb3-409c-a108-07601ce5f67a, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 21 18:58:53 np0005591285 neutron-haproxy-ovnmeta-58cd83db-dcb3-409c-a108-07601ce5f67a[222012]: [NOTICE]   (222016) : New worker (222018) forked
Jan 21 18:58:53 np0005591285 neutron-haproxy-ovnmeta-58cd83db-dcb3-409c-a108-07601ce5f67a[222012]: [NOTICE]   (222016) : Loading success.
Jan 21 18:58:54 np0005591285 nova_compute[182755]: 2026-01-21 23:58:54.857 182759 DEBUG nova.network.neutron [None req-1cfd7b6b-7a83-4952-9ee4-6761383e3b7d 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] [instance: 9308be91-9a92-4389-939a-8b03d37474cf] Updating instance_info_cache with network_info: [{"id": "d96fb6bb-9793-4373-8f62-3aa3f32af6a5", "address": "fa:16:3e:c3:44:d7", "network": {"id": "19c3e0c8-5563-479c-995a-ab38d8b8c7f7", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-10713966-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.240", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "cccb624dbe6d4401a89e9cd254f91828", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd96fb6bb-97", "ovs_interfaceid": "d96fb6bb-9793-4373-8f62-3aa3f32af6a5", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 21 18:58:54 np0005591285 nova_compute[182755]: 2026-01-21 23:58:54.873 182759 DEBUG oslo_concurrency.lockutils [None req-1cfd7b6b-7a83-4952-9ee4-6761383e3b7d 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] Releasing lock "refresh_cache-9308be91-9a92-4389-939a-8b03d37474cf" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 21 18:58:54 np0005591285 nova_compute[182755]: 2026-01-21 23:58:54.889 182759 DEBUG nova.compute.manager [None req-1cfd7b6b-7a83-4952-9ee4-6761383e3b7d 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] [instance: 9308be91-9a92-4389-939a-8b03d37474cf] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 21 18:58:55 np0005591285 nova_compute[182755]: 2026-01-21 23:58:55.052 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 18:58:55 np0005591285 kernel: tapd96fb6bb-97 (unregistering): left promiscuous mode
Jan 21 18:58:55 np0005591285 NetworkManager[55017]: <info>  [1769039935.0816] device (tapd96fb6bb-97): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 21 18:58:55 np0005591285 ovn_controller[94908]: 2026-01-21T23:58:55Z|00255|binding|INFO|Releasing lport d96fb6bb-9793-4373-8f62-3aa3f32af6a5 from this chassis (sb_readonly=0)
Jan 21 18:58:55 np0005591285 ovn_controller[94908]: 2026-01-21T23:58:55Z|00256|binding|INFO|Setting lport d96fb6bb-9793-4373-8f62-3aa3f32af6a5 down in Southbound
Jan 21 18:58:55 np0005591285 ovn_controller[94908]: 2026-01-21T23:58:55Z|00257|binding|INFO|Removing iface tapd96fb6bb-97 ovn-installed in OVS
Jan 21 18:58:55 np0005591285 nova_compute[182755]: 2026-01-21 23:58:55.098 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 18:58:55 np0005591285 nova_compute[182755]: 2026-01-21 23:58:55.107 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 18:58:55 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:58:55.110 104259 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:c3:44:d7 10.100.0.7'], port_security=['fa:16:3e:c3:44:d7 10.100.0.7'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.7/28', 'neutron:device_id': '9308be91-9a92-4389-939a-8b03d37474cf', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-19c3e0c8-5563-479c-995a-ab38d8b8c7f7', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'cccb624dbe6d4401a89e9cd254f91828', 'neutron:revision_number': '8', 'neutron:security_group_ids': '6d59a7e5-ecca-4ec2-a40e-386acabc1d66', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:port_fip': '192.168.122.240', 'neutron:host_id': 'compute-2.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=1cb5ae5b-fb9e-4b4d-8960-35191db09308, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fdd55b5c970>], logical_port=d96fb6bb-9793-4373-8f62-3aa3f32af6a5) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7fdd55b5c970>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 21 18:58:55 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:58:55.112 104259 INFO neutron.agent.ovn.metadata.agent [-] Port d96fb6bb-9793-4373-8f62-3aa3f32af6a5 in datapath 19c3e0c8-5563-479c-995a-ab38d8b8c7f7 unbound from our chassis#033[00m
Jan 21 18:58:55 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:58:55.113 104259 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 19c3e0c8-5563-479c-995a-ab38d8b8c7f7, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Jan 21 18:58:55 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:58:55.116 211690 DEBUG oslo.privsep.daemon [-] privsep: reply[a7c69106-448f-41eb-bd40-ab08b627c257]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 18:58:55 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:58:55.117 104259 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-19c3e0c8-5563-479c-995a-ab38d8b8c7f7 namespace which is not needed anymore#033[00m
Jan 21 18:58:55 np0005591285 systemd[1]: machine-qemu\x2d32\x2dinstance\x2d00000046.scope: Deactivated successfully.
Jan 21 18:58:55 np0005591285 systemd[1]: machine-qemu\x2d32\x2dinstance\x2d00000046.scope: Consumed 14.679s CPU time.
Jan 21 18:58:55 np0005591285 systemd-machined[154022]: Machine qemu-32-instance-00000046 terminated.
Jan 21 18:58:55 np0005591285 nova_compute[182755]: 2026-01-21 23:58:55.302 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 18:58:55 np0005591285 nova_compute[182755]: 2026-01-21 23:58:55.307 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 18:58:55 np0005591285 neutron-haproxy-ovnmeta-19c3e0c8-5563-479c-995a-ab38d8b8c7f7[221417]: [NOTICE]   (221421) : haproxy version is 2.8.14-c23fe91
Jan 21 18:58:55 np0005591285 neutron-haproxy-ovnmeta-19c3e0c8-5563-479c-995a-ab38d8b8c7f7[221417]: [NOTICE]   (221421) : path to executable is /usr/sbin/haproxy
Jan 21 18:58:55 np0005591285 neutron-haproxy-ovnmeta-19c3e0c8-5563-479c-995a-ab38d8b8c7f7[221417]: [WARNING]  (221421) : Exiting Master process...
Jan 21 18:58:55 np0005591285 neutron-haproxy-ovnmeta-19c3e0c8-5563-479c-995a-ab38d8b8c7f7[221417]: [WARNING]  (221421) : Exiting Master process...
Jan 21 18:58:55 np0005591285 neutron-haproxy-ovnmeta-19c3e0c8-5563-479c-995a-ab38d8b8c7f7[221417]: [ALERT]    (221421) : Current worker (221423) exited with code 143 (Terminated)
Jan 21 18:58:55 np0005591285 neutron-haproxy-ovnmeta-19c3e0c8-5563-479c-995a-ab38d8b8c7f7[221417]: [WARNING]  (221421) : All workers exited. Exiting... (0)
Jan 21 18:58:55 np0005591285 systemd[1]: libpod-ce5f4a8ea03ba1e6a22fa04817afcff08c2929abd20a134387ab35b782d86549.scope: Deactivated successfully.
Jan 21 18:58:55 np0005591285 podman[222048]: 2026-01-21 23:58:55.324621722 +0000 UTC m=+0.072457376 container died ce5f4a8ea03ba1e6a22fa04817afcff08c2929abd20a134387ab35b782d86549 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-19c3e0c8-5563-479c-995a-ab38d8b8c7f7, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team)
Jan 21 18:58:55 np0005591285 nova_compute[182755]: 2026-01-21 23:58:55.351 182759 INFO nova.virt.libvirt.driver [-] [instance: 9308be91-9a92-4389-939a-8b03d37474cf] Instance destroyed successfully.#033[00m
Jan 21 18:58:55 np0005591285 nova_compute[182755]: 2026-01-21 23:58:55.354 182759 DEBUG nova.objects.instance [None req-1cfd7b6b-7a83-4952-9ee4-6761383e3b7d 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] Lazy-loading 'resources' on Instance uuid 9308be91-9a92-4389-939a-8b03d37474cf obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 21 18:58:55 np0005591285 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-ce5f4a8ea03ba1e6a22fa04817afcff08c2929abd20a134387ab35b782d86549-userdata-shm.mount: Deactivated successfully.
Jan 21 18:58:55 np0005591285 systemd[1]: var-lib-containers-storage-overlay-c43af1648b638f28e991b5a990512950a94631aa9495cd065478fa8cadbf332e-merged.mount: Deactivated successfully.
Jan 21 18:58:55 np0005591285 podman[222048]: 2026-01-21 23:58:55.383258873 +0000 UTC m=+0.131094487 container cleanup ce5f4a8ea03ba1e6a22fa04817afcff08c2929abd20a134387ab35b782d86549 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-19c3e0c8-5563-479c-995a-ab38d8b8c7f7, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202)
Jan 21 18:58:55 np0005591285 systemd[1]: libpod-conmon-ce5f4a8ea03ba1e6a22fa04817afcff08c2929abd20a134387ab35b782d86549.scope: Deactivated successfully.
Jan 21 18:58:55 np0005591285 nova_compute[182755]: 2026-01-21 23:58:55.407 182759 DEBUG nova.virt.libvirt.vif [None req-1cfd7b6b-7a83-4952-9ee4-6761383e3b7d 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-21T23:57:10Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ServerActionsTestJSON-server-396111842',display_name='tempest-ServerActionsTestJSON-server-396111842',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-serveractionstestjson-server-396111842',id=70,image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBM2ugiUux7DYMlN8dY8gue1BzsfXbOKOqdPq/gJUxFgjYtiZRKn0Il7yH7vkt/FF0n0nQ57uKZ7FjQwDvGcLpEHkhrK3RTLhPWsztjfiNHjhjKK0S86T4k3kzP0rpeoh4Q==',key_name='tempest-keypair-452781070',keypairs=<?>,launch_index=0,launched_at=2026-01-21T23:57:17Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='cccb624dbe6d4401a89e9cd254f91828',ramdisk_id='',reservation_id='r-740ncwsh',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-ServerActionsTestJSON-78742637',owner_user_name='tempest-ServerActionsTestJSON-78742637-project-member'},tags=<?>,task_state='reboot_started_hard',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-21T23:58:54Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='3e78a70a1d284a9d932d4a53b872df39',uuid=9308be91-9a92-4389-939a-8b03d37474cf,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "d96fb6bb-9793-4373-8f62-3aa3f32af6a5", "address": "fa:16:3e:c3:44:d7", "network": {"id": "19c3e0c8-5563-479c-995a-ab38d8b8c7f7", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-10713966-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.240", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "cccb624dbe6d4401a89e9cd254f91828", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd96fb6bb-97", "ovs_interfaceid": "d96fb6bb-9793-4373-8f62-3aa3f32af6a5", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Jan 21 18:58:55 np0005591285 nova_compute[182755]: 2026-01-21 23:58:55.408 182759 DEBUG nova.network.os_vif_util [None req-1cfd7b6b-7a83-4952-9ee4-6761383e3b7d 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] Converting VIF {"id": "d96fb6bb-9793-4373-8f62-3aa3f32af6a5", "address": "fa:16:3e:c3:44:d7", "network": {"id": "19c3e0c8-5563-479c-995a-ab38d8b8c7f7", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-10713966-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.240", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "cccb624dbe6d4401a89e9cd254f91828", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd96fb6bb-97", "ovs_interfaceid": "d96fb6bb-9793-4373-8f62-3aa3f32af6a5", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 21 18:58:55 np0005591285 nova_compute[182755]: 2026-01-21 23:58:55.409 182759 DEBUG nova.network.os_vif_util [None req-1cfd7b6b-7a83-4952-9ee4-6761383e3b7d 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:c3:44:d7,bridge_name='br-int',has_traffic_filtering=True,id=d96fb6bb-9793-4373-8f62-3aa3f32af6a5,network=Network(19c3e0c8-5563-479c-995a-ab38d8b8c7f7),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapd96fb6bb-97') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 21 18:58:55 np0005591285 nova_compute[182755]: 2026-01-21 23:58:55.409 182759 DEBUG os_vif [None req-1cfd7b6b-7a83-4952-9ee4-6761383e3b7d 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:c3:44:d7,bridge_name='br-int',has_traffic_filtering=True,id=d96fb6bb-9793-4373-8f62-3aa3f32af6a5,network=Network(19c3e0c8-5563-479c-995a-ab38d8b8c7f7),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapd96fb6bb-97') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Jan 21 18:58:55 np0005591285 nova_compute[182755]: 2026-01-21 23:58:55.411 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 18:58:55 np0005591285 nova_compute[182755]: 2026-01-21 23:58:55.412 182759 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapd96fb6bb-97, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 21 18:58:55 np0005591285 nova_compute[182755]: 2026-01-21 23:58:55.413 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 18:58:55 np0005591285 nova_compute[182755]: 2026-01-21 23:58:55.416 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 21 18:58:55 np0005591285 nova_compute[182755]: 2026-01-21 23:58:55.419 182759 INFO os_vif [None req-1cfd7b6b-7a83-4952-9ee4-6761383e3b7d 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:c3:44:d7,bridge_name='br-int',has_traffic_filtering=True,id=d96fb6bb-9793-4373-8f62-3aa3f32af6a5,network=Network(19c3e0c8-5563-479c-995a-ab38d8b8c7f7),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapd96fb6bb-97')#033[00m
Jan 21 18:58:55 np0005591285 nova_compute[182755]: 2026-01-21 23:58:55.427 182759 DEBUG nova.virt.libvirt.driver [None req-1cfd7b6b-7a83-4952-9ee4-6761383e3b7d 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] [instance: 9308be91-9a92-4389-939a-8b03d37474cf] Start _get_guest_xml network_info=[{"id": "d96fb6bb-9793-4373-8f62-3aa3f32af6a5", "address": "fa:16:3e:c3:44:d7", "network": {"id": "19c3e0c8-5563-479c-995a-ab38d8b8c7f7", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-10713966-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.240", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "cccb624dbe6d4401a89e9cd254f91828", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd96fb6bb-97", "ovs_interfaceid": "d96fb6bb-9793-4373-8f62-3aa3f32af6a5", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum=<?>,container_format='bare',created_at=<?>,direct_url=<?>,disk_format='qcow2',id=9cd98f02-a505-4543-a7ad-04e9a377b456,min_disk=1,min_ram=0,name=<?>,owner=<?>,properties=ImageMetaProps,protected=<?>,size=<?>,status=<?>,tags=<?>,updated_at=<?>,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'device_type': 'disk', 'device_name': '/dev/vda', 'size': 0, 'boot_index': 0, 'disk_bus': 'virtio', 'guest_format': None, 'encryption_options': None, 'encryption_format': None, 'encrypted': False, 'encryption_secret_uuid': None, 'image_id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Jan 21 18:58:55 np0005591285 nova_compute[182755]: 2026-01-21 23:58:55.430 182759 WARNING nova.virt.libvirt.driver [None req-1cfd7b6b-7a83-4952-9ee4-6761383e3b7d 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 21 18:58:55 np0005591285 nova_compute[182755]: 2026-01-21 23:58:55.438 182759 DEBUG nova.virt.libvirt.host [None req-1cfd7b6b-7a83-4952-9ee4-6761383e3b7d 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Jan 21 18:58:55 np0005591285 nova_compute[182755]: 2026-01-21 23:58:55.438 182759 DEBUG nova.virt.libvirt.host [None req-1cfd7b6b-7a83-4952-9ee4-6761383e3b7d 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Jan 21 18:58:55 np0005591285 nova_compute[182755]: 2026-01-21 23:58:55.442 182759 DEBUG nova.virt.libvirt.host [None req-1cfd7b6b-7a83-4952-9ee4-6761383e3b7d 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Jan 21 18:58:55 np0005591285 nova_compute[182755]: 2026-01-21 23:58:55.443 182759 DEBUG nova.virt.libvirt.host [None req-1cfd7b6b-7a83-4952-9ee4-6761383e3b7d 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Jan 21 18:58:55 np0005591285 nova_compute[182755]: 2026-01-21 23:58:55.444 182759 DEBUG nova.virt.libvirt.driver [None req-1cfd7b6b-7a83-4952-9ee4-6761383e3b7d 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Jan 21 18:58:55 np0005591285 nova_compute[182755]: 2026-01-21 23:58:55.446 182759 DEBUG nova.virt.hardware [None req-1cfd7b6b-7a83-4952-9ee4-6761383e3b7d 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-21T23:42:45Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='c3389c03-89c4-4ff5-9e03-1a99d41713d4',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum=<?>,container_format='bare',created_at=<?>,direct_url=<?>,disk_format='qcow2',id=9cd98f02-a505-4543-a7ad-04e9a377b456,min_disk=1,min_ram=0,name=<?>,owner=<?>,properties=ImageMetaProps,protected=<?>,size=<?>,status=<?>,tags=<?>,updated_at=<?>,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Jan 21 18:58:55 np0005591285 nova_compute[182755]: 2026-01-21 23:58:55.447 182759 DEBUG nova.virt.hardware [None req-1cfd7b6b-7a83-4952-9ee4-6761383e3b7d 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Jan 21 18:58:55 np0005591285 nova_compute[182755]: 2026-01-21 23:58:55.447 182759 DEBUG nova.virt.hardware [None req-1cfd7b6b-7a83-4952-9ee4-6761383e3b7d 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Jan 21 18:58:55 np0005591285 nova_compute[182755]: 2026-01-21 23:58:55.447 182759 DEBUG nova.virt.hardware [None req-1cfd7b6b-7a83-4952-9ee4-6761383e3b7d 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Jan 21 18:58:55 np0005591285 nova_compute[182755]: 2026-01-21 23:58:55.447 182759 DEBUG nova.virt.hardware [None req-1cfd7b6b-7a83-4952-9ee4-6761383e3b7d 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Jan 21 18:58:55 np0005591285 nova_compute[182755]: 2026-01-21 23:58:55.448 182759 DEBUG nova.virt.hardware [None req-1cfd7b6b-7a83-4952-9ee4-6761383e3b7d 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Jan 21 18:58:55 np0005591285 nova_compute[182755]: 2026-01-21 23:58:55.448 182759 DEBUG nova.virt.hardware [None req-1cfd7b6b-7a83-4952-9ee4-6761383e3b7d 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Jan 21 18:58:55 np0005591285 nova_compute[182755]: 2026-01-21 23:58:55.448 182759 DEBUG nova.virt.hardware [None req-1cfd7b6b-7a83-4952-9ee4-6761383e3b7d 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Jan 21 18:58:55 np0005591285 nova_compute[182755]: 2026-01-21 23:58:55.449 182759 DEBUG nova.virt.hardware [None req-1cfd7b6b-7a83-4952-9ee4-6761383e3b7d 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Jan 21 18:58:55 np0005591285 nova_compute[182755]: 2026-01-21 23:58:55.449 182759 DEBUG nova.virt.hardware [None req-1cfd7b6b-7a83-4952-9ee4-6761383e3b7d 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Jan 21 18:58:55 np0005591285 nova_compute[182755]: 2026-01-21 23:58:55.449 182759 DEBUG nova.virt.hardware [None req-1cfd7b6b-7a83-4952-9ee4-6761383e3b7d 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Jan 21 18:58:55 np0005591285 nova_compute[182755]: 2026-01-21 23:58:55.450 182759 DEBUG nova.objects.instance [None req-1cfd7b6b-7a83-4952-9ee4-6761383e3b7d 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] Lazy-loading 'vcpu_model' on Instance uuid 9308be91-9a92-4389-939a-8b03d37474cf obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 21 18:58:55 np0005591285 nova_compute[182755]: 2026-01-21 23:58:55.453 182759 DEBUG nova.compute.manager [req-aeeab664-6f47-44df-8bac-532ede751d8d req-5d337f6e-bd1a-4f88-b1f0-ace62bc3c144 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 83fe04ea-7d77-4003-9276-6a7d268e942a] Received event network-vif-plugged-8e162717-2b5c-4731-8484-d2c68330bdaa external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 21 18:58:55 np0005591285 nova_compute[182755]: 2026-01-21 23:58:55.454 182759 DEBUG oslo_concurrency.lockutils [req-aeeab664-6f47-44df-8bac-532ede751d8d req-5d337f6e-bd1a-4f88-b1f0-ace62bc3c144 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "83fe04ea-7d77-4003-9276-6a7d268e942a-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 21 18:58:55 np0005591285 nova_compute[182755]: 2026-01-21 23:58:55.454 182759 DEBUG oslo_concurrency.lockutils [req-aeeab664-6f47-44df-8bac-532ede751d8d req-5d337f6e-bd1a-4f88-b1f0-ace62bc3c144 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "83fe04ea-7d77-4003-9276-6a7d268e942a-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 21 18:58:55 np0005591285 nova_compute[182755]: 2026-01-21 23:58:55.454 182759 DEBUG oslo_concurrency.lockutils [req-aeeab664-6f47-44df-8bac-532ede751d8d req-5d337f6e-bd1a-4f88-b1f0-ace62bc3c144 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "83fe04ea-7d77-4003-9276-6a7d268e942a-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 21 18:58:55 np0005591285 nova_compute[182755]: 2026-01-21 23:58:55.454 182759 DEBUG nova.compute.manager [req-aeeab664-6f47-44df-8bac-532ede751d8d req-5d337f6e-bd1a-4f88-b1f0-ace62bc3c144 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 83fe04ea-7d77-4003-9276-6a7d268e942a] No waiting events found dispatching network-vif-plugged-8e162717-2b5c-4731-8484-d2c68330bdaa pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 21 18:58:55 np0005591285 nova_compute[182755]: 2026-01-21 23:58:55.455 182759 WARNING nova.compute.manager [req-aeeab664-6f47-44df-8bac-532ede751d8d req-5d337f6e-bd1a-4f88-b1f0-ace62bc3c144 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 83fe04ea-7d77-4003-9276-6a7d268e942a] Received unexpected event network-vif-plugged-8e162717-2b5c-4731-8484-d2c68330bdaa for instance with vm_state active and task_state None.#033[00m
Jan 21 18:58:55 np0005591285 nova_compute[182755]: 2026-01-21 23:58:55.455 182759 DEBUG nova.compute.manager [req-aeeab664-6f47-44df-8bac-532ede751d8d req-5d337f6e-bd1a-4f88-b1f0-ace62bc3c144 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 83fe04ea-7d77-4003-9276-6a7d268e942a] Received event network-vif-plugged-8e162717-2b5c-4731-8484-d2c68330bdaa external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 21 18:58:55 np0005591285 nova_compute[182755]: 2026-01-21 23:58:55.455 182759 DEBUG oslo_concurrency.lockutils [req-aeeab664-6f47-44df-8bac-532ede751d8d req-5d337f6e-bd1a-4f88-b1f0-ace62bc3c144 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "83fe04ea-7d77-4003-9276-6a7d268e942a-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 21 18:58:55 np0005591285 nova_compute[182755]: 2026-01-21 23:58:55.456 182759 DEBUG oslo_concurrency.lockutils [req-aeeab664-6f47-44df-8bac-532ede751d8d req-5d337f6e-bd1a-4f88-b1f0-ace62bc3c144 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "83fe04ea-7d77-4003-9276-6a7d268e942a-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 21 18:58:55 np0005591285 nova_compute[182755]: 2026-01-21 23:58:55.456 182759 DEBUG oslo_concurrency.lockutils [req-aeeab664-6f47-44df-8bac-532ede751d8d req-5d337f6e-bd1a-4f88-b1f0-ace62bc3c144 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "83fe04ea-7d77-4003-9276-6a7d268e942a-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 21 18:58:55 np0005591285 nova_compute[182755]: 2026-01-21 23:58:55.456 182759 DEBUG nova.compute.manager [req-aeeab664-6f47-44df-8bac-532ede751d8d req-5d337f6e-bd1a-4f88-b1f0-ace62bc3c144 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 83fe04ea-7d77-4003-9276-6a7d268e942a] No waiting events found dispatching network-vif-plugged-8e162717-2b5c-4731-8484-d2c68330bdaa pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 21 18:58:55 np0005591285 nova_compute[182755]: 2026-01-21 23:58:55.457 182759 WARNING nova.compute.manager [req-aeeab664-6f47-44df-8bac-532ede751d8d req-5d337f6e-bd1a-4f88-b1f0-ace62bc3c144 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 83fe04ea-7d77-4003-9276-6a7d268e942a] Received unexpected event network-vif-plugged-8e162717-2b5c-4731-8484-d2c68330bdaa for instance with vm_state active and task_state None.#033[00m
Jan 21 18:58:55 np0005591285 podman[222095]: 2026-01-21 23:58:55.473796719 +0000 UTC m=+0.064573634 container remove ce5f4a8ea03ba1e6a22fa04817afcff08c2929abd20a134387ab35b782d86549 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-19c3e0c8-5563-479c-995a-ab38d8b8c7f7, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 21 18:58:55 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:58:55.479 211690 DEBUG oslo.privsep.daemon [-] privsep: reply[a3bcbecf-0bd4-4854-931a-a38ac9bf7d59]: (4, ('Wed Jan 21 11:58:55 PM UTC 2026 Stopping container neutron-haproxy-ovnmeta-19c3e0c8-5563-479c-995a-ab38d8b8c7f7 (ce5f4a8ea03ba1e6a22fa04817afcff08c2929abd20a134387ab35b782d86549)\nce5f4a8ea03ba1e6a22fa04817afcff08c2929abd20a134387ab35b782d86549\nWed Jan 21 11:58:55 PM UTC 2026 Deleting container neutron-haproxy-ovnmeta-19c3e0c8-5563-479c-995a-ab38d8b8c7f7 (ce5f4a8ea03ba1e6a22fa04817afcff08c2929abd20a134387ab35b782d86549)\nce5f4a8ea03ba1e6a22fa04817afcff08c2929abd20a134387ab35b782d86549\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 18:58:55 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:58:55.481 211690 DEBUG oslo.privsep.daemon [-] privsep: reply[9d247e3e-6fb4-4879-b6ed-19124e912f2f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 18:58:55 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:58:55.482 104259 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap19c3e0c8-50, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 21 18:58:55 np0005591285 kernel: tap19c3e0c8-50: left promiscuous mode
Jan 21 18:58:55 np0005591285 nova_compute[182755]: 2026-01-21 23:58:55.490 182759 DEBUG nova.virt.libvirt.vif [None req-1cfd7b6b-7a83-4952-9ee4-6761383e3b7d 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-21T23:57:10Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ServerActionsTestJSON-server-396111842',display_name='tempest-ServerActionsTestJSON-server-396111842',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-serveractionstestjson-server-396111842',id=70,image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBM2ugiUux7DYMlN8dY8gue1BzsfXbOKOqdPq/gJUxFgjYtiZRKn0Il7yH7vkt/FF0n0nQ57uKZ7FjQwDvGcLpEHkhrK3RTLhPWsztjfiNHjhjKK0S86T4k3kzP0rpeoh4Q==',key_name='tempest-keypair-452781070',keypairs=<?>,launch_index=0,launched_at=2026-01-21T23:57:17Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='cccb624dbe6d4401a89e9cd254f91828',ramdisk_id='',reservation_id='r-740ncwsh',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-ServerActionsTestJSON-78742637',owner_user_name='tempest-ServerActionsTestJSON-78742637-project-member'},tags=<?>,task_state='reboot_started_hard',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-21T23:58:54Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='3e78a70a1d284a9d932d4a53b872df39',uuid=9308be91-9a92-4389-939a-8b03d37474cf,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "d96fb6bb-9793-4373-8f62-3aa3f32af6a5", "address": "fa:16:3e:c3:44:d7", "network": {"id": "19c3e0c8-5563-479c-995a-ab38d8b8c7f7", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-10713966-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.240", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "cccb624dbe6d4401a89e9cd254f91828", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd96fb6bb-97", "ovs_interfaceid": "d96fb6bb-9793-4373-8f62-3aa3f32af6a5", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Jan 21 18:58:55 np0005591285 nova_compute[182755]: 2026-01-21 23:58:55.491 182759 DEBUG nova.network.os_vif_util [None req-1cfd7b6b-7a83-4952-9ee4-6761383e3b7d 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] Converting VIF {"id": "d96fb6bb-9793-4373-8f62-3aa3f32af6a5", "address": "fa:16:3e:c3:44:d7", "network": {"id": "19c3e0c8-5563-479c-995a-ab38d8b8c7f7", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-10713966-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.240", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "cccb624dbe6d4401a89e9cd254f91828", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd96fb6bb-97", "ovs_interfaceid": "d96fb6bb-9793-4373-8f62-3aa3f32af6a5", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 21 18:58:55 np0005591285 nova_compute[182755]: 2026-01-21 23:58:55.492 182759 DEBUG nova.network.os_vif_util [None req-1cfd7b6b-7a83-4952-9ee4-6761383e3b7d 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:c3:44:d7,bridge_name='br-int',has_traffic_filtering=True,id=d96fb6bb-9793-4373-8f62-3aa3f32af6a5,network=Network(19c3e0c8-5563-479c-995a-ab38d8b8c7f7),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapd96fb6bb-97') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 21 18:58:55 np0005591285 nova_compute[182755]: 2026-01-21 23:58:55.493 182759 DEBUG nova.objects.instance [None req-1cfd7b6b-7a83-4952-9ee4-6761383e3b7d 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] Lazy-loading 'pci_devices' on Instance uuid 9308be91-9a92-4389-939a-8b03d37474cf obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 21 18:58:55 np0005591285 nova_compute[182755]: 2026-01-21 23:58:55.494 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 18:58:55 np0005591285 nova_compute[182755]: 2026-01-21 23:58:55.499 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 18:58:55 np0005591285 nova_compute[182755]: 2026-01-21 23:58:55.500 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 18:58:55 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:58:55.502 211690 DEBUG oslo.privsep.daemon [-] privsep: reply[582ac298-c976-474b-ad67-9eb19347028c]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 18:58:55 np0005591285 nova_compute[182755]: 2026-01-21 23:58:55.510 182759 DEBUG nova.virt.libvirt.driver [None req-1cfd7b6b-7a83-4952-9ee4-6761383e3b7d 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] [instance: 9308be91-9a92-4389-939a-8b03d37474cf] End _get_guest_xml xml=<domain type="kvm">
Jan 21 18:58:55 np0005591285 nova_compute[182755]:  <uuid>9308be91-9a92-4389-939a-8b03d37474cf</uuid>
Jan 21 18:58:55 np0005591285 nova_compute[182755]:  <name>instance-00000046</name>
Jan 21 18:58:55 np0005591285 nova_compute[182755]:  <memory>131072</memory>
Jan 21 18:58:55 np0005591285 nova_compute[182755]:  <vcpu>1</vcpu>
Jan 21 18:58:55 np0005591285 nova_compute[182755]:  <metadata>
Jan 21 18:58:55 np0005591285 nova_compute[182755]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 21 18:58:55 np0005591285 nova_compute[182755]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 21 18:58:55 np0005591285 nova_compute[182755]:      <nova:name>tempest-ServerActionsTestJSON-server-396111842</nova:name>
Jan 21 18:58:55 np0005591285 nova_compute[182755]:      <nova:creationTime>2026-01-21 23:58:55</nova:creationTime>
Jan 21 18:58:55 np0005591285 nova_compute[182755]:      <nova:flavor name="m1.nano">
Jan 21 18:58:55 np0005591285 nova_compute[182755]:        <nova:memory>128</nova:memory>
Jan 21 18:58:55 np0005591285 nova_compute[182755]:        <nova:disk>1</nova:disk>
Jan 21 18:58:55 np0005591285 nova_compute[182755]:        <nova:swap>0</nova:swap>
Jan 21 18:58:55 np0005591285 nova_compute[182755]:        <nova:ephemeral>0</nova:ephemeral>
Jan 21 18:58:55 np0005591285 nova_compute[182755]:        <nova:vcpus>1</nova:vcpus>
Jan 21 18:58:55 np0005591285 nova_compute[182755]:      </nova:flavor>
Jan 21 18:58:55 np0005591285 nova_compute[182755]:      <nova:owner>
Jan 21 18:58:55 np0005591285 nova_compute[182755]:        <nova:user uuid="3e78a70a1d284a9d932d4a53b872df39">tempest-ServerActionsTestJSON-78742637-project-member</nova:user>
Jan 21 18:58:55 np0005591285 nova_compute[182755]:        <nova:project uuid="cccb624dbe6d4401a89e9cd254f91828">tempest-ServerActionsTestJSON-78742637</nova:project>
Jan 21 18:58:55 np0005591285 nova_compute[182755]:      </nova:owner>
Jan 21 18:58:55 np0005591285 nova_compute[182755]:      <nova:root type="image" uuid="9cd98f02-a505-4543-a7ad-04e9a377b456"/>
Jan 21 18:58:55 np0005591285 nova_compute[182755]:      <nova:ports>
Jan 21 18:58:55 np0005591285 nova_compute[182755]:        <nova:port uuid="d96fb6bb-9793-4373-8f62-3aa3f32af6a5">
Jan 21 18:58:55 np0005591285 nova_compute[182755]:          <nova:ip type="fixed" address="10.100.0.7" ipVersion="4"/>
Jan 21 18:58:55 np0005591285 nova_compute[182755]:        </nova:port>
Jan 21 18:58:55 np0005591285 nova_compute[182755]:      </nova:ports>
Jan 21 18:58:55 np0005591285 nova_compute[182755]:    </nova:instance>
Jan 21 18:58:55 np0005591285 nova_compute[182755]:  </metadata>
Jan 21 18:58:55 np0005591285 nova_compute[182755]:  <sysinfo type="smbios">
Jan 21 18:58:55 np0005591285 nova_compute[182755]:    <system>
Jan 21 18:58:55 np0005591285 nova_compute[182755]:      <entry name="manufacturer">RDO</entry>
Jan 21 18:58:55 np0005591285 nova_compute[182755]:      <entry name="product">OpenStack Compute</entry>
Jan 21 18:58:55 np0005591285 nova_compute[182755]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 21 18:58:55 np0005591285 nova_compute[182755]:      <entry name="serial">9308be91-9a92-4389-939a-8b03d37474cf</entry>
Jan 21 18:58:55 np0005591285 nova_compute[182755]:      <entry name="uuid">9308be91-9a92-4389-939a-8b03d37474cf</entry>
Jan 21 18:58:55 np0005591285 nova_compute[182755]:      <entry name="family">Virtual Machine</entry>
Jan 21 18:58:55 np0005591285 nova_compute[182755]:    </system>
Jan 21 18:58:55 np0005591285 nova_compute[182755]:  </sysinfo>
Jan 21 18:58:55 np0005591285 nova_compute[182755]:  <os>
Jan 21 18:58:55 np0005591285 nova_compute[182755]:    <type arch="x86_64" machine="q35">hvm</type>
Jan 21 18:58:55 np0005591285 nova_compute[182755]:    <boot dev="hd"/>
Jan 21 18:58:55 np0005591285 nova_compute[182755]:    <smbios mode="sysinfo"/>
Jan 21 18:58:55 np0005591285 nova_compute[182755]:  </os>
Jan 21 18:58:55 np0005591285 nova_compute[182755]:  <features>
Jan 21 18:58:55 np0005591285 nova_compute[182755]:    <acpi/>
Jan 21 18:58:55 np0005591285 nova_compute[182755]:    <apic/>
Jan 21 18:58:55 np0005591285 nova_compute[182755]:    <vmcoreinfo/>
Jan 21 18:58:55 np0005591285 nova_compute[182755]:  </features>
Jan 21 18:58:55 np0005591285 nova_compute[182755]:  <clock offset="utc">
Jan 21 18:58:55 np0005591285 nova_compute[182755]:    <timer name="pit" tickpolicy="delay"/>
Jan 21 18:58:55 np0005591285 nova_compute[182755]:    <timer name="rtc" tickpolicy="catchup"/>
Jan 21 18:58:55 np0005591285 nova_compute[182755]:    <timer name="hpet" present="no"/>
Jan 21 18:58:55 np0005591285 nova_compute[182755]:  </clock>
Jan 21 18:58:55 np0005591285 nova_compute[182755]:  <cpu mode="custom" match="exact">
Jan 21 18:58:55 np0005591285 nova_compute[182755]:    <model>Nehalem</model>
Jan 21 18:58:55 np0005591285 nova_compute[182755]:    <topology sockets="1" cores="1" threads="1"/>
Jan 21 18:58:55 np0005591285 nova_compute[182755]:  </cpu>
Jan 21 18:58:55 np0005591285 nova_compute[182755]:  <devices>
Jan 21 18:58:55 np0005591285 nova_compute[182755]:    <disk type="file" device="disk">
Jan 21 18:58:55 np0005591285 nova_compute[182755]:      <driver name="qemu" type="qcow2" cache="none"/>
Jan 21 18:58:55 np0005591285 nova_compute[182755]:      <source file="/var/lib/nova/instances/9308be91-9a92-4389-939a-8b03d37474cf/disk"/>
Jan 21 18:58:55 np0005591285 nova_compute[182755]:      <target dev="vda" bus="virtio"/>
Jan 21 18:58:55 np0005591285 nova_compute[182755]:    </disk>
Jan 21 18:58:55 np0005591285 nova_compute[182755]:    <disk type="file" device="cdrom">
Jan 21 18:58:55 np0005591285 nova_compute[182755]:      <driver name="qemu" type="raw" cache="none"/>
Jan 21 18:58:55 np0005591285 nova_compute[182755]:      <source file="/var/lib/nova/instances/9308be91-9a92-4389-939a-8b03d37474cf/disk.config"/>
Jan 21 18:58:55 np0005591285 nova_compute[182755]:      <target dev="sda" bus="sata"/>
Jan 21 18:58:55 np0005591285 nova_compute[182755]:    </disk>
Jan 21 18:58:55 np0005591285 nova_compute[182755]:    <interface type="ethernet">
Jan 21 18:58:55 np0005591285 nova_compute[182755]:      <mac address="fa:16:3e:c3:44:d7"/>
Jan 21 18:58:55 np0005591285 nova_compute[182755]:      <model type="virtio"/>
Jan 21 18:58:55 np0005591285 nova_compute[182755]:      <driver name="vhost" rx_queue_size="512"/>
Jan 21 18:58:55 np0005591285 nova_compute[182755]:      <mtu size="1442"/>
Jan 21 18:58:55 np0005591285 nova_compute[182755]:      <target dev="tapd96fb6bb-97"/>
Jan 21 18:58:55 np0005591285 nova_compute[182755]:    </interface>
Jan 21 18:58:55 np0005591285 nova_compute[182755]:    <serial type="pty">
Jan 21 18:58:55 np0005591285 nova_compute[182755]:      <log file="/var/lib/nova/instances/9308be91-9a92-4389-939a-8b03d37474cf/console.log" append="off"/>
Jan 21 18:58:55 np0005591285 nova_compute[182755]:    </serial>
Jan 21 18:58:55 np0005591285 nova_compute[182755]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 21 18:58:55 np0005591285 nova_compute[182755]:    <video>
Jan 21 18:58:55 np0005591285 nova_compute[182755]:      <model type="virtio"/>
Jan 21 18:58:55 np0005591285 nova_compute[182755]:    </video>
Jan 21 18:58:55 np0005591285 nova_compute[182755]:    <input type="tablet" bus="usb"/>
Jan 21 18:58:55 np0005591285 nova_compute[182755]:    <input type="keyboard" bus="usb"/>
Jan 21 18:58:55 np0005591285 nova_compute[182755]:    <rng model="virtio">
Jan 21 18:58:55 np0005591285 nova_compute[182755]:      <backend model="random">/dev/urandom</backend>
Jan 21 18:58:55 np0005591285 nova_compute[182755]:    </rng>
Jan 21 18:58:55 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root"/>
Jan 21 18:58:55 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 18:58:55 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 18:58:55 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 18:58:55 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 18:58:55 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 18:58:55 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 18:58:55 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 18:58:55 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 18:58:55 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 18:58:55 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 18:58:55 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 18:58:55 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 18:58:55 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 18:58:55 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 18:58:55 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 18:58:55 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 18:58:55 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 18:58:55 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 18:58:55 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 18:58:55 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 18:58:55 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 18:58:55 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 18:58:55 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 18:58:55 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 18:58:55 np0005591285 nova_compute[182755]:    <controller type="usb" index="0"/>
Jan 21 18:58:55 np0005591285 nova_compute[182755]:    <memballoon model="virtio">
Jan 21 18:58:55 np0005591285 nova_compute[182755]:      <stats period="10"/>
Jan 21 18:58:55 np0005591285 nova_compute[182755]:    </memballoon>
Jan 21 18:58:55 np0005591285 nova_compute[182755]:  </devices>
Jan 21 18:58:55 np0005591285 nova_compute[182755]: </domain>
Jan 21 18:58:55 np0005591285 nova_compute[182755]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Jan 21 18:58:55 np0005591285 nova_compute[182755]: 2026-01-21 23:58:55.516 182759 DEBUG oslo_concurrency.processutils [None req-1cfd7b6b-7a83-4952-9ee4-6761383e3b7d 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/9308be91-9a92-4389-939a-8b03d37474cf/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 21 18:58:55 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:58:55.521 211690 DEBUG oslo.privsep.daemon [-] privsep: reply[c47c5845-dbf6-4920-bae0-2c9f9ea3c038]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 18:58:55 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:58:55.523 211690 DEBUG oslo.privsep.daemon [-] privsep: reply[5eb4085e-6054-4752-a496-e3a8f8811335]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 18:58:55 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:58:55.547 211690 DEBUG oslo.privsep.daemon [-] privsep: reply[4251dad0-6ce4-4ebe-a031-200858abeb73]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 446700, 'reachable_time': 16940, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 222110, 'error': None, 'target': 'ovnmeta-19c3e0c8-5563-479c-995a-ab38d8b8c7f7', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 18:58:55 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:58:55.549 104650 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-19c3e0c8-5563-479c-995a-ab38d8b8c7f7 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Jan 21 18:58:55 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:58:55.549 104650 DEBUG oslo.privsep.daemon [-] privsep: reply[9809a2fb-3c98-401b-b163-0458956d3ae3]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 18:58:55 np0005591285 systemd[1]: run-netns-ovnmeta\x2d19c3e0c8\x2d5563\x2d479c\x2d995a\x2dab38d8b8c7f7.mount: Deactivated successfully.
Jan 21 18:58:55 np0005591285 nova_compute[182755]: 2026-01-21 23:58:55.619 182759 DEBUG oslo_concurrency.processutils [None req-1cfd7b6b-7a83-4952-9ee4-6761383e3b7d 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/9308be91-9a92-4389-939a-8b03d37474cf/disk --force-share --output=json" returned: 0 in 0.103s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 21 18:58:55 np0005591285 nova_compute[182755]: 2026-01-21 23:58:55.622 182759 DEBUG oslo_concurrency.processutils [None req-1cfd7b6b-7a83-4952-9ee4-6761383e3b7d 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/9308be91-9a92-4389-939a-8b03d37474cf/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 21 18:58:55 np0005591285 nova_compute[182755]: 2026-01-21 23:58:55.711 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 18:58:55 np0005591285 nova_compute[182755]: 2026-01-21 23:58:55.715 182759 DEBUG oslo_concurrency.processutils [None req-1cfd7b6b-7a83-4952-9ee4-6761383e3b7d 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/9308be91-9a92-4389-939a-8b03d37474cf/disk --force-share --output=json" returned: 0 in 0.093s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 21 18:58:55 np0005591285 nova_compute[182755]: 2026-01-21 23:58:55.717 182759 DEBUG nova.objects.instance [None req-1cfd7b6b-7a83-4952-9ee4-6761383e3b7d 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] Lazy-loading 'trusted_certs' on Instance uuid 9308be91-9a92-4389-939a-8b03d37474cf obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 21 18:58:55 np0005591285 nova_compute[182755]: 2026-01-21 23:58:55.737 182759 DEBUG oslo_concurrency.processutils [None req-1cfd7b6b-7a83-4952-9ee4-6761383e3b7d 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 21 18:58:55 np0005591285 nova_compute[182755]: 2026-01-21 23:58:55.822 182759 DEBUG oslo_concurrency.processutils [None req-1cfd7b6b-7a83-4952-9ee4-6761383e3b7d 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json" returned: 0 in 0.085s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 21 18:58:55 np0005591285 nova_compute[182755]: 2026-01-21 23:58:55.823 182759 DEBUG nova.virt.disk.api [None req-1cfd7b6b-7a83-4952-9ee4-6761383e3b7d 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] Checking if we can resize image /var/lib/nova/instances/9308be91-9a92-4389-939a-8b03d37474cf/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166#033[00m
Jan 21 18:58:55 np0005591285 nova_compute[182755]: 2026-01-21 23:58:55.824 182759 DEBUG oslo_concurrency.processutils [None req-1cfd7b6b-7a83-4952-9ee4-6761383e3b7d 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/9308be91-9a92-4389-939a-8b03d37474cf/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 21 18:58:55 np0005591285 nova_compute[182755]: 2026-01-21 23:58:55.918 182759 DEBUG oslo_concurrency.processutils [None req-1cfd7b6b-7a83-4952-9ee4-6761383e3b7d 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/9308be91-9a92-4389-939a-8b03d37474cf/disk --force-share --output=json" returned: 0 in 0.094s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 21 18:58:55 np0005591285 nova_compute[182755]: 2026-01-21 23:58:55.920 182759 DEBUG nova.virt.disk.api [None req-1cfd7b6b-7a83-4952-9ee4-6761383e3b7d 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] Cannot resize image /var/lib/nova/instances/9308be91-9a92-4389-939a-8b03d37474cf/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172#033[00m
Jan 21 18:58:55 np0005591285 nova_compute[182755]: 2026-01-21 23:58:55.921 182759 DEBUG nova.objects.instance [None req-1cfd7b6b-7a83-4952-9ee4-6761383e3b7d 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] Lazy-loading 'migration_context' on Instance uuid 9308be91-9a92-4389-939a-8b03d37474cf obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 21 18:58:55 np0005591285 nova_compute[182755]: 2026-01-21 23:58:55.936 182759 DEBUG nova.virt.libvirt.vif [None req-1cfd7b6b-7a83-4952-9ee4-6761383e3b7d 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-21T23:57:10Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ServerActionsTestJSON-server-396111842',display_name='tempest-ServerActionsTestJSON-server-396111842',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-serveractionstestjson-server-396111842',id=70,image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBM2ugiUux7DYMlN8dY8gue1BzsfXbOKOqdPq/gJUxFgjYtiZRKn0Il7yH7vkt/FF0n0nQ57uKZ7FjQwDvGcLpEHkhrK3RTLhPWsztjfiNHjhjKK0S86T4k3kzP0rpeoh4Q==',key_name='tempest-keypair-452781070',keypairs=<?>,launch_index=0,launched_at=2026-01-21T23:57:17Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=<?>,power_state=1,progress=0,project_id='cccb624dbe6d4401a89e9cd254f91828',ramdisk_id='',reservation_id='r-740ncwsh',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-ServerActionsTestJSON-78742637',owner_user_name='tempest-ServerActionsTestJSON-78742637-project-member'},tags=<?>,task_state='reboot_started_hard',terminated_at=None,trusted_certs=None,updated_at=2026-01-21T23:58:54Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='3e78a70a1d284a9d932d4a53b872df39',uuid=9308be91-9a92-4389-939a-8b03d37474cf,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "d96fb6bb-9793-4373-8f62-3aa3f32af6a5", "address": "fa:16:3e:c3:44:d7", "network": {"id": "19c3e0c8-5563-479c-995a-ab38d8b8c7f7", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-10713966-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.240", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "cccb624dbe6d4401a89e9cd254f91828", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd96fb6bb-97", "ovs_interfaceid": "d96fb6bb-9793-4373-8f62-3aa3f32af6a5", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Jan 21 18:58:55 np0005591285 nova_compute[182755]: 2026-01-21 23:58:55.937 182759 DEBUG nova.network.os_vif_util [None req-1cfd7b6b-7a83-4952-9ee4-6761383e3b7d 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] Converting VIF {"id": "d96fb6bb-9793-4373-8f62-3aa3f32af6a5", "address": "fa:16:3e:c3:44:d7", "network": {"id": "19c3e0c8-5563-479c-995a-ab38d8b8c7f7", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-10713966-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.240", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "cccb624dbe6d4401a89e9cd254f91828", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd96fb6bb-97", "ovs_interfaceid": "d96fb6bb-9793-4373-8f62-3aa3f32af6a5", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 21 18:58:55 np0005591285 nova_compute[182755]: 2026-01-21 23:58:55.938 182759 DEBUG nova.network.os_vif_util [None req-1cfd7b6b-7a83-4952-9ee4-6761383e3b7d 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:c3:44:d7,bridge_name='br-int',has_traffic_filtering=True,id=d96fb6bb-9793-4373-8f62-3aa3f32af6a5,network=Network(19c3e0c8-5563-479c-995a-ab38d8b8c7f7),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapd96fb6bb-97') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 21 18:58:55 np0005591285 nova_compute[182755]: 2026-01-21 23:58:55.938 182759 DEBUG os_vif [None req-1cfd7b6b-7a83-4952-9ee4-6761383e3b7d 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] Plugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:c3:44:d7,bridge_name='br-int',has_traffic_filtering=True,id=d96fb6bb-9793-4373-8f62-3aa3f32af6a5,network=Network(19c3e0c8-5563-479c-995a-ab38d8b8c7f7),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapd96fb6bb-97') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Jan 21 18:58:55 np0005591285 nova_compute[182755]: 2026-01-21 23:58:55.939 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 18:58:55 np0005591285 nova_compute[182755]: 2026-01-21 23:58:55.940 182759 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 21 18:58:55 np0005591285 nova_compute[182755]: 2026-01-21 23:58:55.940 182759 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 21 18:58:55 np0005591285 nova_compute[182755]: 2026-01-21 23:58:55.945 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 18:58:55 np0005591285 nova_compute[182755]: 2026-01-21 23:58:55.945 182759 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapd96fb6bb-97, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 21 18:58:55 np0005591285 nova_compute[182755]: 2026-01-21 23:58:55.945 182759 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tapd96fb6bb-97, col_values=(('external_ids', {'iface-id': 'd96fb6bb-9793-4373-8f62-3aa3f32af6a5', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:c3:44:d7', 'vm-uuid': '9308be91-9a92-4389-939a-8b03d37474cf'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 21 18:58:55 np0005591285 nova_compute[182755]: 2026-01-21 23:58:55.948 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 18:58:55 np0005591285 nova_compute[182755]: 2026-01-21 23:58:55.949 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 21 18:58:55 np0005591285 NetworkManager[55017]: <info>  [1769039935.9501] manager: (tapd96fb6bb-97): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/129)
Jan 21 18:58:55 np0005591285 nova_compute[182755]: 2026-01-21 23:58:55.955 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 18:58:55 np0005591285 nova_compute[182755]: 2026-01-21 23:58:55.957 182759 INFO os_vif [None req-1cfd7b6b-7a83-4952-9ee4-6761383e3b7d 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] Successfully plugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:c3:44:d7,bridge_name='br-int',has_traffic_filtering=True,id=d96fb6bb-9793-4373-8f62-3aa3f32af6a5,network=Network(19c3e0c8-5563-479c-995a-ab38d8b8c7f7),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapd96fb6bb-97')#033[00m
Jan 21 18:58:56 np0005591285 kernel: tapd96fb6bb-97: entered promiscuous mode
Jan 21 18:58:56 np0005591285 systemd-udevd[222093]: Network interface NamePolicy= disabled on kernel command line.
Jan 21 18:58:56 np0005591285 nova_compute[182755]: 2026-01-21 23:58:56.061 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 18:58:56 np0005591285 NetworkManager[55017]: <info>  [1769039936.0643] manager: (tapd96fb6bb-97): new Tun device (/org/freedesktop/NetworkManager/Devices/130)
Jan 21 18:58:56 np0005591285 ovn_controller[94908]: 2026-01-21T23:58:56Z|00258|binding|INFO|Claiming lport d96fb6bb-9793-4373-8f62-3aa3f32af6a5 for this chassis.
Jan 21 18:58:56 np0005591285 ovn_controller[94908]: 2026-01-21T23:58:56Z|00259|binding|INFO|d96fb6bb-9793-4373-8f62-3aa3f32af6a5: Claiming fa:16:3e:c3:44:d7 10.100.0.7
Jan 21 18:58:56 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:58:56.072 104259 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:c3:44:d7 10.100.0.7'], port_security=['fa:16:3e:c3:44:d7 10.100.0.7'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.7/28', 'neutron:device_id': '9308be91-9a92-4389-939a-8b03d37474cf', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-19c3e0c8-5563-479c-995a-ab38d8b8c7f7', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'cccb624dbe6d4401a89e9cd254f91828', 'neutron:revision_number': '9', 'neutron:security_group_ids': '6d59a7e5-ecca-4ec2-a40e-386acabc1d66', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:port_fip': '192.168.122.240'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=1cb5ae5b-fb9e-4b4d-8960-35191db09308, chassis=[<ovs.db.idl.Row object at 0x7fdd55b5c970>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fdd55b5c970>], logical_port=d96fb6bb-9793-4373-8f62-3aa3f32af6a5) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 21 18:58:56 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:58:56.073 104259 INFO neutron.agent.ovn.metadata.agent [-] Port d96fb6bb-9793-4373-8f62-3aa3f32af6a5 in datapath 19c3e0c8-5563-479c-995a-ab38d8b8c7f7 bound to our chassis#033[00m
Jan 21 18:58:56 np0005591285 nova_compute[182755]: 2026-01-21 23:58:56.076 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 18:58:56 np0005591285 ovn_controller[94908]: 2026-01-21T23:58:56Z|00260|binding|INFO|Setting lport d96fb6bb-9793-4373-8f62-3aa3f32af6a5 ovn-installed in OVS
Jan 21 18:58:56 np0005591285 ovn_controller[94908]: 2026-01-21T23:58:56Z|00261|binding|INFO|Setting lport d96fb6bb-9793-4373-8f62-3aa3f32af6a5 up in Southbound
Jan 21 18:58:56 np0005591285 NetworkManager[55017]: <info>  [1769039936.0781] device (tapd96fb6bb-97): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 21 18:58:56 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:58:56.076 104259 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 19c3e0c8-5563-479c-995a-ab38d8b8c7f7#033[00m
Jan 21 18:58:56 np0005591285 NetworkManager[55017]: <info>  [1769039936.0790] device (tapd96fb6bb-97): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 21 18:58:56 np0005591285 nova_compute[182755]: 2026-01-21 23:58:56.085 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 18:58:56 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:58:56.090 211690 DEBUG oslo.privsep.daemon [-] privsep: reply[d7b06d0c-0c50-4298-a439-276bb7263a20]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 18:58:56 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:58:56.091 104259 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap19c3e0c8-51 in ovnmeta-19c3e0c8-5563-479c-995a-ab38d8b8c7f7 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Jan 21 18:58:56 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:58:56.094 211690 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap19c3e0c8-50 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Jan 21 18:58:56 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:58:56.094 211690 DEBUG oslo.privsep.daemon [-] privsep: reply[c84886f6-09f8-433f-9ec4-5230bc0ff885]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 18:58:56 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:58:56.095 211690 DEBUG oslo.privsep.daemon [-] privsep: reply[49cad0ce-9efc-4973-9aa4-f51637c63a9d]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 18:58:56 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:58:56.107 104650 DEBUG oslo.privsep.daemon [-] privsep: reply[90b4d71e-3f77-4ce4-b384-fc8226eddc66]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 18:58:56 np0005591285 systemd-machined[154022]: New machine qemu-35-instance-00000046.
Jan 21 18:58:56 np0005591285 systemd[1]: Started Virtual Machine qemu-35-instance-00000046.
Jan 21 18:58:56 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:58:56.142 211690 DEBUG oslo.privsep.daemon [-] privsep: reply[fbb7a8c8-3914-46ff-b878-78c486914eaf]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 18:58:56 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:58:56.193 211704 DEBUG oslo.privsep.daemon [-] privsep: reply[2b6d50a8-be77-466e-9152-a4d8bb6ef7b1]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 18:58:56 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:58:56.202 211690 DEBUG oslo.privsep.daemon [-] privsep: reply[af61dbf6-7b0e-45e9-b0d6-4edd899eda18]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 18:58:56 np0005591285 NetworkManager[55017]: <info>  [1769039936.2086] manager: (tap19c3e0c8-50): new Veth device (/org/freedesktop/NetworkManager/Devices/131)
Jan 21 18:58:56 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:58:56.255 211704 DEBUG oslo.privsep.daemon [-] privsep: reply[a288f265-da7c-4d87-813a-ea459d4b45e0]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 18:58:56 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:58:56.260 211704 DEBUG oslo.privsep.daemon [-] privsep: reply[817f57c7-e4eb-4716-b177-a60cd11bda06]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 18:58:56 np0005591285 NetworkManager[55017]: <info>  [1769039936.3008] device (tap19c3e0c8-50): carrier: link connected
Jan 21 18:58:56 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:58:56.313 211704 DEBUG oslo.privsep.daemon [-] privsep: reply[96f5a8ca-cb4e-48ef-9017-774dc0735235]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 18:58:56 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:58:56.350 211690 DEBUG oslo.privsep.daemon [-] privsep: reply[e5bc7443-e08e-467e-a893-f83ef1454f26]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap19c3e0c8-51'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:8f:3a:b0'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 84], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 449495, 'reachable_time': 30837, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 222172, 'error': None, 'target': 'ovnmeta-19c3e0c8-5563-479c-995a-ab38d8b8c7f7', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 18:58:56 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:58:56.389 211690 DEBUG oslo.privsep.daemon [-] privsep: reply[b1d09f8a-ad47-45b2-9cdd-429bffe46a91]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe8f:3ab0'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 449495, 'tstamp': 449495}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 222177, 'error': None, 'target': 'ovnmeta-19c3e0c8-5563-479c-995a-ab38d8b8c7f7', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 18:58:56 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:58:56.419 211690 DEBUG oslo.privsep.daemon [-] privsep: reply[c4e97650-ce62-46f0-97ec-b8baae236019]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap19c3e0c8-51'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:8f:3a:b0'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 84], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 449495, 'reachable_time': 30837, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 222179, 'error': None, 'target': 'ovnmeta-19c3e0c8-5563-479c-995a-ab38d8b8c7f7', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 18:58:56 np0005591285 nova_compute[182755]: 2026-01-21 23:58:56.423 182759 DEBUG nova.virt.libvirt.host [None req-f447ef32-7edb-4e2c-a5c5-5452f2f9c37e - - - - - -] Removed pending event for 9308be91-9a92-4389-939a-8b03d37474cf due to event _event_emit_delayed /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:438#033[00m
Jan 21 18:58:56 np0005591285 nova_compute[182755]: 2026-01-21 23:58:56.424 182759 DEBUG nova.virt.driver [None req-f447ef32-7edb-4e2c-a5c5-5452f2f9c37e - - - - - -] Emitting event <LifecycleEvent: 1769039936.4230437, 9308be91-9a92-4389-939a-8b03d37474cf => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 21 18:58:56 np0005591285 nova_compute[182755]: 2026-01-21 23:58:56.425 182759 INFO nova.compute.manager [None req-f447ef32-7edb-4e2c-a5c5-5452f2f9c37e - - - - - -] [instance: 9308be91-9a92-4389-939a-8b03d37474cf] VM Resumed (Lifecycle Event)#033[00m
Jan 21 18:58:56 np0005591285 nova_compute[182755]: 2026-01-21 23:58:56.430 182759 DEBUG nova.compute.manager [None req-1cfd7b6b-7a83-4952-9ee4-6761383e3b7d 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] [instance: 9308be91-9a92-4389-939a-8b03d37474cf] Instance event wait completed in 0 seconds for  wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Jan 21 18:58:56 np0005591285 nova_compute[182755]: 2026-01-21 23:58:56.436 182759 INFO nova.virt.libvirt.driver [-] [instance: 9308be91-9a92-4389-939a-8b03d37474cf] Instance rebooted successfully.#033[00m
Jan 21 18:58:56 np0005591285 nova_compute[182755]: 2026-01-21 23:58:56.437 182759 DEBUG nova.compute.manager [None req-1cfd7b6b-7a83-4952-9ee4-6761383e3b7d 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] [instance: 9308be91-9a92-4389-939a-8b03d37474cf] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 21 18:58:56 np0005591285 nova_compute[182755]: 2026-01-21 23:58:56.473 182759 DEBUG nova.compute.manager [None req-f447ef32-7edb-4e2c-a5c5-5452f2f9c37e - - - - - -] [instance: 9308be91-9a92-4389-939a-8b03d37474cf] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 21 18:58:56 np0005591285 nova_compute[182755]: 2026-01-21 23:58:56.478 182759 DEBUG nova.compute.manager [None req-f447ef32-7edb-4e2c-a5c5-5452f2f9c37e - - - - - -] [instance: 9308be91-9a92-4389-939a-8b03d37474cf] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: active, current task_state: reboot_started_hard, current DB power_state: 1, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 21 18:58:56 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:58:56.499 211690 DEBUG oslo.privsep.daemon [-] privsep: reply[deb0c541-8f7e-45f0-b120-b8713416edf2]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 18:58:56 np0005591285 nova_compute[182755]: 2026-01-21 23:58:56.514 182759 DEBUG nova.virt.driver [None req-f447ef32-7edb-4e2c-a5c5-5452f2f9c37e - - - - - -] Emitting event <LifecycleEvent: 1769039936.4293609, 9308be91-9a92-4389-939a-8b03d37474cf => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 21 18:58:56 np0005591285 nova_compute[182755]: 2026-01-21 23:58:56.514 182759 INFO nova.compute.manager [None req-f447ef32-7edb-4e2c-a5c5-5452f2f9c37e - - - - - -] [instance: 9308be91-9a92-4389-939a-8b03d37474cf] VM Started (Lifecycle Event)#033[00m
Jan 21 18:58:56 np0005591285 nova_compute[182755]: 2026-01-21 23:58:56.538 182759 DEBUG nova.compute.manager [None req-f447ef32-7edb-4e2c-a5c5-5452f2f9c37e - - - - - -] [instance: 9308be91-9a92-4389-939a-8b03d37474cf] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 21 18:58:56 np0005591285 nova_compute[182755]: 2026-01-21 23:58:56.548 182759 DEBUG nova.compute.manager [None req-f447ef32-7edb-4e2c-a5c5-5452f2f9c37e - - - - - -] [instance: 9308be91-9a92-4389-939a-8b03d37474cf] Synchronizing instance power state after lifecycle event "Started"; current vm_state: active, current task_state: None, current DB power_state: 1, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 21 18:58:56 np0005591285 nova_compute[182755]: 2026-01-21 23:58:56.551 182759 DEBUG oslo_concurrency.lockutils [None req-1cfd7b6b-7a83-4952-9ee4-6761383e3b7d 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] Lock "9308be91-9a92-4389-939a-8b03d37474cf" "released" by "nova.compute.manager.ComputeManager.reboot_instance.<locals>.do_reboot_instance" :: held 3.191s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 21 18:58:56 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:58:56.603 211690 DEBUG oslo.privsep.daemon [-] privsep: reply[f71f034e-8d00-4131-91af-3eabd9a57045]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 18:58:56 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:58:56.606 104259 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap19c3e0c8-50, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 21 18:58:56 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:58:56.607 104259 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 21 18:58:56 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:58:56.608 104259 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap19c3e0c8-50, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 21 18:58:56 np0005591285 nova_compute[182755]: 2026-01-21 23:58:56.611 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 18:58:56 np0005591285 NetworkManager[55017]: <info>  [1769039936.6121] manager: (tap19c3e0c8-50): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/132)
Jan 21 18:58:56 np0005591285 kernel: tap19c3e0c8-50: entered promiscuous mode
Jan 21 18:58:56 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:58:56.616 104259 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap19c3e0c8-50, col_values=(('external_ids', {'iface-id': '1b7e9589-a667-4684-99c2-2699b19c29bb'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 21 18:58:56 np0005591285 nova_compute[182755]: 2026-01-21 23:58:56.618 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 18:58:56 np0005591285 ovn_controller[94908]: 2026-01-21T23:58:56Z|00262|binding|INFO|Releasing lport 1b7e9589-a667-4684-99c2-2699b19c29bb from this chassis (sb_readonly=0)
Jan 21 18:58:56 np0005591285 nova_compute[182755]: 2026-01-21 23:58:56.619 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 18:58:56 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:58:56.621 104259 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/19c3e0c8-5563-479c-995a-ab38d8b8c7f7.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/19c3e0c8-5563-479c-995a-ab38d8b8c7f7.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Jan 21 18:58:56 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:58:56.623 211690 DEBUG oslo.privsep.daemon [-] privsep: reply[d5a70cd7-9b75-487b-85f5-7a1b9b932aed]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 18:58:56 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:58:56.624 104259 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Jan 21 18:58:56 np0005591285 ovn_metadata_agent[104254]: global
Jan 21 18:58:56 np0005591285 ovn_metadata_agent[104254]:    log         /dev/log local0 debug
Jan 21 18:58:56 np0005591285 ovn_metadata_agent[104254]:    log-tag     haproxy-metadata-proxy-19c3e0c8-5563-479c-995a-ab38d8b8c7f7
Jan 21 18:58:56 np0005591285 ovn_metadata_agent[104254]:    user        root
Jan 21 18:58:56 np0005591285 ovn_metadata_agent[104254]:    group       root
Jan 21 18:58:56 np0005591285 ovn_metadata_agent[104254]:    maxconn     1024
Jan 21 18:58:56 np0005591285 ovn_metadata_agent[104254]:    pidfile     /var/lib/neutron/external/pids/19c3e0c8-5563-479c-995a-ab38d8b8c7f7.pid.haproxy
Jan 21 18:58:56 np0005591285 ovn_metadata_agent[104254]:    daemon
Jan 21 18:58:56 np0005591285 ovn_metadata_agent[104254]: 
Jan 21 18:58:56 np0005591285 ovn_metadata_agent[104254]: defaults
Jan 21 18:58:56 np0005591285 ovn_metadata_agent[104254]:    log global
Jan 21 18:58:56 np0005591285 ovn_metadata_agent[104254]:    mode http
Jan 21 18:58:56 np0005591285 ovn_metadata_agent[104254]:    option httplog
Jan 21 18:58:56 np0005591285 ovn_metadata_agent[104254]:    option dontlognull
Jan 21 18:58:56 np0005591285 ovn_metadata_agent[104254]:    option http-server-close
Jan 21 18:58:56 np0005591285 ovn_metadata_agent[104254]:    option forwardfor
Jan 21 18:58:56 np0005591285 ovn_metadata_agent[104254]:    retries                 3
Jan 21 18:58:56 np0005591285 ovn_metadata_agent[104254]:    timeout http-request    30s
Jan 21 18:58:56 np0005591285 ovn_metadata_agent[104254]:    timeout connect         30s
Jan 21 18:58:56 np0005591285 ovn_metadata_agent[104254]:    timeout client          32s
Jan 21 18:58:56 np0005591285 ovn_metadata_agent[104254]:    timeout server          32s
Jan 21 18:58:56 np0005591285 ovn_metadata_agent[104254]:    timeout http-keep-alive 30s
Jan 21 18:58:56 np0005591285 ovn_metadata_agent[104254]: 
Jan 21 18:58:56 np0005591285 ovn_metadata_agent[104254]: 
Jan 21 18:58:56 np0005591285 ovn_metadata_agent[104254]: listen listener
Jan 21 18:58:56 np0005591285 ovn_metadata_agent[104254]:    bind 169.254.169.254:80
Jan 21 18:58:56 np0005591285 ovn_metadata_agent[104254]:    server metadata /var/lib/neutron/metadata_proxy
Jan 21 18:58:56 np0005591285 ovn_metadata_agent[104254]:    http-request add-header X-OVN-Network-ID 19c3e0c8-5563-479c-995a-ab38d8b8c7f7
Jan 21 18:58:56 np0005591285 ovn_metadata_agent[104254]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Jan 21 18:58:56 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:58:56.628 104259 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-19c3e0c8-5563-479c-995a-ab38d8b8c7f7', 'env', 'PROCESS_TAG=haproxy-19c3e0c8-5563-479c-995a-ab38d8b8c7f7', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/19c3e0c8-5563-479c-995a-ab38d8b8c7f7.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Jan 21 18:58:56 np0005591285 nova_compute[182755]: 2026-01-21 23:58:56.635 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 18:58:57 np0005591285 podman[222213]: 2026-01-21 23:58:57.144807217 +0000 UTC m=+0.087846363 container create 98c32dedac231254ee5faf47f58b2f2d3cc92489dcc4c0dfe425ed1d9eba5c65 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-19c3e0c8-5563-479c-995a-ab38d8b8c7f7, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team)
Jan 21 18:58:57 np0005591285 systemd[1]: Started libpod-conmon-98c32dedac231254ee5faf47f58b2f2d3cc92489dcc4c0dfe425ed1d9eba5c65.scope.
Jan 21 18:58:57 np0005591285 podman[222213]: 2026-01-21 23:58:57.110703343 +0000 UTC m=+0.053742579 image pull 3695f0466b4af47afdf4b467956f8cc4744d7249671a73e7ca3fd26cca2f59c3 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Jan 21 18:58:57 np0005591285 systemd[1]: Started libcrun container.
Jan 21 18:58:57 np0005591285 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/863ef7382bfc8a761ac382d82512363832aa0939f83485f0509c621f675bb362/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 21 18:58:57 np0005591285 podman[222213]: 2026-01-21 23:58:57.266393134 +0000 UTC m=+0.209432280 container init 98c32dedac231254ee5faf47f58b2f2d3cc92489dcc4c0dfe425ed1d9eba5c65 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-19c3e0c8-5563-479c-995a-ab38d8b8c7f7, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 21 18:58:57 np0005591285 podman[222213]: 2026-01-21 23:58:57.276066087 +0000 UTC m=+0.219105253 container start 98c32dedac231254ee5faf47f58b2f2d3cc92489dcc4c0dfe425ed1d9eba5c65 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-19c3e0c8-5563-479c-995a-ab38d8b8c7f7, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true)
Jan 21 18:58:57 np0005591285 neutron-haproxy-ovnmeta-19c3e0c8-5563-479c-995a-ab38d8b8c7f7[222228]: [NOTICE]   (222232) : New worker (222234) forked
Jan 21 18:58:57 np0005591285 neutron-haproxy-ovnmeta-19c3e0c8-5563-479c-995a-ab38d8b8c7f7[222228]: [NOTICE]   (222232) : Loading success.
Jan 21 18:58:57 np0005591285 nova_compute[182755]: 2026-01-21 23:58:57.589 182759 DEBUG nova.compute.manager [req-983a1eb8-4d12-4eb6-8a2b-76bdaeefd48b req-b83c506b-62a1-4811-ae26-2c56b2aec0d9 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 9308be91-9a92-4389-939a-8b03d37474cf] Received event network-vif-unplugged-d96fb6bb-9793-4373-8f62-3aa3f32af6a5 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 21 18:58:57 np0005591285 nova_compute[182755]: 2026-01-21 23:58:57.590 182759 DEBUG oslo_concurrency.lockutils [req-983a1eb8-4d12-4eb6-8a2b-76bdaeefd48b req-b83c506b-62a1-4811-ae26-2c56b2aec0d9 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "9308be91-9a92-4389-939a-8b03d37474cf-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 21 18:58:57 np0005591285 nova_compute[182755]: 2026-01-21 23:58:57.591 182759 DEBUG oslo_concurrency.lockutils [req-983a1eb8-4d12-4eb6-8a2b-76bdaeefd48b req-b83c506b-62a1-4811-ae26-2c56b2aec0d9 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "9308be91-9a92-4389-939a-8b03d37474cf-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 21 18:58:57 np0005591285 nova_compute[182755]: 2026-01-21 23:58:57.591 182759 DEBUG oslo_concurrency.lockutils [req-983a1eb8-4d12-4eb6-8a2b-76bdaeefd48b req-b83c506b-62a1-4811-ae26-2c56b2aec0d9 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "9308be91-9a92-4389-939a-8b03d37474cf-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 21 18:58:57 np0005591285 nova_compute[182755]: 2026-01-21 23:58:57.592 182759 DEBUG nova.compute.manager [req-983a1eb8-4d12-4eb6-8a2b-76bdaeefd48b req-b83c506b-62a1-4811-ae26-2c56b2aec0d9 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 9308be91-9a92-4389-939a-8b03d37474cf] No waiting events found dispatching network-vif-unplugged-d96fb6bb-9793-4373-8f62-3aa3f32af6a5 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 21 18:58:57 np0005591285 nova_compute[182755]: 2026-01-21 23:58:57.592 182759 WARNING nova.compute.manager [req-983a1eb8-4d12-4eb6-8a2b-76bdaeefd48b req-b83c506b-62a1-4811-ae26-2c56b2aec0d9 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 9308be91-9a92-4389-939a-8b03d37474cf] Received unexpected event network-vif-unplugged-d96fb6bb-9793-4373-8f62-3aa3f32af6a5 for instance with vm_state active and task_state None.#033[00m
Jan 21 18:58:57 np0005591285 nova_compute[182755]: 2026-01-21 23:58:57.592 182759 DEBUG nova.compute.manager [req-983a1eb8-4d12-4eb6-8a2b-76bdaeefd48b req-b83c506b-62a1-4811-ae26-2c56b2aec0d9 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 9308be91-9a92-4389-939a-8b03d37474cf] Received event network-vif-plugged-d96fb6bb-9793-4373-8f62-3aa3f32af6a5 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 21 18:58:57 np0005591285 nova_compute[182755]: 2026-01-21 23:58:57.593 182759 DEBUG oslo_concurrency.lockutils [req-983a1eb8-4d12-4eb6-8a2b-76bdaeefd48b req-b83c506b-62a1-4811-ae26-2c56b2aec0d9 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "9308be91-9a92-4389-939a-8b03d37474cf-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 21 18:58:57 np0005591285 nova_compute[182755]: 2026-01-21 23:58:57.608 182759 DEBUG oslo_concurrency.lockutils [req-983a1eb8-4d12-4eb6-8a2b-76bdaeefd48b req-b83c506b-62a1-4811-ae26-2c56b2aec0d9 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "9308be91-9a92-4389-939a-8b03d37474cf-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.015s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 21 18:58:57 np0005591285 nova_compute[182755]: 2026-01-21 23:58:57.608 182759 DEBUG oslo_concurrency.lockutils [req-983a1eb8-4d12-4eb6-8a2b-76bdaeefd48b req-b83c506b-62a1-4811-ae26-2c56b2aec0d9 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "9308be91-9a92-4389-939a-8b03d37474cf-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 21 18:58:57 np0005591285 nova_compute[182755]: 2026-01-21 23:58:57.608 182759 DEBUG nova.compute.manager [req-983a1eb8-4d12-4eb6-8a2b-76bdaeefd48b req-b83c506b-62a1-4811-ae26-2c56b2aec0d9 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 9308be91-9a92-4389-939a-8b03d37474cf] No waiting events found dispatching network-vif-plugged-d96fb6bb-9793-4373-8f62-3aa3f32af6a5 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 21 18:58:57 np0005591285 nova_compute[182755]: 2026-01-21 23:58:57.609 182759 WARNING nova.compute.manager [req-983a1eb8-4d12-4eb6-8a2b-76bdaeefd48b req-b83c506b-62a1-4811-ae26-2c56b2aec0d9 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 9308be91-9a92-4389-939a-8b03d37474cf] Received unexpected event network-vif-plugged-d96fb6bb-9793-4373-8f62-3aa3f32af6a5 for instance with vm_state active and task_state None.#033[00m
Jan 21 18:58:57 np0005591285 nova_compute[182755]: 2026-01-21 23:58:57.609 182759 DEBUG nova.compute.manager [req-983a1eb8-4d12-4eb6-8a2b-76bdaeefd48b req-b83c506b-62a1-4811-ae26-2c56b2aec0d9 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 9308be91-9a92-4389-939a-8b03d37474cf] Received event network-vif-plugged-d96fb6bb-9793-4373-8f62-3aa3f32af6a5 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 21 18:58:57 np0005591285 nova_compute[182755]: 2026-01-21 23:58:57.609 182759 DEBUG oslo_concurrency.lockutils [req-983a1eb8-4d12-4eb6-8a2b-76bdaeefd48b req-b83c506b-62a1-4811-ae26-2c56b2aec0d9 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "9308be91-9a92-4389-939a-8b03d37474cf-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 21 18:58:57 np0005591285 nova_compute[182755]: 2026-01-21 23:58:57.609 182759 DEBUG oslo_concurrency.lockutils [req-983a1eb8-4d12-4eb6-8a2b-76bdaeefd48b req-b83c506b-62a1-4811-ae26-2c56b2aec0d9 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "9308be91-9a92-4389-939a-8b03d37474cf-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 21 18:58:57 np0005591285 nova_compute[182755]: 2026-01-21 23:58:57.610 182759 DEBUG oslo_concurrency.lockutils [req-983a1eb8-4d12-4eb6-8a2b-76bdaeefd48b req-b83c506b-62a1-4811-ae26-2c56b2aec0d9 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "9308be91-9a92-4389-939a-8b03d37474cf-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 21 18:58:57 np0005591285 nova_compute[182755]: 2026-01-21 23:58:57.610 182759 DEBUG nova.compute.manager [req-983a1eb8-4d12-4eb6-8a2b-76bdaeefd48b req-b83c506b-62a1-4811-ae26-2c56b2aec0d9 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 9308be91-9a92-4389-939a-8b03d37474cf] No waiting events found dispatching network-vif-plugged-d96fb6bb-9793-4373-8f62-3aa3f32af6a5 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 21 18:58:57 np0005591285 nova_compute[182755]: 2026-01-21 23:58:57.610 182759 WARNING nova.compute.manager [req-983a1eb8-4d12-4eb6-8a2b-76bdaeefd48b req-b83c506b-62a1-4811-ae26-2c56b2aec0d9 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 9308be91-9a92-4389-939a-8b03d37474cf] Received unexpected event network-vif-plugged-d96fb6bb-9793-4373-8f62-3aa3f32af6a5 for instance with vm_state active and task_state None.#033[00m
Jan 21 18:58:57 np0005591285 nova_compute[182755]: 2026-01-21 23:58:57.610 182759 DEBUG nova.compute.manager [req-983a1eb8-4d12-4eb6-8a2b-76bdaeefd48b req-b83c506b-62a1-4811-ae26-2c56b2aec0d9 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 9308be91-9a92-4389-939a-8b03d37474cf] Received event network-vif-plugged-d96fb6bb-9793-4373-8f62-3aa3f32af6a5 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 21 18:58:57 np0005591285 nova_compute[182755]: 2026-01-21 23:58:57.611 182759 DEBUG oslo_concurrency.lockutils [req-983a1eb8-4d12-4eb6-8a2b-76bdaeefd48b req-b83c506b-62a1-4811-ae26-2c56b2aec0d9 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "9308be91-9a92-4389-939a-8b03d37474cf-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 21 18:58:57 np0005591285 nova_compute[182755]: 2026-01-21 23:58:57.611 182759 DEBUG oslo_concurrency.lockutils [req-983a1eb8-4d12-4eb6-8a2b-76bdaeefd48b req-b83c506b-62a1-4811-ae26-2c56b2aec0d9 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "9308be91-9a92-4389-939a-8b03d37474cf-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 21 18:58:57 np0005591285 nova_compute[182755]: 2026-01-21 23:58:57.611 182759 DEBUG oslo_concurrency.lockutils [req-983a1eb8-4d12-4eb6-8a2b-76bdaeefd48b req-b83c506b-62a1-4811-ae26-2c56b2aec0d9 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "9308be91-9a92-4389-939a-8b03d37474cf-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 21 18:58:57 np0005591285 nova_compute[182755]: 2026-01-21 23:58:57.611 182759 DEBUG nova.compute.manager [req-983a1eb8-4d12-4eb6-8a2b-76bdaeefd48b req-b83c506b-62a1-4811-ae26-2c56b2aec0d9 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 9308be91-9a92-4389-939a-8b03d37474cf] No waiting events found dispatching network-vif-plugged-d96fb6bb-9793-4373-8f62-3aa3f32af6a5 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 21 18:58:57 np0005591285 nova_compute[182755]: 2026-01-21 23:58:57.611 182759 WARNING nova.compute.manager [req-983a1eb8-4d12-4eb6-8a2b-76bdaeefd48b req-b83c506b-62a1-4811-ae26-2c56b2aec0d9 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 9308be91-9a92-4389-939a-8b03d37474cf] Received unexpected event network-vif-plugged-d96fb6bb-9793-4373-8f62-3aa3f32af6a5 for instance with vm_state active and task_state None.#033[00m
Jan 21 18:58:59 np0005591285 podman[222243]: 2026-01-21 23:58:59.262772668 +0000 UTC m=+0.120529719 container health_status 769668cc4e818cfae854ab3fcb5517227ddca1e18005a18ef4d782a494f45662 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, version=9.6, distribution-scope=public, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, name=ubi9-minimal, io.buildah.version=1.33.7, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-type=git, vendor=Red Hat, Inc., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.expose-services=, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., config_id=openstack_network_exporter, managed_by=edpm_ansible, container_name=openstack_network_exporter, url=https://catalog.redhat.com/en/search?searchType=containers, architecture=x86_64, com.redhat.component=ubi9-minimal-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=minimal rhel9, release=1755695350, build-date=2025-08-20T13:12:41, maintainer=Red Hat, Inc.)
Jan 21 18:58:59 np0005591285 podman[222244]: 2026-01-21 23:58:59.278971918 +0000 UTC m=+0.129150424 container health_status b5c1a31a508d0cf9e10702abe9c0ef424614b4d8f0daee85791f7de054afa6f0 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_id=ceilometer_agent_compute, container_name=ceilometer_agent_compute, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, managed_by=edpm_ansible)
Jan 21 18:59:00 np0005591285 nova_compute[182755]: 2026-01-21 23:59:00.716 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 18:59:00 np0005591285 nova_compute[182755]: 2026-01-21 23:59:00.948 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 18:59:01 np0005591285 nova_compute[182755]: 2026-01-21 23:59:01.324 182759 DEBUG oslo_concurrency.lockutils [None req-0445d58e-0401-4878-a15f-95ff6c7ea0b2 0f8ef02149394f2dac899fc3395b6bf7 717cc581e6a349a98dfd390d05b18624 - - default default] Acquiring lock "30dd1355-3b44-4697-89e2-e5c929a535ac" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 21 18:59:01 np0005591285 nova_compute[182755]: 2026-01-21 23:59:01.325 182759 DEBUG oslo_concurrency.lockutils [None req-0445d58e-0401-4878-a15f-95ff6c7ea0b2 0f8ef02149394f2dac899fc3395b6bf7 717cc581e6a349a98dfd390d05b18624 - - default default] Lock "30dd1355-3b44-4697-89e2-e5c929a535ac" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 21 18:59:01 np0005591285 nova_compute[182755]: 2026-01-21 23:59:01.349 182759 DEBUG nova.compute.manager [None req-0445d58e-0401-4878-a15f-95ff6c7ea0b2 0f8ef02149394f2dac899fc3395b6bf7 717cc581e6a349a98dfd390d05b18624 - - default default] [instance: 30dd1355-3b44-4697-89e2-e5c929a535ac] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Jan 21 18:59:01 np0005591285 nova_compute[182755]: 2026-01-21 23:59:01.474 182759 DEBUG oslo_concurrency.lockutils [None req-0445d58e-0401-4878-a15f-95ff6c7ea0b2 0f8ef02149394f2dac899fc3395b6bf7 717cc581e6a349a98dfd390d05b18624 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 21 18:59:01 np0005591285 nova_compute[182755]: 2026-01-21 23:59:01.476 182759 DEBUG oslo_concurrency.lockutils [None req-0445d58e-0401-4878-a15f-95ff6c7ea0b2 0f8ef02149394f2dac899fc3395b6bf7 717cc581e6a349a98dfd390d05b18624 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 21 18:59:01 np0005591285 nova_compute[182755]: 2026-01-21 23:59:01.485 182759 DEBUG nova.virt.hardware [None req-0445d58e-0401-4878-a15f-95ff6c7ea0b2 0f8ef02149394f2dac899fc3395b6bf7 717cc581e6a349a98dfd390d05b18624 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Jan 21 18:59:01 np0005591285 nova_compute[182755]: 2026-01-21 23:59:01.486 182759 INFO nova.compute.claims [None req-0445d58e-0401-4878-a15f-95ff6c7ea0b2 0f8ef02149394f2dac899fc3395b6bf7 717cc581e6a349a98dfd390d05b18624 - - default default] [instance: 30dd1355-3b44-4697-89e2-e5c929a535ac] Claim successful on node compute-2.ctlplane.example.com#033[00m
Jan 21 18:59:01 np0005591285 nova_compute[182755]: 2026-01-21 23:59:01.675 182759 DEBUG nova.compute.provider_tree [None req-0445d58e-0401-4878-a15f-95ff6c7ea0b2 0f8ef02149394f2dac899fc3395b6bf7 717cc581e6a349a98dfd390d05b18624 - - default default] Inventory has not changed in ProviderTree for provider: e96a8776-a298-4c19-937a-402cb8191067 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 21 18:59:01 np0005591285 nova_compute[182755]: 2026-01-21 23:59:01.711 182759 DEBUG nova.scheduler.client.report [None req-0445d58e-0401-4878-a15f-95ff6c7ea0b2 0f8ef02149394f2dac899fc3395b6bf7 717cc581e6a349a98dfd390d05b18624 - - default default] Inventory has not changed for provider e96a8776-a298-4c19-937a-402cb8191067 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 21 18:59:01 np0005591285 nova_compute[182755]: 2026-01-21 23:59:01.741 182759 DEBUG oslo_concurrency.lockutils [None req-0445d58e-0401-4878-a15f-95ff6c7ea0b2 0f8ef02149394f2dac899fc3395b6bf7 717cc581e6a349a98dfd390d05b18624 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.265s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 21 18:59:01 np0005591285 nova_compute[182755]: 2026-01-21 23:59:01.743 182759 DEBUG nova.compute.manager [None req-0445d58e-0401-4878-a15f-95ff6c7ea0b2 0f8ef02149394f2dac899fc3395b6bf7 717cc581e6a349a98dfd390d05b18624 - - default default] [instance: 30dd1355-3b44-4697-89e2-e5c929a535ac] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Jan 21 18:59:01 np0005591285 nova_compute[182755]: 2026-01-21 23:59:01.808 182759 DEBUG nova.compute.manager [None req-0445d58e-0401-4878-a15f-95ff6c7ea0b2 0f8ef02149394f2dac899fc3395b6bf7 717cc581e6a349a98dfd390d05b18624 - - default default] [instance: 30dd1355-3b44-4697-89e2-e5c929a535ac] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Jan 21 18:59:01 np0005591285 nova_compute[182755]: 2026-01-21 23:59:01.809 182759 DEBUG nova.network.neutron [None req-0445d58e-0401-4878-a15f-95ff6c7ea0b2 0f8ef02149394f2dac899fc3395b6bf7 717cc581e6a349a98dfd390d05b18624 - - default default] [instance: 30dd1355-3b44-4697-89e2-e5c929a535ac] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Jan 21 18:59:01 np0005591285 nova_compute[182755]: 2026-01-21 23:59:01.834 182759 INFO nova.virt.libvirt.driver [None req-0445d58e-0401-4878-a15f-95ff6c7ea0b2 0f8ef02149394f2dac899fc3395b6bf7 717cc581e6a349a98dfd390d05b18624 - - default default] [instance: 30dd1355-3b44-4697-89e2-e5c929a535ac] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Jan 21 18:59:01 np0005591285 nova_compute[182755]: 2026-01-21 23:59:01.854 182759 DEBUG nova.compute.manager [None req-0445d58e-0401-4878-a15f-95ff6c7ea0b2 0f8ef02149394f2dac899fc3395b6bf7 717cc581e6a349a98dfd390d05b18624 - - default default] [instance: 30dd1355-3b44-4697-89e2-e5c929a535ac] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Jan 21 18:59:01 np0005591285 nova_compute[182755]: 2026-01-21 23:59:01.983 182759 DEBUG nova.compute.manager [None req-0445d58e-0401-4878-a15f-95ff6c7ea0b2 0f8ef02149394f2dac899fc3395b6bf7 717cc581e6a349a98dfd390d05b18624 - - default default] [instance: 30dd1355-3b44-4697-89e2-e5c929a535ac] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Jan 21 18:59:01 np0005591285 nova_compute[182755]: 2026-01-21 23:59:01.986 182759 DEBUG nova.virt.libvirt.driver [None req-0445d58e-0401-4878-a15f-95ff6c7ea0b2 0f8ef02149394f2dac899fc3395b6bf7 717cc581e6a349a98dfd390d05b18624 - - default default] [instance: 30dd1355-3b44-4697-89e2-e5c929a535ac] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Jan 21 18:59:01 np0005591285 nova_compute[182755]: 2026-01-21 23:59:01.987 182759 INFO nova.virt.libvirt.driver [None req-0445d58e-0401-4878-a15f-95ff6c7ea0b2 0f8ef02149394f2dac899fc3395b6bf7 717cc581e6a349a98dfd390d05b18624 - - default default] [instance: 30dd1355-3b44-4697-89e2-e5c929a535ac] Creating image(s)#033[00m
Jan 21 18:59:01 np0005591285 nova_compute[182755]: 2026-01-21 23:59:01.988 182759 DEBUG oslo_concurrency.lockutils [None req-0445d58e-0401-4878-a15f-95ff6c7ea0b2 0f8ef02149394f2dac899fc3395b6bf7 717cc581e6a349a98dfd390d05b18624 - - default default] Acquiring lock "/var/lib/nova/instances/30dd1355-3b44-4697-89e2-e5c929a535ac/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 21 18:59:01 np0005591285 nova_compute[182755]: 2026-01-21 23:59:01.989 182759 DEBUG oslo_concurrency.lockutils [None req-0445d58e-0401-4878-a15f-95ff6c7ea0b2 0f8ef02149394f2dac899fc3395b6bf7 717cc581e6a349a98dfd390d05b18624 - - default default] Lock "/var/lib/nova/instances/30dd1355-3b44-4697-89e2-e5c929a535ac/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 21 18:59:01 np0005591285 nova_compute[182755]: 2026-01-21 23:59:01.990 182759 DEBUG oslo_concurrency.lockutils [None req-0445d58e-0401-4878-a15f-95ff6c7ea0b2 0f8ef02149394f2dac899fc3395b6bf7 717cc581e6a349a98dfd390d05b18624 - - default default] Lock "/var/lib/nova/instances/30dd1355-3b44-4697-89e2-e5c929a535ac/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 21 18:59:02 np0005591285 nova_compute[182755]: 2026-01-21 23:59:02.021 182759 DEBUG oslo_concurrency.processutils [None req-0445d58e-0401-4878-a15f-95ff6c7ea0b2 0f8ef02149394f2dac899fc3395b6bf7 717cc581e6a349a98dfd390d05b18624 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 21 18:59:02 np0005591285 nova_compute[182755]: 2026-01-21 23:59:02.109 182759 DEBUG oslo_concurrency.processutils [None req-0445d58e-0401-4878-a15f-95ff6c7ea0b2 0f8ef02149394f2dac899fc3395b6bf7 717cc581e6a349a98dfd390d05b18624 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json" returned: 0 in 0.088s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 21 18:59:02 np0005591285 nova_compute[182755]: 2026-01-21 23:59:02.111 182759 DEBUG oslo_concurrency.lockutils [None req-0445d58e-0401-4878-a15f-95ff6c7ea0b2 0f8ef02149394f2dac899fc3395b6bf7 717cc581e6a349a98dfd390d05b18624 - - default default] Acquiring lock "3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 21 18:59:02 np0005591285 nova_compute[182755]: 2026-01-21 23:59:02.113 182759 DEBUG oslo_concurrency.lockutils [None req-0445d58e-0401-4878-a15f-95ff6c7ea0b2 0f8ef02149394f2dac899fc3395b6bf7 717cc581e6a349a98dfd390d05b18624 - - default default] Lock "3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 21 18:59:02 np0005591285 nova_compute[182755]: 2026-01-21 23:59:02.129 182759 DEBUG oslo_concurrency.processutils [None req-0445d58e-0401-4878-a15f-95ff6c7ea0b2 0f8ef02149394f2dac899fc3395b6bf7 717cc581e6a349a98dfd390d05b18624 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 21 18:59:02 np0005591285 nova_compute[182755]: 2026-01-21 23:59:02.158 182759 DEBUG nova.policy [None req-0445d58e-0401-4878-a15f-95ff6c7ea0b2 0f8ef02149394f2dac899fc3395b6bf7 717cc581e6a349a98dfd390d05b18624 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '0f8ef02149394f2dac899fc3395b6bf7', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '717cc581e6a349a98dfd390d05b18624', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Jan 21 18:59:02 np0005591285 nova_compute[182755]: 2026-01-21 23:59:02.203 182759 DEBUG oslo_concurrency.processutils [None req-0445d58e-0401-4878-a15f-95ff6c7ea0b2 0f8ef02149394f2dac899fc3395b6bf7 717cc581e6a349a98dfd390d05b18624 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json" returned: 0 in 0.074s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 21 18:59:02 np0005591285 nova_compute[182755]: 2026-01-21 23:59:02.205 182759 DEBUG oslo_concurrency.processutils [None req-0445d58e-0401-4878-a15f-95ff6c7ea0b2 0f8ef02149394f2dac899fc3395b6bf7 717cc581e6a349a98dfd390d05b18624 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474,backing_fmt=raw /var/lib/nova/instances/30dd1355-3b44-4697-89e2-e5c929a535ac/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 21 18:59:02 np0005591285 nova_compute[182755]: 2026-01-21 23:59:02.251 182759 DEBUG oslo_concurrency.processutils [None req-0445d58e-0401-4878-a15f-95ff6c7ea0b2 0f8ef02149394f2dac899fc3395b6bf7 717cc581e6a349a98dfd390d05b18624 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474,backing_fmt=raw /var/lib/nova/instances/30dd1355-3b44-4697-89e2-e5c929a535ac/disk 1073741824" returned: 0 in 0.046s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 21 18:59:02 np0005591285 nova_compute[182755]: 2026-01-21 23:59:02.253 182759 DEBUG oslo_concurrency.lockutils [None req-0445d58e-0401-4878-a15f-95ff6c7ea0b2 0f8ef02149394f2dac899fc3395b6bf7 717cc581e6a349a98dfd390d05b18624 - - default default] Lock "3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.141s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 21 18:59:02 np0005591285 nova_compute[182755]: 2026-01-21 23:59:02.254 182759 DEBUG oslo_concurrency.processutils [None req-0445d58e-0401-4878-a15f-95ff6c7ea0b2 0f8ef02149394f2dac899fc3395b6bf7 717cc581e6a349a98dfd390d05b18624 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 21 18:59:02 np0005591285 nova_compute[182755]: 2026-01-21 23:59:02.320 182759 DEBUG oslo_concurrency.processutils [None req-0445d58e-0401-4878-a15f-95ff6c7ea0b2 0f8ef02149394f2dac899fc3395b6bf7 717cc581e6a349a98dfd390d05b18624 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json" returned: 0 in 0.066s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 21 18:59:02 np0005591285 nova_compute[182755]: 2026-01-21 23:59:02.322 182759 DEBUG nova.virt.disk.api [None req-0445d58e-0401-4878-a15f-95ff6c7ea0b2 0f8ef02149394f2dac899fc3395b6bf7 717cc581e6a349a98dfd390d05b18624 - - default default] Checking if we can resize image /var/lib/nova/instances/30dd1355-3b44-4697-89e2-e5c929a535ac/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166#033[00m
Jan 21 18:59:02 np0005591285 nova_compute[182755]: 2026-01-21 23:59:02.323 182759 DEBUG oslo_concurrency.processutils [None req-0445d58e-0401-4878-a15f-95ff6c7ea0b2 0f8ef02149394f2dac899fc3395b6bf7 717cc581e6a349a98dfd390d05b18624 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/30dd1355-3b44-4697-89e2-e5c929a535ac/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 21 18:59:02 np0005591285 nova_compute[182755]: 2026-01-21 23:59:02.388 182759 DEBUG oslo_concurrency.processutils [None req-0445d58e-0401-4878-a15f-95ff6c7ea0b2 0f8ef02149394f2dac899fc3395b6bf7 717cc581e6a349a98dfd390d05b18624 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/30dd1355-3b44-4697-89e2-e5c929a535ac/disk --force-share --output=json" returned: 0 in 0.064s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 21 18:59:02 np0005591285 nova_compute[182755]: 2026-01-21 23:59:02.389 182759 DEBUG nova.virt.disk.api [None req-0445d58e-0401-4878-a15f-95ff6c7ea0b2 0f8ef02149394f2dac899fc3395b6bf7 717cc581e6a349a98dfd390d05b18624 - - default default] Cannot resize image /var/lib/nova/instances/30dd1355-3b44-4697-89e2-e5c929a535ac/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172#033[00m
Jan 21 18:59:02 np0005591285 nova_compute[182755]: 2026-01-21 23:59:02.390 182759 DEBUG nova.objects.instance [None req-0445d58e-0401-4878-a15f-95ff6c7ea0b2 0f8ef02149394f2dac899fc3395b6bf7 717cc581e6a349a98dfd390d05b18624 - - default default] Lazy-loading 'migration_context' on Instance uuid 30dd1355-3b44-4697-89e2-e5c929a535ac obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 21 18:59:02 np0005591285 nova_compute[182755]: 2026-01-21 23:59:02.415 182759 DEBUG nova.virt.libvirt.driver [None req-0445d58e-0401-4878-a15f-95ff6c7ea0b2 0f8ef02149394f2dac899fc3395b6bf7 717cc581e6a349a98dfd390d05b18624 - - default default] [instance: 30dd1355-3b44-4697-89e2-e5c929a535ac] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Jan 21 18:59:02 np0005591285 nova_compute[182755]: 2026-01-21 23:59:02.417 182759 DEBUG nova.virt.libvirt.driver [None req-0445d58e-0401-4878-a15f-95ff6c7ea0b2 0f8ef02149394f2dac899fc3395b6bf7 717cc581e6a349a98dfd390d05b18624 - - default default] [instance: 30dd1355-3b44-4697-89e2-e5c929a535ac] Ensure instance console log exists: /var/lib/nova/instances/30dd1355-3b44-4697-89e2-e5c929a535ac/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Jan 21 18:59:02 np0005591285 nova_compute[182755]: 2026-01-21 23:59:02.418 182759 DEBUG oslo_concurrency.lockutils [None req-0445d58e-0401-4878-a15f-95ff6c7ea0b2 0f8ef02149394f2dac899fc3395b6bf7 717cc581e6a349a98dfd390d05b18624 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 21 18:59:02 np0005591285 nova_compute[182755]: 2026-01-21 23:59:02.419 182759 DEBUG oslo_concurrency.lockutils [None req-0445d58e-0401-4878-a15f-95ff6c7ea0b2 0f8ef02149394f2dac899fc3395b6bf7 717cc581e6a349a98dfd390d05b18624 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 21 18:59:02 np0005591285 nova_compute[182755]: 2026-01-21 23:59:02.419 182759 DEBUG oslo_concurrency.lockutils [None req-0445d58e-0401-4878-a15f-95ff6c7ea0b2 0f8ef02149394f2dac899fc3395b6bf7 717cc581e6a349a98dfd390d05b18624 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 21 18:59:02 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:59:02.964 104259 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 21 18:59:02 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:59:02.966 104259 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 21 18:59:02 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:59:02.967 104259 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 21 18:59:03 np0005591285 nova_compute[182755]: 2026-01-21 23:59:03.780 182759 DEBUG nova.network.neutron [None req-0445d58e-0401-4878-a15f-95ff6c7ea0b2 0f8ef02149394f2dac899fc3395b6bf7 717cc581e6a349a98dfd390d05b18624 - - default default] [instance: 30dd1355-3b44-4697-89e2-e5c929a535ac] Successfully created port: e685e997-ac00-415e-9109-6d37bbb2f577 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Jan 21 18:59:05 np0005591285 nova_compute[182755]: 2026-01-21 23:59:05.718 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 18:59:05 np0005591285 nova_compute[182755]: 2026-01-21 23:59:05.951 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 18:59:06 np0005591285 podman[222303]: 2026-01-21 23:59:06.233857401 +0000 UTC m=+0.098426461 container health_status 3d927e208c8d9d2bf2ed958430b573b5beb6e6062ba87e1ec7a0ee42c855b2e4 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter)
Jan 21 18:59:06 np0005591285 ovn_controller[94908]: 2026-01-21T23:59:06Z|00029|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:2d:84:20 10.100.0.13
Jan 21 18:59:06 np0005591285 ovn_controller[94908]: 2026-01-21T23:59:06Z|00030|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:2d:84:20 10.100.0.13
Jan 21 18:59:07 np0005591285 nova_compute[182755]: 2026-01-21 23:59:07.418 182759 DEBUG nova.network.neutron [None req-0445d58e-0401-4878-a15f-95ff6c7ea0b2 0f8ef02149394f2dac899fc3395b6bf7 717cc581e6a349a98dfd390d05b18624 - - default default] [instance: 30dd1355-3b44-4697-89e2-e5c929a535ac] Successfully updated port: e685e997-ac00-415e-9109-6d37bbb2f577 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Jan 21 18:59:07 np0005591285 nova_compute[182755]: 2026-01-21 23:59:07.434 182759 DEBUG oslo_concurrency.lockutils [None req-0445d58e-0401-4878-a15f-95ff6c7ea0b2 0f8ef02149394f2dac899fc3395b6bf7 717cc581e6a349a98dfd390d05b18624 - - default default] Acquiring lock "refresh_cache-30dd1355-3b44-4697-89e2-e5c929a535ac" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 21 18:59:07 np0005591285 nova_compute[182755]: 2026-01-21 23:59:07.435 182759 DEBUG oslo_concurrency.lockutils [None req-0445d58e-0401-4878-a15f-95ff6c7ea0b2 0f8ef02149394f2dac899fc3395b6bf7 717cc581e6a349a98dfd390d05b18624 - - default default] Acquired lock "refresh_cache-30dd1355-3b44-4697-89e2-e5c929a535ac" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 21 18:59:07 np0005591285 nova_compute[182755]: 2026-01-21 23:59:07.435 182759 DEBUG nova.network.neutron [None req-0445d58e-0401-4878-a15f-95ff6c7ea0b2 0f8ef02149394f2dac899fc3395b6bf7 717cc581e6a349a98dfd390d05b18624 - - default default] [instance: 30dd1355-3b44-4697-89e2-e5c929a535ac] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Jan 21 18:59:07 np0005591285 nova_compute[182755]: 2026-01-21 23:59:07.547 182759 DEBUG nova.compute.manager [req-167e8d9a-2f48-4a9f-92b6-23ed97cae0d3 req-b8e552f6-e45e-469b-ba96-002486f82106 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 30dd1355-3b44-4697-89e2-e5c929a535ac] Received event network-changed-e685e997-ac00-415e-9109-6d37bbb2f577 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 21 18:59:07 np0005591285 nova_compute[182755]: 2026-01-21 23:59:07.548 182759 DEBUG nova.compute.manager [req-167e8d9a-2f48-4a9f-92b6-23ed97cae0d3 req-b8e552f6-e45e-469b-ba96-002486f82106 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 30dd1355-3b44-4697-89e2-e5c929a535ac] Refreshing instance network info cache due to event network-changed-e685e997-ac00-415e-9109-6d37bbb2f577. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 21 18:59:07 np0005591285 nova_compute[182755]: 2026-01-21 23:59:07.549 182759 DEBUG oslo_concurrency.lockutils [req-167e8d9a-2f48-4a9f-92b6-23ed97cae0d3 req-b8e552f6-e45e-469b-ba96-002486f82106 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "refresh_cache-30dd1355-3b44-4697-89e2-e5c929a535ac" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 21 18:59:08 np0005591285 nova_compute[182755]: 2026-01-21 23:59:08.496 182759 DEBUG nova.network.neutron [None req-0445d58e-0401-4878-a15f-95ff6c7ea0b2 0f8ef02149394f2dac899fc3395b6bf7 717cc581e6a349a98dfd390d05b18624 - - default default] [instance: 30dd1355-3b44-4697-89e2-e5c929a535ac] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Jan 21 18:59:10 np0005591285 ovn_controller[94908]: 2026-01-21T23:59:10Z|00031|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:c3:44:d7 10.100.0.7
Jan 21 18:59:10 np0005591285 nova_compute[182755]: 2026-01-21 23:59:10.477 182759 DEBUG nova.network.neutron [None req-0445d58e-0401-4878-a15f-95ff6c7ea0b2 0f8ef02149394f2dac899fc3395b6bf7 717cc581e6a349a98dfd390d05b18624 - - default default] [instance: 30dd1355-3b44-4697-89e2-e5c929a535ac] Updating instance_info_cache with network_info: [{"id": "e685e997-ac00-415e-9109-6d37bbb2f577", "address": "fa:16:3e:a5:00:2a", "network": {"id": "1995baab-0f8d-4658-a4fc-2d21868dc592", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-54296518-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "717cc581e6a349a98dfd390d05b18624", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape685e997-ac", "ovs_interfaceid": "e685e997-ac00-415e-9109-6d37bbb2f577", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 21 18:59:10 np0005591285 nova_compute[182755]: 2026-01-21 23:59:10.515 182759 DEBUG oslo_concurrency.lockutils [None req-0445d58e-0401-4878-a15f-95ff6c7ea0b2 0f8ef02149394f2dac899fc3395b6bf7 717cc581e6a349a98dfd390d05b18624 - - default default] Releasing lock "refresh_cache-30dd1355-3b44-4697-89e2-e5c929a535ac" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 21 18:59:10 np0005591285 nova_compute[182755]: 2026-01-21 23:59:10.516 182759 DEBUG nova.compute.manager [None req-0445d58e-0401-4878-a15f-95ff6c7ea0b2 0f8ef02149394f2dac899fc3395b6bf7 717cc581e6a349a98dfd390d05b18624 - - default default] [instance: 30dd1355-3b44-4697-89e2-e5c929a535ac] Instance network_info: |[{"id": "e685e997-ac00-415e-9109-6d37bbb2f577", "address": "fa:16:3e:a5:00:2a", "network": {"id": "1995baab-0f8d-4658-a4fc-2d21868dc592", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-54296518-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "717cc581e6a349a98dfd390d05b18624", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape685e997-ac", "ovs_interfaceid": "e685e997-ac00-415e-9109-6d37bbb2f577", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Jan 21 18:59:10 np0005591285 nova_compute[182755]: 2026-01-21 23:59:10.517 182759 DEBUG oslo_concurrency.lockutils [req-167e8d9a-2f48-4a9f-92b6-23ed97cae0d3 req-b8e552f6-e45e-469b-ba96-002486f82106 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquired lock "refresh_cache-30dd1355-3b44-4697-89e2-e5c929a535ac" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 21 18:59:10 np0005591285 nova_compute[182755]: 2026-01-21 23:59:10.517 182759 DEBUG nova.network.neutron [req-167e8d9a-2f48-4a9f-92b6-23ed97cae0d3 req-b8e552f6-e45e-469b-ba96-002486f82106 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 30dd1355-3b44-4697-89e2-e5c929a535ac] Refreshing network info cache for port e685e997-ac00-415e-9109-6d37bbb2f577 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 21 18:59:10 np0005591285 nova_compute[182755]: 2026-01-21 23:59:10.523 182759 DEBUG nova.virt.libvirt.driver [None req-0445d58e-0401-4878-a15f-95ff6c7ea0b2 0f8ef02149394f2dac899fc3395b6bf7 717cc581e6a349a98dfd390d05b18624 - - default default] [instance: 30dd1355-3b44-4697-89e2-e5c929a535ac] Start _get_guest_xml network_info=[{"id": "e685e997-ac00-415e-9109-6d37bbb2f577", "address": "fa:16:3e:a5:00:2a", "network": {"id": "1995baab-0f8d-4658-a4fc-2d21868dc592", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-54296518-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "717cc581e6a349a98dfd390d05b18624", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape685e997-ac", "ovs_interfaceid": "e685e997-ac00-415e-9109-6d37bbb2f577", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-21T23:42:50Z,direct_url=<?>,disk_format='qcow2',id=9cd98f02-a505-4543-a7ad-04e9a377b456,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='43b70c4e837343859ac97b6b2397ba1b',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-21T23:42:52Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'device_type': 'disk', 'device_name': '/dev/vda', 'size': 0, 'boot_index': 0, 'disk_bus': 'virtio', 'guest_format': None, 'encryption_options': None, 'encryption_format': None, 'encrypted': False, 'encryption_secret_uuid': None, 'image_id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Jan 21 18:59:10 np0005591285 nova_compute[182755]: 2026-01-21 23:59:10.531 182759 WARNING nova.virt.libvirt.driver [None req-0445d58e-0401-4878-a15f-95ff6c7ea0b2 0f8ef02149394f2dac899fc3395b6bf7 717cc581e6a349a98dfd390d05b18624 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 21 18:59:10 np0005591285 nova_compute[182755]: 2026-01-21 23:59:10.536 182759 DEBUG nova.virt.libvirt.host [None req-0445d58e-0401-4878-a15f-95ff6c7ea0b2 0f8ef02149394f2dac899fc3395b6bf7 717cc581e6a349a98dfd390d05b18624 - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Jan 21 18:59:10 np0005591285 nova_compute[182755]: 2026-01-21 23:59:10.537 182759 DEBUG nova.virt.libvirt.host [None req-0445d58e-0401-4878-a15f-95ff6c7ea0b2 0f8ef02149394f2dac899fc3395b6bf7 717cc581e6a349a98dfd390d05b18624 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Jan 21 18:59:10 np0005591285 nova_compute[182755]: 2026-01-21 23:59:10.543 182759 DEBUG nova.virt.libvirt.host [None req-0445d58e-0401-4878-a15f-95ff6c7ea0b2 0f8ef02149394f2dac899fc3395b6bf7 717cc581e6a349a98dfd390d05b18624 - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Jan 21 18:59:10 np0005591285 nova_compute[182755]: 2026-01-21 23:59:10.544 182759 DEBUG nova.virt.libvirt.host [None req-0445d58e-0401-4878-a15f-95ff6c7ea0b2 0f8ef02149394f2dac899fc3395b6bf7 717cc581e6a349a98dfd390d05b18624 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Jan 21 18:59:10 np0005591285 nova_compute[182755]: 2026-01-21 23:59:10.546 182759 DEBUG nova.virt.libvirt.driver [None req-0445d58e-0401-4878-a15f-95ff6c7ea0b2 0f8ef02149394f2dac899fc3395b6bf7 717cc581e6a349a98dfd390d05b18624 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Jan 21 18:59:10 np0005591285 nova_compute[182755]: 2026-01-21 23:59:10.547 182759 DEBUG nova.virt.hardware [None req-0445d58e-0401-4878-a15f-95ff6c7ea0b2 0f8ef02149394f2dac899fc3395b6bf7 717cc581e6a349a98dfd390d05b18624 - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-21T23:42:45Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='c3389c03-89c4-4ff5-9e03-1a99d41713d4',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-21T23:42:50Z,direct_url=<?>,disk_format='qcow2',id=9cd98f02-a505-4543-a7ad-04e9a377b456,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='43b70c4e837343859ac97b6b2397ba1b',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-21T23:42:52Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Jan 21 18:59:10 np0005591285 nova_compute[182755]: 2026-01-21 23:59:10.548 182759 DEBUG nova.virt.hardware [None req-0445d58e-0401-4878-a15f-95ff6c7ea0b2 0f8ef02149394f2dac899fc3395b6bf7 717cc581e6a349a98dfd390d05b18624 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Jan 21 18:59:10 np0005591285 nova_compute[182755]: 2026-01-21 23:59:10.548 182759 DEBUG nova.virt.hardware [None req-0445d58e-0401-4878-a15f-95ff6c7ea0b2 0f8ef02149394f2dac899fc3395b6bf7 717cc581e6a349a98dfd390d05b18624 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Jan 21 18:59:10 np0005591285 nova_compute[182755]: 2026-01-21 23:59:10.549 182759 DEBUG nova.virt.hardware [None req-0445d58e-0401-4878-a15f-95ff6c7ea0b2 0f8ef02149394f2dac899fc3395b6bf7 717cc581e6a349a98dfd390d05b18624 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Jan 21 18:59:10 np0005591285 nova_compute[182755]: 2026-01-21 23:59:10.549 182759 DEBUG nova.virt.hardware [None req-0445d58e-0401-4878-a15f-95ff6c7ea0b2 0f8ef02149394f2dac899fc3395b6bf7 717cc581e6a349a98dfd390d05b18624 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Jan 21 18:59:10 np0005591285 nova_compute[182755]: 2026-01-21 23:59:10.549 182759 DEBUG nova.virt.hardware [None req-0445d58e-0401-4878-a15f-95ff6c7ea0b2 0f8ef02149394f2dac899fc3395b6bf7 717cc581e6a349a98dfd390d05b18624 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Jan 21 18:59:10 np0005591285 nova_compute[182755]: 2026-01-21 23:59:10.550 182759 DEBUG nova.virt.hardware [None req-0445d58e-0401-4878-a15f-95ff6c7ea0b2 0f8ef02149394f2dac899fc3395b6bf7 717cc581e6a349a98dfd390d05b18624 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Jan 21 18:59:10 np0005591285 nova_compute[182755]: 2026-01-21 23:59:10.550 182759 DEBUG nova.virt.hardware [None req-0445d58e-0401-4878-a15f-95ff6c7ea0b2 0f8ef02149394f2dac899fc3395b6bf7 717cc581e6a349a98dfd390d05b18624 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Jan 21 18:59:10 np0005591285 nova_compute[182755]: 2026-01-21 23:59:10.551 182759 DEBUG nova.virt.hardware [None req-0445d58e-0401-4878-a15f-95ff6c7ea0b2 0f8ef02149394f2dac899fc3395b6bf7 717cc581e6a349a98dfd390d05b18624 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Jan 21 18:59:10 np0005591285 nova_compute[182755]: 2026-01-21 23:59:10.551 182759 DEBUG nova.virt.hardware [None req-0445d58e-0401-4878-a15f-95ff6c7ea0b2 0f8ef02149394f2dac899fc3395b6bf7 717cc581e6a349a98dfd390d05b18624 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Jan 21 18:59:10 np0005591285 nova_compute[182755]: 2026-01-21 23:59:10.552 182759 DEBUG nova.virt.hardware [None req-0445d58e-0401-4878-a15f-95ff6c7ea0b2 0f8ef02149394f2dac899fc3395b6bf7 717cc581e6a349a98dfd390d05b18624 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Jan 21 18:59:10 np0005591285 nova_compute[182755]: 2026-01-21 23:59:10.558 182759 DEBUG nova.virt.libvirt.vif [None req-0445d58e-0401-4878-a15f-95ff6c7ea0b2 0f8ef02149394f2dac899fc3395b6bf7 717cc581e6a349a98dfd390d05b18624 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-21T23:59:00Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-tempest.common.compute-instance-1677916293',display_name='tempest-tempest.common.compute-instance-1677916293',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-tempest-common-compute-instance-1677916293',id=77,image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBGa3Z5wRILybgNVRpahBLmiLTAeMMxTHFRpSqeE8vf0/V2PbXMu+NKNirigUjrRZfax5519niVZ5m1wF5bYzERQCuKYHZS2P+HCnCDhrmGktv+3EqVL2XpKwAJqMQBW6VA==',key_name='tempest-keypair-1039920618',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='717cc581e6a349a98dfd390d05b18624',ramdisk_id='',reservation_id='r-eb7gua9t',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-AttachInterfacesTestJSON-658760528',owner_user_name='tempest-AttachInterfacesTestJSON-658760528-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-21T23:59:01Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='0f8ef02149394f2dac899fc3395b6bf7',uuid=30dd1355-3b44-4697-89e2-e5c929a535ac,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "e685e997-ac00-415e-9109-6d37bbb2f577", "address": "fa:16:3e:a5:00:2a", "network": {"id": "1995baab-0f8d-4658-a4fc-2d21868dc592", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-54296518-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "717cc581e6a349a98dfd390d05b18624", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape685e997-ac", "ovs_interfaceid": "e685e997-ac00-415e-9109-6d37bbb2f577", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Jan 21 18:59:10 np0005591285 nova_compute[182755]: 2026-01-21 23:59:10.559 182759 DEBUG nova.network.os_vif_util [None req-0445d58e-0401-4878-a15f-95ff6c7ea0b2 0f8ef02149394f2dac899fc3395b6bf7 717cc581e6a349a98dfd390d05b18624 - - default default] Converting VIF {"id": "e685e997-ac00-415e-9109-6d37bbb2f577", "address": "fa:16:3e:a5:00:2a", "network": {"id": "1995baab-0f8d-4658-a4fc-2d21868dc592", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-54296518-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "717cc581e6a349a98dfd390d05b18624", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape685e997-ac", "ovs_interfaceid": "e685e997-ac00-415e-9109-6d37bbb2f577", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 21 18:59:10 np0005591285 nova_compute[182755]: 2026-01-21 23:59:10.561 182759 DEBUG nova.network.os_vif_util [None req-0445d58e-0401-4878-a15f-95ff6c7ea0b2 0f8ef02149394f2dac899fc3395b6bf7 717cc581e6a349a98dfd390d05b18624 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:a5:00:2a,bridge_name='br-int',has_traffic_filtering=True,id=e685e997-ac00-415e-9109-6d37bbb2f577,network=Network(1995baab-0f8d-4658-a4fc-2d21868dc592),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tape685e997-ac') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 21 18:59:10 np0005591285 nova_compute[182755]: 2026-01-21 23:59:10.563 182759 DEBUG nova.objects.instance [None req-0445d58e-0401-4878-a15f-95ff6c7ea0b2 0f8ef02149394f2dac899fc3395b6bf7 717cc581e6a349a98dfd390d05b18624 - - default default] Lazy-loading 'pci_devices' on Instance uuid 30dd1355-3b44-4697-89e2-e5c929a535ac obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 21 18:59:10 np0005591285 nova_compute[182755]: 2026-01-21 23:59:10.586 182759 DEBUG nova.virt.libvirt.driver [None req-0445d58e-0401-4878-a15f-95ff6c7ea0b2 0f8ef02149394f2dac899fc3395b6bf7 717cc581e6a349a98dfd390d05b18624 - - default default] [instance: 30dd1355-3b44-4697-89e2-e5c929a535ac] End _get_guest_xml xml=<domain type="kvm">
Jan 21 18:59:10 np0005591285 nova_compute[182755]:  <uuid>30dd1355-3b44-4697-89e2-e5c929a535ac</uuid>
Jan 21 18:59:10 np0005591285 nova_compute[182755]:  <name>instance-0000004d</name>
Jan 21 18:59:10 np0005591285 nova_compute[182755]:  <memory>131072</memory>
Jan 21 18:59:10 np0005591285 nova_compute[182755]:  <vcpu>1</vcpu>
Jan 21 18:59:10 np0005591285 nova_compute[182755]:  <metadata>
Jan 21 18:59:10 np0005591285 nova_compute[182755]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 21 18:59:10 np0005591285 nova_compute[182755]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 21 18:59:10 np0005591285 nova_compute[182755]:      <nova:name>tempest-tempest.common.compute-instance-1677916293</nova:name>
Jan 21 18:59:10 np0005591285 nova_compute[182755]:      <nova:creationTime>2026-01-21 23:59:10</nova:creationTime>
Jan 21 18:59:10 np0005591285 nova_compute[182755]:      <nova:flavor name="m1.nano">
Jan 21 18:59:10 np0005591285 nova_compute[182755]:        <nova:memory>128</nova:memory>
Jan 21 18:59:10 np0005591285 nova_compute[182755]:        <nova:disk>1</nova:disk>
Jan 21 18:59:10 np0005591285 nova_compute[182755]:        <nova:swap>0</nova:swap>
Jan 21 18:59:10 np0005591285 nova_compute[182755]:        <nova:ephemeral>0</nova:ephemeral>
Jan 21 18:59:10 np0005591285 nova_compute[182755]:        <nova:vcpus>1</nova:vcpus>
Jan 21 18:59:10 np0005591285 nova_compute[182755]:      </nova:flavor>
Jan 21 18:59:10 np0005591285 nova_compute[182755]:      <nova:owner>
Jan 21 18:59:10 np0005591285 nova_compute[182755]:        <nova:user uuid="0f8ef02149394f2dac899fc3395b6bf7">tempest-AttachInterfacesTestJSON-658760528-project-member</nova:user>
Jan 21 18:59:10 np0005591285 nova_compute[182755]:        <nova:project uuid="717cc581e6a349a98dfd390d05b18624">tempest-AttachInterfacesTestJSON-658760528</nova:project>
Jan 21 18:59:10 np0005591285 nova_compute[182755]:      </nova:owner>
Jan 21 18:59:10 np0005591285 nova_compute[182755]:      <nova:root type="image" uuid="9cd98f02-a505-4543-a7ad-04e9a377b456"/>
Jan 21 18:59:10 np0005591285 nova_compute[182755]:      <nova:ports>
Jan 21 18:59:10 np0005591285 nova_compute[182755]:        <nova:port uuid="e685e997-ac00-415e-9109-6d37bbb2f577">
Jan 21 18:59:10 np0005591285 nova_compute[182755]:          <nova:ip type="fixed" address="10.100.0.8" ipVersion="4"/>
Jan 21 18:59:10 np0005591285 nova_compute[182755]:        </nova:port>
Jan 21 18:59:10 np0005591285 nova_compute[182755]:      </nova:ports>
Jan 21 18:59:10 np0005591285 nova_compute[182755]:    </nova:instance>
Jan 21 18:59:10 np0005591285 nova_compute[182755]:  </metadata>
Jan 21 18:59:10 np0005591285 nova_compute[182755]:  <sysinfo type="smbios">
Jan 21 18:59:10 np0005591285 nova_compute[182755]:    <system>
Jan 21 18:59:10 np0005591285 nova_compute[182755]:      <entry name="manufacturer">RDO</entry>
Jan 21 18:59:10 np0005591285 nova_compute[182755]:      <entry name="product">OpenStack Compute</entry>
Jan 21 18:59:10 np0005591285 nova_compute[182755]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 21 18:59:10 np0005591285 nova_compute[182755]:      <entry name="serial">30dd1355-3b44-4697-89e2-e5c929a535ac</entry>
Jan 21 18:59:10 np0005591285 nova_compute[182755]:      <entry name="uuid">30dd1355-3b44-4697-89e2-e5c929a535ac</entry>
Jan 21 18:59:10 np0005591285 nova_compute[182755]:      <entry name="family">Virtual Machine</entry>
Jan 21 18:59:10 np0005591285 nova_compute[182755]:    </system>
Jan 21 18:59:10 np0005591285 nova_compute[182755]:  </sysinfo>
Jan 21 18:59:10 np0005591285 nova_compute[182755]:  <os>
Jan 21 18:59:10 np0005591285 nova_compute[182755]:    <type arch="x86_64" machine="q35">hvm</type>
Jan 21 18:59:10 np0005591285 nova_compute[182755]:    <boot dev="hd"/>
Jan 21 18:59:10 np0005591285 nova_compute[182755]:    <smbios mode="sysinfo"/>
Jan 21 18:59:10 np0005591285 nova_compute[182755]:  </os>
Jan 21 18:59:10 np0005591285 nova_compute[182755]:  <features>
Jan 21 18:59:10 np0005591285 nova_compute[182755]:    <acpi/>
Jan 21 18:59:10 np0005591285 nova_compute[182755]:    <apic/>
Jan 21 18:59:10 np0005591285 nova_compute[182755]:    <vmcoreinfo/>
Jan 21 18:59:10 np0005591285 nova_compute[182755]:  </features>
Jan 21 18:59:10 np0005591285 nova_compute[182755]:  <clock offset="utc">
Jan 21 18:59:10 np0005591285 nova_compute[182755]:    <timer name="pit" tickpolicy="delay"/>
Jan 21 18:59:10 np0005591285 nova_compute[182755]:    <timer name="rtc" tickpolicy="catchup"/>
Jan 21 18:59:10 np0005591285 nova_compute[182755]:    <timer name="hpet" present="no"/>
Jan 21 18:59:10 np0005591285 nova_compute[182755]:  </clock>
Jan 21 18:59:10 np0005591285 nova_compute[182755]:  <cpu mode="custom" match="exact">
Jan 21 18:59:10 np0005591285 nova_compute[182755]:    <model>Nehalem</model>
Jan 21 18:59:10 np0005591285 nova_compute[182755]:    <topology sockets="1" cores="1" threads="1"/>
Jan 21 18:59:10 np0005591285 nova_compute[182755]:  </cpu>
Jan 21 18:59:10 np0005591285 nova_compute[182755]:  <devices>
Jan 21 18:59:10 np0005591285 nova_compute[182755]:    <disk type="file" device="disk">
Jan 21 18:59:10 np0005591285 nova_compute[182755]:      <driver name="qemu" type="qcow2" cache="none"/>
Jan 21 18:59:10 np0005591285 nova_compute[182755]:      <source file="/var/lib/nova/instances/30dd1355-3b44-4697-89e2-e5c929a535ac/disk"/>
Jan 21 18:59:10 np0005591285 nova_compute[182755]:      <target dev="vda" bus="virtio"/>
Jan 21 18:59:10 np0005591285 nova_compute[182755]:    </disk>
Jan 21 18:59:10 np0005591285 nova_compute[182755]:    <disk type="file" device="cdrom">
Jan 21 18:59:10 np0005591285 nova_compute[182755]:      <driver name="qemu" type="raw" cache="none"/>
Jan 21 18:59:10 np0005591285 nova_compute[182755]:      <source file="/var/lib/nova/instances/30dd1355-3b44-4697-89e2-e5c929a535ac/disk.config"/>
Jan 21 18:59:10 np0005591285 nova_compute[182755]:      <target dev="sda" bus="sata"/>
Jan 21 18:59:10 np0005591285 nova_compute[182755]:    </disk>
Jan 21 18:59:10 np0005591285 nova_compute[182755]:    <interface type="ethernet">
Jan 21 18:59:10 np0005591285 nova_compute[182755]:      <mac address="fa:16:3e:a5:00:2a"/>
Jan 21 18:59:10 np0005591285 nova_compute[182755]:      <model type="virtio"/>
Jan 21 18:59:10 np0005591285 nova_compute[182755]:      <driver name="vhost" rx_queue_size="512"/>
Jan 21 18:59:10 np0005591285 nova_compute[182755]:      <mtu size="1442"/>
Jan 21 18:59:10 np0005591285 nova_compute[182755]:      <target dev="tape685e997-ac"/>
Jan 21 18:59:10 np0005591285 nova_compute[182755]:    </interface>
Jan 21 18:59:10 np0005591285 nova_compute[182755]:    <serial type="pty">
Jan 21 18:59:10 np0005591285 nova_compute[182755]:      <log file="/var/lib/nova/instances/30dd1355-3b44-4697-89e2-e5c929a535ac/console.log" append="off"/>
Jan 21 18:59:10 np0005591285 nova_compute[182755]:    </serial>
Jan 21 18:59:10 np0005591285 nova_compute[182755]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 21 18:59:10 np0005591285 nova_compute[182755]:    <video>
Jan 21 18:59:10 np0005591285 nova_compute[182755]:      <model type="virtio"/>
Jan 21 18:59:10 np0005591285 nova_compute[182755]:    </video>
Jan 21 18:59:10 np0005591285 nova_compute[182755]:    <input type="tablet" bus="usb"/>
Jan 21 18:59:10 np0005591285 nova_compute[182755]:    <rng model="virtio">
Jan 21 18:59:10 np0005591285 nova_compute[182755]:      <backend model="random">/dev/urandom</backend>
Jan 21 18:59:10 np0005591285 nova_compute[182755]:    </rng>
Jan 21 18:59:10 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root"/>
Jan 21 18:59:10 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 18:59:10 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 18:59:10 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 18:59:10 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 18:59:10 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 18:59:10 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 18:59:10 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 18:59:10 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 18:59:10 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 18:59:10 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 18:59:10 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 18:59:10 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 18:59:10 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 18:59:10 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 18:59:10 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 18:59:10 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 18:59:10 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 18:59:10 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 18:59:10 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 18:59:10 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 18:59:10 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 18:59:10 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 18:59:10 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 18:59:10 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 18:59:10 np0005591285 nova_compute[182755]:    <controller type="usb" index="0"/>
Jan 21 18:59:10 np0005591285 nova_compute[182755]:    <memballoon model="virtio">
Jan 21 18:59:10 np0005591285 nova_compute[182755]:      <stats period="10"/>
Jan 21 18:59:10 np0005591285 nova_compute[182755]:    </memballoon>
Jan 21 18:59:10 np0005591285 nova_compute[182755]:  </devices>
Jan 21 18:59:10 np0005591285 nova_compute[182755]: </domain>
Jan 21 18:59:10 np0005591285 nova_compute[182755]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Jan 21 18:59:10 np0005591285 nova_compute[182755]: 2026-01-21 23:59:10.588 182759 DEBUG nova.compute.manager [None req-0445d58e-0401-4878-a15f-95ff6c7ea0b2 0f8ef02149394f2dac899fc3395b6bf7 717cc581e6a349a98dfd390d05b18624 - - default default] [instance: 30dd1355-3b44-4697-89e2-e5c929a535ac] Preparing to wait for external event network-vif-plugged-e685e997-ac00-415e-9109-6d37bbb2f577 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Jan 21 18:59:10 np0005591285 nova_compute[182755]: 2026-01-21 23:59:10.589 182759 DEBUG oslo_concurrency.lockutils [None req-0445d58e-0401-4878-a15f-95ff6c7ea0b2 0f8ef02149394f2dac899fc3395b6bf7 717cc581e6a349a98dfd390d05b18624 - - default default] Acquiring lock "30dd1355-3b44-4697-89e2-e5c929a535ac-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 21 18:59:10 np0005591285 nova_compute[182755]: 2026-01-21 23:59:10.590 182759 DEBUG oslo_concurrency.lockutils [None req-0445d58e-0401-4878-a15f-95ff6c7ea0b2 0f8ef02149394f2dac899fc3395b6bf7 717cc581e6a349a98dfd390d05b18624 - - default default] Lock "30dd1355-3b44-4697-89e2-e5c929a535ac-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 21 18:59:10 np0005591285 nova_compute[182755]: 2026-01-21 23:59:10.591 182759 DEBUG oslo_concurrency.lockutils [None req-0445d58e-0401-4878-a15f-95ff6c7ea0b2 0f8ef02149394f2dac899fc3395b6bf7 717cc581e6a349a98dfd390d05b18624 - - default default] Lock "30dd1355-3b44-4697-89e2-e5c929a535ac-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 21 18:59:10 np0005591285 nova_compute[182755]: 2026-01-21 23:59:10.592 182759 DEBUG nova.virt.libvirt.vif [None req-0445d58e-0401-4878-a15f-95ff6c7ea0b2 0f8ef02149394f2dac899fc3395b6bf7 717cc581e6a349a98dfd390d05b18624 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-21T23:59:00Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-tempest.common.compute-instance-1677916293',display_name='tempest-tempest.common.compute-instance-1677916293',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-tempest-common-compute-instance-1677916293',id=77,image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBGa3Z5wRILybgNVRpahBLmiLTAeMMxTHFRpSqeE8vf0/V2PbXMu+NKNirigUjrRZfax5519niVZ5m1wF5bYzERQCuKYHZS2P+HCnCDhrmGktv+3EqVL2XpKwAJqMQBW6VA==',key_name='tempest-keypair-1039920618',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='717cc581e6a349a98dfd390d05b18624',ramdisk_id='',reservation_id='r-eb7gua9t',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-AttachInterfacesTestJSON-658760528',owner_user_name='tempest-AttachInterfacesTestJSON-658760528-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-21T23:59:01Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='0f8ef02149394f2dac899fc3395b6bf7',uuid=30dd1355-3b44-4697-89e2-e5c929a535ac,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "e685e997-ac00-415e-9109-6d37bbb2f577", "address": "fa:16:3e:a5:00:2a", "network": {"id": "1995baab-0f8d-4658-a4fc-2d21868dc592", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-54296518-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "717cc581e6a349a98dfd390d05b18624", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape685e997-ac", "ovs_interfaceid": "e685e997-ac00-415e-9109-6d37bbb2f577", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Jan 21 18:59:10 np0005591285 nova_compute[182755]: 2026-01-21 23:59:10.593 182759 DEBUG nova.network.os_vif_util [None req-0445d58e-0401-4878-a15f-95ff6c7ea0b2 0f8ef02149394f2dac899fc3395b6bf7 717cc581e6a349a98dfd390d05b18624 - - default default] Converting VIF {"id": "e685e997-ac00-415e-9109-6d37bbb2f577", "address": "fa:16:3e:a5:00:2a", "network": {"id": "1995baab-0f8d-4658-a4fc-2d21868dc592", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-54296518-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "717cc581e6a349a98dfd390d05b18624", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape685e997-ac", "ovs_interfaceid": "e685e997-ac00-415e-9109-6d37bbb2f577", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 21 18:59:10 np0005591285 nova_compute[182755]: 2026-01-21 23:59:10.594 182759 DEBUG nova.network.os_vif_util [None req-0445d58e-0401-4878-a15f-95ff6c7ea0b2 0f8ef02149394f2dac899fc3395b6bf7 717cc581e6a349a98dfd390d05b18624 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:a5:00:2a,bridge_name='br-int',has_traffic_filtering=True,id=e685e997-ac00-415e-9109-6d37bbb2f577,network=Network(1995baab-0f8d-4658-a4fc-2d21868dc592),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tape685e997-ac') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 21 18:59:10 np0005591285 nova_compute[182755]: 2026-01-21 23:59:10.595 182759 DEBUG os_vif [None req-0445d58e-0401-4878-a15f-95ff6c7ea0b2 0f8ef02149394f2dac899fc3395b6bf7 717cc581e6a349a98dfd390d05b18624 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:a5:00:2a,bridge_name='br-int',has_traffic_filtering=True,id=e685e997-ac00-415e-9109-6d37bbb2f577,network=Network(1995baab-0f8d-4658-a4fc-2d21868dc592),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tape685e997-ac') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Jan 21 18:59:10 np0005591285 nova_compute[182755]: 2026-01-21 23:59:10.597 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 18:59:10 np0005591285 nova_compute[182755]: 2026-01-21 23:59:10.598 182759 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 21 18:59:10 np0005591285 nova_compute[182755]: 2026-01-21 23:59:10.598 182759 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 21 18:59:10 np0005591285 nova_compute[182755]: 2026-01-21 23:59:10.604 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 18:59:10 np0005591285 nova_compute[182755]: 2026-01-21 23:59:10.604 182759 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tape685e997-ac, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 21 18:59:10 np0005591285 nova_compute[182755]: 2026-01-21 23:59:10.605 182759 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tape685e997-ac, col_values=(('external_ids', {'iface-id': 'e685e997-ac00-415e-9109-6d37bbb2f577', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:a5:00:2a', 'vm-uuid': '30dd1355-3b44-4697-89e2-e5c929a535ac'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 21 18:59:10 np0005591285 nova_compute[182755]: 2026-01-21 23:59:10.609 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 18:59:10 np0005591285 NetworkManager[55017]: <info>  [1769039950.6108] manager: (tape685e997-ac): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/133)
Jan 21 18:59:10 np0005591285 nova_compute[182755]: 2026-01-21 23:59:10.613 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 21 18:59:10 np0005591285 nova_compute[182755]: 2026-01-21 23:59:10.623 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 18:59:10 np0005591285 nova_compute[182755]: 2026-01-21 23:59:10.624 182759 INFO os_vif [None req-0445d58e-0401-4878-a15f-95ff6c7ea0b2 0f8ef02149394f2dac899fc3395b6bf7 717cc581e6a349a98dfd390d05b18624 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:a5:00:2a,bridge_name='br-int',has_traffic_filtering=True,id=e685e997-ac00-415e-9109-6d37bbb2f577,network=Network(1995baab-0f8d-4658-a4fc-2d21868dc592),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tape685e997-ac')#033[00m
Jan 21 18:59:10 np0005591285 nova_compute[182755]: 2026-01-21 23:59:10.723 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 18:59:10 np0005591285 nova_compute[182755]: 2026-01-21 23:59:10.779 182759 DEBUG nova.virt.libvirt.driver [None req-0445d58e-0401-4878-a15f-95ff6c7ea0b2 0f8ef02149394f2dac899fc3395b6bf7 717cc581e6a349a98dfd390d05b18624 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 21 18:59:10 np0005591285 nova_compute[182755]: 2026-01-21 23:59:10.780 182759 DEBUG nova.virt.libvirt.driver [None req-0445d58e-0401-4878-a15f-95ff6c7ea0b2 0f8ef02149394f2dac899fc3395b6bf7 717cc581e6a349a98dfd390d05b18624 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 21 18:59:10 np0005591285 nova_compute[182755]: 2026-01-21 23:59:10.780 182759 DEBUG nova.virt.libvirt.driver [None req-0445d58e-0401-4878-a15f-95ff6c7ea0b2 0f8ef02149394f2dac899fc3395b6bf7 717cc581e6a349a98dfd390d05b18624 - - default default] No VIF found with MAC fa:16:3e:a5:00:2a, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Jan 21 18:59:10 np0005591285 nova_compute[182755]: 2026-01-21 23:59:10.782 182759 INFO nova.virt.libvirt.driver [None req-0445d58e-0401-4878-a15f-95ff6c7ea0b2 0f8ef02149394f2dac899fc3395b6bf7 717cc581e6a349a98dfd390d05b18624 - - default default] [instance: 30dd1355-3b44-4697-89e2-e5c929a535ac] Using config drive#033[00m
Jan 21 18:59:10 np0005591285 podman[222339]: 2026-01-21 23:59:10.787344775 +0000 UTC m=+0.099834609 container health_status 8919309155a033da8deb024bb37b9fc0d285fe97686ee0d202af37a5f2df8416 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Jan 21 18:59:10 np0005591285 podman[222338]: 2026-01-21 23:59:10.793844991 +0000 UTC m=+0.110099147 container health_status 482a8450540b33e5cd4c4cb0c4b60281016c657b79b89fa82f8a9e447843f995 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251202)
Jan 21 18:59:10 np0005591285 podman[222340]: 2026-01-21 23:59:10.844062473 +0000 UTC m=+0.154958373 container health_status e38def30a2d032278c507a8fe96e62e9ea51ff4721fe55b24e66691821fd079e (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, tcib_managed=true, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']})
Jan 21 18:59:11 np0005591285 nova_compute[182755]: 2026-01-21 23:59:11.641 182759 INFO nova.virt.libvirt.driver [None req-0445d58e-0401-4878-a15f-95ff6c7ea0b2 0f8ef02149394f2dac899fc3395b6bf7 717cc581e6a349a98dfd390d05b18624 - - default default] [instance: 30dd1355-3b44-4697-89e2-e5c929a535ac] Creating config drive at /var/lib/nova/instances/30dd1355-3b44-4697-89e2-e5c929a535ac/disk.config#033[00m
Jan 21 18:59:11 np0005591285 nova_compute[182755]: 2026-01-21 23:59:11.648 182759 DEBUG oslo_concurrency.processutils [None req-0445d58e-0401-4878-a15f-95ff6c7ea0b2 0f8ef02149394f2dac899fc3395b6bf7 717cc581e6a349a98dfd390d05b18624 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/30dd1355-3b44-4697-89e2-e5c929a535ac/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpb1oyzche execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 21 18:59:11 np0005591285 nova_compute[182755]: 2026-01-21 23:59:11.795 182759 DEBUG oslo_concurrency.processutils [None req-0445d58e-0401-4878-a15f-95ff6c7ea0b2 0f8ef02149394f2dac899fc3395b6bf7 717cc581e6a349a98dfd390d05b18624 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/30dd1355-3b44-4697-89e2-e5c929a535ac/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpb1oyzche" returned: 0 in 0.148s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 21 18:59:11 np0005591285 kernel: tape685e997-ac: entered promiscuous mode
Jan 21 18:59:11 np0005591285 NetworkManager[55017]: <info>  [1769039951.8754] manager: (tape685e997-ac): new Tun device (/org/freedesktop/NetworkManager/Devices/134)
Jan 21 18:59:11 np0005591285 ovn_controller[94908]: 2026-01-21T23:59:11Z|00263|binding|INFO|Claiming lport e685e997-ac00-415e-9109-6d37bbb2f577 for this chassis.
Jan 21 18:59:11 np0005591285 ovn_controller[94908]: 2026-01-21T23:59:11Z|00264|binding|INFO|e685e997-ac00-415e-9109-6d37bbb2f577: Claiming fa:16:3e:a5:00:2a 10.100.0.8
Jan 21 18:59:11 np0005591285 nova_compute[182755]: 2026-01-21 23:59:11.876 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 18:59:11 np0005591285 ovn_controller[94908]: 2026-01-21T23:59:11Z|00265|binding|INFO|Setting lport e685e997-ac00-415e-9109-6d37bbb2f577 ovn-installed in OVS
Jan 21 18:59:11 np0005591285 nova_compute[182755]: 2026-01-21 23:59:11.890 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 18:59:11 np0005591285 nova_compute[182755]: 2026-01-21 23:59:11.894 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 18:59:11 np0005591285 ovn_controller[94908]: 2026-01-21T23:59:11Z|00266|binding|INFO|Setting lport e685e997-ac00-415e-9109-6d37bbb2f577 up in Southbound
Jan 21 18:59:11 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:59:11.903 104259 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:a5:00:2a 10.100.0.8'], port_security=['fa:16:3e:a5:00:2a 10.100.0.8'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.8/28', 'neutron:device_id': '30dd1355-3b44-4697-89e2-e5c929a535ac', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-1995baab-0f8d-4658-a4fc-2d21868dc592', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '717cc581e6a349a98dfd390d05b18624', 'neutron:revision_number': '2', 'neutron:security_group_ids': '2e3b7d6e-99c3-4bed-a6db-24cc4d63ab1a', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=a84fa12f-731b-4479-8697-844749c5a76f, chassis=[<ovs.db.idl.Row object at 0x7fdd55b5c970>], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fdd55b5c970>], logical_port=e685e997-ac00-415e-9109-6d37bbb2f577) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 21 18:59:11 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:59:11.904 104259 INFO neutron.agent.ovn.metadata.agent [-] Port e685e997-ac00-415e-9109-6d37bbb2f577 in datapath 1995baab-0f8d-4658-a4fc-2d21868dc592 bound to our chassis#033[00m
Jan 21 18:59:11 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:59:11.907 104259 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 1995baab-0f8d-4658-a4fc-2d21868dc592#033[00m
Jan 21 18:59:11 np0005591285 systemd-udevd[222423]: Network interface NamePolicy= disabled on kernel command line.
Jan 21 18:59:11 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:59:11.924 211690 DEBUG oslo.privsep.daemon [-] privsep: reply[a77b0258-06cd-47db-b141-5b81040b821e]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 18:59:11 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:59:11.926 104259 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap1995baab-01 in ovnmeta-1995baab-0f8d-4658-a4fc-2d21868dc592 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Jan 21 18:59:11 np0005591285 NetworkManager[55017]: <info>  [1769039951.9293] device (tape685e997-ac): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 21 18:59:11 np0005591285 NetworkManager[55017]: <info>  [1769039951.9297] device (tape685e997-ac): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 21 18:59:11 np0005591285 systemd-machined[154022]: New machine qemu-36-instance-0000004d.
Jan 21 18:59:11 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:59:11.929 211690 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap1995baab-00 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Jan 21 18:59:11 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:59:11.929 211690 DEBUG oslo.privsep.daemon [-] privsep: reply[cb44cda4-4dda-4741-82de-67b4312a4199]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 18:59:11 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:59:11.932 211690 DEBUG oslo.privsep.daemon [-] privsep: reply[9f74fd90-5cdd-4059-a47f-f94715d9f36a]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 18:59:11 np0005591285 systemd[1]: Started Virtual Machine qemu-36-instance-0000004d.
Jan 21 18:59:11 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:59:11.949 104650 DEBUG oslo.privsep.daemon [-] privsep: reply[72335c4c-f45e-4d82-b349-e9dbd3fab5df]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 18:59:11 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:59:11.973 211690 DEBUG oslo.privsep.daemon [-] privsep: reply[654d0996-4007-4ccb-9e8f-4326a2821b09]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 18:59:12 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:59:12.009 211704 DEBUG oslo.privsep.daemon [-] privsep: reply[a6b3cf7e-a6bf-4501-aac5-1dc5cdc01a13]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 18:59:12 np0005591285 NetworkManager[55017]: <info>  [1769039952.0166] manager: (tap1995baab-00): new Veth device (/org/freedesktop/NetworkManager/Devices/135)
Jan 21 18:59:12 np0005591285 systemd-udevd[222427]: Network interface NamePolicy= disabled on kernel command line.
Jan 21 18:59:12 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:59:12.017 211690 DEBUG oslo.privsep.daemon [-] privsep: reply[4c2c3e14-ebbd-41c5-84e6-4667b491adea]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 18:59:12 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:59:12.057 211704 DEBUG oslo.privsep.daemon [-] privsep: reply[62afde78-d814-4678-a800-fd646b5c4360]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 18:59:12 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:59:12.062 211704 DEBUG oslo.privsep.daemon [-] privsep: reply[2a978733-1d05-4767-b23d-b355b4cc81d7]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 18:59:12 np0005591285 NetworkManager[55017]: <info>  [1769039952.0926] device (tap1995baab-00): carrier: link connected
Jan 21 18:59:12 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:59:12.098 211704 DEBUG oslo.privsep.daemon [-] privsep: reply[2cb6b556-27ee-4ade-8153-ffa7f17da91c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 18:59:12 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:59:12.121 211690 DEBUG oslo.privsep.daemon [-] privsep: reply[b6296d57-0fff-4fcb-b688-c9de58946b2e]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap1995baab-01'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:4b:ff:2f'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 86], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 451075, 'reachable_time': 32815, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 222459, 'error': None, 'target': 'ovnmeta-1995baab-0f8d-4658-a4fc-2d21868dc592', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 18:59:12 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:59:12.143 211690 DEBUG oslo.privsep.daemon [-] privsep: reply[df7b751a-3bd8-48e4-a3a8-37ccec5fa7f9]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe4b:ff2f'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 451075, 'tstamp': 451075}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 222464, 'error': None, 'target': 'ovnmeta-1995baab-0f8d-4658-a4fc-2d21868dc592', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 18:59:12 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:59:12.165 211690 DEBUG oslo.privsep.daemon [-] privsep: reply[ed0a1902-6558-4c84-be7e-db5e724bc8c9]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap1995baab-01'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:4b:ff:2f'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 86], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 451075, 'reachable_time': 32815, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 222465, 'error': None, 'target': 'ovnmeta-1995baab-0f8d-4658-a4fc-2d21868dc592', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 18:59:12 np0005591285 nova_compute[182755]: 2026-01-21 23:59:12.206 182759 DEBUG nova.virt.driver [None req-f447ef32-7edb-4e2c-a5c5-5452f2f9c37e - - - - - -] Emitting event <LifecycleEvent: 1769039952.2052703, 30dd1355-3b44-4697-89e2-e5c929a535ac => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 21 18:59:12 np0005591285 nova_compute[182755]: 2026-01-21 23:59:12.207 182759 INFO nova.compute.manager [None req-f447ef32-7edb-4e2c-a5c5-5452f2f9c37e - - - - - -] [instance: 30dd1355-3b44-4697-89e2-e5c929a535ac] VM Started (Lifecycle Event)#033[00m
Jan 21 18:59:12 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:59:12.210 211690 DEBUG oslo.privsep.daemon [-] privsep: reply[ff689f5a-ddc5-4f55-adf7-5515143d8116]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 18:59:12 np0005591285 nova_compute[182755]: 2026-01-21 23:59:12.231 182759 DEBUG nova.compute.manager [None req-f447ef32-7edb-4e2c-a5c5-5452f2f9c37e - - - - - -] [instance: 30dd1355-3b44-4697-89e2-e5c929a535ac] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 21 18:59:12 np0005591285 nova_compute[182755]: 2026-01-21 23:59:12.237 182759 DEBUG nova.virt.driver [None req-f447ef32-7edb-4e2c-a5c5-5452f2f9c37e - - - - - -] Emitting event <LifecycleEvent: 1769039952.205681, 30dd1355-3b44-4697-89e2-e5c929a535ac => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 21 18:59:12 np0005591285 nova_compute[182755]: 2026-01-21 23:59:12.237 182759 INFO nova.compute.manager [None req-f447ef32-7edb-4e2c-a5c5-5452f2f9c37e - - - - - -] [instance: 30dd1355-3b44-4697-89e2-e5c929a535ac] VM Paused (Lifecycle Event)#033[00m
Jan 21 18:59:12 np0005591285 nova_compute[182755]: 2026-01-21 23:59:12.260 182759 DEBUG nova.compute.manager [None req-f447ef32-7edb-4e2c-a5c5-5452f2f9c37e - - - - - -] [instance: 30dd1355-3b44-4697-89e2-e5c929a535ac] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 21 18:59:12 np0005591285 nova_compute[182755]: 2026-01-21 23:59:12.265 182759 DEBUG nova.compute.manager [None req-f447ef32-7edb-4e2c-a5c5-5452f2f9c37e - - - - - -] [instance: 30dd1355-3b44-4697-89e2-e5c929a535ac] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 21 18:59:12 np0005591285 nova_compute[182755]: 2026-01-21 23:59:12.289 182759 INFO nova.compute.manager [None req-f447ef32-7edb-4e2c-a5c5-5452f2f9c37e - - - - - -] [instance: 30dd1355-3b44-4697-89e2-e5c929a535ac] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 21 18:59:12 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:59:12.296 211690 DEBUG oslo.privsep.daemon [-] privsep: reply[213499d3-2652-4fc6-8190-e300c29dbee9]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 18:59:12 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:59:12.299 104259 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap1995baab-00, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 21 18:59:12 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:59:12.299 104259 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 21 18:59:12 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:59:12.300 104259 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap1995baab-00, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 21 18:59:12 np0005591285 nova_compute[182755]: 2026-01-21 23:59:12.302 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 18:59:12 np0005591285 NetworkManager[55017]: <info>  [1769039952.3036] manager: (tap1995baab-00): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/136)
Jan 21 18:59:12 np0005591285 kernel: tap1995baab-00: entered promiscuous mode
Jan 21 18:59:12 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:59:12.307 104259 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap1995baab-00, col_values=(('external_ids', {'iface-id': '4a5cc35b-5169-43e2-b11f-202219aae22d'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 21 18:59:12 np0005591285 nova_compute[182755]: 2026-01-21 23:59:12.309 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 18:59:12 np0005591285 ovn_controller[94908]: 2026-01-21T23:59:12Z|00267|binding|INFO|Releasing lport 4a5cc35b-5169-43e2-b11f-202219aae22d from this chassis (sb_readonly=0)
Jan 21 18:59:12 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:59:12.312 104259 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/1995baab-0f8d-4658-a4fc-2d21868dc592.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/1995baab-0f8d-4658-a4fc-2d21868dc592.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Jan 21 18:59:12 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:59:12.313 211690 DEBUG oslo.privsep.daemon [-] privsep: reply[828f0af8-678f-4e18-9ab0-46a485247880]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 18:59:12 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:59:12.314 104259 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Jan 21 18:59:12 np0005591285 ovn_metadata_agent[104254]: global
Jan 21 18:59:12 np0005591285 ovn_metadata_agent[104254]:    log         /dev/log local0 debug
Jan 21 18:59:12 np0005591285 ovn_metadata_agent[104254]:    log-tag     haproxy-metadata-proxy-1995baab-0f8d-4658-a4fc-2d21868dc592
Jan 21 18:59:12 np0005591285 ovn_metadata_agent[104254]:    user        root
Jan 21 18:59:12 np0005591285 ovn_metadata_agent[104254]:    group       root
Jan 21 18:59:12 np0005591285 ovn_metadata_agent[104254]:    maxconn     1024
Jan 21 18:59:12 np0005591285 ovn_metadata_agent[104254]:    pidfile     /var/lib/neutron/external/pids/1995baab-0f8d-4658-a4fc-2d21868dc592.pid.haproxy
Jan 21 18:59:12 np0005591285 ovn_metadata_agent[104254]:    daemon
Jan 21 18:59:12 np0005591285 ovn_metadata_agent[104254]: 
Jan 21 18:59:12 np0005591285 ovn_metadata_agent[104254]: defaults
Jan 21 18:59:12 np0005591285 ovn_metadata_agent[104254]:    log global
Jan 21 18:59:12 np0005591285 ovn_metadata_agent[104254]:    mode http
Jan 21 18:59:12 np0005591285 ovn_metadata_agent[104254]:    option httplog
Jan 21 18:59:12 np0005591285 ovn_metadata_agent[104254]:    option dontlognull
Jan 21 18:59:12 np0005591285 ovn_metadata_agent[104254]:    option http-server-close
Jan 21 18:59:12 np0005591285 ovn_metadata_agent[104254]:    option forwardfor
Jan 21 18:59:12 np0005591285 ovn_metadata_agent[104254]:    retries                 3
Jan 21 18:59:12 np0005591285 ovn_metadata_agent[104254]:    timeout http-request    30s
Jan 21 18:59:12 np0005591285 ovn_metadata_agent[104254]:    timeout connect         30s
Jan 21 18:59:12 np0005591285 ovn_metadata_agent[104254]:    timeout client          32s
Jan 21 18:59:12 np0005591285 ovn_metadata_agent[104254]:    timeout server          32s
Jan 21 18:59:12 np0005591285 ovn_metadata_agent[104254]:    timeout http-keep-alive 30s
Jan 21 18:59:12 np0005591285 ovn_metadata_agent[104254]: 
Jan 21 18:59:12 np0005591285 ovn_metadata_agent[104254]: 
Jan 21 18:59:12 np0005591285 ovn_metadata_agent[104254]: listen listener
Jan 21 18:59:12 np0005591285 ovn_metadata_agent[104254]:    bind 169.254.169.254:80
Jan 21 18:59:12 np0005591285 ovn_metadata_agent[104254]:    server metadata /var/lib/neutron/metadata_proxy
Jan 21 18:59:12 np0005591285 ovn_metadata_agent[104254]:    http-request add-header X-OVN-Network-ID 1995baab-0f8d-4658-a4fc-2d21868dc592
Jan 21 18:59:12 np0005591285 ovn_metadata_agent[104254]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Jan 21 18:59:12 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:59:12.315 104259 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-1995baab-0f8d-4658-a4fc-2d21868dc592', 'env', 'PROCESS_TAG=haproxy-1995baab-0f8d-4658-a4fc-2d21868dc592', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/1995baab-0f8d-4658-a4fc-2d21868dc592.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Jan 21 18:59:12 np0005591285 nova_compute[182755]: 2026-01-21 23:59:12.321 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 18:59:12 np0005591285 nova_compute[182755]: 2026-01-21 23:59:12.739 182759 DEBUG nova.compute.manager [req-48041955-66a7-41c6-a852-09c46eaa53d2 req-2cebd04d-6415-48a0-85c2-0a3713264de2 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 30dd1355-3b44-4697-89e2-e5c929a535ac] Received event network-vif-plugged-e685e997-ac00-415e-9109-6d37bbb2f577 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 21 18:59:12 np0005591285 nova_compute[182755]: 2026-01-21 23:59:12.741 182759 DEBUG oslo_concurrency.lockutils [req-48041955-66a7-41c6-a852-09c46eaa53d2 req-2cebd04d-6415-48a0-85c2-0a3713264de2 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "30dd1355-3b44-4697-89e2-e5c929a535ac-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 21 18:59:12 np0005591285 nova_compute[182755]: 2026-01-21 23:59:12.741 182759 DEBUG oslo_concurrency.lockutils [req-48041955-66a7-41c6-a852-09c46eaa53d2 req-2cebd04d-6415-48a0-85c2-0a3713264de2 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "30dd1355-3b44-4697-89e2-e5c929a535ac-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 21 18:59:12 np0005591285 nova_compute[182755]: 2026-01-21 23:59:12.742 182759 DEBUG oslo_concurrency.lockutils [req-48041955-66a7-41c6-a852-09c46eaa53d2 req-2cebd04d-6415-48a0-85c2-0a3713264de2 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "30dd1355-3b44-4697-89e2-e5c929a535ac-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 21 18:59:12 np0005591285 nova_compute[182755]: 2026-01-21 23:59:12.742 182759 DEBUG nova.compute.manager [req-48041955-66a7-41c6-a852-09c46eaa53d2 req-2cebd04d-6415-48a0-85c2-0a3713264de2 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 30dd1355-3b44-4697-89e2-e5c929a535ac] Processing event network-vif-plugged-e685e997-ac00-415e-9109-6d37bbb2f577 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Jan 21 18:59:12 np0005591285 nova_compute[182755]: 2026-01-21 23:59:12.743 182759 DEBUG nova.compute.manager [None req-0445d58e-0401-4878-a15f-95ff6c7ea0b2 0f8ef02149394f2dac899fc3395b6bf7 717cc581e6a349a98dfd390d05b18624 - - default default] [instance: 30dd1355-3b44-4697-89e2-e5c929a535ac] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Jan 21 18:59:12 np0005591285 nova_compute[182755]: 2026-01-21 23:59:12.749 182759 DEBUG nova.virt.libvirt.driver [None req-0445d58e-0401-4878-a15f-95ff6c7ea0b2 0f8ef02149394f2dac899fc3395b6bf7 717cc581e6a349a98dfd390d05b18624 - - default default] [instance: 30dd1355-3b44-4697-89e2-e5c929a535ac] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Jan 21 18:59:12 np0005591285 nova_compute[182755]: 2026-01-21 23:59:12.749 182759 DEBUG nova.virt.driver [None req-f447ef32-7edb-4e2c-a5c5-5452f2f9c37e - - - - - -] Emitting event <LifecycleEvent: 1769039952.7488062, 30dd1355-3b44-4697-89e2-e5c929a535ac => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 21 18:59:12 np0005591285 nova_compute[182755]: 2026-01-21 23:59:12.749 182759 INFO nova.compute.manager [None req-f447ef32-7edb-4e2c-a5c5-5452f2f9c37e - - - - - -] [instance: 30dd1355-3b44-4697-89e2-e5c929a535ac] VM Resumed (Lifecycle Event)#033[00m
Jan 21 18:59:12 np0005591285 nova_compute[182755]: 2026-01-21 23:59:12.756 182759 INFO nova.virt.libvirt.driver [-] [instance: 30dd1355-3b44-4697-89e2-e5c929a535ac] Instance spawned successfully.#033[00m
Jan 21 18:59:12 np0005591285 nova_compute[182755]: 2026-01-21 23:59:12.756 182759 DEBUG nova.virt.libvirt.driver [None req-0445d58e-0401-4878-a15f-95ff6c7ea0b2 0f8ef02149394f2dac899fc3395b6bf7 717cc581e6a349a98dfd390d05b18624 - - default default] [instance: 30dd1355-3b44-4697-89e2-e5c929a535ac] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Jan 21 18:59:12 np0005591285 podman[222498]: 2026-01-21 23:59:12.790820551 +0000 UTC m=+0.069261320 container create ba6419a95934f2494ff7459b960435620e39f9c811972af4fe1fd41021ff972e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-1995baab-0f8d-4658-a4fc-2d21868dc592, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 21 18:59:12 np0005591285 nova_compute[182755]: 2026-01-21 23:59:12.790 182759 DEBUG nova.compute.manager [None req-f447ef32-7edb-4e2c-a5c5-5452f2f9c37e - - - - - -] [instance: 30dd1355-3b44-4697-89e2-e5c929a535ac] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 21 18:59:12 np0005591285 nova_compute[182755]: 2026-01-21 23:59:12.797 182759 DEBUG nova.virt.libvirt.driver [None req-0445d58e-0401-4878-a15f-95ff6c7ea0b2 0f8ef02149394f2dac899fc3395b6bf7 717cc581e6a349a98dfd390d05b18624 - - default default] [instance: 30dd1355-3b44-4697-89e2-e5c929a535ac] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 21 18:59:12 np0005591285 nova_compute[182755]: 2026-01-21 23:59:12.797 182759 DEBUG nova.virt.libvirt.driver [None req-0445d58e-0401-4878-a15f-95ff6c7ea0b2 0f8ef02149394f2dac899fc3395b6bf7 717cc581e6a349a98dfd390d05b18624 - - default default] [instance: 30dd1355-3b44-4697-89e2-e5c929a535ac] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 21 18:59:12 np0005591285 nova_compute[182755]: 2026-01-21 23:59:12.798 182759 DEBUG nova.virt.libvirt.driver [None req-0445d58e-0401-4878-a15f-95ff6c7ea0b2 0f8ef02149394f2dac899fc3395b6bf7 717cc581e6a349a98dfd390d05b18624 - - default default] [instance: 30dd1355-3b44-4697-89e2-e5c929a535ac] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 21 18:59:12 np0005591285 nova_compute[182755]: 2026-01-21 23:59:12.798 182759 DEBUG nova.virt.libvirt.driver [None req-0445d58e-0401-4878-a15f-95ff6c7ea0b2 0f8ef02149394f2dac899fc3395b6bf7 717cc581e6a349a98dfd390d05b18624 - - default default] [instance: 30dd1355-3b44-4697-89e2-e5c929a535ac] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 21 18:59:12 np0005591285 nova_compute[182755]: 2026-01-21 23:59:12.799 182759 DEBUG nova.virt.libvirt.driver [None req-0445d58e-0401-4878-a15f-95ff6c7ea0b2 0f8ef02149394f2dac899fc3395b6bf7 717cc581e6a349a98dfd390d05b18624 - - default default] [instance: 30dd1355-3b44-4697-89e2-e5c929a535ac] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 21 18:59:12 np0005591285 nova_compute[182755]: 2026-01-21 23:59:12.799 182759 DEBUG nova.virt.libvirt.driver [None req-0445d58e-0401-4878-a15f-95ff6c7ea0b2 0f8ef02149394f2dac899fc3395b6bf7 717cc581e6a349a98dfd390d05b18624 - - default default] [instance: 30dd1355-3b44-4697-89e2-e5c929a535ac] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 21 18:59:12 np0005591285 nova_compute[182755]: 2026-01-21 23:59:12.803 182759 DEBUG nova.compute.manager [None req-f447ef32-7edb-4e2c-a5c5-5452f2f9c37e - - - - - -] [instance: 30dd1355-3b44-4697-89e2-e5c929a535ac] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 21 18:59:12 np0005591285 systemd[1]: Started libpod-conmon-ba6419a95934f2494ff7459b960435620e39f9c811972af4fe1fd41021ff972e.scope.
Jan 21 18:59:12 np0005591285 nova_compute[182755]: 2026-01-21 23:59:12.839 182759 INFO nova.compute.manager [None req-f447ef32-7edb-4e2c-a5c5-5452f2f9c37e - - - - - -] [instance: 30dd1355-3b44-4697-89e2-e5c929a535ac] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 21 18:59:12 np0005591285 podman[222498]: 2026-01-21 23:59:12.753808367 +0000 UTC m=+0.032249156 image pull 3695f0466b4af47afdf4b467956f8cc4744d7249671a73e7ca3fd26cca2f59c3 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Jan 21 18:59:12 np0005591285 systemd[1]: Started libcrun container.
Jan 21 18:59:12 np0005591285 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/042f0bda2a0faad399b1887b473b147061d18f9d27a76793a3947ff145b3c7b4/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 21 18:59:12 np0005591285 nova_compute[182755]: 2026-01-21 23:59:12.885 182759 INFO nova.compute.manager [None req-0445d58e-0401-4878-a15f-95ff6c7ea0b2 0f8ef02149394f2dac899fc3395b6bf7 717cc581e6a349a98dfd390d05b18624 - - default default] [instance: 30dd1355-3b44-4697-89e2-e5c929a535ac] Took 10.90 seconds to spawn the instance on the hypervisor.#033[00m
Jan 21 18:59:12 np0005591285 nova_compute[182755]: 2026-01-21 23:59:12.885 182759 DEBUG nova.compute.manager [None req-0445d58e-0401-4878-a15f-95ff6c7ea0b2 0f8ef02149394f2dac899fc3395b6bf7 717cc581e6a349a98dfd390d05b18624 - - default default] [instance: 30dd1355-3b44-4697-89e2-e5c929a535ac] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 21 18:59:12 np0005591285 podman[222498]: 2026-01-21 23:59:12.890793613 +0000 UTC m=+0.169234462 container init ba6419a95934f2494ff7459b960435620e39f9c811972af4fe1fd41021ff972e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-1995baab-0f8d-4658-a4fc-2d21868dc592, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team)
Jan 21 18:59:12 np0005591285 podman[222498]: 2026-01-21 23:59:12.902815148 +0000 UTC m=+0.181255957 container start ba6419a95934f2494ff7459b960435620e39f9c811972af4fe1fd41021ff972e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-1995baab-0f8d-4658-a4fc-2d21868dc592, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 21 18:59:12 np0005591285 neutron-haproxy-ovnmeta-1995baab-0f8d-4658-a4fc-2d21868dc592[222514]: [NOTICE]   (222518) : New worker (222520) forked
Jan 21 18:59:12 np0005591285 neutron-haproxy-ovnmeta-1995baab-0f8d-4658-a4fc-2d21868dc592[222514]: [NOTICE]   (222518) : Loading success.
Jan 21 18:59:12 np0005591285 nova_compute[182755]: 2026-01-21 23:59:12.985 182759 INFO nova.compute.manager [None req-0445d58e-0401-4878-a15f-95ff6c7ea0b2 0f8ef02149394f2dac899fc3395b6bf7 717cc581e6a349a98dfd390d05b18624 - - default default] [instance: 30dd1355-3b44-4697-89e2-e5c929a535ac] Took 11.57 seconds to build instance.#033[00m
Jan 21 18:59:13 np0005591285 nova_compute[182755]: 2026-01-21 23:59:13.019 182759 DEBUG oslo_concurrency.lockutils [None req-0445d58e-0401-4878-a15f-95ff6c7ea0b2 0f8ef02149394f2dac899fc3395b6bf7 717cc581e6a349a98dfd390d05b18624 - - default default] Lock "30dd1355-3b44-4697-89e2-e5c929a535ac" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 11.693s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 21 18:59:13 np0005591285 nova_compute[182755]: 2026-01-21 23:59:13.105 182759 DEBUG nova.network.neutron [req-167e8d9a-2f48-4a9f-92b6-23ed97cae0d3 req-b8e552f6-e45e-469b-ba96-002486f82106 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 30dd1355-3b44-4697-89e2-e5c929a535ac] Updated VIF entry in instance network info cache for port e685e997-ac00-415e-9109-6d37bbb2f577. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 21 18:59:13 np0005591285 nova_compute[182755]: 2026-01-21 23:59:13.106 182759 DEBUG nova.network.neutron [req-167e8d9a-2f48-4a9f-92b6-23ed97cae0d3 req-b8e552f6-e45e-469b-ba96-002486f82106 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 30dd1355-3b44-4697-89e2-e5c929a535ac] Updating instance_info_cache with network_info: [{"id": "e685e997-ac00-415e-9109-6d37bbb2f577", "address": "fa:16:3e:a5:00:2a", "network": {"id": "1995baab-0f8d-4658-a4fc-2d21868dc592", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-54296518-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "717cc581e6a349a98dfd390d05b18624", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape685e997-ac", "ovs_interfaceid": "e685e997-ac00-415e-9109-6d37bbb2f577", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 21 18:59:13 np0005591285 nova_compute[182755]: 2026-01-21 23:59:13.125 182759 DEBUG oslo_concurrency.lockutils [req-167e8d9a-2f48-4a9f-92b6-23ed97cae0d3 req-b8e552f6-e45e-469b-ba96-002486f82106 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Releasing lock "refresh_cache-30dd1355-3b44-4697-89e2-e5c929a535ac" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 21 18:59:14 np0005591285 nova_compute[182755]: 2026-01-21 23:59:14.841 182759 DEBUG nova.compute.manager [req-5931cbbf-c806-4d9a-8ba2-bbc67a2122e8 req-0e8f1a9f-ebdf-4e85-91ef-fbcaec01827b 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 30dd1355-3b44-4697-89e2-e5c929a535ac] Received event network-vif-plugged-e685e997-ac00-415e-9109-6d37bbb2f577 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 21 18:59:14 np0005591285 nova_compute[182755]: 2026-01-21 23:59:14.842 182759 DEBUG oslo_concurrency.lockutils [req-5931cbbf-c806-4d9a-8ba2-bbc67a2122e8 req-0e8f1a9f-ebdf-4e85-91ef-fbcaec01827b 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "30dd1355-3b44-4697-89e2-e5c929a535ac-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 21 18:59:14 np0005591285 nova_compute[182755]: 2026-01-21 23:59:14.842 182759 DEBUG oslo_concurrency.lockutils [req-5931cbbf-c806-4d9a-8ba2-bbc67a2122e8 req-0e8f1a9f-ebdf-4e85-91ef-fbcaec01827b 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "30dd1355-3b44-4697-89e2-e5c929a535ac-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 21 18:59:14 np0005591285 nova_compute[182755]: 2026-01-21 23:59:14.842 182759 DEBUG oslo_concurrency.lockutils [req-5931cbbf-c806-4d9a-8ba2-bbc67a2122e8 req-0e8f1a9f-ebdf-4e85-91ef-fbcaec01827b 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "30dd1355-3b44-4697-89e2-e5c929a535ac-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 21 18:59:14 np0005591285 nova_compute[182755]: 2026-01-21 23:59:14.842 182759 DEBUG nova.compute.manager [req-5931cbbf-c806-4d9a-8ba2-bbc67a2122e8 req-0e8f1a9f-ebdf-4e85-91ef-fbcaec01827b 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 30dd1355-3b44-4697-89e2-e5c929a535ac] No waiting events found dispatching network-vif-plugged-e685e997-ac00-415e-9109-6d37bbb2f577 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 21 18:59:14 np0005591285 nova_compute[182755]: 2026-01-21 23:59:14.843 182759 WARNING nova.compute.manager [req-5931cbbf-c806-4d9a-8ba2-bbc67a2122e8 req-0e8f1a9f-ebdf-4e85-91ef-fbcaec01827b 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 30dd1355-3b44-4697-89e2-e5c929a535ac] Received unexpected event network-vif-plugged-e685e997-ac00-415e-9109-6d37bbb2f577 for instance with vm_state active and task_state None.#033[00m
Jan 21 18:59:15 np0005591285 nova_compute[182755]: 2026-01-21 23:59:15.652 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 18:59:15 np0005591285 nova_compute[182755]: 2026-01-21 23:59:15.725 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 18:59:17 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:59:17.768 104259 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=18, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '16:a4:fb', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': 'fa:c2:bb:50:eb:27'}, ipsec=False) old=SB_Global(nb_cfg=17) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 21 18:59:17 np0005591285 nova_compute[182755]: 2026-01-21 23:59:17.768 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 18:59:17 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:59:17.770 104259 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 9 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Jan 21 18:59:19 np0005591285 nova_compute[182755]: 2026-01-21 23:59:19.234 182759 DEBUG nova.compute.manager [req-2f8025c7-bcb1-429f-ad02-9855e3dbed41 req-7e55306a-315a-4334-a943-d487cfd20b6f 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 30dd1355-3b44-4697-89e2-e5c929a535ac] Received event network-changed-e685e997-ac00-415e-9109-6d37bbb2f577 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 21 18:59:19 np0005591285 nova_compute[182755]: 2026-01-21 23:59:19.235 182759 DEBUG nova.compute.manager [req-2f8025c7-bcb1-429f-ad02-9855e3dbed41 req-7e55306a-315a-4334-a943-d487cfd20b6f 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 30dd1355-3b44-4697-89e2-e5c929a535ac] Refreshing instance network info cache due to event network-changed-e685e997-ac00-415e-9109-6d37bbb2f577. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 21 18:59:19 np0005591285 nova_compute[182755]: 2026-01-21 23:59:19.236 182759 DEBUG oslo_concurrency.lockutils [req-2f8025c7-bcb1-429f-ad02-9855e3dbed41 req-7e55306a-315a-4334-a943-d487cfd20b6f 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "refresh_cache-30dd1355-3b44-4697-89e2-e5c929a535ac" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 21 18:59:19 np0005591285 nova_compute[182755]: 2026-01-21 23:59:19.237 182759 DEBUG oslo_concurrency.lockutils [req-2f8025c7-bcb1-429f-ad02-9855e3dbed41 req-7e55306a-315a-4334-a943-d487cfd20b6f 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquired lock "refresh_cache-30dd1355-3b44-4697-89e2-e5c929a535ac" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 21 18:59:19 np0005591285 nova_compute[182755]: 2026-01-21 23:59:19.237 182759 DEBUG nova.network.neutron [req-2f8025c7-bcb1-429f-ad02-9855e3dbed41 req-7e55306a-315a-4334-a943-d487cfd20b6f 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 30dd1355-3b44-4697-89e2-e5c929a535ac] Refreshing network info cache for port e685e997-ac00-415e-9109-6d37bbb2f577 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 21 18:59:20 np0005591285 nova_compute[182755]: 2026-01-21 23:59:20.659 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 18:59:20 np0005591285 nova_compute[182755]: 2026-01-21 23:59:20.731 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 18:59:20 np0005591285 nova_compute[182755]: 2026-01-21 23:59:20.896 182759 DEBUG nova.network.neutron [req-2f8025c7-bcb1-429f-ad02-9855e3dbed41 req-7e55306a-315a-4334-a943-d487cfd20b6f 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 30dd1355-3b44-4697-89e2-e5c929a535ac] Updated VIF entry in instance network info cache for port e685e997-ac00-415e-9109-6d37bbb2f577. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 21 18:59:20 np0005591285 nova_compute[182755]: 2026-01-21 23:59:20.898 182759 DEBUG nova.network.neutron [req-2f8025c7-bcb1-429f-ad02-9855e3dbed41 req-7e55306a-315a-4334-a943-d487cfd20b6f 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 30dd1355-3b44-4697-89e2-e5c929a535ac] Updating instance_info_cache with network_info: [{"id": "e685e997-ac00-415e-9109-6d37bbb2f577", "address": "fa:16:3e:a5:00:2a", "network": {"id": "1995baab-0f8d-4658-a4fc-2d21868dc592", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-54296518-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.223", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "717cc581e6a349a98dfd390d05b18624", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape685e997-ac", "ovs_interfaceid": "e685e997-ac00-415e-9109-6d37bbb2f577", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 21 18:59:20 np0005591285 nova_compute[182755]: 2026-01-21 23:59:20.926 182759 DEBUG oslo_concurrency.lockutils [req-2f8025c7-bcb1-429f-ad02-9855e3dbed41 req-7e55306a-315a-4334-a943-d487cfd20b6f 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Releasing lock "refresh_cache-30dd1355-3b44-4697-89e2-e5c929a535ac" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 21 18:59:21 np0005591285 nova_compute[182755]: 2026-01-21 23:59:21.328 182759 DEBUG nova.compute.manager [req-30fb0ffb-fabe-4200-a328-63b1afb53689 req-732b0965-3b2b-41a4-a811-29c7fa2adda7 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 30dd1355-3b44-4697-89e2-e5c929a535ac] Received event network-changed-e685e997-ac00-415e-9109-6d37bbb2f577 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 21 18:59:21 np0005591285 nova_compute[182755]: 2026-01-21 23:59:21.328 182759 DEBUG nova.compute.manager [req-30fb0ffb-fabe-4200-a328-63b1afb53689 req-732b0965-3b2b-41a4-a811-29c7fa2adda7 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 30dd1355-3b44-4697-89e2-e5c929a535ac] Refreshing instance network info cache due to event network-changed-e685e997-ac00-415e-9109-6d37bbb2f577. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 21 18:59:21 np0005591285 nova_compute[182755]: 2026-01-21 23:59:21.329 182759 DEBUG oslo_concurrency.lockutils [req-30fb0ffb-fabe-4200-a328-63b1afb53689 req-732b0965-3b2b-41a4-a811-29c7fa2adda7 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "refresh_cache-30dd1355-3b44-4697-89e2-e5c929a535ac" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 21 18:59:21 np0005591285 nova_compute[182755]: 2026-01-21 23:59:21.329 182759 DEBUG oslo_concurrency.lockutils [req-30fb0ffb-fabe-4200-a328-63b1afb53689 req-732b0965-3b2b-41a4-a811-29c7fa2adda7 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquired lock "refresh_cache-30dd1355-3b44-4697-89e2-e5c929a535ac" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 21 18:59:21 np0005591285 nova_compute[182755]: 2026-01-21 23:59:21.330 182759 DEBUG nova.network.neutron [req-30fb0ffb-fabe-4200-a328-63b1afb53689 req-732b0965-3b2b-41a4-a811-29c7fa2adda7 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 30dd1355-3b44-4697-89e2-e5c929a535ac] Refreshing network info cache for port e685e997-ac00-415e-9109-6d37bbb2f577 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 21 18:59:22 np0005591285 nova_compute[182755]: 2026-01-21 23:59:22.718 182759 DEBUG nova.network.neutron [req-30fb0ffb-fabe-4200-a328-63b1afb53689 req-732b0965-3b2b-41a4-a811-29c7fa2adda7 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 30dd1355-3b44-4697-89e2-e5c929a535ac] Updated VIF entry in instance network info cache for port e685e997-ac00-415e-9109-6d37bbb2f577. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 21 18:59:22 np0005591285 nova_compute[182755]: 2026-01-21 23:59:22.719 182759 DEBUG nova.network.neutron [req-30fb0ffb-fabe-4200-a328-63b1afb53689 req-732b0965-3b2b-41a4-a811-29c7fa2adda7 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 30dd1355-3b44-4697-89e2-e5c929a535ac] Updating instance_info_cache with network_info: [{"id": "e685e997-ac00-415e-9109-6d37bbb2f577", "address": "fa:16:3e:a5:00:2a", "network": {"id": "1995baab-0f8d-4658-a4fc-2d21868dc592", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-54296518-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "717cc581e6a349a98dfd390d05b18624", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape685e997-ac", "ovs_interfaceid": "e685e997-ac00-415e-9109-6d37bbb2f577", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 21 18:59:22 np0005591285 nova_compute[182755]: 2026-01-21 23:59:22.756 182759 DEBUG oslo_concurrency.lockutils [req-30fb0ffb-fabe-4200-a328-63b1afb53689 req-732b0965-3b2b-41a4-a811-29c7fa2adda7 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Releasing lock "refresh_cache-30dd1355-3b44-4697-89e2-e5c929a535ac" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 21 18:59:25 np0005591285 nova_compute[182755]: 2026-01-21 23:59:25.665 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 18:59:25 np0005591285 nova_compute[182755]: 2026-01-21 23:59:25.736 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 18:59:26 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:59:26.774 104259 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=ce4b296c-26ac-415a-aa87-9634754eb3d3, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '18'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 21 18:59:26 np0005591285 ovn_controller[94908]: 2026-01-21T23:59:26Z|00032|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:a5:00:2a 10.100.0.8
Jan 21 18:59:26 np0005591285 ovn_controller[94908]: 2026-01-21T23:59:26Z|00033|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:a5:00:2a 10.100.0.8
Jan 21 18:59:27 np0005591285 nova_compute[182755]: 2026-01-21 23:59:27.214 182759 DEBUG oslo_service.periodic_task [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 21 18:59:29 np0005591285 nova_compute[182755]: 2026-01-21 23:59:29.217 182759 DEBUG oslo_service.periodic_task [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 21 18:59:29 np0005591285 nova_compute[182755]: 2026-01-21 23:59:29.218 182759 DEBUG nova.compute.manager [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Jan 21 18:59:30 np0005591285 nova_compute[182755]: 2026-01-21 23:59:30.218 182759 DEBUG oslo_service.periodic_task [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 21 18:59:30 np0005591285 nova_compute[182755]: 2026-01-21 23:59:30.218 182759 DEBUG oslo_service.periodic_task [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 21 18:59:30 np0005591285 podman[222546]: 2026-01-21 23:59:30.250541384 +0000 UTC m=+0.103834838 container health_status 769668cc4e818cfae854ab3fcb5517227ddca1e18005a18ef4d782a494f45662 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., name=ubi9-minimal, distribution-scope=public, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., config_id=openstack_network_exporter, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, release=1755695350, build-date=2025-08-20T13:12:41, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=Red Hat, Inc., io.openshift.tags=minimal rhel9, container_name=openstack_network_exporter, vcs-type=git, io.openshift.expose-services=, io.buildah.version=1.33.7, com.redhat.component=ubi9-minimal-container, url=https://catalog.redhat.com/en/search?searchType=containers, vendor=Red Hat, Inc., version=9.6, managed_by=edpm_ansible, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., architecture=x86_64)
Jan 21 18:59:30 np0005591285 podman[222547]: 2026-01-21 23:59:30.250895383 +0000 UTC m=+0.097479265 container health_status b5c1a31a508d0cf9e10702abe9c0ef424614b4d8f0daee85791f7de054afa6f0 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_managed=true, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=ceilometer_agent_compute, container_name=ceilometer_agent_compute)
Jan 21 18:59:30 np0005591285 nova_compute[182755]: 2026-01-21 23:59:30.670 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 18:59:30 np0005591285 nova_compute[182755]: 2026-01-21 23:59:30.739 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 18:59:32 np0005591285 nova_compute[182755]: 2026-01-21 23:59:32.217 182759 DEBUG oslo_service.periodic_task [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 21 18:59:32 np0005591285 nova_compute[182755]: 2026-01-21 23:59:32.248 182759 DEBUG oslo_concurrency.lockutils [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 21 18:59:32 np0005591285 nova_compute[182755]: 2026-01-21 23:59:32.249 182759 DEBUG oslo_concurrency.lockutils [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 21 18:59:32 np0005591285 nova_compute[182755]: 2026-01-21 23:59:32.250 182759 DEBUG oslo_concurrency.lockutils [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 21 18:59:32 np0005591285 nova_compute[182755]: 2026-01-21 23:59:32.250 182759 DEBUG nova.compute.resource_tracker [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Jan 21 18:59:32 np0005591285 nova_compute[182755]: 2026-01-21 23:59:32.335 182759 DEBUG oslo_concurrency.processutils [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/83fe04ea-7d77-4003-9276-6a7d268e942a/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 21 18:59:32 np0005591285 nova_compute[182755]: 2026-01-21 23:59:32.435 182759 DEBUG oslo_concurrency.processutils [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/83fe04ea-7d77-4003-9276-6a7d268e942a/disk --force-share --output=json" returned: 0 in 0.101s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 21 18:59:32 np0005591285 nova_compute[182755]: 2026-01-21 23:59:32.437 182759 DEBUG oslo_concurrency.processutils [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/83fe04ea-7d77-4003-9276-6a7d268e942a/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 21 18:59:32 np0005591285 nova_compute[182755]: 2026-01-21 23:59:32.499 182759 DEBUG oslo_concurrency.processutils [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/83fe04ea-7d77-4003-9276-6a7d268e942a/disk --force-share --output=json" returned: 0 in 0.061s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 21 18:59:32 np0005591285 nova_compute[182755]: 2026-01-21 23:59:32.508 182759 DEBUG oslo_concurrency.processutils [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/9308be91-9a92-4389-939a-8b03d37474cf/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 21 18:59:32 np0005591285 nova_compute[182755]: 2026-01-21 23:59:32.604 182759 DEBUG oslo_concurrency.processutils [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/9308be91-9a92-4389-939a-8b03d37474cf/disk --force-share --output=json" returned: 0 in 0.096s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 21 18:59:32 np0005591285 nova_compute[182755]: 2026-01-21 23:59:32.605 182759 DEBUG oslo_concurrency.processutils [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/9308be91-9a92-4389-939a-8b03d37474cf/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 21 18:59:32 np0005591285 nova_compute[182755]: 2026-01-21 23:59:32.672 182759 DEBUG oslo_concurrency.processutils [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/9308be91-9a92-4389-939a-8b03d37474cf/disk --force-share --output=json" returned: 0 in 0.067s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 21 18:59:32 np0005591285 nova_compute[182755]: 2026-01-21 23:59:32.678 182759 DEBUG oslo_concurrency.processutils [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/30dd1355-3b44-4697-89e2-e5c929a535ac/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 21 18:59:32 np0005591285 nova_compute[182755]: 2026-01-21 23:59:32.739 182759 DEBUG oslo_concurrency.processutils [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/30dd1355-3b44-4697-89e2-e5c929a535ac/disk --force-share --output=json" returned: 0 in 0.061s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 21 18:59:32 np0005591285 nova_compute[182755]: 2026-01-21 23:59:32.741 182759 DEBUG oslo_concurrency.processutils [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/30dd1355-3b44-4697-89e2-e5c929a535ac/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 21 18:59:32 np0005591285 nova_compute[182755]: 2026-01-21 23:59:32.802 182759 DEBUG oslo_concurrency.processutils [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/30dd1355-3b44-4697-89e2-e5c929a535ac/disk --force-share --output=json" returned: 0 in 0.060s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 21 18:59:33 np0005591285 nova_compute[182755]: 2026-01-21 23:59:33.049 182759 WARNING nova.virt.libvirt.driver [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 21 18:59:33 np0005591285 nova_compute[182755]: 2026-01-21 23:59:33.051 182759 DEBUG nova.compute.resource_tracker [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=5151MB free_disk=73.18172454833984GB free_vcpus=5 pci_devices=[{"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Jan 21 18:59:33 np0005591285 nova_compute[182755]: 2026-01-21 23:59:33.051 182759 DEBUG oslo_concurrency.lockutils [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 21 18:59:33 np0005591285 nova_compute[182755]: 2026-01-21 23:59:33.052 182759 DEBUG oslo_concurrency.lockutils [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 21 18:59:33 np0005591285 nova_compute[182755]: 2026-01-21 23:59:33.141 182759 DEBUG nova.compute.resource_tracker [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Instance 9308be91-9a92-4389-939a-8b03d37474cf actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Jan 21 18:59:33 np0005591285 nova_compute[182755]: 2026-01-21 23:59:33.142 182759 DEBUG nova.compute.resource_tracker [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Instance 83fe04ea-7d77-4003-9276-6a7d268e942a actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Jan 21 18:59:33 np0005591285 nova_compute[182755]: 2026-01-21 23:59:33.142 182759 DEBUG nova.compute.resource_tracker [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Instance 30dd1355-3b44-4697-89e2-e5c929a535ac actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Jan 21 18:59:33 np0005591285 nova_compute[182755]: 2026-01-21 23:59:33.142 182759 DEBUG nova.compute.resource_tracker [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 3 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Jan 21 18:59:33 np0005591285 nova_compute[182755]: 2026-01-21 23:59:33.142 182759 DEBUG nova.compute.resource_tracker [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7679MB used_ram=896MB phys_disk=79GB used_disk=3GB total_vcpus=8 used_vcpus=3 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Jan 21 18:59:33 np0005591285 nova_compute[182755]: 2026-01-21 23:59:33.159 182759 DEBUG nova.scheduler.client.report [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Refreshing inventories for resource provider e96a8776-a298-4c19-937a-402cb8191067 _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:804#033[00m
Jan 21 18:59:33 np0005591285 nova_compute[182755]: 2026-01-21 23:59:33.177 182759 DEBUG nova.scheduler.client.report [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Updating ProviderTree inventory for provider e96a8776-a298-4c19-937a-402cb8191067 from _refresh_and_get_inventory using data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} _refresh_and_get_inventory /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:768#033[00m
Jan 21 18:59:33 np0005591285 nova_compute[182755]: 2026-01-21 23:59:33.177 182759 DEBUG nova.compute.provider_tree [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Updating inventory in ProviderTree for provider e96a8776-a298-4c19-937a-402cb8191067 with inventory: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176#033[00m
Jan 21 18:59:33 np0005591285 nova_compute[182755]: 2026-01-21 23:59:33.202 182759 DEBUG nova.scheduler.client.report [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Refreshing aggregate associations for resource provider e96a8776-a298-4c19-937a-402cb8191067, aggregates: None _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:813#033[00m
Jan 21 18:59:33 np0005591285 nova_compute[182755]: 2026-01-21 23:59:33.220 182759 DEBUG nova.scheduler.client.report [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Refreshing trait associations for resource provider e96a8776-a298-4c19-937a-402cb8191067, traits: COMPUTE_SECURITY_UEFI_SECURE_BOOT,COMPUTE_VIOMMU_MODEL_VIRTIO,COMPUTE_STORAGE_BUS_IDE,COMPUTE_NET_ATTACH_INTERFACE,HW_CPU_X86_SSSE3,HW_CPU_X86_SSE,COMPUTE_IMAGE_TYPE_QCOW2,COMPUTE_NET_VIF_MODEL_VMXNET3,COMPUTE_STORAGE_BUS_FDC,HW_CPU_X86_SSE42,COMPUTE_SECURITY_TPM_2_0,COMPUTE_IMAGE_TYPE_AMI,COMPUTE_DEVICE_TAGGING,COMPUTE_VOLUME_EXTEND,COMPUTE_NET_VIF_MODEL_E1000,COMPUTE_VIOMMU_MODEL_AUTO,COMPUTE_GRAPHICS_MODEL_CIRRUS,COMPUTE_VOLUME_ATTACH_WITH_TAG,COMPUTE_IMAGE_TYPE_AKI,COMPUTE_STORAGE_BUS_SATA,HW_CPU_X86_SSE2,COMPUTE_IMAGE_TYPE_ARI,HW_CPU_X86_SSE41,COMPUTE_RESCUE_BFV,COMPUTE_NODE,COMPUTE_GRAPHICS_MODEL_VIRTIO,COMPUTE_NET_VIF_MODEL_PCNET,COMPUTE_NET_VIF_MODEL_VIRTIO,COMPUTE_NET_VIF_MODEL_RTL8139,COMPUTE_ACCELERATORS,COMPUTE_SECURITY_TPM_1_2,COMPUTE_GRAPHICS_MODEL_NONE,COMPUTE_GRAPHICS_MODEL_BOCHS,COMPUTE_VOLUME_MULTI_ATTACH,COMPUTE_NET_VIF_MODEL_NE2K_PCI,COMPUTE_TRUSTED_CERTS,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,COMPUTE_VIOMMU_MODEL_INTEL,COMPUTE_IMAGE_TYPE_ISO,COMPUTE_SOCKET_PCI_NUMA_AFFINITY,COMPUTE_STORAGE_BUS_USB,COMPUTE_STORAGE_BUS_SCSI,COMPUTE_STORAGE_BUS_VIRTIO,HW_CPU_X86_MMX,COMPUTE_GRAPHICS_MODEL_VGA,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,COMPUTE_NET_VIF_MODEL_E1000E,COMPUTE_IMAGE_TYPE_RAW _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:825#033[00m
Jan 21 18:59:33 np0005591285 nova_compute[182755]: 2026-01-21 23:59:33.340 182759 DEBUG nova.compute.provider_tree [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Inventory has not changed in ProviderTree for provider: e96a8776-a298-4c19-937a-402cb8191067 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 21 18:59:33 np0005591285 nova_compute[182755]: 2026-01-21 23:59:33.359 182759 DEBUG nova.scheduler.client.report [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Inventory has not changed for provider e96a8776-a298-4c19-937a-402cb8191067 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 21 18:59:33 np0005591285 nova_compute[182755]: 2026-01-21 23:59:33.398 182759 DEBUG nova.compute.resource_tracker [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Jan 21 18:59:33 np0005591285 nova_compute[182755]: 2026-01-21 23:59:33.398 182759 DEBUG oslo_concurrency.lockutils [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.347s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 21 18:59:35 np0005591285 nova_compute[182755]: 2026-01-21 23:59:35.675 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 18:59:35 np0005591285 nova_compute[182755]: 2026-01-21 23:59:35.741 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 18:59:36 np0005591285 nova_compute[182755]: 2026-01-21 23:59:36.394 182759 DEBUG oslo_service.periodic_task [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 21 18:59:36 np0005591285 nova_compute[182755]: 2026-01-21 23:59:36.429 182759 DEBUG oslo_service.periodic_task [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 21 18:59:36 np0005591285 nova_compute[182755]: 2026-01-21 23:59:36.429 182759 DEBUG nova.compute.manager [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Jan 21 18:59:36 np0005591285 nova_compute[182755]: 2026-01-21 23:59:36.707 182759 DEBUG oslo_concurrency.lockutils [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Acquiring lock "refresh_cache-83fe04ea-7d77-4003-9276-6a7d268e942a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 21 18:59:36 np0005591285 nova_compute[182755]: 2026-01-21 23:59:36.708 182759 DEBUG oslo_concurrency.lockutils [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Acquired lock "refresh_cache-83fe04ea-7d77-4003-9276-6a7d268e942a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 21 18:59:36 np0005591285 nova_compute[182755]: 2026-01-21 23:59:36.708 182759 DEBUG nova.network.neutron [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] [instance: 83fe04ea-7d77-4003-9276-6a7d268e942a] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004#033[00m
Jan 21 18:59:37 np0005591285 podman[222607]: 2026-01-21 23:59:37.234261798 +0000 UTC m=+0.085695316 container health_status 3d927e208c8d9d2bf2ed958430b573b5beb6e6062ba87e1ec7a0ee42c855b2e4 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Jan 21 18:59:38 np0005591285 nova_compute[182755]: 2026-01-21 23:59:38.241 182759 DEBUG nova.network.neutron [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] [instance: 83fe04ea-7d77-4003-9276-6a7d268e942a] Updating instance_info_cache with network_info: [{"id": "8e162717-2b5c-4731-8484-d2c68330bdaa", "address": "fa:16:3e:2d:84:20", "network": {"id": "58cd83db-dcb3-409c-a108-07601ce5f67a", "bridge": "br-int", "label": "tempest-ServerStableDeviceRescueTest-1658091487-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "011e84f966444a668bd6c0f5674f551f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap8e162717-2b", "ovs_interfaceid": "8e162717-2b5c-4731-8484-d2c68330bdaa", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 21 18:59:38 np0005591285 nova_compute[182755]: 2026-01-21 23:59:38.272 182759 DEBUG oslo_concurrency.lockutils [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Releasing lock "refresh_cache-83fe04ea-7d77-4003-9276-6a7d268e942a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 21 18:59:38 np0005591285 nova_compute[182755]: 2026-01-21 23:59:38.273 182759 DEBUG nova.compute.manager [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] [instance: 83fe04ea-7d77-4003-9276-6a7d268e942a] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929#033[00m
Jan 21 18:59:38 np0005591285 nova_compute[182755]: 2026-01-21 23:59:38.274 182759 DEBUG oslo_service.periodic_task [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 21 18:59:38 np0005591285 nova_compute[182755]: 2026-01-21 23:59:38.275 182759 DEBUG oslo_service.periodic_task [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 21 18:59:39 np0005591285 nova_compute[182755]: 2026-01-21 23:59:39.218 182759 DEBUG oslo_service.periodic_task [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 21 18:59:40 np0005591285 nova_compute[182755]: 2026-01-21 23:59:40.709 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 18:59:40 np0005591285 nova_compute[182755]: 2026-01-21 23:59:40.713 182759 DEBUG oslo_concurrency.lockutils [None req-b6481e2d-ede0-481b-b161-d834a6991ba1 55710edfd4b24e368807c8b5087ec91c 011e84f966444a668bd6c0f5674f551f - - default default] Acquiring lock "83fe04ea-7d77-4003-9276-6a7d268e942a" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 21 18:59:40 np0005591285 nova_compute[182755]: 2026-01-21 23:59:40.714 182759 DEBUG oslo_concurrency.lockutils [None req-b6481e2d-ede0-481b-b161-d834a6991ba1 55710edfd4b24e368807c8b5087ec91c 011e84f966444a668bd6c0f5674f551f - - default default] Lock "83fe04ea-7d77-4003-9276-6a7d268e942a" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 21 18:59:40 np0005591285 nova_compute[182755]: 2026-01-21 23:59:40.714 182759 DEBUG oslo_concurrency.lockutils [None req-b6481e2d-ede0-481b-b161-d834a6991ba1 55710edfd4b24e368807c8b5087ec91c 011e84f966444a668bd6c0f5674f551f - - default default] Acquiring lock "83fe04ea-7d77-4003-9276-6a7d268e942a-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 21 18:59:40 np0005591285 nova_compute[182755]: 2026-01-21 23:59:40.715 182759 DEBUG oslo_concurrency.lockutils [None req-b6481e2d-ede0-481b-b161-d834a6991ba1 55710edfd4b24e368807c8b5087ec91c 011e84f966444a668bd6c0f5674f551f - - default default] Lock "83fe04ea-7d77-4003-9276-6a7d268e942a-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 21 18:59:40 np0005591285 nova_compute[182755]: 2026-01-21 23:59:40.715 182759 DEBUG oslo_concurrency.lockutils [None req-b6481e2d-ede0-481b-b161-d834a6991ba1 55710edfd4b24e368807c8b5087ec91c 011e84f966444a668bd6c0f5674f551f - - default default] Lock "83fe04ea-7d77-4003-9276-6a7d268e942a-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 21 18:59:40 np0005591285 nova_compute[182755]: 2026-01-21 23:59:40.733 182759 INFO nova.compute.manager [None req-b6481e2d-ede0-481b-b161-d834a6991ba1 55710edfd4b24e368807c8b5087ec91c 011e84f966444a668bd6c0f5674f551f - - default default] [instance: 83fe04ea-7d77-4003-9276-6a7d268e942a] Terminating instance#033[00m
Jan 21 18:59:40 np0005591285 nova_compute[182755]: 2026-01-21 23:59:40.745 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 18:59:40 np0005591285 nova_compute[182755]: 2026-01-21 23:59:40.749 182759 DEBUG nova.compute.manager [None req-b6481e2d-ede0-481b-b161-d834a6991ba1 55710edfd4b24e368807c8b5087ec91c 011e84f966444a668bd6c0f5674f551f - - default default] [instance: 83fe04ea-7d77-4003-9276-6a7d268e942a] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Jan 21 18:59:40 np0005591285 kernel: tap8e162717-2b (unregistering): left promiscuous mode
Jan 21 18:59:40 np0005591285 NetworkManager[55017]: <info>  [1769039980.7890] device (tap8e162717-2b): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 21 18:59:40 np0005591285 ovn_controller[94908]: 2026-01-21T23:59:40Z|00268|binding|INFO|Releasing lport 8e162717-2b5c-4731-8484-d2c68330bdaa from this chassis (sb_readonly=0)
Jan 21 18:59:40 np0005591285 ovn_controller[94908]: 2026-01-21T23:59:40Z|00269|binding|INFO|Setting lport 8e162717-2b5c-4731-8484-d2c68330bdaa down in Southbound
Jan 21 18:59:40 np0005591285 nova_compute[182755]: 2026-01-21 23:59:40.802 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 18:59:40 np0005591285 ovn_controller[94908]: 2026-01-21T23:59:40Z|00270|binding|INFO|Removing iface tap8e162717-2b ovn-installed in OVS
Jan 21 18:59:40 np0005591285 nova_compute[182755]: 2026-01-21 23:59:40.804 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 18:59:40 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:59:40.815 104259 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:2d:84:20 10.100.0.13'], port_security=['fa:16:3e:2d:84:20 10.100.0.13'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.13/28', 'neutron:device_id': '83fe04ea-7d77-4003-9276-6a7d268e942a', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-58cd83db-dcb3-409c-a108-07601ce5f67a', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '011e84f966444a668bd6c0f5674f551f', 'neutron:revision_number': '8', 'neutron:security_group_ids': '5f02e9fc-67d8-4ade-8ddf-f139c26fa610', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-2.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=bbb394e5-dc7d-4c83-b892-c42bee4b1312, chassis=[], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fdd55b5c970>], logical_port=8e162717-2b5c-4731-8484-d2c68330bdaa) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7fdd55b5c970>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 21 18:59:40 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:59:40.817 104259 INFO neutron.agent.ovn.metadata.agent [-] Port 8e162717-2b5c-4731-8484-d2c68330bdaa in datapath 58cd83db-dcb3-409c-a108-07601ce5f67a unbound from our chassis#033[00m
Jan 21 18:59:40 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:59:40.820 104259 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 58cd83db-dcb3-409c-a108-07601ce5f67a, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Jan 21 18:59:40 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:59:40.823 211690 DEBUG oslo.privsep.daemon [-] privsep: reply[5ed4abe1-9a86-473b-baf7-9e4759ea75f6]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 18:59:40 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:59:40.824 104259 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-58cd83db-dcb3-409c-a108-07601ce5f67a namespace which is not needed anymore#033[00m
Jan 21 18:59:40 np0005591285 nova_compute[182755]: 2026-01-21 23:59:40.844 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 18:59:40 np0005591285 systemd[1]: machine-qemu\x2d34\x2dinstance\x2d0000004a.scope: Deactivated successfully.
Jan 21 18:59:40 np0005591285 systemd[1]: machine-qemu\x2d34\x2dinstance\x2d0000004a.scope: Consumed 14.832s CPU time.
Jan 21 18:59:40 np0005591285 systemd-machined[154022]: Machine qemu-34-instance-0000004a terminated.
Jan 21 18:59:40 np0005591285 podman[222636]: 2026-01-21 23:59:40.94699815 +0000 UTC m=+0.097734642 container health_status 482a8450540b33e5cd4c4cb0c4b60281016c657b79b89fa82f8a9e447843f995 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2)
Jan 21 18:59:40 np0005591285 podman[222635]: 2026-01-21 23:59:40.973963702 +0000 UTC m=+0.134964962 container health_status 8919309155a033da8deb024bb37b9fc0d285fe97686ee0d202af37a5f2df8416 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Jan 21 18:59:41 np0005591285 neutron-haproxy-ovnmeta-58cd83db-dcb3-409c-a108-07601ce5f67a[222012]: [NOTICE]   (222016) : haproxy version is 2.8.14-c23fe91
Jan 21 18:59:41 np0005591285 neutron-haproxy-ovnmeta-58cd83db-dcb3-409c-a108-07601ce5f67a[222012]: [NOTICE]   (222016) : path to executable is /usr/sbin/haproxy
Jan 21 18:59:41 np0005591285 neutron-haproxy-ovnmeta-58cd83db-dcb3-409c-a108-07601ce5f67a[222012]: [WARNING]  (222016) : Exiting Master process...
Jan 21 18:59:41 np0005591285 neutron-haproxy-ovnmeta-58cd83db-dcb3-409c-a108-07601ce5f67a[222012]: [WARNING]  (222016) : Exiting Master process...
Jan 21 18:59:41 np0005591285 neutron-haproxy-ovnmeta-58cd83db-dcb3-409c-a108-07601ce5f67a[222012]: [ALERT]    (222016) : Current worker (222018) exited with code 143 (Terminated)
Jan 21 18:59:41 np0005591285 neutron-haproxy-ovnmeta-58cd83db-dcb3-409c-a108-07601ce5f67a[222012]: [WARNING]  (222016) : All workers exited. Exiting... (0)
Jan 21 18:59:41 np0005591285 systemd[1]: libpod-0c19d527c5912897c5ede06cd1b4108b951dd546940f33ed9261e3c33c58a7fd.scope: Deactivated successfully.
Jan 21 18:59:41 np0005591285 nova_compute[182755]: 2026-01-21 23:59:41.041 182759 INFO nova.virt.libvirt.driver [-] [instance: 83fe04ea-7d77-4003-9276-6a7d268e942a] Instance destroyed successfully.#033[00m
Jan 21 18:59:41 np0005591285 nova_compute[182755]: 2026-01-21 23:59:41.042 182759 DEBUG nova.objects.instance [None req-b6481e2d-ede0-481b-b161-d834a6991ba1 55710edfd4b24e368807c8b5087ec91c 011e84f966444a668bd6c0f5674f551f - - default default] Lazy-loading 'resources' on Instance uuid 83fe04ea-7d77-4003-9276-6a7d268e942a obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 21 18:59:41 np0005591285 podman[222714]: 2026-01-21 23:59:41.045786839 +0000 UTC m=+0.065130957 container died 0c19d527c5912897c5ede06cd1b4108b951dd546940f33ed9261e3c33c58a7fd (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-58cd83db-dcb3-409c-a108-07601ce5f67a, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.schema-version=1.0)
Jan 21 18:59:41 np0005591285 podman[222653]: 2026-01-21 23:59:41.057188888 +0000 UTC m=+0.158559891 container health_status e38def30a2d032278c507a8fe96e62e9ea51ff4721fe55b24e66691821fd079e (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, container_name=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true)
Jan 21 18:59:41 np0005591285 nova_compute[182755]: 2026-01-21 23:59:41.072 182759 DEBUG nova.virt.libvirt.vif [None req-b6481e2d-ede0-481b-b161-d834a6991ba1 55710edfd4b24e368807c8b5087ec91c 011e84f966444a668bd6c0f5674f551f - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-21T23:58:15Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ServerStableDeviceRescueTest-server-1755799853',display_name='tempest-ServerStableDeviceRescueTest-server-1755799853',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-serverstabledevicerescuetest-server-1755799853',id=74,image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2026-01-21T23:58:48Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='011e84f966444a668bd6c0f5674f551f',ramdisk_id='',reservation_id='r-gcfnr2kl',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-ServerStableDeviceRescueTest-1256721315',owner_user_name='tempest-ServerStableDeviceRescueTest-1256721315-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-21T23:58:53Z,user_data=None,user_id='55710edfd4b24e368807c8b5087ec91c',uuid=83fe04ea-7d77-4003-9276-6a7d268e942a,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "8e162717-2b5c-4731-8484-d2c68330bdaa", "address": "fa:16:3e:2d:84:20", "network": {"id": "58cd83db-dcb3-409c-a108-07601ce5f67a", "bridge": "br-int", "label": "tempest-ServerStableDeviceRescueTest-1658091487-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "011e84f966444a668bd6c0f5674f551f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap8e162717-2b", "ovs_interfaceid": "8e162717-2b5c-4731-8484-d2c68330bdaa", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Jan 21 18:59:41 np0005591285 nova_compute[182755]: 2026-01-21 23:59:41.074 182759 DEBUG nova.network.os_vif_util [None req-b6481e2d-ede0-481b-b161-d834a6991ba1 55710edfd4b24e368807c8b5087ec91c 011e84f966444a668bd6c0f5674f551f - - default default] Converting VIF {"id": "8e162717-2b5c-4731-8484-d2c68330bdaa", "address": "fa:16:3e:2d:84:20", "network": {"id": "58cd83db-dcb3-409c-a108-07601ce5f67a", "bridge": "br-int", "label": "tempest-ServerStableDeviceRescueTest-1658091487-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "011e84f966444a668bd6c0f5674f551f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap8e162717-2b", "ovs_interfaceid": "8e162717-2b5c-4731-8484-d2c68330bdaa", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 21 18:59:41 np0005591285 nova_compute[182755]: 2026-01-21 23:59:41.074 182759 DEBUG nova.network.os_vif_util [None req-b6481e2d-ede0-481b-b161-d834a6991ba1 55710edfd4b24e368807c8b5087ec91c 011e84f966444a668bd6c0f5674f551f - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:2d:84:20,bridge_name='br-int',has_traffic_filtering=True,id=8e162717-2b5c-4731-8484-d2c68330bdaa,network=Network(58cd83db-dcb3-409c-a108-07601ce5f67a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap8e162717-2b') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 21 18:59:41 np0005591285 nova_compute[182755]: 2026-01-21 23:59:41.075 182759 DEBUG os_vif [None req-b6481e2d-ede0-481b-b161-d834a6991ba1 55710edfd4b24e368807c8b5087ec91c 011e84f966444a668bd6c0f5674f551f - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:2d:84:20,bridge_name='br-int',has_traffic_filtering=True,id=8e162717-2b5c-4731-8484-d2c68330bdaa,network=Network(58cd83db-dcb3-409c-a108-07601ce5f67a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap8e162717-2b') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Jan 21 18:59:41 np0005591285 nova_compute[182755]: 2026-01-21 23:59:41.077 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 18:59:41 np0005591285 nova_compute[182755]: 2026-01-21 23:59:41.077 182759 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap8e162717-2b, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 21 18:59:41 np0005591285 nova_compute[182755]: 2026-01-21 23:59:41.079 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 18:59:41 np0005591285 nova_compute[182755]: 2026-01-21 23:59:41.081 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 18:59:41 np0005591285 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-0c19d527c5912897c5ede06cd1b4108b951dd546940f33ed9261e3c33c58a7fd-userdata-shm.mount: Deactivated successfully.
Jan 21 18:59:41 np0005591285 systemd[1]: var-lib-containers-storage-overlay-d9254aad97b36a61c7f68f2f7c2c214d2f26ce61789016597d1f62dd3ae4ae9b-merged.mount: Deactivated successfully.
Jan 21 18:59:41 np0005591285 nova_compute[182755]: 2026-01-21 23:59:41.092 182759 INFO os_vif [None req-b6481e2d-ede0-481b-b161-d834a6991ba1 55710edfd4b24e368807c8b5087ec91c 011e84f966444a668bd6c0f5674f551f - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:2d:84:20,bridge_name='br-int',has_traffic_filtering=True,id=8e162717-2b5c-4731-8484-d2c68330bdaa,network=Network(58cd83db-dcb3-409c-a108-07601ce5f67a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap8e162717-2b')#033[00m
Jan 21 18:59:41 np0005591285 nova_compute[182755]: 2026-01-21 23:59:41.093 182759 INFO nova.virt.libvirt.driver [None req-b6481e2d-ede0-481b-b161-d834a6991ba1 55710edfd4b24e368807c8b5087ec91c 011e84f966444a668bd6c0f5674f551f - - default default] [instance: 83fe04ea-7d77-4003-9276-6a7d268e942a] Deleting instance files /var/lib/nova/instances/83fe04ea-7d77-4003-9276-6a7d268e942a_del#033[00m
Jan 21 18:59:41 np0005591285 nova_compute[182755]: 2026-01-21 23:59:41.094 182759 INFO nova.virt.libvirt.driver [None req-b6481e2d-ede0-481b-b161-d834a6991ba1 55710edfd4b24e368807c8b5087ec91c 011e84f966444a668bd6c0f5674f551f - - default default] [instance: 83fe04ea-7d77-4003-9276-6a7d268e942a] Deletion of /var/lib/nova/instances/83fe04ea-7d77-4003-9276-6a7d268e942a_del complete#033[00m
Jan 21 18:59:41 np0005591285 podman[222714]: 2026-01-21 23:59:41.094799929 +0000 UTC m=+0.114144017 container cleanup 0c19d527c5912897c5ede06cd1b4108b951dd546940f33ed9261e3c33c58a7fd (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-58cd83db-dcb3-409c-a108-07601ce5f67a, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.schema-version=1.0)
Jan 21 18:59:41 np0005591285 systemd[1]: libpod-conmon-0c19d527c5912897c5ede06cd1b4108b951dd546940f33ed9261e3c33c58a7fd.scope: Deactivated successfully.
Jan 21 18:59:41 np0005591285 podman[222769]: 2026-01-21 23:59:41.169103773 +0000 UTC m=+0.049461331 container remove 0c19d527c5912897c5ede06cd1b4108b951dd546940f33ed9261e3c33c58a7fd (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-58cd83db-dcb3-409c-a108-07601ce5f67a, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true)
Jan 21 18:59:41 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:59:41.176 211690 DEBUG oslo.privsep.daemon [-] privsep: reply[b7311329-5e4e-4081-9091-c8f4974cbbc3]: (4, ('Wed Jan 21 11:59:40 PM UTC 2026 Stopping container neutron-haproxy-ovnmeta-58cd83db-dcb3-409c-a108-07601ce5f67a (0c19d527c5912897c5ede06cd1b4108b951dd546940f33ed9261e3c33c58a7fd)\n0c19d527c5912897c5ede06cd1b4108b951dd546940f33ed9261e3c33c58a7fd\nWed Jan 21 11:59:41 PM UTC 2026 Deleting container neutron-haproxy-ovnmeta-58cd83db-dcb3-409c-a108-07601ce5f67a (0c19d527c5912897c5ede06cd1b4108b951dd546940f33ed9261e3c33c58a7fd)\n0c19d527c5912897c5ede06cd1b4108b951dd546940f33ed9261e3c33c58a7fd\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 18:59:41 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:59:41.179 211690 DEBUG oslo.privsep.daemon [-] privsep: reply[149d6282-cb28-4cbd-bea3-13869a296433]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 18:59:41 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:59:41.180 104259 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap58cd83db-d0, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 21 18:59:41 np0005591285 nova_compute[182755]: 2026-01-21 23:59:41.182 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 18:59:41 np0005591285 kernel: tap58cd83db-d0: left promiscuous mode
Jan 21 18:59:41 np0005591285 nova_compute[182755]: 2026-01-21 23:59:41.195 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 18:59:41 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:59:41.199 211690 DEBUG oslo.privsep.daemon [-] privsep: reply[59f0c7ee-b91d-4d93-8673-60d94b189d46]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 18:59:41 np0005591285 nova_compute[182755]: 2026-01-21 23:59:41.216 182759 INFO nova.compute.manager [None req-b6481e2d-ede0-481b-b161-d834a6991ba1 55710edfd4b24e368807c8b5087ec91c 011e84f966444a668bd6c0f5674f551f - - default default] [instance: 83fe04ea-7d77-4003-9276-6a7d268e942a] Took 0.47 seconds to destroy the instance on the hypervisor.#033[00m
Jan 21 18:59:41 np0005591285 nova_compute[182755]: 2026-01-21 23:59:41.217 182759 DEBUG oslo.service.loopingcall [None req-b6481e2d-ede0-481b-b161-d834a6991ba1 55710edfd4b24e368807c8b5087ec91c 011e84f966444a668bd6c0f5674f551f - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Jan 21 18:59:41 np0005591285 nova_compute[182755]: 2026-01-21 23:59:41.217 182759 DEBUG nova.compute.manager [-] [instance: 83fe04ea-7d77-4003-9276-6a7d268e942a] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Jan 21 18:59:41 np0005591285 nova_compute[182755]: 2026-01-21 23:59:41.218 182759 DEBUG nova.network.neutron [-] [instance: 83fe04ea-7d77-4003-9276-6a7d268e942a] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Jan 21 18:59:41 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:59:41.217 211690 DEBUG oslo.privsep.daemon [-] privsep: reply[ed86c880-3667-48ac-8c5c-eac995abbf2c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 18:59:41 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:59:41.219 211690 DEBUG oslo.privsep.daemon [-] privsep: reply[1aea6920-40e0-4541-a0d6-81cfe2d291e5]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 18:59:41 np0005591285 nova_compute[182755]: 2026-01-21 23:59:41.231 182759 DEBUG nova.compute.manager [req-3dcff90f-86e4-4115-8aaa-8eef101b08a1 req-27520963-02ce-4a7c-a009-6ad5e3ad8849 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 30dd1355-3b44-4697-89e2-e5c929a535ac] Received event network-changed-e685e997-ac00-415e-9109-6d37bbb2f577 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 21 18:59:41 np0005591285 nova_compute[182755]: 2026-01-21 23:59:41.231 182759 DEBUG nova.compute.manager [req-3dcff90f-86e4-4115-8aaa-8eef101b08a1 req-27520963-02ce-4a7c-a009-6ad5e3ad8849 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 30dd1355-3b44-4697-89e2-e5c929a535ac] Refreshing instance network info cache due to event network-changed-e685e997-ac00-415e-9109-6d37bbb2f577. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 21 18:59:41 np0005591285 nova_compute[182755]: 2026-01-21 23:59:41.232 182759 DEBUG oslo_concurrency.lockutils [req-3dcff90f-86e4-4115-8aaa-8eef101b08a1 req-27520963-02ce-4a7c-a009-6ad5e3ad8849 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "refresh_cache-30dd1355-3b44-4697-89e2-e5c929a535ac" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 21 18:59:41 np0005591285 nova_compute[182755]: 2026-01-21 23:59:41.232 182759 DEBUG oslo_concurrency.lockutils [req-3dcff90f-86e4-4115-8aaa-8eef101b08a1 req-27520963-02ce-4a7c-a009-6ad5e3ad8849 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquired lock "refresh_cache-30dd1355-3b44-4697-89e2-e5c929a535ac" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 21 18:59:41 np0005591285 nova_compute[182755]: 2026-01-21 23:59:41.232 182759 DEBUG nova.network.neutron [req-3dcff90f-86e4-4115-8aaa-8eef101b08a1 req-27520963-02ce-4a7c-a009-6ad5e3ad8849 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 30dd1355-3b44-4697-89e2-e5c929a535ac] Refreshing network info cache for port e685e997-ac00-415e-9109-6d37bbb2f577 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 21 18:59:41 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:59:41.243 211690 DEBUG oslo.privsep.daemon [-] privsep: reply[8b0222e6-5344-454b-b05d-b0e7fbe2b6de]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 449143, 'reachable_time': 25641, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 222784, 'error': None, 'target': 'ovnmeta-58cd83db-dcb3-409c-a108-07601ce5f67a', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 18:59:41 np0005591285 systemd[1]: run-netns-ovnmeta\x2d58cd83db\x2ddcb3\x2d409c\x2da108\x2d07601ce5f67a.mount: Deactivated successfully.
Jan 21 18:59:41 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:59:41.249 104650 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-58cd83db-dcb3-409c-a108-07601ce5f67a deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Jan 21 18:59:41 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:59:41.249 104650 DEBUG oslo.privsep.daemon [-] privsep: reply[277f7742-102e-4807-b875-6a431b6d4a5c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 18:59:41 np0005591285 nova_compute[182755]: 2026-01-21 23:59:41.839 182759 DEBUG oslo_concurrency.lockutils [None req-0897a1bf-70ff-4a2e-b97c-0dc15624dffe 0f8ef02149394f2dac899fc3395b6bf7 717cc581e6a349a98dfd390d05b18624 - - default default] Acquiring lock "interface-30dd1355-3b44-4697-89e2-e5c929a535ac-c596dfbe-ce59-4ab9-8cad-bd4144812420" by "nova.compute.manager.ComputeManager.attach_interface.<locals>.do_attach_interface" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 21 18:59:41 np0005591285 nova_compute[182755]: 2026-01-21 23:59:41.839 182759 DEBUG oslo_concurrency.lockutils [None req-0897a1bf-70ff-4a2e-b97c-0dc15624dffe 0f8ef02149394f2dac899fc3395b6bf7 717cc581e6a349a98dfd390d05b18624 - - default default] Lock "interface-30dd1355-3b44-4697-89e2-e5c929a535ac-c596dfbe-ce59-4ab9-8cad-bd4144812420" acquired by "nova.compute.manager.ComputeManager.attach_interface.<locals>.do_attach_interface" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 21 18:59:41 np0005591285 nova_compute[182755]: 2026-01-21 23:59:41.839 182759 DEBUG nova.objects.instance [None req-0897a1bf-70ff-4a2e-b97c-0dc15624dffe 0f8ef02149394f2dac899fc3395b6bf7 717cc581e6a349a98dfd390d05b18624 - - default default] Lazy-loading 'flavor' on Instance uuid 30dd1355-3b44-4697-89e2-e5c929a535ac obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 21 18:59:43 np0005591285 nova_compute[182755]: 2026-01-21 23:59:43.012 182759 DEBUG nova.network.neutron [-] [instance: 83fe04ea-7d77-4003-9276-6a7d268e942a] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 21 18:59:43 np0005591285 nova_compute[182755]: 2026-01-21 23:59:43.033 182759 INFO nova.compute.manager [-] [instance: 83fe04ea-7d77-4003-9276-6a7d268e942a] Took 1.82 seconds to deallocate network for instance.#033[00m
Jan 21 18:59:43 np0005591285 nova_compute[182755]: 2026-01-21 23:59:43.065 182759 DEBUG nova.compute.manager [req-3ed26449-cc74-48fa-b70d-a8aa43666ee4 req-7e6e37be-15fa-424d-8cf1-5f5617c2ba77 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 83fe04ea-7d77-4003-9276-6a7d268e942a] Received event network-vif-unplugged-8e162717-2b5c-4731-8484-d2c68330bdaa external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 21 18:59:43 np0005591285 nova_compute[182755]: 2026-01-21 23:59:43.066 182759 DEBUG oslo_concurrency.lockutils [req-3ed26449-cc74-48fa-b70d-a8aa43666ee4 req-7e6e37be-15fa-424d-8cf1-5f5617c2ba77 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "83fe04ea-7d77-4003-9276-6a7d268e942a-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 21 18:59:43 np0005591285 nova_compute[182755]: 2026-01-21 23:59:43.066 182759 DEBUG oslo_concurrency.lockutils [req-3ed26449-cc74-48fa-b70d-a8aa43666ee4 req-7e6e37be-15fa-424d-8cf1-5f5617c2ba77 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "83fe04ea-7d77-4003-9276-6a7d268e942a-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 21 18:59:43 np0005591285 nova_compute[182755]: 2026-01-21 23:59:43.066 182759 DEBUG oslo_concurrency.lockutils [req-3ed26449-cc74-48fa-b70d-a8aa43666ee4 req-7e6e37be-15fa-424d-8cf1-5f5617c2ba77 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "83fe04ea-7d77-4003-9276-6a7d268e942a-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 21 18:59:43 np0005591285 nova_compute[182755]: 2026-01-21 23:59:43.067 182759 DEBUG nova.compute.manager [req-3ed26449-cc74-48fa-b70d-a8aa43666ee4 req-7e6e37be-15fa-424d-8cf1-5f5617c2ba77 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 83fe04ea-7d77-4003-9276-6a7d268e942a] No waiting events found dispatching network-vif-unplugged-8e162717-2b5c-4731-8484-d2c68330bdaa pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 21 18:59:43 np0005591285 nova_compute[182755]: 2026-01-21 23:59:43.067 182759 DEBUG nova.compute.manager [req-3ed26449-cc74-48fa-b70d-a8aa43666ee4 req-7e6e37be-15fa-424d-8cf1-5f5617c2ba77 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 83fe04ea-7d77-4003-9276-6a7d268e942a] Received event network-vif-unplugged-8e162717-2b5c-4731-8484-d2c68330bdaa for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Jan 21 18:59:43 np0005591285 nova_compute[182755]: 2026-01-21 23:59:43.150 182759 DEBUG oslo_concurrency.lockutils [None req-b6481e2d-ede0-481b-b161-d834a6991ba1 55710edfd4b24e368807c8b5087ec91c 011e84f966444a668bd6c0f5674f551f - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 21 18:59:43 np0005591285 nova_compute[182755]: 2026-01-21 23:59:43.151 182759 DEBUG oslo_concurrency.lockutils [None req-b6481e2d-ede0-481b-b161-d834a6991ba1 55710edfd4b24e368807c8b5087ec91c 011e84f966444a668bd6c0f5674f551f - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 21 18:59:43 np0005591285 nova_compute[182755]: 2026-01-21 23:59:43.280 182759 DEBUG nova.compute.provider_tree [None req-b6481e2d-ede0-481b-b161-d834a6991ba1 55710edfd4b24e368807c8b5087ec91c 011e84f966444a668bd6c0f5674f551f - - default default] Inventory has not changed in ProviderTree for provider: e96a8776-a298-4c19-937a-402cb8191067 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 21 18:59:43 np0005591285 nova_compute[182755]: 2026-01-21 23:59:43.299 182759 DEBUG nova.scheduler.client.report [None req-b6481e2d-ede0-481b-b161-d834a6991ba1 55710edfd4b24e368807c8b5087ec91c 011e84f966444a668bd6c0f5674f551f - - default default] Inventory has not changed for provider e96a8776-a298-4c19-937a-402cb8191067 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 21 18:59:43 np0005591285 nova_compute[182755]: 2026-01-21 23:59:43.323 182759 DEBUG oslo_concurrency.lockutils [None req-b6481e2d-ede0-481b-b161-d834a6991ba1 55710edfd4b24e368807c8b5087ec91c 011e84f966444a668bd6c0f5674f551f - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.173s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 21 18:59:43 np0005591285 nova_compute[182755]: 2026-01-21 23:59:43.350 182759 INFO nova.scheduler.client.report [None req-b6481e2d-ede0-481b-b161-d834a6991ba1 55710edfd4b24e368807c8b5087ec91c 011e84f966444a668bd6c0f5674f551f - - default default] Deleted allocations for instance 83fe04ea-7d77-4003-9276-6a7d268e942a#033[00m
Jan 21 18:59:43 np0005591285 nova_compute[182755]: 2026-01-21 23:59:43.359 182759 DEBUG nova.compute.manager [req-7f733605-b4cb-48d6-b18f-e5ce67b13b42 req-744aeb39-1016-49ba-9725-03db7ee6bca4 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 83fe04ea-7d77-4003-9276-6a7d268e942a] Received event network-vif-deleted-8e162717-2b5c-4731-8484-d2c68330bdaa external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 21 18:59:43 np0005591285 nova_compute[182755]: 2026-01-21 23:59:43.445 182759 DEBUG oslo_concurrency.lockutils [None req-b6481e2d-ede0-481b-b161-d834a6991ba1 55710edfd4b24e368807c8b5087ec91c 011e84f966444a668bd6c0f5674f551f - - default default] Lock "83fe04ea-7d77-4003-9276-6a7d268e942a" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 2.732s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 21 18:59:43 np0005591285 nova_compute[182755]: 2026-01-21 23:59:43.471 182759 DEBUG nova.objects.instance [None req-0897a1bf-70ff-4a2e-b97c-0dc15624dffe 0f8ef02149394f2dac899fc3395b6bf7 717cc581e6a349a98dfd390d05b18624 - - default default] Lazy-loading 'pci_requests' on Instance uuid 30dd1355-3b44-4697-89e2-e5c929a535ac obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 21 18:59:43 np0005591285 nova_compute[182755]: 2026-01-21 23:59:43.486 182759 DEBUG nova.network.neutron [None req-0897a1bf-70ff-4a2e-b97c-0dc15624dffe 0f8ef02149394f2dac899fc3395b6bf7 717cc581e6a349a98dfd390d05b18624 - - default default] [instance: 30dd1355-3b44-4697-89e2-e5c929a535ac] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Jan 21 18:59:44 np0005591285 nova_compute[182755]: 2026-01-21 23:59:44.538 182759 DEBUG nova.network.neutron [req-3dcff90f-86e4-4115-8aaa-8eef101b08a1 req-27520963-02ce-4a7c-a009-6ad5e3ad8849 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 30dd1355-3b44-4697-89e2-e5c929a535ac] Updated VIF entry in instance network info cache for port e685e997-ac00-415e-9109-6d37bbb2f577. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 21 18:59:44 np0005591285 nova_compute[182755]: 2026-01-21 23:59:44.539 182759 DEBUG nova.network.neutron [req-3dcff90f-86e4-4115-8aaa-8eef101b08a1 req-27520963-02ce-4a7c-a009-6ad5e3ad8849 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 30dd1355-3b44-4697-89e2-e5c929a535ac] Updating instance_info_cache with network_info: [{"id": "e685e997-ac00-415e-9109-6d37bbb2f577", "address": "fa:16:3e:a5:00:2a", "network": {"id": "1995baab-0f8d-4658-a4fc-2d21868dc592", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-54296518-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.223", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "717cc581e6a349a98dfd390d05b18624", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape685e997-ac", "ovs_interfaceid": "e685e997-ac00-415e-9109-6d37bbb2f577", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 21 18:59:44 np0005591285 nova_compute[182755]: 2026-01-21 23:59:44.565 182759 DEBUG oslo_concurrency.lockutils [req-3dcff90f-86e4-4115-8aaa-8eef101b08a1 req-27520963-02ce-4a7c-a009-6ad5e3ad8849 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Releasing lock "refresh_cache-30dd1355-3b44-4697-89e2-e5c929a535ac" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 21 18:59:44 np0005591285 nova_compute[182755]: 2026-01-21 23:59:44.769 182759 DEBUG nova.policy [None req-0897a1bf-70ff-4a2e-b97c-0dc15624dffe 0f8ef02149394f2dac899fc3395b6bf7 717cc581e6a349a98dfd390d05b18624 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '0f8ef02149394f2dac899fc3395b6bf7', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '717cc581e6a349a98dfd390d05b18624', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Jan 21 18:59:45 np0005591285 nova_compute[182755]: 2026-01-21 23:59:45.209 182759 DEBUG nova.compute.manager [req-98f0d694-fc14-4015-8ce8-62acf8899665 req-ad7865b8-a834-4a4a-a8eb-0fa41aa9c64d 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 83fe04ea-7d77-4003-9276-6a7d268e942a] Received event network-vif-plugged-8e162717-2b5c-4731-8484-d2c68330bdaa external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 21 18:59:45 np0005591285 nova_compute[182755]: 2026-01-21 23:59:45.210 182759 DEBUG oslo_concurrency.lockutils [req-98f0d694-fc14-4015-8ce8-62acf8899665 req-ad7865b8-a834-4a4a-a8eb-0fa41aa9c64d 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "83fe04ea-7d77-4003-9276-6a7d268e942a-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 21 18:59:45 np0005591285 nova_compute[182755]: 2026-01-21 23:59:45.211 182759 DEBUG oslo_concurrency.lockutils [req-98f0d694-fc14-4015-8ce8-62acf8899665 req-ad7865b8-a834-4a4a-a8eb-0fa41aa9c64d 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "83fe04ea-7d77-4003-9276-6a7d268e942a-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 21 18:59:45 np0005591285 nova_compute[182755]: 2026-01-21 23:59:45.211 182759 DEBUG oslo_concurrency.lockutils [req-98f0d694-fc14-4015-8ce8-62acf8899665 req-ad7865b8-a834-4a4a-a8eb-0fa41aa9c64d 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "83fe04ea-7d77-4003-9276-6a7d268e942a-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 21 18:59:45 np0005591285 nova_compute[182755]: 2026-01-21 23:59:45.211 182759 DEBUG nova.compute.manager [req-98f0d694-fc14-4015-8ce8-62acf8899665 req-ad7865b8-a834-4a4a-a8eb-0fa41aa9c64d 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 83fe04ea-7d77-4003-9276-6a7d268e942a] No waiting events found dispatching network-vif-plugged-8e162717-2b5c-4731-8484-d2c68330bdaa pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 21 18:59:45 np0005591285 nova_compute[182755]: 2026-01-21 23:59:45.212 182759 WARNING nova.compute.manager [req-98f0d694-fc14-4015-8ce8-62acf8899665 req-ad7865b8-a834-4a4a-a8eb-0fa41aa9c64d 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 83fe04ea-7d77-4003-9276-6a7d268e942a] Received unexpected event network-vif-plugged-8e162717-2b5c-4731-8484-d2c68330bdaa for instance with vm_state deleted and task_state None.#033[00m
Jan 21 18:59:45 np0005591285 nova_compute[182755]: 2026-01-21 23:59:45.748 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 18:59:46 np0005591285 nova_compute[182755]: 2026-01-21 23:59:46.080 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 18:59:46 np0005591285 nova_compute[182755]: 2026-01-21 23:59:46.789 182759 DEBUG nova.network.neutron [None req-0897a1bf-70ff-4a2e-b97c-0dc15624dffe 0f8ef02149394f2dac899fc3395b6bf7 717cc581e6a349a98dfd390d05b18624 - - default default] [instance: 30dd1355-3b44-4697-89e2-e5c929a535ac] Successfully updated port: c596dfbe-ce59-4ab9-8cad-bd4144812420 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Jan 21 18:59:46 np0005591285 nova_compute[182755]: 2026-01-21 23:59:46.815 182759 DEBUG oslo_concurrency.lockutils [None req-0897a1bf-70ff-4a2e-b97c-0dc15624dffe 0f8ef02149394f2dac899fc3395b6bf7 717cc581e6a349a98dfd390d05b18624 - - default default] Acquiring lock "refresh_cache-30dd1355-3b44-4697-89e2-e5c929a535ac" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 21 18:59:46 np0005591285 nova_compute[182755]: 2026-01-21 23:59:46.816 182759 DEBUG oslo_concurrency.lockutils [None req-0897a1bf-70ff-4a2e-b97c-0dc15624dffe 0f8ef02149394f2dac899fc3395b6bf7 717cc581e6a349a98dfd390d05b18624 - - default default] Acquired lock "refresh_cache-30dd1355-3b44-4697-89e2-e5c929a535ac" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 21 18:59:46 np0005591285 nova_compute[182755]: 2026-01-21 23:59:46.816 182759 DEBUG nova.network.neutron [None req-0897a1bf-70ff-4a2e-b97c-0dc15624dffe 0f8ef02149394f2dac899fc3395b6bf7 717cc581e6a349a98dfd390d05b18624 - - default default] [instance: 30dd1355-3b44-4697-89e2-e5c929a535ac] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Jan 21 18:59:47 np0005591285 nova_compute[182755]: 2026-01-21 23:59:47.086 182759 WARNING nova.network.neutron [None req-0897a1bf-70ff-4a2e-b97c-0dc15624dffe 0f8ef02149394f2dac899fc3395b6bf7 717cc581e6a349a98dfd390d05b18624 - - default default] [instance: 30dd1355-3b44-4697-89e2-e5c929a535ac] 1995baab-0f8d-4658-a4fc-2d21868dc592 already exists in list: networks containing: ['1995baab-0f8d-4658-a4fc-2d21868dc592']. ignoring it#033[00m
Jan 21 18:59:47 np0005591285 nova_compute[182755]: 2026-01-21 23:59:47.338 182759 DEBUG nova.compute.manager [req-48dfb60a-0b98-47a3-b511-a91c03e9a96d req-06040c47-2f92-4941-806f-d20ed6c0aa7a 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 30dd1355-3b44-4697-89e2-e5c929a535ac] Received event network-changed-c596dfbe-ce59-4ab9-8cad-bd4144812420 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 21 18:59:47 np0005591285 nova_compute[182755]: 2026-01-21 23:59:47.339 182759 DEBUG nova.compute.manager [req-48dfb60a-0b98-47a3-b511-a91c03e9a96d req-06040c47-2f92-4941-806f-d20ed6c0aa7a 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 30dd1355-3b44-4697-89e2-e5c929a535ac] Refreshing instance network info cache due to event network-changed-c596dfbe-ce59-4ab9-8cad-bd4144812420. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 21 18:59:47 np0005591285 nova_compute[182755]: 2026-01-21 23:59:47.339 182759 DEBUG oslo_concurrency.lockutils [req-48dfb60a-0b98-47a3-b511-a91c03e9a96d req-06040c47-2f92-4941-806f-d20ed6c0aa7a 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "refresh_cache-30dd1355-3b44-4697-89e2-e5c929a535ac" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 21 18:59:49 np0005591285 nova_compute[182755]: 2026-01-21 23:59:49.762 182759 DEBUG nova.network.neutron [None req-0897a1bf-70ff-4a2e-b97c-0dc15624dffe 0f8ef02149394f2dac899fc3395b6bf7 717cc581e6a349a98dfd390d05b18624 - - default default] [instance: 30dd1355-3b44-4697-89e2-e5c929a535ac] Updating instance_info_cache with network_info: [{"id": "e685e997-ac00-415e-9109-6d37bbb2f577", "address": "fa:16:3e:a5:00:2a", "network": {"id": "1995baab-0f8d-4658-a4fc-2d21868dc592", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-54296518-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.223", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "717cc581e6a349a98dfd390d05b18624", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape685e997-ac", "ovs_interfaceid": "e685e997-ac00-415e-9109-6d37bbb2f577", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "c596dfbe-ce59-4ab9-8cad-bd4144812420", "address": "fa:16:3e:24:23:22", "network": {"id": "1995baab-0f8d-4658-a4fc-2d21868dc592", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-54296518-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "717cc581e6a349a98dfd390d05b18624", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc596dfbe-ce", "ovs_interfaceid": "c596dfbe-ce59-4ab9-8cad-bd4144812420", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 21 18:59:49 np0005591285 nova_compute[182755]: 2026-01-21 23:59:49.793 182759 DEBUG oslo_concurrency.lockutils [None req-0897a1bf-70ff-4a2e-b97c-0dc15624dffe 0f8ef02149394f2dac899fc3395b6bf7 717cc581e6a349a98dfd390d05b18624 - - default default] Releasing lock "refresh_cache-30dd1355-3b44-4697-89e2-e5c929a535ac" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 21 18:59:49 np0005591285 nova_compute[182755]: 2026-01-21 23:59:49.796 182759 DEBUG oslo_concurrency.lockutils [req-48dfb60a-0b98-47a3-b511-a91c03e9a96d req-06040c47-2f92-4941-806f-d20ed6c0aa7a 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquired lock "refresh_cache-30dd1355-3b44-4697-89e2-e5c929a535ac" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 21 18:59:49 np0005591285 nova_compute[182755]: 2026-01-21 23:59:49.796 182759 DEBUG nova.network.neutron [req-48dfb60a-0b98-47a3-b511-a91c03e9a96d req-06040c47-2f92-4941-806f-d20ed6c0aa7a 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 30dd1355-3b44-4697-89e2-e5c929a535ac] Refreshing network info cache for port c596dfbe-ce59-4ab9-8cad-bd4144812420 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 21 18:59:49 np0005591285 nova_compute[182755]: 2026-01-21 23:59:49.802 182759 DEBUG nova.virt.libvirt.vif [None req-0897a1bf-70ff-4a2e-b97c-0dc15624dffe 0f8ef02149394f2dac899fc3395b6bf7 717cc581e6a349a98dfd390d05b18624 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-21T23:59:00Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-tempest.common.compute-instance-1677916293',display_name='tempest-tempest.common.compute-instance-1677916293',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-tempest-common-compute-instance-1677916293',id=77,image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBGa3Z5wRILybgNVRpahBLmiLTAeMMxTHFRpSqeE8vf0/V2PbXMu+NKNirigUjrRZfax5519niVZ5m1wF5bYzERQCuKYHZS2P+HCnCDhrmGktv+3EqVL2XpKwAJqMQBW6VA==',key_name='tempest-keypair-1039920618',keypairs=<?>,launch_index=0,launched_at=2026-01-21T23:59:12Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=<?>,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='717cc581e6a349a98dfd390d05b18624',ramdisk_id='',reservation_id='r-eb7gua9t',resources=<?>,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-AttachInterfacesTestJSON-658760528',owner_user_name='tempest-AttachInterfacesTestJSON-658760528-project-member'},tags=<?>,task_state=None,terminated_at=None,trusted_certs=<?>,updated_at=2026-01-21T23:59:12Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='0f8ef02149394f2dac899fc3395b6bf7',uuid=30dd1355-3b44-4697-89e2-e5c929a535ac,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "c596dfbe-ce59-4ab9-8cad-bd4144812420", "address": "fa:16:3e:24:23:22", "network": {"id": "1995baab-0f8d-4658-a4fc-2d21868dc592", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-54296518-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "717cc581e6a349a98dfd390d05b18624", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc596dfbe-ce", "ovs_interfaceid": "c596dfbe-ce59-4ab9-8cad-bd4144812420", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Jan 21 18:59:49 np0005591285 nova_compute[182755]: 2026-01-21 23:59:49.802 182759 DEBUG nova.network.os_vif_util [None req-0897a1bf-70ff-4a2e-b97c-0dc15624dffe 0f8ef02149394f2dac899fc3395b6bf7 717cc581e6a349a98dfd390d05b18624 - - default default] Converting VIF {"id": "c596dfbe-ce59-4ab9-8cad-bd4144812420", "address": "fa:16:3e:24:23:22", "network": {"id": "1995baab-0f8d-4658-a4fc-2d21868dc592", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-54296518-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "717cc581e6a349a98dfd390d05b18624", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc596dfbe-ce", "ovs_interfaceid": "c596dfbe-ce59-4ab9-8cad-bd4144812420", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 21 18:59:49 np0005591285 nova_compute[182755]: 2026-01-21 23:59:49.804 182759 DEBUG nova.network.os_vif_util [None req-0897a1bf-70ff-4a2e-b97c-0dc15624dffe 0f8ef02149394f2dac899fc3395b6bf7 717cc581e6a349a98dfd390d05b18624 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:24:23:22,bridge_name='br-int',has_traffic_filtering=True,id=c596dfbe-ce59-4ab9-8cad-bd4144812420,network=Network(1995baab-0f8d-4658-a4fc-2d21868dc592),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tapc596dfbe-ce') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 21 18:59:49 np0005591285 nova_compute[182755]: 2026-01-21 23:59:49.804 182759 DEBUG os_vif [None req-0897a1bf-70ff-4a2e-b97c-0dc15624dffe 0f8ef02149394f2dac899fc3395b6bf7 717cc581e6a349a98dfd390d05b18624 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:24:23:22,bridge_name='br-int',has_traffic_filtering=True,id=c596dfbe-ce59-4ab9-8cad-bd4144812420,network=Network(1995baab-0f8d-4658-a4fc-2d21868dc592),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tapc596dfbe-ce') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Jan 21 18:59:49 np0005591285 nova_compute[182755]: 2026-01-21 23:59:49.806 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 18:59:49 np0005591285 nova_compute[182755]: 2026-01-21 23:59:49.807 182759 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 21 18:59:49 np0005591285 nova_compute[182755]: 2026-01-21 23:59:49.807 182759 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 21 18:59:49 np0005591285 nova_compute[182755]: 2026-01-21 23:59:49.812 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 18:59:49 np0005591285 nova_compute[182755]: 2026-01-21 23:59:49.812 182759 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapc596dfbe-ce, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 21 18:59:49 np0005591285 nova_compute[182755]: 2026-01-21 23:59:49.813 182759 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tapc596dfbe-ce, col_values=(('external_ids', {'iface-id': 'c596dfbe-ce59-4ab9-8cad-bd4144812420', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:24:23:22', 'vm-uuid': '30dd1355-3b44-4697-89e2-e5c929a535ac'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 21 18:59:49 np0005591285 nova_compute[182755]: 2026-01-21 23:59:49.815 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 18:59:49 np0005591285 NetworkManager[55017]: <info>  [1769039989.8174] manager: (tapc596dfbe-ce): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/137)
Jan 21 18:59:49 np0005591285 nova_compute[182755]: 2026-01-21 23:59:49.824 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 21 18:59:49 np0005591285 nova_compute[182755]: 2026-01-21 23:59:49.826 182759 INFO os_vif [None req-0897a1bf-70ff-4a2e-b97c-0dc15624dffe 0f8ef02149394f2dac899fc3395b6bf7 717cc581e6a349a98dfd390d05b18624 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:24:23:22,bridge_name='br-int',has_traffic_filtering=True,id=c596dfbe-ce59-4ab9-8cad-bd4144812420,network=Network(1995baab-0f8d-4658-a4fc-2d21868dc592),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tapc596dfbe-ce')#033[00m
Jan 21 18:59:49 np0005591285 nova_compute[182755]: 2026-01-21 23:59:49.827 182759 DEBUG nova.virt.libvirt.vif [None req-0897a1bf-70ff-4a2e-b97c-0dc15624dffe 0f8ef02149394f2dac899fc3395b6bf7 717cc581e6a349a98dfd390d05b18624 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-21T23:59:00Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-tempest.common.compute-instance-1677916293',display_name='tempest-tempest.common.compute-instance-1677916293',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-tempest-common-compute-instance-1677916293',id=77,image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBGa3Z5wRILybgNVRpahBLmiLTAeMMxTHFRpSqeE8vf0/V2PbXMu+NKNirigUjrRZfax5519niVZ5m1wF5bYzERQCuKYHZS2P+HCnCDhrmGktv+3EqVL2XpKwAJqMQBW6VA==',key_name='tempest-keypair-1039920618',keypairs=<?>,launch_index=0,launched_at=2026-01-21T23:59:12Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=<?>,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='717cc581e6a349a98dfd390d05b18624',ramdisk_id='',reservation_id='r-eb7gua9t',resources=<?>,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-AttachInterfacesTestJSON-658760528',owner_user_name='tempest-AttachInterfacesTestJSON-658760528-project-member'},tags=<?>,task_state=None,terminated_at=None,trusted_certs=<?>,updated_at=2026-01-21T23:59:12Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='0f8ef02149394f2dac899fc3395b6bf7',uuid=30dd1355-3b44-4697-89e2-e5c929a535ac,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "c596dfbe-ce59-4ab9-8cad-bd4144812420", "address": "fa:16:3e:24:23:22", "network": {"id": "1995baab-0f8d-4658-a4fc-2d21868dc592", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-54296518-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "717cc581e6a349a98dfd390d05b18624", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc596dfbe-ce", "ovs_interfaceid": "c596dfbe-ce59-4ab9-8cad-bd4144812420", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Jan 21 18:59:49 np0005591285 nova_compute[182755]: 2026-01-21 23:59:49.827 182759 DEBUG nova.network.os_vif_util [None req-0897a1bf-70ff-4a2e-b97c-0dc15624dffe 0f8ef02149394f2dac899fc3395b6bf7 717cc581e6a349a98dfd390d05b18624 - - default default] Converting VIF {"id": "c596dfbe-ce59-4ab9-8cad-bd4144812420", "address": "fa:16:3e:24:23:22", "network": {"id": "1995baab-0f8d-4658-a4fc-2d21868dc592", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-54296518-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "717cc581e6a349a98dfd390d05b18624", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc596dfbe-ce", "ovs_interfaceid": "c596dfbe-ce59-4ab9-8cad-bd4144812420", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 21 18:59:49 np0005591285 nova_compute[182755]: 2026-01-21 23:59:49.829 182759 DEBUG nova.network.os_vif_util [None req-0897a1bf-70ff-4a2e-b97c-0dc15624dffe 0f8ef02149394f2dac899fc3395b6bf7 717cc581e6a349a98dfd390d05b18624 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:24:23:22,bridge_name='br-int',has_traffic_filtering=True,id=c596dfbe-ce59-4ab9-8cad-bd4144812420,network=Network(1995baab-0f8d-4658-a4fc-2d21868dc592),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tapc596dfbe-ce') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 21 18:59:49 np0005591285 nova_compute[182755]: 2026-01-21 23:59:49.833 182759 DEBUG nova.virt.libvirt.guest [None req-0897a1bf-70ff-4a2e-b97c-0dc15624dffe 0f8ef02149394f2dac899fc3395b6bf7 717cc581e6a349a98dfd390d05b18624 - - default default] attach device xml: <interface type="ethernet">
Jan 21 18:59:49 np0005591285 nova_compute[182755]:  <mac address="fa:16:3e:24:23:22"/>
Jan 21 18:59:49 np0005591285 nova_compute[182755]:  <model type="virtio"/>
Jan 21 18:59:49 np0005591285 nova_compute[182755]:  <driver name="vhost" rx_queue_size="512"/>
Jan 21 18:59:49 np0005591285 nova_compute[182755]:  <mtu size="1442"/>
Jan 21 18:59:49 np0005591285 nova_compute[182755]:  <target dev="tapc596dfbe-ce"/>
Jan 21 18:59:49 np0005591285 nova_compute[182755]: </interface>
Jan 21 18:59:49 np0005591285 nova_compute[182755]: attach_device /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:339#033[00m
Jan 21 18:59:49 np0005591285 kernel: tapc596dfbe-ce: entered promiscuous mode
Jan 21 18:59:49 np0005591285 NetworkManager[55017]: <info>  [1769039989.8560] manager: (tapc596dfbe-ce): new Tun device (/org/freedesktop/NetworkManager/Devices/138)
Jan 21 18:59:49 np0005591285 nova_compute[182755]: 2026-01-21 23:59:49.855 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 18:59:49 np0005591285 ovn_controller[94908]: 2026-01-21T23:59:49Z|00271|binding|INFO|Claiming lport c596dfbe-ce59-4ab9-8cad-bd4144812420 for this chassis.
Jan 21 18:59:49 np0005591285 ovn_controller[94908]: 2026-01-21T23:59:49Z|00272|binding|INFO|c596dfbe-ce59-4ab9-8cad-bd4144812420: Claiming fa:16:3e:24:23:22 10.100.0.14
Jan 21 18:59:49 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:59:49.883 104259 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:24:23:22 10.100.0.14'], port_security=['fa:16:3e:24:23:22 10.100.0.14'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'name': 'tempest-AttachInterfacesTestJSON-1512441673', 'neutron:cidrs': '10.100.0.14/28', 'neutron:device_id': '30dd1355-3b44-4697-89e2-e5c929a535ac', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-1995baab-0f8d-4658-a4fc-2d21868dc592', 'neutron:port_capabilities': '', 'neutron:port_name': 'tempest-AttachInterfacesTestJSON-1512441673', 'neutron:project_id': '717cc581e6a349a98dfd390d05b18624', 'neutron:revision_number': '7', 'neutron:security_group_ids': '453c6af8-25dc-4538-90e8-d74d46875cdc', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=a84fa12f-731b-4479-8697-844749c5a76f, chassis=[<ovs.db.idl.Row object at 0x7fdd55b5c970>], tunnel_key=5, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fdd55b5c970>], logical_port=c596dfbe-ce59-4ab9-8cad-bd4144812420) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 21 18:59:49 np0005591285 ovn_controller[94908]: 2026-01-21T23:59:49Z|00273|binding|INFO|Setting lport c596dfbe-ce59-4ab9-8cad-bd4144812420 ovn-installed in OVS
Jan 21 18:59:49 np0005591285 ovn_controller[94908]: 2026-01-21T23:59:49Z|00274|binding|INFO|Setting lport c596dfbe-ce59-4ab9-8cad-bd4144812420 up in Southbound
Jan 21 18:59:49 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:59:49.887 104259 INFO neutron.agent.ovn.metadata.agent [-] Port c596dfbe-ce59-4ab9-8cad-bd4144812420 in datapath 1995baab-0f8d-4658-a4fc-2d21868dc592 bound to our chassis#033[00m
Jan 21 18:59:49 np0005591285 nova_compute[182755]: 2026-01-21 23:59:49.890 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 18:59:49 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:59:49.893 104259 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 1995baab-0f8d-4658-a4fc-2d21868dc592#033[00m
Jan 21 18:59:49 np0005591285 systemd-udevd[222793]: Network interface NamePolicy= disabled on kernel command line.
Jan 21 18:59:49 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:59:49.924 211690 DEBUG oslo.privsep.daemon [-] privsep: reply[4f39696a-9399-44c0-b152-265d25114632]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 18:59:49 np0005591285 NetworkManager[55017]: <info>  [1769039989.9368] device (tapc596dfbe-ce): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 21 18:59:49 np0005591285 NetworkManager[55017]: <info>  [1769039989.9393] device (tapc596dfbe-ce): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 21 18:59:49 np0005591285 nova_compute[182755]: 2026-01-21 23:59:49.985 182759 DEBUG nova.virt.libvirt.driver [None req-0897a1bf-70ff-4a2e-b97c-0dc15624dffe 0f8ef02149394f2dac899fc3395b6bf7 717cc581e6a349a98dfd390d05b18624 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 21 18:59:49 np0005591285 nova_compute[182755]: 2026-01-21 23:59:49.986 182759 DEBUG nova.virt.libvirt.driver [None req-0897a1bf-70ff-4a2e-b97c-0dc15624dffe 0f8ef02149394f2dac899fc3395b6bf7 717cc581e6a349a98dfd390d05b18624 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 21 18:59:49 np0005591285 nova_compute[182755]: 2026-01-21 23:59:49.986 182759 DEBUG nova.virt.libvirt.driver [None req-0897a1bf-70ff-4a2e-b97c-0dc15624dffe 0f8ef02149394f2dac899fc3395b6bf7 717cc581e6a349a98dfd390d05b18624 - - default default] No VIF found with MAC fa:16:3e:a5:00:2a, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Jan 21 18:59:49 np0005591285 nova_compute[182755]: 2026-01-21 23:59:49.986 182759 DEBUG nova.virt.libvirt.driver [None req-0897a1bf-70ff-4a2e-b97c-0dc15624dffe 0f8ef02149394f2dac899fc3395b6bf7 717cc581e6a349a98dfd390d05b18624 - - default default] No VIF found with MAC fa:16:3e:24:23:22, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Jan 21 18:59:49 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:59:49.986 211704 DEBUG oslo.privsep.daemon [-] privsep: reply[66035040-685d-4d47-aa27-468aa5e3d4a3]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 18:59:49 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:59:49.990 211704 DEBUG oslo.privsep.daemon [-] privsep: reply[2d2a4a3e-6086-4e94-b990-11cd1bf3bb4b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 18:59:50 np0005591285 nova_compute[182755]: 2026-01-21 23:59:50.016 182759 DEBUG nova.virt.libvirt.guest [None req-0897a1bf-70ff-4a2e-b97c-0dc15624dffe 0f8ef02149394f2dac899fc3395b6bf7 717cc581e6a349a98dfd390d05b18624 - - default default] set metadata xml: <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 21 18:59:50 np0005591285 nova_compute[182755]:  <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 21 18:59:50 np0005591285 nova_compute[182755]:  <nova:name>tempest-tempest.common.compute-instance-1677916293</nova:name>
Jan 21 18:59:50 np0005591285 nova_compute[182755]:  <nova:creationTime>2026-01-21 23:59:50</nova:creationTime>
Jan 21 18:59:50 np0005591285 nova_compute[182755]:  <nova:flavor name="m1.nano">
Jan 21 18:59:50 np0005591285 nova_compute[182755]:    <nova:memory>128</nova:memory>
Jan 21 18:59:50 np0005591285 nova_compute[182755]:    <nova:disk>1</nova:disk>
Jan 21 18:59:50 np0005591285 nova_compute[182755]:    <nova:swap>0</nova:swap>
Jan 21 18:59:50 np0005591285 nova_compute[182755]:    <nova:ephemeral>0</nova:ephemeral>
Jan 21 18:59:50 np0005591285 nova_compute[182755]:    <nova:vcpus>1</nova:vcpus>
Jan 21 18:59:50 np0005591285 nova_compute[182755]:  </nova:flavor>
Jan 21 18:59:50 np0005591285 nova_compute[182755]:  <nova:owner>
Jan 21 18:59:50 np0005591285 nova_compute[182755]:    <nova:user uuid="0f8ef02149394f2dac899fc3395b6bf7">tempest-AttachInterfacesTestJSON-658760528-project-member</nova:user>
Jan 21 18:59:50 np0005591285 nova_compute[182755]:    <nova:project uuid="717cc581e6a349a98dfd390d05b18624">tempest-AttachInterfacesTestJSON-658760528</nova:project>
Jan 21 18:59:50 np0005591285 nova_compute[182755]:  </nova:owner>
Jan 21 18:59:50 np0005591285 nova_compute[182755]:  <nova:root type="image" uuid="9cd98f02-a505-4543-a7ad-04e9a377b456"/>
Jan 21 18:59:50 np0005591285 nova_compute[182755]:  <nova:ports>
Jan 21 18:59:50 np0005591285 nova_compute[182755]:    <nova:port uuid="e685e997-ac00-415e-9109-6d37bbb2f577">
Jan 21 18:59:50 np0005591285 nova_compute[182755]:      <nova:ip type="fixed" address="10.100.0.8" ipVersion="4"/>
Jan 21 18:59:50 np0005591285 nova_compute[182755]:    </nova:port>
Jan 21 18:59:50 np0005591285 nova_compute[182755]:    <nova:port uuid="c596dfbe-ce59-4ab9-8cad-bd4144812420">
Jan 21 18:59:50 np0005591285 nova_compute[182755]:      <nova:ip type="fixed" address="10.100.0.14" ipVersion="4"/>
Jan 21 18:59:50 np0005591285 nova_compute[182755]:    </nova:port>
Jan 21 18:59:50 np0005591285 nova_compute[182755]:  </nova:ports>
Jan 21 18:59:50 np0005591285 nova_compute[182755]: </nova:instance>
Jan 21 18:59:50 np0005591285 nova_compute[182755]: set_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:359#033[00m
Jan 21 18:59:50 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:59:50.033 211704 DEBUG oslo.privsep.daemon [-] privsep: reply[0ae4bdbb-6ff8-4502-8547-91c7eb6f8219]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 18:59:50 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:59:50.060 211690 DEBUG oslo.privsep.daemon [-] privsep: reply[47c06db3-155f-4446-9322-3d44989b0e08]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap1995baab-01'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:4b:ff:2f'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 8, 'tx_packets': 5, 'rx_bytes': 616, 'tx_bytes': 354, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 8, 'tx_packets': 5, 'rx_bytes': 616, 'tx_bytes': 354, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 86], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 451075, 'reachable_time': 32815, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 6, 'inoctets': 448, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 3, 'outoctets': 228, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 6, 'outmcastpkts': 3, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 448, 'outmcastoctets': 228, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 6, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 3, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 222800, 'error': None, 'target': 'ovnmeta-1995baab-0f8d-4658-a4fc-2d21868dc592', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 18:59:50 np0005591285 nova_compute[182755]: 2026-01-21 23:59:50.074 182759 DEBUG oslo_concurrency.lockutils [None req-0897a1bf-70ff-4a2e-b97c-0dc15624dffe 0f8ef02149394f2dac899fc3395b6bf7 717cc581e6a349a98dfd390d05b18624 - - default default] Lock "interface-30dd1355-3b44-4697-89e2-e5c929a535ac-c596dfbe-ce59-4ab9-8cad-bd4144812420" "released" by "nova.compute.manager.ComputeManager.attach_interface.<locals>.do_attach_interface" :: held 8.235s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 21 18:59:50 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:59:50.092 211690 DEBUG oslo.privsep.daemon [-] privsep: reply[0d103a6d-1c14-4fd0-b645-72dd9fb68ccb]: (4, ({'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tap1995baab-01'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 451090, 'tstamp': 451090}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 222801, 'error': None, 'target': 'ovnmeta-1995baab-0f8d-4658-a4fc-2d21868dc592', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.15'], ['IFA_LABEL', 'tap1995baab-01'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 451094, 'tstamp': 451094}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 222801, 'error': None, 'target': 'ovnmeta-1995baab-0f8d-4658-a4fc-2d21868dc592', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 18:59:50 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:59:50.094 104259 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap1995baab-00, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 21 18:59:50 np0005591285 nova_compute[182755]: 2026-01-21 23:59:50.096 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 18:59:50 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:59:50.097 104259 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap1995baab-00, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 21 18:59:50 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:59:50.097 104259 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 21 18:59:50 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:59:50.097 104259 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap1995baab-00, col_values=(('external_ids', {'iface-id': '4a5cc35b-5169-43e2-b11f-202219aae22d'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 21 18:59:50 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:59:50.098 104259 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 21 18:59:50 np0005591285 nova_compute[182755]: 2026-01-21 23:59:50.390 182759 DEBUG nova.compute.manager [req-a7bff642-15ad-442f-82c8-807e9ed0b759 req-3b3d1011-8a40-445a-b6c0-cb307a99efc1 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 30dd1355-3b44-4697-89e2-e5c929a535ac] Received event network-vif-plugged-c596dfbe-ce59-4ab9-8cad-bd4144812420 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 21 18:59:50 np0005591285 nova_compute[182755]: 2026-01-21 23:59:50.391 182759 DEBUG oslo_concurrency.lockutils [req-a7bff642-15ad-442f-82c8-807e9ed0b759 req-3b3d1011-8a40-445a-b6c0-cb307a99efc1 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "30dd1355-3b44-4697-89e2-e5c929a535ac-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 21 18:59:50 np0005591285 nova_compute[182755]: 2026-01-21 23:59:50.392 182759 DEBUG oslo_concurrency.lockutils [req-a7bff642-15ad-442f-82c8-807e9ed0b759 req-3b3d1011-8a40-445a-b6c0-cb307a99efc1 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "30dd1355-3b44-4697-89e2-e5c929a535ac-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 21 18:59:50 np0005591285 nova_compute[182755]: 2026-01-21 23:59:50.392 182759 DEBUG oslo_concurrency.lockutils [req-a7bff642-15ad-442f-82c8-807e9ed0b759 req-3b3d1011-8a40-445a-b6c0-cb307a99efc1 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "30dd1355-3b44-4697-89e2-e5c929a535ac-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 21 18:59:50 np0005591285 nova_compute[182755]: 2026-01-21 23:59:50.393 182759 DEBUG nova.compute.manager [req-a7bff642-15ad-442f-82c8-807e9ed0b759 req-3b3d1011-8a40-445a-b6c0-cb307a99efc1 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 30dd1355-3b44-4697-89e2-e5c929a535ac] No waiting events found dispatching network-vif-plugged-c596dfbe-ce59-4ab9-8cad-bd4144812420 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 21 18:59:50 np0005591285 nova_compute[182755]: 2026-01-21 23:59:50.393 182759 WARNING nova.compute.manager [req-a7bff642-15ad-442f-82c8-807e9ed0b759 req-3b3d1011-8a40-445a-b6c0-cb307a99efc1 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 30dd1355-3b44-4697-89e2-e5c929a535ac] Received unexpected event network-vif-plugged-c596dfbe-ce59-4ab9-8cad-bd4144812420 for instance with vm_state active and task_state None.#033[00m
Jan 21 18:59:50 np0005591285 nova_compute[182755]: 2026-01-21 23:59:50.751 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 18:59:51 np0005591285 ovn_controller[94908]: 2026-01-21T23:59:51Z|00275|binding|INFO|Releasing lport 1b7e9589-a667-4684-99c2-2699b19c29bb from this chassis (sb_readonly=0)
Jan 21 18:59:51 np0005591285 ovn_controller[94908]: 2026-01-21T23:59:51Z|00276|binding|INFO|Releasing lport 4a5cc35b-5169-43e2-b11f-202219aae22d from this chassis (sb_readonly=0)
Jan 21 18:59:51 np0005591285 nova_compute[182755]: 2026-01-21 23:59:51.320 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 18:59:51 np0005591285 nova_compute[182755]: 2026-01-21 23:59:51.505 182759 DEBUG oslo_concurrency.lockutils [None req-83be6c55-c85f-489d-9178-d17bd4158a6f 0f8ef02149394f2dac899fc3395b6bf7 717cc581e6a349a98dfd390d05b18624 - - default default] Acquiring lock "interface-30dd1355-3b44-4697-89e2-e5c929a535ac-c596dfbe-ce59-4ab9-8cad-bd4144812420" by "nova.compute.manager.ComputeManager.detach_interface.<locals>.do_detach_interface" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 21 18:59:51 np0005591285 nova_compute[182755]: 2026-01-21 23:59:51.507 182759 DEBUG oslo_concurrency.lockutils [None req-83be6c55-c85f-489d-9178-d17bd4158a6f 0f8ef02149394f2dac899fc3395b6bf7 717cc581e6a349a98dfd390d05b18624 - - default default] Lock "interface-30dd1355-3b44-4697-89e2-e5c929a535ac-c596dfbe-ce59-4ab9-8cad-bd4144812420" acquired by "nova.compute.manager.ComputeManager.detach_interface.<locals>.do_detach_interface" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 21 18:59:51 np0005591285 nova_compute[182755]: 2026-01-21 23:59:51.521 182759 DEBUG nova.objects.instance [None req-83be6c55-c85f-489d-9178-d17bd4158a6f 0f8ef02149394f2dac899fc3395b6bf7 717cc581e6a349a98dfd390d05b18624 - - default default] Lazy-loading 'flavor' on Instance uuid 30dd1355-3b44-4697-89e2-e5c929a535ac obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 21 18:59:51 np0005591285 nova_compute[182755]: 2026-01-21 23:59:51.554 182759 DEBUG nova.virt.libvirt.vif [None req-83be6c55-c85f-489d-9178-d17bd4158a6f 0f8ef02149394f2dac899fc3395b6bf7 717cc581e6a349a98dfd390d05b18624 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-21T23:59:00Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-tempest.common.compute-instance-1677916293',display_name='tempest-tempest.common.compute-instance-1677916293',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-tempest-common-compute-instance-1677916293',id=77,image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBGa3Z5wRILybgNVRpahBLmiLTAeMMxTHFRpSqeE8vf0/V2PbXMu+NKNirigUjrRZfax5519niVZ5m1wF5bYzERQCuKYHZS2P+HCnCDhrmGktv+3EqVL2XpKwAJqMQBW6VA==',key_name='tempest-keypair-1039920618',keypairs=<?>,launch_index=0,launched_at=2026-01-21T23:59:12Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=<?>,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='717cc581e6a349a98dfd390d05b18624',ramdisk_id='',reservation_id='r-eb7gua9t',resources=<?>,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-AttachInterfacesTestJSON-658760528',owner_user_name='tempest-AttachInterfacesTestJSON-658760528-project-member'},tags=<?>,task_state=None,terminated_at=None,trusted_certs=<?>,updated_at=2026-01-21T23:59:12Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='0f8ef02149394f2dac899fc3395b6bf7',uuid=30dd1355-3b44-4697-89e2-e5c929a535ac,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "c596dfbe-ce59-4ab9-8cad-bd4144812420", "address": "fa:16:3e:24:23:22", "network": {"id": "1995baab-0f8d-4658-a4fc-2d21868dc592", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-54296518-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "717cc581e6a349a98dfd390d05b18624", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc596dfbe-ce", "ovs_interfaceid": "c596dfbe-ce59-4ab9-8cad-bd4144812420", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Jan 21 18:59:51 np0005591285 nova_compute[182755]: 2026-01-21 23:59:51.555 182759 DEBUG nova.network.os_vif_util [None req-83be6c55-c85f-489d-9178-d17bd4158a6f 0f8ef02149394f2dac899fc3395b6bf7 717cc581e6a349a98dfd390d05b18624 - - default default] Converting VIF {"id": "c596dfbe-ce59-4ab9-8cad-bd4144812420", "address": "fa:16:3e:24:23:22", "network": {"id": "1995baab-0f8d-4658-a4fc-2d21868dc592", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-54296518-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "717cc581e6a349a98dfd390d05b18624", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc596dfbe-ce", "ovs_interfaceid": "c596dfbe-ce59-4ab9-8cad-bd4144812420", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 21 18:59:51 np0005591285 nova_compute[182755]: 2026-01-21 23:59:51.556 182759 DEBUG nova.network.os_vif_util [None req-83be6c55-c85f-489d-9178-d17bd4158a6f 0f8ef02149394f2dac899fc3395b6bf7 717cc581e6a349a98dfd390d05b18624 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:24:23:22,bridge_name='br-int',has_traffic_filtering=True,id=c596dfbe-ce59-4ab9-8cad-bd4144812420,network=Network(1995baab-0f8d-4658-a4fc-2d21868dc592),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tapc596dfbe-ce') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 21 18:59:51 np0005591285 nova_compute[182755]: 2026-01-21 23:59:51.561 182759 DEBUG nova.virt.libvirt.guest [None req-83be6c55-c85f-489d-9178-d17bd4158a6f 0f8ef02149394f2dac899fc3395b6bf7 717cc581e6a349a98dfd390d05b18624 - - default default] looking for interface given config: <interface type="ethernet"><mac address="fa:16:3e:24:23:22"/><model type="virtio"/><driver name="vhost" rx_queue_size="512"/><mtu size="1442"/><target dev="tapc596dfbe-ce"/></interface> get_interface_by_cfg /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:257#033[00m
Jan 21 18:59:51 np0005591285 nova_compute[182755]: 2026-01-21 23:59:51.565 182759 DEBUG nova.virt.libvirt.guest [None req-83be6c55-c85f-489d-9178-d17bd4158a6f 0f8ef02149394f2dac899fc3395b6bf7 717cc581e6a349a98dfd390d05b18624 - - default default] looking for interface given config: <interface type="ethernet"><mac address="fa:16:3e:24:23:22"/><model type="virtio"/><driver name="vhost" rx_queue_size="512"/><mtu size="1442"/><target dev="tapc596dfbe-ce"/></interface> get_interface_by_cfg /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:257#033[00m
Jan 21 18:59:51 np0005591285 nova_compute[182755]: 2026-01-21 23:59:51.569 182759 DEBUG nova.virt.libvirt.driver [None req-83be6c55-c85f-489d-9178-d17bd4158a6f 0f8ef02149394f2dac899fc3395b6bf7 717cc581e6a349a98dfd390d05b18624 - - default default] Attempting to detach device tapc596dfbe-ce from instance 30dd1355-3b44-4697-89e2-e5c929a535ac from the persistent domain config. _detach_from_persistent /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:2487#033[00m
Jan 21 18:59:51 np0005591285 nova_compute[182755]: 2026-01-21 23:59:51.570 182759 DEBUG nova.virt.libvirt.guest [None req-83be6c55-c85f-489d-9178-d17bd4158a6f 0f8ef02149394f2dac899fc3395b6bf7 717cc581e6a349a98dfd390d05b18624 - - default default] detach device xml: <interface type="ethernet">
Jan 21 18:59:51 np0005591285 nova_compute[182755]:  <mac address="fa:16:3e:24:23:22"/>
Jan 21 18:59:51 np0005591285 nova_compute[182755]:  <model type="virtio"/>
Jan 21 18:59:51 np0005591285 nova_compute[182755]:  <driver name="vhost" rx_queue_size="512"/>
Jan 21 18:59:51 np0005591285 nova_compute[182755]:  <mtu size="1442"/>
Jan 21 18:59:51 np0005591285 nova_compute[182755]:  <target dev="tapc596dfbe-ce"/>
Jan 21 18:59:51 np0005591285 nova_compute[182755]: </interface>
Jan 21 18:59:51 np0005591285 nova_compute[182755]: detach_device /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:465#033[00m
Jan 21 18:59:51 np0005591285 nova_compute[182755]: 2026-01-21 23:59:51.788 182759 DEBUG nova.virt.libvirt.guest [None req-83be6c55-c85f-489d-9178-d17bd4158a6f 0f8ef02149394f2dac899fc3395b6bf7 717cc581e6a349a98dfd390d05b18624 - - default default] looking for interface given config: <interface type="ethernet"><mac address="fa:16:3e:24:23:22"/><model type="virtio"/><driver name="vhost" rx_queue_size="512"/><mtu size="1442"/><target dev="tapc596dfbe-ce"/></interface> get_interface_by_cfg /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:257#033[00m
Jan 21 18:59:51 np0005591285 nova_compute[182755]: 2026-01-21 23:59:51.794 182759 DEBUG nova.virt.libvirt.guest [None req-83be6c55-c85f-489d-9178-d17bd4158a6f 0f8ef02149394f2dac899fc3395b6bf7 717cc581e6a349a98dfd390d05b18624 - - default default] interface for config: <interface type="ethernet"><mac address="fa:16:3e:24:23:22"/><model type="virtio"/><driver name="vhost" rx_queue_size="512"/><mtu size="1442"/><target dev="tapc596dfbe-ce"/></interface>not found in domain: <domain type='kvm' id='36'>
Jan 21 18:59:51 np0005591285 nova_compute[182755]:  <name>instance-0000004d</name>
Jan 21 18:59:51 np0005591285 nova_compute[182755]:  <uuid>30dd1355-3b44-4697-89e2-e5c929a535ac</uuid>
Jan 21 18:59:51 np0005591285 nova_compute[182755]:  <metadata>
Jan 21 18:59:51 np0005591285 nova_compute[182755]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1" xmlns:instance="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 21 18:59:51 np0005591285 nova_compute[182755]:  <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 21 18:59:51 np0005591285 nova_compute[182755]:  <nova:name>tempest-tempest.common.compute-instance-1677916293</nova:name>
Jan 21 18:59:51 np0005591285 nova_compute[182755]:  <nova:creationTime>2026-01-21 23:59:50</nova:creationTime>
Jan 21 18:59:51 np0005591285 nova_compute[182755]:  <nova:flavor name="m1.nano">
Jan 21 18:59:51 np0005591285 nova_compute[182755]:    <nova:memory>128</nova:memory>
Jan 21 18:59:51 np0005591285 nova_compute[182755]:    <nova:disk>1</nova:disk>
Jan 21 18:59:51 np0005591285 nova_compute[182755]:    <nova:swap>0</nova:swap>
Jan 21 18:59:51 np0005591285 nova_compute[182755]:    <nova:ephemeral>0</nova:ephemeral>
Jan 21 18:59:51 np0005591285 nova_compute[182755]:    <nova:vcpus>1</nova:vcpus>
Jan 21 18:59:51 np0005591285 nova_compute[182755]:  </nova:flavor>
Jan 21 18:59:51 np0005591285 nova_compute[182755]:  <nova:owner>
Jan 21 18:59:51 np0005591285 nova_compute[182755]:    <nova:user uuid="0f8ef02149394f2dac899fc3395b6bf7">tempest-AttachInterfacesTestJSON-658760528-project-member</nova:user>
Jan 21 18:59:51 np0005591285 nova_compute[182755]:    <nova:project uuid="717cc581e6a349a98dfd390d05b18624">tempest-AttachInterfacesTestJSON-658760528</nova:project>
Jan 21 18:59:51 np0005591285 nova_compute[182755]:  </nova:owner>
Jan 21 18:59:51 np0005591285 nova_compute[182755]:  <nova:root type="image" uuid="9cd98f02-a505-4543-a7ad-04e9a377b456"/>
Jan 21 18:59:51 np0005591285 nova_compute[182755]:  <nova:ports>
Jan 21 18:59:51 np0005591285 nova_compute[182755]:    <nova:port uuid="e685e997-ac00-415e-9109-6d37bbb2f577">
Jan 21 18:59:51 np0005591285 nova_compute[182755]:      <nova:ip type="fixed" address="10.100.0.8" ipVersion="4"/>
Jan 21 18:59:51 np0005591285 nova_compute[182755]:    </nova:port>
Jan 21 18:59:51 np0005591285 nova_compute[182755]:    <nova:port uuid="c596dfbe-ce59-4ab9-8cad-bd4144812420">
Jan 21 18:59:51 np0005591285 nova_compute[182755]:      <nova:ip type="fixed" address="10.100.0.14" ipVersion="4"/>
Jan 21 18:59:51 np0005591285 nova_compute[182755]:    </nova:port>
Jan 21 18:59:51 np0005591285 nova_compute[182755]:  </nova:ports>
Jan 21 18:59:51 np0005591285 nova_compute[182755]: </nova:instance>
Jan 21 18:59:51 np0005591285 nova_compute[182755]:  </metadata>
Jan 21 18:59:51 np0005591285 nova_compute[182755]:  <memory unit='KiB'>131072</memory>
Jan 21 18:59:51 np0005591285 nova_compute[182755]:  <currentMemory unit='KiB'>131072</currentMemory>
Jan 21 18:59:51 np0005591285 nova_compute[182755]:  <vcpu placement='static'>1</vcpu>
Jan 21 18:59:51 np0005591285 nova_compute[182755]:  <resource>
Jan 21 18:59:51 np0005591285 nova_compute[182755]:    <partition>/machine</partition>
Jan 21 18:59:51 np0005591285 nova_compute[182755]:  </resource>
Jan 21 18:59:51 np0005591285 nova_compute[182755]:  <sysinfo type='smbios'>
Jan 21 18:59:51 np0005591285 nova_compute[182755]:    <system>
Jan 21 18:59:51 np0005591285 nova_compute[182755]:      <entry name='manufacturer'>RDO</entry>
Jan 21 18:59:51 np0005591285 nova_compute[182755]:      <entry name='product'>OpenStack Compute</entry>
Jan 21 18:59:51 np0005591285 nova_compute[182755]:      <entry name='version'>27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 21 18:59:51 np0005591285 nova_compute[182755]:      <entry name='serial'>30dd1355-3b44-4697-89e2-e5c929a535ac</entry>
Jan 21 18:59:51 np0005591285 nova_compute[182755]:      <entry name='uuid'>30dd1355-3b44-4697-89e2-e5c929a535ac</entry>
Jan 21 18:59:51 np0005591285 nova_compute[182755]:      <entry name='family'>Virtual Machine</entry>
Jan 21 18:59:51 np0005591285 nova_compute[182755]:    </system>
Jan 21 18:59:51 np0005591285 nova_compute[182755]:  </sysinfo>
Jan 21 18:59:51 np0005591285 nova_compute[182755]:  <os>
Jan 21 18:59:51 np0005591285 nova_compute[182755]:    <type arch='x86_64' machine='pc-q35-rhel9.8.0'>hvm</type>
Jan 21 18:59:51 np0005591285 nova_compute[182755]:    <boot dev='hd'/>
Jan 21 18:59:51 np0005591285 nova_compute[182755]:    <smbios mode='sysinfo'/>
Jan 21 18:59:51 np0005591285 nova_compute[182755]:  </os>
Jan 21 18:59:51 np0005591285 nova_compute[182755]:  <features>
Jan 21 18:59:51 np0005591285 nova_compute[182755]:    <acpi/>
Jan 21 18:59:51 np0005591285 nova_compute[182755]:    <apic/>
Jan 21 18:59:51 np0005591285 nova_compute[182755]:    <vmcoreinfo state='on'/>
Jan 21 18:59:51 np0005591285 nova_compute[182755]:  </features>
Jan 21 18:59:51 np0005591285 nova_compute[182755]:  <cpu mode='custom' match='exact' check='full'>
Jan 21 18:59:51 np0005591285 nova_compute[182755]:    <model fallback='forbid'>Nehalem</model>
Jan 21 18:59:51 np0005591285 nova_compute[182755]:    <topology sockets='1' dies='1' clusters='1' cores='1' threads='1'/>
Jan 21 18:59:51 np0005591285 nova_compute[182755]:    <feature policy='require' name='x2apic'/>
Jan 21 18:59:51 np0005591285 nova_compute[182755]:    <feature policy='require' name='hypervisor'/>
Jan 21 18:59:51 np0005591285 nova_compute[182755]:    <feature policy='require' name='vme'/>
Jan 21 18:59:51 np0005591285 nova_compute[182755]:  </cpu>
Jan 21 18:59:51 np0005591285 nova_compute[182755]:  <clock offset='utc'>
Jan 21 18:59:51 np0005591285 nova_compute[182755]:    <timer name='pit' tickpolicy='delay'/>
Jan 21 18:59:51 np0005591285 nova_compute[182755]:    <timer name='rtc' tickpolicy='catchup'/>
Jan 21 18:59:51 np0005591285 nova_compute[182755]:    <timer name='hpet' present='no'/>
Jan 21 18:59:51 np0005591285 nova_compute[182755]:  </clock>
Jan 21 18:59:51 np0005591285 nova_compute[182755]:  <on_poweroff>destroy</on_poweroff>
Jan 21 18:59:51 np0005591285 nova_compute[182755]:  <on_reboot>restart</on_reboot>
Jan 21 18:59:51 np0005591285 nova_compute[182755]:  <on_crash>destroy</on_crash>
Jan 21 18:59:51 np0005591285 nova_compute[182755]:  <devices>
Jan 21 18:59:51 np0005591285 nova_compute[182755]:    <emulator>/usr/libexec/qemu-kvm</emulator>
Jan 21 18:59:51 np0005591285 nova_compute[182755]:    <disk type='file' device='disk'>
Jan 21 18:59:51 np0005591285 nova_compute[182755]:      <driver name='qemu' type='qcow2' cache='none'/>
Jan 21 18:59:51 np0005591285 nova_compute[182755]:      <source file='/var/lib/nova/instances/30dd1355-3b44-4697-89e2-e5c929a535ac/disk' index='2'/>
Jan 21 18:59:51 np0005591285 nova_compute[182755]:      <backingStore type='file' index='3'>
Jan 21 18:59:51 np0005591285 nova_compute[182755]:        <format type='raw'/>
Jan 21 18:59:51 np0005591285 nova_compute[182755]:        <source file='/var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474'/>
Jan 21 18:59:51 np0005591285 nova_compute[182755]:        <backingStore/>
Jan 21 18:59:51 np0005591285 nova_compute[182755]:      </backingStore>
Jan 21 18:59:51 np0005591285 nova_compute[182755]:      <target dev='vda' bus='virtio'/>
Jan 21 18:59:51 np0005591285 nova_compute[182755]:      <alias name='virtio-disk0'/>
Jan 21 18:59:51 np0005591285 nova_compute[182755]:      <address type='pci' domain='0x0000' bus='0x03' slot='0x00' function='0x0'/>
Jan 21 18:59:51 np0005591285 nova_compute[182755]:    </disk>
Jan 21 18:59:51 np0005591285 nova_compute[182755]:    <disk type='file' device='cdrom'>
Jan 21 18:59:51 np0005591285 nova_compute[182755]:      <driver name='qemu' type='raw' cache='none'/>
Jan 21 18:59:51 np0005591285 nova_compute[182755]:      <source file='/var/lib/nova/instances/30dd1355-3b44-4697-89e2-e5c929a535ac/disk.config' index='1'/>
Jan 21 18:59:51 np0005591285 nova_compute[182755]:      <backingStore/>
Jan 21 18:59:51 np0005591285 nova_compute[182755]:      <target dev='sda' bus='sata'/>
Jan 21 18:59:51 np0005591285 nova_compute[182755]:      <readonly/>
Jan 21 18:59:51 np0005591285 nova_compute[182755]:      <alias name='sata0-0-0'/>
Jan 21 18:59:51 np0005591285 nova_compute[182755]:      <address type='drive' controller='0' bus='0' target='0' unit='0'/>
Jan 21 18:59:51 np0005591285 nova_compute[182755]:    </disk>
Jan 21 18:59:51 np0005591285 nova_compute[182755]:    <controller type='pci' index='0' model='pcie-root'>
Jan 21 18:59:51 np0005591285 nova_compute[182755]:      <alias name='pcie.0'/>
Jan 21 18:59:51 np0005591285 nova_compute[182755]:    </controller>
Jan 21 18:59:51 np0005591285 nova_compute[182755]:    <controller type='pci' index='1' model='pcie-root-port'>
Jan 21 18:59:51 np0005591285 nova_compute[182755]:      <model name='pcie-root-port'/>
Jan 21 18:59:51 np0005591285 nova_compute[182755]:      <target chassis='1' port='0x10'/>
Jan 21 18:59:51 np0005591285 nova_compute[182755]:      <alias name='pci.1'/>
Jan 21 18:59:51 np0005591285 nova_compute[182755]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x0' multifunction='on'/>
Jan 21 18:59:51 np0005591285 nova_compute[182755]:    </controller>
Jan 21 18:59:51 np0005591285 nova_compute[182755]:    <controller type='pci' index='2' model='pcie-root-port'>
Jan 21 18:59:51 np0005591285 nova_compute[182755]:      <model name='pcie-root-port'/>
Jan 21 18:59:51 np0005591285 nova_compute[182755]:      <target chassis='2' port='0x11'/>
Jan 21 18:59:51 np0005591285 nova_compute[182755]:      <alias name='pci.2'/>
Jan 21 18:59:51 np0005591285 nova_compute[182755]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x1'/>
Jan 21 18:59:51 np0005591285 nova_compute[182755]:    </controller>
Jan 21 18:59:51 np0005591285 nova_compute[182755]:    <controller type='pci' index='3' model='pcie-root-port'>
Jan 21 18:59:51 np0005591285 nova_compute[182755]:      <model name='pcie-root-port'/>
Jan 21 18:59:51 np0005591285 nova_compute[182755]:      <target chassis='3' port='0x12'/>
Jan 21 18:59:51 np0005591285 nova_compute[182755]:      <alias name='pci.3'/>
Jan 21 18:59:51 np0005591285 nova_compute[182755]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x2'/>
Jan 21 18:59:51 np0005591285 nova_compute[182755]:    </controller>
Jan 21 18:59:51 np0005591285 nova_compute[182755]:    <controller type='pci' index='4' model='pcie-root-port'>
Jan 21 18:59:51 np0005591285 nova_compute[182755]:      <model name='pcie-root-port'/>
Jan 21 18:59:51 np0005591285 nova_compute[182755]:      <target chassis='4' port='0x13'/>
Jan 21 18:59:51 np0005591285 nova_compute[182755]:      <alias name='pci.4'/>
Jan 21 18:59:51 np0005591285 nova_compute[182755]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x3'/>
Jan 21 18:59:51 np0005591285 nova_compute[182755]:    </controller>
Jan 21 18:59:51 np0005591285 nova_compute[182755]:    <controller type='pci' index='5' model='pcie-root-port'>
Jan 21 18:59:51 np0005591285 nova_compute[182755]:      <model name='pcie-root-port'/>
Jan 21 18:59:51 np0005591285 nova_compute[182755]:      <target chassis='5' port='0x14'/>
Jan 21 18:59:51 np0005591285 nova_compute[182755]:      <alias name='pci.5'/>
Jan 21 18:59:51 np0005591285 nova_compute[182755]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x4'/>
Jan 21 18:59:51 np0005591285 nova_compute[182755]:    </controller>
Jan 21 18:59:51 np0005591285 nova_compute[182755]:    <controller type='pci' index='6' model='pcie-root-port'>
Jan 21 18:59:51 np0005591285 nova_compute[182755]:      <model name='pcie-root-port'/>
Jan 21 18:59:51 np0005591285 nova_compute[182755]:      <target chassis='6' port='0x15'/>
Jan 21 18:59:51 np0005591285 nova_compute[182755]:      <alias name='pci.6'/>
Jan 21 18:59:51 np0005591285 nova_compute[182755]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x5'/>
Jan 21 18:59:51 np0005591285 nova_compute[182755]:    </controller>
Jan 21 18:59:51 np0005591285 nova_compute[182755]:    <controller type='pci' index='7' model='pcie-root-port'>
Jan 21 18:59:51 np0005591285 nova_compute[182755]:      <model name='pcie-root-port'/>
Jan 21 18:59:51 np0005591285 nova_compute[182755]:      <target chassis='7' port='0x16'/>
Jan 21 18:59:51 np0005591285 nova_compute[182755]:      <alias name='pci.7'/>
Jan 21 18:59:51 np0005591285 nova_compute[182755]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x6'/>
Jan 21 18:59:51 np0005591285 nova_compute[182755]:    </controller>
Jan 21 18:59:51 np0005591285 nova_compute[182755]:    <controller type='pci' index='8' model='pcie-root-port'>
Jan 21 18:59:51 np0005591285 nova_compute[182755]:      <model name='pcie-root-port'/>
Jan 21 18:59:51 np0005591285 nova_compute[182755]:      <target chassis='8' port='0x17'/>
Jan 21 18:59:51 np0005591285 nova_compute[182755]:      <alias name='pci.8'/>
Jan 21 18:59:51 np0005591285 nova_compute[182755]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x7'/>
Jan 21 18:59:51 np0005591285 nova_compute[182755]:    </controller>
Jan 21 18:59:51 np0005591285 nova_compute[182755]:    <controller type='pci' index='9' model='pcie-root-port'>
Jan 21 18:59:51 np0005591285 nova_compute[182755]:      <model name='pcie-root-port'/>
Jan 21 18:59:51 np0005591285 nova_compute[182755]:      <target chassis='9' port='0x18'/>
Jan 21 18:59:51 np0005591285 nova_compute[182755]:      <alias name='pci.9'/>
Jan 21 18:59:51 np0005591285 nova_compute[182755]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x0' multifunction='on'/>
Jan 21 18:59:51 np0005591285 nova_compute[182755]:    </controller>
Jan 21 18:59:51 np0005591285 nova_compute[182755]:    <controller type='pci' index='10' model='pcie-root-port'>
Jan 21 18:59:51 np0005591285 nova_compute[182755]:      <model name='pcie-root-port'/>
Jan 21 18:59:51 np0005591285 nova_compute[182755]:      <target chassis='10' port='0x19'/>
Jan 21 18:59:51 np0005591285 nova_compute[182755]:      <alias name='pci.10'/>
Jan 21 18:59:51 np0005591285 nova_compute[182755]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x1'/>
Jan 21 18:59:51 np0005591285 nova_compute[182755]:    </controller>
Jan 21 18:59:51 np0005591285 nova_compute[182755]:    <controller type='pci' index='11' model='pcie-root-port'>
Jan 21 18:59:51 np0005591285 nova_compute[182755]:      <model name='pcie-root-port'/>
Jan 21 18:59:51 np0005591285 nova_compute[182755]:      <target chassis='11' port='0x1a'/>
Jan 21 18:59:51 np0005591285 nova_compute[182755]:      <alias name='pci.11'/>
Jan 21 18:59:51 np0005591285 nova_compute[182755]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x2'/>
Jan 21 18:59:51 np0005591285 nova_compute[182755]:    </controller>
Jan 21 18:59:51 np0005591285 nova_compute[182755]:    <controller type='pci' index='12' model='pcie-root-port'>
Jan 21 18:59:51 np0005591285 nova_compute[182755]:      <model name='pcie-root-port'/>
Jan 21 18:59:51 np0005591285 nova_compute[182755]:      <target chassis='12' port='0x1b'/>
Jan 21 18:59:51 np0005591285 nova_compute[182755]:      <alias name='pci.12'/>
Jan 21 18:59:51 np0005591285 nova_compute[182755]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x3'/>
Jan 21 18:59:51 np0005591285 nova_compute[182755]:    </controller>
Jan 21 18:59:51 np0005591285 nova_compute[182755]:    <controller type='pci' index='13' model='pcie-root-port'>
Jan 21 18:59:51 np0005591285 nova_compute[182755]:      <model name='pcie-root-port'/>
Jan 21 18:59:51 np0005591285 nova_compute[182755]:      <target chassis='13' port='0x1c'/>
Jan 21 18:59:51 np0005591285 nova_compute[182755]:      <alias name='pci.13'/>
Jan 21 18:59:51 np0005591285 nova_compute[182755]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x4'/>
Jan 21 18:59:51 np0005591285 nova_compute[182755]:    </controller>
Jan 21 18:59:51 np0005591285 nova_compute[182755]:    <controller type='pci' index='14' model='pcie-root-port'>
Jan 21 18:59:51 np0005591285 nova_compute[182755]:      <model name='pcie-root-port'/>
Jan 21 18:59:51 np0005591285 nova_compute[182755]:      <target chassis='14' port='0x1d'/>
Jan 21 18:59:51 np0005591285 nova_compute[182755]:      <alias name='pci.14'/>
Jan 21 18:59:51 np0005591285 nova_compute[182755]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x5'/>
Jan 21 18:59:51 np0005591285 nova_compute[182755]:    </controller>
Jan 21 18:59:51 np0005591285 nova_compute[182755]:    <controller type='pci' index='15' model='pcie-root-port'>
Jan 21 18:59:51 np0005591285 nova_compute[182755]:      <model name='pcie-root-port'/>
Jan 21 18:59:51 np0005591285 nova_compute[182755]:      <target chassis='15' port='0x1e'/>
Jan 21 18:59:51 np0005591285 nova_compute[182755]:      <alias name='pci.15'/>
Jan 21 18:59:51 np0005591285 nova_compute[182755]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x6'/>
Jan 21 18:59:51 np0005591285 nova_compute[182755]:    </controller>
Jan 21 18:59:51 np0005591285 nova_compute[182755]:    <controller type='pci' index='16' model='pcie-root-port'>
Jan 21 18:59:51 np0005591285 nova_compute[182755]:      <model name='pcie-root-port'/>
Jan 21 18:59:51 np0005591285 nova_compute[182755]:      <target chassis='16' port='0x1f'/>
Jan 21 18:59:51 np0005591285 nova_compute[182755]:      <alias name='pci.16'/>
Jan 21 18:59:51 np0005591285 nova_compute[182755]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x7'/>
Jan 21 18:59:51 np0005591285 nova_compute[182755]:    </controller>
Jan 21 18:59:51 np0005591285 nova_compute[182755]:    <controller type='pci' index='17' model='pcie-root-port'>
Jan 21 18:59:51 np0005591285 nova_compute[182755]:      <model name='pcie-root-port'/>
Jan 21 18:59:51 np0005591285 nova_compute[182755]:      <target chassis='17' port='0x20'/>
Jan 21 18:59:51 np0005591285 nova_compute[182755]:      <alias name='pci.17'/>
Jan 21 18:59:51 np0005591285 nova_compute[182755]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x0' multifunction='on'/>
Jan 21 18:59:51 np0005591285 nova_compute[182755]:    </controller>
Jan 21 18:59:51 np0005591285 nova_compute[182755]:    <controller type='pci' index='18' model='pcie-root-port'>
Jan 21 18:59:51 np0005591285 nova_compute[182755]:      <model name='pcie-root-port'/>
Jan 21 18:59:51 np0005591285 nova_compute[182755]:      <target chassis='18' port='0x21'/>
Jan 21 18:59:51 np0005591285 nova_compute[182755]:      <alias name='pci.18'/>
Jan 21 18:59:51 np0005591285 nova_compute[182755]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x1'/>
Jan 21 18:59:51 np0005591285 nova_compute[182755]:    </controller>
Jan 21 18:59:51 np0005591285 nova_compute[182755]:    <controller type='pci' index='19' model='pcie-root-port'>
Jan 21 18:59:51 np0005591285 nova_compute[182755]:      <model name='pcie-root-port'/>
Jan 21 18:59:51 np0005591285 nova_compute[182755]:      <target chassis='19' port='0x22'/>
Jan 21 18:59:51 np0005591285 nova_compute[182755]:      <alias name='pci.19'/>
Jan 21 18:59:51 np0005591285 nova_compute[182755]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x2'/>
Jan 21 18:59:51 np0005591285 nova_compute[182755]:    </controller>
Jan 21 18:59:51 np0005591285 nova_compute[182755]:    <controller type='pci' index='20' model='pcie-root-port'>
Jan 21 18:59:51 np0005591285 nova_compute[182755]:      <model name='pcie-root-port'/>
Jan 21 18:59:51 np0005591285 nova_compute[182755]:      <target chassis='20' port='0x23'/>
Jan 21 18:59:51 np0005591285 nova_compute[182755]:      <alias name='pci.20'/>
Jan 21 18:59:51 np0005591285 nova_compute[182755]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x3'/>
Jan 21 18:59:51 np0005591285 nova_compute[182755]:    </controller>
Jan 21 18:59:51 np0005591285 nova_compute[182755]:    <controller type='pci' index='21' model='pcie-root-port'>
Jan 21 18:59:51 np0005591285 nova_compute[182755]:      <model name='pcie-root-port'/>
Jan 21 18:59:51 np0005591285 nova_compute[182755]:      <target chassis='21' port='0x24'/>
Jan 21 18:59:51 np0005591285 nova_compute[182755]:      <alias name='pci.21'/>
Jan 21 18:59:51 np0005591285 nova_compute[182755]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x4'/>
Jan 21 18:59:51 np0005591285 nova_compute[182755]:    </controller>
Jan 21 18:59:51 np0005591285 nova_compute[182755]:    <controller type='pci' index='22' model='pcie-root-port'>
Jan 21 18:59:51 np0005591285 nova_compute[182755]:      <model name='pcie-root-port'/>
Jan 21 18:59:51 np0005591285 nova_compute[182755]:      <target chassis='22' port='0x25'/>
Jan 21 18:59:51 np0005591285 nova_compute[182755]:      <alias name='pci.22'/>
Jan 21 18:59:51 np0005591285 nova_compute[182755]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x5'/>
Jan 21 18:59:51 np0005591285 nova_compute[182755]:    </controller>
Jan 21 18:59:51 np0005591285 nova_compute[182755]:    <controller type='pci' index='23' model='pcie-root-port'>
Jan 21 18:59:51 np0005591285 nova_compute[182755]:      <model name='pcie-root-port'/>
Jan 21 18:59:51 np0005591285 nova_compute[182755]:      <target chassis='23' port='0x26'/>
Jan 21 18:59:51 np0005591285 nova_compute[182755]:      <alias name='pci.23'/>
Jan 21 18:59:51 np0005591285 nova_compute[182755]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x6'/>
Jan 21 18:59:51 np0005591285 nova_compute[182755]:    </controller>
Jan 21 18:59:51 np0005591285 nova_compute[182755]:    <controller type='pci' index='24' model='pcie-root-port'>
Jan 21 18:59:51 np0005591285 nova_compute[182755]:      <model name='pcie-root-port'/>
Jan 21 18:59:51 np0005591285 nova_compute[182755]:      <target chassis='24' port='0x27'/>
Jan 21 18:59:51 np0005591285 nova_compute[182755]:      <alias name='pci.24'/>
Jan 21 18:59:51 np0005591285 nova_compute[182755]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x7'/>
Jan 21 18:59:51 np0005591285 nova_compute[182755]:    </controller>
Jan 21 18:59:51 np0005591285 nova_compute[182755]:    <controller type='pci' index='25' model='pcie-root-port'>
Jan 21 18:59:51 np0005591285 nova_compute[182755]:      <model name='pcie-root-port'/>
Jan 21 18:59:51 np0005591285 nova_compute[182755]:      <target chassis='25' port='0x28'/>
Jan 21 18:59:51 np0005591285 nova_compute[182755]:      <alias name='pci.25'/>
Jan 21 18:59:51 np0005591285 nova_compute[182755]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x05' function='0x0'/>
Jan 21 18:59:51 np0005591285 nova_compute[182755]:    </controller>
Jan 21 18:59:51 np0005591285 nova_compute[182755]:    <controller type='pci' index='26' model='pcie-to-pci-bridge'>
Jan 21 18:59:51 np0005591285 nova_compute[182755]:      <model name='pcie-pci-bridge'/>
Jan 21 18:59:51 np0005591285 nova_compute[182755]:      <alias name='pci.26'/>
Jan 21 18:59:51 np0005591285 nova_compute[182755]:      <address type='pci' domain='0x0000' bus='0x01' slot='0x00' function='0x0'/>
Jan 21 18:59:51 np0005591285 nova_compute[182755]:    </controller>
Jan 21 18:59:51 np0005591285 nova_compute[182755]:    <controller type='usb' index='0' model='piix3-uhci'>
Jan 21 18:59:51 np0005591285 nova_compute[182755]:      <alias name='usb'/>
Jan 21 18:59:51 np0005591285 nova_compute[182755]:      <address type='pci' domain='0x0000' bus='0x1a' slot='0x01' function='0x0'/>
Jan 21 18:59:51 np0005591285 nova_compute[182755]:    </controller>
Jan 21 18:59:51 np0005591285 nova_compute[182755]:    <controller type='sata' index='0'>
Jan 21 18:59:51 np0005591285 nova_compute[182755]:      <alias name='ide'/>
Jan 21 18:59:51 np0005591285 nova_compute[182755]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x1f' function='0x2'/>
Jan 21 18:59:51 np0005591285 nova_compute[182755]:    </controller>
Jan 21 18:59:51 np0005591285 nova_compute[182755]:    <interface type='ethernet'>
Jan 21 18:59:51 np0005591285 nova_compute[182755]:      <mac address='fa:16:3e:a5:00:2a'/>
Jan 21 18:59:51 np0005591285 nova_compute[182755]:      <target dev='tape685e997-ac'/>
Jan 21 18:59:51 np0005591285 nova_compute[182755]:      <model type='virtio'/>
Jan 21 18:59:51 np0005591285 nova_compute[182755]:      <driver name='vhost' rx_queue_size='512'/>
Jan 21 18:59:51 np0005591285 nova_compute[182755]:      <mtu size='1442'/>
Jan 21 18:59:51 np0005591285 nova_compute[182755]:      <alias name='net0'/>
Jan 21 18:59:51 np0005591285 nova_compute[182755]:      <address type='pci' domain='0x0000' bus='0x02' slot='0x00' function='0x0'/>
Jan 21 18:59:51 np0005591285 nova_compute[182755]:    </interface>
Jan 21 18:59:51 np0005591285 nova_compute[182755]:    <interface type='ethernet'>
Jan 21 18:59:51 np0005591285 nova_compute[182755]:      <mac address='fa:16:3e:24:23:22'/>
Jan 21 18:59:51 np0005591285 nova_compute[182755]:      <target dev='tapc596dfbe-ce'/>
Jan 21 18:59:51 np0005591285 nova_compute[182755]:      <model type='virtio'/>
Jan 21 18:59:51 np0005591285 nova_compute[182755]:      <driver name='vhost' rx_queue_size='512'/>
Jan 21 18:59:51 np0005591285 nova_compute[182755]:      <mtu size='1442'/>
Jan 21 18:59:51 np0005591285 nova_compute[182755]:      <alias name='net1'/>
Jan 21 18:59:51 np0005591285 nova_compute[182755]:      <address type='pci' domain='0x0000' bus='0x06' slot='0x00' function='0x0'/>
Jan 21 18:59:51 np0005591285 nova_compute[182755]:    </interface>
Jan 21 18:59:51 np0005591285 nova_compute[182755]:    <serial type='pty'>
Jan 21 18:59:51 np0005591285 nova_compute[182755]:      <source path='/dev/pts/2'/>
Jan 21 18:59:51 np0005591285 nova_compute[182755]:      <log file='/var/lib/nova/instances/30dd1355-3b44-4697-89e2-e5c929a535ac/console.log' append='off'/>
Jan 21 18:59:51 np0005591285 nova_compute[182755]:      <target type='isa-serial' port='0'>
Jan 21 18:59:51 np0005591285 nova_compute[182755]:        <model name='isa-serial'/>
Jan 21 18:59:51 np0005591285 nova_compute[182755]:      </target>
Jan 21 18:59:51 np0005591285 nova_compute[182755]:      <alias name='serial0'/>
Jan 21 18:59:51 np0005591285 nova_compute[182755]:    </serial>
Jan 21 18:59:51 np0005591285 nova_compute[182755]:    <console type='pty' tty='/dev/pts/2'>
Jan 21 18:59:51 np0005591285 nova_compute[182755]:      <source path='/dev/pts/2'/>
Jan 21 18:59:51 np0005591285 nova_compute[182755]:      <log file='/var/lib/nova/instances/30dd1355-3b44-4697-89e2-e5c929a535ac/console.log' append='off'/>
Jan 21 18:59:51 np0005591285 nova_compute[182755]:      <target type='serial' port='0'/>
Jan 21 18:59:51 np0005591285 nova_compute[182755]:      <alias name='serial0'/>
Jan 21 18:59:51 np0005591285 nova_compute[182755]:    </console>
Jan 21 18:59:51 np0005591285 nova_compute[182755]:    <input type='tablet' bus='usb'>
Jan 21 18:59:51 np0005591285 nova_compute[182755]:      <alias name='input0'/>
Jan 21 18:59:51 np0005591285 nova_compute[182755]:      <address type='usb' bus='0' port='1'/>
Jan 21 18:59:51 np0005591285 nova_compute[182755]:    </input>
Jan 21 18:59:51 np0005591285 nova_compute[182755]:    <input type='mouse' bus='ps2'>
Jan 21 18:59:51 np0005591285 nova_compute[182755]:      <alias name='input1'/>
Jan 21 18:59:51 np0005591285 nova_compute[182755]:    </input>
Jan 21 18:59:51 np0005591285 nova_compute[182755]:    <input type='keyboard' bus='ps2'>
Jan 21 18:59:51 np0005591285 nova_compute[182755]:      <alias name='input2'/>
Jan 21 18:59:51 np0005591285 nova_compute[182755]:    </input>
Jan 21 18:59:51 np0005591285 nova_compute[182755]:    <graphics type='vnc' port='5902' autoport='yes' listen='::0'>
Jan 21 18:59:51 np0005591285 nova_compute[182755]:      <listen type='address' address='::0'/>
Jan 21 18:59:51 np0005591285 nova_compute[182755]:    </graphics>
Jan 21 18:59:51 np0005591285 nova_compute[182755]:    <audio id='1' type='none'/>
Jan 21 18:59:51 np0005591285 nova_compute[182755]:    <video>
Jan 21 18:59:51 np0005591285 nova_compute[182755]:      <model type='virtio' heads='1' primary='yes'/>
Jan 21 18:59:51 np0005591285 nova_compute[182755]:      <alias name='video0'/>
Jan 21 18:59:51 np0005591285 nova_compute[182755]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x01' function='0x0'/>
Jan 21 18:59:51 np0005591285 nova_compute[182755]:    </video>
Jan 21 18:59:51 np0005591285 nova_compute[182755]:    <watchdog model='itco' action='reset'>
Jan 21 18:59:51 np0005591285 nova_compute[182755]:      <alias name='watchdog0'/>
Jan 21 18:59:51 np0005591285 nova_compute[182755]:    </watchdog>
Jan 21 18:59:51 np0005591285 nova_compute[182755]:    <memballoon model='virtio'>
Jan 21 18:59:51 np0005591285 nova_compute[182755]:      <stats period='10'/>
Jan 21 18:59:51 np0005591285 nova_compute[182755]:      <alias name='balloon0'/>
Jan 21 18:59:51 np0005591285 nova_compute[182755]:      <address type='pci' domain='0x0000' bus='0x04' slot='0x00' function='0x0'/>
Jan 21 18:59:51 np0005591285 nova_compute[182755]:    </memballoon>
Jan 21 18:59:51 np0005591285 nova_compute[182755]:    <rng model='virtio'>
Jan 21 18:59:51 np0005591285 nova_compute[182755]:      <backend model='random'>/dev/urandom</backend>
Jan 21 18:59:51 np0005591285 nova_compute[182755]:      <alias name='rng0'/>
Jan 21 18:59:51 np0005591285 nova_compute[182755]:      <address type='pci' domain='0x0000' bus='0x05' slot='0x00' function='0x0'/>
Jan 21 18:59:51 np0005591285 nova_compute[182755]:    </rng>
Jan 21 18:59:51 np0005591285 nova_compute[182755]:  </devices>
Jan 21 18:59:51 np0005591285 nova_compute[182755]:  <seclabel type='dynamic' model='selinux' relabel='yes'>
Jan 21 18:59:51 np0005591285 nova_compute[182755]:    <label>system_u:system_r:svirt_t:s0:c2,c726</label>
Jan 21 18:59:51 np0005591285 nova_compute[182755]:    <imagelabel>system_u:object_r:svirt_image_t:s0:c2,c726</imagelabel>
Jan 21 18:59:51 np0005591285 nova_compute[182755]:  </seclabel>
Jan 21 18:59:51 np0005591285 nova_compute[182755]:  <seclabel type='dynamic' model='dac' relabel='yes'>
Jan 21 18:59:51 np0005591285 nova_compute[182755]:    <label>+107:+107</label>
Jan 21 18:59:51 np0005591285 nova_compute[182755]:    <imagelabel>+107:+107</imagelabel>
Jan 21 18:59:51 np0005591285 nova_compute[182755]:  </seclabel>
Jan 21 18:59:51 np0005591285 nova_compute[182755]: </domain>
Jan 21 18:59:51 np0005591285 nova_compute[182755]: get_interface_by_cfg /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:282#033[00m
Jan 21 18:59:51 np0005591285 nova_compute[182755]: 2026-01-21 23:59:51.796 182759 INFO nova.virt.libvirt.driver [None req-83be6c55-c85f-489d-9178-d17bd4158a6f 0f8ef02149394f2dac899fc3395b6bf7 717cc581e6a349a98dfd390d05b18624 - - default default] Successfully detached device tapc596dfbe-ce from instance 30dd1355-3b44-4697-89e2-e5c929a535ac from the persistent domain config.#033[00m
Jan 21 18:59:51 np0005591285 nova_compute[182755]: 2026-01-21 23:59:51.797 182759 DEBUG nova.virt.libvirt.driver [None req-83be6c55-c85f-489d-9178-d17bd4158a6f 0f8ef02149394f2dac899fc3395b6bf7 717cc581e6a349a98dfd390d05b18624 - - default default] (1/8): Attempting to detach device tapc596dfbe-ce with device alias net1 from instance 30dd1355-3b44-4697-89e2-e5c929a535ac from the live domain config. _detach_from_live_with_retry /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:2523#033[00m
Jan 21 18:59:51 np0005591285 nova_compute[182755]: 2026-01-21 23:59:51.799 182759 DEBUG nova.virt.libvirt.guest [None req-83be6c55-c85f-489d-9178-d17bd4158a6f 0f8ef02149394f2dac899fc3395b6bf7 717cc581e6a349a98dfd390d05b18624 - - default default] detach device xml: <interface type="ethernet">
Jan 21 18:59:51 np0005591285 nova_compute[182755]:  <mac address="fa:16:3e:24:23:22"/>
Jan 21 18:59:51 np0005591285 nova_compute[182755]:  <model type="virtio"/>
Jan 21 18:59:51 np0005591285 nova_compute[182755]:  <driver name="vhost" rx_queue_size="512"/>
Jan 21 18:59:51 np0005591285 nova_compute[182755]:  <mtu size="1442"/>
Jan 21 18:59:51 np0005591285 nova_compute[182755]:  <target dev="tapc596dfbe-ce"/>
Jan 21 18:59:51 np0005591285 nova_compute[182755]: </interface>
Jan 21 18:59:51 np0005591285 nova_compute[182755]: detach_device /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:465#033[00m
Jan 21 18:59:51 np0005591285 nova_compute[182755]: 2026-01-21 23:59:51.859 182759 DEBUG nova.network.neutron [req-48dfb60a-0b98-47a3-b511-a91c03e9a96d req-06040c47-2f92-4941-806f-d20ed6c0aa7a 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 30dd1355-3b44-4697-89e2-e5c929a535ac] Updated VIF entry in instance network info cache for port c596dfbe-ce59-4ab9-8cad-bd4144812420. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 21 18:59:51 np0005591285 nova_compute[182755]: 2026-01-21 23:59:51.860 182759 DEBUG nova.network.neutron [req-48dfb60a-0b98-47a3-b511-a91c03e9a96d req-06040c47-2f92-4941-806f-d20ed6c0aa7a 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 30dd1355-3b44-4697-89e2-e5c929a535ac] Updating instance_info_cache with network_info: [{"id": "e685e997-ac00-415e-9109-6d37bbb2f577", "address": "fa:16:3e:a5:00:2a", "network": {"id": "1995baab-0f8d-4658-a4fc-2d21868dc592", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-54296518-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.223", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "717cc581e6a349a98dfd390d05b18624", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape685e997-ac", "ovs_interfaceid": "e685e997-ac00-415e-9109-6d37bbb2f577", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "c596dfbe-ce59-4ab9-8cad-bd4144812420", "address": "fa:16:3e:24:23:22", "network": {"id": "1995baab-0f8d-4658-a4fc-2d21868dc592", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-54296518-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "717cc581e6a349a98dfd390d05b18624", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc596dfbe-ce", "ovs_interfaceid": "c596dfbe-ce59-4ab9-8cad-bd4144812420", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 21 18:59:51 np0005591285 kernel: tapc596dfbe-ce (unregistering): left promiscuous mode
Jan 21 18:59:51 np0005591285 NetworkManager[55017]: <info>  [1769039991.8648] device (tapc596dfbe-ce): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 21 18:59:51 np0005591285 nova_compute[182755]: 2026-01-21 23:59:51.868 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 18:59:51 np0005591285 ovn_controller[94908]: 2026-01-21T23:59:51Z|00277|binding|INFO|Releasing lport c596dfbe-ce59-4ab9-8cad-bd4144812420 from this chassis (sb_readonly=0)
Jan 21 18:59:51 np0005591285 ovn_controller[94908]: 2026-01-21T23:59:51Z|00278|binding|INFO|Setting lport c596dfbe-ce59-4ab9-8cad-bd4144812420 down in Southbound
Jan 21 18:59:51 np0005591285 ovn_controller[94908]: 2026-01-21T23:59:51Z|00279|binding|INFO|Removing iface tapc596dfbe-ce ovn-installed in OVS
Jan 21 18:59:51 np0005591285 nova_compute[182755]: 2026-01-21 23:59:51.873 182759 DEBUG nova.virt.libvirt.driver [None req-f447ef32-7edb-4e2c-a5c5-5452f2f9c37e - - - - - -] Received event <DeviceRemovedEvent: 1769039991.8729265, 30dd1355-3b44-4697-89e2-e5c929a535ac => net1> from libvirt while the driver is waiting for it; dispatched. emit_event /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:2370#033[00m
Jan 21 18:59:51 np0005591285 nova_compute[182755]: 2026-01-21 23:59:51.873 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 18:59:51 np0005591285 nova_compute[182755]: 2026-01-21 23:59:51.875 182759 DEBUG nova.virt.libvirt.driver [None req-83be6c55-c85f-489d-9178-d17bd4158a6f 0f8ef02149394f2dac899fc3395b6bf7 717cc581e6a349a98dfd390d05b18624 - - default default] Start waiting for the detach event from libvirt for device tapc596dfbe-ce with device alias net1 for instance 30dd1355-3b44-4697-89e2-e5c929a535ac _detach_from_live_and_wait_for_event /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:2599#033[00m
Jan 21 18:59:51 np0005591285 nova_compute[182755]: 2026-01-21 23:59:51.876 182759 DEBUG nova.virt.libvirt.guest [None req-83be6c55-c85f-489d-9178-d17bd4158a6f 0f8ef02149394f2dac899fc3395b6bf7 717cc581e6a349a98dfd390d05b18624 - - default default] looking for interface given config: <interface type="ethernet"><mac address="fa:16:3e:24:23:22"/><model type="virtio"/><driver name="vhost" rx_queue_size="512"/><mtu size="1442"/><target dev="tapc596dfbe-ce"/></interface> get_interface_by_cfg /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:257#033[00m
Jan 21 18:59:51 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:59:51.879 104259 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:24:23:22 10.100.0.14'], port_security=['fa:16:3e:24:23:22 10.100.0.14'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'name': 'tempest-AttachInterfacesTestJSON-1512441673', 'neutron:cidrs': '10.100.0.14/28', 'neutron:device_id': '30dd1355-3b44-4697-89e2-e5c929a535ac', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-1995baab-0f8d-4658-a4fc-2d21868dc592', 'neutron:port_capabilities': '', 'neutron:port_name': 'tempest-AttachInterfacesTestJSON-1512441673', 'neutron:project_id': '717cc581e6a349a98dfd390d05b18624', 'neutron:revision_number': '9', 'neutron:security_group_ids': '453c6af8-25dc-4538-90e8-d74d46875cdc', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-2.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=a84fa12f-731b-4479-8697-844749c5a76f, chassis=[], tunnel_key=5, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fdd55b5c970>], logical_port=c596dfbe-ce59-4ab9-8cad-bd4144812420) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7fdd55b5c970>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 21 18:59:51 np0005591285 nova_compute[182755]: 2026-01-21 23:59:51.880 182759 DEBUG nova.virt.libvirt.guest [None req-83be6c55-c85f-489d-9178-d17bd4158a6f 0f8ef02149394f2dac899fc3395b6bf7 717cc581e6a349a98dfd390d05b18624 - - default default] interface for config: <interface type="ethernet"><mac address="fa:16:3e:24:23:22"/><model type="virtio"/><driver name="vhost" rx_queue_size="512"/><mtu size="1442"/><target dev="tapc596dfbe-ce"/></interface>not found in domain: <domain type='kvm' id='36'>
Jan 21 18:59:51 np0005591285 nova_compute[182755]:  <name>instance-0000004d</name>
Jan 21 18:59:51 np0005591285 nova_compute[182755]:  <uuid>30dd1355-3b44-4697-89e2-e5c929a535ac</uuid>
Jan 21 18:59:51 np0005591285 nova_compute[182755]:  <metadata>
Jan 21 18:59:51 np0005591285 nova_compute[182755]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1" xmlns:instance="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 21 18:59:51 np0005591285 nova_compute[182755]:  <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 21 18:59:51 np0005591285 nova_compute[182755]:  <nova:name>tempest-tempest.common.compute-instance-1677916293</nova:name>
Jan 21 18:59:51 np0005591285 nova_compute[182755]:  <nova:creationTime>2026-01-21 23:59:50</nova:creationTime>
Jan 21 18:59:51 np0005591285 nova_compute[182755]:  <nova:flavor name="m1.nano">
Jan 21 18:59:51 np0005591285 nova_compute[182755]:    <nova:memory>128</nova:memory>
Jan 21 18:59:51 np0005591285 nova_compute[182755]:    <nova:disk>1</nova:disk>
Jan 21 18:59:51 np0005591285 nova_compute[182755]:    <nova:swap>0</nova:swap>
Jan 21 18:59:51 np0005591285 nova_compute[182755]:    <nova:ephemeral>0</nova:ephemeral>
Jan 21 18:59:51 np0005591285 nova_compute[182755]:    <nova:vcpus>1</nova:vcpus>
Jan 21 18:59:51 np0005591285 nova_compute[182755]:  </nova:flavor>
Jan 21 18:59:51 np0005591285 nova_compute[182755]:  <nova:owner>
Jan 21 18:59:51 np0005591285 nova_compute[182755]:    <nova:user uuid="0f8ef02149394f2dac899fc3395b6bf7">tempest-AttachInterfacesTestJSON-658760528-project-member</nova:user>
Jan 21 18:59:51 np0005591285 nova_compute[182755]:    <nova:project uuid="717cc581e6a349a98dfd390d05b18624">tempest-AttachInterfacesTestJSON-658760528</nova:project>
Jan 21 18:59:51 np0005591285 nova_compute[182755]:  </nova:owner>
Jan 21 18:59:51 np0005591285 nova_compute[182755]:  <nova:root type="image" uuid="9cd98f02-a505-4543-a7ad-04e9a377b456"/>
Jan 21 18:59:51 np0005591285 nova_compute[182755]:  <nova:ports>
Jan 21 18:59:51 np0005591285 nova_compute[182755]:    <nova:port uuid="e685e997-ac00-415e-9109-6d37bbb2f577">
Jan 21 18:59:51 np0005591285 nova_compute[182755]:      <nova:ip type="fixed" address="10.100.0.8" ipVersion="4"/>
Jan 21 18:59:51 np0005591285 nova_compute[182755]:    </nova:port>
Jan 21 18:59:51 np0005591285 nova_compute[182755]:    <nova:port uuid="c596dfbe-ce59-4ab9-8cad-bd4144812420">
Jan 21 18:59:51 np0005591285 nova_compute[182755]:      <nova:ip type="fixed" address="10.100.0.14" ipVersion="4"/>
Jan 21 18:59:51 np0005591285 nova_compute[182755]:    </nova:port>
Jan 21 18:59:51 np0005591285 nova_compute[182755]:  </nova:ports>
Jan 21 18:59:51 np0005591285 nova_compute[182755]: </nova:instance>
Jan 21 18:59:51 np0005591285 nova_compute[182755]:  </metadata>
Jan 21 18:59:51 np0005591285 nova_compute[182755]:  <memory unit='KiB'>131072</memory>
Jan 21 18:59:51 np0005591285 nova_compute[182755]:  <currentMemory unit='KiB'>131072</currentMemory>
Jan 21 18:59:51 np0005591285 nova_compute[182755]:  <vcpu placement='static'>1</vcpu>
Jan 21 18:59:51 np0005591285 nova_compute[182755]:  <resource>
Jan 21 18:59:51 np0005591285 nova_compute[182755]:    <partition>/machine</partition>
Jan 21 18:59:51 np0005591285 nova_compute[182755]:  </resource>
Jan 21 18:59:51 np0005591285 nova_compute[182755]:  <sysinfo type='smbios'>
Jan 21 18:59:51 np0005591285 nova_compute[182755]:    <system>
Jan 21 18:59:51 np0005591285 nova_compute[182755]:      <entry name='manufacturer'>RDO</entry>
Jan 21 18:59:51 np0005591285 nova_compute[182755]:      <entry name='product'>OpenStack Compute</entry>
Jan 21 18:59:51 np0005591285 nova_compute[182755]:      <entry name='version'>27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 21 18:59:51 np0005591285 nova_compute[182755]:      <entry name='serial'>30dd1355-3b44-4697-89e2-e5c929a535ac</entry>
Jan 21 18:59:51 np0005591285 nova_compute[182755]:      <entry name='uuid'>30dd1355-3b44-4697-89e2-e5c929a535ac</entry>
Jan 21 18:59:51 np0005591285 nova_compute[182755]:      <entry name='family'>Virtual Machine</entry>
Jan 21 18:59:51 np0005591285 nova_compute[182755]:    </system>
Jan 21 18:59:51 np0005591285 nova_compute[182755]:  </sysinfo>
Jan 21 18:59:51 np0005591285 nova_compute[182755]:  <os>
Jan 21 18:59:51 np0005591285 nova_compute[182755]:    <type arch='x86_64' machine='pc-q35-rhel9.8.0'>hvm</type>
Jan 21 18:59:51 np0005591285 nova_compute[182755]:    <boot dev='hd'/>
Jan 21 18:59:51 np0005591285 nova_compute[182755]:    <smbios mode='sysinfo'/>
Jan 21 18:59:51 np0005591285 nova_compute[182755]:  </os>
Jan 21 18:59:51 np0005591285 nova_compute[182755]:  <features>
Jan 21 18:59:51 np0005591285 nova_compute[182755]:    <acpi/>
Jan 21 18:59:51 np0005591285 nova_compute[182755]:    <apic/>
Jan 21 18:59:51 np0005591285 nova_compute[182755]:    <vmcoreinfo state='on'/>
Jan 21 18:59:51 np0005591285 nova_compute[182755]:  </features>
Jan 21 18:59:51 np0005591285 nova_compute[182755]:  <cpu mode='custom' match='exact' check='full'>
Jan 21 18:59:51 np0005591285 nova_compute[182755]:    <model fallback='forbid'>Nehalem</model>
Jan 21 18:59:51 np0005591285 nova_compute[182755]:    <topology sockets='1' dies='1' clusters='1' cores='1' threads='1'/>
Jan 21 18:59:51 np0005591285 nova_compute[182755]:    <feature policy='require' name='x2apic'/>
Jan 21 18:59:51 np0005591285 nova_compute[182755]:    <feature policy='require' name='hypervisor'/>
Jan 21 18:59:51 np0005591285 nova_compute[182755]:    <feature policy='require' name='vme'/>
Jan 21 18:59:51 np0005591285 nova_compute[182755]:  </cpu>
Jan 21 18:59:51 np0005591285 nova_compute[182755]:  <clock offset='utc'>
Jan 21 18:59:51 np0005591285 nova_compute[182755]:    <timer name='pit' tickpolicy='delay'/>
Jan 21 18:59:51 np0005591285 nova_compute[182755]:    <timer name='rtc' tickpolicy='catchup'/>
Jan 21 18:59:51 np0005591285 nova_compute[182755]:    <timer name='hpet' present='no'/>
Jan 21 18:59:51 np0005591285 nova_compute[182755]:  </clock>
Jan 21 18:59:51 np0005591285 nova_compute[182755]:  <on_poweroff>destroy</on_poweroff>
Jan 21 18:59:51 np0005591285 nova_compute[182755]:  <on_reboot>restart</on_reboot>
Jan 21 18:59:51 np0005591285 nova_compute[182755]:  <on_crash>destroy</on_crash>
Jan 21 18:59:51 np0005591285 nova_compute[182755]:  <devices>
Jan 21 18:59:51 np0005591285 nova_compute[182755]:    <emulator>/usr/libexec/qemu-kvm</emulator>
Jan 21 18:59:51 np0005591285 nova_compute[182755]:    <disk type='file' device='disk'>
Jan 21 18:59:51 np0005591285 nova_compute[182755]:      <driver name='qemu' type='qcow2' cache='none'/>
Jan 21 18:59:51 np0005591285 nova_compute[182755]:      <source file='/var/lib/nova/instances/30dd1355-3b44-4697-89e2-e5c929a535ac/disk' index='2'/>
Jan 21 18:59:51 np0005591285 nova_compute[182755]:      <backingStore type='file' index='3'>
Jan 21 18:59:51 np0005591285 nova_compute[182755]:        <format type='raw'/>
Jan 21 18:59:51 np0005591285 nova_compute[182755]:        <source file='/var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474'/>
Jan 21 18:59:51 np0005591285 nova_compute[182755]:        <backingStore/>
Jan 21 18:59:51 np0005591285 nova_compute[182755]:      </backingStore>
Jan 21 18:59:51 np0005591285 nova_compute[182755]:      <target dev='vda' bus='virtio'/>
Jan 21 18:59:51 np0005591285 nova_compute[182755]:      <alias name='virtio-disk0'/>
Jan 21 18:59:51 np0005591285 nova_compute[182755]:      <address type='pci' domain='0x0000' bus='0x03' slot='0x00' function='0x0'/>
Jan 21 18:59:51 np0005591285 nova_compute[182755]:    </disk>
Jan 21 18:59:51 np0005591285 nova_compute[182755]:    <disk type='file' device='cdrom'>
Jan 21 18:59:51 np0005591285 nova_compute[182755]:      <driver name='qemu' type='raw' cache='none'/>
Jan 21 18:59:51 np0005591285 nova_compute[182755]:      <source file='/var/lib/nova/instances/30dd1355-3b44-4697-89e2-e5c929a535ac/disk.config' index='1'/>
Jan 21 18:59:51 np0005591285 nova_compute[182755]:      <backingStore/>
Jan 21 18:59:51 np0005591285 nova_compute[182755]:      <target dev='sda' bus='sata'/>
Jan 21 18:59:51 np0005591285 nova_compute[182755]:      <readonly/>
Jan 21 18:59:51 np0005591285 nova_compute[182755]:      <alias name='sata0-0-0'/>
Jan 21 18:59:51 np0005591285 nova_compute[182755]:      <address type='drive' controller='0' bus='0' target='0' unit='0'/>
Jan 21 18:59:51 np0005591285 nova_compute[182755]:    </disk>
Jan 21 18:59:51 np0005591285 nova_compute[182755]:    <controller type='pci' index='0' model='pcie-root'>
Jan 21 18:59:51 np0005591285 nova_compute[182755]:      <alias name='pcie.0'/>
Jan 21 18:59:51 np0005591285 nova_compute[182755]:    </controller>
Jan 21 18:59:51 np0005591285 nova_compute[182755]:    <controller type='pci' index='1' model='pcie-root-port'>
Jan 21 18:59:51 np0005591285 nova_compute[182755]:      <model name='pcie-root-port'/>
Jan 21 18:59:51 np0005591285 nova_compute[182755]:      <target chassis='1' port='0x10'/>
Jan 21 18:59:51 np0005591285 nova_compute[182755]:      <alias name='pci.1'/>
Jan 21 18:59:51 np0005591285 nova_compute[182755]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x0' multifunction='on'/>
Jan 21 18:59:51 np0005591285 nova_compute[182755]:    </controller>
Jan 21 18:59:51 np0005591285 nova_compute[182755]:    <controller type='pci' index='2' model='pcie-root-port'>
Jan 21 18:59:51 np0005591285 nova_compute[182755]:      <model name='pcie-root-port'/>
Jan 21 18:59:51 np0005591285 nova_compute[182755]:      <target chassis='2' port='0x11'/>
Jan 21 18:59:51 np0005591285 nova_compute[182755]:      <alias name='pci.2'/>
Jan 21 18:59:51 np0005591285 nova_compute[182755]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x1'/>
Jan 21 18:59:51 np0005591285 nova_compute[182755]:    </controller>
Jan 21 18:59:51 np0005591285 nova_compute[182755]:    <controller type='pci' index='3' model='pcie-root-port'>
Jan 21 18:59:51 np0005591285 nova_compute[182755]:      <model name='pcie-root-port'/>
Jan 21 18:59:51 np0005591285 nova_compute[182755]:      <target chassis='3' port='0x12'/>
Jan 21 18:59:51 np0005591285 nova_compute[182755]:      <alias name='pci.3'/>
Jan 21 18:59:51 np0005591285 nova_compute[182755]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x2'/>
Jan 21 18:59:51 np0005591285 nova_compute[182755]:    </controller>
Jan 21 18:59:51 np0005591285 nova_compute[182755]:    <controller type='pci' index='4' model='pcie-root-port'>
Jan 21 18:59:51 np0005591285 nova_compute[182755]:      <model name='pcie-root-port'/>
Jan 21 18:59:51 np0005591285 nova_compute[182755]:      <target chassis='4' port='0x13'/>
Jan 21 18:59:51 np0005591285 nova_compute[182755]:      <alias name='pci.4'/>
Jan 21 18:59:51 np0005591285 nova_compute[182755]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x3'/>
Jan 21 18:59:51 np0005591285 nova_compute[182755]:    </controller>
Jan 21 18:59:51 np0005591285 nova_compute[182755]:    <controller type='pci' index='5' model='pcie-root-port'>
Jan 21 18:59:51 np0005591285 nova_compute[182755]:      <model name='pcie-root-port'/>
Jan 21 18:59:51 np0005591285 nova_compute[182755]:      <target chassis='5' port='0x14'/>
Jan 21 18:59:51 np0005591285 nova_compute[182755]:      <alias name='pci.5'/>
Jan 21 18:59:51 np0005591285 nova_compute[182755]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x4'/>
Jan 21 18:59:51 np0005591285 nova_compute[182755]:    </controller>
Jan 21 18:59:51 np0005591285 nova_compute[182755]:    <controller type='pci' index='6' model='pcie-root-port'>
Jan 21 18:59:51 np0005591285 nova_compute[182755]:      <model name='pcie-root-port'/>
Jan 21 18:59:51 np0005591285 nova_compute[182755]:      <target chassis='6' port='0x15'/>
Jan 21 18:59:51 np0005591285 nova_compute[182755]:      <alias name='pci.6'/>
Jan 21 18:59:51 np0005591285 nova_compute[182755]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x5'/>
Jan 21 18:59:51 np0005591285 nova_compute[182755]:    </controller>
Jan 21 18:59:51 np0005591285 nova_compute[182755]:    <controller type='pci' index='7' model='pcie-root-port'>
Jan 21 18:59:51 np0005591285 nova_compute[182755]:      <model name='pcie-root-port'/>
Jan 21 18:59:51 np0005591285 nova_compute[182755]:      <target chassis='7' port='0x16'/>
Jan 21 18:59:51 np0005591285 nova_compute[182755]:      <alias name='pci.7'/>
Jan 21 18:59:51 np0005591285 nova_compute[182755]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x6'/>
Jan 21 18:59:51 np0005591285 nova_compute[182755]:    </controller>
Jan 21 18:59:51 np0005591285 nova_compute[182755]:    <controller type='pci' index='8' model='pcie-root-port'>
Jan 21 18:59:51 np0005591285 nova_compute[182755]:      <model name='pcie-root-port'/>
Jan 21 18:59:51 np0005591285 nova_compute[182755]:      <target chassis='8' port='0x17'/>
Jan 21 18:59:51 np0005591285 nova_compute[182755]:      <alias name='pci.8'/>
Jan 21 18:59:51 np0005591285 nova_compute[182755]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x7'/>
Jan 21 18:59:51 np0005591285 nova_compute[182755]:    </controller>
Jan 21 18:59:51 np0005591285 nova_compute[182755]:    <controller type='pci' index='9' model='pcie-root-port'>
Jan 21 18:59:51 np0005591285 nova_compute[182755]:      <model name='pcie-root-port'/>
Jan 21 18:59:51 np0005591285 nova_compute[182755]:      <target chassis='9' port='0x18'/>
Jan 21 18:59:51 np0005591285 nova_compute[182755]:      <alias name='pci.9'/>
Jan 21 18:59:51 np0005591285 nova_compute[182755]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x0' multifunction='on'/>
Jan 21 18:59:51 np0005591285 nova_compute[182755]:    </controller>
Jan 21 18:59:51 np0005591285 nova_compute[182755]:    <controller type='pci' index='10' model='pcie-root-port'>
Jan 21 18:59:51 np0005591285 nova_compute[182755]:      <model name='pcie-root-port'/>
Jan 21 18:59:51 np0005591285 nova_compute[182755]:      <target chassis='10' port='0x19'/>
Jan 21 18:59:51 np0005591285 nova_compute[182755]:      <alias name='pci.10'/>
Jan 21 18:59:51 np0005591285 nova_compute[182755]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x1'/>
Jan 21 18:59:51 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:59:51.882 104259 INFO neutron.agent.ovn.metadata.agent [-] Port c596dfbe-ce59-4ab9-8cad-bd4144812420 in datapath 1995baab-0f8d-4658-a4fc-2d21868dc592 unbound from our chassis#033[00m
Jan 21 18:59:51 np0005591285 nova_compute[182755]:    </controller>
Jan 21 18:59:51 np0005591285 nova_compute[182755]:    <controller type='pci' index='11' model='pcie-root-port'>
Jan 21 18:59:51 np0005591285 nova_compute[182755]:      <model name='pcie-root-port'/>
Jan 21 18:59:51 np0005591285 nova_compute[182755]:      <target chassis='11' port='0x1a'/>
Jan 21 18:59:51 np0005591285 nova_compute[182755]:      <alias name='pci.11'/>
Jan 21 18:59:51 np0005591285 nova_compute[182755]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x2'/>
Jan 21 18:59:51 np0005591285 nova_compute[182755]:    </controller>
Jan 21 18:59:51 np0005591285 nova_compute[182755]:    <controller type='pci' index='12' model='pcie-root-port'>
Jan 21 18:59:51 np0005591285 nova_compute[182755]:      <model name='pcie-root-port'/>
Jan 21 18:59:51 np0005591285 nova_compute[182755]:      <target chassis='12' port='0x1b'/>
Jan 21 18:59:51 np0005591285 nova_compute[182755]:      <alias name='pci.12'/>
Jan 21 18:59:51 np0005591285 nova_compute[182755]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x3'/>
Jan 21 18:59:51 np0005591285 nova_compute[182755]:    </controller>
Jan 21 18:59:51 np0005591285 nova_compute[182755]:    <controller type='pci' index='13' model='pcie-root-port'>
Jan 21 18:59:51 np0005591285 nova_compute[182755]:      <model name='pcie-root-port'/>
Jan 21 18:59:51 np0005591285 nova_compute[182755]:      <target chassis='13' port='0x1c'/>
Jan 21 18:59:51 np0005591285 nova_compute[182755]:      <alias name='pci.13'/>
Jan 21 18:59:51 np0005591285 nova_compute[182755]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x4'/>
Jan 21 18:59:51 np0005591285 nova_compute[182755]:    </controller>
Jan 21 18:59:51 np0005591285 nova_compute[182755]:    <controller type='pci' index='14' model='pcie-root-port'>
Jan 21 18:59:51 np0005591285 nova_compute[182755]:      <model name='pcie-root-port'/>
Jan 21 18:59:51 np0005591285 nova_compute[182755]:      <target chassis='14' port='0x1d'/>
Jan 21 18:59:51 np0005591285 nova_compute[182755]:      <alias name='pci.14'/>
Jan 21 18:59:51 np0005591285 nova_compute[182755]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x5'/>
Jan 21 18:59:51 np0005591285 nova_compute[182755]:    </controller>
Jan 21 18:59:51 np0005591285 nova_compute[182755]:    <controller type='pci' index='15' model='pcie-root-port'>
Jan 21 18:59:51 np0005591285 nova_compute[182755]:      <model name='pcie-root-port'/>
Jan 21 18:59:51 np0005591285 nova_compute[182755]:      <target chassis='15' port='0x1e'/>
Jan 21 18:59:51 np0005591285 nova_compute[182755]:      <alias name='pci.15'/>
Jan 21 18:59:51 np0005591285 nova_compute[182755]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x6'/>
Jan 21 18:59:51 np0005591285 nova_compute[182755]:    </controller>
Jan 21 18:59:51 np0005591285 nova_compute[182755]:    <controller type='pci' index='16' model='pcie-root-port'>
Jan 21 18:59:51 np0005591285 nova_compute[182755]:      <model name='pcie-root-port'/>
Jan 21 18:59:51 np0005591285 nova_compute[182755]:      <target chassis='16' port='0x1f'/>
Jan 21 18:59:51 np0005591285 nova_compute[182755]:      <alias name='pci.16'/>
Jan 21 18:59:51 np0005591285 nova_compute[182755]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x7'/>
Jan 21 18:59:51 np0005591285 nova_compute[182755]:    </controller>
Jan 21 18:59:51 np0005591285 nova_compute[182755]:    <controller type='pci' index='17' model='pcie-root-port'>
Jan 21 18:59:51 np0005591285 nova_compute[182755]:      <model name='pcie-root-port'/>
Jan 21 18:59:51 np0005591285 nova_compute[182755]:      <target chassis='17' port='0x20'/>
Jan 21 18:59:51 np0005591285 nova_compute[182755]:      <alias name='pci.17'/>
Jan 21 18:59:51 np0005591285 nova_compute[182755]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x0' multifunction='on'/>
Jan 21 18:59:51 np0005591285 nova_compute[182755]:    </controller>
Jan 21 18:59:51 np0005591285 nova_compute[182755]:    <controller type='pci' index='18' model='pcie-root-port'>
Jan 21 18:59:51 np0005591285 nova_compute[182755]:      <model name='pcie-root-port'/>
Jan 21 18:59:51 np0005591285 nova_compute[182755]:      <target chassis='18' port='0x21'/>
Jan 21 18:59:51 np0005591285 nova_compute[182755]:      <alias name='pci.18'/>
Jan 21 18:59:51 np0005591285 nova_compute[182755]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x1'/>
Jan 21 18:59:51 np0005591285 nova_compute[182755]:    </controller>
Jan 21 18:59:51 np0005591285 nova_compute[182755]:    <controller type='pci' index='19' model='pcie-root-port'>
Jan 21 18:59:51 np0005591285 nova_compute[182755]:      <model name='pcie-root-port'/>
Jan 21 18:59:51 np0005591285 nova_compute[182755]:      <target chassis='19' port='0x22'/>
Jan 21 18:59:51 np0005591285 nova_compute[182755]:      <alias name='pci.19'/>
Jan 21 18:59:51 np0005591285 nova_compute[182755]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x2'/>
Jan 21 18:59:51 np0005591285 nova_compute[182755]:    </controller>
Jan 21 18:59:51 np0005591285 nova_compute[182755]:    <controller type='pci' index='20' model='pcie-root-port'>
Jan 21 18:59:51 np0005591285 nova_compute[182755]:      <model name='pcie-root-port'/>
Jan 21 18:59:51 np0005591285 nova_compute[182755]:      <target chassis='20' port='0x23'/>
Jan 21 18:59:51 np0005591285 nova_compute[182755]:      <alias name='pci.20'/>
Jan 21 18:59:51 np0005591285 nova_compute[182755]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x3'/>
Jan 21 18:59:51 np0005591285 nova_compute[182755]:    </controller>
Jan 21 18:59:51 np0005591285 nova_compute[182755]:    <controller type='pci' index='21' model='pcie-root-port'>
Jan 21 18:59:51 np0005591285 nova_compute[182755]:      <model name='pcie-root-port'/>
Jan 21 18:59:51 np0005591285 nova_compute[182755]:      <target chassis='21' port='0x24'/>
Jan 21 18:59:51 np0005591285 nova_compute[182755]:      <alias name='pci.21'/>
Jan 21 18:59:51 np0005591285 nova_compute[182755]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x4'/>
Jan 21 18:59:51 np0005591285 nova_compute[182755]:    </controller>
Jan 21 18:59:51 np0005591285 nova_compute[182755]:    <controller type='pci' index='22' model='pcie-root-port'>
Jan 21 18:59:51 np0005591285 nova_compute[182755]:      <model name='pcie-root-port'/>
Jan 21 18:59:51 np0005591285 nova_compute[182755]:      <target chassis='22' port='0x25'/>
Jan 21 18:59:51 np0005591285 nova_compute[182755]:      <alias name='pci.22'/>
Jan 21 18:59:51 np0005591285 nova_compute[182755]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x5'/>
Jan 21 18:59:51 np0005591285 nova_compute[182755]:    </controller>
Jan 21 18:59:51 np0005591285 nova_compute[182755]:    <controller type='pci' index='23' model='pcie-root-port'>
Jan 21 18:59:51 np0005591285 nova_compute[182755]:      <model name='pcie-root-port'/>
Jan 21 18:59:51 np0005591285 nova_compute[182755]:      <target chassis='23' port='0x26'/>
Jan 21 18:59:51 np0005591285 nova_compute[182755]:      <alias name='pci.23'/>
Jan 21 18:59:51 np0005591285 nova_compute[182755]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x6'/>
Jan 21 18:59:51 np0005591285 nova_compute[182755]:    </controller>
Jan 21 18:59:51 np0005591285 nova_compute[182755]:    <controller type='pci' index='24' model='pcie-root-port'>
Jan 21 18:59:51 np0005591285 nova_compute[182755]:      <model name='pcie-root-port'/>
Jan 21 18:59:51 np0005591285 nova_compute[182755]:      <target chassis='24' port='0x27'/>
Jan 21 18:59:51 np0005591285 nova_compute[182755]:      <alias name='pci.24'/>
Jan 21 18:59:51 np0005591285 nova_compute[182755]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x7'/>
Jan 21 18:59:51 np0005591285 nova_compute[182755]:    </controller>
Jan 21 18:59:51 np0005591285 nova_compute[182755]:    <controller type='pci' index='25' model='pcie-root-port'>
Jan 21 18:59:51 np0005591285 nova_compute[182755]:      <model name='pcie-root-port'/>
Jan 21 18:59:51 np0005591285 nova_compute[182755]:      <target chassis='25' port='0x28'/>
Jan 21 18:59:51 np0005591285 nova_compute[182755]:      <alias name='pci.25'/>
Jan 21 18:59:51 np0005591285 nova_compute[182755]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x05' function='0x0'/>
Jan 21 18:59:51 np0005591285 nova_compute[182755]:    </controller>
Jan 21 18:59:51 np0005591285 nova_compute[182755]:    <controller type='pci' index='26' model='pcie-to-pci-bridge'>
Jan 21 18:59:51 np0005591285 nova_compute[182755]:      <model name='pcie-pci-bridge'/>
Jan 21 18:59:51 np0005591285 nova_compute[182755]:      <alias name='pci.26'/>
Jan 21 18:59:51 np0005591285 nova_compute[182755]:      <address type='pci' domain='0x0000' bus='0x01' slot='0x00' function='0x0'/>
Jan 21 18:59:51 np0005591285 nova_compute[182755]:    </controller>
Jan 21 18:59:51 np0005591285 nova_compute[182755]:    <controller type='usb' index='0' model='piix3-uhci'>
Jan 21 18:59:51 np0005591285 nova_compute[182755]:      <alias name='usb'/>
Jan 21 18:59:51 np0005591285 nova_compute[182755]:      <address type='pci' domain='0x0000' bus='0x1a' slot='0x01' function='0x0'/>
Jan 21 18:59:51 np0005591285 nova_compute[182755]:    </controller>
Jan 21 18:59:51 np0005591285 nova_compute[182755]:    <controller type='sata' index='0'>
Jan 21 18:59:51 np0005591285 nova_compute[182755]:      <alias name='ide'/>
Jan 21 18:59:51 np0005591285 nova_compute[182755]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x1f' function='0x2'/>
Jan 21 18:59:51 np0005591285 nova_compute[182755]:    </controller>
Jan 21 18:59:51 np0005591285 nova_compute[182755]:    <interface type='ethernet'>
Jan 21 18:59:51 np0005591285 nova_compute[182755]:      <mac address='fa:16:3e:a5:00:2a'/>
Jan 21 18:59:51 np0005591285 nova_compute[182755]:      <target dev='tape685e997-ac'/>
Jan 21 18:59:51 np0005591285 nova_compute[182755]:      <model type='virtio'/>
Jan 21 18:59:51 np0005591285 nova_compute[182755]:      <driver name='vhost' rx_queue_size='512'/>
Jan 21 18:59:51 np0005591285 nova_compute[182755]:      <mtu size='1442'/>
Jan 21 18:59:51 np0005591285 nova_compute[182755]:      <alias name='net0'/>
Jan 21 18:59:51 np0005591285 nova_compute[182755]:      <address type='pci' domain='0x0000' bus='0x02' slot='0x00' function='0x0'/>
Jan 21 18:59:51 np0005591285 nova_compute[182755]:    </interface>
Jan 21 18:59:51 np0005591285 nova_compute[182755]:    <serial type='pty'>
Jan 21 18:59:51 np0005591285 nova_compute[182755]:      <source path='/dev/pts/2'/>
Jan 21 18:59:51 np0005591285 nova_compute[182755]:      <log file='/var/lib/nova/instances/30dd1355-3b44-4697-89e2-e5c929a535ac/console.log' append='off'/>
Jan 21 18:59:51 np0005591285 nova_compute[182755]:      <target type='isa-serial' port='0'>
Jan 21 18:59:51 np0005591285 nova_compute[182755]:        <model name='isa-serial'/>
Jan 21 18:59:51 np0005591285 nova_compute[182755]:      </target>
Jan 21 18:59:51 np0005591285 nova_compute[182755]:      <alias name='serial0'/>
Jan 21 18:59:51 np0005591285 nova_compute[182755]:    </serial>
Jan 21 18:59:51 np0005591285 nova_compute[182755]:    <console type='pty' tty='/dev/pts/2'>
Jan 21 18:59:51 np0005591285 nova_compute[182755]:      <source path='/dev/pts/2'/>
Jan 21 18:59:51 np0005591285 nova_compute[182755]:      <log file='/var/lib/nova/instances/30dd1355-3b44-4697-89e2-e5c929a535ac/console.log' append='off'/>
Jan 21 18:59:51 np0005591285 nova_compute[182755]:      <target type='serial' port='0'/>
Jan 21 18:59:51 np0005591285 nova_compute[182755]:      <alias name='serial0'/>
Jan 21 18:59:51 np0005591285 nova_compute[182755]:    </console>
Jan 21 18:59:51 np0005591285 nova_compute[182755]:    <input type='tablet' bus='usb'>
Jan 21 18:59:51 np0005591285 nova_compute[182755]:      <alias name='input0'/>
Jan 21 18:59:51 np0005591285 nova_compute[182755]:      <address type='usb' bus='0' port='1'/>
Jan 21 18:59:51 np0005591285 nova_compute[182755]:    </input>
Jan 21 18:59:51 np0005591285 nova_compute[182755]:    <input type='mouse' bus='ps2'>
Jan 21 18:59:51 np0005591285 nova_compute[182755]:      <alias name='input1'/>
Jan 21 18:59:51 np0005591285 nova_compute[182755]:    </input>
Jan 21 18:59:51 np0005591285 nova_compute[182755]:    <input type='keyboard' bus='ps2'>
Jan 21 18:59:51 np0005591285 nova_compute[182755]:      <alias name='input2'/>
Jan 21 18:59:51 np0005591285 nova_compute[182755]:    </input>
Jan 21 18:59:51 np0005591285 nova_compute[182755]:    <graphics type='vnc' port='5902' autoport='yes' listen='::0'>
Jan 21 18:59:51 np0005591285 nova_compute[182755]:      <listen type='address' address='::0'/>
Jan 21 18:59:51 np0005591285 nova_compute[182755]:    </graphics>
Jan 21 18:59:51 np0005591285 nova_compute[182755]:    <audio id='1' type='none'/>
Jan 21 18:59:51 np0005591285 nova_compute[182755]:    <video>
Jan 21 18:59:51 np0005591285 nova_compute[182755]:      <model type='virtio' heads='1' primary='yes'/>
Jan 21 18:59:51 np0005591285 nova_compute[182755]:      <alias name='video0'/>
Jan 21 18:59:51 np0005591285 nova_compute[182755]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x01' function='0x0'/>
Jan 21 18:59:51 np0005591285 nova_compute[182755]:    </video>
Jan 21 18:59:51 np0005591285 nova_compute[182755]:    <watchdog model='itco' action='reset'>
Jan 21 18:59:51 np0005591285 nova_compute[182755]:      <alias name='watchdog0'/>
Jan 21 18:59:51 np0005591285 nova_compute[182755]:    </watchdog>
Jan 21 18:59:51 np0005591285 nova_compute[182755]:    <memballoon model='virtio'>
Jan 21 18:59:51 np0005591285 nova_compute[182755]:      <stats period='10'/>
Jan 21 18:59:51 np0005591285 nova_compute[182755]:      <alias name='balloon0'/>
Jan 21 18:59:51 np0005591285 nova_compute[182755]:      <address type='pci' domain='0x0000' bus='0x04' slot='0x00' function='0x0'/>
Jan 21 18:59:51 np0005591285 nova_compute[182755]:    </memballoon>
Jan 21 18:59:51 np0005591285 nova_compute[182755]:    <rng model='virtio'>
Jan 21 18:59:51 np0005591285 nova_compute[182755]:      <backend model='random'>/dev/urandom</backend>
Jan 21 18:59:51 np0005591285 nova_compute[182755]:      <alias name='rng0'/>
Jan 21 18:59:51 np0005591285 nova_compute[182755]:      <address type='pci' domain='0x0000' bus='0x05' slot='0x00' function='0x0'/>
Jan 21 18:59:51 np0005591285 nova_compute[182755]:    </rng>
Jan 21 18:59:51 np0005591285 nova_compute[182755]:  </devices>
Jan 21 18:59:51 np0005591285 nova_compute[182755]:  <seclabel type='dynamic' model='selinux' relabel='yes'>
Jan 21 18:59:51 np0005591285 nova_compute[182755]:    <label>system_u:system_r:svirt_t:s0:c2,c726</label>
Jan 21 18:59:51 np0005591285 nova_compute[182755]:    <imagelabel>system_u:object_r:svirt_image_t:s0:c2,c726</imagelabel>
Jan 21 18:59:51 np0005591285 nova_compute[182755]:  </seclabel>
Jan 21 18:59:51 np0005591285 nova_compute[182755]:  <seclabel type='dynamic' model='dac' relabel='yes'>
Jan 21 18:59:51 np0005591285 nova_compute[182755]:    <label>+107:+107</label>
Jan 21 18:59:51 np0005591285 nova_compute[182755]:    <imagelabel>+107:+107</imagelabel>
Jan 21 18:59:51 np0005591285 nova_compute[182755]:  </seclabel>
Jan 21 18:59:51 np0005591285 nova_compute[182755]: </domain>
Jan 21 18:59:51 np0005591285 nova_compute[182755]: get_interface_by_cfg /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:282#033[00m
Jan 21 18:59:51 np0005591285 nova_compute[182755]: 2026-01-21 23:59:51.882 182759 INFO nova.virt.libvirt.driver [None req-83be6c55-c85f-489d-9178-d17bd4158a6f 0f8ef02149394f2dac899fc3395b6bf7 717cc581e6a349a98dfd390d05b18624 - - default default] Successfully detached device tapc596dfbe-ce from instance 30dd1355-3b44-4697-89e2-e5c929a535ac from the live domain config.#033[00m
Jan 21 18:59:51 np0005591285 nova_compute[182755]: 2026-01-21 23:59:51.883 182759 DEBUG nova.virt.libvirt.vif [None req-83be6c55-c85f-489d-9178-d17bd4158a6f 0f8ef02149394f2dac899fc3395b6bf7 717cc581e6a349a98dfd390d05b18624 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-21T23:59:00Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-tempest.common.compute-instance-1677916293',display_name='tempest-tempest.common.compute-instance-1677916293',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-tempest-common-compute-instance-1677916293',id=77,image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBGa3Z5wRILybgNVRpahBLmiLTAeMMxTHFRpSqeE8vf0/V2PbXMu+NKNirigUjrRZfax5519niVZ5m1wF5bYzERQCuKYHZS2P+HCnCDhrmGktv+3EqVL2XpKwAJqMQBW6VA==',key_name='tempest-keypair-1039920618',keypairs=<?>,launch_index=0,launched_at=2026-01-21T23:59:12Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=<?>,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='717cc581e6a349a98dfd390d05b18624',ramdisk_id='',reservation_id='r-eb7gua9t',resources=<?>,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-AttachInterfacesTestJSON-658760528',owner_user_name='tempest-AttachInterfacesTestJSON-658760528-project-member'},tags=<?>,task_state=None,terminated_at=None,trusted_certs=<?>,updated_at=2026-01-21T23:59:12Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='0f8ef02149394f2dac899fc3395b6bf7',uuid=30dd1355-3b44-4697-89e2-e5c929a535ac,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "c596dfbe-ce59-4ab9-8cad-bd4144812420", "address": "fa:16:3e:24:23:22", "network": {"id": "1995baab-0f8d-4658-a4fc-2d21868dc592", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-54296518-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "717cc581e6a349a98dfd390d05b18624", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc596dfbe-ce", "ovs_interfaceid": "c596dfbe-ce59-4ab9-8cad-bd4144812420", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Jan 21 18:59:51 np0005591285 nova_compute[182755]: 2026-01-21 23:59:51.883 182759 DEBUG nova.network.os_vif_util [None req-83be6c55-c85f-489d-9178-d17bd4158a6f 0f8ef02149394f2dac899fc3395b6bf7 717cc581e6a349a98dfd390d05b18624 - - default default] Converting VIF {"id": "c596dfbe-ce59-4ab9-8cad-bd4144812420", "address": "fa:16:3e:24:23:22", "network": {"id": "1995baab-0f8d-4658-a4fc-2d21868dc592", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-54296518-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "717cc581e6a349a98dfd390d05b18624", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc596dfbe-ce", "ovs_interfaceid": "c596dfbe-ce59-4ab9-8cad-bd4144812420", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 21 18:59:51 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:59:51.884 104259 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 1995baab-0f8d-4658-a4fc-2d21868dc592#033[00m
Jan 21 18:59:51 np0005591285 nova_compute[182755]: 2026-01-21 23:59:51.884 182759 DEBUG nova.network.os_vif_util [None req-83be6c55-c85f-489d-9178-d17bd4158a6f 0f8ef02149394f2dac899fc3395b6bf7 717cc581e6a349a98dfd390d05b18624 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:24:23:22,bridge_name='br-int',has_traffic_filtering=True,id=c596dfbe-ce59-4ab9-8cad-bd4144812420,network=Network(1995baab-0f8d-4658-a4fc-2d21868dc592),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tapc596dfbe-ce') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 21 18:59:51 np0005591285 nova_compute[182755]: 2026-01-21 23:59:51.885 182759 DEBUG os_vif [None req-83be6c55-c85f-489d-9178-d17bd4158a6f 0f8ef02149394f2dac899fc3395b6bf7 717cc581e6a349a98dfd390d05b18624 - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:24:23:22,bridge_name='br-int',has_traffic_filtering=True,id=c596dfbe-ce59-4ab9-8cad-bd4144812420,network=Network(1995baab-0f8d-4658-a4fc-2d21868dc592),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tapc596dfbe-ce') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Jan 21 18:59:51 np0005591285 nova_compute[182755]: 2026-01-21 23:59:51.888 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 18:59:51 np0005591285 nova_compute[182755]: 2026-01-21 23:59:51.888 182759 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapc596dfbe-ce, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 21 18:59:51 np0005591285 nova_compute[182755]: 2026-01-21 23:59:51.895 182759 DEBUG oslo_concurrency.lockutils [req-48dfb60a-0b98-47a3-b511-a91c03e9a96d req-06040c47-2f92-4941-806f-d20ed6c0aa7a 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Releasing lock "refresh_cache-30dd1355-3b44-4697-89e2-e5c929a535ac" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 21 18:59:51 np0005591285 nova_compute[182755]: 2026-01-21 23:59:51.897 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 18:59:51 np0005591285 nova_compute[182755]: 2026-01-21 23:59:51.901 182759 INFO os_vif [None req-83be6c55-c85f-489d-9178-d17bd4158a6f 0f8ef02149394f2dac899fc3395b6bf7 717cc581e6a349a98dfd390d05b18624 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:24:23:22,bridge_name='br-int',has_traffic_filtering=True,id=c596dfbe-ce59-4ab9-8cad-bd4144812420,network=Network(1995baab-0f8d-4658-a4fc-2d21868dc592),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tapc596dfbe-ce')#033[00m
Jan 21 18:59:51 np0005591285 nova_compute[182755]: 2026-01-21 23:59:51.902 182759 DEBUG nova.virt.libvirt.guest [None req-83be6c55-c85f-489d-9178-d17bd4158a6f 0f8ef02149394f2dac899fc3395b6bf7 717cc581e6a349a98dfd390d05b18624 - - default default] set metadata xml: <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 21 18:59:51 np0005591285 nova_compute[182755]:  <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 21 18:59:51 np0005591285 nova_compute[182755]:  <nova:name>tempest-tempest.common.compute-instance-1677916293</nova:name>
Jan 21 18:59:51 np0005591285 nova_compute[182755]:  <nova:creationTime>2026-01-21 23:59:51</nova:creationTime>
Jan 21 18:59:51 np0005591285 nova_compute[182755]:  <nova:flavor name="m1.nano">
Jan 21 18:59:51 np0005591285 nova_compute[182755]:    <nova:memory>128</nova:memory>
Jan 21 18:59:51 np0005591285 nova_compute[182755]:    <nova:disk>1</nova:disk>
Jan 21 18:59:51 np0005591285 nova_compute[182755]:    <nova:swap>0</nova:swap>
Jan 21 18:59:51 np0005591285 nova_compute[182755]:    <nova:ephemeral>0</nova:ephemeral>
Jan 21 18:59:51 np0005591285 nova_compute[182755]:    <nova:vcpus>1</nova:vcpus>
Jan 21 18:59:51 np0005591285 nova_compute[182755]:  </nova:flavor>
Jan 21 18:59:51 np0005591285 nova_compute[182755]:  <nova:owner>
Jan 21 18:59:51 np0005591285 nova_compute[182755]:    <nova:user uuid="0f8ef02149394f2dac899fc3395b6bf7">tempest-AttachInterfacesTestJSON-658760528-project-member</nova:user>
Jan 21 18:59:51 np0005591285 nova_compute[182755]:    <nova:project uuid="717cc581e6a349a98dfd390d05b18624">tempest-AttachInterfacesTestJSON-658760528</nova:project>
Jan 21 18:59:51 np0005591285 nova_compute[182755]:  </nova:owner>
Jan 21 18:59:51 np0005591285 nova_compute[182755]:  <nova:root type="image" uuid="9cd98f02-a505-4543-a7ad-04e9a377b456"/>
Jan 21 18:59:51 np0005591285 nova_compute[182755]:  <nova:ports>
Jan 21 18:59:51 np0005591285 nova_compute[182755]:    <nova:port uuid="e685e997-ac00-415e-9109-6d37bbb2f577">
Jan 21 18:59:51 np0005591285 nova_compute[182755]:      <nova:ip type="fixed" address="10.100.0.8" ipVersion="4"/>
Jan 21 18:59:51 np0005591285 nova_compute[182755]:    </nova:port>
Jan 21 18:59:51 np0005591285 nova_compute[182755]:  </nova:ports>
Jan 21 18:59:51 np0005591285 nova_compute[182755]: </nova:instance>
Jan 21 18:59:51 np0005591285 nova_compute[182755]: set_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:359#033[00m
Jan 21 18:59:51 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:59:51.911 211690 DEBUG oslo.privsep.daemon [-] privsep: reply[3439b3c4-2d20-4804-9d2a-d454c462ac13]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 18:59:51 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:59:51.953 211704 DEBUG oslo.privsep.daemon [-] privsep: reply[efce5a01-9f9c-4637-9a6a-c8cb1245c793]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 18:59:51 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:59:51.958 211704 DEBUG oslo.privsep.daemon [-] privsep: reply[a009b817-cf98-4683-95c0-cab978a7af9e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 18:59:51 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:59:51.996 211704 DEBUG oslo.privsep.daemon [-] privsep: reply[9331375b-9ba0-466a-8992-4e18d9bbffae]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 18:59:52 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:59:52.022 211690 DEBUG oslo.privsep.daemon [-] privsep: reply[5a7c3381-ca80-4db1-84dc-54dabb573d90]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap1995baab-01'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:4b:ff:2f'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 8, 'tx_packets': 7, 'rx_bytes': 616, 'tx_bytes': 438, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 8, 'tx_packets': 7, 'rx_bytes': 616, 'tx_bytes': 438, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 86], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 451075, 'reachable_time': 32815, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 6, 'inoctets': 448, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 3, 'outoctets': 228, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 6, 'outmcastpkts': 3, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 448, 'outmcastoctets': 228, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 6, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 3, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 222812, 'error': None, 'target': 'ovnmeta-1995baab-0f8d-4658-a4fc-2d21868dc592', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 18:59:52 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:59:52.045 211690 DEBUG oslo.privsep.daemon [-] privsep: reply[1acd1eac-e1c7-474c-a878-6df546cf9c11]: (4, ({'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tap1995baab-01'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 451090, 'tstamp': 451090}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 222813, 'error': None, 'target': 'ovnmeta-1995baab-0f8d-4658-a4fc-2d21868dc592', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.15'], ['IFA_LABEL', 'tap1995baab-01'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 451094, 'tstamp': 451094}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 222813, 'error': None, 'target': 'ovnmeta-1995baab-0f8d-4658-a4fc-2d21868dc592', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 18:59:52 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:59:52.047 104259 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap1995baab-00, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 21 18:59:52 np0005591285 nova_compute[182755]: 2026-01-21 23:59:52.050 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 18:59:52 np0005591285 nova_compute[182755]: 2026-01-21 23:59:52.052 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 18:59:52 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:59:52.053 104259 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap1995baab-00, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 21 18:59:52 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:59:52.053 104259 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 21 18:59:52 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:59:52.054 104259 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap1995baab-00, col_values=(('external_ids', {'iface-id': '4a5cc35b-5169-43e2-b11f-202219aae22d'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 21 18:59:52 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:59:52.054 104259 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 21 18:59:52 np0005591285 nova_compute[182755]: 2026-01-21 23:59:52.897 182759 DEBUG nova.compute.manager [req-17d1370c-cf64-4c24-bdcf-78e1ce4809fa req-e2e1a472-6d3a-457c-b2fc-1f9be9a53d7d 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 30dd1355-3b44-4697-89e2-e5c929a535ac] Received event network-vif-plugged-c596dfbe-ce59-4ab9-8cad-bd4144812420 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 21 18:59:52 np0005591285 nova_compute[182755]: 2026-01-21 23:59:52.898 182759 DEBUG oslo_concurrency.lockutils [req-17d1370c-cf64-4c24-bdcf-78e1ce4809fa req-e2e1a472-6d3a-457c-b2fc-1f9be9a53d7d 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "30dd1355-3b44-4697-89e2-e5c929a535ac-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 21 18:59:52 np0005591285 nova_compute[182755]: 2026-01-21 23:59:52.898 182759 DEBUG oslo_concurrency.lockutils [req-17d1370c-cf64-4c24-bdcf-78e1ce4809fa req-e2e1a472-6d3a-457c-b2fc-1f9be9a53d7d 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "30dd1355-3b44-4697-89e2-e5c929a535ac-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 21 18:59:52 np0005591285 nova_compute[182755]: 2026-01-21 23:59:52.899 182759 DEBUG oslo_concurrency.lockutils [req-17d1370c-cf64-4c24-bdcf-78e1ce4809fa req-e2e1a472-6d3a-457c-b2fc-1f9be9a53d7d 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "30dd1355-3b44-4697-89e2-e5c929a535ac-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 21 18:59:52 np0005591285 nova_compute[182755]: 2026-01-21 23:59:52.899 182759 DEBUG nova.compute.manager [req-17d1370c-cf64-4c24-bdcf-78e1ce4809fa req-e2e1a472-6d3a-457c-b2fc-1f9be9a53d7d 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 30dd1355-3b44-4697-89e2-e5c929a535ac] No waiting events found dispatching network-vif-plugged-c596dfbe-ce59-4ab9-8cad-bd4144812420 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 21 18:59:52 np0005591285 nova_compute[182755]: 2026-01-21 23:59:52.900 182759 WARNING nova.compute.manager [req-17d1370c-cf64-4c24-bdcf-78e1ce4809fa req-e2e1a472-6d3a-457c-b2fc-1f9be9a53d7d 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 30dd1355-3b44-4697-89e2-e5c929a535ac] Received unexpected event network-vif-plugged-c596dfbe-ce59-4ab9-8cad-bd4144812420 for instance with vm_state active and task_state None.#033[00m
Jan 21 18:59:52 np0005591285 nova_compute[182755]: 2026-01-21 23:59:52.900 182759 DEBUG nova.compute.manager [req-17d1370c-cf64-4c24-bdcf-78e1ce4809fa req-e2e1a472-6d3a-457c-b2fc-1f9be9a53d7d 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 30dd1355-3b44-4697-89e2-e5c929a535ac] Received event network-vif-unplugged-c596dfbe-ce59-4ab9-8cad-bd4144812420 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 21 18:59:52 np0005591285 nova_compute[182755]: 2026-01-21 23:59:52.901 182759 DEBUG oslo_concurrency.lockutils [req-17d1370c-cf64-4c24-bdcf-78e1ce4809fa req-e2e1a472-6d3a-457c-b2fc-1f9be9a53d7d 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "30dd1355-3b44-4697-89e2-e5c929a535ac-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 21 18:59:52 np0005591285 nova_compute[182755]: 2026-01-21 23:59:52.901 182759 DEBUG oslo_concurrency.lockutils [req-17d1370c-cf64-4c24-bdcf-78e1ce4809fa req-e2e1a472-6d3a-457c-b2fc-1f9be9a53d7d 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "30dd1355-3b44-4697-89e2-e5c929a535ac-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 21 18:59:52 np0005591285 nova_compute[182755]: 2026-01-21 23:59:52.902 182759 DEBUG oslo_concurrency.lockutils [req-17d1370c-cf64-4c24-bdcf-78e1ce4809fa req-e2e1a472-6d3a-457c-b2fc-1f9be9a53d7d 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "30dd1355-3b44-4697-89e2-e5c929a535ac-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 21 18:59:52 np0005591285 nova_compute[182755]: 2026-01-21 23:59:52.902 182759 DEBUG nova.compute.manager [req-17d1370c-cf64-4c24-bdcf-78e1ce4809fa req-e2e1a472-6d3a-457c-b2fc-1f9be9a53d7d 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 30dd1355-3b44-4697-89e2-e5c929a535ac] No waiting events found dispatching network-vif-unplugged-c596dfbe-ce59-4ab9-8cad-bd4144812420 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 21 18:59:52 np0005591285 nova_compute[182755]: 2026-01-21 23:59:52.903 182759 WARNING nova.compute.manager [req-17d1370c-cf64-4c24-bdcf-78e1ce4809fa req-e2e1a472-6d3a-457c-b2fc-1f9be9a53d7d 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 30dd1355-3b44-4697-89e2-e5c929a535ac] Received unexpected event network-vif-unplugged-c596dfbe-ce59-4ab9-8cad-bd4144812420 for instance with vm_state active and task_state None.#033[00m
Jan 21 18:59:52 np0005591285 nova_compute[182755]: 2026-01-21 23:59:52.903 182759 DEBUG nova.compute.manager [req-17d1370c-cf64-4c24-bdcf-78e1ce4809fa req-e2e1a472-6d3a-457c-b2fc-1f9be9a53d7d 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 30dd1355-3b44-4697-89e2-e5c929a535ac] Received event network-vif-plugged-c596dfbe-ce59-4ab9-8cad-bd4144812420 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 21 18:59:52 np0005591285 nova_compute[182755]: 2026-01-21 23:59:52.904 182759 DEBUG oslo_concurrency.lockutils [req-17d1370c-cf64-4c24-bdcf-78e1ce4809fa req-e2e1a472-6d3a-457c-b2fc-1f9be9a53d7d 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "30dd1355-3b44-4697-89e2-e5c929a535ac-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 21 18:59:52 np0005591285 nova_compute[182755]: 2026-01-21 23:59:52.905 182759 DEBUG oslo_concurrency.lockutils [req-17d1370c-cf64-4c24-bdcf-78e1ce4809fa req-e2e1a472-6d3a-457c-b2fc-1f9be9a53d7d 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "30dd1355-3b44-4697-89e2-e5c929a535ac-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 21 18:59:52 np0005591285 nova_compute[182755]: 2026-01-21 23:59:52.905 182759 DEBUG oslo_concurrency.lockutils [req-17d1370c-cf64-4c24-bdcf-78e1ce4809fa req-e2e1a472-6d3a-457c-b2fc-1f9be9a53d7d 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "30dd1355-3b44-4697-89e2-e5c929a535ac-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 21 18:59:52 np0005591285 nova_compute[182755]: 2026-01-21 23:59:52.906 182759 DEBUG nova.compute.manager [req-17d1370c-cf64-4c24-bdcf-78e1ce4809fa req-e2e1a472-6d3a-457c-b2fc-1f9be9a53d7d 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 30dd1355-3b44-4697-89e2-e5c929a535ac] No waiting events found dispatching network-vif-plugged-c596dfbe-ce59-4ab9-8cad-bd4144812420 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 21 18:59:52 np0005591285 nova_compute[182755]: 2026-01-21 23:59:52.906 182759 WARNING nova.compute.manager [req-17d1370c-cf64-4c24-bdcf-78e1ce4809fa req-e2e1a472-6d3a-457c-b2fc-1f9be9a53d7d 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 30dd1355-3b44-4697-89e2-e5c929a535ac] Received unexpected event network-vif-plugged-c596dfbe-ce59-4ab9-8cad-bd4144812420 for instance with vm_state active and task_state None.#033[00m
Jan 21 18:59:54 np0005591285 nova_compute[182755]: 2026-01-21 23:59:54.004 182759 DEBUG oslo_concurrency.lockutils [None req-83be6c55-c85f-489d-9178-d17bd4158a6f 0f8ef02149394f2dac899fc3395b6bf7 717cc581e6a349a98dfd390d05b18624 - - default default] Acquiring lock "refresh_cache-30dd1355-3b44-4697-89e2-e5c929a535ac" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 21 18:59:54 np0005591285 nova_compute[182755]: 2026-01-21 23:59:54.005 182759 DEBUG oslo_concurrency.lockutils [None req-83be6c55-c85f-489d-9178-d17bd4158a6f 0f8ef02149394f2dac899fc3395b6bf7 717cc581e6a349a98dfd390d05b18624 - - default default] Acquired lock "refresh_cache-30dd1355-3b44-4697-89e2-e5c929a535ac" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 21 18:59:54 np0005591285 nova_compute[182755]: 2026-01-21 23:59:54.006 182759 DEBUG nova.network.neutron [None req-83be6c55-c85f-489d-9178-d17bd4158a6f 0f8ef02149394f2dac899fc3395b6bf7 717cc581e6a349a98dfd390d05b18624 - - default default] [instance: 30dd1355-3b44-4697-89e2-e5c929a535ac] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Jan 21 18:59:54 np0005591285 nova_compute[182755]: 2026-01-21 23:59:54.859 182759 DEBUG oslo_concurrency.lockutils [None req-09c4cded-4606-47ad-abff-dbdd0fbb938a 0f8ef02149394f2dac899fc3395b6bf7 717cc581e6a349a98dfd390d05b18624 - - default default] Acquiring lock "30dd1355-3b44-4697-89e2-e5c929a535ac" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 21 18:59:54 np0005591285 nova_compute[182755]: 2026-01-21 23:59:54.860 182759 DEBUG oslo_concurrency.lockutils [None req-09c4cded-4606-47ad-abff-dbdd0fbb938a 0f8ef02149394f2dac899fc3395b6bf7 717cc581e6a349a98dfd390d05b18624 - - default default] Lock "30dd1355-3b44-4697-89e2-e5c929a535ac" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 21 18:59:54 np0005591285 nova_compute[182755]: 2026-01-21 23:59:54.860 182759 DEBUG oslo_concurrency.lockutils [None req-09c4cded-4606-47ad-abff-dbdd0fbb938a 0f8ef02149394f2dac899fc3395b6bf7 717cc581e6a349a98dfd390d05b18624 - - default default] Acquiring lock "30dd1355-3b44-4697-89e2-e5c929a535ac-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 21 18:59:54 np0005591285 nova_compute[182755]: 2026-01-21 23:59:54.860 182759 DEBUG oslo_concurrency.lockutils [None req-09c4cded-4606-47ad-abff-dbdd0fbb938a 0f8ef02149394f2dac899fc3395b6bf7 717cc581e6a349a98dfd390d05b18624 - - default default] Lock "30dd1355-3b44-4697-89e2-e5c929a535ac-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 21 18:59:54 np0005591285 nova_compute[182755]: 2026-01-21 23:59:54.860 182759 DEBUG oslo_concurrency.lockutils [None req-09c4cded-4606-47ad-abff-dbdd0fbb938a 0f8ef02149394f2dac899fc3395b6bf7 717cc581e6a349a98dfd390d05b18624 - - default default] Lock "30dd1355-3b44-4697-89e2-e5c929a535ac-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 21 18:59:54 np0005591285 nova_compute[182755]: 2026-01-21 23:59:54.874 182759 INFO nova.compute.manager [None req-09c4cded-4606-47ad-abff-dbdd0fbb938a 0f8ef02149394f2dac899fc3395b6bf7 717cc581e6a349a98dfd390d05b18624 - - default default] [instance: 30dd1355-3b44-4697-89e2-e5c929a535ac] Terminating instance#033[00m
Jan 21 18:59:54 np0005591285 nova_compute[182755]: 2026-01-21 23:59:54.887 182759 DEBUG nova.compute.manager [None req-09c4cded-4606-47ad-abff-dbdd0fbb938a 0f8ef02149394f2dac899fc3395b6bf7 717cc581e6a349a98dfd390d05b18624 - - default default] [instance: 30dd1355-3b44-4697-89e2-e5c929a535ac] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Jan 21 18:59:54 np0005591285 kernel: tape685e997-ac (unregistering): left promiscuous mode
Jan 21 18:59:54 np0005591285 NetworkManager[55017]: <info>  [1769039994.9208] device (tape685e997-ac): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 21 18:59:54 np0005591285 nova_compute[182755]: 2026-01-21 23:59:54.924 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 18:59:54 np0005591285 ovn_controller[94908]: 2026-01-21T23:59:54Z|00280|binding|INFO|Releasing lport e685e997-ac00-415e-9109-6d37bbb2f577 from this chassis (sb_readonly=0)
Jan 21 18:59:54 np0005591285 ovn_controller[94908]: 2026-01-21T23:59:54Z|00281|binding|INFO|Setting lport e685e997-ac00-415e-9109-6d37bbb2f577 down in Southbound
Jan 21 18:59:54 np0005591285 ovn_controller[94908]: 2026-01-21T23:59:54Z|00282|binding|INFO|Removing iface tape685e997-ac ovn-installed in OVS
Jan 21 18:59:54 np0005591285 nova_compute[182755]: 2026-01-21 23:59:54.928 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 18:59:54 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:59:54.938 104259 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:a5:00:2a 10.100.0.8'], port_security=['fa:16:3e:a5:00:2a 10.100.0.8'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.8/28', 'neutron:device_id': '30dd1355-3b44-4697-89e2-e5c929a535ac', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-1995baab-0f8d-4658-a4fc-2d21868dc592', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '717cc581e6a349a98dfd390d05b18624', 'neutron:revision_number': '4', 'neutron:security_group_ids': '2e3b7d6e-99c3-4bed-a6db-24cc4d63ab1a', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-2.ctlplane.example.com', 'neutron:port_fip': '192.168.122.223'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=a84fa12f-731b-4479-8697-844749c5a76f, chassis=[], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fdd55b5c970>], logical_port=e685e997-ac00-415e-9109-6d37bbb2f577) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7fdd55b5c970>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 21 18:59:54 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:59:54.941 104259 INFO neutron.agent.ovn.metadata.agent [-] Port e685e997-ac00-415e-9109-6d37bbb2f577 in datapath 1995baab-0f8d-4658-a4fc-2d21868dc592 unbound from our chassis#033[00m
Jan 21 18:59:54 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:59:54.944 104259 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 1995baab-0f8d-4658-a4fc-2d21868dc592, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Jan 21 18:59:54 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:59:54.945 211690 DEBUG oslo.privsep.daemon [-] privsep: reply[523b39a2-6af1-4205-ab81-58f11dc0b037]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 18:59:54 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:59:54.946 104259 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-1995baab-0f8d-4658-a4fc-2d21868dc592 namespace which is not needed anymore#033[00m
Jan 21 18:59:54 np0005591285 nova_compute[182755]: 2026-01-21 23:59:54.970 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 18:59:54 np0005591285 systemd[1]: machine-qemu\x2d36\x2dinstance\x2d0000004d.scope: Deactivated successfully.
Jan 21 18:59:54 np0005591285 systemd[1]: machine-qemu\x2d36\x2dinstance\x2d0000004d.scope: Consumed 15.263s CPU time.
Jan 21 18:59:54 np0005591285 systemd-machined[154022]: Machine qemu-36-instance-0000004d terminated.
Jan 21 18:59:55 np0005591285 neutron-haproxy-ovnmeta-1995baab-0f8d-4658-a4fc-2d21868dc592[222514]: [NOTICE]   (222518) : haproxy version is 2.8.14-c23fe91
Jan 21 18:59:55 np0005591285 neutron-haproxy-ovnmeta-1995baab-0f8d-4658-a4fc-2d21868dc592[222514]: [NOTICE]   (222518) : path to executable is /usr/sbin/haproxy
Jan 21 18:59:55 np0005591285 neutron-haproxy-ovnmeta-1995baab-0f8d-4658-a4fc-2d21868dc592[222514]: [WARNING]  (222518) : Exiting Master process...
Jan 21 18:59:55 np0005591285 neutron-haproxy-ovnmeta-1995baab-0f8d-4658-a4fc-2d21868dc592[222514]: [ALERT]    (222518) : Current worker (222520) exited with code 143 (Terminated)
Jan 21 18:59:55 np0005591285 neutron-haproxy-ovnmeta-1995baab-0f8d-4658-a4fc-2d21868dc592[222514]: [WARNING]  (222518) : All workers exited. Exiting... (0)
Jan 21 18:59:55 np0005591285 systemd[1]: libpod-ba6419a95934f2494ff7459b960435620e39f9c811972af4fe1fd41021ff972e.scope: Deactivated successfully.
Jan 21 18:59:55 np0005591285 podman[222839]: 2026-01-21 23:59:55.172221677 +0000 UTC m=+0.082090652 container died ba6419a95934f2494ff7459b960435620e39f9c811972af4fe1fd41021ff972e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-1995baab-0f8d-4658-a4fc-2d21868dc592, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team)
Jan 21 18:59:55 np0005591285 nova_compute[182755]: 2026-01-21 23:59:55.198 182759 INFO nova.virt.libvirt.driver [-] [instance: 30dd1355-3b44-4697-89e2-e5c929a535ac] Instance destroyed successfully.#033[00m
Jan 21 18:59:55 np0005591285 nova_compute[182755]: 2026-01-21 23:59:55.199 182759 DEBUG nova.objects.instance [None req-09c4cded-4606-47ad-abff-dbdd0fbb938a 0f8ef02149394f2dac899fc3395b6bf7 717cc581e6a349a98dfd390d05b18624 - - default default] Lazy-loading 'resources' on Instance uuid 30dd1355-3b44-4697-89e2-e5c929a535ac obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 21 18:59:55 np0005591285 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-ba6419a95934f2494ff7459b960435620e39f9c811972af4fe1fd41021ff972e-userdata-shm.mount: Deactivated successfully.
Jan 21 18:59:55 np0005591285 systemd[1]: var-lib-containers-storage-overlay-042f0bda2a0faad399b1887b473b147061d18f9d27a76793a3947ff145b3c7b4-merged.mount: Deactivated successfully.
Jan 21 18:59:55 np0005591285 podman[222839]: 2026-01-21 23:59:55.215318602 +0000 UTC m=+0.125187567 container cleanup ba6419a95934f2494ff7459b960435620e39f9c811972af4fe1fd41021ff972e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-1995baab-0f8d-4658-a4fc-2d21868dc592, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team)
Jan 21 18:59:55 np0005591285 nova_compute[182755]: 2026-01-21 23:59:55.220 182759 DEBUG nova.virt.libvirt.vif [None req-09c4cded-4606-47ad-abff-dbdd0fbb938a 0f8ef02149394f2dac899fc3395b6bf7 717cc581e6a349a98dfd390d05b18624 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-21T23:59:00Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-tempest.common.compute-instance-1677916293',display_name='tempest-tempest.common.compute-instance-1677916293',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-tempest-common-compute-instance-1677916293',id=77,image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBGa3Z5wRILybgNVRpahBLmiLTAeMMxTHFRpSqeE8vf0/V2PbXMu+NKNirigUjrRZfax5519niVZ5m1wF5bYzERQCuKYHZS2P+HCnCDhrmGktv+3EqVL2XpKwAJqMQBW6VA==',key_name='tempest-keypair-1039920618',keypairs=<?>,launch_index=0,launched_at=2026-01-21T23:59:12Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='717cc581e6a349a98dfd390d05b18624',ramdisk_id='',reservation_id='r-eb7gua9t',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-AttachInterfacesTestJSON-658760528',owner_user_name='tempest-AttachInterfacesTestJSON-658760528-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-21T23:59:12Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='0f8ef02149394f2dac899fc3395b6bf7',uuid=30dd1355-3b44-4697-89e2-e5c929a535ac,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "e685e997-ac00-415e-9109-6d37bbb2f577", "address": "fa:16:3e:a5:00:2a", "network": {"id": "1995baab-0f8d-4658-a4fc-2d21868dc592", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-54296518-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.223", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "717cc581e6a349a98dfd390d05b18624", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape685e997-ac", "ovs_interfaceid": "e685e997-ac00-415e-9109-6d37bbb2f577", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Jan 21 18:59:55 np0005591285 nova_compute[182755]: 2026-01-21 23:59:55.222 182759 DEBUG nova.network.os_vif_util [None req-09c4cded-4606-47ad-abff-dbdd0fbb938a 0f8ef02149394f2dac899fc3395b6bf7 717cc581e6a349a98dfd390d05b18624 - - default default] Converting VIF {"id": "e685e997-ac00-415e-9109-6d37bbb2f577", "address": "fa:16:3e:a5:00:2a", "network": {"id": "1995baab-0f8d-4658-a4fc-2d21868dc592", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-54296518-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.223", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "717cc581e6a349a98dfd390d05b18624", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape685e997-ac", "ovs_interfaceid": "e685e997-ac00-415e-9109-6d37bbb2f577", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 21 18:59:55 np0005591285 nova_compute[182755]: 2026-01-21 23:59:55.223 182759 DEBUG nova.network.os_vif_util [None req-09c4cded-4606-47ad-abff-dbdd0fbb938a 0f8ef02149394f2dac899fc3395b6bf7 717cc581e6a349a98dfd390d05b18624 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:a5:00:2a,bridge_name='br-int',has_traffic_filtering=True,id=e685e997-ac00-415e-9109-6d37bbb2f577,network=Network(1995baab-0f8d-4658-a4fc-2d21868dc592),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tape685e997-ac') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 21 18:59:55 np0005591285 nova_compute[182755]: 2026-01-21 23:59:55.223 182759 DEBUG os_vif [None req-09c4cded-4606-47ad-abff-dbdd0fbb938a 0f8ef02149394f2dac899fc3395b6bf7 717cc581e6a349a98dfd390d05b18624 - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:a5:00:2a,bridge_name='br-int',has_traffic_filtering=True,id=e685e997-ac00-415e-9109-6d37bbb2f577,network=Network(1995baab-0f8d-4658-a4fc-2d21868dc592),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tape685e997-ac') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Jan 21 18:59:55 np0005591285 nova_compute[182755]: 2026-01-21 23:59:55.225 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 18:59:55 np0005591285 nova_compute[182755]: 2026-01-21 23:59:55.225 182759 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tape685e997-ac, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 21 18:59:55 np0005591285 nova_compute[182755]: 2026-01-21 23:59:55.227 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 18:59:55 np0005591285 nova_compute[182755]: 2026-01-21 23:59:55.230 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 18:59:55 np0005591285 nova_compute[182755]: 2026-01-21 23:59:55.235 182759 INFO os_vif [None req-09c4cded-4606-47ad-abff-dbdd0fbb938a 0f8ef02149394f2dac899fc3395b6bf7 717cc581e6a349a98dfd390d05b18624 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:a5:00:2a,bridge_name='br-int',has_traffic_filtering=True,id=e685e997-ac00-415e-9109-6d37bbb2f577,network=Network(1995baab-0f8d-4658-a4fc-2d21868dc592),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tape685e997-ac')#033[00m
Jan 21 18:59:55 np0005591285 nova_compute[182755]: 2026-01-21 23:59:55.236 182759 DEBUG nova.virt.libvirt.vif [None req-09c4cded-4606-47ad-abff-dbdd0fbb938a 0f8ef02149394f2dac899fc3395b6bf7 717cc581e6a349a98dfd390d05b18624 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-21T23:59:00Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-tempest.common.compute-instance-1677916293',display_name='tempest-tempest.common.compute-instance-1677916293',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-tempest-common-compute-instance-1677916293',id=77,image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBGa3Z5wRILybgNVRpahBLmiLTAeMMxTHFRpSqeE8vf0/V2PbXMu+NKNirigUjrRZfax5519niVZ5m1wF5bYzERQCuKYHZS2P+HCnCDhrmGktv+3EqVL2XpKwAJqMQBW6VA==',key_name='tempest-keypair-1039920618',keypairs=<?>,launch_index=0,launched_at=2026-01-21T23:59:12Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='717cc581e6a349a98dfd390d05b18624',ramdisk_id='',reservation_id='r-eb7gua9t',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-AttachInterfacesTestJSON-658760528',owner_user_name='tempest-AttachInterfacesTestJSON-658760528-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-21T23:59:12Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='0f8ef02149394f2dac899fc3395b6bf7',uuid=30dd1355-3b44-4697-89e2-e5c929a535ac,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "c596dfbe-ce59-4ab9-8cad-bd4144812420", "address": "fa:16:3e:24:23:22", "network": {"id": "1995baab-0f8d-4658-a4fc-2d21868dc592", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-54296518-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "717cc581e6a349a98dfd390d05b18624", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc596dfbe-ce", "ovs_interfaceid": "c596dfbe-ce59-4ab9-8cad-bd4144812420", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Jan 21 18:59:55 np0005591285 nova_compute[182755]: 2026-01-21 23:59:55.236 182759 DEBUG nova.network.os_vif_util [None req-09c4cded-4606-47ad-abff-dbdd0fbb938a 0f8ef02149394f2dac899fc3395b6bf7 717cc581e6a349a98dfd390d05b18624 - - default default] Converting VIF {"id": "c596dfbe-ce59-4ab9-8cad-bd4144812420", "address": "fa:16:3e:24:23:22", "network": {"id": "1995baab-0f8d-4658-a4fc-2d21868dc592", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-54296518-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "717cc581e6a349a98dfd390d05b18624", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc596dfbe-ce", "ovs_interfaceid": "c596dfbe-ce59-4ab9-8cad-bd4144812420", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 21 18:59:55 np0005591285 nova_compute[182755]: 2026-01-21 23:59:55.237 182759 DEBUG nova.network.os_vif_util [None req-09c4cded-4606-47ad-abff-dbdd0fbb938a 0f8ef02149394f2dac899fc3395b6bf7 717cc581e6a349a98dfd390d05b18624 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:24:23:22,bridge_name='br-int',has_traffic_filtering=True,id=c596dfbe-ce59-4ab9-8cad-bd4144812420,network=Network(1995baab-0f8d-4658-a4fc-2d21868dc592),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tapc596dfbe-ce') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 21 18:59:55 np0005591285 nova_compute[182755]: 2026-01-21 23:59:55.238 182759 DEBUG os_vif [None req-09c4cded-4606-47ad-abff-dbdd0fbb938a 0f8ef02149394f2dac899fc3395b6bf7 717cc581e6a349a98dfd390d05b18624 - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:24:23:22,bridge_name='br-int',has_traffic_filtering=True,id=c596dfbe-ce59-4ab9-8cad-bd4144812420,network=Network(1995baab-0f8d-4658-a4fc-2d21868dc592),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tapc596dfbe-ce') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Jan 21 18:59:55 np0005591285 nova_compute[182755]: 2026-01-21 23:59:55.240 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 18:59:55 np0005591285 nova_compute[182755]: 2026-01-21 23:59:55.241 182759 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapc596dfbe-ce, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 21 18:59:55 np0005591285 nova_compute[182755]: 2026-01-21 23:59:55.241 182759 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 21 18:59:55 np0005591285 nova_compute[182755]: 2026-01-21 23:59:55.244 182759 INFO os_vif [None req-09c4cded-4606-47ad-abff-dbdd0fbb938a 0f8ef02149394f2dac899fc3395b6bf7 717cc581e6a349a98dfd390d05b18624 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:24:23:22,bridge_name='br-int',has_traffic_filtering=True,id=c596dfbe-ce59-4ab9-8cad-bd4144812420,network=Network(1995baab-0f8d-4658-a4fc-2d21868dc592),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tapc596dfbe-ce')#033[00m
Jan 21 18:59:55 np0005591285 nova_compute[182755]: 2026-01-21 23:59:55.244 182759 INFO nova.virt.libvirt.driver [None req-09c4cded-4606-47ad-abff-dbdd0fbb938a 0f8ef02149394f2dac899fc3395b6bf7 717cc581e6a349a98dfd390d05b18624 - - default default] [instance: 30dd1355-3b44-4697-89e2-e5c929a535ac] Deleting instance files /var/lib/nova/instances/30dd1355-3b44-4697-89e2-e5c929a535ac_del#033[00m
Jan 21 18:59:55 np0005591285 nova_compute[182755]: 2026-01-21 23:59:55.245 182759 INFO nova.virt.libvirt.driver [None req-09c4cded-4606-47ad-abff-dbdd0fbb938a 0f8ef02149394f2dac899fc3395b6bf7 717cc581e6a349a98dfd390d05b18624 - - default default] [instance: 30dd1355-3b44-4697-89e2-e5c929a535ac] Deletion of /var/lib/nova/instances/30dd1355-3b44-4697-89e2-e5c929a535ac_del complete#033[00m
Jan 21 18:59:55 np0005591285 systemd[1]: libpod-conmon-ba6419a95934f2494ff7459b960435620e39f9c811972af4fe1fd41021ff972e.scope: Deactivated successfully.
Jan 21 18:59:55 np0005591285 podman[222881]: 2026-01-21 23:59:55.299977252 +0000 UTC m=+0.050964195 container remove ba6419a95934f2494ff7459b960435620e39f9c811972af4fe1fd41021ff972e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-1995baab-0f8d-4658-a4fc-2d21868dc592, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251202, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Jan 21 18:59:55 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:59:55.307 211690 DEBUG oslo.privsep.daemon [-] privsep: reply[720da47f-0da8-4dd1-84e0-803084e87885]: (4, ('Wed Jan 21 11:59:55 PM UTC 2026 Stopping container neutron-haproxy-ovnmeta-1995baab-0f8d-4658-a4fc-2d21868dc592 (ba6419a95934f2494ff7459b960435620e39f9c811972af4fe1fd41021ff972e)\nba6419a95934f2494ff7459b960435620e39f9c811972af4fe1fd41021ff972e\nWed Jan 21 11:59:55 PM UTC 2026 Deleting container neutron-haproxy-ovnmeta-1995baab-0f8d-4658-a4fc-2d21868dc592 (ba6419a95934f2494ff7459b960435620e39f9c811972af4fe1fd41021ff972e)\nba6419a95934f2494ff7459b960435620e39f9c811972af4fe1fd41021ff972e\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 18:59:55 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:59:55.309 211690 DEBUG oslo.privsep.daemon [-] privsep: reply[e1f49ff4-9c2c-49b9-a58c-84babcf4db54]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 18:59:55 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:59:55.310 104259 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap1995baab-00, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 21 18:59:55 np0005591285 nova_compute[182755]: 2026-01-21 23:59:55.312 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 18:59:55 np0005591285 kernel: tap1995baab-00: left promiscuous mode
Jan 21 18:59:55 np0005591285 nova_compute[182755]: 2026-01-21 23:59:55.333 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 18:59:55 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:59:55.338 211690 DEBUG oslo.privsep.daemon [-] privsep: reply[b08dd0a1-69a9-4590-b4e4-0b96a277750b]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 18:59:55 np0005591285 nova_compute[182755]: 2026-01-21 23:59:55.347 182759 INFO nova.compute.manager [None req-09c4cded-4606-47ad-abff-dbdd0fbb938a 0f8ef02149394f2dac899fc3395b6bf7 717cc581e6a349a98dfd390d05b18624 - - default default] [instance: 30dd1355-3b44-4697-89e2-e5c929a535ac] Took 0.46 seconds to destroy the instance on the hypervisor.#033[00m
Jan 21 18:59:55 np0005591285 nova_compute[182755]: 2026-01-21 23:59:55.348 182759 DEBUG oslo.service.loopingcall [None req-09c4cded-4606-47ad-abff-dbdd0fbb938a 0f8ef02149394f2dac899fc3395b6bf7 717cc581e6a349a98dfd390d05b18624 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Jan 21 18:59:55 np0005591285 nova_compute[182755]: 2026-01-21 23:59:55.349 182759 DEBUG nova.compute.manager [-] [instance: 30dd1355-3b44-4697-89e2-e5c929a535ac] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Jan 21 18:59:55 np0005591285 nova_compute[182755]: 2026-01-21 23:59:55.349 182759 DEBUG nova.network.neutron [-] [instance: 30dd1355-3b44-4697-89e2-e5c929a535ac] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Jan 21 18:59:55 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:59:55.356 211690 DEBUG oslo.privsep.daemon [-] privsep: reply[6c4bbfd0-82ce-4308-817b-ea02f1271a0e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 18:59:55 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:59:55.359 211690 DEBUG oslo.privsep.daemon [-] privsep: reply[654783ac-7d1d-46b9-978d-5491d470c1c7]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 18:59:55 np0005591285 nova_compute[182755]: 2026-01-21 23:59:55.380 182759 DEBUG nova.compute.manager [req-1726bd4e-94cf-4a3a-ab58-446b24b6440d req-b912b1c9-bb6f-4848-a33f-4ceb8e601f3d 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 30dd1355-3b44-4697-89e2-e5c929a535ac] Received event network-vif-unplugged-e685e997-ac00-415e-9109-6d37bbb2f577 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 21 18:59:55 np0005591285 nova_compute[182755]: 2026-01-21 23:59:55.381 182759 DEBUG oslo_concurrency.lockutils [req-1726bd4e-94cf-4a3a-ab58-446b24b6440d req-b912b1c9-bb6f-4848-a33f-4ceb8e601f3d 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "30dd1355-3b44-4697-89e2-e5c929a535ac-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 21 18:59:55 np0005591285 nova_compute[182755]: 2026-01-21 23:59:55.381 182759 DEBUG oslo_concurrency.lockutils [req-1726bd4e-94cf-4a3a-ab58-446b24b6440d req-b912b1c9-bb6f-4848-a33f-4ceb8e601f3d 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "30dd1355-3b44-4697-89e2-e5c929a535ac-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 21 18:59:55 np0005591285 nova_compute[182755]: 2026-01-21 23:59:55.381 182759 DEBUG oslo_concurrency.lockutils [req-1726bd4e-94cf-4a3a-ab58-446b24b6440d req-b912b1c9-bb6f-4848-a33f-4ceb8e601f3d 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "30dd1355-3b44-4697-89e2-e5c929a535ac-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 21 18:59:55 np0005591285 nova_compute[182755]: 2026-01-21 23:59:55.381 182759 DEBUG nova.compute.manager [req-1726bd4e-94cf-4a3a-ab58-446b24b6440d req-b912b1c9-bb6f-4848-a33f-4ceb8e601f3d 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 30dd1355-3b44-4697-89e2-e5c929a535ac] No waiting events found dispatching network-vif-unplugged-e685e997-ac00-415e-9109-6d37bbb2f577 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 21 18:59:55 np0005591285 nova_compute[182755]: 2026-01-21 23:59:55.382 182759 DEBUG nova.compute.manager [req-1726bd4e-94cf-4a3a-ab58-446b24b6440d req-b912b1c9-bb6f-4848-a33f-4ceb8e601f3d 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 30dd1355-3b44-4697-89e2-e5c929a535ac] Received event network-vif-unplugged-e685e997-ac00-415e-9109-6d37bbb2f577 for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Jan 21 18:59:55 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:59:55.385 211690 DEBUG oslo.privsep.daemon [-] privsep: reply[6a488414-ce62-4b8c-a69f-e9dd5787d91f]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 451066, 'reachable_time': 25007, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 222898, 'error': None, 'target': 'ovnmeta-1995baab-0f8d-4658-a4fc-2d21868dc592', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 18:59:55 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:59:55.389 104650 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-1995baab-0f8d-4658-a4fc-2d21868dc592 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Jan 21 18:59:55 np0005591285 systemd[1]: run-netns-ovnmeta\x2d1995baab\x2d0f8d\x2d4658\x2da4fc\x2d2d21868dc592.mount: Deactivated successfully.
Jan 21 18:59:55 np0005591285 ovn_metadata_agent[104254]: 2026-01-21 23:59:55.390 104650 DEBUG oslo.privsep.daemon [-] privsep: reply[333fde44-447f-414b-884e-0c7ed66faa9e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 18:59:55 np0005591285 nova_compute[182755]: 2026-01-21 23:59:55.784 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 18:59:56 np0005591285 nova_compute[182755]: 2026-01-21 23:59:56.038 182759 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1769039981.0374613, 83fe04ea-7d77-4003-9276-6a7d268e942a => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 21 18:59:56 np0005591285 nova_compute[182755]: 2026-01-21 23:59:56.039 182759 INFO nova.compute.manager [-] [instance: 83fe04ea-7d77-4003-9276-6a7d268e942a] VM Stopped (Lifecycle Event)#033[00m
Jan 21 18:59:56 np0005591285 nova_compute[182755]: 2026-01-21 23:59:56.069 182759 DEBUG nova.compute.manager [None req-44a66668-8c16-4d4d-a245-0d582bd76165 - - - - - -] [instance: 83fe04ea-7d77-4003-9276-6a7d268e942a] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 21 18:59:56 np0005591285 nova_compute[182755]: 2026-01-21 23:59:56.639 182759 INFO nova.network.neutron [None req-83be6c55-c85f-489d-9178-d17bd4158a6f 0f8ef02149394f2dac899fc3395b6bf7 717cc581e6a349a98dfd390d05b18624 - - default default] [instance: 30dd1355-3b44-4697-89e2-e5c929a535ac] Port c596dfbe-ce59-4ab9-8cad-bd4144812420 from network info_cache is no longer associated with instance in Neutron. Removing from network info_cache.#033[00m
Jan 21 18:59:56 np0005591285 nova_compute[182755]: 2026-01-21 23:59:56.639 182759 DEBUG nova.network.neutron [None req-83be6c55-c85f-489d-9178-d17bd4158a6f 0f8ef02149394f2dac899fc3395b6bf7 717cc581e6a349a98dfd390d05b18624 - - default default] [instance: 30dd1355-3b44-4697-89e2-e5c929a535ac] Updating instance_info_cache with network_info: [{"id": "e685e997-ac00-415e-9109-6d37bbb2f577", "address": "fa:16:3e:a5:00:2a", "network": {"id": "1995baab-0f8d-4658-a4fc-2d21868dc592", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-54296518-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.223", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "717cc581e6a349a98dfd390d05b18624", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape685e997-ac", "ovs_interfaceid": "e685e997-ac00-415e-9109-6d37bbb2f577", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 21 18:59:56 np0005591285 nova_compute[182755]: 2026-01-21 23:59:56.662 182759 DEBUG oslo_concurrency.lockutils [None req-83be6c55-c85f-489d-9178-d17bd4158a6f 0f8ef02149394f2dac899fc3395b6bf7 717cc581e6a349a98dfd390d05b18624 - - default default] Releasing lock "refresh_cache-30dd1355-3b44-4697-89e2-e5c929a535ac" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 21 18:59:56 np0005591285 nova_compute[182755]: 2026-01-21 23:59:56.699 182759 DEBUG oslo_concurrency.lockutils [None req-83be6c55-c85f-489d-9178-d17bd4158a6f 0f8ef02149394f2dac899fc3395b6bf7 717cc581e6a349a98dfd390d05b18624 - - default default] Lock "interface-30dd1355-3b44-4697-89e2-e5c929a535ac-c596dfbe-ce59-4ab9-8cad-bd4144812420" "released" by "nova.compute.manager.ComputeManager.detach_interface.<locals>.do_detach_interface" :: held 5.193s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 21 18:59:57 np0005591285 nova_compute[182755]: 2026-01-21 23:59:57.516 182759 DEBUG nova.compute.manager [req-ab465cf0-cd96-4374-ad19-59e7d43b9c00 req-8d8aa231-09cf-4e9d-9538-aede671ef0ea 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 30dd1355-3b44-4697-89e2-e5c929a535ac] Received event network-vif-plugged-e685e997-ac00-415e-9109-6d37bbb2f577 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 21 18:59:57 np0005591285 nova_compute[182755]: 2026-01-21 23:59:57.517 182759 DEBUG oslo_concurrency.lockutils [req-ab465cf0-cd96-4374-ad19-59e7d43b9c00 req-8d8aa231-09cf-4e9d-9538-aede671ef0ea 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "30dd1355-3b44-4697-89e2-e5c929a535ac-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 21 18:59:57 np0005591285 nova_compute[182755]: 2026-01-21 23:59:57.517 182759 DEBUG oslo_concurrency.lockutils [req-ab465cf0-cd96-4374-ad19-59e7d43b9c00 req-8d8aa231-09cf-4e9d-9538-aede671ef0ea 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "30dd1355-3b44-4697-89e2-e5c929a535ac-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 21 18:59:57 np0005591285 nova_compute[182755]: 2026-01-21 23:59:57.518 182759 DEBUG oslo_concurrency.lockutils [req-ab465cf0-cd96-4374-ad19-59e7d43b9c00 req-8d8aa231-09cf-4e9d-9538-aede671ef0ea 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "30dd1355-3b44-4697-89e2-e5c929a535ac-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 21 18:59:57 np0005591285 nova_compute[182755]: 2026-01-21 23:59:57.518 182759 DEBUG nova.compute.manager [req-ab465cf0-cd96-4374-ad19-59e7d43b9c00 req-8d8aa231-09cf-4e9d-9538-aede671ef0ea 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 30dd1355-3b44-4697-89e2-e5c929a535ac] No waiting events found dispatching network-vif-plugged-e685e997-ac00-415e-9109-6d37bbb2f577 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 21 18:59:57 np0005591285 nova_compute[182755]: 2026-01-21 23:59:57.518 182759 WARNING nova.compute.manager [req-ab465cf0-cd96-4374-ad19-59e7d43b9c00 req-8d8aa231-09cf-4e9d-9538-aede671ef0ea 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 30dd1355-3b44-4697-89e2-e5c929a535ac] Received unexpected event network-vif-plugged-e685e997-ac00-415e-9109-6d37bbb2f577 for instance with vm_state active and task_state deleting.#033[00m
Jan 21 18:59:58 np0005591285 nova_compute[182755]: 2026-01-21 23:59:58.247 182759 DEBUG nova.network.neutron [-] [instance: 30dd1355-3b44-4697-89e2-e5c929a535ac] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 21 18:59:58 np0005591285 nova_compute[182755]: 2026-01-21 23:59:58.281 182759 INFO nova.compute.manager [-] [instance: 30dd1355-3b44-4697-89e2-e5c929a535ac] Took 2.93 seconds to deallocate network for instance.#033[00m
Jan 21 18:59:58 np0005591285 nova_compute[182755]: 2026-01-21 23:59:58.367 182759 DEBUG nova.compute.manager [req-98a394f1-0e64-43d8-9823-c6ea33303ebe req-5425c9e2-4e61-4c79-b68f-26e401eb32cf 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 30dd1355-3b44-4697-89e2-e5c929a535ac] Received event network-vif-deleted-e685e997-ac00-415e-9109-6d37bbb2f577 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 21 18:59:58 np0005591285 nova_compute[182755]: 2026-01-21 23:59:58.382 182759 DEBUG oslo_concurrency.lockutils [None req-09c4cded-4606-47ad-abff-dbdd0fbb938a 0f8ef02149394f2dac899fc3395b6bf7 717cc581e6a349a98dfd390d05b18624 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 21 18:59:58 np0005591285 nova_compute[182755]: 2026-01-21 23:59:58.382 182759 DEBUG oslo_concurrency.lockutils [None req-09c4cded-4606-47ad-abff-dbdd0fbb938a 0f8ef02149394f2dac899fc3395b6bf7 717cc581e6a349a98dfd390d05b18624 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 21 18:59:58 np0005591285 nova_compute[182755]: 2026-01-21 23:59:58.503 182759 DEBUG nova.compute.provider_tree [None req-09c4cded-4606-47ad-abff-dbdd0fbb938a 0f8ef02149394f2dac899fc3395b6bf7 717cc581e6a349a98dfd390d05b18624 - - default default] Inventory has not changed in ProviderTree for provider: e96a8776-a298-4c19-937a-402cb8191067 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 21 18:59:58 np0005591285 nova_compute[182755]: 2026-01-21 23:59:58.530 182759 DEBUG nova.scheduler.client.report [None req-09c4cded-4606-47ad-abff-dbdd0fbb938a 0f8ef02149394f2dac899fc3395b6bf7 717cc581e6a349a98dfd390d05b18624 - - default default] Inventory has not changed for provider e96a8776-a298-4c19-937a-402cb8191067 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 21 18:59:58 np0005591285 nova_compute[182755]: 2026-01-21 23:59:58.565 182759 DEBUG oslo_concurrency.lockutils [None req-09c4cded-4606-47ad-abff-dbdd0fbb938a 0f8ef02149394f2dac899fc3395b6bf7 717cc581e6a349a98dfd390d05b18624 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.182s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 21 18:59:58 np0005591285 nova_compute[182755]: 2026-01-21 23:59:58.589 182759 INFO nova.scheduler.client.report [None req-09c4cded-4606-47ad-abff-dbdd0fbb938a 0f8ef02149394f2dac899fc3395b6bf7 717cc581e6a349a98dfd390d05b18624 - - default default] Deleted allocations for instance 30dd1355-3b44-4697-89e2-e5c929a535ac#033[00m
Jan 21 18:59:58 np0005591285 nova_compute[182755]: 2026-01-21 23:59:58.688 182759 DEBUG oslo_concurrency.lockutils [None req-09c4cded-4606-47ad-abff-dbdd0fbb938a 0f8ef02149394f2dac899fc3395b6bf7 717cc581e6a349a98dfd390d05b18624 - - default default] Lock "30dd1355-3b44-4697-89e2-e5c929a535ac" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 3.828s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 21 19:00:00 np0005591285 nova_compute[182755]: 2026-01-22 00:00:00.230 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:00:00 np0005591285 nova_compute[182755]: 2026-01-22 00:00:00.785 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:00:01 np0005591285 systemd[1]: Starting update of the root trust anchor for DNSSEC validation in unbound...
Jan 21 19:00:01 np0005591285 systemd[1]: Starting Rotate log files...
Jan 21 19:00:01 np0005591285 systemd[1]: unbound-anchor.service: Deactivated successfully.
Jan 21 19:00:01 np0005591285 systemd[1]: Finished update of the root trust anchor for DNSSEC validation in unbound.
Jan 21 19:00:01 np0005591285 podman[222900]: 2026-01-22 00:00:01.223602971 +0000 UTC m=+0.078826906 container health_status b5c1a31a508d0cf9e10702abe9c0ef424614b4d8f0daee85791f7de054afa6f0 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, managed_by=edpm_ansible, org.label-schema.build-date=20251202, container_name=ceilometer_agent_compute, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=ceilometer_agent_compute, io.buildah.version=1.41.3)
Jan 21 19:00:01 np0005591285 podman[222899]: 2026-01-22 00:00:01.234222843 +0000 UTC m=+0.089442408 container health_status 769668cc4e818cfae854ab3fcb5517227ddca1e18005a18ef4d782a494f45662 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, vendor=Red Hat, Inc., build-date=2025-08-20T13:12:41, architecture=x86_64, release=1755695350, url=https://catalog.redhat.com/en/search?searchType=containers, version=9.6, container_name=openstack_network_exporter, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.component=ubi9-minimal-container, managed_by=edpm_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=Red Hat, Inc., config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, distribution-scope=public, io.openshift.tags=minimal rhel9, name=ubi9-minimal, config_id=openstack_network_exporter, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.openshift.expose-services=, io.buildah.version=1.33.7, vcs-type=git)
Jan 21 19:00:01 np0005591285 systemd[1]: logrotate.service: Deactivated successfully.
Jan 21 19:00:01 np0005591285 systemd[1]: Finished Rotate log files.
Jan 21 19:00:01 np0005591285 nova_compute[182755]: 2026-01-22 00:00:01.346 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:00:02 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:00:02.965 104259 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 21 19:00:02 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:00:02.966 104259 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 21 19:00:02 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:00:02.967 104259 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 21 19:00:05 np0005591285 nova_compute[182755]: 2026-01-22 00:00:05.234 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:00:05 np0005591285 nova_compute[182755]: 2026-01-22 00:00:05.788 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:00:07 np0005591285 nova_compute[182755]: 2026-01-22 00:00:07.387 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:00:08 np0005591285 podman[222942]: 2026-01-22 00:00:08.223676188 +0000 UTC m=+0.078510017 container health_status 3d927e208c8d9d2bf2ed958430b573b5beb6e6062ba87e1ec7a0ee42c855b2e4 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter)
Jan 21 19:00:10 np0005591285 nova_compute[182755]: 2026-01-22 00:00:10.195 182759 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1769039995.1933424, 30dd1355-3b44-4697-89e2-e5c929a535ac => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 21 19:00:10 np0005591285 nova_compute[182755]: 2026-01-22 00:00:10.196 182759 INFO nova.compute.manager [-] [instance: 30dd1355-3b44-4697-89e2-e5c929a535ac] VM Stopped (Lifecycle Event)#033[00m
Jan 21 19:00:10 np0005591285 nova_compute[182755]: 2026-01-22 00:00:10.238 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:00:10 np0005591285 nova_compute[182755]: 2026-01-22 00:00:10.247 182759 DEBUG nova.compute.manager [None req-7d85b737-5553-46cf-8ee6-14b64d9d2e39 - - - - - -] [instance: 30dd1355-3b44-4697-89e2-e5c929a535ac] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 21 19:00:10 np0005591285 nova_compute[182755]: 2026-01-22 00:00:10.823 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:00:11 np0005591285 podman[222979]: 2026-01-22 00:00:11.216927622 +0000 UTC m=+0.074239714 container health_status 482a8450540b33e5cd4c4cb0c4b60281016c657b79b89fa82f8a9e447843f995 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Jan 21 19:00:11 np0005591285 podman[222980]: 2026-01-22 00:00:11.2424534 +0000 UTC m=+0.091824601 container health_status 8919309155a033da8deb024bb37b9fc0d285fe97686ee0d202af37a5f2df8416 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter)
Jan 21 19:00:11 np0005591285 podman[222981]: 2026-01-22 00:00:11.277937174 +0000 UTC m=+0.112286756 container health_status e38def30a2d032278c507a8fe96e62e9ea51ff4721fe55b24e66691821fd079e (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.license=GPLv2, tcib_managed=true, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS)
Jan 21 19:00:14 np0005591285 nova_compute[182755]: 2026-01-22 00:00:14.433 182759 DEBUG oslo_concurrency.lockutils [None req-4db8cc37-579f-4a37-b2e4-4c5682194607 9c8bb39ebd324063b1c1044104d8fe0d ef52bd80396048e796f6c2dbf7295b47 - - default default] Acquiring lock "c5085649-028e-44f3-b7fa-53f19fc0a7de" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 21 19:00:14 np0005591285 nova_compute[182755]: 2026-01-22 00:00:14.434 182759 DEBUG oslo_concurrency.lockutils [None req-4db8cc37-579f-4a37-b2e4-4c5682194607 9c8bb39ebd324063b1c1044104d8fe0d ef52bd80396048e796f6c2dbf7295b47 - - default default] Lock "c5085649-028e-44f3-b7fa-53f19fc0a7de" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 21 19:00:14 np0005591285 nova_compute[182755]: 2026-01-22 00:00:14.453 182759 DEBUG nova.compute.manager [None req-4db8cc37-579f-4a37-b2e4-4c5682194607 9c8bb39ebd324063b1c1044104d8fe0d ef52bd80396048e796f6c2dbf7295b47 - - default default] [instance: c5085649-028e-44f3-b7fa-53f19fc0a7de] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Jan 21 19:00:14 np0005591285 nova_compute[182755]: 2026-01-22 00:00:14.834 182759 DEBUG oslo_concurrency.lockutils [None req-4db8cc37-579f-4a37-b2e4-4c5682194607 9c8bb39ebd324063b1c1044104d8fe0d ef52bd80396048e796f6c2dbf7295b47 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 21 19:00:14 np0005591285 nova_compute[182755]: 2026-01-22 00:00:14.834 182759 DEBUG oslo_concurrency.lockutils [None req-4db8cc37-579f-4a37-b2e4-4c5682194607 9c8bb39ebd324063b1c1044104d8fe0d ef52bd80396048e796f6c2dbf7295b47 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 21 19:00:14 np0005591285 nova_compute[182755]: 2026-01-22 00:00:14.842 182759 DEBUG nova.virt.hardware [None req-4db8cc37-579f-4a37-b2e4-4c5682194607 9c8bb39ebd324063b1c1044104d8fe0d ef52bd80396048e796f6c2dbf7295b47 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Jan 21 19:00:14 np0005591285 nova_compute[182755]: 2026-01-22 00:00:14.842 182759 INFO nova.compute.claims [None req-4db8cc37-579f-4a37-b2e4-4c5682194607 9c8bb39ebd324063b1c1044104d8fe0d ef52bd80396048e796f6c2dbf7295b47 - - default default] [instance: c5085649-028e-44f3-b7fa-53f19fc0a7de] Claim successful on node compute-2.ctlplane.example.com#033[00m
Jan 21 19:00:14 np0005591285 nova_compute[182755]: 2026-01-22 00:00:14.984 182759 DEBUG nova.compute.provider_tree [None req-4db8cc37-579f-4a37-b2e4-4c5682194607 9c8bb39ebd324063b1c1044104d8fe0d ef52bd80396048e796f6c2dbf7295b47 - - default default] Inventory has not changed in ProviderTree for provider: e96a8776-a298-4c19-937a-402cb8191067 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 21 19:00:15 np0005591285 nova_compute[182755]: 2026-01-22 00:00:14.999 182759 DEBUG nova.scheduler.client.report [None req-4db8cc37-579f-4a37-b2e4-4c5682194607 9c8bb39ebd324063b1c1044104d8fe0d ef52bd80396048e796f6c2dbf7295b47 - - default default] Inventory has not changed for provider e96a8776-a298-4c19-937a-402cb8191067 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 21 19:00:15 np0005591285 nova_compute[182755]: 2026-01-22 00:00:15.043 182759 DEBUG oslo_concurrency.lockutils [None req-4db8cc37-579f-4a37-b2e4-4c5682194607 9c8bb39ebd324063b1c1044104d8fe0d ef52bd80396048e796f6c2dbf7295b47 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.208s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 21 19:00:15 np0005591285 nova_compute[182755]: 2026-01-22 00:00:15.044 182759 DEBUG nova.compute.manager [None req-4db8cc37-579f-4a37-b2e4-4c5682194607 9c8bb39ebd324063b1c1044104d8fe0d ef52bd80396048e796f6c2dbf7295b47 - - default default] [instance: c5085649-028e-44f3-b7fa-53f19fc0a7de] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Jan 21 19:00:15 np0005591285 nova_compute[182755]: 2026-01-22 00:00:15.125 182759 DEBUG nova.compute.manager [None req-4db8cc37-579f-4a37-b2e4-4c5682194607 9c8bb39ebd324063b1c1044104d8fe0d ef52bd80396048e796f6c2dbf7295b47 - - default default] [instance: c5085649-028e-44f3-b7fa-53f19fc0a7de] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Jan 21 19:00:15 np0005591285 nova_compute[182755]: 2026-01-22 00:00:15.126 182759 DEBUG nova.network.neutron [None req-4db8cc37-579f-4a37-b2e4-4c5682194607 9c8bb39ebd324063b1c1044104d8fe0d ef52bd80396048e796f6c2dbf7295b47 - - default default] [instance: c5085649-028e-44f3-b7fa-53f19fc0a7de] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Jan 21 19:00:15 np0005591285 nova_compute[182755]: 2026-01-22 00:00:15.152 182759 INFO nova.virt.libvirt.driver [None req-4db8cc37-579f-4a37-b2e4-4c5682194607 9c8bb39ebd324063b1c1044104d8fe0d ef52bd80396048e796f6c2dbf7295b47 - - default default] [instance: c5085649-028e-44f3-b7fa-53f19fc0a7de] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Jan 21 19:00:15 np0005591285 nova_compute[182755]: 2026-01-22 00:00:15.177 182759 DEBUG nova.compute.manager [None req-4db8cc37-579f-4a37-b2e4-4c5682194607 9c8bb39ebd324063b1c1044104d8fe0d ef52bd80396048e796f6c2dbf7295b47 - - default default] [instance: c5085649-028e-44f3-b7fa-53f19fc0a7de] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Jan 21 19:00:15 np0005591285 nova_compute[182755]: 2026-01-22 00:00:15.242 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:00:15 np0005591285 nova_compute[182755]: 2026-01-22 00:00:15.328 182759 DEBUG nova.compute.manager [None req-4db8cc37-579f-4a37-b2e4-4c5682194607 9c8bb39ebd324063b1c1044104d8fe0d ef52bd80396048e796f6c2dbf7295b47 - - default default] [instance: c5085649-028e-44f3-b7fa-53f19fc0a7de] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Jan 21 19:00:15 np0005591285 nova_compute[182755]: 2026-01-22 00:00:15.330 182759 DEBUG nova.virt.libvirt.driver [None req-4db8cc37-579f-4a37-b2e4-4c5682194607 9c8bb39ebd324063b1c1044104d8fe0d ef52bd80396048e796f6c2dbf7295b47 - - default default] [instance: c5085649-028e-44f3-b7fa-53f19fc0a7de] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Jan 21 19:00:15 np0005591285 nova_compute[182755]: 2026-01-22 00:00:15.331 182759 INFO nova.virt.libvirt.driver [None req-4db8cc37-579f-4a37-b2e4-4c5682194607 9c8bb39ebd324063b1c1044104d8fe0d ef52bd80396048e796f6c2dbf7295b47 - - default default] [instance: c5085649-028e-44f3-b7fa-53f19fc0a7de] Creating image(s)#033[00m
Jan 21 19:00:15 np0005591285 nova_compute[182755]: 2026-01-22 00:00:15.333 182759 DEBUG oslo_concurrency.lockutils [None req-4db8cc37-579f-4a37-b2e4-4c5682194607 9c8bb39ebd324063b1c1044104d8fe0d ef52bd80396048e796f6c2dbf7295b47 - - default default] Acquiring lock "/var/lib/nova/instances/c5085649-028e-44f3-b7fa-53f19fc0a7de/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 21 19:00:15 np0005591285 nova_compute[182755]: 2026-01-22 00:00:15.333 182759 DEBUG oslo_concurrency.lockutils [None req-4db8cc37-579f-4a37-b2e4-4c5682194607 9c8bb39ebd324063b1c1044104d8fe0d ef52bd80396048e796f6c2dbf7295b47 - - default default] Lock "/var/lib/nova/instances/c5085649-028e-44f3-b7fa-53f19fc0a7de/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 21 19:00:15 np0005591285 nova_compute[182755]: 2026-01-22 00:00:15.335 182759 DEBUG oslo_concurrency.lockutils [None req-4db8cc37-579f-4a37-b2e4-4c5682194607 9c8bb39ebd324063b1c1044104d8fe0d ef52bd80396048e796f6c2dbf7295b47 - - default default] Lock "/var/lib/nova/instances/c5085649-028e-44f3-b7fa-53f19fc0a7de/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 21 19:00:15 np0005591285 nova_compute[182755]: 2026-01-22 00:00:15.374 182759 DEBUG oslo_concurrency.processutils [None req-4db8cc37-579f-4a37-b2e4-4c5682194607 9c8bb39ebd324063b1c1044104d8fe0d ef52bd80396048e796f6c2dbf7295b47 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 21 19:00:15 np0005591285 nova_compute[182755]: 2026-01-22 00:00:15.458 182759 DEBUG oslo_concurrency.processutils [None req-4db8cc37-579f-4a37-b2e4-4c5682194607 9c8bb39ebd324063b1c1044104d8fe0d ef52bd80396048e796f6c2dbf7295b47 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json" returned: 0 in 0.084s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 21 19:00:15 np0005591285 nova_compute[182755]: 2026-01-22 00:00:15.459 182759 DEBUG oslo_concurrency.lockutils [None req-4db8cc37-579f-4a37-b2e4-4c5682194607 9c8bb39ebd324063b1c1044104d8fe0d ef52bd80396048e796f6c2dbf7295b47 - - default default] Acquiring lock "3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 21 19:00:15 np0005591285 nova_compute[182755]: 2026-01-22 00:00:15.459 182759 DEBUG oslo_concurrency.lockutils [None req-4db8cc37-579f-4a37-b2e4-4c5682194607 9c8bb39ebd324063b1c1044104d8fe0d ef52bd80396048e796f6c2dbf7295b47 - - default default] Lock "3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 21 19:00:15 np0005591285 nova_compute[182755]: 2026-01-22 00:00:15.472 182759 DEBUG oslo_concurrency.processutils [None req-4db8cc37-579f-4a37-b2e4-4c5682194607 9c8bb39ebd324063b1c1044104d8fe0d ef52bd80396048e796f6c2dbf7295b47 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 21 19:00:15 np0005591285 nova_compute[182755]: 2026-01-22 00:00:15.538 182759 DEBUG nova.policy [None req-4db8cc37-579f-4a37-b2e4-4c5682194607 9c8bb39ebd324063b1c1044104d8fe0d ef52bd80396048e796f6c2dbf7295b47 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '9c8bb39ebd324063b1c1044104d8fe0d', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'ef52bd80396048e796f6c2dbf7295b47', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Jan 21 19:00:15 np0005591285 nova_compute[182755]: 2026-01-22 00:00:15.564 182759 DEBUG oslo_concurrency.processutils [None req-4db8cc37-579f-4a37-b2e4-4c5682194607 9c8bb39ebd324063b1c1044104d8fe0d ef52bd80396048e796f6c2dbf7295b47 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json" returned: 0 in 0.092s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 21 19:00:15 np0005591285 nova_compute[182755]: 2026-01-22 00:00:15.565 182759 DEBUG oslo_concurrency.processutils [None req-4db8cc37-579f-4a37-b2e4-4c5682194607 9c8bb39ebd324063b1c1044104d8fe0d ef52bd80396048e796f6c2dbf7295b47 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474,backing_fmt=raw /var/lib/nova/instances/c5085649-028e-44f3-b7fa-53f19fc0a7de/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 21 19:00:15 np0005591285 nova_compute[182755]: 2026-01-22 00:00:15.612 182759 DEBUG oslo_concurrency.processutils [None req-4db8cc37-579f-4a37-b2e4-4c5682194607 9c8bb39ebd324063b1c1044104d8fe0d ef52bd80396048e796f6c2dbf7295b47 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474,backing_fmt=raw /var/lib/nova/instances/c5085649-028e-44f3-b7fa-53f19fc0a7de/disk 1073741824" returned: 0 in 0.047s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 21 19:00:15 np0005591285 nova_compute[182755]: 2026-01-22 00:00:15.615 182759 DEBUG oslo_concurrency.lockutils [None req-4db8cc37-579f-4a37-b2e4-4c5682194607 9c8bb39ebd324063b1c1044104d8fe0d ef52bd80396048e796f6c2dbf7295b47 - - default default] Lock "3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.155s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 21 19:00:15 np0005591285 nova_compute[182755]: 2026-01-22 00:00:15.616 182759 DEBUG oslo_concurrency.processutils [None req-4db8cc37-579f-4a37-b2e4-4c5682194607 9c8bb39ebd324063b1c1044104d8fe0d ef52bd80396048e796f6c2dbf7295b47 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 21 19:00:15 np0005591285 nova_compute[182755]: 2026-01-22 00:00:15.681 182759 DEBUG oslo_concurrency.processutils [None req-4db8cc37-579f-4a37-b2e4-4c5682194607 9c8bb39ebd324063b1c1044104d8fe0d ef52bd80396048e796f6c2dbf7295b47 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json" returned: 0 in 0.065s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 21 19:00:15 np0005591285 nova_compute[182755]: 2026-01-22 00:00:15.682 182759 DEBUG nova.virt.disk.api [None req-4db8cc37-579f-4a37-b2e4-4c5682194607 9c8bb39ebd324063b1c1044104d8fe0d ef52bd80396048e796f6c2dbf7295b47 - - default default] Checking if we can resize image /var/lib/nova/instances/c5085649-028e-44f3-b7fa-53f19fc0a7de/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166#033[00m
Jan 21 19:00:15 np0005591285 nova_compute[182755]: 2026-01-22 00:00:15.683 182759 DEBUG oslo_concurrency.processutils [None req-4db8cc37-579f-4a37-b2e4-4c5682194607 9c8bb39ebd324063b1c1044104d8fe0d ef52bd80396048e796f6c2dbf7295b47 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/c5085649-028e-44f3-b7fa-53f19fc0a7de/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 21 19:00:15 np0005591285 nova_compute[182755]: 2026-01-22 00:00:15.743 182759 DEBUG oslo_concurrency.processutils [None req-4db8cc37-579f-4a37-b2e4-4c5682194607 9c8bb39ebd324063b1c1044104d8fe0d ef52bd80396048e796f6c2dbf7295b47 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/c5085649-028e-44f3-b7fa-53f19fc0a7de/disk --force-share --output=json" returned: 0 in 0.060s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 21 19:00:15 np0005591285 nova_compute[182755]: 2026-01-22 00:00:15.744 182759 DEBUG nova.virt.disk.api [None req-4db8cc37-579f-4a37-b2e4-4c5682194607 9c8bb39ebd324063b1c1044104d8fe0d ef52bd80396048e796f6c2dbf7295b47 - - default default] Cannot resize image /var/lib/nova/instances/c5085649-028e-44f3-b7fa-53f19fc0a7de/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172#033[00m
Jan 21 19:00:15 np0005591285 nova_compute[182755]: 2026-01-22 00:00:15.745 182759 DEBUG nova.objects.instance [None req-4db8cc37-579f-4a37-b2e4-4c5682194607 9c8bb39ebd324063b1c1044104d8fe0d ef52bd80396048e796f6c2dbf7295b47 - - default default] Lazy-loading 'migration_context' on Instance uuid c5085649-028e-44f3-b7fa-53f19fc0a7de obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 21 19:00:15 np0005591285 nova_compute[182755]: 2026-01-22 00:00:15.771 182759 DEBUG nova.virt.libvirt.driver [None req-4db8cc37-579f-4a37-b2e4-4c5682194607 9c8bb39ebd324063b1c1044104d8fe0d ef52bd80396048e796f6c2dbf7295b47 - - default default] [instance: c5085649-028e-44f3-b7fa-53f19fc0a7de] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Jan 21 19:00:15 np0005591285 nova_compute[182755]: 2026-01-22 00:00:15.772 182759 DEBUG nova.virt.libvirt.driver [None req-4db8cc37-579f-4a37-b2e4-4c5682194607 9c8bb39ebd324063b1c1044104d8fe0d ef52bd80396048e796f6c2dbf7295b47 - - default default] [instance: c5085649-028e-44f3-b7fa-53f19fc0a7de] Ensure instance console log exists: /var/lib/nova/instances/c5085649-028e-44f3-b7fa-53f19fc0a7de/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Jan 21 19:00:15 np0005591285 nova_compute[182755]: 2026-01-22 00:00:15.773 182759 DEBUG oslo_concurrency.lockutils [None req-4db8cc37-579f-4a37-b2e4-4c5682194607 9c8bb39ebd324063b1c1044104d8fe0d ef52bd80396048e796f6c2dbf7295b47 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 21 19:00:15 np0005591285 nova_compute[182755]: 2026-01-22 00:00:15.773 182759 DEBUG oslo_concurrency.lockutils [None req-4db8cc37-579f-4a37-b2e4-4c5682194607 9c8bb39ebd324063b1c1044104d8fe0d ef52bd80396048e796f6c2dbf7295b47 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 21 19:00:15 np0005591285 nova_compute[182755]: 2026-01-22 00:00:15.773 182759 DEBUG oslo_concurrency.lockutils [None req-4db8cc37-579f-4a37-b2e4-4c5682194607 9c8bb39ebd324063b1c1044104d8fe0d ef52bd80396048e796f6c2dbf7295b47 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 21 19:00:15 np0005591285 nova_compute[182755]: 2026-01-22 00:00:15.887 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:00:16 np0005591285 nova_compute[182755]: 2026-01-22 00:00:16.358 182759 DEBUG nova.network.neutron [None req-4db8cc37-579f-4a37-b2e4-4c5682194607 9c8bb39ebd324063b1c1044104d8fe0d ef52bd80396048e796f6c2dbf7295b47 - - default default] [instance: c5085649-028e-44f3-b7fa-53f19fc0a7de] Successfully created port: 9a8b67ec-5e81-4335-80f2-76197d4f6f9e _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Jan 21 19:00:16 np0005591285 ovn_controller[94908]: 2026-01-22T00:00:16Z|00283|binding|INFO|Releasing lport 1b7e9589-a667-4684-99c2-2699b19c29bb from this chassis (sb_readonly=0)
Jan 21 19:00:16 np0005591285 nova_compute[182755]: 2026-01-22 00:00:16.971 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:00:17 np0005591285 nova_compute[182755]: 2026-01-22 00:00:17.477 182759 DEBUG nova.network.neutron [None req-4db8cc37-579f-4a37-b2e4-4c5682194607 9c8bb39ebd324063b1c1044104d8fe0d ef52bd80396048e796f6c2dbf7295b47 - - default default] [instance: c5085649-028e-44f3-b7fa-53f19fc0a7de] Successfully updated port: 9a8b67ec-5e81-4335-80f2-76197d4f6f9e _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Jan 21 19:00:17 np0005591285 nova_compute[182755]: 2026-01-22 00:00:17.494 182759 DEBUG oslo_concurrency.lockutils [None req-4db8cc37-579f-4a37-b2e4-4c5682194607 9c8bb39ebd324063b1c1044104d8fe0d ef52bd80396048e796f6c2dbf7295b47 - - default default] Acquiring lock "refresh_cache-c5085649-028e-44f3-b7fa-53f19fc0a7de" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 21 19:00:17 np0005591285 nova_compute[182755]: 2026-01-22 00:00:17.495 182759 DEBUG oslo_concurrency.lockutils [None req-4db8cc37-579f-4a37-b2e4-4c5682194607 9c8bb39ebd324063b1c1044104d8fe0d ef52bd80396048e796f6c2dbf7295b47 - - default default] Acquired lock "refresh_cache-c5085649-028e-44f3-b7fa-53f19fc0a7de" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 21 19:00:17 np0005591285 nova_compute[182755]: 2026-01-22 00:00:17.495 182759 DEBUG nova.network.neutron [None req-4db8cc37-579f-4a37-b2e4-4c5682194607 9c8bb39ebd324063b1c1044104d8fe0d ef52bd80396048e796f6c2dbf7295b47 - - default default] [instance: c5085649-028e-44f3-b7fa-53f19fc0a7de] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Jan 21 19:00:17 np0005591285 nova_compute[182755]: 2026-01-22 00:00:17.675 182759 DEBUG nova.network.neutron [None req-4db8cc37-579f-4a37-b2e4-4c5682194607 9c8bb39ebd324063b1c1044104d8fe0d ef52bd80396048e796f6c2dbf7295b47 - - default default] [instance: c5085649-028e-44f3-b7fa-53f19fc0a7de] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Jan 21 19:00:17 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:00:17.905 104259 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=19, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '16:a4:fb', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': 'fa:c2:bb:50:eb:27'}, ipsec=False) old=SB_Global(nb_cfg=18) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 21 19:00:17 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:00:17.907 104259 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 8 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Jan 21 19:00:17 np0005591285 nova_compute[182755]: 2026-01-22 00:00:17.907 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:00:18 np0005591285 nova_compute[182755]: 2026-01-22 00:00:18.030 182759 DEBUG nova.compute.manager [req-2c092b11-8c34-42da-aa6f-f148b36a9ea6 req-16cdaecd-f41d-4bdb-a130-11db6e967948 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: c5085649-028e-44f3-b7fa-53f19fc0a7de] Received event network-changed-9a8b67ec-5e81-4335-80f2-76197d4f6f9e external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 21 19:00:18 np0005591285 nova_compute[182755]: 2026-01-22 00:00:18.031 182759 DEBUG nova.compute.manager [req-2c092b11-8c34-42da-aa6f-f148b36a9ea6 req-16cdaecd-f41d-4bdb-a130-11db6e967948 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: c5085649-028e-44f3-b7fa-53f19fc0a7de] Refreshing instance network info cache due to event network-changed-9a8b67ec-5e81-4335-80f2-76197d4f6f9e. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 21 19:00:18 np0005591285 nova_compute[182755]: 2026-01-22 00:00:18.032 182759 DEBUG oslo_concurrency.lockutils [req-2c092b11-8c34-42da-aa6f-f148b36a9ea6 req-16cdaecd-f41d-4bdb-a130-11db6e967948 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "refresh_cache-c5085649-028e-44f3-b7fa-53f19fc0a7de" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 21 19:00:19 np0005591285 nova_compute[182755]: 2026-01-22 00:00:19.215 182759 DEBUG nova.network.neutron [None req-4db8cc37-579f-4a37-b2e4-4c5682194607 9c8bb39ebd324063b1c1044104d8fe0d ef52bd80396048e796f6c2dbf7295b47 - - default default] [instance: c5085649-028e-44f3-b7fa-53f19fc0a7de] Updating instance_info_cache with network_info: [{"id": "9a8b67ec-5e81-4335-80f2-76197d4f6f9e", "address": "fa:16:3e:31:f9:41", "network": {"id": "af35880e-f8b9-4463-bac9-70c95c551a8c", "bridge": "br-int", "label": "tempest-ServerTagsTestJSON-2067187433-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ef52bd80396048e796f6c2dbf7295b47", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap9a8b67ec-5e", "ovs_interfaceid": "9a8b67ec-5e81-4335-80f2-76197d4f6f9e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 21 19:00:19 np0005591285 nova_compute[182755]: 2026-01-22 00:00:19.245 182759 DEBUG oslo_concurrency.lockutils [None req-4db8cc37-579f-4a37-b2e4-4c5682194607 9c8bb39ebd324063b1c1044104d8fe0d ef52bd80396048e796f6c2dbf7295b47 - - default default] Releasing lock "refresh_cache-c5085649-028e-44f3-b7fa-53f19fc0a7de" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 21 19:00:19 np0005591285 nova_compute[182755]: 2026-01-22 00:00:19.245 182759 DEBUG nova.compute.manager [None req-4db8cc37-579f-4a37-b2e4-4c5682194607 9c8bb39ebd324063b1c1044104d8fe0d ef52bd80396048e796f6c2dbf7295b47 - - default default] [instance: c5085649-028e-44f3-b7fa-53f19fc0a7de] Instance network_info: |[{"id": "9a8b67ec-5e81-4335-80f2-76197d4f6f9e", "address": "fa:16:3e:31:f9:41", "network": {"id": "af35880e-f8b9-4463-bac9-70c95c551a8c", "bridge": "br-int", "label": "tempest-ServerTagsTestJSON-2067187433-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ef52bd80396048e796f6c2dbf7295b47", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap9a8b67ec-5e", "ovs_interfaceid": "9a8b67ec-5e81-4335-80f2-76197d4f6f9e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Jan 21 19:00:19 np0005591285 nova_compute[182755]: 2026-01-22 00:00:19.246 182759 DEBUG oslo_concurrency.lockutils [req-2c092b11-8c34-42da-aa6f-f148b36a9ea6 req-16cdaecd-f41d-4bdb-a130-11db6e967948 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquired lock "refresh_cache-c5085649-028e-44f3-b7fa-53f19fc0a7de" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 21 19:00:19 np0005591285 nova_compute[182755]: 2026-01-22 00:00:19.246 182759 DEBUG nova.network.neutron [req-2c092b11-8c34-42da-aa6f-f148b36a9ea6 req-16cdaecd-f41d-4bdb-a130-11db6e967948 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: c5085649-028e-44f3-b7fa-53f19fc0a7de] Refreshing network info cache for port 9a8b67ec-5e81-4335-80f2-76197d4f6f9e _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 21 19:00:19 np0005591285 nova_compute[182755]: 2026-01-22 00:00:19.250 182759 DEBUG nova.virt.libvirt.driver [None req-4db8cc37-579f-4a37-b2e4-4c5682194607 9c8bb39ebd324063b1c1044104d8fe0d ef52bd80396048e796f6c2dbf7295b47 - - default default] [instance: c5085649-028e-44f3-b7fa-53f19fc0a7de] Start _get_guest_xml network_info=[{"id": "9a8b67ec-5e81-4335-80f2-76197d4f6f9e", "address": "fa:16:3e:31:f9:41", "network": {"id": "af35880e-f8b9-4463-bac9-70c95c551a8c", "bridge": "br-int", "label": "tempest-ServerTagsTestJSON-2067187433-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ef52bd80396048e796f6c2dbf7295b47", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap9a8b67ec-5e", "ovs_interfaceid": "9a8b67ec-5e81-4335-80f2-76197d4f6f9e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-21T23:42:50Z,direct_url=<?>,disk_format='qcow2',id=9cd98f02-a505-4543-a7ad-04e9a377b456,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='43b70c4e837343859ac97b6b2397ba1b',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-21T23:42:52Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'device_type': 'disk', 'device_name': '/dev/vda', 'size': 0, 'boot_index': 0, 'disk_bus': 'virtio', 'guest_format': None, 'encryption_options': None, 'encryption_format': None, 'encrypted': False, 'encryption_secret_uuid': None, 'image_id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Jan 21 19:00:19 np0005591285 nova_compute[182755]: 2026-01-22 00:00:19.257 182759 WARNING nova.virt.libvirt.driver [None req-4db8cc37-579f-4a37-b2e4-4c5682194607 9c8bb39ebd324063b1c1044104d8fe0d ef52bd80396048e796f6c2dbf7295b47 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 21 19:00:19 np0005591285 nova_compute[182755]: 2026-01-22 00:00:19.266 182759 DEBUG nova.virt.libvirt.host [None req-4db8cc37-579f-4a37-b2e4-4c5682194607 9c8bb39ebd324063b1c1044104d8fe0d ef52bd80396048e796f6c2dbf7295b47 - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Jan 21 19:00:19 np0005591285 nova_compute[182755]: 2026-01-22 00:00:19.267 182759 DEBUG nova.virt.libvirt.host [None req-4db8cc37-579f-4a37-b2e4-4c5682194607 9c8bb39ebd324063b1c1044104d8fe0d ef52bd80396048e796f6c2dbf7295b47 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Jan 21 19:00:19 np0005591285 nova_compute[182755]: 2026-01-22 00:00:19.271 182759 DEBUG nova.virt.libvirt.host [None req-4db8cc37-579f-4a37-b2e4-4c5682194607 9c8bb39ebd324063b1c1044104d8fe0d ef52bd80396048e796f6c2dbf7295b47 - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Jan 21 19:00:19 np0005591285 nova_compute[182755]: 2026-01-22 00:00:19.272 182759 DEBUG nova.virt.libvirt.host [None req-4db8cc37-579f-4a37-b2e4-4c5682194607 9c8bb39ebd324063b1c1044104d8fe0d ef52bd80396048e796f6c2dbf7295b47 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Jan 21 19:00:19 np0005591285 nova_compute[182755]: 2026-01-22 00:00:19.273 182759 DEBUG nova.virt.libvirt.driver [None req-4db8cc37-579f-4a37-b2e4-4c5682194607 9c8bb39ebd324063b1c1044104d8fe0d ef52bd80396048e796f6c2dbf7295b47 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Jan 21 19:00:19 np0005591285 nova_compute[182755]: 2026-01-22 00:00:19.273 182759 DEBUG nova.virt.hardware [None req-4db8cc37-579f-4a37-b2e4-4c5682194607 9c8bb39ebd324063b1c1044104d8fe0d ef52bd80396048e796f6c2dbf7295b47 - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-21T23:42:45Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='c3389c03-89c4-4ff5-9e03-1a99d41713d4',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-21T23:42:50Z,direct_url=<?>,disk_format='qcow2',id=9cd98f02-a505-4543-a7ad-04e9a377b456,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='43b70c4e837343859ac97b6b2397ba1b',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-21T23:42:52Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Jan 21 19:00:19 np0005591285 nova_compute[182755]: 2026-01-22 00:00:19.274 182759 DEBUG nova.virt.hardware [None req-4db8cc37-579f-4a37-b2e4-4c5682194607 9c8bb39ebd324063b1c1044104d8fe0d ef52bd80396048e796f6c2dbf7295b47 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Jan 21 19:00:19 np0005591285 nova_compute[182755]: 2026-01-22 00:00:19.274 182759 DEBUG nova.virt.hardware [None req-4db8cc37-579f-4a37-b2e4-4c5682194607 9c8bb39ebd324063b1c1044104d8fe0d ef52bd80396048e796f6c2dbf7295b47 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Jan 21 19:00:19 np0005591285 nova_compute[182755]: 2026-01-22 00:00:19.274 182759 DEBUG nova.virt.hardware [None req-4db8cc37-579f-4a37-b2e4-4c5682194607 9c8bb39ebd324063b1c1044104d8fe0d ef52bd80396048e796f6c2dbf7295b47 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Jan 21 19:00:19 np0005591285 nova_compute[182755]: 2026-01-22 00:00:19.274 182759 DEBUG nova.virt.hardware [None req-4db8cc37-579f-4a37-b2e4-4c5682194607 9c8bb39ebd324063b1c1044104d8fe0d ef52bd80396048e796f6c2dbf7295b47 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Jan 21 19:00:19 np0005591285 nova_compute[182755]: 2026-01-22 00:00:19.274 182759 DEBUG nova.virt.hardware [None req-4db8cc37-579f-4a37-b2e4-4c5682194607 9c8bb39ebd324063b1c1044104d8fe0d ef52bd80396048e796f6c2dbf7295b47 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Jan 21 19:00:19 np0005591285 nova_compute[182755]: 2026-01-22 00:00:19.274 182759 DEBUG nova.virt.hardware [None req-4db8cc37-579f-4a37-b2e4-4c5682194607 9c8bb39ebd324063b1c1044104d8fe0d ef52bd80396048e796f6c2dbf7295b47 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Jan 21 19:00:19 np0005591285 nova_compute[182755]: 2026-01-22 00:00:19.275 182759 DEBUG nova.virt.hardware [None req-4db8cc37-579f-4a37-b2e4-4c5682194607 9c8bb39ebd324063b1c1044104d8fe0d ef52bd80396048e796f6c2dbf7295b47 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Jan 21 19:00:19 np0005591285 nova_compute[182755]: 2026-01-22 00:00:19.275 182759 DEBUG nova.virt.hardware [None req-4db8cc37-579f-4a37-b2e4-4c5682194607 9c8bb39ebd324063b1c1044104d8fe0d ef52bd80396048e796f6c2dbf7295b47 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Jan 21 19:00:19 np0005591285 nova_compute[182755]: 2026-01-22 00:00:19.275 182759 DEBUG nova.virt.hardware [None req-4db8cc37-579f-4a37-b2e4-4c5682194607 9c8bb39ebd324063b1c1044104d8fe0d ef52bd80396048e796f6c2dbf7295b47 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Jan 21 19:00:19 np0005591285 nova_compute[182755]: 2026-01-22 00:00:19.275 182759 DEBUG nova.virt.hardware [None req-4db8cc37-579f-4a37-b2e4-4c5682194607 9c8bb39ebd324063b1c1044104d8fe0d ef52bd80396048e796f6c2dbf7295b47 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Jan 21 19:00:19 np0005591285 nova_compute[182755]: 2026-01-22 00:00:19.279 182759 DEBUG nova.virt.libvirt.vif [None req-4db8cc37-579f-4a37-b2e4-4c5682194607 9c8bb39ebd324063b1c1044104d8fe0d ef52bd80396048e796f6c2dbf7295b47 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-22T00:00:13Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description=None,display_name='tempest-ServerTagsTestJSON-server-196894615',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-servertagstestjson-server-196894615',id=81,image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='ef52bd80396048e796f6c2dbf7295b47',ramdisk_id='',reservation_id='r-ja0ml4i1',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ServerTagsTestJSON-1413291072',owner_user_name='tempest-ServerTagsTestJSON-1413291072-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-22T00:00:15Z,user_data=None,user_id='9c8bb39ebd324063b1c1044104d8fe0d',uuid=c5085649-028e-44f3-b7fa-53f19fc0a7de,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "9a8b67ec-5e81-4335-80f2-76197d4f6f9e", "address": "fa:16:3e:31:f9:41", "network": {"id": "af35880e-f8b9-4463-bac9-70c95c551a8c", "bridge": "br-int", "label": "tempest-ServerTagsTestJSON-2067187433-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ef52bd80396048e796f6c2dbf7295b47", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap9a8b67ec-5e", "ovs_interfaceid": "9a8b67ec-5e81-4335-80f2-76197d4f6f9e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Jan 21 19:00:19 np0005591285 nova_compute[182755]: 2026-01-22 00:00:19.279 182759 DEBUG nova.network.os_vif_util [None req-4db8cc37-579f-4a37-b2e4-4c5682194607 9c8bb39ebd324063b1c1044104d8fe0d ef52bd80396048e796f6c2dbf7295b47 - - default default] Converting VIF {"id": "9a8b67ec-5e81-4335-80f2-76197d4f6f9e", "address": "fa:16:3e:31:f9:41", "network": {"id": "af35880e-f8b9-4463-bac9-70c95c551a8c", "bridge": "br-int", "label": "tempest-ServerTagsTestJSON-2067187433-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ef52bd80396048e796f6c2dbf7295b47", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap9a8b67ec-5e", "ovs_interfaceid": "9a8b67ec-5e81-4335-80f2-76197d4f6f9e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 21 19:00:19 np0005591285 nova_compute[182755]: 2026-01-22 00:00:19.280 182759 DEBUG nova.network.os_vif_util [None req-4db8cc37-579f-4a37-b2e4-4c5682194607 9c8bb39ebd324063b1c1044104d8fe0d ef52bd80396048e796f6c2dbf7295b47 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:31:f9:41,bridge_name='br-int',has_traffic_filtering=True,id=9a8b67ec-5e81-4335-80f2-76197d4f6f9e,network=Network(af35880e-f8b9-4463-bac9-70c95c551a8c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap9a8b67ec-5e') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 21 19:00:19 np0005591285 nova_compute[182755]: 2026-01-22 00:00:19.280 182759 DEBUG nova.objects.instance [None req-4db8cc37-579f-4a37-b2e4-4c5682194607 9c8bb39ebd324063b1c1044104d8fe0d ef52bd80396048e796f6c2dbf7295b47 - - default default] Lazy-loading 'pci_devices' on Instance uuid c5085649-028e-44f3-b7fa-53f19fc0a7de obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 21 19:00:19 np0005591285 nova_compute[182755]: 2026-01-22 00:00:19.294 182759 DEBUG nova.virt.libvirt.driver [None req-4db8cc37-579f-4a37-b2e4-4c5682194607 9c8bb39ebd324063b1c1044104d8fe0d ef52bd80396048e796f6c2dbf7295b47 - - default default] [instance: c5085649-028e-44f3-b7fa-53f19fc0a7de] End _get_guest_xml xml=<domain type="kvm">
Jan 21 19:00:19 np0005591285 nova_compute[182755]:  <uuid>c5085649-028e-44f3-b7fa-53f19fc0a7de</uuid>
Jan 21 19:00:19 np0005591285 nova_compute[182755]:  <name>instance-00000051</name>
Jan 21 19:00:19 np0005591285 nova_compute[182755]:  <memory>131072</memory>
Jan 21 19:00:19 np0005591285 nova_compute[182755]:  <vcpu>1</vcpu>
Jan 21 19:00:19 np0005591285 nova_compute[182755]:  <metadata>
Jan 21 19:00:19 np0005591285 nova_compute[182755]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 21 19:00:19 np0005591285 nova_compute[182755]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 21 19:00:19 np0005591285 nova_compute[182755]:      <nova:name>tempest-ServerTagsTestJSON-server-196894615</nova:name>
Jan 21 19:00:19 np0005591285 nova_compute[182755]:      <nova:creationTime>2026-01-22 00:00:19</nova:creationTime>
Jan 21 19:00:19 np0005591285 nova_compute[182755]:      <nova:flavor name="m1.nano">
Jan 21 19:00:19 np0005591285 nova_compute[182755]:        <nova:memory>128</nova:memory>
Jan 21 19:00:19 np0005591285 nova_compute[182755]:        <nova:disk>1</nova:disk>
Jan 21 19:00:19 np0005591285 nova_compute[182755]:        <nova:swap>0</nova:swap>
Jan 21 19:00:19 np0005591285 nova_compute[182755]:        <nova:ephemeral>0</nova:ephemeral>
Jan 21 19:00:19 np0005591285 nova_compute[182755]:        <nova:vcpus>1</nova:vcpus>
Jan 21 19:00:19 np0005591285 nova_compute[182755]:      </nova:flavor>
Jan 21 19:00:19 np0005591285 nova_compute[182755]:      <nova:owner>
Jan 21 19:00:19 np0005591285 nova_compute[182755]:        <nova:user uuid="9c8bb39ebd324063b1c1044104d8fe0d">tempest-ServerTagsTestJSON-1413291072-project-member</nova:user>
Jan 21 19:00:19 np0005591285 nova_compute[182755]:        <nova:project uuid="ef52bd80396048e796f6c2dbf7295b47">tempest-ServerTagsTestJSON-1413291072</nova:project>
Jan 21 19:00:19 np0005591285 nova_compute[182755]:      </nova:owner>
Jan 21 19:00:19 np0005591285 nova_compute[182755]:      <nova:root type="image" uuid="9cd98f02-a505-4543-a7ad-04e9a377b456"/>
Jan 21 19:00:19 np0005591285 nova_compute[182755]:      <nova:ports>
Jan 21 19:00:19 np0005591285 nova_compute[182755]:        <nova:port uuid="9a8b67ec-5e81-4335-80f2-76197d4f6f9e">
Jan 21 19:00:19 np0005591285 nova_compute[182755]:          <nova:ip type="fixed" address="10.100.0.3" ipVersion="4"/>
Jan 21 19:00:19 np0005591285 nova_compute[182755]:        </nova:port>
Jan 21 19:00:19 np0005591285 nova_compute[182755]:      </nova:ports>
Jan 21 19:00:19 np0005591285 nova_compute[182755]:    </nova:instance>
Jan 21 19:00:19 np0005591285 nova_compute[182755]:  </metadata>
Jan 21 19:00:19 np0005591285 nova_compute[182755]:  <sysinfo type="smbios">
Jan 21 19:00:19 np0005591285 nova_compute[182755]:    <system>
Jan 21 19:00:19 np0005591285 nova_compute[182755]:      <entry name="manufacturer">RDO</entry>
Jan 21 19:00:19 np0005591285 nova_compute[182755]:      <entry name="product">OpenStack Compute</entry>
Jan 21 19:00:19 np0005591285 nova_compute[182755]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 21 19:00:19 np0005591285 nova_compute[182755]:      <entry name="serial">c5085649-028e-44f3-b7fa-53f19fc0a7de</entry>
Jan 21 19:00:19 np0005591285 nova_compute[182755]:      <entry name="uuid">c5085649-028e-44f3-b7fa-53f19fc0a7de</entry>
Jan 21 19:00:19 np0005591285 nova_compute[182755]:      <entry name="family">Virtual Machine</entry>
Jan 21 19:00:19 np0005591285 nova_compute[182755]:    </system>
Jan 21 19:00:19 np0005591285 nova_compute[182755]:  </sysinfo>
Jan 21 19:00:19 np0005591285 nova_compute[182755]:  <os>
Jan 21 19:00:19 np0005591285 nova_compute[182755]:    <type arch="x86_64" machine="q35">hvm</type>
Jan 21 19:00:19 np0005591285 nova_compute[182755]:    <boot dev="hd"/>
Jan 21 19:00:19 np0005591285 nova_compute[182755]:    <smbios mode="sysinfo"/>
Jan 21 19:00:19 np0005591285 nova_compute[182755]:  </os>
Jan 21 19:00:19 np0005591285 nova_compute[182755]:  <features>
Jan 21 19:00:19 np0005591285 nova_compute[182755]:    <acpi/>
Jan 21 19:00:19 np0005591285 nova_compute[182755]:    <apic/>
Jan 21 19:00:19 np0005591285 nova_compute[182755]:    <vmcoreinfo/>
Jan 21 19:00:19 np0005591285 nova_compute[182755]:  </features>
Jan 21 19:00:19 np0005591285 nova_compute[182755]:  <clock offset="utc">
Jan 21 19:00:19 np0005591285 nova_compute[182755]:    <timer name="pit" tickpolicy="delay"/>
Jan 21 19:00:19 np0005591285 nova_compute[182755]:    <timer name="rtc" tickpolicy="catchup"/>
Jan 21 19:00:19 np0005591285 nova_compute[182755]:    <timer name="hpet" present="no"/>
Jan 21 19:00:19 np0005591285 nova_compute[182755]:  </clock>
Jan 21 19:00:19 np0005591285 nova_compute[182755]:  <cpu mode="custom" match="exact">
Jan 21 19:00:19 np0005591285 nova_compute[182755]:    <model>Nehalem</model>
Jan 21 19:00:19 np0005591285 nova_compute[182755]:    <topology sockets="1" cores="1" threads="1"/>
Jan 21 19:00:19 np0005591285 nova_compute[182755]:  </cpu>
Jan 21 19:00:19 np0005591285 nova_compute[182755]:  <devices>
Jan 21 19:00:19 np0005591285 nova_compute[182755]:    <disk type="file" device="disk">
Jan 21 19:00:19 np0005591285 nova_compute[182755]:      <driver name="qemu" type="qcow2" cache="none"/>
Jan 21 19:00:19 np0005591285 nova_compute[182755]:      <source file="/var/lib/nova/instances/c5085649-028e-44f3-b7fa-53f19fc0a7de/disk"/>
Jan 21 19:00:19 np0005591285 nova_compute[182755]:      <target dev="vda" bus="virtio"/>
Jan 21 19:00:19 np0005591285 nova_compute[182755]:    </disk>
Jan 21 19:00:19 np0005591285 nova_compute[182755]:    <disk type="file" device="cdrom">
Jan 21 19:00:19 np0005591285 nova_compute[182755]:      <driver name="qemu" type="raw" cache="none"/>
Jan 21 19:00:19 np0005591285 nova_compute[182755]:      <source file="/var/lib/nova/instances/c5085649-028e-44f3-b7fa-53f19fc0a7de/disk.config"/>
Jan 21 19:00:19 np0005591285 nova_compute[182755]:      <target dev="sda" bus="sata"/>
Jan 21 19:00:19 np0005591285 nova_compute[182755]:    </disk>
Jan 21 19:00:19 np0005591285 nova_compute[182755]:    <interface type="ethernet">
Jan 21 19:00:19 np0005591285 nova_compute[182755]:      <mac address="fa:16:3e:31:f9:41"/>
Jan 21 19:00:19 np0005591285 nova_compute[182755]:      <model type="virtio"/>
Jan 21 19:00:19 np0005591285 nova_compute[182755]:      <driver name="vhost" rx_queue_size="512"/>
Jan 21 19:00:19 np0005591285 nova_compute[182755]:      <mtu size="1442"/>
Jan 21 19:00:19 np0005591285 nova_compute[182755]:      <target dev="tap9a8b67ec-5e"/>
Jan 21 19:00:19 np0005591285 nova_compute[182755]:    </interface>
Jan 21 19:00:19 np0005591285 nova_compute[182755]:    <serial type="pty">
Jan 21 19:00:19 np0005591285 nova_compute[182755]:      <log file="/var/lib/nova/instances/c5085649-028e-44f3-b7fa-53f19fc0a7de/console.log" append="off"/>
Jan 21 19:00:19 np0005591285 nova_compute[182755]:    </serial>
Jan 21 19:00:19 np0005591285 nova_compute[182755]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 21 19:00:19 np0005591285 nova_compute[182755]:    <video>
Jan 21 19:00:19 np0005591285 nova_compute[182755]:      <model type="virtio"/>
Jan 21 19:00:19 np0005591285 nova_compute[182755]:    </video>
Jan 21 19:00:19 np0005591285 nova_compute[182755]:    <input type="tablet" bus="usb"/>
Jan 21 19:00:19 np0005591285 nova_compute[182755]:    <rng model="virtio">
Jan 21 19:00:19 np0005591285 nova_compute[182755]:      <backend model="random">/dev/urandom</backend>
Jan 21 19:00:19 np0005591285 nova_compute[182755]:    </rng>
Jan 21 19:00:19 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root"/>
Jan 21 19:00:19 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 19:00:19 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 19:00:19 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 19:00:19 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 19:00:19 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 19:00:19 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 19:00:19 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 19:00:19 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 19:00:19 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 19:00:19 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 19:00:19 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 19:00:19 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 19:00:19 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 19:00:19 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 19:00:19 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 19:00:19 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 19:00:19 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 19:00:19 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 19:00:19 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 19:00:19 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 19:00:19 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 19:00:19 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 19:00:19 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 19:00:19 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 19:00:19 np0005591285 nova_compute[182755]:    <controller type="usb" index="0"/>
Jan 21 19:00:19 np0005591285 nova_compute[182755]:    <memballoon model="virtio">
Jan 21 19:00:19 np0005591285 nova_compute[182755]:      <stats period="10"/>
Jan 21 19:00:19 np0005591285 nova_compute[182755]:    </memballoon>
Jan 21 19:00:19 np0005591285 nova_compute[182755]:  </devices>
Jan 21 19:00:19 np0005591285 nova_compute[182755]: </domain>
Jan 21 19:00:19 np0005591285 nova_compute[182755]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Jan 21 19:00:19 np0005591285 nova_compute[182755]: 2026-01-22 00:00:19.294 182759 DEBUG nova.compute.manager [None req-4db8cc37-579f-4a37-b2e4-4c5682194607 9c8bb39ebd324063b1c1044104d8fe0d ef52bd80396048e796f6c2dbf7295b47 - - default default] [instance: c5085649-028e-44f3-b7fa-53f19fc0a7de] Preparing to wait for external event network-vif-plugged-9a8b67ec-5e81-4335-80f2-76197d4f6f9e prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Jan 21 19:00:19 np0005591285 nova_compute[182755]: 2026-01-22 00:00:19.295 182759 DEBUG oslo_concurrency.lockutils [None req-4db8cc37-579f-4a37-b2e4-4c5682194607 9c8bb39ebd324063b1c1044104d8fe0d ef52bd80396048e796f6c2dbf7295b47 - - default default] Acquiring lock "c5085649-028e-44f3-b7fa-53f19fc0a7de-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 21 19:00:19 np0005591285 nova_compute[182755]: 2026-01-22 00:00:19.295 182759 DEBUG oslo_concurrency.lockutils [None req-4db8cc37-579f-4a37-b2e4-4c5682194607 9c8bb39ebd324063b1c1044104d8fe0d ef52bd80396048e796f6c2dbf7295b47 - - default default] Lock "c5085649-028e-44f3-b7fa-53f19fc0a7de-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 21 19:00:19 np0005591285 nova_compute[182755]: 2026-01-22 00:00:19.295 182759 DEBUG oslo_concurrency.lockutils [None req-4db8cc37-579f-4a37-b2e4-4c5682194607 9c8bb39ebd324063b1c1044104d8fe0d ef52bd80396048e796f6c2dbf7295b47 - - default default] Lock "c5085649-028e-44f3-b7fa-53f19fc0a7de-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 21 19:00:19 np0005591285 nova_compute[182755]: 2026-01-22 00:00:19.296 182759 DEBUG nova.virt.libvirt.vif [None req-4db8cc37-579f-4a37-b2e4-4c5682194607 9c8bb39ebd324063b1c1044104d8fe0d ef52bd80396048e796f6c2dbf7295b47 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-22T00:00:13Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description=None,display_name='tempest-ServerTagsTestJSON-server-196894615',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-servertagstestjson-server-196894615',id=81,image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='ef52bd80396048e796f6c2dbf7295b47',ramdisk_id='',reservation_id='r-ja0ml4i1',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ServerTagsTestJSON-1413291072',owner_user_name='tempest-ServerTagsTestJSON-1413291072-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-22T00:00:15Z,user_data=None,user_id='9c8bb39ebd324063b1c1044104d8fe0d',uuid=c5085649-028e-44f3-b7fa-53f19fc0a7de,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "9a8b67ec-5e81-4335-80f2-76197d4f6f9e", "address": "fa:16:3e:31:f9:41", "network": {"id": "af35880e-f8b9-4463-bac9-70c95c551a8c", "bridge": "br-int", "label": "tempest-ServerTagsTestJSON-2067187433-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ef52bd80396048e796f6c2dbf7295b47", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap9a8b67ec-5e", "ovs_interfaceid": "9a8b67ec-5e81-4335-80f2-76197d4f6f9e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Jan 21 19:00:19 np0005591285 nova_compute[182755]: 2026-01-22 00:00:19.296 182759 DEBUG nova.network.os_vif_util [None req-4db8cc37-579f-4a37-b2e4-4c5682194607 9c8bb39ebd324063b1c1044104d8fe0d ef52bd80396048e796f6c2dbf7295b47 - - default default] Converting VIF {"id": "9a8b67ec-5e81-4335-80f2-76197d4f6f9e", "address": "fa:16:3e:31:f9:41", "network": {"id": "af35880e-f8b9-4463-bac9-70c95c551a8c", "bridge": "br-int", "label": "tempest-ServerTagsTestJSON-2067187433-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ef52bd80396048e796f6c2dbf7295b47", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap9a8b67ec-5e", "ovs_interfaceid": "9a8b67ec-5e81-4335-80f2-76197d4f6f9e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 21 19:00:19 np0005591285 nova_compute[182755]: 2026-01-22 00:00:19.296 182759 DEBUG nova.network.os_vif_util [None req-4db8cc37-579f-4a37-b2e4-4c5682194607 9c8bb39ebd324063b1c1044104d8fe0d ef52bd80396048e796f6c2dbf7295b47 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:31:f9:41,bridge_name='br-int',has_traffic_filtering=True,id=9a8b67ec-5e81-4335-80f2-76197d4f6f9e,network=Network(af35880e-f8b9-4463-bac9-70c95c551a8c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap9a8b67ec-5e') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 21 19:00:19 np0005591285 nova_compute[182755]: 2026-01-22 00:00:19.297 182759 DEBUG os_vif [None req-4db8cc37-579f-4a37-b2e4-4c5682194607 9c8bb39ebd324063b1c1044104d8fe0d ef52bd80396048e796f6c2dbf7295b47 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:31:f9:41,bridge_name='br-int',has_traffic_filtering=True,id=9a8b67ec-5e81-4335-80f2-76197d4f6f9e,network=Network(af35880e-f8b9-4463-bac9-70c95c551a8c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap9a8b67ec-5e') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Jan 21 19:00:19 np0005591285 nova_compute[182755]: 2026-01-22 00:00:19.297 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:00:19 np0005591285 nova_compute[182755]: 2026-01-22 00:00:19.298 182759 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 21 19:00:19 np0005591285 nova_compute[182755]: 2026-01-22 00:00:19.298 182759 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 21 19:00:19 np0005591285 nova_compute[182755]: 2026-01-22 00:00:19.301 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:00:19 np0005591285 nova_compute[182755]: 2026-01-22 00:00:19.302 182759 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap9a8b67ec-5e, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 21 19:00:19 np0005591285 nova_compute[182755]: 2026-01-22 00:00:19.302 182759 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap9a8b67ec-5e, col_values=(('external_ids', {'iface-id': '9a8b67ec-5e81-4335-80f2-76197d4f6f9e', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:31:f9:41', 'vm-uuid': 'c5085649-028e-44f3-b7fa-53f19fc0a7de'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 21 19:00:19 np0005591285 NetworkManager[55017]: <info>  [1769040019.3067] manager: (tap9a8b67ec-5e): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/139)
Jan 21 19:00:19 np0005591285 nova_compute[182755]: 2026-01-22 00:00:19.305 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:00:19 np0005591285 nova_compute[182755]: 2026-01-22 00:00:19.310 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 21 19:00:19 np0005591285 nova_compute[182755]: 2026-01-22 00:00:19.314 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:00:19 np0005591285 nova_compute[182755]: 2026-01-22 00:00:19.316 182759 INFO os_vif [None req-4db8cc37-579f-4a37-b2e4-4c5682194607 9c8bb39ebd324063b1c1044104d8fe0d ef52bd80396048e796f6c2dbf7295b47 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:31:f9:41,bridge_name='br-int',has_traffic_filtering=True,id=9a8b67ec-5e81-4335-80f2-76197d4f6f9e,network=Network(af35880e-f8b9-4463-bac9-70c95c551a8c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap9a8b67ec-5e')#033[00m
Jan 21 19:00:19 np0005591285 nova_compute[182755]: 2026-01-22 00:00:19.381 182759 DEBUG nova.virt.libvirt.driver [None req-4db8cc37-579f-4a37-b2e4-4c5682194607 9c8bb39ebd324063b1c1044104d8fe0d ef52bd80396048e796f6c2dbf7295b47 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 21 19:00:19 np0005591285 nova_compute[182755]: 2026-01-22 00:00:19.382 182759 DEBUG nova.virt.libvirt.driver [None req-4db8cc37-579f-4a37-b2e4-4c5682194607 9c8bb39ebd324063b1c1044104d8fe0d ef52bd80396048e796f6c2dbf7295b47 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 21 19:00:19 np0005591285 nova_compute[182755]: 2026-01-22 00:00:19.382 182759 DEBUG nova.virt.libvirt.driver [None req-4db8cc37-579f-4a37-b2e4-4c5682194607 9c8bb39ebd324063b1c1044104d8fe0d ef52bd80396048e796f6c2dbf7295b47 - - default default] No VIF found with MAC fa:16:3e:31:f9:41, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Jan 21 19:00:19 np0005591285 nova_compute[182755]: 2026-01-22 00:00:19.383 182759 INFO nova.virt.libvirt.driver [None req-4db8cc37-579f-4a37-b2e4-4c5682194607 9c8bb39ebd324063b1c1044104d8fe0d ef52bd80396048e796f6c2dbf7295b47 - - default default] [instance: c5085649-028e-44f3-b7fa-53f19fc0a7de] Using config drive#033[00m
Jan 21 19:00:20 np0005591285 nova_compute[182755]: 2026-01-22 00:00:19.998 182759 INFO nova.virt.libvirt.driver [None req-4db8cc37-579f-4a37-b2e4-4c5682194607 9c8bb39ebd324063b1c1044104d8fe0d ef52bd80396048e796f6c2dbf7295b47 - - default default] [instance: c5085649-028e-44f3-b7fa-53f19fc0a7de] Creating config drive at /var/lib/nova/instances/c5085649-028e-44f3-b7fa-53f19fc0a7de/disk.config#033[00m
Jan 21 19:00:20 np0005591285 nova_compute[182755]: 2026-01-22 00:00:20.010 182759 DEBUG oslo_concurrency.processutils [None req-4db8cc37-579f-4a37-b2e4-4c5682194607 9c8bb39ebd324063b1c1044104d8fe0d ef52bd80396048e796f6c2dbf7295b47 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/c5085649-028e-44f3-b7fa-53f19fc0a7de/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmppshpxutf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 21 19:00:20 np0005591285 nova_compute[182755]: 2026-01-22 00:00:20.156 182759 DEBUG oslo_concurrency.processutils [None req-4db8cc37-579f-4a37-b2e4-4c5682194607 9c8bb39ebd324063b1c1044104d8fe0d ef52bd80396048e796f6c2dbf7295b47 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/c5085649-028e-44f3-b7fa-53f19fc0a7de/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmppshpxutf" returned: 0 in 0.146s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 21 19:00:20 np0005591285 kernel: tap9a8b67ec-5e: entered promiscuous mode
Jan 21 19:00:20 np0005591285 NetworkManager[55017]: <info>  [1769040020.2379] manager: (tap9a8b67ec-5e): new Tun device (/org/freedesktop/NetworkManager/Devices/140)
Jan 21 19:00:20 np0005591285 nova_compute[182755]: 2026-01-22 00:00:20.237 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:00:20 np0005591285 ovn_controller[94908]: 2026-01-22T00:00:20Z|00284|binding|INFO|Claiming lport 9a8b67ec-5e81-4335-80f2-76197d4f6f9e for this chassis.
Jan 21 19:00:20 np0005591285 ovn_controller[94908]: 2026-01-22T00:00:20Z|00285|binding|INFO|9a8b67ec-5e81-4335-80f2-76197d4f6f9e: Claiming fa:16:3e:31:f9:41 10.100.0.3
Jan 21 19:00:20 np0005591285 nova_compute[182755]: 2026-01-22 00:00:20.252 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:00:20 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:00:20.253 104259 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:31:f9:41 10.100.0.3'], port_security=['fa:16:3e:31:f9:41 10.100.0.3'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.3/28', 'neutron:device_id': 'c5085649-028e-44f3-b7fa-53f19fc0a7de', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-af35880e-f8b9-4463-bac9-70c95c551a8c', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'ef52bd80396048e796f6c2dbf7295b47', 'neutron:revision_number': '2', 'neutron:security_group_ids': '35be5298-c227-4fe0-9a03-50c7aabd6783', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=6ca68d07-06cf-4a9c-b5e8-6a0c54b84078, chassis=[<ovs.db.idl.Row object at 0x7fdd55b5c970>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fdd55b5c970>], logical_port=9a8b67ec-5e81-4335-80f2-76197d4f6f9e) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 21 19:00:20 np0005591285 ovn_controller[94908]: 2026-01-22T00:00:20Z|00286|binding|INFO|Setting lport 9a8b67ec-5e81-4335-80f2-76197d4f6f9e ovn-installed in OVS
Jan 21 19:00:20 np0005591285 ovn_controller[94908]: 2026-01-22T00:00:20Z|00287|binding|INFO|Setting lport 9a8b67ec-5e81-4335-80f2-76197d4f6f9e up in Southbound
Jan 21 19:00:20 np0005591285 nova_compute[182755]: 2026-01-22 00:00:20.256 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:00:20 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:00:20.257 104259 INFO neutron.agent.ovn.metadata.agent [-] Port 9a8b67ec-5e81-4335-80f2-76197d4f6f9e in datapath af35880e-f8b9-4463-bac9-70c95c551a8c bound to our chassis#033[00m
Jan 21 19:00:20 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:00:20.261 104259 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network af35880e-f8b9-4463-bac9-70c95c551a8c#033[00m
Jan 21 19:00:20 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:00:20.274 211690 DEBUG oslo.privsep.daemon [-] privsep: reply[c225acce-09e5-4e87-a115-31ad0c596bc1]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 19:00:20 np0005591285 systemd-udevd[223084]: Network interface NamePolicy= disabled on kernel command line.
Jan 21 19:00:20 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:00:20.275 104259 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tapaf35880e-f1 in ovnmeta-af35880e-f8b9-4463-bac9-70c95c551a8c namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Jan 21 19:00:20 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:00:20.277 211690 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tapaf35880e-f0 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Jan 21 19:00:20 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:00:20.278 211690 DEBUG oslo.privsep.daemon [-] privsep: reply[c27a95b9-1f79-4ba8-a73e-eef49c9f0785]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 19:00:20 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:00:20.279 211690 DEBUG oslo.privsep.daemon [-] privsep: reply[bf9e6dac-b73f-4d18-97b5-a4e283ea4dcf]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 19:00:20 np0005591285 NetworkManager[55017]: <info>  [1769040020.2869] device (tap9a8b67ec-5e): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 21 19:00:20 np0005591285 NetworkManager[55017]: <info>  [1769040020.2876] device (tap9a8b67ec-5e): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 21 19:00:20 np0005591285 systemd-machined[154022]: New machine qemu-37-instance-00000051.
Jan 21 19:00:20 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:00:20.293 104650 DEBUG oslo.privsep.daemon [-] privsep: reply[35522778-8eb9-48dc-9eca-4df00ba442f5]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 19:00:20 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:00:20.312 211690 DEBUG oslo.privsep.daemon [-] privsep: reply[1a60d512-6cd8-464f-b9a3-bca482587d01]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 19:00:20 np0005591285 systemd[1]: Started Virtual Machine qemu-37-instance-00000051.
Jan 21 19:00:20 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:00:20.340 211704 DEBUG oslo.privsep.daemon [-] privsep: reply[4d896af6-98b5-4b3c-b2d8-c955cad4513d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 19:00:20 np0005591285 NetworkManager[55017]: <info>  [1769040020.3467] manager: (tapaf35880e-f0): new Veth device (/org/freedesktop/NetworkManager/Devices/141)
Jan 21 19:00:20 np0005591285 systemd-udevd[223088]: Network interface NamePolicy= disabled on kernel command line.
Jan 21 19:00:20 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:00:20.345 211690 DEBUG oslo.privsep.daemon [-] privsep: reply[e957b460-dac4-4f80-b810-c6b13071b8e3]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 19:00:20 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:00:20.379 211704 DEBUG oslo.privsep.daemon [-] privsep: reply[7b8c2540-391d-4e21-9018-b0a00a2785f1]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 19:00:20 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:00:20.404 211704 DEBUG oslo.privsep.daemon [-] privsep: reply[7c52061e-abbe-4e98-a827-e7efa024ae18]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 19:00:20 np0005591285 NetworkManager[55017]: <info>  [1769040020.4263] device (tapaf35880e-f0): carrier: link connected
Jan 21 19:00:20 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:00:20.430 211704 DEBUG oslo.privsep.daemon [-] privsep: reply[15ae41b7-f403-405d-bca6-398982adeeab]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 19:00:20 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:00:20.447 211690 DEBUG oslo.privsep.daemon [-] privsep: reply[ed303ef7-fbe1-4ab7-a51b-e0ac6220c6b9]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapaf35880e-f1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:d4:d7:e2'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 91], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 457908, 'reachable_time': 19248, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 223117, 'error': None, 'target': 'ovnmeta-af35880e-f8b9-4463-bac9-70c95c551a8c', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 19:00:20 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:00:20.462 211690 DEBUG oslo.privsep.daemon [-] privsep: reply[3952bbe3-83e7-4ab1-ab3e-31e243ec14d2]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fed4:d7e2'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 457908, 'tstamp': 457908}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 223118, 'error': None, 'target': 'ovnmeta-af35880e-f8b9-4463-bac9-70c95c551a8c', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 19:00:20 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:00:20.479 211690 DEBUG oslo.privsep.daemon [-] privsep: reply[b7900e12-15b8-4602-949e-850447e8d7f6]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapaf35880e-f1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:d4:d7:e2'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 91], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 457908, 'reachable_time': 19248, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 223119, 'error': None, 'target': 'ovnmeta-af35880e-f8b9-4463-bac9-70c95c551a8c', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 19:00:20 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:00:20.506 211690 DEBUG oslo.privsep.daemon [-] privsep: reply[0462ee27-bc7a-470d-ae54-3a6e2a85ba78]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 19:00:20 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:00:20.573 211690 DEBUG oslo.privsep.daemon [-] privsep: reply[3afed6b8-d0af-4459-a60f-fa44b105b308]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 19:00:20 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:00:20.574 104259 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapaf35880e-f0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 21 19:00:20 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:00:20.575 104259 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 21 19:00:20 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:00:20.575 104259 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapaf35880e-f0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 21 19:00:20 np0005591285 nova_compute[182755]: 2026-01-22 00:00:20.577 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:00:20 np0005591285 NetworkManager[55017]: <info>  [1769040020.5776] manager: (tapaf35880e-f0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/142)
Jan 21 19:00:20 np0005591285 kernel: tapaf35880e-f0: entered promiscuous mode
Jan 21 19:00:20 np0005591285 nova_compute[182755]: 2026-01-22 00:00:20.580 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:00:20 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:00:20.581 104259 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapaf35880e-f0, col_values=(('external_ids', {'iface-id': 'f58df12c-8feb-407c-8f5c-4e87296da4c5'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 21 19:00:20 np0005591285 nova_compute[182755]: 2026-01-22 00:00:20.582 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:00:20 np0005591285 ovn_controller[94908]: 2026-01-22T00:00:20Z|00288|binding|INFO|Releasing lport f58df12c-8feb-407c-8f5c-4e87296da4c5 from this chassis (sb_readonly=0)
Jan 21 19:00:20 np0005591285 nova_compute[182755]: 2026-01-22 00:00:20.588 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:00:20 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:00:20.589 104259 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/af35880e-f8b9-4463-bac9-70c95c551a8c.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/af35880e-f8b9-4463-bac9-70c95c551a8c.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Jan 21 19:00:20 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:00:20.590 211690 DEBUG oslo.privsep.daemon [-] privsep: reply[3c279c04-da2e-4517-966f-a6a639f06f65]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 19:00:20 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:00:20.591 104259 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Jan 21 19:00:20 np0005591285 ovn_metadata_agent[104254]: global
Jan 21 19:00:20 np0005591285 ovn_metadata_agent[104254]:    log         /dev/log local0 debug
Jan 21 19:00:20 np0005591285 ovn_metadata_agent[104254]:    log-tag     haproxy-metadata-proxy-af35880e-f8b9-4463-bac9-70c95c551a8c
Jan 21 19:00:20 np0005591285 ovn_metadata_agent[104254]:    user        root
Jan 21 19:00:20 np0005591285 ovn_metadata_agent[104254]:    group       root
Jan 21 19:00:20 np0005591285 ovn_metadata_agent[104254]:    maxconn     1024
Jan 21 19:00:20 np0005591285 ovn_metadata_agent[104254]:    pidfile     /var/lib/neutron/external/pids/af35880e-f8b9-4463-bac9-70c95c551a8c.pid.haproxy
Jan 21 19:00:20 np0005591285 ovn_metadata_agent[104254]:    daemon
Jan 21 19:00:20 np0005591285 ovn_metadata_agent[104254]: 
Jan 21 19:00:20 np0005591285 ovn_metadata_agent[104254]: defaults
Jan 21 19:00:20 np0005591285 ovn_metadata_agent[104254]:    log global
Jan 21 19:00:20 np0005591285 ovn_metadata_agent[104254]:    mode http
Jan 21 19:00:20 np0005591285 ovn_metadata_agent[104254]:    option httplog
Jan 21 19:00:20 np0005591285 ovn_metadata_agent[104254]:    option dontlognull
Jan 21 19:00:20 np0005591285 ovn_metadata_agent[104254]:    option http-server-close
Jan 21 19:00:20 np0005591285 ovn_metadata_agent[104254]:    option forwardfor
Jan 21 19:00:20 np0005591285 ovn_metadata_agent[104254]:    retries                 3
Jan 21 19:00:20 np0005591285 ovn_metadata_agent[104254]:    timeout http-request    30s
Jan 21 19:00:20 np0005591285 ovn_metadata_agent[104254]:    timeout connect         30s
Jan 21 19:00:20 np0005591285 ovn_metadata_agent[104254]:    timeout client          32s
Jan 21 19:00:20 np0005591285 ovn_metadata_agent[104254]:    timeout server          32s
Jan 21 19:00:20 np0005591285 ovn_metadata_agent[104254]:    timeout http-keep-alive 30s
Jan 21 19:00:20 np0005591285 ovn_metadata_agent[104254]: 
Jan 21 19:00:20 np0005591285 ovn_metadata_agent[104254]: 
Jan 21 19:00:20 np0005591285 ovn_metadata_agent[104254]: listen listener
Jan 21 19:00:20 np0005591285 ovn_metadata_agent[104254]:    bind 169.254.169.254:80
Jan 21 19:00:20 np0005591285 ovn_metadata_agent[104254]:    server metadata /var/lib/neutron/metadata_proxy
Jan 21 19:00:20 np0005591285 ovn_metadata_agent[104254]:    http-request add-header X-OVN-Network-ID af35880e-f8b9-4463-bac9-70c95c551a8c
Jan 21 19:00:20 np0005591285 ovn_metadata_agent[104254]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Jan 21 19:00:20 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:00:20.592 104259 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-af35880e-f8b9-4463-bac9-70c95c551a8c', 'env', 'PROCESS_TAG=haproxy-af35880e-f8b9-4463-bac9-70c95c551a8c', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/af35880e-f8b9-4463-bac9-70c95c551a8c.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Jan 21 19:00:20 np0005591285 nova_compute[182755]: 2026-01-22 00:00:20.606 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:00:20 np0005591285 nova_compute[182755]: 2026-01-22 00:00:20.889 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:00:21 np0005591285 podman[223150]: 2026-01-22 00:00:21.029065489 +0000 UTC m=+0.064900066 container create 618f58d591092028e306e624e6c853a6571d0b2cfc1af59112f862eaac58d8f2 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-af35880e-f8b9-4463-bac9-70c95c551a8c, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, tcib_managed=true, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 21 19:00:21 np0005591285 systemd[1]: Started libpod-conmon-618f58d591092028e306e624e6c853a6571d0b2cfc1af59112f862eaac58d8f2.scope.
Jan 21 19:00:21 np0005591285 podman[223150]: 2026-01-22 00:00:20.992322812 +0000 UTC m=+0.028157389 image pull 3695f0466b4af47afdf4b467956f8cc4744d7249671a73e7ca3fd26cca2f59c3 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Jan 21 19:00:21 np0005591285 systemd[1]: Started libcrun container.
Jan 21 19:00:21 np0005591285 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/f3b87c384e1221e6166bbada1ca5643e88c705a780de132f530570d75a701510/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 21 19:00:21 np0005591285 podman[223150]: 2026-01-22 00:00:21.122233285 +0000 UTC m=+0.158067842 container init 618f58d591092028e306e624e6c853a6571d0b2cfc1af59112f862eaac58d8f2 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-af35880e-f8b9-4463-bac9-70c95c551a8c, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0)
Jan 21 19:00:21 np0005591285 podman[223150]: 2026-01-22 00:00:21.128297536 +0000 UTC m=+0.164132093 container start 618f58d591092028e306e624e6c853a6571d0b2cfc1af59112f862eaac58d8f2 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-af35880e-f8b9-4463-bac9-70c95c551a8c, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, org.label-schema.build-date=20251202)
Jan 21 19:00:21 np0005591285 neutron-haproxy-ovnmeta-af35880e-f8b9-4463-bac9-70c95c551a8c[223166]: [NOTICE]   (223176) : New worker (223179) forked
Jan 21 19:00:21 np0005591285 neutron-haproxy-ovnmeta-af35880e-f8b9-4463-bac9-70c95c551a8c[223166]: [NOTICE]   (223176) : Loading success.
Jan 21 19:00:21 np0005591285 nova_compute[182755]: 2026-01-22 00:00:21.203 182759 DEBUG nova.virt.driver [None req-f447ef32-7edb-4e2c-a5c5-5452f2f9c37e - - - - - -] Emitting event <LifecycleEvent: 1769040021.2033782, c5085649-028e-44f3-b7fa-53f19fc0a7de => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 21 19:00:21 np0005591285 nova_compute[182755]: 2026-01-22 00:00:21.204 182759 INFO nova.compute.manager [None req-f447ef32-7edb-4e2c-a5c5-5452f2f9c37e - - - - - -] [instance: c5085649-028e-44f3-b7fa-53f19fc0a7de] VM Started (Lifecycle Event)#033[00m
Jan 21 19:00:21 np0005591285 nova_compute[182755]: 2026-01-22 00:00:21.368 182759 DEBUG nova.compute.manager [None req-f447ef32-7edb-4e2c-a5c5-5452f2f9c37e - - - - - -] [instance: c5085649-028e-44f3-b7fa-53f19fc0a7de] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 21 19:00:21 np0005591285 nova_compute[182755]: 2026-01-22 00:00:21.373 182759 DEBUG nova.virt.driver [None req-f447ef32-7edb-4e2c-a5c5-5452f2f9c37e - - - - - -] Emitting event <LifecycleEvent: 1769040021.2035837, c5085649-028e-44f3-b7fa-53f19fc0a7de => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 21 19:00:21 np0005591285 nova_compute[182755]: 2026-01-22 00:00:21.374 182759 INFO nova.compute.manager [None req-f447ef32-7edb-4e2c-a5c5-5452f2f9c37e - - - - - -] [instance: c5085649-028e-44f3-b7fa-53f19fc0a7de] VM Paused (Lifecycle Event)#033[00m
Jan 21 19:00:21 np0005591285 nova_compute[182755]: 2026-01-22 00:00:21.398 182759 DEBUG nova.compute.manager [None req-f447ef32-7edb-4e2c-a5c5-5452f2f9c37e - - - - - -] [instance: c5085649-028e-44f3-b7fa-53f19fc0a7de] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 21 19:00:21 np0005591285 nova_compute[182755]: 2026-01-22 00:00:21.403 182759 DEBUG nova.compute.manager [None req-f447ef32-7edb-4e2c-a5c5-5452f2f9c37e - - - - - -] [instance: c5085649-028e-44f3-b7fa-53f19fc0a7de] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 21 19:00:21 np0005591285 nova_compute[182755]: 2026-01-22 00:00:21.428 182759 INFO nova.compute.manager [None req-f447ef32-7edb-4e2c-a5c5-5452f2f9c37e - - - - - -] [instance: c5085649-028e-44f3-b7fa-53f19fc0a7de] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 21 19:00:21 np0005591285 nova_compute[182755]: 2026-01-22 00:00:21.528 182759 DEBUG nova.network.neutron [req-2c092b11-8c34-42da-aa6f-f148b36a9ea6 req-16cdaecd-f41d-4bdb-a130-11db6e967948 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: c5085649-028e-44f3-b7fa-53f19fc0a7de] Updated VIF entry in instance network info cache for port 9a8b67ec-5e81-4335-80f2-76197d4f6f9e. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 21 19:00:21 np0005591285 nova_compute[182755]: 2026-01-22 00:00:21.529 182759 DEBUG nova.network.neutron [req-2c092b11-8c34-42da-aa6f-f148b36a9ea6 req-16cdaecd-f41d-4bdb-a130-11db6e967948 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: c5085649-028e-44f3-b7fa-53f19fc0a7de] Updating instance_info_cache with network_info: [{"id": "9a8b67ec-5e81-4335-80f2-76197d4f6f9e", "address": "fa:16:3e:31:f9:41", "network": {"id": "af35880e-f8b9-4463-bac9-70c95c551a8c", "bridge": "br-int", "label": "tempest-ServerTagsTestJSON-2067187433-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ef52bd80396048e796f6c2dbf7295b47", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap9a8b67ec-5e", "ovs_interfaceid": "9a8b67ec-5e81-4335-80f2-76197d4f6f9e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 21 19:00:21 np0005591285 nova_compute[182755]: 2026-01-22 00:00:21.549 182759 DEBUG oslo_concurrency.lockutils [req-2c092b11-8c34-42da-aa6f-f148b36a9ea6 req-16cdaecd-f41d-4bdb-a130-11db6e967948 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Releasing lock "refresh_cache-c5085649-028e-44f3-b7fa-53f19fc0a7de" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 21 19:00:21 np0005591285 nova_compute[182755]: 2026-01-22 00:00:21.594 182759 DEBUG nova.compute.manager [req-069fb5a2-f9fd-4bc1-853f-185b3c5b45c8 req-273f2b92-5980-457c-94e5-87a77ae74738 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: c5085649-028e-44f3-b7fa-53f19fc0a7de] Received event network-vif-plugged-9a8b67ec-5e81-4335-80f2-76197d4f6f9e external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 21 19:00:21 np0005591285 nova_compute[182755]: 2026-01-22 00:00:21.595 182759 DEBUG oslo_concurrency.lockutils [req-069fb5a2-f9fd-4bc1-853f-185b3c5b45c8 req-273f2b92-5980-457c-94e5-87a77ae74738 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "c5085649-028e-44f3-b7fa-53f19fc0a7de-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 21 19:00:21 np0005591285 nova_compute[182755]: 2026-01-22 00:00:21.595 182759 DEBUG oslo_concurrency.lockutils [req-069fb5a2-f9fd-4bc1-853f-185b3c5b45c8 req-273f2b92-5980-457c-94e5-87a77ae74738 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "c5085649-028e-44f3-b7fa-53f19fc0a7de-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 21 19:00:21 np0005591285 nova_compute[182755]: 2026-01-22 00:00:21.596 182759 DEBUG oslo_concurrency.lockutils [req-069fb5a2-f9fd-4bc1-853f-185b3c5b45c8 req-273f2b92-5980-457c-94e5-87a77ae74738 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "c5085649-028e-44f3-b7fa-53f19fc0a7de-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 21 19:00:21 np0005591285 nova_compute[182755]: 2026-01-22 00:00:21.596 182759 DEBUG nova.compute.manager [req-069fb5a2-f9fd-4bc1-853f-185b3c5b45c8 req-273f2b92-5980-457c-94e5-87a77ae74738 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: c5085649-028e-44f3-b7fa-53f19fc0a7de] Processing event network-vif-plugged-9a8b67ec-5e81-4335-80f2-76197d4f6f9e _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Jan 21 19:00:21 np0005591285 nova_compute[182755]: 2026-01-22 00:00:21.596 182759 DEBUG nova.compute.manager [req-069fb5a2-f9fd-4bc1-853f-185b3c5b45c8 req-273f2b92-5980-457c-94e5-87a77ae74738 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: c5085649-028e-44f3-b7fa-53f19fc0a7de] Received event network-vif-plugged-9a8b67ec-5e81-4335-80f2-76197d4f6f9e external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 21 19:00:21 np0005591285 nova_compute[182755]: 2026-01-22 00:00:21.597 182759 DEBUG oslo_concurrency.lockutils [req-069fb5a2-f9fd-4bc1-853f-185b3c5b45c8 req-273f2b92-5980-457c-94e5-87a77ae74738 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "c5085649-028e-44f3-b7fa-53f19fc0a7de-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 21 19:00:21 np0005591285 nova_compute[182755]: 2026-01-22 00:00:21.597 182759 DEBUG oslo_concurrency.lockutils [req-069fb5a2-f9fd-4bc1-853f-185b3c5b45c8 req-273f2b92-5980-457c-94e5-87a77ae74738 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "c5085649-028e-44f3-b7fa-53f19fc0a7de-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 21 19:00:21 np0005591285 nova_compute[182755]: 2026-01-22 00:00:21.597 182759 DEBUG oslo_concurrency.lockutils [req-069fb5a2-f9fd-4bc1-853f-185b3c5b45c8 req-273f2b92-5980-457c-94e5-87a77ae74738 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "c5085649-028e-44f3-b7fa-53f19fc0a7de-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 21 19:00:21 np0005591285 nova_compute[182755]: 2026-01-22 00:00:21.598 182759 DEBUG nova.compute.manager [req-069fb5a2-f9fd-4bc1-853f-185b3c5b45c8 req-273f2b92-5980-457c-94e5-87a77ae74738 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: c5085649-028e-44f3-b7fa-53f19fc0a7de] No waiting events found dispatching network-vif-plugged-9a8b67ec-5e81-4335-80f2-76197d4f6f9e pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 21 19:00:21 np0005591285 nova_compute[182755]: 2026-01-22 00:00:21.598 182759 WARNING nova.compute.manager [req-069fb5a2-f9fd-4bc1-853f-185b3c5b45c8 req-273f2b92-5980-457c-94e5-87a77ae74738 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: c5085649-028e-44f3-b7fa-53f19fc0a7de] Received unexpected event network-vif-plugged-9a8b67ec-5e81-4335-80f2-76197d4f6f9e for instance with vm_state building and task_state spawning.#033[00m
Jan 21 19:00:21 np0005591285 nova_compute[182755]: 2026-01-22 00:00:21.599 182759 DEBUG nova.compute.manager [None req-4db8cc37-579f-4a37-b2e4-4c5682194607 9c8bb39ebd324063b1c1044104d8fe0d ef52bd80396048e796f6c2dbf7295b47 - - default default] [instance: c5085649-028e-44f3-b7fa-53f19fc0a7de] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Jan 21 19:00:21 np0005591285 nova_compute[182755]: 2026-01-22 00:00:21.603 182759 DEBUG nova.virt.driver [None req-f447ef32-7edb-4e2c-a5c5-5452f2f9c37e - - - - - -] Emitting event <LifecycleEvent: 1769040021.6032755, c5085649-028e-44f3-b7fa-53f19fc0a7de => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 21 19:00:21 np0005591285 nova_compute[182755]: 2026-01-22 00:00:21.603 182759 INFO nova.compute.manager [None req-f447ef32-7edb-4e2c-a5c5-5452f2f9c37e - - - - - -] [instance: c5085649-028e-44f3-b7fa-53f19fc0a7de] VM Resumed (Lifecycle Event)#033[00m
Jan 21 19:00:21 np0005591285 nova_compute[182755]: 2026-01-22 00:00:21.606 182759 DEBUG nova.virt.libvirt.driver [None req-4db8cc37-579f-4a37-b2e4-4c5682194607 9c8bb39ebd324063b1c1044104d8fe0d ef52bd80396048e796f6c2dbf7295b47 - - default default] [instance: c5085649-028e-44f3-b7fa-53f19fc0a7de] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Jan 21 19:00:21 np0005591285 nova_compute[182755]: 2026-01-22 00:00:21.610 182759 INFO nova.virt.libvirt.driver [-] [instance: c5085649-028e-44f3-b7fa-53f19fc0a7de] Instance spawned successfully.#033[00m
Jan 21 19:00:21 np0005591285 nova_compute[182755]: 2026-01-22 00:00:21.611 182759 DEBUG nova.virt.libvirt.driver [None req-4db8cc37-579f-4a37-b2e4-4c5682194607 9c8bb39ebd324063b1c1044104d8fe0d ef52bd80396048e796f6c2dbf7295b47 - - default default] [instance: c5085649-028e-44f3-b7fa-53f19fc0a7de] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Jan 21 19:00:21 np0005591285 nova_compute[182755]: 2026-01-22 00:00:21.626 182759 DEBUG nova.compute.manager [None req-f447ef32-7edb-4e2c-a5c5-5452f2f9c37e - - - - - -] [instance: c5085649-028e-44f3-b7fa-53f19fc0a7de] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 21 19:00:21 np0005591285 nova_compute[182755]: 2026-01-22 00:00:21.634 182759 DEBUG nova.compute.manager [None req-f447ef32-7edb-4e2c-a5c5-5452f2f9c37e - - - - - -] [instance: c5085649-028e-44f3-b7fa-53f19fc0a7de] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 21 19:00:21 np0005591285 nova_compute[182755]: 2026-01-22 00:00:21.638 182759 DEBUG nova.virt.libvirt.driver [None req-4db8cc37-579f-4a37-b2e4-4c5682194607 9c8bb39ebd324063b1c1044104d8fe0d ef52bd80396048e796f6c2dbf7295b47 - - default default] [instance: c5085649-028e-44f3-b7fa-53f19fc0a7de] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 21 19:00:21 np0005591285 nova_compute[182755]: 2026-01-22 00:00:21.639 182759 DEBUG nova.virt.libvirt.driver [None req-4db8cc37-579f-4a37-b2e4-4c5682194607 9c8bb39ebd324063b1c1044104d8fe0d ef52bd80396048e796f6c2dbf7295b47 - - default default] [instance: c5085649-028e-44f3-b7fa-53f19fc0a7de] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 21 19:00:21 np0005591285 nova_compute[182755]: 2026-01-22 00:00:21.639 182759 DEBUG nova.virt.libvirt.driver [None req-4db8cc37-579f-4a37-b2e4-4c5682194607 9c8bb39ebd324063b1c1044104d8fe0d ef52bd80396048e796f6c2dbf7295b47 - - default default] [instance: c5085649-028e-44f3-b7fa-53f19fc0a7de] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 21 19:00:21 np0005591285 nova_compute[182755]: 2026-01-22 00:00:21.640 182759 DEBUG nova.virt.libvirt.driver [None req-4db8cc37-579f-4a37-b2e4-4c5682194607 9c8bb39ebd324063b1c1044104d8fe0d ef52bd80396048e796f6c2dbf7295b47 - - default default] [instance: c5085649-028e-44f3-b7fa-53f19fc0a7de] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 21 19:00:21 np0005591285 nova_compute[182755]: 2026-01-22 00:00:21.641 182759 DEBUG nova.virt.libvirt.driver [None req-4db8cc37-579f-4a37-b2e4-4c5682194607 9c8bb39ebd324063b1c1044104d8fe0d ef52bd80396048e796f6c2dbf7295b47 - - default default] [instance: c5085649-028e-44f3-b7fa-53f19fc0a7de] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 21 19:00:21 np0005591285 nova_compute[182755]: 2026-01-22 00:00:21.641 182759 DEBUG nova.virt.libvirt.driver [None req-4db8cc37-579f-4a37-b2e4-4c5682194607 9c8bb39ebd324063b1c1044104d8fe0d ef52bd80396048e796f6c2dbf7295b47 - - default default] [instance: c5085649-028e-44f3-b7fa-53f19fc0a7de] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 21 19:00:21 np0005591285 nova_compute[182755]: 2026-01-22 00:00:21.655 182759 INFO nova.compute.manager [None req-f447ef32-7edb-4e2c-a5c5-5452f2f9c37e - - - - - -] [instance: c5085649-028e-44f3-b7fa-53f19fc0a7de] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 21 19:00:21 np0005591285 nova_compute[182755]: 2026-01-22 00:00:21.737 182759 INFO nova.compute.manager [None req-4db8cc37-579f-4a37-b2e4-4c5682194607 9c8bb39ebd324063b1c1044104d8fe0d ef52bd80396048e796f6c2dbf7295b47 - - default default] [instance: c5085649-028e-44f3-b7fa-53f19fc0a7de] Took 6.41 seconds to spawn the instance on the hypervisor.#033[00m
Jan 21 19:00:21 np0005591285 nova_compute[182755]: 2026-01-22 00:00:21.738 182759 DEBUG nova.compute.manager [None req-4db8cc37-579f-4a37-b2e4-4c5682194607 9c8bb39ebd324063b1c1044104d8fe0d ef52bd80396048e796f6c2dbf7295b47 - - default default] [instance: c5085649-028e-44f3-b7fa-53f19fc0a7de] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 21 19:00:21 np0005591285 nova_compute[182755]: 2026-01-22 00:00:21.860 182759 INFO nova.compute.manager [None req-4db8cc37-579f-4a37-b2e4-4c5682194607 9c8bb39ebd324063b1c1044104d8fe0d ef52bd80396048e796f6c2dbf7295b47 - - default default] [instance: c5085649-028e-44f3-b7fa-53f19fc0a7de] Took 7.35 seconds to build instance.#033[00m
Jan 21 19:00:21 np0005591285 nova_compute[182755]: 2026-01-22 00:00:21.889 182759 DEBUG oslo_concurrency.lockutils [None req-4db8cc37-579f-4a37-b2e4-4c5682194607 9c8bb39ebd324063b1c1044104d8fe0d ef52bd80396048e796f6c2dbf7295b47 - - default default] Lock "c5085649-028e-44f3-b7fa-53f19fc0a7de" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 7.455s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.164 12 DEBUG ceilometer.compute.discovery [-] instance data: {'id': 'c5085649-028e-44f3-b7fa-53f19fc0a7de', 'name': 'tempest-ServerTagsTestJSON-server-196894615', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'os_type': 'hvm', 'architecture': 'x86_64', 'OS-EXT-SRV-ATTR:instance_name': 'instance-00000051', 'OS-EXT-SRV-ATTR:host': 'compute-2.ctlplane.example.com', 'OS-EXT-STS:vm_state': 'running', 'tenant_id': 'ef52bd80396048e796f6c2dbf7295b47', 'user_id': '9c8bb39ebd324063b1c1044104d8fe0d', 'hostId': '24df005293addda2e6c1dd5df249203ca1b6ab99c54d56db81428361', 'status': 'active', 'metadata': {}} discover_libvirt_polling /usr/lib/python3.9/site-packages/ceilometer/compute/discovery.py:228
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.169 12 DEBUG ceilometer.compute.discovery [-] instance data: {'id': '9308be91-9a92-4389-939a-8b03d37474cf', 'name': 'tempest-ServerActionsTestJSON-server-396111842', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'os_type': 'hvm', 'architecture': 'x86_64', 'OS-EXT-SRV-ATTR:instance_name': 'instance-00000046', 'OS-EXT-SRV-ATTR:host': 'compute-2.ctlplane.example.com', 'OS-EXT-STS:vm_state': 'running', 'tenant_id': 'cccb624dbe6d4401a89e9cd254f91828', 'user_id': '3e78a70a1d284a9d932d4a53b872df39', 'hostId': '98bf05fc3cde3063e357af07cf32397d1b83b1095afc25a5e9b316ae', 'status': 'active', 'metadata': {}} discover_libvirt_polling /usr/lib/python3.9/site-packages/ceilometer/compute/discovery.py:228
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.170 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.rate in the context of pollsters
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.171 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for IncomingBytesRatePollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.171 12 ERROR ceilometer.polling.manager [-] Prevent pollster network.incoming.bytes.rate from polling [<NovaLikeServer: tempest-ServerTagsTestJSON-server-196894615>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-ServerTagsTestJSON-server-196894615>]
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.172 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.latency in the context of pollsters
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.206 12 DEBUG ceilometer.compute.pollsters [-] c5085649-028e-44f3-b7fa-53f19fc0a7de/disk.device.read.latency volume: 139492517 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.208 12 DEBUG ceilometer.compute.pollsters [-] c5085649-028e-44f3-b7fa-53f19fc0a7de/disk.device.read.latency volume: 513253 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.238 12 DEBUG ceilometer.compute.pollsters [-] 9308be91-9a92-4389-939a-8b03d37474cf/disk.device.read.latency volume: 382888263 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.239 12 DEBUG ceilometer.compute.pollsters [-] 9308be91-9a92-4389-939a-8b03d37474cf/disk.device.read.latency volume: 29665594 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.246 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'b27641ac-34ec-4241-8bf0-645f2f84763e', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 139492517, 'user_id': '9c8bb39ebd324063b1c1044104d8fe0d', 'user_name': None, 'project_id': 'ef52bd80396048e796f6c2dbf7295b47', 'project_name': None, 'resource_id': 'c5085649-028e-44f3-b7fa-53f19fc0a7de-vda', 'timestamp': '2026-01-22T00:00:23.173241', 'resource_metadata': {'display_name': 'tempest-ServerTagsTestJSON-server-196894615', 'name': 'instance-00000051', 'instance_id': 'c5085649-028e-44f3-b7fa-53f19fc0a7de', 'instance_type': 'm1.nano', 'host': '24df005293addda2e6c1dd5df249203ca1b6ab99c54d56db81428361', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '5945b4c0-f725-11f0-b13b-fa163e425b77', 'monotonic_time': 4581.892503118, 'message_signature': '9db18d185da29b9ca81c8cf470fb4c4c08779c024af2787e838249db5f0a55ee'}, {'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 513253, 'user_id': '9c8bb39ebd324063b1c1044104d8fe0d', 'user_name': None, 'project_id': 'ef52bd80396048e796f6c2dbf7295b47', 'project_name': None, 'resource_id': 'c5085649-028e-44f3-b7fa-53f19fc0a7de-sda', 'timestamp': '2026-01-22T00:00:23.173241', 'resource_metadata': {'display_name': 'tempest-ServerTagsTestJSON-server-196894615', 'name': 'instance-00000051', 'instance_id': 'c5085649-028e-44f3-b7fa-53f19fc0a7de', 'instance_type': 'm1.nano', 'host': '24df005293addda2e6c1dd5df249203ca1b6ab99c54d56db81428361', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '5945e558-f725-11f0-b13b-fa163e425b77', 'monotonic_time': 4581.892503118, 'message_signature': '895889f9695b4dcb0fc8b264a996688c063acf1e5d4b332371e623686b75f1e8'}, {'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 382888263, 'user_id': '3e78a70a1d284a9d932d4a53b872df39', 'user_name': None, 'project_id': 'cccb624dbe6d4401a89e9cd254f91828', 'project_name': None, 'resource_id': '9308be91-9a92-4389-939a-8b03d37474cf-vda', 'timestamp': '2026-01-22T00:00:23.173241', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestJSON-server-396111842', 'name': 'instance-00000046', 'instance_id': '9308be91-9a92-4389-939a-8b03d37474cf', 'instance_type': 'm1.nano', 'host': '98bf05fc3cde3063e357af07cf32397d1b83b1095afc25a5e9b316ae', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '594a8d4c-f725-11f0-b13b-fa163e425b77', 'monotonic_time': 4581.928235668, 'message_signature': 'c3b3a2384f7959361ba427e953244217b26525def38eeb2e642cc10e041c25a5'}, {'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 29665594, 'user_id': '3e78a70a1d284a9d932d4a53b872df39', 'user_name': None, 'project_id': 'cccb624dbe6d4401a89e9cd254f91828', 'project_name': None, 'resource_id': '9308be91-9a92-4389-939a-8b03d37474cf-sda', 'timestamp': '2026-01-22T00:00:23.173241', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestJSON-server-396111842', 'name': 'instance-00000046', 'instance_id': '9308be91-9a92-4389-939a-8b03d37474cf', 'instance_type': 'm1.nano', 'host': '98bf05fc3cde3063e357af07cf32397d1b83b1095afc25a5e9b316ae', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '594a9a80-f725-11f0-b13b-fa163e425b77', 'monotonic_time': 4581.928235668, 'message_signature': 'b0738c2b3b2a2fdc20dc2366ef7b065d2937954db742e720b64ba297b40da7cd'}]}, 'timestamp': '2026-01-22 00:00:23.240313', '_unique_id': '06fc7efc06864137a2efd4b5a19f15ee'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.246 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.246 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.246 12 ERROR oslo_messaging.notify.messaging     yield
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.246 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.246 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.246 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.246 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.246 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.246 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.246 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.246 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.246 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.246 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.246 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.246 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.246 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.246 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.246 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.246 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.246 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.246 12 ERROR oslo_messaging.notify.messaging 
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.246 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.246 12 ERROR oslo_messaging.notify.messaging 
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.246 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.246 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.246 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.246 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.246 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.246 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.246 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.246 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.246 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.246 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.246 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.246 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.246 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.246 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.246 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.246 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.246 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.246 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.246 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.246 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.246 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.246 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.246 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.246 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.246 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.246 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.246 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.246 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.246 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.246 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.246 12 ERROR oslo_messaging.notify.messaging 
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.248 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.delta in the context of pollsters
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.254 12 DEBUG ceilometer.compute.virt.libvirt.inspector [-] No delta meter predecessor for c5085649-028e-44f3-b7fa-53f19fc0a7de / tap9a8b67ec-5e inspect_vnics /usr/lib/python3.9/site-packages/ceilometer/compute/virt/libvirt/inspector.py:136
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.255 12 DEBUG ceilometer.compute.pollsters [-] c5085649-028e-44f3-b7fa-53f19fc0a7de/network.outgoing.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.257 12 DEBUG ceilometer.compute.virt.libvirt.inspector [-] No delta meter predecessor for 9308be91-9a92-4389-939a-8b03d37474cf / tapd96fb6bb-97 inspect_vnics /usr/lib/python3.9/site-packages/ceilometer/compute/virt/libvirt/inspector.py:136
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.257 12 DEBUG ceilometer.compute.pollsters [-] 9308be91-9a92-4389-939a-8b03d37474cf/network.outgoing.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.259 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '7b520e53-2126-409a-94bc-6151f586d9f4', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '9c8bb39ebd324063b1c1044104d8fe0d', 'user_name': None, 'project_id': 'ef52bd80396048e796f6c2dbf7295b47', 'project_name': None, 'resource_id': 'instance-00000051-c5085649-028e-44f3-b7fa-53f19fc0a7de-tap9a8b67ec-5e', 'timestamp': '2026-01-22T00:00:23.248164', 'resource_metadata': {'display_name': 'tempest-ServerTagsTestJSON-server-196894615', 'name': 'tap9a8b67ec-5e', 'instance_id': 'c5085649-028e-44f3-b7fa-53f19fc0a7de', 'instance_type': 'm1.nano', 'host': '24df005293addda2e6c1dd5df249203ca1b6ab99c54d56db81428361', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:31:f9:41', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap9a8b67ec-5e'}, 'message_id': '594cfc1c-f725-11f0-b13b-fa163e425b77', 'monotonic_time': 4581.967390419, 'message_signature': '9da5101329de92abce3e07a4ffef53f252ecb9b4002f4051daf44000cc48b9ba'}, {'source': 'openstack', 'counter_name': 'network.outgoing.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '3e78a70a1d284a9d932d4a53b872df39', 'user_name': None, 'project_id': 'cccb624dbe6d4401a89e9cd254f91828', 'project_name': None, 'resource_id': 'instance-00000046-9308be91-9a92-4389-939a-8b03d37474cf-tapd96fb6bb-97', 'timestamp': '2026-01-22T00:00:23.248164', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestJSON-server-396111842', 'name': 'tapd96fb6bb-97', 'instance_id': '9308be91-9a92-4389-939a-8b03d37474cf', 'instance_type': 'm1.nano', 'host': '98bf05fc3cde3063e357af07cf32397d1b83b1095afc25a5e9b316ae', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:c3:44:d7', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapd96fb6bb-97'}, 'message_id': '594d6990-f725-11f0-b13b-fa163e425b77', 'monotonic_time': 4581.974649062, 'message_signature': '6601aa2b099f8d2319df5e05de678fabbcf257a778ebdf464db60c9be4814da6'}]}, 'timestamp': '2026-01-22 00:00:23.258281', '_unique_id': 'b6b840d0fe174f689c886f77a0a7b067'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.259 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.259 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.259 12 ERROR oslo_messaging.notify.messaging     yield
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.259 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.259 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.259 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.259 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.259 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.259 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.259 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.259 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.259 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.259 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.259 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.259 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.259 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.259 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.259 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.259 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.259 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.259 12 ERROR oslo_messaging.notify.messaging 
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.259 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.259 12 ERROR oslo_messaging.notify.messaging 
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.259 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.259 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.259 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.259 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.259 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.259 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.259 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.259 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.259 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.259 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.259 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.259 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.259 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.259 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.259 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.259 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.259 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.259 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.259 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.259 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.259 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.259 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.259 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.259 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.259 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.259 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.259 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.259 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.259 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.259 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.259 12 ERROR oslo_messaging.notify.messaging 
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.260 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes in the context of pollsters
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.260 12 DEBUG ceilometer.compute.pollsters [-] c5085649-028e-44f3-b7fa-53f19fc0a7de/network.outgoing.bytes volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.260 12 DEBUG ceilometer.compute.pollsters [-] 9308be91-9a92-4389-939a-8b03d37474cf/network.outgoing.bytes volume: 5524 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.261 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'f850c7a0-18ad-42e8-851d-07fcf9bd379b', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '9c8bb39ebd324063b1c1044104d8fe0d', 'user_name': None, 'project_id': 'ef52bd80396048e796f6c2dbf7295b47', 'project_name': None, 'resource_id': 'instance-00000051-c5085649-028e-44f3-b7fa-53f19fc0a7de-tap9a8b67ec-5e', 'timestamp': '2026-01-22T00:00:23.260344', 'resource_metadata': {'display_name': 'tempest-ServerTagsTestJSON-server-196894615', 'name': 'tap9a8b67ec-5e', 'instance_id': 'c5085649-028e-44f3-b7fa-53f19fc0a7de', 'instance_type': 'm1.nano', 'host': '24df005293addda2e6c1dd5df249203ca1b6ab99c54d56db81428361', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:31:f9:41', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap9a8b67ec-5e'}, 'message_id': '594dc5e8-f725-11f0-b13b-fa163e425b77', 'monotonic_time': 4581.967390419, 'message_signature': '1c69ab082eca59119ac892e26dfcc6367a82beb31f2fd1113860ad5272ac553d'}, {'source': 'openstack', 'counter_name': 'network.outgoing.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 5524, 'user_id': '3e78a70a1d284a9d932d4a53b872df39', 'user_name': None, 'project_id': 'cccb624dbe6d4401a89e9cd254f91828', 'project_name': None, 'resource_id': 'instance-00000046-9308be91-9a92-4389-939a-8b03d37474cf-tapd96fb6bb-97', 'timestamp': '2026-01-22T00:00:23.260344', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestJSON-server-396111842', 'name': 'tapd96fb6bb-97', 'instance_id': '9308be91-9a92-4389-939a-8b03d37474cf', 'instance_type': 'm1.nano', 'host': '98bf05fc3cde3063e357af07cf32397d1b83b1095afc25a5e9b316ae', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:c3:44:d7', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapd96fb6bb-97'}, 'message_id': '594dd006-f725-11f0-b13b-fa163e425b77', 'monotonic_time': 4581.974649062, 'message_signature': 'bc6afd5561c92f0453266aa515f0d976c8d9afbdfd6ab1e0f771418674eafb96'}]}, 'timestamp': '2026-01-22 00:00:23.260899', '_unique_id': 'e4917f031d3d45419a0b39ffee808001'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.261 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.261 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.261 12 ERROR oslo_messaging.notify.messaging     yield
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.261 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.261 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.261 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.261 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.261 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.261 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.261 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.261 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.261 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.261 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.261 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.261 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.261 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.261 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.261 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.261 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.261 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.261 12 ERROR oslo_messaging.notify.messaging 
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.261 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.261 12 ERROR oslo_messaging.notify.messaging 
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.261 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.261 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.261 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.261 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.261 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.261 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.261 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.261 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.261 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.261 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.261 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.261 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.261 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.261 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.261 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.261 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.261 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.261 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.261 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.261 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.261 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.261 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.261 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.261 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.261 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.261 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.261 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.261 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.261 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.261 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.261 12 ERROR oslo_messaging.notify.messaging 
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.262 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.latency in the context of pollsters
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.262 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for PerDeviceDiskLatencyPollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.262 12 ERROR ceilometer.polling.manager [-] Prevent pollster disk.device.latency from polling [<NovaLikeServer: tempest-ServerTagsTestJSON-server-196894615>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-ServerTagsTestJSON-server-196894615>]
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.262 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.delta in the context of pollsters
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.262 12 DEBUG ceilometer.compute.pollsters [-] c5085649-028e-44f3-b7fa-53f19fc0a7de/network.incoming.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.263 12 DEBUG ceilometer.compute.pollsters [-] 9308be91-9a92-4389-939a-8b03d37474cf/network.incoming.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.264 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'f63eec37-3131-4587-8d6d-45086ee9ce6c', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '9c8bb39ebd324063b1c1044104d8fe0d', 'user_name': None, 'project_id': 'ef52bd80396048e796f6c2dbf7295b47', 'project_name': None, 'resource_id': 'instance-00000051-c5085649-028e-44f3-b7fa-53f19fc0a7de-tap9a8b67ec-5e', 'timestamp': '2026-01-22T00:00:23.262923', 'resource_metadata': {'display_name': 'tempest-ServerTagsTestJSON-server-196894615', 'name': 'tap9a8b67ec-5e', 'instance_id': 'c5085649-028e-44f3-b7fa-53f19fc0a7de', 'instance_type': 'm1.nano', 'host': '24df005293addda2e6c1dd5df249203ca1b6ab99c54d56db81428361', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:31:f9:41', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap9a8b67ec-5e'}, 'message_id': '594e2cb8-f725-11f0-b13b-fa163e425b77', 'monotonic_time': 4581.967390419, 'message_signature': 'f033b92d10775332c4fae7360660eddb83c6145f7f6592f659c981cca836763d'}, {'source': 'openstack', 'counter_name': 'network.incoming.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '3e78a70a1d284a9d932d4a53b872df39', 'user_name': None, 'project_id': 'cccb624dbe6d4401a89e9cd254f91828', 'project_name': None, 'resource_id': 'instance-00000046-9308be91-9a92-4389-939a-8b03d37474cf-tapd96fb6bb-97', 'timestamp': '2026-01-22T00:00:23.262923', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestJSON-server-396111842', 'name': 'tapd96fb6bb-97', 'instance_id': '9308be91-9a92-4389-939a-8b03d37474cf', 'instance_type': 'm1.nano', 'host': '98bf05fc3cde3063e357af07cf32397d1b83b1095afc25a5e9b316ae', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:c3:44:d7', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapd96fb6bb-97'}, 'message_id': '594e36c2-f725-11f0-b13b-fa163e425b77', 'monotonic_time': 4581.974649062, 'message_signature': '084acfe1aa87f3a6fe4c8b88961ba90b297b1f10266e41df9627bb06adc993b5'}]}, 'timestamp': '2026-01-22 00:00:23.263510', '_unique_id': '88e4344fc53c4ddb9694e84d8a87621f'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.264 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.264 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.264 12 ERROR oslo_messaging.notify.messaging     yield
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.264 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.264 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.264 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.264 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.264 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.264 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.264 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.264 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.264 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.264 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.264 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.264 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.264 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.264 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.264 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.264 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.264 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.264 12 ERROR oslo_messaging.notify.messaging 
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.264 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.264 12 ERROR oslo_messaging.notify.messaging 
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.264 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.264 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.264 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.264 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.264 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.264 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.264 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.264 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.264 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.264 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.264 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.264 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.264 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.264 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.264 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.264 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.264 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.264 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.264 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.264 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.264 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.264 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.264 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.264 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.264 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.264 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.264 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.264 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.264 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.264 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.264 12 ERROR oslo_messaging.notify.messaging 
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.265 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.requests in the context of pollsters
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.265 12 DEBUG ceilometer.compute.pollsters [-] c5085649-028e-44f3-b7fa-53f19fc0a7de/disk.device.read.requests volume: 760 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.265 12 DEBUG ceilometer.compute.pollsters [-] c5085649-028e-44f3-b7fa-53f19fc0a7de/disk.device.read.requests volume: 1 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.265 12 DEBUG ceilometer.compute.pollsters [-] 9308be91-9a92-4389-939a-8b03d37474cf/disk.device.read.requests volume: 1218 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.266 12 DEBUG ceilometer.compute.pollsters [-] 9308be91-9a92-4389-939a-8b03d37474cf/disk.device.read.requests volume: 120 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.267 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '317b15af-f686-4075-bcab-ba69689d63e6', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 760, 'user_id': '9c8bb39ebd324063b1c1044104d8fe0d', 'user_name': None, 'project_id': 'ef52bd80396048e796f6c2dbf7295b47', 'project_name': None, 'resource_id': 'c5085649-028e-44f3-b7fa-53f19fc0a7de-vda', 'timestamp': '2026-01-22T00:00:23.265333', 'resource_metadata': {'display_name': 'tempest-ServerTagsTestJSON-server-196894615', 'name': 'instance-00000051', 'instance_id': 'c5085649-028e-44f3-b7fa-53f19fc0a7de', 'instance_type': 'm1.nano', 'host': '24df005293addda2e6c1dd5df249203ca1b6ab99c54d56db81428361', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '594e8c1c-f725-11f0-b13b-fa163e425b77', 'monotonic_time': 4581.892503118, 'message_signature': '96e666c8a30054f8a50afa12ead971a48f2915e667d82ce0419b4e09463e6697'}, {'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 1, 'user_id': '9c8bb39ebd324063b1c1044104d8fe0d', 'user_name': None, 'project_id': 'ef52bd80396048e796f6c2dbf7295b47', 'project_name': None, 'resource_id': 'c5085649-028e-44f3-b7fa-53f19fc0a7de-sda', 'timestamp': '2026-01-22T00:00:23.265333', 'resource_metadata': {'display_name': 'tempest-ServerTagsTestJSON-server-196894615', 'name': 'instance-00000051', 'instance_id': 'c5085649-028e-44f3-b7fa-53f19fc0a7de', 'instance_type': 'm1.nano', 'host': '24df005293addda2e6c1dd5df249203ca1b6ab99c54d56db81428361', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '594e963a-f725-11f0-b13b-fa163e425b77', 'monotonic_time': 4581.892503118, 'message_signature': '0a14d24a87d3b801207bc17f3f515cc9dc7b1c8623daef23b9ce6bb61a841e62'}, {'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 1218, 'user_id': '3e78a70a1d284a9d932d4a53b872df39', 'user_name': None, 'project_id': 'cccb624dbe6d4401a89e9cd254f91828', 'project_name': None, 'resource_id': '9308be91-9a92-4389-939a-8b03d37474cf-vda', 'timestamp': '2026-01-22T00:00:23.265333', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestJSON-server-396111842', 'name': 'instance-00000046', 'instance_id': '9308be91-9a92-4389-939a-8b03d37474cf', 'instance_type': 'm1.nano', 'host': '98bf05fc3cde3063e357af07cf32397d1b83b1095afc25a5e9b316ae', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '594ea27e-f725-11f0-b13b-fa163e425b77', 'monotonic_time': 4581.928235668, 'message_signature': '41ceef4788cfc5edd87e79351b44f2e2f3d918fb9387b5c72076ba527a814c9d'}, {'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 120, 'user_id': '3e78a70a1d284a9d932d4a53b872df39', 'user_name': None, 'project_id': 'cccb624dbe6d4401a89e9cd254f91828', 'project_name': None, 'resource_id': '9308be91-9a92-4389-939a-8b03d37474cf-sda', 'timestamp': '2026-01-22T00:00:23.265333', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestJSON-server-396111842', 'name': 'instance-00000046', 'instance_id': '9308be91-9a92-4389-939a-8b03d37474cf', 'instance_type': 'm1.nano', 'host': '98bf05fc3cde3063e357af07cf32397d1b83b1095afc25a5e9b316ae', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '594eabac-f725-11f0-b13b-fa163e425b77', 'monotonic_time': 4581.928235668, 'message_signature': '1e8d548c56fa41f7c5534afe08b1e5200f1cc3593464ae151659d16281c8f98e'}]}, 'timestamp': '2026-01-22 00:00:23.266495', '_unique_id': 'e875153efc634d9aae1615d822c3d3ec'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.267 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.267 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.267 12 ERROR oslo_messaging.notify.messaging     yield
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.267 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.267 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.267 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.267 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.267 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.267 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.267 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.267 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.267 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.267 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.267 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.267 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.267 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.267 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.267 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.267 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.267 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.267 12 ERROR oslo_messaging.notify.messaging 
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.267 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.267 12 ERROR oslo_messaging.notify.messaging 
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.267 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.267 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.267 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.267 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.267 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.267 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.267 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.267 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.267 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.267 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.267 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.267 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.267 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.267 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.267 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.267 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.267 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.267 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.267 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.267 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.267 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.267 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.267 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.267 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.267 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.267 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.267 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.267 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.267 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.267 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.267 12 ERROR oslo_messaging.notify.messaging 
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.267 12 INFO ceilometer.polling.manager [-] Polling pollster cpu in the context of pollsters
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.285 12 DEBUG ceilometer.compute.pollsters [-] c5085649-028e-44f3-b7fa-53f19fc0a7de/cpu volume: 1610000000 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.301 12 DEBUG ceilometer.compute.pollsters [-] 9308be91-9a92-4389-939a-8b03d37474cf/cpu volume: 12310000000 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.303 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'e9851d3b-115e-4c99-b5c7-c923fdce98fd', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'cpu', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 1610000000, 'user_id': '9c8bb39ebd324063b1c1044104d8fe0d', 'user_name': None, 'project_id': 'ef52bd80396048e796f6c2dbf7295b47', 'project_name': None, 'resource_id': 'c5085649-028e-44f3-b7fa-53f19fc0a7de', 'timestamp': '2026-01-22T00:00:23.268116', 'resource_metadata': {'display_name': 'tempest-ServerTagsTestJSON-server-196894615', 'name': 'instance-00000051', 'instance_id': 'c5085649-028e-44f3-b7fa-53f19fc0a7de', 'instance_type': 'm1.nano', 'host': '24df005293addda2e6c1dd5df249203ca1b6ab99c54d56db81428361', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'cpu_number': 1}, 'message_id': '5951b66c-f725-11f0-b13b-fa163e425b77', 'monotonic_time': 4582.004856314, 'message_signature': 'f635ad70babfedc6d46b2f006be99a43eae76c1141f06d7ef55ad3937c0b84d6'}, {'source': 'openstack', 'counter_name': 'cpu', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 12310000000, 'user_id': '3e78a70a1d284a9d932d4a53b872df39', 'user_name': None, 'project_id': 'cccb624dbe6d4401a89e9cd254f91828', 'project_name': None, 'resource_id': '9308be91-9a92-4389-939a-8b03d37474cf', 'timestamp': '2026-01-22T00:00:23.268116', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestJSON-server-396111842', 'name': 'instance-00000046', 'instance_id': '9308be91-9a92-4389-939a-8b03d37474cf', 'instance_type': 'm1.nano', 'host': '98bf05fc3cde3063e357af07cf32397d1b83b1095afc25a5e9b316ae', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'cpu_number': 1}, 'message_id': '595420fa-f725-11f0-b13b-fa163e425b77', 'monotonic_time': 4582.020749947, 'message_signature': '048157bce4522f90158b1172316594730d3ee4e11d037a6d5bf9a22037aafca0'}]}, 'timestamp': '2026-01-22 00:00:23.302351', '_unique_id': 'ae124c56952c4996bc4b4ae2e2bf0312'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.303 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.303 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.303 12 ERROR oslo_messaging.notify.messaging     yield
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.303 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.303 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.303 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.303 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.303 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.303 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.303 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.303 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.303 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.303 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.303 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.303 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.303 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.303 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.303 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.303 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.303 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.303 12 ERROR oslo_messaging.notify.messaging 
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.303 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.303 12 ERROR oslo_messaging.notify.messaging 
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.303 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.303 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.303 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.303 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.303 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.303 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.303 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.303 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.303 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.303 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.303 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.303 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.303 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.303 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.303 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.303 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.303 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.303 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.303 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.303 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.303 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.303 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.303 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.303 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.303 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.303 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.303 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.303 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.303 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.303 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.303 12 ERROR oslo_messaging.notify.messaging 
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.304 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.requests in the context of pollsters
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.304 12 DEBUG ceilometer.compute.pollsters [-] c5085649-028e-44f3-b7fa-53f19fc0a7de/disk.device.write.requests volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.304 12 DEBUG ceilometer.compute.pollsters [-] c5085649-028e-44f3-b7fa-53f19fc0a7de/disk.device.write.requests volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.305 12 DEBUG ceilometer.compute.pollsters [-] 9308be91-9a92-4389-939a-8b03d37474cf/disk.device.write.requests volume: 51 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.305 12 DEBUG ceilometer.compute.pollsters [-] 9308be91-9a92-4389-939a-8b03d37474cf/disk.device.write.requests volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.306 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '4a4c710a-f07e-4dfa-b2ba-308e40d360ef', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 0, 'user_id': '9c8bb39ebd324063b1c1044104d8fe0d', 'user_name': None, 'project_id': 'ef52bd80396048e796f6c2dbf7295b47', 'project_name': None, 'resource_id': 'c5085649-028e-44f3-b7fa-53f19fc0a7de-vda', 'timestamp': '2026-01-22T00:00:23.304468', 'resource_metadata': {'display_name': 'tempest-ServerTagsTestJSON-server-196894615', 'name': 'instance-00000051', 'instance_id': 'c5085649-028e-44f3-b7fa-53f19fc0a7de', 'instance_type': 'm1.nano', 'host': '24df005293addda2e6c1dd5df249203ca1b6ab99c54d56db81428361', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '59548176-f725-11f0-b13b-fa163e425b77', 'monotonic_time': 4581.892503118, 'message_signature': 'e631d2c7737cc069ed8e9fdfff181bab38802ccf0e04309c972389ee41cf9433'}, {'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 0, 'user_id': '9c8bb39ebd324063b1c1044104d8fe0d', 'user_name': None, 'project_id': 'ef52bd80396048e796f6c2dbf7295b47', 'project_name': None, 'resource_id': 'c5085649-028e-44f3-b7fa-53f19fc0a7de-sda', 'timestamp': '2026-01-22T00:00:23.304468', 'resource_metadata': {'display_name': 'tempest-ServerTagsTestJSON-server-196894615', 'name': 'instance-00000051', 'instance_id': 'c5085649-028e-44f3-b7fa-53f19fc0a7de', 'instance_type': 'm1.nano', 'host': '24df005293addda2e6c1dd5df249203ca1b6ab99c54d56db81428361', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '59548bbc-f725-11f0-b13b-fa163e425b77', 'monotonic_time': 4581.892503118, 'message_signature': 'adf1625e39ea2db35ccf8c2cab1a24e85e5eb85d5a7cb0967fc49ad0d19ea361'}, {'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 51, 'user_id': '3e78a70a1d284a9d932d4a53b872df39', 'user_name': None, 'project_id': 'cccb624dbe6d4401a89e9cd254f91828', 'project_name': None, 'resource_id': '9308be91-9a92-4389-939a-8b03d37474cf-vda', 'timestamp': '2026-01-22T00:00:23.304468', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestJSON-server-396111842', 'name': 'instance-00000046', 'instance_id': '9308be91-9a92-4389-939a-8b03d37474cf', 'instance_type': 'm1.nano', 'host': '98bf05fc3cde3063e357af07cf32397d1b83b1095afc25a5e9b316ae', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '595496d4-f725-11f0-b13b-fa163e425b77', 'monotonic_time': 4581.928235668, 'message_signature': 'd065d8406d49cef07a82a832bd8da0335d721800a3b6b9213267face83816b92'}, {'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 0, 'user_id': '3e78a70a1d284a9d932d4a53b872df39', 'user_name': None, 'project_id': 'cccb624dbe6d4401a89e9cd254f91828', 'project_name': None, 'resource_id': '9308be91-9a92-4389-939a-8b03d37474cf-sda', 'timestamp': '2026-01-22T00:00:23.304468', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestJSON-server-396111842', 'name': 'instance-00000046', 'instance_id': '9308be91-9a92-4389-939a-8b03d37474cf', 'instance_type': 'm1.nano', 'host': '98bf05fc3cde3063e357af07cf32397d1b83b1095afc25a5e9b316ae', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '5954a408-f725-11f0-b13b-fa163e425b77', 'monotonic_time': 4581.928235668, 'message_signature': '1c6fef092ebe902dde851eb3973220be535f62ea1407c0549fe013d509096f08'}]}, 'timestamp': '2026-01-22 00:00:23.305630', '_unique_id': 'fb5f0ad9228340d3ae443f79982055b5'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.306 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.306 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.306 12 ERROR oslo_messaging.notify.messaging     yield
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.306 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.306 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.306 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.306 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.306 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.306 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.306 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.306 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.306 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.306 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.306 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.306 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.306 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.306 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.306 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.306 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.306 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.306 12 ERROR oslo_messaging.notify.messaging 
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.306 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.306 12 ERROR oslo_messaging.notify.messaging 
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.306 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.306 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.306 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.306 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.306 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.306 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.306 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.306 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.306 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.306 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.306 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.306 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.306 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.306 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.306 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.306 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.306 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.306 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.306 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.306 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.306 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.306 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.306 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.306 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.306 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.306 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.306 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.306 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.306 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.306 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.306 12 ERROR oslo_messaging.notify.messaging 
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.307 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets in the context of pollsters
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.307 12 DEBUG ceilometer.compute.pollsters [-] c5085649-028e-44f3-b7fa-53f19fc0a7de/network.incoming.packets volume: 1 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.307 12 DEBUG ceilometer.compute.pollsters [-] 9308be91-9a92-4389-939a-8b03d37474cf/network.incoming.packets volume: 42 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.308 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '83cc8068-541a-4cea-a035-4b1063e13245', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 1, 'user_id': '9c8bb39ebd324063b1c1044104d8fe0d', 'user_name': None, 'project_id': 'ef52bd80396048e796f6c2dbf7295b47', 'project_name': None, 'resource_id': 'instance-00000051-c5085649-028e-44f3-b7fa-53f19fc0a7de-tap9a8b67ec-5e', 'timestamp': '2026-01-22T00:00:23.307510', 'resource_metadata': {'display_name': 'tempest-ServerTagsTestJSON-server-196894615', 'name': 'tap9a8b67ec-5e', 'instance_id': 'c5085649-028e-44f3-b7fa-53f19fc0a7de', 'instance_type': 'm1.nano', 'host': '24df005293addda2e6c1dd5df249203ca1b6ab99c54d56db81428361', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:31:f9:41', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap9a8b67ec-5e'}, 'message_id': '5954f9f8-f725-11f0-b13b-fa163e425b77', 'monotonic_time': 4581.967390419, 'message_signature': 'ac56607fedb3159ecf6bf642c2e3745838c029a6dbafd1caf5e9c6f01d6ca2e5'}, {'source': 'openstack', 'counter_name': 'network.incoming.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 42, 'user_id': '3e78a70a1d284a9d932d4a53b872df39', 'user_name': None, 'project_id': 'cccb624dbe6d4401a89e9cd254f91828', 'project_name': None, 'resource_id': 'instance-00000046-9308be91-9a92-4389-939a-8b03d37474cf-tapd96fb6bb-97', 'timestamp': '2026-01-22T00:00:23.307510', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestJSON-server-396111842', 'name': 'tapd96fb6bb-97', 'instance_id': '9308be91-9a92-4389-939a-8b03d37474cf', 'instance_type': 'm1.nano', 'host': '98bf05fc3cde3063e357af07cf32397d1b83b1095afc25a5e9b316ae', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:c3:44:d7', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapd96fb6bb-97'}, 'message_id': '595504d4-f725-11f0-b13b-fa163e425b77', 'monotonic_time': 4581.974649062, 'message_signature': '77bb796a2794106d559d09a21ee16ddc09eb0434be9411673a158629b2966c63'}]}, 'timestamp': '2026-01-22 00:00:23.308107', '_unique_id': 'a607e127ca344885bee60bb1905084aa'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.308 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.308 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.308 12 ERROR oslo_messaging.notify.messaging     yield
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.308 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.308 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.308 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.308 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.308 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.308 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.308 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.308 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.308 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.308 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.308 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.308 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.308 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.308 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.308 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.308 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.308 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.308 12 ERROR oslo_messaging.notify.messaging 
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.308 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.308 12 ERROR oslo_messaging.notify.messaging 
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.308 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.308 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.308 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.308 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.308 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.308 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.308 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.308 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.308 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.308 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.308 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.308 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.308 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.308 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.308 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.308 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.308 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.308 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.308 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.308 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.308 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.308 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.308 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.308 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.308 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.308 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.308 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.308 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.308 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.308 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.308 12 ERROR oslo_messaging.notify.messaging 
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.309 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.bytes in the context of pollsters
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.309 12 DEBUG ceilometer.compute.pollsters [-] c5085649-028e-44f3-b7fa-53f19fc0a7de/disk.device.read.bytes volume: 23775232 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.309 12 DEBUG ceilometer.compute.pollsters [-] c5085649-028e-44f3-b7fa-53f19fc0a7de/disk.device.read.bytes volume: 2048 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.310 12 DEBUG ceilometer.compute.pollsters [-] 9308be91-9a92-4389-939a-8b03d37474cf/disk.device.read.bytes volume: 32090112 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.310 12 DEBUG ceilometer.compute.pollsters [-] 9308be91-9a92-4389-939a-8b03d37474cf/disk.device.read.bytes volume: 299326 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.311 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'ed08e845-1c40-4b05-a908-555c16cd4af0', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 23775232, 'user_id': '9c8bb39ebd324063b1c1044104d8fe0d', 'user_name': None, 'project_id': 'ef52bd80396048e796f6c2dbf7295b47', 'project_name': None, 'resource_id': 'c5085649-028e-44f3-b7fa-53f19fc0a7de-vda', 'timestamp': '2026-01-22T00:00:23.309582', 'resource_metadata': {'display_name': 'tempest-ServerTagsTestJSON-server-196894615', 'name': 'instance-00000051', 'instance_id': 'c5085649-028e-44f3-b7fa-53f19fc0a7de', 'instance_type': 'm1.nano', 'host': '24df005293addda2e6c1dd5df249203ca1b6ab99c54d56db81428361', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '59554908-f725-11f0-b13b-fa163e425b77', 'monotonic_time': 4581.892503118, 'message_signature': '049efff18dc8bc1ef69c189a334ed6ae927faf7acb04d08c1dc7e32cc36179af'}, {'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 2048, 'user_id': '9c8bb39ebd324063b1c1044104d8fe0d', 'user_name': None, 'project_id': 'ef52bd80396048e796f6c2dbf7295b47', 'project_name': None, 'resource_id': 'c5085649-028e-44f3-b7fa-53f19fc0a7de-sda', 'timestamp': '2026-01-22T00:00:23.309582', 'resource_metadata': {'display_name': 'tempest-ServerTagsTestJSON-server-196894615', 'name': 'instance-00000051', 'instance_id': 'c5085649-028e-44f3-b7fa-53f19fc0a7de', 'instance_type': 'm1.nano', 'host': '24df005293addda2e6c1dd5df249203ca1b6ab99c54d56db81428361', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '595554de-f725-11f0-b13b-fa163e425b77', 'monotonic_time': 4581.892503118, 'message_signature': 'f804ec5023d848de94aa721c5d800355ea968f19d9137fc57bdbbd49791fcda4'}, {'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 32090112, 'user_id': '3e78a70a1d284a9d932d4a53b872df39', 'user_name': None, 'project_id': 'cccb624dbe6d4401a89e9cd254f91828', 'project_name': None, 'resource_id': '9308be91-9a92-4389-939a-8b03d37474cf-vda', 'timestamp': '2026-01-22T00:00:23.309582', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestJSON-server-396111842', 'name': 'instance-00000046', 'instance_id': '9308be91-9a92-4389-939a-8b03d37474cf', 'instance_type': 'm1.nano', 'host': '98bf05fc3cde3063e357af07cf32397d1b83b1095afc25a5e9b316ae', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '5955626c-f725-11f0-b13b-fa163e425b77', 'monotonic_time': 4581.928235668, 'message_signature': 'fde951c172d49dbc66e78ba8ca3359aab8aa93fcce2a64b2a5614113b02f992f'}, {'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 299326, 'user_id': '3e78a70a1d284a9d932d4a53b872df39', 'user_name': None, 'project_id': 'cccb624dbe6d4401a89e9cd254f91828', 'project_name': None, 'resource_id': '9308be91-9a92-4389-939a-8b03d37474cf-sda', 'timestamp': '2026-01-22T00:00:23.309582', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestJSON-server-396111842', 'name': 'instance-00000046', 'instance_id': '9308be91-9a92-4389-939a-8b03d37474cf', 'instance_type': 'm1.nano', 'host': '98bf05fc3cde3063e357af07cf32397d1b83b1095afc25a5e9b316ae', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '59556bfe-f725-11f0-b13b-fa163e425b77', 'monotonic_time': 4581.928235668, 'message_signature': 'df70cb59d44547f97e48ddda52ce24aed4d6d50f4ba23bd243c5e84ce3a59c5f'}]}, 'timestamp': '2026-01-22 00:00:23.310741', '_unique_id': '0f3f6ef2e3024c95afcd6a9c128ab542'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.311 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.311 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.311 12 ERROR oslo_messaging.notify.messaging     yield
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.311 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.311 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.311 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.311 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.311 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.311 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.311 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.311 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.311 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.311 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.311 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.311 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.311 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.311 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.311 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.311 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.311 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.311 12 ERROR oslo_messaging.notify.messaging 
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.311 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.311 12 ERROR oslo_messaging.notify.messaging 
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.311 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.311 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.311 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.311 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.311 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.311 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.311 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.311 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.311 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.311 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.311 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.311 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.311 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.311 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.311 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.311 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.311 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.311 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.311 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.311 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.311 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.311 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.311 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.311 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.311 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.311 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.311 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.311 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.311 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.311 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.311 12 ERROR oslo_messaging.notify.messaging 
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.312 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.drop in the context of pollsters
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.312 12 DEBUG ceilometer.compute.pollsters [-] c5085649-028e-44f3-b7fa-53f19fc0a7de/network.incoming.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.312 12 DEBUG ceilometer.compute.pollsters [-] 9308be91-9a92-4389-939a-8b03d37474cf/network.incoming.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.313 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'f9bb7523-001f-419f-bb5a-f018af4a9237', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '9c8bb39ebd324063b1c1044104d8fe0d', 'user_name': None, 'project_id': 'ef52bd80396048e796f6c2dbf7295b47', 'project_name': None, 'resource_id': 'instance-00000051-c5085649-028e-44f3-b7fa-53f19fc0a7de-tap9a8b67ec-5e', 'timestamp': '2026-01-22T00:00:23.312488', 'resource_metadata': {'display_name': 'tempest-ServerTagsTestJSON-server-196894615', 'name': 'tap9a8b67ec-5e', 'instance_id': 'c5085649-028e-44f3-b7fa-53f19fc0a7de', 'instance_type': 'm1.nano', 'host': '24df005293addda2e6c1dd5df249203ca1b6ab99c54d56db81428361', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:31:f9:41', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap9a8b67ec-5e'}, 'message_id': '5955bae6-f725-11f0-b13b-fa163e425b77', 'monotonic_time': 4581.967390419, 'message_signature': 'fe21beeb60aca410a7529c6a095742931cb845349ef809275f963eddd3382c89'}, {'source': 'openstack', 'counter_name': 'network.incoming.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '3e78a70a1d284a9d932d4a53b872df39', 'user_name': None, 'project_id': 'cccb624dbe6d4401a89e9cd254f91828', 'project_name': None, 'resource_id': 'instance-00000046-9308be91-9a92-4389-939a-8b03d37474cf-tapd96fb6bb-97', 'timestamp': '2026-01-22T00:00:23.312488', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestJSON-server-396111842', 'name': 'tapd96fb6bb-97', 'instance_id': '9308be91-9a92-4389-939a-8b03d37474cf', 'instance_type': 'm1.nano', 'host': '98bf05fc3cde3063e357af07cf32397d1b83b1095afc25a5e9b316ae', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:c3:44:d7', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapd96fb6bb-97'}, 'message_id': '5955c57c-f725-11f0-b13b-fa163e425b77', 'monotonic_time': 4581.974649062, 'message_signature': '35c86753036f93213a3114fa9205657cd23658e1fb0c60ce248b25ab6891d58e'}]}, 'timestamp': '2026-01-22 00:00:23.313040', '_unique_id': 'f2094de15e0e4d29aa000b4b82e16159'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.313 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.313 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.313 12 ERROR oslo_messaging.notify.messaging     yield
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.313 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.313 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.313 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.313 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.313 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.313 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.313 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.313 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.313 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.313 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.313 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.313 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.313 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.313 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.313 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.313 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.313 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.313 12 ERROR oslo_messaging.notify.messaging 
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.313 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.313 12 ERROR oslo_messaging.notify.messaging 
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.313 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.313 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.313 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.313 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.313 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.313 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.313 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.313 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.313 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.313 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.313 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.313 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.313 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.313 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.313 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.313 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.313 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.313 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.313 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.313 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.313 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.313 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.313 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.313 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.313 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.313 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.313 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.313 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.313 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.313 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.313 12 ERROR oslo_messaging.notify.messaging 
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.315 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.rate in the context of pollsters
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.315 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for OutgoingBytesRatePollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.315 12 ERROR ceilometer.polling.manager [-] Prevent pollster network.outgoing.bytes.rate from polling [<NovaLikeServer: tempest-ServerTagsTestJSON-server-196894615>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-ServerTagsTestJSON-server-196894615>]
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.315 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.error in the context of pollsters
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.315 12 DEBUG ceilometer.compute.pollsters [-] c5085649-028e-44f3-b7fa-53f19fc0a7de/network.incoming.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.315 12 DEBUG ceilometer.compute.pollsters [-] 9308be91-9a92-4389-939a-8b03d37474cf/network.incoming.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.316 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '23e86176-56ac-42bb-8203-4f50a00db369', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '9c8bb39ebd324063b1c1044104d8fe0d', 'user_name': None, 'project_id': 'ef52bd80396048e796f6c2dbf7295b47', 'project_name': None, 'resource_id': 'instance-00000051-c5085649-028e-44f3-b7fa-53f19fc0a7de-tap9a8b67ec-5e', 'timestamp': '2026-01-22T00:00:23.315679', 'resource_metadata': {'display_name': 'tempest-ServerTagsTestJSON-server-196894615', 'name': 'tap9a8b67ec-5e', 'instance_id': 'c5085649-028e-44f3-b7fa-53f19fc0a7de', 'instance_type': 'm1.nano', 'host': '24df005293addda2e6c1dd5df249203ca1b6ab99c54d56db81428361', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:31:f9:41', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap9a8b67ec-5e'}, 'message_id': '5956371e-f725-11f0-b13b-fa163e425b77', 'monotonic_time': 4581.967390419, 'message_signature': '04812b366480426cb1a3a7c75b0e038ef692f70caee7d3d22bb709f0617a759b'}, {'source': 'openstack', 'counter_name': 'network.incoming.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '3e78a70a1d284a9d932d4a53b872df39', 'user_name': None, 'project_id': 'cccb624dbe6d4401a89e9cd254f91828', 'project_name': None, 'resource_id': 'instance-00000046-9308be91-9a92-4389-939a-8b03d37474cf-tapd96fb6bb-97', 'timestamp': '2026-01-22T00:00:23.315679', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestJSON-server-396111842', 'name': 'tapd96fb6bb-97', 'instance_id': '9308be91-9a92-4389-939a-8b03d37474cf', 'instance_type': 'm1.nano', 'host': '98bf05fc3cde3063e357af07cf32397d1b83b1095afc25a5e9b316ae', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:c3:44:d7', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapd96fb6bb-97'}, 'message_id': '59564358-f725-11f0-b13b-fa163e425b77', 'monotonic_time': 4581.974649062, 'message_signature': '00a4b68338465bf1ffaa652ac577254f1728c7d524af6b36e7bd4b5f8943e0cb'}]}, 'timestamp': '2026-01-22 00:00:23.316305', '_unique_id': '93644fdb2d4e45dd82622185079e1a29'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.316 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.316 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.316 12 ERROR oslo_messaging.notify.messaging     yield
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.316 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.316 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.316 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.316 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.316 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.316 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.316 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.316 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.316 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.316 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.316 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.316 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.316 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.316 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.316 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.316 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.316 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.316 12 ERROR oslo_messaging.notify.messaging 
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.316 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.316 12 ERROR oslo_messaging.notify.messaging 
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.316 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.316 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.316 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.316 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.316 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.316 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.316 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.316 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.316 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.316 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.316 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.316 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.316 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.316 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.316 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.316 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.316 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.316 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.316 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.316 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.316 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.316 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.316 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.316 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.316 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.316 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.316 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.316 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.316 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.316 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.316 12 ERROR oslo_messaging.notify.messaging 
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.317 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets in the context of pollsters
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.317 12 DEBUG ceilometer.compute.pollsters [-] c5085649-028e-44f3-b7fa-53f19fc0a7de/network.outgoing.packets volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.318 12 DEBUG ceilometer.compute.pollsters [-] 9308be91-9a92-4389-939a-8b03d37474cf/network.outgoing.packets volume: 44 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.318 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '45a726ed-85b8-44c8-aef0-410bd9027ba3', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '9c8bb39ebd324063b1c1044104d8fe0d', 'user_name': None, 'project_id': 'ef52bd80396048e796f6c2dbf7295b47', 'project_name': None, 'resource_id': 'instance-00000051-c5085649-028e-44f3-b7fa-53f19fc0a7de-tap9a8b67ec-5e', 'timestamp': '2026-01-22T00:00:23.317809', 'resource_metadata': {'display_name': 'tempest-ServerTagsTestJSON-server-196894615', 'name': 'tap9a8b67ec-5e', 'instance_id': 'c5085649-028e-44f3-b7fa-53f19fc0a7de', 'instance_type': 'm1.nano', 'host': '24df005293addda2e6c1dd5df249203ca1b6ab99c54d56db81428361', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:31:f9:41', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap9a8b67ec-5e'}, 'message_id': '59568b38-f725-11f0-b13b-fa163e425b77', 'monotonic_time': 4581.967390419, 'message_signature': '107ae0b2706ba88898b2f8879d230cbfee7438db1219e48227650324a91ce2e3'}, {'source': 'openstack', 'counter_name': 'network.outgoing.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 44, 'user_id': '3e78a70a1d284a9d932d4a53b872df39', 'user_name': None, 'project_id': 'cccb624dbe6d4401a89e9cd254f91828', 'project_name': None, 'resource_id': 'instance-00000046-9308be91-9a92-4389-939a-8b03d37474cf-tapd96fb6bb-97', 'timestamp': '2026-01-22T00:00:23.317809', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestJSON-server-396111842', 'name': 'tapd96fb6bb-97', 'instance_id': '9308be91-9a92-4389-939a-8b03d37474cf', 'instance_type': 'm1.nano', 'host': '98bf05fc3cde3063e357af07cf32397d1b83b1095afc25a5e9b316ae', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:c3:44:d7', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapd96fb6bb-97'}, 'message_id': '59569628-f725-11f0-b13b-fa163e425b77', 'monotonic_time': 4581.974649062, 'message_signature': '404ab7d2dc757900a4c938d1638d5f0c9dadfe4b8dfb309acd3c76f473e99a63'}]}, 'timestamp': '2026-01-22 00:00:23.318381', '_unique_id': '9820c9773a1f4645b9578a39eed83da4'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.318 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.318 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.318 12 ERROR oslo_messaging.notify.messaging     yield
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.318 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.318 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.318 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.318 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.318 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.318 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.318 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.318 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.318 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.318 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.318 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.318 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.318 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.318 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.318 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.318 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.318 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.318 12 ERROR oslo_messaging.notify.messaging 
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.318 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.318 12 ERROR oslo_messaging.notify.messaging 
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.318 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.318 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.318 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.318 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.318 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.318 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.318 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.318 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.318 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.318 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.318 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.318 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.318 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.318 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.318 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.318 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.318 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.318 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.318 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.318 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.318 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.318 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.318 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.318 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.318 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.318 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.318 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.318 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.318 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.318 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.318 12 ERROR oslo_messaging.notify.messaging 
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.319 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.latency in the context of pollsters
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.320 12 DEBUG ceilometer.compute.pollsters [-] c5085649-028e-44f3-b7fa-53f19fc0a7de/disk.device.write.latency volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.320 12 DEBUG ceilometer.compute.pollsters [-] c5085649-028e-44f3-b7fa-53f19fc0a7de/disk.device.write.latency volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.320 12 DEBUG ceilometer.compute.pollsters [-] 9308be91-9a92-4389-939a-8b03d37474cf/disk.device.write.latency volume: 58109503 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.320 12 DEBUG ceilometer.compute.pollsters [-] 9308be91-9a92-4389-939a-8b03d37474cf/disk.device.write.latency volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.321 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'c2e85863-aeb2-48a7-8693-bcb0125e10e7', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 0, 'user_id': '9c8bb39ebd324063b1c1044104d8fe0d', 'user_name': None, 'project_id': 'ef52bd80396048e796f6c2dbf7295b47', 'project_name': None, 'resource_id': 'c5085649-028e-44f3-b7fa-53f19fc0a7de-vda', 'timestamp': '2026-01-22T00:00:23.320118', 'resource_metadata': {'display_name': 'tempest-ServerTagsTestJSON-server-196894615', 'name': 'instance-00000051', 'instance_id': 'c5085649-028e-44f3-b7fa-53f19fc0a7de', 'instance_type': 'm1.nano', 'host': '24df005293addda2e6c1dd5df249203ca1b6ab99c54d56db81428361', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '5956e574-f725-11f0-b13b-fa163e425b77', 'monotonic_time': 4581.892503118, 'message_signature': 'ef89e3e5a1ca09bd98cd3c70ea33dae70bf0b7b01d59e6623d3003d05b424de0'}, {'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 0, 'user_id': '9c8bb39ebd324063b1c1044104d8fe0d', 'user_name': None, 'project_id': 'ef52bd80396048e796f6c2dbf7295b47', 'project_name': None, 'resource_id': 'c5085649-028e-44f3-b7fa-53f19fc0a7de-sda', 'timestamp': '2026-01-22T00:00:23.320118', 'resource_metadata': {'display_name': 'tempest-ServerTagsTestJSON-server-196894615', 'name': 'instance-00000051', 'instance_id': 'c5085649-028e-44f3-b7fa-53f19fc0a7de', 'instance_type': 'm1.nano', 'host': '24df005293addda2e6c1dd5df249203ca1b6ab99c54d56db81428361', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '5956f028-f725-11f0-b13b-fa163e425b77', 'monotonic_time': 4581.892503118, 'message_signature': '09d7a39074637d0e3ea7f300355c99506598bae21d3c2896ee38376eb08bb98b'}, {'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 58109503, 'user_id': '3e78a70a1d284a9d932d4a53b872df39', 'user_name': None, 'project_id': 'cccb624dbe6d4401a89e9cd254f91828', 'project_name': None, 'resource_id': '9308be91-9a92-4389-939a-8b03d37474cf-vda', 'timestamp': '2026-01-22T00:00:23.320118', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestJSON-server-396111842', 'name': 'instance-00000046', 'instance_id': '9308be91-9a92-4389-939a-8b03d37474cf', 'instance_type': 'm1.nano', 'host': '98bf05fc3cde3063e357af07cf32397d1b83b1095afc25a5e9b316ae', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '5956f9c4-f725-11f0-b13b-fa163e425b77', 'monotonic_time': 4581.928235668, 'message_signature': 'f93a7aec24615e1e25c2e4edaf516f52e9abf931fd75dc31fb090a2df7233109'}, {'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 0, 'user_id': '3e78a70a1d284a9d932d4a53b872df39', 'user_name': None, 'project_id': 'cccb624dbe6d4401a89e9cd254f91828', 'project_name': None, 'resource_id': '9308be91-9a92-4389-939a-8b03d37474cf-sda', 'timestamp': '2026-01-22T00:00:23.320118', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestJSON-server-396111842', 'name': 'instance-00000046', 'instance_id': '9308be91-9a92-4389-939a-8b03d37474cf', 'instance_type': 'm1.nano', 'host': '98bf05fc3cde3063e357af07cf32397d1b83b1095afc25a5e9b316ae', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '5957041e-f725-11f0-b13b-fa163e425b77', 'monotonic_time': 4581.928235668, 'message_signature': 'da8167b40b591e50ac4dc0cf33db6cf7167cb7411c6106b73de501631aea985a'}]}, 'timestamp': '2026-01-22 00:00:23.321186', '_unique_id': '9ea7983b55ce47a886c0e12d0b10f3ea'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.321 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.321 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.321 12 ERROR oslo_messaging.notify.messaging     yield
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.321 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.321 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.321 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.321 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.321 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.321 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.321 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.321 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.321 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.321 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.321 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.321 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.321 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.321 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.321 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.321 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.321 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.321 12 ERROR oslo_messaging.notify.messaging 
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.321 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.321 12 ERROR oslo_messaging.notify.messaging 
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.321 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.321 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.321 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.321 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.321 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.321 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.321 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.321 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.321 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.321 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.321 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.321 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.321 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.321 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.321 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.321 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.321 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.321 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.321 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.321 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.321 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.321 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.321 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.321 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.321 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.321 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.321 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.321 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.321 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.321 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.321 12 ERROR oslo_messaging.notify.messaging 
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.322 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes in the context of pollsters
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.322 12 DEBUG ceilometer.compute.pollsters [-] c5085649-028e-44f3-b7fa-53f19fc0a7de/network.incoming.bytes volume: 90 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.323 12 DEBUG ceilometer.compute.pollsters [-] 9308be91-9a92-4389-939a-8b03d37474cf/network.incoming.bytes volume: 6993 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.323 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '06d91137-8a03-4156-a2c6-0f919ca83281', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 90, 'user_id': '9c8bb39ebd324063b1c1044104d8fe0d', 'user_name': None, 'project_id': 'ef52bd80396048e796f6c2dbf7295b47', 'project_name': None, 'resource_id': 'instance-00000051-c5085649-028e-44f3-b7fa-53f19fc0a7de-tap9a8b67ec-5e', 'timestamp': '2026-01-22T00:00:23.322804', 'resource_metadata': {'display_name': 'tempest-ServerTagsTestJSON-server-196894615', 'name': 'tap9a8b67ec-5e', 'instance_id': 'c5085649-028e-44f3-b7fa-53f19fc0a7de', 'instance_type': 'm1.nano', 'host': '24df005293addda2e6c1dd5df249203ca1b6ab99c54d56db81428361', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:31:f9:41', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap9a8b67ec-5e'}, 'message_id': '59574faa-f725-11f0-b13b-fa163e425b77', 'monotonic_time': 4581.967390419, 'message_signature': '489835f00bd44606a2ec9397b3bb121d071e20cea2da028627eb6107a2779d7b'}, {'source': 'openstack', 'counter_name': 'network.incoming.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 6993, 'user_id': '3e78a70a1d284a9d932d4a53b872df39', 'user_name': None, 'project_id': 'cccb624dbe6d4401a89e9cd254f91828', 'project_name': None, 'resource_id': 'instance-00000046-9308be91-9a92-4389-939a-8b03d37474cf-tapd96fb6bb-97', 'timestamp': '2026-01-22T00:00:23.322804', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestJSON-server-396111842', 'name': 'tapd96fb6bb-97', 'instance_id': '9308be91-9a92-4389-939a-8b03d37474cf', 'instance_type': 'm1.nano', 'host': '98bf05fc3cde3063e357af07cf32397d1b83b1095afc25a5e9b316ae', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:c3:44:d7', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapd96fb6bb-97'}, 'message_id': '59575a22-f725-11f0-b13b-fa163e425b77', 'monotonic_time': 4581.974649062, 'message_signature': '48b8ef8925b5c94ead70dadc76555cc433cafea4a1da9909ca9e9a0fdf7a332e'}]}, 'timestamp': '2026-01-22 00:00:23.323421', '_unique_id': '8d125a1902434c008f540c3c1d8aa893'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.323 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.323 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.323 12 ERROR oslo_messaging.notify.messaging     yield
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.323 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.323 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.323 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.323 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.323 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.323 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.323 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.323 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.323 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.323 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.323 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.323 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.323 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.323 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.323 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.323 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.323 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.323 12 ERROR oslo_messaging.notify.messaging 
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.323 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.323 12 ERROR oslo_messaging.notify.messaging 
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.323 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.323 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.323 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.323 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.323 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.323 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.323 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.323 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.323 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.323 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.323 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.323 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.323 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.323 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.323 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.323 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.323 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.323 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.323 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.323 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.323 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.323 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.323 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.323 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.323 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.323 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.323 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.323 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.323 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.323 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.323 12 ERROR oslo_messaging.notify.messaging 
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.324 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.drop in the context of pollsters
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.325 12 DEBUG ceilometer.compute.pollsters [-] c5085649-028e-44f3-b7fa-53f19fc0a7de/network.outgoing.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.325 12 DEBUG ceilometer.compute.pollsters [-] 9308be91-9a92-4389-939a-8b03d37474cf/network.outgoing.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.326 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'e36e80a5-a641-4ce3-8f8e-6c4e98fec4ff', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '9c8bb39ebd324063b1c1044104d8fe0d', 'user_name': None, 'project_id': 'ef52bd80396048e796f6c2dbf7295b47', 'project_name': None, 'resource_id': 'instance-00000051-c5085649-028e-44f3-b7fa-53f19fc0a7de-tap9a8b67ec-5e', 'timestamp': '2026-01-22T00:00:23.325150', 'resource_metadata': {'display_name': 'tempest-ServerTagsTestJSON-server-196894615', 'name': 'tap9a8b67ec-5e', 'instance_id': 'c5085649-028e-44f3-b7fa-53f19fc0a7de', 'instance_type': 'm1.nano', 'host': '24df005293addda2e6c1dd5df249203ca1b6ab99c54d56db81428361', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:31:f9:41', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap9a8b67ec-5e'}, 'message_id': '5957a9c8-f725-11f0-b13b-fa163e425b77', 'monotonic_time': 4581.967390419, 'message_signature': 'b9ab400ed71bc546e38a6ae478cd905c2849722b42fe551576811fd96bd6bcac'}, {'source': 'openstack', 'counter_name': 'network.outgoing.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '3e78a70a1d284a9d932d4a53b872df39', 'user_name': None, 'project_id': 'cccb624dbe6d4401a89e9cd254f91828', 'project_name': None, 'resource_id': 'instance-00000046-9308be91-9a92-4389-939a-8b03d37474cf-tapd96fb6bb-97', 'timestamp': '2026-01-22T00:00:23.325150', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestJSON-server-396111842', 'name': 'tapd96fb6bb-97', 'instance_id': '9308be91-9a92-4389-939a-8b03d37474cf', 'instance_type': 'm1.nano', 'host': '98bf05fc3cde3063e357af07cf32397d1b83b1095afc25a5e9b316ae', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:c3:44:d7', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapd96fb6bb-97'}, 'message_id': '5957b3d2-f725-11f0-b13b-fa163e425b77', 'monotonic_time': 4581.974649062, 'message_signature': '82b66ab7449ed893b7298859ee2c9780a031bb01b33e7ececa0fdd799d1fd462'}]}, 'timestamp': '2026-01-22 00:00:23.325704', '_unique_id': 'd8a37a5b154b420b8e825d6eafed9eeb'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.326 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.326 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.326 12 ERROR oslo_messaging.notify.messaging     yield
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.326 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.326 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.326 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.326 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.326 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.326 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.326 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.326 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.326 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.326 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.326 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.326 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.326 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.326 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.326 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.326 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.326 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.326 12 ERROR oslo_messaging.notify.messaging 
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.326 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.326 12 ERROR oslo_messaging.notify.messaging 
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.326 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.326 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.326 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.326 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.326 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.326 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.326 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.326 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.326 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.326 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.326 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.326 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.326 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.326 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.326 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.326 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.326 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.326 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.326 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.326 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.326 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.326 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.326 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.326 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.326 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.326 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.326 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.326 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.326 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.326 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.326 12 ERROR oslo_messaging.notify.messaging 
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.327 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.usage in the context of pollsters
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.338 12 DEBUG ceilometer.compute.pollsters [-] c5085649-028e-44f3-b7fa-53f19fc0a7de/disk.device.usage volume: 196624 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.338 12 DEBUG ceilometer.compute.pollsters [-] c5085649-028e-44f3-b7fa-53f19fc0a7de/disk.device.usage volume: 485376 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.348 12 DEBUG ceilometer.compute.pollsters [-] 9308be91-9a92-4389-939a-8b03d37474cf/disk.device.usage volume: 30408704 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.348 12 DEBUG ceilometer.compute.pollsters [-] 9308be91-9a92-4389-939a-8b03d37474cf/disk.device.usage volume: 509952 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.350 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '8493e4e5-62e1-41dd-8497-7f6aea4fbfca', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 196624, 'user_id': '9c8bb39ebd324063b1c1044104d8fe0d', 'user_name': None, 'project_id': 'ef52bd80396048e796f6c2dbf7295b47', 'project_name': None, 'resource_id': 'c5085649-028e-44f3-b7fa-53f19fc0a7de-vda', 'timestamp': '2026-01-22T00:00:23.327428', 'resource_metadata': {'display_name': 'tempest-ServerTagsTestJSON-server-196894615', 'name': 'instance-00000051', 'instance_id': 'c5085649-028e-44f3-b7fa-53f19fc0a7de', 'instance_type': 'm1.nano', 'host': '24df005293addda2e6c1dd5df249203ca1b6ab99c54d56db81428361', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '5959a4f8-f725-11f0-b13b-fa163e425b77', 'monotonic_time': 4582.046638885, 'message_signature': '21a6eeee9bab02dc2534ba5b12a177bed8fea1152f9a97e91c550c155fc6d6a7'}, {'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 485376, 'user_id': '9c8bb39ebd324063b1c1044104d8fe0d', 'user_name': None, 'project_id': 'ef52bd80396048e796f6c2dbf7295b47', 'project_name': None, 'resource_id': 'c5085649-028e-44f3-b7fa-53f19fc0a7de-sda', 'timestamp': '2026-01-22T00:00:23.327428', 'resource_metadata': {'display_name': 'tempest-ServerTagsTestJSON-server-196894615', 'name': 'instance-00000051', 'instance_id': 'c5085649-028e-44f3-b7fa-53f19fc0a7de', 'instance_type': 'm1.nano', 'host': '24df005293addda2e6c1dd5df249203ca1b6ab99c54d56db81428361', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '5959afe8-f725-11f0-b13b-fa163e425b77', 'monotonic_time': 4582.046638885, 'message_signature': '622e8468f705b8732674ed7b91040a29759ee4ab683c507e68dd36a4e89584f0'}, {'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 30408704, 'user_id': '3e78a70a1d284a9d932d4a53b872df39', 'user_name': None, 'project_id': 'cccb624dbe6d4401a89e9cd254f91828', 'project_name': None, 'resource_id': '9308be91-9a92-4389-939a-8b03d37474cf-vda', 'timestamp': '2026-01-22T00:00:23.327428', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestJSON-server-396111842', 'name': 'instance-00000046', 'instance_id': '9308be91-9a92-4389-939a-8b03d37474cf', 'instance_type': 'm1.nano', 'host': '98bf05fc3cde3063e357af07cf32397d1b83b1095afc25a5e9b316ae', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '595b3e1c-f725-11f0-b13b-fa163e425b77', 'monotonic_time': 4582.057877934, 'message_signature': '11108bb0054fc9ec5c2b748f590980f2bc87e0815666781e8e8bfe52caca1829'}, {'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 509952, 'user_id': '3e78a70a1d284a9d932d4a53b872df39', 'user_name': None, 'project_id': 'cccb624dbe6d4401a89e9cd254f91828', 'project_name': None, 'resource_id': '9308be91-9a92-4389-939a-8b03d37474cf-sda', 'timestamp': '2026-01-22T00:00:23.327428', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestJSON-server-396111842', 'name': 'instance-00000046', 'instance_id': '9308be91-9a92-4389-939a-8b03d37474cf', 'instance_type': 'm1.nano', 'host': '98bf05fc3cde3063e357af07cf32397d1b83b1095afc25a5e9b316ae', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '595b49b6-f725-11f0-b13b-fa163e425b77', 'monotonic_time': 4582.057877934, 'message_signature': '3c18bf312f33cd2bd89faf05ae00c7b9b4bdacab4b99b08ecf1eb0c0ce064010'}]}, 'timestamp': '2026-01-22 00:00:23.349202', '_unique_id': 'a653abcf513043b892a8b78aee93e7d7'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.350 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.350 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.350 12 ERROR oslo_messaging.notify.messaging     yield
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.350 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.350 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.350 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.350 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.350 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.350 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.350 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.350 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.350 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.350 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.350 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.350 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.350 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.350 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.350 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.350 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.350 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.350 12 ERROR oslo_messaging.notify.messaging 
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.350 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.350 12 ERROR oslo_messaging.notify.messaging 
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.350 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.350 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.350 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.350 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.350 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.350 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.350 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.350 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.350 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.350 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.350 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.350 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.350 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.350 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.350 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.350 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.350 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.350 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.350 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.350 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.350 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.350 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.350 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.350 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.350 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.350 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.350 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.350 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.350 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.350 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.350 12 ERROR oslo_messaging.notify.messaging 
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.351 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.bytes in the context of pollsters
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.351 12 DEBUG ceilometer.compute.pollsters [-] c5085649-028e-44f3-b7fa-53f19fc0a7de/disk.device.write.bytes volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.351 12 DEBUG ceilometer.compute.pollsters [-] c5085649-028e-44f3-b7fa-53f19fc0a7de/disk.device.write.bytes volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.352 12 DEBUG ceilometer.compute.pollsters [-] 9308be91-9a92-4389-939a-8b03d37474cf/disk.device.write.bytes volume: 438272 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.352 12 DEBUG ceilometer.compute.pollsters [-] 9308be91-9a92-4389-939a-8b03d37474cf/disk.device.write.bytes volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.353 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'b35a1945-bde8-464a-9b20-03f8ff686fe0', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '9c8bb39ebd324063b1c1044104d8fe0d', 'user_name': None, 'project_id': 'ef52bd80396048e796f6c2dbf7295b47', 'project_name': None, 'resource_id': 'c5085649-028e-44f3-b7fa-53f19fc0a7de-vda', 'timestamp': '2026-01-22T00:00:23.351661', 'resource_metadata': {'display_name': 'tempest-ServerTagsTestJSON-server-196894615', 'name': 'instance-00000051', 'instance_id': 'c5085649-028e-44f3-b7fa-53f19fc0a7de', 'instance_type': 'm1.nano', 'host': '24df005293addda2e6c1dd5df249203ca1b6ab99c54d56db81428361', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '595bb50e-f725-11f0-b13b-fa163e425b77', 'monotonic_time': 4581.892503118, 'message_signature': '6daa5cfb3fb9ec5c01fd7eaf428c7f1ece1eb643a2e91dfd82ce7e9fee0b1765'}, {'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '9c8bb39ebd324063b1c1044104d8fe0d', 'user_name': None, 'project_id': 'ef52bd80396048e796f6c2dbf7295b47', 'project_name': None, 'resource_id': 'c5085649-028e-44f3-b7fa-53f19fc0a7de-sda', 'timestamp': '2026-01-22T00:00:23.351661', 'resource_metadata': {'display_name': 'tempest-ServerTagsTestJSON-server-196894615', 'name': 'instance-00000051', 'instance_id': 'c5085649-028e-44f3-b7fa-53f19fc0a7de', 'instance_type': 'm1.nano', 'host': '24df005293addda2e6c1dd5df249203ca1b6ab99c54d56db81428361', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '595bc18e-f725-11f0-b13b-fa163e425b77', 'monotonic_time': 4581.892503118, 'message_signature': 'a5f78d193d054b7bfecccb291e4c7a359dfa1f697a4b7131017aa2b1706c3d12'}, {'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 438272, 'user_id': '3e78a70a1d284a9d932d4a53b872df39', 'user_name': None, 'project_id': 'cccb624dbe6d4401a89e9cd254f91828', 'project_name': None, 'resource_id': '9308be91-9a92-4389-939a-8b03d37474cf-vda', 'timestamp': '2026-01-22T00:00:23.351661', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestJSON-server-396111842', 'name': 'instance-00000046', 'instance_id': '9308be91-9a92-4389-939a-8b03d37474cf', 'instance_type': 'm1.nano', 'host': '98bf05fc3cde3063e357af07cf32397d1b83b1095afc25a5e9b316ae', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '595bcb16-f725-11f0-b13b-fa163e425b77', 'monotonic_time': 4581.928235668, 'message_signature': 'b7bf16a3d22116de7eac51ceb0b5848ad3325371ee30246449b26b4534c2362a'}, {'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '3e78a70a1d284a9d932d4a53b872df39', 'user_name': None, 'project_id': 'cccb624dbe6d4401a89e9cd254f91828', 'project_name': None, 'resource_id': '9308be91-9a92-4389-939a-8b03d37474cf-sda', 'timestamp': '2026-01-22T00:00:23.351661', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestJSON-server-396111842', 'name': 'instance-00000046', 'instance_id': '9308be91-9a92-4389-939a-8b03d37474cf', 'instance_type': 'm1.nano', 'host': '98bf05fc3cde3063e357af07cf32397d1b83b1095afc25a5e9b316ae', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '595bd4c6-f725-11f0-b13b-fa163e425b77', 'monotonic_time': 4581.928235668, 'message_signature': '1f98af38fb95dc01a848a94c1e052560215612f9f62678df35777ef8bf53e6c4'}]}, 'timestamp': '2026-01-22 00:00:23.352750', '_unique_id': '6056a0de60c24d95995f111a5ae39e00'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.353 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.353 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.353 12 ERROR oslo_messaging.notify.messaging     yield
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.353 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.353 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.353 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.353 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.353 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.353 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.353 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.353 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.353 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.353 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.353 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.353 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.353 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.353 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.353 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.353 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.353 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.353 12 ERROR oslo_messaging.notify.messaging 
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.353 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.353 12 ERROR oslo_messaging.notify.messaging 
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.353 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.353 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.353 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.353 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.353 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.353 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.353 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.353 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.353 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.353 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.353 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.353 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.353 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.353 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.353 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.353 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.353 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.353 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.353 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.353 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.353 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.353 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.353 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.353 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.353 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.353 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.353 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.353 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.353 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.353 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.353 12 ERROR oslo_messaging.notify.messaging 
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.354 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.iops in the context of pollsters
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.354 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for PerDeviceDiskIOPSPollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.354 12 ERROR ceilometer.polling.manager [-] Prevent pollster disk.device.iops from polling [<NovaLikeServer: tempest-ServerTagsTestJSON-server-196894615>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-ServerTagsTestJSON-server-196894615>]
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.354 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.allocation in the context of pollsters
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.354 12 DEBUG ceilometer.compute.pollsters [-] c5085649-028e-44f3-b7fa-53f19fc0a7de/disk.device.allocation volume: 204800 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.355 12 DEBUG ceilometer.compute.pollsters [-] c5085649-028e-44f3-b7fa-53f19fc0a7de/disk.device.allocation volume: 487424 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.355 12 DEBUG ceilometer.compute.pollsters [-] 9308be91-9a92-4389-939a-8b03d37474cf/disk.device.allocation volume: 30679040 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.355 12 DEBUG ceilometer.compute.pollsters [-] 9308be91-9a92-4389-939a-8b03d37474cf/disk.device.allocation volume: 512000 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.356 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'fc5c8f86-26ca-414f-b294-993c2a5fd56e', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 204800, 'user_id': '9c8bb39ebd324063b1c1044104d8fe0d', 'user_name': None, 'project_id': 'ef52bd80396048e796f6c2dbf7295b47', 'project_name': None, 'resource_id': 'c5085649-028e-44f3-b7fa-53f19fc0a7de-vda', 'timestamp': '2026-01-22T00:00:23.354901', 'resource_metadata': {'display_name': 'tempest-ServerTagsTestJSON-server-196894615', 'name': 'instance-00000051', 'instance_id': 'c5085649-028e-44f3-b7fa-53f19fc0a7de', 'instance_type': 'm1.nano', 'host': '24df005293addda2e6c1dd5df249203ca1b6ab99c54d56db81428361', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '595c33f8-f725-11f0-b13b-fa163e425b77', 'monotonic_time': 4582.046638885, 'message_signature': '55f8aa42f5462c8ec5182d8b60e20fc0e84a99f7969ff1c02b899b24d3122925'}, {'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 487424, 'user_id': '9c8bb39ebd324063b1c1044104d8fe0d', 'user_name': None, 'project_id': 'ef52bd80396048e796f6c2dbf7295b47', 'project_name': None, 'resource_id': 'c5085649-028e-44f3-b7fa-53f19fc0a7de-sda', 'timestamp': '2026-01-22T00:00:23.354901', 'resource_metadata': {'display_name': 'tempest-ServerTagsTestJSON-server-196894615', 'name': 'instance-00000051', 'instance_id': 'c5085649-028e-44f3-b7fa-53f19fc0a7de', 'instance_type': 'm1.nano', 'host': '24df005293addda2e6c1dd5df249203ca1b6ab99c54d56db81428361', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '595c3d4e-f725-11f0-b13b-fa163e425b77', 'monotonic_time': 4582.046638885, 'message_signature': '9b7ed3ab043c88b0eb9649db9a8bb21c7e3ddd0d6824f17f3fe388f5590e54f4'}, {'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 30679040, 'user_id': '3e78a70a1d284a9d932d4a53b872df39', 'user_name': None, 'project_id': 'cccb624dbe6d4401a89e9cd254f91828', 'project_name': None, 'resource_id': '9308be91-9a92-4389-939a-8b03d37474cf-vda', 'timestamp': '2026-01-22T00:00:23.354901', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestJSON-server-396111842', 'name': 'instance-00000046', 'instance_id': '9308be91-9a92-4389-939a-8b03d37474cf', 'instance_type': 'm1.nano', 'host': '98bf05fc3cde3063e357af07cf32397d1b83b1095afc25a5e9b316ae', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '595c4762-f725-11f0-b13b-fa163e425b77', 'monotonic_time': 4582.057877934, 'message_signature': '15066a0ef971edc7da9f8a8ca8f59844876da39071c53c023e63fbe8a56e7ecc'}, {'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 512000, 'user_id': '3e78a70a1d284a9d932d4a53b872df39', 'user_name': None, 'project_id': 'cccb624dbe6d4401a89e9cd254f91828', 'project_name': None, 'resource_id': '9308be91-9a92-4389-939a-8b03d37474cf-sda', 'timestamp': '2026-01-22T00:00:23.354901', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestJSON-server-396111842', 'name': 'instance-00000046', 'instance_id': '9308be91-9a92-4389-939a-8b03d37474cf', 'instance_type': 'm1.nano', 'host': '98bf05fc3cde3063e357af07cf32397d1b83b1095afc25a5e9b316ae', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '595c5090-f725-11f0-b13b-fa163e425b77', 'monotonic_time': 4582.057877934, 'message_signature': 'b1495218811d505033b7596f8fdea5445d89d44c5758bc3903c61ef7358c44eb'}]}, 'timestamp': '2026-01-22 00:00:23.355939', '_unique_id': '67a069eb40304e3b8cd1e3d11431744e'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.356 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.356 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.356 12 ERROR oslo_messaging.notify.messaging     yield
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.356 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.356 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.356 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.356 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.356 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.356 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.356 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.356 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.356 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.356 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.356 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.356 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.356 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.356 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.356 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.356 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.356 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.356 12 ERROR oslo_messaging.notify.messaging 
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.356 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.356 12 ERROR oslo_messaging.notify.messaging 
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.356 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.356 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.356 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.356 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.356 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.356 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.356 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.356 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.356 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.356 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.356 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.356 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.356 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.356 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.356 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.356 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.356 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.356 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.356 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.356 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.356 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.356 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.356 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.356 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.356 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.356 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.356 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.356 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.356 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.356 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.356 12 ERROR oslo_messaging.notify.messaging 
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.357 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.capacity in the context of pollsters
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.357 12 DEBUG ceilometer.compute.pollsters [-] c5085649-028e-44f3-b7fa-53f19fc0a7de/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.357 12 DEBUG ceilometer.compute.pollsters [-] c5085649-028e-44f3-b7fa-53f19fc0a7de/disk.device.capacity volume: 485376 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.358 12 DEBUG ceilometer.compute.pollsters [-] 9308be91-9a92-4389-939a-8b03d37474cf/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.358 12 DEBUG ceilometer.compute.pollsters [-] 9308be91-9a92-4389-939a-8b03d37474cf/disk.device.capacity volume: 509952 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.359 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'b10790dd-89dc-4abd-890e-a756c7afc78a', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '9c8bb39ebd324063b1c1044104d8fe0d', 'user_name': None, 'project_id': 'ef52bd80396048e796f6c2dbf7295b47', 'project_name': None, 'resource_id': 'c5085649-028e-44f3-b7fa-53f19fc0a7de-vda', 'timestamp': '2026-01-22T00:00:23.357573', 'resource_metadata': {'display_name': 'tempest-ServerTagsTestJSON-server-196894615', 'name': 'instance-00000051', 'instance_id': 'c5085649-028e-44f3-b7fa-53f19fc0a7de', 'instance_type': 'm1.nano', 'host': '24df005293addda2e6c1dd5df249203ca1b6ab99c54d56db81428361', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '595c9b7c-f725-11f0-b13b-fa163e425b77', 'monotonic_time': 4582.046638885, 'message_signature': '05a42b9f204fd304baf15306086cebfc8f5ecb9072133cf5ed7155ee09263908'}, {'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 485376, 'user_id': '9c8bb39ebd324063b1c1044104d8fe0d', 'user_name': None, 'project_id': 'ef52bd80396048e796f6c2dbf7295b47', 'project_name': None, 'resource_id': 'c5085649-028e-44f3-b7fa-53f19fc0a7de-sda', 'timestamp': '2026-01-22T00:00:23.357573', 'resource_metadata': {'display_name': 'tempest-ServerTagsTestJSON-server-196894615', 'name': 'instance-00000051', 'instance_id': 'c5085649-028e-44f3-b7fa-53f19fc0a7de', 'instance_type': 'm1.nano', 'host': '24df005293addda2e6c1dd5df249203ca1b6ab99c54d56db81428361', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '595ca590-f725-11f0-b13b-fa163e425b77', 'monotonic_time': 4582.046638885, 'message_signature': 'a0ff8f8d34ce36bf7a9961dfc579203291d31f4c46e383dcfd86b7b829515712'}, {'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '3e78a70a1d284a9d932d4a53b872df39', 'user_name': None, 'project_id': 'cccb624dbe6d4401a89e9cd254f91828', 'project_name': None, 'resource_id': '9308be91-9a92-4389-939a-8b03d37474cf-vda', 'timestamp': '2026-01-22T00:00:23.357573', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestJSON-server-396111842', 'name': 'instance-00000046', 'instance_id': '9308be91-9a92-4389-939a-8b03d37474cf', 'instance_type': 'm1.nano', 'host': '98bf05fc3cde3063e357af07cf32397d1b83b1095afc25a5e9b316ae', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '595caff4-f725-11f0-b13b-fa163e425b77', 'monotonic_time': 4582.057877934, 'message_signature': 'd50f8ead30006e043f3c38be7d6e5c1c905510662b2ee836966d72adc6028ce5'}, {'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 509952, 'user_id': '3e78a70a1d284a9d932d4a53b872df39', 'user_name': None, 'project_id': 'cccb624dbe6d4401a89e9cd254f91828', 'project_name': None, 'resource_id': '9308be91-9a92-4389-939a-8b03d37474cf-sda', 'timestamp': '2026-01-22T00:00:23.357573', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestJSON-server-396111842', 'name': 'instance-00000046', 'instance_id': '9308be91-9a92-4389-939a-8b03d37474cf', 'instance_type': 'm1.nano', 'host': '98bf05fc3cde3063e357af07cf32397d1b83b1095afc25a5e9b316ae', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '595cb918-f725-11f0-b13b-fa163e425b77', 'monotonic_time': 4582.057877934, 'message_signature': '5c67b3884764edda3d7a8fc065a8e7f0b965a0c6c5d7d11f605cf086a6b94929'}]}, 'timestamp': '2026-01-22 00:00:23.358586', '_unique_id': 'd34317c2be054604b3051a3862fc5c59'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.359 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.359 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.359 12 ERROR oslo_messaging.notify.messaging     yield
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.359 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.359 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.359 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.359 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.359 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.359 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.359 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.359 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.359 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.359 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.359 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.359 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.359 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.359 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.359 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.359 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.359 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.359 12 ERROR oslo_messaging.notify.messaging 
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.359 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.359 12 ERROR oslo_messaging.notify.messaging 
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.359 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.359 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.359 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.359 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.359 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.359 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.359 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.359 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.359 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.359 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.359 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.359 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.359 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.359 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.359 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.359 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.359 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.359 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.359 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.359 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.359 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.359 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.359 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.359 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.359 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.359 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.359 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.359 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.359 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.359 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.359 12 ERROR oslo_messaging.notify.messaging 
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.360 12 INFO ceilometer.polling.manager [-] Polling pollster memory.usage in the context of pollsters
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.360 12 DEBUG ceilometer.compute.pollsters [-] c5085649-028e-44f3-b7fa-53f19fc0a7de/memory.usage volume: Unavailable _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.360 12 WARNING ceilometer.compute.pollsters [-] memory.usage statistic in not available for instance c5085649-028e-44f3-b7fa-53f19fc0a7de: ceilometer.compute.pollsters.NoVolumeException
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.360 12 DEBUG ceilometer.compute.pollsters [-] 9308be91-9a92-4389-939a-8b03d37474cf/memory.usage volume: 42.390625 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.361 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '5543b35c-c568-43ac-995c-b20748f1a613', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'memory.usage', 'counter_type': 'gauge', 'counter_unit': 'MB', 'counter_volume': 42.390625, 'user_id': '3e78a70a1d284a9d932d4a53b872df39', 'user_name': None, 'project_id': 'cccb624dbe6d4401a89e9cd254f91828', 'project_name': None, 'resource_id': '9308be91-9a92-4389-939a-8b03d37474cf', 'timestamp': '2026-01-22T00:00:23.360381', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestJSON-server-396111842', 'name': 'instance-00000046', 'instance_id': '9308be91-9a92-4389-939a-8b03d37474cf', 'instance_type': 'm1.nano', 'host': '98bf05fc3cde3063e357af07cf32397d1b83b1095afc25a5e9b316ae', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1}, 'message_id': '595d189a-f725-11f0-b13b-fa163e425b77', 'monotonic_time': 4582.020749947, 'message_signature': '74397f29416eb62f74470dccb85e2c5d411d760a16a68746ea70f48f8b23a2c6'}]}, 'timestamp': '2026-01-22 00:00:23.361079', '_unique_id': '743976fddebb43029159b2464bab26b1'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.361 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.361 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.361 12 ERROR oslo_messaging.notify.messaging     yield
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.361 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.361 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.361 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.361 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.361 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.361 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.361 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.361 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.361 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.361 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.361 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.361 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.361 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.361 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.361 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.361 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.361 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.361 12 ERROR oslo_messaging.notify.messaging 
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.361 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.361 12 ERROR oslo_messaging.notify.messaging 
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.361 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.361 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.361 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.361 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.361 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.361 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.361 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.361 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.361 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.361 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.361 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.361 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.361 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.361 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.361 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.361 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.361 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.361 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.361 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.361 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.361 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.361 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.361 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.361 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.361 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.361 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.361 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.361 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.361 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.361 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.361 12 ERROR oslo_messaging.notify.messaging 
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.362 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.error in the context of pollsters
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.362 12 DEBUG ceilometer.compute.pollsters [-] c5085649-028e-44f3-b7fa-53f19fc0a7de/network.outgoing.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.363 12 DEBUG ceilometer.compute.pollsters [-] 9308be91-9a92-4389-939a-8b03d37474cf/network.outgoing.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.363 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'ad99cc67-1865-45a0-8782-dfaa0967b52f', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '9c8bb39ebd324063b1c1044104d8fe0d', 'user_name': None, 'project_id': 'ef52bd80396048e796f6c2dbf7295b47', 'project_name': None, 'resource_id': 'instance-00000051-c5085649-028e-44f3-b7fa-53f19fc0a7de-tap9a8b67ec-5e', 'timestamp': '2026-01-22T00:00:23.362901', 'resource_metadata': {'display_name': 'tempest-ServerTagsTestJSON-server-196894615', 'name': 'tap9a8b67ec-5e', 'instance_id': 'c5085649-028e-44f3-b7fa-53f19fc0a7de', 'instance_type': 'm1.nano', 'host': '24df005293addda2e6c1dd5df249203ca1b6ab99c54d56db81428361', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:31:f9:41', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap9a8b67ec-5e'}, 'message_id': '595d6c32-f725-11f0-b13b-fa163e425b77', 'monotonic_time': 4581.967390419, 'message_signature': '346234075a289f1923d056c50872280e8890cfd247517f9a32405dcaa4877472'}, {'source': 'openstack', 'counter_name': 'network.outgoing.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '3e78a70a1d284a9d932d4a53b872df39', 'user_name': None, 'project_id': 'cccb624dbe6d4401a89e9cd254f91828', 'project_name': None, 'resource_id': 'instance-00000046-9308be91-9a92-4389-939a-8b03d37474cf-tapd96fb6bb-97', 'timestamp': '2026-01-22T00:00:23.362901', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestJSON-server-396111842', 'name': 'tapd96fb6bb-97', 'instance_id': '9308be91-9a92-4389-939a-8b03d37474cf', 'instance_type': 'm1.nano', 'host': '98bf05fc3cde3063e357af07cf32397d1b83b1095afc25a5e9b316ae', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:c3:44:d7', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapd96fb6bb-97'}, 'message_id': '595d7600-f725-11f0-b13b-fa163e425b77', 'monotonic_time': 4581.974649062, 'message_signature': 'c1fe7a5b156b480b9f4af07837140b9d24b8965944ff7eeb92819cd4bedfd9af'}]}, 'timestamp': '2026-01-22 00:00:23.363432', '_unique_id': 'cb8ba8f2087b465986180a615db2b3f9'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.363 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.363 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.363 12 ERROR oslo_messaging.notify.messaging     yield
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.363 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.363 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.363 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.363 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.363 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.363 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.363 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.363 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.363 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.363 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.363 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.363 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.363 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.363 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.363 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.363 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.363 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.363 12 ERROR oslo_messaging.notify.messaging 
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.363 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.363 12 ERROR oslo_messaging.notify.messaging 
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.363 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.363 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.363 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.363 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.363 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.363 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.363 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.363 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.363 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.363 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.363 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.363 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.363 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.363 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.363 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.363 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.363 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.363 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.363 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.363 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.363 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.363 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.363 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.363 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.363 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.363 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.363 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.363 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.363 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.363 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 21 19:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:00:23.363 12 ERROR oslo_messaging.notify.messaging 
Jan 21 19:00:23 np0005591285 nova_compute[182755]: 2026-01-22 00:00:23.389 182759 DEBUG oslo_concurrency.lockutils [None req-7084740f-d576-4ef1-8009-26afe304ba5c 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] Acquiring lock "refresh_cache-9308be91-9a92-4389-939a-8b03d37474cf" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 21 19:00:23 np0005591285 nova_compute[182755]: 2026-01-22 00:00:23.390 182759 DEBUG oslo_concurrency.lockutils [None req-7084740f-d576-4ef1-8009-26afe304ba5c 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] Acquired lock "refresh_cache-9308be91-9a92-4389-939a-8b03d37474cf" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 21 19:00:23 np0005591285 nova_compute[182755]: 2026-01-22 00:00:23.390 182759 DEBUG nova.network.neutron [None req-7084740f-d576-4ef1-8009-26afe304ba5c 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] [instance: 9308be91-9a92-4389-939a-8b03d37474cf] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Jan 21 19:00:24 np0005591285 nova_compute[182755]: 2026-01-22 00:00:24.306 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:00:24 np0005591285 nova_compute[182755]: 2026-01-22 00:00:24.907 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:00:25 np0005591285 nova_compute[182755]: 2026-01-22 00:00:25.670 182759 DEBUG oslo_concurrency.lockutils [None req-af1f27ae-cbd1-4497-bd1f-748af7479c9a 9c8bb39ebd324063b1c1044104d8fe0d ef52bd80396048e796f6c2dbf7295b47 - - default default] Acquiring lock "c5085649-028e-44f3-b7fa-53f19fc0a7de" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 21 19:00:25 np0005591285 nova_compute[182755]: 2026-01-22 00:00:25.671 182759 DEBUG oslo_concurrency.lockutils [None req-af1f27ae-cbd1-4497-bd1f-748af7479c9a 9c8bb39ebd324063b1c1044104d8fe0d ef52bd80396048e796f6c2dbf7295b47 - - default default] Lock "c5085649-028e-44f3-b7fa-53f19fc0a7de" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 21 19:00:25 np0005591285 nova_compute[182755]: 2026-01-22 00:00:25.671 182759 DEBUG oslo_concurrency.lockutils [None req-af1f27ae-cbd1-4497-bd1f-748af7479c9a 9c8bb39ebd324063b1c1044104d8fe0d ef52bd80396048e796f6c2dbf7295b47 - - default default] Acquiring lock "c5085649-028e-44f3-b7fa-53f19fc0a7de-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 21 19:00:25 np0005591285 nova_compute[182755]: 2026-01-22 00:00:25.672 182759 DEBUG oslo_concurrency.lockutils [None req-af1f27ae-cbd1-4497-bd1f-748af7479c9a 9c8bb39ebd324063b1c1044104d8fe0d ef52bd80396048e796f6c2dbf7295b47 - - default default] Lock "c5085649-028e-44f3-b7fa-53f19fc0a7de-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 21 19:00:25 np0005591285 nova_compute[182755]: 2026-01-22 00:00:25.672 182759 DEBUG oslo_concurrency.lockutils [None req-af1f27ae-cbd1-4497-bd1f-748af7479c9a 9c8bb39ebd324063b1c1044104d8fe0d ef52bd80396048e796f6c2dbf7295b47 - - default default] Lock "c5085649-028e-44f3-b7fa-53f19fc0a7de-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 21 19:00:25 np0005591285 nova_compute[182755]: 2026-01-22 00:00:25.693 182759 DEBUG nova.network.neutron [None req-7084740f-d576-4ef1-8009-26afe304ba5c 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] [instance: 9308be91-9a92-4389-939a-8b03d37474cf] Updating instance_info_cache with network_info: [{"id": "d96fb6bb-9793-4373-8f62-3aa3f32af6a5", "address": "fa:16:3e:c3:44:d7", "network": {"id": "19c3e0c8-5563-479c-995a-ab38d8b8c7f7", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-10713966-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.240", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "cccb624dbe6d4401a89e9cd254f91828", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd96fb6bb-97", "ovs_interfaceid": "d96fb6bb-9793-4373-8f62-3aa3f32af6a5", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 21 19:00:25 np0005591285 nova_compute[182755]: 2026-01-22 00:00:25.699 182759 INFO nova.compute.manager [None req-af1f27ae-cbd1-4497-bd1f-748af7479c9a 9c8bb39ebd324063b1c1044104d8fe0d ef52bd80396048e796f6c2dbf7295b47 - - default default] [instance: c5085649-028e-44f3-b7fa-53f19fc0a7de] Terminating instance#033[00m
Jan 21 19:00:25 np0005591285 nova_compute[182755]: 2026-01-22 00:00:25.713 182759 DEBUG nova.compute.manager [None req-af1f27ae-cbd1-4497-bd1f-748af7479c9a 9c8bb39ebd324063b1c1044104d8fe0d ef52bd80396048e796f6c2dbf7295b47 - - default default] [instance: c5085649-028e-44f3-b7fa-53f19fc0a7de] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Jan 21 19:00:25 np0005591285 nova_compute[182755]: 2026-01-22 00:00:25.719 182759 DEBUG oslo_concurrency.lockutils [None req-7084740f-d576-4ef1-8009-26afe304ba5c 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] Releasing lock "refresh_cache-9308be91-9a92-4389-939a-8b03d37474cf" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 21 19:00:25 np0005591285 kernel: tap9a8b67ec-5e (unregistering): left promiscuous mode
Jan 21 19:00:25 np0005591285 NetworkManager[55017]: <info>  [1769040025.7359] device (tap9a8b67ec-5e): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 21 19:00:25 np0005591285 nova_compute[182755]: 2026-01-22 00:00:25.741 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:00:25 np0005591285 ovn_controller[94908]: 2026-01-22T00:00:25Z|00289|binding|INFO|Releasing lport 9a8b67ec-5e81-4335-80f2-76197d4f6f9e from this chassis (sb_readonly=0)
Jan 21 19:00:25 np0005591285 ovn_controller[94908]: 2026-01-22T00:00:25Z|00290|binding|INFO|Setting lport 9a8b67ec-5e81-4335-80f2-76197d4f6f9e down in Southbound
Jan 21 19:00:25 np0005591285 ovn_controller[94908]: 2026-01-22T00:00:25Z|00291|binding|INFO|Removing iface tap9a8b67ec-5e ovn-installed in OVS
Jan 21 19:00:25 np0005591285 nova_compute[182755]: 2026-01-22 00:00:25.754 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:00:25 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:00:25.760 104259 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:31:f9:41 10.100.0.3'], port_security=['fa:16:3e:31:f9:41 10.100.0.3'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.3/28', 'neutron:device_id': 'c5085649-028e-44f3-b7fa-53f19fc0a7de', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-af35880e-f8b9-4463-bac9-70c95c551a8c', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'ef52bd80396048e796f6c2dbf7295b47', 'neutron:revision_number': '4', 'neutron:security_group_ids': '35be5298-c227-4fe0-9a03-50c7aabd6783', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-2.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=6ca68d07-06cf-4a9c-b5e8-6a0c54b84078, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fdd55b5c970>], logical_port=9a8b67ec-5e81-4335-80f2-76197d4f6f9e) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7fdd55b5c970>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 21 19:00:25 np0005591285 nova_compute[182755]: 2026-01-22 00:00:25.762 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:00:25 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:00:25.763 104259 INFO neutron.agent.ovn.metadata.agent [-] Port 9a8b67ec-5e81-4335-80f2-76197d4f6f9e in datapath af35880e-f8b9-4463-bac9-70c95c551a8c unbound from our chassis#033[00m
Jan 21 19:00:25 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:00:25.766 104259 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network af35880e-f8b9-4463-bac9-70c95c551a8c, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Jan 21 19:00:25 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:00:25.768 211690 DEBUG oslo.privsep.daemon [-] privsep: reply[996d3cd6-4f08-406e-b36c-b10c20c2e5ae]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 19:00:25 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:00:25.768 104259 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-af35880e-f8b9-4463-bac9-70c95c551a8c namespace which is not needed anymore#033[00m
Jan 21 19:00:25 np0005591285 systemd[1]: machine-qemu\x2d37\x2dinstance\x2d00000051.scope: Deactivated successfully.
Jan 21 19:00:25 np0005591285 systemd[1]: machine-qemu\x2d37\x2dinstance\x2d00000051.scope: Consumed 5.110s CPU time.
Jan 21 19:00:25 np0005591285 systemd-machined[154022]: Machine qemu-37-instance-00000051 terminated.
Jan 21 19:00:25 np0005591285 nova_compute[182755]: 2026-01-22 00:00:25.892 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:00:25 np0005591285 neutron-haproxy-ovnmeta-af35880e-f8b9-4463-bac9-70c95c551a8c[223166]: [NOTICE]   (223176) : haproxy version is 2.8.14-c23fe91
Jan 21 19:00:25 np0005591285 neutron-haproxy-ovnmeta-af35880e-f8b9-4463-bac9-70c95c551a8c[223166]: [NOTICE]   (223176) : path to executable is /usr/sbin/haproxy
Jan 21 19:00:25 np0005591285 neutron-haproxy-ovnmeta-af35880e-f8b9-4463-bac9-70c95c551a8c[223166]: [WARNING]  (223176) : Exiting Master process...
Jan 21 19:00:25 np0005591285 neutron-haproxy-ovnmeta-af35880e-f8b9-4463-bac9-70c95c551a8c[223166]: [WARNING]  (223176) : Exiting Master process...
Jan 21 19:00:25 np0005591285 neutron-haproxy-ovnmeta-af35880e-f8b9-4463-bac9-70c95c551a8c[223166]: [ALERT]    (223176) : Current worker (223179) exited with code 143 (Terminated)
Jan 21 19:00:25 np0005591285 neutron-haproxy-ovnmeta-af35880e-f8b9-4463-bac9-70c95c551a8c[223166]: [WARNING]  (223176) : All workers exited. Exiting... (0)
Jan 21 19:00:25 np0005591285 systemd[1]: libpod-618f58d591092028e306e624e6c853a6571d0b2cfc1af59112f862eaac58d8f2.scope: Deactivated successfully.
Jan 21 19:00:25 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:00:25.909 104259 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=ce4b296c-26ac-415a-aa87-9634754eb3d3, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '19'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 21 19:00:25 np0005591285 podman[223211]: 2026-01-22 00:00:25.914727039 +0000 UTC m=+0.045611493 container died 618f58d591092028e306e624e6c853a6571d0b2cfc1af59112f862eaac58d8f2 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-af35880e-f8b9-4463-bac9-70c95c551a8c, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0)
Jan 21 19:00:25 np0005591285 nova_compute[182755]: 2026-01-22 00:00:25.940 182759 DEBUG nova.virt.libvirt.driver [None req-7084740f-d576-4ef1-8009-26afe304ba5c 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] [instance: 9308be91-9a92-4389-939a-8b03d37474cf] Starting migrate_disk_and_power_off migrate_disk_and_power_off /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11511#033[00m
Jan 21 19:00:25 np0005591285 nova_compute[182755]: 2026-01-22 00:00:25.941 182759 DEBUG nova.virt.libvirt.volume.remotefs [None req-7084740f-d576-4ef1-8009-26afe304ba5c 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] Creating file /var/lib/nova/instances/9308be91-9a92-4389-939a-8b03d37474cf/e5ab3870ce704299836a88abd2af75bb.tmp on remote host 192.168.122.101 create_file /usr/lib/python3.9/site-packages/nova/virt/libvirt/volume/remotefs.py:79#033[00m
Jan 21 19:00:25 np0005591285 nova_compute[182755]: 2026-01-22 00:00:25.941 182759 DEBUG oslo_concurrency.processutils [None req-7084740f-d576-4ef1-8009-26afe304ba5c 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] Running cmd (subprocess): ssh -o BatchMode=yes 192.168.122.101 touch /var/lib/nova/instances/9308be91-9a92-4389-939a-8b03d37474cf/e5ab3870ce704299836a88abd2af75bb.tmp execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 21 19:00:25 np0005591285 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-618f58d591092028e306e624e6c853a6571d0b2cfc1af59112f862eaac58d8f2-userdata-shm.mount: Deactivated successfully.
Jan 21 19:00:25 np0005591285 systemd[1]: var-lib-containers-storage-overlay-f3b87c384e1221e6166bbada1ca5643e88c705a780de132f530570d75a701510-merged.mount: Deactivated successfully.
Jan 21 19:00:25 np0005591285 podman[223211]: 2026-01-22 00:00:25.966784273 +0000 UTC m=+0.097668717 container cleanup 618f58d591092028e306e624e6c853a6571d0b2cfc1af59112f862eaac58d8f2 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-af35880e-f8b9-4463-bac9-70c95c551a8c, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.license=GPLv2, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team)
Jan 21 19:00:25 np0005591285 systemd[1]: libpod-conmon-618f58d591092028e306e624e6c853a6571d0b2cfc1af59112f862eaac58d8f2.scope: Deactivated successfully.
Jan 21 19:00:25 np0005591285 nova_compute[182755]: 2026-01-22 00:00:25.983 182759 INFO nova.virt.libvirt.driver [-] [instance: c5085649-028e-44f3-b7fa-53f19fc0a7de] Instance destroyed successfully.#033[00m
Jan 21 19:00:25 np0005591285 nova_compute[182755]: 2026-01-22 00:00:25.985 182759 DEBUG nova.objects.instance [None req-af1f27ae-cbd1-4497-bd1f-748af7479c9a 9c8bb39ebd324063b1c1044104d8fe0d ef52bd80396048e796f6c2dbf7295b47 - - default default] Lazy-loading 'resources' on Instance uuid c5085649-028e-44f3-b7fa-53f19fc0a7de obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 21 19:00:26 np0005591285 nova_compute[182755]: 2026-01-22 00:00:26.001 182759 DEBUG nova.virt.libvirt.vif [None req-af1f27ae-cbd1-4497-bd1f-748af7479c9a 9c8bb39ebd324063b1c1044104d8fe0d ef52bd80396048e796f6c2dbf7295b47 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-22T00:00:13Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-ServerTagsTestJSON-server-196894615',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-servertagstestjson-server-196894615',id=81,image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2026-01-22T00:00:21Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='ef52bd80396048e796f6c2dbf7295b47',ramdisk_id='',reservation_id='r-ja0ml4i1',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-ServerTagsTestJSON-1413291072',owner_user_name='tempest-ServerTagsTestJSON-1413291072-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-22T00:00:21Z,user_data=None,user_id='9c8bb39ebd324063b1c1044104d8fe0d',uuid=c5085649-028e-44f3-b7fa-53f19fc0a7de,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "9a8b67ec-5e81-4335-80f2-76197d4f6f9e", "address": "fa:16:3e:31:f9:41", "network": {"id": "af35880e-f8b9-4463-bac9-70c95c551a8c", "bridge": "br-int", "label": "tempest-ServerTagsTestJSON-2067187433-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ef52bd80396048e796f6c2dbf7295b47", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap9a8b67ec-5e", "ovs_interfaceid": "9a8b67ec-5e81-4335-80f2-76197d4f6f9e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Jan 21 19:00:26 np0005591285 nova_compute[182755]: 2026-01-22 00:00:26.001 182759 DEBUG nova.network.os_vif_util [None req-af1f27ae-cbd1-4497-bd1f-748af7479c9a 9c8bb39ebd324063b1c1044104d8fe0d ef52bd80396048e796f6c2dbf7295b47 - - default default] Converting VIF {"id": "9a8b67ec-5e81-4335-80f2-76197d4f6f9e", "address": "fa:16:3e:31:f9:41", "network": {"id": "af35880e-f8b9-4463-bac9-70c95c551a8c", "bridge": "br-int", "label": "tempest-ServerTagsTestJSON-2067187433-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ef52bd80396048e796f6c2dbf7295b47", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap9a8b67ec-5e", "ovs_interfaceid": "9a8b67ec-5e81-4335-80f2-76197d4f6f9e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 21 19:00:26 np0005591285 nova_compute[182755]: 2026-01-22 00:00:26.002 182759 DEBUG nova.network.os_vif_util [None req-af1f27ae-cbd1-4497-bd1f-748af7479c9a 9c8bb39ebd324063b1c1044104d8fe0d ef52bd80396048e796f6c2dbf7295b47 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:31:f9:41,bridge_name='br-int',has_traffic_filtering=True,id=9a8b67ec-5e81-4335-80f2-76197d4f6f9e,network=Network(af35880e-f8b9-4463-bac9-70c95c551a8c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap9a8b67ec-5e') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 21 19:00:26 np0005591285 nova_compute[182755]: 2026-01-22 00:00:26.003 182759 DEBUG os_vif [None req-af1f27ae-cbd1-4497-bd1f-748af7479c9a 9c8bb39ebd324063b1c1044104d8fe0d ef52bd80396048e796f6c2dbf7295b47 - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:31:f9:41,bridge_name='br-int',has_traffic_filtering=True,id=9a8b67ec-5e81-4335-80f2-76197d4f6f9e,network=Network(af35880e-f8b9-4463-bac9-70c95c551a8c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap9a8b67ec-5e') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Jan 21 19:00:26 np0005591285 nova_compute[182755]: 2026-01-22 00:00:26.006 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:00:26 np0005591285 nova_compute[182755]: 2026-01-22 00:00:26.006 182759 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap9a8b67ec-5e, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 21 19:00:26 np0005591285 nova_compute[182755]: 2026-01-22 00:00:26.008 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:00:26 np0005591285 nova_compute[182755]: 2026-01-22 00:00:26.012 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 21 19:00:26 np0005591285 nova_compute[182755]: 2026-01-22 00:00:26.015 182759 INFO os_vif [None req-af1f27ae-cbd1-4497-bd1f-748af7479c9a 9c8bb39ebd324063b1c1044104d8fe0d ef52bd80396048e796f6c2dbf7295b47 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:31:f9:41,bridge_name='br-int',has_traffic_filtering=True,id=9a8b67ec-5e81-4335-80f2-76197d4f6f9e,network=Network(af35880e-f8b9-4463-bac9-70c95c551a8c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap9a8b67ec-5e')#033[00m
Jan 21 19:00:26 np0005591285 nova_compute[182755]: 2026-01-22 00:00:26.016 182759 INFO nova.virt.libvirt.driver [None req-af1f27ae-cbd1-4497-bd1f-748af7479c9a 9c8bb39ebd324063b1c1044104d8fe0d ef52bd80396048e796f6c2dbf7295b47 - - default default] [instance: c5085649-028e-44f3-b7fa-53f19fc0a7de] Deleting instance files /var/lib/nova/instances/c5085649-028e-44f3-b7fa-53f19fc0a7de_del#033[00m
Jan 21 19:00:26 np0005591285 nova_compute[182755]: 2026-01-22 00:00:26.017 182759 INFO nova.virt.libvirt.driver [None req-af1f27ae-cbd1-4497-bd1f-748af7479c9a 9c8bb39ebd324063b1c1044104d8fe0d ef52bd80396048e796f6c2dbf7295b47 - - default default] [instance: c5085649-028e-44f3-b7fa-53f19fc0a7de] Deletion of /var/lib/nova/instances/c5085649-028e-44f3-b7fa-53f19fc0a7de_del complete#033[00m
Jan 21 19:00:26 np0005591285 podman[223257]: 2026-01-22 00:00:26.039880136 +0000 UTC m=+0.043350204 container remove 618f58d591092028e306e624e6c853a6571d0b2cfc1af59112f862eaac58d8f2 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-af35880e-f8b9-4463-bac9-70c95c551a8c, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS)
Jan 21 19:00:26 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:00:26.047 211690 DEBUG oslo.privsep.daemon [-] privsep: reply[a15700ca-171a-4cc8-a084-c32637839fd7]: (4, ('Thu Jan 22 12:00:25 AM UTC 2026 Stopping container neutron-haproxy-ovnmeta-af35880e-f8b9-4463-bac9-70c95c551a8c (618f58d591092028e306e624e6c853a6571d0b2cfc1af59112f862eaac58d8f2)\n618f58d591092028e306e624e6c853a6571d0b2cfc1af59112f862eaac58d8f2\nThu Jan 22 12:00:25 AM UTC 2026 Deleting container neutron-haproxy-ovnmeta-af35880e-f8b9-4463-bac9-70c95c551a8c (618f58d591092028e306e624e6c853a6571d0b2cfc1af59112f862eaac58d8f2)\n618f58d591092028e306e624e6c853a6571d0b2cfc1af59112f862eaac58d8f2\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 19:00:26 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:00:26.048 211690 DEBUG oslo.privsep.daemon [-] privsep: reply[0f21ab6b-b260-4ef0-939c-c05b47d79fd9]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 19:00:26 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:00:26.049 104259 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapaf35880e-f0, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 21 19:00:26 np0005591285 kernel: tapaf35880e-f0: left promiscuous mode
Jan 21 19:00:26 np0005591285 nova_compute[182755]: 2026-01-22 00:00:26.052 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:00:26 np0005591285 nova_compute[182755]: 2026-01-22 00:00:26.067 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:00:26 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:00:26.071 211690 DEBUG oslo.privsep.daemon [-] privsep: reply[75d9077f-610b-48e7-985c-d590d0c2b168]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 19:00:26 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:00:26.083 211690 DEBUG oslo.privsep.daemon [-] privsep: reply[1072f495-274f-4c77-aa81-daef394fd6a4]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 19:00:26 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:00:26.084 211690 DEBUG oslo.privsep.daemon [-] privsep: reply[cb4b5a59-cce4-4566-99d1-6e2d255cc140]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 19:00:26 np0005591285 nova_compute[182755]: 2026-01-22 00:00:26.101 182759 INFO nova.compute.manager [None req-af1f27ae-cbd1-4497-bd1f-748af7479c9a 9c8bb39ebd324063b1c1044104d8fe0d ef52bd80396048e796f6c2dbf7295b47 - - default default] [instance: c5085649-028e-44f3-b7fa-53f19fc0a7de] Took 0.39 seconds to destroy the instance on the hypervisor.#033[00m
Jan 21 19:00:26 np0005591285 nova_compute[182755]: 2026-01-22 00:00:26.102 182759 DEBUG oslo.service.loopingcall [None req-af1f27ae-cbd1-4497-bd1f-748af7479c9a 9c8bb39ebd324063b1c1044104d8fe0d ef52bd80396048e796f6c2dbf7295b47 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Jan 21 19:00:26 np0005591285 nova_compute[182755]: 2026-01-22 00:00:26.102 182759 DEBUG nova.compute.manager [-] [instance: c5085649-028e-44f3-b7fa-53f19fc0a7de] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Jan 21 19:00:26 np0005591285 nova_compute[182755]: 2026-01-22 00:00:26.102 182759 DEBUG nova.network.neutron [-] [instance: c5085649-028e-44f3-b7fa-53f19fc0a7de] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Jan 21 19:00:26 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:00:26.105 211690 DEBUG oslo.privsep.daemon [-] privsep: reply[301926e8-a9ad-4246-92df-c82957144b5f]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 457899, 'reachable_time': 21715, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 223272, 'error': None, 'target': 'ovnmeta-af35880e-f8b9-4463-bac9-70c95c551a8c', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 19:00:26 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:00:26.107 104650 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-af35880e-f8b9-4463-bac9-70c95c551a8c deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Jan 21 19:00:26 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:00:26.107 104650 DEBUG oslo.privsep.daemon [-] privsep: reply[1b3c8e17-c604-4515-8987-16469fa5535f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 19:00:26 np0005591285 systemd[1]: run-netns-ovnmeta\x2daf35880e\x2df8b9\x2d4463\x2dbac9\x2d70c95c551a8c.mount: Deactivated successfully.
Jan 21 19:00:26 np0005591285 nova_compute[182755]: 2026-01-22 00:00:26.176 182759 DEBUG nova.compute.manager [req-09c5fe10-ec48-4d24-bc59-b1ef20068ab4 req-b8d22c92-4f67-4f1d-a55b-c5ad74cb7ecd 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: c5085649-028e-44f3-b7fa-53f19fc0a7de] Received event network-vif-unplugged-9a8b67ec-5e81-4335-80f2-76197d4f6f9e external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 21 19:00:26 np0005591285 nova_compute[182755]: 2026-01-22 00:00:26.176 182759 DEBUG oslo_concurrency.lockutils [req-09c5fe10-ec48-4d24-bc59-b1ef20068ab4 req-b8d22c92-4f67-4f1d-a55b-c5ad74cb7ecd 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "c5085649-028e-44f3-b7fa-53f19fc0a7de-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 21 19:00:26 np0005591285 nova_compute[182755]: 2026-01-22 00:00:26.176 182759 DEBUG oslo_concurrency.lockutils [req-09c5fe10-ec48-4d24-bc59-b1ef20068ab4 req-b8d22c92-4f67-4f1d-a55b-c5ad74cb7ecd 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "c5085649-028e-44f3-b7fa-53f19fc0a7de-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 21 19:00:26 np0005591285 nova_compute[182755]: 2026-01-22 00:00:26.177 182759 DEBUG oslo_concurrency.lockutils [req-09c5fe10-ec48-4d24-bc59-b1ef20068ab4 req-b8d22c92-4f67-4f1d-a55b-c5ad74cb7ecd 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "c5085649-028e-44f3-b7fa-53f19fc0a7de-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 21 19:00:26 np0005591285 nova_compute[182755]: 2026-01-22 00:00:26.177 182759 DEBUG nova.compute.manager [req-09c5fe10-ec48-4d24-bc59-b1ef20068ab4 req-b8d22c92-4f67-4f1d-a55b-c5ad74cb7ecd 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: c5085649-028e-44f3-b7fa-53f19fc0a7de] No waiting events found dispatching network-vif-unplugged-9a8b67ec-5e81-4335-80f2-76197d4f6f9e pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 21 19:00:26 np0005591285 nova_compute[182755]: 2026-01-22 00:00:26.177 182759 DEBUG nova.compute.manager [req-09c5fe10-ec48-4d24-bc59-b1ef20068ab4 req-b8d22c92-4f67-4f1d-a55b-c5ad74cb7ecd 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: c5085649-028e-44f3-b7fa-53f19fc0a7de] Received event network-vif-unplugged-9a8b67ec-5e81-4335-80f2-76197d4f6f9e for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Jan 21 19:00:26 np0005591285 nova_compute[182755]: 2026-01-22 00:00:26.425 182759 DEBUG oslo_concurrency.processutils [None req-7084740f-d576-4ef1-8009-26afe304ba5c 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] CMD "ssh -o BatchMode=yes 192.168.122.101 touch /var/lib/nova/instances/9308be91-9a92-4389-939a-8b03d37474cf/e5ab3870ce704299836a88abd2af75bb.tmp" returned: 1 in 0.484s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 21 19:00:26 np0005591285 nova_compute[182755]: 2026-01-22 00:00:26.426 182759 DEBUG oslo_concurrency.processutils [None req-7084740f-d576-4ef1-8009-26afe304ba5c 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] 'ssh -o BatchMode=yes 192.168.122.101 touch /var/lib/nova/instances/9308be91-9a92-4389-939a-8b03d37474cf/e5ab3870ce704299836a88abd2af75bb.tmp' failed. Not Retrying. execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:473#033[00m
Jan 21 19:00:26 np0005591285 nova_compute[182755]: 2026-01-22 00:00:26.427 182759 DEBUG nova.virt.libvirt.volume.remotefs [None req-7084740f-d576-4ef1-8009-26afe304ba5c 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] Creating directory /var/lib/nova/instances/9308be91-9a92-4389-939a-8b03d37474cf on remote host 192.168.122.101 create_dir /usr/lib/python3.9/site-packages/nova/virt/libvirt/volume/remotefs.py:91#033[00m
Jan 21 19:00:26 np0005591285 nova_compute[182755]: 2026-01-22 00:00:26.427 182759 DEBUG oslo_concurrency.processutils [None req-7084740f-d576-4ef1-8009-26afe304ba5c 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] Running cmd (subprocess): ssh -o BatchMode=yes 192.168.122.101 mkdir -p /var/lib/nova/instances/9308be91-9a92-4389-939a-8b03d37474cf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 21 19:00:26 np0005591285 nova_compute[182755]: 2026-01-22 00:00:26.654 182759 DEBUG oslo_concurrency.processutils [None req-7084740f-d576-4ef1-8009-26afe304ba5c 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] CMD "ssh -o BatchMode=yes 192.168.122.101 mkdir -p /var/lib/nova/instances/9308be91-9a92-4389-939a-8b03d37474cf" returned: 0 in 0.226s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 21 19:00:26 np0005591285 nova_compute[182755]: 2026-01-22 00:00:26.660 182759 DEBUG nova.virt.libvirt.driver [None req-7084740f-d576-4ef1-8009-26afe304ba5c 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] [instance: 9308be91-9a92-4389-939a-8b03d37474cf] Shutting down instance from state 1 _clean_shutdown /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4071#033[00m
Jan 21 19:00:27 np0005591285 nova_compute[182755]: 2026-01-22 00:00:27.214 182759 DEBUG oslo_service.periodic_task [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 21 19:00:27 np0005591285 ovn_controller[94908]: 2026-01-22T00:00:27Z|00292|binding|INFO|Releasing lport 1b7e9589-a667-4684-99c2-2699b19c29bb from this chassis (sb_readonly=0)
Jan 21 19:00:27 np0005591285 nova_compute[182755]: 2026-01-22 00:00:27.440 182759 DEBUG nova.network.neutron [-] [instance: c5085649-028e-44f3-b7fa-53f19fc0a7de] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 21 19:00:27 np0005591285 nova_compute[182755]: 2026-01-22 00:00:27.465 182759 INFO nova.compute.manager [-] [instance: c5085649-028e-44f3-b7fa-53f19fc0a7de] Took 1.36 seconds to deallocate network for instance.#033[00m
Jan 21 19:00:27 np0005591285 nova_compute[182755]: 2026-01-22 00:00:27.539 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:00:27 np0005591285 nova_compute[182755]: 2026-01-22 00:00:27.561 182759 DEBUG nova.compute.manager [req-9e22cbc4-90bc-4098-8c28-2f4b2910f7e3 req-2fe2b5dd-f640-41f8-aba2-240b823612a8 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: c5085649-028e-44f3-b7fa-53f19fc0a7de] Received event network-vif-deleted-9a8b67ec-5e81-4335-80f2-76197d4f6f9e external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 21 19:00:27 np0005591285 nova_compute[182755]: 2026-01-22 00:00:27.587 182759 DEBUG oslo_concurrency.lockutils [None req-af1f27ae-cbd1-4497-bd1f-748af7479c9a 9c8bb39ebd324063b1c1044104d8fe0d ef52bd80396048e796f6c2dbf7295b47 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 21 19:00:27 np0005591285 nova_compute[182755]: 2026-01-22 00:00:27.587 182759 DEBUG oslo_concurrency.lockutils [None req-af1f27ae-cbd1-4497-bd1f-748af7479c9a 9c8bb39ebd324063b1c1044104d8fe0d ef52bd80396048e796f6c2dbf7295b47 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 21 19:00:27 np0005591285 nova_compute[182755]: 2026-01-22 00:00:27.681 182759 DEBUG nova.compute.provider_tree [None req-af1f27ae-cbd1-4497-bd1f-748af7479c9a 9c8bb39ebd324063b1c1044104d8fe0d ef52bd80396048e796f6c2dbf7295b47 - - default default] Inventory has not changed in ProviderTree for provider: e96a8776-a298-4c19-937a-402cb8191067 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 21 19:00:27 np0005591285 nova_compute[182755]: 2026-01-22 00:00:27.699 182759 DEBUG nova.scheduler.client.report [None req-af1f27ae-cbd1-4497-bd1f-748af7479c9a 9c8bb39ebd324063b1c1044104d8fe0d ef52bd80396048e796f6c2dbf7295b47 - - default default] Inventory has not changed for provider e96a8776-a298-4c19-937a-402cb8191067 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 21 19:00:27 np0005591285 nova_compute[182755]: 2026-01-22 00:00:27.725 182759 DEBUG oslo_concurrency.lockutils [None req-af1f27ae-cbd1-4497-bd1f-748af7479c9a 9c8bb39ebd324063b1c1044104d8fe0d ef52bd80396048e796f6c2dbf7295b47 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.138s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 21 19:00:27 np0005591285 nova_compute[182755]: 2026-01-22 00:00:27.761 182759 INFO nova.scheduler.client.report [None req-af1f27ae-cbd1-4497-bd1f-748af7479c9a 9c8bb39ebd324063b1c1044104d8fe0d ef52bd80396048e796f6c2dbf7295b47 - - default default] Deleted allocations for instance c5085649-028e-44f3-b7fa-53f19fc0a7de#033[00m
Jan 21 19:00:27 np0005591285 nova_compute[182755]: 2026-01-22 00:00:27.869 182759 DEBUG oslo_concurrency.lockutils [None req-af1f27ae-cbd1-4497-bd1f-748af7479c9a 9c8bb39ebd324063b1c1044104d8fe0d ef52bd80396048e796f6c2dbf7295b47 - - default default] Lock "c5085649-028e-44f3-b7fa-53f19fc0a7de" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 2.198s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 21 19:00:28 np0005591285 nova_compute[182755]: 2026-01-22 00:00:28.201 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:00:28 np0005591285 nova_compute[182755]: 2026-01-22 00:00:28.324 182759 DEBUG nova.compute.manager [req-04824068-aa01-448a-835a-ddb4b98fdbb5 req-eb3bc9ba-c511-4ef0-b8be-8905fb7d8584 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: c5085649-028e-44f3-b7fa-53f19fc0a7de] Received event network-vif-plugged-9a8b67ec-5e81-4335-80f2-76197d4f6f9e external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 21 19:00:28 np0005591285 nova_compute[182755]: 2026-01-22 00:00:28.325 182759 DEBUG oslo_concurrency.lockutils [req-04824068-aa01-448a-835a-ddb4b98fdbb5 req-eb3bc9ba-c511-4ef0-b8be-8905fb7d8584 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "c5085649-028e-44f3-b7fa-53f19fc0a7de-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 21 19:00:28 np0005591285 nova_compute[182755]: 2026-01-22 00:00:28.325 182759 DEBUG oslo_concurrency.lockutils [req-04824068-aa01-448a-835a-ddb4b98fdbb5 req-eb3bc9ba-c511-4ef0-b8be-8905fb7d8584 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "c5085649-028e-44f3-b7fa-53f19fc0a7de-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 21 19:00:28 np0005591285 nova_compute[182755]: 2026-01-22 00:00:28.326 182759 DEBUG oslo_concurrency.lockutils [req-04824068-aa01-448a-835a-ddb4b98fdbb5 req-eb3bc9ba-c511-4ef0-b8be-8905fb7d8584 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "c5085649-028e-44f3-b7fa-53f19fc0a7de-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 21 19:00:28 np0005591285 nova_compute[182755]: 2026-01-22 00:00:28.326 182759 DEBUG nova.compute.manager [req-04824068-aa01-448a-835a-ddb4b98fdbb5 req-eb3bc9ba-c511-4ef0-b8be-8905fb7d8584 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: c5085649-028e-44f3-b7fa-53f19fc0a7de] No waiting events found dispatching network-vif-plugged-9a8b67ec-5e81-4335-80f2-76197d4f6f9e pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 21 19:00:28 np0005591285 nova_compute[182755]: 2026-01-22 00:00:28.327 182759 WARNING nova.compute.manager [req-04824068-aa01-448a-835a-ddb4b98fdbb5 req-eb3bc9ba-c511-4ef0-b8be-8905fb7d8584 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: c5085649-028e-44f3-b7fa-53f19fc0a7de] Received unexpected event network-vif-plugged-9a8b67ec-5e81-4335-80f2-76197d4f6f9e for instance with vm_state deleted and task_state None.#033[00m
Jan 21 19:00:28 np0005591285 kernel: tapd96fb6bb-97 (unregistering): left promiscuous mode
Jan 21 19:00:28 np0005591285 NetworkManager[55017]: <info>  [1769040028.8886] device (tapd96fb6bb-97): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 21 19:00:28 np0005591285 nova_compute[182755]: 2026-01-22 00:00:28.894 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:00:28 np0005591285 ovn_controller[94908]: 2026-01-22T00:00:28Z|00293|binding|INFO|Releasing lport d96fb6bb-9793-4373-8f62-3aa3f32af6a5 from this chassis (sb_readonly=0)
Jan 21 19:00:28 np0005591285 ovn_controller[94908]: 2026-01-22T00:00:28Z|00294|binding|INFO|Setting lport d96fb6bb-9793-4373-8f62-3aa3f32af6a5 down in Southbound
Jan 21 19:00:28 np0005591285 ovn_controller[94908]: 2026-01-22T00:00:28Z|00295|binding|INFO|Removing iface tapd96fb6bb-97 ovn-installed in OVS
Jan 21 19:00:28 np0005591285 nova_compute[182755]: 2026-01-22 00:00:28.897 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:00:28 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:00:28.908 104259 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:c3:44:d7 10.100.0.7'], port_security=['fa:16:3e:c3:44:d7 10.100.0.7'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.7/28', 'neutron:device_id': '9308be91-9a92-4389-939a-8b03d37474cf', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-19c3e0c8-5563-479c-995a-ab38d8b8c7f7', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'cccb624dbe6d4401a89e9cd254f91828', 'neutron:revision_number': '10', 'neutron:security_group_ids': '6d59a7e5-ecca-4ec2-a40e-386acabc1d66', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:port_fip': '192.168.122.240', 'neutron:host_id': 'compute-2.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=1cb5ae5b-fb9e-4b4d-8960-35191db09308, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fdd55b5c970>], logical_port=d96fb6bb-9793-4373-8f62-3aa3f32af6a5) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7fdd55b5c970>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 21 19:00:28 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:00:28.910 104259 INFO neutron.agent.ovn.metadata.agent [-] Port d96fb6bb-9793-4373-8f62-3aa3f32af6a5 in datapath 19c3e0c8-5563-479c-995a-ab38d8b8c7f7 unbound from our chassis#033[00m
Jan 21 19:00:28 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:00:28.911 104259 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 19c3e0c8-5563-479c-995a-ab38d8b8c7f7, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Jan 21 19:00:28 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:00:28.912 211690 DEBUG oslo.privsep.daemon [-] privsep: reply[c5d0cc26-27bd-4e36-a26c-62ccd5d6ce0f]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 19:00:28 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:00:28.913 104259 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-19c3e0c8-5563-479c-995a-ab38d8b8c7f7 namespace which is not needed anymore#033[00m
Jan 21 19:00:28 np0005591285 nova_compute[182755]: 2026-01-22 00:00:28.926 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:00:28 np0005591285 systemd[1]: machine-qemu\x2d35\x2dinstance\x2d00000046.scope: Deactivated successfully.
Jan 21 19:00:28 np0005591285 systemd[1]: machine-qemu\x2d35\x2dinstance\x2d00000046.scope: Consumed 17.611s CPU time.
Jan 21 19:00:28 np0005591285 systemd-machined[154022]: Machine qemu-35-instance-00000046 terminated.
Jan 21 19:00:29 np0005591285 neutron-haproxy-ovnmeta-19c3e0c8-5563-479c-995a-ab38d8b8c7f7[222228]: [NOTICE]   (222232) : haproxy version is 2.8.14-c23fe91
Jan 21 19:00:29 np0005591285 neutron-haproxy-ovnmeta-19c3e0c8-5563-479c-995a-ab38d8b8c7f7[222228]: [NOTICE]   (222232) : path to executable is /usr/sbin/haproxy
Jan 21 19:00:29 np0005591285 neutron-haproxy-ovnmeta-19c3e0c8-5563-479c-995a-ab38d8b8c7f7[222228]: [WARNING]  (222232) : Exiting Master process...
Jan 21 19:00:29 np0005591285 neutron-haproxy-ovnmeta-19c3e0c8-5563-479c-995a-ab38d8b8c7f7[222228]: [WARNING]  (222232) : Exiting Master process...
Jan 21 19:00:29 np0005591285 neutron-haproxy-ovnmeta-19c3e0c8-5563-479c-995a-ab38d8b8c7f7[222228]: [ALERT]    (222232) : Current worker (222234) exited with code 143 (Terminated)
Jan 21 19:00:29 np0005591285 neutron-haproxy-ovnmeta-19c3e0c8-5563-479c-995a-ab38d8b8c7f7[222228]: [WARNING]  (222232) : All workers exited. Exiting... (0)
Jan 21 19:00:29 np0005591285 systemd[1]: libpod-98c32dedac231254ee5faf47f58b2f2d3cc92489dcc4c0dfe425ed1d9eba5c65.scope: Deactivated successfully.
Jan 21 19:00:29 np0005591285 podman[223298]: 2026-01-22 00:00:29.055735031 +0000 UTC m=+0.047836913 container died 98c32dedac231254ee5faf47f58b2f2d3cc92489dcc4c0dfe425ed1d9eba5c65 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-19c3e0c8-5563-479c-995a-ab38d8b8c7f7, tcib_managed=true, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 21 19:00:29 np0005591285 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-98c32dedac231254ee5faf47f58b2f2d3cc92489dcc4c0dfe425ed1d9eba5c65-userdata-shm.mount: Deactivated successfully.
Jan 21 19:00:29 np0005591285 systemd[1]: var-lib-containers-storage-overlay-863ef7382bfc8a761ac382d82512363832aa0939f83485f0509c621f675bb362-merged.mount: Deactivated successfully.
Jan 21 19:00:29 np0005591285 podman[223298]: 2026-01-22 00:00:29.086069627 +0000 UTC m=+0.078171509 container cleanup 98c32dedac231254ee5faf47f58b2f2d3cc92489dcc4c0dfe425ed1d9eba5c65 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-19c3e0c8-5563-479c-995a-ab38d8b8c7f7, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202)
Jan 21 19:00:29 np0005591285 systemd[1]: libpod-conmon-98c32dedac231254ee5faf47f58b2f2d3cc92489dcc4c0dfe425ed1d9eba5c65.scope: Deactivated successfully.
Jan 21 19:00:29 np0005591285 podman[223326]: 2026-01-22 00:00:29.154388833 +0000 UTC m=+0.046978879 container remove 98c32dedac231254ee5faf47f58b2f2d3cc92489dcc4c0dfe425ed1d9eba5c65 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-19c3e0c8-5563-479c-995a-ab38d8b8c7f7, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2)
Jan 21 19:00:29 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:00:29.161 211690 DEBUG oslo.privsep.daemon [-] privsep: reply[4cdc233e-eed0-4499-ab4d-83fa8ae2718d]: (4, ('Thu Jan 22 12:00:28 AM UTC 2026 Stopping container neutron-haproxy-ovnmeta-19c3e0c8-5563-479c-995a-ab38d8b8c7f7 (98c32dedac231254ee5faf47f58b2f2d3cc92489dcc4c0dfe425ed1d9eba5c65)\n98c32dedac231254ee5faf47f58b2f2d3cc92489dcc4c0dfe425ed1d9eba5c65\nThu Jan 22 12:00:29 AM UTC 2026 Deleting container neutron-haproxy-ovnmeta-19c3e0c8-5563-479c-995a-ab38d8b8c7f7 (98c32dedac231254ee5faf47f58b2f2d3cc92489dcc4c0dfe425ed1d9eba5c65)\n98c32dedac231254ee5faf47f58b2f2d3cc92489dcc4c0dfe425ed1d9eba5c65\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 19:00:29 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:00:29.164 211690 DEBUG oslo.privsep.daemon [-] privsep: reply[34ba1bb4-a9ba-49f5-ac7c-d4dd2011f0ea]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 19:00:29 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:00:29.166 104259 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap19c3e0c8-50, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 21 19:00:29 np0005591285 kernel: tap19c3e0c8-50: left promiscuous mode
Jan 21 19:00:29 np0005591285 nova_compute[182755]: 2026-01-22 00:00:29.169 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:00:29 np0005591285 nova_compute[182755]: 2026-01-22 00:00:29.187 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:00:29 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:00:29.190 211690 DEBUG oslo.privsep.daemon [-] privsep: reply[7a72f1c1-40a3-4307-b044-374542baad75]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 19:00:29 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:00:29.205 211690 DEBUG oslo.privsep.daemon [-] privsep: reply[649a5329-8bd3-4ee6-abff-3b594249a457]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 19:00:29 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:00:29.206 211690 DEBUG oslo.privsep.daemon [-] privsep: reply[79277056-78d2-4f9b-9613-c226fc2da17c]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 19:00:29 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:00:29.222 211690 DEBUG oslo.privsep.daemon [-] privsep: reply[7906324a-35a4-446c-890e-c2c0f34c048a]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 449483, 'reachable_time': 18212, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 223359, 'error': None, 'target': 'ovnmeta-19c3e0c8-5563-479c-995a-ab38d8b8c7f7', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 19:00:29 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:00:29.224 104650 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-19c3e0c8-5563-479c-995a-ab38d8b8c7f7 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Jan 21 19:00:29 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:00:29.225 104650 DEBUG oslo.privsep.daemon [-] privsep: reply[6ae4ff3e-2426-4600-8603-d2f78becc3c5]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 19:00:29 np0005591285 systemd[1]: run-netns-ovnmeta\x2d19c3e0c8\x2d5563\x2d479c\x2d995a\x2dab38d8b8c7f7.mount: Deactivated successfully.
Jan 21 19:00:29 np0005591285 nova_compute[182755]: 2026-01-22 00:00:29.254 182759 DEBUG nova.compute.manager [req-8d83ae3b-a114-4ca1-bf9b-5355703870e5 req-28e3188f-fc23-406c-9b89-383200f17a19 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 9308be91-9a92-4389-939a-8b03d37474cf] Received event network-vif-unplugged-d96fb6bb-9793-4373-8f62-3aa3f32af6a5 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 21 19:00:29 np0005591285 nova_compute[182755]: 2026-01-22 00:00:29.254 182759 DEBUG oslo_concurrency.lockutils [req-8d83ae3b-a114-4ca1-bf9b-5355703870e5 req-28e3188f-fc23-406c-9b89-383200f17a19 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "9308be91-9a92-4389-939a-8b03d37474cf-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 21 19:00:29 np0005591285 nova_compute[182755]: 2026-01-22 00:00:29.254 182759 DEBUG oslo_concurrency.lockutils [req-8d83ae3b-a114-4ca1-bf9b-5355703870e5 req-28e3188f-fc23-406c-9b89-383200f17a19 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "9308be91-9a92-4389-939a-8b03d37474cf-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 21 19:00:29 np0005591285 nova_compute[182755]: 2026-01-22 00:00:29.255 182759 DEBUG oslo_concurrency.lockutils [req-8d83ae3b-a114-4ca1-bf9b-5355703870e5 req-28e3188f-fc23-406c-9b89-383200f17a19 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "9308be91-9a92-4389-939a-8b03d37474cf-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 21 19:00:29 np0005591285 nova_compute[182755]: 2026-01-22 00:00:29.255 182759 DEBUG nova.compute.manager [req-8d83ae3b-a114-4ca1-bf9b-5355703870e5 req-28e3188f-fc23-406c-9b89-383200f17a19 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 9308be91-9a92-4389-939a-8b03d37474cf] No waiting events found dispatching network-vif-unplugged-d96fb6bb-9793-4373-8f62-3aa3f32af6a5 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 21 19:00:29 np0005591285 nova_compute[182755]: 2026-01-22 00:00:29.255 182759 WARNING nova.compute.manager [req-8d83ae3b-a114-4ca1-bf9b-5355703870e5 req-28e3188f-fc23-406c-9b89-383200f17a19 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 9308be91-9a92-4389-939a-8b03d37474cf] Received unexpected event network-vif-unplugged-d96fb6bb-9793-4373-8f62-3aa3f32af6a5 for instance with vm_state active and task_state resize_migrating.#033[00m
Jan 21 19:00:29 np0005591285 nova_compute[182755]: 2026-01-22 00:00:29.680 182759 INFO nova.virt.libvirt.driver [None req-7084740f-d576-4ef1-8009-26afe304ba5c 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] [instance: 9308be91-9a92-4389-939a-8b03d37474cf] Instance shutdown successfully after 3 seconds.#033[00m
Jan 21 19:00:29 np0005591285 nova_compute[182755]: 2026-01-22 00:00:29.688 182759 INFO nova.virt.libvirt.driver [-] [instance: 9308be91-9a92-4389-939a-8b03d37474cf] Instance destroyed successfully.#033[00m
Jan 21 19:00:29 np0005591285 nova_compute[182755]: 2026-01-22 00:00:29.689 182759 DEBUG nova.virt.libvirt.vif [None req-7084740f-d576-4ef1-8009-26afe304ba5c 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-21T23:57:10Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ServerActionsTestJSON-server-396111842',display_name='tempest-ServerActionsTestJSON-server-396111842',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-serveractionstestjson-server-396111842',id=70,image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBM2ugiUux7DYMlN8dY8gue1BzsfXbOKOqdPq/gJUxFgjYtiZRKn0Il7yH7vkt/FF0n0nQ57uKZ7FjQwDvGcLpEHkhrK3RTLhPWsztjfiNHjhjKK0S86T4k3kzP0rpeoh4Q==',key_name='tempest-keypair-452781070',keypairs=<?>,launch_index=0,launched_at=2026-01-21T23:57:17Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=MigrationContext,new_flavor=Flavor(2),node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='cccb624dbe6d4401a89e9cd254f91828',ramdisk_id='',reservation_id='r-740ncwsh',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=ServiceList,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',old_vm_state='active',owner_project_name='tempest-ServerActionsTestJSON-78742637',owner_user_name='tempest-ServerActionsTestJSON-78742637-project-member'},tags=<?>,task_state='resize_migrating',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-22T00:00:22Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='3e78a70a1d284a9d932d4a53b872df39',uuid=9308be91-9a92-4389-939a-8b03d37474cf,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "d96fb6bb-9793-4373-8f62-3aa3f32af6a5", "address": "fa:16:3e:c3:44:d7", "network": {"id": "19c3e0c8-5563-479c-995a-ab38d8b8c7f7", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-10713966-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.240", "type": "floating", "version": 4, "meta": {}}], "label": "tempest-ServerActionsTestJSON-10713966-network", "vif_mac": "fa:16:3e:c3:44:d7"}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "cccb624dbe6d4401a89e9cd254f91828", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd96fb6bb-97", "ovs_interfaceid": "d96fb6bb-9793-4373-8f62-3aa3f32af6a5", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Jan 21 19:00:29 np0005591285 nova_compute[182755]: 2026-01-22 00:00:29.690 182759 DEBUG nova.network.os_vif_util [None req-7084740f-d576-4ef1-8009-26afe304ba5c 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] Converting VIF {"id": "d96fb6bb-9793-4373-8f62-3aa3f32af6a5", "address": "fa:16:3e:c3:44:d7", "network": {"id": "19c3e0c8-5563-479c-995a-ab38d8b8c7f7", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-10713966-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.240", "type": "floating", "version": 4, "meta": {}}], "label": "tempest-ServerActionsTestJSON-10713966-network", "vif_mac": "fa:16:3e:c3:44:d7"}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "cccb624dbe6d4401a89e9cd254f91828", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd96fb6bb-97", "ovs_interfaceid": "d96fb6bb-9793-4373-8f62-3aa3f32af6a5", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 21 19:00:29 np0005591285 nova_compute[182755]: 2026-01-22 00:00:29.691 182759 DEBUG nova.network.os_vif_util [None req-7084740f-d576-4ef1-8009-26afe304ba5c 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:c3:44:d7,bridge_name='br-int',has_traffic_filtering=True,id=d96fb6bb-9793-4373-8f62-3aa3f32af6a5,network=Network(19c3e0c8-5563-479c-995a-ab38d8b8c7f7),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapd96fb6bb-97') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 21 19:00:29 np0005591285 nova_compute[182755]: 2026-01-22 00:00:29.692 182759 DEBUG os_vif [None req-7084740f-d576-4ef1-8009-26afe304ba5c 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:c3:44:d7,bridge_name='br-int',has_traffic_filtering=True,id=d96fb6bb-9793-4373-8f62-3aa3f32af6a5,network=Network(19c3e0c8-5563-479c-995a-ab38d8b8c7f7),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapd96fb6bb-97') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Jan 21 19:00:29 np0005591285 nova_compute[182755]: 2026-01-22 00:00:29.695 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:00:29 np0005591285 nova_compute[182755]: 2026-01-22 00:00:29.695 182759 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapd96fb6bb-97, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 21 19:00:29 np0005591285 nova_compute[182755]: 2026-01-22 00:00:29.697 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:00:29 np0005591285 nova_compute[182755]: 2026-01-22 00:00:29.699 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:00:29 np0005591285 nova_compute[182755]: 2026-01-22 00:00:29.703 182759 INFO os_vif [None req-7084740f-d576-4ef1-8009-26afe304ba5c 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:c3:44:d7,bridge_name='br-int',has_traffic_filtering=True,id=d96fb6bb-9793-4373-8f62-3aa3f32af6a5,network=Network(19c3e0c8-5563-479c-995a-ab38d8b8c7f7),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapd96fb6bb-97')#033[00m
Jan 21 19:00:29 np0005591285 nova_compute[182755]: 2026-01-22 00:00:29.709 182759 DEBUG oslo_concurrency.processutils [None req-7084740f-d576-4ef1-8009-26afe304ba5c 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/9308be91-9a92-4389-939a-8b03d37474cf/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 21 19:00:29 np0005591285 nova_compute[182755]: 2026-01-22 00:00:29.799 182759 DEBUG oslo_concurrency.processutils [None req-7084740f-d576-4ef1-8009-26afe304ba5c 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/9308be91-9a92-4389-939a-8b03d37474cf/disk --force-share --output=json" returned: 0 in 0.090s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 21 19:00:29 np0005591285 nova_compute[182755]: 2026-01-22 00:00:29.801 182759 DEBUG oslo_concurrency.processutils [None req-7084740f-d576-4ef1-8009-26afe304ba5c 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/9308be91-9a92-4389-939a-8b03d37474cf/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 21 19:00:29 np0005591285 nova_compute[182755]: 2026-01-22 00:00:29.896 182759 DEBUG oslo_concurrency.processutils [None req-7084740f-d576-4ef1-8009-26afe304ba5c 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/9308be91-9a92-4389-939a-8b03d37474cf/disk --force-share --output=json" returned: 0 in 0.094s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 21 19:00:29 np0005591285 nova_compute[182755]: 2026-01-22 00:00:29.898 182759 DEBUG nova.virt.libvirt.volume.remotefs [None req-7084740f-d576-4ef1-8009-26afe304ba5c 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] Copying file /var/lib/nova/instances/9308be91-9a92-4389-939a-8b03d37474cf_resize/disk to 192.168.122.101:/var/lib/nova/instances/9308be91-9a92-4389-939a-8b03d37474cf/disk copy_file /usr/lib/python3.9/site-packages/nova/virt/libvirt/volume/remotefs.py:103#033[00m
Jan 21 19:00:29 np0005591285 nova_compute[182755]: 2026-01-22 00:00:29.899 182759 DEBUG oslo_concurrency.processutils [None req-7084740f-d576-4ef1-8009-26afe304ba5c 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] Running cmd (subprocess): scp -r /var/lib/nova/instances/9308be91-9a92-4389-939a-8b03d37474cf_resize/disk 192.168.122.101:/var/lib/nova/instances/9308be91-9a92-4389-939a-8b03d37474cf/disk execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 21 19:00:30 np0005591285 nova_compute[182755]: 2026-01-22 00:00:30.217 182759 DEBUG oslo_service.periodic_task [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 21 19:00:30 np0005591285 nova_compute[182755]: 2026-01-22 00:00:30.219 182759 DEBUG nova.compute.manager [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Jan 21 19:00:30 np0005591285 nova_compute[182755]: 2026-01-22 00:00:30.540 182759 DEBUG oslo_concurrency.processutils [None req-7084740f-d576-4ef1-8009-26afe304ba5c 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] CMD "scp -r /var/lib/nova/instances/9308be91-9a92-4389-939a-8b03d37474cf_resize/disk 192.168.122.101:/var/lib/nova/instances/9308be91-9a92-4389-939a-8b03d37474cf/disk" returned: 0 in 0.640s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 21 19:00:30 np0005591285 nova_compute[182755]: 2026-01-22 00:00:30.541 182759 DEBUG nova.virt.libvirt.volume.remotefs [None req-7084740f-d576-4ef1-8009-26afe304ba5c 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] Copying file /var/lib/nova/instances/9308be91-9a92-4389-939a-8b03d37474cf_resize/disk.config to 192.168.122.101:/var/lib/nova/instances/9308be91-9a92-4389-939a-8b03d37474cf/disk.config copy_file /usr/lib/python3.9/site-packages/nova/virt/libvirt/volume/remotefs.py:103#033[00m
Jan 21 19:00:30 np0005591285 nova_compute[182755]: 2026-01-22 00:00:30.541 182759 DEBUG oslo_concurrency.processutils [None req-7084740f-d576-4ef1-8009-26afe304ba5c 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] Running cmd (subprocess): scp -C -r /var/lib/nova/instances/9308be91-9a92-4389-939a-8b03d37474cf_resize/disk.config 192.168.122.101:/var/lib/nova/instances/9308be91-9a92-4389-939a-8b03d37474cf/disk.config execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 21 19:00:30 np0005591285 nova_compute[182755]: 2026-01-22 00:00:30.792 182759 DEBUG oslo_concurrency.processutils [None req-7084740f-d576-4ef1-8009-26afe304ba5c 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] CMD "scp -C -r /var/lib/nova/instances/9308be91-9a92-4389-939a-8b03d37474cf_resize/disk.config 192.168.122.101:/var/lib/nova/instances/9308be91-9a92-4389-939a-8b03d37474cf/disk.config" returned: 0 in 0.250s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 21 19:00:30 np0005591285 nova_compute[182755]: 2026-01-22 00:00:30.794 182759 DEBUG nova.virt.libvirt.volume.remotefs [None req-7084740f-d576-4ef1-8009-26afe304ba5c 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] Copying file /var/lib/nova/instances/9308be91-9a92-4389-939a-8b03d37474cf_resize/disk.info to 192.168.122.101:/var/lib/nova/instances/9308be91-9a92-4389-939a-8b03d37474cf/disk.info copy_file /usr/lib/python3.9/site-packages/nova/virt/libvirt/volume/remotefs.py:103#033[00m
Jan 21 19:00:30 np0005591285 nova_compute[182755]: 2026-01-22 00:00:30.794 182759 DEBUG oslo_concurrency.processutils [None req-7084740f-d576-4ef1-8009-26afe304ba5c 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] Running cmd (subprocess): scp -C -r /var/lib/nova/instances/9308be91-9a92-4389-939a-8b03d37474cf_resize/disk.info 192.168.122.101:/var/lib/nova/instances/9308be91-9a92-4389-939a-8b03d37474cf/disk.info execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 21 19:00:30 np0005591285 nova_compute[182755]: 2026-01-22 00:00:30.894 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:00:31 np0005591285 nova_compute[182755]: 2026-01-22 00:00:31.413 182759 DEBUG nova.compute.manager [req-e799b16d-f730-4081-ade3-4f43d6ffe3e9 req-cb665332-e332-49aa-96e7-b2a6c1da6055 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 9308be91-9a92-4389-939a-8b03d37474cf] Received event network-vif-plugged-d96fb6bb-9793-4373-8f62-3aa3f32af6a5 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 21 19:00:31 np0005591285 nova_compute[182755]: 2026-01-22 00:00:31.413 182759 DEBUG oslo_concurrency.lockutils [req-e799b16d-f730-4081-ade3-4f43d6ffe3e9 req-cb665332-e332-49aa-96e7-b2a6c1da6055 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "9308be91-9a92-4389-939a-8b03d37474cf-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 21 19:00:31 np0005591285 nova_compute[182755]: 2026-01-22 00:00:31.414 182759 DEBUG oslo_concurrency.lockutils [req-e799b16d-f730-4081-ade3-4f43d6ffe3e9 req-cb665332-e332-49aa-96e7-b2a6c1da6055 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "9308be91-9a92-4389-939a-8b03d37474cf-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 21 19:00:31 np0005591285 nova_compute[182755]: 2026-01-22 00:00:31.414 182759 DEBUG oslo_concurrency.lockutils [req-e799b16d-f730-4081-ade3-4f43d6ffe3e9 req-cb665332-e332-49aa-96e7-b2a6c1da6055 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "9308be91-9a92-4389-939a-8b03d37474cf-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 21 19:00:31 np0005591285 nova_compute[182755]: 2026-01-22 00:00:31.414 182759 DEBUG nova.compute.manager [req-e799b16d-f730-4081-ade3-4f43d6ffe3e9 req-cb665332-e332-49aa-96e7-b2a6c1da6055 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 9308be91-9a92-4389-939a-8b03d37474cf] No waiting events found dispatching network-vif-plugged-d96fb6bb-9793-4373-8f62-3aa3f32af6a5 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 21 19:00:31 np0005591285 nova_compute[182755]: 2026-01-22 00:00:31.414 182759 WARNING nova.compute.manager [req-e799b16d-f730-4081-ade3-4f43d6ffe3e9 req-cb665332-e332-49aa-96e7-b2a6c1da6055 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 9308be91-9a92-4389-939a-8b03d37474cf] Received unexpected event network-vif-plugged-d96fb6bb-9793-4373-8f62-3aa3f32af6a5 for instance with vm_state active and task_state resize_migrating.#033[00m
Jan 21 19:00:31 np0005591285 nova_compute[182755]: 2026-01-22 00:00:31.712 182759 DEBUG oslo_concurrency.processutils [None req-7084740f-d576-4ef1-8009-26afe304ba5c 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] CMD "scp -C -r /var/lib/nova/instances/9308be91-9a92-4389-939a-8b03d37474cf_resize/disk.info 192.168.122.101:/var/lib/nova/instances/9308be91-9a92-4389-939a-8b03d37474cf/disk.info" returned: 0 in 0.918s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 21 19:00:31 np0005591285 nova_compute[182755]: 2026-01-22 00:00:31.921 182759 DEBUG neutronclient.v2_0.client [None req-7084740f-d576-4ef1-8009-26afe304ba5c 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] Error message: {"NeutronError": {"type": "PortBindingNotFound", "message": "Binding for port d96fb6bb-9793-4373-8f62-3aa3f32af6a5 for host compute-1.ctlplane.example.com could not be found.", "detail": ""}} _handle_fault_response /usr/lib/python3.9/site-packages/neutronclient/v2_0/client.py:262#033[00m
Jan 21 19:00:32 np0005591285 nova_compute[182755]: 2026-01-22 00:00:32.129 182759 DEBUG oslo_concurrency.lockutils [None req-7084740f-d576-4ef1-8009-26afe304ba5c 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] Acquiring lock "9308be91-9a92-4389-939a-8b03d37474cf-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 21 19:00:32 np0005591285 nova_compute[182755]: 2026-01-22 00:00:32.130 182759 DEBUG oslo_concurrency.lockutils [None req-7084740f-d576-4ef1-8009-26afe304ba5c 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] Lock "9308be91-9a92-4389-939a-8b03d37474cf-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 21 19:00:32 np0005591285 nova_compute[182755]: 2026-01-22 00:00:32.130 182759 DEBUG oslo_concurrency.lockutils [None req-7084740f-d576-4ef1-8009-26afe304ba5c 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] Lock "9308be91-9a92-4389-939a-8b03d37474cf-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 21 19:00:32 np0005591285 nova_compute[182755]: 2026-01-22 00:00:32.219 182759 DEBUG oslo_service.periodic_task [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 21 19:00:32 np0005591285 nova_compute[182755]: 2026-01-22 00:00:32.219 182759 DEBUG oslo_service.periodic_task [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 21 19:00:32 np0005591285 podman[223373]: 2026-01-22 00:00:32.240895026 +0000 UTC m=+0.097042170 container health_status b5c1a31a508d0cf9e10702abe9c0ef424614b4d8f0daee85791f7de054afa6f0 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, managed_by=edpm_ansible, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, container_name=ceilometer_agent_compute, org.label-schema.license=GPLv2, config_id=ceilometer_agent_compute, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 21 19:00:32 np0005591285 podman[223372]: 2026-01-22 00:00:32.256985714 +0000 UTC m=+0.114976047 container health_status 769668cc4e818cfae854ab3fcb5517227ddca1e18005a18ef4d782a494f45662 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=openstack_network_exporter, name=ubi9-minimal, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, url=https://catalog.redhat.com/en/search?searchType=containers, config_id=openstack_network_exporter, managed_by=edpm_ansible, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-type=git, com.redhat.component=ubi9-minimal-container, distribution-scope=public, maintainer=Red Hat, Inc., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., release=1755695350, io.openshift.expose-services=, build-date=2025-08-20T13:12:41, io.buildah.version=1.33.7, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., architecture=x86_64, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=minimal rhel9, version=9.6, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']})
Jan 21 19:00:32 np0005591285 nova_compute[182755]: 2026-01-22 00:00:32.341 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:00:32 np0005591285 nova_compute[182755]: 2026-01-22 00:00:32.491 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:00:34 np0005591285 nova_compute[182755]: 2026-01-22 00:00:34.217 182759 DEBUG oslo_service.periodic_task [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 21 19:00:34 np0005591285 nova_compute[182755]: 2026-01-22 00:00:34.330 182759 DEBUG oslo_concurrency.lockutils [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 21 19:00:34 np0005591285 nova_compute[182755]: 2026-01-22 00:00:34.331 182759 DEBUG oslo_concurrency.lockutils [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 21 19:00:34 np0005591285 nova_compute[182755]: 2026-01-22 00:00:34.332 182759 DEBUG oslo_concurrency.lockutils [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 21 19:00:34 np0005591285 nova_compute[182755]: 2026-01-22 00:00:34.332 182759 DEBUG nova.compute.resource_tracker [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Jan 21 19:00:34 np0005591285 nova_compute[182755]: 2026-01-22 00:00:34.411 182759 WARNING nova.virt.libvirt.driver [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Periodic task is updating the host stats, it is trying to get disk info for instance-00000046, but the backing disk storage was removed by a concurrent operation such as resize. Error: No disk at /var/lib/nova/instances/9308be91-9a92-4389-939a-8b03d37474cf/disk: nova.exception.DiskNotFound: No disk at /var/lib/nova/instances/9308be91-9a92-4389-939a-8b03d37474cf/disk#033[00m
Jan 21 19:00:34 np0005591285 nova_compute[182755]: 2026-01-22 00:00:34.557 182759 WARNING nova.virt.libvirt.driver [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 21 19:00:34 np0005591285 nova_compute[182755]: 2026-01-22 00:00:34.559 182759 DEBUG nova.compute.resource_tracker [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=5621MB free_disk=73.23980712890625GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Jan 21 19:00:34 np0005591285 nova_compute[182755]: 2026-01-22 00:00:34.559 182759 DEBUG oslo_concurrency.lockutils [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 21 19:00:34 np0005591285 nova_compute[182755]: 2026-01-22 00:00:34.559 182759 DEBUG oslo_concurrency.lockutils [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 21 19:00:34 np0005591285 nova_compute[182755]: 2026-01-22 00:00:34.622 182759 DEBUG nova.compute.resource_tracker [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Migration for instance 9308be91-9a92-4389-939a-8b03d37474cf refers to another host's instance! _pair_instances_to_migrations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:903#033[00m
Jan 21 19:00:34 np0005591285 nova_compute[182755]: 2026-01-22 00:00:34.654 182759 INFO nova.compute.resource_tracker [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] [instance: 9308be91-9a92-4389-939a-8b03d37474cf] Updating resource usage from migration 26c72328-aea0-476d-a0df-60a56cb3907f#033[00m
Jan 21 19:00:34 np0005591285 nova_compute[182755]: 2026-01-22 00:00:34.655 182759 DEBUG nova.compute.resource_tracker [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] [instance: 9308be91-9a92-4389-939a-8b03d37474cf] Starting to track outgoing migration 26c72328-aea0-476d-a0df-60a56cb3907f with flavor c3389c03-89c4-4ff5-9e03-1a99d41713d4 _update_usage_from_migration /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1444#033[00m
Jan 21 19:00:34 np0005591285 nova_compute[182755]: 2026-01-22 00:00:34.698 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:00:34 np0005591285 nova_compute[182755]: 2026-01-22 00:00:34.717 182759 DEBUG nova.compute.resource_tracker [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Migration 26c72328-aea0-476d-a0df-60a56cb3907f is active on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1640#033[00m
Jan 21 19:00:34 np0005591285 nova_compute[182755]: 2026-01-22 00:00:34.718 182759 DEBUG nova.compute.resource_tracker [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Jan 21 19:00:34 np0005591285 nova_compute[182755]: 2026-01-22 00:00:34.718 182759 DEBUG nova.compute.resource_tracker [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7679MB used_ram=640MB phys_disk=79GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Jan 21 19:00:34 np0005591285 nova_compute[182755]: 2026-01-22 00:00:34.795 182759 DEBUG nova.compute.provider_tree [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Inventory has not changed in ProviderTree for provider: e96a8776-a298-4c19-937a-402cb8191067 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 21 19:00:34 np0005591285 nova_compute[182755]: 2026-01-22 00:00:34.824 182759 DEBUG nova.scheduler.client.report [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Inventory has not changed for provider e96a8776-a298-4c19-937a-402cb8191067 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 21 19:00:34 np0005591285 nova_compute[182755]: 2026-01-22 00:00:34.865 182759 DEBUG nova.compute.resource_tracker [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Jan 21 19:00:34 np0005591285 nova_compute[182755]: 2026-01-22 00:00:34.866 182759 DEBUG oslo_concurrency.lockutils [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.307s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 21 19:00:34 np0005591285 nova_compute[182755]: 2026-01-22 00:00:34.925 182759 DEBUG nova.compute.manager [req-dda76e27-b347-4440-a550-5300598ea5a3 req-2589bb52-4aeb-4bc8-b973-f566fde31059 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 9308be91-9a92-4389-939a-8b03d37474cf] Received event network-changed-d96fb6bb-9793-4373-8f62-3aa3f32af6a5 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 21 19:00:34 np0005591285 nova_compute[182755]: 2026-01-22 00:00:34.926 182759 DEBUG nova.compute.manager [req-dda76e27-b347-4440-a550-5300598ea5a3 req-2589bb52-4aeb-4bc8-b973-f566fde31059 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 9308be91-9a92-4389-939a-8b03d37474cf] Refreshing instance network info cache due to event network-changed-d96fb6bb-9793-4373-8f62-3aa3f32af6a5. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 21 19:00:34 np0005591285 nova_compute[182755]: 2026-01-22 00:00:34.927 182759 DEBUG oslo_concurrency.lockutils [req-dda76e27-b347-4440-a550-5300598ea5a3 req-2589bb52-4aeb-4bc8-b973-f566fde31059 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "refresh_cache-9308be91-9a92-4389-939a-8b03d37474cf" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 21 19:00:34 np0005591285 nova_compute[182755]: 2026-01-22 00:00:34.927 182759 DEBUG oslo_concurrency.lockutils [req-dda76e27-b347-4440-a550-5300598ea5a3 req-2589bb52-4aeb-4bc8-b973-f566fde31059 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquired lock "refresh_cache-9308be91-9a92-4389-939a-8b03d37474cf" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 21 19:00:34 np0005591285 nova_compute[182755]: 2026-01-22 00:00:34.928 182759 DEBUG nova.network.neutron [req-dda76e27-b347-4440-a550-5300598ea5a3 req-2589bb52-4aeb-4bc8-b973-f566fde31059 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 9308be91-9a92-4389-939a-8b03d37474cf] Refreshing network info cache for port d96fb6bb-9793-4373-8f62-3aa3f32af6a5 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 21 19:00:35 np0005591285 nova_compute[182755]: 2026-01-22 00:00:35.866 182759 DEBUG oslo_service.periodic_task [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 21 19:00:35 np0005591285 nova_compute[182755]: 2026-01-22 00:00:35.927 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:00:36 np0005591285 nova_compute[182755]: 2026-01-22 00:00:36.218 182759 DEBUG oslo_service.periodic_task [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 21 19:00:36 np0005591285 nova_compute[182755]: 2026-01-22 00:00:36.218 182759 DEBUG nova.compute.manager [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Jan 21 19:00:36 np0005591285 nova_compute[182755]: 2026-01-22 00:00:36.219 182759 DEBUG nova.compute.manager [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Jan 21 19:00:36 np0005591285 nova_compute[182755]: 2026-01-22 00:00:36.236 182759 DEBUG nova.compute.manager [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Jan 21 19:00:37 np0005591285 nova_compute[182755]: 2026-01-22 00:00:37.925 182759 DEBUG nova.network.neutron [req-dda76e27-b347-4440-a550-5300598ea5a3 req-2589bb52-4aeb-4bc8-b973-f566fde31059 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 9308be91-9a92-4389-939a-8b03d37474cf] Updated VIF entry in instance network info cache for port d96fb6bb-9793-4373-8f62-3aa3f32af6a5. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 21 19:00:37 np0005591285 nova_compute[182755]: 2026-01-22 00:00:37.925 182759 DEBUG nova.network.neutron [req-dda76e27-b347-4440-a550-5300598ea5a3 req-2589bb52-4aeb-4bc8-b973-f566fde31059 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 9308be91-9a92-4389-939a-8b03d37474cf] Updating instance_info_cache with network_info: [{"id": "d96fb6bb-9793-4373-8f62-3aa3f32af6a5", "address": "fa:16:3e:c3:44:d7", "network": {"id": "19c3e0c8-5563-479c-995a-ab38d8b8c7f7", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-10713966-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.240", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "cccb624dbe6d4401a89e9cd254f91828", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd96fb6bb-97", "ovs_interfaceid": "d96fb6bb-9793-4373-8f62-3aa3f32af6a5", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 21 19:00:37 np0005591285 nova_compute[182755]: 2026-01-22 00:00:37.958 182759 DEBUG oslo_concurrency.lockutils [req-dda76e27-b347-4440-a550-5300598ea5a3 req-2589bb52-4aeb-4bc8-b973-f566fde31059 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Releasing lock "refresh_cache-9308be91-9a92-4389-939a-8b03d37474cf" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 21 19:00:38 np0005591285 nova_compute[182755]: 2026-01-22 00:00:38.218 182759 DEBUG oslo_service.periodic_task [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 21 19:00:39 np0005591285 nova_compute[182755]: 2026-01-22 00:00:39.219 182759 DEBUG oslo_service.periodic_task [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 21 19:00:39 np0005591285 podman[223413]: 2026-01-22 00:00:39.233939086 +0000 UTC m=+0.093742452 container health_status 3d927e208c8d9d2bf2ed958430b573b5beb6e6062ba87e1ec7a0ee42c855b2e4 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Jan 21 19:00:39 np0005591285 nova_compute[182755]: 2026-01-22 00:00:39.701 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:00:40 np0005591285 nova_compute[182755]: 2026-01-22 00:00:40.930 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:00:40 np0005591285 nova_compute[182755]: 2026-01-22 00:00:40.982 182759 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1769040025.9809787, c5085649-028e-44f3-b7fa-53f19fc0a7de => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 21 19:00:40 np0005591285 nova_compute[182755]: 2026-01-22 00:00:40.983 182759 INFO nova.compute.manager [-] [instance: c5085649-028e-44f3-b7fa-53f19fc0a7de] VM Stopped (Lifecycle Event)#033[00m
Jan 21 19:00:41 np0005591285 nova_compute[182755]: 2026-01-22 00:00:41.162 182759 DEBUG nova.compute.manager [None req-3309351c-c16e-44cc-99f3-e2681627f175 - - - - - -] [instance: c5085649-028e-44f3-b7fa-53f19fc0a7de] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 21 19:00:42 np0005591285 nova_compute[182755]: 2026-01-22 00:00:42.191 182759 DEBUG nova.compute.manager [req-df114de0-3351-443b-bd0f-ad132f17bce5 req-de8503e4-5fd5-479f-98c1-f71f68cb6e06 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 9308be91-9a92-4389-939a-8b03d37474cf] Received event network-vif-plugged-d96fb6bb-9793-4373-8f62-3aa3f32af6a5 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 21 19:00:42 np0005591285 nova_compute[182755]: 2026-01-22 00:00:42.192 182759 DEBUG oslo_concurrency.lockutils [req-df114de0-3351-443b-bd0f-ad132f17bce5 req-de8503e4-5fd5-479f-98c1-f71f68cb6e06 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "9308be91-9a92-4389-939a-8b03d37474cf-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 21 19:00:42 np0005591285 nova_compute[182755]: 2026-01-22 00:00:42.193 182759 DEBUG oslo_concurrency.lockutils [req-df114de0-3351-443b-bd0f-ad132f17bce5 req-de8503e4-5fd5-479f-98c1-f71f68cb6e06 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "9308be91-9a92-4389-939a-8b03d37474cf-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 21 19:00:42 np0005591285 nova_compute[182755]: 2026-01-22 00:00:42.193 182759 DEBUG oslo_concurrency.lockutils [req-df114de0-3351-443b-bd0f-ad132f17bce5 req-de8503e4-5fd5-479f-98c1-f71f68cb6e06 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "9308be91-9a92-4389-939a-8b03d37474cf-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 21 19:00:42 np0005591285 nova_compute[182755]: 2026-01-22 00:00:42.193 182759 DEBUG nova.compute.manager [req-df114de0-3351-443b-bd0f-ad132f17bce5 req-de8503e4-5fd5-479f-98c1-f71f68cb6e06 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 9308be91-9a92-4389-939a-8b03d37474cf] No waiting events found dispatching network-vif-plugged-d96fb6bb-9793-4373-8f62-3aa3f32af6a5 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 21 19:00:42 np0005591285 nova_compute[182755]: 2026-01-22 00:00:42.194 182759 WARNING nova.compute.manager [req-df114de0-3351-443b-bd0f-ad132f17bce5 req-de8503e4-5fd5-479f-98c1-f71f68cb6e06 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 9308be91-9a92-4389-939a-8b03d37474cf] Received unexpected event network-vif-plugged-d96fb6bb-9793-4373-8f62-3aa3f32af6a5 for instance with vm_state resized and task_state None.#033[00m
Jan 21 19:00:42 np0005591285 podman[223436]: 2026-01-22 00:00:42.254298062 +0000 UTC m=+0.117869254 container health_status 482a8450540b33e5cd4c4cb0c4b60281016c657b79b89fa82f8a9e447843f995 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, managed_by=edpm_ansible, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0)
Jan 21 19:00:42 np0005591285 podman[223437]: 2026-01-22 00:00:42.275924266 +0000 UTC m=+0.137018433 container health_status 8919309155a033da8deb024bb37b9fc0d285fe97686ee0d202af37a5f2df8416 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Jan 21 19:00:42 np0005591285 podman[223438]: 2026-01-22 00:00:42.338756816 +0000 UTC m=+0.193119993 container health_status e38def30a2d032278c507a8fe96e62e9ea51ff4721fe55b24e66691821fd079e (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.build-date=20251202, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.license=GPLv2)
Jan 21 19:00:44 np0005591285 nova_compute[182755]: 2026-01-22 00:00:44.167 182759 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1769040029.1655047, 9308be91-9a92-4389-939a-8b03d37474cf => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 21 19:00:44 np0005591285 nova_compute[182755]: 2026-01-22 00:00:44.168 182759 INFO nova.compute.manager [-] [instance: 9308be91-9a92-4389-939a-8b03d37474cf] VM Stopped (Lifecycle Event)#033[00m
Jan 21 19:00:44 np0005591285 nova_compute[182755]: 2026-01-22 00:00:44.207 182759 DEBUG nova.compute.manager [None req-346fc5e4-ee10-4c4d-8973-83e08d3c8936 - - - - - -] [instance: 9308be91-9a92-4389-939a-8b03d37474cf] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 21 19:00:44 np0005591285 nova_compute[182755]: 2026-01-22 00:00:44.214 182759 DEBUG nova.compute.manager [None req-346fc5e4-ee10-4c4d-8973-83e08d3c8936 - - - - - -] [instance: 9308be91-9a92-4389-939a-8b03d37474cf] Synchronizing instance power state after lifecycle event "Stopped"; current vm_state: resized, current task_state: None, current DB power_state: 1, VM power_state: 4 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 21 19:00:44 np0005591285 nova_compute[182755]: 2026-01-22 00:00:44.283 182759 INFO nova.compute.manager [None req-346fc5e4-ee10-4c4d-8973-83e08d3c8936 - - - - - -] [instance: 9308be91-9a92-4389-939a-8b03d37474cf] During the sync_power process the instance has moved from host compute-1.ctlplane.example.com to host compute-2.ctlplane.example.com#033[00m
Jan 21 19:00:44 np0005591285 nova_compute[182755]: 2026-01-22 00:00:44.589 182759 DEBUG nova.compute.manager [req-98777641-7aa5-42cd-8fc8-a1d1b189dc6e req-043ed43a-bcf6-4566-8fa6-8fedb6a2476f 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 9308be91-9a92-4389-939a-8b03d37474cf] Received event network-vif-plugged-d96fb6bb-9793-4373-8f62-3aa3f32af6a5 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 21 19:00:44 np0005591285 nova_compute[182755]: 2026-01-22 00:00:44.590 182759 DEBUG oslo_concurrency.lockutils [req-98777641-7aa5-42cd-8fc8-a1d1b189dc6e req-043ed43a-bcf6-4566-8fa6-8fedb6a2476f 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "9308be91-9a92-4389-939a-8b03d37474cf-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 21 19:00:44 np0005591285 nova_compute[182755]: 2026-01-22 00:00:44.590 182759 DEBUG oslo_concurrency.lockutils [req-98777641-7aa5-42cd-8fc8-a1d1b189dc6e req-043ed43a-bcf6-4566-8fa6-8fedb6a2476f 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "9308be91-9a92-4389-939a-8b03d37474cf-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 21 19:00:44 np0005591285 nova_compute[182755]: 2026-01-22 00:00:44.591 182759 DEBUG oslo_concurrency.lockutils [req-98777641-7aa5-42cd-8fc8-a1d1b189dc6e req-043ed43a-bcf6-4566-8fa6-8fedb6a2476f 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "9308be91-9a92-4389-939a-8b03d37474cf-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 21 19:00:44 np0005591285 nova_compute[182755]: 2026-01-22 00:00:44.591 182759 DEBUG nova.compute.manager [req-98777641-7aa5-42cd-8fc8-a1d1b189dc6e req-043ed43a-bcf6-4566-8fa6-8fedb6a2476f 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 9308be91-9a92-4389-939a-8b03d37474cf] No waiting events found dispatching network-vif-plugged-d96fb6bb-9793-4373-8f62-3aa3f32af6a5 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 21 19:00:44 np0005591285 nova_compute[182755]: 2026-01-22 00:00:44.592 182759 WARNING nova.compute.manager [req-98777641-7aa5-42cd-8fc8-a1d1b189dc6e req-043ed43a-bcf6-4566-8fa6-8fedb6a2476f 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 9308be91-9a92-4389-939a-8b03d37474cf] Received unexpected event network-vif-plugged-d96fb6bb-9793-4373-8f62-3aa3f32af6a5 for instance with vm_state resized and task_state None.#033[00m
Jan 21 19:00:44 np0005591285 nova_compute[182755]: 2026-01-22 00:00:44.705 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:00:45 np0005591285 nova_compute[182755]: 2026-01-22 00:00:45.978 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:00:46 np0005591285 nova_compute[182755]: 2026-01-22 00:00:46.913 182759 DEBUG oslo_concurrency.lockutils [None req-4a1c0c34-b930-45ed-a237-e955fcf014e2 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] Acquiring lock "9308be91-9a92-4389-939a-8b03d37474cf" by "nova.compute.manager.ComputeManager.confirm_resize.<locals>.do_confirm_resize" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 21 19:00:46 np0005591285 nova_compute[182755]: 2026-01-22 00:00:46.914 182759 DEBUG oslo_concurrency.lockutils [None req-4a1c0c34-b930-45ed-a237-e955fcf014e2 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] Lock "9308be91-9a92-4389-939a-8b03d37474cf" acquired by "nova.compute.manager.ComputeManager.confirm_resize.<locals>.do_confirm_resize" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 21 19:00:46 np0005591285 nova_compute[182755]: 2026-01-22 00:00:46.914 182759 DEBUG nova.compute.manager [None req-4a1c0c34-b930-45ed-a237-e955fcf014e2 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] [instance: 9308be91-9a92-4389-939a-8b03d37474cf] Going to confirm migration 13 do_confirm_resize /usr/lib/python3.9/site-packages/nova/compute/manager.py:4679#033[00m
Jan 21 19:00:46 np0005591285 nova_compute[182755]: 2026-01-22 00:00:46.975 182759 DEBUG nova.objects.instance [None req-4a1c0c34-b930-45ed-a237-e955fcf014e2 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] Lazy-loading 'info_cache' on Instance uuid 9308be91-9a92-4389-939a-8b03d37474cf obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 21 19:00:48 np0005591285 nova_compute[182755]: 2026-01-22 00:00:48.483 182759 DEBUG neutronclient.v2_0.client [None req-4a1c0c34-b930-45ed-a237-e955fcf014e2 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] Error message: {"NeutronError": {"type": "PortBindingNotFound", "message": "Binding for port d96fb6bb-9793-4373-8f62-3aa3f32af6a5 for host compute-2.ctlplane.example.com could not be found.", "detail": ""}} _handle_fault_response /usr/lib/python3.9/site-packages/neutronclient/v2_0/client.py:262#033[00m
Jan 21 19:00:48 np0005591285 nova_compute[182755]: 2026-01-22 00:00:48.483 182759 DEBUG oslo_concurrency.lockutils [None req-4a1c0c34-b930-45ed-a237-e955fcf014e2 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] Acquiring lock "refresh_cache-9308be91-9a92-4389-939a-8b03d37474cf" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 21 19:00:48 np0005591285 nova_compute[182755]: 2026-01-22 00:00:48.484 182759 DEBUG oslo_concurrency.lockutils [None req-4a1c0c34-b930-45ed-a237-e955fcf014e2 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] Acquired lock "refresh_cache-9308be91-9a92-4389-939a-8b03d37474cf" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 21 19:00:48 np0005591285 nova_compute[182755]: 2026-01-22 00:00:48.484 182759 DEBUG nova.network.neutron [None req-4a1c0c34-b930-45ed-a237-e955fcf014e2 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] [instance: 9308be91-9a92-4389-939a-8b03d37474cf] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Jan 21 19:00:49 np0005591285 nova_compute[182755]: 2026-01-22 00:00:49.708 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:00:51 np0005591285 nova_compute[182755]: 2026-01-22 00:00:51.003 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:00:51 np0005591285 nova_compute[182755]: 2026-01-22 00:00:51.888 182759 DEBUG nova.network.neutron [None req-4a1c0c34-b930-45ed-a237-e955fcf014e2 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] [instance: 9308be91-9a92-4389-939a-8b03d37474cf] Updating instance_info_cache with network_info: [{"id": "d96fb6bb-9793-4373-8f62-3aa3f32af6a5", "address": "fa:16:3e:c3:44:d7", "network": {"id": "19c3e0c8-5563-479c-995a-ab38d8b8c7f7", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-10713966-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.240", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "cccb624dbe6d4401a89e9cd254f91828", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd96fb6bb-97", "ovs_interfaceid": "d96fb6bb-9793-4373-8f62-3aa3f32af6a5", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 21 19:00:51 np0005591285 nova_compute[182755]: 2026-01-22 00:00:51.908 182759 DEBUG oslo_concurrency.lockutils [None req-4a1c0c34-b930-45ed-a237-e955fcf014e2 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] Releasing lock "refresh_cache-9308be91-9a92-4389-939a-8b03d37474cf" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 21 19:00:51 np0005591285 nova_compute[182755]: 2026-01-22 00:00:51.908 182759 DEBUG nova.objects.instance [None req-4a1c0c34-b930-45ed-a237-e955fcf014e2 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] Lazy-loading 'migration_context' on Instance uuid 9308be91-9a92-4389-939a-8b03d37474cf obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 21 19:00:51 np0005591285 nova_compute[182755]: 2026-01-22 00:00:51.941 182759 DEBUG nova.virt.libvirt.vif [None req-4a1c0c34-b930-45ed-a237-e955fcf014e2 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-21T23:57:10Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ServerActionsTestJSON-server-396111842',display_name='tempest-ServerActionsTestJSON-server-396111842',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(2),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-serveractionstestjson-server-396111842',id=70,image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',info_cache=InstanceInfoCache,instance_type_id=2,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBM2ugiUux7DYMlN8dY8gue1BzsfXbOKOqdPq/gJUxFgjYtiZRKn0Il7yH7vkt/FF0n0nQ57uKZ7FjQwDvGcLpEHkhrK3RTLhPWsztjfiNHjhjKK0S86T4k3kzP0rpeoh4Q==',key_name='tempest-keypair-452781070',keypairs=<?>,launch_index=0,launched_at=2026-01-22T00:00:40Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=192,metadata={},migration_context=MigrationContext,new_flavor=Flavor(2),node='compute-1.ctlplane.example.com',numa_topology=<?>,old_flavor=Flavor(1),os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='cccb624dbe6d4401a89e9cd254f91828',ramdisk_id='',reservation_id='r-740ncwsh',resources=<?>,root_device_name='/dev/vda',root_gb=1,security_groups=<?>,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',old_vm_state='active',owner_project_name='tempest-ServerActionsTestJSON-78742637',owner_user_name='tempest-ServerActionsTestJSON-78742637-project-member'},tags=<?>,task_state=None,terminated_at=None,trusted_certs=<?>,updated_at=2026-01-22T00:00:40Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='3e78a70a1d284a9d932d4a53b872df39',uuid=9308be91-9a92-4389-939a-8b03d37474cf,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='resized') vif={"id": "d96fb6bb-9793-4373-8f62-3aa3f32af6a5", "address": "fa:16:3e:c3:44:d7", "network": {"id": "19c3e0c8-5563-479c-995a-ab38d8b8c7f7", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-10713966-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.240", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "cccb624dbe6d4401a89e9cd254f91828", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd96fb6bb-97", "ovs_interfaceid": "d96fb6bb-9793-4373-8f62-3aa3f32af6a5", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Jan 21 19:00:51 np0005591285 nova_compute[182755]: 2026-01-22 00:00:51.942 182759 DEBUG nova.network.os_vif_util [None req-4a1c0c34-b930-45ed-a237-e955fcf014e2 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] Converting VIF {"id": "d96fb6bb-9793-4373-8f62-3aa3f32af6a5", "address": "fa:16:3e:c3:44:d7", "network": {"id": "19c3e0c8-5563-479c-995a-ab38d8b8c7f7", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-10713966-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.240", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "cccb624dbe6d4401a89e9cd254f91828", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd96fb6bb-97", "ovs_interfaceid": "d96fb6bb-9793-4373-8f62-3aa3f32af6a5", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 21 19:00:51 np0005591285 nova_compute[182755]: 2026-01-22 00:00:51.944 182759 DEBUG nova.network.os_vif_util [None req-4a1c0c34-b930-45ed-a237-e955fcf014e2 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:c3:44:d7,bridge_name='br-int',has_traffic_filtering=True,id=d96fb6bb-9793-4373-8f62-3aa3f32af6a5,network=Network(19c3e0c8-5563-479c-995a-ab38d8b8c7f7),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapd96fb6bb-97') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 21 19:00:51 np0005591285 nova_compute[182755]: 2026-01-22 00:00:51.945 182759 DEBUG os_vif [None req-4a1c0c34-b930-45ed-a237-e955fcf014e2 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:c3:44:d7,bridge_name='br-int',has_traffic_filtering=True,id=d96fb6bb-9793-4373-8f62-3aa3f32af6a5,network=Network(19c3e0c8-5563-479c-995a-ab38d8b8c7f7),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapd96fb6bb-97') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Jan 21 19:00:51 np0005591285 nova_compute[182755]: 2026-01-22 00:00:51.949 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:00:51 np0005591285 nova_compute[182755]: 2026-01-22 00:00:51.951 182759 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapd96fb6bb-97, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 21 19:00:51 np0005591285 nova_compute[182755]: 2026-01-22 00:00:51.951 182759 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 21 19:00:51 np0005591285 nova_compute[182755]: 2026-01-22 00:00:51.962 182759 INFO os_vif [None req-4a1c0c34-b930-45ed-a237-e955fcf014e2 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:c3:44:d7,bridge_name='br-int',has_traffic_filtering=True,id=d96fb6bb-9793-4373-8f62-3aa3f32af6a5,network=Network(19c3e0c8-5563-479c-995a-ab38d8b8c7f7),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapd96fb6bb-97')#033[00m
Jan 21 19:00:51 np0005591285 nova_compute[182755]: 2026-01-22 00:00:51.962 182759 DEBUG oslo_concurrency.lockutils [None req-4a1c0c34-b930-45ed-a237-e955fcf014e2 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.drop_move_claim_at_source" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 21 19:00:51 np0005591285 nova_compute[182755]: 2026-01-22 00:00:51.963 182759 DEBUG oslo_concurrency.lockutils [None req-4a1c0c34-b930-45ed-a237-e955fcf014e2 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.drop_move_claim_at_source" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 21 19:00:52 np0005591285 nova_compute[182755]: 2026-01-22 00:00:52.089 182759 DEBUG nova.compute.provider_tree [None req-4a1c0c34-b930-45ed-a237-e955fcf014e2 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] Inventory has not changed in ProviderTree for provider: e96a8776-a298-4c19-937a-402cb8191067 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 21 19:00:52 np0005591285 nova_compute[182755]: 2026-01-22 00:00:52.113 182759 DEBUG nova.scheduler.client.report [None req-4a1c0c34-b930-45ed-a237-e955fcf014e2 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] Inventory has not changed for provider e96a8776-a298-4c19-937a-402cb8191067 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 21 19:00:52 np0005591285 nova_compute[182755]: 2026-01-22 00:00:52.208 182759 DEBUG oslo_concurrency.lockutils [None req-4a1c0c34-b930-45ed-a237-e955fcf014e2 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.drop_move_claim_at_source" :: held 0.245s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 21 19:00:52 np0005591285 nova_compute[182755]: 2026-01-22 00:00:52.401 182759 INFO nova.scheduler.client.report [None req-4a1c0c34-b930-45ed-a237-e955fcf014e2 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] Deleted allocation for migration 26c72328-aea0-476d-a0df-60a56cb3907f#033[00m
Jan 21 19:00:52 np0005591285 nova_compute[182755]: 2026-01-22 00:00:52.538 182759 DEBUG oslo_concurrency.lockutils [None req-4a1c0c34-b930-45ed-a237-e955fcf014e2 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] Lock "9308be91-9a92-4389-939a-8b03d37474cf" "released" by "nova.compute.manager.ComputeManager.confirm_resize.<locals>.do_confirm_resize" :: held 5.625s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 21 19:00:54 np0005591285 nova_compute[182755]: 2026-01-22 00:00:54.710 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:00:56 np0005591285 nova_compute[182755]: 2026-01-22 00:00:56.039 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:00:59 np0005591285 nova_compute[182755]: 2026-01-22 00:00:59.714 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:01:01 np0005591285 nova_compute[182755]: 2026-01-22 00:01:01.040 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:01:02 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:01:02.966 104259 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 21 19:01:02 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:01:02.968 104259 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 21 19:01:02 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:01:02.968 104259 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 21 19:01:03 np0005591285 podman[223520]: 2026-01-22 00:01:03.247935298 +0000 UTC m=+0.105165187 container health_status 769668cc4e818cfae854ab3fcb5517227ddca1e18005a18ef4d782a494f45662 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=openstack_network_exporter, build-date=2025-08-20T13:12:41, io.openshift.tags=minimal rhel9, io.buildah.version=1.33.7, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., managed_by=edpm_ansible, architecture=x86_64, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Red Hat, Inc., url=https://catalog.redhat.com/en/search?searchType=containers, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.openshift.expose-services=, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1755695350, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., com.redhat.component=ubi9-minimal-container, name=ubi9-minimal, vcs-type=git, version=9.6, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, container_name=openstack_network_exporter, distribution-scope=public, vendor=Red Hat, Inc.)
Jan 21 19:01:03 np0005591285 podman[223521]: 2026-01-22 00:01:03.268417482 +0000 UTC m=+0.104442187 container health_status b5c1a31a508d0cf9e10702abe9c0ef424614b4d8f0daee85791f7de054afa6f0 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, config_id=ceilometer_agent_compute, org.label-schema.schema-version=1.0, tcib_managed=true, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 21 19:01:04 np0005591285 nova_compute[182755]: 2026-01-22 00:01:04.716 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:01:06 np0005591285 nova_compute[182755]: 2026-01-22 00:01:06.042 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:01:09 np0005591285 nova_compute[182755]: 2026-01-22 00:01:09.720 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:01:10 np0005591285 podman[223560]: 2026-01-22 00:01:10.190248849 +0000 UTC m=+0.058463645 container health_status 3d927e208c8d9d2bf2ed958430b573b5beb6e6062ba87e1ec7a0ee42c855b2e4 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter)
Jan 21 19:01:11 np0005591285 nova_compute[182755]: 2026-01-22 00:01:11.044 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:01:13 np0005591285 podman[223585]: 2026-01-22 00:01:13.184651016 +0000 UTC m=+0.056419930 container health_status 482a8450540b33e5cd4c4cb0c4b60281016c657b79b89fa82f8a9e447843f995 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.build-date=20251202, tcib_managed=true, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 21 19:01:13 np0005591285 podman[223586]: 2026-01-22 00:01:13.191677692 +0000 UTC m=+0.054436247 container health_status 8919309155a033da8deb024bb37b9fc0d285fe97686ee0d202af37a5f2df8416 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Jan 21 19:01:13 np0005591285 podman[223587]: 2026-01-22 00:01:13.217100918 +0000 UTC m=+0.079599776 container health_status e38def30a2d032278c507a8fe96e62e9ea51ff4721fe55b24e66691821fd079e (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.license=GPLv2)
Jan 21 19:01:14 np0005591285 nova_compute[182755]: 2026-01-22 00:01:14.722 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:01:16 np0005591285 nova_compute[182755]: 2026-01-22 00:01:16.045 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:01:19 np0005591285 nova_compute[182755]: 2026-01-22 00:01:19.724 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:01:21 np0005591285 nova_compute[182755]: 2026-01-22 00:01:21.046 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:01:23 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:01:23.456 104259 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=20, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '16:a4:fb', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': 'fa:c2:bb:50:eb:27'}, ipsec=False) old=SB_Global(nb_cfg=19) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 21 19:01:23 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:01:23.457 104259 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 4 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Jan 21 19:01:23 np0005591285 nova_compute[182755]: 2026-01-22 00:01:23.491 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:01:24 np0005591285 nova_compute[182755]: 2026-01-22 00:01:24.728 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:01:26 np0005591285 nova_compute[182755]: 2026-01-22 00:01:26.048 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:01:27 np0005591285 nova_compute[182755]: 2026-01-22 00:01:27.213 182759 DEBUG oslo_service.periodic_task [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 21 19:01:27 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:01:27.459 104259 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=ce4b296c-26ac-415a-aa87-9634754eb3d3, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '20'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 21 19:01:29 np0005591285 nova_compute[182755]: 2026-01-22 00:01:29.730 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:01:29 np0005591285 ovn_controller[94908]: 2026-01-22T00:01:29Z|00296|memory_trim|INFO|Detected inactivity (last active 30002 ms ago): trimming memory
Jan 21 19:01:30 np0005591285 nova_compute[182755]: 2026-01-22 00:01:30.353 182759 DEBUG oslo_concurrency.lockutils [None req-b65f8f80-e09b-4d89-ba79-67c6b7757efe 531ec5a088a94b78af6e2c3feda17c0c a7e425a4d1854533a17d5f0dcd9d87b9 - - default default] Acquiring lock "ec8b1222-c882-45b6-ac60-941f37ff8b8c" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 21 19:01:30 np0005591285 nova_compute[182755]: 2026-01-22 00:01:30.354 182759 DEBUG oslo_concurrency.lockutils [None req-b65f8f80-e09b-4d89-ba79-67c6b7757efe 531ec5a088a94b78af6e2c3feda17c0c a7e425a4d1854533a17d5f0dcd9d87b9 - - default default] Lock "ec8b1222-c882-45b6-ac60-941f37ff8b8c" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 21 19:01:30 np0005591285 nova_compute[182755]: 2026-01-22 00:01:30.388 182759 DEBUG nova.compute.manager [None req-b65f8f80-e09b-4d89-ba79-67c6b7757efe 531ec5a088a94b78af6e2c3feda17c0c a7e425a4d1854533a17d5f0dcd9d87b9 - - default default] [instance: ec8b1222-c882-45b6-ac60-941f37ff8b8c] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Jan 21 19:01:30 np0005591285 nova_compute[182755]: 2026-01-22 00:01:30.575 182759 DEBUG oslo_concurrency.lockutils [None req-b65f8f80-e09b-4d89-ba79-67c6b7757efe 531ec5a088a94b78af6e2c3feda17c0c a7e425a4d1854533a17d5f0dcd9d87b9 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 21 19:01:30 np0005591285 nova_compute[182755]: 2026-01-22 00:01:30.576 182759 DEBUG oslo_concurrency.lockutils [None req-b65f8f80-e09b-4d89-ba79-67c6b7757efe 531ec5a088a94b78af6e2c3feda17c0c a7e425a4d1854533a17d5f0dcd9d87b9 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 21 19:01:30 np0005591285 nova_compute[182755]: 2026-01-22 00:01:30.588 182759 DEBUG nova.virt.hardware [None req-b65f8f80-e09b-4d89-ba79-67c6b7757efe 531ec5a088a94b78af6e2c3feda17c0c a7e425a4d1854533a17d5f0dcd9d87b9 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Jan 21 19:01:30 np0005591285 nova_compute[182755]: 2026-01-22 00:01:30.589 182759 INFO nova.compute.claims [None req-b65f8f80-e09b-4d89-ba79-67c6b7757efe 531ec5a088a94b78af6e2c3feda17c0c a7e425a4d1854533a17d5f0dcd9d87b9 - - default default] [instance: ec8b1222-c882-45b6-ac60-941f37ff8b8c] Claim successful on node compute-2.ctlplane.example.com#033[00m
Jan 21 19:01:30 np0005591285 nova_compute[182755]: 2026-01-22 00:01:30.838 182759 DEBUG nova.compute.provider_tree [None req-b65f8f80-e09b-4d89-ba79-67c6b7757efe 531ec5a088a94b78af6e2c3feda17c0c a7e425a4d1854533a17d5f0dcd9d87b9 - - default default] Inventory has not changed in ProviderTree for provider: e96a8776-a298-4c19-937a-402cb8191067 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 21 19:01:30 np0005591285 nova_compute[182755]: 2026-01-22 00:01:30.861 182759 DEBUG nova.scheduler.client.report [None req-b65f8f80-e09b-4d89-ba79-67c6b7757efe 531ec5a088a94b78af6e2c3feda17c0c a7e425a4d1854533a17d5f0dcd9d87b9 - - default default] Inventory has not changed for provider e96a8776-a298-4c19-937a-402cb8191067 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 21 19:01:30 np0005591285 nova_compute[182755]: 2026-01-22 00:01:30.888 182759 DEBUG oslo_concurrency.lockutils [None req-b65f8f80-e09b-4d89-ba79-67c6b7757efe 531ec5a088a94b78af6e2c3feda17c0c a7e425a4d1854533a17d5f0dcd9d87b9 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.312s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 21 19:01:30 np0005591285 nova_compute[182755]: 2026-01-22 00:01:30.889 182759 DEBUG nova.compute.manager [None req-b65f8f80-e09b-4d89-ba79-67c6b7757efe 531ec5a088a94b78af6e2c3feda17c0c a7e425a4d1854533a17d5f0dcd9d87b9 - - default default] [instance: ec8b1222-c882-45b6-ac60-941f37ff8b8c] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Jan 21 19:01:30 np0005591285 nova_compute[182755]: 2026-01-22 00:01:30.973 182759 DEBUG nova.compute.manager [None req-b65f8f80-e09b-4d89-ba79-67c6b7757efe 531ec5a088a94b78af6e2c3feda17c0c a7e425a4d1854533a17d5f0dcd9d87b9 - - default default] [instance: ec8b1222-c882-45b6-ac60-941f37ff8b8c] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Jan 21 19:01:30 np0005591285 nova_compute[182755]: 2026-01-22 00:01:30.974 182759 DEBUG nova.network.neutron [None req-b65f8f80-e09b-4d89-ba79-67c6b7757efe 531ec5a088a94b78af6e2c3feda17c0c a7e425a4d1854533a17d5f0dcd9d87b9 - - default default] [instance: ec8b1222-c882-45b6-ac60-941f37ff8b8c] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Jan 21 19:01:30 np0005591285 nova_compute[182755]: 2026-01-22 00:01:30.995 182759 INFO nova.virt.libvirt.driver [None req-b65f8f80-e09b-4d89-ba79-67c6b7757efe 531ec5a088a94b78af6e2c3feda17c0c a7e425a4d1854533a17d5f0dcd9d87b9 - - default default] [instance: ec8b1222-c882-45b6-ac60-941f37ff8b8c] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Jan 21 19:01:31 np0005591285 nova_compute[182755]: 2026-01-22 00:01:31.040 182759 DEBUG nova.compute.manager [None req-b65f8f80-e09b-4d89-ba79-67c6b7757efe 531ec5a088a94b78af6e2c3feda17c0c a7e425a4d1854533a17d5f0dcd9d87b9 - - default default] [instance: ec8b1222-c882-45b6-ac60-941f37ff8b8c] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Jan 21 19:01:31 np0005591285 nova_compute[182755]: 2026-01-22 00:01:31.049 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:01:31 np0005591285 nova_compute[182755]: 2026-01-22 00:01:31.217 182759 DEBUG oslo_service.periodic_task [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 21 19:01:31 np0005591285 nova_compute[182755]: 2026-01-22 00:01:31.218 182759 DEBUG nova.compute.manager [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Jan 21 19:01:31 np0005591285 nova_compute[182755]: 2026-01-22 00:01:31.303 182759 DEBUG nova.compute.manager [None req-b65f8f80-e09b-4d89-ba79-67c6b7757efe 531ec5a088a94b78af6e2c3feda17c0c a7e425a4d1854533a17d5f0dcd9d87b9 - - default default] [instance: ec8b1222-c882-45b6-ac60-941f37ff8b8c] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Jan 21 19:01:31 np0005591285 nova_compute[182755]: 2026-01-22 00:01:31.304 182759 DEBUG nova.virt.libvirt.driver [None req-b65f8f80-e09b-4d89-ba79-67c6b7757efe 531ec5a088a94b78af6e2c3feda17c0c a7e425a4d1854533a17d5f0dcd9d87b9 - - default default] [instance: ec8b1222-c882-45b6-ac60-941f37ff8b8c] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Jan 21 19:01:31 np0005591285 nova_compute[182755]: 2026-01-22 00:01:31.304 182759 INFO nova.virt.libvirt.driver [None req-b65f8f80-e09b-4d89-ba79-67c6b7757efe 531ec5a088a94b78af6e2c3feda17c0c a7e425a4d1854533a17d5f0dcd9d87b9 - - default default] [instance: ec8b1222-c882-45b6-ac60-941f37ff8b8c] Creating image(s)#033[00m
Jan 21 19:01:31 np0005591285 nova_compute[182755]: 2026-01-22 00:01:31.305 182759 DEBUG oslo_concurrency.lockutils [None req-b65f8f80-e09b-4d89-ba79-67c6b7757efe 531ec5a088a94b78af6e2c3feda17c0c a7e425a4d1854533a17d5f0dcd9d87b9 - - default default] Acquiring lock "/var/lib/nova/instances/ec8b1222-c882-45b6-ac60-941f37ff8b8c/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 21 19:01:31 np0005591285 nova_compute[182755]: 2026-01-22 00:01:31.305 182759 DEBUG oslo_concurrency.lockutils [None req-b65f8f80-e09b-4d89-ba79-67c6b7757efe 531ec5a088a94b78af6e2c3feda17c0c a7e425a4d1854533a17d5f0dcd9d87b9 - - default default] Lock "/var/lib/nova/instances/ec8b1222-c882-45b6-ac60-941f37ff8b8c/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 21 19:01:31 np0005591285 nova_compute[182755]: 2026-01-22 00:01:31.306 182759 DEBUG oslo_concurrency.lockutils [None req-b65f8f80-e09b-4d89-ba79-67c6b7757efe 531ec5a088a94b78af6e2c3feda17c0c a7e425a4d1854533a17d5f0dcd9d87b9 - - default default] Lock "/var/lib/nova/instances/ec8b1222-c882-45b6-ac60-941f37ff8b8c/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 21 19:01:31 np0005591285 nova_compute[182755]: 2026-01-22 00:01:31.319 182759 DEBUG oslo_concurrency.processutils [None req-b65f8f80-e09b-4d89-ba79-67c6b7757efe 531ec5a088a94b78af6e2c3feda17c0c a7e425a4d1854533a17d5f0dcd9d87b9 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 21 19:01:31 np0005591285 nova_compute[182755]: 2026-01-22 00:01:31.381 182759 DEBUG oslo_concurrency.processutils [None req-b65f8f80-e09b-4d89-ba79-67c6b7757efe 531ec5a088a94b78af6e2c3feda17c0c a7e425a4d1854533a17d5f0dcd9d87b9 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json" returned: 0 in 0.062s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 21 19:01:31 np0005591285 nova_compute[182755]: 2026-01-22 00:01:31.383 182759 DEBUG oslo_concurrency.lockutils [None req-b65f8f80-e09b-4d89-ba79-67c6b7757efe 531ec5a088a94b78af6e2c3feda17c0c a7e425a4d1854533a17d5f0dcd9d87b9 - - default default] Acquiring lock "3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 21 19:01:31 np0005591285 nova_compute[182755]: 2026-01-22 00:01:31.385 182759 DEBUG oslo_concurrency.lockutils [None req-b65f8f80-e09b-4d89-ba79-67c6b7757efe 531ec5a088a94b78af6e2c3feda17c0c a7e425a4d1854533a17d5f0dcd9d87b9 - - default default] Lock "3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 21 19:01:31 np0005591285 nova_compute[182755]: 2026-01-22 00:01:31.414 182759 DEBUG oslo_concurrency.processutils [None req-b65f8f80-e09b-4d89-ba79-67c6b7757efe 531ec5a088a94b78af6e2c3feda17c0c a7e425a4d1854533a17d5f0dcd9d87b9 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 21 19:01:31 np0005591285 nova_compute[182755]: 2026-01-22 00:01:31.490 182759 DEBUG oslo_concurrency.processutils [None req-b65f8f80-e09b-4d89-ba79-67c6b7757efe 531ec5a088a94b78af6e2c3feda17c0c a7e425a4d1854533a17d5f0dcd9d87b9 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json" returned: 0 in 0.076s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 21 19:01:31 np0005591285 nova_compute[182755]: 2026-01-22 00:01:31.491 182759 DEBUG oslo_concurrency.processutils [None req-b65f8f80-e09b-4d89-ba79-67c6b7757efe 531ec5a088a94b78af6e2c3feda17c0c a7e425a4d1854533a17d5f0dcd9d87b9 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474,backing_fmt=raw /var/lib/nova/instances/ec8b1222-c882-45b6-ac60-941f37ff8b8c/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 21 19:01:31 np0005591285 nova_compute[182755]: 2026-01-22 00:01:31.549 182759 DEBUG oslo_concurrency.processutils [None req-b65f8f80-e09b-4d89-ba79-67c6b7757efe 531ec5a088a94b78af6e2c3feda17c0c a7e425a4d1854533a17d5f0dcd9d87b9 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474,backing_fmt=raw /var/lib/nova/instances/ec8b1222-c882-45b6-ac60-941f37ff8b8c/disk 1073741824" returned: 0 in 0.058s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 21 19:01:31 np0005591285 nova_compute[182755]: 2026-01-22 00:01:31.551 182759 DEBUG oslo_concurrency.lockutils [None req-b65f8f80-e09b-4d89-ba79-67c6b7757efe 531ec5a088a94b78af6e2c3feda17c0c a7e425a4d1854533a17d5f0dcd9d87b9 - - default default] Lock "3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.167s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 21 19:01:31 np0005591285 nova_compute[182755]: 2026-01-22 00:01:31.552 182759 DEBUG oslo_concurrency.processutils [None req-b65f8f80-e09b-4d89-ba79-67c6b7757efe 531ec5a088a94b78af6e2c3feda17c0c a7e425a4d1854533a17d5f0dcd9d87b9 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 21 19:01:31 np0005591285 nova_compute[182755]: 2026-01-22 00:01:31.648 182759 DEBUG oslo_concurrency.processutils [None req-b65f8f80-e09b-4d89-ba79-67c6b7757efe 531ec5a088a94b78af6e2c3feda17c0c a7e425a4d1854533a17d5f0dcd9d87b9 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json" returned: 0 in 0.096s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 21 19:01:31 np0005591285 nova_compute[182755]: 2026-01-22 00:01:31.652 182759 DEBUG nova.virt.disk.api [None req-b65f8f80-e09b-4d89-ba79-67c6b7757efe 531ec5a088a94b78af6e2c3feda17c0c a7e425a4d1854533a17d5f0dcd9d87b9 - - default default] Checking if we can resize image /var/lib/nova/instances/ec8b1222-c882-45b6-ac60-941f37ff8b8c/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166#033[00m
Jan 21 19:01:31 np0005591285 nova_compute[182755]: 2026-01-22 00:01:31.652 182759 DEBUG oslo_concurrency.processutils [None req-b65f8f80-e09b-4d89-ba79-67c6b7757efe 531ec5a088a94b78af6e2c3feda17c0c a7e425a4d1854533a17d5f0dcd9d87b9 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/ec8b1222-c882-45b6-ac60-941f37ff8b8c/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 21 19:01:31 np0005591285 nova_compute[182755]: 2026-01-22 00:01:31.715 182759 DEBUG oslo_concurrency.processutils [None req-b65f8f80-e09b-4d89-ba79-67c6b7757efe 531ec5a088a94b78af6e2c3feda17c0c a7e425a4d1854533a17d5f0dcd9d87b9 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/ec8b1222-c882-45b6-ac60-941f37ff8b8c/disk --force-share --output=json" returned: 0 in 0.063s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 21 19:01:31 np0005591285 nova_compute[182755]: 2026-01-22 00:01:31.718 182759 DEBUG nova.virt.disk.api [None req-b65f8f80-e09b-4d89-ba79-67c6b7757efe 531ec5a088a94b78af6e2c3feda17c0c a7e425a4d1854533a17d5f0dcd9d87b9 - - default default] Cannot resize image /var/lib/nova/instances/ec8b1222-c882-45b6-ac60-941f37ff8b8c/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172#033[00m
Jan 21 19:01:31 np0005591285 nova_compute[182755]: 2026-01-22 00:01:31.719 182759 DEBUG nova.objects.instance [None req-b65f8f80-e09b-4d89-ba79-67c6b7757efe 531ec5a088a94b78af6e2c3feda17c0c a7e425a4d1854533a17d5f0dcd9d87b9 - - default default] Lazy-loading 'migration_context' on Instance uuid ec8b1222-c882-45b6-ac60-941f37ff8b8c obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 21 19:01:31 np0005591285 nova_compute[182755]: 2026-01-22 00:01:31.730 182759 DEBUG nova.policy [None req-b65f8f80-e09b-4d89-ba79-67c6b7757efe 531ec5a088a94b78af6e2c3feda17c0c a7e425a4d1854533a17d5f0dcd9d87b9 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '531ec5a088a94b78af6e2c3feda17c0c', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'a7e425a4d1854533a17d5f0dcd9d87b9', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Jan 21 19:01:31 np0005591285 nova_compute[182755]: 2026-01-22 00:01:31.749 182759 DEBUG nova.virt.libvirt.driver [None req-b65f8f80-e09b-4d89-ba79-67c6b7757efe 531ec5a088a94b78af6e2c3feda17c0c a7e425a4d1854533a17d5f0dcd9d87b9 - - default default] [instance: ec8b1222-c882-45b6-ac60-941f37ff8b8c] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Jan 21 19:01:31 np0005591285 nova_compute[182755]: 2026-01-22 00:01:31.749 182759 DEBUG nova.virt.libvirt.driver [None req-b65f8f80-e09b-4d89-ba79-67c6b7757efe 531ec5a088a94b78af6e2c3feda17c0c a7e425a4d1854533a17d5f0dcd9d87b9 - - default default] [instance: ec8b1222-c882-45b6-ac60-941f37ff8b8c] Ensure instance console log exists: /var/lib/nova/instances/ec8b1222-c882-45b6-ac60-941f37ff8b8c/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Jan 21 19:01:31 np0005591285 nova_compute[182755]: 2026-01-22 00:01:31.750 182759 DEBUG oslo_concurrency.lockutils [None req-b65f8f80-e09b-4d89-ba79-67c6b7757efe 531ec5a088a94b78af6e2c3feda17c0c a7e425a4d1854533a17d5f0dcd9d87b9 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 21 19:01:31 np0005591285 nova_compute[182755]: 2026-01-22 00:01:31.750 182759 DEBUG oslo_concurrency.lockutils [None req-b65f8f80-e09b-4d89-ba79-67c6b7757efe 531ec5a088a94b78af6e2c3feda17c0c a7e425a4d1854533a17d5f0dcd9d87b9 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 21 19:01:31 np0005591285 nova_compute[182755]: 2026-01-22 00:01:31.750 182759 DEBUG oslo_concurrency.lockutils [None req-b65f8f80-e09b-4d89-ba79-67c6b7757efe 531ec5a088a94b78af6e2c3feda17c0c a7e425a4d1854533a17d5f0dcd9d87b9 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 21 19:01:33 np0005591285 nova_compute[182755]: 2026-01-22 00:01:33.218 182759 DEBUG oslo_service.periodic_task [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 21 19:01:33 np0005591285 nova_compute[182755]: 2026-01-22 00:01:33.705 182759 DEBUG nova.network.neutron [None req-b65f8f80-e09b-4d89-ba79-67c6b7757efe 531ec5a088a94b78af6e2c3feda17c0c a7e425a4d1854533a17d5f0dcd9d87b9 - - default default] [instance: ec8b1222-c882-45b6-ac60-941f37ff8b8c] Successfully created port: 853e95b8-2a2f-4ec2-afd2-18fe1ea6d1c9 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Jan 21 19:01:34 np0005591285 nova_compute[182755]: 2026-01-22 00:01:34.218 182759 DEBUG oslo_service.periodic_task [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 21 19:01:34 np0005591285 podman[223666]: 2026-01-22 00:01:34.24477125 +0000 UTC m=+0.113205950 container health_status b5c1a31a508d0cf9e10702abe9c0ef424614b4d8f0daee85791f7de054afa6f0 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=ceilometer_agent_compute, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_managed=true, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.vendor=CentOS)
Jan 21 19:01:34 np0005591285 podman[223665]: 2026-01-22 00:01:34.246343072 +0000 UTC m=+0.115618094 container health_status 769668cc4e818cfae854ab3fcb5517227ddca1e18005a18ef4d782a494f45662 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.33.7, url=https://catalog.redhat.com/en/search?searchType=containers, version=9.6, io.openshift.expose-services=, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.tags=minimal rhel9, name=ubi9-minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., distribution-scope=public, container_name=openstack_network_exporter, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, config_id=openstack_network_exporter, maintainer=Red Hat, Inc., release=1755695350, build-date=2025-08-20T13:12:41, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, vendor=Red Hat, Inc., com.redhat.component=ubi9-minimal-container, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., architecture=x86_64, managed_by=edpm_ansible, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b)
Jan 21 19:01:34 np0005591285 nova_compute[182755]: 2026-01-22 00:01:34.733 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:01:35 np0005591285 nova_compute[182755]: 2026-01-22 00:01:35.217 182759 DEBUG oslo_service.periodic_task [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 21 19:01:35 np0005591285 nova_compute[182755]: 2026-01-22 00:01:35.283 182759 DEBUG oslo_concurrency.lockutils [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 21 19:01:35 np0005591285 nova_compute[182755]: 2026-01-22 00:01:35.284 182759 DEBUG oslo_concurrency.lockutils [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 21 19:01:35 np0005591285 nova_compute[182755]: 2026-01-22 00:01:35.284 182759 DEBUG oslo_concurrency.lockutils [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 21 19:01:35 np0005591285 nova_compute[182755]: 2026-01-22 00:01:35.285 182759 DEBUG nova.compute.resource_tracker [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Jan 21 19:01:35 np0005591285 nova_compute[182755]: 2026-01-22 00:01:35.442 182759 WARNING nova.virt.libvirt.driver [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 21 19:01:35 np0005591285 nova_compute[182755]: 2026-01-22 00:01:35.443 182759 DEBUG nova.compute.resource_tracker [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=5674MB free_disk=73.26876449584961GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Jan 21 19:01:35 np0005591285 nova_compute[182755]: 2026-01-22 00:01:35.444 182759 DEBUG oslo_concurrency.lockutils [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 21 19:01:35 np0005591285 nova_compute[182755]: 2026-01-22 00:01:35.444 182759 DEBUG oslo_concurrency.lockutils [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 21 19:01:35 np0005591285 nova_compute[182755]: 2026-01-22 00:01:35.566 182759 DEBUG nova.compute.resource_tracker [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Instance ec8b1222-c882-45b6-ac60-941f37ff8b8c actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Jan 21 19:01:35 np0005591285 nova_compute[182755]: 2026-01-22 00:01:35.566 182759 DEBUG nova.compute.resource_tracker [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Jan 21 19:01:35 np0005591285 nova_compute[182755]: 2026-01-22 00:01:35.567 182759 DEBUG nova.compute.resource_tracker [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7679MB used_ram=640MB phys_disk=79GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Jan 21 19:01:35 np0005591285 nova_compute[182755]: 2026-01-22 00:01:35.630 182759 DEBUG nova.compute.provider_tree [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Inventory has not changed in ProviderTree for provider: e96a8776-a298-4c19-937a-402cb8191067 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 21 19:01:35 np0005591285 nova_compute[182755]: 2026-01-22 00:01:35.654 182759 DEBUG nova.scheduler.client.report [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Inventory has not changed for provider e96a8776-a298-4c19-937a-402cb8191067 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 21 19:01:35 np0005591285 nova_compute[182755]: 2026-01-22 00:01:35.729 182759 DEBUG nova.compute.resource_tracker [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Jan 21 19:01:35 np0005591285 nova_compute[182755]: 2026-01-22 00:01:35.729 182759 DEBUG oslo_concurrency.lockutils [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.285s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 21 19:01:35 np0005591285 nova_compute[182755]: 2026-01-22 00:01:35.804 182759 DEBUG nova.network.neutron [None req-b65f8f80-e09b-4d89-ba79-67c6b7757efe 531ec5a088a94b78af6e2c3feda17c0c a7e425a4d1854533a17d5f0dcd9d87b9 - - default default] [instance: ec8b1222-c882-45b6-ac60-941f37ff8b8c] Successfully updated port: 853e95b8-2a2f-4ec2-afd2-18fe1ea6d1c9 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Jan 21 19:01:35 np0005591285 nova_compute[182755]: 2026-01-22 00:01:35.821 182759 DEBUG oslo_concurrency.lockutils [None req-b65f8f80-e09b-4d89-ba79-67c6b7757efe 531ec5a088a94b78af6e2c3feda17c0c a7e425a4d1854533a17d5f0dcd9d87b9 - - default default] Acquiring lock "refresh_cache-ec8b1222-c882-45b6-ac60-941f37ff8b8c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 21 19:01:35 np0005591285 nova_compute[182755]: 2026-01-22 00:01:35.822 182759 DEBUG oslo_concurrency.lockutils [None req-b65f8f80-e09b-4d89-ba79-67c6b7757efe 531ec5a088a94b78af6e2c3feda17c0c a7e425a4d1854533a17d5f0dcd9d87b9 - - default default] Acquired lock "refresh_cache-ec8b1222-c882-45b6-ac60-941f37ff8b8c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 21 19:01:35 np0005591285 nova_compute[182755]: 2026-01-22 00:01:35.822 182759 DEBUG nova.network.neutron [None req-b65f8f80-e09b-4d89-ba79-67c6b7757efe 531ec5a088a94b78af6e2c3feda17c0c a7e425a4d1854533a17d5f0dcd9d87b9 - - default default] [instance: ec8b1222-c882-45b6-ac60-941f37ff8b8c] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Jan 21 19:01:36 np0005591285 nova_compute[182755]: 2026-01-22 00:01:36.051 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:01:36 np0005591285 nova_compute[182755]: 2026-01-22 00:01:36.207 182759 DEBUG nova.network.neutron [None req-b65f8f80-e09b-4d89-ba79-67c6b7757efe 531ec5a088a94b78af6e2c3feda17c0c a7e425a4d1854533a17d5f0dcd9d87b9 - - default default] [instance: ec8b1222-c882-45b6-ac60-941f37ff8b8c] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Jan 21 19:01:36 np0005591285 nova_compute[182755]: 2026-01-22 00:01:36.476 182759 DEBUG nova.compute.manager [req-69596f5e-4da6-4cfa-9f0c-a88b22ff0250 req-6042699d-7555-49a5-90cd-12ccf5a9321b 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: ec8b1222-c882-45b6-ac60-941f37ff8b8c] Received event network-changed-853e95b8-2a2f-4ec2-afd2-18fe1ea6d1c9 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 21 19:01:36 np0005591285 nova_compute[182755]: 2026-01-22 00:01:36.477 182759 DEBUG nova.compute.manager [req-69596f5e-4da6-4cfa-9f0c-a88b22ff0250 req-6042699d-7555-49a5-90cd-12ccf5a9321b 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: ec8b1222-c882-45b6-ac60-941f37ff8b8c] Refreshing instance network info cache due to event network-changed-853e95b8-2a2f-4ec2-afd2-18fe1ea6d1c9. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 21 19:01:36 np0005591285 nova_compute[182755]: 2026-01-22 00:01:36.477 182759 DEBUG oslo_concurrency.lockutils [req-69596f5e-4da6-4cfa-9f0c-a88b22ff0250 req-6042699d-7555-49a5-90cd-12ccf5a9321b 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "refresh_cache-ec8b1222-c882-45b6-ac60-941f37ff8b8c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 21 19:01:37 np0005591285 nova_compute[182755]: 2026-01-22 00:01:37.464 182759 DEBUG nova.network.neutron [None req-b65f8f80-e09b-4d89-ba79-67c6b7757efe 531ec5a088a94b78af6e2c3feda17c0c a7e425a4d1854533a17d5f0dcd9d87b9 - - default default] [instance: ec8b1222-c882-45b6-ac60-941f37ff8b8c] Updating instance_info_cache with network_info: [{"id": "853e95b8-2a2f-4ec2-afd2-18fe1ea6d1c9", "address": "fa:16:3e:12:25:5f", "network": {"id": "397ba44b-e27b-4a2a-a10b-7de0daa31656", "bridge": "br-int", "label": "tempest-ServersNegativeTestJSON-1751919755-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "a7e425a4d1854533a17d5f0dcd9d87b9", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap853e95b8-2a", "ovs_interfaceid": "853e95b8-2a2f-4ec2-afd2-18fe1ea6d1c9", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 21 19:01:37 np0005591285 nova_compute[182755]: 2026-01-22 00:01:37.494 182759 DEBUG oslo_concurrency.lockutils [None req-b65f8f80-e09b-4d89-ba79-67c6b7757efe 531ec5a088a94b78af6e2c3feda17c0c a7e425a4d1854533a17d5f0dcd9d87b9 - - default default] Releasing lock "refresh_cache-ec8b1222-c882-45b6-ac60-941f37ff8b8c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 21 19:01:37 np0005591285 nova_compute[182755]: 2026-01-22 00:01:37.494 182759 DEBUG nova.compute.manager [None req-b65f8f80-e09b-4d89-ba79-67c6b7757efe 531ec5a088a94b78af6e2c3feda17c0c a7e425a4d1854533a17d5f0dcd9d87b9 - - default default] [instance: ec8b1222-c882-45b6-ac60-941f37ff8b8c] Instance network_info: |[{"id": "853e95b8-2a2f-4ec2-afd2-18fe1ea6d1c9", "address": "fa:16:3e:12:25:5f", "network": {"id": "397ba44b-e27b-4a2a-a10b-7de0daa31656", "bridge": "br-int", "label": "tempest-ServersNegativeTestJSON-1751919755-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "a7e425a4d1854533a17d5f0dcd9d87b9", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap853e95b8-2a", "ovs_interfaceid": "853e95b8-2a2f-4ec2-afd2-18fe1ea6d1c9", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Jan 21 19:01:37 np0005591285 nova_compute[182755]: 2026-01-22 00:01:37.494 182759 DEBUG oslo_concurrency.lockutils [req-69596f5e-4da6-4cfa-9f0c-a88b22ff0250 req-6042699d-7555-49a5-90cd-12ccf5a9321b 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquired lock "refresh_cache-ec8b1222-c882-45b6-ac60-941f37ff8b8c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 21 19:01:37 np0005591285 nova_compute[182755]: 2026-01-22 00:01:37.495 182759 DEBUG nova.network.neutron [req-69596f5e-4da6-4cfa-9f0c-a88b22ff0250 req-6042699d-7555-49a5-90cd-12ccf5a9321b 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: ec8b1222-c882-45b6-ac60-941f37ff8b8c] Refreshing network info cache for port 853e95b8-2a2f-4ec2-afd2-18fe1ea6d1c9 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 21 19:01:37 np0005591285 nova_compute[182755]: 2026-01-22 00:01:37.497 182759 DEBUG nova.virt.libvirt.driver [None req-b65f8f80-e09b-4d89-ba79-67c6b7757efe 531ec5a088a94b78af6e2c3feda17c0c a7e425a4d1854533a17d5f0dcd9d87b9 - - default default] [instance: ec8b1222-c882-45b6-ac60-941f37ff8b8c] Start _get_guest_xml network_info=[{"id": "853e95b8-2a2f-4ec2-afd2-18fe1ea6d1c9", "address": "fa:16:3e:12:25:5f", "network": {"id": "397ba44b-e27b-4a2a-a10b-7de0daa31656", "bridge": "br-int", "label": "tempest-ServersNegativeTestJSON-1751919755-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "a7e425a4d1854533a17d5f0dcd9d87b9", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap853e95b8-2a", "ovs_interfaceid": "853e95b8-2a2f-4ec2-afd2-18fe1ea6d1c9", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-21T23:42:50Z,direct_url=<?>,disk_format='qcow2',id=9cd98f02-a505-4543-a7ad-04e9a377b456,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='43b70c4e837343859ac97b6b2397ba1b',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-21T23:42:52Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'device_type': 'disk', 'device_name': '/dev/vda', 'size': 0, 'boot_index': 0, 'disk_bus': 'virtio', 'guest_format': None, 'encryption_options': None, 'encryption_format': None, 'encrypted': False, 'encryption_secret_uuid': None, 'image_id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Jan 21 19:01:37 np0005591285 nova_compute[182755]: 2026-01-22 00:01:37.502 182759 WARNING nova.virt.libvirt.driver [None req-b65f8f80-e09b-4d89-ba79-67c6b7757efe 531ec5a088a94b78af6e2c3feda17c0c a7e425a4d1854533a17d5f0dcd9d87b9 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 21 19:01:37 np0005591285 nova_compute[182755]: 2026-01-22 00:01:37.507 182759 DEBUG nova.virt.libvirt.host [None req-b65f8f80-e09b-4d89-ba79-67c6b7757efe 531ec5a088a94b78af6e2c3feda17c0c a7e425a4d1854533a17d5f0dcd9d87b9 - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Jan 21 19:01:37 np0005591285 nova_compute[182755]: 2026-01-22 00:01:37.507 182759 DEBUG nova.virt.libvirt.host [None req-b65f8f80-e09b-4d89-ba79-67c6b7757efe 531ec5a088a94b78af6e2c3feda17c0c a7e425a4d1854533a17d5f0dcd9d87b9 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Jan 21 19:01:37 np0005591285 nova_compute[182755]: 2026-01-22 00:01:37.514 182759 DEBUG nova.virt.libvirt.host [None req-b65f8f80-e09b-4d89-ba79-67c6b7757efe 531ec5a088a94b78af6e2c3feda17c0c a7e425a4d1854533a17d5f0dcd9d87b9 - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Jan 21 19:01:37 np0005591285 nova_compute[182755]: 2026-01-22 00:01:37.515 182759 DEBUG nova.virt.libvirt.host [None req-b65f8f80-e09b-4d89-ba79-67c6b7757efe 531ec5a088a94b78af6e2c3feda17c0c a7e425a4d1854533a17d5f0dcd9d87b9 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Jan 21 19:01:37 np0005591285 nova_compute[182755]: 2026-01-22 00:01:37.516 182759 DEBUG nova.virt.libvirt.driver [None req-b65f8f80-e09b-4d89-ba79-67c6b7757efe 531ec5a088a94b78af6e2c3feda17c0c a7e425a4d1854533a17d5f0dcd9d87b9 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Jan 21 19:01:37 np0005591285 nova_compute[182755]: 2026-01-22 00:01:37.516 182759 DEBUG nova.virt.hardware [None req-b65f8f80-e09b-4d89-ba79-67c6b7757efe 531ec5a088a94b78af6e2c3feda17c0c a7e425a4d1854533a17d5f0dcd9d87b9 - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-21T23:42:45Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='c3389c03-89c4-4ff5-9e03-1a99d41713d4',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-21T23:42:50Z,direct_url=<?>,disk_format='qcow2',id=9cd98f02-a505-4543-a7ad-04e9a377b456,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='43b70c4e837343859ac97b6b2397ba1b',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-21T23:42:52Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Jan 21 19:01:37 np0005591285 nova_compute[182755]: 2026-01-22 00:01:37.517 182759 DEBUG nova.virt.hardware [None req-b65f8f80-e09b-4d89-ba79-67c6b7757efe 531ec5a088a94b78af6e2c3feda17c0c a7e425a4d1854533a17d5f0dcd9d87b9 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Jan 21 19:01:37 np0005591285 nova_compute[182755]: 2026-01-22 00:01:37.517 182759 DEBUG nova.virt.hardware [None req-b65f8f80-e09b-4d89-ba79-67c6b7757efe 531ec5a088a94b78af6e2c3feda17c0c a7e425a4d1854533a17d5f0dcd9d87b9 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Jan 21 19:01:37 np0005591285 nova_compute[182755]: 2026-01-22 00:01:37.517 182759 DEBUG nova.virt.hardware [None req-b65f8f80-e09b-4d89-ba79-67c6b7757efe 531ec5a088a94b78af6e2c3feda17c0c a7e425a4d1854533a17d5f0dcd9d87b9 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Jan 21 19:01:37 np0005591285 nova_compute[182755]: 2026-01-22 00:01:37.517 182759 DEBUG nova.virt.hardware [None req-b65f8f80-e09b-4d89-ba79-67c6b7757efe 531ec5a088a94b78af6e2c3feda17c0c a7e425a4d1854533a17d5f0dcd9d87b9 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Jan 21 19:01:37 np0005591285 nova_compute[182755]: 2026-01-22 00:01:37.517 182759 DEBUG nova.virt.hardware [None req-b65f8f80-e09b-4d89-ba79-67c6b7757efe 531ec5a088a94b78af6e2c3feda17c0c a7e425a4d1854533a17d5f0dcd9d87b9 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Jan 21 19:01:37 np0005591285 nova_compute[182755]: 2026-01-22 00:01:37.518 182759 DEBUG nova.virt.hardware [None req-b65f8f80-e09b-4d89-ba79-67c6b7757efe 531ec5a088a94b78af6e2c3feda17c0c a7e425a4d1854533a17d5f0dcd9d87b9 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Jan 21 19:01:37 np0005591285 nova_compute[182755]: 2026-01-22 00:01:37.518 182759 DEBUG nova.virt.hardware [None req-b65f8f80-e09b-4d89-ba79-67c6b7757efe 531ec5a088a94b78af6e2c3feda17c0c a7e425a4d1854533a17d5f0dcd9d87b9 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Jan 21 19:01:37 np0005591285 nova_compute[182755]: 2026-01-22 00:01:37.518 182759 DEBUG nova.virt.hardware [None req-b65f8f80-e09b-4d89-ba79-67c6b7757efe 531ec5a088a94b78af6e2c3feda17c0c a7e425a4d1854533a17d5f0dcd9d87b9 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Jan 21 19:01:37 np0005591285 nova_compute[182755]: 2026-01-22 00:01:37.518 182759 DEBUG nova.virt.hardware [None req-b65f8f80-e09b-4d89-ba79-67c6b7757efe 531ec5a088a94b78af6e2c3feda17c0c a7e425a4d1854533a17d5f0dcd9d87b9 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Jan 21 19:01:37 np0005591285 nova_compute[182755]: 2026-01-22 00:01:37.518 182759 DEBUG nova.virt.hardware [None req-b65f8f80-e09b-4d89-ba79-67c6b7757efe 531ec5a088a94b78af6e2c3feda17c0c a7e425a4d1854533a17d5f0dcd9d87b9 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Jan 21 19:01:37 np0005591285 nova_compute[182755]: 2026-01-22 00:01:37.522 182759 DEBUG nova.virt.libvirt.vif [None req-b65f8f80-e09b-4d89-ba79-67c6b7757efe 531ec5a088a94b78af6e2c3feda17c0c a7e425a4d1854533a17d5f0dcd9d87b9 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-22T00:01:28Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ServersNegativeTestJSON-server-577172231',display_name='tempest-ServersNegativeTestJSON-server-577172231',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-serversnegativetestjson-server-577172231',id=85,image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='a7e425a4d1854533a17d5f0dcd9d87b9',ramdisk_id='',reservation_id='r-dcs018vy',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ServersNegativeTestJSON-1689661',owner_user_name='tempest-ServersNegativeTestJSON-1689661-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-22T00:01:31Z,user_data=None,user_id='531ec5a088a94b78af6e2c3feda17c0c',uuid=ec8b1222-c882-45b6-ac60-941f37ff8b8c,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "853e95b8-2a2f-4ec2-afd2-18fe1ea6d1c9", "address": "fa:16:3e:12:25:5f", "network": {"id": "397ba44b-e27b-4a2a-a10b-7de0daa31656", "bridge": "br-int", "label": "tempest-ServersNegativeTestJSON-1751919755-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "a7e425a4d1854533a17d5f0dcd9d87b9", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap853e95b8-2a", "ovs_interfaceid": "853e95b8-2a2f-4ec2-afd2-18fe1ea6d1c9", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Jan 21 19:01:37 np0005591285 nova_compute[182755]: 2026-01-22 00:01:37.522 182759 DEBUG nova.network.os_vif_util [None req-b65f8f80-e09b-4d89-ba79-67c6b7757efe 531ec5a088a94b78af6e2c3feda17c0c a7e425a4d1854533a17d5f0dcd9d87b9 - - default default] Converting VIF {"id": "853e95b8-2a2f-4ec2-afd2-18fe1ea6d1c9", "address": "fa:16:3e:12:25:5f", "network": {"id": "397ba44b-e27b-4a2a-a10b-7de0daa31656", "bridge": "br-int", "label": "tempest-ServersNegativeTestJSON-1751919755-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "a7e425a4d1854533a17d5f0dcd9d87b9", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap853e95b8-2a", "ovs_interfaceid": "853e95b8-2a2f-4ec2-afd2-18fe1ea6d1c9", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 21 19:01:37 np0005591285 nova_compute[182755]: 2026-01-22 00:01:37.523 182759 DEBUG nova.network.os_vif_util [None req-b65f8f80-e09b-4d89-ba79-67c6b7757efe 531ec5a088a94b78af6e2c3feda17c0c a7e425a4d1854533a17d5f0dcd9d87b9 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:12:25:5f,bridge_name='br-int',has_traffic_filtering=True,id=853e95b8-2a2f-4ec2-afd2-18fe1ea6d1c9,network=Network(397ba44b-e27b-4a2a-a10b-7de0daa31656),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap853e95b8-2a') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 21 19:01:37 np0005591285 nova_compute[182755]: 2026-01-22 00:01:37.524 182759 DEBUG nova.objects.instance [None req-b65f8f80-e09b-4d89-ba79-67c6b7757efe 531ec5a088a94b78af6e2c3feda17c0c a7e425a4d1854533a17d5f0dcd9d87b9 - - default default] Lazy-loading 'pci_devices' on Instance uuid ec8b1222-c882-45b6-ac60-941f37ff8b8c obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 21 19:01:37 np0005591285 nova_compute[182755]: 2026-01-22 00:01:37.537 182759 DEBUG nova.virt.libvirt.driver [None req-b65f8f80-e09b-4d89-ba79-67c6b7757efe 531ec5a088a94b78af6e2c3feda17c0c a7e425a4d1854533a17d5f0dcd9d87b9 - - default default] [instance: ec8b1222-c882-45b6-ac60-941f37ff8b8c] End _get_guest_xml xml=<domain type="kvm">
Jan 21 19:01:37 np0005591285 nova_compute[182755]:  <uuid>ec8b1222-c882-45b6-ac60-941f37ff8b8c</uuid>
Jan 21 19:01:37 np0005591285 nova_compute[182755]:  <name>instance-00000055</name>
Jan 21 19:01:37 np0005591285 nova_compute[182755]:  <memory>131072</memory>
Jan 21 19:01:37 np0005591285 nova_compute[182755]:  <vcpu>1</vcpu>
Jan 21 19:01:37 np0005591285 nova_compute[182755]:  <metadata>
Jan 21 19:01:37 np0005591285 nova_compute[182755]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 21 19:01:37 np0005591285 nova_compute[182755]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 21 19:01:37 np0005591285 nova_compute[182755]:      <nova:name>tempest-ServersNegativeTestJSON-server-577172231</nova:name>
Jan 21 19:01:37 np0005591285 nova_compute[182755]:      <nova:creationTime>2026-01-22 00:01:37</nova:creationTime>
Jan 21 19:01:37 np0005591285 nova_compute[182755]:      <nova:flavor name="m1.nano">
Jan 21 19:01:37 np0005591285 nova_compute[182755]:        <nova:memory>128</nova:memory>
Jan 21 19:01:37 np0005591285 nova_compute[182755]:        <nova:disk>1</nova:disk>
Jan 21 19:01:37 np0005591285 nova_compute[182755]:        <nova:swap>0</nova:swap>
Jan 21 19:01:37 np0005591285 nova_compute[182755]:        <nova:ephemeral>0</nova:ephemeral>
Jan 21 19:01:37 np0005591285 nova_compute[182755]:        <nova:vcpus>1</nova:vcpus>
Jan 21 19:01:37 np0005591285 nova_compute[182755]:      </nova:flavor>
Jan 21 19:01:37 np0005591285 nova_compute[182755]:      <nova:owner>
Jan 21 19:01:37 np0005591285 nova_compute[182755]:        <nova:user uuid="531ec5a088a94b78af6e2c3feda17c0c">tempest-ServersNegativeTestJSON-1689661-project-member</nova:user>
Jan 21 19:01:37 np0005591285 nova_compute[182755]:        <nova:project uuid="a7e425a4d1854533a17d5f0dcd9d87b9">tempest-ServersNegativeTestJSON-1689661</nova:project>
Jan 21 19:01:37 np0005591285 nova_compute[182755]:      </nova:owner>
Jan 21 19:01:37 np0005591285 nova_compute[182755]:      <nova:root type="image" uuid="9cd98f02-a505-4543-a7ad-04e9a377b456"/>
Jan 21 19:01:37 np0005591285 nova_compute[182755]:      <nova:ports>
Jan 21 19:01:37 np0005591285 nova_compute[182755]:        <nova:port uuid="853e95b8-2a2f-4ec2-afd2-18fe1ea6d1c9">
Jan 21 19:01:37 np0005591285 nova_compute[182755]:          <nova:ip type="fixed" address="10.100.0.11" ipVersion="4"/>
Jan 21 19:01:37 np0005591285 nova_compute[182755]:        </nova:port>
Jan 21 19:01:37 np0005591285 nova_compute[182755]:      </nova:ports>
Jan 21 19:01:37 np0005591285 nova_compute[182755]:    </nova:instance>
Jan 21 19:01:37 np0005591285 nova_compute[182755]:  </metadata>
Jan 21 19:01:37 np0005591285 nova_compute[182755]:  <sysinfo type="smbios">
Jan 21 19:01:37 np0005591285 nova_compute[182755]:    <system>
Jan 21 19:01:37 np0005591285 nova_compute[182755]:      <entry name="manufacturer">RDO</entry>
Jan 21 19:01:37 np0005591285 nova_compute[182755]:      <entry name="product">OpenStack Compute</entry>
Jan 21 19:01:37 np0005591285 nova_compute[182755]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 21 19:01:37 np0005591285 nova_compute[182755]:      <entry name="serial">ec8b1222-c882-45b6-ac60-941f37ff8b8c</entry>
Jan 21 19:01:37 np0005591285 nova_compute[182755]:      <entry name="uuid">ec8b1222-c882-45b6-ac60-941f37ff8b8c</entry>
Jan 21 19:01:37 np0005591285 nova_compute[182755]:      <entry name="family">Virtual Machine</entry>
Jan 21 19:01:37 np0005591285 nova_compute[182755]:    </system>
Jan 21 19:01:37 np0005591285 nova_compute[182755]:  </sysinfo>
Jan 21 19:01:37 np0005591285 nova_compute[182755]:  <os>
Jan 21 19:01:37 np0005591285 nova_compute[182755]:    <type arch="x86_64" machine="q35">hvm</type>
Jan 21 19:01:37 np0005591285 nova_compute[182755]:    <boot dev="hd"/>
Jan 21 19:01:37 np0005591285 nova_compute[182755]:    <smbios mode="sysinfo"/>
Jan 21 19:01:37 np0005591285 nova_compute[182755]:  </os>
Jan 21 19:01:37 np0005591285 nova_compute[182755]:  <features>
Jan 21 19:01:37 np0005591285 nova_compute[182755]:    <acpi/>
Jan 21 19:01:37 np0005591285 nova_compute[182755]:    <apic/>
Jan 21 19:01:37 np0005591285 nova_compute[182755]:    <vmcoreinfo/>
Jan 21 19:01:37 np0005591285 nova_compute[182755]:  </features>
Jan 21 19:01:37 np0005591285 nova_compute[182755]:  <clock offset="utc">
Jan 21 19:01:37 np0005591285 nova_compute[182755]:    <timer name="pit" tickpolicy="delay"/>
Jan 21 19:01:37 np0005591285 nova_compute[182755]:    <timer name="rtc" tickpolicy="catchup"/>
Jan 21 19:01:37 np0005591285 nova_compute[182755]:    <timer name="hpet" present="no"/>
Jan 21 19:01:37 np0005591285 nova_compute[182755]:  </clock>
Jan 21 19:01:37 np0005591285 nova_compute[182755]:  <cpu mode="custom" match="exact">
Jan 21 19:01:37 np0005591285 nova_compute[182755]:    <model>Nehalem</model>
Jan 21 19:01:37 np0005591285 nova_compute[182755]:    <topology sockets="1" cores="1" threads="1"/>
Jan 21 19:01:37 np0005591285 nova_compute[182755]:  </cpu>
Jan 21 19:01:37 np0005591285 nova_compute[182755]:  <devices>
Jan 21 19:01:37 np0005591285 nova_compute[182755]:    <disk type="file" device="disk">
Jan 21 19:01:37 np0005591285 nova_compute[182755]:      <driver name="qemu" type="qcow2" cache="none"/>
Jan 21 19:01:37 np0005591285 nova_compute[182755]:      <source file="/var/lib/nova/instances/ec8b1222-c882-45b6-ac60-941f37ff8b8c/disk"/>
Jan 21 19:01:37 np0005591285 nova_compute[182755]:      <target dev="vda" bus="virtio"/>
Jan 21 19:01:37 np0005591285 nova_compute[182755]:    </disk>
Jan 21 19:01:37 np0005591285 nova_compute[182755]:    <disk type="file" device="cdrom">
Jan 21 19:01:37 np0005591285 nova_compute[182755]:      <driver name="qemu" type="raw" cache="none"/>
Jan 21 19:01:37 np0005591285 nova_compute[182755]:      <source file="/var/lib/nova/instances/ec8b1222-c882-45b6-ac60-941f37ff8b8c/disk.config"/>
Jan 21 19:01:37 np0005591285 nova_compute[182755]:      <target dev="sda" bus="sata"/>
Jan 21 19:01:37 np0005591285 nova_compute[182755]:    </disk>
Jan 21 19:01:37 np0005591285 nova_compute[182755]:    <interface type="ethernet">
Jan 21 19:01:37 np0005591285 nova_compute[182755]:      <mac address="fa:16:3e:12:25:5f"/>
Jan 21 19:01:37 np0005591285 nova_compute[182755]:      <model type="virtio"/>
Jan 21 19:01:37 np0005591285 nova_compute[182755]:      <driver name="vhost" rx_queue_size="512"/>
Jan 21 19:01:37 np0005591285 nova_compute[182755]:      <mtu size="1442"/>
Jan 21 19:01:37 np0005591285 nova_compute[182755]:      <target dev="tap853e95b8-2a"/>
Jan 21 19:01:37 np0005591285 nova_compute[182755]:    </interface>
Jan 21 19:01:37 np0005591285 nova_compute[182755]:    <serial type="pty">
Jan 21 19:01:37 np0005591285 nova_compute[182755]:      <log file="/var/lib/nova/instances/ec8b1222-c882-45b6-ac60-941f37ff8b8c/console.log" append="off"/>
Jan 21 19:01:37 np0005591285 nova_compute[182755]:    </serial>
Jan 21 19:01:37 np0005591285 nova_compute[182755]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 21 19:01:37 np0005591285 nova_compute[182755]:    <video>
Jan 21 19:01:37 np0005591285 nova_compute[182755]:      <model type="virtio"/>
Jan 21 19:01:37 np0005591285 nova_compute[182755]:    </video>
Jan 21 19:01:37 np0005591285 nova_compute[182755]:    <input type="tablet" bus="usb"/>
Jan 21 19:01:37 np0005591285 nova_compute[182755]:    <rng model="virtio">
Jan 21 19:01:37 np0005591285 nova_compute[182755]:      <backend model="random">/dev/urandom</backend>
Jan 21 19:01:37 np0005591285 nova_compute[182755]:    </rng>
Jan 21 19:01:37 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root"/>
Jan 21 19:01:37 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 19:01:37 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 19:01:37 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 19:01:37 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 19:01:37 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 19:01:37 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 19:01:37 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 19:01:37 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 19:01:37 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 19:01:37 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 19:01:37 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 19:01:37 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 19:01:37 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 19:01:37 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 19:01:37 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 19:01:37 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 19:01:37 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 19:01:37 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 19:01:37 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 19:01:37 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 19:01:37 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 19:01:37 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 19:01:37 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 19:01:37 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 19:01:37 np0005591285 nova_compute[182755]:    <controller type="usb" index="0"/>
Jan 21 19:01:37 np0005591285 nova_compute[182755]:    <memballoon model="virtio">
Jan 21 19:01:37 np0005591285 nova_compute[182755]:      <stats period="10"/>
Jan 21 19:01:37 np0005591285 nova_compute[182755]:    </memballoon>
Jan 21 19:01:37 np0005591285 nova_compute[182755]:  </devices>
Jan 21 19:01:37 np0005591285 nova_compute[182755]: </domain>
Jan 21 19:01:37 np0005591285 nova_compute[182755]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Jan 21 19:01:37 np0005591285 nova_compute[182755]: 2026-01-22 00:01:37.538 182759 DEBUG nova.compute.manager [None req-b65f8f80-e09b-4d89-ba79-67c6b7757efe 531ec5a088a94b78af6e2c3feda17c0c a7e425a4d1854533a17d5f0dcd9d87b9 - - default default] [instance: ec8b1222-c882-45b6-ac60-941f37ff8b8c] Preparing to wait for external event network-vif-plugged-853e95b8-2a2f-4ec2-afd2-18fe1ea6d1c9 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Jan 21 19:01:37 np0005591285 nova_compute[182755]: 2026-01-22 00:01:37.538 182759 DEBUG oslo_concurrency.lockutils [None req-b65f8f80-e09b-4d89-ba79-67c6b7757efe 531ec5a088a94b78af6e2c3feda17c0c a7e425a4d1854533a17d5f0dcd9d87b9 - - default default] Acquiring lock "ec8b1222-c882-45b6-ac60-941f37ff8b8c-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 21 19:01:37 np0005591285 nova_compute[182755]: 2026-01-22 00:01:37.539 182759 DEBUG oslo_concurrency.lockutils [None req-b65f8f80-e09b-4d89-ba79-67c6b7757efe 531ec5a088a94b78af6e2c3feda17c0c a7e425a4d1854533a17d5f0dcd9d87b9 - - default default] Lock "ec8b1222-c882-45b6-ac60-941f37ff8b8c-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 21 19:01:37 np0005591285 nova_compute[182755]: 2026-01-22 00:01:37.539 182759 DEBUG oslo_concurrency.lockutils [None req-b65f8f80-e09b-4d89-ba79-67c6b7757efe 531ec5a088a94b78af6e2c3feda17c0c a7e425a4d1854533a17d5f0dcd9d87b9 - - default default] Lock "ec8b1222-c882-45b6-ac60-941f37ff8b8c-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 21 19:01:37 np0005591285 nova_compute[182755]: 2026-01-22 00:01:37.539 182759 DEBUG nova.virt.libvirt.vif [None req-b65f8f80-e09b-4d89-ba79-67c6b7757efe 531ec5a088a94b78af6e2c3feda17c0c a7e425a4d1854533a17d5f0dcd9d87b9 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-22T00:01:28Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ServersNegativeTestJSON-server-577172231',display_name='tempest-ServersNegativeTestJSON-server-577172231',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-serversnegativetestjson-server-577172231',id=85,image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='a7e425a4d1854533a17d5f0dcd9d87b9',ramdisk_id='',reservation_id='r-dcs018vy',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ServersNegativeTestJSON-1689661',owner_user_name='tempest-ServersNegativeTestJSON-1689661-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-22T00:01:31Z,user_data=None,user_id='531ec5a088a94b78af6e2c3feda17c0c',uuid=ec8b1222-c882-45b6-ac60-941f37ff8b8c,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "853e95b8-2a2f-4ec2-afd2-18fe1ea6d1c9", "address": "fa:16:3e:12:25:5f", "network": {"id": "397ba44b-e27b-4a2a-a10b-7de0daa31656", "bridge": "br-int", "label": "tempest-ServersNegativeTestJSON-1751919755-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "a7e425a4d1854533a17d5f0dcd9d87b9", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap853e95b8-2a", "ovs_interfaceid": "853e95b8-2a2f-4ec2-afd2-18fe1ea6d1c9", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Jan 21 19:01:37 np0005591285 nova_compute[182755]: 2026-01-22 00:01:37.540 182759 DEBUG nova.network.os_vif_util [None req-b65f8f80-e09b-4d89-ba79-67c6b7757efe 531ec5a088a94b78af6e2c3feda17c0c a7e425a4d1854533a17d5f0dcd9d87b9 - - default default] Converting VIF {"id": "853e95b8-2a2f-4ec2-afd2-18fe1ea6d1c9", "address": "fa:16:3e:12:25:5f", "network": {"id": "397ba44b-e27b-4a2a-a10b-7de0daa31656", "bridge": "br-int", "label": "tempest-ServersNegativeTestJSON-1751919755-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "a7e425a4d1854533a17d5f0dcd9d87b9", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap853e95b8-2a", "ovs_interfaceid": "853e95b8-2a2f-4ec2-afd2-18fe1ea6d1c9", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 21 19:01:37 np0005591285 nova_compute[182755]: 2026-01-22 00:01:37.540 182759 DEBUG nova.network.os_vif_util [None req-b65f8f80-e09b-4d89-ba79-67c6b7757efe 531ec5a088a94b78af6e2c3feda17c0c a7e425a4d1854533a17d5f0dcd9d87b9 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:12:25:5f,bridge_name='br-int',has_traffic_filtering=True,id=853e95b8-2a2f-4ec2-afd2-18fe1ea6d1c9,network=Network(397ba44b-e27b-4a2a-a10b-7de0daa31656),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap853e95b8-2a') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 21 19:01:37 np0005591285 nova_compute[182755]: 2026-01-22 00:01:37.541 182759 DEBUG os_vif [None req-b65f8f80-e09b-4d89-ba79-67c6b7757efe 531ec5a088a94b78af6e2c3feda17c0c a7e425a4d1854533a17d5f0dcd9d87b9 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:12:25:5f,bridge_name='br-int',has_traffic_filtering=True,id=853e95b8-2a2f-4ec2-afd2-18fe1ea6d1c9,network=Network(397ba44b-e27b-4a2a-a10b-7de0daa31656),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap853e95b8-2a') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Jan 21 19:01:37 np0005591285 nova_compute[182755]: 2026-01-22 00:01:37.541 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:01:37 np0005591285 nova_compute[182755]: 2026-01-22 00:01:37.541 182759 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 21 19:01:37 np0005591285 nova_compute[182755]: 2026-01-22 00:01:37.542 182759 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 21 19:01:37 np0005591285 nova_compute[182755]: 2026-01-22 00:01:37.546 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:01:37 np0005591285 nova_compute[182755]: 2026-01-22 00:01:37.547 182759 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap853e95b8-2a, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 21 19:01:37 np0005591285 nova_compute[182755]: 2026-01-22 00:01:37.547 182759 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap853e95b8-2a, col_values=(('external_ids', {'iface-id': '853e95b8-2a2f-4ec2-afd2-18fe1ea6d1c9', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:12:25:5f', 'vm-uuid': 'ec8b1222-c882-45b6-ac60-941f37ff8b8c'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 21 19:01:37 np0005591285 nova_compute[182755]: 2026-01-22 00:01:37.549 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:01:37 np0005591285 NetworkManager[55017]: <info>  [1769040097.5504] manager: (tap853e95b8-2a): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/143)
Jan 21 19:01:37 np0005591285 nova_compute[182755]: 2026-01-22 00:01:37.552 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 21 19:01:37 np0005591285 nova_compute[182755]: 2026-01-22 00:01:37.561 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:01:37 np0005591285 nova_compute[182755]: 2026-01-22 00:01:37.561 182759 INFO os_vif [None req-b65f8f80-e09b-4d89-ba79-67c6b7757efe 531ec5a088a94b78af6e2c3feda17c0c a7e425a4d1854533a17d5f0dcd9d87b9 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:12:25:5f,bridge_name='br-int',has_traffic_filtering=True,id=853e95b8-2a2f-4ec2-afd2-18fe1ea6d1c9,network=Network(397ba44b-e27b-4a2a-a10b-7de0daa31656),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap853e95b8-2a')#033[00m
Jan 21 19:01:37 np0005591285 nova_compute[182755]: 2026-01-22 00:01:37.615 182759 DEBUG nova.virt.libvirt.driver [None req-b65f8f80-e09b-4d89-ba79-67c6b7757efe 531ec5a088a94b78af6e2c3feda17c0c a7e425a4d1854533a17d5f0dcd9d87b9 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 21 19:01:37 np0005591285 nova_compute[182755]: 2026-01-22 00:01:37.616 182759 DEBUG nova.virt.libvirt.driver [None req-b65f8f80-e09b-4d89-ba79-67c6b7757efe 531ec5a088a94b78af6e2c3feda17c0c a7e425a4d1854533a17d5f0dcd9d87b9 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 21 19:01:37 np0005591285 nova_compute[182755]: 2026-01-22 00:01:37.617 182759 DEBUG nova.virt.libvirt.driver [None req-b65f8f80-e09b-4d89-ba79-67c6b7757efe 531ec5a088a94b78af6e2c3feda17c0c a7e425a4d1854533a17d5f0dcd9d87b9 - - default default] No VIF found with MAC fa:16:3e:12:25:5f, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Jan 21 19:01:37 np0005591285 nova_compute[182755]: 2026-01-22 00:01:37.617 182759 INFO nova.virt.libvirt.driver [None req-b65f8f80-e09b-4d89-ba79-67c6b7757efe 531ec5a088a94b78af6e2c3feda17c0c a7e425a4d1854533a17d5f0dcd9d87b9 - - default default] [instance: ec8b1222-c882-45b6-ac60-941f37ff8b8c] Using config drive#033[00m
Jan 21 19:01:37 np0005591285 nova_compute[182755]: 2026-01-22 00:01:37.725 182759 DEBUG oslo_service.periodic_task [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 21 19:01:37 np0005591285 nova_compute[182755]: 2026-01-22 00:01:37.760 182759 DEBUG oslo_service.periodic_task [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 21 19:01:37 np0005591285 nova_compute[182755]: 2026-01-22 00:01:37.761 182759 DEBUG nova.compute.manager [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Jan 21 19:01:37 np0005591285 nova_compute[182755]: 2026-01-22 00:01:37.761 182759 DEBUG nova.compute.manager [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Jan 21 19:01:37 np0005591285 nova_compute[182755]: 2026-01-22 00:01:37.788 182759 DEBUG nova.compute.manager [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] [instance: ec8b1222-c882-45b6-ac60-941f37ff8b8c] Skipping network cache update for instance because it is Building. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9871#033[00m
Jan 21 19:01:37 np0005591285 nova_compute[182755]: 2026-01-22 00:01:37.789 182759 DEBUG nova.compute.manager [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Jan 21 19:01:37 np0005591285 nova_compute[182755]: 2026-01-22 00:01:37.790 182759 DEBUG oslo_service.periodic_task [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 21 19:01:38 np0005591285 nova_compute[182755]: 2026-01-22 00:01:38.159 182759 INFO nova.virt.libvirt.driver [None req-b65f8f80-e09b-4d89-ba79-67c6b7757efe 531ec5a088a94b78af6e2c3feda17c0c a7e425a4d1854533a17d5f0dcd9d87b9 - - default default] [instance: ec8b1222-c882-45b6-ac60-941f37ff8b8c] Creating config drive at /var/lib/nova/instances/ec8b1222-c882-45b6-ac60-941f37ff8b8c/disk.config#033[00m
Jan 21 19:01:38 np0005591285 nova_compute[182755]: 2026-01-22 00:01:38.169 182759 DEBUG oslo_concurrency.processutils [None req-b65f8f80-e09b-4d89-ba79-67c6b7757efe 531ec5a088a94b78af6e2c3feda17c0c a7e425a4d1854533a17d5f0dcd9d87b9 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/ec8b1222-c882-45b6-ac60-941f37ff8b8c/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmphtzm82hx execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 21 19:01:38 np0005591285 nova_compute[182755]: 2026-01-22 00:01:38.319 182759 DEBUG oslo_concurrency.processutils [None req-b65f8f80-e09b-4d89-ba79-67c6b7757efe 531ec5a088a94b78af6e2c3feda17c0c a7e425a4d1854533a17d5f0dcd9d87b9 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/ec8b1222-c882-45b6-ac60-941f37ff8b8c/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmphtzm82hx" returned: 0 in 0.150s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 21 19:01:38 np0005591285 kernel: tap853e95b8-2a: entered promiscuous mode
Jan 21 19:01:38 np0005591285 NetworkManager[55017]: <info>  [1769040098.4174] manager: (tap853e95b8-2a): new Tun device (/org/freedesktop/NetworkManager/Devices/144)
Jan 21 19:01:38 np0005591285 systemd-udevd[223720]: Network interface NamePolicy= disabled on kernel command line.
Jan 21 19:01:38 np0005591285 ovn_controller[94908]: 2026-01-22T00:01:38Z|00297|binding|INFO|Claiming lport 853e95b8-2a2f-4ec2-afd2-18fe1ea6d1c9 for this chassis.
Jan 21 19:01:38 np0005591285 ovn_controller[94908]: 2026-01-22T00:01:38Z|00298|binding|INFO|853e95b8-2a2f-4ec2-afd2-18fe1ea6d1c9: Claiming fa:16:3e:12:25:5f 10.100.0.11
Jan 21 19:01:38 np0005591285 nova_compute[182755]: 2026-01-22 00:01:38.471 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:01:38 np0005591285 nova_compute[182755]: 2026-01-22 00:01:38.480 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:01:38 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:01:38.490 104259 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:12:25:5f 10.100.0.11'], port_security=['fa:16:3e:12:25:5f 10.100.0.11'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.11/28', 'neutron:device_id': 'ec8b1222-c882-45b6-ac60-941f37ff8b8c', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-397ba44b-e27b-4a2a-a10b-7de0daa31656', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'a7e425a4d1854533a17d5f0dcd9d87b9', 'neutron:revision_number': '2', 'neutron:security_group_ids': '5fb84efc-d0d8-44ae-84e4-97e70d8c202e', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=10175545-8ba8-4bcf-9e15-f460a54818aa, chassis=[<ovs.db.idl.Row object at 0x7fdd55b5c970>], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fdd55b5c970>], logical_port=853e95b8-2a2f-4ec2-afd2-18fe1ea6d1c9) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 21 19:01:38 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:01:38.491 104259 INFO neutron.agent.ovn.metadata.agent [-] Port 853e95b8-2a2f-4ec2-afd2-18fe1ea6d1c9 in datapath 397ba44b-e27b-4a2a-a10b-7de0daa31656 bound to our chassis#033[00m
Jan 21 19:01:38 np0005591285 NetworkManager[55017]: <info>  [1769040098.4931] device (tap853e95b8-2a): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 21 19:01:38 np0005591285 NetworkManager[55017]: <info>  [1769040098.4944] device (tap853e95b8-2a): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 21 19:01:38 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:01:38.495 104259 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 397ba44b-e27b-4a2a-a10b-7de0daa31656#033[00m
Jan 21 19:01:38 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:01:38.512 211690 DEBUG oslo.privsep.daemon [-] privsep: reply[42ca655a-9b9f-4fdb-94a5-e364b77bb0f3]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 19:01:38 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:01:38.514 104259 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap397ba44b-e1 in ovnmeta-397ba44b-e27b-4a2a-a10b-7de0daa31656 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Jan 21 19:01:38 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:01:38.516 211690 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap397ba44b-e0 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Jan 21 19:01:38 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:01:38.516 211690 DEBUG oslo.privsep.daemon [-] privsep: reply[26a78265-c342-4ded-80c7-7ec71c4b7418]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 19:01:38 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:01:38.517 211690 DEBUG oslo.privsep.daemon [-] privsep: reply[4cb7b663-b527-4a23-b233-e101114f7f42]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 19:01:38 np0005591285 systemd-machined[154022]: New machine qemu-38-instance-00000055.
Jan 21 19:01:38 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:01:38.540 104650 DEBUG oslo.privsep.daemon [-] privsep: reply[0f536095-533a-40da-b86e-c8b7406b59a3]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 19:01:38 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:01:38.559 211690 DEBUG oslo.privsep.daemon [-] privsep: reply[f73e599b-7b97-495a-bc6f-b88c99d8206b]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 19:01:38 np0005591285 ovn_controller[94908]: 2026-01-22T00:01:38Z|00299|binding|INFO|Setting lport 853e95b8-2a2f-4ec2-afd2-18fe1ea6d1c9 ovn-installed in OVS
Jan 21 19:01:38 np0005591285 ovn_controller[94908]: 2026-01-22T00:01:38Z|00300|binding|INFO|Setting lport 853e95b8-2a2f-4ec2-afd2-18fe1ea6d1c9 up in Southbound
Jan 21 19:01:38 np0005591285 nova_compute[182755]: 2026-01-22 00:01:38.564 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:01:38 np0005591285 systemd[1]: Started Virtual Machine qemu-38-instance-00000055.
Jan 21 19:01:38 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:01:38.598 211704 DEBUG oslo.privsep.daemon [-] privsep: reply[44916bfd-9588-4bef-aea1-3c8def7d871d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 19:01:38 np0005591285 NetworkManager[55017]: <info>  [1769040098.6087] manager: (tap397ba44b-e0): new Veth device (/org/freedesktop/NetworkManager/Devices/145)
Jan 21 19:01:38 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:01:38.608 211690 DEBUG oslo.privsep.daemon [-] privsep: reply[954c173d-cf05-4460-8d49-09e1d369a4fa]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 19:01:38 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:01:38.650 211704 DEBUG oslo.privsep.daemon [-] privsep: reply[a70971ad-0737-439a-96af-ba00ed30cf39]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 19:01:38 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:01:38.654 211704 DEBUG oslo.privsep.daemon [-] privsep: reply[098a0813-c284-4ba2-849b-92da8bceea38]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 19:01:38 np0005591285 NetworkManager[55017]: <info>  [1769040098.6825] device (tap397ba44b-e0): carrier: link connected
Jan 21 19:01:38 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:01:38.692 211704 DEBUG oslo.privsep.daemon [-] privsep: reply[829dbd26-261f-46a0-88a2-d0bdf21bb7bd]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 19:01:38 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:01:38.712 211690 DEBUG oslo.privsep.daemon [-] privsep: reply[0457c3b9-6b77-4b0f-9a1b-9d3887174bee]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap397ba44b-e1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:29:12:aa'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 95], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 465734, 'reachable_time': 39020, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 223756, 'error': None, 'target': 'ovnmeta-397ba44b-e27b-4a2a-a10b-7de0daa31656', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 19:01:38 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:01:38.738 211690 DEBUG oslo.privsep.daemon [-] privsep: reply[a658f54f-eccb-47bc-9ce3-f2fdb2b0318e]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe29:12aa'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 465734, 'tstamp': 465734}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 223757, 'error': None, 'target': 'ovnmeta-397ba44b-e27b-4a2a-a10b-7de0daa31656', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 19:01:38 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:01:38.762 211690 DEBUG oslo.privsep.daemon [-] privsep: reply[ab3846c2-0866-4001-ac7e-0f10a98d2ce0]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap397ba44b-e1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:29:12:aa'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 95], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 465734, 'reachable_time': 39020, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 223758, 'error': None, 'target': 'ovnmeta-397ba44b-e27b-4a2a-a10b-7de0daa31656', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 19:01:38 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:01:38.814 211690 DEBUG oslo.privsep.daemon [-] privsep: reply[e98835c2-a2ed-4921-889d-fcdb33cc87d4]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 19:01:38 np0005591285 nova_compute[182755]: 2026-01-22 00:01:38.817 182759 DEBUG nova.compute.manager [req-4b884460-e3e8-47e4-a60f-b687ad745490 req-58b05a75-1238-47ff-88b8-c591389e1921 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: ec8b1222-c882-45b6-ac60-941f37ff8b8c] Received event network-vif-plugged-853e95b8-2a2f-4ec2-afd2-18fe1ea6d1c9 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 21 19:01:38 np0005591285 nova_compute[182755]: 2026-01-22 00:01:38.818 182759 DEBUG oslo_concurrency.lockutils [req-4b884460-e3e8-47e4-a60f-b687ad745490 req-58b05a75-1238-47ff-88b8-c591389e1921 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "ec8b1222-c882-45b6-ac60-941f37ff8b8c-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 21 19:01:38 np0005591285 nova_compute[182755]: 2026-01-22 00:01:38.818 182759 DEBUG oslo_concurrency.lockutils [req-4b884460-e3e8-47e4-a60f-b687ad745490 req-58b05a75-1238-47ff-88b8-c591389e1921 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "ec8b1222-c882-45b6-ac60-941f37ff8b8c-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 21 19:01:38 np0005591285 nova_compute[182755]: 2026-01-22 00:01:38.819 182759 DEBUG oslo_concurrency.lockutils [req-4b884460-e3e8-47e4-a60f-b687ad745490 req-58b05a75-1238-47ff-88b8-c591389e1921 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "ec8b1222-c882-45b6-ac60-941f37ff8b8c-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 21 19:01:38 np0005591285 nova_compute[182755]: 2026-01-22 00:01:38.819 182759 DEBUG nova.compute.manager [req-4b884460-e3e8-47e4-a60f-b687ad745490 req-58b05a75-1238-47ff-88b8-c591389e1921 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: ec8b1222-c882-45b6-ac60-941f37ff8b8c] Processing event network-vif-plugged-853e95b8-2a2f-4ec2-afd2-18fe1ea6d1c9 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Jan 21 19:01:38 np0005591285 nova_compute[182755]: 2026-01-22 00:01:38.916 182759 DEBUG nova.compute.manager [None req-b65f8f80-e09b-4d89-ba79-67c6b7757efe 531ec5a088a94b78af6e2c3feda17c0c a7e425a4d1854533a17d5f0dcd9d87b9 - - default default] [instance: ec8b1222-c882-45b6-ac60-941f37ff8b8c] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Jan 21 19:01:38 np0005591285 nova_compute[182755]: 2026-01-22 00:01:38.918 182759 DEBUG nova.virt.driver [None req-f447ef32-7edb-4e2c-a5c5-5452f2f9c37e - - - - - -] Emitting event <LifecycleEvent: 1769040098.915641, ec8b1222-c882-45b6-ac60-941f37ff8b8c => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 21 19:01:38 np0005591285 nova_compute[182755]: 2026-01-22 00:01:38.918 182759 INFO nova.compute.manager [None req-f447ef32-7edb-4e2c-a5c5-5452f2f9c37e - - - - - -] [instance: ec8b1222-c882-45b6-ac60-941f37ff8b8c] VM Started (Lifecycle Event)#033[00m
Jan 21 19:01:38 np0005591285 nova_compute[182755]: 2026-01-22 00:01:38.928 182759 DEBUG nova.virt.libvirt.driver [None req-b65f8f80-e09b-4d89-ba79-67c6b7757efe 531ec5a088a94b78af6e2c3feda17c0c a7e425a4d1854533a17d5f0dcd9d87b9 - - default default] [instance: ec8b1222-c882-45b6-ac60-941f37ff8b8c] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Jan 21 19:01:38 np0005591285 nova_compute[182755]: 2026-01-22 00:01:38.933 182759 INFO nova.virt.libvirt.driver [-] [instance: ec8b1222-c882-45b6-ac60-941f37ff8b8c] Instance spawned successfully.#033[00m
Jan 21 19:01:38 np0005591285 nova_compute[182755]: 2026-01-22 00:01:38.934 182759 DEBUG nova.virt.libvirt.driver [None req-b65f8f80-e09b-4d89-ba79-67c6b7757efe 531ec5a088a94b78af6e2c3feda17c0c a7e425a4d1854533a17d5f0dcd9d87b9 - - default default] [instance: ec8b1222-c882-45b6-ac60-941f37ff8b8c] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Jan 21 19:01:38 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:01:38.940 211690 DEBUG oslo.privsep.daemon [-] privsep: reply[9eae59d3-7d53-4289-8a4e-b644e015a87e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 19:01:38 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:01:38.942 104259 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap397ba44b-e0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 21 19:01:38 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:01:38.943 104259 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 21 19:01:38 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:01:38.944 104259 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap397ba44b-e0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 21 19:01:38 np0005591285 NetworkManager[55017]: <info>  [1769040098.9478] manager: (tap397ba44b-e0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/146)
Jan 21 19:01:38 np0005591285 kernel: tap397ba44b-e0: entered promiscuous mode
Jan 21 19:01:38 np0005591285 nova_compute[182755]: 2026-01-22 00:01:38.947 182759 DEBUG nova.compute.manager [None req-f447ef32-7edb-4e2c-a5c5-5452f2f9c37e - - - - - -] [instance: ec8b1222-c882-45b6-ac60-941f37ff8b8c] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 21 19:01:38 np0005591285 nova_compute[182755]: 2026-01-22 00:01:38.949 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:01:38 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:01:38.951 104259 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap397ba44b-e0, col_values=(('external_ids', {'iface-id': 'f7f4d7e4-9841-41f2-85bd-658a3b613e0d'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 21 19:01:38 np0005591285 ovn_controller[94908]: 2026-01-22T00:01:38Z|00301|binding|INFO|Releasing lport f7f4d7e4-9841-41f2-85bd-658a3b613e0d from this chassis (sb_readonly=0)
Jan 21 19:01:38 np0005591285 nova_compute[182755]: 2026-01-22 00:01:38.963 182759 DEBUG nova.compute.manager [None req-f447ef32-7edb-4e2c-a5c5-5452f2f9c37e - - - - - -] [instance: ec8b1222-c882-45b6-ac60-941f37ff8b8c] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 21 19:01:38 np0005591285 nova_compute[182755]: 2026-01-22 00:01:38.969 182759 DEBUG nova.virt.libvirt.driver [None req-b65f8f80-e09b-4d89-ba79-67c6b7757efe 531ec5a088a94b78af6e2c3feda17c0c a7e425a4d1854533a17d5f0dcd9d87b9 - - default default] [instance: ec8b1222-c882-45b6-ac60-941f37ff8b8c] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 21 19:01:38 np0005591285 nova_compute[182755]: 2026-01-22 00:01:38.970 182759 DEBUG nova.virt.libvirt.driver [None req-b65f8f80-e09b-4d89-ba79-67c6b7757efe 531ec5a088a94b78af6e2c3feda17c0c a7e425a4d1854533a17d5f0dcd9d87b9 - - default default] [instance: ec8b1222-c882-45b6-ac60-941f37ff8b8c] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 21 19:01:38 np0005591285 nova_compute[182755]: 2026-01-22 00:01:38.971 182759 DEBUG nova.virt.libvirt.driver [None req-b65f8f80-e09b-4d89-ba79-67c6b7757efe 531ec5a088a94b78af6e2c3feda17c0c a7e425a4d1854533a17d5f0dcd9d87b9 - - default default] [instance: ec8b1222-c882-45b6-ac60-941f37ff8b8c] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 21 19:01:38 np0005591285 nova_compute[182755]: 2026-01-22 00:01:38.971 182759 DEBUG nova.virt.libvirt.driver [None req-b65f8f80-e09b-4d89-ba79-67c6b7757efe 531ec5a088a94b78af6e2c3feda17c0c a7e425a4d1854533a17d5f0dcd9d87b9 - - default default] [instance: ec8b1222-c882-45b6-ac60-941f37ff8b8c] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 21 19:01:38 np0005591285 nova_compute[182755]: 2026-01-22 00:01:38.972 182759 DEBUG nova.virt.libvirt.driver [None req-b65f8f80-e09b-4d89-ba79-67c6b7757efe 531ec5a088a94b78af6e2c3feda17c0c a7e425a4d1854533a17d5f0dcd9d87b9 - - default default] [instance: ec8b1222-c882-45b6-ac60-941f37ff8b8c] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 21 19:01:38 np0005591285 nova_compute[182755]: 2026-01-22 00:01:38.973 182759 DEBUG nova.virt.libvirt.driver [None req-b65f8f80-e09b-4d89-ba79-67c6b7757efe 531ec5a088a94b78af6e2c3feda17c0c a7e425a4d1854533a17d5f0dcd9d87b9 - - default default] [instance: ec8b1222-c882-45b6-ac60-941f37ff8b8c] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 21 19:01:38 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:01:38.977 104259 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/397ba44b-e27b-4a2a-a10b-7de0daa31656.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/397ba44b-e27b-4a2a-a10b-7de0daa31656.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Jan 21 19:01:38 np0005591285 nova_compute[182755]: 2026-01-22 00:01:38.978 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:01:38 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:01:38.978 211690 DEBUG oslo.privsep.daemon [-] privsep: reply[6c4c291f-2778-48ec-b0a8-6816d505f9fc]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 19:01:38 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:01:38.979 104259 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Jan 21 19:01:38 np0005591285 ovn_metadata_agent[104254]: global
Jan 21 19:01:38 np0005591285 ovn_metadata_agent[104254]:    log         /dev/log local0 debug
Jan 21 19:01:38 np0005591285 ovn_metadata_agent[104254]:    log-tag     haproxy-metadata-proxy-397ba44b-e27b-4a2a-a10b-7de0daa31656
Jan 21 19:01:38 np0005591285 ovn_metadata_agent[104254]:    user        root
Jan 21 19:01:38 np0005591285 ovn_metadata_agent[104254]:    group       root
Jan 21 19:01:38 np0005591285 ovn_metadata_agent[104254]:    maxconn     1024
Jan 21 19:01:38 np0005591285 ovn_metadata_agent[104254]:    pidfile     /var/lib/neutron/external/pids/397ba44b-e27b-4a2a-a10b-7de0daa31656.pid.haproxy
Jan 21 19:01:38 np0005591285 ovn_metadata_agent[104254]:    daemon
Jan 21 19:01:38 np0005591285 ovn_metadata_agent[104254]: 
Jan 21 19:01:38 np0005591285 ovn_metadata_agent[104254]: defaults
Jan 21 19:01:38 np0005591285 ovn_metadata_agent[104254]:    log global
Jan 21 19:01:38 np0005591285 ovn_metadata_agent[104254]:    mode http
Jan 21 19:01:38 np0005591285 ovn_metadata_agent[104254]:    option httplog
Jan 21 19:01:38 np0005591285 ovn_metadata_agent[104254]:    option dontlognull
Jan 21 19:01:38 np0005591285 ovn_metadata_agent[104254]:    option http-server-close
Jan 21 19:01:38 np0005591285 ovn_metadata_agent[104254]:    option forwardfor
Jan 21 19:01:38 np0005591285 ovn_metadata_agent[104254]:    retries                 3
Jan 21 19:01:38 np0005591285 ovn_metadata_agent[104254]:    timeout http-request    30s
Jan 21 19:01:38 np0005591285 ovn_metadata_agent[104254]:    timeout connect         30s
Jan 21 19:01:38 np0005591285 ovn_metadata_agent[104254]:    timeout client          32s
Jan 21 19:01:38 np0005591285 ovn_metadata_agent[104254]:    timeout server          32s
Jan 21 19:01:38 np0005591285 ovn_metadata_agent[104254]:    timeout http-keep-alive 30s
Jan 21 19:01:38 np0005591285 ovn_metadata_agent[104254]: 
Jan 21 19:01:38 np0005591285 ovn_metadata_agent[104254]: 
Jan 21 19:01:38 np0005591285 ovn_metadata_agent[104254]: listen listener
Jan 21 19:01:38 np0005591285 ovn_metadata_agent[104254]:    bind 169.254.169.254:80
Jan 21 19:01:38 np0005591285 ovn_metadata_agent[104254]:    server metadata /var/lib/neutron/metadata_proxy
Jan 21 19:01:38 np0005591285 ovn_metadata_agent[104254]:    http-request add-header X-OVN-Network-ID 397ba44b-e27b-4a2a-a10b-7de0daa31656
Jan 21 19:01:38 np0005591285 ovn_metadata_agent[104254]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Jan 21 19:01:38 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:01:38.980 104259 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-397ba44b-e27b-4a2a-a10b-7de0daa31656', 'env', 'PROCESS_TAG=haproxy-397ba44b-e27b-4a2a-a10b-7de0daa31656', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/397ba44b-e27b-4a2a-a10b-7de0daa31656.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Jan 21 19:01:38 np0005591285 nova_compute[182755]: 2026-01-22 00:01:38.990 182759 INFO nova.compute.manager [None req-f447ef32-7edb-4e2c-a5c5-5452f2f9c37e - - - - - -] [instance: ec8b1222-c882-45b6-ac60-941f37ff8b8c] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 21 19:01:38 np0005591285 nova_compute[182755]: 2026-01-22 00:01:38.991 182759 DEBUG nova.virt.driver [None req-f447ef32-7edb-4e2c-a5c5-5452f2f9c37e - - - - - -] Emitting event <LifecycleEvent: 1769040098.9160903, ec8b1222-c882-45b6-ac60-941f37ff8b8c => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 21 19:01:38 np0005591285 nova_compute[182755]: 2026-01-22 00:01:38.992 182759 INFO nova.compute.manager [None req-f447ef32-7edb-4e2c-a5c5-5452f2f9c37e - - - - - -] [instance: ec8b1222-c882-45b6-ac60-941f37ff8b8c] VM Paused (Lifecycle Event)#033[00m
Jan 21 19:01:39 np0005591285 nova_compute[182755]: 2026-01-22 00:01:39.026 182759 DEBUG nova.compute.manager [None req-f447ef32-7edb-4e2c-a5c5-5452f2f9c37e - - - - - -] [instance: ec8b1222-c882-45b6-ac60-941f37ff8b8c] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 21 19:01:39 np0005591285 nova_compute[182755]: 2026-01-22 00:01:39.030 182759 DEBUG nova.virt.driver [None req-f447ef32-7edb-4e2c-a5c5-5452f2f9c37e - - - - - -] Emitting event <LifecycleEvent: 1769040098.9270868, ec8b1222-c882-45b6-ac60-941f37ff8b8c => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 21 19:01:39 np0005591285 nova_compute[182755]: 2026-01-22 00:01:39.030 182759 INFO nova.compute.manager [None req-f447ef32-7edb-4e2c-a5c5-5452f2f9c37e - - - - - -] [instance: ec8b1222-c882-45b6-ac60-941f37ff8b8c] VM Resumed (Lifecycle Event)#033[00m
Jan 21 19:01:39 np0005591285 nova_compute[182755]: 2026-01-22 00:01:39.060 182759 DEBUG nova.compute.manager [None req-f447ef32-7edb-4e2c-a5c5-5452f2f9c37e - - - - - -] [instance: ec8b1222-c882-45b6-ac60-941f37ff8b8c] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 21 19:01:39 np0005591285 nova_compute[182755]: 2026-01-22 00:01:39.064 182759 DEBUG nova.compute.manager [None req-f447ef32-7edb-4e2c-a5c5-5452f2f9c37e - - - - - -] [instance: ec8b1222-c882-45b6-ac60-941f37ff8b8c] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 21 19:01:39 np0005591285 nova_compute[182755]: 2026-01-22 00:01:39.095 182759 INFO nova.compute.manager [None req-b65f8f80-e09b-4d89-ba79-67c6b7757efe 531ec5a088a94b78af6e2c3feda17c0c a7e425a4d1854533a17d5f0dcd9d87b9 - - default default] [instance: ec8b1222-c882-45b6-ac60-941f37ff8b8c] Took 7.79 seconds to spawn the instance on the hypervisor.#033[00m
Jan 21 19:01:39 np0005591285 nova_compute[182755]: 2026-01-22 00:01:39.096 182759 DEBUG nova.compute.manager [None req-b65f8f80-e09b-4d89-ba79-67c6b7757efe 531ec5a088a94b78af6e2c3feda17c0c a7e425a4d1854533a17d5f0dcd9d87b9 - - default default] [instance: ec8b1222-c882-45b6-ac60-941f37ff8b8c] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 21 19:01:39 np0005591285 nova_compute[182755]: 2026-01-22 00:01:39.112 182759 DEBUG nova.network.neutron [req-69596f5e-4da6-4cfa-9f0c-a88b22ff0250 req-6042699d-7555-49a5-90cd-12ccf5a9321b 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: ec8b1222-c882-45b6-ac60-941f37ff8b8c] Updated VIF entry in instance network info cache for port 853e95b8-2a2f-4ec2-afd2-18fe1ea6d1c9. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 21 19:01:39 np0005591285 nova_compute[182755]: 2026-01-22 00:01:39.112 182759 DEBUG nova.network.neutron [req-69596f5e-4da6-4cfa-9f0c-a88b22ff0250 req-6042699d-7555-49a5-90cd-12ccf5a9321b 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: ec8b1222-c882-45b6-ac60-941f37ff8b8c] Updating instance_info_cache with network_info: [{"id": "853e95b8-2a2f-4ec2-afd2-18fe1ea6d1c9", "address": "fa:16:3e:12:25:5f", "network": {"id": "397ba44b-e27b-4a2a-a10b-7de0daa31656", "bridge": "br-int", "label": "tempest-ServersNegativeTestJSON-1751919755-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "a7e425a4d1854533a17d5f0dcd9d87b9", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap853e95b8-2a", "ovs_interfaceid": "853e95b8-2a2f-4ec2-afd2-18fe1ea6d1c9", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 21 19:01:39 np0005591285 nova_compute[182755]: 2026-01-22 00:01:39.115 182759 INFO nova.compute.manager [None req-f447ef32-7edb-4e2c-a5c5-5452f2f9c37e - - - - - -] [instance: ec8b1222-c882-45b6-ac60-941f37ff8b8c] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 21 19:01:39 np0005591285 nova_compute[182755]: 2026-01-22 00:01:39.156 182759 DEBUG oslo_concurrency.lockutils [req-69596f5e-4da6-4cfa-9f0c-a88b22ff0250 req-6042699d-7555-49a5-90cd-12ccf5a9321b 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Releasing lock "refresh_cache-ec8b1222-c882-45b6-ac60-941f37ff8b8c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 21 19:01:39 np0005591285 nova_compute[182755]: 2026-01-22 00:01:39.217 182759 DEBUG oslo_service.periodic_task [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 21 19:01:39 np0005591285 nova_compute[182755]: 2026-01-22 00:01:39.254 182759 INFO nova.compute.manager [None req-b65f8f80-e09b-4d89-ba79-67c6b7757efe 531ec5a088a94b78af6e2c3feda17c0c a7e425a4d1854533a17d5f0dcd9d87b9 - - default default] [instance: ec8b1222-c882-45b6-ac60-941f37ff8b8c] Took 8.74 seconds to build instance.#033[00m
Jan 21 19:01:39 np0005591285 nova_compute[182755]: 2026-01-22 00:01:39.275 182759 DEBUG oslo_concurrency.lockutils [None req-b65f8f80-e09b-4d89-ba79-67c6b7757efe 531ec5a088a94b78af6e2c3feda17c0c a7e425a4d1854533a17d5f0dcd9d87b9 - - default default] Lock "ec8b1222-c882-45b6-ac60-941f37ff8b8c" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 8.921s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 21 19:01:39 np0005591285 podman[223796]: 2026-01-22 00:01:39.468602929 +0000 UTC m=+0.083459299 container create c53c65279f58610f462c1b0685f53cd06dd5bf92e0c35c35523b8ffa7d37b8a0 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-397ba44b-e27b-4a2a-a10b-7de0daa31656, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.license=GPLv2, org.label-schema.build-date=20251202, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_managed=true)
Jan 21 19:01:39 np0005591285 systemd[1]: Started libpod-conmon-c53c65279f58610f462c1b0685f53cd06dd5bf92e0c35c35523b8ffa7d37b8a0.scope.
Jan 21 19:01:39 np0005591285 podman[223796]: 2026-01-22 00:01:39.426727646 +0000 UTC m=+0.041584096 image pull 3695f0466b4af47afdf4b467956f8cc4744d7249671a73e7ca3fd26cca2f59c3 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Jan 21 19:01:39 np0005591285 systemd[1]: Started libcrun container.
Jan 21 19:01:39 np0005591285 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/2c311f5a13265d7a38d210e64d0500658f1c89ced9186d41b202265d4c426828/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 21 19:01:39 np0005591285 podman[223796]: 2026-01-22 00:01:39.572498281 +0000 UTC m=+0.187354691 container init c53c65279f58610f462c1b0685f53cd06dd5bf92e0c35c35523b8ffa7d37b8a0 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-397ba44b-e27b-4a2a-a10b-7de0daa31656, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.license=GPLv2, org.label-schema.build-date=20251202, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3)
Jan 21 19:01:39 np0005591285 podman[223796]: 2026-01-22 00:01:39.577504724 +0000 UTC m=+0.192361104 container start c53c65279f58610f462c1b0685f53cd06dd5bf92e0c35c35523b8ffa7d37b8a0 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-397ba44b-e27b-4a2a-a10b-7de0daa31656, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251202, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 21 19:01:39 np0005591285 neutron-haproxy-ovnmeta-397ba44b-e27b-4a2a-a10b-7de0daa31656[223812]: [NOTICE]   (223816) : New worker (223818) forked
Jan 21 19:01:39 np0005591285 neutron-haproxy-ovnmeta-397ba44b-e27b-4a2a-a10b-7de0daa31656[223812]: [NOTICE]   (223816) : Loading success.
Jan 21 19:01:40 np0005591285 nova_compute[182755]: 2026-01-22 00:01:40.217 182759 DEBUG oslo_service.periodic_task [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 21 19:01:40 np0005591285 nova_compute[182755]: 2026-01-22 00:01:40.953 182759 DEBUG oslo_concurrency.lockutils [None req-73884364-83b4-4c92-967e-f322317f056b 531ec5a088a94b78af6e2c3feda17c0c a7e425a4d1854533a17d5f0dcd9d87b9 - - default default] Acquiring lock "ec8b1222-c882-45b6-ac60-941f37ff8b8c" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 21 19:01:40 np0005591285 nova_compute[182755]: 2026-01-22 00:01:40.953 182759 DEBUG oslo_concurrency.lockutils [None req-73884364-83b4-4c92-967e-f322317f056b 531ec5a088a94b78af6e2c3feda17c0c a7e425a4d1854533a17d5f0dcd9d87b9 - - default default] Lock "ec8b1222-c882-45b6-ac60-941f37ff8b8c" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 21 19:01:40 np0005591285 nova_compute[182755]: 2026-01-22 00:01:40.954 182759 DEBUG oslo_concurrency.lockutils [None req-73884364-83b4-4c92-967e-f322317f056b 531ec5a088a94b78af6e2c3feda17c0c a7e425a4d1854533a17d5f0dcd9d87b9 - - default default] Acquiring lock "ec8b1222-c882-45b6-ac60-941f37ff8b8c-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 21 19:01:40 np0005591285 nova_compute[182755]: 2026-01-22 00:01:40.954 182759 DEBUG oslo_concurrency.lockutils [None req-73884364-83b4-4c92-967e-f322317f056b 531ec5a088a94b78af6e2c3feda17c0c a7e425a4d1854533a17d5f0dcd9d87b9 - - default default] Lock "ec8b1222-c882-45b6-ac60-941f37ff8b8c-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 21 19:01:40 np0005591285 nova_compute[182755]: 2026-01-22 00:01:40.954 182759 DEBUG oslo_concurrency.lockutils [None req-73884364-83b4-4c92-967e-f322317f056b 531ec5a088a94b78af6e2c3feda17c0c a7e425a4d1854533a17d5f0dcd9d87b9 - - default default] Lock "ec8b1222-c882-45b6-ac60-941f37ff8b8c-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 21 19:01:40 np0005591285 nova_compute[182755]: 2026-01-22 00:01:40.958 182759 DEBUG nova.compute.manager [req-a8a3ccaf-1872-4542-b4f6-dfc388270222 req-d06c7361-cfef-4b29-9571-c17f3394026c 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: ec8b1222-c882-45b6-ac60-941f37ff8b8c] Received event network-vif-plugged-853e95b8-2a2f-4ec2-afd2-18fe1ea6d1c9 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 21 19:01:40 np0005591285 nova_compute[182755]: 2026-01-22 00:01:40.958 182759 DEBUG oslo_concurrency.lockutils [req-a8a3ccaf-1872-4542-b4f6-dfc388270222 req-d06c7361-cfef-4b29-9571-c17f3394026c 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "ec8b1222-c882-45b6-ac60-941f37ff8b8c-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 21 19:01:40 np0005591285 nova_compute[182755]: 2026-01-22 00:01:40.958 182759 DEBUG oslo_concurrency.lockutils [req-a8a3ccaf-1872-4542-b4f6-dfc388270222 req-d06c7361-cfef-4b29-9571-c17f3394026c 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "ec8b1222-c882-45b6-ac60-941f37ff8b8c-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 21 19:01:40 np0005591285 nova_compute[182755]: 2026-01-22 00:01:40.959 182759 DEBUG oslo_concurrency.lockutils [req-a8a3ccaf-1872-4542-b4f6-dfc388270222 req-d06c7361-cfef-4b29-9571-c17f3394026c 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "ec8b1222-c882-45b6-ac60-941f37ff8b8c-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 21 19:01:40 np0005591285 nova_compute[182755]: 2026-01-22 00:01:40.959 182759 DEBUG nova.compute.manager [req-a8a3ccaf-1872-4542-b4f6-dfc388270222 req-d06c7361-cfef-4b29-9571-c17f3394026c 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: ec8b1222-c882-45b6-ac60-941f37ff8b8c] No waiting events found dispatching network-vif-plugged-853e95b8-2a2f-4ec2-afd2-18fe1ea6d1c9 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 21 19:01:40 np0005591285 nova_compute[182755]: 2026-01-22 00:01:40.959 182759 WARNING nova.compute.manager [req-a8a3ccaf-1872-4542-b4f6-dfc388270222 req-d06c7361-cfef-4b29-9571-c17f3394026c 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: ec8b1222-c882-45b6-ac60-941f37ff8b8c] Received unexpected event network-vif-plugged-853e95b8-2a2f-4ec2-afd2-18fe1ea6d1c9 for instance with vm_state active and task_state deleting.#033[00m
Jan 21 19:01:40 np0005591285 nova_compute[182755]: 2026-01-22 00:01:40.966 182759 INFO nova.compute.manager [None req-73884364-83b4-4c92-967e-f322317f056b 531ec5a088a94b78af6e2c3feda17c0c a7e425a4d1854533a17d5f0dcd9d87b9 - - default default] [instance: ec8b1222-c882-45b6-ac60-941f37ff8b8c] Terminating instance#033[00m
Jan 21 19:01:40 np0005591285 nova_compute[182755]: 2026-01-22 00:01:40.979 182759 DEBUG nova.compute.manager [None req-73884364-83b4-4c92-967e-f322317f056b 531ec5a088a94b78af6e2c3feda17c0c a7e425a4d1854533a17d5f0dcd9d87b9 - - default default] [instance: ec8b1222-c882-45b6-ac60-941f37ff8b8c] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Jan 21 19:01:41 np0005591285 kernel: tap853e95b8-2a (unregistering): left promiscuous mode
Jan 21 19:01:41 np0005591285 NetworkManager[55017]: <info>  [1769040101.0045] device (tap853e95b8-2a): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 21 19:01:41 np0005591285 ovn_controller[94908]: 2026-01-22T00:01:41Z|00302|binding|INFO|Releasing lport 853e95b8-2a2f-4ec2-afd2-18fe1ea6d1c9 from this chassis (sb_readonly=0)
Jan 21 19:01:41 np0005591285 ovn_controller[94908]: 2026-01-22T00:01:41Z|00303|binding|INFO|Setting lport 853e95b8-2a2f-4ec2-afd2-18fe1ea6d1c9 down in Southbound
Jan 21 19:01:41 np0005591285 nova_compute[182755]: 2026-01-22 00:01:41.015 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:01:41 np0005591285 ovn_controller[94908]: 2026-01-22T00:01:41Z|00304|binding|INFO|Removing iface tap853e95b8-2a ovn-installed in OVS
Jan 21 19:01:41 np0005591285 nova_compute[182755]: 2026-01-22 00:01:41.018 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:01:41 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:01:41.030 104259 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:12:25:5f 10.100.0.11'], port_security=['fa:16:3e:12:25:5f 10.100.0.11'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.11/28', 'neutron:device_id': 'ec8b1222-c882-45b6-ac60-941f37ff8b8c', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-397ba44b-e27b-4a2a-a10b-7de0daa31656', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'a7e425a4d1854533a17d5f0dcd9d87b9', 'neutron:revision_number': '4', 'neutron:security_group_ids': '5fb84efc-d0d8-44ae-84e4-97e70d8c202e', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-2.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=10175545-8ba8-4bcf-9e15-f460a54818aa, chassis=[], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fdd55b5c970>], logical_port=853e95b8-2a2f-4ec2-afd2-18fe1ea6d1c9) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7fdd55b5c970>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 21 19:01:41 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:01:41.031 104259 INFO neutron.agent.ovn.metadata.agent [-] Port 853e95b8-2a2f-4ec2-afd2-18fe1ea6d1c9 in datapath 397ba44b-e27b-4a2a-a10b-7de0daa31656 unbound from our chassis#033[00m
Jan 21 19:01:41 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:01:41.032 104259 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 397ba44b-e27b-4a2a-a10b-7de0daa31656, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Jan 21 19:01:41 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:01:41.033 211690 DEBUG oslo.privsep.daemon [-] privsep: reply[4c14b8d0-fefb-4707-ab52-18c3ce847810]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 19:01:41 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:01:41.034 104259 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-397ba44b-e27b-4a2a-a10b-7de0daa31656 namespace which is not needed anymore#033[00m
Jan 21 19:01:41 np0005591285 nova_compute[182755]: 2026-01-22 00:01:41.035 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:01:41 np0005591285 systemd[1]: machine-qemu\x2d38\x2dinstance\x2d00000055.scope: Deactivated successfully.
Jan 21 19:01:41 np0005591285 systemd[1]: machine-qemu\x2d38\x2dinstance\x2d00000055.scope: Consumed 2.442s CPU time.
Jan 21 19:01:41 np0005591285 nova_compute[182755]: 2026-01-22 00:01:41.052 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:01:41 np0005591285 systemd-machined[154022]: Machine qemu-38-instance-00000055 terminated.
Jan 21 19:01:41 np0005591285 podman[223827]: 2026-01-22 00:01:41.12826844 +0000 UTC m=+0.088246307 container health_status 3d927e208c8d9d2bf2ed958430b573b5beb6e6062ba87e1ec7a0ee42c855b2e4 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter)
Jan 21 19:01:41 np0005591285 NetworkManager[55017]: <info>  [1769040101.2009] manager: (tap853e95b8-2a): new Tun device (/org/freedesktop/NetworkManager/Devices/147)
Jan 21 19:01:41 np0005591285 neutron-haproxy-ovnmeta-397ba44b-e27b-4a2a-a10b-7de0daa31656[223812]: [NOTICE]   (223816) : haproxy version is 2.8.14-c23fe91
Jan 21 19:01:41 np0005591285 neutron-haproxy-ovnmeta-397ba44b-e27b-4a2a-a10b-7de0daa31656[223812]: [NOTICE]   (223816) : path to executable is /usr/sbin/haproxy
Jan 21 19:01:41 np0005591285 neutron-haproxy-ovnmeta-397ba44b-e27b-4a2a-a10b-7de0daa31656[223812]: [WARNING]  (223816) : Exiting Master process...
Jan 21 19:01:41 np0005591285 neutron-haproxy-ovnmeta-397ba44b-e27b-4a2a-a10b-7de0daa31656[223812]: [WARNING]  (223816) : Exiting Master process...
Jan 21 19:01:41 np0005591285 neutron-haproxy-ovnmeta-397ba44b-e27b-4a2a-a10b-7de0daa31656[223812]: [ALERT]    (223816) : Current worker (223818) exited with code 143 (Terminated)
Jan 21 19:01:41 np0005591285 neutron-haproxy-ovnmeta-397ba44b-e27b-4a2a-a10b-7de0daa31656[223812]: [WARNING]  (223816) : All workers exited. Exiting... (0)
Jan 21 19:01:41 np0005591285 systemd[1]: libpod-c53c65279f58610f462c1b0685f53cd06dd5bf92e0c35c35523b8ffa7d37b8a0.scope: Deactivated successfully.
Jan 21 19:01:41 np0005591285 podman[223872]: 2026-01-22 00:01:41.22383672 +0000 UTC m=+0.063287323 container died c53c65279f58610f462c1b0685f53cd06dd5bf92e0c35c35523b8ffa7d37b8a0 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-397ba44b-e27b-4a2a-a10b-7de0daa31656, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3)
Jan 21 19:01:41 np0005591285 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-c53c65279f58610f462c1b0685f53cd06dd5bf92e0c35c35523b8ffa7d37b8a0-userdata-shm.mount: Deactivated successfully.
Jan 21 19:01:41 np0005591285 systemd[1]: var-lib-containers-storage-overlay-2c311f5a13265d7a38d210e64d0500658f1c89ced9186d41b202265d4c426828-merged.mount: Deactivated successfully.
Jan 21 19:01:41 np0005591285 nova_compute[182755]: 2026-01-22 00:01:41.268 182759 INFO nova.virt.libvirt.driver [-] [instance: ec8b1222-c882-45b6-ac60-941f37ff8b8c] Instance destroyed successfully.#033[00m
Jan 21 19:01:41 np0005591285 nova_compute[182755]: 2026-01-22 00:01:41.269 182759 DEBUG nova.objects.instance [None req-73884364-83b4-4c92-967e-f322317f056b 531ec5a088a94b78af6e2c3feda17c0c a7e425a4d1854533a17d5f0dcd9d87b9 - - default default] Lazy-loading 'resources' on Instance uuid ec8b1222-c882-45b6-ac60-941f37ff8b8c obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 21 19:01:41 np0005591285 nova_compute[182755]: 2026-01-22 00:01:41.285 182759 DEBUG nova.virt.libvirt.vif [None req-73884364-83b4-4c92-967e-f322317f056b 531ec5a088a94b78af6e2c3feda17c0c a7e425a4d1854533a17d5f0dcd9d87b9 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-22T00:01:28Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ServersNegativeTestJSON-server-577172231',display_name='tempest-ServersNegativeTestJSON-server-577172231',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-serversnegativetestjson-server-577172231',id=85,image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2026-01-22T00:01:39Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='a7e425a4d1854533a17d5f0dcd9d87b9',ramdisk_id='',reservation_id='r-dcs018vy',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-ServersNegativeTestJSON-1689661',owner_user_name='tempest-ServersNegativeTestJSON-1689661-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-22T00:01:39Z,user_data=None,user_id='531ec5a088a94b78af6e2c3feda17c0c',uuid=ec8b1222-c882-45b6-ac60-941f37ff8b8c,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "853e95b8-2a2f-4ec2-afd2-18fe1ea6d1c9", "address": "fa:16:3e:12:25:5f", "network": {"id": "397ba44b-e27b-4a2a-a10b-7de0daa31656", "bridge": "br-int", "label": "tempest-ServersNegativeTestJSON-1751919755-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "a7e425a4d1854533a17d5f0dcd9d87b9", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap853e95b8-2a", "ovs_interfaceid": "853e95b8-2a2f-4ec2-afd2-18fe1ea6d1c9", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Jan 21 19:01:41 np0005591285 nova_compute[182755]: 2026-01-22 00:01:41.285 182759 DEBUG nova.network.os_vif_util [None req-73884364-83b4-4c92-967e-f322317f056b 531ec5a088a94b78af6e2c3feda17c0c a7e425a4d1854533a17d5f0dcd9d87b9 - - default default] Converting VIF {"id": "853e95b8-2a2f-4ec2-afd2-18fe1ea6d1c9", "address": "fa:16:3e:12:25:5f", "network": {"id": "397ba44b-e27b-4a2a-a10b-7de0daa31656", "bridge": "br-int", "label": "tempest-ServersNegativeTestJSON-1751919755-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "a7e425a4d1854533a17d5f0dcd9d87b9", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap853e95b8-2a", "ovs_interfaceid": "853e95b8-2a2f-4ec2-afd2-18fe1ea6d1c9", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 21 19:01:41 np0005591285 nova_compute[182755]: 2026-01-22 00:01:41.286 182759 DEBUG nova.network.os_vif_util [None req-73884364-83b4-4c92-967e-f322317f056b 531ec5a088a94b78af6e2c3feda17c0c a7e425a4d1854533a17d5f0dcd9d87b9 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:12:25:5f,bridge_name='br-int',has_traffic_filtering=True,id=853e95b8-2a2f-4ec2-afd2-18fe1ea6d1c9,network=Network(397ba44b-e27b-4a2a-a10b-7de0daa31656),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap853e95b8-2a') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 21 19:01:41 np0005591285 nova_compute[182755]: 2026-01-22 00:01:41.287 182759 DEBUG os_vif [None req-73884364-83b4-4c92-967e-f322317f056b 531ec5a088a94b78af6e2c3feda17c0c a7e425a4d1854533a17d5f0dcd9d87b9 - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:12:25:5f,bridge_name='br-int',has_traffic_filtering=True,id=853e95b8-2a2f-4ec2-afd2-18fe1ea6d1c9,network=Network(397ba44b-e27b-4a2a-a10b-7de0daa31656),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap853e95b8-2a') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Jan 21 19:01:41 np0005591285 nova_compute[182755]: 2026-01-22 00:01:41.289 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:01:41 np0005591285 nova_compute[182755]: 2026-01-22 00:01:41.289 182759 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap853e95b8-2a, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 21 19:01:41 np0005591285 nova_compute[182755]: 2026-01-22 00:01:41.291 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:01:41 np0005591285 nova_compute[182755]: 2026-01-22 00:01:41.292 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:01:41 np0005591285 nova_compute[182755]: 2026-01-22 00:01:41.297 182759 INFO os_vif [None req-73884364-83b4-4c92-967e-f322317f056b 531ec5a088a94b78af6e2c3feda17c0c a7e425a4d1854533a17d5f0dcd9d87b9 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:12:25:5f,bridge_name='br-int',has_traffic_filtering=True,id=853e95b8-2a2f-4ec2-afd2-18fe1ea6d1c9,network=Network(397ba44b-e27b-4a2a-a10b-7de0daa31656),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap853e95b8-2a')#033[00m
Jan 21 19:01:41 np0005591285 nova_compute[182755]: 2026-01-22 00:01:41.298 182759 INFO nova.virt.libvirt.driver [None req-73884364-83b4-4c92-967e-f322317f056b 531ec5a088a94b78af6e2c3feda17c0c a7e425a4d1854533a17d5f0dcd9d87b9 - - default default] [instance: ec8b1222-c882-45b6-ac60-941f37ff8b8c] Deleting instance files /var/lib/nova/instances/ec8b1222-c882-45b6-ac60-941f37ff8b8c_del#033[00m
Jan 21 19:01:41 np0005591285 nova_compute[182755]: 2026-01-22 00:01:41.299 182759 INFO nova.virt.libvirt.driver [None req-73884364-83b4-4c92-967e-f322317f056b 531ec5a088a94b78af6e2c3feda17c0c a7e425a4d1854533a17d5f0dcd9d87b9 - - default default] [instance: ec8b1222-c882-45b6-ac60-941f37ff8b8c] Deletion of /var/lib/nova/instances/ec8b1222-c882-45b6-ac60-941f37ff8b8c_del complete#033[00m
Jan 21 19:01:41 np0005591285 podman[223872]: 2026-01-22 00:01:41.326716264 +0000 UTC m=+0.166166857 container cleanup c53c65279f58610f462c1b0685f53cd06dd5bf92e0c35c35523b8ffa7d37b8a0 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-397ba44b-e27b-4a2a-a10b-7de0daa31656, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 21 19:01:41 np0005591285 systemd[1]: libpod-conmon-c53c65279f58610f462c1b0685f53cd06dd5bf92e0c35c35523b8ffa7d37b8a0.scope: Deactivated successfully.
Jan 21 19:01:41 np0005591285 nova_compute[182755]: 2026-01-22 00:01:41.370 182759 INFO nova.compute.manager [None req-73884364-83b4-4c92-967e-f322317f056b 531ec5a088a94b78af6e2c3feda17c0c a7e425a4d1854533a17d5f0dcd9d87b9 - - default default] [instance: ec8b1222-c882-45b6-ac60-941f37ff8b8c] Took 0.39 seconds to destroy the instance on the hypervisor.#033[00m
Jan 21 19:01:41 np0005591285 nova_compute[182755]: 2026-01-22 00:01:41.371 182759 DEBUG oslo.service.loopingcall [None req-73884364-83b4-4c92-967e-f322317f056b 531ec5a088a94b78af6e2c3feda17c0c a7e425a4d1854533a17d5f0dcd9d87b9 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Jan 21 19:01:41 np0005591285 nova_compute[182755]: 2026-01-22 00:01:41.371 182759 DEBUG nova.compute.manager [-] [instance: ec8b1222-c882-45b6-ac60-941f37ff8b8c] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Jan 21 19:01:41 np0005591285 nova_compute[182755]: 2026-01-22 00:01:41.372 182759 DEBUG nova.network.neutron [-] [instance: ec8b1222-c882-45b6-ac60-941f37ff8b8c] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Jan 21 19:01:41 np0005591285 podman[223922]: 2026-01-22 00:01:41.412973257 +0000 UTC m=+0.053575235 container remove c53c65279f58610f462c1b0685f53cd06dd5bf92e0c35c35523b8ffa7d37b8a0 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-397ba44b-e27b-4a2a-a10b-7de0daa31656, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0)
Jan 21 19:01:41 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:01:41.421 211690 DEBUG oslo.privsep.daemon [-] privsep: reply[9fbfecd6-f45f-4c83-88c2-8e39f3dbea15]: (4, ('Thu Jan 22 12:01:41 AM UTC 2026 Stopping container neutron-haproxy-ovnmeta-397ba44b-e27b-4a2a-a10b-7de0daa31656 (c53c65279f58610f462c1b0685f53cd06dd5bf92e0c35c35523b8ffa7d37b8a0)\nc53c65279f58610f462c1b0685f53cd06dd5bf92e0c35c35523b8ffa7d37b8a0\nThu Jan 22 12:01:41 AM UTC 2026 Deleting container neutron-haproxy-ovnmeta-397ba44b-e27b-4a2a-a10b-7de0daa31656 (c53c65279f58610f462c1b0685f53cd06dd5bf92e0c35c35523b8ffa7d37b8a0)\nc53c65279f58610f462c1b0685f53cd06dd5bf92e0c35c35523b8ffa7d37b8a0\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 19:01:41 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:01:41.423 211690 DEBUG oslo.privsep.daemon [-] privsep: reply[c21c6e28-3b99-4096-bc11-a87b441f223f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 19:01:41 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:01:41.425 104259 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap397ba44b-e0, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 21 19:01:41 np0005591285 kernel: tap397ba44b-e0: left promiscuous mode
Jan 21 19:01:41 np0005591285 nova_compute[182755]: 2026-01-22 00:01:41.431 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:01:41 np0005591285 nova_compute[182755]: 2026-01-22 00:01:41.452 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:01:41 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:01:41.455 211690 DEBUG oslo.privsep.daemon [-] privsep: reply[c676fbcc-8b5c-4d7b-afc7-4d3804e5c53b]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 19:01:41 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:01:41.469 211690 DEBUG oslo.privsep.daemon [-] privsep: reply[2e7706b6-1096-4353-8aca-5362a4299cd0]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 19:01:41 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:01:41.475 211690 DEBUG oslo.privsep.daemon [-] privsep: reply[23a52011-cc81-4b8a-9f5a-8295194ea780]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 19:01:41 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:01:41.495 211690 DEBUG oslo.privsep.daemon [-] privsep: reply[b8d8ef86-982a-4cf3-b575-2d134d598202]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 465725, 'reachable_time': 38658, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 223935, 'error': None, 'target': 'ovnmeta-397ba44b-e27b-4a2a-a10b-7de0daa31656', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 19:01:41 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:01:41.499 104650 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-397ba44b-e27b-4a2a-a10b-7de0daa31656 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Jan 21 19:01:41 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:01:41.499 104650 DEBUG oslo.privsep.daemon [-] privsep: reply[6e9df455-d347-4b84-a3b2-a34de09dd287]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 19:01:41 np0005591285 systemd[1]: run-netns-ovnmeta\x2d397ba44b\x2de27b\x2d4a2a\x2da10b\x2d7de0daa31656.mount: Deactivated successfully.
Jan 21 19:01:42 np0005591285 nova_compute[182755]: 2026-01-22 00:01:42.753 182759 DEBUG nova.network.neutron [-] [instance: ec8b1222-c882-45b6-ac60-941f37ff8b8c] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 21 19:01:42 np0005591285 nova_compute[182755]: 2026-01-22 00:01:42.829 182759 INFO nova.compute.manager [-] [instance: ec8b1222-c882-45b6-ac60-941f37ff8b8c] Took 1.46 seconds to deallocate network for instance.#033[00m
Jan 21 19:01:43 np0005591285 nova_compute[182755]: 2026-01-22 00:01:43.167 182759 DEBUG nova.compute.manager [req-8fd8cb0c-5ead-453f-a27a-27e18ab6647a req-a9131978-6756-4e95-b9ce-82c58536838d 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: ec8b1222-c882-45b6-ac60-941f37ff8b8c] Received event network-vif-deleted-853e95b8-2a2f-4ec2-afd2-18fe1ea6d1c9 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 21 19:01:43 np0005591285 nova_compute[182755]: 2026-01-22 00:01:43.223 182759 DEBUG oslo_concurrency.lockutils [None req-73884364-83b4-4c92-967e-f322317f056b 531ec5a088a94b78af6e2c3feda17c0c a7e425a4d1854533a17d5f0dcd9d87b9 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 21 19:01:43 np0005591285 nova_compute[182755]: 2026-01-22 00:01:43.224 182759 DEBUG oslo_concurrency.lockutils [None req-73884364-83b4-4c92-967e-f322317f056b 531ec5a088a94b78af6e2c3feda17c0c a7e425a4d1854533a17d5f0dcd9d87b9 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 21 19:01:43 np0005591285 nova_compute[182755]: 2026-01-22 00:01:43.297 182759 DEBUG nova.compute.provider_tree [None req-73884364-83b4-4c92-967e-f322317f056b 531ec5a088a94b78af6e2c3feda17c0c a7e425a4d1854533a17d5f0dcd9d87b9 - - default default] Inventory has not changed in ProviderTree for provider: e96a8776-a298-4c19-937a-402cb8191067 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 21 19:01:43 np0005591285 nova_compute[182755]: 2026-01-22 00:01:43.326 182759 DEBUG nova.scheduler.client.report [None req-73884364-83b4-4c92-967e-f322317f056b 531ec5a088a94b78af6e2c3feda17c0c a7e425a4d1854533a17d5f0dcd9d87b9 - - default default] Inventory has not changed for provider e96a8776-a298-4c19-937a-402cb8191067 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 21 19:01:43 np0005591285 nova_compute[182755]: 2026-01-22 00:01:43.494 182759 DEBUG oslo_concurrency.lockutils [None req-73884364-83b4-4c92-967e-f322317f056b 531ec5a088a94b78af6e2c3feda17c0c a7e425a4d1854533a17d5f0dcd9d87b9 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.271s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 21 19:01:43 np0005591285 nova_compute[182755]: 2026-01-22 00:01:43.571 182759 INFO nova.scheduler.client.report [None req-73884364-83b4-4c92-967e-f322317f056b 531ec5a088a94b78af6e2c3feda17c0c a7e425a4d1854533a17d5f0dcd9d87b9 - - default default] Deleted allocations for instance ec8b1222-c882-45b6-ac60-941f37ff8b8c#033[00m
Jan 21 19:01:44 np0005591285 nova_compute[182755]: 2026-01-22 00:01:44.061 182759 DEBUG oslo_concurrency.lockutils [None req-73884364-83b4-4c92-967e-f322317f056b 531ec5a088a94b78af6e2c3feda17c0c a7e425a4d1854533a17d5f0dcd9d87b9 - - default default] Lock "ec8b1222-c882-45b6-ac60-941f37ff8b8c" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 3.107s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 21 19:01:44 np0005591285 podman[223940]: 2026-01-22 00:01:44.211751461 +0000 UTC m=+0.077791128 container health_status 482a8450540b33e5cd4c4cb0c4b60281016c657b79b89fa82f8a9e447843f995 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, managed_by=edpm_ansible, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 21 19:01:44 np0005591285 podman[223941]: 2026-01-22 00:01:44.224669925 +0000 UTC m=+0.080002528 container health_status 8919309155a033da8deb024bb37b9fc0d285fe97686ee0d202af37a5f2df8416 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Jan 21 19:01:44 np0005591285 podman[223942]: 2026-01-22 00:01:44.256926682 +0000 UTC m=+0.109664476 container health_status e38def30a2d032278c507a8fe96e62e9ea51ff4721fe55b24e66691821fd079e (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=ovn_controller, managed_by=edpm_ansible, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 21 19:01:46 np0005591285 nova_compute[182755]: 2026-01-22 00:01:46.054 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:01:46 np0005591285 nova_compute[182755]: 2026-01-22 00:01:46.292 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:01:51 np0005591285 nova_compute[182755]: 2026-01-22 00:01:51.056 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:01:51 np0005591285 nova_compute[182755]: 2026-01-22 00:01:51.294 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:01:55 np0005591285 nova_compute[182755]: 2026-01-22 00:01:55.107 182759 DEBUG oslo_concurrency.lockutils [None req-4dd73e41-2095-45dd-bdee-d324d25e1c4a a54498c24248464db477c8bacbc2529f 81061941b85c488d887a1cbb0d870471 - - default default] Acquiring lock "cc06928c-a631-4fa0-9c21-daf6bef0991c" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 21 19:01:55 np0005591285 nova_compute[182755]: 2026-01-22 00:01:55.109 182759 DEBUG oslo_concurrency.lockutils [None req-4dd73e41-2095-45dd-bdee-d324d25e1c4a a54498c24248464db477c8bacbc2529f 81061941b85c488d887a1cbb0d870471 - - default default] Lock "cc06928c-a631-4fa0-9c21-daf6bef0991c" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 21 19:01:55 np0005591285 nova_compute[182755]: 2026-01-22 00:01:55.332 182759 DEBUG nova.compute.manager [None req-4dd73e41-2095-45dd-bdee-d324d25e1c4a a54498c24248464db477c8bacbc2529f 81061941b85c488d887a1cbb0d870471 - - default default] [instance: cc06928c-a631-4fa0-9c21-daf6bef0991c] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Jan 21 19:01:56 np0005591285 nova_compute[182755]: 2026-01-22 00:01:56.059 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:01:56 np0005591285 nova_compute[182755]: 2026-01-22 00:01:56.266 182759 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1769040101.2659707, ec8b1222-c882-45b6-ac60-941f37ff8b8c => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 21 19:01:56 np0005591285 nova_compute[182755]: 2026-01-22 00:01:56.267 182759 INFO nova.compute.manager [-] [instance: ec8b1222-c882-45b6-ac60-941f37ff8b8c] VM Stopped (Lifecycle Event)#033[00m
Jan 21 19:01:56 np0005591285 nova_compute[182755]: 2026-01-22 00:01:56.297 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:01:56 np0005591285 nova_compute[182755]: 2026-01-22 00:01:56.374 182759 DEBUG nova.compute.manager [None req-28f053a4-f5f2-40fb-9423-3055cfc767f5 - - - - - -] [instance: ec8b1222-c882-45b6-ac60-941f37ff8b8c] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 21 19:01:56 np0005591285 nova_compute[182755]: 2026-01-22 00:01:56.778 182759 DEBUG oslo_concurrency.lockutils [None req-4dd73e41-2095-45dd-bdee-d324d25e1c4a a54498c24248464db477c8bacbc2529f 81061941b85c488d887a1cbb0d870471 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 21 19:01:56 np0005591285 nova_compute[182755]: 2026-01-22 00:01:56.779 182759 DEBUG oslo_concurrency.lockutils [None req-4dd73e41-2095-45dd-bdee-d324d25e1c4a a54498c24248464db477c8bacbc2529f 81061941b85c488d887a1cbb0d870471 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 21 19:01:56 np0005591285 nova_compute[182755]: 2026-01-22 00:01:56.786 182759 DEBUG nova.virt.hardware [None req-4dd73e41-2095-45dd-bdee-d324d25e1c4a a54498c24248464db477c8bacbc2529f 81061941b85c488d887a1cbb0d870471 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Jan 21 19:01:56 np0005591285 nova_compute[182755]: 2026-01-22 00:01:56.786 182759 INFO nova.compute.claims [None req-4dd73e41-2095-45dd-bdee-d324d25e1c4a a54498c24248464db477c8bacbc2529f 81061941b85c488d887a1cbb0d870471 - - default default] [instance: cc06928c-a631-4fa0-9c21-daf6bef0991c] Claim successful on node compute-2.ctlplane.example.com#033[00m
Jan 21 19:01:57 np0005591285 nova_compute[182755]: 2026-01-22 00:01:57.298 182759 DEBUG nova.compute.provider_tree [None req-4dd73e41-2095-45dd-bdee-d324d25e1c4a a54498c24248464db477c8bacbc2529f 81061941b85c488d887a1cbb0d870471 - - default default] Inventory has not changed in ProviderTree for provider: e96a8776-a298-4c19-937a-402cb8191067 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 21 19:01:57 np0005591285 nova_compute[182755]: 2026-01-22 00:01:57.344 182759 DEBUG nova.scheduler.client.report [None req-4dd73e41-2095-45dd-bdee-d324d25e1c4a a54498c24248464db477c8bacbc2529f 81061941b85c488d887a1cbb0d870471 - - default default] Inventory has not changed for provider e96a8776-a298-4c19-937a-402cb8191067 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 21 19:01:57 np0005591285 nova_compute[182755]: 2026-01-22 00:01:57.494 182759 DEBUG oslo_concurrency.lockutils [None req-4dd73e41-2095-45dd-bdee-d324d25e1c4a a54498c24248464db477c8bacbc2529f 81061941b85c488d887a1cbb0d870471 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.715s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 21 19:01:57 np0005591285 nova_compute[182755]: 2026-01-22 00:01:57.495 182759 DEBUG nova.compute.manager [None req-4dd73e41-2095-45dd-bdee-d324d25e1c4a a54498c24248464db477c8bacbc2529f 81061941b85c488d887a1cbb0d870471 - - default default] [instance: cc06928c-a631-4fa0-9c21-daf6bef0991c] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Jan 21 19:01:57 np0005591285 nova_compute[182755]: 2026-01-22 00:01:57.915 182759 DEBUG nova.compute.manager [None req-4dd73e41-2095-45dd-bdee-d324d25e1c4a a54498c24248464db477c8bacbc2529f 81061941b85c488d887a1cbb0d870471 - - default default] [instance: cc06928c-a631-4fa0-9c21-daf6bef0991c] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Jan 21 19:01:57 np0005591285 nova_compute[182755]: 2026-01-22 00:01:57.916 182759 DEBUG nova.network.neutron [None req-4dd73e41-2095-45dd-bdee-d324d25e1c4a a54498c24248464db477c8bacbc2529f 81061941b85c488d887a1cbb0d870471 - - default default] [instance: cc06928c-a631-4fa0-9c21-daf6bef0991c] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Jan 21 19:01:58 np0005591285 nova_compute[182755]: 2026-01-22 00:01:58.059 182759 INFO nova.virt.libvirt.driver [None req-4dd73e41-2095-45dd-bdee-d324d25e1c4a a54498c24248464db477c8bacbc2529f 81061941b85c488d887a1cbb0d870471 - - default default] [instance: cc06928c-a631-4fa0-9c21-daf6bef0991c] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Jan 21 19:01:58 np0005591285 nova_compute[182755]: 2026-01-22 00:01:58.496 182759 DEBUG nova.compute.manager [None req-4dd73e41-2095-45dd-bdee-d324d25e1c4a a54498c24248464db477c8bacbc2529f 81061941b85c488d887a1cbb0d870471 - - default default] [instance: cc06928c-a631-4fa0-9c21-daf6bef0991c] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Jan 21 19:01:59 np0005591285 nova_compute[182755]: 2026-01-22 00:01:59.191 182759 DEBUG nova.policy [None req-4dd73e41-2095-45dd-bdee-d324d25e1c4a a54498c24248464db477c8bacbc2529f 81061941b85c488d887a1cbb0d870471 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'a54498c24248464db477c8bacbc2529f', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '81061941b85c488d887a1cbb0d870471', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Jan 21 19:02:00 np0005591285 nova_compute[182755]: 2026-01-22 00:02:00.054 182759 DEBUG nova.compute.manager [None req-4dd73e41-2095-45dd-bdee-d324d25e1c4a a54498c24248464db477c8bacbc2529f 81061941b85c488d887a1cbb0d870471 - - default default] [instance: cc06928c-a631-4fa0-9c21-daf6bef0991c] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Jan 21 19:02:00 np0005591285 nova_compute[182755]: 2026-01-22 00:02:00.056 182759 DEBUG nova.virt.libvirt.driver [None req-4dd73e41-2095-45dd-bdee-d324d25e1c4a a54498c24248464db477c8bacbc2529f 81061941b85c488d887a1cbb0d870471 - - default default] [instance: cc06928c-a631-4fa0-9c21-daf6bef0991c] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Jan 21 19:02:00 np0005591285 nova_compute[182755]: 2026-01-22 00:02:00.056 182759 INFO nova.virt.libvirt.driver [None req-4dd73e41-2095-45dd-bdee-d324d25e1c4a a54498c24248464db477c8bacbc2529f 81061941b85c488d887a1cbb0d870471 - - default default] [instance: cc06928c-a631-4fa0-9c21-daf6bef0991c] Creating image(s)#033[00m
Jan 21 19:02:00 np0005591285 nova_compute[182755]: 2026-01-22 00:02:00.057 182759 DEBUG oslo_concurrency.lockutils [None req-4dd73e41-2095-45dd-bdee-d324d25e1c4a a54498c24248464db477c8bacbc2529f 81061941b85c488d887a1cbb0d870471 - - default default] Acquiring lock "/var/lib/nova/instances/cc06928c-a631-4fa0-9c21-daf6bef0991c/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 21 19:02:00 np0005591285 nova_compute[182755]: 2026-01-22 00:02:00.058 182759 DEBUG oslo_concurrency.lockutils [None req-4dd73e41-2095-45dd-bdee-d324d25e1c4a a54498c24248464db477c8bacbc2529f 81061941b85c488d887a1cbb0d870471 - - default default] Lock "/var/lib/nova/instances/cc06928c-a631-4fa0-9c21-daf6bef0991c/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 21 19:02:00 np0005591285 nova_compute[182755]: 2026-01-22 00:02:00.059 182759 DEBUG oslo_concurrency.lockutils [None req-4dd73e41-2095-45dd-bdee-d324d25e1c4a a54498c24248464db477c8bacbc2529f 81061941b85c488d887a1cbb0d870471 - - default default] Lock "/var/lib/nova/instances/cc06928c-a631-4fa0-9c21-daf6bef0991c/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 21 19:02:00 np0005591285 nova_compute[182755]: 2026-01-22 00:02:00.083 182759 DEBUG oslo_concurrency.processutils [None req-4dd73e41-2095-45dd-bdee-d324d25e1c4a a54498c24248464db477c8bacbc2529f 81061941b85c488d887a1cbb0d870471 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 21 19:02:00 np0005591285 nova_compute[182755]: 2026-01-22 00:02:00.145 182759 DEBUG oslo_concurrency.processutils [None req-4dd73e41-2095-45dd-bdee-d324d25e1c4a a54498c24248464db477c8bacbc2529f 81061941b85c488d887a1cbb0d870471 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json" returned: 0 in 0.062s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 21 19:02:00 np0005591285 nova_compute[182755]: 2026-01-22 00:02:00.146 182759 DEBUG oslo_concurrency.lockutils [None req-4dd73e41-2095-45dd-bdee-d324d25e1c4a a54498c24248464db477c8bacbc2529f 81061941b85c488d887a1cbb0d870471 - - default default] Acquiring lock "3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 21 19:02:00 np0005591285 nova_compute[182755]: 2026-01-22 00:02:00.147 182759 DEBUG oslo_concurrency.lockutils [None req-4dd73e41-2095-45dd-bdee-d324d25e1c4a a54498c24248464db477c8bacbc2529f 81061941b85c488d887a1cbb0d870471 - - default default] Lock "3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 21 19:02:00 np0005591285 nova_compute[182755]: 2026-01-22 00:02:00.159 182759 DEBUG oslo_concurrency.processutils [None req-4dd73e41-2095-45dd-bdee-d324d25e1c4a a54498c24248464db477c8bacbc2529f 81061941b85c488d887a1cbb0d870471 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 21 19:02:00 np0005591285 nova_compute[182755]: 2026-01-22 00:02:00.218 182759 DEBUG oslo_concurrency.processutils [None req-4dd73e41-2095-45dd-bdee-d324d25e1c4a a54498c24248464db477c8bacbc2529f 81061941b85c488d887a1cbb0d870471 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json" returned: 0 in 0.059s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 21 19:02:00 np0005591285 nova_compute[182755]: 2026-01-22 00:02:00.219 182759 DEBUG oslo_concurrency.processutils [None req-4dd73e41-2095-45dd-bdee-d324d25e1c4a a54498c24248464db477c8bacbc2529f 81061941b85c488d887a1cbb0d870471 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474,backing_fmt=raw /var/lib/nova/instances/cc06928c-a631-4fa0-9c21-daf6bef0991c/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 21 19:02:00 np0005591285 nova_compute[182755]: 2026-01-22 00:02:00.252 182759 DEBUG oslo_concurrency.processutils [None req-4dd73e41-2095-45dd-bdee-d324d25e1c4a a54498c24248464db477c8bacbc2529f 81061941b85c488d887a1cbb0d870471 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474,backing_fmt=raw /var/lib/nova/instances/cc06928c-a631-4fa0-9c21-daf6bef0991c/disk 1073741824" returned: 0 in 0.033s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 21 19:02:00 np0005591285 nova_compute[182755]: 2026-01-22 00:02:00.253 182759 DEBUG oslo_concurrency.lockutils [None req-4dd73e41-2095-45dd-bdee-d324d25e1c4a a54498c24248464db477c8bacbc2529f 81061941b85c488d887a1cbb0d870471 - - default default] Lock "3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.106s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 21 19:02:00 np0005591285 nova_compute[182755]: 2026-01-22 00:02:00.253 182759 DEBUG oslo_concurrency.processutils [None req-4dd73e41-2095-45dd-bdee-d324d25e1c4a a54498c24248464db477c8bacbc2529f 81061941b85c488d887a1cbb0d870471 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 21 19:02:00 np0005591285 nova_compute[182755]: 2026-01-22 00:02:00.318 182759 DEBUG oslo_concurrency.processutils [None req-4dd73e41-2095-45dd-bdee-d324d25e1c4a a54498c24248464db477c8bacbc2529f 81061941b85c488d887a1cbb0d870471 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json" returned: 0 in 0.064s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 21 19:02:00 np0005591285 nova_compute[182755]: 2026-01-22 00:02:00.319 182759 DEBUG nova.virt.disk.api [None req-4dd73e41-2095-45dd-bdee-d324d25e1c4a a54498c24248464db477c8bacbc2529f 81061941b85c488d887a1cbb0d870471 - - default default] Checking if we can resize image /var/lib/nova/instances/cc06928c-a631-4fa0-9c21-daf6bef0991c/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166#033[00m
Jan 21 19:02:00 np0005591285 nova_compute[182755]: 2026-01-22 00:02:00.319 182759 DEBUG oslo_concurrency.processutils [None req-4dd73e41-2095-45dd-bdee-d324d25e1c4a a54498c24248464db477c8bacbc2529f 81061941b85c488d887a1cbb0d870471 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/cc06928c-a631-4fa0-9c21-daf6bef0991c/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 21 19:02:00 np0005591285 nova_compute[182755]: 2026-01-22 00:02:00.382 182759 DEBUG oslo_concurrency.processutils [None req-4dd73e41-2095-45dd-bdee-d324d25e1c4a a54498c24248464db477c8bacbc2529f 81061941b85c488d887a1cbb0d870471 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/cc06928c-a631-4fa0-9c21-daf6bef0991c/disk --force-share --output=json" returned: 0 in 0.063s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 21 19:02:00 np0005591285 nova_compute[182755]: 2026-01-22 00:02:00.383 182759 DEBUG nova.virt.disk.api [None req-4dd73e41-2095-45dd-bdee-d324d25e1c4a a54498c24248464db477c8bacbc2529f 81061941b85c488d887a1cbb0d870471 - - default default] Cannot resize image /var/lib/nova/instances/cc06928c-a631-4fa0-9c21-daf6bef0991c/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172#033[00m
Jan 21 19:02:00 np0005591285 nova_compute[182755]: 2026-01-22 00:02:00.384 182759 DEBUG nova.objects.instance [None req-4dd73e41-2095-45dd-bdee-d324d25e1c4a a54498c24248464db477c8bacbc2529f 81061941b85c488d887a1cbb0d870471 - - default default] Lazy-loading 'migration_context' on Instance uuid cc06928c-a631-4fa0-9c21-daf6bef0991c obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 21 19:02:00 np0005591285 nova_compute[182755]: 2026-01-22 00:02:00.454 182759 DEBUG nova.virt.libvirt.driver [None req-4dd73e41-2095-45dd-bdee-d324d25e1c4a a54498c24248464db477c8bacbc2529f 81061941b85c488d887a1cbb0d870471 - - default default] [instance: cc06928c-a631-4fa0-9c21-daf6bef0991c] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Jan 21 19:02:00 np0005591285 nova_compute[182755]: 2026-01-22 00:02:00.455 182759 DEBUG nova.virt.libvirt.driver [None req-4dd73e41-2095-45dd-bdee-d324d25e1c4a a54498c24248464db477c8bacbc2529f 81061941b85c488d887a1cbb0d870471 - - default default] [instance: cc06928c-a631-4fa0-9c21-daf6bef0991c] Ensure instance console log exists: /var/lib/nova/instances/cc06928c-a631-4fa0-9c21-daf6bef0991c/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Jan 21 19:02:00 np0005591285 nova_compute[182755]: 2026-01-22 00:02:00.455 182759 DEBUG oslo_concurrency.lockutils [None req-4dd73e41-2095-45dd-bdee-d324d25e1c4a a54498c24248464db477c8bacbc2529f 81061941b85c488d887a1cbb0d870471 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 21 19:02:00 np0005591285 nova_compute[182755]: 2026-01-22 00:02:00.456 182759 DEBUG oslo_concurrency.lockutils [None req-4dd73e41-2095-45dd-bdee-d324d25e1c4a a54498c24248464db477c8bacbc2529f 81061941b85c488d887a1cbb0d870471 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 21 19:02:00 np0005591285 nova_compute[182755]: 2026-01-22 00:02:00.456 182759 DEBUG oslo_concurrency.lockutils [None req-4dd73e41-2095-45dd-bdee-d324d25e1c4a a54498c24248464db477c8bacbc2529f 81061941b85c488d887a1cbb0d870471 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 21 19:02:01 np0005591285 nova_compute[182755]: 2026-01-22 00:02:01.046 182759 DEBUG nova.network.neutron [None req-4dd73e41-2095-45dd-bdee-d324d25e1c4a a54498c24248464db477c8bacbc2529f 81061941b85c488d887a1cbb0d870471 - - default default] [instance: cc06928c-a631-4fa0-9c21-daf6bef0991c] Successfully created port: 2d2a9041-fc22-4cf2-90fb-3db5c39c337d _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Jan 21 19:02:01 np0005591285 nova_compute[182755]: 2026-01-22 00:02:01.061 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:02:01 np0005591285 nova_compute[182755]: 2026-01-22 00:02:01.299 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:02:02 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:02:02.118 104259 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=21, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '16:a4:fb', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': 'fa:c2:bb:50:eb:27'}, ipsec=False) old=SB_Global(nb_cfg=20) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 21 19:02:02 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:02:02.119 104259 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 7 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Jan 21 19:02:02 np0005591285 nova_compute[182755]: 2026-01-22 00:02:02.154 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:02:02 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:02:02.967 104259 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 21 19:02:02 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:02:02.968 104259 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 21 19:02:02 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:02:02.969 104259 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 21 19:02:05 np0005591285 nova_compute[182755]: 2026-01-22 00:02:05.140 182759 DEBUG nova.network.neutron [None req-4dd73e41-2095-45dd-bdee-d324d25e1c4a a54498c24248464db477c8bacbc2529f 81061941b85c488d887a1cbb0d870471 - - default default] [instance: cc06928c-a631-4fa0-9c21-daf6bef0991c] Successfully updated port: 2d2a9041-fc22-4cf2-90fb-3db5c39c337d _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Jan 21 19:02:05 np0005591285 podman[224019]: 2026-01-22 00:02:05.18579685 +0000 UTC m=+0.057913910 container health_status 769668cc4e818cfae854ab3fcb5517227ddca1e18005a18ef4d782a494f45662 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.openshift.expose-services=, name=ubi9-minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-type=git, io.openshift.tags=minimal rhel9, maintainer=Red Hat, Inc., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.component=ubi9-minimal-container, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, build-date=2025-08-20T13:12:41, config_id=openstack_network_exporter, release=1755695350, architecture=x86_64, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, distribution-scope=public, managed_by=edpm_ansible, io.buildah.version=1.33.7, vendor=Red Hat, Inc., version=9.6, container_name=openstack_network_exporter)
Jan 21 19:02:05 np0005591285 podman[224020]: 2026-01-22 00:02:05.21364811 +0000 UTC m=+0.070479734 container health_status b5c1a31a508d0cf9e10702abe9c0ef424614b4d8f0daee85791f7de054afa6f0 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.license=GPLv2, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=ceilometer_agent_compute, container_name=ceilometer_agent_compute, managed_by=edpm_ansible)
Jan 21 19:02:06 np0005591285 nova_compute[182755]: 2026-01-22 00:02:06.063 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:02:06 np0005591285 nova_compute[182755]: 2026-01-22 00:02:06.099 182759 DEBUG oslo_concurrency.lockutils [None req-4dd73e41-2095-45dd-bdee-d324d25e1c4a a54498c24248464db477c8bacbc2529f 81061941b85c488d887a1cbb0d870471 - - default default] Acquiring lock "refresh_cache-cc06928c-a631-4fa0-9c21-daf6bef0991c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 21 19:02:06 np0005591285 nova_compute[182755]: 2026-01-22 00:02:06.100 182759 DEBUG oslo_concurrency.lockutils [None req-4dd73e41-2095-45dd-bdee-d324d25e1c4a a54498c24248464db477c8bacbc2529f 81061941b85c488d887a1cbb0d870471 - - default default] Acquired lock "refresh_cache-cc06928c-a631-4fa0-9c21-daf6bef0991c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 21 19:02:06 np0005591285 nova_compute[182755]: 2026-01-22 00:02:06.100 182759 DEBUG nova.network.neutron [None req-4dd73e41-2095-45dd-bdee-d324d25e1c4a a54498c24248464db477c8bacbc2529f 81061941b85c488d887a1cbb0d870471 - - default default] [instance: cc06928c-a631-4fa0-9c21-daf6bef0991c] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Jan 21 19:02:06 np0005591285 nova_compute[182755]: 2026-01-22 00:02:06.301 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:02:06 np0005591285 nova_compute[182755]: 2026-01-22 00:02:06.358 182759 DEBUG nova.compute.manager [req-28b36c31-2a96-474d-912e-275b653504e0 req-b280cfda-8e89-406a-961c-f222b26503e0 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: cc06928c-a631-4fa0-9c21-daf6bef0991c] Received event network-changed-2d2a9041-fc22-4cf2-90fb-3db5c39c337d external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 21 19:02:06 np0005591285 nova_compute[182755]: 2026-01-22 00:02:06.358 182759 DEBUG nova.compute.manager [req-28b36c31-2a96-474d-912e-275b653504e0 req-b280cfda-8e89-406a-961c-f222b26503e0 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: cc06928c-a631-4fa0-9c21-daf6bef0991c] Refreshing instance network info cache due to event network-changed-2d2a9041-fc22-4cf2-90fb-3db5c39c337d. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 21 19:02:06 np0005591285 nova_compute[182755]: 2026-01-22 00:02:06.359 182759 DEBUG oslo_concurrency.lockutils [req-28b36c31-2a96-474d-912e-275b653504e0 req-b280cfda-8e89-406a-961c-f222b26503e0 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "refresh_cache-cc06928c-a631-4fa0-9c21-daf6bef0991c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 21 19:02:06 np0005591285 nova_compute[182755]: 2026-01-22 00:02:06.587 182759 DEBUG nova.network.neutron [None req-4dd73e41-2095-45dd-bdee-d324d25e1c4a a54498c24248464db477c8bacbc2529f 81061941b85c488d887a1cbb0d870471 - - default default] [instance: cc06928c-a631-4fa0-9c21-daf6bef0991c] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Jan 21 19:02:07 np0005591285 nova_compute[182755]: 2026-01-22 00:02:07.681 182759 DEBUG nova.network.neutron [None req-4dd73e41-2095-45dd-bdee-d324d25e1c4a a54498c24248464db477c8bacbc2529f 81061941b85c488d887a1cbb0d870471 - - default default] [instance: cc06928c-a631-4fa0-9c21-daf6bef0991c] Updating instance_info_cache with network_info: [{"id": "2d2a9041-fc22-4cf2-90fb-3db5c39c337d", "address": "fa:16:3e:7f:d4:48", "network": {"id": "50b353e7-a528-49af-98b7-91bbacfa8db4", "bridge": "br-int", "label": "tempest-InstanceActionsNegativeTestJSON-315463875-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "81061941b85c488d887a1cbb0d870471", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap2d2a9041-fc", "ovs_interfaceid": "2d2a9041-fc22-4cf2-90fb-3db5c39c337d", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 21 19:02:08 np0005591285 nova_compute[182755]: 2026-01-22 00:02:08.196 182759 DEBUG oslo_concurrency.lockutils [None req-4dd73e41-2095-45dd-bdee-d324d25e1c4a a54498c24248464db477c8bacbc2529f 81061941b85c488d887a1cbb0d870471 - - default default] Releasing lock "refresh_cache-cc06928c-a631-4fa0-9c21-daf6bef0991c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 21 19:02:08 np0005591285 nova_compute[182755]: 2026-01-22 00:02:08.197 182759 DEBUG nova.compute.manager [None req-4dd73e41-2095-45dd-bdee-d324d25e1c4a a54498c24248464db477c8bacbc2529f 81061941b85c488d887a1cbb0d870471 - - default default] [instance: cc06928c-a631-4fa0-9c21-daf6bef0991c] Instance network_info: |[{"id": "2d2a9041-fc22-4cf2-90fb-3db5c39c337d", "address": "fa:16:3e:7f:d4:48", "network": {"id": "50b353e7-a528-49af-98b7-91bbacfa8db4", "bridge": "br-int", "label": "tempest-InstanceActionsNegativeTestJSON-315463875-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "81061941b85c488d887a1cbb0d870471", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap2d2a9041-fc", "ovs_interfaceid": "2d2a9041-fc22-4cf2-90fb-3db5c39c337d", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Jan 21 19:02:08 np0005591285 nova_compute[182755]: 2026-01-22 00:02:08.197 182759 DEBUG oslo_concurrency.lockutils [req-28b36c31-2a96-474d-912e-275b653504e0 req-b280cfda-8e89-406a-961c-f222b26503e0 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquired lock "refresh_cache-cc06928c-a631-4fa0-9c21-daf6bef0991c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 21 19:02:08 np0005591285 nova_compute[182755]: 2026-01-22 00:02:08.198 182759 DEBUG nova.network.neutron [req-28b36c31-2a96-474d-912e-275b653504e0 req-b280cfda-8e89-406a-961c-f222b26503e0 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: cc06928c-a631-4fa0-9c21-daf6bef0991c] Refreshing network info cache for port 2d2a9041-fc22-4cf2-90fb-3db5c39c337d _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 21 19:02:08 np0005591285 nova_compute[182755]: 2026-01-22 00:02:08.200 182759 DEBUG nova.virt.libvirt.driver [None req-4dd73e41-2095-45dd-bdee-d324d25e1c4a a54498c24248464db477c8bacbc2529f 81061941b85c488d887a1cbb0d870471 - - default default] [instance: cc06928c-a631-4fa0-9c21-daf6bef0991c] Start _get_guest_xml network_info=[{"id": "2d2a9041-fc22-4cf2-90fb-3db5c39c337d", "address": "fa:16:3e:7f:d4:48", "network": {"id": "50b353e7-a528-49af-98b7-91bbacfa8db4", "bridge": "br-int", "label": "tempest-InstanceActionsNegativeTestJSON-315463875-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "81061941b85c488d887a1cbb0d870471", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap2d2a9041-fc", "ovs_interfaceid": "2d2a9041-fc22-4cf2-90fb-3db5c39c337d", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-21T23:42:50Z,direct_url=<?>,disk_format='qcow2',id=9cd98f02-a505-4543-a7ad-04e9a377b456,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='43b70c4e837343859ac97b6b2397ba1b',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-21T23:42:52Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'device_type': 'disk', 'device_name': '/dev/vda', 'size': 0, 'boot_index': 0, 'disk_bus': 'virtio', 'guest_format': None, 'encryption_options': None, 'encryption_format': None, 'encrypted': False, 'encryption_secret_uuid': None, 'image_id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Jan 21 19:02:08 np0005591285 nova_compute[182755]: 2026-01-22 00:02:08.208 182759 WARNING nova.virt.libvirt.driver [None req-4dd73e41-2095-45dd-bdee-d324d25e1c4a a54498c24248464db477c8bacbc2529f 81061941b85c488d887a1cbb0d870471 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 21 19:02:08 np0005591285 nova_compute[182755]: 2026-01-22 00:02:08.219 182759 DEBUG nova.virt.libvirt.host [None req-4dd73e41-2095-45dd-bdee-d324d25e1c4a a54498c24248464db477c8bacbc2529f 81061941b85c488d887a1cbb0d870471 - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Jan 21 19:02:08 np0005591285 nova_compute[182755]: 2026-01-22 00:02:08.220 182759 DEBUG nova.virt.libvirt.host [None req-4dd73e41-2095-45dd-bdee-d324d25e1c4a a54498c24248464db477c8bacbc2529f 81061941b85c488d887a1cbb0d870471 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Jan 21 19:02:08 np0005591285 nova_compute[182755]: 2026-01-22 00:02:08.225 182759 DEBUG nova.virt.libvirt.host [None req-4dd73e41-2095-45dd-bdee-d324d25e1c4a a54498c24248464db477c8bacbc2529f 81061941b85c488d887a1cbb0d870471 - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Jan 21 19:02:08 np0005591285 nova_compute[182755]: 2026-01-22 00:02:08.226 182759 DEBUG nova.virt.libvirt.host [None req-4dd73e41-2095-45dd-bdee-d324d25e1c4a a54498c24248464db477c8bacbc2529f 81061941b85c488d887a1cbb0d870471 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Jan 21 19:02:08 np0005591285 nova_compute[182755]: 2026-01-22 00:02:08.227 182759 DEBUG nova.virt.libvirt.driver [None req-4dd73e41-2095-45dd-bdee-d324d25e1c4a a54498c24248464db477c8bacbc2529f 81061941b85c488d887a1cbb0d870471 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Jan 21 19:02:08 np0005591285 nova_compute[182755]: 2026-01-22 00:02:08.228 182759 DEBUG nova.virt.hardware [None req-4dd73e41-2095-45dd-bdee-d324d25e1c4a a54498c24248464db477c8bacbc2529f 81061941b85c488d887a1cbb0d870471 - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-21T23:42:45Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='c3389c03-89c4-4ff5-9e03-1a99d41713d4',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-21T23:42:50Z,direct_url=<?>,disk_format='qcow2',id=9cd98f02-a505-4543-a7ad-04e9a377b456,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='43b70c4e837343859ac97b6b2397ba1b',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-21T23:42:52Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Jan 21 19:02:08 np0005591285 nova_compute[182755]: 2026-01-22 00:02:08.228 182759 DEBUG nova.virt.hardware [None req-4dd73e41-2095-45dd-bdee-d324d25e1c4a a54498c24248464db477c8bacbc2529f 81061941b85c488d887a1cbb0d870471 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Jan 21 19:02:08 np0005591285 nova_compute[182755]: 2026-01-22 00:02:08.228 182759 DEBUG nova.virt.hardware [None req-4dd73e41-2095-45dd-bdee-d324d25e1c4a a54498c24248464db477c8bacbc2529f 81061941b85c488d887a1cbb0d870471 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Jan 21 19:02:08 np0005591285 nova_compute[182755]: 2026-01-22 00:02:08.228 182759 DEBUG nova.virt.hardware [None req-4dd73e41-2095-45dd-bdee-d324d25e1c4a a54498c24248464db477c8bacbc2529f 81061941b85c488d887a1cbb0d870471 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Jan 21 19:02:08 np0005591285 nova_compute[182755]: 2026-01-22 00:02:08.229 182759 DEBUG nova.virt.hardware [None req-4dd73e41-2095-45dd-bdee-d324d25e1c4a a54498c24248464db477c8bacbc2529f 81061941b85c488d887a1cbb0d870471 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Jan 21 19:02:08 np0005591285 nova_compute[182755]: 2026-01-22 00:02:08.229 182759 DEBUG nova.virt.hardware [None req-4dd73e41-2095-45dd-bdee-d324d25e1c4a a54498c24248464db477c8bacbc2529f 81061941b85c488d887a1cbb0d870471 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Jan 21 19:02:08 np0005591285 nova_compute[182755]: 2026-01-22 00:02:08.229 182759 DEBUG nova.virt.hardware [None req-4dd73e41-2095-45dd-bdee-d324d25e1c4a a54498c24248464db477c8bacbc2529f 81061941b85c488d887a1cbb0d870471 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Jan 21 19:02:08 np0005591285 nova_compute[182755]: 2026-01-22 00:02:08.229 182759 DEBUG nova.virt.hardware [None req-4dd73e41-2095-45dd-bdee-d324d25e1c4a a54498c24248464db477c8bacbc2529f 81061941b85c488d887a1cbb0d870471 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Jan 21 19:02:08 np0005591285 nova_compute[182755]: 2026-01-22 00:02:08.230 182759 DEBUG nova.virt.hardware [None req-4dd73e41-2095-45dd-bdee-d324d25e1c4a a54498c24248464db477c8bacbc2529f 81061941b85c488d887a1cbb0d870471 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Jan 21 19:02:08 np0005591285 nova_compute[182755]: 2026-01-22 00:02:08.230 182759 DEBUG nova.virt.hardware [None req-4dd73e41-2095-45dd-bdee-d324d25e1c4a a54498c24248464db477c8bacbc2529f 81061941b85c488d887a1cbb0d870471 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Jan 21 19:02:08 np0005591285 nova_compute[182755]: 2026-01-22 00:02:08.230 182759 DEBUG nova.virt.hardware [None req-4dd73e41-2095-45dd-bdee-d324d25e1c4a a54498c24248464db477c8bacbc2529f 81061941b85c488d887a1cbb0d870471 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Jan 21 19:02:08 np0005591285 nova_compute[182755]: 2026-01-22 00:02:08.234 182759 DEBUG nova.virt.libvirt.vif [None req-4dd73e41-2095-45dd-bdee-d324d25e1c4a a54498c24248464db477c8bacbc2529f 81061941b85c488d887a1cbb0d870471 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-22T00:01:53Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-InstanceActionsNegativeTestJSON-server-748899397',display_name='tempest-InstanceActionsNegativeTestJSON-server-748899397',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-instanceactionsnegativetestjson-server-748899397',id=86,image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='81061941b85c488d887a1cbb0d870471',ramdisk_id='',reservation_id='r-ls5bgkh9',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-InstanceActionsNegativeTestJSON-1091295650',owner_user_name='tempest-InstanceActionsNegativeTestJSON-1091295650-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-22T00:01:58Z,user_data=None,user_id='a54498c24248464db477c8bacbc2529f',uuid=cc06928c-a631-4fa0-9c21-daf6bef0991c,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "2d2a9041-fc22-4cf2-90fb-3db5c39c337d", "address": "fa:16:3e:7f:d4:48", "network": {"id": "50b353e7-a528-49af-98b7-91bbacfa8db4", "bridge": "br-int", "label": "tempest-InstanceActionsNegativeTestJSON-315463875-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "81061941b85c488d887a1cbb0d870471", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap2d2a9041-fc", "ovs_interfaceid": "2d2a9041-fc22-4cf2-90fb-3db5c39c337d", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Jan 21 19:02:08 np0005591285 nova_compute[182755]: 2026-01-22 00:02:08.234 182759 DEBUG nova.network.os_vif_util [None req-4dd73e41-2095-45dd-bdee-d324d25e1c4a a54498c24248464db477c8bacbc2529f 81061941b85c488d887a1cbb0d870471 - - default default] Converting VIF {"id": "2d2a9041-fc22-4cf2-90fb-3db5c39c337d", "address": "fa:16:3e:7f:d4:48", "network": {"id": "50b353e7-a528-49af-98b7-91bbacfa8db4", "bridge": "br-int", "label": "tempest-InstanceActionsNegativeTestJSON-315463875-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "81061941b85c488d887a1cbb0d870471", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap2d2a9041-fc", "ovs_interfaceid": "2d2a9041-fc22-4cf2-90fb-3db5c39c337d", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 21 19:02:08 np0005591285 nova_compute[182755]: 2026-01-22 00:02:08.235 182759 DEBUG nova.network.os_vif_util [None req-4dd73e41-2095-45dd-bdee-d324d25e1c4a a54498c24248464db477c8bacbc2529f 81061941b85c488d887a1cbb0d870471 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:7f:d4:48,bridge_name='br-int',has_traffic_filtering=True,id=2d2a9041-fc22-4cf2-90fb-3db5c39c337d,network=Network(50b353e7-a528-49af-98b7-91bbacfa8db4),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap2d2a9041-fc') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 21 19:02:08 np0005591285 nova_compute[182755]: 2026-01-22 00:02:08.235 182759 DEBUG nova.objects.instance [None req-4dd73e41-2095-45dd-bdee-d324d25e1c4a a54498c24248464db477c8bacbc2529f 81061941b85c488d887a1cbb0d870471 - - default default] Lazy-loading 'pci_devices' on Instance uuid cc06928c-a631-4fa0-9c21-daf6bef0991c obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 21 19:02:08 np0005591285 nova_compute[182755]: 2026-01-22 00:02:08.326 182759 DEBUG nova.virt.libvirt.driver [None req-4dd73e41-2095-45dd-bdee-d324d25e1c4a a54498c24248464db477c8bacbc2529f 81061941b85c488d887a1cbb0d870471 - - default default] [instance: cc06928c-a631-4fa0-9c21-daf6bef0991c] End _get_guest_xml xml=<domain type="kvm">
Jan 21 19:02:08 np0005591285 nova_compute[182755]:  <uuid>cc06928c-a631-4fa0-9c21-daf6bef0991c</uuid>
Jan 21 19:02:08 np0005591285 nova_compute[182755]:  <name>instance-00000056</name>
Jan 21 19:02:08 np0005591285 nova_compute[182755]:  <memory>131072</memory>
Jan 21 19:02:08 np0005591285 nova_compute[182755]:  <vcpu>1</vcpu>
Jan 21 19:02:08 np0005591285 nova_compute[182755]:  <metadata>
Jan 21 19:02:08 np0005591285 nova_compute[182755]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 21 19:02:08 np0005591285 nova_compute[182755]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 21 19:02:08 np0005591285 nova_compute[182755]:      <nova:name>tempest-InstanceActionsNegativeTestJSON-server-748899397</nova:name>
Jan 21 19:02:08 np0005591285 nova_compute[182755]:      <nova:creationTime>2026-01-22 00:02:08</nova:creationTime>
Jan 21 19:02:08 np0005591285 nova_compute[182755]:      <nova:flavor name="m1.nano">
Jan 21 19:02:08 np0005591285 nova_compute[182755]:        <nova:memory>128</nova:memory>
Jan 21 19:02:08 np0005591285 nova_compute[182755]:        <nova:disk>1</nova:disk>
Jan 21 19:02:08 np0005591285 nova_compute[182755]:        <nova:swap>0</nova:swap>
Jan 21 19:02:08 np0005591285 nova_compute[182755]:        <nova:ephemeral>0</nova:ephemeral>
Jan 21 19:02:08 np0005591285 nova_compute[182755]:        <nova:vcpus>1</nova:vcpus>
Jan 21 19:02:08 np0005591285 nova_compute[182755]:      </nova:flavor>
Jan 21 19:02:08 np0005591285 nova_compute[182755]:      <nova:owner>
Jan 21 19:02:08 np0005591285 nova_compute[182755]:        <nova:user uuid="a54498c24248464db477c8bacbc2529f">tempest-InstanceActionsNegativeTestJSON-1091295650-project-member</nova:user>
Jan 21 19:02:08 np0005591285 nova_compute[182755]:        <nova:project uuid="81061941b85c488d887a1cbb0d870471">tempest-InstanceActionsNegativeTestJSON-1091295650</nova:project>
Jan 21 19:02:08 np0005591285 nova_compute[182755]:      </nova:owner>
Jan 21 19:02:08 np0005591285 nova_compute[182755]:      <nova:root type="image" uuid="9cd98f02-a505-4543-a7ad-04e9a377b456"/>
Jan 21 19:02:08 np0005591285 nova_compute[182755]:      <nova:ports>
Jan 21 19:02:08 np0005591285 nova_compute[182755]:        <nova:port uuid="2d2a9041-fc22-4cf2-90fb-3db5c39c337d">
Jan 21 19:02:08 np0005591285 nova_compute[182755]:          <nova:ip type="fixed" address="10.100.0.7" ipVersion="4"/>
Jan 21 19:02:08 np0005591285 nova_compute[182755]:        </nova:port>
Jan 21 19:02:08 np0005591285 nova_compute[182755]:      </nova:ports>
Jan 21 19:02:08 np0005591285 nova_compute[182755]:    </nova:instance>
Jan 21 19:02:08 np0005591285 nova_compute[182755]:  </metadata>
Jan 21 19:02:08 np0005591285 nova_compute[182755]:  <sysinfo type="smbios">
Jan 21 19:02:08 np0005591285 nova_compute[182755]:    <system>
Jan 21 19:02:08 np0005591285 nova_compute[182755]:      <entry name="manufacturer">RDO</entry>
Jan 21 19:02:08 np0005591285 nova_compute[182755]:      <entry name="product">OpenStack Compute</entry>
Jan 21 19:02:08 np0005591285 nova_compute[182755]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 21 19:02:08 np0005591285 nova_compute[182755]:      <entry name="serial">cc06928c-a631-4fa0-9c21-daf6bef0991c</entry>
Jan 21 19:02:08 np0005591285 nova_compute[182755]:      <entry name="uuid">cc06928c-a631-4fa0-9c21-daf6bef0991c</entry>
Jan 21 19:02:08 np0005591285 nova_compute[182755]:      <entry name="family">Virtual Machine</entry>
Jan 21 19:02:08 np0005591285 nova_compute[182755]:    </system>
Jan 21 19:02:08 np0005591285 nova_compute[182755]:  </sysinfo>
Jan 21 19:02:08 np0005591285 nova_compute[182755]:  <os>
Jan 21 19:02:08 np0005591285 nova_compute[182755]:    <type arch="x86_64" machine="q35">hvm</type>
Jan 21 19:02:08 np0005591285 nova_compute[182755]:    <boot dev="hd"/>
Jan 21 19:02:08 np0005591285 nova_compute[182755]:    <smbios mode="sysinfo"/>
Jan 21 19:02:08 np0005591285 nova_compute[182755]:  </os>
Jan 21 19:02:08 np0005591285 nova_compute[182755]:  <features>
Jan 21 19:02:08 np0005591285 nova_compute[182755]:    <acpi/>
Jan 21 19:02:08 np0005591285 nova_compute[182755]:    <apic/>
Jan 21 19:02:08 np0005591285 nova_compute[182755]:    <vmcoreinfo/>
Jan 21 19:02:08 np0005591285 nova_compute[182755]:  </features>
Jan 21 19:02:08 np0005591285 nova_compute[182755]:  <clock offset="utc">
Jan 21 19:02:08 np0005591285 nova_compute[182755]:    <timer name="pit" tickpolicy="delay"/>
Jan 21 19:02:08 np0005591285 nova_compute[182755]:    <timer name="rtc" tickpolicy="catchup"/>
Jan 21 19:02:08 np0005591285 nova_compute[182755]:    <timer name="hpet" present="no"/>
Jan 21 19:02:08 np0005591285 nova_compute[182755]:  </clock>
Jan 21 19:02:08 np0005591285 nova_compute[182755]:  <cpu mode="custom" match="exact">
Jan 21 19:02:08 np0005591285 nova_compute[182755]:    <model>Nehalem</model>
Jan 21 19:02:08 np0005591285 nova_compute[182755]:    <topology sockets="1" cores="1" threads="1"/>
Jan 21 19:02:08 np0005591285 nova_compute[182755]:  </cpu>
Jan 21 19:02:08 np0005591285 nova_compute[182755]:  <devices>
Jan 21 19:02:08 np0005591285 nova_compute[182755]:    <disk type="file" device="disk">
Jan 21 19:02:08 np0005591285 nova_compute[182755]:      <driver name="qemu" type="qcow2" cache="none"/>
Jan 21 19:02:08 np0005591285 nova_compute[182755]:      <source file="/var/lib/nova/instances/cc06928c-a631-4fa0-9c21-daf6bef0991c/disk"/>
Jan 21 19:02:08 np0005591285 nova_compute[182755]:      <target dev="vda" bus="virtio"/>
Jan 21 19:02:08 np0005591285 nova_compute[182755]:    </disk>
Jan 21 19:02:08 np0005591285 nova_compute[182755]:    <disk type="file" device="cdrom">
Jan 21 19:02:08 np0005591285 nova_compute[182755]:      <driver name="qemu" type="raw" cache="none"/>
Jan 21 19:02:08 np0005591285 nova_compute[182755]:      <source file="/var/lib/nova/instances/cc06928c-a631-4fa0-9c21-daf6bef0991c/disk.config"/>
Jan 21 19:02:08 np0005591285 nova_compute[182755]:      <target dev="sda" bus="sata"/>
Jan 21 19:02:08 np0005591285 nova_compute[182755]:    </disk>
Jan 21 19:02:08 np0005591285 nova_compute[182755]:    <interface type="ethernet">
Jan 21 19:02:08 np0005591285 nova_compute[182755]:      <mac address="fa:16:3e:7f:d4:48"/>
Jan 21 19:02:08 np0005591285 nova_compute[182755]:      <model type="virtio"/>
Jan 21 19:02:08 np0005591285 nova_compute[182755]:      <driver name="vhost" rx_queue_size="512"/>
Jan 21 19:02:08 np0005591285 nova_compute[182755]:      <mtu size="1442"/>
Jan 21 19:02:08 np0005591285 nova_compute[182755]:      <target dev="tap2d2a9041-fc"/>
Jan 21 19:02:08 np0005591285 nova_compute[182755]:    </interface>
Jan 21 19:02:08 np0005591285 nova_compute[182755]:    <serial type="pty">
Jan 21 19:02:08 np0005591285 nova_compute[182755]:      <log file="/var/lib/nova/instances/cc06928c-a631-4fa0-9c21-daf6bef0991c/console.log" append="off"/>
Jan 21 19:02:08 np0005591285 nova_compute[182755]:    </serial>
Jan 21 19:02:08 np0005591285 nova_compute[182755]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 21 19:02:08 np0005591285 nova_compute[182755]:    <video>
Jan 21 19:02:08 np0005591285 nova_compute[182755]:      <model type="virtio"/>
Jan 21 19:02:08 np0005591285 nova_compute[182755]:    </video>
Jan 21 19:02:08 np0005591285 nova_compute[182755]:    <input type="tablet" bus="usb"/>
Jan 21 19:02:08 np0005591285 nova_compute[182755]:    <rng model="virtio">
Jan 21 19:02:08 np0005591285 nova_compute[182755]:      <backend model="random">/dev/urandom</backend>
Jan 21 19:02:08 np0005591285 nova_compute[182755]:    </rng>
Jan 21 19:02:08 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root"/>
Jan 21 19:02:08 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 19:02:08 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 19:02:08 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 19:02:08 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 19:02:08 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 19:02:08 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 19:02:08 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 19:02:08 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 19:02:08 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 19:02:08 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 19:02:08 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 19:02:08 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 19:02:08 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 19:02:08 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 19:02:08 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 19:02:08 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 19:02:08 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 19:02:08 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 19:02:08 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 19:02:08 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 19:02:08 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 19:02:08 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 19:02:08 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 19:02:08 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 19:02:08 np0005591285 nova_compute[182755]:    <controller type="usb" index="0"/>
Jan 21 19:02:08 np0005591285 nova_compute[182755]:    <memballoon model="virtio">
Jan 21 19:02:08 np0005591285 nova_compute[182755]:      <stats period="10"/>
Jan 21 19:02:08 np0005591285 nova_compute[182755]:    </memballoon>
Jan 21 19:02:08 np0005591285 nova_compute[182755]:  </devices>
Jan 21 19:02:08 np0005591285 nova_compute[182755]: </domain>
Jan 21 19:02:08 np0005591285 nova_compute[182755]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Jan 21 19:02:08 np0005591285 nova_compute[182755]: 2026-01-22 00:02:08.328 182759 DEBUG nova.compute.manager [None req-4dd73e41-2095-45dd-bdee-d324d25e1c4a a54498c24248464db477c8bacbc2529f 81061941b85c488d887a1cbb0d870471 - - default default] [instance: cc06928c-a631-4fa0-9c21-daf6bef0991c] Preparing to wait for external event network-vif-plugged-2d2a9041-fc22-4cf2-90fb-3db5c39c337d prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Jan 21 19:02:08 np0005591285 nova_compute[182755]: 2026-01-22 00:02:08.329 182759 DEBUG oslo_concurrency.lockutils [None req-4dd73e41-2095-45dd-bdee-d324d25e1c4a a54498c24248464db477c8bacbc2529f 81061941b85c488d887a1cbb0d870471 - - default default] Acquiring lock "cc06928c-a631-4fa0-9c21-daf6bef0991c-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 21 19:02:08 np0005591285 nova_compute[182755]: 2026-01-22 00:02:08.329 182759 DEBUG oslo_concurrency.lockutils [None req-4dd73e41-2095-45dd-bdee-d324d25e1c4a a54498c24248464db477c8bacbc2529f 81061941b85c488d887a1cbb0d870471 - - default default] Lock "cc06928c-a631-4fa0-9c21-daf6bef0991c-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 21 19:02:08 np0005591285 nova_compute[182755]: 2026-01-22 00:02:08.329 182759 DEBUG oslo_concurrency.lockutils [None req-4dd73e41-2095-45dd-bdee-d324d25e1c4a a54498c24248464db477c8bacbc2529f 81061941b85c488d887a1cbb0d870471 - - default default] Lock "cc06928c-a631-4fa0-9c21-daf6bef0991c-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 21 19:02:08 np0005591285 nova_compute[182755]: 2026-01-22 00:02:08.330 182759 DEBUG nova.virt.libvirt.vif [None req-4dd73e41-2095-45dd-bdee-d324d25e1c4a a54498c24248464db477c8bacbc2529f 81061941b85c488d887a1cbb0d870471 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-22T00:01:53Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-InstanceActionsNegativeTestJSON-server-748899397',display_name='tempest-InstanceActionsNegativeTestJSON-server-748899397',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-instanceactionsnegativetestjson-server-748899397',id=86,image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='81061941b85c488d887a1cbb0d870471',ramdisk_id='',reservation_id='r-ls5bgkh9',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-InstanceActionsNegativeTestJSON-1091295650',owner_user_name='tempest-InstanceActionsNegativeTestJSON-1091295650-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-22T00:01:58Z,user_data=None,user_id='a54498c24248464db477c8bacbc2529f',uuid=cc06928c-a631-4fa0-9c21-daf6bef0991c,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "2d2a9041-fc22-4cf2-90fb-3db5c39c337d", "address": "fa:16:3e:7f:d4:48", "network": {"id": "50b353e7-a528-49af-98b7-91bbacfa8db4", "bridge": "br-int", "label": "tempest-InstanceActionsNegativeTestJSON-315463875-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "81061941b85c488d887a1cbb0d870471", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap2d2a9041-fc", "ovs_interfaceid": "2d2a9041-fc22-4cf2-90fb-3db5c39c337d", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Jan 21 19:02:08 np0005591285 nova_compute[182755]: 2026-01-22 00:02:08.330 182759 DEBUG nova.network.os_vif_util [None req-4dd73e41-2095-45dd-bdee-d324d25e1c4a a54498c24248464db477c8bacbc2529f 81061941b85c488d887a1cbb0d870471 - - default default] Converting VIF {"id": "2d2a9041-fc22-4cf2-90fb-3db5c39c337d", "address": "fa:16:3e:7f:d4:48", "network": {"id": "50b353e7-a528-49af-98b7-91bbacfa8db4", "bridge": "br-int", "label": "tempest-InstanceActionsNegativeTestJSON-315463875-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "81061941b85c488d887a1cbb0d870471", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap2d2a9041-fc", "ovs_interfaceid": "2d2a9041-fc22-4cf2-90fb-3db5c39c337d", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 21 19:02:08 np0005591285 nova_compute[182755]: 2026-01-22 00:02:08.330 182759 DEBUG nova.network.os_vif_util [None req-4dd73e41-2095-45dd-bdee-d324d25e1c4a a54498c24248464db477c8bacbc2529f 81061941b85c488d887a1cbb0d870471 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:7f:d4:48,bridge_name='br-int',has_traffic_filtering=True,id=2d2a9041-fc22-4cf2-90fb-3db5c39c337d,network=Network(50b353e7-a528-49af-98b7-91bbacfa8db4),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap2d2a9041-fc') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 21 19:02:08 np0005591285 nova_compute[182755]: 2026-01-22 00:02:08.331 182759 DEBUG os_vif [None req-4dd73e41-2095-45dd-bdee-d324d25e1c4a a54498c24248464db477c8bacbc2529f 81061941b85c488d887a1cbb0d870471 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:7f:d4:48,bridge_name='br-int',has_traffic_filtering=True,id=2d2a9041-fc22-4cf2-90fb-3db5c39c337d,network=Network(50b353e7-a528-49af-98b7-91bbacfa8db4),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap2d2a9041-fc') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Jan 21 19:02:08 np0005591285 nova_compute[182755]: 2026-01-22 00:02:08.331 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:02:08 np0005591285 nova_compute[182755]: 2026-01-22 00:02:08.332 182759 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 21 19:02:08 np0005591285 nova_compute[182755]: 2026-01-22 00:02:08.332 182759 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 21 19:02:08 np0005591285 nova_compute[182755]: 2026-01-22 00:02:08.335 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:02:08 np0005591285 nova_compute[182755]: 2026-01-22 00:02:08.335 182759 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap2d2a9041-fc, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 21 19:02:08 np0005591285 nova_compute[182755]: 2026-01-22 00:02:08.335 182759 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap2d2a9041-fc, col_values=(('external_ids', {'iface-id': '2d2a9041-fc22-4cf2-90fb-3db5c39c337d', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:7f:d4:48', 'vm-uuid': 'cc06928c-a631-4fa0-9c21-daf6bef0991c'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 21 19:02:08 np0005591285 nova_compute[182755]: 2026-01-22 00:02:08.337 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:02:08 np0005591285 NetworkManager[55017]: <info>  [1769040128.3381] manager: (tap2d2a9041-fc): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/148)
Jan 21 19:02:08 np0005591285 nova_compute[182755]: 2026-01-22 00:02:08.339 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 21 19:02:08 np0005591285 nova_compute[182755]: 2026-01-22 00:02:08.344 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:02:08 np0005591285 nova_compute[182755]: 2026-01-22 00:02:08.345 182759 INFO os_vif [None req-4dd73e41-2095-45dd-bdee-d324d25e1c4a a54498c24248464db477c8bacbc2529f 81061941b85c488d887a1cbb0d870471 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:7f:d4:48,bridge_name='br-int',has_traffic_filtering=True,id=2d2a9041-fc22-4cf2-90fb-3db5c39c337d,network=Network(50b353e7-a528-49af-98b7-91bbacfa8db4),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap2d2a9041-fc')#033[00m
Jan 21 19:02:08 np0005591285 nova_compute[182755]: 2026-01-22 00:02:08.690 182759 DEBUG nova.virt.libvirt.driver [None req-4dd73e41-2095-45dd-bdee-d324d25e1c4a a54498c24248464db477c8bacbc2529f 81061941b85c488d887a1cbb0d870471 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 21 19:02:08 np0005591285 nova_compute[182755]: 2026-01-22 00:02:08.690 182759 DEBUG nova.virt.libvirt.driver [None req-4dd73e41-2095-45dd-bdee-d324d25e1c4a a54498c24248464db477c8bacbc2529f 81061941b85c488d887a1cbb0d870471 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 21 19:02:08 np0005591285 nova_compute[182755]: 2026-01-22 00:02:08.691 182759 DEBUG nova.virt.libvirt.driver [None req-4dd73e41-2095-45dd-bdee-d324d25e1c4a a54498c24248464db477c8bacbc2529f 81061941b85c488d887a1cbb0d870471 - - default default] No VIF found with MAC fa:16:3e:7f:d4:48, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Jan 21 19:02:08 np0005591285 nova_compute[182755]: 2026-01-22 00:02:08.692 182759 INFO nova.virt.libvirt.driver [None req-4dd73e41-2095-45dd-bdee-d324d25e1c4a a54498c24248464db477c8bacbc2529f 81061941b85c488d887a1cbb0d870471 - - default default] [instance: cc06928c-a631-4fa0-9c21-daf6bef0991c] Using config drive#033[00m
Jan 21 19:02:09 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:02:09.122 104259 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=ce4b296c-26ac-415a-aa87-9634754eb3d3, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '21'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 21 19:02:09 np0005591285 nova_compute[182755]: 2026-01-22 00:02:09.346 182759 INFO nova.virt.libvirt.driver [None req-4dd73e41-2095-45dd-bdee-d324d25e1c4a a54498c24248464db477c8bacbc2529f 81061941b85c488d887a1cbb0d870471 - - default default] [instance: cc06928c-a631-4fa0-9c21-daf6bef0991c] Creating config drive at /var/lib/nova/instances/cc06928c-a631-4fa0-9c21-daf6bef0991c/disk.config#033[00m
Jan 21 19:02:09 np0005591285 nova_compute[182755]: 2026-01-22 00:02:09.351 182759 DEBUG oslo_concurrency.processutils [None req-4dd73e41-2095-45dd-bdee-d324d25e1c4a a54498c24248464db477c8bacbc2529f 81061941b85c488d887a1cbb0d870471 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/cc06928c-a631-4fa0-9c21-daf6bef0991c/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpca5swsgc execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 21 19:02:09 np0005591285 nova_compute[182755]: 2026-01-22 00:02:09.480 182759 DEBUG oslo_concurrency.processutils [None req-4dd73e41-2095-45dd-bdee-d324d25e1c4a a54498c24248464db477c8bacbc2529f 81061941b85c488d887a1cbb0d870471 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/cc06928c-a631-4fa0-9c21-daf6bef0991c/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpca5swsgc" returned: 0 in 0.129s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 21 19:02:09 np0005591285 kernel: tap2d2a9041-fc: entered promiscuous mode
Jan 21 19:02:09 np0005591285 NetworkManager[55017]: <info>  [1769040129.5397] manager: (tap2d2a9041-fc): new Tun device (/org/freedesktop/NetworkManager/Devices/149)
Jan 21 19:02:09 np0005591285 ovn_controller[94908]: 2026-01-22T00:02:09Z|00305|binding|INFO|Claiming lport 2d2a9041-fc22-4cf2-90fb-3db5c39c337d for this chassis.
Jan 21 19:02:09 np0005591285 ovn_controller[94908]: 2026-01-22T00:02:09Z|00306|binding|INFO|2d2a9041-fc22-4cf2-90fb-3db5c39c337d: Claiming fa:16:3e:7f:d4:48 10.100.0.7
Jan 21 19:02:09 np0005591285 nova_compute[182755]: 2026-01-22 00:02:09.581 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:02:09 np0005591285 systemd-udevd[224079]: Network interface NamePolicy= disabled on kernel command line.
Jan 21 19:02:09 np0005591285 systemd-machined[154022]: New machine qemu-39-instance-00000056.
Jan 21 19:02:09 np0005591285 NetworkManager[55017]: <info>  [1769040129.6167] device (tap2d2a9041-fc): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 21 19:02:09 np0005591285 NetworkManager[55017]: <info>  [1769040129.6177] device (tap2d2a9041-fc): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 21 19:02:09 np0005591285 systemd[1]: Started Virtual Machine qemu-39-instance-00000056.
Jan 21 19:02:09 np0005591285 nova_compute[182755]: 2026-01-22 00:02:09.639 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:02:09 np0005591285 ovn_controller[94908]: 2026-01-22T00:02:09Z|00307|binding|INFO|Setting lport 2d2a9041-fc22-4cf2-90fb-3db5c39c337d ovn-installed in OVS
Jan 21 19:02:09 np0005591285 nova_compute[182755]: 2026-01-22 00:02:09.643 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:02:09 np0005591285 ovn_controller[94908]: 2026-01-22T00:02:09Z|00308|binding|INFO|Setting lport 2d2a9041-fc22-4cf2-90fb-3db5c39c337d up in Southbound
Jan 21 19:02:09 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:02:09.949 104259 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:7f:d4:48 10.100.0.7'], port_security=['fa:16:3e:7f:d4:48 10.100.0.7'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.7/28', 'neutron:device_id': 'cc06928c-a631-4fa0-9c21-daf6bef0991c', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-50b353e7-a528-49af-98b7-91bbacfa8db4', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '81061941b85c488d887a1cbb0d870471', 'neutron:revision_number': '2', 'neutron:security_group_ids': 'b0ff5b85-c00c-43c3-a8dc-b5f2a8e354c5', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=cf0d3006-26b1-4abc-ba70-e1a18eca90a7, chassis=[<ovs.db.idl.Row object at 0x7fdd55b5c970>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fdd55b5c970>], logical_port=2d2a9041-fc22-4cf2-90fb-3db5c39c337d) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 21 19:02:09 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:02:09.951 104259 INFO neutron.agent.ovn.metadata.agent [-] Port 2d2a9041-fc22-4cf2-90fb-3db5c39c337d in datapath 50b353e7-a528-49af-98b7-91bbacfa8db4 bound to our chassis#033[00m
Jan 21 19:02:09 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:02:09.952 104259 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 50b353e7-a528-49af-98b7-91bbacfa8db4#033[00m
Jan 21 19:02:09 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:02:09.964 211690 DEBUG oslo.privsep.daemon [-] privsep: reply[0fd588f3-0d33-4173-8ac4-9340ca813145]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 19:02:09 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:02:09.965 104259 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap50b353e7-a1 in ovnmeta-50b353e7-a528-49af-98b7-91bbacfa8db4 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Jan 21 19:02:09 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:02:09.967 211690 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap50b353e7-a0 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Jan 21 19:02:09 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:02:09.967 211690 DEBUG oslo.privsep.daemon [-] privsep: reply[0bd59381-1e51-4f37-b972-dd588f584d9e]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 19:02:09 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:02:09.968 211690 DEBUG oslo.privsep.daemon [-] privsep: reply[68ea6f49-2f11-41af-b905-0e533f84f293]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 19:02:09 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:02:09.986 104650 DEBUG oslo.privsep.daemon [-] privsep: reply[22b54d22-4a34-44c2-969f-e5a917e4a390]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 19:02:09 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:02:09.998 211690 DEBUG oslo.privsep.daemon [-] privsep: reply[59862c90-f644-4261-95e1-06214a36567a]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 19:02:10 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:02:10.028 211704 DEBUG oslo.privsep.daemon [-] privsep: reply[898be6d5-0c45-4487-a3a5-8ba47e4b80e4]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 19:02:10 np0005591285 systemd-udevd[224082]: Network interface NamePolicy= disabled on kernel command line.
Jan 21 19:02:10 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:02:10.033 211690 DEBUG oslo.privsep.daemon [-] privsep: reply[a6a4a6ef-1707-4fc7-bf7a-3507f72f280d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 19:02:10 np0005591285 NetworkManager[55017]: <info>  [1769040130.0345] manager: (tap50b353e7-a0): new Veth device (/org/freedesktop/NetworkManager/Devices/150)
Jan 21 19:02:10 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:02:10.069 211704 DEBUG oslo.privsep.daemon [-] privsep: reply[6d6ebdb9-d098-445f-9cba-5b78689c4e42]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 19:02:10 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:02:10.073 211704 DEBUG oslo.privsep.daemon [-] privsep: reply[93b9b378-b4f4-4976-a89f-02ab5c862ba7]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 19:02:10 np0005591285 NetworkManager[55017]: <info>  [1769040130.0991] device (tap50b353e7-a0): carrier: link connected
Jan 21 19:02:10 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:02:10.105 211704 DEBUG oslo.privsep.daemon [-] privsep: reply[c14e26af-7a46-493e-98a2-b6b090ca7a62]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 19:02:10 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:02:10.121 211690 DEBUG oslo.privsep.daemon [-] privsep: reply[90276456-65dc-43d4-868f-3acf22b046e6]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap50b353e7-a1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:ed:33:ee'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 98], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 468875, 'reachable_time': 27594, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 224121, 'error': None, 'target': 'ovnmeta-50b353e7-a528-49af-98b7-91bbacfa8db4', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 19:02:10 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:02:10.138 211690 DEBUG oslo.privsep.daemon [-] privsep: reply[aa8f9d9d-edee-44b0-8c5f-632ef49b2627]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:feed:33ee'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 468875, 'tstamp': 468875}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 224122, 'error': None, 'target': 'ovnmeta-50b353e7-a528-49af-98b7-91bbacfa8db4', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 19:02:10 np0005591285 nova_compute[182755]: 2026-01-22 00:02:10.145 182759 DEBUG nova.virt.driver [None req-f447ef32-7edb-4e2c-a5c5-5452f2f9c37e - - - - - -] Emitting event <LifecycleEvent: 1769040130.144952, cc06928c-a631-4fa0-9c21-daf6bef0991c => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 21 19:02:10 np0005591285 nova_compute[182755]: 2026-01-22 00:02:10.145 182759 INFO nova.compute.manager [None req-f447ef32-7edb-4e2c-a5c5-5452f2f9c37e - - - - - -] [instance: cc06928c-a631-4fa0-9c21-daf6bef0991c] VM Started (Lifecycle Event)#033[00m
Jan 21 19:02:10 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:02:10.158 211690 DEBUG oslo.privsep.daemon [-] privsep: reply[e4b577fc-4911-4fb7-93e4-3b8ac9508508]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap50b353e7-a1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:ed:33:ee'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 2, 'tx_packets': 1, 'rx_bytes': 180, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 2, 'tx_packets': 1, 'rx_bytes': 180, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 98], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 468875, 'reachable_time': 27594, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 2, 'inoctets': 152, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 2, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 152, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 2, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 224123, 'error': None, 'target': 'ovnmeta-50b353e7-a528-49af-98b7-91bbacfa8db4', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 19:02:10 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:02:10.188 211690 DEBUG oslo.privsep.daemon [-] privsep: reply[1f952bba-62d0-4c5c-9131-2428224dec36]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 19:02:10 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:02:10.246 211690 DEBUG oslo.privsep.daemon [-] privsep: reply[7f9285b6-2ad0-47bd-aa0f-76b2b337d614]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 19:02:10 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:02:10.247 104259 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap50b353e7-a0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 21 19:02:10 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:02:10.247 104259 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 21 19:02:10 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:02:10.248 104259 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap50b353e7-a0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 21 19:02:10 np0005591285 NetworkManager[55017]: <info>  [1769040130.2506] manager: (tap50b353e7-a0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/151)
Jan 21 19:02:10 np0005591285 kernel: tap50b353e7-a0: entered promiscuous mode
Jan 21 19:02:10 np0005591285 nova_compute[182755]: 2026-01-22 00:02:10.250 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:02:10 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:02:10.253 104259 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap50b353e7-a0, col_values=(('external_ids', {'iface-id': '42f3e4f8-c8bf-4a5e-af8c-cf5d7d8d1c3b'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 21 19:02:10 np0005591285 nova_compute[182755]: 2026-01-22 00:02:10.254 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:02:10 np0005591285 ovn_controller[94908]: 2026-01-22T00:02:10Z|00309|binding|INFO|Releasing lport 42f3e4f8-c8bf-4a5e-af8c-cf5d7d8d1c3b from this chassis (sb_readonly=0)
Jan 21 19:02:10 np0005591285 nova_compute[182755]: 2026-01-22 00:02:10.255 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:02:10 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:02:10.257 104259 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/50b353e7-a528-49af-98b7-91bbacfa8db4.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/50b353e7-a528-49af-98b7-91bbacfa8db4.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Jan 21 19:02:10 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:02:10.258 211690 DEBUG oslo.privsep.daemon [-] privsep: reply[6aa6d262-a7ba-4edd-bf33-e59b9dcceb63]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 19:02:10 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:02:10.259 104259 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Jan 21 19:02:10 np0005591285 ovn_metadata_agent[104254]: global
Jan 21 19:02:10 np0005591285 ovn_metadata_agent[104254]:    log         /dev/log local0 debug
Jan 21 19:02:10 np0005591285 ovn_metadata_agent[104254]:    log-tag     haproxy-metadata-proxy-50b353e7-a528-49af-98b7-91bbacfa8db4
Jan 21 19:02:10 np0005591285 ovn_metadata_agent[104254]:    user        root
Jan 21 19:02:10 np0005591285 ovn_metadata_agent[104254]:    group       root
Jan 21 19:02:10 np0005591285 ovn_metadata_agent[104254]:    maxconn     1024
Jan 21 19:02:10 np0005591285 ovn_metadata_agent[104254]:    pidfile     /var/lib/neutron/external/pids/50b353e7-a528-49af-98b7-91bbacfa8db4.pid.haproxy
Jan 21 19:02:10 np0005591285 ovn_metadata_agent[104254]:    daemon
Jan 21 19:02:10 np0005591285 ovn_metadata_agent[104254]: 
Jan 21 19:02:10 np0005591285 ovn_metadata_agent[104254]: defaults
Jan 21 19:02:10 np0005591285 ovn_metadata_agent[104254]:    log global
Jan 21 19:02:10 np0005591285 ovn_metadata_agent[104254]:    mode http
Jan 21 19:02:10 np0005591285 ovn_metadata_agent[104254]:    option httplog
Jan 21 19:02:10 np0005591285 ovn_metadata_agent[104254]:    option dontlognull
Jan 21 19:02:10 np0005591285 ovn_metadata_agent[104254]:    option http-server-close
Jan 21 19:02:10 np0005591285 ovn_metadata_agent[104254]:    option forwardfor
Jan 21 19:02:10 np0005591285 ovn_metadata_agent[104254]:    retries                 3
Jan 21 19:02:10 np0005591285 ovn_metadata_agent[104254]:    timeout http-request    30s
Jan 21 19:02:10 np0005591285 ovn_metadata_agent[104254]:    timeout connect         30s
Jan 21 19:02:10 np0005591285 ovn_metadata_agent[104254]:    timeout client          32s
Jan 21 19:02:10 np0005591285 ovn_metadata_agent[104254]:    timeout server          32s
Jan 21 19:02:10 np0005591285 ovn_metadata_agent[104254]:    timeout http-keep-alive 30s
Jan 21 19:02:10 np0005591285 ovn_metadata_agent[104254]: 
Jan 21 19:02:10 np0005591285 ovn_metadata_agent[104254]: 
Jan 21 19:02:10 np0005591285 ovn_metadata_agent[104254]: listen listener
Jan 21 19:02:10 np0005591285 ovn_metadata_agent[104254]:    bind 169.254.169.254:80
Jan 21 19:02:10 np0005591285 ovn_metadata_agent[104254]:    server metadata /var/lib/neutron/metadata_proxy
Jan 21 19:02:10 np0005591285 ovn_metadata_agent[104254]:    http-request add-header X-OVN-Network-ID 50b353e7-a528-49af-98b7-91bbacfa8db4
Jan 21 19:02:10 np0005591285 ovn_metadata_agent[104254]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Jan 21 19:02:10 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:02:10.259 104259 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-50b353e7-a528-49af-98b7-91bbacfa8db4', 'env', 'PROCESS_TAG=haproxy-50b353e7-a528-49af-98b7-91bbacfa8db4', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/50b353e7-a528-49af-98b7-91bbacfa8db4.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Jan 21 19:02:10 np0005591285 nova_compute[182755]: 2026-01-22 00:02:10.265 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:02:10 np0005591285 nova_compute[182755]: 2026-01-22 00:02:10.403 182759 DEBUG nova.compute.manager [None req-f447ef32-7edb-4e2c-a5c5-5452f2f9c37e - - - - - -] [instance: cc06928c-a631-4fa0-9c21-daf6bef0991c] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 21 19:02:10 np0005591285 nova_compute[182755]: 2026-01-22 00:02:10.408 182759 DEBUG nova.virt.driver [None req-f447ef32-7edb-4e2c-a5c5-5452f2f9c37e - - - - - -] Emitting event <LifecycleEvent: 1769040130.1456728, cc06928c-a631-4fa0-9c21-daf6bef0991c => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 21 19:02:10 np0005591285 nova_compute[182755]: 2026-01-22 00:02:10.409 182759 INFO nova.compute.manager [None req-f447ef32-7edb-4e2c-a5c5-5452f2f9c37e - - - - - -] [instance: cc06928c-a631-4fa0-9c21-daf6bef0991c] VM Paused (Lifecycle Event)#033[00m
Jan 21 19:02:10 np0005591285 podman[224153]: 2026-01-22 00:02:10.635148883 +0000 UTC m=+0.059192865 container create 8df9af214ba04d7f6748cb036291d1a16b54708a55178bb41c3214fb874a368e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-50b353e7-a528-49af-98b7-91bbacfa8db4, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, tcib_managed=true, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team)
Jan 21 19:02:10 np0005591285 nova_compute[182755]: 2026-01-22 00:02:10.657 182759 DEBUG nova.compute.manager [None req-f447ef32-7edb-4e2c-a5c5-5452f2f9c37e - - - - - -] [instance: cc06928c-a631-4fa0-9c21-daf6bef0991c] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 21 19:02:10 np0005591285 nova_compute[182755]: 2026-01-22 00:02:10.662 182759 DEBUG nova.compute.manager [None req-f447ef32-7edb-4e2c-a5c5-5452f2f9c37e - - - - - -] [instance: cc06928c-a631-4fa0-9c21-daf6bef0991c] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 21 19:02:10 np0005591285 systemd[1]: Started libpod-conmon-8df9af214ba04d7f6748cb036291d1a16b54708a55178bb41c3214fb874a368e.scope.
Jan 21 19:02:10 np0005591285 podman[224153]: 2026-01-22 00:02:10.59783651 +0000 UTC m=+0.021880522 image pull 3695f0466b4af47afdf4b467956f8cc4744d7249671a73e7ca3fd26cca2f59c3 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Jan 21 19:02:10 np0005591285 systemd[1]: Started libcrun container.
Jan 21 19:02:10 np0005591285 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/e7147e2f1ca5dd9a73e0189cdee121ec050af5747c18c6edf85771dd640d065b/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 21 19:02:10 np0005591285 podman[224153]: 2026-01-22 00:02:10.735448368 +0000 UTC m=+0.159492380 container init 8df9af214ba04d7f6748cb036291d1a16b54708a55178bb41c3214fb874a368e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-50b353e7-a528-49af-98b7-91bbacfa8db4, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team)
Jan 21 19:02:10 np0005591285 podman[224153]: 2026-01-22 00:02:10.741127638 +0000 UTC m=+0.165171620 container start 8df9af214ba04d7f6748cb036291d1a16b54708a55178bb41c3214fb874a368e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-50b353e7-a528-49af-98b7-91bbacfa8db4, org.label-schema.build-date=20251202, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true)
Jan 21 19:02:10 np0005591285 neutron-haproxy-ovnmeta-50b353e7-a528-49af-98b7-91bbacfa8db4[224167]: [NOTICE]   (224172) : New worker (224174) forked
Jan 21 19:02:10 np0005591285 neutron-haproxy-ovnmeta-50b353e7-a528-49af-98b7-91bbacfa8db4[224167]: [NOTICE]   (224172) : Loading success.
Jan 21 19:02:11 np0005591285 nova_compute[182755]: 2026-01-22 00:02:11.067 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:02:11 np0005591285 nova_compute[182755]: 2026-01-22 00:02:11.374 182759 INFO nova.compute.manager [None req-f447ef32-7edb-4e2c-a5c5-5452f2f9c37e - - - - - -] [instance: cc06928c-a631-4fa0-9c21-daf6bef0991c] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 21 19:02:12 np0005591285 podman[224183]: 2026-01-22 00:02:12.25140569 +0000 UTC m=+0.114858234 container health_status 3d927e208c8d9d2bf2ed958430b573b5beb6e6062ba87e1ec7a0ee42c855b2e4 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Jan 21 19:02:12 np0005591285 nova_compute[182755]: 2026-01-22 00:02:12.690 182759 DEBUG nova.network.neutron [req-28b36c31-2a96-474d-912e-275b653504e0 req-b280cfda-8e89-406a-961c-f222b26503e0 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: cc06928c-a631-4fa0-9c21-daf6bef0991c] Updated VIF entry in instance network info cache for port 2d2a9041-fc22-4cf2-90fb-3db5c39c337d. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 21 19:02:12 np0005591285 nova_compute[182755]: 2026-01-22 00:02:12.691 182759 DEBUG nova.network.neutron [req-28b36c31-2a96-474d-912e-275b653504e0 req-b280cfda-8e89-406a-961c-f222b26503e0 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: cc06928c-a631-4fa0-9c21-daf6bef0991c] Updating instance_info_cache with network_info: [{"id": "2d2a9041-fc22-4cf2-90fb-3db5c39c337d", "address": "fa:16:3e:7f:d4:48", "network": {"id": "50b353e7-a528-49af-98b7-91bbacfa8db4", "bridge": "br-int", "label": "tempest-InstanceActionsNegativeTestJSON-315463875-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "81061941b85c488d887a1cbb0d870471", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap2d2a9041-fc", "ovs_interfaceid": "2d2a9041-fc22-4cf2-90fb-3db5c39c337d", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 21 19:02:12 np0005591285 nova_compute[182755]: 2026-01-22 00:02:12.888 182759 DEBUG oslo_concurrency.lockutils [req-28b36c31-2a96-474d-912e-275b653504e0 req-b280cfda-8e89-406a-961c-f222b26503e0 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Releasing lock "refresh_cache-cc06928c-a631-4fa0-9c21-daf6bef0991c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 21 19:02:13 np0005591285 nova_compute[182755]: 2026-01-22 00:02:13.066 182759 DEBUG nova.compute.manager [req-0bf93a2d-49cb-4227-945f-d541f525e52b req-107c5bb0-1a72-43d3-af75-269b1211ed15 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: cc06928c-a631-4fa0-9c21-daf6bef0991c] Received event network-vif-plugged-2d2a9041-fc22-4cf2-90fb-3db5c39c337d external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 21 19:02:13 np0005591285 nova_compute[182755]: 2026-01-22 00:02:13.067 182759 DEBUG oslo_concurrency.lockutils [req-0bf93a2d-49cb-4227-945f-d541f525e52b req-107c5bb0-1a72-43d3-af75-269b1211ed15 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "cc06928c-a631-4fa0-9c21-daf6bef0991c-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 21 19:02:13 np0005591285 nova_compute[182755]: 2026-01-22 00:02:13.067 182759 DEBUG oslo_concurrency.lockutils [req-0bf93a2d-49cb-4227-945f-d541f525e52b req-107c5bb0-1a72-43d3-af75-269b1211ed15 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "cc06928c-a631-4fa0-9c21-daf6bef0991c-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 21 19:02:13 np0005591285 nova_compute[182755]: 2026-01-22 00:02:13.069 182759 DEBUG oslo_concurrency.lockutils [req-0bf93a2d-49cb-4227-945f-d541f525e52b req-107c5bb0-1a72-43d3-af75-269b1211ed15 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "cc06928c-a631-4fa0-9c21-daf6bef0991c-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 21 19:02:13 np0005591285 nova_compute[182755]: 2026-01-22 00:02:13.069 182759 DEBUG nova.compute.manager [req-0bf93a2d-49cb-4227-945f-d541f525e52b req-107c5bb0-1a72-43d3-af75-269b1211ed15 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: cc06928c-a631-4fa0-9c21-daf6bef0991c] Processing event network-vif-plugged-2d2a9041-fc22-4cf2-90fb-3db5c39c337d _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Jan 21 19:02:13 np0005591285 nova_compute[182755]: 2026-01-22 00:02:13.071 182759 DEBUG nova.compute.manager [None req-4dd73e41-2095-45dd-bdee-d324d25e1c4a a54498c24248464db477c8bacbc2529f 81061941b85c488d887a1cbb0d870471 - - default default] [instance: cc06928c-a631-4fa0-9c21-daf6bef0991c] Instance event wait completed in 2 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Jan 21 19:02:13 np0005591285 nova_compute[182755]: 2026-01-22 00:02:13.077 182759 DEBUG nova.virt.driver [None req-f447ef32-7edb-4e2c-a5c5-5452f2f9c37e - - - - - -] Emitting event <LifecycleEvent: 1769040133.0769064, cc06928c-a631-4fa0-9c21-daf6bef0991c => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 21 19:02:13 np0005591285 nova_compute[182755]: 2026-01-22 00:02:13.078 182759 INFO nova.compute.manager [None req-f447ef32-7edb-4e2c-a5c5-5452f2f9c37e - - - - - -] [instance: cc06928c-a631-4fa0-9c21-daf6bef0991c] VM Resumed (Lifecycle Event)#033[00m
Jan 21 19:02:13 np0005591285 nova_compute[182755]: 2026-01-22 00:02:13.080 182759 DEBUG nova.virt.libvirt.driver [None req-4dd73e41-2095-45dd-bdee-d324d25e1c4a a54498c24248464db477c8bacbc2529f 81061941b85c488d887a1cbb0d870471 - - default default] [instance: cc06928c-a631-4fa0-9c21-daf6bef0991c] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Jan 21 19:02:13 np0005591285 nova_compute[182755]: 2026-01-22 00:02:13.084 182759 INFO nova.virt.libvirt.driver [-] [instance: cc06928c-a631-4fa0-9c21-daf6bef0991c] Instance spawned successfully.#033[00m
Jan 21 19:02:13 np0005591285 nova_compute[182755]: 2026-01-22 00:02:13.084 182759 DEBUG nova.virt.libvirt.driver [None req-4dd73e41-2095-45dd-bdee-d324d25e1c4a a54498c24248464db477c8bacbc2529f 81061941b85c488d887a1cbb0d870471 - - default default] [instance: cc06928c-a631-4fa0-9c21-daf6bef0991c] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Jan 21 19:02:13 np0005591285 nova_compute[182755]: 2026-01-22 00:02:13.338 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:02:13 np0005591285 nova_compute[182755]: 2026-01-22 00:02:13.934 182759 DEBUG nova.compute.manager [None req-f447ef32-7edb-4e2c-a5c5-5452f2f9c37e - - - - - -] [instance: cc06928c-a631-4fa0-9c21-daf6bef0991c] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 21 19:02:13 np0005591285 nova_compute[182755]: 2026-01-22 00:02:13.936 182759 DEBUG nova.virt.libvirt.driver [None req-4dd73e41-2095-45dd-bdee-d324d25e1c4a a54498c24248464db477c8bacbc2529f 81061941b85c488d887a1cbb0d870471 - - default default] [instance: cc06928c-a631-4fa0-9c21-daf6bef0991c] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 21 19:02:13 np0005591285 nova_compute[182755]: 2026-01-22 00:02:13.936 182759 DEBUG nova.virt.libvirt.driver [None req-4dd73e41-2095-45dd-bdee-d324d25e1c4a a54498c24248464db477c8bacbc2529f 81061941b85c488d887a1cbb0d870471 - - default default] [instance: cc06928c-a631-4fa0-9c21-daf6bef0991c] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 21 19:02:13 np0005591285 nova_compute[182755]: 2026-01-22 00:02:13.937 182759 DEBUG nova.virt.libvirt.driver [None req-4dd73e41-2095-45dd-bdee-d324d25e1c4a a54498c24248464db477c8bacbc2529f 81061941b85c488d887a1cbb0d870471 - - default default] [instance: cc06928c-a631-4fa0-9c21-daf6bef0991c] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 21 19:02:13 np0005591285 nova_compute[182755]: 2026-01-22 00:02:13.937 182759 DEBUG nova.virt.libvirt.driver [None req-4dd73e41-2095-45dd-bdee-d324d25e1c4a a54498c24248464db477c8bacbc2529f 81061941b85c488d887a1cbb0d870471 - - default default] [instance: cc06928c-a631-4fa0-9c21-daf6bef0991c] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 21 19:02:13 np0005591285 nova_compute[182755]: 2026-01-22 00:02:13.937 182759 DEBUG nova.virt.libvirt.driver [None req-4dd73e41-2095-45dd-bdee-d324d25e1c4a a54498c24248464db477c8bacbc2529f 81061941b85c488d887a1cbb0d870471 - - default default] [instance: cc06928c-a631-4fa0-9c21-daf6bef0991c] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 21 19:02:13 np0005591285 nova_compute[182755]: 2026-01-22 00:02:13.938 182759 DEBUG nova.virt.libvirt.driver [None req-4dd73e41-2095-45dd-bdee-d324d25e1c4a a54498c24248464db477c8bacbc2529f 81061941b85c488d887a1cbb0d870471 - - default default] [instance: cc06928c-a631-4fa0-9c21-daf6bef0991c] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 21 19:02:13 np0005591285 nova_compute[182755]: 2026-01-22 00:02:13.942 182759 DEBUG nova.compute.manager [None req-f447ef32-7edb-4e2c-a5c5-5452f2f9c37e - - - - - -] [instance: cc06928c-a631-4fa0-9c21-daf6bef0991c] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 21 19:02:14 np0005591285 nova_compute[182755]: 2026-01-22 00:02:14.233 182759 INFO nova.compute.manager [None req-f447ef32-7edb-4e2c-a5c5-5452f2f9c37e - - - - - -] [instance: cc06928c-a631-4fa0-9c21-daf6bef0991c] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 21 19:02:14 np0005591285 nova_compute[182755]: 2026-01-22 00:02:14.808 182759 INFO nova.compute.manager [None req-4dd73e41-2095-45dd-bdee-d324d25e1c4a a54498c24248464db477c8bacbc2529f 81061941b85c488d887a1cbb0d870471 - - default default] [instance: cc06928c-a631-4fa0-9c21-daf6bef0991c] Took 14.75 seconds to spawn the instance on the hypervisor.#033[00m
Jan 21 19:02:14 np0005591285 nova_compute[182755]: 2026-01-22 00:02:14.809 182759 DEBUG nova.compute.manager [None req-4dd73e41-2095-45dd-bdee-d324d25e1c4a a54498c24248464db477c8bacbc2529f 81061941b85c488d887a1cbb0d870471 - - default default] [instance: cc06928c-a631-4fa0-9c21-daf6bef0991c] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 21 19:02:15 np0005591285 podman[224206]: 2026-01-22 00:02:15.185861181 +0000 UTC m=+0.054692194 container health_status 482a8450540b33e5cd4c4cb0c4b60281016c657b79b89fa82f8a9e447843f995 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, container_name=ovn_metadata_agent, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible)
Jan 21 19:02:15 np0005591285 podman[224207]: 2026-01-22 00:02:15.191705176 +0000 UTC m=+0.057265713 container health_status 8919309155a033da8deb024bb37b9fc0d285fe97686ee0d202af37a5f2df8416 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Jan 21 19:02:15 np0005591285 podman[224208]: 2026-01-22 00:02:15.228858133 +0000 UTC m=+0.089699145 container health_status e38def30a2d032278c507a8fe96e62e9ea51ff4721fe55b24e66691821fd079e (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, container_name=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2)
Jan 21 19:02:15 np0005591285 nova_compute[182755]: 2026-01-22 00:02:15.633 182759 DEBUG nova.compute.manager [req-9214ee5a-9477-4c6e-ade8-864309590c96 req-f9728bce-b9e6-4670-94af-dc7df0177758 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: cc06928c-a631-4fa0-9c21-daf6bef0991c] Received event network-vif-plugged-2d2a9041-fc22-4cf2-90fb-3db5c39c337d external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 21 19:02:15 np0005591285 nova_compute[182755]: 2026-01-22 00:02:15.633 182759 DEBUG oslo_concurrency.lockutils [req-9214ee5a-9477-4c6e-ade8-864309590c96 req-f9728bce-b9e6-4670-94af-dc7df0177758 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "cc06928c-a631-4fa0-9c21-daf6bef0991c-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 21 19:02:15 np0005591285 nova_compute[182755]: 2026-01-22 00:02:15.634 182759 DEBUG oslo_concurrency.lockutils [req-9214ee5a-9477-4c6e-ade8-864309590c96 req-f9728bce-b9e6-4670-94af-dc7df0177758 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "cc06928c-a631-4fa0-9c21-daf6bef0991c-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 21 19:02:15 np0005591285 nova_compute[182755]: 2026-01-22 00:02:15.634 182759 DEBUG oslo_concurrency.lockutils [req-9214ee5a-9477-4c6e-ade8-864309590c96 req-f9728bce-b9e6-4670-94af-dc7df0177758 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "cc06928c-a631-4fa0-9c21-daf6bef0991c-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 21 19:02:15 np0005591285 nova_compute[182755]: 2026-01-22 00:02:15.635 182759 DEBUG nova.compute.manager [req-9214ee5a-9477-4c6e-ade8-864309590c96 req-f9728bce-b9e6-4670-94af-dc7df0177758 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: cc06928c-a631-4fa0-9c21-daf6bef0991c] No waiting events found dispatching network-vif-plugged-2d2a9041-fc22-4cf2-90fb-3db5c39c337d pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 21 19:02:15 np0005591285 nova_compute[182755]: 2026-01-22 00:02:15.635 182759 WARNING nova.compute.manager [req-9214ee5a-9477-4c6e-ade8-864309590c96 req-f9728bce-b9e6-4670-94af-dc7df0177758 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: cc06928c-a631-4fa0-9c21-daf6bef0991c] Received unexpected event network-vif-plugged-2d2a9041-fc22-4cf2-90fb-3db5c39c337d for instance with vm_state building and task_state spawning.#033[00m
Jan 21 19:02:15 np0005591285 nova_compute[182755]: 2026-01-22 00:02:15.976 182759 INFO nova.compute.manager [None req-4dd73e41-2095-45dd-bdee-d324d25e1c4a a54498c24248464db477c8bacbc2529f 81061941b85c488d887a1cbb0d870471 - - default default] [instance: cc06928c-a631-4fa0-9c21-daf6bef0991c] Took 19.90 seconds to build instance.#033[00m
Jan 21 19:02:16 np0005591285 nova_compute[182755]: 2026-01-22 00:02:16.068 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:02:16 np0005591285 nova_compute[182755]: 2026-01-22 00:02:16.516 182759 DEBUG oslo_concurrency.lockutils [None req-4dd73e41-2095-45dd-bdee-d324d25e1c4a a54498c24248464db477c8bacbc2529f 81061941b85c488d887a1cbb0d870471 - - default default] Lock "cc06928c-a631-4fa0-9c21-daf6bef0991c" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 21.407s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 21 19:02:18 np0005591285 nova_compute[182755]: 2026-01-22 00:02:18.413 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:02:18 np0005591285 nova_compute[182755]: 2026-01-22 00:02:18.921 182759 DEBUG oslo_concurrency.lockutils [None req-76561eee-1361-4b2e-a22b-93efd5ee3245 a54498c24248464db477c8bacbc2529f 81061941b85c488d887a1cbb0d870471 - - default default] Acquiring lock "cc06928c-a631-4fa0-9c21-daf6bef0991c" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 21 19:02:18 np0005591285 nova_compute[182755]: 2026-01-22 00:02:18.921 182759 DEBUG oslo_concurrency.lockutils [None req-76561eee-1361-4b2e-a22b-93efd5ee3245 a54498c24248464db477c8bacbc2529f 81061941b85c488d887a1cbb0d870471 - - default default] Lock "cc06928c-a631-4fa0-9c21-daf6bef0991c" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 21 19:02:18 np0005591285 nova_compute[182755]: 2026-01-22 00:02:18.922 182759 DEBUG oslo_concurrency.lockutils [None req-76561eee-1361-4b2e-a22b-93efd5ee3245 a54498c24248464db477c8bacbc2529f 81061941b85c488d887a1cbb0d870471 - - default default] Acquiring lock "cc06928c-a631-4fa0-9c21-daf6bef0991c-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 21 19:02:18 np0005591285 nova_compute[182755]: 2026-01-22 00:02:18.922 182759 DEBUG oslo_concurrency.lockutils [None req-76561eee-1361-4b2e-a22b-93efd5ee3245 a54498c24248464db477c8bacbc2529f 81061941b85c488d887a1cbb0d870471 - - default default] Lock "cc06928c-a631-4fa0-9c21-daf6bef0991c-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 21 19:02:18 np0005591285 nova_compute[182755]: 2026-01-22 00:02:18.922 182759 DEBUG oslo_concurrency.lockutils [None req-76561eee-1361-4b2e-a22b-93efd5ee3245 a54498c24248464db477c8bacbc2529f 81061941b85c488d887a1cbb0d870471 - - default default] Lock "cc06928c-a631-4fa0-9c21-daf6bef0991c-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 21 19:02:19 np0005591285 nova_compute[182755]: 2026-01-22 00:02:19.309 182759 INFO nova.compute.manager [None req-76561eee-1361-4b2e-a22b-93efd5ee3245 a54498c24248464db477c8bacbc2529f 81061941b85c488d887a1cbb0d870471 - - default default] [instance: cc06928c-a631-4fa0-9c21-daf6bef0991c] Terminating instance#033[00m
Jan 21 19:02:19 np0005591285 nova_compute[182755]: 2026-01-22 00:02:19.480 182759 DEBUG nova.compute.manager [None req-76561eee-1361-4b2e-a22b-93efd5ee3245 a54498c24248464db477c8bacbc2529f 81061941b85c488d887a1cbb0d870471 - - default default] [instance: cc06928c-a631-4fa0-9c21-daf6bef0991c] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Jan 21 19:02:19 np0005591285 kernel: tap2d2a9041-fc (unregistering): left promiscuous mode
Jan 21 19:02:19 np0005591285 NetworkManager[55017]: <info>  [1769040139.5025] device (tap2d2a9041-fc): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 21 19:02:19 np0005591285 ovn_controller[94908]: 2026-01-22T00:02:19Z|00310|binding|INFO|Releasing lport 2d2a9041-fc22-4cf2-90fb-3db5c39c337d from this chassis (sb_readonly=0)
Jan 21 19:02:19 np0005591285 ovn_controller[94908]: 2026-01-22T00:02:19Z|00311|binding|INFO|Setting lport 2d2a9041-fc22-4cf2-90fb-3db5c39c337d down in Southbound
Jan 21 19:02:19 np0005591285 ovn_controller[94908]: 2026-01-22T00:02:19Z|00312|binding|INFO|Removing iface tap2d2a9041-fc ovn-installed in OVS
Jan 21 19:02:19 np0005591285 nova_compute[182755]: 2026-01-22 00:02:19.541 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:02:19 np0005591285 nova_compute[182755]: 2026-01-22 00:02:19.543 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:02:19 np0005591285 nova_compute[182755]: 2026-01-22 00:02:19.555 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:02:19 np0005591285 systemd[1]: machine-qemu\x2d39\x2dinstance\x2d00000056.scope: Deactivated successfully.
Jan 21 19:02:19 np0005591285 systemd[1]: machine-qemu\x2d39\x2dinstance\x2d00000056.scope: Consumed 6.910s CPU time.
Jan 21 19:02:19 np0005591285 systemd-machined[154022]: Machine qemu-39-instance-00000056 terminated.
Jan 21 19:02:19 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:02:19.640 104259 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:7f:d4:48 10.100.0.7'], port_security=['fa:16:3e:7f:d4:48 10.100.0.7'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.7/28', 'neutron:device_id': 'cc06928c-a631-4fa0-9c21-daf6bef0991c', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-50b353e7-a528-49af-98b7-91bbacfa8db4', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '81061941b85c488d887a1cbb0d870471', 'neutron:revision_number': '4', 'neutron:security_group_ids': 'b0ff5b85-c00c-43c3-a8dc-b5f2a8e354c5', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-2.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=cf0d3006-26b1-4abc-ba70-e1a18eca90a7, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fdd55b5c970>], logical_port=2d2a9041-fc22-4cf2-90fb-3db5c39c337d) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7fdd55b5c970>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 21 19:02:19 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:02:19.643 104259 INFO neutron.agent.ovn.metadata.agent [-] Port 2d2a9041-fc22-4cf2-90fb-3db5c39c337d in datapath 50b353e7-a528-49af-98b7-91bbacfa8db4 unbound from our chassis#033[00m
Jan 21 19:02:19 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:02:19.646 104259 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 50b353e7-a528-49af-98b7-91bbacfa8db4, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Jan 21 19:02:19 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:02:19.648 211690 DEBUG oslo.privsep.daemon [-] privsep: reply[ad82f64d-9d40-43ac-b9e0-cbd42243da6e]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 19:02:19 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:02:19.649 104259 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-50b353e7-a528-49af-98b7-91bbacfa8db4 namespace which is not needed anymore#033[00m
Jan 21 19:02:19 np0005591285 nova_compute[182755]: 2026-01-22 00:02:19.745 182759 INFO nova.virt.libvirt.driver [-] [instance: cc06928c-a631-4fa0-9c21-daf6bef0991c] Instance destroyed successfully.#033[00m
Jan 21 19:02:19 np0005591285 nova_compute[182755]: 2026-01-22 00:02:19.745 182759 DEBUG nova.objects.instance [None req-76561eee-1361-4b2e-a22b-93efd5ee3245 a54498c24248464db477c8bacbc2529f 81061941b85c488d887a1cbb0d870471 - - default default] Lazy-loading 'resources' on Instance uuid cc06928c-a631-4fa0-9c21-daf6bef0991c obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 21 19:02:19 np0005591285 neutron-haproxy-ovnmeta-50b353e7-a528-49af-98b7-91bbacfa8db4[224167]: [NOTICE]   (224172) : haproxy version is 2.8.14-c23fe91
Jan 21 19:02:19 np0005591285 neutron-haproxy-ovnmeta-50b353e7-a528-49af-98b7-91bbacfa8db4[224167]: [NOTICE]   (224172) : path to executable is /usr/sbin/haproxy
Jan 21 19:02:19 np0005591285 neutron-haproxy-ovnmeta-50b353e7-a528-49af-98b7-91bbacfa8db4[224167]: [WARNING]  (224172) : Exiting Master process...
Jan 21 19:02:19 np0005591285 neutron-haproxy-ovnmeta-50b353e7-a528-49af-98b7-91bbacfa8db4[224167]: [ALERT]    (224172) : Current worker (224174) exited with code 143 (Terminated)
Jan 21 19:02:19 np0005591285 neutron-haproxy-ovnmeta-50b353e7-a528-49af-98b7-91bbacfa8db4[224167]: [WARNING]  (224172) : All workers exited. Exiting... (0)
Jan 21 19:02:19 np0005591285 systemd[1]: libpod-8df9af214ba04d7f6748cb036291d1a16b54708a55178bb41c3214fb874a368e.scope: Deactivated successfully.
Jan 21 19:02:19 np0005591285 podman[224310]: 2026-01-22 00:02:19.823519301 +0000 UTC m=+0.068457251 container died 8df9af214ba04d7f6748cb036291d1a16b54708a55178bb41c3214fb874a368e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-50b353e7-a528-49af-98b7-91bbacfa8db4, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 21 19:02:19 np0005591285 nova_compute[182755]: 2026-01-22 00:02:19.827 182759 DEBUG nova.virt.libvirt.vif [None req-76561eee-1361-4b2e-a22b-93efd5ee3245 a54498c24248464db477c8bacbc2529f 81061941b85c488d887a1cbb0d870471 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-22T00:01:53Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-InstanceActionsNegativeTestJSON-server-748899397',display_name='tempest-InstanceActionsNegativeTestJSON-server-748899397',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-instanceactionsnegativetestjson-server-748899397',id=86,image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2026-01-22T00:02:14Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='81061941b85c488d887a1cbb0d870471',ramdisk_id='',reservation_id='r-ls5bgkh9',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-InstanceActionsNegativeTestJSON-1091295650',owner_user_name='tempest-InstanceActionsNegativeTestJSON-1091295650-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-22T00:02:15Z,user_data=None,user_id='a54498c24248464db477c8bacbc2529f',uuid=cc06928c-a631-4fa0-9c21-daf6bef0991c,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "2d2a9041-fc22-4cf2-90fb-3db5c39c337d", "address": "fa:16:3e:7f:d4:48", "network": {"id": "50b353e7-a528-49af-98b7-91bbacfa8db4", "bridge": "br-int", "label": "tempest-InstanceActionsNegativeTestJSON-315463875-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "81061941b85c488d887a1cbb0d870471", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap2d2a9041-fc", "ovs_interfaceid": "2d2a9041-fc22-4cf2-90fb-3db5c39c337d", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Jan 21 19:02:19 np0005591285 nova_compute[182755]: 2026-01-22 00:02:19.828 182759 DEBUG nova.network.os_vif_util [None req-76561eee-1361-4b2e-a22b-93efd5ee3245 a54498c24248464db477c8bacbc2529f 81061941b85c488d887a1cbb0d870471 - - default default] Converting VIF {"id": "2d2a9041-fc22-4cf2-90fb-3db5c39c337d", "address": "fa:16:3e:7f:d4:48", "network": {"id": "50b353e7-a528-49af-98b7-91bbacfa8db4", "bridge": "br-int", "label": "tempest-InstanceActionsNegativeTestJSON-315463875-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "81061941b85c488d887a1cbb0d870471", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap2d2a9041-fc", "ovs_interfaceid": "2d2a9041-fc22-4cf2-90fb-3db5c39c337d", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 21 19:02:19 np0005591285 nova_compute[182755]: 2026-01-22 00:02:19.828 182759 DEBUG nova.network.os_vif_util [None req-76561eee-1361-4b2e-a22b-93efd5ee3245 a54498c24248464db477c8bacbc2529f 81061941b85c488d887a1cbb0d870471 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:7f:d4:48,bridge_name='br-int',has_traffic_filtering=True,id=2d2a9041-fc22-4cf2-90fb-3db5c39c337d,network=Network(50b353e7-a528-49af-98b7-91bbacfa8db4),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap2d2a9041-fc') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 21 19:02:19 np0005591285 nova_compute[182755]: 2026-01-22 00:02:19.829 182759 DEBUG os_vif [None req-76561eee-1361-4b2e-a22b-93efd5ee3245 a54498c24248464db477c8bacbc2529f 81061941b85c488d887a1cbb0d870471 - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:7f:d4:48,bridge_name='br-int',has_traffic_filtering=True,id=2d2a9041-fc22-4cf2-90fb-3db5c39c337d,network=Network(50b353e7-a528-49af-98b7-91bbacfa8db4),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap2d2a9041-fc') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Jan 21 19:02:19 np0005591285 nova_compute[182755]: 2026-01-22 00:02:19.830 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:02:19 np0005591285 nova_compute[182755]: 2026-01-22 00:02:19.831 182759 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap2d2a9041-fc, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 21 19:02:19 np0005591285 nova_compute[182755]: 2026-01-22 00:02:19.832 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:02:19 np0005591285 nova_compute[182755]: 2026-01-22 00:02:19.834 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:02:19 np0005591285 nova_compute[182755]: 2026-01-22 00:02:19.837 182759 INFO os_vif [None req-76561eee-1361-4b2e-a22b-93efd5ee3245 a54498c24248464db477c8bacbc2529f 81061941b85c488d887a1cbb0d870471 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:7f:d4:48,bridge_name='br-int',has_traffic_filtering=True,id=2d2a9041-fc22-4cf2-90fb-3db5c39c337d,network=Network(50b353e7-a528-49af-98b7-91bbacfa8db4),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap2d2a9041-fc')#033[00m
Jan 21 19:02:19 np0005591285 nova_compute[182755]: 2026-01-22 00:02:19.837 182759 INFO nova.virt.libvirt.driver [None req-76561eee-1361-4b2e-a22b-93efd5ee3245 a54498c24248464db477c8bacbc2529f 81061941b85c488d887a1cbb0d870471 - - default default] [instance: cc06928c-a631-4fa0-9c21-daf6bef0991c] Deleting instance files /var/lib/nova/instances/cc06928c-a631-4fa0-9c21-daf6bef0991c_del#033[00m
Jan 21 19:02:19 np0005591285 nova_compute[182755]: 2026-01-22 00:02:19.838 182759 INFO nova.virt.libvirt.driver [None req-76561eee-1361-4b2e-a22b-93efd5ee3245 a54498c24248464db477c8bacbc2529f 81061941b85c488d887a1cbb0d870471 - - default default] [instance: cc06928c-a631-4fa0-9c21-daf6bef0991c] Deletion of /var/lib/nova/instances/cc06928c-a631-4fa0-9c21-daf6bef0991c_del complete#033[00m
Jan 21 19:02:19 np0005591285 systemd[1]: var-lib-containers-storage-overlay-e7147e2f1ca5dd9a73e0189cdee121ec050af5747c18c6edf85771dd640d065b-merged.mount: Deactivated successfully.
Jan 21 19:02:19 np0005591285 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-8df9af214ba04d7f6748cb036291d1a16b54708a55178bb41c3214fb874a368e-userdata-shm.mount: Deactivated successfully.
Jan 21 19:02:19 np0005591285 podman[224310]: 2026-01-22 00:02:19.860091172 +0000 UTC m=+0.105029132 container cleanup 8df9af214ba04d7f6748cb036291d1a16b54708a55178bb41c3214fb874a368e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-50b353e7-a528-49af-98b7-91bbacfa8db4, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251202)
Jan 21 19:02:19 np0005591285 systemd[1]: libpod-conmon-8df9af214ba04d7f6748cb036291d1a16b54708a55178bb41c3214fb874a368e.scope: Deactivated successfully.
Jan 21 19:02:19 np0005591285 podman[224340]: 2026-01-22 00:02:19.936411791 +0000 UTC m=+0.046305892 container remove 8df9af214ba04d7f6748cb036291d1a16b54708a55178bb41c3214fb874a368e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-50b353e7-a528-49af-98b7-91bbacfa8db4, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2)
Jan 21 19:02:19 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:02:19.941 211690 DEBUG oslo.privsep.daemon [-] privsep: reply[08eb36c7-846e-474b-9d02-dec6fa0d6494]: (4, ('Thu Jan 22 12:02:19 AM UTC 2026 Stopping container neutron-haproxy-ovnmeta-50b353e7-a528-49af-98b7-91bbacfa8db4 (8df9af214ba04d7f6748cb036291d1a16b54708a55178bb41c3214fb874a368e)\n8df9af214ba04d7f6748cb036291d1a16b54708a55178bb41c3214fb874a368e\nThu Jan 22 12:02:19 AM UTC 2026 Deleting container neutron-haproxy-ovnmeta-50b353e7-a528-49af-98b7-91bbacfa8db4 (8df9af214ba04d7f6748cb036291d1a16b54708a55178bb41c3214fb874a368e)\n8df9af214ba04d7f6748cb036291d1a16b54708a55178bb41c3214fb874a368e\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 19:02:19 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:02:19.943 211690 DEBUG oslo.privsep.daemon [-] privsep: reply[a771038d-6836-47b8-af97-96cab58bbbba]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 19:02:19 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:02:19.944 104259 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap50b353e7-a0, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 21 19:02:19 np0005591285 kernel: tap50b353e7-a0: left promiscuous mode
Jan 21 19:02:19 np0005591285 nova_compute[182755]: 2026-01-22 00:02:19.947 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:02:19 np0005591285 nova_compute[182755]: 2026-01-22 00:02:19.962 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:02:19 np0005591285 nova_compute[182755]: 2026-01-22 00:02:19.963 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:02:19 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:02:19.965 211690 DEBUG oslo.privsep.daemon [-] privsep: reply[604711af-17ae-4a9a-853d-c3f4afd4ae3e]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 19:02:19 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:02:19.992 211690 DEBUG oslo.privsep.daemon [-] privsep: reply[d5d400db-7cea-4f9b-a1d3-5c6762f680b9]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 19:02:19 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:02:19.994 211690 DEBUG oslo.privsep.daemon [-] privsep: reply[a9b6a4d8-4ebc-4aa1-b73f-ccda6cc6c76d]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 19:02:20 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:02:20.009 211690 DEBUG oslo.privsep.daemon [-] privsep: reply[019da9fe-6e09-4516-94b7-9d55460dbaa3]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 468868, 'reachable_time': 28796, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 224356, 'error': None, 'target': 'ovnmeta-50b353e7-a528-49af-98b7-91bbacfa8db4', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 19:02:20 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:02:20.011 104650 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-50b353e7-a528-49af-98b7-91bbacfa8db4 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Jan 21 19:02:20 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:02:20.012 104650 DEBUG oslo.privsep.daemon [-] privsep: reply[659c2668-8c80-4345-87bc-69a04efac608]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 19:02:20 np0005591285 systemd[1]: run-netns-ovnmeta\x2d50b353e7\x2da528\x2d49af\x2d98b7\x2d91bbacfa8db4.mount: Deactivated successfully.
Jan 21 19:02:20 np0005591285 nova_compute[182755]: 2026-01-22 00:02:20.663 182759 DEBUG nova.compute.manager [req-0e2afe37-db41-4c6c-9fc7-5c868d4f7af7 req-bd452e85-a79f-4a2f-827d-80b5e0a27832 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: cc06928c-a631-4fa0-9c21-daf6bef0991c] Received event network-vif-unplugged-2d2a9041-fc22-4cf2-90fb-3db5c39c337d external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 21 19:02:20 np0005591285 nova_compute[182755]: 2026-01-22 00:02:20.664 182759 DEBUG oslo_concurrency.lockutils [req-0e2afe37-db41-4c6c-9fc7-5c868d4f7af7 req-bd452e85-a79f-4a2f-827d-80b5e0a27832 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "cc06928c-a631-4fa0-9c21-daf6bef0991c-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 21 19:02:20 np0005591285 nova_compute[182755]: 2026-01-22 00:02:20.665 182759 DEBUG oslo_concurrency.lockutils [req-0e2afe37-db41-4c6c-9fc7-5c868d4f7af7 req-bd452e85-a79f-4a2f-827d-80b5e0a27832 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "cc06928c-a631-4fa0-9c21-daf6bef0991c-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 21 19:02:20 np0005591285 nova_compute[182755]: 2026-01-22 00:02:20.665 182759 DEBUG oslo_concurrency.lockutils [req-0e2afe37-db41-4c6c-9fc7-5c868d4f7af7 req-bd452e85-a79f-4a2f-827d-80b5e0a27832 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "cc06928c-a631-4fa0-9c21-daf6bef0991c-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 21 19:02:20 np0005591285 nova_compute[182755]: 2026-01-22 00:02:20.666 182759 DEBUG nova.compute.manager [req-0e2afe37-db41-4c6c-9fc7-5c868d4f7af7 req-bd452e85-a79f-4a2f-827d-80b5e0a27832 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: cc06928c-a631-4fa0-9c21-daf6bef0991c] No waiting events found dispatching network-vif-unplugged-2d2a9041-fc22-4cf2-90fb-3db5c39c337d pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 21 19:02:20 np0005591285 nova_compute[182755]: 2026-01-22 00:02:20.666 182759 DEBUG nova.compute.manager [req-0e2afe37-db41-4c6c-9fc7-5c868d4f7af7 req-bd452e85-a79f-4a2f-827d-80b5e0a27832 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: cc06928c-a631-4fa0-9c21-daf6bef0991c] Received event network-vif-unplugged-2d2a9041-fc22-4cf2-90fb-3db5c39c337d for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Jan 21 19:02:21 np0005591285 nova_compute[182755]: 2026-01-22 00:02:21.029 182759 INFO nova.compute.manager [None req-76561eee-1361-4b2e-a22b-93efd5ee3245 a54498c24248464db477c8bacbc2529f 81061941b85c488d887a1cbb0d870471 - - default default] [instance: cc06928c-a631-4fa0-9c21-daf6bef0991c] Took 1.55 seconds to destroy the instance on the hypervisor.#033[00m
Jan 21 19:02:21 np0005591285 nova_compute[182755]: 2026-01-22 00:02:21.031 182759 DEBUG oslo.service.loopingcall [None req-76561eee-1361-4b2e-a22b-93efd5ee3245 a54498c24248464db477c8bacbc2529f 81061941b85c488d887a1cbb0d870471 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Jan 21 19:02:21 np0005591285 nova_compute[182755]: 2026-01-22 00:02:21.032 182759 DEBUG nova.compute.manager [-] [instance: cc06928c-a631-4fa0-9c21-daf6bef0991c] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Jan 21 19:02:21 np0005591285 nova_compute[182755]: 2026-01-22 00:02:21.033 182759 DEBUG nova.network.neutron [-] [instance: cc06928c-a631-4fa0-9c21-daf6bef0991c] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Jan 21 19:02:21 np0005591285 nova_compute[182755]: 2026-01-22 00:02:21.070 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:02:22 np0005591285 nova_compute[182755]: 2026-01-22 00:02:22.697 182759 DEBUG nova.network.neutron [-] [instance: cc06928c-a631-4fa0-9c21-daf6bef0991c] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 21 19:02:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:02:23.160 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 21 19:02:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:02:23.162 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 21 19:02:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:02:23.163 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.capacity, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 21 19:02:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:02:23.163 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.iops, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 21 19:02:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:02:23.163 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 21 19:02:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:02:23.164 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 21 19:02:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:02:23.164 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 21 19:02:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:02:23.164 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 21 19:02:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:02:23.165 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 21 19:02:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:02:23.165 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 21 19:02:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:02:23.165 12 DEBUG ceilometer.polling.manager [-] Skip pollster memory.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 21 19:02:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:02:23.166 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 21 19:02:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:02:23.166 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 21 19:02:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:02:23.166 12 DEBUG ceilometer.polling.manager [-] Skip pollster cpu, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 21 19:02:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:02:23.166 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 21 19:02:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:02:23.167 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 21 19:02:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:02:23.167 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 21 19:02:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:02:23.167 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.allocation, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 21 19:02:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:02:23.168 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 21 19:02:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:02:23.168 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 21 19:02:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:02:23.168 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 21 19:02:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:02:23.169 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 21 19:02:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:02:23.169 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 21 19:02:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:02:23.169 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 21 19:02:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:02:23.170 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 21 19:02:23 np0005591285 nova_compute[182755]: 2026-01-22 00:02:23.223 182759 DEBUG nova.compute.manager [req-a3c6c8c0-24f0-4f5e-9088-4268935271c8 req-fbc56c5e-67fc-496b-9022-f5281612cb53 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: cc06928c-a631-4fa0-9c21-daf6bef0991c] Received event network-vif-plugged-2d2a9041-fc22-4cf2-90fb-3db5c39c337d external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 21 19:02:23 np0005591285 nova_compute[182755]: 2026-01-22 00:02:23.223 182759 DEBUG oslo_concurrency.lockutils [req-a3c6c8c0-24f0-4f5e-9088-4268935271c8 req-fbc56c5e-67fc-496b-9022-f5281612cb53 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "cc06928c-a631-4fa0-9c21-daf6bef0991c-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 21 19:02:23 np0005591285 nova_compute[182755]: 2026-01-22 00:02:23.223 182759 DEBUG oslo_concurrency.lockutils [req-a3c6c8c0-24f0-4f5e-9088-4268935271c8 req-fbc56c5e-67fc-496b-9022-f5281612cb53 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "cc06928c-a631-4fa0-9c21-daf6bef0991c-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 21 19:02:23 np0005591285 nova_compute[182755]: 2026-01-22 00:02:23.223 182759 DEBUG oslo_concurrency.lockutils [req-a3c6c8c0-24f0-4f5e-9088-4268935271c8 req-fbc56c5e-67fc-496b-9022-f5281612cb53 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "cc06928c-a631-4fa0-9c21-daf6bef0991c-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 21 19:02:23 np0005591285 nova_compute[182755]: 2026-01-22 00:02:23.224 182759 DEBUG nova.compute.manager [req-a3c6c8c0-24f0-4f5e-9088-4268935271c8 req-fbc56c5e-67fc-496b-9022-f5281612cb53 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: cc06928c-a631-4fa0-9c21-daf6bef0991c] No waiting events found dispatching network-vif-plugged-2d2a9041-fc22-4cf2-90fb-3db5c39c337d pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 21 19:02:23 np0005591285 nova_compute[182755]: 2026-01-22 00:02:23.224 182759 WARNING nova.compute.manager [req-a3c6c8c0-24f0-4f5e-9088-4268935271c8 req-fbc56c5e-67fc-496b-9022-f5281612cb53 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: cc06928c-a631-4fa0-9c21-daf6bef0991c] Received unexpected event network-vif-plugged-2d2a9041-fc22-4cf2-90fb-3db5c39c337d for instance with vm_state active and task_state deleting.#033[00m
Jan 21 19:02:23 np0005591285 nova_compute[182755]: 2026-01-22 00:02:23.224 182759 DEBUG nova.compute.manager [req-a3c6c8c0-24f0-4f5e-9088-4268935271c8 req-fbc56c5e-67fc-496b-9022-f5281612cb53 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: cc06928c-a631-4fa0-9c21-daf6bef0991c] Received event network-vif-deleted-2d2a9041-fc22-4cf2-90fb-3db5c39c337d external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 21 19:02:23 np0005591285 nova_compute[182755]: 2026-01-22 00:02:23.224 182759 INFO nova.compute.manager [req-a3c6c8c0-24f0-4f5e-9088-4268935271c8 req-fbc56c5e-67fc-496b-9022-f5281612cb53 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: cc06928c-a631-4fa0-9c21-daf6bef0991c] Neutron deleted interface 2d2a9041-fc22-4cf2-90fb-3db5c39c337d; detaching it from the instance and deleting it from the info cache#033[00m
Jan 21 19:02:23 np0005591285 nova_compute[182755]: 2026-01-22 00:02:23.225 182759 DEBUG nova.network.neutron [req-a3c6c8c0-24f0-4f5e-9088-4268935271c8 req-fbc56c5e-67fc-496b-9022-f5281612cb53 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: cc06928c-a631-4fa0-9c21-daf6bef0991c] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 21 19:02:23 np0005591285 nova_compute[182755]: 2026-01-22 00:02:23.752 182759 INFO nova.compute.manager [-] [instance: cc06928c-a631-4fa0-9c21-daf6bef0991c] Took 2.72 seconds to deallocate network for instance.#033[00m
Jan 21 19:02:24 np0005591285 nova_compute[182755]: 2026-01-22 00:02:24.044 182759 DEBUG nova.compute.manager [req-a3c6c8c0-24f0-4f5e-9088-4268935271c8 req-fbc56c5e-67fc-496b-9022-f5281612cb53 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: cc06928c-a631-4fa0-9c21-daf6bef0991c] Detach interface failed, port_id=2d2a9041-fc22-4cf2-90fb-3db5c39c337d, reason: Instance cc06928c-a631-4fa0-9c21-daf6bef0991c could not be found. _process_instance_vif_deleted_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10882#033[00m
Jan 21 19:02:24 np0005591285 nova_compute[182755]: 2026-01-22 00:02:24.834 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:02:25 np0005591285 nova_compute[182755]: 2026-01-22 00:02:25.674 182759 DEBUG oslo_concurrency.lockutils [None req-76561eee-1361-4b2e-a22b-93efd5ee3245 a54498c24248464db477c8bacbc2529f 81061941b85c488d887a1cbb0d870471 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 21 19:02:25 np0005591285 nova_compute[182755]: 2026-01-22 00:02:25.675 182759 DEBUG oslo_concurrency.lockutils [None req-76561eee-1361-4b2e-a22b-93efd5ee3245 a54498c24248464db477c8bacbc2529f 81061941b85c488d887a1cbb0d870471 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 21 19:02:25 np0005591285 nova_compute[182755]: 2026-01-22 00:02:25.756 182759 DEBUG nova.compute.provider_tree [None req-76561eee-1361-4b2e-a22b-93efd5ee3245 a54498c24248464db477c8bacbc2529f 81061941b85c488d887a1cbb0d870471 - - default default] Inventory has not changed in ProviderTree for provider: e96a8776-a298-4c19-937a-402cb8191067 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 21 19:02:26 np0005591285 nova_compute[182755]: 2026-01-22 00:02:26.044 182759 DEBUG nova.scheduler.client.report [None req-76561eee-1361-4b2e-a22b-93efd5ee3245 a54498c24248464db477c8bacbc2529f 81061941b85c488d887a1cbb0d870471 - - default default] Inventory has not changed for provider e96a8776-a298-4c19-937a-402cb8191067 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 21 19:02:26 np0005591285 nova_compute[182755]: 2026-01-22 00:02:26.101 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:02:26 np0005591285 nova_compute[182755]: 2026-01-22 00:02:26.368 182759 DEBUG oslo_concurrency.lockutils [None req-76561eee-1361-4b2e-a22b-93efd5ee3245 a54498c24248464db477c8bacbc2529f 81061941b85c488d887a1cbb0d870471 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.693s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 21 19:02:26 np0005591285 nova_compute[182755]: 2026-01-22 00:02:26.689 182759 INFO nova.scheduler.client.report [None req-76561eee-1361-4b2e-a22b-93efd5ee3245 a54498c24248464db477c8bacbc2529f 81061941b85c488d887a1cbb0d870471 - - default default] Deleted allocations for instance cc06928c-a631-4fa0-9c21-daf6bef0991c#033[00m
Jan 21 19:02:27 np0005591285 nova_compute[182755]: 2026-01-22 00:02:27.217 182759 DEBUG oslo_service.periodic_task [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Running periodic task ComputeManager._run_pending_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 21 19:02:27 np0005591285 nova_compute[182755]: 2026-01-22 00:02:27.217 182759 DEBUG nova.compute.manager [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Cleaning up deleted instances _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11145#033[00m
Jan 21 19:02:27 np0005591285 nova_compute[182755]: 2026-01-22 00:02:27.821 182759 DEBUG nova.compute.manager [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] There are 0 instances to clean _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11154#033[00m
Jan 21 19:02:27 np0005591285 nova_compute[182755]: 2026-01-22 00:02:27.882 182759 DEBUG oslo_concurrency.lockutils [None req-76561eee-1361-4b2e-a22b-93efd5ee3245 a54498c24248464db477c8bacbc2529f 81061941b85c488d887a1cbb0d870471 - - default default] Lock "cc06928c-a631-4fa0-9c21-daf6bef0991c" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 8.960s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 21 19:02:28 np0005591285 nova_compute[182755]: 2026-01-22 00:02:28.816 182759 DEBUG oslo_service.periodic_task [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 21 19:02:29 np0005591285 nova_compute[182755]: 2026-01-22 00:02:29.861 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:02:31 np0005591285 nova_compute[182755]: 2026-01-22 00:02:31.103 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:02:33 np0005591285 nova_compute[182755]: 2026-01-22 00:02:33.218 182759 DEBUG oslo_service.periodic_task [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 21 19:02:33 np0005591285 nova_compute[182755]: 2026-01-22 00:02:33.218 182759 DEBUG oslo_service.periodic_task [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 21 19:02:33 np0005591285 nova_compute[182755]: 2026-01-22 00:02:33.219 182759 DEBUG nova.compute.manager [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Jan 21 19:02:34 np0005591285 nova_compute[182755]: 2026-01-22 00:02:34.744 182759 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1769040139.7427607, cc06928c-a631-4fa0-9c21-daf6bef0991c => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 21 19:02:34 np0005591285 nova_compute[182755]: 2026-01-22 00:02:34.745 182759 INFO nova.compute.manager [-] [instance: cc06928c-a631-4fa0-9c21-daf6bef0991c] VM Stopped (Lifecycle Event)#033[00m
Jan 21 19:02:34 np0005591285 nova_compute[182755]: 2026-01-22 00:02:34.843 182759 DEBUG nova.compute.manager [None req-ad6e9a55-d1ad-48cb-bd0a-5a7b285d6fce - - - - - -] [instance: cc06928c-a631-4fa0-9c21-daf6bef0991c] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 21 19:02:34 np0005591285 nova_compute[182755]: 2026-01-22 00:02:34.908 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:02:36 np0005591285 nova_compute[182755]: 2026-01-22 00:02:36.108 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:02:36 np0005591285 podman[224357]: 2026-01-22 00:02:36.193199802 +0000 UTC m=+0.062744028 container health_status 769668cc4e818cfae854ab3fcb5517227ddca1e18005a18ef4d782a494f45662 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.openshift.expose-services=, io.openshift.tags=minimal rhel9, name=ubi9-minimal, vcs-type=git, io.buildah.version=1.33.7, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vendor=Red Hat, Inc., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.component=ubi9-minimal-container, container_name=openstack_network_exporter, config_id=openstack_network_exporter, url=https://catalog.redhat.com/en/search?searchType=containers, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., architecture=x86_64, maintainer=Red Hat, Inc., distribution-scope=public, managed_by=edpm_ansible, release=1755695350, version=9.6, build-date=2025-08-20T13:12:41, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b)
Jan 21 19:02:36 np0005591285 podman[224358]: 2026-01-22 00:02:36.193190251 +0000 UTC m=+0.062345127 container health_status b5c1a31a508d0cf9e10702abe9c0ef424614b4d8f0daee85791f7de054afa6f0 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, config_id=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']})
Jan 21 19:02:36 np0005591285 nova_compute[182755]: 2026-01-22 00:02:36.217 182759 DEBUG oslo_service.periodic_task [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 21 19:02:37 np0005591285 nova_compute[182755]: 2026-01-22 00:02:37.217 182759 DEBUG oslo_service.periodic_task [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 21 19:02:39 np0005591285 nova_compute[182755]: 2026-01-22 00:02:39.911 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:02:40 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:02:40.652 104259 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=22, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '16:a4:fb', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': 'fa:c2:bb:50:eb:27'}, ipsec=False) old=SB_Global(nb_cfg=21) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 21 19:02:40 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:02:40.653 104259 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 8 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Jan 21 19:02:40 np0005591285 nova_compute[182755]: 2026-01-22 00:02:40.701 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:02:41 np0005591285 nova_compute[182755]: 2026-01-22 00:02:41.110 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:02:42 np0005591285 nova_compute[182755]: 2026-01-22 00:02:42.504 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:02:42 np0005591285 nova_compute[182755]: 2026-01-22 00:02:42.515 182759 DEBUG oslo_concurrency.lockutils [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 21 19:02:42 np0005591285 nova_compute[182755]: 2026-01-22 00:02:42.515 182759 DEBUG oslo_concurrency.lockutils [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 21 19:02:42 np0005591285 nova_compute[182755]: 2026-01-22 00:02:42.516 182759 DEBUG oslo_concurrency.lockutils [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 21 19:02:42 np0005591285 nova_compute[182755]: 2026-01-22 00:02:42.516 182759 DEBUG nova.compute.resource_tracker [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Jan 21 19:02:42 np0005591285 podman[224398]: 2026-01-22 00:02:42.602069057 +0000 UTC m=+0.053317658 container health_status 3d927e208c8d9d2bf2ed958430b573b5beb6e6062ba87e1ec7a0ee42c855b2e4 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Jan 21 19:02:42 np0005591285 nova_compute[182755]: 2026-01-22 00:02:42.669 182759 WARNING nova.virt.libvirt.driver [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 21 19:02:42 np0005591285 nova_compute[182755]: 2026-01-22 00:02:42.670 182759 DEBUG nova.compute.resource_tracker [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=5657MB free_disk=73.2690200805664GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Jan 21 19:02:42 np0005591285 nova_compute[182755]: 2026-01-22 00:02:42.671 182759 DEBUG oslo_concurrency.lockutils [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 21 19:02:42 np0005591285 nova_compute[182755]: 2026-01-22 00:02:42.671 182759 DEBUG oslo_concurrency.lockutils [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 21 19:02:44 np0005591285 nova_compute[182755]: 2026-01-22 00:02:44.915 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:02:46 np0005591285 nova_compute[182755]: 2026-01-22 00:02:46.112 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:02:46 np0005591285 podman[224423]: 2026-01-22 00:02:46.206934016 +0000 UTC m=+0.071323687 container health_status 482a8450540b33e5cd4c4cb0c4b60281016c657b79b89fa82f8a9e447843f995 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team)
Jan 21 19:02:46 np0005591285 podman[224424]: 2026-01-22 00:02:46.234492028 +0000 UTC m=+0.085108993 container health_status 8919309155a033da8deb024bb37b9fc0d285fe97686ee0d202af37a5f2df8416 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter)
Jan 21 19:02:46 np0005591285 nova_compute[182755]: 2026-01-22 00:02:46.285 182759 DEBUG nova.compute.resource_tracker [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Jan 21 19:02:46 np0005591285 nova_compute[182755]: 2026-01-22 00:02:46.286 182759 DEBUG nova.compute.resource_tracker [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Jan 21 19:02:46 np0005591285 podman[224425]: 2026-01-22 00:02:46.289546662 +0000 UTC m=+0.141625476 container health_status e38def30a2d032278c507a8fe96e62e9ea51ff4721fe55b24e66691821fd079e (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, container_name=ovn_controller, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2)
Jan 21 19:02:46 np0005591285 nova_compute[182755]: 2026-01-22 00:02:46.375 182759 DEBUG nova.compute.provider_tree [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Inventory has not changed in ProviderTree for provider: e96a8776-a298-4c19-937a-402cb8191067 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 21 19:02:46 np0005591285 nova_compute[182755]: 2026-01-22 00:02:46.446 182759 DEBUG nova.scheduler.client.report [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Inventory has not changed for provider e96a8776-a298-4c19-937a-402cb8191067 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 21 19:02:46 np0005591285 nova_compute[182755]: 2026-01-22 00:02:46.848 182759 DEBUG nova.compute.resource_tracker [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Jan 21 19:02:46 np0005591285 nova_compute[182755]: 2026-01-22 00:02:46.848 182759 DEBUG oslo_concurrency.lockutils [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 4.177s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 21 19:02:46 np0005591285 nova_compute[182755]: 2026-01-22 00:02:46.849 182759 DEBUG oslo_service.periodic_task [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Running periodic task ComputeManager._cleanup_incomplete_migrations run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 21 19:02:46 np0005591285 nova_compute[182755]: 2026-01-22 00:02:46.849 182759 DEBUG nova.compute.manager [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Cleaning up deleted instances with incomplete migration  _cleanup_incomplete_migrations /usr/lib/python3.9/site-packages/nova/compute/manager.py:11183#033[00m
Jan 21 19:02:48 np0005591285 nova_compute[182755]: 2026-01-22 00:02:48.217 182759 DEBUG oslo_service.periodic_task [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 21 19:02:48 np0005591285 nova_compute[182755]: 2026-01-22 00:02:48.217 182759 DEBUG nova.compute.manager [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Jan 21 19:02:48 np0005591285 nova_compute[182755]: 2026-01-22 00:02:48.218 182759 DEBUG nova.compute.manager [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Jan 21 19:02:48 np0005591285 nova_compute[182755]: 2026-01-22 00:02:48.234 182759 DEBUG nova.compute.manager [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Jan 21 19:02:48 np0005591285 nova_compute[182755]: 2026-01-22 00:02:48.235 182759 DEBUG oslo_service.periodic_task [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 21 19:02:48 np0005591285 nova_compute[182755]: 2026-01-22 00:02:48.235 182759 DEBUG oslo_service.periodic_task [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 21 19:02:48 np0005591285 nova_compute[182755]: 2026-01-22 00:02:48.236 182759 DEBUG oslo_service.periodic_task [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 21 19:02:48 np0005591285 nova_compute[182755]: 2026-01-22 00:02:48.236 182759 DEBUG oslo_service.periodic_task [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 21 19:02:48 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:02:48.655 104259 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=ce4b296c-26ac-415a-aa87-9634754eb3d3, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '22'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 21 19:02:49 np0005591285 nova_compute[182755]: 2026-01-22 00:02:49.918 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:02:51 np0005591285 nova_compute[182755]: 2026-01-22 00:02:51.115 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:02:54 np0005591285 nova_compute[182755]: 2026-01-22 00:02:54.921 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:02:56 np0005591285 nova_compute[182755]: 2026-01-22 00:02:56.117 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:02:59 np0005591285 nova_compute[182755]: 2026-01-22 00:02:59.924 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:03:01 np0005591285 nova_compute[182755]: 2026-01-22 00:03:01.119 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:03:02 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:03:02.968 104259 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 21 19:03:02 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:03:02.969 104259 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 21 19:03:02 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:03:02.969 104259 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 21 19:03:04 np0005591285 nova_compute[182755]: 2026-01-22 00:03:04.173 182759 DEBUG oslo_concurrency.lockutils [None req-afcdf291-457d-476f-a86f-db6577acd4e9 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] Acquiring lock "65bbb3bd-2b3c-4868-bf10-ce8795c0a312" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 21 19:03:04 np0005591285 nova_compute[182755]: 2026-01-22 00:03:04.173 182759 DEBUG oslo_concurrency.lockutils [None req-afcdf291-457d-476f-a86f-db6577acd4e9 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] Lock "65bbb3bd-2b3c-4868-bf10-ce8795c0a312" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 21 19:03:04 np0005591285 nova_compute[182755]: 2026-01-22 00:03:04.316 182759 DEBUG nova.compute.manager [None req-afcdf291-457d-476f-a86f-db6577acd4e9 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] [instance: 65bbb3bd-2b3c-4868-bf10-ce8795c0a312] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Jan 21 19:03:04 np0005591285 nova_compute[182755]: 2026-01-22 00:03:04.927 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:03:05 np0005591285 nova_compute[182755]: 2026-01-22 00:03:05.103 182759 DEBUG oslo_concurrency.lockutils [None req-afcdf291-457d-476f-a86f-db6577acd4e9 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 21 19:03:05 np0005591285 nova_compute[182755]: 2026-01-22 00:03:05.103 182759 DEBUG oslo_concurrency.lockutils [None req-afcdf291-457d-476f-a86f-db6577acd4e9 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 21 19:03:05 np0005591285 nova_compute[182755]: 2026-01-22 00:03:05.113 182759 DEBUG nova.virt.hardware [None req-afcdf291-457d-476f-a86f-db6577acd4e9 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Jan 21 19:03:05 np0005591285 nova_compute[182755]: 2026-01-22 00:03:05.113 182759 INFO nova.compute.claims [None req-afcdf291-457d-476f-a86f-db6577acd4e9 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] [instance: 65bbb3bd-2b3c-4868-bf10-ce8795c0a312] Claim successful on node compute-2.ctlplane.example.com#033[00m
Jan 21 19:03:05 np0005591285 nova_compute[182755]: 2026-01-22 00:03:05.797 182759 DEBUG nova.compute.provider_tree [None req-afcdf291-457d-476f-a86f-db6577acd4e9 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] Inventory has not changed in ProviderTree for provider: e96a8776-a298-4c19-937a-402cb8191067 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 21 19:03:05 np0005591285 nova_compute[182755]: 2026-01-22 00:03:05.856 182759 DEBUG nova.scheduler.client.report [None req-afcdf291-457d-476f-a86f-db6577acd4e9 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] Inventory has not changed for provider e96a8776-a298-4c19-937a-402cb8191067 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 21 19:03:05 np0005591285 nova_compute[182755]: 2026-01-22 00:03:05.946 182759 DEBUG oslo_concurrency.lockutils [None req-afcdf291-457d-476f-a86f-db6577acd4e9 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.843s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 21 19:03:05 np0005591285 nova_compute[182755]: 2026-01-22 00:03:05.947 182759 DEBUG nova.compute.manager [None req-afcdf291-457d-476f-a86f-db6577acd4e9 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] [instance: 65bbb3bd-2b3c-4868-bf10-ce8795c0a312] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Jan 21 19:03:06 np0005591285 nova_compute[182755]: 2026-01-22 00:03:06.120 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:03:06 np0005591285 nova_compute[182755]: 2026-01-22 00:03:06.177 182759 DEBUG nova.compute.manager [None req-afcdf291-457d-476f-a86f-db6577acd4e9 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] [instance: 65bbb3bd-2b3c-4868-bf10-ce8795c0a312] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Jan 21 19:03:06 np0005591285 nova_compute[182755]: 2026-01-22 00:03:06.178 182759 DEBUG nova.network.neutron [None req-afcdf291-457d-476f-a86f-db6577acd4e9 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] [instance: 65bbb3bd-2b3c-4868-bf10-ce8795c0a312] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Jan 21 19:03:06 np0005591285 nova_compute[182755]: 2026-01-22 00:03:06.242 182759 INFO nova.virt.libvirt.driver [None req-afcdf291-457d-476f-a86f-db6577acd4e9 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] [instance: 65bbb3bd-2b3c-4868-bf10-ce8795c0a312] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Jan 21 19:03:06 np0005591285 nova_compute[182755]: 2026-01-22 00:03:06.297 182759 DEBUG nova.compute.manager [None req-afcdf291-457d-476f-a86f-db6577acd4e9 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] [instance: 65bbb3bd-2b3c-4868-bf10-ce8795c0a312] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Jan 21 19:03:06 np0005591285 nova_compute[182755]: 2026-01-22 00:03:06.487 182759 DEBUG nova.policy [None req-afcdf291-457d-476f-a86f-db6577acd4e9 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '3e78a70a1d284a9d932d4a53b872df39', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'cccb624dbe6d4401a89e9cd254f91828', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Jan 21 19:03:07 np0005591285 podman[224493]: 2026-01-22 00:03:07.218244313 +0000 UTC m=+0.079940906 container health_status 769668cc4e818cfae854ab3fcb5517227ddca1e18005a18ef4d782a494f45662 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, container_name=openstack_network_exporter, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, com.redhat.component=ubi9-minimal-container, distribution-scope=public, io.openshift.tags=minimal rhel9, name=ubi9-minimal, release=1755695350, vendor=Red Hat, Inc., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, architecture=x86_64, managed_by=edpm_ansible, vcs-type=git, config_id=openstack_network_exporter, io.buildah.version=1.33.7, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., build-date=2025-08-20T13:12:41, version=9.6, maintainer=Red Hat, Inc.)
Jan 21 19:03:07 np0005591285 podman[224494]: 2026-01-22 00:03:07.219163537 +0000 UTC m=+0.084240959 container health_status b5c1a31a508d0cf9e10702abe9c0ef424614b4d8f0daee85791f7de054afa6f0 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, managed_by=edpm_ansible, config_id=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0)
Jan 21 19:03:07 np0005591285 nova_compute[182755]: 2026-01-22 00:03:07.995 182759 DEBUG nova.compute.manager [None req-afcdf291-457d-476f-a86f-db6577acd4e9 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] [instance: 65bbb3bd-2b3c-4868-bf10-ce8795c0a312] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Jan 21 19:03:07 np0005591285 nova_compute[182755]: 2026-01-22 00:03:07.997 182759 DEBUG nova.virt.libvirt.driver [None req-afcdf291-457d-476f-a86f-db6577acd4e9 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] [instance: 65bbb3bd-2b3c-4868-bf10-ce8795c0a312] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Jan 21 19:03:07 np0005591285 nova_compute[182755]: 2026-01-22 00:03:07.998 182759 INFO nova.virt.libvirt.driver [None req-afcdf291-457d-476f-a86f-db6577acd4e9 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] [instance: 65bbb3bd-2b3c-4868-bf10-ce8795c0a312] Creating image(s)#033[00m
Jan 21 19:03:07 np0005591285 nova_compute[182755]: 2026-01-22 00:03:07.999 182759 DEBUG oslo_concurrency.lockutils [None req-afcdf291-457d-476f-a86f-db6577acd4e9 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] Acquiring lock "/var/lib/nova/instances/65bbb3bd-2b3c-4868-bf10-ce8795c0a312/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 21 19:03:08 np0005591285 nova_compute[182755]: 2026-01-22 00:03:08.000 182759 DEBUG oslo_concurrency.lockutils [None req-afcdf291-457d-476f-a86f-db6577acd4e9 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] Lock "/var/lib/nova/instances/65bbb3bd-2b3c-4868-bf10-ce8795c0a312/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 21 19:03:08 np0005591285 nova_compute[182755]: 2026-01-22 00:03:08.001 182759 DEBUG oslo_concurrency.lockutils [None req-afcdf291-457d-476f-a86f-db6577acd4e9 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] Lock "/var/lib/nova/instances/65bbb3bd-2b3c-4868-bf10-ce8795c0a312/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 21 19:03:08 np0005591285 nova_compute[182755]: 2026-01-22 00:03:08.025 182759 DEBUG oslo_concurrency.processutils [None req-afcdf291-457d-476f-a86f-db6577acd4e9 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 21 19:03:08 np0005591285 nova_compute[182755]: 2026-01-22 00:03:08.123 182759 DEBUG oslo_concurrency.processutils [None req-afcdf291-457d-476f-a86f-db6577acd4e9 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json" returned: 0 in 0.098s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 21 19:03:08 np0005591285 nova_compute[182755]: 2026-01-22 00:03:08.124 182759 DEBUG oslo_concurrency.lockutils [None req-afcdf291-457d-476f-a86f-db6577acd4e9 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] Acquiring lock "3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 21 19:03:08 np0005591285 nova_compute[182755]: 2026-01-22 00:03:08.125 182759 DEBUG oslo_concurrency.lockutils [None req-afcdf291-457d-476f-a86f-db6577acd4e9 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] Lock "3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 21 19:03:08 np0005591285 nova_compute[182755]: 2026-01-22 00:03:08.140 182759 DEBUG oslo_concurrency.processutils [None req-afcdf291-457d-476f-a86f-db6577acd4e9 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 21 19:03:08 np0005591285 nova_compute[182755]: 2026-01-22 00:03:08.208 182759 DEBUG oslo_concurrency.processutils [None req-afcdf291-457d-476f-a86f-db6577acd4e9 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json" returned: 0 in 0.068s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 21 19:03:08 np0005591285 nova_compute[182755]: 2026-01-22 00:03:08.209 182759 DEBUG oslo_concurrency.processutils [None req-afcdf291-457d-476f-a86f-db6577acd4e9 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474,backing_fmt=raw /var/lib/nova/instances/65bbb3bd-2b3c-4868-bf10-ce8795c0a312/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 21 19:03:08 np0005591285 nova_compute[182755]: 2026-01-22 00:03:08.244 182759 DEBUG oslo_concurrency.processutils [None req-afcdf291-457d-476f-a86f-db6577acd4e9 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474,backing_fmt=raw /var/lib/nova/instances/65bbb3bd-2b3c-4868-bf10-ce8795c0a312/disk 1073741824" returned: 0 in 0.035s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 21 19:03:08 np0005591285 nova_compute[182755]: 2026-01-22 00:03:08.246 182759 DEBUG oslo_concurrency.lockutils [None req-afcdf291-457d-476f-a86f-db6577acd4e9 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] Lock "3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.121s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 21 19:03:08 np0005591285 nova_compute[182755]: 2026-01-22 00:03:08.246 182759 DEBUG oslo_concurrency.processutils [None req-afcdf291-457d-476f-a86f-db6577acd4e9 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 21 19:03:08 np0005591285 nova_compute[182755]: 2026-01-22 00:03:08.306 182759 DEBUG oslo_concurrency.processutils [None req-afcdf291-457d-476f-a86f-db6577acd4e9 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json" returned: 0 in 0.059s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 21 19:03:08 np0005591285 nova_compute[182755]: 2026-01-22 00:03:08.307 182759 DEBUG nova.virt.disk.api [None req-afcdf291-457d-476f-a86f-db6577acd4e9 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] Checking if we can resize image /var/lib/nova/instances/65bbb3bd-2b3c-4868-bf10-ce8795c0a312/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166#033[00m
Jan 21 19:03:08 np0005591285 nova_compute[182755]: 2026-01-22 00:03:08.308 182759 DEBUG oslo_concurrency.processutils [None req-afcdf291-457d-476f-a86f-db6577acd4e9 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/65bbb3bd-2b3c-4868-bf10-ce8795c0a312/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 21 19:03:08 np0005591285 nova_compute[182755]: 2026-01-22 00:03:08.365 182759 DEBUG oslo_concurrency.processutils [None req-afcdf291-457d-476f-a86f-db6577acd4e9 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/65bbb3bd-2b3c-4868-bf10-ce8795c0a312/disk --force-share --output=json" returned: 0 in 0.057s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 21 19:03:08 np0005591285 nova_compute[182755]: 2026-01-22 00:03:08.367 182759 DEBUG nova.virt.disk.api [None req-afcdf291-457d-476f-a86f-db6577acd4e9 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] Cannot resize image /var/lib/nova/instances/65bbb3bd-2b3c-4868-bf10-ce8795c0a312/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172#033[00m
Jan 21 19:03:08 np0005591285 nova_compute[182755]: 2026-01-22 00:03:08.367 182759 DEBUG nova.objects.instance [None req-afcdf291-457d-476f-a86f-db6577acd4e9 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] Lazy-loading 'migration_context' on Instance uuid 65bbb3bd-2b3c-4868-bf10-ce8795c0a312 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 21 19:03:08 np0005591285 nova_compute[182755]: 2026-01-22 00:03:08.460 182759 DEBUG nova.virt.libvirt.driver [None req-afcdf291-457d-476f-a86f-db6577acd4e9 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] [instance: 65bbb3bd-2b3c-4868-bf10-ce8795c0a312] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Jan 21 19:03:08 np0005591285 nova_compute[182755]: 2026-01-22 00:03:08.461 182759 DEBUG nova.virt.libvirt.driver [None req-afcdf291-457d-476f-a86f-db6577acd4e9 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] [instance: 65bbb3bd-2b3c-4868-bf10-ce8795c0a312] Ensure instance console log exists: /var/lib/nova/instances/65bbb3bd-2b3c-4868-bf10-ce8795c0a312/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Jan 21 19:03:08 np0005591285 nova_compute[182755]: 2026-01-22 00:03:08.461 182759 DEBUG oslo_concurrency.lockutils [None req-afcdf291-457d-476f-a86f-db6577acd4e9 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 21 19:03:08 np0005591285 nova_compute[182755]: 2026-01-22 00:03:08.461 182759 DEBUG oslo_concurrency.lockutils [None req-afcdf291-457d-476f-a86f-db6577acd4e9 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 21 19:03:08 np0005591285 nova_compute[182755]: 2026-01-22 00:03:08.462 182759 DEBUG oslo_concurrency.lockutils [None req-afcdf291-457d-476f-a86f-db6577acd4e9 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 21 19:03:09 np0005591285 nova_compute[182755]: 2026-01-22 00:03:09.281 182759 DEBUG nova.network.neutron [None req-afcdf291-457d-476f-a86f-db6577acd4e9 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] [instance: 65bbb3bd-2b3c-4868-bf10-ce8795c0a312] Successfully created port: d13b0c1b-9c16-4db4-bc03-d7ffef3f3af0 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Jan 21 19:03:09 np0005591285 nova_compute[182755]: 2026-01-22 00:03:09.930 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:03:11 np0005591285 nova_compute[182755]: 2026-01-22 00:03:11.123 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:03:12 np0005591285 nova_compute[182755]: 2026-01-22 00:03:12.108 182759 DEBUG nova.network.neutron [None req-afcdf291-457d-476f-a86f-db6577acd4e9 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] [instance: 65bbb3bd-2b3c-4868-bf10-ce8795c0a312] Successfully updated port: d13b0c1b-9c16-4db4-bc03-d7ffef3f3af0 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Jan 21 19:03:12 np0005591285 nova_compute[182755]: 2026-01-22 00:03:12.212 182759 DEBUG oslo_concurrency.lockutils [None req-afcdf291-457d-476f-a86f-db6577acd4e9 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] Acquiring lock "refresh_cache-65bbb3bd-2b3c-4868-bf10-ce8795c0a312" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 21 19:03:12 np0005591285 nova_compute[182755]: 2026-01-22 00:03:12.213 182759 DEBUG oslo_concurrency.lockutils [None req-afcdf291-457d-476f-a86f-db6577acd4e9 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] Acquired lock "refresh_cache-65bbb3bd-2b3c-4868-bf10-ce8795c0a312" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 21 19:03:12 np0005591285 nova_compute[182755]: 2026-01-22 00:03:12.213 182759 DEBUG nova.network.neutron [None req-afcdf291-457d-476f-a86f-db6577acd4e9 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] [instance: 65bbb3bd-2b3c-4868-bf10-ce8795c0a312] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Jan 21 19:03:12 np0005591285 nova_compute[182755]: 2026-01-22 00:03:12.328 182759 DEBUG nova.compute.manager [req-805179f3-e8b2-4590-ac1f-af6f5cc3366e req-1ae812dd-ee7e-4194-b83e-ed7b3812d0c6 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 65bbb3bd-2b3c-4868-bf10-ce8795c0a312] Received event network-changed-d13b0c1b-9c16-4db4-bc03-d7ffef3f3af0 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 21 19:03:12 np0005591285 nova_compute[182755]: 2026-01-22 00:03:12.328 182759 DEBUG nova.compute.manager [req-805179f3-e8b2-4590-ac1f-af6f5cc3366e req-1ae812dd-ee7e-4194-b83e-ed7b3812d0c6 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 65bbb3bd-2b3c-4868-bf10-ce8795c0a312] Refreshing instance network info cache due to event network-changed-d13b0c1b-9c16-4db4-bc03-d7ffef3f3af0. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 21 19:03:12 np0005591285 nova_compute[182755]: 2026-01-22 00:03:12.329 182759 DEBUG oslo_concurrency.lockutils [req-805179f3-e8b2-4590-ac1f-af6f5cc3366e req-1ae812dd-ee7e-4194-b83e-ed7b3812d0c6 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "refresh_cache-65bbb3bd-2b3c-4868-bf10-ce8795c0a312" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 21 19:03:12 np0005591285 nova_compute[182755]: 2026-01-22 00:03:12.704 182759 DEBUG nova.network.neutron [None req-afcdf291-457d-476f-a86f-db6577acd4e9 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] [instance: 65bbb3bd-2b3c-4868-bf10-ce8795c0a312] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Jan 21 19:03:13 np0005591285 podman[224547]: 2026-01-22 00:03:13.221508818 +0000 UTC m=+0.075107708 container health_status 3d927e208c8d9d2bf2ed958430b573b5beb6e6062ba87e1ec7a0ee42c855b2e4 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter)
Jan 21 19:03:14 np0005591285 nova_compute[182755]: 2026-01-22 00:03:14.932 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:03:15 np0005591285 nova_compute[182755]: 2026-01-22 00:03:15.000 182759 DEBUG nova.network.neutron [None req-afcdf291-457d-476f-a86f-db6577acd4e9 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] [instance: 65bbb3bd-2b3c-4868-bf10-ce8795c0a312] Updating instance_info_cache with network_info: [{"id": "d13b0c1b-9c16-4db4-bc03-d7ffef3f3af0", "address": "fa:16:3e:d5:43:90", "network": {"id": "19c3e0c8-5563-479c-995a-ab38d8b8c7f7", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-10713966-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "cccb624dbe6d4401a89e9cd254f91828", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd13b0c1b-9c", "ovs_interfaceid": "d13b0c1b-9c16-4db4-bc03-d7ffef3f3af0", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 21 19:03:15 np0005591285 nova_compute[182755]: 2026-01-22 00:03:15.160 182759 DEBUG oslo_concurrency.lockutils [None req-afcdf291-457d-476f-a86f-db6577acd4e9 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] Releasing lock "refresh_cache-65bbb3bd-2b3c-4868-bf10-ce8795c0a312" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 21 19:03:15 np0005591285 nova_compute[182755]: 2026-01-22 00:03:15.160 182759 DEBUG nova.compute.manager [None req-afcdf291-457d-476f-a86f-db6577acd4e9 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] [instance: 65bbb3bd-2b3c-4868-bf10-ce8795c0a312] Instance network_info: |[{"id": "d13b0c1b-9c16-4db4-bc03-d7ffef3f3af0", "address": "fa:16:3e:d5:43:90", "network": {"id": "19c3e0c8-5563-479c-995a-ab38d8b8c7f7", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-10713966-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "cccb624dbe6d4401a89e9cd254f91828", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd13b0c1b-9c", "ovs_interfaceid": "d13b0c1b-9c16-4db4-bc03-d7ffef3f3af0", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Jan 21 19:03:15 np0005591285 nova_compute[182755]: 2026-01-22 00:03:15.161 182759 DEBUG oslo_concurrency.lockutils [req-805179f3-e8b2-4590-ac1f-af6f5cc3366e req-1ae812dd-ee7e-4194-b83e-ed7b3812d0c6 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquired lock "refresh_cache-65bbb3bd-2b3c-4868-bf10-ce8795c0a312" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 21 19:03:15 np0005591285 nova_compute[182755]: 2026-01-22 00:03:15.161 182759 DEBUG nova.network.neutron [req-805179f3-e8b2-4590-ac1f-af6f5cc3366e req-1ae812dd-ee7e-4194-b83e-ed7b3812d0c6 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 65bbb3bd-2b3c-4868-bf10-ce8795c0a312] Refreshing network info cache for port d13b0c1b-9c16-4db4-bc03-d7ffef3f3af0 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 21 19:03:15 np0005591285 nova_compute[182755]: 2026-01-22 00:03:15.163 182759 DEBUG nova.virt.libvirt.driver [None req-afcdf291-457d-476f-a86f-db6577acd4e9 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] [instance: 65bbb3bd-2b3c-4868-bf10-ce8795c0a312] Start _get_guest_xml network_info=[{"id": "d13b0c1b-9c16-4db4-bc03-d7ffef3f3af0", "address": "fa:16:3e:d5:43:90", "network": {"id": "19c3e0c8-5563-479c-995a-ab38d8b8c7f7", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-10713966-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "cccb624dbe6d4401a89e9cd254f91828", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd13b0c1b-9c", "ovs_interfaceid": "d13b0c1b-9c16-4db4-bc03-d7ffef3f3af0", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-21T23:42:50Z,direct_url=<?>,disk_format='qcow2',id=9cd98f02-a505-4543-a7ad-04e9a377b456,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='43b70c4e837343859ac97b6b2397ba1b',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-21T23:42:52Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'device_type': 'disk', 'device_name': '/dev/vda', 'size': 0, 'boot_index': 0, 'disk_bus': 'virtio', 'guest_format': None, 'encryption_options': None, 'encryption_format': None, 'encrypted': False, 'encryption_secret_uuid': None, 'image_id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Jan 21 19:03:15 np0005591285 nova_compute[182755]: 2026-01-22 00:03:15.169 182759 WARNING nova.virt.libvirt.driver [None req-afcdf291-457d-476f-a86f-db6577acd4e9 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 21 19:03:15 np0005591285 nova_compute[182755]: 2026-01-22 00:03:15.173 182759 DEBUG nova.virt.libvirt.host [None req-afcdf291-457d-476f-a86f-db6577acd4e9 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Jan 21 19:03:15 np0005591285 nova_compute[182755]: 2026-01-22 00:03:15.174 182759 DEBUG nova.virt.libvirt.host [None req-afcdf291-457d-476f-a86f-db6577acd4e9 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Jan 21 19:03:15 np0005591285 nova_compute[182755]: 2026-01-22 00:03:15.178 182759 DEBUG nova.virt.libvirt.host [None req-afcdf291-457d-476f-a86f-db6577acd4e9 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Jan 21 19:03:15 np0005591285 nova_compute[182755]: 2026-01-22 00:03:15.179 182759 DEBUG nova.virt.libvirt.host [None req-afcdf291-457d-476f-a86f-db6577acd4e9 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Jan 21 19:03:15 np0005591285 nova_compute[182755]: 2026-01-22 00:03:15.180 182759 DEBUG nova.virt.libvirt.driver [None req-afcdf291-457d-476f-a86f-db6577acd4e9 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Jan 21 19:03:15 np0005591285 nova_compute[182755]: 2026-01-22 00:03:15.180 182759 DEBUG nova.virt.hardware [None req-afcdf291-457d-476f-a86f-db6577acd4e9 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-21T23:42:45Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='c3389c03-89c4-4ff5-9e03-1a99d41713d4',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-21T23:42:50Z,direct_url=<?>,disk_format='qcow2',id=9cd98f02-a505-4543-a7ad-04e9a377b456,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='43b70c4e837343859ac97b6b2397ba1b',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-21T23:42:52Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Jan 21 19:03:15 np0005591285 nova_compute[182755]: 2026-01-22 00:03:15.181 182759 DEBUG nova.virt.hardware [None req-afcdf291-457d-476f-a86f-db6577acd4e9 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Jan 21 19:03:15 np0005591285 nova_compute[182755]: 2026-01-22 00:03:15.181 182759 DEBUG nova.virt.hardware [None req-afcdf291-457d-476f-a86f-db6577acd4e9 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Jan 21 19:03:15 np0005591285 nova_compute[182755]: 2026-01-22 00:03:15.181 182759 DEBUG nova.virt.hardware [None req-afcdf291-457d-476f-a86f-db6577acd4e9 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Jan 21 19:03:15 np0005591285 nova_compute[182755]: 2026-01-22 00:03:15.181 182759 DEBUG nova.virt.hardware [None req-afcdf291-457d-476f-a86f-db6577acd4e9 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Jan 21 19:03:15 np0005591285 nova_compute[182755]: 2026-01-22 00:03:15.181 182759 DEBUG nova.virt.hardware [None req-afcdf291-457d-476f-a86f-db6577acd4e9 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Jan 21 19:03:15 np0005591285 nova_compute[182755]: 2026-01-22 00:03:15.182 182759 DEBUG nova.virt.hardware [None req-afcdf291-457d-476f-a86f-db6577acd4e9 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Jan 21 19:03:15 np0005591285 nova_compute[182755]: 2026-01-22 00:03:15.182 182759 DEBUG nova.virt.hardware [None req-afcdf291-457d-476f-a86f-db6577acd4e9 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Jan 21 19:03:15 np0005591285 nova_compute[182755]: 2026-01-22 00:03:15.182 182759 DEBUG nova.virt.hardware [None req-afcdf291-457d-476f-a86f-db6577acd4e9 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Jan 21 19:03:15 np0005591285 nova_compute[182755]: 2026-01-22 00:03:15.183 182759 DEBUG nova.virt.hardware [None req-afcdf291-457d-476f-a86f-db6577acd4e9 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Jan 21 19:03:15 np0005591285 nova_compute[182755]: 2026-01-22 00:03:15.183 182759 DEBUG nova.virt.hardware [None req-afcdf291-457d-476f-a86f-db6577acd4e9 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Jan 21 19:03:15 np0005591285 nova_compute[182755]: 2026-01-22 00:03:15.187 182759 DEBUG nova.virt.libvirt.vif [None req-afcdf291-457d-476f-a86f-db6577acd4e9 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-22T00:02:55Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ServerActionsTestJSON-server-1767669163',display_name='tempest-ServerActionsTestJSON-server-1767669163',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-serveractionstestjson-server-1767669163',id=88,image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBM2ugiUux7DYMlN8dY8gue1BzsfXbOKOqdPq/gJUxFgjYtiZRKn0Il7yH7vkt/FF0n0nQ57uKZ7FjQwDvGcLpEHkhrK3RTLhPWsztjfiNHjhjKK0S86T4k3kzP0rpeoh4Q==',key_name='tempest-keypair-452781070',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='cccb624dbe6d4401a89e9cd254f91828',ramdisk_id='',reservation_id='r-200ojavi',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ServerActionsTestJSON-78742637',owner_user_name='tempest-ServerActionsTestJSON-78742637-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-22T00:03:06Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='3e78a70a1d284a9d932d4a53b872df39',uuid=65bbb3bd-2b3c-4868-bf10-ce8795c0a312,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "d13b0c1b-9c16-4db4-bc03-d7ffef3f3af0", "address": "fa:16:3e:d5:43:90", "network": {"id": "19c3e0c8-5563-479c-995a-ab38d8b8c7f7", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-10713966-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "cccb624dbe6d4401a89e9cd254f91828", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd13b0c1b-9c", "ovs_interfaceid": "d13b0c1b-9c16-4db4-bc03-d7ffef3f3af0", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Jan 21 19:03:15 np0005591285 nova_compute[182755]: 2026-01-22 00:03:15.187 182759 DEBUG nova.network.os_vif_util [None req-afcdf291-457d-476f-a86f-db6577acd4e9 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] Converting VIF {"id": "d13b0c1b-9c16-4db4-bc03-d7ffef3f3af0", "address": "fa:16:3e:d5:43:90", "network": {"id": "19c3e0c8-5563-479c-995a-ab38d8b8c7f7", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-10713966-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "cccb624dbe6d4401a89e9cd254f91828", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd13b0c1b-9c", "ovs_interfaceid": "d13b0c1b-9c16-4db4-bc03-d7ffef3f3af0", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 21 19:03:15 np0005591285 nova_compute[182755]: 2026-01-22 00:03:15.188 182759 DEBUG nova.network.os_vif_util [None req-afcdf291-457d-476f-a86f-db6577acd4e9 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:d5:43:90,bridge_name='br-int',has_traffic_filtering=True,id=d13b0c1b-9c16-4db4-bc03-d7ffef3f3af0,network=Network(19c3e0c8-5563-479c-995a-ab38d8b8c7f7),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapd13b0c1b-9c') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 21 19:03:15 np0005591285 nova_compute[182755]: 2026-01-22 00:03:15.189 182759 DEBUG nova.objects.instance [None req-afcdf291-457d-476f-a86f-db6577acd4e9 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] Lazy-loading 'pci_devices' on Instance uuid 65bbb3bd-2b3c-4868-bf10-ce8795c0a312 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 21 19:03:15 np0005591285 nova_compute[182755]: 2026-01-22 00:03:15.313 182759 DEBUG nova.virt.libvirt.driver [None req-afcdf291-457d-476f-a86f-db6577acd4e9 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] [instance: 65bbb3bd-2b3c-4868-bf10-ce8795c0a312] End _get_guest_xml xml=<domain type="kvm">
Jan 21 19:03:15 np0005591285 nova_compute[182755]:  <uuid>65bbb3bd-2b3c-4868-bf10-ce8795c0a312</uuid>
Jan 21 19:03:15 np0005591285 nova_compute[182755]:  <name>instance-00000058</name>
Jan 21 19:03:15 np0005591285 nova_compute[182755]:  <memory>131072</memory>
Jan 21 19:03:15 np0005591285 nova_compute[182755]:  <vcpu>1</vcpu>
Jan 21 19:03:15 np0005591285 nova_compute[182755]:  <metadata>
Jan 21 19:03:15 np0005591285 nova_compute[182755]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 21 19:03:15 np0005591285 nova_compute[182755]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 21 19:03:15 np0005591285 nova_compute[182755]:      <nova:name>tempest-ServerActionsTestJSON-server-1767669163</nova:name>
Jan 21 19:03:15 np0005591285 nova_compute[182755]:      <nova:creationTime>2026-01-22 00:03:15</nova:creationTime>
Jan 21 19:03:15 np0005591285 nova_compute[182755]:      <nova:flavor name="m1.nano">
Jan 21 19:03:15 np0005591285 nova_compute[182755]:        <nova:memory>128</nova:memory>
Jan 21 19:03:15 np0005591285 nova_compute[182755]:        <nova:disk>1</nova:disk>
Jan 21 19:03:15 np0005591285 nova_compute[182755]:        <nova:swap>0</nova:swap>
Jan 21 19:03:15 np0005591285 nova_compute[182755]:        <nova:ephemeral>0</nova:ephemeral>
Jan 21 19:03:15 np0005591285 nova_compute[182755]:        <nova:vcpus>1</nova:vcpus>
Jan 21 19:03:15 np0005591285 nova_compute[182755]:      </nova:flavor>
Jan 21 19:03:15 np0005591285 nova_compute[182755]:      <nova:owner>
Jan 21 19:03:15 np0005591285 nova_compute[182755]:        <nova:user uuid="3e78a70a1d284a9d932d4a53b872df39">tempest-ServerActionsTestJSON-78742637-project-member</nova:user>
Jan 21 19:03:15 np0005591285 nova_compute[182755]:        <nova:project uuid="cccb624dbe6d4401a89e9cd254f91828">tempest-ServerActionsTestJSON-78742637</nova:project>
Jan 21 19:03:15 np0005591285 nova_compute[182755]:      </nova:owner>
Jan 21 19:03:15 np0005591285 nova_compute[182755]:      <nova:root type="image" uuid="9cd98f02-a505-4543-a7ad-04e9a377b456"/>
Jan 21 19:03:15 np0005591285 nova_compute[182755]:      <nova:ports>
Jan 21 19:03:15 np0005591285 nova_compute[182755]:        <nova:port uuid="d13b0c1b-9c16-4db4-bc03-d7ffef3f3af0">
Jan 21 19:03:15 np0005591285 nova_compute[182755]:          <nova:ip type="fixed" address="10.100.0.6" ipVersion="4"/>
Jan 21 19:03:15 np0005591285 nova_compute[182755]:        </nova:port>
Jan 21 19:03:15 np0005591285 nova_compute[182755]:      </nova:ports>
Jan 21 19:03:15 np0005591285 nova_compute[182755]:    </nova:instance>
Jan 21 19:03:15 np0005591285 nova_compute[182755]:  </metadata>
Jan 21 19:03:15 np0005591285 nova_compute[182755]:  <sysinfo type="smbios">
Jan 21 19:03:15 np0005591285 nova_compute[182755]:    <system>
Jan 21 19:03:15 np0005591285 nova_compute[182755]:      <entry name="manufacturer">RDO</entry>
Jan 21 19:03:15 np0005591285 nova_compute[182755]:      <entry name="product">OpenStack Compute</entry>
Jan 21 19:03:15 np0005591285 nova_compute[182755]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 21 19:03:15 np0005591285 nova_compute[182755]:      <entry name="serial">65bbb3bd-2b3c-4868-bf10-ce8795c0a312</entry>
Jan 21 19:03:15 np0005591285 nova_compute[182755]:      <entry name="uuid">65bbb3bd-2b3c-4868-bf10-ce8795c0a312</entry>
Jan 21 19:03:15 np0005591285 nova_compute[182755]:      <entry name="family">Virtual Machine</entry>
Jan 21 19:03:15 np0005591285 nova_compute[182755]:    </system>
Jan 21 19:03:15 np0005591285 nova_compute[182755]:  </sysinfo>
Jan 21 19:03:15 np0005591285 nova_compute[182755]:  <os>
Jan 21 19:03:15 np0005591285 nova_compute[182755]:    <type arch="x86_64" machine="q35">hvm</type>
Jan 21 19:03:15 np0005591285 nova_compute[182755]:    <boot dev="hd"/>
Jan 21 19:03:15 np0005591285 nova_compute[182755]:    <smbios mode="sysinfo"/>
Jan 21 19:03:15 np0005591285 nova_compute[182755]:  </os>
Jan 21 19:03:15 np0005591285 nova_compute[182755]:  <features>
Jan 21 19:03:15 np0005591285 nova_compute[182755]:    <acpi/>
Jan 21 19:03:15 np0005591285 nova_compute[182755]:    <apic/>
Jan 21 19:03:15 np0005591285 nova_compute[182755]:    <vmcoreinfo/>
Jan 21 19:03:15 np0005591285 nova_compute[182755]:  </features>
Jan 21 19:03:15 np0005591285 nova_compute[182755]:  <clock offset="utc">
Jan 21 19:03:15 np0005591285 nova_compute[182755]:    <timer name="pit" tickpolicy="delay"/>
Jan 21 19:03:15 np0005591285 nova_compute[182755]:    <timer name="rtc" tickpolicy="catchup"/>
Jan 21 19:03:15 np0005591285 nova_compute[182755]:    <timer name="hpet" present="no"/>
Jan 21 19:03:15 np0005591285 nova_compute[182755]:  </clock>
Jan 21 19:03:15 np0005591285 nova_compute[182755]:  <cpu mode="custom" match="exact">
Jan 21 19:03:15 np0005591285 nova_compute[182755]:    <model>Nehalem</model>
Jan 21 19:03:15 np0005591285 nova_compute[182755]:    <topology sockets="1" cores="1" threads="1"/>
Jan 21 19:03:15 np0005591285 nova_compute[182755]:  </cpu>
Jan 21 19:03:15 np0005591285 nova_compute[182755]:  <devices>
Jan 21 19:03:15 np0005591285 nova_compute[182755]:    <disk type="file" device="disk">
Jan 21 19:03:15 np0005591285 nova_compute[182755]:      <driver name="qemu" type="qcow2" cache="none"/>
Jan 21 19:03:15 np0005591285 nova_compute[182755]:      <source file="/var/lib/nova/instances/65bbb3bd-2b3c-4868-bf10-ce8795c0a312/disk"/>
Jan 21 19:03:15 np0005591285 nova_compute[182755]:      <target dev="vda" bus="virtio"/>
Jan 21 19:03:15 np0005591285 nova_compute[182755]:    </disk>
Jan 21 19:03:15 np0005591285 nova_compute[182755]:    <disk type="file" device="cdrom">
Jan 21 19:03:15 np0005591285 nova_compute[182755]:      <driver name="qemu" type="raw" cache="none"/>
Jan 21 19:03:15 np0005591285 nova_compute[182755]:      <source file="/var/lib/nova/instances/65bbb3bd-2b3c-4868-bf10-ce8795c0a312/disk.config"/>
Jan 21 19:03:15 np0005591285 nova_compute[182755]:      <target dev="sda" bus="sata"/>
Jan 21 19:03:15 np0005591285 nova_compute[182755]:    </disk>
Jan 21 19:03:15 np0005591285 nova_compute[182755]:    <interface type="ethernet">
Jan 21 19:03:15 np0005591285 nova_compute[182755]:      <mac address="fa:16:3e:d5:43:90"/>
Jan 21 19:03:15 np0005591285 nova_compute[182755]:      <model type="virtio"/>
Jan 21 19:03:15 np0005591285 nova_compute[182755]:      <driver name="vhost" rx_queue_size="512"/>
Jan 21 19:03:15 np0005591285 nova_compute[182755]:      <mtu size="1442"/>
Jan 21 19:03:15 np0005591285 nova_compute[182755]:      <target dev="tapd13b0c1b-9c"/>
Jan 21 19:03:15 np0005591285 nova_compute[182755]:    </interface>
Jan 21 19:03:15 np0005591285 nova_compute[182755]:    <serial type="pty">
Jan 21 19:03:15 np0005591285 nova_compute[182755]:      <log file="/var/lib/nova/instances/65bbb3bd-2b3c-4868-bf10-ce8795c0a312/console.log" append="off"/>
Jan 21 19:03:15 np0005591285 nova_compute[182755]:    </serial>
Jan 21 19:03:15 np0005591285 nova_compute[182755]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 21 19:03:15 np0005591285 nova_compute[182755]:    <video>
Jan 21 19:03:15 np0005591285 nova_compute[182755]:      <model type="virtio"/>
Jan 21 19:03:15 np0005591285 nova_compute[182755]:    </video>
Jan 21 19:03:15 np0005591285 nova_compute[182755]:    <input type="tablet" bus="usb"/>
Jan 21 19:03:15 np0005591285 nova_compute[182755]:    <rng model="virtio">
Jan 21 19:03:15 np0005591285 nova_compute[182755]:      <backend model="random">/dev/urandom</backend>
Jan 21 19:03:15 np0005591285 nova_compute[182755]:    </rng>
Jan 21 19:03:15 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root"/>
Jan 21 19:03:15 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 19:03:15 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 19:03:15 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 19:03:15 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 19:03:15 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 19:03:15 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 19:03:15 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 19:03:15 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 19:03:15 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 19:03:15 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 19:03:15 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 19:03:15 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 19:03:15 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 19:03:15 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 19:03:15 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 19:03:15 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 19:03:15 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 19:03:15 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 19:03:15 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 19:03:15 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 19:03:15 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 19:03:15 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 19:03:15 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 19:03:15 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 19:03:15 np0005591285 nova_compute[182755]:    <controller type="usb" index="0"/>
Jan 21 19:03:15 np0005591285 nova_compute[182755]:    <memballoon model="virtio">
Jan 21 19:03:15 np0005591285 nova_compute[182755]:      <stats period="10"/>
Jan 21 19:03:15 np0005591285 nova_compute[182755]:    </memballoon>
Jan 21 19:03:15 np0005591285 nova_compute[182755]:  </devices>
Jan 21 19:03:15 np0005591285 nova_compute[182755]: </domain>
Jan 21 19:03:15 np0005591285 nova_compute[182755]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Jan 21 19:03:15 np0005591285 nova_compute[182755]: 2026-01-22 00:03:15.315 182759 DEBUG nova.compute.manager [None req-afcdf291-457d-476f-a86f-db6577acd4e9 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] [instance: 65bbb3bd-2b3c-4868-bf10-ce8795c0a312] Preparing to wait for external event network-vif-plugged-d13b0c1b-9c16-4db4-bc03-d7ffef3f3af0 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Jan 21 19:03:15 np0005591285 nova_compute[182755]: 2026-01-22 00:03:15.315 182759 DEBUG oslo_concurrency.lockutils [None req-afcdf291-457d-476f-a86f-db6577acd4e9 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] Acquiring lock "65bbb3bd-2b3c-4868-bf10-ce8795c0a312-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 21 19:03:15 np0005591285 nova_compute[182755]: 2026-01-22 00:03:15.316 182759 DEBUG oslo_concurrency.lockutils [None req-afcdf291-457d-476f-a86f-db6577acd4e9 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] Lock "65bbb3bd-2b3c-4868-bf10-ce8795c0a312-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 21 19:03:15 np0005591285 nova_compute[182755]: 2026-01-22 00:03:15.316 182759 DEBUG oslo_concurrency.lockutils [None req-afcdf291-457d-476f-a86f-db6577acd4e9 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] Lock "65bbb3bd-2b3c-4868-bf10-ce8795c0a312-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 21 19:03:15 np0005591285 nova_compute[182755]: 2026-01-22 00:03:15.317 182759 DEBUG nova.virt.libvirt.vif [None req-afcdf291-457d-476f-a86f-db6577acd4e9 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-22T00:02:55Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ServerActionsTestJSON-server-1767669163',display_name='tempest-ServerActionsTestJSON-server-1767669163',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-serveractionstestjson-server-1767669163',id=88,image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBM2ugiUux7DYMlN8dY8gue1BzsfXbOKOqdPq/gJUxFgjYtiZRKn0Il7yH7vkt/FF0n0nQ57uKZ7FjQwDvGcLpEHkhrK3RTLhPWsztjfiNHjhjKK0S86T4k3kzP0rpeoh4Q==',key_name='tempest-keypair-452781070',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='cccb624dbe6d4401a89e9cd254f91828',ramdisk_id='',reservation_id='r-200ojavi',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ServerActionsTestJSON-78742637',owner_user_name='tempest-ServerActionsTestJSON-78742637-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-22T00:03:06Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='3e78a70a1d284a9d932d4a53b872df39',uuid=65bbb3bd-2b3c-4868-bf10-ce8795c0a312,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "d13b0c1b-9c16-4db4-bc03-d7ffef3f3af0", "address": "fa:16:3e:d5:43:90", "network": {"id": "19c3e0c8-5563-479c-995a-ab38d8b8c7f7", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-10713966-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "cccb624dbe6d4401a89e9cd254f91828", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd13b0c1b-9c", "ovs_interfaceid": "d13b0c1b-9c16-4db4-bc03-d7ffef3f3af0", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Jan 21 19:03:15 np0005591285 nova_compute[182755]: 2026-01-22 00:03:15.318 182759 DEBUG nova.network.os_vif_util [None req-afcdf291-457d-476f-a86f-db6577acd4e9 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] Converting VIF {"id": "d13b0c1b-9c16-4db4-bc03-d7ffef3f3af0", "address": "fa:16:3e:d5:43:90", "network": {"id": "19c3e0c8-5563-479c-995a-ab38d8b8c7f7", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-10713966-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "cccb624dbe6d4401a89e9cd254f91828", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd13b0c1b-9c", "ovs_interfaceid": "d13b0c1b-9c16-4db4-bc03-d7ffef3f3af0", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 21 19:03:15 np0005591285 nova_compute[182755]: 2026-01-22 00:03:15.319 182759 DEBUG nova.network.os_vif_util [None req-afcdf291-457d-476f-a86f-db6577acd4e9 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:d5:43:90,bridge_name='br-int',has_traffic_filtering=True,id=d13b0c1b-9c16-4db4-bc03-d7ffef3f3af0,network=Network(19c3e0c8-5563-479c-995a-ab38d8b8c7f7),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapd13b0c1b-9c') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 21 19:03:15 np0005591285 nova_compute[182755]: 2026-01-22 00:03:15.320 182759 DEBUG os_vif [None req-afcdf291-457d-476f-a86f-db6577acd4e9 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:d5:43:90,bridge_name='br-int',has_traffic_filtering=True,id=d13b0c1b-9c16-4db4-bc03-d7ffef3f3af0,network=Network(19c3e0c8-5563-479c-995a-ab38d8b8c7f7),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapd13b0c1b-9c') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Jan 21 19:03:15 np0005591285 nova_compute[182755]: 2026-01-22 00:03:15.321 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:03:15 np0005591285 nova_compute[182755]: 2026-01-22 00:03:15.322 182759 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 21 19:03:15 np0005591285 nova_compute[182755]: 2026-01-22 00:03:15.322 182759 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 21 19:03:15 np0005591285 nova_compute[182755]: 2026-01-22 00:03:15.326 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:03:15 np0005591285 nova_compute[182755]: 2026-01-22 00:03:15.326 182759 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapd13b0c1b-9c, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 21 19:03:15 np0005591285 nova_compute[182755]: 2026-01-22 00:03:15.327 182759 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tapd13b0c1b-9c, col_values=(('external_ids', {'iface-id': 'd13b0c1b-9c16-4db4-bc03-d7ffef3f3af0', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:d5:43:90', 'vm-uuid': '65bbb3bd-2b3c-4868-bf10-ce8795c0a312'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 21 19:03:15 np0005591285 nova_compute[182755]: 2026-01-22 00:03:15.329 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:03:15 np0005591285 NetworkManager[55017]: <info>  [1769040195.3309] manager: (tapd13b0c1b-9c): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/152)
Jan 21 19:03:15 np0005591285 nova_compute[182755]: 2026-01-22 00:03:15.332 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 21 19:03:15 np0005591285 nova_compute[182755]: 2026-01-22 00:03:15.335 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:03:15 np0005591285 nova_compute[182755]: 2026-01-22 00:03:15.336 182759 INFO os_vif [None req-afcdf291-457d-476f-a86f-db6577acd4e9 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:d5:43:90,bridge_name='br-int',has_traffic_filtering=True,id=d13b0c1b-9c16-4db4-bc03-d7ffef3f3af0,network=Network(19c3e0c8-5563-479c-995a-ab38d8b8c7f7),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapd13b0c1b-9c')#033[00m
Jan 21 19:03:15 np0005591285 nova_compute[182755]: 2026-01-22 00:03:15.543 182759 DEBUG nova.virt.libvirt.driver [None req-afcdf291-457d-476f-a86f-db6577acd4e9 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 21 19:03:15 np0005591285 nova_compute[182755]: 2026-01-22 00:03:15.543 182759 DEBUG nova.virt.libvirt.driver [None req-afcdf291-457d-476f-a86f-db6577acd4e9 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 21 19:03:15 np0005591285 nova_compute[182755]: 2026-01-22 00:03:15.544 182759 DEBUG nova.virt.libvirt.driver [None req-afcdf291-457d-476f-a86f-db6577acd4e9 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] No VIF found with MAC fa:16:3e:d5:43:90, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Jan 21 19:03:15 np0005591285 nova_compute[182755]: 2026-01-22 00:03:15.544 182759 INFO nova.virt.libvirt.driver [None req-afcdf291-457d-476f-a86f-db6577acd4e9 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] [instance: 65bbb3bd-2b3c-4868-bf10-ce8795c0a312] Using config drive#033[00m
Jan 21 19:03:16 np0005591285 nova_compute[182755]: 2026-01-22 00:03:16.124 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:03:17 np0005591285 podman[224575]: 2026-01-22 00:03:17.189795946 +0000 UTC m=+0.053634806 container health_status 8919309155a033da8deb024bb37b9fc0d285fe97686ee0d202af37a5f2df8416 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter)
Jan 21 19:03:17 np0005591285 podman[224574]: 2026-01-22 00:03:17.190789643 +0000 UTC m=+0.059041850 container health_status 482a8450540b33e5cd4c4cb0c4b60281016c657b79b89fa82f8a9e447843f995 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']})
Jan 21 19:03:17 np0005591285 podman[224576]: 2026-01-22 00:03:17.251235269 +0000 UTC m=+0.111592167 container health_status e38def30a2d032278c507a8fe96e62e9ea51ff4721fe55b24e66691821fd079e (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, container_name=ovn_controller, managed_by=edpm_ansible, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_id=ovn_controller)
Jan 21 19:03:17 np0005591285 nova_compute[182755]: 2026-01-22 00:03:17.538 182759 INFO nova.virt.libvirt.driver [None req-afcdf291-457d-476f-a86f-db6577acd4e9 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] [instance: 65bbb3bd-2b3c-4868-bf10-ce8795c0a312] Creating config drive at /var/lib/nova/instances/65bbb3bd-2b3c-4868-bf10-ce8795c0a312/disk.config#033[00m
Jan 21 19:03:17 np0005591285 nova_compute[182755]: 2026-01-22 00:03:17.546 182759 DEBUG oslo_concurrency.processutils [None req-afcdf291-457d-476f-a86f-db6577acd4e9 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/65bbb3bd-2b3c-4868-bf10-ce8795c0a312/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpw2u812ej execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 21 19:03:17 np0005591285 nova_compute[182755]: 2026-01-22 00:03:17.692 182759 DEBUG oslo_concurrency.processutils [None req-afcdf291-457d-476f-a86f-db6577acd4e9 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/65bbb3bd-2b3c-4868-bf10-ce8795c0a312/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpw2u812ej" returned: 0 in 0.146s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 21 19:03:17 np0005591285 kernel: tapd13b0c1b-9c: entered promiscuous mode
Jan 21 19:03:17 np0005591285 NetworkManager[55017]: <info>  [1769040197.7783] manager: (tapd13b0c1b-9c): new Tun device (/org/freedesktop/NetworkManager/Devices/153)
Jan 21 19:03:17 np0005591285 ovn_controller[94908]: 2026-01-22T00:03:17Z|00313|binding|INFO|Claiming lport d13b0c1b-9c16-4db4-bc03-d7ffef3f3af0 for this chassis.
Jan 21 19:03:17 np0005591285 ovn_controller[94908]: 2026-01-22T00:03:17Z|00314|binding|INFO|d13b0c1b-9c16-4db4-bc03-d7ffef3f3af0: Claiming fa:16:3e:d5:43:90 10.100.0.6
Jan 21 19:03:17 np0005591285 nova_compute[182755]: 2026-01-22 00:03:17.781 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:03:17 np0005591285 nova_compute[182755]: 2026-01-22 00:03:17.787 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:03:17 np0005591285 nova_compute[182755]: 2026-01-22 00:03:17.791 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:03:17 np0005591285 systemd-udevd[224651]: Network interface NamePolicy= disabled on kernel command line.
Jan 21 19:03:17 np0005591285 NetworkManager[55017]: <info>  [1769040197.8310] device (tapd13b0c1b-9c): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 21 19:03:17 np0005591285 NetworkManager[55017]: <info>  [1769040197.8320] device (tapd13b0c1b-9c): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 21 19:03:17 np0005591285 systemd-machined[154022]: New machine qemu-40-instance-00000058.
Jan 21 19:03:17 np0005591285 nova_compute[182755]: 2026-01-22 00:03:17.877 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:03:17 np0005591285 ovn_controller[94908]: 2026-01-22T00:03:17Z|00315|binding|INFO|Setting lport d13b0c1b-9c16-4db4-bc03-d7ffef3f3af0 ovn-installed in OVS
Jan 21 19:03:17 np0005591285 nova_compute[182755]: 2026-01-22 00:03:17.881 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:03:17 np0005591285 systemd[1]: Started Virtual Machine qemu-40-instance-00000058.
Jan 21 19:03:17 np0005591285 ovn_controller[94908]: 2026-01-22T00:03:17Z|00316|binding|INFO|Setting lport d13b0c1b-9c16-4db4-bc03-d7ffef3f3af0 up in Southbound
Jan 21 19:03:17 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:03:17.947 104259 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:d5:43:90 10.100.0.6'], port_security=['fa:16:3e:d5:43:90 10.100.0.6'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.6/28', 'neutron:device_id': '65bbb3bd-2b3c-4868-bf10-ce8795c0a312', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-19c3e0c8-5563-479c-995a-ab38d8b8c7f7', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'cccb624dbe6d4401a89e9cd254f91828', 'neutron:revision_number': '2', 'neutron:security_group_ids': '6d59a7e5-ecca-4ec2-a40e-386acabc1d66', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=1cb5ae5b-fb9e-4b4d-8960-35191db09308, chassis=[<ovs.db.idl.Row object at 0x7fdd55b5c970>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fdd55b5c970>], logical_port=d13b0c1b-9c16-4db4-bc03-d7ffef3f3af0) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 21 19:03:17 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:03:17.948 104259 INFO neutron.agent.ovn.metadata.agent [-] Port d13b0c1b-9c16-4db4-bc03-d7ffef3f3af0 in datapath 19c3e0c8-5563-479c-995a-ab38d8b8c7f7 bound to our chassis#033[00m
Jan 21 19:03:17 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:03:17.950 104259 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 19c3e0c8-5563-479c-995a-ab38d8b8c7f7#033[00m
Jan 21 19:03:17 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:03:17.967 211690 DEBUG oslo.privsep.daemon [-] privsep: reply[0d493899-acdb-4633-b94d-1c09d646be44]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 19:03:17 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:03:17.968 104259 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap19c3e0c8-51 in ovnmeta-19c3e0c8-5563-479c-995a-ab38d8b8c7f7 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Jan 21 19:03:17 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:03:17.971 211690 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap19c3e0c8-50 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Jan 21 19:03:17 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:03:17.971 211690 DEBUG oslo.privsep.daemon [-] privsep: reply[040f4b89-1b73-478a-9aed-5c2ff083694e]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 19:03:17 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:03:17.973 211690 DEBUG oslo.privsep.daemon [-] privsep: reply[55bc68c9-726c-4870-ab8e-e2ebb35b7409]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 19:03:17 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:03:17.994 104650 DEBUG oslo.privsep.daemon [-] privsep: reply[73da2260-8de2-4df7-b32f-0ac7b79d858c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 19:03:18 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:03:18.012 211690 DEBUG oslo.privsep.daemon [-] privsep: reply[3836b9dd-90dd-4bc1-8cb1-3bdacb3d2518]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 19:03:18 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:03:18.047 211704 DEBUG oslo.privsep.daemon [-] privsep: reply[9d961695-862b-4d63-a520-4595e5a52038]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 19:03:18 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:03:18.052 211690 DEBUG oslo.privsep.daemon [-] privsep: reply[fef9436c-4ae9-4819-90ce-4e0cd20f331b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 19:03:18 np0005591285 NetworkManager[55017]: <info>  [1769040198.0542] manager: (tap19c3e0c8-50): new Veth device (/org/freedesktop/NetworkManager/Devices/154)
Jan 21 19:03:18 np0005591285 systemd-udevd[224655]: Network interface NamePolicy= disabled on kernel command line.
Jan 21 19:03:18 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:03:18.102 211704 DEBUG oslo.privsep.daemon [-] privsep: reply[1f8ae3e4-184c-45e3-9a85-955efb6bfc8e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 19:03:18 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:03:18.107 211704 DEBUG oslo.privsep.daemon [-] privsep: reply[9974721c-9bf9-4b11-8baf-e78de1825f05]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 19:03:18 np0005591285 NetworkManager[55017]: <info>  [1769040198.1443] device (tap19c3e0c8-50): carrier: link connected
Jan 21 19:03:18 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:03:18.152 211704 DEBUG oslo.privsep.daemon [-] privsep: reply[e8f94209-f295-4783-a36d-75f944037076]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 19:03:18 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:03:18.178 211690 DEBUG oslo.privsep.daemon [-] privsep: reply[12346da0-11f0-47ab-8b35-4f9ab32627a1]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap19c3e0c8-51'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:8f:3a:b0'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 101], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 475680, 'reachable_time': 26983, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 224687, 'error': None, 'target': 'ovnmeta-19c3e0c8-5563-479c-995a-ab38d8b8c7f7', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 19:03:18 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:03:18.199 211690 DEBUG oslo.privsep.daemon [-] privsep: reply[887cbe9e-1a7a-45ec-80b2-8d14de84cd57]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe8f:3ab0'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 475680, 'tstamp': 475680}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 224688, 'error': None, 'target': 'ovnmeta-19c3e0c8-5563-479c-995a-ab38d8b8c7f7', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 19:03:18 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:03:18.233 211690 DEBUG oslo.privsep.daemon [-] privsep: reply[620908f3-c0df-495f-831c-396c35ed01a1]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap19c3e0c8-51'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:8f:3a:b0'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 101], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 475680, 'reachable_time': 26983, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 224689, 'error': None, 'target': 'ovnmeta-19c3e0c8-5563-479c-995a-ab38d8b8c7f7', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 19:03:18 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:03:18.273 211690 DEBUG oslo.privsep.daemon [-] privsep: reply[924f8ffb-b162-401d-acfb-2c86e98033ba]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 19:03:18 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:03:18.370 211690 DEBUG oslo.privsep.daemon [-] privsep: reply[5f1f43ab-89f2-4336-be55-1d72748375b1]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 19:03:18 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:03:18.372 104259 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap19c3e0c8-50, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 21 19:03:18 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:03:18.373 104259 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 21 19:03:18 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:03:18.374 104259 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap19c3e0c8-50, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 21 19:03:18 np0005591285 nova_compute[182755]: 2026-01-22 00:03:18.443 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:03:18 np0005591285 kernel: tap19c3e0c8-50: entered promiscuous mode
Jan 21 19:03:18 np0005591285 NetworkManager[55017]: <info>  [1769040198.4438] manager: (tap19c3e0c8-50): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/155)
Jan 21 19:03:18 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:03:18.446 104259 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap19c3e0c8-50, col_values=(('external_ids', {'iface-id': '1b7e9589-a667-4684-99c2-2699b19c29bb'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 21 19:03:18 np0005591285 ovn_controller[94908]: 2026-01-22T00:03:18Z|00317|binding|INFO|Releasing lport 1b7e9589-a667-4684-99c2-2699b19c29bb from this chassis (sb_readonly=0)
Jan 21 19:03:18 np0005591285 nova_compute[182755]: 2026-01-22 00:03:18.462 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:03:18 np0005591285 nova_compute[182755]: 2026-01-22 00:03:18.463 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:03:18 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:03:18.465 104259 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/19c3e0c8-5563-479c-995a-ab38d8b8c7f7.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/19c3e0c8-5563-479c-995a-ab38d8b8c7f7.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Jan 21 19:03:18 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:03:18.465 211690 DEBUG oslo.privsep.daemon [-] privsep: reply[36adde33-fb5f-4269-88b2-035e37462d29]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 19:03:18 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:03:18.466 104259 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Jan 21 19:03:18 np0005591285 ovn_metadata_agent[104254]: global
Jan 21 19:03:18 np0005591285 ovn_metadata_agent[104254]:    log         /dev/log local0 debug
Jan 21 19:03:18 np0005591285 ovn_metadata_agent[104254]:    log-tag     haproxy-metadata-proxy-19c3e0c8-5563-479c-995a-ab38d8b8c7f7
Jan 21 19:03:18 np0005591285 ovn_metadata_agent[104254]:    user        root
Jan 21 19:03:18 np0005591285 ovn_metadata_agent[104254]:    group       root
Jan 21 19:03:18 np0005591285 ovn_metadata_agent[104254]:    maxconn     1024
Jan 21 19:03:18 np0005591285 ovn_metadata_agent[104254]:    pidfile     /var/lib/neutron/external/pids/19c3e0c8-5563-479c-995a-ab38d8b8c7f7.pid.haproxy
Jan 21 19:03:18 np0005591285 ovn_metadata_agent[104254]:    daemon
Jan 21 19:03:18 np0005591285 ovn_metadata_agent[104254]: 
Jan 21 19:03:18 np0005591285 ovn_metadata_agent[104254]: defaults
Jan 21 19:03:18 np0005591285 ovn_metadata_agent[104254]:    log global
Jan 21 19:03:18 np0005591285 ovn_metadata_agent[104254]:    mode http
Jan 21 19:03:18 np0005591285 ovn_metadata_agent[104254]:    option httplog
Jan 21 19:03:18 np0005591285 ovn_metadata_agent[104254]:    option dontlognull
Jan 21 19:03:18 np0005591285 ovn_metadata_agent[104254]:    option http-server-close
Jan 21 19:03:18 np0005591285 ovn_metadata_agent[104254]:    option forwardfor
Jan 21 19:03:18 np0005591285 ovn_metadata_agent[104254]:    retries                 3
Jan 21 19:03:18 np0005591285 ovn_metadata_agent[104254]:    timeout http-request    30s
Jan 21 19:03:18 np0005591285 ovn_metadata_agent[104254]:    timeout connect         30s
Jan 21 19:03:18 np0005591285 ovn_metadata_agent[104254]:    timeout client          32s
Jan 21 19:03:18 np0005591285 ovn_metadata_agent[104254]:    timeout server          32s
Jan 21 19:03:18 np0005591285 ovn_metadata_agent[104254]:    timeout http-keep-alive 30s
Jan 21 19:03:18 np0005591285 ovn_metadata_agent[104254]: 
Jan 21 19:03:18 np0005591285 ovn_metadata_agent[104254]: 
Jan 21 19:03:18 np0005591285 ovn_metadata_agent[104254]: listen listener
Jan 21 19:03:18 np0005591285 ovn_metadata_agent[104254]:    bind 169.254.169.254:80
Jan 21 19:03:18 np0005591285 ovn_metadata_agent[104254]:    server metadata /var/lib/neutron/metadata_proxy
Jan 21 19:03:18 np0005591285 ovn_metadata_agent[104254]:    http-request add-header X-OVN-Network-ID 19c3e0c8-5563-479c-995a-ab38d8b8c7f7
Jan 21 19:03:18 np0005591285 ovn_metadata_agent[104254]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Jan 21 19:03:18 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:03:18.467 104259 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-19c3e0c8-5563-479c-995a-ab38d8b8c7f7', 'env', 'PROCESS_TAG=haproxy-19c3e0c8-5563-479c-995a-ab38d8b8c7f7', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/19c3e0c8-5563-479c-995a-ab38d8b8c7f7.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Jan 21 19:03:18 np0005591285 podman[224721]: 2026-01-22 00:03:18.881430846 +0000 UTC m=+0.048448428 container create b3895f1bed95c746d1d7765ffe7931447737dde099babcb02d572afaca346eb8 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-19c3e0c8-5563-479c-995a-ab38d8b8c7f7, org.label-schema.build-date=20251202, io.buildah.version=1.41.3, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 21 19:03:18 np0005591285 systemd[1]: Started libpod-conmon-b3895f1bed95c746d1d7765ffe7931447737dde099babcb02d572afaca346eb8.scope.
Jan 21 19:03:18 np0005591285 podman[224721]: 2026-01-22 00:03:18.855185599 +0000 UTC m=+0.022203211 image pull 3695f0466b4af47afdf4b467956f8cc4744d7249671a73e7ca3fd26cca2f59c3 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Jan 21 19:03:18 np0005591285 systemd[1]: Started libcrun container.
Jan 21 19:03:18 np0005591285 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/5951641f78b3a1125f90cc09267cbb365a746d5e99652f56ede65ea07edc0ed1/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 21 19:03:18 np0005591285 podman[224721]: 2026-01-22 00:03:18.985198774 +0000 UTC m=+0.152216386 container init b3895f1bed95c746d1d7765ffe7931447737dde099babcb02d572afaca346eb8 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-19c3e0c8-5563-479c-995a-ab38d8b8c7f7, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 21 19:03:18 np0005591285 nova_compute[182755]: 2026-01-22 00:03:18.984 182759 DEBUG nova.virt.driver [None req-f447ef32-7edb-4e2c-a5c5-5452f2f9c37e - - - - - -] Emitting event <LifecycleEvent: 1769040198.9829507, 65bbb3bd-2b3c-4868-bf10-ce8795c0a312 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 21 19:03:18 np0005591285 nova_compute[182755]: 2026-01-22 00:03:18.985 182759 INFO nova.compute.manager [None req-f447ef32-7edb-4e2c-a5c5-5452f2f9c37e - - - - - -] [instance: 65bbb3bd-2b3c-4868-bf10-ce8795c0a312] VM Started (Lifecycle Event)#033[00m
Jan 21 19:03:18 np0005591285 podman[224721]: 2026-01-22 00:03:18.991223294 +0000 UTC m=+0.158240876 container start b3895f1bed95c746d1d7765ffe7931447737dde099babcb02d572afaca346eb8 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-19c3e0c8-5563-479c-995a-ab38d8b8c7f7, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251202, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, org.label-schema.license=GPLv2)
Jan 21 19:03:19 np0005591285 neutron-haproxy-ovnmeta-19c3e0c8-5563-479c-995a-ab38d8b8c7f7[224743]: [NOTICE]   (224748) : New worker (224750) forked
Jan 21 19:03:19 np0005591285 neutron-haproxy-ovnmeta-19c3e0c8-5563-479c-995a-ab38d8b8c7f7[224743]: [NOTICE]   (224748) : Loading success.
Jan 21 19:03:19 np0005591285 nova_compute[182755]: 2026-01-22 00:03:19.263 182759 DEBUG nova.compute.manager [None req-f447ef32-7edb-4e2c-a5c5-5452f2f9c37e - - - - - -] [instance: 65bbb3bd-2b3c-4868-bf10-ce8795c0a312] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 21 19:03:19 np0005591285 nova_compute[182755]: 2026-01-22 00:03:19.269 182759 DEBUG nova.virt.driver [None req-f447ef32-7edb-4e2c-a5c5-5452f2f9c37e - - - - - -] Emitting event <LifecycleEvent: 1769040198.9851153, 65bbb3bd-2b3c-4868-bf10-ce8795c0a312 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 21 19:03:19 np0005591285 nova_compute[182755]: 2026-01-22 00:03:19.269 182759 INFO nova.compute.manager [None req-f447ef32-7edb-4e2c-a5c5-5452f2f9c37e - - - - - -] [instance: 65bbb3bd-2b3c-4868-bf10-ce8795c0a312] VM Paused (Lifecycle Event)#033[00m
Jan 21 19:03:19 np0005591285 nova_compute[182755]: 2026-01-22 00:03:19.276 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:03:19 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:03:19.278 104259 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=23, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '16:a4:fb', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': 'fa:c2:bb:50:eb:27'}, ipsec=False) old=SB_Global(nb_cfg=22) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 21 19:03:19 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:03:19.279 104259 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 8 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Jan 21 19:03:19 np0005591285 nova_compute[182755]: 2026-01-22 00:03:19.441 182759 DEBUG nova.compute.manager [None req-f447ef32-7edb-4e2c-a5c5-5452f2f9c37e - - - - - -] [instance: 65bbb3bd-2b3c-4868-bf10-ce8795c0a312] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 21 19:03:19 np0005591285 nova_compute[182755]: 2026-01-22 00:03:19.446 182759 DEBUG nova.compute.manager [None req-f447ef32-7edb-4e2c-a5c5-5452f2f9c37e - - - - - -] [instance: 65bbb3bd-2b3c-4868-bf10-ce8795c0a312] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 21 19:03:19 np0005591285 nova_compute[182755]: 2026-01-22 00:03:19.537 182759 DEBUG nova.network.neutron [req-805179f3-e8b2-4590-ac1f-af6f5cc3366e req-1ae812dd-ee7e-4194-b83e-ed7b3812d0c6 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 65bbb3bd-2b3c-4868-bf10-ce8795c0a312] Updated VIF entry in instance network info cache for port d13b0c1b-9c16-4db4-bc03-d7ffef3f3af0. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 21 19:03:19 np0005591285 nova_compute[182755]: 2026-01-22 00:03:19.538 182759 DEBUG nova.network.neutron [req-805179f3-e8b2-4590-ac1f-af6f5cc3366e req-1ae812dd-ee7e-4194-b83e-ed7b3812d0c6 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 65bbb3bd-2b3c-4868-bf10-ce8795c0a312] Updating instance_info_cache with network_info: [{"id": "d13b0c1b-9c16-4db4-bc03-d7ffef3f3af0", "address": "fa:16:3e:d5:43:90", "network": {"id": "19c3e0c8-5563-479c-995a-ab38d8b8c7f7", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-10713966-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "cccb624dbe6d4401a89e9cd254f91828", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd13b0c1b-9c", "ovs_interfaceid": "d13b0c1b-9c16-4db4-bc03-d7ffef3f3af0", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 21 19:03:19 np0005591285 nova_compute[182755]: 2026-01-22 00:03:19.796 182759 INFO nova.compute.manager [None req-f447ef32-7edb-4e2c-a5c5-5452f2f9c37e - - - - - -] [instance: 65bbb3bd-2b3c-4868-bf10-ce8795c0a312] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 21 19:03:19 np0005591285 nova_compute[182755]: 2026-01-22 00:03:19.803 182759 DEBUG oslo_concurrency.lockutils [req-805179f3-e8b2-4590-ac1f-af6f5cc3366e req-1ae812dd-ee7e-4194-b83e-ed7b3812d0c6 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Releasing lock "refresh_cache-65bbb3bd-2b3c-4868-bf10-ce8795c0a312" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 21 19:03:20 np0005591285 nova_compute[182755]: 2026-01-22 00:03:20.331 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:03:20 np0005591285 nova_compute[182755]: 2026-01-22 00:03:20.587 182759 DEBUG nova.compute.manager [req-19545a80-ca7e-4854-8b53-d4db54431fa8 req-1506e8ed-248a-4b14-a1d7-dd833db0ffbd 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 65bbb3bd-2b3c-4868-bf10-ce8795c0a312] Received event network-vif-plugged-d13b0c1b-9c16-4db4-bc03-d7ffef3f3af0 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 21 19:03:20 np0005591285 nova_compute[182755]: 2026-01-22 00:03:20.587 182759 DEBUG oslo_concurrency.lockutils [req-19545a80-ca7e-4854-8b53-d4db54431fa8 req-1506e8ed-248a-4b14-a1d7-dd833db0ffbd 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "65bbb3bd-2b3c-4868-bf10-ce8795c0a312-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 21 19:03:20 np0005591285 nova_compute[182755]: 2026-01-22 00:03:20.588 182759 DEBUG oslo_concurrency.lockutils [req-19545a80-ca7e-4854-8b53-d4db54431fa8 req-1506e8ed-248a-4b14-a1d7-dd833db0ffbd 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "65bbb3bd-2b3c-4868-bf10-ce8795c0a312-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 21 19:03:20 np0005591285 nova_compute[182755]: 2026-01-22 00:03:20.588 182759 DEBUG oslo_concurrency.lockutils [req-19545a80-ca7e-4854-8b53-d4db54431fa8 req-1506e8ed-248a-4b14-a1d7-dd833db0ffbd 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "65bbb3bd-2b3c-4868-bf10-ce8795c0a312-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 21 19:03:20 np0005591285 nova_compute[182755]: 2026-01-22 00:03:20.588 182759 DEBUG nova.compute.manager [req-19545a80-ca7e-4854-8b53-d4db54431fa8 req-1506e8ed-248a-4b14-a1d7-dd833db0ffbd 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 65bbb3bd-2b3c-4868-bf10-ce8795c0a312] Processing event network-vif-plugged-d13b0c1b-9c16-4db4-bc03-d7ffef3f3af0 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Jan 21 19:03:20 np0005591285 nova_compute[182755]: 2026-01-22 00:03:20.589 182759 DEBUG nova.compute.manager [None req-afcdf291-457d-476f-a86f-db6577acd4e9 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] [instance: 65bbb3bd-2b3c-4868-bf10-ce8795c0a312] Instance event wait completed in 1 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Jan 21 19:03:20 np0005591285 nova_compute[182755]: 2026-01-22 00:03:20.594 182759 DEBUG nova.virt.driver [None req-f447ef32-7edb-4e2c-a5c5-5452f2f9c37e - - - - - -] Emitting event <LifecycleEvent: 1769040200.5943518, 65bbb3bd-2b3c-4868-bf10-ce8795c0a312 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 21 19:03:20 np0005591285 nova_compute[182755]: 2026-01-22 00:03:20.595 182759 INFO nova.compute.manager [None req-f447ef32-7edb-4e2c-a5c5-5452f2f9c37e - - - - - -] [instance: 65bbb3bd-2b3c-4868-bf10-ce8795c0a312] VM Resumed (Lifecycle Event)#033[00m
Jan 21 19:03:20 np0005591285 nova_compute[182755]: 2026-01-22 00:03:20.598 182759 DEBUG nova.virt.libvirt.driver [None req-afcdf291-457d-476f-a86f-db6577acd4e9 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] [instance: 65bbb3bd-2b3c-4868-bf10-ce8795c0a312] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Jan 21 19:03:20 np0005591285 nova_compute[182755]: 2026-01-22 00:03:20.603 182759 INFO nova.virt.libvirt.driver [-] [instance: 65bbb3bd-2b3c-4868-bf10-ce8795c0a312] Instance spawned successfully.#033[00m
Jan 21 19:03:20 np0005591285 nova_compute[182755]: 2026-01-22 00:03:20.603 182759 DEBUG nova.virt.libvirt.driver [None req-afcdf291-457d-476f-a86f-db6577acd4e9 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] [instance: 65bbb3bd-2b3c-4868-bf10-ce8795c0a312] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Jan 21 19:03:21 np0005591285 nova_compute[182755]: 2026-01-22 00:03:21.127 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:03:23 np0005591285 nova_compute[182755]: 2026-01-22 00:03:23.384 182759 DEBUG nova.compute.manager [None req-f447ef32-7edb-4e2c-a5c5-5452f2f9c37e - - - - - -] [instance: 65bbb3bd-2b3c-4868-bf10-ce8795c0a312] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 21 19:03:23 np0005591285 nova_compute[182755]: 2026-01-22 00:03:23.388 182759 DEBUG nova.compute.manager [None req-f447ef32-7edb-4e2c-a5c5-5452f2f9c37e - - - - - -] [instance: 65bbb3bd-2b3c-4868-bf10-ce8795c0a312] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 21 19:03:23 np0005591285 nova_compute[182755]: 2026-01-22 00:03:23.398 182759 DEBUG nova.virt.libvirt.driver [None req-afcdf291-457d-476f-a86f-db6577acd4e9 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] [instance: 65bbb3bd-2b3c-4868-bf10-ce8795c0a312] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 21 19:03:23 np0005591285 nova_compute[182755]: 2026-01-22 00:03:23.399 182759 DEBUG nova.virt.libvirt.driver [None req-afcdf291-457d-476f-a86f-db6577acd4e9 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] [instance: 65bbb3bd-2b3c-4868-bf10-ce8795c0a312] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 21 19:03:23 np0005591285 nova_compute[182755]: 2026-01-22 00:03:23.399 182759 DEBUG nova.virt.libvirt.driver [None req-afcdf291-457d-476f-a86f-db6577acd4e9 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] [instance: 65bbb3bd-2b3c-4868-bf10-ce8795c0a312] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 21 19:03:23 np0005591285 nova_compute[182755]: 2026-01-22 00:03:23.399 182759 DEBUG nova.virt.libvirt.driver [None req-afcdf291-457d-476f-a86f-db6577acd4e9 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] [instance: 65bbb3bd-2b3c-4868-bf10-ce8795c0a312] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 21 19:03:23 np0005591285 nova_compute[182755]: 2026-01-22 00:03:23.400 182759 DEBUG nova.virt.libvirt.driver [None req-afcdf291-457d-476f-a86f-db6577acd4e9 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] [instance: 65bbb3bd-2b3c-4868-bf10-ce8795c0a312] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 21 19:03:23 np0005591285 nova_compute[182755]: 2026-01-22 00:03:23.400 182759 DEBUG nova.virt.libvirt.driver [None req-afcdf291-457d-476f-a86f-db6577acd4e9 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] [instance: 65bbb3bd-2b3c-4868-bf10-ce8795c0a312] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 21 19:03:24 np0005591285 nova_compute[182755]: 2026-01-22 00:03:24.147 182759 INFO nova.compute.manager [None req-f447ef32-7edb-4e2c-a5c5-5452f2f9c37e - - - - - -] [instance: 65bbb3bd-2b3c-4868-bf10-ce8795c0a312] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 21 19:03:25 np0005591285 nova_compute[182755]: 2026-01-22 00:03:25.335 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:03:26 np0005591285 nova_compute[182755]: 2026-01-22 00:03:26.129 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:03:27 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:03:27.283 104259 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=ce4b296c-26ac-415a-aa87-9634754eb3d3, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '23'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 21 19:03:28 np0005591285 nova_compute[182755]: 2026-01-22 00:03:28.242 182759 DEBUG oslo_service.periodic_task [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 21 19:03:30 np0005591285 nova_compute[182755]: 2026-01-22 00:03:30.338 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:03:31 np0005591285 nova_compute[182755]: 2026-01-22 00:03:31.131 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:03:31 np0005591285 nova_compute[182755]: 2026-01-22 00:03:31.222 182759 DEBUG nova.compute.manager [req-7a7156f6-3fac-4c36-a4f1-c1856cf7e3b7 req-69bed602-8cf4-4e6f-8eae-8a470e264a15 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 65bbb3bd-2b3c-4868-bf10-ce8795c0a312] Received event network-vif-plugged-d13b0c1b-9c16-4db4-bc03-d7ffef3f3af0 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 21 19:03:31 np0005591285 nova_compute[182755]: 2026-01-22 00:03:31.223 182759 DEBUG oslo_concurrency.lockutils [req-7a7156f6-3fac-4c36-a4f1-c1856cf7e3b7 req-69bed602-8cf4-4e6f-8eae-8a470e264a15 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "65bbb3bd-2b3c-4868-bf10-ce8795c0a312-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 21 19:03:31 np0005591285 nova_compute[182755]: 2026-01-22 00:03:31.223 182759 DEBUG oslo_concurrency.lockutils [req-7a7156f6-3fac-4c36-a4f1-c1856cf7e3b7 req-69bed602-8cf4-4e6f-8eae-8a470e264a15 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "65bbb3bd-2b3c-4868-bf10-ce8795c0a312-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 21 19:03:31 np0005591285 nova_compute[182755]: 2026-01-22 00:03:31.223 182759 DEBUG oslo_concurrency.lockutils [req-7a7156f6-3fac-4c36-a4f1-c1856cf7e3b7 req-69bed602-8cf4-4e6f-8eae-8a470e264a15 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "65bbb3bd-2b3c-4868-bf10-ce8795c0a312-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 21 19:03:31 np0005591285 nova_compute[182755]: 2026-01-22 00:03:31.223 182759 DEBUG nova.compute.manager [req-7a7156f6-3fac-4c36-a4f1-c1856cf7e3b7 req-69bed602-8cf4-4e6f-8eae-8a470e264a15 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 65bbb3bd-2b3c-4868-bf10-ce8795c0a312] No waiting events found dispatching network-vif-plugged-d13b0c1b-9c16-4db4-bc03-d7ffef3f3af0 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 21 19:03:31 np0005591285 nova_compute[182755]: 2026-01-22 00:03:31.223 182759 WARNING nova.compute.manager [req-7a7156f6-3fac-4c36-a4f1-c1856cf7e3b7 req-69bed602-8cf4-4e6f-8eae-8a470e264a15 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 65bbb3bd-2b3c-4868-bf10-ce8795c0a312] Received unexpected event network-vif-plugged-d13b0c1b-9c16-4db4-bc03-d7ffef3f3af0 for instance with vm_state building and task_state spawning.#033[00m
Jan 21 19:03:31 np0005591285 nova_compute[182755]: 2026-01-22 00:03:31.318 182759 INFO nova.compute.manager [None req-afcdf291-457d-476f-a86f-db6577acd4e9 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] [instance: 65bbb3bd-2b3c-4868-bf10-ce8795c0a312] Took 23.32 seconds to spawn the instance on the hypervisor.#033[00m
Jan 21 19:03:31 np0005591285 nova_compute[182755]: 2026-01-22 00:03:31.319 182759 DEBUG nova.compute.manager [None req-afcdf291-457d-476f-a86f-db6577acd4e9 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] [instance: 65bbb3bd-2b3c-4868-bf10-ce8795c0a312] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 21 19:03:34 np0005591285 nova_compute[182755]: 2026-01-22 00:03:34.217 182759 DEBUG oslo_service.periodic_task [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 21 19:03:34 np0005591285 nova_compute[182755]: 2026-01-22 00:03:34.218 182759 DEBUG nova.compute.manager [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Jan 21 19:03:34 np0005591285 ovn_controller[94908]: 2026-01-22T00:03:34Z|00034|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:d5:43:90 10.100.0.6
Jan 21 19:03:34 np0005591285 ovn_controller[94908]: 2026-01-22T00:03:34Z|00035|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:d5:43:90 10.100.0.6
Jan 21 19:03:35 np0005591285 nova_compute[182755]: 2026-01-22 00:03:35.218 182759 DEBUG oslo_service.periodic_task [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 21 19:03:35 np0005591285 nova_compute[182755]: 2026-01-22 00:03:35.342 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:03:36 np0005591285 nova_compute[182755]: 2026-01-22 00:03:36.138 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:03:37 np0005591285 nova_compute[182755]: 2026-01-22 00:03:37.554 182759 INFO nova.compute.manager [None req-afcdf291-457d-476f-a86f-db6577acd4e9 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] [instance: 65bbb3bd-2b3c-4868-bf10-ce8795c0a312] Took 32.72 seconds to build instance.#033[00m
Jan 21 19:03:38 np0005591285 podman[224772]: 2026-01-22 00:03:38.212350853 +0000 UTC m=+0.065973215 container health_status b5c1a31a508d0cf9e10702abe9c0ef424614b4d8f0daee85791f7de054afa6f0 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_managed=true, io.buildah.version=1.41.3, managed_by=edpm_ansible, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=ceilometer_agent_compute, org.label-schema.vendor=CentOS, container_name=ceilometer_agent_compute)
Jan 21 19:03:38 np0005591285 nova_compute[182755]: 2026-01-22 00:03:38.217 182759 DEBUG oslo_service.periodic_task [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 21 19:03:38 np0005591285 podman[224771]: 2026-01-22 00:03:38.227140595 +0000 UTC m=+0.089300884 container health_status 769668cc4e818cfae854ab3fcb5517227ddca1e18005a18ef4d782a494f45662 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, vcs-type=git, distribution-scope=public, config_id=openstack_network_exporter, io.openshift.expose-services=, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vendor=Red Hat, Inc., config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.tags=minimal rhel9, release=1755695350, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., architecture=x86_64, container_name=openstack_network_exporter, maintainer=Red Hat, Inc., io.buildah.version=1.33.7, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, name=ubi9-minimal, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2025-08-20T13:12:41, com.redhat.component=ubi9-minimal-container, url=https://catalog.redhat.com/en/search?searchType=containers, version=9.6, managed_by=edpm_ansible)
Jan 21 19:03:38 np0005591285 nova_compute[182755]: 2026-01-22 00:03:38.510 182759 DEBUG oslo_concurrency.lockutils [None req-afcdf291-457d-476f-a86f-db6577acd4e9 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] Lock "65bbb3bd-2b3c-4868-bf10-ce8795c0a312" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 34.337s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 21 19:03:39 np0005591285 nova_compute[182755]: 2026-01-22 00:03:39.217 182759 DEBUG oslo_service.periodic_task [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 21 19:03:40 np0005591285 nova_compute[182755]: 2026-01-22 00:03:40.344 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:03:41 np0005591285 nova_compute[182755]: 2026-01-22 00:03:41.140 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:03:44 np0005591285 nova_compute[182755]: 2026-01-22 00:03:44.128 182759 DEBUG oslo_concurrency.lockutils [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 21 19:03:44 np0005591285 nova_compute[182755]: 2026-01-22 00:03:44.128 182759 DEBUG oslo_concurrency.lockutils [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 21 19:03:44 np0005591285 nova_compute[182755]: 2026-01-22 00:03:44.129 182759 DEBUG oslo_concurrency.lockutils [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 21 19:03:44 np0005591285 nova_compute[182755]: 2026-01-22 00:03:44.129 182759 DEBUG nova.compute.resource_tracker [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Jan 21 19:03:44 np0005591285 podman[224810]: 2026-01-22 00:03:44.186970385 +0000 UTC m=+0.060646202 container health_status 3d927e208c8d9d2bf2ed958430b573b5beb6e6062ba87e1ec7a0ee42c855b2e4 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Jan 21 19:03:45 np0005591285 nova_compute[182755]: 2026-01-22 00:03:45.318 182759 DEBUG oslo_concurrency.processutils [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/65bbb3bd-2b3c-4868-bf10-ce8795c0a312/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 21 19:03:45 np0005591285 nova_compute[182755]: 2026-01-22 00:03:45.347 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:03:45 np0005591285 nova_compute[182755]: 2026-01-22 00:03:45.387 182759 DEBUG oslo_concurrency.processutils [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/65bbb3bd-2b3c-4868-bf10-ce8795c0a312/disk --force-share --output=json" returned: 0 in 0.069s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 21 19:03:45 np0005591285 nova_compute[182755]: 2026-01-22 00:03:45.388 182759 DEBUG oslo_concurrency.processutils [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/65bbb3bd-2b3c-4868-bf10-ce8795c0a312/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 21 19:03:45 np0005591285 nova_compute[182755]: 2026-01-22 00:03:45.452 182759 DEBUG oslo_concurrency.processutils [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/65bbb3bd-2b3c-4868-bf10-ce8795c0a312/disk --force-share --output=json" returned: 0 in 0.064s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 21 19:03:45 np0005591285 nova_compute[182755]: 2026-01-22 00:03:45.581 182759 WARNING nova.virt.libvirt.driver [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 21 19:03:45 np0005591285 nova_compute[182755]: 2026-01-22 00:03:45.582 182759 DEBUG nova.compute.resource_tracker [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=5523MB free_disk=73.23248291015625GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Jan 21 19:03:45 np0005591285 nova_compute[182755]: 2026-01-22 00:03:45.583 182759 DEBUG oslo_concurrency.lockutils [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 21 19:03:45 np0005591285 nova_compute[182755]: 2026-01-22 00:03:45.583 182759 DEBUG oslo_concurrency.lockutils [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 21 19:03:46 np0005591285 nova_compute[182755]: 2026-01-22 00:03:46.141 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:03:47 np0005591285 nova_compute[182755]: 2026-01-22 00:03:47.778 182759 DEBUG nova.compute.resource_tracker [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Instance 65bbb3bd-2b3c-4868-bf10-ce8795c0a312 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Jan 21 19:03:47 np0005591285 nova_compute[182755]: 2026-01-22 00:03:47.779 182759 DEBUG nova.compute.resource_tracker [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Jan 21 19:03:47 np0005591285 nova_compute[182755]: 2026-01-22 00:03:47.779 182759 DEBUG nova.compute.resource_tracker [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7679MB used_ram=640MB phys_disk=79GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Jan 21 19:03:47 np0005591285 ovn_controller[94908]: 2026-01-22T00:03:47Z|00318|memory_trim|INFO|Detected inactivity (last active 30002 ms ago): trimming memory
Jan 21 19:03:47 np0005591285 nova_compute[182755]: 2026-01-22 00:03:47.920 182759 DEBUG nova.compute.provider_tree [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Inventory has not changed in ProviderTree for provider: e96a8776-a298-4c19-937a-402cb8191067 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 21 19:03:47 np0005591285 nova_compute[182755]: 2026-01-22 00:03:47.936 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:03:47 np0005591285 NetworkManager[55017]: <info>  [1769040227.9375] manager: (patch-br-int-to-provnet-99bcf688-d143-4306-a11c-956e66fbe227): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/156)
Jan 21 19:03:47 np0005591285 NetworkManager[55017]: <info>  [1769040227.9394] manager: (patch-provnet-99bcf688-d143-4306-a11c-956e66fbe227-to-br-int): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/157)
Jan 21 19:03:48 np0005591285 nova_compute[182755]: 2026-01-22 00:03:48.040 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:03:48 np0005591285 ovn_controller[94908]: 2026-01-22T00:03:48Z|00319|binding|INFO|Releasing lport 1b7e9589-a667-4684-99c2-2699b19c29bb from this chassis (sb_readonly=0)
Jan 21 19:03:48 np0005591285 nova_compute[182755]: 2026-01-22 00:03:48.055 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:03:48 np0005591285 podman[224843]: 2026-01-22 00:03:48.226016974 +0000 UTC m=+0.093495985 container health_status 8919309155a033da8deb024bb37b9fc0d285fe97686ee0d202af37a5f2df8416 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter)
Jan 21 19:03:48 np0005591285 podman[224842]: 2026-01-22 00:03:48.226829087 +0000 UTC m=+0.099571728 container health_status 482a8450540b33e5cd4c4cb0c4b60281016c657b79b89fa82f8a9e447843f995 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.build-date=20251202, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true)
Jan 21 19:03:48 np0005591285 nova_compute[182755]: 2026-01-22 00:03:48.229 182759 DEBUG nova.scheduler.client.report [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Inventory has not changed for provider e96a8776-a298-4c19-937a-402cb8191067 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 21 19:03:48 np0005591285 podman[224844]: 2026-01-22 00:03:48.282127606 +0000 UTC m=+0.149041882 container health_status e38def30a2d032278c507a8fe96e62e9ea51ff4721fe55b24e66691821fd079e (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, managed_by=edpm_ansible, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 21 19:03:50 np0005591285 nova_compute[182755]: 2026-01-22 00:03:50.352 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:03:51 np0005591285 nova_compute[182755]: 2026-01-22 00:03:51.144 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:03:51 np0005591285 nova_compute[182755]: 2026-01-22 00:03:51.977 182759 DEBUG nova.compute.resource_tracker [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Jan 21 19:03:51 np0005591285 nova_compute[182755]: 2026-01-22 00:03:51.978 182759 DEBUG oslo_concurrency.lockutils [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 6.395s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 21 19:03:52 np0005591285 nova_compute[182755]: 2026-01-22 00:03:52.979 182759 DEBUG oslo_service.periodic_task [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 21 19:03:55 np0005591285 nova_compute[182755]: 2026-01-22 00:03:55.356 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:03:56 np0005591285 nova_compute[182755]: 2026-01-22 00:03:56.147 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:03:56 np0005591285 nova_compute[182755]: 2026-01-22 00:03:56.262 182759 DEBUG nova.compute.manager [req-e33da7f3-ed3a-43c5-a11d-05dc269be901 req-f7766553-3819-478a-aa61-820b7e9dad8d 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 65bbb3bd-2b3c-4868-bf10-ce8795c0a312] Received event network-changed-d13b0c1b-9c16-4db4-bc03-d7ffef3f3af0 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 21 19:03:56 np0005591285 nova_compute[182755]: 2026-01-22 00:03:56.262 182759 DEBUG nova.compute.manager [req-e33da7f3-ed3a-43c5-a11d-05dc269be901 req-f7766553-3819-478a-aa61-820b7e9dad8d 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 65bbb3bd-2b3c-4868-bf10-ce8795c0a312] Refreshing instance network info cache due to event network-changed-d13b0c1b-9c16-4db4-bc03-d7ffef3f3af0. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 21 19:03:56 np0005591285 nova_compute[182755]: 2026-01-22 00:03:56.263 182759 DEBUG oslo_concurrency.lockutils [req-e33da7f3-ed3a-43c5-a11d-05dc269be901 req-f7766553-3819-478a-aa61-820b7e9dad8d 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "refresh_cache-65bbb3bd-2b3c-4868-bf10-ce8795c0a312" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 21 19:03:56 np0005591285 nova_compute[182755]: 2026-01-22 00:03:56.263 182759 DEBUG oslo_concurrency.lockutils [req-e33da7f3-ed3a-43c5-a11d-05dc269be901 req-f7766553-3819-478a-aa61-820b7e9dad8d 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquired lock "refresh_cache-65bbb3bd-2b3c-4868-bf10-ce8795c0a312" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 21 19:03:56 np0005591285 nova_compute[182755]: 2026-01-22 00:03:56.264 182759 DEBUG nova.network.neutron [req-e33da7f3-ed3a-43c5-a11d-05dc269be901 req-f7766553-3819-478a-aa61-820b7e9dad8d 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 65bbb3bd-2b3c-4868-bf10-ce8795c0a312] Refreshing network info cache for port d13b0c1b-9c16-4db4-bc03-d7ffef3f3af0 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 21 19:03:56 np0005591285 nova_compute[182755]: 2026-01-22 00:03:56.329 182759 DEBUG oslo_service.periodic_task [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 21 19:03:56 np0005591285 nova_compute[182755]: 2026-01-22 00:03:56.329 182759 DEBUG nova.compute.manager [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Jan 21 19:03:56 np0005591285 nova_compute[182755]: 2026-01-22 00:03:56.330 182759 DEBUG nova.compute.manager [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Jan 21 19:03:56 np0005591285 nova_compute[182755]: 2026-01-22 00:03:56.357 182759 DEBUG oslo_concurrency.lockutils [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Acquiring lock "refresh_cache-65bbb3bd-2b3c-4868-bf10-ce8795c0a312" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 21 19:03:56 np0005591285 nova_compute[182755]: 2026-01-22 00:03:56.414 182759 DEBUG oslo_concurrency.lockutils [None req-30b7e0de-0b32-4fba-8919-ea168d103873 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] Acquiring lock "65bbb3bd-2b3c-4868-bf10-ce8795c0a312" by "nova.compute.manager.ComputeManager.stop_instance.<locals>.do_stop_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 21 19:03:56 np0005591285 nova_compute[182755]: 2026-01-22 00:03:56.414 182759 DEBUG oslo_concurrency.lockutils [None req-30b7e0de-0b32-4fba-8919-ea168d103873 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] Lock "65bbb3bd-2b3c-4868-bf10-ce8795c0a312" acquired by "nova.compute.manager.ComputeManager.stop_instance.<locals>.do_stop_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 21 19:03:56 np0005591285 nova_compute[182755]: 2026-01-22 00:03:56.414 182759 DEBUG nova.compute.manager [None req-30b7e0de-0b32-4fba-8919-ea168d103873 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] [instance: 65bbb3bd-2b3c-4868-bf10-ce8795c0a312] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 21 19:03:56 np0005591285 nova_compute[182755]: 2026-01-22 00:03:56.420 182759 DEBUG nova.compute.manager [None req-30b7e0de-0b32-4fba-8919-ea168d103873 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] [instance: 65bbb3bd-2b3c-4868-bf10-ce8795c0a312] Stopping instance; current vm_state: active, current task_state: powering-off, current DB power_state: 1, current VM power_state: 1 do_stop_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3338#033[00m
Jan 21 19:03:56 np0005591285 nova_compute[182755]: 2026-01-22 00:03:56.421 182759 DEBUG nova.objects.instance [None req-30b7e0de-0b32-4fba-8919-ea168d103873 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] Lazy-loading 'flavor' on Instance uuid 65bbb3bd-2b3c-4868-bf10-ce8795c0a312 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 21 19:03:56 np0005591285 nova_compute[182755]: 2026-01-22 00:03:56.454 182759 DEBUG nova.objects.instance [None req-30b7e0de-0b32-4fba-8919-ea168d103873 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] Lazy-loading 'info_cache' on Instance uuid 65bbb3bd-2b3c-4868-bf10-ce8795c0a312 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 21 19:03:56 np0005591285 nova_compute[182755]: 2026-01-22 00:03:56.536 182759 DEBUG nova.virt.libvirt.driver [None req-30b7e0de-0b32-4fba-8919-ea168d103873 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] [instance: 65bbb3bd-2b3c-4868-bf10-ce8795c0a312] Shutting down instance from state 1 _clean_shutdown /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4071#033[00m
Jan 21 19:03:56 np0005591285 nova_compute[182755]: 2026-01-22 00:03:56.986 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:03:56 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:03:56.988 104259 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=24, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '16:a4:fb', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': 'fa:c2:bb:50:eb:27'}, ipsec=False) old=SB_Global(nb_cfg=23) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 21 19:03:56 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:03:56.990 104259 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 0 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Jan 21 19:03:56 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:03:56.991 104259 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=ce4b296c-26ac-415a-aa87-9634754eb3d3, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '24'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 21 19:03:58 np0005591285 kernel: tapd13b0c1b-9c (unregistering): left promiscuous mode
Jan 21 19:03:58 np0005591285 NetworkManager[55017]: <info>  [1769040238.7164] device (tapd13b0c1b-9c): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 21 19:03:58 np0005591285 nova_compute[182755]: 2026-01-22 00:03:58.725 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:03:58 np0005591285 ovn_controller[94908]: 2026-01-22T00:03:58Z|00320|binding|INFO|Releasing lport d13b0c1b-9c16-4db4-bc03-d7ffef3f3af0 from this chassis (sb_readonly=0)
Jan 21 19:03:58 np0005591285 ovn_controller[94908]: 2026-01-22T00:03:58Z|00321|binding|INFO|Setting lport d13b0c1b-9c16-4db4-bc03-d7ffef3f3af0 down in Southbound
Jan 21 19:03:58 np0005591285 ovn_controller[94908]: 2026-01-22T00:03:58Z|00322|binding|INFO|Removing iface tapd13b0c1b-9c ovn-installed in OVS
Jan 21 19:03:58 np0005591285 nova_compute[182755]: 2026-01-22 00:03:58.730 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:03:58 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:03:58.746 104259 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:d5:43:90 10.100.0.6'], port_security=['fa:16:3e:d5:43:90 10.100.0.6'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.6/28', 'neutron:device_id': '65bbb3bd-2b3c-4868-bf10-ce8795c0a312', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-19c3e0c8-5563-479c-995a-ab38d8b8c7f7', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'cccb624dbe6d4401a89e9cd254f91828', 'neutron:revision_number': '4', 'neutron:security_group_ids': '6d59a7e5-ecca-4ec2-a40e-386acabc1d66', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-2.ctlplane.example.com', 'neutron:port_fip': '192.168.122.240'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=1cb5ae5b-fb9e-4b4d-8960-35191db09308, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fdd55b5c970>], logical_port=d13b0c1b-9c16-4db4-bc03-d7ffef3f3af0) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7fdd55b5c970>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 21 19:03:58 np0005591285 nova_compute[182755]: 2026-01-22 00:03:58.749 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:03:58 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:03:58.750 104259 INFO neutron.agent.ovn.metadata.agent [-] Port d13b0c1b-9c16-4db4-bc03-d7ffef3f3af0 in datapath 19c3e0c8-5563-479c-995a-ab38d8b8c7f7 unbound from our chassis#033[00m
Jan 21 19:03:58 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:03:58.751 104259 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 19c3e0c8-5563-479c-995a-ab38d8b8c7f7, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Jan 21 19:03:58 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:03:58.753 211690 DEBUG oslo.privsep.daemon [-] privsep: reply[33fa763a-3a49-4d3a-a6dc-0670bac79dd7]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 19:03:58 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:03:58.754 104259 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-19c3e0c8-5563-479c-995a-ab38d8b8c7f7 namespace which is not needed anymore#033[00m
Jan 21 19:03:58 np0005591285 systemd[1]: machine-qemu\x2d40\x2dinstance\x2d00000058.scope: Deactivated successfully.
Jan 21 19:03:58 np0005591285 systemd[1]: machine-qemu\x2d40\x2dinstance\x2d00000058.scope: Consumed 14.963s CPU time.
Jan 21 19:03:58 np0005591285 systemd-machined[154022]: Machine qemu-40-instance-00000058 terminated.
Jan 21 19:03:58 np0005591285 neutron-haproxy-ovnmeta-19c3e0c8-5563-479c-995a-ab38d8b8c7f7[224743]: [NOTICE]   (224748) : haproxy version is 2.8.14-c23fe91
Jan 21 19:03:58 np0005591285 neutron-haproxy-ovnmeta-19c3e0c8-5563-479c-995a-ab38d8b8c7f7[224743]: [NOTICE]   (224748) : path to executable is /usr/sbin/haproxy
Jan 21 19:03:58 np0005591285 neutron-haproxy-ovnmeta-19c3e0c8-5563-479c-995a-ab38d8b8c7f7[224743]: [WARNING]  (224748) : Exiting Master process...
Jan 21 19:03:58 np0005591285 neutron-haproxy-ovnmeta-19c3e0c8-5563-479c-995a-ab38d8b8c7f7[224743]: [WARNING]  (224748) : Exiting Master process...
Jan 21 19:03:58 np0005591285 neutron-haproxy-ovnmeta-19c3e0c8-5563-479c-995a-ab38d8b8c7f7[224743]: [ALERT]    (224748) : Current worker (224750) exited with code 143 (Terminated)
Jan 21 19:03:58 np0005591285 neutron-haproxy-ovnmeta-19c3e0c8-5563-479c-995a-ab38d8b8c7f7[224743]: [WARNING]  (224748) : All workers exited. Exiting... (0)
Jan 21 19:03:58 np0005591285 systemd[1]: libpod-b3895f1bed95c746d1d7765ffe7931447737dde099babcb02d572afaca346eb8.scope: Deactivated successfully.
Jan 21 19:03:58 np0005591285 podman[224929]: 2026-01-22 00:03:58.877298414 +0000 UTC m=+0.043472096 container died b3895f1bed95c746d1d7765ffe7931447737dde099babcb02d572afaca346eb8 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-19c3e0c8-5563-479c-995a-ab38d8b8c7f7, org.label-schema.license=GPLv2, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 21 19:03:58 np0005591285 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-b3895f1bed95c746d1d7765ffe7931447737dde099babcb02d572afaca346eb8-userdata-shm.mount: Deactivated successfully.
Jan 21 19:03:58 np0005591285 systemd[1]: var-lib-containers-storage-overlay-5951641f78b3a1125f90cc09267cbb365a746d5e99652f56ede65ea07edc0ed1-merged.mount: Deactivated successfully.
Jan 21 19:03:58 np0005591285 podman[224929]: 2026-01-22 00:03:58.907422275 +0000 UTC m=+0.073595947 container cleanup b3895f1bed95c746d1d7765ffe7931447737dde099babcb02d572afaca346eb8 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-19c3e0c8-5563-479c-995a-ab38d8b8c7f7, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS)
Jan 21 19:03:58 np0005591285 systemd[1]: libpod-conmon-b3895f1bed95c746d1d7765ffe7931447737dde099babcb02d572afaca346eb8.scope: Deactivated successfully.
Jan 21 19:03:58 np0005591285 kernel: tapd13b0c1b-9c: entered promiscuous mode
Jan 21 19:03:58 np0005591285 kernel: tapd13b0c1b-9c (unregistering): left promiscuous mode
Jan 21 19:03:58 np0005591285 NetworkManager[55017]: <info>  [1769040238.9663] manager: (tapd13b0c1b-9c): new Tun device (/org/freedesktop/NetworkManager/Devices/158)
Jan 21 19:03:58 np0005591285 nova_compute[182755]: 2026-01-22 00:03:58.970 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:03:58 np0005591285 podman[224961]: 2026-01-22 00:03:58.976947053 +0000 UTC m=+0.046645191 container remove b3895f1bed95c746d1d7765ffe7931447737dde099babcb02d572afaca346eb8 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-19c3e0c8-5563-479c-995a-ab38d8b8c7f7, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3)
Jan 21 19:03:58 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:03:58.984 211690 DEBUG oslo.privsep.daemon [-] privsep: reply[78151b97-1366-4a8d-a1db-80cc741061fb]: (4, ('Thu Jan 22 12:03:58 AM UTC 2026 Stopping container neutron-haproxy-ovnmeta-19c3e0c8-5563-479c-995a-ab38d8b8c7f7 (b3895f1bed95c746d1d7765ffe7931447737dde099babcb02d572afaca346eb8)\nb3895f1bed95c746d1d7765ffe7931447737dde099babcb02d572afaca346eb8\nThu Jan 22 12:03:58 AM UTC 2026 Deleting container neutron-haproxy-ovnmeta-19c3e0c8-5563-479c-995a-ab38d8b8c7f7 (b3895f1bed95c746d1d7765ffe7931447737dde099babcb02d572afaca346eb8)\nb3895f1bed95c746d1d7765ffe7931447737dde099babcb02d572afaca346eb8\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 19:03:58 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:03:58.986 211690 DEBUG oslo.privsep.daemon [-] privsep: reply[502bb70f-91de-4507-90c9-b619eedd155f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 19:03:58 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:03:58.987 104259 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap19c3e0c8-50, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 21 19:03:58 np0005591285 nova_compute[182755]: 2026-01-22 00:03:58.989 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:03:58 np0005591285 kernel: tap19c3e0c8-50: left promiscuous mode
Jan 21 19:03:59 np0005591285 nova_compute[182755]: 2026-01-22 00:03:59.007 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:03:59 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:03:59.011 211690 DEBUG oslo.privsep.daemon [-] privsep: reply[cfd2efbc-0259-4a5a-9e04-c5abe021363c]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 19:03:59 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:03:59.030 211690 DEBUG oslo.privsep.daemon [-] privsep: reply[5915f45a-83a9-422e-884a-97152c7445e9]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 19:03:59 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:03:59.032 211690 DEBUG oslo.privsep.daemon [-] privsep: reply[b49799dc-7500-4c8d-adac-ab4fd529177a]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 19:03:59 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:03:59.050 211690 DEBUG oslo.privsep.daemon [-] privsep: reply[01dacbde-21d7-4e2e-8126-68cbb8dc61a7]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 475670, 'reachable_time': 36322, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 224995, 'error': None, 'target': 'ovnmeta-19c3e0c8-5563-479c-995a-ab38d8b8c7f7', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 19:03:59 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:03:59.054 104650 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-19c3e0c8-5563-479c-995a-ab38d8b8c7f7 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Jan 21 19:03:59 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:03:59.054 104650 DEBUG oslo.privsep.daemon [-] privsep: reply[c44eac60-aac6-4996-88fc-e0a83a0a9fca]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 19:03:59 np0005591285 systemd[1]: run-netns-ovnmeta\x2d19c3e0c8\x2d5563\x2d479c\x2d995a\x2dab38d8b8c7f7.mount: Deactivated successfully.
Jan 21 19:03:59 np0005591285 nova_compute[182755]: 2026-01-22 00:03:59.553 182759 INFO nova.virt.libvirt.driver [None req-30b7e0de-0b32-4fba-8919-ea168d103873 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] [instance: 65bbb3bd-2b3c-4868-bf10-ce8795c0a312] Instance shutdown successfully after 3 seconds.#033[00m
Jan 21 19:03:59 np0005591285 nova_compute[182755]: 2026-01-22 00:03:59.561 182759 INFO nova.virt.libvirt.driver [-] [instance: 65bbb3bd-2b3c-4868-bf10-ce8795c0a312] Instance destroyed successfully.#033[00m
Jan 21 19:03:59 np0005591285 nova_compute[182755]: 2026-01-22 00:03:59.561 182759 DEBUG nova.objects.instance [None req-30b7e0de-0b32-4fba-8919-ea168d103873 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] Lazy-loading 'numa_topology' on Instance uuid 65bbb3bd-2b3c-4868-bf10-ce8795c0a312 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 21 19:03:59 np0005591285 nova_compute[182755]: 2026-01-22 00:03:59.592 182759 DEBUG nova.compute.manager [None req-30b7e0de-0b32-4fba-8919-ea168d103873 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] [instance: 65bbb3bd-2b3c-4868-bf10-ce8795c0a312] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 21 19:03:59 np0005591285 nova_compute[182755]: 2026-01-22 00:03:59.702 182759 DEBUG oslo_concurrency.lockutils [None req-30b7e0de-0b32-4fba-8919-ea168d103873 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] Lock "65bbb3bd-2b3c-4868-bf10-ce8795c0a312" "released" by "nova.compute.manager.ComputeManager.stop_instance.<locals>.do_stop_instance" :: held 3.287s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 21 19:03:59 np0005591285 nova_compute[182755]: 2026-01-22 00:03:59.763 182759 DEBUG nova.network.neutron [req-e33da7f3-ed3a-43c5-a11d-05dc269be901 req-f7766553-3819-478a-aa61-820b7e9dad8d 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 65bbb3bd-2b3c-4868-bf10-ce8795c0a312] Updated VIF entry in instance network info cache for port d13b0c1b-9c16-4db4-bc03-d7ffef3f3af0. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 21 19:03:59 np0005591285 nova_compute[182755]: 2026-01-22 00:03:59.764 182759 DEBUG nova.network.neutron [req-e33da7f3-ed3a-43c5-a11d-05dc269be901 req-f7766553-3819-478a-aa61-820b7e9dad8d 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 65bbb3bd-2b3c-4868-bf10-ce8795c0a312] Updating instance_info_cache with network_info: [{"id": "d13b0c1b-9c16-4db4-bc03-d7ffef3f3af0", "address": "fa:16:3e:d5:43:90", "network": {"id": "19c3e0c8-5563-479c-995a-ab38d8b8c7f7", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-10713966-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.240", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "cccb624dbe6d4401a89e9cd254f91828", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd13b0c1b-9c", "ovs_interfaceid": "d13b0c1b-9c16-4db4-bc03-d7ffef3f3af0", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 21 19:03:59 np0005591285 nova_compute[182755]: 2026-01-22 00:03:59.809 182759 DEBUG oslo_concurrency.lockutils [req-e33da7f3-ed3a-43c5-a11d-05dc269be901 req-f7766553-3819-478a-aa61-820b7e9dad8d 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Releasing lock "refresh_cache-65bbb3bd-2b3c-4868-bf10-ce8795c0a312" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 21 19:03:59 np0005591285 nova_compute[182755]: 2026-01-22 00:03:59.810 182759 DEBUG oslo_concurrency.lockutils [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Acquired lock "refresh_cache-65bbb3bd-2b3c-4868-bf10-ce8795c0a312" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 21 19:03:59 np0005591285 nova_compute[182755]: 2026-01-22 00:03:59.810 182759 DEBUG nova.network.neutron [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] [instance: 65bbb3bd-2b3c-4868-bf10-ce8795c0a312] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004#033[00m
Jan 21 19:03:59 np0005591285 nova_compute[182755]: 2026-01-22 00:03:59.811 182759 DEBUG nova.objects.instance [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Lazy-loading 'info_cache' on Instance uuid 65bbb3bd-2b3c-4868-bf10-ce8795c0a312 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 21 19:03:59 np0005591285 nova_compute[182755]: 2026-01-22 00:03:59.958 182759 DEBUG nova.compute.manager [req-8f7ce849-e617-48ea-a489-011eee6566cf req-d2024666-8fc9-4a8b-86be-1e324c984d33 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 65bbb3bd-2b3c-4868-bf10-ce8795c0a312] Received event network-vif-unplugged-d13b0c1b-9c16-4db4-bc03-d7ffef3f3af0 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 21 19:03:59 np0005591285 nova_compute[182755]: 2026-01-22 00:03:59.959 182759 DEBUG oslo_concurrency.lockutils [req-8f7ce849-e617-48ea-a489-011eee6566cf req-d2024666-8fc9-4a8b-86be-1e324c984d33 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "65bbb3bd-2b3c-4868-bf10-ce8795c0a312-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 21 19:03:59 np0005591285 nova_compute[182755]: 2026-01-22 00:03:59.960 182759 DEBUG oslo_concurrency.lockutils [req-8f7ce849-e617-48ea-a489-011eee6566cf req-d2024666-8fc9-4a8b-86be-1e324c984d33 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "65bbb3bd-2b3c-4868-bf10-ce8795c0a312-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 21 19:03:59 np0005591285 nova_compute[182755]: 2026-01-22 00:03:59.960 182759 DEBUG oslo_concurrency.lockutils [req-8f7ce849-e617-48ea-a489-011eee6566cf req-d2024666-8fc9-4a8b-86be-1e324c984d33 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "65bbb3bd-2b3c-4868-bf10-ce8795c0a312-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 21 19:03:59 np0005591285 nova_compute[182755]: 2026-01-22 00:03:59.961 182759 DEBUG nova.compute.manager [req-8f7ce849-e617-48ea-a489-011eee6566cf req-d2024666-8fc9-4a8b-86be-1e324c984d33 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 65bbb3bd-2b3c-4868-bf10-ce8795c0a312] No waiting events found dispatching network-vif-unplugged-d13b0c1b-9c16-4db4-bc03-d7ffef3f3af0 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 21 19:03:59 np0005591285 nova_compute[182755]: 2026-01-22 00:03:59.962 182759 WARNING nova.compute.manager [req-8f7ce849-e617-48ea-a489-011eee6566cf req-d2024666-8fc9-4a8b-86be-1e324c984d33 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 65bbb3bd-2b3c-4868-bf10-ce8795c0a312] Received unexpected event network-vif-unplugged-d13b0c1b-9c16-4db4-bc03-d7ffef3f3af0 for instance with vm_state stopped and task_state None.#033[00m
Jan 21 19:04:00 np0005591285 nova_compute[182755]: 2026-01-22 00:04:00.359 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:04:01 np0005591285 nova_compute[182755]: 2026-01-22 00:04:01.149 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:04:02 np0005591285 nova_compute[182755]: 2026-01-22 00:04:02.181 182759 DEBUG nova.compute.manager [req-de359cbd-6e10-4bdd-859e-2986923094c7 req-ada60983-43e0-495e-b66e-4ecae3c3ac3f 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 65bbb3bd-2b3c-4868-bf10-ce8795c0a312] Received event network-vif-plugged-d13b0c1b-9c16-4db4-bc03-d7ffef3f3af0 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 21 19:04:02 np0005591285 nova_compute[182755]: 2026-01-22 00:04:02.181 182759 DEBUG oslo_concurrency.lockutils [req-de359cbd-6e10-4bdd-859e-2986923094c7 req-ada60983-43e0-495e-b66e-4ecae3c3ac3f 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "65bbb3bd-2b3c-4868-bf10-ce8795c0a312-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 21 19:04:02 np0005591285 nova_compute[182755]: 2026-01-22 00:04:02.182 182759 DEBUG oslo_concurrency.lockutils [req-de359cbd-6e10-4bdd-859e-2986923094c7 req-ada60983-43e0-495e-b66e-4ecae3c3ac3f 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "65bbb3bd-2b3c-4868-bf10-ce8795c0a312-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 21 19:04:02 np0005591285 nova_compute[182755]: 2026-01-22 00:04:02.182 182759 DEBUG oslo_concurrency.lockutils [req-de359cbd-6e10-4bdd-859e-2986923094c7 req-ada60983-43e0-495e-b66e-4ecae3c3ac3f 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "65bbb3bd-2b3c-4868-bf10-ce8795c0a312-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 21 19:04:02 np0005591285 nova_compute[182755]: 2026-01-22 00:04:02.182 182759 DEBUG nova.compute.manager [req-de359cbd-6e10-4bdd-859e-2986923094c7 req-ada60983-43e0-495e-b66e-4ecae3c3ac3f 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 65bbb3bd-2b3c-4868-bf10-ce8795c0a312] No waiting events found dispatching network-vif-plugged-d13b0c1b-9c16-4db4-bc03-d7ffef3f3af0 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 21 19:04:02 np0005591285 nova_compute[182755]: 2026-01-22 00:04:02.182 182759 WARNING nova.compute.manager [req-de359cbd-6e10-4bdd-859e-2986923094c7 req-ada60983-43e0-495e-b66e-4ecae3c3ac3f 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 65bbb3bd-2b3c-4868-bf10-ce8795c0a312] Received unexpected event network-vif-plugged-d13b0c1b-9c16-4db4-bc03-d7ffef3f3af0 for instance with vm_state stopped and task_state None.#033[00m
Jan 21 19:04:02 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:04:02.970 104259 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 21 19:04:02 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:04:02.970 104259 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 21 19:04:02 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:04:02.970 104259 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 21 19:04:03 np0005591285 nova_compute[182755]: 2026-01-22 00:04:03.279 182759 DEBUG nova.network.neutron [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] [instance: 65bbb3bd-2b3c-4868-bf10-ce8795c0a312] Updating instance_info_cache with network_info: [{"id": "d13b0c1b-9c16-4db4-bc03-d7ffef3f3af0", "address": "fa:16:3e:d5:43:90", "network": {"id": "19c3e0c8-5563-479c-995a-ab38d8b8c7f7", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-10713966-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.240", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "cccb624dbe6d4401a89e9cd254f91828", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd13b0c1b-9c", "ovs_interfaceid": "d13b0c1b-9c16-4db4-bc03-d7ffef3f3af0", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 21 19:04:03 np0005591285 nova_compute[182755]: 2026-01-22 00:04:03.519 182759 DEBUG nova.objects.instance [None req-848d58c3-3793-46df-8301-22b771e43d1d 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] Lazy-loading 'flavor' on Instance uuid 65bbb3bd-2b3c-4868-bf10-ce8795c0a312 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 21 19:04:03 np0005591285 nova_compute[182755]: 2026-01-22 00:04:03.520 182759 DEBUG oslo_concurrency.lockutils [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Releasing lock "refresh_cache-65bbb3bd-2b3c-4868-bf10-ce8795c0a312" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 21 19:04:03 np0005591285 nova_compute[182755]: 2026-01-22 00:04:03.521 182759 DEBUG nova.compute.manager [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] [instance: 65bbb3bd-2b3c-4868-bf10-ce8795c0a312] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929#033[00m
Jan 21 19:04:03 np0005591285 nova_compute[182755]: 2026-01-22 00:04:03.522 182759 DEBUG oslo_service.periodic_task [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 21 19:04:03 np0005591285 nova_compute[182755]: 2026-01-22 00:04:03.522 182759 DEBUG oslo_service.periodic_task [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 21 19:04:03 np0005591285 nova_compute[182755]: 2026-01-22 00:04:03.522 182759 DEBUG oslo_service.periodic_task [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 21 19:04:03 np0005591285 nova_compute[182755]: 2026-01-22 00:04:03.569 182759 DEBUG nova.objects.instance [None req-848d58c3-3793-46df-8301-22b771e43d1d 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] Lazy-loading 'info_cache' on Instance uuid 65bbb3bd-2b3c-4868-bf10-ce8795c0a312 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 21 19:04:03 np0005591285 nova_compute[182755]: 2026-01-22 00:04:03.611 182759 DEBUG oslo_concurrency.lockutils [None req-848d58c3-3793-46df-8301-22b771e43d1d 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] Acquiring lock "refresh_cache-65bbb3bd-2b3c-4868-bf10-ce8795c0a312" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 21 19:04:03 np0005591285 nova_compute[182755]: 2026-01-22 00:04:03.611 182759 DEBUG oslo_concurrency.lockutils [None req-848d58c3-3793-46df-8301-22b771e43d1d 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] Acquired lock "refresh_cache-65bbb3bd-2b3c-4868-bf10-ce8795c0a312" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 21 19:04:03 np0005591285 nova_compute[182755]: 2026-01-22 00:04:03.611 182759 DEBUG nova.network.neutron [None req-848d58c3-3793-46df-8301-22b771e43d1d 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] [instance: 65bbb3bd-2b3c-4868-bf10-ce8795c0a312] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Jan 21 19:04:04 np0005591285 nova_compute[182755]: 2026-01-22 00:04:04.120 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:04:05 np0005591285 nova_compute[182755]: 2026-01-22 00:04:05.361 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:04:06 np0005591285 nova_compute[182755]: 2026-01-22 00:04:06.151 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:04:08 np0005591285 nova_compute[182755]: 2026-01-22 00:04:08.771 182759 DEBUG nova.network.neutron [None req-848d58c3-3793-46df-8301-22b771e43d1d 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] [instance: 65bbb3bd-2b3c-4868-bf10-ce8795c0a312] Updating instance_info_cache with network_info: [{"id": "d13b0c1b-9c16-4db4-bc03-d7ffef3f3af0", "address": "fa:16:3e:d5:43:90", "network": {"id": "19c3e0c8-5563-479c-995a-ab38d8b8c7f7", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-10713966-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.240", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "cccb624dbe6d4401a89e9cd254f91828", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd13b0c1b-9c", "ovs_interfaceid": "d13b0c1b-9c16-4db4-bc03-d7ffef3f3af0", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 21 19:04:08 np0005591285 nova_compute[182755]: 2026-01-22 00:04:08.802 182759 DEBUG oslo_concurrency.lockutils [None req-848d58c3-3793-46df-8301-22b771e43d1d 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] Releasing lock "refresh_cache-65bbb3bd-2b3c-4868-bf10-ce8795c0a312" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 21 19:04:08 np0005591285 nova_compute[182755]: 2026-01-22 00:04:08.838 182759 INFO nova.virt.libvirt.driver [-] [instance: 65bbb3bd-2b3c-4868-bf10-ce8795c0a312] Instance destroyed successfully.#033[00m
Jan 21 19:04:08 np0005591285 nova_compute[182755]: 2026-01-22 00:04:08.839 182759 DEBUG nova.objects.instance [None req-848d58c3-3793-46df-8301-22b771e43d1d 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] Lazy-loading 'numa_topology' on Instance uuid 65bbb3bd-2b3c-4868-bf10-ce8795c0a312 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 21 19:04:08 np0005591285 nova_compute[182755]: 2026-01-22 00:04:08.867 182759 DEBUG nova.objects.instance [None req-848d58c3-3793-46df-8301-22b771e43d1d 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] Lazy-loading 'resources' on Instance uuid 65bbb3bd-2b3c-4868-bf10-ce8795c0a312 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 21 19:04:08 np0005591285 nova_compute[182755]: 2026-01-22 00:04:08.882 182759 DEBUG nova.virt.libvirt.vif [None req-848d58c3-3793-46df-8301-22b771e43d1d 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-22T00:02:55Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ServerActionsTestJSON-server-1767669163',display_name='tempest-ServerActionsTestJSON-server-1767669163',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-serveractionstestjson-server-1767669163',id=88,image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBM2ugiUux7DYMlN8dY8gue1BzsfXbOKOqdPq/gJUxFgjYtiZRKn0Il7yH7vkt/FF0n0nQ57uKZ7FjQwDvGcLpEHkhrK3RTLhPWsztjfiNHjhjKK0S86T4k3kzP0rpeoh4Q==',key_name='tempest-keypair-452781070',keypairs=<?>,launch_index=0,launched_at=2026-01-22T00:03:31Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=4,progress=0,project_id='cccb624dbe6d4401a89e9cd254f91828',ramdisk_id='',reservation_id='r-200ojavi',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=<?>,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-ServerActionsTestJSON-78742637',owner_user_name='tempest-ServerActionsTestJSON-78742637-project-member'},tags=<?>,task_state='powering-on',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-22T00:03:59Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='3e78a70a1d284a9d932d4a53b872df39',uuid=65bbb3bd-2b3c-4868-bf10-ce8795c0a312,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='stopped') vif={"id": "d13b0c1b-9c16-4db4-bc03-d7ffef3f3af0", "address": "fa:16:3e:d5:43:90", "network": {"id": "19c3e0c8-5563-479c-995a-ab38d8b8c7f7", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-10713966-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.240", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "cccb624dbe6d4401a89e9cd254f91828", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd13b0c1b-9c", "ovs_interfaceid": "d13b0c1b-9c16-4db4-bc03-d7ffef3f3af0", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Jan 21 19:04:08 np0005591285 nova_compute[182755]: 2026-01-22 00:04:08.883 182759 DEBUG nova.network.os_vif_util [None req-848d58c3-3793-46df-8301-22b771e43d1d 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] Converting VIF {"id": "d13b0c1b-9c16-4db4-bc03-d7ffef3f3af0", "address": "fa:16:3e:d5:43:90", "network": {"id": "19c3e0c8-5563-479c-995a-ab38d8b8c7f7", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-10713966-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.240", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "cccb624dbe6d4401a89e9cd254f91828", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd13b0c1b-9c", "ovs_interfaceid": "d13b0c1b-9c16-4db4-bc03-d7ffef3f3af0", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 21 19:04:08 np0005591285 nova_compute[182755]: 2026-01-22 00:04:08.884 182759 DEBUG nova.network.os_vif_util [None req-848d58c3-3793-46df-8301-22b771e43d1d 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:d5:43:90,bridge_name='br-int',has_traffic_filtering=True,id=d13b0c1b-9c16-4db4-bc03-d7ffef3f3af0,network=Network(19c3e0c8-5563-479c-995a-ab38d8b8c7f7),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapd13b0c1b-9c') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 21 19:04:08 np0005591285 nova_compute[182755]: 2026-01-22 00:04:08.884 182759 DEBUG os_vif [None req-848d58c3-3793-46df-8301-22b771e43d1d 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:d5:43:90,bridge_name='br-int',has_traffic_filtering=True,id=d13b0c1b-9c16-4db4-bc03-d7ffef3f3af0,network=Network(19c3e0c8-5563-479c-995a-ab38d8b8c7f7),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapd13b0c1b-9c') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Jan 21 19:04:08 np0005591285 nova_compute[182755]: 2026-01-22 00:04:08.885 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:04:08 np0005591285 nova_compute[182755]: 2026-01-22 00:04:08.886 182759 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapd13b0c1b-9c, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 21 19:04:08 np0005591285 nova_compute[182755]: 2026-01-22 00:04:08.887 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:04:08 np0005591285 nova_compute[182755]: 2026-01-22 00:04:08.890 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 21 19:04:08 np0005591285 nova_compute[182755]: 2026-01-22 00:04:08.898 182759 INFO os_vif [None req-848d58c3-3793-46df-8301-22b771e43d1d 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:d5:43:90,bridge_name='br-int',has_traffic_filtering=True,id=d13b0c1b-9c16-4db4-bc03-d7ffef3f3af0,network=Network(19c3e0c8-5563-479c-995a-ab38d8b8c7f7),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapd13b0c1b-9c')#033[00m
Jan 21 19:04:08 np0005591285 nova_compute[182755]: 2026-01-22 00:04:08.904 182759 DEBUG nova.virt.libvirt.driver [None req-848d58c3-3793-46df-8301-22b771e43d1d 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] [instance: 65bbb3bd-2b3c-4868-bf10-ce8795c0a312] Start _get_guest_xml network_info=[{"id": "d13b0c1b-9c16-4db4-bc03-d7ffef3f3af0", "address": "fa:16:3e:d5:43:90", "network": {"id": "19c3e0c8-5563-479c-995a-ab38d8b8c7f7", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-10713966-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.240", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "cccb624dbe6d4401a89e9cd254f91828", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd13b0c1b-9c", "ovs_interfaceid": "d13b0c1b-9c16-4db4-bc03-d7ffef3f3af0", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum=<?>,container_format='bare',created_at=<?>,direct_url=<?>,disk_format='qcow2',id=9cd98f02-a505-4543-a7ad-04e9a377b456,min_disk=1,min_ram=0,name=<?>,owner=<?>,properties=ImageMetaProps,protected=<?>,size=<?>,status=<?>,tags=<?>,updated_at=<?>,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'device_type': 'disk', 'device_name': '/dev/vda', 'size': 0, 'boot_index': 0, 'disk_bus': 'virtio', 'guest_format': None, 'encryption_options': None, 'encryption_format': None, 'encrypted': False, 'encryption_secret_uuid': None, 'image_id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Jan 21 19:04:08 np0005591285 nova_compute[182755]: 2026-01-22 00:04:08.907 182759 WARNING nova.virt.libvirt.driver [None req-848d58c3-3793-46df-8301-22b771e43d1d 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 21 19:04:08 np0005591285 nova_compute[182755]: 2026-01-22 00:04:08.916 182759 DEBUG nova.virt.libvirt.host [None req-848d58c3-3793-46df-8301-22b771e43d1d 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Jan 21 19:04:08 np0005591285 nova_compute[182755]: 2026-01-22 00:04:08.917 182759 DEBUG nova.virt.libvirt.host [None req-848d58c3-3793-46df-8301-22b771e43d1d 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Jan 21 19:04:08 np0005591285 nova_compute[182755]: 2026-01-22 00:04:08.924 182759 DEBUG nova.virt.libvirt.host [None req-848d58c3-3793-46df-8301-22b771e43d1d 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Jan 21 19:04:08 np0005591285 nova_compute[182755]: 2026-01-22 00:04:08.925 182759 DEBUG nova.virt.libvirt.host [None req-848d58c3-3793-46df-8301-22b771e43d1d 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Jan 21 19:04:08 np0005591285 nova_compute[182755]: 2026-01-22 00:04:08.926 182759 DEBUG nova.virt.libvirt.driver [None req-848d58c3-3793-46df-8301-22b771e43d1d 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Jan 21 19:04:08 np0005591285 nova_compute[182755]: 2026-01-22 00:04:08.927 182759 DEBUG nova.virt.hardware [None req-848d58c3-3793-46df-8301-22b771e43d1d 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-21T23:42:45Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='c3389c03-89c4-4ff5-9e03-1a99d41713d4',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum=<?>,container_format='bare',created_at=<?>,direct_url=<?>,disk_format='qcow2',id=9cd98f02-a505-4543-a7ad-04e9a377b456,min_disk=1,min_ram=0,name=<?>,owner=<?>,properties=ImageMetaProps,protected=<?>,size=<?>,status=<?>,tags=<?>,updated_at=<?>,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Jan 21 19:04:08 np0005591285 nova_compute[182755]: 2026-01-22 00:04:08.927 182759 DEBUG nova.virt.hardware [None req-848d58c3-3793-46df-8301-22b771e43d1d 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Jan 21 19:04:08 np0005591285 nova_compute[182755]: 2026-01-22 00:04:08.928 182759 DEBUG nova.virt.hardware [None req-848d58c3-3793-46df-8301-22b771e43d1d 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Jan 21 19:04:08 np0005591285 nova_compute[182755]: 2026-01-22 00:04:08.928 182759 DEBUG nova.virt.hardware [None req-848d58c3-3793-46df-8301-22b771e43d1d 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Jan 21 19:04:08 np0005591285 nova_compute[182755]: 2026-01-22 00:04:08.928 182759 DEBUG nova.virt.hardware [None req-848d58c3-3793-46df-8301-22b771e43d1d 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Jan 21 19:04:08 np0005591285 nova_compute[182755]: 2026-01-22 00:04:08.929 182759 DEBUG nova.virt.hardware [None req-848d58c3-3793-46df-8301-22b771e43d1d 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Jan 21 19:04:08 np0005591285 nova_compute[182755]: 2026-01-22 00:04:08.929 182759 DEBUG nova.virt.hardware [None req-848d58c3-3793-46df-8301-22b771e43d1d 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Jan 21 19:04:08 np0005591285 nova_compute[182755]: 2026-01-22 00:04:08.930 182759 DEBUG nova.virt.hardware [None req-848d58c3-3793-46df-8301-22b771e43d1d 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Jan 21 19:04:08 np0005591285 nova_compute[182755]: 2026-01-22 00:04:08.930 182759 DEBUG nova.virt.hardware [None req-848d58c3-3793-46df-8301-22b771e43d1d 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Jan 21 19:04:08 np0005591285 nova_compute[182755]: 2026-01-22 00:04:08.930 182759 DEBUG nova.virt.hardware [None req-848d58c3-3793-46df-8301-22b771e43d1d 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Jan 21 19:04:08 np0005591285 nova_compute[182755]: 2026-01-22 00:04:08.930 182759 DEBUG nova.virt.hardware [None req-848d58c3-3793-46df-8301-22b771e43d1d 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Jan 21 19:04:08 np0005591285 nova_compute[182755]: 2026-01-22 00:04:08.931 182759 DEBUG nova.objects.instance [None req-848d58c3-3793-46df-8301-22b771e43d1d 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] Lazy-loading 'vcpu_model' on Instance uuid 65bbb3bd-2b3c-4868-bf10-ce8795c0a312 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 21 19:04:08 np0005591285 nova_compute[182755]: 2026-01-22 00:04:08.971 182759 DEBUG oslo_concurrency.processutils [None req-848d58c3-3793-46df-8301-22b771e43d1d 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/65bbb3bd-2b3c-4868-bf10-ce8795c0a312/disk.config --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 21 19:04:09 np0005591285 nova_compute[182755]: 2026-01-22 00:04:09.052 182759 DEBUG oslo_concurrency.processutils [None req-848d58c3-3793-46df-8301-22b771e43d1d 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/65bbb3bd-2b3c-4868-bf10-ce8795c0a312/disk.config --force-share --output=json" returned: 0 in 0.081s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 21 19:04:09 np0005591285 nova_compute[182755]: 2026-01-22 00:04:09.054 182759 DEBUG oslo_concurrency.lockutils [None req-848d58c3-3793-46df-8301-22b771e43d1d 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] Acquiring lock "/var/lib/nova/instances/65bbb3bd-2b3c-4868-bf10-ce8795c0a312/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 21 19:04:09 np0005591285 nova_compute[182755]: 2026-01-22 00:04:09.054 182759 DEBUG oslo_concurrency.lockutils [None req-848d58c3-3793-46df-8301-22b771e43d1d 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] Lock "/var/lib/nova/instances/65bbb3bd-2b3c-4868-bf10-ce8795c0a312/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 21 19:04:09 np0005591285 nova_compute[182755]: 2026-01-22 00:04:09.055 182759 DEBUG oslo_concurrency.lockutils [None req-848d58c3-3793-46df-8301-22b771e43d1d 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] Lock "/var/lib/nova/instances/65bbb3bd-2b3c-4868-bf10-ce8795c0a312/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 21 19:04:09 np0005591285 nova_compute[182755]: 2026-01-22 00:04:09.057 182759 DEBUG nova.virt.libvirt.vif [None req-848d58c3-3793-46df-8301-22b771e43d1d 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-22T00:02:55Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ServerActionsTestJSON-server-1767669163',display_name='tempest-ServerActionsTestJSON-server-1767669163',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-serveractionstestjson-server-1767669163',id=88,image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBM2ugiUux7DYMlN8dY8gue1BzsfXbOKOqdPq/gJUxFgjYtiZRKn0Il7yH7vkt/FF0n0nQ57uKZ7FjQwDvGcLpEHkhrK3RTLhPWsztjfiNHjhjKK0S86T4k3kzP0rpeoh4Q==',key_name='tempest-keypair-452781070',keypairs=<?>,launch_index=0,launched_at=2026-01-22T00:03:31Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=4,progress=0,project_id='cccb624dbe6d4401a89e9cd254f91828',ramdisk_id='',reservation_id='r-200ojavi',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=<?>,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-ServerActionsTestJSON-78742637',owner_user_name='tempest-ServerActionsTestJSON-78742637-project-member'},tags=<?>,task_state='powering-on',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-22T00:03:59Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='3e78a70a1d284a9d932d4a53b872df39',uuid=65bbb3bd-2b3c-4868-bf10-ce8795c0a312,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='stopped') vif={"id": "d13b0c1b-9c16-4db4-bc03-d7ffef3f3af0", "address": "fa:16:3e:d5:43:90", "network": {"id": "19c3e0c8-5563-479c-995a-ab38d8b8c7f7", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-10713966-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.240", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "cccb624dbe6d4401a89e9cd254f91828", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd13b0c1b-9c", "ovs_interfaceid": "d13b0c1b-9c16-4db4-bc03-d7ffef3f3af0", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Jan 21 19:04:09 np0005591285 nova_compute[182755]: 2026-01-22 00:04:09.058 182759 DEBUG nova.network.os_vif_util [None req-848d58c3-3793-46df-8301-22b771e43d1d 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] Converting VIF {"id": "d13b0c1b-9c16-4db4-bc03-d7ffef3f3af0", "address": "fa:16:3e:d5:43:90", "network": {"id": "19c3e0c8-5563-479c-995a-ab38d8b8c7f7", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-10713966-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.240", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "cccb624dbe6d4401a89e9cd254f91828", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd13b0c1b-9c", "ovs_interfaceid": "d13b0c1b-9c16-4db4-bc03-d7ffef3f3af0", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 21 19:04:09 np0005591285 nova_compute[182755]: 2026-01-22 00:04:09.059 182759 DEBUG nova.network.os_vif_util [None req-848d58c3-3793-46df-8301-22b771e43d1d 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:d5:43:90,bridge_name='br-int',has_traffic_filtering=True,id=d13b0c1b-9c16-4db4-bc03-d7ffef3f3af0,network=Network(19c3e0c8-5563-479c-995a-ab38d8b8c7f7),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapd13b0c1b-9c') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 21 19:04:09 np0005591285 nova_compute[182755]: 2026-01-22 00:04:09.061 182759 DEBUG nova.objects.instance [None req-848d58c3-3793-46df-8301-22b771e43d1d 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] Lazy-loading 'pci_devices' on Instance uuid 65bbb3bd-2b3c-4868-bf10-ce8795c0a312 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 21 19:04:09 np0005591285 nova_compute[182755]: 2026-01-22 00:04:09.087 182759 DEBUG nova.virt.libvirt.driver [None req-848d58c3-3793-46df-8301-22b771e43d1d 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] [instance: 65bbb3bd-2b3c-4868-bf10-ce8795c0a312] End _get_guest_xml xml=<domain type="kvm">
Jan 21 19:04:09 np0005591285 nova_compute[182755]:  <uuid>65bbb3bd-2b3c-4868-bf10-ce8795c0a312</uuid>
Jan 21 19:04:09 np0005591285 nova_compute[182755]:  <name>instance-00000058</name>
Jan 21 19:04:09 np0005591285 nova_compute[182755]:  <memory>131072</memory>
Jan 21 19:04:09 np0005591285 nova_compute[182755]:  <vcpu>1</vcpu>
Jan 21 19:04:09 np0005591285 nova_compute[182755]:  <metadata>
Jan 21 19:04:09 np0005591285 nova_compute[182755]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 21 19:04:09 np0005591285 nova_compute[182755]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 21 19:04:09 np0005591285 nova_compute[182755]:      <nova:name>tempest-ServerActionsTestJSON-server-1767669163</nova:name>
Jan 21 19:04:09 np0005591285 nova_compute[182755]:      <nova:creationTime>2026-01-22 00:04:08</nova:creationTime>
Jan 21 19:04:09 np0005591285 nova_compute[182755]:      <nova:flavor name="m1.nano">
Jan 21 19:04:09 np0005591285 nova_compute[182755]:        <nova:memory>128</nova:memory>
Jan 21 19:04:09 np0005591285 nova_compute[182755]:        <nova:disk>1</nova:disk>
Jan 21 19:04:09 np0005591285 nova_compute[182755]:        <nova:swap>0</nova:swap>
Jan 21 19:04:09 np0005591285 nova_compute[182755]:        <nova:ephemeral>0</nova:ephemeral>
Jan 21 19:04:09 np0005591285 nova_compute[182755]:        <nova:vcpus>1</nova:vcpus>
Jan 21 19:04:09 np0005591285 nova_compute[182755]:      </nova:flavor>
Jan 21 19:04:09 np0005591285 nova_compute[182755]:      <nova:owner>
Jan 21 19:04:09 np0005591285 nova_compute[182755]:        <nova:user uuid="3e78a70a1d284a9d932d4a53b872df39">tempest-ServerActionsTestJSON-78742637-project-member</nova:user>
Jan 21 19:04:09 np0005591285 nova_compute[182755]:        <nova:project uuid="cccb624dbe6d4401a89e9cd254f91828">tempest-ServerActionsTestJSON-78742637</nova:project>
Jan 21 19:04:09 np0005591285 nova_compute[182755]:      </nova:owner>
Jan 21 19:04:09 np0005591285 nova_compute[182755]:      <nova:root type="image" uuid="9cd98f02-a505-4543-a7ad-04e9a377b456"/>
Jan 21 19:04:09 np0005591285 nova_compute[182755]:      <nova:ports>
Jan 21 19:04:09 np0005591285 nova_compute[182755]:        <nova:port uuid="d13b0c1b-9c16-4db4-bc03-d7ffef3f3af0">
Jan 21 19:04:09 np0005591285 nova_compute[182755]:          <nova:ip type="fixed" address="10.100.0.6" ipVersion="4"/>
Jan 21 19:04:09 np0005591285 nova_compute[182755]:        </nova:port>
Jan 21 19:04:09 np0005591285 nova_compute[182755]:      </nova:ports>
Jan 21 19:04:09 np0005591285 nova_compute[182755]:    </nova:instance>
Jan 21 19:04:09 np0005591285 nova_compute[182755]:  </metadata>
Jan 21 19:04:09 np0005591285 nova_compute[182755]:  <sysinfo type="smbios">
Jan 21 19:04:09 np0005591285 nova_compute[182755]:    <system>
Jan 21 19:04:09 np0005591285 nova_compute[182755]:      <entry name="manufacturer">RDO</entry>
Jan 21 19:04:09 np0005591285 nova_compute[182755]:      <entry name="product">OpenStack Compute</entry>
Jan 21 19:04:09 np0005591285 nova_compute[182755]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 21 19:04:09 np0005591285 nova_compute[182755]:      <entry name="serial">65bbb3bd-2b3c-4868-bf10-ce8795c0a312</entry>
Jan 21 19:04:09 np0005591285 nova_compute[182755]:      <entry name="uuid">65bbb3bd-2b3c-4868-bf10-ce8795c0a312</entry>
Jan 21 19:04:09 np0005591285 nova_compute[182755]:      <entry name="family">Virtual Machine</entry>
Jan 21 19:04:09 np0005591285 nova_compute[182755]:    </system>
Jan 21 19:04:09 np0005591285 nova_compute[182755]:  </sysinfo>
Jan 21 19:04:09 np0005591285 nova_compute[182755]:  <os>
Jan 21 19:04:09 np0005591285 nova_compute[182755]:    <type arch="x86_64" machine="q35">hvm</type>
Jan 21 19:04:09 np0005591285 nova_compute[182755]:    <boot dev="hd"/>
Jan 21 19:04:09 np0005591285 nova_compute[182755]:    <smbios mode="sysinfo"/>
Jan 21 19:04:09 np0005591285 nova_compute[182755]:  </os>
Jan 21 19:04:09 np0005591285 nova_compute[182755]:  <features>
Jan 21 19:04:09 np0005591285 nova_compute[182755]:    <acpi/>
Jan 21 19:04:09 np0005591285 nova_compute[182755]:    <apic/>
Jan 21 19:04:09 np0005591285 nova_compute[182755]:    <vmcoreinfo/>
Jan 21 19:04:09 np0005591285 nova_compute[182755]:  </features>
Jan 21 19:04:09 np0005591285 nova_compute[182755]:  <clock offset="utc">
Jan 21 19:04:09 np0005591285 nova_compute[182755]:    <timer name="pit" tickpolicy="delay"/>
Jan 21 19:04:09 np0005591285 nova_compute[182755]:    <timer name="rtc" tickpolicy="catchup"/>
Jan 21 19:04:09 np0005591285 nova_compute[182755]:    <timer name="hpet" present="no"/>
Jan 21 19:04:09 np0005591285 nova_compute[182755]:  </clock>
Jan 21 19:04:09 np0005591285 nova_compute[182755]:  <cpu mode="custom" match="exact">
Jan 21 19:04:09 np0005591285 nova_compute[182755]:    <model>Nehalem</model>
Jan 21 19:04:09 np0005591285 nova_compute[182755]:    <topology sockets="1" cores="1" threads="1"/>
Jan 21 19:04:09 np0005591285 nova_compute[182755]:  </cpu>
Jan 21 19:04:09 np0005591285 nova_compute[182755]:  <devices>
Jan 21 19:04:09 np0005591285 nova_compute[182755]:    <disk type="file" device="disk">
Jan 21 19:04:09 np0005591285 nova_compute[182755]:      <driver name="qemu" type="qcow2" cache="none"/>
Jan 21 19:04:09 np0005591285 nova_compute[182755]:      <source file="/var/lib/nova/instances/65bbb3bd-2b3c-4868-bf10-ce8795c0a312/disk"/>
Jan 21 19:04:09 np0005591285 nova_compute[182755]:      <target dev="vda" bus="virtio"/>
Jan 21 19:04:09 np0005591285 nova_compute[182755]:    </disk>
Jan 21 19:04:09 np0005591285 nova_compute[182755]:    <disk type="file" device="cdrom">
Jan 21 19:04:09 np0005591285 nova_compute[182755]:      <driver name="qemu" type="raw" cache="none"/>
Jan 21 19:04:09 np0005591285 nova_compute[182755]:      <source file="/var/lib/nova/instances/65bbb3bd-2b3c-4868-bf10-ce8795c0a312/disk.config"/>
Jan 21 19:04:09 np0005591285 nova_compute[182755]:      <target dev="sda" bus="sata"/>
Jan 21 19:04:09 np0005591285 nova_compute[182755]:    </disk>
Jan 21 19:04:09 np0005591285 nova_compute[182755]:    <interface type="ethernet">
Jan 21 19:04:09 np0005591285 nova_compute[182755]:      <mac address="fa:16:3e:d5:43:90"/>
Jan 21 19:04:09 np0005591285 nova_compute[182755]:      <model type="virtio"/>
Jan 21 19:04:09 np0005591285 nova_compute[182755]:      <driver name="vhost" rx_queue_size="512"/>
Jan 21 19:04:09 np0005591285 nova_compute[182755]:      <mtu size="1442"/>
Jan 21 19:04:09 np0005591285 nova_compute[182755]:      <target dev="tapd13b0c1b-9c"/>
Jan 21 19:04:09 np0005591285 nova_compute[182755]:    </interface>
Jan 21 19:04:09 np0005591285 nova_compute[182755]:    <serial type="pty">
Jan 21 19:04:09 np0005591285 nova_compute[182755]:      <log file="/var/lib/nova/instances/65bbb3bd-2b3c-4868-bf10-ce8795c0a312/console.log" append="off"/>
Jan 21 19:04:09 np0005591285 nova_compute[182755]:    </serial>
Jan 21 19:04:09 np0005591285 nova_compute[182755]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 21 19:04:09 np0005591285 nova_compute[182755]:    <video>
Jan 21 19:04:09 np0005591285 nova_compute[182755]:      <model type="virtio"/>
Jan 21 19:04:09 np0005591285 nova_compute[182755]:    </video>
Jan 21 19:04:09 np0005591285 nova_compute[182755]:    <input type="tablet" bus="usb"/>
Jan 21 19:04:09 np0005591285 nova_compute[182755]:    <input type="keyboard" bus="usb"/>
Jan 21 19:04:09 np0005591285 nova_compute[182755]:    <rng model="virtio">
Jan 21 19:04:09 np0005591285 nova_compute[182755]:      <backend model="random">/dev/urandom</backend>
Jan 21 19:04:09 np0005591285 nova_compute[182755]:    </rng>
Jan 21 19:04:09 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root"/>
Jan 21 19:04:09 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 19:04:09 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 19:04:09 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 19:04:09 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 19:04:09 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 19:04:09 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 19:04:09 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 19:04:09 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 19:04:09 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 19:04:09 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 19:04:09 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 19:04:09 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 19:04:09 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 19:04:09 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 19:04:09 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 19:04:09 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 19:04:09 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 19:04:09 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 19:04:09 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 19:04:09 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 19:04:09 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 19:04:09 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 19:04:09 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 19:04:09 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 19:04:09 np0005591285 nova_compute[182755]:    <controller type="usb" index="0"/>
Jan 21 19:04:09 np0005591285 nova_compute[182755]:    <memballoon model="virtio">
Jan 21 19:04:09 np0005591285 nova_compute[182755]:      <stats period="10"/>
Jan 21 19:04:09 np0005591285 nova_compute[182755]:    </memballoon>
Jan 21 19:04:09 np0005591285 nova_compute[182755]:  </devices>
Jan 21 19:04:09 np0005591285 nova_compute[182755]: </domain>
Jan 21 19:04:09 np0005591285 nova_compute[182755]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Jan 21 19:04:09 np0005591285 nova_compute[182755]: 2026-01-22 00:04:09.089 182759 DEBUG oslo_concurrency.processutils [None req-848d58c3-3793-46df-8301-22b771e43d1d 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/65bbb3bd-2b3c-4868-bf10-ce8795c0a312/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 21 19:04:09 np0005591285 nova_compute[182755]: 2026-01-22 00:04:09.175 182759 DEBUG oslo_concurrency.processutils [None req-848d58c3-3793-46df-8301-22b771e43d1d 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/65bbb3bd-2b3c-4868-bf10-ce8795c0a312/disk --force-share --output=json" returned: 0 in 0.085s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 21 19:04:09 np0005591285 nova_compute[182755]: 2026-01-22 00:04:09.176 182759 DEBUG oslo_concurrency.processutils [None req-848d58c3-3793-46df-8301-22b771e43d1d 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/65bbb3bd-2b3c-4868-bf10-ce8795c0a312/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 21 19:04:09 np0005591285 podman[225002]: 2026-01-22 00:04:09.209377345 +0000 UTC m=+0.064885367 container health_status b5c1a31a508d0cf9e10702abe9c0ef424614b4d8f0daee85791f7de054afa6f0 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=ceilometer_agent_compute, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, container_name=ceilometer_agent_compute, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Jan 21 19:04:09 np0005591285 podman[225001]: 2026-01-22 00:04:09.212621612 +0000 UTC m=+0.070087466 container health_status 769668cc4e818cfae854ab3fcb5517227ddca1e18005a18ef4d782a494f45662 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, managed_by=edpm_ansible, version=9.6, architecture=x86_64, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, com.redhat.component=ubi9-minimal-container, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., config_id=openstack_network_exporter, release=1755695350, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, maintainer=Red Hat, Inc., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-type=git, container_name=openstack_network_exporter, build-date=2025-08-20T13:12:41, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, io.openshift.tags=minimal rhel9, name=ubi9-minimal, distribution-scope=public, io.buildah.version=1.33.7, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI)
Jan 21 19:04:09 np0005591285 nova_compute[182755]: 2026-01-22 00:04:09.233 182759 DEBUG oslo_concurrency.processutils [None req-848d58c3-3793-46df-8301-22b771e43d1d 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/65bbb3bd-2b3c-4868-bf10-ce8795c0a312/disk --force-share --output=json" returned: 0 in 0.057s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 21 19:04:09 np0005591285 nova_compute[182755]: 2026-01-22 00:04:09.235 182759 DEBUG nova.objects.instance [None req-848d58c3-3793-46df-8301-22b771e43d1d 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] Lazy-loading 'trusted_certs' on Instance uuid 65bbb3bd-2b3c-4868-bf10-ce8795c0a312 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 21 19:04:09 np0005591285 nova_compute[182755]: 2026-01-22 00:04:09.255 182759 DEBUG oslo_concurrency.processutils [None req-848d58c3-3793-46df-8301-22b771e43d1d 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 21 19:04:09 np0005591285 nova_compute[182755]: 2026-01-22 00:04:09.313 182759 DEBUG oslo_concurrency.processutils [None req-848d58c3-3793-46df-8301-22b771e43d1d 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json" returned: 0 in 0.058s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 21 19:04:09 np0005591285 nova_compute[182755]: 2026-01-22 00:04:09.314 182759 DEBUG nova.virt.disk.api [None req-848d58c3-3793-46df-8301-22b771e43d1d 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] Checking if we can resize image /var/lib/nova/instances/65bbb3bd-2b3c-4868-bf10-ce8795c0a312/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166#033[00m
Jan 21 19:04:09 np0005591285 nova_compute[182755]: 2026-01-22 00:04:09.315 182759 DEBUG oslo_concurrency.processutils [None req-848d58c3-3793-46df-8301-22b771e43d1d 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/65bbb3bd-2b3c-4868-bf10-ce8795c0a312/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 21 19:04:09 np0005591285 nova_compute[182755]: 2026-01-22 00:04:09.377 182759 DEBUG oslo_concurrency.processutils [None req-848d58c3-3793-46df-8301-22b771e43d1d 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/65bbb3bd-2b3c-4868-bf10-ce8795c0a312/disk --force-share --output=json" returned: 0 in 0.062s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 21 19:04:09 np0005591285 nova_compute[182755]: 2026-01-22 00:04:09.378 182759 DEBUG nova.virt.disk.api [None req-848d58c3-3793-46df-8301-22b771e43d1d 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] Cannot resize image /var/lib/nova/instances/65bbb3bd-2b3c-4868-bf10-ce8795c0a312/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172#033[00m
Jan 21 19:04:09 np0005591285 nova_compute[182755]: 2026-01-22 00:04:09.378 182759 DEBUG nova.objects.instance [None req-848d58c3-3793-46df-8301-22b771e43d1d 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] Lazy-loading 'migration_context' on Instance uuid 65bbb3bd-2b3c-4868-bf10-ce8795c0a312 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 21 19:04:09 np0005591285 nova_compute[182755]: 2026-01-22 00:04:09.401 182759 DEBUG nova.virt.libvirt.vif [None req-848d58c3-3793-46df-8301-22b771e43d1d 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-22T00:02:55Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ServerActionsTestJSON-server-1767669163',display_name='tempest-ServerActionsTestJSON-server-1767669163',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-serveractionstestjson-server-1767669163',id=88,image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBM2ugiUux7DYMlN8dY8gue1BzsfXbOKOqdPq/gJUxFgjYtiZRKn0Il7yH7vkt/FF0n0nQ57uKZ7FjQwDvGcLpEHkhrK3RTLhPWsztjfiNHjhjKK0S86T4k3kzP0rpeoh4Q==',key_name='tempest-keypair-452781070',keypairs=<?>,launch_index=0,launched_at=2026-01-22T00:03:31Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=<?>,power_state=4,progress=0,project_id='cccb624dbe6d4401a89e9cd254f91828',ramdisk_id='',reservation_id='r-200ojavi',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=<?>,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-ServerActionsTestJSON-78742637',owner_user_name='tempest-ServerActionsTestJSON-78742637-project-member'},tags=<?>,task_state='powering-on',terminated_at=None,trusted_certs=None,updated_at=2026-01-22T00:03:59Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='3e78a70a1d284a9d932d4a53b872df39',uuid=65bbb3bd-2b3c-4868-bf10-ce8795c0a312,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='stopped') vif={"id": "d13b0c1b-9c16-4db4-bc03-d7ffef3f3af0", "address": "fa:16:3e:d5:43:90", "network": {"id": "19c3e0c8-5563-479c-995a-ab38d8b8c7f7", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-10713966-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.240", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "cccb624dbe6d4401a89e9cd254f91828", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd13b0c1b-9c", "ovs_interfaceid": "d13b0c1b-9c16-4db4-bc03-d7ffef3f3af0", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Jan 21 19:04:09 np0005591285 nova_compute[182755]: 2026-01-22 00:04:09.402 182759 DEBUG nova.network.os_vif_util [None req-848d58c3-3793-46df-8301-22b771e43d1d 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] Converting VIF {"id": "d13b0c1b-9c16-4db4-bc03-d7ffef3f3af0", "address": "fa:16:3e:d5:43:90", "network": {"id": "19c3e0c8-5563-479c-995a-ab38d8b8c7f7", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-10713966-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.240", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "cccb624dbe6d4401a89e9cd254f91828", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd13b0c1b-9c", "ovs_interfaceid": "d13b0c1b-9c16-4db4-bc03-d7ffef3f3af0", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 21 19:04:09 np0005591285 nova_compute[182755]: 2026-01-22 00:04:09.403 182759 DEBUG nova.network.os_vif_util [None req-848d58c3-3793-46df-8301-22b771e43d1d 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:d5:43:90,bridge_name='br-int',has_traffic_filtering=True,id=d13b0c1b-9c16-4db4-bc03-d7ffef3f3af0,network=Network(19c3e0c8-5563-479c-995a-ab38d8b8c7f7),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapd13b0c1b-9c') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 21 19:04:09 np0005591285 nova_compute[182755]: 2026-01-22 00:04:09.403 182759 DEBUG os_vif [None req-848d58c3-3793-46df-8301-22b771e43d1d 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:d5:43:90,bridge_name='br-int',has_traffic_filtering=True,id=d13b0c1b-9c16-4db4-bc03-d7ffef3f3af0,network=Network(19c3e0c8-5563-479c-995a-ab38d8b8c7f7),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapd13b0c1b-9c') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Jan 21 19:04:09 np0005591285 nova_compute[182755]: 2026-01-22 00:04:09.403 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:04:09 np0005591285 nova_compute[182755]: 2026-01-22 00:04:09.404 182759 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 21 19:04:09 np0005591285 nova_compute[182755]: 2026-01-22 00:04:09.404 182759 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 21 19:04:09 np0005591285 nova_compute[182755]: 2026-01-22 00:04:09.409 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:04:09 np0005591285 nova_compute[182755]: 2026-01-22 00:04:09.410 182759 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapd13b0c1b-9c, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 21 19:04:09 np0005591285 nova_compute[182755]: 2026-01-22 00:04:09.410 182759 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tapd13b0c1b-9c, col_values=(('external_ids', {'iface-id': 'd13b0c1b-9c16-4db4-bc03-d7ffef3f3af0', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:d5:43:90', 'vm-uuid': '65bbb3bd-2b3c-4868-bf10-ce8795c0a312'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 21 19:04:09 np0005591285 nova_compute[182755]: 2026-01-22 00:04:09.412 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:04:09 np0005591285 NetworkManager[55017]: <info>  [1769040249.4130] manager: (tapd13b0c1b-9c): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/159)
Jan 21 19:04:09 np0005591285 nova_compute[182755]: 2026-01-22 00:04:09.415 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 21 19:04:09 np0005591285 nova_compute[182755]: 2026-01-22 00:04:09.419 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:04:09 np0005591285 nova_compute[182755]: 2026-01-22 00:04:09.420 182759 INFO os_vif [None req-848d58c3-3793-46df-8301-22b771e43d1d 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:d5:43:90,bridge_name='br-int',has_traffic_filtering=True,id=d13b0c1b-9c16-4db4-bc03-d7ffef3f3af0,network=Network(19c3e0c8-5563-479c-995a-ab38d8b8c7f7),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapd13b0c1b-9c')#033[00m
Jan 21 19:04:09 np0005591285 kernel: tapd13b0c1b-9c: entered promiscuous mode
Jan 21 19:04:09 np0005591285 NetworkManager[55017]: <info>  [1769040249.5343] manager: (tapd13b0c1b-9c): new Tun device (/org/freedesktop/NetworkManager/Devices/160)
Jan 21 19:04:09 np0005591285 ovn_controller[94908]: 2026-01-22T00:04:09Z|00323|binding|INFO|Claiming lport d13b0c1b-9c16-4db4-bc03-d7ffef3f3af0 for this chassis.
Jan 21 19:04:09 np0005591285 ovn_controller[94908]: 2026-01-22T00:04:09Z|00324|binding|INFO|d13b0c1b-9c16-4db4-bc03-d7ffef3f3af0: Claiming fa:16:3e:d5:43:90 10.100.0.6
Jan 21 19:04:09 np0005591285 nova_compute[182755]: 2026-01-22 00:04:09.537 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:04:09 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:04:09.553 104259 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:d5:43:90 10.100.0.6'], port_security=['fa:16:3e:d5:43:90 10.100.0.6'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.6/28', 'neutron:device_id': '65bbb3bd-2b3c-4868-bf10-ce8795c0a312', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-19c3e0c8-5563-479c-995a-ab38d8b8c7f7', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'cccb624dbe6d4401a89e9cd254f91828', 'neutron:revision_number': '5', 'neutron:security_group_ids': '6d59a7e5-ecca-4ec2-a40e-386acabc1d66', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:port_fip': '192.168.122.240'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=1cb5ae5b-fb9e-4b4d-8960-35191db09308, chassis=[<ovs.db.idl.Row object at 0x7fdd55b5c970>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fdd55b5c970>], logical_port=d13b0c1b-9c16-4db4-bc03-d7ffef3f3af0) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 21 19:04:09 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:04:09.554 104259 INFO neutron.agent.ovn.metadata.agent [-] Port d13b0c1b-9c16-4db4-bc03-d7ffef3f3af0 in datapath 19c3e0c8-5563-479c-995a-ab38d8b8c7f7 bound to our chassis#033[00m
Jan 21 19:04:09 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:04:09.556 104259 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 19c3e0c8-5563-479c-995a-ab38d8b8c7f7#033[00m
Jan 21 19:04:09 np0005591285 ovn_controller[94908]: 2026-01-22T00:04:09Z|00325|binding|INFO|Setting lport d13b0c1b-9c16-4db4-bc03-d7ffef3f3af0 ovn-installed in OVS
Jan 21 19:04:09 np0005591285 ovn_controller[94908]: 2026-01-22T00:04:09Z|00326|binding|INFO|Setting lport d13b0c1b-9c16-4db4-bc03-d7ffef3f3af0 up in Southbound
Jan 21 19:04:09 np0005591285 nova_compute[182755]: 2026-01-22 00:04:09.558 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:04:09 np0005591285 nova_compute[182755]: 2026-01-22 00:04:09.560 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:04:09 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:04:09.569 211690 DEBUG oslo.privsep.daemon [-] privsep: reply[79cd5d8a-758b-4371-973e-7853833b2ce9]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 19:04:09 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:04:09.570 104259 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap19c3e0c8-51 in ovnmeta-19c3e0c8-5563-479c-995a-ab38d8b8c7f7 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Jan 21 19:04:09 np0005591285 systemd-udevd[225069]: Network interface NamePolicy= disabled on kernel command line.
Jan 21 19:04:09 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:04:09.573 211690 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap19c3e0c8-50 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Jan 21 19:04:09 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:04:09.573 211690 DEBUG oslo.privsep.daemon [-] privsep: reply[4d6affc5-b59b-4d1f-9959-d4d65354ecb1]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 19:04:09 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:04:09.574 211690 DEBUG oslo.privsep.daemon [-] privsep: reply[b871796f-aaa1-41ff-9c24-1662ad0e8686]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 19:04:09 np0005591285 systemd-machined[154022]: New machine qemu-41-instance-00000058.
Jan 21 19:04:09 np0005591285 NetworkManager[55017]: <info>  [1769040249.5843] device (tapd13b0c1b-9c): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 21 19:04:09 np0005591285 NetworkManager[55017]: <info>  [1769040249.5855] device (tapd13b0c1b-9c): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 21 19:04:09 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:04:09.587 104650 DEBUG oslo.privsep.daemon [-] privsep: reply[6ef1c35b-ee32-4c52-b46c-fd2ba7320fe1]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 19:04:09 np0005591285 systemd[1]: Started Virtual Machine qemu-41-instance-00000058.
Jan 21 19:04:09 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:04:09.601 211690 DEBUG oslo.privsep.daemon [-] privsep: reply[603929fc-6783-4014-bb22-388e23a0e9c6]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 19:04:09 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:04:09.636 211704 DEBUG oslo.privsep.daemon [-] privsep: reply[662c83f6-2da7-4b9d-83e8-a51a9e9a8e2c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 19:04:09 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:04:09.642 211690 DEBUG oslo.privsep.daemon [-] privsep: reply[dfade72b-b768-4f4a-8471-df5de3a9e523]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 19:04:09 np0005591285 systemd-udevd[225073]: Network interface NamePolicy= disabled on kernel command line.
Jan 21 19:04:09 np0005591285 NetworkManager[55017]: <info>  [1769040249.6477] manager: (tap19c3e0c8-50): new Veth device (/org/freedesktop/NetworkManager/Devices/161)
Jan 21 19:04:09 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:04:09.681 211704 DEBUG oslo.privsep.daemon [-] privsep: reply[41a50a3b-15cc-4f00-a80e-dffae2e89f5d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 19:04:09 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:04:09.684 211704 DEBUG oslo.privsep.daemon [-] privsep: reply[e6f7ec9e-2cf3-4b06-8cbd-a36df4a1e4ff]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 19:04:09 np0005591285 NetworkManager[55017]: <info>  [1769040249.7091] device (tap19c3e0c8-50): carrier: link connected
Jan 21 19:04:09 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:04:09.716 211704 DEBUG oslo.privsep.daemon [-] privsep: reply[cc71b1fc-c56f-4729-967c-c5c2a3b6b427]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 19:04:09 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:04:09.740 211690 DEBUG oslo.privsep.daemon [-] privsep: reply[522e47f0-be44-4d50-b134-989737aa8c9b]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap19c3e0c8-51'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:8f:3a:b0'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 104], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 480836, 'reachable_time': 23287, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 225102, 'error': None, 'target': 'ovnmeta-19c3e0c8-5563-479c-995a-ab38d8b8c7f7', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 19:04:09 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:04:09.758 211690 DEBUG oslo.privsep.daemon [-] privsep: reply[4e6704e7-bdee-405b-935e-d1161250157a]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe8f:3ab0'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 480836, 'tstamp': 480836}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 225103, 'error': None, 'target': 'ovnmeta-19c3e0c8-5563-479c-995a-ab38d8b8c7f7', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 19:04:09 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:04:09.775 211690 DEBUG oslo.privsep.daemon [-] privsep: reply[c0659ed5-b499-4006-86ab-d9bec257a318]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap19c3e0c8-51'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:8f:3a:b0'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 2, 'rx_bytes': 90, 'tx_bytes': 176, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 2, 'rx_bytes': 90, 'tx_bytes': 176, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 104], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 480836, 'reachable_time': 23287, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 2, 'outoctets': 148, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 2, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 148, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 2, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 225104, 'error': None, 'target': 'ovnmeta-19c3e0c8-5563-479c-995a-ab38d8b8c7f7', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 19:04:09 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:04:09.808 211690 DEBUG oslo.privsep.daemon [-] privsep: reply[c45cdb63-2ebd-425c-ac34-5df700cba71c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 19:04:09 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:04:09.885 211690 DEBUG oslo.privsep.daemon [-] privsep: reply[e26a824c-846f-4cbf-bd0e-81cd9783ebd7]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 19:04:09 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:04:09.887 104259 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap19c3e0c8-50, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 21 19:04:09 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:04:09.887 104259 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 21 19:04:09 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:04:09.888 104259 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap19c3e0c8-50, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 21 19:04:09 np0005591285 nova_compute[182755]: 2026-01-22 00:04:09.890 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:04:09 np0005591285 kernel: tap19c3e0c8-50: entered promiscuous mode
Jan 21 19:04:09 np0005591285 nova_compute[182755]: 2026-01-22 00:04:09.892 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:04:09 np0005591285 NetworkManager[55017]: <info>  [1769040249.8935] manager: (tap19c3e0c8-50): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/162)
Jan 21 19:04:09 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:04:09.893 104259 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap19c3e0c8-50, col_values=(('external_ids', {'iface-id': '1b7e9589-a667-4684-99c2-2699b19c29bb'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 21 19:04:09 np0005591285 nova_compute[182755]: 2026-01-22 00:04:09.895 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:04:09 np0005591285 ovn_controller[94908]: 2026-01-22T00:04:09Z|00327|binding|INFO|Releasing lport 1b7e9589-a667-4684-99c2-2699b19c29bb from this chassis (sb_readonly=0)
Jan 21 19:04:09 np0005591285 nova_compute[182755]: 2026-01-22 00:04:09.896 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:04:09 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:04:09.897 104259 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/19c3e0c8-5563-479c-995a-ab38d8b8c7f7.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/19c3e0c8-5563-479c-995a-ab38d8b8c7f7.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Jan 21 19:04:09 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:04:09.898 211690 DEBUG oslo.privsep.daemon [-] privsep: reply[88180219-fee3-4d7e-9d9c-9a5ffca2aeb2]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 19:04:09 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:04:09.899 104259 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Jan 21 19:04:09 np0005591285 ovn_metadata_agent[104254]: global
Jan 21 19:04:09 np0005591285 ovn_metadata_agent[104254]:    log         /dev/log local0 debug
Jan 21 19:04:09 np0005591285 ovn_metadata_agent[104254]:    log-tag     haproxy-metadata-proxy-19c3e0c8-5563-479c-995a-ab38d8b8c7f7
Jan 21 19:04:09 np0005591285 ovn_metadata_agent[104254]:    user        root
Jan 21 19:04:09 np0005591285 ovn_metadata_agent[104254]:    group       root
Jan 21 19:04:09 np0005591285 ovn_metadata_agent[104254]:    maxconn     1024
Jan 21 19:04:09 np0005591285 ovn_metadata_agent[104254]:    pidfile     /var/lib/neutron/external/pids/19c3e0c8-5563-479c-995a-ab38d8b8c7f7.pid.haproxy
Jan 21 19:04:09 np0005591285 ovn_metadata_agent[104254]:    daemon
Jan 21 19:04:09 np0005591285 ovn_metadata_agent[104254]: 
Jan 21 19:04:09 np0005591285 ovn_metadata_agent[104254]: defaults
Jan 21 19:04:09 np0005591285 ovn_metadata_agent[104254]:    log global
Jan 21 19:04:09 np0005591285 ovn_metadata_agent[104254]:    mode http
Jan 21 19:04:09 np0005591285 ovn_metadata_agent[104254]:    option httplog
Jan 21 19:04:09 np0005591285 ovn_metadata_agent[104254]:    option dontlognull
Jan 21 19:04:09 np0005591285 ovn_metadata_agent[104254]:    option http-server-close
Jan 21 19:04:09 np0005591285 ovn_metadata_agent[104254]:    option forwardfor
Jan 21 19:04:09 np0005591285 ovn_metadata_agent[104254]:    retries                 3
Jan 21 19:04:09 np0005591285 ovn_metadata_agent[104254]:    timeout http-request    30s
Jan 21 19:04:09 np0005591285 ovn_metadata_agent[104254]:    timeout connect         30s
Jan 21 19:04:09 np0005591285 ovn_metadata_agent[104254]:    timeout client          32s
Jan 21 19:04:09 np0005591285 ovn_metadata_agent[104254]:    timeout server          32s
Jan 21 19:04:09 np0005591285 ovn_metadata_agent[104254]:    timeout http-keep-alive 30s
Jan 21 19:04:09 np0005591285 ovn_metadata_agent[104254]: 
Jan 21 19:04:09 np0005591285 ovn_metadata_agent[104254]: 
Jan 21 19:04:09 np0005591285 ovn_metadata_agent[104254]: listen listener
Jan 21 19:04:09 np0005591285 ovn_metadata_agent[104254]:    bind 169.254.169.254:80
Jan 21 19:04:09 np0005591285 ovn_metadata_agent[104254]:    server metadata /var/lib/neutron/metadata_proxy
Jan 21 19:04:09 np0005591285 ovn_metadata_agent[104254]:    http-request add-header X-OVN-Network-ID 19c3e0c8-5563-479c-995a-ab38d8b8c7f7
Jan 21 19:04:09 np0005591285 ovn_metadata_agent[104254]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Jan 21 19:04:09 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:04:09.900 104259 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-19c3e0c8-5563-479c-995a-ab38d8b8c7f7', 'env', 'PROCESS_TAG=haproxy-19c3e0c8-5563-479c-995a-ab38d8b8c7f7', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/19c3e0c8-5563-479c-995a-ab38d8b8c7f7.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Jan 21 19:04:09 np0005591285 nova_compute[182755]: 2026-01-22 00:04:09.912 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:04:09 np0005591285 nova_compute[182755]: 2026-01-22 00:04:09.996 182759 DEBUG nova.virt.libvirt.host [None req-f447ef32-7edb-4e2c-a5c5-5452f2f9c37e - - - - - -] Removed pending event for 65bbb3bd-2b3c-4868-bf10-ce8795c0a312 due to event _event_emit_delayed /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:438#033[00m
Jan 21 19:04:09 np0005591285 nova_compute[182755]: 2026-01-22 00:04:09.997 182759 DEBUG nova.virt.driver [None req-f447ef32-7edb-4e2c-a5c5-5452f2f9c37e - - - - - -] Emitting event <LifecycleEvent: 1769040249.9958866, 65bbb3bd-2b3c-4868-bf10-ce8795c0a312 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 21 19:04:09 np0005591285 nova_compute[182755]: 2026-01-22 00:04:09.997 182759 INFO nova.compute.manager [None req-f447ef32-7edb-4e2c-a5c5-5452f2f9c37e - - - - - -] [instance: 65bbb3bd-2b3c-4868-bf10-ce8795c0a312] VM Resumed (Lifecycle Event)#033[00m
Jan 21 19:04:10 np0005591285 nova_compute[182755]: 2026-01-22 00:04:10.000 182759 DEBUG nova.compute.manager [None req-848d58c3-3793-46df-8301-22b771e43d1d 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] [instance: 65bbb3bd-2b3c-4868-bf10-ce8795c0a312] Instance event wait completed in 0 seconds for  wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Jan 21 19:04:10 np0005591285 nova_compute[182755]: 2026-01-22 00:04:10.008 182759 INFO nova.virt.libvirt.driver [-] [instance: 65bbb3bd-2b3c-4868-bf10-ce8795c0a312] Instance rebooted successfully.#033[00m
Jan 21 19:04:10 np0005591285 nova_compute[182755]: 2026-01-22 00:04:10.008 182759 DEBUG nova.compute.manager [None req-848d58c3-3793-46df-8301-22b771e43d1d 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] [instance: 65bbb3bd-2b3c-4868-bf10-ce8795c0a312] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 21 19:04:10 np0005591285 nova_compute[182755]: 2026-01-22 00:04:10.102 182759 DEBUG nova.compute.manager [None req-f447ef32-7edb-4e2c-a5c5-5452f2f9c37e - - - - - -] [instance: 65bbb3bd-2b3c-4868-bf10-ce8795c0a312] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 21 19:04:10 np0005591285 nova_compute[182755]: 2026-01-22 00:04:10.105 182759 DEBUG nova.compute.manager [None req-f447ef32-7edb-4e2c-a5c5-5452f2f9c37e - - - - - -] [instance: 65bbb3bd-2b3c-4868-bf10-ce8795c0a312] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: stopped, current task_state: powering-on, current DB power_state: 4, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 21 19:04:10 np0005591285 nova_compute[182755]: 2026-01-22 00:04:10.170 182759 INFO nova.compute.manager [None req-f447ef32-7edb-4e2c-a5c5-5452f2f9c37e - - - - - -] [instance: 65bbb3bd-2b3c-4868-bf10-ce8795c0a312] During sync_power_state the instance has a pending task (powering-on). Skip.#033[00m
Jan 21 19:04:10 np0005591285 nova_compute[182755]: 2026-01-22 00:04:10.171 182759 DEBUG nova.virt.driver [None req-f447ef32-7edb-4e2c-a5c5-5452f2f9c37e - - - - - -] Emitting event <LifecycleEvent: 1769040249.9969406, 65bbb3bd-2b3c-4868-bf10-ce8795c0a312 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 21 19:04:10 np0005591285 nova_compute[182755]: 2026-01-22 00:04:10.171 182759 INFO nova.compute.manager [None req-f447ef32-7edb-4e2c-a5c5-5452f2f9c37e - - - - - -] [instance: 65bbb3bd-2b3c-4868-bf10-ce8795c0a312] VM Started (Lifecycle Event)#033[00m
Jan 21 19:04:10 np0005591285 nova_compute[182755]: 2026-01-22 00:04:10.198 182759 DEBUG nova.compute.manager [None req-f447ef32-7edb-4e2c-a5c5-5452f2f9c37e - - - - - -] [instance: 65bbb3bd-2b3c-4868-bf10-ce8795c0a312] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 21 19:04:10 np0005591285 nova_compute[182755]: 2026-01-22 00:04:10.201 182759 DEBUG nova.compute.manager [None req-f447ef32-7edb-4e2c-a5c5-5452f2f9c37e - - - - - -] [instance: 65bbb3bd-2b3c-4868-bf10-ce8795c0a312] Synchronizing instance power state after lifecycle event "Started"; current vm_state: active, current task_state: None, current DB power_state: 1, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 21 19:04:10 np0005591285 podman[225140]: 2026-01-22 00:04:10.271243038 +0000 UTC m=+0.046505472 container create 39a0c5b0563a94d905b4f2233e2fcbf42912844c5e4087157f50a3928b77facf (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-19c3e0c8-5563-479c-995a-ab38d8b8c7f7, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 21 19:04:10 np0005591285 systemd[1]: Started libpod-conmon-39a0c5b0563a94d905b4f2233e2fcbf42912844c5e4087157f50a3928b77facf.scope.
Jan 21 19:04:10 np0005591285 podman[225140]: 2026-01-22 00:04:10.246150033 +0000 UTC m=+0.021412477 image pull 3695f0466b4af47afdf4b467956f8cc4744d7249671a73e7ca3fd26cca2f59c3 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Jan 21 19:04:10 np0005591285 systemd[1]: Started libcrun container.
Jan 21 19:04:10 np0005591285 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/9a4ebd01a84867086768ac0b9f2a85653bf5b1653bb014db7179ae5f3029d303/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 21 19:04:10 np0005591285 podman[225140]: 2026-01-22 00:04:10.370806006 +0000 UTC m=+0.146068440 container init 39a0c5b0563a94d905b4f2233e2fcbf42912844c5e4087157f50a3928b77facf (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-19c3e0c8-5563-479c-995a-ab38d8b8c7f7, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_managed=true, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS)
Jan 21 19:04:10 np0005591285 podman[225140]: 2026-01-22 00:04:10.376033907 +0000 UTC m=+0.151296341 container start 39a0c5b0563a94d905b4f2233e2fcbf42912844c5e4087157f50a3928b77facf (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-19c3e0c8-5563-479c-995a-ab38d8b8c7f7, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true)
Jan 21 19:04:10 np0005591285 neutron-haproxy-ovnmeta-19c3e0c8-5563-479c-995a-ab38d8b8c7f7[225155]: [NOTICE]   (225159) : New worker (225161) forked
Jan 21 19:04:10 np0005591285 neutron-haproxy-ovnmeta-19c3e0c8-5563-479c-995a-ab38d8b8c7f7[225155]: [NOTICE]   (225159) : Loading success.
Jan 21 19:04:11 np0005591285 nova_compute[182755]: 2026-01-22 00:04:11.132 182759 DEBUG nova.compute.manager [req-7951ae11-4b64-40df-9bfc-3032a0365256 req-964aa00d-a8bb-4019-92f1-6b21db03eda9 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 65bbb3bd-2b3c-4868-bf10-ce8795c0a312] Received event network-vif-plugged-d13b0c1b-9c16-4db4-bc03-d7ffef3f3af0 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 21 19:04:11 np0005591285 nova_compute[182755]: 2026-01-22 00:04:11.134 182759 DEBUG oslo_concurrency.lockutils [req-7951ae11-4b64-40df-9bfc-3032a0365256 req-964aa00d-a8bb-4019-92f1-6b21db03eda9 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "65bbb3bd-2b3c-4868-bf10-ce8795c0a312-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 21 19:04:11 np0005591285 nova_compute[182755]: 2026-01-22 00:04:11.134 182759 DEBUG oslo_concurrency.lockutils [req-7951ae11-4b64-40df-9bfc-3032a0365256 req-964aa00d-a8bb-4019-92f1-6b21db03eda9 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "65bbb3bd-2b3c-4868-bf10-ce8795c0a312-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 21 19:04:11 np0005591285 nova_compute[182755]: 2026-01-22 00:04:11.135 182759 DEBUG oslo_concurrency.lockutils [req-7951ae11-4b64-40df-9bfc-3032a0365256 req-964aa00d-a8bb-4019-92f1-6b21db03eda9 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "65bbb3bd-2b3c-4868-bf10-ce8795c0a312-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 21 19:04:11 np0005591285 nova_compute[182755]: 2026-01-22 00:04:11.135 182759 DEBUG nova.compute.manager [req-7951ae11-4b64-40df-9bfc-3032a0365256 req-964aa00d-a8bb-4019-92f1-6b21db03eda9 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 65bbb3bd-2b3c-4868-bf10-ce8795c0a312] No waiting events found dispatching network-vif-plugged-d13b0c1b-9c16-4db4-bc03-d7ffef3f3af0 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 21 19:04:11 np0005591285 nova_compute[182755]: 2026-01-22 00:04:11.136 182759 WARNING nova.compute.manager [req-7951ae11-4b64-40df-9bfc-3032a0365256 req-964aa00d-a8bb-4019-92f1-6b21db03eda9 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 65bbb3bd-2b3c-4868-bf10-ce8795c0a312] Received unexpected event network-vif-plugged-d13b0c1b-9c16-4db4-bc03-d7ffef3f3af0 for instance with vm_state active and task_state None.#033[00m
Jan 21 19:04:11 np0005591285 nova_compute[182755]: 2026-01-22 00:04:11.153 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:04:13 np0005591285 nova_compute[182755]: 2026-01-22 00:04:13.278 182759 DEBUG nova.compute.manager [req-f185274f-e6f8-46e9-8d40-21f51cffc0bd req-be8770ae-cf2e-4d4b-a5b1-d3ac8bb6c928 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 65bbb3bd-2b3c-4868-bf10-ce8795c0a312] Received event network-vif-plugged-d13b0c1b-9c16-4db4-bc03-d7ffef3f3af0 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 21 19:04:13 np0005591285 nova_compute[182755]: 2026-01-22 00:04:13.278 182759 DEBUG oslo_concurrency.lockutils [req-f185274f-e6f8-46e9-8d40-21f51cffc0bd req-be8770ae-cf2e-4d4b-a5b1-d3ac8bb6c928 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "65bbb3bd-2b3c-4868-bf10-ce8795c0a312-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 21 19:04:13 np0005591285 nova_compute[182755]: 2026-01-22 00:04:13.279 182759 DEBUG oslo_concurrency.lockutils [req-f185274f-e6f8-46e9-8d40-21f51cffc0bd req-be8770ae-cf2e-4d4b-a5b1-d3ac8bb6c928 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "65bbb3bd-2b3c-4868-bf10-ce8795c0a312-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 21 19:04:13 np0005591285 nova_compute[182755]: 2026-01-22 00:04:13.279 182759 DEBUG oslo_concurrency.lockutils [req-f185274f-e6f8-46e9-8d40-21f51cffc0bd req-be8770ae-cf2e-4d4b-a5b1-d3ac8bb6c928 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "65bbb3bd-2b3c-4868-bf10-ce8795c0a312-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 21 19:04:13 np0005591285 nova_compute[182755]: 2026-01-22 00:04:13.280 182759 DEBUG nova.compute.manager [req-f185274f-e6f8-46e9-8d40-21f51cffc0bd req-be8770ae-cf2e-4d4b-a5b1-d3ac8bb6c928 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 65bbb3bd-2b3c-4868-bf10-ce8795c0a312] No waiting events found dispatching network-vif-plugged-d13b0c1b-9c16-4db4-bc03-d7ffef3f3af0 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 21 19:04:13 np0005591285 nova_compute[182755]: 2026-01-22 00:04:13.280 182759 WARNING nova.compute.manager [req-f185274f-e6f8-46e9-8d40-21f51cffc0bd req-be8770ae-cf2e-4d4b-a5b1-d3ac8bb6c928 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 65bbb3bd-2b3c-4868-bf10-ce8795c0a312] Received unexpected event network-vif-plugged-d13b0c1b-9c16-4db4-bc03-d7ffef3f3af0 for instance with vm_state active and task_state None.#033[00m
Jan 21 19:04:14 np0005591285 nova_compute[182755]: 2026-01-22 00:04:14.412 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:04:15 np0005591285 podman[225171]: 2026-01-22 00:04:15.210005807 +0000 UTC m=+0.077583898 container health_status 3d927e208c8d9d2bf2ed958430b573b5beb6e6062ba87e1ec7a0ee42c855b2e4 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Jan 21 19:04:16 np0005591285 nova_compute[182755]: 2026-01-22 00:04:16.156 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:04:17 np0005591285 nova_compute[182755]: 2026-01-22 00:04:17.426 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:04:19 np0005591285 podman[225196]: 2026-01-22 00:04:19.192735889 +0000 UTC m=+0.055817403 container health_status 8919309155a033da8deb024bb37b9fc0d285fe97686ee0d202af37a5f2df8416 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Jan 21 19:04:19 np0005591285 podman[225195]: 2026-01-22 00:04:19.2176859 +0000 UTC m=+0.084012931 container health_status 482a8450540b33e5cd4c4cb0c4b60281016c657b79b89fa82f8a9e447843f995 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.build-date=20251202, tcib_managed=true, org.label-schema.vendor=CentOS)
Jan 21 19:04:19 np0005591285 nova_compute[182755]: 2026-01-22 00:04:19.224 182759 DEBUG nova.objects.instance [None req-9652db06-6cde-4d2a-85b8-0f0885c9ecf8 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] Lazy-loading 'pci_devices' on Instance uuid 65bbb3bd-2b3c-4868-bf10-ce8795c0a312 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 21 19:04:19 np0005591285 podman[225197]: 2026-01-22 00:04:19.240999198 +0000 UTC m=+0.093822496 container health_status e38def30a2d032278c507a8fe96e62e9ea51ff4721fe55b24e66691821fd079e (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, container_name=ovn_controller, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=ovn_controller, managed_by=edpm_ansible, org.label-schema.license=GPLv2, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3)
Jan 21 19:04:19 np0005591285 nova_compute[182755]: 2026-01-22 00:04:19.292 182759 DEBUG nova.virt.driver [None req-f447ef32-7edb-4e2c-a5c5-5452f2f9c37e - - - - - -] Emitting event <LifecycleEvent: 1769040259.2920134, 65bbb3bd-2b3c-4868-bf10-ce8795c0a312 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 21 19:04:19 np0005591285 nova_compute[182755]: 2026-01-22 00:04:19.292 182759 INFO nova.compute.manager [None req-f447ef32-7edb-4e2c-a5c5-5452f2f9c37e - - - - - -] [instance: 65bbb3bd-2b3c-4868-bf10-ce8795c0a312] VM Paused (Lifecycle Event)#033[00m
Jan 21 19:04:19 np0005591285 nova_compute[182755]: 2026-01-22 00:04:19.357 182759 DEBUG nova.compute.manager [None req-f447ef32-7edb-4e2c-a5c5-5452f2f9c37e - - - - - -] [instance: 65bbb3bd-2b3c-4868-bf10-ce8795c0a312] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 21 19:04:19 np0005591285 nova_compute[182755]: 2026-01-22 00:04:19.364 182759 DEBUG nova.compute.manager [None req-f447ef32-7edb-4e2c-a5c5-5452f2f9c37e - - - - - -] [instance: 65bbb3bd-2b3c-4868-bf10-ce8795c0a312] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: active, current task_state: suspending, current DB power_state: 1, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 21 19:04:19 np0005591285 nova_compute[182755]: 2026-01-22 00:04:19.414 182759 INFO nova.compute.manager [None req-f447ef32-7edb-4e2c-a5c5-5452f2f9c37e - - - - - -] [instance: 65bbb3bd-2b3c-4868-bf10-ce8795c0a312] During sync_power_state the instance has a pending task (suspending). Skip.#033[00m
Jan 21 19:04:19 np0005591285 nova_compute[182755]: 2026-01-22 00:04:19.415 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:04:19 np0005591285 kernel: tapd13b0c1b-9c (unregistering): left promiscuous mode
Jan 21 19:04:19 np0005591285 NetworkManager[55017]: <info>  [1769040259.9570] device (tapd13b0c1b-9c): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 21 19:04:19 np0005591285 ovn_controller[94908]: 2026-01-22T00:04:19Z|00328|binding|INFO|Releasing lport d13b0c1b-9c16-4db4-bc03-d7ffef3f3af0 from this chassis (sb_readonly=0)
Jan 21 19:04:19 np0005591285 ovn_controller[94908]: 2026-01-22T00:04:19Z|00329|binding|INFO|Setting lport d13b0c1b-9c16-4db4-bc03-d7ffef3f3af0 down in Southbound
Jan 21 19:04:19 np0005591285 ovn_controller[94908]: 2026-01-22T00:04:19Z|00330|binding|INFO|Removing iface tapd13b0c1b-9c ovn-installed in OVS
Jan 21 19:04:19 np0005591285 nova_compute[182755]: 2026-01-22 00:04:19.964 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:04:19 np0005591285 nova_compute[182755]: 2026-01-22 00:04:19.979 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:04:19 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:04:19.982 104259 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:d5:43:90 10.100.0.6'], port_security=['fa:16:3e:d5:43:90 10.100.0.6'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.6/28', 'neutron:device_id': '65bbb3bd-2b3c-4868-bf10-ce8795c0a312', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-19c3e0c8-5563-479c-995a-ab38d8b8c7f7', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'cccb624dbe6d4401a89e9cd254f91828', 'neutron:revision_number': '6', 'neutron:security_group_ids': '6d59a7e5-ecca-4ec2-a40e-386acabc1d66', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:port_fip': '192.168.122.240', 'neutron:host_id': 'compute-2.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=1cb5ae5b-fb9e-4b4d-8960-35191db09308, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fdd55b5c970>], logical_port=d13b0c1b-9c16-4db4-bc03-d7ffef3f3af0) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7fdd55b5c970>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 21 19:04:19 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:04:19.985 104259 INFO neutron.agent.ovn.metadata.agent [-] Port d13b0c1b-9c16-4db4-bc03-d7ffef3f3af0 in datapath 19c3e0c8-5563-479c-995a-ab38d8b8c7f7 unbound from our chassis#033[00m
Jan 21 19:04:19 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:04:19.988 104259 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 19c3e0c8-5563-479c-995a-ab38d8b8c7f7, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Jan 21 19:04:19 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:04:19.990 211690 DEBUG oslo.privsep.daemon [-] privsep: reply[15ffd76a-bf1f-49ef-be7f-8745099cdca8]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 19:04:19 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:04:19.991 104259 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-19c3e0c8-5563-479c-995a-ab38d8b8c7f7 namespace which is not needed anymore#033[00m
Jan 21 19:04:20 np0005591285 systemd[1]: machine-qemu\x2d41\x2dinstance\x2d00000058.scope: Deactivated successfully.
Jan 21 19:04:20 np0005591285 systemd[1]: machine-qemu\x2d41\x2dinstance\x2d00000058.scope: Consumed 10.395s CPU time.
Jan 21 19:04:20 np0005591285 systemd-machined[154022]: Machine qemu-41-instance-00000058 terminated.
Jan 21 19:04:20 np0005591285 neutron-haproxy-ovnmeta-19c3e0c8-5563-479c-995a-ab38d8b8c7f7[225155]: [NOTICE]   (225159) : haproxy version is 2.8.14-c23fe91
Jan 21 19:04:20 np0005591285 neutron-haproxy-ovnmeta-19c3e0c8-5563-479c-995a-ab38d8b8c7f7[225155]: [NOTICE]   (225159) : path to executable is /usr/sbin/haproxy
Jan 21 19:04:20 np0005591285 neutron-haproxy-ovnmeta-19c3e0c8-5563-479c-995a-ab38d8b8c7f7[225155]: [WARNING]  (225159) : Exiting Master process...
Jan 21 19:04:20 np0005591285 neutron-haproxy-ovnmeta-19c3e0c8-5563-479c-995a-ab38d8b8c7f7[225155]: [ALERT]    (225159) : Current worker (225161) exited with code 143 (Terminated)
Jan 21 19:04:20 np0005591285 neutron-haproxy-ovnmeta-19c3e0c8-5563-479c-995a-ab38d8b8c7f7[225155]: [WARNING]  (225159) : All workers exited. Exiting... (0)
Jan 21 19:04:20 np0005591285 systemd[1]: libpod-39a0c5b0563a94d905b4f2233e2fcbf42912844c5e4087157f50a3928b77facf.scope: Deactivated successfully.
Jan 21 19:04:20 np0005591285 podman[225288]: 2026-01-22 00:04:20.124157744 +0000 UTC m=+0.044314184 container died 39a0c5b0563a94d905b4f2233e2fcbf42912844c5e4087157f50a3928b77facf (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-19c3e0c8-5563-479c-995a-ab38d8b8c7f7, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, tcib_managed=true)
Jan 21 19:04:20 np0005591285 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-39a0c5b0563a94d905b4f2233e2fcbf42912844c5e4087157f50a3928b77facf-userdata-shm.mount: Deactivated successfully.
Jan 21 19:04:20 np0005591285 systemd[1]: var-lib-containers-storage-overlay-9a4ebd01a84867086768ac0b9f2a85653bf5b1653bb014db7179ae5f3029d303-merged.mount: Deactivated successfully.
Jan 21 19:04:20 np0005591285 nova_compute[182755]: 2026-01-22 00:04:20.154 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:04:20 np0005591285 podman[225288]: 2026-01-22 00:04:20.155961329 +0000 UTC m=+0.076117799 container cleanup 39a0c5b0563a94d905b4f2233e2fcbf42912844c5e4087157f50a3928b77facf (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-19c3e0c8-5563-479c-995a-ab38d8b8c7f7, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS)
Jan 21 19:04:20 np0005591285 nova_compute[182755]: 2026-01-22 00:04:20.158 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:04:20 np0005591285 systemd[1]: libpod-conmon-39a0c5b0563a94d905b4f2233e2fcbf42912844c5e4087157f50a3928b77facf.scope: Deactivated successfully.
Jan 21 19:04:20 np0005591285 nova_compute[182755]: 2026-01-22 00:04:20.191 182759 DEBUG nova.compute.manager [None req-9652db06-6cde-4d2a-85b8-0f0885c9ecf8 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] [instance: 65bbb3bd-2b3c-4868-bf10-ce8795c0a312] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 21 19:04:20 np0005591285 podman[225324]: 2026-01-22 00:04:20.219633492 +0000 UTC m=+0.041494398 container remove 39a0c5b0563a94d905b4f2233e2fcbf42912844c5e4087157f50a3928b77facf (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-19c3e0c8-5563-479c-995a-ab38d8b8c7f7, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3)
Jan 21 19:04:20 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:04:20.225 211690 DEBUG oslo.privsep.daemon [-] privsep: reply[2699eef5-1719-45ac-a907-05ef037ec823]: (4, ('Thu Jan 22 12:04:20 AM UTC 2026 Stopping container neutron-haproxy-ovnmeta-19c3e0c8-5563-479c-995a-ab38d8b8c7f7 (39a0c5b0563a94d905b4f2233e2fcbf42912844c5e4087157f50a3928b77facf)\n39a0c5b0563a94d905b4f2233e2fcbf42912844c5e4087157f50a3928b77facf\nThu Jan 22 12:04:20 AM UTC 2026 Deleting container neutron-haproxy-ovnmeta-19c3e0c8-5563-479c-995a-ab38d8b8c7f7 (39a0c5b0563a94d905b4f2233e2fcbf42912844c5e4087157f50a3928b77facf)\n39a0c5b0563a94d905b4f2233e2fcbf42912844c5e4087157f50a3928b77facf\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 19:04:20 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:04:20.227 211690 DEBUG oslo.privsep.daemon [-] privsep: reply[ab7759ce-86bd-4b01-a586-0c3ff901f950]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 19:04:20 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:04:20.227 104259 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap19c3e0c8-50, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 21 19:04:20 np0005591285 nova_compute[182755]: 2026-01-22 00:04:20.229 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:04:20 np0005591285 kernel: tap19c3e0c8-50: left promiscuous mode
Jan 21 19:04:20 np0005591285 nova_compute[182755]: 2026-01-22 00:04:20.244 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:04:20 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:04:20.246 211690 DEBUG oslo.privsep.daemon [-] privsep: reply[cfc8118a-0ea8-4915-be9d-1d60ffb2078f]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 19:04:20 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:04:20.266 211690 DEBUG oslo.privsep.daemon [-] privsep: reply[2380b145-4a9f-4fa4-b009-b8c0fe4cf305]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 19:04:20 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:04:20.267 211690 DEBUG oslo.privsep.daemon [-] privsep: reply[8b29280b-7725-4819-9b0d-cd844e6646e8]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 19:04:20 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:04:20.285 211690 DEBUG oslo.privsep.daemon [-] privsep: reply[b3cc48ff-3b8d-4dba-86c7-90c8f1ac4ef3]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 480828, 'reachable_time': 27914, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 225351, 'error': None, 'target': 'ovnmeta-19c3e0c8-5563-479c-995a-ab38d8b8c7f7', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 19:04:20 np0005591285 systemd[1]: run-netns-ovnmeta\x2d19c3e0c8\x2d5563\x2d479c\x2d995a\x2dab38d8b8c7f7.mount: Deactivated successfully.
Jan 21 19:04:20 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:04:20.289 104650 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-19c3e0c8-5563-479c-995a-ab38d8b8c7f7 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Jan 21 19:04:20 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:04:20.289 104650 DEBUG oslo.privsep.daemon [-] privsep: reply[d8f56a02-be55-494d-9019-32afd36166b3]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 19:04:21 np0005591285 nova_compute[182755]: 2026-01-22 00:04:21.157 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:04:21 np0005591285 nova_compute[182755]: 2026-01-22 00:04:21.897 182759 DEBUG nova.compute.manager [req-2c3634b8-5215-4919-895d-4bd804c64d8a req-4ce7bf50-1823-4ce2-8519-14302e7e2cbf 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 65bbb3bd-2b3c-4868-bf10-ce8795c0a312] Received event network-vif-unplugged-d13b0c1b-9c16-4db4-bc03-d7ffef3f3af0 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 21 19:04:21 np0005591285 nova_compute[182755]: 2026-01-22 00:04:21.897 182759 DEBUG oslo_concurrency.lockutils [req-2c3634b8-5215-4919-895d-4bd804c64d8a req-4ce7bf50-1823-4ce2-8519-14302e7e2cbf 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "65bbb3bd-2b3c-4868-bf10-ce8795c0a312-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 21 19:04:21 np0005591285 nova_compute[182755]: 2026-01-22 00:04:21.898 182759 DEBUG oslo_concurrency.lockutils [req-2c3634b8-5215-4919-895d-4bd804c64d8a req-4ce7bf50-1823-4ce2-8519-14302e7e2cbf 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "65bbb3bd-2b3c-4868-bf10-ce8795c0a312-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 21 19:04:21 np0005591285 nova_compute[182755]: 2026-01-22 00:04:21.898 182759 DEBUG oslo_concurrency.lockutils [req-2c3634b8-5215-4919-895d-4bd804c64d8a req-4ce7bf50-1823-4ce2-8519-14302e7e2cbf 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "65bbb3bd-2b3c-4868-bf10-ce8795c0a312-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 21 19:04:21 np0005591285 nova_compute[182755]: 2026-01-22 00:04:21.898 182759 DEBUG nova.compute.manager [req-2c3634b8-5215-4919-895d-4bd804c64d8a req-4ce7bf50-1823-4ce2-8519-14302e7e2cbf 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 65bbb3bd-2b3c-4868-bf10-ce8795c0a312] No waiting events found dispatching network-vif-unplugged-d13b0c1b-9c16-4db4-bc03-d7ffef3f3af0 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 21 19:04:21 np0005591285 nova_compute[182755]: 2026-01-22 00:04:21.898 182759 WARNING nova.compute.manager [req-2c3634b8-5215-4919-895d-4bd804c64d8a req-4ce7bf50-1823-4ce2-8519-14302e7e2cbf 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 65bbb3bd-2b3c-4868-bf10-ce8795c0a312] Received unexpected event network-vif-unplugged-d13b0c1b-9c16-4db4-bc03-d7ffef3f3af0 for instance with vm_state suspended and task_state None.#033[00m
Jan 21 19:04:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:04:23.167 12 DEBUG ceilometer.compute.discovery [-] instance data: {'id': '65bbb3bd-2b3c-4868-bf10-ce8795c0a312', 'name': 'tempest-ServerActionsTestJSON-server-1767669163', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'os_type': 'hvm', 'architecture': 'x86_64', 'OS-EXT-SRV-ATTR:instance_name': 'instance-00000058', 'OS-EXT-SRV-ATTR:host': 'compute-2.ctlplane.example.com', 'OS-EXT-STS:vm_state': 'shutdown', 'tenant_id': 'cccb624dbe6d4401a89e9cd254f91828', 'user_id': '3e78a70a1d284a9d932d4a53b872df39', 'hostId': '98bf05fc3cde3063e357af07cf32397d1b83b1095afc25a5e9b316ae', 'status': 'stopped', 'metadata': {}} discover_libvirt_polling /usr/lib/python3.9/site-packages/ceilometer/compute/discovery.py:228
Jan 21 19:04:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:04:23.170 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.latency in the context of pollsters
Jan 21 19:04:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:04:23.171 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for PerDeviceDiskLatencyPollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Jan 21 19:04:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:04:23.171 12 ERROR ceilometer.polling.manager [-] Prevent pollster disk.device.latency from polling [<NovaLikeServer: tempest-ServerActionsTestJSON-server-1767669163>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-ServerActionsTestJSON-server-1767669163>]
Jan 21 19:04:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:04:23.173 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.delta in the context of pollsters
Jan 21 19:04:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:04:23.176 12 DEBUG ceilometer.compute.pollsters [-] Instance 65bbb3bd-2b3c-4868-bf10-ce8795c0a312 was shut off while getting sample of network.incoming.bytes.delta: Failed to inspect data of instance <name=instance-00000058, id=65bbb3bd-2b3c-4868-bf10-ce8795c0a312>, domain state is SHUTOFF. get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:152
Jan 21 19:04:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:04:23.177 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.latency in the context of pollsters
Jan 21 19:04:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:04:23.178 12 DEBUG ceilometer.compute.pollsters [-] Instance 65bbb3bd-2b3c-4868-bf10-ce8795c0a312 was shut off while getting sample of disk.device.read.latency: Failed to inspect data of instance <name=instance-00000058, id=65bbb3bd-2b3c-4868-bf10-ce8795c0a312>, domain state is SHUTOFF. get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:152
Jan 21 19:04:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:04:23.178 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.usage in the context of pollsters
Jan 21 19:04:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:04:23.179 12 DEBUG ceilometer.compute.pollsters [-] Instance 65bbb3bd-2b3c-4868-bf10-ce8795c0a312 was shut off while getting sample of disk.device.usage: Failed to inspect data of instance <name=instance-00000058, id=65bbb3bd-2b3c-4868-bf10-ce8795c0a312>, domain state is SHUTOFF. get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:152
Jan 21 19:04:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:04:23.180 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.bytes in the context of pollsters
Jan 21 19:04:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:04:23.181 12 DEBUG ceilometer.compute.pollsters [-] Instance 65bbb3bd-2b3c-4868-bf10-ce8795c0a312 was shut off while getting sample of disk.device.read.bytes: Failed to inspect data of instance <name=instance-00000058, id=65bbb3bd-2b3c-4868-bf10-ce8795c0a312>, domain state is SHUTOFF. get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:152
Jan 21 19:04:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:04:23.181 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes in the context of pollsters
Jan 21 19:04:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:04:23.182 12 DEBUG ceilometer.compute.pollsters [-] Instance 65bbb3bd-2b3c-4868-bf10-ce8795c0a312 was shut off while getting sample of network.incoming.bytes: Failed to inspect data of instance <name=instance-00000058, id=65bbb3bd-2b3c-4868-bf10-ce8795c0a312>, domain state is SHUTOFF. get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:152
Jan 21 19:04:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:04:23.182 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.error in the context of pollsters
Jan 21 19:04:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:04:23.183 12 DEBUG ceilometer.compute.pollsters [-] Instance 65bbb3bd-2b3c-4868-bf10-ce8795c0a312 was shut off while getting sample of network.outgoing.packets.error: Failed to inspect data of instance <name=instance-00000058, id=65bbb3bd-2b3c-4868-bf10-ce8795c0a312>, domain state is SHUTOFF. get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:152
Jan 21 19:04:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:04:23.183 12 INFO ceilometer.polling.manager [-] Polling pollster memory.usage in the context of pollsters
Jan 21 19:04:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:04:23.184 12 DEBUG ceilometer.compute.pollsters [-] Instance 65bbb3bd-2b3c-4868-bf10-ce8795c0a312 was shut off while getting sample of memory.usage: Failed to inspect data of instance <name=instance-00000058, id=65bbb3bd-2b3c-4868-bf10-ce8795c0a312>, domain state is SHUTOFF. get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:152
Jan 21 19:04:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:04:23.184 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.rate in the context of pollsters
Jan 21 19:04:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:04:23.185 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for OutgoingBytesRatePollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Jan 21 19:04:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:04:23.185 12 ERROR ceilometer.polling.manager [-] Prevent pollster network.outgoing.bytes.rate from polling [<NovaLikeServer: tempest-ServerActionsTestJSON-server-1767669163>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-ServerActionsTestJSON-server-1767669163>]
Jan 21 19:04:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:04:23.185 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets in the context of pollsters
Jan 21 19:04:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:04:23.186 12 DEBUG ceilometer.compute.pollsters [-] Instance 65bbb3bd-2b3c-4868-bf10-ce8795c0a312 was shut off while getting sample of network.incoming.packets: Failed to inspect data of instance <name=instance-00000058, id=65bbb3bd-2b3c-4868-bf10-ce8795c0a312>, domain state is SHUTOFF. get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:152
Jan 21 19:04:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:04:23.186 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets in the context of pollsters
Jan 21 19:04:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:04:23.187 12 DEBUG ceilometer.compute.pollsters [-] Instance 65bbb3bd-2b3c-4868-bf10-ce8795c0a312 was shut off while getting sample of network.outgoing.packets: Failed to inspect data of instance <name=instance-00000058, id=65bbb3bd-2b3c-4868-bf10-ce8795c0a312>, domain state is SHUTOFF. get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:152
Jan 21 19:04:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:04:23.187 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.error in the context of pollsters
Jan 21 19:04:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:04:23.188 12 DEBUG ceilometer.compute.pollsters [-] Instance 65bbb3bd-2b3c-4868-bf10-ce8795c0a312 was shut off while getting sample of network.incoming.packets.error: Failed to inspect data of instance <name=instance-00000058, id=65bbb3bd-2b3c-4868-bf10-ce8795c0a312>, domain state is SHUTOFF. get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:152
Jan 21 19:04:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:04:23.188 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes in the context of pollsters
Jan 21 19:04:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:04:23.189 12 DEBUG ceilometer.compute.pollsters [-] Instance 65bbb3bd-2b3c-4868-bf10-ce8795c0a312 was shut off while getting sample of network.outgoing.bytes: Failed to inspect data of instance <name=instance-00000058, id=65bbb3bd-2b3c-4868-bf10-ce8795c0a312>, domain state is SHUTOFF. get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:152
Jan 21 19:04:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:04:23.189 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.allocation in the context of pollsters
Jan 21 19:04:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:04:23.190 12 DEBUG ceilometer.compute.pollsters [-] Instance 65bbb3bd-2b3c-4868-bf10-ce8795c0a312 was shut off while getting sample of disk.device.allocation: Failed to inspect data of instance <name=instance-00000058, id=65bbb3bd-2b3c-4868-bf10-ce8795c0a312>, domain state is SHUTOFF. get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:152
Jan 21 19:04:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:04:23.190 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.requests in the context of pollsters
Jan 21 19:04:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:04:23.190 12 DEBUG ceilometer.compute.pollsters [-] Instance 65bbb3bd-2b3c-4868-bf10-ce8795c0a312 was shut off while getting sample of disk.device.write.requests: Failed to inspect data of instance <name=instance-00000058, id=65bbb3bd-2b3c-4868-bf10-ce8795c0a312>, domain state is SHUTOFF. get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:152
Jan 21 19:04:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:04:23.191 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.drop in the context of pollsters
Jan 21 19:04:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:04:23.192 12 DEBUG ceilometer.compute.pollsters [-] Instance 65bbb3bd-2b3c-4868-bf10-ce8795c0a312 was shut off while getting sample of network.incoming.packets.drop: Failed to inspect data of instance <name=instance-00000058, id=65bbb3bd-2b3c-4868-bf10-ce8795c0a312>, domain state is SHUTOFF. get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:152
Jan 21 19:04:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:04:23.192 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.drop in the context of pollsters
Jan 21 19:04:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:04:23.192 12 DEBUG ceilometer.compute.pollsters [-] Instance 65bbb3bd-2b3c-4868-bf10-ce8795c0a312 was shut off while getting sample of network.outgoing.packets.drop: Failed to inspect data of instance <name=instance-00000058, id=65bbb3bd-2b3c-4868-bf10-ce8795c0a312>, domain state is SHUTOFF. get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:152
Jan 21 19:04:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:04:23.193 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.latency in the context of pollsters
Jan 21 19:04:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:04:23.193 12 DEBUG ceilometer.compute.pollsters [-] Instance 65bbb3bd-2b3c-4868-bf10-ce8795c0a312 was shut off while getting sample of disk.device.write.latency: Failed to inspect data of instance <name=instance-00000058, id=65bbb3bd-2b3c-4868-bf10-ce8795c0a312>, domain state is SHUTOFF. get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:152
Jan 21 19:04:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:04:23.194 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.requests in the context of pollsters
Jan 21 19:04:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:04:23.195 12 DEBUG ceilometer.compute.pollsters [-] Instance 65bbb3bd-2b3c-4868-bf10-ce8795c0a312 was shut off while getting sample of disk.device.read.requests: Failed to inspect data of instance <name=instance-00000058, id=65bbb3bd-2b3c-4868-bf10-ce8795c0a312>, domain state is SHUTOFF. get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:152
Jan 21 19:04:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:04:23.195 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.iops in the context of pollsters
Jan 21 19:04:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:04:23.195 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for PerDeviceDiskIOPSPollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Jan 21 19:04:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:04:23.195 12 ERROR ceilometer.polling.manager [-] Prevent pollster disk.device.iops from polling [<NovaLikeServer: tempest-ServerActionsTestJSON-server-1767669163>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-ServerActionsTestJSON-server-1767669163>]
Jan 21 19:04:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:04:23.196 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.capacity in the context of pollsters
Jan 21 19:04:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:04:23.196 12 DEBUG ceilometer.compute.pollsters [-] Instance 65bbb3bd-2b3c-4868-bf10-ce8795c0a312 was shut off while getting sample of disk.device.capacity: Failed to inspect data of instance <name=instance-00000058, id=65bbb3bd-2b3c-4868-bf10-ce8795c0a312>, domain state is SHUTOFF. get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:152
Jan 21 19:04:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:04:23.197 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.rate in the context of pollsters
Jan 21 19:04:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:04:23.197 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for IncomingBytesRatePollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Jan 21 19:04:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:04:23.197 12 ERROR ceilometer.polling.manager [-] Prevent pollster network.incoming.bytes.rate from polling [<NovaLikeServer: tempest-ServerActionsTestJSON-server-1767669163>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-ServerActionsTestJSON-server-1767669163>]
Jan 21 19:04:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:04:23.197 12 INFO ceilometer.polling.manager [-] Polling pollster cpu in the context of pollsters
Jan 21 19:04:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:04:23.198 12 DEBUG ceilometer.compute.pollsters [-] Instance 65bbb3bd-2b3c-4868-bf10-ce8795c0a312 was shut off while getting sample of cpu: Failed to inspect data of instance <name=instance-00000058, id=65bbb3bd-2b3c-4868-bf10-ce8795c0a312>, domain state is SHUTOFF. get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:152
Jan 21 19:04:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:04:23.198 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.bytes in the context of pollsters
Jan 21 19:04:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:04:23.200 12 DEBUG ceilometer.compute.pollsters [-] Instance 65bbb3bd-2b3c-4868-bf10-ce8795c0a312 was shut off while getting sample of disk.device.write.bytes: Failed to inspect data of instance <name=instance-00000058, id=65bbb3bd-2b3c-4868-bf10-ce8795c0a312>, domain state is SHUTOFF. get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:152
Jan 21 19:04:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:04:23.200 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.delta in the context of pollsters
Jan 21 19:04:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:04:23.201 12 DEBUG ceilometer.compute.pollsters [-] Instance 65bbb3bd-2b3c-4868-bf10-ce8795c0a312 was shut off while getting sample of network.outgoing.bytes.delta: Failed to inspect data of instance <name=instance-00000058, id=65bbb3bd-2b3c-4868-bf10-ce8795c0a312>, domain state is SHUTOFF. get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:152
Jan 21 19:04:23 np0005591285 nova_compute[182755]: 2026-01-22 00:04:23.929 182759 INFO nova.compute.manager [None req-35f6369e-0636-40d2-8c0f-50ed08e48fdb 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] [instance: 65bbb3bd-2b3c-4868-bf10-ce8795c0a312] Resuming#033[00m
Jan 21 19:04:23 np0005591285 nova_compute[182755]: 2026-01-22 00:04:23.930 182759 DEBUG nova.objects.instance [None req-35f6369e-0636-40d2-8c0f-50ed08e48fdb 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] Lazy-loading 'flavor' on Instance uuid 65bbb3bd-2b3c-4868-bf10-ce8795c0a312 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 21 19:04:24 np0005591285 nova_compute[182755]: 2026-01-22 00:04:24.002 182759 DEBUG oslo_concurrency.lockutils [None req-35f6369e-0636-40d2-8c0f-50ed08e48fdb 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] Acquiring lock "refresh_cache-65bbb3bd-2b3c-4868-bf10-ce8795c0a312" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 21 19:04:24 np0005591285 nova_compute[182755]: 2026-01-22 00:04:24.002 182759 DEBUG oslo_concurrency.lockutils [None req-35f6369e-0636-40d2-8c0f-50ed08e48fdb 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] Acquired lock "refresh_cache-65bbb3bd-2b3c-4868-bf10-ce8795c0a312" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 21 19:04:24 np0005591285 nova_compute[182755]: 2026-01-22 00:04:24.003 182759 DEBUG nova.network.neutron [None req-35f6369e-0636-40d2-8c0f-50ed08e48fdb 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] [instance: 65bbb3bd-2b3c-4868-bf10-ce8795c0a312] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Jan 21 19:04:24 np0005591285 nova_compute[182755]: 2026-01-22 00:04:24.097 182759 DEBUG nova.compute.manager [req-03419a74-87d2-40de-9bd1-e9cc35c9f57d req-6d73a311-db08-426a-88df-5b0a91b71df8 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 65bbb3bd-2b3c-4868-bf10-ce8795c0a312] Received event network-vif-plugged-d13b0c1b-9c16-4db4-bc03-d7ffef3f3af0 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 21 19:04:24 np0005591285 nova_compute[182755]: 2026-01-22 00:04:24.097 182759 DEBUG oslo_concurrency.lockutils [req-03419a74-87d2-40de-9bd1-e9cc35c9f57d req-6d73a311-db08-426a-88df-5b0a91b71df8 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "65bbb3bd-2b3c-4868-bf10-ce8795c0a312-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 21 19:04:24 np0005591285 nova_compute[182755]: 2026-01-22 00:04:24.098 182759 DEBUG oslo_concurrency.lockutils [req-03419a74-87d2-40de-9bd1-e9cc35c9f57d req-6d73a311-db08-426a-88df-5b0a91b71df8 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "65bbb3bd-2b3c-4868-bf10-ce8795c0a312-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 21 19:04:24 np0005591285 nova_compute[182755]: 2026-01-22 00:04:24.098 182759 DEBUG oslo_concurrency.lockutils [req-03419a74-87d2-40de-9bd1-e9cc35c9f57d req-6d73a311-db08-426a-88df-5b0a91b71df8 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "65bbb3bd-2b3c-4868-bf10-ce8795c0a312-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 21 19:04:24 np0005591285 nova_compute[182755]: 2026-01-22 00:04:24.098 182759 DEBUG nova.compute.manager [req-03419a74-87d2-40de-9bd1-e9cc35c9f57d req-6d73a311-db08-426a-88df-5b0a91b71df8 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 65bbb3bd-2b3c-4868-bf10-ce8795c0a312] No waiting events found dispatching network-vif-plugged-d13b0c1b-9c16-4db4-bc03-d7ffef3f3af0 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 21 19:04:24 np0005591285 nova_compute[182755]: 2026-01-22 00:04:24.098 182759 WARNING nova.compute.manager [req-03419a74-87d2-40de-9bd1-e9cc35c9f57d req-6d73a311-db08-426a-88df-5b0a91b71df8 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 65bbb3bd-2b3c-4868-bf10-ce8795c0a312] Received unexpected event network-vif-plugged-d13b0c1b-9c16-4db4-bc03-d7ffef3f3af0 for instance with vm_state suspended and task_state resuming.#033[00m
Jan 21 19:04:24 np0005591285 nova_compute[182755]: 2026-01-22 00:04:24.417 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:04:24 np0005591285 nova_compute[182755]: 2026-01-22 00:04:24.950 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:04:25 np0005591285 nova_compute[182755]: 2026-01-22 00:04:25.175 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:04:26 np0005591285 nova_compute[182755]: 2026-01-22 00:04:26.184 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:04:27 np0005591285 nova_compute[182755]: 2026-01-22 00:04:27.871 182759 DEBUG nova.network.neutron [None req-35f6369e-0636-40d2-8c0f-50ed08e48fdb 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] [instance: 65bbb3bd-2b3c-4868-bf10-ce8795c0a312] Updating instance_info_cache with network_info: [{"id": "d13b0c1b-9c16-4db4-bc03-d7ffef3f3af0", "address": "fa:16:3e:d5:43:90", "network": {"id": "19c3e0c8-5563-479c-995a-ab38d8b8c7f7", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-10713966-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.240", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "cccb624dbe6d4401a89e9cd254f91828", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd13b0c1b-9c", "ovs_interfaceid": "d13b0c1b-9c16-4db4-bc03-d7ffef3f3af0", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 21 19:04:27 np0005591285 nova_compute[182755]: 2026-01-22 00:04:27.909 182759 DEBUG oslo_concurrency.lockutils [None req-35f6369e-0636-40d2-8c0f-50ed08e48fdb 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] Releasing lock "refresh_cache-65bbb3bd-2b3c-4868-bf10-ce8795c0a312" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 21 19:04:27 np0005591285 nova_compute[182755]: 2026-01-22 00:04:27.915 182759 DEBUG nova.virt.libvirt.vif [None req-35f6369e-0636-40d2-8c0f-50ed08e48fdb 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-22T00:02:55Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ServerActionsTestJSON-server-1767669163',display_name='tempest-ServerActionsTestJSON-server-1767669163',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-serveractionstestjson-server-1767669163',id=88,image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBM2ugiUux7DYMlN8dY8gue1BzsfXbOKOqdPq/gJUxFgjYtiZRKn0Il7yH7vkt/FF0n0nQ57uKZ7FjQwDvGcLpEHkhrK3RTLhPWsztjfiNHjhjKK0S86T4k3kzP0rpeoh4Q==',key_name='tempest-keypair-452781070',keypairs=<?>,launch_index=0,launched_at=2026-01-22T00:03:31Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=<?>,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=4,progress=0,project_id='cccb624dbe6d4401a89e9cd254f91828',ramdisk_id='',reservation_id='r-200ojavi',resources=<?>,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',old_vm_state='active',owner_project_name='tempest-ServerActionsTestJSON-78742637',owner_user_name='tempest-ServerActionsTestJSON-78742637-project-member'},tags=<?>,task_state='resuming',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-22T00:04:20Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='3e78a70a1d284a9d932d4a53b872df39',uuid=65bbb3bd-2b3c-4868-bf10-ce8795c0a312,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='suspended') vif={"id": "d13b0c1b-9c16-4db4-bc03-d7ffef3f3af0", "address": "fa:16:3e:d5:43:90", "network": {"id": "19c3e0c8-5563-479c-995a-ab38d8b8c7f7", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-10713966-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.240", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "cccb624dbe6d4401a89e9cd254f91828", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd13b0c1b-9c", "ovs_interfaceid": "d13b0c1b-9c16-4db4-bc03-d7ffef3f3af0", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Jan 21 19:04:27 np0005591285 nova_compute[182755]: 2026-01-22 00:04:27.916 182759 DEBUG nova.network.os_vif_util [None req-35f6369e-0636-40d2-8c0f-50ed08e48fdb 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] Converting VIF {"id": "d13b0c1b-9c16-4db4-bc03-d7ffef3f3af0", "address": "fa:16:3e:d5:43:90", "network": {"id": "19c3e0c8-5563-479c-995a-ab38d8b8c7f7", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-10713966-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.240", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "cccb624dbe6d4401a89e9cd254f91828", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd13b0c1b-9c", "ovs_interfaceid": "d13b0c1b-9c16-4db4-bc03-d7ffef3f3af0", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 21 19:04:27 np0005591285 nova_compute[182755]: 2026-01-22 00:04:27.917 182759 DEBUG nova.network.os_vif_util [None req-35f6369e-0636-40d2-8c0f-50ed08e48fdb 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:d5:43:90,bridge_name='br-int',has_traffic_filtering=True,id=d13b0c1b-9c16-4db4-bc03-d7ffef3f3af0,network=Network(19c3e0c8-5563-479c-995a-ab38d8b8c7f7),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapd13b0c1b-9c') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 21 19:04:27 np0005591285 nova_compute[182755]: 2026-01-22 00:04:27.917 182759 DEBUG os_vif [None req-35f6369e-0636-40d2-8c0f-50ed08e48fdb 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:d5:43:90,bridge_name='br-int',has_traffic_filtering=True,id=d13b0c1b-9c16-4db4-bc03-d7ffef3f3af0,network=Network(19c3e0c8-5563-479c-995a-ab38d8b8c7f7),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapd13b0c1b-9c') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Jan 21 19:04:27 np0005591285 nova_compute[182755]: 2026-01-22 00:04:27.917 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:04:27 np0005591285 nova_compute[182755]: 2026-01-22 00:04:27.918 182759 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 21 19:04:27 np0005591285 nova_compute[182755]: 2026-01-22 00:04:27.918 182759 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 21 19:04:27 np0005591285 nova_compute[182755]: 2026-01-22 00:04:27.925 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:04:27 np0005591285 nova_compute[182755]: 2026-01-22 00:04:27.925 182759 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapd13b0c1b-9c, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 21 19:04:27 np0005591285 nova_compute[182755]: 2026-01-22 00:04:27.926 182759 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tapd13b0c1b-9c, col_values=(('external_ids', {'iface-id': 'd13b0c1b-9c16-4db4-bc03-d7ffef3f3af0', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:d5:43:90', 'vm-uuid': '65bbb3bd-2b3c-4868-bf10-ce8795c0a312'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 21 19:04:27 np0005591285 nova_compute[182755]: 2026-01-22 00:04:27.926 182759 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 21 19:04:27 np0005591285 nova_compute[182755]: 2026-01-22 00:04:27.926 182759 INFO os_vif [None req-35f6369e-0636-40d2-8c0f-50ed08e48fdb 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:d5:43:90,bridge_name='br-int',has_traffic_filtering=True,id=d13b0c1b-9c16-4db4-bc03-d7ffef3f3af0,network=Network(19c3e0c8-5563-479c-995a-ab38d8b8c7f7),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapd13b0c1b-9c')#033[00m
Jan 21 19:04:27 np0005591285 nova_compute[182755]: 2026-01-22 00:04:27.961 182759 DEBUG nova.objects.instance [None req-35f6369e-0636-40d2-8c0f-50ed08e48fdb 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] Lazy-loading 'numa_topology' on Instance uuid 65bbb3bd-2b3c-4868-bf10-ce8795c0a312 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 21 19:04:28 np0005591285 kernel: tapd13b0c1b-9c: entered promiscuous mode
Jan 21 19:04:28 np0005591285 NetworkManager[55017]: <info>  [1769040268.0566] manager: (tapd13b0c1b-9c): new Tun device (/org/freedesktop/NetworkManager/Devices/163)
Jan 21 19:04:28 np0005591285 ovn_controller[94908]: 2026-01-22T00:04:28Z|00331|binding|INFO|Claiming lport d13b0c1b-9c16-4db4-bc03-d7ffef3f3af0 for this chassis.
Jan 21 19:04:28 np0005591285 ovn_controller[94908]: 2026-01-22T00:04:28Z|00332|binding|INFO|d13b0c1b-9c16-4db4-bc03-d7ffef3f3af0: Claiming fa:16:3e:d5:43:90 10.100.0.6
Jan 21 19:04:28 np0005591285 nova_compute[182755]: 2026-01-22 00:04:28.058 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:04:28 np0005591285 nova_compute[182755]: 2026-01-22 00:04:28.061 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:04:28 np0005591285 nova_compute[182755]: 2026-01-22 00:04:28.071 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:04:28 np0005591285 nova_compute[182755]: 2026-01-22 00:04:28.076 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:04:28 np0005591285 nova_compute[182755]: 2026-01-22 00:04:28.083 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:04:28 np0005591285 NetworkManager[55017]: <info>  [1769040268.0854] manager: (patch-provnet-99bcf688-d143-4306-a11c-956e66fbe227-to-br-int): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/164)
Jan 21 19:04:28 np0005591285 NetworkManager[55017]: <info>  [1769040268.0876] manager: (patch-br-int-to-provnet-99bcf688-d143-4306-a11c-956e66fbe227): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/165)
Jan 21 19:04:28 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:04:28.096 104259 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:d5:43:90 10.100.0.6'], port_security=['fa:16:3e:d5:43:90 10.100.0.6'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.6/28', 'neutron:device_id': '65bbb3bd-2b3c-4868-bf10-ce8795c0a312', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-19c3e0c8-5563-479c-995a-ab38d8b8c7f7', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'cccb624dbe6d4401a89e9cd254f91828', 'neutron:revision_number': '7', 'neutron:security_group_ids': '6d59a7e5-ecca-4ec2-a40e-386acabc1d66', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:port_fip': '192.168.122.240'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=1cb5ae5b-fb9e-4b4d-8960-35191db09308, chassis=[<ovs.db.idl.Row object at 0x7fdd55b5c970>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fdd55b5c970>], logical_port=d13b0c1b-9c16-4db4-bc03-d7ffef3f3af0) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 21 19:04:28 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:04:28.099 104259 INFO neutron.agent.ovn.metadata.agent [-] Port d13b0c1b-9c16-4db4-bc03-d7ffef3f3af0 in datapath 19c3e0c8-5563-479c-995a-ab38d8b8c7f7 bound to our chassis#033[00m
Jan 21 19:04:28 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:04:28.101 104259 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 19c3e0c8-5563-479c-995a-ab38d8b8c7f7#033[00m
Jan 21 19:04:28 np0005591285 systemd-machined[154022]: New machine qemu-42-instance-00000058.
Jan 21 19:04:28 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:04:28.123 211690 DEBUG oslo.privsep.daemon [-] privsep: reply[f63bdda1-8b78-437e-86b0-ef525c4cc669]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 19:04:28 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:04:28.124 104259 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap19c3e0c8-51 in ovnmeta-19c3e0c8-5563-479c-995a-ab38d8b8c7f7 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Jan 21 19:04:28 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:04:28.127 211690 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap19c3e0c8-50 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Jan 21 19:04:28 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:04:28.127 211690 DEBUG oslo.privsep.daemon [-] privsep: reply[3168c6b3-567e-496e-a72f-7ec2596b12c2]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 19:04:28 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:04:28.128 211690 DEBUG oslo.privsep.daemon [-] privsep: reply[af20e378-7cab-4132-a67a-42ac1b67ce3b]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 19:04:28 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:04:28.146 104650 DEBUG oslo.privsep.daemon [-] privsep: reply[5fb963f6-cf54-40cc-a1df-991c9881a7d6]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 19:04:28 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:04:28.178 211690 DEBUG oslo.privsep.daemon [-] privsep: reply[66c39d4e-d623-434f-b624-802738769397]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 19:04:28 np0005591285 systemd[1]: Started Virtual Machine qemu-42-instance-00000058.
Jan 21 19:04:28 np0005591285 systemd-udevd[225375]: Network interface NamePolicy= disabled on kernel command line.
Jan 21 19:04:28 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:04:28.224 211704 DEBUG oslo.privsep.daemon [-] privsep: reply[81701221-d9e2-4005-8439-813dc975c3a1]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 19:04:28 np0005591285 NetworkManager[55017]: <info>  [1769040268.2333] device (tapd13b0c1b-9c): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 21 19:04:28 np0005591285 systemd-udevd[225378]: Network interface NamePolicy= disabled on kernel command line.
Jan 21 19:04:28 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:04:28.234 211690 DEBUG oslo.privsep.daemon [-] privsep: reply[4699da86-05ae-410f-b8c3-259c497b0d4b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 19:04:28 np0005591285 NetworkManager[55017]: <info>  [1769040268.2481] manager: (tap19c3e0c8-50): new Veth device (/org/freedesktop/NetworkManager/Devices/166)
Jan 21 19:04:28 np0005591285 nova_compute[182755]: 2026-01-22 00:04:28.245 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:04:28 np0005591285 NetworkManager[55017]: <info>  [1769040268.2490] device (tapd13b0c1b-9c): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 21 19:04:28 np0005591285 nova_compute[182755]: 2026-01-22 00:04:28.275 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:04:28 np0005591285 ovn_controller[94908]: 2026-01-22T00:04:28Z|00333|binding|INFO|Setting lport d13b0c1b-9c16-4db4-bc03-d7ffef3f3af0 ovn-installed in OVS
Jan 21 19:04:28 np0005591285 ovn_controller[94908]: 2026-01-22T00:04:28Z|00334|binding|INFO|Setting lport d13b0c1b-9c16-4db4-bc03-d7ffef3f3af0 up in Southbound
Jan 21 19:04:28 np0005591285 nova_compute[182755]: 2026-01-22 00:04:28.277 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:04:28 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:04:28.281 211704 DEBUG oslo.privsep.daemon [-] privsep: reply[dc5cd972-119b-476b-8fe1-a58ed44c2eb5]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 19:04:28 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:04:28.285 211704 DEBUG oslo.privsep.daemon [-] privsep: reply[e4511206-fea6-44a0-ae09-cf50011b1713]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 19:04:28 np0005591285 NetworkManager[55017]: <info>  [1769040268.3138] device (tap19c3e0c8-50): carrier: link connected
Jan 21 19:04:28 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:04:28.317 211704 DEBUG oslo.privsep.daemon [-] privsep: reply[6add33fc-a171-439c-aa50-2338bc4ae33d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 19:04:28 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:04:28.336 211690 DEBUG oslo.privsep.daemon [-] privsep: reply[c088d8f1-2236-4b6b-a1ce-f14e7572dd99]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap19c3e0c8-51'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:8f:3a:b0'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 107], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 482697, 'reachable_time': 39101, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 225403, 'error': None, 'target': 'ovnmeta-19c3e0c8-5563-479c-995a-ab38d8b8c7f7', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 19:04:28 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:04:28.353 211690 DEBUG oslo.privsep.daemon [-] privsep: reply[5b2c9a8d-e00b-4ce7-afd2-d7d3a50cf261]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe8f:3ab0'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 482697, 'tstamp': 482697}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 225404, 'error': None, 'target': 'ovnmeta-19c3e0c8-5563-479c-995a-ab38d8b8c7f7', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 19:04:28 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:04:28.371 211690 DEBUG oslo.privsep.daemon [-] privsep: reply[4c017bd4-d017-40d3-8444-5b066a22001d]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap19c3e0c8-51'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:8f:3a:b0'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 107], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 482697, 'reachable_time': 39101, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 225405, 'error': None, 'target': 'ovnmeta-19c3e0c8-5563-479c-995a-ab38d8b8c7f7', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 19:04:28 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:04:28.404 211690 DEBUG oslo.privsep.daemon [-] privsep: reply[10ef56cd-5d10-4e4f-977d-6f1defeefd54]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 19:04:28 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:04:28.465 211690 DEBUG oslo.privsep.daemon [-] privsep: reply[6b85967b-2076-474c-b9cf-bf0ff596b8d0]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 19:04:28 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:04:28.467 104259 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap19c3e0c8-50, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 21 19:04:28 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:04:28.468 104259 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 21 19:04:28 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:04:28.468 104259 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap19c3e0c8-50, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 21 19:04:28 np0005591285 nova_compute[182755]: 2026-01-22 00:04:28.477 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:04:28 np0005591285 NetworkManager[55017]: <info>  [1769040268.4785] manager: (tap19c3e0c8-50): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/167)
Jan 21 19:04:28 np0005591285 kernel: tap19c3e0c8-50: entered promiscuous mode
Jan 21 19:04:28 np0005591285 nova_compute[182755]: 2026-01-22 00:04:28.479 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:04:28 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:04:28.480 104259 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap19c3e0c8-50, col_values=(('external_ids', {'iface-id': '1b7e9589-a667-4684-99c2-2699b19c29bb'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 21 19:04:28 np0005591285 nova_compute[182755]: 2026-01-22 00:04:28.481 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:04:28 np0005591285 ovn_controller[94908]: 2026-01-22T00:04:28Z|00335|binding|INFO|Releasing lport 1b7e9589-a667-4684-99c2-2699b19c29bb from this chassis (sb_readonly=1)
Jan 21 19:04:28 np0005591285 nova_compute[182755]: 2026-01-22 00:04:28.499 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:04:28 np0005591285 nova_compute[182755]: 2026-01-22 00:04:28.501 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:04:28 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:04:28.501 104259 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/19c3e0c8-5563-479c-995a-ab38d8b8c7f7.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/19c3e0c8-5563-479c-995a-ab38d8b8c7f7.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Jan 21 19:04:28 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:04:28.502 211690 DEBUG oslo.privsep.daemon [-] privsep: reply[1f21144a-8205-4700-a6cb-49803ae98515]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 19:04:28 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:04:28.503 104259 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Jan 21 19:04:28 np0005591285 ovn_metadata_agent[104254]: global
Jan 21 19:04:28 np0005591285 ovn_metadata_agent[104254]:    log         /dev/log local0 debug
Jan 21 19:04:28 np0005591285 ovn_metadata_agent[104254]:    log-tag     haproxy-metadata-proxy-19c3e0c8-5563-479c-995a-ab38d8b8c7f7
Jan 21 19:04:28 np0005591285 ovn_metadata_agent[104254]:    user        root
Jan 21 19:04:28 np0005591285 ovn_metadata_agent[104254]:    group       root
Jan 21 19:04:28 np0005591285 ovn_metadata_agent[104254]:    maxconn     1024
Jan 21 19:04:28 np0005591285 ovn_metadata_agent[104254]:    pidfile     /var/lib/neutron/external/pids/19c3e0c8-5563-479c-995a-ab38d8b8c7f7.pid.haproxy
Jan 21 19:04:28 np0005591285 ovn_metadata_agent[104254]:    daemon
Jan 21 19:04:28 np0005591285 ovn_metadata_agent[104254]: 
Jan 21 19:04:28 np0005591285 ovn_metadata_agent[104254]: defaults
Jan 21 19:04:28 np0005591285 ovn_metadata_agent[104254]:    log global
Jan 21 19:04:28 np0005591285 ovn_metadata_agent[104254]:    mode http
Jan 21 19:04:28 np0005591285 ovn_metadata_agent[104254]:    option httplog
Jan 21 19:04:28 np0005591285 ovn_metadata_agent[104254]:    option dontlognull
Jan 21 19:04:28 np0005591285 ovn_metadata_agent[104254]:    option http-server-close
Jan 21 19:04:28 np0005591285 ovn_metadata_agent[104254]:    option forwardfor
Jan 21 19:04:28 np0005591285 ovn_metadata_agent[104254]:    retries                 3
Jan 21 19:04:28 np0005591285 ovn_metadata_agent[104254]:    timeout http-request    30s
Jan 21 19:04:28 np0005591285 ovn_metadata_agent[104254]:    timeout connect         30s
Jan 21 19:04:28 np0005591285 ovn_metadata_agent[104254]:    timeout client          32s
Jan 21 19:04:28 np0005591285 ovn_metadata_agent[104254]:    timeout server          32s
Jan 21 19:04:28 np0005591285 ovn_metadata_agent[104254]:    timeout http-keep-alive 30s
Jan 21 19:04:28 np0005591285 ovn_metadata_agent[104254]: 
Jan 21 19:04:28 np0005591285 ovn_metadata_agent[104254]: 
Jan 21 19:04:28 np0005591285 ovn_metadata_agent[104254]: listen listener
Jan 21 19:04:28 np0005591285 ovn_metadata_agent[104254]:    bind 169.254.169.254:80
Jan 21 19:04:28 np0005591285 ovn_metadata_agent[104254]:    server metadata /var/lib/neutron/metadata_proxy
Jan 21 19:04:28 np0005591285 ovn_metadata_agent[104254]:    http-request add-header X-OVN-Network-ID 19c3e0c8-5563-479c-995a-ab38d8b8c7f7
Jan 21 19:04:28 np0005591285 ovn_metadata_agent[104254]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Jan 21 19:04:28 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:04:28.505 104259 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-19c3e0c8-5563-479c-995a-ab38d8b8c7f7', 'env', 'PROCESS_TAG=haproxy-19c3e0c8-5563-479c-995a-ab38d8b8c7f7', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/19c3e0c8-5563-479c-995a-ab38d8b8c7f7.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Jan 21 19:04:28 np0005591285 nova_compute[182755]: 2026-01-22 00:04:28.841 182759 DEBUG nova.virt.libvirt.host [None req-f447ef32-7edb-4e2c-a5c5-5452f2f9c37e - - - - - -] Removed pending event for 65bbb3bd-2b3c-4868-bf10-ce8795c0a312 due to event _event_emit_delayed /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:438#033[00m
Jan 21 19:04:28 np0005591285 nova_compute[182755]: 2026-01-22 00:04:28.842 182759 DEBUG nova.virt.driver [None req-f447ef32-7edb-4e2c-a5c5-5452f2f9c37e - - - - - -] Emitting event <LifecycleEvent: 1769040268.8408027, 65bbb3bd-2b3c-4868-bf10-ce8795c0a312 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 21 19:04:28 np0005591285 nova_compute[182755]: 2026-01-22 00:04:28.843 182759 INFO nova.compute.manager [None req-f447ef32-7edb-4e2c-a5c5-5452f2f9c37e - - - - - -] [instance: 65bbb3bd-2b3c-4868-bf10-ce8795c0a312] VM Started (Lifecycle Event)#033[00m
Jan 21 19:04:28 np0005591285 nova_compute[182755]: 2026-01-22 00:04:28.876 182759 DEBUG nova.compute.manager [None req-35f6369e-0636-40d2-8c0f-50ed08e48fdb 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] [instance: 65bbb3bd-2b3c-4868-bf10-ce8795c0a312] Instance event wait completed in 0 seconds for  wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Jan 21 19:04:28 np0005591285 nova_compute[182755]: 2026-01-22 00:04:28.877 182759 DEBUG nova.objects.instance [None req-35f6369e-0636-40d2-8c0f-50ed08e48fdb 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] Lazy-loading 'pci_devices' on Instance uuid 65bbb3bd-2b3c-4868-bf10-ce8795c0a312 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 21 19:04:28 np0005591285 nova_compute[182755]: 2026-01-22 00:04:28.880 182759 DEBUG nova.compute.manager [None req-f447ef32-7edb-4e2c-a5c5-5452f2f9c37e - - - - - -] [instance: 65bbb3bd-2b3c-4868-bf10-ce8795c0a312] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 21 19:04:28 np0005591285 nova_compute[182755]: 2026-01-22 00:04:28.886 182759 DEBUG nova.compute.manager [None req-f447ef32-7edb-4e2c-a5c5-5452f2f9c37e - - - - - -] [instance: 65bbb3bd-2b3c-4868-bf10-ce8795c0a312] Synchronizing instance power state after lifecycle event "Started"; current vm_state: suspended, current task_state: resuming, current DB power_state: 4, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 21 19:04:28 np0005591285 podman[225444]: 2026-01-22 00:04:28.906640186 +0000 UTC m=+0.067100966 container create 60a47496ede3ac89677b7ba49948de50d5a31e67a2fd7b300d8d445b905aa073 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-19c3e0c8-5563-479c-995a-ab38d8b8c7f7, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.license=GPLv2)
Jan 21 19:04:28 np0005591285 nova_compute[182755]: 2026-01-22 00:04:28.913 182759 INFO nova.virt.libvirt.driver [-] [instance: 65bbb3bd-2b3c-4868-bf10-ce8795c0a312] Instance running successfully.#033[00m
Jan 21 19:04:28 np0005591285 virtqemud[182299]: argument unsupported: QEMU guest agent is not configured
Jan 21 19:04:28 np0005591285 nova_compute[182755]: 2026-01-22 00:04:28.915 182759 DEBUG nova.virt.libvirt.guest [None req-35f6369e-0636-40d2-8c0f-50ed08e48fdb 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] [instance: 65bbb3bd-2b3c-4868-bf10-ce8795c0a312] Failed to set time: agent not configured sync_guest_time /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:200#033[00m
Jan 21 19:04:28 np0005591285 nova_compute[182755]: 2026-01-22 00:04:28.916 182759 DEBUG nova.compute.manager [None req-35f6369e-0636-40d2-8c0f-50ed08e48fdb 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] [instance: 65bbb3bd-2b3c-4868-bf10-ce8795c0a312] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 21 19:04:28 np0005591285 nova_compute[182755]: 2026-01-22 00:04:28.923 182759 INFO nova.compute.manager [None req-f447ef32-7edb-4e2c-a5c5-5452f2f9c37e - - - - - -] [instance: 65bbb3bd-2b3c-4868-bf10-ce8795c0a312] During sync_power_state the instance has a pending task (resuming). Skip.#033[00m
Jan 21 19:04:28 np0005591285 nova_compute[182755]: 2026-01-22 00:04:28.924 182759 DEBUG nova.virt.driver [None req-f447ef32-7edb-4e2c-a5c5-5452f2f9c37e - - - - - -] Emitting event <LifecycleEvent: 1769040268.8475716, 65bbb3bd-2b3c-4868-bf10-ce8795c0a312 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 21 19:04:28 np0005591285 nova_compute[182755]: 2026-01-22 00:04:28.924 182759 INFO nova.compute.manager [None req-f447ef32-7edb-4e2c-a5c5-5452f2f9c37e - - - - - -] [instance: 65bbb3bd-2b3c-4868-bf10-ce8795c0a312] VM Resumed (Lifecycle Event)#033[00m
Jan 21 19:04:28 np0005591285 systemd[1]: Started libpod-conmon-60a47496ede3ac89677b7ba49948de50d5a31e67a2fd7b300d8d445b905aa073.scope.
Jan 21 19:04:28 np0005591285 nova_compute[182755]: 2026-01-22 00:04:28.967 182759 DEBUG nova.compute.manager [None req-f447ef32-7edb-4e2c-a5c5-5452f2f9c37e - - - - - -] [instance: 65bbb3bd-2b3c-4868-bf10-ce8795c0a312] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 21 19:04:28 np0005591285 systemd[1]: Started libcrun container.
Jan 21 19:04:28 np0005591285 nova_compute[182755]: 2026-01-22 00:04:28.970 182759 DEBUG nova.compute.manager [None req-f447ef32-7edb-4e2c-a5c5-5452f2f9c37e - - - - - -] [instance: 65bbb3bd-2b3c-4868-bf10-ce8795c0a312] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: suspended, current task_state: resuming, current DB power_state: 4, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 21 19:04:28 np0005591285 podman[225444]: 2026-01-22 00:04:28.882597799 +0000 UTC m=+0.043058599 image pull 3695f0466b4af47afdf4b467956f8cc4744d7249671a73e7ca3fd26cca2f59c3 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Jan 21 19:04:28 np0005591285 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/94a178792babc5b38a60cb14c0e02ad88c0305d07a92b9f24d354433ad87fa38/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 21 19:04:28 np0005591285 podman[225444]: 2026-01-22 00:04:28.998327373 +0000 UTC m=+0.158788153 container init 60a47496ede3ac89677b7ba49948de50d5a31e67a2fd7b300d8d445b905aa073 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-19c3e0c8-5563-479c-995a-ab38d8b8c7f7, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS)
Jan 21 19:04:29 np0005591285 podman[225444]: 2026-01-22 00:04:29.005058003 +0000 UTC m=+0.165518783 container start 60a47496ede3ac89677b7ba49948de50d5a31e67a2fd7b300d8d445b905aa073 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-19c3e0c8-5563-479c-995a-ab38d8b8c7f7, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251202)
Jan 21 19:04:29 np0005591285 nova_compute[182755]: 2026-01-22 00:04:29.006 182759 INFO nova.compute.manager [None req-f447ef32-7edb-4e2c-a5c5-5452f2f9c37e - - - - - -] [instance: 65bbb3bd-2b3c-4868-bf10-ce8795c0a312] During sync_power_state the instance has a pending task (resuming). Skip.#033[00m
Jan 21 19:04:29 np0005591285 neutron-haproxy-ovnmeta-19c3e0c8-5563-479c-995a-ab38d8b8c7f7[225459]: [NOTICE]   (225463) : New worker (225465) forked
Jan 21 19:04:29 np0005591285 neutron-haproxy-ovnmeta-19c3e0c8-5563-479c-995a-ab38d8b8c7f7[225459]: [NOTICE]   (225463) : Loading success.
Jan 21 19:04:29 np0005591285 nova_compute[182755]: 2026-01-22 00:04:29.169 182759 DEBUG nova.compute.manager [req-b44880d3-bc46-4bc6-99bd-c2db5a67be13 req-455bcc83-bc0e-4769-bb23-4b900683e1a4 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 65bbb3bd-2b3c-4868-bf10-ce8795c0a312] Received event network-vif-plugged-d13b0c1b-9c16-4db4-bc03-d7ffef3f3af0 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 21 19:04:29 np0005591285 nova_compute[182755]: 2026-01-22 00:04:29.169 182759 DEBUG oslo_concurrency.lockutils [req-b44880d3-bc46-4bc6-99bd-c2db5a67be13 req-455bcc83-bc0e-4769-bb23-4b900683e1a4 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "65bbb3bd-2b3c-4868-bf10-ce8795c0a312-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 21 19:04:29 np0005591285 nova_compute[182755]: 2026-01-22 00:04:29.169 182759 DEBUG oslo_concurrency.lockutils [req-b44880d3-bc46-4bc6-99bd-c2db5a67be13 req-455bcc83-bc0e-4769-bb23-4b900683e1a4 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "65bbb3bd-2b3c-4868-bf10-ce8795c0a312-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 21 19:04:29 np0005591285 nova_compute[182755]: 2026-01-22 00:04:29.170 182759 DEBUG oslo_concurrency.lockutils [req-b44880d3-bc46-4bc6-99bd-c2db5a67be13 req-455bcc83-bc0e-4769-bb23-4b900683e1a4 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "65bbb3bd-2b3c-4868-bf10-ce8795c0a312-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 21 19:04:29 np0005591285 nova_compute[182755]: 2026-01-22 00:04:29.170 182759 DEBUG nova.compute.manager [req-b44880d3-bc46-4bc6-99bd-c2db5a67be13 req-455bcc83-bc0e-4769-bb23-4b900683e1a4 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 65bbb3bd-2b3c-4868-bf10-ce8795c0a312] No waiting events found dispatching network-vif-plugged-d13b0c1b-9c16-4db4-bc03-d7ffef3f3af0 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 21 19:04:29 np0005591285 nova_compute[182755]: 2026-01-22 00:04:29.170 182759 WARNING nova.compute.manager [req-b44880d3-bc46-4bc6-99bd-c2db5a67be13 req-455bcc83-bc0e-4769-bb23-4b900683e1a4 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 65bbb3bd-2b3c-4868-bf10-ce8795c0a312] Received unexpected event network-vif-plugged-d13b0c1b-9c16-4db4-bc03-d7ffef3f3af0 for instance with vm_state active and task_state None.#033[00m
Jan 21 19:04:29 np0005591285 nova_compute[182755]: 2026-01-22 00:04:29.420 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:04:31 np0005591285 nova_compute[182755]: 2026-01-22 00:04:31.189 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:04:31 np0005591285 nova_compute[182755]: 2026-01-22 00:04:31.380 182759 DEBUG nova.compute.manager [req-66e3ab69-d79e-4a86-8d48-94f211f1660d req-4116541e-f7e9-43a9-83e5-0e14fb316842 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 65bbb3bd-2b3c-4868-bf10-ce8795c0a312] Received event network-vif-plugged-d13b0c1b-9c16-4db4-bc03-d7ffef3f3af0 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 21 19:04:31 np0005591285 nova_compute[182755]: 2026-01-22 00:04:31.381 182759 DEBUG oslo_concurrency.lockutils [req-66e3ab69-d79e-4a86-8d48-94f211f1660d req-4116541e-f7e9-43a9-83e5-0e14fb316842 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "65bbb3bd-2b3c-4868-bf10-ce8795c0a312-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 21 19:04:31 np0005591285 nova_compute[182755]: 2026-01-22 00:04:31.381 182759 DEBUG oslo_concurrency.lockutils [req-66e3ab69-d79e-4a86-8d48-94f211f1660d req-4116541e-f7e9-43a9-83e5-0e14fb316842 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "65bbb3bd-2b3c-4868-bf10-ce8795c0a312-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 21 19:04:31 np0005591285 nova_compute[182755]: 2026-01-22 00:04:31.382 182759 DEBUG oslo_concurrency.lockutils [req-66e3ab69-d79e-4a86-8d48-94f211f1660d req-4116541e-f7e9-43a9-83e5-0e14fb316842 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "65bbb3bd-2b3c-4868-bf10-ce8795c0a312-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 21 19:04:31 np0005591285 nova_compute[182755]: 2026-01-22 00:04:31.382 182759 DEBUG nova.compute.manager [req-66e3ab69-d79e-4a86-8d48-94f211f1660d req-4116541e-f7e9-43a9-83e5-0e14fb316842 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 65bbb3bd-2b3c-4868-bf10-ce8795c0a312] No waiting events found dispatching network-vif-plugged-d13b0c1b-9c16-4db4-bc03-d7ffef3f3af0 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 21 19:04:31 np0005591285 nova_compute[182755]: 2026-01-22 00:04:31.382 182759 WARNING nova.compute.manager [req-66e3ab69-d79e-4a86-8d48-94f211f1660d req-4116541e-f7e9-43a9-83e5-0e14fb316842 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 65bbb3bd-2b3c-4868-bf10-ce8795c0a312] Received unexpected event network-vif-plugged-d13b0c1b-9c16-4db4-bc03-d7ffef3f3af0 for instance with vm_state active and task_state None.#033[00m
Jan 21 19:04:32 np0005591285 ovn_controller[94908]: 2026-01-22T00:04:32Z|00036|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:d5:43:90 10.100.0.6
Jan 21 19:04:33 np0005591285 nova_compute[182755]: 2026-01-22 00:04:33.119 182759 DEBUG oslo_concurrency.lockutils [None req-8b33456a-0b77-419a-8c54-1f6524a2bae6 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] Acquiring lock "65bbb3bd-2b3c-4868-bf10-ce8795c0a312" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 21 19:04:33 np0005591285 nova_compute[182755]: 2026-01-22 00:04:33.119 182759 DEBUG oslo_concurrency.lockutils [None req-8b33456a-0b77-419a-8c54-1f6524a2bae6 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] Lock "65bbb3bd-2b3c-4868-bf10-ce8795c0a312" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 21 19:04:33 np0005591285 nova_compute[182755]: 2026-01-22 00:04:33.120 182759 DEBUG oslo_concurrency.lockutils [None req-8b33456a-0b77-419a-8c54-1f6524a2bae6 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] Acquiring lock "65bbb3bd-2b3c-4868-bf10-ce8795c0a312-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 21 19:04:33 np0005591285 nova_compute[182755]: 2026-01-22 00:04:33.120 182759 DEBUG oslo_concurrency.lockutils [None req-8b33456a-0b77-419a-8c54-1f6524a2bae6 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] Lock "65bbb3bd-2b3c-4868-bf10-ce8795c0a312-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 21 19:04:33 np0005591285 nova_compute[182755]: 2026-01-22 00:04:33.120 182759 DEBUG oslo_concurrency.lockutils [None req-8b33456a-0b77-419a-8c54-1f6524a2bae6 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] Lock "65bbb3bd-2b3c-4868-bf10-ce8795c0a312-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 21 19:04:33 np0005591285 nova_compute[182755]: 2026-01-22 00:04:33.131 182759 INFO nova.compute.manager [None req-8b33456a-0b77-419a-8c54-1f6524a2bae6 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] [instance: 65bbb3bd-2b3c-4868-bf10-ce8795c0a312] Terminating instance#033[00m
Jan 21 19:04:33 np0005591285 nova_compute[182755]: 2026-01-22 00:04:33.143 182759 DEBUG nova.compute.manager [None req-8b33456a-0b77-419a-8c54-1f6524a2bae6 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] [instance: 65bbb3bd-2b3c-4868-bf10-ce8795c0a312] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Jan 21 19:04:33 np0005591285 kernel: tapd13b0c1b-9c (unregistering): left promiscuous mode
Jan 21 19:04:33 np0005591285 NetworkManager[55017]: <info>  [1769040273.1644] device (tapd13b0c1b-9c): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 21 19:04:33 np0005591285 ovn_controller[94908]: 2026-01-22T00:04:33Z|00336|binding|INFO|Releasing lport d13b0c1b-9c16-4db4-bc03-d7ffef3f3af0 from this chassis (sb_readonly=0)
Jan 21 19:04:33 np0005591285 ovn_controller[94908]: 2026-01-22T00:04:33Z|00337|binding|INFO|Setting lport d13b0c1b-9c16-4db4-bc03-d7ffef3f3af0 down in Southbound
Jan 21 19:04:33 np0005591285 ovn_controller[94908]: 2026-01-22T00:04:33Z|00338|binding|INFO|Removing iface tapd13b0c1b-9c ovn-installed in OVS
Jan 21 19:04:33 np0005591285 nova_compute[182755]: 2026-01-22 00:04:33.174 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:04:33 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:04:33.182 104259 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:d5:43:90 10.100.0.6'], port_security=['fa:16:3e:d5:43:90 10.100.0.6'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.6/28', 'neutron:device_id': '65bbb3bd-2b3c-4868-bf10-ce8795c0a312', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-19c3e0c8-5563-479c-995a-ab38d8b8c7f7', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'cccb624dbe6d4401a89e9cd254f91828', 'neutron:revision_number': '8', 'neutron:security_group_ids': '6d59a7e5-ecca-4ec2-a40e-386acabc1d66', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:port_fip': '192.168.122.240', 'neutron:host_id': 'compute-2.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=1cb5ae5b-fb9e-4b4d-8960-35191db09308, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fdd55b5c970>], logical_port=d13b0c1b-9c16-4db4-bc03-d7ffef3f3af0) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7fdd55b5c970>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 21 19:04:33 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:04:33.183 104259 INFO neutron.agent.ovn.metadata.agent [-] Port d13b0c1b-9c16-4db4-bc03-d7ffef3f3af0 in datapath 19c3e0c8-5563-479c-995a-ab38d8b8c7f7 unbound from our chassis#033[00m
Jan 21 19:04:33 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:04:33.184 104259 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 19c3e0c8-5563-479c-995a-ab38d8b8c7f7, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Jan 21 19:04:33 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:04:33.185 211690 DEBUG oslo.privsep.daemon [-] privsep: reply[21863bf1-3606-40fb-bc23-781df99c79fe]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 19:04:33 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:04:33.186 104259 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-19c3e0c8-5563-479c-995a-ab38d8b8c7f7 namespace which is not needed anymore#033[00m
Jan 21 19:04:33 np0005591285 nova_compute[182755]: 2026-01-22 00:04:33.193 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:04:33 np0005591285 systemd[1]: machine-qemu\x2d42\x2dinstance\x2d00000058.scope: Deactivated successfully.
Jan 21 19:04:33 np0005591285 systemd[1]: machine-qemu\x2d42\x2dinstance\x2d00000058.scope: Consumed 4.313s CPU time.
Jan 21 19:04:33 np0005591285 systemd-machined[154022]: Machine qemu-42-instance-00000058 terminated.
Jan 21 19:04:33 np0005591285 neutron-haproxy-ovnmeta-19c3e0c8-5563-479c-995a-ab38d8b8c7f7[225459]: [NOTICE]   (225463) : haproxy version is 2.8.14-c23fe91
Jan 21 19:04:33 np0005591285 neutron-haproxy-ovnmeta-19c3e0c8-5563-479c-995a-ab38d8b8c7f7[225459]: [NOTICE]   (225463) : path to executable is /usr/sbin/haproxy
Jan 21 19:04:33 np0005591285 neutron-haproxy-ovnmeta-19c3e0c8-5563-479c-995a-ab38d8b8c7f7[225459]: [WARNING]  (225463) : Exiting Master process...
Jan 21 19:04:33 np0005591285 neutron-haproxy-ovnmeta-19c3e0c8-5563-479c-995a-ab38d8b8c7f7[225459]: [WARNING]  (225463) : Exiting Master process...
Jan 21 19:04:33 np0005591285 neutron-haproxy-ovnmeta-19c3e0c8-5563-479c-995a-ab38d8b8c7f7[225459]: [ALERT]    (225463) : Current worker (225465) exited with code 143 (Terminated)
Jan 21 19:04:33 np0005591285 neutron-haproxy-ovnmeta-19c3e0c8-5563-479c-995a-ab38d8b8c7f7[225459]: [WARNING]  (225463) : All workers exited. Exiting... (0)
Jan 21 19:04:33 np0005591285 systemd[1]: libpod-60a47496ede3ac89677b7ba49948de50d5a31e67a2fd7b300d8d445b905aa073.scope: Deactivated successfully.
Jan 21 19:04:33 np0005591285 podman[225512]: 2026-01-22 00:04:33.325829048 +0000 UTC m=+0.049491641 container died 60a47496ede3ac89677b7ba49948de50d5a31e67a2fd7b300d8d445b905aa073 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-19c3e0c8-5563-479c-995a-ab38d8b8c7f7, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 21 19:04:33 np0005591285 systemd[1]: var-lib-containers-storage-overlay-94a178792babc5b38a60cb14c0e02ad88c0305d07a92b9f24d354433ad87fa38-merged.mount: Deactivated successfully.
Jan 21 19:04:33 np0005591285 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-60a47496ede3ac89677b7ba49948de50d5a31e67a2fd7b300d8d445b905aa073-userdata-shm.mount: Deactivated successfully.
Jan 21 19:04:33 np0005591285 podman[225512]: 2026-01-22 00:04:33.352857086 +0000 UTC m=+0.076519679 container cleanup 60a47496ede3ac89677b7ba49948de50d5a31e67a2fd7b300d8d445b905aa073 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-19c3e0c8-5563-479c-995a-ab38d8b8c7f7, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.license=GPLv2)
Jan 21 19:04:33 np0005591285 nova_compute[182755]: 2026-01-22 00:04:33.415 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:04:33 np0005591285 podman[225546]: 2026-01-22 00:04:33.416386205 +0000 UTC m=+0.039114143 container remove 60a47496ede3ac89677b7ba49948de50d5a31e67a2fd7b300d8d445b905aa073 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-19c3e0c8-5563-479c-995a-ab38d8b8c7f7, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3)
Jan 21 19:04:33 np0005591285 systemd[1]: libpod-conmon-60a47496ede3ac89677b7ba49948de50d5a31e67a2fd7b300d8d445b905aa073.scope: Deactivated successfully.
Jan 21 19:04:33 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:04:33.418 211690 DEBUG oslo.privsep.daemon [-] privsep: reply[88d40bc4-26c4-4391-81dc-adaf57926c9c]: (4, ('Thu Jan 22 12:04:33 AM UTC 2026 Stopping container neutron-haproxy-ovnmeta-19c3e0c8-5563-479c-995a-ab38d8b8c7f7 (60a47496ede3ac89677b7ba49948de50d5a31e67a2fd7b300d8d445b905aa073)\n60a47496ede3ac89677b7ba49948de50d5a31e67a2fd7b300d8d445b905aa073\nThu Jan 22 12:04:33 AM UTC 2026 Deleting container neutron-haproxy-ovnmeta-19c3e0c8-5563-479c-995a-ab38d8b8c7f7 (60a47496ede3ac89677b7ba49948de50d5a31e67a2fd7b300d8d445b905aa073)\n60a47496ede3ac89677b7ba49948de50d5a31e67a2fd7b300d8d445b905aa073\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 19:04:33 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:04:33.420 211690 DEBUG oslo.privsep.daemon [-] privsep: reply[b071d124-62ae-440e-af3f-2fb2a85d9255]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 19:04:33 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:04:33.421 104259 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap19c3e0c8-50, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 21 19:04:33 np0005591285 nova_compute[182755]: 2026-01-22 00:04:33.424 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:04:33 np0005591285 kernel: tap19c3e0c8-50: left promiscuous mode
Jan 21 19:04:33 np0005591285 nova_compute[182755]: 2026-01-22 00:04:33.441 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:04:33 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:04:33.444 211690 DEBUG oslo.privsep.daemon [-] privsep: reply[a0d3fb61-88a5-44b1-b9f0-251b4ff9b947]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 19:04:33 np0005591285 nova_compute[182755]: 2026-01-22 00:04:33.459 182759 INFO nova.virt.libvirt.driver [-] [instance: 65bbb3bd-2b3c-4868-bf10-ce8795c0a312] Instance destroyed successfully.#033[00m
Jan 21 19:04:33 np0005591285 nova_compute[182755]: 2026-01-22 00:04:33.460 182759 DEBUG nova.objects.instance [None req-8b33456a-0b77-419a-8c54-1f6524a2bae6 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] Lazy-loading 'resources' on Instance uuid 65bbb3bd-2b3c-4868-bf10-ce8795c0a312 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 21 19:04:33 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:04:33.463 211690 DEBUG oslo.privsep.daemon [-] privsep: reply[13b050fd-7d70-41a6-9583-f148b490be1f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 19:04:33 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:04:33.464 211690 DEBUG oslo.privsep.daemon [-] privsep: reply[8e14eaee-3537-4580-a40d-ea646cb98bd8]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 19:04:33 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:04:33.480 211690 DEBUG oslo.privsep.daemon [-] privsep: reply[e681bc1b-4ad1-4bd8-9c26-b09590e916b8]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 482687, 'reachable_time': 23094, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 225577, 'error': None, 'target': 'ovnmeta-19c3e0c8-5563-479c-995a-ab38d8b8c7f7', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 19:04:33 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:04:33.482 104650 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-19c3e0c8-5563-479c-995a-ab38d8b8c7f7 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Jan 21 19:04:33 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:04:33.482 104650 DEBUG oslo.privsep.daemon [-] privsep: reply[ccd149ed-82d9-4af7-bc2b-eecd77e3974d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 19:04:33 np0005591285 systemd[1]: run-netns-ovnmeta\x2d19c3e0c8\x2d5563\x2d479c\x2d995a\x2dab38d8b8c7f7.mount: Deactivated successfully.
Jan 21 19:04:33 np0005591285 nova_compute[182755]: 2026-01-22 00:04:33.497 182759 DEBUG nova.virt.libvirt.vif [None req-8b33456a-0b77-419a-8c54-1f6524a2bae6 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-22T00:02:55Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ServerActionsTestJSON-server-1767669163',display_name='tempest-ServerActionsTestJSON-server-1767669163',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-serveractionstestjson-server-1767669163',id=88,image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBM2ugiUux7DYMlN8dY8gue1BzsfXbOKOqdPq/gJUxFgjYtiZRKn0Il7yH7vkt/FF0n0nQ57uKZ7FjQwDvGcLpEHkhrK3RTLhPWsztjfiNHjhjKK0S86T4k3kzP0rpeoh4Q==',key_name='tempest-keypair-452781070',keypairs=<?>,launch_index=0,launched_at=2026-01-22T00:03:31Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='cccb624dbe6d4401a89e9cd254f91828',ramdisk_id='',reservation_id='r-200ojavi',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-ServerActionsTestJSON-78742637',owner_user_name='tempest-ServerActionsTestJSON-78742637-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-22T00:04:28Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='3e78a70a1d284a9d932d4a53b872df39',uuid=65bbb3bd-2b3c-4868-bf10-ce8795c0a312,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "d13b0c1b-9c16-4db4-bc03-d7ffef3f3af0", "address": "fa:16:3e:d5:43:90", "network": {"id": "19c3e0c8-5563-479c-995a-ab38d8b8c7f7", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-10713966-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.240", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "cccb624dbe6d4401a89e9cd254f91828", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd13b0c1b-9c", "ovs_interfaceid": "d13b0c1b-9c16-4db4-bc03-d7ffef3f3af0", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Jan 21 19:04:33 np0005591285 nova_compute[182755]: 2026-01-22 00:04:33.497 182759 DEBUG nova.network.os_vif_util [None req-8b33456a-0b77-419a-8c54-1f6524a2bae6 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] Converting VIF {"id": "d13b0c1b-9c16-4db4-bc03-d7ffef3f3af0", "address": "fa:16:3e:d5:43:90", "network": {"id": "19c3e0c8-5563-479c-995a-ab38d8b8c7f7", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-10713966-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.240", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "cccb624dbe6d4401a89e9cd254f91828", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd13b0c1b-9c", "ovs_interfaceid": "d13b0c1b-9c16-4db4-bc03-d7ffef3f3af0", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 21 19:04:33 np0005591285 nova_compute[182755]: 2026-01-22 00:04:33.498 182759 DEBUG nova.network.os_vif_util [None req-8b33456a-0b77-419a-8c54-1f6524a2bae6 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:d5:43:90,bridge_name='br-int',has_traffic_filtering=True,id=d13b0c1b-9c16-4db4-bc03-d7ffef3f3af0,network=Network(19c3e0c8-5563-479c-995a-ab38d8b8c7f7),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapd13b0c1b-9c') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 21 19:04:33 np0005591285 nova_compute[182755]: 2026-01-22 00:04:33.498 182759 DEBUG os_vif [None req-8b33456a-0b77-419a-8c54-1f6524a2bae6 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:d5:43:90,bridge_name='br-int',has_traffic_filtering=True,id=d13b0c1b-9c16-4db4-bc03-d7ffef3f3af0,network=Network(19c3e0c8-5563-479c-995a-ab38d8b8c7f7),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapd13b0c1b-9c') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Jan 21 19:04:33 np0005591285 nova_compute[182755]: 2026-01-22 00:04:33.500 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:04:33 np0005591285 nova_compute[182755]: 2026-01-22 00:04:33.500 182759 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapd13b0c1b-9c, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 21 19:04:33 np0005591285 nova_compute[182755]: 2026-01-22 00:04:33.503 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 21 19:04:33 np0005591285 nova_compute[182755]: 2026-01-22 00:04:33.507 182759 INFO os_vif [None req-8b33456a-0b77-419a-8c54-1f6524a2bae6 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:d5:43:90,bridge_name='br-int',has_traffic_filtering=True,id=d13b0c1b-9c16-4db4-bc03-d7ffef3f3af0,network=Network(19c3e0c8-5563-479c-995a-ab38d8b8c7f7),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapd13b0c1b-9c')#033[00m
Jan 21 19:04:33 np0005591285 nova_compute[182755]: 2026-01-22 00:04:33.507 182759 INFO nova.virt.libvirt.driver [None req-8b33456a-0b77-419a-8c54-1f6524a2bae6 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] [instance: 65bbb3bd-2b3c-4868-bf10-ce8795c0a312] Deleting instance files /var/lib/nova/instances/65bbb3bd-2b3c-4868-bf10-ce8795c0a312_del#033[00m
Jan 21 19:04:33 np0005591285 nova_compute[182755]: 2026-01-22 00:04:33.508 182759 INFO nova.virt.libvirt.driver [None req-8b33456a-0b77-419a-8c54-1f6524a2bae6 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] [instance: 65bbb3bd-2b3c-4868-bf10-ce8795c0a312] Deletion of /var/lib/nova/instances/65bbb3bd-2b3c-4868-bf10-ce8795c0a312_del complete#033[00m
Jan 21 19:04:33 np0005591285 nova_compute[182755]: 2026-01-22 00:04:33.842 182759 INFO nova.compute.manager [None req-8b33456a-0b77-419a-8c54-1f6524a2bae6 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] [instance: 65bbb3bd-2b3c-4868-bf10-ce8795c0a312] Took 0.70 seconds to destroy the instance on the hypervisor.#033[00m
Jan 21 19:04:33 np0005591285 nova_compute[182755]: 2026-01-22 00:04:33.842 182759 DEBUG oslo.service.loopingcall [None req-8b33456a-0b77-419a-8c54-1f6524a2bae6 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Jan 21 19:04:33 np0005591285 nova_compute[182755]: 2026-01-22 00:04:33.843 182759 DEBUG nova.compute.manager [-] [instance: 65bbb3bd-2b3c-4868-bf10-ce8795c0a312] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Jan 21 19:04:33 np0005591285 nova_compute[182755]: 2026-01-22 00:04:33.843 182759 DEBUG nova.network.neutron [-] [instance: 65bbb3bd-2b3c-4868-bf10-ce8795c0a312] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Jan 21 19:04:33 np0005591285 nova_compute[182755]: 2026-01-22 00:04:33.918 182759 DEBUG nova.compute.manager [req-c7224481-aea9-49ee-b6bc-7928fedec674 req-e1b15f71-77a3-4928-8f1b-6e122398e91c 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 65bbb3bd-2b3c-4868-bf10-ce8795c0a312] Received event network-vif-unplugged-d13b0c1b-9c16-4db4-bc03-d7ffef3f3af0 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 21 19:04:33 np0005591285 nova_compute[182755]: 2026-01-22 00:04:33.918 182759 DEBUG oslo_concurrency.lockutils [req-c7224481-aea9-49ee-b6bc-7928fedec674 req-e1b15f71-77a3-4928-8f1b-6e122398e91c 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "65bbb3bd-2b3c-4868-bf10-ce8795c0a312-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 21 19:04:33 np0005591285 nova_compute[182755]: 2026-01-22 00:04:33.918 182759 DEBUG oslo_concurrency.lockutils [req-c7224481-aea9-49ee-b6bc-7928fedec674 req-e1b15f71-77a3-4928-8f1b-6e122398e91c 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "65bbb3bd-2b3c-4868-bf10-ce8795c0a312-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 21 19:04:33 np0005591285 nova_compute[182755]: 2026-01-22 00:04:33.919 182759 DEBUG oslo_concurrency.lockutils [req-c7224481-aea9-49ee-b6bc-7928fedec674 req-e1b15f71-77a3-4928-8f1b-6e122398e91c 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "65bbb3bd-2b3c-4868-bf10-ce8795c0a312-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 21 19:04:33 np0005591285 nova_compute[182755]: 2026-01-22 00:04:33.919 182759 DEBUG nova.compute.manager [req-c7224481-aea9-49ee-b6bc-7928fedec674 req-e1b15f71-77a3-4928-8f1b-6e122398e91c 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 65bbb3bd-2b3c-4868-bf10-ce8795c0a312] No waiting events found dispatching network-vif-unplugged-d13b0c1b-9c16-4db4-bc03-d7ffef3f3af0 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 21 19:04:33 np0005591285 nova_compute[182755]: 2026-01-22 00:04:33.919 182759 DEBUG nova.compute.manager [req-c7224481-aea9-49ee-b6bc-7928fedec674 req-e1b15f71-77a3-4928-8f1b-6e122398e91c 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 65bbb3bd-2b3c-4868-bf10-ce8795c0a312] Received event network-vif-unplugged-d13b0c1b-9c16-4db4-bc03-d7ffef3f3af0 for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Jan 21 19:04:34 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:04:34.377 104259 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=25, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '16:a4:fb', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': 'fa:c2:bb:50:eb:27'}, ipsec=False) old=SB_Global(nb_cfg=24) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 21 19:04:34 np0005591285 nova_compute[182755]: 2026-01-22 00:04:34.378 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:04:34 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:04:34.378 104259 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 0 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Jan 21 19:04:34 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:04:34.379 104259 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=ce4b296c-26ac-415a-aa87-9634754eb3d3, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '25'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 21 19:04:36 np0005591285 nova_compute[182755]: 2026-01-22 00:04:36.190 182759 DEBUG nova.compute.manager [req-3a4d2a0e-7865-4e15-b0f1-12e68bc9af76 req-facbce36-1004-464b-8d48-4e84b97d335b 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 65bbb3bd-2b3c-4868-bf10-ce8795c0a312] Received event network-vif-plugged-d13b0c1b-9c16-4db4-bc03-d7ffef3f3af0 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 21 19:04:36 np0005591285 nova_compute[182755]: 2026-01-22 00:04:36.190 182759 DEBUG oslo_concurrency.lockutils [req-3a4d2a0e-7865-4e15-b0f1-12e68bc9af76 req-facbce36-1004-464b-8d48-4e84b97d335b 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "65bbb3bd-2b3c-4868-bf10-ce8795c0a312-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 21 19:04:36 np0005591285 nova_compute[182755]: 2026-01-22 00:04:36.191 182759 DEBUG oslo_concurrency.lockutils [req-3a4d2a0e-7865-4e15-b0f1-12e68bc9af76 req-facbce36-1004-464b-8d48-4e84b97d335b 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "65bbb3bd-2b3c-4868-bf10-ce8795c0a312-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 21 19:04:36 np0005591285 nova_compute[182755]: 2026-01-22 00:04:36.191 182759 DEBUG oslo_concurrency.lockutils [req-3a4d2a0e-7865-4e15-b0f1-12e68bc9af76 req-facbce36-1004-464b-8d48-4e84b97d335b 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "65bbb3bd-2b3c-4868-bf10-ce8795c0a312-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 21 19:04:36 np0005591285 nova_compute[182755]: 2026-01-22 00:04:36.191 182759 DEBUG nova.compute.manager [req-3a4d2a0e-7865-4e15-b0f1-12e68bc9af76 req-facbce36-1004-464b-8d48-4e84b97d335b 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 65bbb3bd-2b3c-4868-bf10-ce8795c0a312] No waiting events found dispatching network-vif-plugged-d13b0c1b-9c16-4db4-bc03-d7ffef3f3af0 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 21 19:04:36 np0005591285 nova_compute[182755]: 2026-01-22 00:04:36.191 182759 WARNING nova.compute.manager [req-3a4d2a0e-7865-4e15-b0f1-12e68bc9af76 req-facbce36-1004-464b-8d48-4e84b97d335b 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 65bbb3bd-2b3c-4868-bf10-ce8795c0a312] Received unexpected event network-vif-plugged-d13b0c1b-9c16-4db4-bc03-d7ffef3f3af0 for instance with vm_state active and task_state deleting.#033[00m
Jan 21 19:04:36 np0005591285 nova_compute[182755]: 2026-01-22 00:04:36.191 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:04:36 np0005591285 nova_compute[182755]: 2026-01-22 00:04:36.216 182759 DEBUG oslo_service.periodic_task [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 21 19:04:36 np0005591285 nova_compute[182755]: 2026-01-22 00:04:36.217 182759 DEBUG oslo_service.periodic_task [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 21 19:04:36 np0005591285 nova_compute[182755]: 2026-01-22 00:04:36.217 182759 DEBUG oslo_service.periodic_task [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 21 19:04:36 np0005591285 nova_compute[182755]: 2026-01-22 00:04:36.217 182759 DEBUG nova.compute.manager [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Jan 21 19:04:37 np0005591285 nova_compute[182755]: 2026-01-22 00:04:37.161 182759 DEBUG nova.network.neutron [-] [instance: 65bbb3bd-2b3c-4868-bf10-ce8795c0a312] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 21 19:04:37 np0005591285 nova_compute[182755]: 2026-01-22 00:04:37.215 182759 INFO nova.compute.manager [-] [instance: 65bbb3bd-2b3c-4868-bf10-ce8795c0a312] Took 3.37 seconds to deallocate network for instance.#033[00m
Jan 21 19:04:37 np0005591285 nova_compute[182755]: 2026-01-22 00:04:37.351 182759 DEBUG oslo_concurrency.lockutils [None req-8b33456a-0b77-419a-8c54-1f6524a2bae6 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 21 19:04:37 np0005591285 nova_compute[182755]: 2026-01-22 00:04:37.352 182759 DEBUG oslo_concurrency.lockutils [None req-8b33456a-0b77-419a-8c54-1f6524a2bae6 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 21 19:04:37 np0005591285 nova_compute[182755]: 2026-01-22 00:04:37.410 182759 DEBUG nova.scheduler.client.report [None req-8b33456a-0b77-419a-8c54-1f6524a2bae6 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] Refreshing inventories for resource provider e96a8776-a298-4c19-937a-402cb8191067 _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:804#033[00m
Jan 21 19:04:37 np0005591285 nova_compute[182755]: 2026-01-22 00:04:37.417 182759 DEBUG nova.compute.manager [req-4f1f9049-1b5b-4470-a5c4-2edc920a8b3b req-51721337-7b78-4dd4-9f18-e730771d04b3 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 65bbb3bd-2b3c-4868-bf10-ce8795c0a312] Received event network-vif-deleted-d13b0c1b-9c16-4db4-bc03-d7ffef3f3af0 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 21 19:04:37 np0005591285 nova_compute[182755]: 2026-01-22 00:04:37.451 182759 DEBUG nova.scheduler.client.report [None req-8b33456a-0b77-419a-8c54-1f6524a2bae6 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] Updating ProviderTree inventory for provider e96a8776-a298-4c19-937a-402cb8191067 from _refresh_and_get_inventory using data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} _refresh_and_get_inventory /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:768#033[00m
Jan 21 19:04:37 np0005591285 nova_compute[182755]: 2026-01-22 00:04:37.451 182759 DEBUG nova.compute.provider_tree [None req-8b33456a-0b77-419a-8c54-1f6524a2bae6 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] Updating inventory in ProviderTree for provider e96a8776-a298-4c19-937a-402cb8191067 with inventory: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176#033[00m
Jan 21 19:04:37 np0005591285 rsyslogd[1006]: imjournal: journal files changed, reloading...  [v8.2510.0-2.el9 try https://www.rsyslog.com/e/0 ]
Jan 21 19:04:37 np0005591285 rsyslogd[1006]: imjournal: journal files changed, reloading...  [v8.2510.0-2.el9 try https://www.rsyslog.com/e/0 ]
Jan 21 19:04:37 np0005591285 nova_compute[182755]: 2026-01-22 00:04:37.496 182759 DEBUG nova.scheduler.client.report [None req-8b33456a-0b77-419a-8c54-1f6524a2bae6 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] Refreshing aggregate associations for resource provider e96a8776-a298-4c19-937a-402cb8191067, aggregates: None _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:813#033[00m
Jan 21 19:04:37 np0005591285 nova_compute[182755]: 2026-01-22 00:04:37.550 182759 DEBUG nova.scheduler.client.report [None req-8b33456a-0b77-419a-8c54-1f6524a2bae6 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] Refreshing trait associations for resource provider e96a8776-a298-4c19-937a-402cb8191067, traits: COMPUTE_SECURITY_UEFI_SECURE_BOOT,COMPUTE_VIOMMU_MODEL_VIRTIO,COMPUTE_STORAGE_BUS_IDE,COMPUTE_NET_ATTACH_INTERFACE,HW_CPU_X86_SSSE3,HW_CPU_X86_SSE,COMPUTE_IMAGE_TYPE_QCOW2,COMPUTE_NET_VIF_MODEL_VMXNET3,COMPUTE_STORAGE_BUS_FDC,HW_CPU_X86_SSE42,COMPUTE_SECURITY_TPM_2_0,COMPUTE_IMAGE_TYPE_AMI,COMPUTE_DEVICE_TAGGING,COMPUTE_VOLUME_EXTEND,COMPUTE_NET_VIF_MODEL_E1000,COMPUTE_VIOMMU_MODEL_AUTO,COMPUTE_GRAPHICS_MODEL_CIRRUS,COMPUTE_VOLUME_ATTACH_WITH_TAG,COMPUTE_IMAGE_TYPE_AKI,COMPUTE_STORAGE_BUS_SATA,HW_CPU_X86_SSE2,COMPUTE_IMAGE_TYPE_ARI,HW_CPU_X86_SSE41,COMPUTE_RESCUE_BFV,COMPUTE_NODE,COMPUTE_GRAPHICS_MODEL_VIRTIO,COMPUTE_NET_VIF_MODEL_PCNET,COMPUTE_NET_VIF_MODEL_VIRTIO,COMPUTE_NET_VIF_MODEL_RTL8139,COMPUTE_ACCELERATORS,COMPUTE_SECURITY_TPM_1_2,COMPUTE_GRAPHICS_MODEL_NONE,COMPUTE_GRAPHICS_MODEL_BOCHS,COMPUTE_VOLUME_MULTI_ATTACH,COMPUTE_NET_VIF_MODEL_NE2K_PCI,COMPUTE_TRUSTED_CERTS,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,COMPUTE_VIOMMU_MODEL_INTEL,COMPUTE_IMAGE_TYPE_ISO,COMPUTE_SOCKET_PCI_NUMA_AFFINITY,COMPUTE_STORAGE_BUS_USB,COMPUTE_STORAGE_BUS_SCSI,COMPUTE_STORAGE_BUS_VIRTIO,HW_CPU_X86_MMX,COMPUTE_GRAPHICS_MODEL_VGA,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,COMPUTE_NET_VIF_MODEL_E1000E,COMPUTE_IMAGE_TYPE_RAW _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:825#033[00m
Jan 21 19:04:37 np0005591285 nova_compute[182755]: 2026-01-22 00:04:37.628 182759 DEBUG nova.compute.provider_tree [None req-8b33456a-0b77-419a-8c54-1f6524a2bae6 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] Inventory has not changed in ProviderTree for provider: e96a8776-a298-4c19-937a-402cb8191067 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 21 19:04:37 np0005591285 nova_compute[182755]: 2026-01-22 00:04:37.657 182759 DEBUG nova.scheduler.client.report [None req-8b33456a-0b77-419a-8c54-1f6524a2bae6 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] Inventory has not changed for provider e96a8776-a298-4c19-937a-402cb8191067 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 21 19:04:37 np0005591285 nova_compute[182755]: 2026-01-22 00:04:37.690 182759 DEBUG oslo_concurrency.lockutils [None req-8b33456a-0b77-419a-8c54-1f6524a2bae6 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.338s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 21 19:04:37 np0005591285 nova_compute[182755]: 2026-01-22 00:04:37.771 182759 INFO nova.scheduler.client.report [None req-8b33456a-0b77-419a-8c54-1f6524a2bae6 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] Deleted allocations for instance 65bbb3bd-2b3c-4868-bf10-ce8795c0a312#033[00m
Jan 21 19:04:37 np0005591285 nova_compute[182755]: 2026-01-22 00:04:37.905 182759 DEBUG oslo_concurrency.lockutils [None req-8b33456a-0b77-419a-8c54-1f6524a2bae6 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] Lock "65bbb3bd-2b3c-4868-bf10-ce8795c0a312" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 4.786s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 21 19:04:38 np0005591285 nova_compute[182755]: 2026-01-22 00:04:38.502 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:04:40 np0005591285 podman[225579]: 2026-01-22 00:04:40.206165964 +0000 UTC m=+0.069593162 container health_status 769668cc4e818cfae854ab3fcb5517227ddca1e18005a18ef4d782a494f45662 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=openstack_network_exporter, container_name=openstack_network_exporter, url=https://catalog.redhat.com/en/search?searchType=containers, build-date=2025-08-20T13:12:41, vendor=Red Hat, Inc., io.openshift.expose-services=, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, io.buildah.version=1.33.7, name=ubi9-minimal, version=9.6, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., distribution-scope=public, io.openshift.tags=minimal rhel9, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., managed_by=edpm_ansible, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Red Hat, Inc., com.redhat.component=ubi9-minimal-container, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, release=1755695350, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b)
Jan 21 19:04:40 np0005591285 nova_compute[182755]: 2026-01-22 00:04:40.217 182759 DEBUG oslo_service.periodic_task [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 21 19:04:40 np0005591285 nova_compute[182755]: 2026-01-22 00:04:40.218 182759 DEBUG oslo_service.periodic_task [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 21 19:04:40 np0005591285 podman[225580]: 2026-01-22 00:04:40.23575466 +0000 UTC m=+0.083558428 container health_status b5c1a31a508d0cf9e10702abe9c0ef424614b4d8f0daee85791f7de054afa6f0 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.license=GPLv2, managed_by=edpm_ansible, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team)
Jan 21 19:04:41 np0005591285 nova_compute[182755]: 2026-01-22 00:04:41.190 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:04:41 np0005591285 nova_compute[182755]: 2026-01-22 00:04:41.216 182759 DEBUG oslo_service.periodic_task [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 21 19:04:41 np0005591285 nova_compute[182755]: 2026-01-22 00:04:41.217 182759 DEBUG nova.compute.manager [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Jan 21 19:04:41 np0005591285 nova_compute[182755]: 2026-01-22 00:04:41.217 182759 DEBUG nova.compute.manager [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Jan 21 19:04:41 np0005591285 nova_compute[182755]: 2026-01-22 00:04:41.244 182759 DEBUG nova.compute.manager [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Jan 21 19:04:41 np0005591285 nova_compute[182755]: 2026-01-22 00:04:41.244 182759 DEBUG oslo_service.periodic_task [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 21 19:04:41 np0005591285 nova_compute[182755]: 2026-01-22 00:04:41.244 182759 DEBUG oslo_service.periodic_task [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 21 19:04:41 np0005591285 nova_compute[182755]: 2026-01-22 00:04:41.265 182759 DEBUG oslo_concurrency.lockutils [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 21 19:04:41 np0005591285 nova_compute[182755]: 2026-01-22 00:04:41.266 182759 DEBUG oslo_concurrency.lockutils [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 21 19:04:41 np0005591285 nova_compute[182755]: 2026-01-22 00:04:41.266 182759 DEBUG oslo_concurrency.lockutils [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 21 19:04:41 np0005591285 nova_compute[182755]: 2026-01-22 00:04:41.266 182759 DEBUG nova.compute.resource_tracker [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Jan 21 19:04:41 np0005591285 nova_compute[182755]: 2026-01-22 00:04:41.422 182759 WARNING nova.virt.libvirt.driver [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 21 19:04:41 np0005591285 nova_compute[182755]: 2026-01-22 00:04:41.423 182759 DEBUG nova.compute.resource_tracker [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=5603MB free_disk=73.26117324829102GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Jan 21 19:04:41 np0005591285 nova_compute[182755]: 2026-01-22 00:04:41.423 182759 DEBUG oslo_concurrency.lockutils [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 21 19:04:41 np0005591285 nova_compute[182755]: 2026-01-22 00:04:41.423 182759 DEBUG oslo_concurrency.lockutils [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 21 19:04:41 np0005591285 nova_compute[182755]: 2026-01-22 00:04:41.518 182759 DEBUG nova.compute.resource_tracker [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Jan 21 19:04:41 np0005591285 nova_compute[182755]: 2026-01-22 00:04:41.519 182759 DEBUG nova.compute.resource_tracker [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Jan 21 19:04:41 np0005591285 nova_compute[182755]: 2026-01-22 00:04:41.553 182759 DEBUG nova.compute.provider_tree [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Inventory has not changed in ProviderTree for provider: e96a8776-a298-4c19-937a-402cb8191067 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 21 19:04:41 np0005591285 nova_compute[182755]: 2026-01-22 00:04:41.573 182759 DEBUG nova.scheduler.client.report [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Inventory has not changed for provider e96a8776-a298-4c19-937a-402cb8191067 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 21 19:04:41 np0005591285 nova_compute[182755]: 2026-01-22 00:04:41.606 182759 DEBUG nova.compute.resource_tracker [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Jan 21 19:04:41 np0005591285 nova_compute[182755]: 2026-01-22 00:04:41.606 182759 DEBUG oslo_concurrency.lockutils [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.183s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 21 19:04:42 np0005591285 nova_compute[182755]: 2026-01-22 00:04:42.422 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:04:42 np0005591285 nova_compute[182755]: 2026-01-22 00:04:42.580 182759 DEBUG oslo_service.periodic_task [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 21 19:04:43 np0005591285 nova_compute[182755]: 2026-01-22 00:04:43.504 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:04:46 np0005591285 podman[225619]: 2026-01-22 00:04:46.190114948 +0000 UTC m=+0.053966793 container health_status 3d927e208c8d9d2bf2ed958430b573b5beb6e6062ba87e1ec7a0ee42c855b2e4 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter)
Jan 21 19:04:46 np0005591285 nova_compute[182755]: 2026-01-22 00:04:46.192 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:04:46 np0005591285 nova_compute[182755]: 2026-01-22 00:04:46.233 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:04:46 np0005591285 nova_compute[182755]: 2026-01-22 00:04:46.321 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:04:46 np0005591285 nova_compute[182755]: 2026-01-22 00:04:46.520 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:04:48 np0005591285 nova_compute[182755]: 2026-01-22 00:04:48.457 182759 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1769040273.456792, 65bbb3bd-2b3c-4868-bf10-ce8795c0a312 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 21 19:04:48 np0005591285 nova_compute[182755]: 2026-01-22 00:04:48.458 182759 INFO nova.compute.manager [-] [instance: 65bbb3bd-2b3c-4868-bf10-ce8795c0a312] VM Stopped (Lifecycle Event)#033[00m
Jan 21 19:04:48 np0005591285 nova_compute[182755]: 2026-01-22 00:04:48.493 182759 DEBUG nova.compute.manager [None req-7d43a27e-afa7-482e-8f78-9588eea4917c - - - - - -] [instance: 65bbb3bd-2b3c-4868-bf10-ce8795c0a312] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 21 19:04:48 np0005591285 nova_compute[182755]: 2026-01-22 00:04:48.505 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:04:50 np0005591285 podman[225646]: 2026-01-22 00:04:50.186066725 +0000 UTC m=+0.054450746 container health_status 8919309155a033da8deb024bb37b9fc0d285fe97686ee0d202af37a5f2df8416 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter)
Jan 21 19:04:50 np0005591285 podman[225645]: 2026-01-22 00:04:50.200858393 +0000 UTC m=+0.065535754 container health_status 482a8450540b33e5cd4c4cb0c4b60281016c657b79b89fa82f8a9e447843f995 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_metadata_agent, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=ovn_metadata_agent, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']})
Jan 21 19:04:50 np0005591285 podman[225647]: 2026-01-22 00:04:50.266607872 +0000 UTC m=+0.133368939 container health_status e38def30a2d032278c507a8fe96e62e9ea51ff4721fe55b24e66691821fd079e (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251202)
Jan 21 19:04:51 np0005591285 nova_compute[182755]: 2026-01-22 00:04:51.194 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:04:53 np0005591285 nova_compute[182755]: 2026-01-22 00:04:53.509 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:04:54 np0005591285 nova_compute[182755]: 2026-01-22 00:04:54.210 182759 DEBUG oslo_concurrency.lockutils [None req-42d88e4e-d95b-4b50-a14c-53de9c59324d b4385295f46b45d8803b0c536a989822 c299d482d37e45169cca3d6f178e8555 - - default default] Acquiring lock "e5fe3f24-b0cd-4353-af64-6c1c92f1581d" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 21 19:04:54 np0005591285 nova_compute[182755]: 2026-01-22 00:04:54.211 182759 DEBUG oslo_concurrency.lockutils [None req-42d88e4e-d95b-4b50-a14c-53de9c59324d b4385295f46b45d8803b0c536a989822 c299d482d37e45169cca3d6f178e8555 - - default default] Lock "e5fe3f24-b0cd-4353-af64-6c1c92f1581d" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 21 19:04:54 np0005591285 nova_compute[182755]: 2026-01-22 00:04:54.233 182759 DEBUG nova.compute.manager [None req-42d88e4e-d95b-4b50-a14c-53de9c59324d b4385295f46b45d8803b0c536a989822 c299d482d37e45169cca3d6f178e8555 - - default default] [instance: e5fe3f24-b0cd-4353-af64-6c1c92f1581d] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Jan 21 19:04:54 np0005591285 nova_compute[182755]: 2026-01-22 00:04:54.542 182759 DEBUG oslo_concurrency.lockutils [None req-42d88e4e-d95b-4b50-a14c-53de9c59324d b4385295f46b45d8803b0c536a989822 c299d482d37e45169cca3d6f178e8555 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 21 19:04:54 np0005591285 nova_compute[182755]: 2026-01-22 00:04:54.543 182759 DEBUG oslo_concurrency.lockutils [None req-42d88e4e-d95b-4b50-a14c-53de9c59324d b4385295f46b45d8803b0c536a989822 c299d482d37e45169cca3d6f178e8555 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 21 19:04:54 np0005591285 nova_compute[182755]: 2026-01-22 00:04:54.550 182759 DEBUG nova.virt.hardware [None req-42d88e4e-d95b-4b50-a14c-53de9c59324d b4385295f46b45d8803b0c536a989822 c299d482d37e45169cca3d6f178e8555 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Jan 21 19:04:54 np0005591285 nova_compute[182755]: 2026-01-22 00:04:54.551 182759 INFO nova.compute.claims [None req-42d88e4e-d95b-4b50-a14c-53de9c59324d b4385295f46b45d8803b0c536a989822 c299d482d37e45169cca3d6f178e8555 - - default default] [instance: e5fe3f24-b0cd-4353-af64-6c1c92f1581d] Claim successful on node compute-2.ctlplane.example.com#033[00m
Jan 21 19:04:54 np0005591285 nova_compute[182755]: 2026-01-22 00:04:54.718 182759 DEBUG nova.compute.provider_tree [None req-42d88e4e-d95b-4b50-a14c-53de9c59324d b4385295f46b45d8803b0c536a989822 c299d482d37e45169cca3d6f178e8555 - - default default] Inventory has not changed in ProviderTree for provider: e96a8776-a298-4c19-937a-402cb8191067 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 21 19:04:54 np0005591285 nova_compute[182755]: 2026-01-22 00:04:54.734 182759 DEBUG nova.scheduler.client.report [None req-42d88e4e-d95b-4b50-a14c-53de9c59324d b4385295f46b45d8803b0c536a989822 c299d482d37e45169cca3d6f178e8555 - - default default] Inventory has not changed for provider e96a8776-a298-4c19-937a-402cb8191067 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 21 19:04:54 np0005591285 nova_compute[182755]: 2026-01-22 00:04:54.756 182759 DEBUG oslo_concurrency.lockutils [None req-42d88e4e-d95b-4b50-a14c-53de9c59324d b4385295f46b45d8803b0c536a989822 c299d482d37e45169cca3d6f178e8555 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.214s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 21 19:04:54 np0005591285 nova_compute[182755]: 2026-01-22 00:04:54.757 182759 DEBUG nova.compute.manager [None req-42d88e4e-d95b-4b50-a14c-53de9c59324d b4385295f46b45d8803b0c536a989822 c299d482d37e45169cca3d6f178e8555 - - default default] [instance: e5fe3f24-b0cd-4353-af64-6c1c92f1581d] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Jan 21 19:04:54 np0005591285 nova_compute[182755]: 2026-01-22 00:04:54.816 182759 DEBUG nova.compute.manager [None req-42d88e4e-d95b-4b50-a14c-53de9c59324d b4385295f46b45d8803b0c536a989822 c299d482d37e45169cca3d6f178e8555 - - default default] [instance: e5fe3f24-b0cd-4353-af64-6c1c92f1581d] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Jan 21 19:04:54 np0005591285 nova_compute[182755]: 2026-01-22 00:04:54.816 182759 DEBUG nova.network.neutron [None req-42d88e4e-d95b-4b50-a14c-53de9c59324d b4385295f46b45d8803b0c536a989822 c299d482d37e45169cca3d6f178e8555 - - default default] [instance: e5fe3f24-b0cd-4353-af64-6c1c92f1581d] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Jan 21 19:04:54 np0005591285 nova_compute[182755]: 2026-01-22 00:04:54.836 182759 INFO nova.virt.libvirt.driver [None req-42d88e4e-d95b-4b50-a14c-53de9c59324d b4385295f46b45d8803b0c536a989822 c299d482d37e45169cca3d6f178e8555 - - default default] [instance: e5fe3f24-b0cd-4353-af64-6c1c92f1581d] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Jan 21 19:04:54 np0005591285 nova_compute[182755]: 2026-01-22 00:04:54.856 182759 DEBUG nova.compute.manager [None req-42d88e4e-d95b-4b50-a14c-53de9c59324d b4385295f46b45d8803b0c536a989822 c299d482d37e45169cca3d6f178e8555 - - default default] [instance: e5fe3f24-b0cd-4353-af64-6c1c92f1581d] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Jan 21 19:04:54 np0005591285 nova_compute[182755]: 2026-01-22 00:04:54.975 182759 DEBUG nova.compute.manager [None req-42d88e4e-d95b-4b50-a14c-53de9c59324d b4385295f46b45d8803b0c536a989822 c299d482d37e45169cca3d6f178e8555 - - default default] [instance: e5fe3f24-b0cd-4353-af64-6c1c92f1581d] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Jan 21 19:04:54 np0005591285 nova_compute[182755]: 2026-01-22 00:04:54.976 182759 DEBUG nova.virt.libvirt.driver [None req-42d88e4e-d95b-4b50-a14c-53de9c59324d b4385295f46b45d8803b0c536a989822 c299d482d37e45169cca3d6f178e8555 - - default default] [instance: e5fe3f24-b0cd-4353-af64-6c1c92f1581d] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Jan 21 19:04:54 np0005591285 nova_compute[182755]: 2026-01-22 00:04:54.976 182759 INFO nova.virt.libvirt.driver [None req-42d88e4e-d95b-4b50-a14c-53de9c59324d b4385295f46b45d8803b0c536a989822 c299d482d37e45169cca3d6f178e8555 - - default default] [instance: e5fe3f24-b0cd-4353-af64-6c1c92f1581d] Creating image(s)#033[00m
Jan 21 19:04:54 np0005591285 nova_compute[182755]: 2026-01-22 00:04:54.977 182759 DEBUG oslo_concurrency.lockutils [None req-42d88e4e-d95b-4b50-a14c-53de9c59324d b4385295f46b45d8803b0c536a989822 c299d482d37e45169cca3d6f178e8555 - - default default] Acquiring lock "/var/lib/nova/instances/e5fe3f24-b0cd-4353-af64-6c1c92f1581d/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 21 19:04:54 np0005591285 nova_compute[182755]: 2026-01-22 00:04:54.977 182759 DEBUG oslo_concurrency.lockutils [None req-42d88e4e-d95b-4b50-a14c-53de9c59324d b4385295f46b45d8803b0c536a989822 c299d482d37e45169cca3d6f178e8555 - - default default] Lock "/var/lib/nova/instances/e5fe3f24-b0cd-4353-af64-6c1c92f1581d/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 21 19:04:54 np0005591285 nova_compute[182755]: 2026-01-22 00:04:54.978 182759 DEBUG oslo_concurrency.lockutils [None req-42d88e4e-d95b-4b50-a14c-53de9c59324d b4385295f46b45d8803b0c536a989822 c299d482d37e45169cca3d6f178e8555 - - default default] Lock "/var/lib/nova/instances/e5fe3f24-b0cd-4353-af64-6c1c92f1581d/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 21 19:04:54 np0005591285 nova_compute[182755]: 2026-01-22 00:04:54.996 182759 DEBUG oslo_concurrency.processutils [None req-42d88e4e-d95b-4b50-a14c-53de9c59324d b4385295f46b45d8803b0c536a989822 c299d482d37e45169cca3d6f178e8555 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 21 19:04:55 np0005591285 nova_compute[182755]: 2026-01-22 00:04:55.050 182759 DEBUG nova.policy [None req-42d88e4e-d95b-4b50-a14c-53de9c59324d b4385295f46b45d8803b0c536a989822 c299d482d37e45169cca3d6f178e8555 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'b4385295f46b45d8803b0c536a989822', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'c299d482d37e45169cca3d6f178e8555', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Jan 21 19:04:55 np0005591285 nova_compute[182755]: 2026-01-22 00:04:55.072 182759 DEBUG oslo_concurrency.processutils [None req-42d88e4e-d95b-4b50-a14c-53de9c59324d b4385295f46b45d8803b0c536a989822 c299d482d37e45169cca3d6f178e8555 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json" returned: 0 in 0.077s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 21 19:04:55 np0005591285 nova_compute[182755]: 2026-01-22 00:04:55.073 182759 DEBUG oslo_concurrency.lockutils [None req-42d88e4e-d95b-4b50-a14c-53de9c59324d b4385295f46b45d8803b0c536a989822 c299d482d37e45169cca3d6f178e8555 - - default default] Acquiring lock "3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 21 19:04:55 np0005591285 nova_compute[182755]: 2026-01-22 00:04:55.074 182759 DEBUG oslo_concurrency.lockutils [None req-42d88e4e-d95b-4b50-a14c-53de9c59324d b4385295f46b45d8803b0c536a989822 c299d482d37e45169cca3d6f178e8555 - - default default] Lock "3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 21 19:04:55 np0005591285 nova_compute[182755]: 2026-01-22 00:04:55.085 182759 DEBUG oslo_concurrency.processutils [None req-42d88e4e-d95b-4b50-a14c-53de9c59324d b4385295f46b45d8803b0c536a989822 c299d482d37e45169cca3d6f178e8555 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 21 19:04:55 np0005591285 nova_compute[182755]: 2026-01-22 00:04:55.159 182759 DEBUG oslo_concurrency.processutils [None req-42d88e4e-d95b-4b50-a14c-53de9c59324d b4385295f46b45d8803b0c536a989822 c299d482d37e45169cca3d6f178e8555 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json" returned: 0 in 0.075s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 21 19:04:55 np0005591285 nova_compute[182755]: 2026-01-22 00:04:55.161 182759 DEBUG oslo_concurrency.processutils [None req-42d88e4e-d95b-4b50-a14c-53de9c59324d b4385295f46b45d8803b0c536a989822 c299d482d37e45169cca3d6f178e8555 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474,backing_fmt=raw /var/lib/nova/instances/e5fe3f24-b0cd-4353-af64-6c1c92f1581d/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 21 19:04:55 np0005591285 nova_compute[182755]: 2026-01-22 00:04:55.203 182759 DEBUG oslo_concurrency.processutils [None req-42d88e4e-d95b-4b50-a14c-53de9c59324d b4385295f46b45d8803b0c536a989822 c299d482d37e45169cca3d6f178e8555 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474,backing_fmt=raw /var/lib/nova/instances/e5fe3f24-b0cd-4353-af64-6c1c92f1581d/disk 1073741824" returned: 0 in 0.042s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 21 19:04:55 np0005591285 nova_compute[182755]: 2026-01-22 00:04:55.204 182759 DEBUG oslo_concurrency.lockutils [None req-42d88e4e-d95b-4b50-a14c-53de9c59324d b4385295f46b45d8803b0c536a989822 c299d482d37e45169cca3d6f178e8555 - - default default] Lock "3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.130s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 21 19:04:55 np0005591285 nova_compute[182755]: 2026-01-22 00:04:55.204 182759 DEBUG oslo_concurrency.processutils [None req-42d88e4e-d95b-4b50-a14c-53de9c59324d b4385295f46b45d8803b0c536a989822 c299d482d37e45169cca3d6f178e8555 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 21 19:04:55 np0005591285 nova_compute[182755]: 2026-01-22 00:04:55.297 182759 DEBUG oslo_concurrency.processutils [None req-42d88e4e-d95b-4b50-a14c-53de9c59324d b4385295f46b45d8803b0c536a989822 c299d482d37e45169cca3d6f178e8555 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json" returned: 0 in 0.093s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 21 19:04:55 np0005591285 nova_compute[182755]: 2026-01-22 00:04:55.298 182759 DEBUG nova.virt.disk.api [None req-42d88e4e-d95b-4b50-a14c-53de9c59324d b4385295f46b45d8803b0c536a989822 c299d482d37e45169cca3d6f178e8555 - - default default] Checking if we can resize image /var/lib/nova/instances/e5fe3f24-b0cd-4353-af64-6c1c92f1581d/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166#033[00m
Jan 21 19:04:55 np0005591285 nova_compute[182755]: 2026-01-22 00:04:55.298 182759 DEBUG oslo_concurrency.processutils [None req-42d88e4e-d95b-4b50-a14c-53de9c59324d b4385295f46b45d8803b0c536a989822 c299d482d37e45169cca3d6f178e8555 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/e5fe3f24-b0cd-4353-af64-6c1c92f1581d/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 21 19:04:55 np0005591285 nova_compute[182755]: 2026-01-22 00:04:55.367 182759 DEBUG oslo_concurrency.processutils [None req-42d88e4e-d95b-4b50-a14c-53de9c59324d b4385295f46b45d8803b0c536a989822 c299d482d37e45169cca3d6f178e8555 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/e5fe3f24-b0cd-4353-af64-6c1c92f1581d/disk --force-share --output=json" returned: 0 in 0.068s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 21 19:04:55 np0005591285 nova_compute[182755]: 2026-01-22 00:04:55.368 182759 DEBUG nova.virt.disk.api [None req-42d88e4e-d95b-4b50-a14c-53de9c59324d b4385295f46b45d8803b0c536a989822 c299d482d37e45169cca3d6f178e8555 - - default default] Cannot resize image /var/lib/nova/instances/e5fe3f24-b0cd-4353-af64-6c1c92f1581d/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172#033[00m
Jan 21 19:04:55 np0005591285 nova_compute[182755]: 2026-01-22 00:04:55.368 182759 DEBUG nova.objects.instance [None req-42d88e4e-d95b-4b50-a14c-53de9c59324d b4385295f46b45d8803b0c536a989822 c299d482d37e45169cca3d6f178e8555 - - default default] Lazy-loading 'migration_context' on Instance uuid e5fe3f24-b0cd-4353-af64-6c1c92f1581d obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 21 19:04:55 np0005591285 nova_compute[182755]: 2026-01-22 00:04:55.507 182759 DEBUG nova.virt.libvirt.driver [None req-42d88e4e-d95b-4b50-a14c-53de9c59324d b4385295f46b45d8803b0c536a989822 c299d482d37e45169cca3d6f178e8555 - - default default] [instance: e5fe3f24-b0cd-4353-af64-6c1c92f1581d] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Jan 21 19:04:55 np0005591285 nova_compute[182755]: 2026-01-22 00:04:55.508 182759 DEBUG nova.virt.libvirt.driver [None req-42d88e4e-d95b-4b50-a14c-53de9c59324d b4385295f46b45d8803b0c536a989822 c299d482d37e45169cca3d6f178e8555 - - default default] [instance: e5fe3f24-b0cd-4353-af64-6c1c92f1581d] Ensure instance console log exists: /var/lib/nova/instances/e5fe3f24-b0cd-4353-af64-6c1c92f1581d/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Jan 21 19:04:55 np0005591285 nova_compute[182755]: 2026-01-22 00:04:55.509 182759 DEBUG oslo_concurrency.lockutils [None req-42d88e4e-d95b-4b50-a14c-53de9c59324d b4385295f46b45d8803b0c536a989822 c299d482d37e45169cca3d6f178e8555 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 21 19:04:55 np0005591285 nova_compute[182755]: 2026-01-22 00:04:55.510 182759 DEBUG oslo_concurrency.lockutils [None req-42d88e4e-d95b-4b50-a14c-53de9c59324d b4385295f46b45d8803b0c536a989822 c299d482d37e45169cca3d6f178e8555 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 21 19:04:55 np0005591285 nova_compute[182755]: 2026-01-22 00:04:55.510 182759 DEBUG oslo_concurrency.lockutils [None req-42d88e4e-d95b-4b50-a14c-53de9c59324d b4385295f46b45d8803b0c536a989822 c299d482d37e45169cca3d6f178e8555 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 21 19:04:56 np0005591285 nova_compute[182755]: 2026-01-22 00:04:56.197 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:04:57 np0005591285 nova_compute[182755]: 2026-01-22 00:04:57.291 182759 DEBUG nova.network.neutron [None req-42d88e4e-d95b-4b50-a14c-53de9c59324d b4385295f46b45d8803b0c536a989822 c299d482d37e45169cca3d6f178e8555 - - default default] [instance: e5fe3f24-b0cd-4353-af64-6c1c92f1581d] Successfully created port: 4281bc8f-b881-4082-9fc7-f4b6436a837d _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Jan 21 19:04:58 np0005591285 nova_compute[182755]: 2026-01-22 00:04:58.513 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:04:58 np0005591285 nova_compute[182755]: 2026-01-22 00:04:58.633 182759 DEBUG nova.network.neutron [None req-42d88e4e-d95b-4b50-a14c-53de9c59324d b4385295f46b45d8803b0c536a989822 c299d482d37e45169cca3d6f178e8555 - - default default] [instance: e5fe3f24-b0cd-4353-af64-6c1c92f1581d] Successfully updated port: 4281bc8f-b881-4082-9fc7-f4b6436a837d _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Jan 21 19:04:58 np0005591285 nova_compute[182755]: 2026-01-22 00:04:58.654 182759 DEBUG oslo_concurrency.lockutils [None req-42d88e4e-d95b-4b50-a14c-53de9c59324d b4385295f46b45d8803b0c536a989822 c299d482d37e45169cca3d6f178e8555 - - default default] Acquiring lock "refresh_cache-e5fe3f24-b0cd-4353-af64-6c1c92f1581d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 21 19:04:58 np0005591285 nova_compute[182755]: 2026-01-22 00:04:58.655 182759 DEBUG oslo_concurrency.lockutils [None req-42d88e4e-d95b-4b50-a14c-53de9c59324d b4385295f46b45d8803b0c536a989822 c299d482d37e45169cca3d6f178e8555 - - default default] Acquired lock "refresh_cache-e5fe3f24-b0cd-4353-af64-6c1c92f1581d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 21 19:04:58 np0005591285 nova_compute[182755]: 2026-01-22 00:04:58.656 182759 DEBUG nova.network.neutron [None req-42d88e4e-d95b-4b50-a14c-53de9c59324d b4385295f46b45d8803b0c536a989822 c299d482d37e45169cca3d6f178e8555 - - default default] [instance: e5fe3f24-b0cd-4353-af64-6c1c92f1581d] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Jan 21 19:04:58 np0005591285 nova_compute[182755]: 2026-01-22 00:04:58.779 182759 DEBUG nova.compute.manager [req-994c696a-2cf5-44a9-bdac-5b31cda8b4f5 req-bcd983d0-90ee-49db-b707-d6d7f1cc063e 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: e5fe3f24-b0cd-4353-af64-6c1c92f1581d] Received event network-changed-4281bc8f-b881-4082-9fc7-f4b6436a837d external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 21 19:04:58 np0005591285 nova_compute[182755]: 2026-01-22 00:04:58.780 182759 DEBUG nova.compute.manager [req-994c696a-2cf5-44a9-bdac-5b31cda8b4f5 req-bcd983d0-90ee-49db-b707-d6d7f1cc063e 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: e5fe3f24-b0cd-4353-af64-6c1c92f1581d] Refreshing instance network info cache due to event network-changed-4281bc8f-b881-4082-9fc7-f4b6436a837d. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 21 19:04:58 np0005591285 nova_compute[182755]: 2026-01-22 00:04:58.781 182759 DEBUG oslo_concurrency.lockutils [req-994c696a-2cf5-44a9-bdac-5b31cda8b4f5 req-bcd983d0-90ee-49db-b707-d6d7f1cc063e 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "refresh_cache-e5fe3f24-b0cd-4353-af64-6c1c92f1581d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 21 19:04:58 np0005591285 nova_compute[182755]: 2026-01-22 00:04:58.895 182759 DEBUG nova.network.neutron [None req-42d88e4e-d95b-4b50-a14c-53de9c59324d b4385295f46b45d8803b0c536a989822 c299d482d37e45169cca3d6f178e8555 - - default default] [instance: e5fe3f24-b0cd-4353-af64-6c1c92f1581d] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Jan 21 19:05:01 np0005591285 nova_compute[182755]: 2026-01-22 00:05:01.199 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:05:01 np0005591285 nova_compute[182755]: 2026-01-22 00:05:01.282 182759 DEBUG nova.network.neutron [None req-42d88e4e-d95b-4b50-a14c-53de9c59324d b4385295f46b45d8803b0c536a989822 c299d482d37e45169cca3d6f178e8555 - - default default] [instance: e5fe3f24-b0cd-4353-af64-6c1c92f1581d] Updating instance_info_cache with network_info: [{"id": "4281bc8f-b881-4082-9fc7-f4b6436a837d", "address": "fa:16:3e:e2:23:c6", "network": {"id": "b3dacae7-b9cd-426c-aa4a-3a6b971c7ee5", "bridge": "br-int", "label": "tempest-ServerActionsTestOtherA-1112750792-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c299d482d37e45169cca3d6f178e8555", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4281bc8f-b8", "ovs_interfaceid": "4281bc8f-b881-4082-9fc7-f4b6436a837d", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 21 19:05:01 np0005591285 nova_compute[182755]: 2026-01-22 00:05:01.333 182759 DEBUG oslo_concurrency.lockutils [None req-42d88e4e-d95b-4b50-a14c-53de9c59324d b4385295f46b45d8803b0c536a989822 c299d482d37e45169cca3d6f178e8555 - - default default] Releasing lock "refresh_cache-e5fe3f24-b0cd-4353-af64-6c1c92f1581d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 21 19:05:01 np0005591285 nova_compute[182755]: 2026-01-22 00:05:01.334 182759 DEBUG nova.compute.manager [None req-42d88e4e-d95b-4b50-a14c-53de9c59324d b4385295f46b45d8803b0c536a989822 c299d482d37e45169cca3d6f178e8555 - - default default] [instance: e5fe3f24-b0cd-4353-af64-6c1c92f1581d] Instance network_info: |[{"id": "4281bc8f-b881-4082-9fc7-f4b6436a837d", "address": "fa:16:3e:e2:23:c6", "network": {"id": "b3dacae7-b9cd-426c-aa4a-3a6b971c7ee5", "bridge": "br-int", "label": "tempest-ServerActionsTestOtherA-1112750792-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c299d482d37e45169cca3d6f178e8555", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4281bc8f-b8", "ovs_interfaceid": "4281bc8f-b881-4082-9fc7-f4b6436a837d", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Jan 21 19:05:01 np0005591285 nova_compute[182755]: 2026-01-22 00:05:01.335 182759 DEBUG oslo_concurrency.lockutils [req-994c696a-2cf5-44a9-bdac-5b31cda8b4f5 req-bcd983d0-90ee-49db-b707-d6d7f1cc063e 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquired lock "refresh_cache-e5fe3f24-b0cd-4353-af64-6c1c92f1581d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 21 19:05:01 np0005591285 nova_compute[182755]: 2026-01-22 00:05:01.336 182759 DEBUG nova.network.neutron [req-994c696a-2cf5-44a9-bdac-5b31cda8b4f5 req-bcd983d0-90ee-49db-b707-d6d7f1cc063e 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: e5fe3f24-b0cd-4353-af64-6c1c92f1581d] Refreshing network info cache for port 4281bc8f-b881-4082-9fc7-f4b6436a837d _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 21 19:05:01 np0005591285 nova_compute[182755]: 2026-01-22 00:05:01.340 182759 DEBUG nova.virt.libvirt.driver [None req-42d88e4e-d95b-4b50-a14c-53de9c59324d b4385295f46b45d8803b0c536a989822 c299d482d37e45169cca3d6f178e8555 - - default default] [instance: e5fe3f24-b0cd-4353-af64-6c1c92f1581d] Start _get_guest_xml network_info=[{"id": "4281bc8f-b881-4082-9fc7-f4b6436a837d", "address": "fa:16:3e:e2:23:c6", "network": {"id": "b3dacae7-b9cd-426c-aa4a-3a6b971c7ee5", "bridge": "br-int", "label": "tempest-ServerActionsTestOtherA-1112750792-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c299d482d37e45169cca3d6f178e8555", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4281bc8f-b8", "ovs_interfaceid": "4281bc8f-b881-4082-9fc7-f4b6436a837d", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-21T23:42:50Z,direct_url=<?>,disk_format='qcow2',id=9cd98f02-a505-4543-a7ad-04e9a377b456,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='43b70c4e837343859ac97b6b2397ba1b',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-21T23:42:52Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'device_type': 'disk', 'device_name': '/dev/vda', 'size': 0, 'boot_index': 0, 'disk_bus': 'virtio', 'guest_format': None, 'encryption_options': None, 'encryption_format': None, 'encrypted': False, 'encryption_secret_uuid': None, 'image_id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Jan 21 19:05:01 np0005591285 nova_compute[182755]: 2026-01-22 00:05:01.348 182759 WARNING nova.virt.libvirt.driver [None req-42d88e4e-d95b-4b50-a14c-53de9c59324d b4385295f46b45d8803b0c536a989822 c299d482d37e45169cca3d6f178e8555 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 21 19:05:01 np0005591285 nova_compute[182755]: 2026-01-22 00:05:01.354 182759 DEBUG nova.virt.libvirt.host [None req-42d88e4e-d95b-4b50-a14c-53de9c59324d b4385295f46b45d8803b0c536a989822 c299d482d37e45169cca3d6f178e8555 - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Jan 21 19:05:01 np0005591285 nova_compute[182755]: 2026-01-22 00:05:01.356 182759 DEBUG nova.virt.libvirt.host [None req-42d88e4e-d95b-4b50-a14c-53de9c59324d b4385295f46b45d8803b0c536a989822 c299d482d37e45169cca3d6f178e8555 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Jan 21 19:05:01 np0005591285 nova_compute[182755]: 2026-01-22 00:05:01.360 182759 DEBUG nova.virt.libvirt.host [None req-42d88e4e-d95b-4b50-a14c-53de9c59324d b4385295f46b45d8803b0c536a989822 c299d482d37e45169cca3d6f178e8555 - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Jan 21 19:05:01 np0005591285 nova_compute[182755]: 2026-01-22 00:05:01.361 182759 DEBUG nova.virt.libvirt.host [None req-42d88e4e-d95b-4b50-a14c-53de9c59324d b4385295f46b45d8803b0c536a989822 c299d482d37e45169cca3d6f178e8555 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Jan 21 19:05:01 np0005591285 nova_compute[182755]: 2026-01-22 00:05:01.365 182759 DEBUG nova.virt.libvirt.driver [None req-42d88e4e-d95b-4b50-a14c-53de9c59324d b4385295f46b45d8803b0c536a989822 c299d482d37e45169cca3d6f178e8555 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Jan 21 19:05:01 np0005591285 nova_compute[182755]: 2026-01-22 00:05:01.365 182759 DEBUG nova.virt.hardware [None req-42d88e4e-d95b-4b50-a14c-53de9c59324d b4385295f46b45d8803b0c536a989822 c299d482d37e45169cca3d6f178e8555 - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-21T23:42:45Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='c3389c03-89c4-4ff5-9e03-1a99d41713d4',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-21T23:42:50Z,direct_url=<?>,disk_format='qcow2',id=9cd98f02-a505-4543-a7ad-04e9a377b456,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='43b70c4e837343859ac97b6b2397ba1b',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-21T23:42:52Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Jan 21 19:05:01 np0005591285 nova_compute[182755]: 2026-01-22 00:05:01.366 182759 DEBUG nova.virt.hardware [None req-42d88e4e-d95b-4b50-a14c-53de9c59324d b4385295f46b45d8803b0c536a989822 c299d482d37e45169cca3d6f178e8555 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Jan 21 19:05:01 np0005591285 nova_compute[182755]: 2026-01-22 00:05:01.367 182759 DEBUG nova.virt.hardware [None req-42d88e4e-d95b-4b50-a14c-53de9c59324d b4385295f46b45d8803b0c536a989822 c299d482d37e45169cca3d6f178e8555 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Jan 21 19:05:01 np0005591285 nova_compute[182755]: 2026-01-22 00:05:01.367 182759 DEBUG nova.virt.hardware [None req-42d88e4e-d95b-4b50-a14c-53de9c59324d b4385295f46b45d8803b0c536a989822 c299d482d37e45169cca3d6f178e8555 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Jan 21 19:05:01 np0005591285 nova_compute[182755]: 2026-01-22 00:05:01.368 182759 DEBUG nova.virt.hardware [None req-42d88e4e-d95b-4b50-a14c-53de9c59324d b4385295f46b45d8803b0c536a989822 c299d482d37e45169cca3d6f178e8555 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Jan 21 19:05:01 np0005591285 nova_compute[182755]: 2026-01-22 00:05:01.368 182759 DEBUG nova.virt.hardware [None req-42d88e4e-d95b-4b50-a14c-53de9c59324d b4385295f46b45d8803b0c536a989822 c299d482d37e45169cca3d6f178e8555 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Jan 21 19:05:01 np0005591285 nova_compute[182755]: 2026-01-22 00:05:01.368 182759 DEBUG nova.virt.hardware [None req-42d88e4e-d95b-4b50-a14c-53de9c59324d b4385295f46b45d8803b0c536a989822 c299d482d37e45169cca3d6f178e8555 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Jan 21 19:05:01 np0005591285 nova_compute[182755]: 2026-01-22 00:05:01.369 182759 DEBUG nova.virt.hardware [None req-42d88e4e-d95b-4b50-a14c-53de9c59324d b4385295f46b45d8803b0c536a989822 c299d482d37e45169cca3d6f178e8555 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Jan 21 19:05:01 np0005591285 nova_compute[182755]: 2026-01-22 00:05:01.369 182759 DEBUG nova.virt.hardware [None req-42d88e4e-d95b-4b50-a14c-53de9c59324d b4385295f46b45d8803b0c536a989822 c299d482d37e45169cca3d6f178e8555 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Jan 21 19:05:01 np0005591285 nova_compute[182755]: 2026-01-22 00:05:01.370 182759 DEBUG nova.virt.hardware [None req-42d88e4e-d95b-4b50-a14c-53de9c59324d b4385295f46b45d8803b0c536a989822 c299d482d37e45169cca3d6f178e8555 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Jan 21 19:05:01 np0005591285 nova_compute[182755]: 2026-01-22 00:05:01.370 182759 DEBUG nova.virt.hardware [None req-42d88e4e-d95b-4b50-a14c-53de9c59324d b4385295f46b45d8803b0c536a989822 c299d482d37e45169cca3d6f178e8555 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Jan 21 19:05:01 np0005591285 nova_compute[182755]: 2026-01-22 00:05:01.376 182759 DEBUG nova.virt.libvirt.vif [None req-42d88e4e-d95b-4b50-a14c-53de9c59324d b4385295f46b45d8803b0c536a989822 c299d482d37e45169cca3d6f178e8555 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-22T00:04:53Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ServerActionsTestOtherA-server-1403378080',display_name='tempest-ServerActionsTestOtherA-server-1403378080',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-serveractionstestothera-server-1403378080',id=91,image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBH70xah7ihgEIoUx5I8Vi9VE8DEeMG53SOL9NCSbgEeBRV9je/jiE2sCWFNA3ItoX/qylG9OqBTijx5WPdqmM5JgcD0QcWbaXaoP4id2xYDAqen7JSpxK96w9/70dxAV2w==',key_name='tempest-keypair-70761650',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='c299d482d37e45169cca3d6f178e8555',ramdisk_id='',reservation_id='r-iz1p6ahz',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ServerActionsTestOtherA-1347085859',owner_user_name='tempest-ServerActionsTestOtherA-1347085859-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-22T00:04:54Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='b4385295f46b45d8803b0c536a989822',uuid=e5fe3f24-b0cd-4353-af64-6c1c92f1581d,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "4281bc8f-b881-4082-9fc7-f4b6436a837d", "address": "fa:16:3e:e2:23:c6", "network": {"id": "b3dacae7-b9cd-426c-aa4a-3a6b971c7ee5", "bridge": "br-int", "label": "tempest-ServerActionsTestOtherA-1112750792-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c299d482d37e45169cca3d6f178e8555", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4281bc8f-b8", "ovs_interfaceid": "4281bc8f-b881-4082-9fc7-f4b6436a837d", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Jan 21 19:05:01 np0005591285 nova_compute[182755]: 2026-01-22 00:05:01.377 182759 DEBUG nova.network.os_vif_util [None req-42d88e4e-d95b-4b50-a14c-53de9c59324d b4385295f46b45d8803b0c536a989822 c299d482d37e45169cca3d6f178e8555 - - default default] Converting VIF {"id": "4281bc8f-b881-4082-9fc7-f4b6436a837d", "address": "fa:16:3e:e2:23:c6", "network": {"id": "b3dacae7-b9cd-426c-aa4a-3a6b971c7ee5", "bridge": "br-int", "label": "tempest-ServerActionsTestOtherA-1112750792-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c299d482d37e45169cca3d6f178e8555", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4281bc8f-b8", "ovs_interfaceid": "4281bc8f-b881-4082-9fc7-f4b6436a837d", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 21 19:05:01 np0005591285 nova_compute[182755]: 2026-01-22 00:05:01.378 182759 DEBUG nova.network.os_vif_util [None req-42d88e4e-d95b-4b50-a14c-53de9c59324d b4385295f46b45d8803b0c536a989822 c299d482d37e45169cca3d6f178e8555 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:e2:23:c6,bridge_name='br-int',has_traffic_filtering=True,id=4281bc8f-b881-4082-9fc7-f4b6436a837d,network=Network(b3dacae7-b9cd-426c-aa4a-3a6b971c7ee5),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap4281bc8f-b8') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 21 19:05:01 np0005591285 nova_compute[182755]: 2026-01-22 00:05:01.380 182759 DEBUG nova.objects.instance [None req-42d88e4e-d95b-4b50-a14c-53de9c59324d b4385295f46b45d8803b0c536a989822 c299d482d37e45169cca3d6f178e8555 - - default default] Lazy-loading 'pci_devices' on Instance uuid e5fe3f24-b0cd-4353-af64-6c1c92f1581d obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 21 19:05:01 np0005591285 nova_compute[182755]: 2026-01-22 00:05:01.396 182759 DEBUG nova.virt.libvirt.driver [None req-42d88e4e-d95b-4b50-a14c-53de9c59324d b4385295f46b45d8803b0c536a989822 c299d482d37e45169cca3d6f178e8555 - - default default] [instance: e5fe3f24-b0cd-4353-af64-6c1c92f1581d] End _get_guest_xml xml=<domain type="kvm">
Jan 21 19:05:01 np0005591285 nova_compute[182755]:  <uuid>e5fe3f24-b0cd-4353-af64-6c1c92f1581d</uuid>
Jan 21 19:05:01 np0005591285 nova_compute[182755]:  <name>instance-0000005b</name>
Jan 21 19:05:01 np0005591285 nova_compute[182755]:  <memory>131072</memory>
Jan 21 19:05:01 np0005591285 nova_compute[182755]:  <vcpu>1</vcpu>
Jan 21 19:05:01 np0005591285 nova_compute[182755]:  <metadata>
Jan 21 19:05:01 np0005591285 nova_compute[182755]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 21 19:05:01 np0005591285 nova_compute[182755]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 21 19:05:01 np0005591285 nova_compute[182755]:      <nova:name>tempest-ServerActionsTestOtherA-server-1403378080</nova:name>
Jan 21 19:05:01 np0005591285 nova_compute[182755]:      <nova:creationTime>2026-01-22 00:05:01</nova:creationTime>
Jan 21 19:05:01 np0005591285 nova_compute[182755]:      <nova:flavor name="m1.nano">
Jan 21 19:05:01 np0005591285 nova_compute[182755]:        <nova:memory>128</nova:memory>
Jan 21 19:05:01 np0005591285 nova_compute[182755]:        <nova:disk>1</nova:disk>
Jan 21 19:05:01 np0005591285 nova_compute[182755]:        <nova:swap>0</nova:swap>
Jan 21 19:05:01 np0005591285 nova_compute[182755]:        <nova:ephemeral>0</nova:ephemeral>
Jan 21 19:05:01 np0005591285 nova_compute[182755]:        <nova:vcpus>1</nova:vcpus>
Jan 21 19:05:01 np0005591285 nova_compute[182755]:      </nova:flavor>
Jan 21 19:05:01 np0005591285 nova_compute[182755]:      <nova:owner>
Jan 21 19:05:01 np0005591285 nova_compute[182755]:        <nova:user uuid="b4385295f46b45d8803b0c536a989822">tempest-ServerActionsTestOtherA-1347085859-project-member</nova:user>
Jan 21 19:05:01 np0005591285 nova_compute[182755]:        <nova:project uuid="c299d482d37e45169cca3d6f178e8555">tempest-ServerActionsTestOtherA-1347085859</nova:project>
Jan 21 19:05:01 np0005591285 nova_compute[182755]:      </nova:owner>
Jan 21 19:05:01 np0005591285 nova_compute[182755]:      <nova:root type="image" uuid="9cd98f02-a505-4543-a7ad-04e9a377b456"/>
Jan 21 19:05:01 np0005591285 nova_compute[182755]:      <nova:ports>
Jan 21 19:05:01 np0005591285 nova_compute[182755]:        <nova:port uuid="4281bc8f-b881-4082-9fc7-f4b6436a837d">
Jan 21 19:05:01 np0005591285 nova_compute[182755]:          <nova:ip type="fixed" address="10.100.0.4" ipVersion="4"/>
Jan 21 19:05:01 np0005591285 nova_compute[182755]:        </nova:port>
Jan 21 19:05:01 np0005591285 nova_compute[182755]:      </nova:ports>
Jan 21 19:05:01 np0005591285 nova_compute[182755]:    </nova:instance>
Jan 21 19:05:01 np0005591285 nova_compute[182755]:  </metadata>
Jan 21 19:05:01 np0005591285 nova_compute[182755]:  <sysinfo type="smbios">
Jan 21 19:05:01 np0005591285 nova_compute[182755]:    <system>
Jan 21 19:05:01 np0005591285 nova_compute[182755]:      <entry name="manufacturer">RDO</entry>
Jan 21 19:05:01 np0005591285 nova_compute[182755]:      <entry name="product">OpenStack Compute</entry>
Jan 21 19:05:01 np0005591285 nova_compute[182755]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 21 19:05:01 np0005591285 nova_compute[182755]:      <entry name="serial">e5fe3f24-b0cd-4353-af64-6c1c92f1581d</entry>
Jan 21 19:05:01 np0005591285 nova_compute[182755]:      <entry name="uuid">e5fe3f24-b0cd-4353-af64-6c1c92f1581d</entry>
Jan 21 19:05:01 np0005591285 nova_compute[182755]:      <entry name="family">Virtual Machine</entry>
Jan 21 19:05:01 np0005591285 nova_compute[182755]:    </system>
Jan 21 19:05:01 np0005591285 nova_compute[182755]:  </sysinfo>
Jan 21 19:05:01 np0005591285 nova_compute[182755]:  <os>
Jan 21 19:05:01 np0005591285 nova_compute[182755]:    <type arch="x86_64" machine="q35">hvm</type>
Jan 21 19:05:01 np0005591285 nova_compute[182755]:    <boot dev="hd"/>
Jan 21 19:05:01 np0005591285 nova_compute[182755]:    <smbios mode="sysinfo"/>
Jan 21 19:05:01 np0005591285 nova_compute[182755]:  </os>
Jan 21 19:05:01 np0005591285 nova_compute[182755]:  <features>
Jan 21 19:05:01 np0005591285 nova_compute[182755]:    <acpi/>
Jan 21 19:05:01 np0005591285 nova_compute[182755]:    <apic/>
Jan 21 19:05:01 np0005591285 nova_compute[182755]:    <vmcoreinfo/>
Jan 21 19:05:01 np0005591285 nova_compute[182755]:  </features>
Jan 21 19:05:01 np0005591285 nova_compute[182755]:  <clock offset="utc">
Jan 21 19:05:01 np0005591285 nova_compute[182755]:    <timer name="pit" tickpolicy="delay"/>
Jan 21 19:05:01 np0005591285 nova_compute[182755]:    <timer name="rtc" tickpolicy="catchup"/>
Jan 21 19:05:01 np0005591285 nova_compute[182755]:    <timer name="hpet" present="no"/>
Jan 21 19:05:01 np0005591285 nova_compute[182755]:  </clock>
Jan 21 19:05:01 np0005591285 nova_compute[182755]:  <cpu mode="custom" match="exact">
Jan 21 19:05:01 np0005591285 nova_compute[182755]:    <model>Nehalem</model>
Jan 21 19:05:01 np0005591285 nova_compute[182755]:    <topology sockets="1" cores="1" threads="1"/>
Jan 21 19:05:01 np0005591285 nova_compute[182755]:  </cpu>
Jan 21 19:05:01 np0005591285 nova_compute[182755]:  <devices>
Jan 21 19:05:01 np0005591285 nova_compute[182755]:    <disk type="file" device="disk">
Jan 21 19:05:01 np0005591285 nova_compute[182755]:      <driver name="qemu" type="qcow2" cache="none"/>
Jan 21 19:05:01 np0005591285 nova_compute[182755]:      <source file="/var/lib/nova/instances/e5fe3f24-b0cd-4353-af64-6c1c92f1581d/disk"/>
Jan 21 19:05:01 np0005591285 nova_compute[182755]:      <target dev="vda" bus="virtio"/>
Jan 21 19:05:01 np0005591285 nova_compute[182755]:    </disk>
Jan 21 19:05:01 np0005591285 nova_compute[182755]:    <disk type="file" device="cdrom">
Jan 21 19:05:01 np0005591285 nova_compute[182755]:      <driver name="qemu" type="raw" cache="none"/>
Jan 21 19:05:01 np0005591285 nova_compute[182755]:      <source file="/var/lib/nova/instances/e5fe3f24-b0cd-4353-af64-6c1c92f1581d/disk.config"/>
Jan 21 19:05:01 np0005591285 nova_compute[182755]:      <target dev="sda" bus="sata"/>
Jan 21 19:05:01 np0005591285 nova_compute[182755]:    </disk>
Jan 21 19:05:01 np0005591285 nova_compute[182755]:    <interface type="ethernet">
Jan 21 19:05:01 np0005591285 nova_compute[182755]:      <mac address="fa:16:3e:e2:23:c6"/>
Jan 21 19:05:01 np0005591285 nova_compute[182755]:      <model type="virtio"/>
Jan 21 19:05:01 np0005591285 nova_compute[182755]:      <driver name="vhost" rx_queue_size="512"/>
Jan 21 19:05:01 np0005591285 nova_compute[182755]:      <mtu size="1442"/>
Jan 21 19:05:01 np0005591285 nova_compute[182755]:      <target dev="tap4281bc8f-b8"/>
Jan 21 19:05:01 np0005591285 nova_compute[182755]:    </interface>
Jan 21 19:05:01 np0005591285 nova_compute[182755]:    <serial type="pty">
Jan 21 19:05:01 np0005591285 nova_compute[182755]:      <log file="/var/lib/nova/instances/e5fe3f24-b0cd-4353-af64-6c1c92f1581d/console.log" append="off"/>
Jan 21 19:05:01 np0005591285 nova_compute[182755]:    </serial>
Jan 21 19:05:01 np0005591285 nova_compute[182755]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 21 19:05:01 np0005591285 nova_compute[182755]:    <video>
Jan 21 19:05:01 np0005591285 nova_compute[182755]:      <model type="virtio"/>
Jan 21 19:05:01 np0005591285 nova_compute[182755]:    </video>
Jan 21 19:05:01 np0005591285 nova_compute[182755]:    <input type="tablet" bus="usb"/>
Jan 21 19:05:01 np0005591285 nova_compute[182755]:    <rng model="virtio">
Jan 21 19:05:01 np0005591285 nova_compute[182755]:      <backend model="random">/dev/urandom</backend>
Jan 21 19:05:01 np0005591285 nova_compute[182755]:    </rng>
Jan 21 19:05:01 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root"/>
Jan 21 19:05:01 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 19:05:01 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 19:05:01 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 19:05:01 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 19:05:01 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 19:05:01 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 19:05:01 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 19:05:01 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 19:05:01 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 19:05:01 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 19:05:01 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 19:05:01 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 19:05:01 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 19:05:01 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 19:05:01 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 19:05:01 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 19:05:01 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 19:05:01 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 19:05:01 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 19:05:01 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 19:05:01 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 19:05:01 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 19:05:01 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 19:05:01 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 19:05:01 np0005591285 nova_compute[182755]:    <controller type="usb" index="0"/>
Jan 21 19:05:01 np0005591285 nova_compute[182755]:    <memballoon model="virtio">
Jan 21 19:05:01 np0005591285 nova_compute[182755]:      <stats period="10"/>
Jan 21 19:05:01 np0005591285 nova_compute[182755]:    </memballoon>
Jan 21 19:05:01 np0005591285 nova_compute[182755]:  </devices>
Jan 21 19:05:01 np0005591285 nova_compute[182755]: </domain>
Jan 21 19:05:01 np0005591285 nova_compute[182755]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Jan 21 19:05:01 np0005591285 nova_compute[182755]: 2026-01-22 00:05:01.399 182759 DEBUG nova.compute.manager [None req-42d88e4e-d95b-4b50-a14c-53de9c59324d b4385295f46b45d8803b0c536a989822 c299d482d37e45169cca3d6f178e8555 - - default default] [instance: e5fe3f24-b0cd-4353-af64-6c1c92f1581d] Preparing to wait for external event network-vif-plugged-4281bc8f-b881-4082-9fc7-f4b6436a837d prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Jan 21 19:05:01 np0005591285 nova_compute[182755]: 2026-01-22 00:05:01.399 182759 DEBUG oslo_concurrency.lockutils [None req-42d88e4e-d95b-4b50-a14c-53de9c59324d b4385295f46b45d8803b0c536a989822 c299d482d37e45169cca3d6f178e8555 - - default default] Acquiring lock "e5fe3f24-b0cd-4353-af64-6c1c92f1581d-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 21 19:05:01 np0005591285 nova_compute[182755]: 2026-01-22 00:05:01.400 182759 DEBUG oslo_concurrency.lockutils [None req-42d88e4e-d95b-4b50-a14c-53de9c59324d b4385295f46b45d8803b0c536a989822 c299d482d37e45169cca3d6f178e8555 - - default default] Lock "e5fe3f24-b0cd-4353-af64-6c1c92f1581d-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 21 19:05:01 np0005591285 nova_compute[182755]: 2026-01-22 00:05:01.400 182759 DEBUG oslo_concurrency.lockutils [None req-42d88e4e-d95b-4b50-a14c-53de9c59324d b4385295f46b45d8803b0c536a989822 c299d482d37e45169cca3d6f178e8555 - - default default] Lock "e5fe3f24-b0cd-4353-af64-6c1c92f1581d-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 21 19:05:01 np0005591285 nova_compute[182755]: 2026-01-22 00:05:01.402 182759 DEBUG nova.virt.libvirt.vif [None req-42d88e4e-d95b-4b50-a14c-53de9c59324d b4385295f46b45d8803b0c536a989822 c299d482d37e45169cca3d6f178e8555 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-22T00:04:53Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ServerActionsTestOtherA-server-1403378080',display_name='tempest-ServerActionsTestOtherA-server-1403378080',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-serveractionstestothera-server-1403378080',id=91,image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBH70xah7ihgEIoUx5I8Vi9VE8DEeMG53SOL9NCSbgEeBRV9je/jiE2sCWFNA3ItoX/qylG9OqBTijx5WPdqmM5JgcD0QcWbaXaoP4id2xYDAqen7JSpxK96w9/70dxAV2w==',key_name='tempest-keypair-70761650',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='c299d482d37e45169cca3d6f178e8555',ramdisk_id='',reservation_id='r-iz1p6ahz',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ServerActionsTestOtherA-1347085859',owner_user_name='tempest-ServerActionsTestOtherA-1347085859-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-22T00:04:54Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='b4385295f46b45d8803b0c536a989822',uuid=e5fe3f24-b0cd-4353-af64-6c1c92f1581d,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "4281bc8f-b881-4082-9fc7-f4b6436a837d", "address": "fa:16:3e:e2:23:c6", "network": {"id": "b3dacae7-b9cd-426c-aa4a-3a6b971c7ee5", "bridge": "br-int", "label": "tempest-ServerActionsTestOtherA-1112750792-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c299d482d37e45169cca3d6f178e8555", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4281bc8f-b8", "ovs_interfaceid": "4281bc8f-b881-4082-9fc7-f4b6436a837d", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Jan 21 19:05:01 np0005591285 nova_compute[182755]: 2026-01-22 00:05:01.402 182759 DEBUG nova.network.os_vif_util [None req-42d88e4e-d95b-4b50-a14c-53de9c59324d b4385295f46b45d8803b0c536a989822 c299d482d37e45169cca3d6f178e8555 - - default default] Converting VIF {"id": "4281bc8f-b881-4082-9fc7-f4b6436a837d", "address": "fa:16:3e:e2:23:c6", "network": {"id": "b3dacae7-b9cd-426c-aa4a-3a6b971c7ee5", "bridge": "br-int", "label": "tempest-ServerActionsTestOtherA-1112750792-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c299d482d37e45169cca3d6f178e8555", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4281bc8f-b8", "ovs_interfaceid": "4281bc8f-b881-4082-9fc7-f4b6436a837d", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 21 19:05:01 np0005591285 nova_compute[182755]: 2026-01-22 00:05:01.403 182759 DEBUG nova.network.os_vif_util [None req-42d88e4e-d95b-4b50-a14c-53de9c59324d b4385295f46b45d8803b0c536a989822 c299d482d37e45169cca3d6f178e8555 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:e2:23:c6,bridge_name='br-int',has_traffic_filtering=True,id=4281bc8f-b881-4082-9fc7-f4b6436a837d,network=Network(b3dacae7-b9cd-426c-aa4a-3a6b971c7ee5),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap4281bc8f-b8') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 21 19:05:01 np0005591285 nova_compute[182755]: 2026-01-22 00:05:01.404 182759 DEBUG os_vif [None req-42d88e4e-d95b-4b50-a14c-53de9c59324d b4385295f46b45d8803b0c536a989822 c299d482d37e45169cca3d6f178e8555 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:e2:23:c6,bridge_name='br-int',has_traffic_filtering=True,id=4281bc8f-b881-4082-9fc7-f4b6436a837d,network=Network(b3dacae7-b9cd-426c-aa4a-3a6b971c7ee5),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap4281bc8f-b8') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Jan 21 19:05:01 np0005591285 nova_compute[182755]: 2026-01-22 00:05:01.405 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:05:01 np0005591285 nova_compute[182755]: 2026-01-22 00:05:01.406 182759 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 21 19:05:01 np0005591285 nova_compute[182755]: 2026-01-22 00:05:01.406 182759 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 21 19:05:01 np0005591285 nova_compute[182755]: 2026-01-22 00:05:01.411 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:05:01 np0005591285 nova_compute[182755]: 2026-01-22 00:05:01.412 182759 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap4281bc8f-b8, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 21 19:05:01 np0005591285 nova_compute[182755]: 2026-01-22 00:05:01.413 182759 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap4281bc8f-b8, col_values=(('external_ids', {'iface-id': '4281bc8f-b881-4082-9fc7-f4b6436a837d', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:e2:23:c6', 'vm-uuid': 'e5fe3f24-b0cd-4353-af64-6c1c92f1581d'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 21 19:05:01 np0005591285 nova_compute[182755]: 2026-01-22 00:05:01.415 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:05:01 np0005591285 NetworkManager[55017]: <info>  [1769040301.4181] manager: (tap4281bc8f-b8): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/168)
Jan 21 19:05:01 np0005591285 nova_compute[182755]: 2026-01-22 00:05:01.419 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 21 19:05:01 np0005591285 nova_compute[182755]: 2026-01-22 00:05:01.423 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:05:01 np0005591285 nova_compute[182755]: 2026-01-22 00:05:01.425 182759 INFO os_vif [None req-42d88e4e-d95b-4b50-a14c-53de9c59324d b4385295f46b45d8803b0c536a989822 c299d482d37e45169cca3d6f178e8555 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:e2:23:c6,bridge_name='br-int',has_traffic_filtering=True,id=4281bc8f-b881-4082-9fc7-f4b6436a837d,network=Network(b3dacae7-b9cd-426c-aa4a-3a6b971c7ee5),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap4281bc8f-b8')#033[00m
Jan 21 19:05:01 np0005591285 nova_compute[182755]: 2026-01-22 00:05:01.572 182759 DEBUG nova.virt.libvirt.driver [None req-42d88e4e-d95b-4b50-a14c-53de9c59324d b4385295f46b45d8803b0c536a989822 c299d482d37e45169cca3d6f178e8555 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 21 19:05:01 np0005591285 nova_compute[182755]: 2026-01-22 00:05:01.572 182759 DEBUG nova.virt.libvirt.driver [None req-42d88e4e-d95b-4b50-a14c-53de9c59324d b4385295f46b45d8803b0c536a989822 c299d482d37e45169cca3d6f178e8555 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 21 19:05:01 np0005591285 nova_compute[182755]: 2026-01-22 00:05:01.576 182759 DEBUG nova.virt.libvirt.driver [None req-42d88e4e-d95b-4b50-a14c-53de9c59324d b4385295f46b45d8803b0c536a989822 c299d482d37e45169cca3d6f178e8555 - - default default] No VIF found with MAC fa:16:3e:e2:23:c6, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Jan 21 19:05:01 np0005591285 nova_compute[182755]: 2026-01-22 00:05:01.577 182759 INFO nova.virt.libvirt.driver [None req-42d88e4e-d95b-4b50-a14c-53de9c59324d b4385295f46b45d8803b0c536a989822 c299d482d37e45169cca3d6f178e8555 - - default default] [instance: e5fe3f24-b0cd-4353-af64-6c1c92f1581d] Using config drive#033[00m
Jan 21 19:05:02 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:05:02.970 104259 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 21 19:05:02 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:05:02.970 104259 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 21 19:05:02 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:05:02.971 104259 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 21 19:05:05 np0005591285 nova_compute[182755]: 2026-01-22 00:05:05.250 182759 INFO nova.virt.libvirt.driver [None req-42d88e4e-d95b-4b50-a14c-53de9c59324d b4385295f46b45d8803b0c536a989822 c299d482d37e45169cca3d6f178e8555 - - default default] [instance: e5fe3f24-b0cd-4353-af64-6c1c92f1581d] Creating config drive at /var/lib/nova/instances/e5fe3f24-b0cd-4353-af64-6c1c92f1581d/disk.config#033[00m
Jan 21 19:05:05 np0005591285 nova_compute[182755]: 2026-01-22 00:05:05.261 182759 DEBUG oslo_concurrency.processutils [None req-42d88e4e-d95b-4b50-a14c-53de9c59324d b4385295f46b45d8803b0c536a989822 c299d482d37e45169cca3d6f178e8555 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/e5fe3f24-b0cd-4353-af64-6c1c92f1581d/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp9e3j8wqo execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 21 19:05:05 np0005591285 nova_compute[182755]: 2026-01-22 00:05:05.391 182759 DEBUG oslo_concurrency.processutils [None req-42d88e4e-d95b-4b50-a14c-53de9c59324d b4385295f46b45d8803b0c536a989822 c299d482d37e45169cca3d6f178e8555 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/e5fe3f24-b0cd-4353-af64-6c1c92f1581d/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp9e3j8wqo" returned: 0 in 0.130s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 21 19:05:05 np0005591285 kernel: tap4281bc8f-b8: entered promiscuous mode
Jan 21 19:05:05 np0005591285 NetworkManager[55017]: <info>  [1769040305.4646] manager: (tap4281bc8f-b8): new Tun device (/org/freedesktop/NetworkManager/Devices/169)
Jan 21 19:05:05 np0005591285 ovn_controller[94908]: 2026-01-22T00:05:05Z|00339|binding|INFO|Claiming lport 4281bc8f-b881-4082-9fc7-f4b6436a837d for this chassis.
Jan 21 19:05:05 np0005591285 ovn_controller[94908]: 2026-01-22T00:05:05Z|00340|binding|INFO|4281bc8f-b881-4082-9fc7-f4b6436a837d: Claiming fa:16:3e:e2:23:c6 10.100.0.4
Jan 21 19:05:05 np0005591285 nova_compute[182755]: 2026-01-22 00:05:05.465 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:05:05 np0005591285 nova_compute[182755]: 2026-01-22 00:05:05.470 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:05:05 np0005591285 systemd-udevd[225740]: Network interface NamePolicy= disabled on kernel command line.
Jan 21 19:05:05 np0005591285 NetworkManager[55017]: <info>  [1769040305.5269] device (tap4281bc8f-b8): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 21 19:05:05 np0005591285 NetworkManager[55017]: <info>  [1769040305.5279] device (tap4281bc8f-b8): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 21 19:05:05 np0005591285 systemd-machined[154022]: New machine qemu-43-instance-0000005b.
Jan 21 19:05:05 np0005591285 nova_compute[182755]: 2026-01-22 00:05:05.558 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:05:05 np0005591285 ovn_controller[94908]: 2026-01-22T00:05:05Z|00341|binding|INFO|Setting lport 4281bc8f-b881-4082-9fc7-f4b6436a837d ovn-installed in OVS
Jan 21 19:05:05 np0005591285 nova_compute[182755]: 2026-01-22 00:05:05.562 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:05:05 np0005591285 systemd[1]: Started Virtual Machine qemu-43-instance-0000005b.
Jan 21 19:05:05 np0005591285 nova_compute[182755]: 2026-01-22 00:05:05.992 182759 DEBUG nova.virt.driver [None req-f447ef32-7edb-4e2c-a5c5-5452f2f9c37e - - - - - -] Emitting event <LifecycleEvent: 1769040305.9916627, e5fe3f24-b0cd-4353-af64-6c1c92f1581d => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 21 19:05:05 np0005591285 nova_compute[182755]: 2026-01-22 00:05:05.993 182759 INFO nova.compute.manager [None req-f447ef32-7edb-4e2c-a5c5-5452f2f9c37e - - - - - -] [instance: e5fe3f24-b0cd-4353-af64-6c1c92f1581d] VM Started (Lifecycle Event)#033[00m
Jan 21 19:05:06 np0005591285 nova_compute[182755]: 2026-01-22 00:05:06.201 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:05:06 np0005591285 nova_compute[182755]: 2026-01-22 00:05:06.415 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:05:07 np0005591285 ovn_controller[94908]: 2026-01-22T00:05:07Z|00342|binding|INFO|Setting lport 4281bc8f-b881-4082-9fc7-f4b6436a837d up in Southbound
Jan 21 19:05:07 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:05:07.439 104259 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:e2:23:c6 10.100.0.4'], port_security=['fa:16:3e:e2:23:c6 10.100.0.4'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.4/28', 'neutron:device_id': 'e5fe3f24-b0cd-4353-af64-6c1c92f1581d', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-b3dacae7-b9cd-426c-aa4a-3a6b971c7ee5', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'c299d482d37e45169cca3d6f178e8555', 'neutron:revision_number': '2', 'neutron:security_group_ids': '8dd89f69-046c-4ee5-879d-2e2669cadea8', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=47fc8aa5-cd00-4c23-8e55-87bda0bbf0d4, chassis=[<ovs.db.idl.Row object at 0x7fdd55b5c970>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fdd55b5c970>], logical_port=4281bc8f-b881-4082-9fc7-f4b6436a837d) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 21 19:05:07 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:05:07.440 104259 INFO neutron.agent.ovn.metadata.agent [-] Port 4281bc8f-b881-4082-9fc7-f4b6436a837d in datapath b3dacae7-b9cd-426c-aa4a-3a6b971c7ee5 bound to our chassis#033[00m
Jan 21 19:05:07 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:05:07.442 104259 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network b3dacae7-b9cd-426c-aa4a-3a6b971c7ee5#033[00m
Jan 21 19:05:07 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:05:07.457 211690 DEBUG oslo.privsep.daemon [-] privsep: reply[708269e7-2859-4047-9720-4951d310a7b9]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 19:05:07 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:05:07.458 104259 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tapb3dacae7-b1 in ovnmeta-b3dacae7-b9cd-426c-aa4a-3a6b971c7ee5 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Jan 21 19:05:07 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:05:07.464 211690 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tapb3dacae7-b0 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Jan 21 19:05:07 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:05:07.464 211690 DEBUG oslo.privsep.daemon [-] privsep: reply[e4da3dd9-ac0f-48e2-b8e9-7411c4d47979]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 19:05:07 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:05:07.465 211690 DEBUG oslo.privsep.daemon [-] privsep: reply[eacd505f-02c5-4201-8f05-100b8261618a]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 19:05:07 np0005591285 nova_compute[182755]: 2026-01-22 00:05:07.474 182759 DEBUG nova.compute.manager [None req-f447ef32-7edb-4e2c-a5c5-5452f2f9c37e - - - - - -] [instance: e5fe3f24-b0cd-4353-af64-6c1c92f1581d] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 21 19:05:07 np0005591285 nova_compute[182755]: 2026-01-22 00:05:07.480 182759 DEBUG nova.virt.driver [None req-f447ef32-7edb-4e2c-a5c5-5452f2f9c37e - - - - - -] Emitting event <LifecycleEvent: 1769040305.992999, e5fe3f24-b0cd-4353-af64-6c1c92f1581d => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 21 19:05:07 np0005591285 nova_compute[182755]: 2026-01-22 00:05:07.481 182759 INFO nova.compute.manager [None req-f447ef32-7edb-4e2c-a5c5-5452f2f9c37e - - - - - -] [instance: e5fe3f24-b0cd-4353-af64-6c1c92f1581d] VM Paused (Lifecycle Event)#033[00m
Jan 21 19:05:07 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:05:07.482 104650 DEBUG oslo.privsep.daemon [-] privsep: reply[4af51e51-3594-4408-b2c5-08eda1a3b18a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 19:05:07 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:05:07.512 211690 DEBUG oslo.privsep.daemon [-] privsep: reply[33f154c0-4bd4-4fc0-9441-08796f619637]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 19:05:07 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:05:07.546 211704 DEBUG oslo.privsep.daemon [-] privsep: reply[b995353d-7605-4b71-8497-8ea6177e0b01]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 19:05:07 np0005591285 systemd-udevd[225745]: Network interface NamePolicy= disabled on kernel command line.
Jan 21 19:05:07 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:05:07.552 211690 DEBUG oslo.privsep.daemon [-] privsep: reply[dfec7308-d095-44f8-8b85-bf638d20c64a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 19:05:07 np0005591285 NetworkManager[55017]: <info>  [1769040307.5532] manager: (tapb3dacae7-b0): new Veth device (/org/freedesktop/NetworkManager/Devices/170)
Jan 21 19:05:07 np0005591285 nova_compute[182755]: 2026-01-22 00:05:07.567 182759 DEBUG nova.compute.manager [None req-f447ef32-7edb-4e2c-a5c5-5452f2f9c37e - - - - - -] [instance: e5fe3f24-b0cd-4353-af64-6c1c92f1581d] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 21 19:05:07 np0005591285 nova_compute[182755]: 2026-01-22 00:05:07.574 182759 DEBUG nova.compute.manager [None req-f447ef32-7edb-4e2c-a5c5-5452f2f9c37e - - - - - -] [instance: e5fe3f24-b0cd-4353-af64-6c1c92f1581d] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 21 19:05:07 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:05:07.595 211704 DEBUG oslo.privsep.daemon [-] privsep: reply[8ac9d788-07f1-4a8d-aacc-1620504f95cb]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 19:05:07 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:05:07.597 211704 DEBUG oslo.privsep.daemon [-] privsep: reply[bd661218-e360-4b37-943a-941bccaa078a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 19:05:07 np0005591285 NetworkManager[55017]: <info>  [1769040307.6222] device (tapb3dacae7-b0): carrier: link connected
Jan 21 19:05:07 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:05:07.626 211704 DEBUG oslo.privsep.daemon [-] privsep: reply[7f103a20-0512-4dd7-a113-81d8c3d6005e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 19:05:07 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:05:07.651 211690 DEBUG oslo.privsep.daemon [-] privsep: reply[91aa72ea-a52c-4129-9d45-622fcf3df97d]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapb3dacae7-b1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:8b:f1:ab'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 110], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 486628, 'reachable_time': 44255, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 225783, 'error': None, 'target': 'ovnmeta-b3dacae7-b9cd-426c-aa4a-3a6b971c7ee5', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 19:05:07 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:05:07.669 211690 DEBUG oslo.privsep.daemon [-] privsep: reply[c0bc2d97-6ffd-4061-86dd-52cb8c8146df]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe8b:f1ab'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 486628, 'tstamp': 486628}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 225784, 'error': None, 'target': 'ovnmeta-b3dacae7-b9cd-426c-aa4a-3a6b971c7ee5', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 19:05:07 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:05:07.684 211690 DEBUG oslo.privsep.daemon [-] privsep: reply[96baf621-9715-435b-8db9-977dac3f11c2]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapb3dacae7-b1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:8b:f1:ab'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 110], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 486628, 'reachable_time': 44255, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 225785, 'error': None, 'target': 'ovnmeta-b3dacae7-b9cd-426c-aa4a-3a6b971c7ee5', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 19:05:07 np0005591285 nova_compute[182755]: 2026-01-22 00:05:07.703 182759 DEBUG nova.network.neutron [req-994c696a-2cf5-44a9-bdac-5b31cda8b4f5 req-bcd983d0-90ee-49db-b707-d6d7f1cc063e 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: e5fe3f24-b0cd-4353-af64-6c1c92f1581d] Updated VIF entry in instance network info cache for port 4281bc8f-b881-4082-9fc7-f4b6436a837d. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 21 19:05:07 np0005591285 nova_compute[182755]: 2026-01-22 00:05:07.703 182759 DEBUG nova.network.neutron [req-994c696a-2cf5-44a9-bdac-5b31cda8b4f5 req-bcd983d0-90ee-49db-b707-d6d7f1cc063e 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: e5fe3f24-b0cd-4353-af64-6c1c92f1581d] Updating instance_info_cache with network_info: [{"id": "4281bc8f-b881-4082-9fc7-f4b6436a837d", "address": "fa:16:3e:e2:23:c6", "network": {"id": "b3dacae7-b9cd-426c-aa4a-3a6b971c7ee5", "bridge": "br-int", "label": "tempest-ServerActionsTestOtherA-1112750792-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c299d482d37e45169cca3d6f178e8555", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4281bc8f-b8", "ovs_interfaceid": "4281bc8f-b881-4082-9fc7-f4b6436a837d", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 21 19:05:07 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:05:07.715 211690 DEBUG oslo.privsep.daemon [-] privsep: reply[b4497891-3170-45b2-96cc-4627fc6c5e1c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 19:05:07 np0005591285 nova_compute[182755]: 2026-01-22 00:05:07.754 182759 INFO nova.compute.manager [None req-f447ef32-7edb-4e2c-a5c5-5452f2f9c37e - - - - - -] [instance: e5fe3f24-b0cd-4353-af64-6c1c92f1581d] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 21 19:05:07 np0005591285 nova_compute[182755]: 2026-01-22 00:05:07.754 182759 DEBUG oslo_concurrency.lockutils [req-994c696a-2cf5-44a9-bdac-5b31cda8b4f5 req-bcd983d0-90ee-49db-b707-d6d7f1cc063e 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Releasing lock "refresh_cache-e5fe3f24-b0cd-4353-af64-6c1c92f1581d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 21 19:05:07 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:05:07.775 211690 DEBUG oslo.privsep.daemon [-] privsep: reply[0c6f1d6d-4b9e-4178-8868-ae534a31eec4]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 19:05:07 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:05:07.776 104259 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapb3dacae7-b0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 21 19:05:07 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:05:07.777 104259 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 21 19:05:07 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:05:07.777 104259 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapb3dacae7-b0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 21 19:05:07 np0005591285 nova_compute[182755]: 2026-01-22 00:05:07.778 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:05:07 np0005591285 NetworkManager[55017]: <info>  [1769040307.7799] manager: (tapb3dacae7-b0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/171)
Jan 21 19:05:07 np0005591285 kernel: tapb3dacae7-b0: entered promiscuous mode
Jan 21 19:05:07 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:05:07.782 104259 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapb3dacae7-b0, col_values=(('external_ids', {'iface-id': '90cfb65b-4764-45c8-aca6-274b0a687241'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 21 19:05:07 np0005591285 ovn_controller[94908]: 2026-01-22T00:05:07Z|00343|binding|INFO|Releasing lport 90cfb65b-4764-45c8-aca6-274b0a687241 from this chassis (sb_readonly=0)
Jan 21 19:05:07 np0005591285 nova_compute[182755]: 2026-01-22 00:05:07.783 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:05:07 np0005591285 nova_compute[182755]: 2026-01-22 00:05:07.793 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:05:07 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:05:07.793 104259 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/b3dacae7-b9cd-426c-aa4a-3a6b971c7ee5.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/b3dacae7-b9cd-426c-aa4a-3a6b971c7ee5.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Jan 21 19:05:07 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:05:07.794 211690 DEBUG oslo.privsep.daemon [-] privsep: reply[173d6fe3-e4f4-406d-916c-0d4ec5bbe628]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 19:05:07 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:05:07.795 104259 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Jan 21 19:05:07 np0005591285 ovn_metadata_agent[104254]: global
Jan 21 19:05:07 np0005591285 ovn_metadata_agent[104254]:    log         /dev/log local0 debug
Jan 21 19:05:07 np0005591285 ovn_metadata_agent[104254]:    log-tag     haproxy-metadata-proxy-b3dacae7-b9cd-426c-aa4a-3a6b971c7ee5
Jan 21 19:05:07 np0005591285 ovn_metadata_agent[104254]:    user        root
Jan 21 19:05:07 np0005591285 ovn_metadata_agent[104254]:    group       root
Jan 21 19:05:07 np0005591285 ovn_metadata_agent[104254]:    maxconn     1024
Jan 21 19:05:07 np0005591285 ovn_metadata_agent[104254]:    pidfile     /var/lib/neutron/external/pids/b3dacae7-b9cd-426c-aa4a-3a6b971c7ee5.pid.haproxy
Jan 21 19:05:07 np0005591285 ovn_metadata_agent[104254]:    daemon
Jan 21 19:05:07 np0005591285 ovn_metadata_agent[104254]: 
Jan 21 19:05:07 np0005591285 ovn_metadata_agent[104254]: defaults
Jan 21 19:05:07 np0005591285 ovn_metadata_agent[104254]:    log global
Jan 21 19:05:07 np0005591285 ovn_metadata_agent[104254]:    mode http
Jan 21 19:05:07 np0005591285 ovn_metadata_agent[104254]:    option httplog
Jan 21 19:05:07 np0005591285 ovn_metadata_agent[104254]:    option dontlognull
Jan 21 19:05:07 np0005591285 ovn_metadata_agent[104254]:    option http-server-close
Jan 21 19:05:07 np0005591285 ovn_metadata_agent[104254]:    option forwardfor
Jan 21 19:05:07 np0005591285 ovn_metadata_agent[104254]:    retries                 3
Jan 21 19:05:07 np0005591285 ovn_metadata_agent[104254]:    timeout http-request    30s
Jan 21 19:05:07 np0005591285 ovn_metadata_agent[104254]:    timeout connect         30s
Jan 21 19:05:07 np0005591285 ovn_metadata_agent[104254]:    timeout client          32s
Jan 21 19:05:07 np0005591285 ovn_metadata_agent[104254]:    timeout server          32s
Jan 21 19:05:07 np0005591285 ovn_metadata_agent[104254]:    timeout http-keep-alive 30s
Jan 21 19:05:07 np0005591285 ovn_metadata_agent[104254]: 
Jan 21 19:05:07 np0005591285 ovn_metadata_agent[104254]: 
Jan 21 19:05:07 np0005591285 ovn_metadata_agent[104254]: listen listener
Jan 21 19:05:07 np0005591285 ovn_metadata_agent[104254]:    bind 169.254.169.254:80
Jan 21 19:05:07 np0005591285 ovn_metadata_agent[104254]:    server metadata /var/lib/neutron/metadata_proxy
Jan 21 19:05:07 np0005591285 ovn_metadata_agent[104254]:    http-request add-header X-OVN-Network-ID b3dacae7-b9cd-426c-aa4a-3a6b971c7ee5
Jan 21 19:05:07 np0005591285 ovn_metadata_agent[104254]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Jan 21 19:05:07 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:05:07.796 104259 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-b3dacae7-b9cd-426c-aa4a-3a6b971c7ee5', 'env', 'PROCESS_TAG=haproxy-b3dacae7-b9cd-426c-aa4a-3a6b971c7ee5', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/b3dacae7-b9cd-426c-aa4a-3a6b971c7ee5.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Jan 21 19:05:08 np0005591285 podman[225818]: 2026-01-22 00:05:08.254291867 +0000 UTC m=+0.065484104 container create e0fea9e92bbcd5d92cb91dfa74d3fbe2dfe84bc51c8e18d97b28cc4566e8e0ee (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-b3dacae7-b9cd-426c-aa4a-3a6b971c7ee5, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251202, tcib_managed=true, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team)
Jan 21 19:05:08 np0005591285 systemd[1]: Started libpod-conmon-e0fea9e92bbcd5d92cb91dfa74d3fbe2dfe84bc51c8e18d97b28cc4566e8e0ee.scope.
Jan 21 19:05:08 np0005591285 podman[225818]: 2026-01-22 00:05:08.225655745 +0000 UTC m=+0.036847992 image pull 3695f0466b4af47afdf4b467956f8cc4744d7249671a73e7ca3fd26cca2f59c3 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Jan 21 19:05:08 np0005591285 systemd[1]: Started libcrun container.
Jan 21 19:05:08 np0005591285 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/94235a5c0ac13e34ac15efe1d18357abd8dc1a5561b97d37f35bf7b6ee58cbea/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 21 19:05:08 np0005591285 podman[225818]: 2026-01-22 00:05:08.340117085 +0000 UTC m=+0.151309332 container init e0fea9e92bbcd5d92cb91dfa74d3fbe2dfe84bc51c8e18d97b28cc4566e8e0ee (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-b3dacae7-b9cd-426c-aa4a-3a6b971c7ee5, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251202, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2)
Jan 21 19:05:08 np0005591285 podman[225818]: 2026-01-22 00:05:08.344985486 +0000 UTC m=+0.156177723 container start e0fea9e92bbcd5d92cb91dfa74d3fbe2dfe84bc51c8e18d97b28cc4566e8e0ee (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-b3dacae7-b9cd-426c-aa4a-3a6b971c7ee5, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_managed=true, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 21 19:05:08 np0005591285 neutron-haproxy-ovnmeta-b3dacae7-b9cd-426c-aa4a-3a6b971c7ee5[225833]: [NOTICE]   (225840) : New worker (225842) forked
Jan 21 19:05:08 np0005591285 neutron-haproxy-ovnmeta-b3dacae7-b9cd-426c-aa4a-3a6b971c7ee5[225833]: [NOTICE]   (225840) : Loading success.
Jan 21 19:05:10 np0005591285 nova_compute[182755]: 2026-01-22 00:05:10.488 182759 DEBUG nova.compute.manager [req-ed7357c6-bfd2-4d92-b213-45fd200b6385 req-73c2fabe-3dfc-47da-88a7-5346d5bb4be1 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: e5fe3f24-b0cd-4353-af64-6c1c92f1581d] Received event network-vif-plugged-4281bc8f-b881-4082-9fc7-f4b6436a837d external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 21 19:05:10 np0005591285 nova_compute[182755]: 2026-01-22 00:05:10.489 182759 DEBUG oslo_concurrency.lockutils [req-ed7357c6-bfd2-4d92-b213-45fd200b6385 req-73c2fabe-3dfc-47da-88a7-5346d5bb4be1 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "e5fe3f24-b0cd-4353-af64-6c1c92f1581d-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 21 19:05:10 np0005591285 nova_compute[182755]: 2026-01-22 00:05:10.489 182759 DEBUG oslo_concurrency.lockutils [req-ed7357c6-bfd2-4d92-b213-45fd200b6385 req-73c2fabe-3dfc-47da-88a7-5346d5bb4be1 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "e5fe3f24-b0cd-4353-af64-6c1c92f1581d-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 21 19:05:10 np0005591285 nova_compute[182755]: 2026-01-22 00:05:10.490 182759 DEBUG oslo_concurrency.lockutils [req-ed7357c6-bfd2-4d92-b213-45fd200b6385 req-73c2fabe-3dfc-47da-88a7-5346d5bb4be1 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "e5fe3f24-b0cd-4353-af64-6c1c92f1581d-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 21 19:05:10 np0005591285 nova_compute[182755]: 2026-01-22 00:05:10.490 182759 DEBUG nova.compute.manager [req-ed7357c6-bfd2-4d92-b213-45fd200b6385 req-73c2fabe-3dfc-47da-88a7-5346d5bb4be1 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: e5fe3f24-b0cd-4353-af64-6c1c92f1581d] Processing event network-vif-plugged-4281bc8f-b881-4082-9fc7-f4b6436a837d _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Jan 21 19:05:10 np0005591285 nova_compute[182755]: 2026-01-22 00:05:10.491 182759 DEBUG nova.compute.manager [None req-42d88e4e-d95b-4b50-a14c-53de9c59324d b4385295f46b45d8803b0c536a989822 c299d482d37e45169cca3d6f178e8555 - - default default] [instance: e5fe3f24-b0cd-4353-af64-6c1c92f1581d] Instance event wait completed in 4 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Jan 21 19:05:10 np0005591285 nova_compute[182755]: 2026-01-22 00:05:10.494 182759 DEBUG nova.virt.driver [None req-f447ef32-7edb-4e2c-a5c5-5452f2f9c37e - - - - - -] Emitting event <LifecycleEvent: 1769040310.494583, e5fe3f24-b0cd-4353-af64-6c1c92f1581d => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 21 19:05:10 np0005591285 nova_compute[182755]: 2026-01-22 00:05:10.495 182759 INFO nova.compute.manager [None req-f447ef32-7edb-4e2c-a5c5-5452f2f9c37e - - - - - -] [instance: e5fe3f24-b0cd-4353-af64-6c1c92f1581d] VM Resumed (Lifecycle Event)#033[00m
Jan 21 19:05:10 np0005591285 nova_compute[182755]: 2026-01-22 00:05:10.497 182759 DEBUG nova.virt.libvirt.driver [None req-42d88e4e-d95b-4b50-a14c-53de9c59324d b4385295f46b45d8803b0c536a989822 c299d482d37e45169cca3d6f178e8555 - - default default] [instance: e5fe3f24-b0cd-4353-af64-6c1c92f1581d] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Jan 21 19:05:10 np0005591285 nova_compute[182755]: 2026-01-22 00:05:10.500 182759 INFO nova.virt.libvirt.driver [-] [instance: e5fe3f24-b0cd-4353-af64-6c1c92f1581d] Instance spawned successfully.#033[00m
Jan 21 19:05:10 np0005591285 nova_compute[182755]: 2026-01-22 00:05:10.500 182759 DEBUG nova.virt.libvirt.driver [None req-42d88e4e-d95b-4b50-a14c-53de9c59324d b4385295f46b45d8803b0c536a989822 c299d482d37e45169cca3d6f178e8555 - - default default] [instance: e5fe3f24-b0cd-4353-af64-6c1c92f1581d] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Jan 21 19:05:10 np0005591285 nova_compute[182755]: 2026-01-22 00:05:10.518 182759 DEBUG nova.compute.manager [None req-f447ef32-7edb-4e2c-a5c5-5452f2f9c37e - - - - - -] [instance: e5fe3f24-b0cd-4353-af64-6c1c92f1581d] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 21 19:05:10 np0005591285 nova_compute[182755]: 2026-01-22 00:05:10.525 182759 DEBUG nova.compute.manager [None req-f447ef32-7edb-4e2c-a5c5-5452f2f9c37e - - - - - -] [instance: e5fe3f24-b0cd-4353-af64-6c1c92f1581d] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 21 19:05:10 np0005591285 nova_compute[182755]: 2026-01-22 00:05:10.529 182759 DEBUG nova.virt.libvirt.driver [None req-42d88e4e-d95b-4b50-a14c-53de9c59324d b4385295f46b45d8803b0c536a989822 c299d482d37e45169cca3d6f178e8555 - - default default] [instance: e5fe3f24-b0cd-4353-af64-6c1c92f1581d] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 21 19:05:10 np0005591285 nova_compute[182755]: 2026-01-22 00:05:10.529 182759 DEBUG nova.virt.libvirt.driver [None req-42d88e4e-d95b-4b50-a14c-53de9c59324d b4385295f46b45d8803b0c536a989822 c299d482d37e45169cca3d6f178e8555 - - default default] [instance: e5fe3f24-b0cd-4353-af64-6c1c92f1581d] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 21 19:05:10 np0005591285 nova_compute[182755]: 2026-01-22 00:05:10.530 182759 DEBUG nova.virt.libvirt.driver [None req-42d88e4e-d95b-4b50-a14c-53de9c59324d b4385295f46b45d8803b0c536a989822 c299d482d37e45169cca3d6f178e8555 - - default default] [instance: e5fe3f24-b0cd-4353-af64-6c1c92f1581d] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 21 19:05:10 np0005591285 nova_compute[182755]: 2026-01-22 00:05:10.530 182759 DEBUG nova.virt.libvirt.driver [None req-42d88e4e-d95b-4b50-a14c-53de9c59324d b4385295f46b45d8803b0c536a989822 c299d482d37e45169cca3d6f178e8555 - - default default] [instance: e5fe3f24-b0cd-4353-af64-6c1c92f1581d] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 21 19:05:10 np0005591285 nova_compute[182755]: 2026-01-22 00:05:10.531 182759 DEBUG nova.virt.libvirt.driver [None req-42d88e4e-d95b-4b50-a14c-53de9c59324d b4385295f46b45d8803b0c536a989822 c299d482d37e45169cca3d6f178e8555 - - default default] [instance: e5fe3f24-b0cd-4353-af64-6c1c92f1581d] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 21 19:05:10 np0005591285 nova_compute[182755]: 2026-01-22 00:05:10.531 182759 DEBUG nova.virt.libvirt.driver [None req-42d88e4e-d95b-4b50-a14c-53de9c59324d b4385295f46b45d8803b0c536a989822 c299d482d37e45169cca3d6f178e8555 - - default default] [instance: e5fe3f24-b0cd-4353-af64-6c1c92f1581d] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 21 19:05:10 np0005591285 nova_compute[182755]: 2026-01-22 00:05:10.557 182759 INFO nova.compute.manager [None req-f447ef32-7edb-4e2c-a5c5-5452f2f9c37e - - - - - -] [instance: e5fe3f24-b0cd-4353-af64-6c1c92f1581d] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 21 19:05:10 np0005591285 nova_compute[182755]: 2026-01-22 00:05:10.604 182759 INFO nova.compute.manager [None req-42d88e4e-d95b-4b50-a14c-53de9c59324d b4385295f46b45d8803b0c536a989822 c299d482d37e45169cca3d6f178e8555 - - default default] [instance: e5fe3f24-b0cd-4353-af64-6c1c92f1581d] Took 15.63 seconds to spawn the instance on the hypervisor.#033[00m
Jan 21 19:05:10 np0005591285 nova_compute[182755]: 2026-01-22 00:05:10.605 182759 DEBUG nova.compute.manager [None req-42d88e4e-d95b-4b50-a14c-53de9c59324d b4385295f46b45d8803b0c536a989822 c299d482d37e45169cca3d6f178e8555 - - default default] [instance: e5fe3f24-b0cd-4353-af64-6c1c92f1581d] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 21 19:05:10 np0005591285 nova_compute[182755]: 2026-01-22 00:05:10.750 182759 INFO nova.compute.manager [None req-42d88e4e-d95b-4b50-a14c-53de9c59324d b4385295f46b45d8803b0c536a989822 c299d482d37e45169cca3d6f178e8555 - - default default] [instance: e5fe3f24-b0cd-4353-af64-6c1c92f1581d] Took 16.35 seconds to build instance.#033[00m
Jan 21 19:05:10 np0005591285 nova_compute[182755]: 2026-01-22 00:05:10.802 182759 DEBUG oslo_concurrency.lockutils [None req-42d88e4e-d95b-4b50-a14c-53de9c59324d b4385295f46b45d8803b0c536a989822 c299d482d37e45169cca3d6f178e8555 - - default default] Lock "e5fe3f24-b0cd-4353-af64-6c1c92f1581d" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 16.591s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 21 19:05:11 np0005591285 podman[225852]: 2026-01-22 00:05:11.191865044 +0000 UTC m=+0.063524539 container health_status b5c1a31a508d0cf9e10702abe9c0ef424614b4d8f0daee85791f7de054afa6f0 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=ceilometer_agent_compute, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_managed=true, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 21 19:05:11 np0005591285 nova_compute[182755]: 2026-01-22 00:05:11.202 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:05:11 np0005591285 podman[225851]: 2026-01-22 00:05:11.23927728 +0000 UTC m=+0.113610357 container health_status 769668cc4e818cfae854ab3fcb5517227ddca1e18005a18ef4d782a494f45662 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.openshift.expose-services=, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, config_id=openstack_network_exporter, distribution-scope=public, io.buildah.version=1.33.7, vendor=Red Hat, Inc., release=1755695350, architecture=x86_64, container_name=openstack_network_exporter, managed_by=edpm_ansible, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, build-date=2025-08-20T13:12:41, name=ubi9-minimal, url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.component=ubi9-minimal-container, vcs-type=git, version=9.6, maintainer=Red Hat, Inc., io.openshift.tags=minimal rhel9, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly.)
Jan 21 19:05:11 np0005591285 nova_compute[182755]: 2026-01-22 00:05:11.416 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:05:12 np0005591285 nova_compute[182755]: 2026-01-22 00:05:12.612 182759 DEBUG nova.compute.manager [req-4068d42d-608c-43d4-94d1-c316842bd135 req-4bb0e493-622f-4c7c-9d6a-ab52692b2489 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: e5fe3f24-b0cd-4353-af64-6c1c92f1581d] Received event network-vif-plugged-4281bc8f-b881-4082-9fc7-f4b6436a837d external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 21 19:05:12 np0005591285 nova_compute[182755]: 2026-01-22 00:05:12.613 182759 DEBUG oslo_concurrency.lockutils [req-4068d42d-608c-43d4-94d1-c316842bd135 req-4bb0e493-622f-4c7c-9d6a-ab52692b2489 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "e5fe3f24-b0cd-4353-af64-6c1c92f1581d-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 21 19:05:12 np0005591285 nova_compute[182755]: 2026-01-22 00:05:12.614 182759 DEBUG oslo_concurrency.lockutils [req-4068d42d-608c-43d4-94d1-c316842bd135 req-4bb0e493-622f-4c7c-9d6a-ab52692b2489 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "e5fe3f24-b0cd-4353-af64-6c1c92f1581d-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 21 19:05:12 np0005591285 nova_compute[182755]: 2026-01-22 00:05:12.614 182759 DEBUG oslo_concurrency.lockutils [req-4068d42d-608c-43d4-94d1-c316842bd135 req-4bb0e493-622f-4c7c-9d6a-ab52692b2489 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "e5fe3f24-b0cd-4353-af64-6c1c92f1581d-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 21 19:05:12 np0005591285 nova_compute[182755]: 2026-01-22 00:05:12.615 182759 DEBUG nova.compute.manager [req-4068d42d-608c-43d4-94d1-c316842bd135 req-4bb0e493-622f-4c7c-9d6a-ab52692b2489 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: e5fe3f24-b0cd-4353-af64-6c1c92f1581d] No waiting events found dispatching network-vif-plugged-4281bc8f-b881-4082-9fc7-f4b6436a837d pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 21 19:05:12 np0005591285 nova_compute[182755]: 2026-01-22 00:05:12.616 182759 WARNING nova.compute.manager [req-4068d42d-608c-43d4-94d1-c316842bd135 req-4bb0e493-622f-4c7c-9d6a-ab52692b2489 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: e5fe3f24-b0cd-4353-af64-6c1c92f1581d] Received unexpected event network-vif-plugged-4281bc8f-b881-4082-9fc7-f4b6436a837d for instance with vm_state active and task_state None.#033[00m
Jan 21 19:05:12 np0005591285 NetworkManager[55017]: <info>  [1769040312.9601] manager: (patch-provnet-99bcf688-d143-4306-a11c-956e66fbe227-to-br-int): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/172)
Jan 21 19:05:12 np0005591285 NetworkManager[55017]: <info>  [1769040312.9614] manager: (patch-br-int-to-provnet-99bcf688-d143-4306-a11c-956e66fbe227): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/173)
Jan 21 19:05:12 np0005591285 nova_compute[182755]: 2026-01-22 00:05:12.963 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:05:13 np0005591285 nova_compute[182755]: 2026-01-22 00:05:13.111 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:05:13 np0005591285 ovn_controller[94908]: 2026-01-22T00:05:13Z|00344|binding|INFO|Releasing lport 90cfb65b-4764-45c8-aca6-274b0a687241 from this chassis (sb_readonly=0)
Jan 21 19:05:13 np0005591285 nova_compute[182755]: 2026-01-22 00:05:13.124 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:05:14 np0005591285 nova_compute[182755]: 2026-01-22 00:05:14.065 182759 DEBUG nova.compute.manager [req-fed6f15e-70df-43bd-8a9c-10d6f85d9f8c req-afa4ff64-66a5-42d5-a96a-6c6a09cf6bb0 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: e5fe3f24-b0cd-4353-af64-6c1c92f1581d] Received event network-changed-4281bc8f-b881-4082-9fc7-f4b6436a837d external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 21 19:05:14 np0005591285 nova_compute[182755]: 2026-01-22 00:05:14.066 182759 DEBUG nova.compute.manager [req-fed6f15e-70df-43bd-8a9c-10d6f85d9f8c req-afa4ff64-66a5-42d5-a96a-6c6a09cf6bb0 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: e5fe3f24-b0cd-4353-af64-6c1c92f1581d] Refreshing instance network info cache due to event network-changed-4281bc8f-b881-4082-9fc7-f4b6436a837d. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 21 19:05:14 np0005591285 nova_compute[182755]: 2026-01-22 00:05:14.066 182759 DEBUG oslo_concurrency.lockutils [req-fed6f15e-70df-43bd-8a9c-10d6f85d9f8c req-afa4ff64-66a5-42d5-a96a-6c6a09cf6bb0 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "refresh_cache-e5fe3f24-b0cd-4353-af64-6c1c92f1581d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 21 19:05:14 np0005591285 nova_compute[182755]: 2026-01-22 00:05:14.066 182759 DEBUG oslo_concurrency.lockutils [req-fed6f15e-70df-43bd-8a9c-10d6f85d9f8c req-afa4ff64-66a5-42d5-a96a-6c6a09cf6bb0 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquired lock "refresh_cache-e5fe3f24-b0cd-4353-af64-6c1c92f1581d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 21 19:05:14 np0005591285 nova_compute[182755]: 2026-01-22 00:05:14.067 182759 DEBUG nova.network.neutron [req-fed6f15e-70df-43bd-8a9c-10d6f85d9f8c req-afa4ff64-66a5-42d5-a96a-6c6a09cf6bb0 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: e5fe3f24-b0cd-4353-af64-6c1c92f1581d] Refreshing network info cache for port 4281bc8f-b881-4082-9fc7-f4b6436a837d _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 21 19:05:15 np0005591285 ovn_controller[94908]: 2026-01-22T00:05:15Z|00345|binding|INFO|Releasing lport 90cfb65b-4764-45c8-aca6-274b0a687241 from this chassis (sb_readonly=0)
Jan 21 19:05:15 np0005591285 nova_compute[182755]: 2026-01-22 00:05:15.219 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:05:16 np0005591285 nova_compute[182755]: 2026-01-22 00:05:16.204 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:05:16 np0005591285 nova_compute[182755]: 2026-01-22 00:05:16.420 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:05:17 np0005591285 podman[225890]: 2026-01-22 00:05:17.17959212 +0000 UTC m=+0.051600359 container health_status 3d927e208c8d9d2bf2ed958430b573b5beb6e6062ba87e1ec7a0ee42c855b2e4 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Jan 21 19:05:17 np0005591285 nova_compute[182755]: 2026-01-22 00:05:17.242 182759 DEBUG nova.network.neutron [req-fed6f15e-70df-43bd-8a9c-10d6f85d9f8c req-afa4ff64-66a5-42d5-a96a-6c6a09cf6bb0 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: e5fe3f24-b0cd-4353-af64-6c1c92f1581d] Updated VIF entry in instance network info cache for port 4281bc8f-b881-4082-9fc7-f4b6436a837d. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 21 19:05:17 np0005591285 nova_compute[182755]: 2026-01-22 00:05:17.243 182759 DEBUG nova.network.neutron [req-fed6f15e-70df-43bd-8a9c-10d6f85d9f8c req-afa4ff64-66a5-42d5-a96a-6c6a09cf6bb0 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: e5fe3f24-b0cd-4353-af64-6c1c92f1581d] Updating instance_info_cache with network_info: [{"id": "4281bc8f-b881-4082-9fc7-f4b6436a837d", "address": "fa:16:3e:e2:23:c6", "network": {"id": "b3dacae7-b9cd-426c-aa4a-3a6b971c7ee5", "bridge": "br-int", "label": "tempest-ServerActionsTestOtherA-1112750792-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.244", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c299d482d37e45169cca3d6f178e8555", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4281bc8f-b8", "ovs_interfaceid": "4281bc8f-b881-4082-9fc7-f4b6436a837d", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 21 19:05:17 np0005591285 nova_compute[182755]: 2026-01-22 00:05:17.272 182759 DEBUG oslo_concurrency.lockutils [req-fed6f15e-70df-43bd-8a9c-10d6f85d9f8c req-afa4ff64-66a5-42d5-a96a-6c6a09cf6bb0 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Releasing lock "refresh_cache-e5fe3f24-b0cd-4353-af64-6c1c92f1581d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 21 19:05:18 np0005591285 ovn_controller[94908]: 2026-01-22T00:05:18Z|00346|binding|INFO|Releasing lport 90cfb65b-4764-45c8-aca6-274b0a687241 from this chassis (sb_readonly=0)
Jan 21 19:05:18 np0005591285 nova_compute[182755]: 2026-01-22 00:05:18.788 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:05:21 np0005591285 podman[225914]: 2026-01-22 00:05:21.186247674 +0000 UTC m=+0.059615004 container health_status 482a8450540b33e5cd4c4cb0c4b60281016c657b79b89fa82f8a9e447843f995 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.build-date=20251202, tcib_managed=true, config_id=ovn_metadata_agent, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, org.label-schema.license=GPLv2)
Jan 21 19:05:21 np0005591285 podman[225915]: 2026-01-22 00:05:21.189190483 +0000 UTC m=+0.058686809 container health_status 8919309155a033da8deb024bb37b9fc0d285fe97686ee0d202af37a5f2df8416 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Jan 21 19:05:21 np0005591285 nova_compute[182755]: 2026-01-22 00:05:21.206 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:05:21 np0005591285 podman[225916]: 2026-01-22 00:05:21.242905438 +0000 UTC m=+0.100907855 container health_status e38def30a2d032278c507a8fe96e62e9ea51ff4721fe55b24e66691821fd079e (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, config_id=ovn_controller, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS)
Jan 21 19:05:21 np0005591285 nova_compute[182755]: 2026-01-22 00:05:21.422 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:05:23 np0005591285 ovn_controller[94908]: 2026-01-22T00:05:23Z|00037|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:e2:23:c6 10.100.0.4
Jan 21 19:05:23 np0005591285 ovn_controller[94908]: 2026-01-22T00:05:23Z|00038|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:e2:23:c6 10.100.0.4
Jan 21 19:05:25 np0005591285 nova_compute[182755]: 2026-01-22 00:05:25.141 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:05:26 np0005591285 nova_compute[182755]: 2026-01-22 00:05:26.209 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:05:26 np0005591285 nova_compute[182755]: 2026-01-22 00:05:26.423 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:05:28 np0005591285 nova_compute[182755]: 2026-01-22 00:05:28.079 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:05:29 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:05:29.220 104259 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=26, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '16:a4:fb', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': 'fa:c2:bb:50:eb:27'}, ipsec=False) old=SB_Global(nb_cfg=25) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 21 19:05:29 np0005591285 nova_compute[182755]: 2026-01-22 00:05:29.221 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:05:29 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:05:29.223 104259 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 2 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Jan 21 19:05:30 np0005591285 nova_compute[182755]: 2026-01-22 00:05:30.213 182759 DEBUG oslo_service.periodic_task [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 21 19:05:31 np0005591285 nova_compute[182755]: 2026-01-22 00:05:31.212 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:05:31 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:05:31.227 104259 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=ce4b296c-26ac-415a-aa87-9634754eb3d3, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '26'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 21 19:05:31 np0005591285 nova_compute[182755]: 2026-01-22 00:05:31.425 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:05:36 np0005591285 nova_compute[182755]: 2026-01-22 00:05:36.213 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:05:36 np0005591285 nova_compute[182755]: 2026-01-22 00:05:36.217 182759 DEBUG oslo_service.periodic_task [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 21 19:05:36 np0005591285 nova_compute[182755]: 2026-01-22 00:05:36.217 182759 DEBUG nova.compute.manager [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Jan 21 19:05:36 np0005591285 nova_compute[182755]: 2026-01-22 00:05:36.426 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:05:37 np0005591285 nova_compute[182755]: 2026-01-22 00:05:37.217 182759 DEBUG oslo_service.periodic_task [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 21 19:05:37 np0005591285 nova_compute[182755]: 2026-01-22 00:05:37.396 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:05:40 np0005591285 nova_compute[182755]: 2026-01-22 00:05:40.217 182759 DEBUG oslo_service.periodic_task [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 21 19:05:41 np0005591285 nova_compute[182755]: 2026-01-22 00:05:41.216 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:05:41 np0005591285 nova_compute[182755]: 2026-01-22 00:05:41.219 182759 DEBUG oslo_service.periodic_task [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 21 19:05:41 np0005591285 nova_compute[182755]: 2026-01-22 00:05:41.219 182759 DEBUG oslo_service.periodic_task [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 21 19:05:41 np0005591285 nova_compute[182755]: 2026-01-22 00:05:41.427 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:05:42 np0005591285 nova_compute[182755]: 2026-01-22 00:05:42.217 182759 DEBUG oslo_service.periodic_task [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 21 19:05:42 np0005591285 podman[225997]: 2026-01-22 00:05:42.216733887 +0000 UTC m=+0.083184559 container health_status 769668cc4e818cfae854ab3fcb5517227ddca1e18005a18ef4d782a494f45662 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, distribution-scope=public, release=1755695350, vcs-type=git, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., version=9.6, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, managed_by=edpm_ansible, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, com.redhat.component=ubi9-minimal-container, io.buildah.version=1.33.7, io.openshift.tags=minimal rhel9, config_id=openstack_network_exporter, url=https://catalog.redhat.com/en/search?searchType=containers, maintainer=Red Hat, Inc., name=ubi9-minimal, io.openshift.expose-services=, container_name=openstack_network_exporter, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., build-date=2025-08-20T13:12:41, vendor=Red Hat, Inc.)
Jan 21 19:05:42 np0005591285 nova_compute[182755]: 2026-01-22 00:05:42.217 182759 DEBUG oslo_service.periodic_task [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 21 19:05:42 np0005591285 podman[225998]: 2026-01-22 00:05:42.233862018 +0000 UTC m=+0.104367699 container health_status b5c1a31a508d0cf9e10702abe9c0ef424614b4d8f0daee85791f7de054afa6f0 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=ceilometer_agent_compute, container_name=ceilometer_agent_compute, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 21 19:05:42 np0005591285 nova_compute[182755]: 2026-01-22 00:05:42.246 182759 DEBUG oslo_concurrency.lockutils [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 21 19:05:42 np0005591285 nova_compute[182755]: 2026-01-22 00:05:42.246 182759 DEBUG oslo_concurrency.lockutils [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 21 19:05:42 np0005591285 nova_compute[182755]: 2026-01-22 00:05:42.247 182759 DEBUG oslo_concurrency.lockutils [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 21 19:05:42 np0005591285 nova_compute[182755]: 2026-01-22 00:05:42.247 182759 DEBUG nova.compute.resource_tracker [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Jan 21 19:05:42 np0005591285 nova_compute[182755]: 2026-01-22 00:05:42.341 182759 DEBUG oslo_concurrency.processutils [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/e5fe3f24-b0cd-4353-af64-6c1c92f1581d/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 21 19:05:42 np0005591285 nova_compute[182755]: 2026-01-22 00:05:42.448 182759 DEBUG oslo_concurrency.processutils [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/e5fe3f24-b0cd-4353-af64-6c1c92f1581d/disk --force-share --output=json" returned: 0 in 0.106s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 21 19:05:42 np0005591285 nova_compute[182755]: 2026-01-22 00:05:42.449 182759 DEBUG oslo_concurrency.processutils [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/e5fe3f24-b0cd-4353-af64-6c1c92f1581d/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 21 19:05:42 np0005591285 nova_compute[182755]: 2026-01-22 00:05:42.539 182759 DEBUG oslo_concurrency.processutils [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/e5fe3f24-b0cd-4353-af64-6c1c92f1581d/disk --force-share --output=json" returned: 0 in 0.089s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 21 19:05:42 np0005591285 nova_compute[182755]: 2026-01-22 00:05:42.701 182759 WARNING nova.virt.libvirt.driver [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 21 19:05:42 np0005591285 nova_compute[182755]: 2026-01-22 00:05:42.703 182759 DEBUG nova.compute.resource_tracker [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=5542MB free_disk=73.23245239257812GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Jan 21 19:05:42 np0005591285 nova_compute[182755]: 2026-01-22 00:05:42.703 182759 DEBUG oslo_concurrency.lockutils [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 21 19:05:42 np0005591285 nova_compute[182755]: 2026-01-22 00:05:42.704 182759 DEBUG oslo_concurrency.lockutils [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 21 19:05:42 np0005591285 nova_compute[182755]: 2026-01-22 00:05:42.898 182759 DEBUG nova.compute.resource_tracker [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Instance e5fe3f24-b0cd-4353-af64-6c1c92f1581d actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Jan 21 19:05:42 np0005591285 nova_compute[182755]: 2026-01-22 00:05:42.899 182759 DEBUG nova.compute.resource_tracker [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Jan 21 19:05:42 np0005591285 nova_compute[182755]: 2026-01-22 00:05:42.899 182759 DEBUG nova.compute.resource_tracker [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7679MB used_ram=640MB phys_disk=79GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Jan 21 19:05:43 np0005591285 nova_compute[182755]: 2026-01-22 00:05:43.004 182759 DEBUG nova.compute.provider_tree [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Inventory has not changed in ProviderTree for provider: e96a8776-a298-4c19-937a-402cb8191067 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 21 19:05:43 np0005591285 nova_compute[182755]: 2026-01-22 00:05:43.029 182759 DEBUG nova.scheduler.client.report [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Inventory has not changed for provider e96a8776-a298-4c19-937a-402cb8191067 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 21 19:05:43 np0005591285 nova_compute[182755]: 2026-01-22 00:05:43.069 182759 DEBUG nova.compute.resource_tracker [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Jan 21 19:05:43 np0005591285 nova_compute[182755]: 2026-01-22 00:05:43.069 182759 DEBUG oslo_concurrency.lockutils [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.366s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 21 19:05:44 np0005591285 nova_compute[182755]: 2026-01-22 00:05:44.070 182759 DEBUG oslo_service.periodic_task [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 21 19:05:44 np0005591285 nova_compute[182755]: 2026-01-22 00:05:44.071 182759 DEBUG nova.compute.manager [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Jan 21 19:05:44 np0005591285 nova_compute[182755]: 2026-01-22 00:05:44.071 182759 DEBUG nova.compute.manager [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Jan 21 19:05:44 np0005591285 nova_compute[182755]: 2026-01-22 00:05:44.370 182759 DEBUG oslo_concurrency.lockutils [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Acquiring lock "refresh_cache-e5fe3f24-b0cd-4353-af64-6c1c92f1581d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 21 19:05:44 np0005591285 nova_compute[182755]: 2026-01-22 00:05:44.371 182759 DEBUG oslo_concurrency.lockutils [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Acquired lock "refresh_cache-e5fe3f24-b0cd-4353-af64-6c1c92f1581d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 21 19:05:44 np0005591285 nova_compute[182755]: 2026-01-22 00:05:44.371 182759 DEBUG nova.network.neutron [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] [instance: e5fe3f24-b0cd-4353-af64-6c1c92f1581d] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004#033[00m
Jan 21 19:05:44 np0005591285 nova_compute[182755]: 2026-01-22 00:05:44.371 182759 DEBUG nova.objects.instance [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Lazy-loading 'info_cache' on Instance uuid e5fe3f24-b0cd-4353-af64-6c1c92f1581d obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 21 19:05:46 np0005591285 nova_compute[182755]: 2026-01-22 00:05:46.218 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:05:46 np0005591285 nova_compute[182755]: 2026-01-22 00:05:46.428 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:05:48 np0005591285 podman[226045]: 2026-01-22 00:05:48.224717918 +0000 UTC m=+0.088187053 container health_status 3d927e208c8d9d2bf2ed958430b573b5beb6e6062ba87e1ec7a0ee42c855b2e4 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter)
Jan 21 19:05:48 np0005591285 nova_compute[182755]: 2026-01-22 00:05:48.759 182759 DEBUG nova.network.neutron [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] [instance: e5fe3f24-b0cd-4353-af64-6c1c92f1581d] Updating instance_info_cache with network_info: [{"id": "4281bc8f-b881-4082-9fc7-f4b6436a837d", "address": "fa:16:3e:e2:23:c6", "network": {"id": "b3dacae7-b9cd-426c-aa4a-3a6b971c7ee5", "bridge": "br-int", "label": "tempest-ServerActionsTestOtherA-1112750792-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.244", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c299d482d37e45169cca3d6f178e8555", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4281bc8f-b8", "ovs_interfaceid": "4281bc8f-b881-4082-9fc7-f4b6436a837d", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 21 19:05:50 np0005591285 nova_compute[182755]: 2026-01-22 00:05:50.180 182759 DEBUG oslo_concurrency.lockutils [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Releasing lock "refresh_cache-e5fe3f24-b0cd-4353-af64-6c1c92f1581d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 21 19:05:50 np0005591285 nova_compute[182755]: 2026-01-22 00:05:50.180 182759 DEBUG nova.compute.manager [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] [instance: e5fe3f24-b0cd-4353-af64-6c1c92f1581d] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929#033[00m
Jan 21 19:05:50 np0005591285 nova_compute[182755]: 2026-01-22 00:05:50.322 182759 DEBUG oslo_service.periodic_task [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 21 19:05:50 np0005591285 nova_compute[182755]: 2026-01-22 00:05:50.426 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:05:51 np0005591285 nova_compute[182755]: 2026-01-22 00:05:51.220 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:05:51 np0005591285 nova_compute[182755]: 2026-01-22 00:05:51.430 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:05:52 np0005591285 podman[226071]: 2026-01-22 00:05:52.185559141 +0000 UTC m=+0.054154599 container health_status 8919309155a033da8deb024bb37b9fc0d285fe97686ee0d202af37a5f2df8416 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Jan 21 19:05:52 np0005591285 podman[226070]: 2026-01-22 00:05:52.214758096 +0000 UTC m=+0.085517481 container health_status 482a8450540b33e5cd4c4cb0c4b60281016c657b79b89fa82f8a9e447843f995 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_managed=true, managed_by=edpm_ansible, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 21 19:05:52 np0005591285 podman[226072]: 2026-01-22 00:05:52.217065848 +0000 UTC m=+0.080984110 container health_status e38def30a2d032278c507a8fe96e62e9ea51ff4721fe55b24e66691821fd079e (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_controller, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible)
Jan 21 19:05:56 np0005591285 nova_compute[182755]: 2026-01-22 00:05:56.267 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:05:56 np0005591285 nova_compute[182755]: 2026-01-22 00:05:56.431 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:06:01 np0005591285 nova_compute[182755]: 2026-01-22 00:06:01.312 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:06:01 np0005591285 nova_compute[182755]: 2026-01-22 00:06:01.434 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:06:01 np0005591285 nova_compute[182755]: 2026-01-22 00:06:01.949 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:06:02 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:06:02.971 104259 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 21 19:06:02 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:06:02.972 104259 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 21 19:06:02 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:06:02.973 104259 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 21 19:06:05 np0005591285 ovn_controller[94908]: 2026-01-22T00:06:05Z|00347|binding|INFO|Releasing lport 90cfb65b-4764-45c8-aca6-274b0a687241 from this chassis (sb_readonly=0)
Jan 21 19:06:05 np0005591285 nova_compute[182755]: 2026-01-22 00:06:05.931 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:06:06 np0005591285 nova_compute[182755]: 2026-01-22 00:06:06.364 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:06:06 np0005591285 nova_compute[182755]: 2026-01-22 00:06:06.436 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:06:09 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:06:09.424 104259 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=27, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '16:a4:fb', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': 'fa:c2:bb:50:eb:27'}, ipsec=False) old=SB_Global(nb_cfg=26) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 21 19:06:09 np0005591285 nova_compute[182755]: 2026-01-22 00:06:09.424 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:06:09 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:06:09.425 104259 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 3 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Jan 21 19:06:11 np0005591285 nova_compute[182755]: 2026-01-22 00:06:11.401 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:06:11 np0005591285 nova_compute[182755]: 2026-01-22 00:06:11.438 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:06:12 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:06:12.427 104259 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=ce4b296c-26ac-415a-aa87-9634754eb3d3, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '27'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 21 19:06:13 np0005591285 podman[226148]: 2026-01-22 00:06:13.209043682 +0000 UTC m=+0.066555141 container health_status b5c1a31a508d0cf9e10702abe9c0ef424614b4d8f0daee85791f7de054afa6f0 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=ceilometer_agent_compute, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']})
Jan 21 19:06:13 np0005591285 podman[226147]: 2026-01-22 00:06:13.236976043 +0000 UTC m=+0.089834307 container health_status 769668cc4e818cfae854ab3fcb5517227ddca1e18005a18ef4d782a494f45662 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, architecture=x86_64, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Red Hat, Inc., container_name=openstack_network_exporter, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, name=ubi9-minimal, build-date=2025-08-20T13:12:41, version=9.6, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-type=git, io.openshift.expose-services=, io.openshift.tags=minimal rhel9, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.buildah.version=1.33.7, managed_by=edpm_ansible, vendor=Red Hat, Inc., com.redhat.component=ubi9-minimal-container, config_id=openstack_network_exporter, release=1755695350)
Jan 21 19:06:14 np0005591285 ovn_controller[94908]: 2026-01-22T00:06:14Z|00348|binding|INFO|Releasing lport 90cfb65b-4764-45c8-aca6-274b0a687241 from this chassis (sb_readonly=0)
Jan 21 19:06:14 np0005591285 nova_compute[182755]: 2026-01-22 00:06:14.436 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:06:16 np0005591285 nova_compute[182755]: 2026-01-22 00:06:16.403 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:06:16 np0005591285 nova_compute[182755]: 2026-01-22 00:06:16.440 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:06:19 np0005591285 podman[226185]: 2026-01-22 00:06:19.189623465 +0000 UTC m=+0.063260492 container health_status 3d927e208c8d9d2bf2ed958430b573b5beb6e6062ba87e1ec7a0ee42c855b2e4 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter)
Jan 21 19:06:19 np0005591285 ovn_controller[94908]: 2026-01-22T00:06:19Z|00349|binding|INFO|Releasing lport 90cfb65b-4764-45c8-aca6-274b0a687241 from this chassis (sb_readonly=0)
Jan 21 19:06:19 np0005591285 nova_compute[182755]: 2026-01-22 00:06:19.600 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:06:21 np0005591285 nova_compute[182755]: 2026-01-22 00:06:21.406 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:06:21 np0005591285 nova_compute[182755]: 2026-01-22 00:06:21.442 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.164 12 DEBUG ceilometer.compute.discovery [-] instance data: {'id': 'e5fe3f24-b0cd-4353-af64-6c1c92f1581d', 'name': 'tempest-ServerActionsTestOtherA-server-1403378080', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'os_type': 'hvm', 'architecture': 'x86_64', 'OS-EXT-SRV-ATTR:instance_name': 'instance-0000005b', 'OS-EXT-SRV-ATTR:host': 'compute-2.ctlplane.example.com', 'OS-EXT-STS:vm_state': 'running', 'tenant_id': 'c299d482d37e45169cca3d6f178e8555', 'user_id': 'b4385295f46b45d8803b0c536a989822', 'hostId': '0ded5639d0be18dd17c7af5906c0415a5ebee818ba74f55e3f8b05ab', 'status': 'active', 'metadata': {}} discover_libvirt_polling /usr/lib/python3.9/site-packages/ceilometer/compute/discovery.py:228
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.165 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.requests in the context of pollsters
Jan 21 19:06:23 np0005591285 podman[226210]: 2026-01-22 00:06:23.188667357 +0000 UTC m=+0.050051047 container health_status 8919309155a033da8deb024bb37b9fc0d285fe97686ee0d202af37a5f2df8416 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter)
Jan 21 19:06:23 np0005591285 podman[226209]: 2026-01-22 00:06:23.190187537 +0000 UTC m=+0.055056731 container health_status 482a8450540b33e5cd4c4cb0c4b60281016c657b79b89fa82f8a9e447843f995 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, managed_by=edpm_ansible, org.label-schema.build-date=20251202)
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.194 12 DEBUG ceilometer.compute.pollsters [-] e5fe3f24-b0cd-4353-af64-6c1c92f1581d/disk.device.read.requests volume: 1070 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.195 12 DEBUG ceilometer.compute.pollsters [-] e5fe3f24-b0cd-4353-af64-6c1c92f1581d/disk.device.read.requests volume: 120 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.198 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'a5f2c10b-fda0-4d30-984c-f7f6adbd6898', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 1070, 'user_id': 'b4385295f46b45d8803b0c536a989822', 'user_name': None, 'project_id': 'c299d482d37e45169cca3d6f178e8555', 'project_name': None, 'resource_id': 'e5fe3f24-b0cd-4353-af64-6c1c92f1581d-vda', 'timestamp': '2026-01-22T00:06:23.165642', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestOtherA-server-1403378080', 'name': 'instance-0000005b', 'instance_id': 'e5fe3f24-b0cd-4353-af64-6c1c92f1581d', 'instance_type': 'm1.nano', 'host': '0ded5639d0be18dd17c7af5906c0415a5ebee818ba74f55e3f8b05ab', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '2fd760f6-f726-11f0-b13b-fa163e425b77', 'monotonic_time': 4941.884870339, 'message_signature': '7e22931d44d89c6c73adda437f886044fa44f1a6b317522e0859ac56942b5a75'}, {'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 120, 'user_id': 'b4385295f46b45d8803b0c536a989822', 'user_name': None, 'project_id': 'c299d482d37e45169cca3d6f178e8555', 'project_name': None, 'resource_id': 'e5fe3f24-b0cd-4353-af64-6c1c92f1581d-sda', 'timestamp': '2026-01-22T00:06:23.165642', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestOtherA-server-1403378080', 'name': 'instance-0000005b', 'instance_id': 'e5fe3f24-b0cd-4353-af64-6c1c92f1581d', 'instance_type': 'm1.nano', 'host': '0ded5639d0be18dd17c7af5906c0415a5ebee818ba74f55e3f8b05ab', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '2fd77488-f726-11f0-b13b-fa163e425b77', 'monotonic_time': 4941.884870339, 'message_signature': '8757b605cabc18083723453634292ad4a2e0eb5bb16d123d96282b133fa0cc23'}]}, 'timestamp': '2026-01-22 00:06:23.195474', '_unique_id': 'e43f4e8920fb49359594b847c90ff32e'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.198 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.198 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.198 12 ERROR oslo_messaging.notify.messaging     yield
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.198 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.198 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.198 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.198 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.198 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.198 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.198 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.198 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.198 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.198 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.198 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.198 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.198 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.198 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.198 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.198 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.198 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.198 12 ERROR oslo_messaging.notify.messaging 
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.198 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.198 12 ERROR oslo_messaging.notify.messaging 
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.198 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.198 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.198 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.198 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.198 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.198 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.198 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.198 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.198 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.198 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.198 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.198 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.198 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.198 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.198 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.198 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.198 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.198 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.198 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.198 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.198 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.198 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.198 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.198 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.198 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.198 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.198 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.198 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.198 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.198 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.198 12 ERROR oslo_messaging.notify.messaging 
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.199 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.requests in the context of pollsters
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.199 12 DEBUG ceilometer.compute.pollsters [-] e5fe3f24-b0cd-4353-af64-6c1c92f1581d/disk.device.write.requests volume: 332 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.199 12 DEBUG ceilometer.compute.pollsters [-] e5fe3f24-b0cd-4353-af64-6c1c92f1581d/disk.device.write.requests volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.200 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '8c361ddb-4ef8-4262-887e-a01a5c725fed', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 332, 'user_id': 'b4385295f46b45d8803b0c536a989822', 'user_name': None, 'project_id': 'c299d482d37e45169cca3d6f178e8555', 'project_name': None, 'resource_id': 'e5fe3f24-b0cd-4353-af64-6c1c92f1581d-vda', 'timestamp': '2026-01-22T00:06:23.199396', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestOtherA-server-1403378080', 'name': 'instance-0000005b', 'instance_id': 'e5fe3f24-b0cd-4353-af64-6c1c92f1581d', 'instance_type': 'm1.nano', 'host': '0ded5639d0be18dd17c7af5906c0415a5ebee818ba74f55e3f8b05ab', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '2fd81c1c-f726-11f0-b13b-fa163e425b77', 'monotonic_time': 4941.884870339, 'message_signature': 'd15fb329785c61eb9b45403cbae835db2b4adf586c0bdd04924029583855138f'}, {'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 0, 'user_id': 'b4385295f46b45d8803b0c536a989822', 'user_name': None, 'project_id': 'c299d482d37e45169cca3d6f178e8555', 'project_name': None, 'resource_id': 'e5fe3f24-b0cd-4353-af64-6c1c92f1581d-sda', 'timestamp': '2026-01-22T00:06:23.199396', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestOtherA-server-1403378080', 'name': 'instance-0000005b', 'instance_id': 'e5fe3f24-b0cd-4353-af64-6c1c92f1581d', 'instance_type': 'm1.nano', 'host': '0ded5639d0be18dd17c7af5906c0415a5ebee818ba74f55e3f8b05ab', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '2fd82572-f726-11f0-b13b-fa163e425b77', 'monotonic_time': 4941.884870339, 'message_signature': 'aaf233cdf40482effbec53abcbb411ca142eb482c3034b1ad1f9af576f87a13b'}]}, 'timestamp': '2026-01-22 00:06:23.199903', '_unique_id': 'c8497b49f4864c2bb617e7f868927b6e'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.200 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.200 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.200 12 ERROR oslo_messaging.notify.messaging     yield
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.200 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.200 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.200 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.200 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.200 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.200 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.200 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.200 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.200 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.200 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.200 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.200 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.200 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.200 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.200 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.200 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.200 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.200 12 ERROR oslo_messaging.notify.messaging 
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.200 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.200 12 ERROR oslo_messaging.notify.messaging 
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.200 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.200 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.200 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.200 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.200 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.200 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.200 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.200 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.200 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.200 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.200 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.200 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.200 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.200 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.200 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.200 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.200 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.200 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.200 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.200 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.200 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.200 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.200 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.200 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.200 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.200 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.200 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.200 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.200 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.200 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.200 12 ERROR oslo_messaging.notify.messaging 
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.201 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.latency in the context of pollsters
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.201 12 DEBUG ceilometer.compute.pollsters [-] e5fe3f24-b0cd-4353-af64-6c1c92f1581d/disk.device.read.latency volume: 201548346 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.201 12 DEBUG ceilometer.compute.pollsters [-] e5fe3f24-b0cd-4353-af64-6c1c92f1581d/disk.device.read.latency volume: 24399608 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.202 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '87bef205-13d9-41d6-b780-7dc74ca73f72', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 201548346, 'user_id': 'b4385295f46b45d8803b0c536a989822', 'user_name': None, 'project_id': 'c299d482d37e45169cca3d6f178e8555', 'project_name': None, 'resource_id': 'e5fe3f24-b0cd-4353-af64-6c1c92f1581d-vda', 'timestamp': '2026-01-22T00:06:23.201333', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestOtherA-server-1403378080', 'name': 'instance-0000005b', 'instance_id': 'e5fe3f24-b0cd-4353-af64-6c1c92f1581d', 'instance_type': 'm1.nano', 'host': '0ded5639d0be18dd17c7af5906c0415a5ebee818ba74f55e3f8b05ab', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '2fd8674e-f726-11f0-b13b-fa163e425b77', 'monotonic_time': 4941.884870339, 'message_signature': '89437476abf29369afce8b1a349d718583d02ccbcd8c74de95ec4a5d4d5f41b1'}, {'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 24399608, 'user_id': 'b4385295f46b45d8803b0c536a989822', 'user_name': None, 'project_id': 'c299d482d37e45169cca3d6f178e8555', 'project_name': None, 'resource_id': 'e5fe3f24-b0cd-4353-af64-6c1c92f1581d-sda', 'timestamp': '2026-01-22T00:06:23.201333', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestOtherA-server-1403378080', 'name': 'instance-0000005b', 'instance_id': 'e5fe3f24-b0cd-4353-af64-6c1c92f1581d', 'instance_type': 'm1.nano', 'host': '0ded5639d0be18dd17c7af5906c0415a5ebee818ba74f55e3f8b05ab', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '2fd86f0a-f726-11f0-b13b-fa163e425b77', 'monotonic_time': 4941.884870339, 'message_signature': '5440e14ac69a6260af28157deb4c30dbb7588bd0d81960ad6aee3ac0984b78a9'}]}, 'timestamp': '2026-01-22 00:06:23.201761', '_unique_id': '524129be86c8468dab741b03ebfb5949'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.202 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.202 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.202 12 ERROR oslo_messaging.notify.messaging     yield
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.202 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.202 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.202 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.202 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.202 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.202 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.202 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.202 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.202 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.202 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.202 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.202 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.202 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.202 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.202 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.202 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.202 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.202 12 ERROR oslo_messaging.notify.messaging 
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.202 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.202 12 ERROR oslo_messaging.notify.messaging 
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.202 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.202 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.202 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.202 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.202 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.202 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.202 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.202 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.202 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.202 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.202 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.202 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.202 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.202 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.202 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.202 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.202 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.202 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.202 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.202 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.202 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.202 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.202 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.202 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.202 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.202 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.202 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.202 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.202 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.202 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.202 12 ERROR oslo_messaging.notify.messaging 
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.202 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.delta in the context of pollsters
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.206 12 DEBUG ceilometer.compute.virt.libvirt.inspector [-] No delta meter predecessor for e5fe3f24-b0cd-4353-af64-6c1c92f1581d / tap4281bc8f-b8 inspect_vnics /usr/lib/python3.9/site-packages/ceilometer/compute/virt/libvirt/inspector.py:136
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.206 12 DEBUG ceilometer.compute.pollsters [-] e5fe3f24-b0cd-4353-af64-6c1c92f1581d/network.outgoing.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.207 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '3bbbc4d8-c577-4dc1-8621-99fa9c943be6', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': 'b4385295f46b45d8803b0c536a989822', 'user_name': None, 'project_id': 'c299d482d37e45169cca3d6f178e8555', 'project_name': None, 'resource_id': 'instance-0000005b-e5fe3f24-b0cd-4353-af64-6c1c92f1581d-tap4281bc8f-b8', 'timestamp': '2026-01-22T00:06:23.202922', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestOtherA-server-1403378080', 'name': 'tap4281bc8f-b8', 'instance_id': 'e5fe3f24-b0cd-4353-af64-6c1c92f1581d', 'instance_type': 'm1.nano', 'host': '0ded5639d0be18dd17c7af5906c0415a5ebee818ba74f55e3f8b05ab', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:e2:23:c6', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap4281bc8f-b8'}, 'message_id': '2fd942ea-f726-11f0-b13b-fa163e425b77', 'monotonic_time': 4941.922124571, 'message_signature': 'd15e5fb1996e3870a51191adb80cc21550d3eadeff8eb29d23b3f5fc56d5a411'}]}, 'timestamp': '2026-01-22 00:06:23.207244', '_unique_id': '70362c7acad148c3bf2da28e7bfb2045'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.207 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.207 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.207 12 ERROR oslo_messaging.notify.messaging     yield
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.207 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.207 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.207 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.207 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.207 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.207 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.207 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.207 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.207 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.207 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.207 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.207 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.207 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.207 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.207 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.207 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.207 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.207 12 ERROR oslo_messaging.notify.messaging 
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.207 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.207 12 ERROR oslo_messaging.notify.messaging 
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.207 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.207 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.207 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.207 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.207 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.207 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.207 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.207 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.207 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.207 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.207 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.207 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.207 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.207 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.207 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.207 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.207 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.207 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.207 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.207 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.207 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.207 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.207 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.207 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.207 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.207 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.207 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.207 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.207 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.207 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.207 12 ERROR oslo_messaging.notify.messaging 
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.208 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes in the context of pollsters
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.209 12 DEBUG ceilometer.compute.pollsters [-] e5fe3f24-b0cd-4353-af64-6c1c92f1581d/network.incoming.bytes volume: 4195 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.209 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '7d8947db-e957-424e-9c42-97b6846d97e4', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 4195, 'user_id': 'b4385295f46b45d8803b0c536a989822', 'user_name': None, 'project_id': 'c299d482d37e45169cca3d6f178e8555', 'project_name': None, 'resource_id': 'instance-0000005b-e5fe3f24-b0cd-4353-af64-6c1c92f1581d-tap4281bc8f-b8', 'timestamp': '2026-01-22T00:06:23.209046', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestOtherA-server-1403378080', 'name': 'tap4281bc8f-b8', 'instance_id': 'e5fe3f24-b0cd-4353-af64-6c1c92f1581d', 'instance_type': 'm1.nano', 'host': '0ded5639d0be18dd17c7af5906c0415a5ebee818ba74f55e3f8b05ab', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:e2:23:c6', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap4281bc8f-b8'}, 'message_id': '2fd99664-f726-11f0-b13b-fa163e425b77', 'monotonic_time': 4941.922124571, 'message_signature': 'a4d68b2a3f8d5f18caba0b8d09d805ec4d62e11fe7692ba723a1b2ec5d38a573'}]}, 'timestamp': '2026-01-22 00:06:23.209344', '_unique_id': '63ad8693d3a549a59f93225f7e0bd625'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.209 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.209 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.209 12 ERROR oslo_messaging.notify.messaging     yield
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.209 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.209 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.209 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.209 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.209 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.209 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.209 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.209 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.209 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.209 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.209 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.209 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.209 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.209 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.209 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.209 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.209 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.209 12 ERROR oslo_messaging.notify.messaging 
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.209 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.209 12 ERROR oslo_messaging.notify.messaging 
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.209 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.209 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.209 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.209 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.209 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.209 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.209 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.209 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.209 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.209 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.209 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.209 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.209 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.209 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.209 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.209 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.209 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.209 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.209 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.209 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.209 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.209 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.209 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.209 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.209 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.209 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.209 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.209 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.209 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.209 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.209 12 ERROR oslo_messaging.notify.messaging 
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.210 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.iops in the context of pollsters
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.210 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for PerDeviceDiskIOPSPollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.211 12 ERROR ceilometer.polling.manager [-] Prevent pollster disk.device.iops from polling [<NovaLikeServer: tempest-ServerActionsTestOtherA-server-1403378080>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-ServerActionsTestOtherA-server-1403378080>]
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.211 12 INFO ceilometer.polling.manager [-] Polling pollster memory.usage in the context of pollsters
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.234 12 DEBUG ceilometer.compute.pollsters [-] e5fe3f24-b0cd-4353-af64-6c1c92f1581d/memory.usage volume: 42.51953125 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.236 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '6e589217-f008-49cf-ab9f-e6bf020c4dd3', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'memory.usage', 'counter_type': 'gauge', 'counter_unit': 'MB', 'counter_volume': 42.51953125, 'user_id': 'b4385295f46b45d8803b0c536a989822', 'user_name': None, 'project_id': 'c299d482d37e45169cca3d6f178e8555', 'project_name': None, 'resource_id': 'e5fe3f24-b0cd-4353-af64-6c1c92f1581d', 'timestamp': '2026-01-22T00:06:23.211610', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestOtherA-server-1403378080', 'name': 'instance-0000005b', 'instance_id': 'e5fe3f24-b0cd-4353-af64-6c1c92f1581d', 'instance_type': 'm1.nano', 'host': '0ded5639d0be18dd17c7af5906c0415a5ebee818ba74f55e3f8b05ab', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1}, 'message_id': '2fdd9386-f726-11f0-b13b-fa163e425b77', 'monotonic_time': 4941.953940547, 'message_signature': '7b6c9bbccb92131f09db3022648399e2c77b270f4e455fbb16d73c574d099fec'}]}, 'timestamp': '2026-01-22 00:06:23.235605', '_unique_id': 'cfb03ea57b684638bbf57d1459179cec'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.236 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.236 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.236 12 ERROR oslo_messaging.notify.messaging     yield
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.236 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.236 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.236 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.236 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.236 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.236 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.236 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.236 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.236 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.236 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.236 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.236 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.236 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.236 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.236 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.236 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.236 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.236 12 ERROR oslo_messaging.notify.messaging 
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.236 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.236 12 ERROR oslo_messaging.notify.messaging 
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.236 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.236 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.236 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.236 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.236 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.236 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.236 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.236 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.236 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.236 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.236 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.236 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.236 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.236 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.236 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.236 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.236 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.236 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.236 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.236 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.236 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.236 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.236 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.236 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.236 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.236 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.236 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.236 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.236 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.236 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.236 12 ERROR oslo_messaging.notify.messaging 
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.237 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.bytes in the context of pollsters
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.238 12 DEBUG ceilometer.compute.pollsters [-] e5fe3f24-b0cd-4353-af64-6c1c92f1581d/disk.device.write.bytes volume: 73089024 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.238 12 DEBUG ceilometer.compute.pollsters [-] e5fe3f24-b0cd-4353-af64-6c1c92f1581d/disk.device.write.bytes volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.239 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'a482a6fc-3132-434f-bdab-97bf748b2893', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 73089024, 'user_id': 'b4385295f46b45d8803b0c536a989822', 'user_name': None, 'project_id': 'c299d482d37e45169cca3d6f178e8555', 'project_name': None, 'resource_id': 'e5fe3f24-b0cd-4353-af64-6c1c92f1581d-vda', 'timestamp': '2026-01-22T00:06:23.238078', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestOtherA-server-1403378080', 'name': 'instance-0000005b', 'instance_id': 'e5fe3f24-b0cd-4353-af64-6c1c92f1581d', 'instance_type': 'm1.nano', 'host': '0ded5639d0be18dd17c7af5906c0415a5ebee818ba74f55e3f8b05ab', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '2fde0992-f726-11f0-b13b-fa163e425b77', 'monotonic_time': 4941.884870339, 'message_signature': 'b7a35a17f391b69e355775554faec4f91b9ce10e4206cf3f6398fff18bebd7ab'}, {'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': 'b4385295f46b45d8803b0c536a989822', 'user_name': None, 'project_id': 'c299d482d37e45169cca3d6f178e8555', 'project_name': None, 'resource_id': 'e5fe3f24-b0cd-4353-af64-6c1c92f1581d-sda', 'timestamp': '2026-01-22T00:06:23.238078', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestOtherA-server-1403378080', 'name': 'instance-0000005b', 'instance_id': 'e5fe3f24-b0cd-4353-af64-6c1c92f1581d', 'instance_type': 'm1.nano', 'host': '0ded5639d0be18dd17c7af5906c0415a5ebee818ba74f55e3f8b05ab', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '2fde1676-f726-11f0-b13b-fa163e425b77', 'monotonic_time': 4941.884870339, 'message_signature': 'fa92eb7ec1a395c71530f6bd01912be6732a4829df92d955b59739d8c41357e4'}]}, 'timestamp': '2026-01-22 00:06:23.238842', '_unique_id': 'a8f4c6cdc30c4501a58e0bdddeb49941'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.239 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.239 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.239 12 ERROR oslo_messaging.notify.messaging     yield
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.239 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.239 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.239 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.239 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.239 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.239 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.239 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.239 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.239 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.239 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.239 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.239 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.239 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.239 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.239 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.239 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.239 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.239 12 ERROR oslo_messaging.notify.messaging 
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.239 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.239 12 ERROR oslo_messaging.notify.messaging 
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.239 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.239 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.239 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.239 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.239 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.239 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.239 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.239 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.239 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.239 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.239 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.239 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.239 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.239 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.239 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.239 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.239 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.239 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.239 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.239 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.239 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.239 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.239 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.239 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.239 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.239 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.239 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.239 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.239 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.239 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.239 12 ERROR oslo_messaging.notify.messaging 
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.241 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets in the context of pollsters
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.241 12 DEBUG ceilometer.compute.pollsters [-] e5fe3f24-b0cd-4353-af64-6c1c92f1581d/network.outgoing.packets volume: 28 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.243 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'ab6ffc50-ed78-4a46-bd3b-5b7fa77bb42a', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 28, 'user_id': 'b4385295f46b45d8803b0c536a989822', 'user_name': None, 'project_id': 'c299d482d37e45169cca3d6f178e8555', 'project_name': None, 'resource_id': 'instance-0000005b-e5fe3f24-b0cd-4353-af64-6c1c92f1581d-tap4281bc8f-b8', 'timestamp': '2026-01-22T00:06:23.241755', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestOtherA-server-1403378080', 'name': 'tap4281bc8f-b8', 'instance_id': 'e5fe3f24-b0cd-4353-af64-6c1c92f1581d', 'instance_type': 'm1.nano', 'host': '0ded5639d0be18dd17c7af5906c0415a5ebee818ba74f55e3f8b05ab', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:e2:23:c6', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap4281bc8f-b8'}, 'message_id': '2fde9b96-f726-11f0-b13b-fa163e425b77', 'monotonic_time': 4941.922124571, 'message_signature': '0abf4ce7f12cae4c6d37098ab540c99b5eb07956ee634720ac46303bf5bdd699'}]}, 'timestamp': '2026-01-22 00:06:23.242348', '_unique_id': '6a22af62a7744851835b25ef98fcc5ab'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.243 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.243 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.243 12 ERROR oslo_messaging.notify.messaging     yield
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.243 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.243 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.243 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.243 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.243 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.243 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.243 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.243 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.243 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.243 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.243 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.243 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.243 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.243 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.243 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.243 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.243 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.243 12 ERROR oslo_messaging.notify.messaging 
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.243 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.243 12 ERROR oslo_messaging.notify.messaging 
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.243 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.243 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.243 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.243 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.243 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.243 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.243 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.243 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.243 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.243 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.243 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.243 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.243 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.243 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.243 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.243 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.243 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.243 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.243 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.243 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.243 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.243 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.243 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.243 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.243 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.243 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.243 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.243 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.243 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.243 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.243 12 ERROR oslo_messaging.notify.messaging 
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.244 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.error in the context of pollsters
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.244 12 DEBUG ceilometer.compute.pollsters [-] e5fe3f24-b0cd-4353-af64-6c1c92f1581d/network.incoming.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 21 19:06:23 np0005591285 podman[226211]: 2026-01-22 00:06:23.245242529 +0000 UTC m=+0.105061108 container health_status e38def30a2d032278c507a8fe96e62e9ea51ff4721fe55b24e66691821fd079e (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, tcib_managed=true, container_name=ovn_controller, managed_by=edpm_ansible, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.245 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '75456787-78ef-4969-a4a6-c456a015250a', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'b4385295f46b45d8803b0c536a989822', 'user_name': None, 'project_id': 'c299d482d37e45169cca3d6f178e8555', 'project_name': None, 'resource_id': 'instance-0000005b-e5fe3f24-b0cd-4353-af64-6c1c92f1581d-tap4281bc8f-b8', 'timestamp': '2026-01-22T00:06:23.244843', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestOtherA-server-1403378080', 'name': 'tap4281bc8f-b8', 'instance_id': 'e5fe3f24-b0cd-4353-af64-6c1c92f1581d', 'instance_type': 'm1.nano', 'host': '0ded5639d0be18dd17c7af5906c0415a5ebee818ba74f55e3f8b05ab', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:e2:23:c6', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap4281bc8f-b8'}, 'message_id': '2fdf0fb8-f726-11f0-b13b-fa163e425b77', 'monotonic_time': 4941.922124571, 'message_signature': '4a97e23b9a8149e495ec44315e4d8b48d585abdcbe9a6d613ebbab41d3b36877'}]}, 'timestamp': '2026-01-22 00:06:23.245270', '_unique_id': '02514c2d48e542f0854be2ea5c97f82e'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.245 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.245 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.245 12 ERROR oslo_messaging.notify.messaging     yield
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.245 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.245 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.245 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.245 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.245 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.245 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.245 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.245 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.245 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.245 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.245 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.245 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.245 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.245 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.245 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.245 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.245 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.245 12 ERROR oslo_messaging.notify.messaging 
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.245 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.245 12 ERROR oslo_messaging.notify.messaging 
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.245 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.245 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.245 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.245 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.245 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.245 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.245 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.245 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.245 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.245 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.245 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.245 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.245 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.245 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.245 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.245 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.245 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.245 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.245 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.245 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.245 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.245 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.245 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.245 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.245 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.245 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.245 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.245 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.245 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.245 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.245 12 ERROR oslo_messaging.notify.messaging 
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.247 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.drop in the context of pollsters
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.247 12 DEBUG ceilometer.compute.pollsters [-] e5fe3f24-b0cd-4353-af64-6c1c92f1581d/network.outgoing.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.248 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '742fa122-fa07-46d5-8e9c-5859d03131cb', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'b4385295f46b45d8803b0c536a989822', 'user_name': None, 'project_id': 'c299d482d37e45169cca3d6f178e8555', 'project_name': None, 'resource_id': 'instance-0000005b-e5fe3f24-b0cd-4353-af64-6c1c92f1581d-tap4281bc8f-b8', 'timestamp': '2026-01-22T00:06:23.247393', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestOtherA-server-1403378080', 'name': 'tap4281bc8f-b8', 'instance_id': 'e5fe3f24-b0cd-4353-af64-6c1c92f1581d', 'instance_type': 'm1.nano', 'host': '0ded5639d0be18dd17c7af5906c0415a5ebee818ba74f55e3f8b05ab', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:e2:23:c6', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap4281bc8f-b8'}, 'message_id': '2fdf71ce-f726-11f0-b13b-fa163e425b77', 'monotonic_time': 4941.922124571, 'message_signature': '5923c753e3d19a140f292c6cc0d6201b5951a16cbf406222954186b1a4f0d624'}]}, 'timestamp': '2026-01-22 00:06:23.247771', '_unique_id': '874036c4b51941e3888253a206b0783b'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.248 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.248 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.248 12 ERROR oslo_messaging.notify.messaging     yield
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.248 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.248 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.248 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.248 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.248 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.248 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.248 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.248 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.248 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.248 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.248 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.248 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.248 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.248 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.248 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.248 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.248 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.248 12 ERROR oslo_messaging.notify.messaging 
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.248 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.248 12 ERROR oslo_messaging.notify.messaging 
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.248 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.248 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.248 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.248 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.248 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.248 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.248 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.248 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.248 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.248 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.248 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.248 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.248 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.248 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.248 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.248 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.248 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.248 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.248 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.248 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.248 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.248 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.248 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.248 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.248 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.248 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.248 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.248 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.248 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.248 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.248 12 ERROR oslo_messaging.notify.messaging 
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.249 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.latency in the context of pollsters
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.249 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for PerDeviceDiskLatencyPollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.249 12 ERROR ceilometer.polling.manager [-] Prevent pollster disk.device.latency from polling [<NovaLikeServer: tempest-ServerActionsTestOtherA-server-1403378080>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-ServerActionsTestOtherA-server-1403378080>]
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.250 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.latency in the context of pollsters
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.250 12 DEBUG ceilometer.compute.pollsters [-] e5fe3f24-b0cd-4353-af64-6c1c92f1581d/disk.device.write.latency volume: 1977698480 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.250 12 DEBUG ceilometer.compute.pollsters [-] e5fe3f24-b0cd-4353-af64-6c1c92f1581d/disk.device.write.latency volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.251 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '1271dd28-2b16-43ea-9b76-b1b53c4aff00', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 1977698480, 'user_id': 'b4385295f46b45d8803b0c536a989822', 'user_name': None, 'project_id': 'c299d482d37e45169cca3d6f178e8555', 'project_name': None, 'resource_id': 'e5fe3f24-b0cd-4353-af64-6c1c92f1581d-vda', 'timestamp': '2026-01-22T00:06:23.250307', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestOtherA-server-1403378080', 'name': 'instance-0000005b', 'instance_id': 'e5fe3f24-b0cd-4353-af64-6c1c92f1581d', 'instance_type': 'm1.nano', 'host': '0ded5639d0be18dd17c7af5906c0415a5ebee818ba74f55e3f8b05ab', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '2fdfe384-f726-11f0-b13b-fa163e425b77', 'monotonic_time': 4941.884870339, 'message_signature': '1bdf925d7fe4778ad383851a5b2aafd6ea4c5e64fac35151a8d501a098e9b270'}, {'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 0, 'user_id': 'b4385295f46b45d8803b0c536a989822', 'user_name': None, 'project_id': 'c299d482d37e45169cca3d6f178e8555', 'project_name': None, 'resource_id': 'e5fe3f24-b0cd-4353-af64-6c1c92f1581d-sda', 'timestamp': '2026-01-22T00:06:23.250307', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestOtherA-server-1403378080', 'name': 'instance-0000005b', 'instance_id': 'e5fe3f24-b0cd-4353-af64-6c1c92f1581d', 'instance_type': 'm1.nano', 'host': '0ded5639d0be18dd17c7af5906c0415a5ebee818ba74f55e3f8b05ab', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '2fdff0cc-f726-11f0-b13b-fa163e425b77', 'monotonic_time': 4941.884870339, 'message_signature': '1e96a29f0f48491fdaac56c8098d2672732ceabe481b2cb848fcec1c2e864baa'}]}, 'timestamp': '2026-01-22 00:06:23.251047', '_unique_id': 'cc6fb9989873484989cea1218efe5dd4'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.251 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.251 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.251 12 ERROR oslo_messaging.notify.messaging     yield
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.251 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.251 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.251 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.251 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.251 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.251 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.251 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.251 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.251 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.251 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.251 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.251 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.251 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.251 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.251 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.251 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.251 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.251 12 ERROR oslo_messaging.notify.messaging 
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.251 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.251 12 ERROR oslo_messaging.notify.messaging 
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.251 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.251 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.251 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.251 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.251 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.251 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.251 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.251 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.251 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.251 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.251 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.251 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.251 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.251 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.251 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.251 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.251 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.251 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.251 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.251 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.251 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.251 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.251 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.251 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.251 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.251 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.251 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.251 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.251 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.251 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.251 12 ERROR oslo_messaging.notify.messaging 
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.252 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.bytes in the context of pollsters
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.253 12 DEBUG ceilometer.compute.pollsters [-] e5fe3f24-b0cd-4353-af64-6c1c92f1581d/disk.device.read.bytes volume: 29587968 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.253 12 DEBUG ceilometer.compute.pollsters [-] e5fe3f24-b0cd-4353-af64-6c1c92f1581d/disk.device.read.bytes volume: 299326 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.254 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '5ef3c298-04d2-4cc8-bee1-1b7210f80420', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 29587968, 'user_id': 'b4385295f46b45d8803b0c536a989822', 'user_name': None, 'project_id': 'c299d482d37e45169cca3d6f178e8555', 'project_name': None, 'resource_id': 'e5fe3f24-b0cd-4353-af64-6c1c92f1581d-vda', 'timestamp': '2026-01-22T00:06:23.253031', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestOtherA-server-1403378080', 'name': 'instance-0000005b', 'instance_id': 'e5fe3f24-b0cd-4353-af64-6c1c92f1581d', 'instance_type': 'm1.nano', 'host': '0ded5639d0be18dd17c7af5906c0415a5ebee818ba74f55e3f8b05ab', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '2fe04ea0-f726-11f0-b13b-fa163e425b77', 'monotonic_time': 4941.884870339, 'message_signature': '72df2826db70fd600eb90ed82794c838cdde97f969d89d8824b4987191936b3c'}, {'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 299326, 'user_id': 'b4385295f46b45d8803b0c536a989822', 'user_name': None, 'project_id': 'c299d482d37e45169cca3d6f178e8555', 'project_name': None, 'resource_id': 'e5fe3f24-b0cd-4353-af64-6c1c92f1581d-sda', 'timestamp': '2026-01-22T00:06:23.253031', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestOtherA-server-1403378080', 'name': 'instance-0000005b', 'instance_id': 'e5fe3f24-b0cd-4353-af64-6c1c92f1581d', 'instance_type': 'm1.nano', 'host': '0ded5639d0be18dd17c7af5906c0415a5ebee818ba74f55e3f8b05ab', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '2fe05b84-f726-11f0-b13b-fa163e425b77', 'monotonic_time': 4941.884870339, 'message_signature': 'eb641f1560be5d22800da180c30cb355f3757752f6c08826e55af8a1fd719416'}]}, 'timestamp': '2026-01-22 00:06:23.253739', '_unique_id': '55b5e9bd6ed54f93841265a009afc33f'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.254 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.254 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.254 12 ERROR oslo_messaging.notify.messaging     yield
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.254 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.254 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.254 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.254 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.254 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.254 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.254 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.254 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.254 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.254 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.254 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.254 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.254 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.254 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.254 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.254 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.254 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.254 12 ERROR oslo_messaging.notify.messaging 
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.254 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.254 12 ERROR oslo_messaging.notify.messaging 
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.254 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.254 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.254 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.254 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.254 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.254 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.254 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.254 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.254 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.254 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.254 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.254 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.254 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.254 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.254 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.254 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.254 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.254 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.254 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.254 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.254 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.254 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.254 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.254 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.254 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.254 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.254 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.254 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.254 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.254 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.254 12 ERROR oslo_messaging.notify.messaging 
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.255 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.capacity in the context of pollsters
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.267 12 DEBUG ceilometer.compute.pollsters [-] e5fe3f24-b0cd-4353-af64-6c1c92f1581d/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.267 12 DEBUG ceilometer.compute.pollsters [-] e5fe3f24-b0cd-4353-af64-6c1c92f1581d/disk.device.capacity volume: 509952 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.269 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '5093363b-a7a2-4da7-ac03-a08053dc9de7', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': 'b4385295f46b45d8803b0c536a989822', 'user_name': None, 'project_id': 'c299d482d37e45169cca3d6f178e8555', 'project_name': None, 'resource_id': 'e5fe3f24-b0cd-4353-af64-6c1c92f1581d-vda', 'timestamp': '2026-01-22T00:06:23.255733', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestOtherA-server-1403378080', 'name': 'instance-0000005b', 'instance_id': 'e5fe3f24-b0cd-4353-af64-6c1c92f1581d', 'instance_type': 'm1.nano', 'host': '0ded5639d0be18dd17c7af5906c0415a5ebee818ba74f55e3f8b05ab', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '2fe285ee-f726-11f0-b13b-fa163e425b77', 'monotonic_time': 4941.974955762, 'message_signature': '4eb0b50c57d77cc45bee88d28e26e54d128f5c19d9ab840580c7de3127a1b455'}, {'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 509952, 'user_id': 'b4385295f46b45d8803b0c536a989822', 'user_name': None, 'project_id': 'c299d482d37e45169cca3d6f178e8555', 'project_name': None, 'resource_id': 'e5fe3f24-b0cd-4353-af64-6c1c92f1581d-sda', 'timestamp': '2026-01-22T00:06:23.255733', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestOtherA-server-1403378080', 'name': 'instance-0000005b', 'instance_id': 'e5fe3f24-b0cd-4353-af64-6c1c92f1581d', 'instance_type': 'm1.nano', 'host': '0ded5639d0be18dd17c7af5906c0415a5ebee818ba74f55e3f8b05ab', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '2fe294b2-f726-11f0-b13b-fa163e425b77', 'monotonic_time': 4941.974955762, 'message_signature': '529aa1624224f22591c187fc060b78570d35cf56a4718e2f32488d28f3318676'}]}, 'timestamp': '2026-01-22 00:06:23.268291', '_unique_id': '3893b89673d249f38a7b392162fa463b'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.269 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.269 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.269 12 ERROR oslo_messaging.notify.messaging     yield
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.269 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.269 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.269 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.269 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.269 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.269 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.269 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.269 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.269 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.269 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.269 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.269 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.269 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.269 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.269 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.269 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.269 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.269 12 ERROR oslo_messaging.notify.messaging 
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.269 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.269 12 ERROR oslo_messaging.notify.messaging 
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.269 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.269 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.269 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.269 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.269 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.269 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.269 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.269 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.269 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.269 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.269 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.269 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.269 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.269 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.269 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.269 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.269 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.269 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.269 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.269 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.269 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.269 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.269 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.269 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.269 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.269 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.269 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.269 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.269 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.269 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.269 12 ERROR oslo_messaging.notify.messaging 
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.270 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.allocation in the context of pollsters
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.270 12 DEBUG ceilometer.compute.pollsters [-] e5fe3f24-b0cd-4353-af64-6c1c92f1581d/disk.device.allocation volume: 30089216 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.270 12 DEBUG ceilometer.compute.pollsters [-] e5fe3f24-b0cd-4353-af64-6c1c92f1581d/disk.device.allocation volume: 512000 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.271 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '2e8e7c9e-db2f-4d28-a448-3ab1d4a0ab0a', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 30089216, 'user_id': 'b4385295f46b45d8803b0c536a989822', 'user_name': None, 'project_id': 'c299d482d37e45169cca3d6f178e8555', 'project_name': None, 'resource_id': 'e5fe3f24-b0cd-4353-af64-6c1c92f1581d-vda', 'timestamp': '2026-01-22T00:06:23.270427', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestOtherA-server-1403378080', 'name': 'instance-0000005b', 'instance_id': 'e5fe3f24-b0cd-4353-af64-6c1c92f1581d', 'instance_type': 'm1.nano', 'host': '0ded5639d0be18dd17c7af5906c0415a5ebee818ba74f55e3f8b05ab', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '2fe2f5d8-f726-11f0-b13b-fa163e425b77', 'monotonic_time': 4941.974955762, 'message_signature': 'fd3e2f0100e78979f2542b571c59b40db75d40bc5c6abd5431bfddb8476e2ab6'}, {'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 512000, 'user_id': 'b4385295f46b45d8803b0c536a989822', 'user_name': None, 'project_id': 'c299d482d37e45169cca3d6f178e8555', 'project_name': None, 'resource_id': 'e5fe3f24-b0cd-4353-af64-6c1c92f1581d-sda', 'timestamp': '2026-01-22T00:06:23.270427', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestOtherA-server-1403378080', 'name': 'instance-0000005b', 'instance_id': 'e5fe3f24-b0cd-4353-af64-6c1c92f1581d', 'instance_type': 'm1.nano', 'host': '0ded5639d0be18dd17c7af5906c0415a5ebee818ba74f55e3f8b05ab', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '2fe303f2-f726-11f0-b13b-fa163e425b77', 'monotonic_time': 4941.974955762, 'message_signature': 'e7d30565c3321a329ae085b234e768017bcb8ddce4a379516375bc59e40c3ab9'}]}, 'timestamp': '2026-01-22 00:06:23.271167', '_unique_id': '2eeb1038c7f04768979e7af9918b4433'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.271 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.271 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.271 12 ERROR oslo_messaging.notify.messaging     yield
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.271 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.271 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.271 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.271 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.271 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.271 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.271 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.271 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.271 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.271 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.271 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.271 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.271 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.271 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.271 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.271 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.271 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.271 12 ERROR oslo_messaging.notify.messaging 
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.271 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.271 12 ERROR oslo_messaging.notify.messaging 
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.271 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.271 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.271 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.271 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.271 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.271 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.271 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.271 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.271 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.271 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.271 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.271 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.271 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.271 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.271 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.271 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.271 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.271 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.271 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.271 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.271 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.271 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.271 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.271 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.271 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.271 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.271 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.271 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.271 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.271 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.271 12 ERROR oslo_messaging.notify.messaging 
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.273 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets in the context of pollsters
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.273 12 DEBUG ceilometer.compute.pollsters [-] e5fe3f24-b0cd-4353-af64-6c1c92f1581d/network.incoming.packets volume: 26 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.274 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'd99eb068-26db-44e7-9266-3025e0b6dc58', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 26, 'user_id': 'b4385295f46b45d8803b0c536a989822', 'user_name': None, 'project_id': 'c299d482d37e45169cca3d6f178e8555', 'project_name': None, 'resource_id': 'instance-0000005b-e5fe3f24-b0cd-4353-af64-6c1c92f1581d-tap4281bc8f-b8', 'timestamp': '2026-01-22T00:06:23.273192', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestOtherA-server-1403378080', 'name': 'tap4281bc8f-b8', 'instance_id': 'e5fe3f24-b0cd-4353-af64-6c1c92f1581d', 'instance_type': 'm1.nano', 'host': '0ded5639d0be18dd17c7af5906c0415a5ebee818ba74f55e3f8b05ab', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:e2:23:c6', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap4281bc8f-b8'}, 'message_id': '2fe361e4-f726-11f0-b13b-fa163e425b77', 'monotonic_time': 4941.922124571, 'message_signature': 'da966ac739d0199dddbeb43c823c4f9e1ff6712cf034e52da29201a439242f96'}]}, 'timestamp': '2026-01-22 00:06:23.273578', '_unique_id': '474c11ce19214d75845e1e657be4a1d6'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.274 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.274 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.274 12 ERROR oslo_messaging.notify.messaging     yield
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.274 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.274 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.274 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.274 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.274 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.274 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.274 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.274 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.274 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.274 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.274 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.274 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.274 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.274 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.274 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.274 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.274 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.274 12 ERROR oslo_messaging.notify.messaging 
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.274 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.274 12 ERROR oslo_messaging.notify.messaging 
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.274 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.274 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.274 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.274 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.274 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.274 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.274 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.274 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.274 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.274 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.274 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.274 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.274 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.274 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.274 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.274 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.274 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.274 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.274 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.274 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.274 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.274 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.274 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.274 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.274 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.274 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.274 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.274 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.274 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.274 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.274 12 ERROR oslo_messaging.notify.messaging 
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.275 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.rate in the context of pollsters
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.275 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for IncomingBytesRatePollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.275 12 ERROR ceilometer.polling.manager [-] Prevent pollster network.incoming.bytes.rate from polling [<NovaLikeServer: tempest-ServerActionsTestOtherA-server-1403378080>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-ServerActionsTestOtherA-server-1403378080>]
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.276 12 INFO ceilometer.polling.manager [-] Polling pollster cpu in the context of pollsters
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.276 12 DEBUG ceilometer.compute.pollsters [-] e5fe3f24-b0cd-4353-af64-6c1c92f1581d/cpu volume: 12040000000 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.277 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '1e6e7f81-8b34-483a-82e2-79e57d8f4390', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'cpu', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 12040000000, 'user_id': 'b4385295f46b45d8803b0c536a989822', 'user_name': None, 'project_id': 'c299d482d37e45169cca3d6f178e8555', 'project_name': None, 'resource_id': 'e5fe3f24-b0cd-4353-af64-6c1c92f1581d', 'timestamp': '2026-01-22T00:06:23.276192', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestOtherA-server-1403378080', 'name': 'instance-0000005b', 'instance_id': 'e5fe3f24-b0cd-4353-af64-6c1c92f1581d', 'instance_type': 'm1.nano', 'host': '0ded5639d0be18dd17c7af5906c0415a5ebee818ba74f55e3f8b05ab', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'cpu_number': 1}, 'message_id': '2fe3d750-f726-11f0-b13b-fa163e425b77', 'monotonic_time': 4941.953940547, 'message_signature': '91b90ec9ab9b4de96c00eb9e281c90409d9f7b44fcb27c869bc5b0930d88f605'}]}, 'timestamp': '2026-01-22 00:06:23.276570', '_unique_id': '9ea0bf9d5e3f4d2ab5d05f5dfe33b318'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.277 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.277 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.277 12 ERROR oslo_messaging.notify.messaging     yield
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.277 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.277 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.277 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.277 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.277 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.277 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.277 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.277 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.277 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.277 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.277 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.277 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.277 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.277 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.277 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.277 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.277 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.277 12 ERROR oslo_messaging.notify.messaging 
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.277 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.277 12 ERROR oslo_messaging.notify.messaging 
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.277 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.277 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.277 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.277 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.277 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.277 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.277 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.277 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.277 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.277 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.277 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.277 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.277 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.277 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.277 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.277 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.277 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.277 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.277 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.277 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.277 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.277 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.277 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.277 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.277 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.277 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.277 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.277 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.277 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.277 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.277 12 ERROR oslo_messaging.notify.messaging 
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.278 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.drop in the context of pollsters
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.278 12 DEBUG ceilometer.compute.pollsters [-] e5fe3f24-b0cd-4353-af64-6c1c92f1581d/network.incoming.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.279 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '809f32d3-e801-4662-87d1-65c72b1b563b', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'b4385295f46b45d8803b0c536a989822', 'user_name': None, 'project_id': 'c299d482d37e45169cca3d6f178e8555', 'project_name': None, 'resource_id': 'instance-0000005b-e5fe3f24-b0cd-4353-af64-6c1c92f1581d-tap4281bc8f-b8', 'timestamp': '2026-01-22T00:06:23.278487', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestOtherA-server-1403378080', 'name': 'tap4281bc8f-b8', 'instance_id': 'e5fe3f24-b0cd-4353-af64-6c1c92f1581d', 'instance_type': 'm1.nano', 'host': '0ded5639d0be18dd17c7af5906c0415a5ebee818ba74f55e3f8b05ab', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:e2:23:c6', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap4281bc8f-b8'}, 'message_id': '2fe43056-f726-11f0-b13b-fa163e425b77', 'monotonic_time': 4941.922124571, 'message_signature': 'ee5db483bca1dc4b4cc52905927db4df46e2d00089ee770af948c24b8dcfa802'}]}, 'timestamp': '2026-01-22 00:06:23.278860', '_unique_id': 'b55c7e7f785741068c2bb4196b2547f0'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.279 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.279 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.279 12 ERROR oslo_messaging.notify.messaging     yield
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.279 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.279 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.279 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.279 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.279 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.279 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.279 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.279 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.279 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.279 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.279 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.279 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.279 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.279 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.279 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.279 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.279 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.279 12 ERROR oslo_messaging.notify.messaging 
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.279 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.279 12 ERROR oslo_messaging.notify.messaging 
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.279 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.279 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.279 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.279 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.279 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.279 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.279 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.279 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.279 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.279 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.279 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.279 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.279 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.279 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.279 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.279 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.279 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.279 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.279 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.279 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.279 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.279 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.279 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.279 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.279 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.279 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.279 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.279 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.279 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.279 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.279 12 ERROR oslo_messaging.notify.messaging 
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.280 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.rate in the context of pollsters
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.280 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for OutgoingBytesRatePollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.281 12 ERROR ceilometer.polling.manager [-] Prevent pollster network.outgoing.bytes.rate from polling [<NovaLikeServer: tempest-ServerActionsTestOtherA-server-1403378080>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-ServerActionsTestOtherA-server-1403378080>]
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.281 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.usage in the context of pollsters
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.281 12 DEBUG ceilometer.compute.pollsters [-] e5fe3f24-b0cd-4353-af64-6c1c92f1581d/disk.device.usage volume: 29949952 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.281 12 DEBUG ceilometer.compute.pollsters [-] e5fe3f24-b0cd-4353-af64-6c1c92f1581d/disk.device.usage volume: 509952 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.282 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'ec09d811-b525-4a70-8c1d-6c8749458454', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 29949952, 'user_id': 'b4385295f46b45d8803b0c536a989822', 'user_name': None, 'project_id': 'c299d482d37e45169cca3d6f178e8555', 'project_name': None, 'resource_id': 'e5fe3f24-b0cd-4353-af64-6c1c92f1581d-vda', 'timestamp': '2026-01-22T00:06:23.281426', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestOtherA-server-1403378080', 'name': 'instance-0000005b', 'instance_id': 'e5fe3f24-b0cd-4353-af64-6c1c92f1581d', 'instance_type': 'm1.nano', 'host': '0ded5639d0be18dd17c7af5906c0415a5ebee818ba74f55e3f8b05ab', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '2fe4a306-f726-11f0-b13b-fa163e425b77', 'monotonic_time': 4941.974955762, 'message_signature': '90f5a87d07d0b29a3783ffaf2a798ddf96bd42a1f2503245e5387973d5ea3aad'}, {'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 509952, 'user_id': 'b4385295f46b45d8803b0c536a989822', 'user_name': None, 'project_id': 'c299d482d37e45169cca3d6f178e8555', 'project_name': None, 'resource_id': 'e5fe3f24-b0cd-4353-af64-6c1c92f1581d-sda', 'timestamp': '2026-01-22T00:06:23.281426', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestOtherA-server-1403378080', 'name': 'instance-0000005b', 'instance_id': 'e5fe3f24-b0cd-4353-af64-6c1c92f1581d', 'instance_type': 'm1.nano', 'host': '0ded5639d0be18dd17c7af5906c0415a5ebee818ba74f55e3f8b05ab', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '2fe4b0da-f726-11f0-b13b-fa163e425b77', 'monotonic_time': 4941.974955762, 'message_signature': '75bcf127dc028c490f33e17e162a61e4e1d7a68ba6845af8db4a26267403e497'}]}, 'timestamp': '2026-01-22 00:06:23.282157', '_unique_id': '097e31876ae44c17a8530aa8898820a4'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.282 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.282 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.282 12 ERROR oslo_messaging.notify.messaging     yield
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.282 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.282 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.282 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.282 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.282 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.282 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.282 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.282 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.282 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.282 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.282 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.282 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.282 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.282 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.282 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.282 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.282 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.282 12 ERROR oslo_messaging.notify.messaging 
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.282 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.282 12 ERROR oslo_messaging.notify.messaging 
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.282 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.282 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.282 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.282 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.282 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.282 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.282 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.282 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.282 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.282 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.282 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.282 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.282 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.282 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.282 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.282 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.282 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.282 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.282 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.282 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.282 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.282 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.282 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.282 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.282 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.282 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.282 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.282 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.282 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.282 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.282 12 ERROR oslo_messaging.notify.messaging 
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.284 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.error in the context of pollsters
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.284 12 DEBUG ceilometer.compute.pollsters [-] e5fe3f24-b0cd-4353-af64-6c1c92f1581d/network.outgoing.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.285 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '5c02ced5-7105-41ef-9241-2dd81d7a0d15', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'b4385295f46b45d8803b0c536a989822', 'user_name': None, 'project_id': 'c299d482d37e45169cca3d6f178e8555', 'project_name': None, 'resource_id': 'instance-0000005b-e5fe3f24-b0cd-4353-af64-6c1c92f1581d-tap4281bc8f-b8', 'timestamp': '2026-01-22T00:06:23.284155', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestOtherA-server-1403378080', 'name': 'tap4281bc8f-b8', 'instance_id': 'e5fe3f24-b0cd-4353-af64-6c1c92f1581d', 'instance_type': 'm1.nano', 'host': '0ded5639d0be18dd17c7af5906c0415a5ebee818ba74f55e3f8b05ab', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:e2:23:c6', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap4281bc8f-b8'}, 'message_id': '2fe50e9a-f726-11f0-b13b-fa163e425b77', 'monotonic_time': 4941.922124571, 'message_signature': '19884bec4802d4b13835d588aeb27748ea116bd3da4d72e76b28254f9c181e21'}]}, 'timestamp': '2026-01-22 00:06:23.284553', '_unique_id': '8bfa756e3a124123a9928d406be525ef'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.285 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.285 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.285 12 ERROR oslo_messaging.notify.messaging     yield
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.285 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.285 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.285 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.285 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.285 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.285 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.285 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.285 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.285 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.285 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.285 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.285 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.285 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.285 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.285 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.285 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.285 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.285 12 ERROR oslo_messaging.notify.messaging 
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.285 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.285 12 ERROR oslo_messaging.notify.messaging 
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.285 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.285 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.285 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.285 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.285 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.285 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.285 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.285 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.285 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.285 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.285 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.285 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.285 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.285 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.285 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.285 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.285 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.285 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.285 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.285 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.285 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.285 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.285 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.285 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.285 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.285 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.285 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.285 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.285 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.285 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.285 12 ERROR oslo_messaging.notify.messaging 
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.286 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.delta in the context of pollsters
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.286 12 DEBUG ceilometer.compute.pollsters [-] e5fe3f24-b0cd-4353-af64-6c1c92f1581d/network.incoming.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.287 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '27f19ed1-ee05-403e-a04d-2fc4ce07cf9a', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': 'b4385295f46b45d8803b0c536a989822', 'user_name': None, 'project_id': 'c299d482d37e45169cca3d6f178e8555', 'project_name': None, 'resource_id': 'instance-0000005b-e5fe3f24-b0cd-4353-af64-6c1c92f1581d-tap4281bc8f-b8', 'timestamp': '2026-01-22T00:06:23.286516', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestOtherA-server-1403378080', 'name': 'tap4281bc8f-b8', 'instance_id': 'e5fe3f24-b0cd-4353-af64-6c1c92f1581d', 'instance_type': 'm1.nano', 'host': '0ded5639d0be18dd17c7af5906c0415a5ebee818ba74f55e3f8b05ab', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:e2:23:c6', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap4281bc8f-b8'}, 'message_id': '2fe56a7a-f726-11f0-b13b-fa163e425b77', 'monotonic_time': 4941.922124571, 'message_signature': 'a360c10e8a0cbc95d6f1e6592341eb191da0e016b75e16e96981969cc00bce9e'}]}, 'timestamp': '2026-01-22 00:06:23.286934', '_unique_id': 'c9f4baaf73a146a1b27757513a3ca42d'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.287 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.287 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.287 12 ERROR oslo_messaging.notify.messaging     yield
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.287 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.287 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.287 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.287 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.287 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.287 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.287 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.287 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.287 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.287 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.287 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.287 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.287 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.287 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.287 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.287 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.287 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.287 12 ERROR oslo_messaging.notify.messaging 
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.287 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.287 12 ERROR oslo_messaging.notify.messaging 
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.287 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.287 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.287 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.287 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.287 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.287 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.287 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.287 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.287 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.287 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.287 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.287 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.287 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.287 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.287 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.287 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.287 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.287 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.287 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.287 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.287 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.287 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.287 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.287 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.287 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.287 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.287 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.287 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.287 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.287 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.287 12 ERROR oslo_messaging.notify.messaging 
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.288 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes in the context of pollsters
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.289 12 DEBUG ceilometer.compute.pollsters [-] e5fe3f24-b0cd-4353-af64-6c1c92f1581d/network.outgoing.bytes volume: 3390 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.290 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '7e5bdb3c-8401-4ff5-8f5b-a3f01ec1952e', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 3390, 'user_id': 'b4385295f46b45d8803b0c536a989822', 'user_name': None, 'project_id': 'c299d482d37e45169cca3d6f178e8555', 'project_name': None, 'resource_id': 'instance-0000005b-e5fe3f24-b0cd-4353-af64-6c1c92f1581d-tap4281bc8f-b8', 'timestamp': '2026-01-22T00:06:23.288973', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestOtherA-server-1403378080', 'name': 'tap4281bc8f-b8', 'instance_id': 'e5fe3f24-b0cd-4353-af64-6c1c92f1581d', 'instance_type': 'm1.nano', 'host': '0ded5639d0be18dd17c7af5906c0415a5ebee818ba74f55e3f8b05ab', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:e2:23:c6', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap4281bc8f-b8'}, 'message_id': '2fe5ca56-f726-11f0-b13b-fa163e425b77', 'monotonic_time': 4941.922124571, 'message_signature': '63d6191c3493931e7aa008177b5f21afdc695da7ef73d06624262206ea96949a'}]}, 'timestamp': '2026-01-22 00:06:23.289363', '_unique_id': '46336b961ca24936984152b4eb819fe2'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.290 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.290 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.290 12 ERROR oslo_messaging.notify.messaging     yield
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.290 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.290 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.290 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.290 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.290 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.290 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.290 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.290 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.290 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.290 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.290 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.290 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.290 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.290 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.290 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.290 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.290 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.290 12 ERROR oslo_messaging.notify.messaging 
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.290 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.290 12 ERROR oslo_messaging.notify.messaging 
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.290 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.290 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.290 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.290 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.290 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.290 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.290 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.290 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.290 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.290 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.290 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.290 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.290 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.290 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.290 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.290 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.290 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.290 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.290 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.290 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.290 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.290 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.290 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.290 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.290 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.290 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.290 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.290 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.290 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.290 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 21 19:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:06:23.290 12 ERROR oslo_messaging.notify.messaging 
Jan 21 19:06:26 np0005591285 nova_compute[182755]: 2026-01-22 00:06:26.192 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:06:26 np0005591285 nova_compute[182755]: 2026-01-22 00:06:26.408 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:06:26 np0005591285 nova_compute[182755]: 2026-01-22 00:06:26.444 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:06:27 np0005591285 nova_compute[182755]: 2026-01-22 00:06:27.926 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:06:31 np0005591285 nova_compute[182755]: 2026-01-22 00:06:31.242 182759 DEBUG oslo_service.periodic_task [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 21 19:06:31 np0005591285 nova_compute[182755]: 2026-01-22 00:06:31.410 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:06:31 np0005591285 nova_compute[182755]: 2026-01-22 00:06:31.445 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:06:32 np0005591285 nova_compute[182755]: 2026-01-22 00:06:32.509 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:06:33 np0005591285 nova_compute[182755]: 2026-01-22 00:06:33.349 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:06:36 np0005591285 nova_compute[182755]: 2026-01-22 00:06:36.216 182759 DEBUG oslo_service.periodic_task [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 21 19:06:36 np0005591285 nova_compute[182755]: 2026-01-22 00:06:36.217 182759 DEBUG nova.compute.manager [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Jan 21 19:06:36 np0005591285 nova_compute[182755]: 2026-01-22 00:06:36.413 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:06:36 np0005591285 nova_compute[182755]: 2026-01-22 00:06:36.447 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:06:38 np0005591285 nova_compute[182755]: 2026-01-22 00:06:38.217 182759 DEBUG oslo_service.periodic_task [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 21 19:06:40 np0005591285 nova_compute[182755]: 2026-01-22 00:06:40.174 182759 DEBUG oslo_concurrency.lockutils [None req-c5cdc3a8-f3cb-4175-a965-d891041730e5 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] Acquiring lock "6167cc82-55cf-479c-a543-101634481524" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 21 19:06:40 np0005591285 nova_compute[182755]: 2026-01-22 00:06:40.175 182759 DEBUG oslo_concurrency.lockutils [None req-c5cdc3a8-f3cb-4175-a965-d891041730e5 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] Lock "6167cc82-55cf-479c-a543-101634481524" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 21 19:06:40 np0005591285 nova_compute[182755]: 2026-01-22 00:06:40.198 182759 DEBUG nova.compute.manager [None req-c5cdc3a8-f3cb-4175-a965-d891041730e5 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] [instance: 6167cc82-55cf-479c-a543-101634481524] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Jan 21 19:06:40 np0005591285 nova_compute[182755]: 2026-01-22 00:06:40.346 182759 DEBUG oslo_concurrency.lockutils [None req-c5cdc3a8-f3cb-4175-a965-d891041730e5 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 21 19:06:40 np0005591285 nova_compute[182755]: 2026-01-22 00:06:40.347 182759 DEBUG oslo_concurrency.lockutils [None req-c5cdc3a8-f3cb-4175-a965-d891041730e5 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 21 19:06:40 np0005591285 nova_compute[182755]: 2026-01-22 00:06:40.354 182759 DEBUG nova.virt.hardware [None req-c5cdc3a8-f3cb-4175-a965-d891041730e5 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Jan 21 19:06:40 np0005591285 nova_compute[182755]: 2026-01-22 00:06:40.354 182759 INFO nova.compute.claims [None req-c5cdc3a8-f3cb-4175-a965-d891041730e5 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] [instance: 6167cc82-55cf-479c-a543-101634481524] Claim successful on node compute-2.ctlplane.example.com#033[00m
Jan 21 19:06:40 np0005591285 nova_compute[182755]: 2026-01-22 00:06:40.549 182759 DEBUG nova.compute.provider_tree [None req-c5cdc3a8-f3cb-4175-a965-d891041730e5 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] Inventory has not changed in ProviderTree for provider: e96a8776-a298-4c19-937a-402cb8191067 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 21 19:06:40 np0005591285 nova_compute[182755]: 2026-01-22 00:06:40.570 182759 DEBUG nova.scheduler.client.report [None req-c5cdc3a8-f3cb-4175-a965-d891041730e5 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] Inventory has not changed for provider e96a8776-a298-4c19-937a-402cb8191067 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 21 19:06:40 np0005591285 nova_compute[182755]: 2026-01-22 00:06:40.597 182759 DEBUG oslo_concurrency.lockutils [None req-c5cdc3a8-f3cb-4175-a965-d891041730e5 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.250s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 21 19:06:40 np0005591285 nova_compute[182755]: 2026-01-22 00:06:40.598 182759 DEBUG nova.compute.manager [None req-c5cdc3a8-f3cb-4175-a965-d891041730e5 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] [instance: 6167cc82-55cf-479c-a543-101634481524] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Jan 21 19:06:40 np0005591285 nova_compute[182755]: 2026-01-22 00:06:40.669 182759 DEBUG nova.compute.manager [None req-c5cdc3a8-f3cb-4175-a965-d891041730e5 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] [instance: 6167cc82-55cf-479c-a543-101634481524] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Jan 21 19:06:40 np0005591285 nova_compute[182755]: 2026-01-22 00:06:40.669 182759 DEBUG nova.network.neutron [None req-c5cdc3a8-f3cb-4175-a965-d891041730e5 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] [instance: 6167cc82-55cf-479c-a543-101634481524] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Jan 21 19:06:40 np0005591285 nova_compute[182755]: 2026-01-22 00:06:40.706 182759 INFO nova.virt.libvirt.driver [None req-c5cdc3a8-f3cb-4175-a965-d891041730e5 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] [instance: 6167cc82-55cf-479c-a543-101634481524] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Jan 21 19:06:40 np0005591285 nova_compute[182755]: 2026-01-22 00:06:40.729 182759 DEBUG nova.compute.manager [None req-c5cdc3a8-f3cb-4175-a965-d891041730e5 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] [instance: 6167cc82-55cf-479c-a543-101634481524] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Jan 21 19:06:40 np0005591285 nova_compute[182755]: 2026-01-22 00:06:40.883 182759 DEBUG nova.compute.manager [None req-c5cdc3a8-f3cb-4175-a965-d891041730e5 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] [instance: 6167cc82-55cf-479c-a543-101634481524] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Jan 21 19:06:40 np0005591285 nova_compute[182755]: 2026-01-22 00:06:40.884 182759 DEBUG nova.virt.libvirt.driver [None req-c5cdc3a8-f3cb-4175-a965-d891041730e5 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] [instance: 6167cc82-55cf-479c-a543-101634481524] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Jan 21 19:06:40 np0005591285 nova_compute[182755]: 2026-01-22 00:06:40.885 182759 INFO nova.virt.libvirt.driver [None req-c5cdc3a8-f3cb-4175-a965-d891041730e5 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] [instance: 6167cc82-55cf-479c-a543-101634481524] Creating image(s)#033[00m
Jan 21 19:06:40 np0005591285 nova_compute[182755]: 2026-01-22 00:06:40.885 182759 DEBUG oslo_concurrency.lockutils [None req-c5cdc3a8-f3cb-4175-a965-d891041730e5 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] Acquiring lock "/var/lib/nova/instances/6167cc82-55cf-479c-a543-101634481524/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 21 19:06:40 np0005591285 nova_compute[182755]: 2026-01-22 00:06:40.886 182759 DEBUG oslo_concurrency.lockutils [None req-c5cdc3a8-f3cb-4175-a965-d891041730e5 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] Lock "/var/lib/nova/instances/6167cc82-55cf-479c-a543-101634481524/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 21 19:06:40 np0005591285 nova_compute[182755]: 2026-01-22 00:06:40.886 182759 DEBUG oslo_concurrency.lockutils [None req-c5cdc3a8-f3cb-4175-a965-d891041730e5 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] Lock "/var/lib/nova/instances/6167cc82-55cf-479c-a543-101634481524/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 21 19:06:40 np0005591285 nova_compute[182755]: 2026-01-22 00:06:40.903 182759 DEBUG oslo_concurrency.processutils [None req-c5cdc3a8-f3cb-4175-a965-d891041730e5 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 21 19:06:40 np0005591285 nova_compute[182755]: 2026-01-22 00:06:40.995 182759 DEBUG nova.policy [None req-c5cdc3a8-f3cb-4175-a965-d891041730e5 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '833f1e9dce90456ea55a443da6704907', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '34b96b4037d24a0ea19383ca2477b2fd', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Jan 21 19:06:41 np0005591285 nova_compute[182755]: 2026-01-22 00:06:41.010 182759 DEBUG oslo_concurrency.processutils [None req-c5cdc3a8-f3cb-4175-a965-d891041730e5 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json" returned: 0 in 0.108s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 21 19:06:41 np0005591285 nova_compute[182755]: 2026-01-22 00:06:41.012 182759 DEBUG oslo_concurrency.lockutils [None req-c5cdc3a8-f3cb-4175-a965-d891041730e5 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] Acquiring lock "3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 21 19:06:41 np0005591285 nova_compute[182755]: 2026-01-22 00:06:41.013 182759 DEBUG oslo_concurrency.lockutils [None req-c5cdc3a8-f3cb-4175-a965-d891041730e5 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] Lock "3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 21 19:06:41 np0005591285 nova_compute[182755]: 2026-01-22 00:06:41.041 182759 DEBUG oslo_concurrency.processutils [None req-c5cdc3a8-f3cb-4175-a965-d891041730e5 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 21 19:06:41 np0005591285 nova_compute[182755]: 2026-01-22 00:06:41.120 182759 DEBUG oslo_concurrency.processutils [None req-c5cdc3a8-f3cb-4175-a965-d891041730e5 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json" returned: 0 in 0.079s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 21 19:06:41 np0005591285 nova_compute[182755]: 2026-01-22 00:06:41.122 182759 DEBUG oslo_concurrency.processutils [None req-c5cdc3a8-f3cb-4175-a965-d891041730e5 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474,backing_fmt=raw /var/lib/nova/instances/6167cc82-55cf-479c-a543-101634481524/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 21 19:06:41 np0005591285 nova_compute[182755]: 2026-01-22 00:06:41.163 182759 DEBUG oslo_concurrency.processutils [None req-c5cdc3a8-f3cb-4175-a965-d891041730e5 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474,backing_fmt=raw /var/lib/nova/instances/6167cc82-55cf-479c-a543-101634481524/disk 1073741824" returned: 0 in 0.041s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 21 19:06:41 np0005591285 nova_compute[182755]: 2026-01-22 00:06:41.164 182759 DEBUG oslo_concurrency.lockutils [None req-c5cdc3a8-f3cb-4175-a965-d891041730e5 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] Lock "3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.151s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 21 19:06:41 np0005591285 nova_compute[182755]: 2026-01-22 00:06:41.165 182759 DEBUG oslo_concurrency.processutils [None req-c5cdc3a8-f3cb-4175-a965-d891041730e5 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 21 19:06:41 np0005591285 nova_compute[182755]: 2026-01-22 00:06:41.217 182759 DEBUG oslo_service.periodic_task [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 21 19:06:41 np0005591285 nova_compute[182755]: 2026-01-22 00:06:41.219 182759 DEBUG oslo_service.periodic_task [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 21 19:06:41 np0005591285 nova_compute[182755]: 2026-01-22 00:06:41.239 182759 DEBUG oslo_concurrency.processutils [None req-c5cdc3a8-f3cb-4175-a965-d891041730e5 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json" returned: 0 in 0.074s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 21 19:06:41 np0005591285 nova_compute[182755]: 2026-01-22 00:06:41.241 182759 DEBUG nova.virt.disk.api [None req-c5cdc3a8-f3cb-4175-a965-d891041730e5 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] Checking if we can resize image /var/lib/nova/instances/6167cc82-55cf-479c-a543-101634481524/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166#033[00m
Jan 21 19:06:41 np0005591285 nova_compute[182755]: 2026-01-22 00:06:41.242 182759 DEBUG oslo_concurrency.processutils [None req-c5cdc3a8-f3cb-4175-a965-d891041730e5 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/6167cc82-55cf-479c-a543-101634481524/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 21 19:06:41 np0005591285 nova_compute[182755]: 2026-01-22 00:06:41.294 182759 DEBUG oslo_concurrency.processutils [None req-c5cdc3a8-f3cb-4175-a965-d891041730e5 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/6167cc82-55cf-479c-a543-101634481524/disk --force-share --output=json" returned: 0 in 0.052s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 21 19:06:41 np0005591285 nova_compute[182755]: 2026-01-22 00:06:41.295 182759 DEBUG nova.virt.disk.api [None req-c5cdc3a8-f3cb-4175-a965-d891041730e5 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] Cannot resize image /var/lib/nova/instances/6167cc82-55cf-479c-a543-101634481524/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172#033[00m
Jan 21 19:06:41 np0005591285 nova_compute[182755]: 2026-01-22 00:06:41.296 182759 DEBUG nova.objects.instance [None req-c5cdc3a8-f3cb-4175-a965-d891041730e5 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] Lazy-loading 'migration_context' on Instance uuid 6167cc82-55cf-479c-a543-101634481524 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 21 19:06:41 np0005591285 nova_compute[182755]: 2026-01-22 00:06:41.317 182759 DEBUG nova.virt.libvirt.driver [None req-c5cdc3a8-f3cb-4175-a965-d891041730e5 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] [instance: 6167cc82-55cf-479c-a543-101634481524] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Jan 21 19:06:41 np0005591285 nova_compute[182755]: 2026-01-22 00:06:41.317 182759 DEBUG nova.virt.libvirt.driver [None req-c5cdc3a8-f3cb-4175-a965-d891041730e5 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] [instance: 6167cc82-55cf-479c-a543-101634481524] Ensure instance console log exists: /var/lib/nova/instances/6167cc82-55cf-479c-a543-101634481524/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Jan 21 19:06:41 np0005591285 nova_compute[182755]: 2026-01-22 00:06:41.318 182759 DEBUG oslo_concurrency.lockutils [None req-c5cdc3a8-f3cb-4175-a965-d891041730e5 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 21 19:06:41 np0005591285 nova_compute[182755]: 2026-01-22 00:06:41.318 182759 DEBUG oslo_concurrency.lockutils [None req-c5cdc3a8-f3cb-4175-a965-d891041730e5 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 21 19:06:41 np0005591285 nova_compute[182755]: 2026-01-22 00:06:41.318 182759 DEBUG oslo_concurrency.lockutils [None req-c5cdc3a8-f3cb-4175-a965-d891041730e5 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 21 19:06:41 np0005591285 nova_compute[182755]: 2026-01-22 00:06:41.449 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 21 19:06:41 np0005591285 nova_compute[182755]: 2026-01-22 00:06:41.882 182759 DEBUG nova.network.neutron [None req-c5cdc3a8-f3cb-4175-a965-d891041730e5 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] [instance: 6167cc82-55cf-479c-a543-101634481524] Successfully created port: c8f69aa7-693e-445d-9997-3c34ee42d0ad _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Jan 21 19:06:42 np0005591285 nova_compute[182755]: 2026-01-22 00:06:42.980 182759 DEBUG nova.network.neutron [None req-c5cdc3a8-f3cb-4175-a965-d891041730e5 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] [instance: 6167cc82-55cf-479c-a543-101634481524] Successfully updated port: c8f69aa7-693e-445d-9997-3c34ee42d0ad _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Jan 21 19:06:43 np0005591285 nova_compute[182755]: 2026-01-22 00:06:43.001 182759 DEBUG oslo_concurrency.lockutils [None req-c5cdc3a8-f3cb-4175-a965-d891041730e5 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] Acquiring lock "refresh_cache-6167cc82-55cf-479c-a543-101634481524" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 21 19:06:43 np0005591285 nova_compute[182755]: 2026-01-22 00:06:43.001 182759 DEBUG oslo_concurrency.lockutils [None req-c5cdc3a8-f3cb-4175-a965-d891041730e5 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] Acquired lock "refresh_cache-6167cc82-55cf-479c-a543-101634481524" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 21 19:06:43 np0005591285 nova_compute[182755]: 2026-01-22 00:06:43.002 182759 DEBUG nova.network.neutron [None req-c5cdc3a8-f3cb-4175-a965-d891041730e5 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] [instance: 6167cc82-55cf-479c-a543-101634481524] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Jan 21 19:06:43 np0005591285 nova_compute[182755]: 2026-01-22 00:06:43.081 182759 DEBUG nova.compute.manager [req-724d76ae-2b91-4bf5-8769-3ebdf306009d req-17c26c60-009e-4e51-9e93-46fed9ddaf40 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 6167cc82-55cf-479c-a543-101634481524] Received event network-changed-c8f69aa7-693e-445d-9997-3c34ee42d0ad external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 21 19:06:43 np0005591285 nova_compute[182755]: 2026-01-22 00:06:43.082 182759 DEBUG nova.compute.manager [req-724d76ae-2b91-4bf5-8769-3ebdf306009d req-17c26c60-009e-4e51-9e93-46fed9ddaf40 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 6167cc82-55cf-479c-a543-101634481524] Refreshing instance network info cache due to event network-changed-c8f69aa7-693e-445d-9997-3c34ee42d0ad. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 21 19:06:43 np0005591285 nova_compute[182755]: 2026-01-22 00:06:43.082 182759 DEBUG oslo_concurrency.lockutils [req-724d76ae-2b91-4bf5-8769-3ebdf306009d req-17c26c60-009e-4e51-9e93-46fed9ddaf40 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "refresh_cache-6167cc82-55cf-479c-a543-101634481524" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 21 19:06:43 np0005591285 nova_compute[182755]: 2026-01-22 00:06:43.097 182759 DEBUG oslo_concurrency.lockutils [None req-bfdf8379-1463-4e99-a95f-ef09066e740a b4385295f46b45d8803b0c536a989822 c299d482d37e45169cca3d6f178e8555 - - default default] Acquiring lock "e5fe3f24-b0cd-4353-af64-6c1c92f1581d" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 21 19:06:43 np0005591285 nova_compute[182755]: 2026-01-22 00:06:43.098 182759 DEBUG oslo_concurrency.lockutils [None req-bfdf8379-1463-4e99-a95f-ef09066e740a b4385295f46b45d8803b0c536a989822 c299d482d37e45169cca3d6f178e8555 - - default default] Lock "e5fe3f24-b0cd-4353-af64-6c1c92f1581d" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 21 19:06:43 np0005591285 nova_compute[182755]: 2026-01-22 00:06:43.099 182759 DEBUG oslo_concurrency.lockutils [None req-bfdf8379-1463-4e99-a95f-ef09066e740a b4385295f46b45d8803b0c536a989822 c299d482d37e45169cca3d6f178e8555 - - default default] Acquiring lock "e5fe3f24-b0cd-4353-af64-6c1c92f1581d-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 21 19:06:43 np0005591285 nova_compute[182755]: 2026-01-22 00:06:43.099 182759 DEBUG oslo_concurrency.lockutils [None req-bfdf8379-1463-4e99-a95f-ef09066e740a b4385295f46b45d8803b0c536a989822 c299d482d37e45169cca3d6f178e8555 - - default default] Lock "e5fe3f24-b0cd-4353-af64-6c1c92f1581d-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 21 19:06:43 np0005591285 nova_compute[182755]: 2026-01-22 00:06:43.099 182759 DEBUG oslo_concurrency.lockutils [None req-bfdf8379-1463-4e99-a95f-ef09066e740a b4385295f46b45d8803b0c536a989822 c299d482d37e45169cca3d6f178e8555 - - default default] Lock "e5fe3f24-b0cd-4353-af64-6c1c92f1581d-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 21 19:06:43 np0005591285 nova_compute[182755]: 2026-01-22 00:06:43.113 182759 INFO nova.compute.manager [None req-bfdf8379-1463-4e99-a95f-ef09066e740a b4385295f46b45d8803b0c536a989822 c299d482d37e45169cca3d6f178e8555 - - default default] [instance: e5fe3f24-b0cd-4353-af64-6c1c92f1581d] Terminating instance#033[00m
Jan 21 19:06:43 np0005591285 nova_compute[182755]: 2026-01-22 00:06:43.122 182759 DEBUG nova.compute.manager [None req-bfdf8379-1463-4e99-a95f-ef09066e740a b4385295f46b45d8803b0c536a989822 c299d482d37e45169cca3d6f178e8555 - - default default] [instance: e5fe3f24-b0cd-4353-af64-6c1c92f1581d] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Jan 21 19:06:43 np0005591285 nova_compute[182755]: 2026-01-22 00:06:43.159 182759 DEBUG nova.network.neutron [None req-c5cdc3a8-f3cb-4175-a965-d891041730e5 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] [instance: 6167cc82-55cf-479c-a543-101634481524] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Jan 21 19:06:43 np0005591285 kernel: tap4281bc8f-b8 (unregistering): left promiscuous mode
Jan 21 19:06:43 np0005591285 NetworkManager[55017]: <info>  [1769040403.1738] device (tap4281bc8f-b8): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 21 19:06:43 np0005591285 ovn_controller[94908]: 2026-01-22T00:06:43Z|00350|binding|INFO|Releasing lport 4281bc8f-b881-4082-9fc7-f4b6436a837d from this chassis (sb_readonly=0)
Jan 21 19:06:43 np0005591285 nova_compute[182755]: 2026-01-22 00:06:43.181 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:06:43 np0005591285 ovn_controller[94908]: 2026-01-22T00:06:43Z|00351|binding|INFO|Setting lport 4281bc8f-b881-4082-9fc7-f4b6436a837d down in Southbound
Jan 21 19:06:43 np0005591285 ovn_controller[94908]: 2026-01-22T00:06:43Z|00352|binding|INFO|Removing iface tap4281bc8f-b8 ovn-installed in OVS
Jan 21 19:06:43 np0005591285 nova_compute[182755]: 2026-01-22 00:06:43.186 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:06:43 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:06:43.195 104259 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:e2:23:c6 10.100.0.4'], port_security=['fa:16:3e:e2:23:c6 10.100.0.4'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.4/28', 'neutron:device_id': 'e5fe3f24-b0cd-4353-af64-6c1c92f1581d', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-b3dacae7-b9cd-426c-aa4a-3a6b971c7ee5', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'c299d482d37e45169cca3d6f178e8555', 'neutron:revision_number': '4', 'neutron:security_group_ids': '8dd89f69-046c-4ee5-879d-2e2669cadea8', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-2.ctlplane.example.com', 'neutron:port_fip': '192.168.122.244'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=47fc8aa5-cd00-4c23-8e55-87bda0bbf0d4, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fdd55b5c970>], logical_port=4281bc8f-b881-4082-9fc7-f4b6436a837d) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7fdd55b5c970>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 21 19:06:43 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:06:43.197 104259 INFO neutron.agent.ovn.metadata.agent [-] Port 4281bc8f-b881-4082-9fc7-f4b6436a837d in datapath b3dacae7-b9cd-426c-aa4a-3a6b971c7ee5 unbound from our chassis#033[00m
Jan 21 19:06:43 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:06:43.199 104259 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network b3dacae7-b9cd-426c-aa4a-3a6b971c7ee5, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Jan 21 19:06:43 np0005591285 nova_compute[182755]: 2026-01-22 00:06:43.204 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:06:43 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:06:43.203 211690 DEBUG oslo.privsep.daemon [-] privsep: reply[5a44169c-35de-44bf-9bad-98baee81560a]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 19:06:43 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:06:43.205 104259 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-b3dacae7-b9cd-426c-aa4a-3a6b971c7ee5 namespace which is not needed anymore#033[00m
Jan 21 19:06:43 np0005591285 nova_compute[182755]: 2026-01-22 00:06:43.217 182759 DEBUG oslo_service.periodic_task [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 21 19:06:43 np0005591285 nova_compute[182755]: 2026-01-22 00:06:43.217 182759 DEBUG nova.compute.manager [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Jan 21 19:06:43 np0005591285 nova_compute[182755]: 2026-01-22 00:06:43.218 182759 DEBUG nova.compute.manager [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Jan 21 19:06:43 np0005591285 nova_compute[182755]: 2026-01-22 00:06:43.237 182759 DEBUG nova.compute.manager [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] [instance: e5fe3f24-b0cd-4353-af64-6c1c92f1581d] Skipping network cache update for instance because it is being deleted. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9875#033[00m
Jan 21 19:06:43 np0005591285 nova_compute[182755]: 2026-01-22 00:06:43.237 182759 DEBUG nova.compute.manager [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] [instance: 6167cc82-55cf-479c-a543-101634481524] Skipping network cache update for instance because it is Building. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9871#033[00m
Jan 21 19:06:43 np0005591285 nova_compute[182755]: 2026-01-22 00:06:43.237 182759 DEBUG nova.compute.manager [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Jan 21 19:06:43 np0005591285 nova_compute[182755]: 2026-01-22 00:06:43.238 182759 DEBUG oslo_service.periodic_task [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 21 19:06:43 np0005591285 nova_compute[182755]: 2026-01-22 00:06:43.238 182759 DEBUG oslo_service.periodic_task [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 21 19:06:43 np0005591285 nova_compute[182755]: 2026-01-22 00:06:43.238 182759 DEBUG oslo_service.periodic_task [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 21 19:06:43 np0005591285 systemd[1]: machine-qemu\x2d43\x2dinstance\x2d0000005b.scope: Deactivated successfully.
Jan 21 19:06:43 np0005591285 systemd[1]: machine-qemu\x2d43\x2dinstance\x2d0000005b.scope: Consumed 16.551s CPU time.
Jan 21 19:06:43 np0005591285 nova_compute[182755]: 2026-01-22 00:06:43.266 182759 DEBUG oslo_concurrency.lockutils [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 21 19:06:43 np0005591285 systemd-machined[154022]: Machine qemu-43-instance-0000005b terminated.
Jan 21 19:06:43 np0005591285 nova_compute[182755]: 2026-01-22 00:06:43.267 182759 DEBUG oslo_concurrency.lockutils [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 21 19:06:43 np0005591285 nova_compute[182755]: 2026-01-22 00:06:43.267 182759 DEBUG oslo_concurrency.lockutils [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 21 19:06:43 np0005591285 nova_compute[182755]: 2026-01-22 00:06:43.268 182759 DEBUG nova.compute.resource_tracker [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Jan 21 19:06:43 np0005591285 podman[226306]: 2026-01-22 00:06:43.337915006 +0000 UTC m=+0.055658888 container health_status 769668cc4e818cfae854ab3fcb5517227ddca1e18005a18ef4d782a494f45662 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1755695350, version=9.6, architecture=x86_64, build-date=2025-08-20T13:12:41, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.buildah.version=1.33.7, io.openshift.tags=minimal rhel9, distribution-scope=public, name=ubi9-minimal, container_name=openstack_network_exporter, managed_by=edpm_ansible, io.openshift.expose-services=, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.component=ubi9-minimal-container, vendor=Red Hat, Inc., maintainer=Red Hat, Inc., config_id=openstack_network_exporter, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-type=git)
Jan 21 19:06:43 np0005591285 neutron-haproxy-ovnmeta-b3dacae7-b9cd-426c-aa4a-3a6b971c7ee5[225833]: [NOTICE]   (225840) : haproxy version is 2.8.14-c23fe91
Jan 21 19:06:43 np0005591285 neutron-haproxy-ovnmeta-b3dacae7-b9cd-426c-aa4a-3a6b971c7ee5[225833]: [NOTICE]   (225840) : path to executable is /usr/sbin/haproxy
Jan 21 19:06:43 np0005591285 neutron-haproxy-ovnmeta-b3dacae7-b9cd-426c-aa4a-3a6b971c7ee5[225833]: [WARNING]  (225840) : Exiting Master process...
Jan 21 19:06:43 np0005591285 neutron-haproxy-ovnmeta-b3dacae7-b9cd-426c-aa4a-3a6b971c7ee5[225833]: [WARNING]  (225840) : Exiting Master process...
Jan 21 19:06:43 np0005591285 neutron-haproxy-ovnmeta-b3dacae7-b9cd-426c-aa4a-3a6b971c7ee5[225833]: [ALERT]    (225840) : Current worker (225842) exited with code 143 (Terminated)
Jan 21 19:06:43 np0005591285 neutron-haproxy-ovnmeta-b3dacae7-b9cd-426c-aa4a-3a6b971c7ee5[225833]: [WARNING]  (225840) : All workers exited. Exiting... (0)
Jan 21 19:06:43 np0005591285 systemd[1]: libpod-e0fea9e92bbcd5d92cb91dfa74d3fbe2dfe84bc51c8e18d97b28cc4566e8e0ee.scope: Deactivated successfully.
Jan 21 19:06:43 np0005591285 podman[226295]: 2026-01-22 00:06:43.35701523 +0000 UTC m=+0.110844353 container health_status b5c1a31a508d0cf9e10702abe9c0ef424614b4d8f0daee85791f7de054afa6f0 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, config_id=ceilometer_agent_compute, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2)
Jan 21 19:06:43 np0005591285 podman[226332]: 2026-01-22 00:06:43.360501604 +0000 UTC m=+0.045367221 container died e0fea9e92bbcd5d92cb91dfa74d3fbe2dfe84bc51c8e18d97b28cc4566e8e0ee (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-b3dacae7-b9cd-426c-aa4a-3a6b971c7ee5, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 21 19:06:43 np0005591285 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-e0fea9e92bbcd5d92cb91dfa74d3fbe2dfe84bc51c8e18d97b28cc4566e8e0ee-userdata-shm.mount: Deactivated successfully.
Jan 21 19:06:43 np0005591285 systemd[1]: var-lib-containers-storage-overlay-94235a5c0ac13e34ac15efe1d18357abd8dc1a5561b97d37f35bf7b6ee58cbea-merged.mount: Deactivated successfully.
Jan 21 19:06:43 np0005591285 podman[226332]: 2026-01-22 00:06:43.397782528 +0000 UTC m=+0.082648125 container cleanup e0fea9e92bbcd5d92cb91dfa74d3fbe2dfe84bc51c8e18d97b28cc4566e8e0ee (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-b3dacae7-b9cd-426c-aa4a-3a6b971c7ee5, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0)
Jan 21 19:06:43 np0005591285 nova_compute[182755]: 2026-01-22 00:06:43.397 182759 INFO nova.virt.libvirt.driver [-] [instance: e5fe3f24-b0cd-4353-af64-6c1c92f1581d] Instance destroyed successfully.#033[00m
Jan 21 19:06:43 np0005591285 nova_compute[182755]: 2026-01-22 00:06:43.398 182759 DEBUG nova.objects.instance [None req-bfdf8379-1463-4e99-a95f-ef09066e740a b4385295f46b45d8803b0c536a989822 c299d482d37e45169cca3d6f178e8555 - - default default] Lazy-loading 'resources' on Instance uuid e5fe3f24-b0cd-4353-af64-6c1c92f1581d obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 21 19:06:43 np0005591285 systemd[1]: libpod-conmon-e0fea9e92bbcd5d92cb91dfa74d3fbe2dfe84bc51c8e18d97b28cc4566e8e0ee.scope: Deactivated successfully.
Jan 21 19:06:43 np0005591285 nova_compute[182755]: 2026-01-22 00:06:43.459 182759 DEBUG nova.virt.libvirt.vif [None req-bfdf8379-1463-4e99-a95f-ef09066e740a b4385295f46b45d8803b0c536a989822 c299d482d37e45169cca3d6f178e8555 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-22T00:04:53Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ServerActionsTestOtherA-server-1403378080',display_name='tempest-ServerActionsTestOtherA-server-1403378080',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-serveractionstestothera-server-1403378080',id=91,image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBH70xah7ihgEIoUx5I8Vi9VE8DEeMG53SOL9NCSbgEeBRV9je/jiE2sCWFNA3ItoX/qylG9OqBTijx5WPdqmM5JgcD0QcWbaXaoP4id2xYDAqen7JSpxK96w9/70dxAV2w==',key_name='tempest-keypair-70761650',keypairs=<?>,launch_index=0,launched_at=2026-01-22T00:05:10Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='c299d482d37e45169cca3d6f178e8555',ramdisk_id='',reservation_id='r-iz1p6ahz',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-ServerActionsTestOtherA-1347085859',owner_user_name='tempest-ServerActionsTestOtherA-1347085859-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-22T00:05:10Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='b4385295f46b45d8803b0c536a989822',uuid=e5fe3f24-b0cd-4353-af64-6c1c92f1581d,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "4281bc8f-b881-4082-9fc7-f4b6436a837d", "address": "fa:16:3e:e2:23:c6", "network": {"id": "b3dacae7-b9cd-426c-aa4a-3a6b971c7ee5", "bridge": "br-int", "label": "tempest-ServerActionsTestOtherA-1112750792-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.244", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c299d482d37e45169cca3d6f178e8555", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4281bc8f-b8", "ovs_interfaceid": "4281bc8f-b881-4082-9fc7-f4b6436a837d", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Jan 21 19:06:43 np0005591285 nova_compute[182755]: 2026-01-22 00:06:43.460 182759 DEBUG nova.network.os_vif_util [None req-bfdf8379-1463-4e99-a95f-ef09066e740a b4385295f46b45d8803b0c536a989822 c299d482d37e45169cca3d6f178e8555 - - default default] Converting VIF {"id": "4281bc8f-b881-4082-9fc7-f4b6436a837d", "address": "fa:16:3e:e2:23:c6", "network": {"id": "b3dacae7-b9cd-426c-aa4a-3a6b971c7ee5", "bridge": "br-int", "label": "tempest-ServerActionsTestOtherA-1112750792-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.244", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c299d482d37e45169cca3d6f178e8555", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4281bc8f-b8", "ovs_interfaceid": "4281bc8f-b881-4082-9fc7-f4b6436a837d", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 21 19:06:43 np0005591285 nova_compute[182755]: 2026-01-22 00:06:43.461 182759 DEBUG nova.network.os_vif_util [None req-bfdf8379-1463-4e99-a95f-ef09066e740a b4385295f46b45d8803b0c536a989822 c299d482d37e45169cca3d6f178e8555 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:e2:23:c6,bridge_name='br-int',has_traffic_filtering=True,id=4281bc8f-b881-4082-9fc7-f4b6436a837d,network=Network(b3dacae7-b9cd-426c-aa4a-3a6b971c7ee5),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap4281bc8f-b8') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 21 19:06:43 np0005591285 nova_compute[182755]: 2026-01-22 00:06:43.461 182759 DEBUG os_vif [None req-bfdf8379-1463-4e99-a95f-ef09066e740a b4385295f46b45d8803b0c536a989822 c299d482d37e45169cca3d6f178e8555 - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:e2:23:c6,bridge_name='br-int',has_traffic_filtering=True,id=4281bc8f-b881-4082-9fc7-f4b6436a837d,network=Network(b3dacae7-b9cd-426c-aa4a-3a6b971c7ee5),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap4281bc8f-b8') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Jan 21 19:06:43 np0005591285 nova_compute[182755]: 2026-01-22 00:06:43.464 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:06:43 np0005591285 nova_compute[182755]: 2026-01-22 00:06:43.464 182759 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap4281bc8f-b8, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 21 19:06:43 np0005591285 podman[226393]: 2026-01-22 00:06:43.465560081 +0000 UTC m=+0.042516305 container remove e0fea9e92bbcd5d92cb91dfa74d3fbe2dfe84bc51c8e18d97b28cc4566e8e0ee (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-b3dacae7-b9cd-426c-aa4a-3a6b971c7ee5, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 21 19:06:43 np0005591285 nova_compute[182755]: 2026-01-22 00:06:43.467 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 21 19:06:43 np0005591285 nova_compute[182755]: 2026-01-22 00:06:43.468 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:06:43 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:06:43.473 211690 DEBUG oslo.privsep.daemon [-] privsep: reply[ca6d9aac-9da1-4e43-9c4a-7fa57b9ade18]: (4, ('Thu Jan 22 12:06:43 AM UTC 2026 Stopping container neutron-haproxy-ovnmeta-b3dacae7-b9cd-426c-aa4a-3a6b971c7ee5 (e0fea9e92bbcd5d92cb91dfa74d3fbe2dfe84bc51c8e18d97b28cc4566e8e0ee)\ne0fea9e92bbcd5d92cb91dfa74d3fbe2dfe84bc51c8e18d97b28cc4566e8e0ee\nThu Jan 22 12:06:43 AM UTC 2026 Deleting container neutron-haproxy-ovnmeta-b3dacae7-b9cd-426c-aa4a-3a6b971c7ee5 (e0fea9e92bbcd5d92cb91dfa74d3fbe2dfe84bc51c8e18d97b28cc4566e8e0ee)\ne0fea9e92bbcd5d92cb91dfa74d3fbe2dfe84bc51c8e18d97b28cc4566e8e0ee\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 19:06:43 np0005591285 nova_compute[182755]: 2026-01-22 00:06:43.474 182759 INFO os_vif [None req-bfdf8379-1463-4e99-a95f-ef09066e740a b4385295f46b45d8803b0c536a989822 c299d482d37e45169cca3d6f178e8555 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:e2:23:c6,bridge_name='br-int',has_traffic_filtering=True,id=4281bc8f-b881-4082-9fc7-f4b6436a837d,network=Network(b3dacae7-b9cd-426c-aa4a-3a6b971c7ee5),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap4281bc8f-b8')#033[00m
Jan 21 19:06:43 np0005591285 nova_compute[182755]: 2026-01-22 00:06:43.475 182759 INFO nova.virt.libvirt.driver [None req-bfdf8379-1463-4e99-a95f-ef09066e740a b4385295f46b45d8803b0c536a989822 c299d482d37e45169cca3d6f178e8555 - - default default] [instance: e5fe3f24-b0cd-4353-af64-6c1c92f1581d] Deleting instance files /var/lib/nova/instances/e5fe3f24-b0cd-4353-af64-6c1c92f1581d_del#033[00m
Jan 21 19:06:43 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:06:43.475 211690 DEBUG oslo.privsep.daemon [-] privsep: reply[96d76d97-90f1-4cb1-be8b-cca927db9cd6]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 19:06:43 np0005591285 nova_compute[182755]: 2026-01-22 00:06:43.476 182759 INFO nova.virt.libvirt.driver [None req-bfdf8379-1463-4e99-a95f-ef09066e740a b4385295f46b45d8803b0c536a989822 c299d482d37e45169cca3d6f178e8555 - - default default] [instance: e5fe3f24-b0cd-4353-af64-6c1c92f1581d] Deletion of /var/lib/nova/instances/e5fe3f24-b0cd-4353-af64-6c1c92f1581d_del complete#033[00m
Jan 21 19:06:43 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:06:43.476 104259 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapb3dacae7-b0, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 21 19:06:43 np0005591285 kernel: tapb3dacae7-b0: left promiscuous mode
Jan 21 19:06:43 np0005591285 nova_compute[182755]: 2026-01-22 00:06:43.479 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:06:43 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:06:43.482 211690 DEBUG oslo.privsep.daemon [-] privsep: reply[20821fba-4d46-4129-b525-8569742e15b8]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 19:06:43 np0005591285 nova_compute[182755]: 2026-01-22 00:06:43.491 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:06:43 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:06:43.501 211690 DEBUG oslo.privsep.daemon [-] privsep: reply[5b9c26ed-28e6-4ed2-8da8-45ad60bbc4c5]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 19:06:43 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:06:43.502 211690 DEBUG oslo.privsep.daemon [-] privsep: reply[9c5d37b9-94c2-47e9-9b6a-a987fdb87d08]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 19:06:43 np0005591285 nova_compute[182755]: 2026-01-22 00:06:43.512 182759 WARNING nova.virt.libvirt.driver [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Periodic task is updating the host stats, it is trying to get disk info for instance-0000005b, but the backing disk storage was removed by a concurrent operation such as resize. Error: No disk at /var/lib/nova/instances/e5fe3f24-b0cd-4353-af64-6c1c92f1581d/disk: nova.exception.DiskNotFound: No disk at /var/lib/nova/instances/e5fe3f24-b0cd-4353-af64-6c1c92f1581d/disk#033[00m
Jan 21 19:06:43 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:06:43.518 211690 DEBUG oslo.privsep.daemon [-] privsep: reply[0aaf7466-4ef8-4bd0-978d-01fea656f699]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 486619, 'reachable_time': 21223, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 226408, 'error': None, 'target': 'ovnmeta-b3dacae7-b9cd-426c-aa4a-3a6b971c7ee5', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 19:06:43 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:06:43.523 104650 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-b3dacae7-b9cd-426c-aa4a-3a6b971c7ee5 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Jan 21 19:06:43 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:06:43.524 104650 DEBUG oslo.privsep.daemon [-] privsep: reply[241a590d-1f8b-499f-8f8f-1801b38b3fa8]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 19:06:43 np0005591285 systemd[1]: run-netns-ovnmeta\x2db3dacae7\x2db9cd\x2d426c\x2daa4a\x2d3a6b971c7ee5.mount: Deactivated successfully.
Jan 21 19:06:43 np0005591285 nova_compute[182755]: 2026-01-22 00:06:43.637 182759 WARNING nova.virt.libvirt.driver [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 21 19:06:43 np0005591285 nova_compute[182755]: 2026-01-22 00:06:43.638 182759 DEBUG nova.compute.resource_tracker [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=5690MB free_disk=73.23227310180664GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Jan 21 19:06:43 np0005591285 nova_compute[182755]: 2026-01-22 00:06:43.638 182759 DEBUG oslo_concurrency.lockutils [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 21 19:06:43 np0005591285 nova_compute[182755]: 2026-01-22 00:06:43.638 182759 DEBUG oslo_concurrency.lockutils [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 21 19:06:43 np0005591285 nova_compute[182755]: 2026-01-22 00:06:43.694 182759 INFO nova.compute.manager [None req-bfdf8379-1463-4e99-a95f-ef09066e740a b4385295f46b45d8803b0c536a989822 c299d482d37e45169cca3d6f178e8555 - - default default] [instance: e5fe3f24-b0cd-4353-af64-6c1c92f1581d] Took 0.57 seconds to destroy the instance on the hypervisor.#033[00m
Jan 21 19:06:43 np0005591285 nova_compute[182755]: 2026-01-22 00:06:43.694 182759 DEBUG oslo.service.loopingcall [None req-bfdf8379-1463-4e99-a95f-ef09066e740a b4385295f46b45d8803b0c536a989822 c299d482d37e45169cca3d6f178e8555 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Jan 21 19:06:43 np0005591285 nova_compute[182755]: 2026-01-22 00:06:43.695 182759 DEBUG nova.compute.manager [-] [instance: e5fe3f24-b0cd-4353-af64-6c1c92f1581d] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Jan 21 19:06:43 np0005591285 nova_compute[182755]: 2026-01-22 00:06:43.695 182759 DEBUG nova.network.neutron [-] [instance: e5fe3f24-b0cd-4353-af64-6c1c92f1581d] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Jan 21 19:06:43 np0005591285 nova_compute[182755]: 2026-01-22 00:06:43.814 182759 DEBUG nova.compute.resource_tracker [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Instance e5fe3f24-b0cd-4353-af64-6c1c92f1581d actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Jan 21 19:06:43 np0005591285 nova_compute[182755]: 2026-01-22 00:06:43.814 182759 DEBUG nova.compute.resource_tracker [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Instance 6167cc82-55cf-479c-a543-101634481524 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Jan 21 19:06:43 np0005591285 nova_compute[182755]: 2026-01-22 00:06:43.815 182759 DEBUG nova.compute.resource_tracker [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 2 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Jan 21 19:06:43 np0005591285 nova_compute[182755]: 2026-01-22 00:06:43.815 182759 DEBUG nova.compute.resource_tracker [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7679MB used_ram=768MB phys_disk=79GB used_disk=2GB total_vcpus=8 used_vcpus=2 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Jan 21 19:06:43 np0005591285 nova_compute[182755]: 2026-01-22 00:06:43.901 182759 DEBUG nova.compute.provider_tree [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Inventory has not changed in ProviderTree for provider: e96a8776-a298-4c19-937a-402cb8191067 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 21 19:06:43 np0005591285 nova_compute[182755]: 2026-01-22 00:06:43.918 182759 DEBUG nova.scheduler.client.report [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Inventory has not changed for provider e96a8776-a298-4c19-937a-402cb8191067 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 21 19:06:43 np0005591285 nova_compute[182755]: 2026-01-22 00:06:43.956 182759 DEBUG nova.compute.resource_tracker [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Jan 21 19:06:43 np0005591285 nova_compute[182755]: 2026-01-22 00:06:43.957 182759 DEBUG oslo_concurrency.lockutils [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.319s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 21 19:06:44 np0005591285 nova_compute[182755]: 2026-01-22 00:06:44.658 182759 DEBUG nova.network.neutron [None req-c5cdc3a8-f3cb-4175-a965-d891041730e5 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] [instance: 6167cc82-55cf-479c-a543-101634481524] Updating instance_info_cache with network_info: [{"id": "c8f69aa7-693e-445d-9997-3c34ee42d0ad", "address": "fa:16:3e:6a:66:fa", "network": {"id": "f10ec1ab-4b98-425d-b81d-b3bec89eb303", "bridge": "br-int", "label": "tempest-network-smoke--2025591889", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "34b96b4037d24a0ea19383ca2477b2fd", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc8f69aa7-69", "ovs_interfaceid": "c8f69aa7-693e-445d-9997-3c34ee42d0ad", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 21 19:06:44 np0005591285 nova_compute[182755]: 2026-01-22 00:06:44.726 182759 DEBUG oslo_concurrency.lockutils [None req-c5cdc3a8-f3cb-4175-a965-d891041730e5 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] Releasing lock "refresh_cache-6167cc82-55cf-479c-a543-101634481524" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 21 19:06:44 np0005591285 nova_compute[182755]: 2026-01-22 00:06:44.727 182759 DEBUG nova.compute.manager [None req-c5cdc3a8-f3cb-4175-a965-d891041730e5 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] [instance: 6167cc82-55cf-479c-a543-101634481524] Instance network_info: |[{"id": "c8f69aa7-693e-445d-9997-3c34ee42d0ad", "address": "fa:16:3e:6a:66:fa", "network": {"id": "f10ec1ab-4b98-425d-b81d-b3bec89eb303", "bridge": "br-int", "label": "tempest-network-smoke--2025591889", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "34b96b4037d24a0ea19383ca2477b2fd", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc8f69aa7-69", "ovs_interfaceid": "c8f69aa7-693e-445d-9997-3c34ee42d0ad", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Jan 21 19:06:44 np0005591285 nova_compute[182755]: 2026-01-22 00:06:44.727 182759 DEBUG oslo_concurrency.lockutils [req-724d76ae-2b91-4bf5-8769-3ebdf306009d req-17c26c60-009e-4e51-9e93-46fed9ddaf40 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquired lock "refresh_cache-6167cc82-55cf-479c-a543-101634481524" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 21 19:06:44 np0005591285 nova_compute[182755]: 2026-01-22 00:06:44.728 182759 DEBUG nova.network.neutron [req-724d76ae-2b91-4bf5-8769-3ebdf306009d req-17c26c60-009e-4e51-9e93-46fed9ddaf40 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 6167cc82-55cf-479c-a543-101634481524] Refreshing network info cache for port c8f69aa7-693e-445d-9997-3c34ee42d0ad _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 21 19:06:44 np0005591285 nova_compute[182755]: 2026-01-22 00:06:44.733 182759 DEBUG nova.virt.libvirt.driver [None req-c5cdc3a8-f3cb-4175-a965-d891041730e5 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] [instance: 6167cc82-55cf-479c-a543-101634481524] Start _get_guest_xml network_info=[{"id": "c8f69aa7-693e-445d-9997-3c34ee42d0ad", "address": "fa:16:3e:6a:66:fa", "network": {"id": "f10ec1ab-4b98-425d-b81d-b3bec89eb303", "bridge": "br-int", "label": "tempest-network-smoke--2025591889", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "34b96b4037d24a0ea19383ca2477b2fd", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc8f69aa7-69", "ovs_interfaceid": "c8f69aa7-693e-445d-9997-3c34ee42d0ad", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-21T23:42:50Z,direct_url=<?>,disk_format='qcow2',id=9cd98f02-a505-4543-a7ad-04e9a377b456,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='43b70c4e837343859ac97b6b2397ba1b',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-21T23:42:52Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'device_type': 'disk', 'device_name': '/dev/vda', 'size': 0, 'boot_index': 0, 'disk_bus': 'virtio', 'guest_format': None, 'encryption_options': None, 'encryption_format': None, 'encrypted': False, 'encryption_secret_uuid': None, 'image_id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Jan 21 19:06:44 np0005591285 nova_compute[182755]: 2026-01-22 00:06:44.739 182759 WARNING nova.virt.libvirt.driver [None req-c5cdc3a8-f3cb-4175-a965-d891041730e5 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 21 19:06:44 np0005591285 nova_compute[182755]: 2026-01-22 00:06:44.747 182759 DEBUG nova.virt.libvirt.host [None req-c5cdc3a8-f3cb-4175-a965-d891041730e5 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Jan 21 19:06:44 np0005591285 nova_compute[182755]: 2026-01-22 00:06:44.748 182759 DEBUG nova.virt.libvirt.host [None req-c5cdc3a8-f3cb-4175-a965-d891041730e5 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Jan 21 19:06:44 np0005591285 nova_compute[182755]: 2026-01-22 00:06:44.756 182759 DEBUG nova.virt.libvirt.host [None req-c5cdc3a8-f3cb-4175-a965-d891041730e5 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Jan 21 19:06:44 np0005591285 nova_compute[182755]: 2026-01-22 00:06:44.757 182759 DEBUG nova.virt.libvirt.host [None req-c5cdc3a8-f3cb-4175-a965-d891041730e5 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Jan 21 19:06:44 np0005591285 nova_compute[182755]: 2026-01-22 00:06:44.759 182759 DEBUG nova.virt.libvirt.driver [None req-c5cdc3a8-f3cb-4175-a965-d891041730e5 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Jan 21 19:06:44 np0005591285 nova_compute[182755]: 2026-01-22 00:06:44.759 182759 DEBUG nova.virt.hardware [None req-c5cdc3a8-f3cb-4175-a965-d891041730e5 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-21T23:42:45Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='c3389c03-89c4-4ff5-9e03-1a99d41713d4',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-21T23:42:50Z,direct_url=<?>,disk_format='qcow2',id=9cd98f02-a505-4543-a7ad-04e9a377b456,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='43b70c4e837343859ac97b6b2397ba1b',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-21T23:42:52Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Jan 21 19:06:44 np0005591285 nova_compute[182755]: 2026-01-22 00:06:44.760 182759 DEBUG nova.virt.hardware [None req-c5cdc3a8-f3cb-4175-a965-d891041730e5 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Jan 21 19:06:44 np0005591285 nova_compute[182755]: 2026-01-22 00:06:44.761 182759 DEBUG nova.virt.hardware [None req-c5cdc3a8-f3cb-4175-a965-d891041730e5 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Jan 21 19:06:44 np0005591285 nova_compute[182755]: 2026-01-22 00:06:44.761 182759 DEBUG nova.virt.hardware [None req-c5cdc3a8-f3cb-4175-a965-d891041730e5 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Jan 21 19:06:44 np0005591285 nova_compute[182755]: 2026-01-22 00:06:44.761 182759 DEBUG nova.virt.hardware [None req-c5cdc3a8-f3cb-4175-a965-d891041730e5 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Jan 21 19:06:44 np0005591285 nova_compute[182755]: 2026-01-22 00:06:44.762 182759 DEBUG nova.virt.hardware [None req-c5cdc3a8-f3cb-4175-a965-d891041730e5 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Jan 21 19:06:44 np0005591285 nova_compute[182755]: 2026-01-22 00:06:44.762 182759 DEBUG nova.virt.hardware [None req-c5cdc3a8-f3cb-4175-a965-d891041730e5 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Jan 21 19:06:44 np0005591285 nova_compute[182755]: 2026-01-22 00:06:44.763 182759 DEBUG nova.virt.hardware [None req-c5cdc3a8-f3cb-4175-a965-d891041730e5 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Jan 21 19:06:44 np0005591285 nova_compute[182755]: 2026-01-22 00:06:44.763 182759 DEBUG nova.virt.hardware [None req-c5cdc3a8-f3cb-4175-a965-d891041730e5 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Jan 21 19:06:44 np0005591285 nova_compute[182755]: 2026-01-22 00:06:44.764 182759 DEBUG nova.virt.hardware [None req-c5cdc3a8-f3cb-4175-a965-d891041730e5 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Jan 21 19:06:44 np0005591285 nova_compute[182755]: 2026-01-22 00:06:44.764 182759 DEBUG nova.virt.hardware [None req-c5cdc3a8-f3cb-4175-a965-d891041730e5 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Jan 21 19:06:44 np0005591285 nova_compute[182755]: 2026-01-22 00:06:44.770 182759 DEBUG nova.virt.libvirt.vif [None req-c5cdc3a8-f3cb-4175-a965-d891041730e5 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-22T00:06:39Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-1751465895',display_name='tempest-TestNetworkBasicOps-server-1751465895',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-1751465895',id=98,image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBCUQEFaz6xFbfUUIyFDhdsrUS3W/OFy4rX5dg+VsnwsKvybUejlkijhF0sCEbEK0YXF3bzeH12g1xvPUzBL7J4wI/PY6jPsJBtb13ZmCpPxLQe7XZOC3++3YTNNntYsYIQ==',key_name='tempest-TestNetworkBasicOps-194618748',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='34b96b4037d24a0ea19383ca2477b2fd',ramdisk_id='',reservation_id='r-y000o108',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestNetworkBasicOps-822850957',owner_user_name='tempest-TestNetworkBasicOps-822850957-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-22T00:06:40Z,user_data=None,user_id='833f1e9dce90456ea55a443da6704907',uuid=6167cc82-55cf-479c-a543-101634481524,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "c8f69aa7-693e-445d-9997-3c34ee42d0ad", "address": "fa:16:3e:6a:66:fa", "network": {"id": "f10ec1ab-4b98-425d-b81d-b3bec89eb303", "bridge": "br-int", "label": "tempest-network-smoke--2025591889", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "34b96b4037d24a0ea19383ca2477b2fd", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc8f69aa7-69", "ovs_interfaceid": "c8f69aa7-693e-445d-9997-3c34ee42d0ad", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Jan 21 19:06:44 np0005591285 nova_compute[182755]: 2026-01-22 00:06:44.771 182759 DEBUG nova.network.os_vif_util [None req-c5cdc3a8-f3cb-4175-a965-d891041730e5 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] Converting VIF {"id": "c8f69aa7-693e-445d-9997-3c34ee42d0ad", "address": "fa:16:3e:6a:66:fa", "network": {"id": "f10ec1ab-4b98-425d-b81d-b3bec89eb303", "bridge": "br-int", "label": "tempest-network-smoke--2025591889", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "34b96b4037d24a0ea19383ca2477b2fd", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc8f69aa7-69", "ovs_interfaceid": "c8f69aa7-693e-445d-9997-3c34ee42d0ad", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 21 19:06:44 np0005591285 nova_compute[182755]: 2026-01-22 00:06:44.772 182759 DEBUG nova.network.os_vif_util [None req-c5cdc3a8-f3cb-4175-a965-d891041730e5 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:6a:66:fa,bridge_name='br-int',has_traffic_filtering=True,id=c8f69aa7-693e-445d-9997-3c34ee42d0ad,network=Network(f10ec1ab-4b98-425d-b81d-b3bec89eb303),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapc8f69aa7-69') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 21 19:06:44 np0005591285 nova_compute[182755]: 2026-01-22 00:06:44.774 182759 DEBUG nova.objects.instance [None req-c5cdc3a8-f3cb-4175-a965-d891041730e5 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] Lazy-loading 'pci_devices' on Instance uuid 6167cc82-55cf-479c-a543-101634481524 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 21 19:06:44 np0005591285 nova_compute[182755]: 2026-01-22 00:06:44.790 182759 DEBUG nova.compute.manager [req-6679c8e7-4459-4a1f-958b-2d474c2674d5 req-bb90c0a5-2a21-4d29-8273-63be15811ad4 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: e5fe3f24-b0cd-4353-af64-6c1c92f1581d] Received event network-vif-unplugged-4281bc8f-b881-4082-9fc7-f4b6436a837d external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 21 19:06:44 np0005591285 nova_compute[182755]: 2026-01-22 00:06:44.791 182759 DEBUG oslo_concurrency.lockutils [req-6679c8e7-4459-4a1f-958b-2d474c2674d5 req-bb90c0a5-2a21-4d29-8273-63be15811ad4 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "e5fe3f24-b0cd-4353-af64-6c1c92f1581d-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 21 19:06:44 np0005591285 nova_compute[182755]: 2026-01-22 00:06:44.794 182759 DEBUG oslo_concurrency.lockutils [req-6679c8e7-4459-4a1f-958b-2d474c2674d5 req-bb90c0a5-2a21-4d29-8273-63be15811ad4 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "e5fe3f24-b0cd-4353-af64-6c1c92f1581d-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.003s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 21 19:06:44 np0005591285 nova_compute[182755]: 2026-01-22 00:06:44.795 182759 DEBUG oslo_concurrency.lockutils [req-6679c8e7-4459-4a1f-958b-2d474c2674d5 req-bb90c0a5-2a21-4d29-8273-63be15811ad4 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "e5fe3f24-b0cd-4353-af64-6c1c92f1581d-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 21 19:06:44 np0005591285 nova_compute[182755]: 2026-01-22 00:06:44.795 182759 DEBUG nova.compute.manager [req-6679c8e7-4459-4a1f-958b-2d474c2674d5 req-bb90c0a5-2a21-4d29-8273-63be15811ad4 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: e5fe3f24-b0cd-4353-af64-6c1c92f1581d] No waiting events found dispatching network-vif-unplugged-4281bc8f-b881-4082-9fc7-f4b6436a837d pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 21 19:06:44 np0005591285 nova_compute[182755]: 2026-01-22 00:06:44.796 182759 DEBUG nova.compute.manager [req-6679c8e7-4459-4a1f-958b-2d474c2674d5 req-bb90c0a5-2a21-4d29-8273-63be15811ad4 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: e5fe3f24-b0cd-4353-af64-6c1c92f1581d] Received event network-vif-unplugged-4281bc8f-b881-4082-9fc7-f4b6436a837d for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Jan 21 19:06:44 np0005591285 nova_compute[182755]: 2026-01-22 00:06:44.796 182759 DEBUG nova.compute.manager [req-6679c8e7-4459-4a1f-958b-2d474c2674d5 req-bb90c0a5-2a21-4d29-8273-63be15811ad4 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: e5fe3f24-b0cd-4353-af64-6c1c92f1581d] Received event network-vif-plugged-4281bc8f-b881-4082-9fc7-f4b6436a837d external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 21 19:06:44 np0005591285 nova_compute[182755]: 2026-01-22 00:06:44.797 182759 DEBUG oslo_concurrency.lockutils [req-6679c8e7-4459-4a1f-958b-2d474c2674d5 req-bb90c0a5-2a21-4d29-8273-63be15811ad4 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "e5fe3f24-b0cd-4353-af64-6c1c92f1581d-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 21 19:06:44 np0005591285 nova_compute[182755]: 2026-01-22 00:06:44.797 182759 DEBUG oslo_concurrency.lockutils [req-6679c8e7-4459-4a1f-958b-2d474c2674d5 req-bb90c0a5-2a21-4d29-8273-63be15811ad4 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "e5fe3f24-b0cd-4353-af64-6c1c92f1581d-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 21 19:06:44 np0005591285 nova_compute[182755]: 2026-01-22 00:06:44.798 182759 DEBUG oslo_concurrency.lockutils [req-6679c8e7-4459-4a1f-958b-2d474c2674d5 req-bb90c0a5-2a21-4d29-8273-63be15811ad4 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "e5fe3f24-b0cd-4353-af64-6c1c92f1581d-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 21 19:06:44 np0005591285 nova_compute[182755]: 2026-01-22 00:06:44.798 182759 DEBUG nova.compute.manager [req-6679c8e7-4459-4a1f-958b-2d474c2674d5 req-bb90c0a5-2a21-4d29-8273-63be15811ad4 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: e5fe3f24-b0cd-4353-af64-6c1c92f1581d] No waiting events found dispatching network-vif-plugged-4281bc8f-b881-4082-9fc7-f4b6436a837d pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 21 19:06:44 np0005591285 nova_compute[182755]: 2026-01-22 00:06:44.799 182759 WARNING nova.compute.manager [req-6679c8e7-4459-4a1f-958b-2d474c2674d5 req-bb90c0a5-2a21-4d29-8273-63be15811ad4 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: e5fe3f24-b0cd-4353-af64-6c1c92f1581d] Received unexpected event network-vif-plugged-4281bc8f-b881-4082-9fc7-f4b6436a837d for instance with vm_state active and task_state deleting.#033[00m
Jan 21 19:06:44 np0005591285 nova_compute[182755]: 2026-01-22 00:06:44.803 182759 DEBUG nova.virt.libvirt.driver [None req-c5cdc3a8-f3cb-4175-a965-d891041730e5 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] [instance: 6167cc82-55cf-479c-a543-101634481524] End _get_guest_xml xml=<domain type="kvm">
Jan 21 19:06:44 np0005591285 nova_compute[182755]:  <uuid>6167cc82-55cf-479c-a543-101634481524</uuid>
Jan 21 19:06:44 np0005591285 nova_compute[182755]:  <name>instance-00000062</name>
Jan 21 19:06:44 np0005591285 nova_compute[182755]:  <memory>131072</memory>
Jan 21 19:06:44 np0005591285 nova_compute[182755]:  <vcpu>1</vcpu>
Jan 21 19:06:44 np0005591285 nova_compute[182755]:  <metadata>
Jan 21 19:06:44 np0005591285 nova_compute[182755]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 21 19:06:44 np0005591285 nova_compute[182755]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 21 19:06:44 np0005591285 nova_compute[182755]:      <nova:name>tempest-TestNetworkBasicOps-server-1751465895</nova:name>
Jan 21 19:06:44 np0005591285 nova_compute[182755]:      <nova:creationTime>2026-01-22 00:06:44</nova:creationTime>
Jan 21 19:06:44 np0005591285 nova_compute[182755]:      <nova:flavor name="m1.nano">
Jan 21 19:06:44 np0005591285 nova_compute[182755]:        <nova:memory>128</nova:memory>
Jan 21 19:06:44 np0005591285 nova_compute[182755]:        <nova:disk>1</nova:disk>
Jan 21 19:06:44 np0005591285 nova_compute[182755]:        <nova:swap>0</nova:swap>
Jan 21 19:06:44 np0005591285 nova_compute[182755]:        <nova:ephemeral>0</nova:ephemeral>
Jan 21 19:06:44 np0005591285 nova_compute[182755]:        <nova:vcpus>1</nova:vcpus>
Jan 21 19:06:44 np0005591285 nova_compute[182755]:      </nova:flavor>
Jan 21 19:06:44 np0005591285 nova_compute[182755]:      <nova:owner>
Jan 21 19:06:44 np0005591285 nova_compute[182755]:        <nova:user uuid="833f1e9dce90456ea55a443da6704907">tempest-TestNetworkBasicOps-822850957-project-member</nova:user>
Jan 21 19:06:44 np0005591285 nova_compute[182755]:        <nova:project uuid="34b96b4037d24a0ea19383ca2477b2fd">tempest-TestNetworkBasicOps-822850957</nova:project>
Jan 21 19:06:44 np0005591285 nova_compute[182755]:      </nova:owner>
Jan 21 19:06:44 np0005591285 nova_compute[182755]:      <nova:root type="image" uuid="9cd98f02-a505-4543-a7ad-04e9a377b456"/>
Jan 21 19:06:44 np0005591285 nova_compute[182755]:      <nova:ports>
Jan 21 19:06:44 np0005591285 nova_compute[182755]:        <nova:port uuid="c8f69aa7-693e-445d-9997-3c34ee42d0ad">
Jan 21 19:06:44 np0005591285 nova_compute[182755]:          <nova:ip type="fixed" address="10.100.0.13" ipVersion="4"/>
Jan 21 19:06:44 np0005591285 nova_compute[182755]:        </nova:port>
Jan 21 19:06:44 np0005591285 nova_compute[182755]:      </nova:ports>
Jan 21 19:06:44 np0005591285 nova_compute[182755]:    </nova:instance>
Jan 21 19:06:44 np0005591285 nova_compute[182755]:  </metadata>
Jan 21 19:06:44 np0005591285 nova_compute[182755]:  <sysinfo type="smbios">
Jan 21 19:06:44 np0005591285 nova_compute[182755]:    <system>
Jan 21 19:06:44 np0005591285 nova_compute[182755]:      <entry name="manufacturer">RDO</entry>
Jan 21 19:06:44 np0005591285 nova_compute[182755]:      <entry name="product">OpenStack Compute</entry>
Jan 21 19:06:44 np0005591285 nova_compute[182755]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 21 19:06:44 np0005591285 nova_compute[182755]:      <entry name="serial">6167cc82-55cf-479c-a543-101634481524</entry>
Jan 21 19:06:44 np0005591285 nova_compute[182755]:      <entry name="uuid">6167cc82-55cf-479c-a543-101634481524</entry>
Jan 21 19:06:44 np0005591285 nova_compute[182755]:      <entry name="family">Virtual Machine</entry>
Jan 21 19:06:44 np0005591285 nova_compute[182755]:    </system>
Jan 21 19:06:44 np0005591285 nova_compute[182755]:  </sysinfo>
Jan 21 19:06:44 np0005591285 nova_compute[182755]:  <os>
Jan 21 19:06:44 np0005591285 nova_compute[182755]:    <type arch="x86_64" machine="q35">hvm</type>
Jan 21 19:06:44 np0005591285 nova_compute[182755]:    <boot dev="hd"/>
Jan 21 19:06:44 np0005591285 nova_compute[182755]:    <smbios mode="sysinfo"/>
Jan 21 19:06:44 np0005591285 nova_compute[182755]:  </os>
Jan 21 19:06:44 np0005591285 nova_compute[182755]:  <features>
Jan 21 19:06:44 np0005591285 nova_compute[182755]:    <acpi/>
Jan 21 19:06:44 np0005591285 nova_compute[182755]:    <apic/>
Jan 21 19:06:44 np0005591285 nova_compute[182755]:    <vmcoreinfo/>
Jan 21 19:06:44 np0005591285 nova_compute[182755]:  </features>
Jan 21 19:06:44 np0005591285 nova_compute[182755]:  <clock offset="utc">
Jan 21 19:06:44 np0005591285 nova_compute[182755]:    <timer name="pit" tickpolicy="delay"/>
Jan 21 19:06:44 np0005591285 nova_compute[182755]:    <timer name="rtc" tickpolicy="catchup"/>
Jan 21 19:06:44 np0005591285 nova_compute[182755]:    <timer name="hpet" present="no"/>
Jan 21 19:06:44 np0005591285 nova_compute[182755]:  </clock>
Jan 21 19:06:44 np0005591285 nova_compute[182755]:  <cpu mode="custom" match="exact">
Jan 21 19:06:44 np0005591285 nova_compute[182755]:    <model>Nehalem</model>
Jan 21 19:06:44 np0005591285 nova_compute[182755]:    <topology sockets="1" cores="1" threads="1"/>
Jan 21 19:06:44 np0005591285 nova_compute[182755]:  </cpu>
Jan 21 19:06:44 np0005591285 nova_compute[182755]:  <devices>
Jan 21 19:06:44 np0005591285 nova_compute[182755]:    <disk type="file" device="disk">
Jan 21 19:06:44 np0005591285 nova_compute[182755]:      <driver name="qemu" type="qcow2" cache="none"/>
Jan 21 19:06:44 np0005591285 nova_compute[182755]:      <source file="/var/lib/nova/instances/6167cc82-55cf-479c-a543-101634481524/disk"/>
Jan 21 19:06:44 np0005591285 nova_compute[182755]:      <target dev="vda" bus="virtio"/>
Jan 21 19:06:44 np0005591285 nova_compute[182755]:    </disk>
Jan 21 19:06:44 np0005591285 nova_compute[182755]:    <disk type="file" device="cdrom">
Jan 21 19:06:44 np0005591285 nova_compute[182755]:      <driver name="qemu" type="raw" cache="none"/>
Jan 21 19:06:44 np0005591285 nova_compute[182755]:      <source file="/var/lib/nova/instances/6167cc82-55cf-479c-a543-101634481524/disk.config"/>
Jan 21 19:06:44 np0005591285 nova_compute[182755]:      <target dev="sda" bus="sata"/>
Jan 21 19:06:44 np0005591285 nova_compute[182755]:    </disk>
Jan 21 19:06:44 np0005591285 nova_compute[182755]:    <interface type="ethernet">
Jan 21 19:06:44 np0005591285 nova_compute[182755]:      <mac address="fa:16:3e:6a:66:fa"/>
Jan 21 19:06:44 np0005591285 nova_compute[182755]:      <model type="virtio"/>
Jan 21 19:06:44 np0005591285 nova_compute[182755]:      <driver name="vhost" rx_queue_size="512"/>
Jan 21 19:06:44 np0005591285 nova_compute[182755]:      <mtu size="1442"/>
Jan 21 19:06:44 np0005591285 nova_compute[182755]:      <target dev="tapc8f69aa7-69"/>
Jan 21 19:06:44 np0005591285 nova_compute[182755]:    </interface>
Jan 21 19:06:44 np0005591285 nova_compute[182755]:    <serial type="pty">
Jan 21 19:06:44 np0005591285 nova_compute[182755]:      <log file="/var/lib/nova/instances/6167cc82-55cf-479c-a543-101634481524/console.log" append="off"/>
Jan 21 19:06:44 np0005591285 nova_compute[182755]:    </serial>
Jan 21 19:06:44 np0005591285 nova_compute[182755]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 21 19:06:44 np0005591285 nova_compute[182755]:    <video>
Jan 21 19:06:44 np0005591285 nova_compute[182755]:      <model type="virtio"/>
Jan 21 19:06:44 np0005591285 nova_compute[182755]:    </video>
Jan 21 19:06:44 np0005591285 nova_compute[182755]:    <input type="tablet" bus="usb"/>
Jan 21 19:06:44 np0005591285 nova_compute[182755]:    <rng model="virtio">
Jan 21 19:06:44 np0005591285 nova_compute[182755]:      <backend model="random">/dev/urandom</backend>
Jan 21 19:06:44 np0005591285 nova_compute[182755]:    </rng>
Jan 21 19:06:44 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root"/>
Jan 21 19:06:44 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 19:06:44 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 19:06:44 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 19:06:44 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 19:06:44 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 19:06:44 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 19:06:44 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 19:06:44 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 19:06:44 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 19:06:44 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 19:06:44 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 19:06:44 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 19:06:44 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 19:06:44 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 19:06:44 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 19:06:44 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 19:06:44 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 19:06:44 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 19:06:44 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 19:06:44 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 19:06:44 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 19:06:44 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 19:06:44 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 19:06:44 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 19:06:44 np0005591285 nova_compute[182755]:    <controller type="usb" index="0"/>
Jan 21 19:06:44 np0005591285 nova_compute[182755]:    <memballoon model="virtio">
Jan 21 19:06:44 np0005591285 nova_compute[182755]:      <stats period="10"/>
Jan 21 19:06:44 np0005591285 nova_compute[182755]:    </memballoon>
Jan 21 19:06:44 np0005591285 nova_compute[182755]:  </devices>
Jan 21 19:06:44 np0005591285 nova_compute[182755]: </domain>
Jan 21 19:06:44 np0005591285 nova_compute[182755]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Jan 21 19:06:44 np0005591285 nova_compute[182755]: 2026-01-22 00:06:44.804 182759 DEBUG nova.compute.manager [None req-c5cdc3a8-f3cb-4175-a965-d891041730e5 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] [instance: 6167cc82-55cf-479c-a543-101634481524] Preparing to wait for external event network-vif-plugged-c8f69aa7-693e-445d-9997-3c34ee42d0ad prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Jan 21 19:06:44 np0005591285 nova_compute[182755]: 2026-01-22 00:06:44.804 182759 DEBUG oslo_concurrency.lockutils [None req-c5cdc3a8-f3cb-4175-a965-d891041730e5 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] Acquiring lock "6167cc82-55cf-479c-a543-101634481524-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 21 19:06:44 np0005591285 nova_compute[182755]: 2026-01-22 00:06:44.805 182759 DEBUG oslo_concurrency.lockutils [None req-c5cdc3a8-f3cb-4175-a965-d891041730e5 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] Lock "6167cc82-55cf-479c-a543-101634481524-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 21 19:06:44 np0005591285 nova_compute[182755]: 2026-01-22 00:06:44.805 182759 DEBUG oslo_concurrency.lockutils [None req-c5cdc3a8-f3cb-4175-a965-d891041730e5 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] Lock "6167cc82-55cf-479c-a543-101634481524-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 21 19:06:44 np0005591285 nova_compute[182755]: 2026-01-22 00:06:44.805 182759 DEBUG nova.virt.libvirt.vif [None req-c5cdc3a8-f3cb-4175-a965-d891041730e5 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-22T00:06:39Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-1751465895',display_name='tempest-TestNetworkBasicOps-server-1751465895',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-1751465895',id=98,image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBCUQEFaz6xFbfUUIyFDhdsrUS3W/OFy4rX5dg+VsnwsKvybUejlkijhF0sCEbEK0YXF3bzeH12g1xvPUzBL7J4wI/PY6jPsJBtb13ZmCpPxLQe7XZOC3++3YTNNntYsYIQ==',key_name='tempest-TestNetworkBasicOps-194618748',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='34b96b4037d24a0ea19383ca2477b2fd',ramdisk_id='',reservation_id='r-y000o108',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestNetworkBasicOps-822850957',owner_user_name='tempest-TestNetworkBasicOps-822850957-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-22T00:06:40Z,user_data=None,user_id='833f1e9dce90456ea55a443da6704907',uuid=6167cc82-55cf-479c-a543-101634481524,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "c8f69aa7-693e-445d-9997-3c34ee42d0ad", "address": "fa:16:3e:6a:66:fa", "network": {"id": "f10ec1ab-4b98-425d-b81d-b3bec89eb303", "bridge": "br-int", "label": "tempest-network-smoke--2025591889", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "34b96b4037d24a0ea19383ca2477b2fd", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc8f69aa7-69", "ovs_interfaceid": "c8f69aa7-693e-445d-9997-3c34ee42d0ad", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Jan 21 19:06:44 np0005591285 nova_compute[182755]: 2026-01-22 00:06:44.806 182759 DEBUG nova.network.os_vif_util [None req-c5cdc3a8-f3cb-4175-a965-d891041730e5 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] Converting VIF {"id": "c8f69aa7-693e-445d-9997-3c34ee42d0ad", "address": "fa:16:3e:6a:66:fa", "network": {"id": "f10ec1ab-4b98-425d-b81d-b3bec89eb303", "bridge": "br-int", "label": "tempest-network-smoke--2025591889", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "34b96b4037d24a0ea19383ca2477b2fd", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc8f69aa7-69", "ovs_interfaceid": "c8f69aa7-693e-445d-9997-3c34ee42d0ad", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 21 19:06:44 np0005591285 nova_compute[182755]: 2026-01-22 00:06:44.806 182759 DEBUG nova.network.os_vif_util [None req-c5cdc3a8-f3cb-4175-a965-d891041730e5 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:6a:66:fa,bridge_name='br-int',has_traffic_filtering=True,id=c8f69aa7-693e-445d-9997-3c34ee42d0ad,network=Network(f10ec1ab-4b98-425d-b81d-b3bec89eb303),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapc8f69aa7-69') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 21 19:06:44 np0005591285 nova_compute[182755]: 2026-01-22 00:06:44.807 182759 DEBUG os_vif [None req-c5cdc3a8-f3cb-4175-a965-d891041730e5 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:6a:66:fa,bridge_name='br-int',has_traffic_filtering=True,id=c8f69aa7-693e-445d-9997-3c34ee42d0ad,network=Network(f10ec1ab-4b98-425d-b81d-b3bec89eb303),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapc8f69aa7-69') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Jan 21 19:06:44 np0005591285 nova_compute[182755]: 2026-01-22 00:06:44.807 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:06:44 np0005591285 nova_compute[182755]: 2026-01-22 00:06:44.807 182759 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 21 19:06:44 np0005591285 nova_compute[182755]: 2026-01-22 00:06:44.808 182759 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 21 19:06:44 np0005591285 nova_compute[182755]: 2026-01-22 00:06:44.811 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:06:44 np0005591285 nova_compute[182755]: 2026-01-22 00:06:44.811 182759 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapc8f69aa7-69, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 21 19:06:44 np0005591285 nova_compute[182755]: 2026-01-22 00:06:44.812 182759 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tapc8f69aa7-69, col_values=(('external_ids', {'iface-id': 'c8f69aa7-693e-445d-9997-3c34ee42d0ad', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:6a:66:fa', 'vm-uuid': '6167cc82-55cf-479c-a543-101634481524'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 21 19:06:44 np0005591285 nova_compute[182755]: 2026-01-22 00:06:44.814 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:06:44 np0005591285 nova_compute[182755]: 2026-01-22 00:06:44.816 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 21 19:06:44 np0005591285 NetworkManager[55017]: <info>  [1769040404.8159] manager: (tapc8f69aa7-69): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/174)
Jan 21 19:06:44 np0005591285 nova_compute[182755]: 2026-01-22 00:06:44.823 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:06:44 np0005591285 nova_compute[182755]: 2026-01-22 00:06:44.824 182759 INFO os_vif [None req-c5cdc3a8-f3cb-4175-a965-d891041730e5 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:6a:66:fa,bridge_name='br-int',has_traffic_filtering=True,id=c8f69aa7-693e-445d-9997-3c34ee42d0ad,network=Network(f10ec1ab-4b98-425d-b81d-b3bec89eb303),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapc8f69aa7-69')#033[00m
Jan 21 19:06:44 np0005591285 nova_compute[182755]: 2026-01-22 00:06:44.898 182759 DEBUG nova.virt.libvirt.driver [None req-c5cdc3a8-f3cb-4175-a965-d891041730e5 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 21 19:06:44 np0005591285 nova_compute[182755]: 2026-01-22 00:06:44.899 182759 DEBUG nova.virt.libvirt.driver [None req-c5cdc3a8-f3cb-4175-a965-d891041730e5 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 21 19:06:44 np0005591285 nova_compute[182755]: 2026-01-22 00:06:44.899 182759 DEBUG nova.virt.libvirt.driver [None req-c5cdc3a8-f3cb-4175-a965-d891041730e5 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] No VIF found with MAC fa:16:3e:6a:66:fa, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Jan 21 19:06:44 np0005591285 nova_compute[182755]: 2026-01-22 00:06:44.900 182759 INFO nova.virt.libvirt.driver [None req-c5cdc3a8-f3cb-4175-a965-d891041730e5 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] [instance: 6167cc82-55cf-479c-a543-101634481524] Using config drive#033[00m
Jan 21 19:06:44 np0005591285 nova_compute[182755]: 2026-01-22 00:06:44.972 182759 DEBUG nova.network.neutron [-] [instance: e5fe3f24-b0cd-4353-af64-6c1c92f1581d] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 21 19:06:45 np0005591285 nova_compute[182755]: 2026-01-22 00:06:45.033 182759 INFO nova.compute.manager [-] [instance: e5fe3f24-b0cd-4353-af64-6c1c92f1581d] Took 1.34 seconds to deallocate network for instance.#033[00m
Jan 21 19:06:45 np0005591285 nova_compute[182755]: 2026-01-22 00:06:45.119 182759 DEBUG nova.compute.manager [req-ae23875e-36a1-4aa3-8937-31b3fa0ab450 req-25f7e850-1737-4f93-944c-cf127bc6de34 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: e5fe3f24-b0cd-4353-af64-6c1c92f1581d] Received event network-vif-deleted-4281bc8f-b881-4082-9fc7-f4b6436a837d external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 21 19:06:45 np0005591285 nova_compute[182755]: 2026-01-22 00:06:45.149 182759 DEBUG oslo_concurrency.lockutils [None req-bfdf8379-1463-4e99-a95f-ef09066e740a b4385295f46b45d8803b0c536a989822 c299d482d37e45169cca3d6f178e8555 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 21 19:06:45 np0005591285 nova_compute[182755]: 2026-01-22 00:06:45.150 182759 DEBUG oslo_concurrency.lockutils [None req-bfdf8379-1463-4e99-a95f-ef09066e740a b4385295f46b45d8803b0c536a989822 c299d482d37e45169cca3d6f178e8555 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 21 19:06:45 np0005591285 nova_compute[182755]: 2026-01-22 00:06:45.231 182759 DEBUG nova.compute.provider_tree [None req-bfdf8379-1463-4e99-a95f-ef09066e740a b4385295f46b45d8803b0c536a989822 c299d482d37e45169cca3d6f178e8555 - - default default] Inventory has not changed in ProviderTree for provider: e96a8776-a298-4c19-937a-402cb8191067 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 21 19:06:45 np0005591285 nova_compute[182755]: 2026-01-22 00:06:45.250 182759 DEBUG nova.scheduler.client.report [None req-bfdf8379-1463-4e99-a95f-ef09066e740a b4385295f46b45d8803b0c536a989822 c299d482d37e45169cca3d6f178e8555 - - default default] Inventory has not changed for provider e96a8776-a298-4c19-937a-402cb8191067 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 21 19:06:45 np0005591285 nova_compute[182755]: 2026-01-22 00:06:45.288 182759 DEBUG oslo_concurrency.lockutils [None req-bfdf8379-1463-4e99-a95f-ef09066e740a b4385295f46b45d8803b0c536a989822 c299d482d37e45169cca3d6f178e8555 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.138s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 21 19:06:45 np0005591285 nova_compute[182755]: 2026-01-22 00:06:45.321 182759 INFO nova.scheduler.client.report [None req-bfdf8379-1463-4e99-a95f-ef09066e740a b4385295f46b45d8803b0c536a989822 c299d482d37e45169cca3d6f178e8555 - - default default] Deleted allocations for instance e5fe3f24-b0cd-4353-af64-6c1c92f1581d#033[00m
Jan 21 19:06:45 np0005591285 nova_compute[182755]: 2026-01-22 00:06:45.358 182759 INFO nova.virt.libvirt.driver [None req-c5cdc3a8-f3cb-4175-a965-d891041730e5 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] [instance: 6167cc82-55cf-479c-a543-101634481524] Creating config drive at /var/lib/nova/instances/6167cc82-55cf-479c-a543-101634481524/disk.config#033[00m
Jan 21 19:06:45 np0005591285 nova_compute[182755]: 2026-01-22 00:06:45.365 182759 DEBUG oslo_concurrency.processutils [None req-c5cdc3a8-f3cb-4175-a965-d891041730e5 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/6167cc82-55cf-479c-a543-101634481524/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpemsu4n79 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 21 19:06:45 np0005591285 nova_compute[182755]: 2026-01-22 00:06:45.424 182759 DEBUG oslo_concurrency.lockutils [None req-bfdf8379-1463-4e99-a95f-ef09066e740a b4385295f46b45d8803b0c536a989822 c299d482d37e45169cca3d6f178e8555 - - default default] Lock "e5fe3f24-b0cd-4353-af64-6c1c92f1581d" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 2.326s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 21 19:06:45 np0005591285 nova_compute[182755]: 2026-01-22 00:06:45.490 182759 DEBUG oslo_concurrency.processutils [None req-c5cdc3a8-f3cb-4175-a965-d891041730e5 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/6167cc82-55cf-479c-a543-101634481524/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpemsu4n79" returned: 0 in 0.125s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 21 19:06:45 np0005591285 kernel: tapc8f69aa7-69: entered promiscuous mode
Jan 21 19:06:45 np0005591285 systemd-udevd[226294]: Network interface NamePolicy= disabled on kernel command line.
Jan 21 19:06:45 np0005591285 NetworkManager[55017]: <info>  [1769040405.5406] manager: (tapc8f69aa7-69): new Tun device (/org/freedesktop/NetworkManager/Devices/175)
Jan 21 19:06:45 np0005591285 nova_compute[182755]: 2026-01-22 00:06:45.540 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:06:45 np0005591285 ovn_controller[94908]: 2026-01-22T00:06:45Z|00353|binding|INFO|Claiming lport c8f69aa7-693e-445d-9997-3c34ee42d0ad for this chassis.
Jan 21 19:06:45 np0005591285 ovn_controller[94908]: 2026-01-22T00:06:45Z|00354|binding|INFO|c8f69aa7-693e-445d-9997-3c34ee42d0ad: Claiming fa:16:3e:6a:66:fa 10.100.0.13
Jan 21 19:06:45 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:06:45.550 104259 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:6a:66:fa 10.100.0.13'], port_security=['fa:16:3e:6a:66:fa 10.100.0.13'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.13/28', 'neutron:device_id': '6167cc82-55cf-479c-a543-101634481524', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-f10ec1ab-4b98-425d-b81d-b3bec89eb303', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '34b96b4037d24a0ea19383ca2477b2fd', 'neutron:revision_number': '2', 'neutron:security_group_ids': '2f4224c0-7028-42a4-a552-421afe2237a5', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=0f5e11f5-9bfe-4253-bb8d-4e8170927296, chassis=[<ovs.db.idl.Row object at 0x7fdd55b5c970>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fdd55b5c970>], logical_port=c8f69aa7-693e-445d-9997-3c34ee42d0ad) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 21 19:06:45 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:06:45.552 104259 INFO neutron.agent.ovn.metadata.agent [-] Port c8f69aa7-693e-445d-9997-3c34ee42d0ad in datapath f10ec1ab-4b98-425d-b81d-b3bec89eb303 bound to our chassis#033[00m
Jan 21 19:06:45 np0005591285 NetworkManager[55017]: <info>  [1769040405.5547] device (tapc8f69aa7-69): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 21 19:06:45 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:06:45.553 104259 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network f10ec1ab-4b98-425d-b81d-b3bec89eb303#033[00m
Jan 21 19:06:45 np0005591285 NetworkManager[55017]: <info>  [1769040405.5557] device (tapc8f69aa7-69): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 21 19:06:45 np0005591285 ovn_controller[94908]: 2026-01-22T00:06:45Z|00355|binding|INFO|Setting lport c8f69aa7-693e-445d-9997-3c34ee42d0ad ovn-installed in OVS
Jan 21 19:06:45 np0005591285 ovn_controller[94908]: 2026-01-22T00:06:45Z|00356|binding|INFO|Setting lport c8f69aa7-693e-445d-9997-3c34ee42d0ad up in Southbound
Jan 21 19:06:45 np0005591285 nova_compute[182755]: 2026-01-22 00:06:45.559 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:06:45 np0005591285 nova_compute[182755]: 2026-01-22 00:06:45.561 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:06:45 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:06:45.564 211690 DEBUG oslo.privsep.daemon [-] privsep: reply[5d05d127-af44-405f-afb7-75cdb8791b95]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 19:06:45 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:06:45.565 104259 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tapf10ec1ab-41 in ovnmeta-f10ec1ab-4b98-425d-b81d-b3bec89eb303 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Jan 21 19:06:45 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:06:45.566 211690 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tapf10ec1ab-40 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Jan 21 19:06:45 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:06:45.566 211690 DEBUG oslo.privsep.daemon [-] privsep: reply[440f104a-445b-457d-84c4-09a91ae49243]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 19:06:45 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:06:45.567 211690 DEBUG oslo.privsep.daemon [-] privsep: reply[e38af950-006e-46e5-962c-05da9b29c009]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 19:06:45 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:06:45.578 104650 DEBUG oslo.privsep.daemon [-] privsep: reply[62c97559-ffb2-4771-8598-49fc96f9eb52]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 19:06:45 np0005591285 systemd-machined[154022]: New machine qemu-44-instance-00000062.
Jan 21 19:06:45 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:06:45.601 211690 DEBUG oslo.privsep.daemon [-] privsep: reply[db07360b-ad1d-4752-894b-e04929120799]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 19:06:45 np0005591285 systemd[1]: Started Virtual Machine qemu-44-instance-00000062.
Jan 21 19:06:45 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:06:45.632 211704 DEBUG oslo.privsep.daemon [-] privsep: reply[145ce416-66b6-4237-8be1-36e96880a6f3]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 19:06:45 np0005591285 NetworkManager[55017]: <info>  [1769040405.6374] manager: (tapf10ec1ab-40): new Veth device (/org/freedesktop/NetworkManager/Devices/176)
Jan 21 19:06:45 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:06:45.636 211690 DEBUG oslo.privsep.daemon [-] privsep: reply[42e722a6-ebab-466b-91da-f055046a6c74]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 19:06:45 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:06:45.669 211704 DEBUG oslo.privsep.daemon [-] privsep: reply[297af9a6-8493-4625-849d-fddc82b7c15a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 19:06:45 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:06:45.672 211704 DEBUG oslo.privsep.daemon [-] privsep: reply[17bb6484-4d0f-488b-ad63-514af681a57e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 19:06:45 np0005591285 NetworkManager[55017]: <info>  [1769040405.6915] device (tapf10ec1ab-40): carrier: link connected
Jan 21 19:06:45 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:06:45.695 211704 DEBUG oslo.privsep.daemon [-] privsep: reply[c99508b5-970b-43f3-813e-beeb20a50bd8]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 19:06:45 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:06:45.708 211690 DEBUG oslo.privsep.daemon [-] privsep: reply[b50df38b-7dfd-4f54-bbaa-72293dd01003]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapf10ec1ab-41'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:9d:d6:e0'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 113], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 496435, 'reachable_time': 24007, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 226458, 'error': None, 'target': 'ovnmeta-f10ec1ab-4b98-425d-b81d-b3bec89eb303', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 19:06:45 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:06:45.720 211690 DEBUG oslo.privsep.daemon [-] privsep: reply[324401cc-ecf9-4cb4-82b7-fa02c2dbd5b2]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe9d:d6e0'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 496435, 'tstamp': 496435}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 226459, 'error': None, 'target': 'ovnmeta-f10ec1ab-4b98-425d-b81d-b3bec89eb303', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 19:06:45 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:06:45.736 211690 DEBUG oslo.privsep.daemon [-] privsep: reply[ea9ebe78-5e76-4e9a-a656-06361a44c6c5]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapf10ec1ab-41'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:9d:d6:e0'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 113], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 496435, 'reachable_time': 24007, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 226460, 'error': None, 'target': 'ovnmeta-f10ec1ab-4b98-425d-b81d-b3bec89eb303', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 19:06:45 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:06:45.765 211690 DEBUG oslo.privsep.daemon [-] privsep: reply[59bd2736-1335-4314-8277-2d84da33ca47]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 19:06:45 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:06:45.808 211690 DEBUG oslo.privsep.daemon [-] privsep: reply[d32ad144-93db-41d2-b5a6-68f270338d1d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 19:06:45 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:06:45.809 104259 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapf10ec1ab-40, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 21 19:06:45 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:06:45.810 104259 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 21 19:06:45 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:06:45.810 104259 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapf10ec1ab-40, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 21 19:06:45 np0005591285 nova_compute[182755]: 2026-01-22 00:06:45.811 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:06:45 np0005591285 kernel: tapf10ec1ab-40: entered promiscuous mode
Jan 21 19:06:45 np0005591285 NetworkManager[55017]: <info>  [1769040405.8129] manager: (tapf10ec1ab-40): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/177)
Jan 21 19:06:45 np0005591285 nova_compute[182755]: 2026-01-22 00:06:45.813 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:06:45 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:06:45.814 104259 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapf10ec1ab-40, col_values=(('external_ids', {'iface-id': 'e13e9986-32c8-46d0-b3e3-65edba110bfc'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 21 19:06:45 np0005591285 nova_compute[182755]: 2026-01-22 00:06:45.815 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:06:45 np0005591285 ovn_controller[94908]: 2026-01-22T00:06:45Z|00357|binding|INFO|Releasing lport e13e9986-32c8-46d0-b3e3-65edba110bfc from this chassis (sb_readonly=0)
Jan 21 19:06:45 np0005591285 nova_compute[182755]: 2026-01-22 00:06:45.829 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:06:45 np0005591285 nova_compute[182755]: 2026-01-22 00:06:45.831 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:06:45 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:06:45.832 104259 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/f10ec1ab-4b98-425d-b81d-b3bec89eb303.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/f10ec1ab-4b98-425d-b81d-b3bec89eb303.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Jan 21 19:06:45 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:06:45.833 211690 DEBUG oslo.privsep.daemon [-] privsep: reply[b3781346-4c48-439f-aee8-5a3922968e18]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 19:06:45 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:06:45.834 104259 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Jan 21 19:06:45 np0005591285 ovn_metadata_agent[104254]: global
Jan 21 19:06:45 np0005591285 ovn_metadata_agent[104254]:    log         /dev/log local0 debug
Jan 21 19:06:45 np0005591285 ovn_metadata_agent[104254]:    log-tag     haproxy-metadata-proxy-f10ec1ab-4b98-425d-b81d-b3bec89eb303
Jan 21 19:06:45 np0005591285 ovn_metadata_agent[104254]:    user        root
Jan 21 19:06:45 np0005591285 ovn_metadata_agent[104254]:    group       root
Jan 21 19:06:45 np0005591285 ovn_metadata_agent[104254]:    maxconn     1024
Jan 21 19:06:45 np0005591285 ovn_metadata_agent[104254]:    pidfile     /var/lib/neutron/external/pids/f10ec1ab-4b98-425d-b81d-b3bec89eb303.pid.haproxy
Jan 21 19:06:45 np0005591285 ovn_metadata_agent[104254]:    daemon
Jan 21 19:06:45 np0005591285 ovn_metadata_agent[104254]: 
Jan 21 19:06:45 np0005591285 ovn_metadata_agent[104254]: defaults
Jan 21 19:06:45 np0005591285 ovn_metadata_agent[104254]:    log global
Jan 21 19:06:45 np0005591285 ovn_metadata_agent[104254]:    mode http
Jan 21 19:06:45 np0005591285 ovn_metadata_agent[104254]:    option httplog
Jan 21 19:06:45 np0005591285 ovn_metadata_agent[104254]:    option dontlognull
Jan 21 19:06:45 np0005591285 ovn_metadata_agent[104254]:    option http-server-close
Jan 21 19:06:45 np0005591285 ovn_metadata_agent[104254]:    option forwardfor
Jan 21 19:06:45 np0005591285 ovn_metadata_agent[104254]:    retries                 3
Jan 21 19:06:45 np0005591285 ovn_metadata_agent[104254]:    timeout http-request    30s
Jan 21 19:06:45 np0005591285 ovn_metadata_agent[104254]:    timeout connect         30s
Jan 21 19:06:45 np0005591285 ovn_metadata_agent[104254]:    timeout client          32s
Jan 21 19:06:45 np0005591285 ovn_metadata_agent[104254]:    timeout server          32s
Jan 21 19:06:45 np0005591285 ovn_metadata_agent[104254]:    timeout http-keep-alive 30s
Jan 21 19:06:45 np0005591285 ovn_metadata_agent[104254]: 
Jan 21 19:06:45 np0005591285 ovn_metadata_agent[104254]: 
Jan 21 19:06:45 np0005591285 ovn_metadata_agent[104254]: listen listener
Jan 21 19:06:45 np0005591285 ovn_metadata_agent[104254]:    bind 169.254.169.254:80
Jan 21 19:06:45 np0005591285 ovn_metadata_agent[104254]:    server metadata /var/lib/neutron/metadata_proxy
Jan 21 19:06:45 np0005591285 ovn_metadata_agent[104254]:    http-request add-header X-OVN-Network-ID f10ec1ab-4b98-425d-b81d-b3bec89eb303
Jan 21 19:06:45 np0005591285 ovn_metadata_agent[104254]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Jan 21 19:06:45 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:06:45.835 104259 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-f10ec1ab-4b98-425d-b81d-b3bec89eb303', 'env', 'PROCESS_TAG=haproxy-f10ec1ab-4b98-425d-b81d-b3bec89eb303', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/f10ec1ab-4b98-425d-b81d-b3bec89eb303.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Jan 21 19:06:46 np0005591285 nova_compute[182755]: 2026-01-22 00:06:46.025 182759 DEBUG nova.virt.driver [None req-f447ef32-7edb-4e2c-a5c5-5452f2f9c37e - - - - - -] Emitting event <LifecycleEvent: 1769040406.0243158, 6167cc82-55cf-479c-a543-101634481524 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 21 19:06:46 np0005591285 nova_compute[182755]: 2026-01-22 00:06:46.026 182759 INFO nova.compute.manager [None req-f447ef32-7edb-4e2c-a5c5-5452f2f9c37e - - - - - -] [instance: 6167cc82-55cf-479c-a543-101634481524] VM Started (Lifecycle Event)#033[00m
Jan 21 19:06:46 np0005591285 nova_compute[182755]: 2026-01-22 00:06:46.056 182759 DEBUG nova.compute.manager [None req-f447ef32-7edb-4e2c-a5c5-5452f2f9c37e - - - - - -] [instance: 6167cc82-55cf-479c-a543-101634481524] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 21 19:06:46 np0005591285 nova_compute[182755]: 2026-01-22 00:06:46.061 182759 DEBUG nova.virt.driver [None req-f447ef32-7edb-4e2c-a5c5-5452f2f9c37e - - - - - -] Emitting event <LifecycleEvent: 1769040406.0245938, 6167cc82-55cf-479c-a543-101634481524 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 21 19:06:46 np0005591285 nova_compute[182755]: 2026-01-22 00:06:46.061 182759 INFO nova.compute.manager [None req-f447ef32-7edb-4e2c-a5c5-5452f2f9c37e - - - - - -] [instance: 6167cc82-55cf-479c-a543-101634481524] VM Paused (Lifecycle Event)#033[00m
Jan 21 19:06:46 np0005591285 nova_compute[182755]: 2026-01-22 00:06:46.095 182759 DEBUG nova.compute.manager [None req-f447ef32-7edb-4e2c-a5c5-5452f2f9c37e - - - - - -] [instance: 6167cc82-55cf-479c-a543-101634481524] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 21 19:06:46 np0005591285 nova_compute[182755]: 2026-01-22 00:06:46.099 182759 DEBUG nova.compute.manager [None req-f447ef32-7edb-4e2c-a5c5-5452f2f9c37e - - - - - -] [instance: 6167cc82-55cf-479c-a543-101634481524] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 21 19:06:46 np0005591285 nova_compute[182755]: 2026-01-22 00:06:46.125 182759 INFO nova.compute.manager [None req-f447ef32-7edb-4e2c-a5c5-5452f2f9c37e - - - - - -] [instance: 6167cc82-55cf-479c-a543-101634481524] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 21 19:06:46 np0005591285 podman[226499]: 2026-01-22 00:06:46.184704184 +0000 UTC m=+0.045787053 container create 270e14f4385c8348b91de78577dcf863ac4ffb67bd9a677ee1570669b1d31dc7 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-f10ec1ab-4b98-425d-b81d-b3bec89eb303, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251202)
Jan 21 19:06:46 np0005591285 systemd[1]: Started libpod-conmon-270e14f4385c8348b91de78577dcf863ac4ffb67bd9a677ee1570669b1d31dc7.scope.
Jan 21 19:06:46 np0005591285 systemd[1]: Started libcrun container.
Jan 21 19:06:46 np0005591285 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/8fcdf618ab44c4445e28131968e548eced512fdaf0da331a39fee4f29efdd80e/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 21 19:06:46 np0005591285 podman[226499]: 2026-01-22 00:06:46.160831851 +0000 UTC m=+0.021914770 image pull 3695f0466b4af47afdf4b467956f8cc4744d7249671a73e7ca3fd26cca2f59c3 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Jan 21 19:06:46 np0005591285 podman[226499]: 2026-01-22 00:06:46.262949388 +0000 UTC m=+0.124032277 container init 270e14f4385c8348b91de78577dcf863ac4ffb67bd9a677ee1570669b1d31dc7 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-f10ec1ab-4b98-425d-b81d-b3bec89eb303, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 21 19:06:46 np0005591285 podman[226499]: 2026-01-22 00:06:46.269796172 +0000 UTC m=+0.130879041 container start 270e14f4385c8348b91de78577dcf863ac4ffb67bd9a677ee1570669b1d31dc7 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-f10ec1ab-4b98-425d-b81d-b3bec89eb303, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251202, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, io.buildah.version=1.41.3)
Jan 21 19:06:46 np0005591285 neutron-haproxy-ovnmeta-f10ec1ab-4b98-425d-b81d-b3bec89eb303[226515]: [NOTICE]   (226519) : New worker (226521) forked
Jan 21 19:06:46 np0005591285 neutron-haproxy-ovnmeta-f10ec1ab-4b98-425d-b81d-b3bec89eb303[226515]: [NOTICE]   (226519) : Loading success.
Jan 21 19:06:46 np0005591285 nova_compute[182755]: 2026-01-22 00:06:46.362 182759 DEBUG nova.network.neutron [req-724d76ae-2b91-4bf5-8769-3ebdf306009d req-17c26c60-009e-4e51-9e93-46fed9ddaf40 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 6167cc82-55cf-479c-a543-101634481524] Updated VIF entry in instance network info cache for port c8f69aa7-693e-445d-9997-3c34ee42d0ad. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 21 19:06:46 np0005591285 nova_compute[182755]: 2026-01-22 00:06:46.362 182759 DEBUG nova.network.neutron [req-724d76ae-2b91-4bf5-8769-3ebdf306009d req-17c26c60-009e-4e51-9e93-46fed9ddaf40 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 6167cc82-55cf-479c-a543-101634481524] Updating instance_info_cache with network_info: [{"id": "c8f69aa7-693e-445d-9997-3c34ee42d0ad", "address": "fa:16:3e:6a:66:fa", "network": {"id": "f10ec1ab-4b98-425d-b81d-b3bec89eb303", "bridge": "br-int", "label": "tempest-network-smoke--2025591889", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "34b96b4037d24a0ea19383ca2477b2fd", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc8f69aa7-69", "ovs_interfaceid": "c8f69aa7-693e-445d-9997-3c34ee42d0ad", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 21 19:06:46 np0005591285 nova_compute[182755]: 2026-01-22 00:06:46.382 182759 DEBUG oslo_concurrency.lockutils [req-724d76ae-2b91-4bf5-8769-3ebdf306009d req-17c26c60-009e-4e51-9e93-46fed9ddaf40 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Releasing lock "refresh_cache-6167cc82-55cf-479c-a543-101634481524" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 21 19:06:46 np0005591285 nova_compute[182755]: 2026-01-22 00:06:46.495 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:06:47 np0005591285 nova_compute[182755]: 2026-01-22 00:06:47.355 182759 DEBUG nova.compute.manager [req-22520865-74c1-4b1d-b36b-1ce697698d1a req-91ec3635-4d16-41c9-901a-3235a84136bc 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 6167cc82-55cf-479c-a543-101634481524] Received event network-vif-plugged-c8f69aa7-693e-445d-9997-3c34ee42d0ad external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 21 19:06:47 np0005591285 nova_compute[182755]: 2026-01-22 00:06:47.355 182759 DEBUG oslo_concurrency.lockutils [req-22520865-74c1-4b1d-b36b-1ce697698d1a req-91ec3635-4d16-41c9-901a-3235a84136bc 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "6167cc82-55cf-479c-a543-101634481524-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 21 19:06:47 np0005591285 nova_compute[182755]: 2026-01-22 00:06:47.356 182759 DEBUG oslo_concurrency.lockutils [req-22520865-74c1-4b1d-b36b-1ce697698d1a req-91ec3635-4d16-41c9-901a-3235a84136bc 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "6167cc82-55cf-479c-a543-101634481524-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 21 19:06:47 np0005591285 nova_compute[182755]: 2026-01-22 00:06:47.356 182759 DEBUG oslo_concurrency.lockutils [req-22520865-74c1-4b1d-b36b-1ce697698d1a req-91ec3635-4d16-41c9-901a-3235a84136bc 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "6167cc82-55cf-479c-a543-101634481524-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 21 19:06:47 np0005591285 nova_compute[182755]: 2026-01-22 00:06:47.356 182759 DEBUG nova.compute.manager [req-22520865-74c1-4b1d-b36b-1ce697698d1a req-91ec3635-4d16-41c9-901a-3235a84136bc 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 6167cc82-55cf-479c-a543-101634481524] Processing event network-vif-plugged-c8f69aa7-693e-445d-9997-3c34ee42d0ad _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Jan 21 19:06:47 np0005591285 nova_compute[182755]: 2026-01-22 00:06:47.356 182759 DEBUG nova.compute.manager [req-22520865-74c1-4b1d-b36b-1ce697698d1a req-91ec3635-4d16-41c9-901a-3235a84136bc 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 6167cc82-55cf-479c-a543-101634481524] Received event network-vif-plugged-c8f69aa7-693e-445d-9997-3c34ee42d0ad external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 21 19:06:47 np0005591285 nova_compute[182755]: 2026-01-22 00:06:47.357 182759 DEBUG oslo_concurrency.lockutils [req-22520865-74c1-4b1d-b36b-1ce697698d1a req-91ec3635-4d16-41c9-901a-3235a84136bc 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "6167cc82-55cf-479c-a543-101634481524-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 21 19:06:47 np0005591285 nova_compute[182755]: 2026-01-22 00:06:47.357 182759 DEBUG oslo_concurrency.lockutils [req-22520865-74c1-4b1d-b36b-1ce697698d1a req-91ec3635-4d16-41c9-901a-3235a84136bc 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "6167cc82-55cf-479c-a543-101634481524-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 21 19:06:47 np0005591285 nova_compute[182755]: 2026-01-22 00:06:47.357 182759 DEBUG oslo_concurrency.lockutils [req-22520865-74c1-4b1d-b36b-1ce697698d1a req-91ec3635-4d16-41c9-901a-3235a84136bc 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "6167cc82-55cf-479c-a543-101634481524-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 21 19:06:47 np0005591285 nova_compute[182755]: 2026-01-22 00:06:47.357 182759 DEBUG nova.compute.manager [req-22520865-74c1-4b1d-b36b-1ce697698d1a req-91ec3635-4d16-41c9-901a-3235a84136bc 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 6167cc82-55cf-479c-a543-101634481524] No waiting events found dispatching network-vif-plugged-c8f69aa7-693e-445d-9997-3c34ee42d0ad pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 21 19:06:47 np0005591285 nova_compute[182755]: 2026-01-22 00:06:47.357 182759 WARNING nova.compute.manager [req-22520865-74c1-4b1d-b36b-1ce697698d1a req-91ec3635-4d16-41c9-901a-3235a84136bc 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 6167cc82-55cf-479c-a543-101634481524] Received unexpected event network-vif-plugged-c8f69aa7-693e-445d-9997-3c34ee42d0ad for instance with vm_state building and task_state spawning.#033[00m
Jan 21 19:06:47 np0005591285 nova_compute[182755]: 2026-01-22 00:06:47.358 182759 DEBUG nova.compute.manager [None req-c5cdc3a8-f3cb-4175-a965-d891041730e5 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] [instance: 6167cc82-55cf-479c-a543-101634481524] Instance event wait completed in 1 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Jan 21 19:06:47 np0005591285 nova_compute[182755]: 2026-01-22 00:06:47.362 182759 DEBUG nova.virt.driver [None req-f447ef32-7edb-4e2c-a5c5-5452f2f9c37e - - - - - -] Emitting event <LifecycleEvent: 1769040407.3619666, 6167cc82-55cf-479c-a543-101634481524 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 21 19:06:47 np0005591285 nova_compute[182755]: 2026-01-22 00:06:47.362 182759 INFO nova.compute.manager [None req-f447ef32-7edb-4e2c-a5c5-5452f2f9c37e - - - - - -] [instance: 6167cc82-55cf-479c-a543-101634481524] VM Resumed (Lifecycle Event)#033[00m
Jan 21 19:06:47 np0005591285 nova_compute[182755]: 2026-01-22 00:06:47.364 182759 DEBUG nova.virt.libvirt.driver [None req-c5cdc3a8-f3cb-4175-a965-d891041730e5 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] [instance: 6167cc82-55cf-479c-a543-101634481524] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Jan 21 19:06:47 np0005591285 nova_compute[182755]: 2026-01-22 00:06:47.367 182759 INFO nova.virt.libvirt.driver [-] [instance: 6167cc82-55cf-479c-a543-101634481524] Instance spawned successfully.#033[00m
Jan 21 19:06:47 np0005591285 nova_compute[182755]: 2026-01-22 00:06:47.368 182759 DEBUG nova.virt.libvirt.driver [None req-c5cdc3a8-f3cb-4175-a965-d891041730e5 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] [instance: 6167cc82-55cf-479c-a543-101634481524] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Jan 21 19:06:47 np0005591285 nova_compute[182755]: 2026-01-22 00:06:47.388 182759 DEBUG nova.compute.manager [None req-f447ef32-7edb-4e2c-a5c5-5452f2f9c37e - - - - - -] [instance: 6167cc82-55cf-479c-a543-101634481524] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 21 19:06:47 np0005591285 nova_compute[182755]: 2026-01-22 00:06:47.394 182759 DEBUG nova.compute.manager [None req-f447ef32-7edb-4e2c-a5c5-5452f2f9c37e - - - - - -] [instance: 6167cc82-55cf-479c-a543-101634481524] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 21 19:06:47 np0005591285 nova_compute[182755]: 2026-01-22 00:06:47.396 182759 DEBUG nova.virt.libvirt.driver [None req-c5cdc3a8-f3cb-4175-a965-d891041730e5 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] [instance: 6167cc82-55cf-479c-a543-101634481524] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 21 19:06:47 np0005591285 nova_compute[182755]: 2026-01-22 00:06:47.397 182759 DEBUG nova.virt.libvirt.driver [None req-c5cdc3a8-f3cb-4175-a965-d891041730e5 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] [instance: 6167cc82-55cf-479c-a543-101634481524] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 21 19:06:47 np0005591285 nova_compute[182755]: 2026-01-22 00:06:47.397 182759 DEBUG nova.virt.libvirt.driver [None req-c5cdc3a8-f3cb-4175-a965-d891041730e5 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] [instance: 6167cc82-55cf-479c-a543-101634481524] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 21 19:06:47 np0005591285 nova_compute[182755]: 2026-01-22 00:06:47.398 182759 DEBUG nova.virt.libvirt.driver [None req-c5cdc3a8-f3cb-4175-a965-d891041730e5 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] [instance: 6167cc82-55cf-479c-a543-101634481524] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 21 19:06:47 np0005591285 nova_compute[182755]: 2026-01-22 00:06:47.398 182759 DEBUG nova.virt.libvirt.driver [None req-c5cdc3a8-f3cb-4175-a965-d891041730e5 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] [instance: 6167cc82-55cf-479c-a543-101634481524] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 21 19:06:47 np0005591285 nova_compute[182755]: 2026-01-22 00:06:47.399 182759 DEBUG nova.virt.libvirt.driver [None req-c5cdc3a8-f3cb-4175-a965-d891041730e5 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] [instance: 6167cc82-55cf-479c-a543-101634481524] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 21 19:06:47 np0005591285 nova_compute[182755]: 2026-01-22 00:06:47.431 182759 INFO nova.compute.manager [None req-f447ef32-7edb-4e2c-a5c5-5452f2f9c37e - - - - - -] [instance: 6167cc82-55cf-479c-a543-101634481524] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 21 19:06:47 np0005591285 nova_compute[182755]: 2026-01-22 00:06:47.481 182759 INFO nova.compute.manager [None req-c5cdc3a8-f3cb-4175-a965-d891041730e5 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] [instance: 6167cc82-55cf-479c-a543-101634481524] Took 6.60 seconds to spawn the instance on the hypervisor.#033[00m
Jan 21 19:06:47 np0005591285 nova_compute[182755]: 2026-01-22 00:06:47.481 182759 DEBUG nova.compute.manager [None req-c5cdc3a8-f3cb-4175-a965-d891041730e5 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] [instance: 6167cc82-55cf-479c-a543-101634481524] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 21 19:06:47 np0005591285 nova_compute[182755]: 2026-01-22 00:06:47.569 182759 INFO nova.compute.manager [None req-c5cdc3a8-f3cb-4175-a965-d891041730e5 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] [instance: 6167cc82-55cf-479c-a543-101634481524] Took 7.27 seconds to build instance.#033[00m
Jan 21 19:06:47 np0005591285 nova_compute[182755]: 2026-01-22 00:06:47.590 182759 DEBUG oslo_concurrency.lockutils [None req-c5cdc3a8-f3cb-4175-a965-d891041730e5 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] Lock "6167cc82-55cf-479c-a543-101634481524" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 7.415s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 21 19:06:49 np0005591285 nova_compute[182755]: 2026-01-22 00:06:49.814 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:06:50 np0005591285 podman[226531]: 2026-01-22 00:06:50.188912713 +0000 UTC m=+0.052761660 container health_status 3d927e208c8d9d2bf2ed958430b573b5beb6e6062ba87e1ec7a0ee42c855b2e4 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter)
Jan 21 19:06:50 np0005591285 ovn_controller[94908]: 2026-01-22T00:06:50Z|00358|binding|INFO|Releasing lport e13e9986-32c8-46d0-b3e3-65edba110bfc from this chassis (sb_readonly=0)
Jan 21 19:06:51 np0005591285 nova_compute[182755]: 2026-01-22 00:06:51.023 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:06:51 np0005591285 ovn_controller[94908]: 2026-01-22T00:06:51Z|00359|binding|INFO|Releasing lport e13e9986-32c8-46d0-b3e3-65edba110bfc from this chassis (sb_readonly=0)
Jan 21 19:06:51 np0005591285 nova_compute[182755]: 2026-01-22 00:06:51.138 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:06:51 np0005591285 NetworkManager[55017]: <info>  [1769040411.2990] manager: (patch-br-int-to-provnet-99bcf688-d143-4306-a11c-956e66fbe227): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/178)
Jan 21 19:06:51 np0005591285 nova_compute[182755]: 2026-01-22 00:06:51.298 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:06:51 np0005591285 NetworkManager[55017]: <info>  [1769040411.3007] manager: (patch-provnet-99bcf688-d143-4306-a11c-956e66fbe227-to-br-int): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/179)
Jan 21 19:06:51 np0005591285 nova_compute[182755]: 2026-01-22 00:06:51.396 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:06:51 np0005591285 ovn_controller[94908]: 2026-01-22T00:06:51Z|00360|binding|INFO|Releasing lport e13e9986-32c8-46d0-b3e3-65edba110bfc from this chassis (sb_readonly=0)
Jan 21 19:06:51 np0005591285 nova_compute[182755]: 2026-01-22 00:06:51.409 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:06:51 np0005591285 nova_compute[182755]: 2026-01-22 00:06:51.496 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:06:51 np0005591285 nova_compute[182755]: 2026-01-22 00:06:51.663 182759 DEBUG nova.compute.manager [req-23af4617-918a-4537-a99a-e068899809c8 req-17ccfa8c-886e-4a9c-ac3f-5e0a6a44ac23 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 6167cc82-55cf-479c-a543-101634481524] Received event network-changed-c8f69aa7-693e-445d-9997-3c34ee42d0ad external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 21 19:06:51 np0005591285 nova_compute[182755]: 2026-01-22 00:06:51.664 182759 DEBUG nova.compute.manager [req-23af4617-918a-4537-a99a-e068899809c8 req-17ccfa8c-886e-4a9c-ac3f-5e0a6a44ac23 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 6167cc82-55cf-479c-a543-101634481524] Refreshing instance network info cache due to event network-changed-c8f69aa7-693e-445d-9997-3c34ee42d0ad. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 21 19:06:51 np0005591285 nova_compute[182755]: 2026-01-22 00:06:51.664 182759 DEBUG oslo_concurrency.lockutils [req-23af4617-918a-4537-a99a-e068899809c8 req-17ccfa8c-886e-4a9c-ac3f-5e0a6a44ac23 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "refresh_cache-6167cc82-55cf-479c-a543-101634481524" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 21 19:06:51 np0005591285 nova_compute[182755]: 2026-01-22 00:06:51.664 182759 DEBUG oslo_concurrency.lockutils [req-23af4617-918a-4537-a99a-e068899809c8 req-17ccfa8c-886e-4a9c-ac3f-5e0a6a44ac23 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquired lock "refresh_cache-6167cc82-55cf-479c-a543-101634481524" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 21 19:06:51 np0005591285 nova_compute[182755]: 2026-01-22 00:06:51.665 182759 DEBUG nova.network.neutron [req-23af4617-918a-4537-a99a-e068899809c8 req-17ccfa8c-886e-4a9c-ac3f-5e0a6a44ac23 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 6167cc82-55cf-479c-a543-101634481524] Refreshing network info cache for port c8f69aa7-693e-445d-9997-3c34ee42d0ad _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 21 19:06:53 np0005591285 nova_compute[182755]: 2026-01-22 00:06:53.697 182759 DEBUG nova.network.neutron [req-23af4617-918a-4537-a99a-e068899809c8 req-17ccfa8c-886e-4a9c-ac3f-5e0a6a44ac23 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 6167cc82-55cf-479c-a543-101634481524] Updated VIF entry in instance network info cache for port c8f69aa7-693e-445d-9997-3c34ee42d0ad. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 21 19:06:53 np0005591285 nova_compute[182755]: 2026-01-22 00:06:53.699 182759 DEBUG nova.network.neutron [req-23af4617-918a-4537-a99a-e068899809c8 req-17ccfa8c-886e-4a9c-ac3f-5e0a6a44ac23 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 6167cc82-55cf-479c-a543-101634481524] Updating instance_info_cache with network_info: [{"id": "c8f69aa7-693e-445d-9997-3c34ee42d0ad", "address": "fa:16:3e:6a:66:fa", "network": {"id": "f10ec1ab-4b98-425d-b81d-b3bec89eb303", "bridge": "br-int", "label": "tempest-network-smoke--2025591889", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.243", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "34b96b4037d24a0ea19383ca2477b2fd", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc8f69aa7-69", "ovs_interfaceid": "c8f69aa7-693e-445d-9997-3c34ee42d0ad", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 21 19:06:53 np0005591285 nova_compute[182755]: 2026-01-22 00:06:53.718 182759 DEBUG oslo_concurrency.lockutils [req-23af4617-918a-4537-a99a-e068899809c8 req-17ccfa8c-886e-4a9c-ac3f-5e0a6a44ac23 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Releasing lock "refresh_cache-6167cc82-55cf-479c-a543-101634481524" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 21 19:06:54 np0005591285 podman[226556]: 2026-01-22 00:06:54.190552704 +0000 UTC m=+0.058696110 container health_status 8919309155a033da8deb024bb37b9fc0d285fe97686ee0d202af37a5f2df8416 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter)
Jan 21 19:06:54 np0005591285 podman[226555]: 2026-01-22 00:06:54.234092375 +0000 UTC m=+0.096101665 container health_status 482a8450540b33e5cd4c4cb0c4b60281016c657b79b89fa82f8a9e447843f995 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, org.label-schema.build-date=20251202, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0)
Jan 21 19:06:54 np0005591285 podman[226557]: 2026-01-22 00:06:54.260470195 +0000 UTC m=+0.130111321 container health_status e38def30a2d032278c507a8fe96e62e9ea51ff4721fe55b24e66691821fd079e (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251202, tcib_managed=true, config_id=ovn_controller, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, container_name=ovn_controller)
Jan 21 19:06:54 np0005591285 nova_compute[182755]: 2026-01-22 00:06:54.817 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:06:56 np0005591285 nova_compute[182755]: 2026-01-22 00:06:56.500 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:06:58 np0005591285 nova_compute[182755]: 2026-01-22 00:06:58.405 182759 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1769040403.3939867, e5fe3f24-b0cd-4353-af64-6c1c92f1581d => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 21 19:06:58 np0005591285 nova_compute[182755]: 2026-01-22 00:06:58.406 182759 INFO nova.compute.manager [-] [instance: e5fe3f24-b0cd-4353-af64-6c1c92f1581d] VM Stopped (Lifecycle Event)#033[00m
Jan 21 19:06:58 np0005591285 nova_compute[182755]: 2026-01-22 00:06:58.448 182759 DEBUG nova.compute.manager [None req-0f488408-4e7c-48a6-92b5-42810e7c8edc - - - - - -] [instance: e5fe3f24-b0cd-4353-af64-6c1c92f1581d] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 21 19:06:58 np0005591285 nova_compute[182755]: 2026-01-22 00:06:58.485 182759 DEBUG nova.compute.manager [None req-61d84e6f-1f5d-4e85-97f8-1a17c46dd747 365f219cd09c471fa6275faa2fe5e2a1 b26cf6f4abd54e30aac169a3cbca648c - - default default] [instance: a7650c58-4663-47b0-8499-d470f8edddbd] Stashing vm_state: stopped _prep_resize /usr/lib/python3.9/site-packages/nova/compute/manager.py:5560#033[00m
Jan 21 19:06:58 np0005591285 nova_compute[182755]: 2026-01-22 00:06:58.665 182759 DEBUG oslo_concurrency.lockutils [None req-61d84e6f-1f5d-4e85-97f8-1a17c46dd747 365f219cd09c471fa6275faa2fe5e2a1 b26cf6f4abd54e30aac169a3cbca648c - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.resize_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 21 19:06:58 np0005591285 nova_compute[182755]: 2026-01-22 00:06:58.665 182759 DEBUG oslo_concurrency.lockutils [None req-61d84e6f-1f5d-4e85-97f8-1a17c46dd747 365f219cd09c471fa6275faa2fe5e2a1 b26cf6f4abd54e30aac169a3cbca648c - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.resize_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 21 19:06:58 np0005591285 nova_compute[182755]: 2026-01-22 00:06:58.695 182759 DEBUG nova.objects.instance [None req-61d84e6f-1f5d-4e85-97f8-1a17c46dd747 365f219cd09c471fa6275faa2fe5e2a1 b26cf6f4abd54e30aac169a3cbca648c - - default default] Lazy-loading 'pci_requests' on Instance uuid a7650c58-4663-47b0-8499-d470f8edddbd obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 21 19:06:58 np0005591285 nova_compute[182755]: 2026-01-22 00:06:58.712 182759 DEBUG nova.virt.hardware [None req-61d84e6f-1f5d-4e85-97f8-1a17c46dd747 365f219cd09c471fa6275faa2fe5e2a1 b26cf6f4abd54e30aac169a3cbca648c - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Jan 21 19:06:58 np0005591285 nova_compute[182755]: 2026-01-22 00:06:58.713 182759 INFO nova.compute.claims [None req-61d84e6f-1f5d-4e85-97f8-1a17c46dd747 365f219cd09c471fa6275faa2fe5e2a1 b26cf6f4abd54e30aac169a3cbca648c - - default default] [instance: a7650c58-4663-47b0-8499-d470f8edddbd] Claim successful on node compute-2.ctlplane.example.com#033[00m
Jan 21 19:06:58 np0005591285 nova_compute[182755]: 2026-01-22 00:06:58.713 182759 DEBUG nova.objects.instance [None req-61d84e6f-1f5d-4e85-97f8-1a17c46dd747 365f219cd09c471fa6275faa2fe5e2a1 b26cf6f4abd54e30aac169a3cbca648c - - default default] Lazy-loading 'resources' on Instance uuid a7650c58-4663-47b0-8499-d470f8edddbd obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 21 19:06:58 np0005591285 nova_compute[182755]: 2026-01-22 00:06:58.735 182759 DEBUG nova.objects.instance [None req-61d84e6f-1f5d-4e85-97f8-1a17c46dd747 365f219cd09c471fa6275faa2fe5e2a1 b26cf6f4abd54e30aac169a3cbca648c - - default default] Lazy-loading 'pci_devices' on Instance uuid a7650c58-4663-47b0-8499-d470f8edddbd obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 21 19:06:58 np0005591285 nova_compute[182755]: 2026-01-22 00:06:58.788 182759 INFO nova.compute.resource_tracker [None req-61d84e6f-1f5d-4e85-97f8-1a17c46dd747 365f219cd09c471fa6275faa2fe5e2a1 b26cf6f4abd54e30aac169a3cbca648c - - default default] [instance: a7650c58-4663-47b0-8499-d470f8edddbd] Updating resource usage from migration f9552d95-1fd3-4e1c-9c7f-b072e712c5b6#033[00m
Jan 21 19:06:58 np0005591285 nova_compute[182755]: 2026-01-22 00:06:58.788 182759 DEBUG nova.compute.resource_tracker [None req-61d84e6f-1f5d-4e85-97f8-1a17c46dd747 365f219cd09c471fa6275faa2fe5e2a1 b26cf6f4abd54e30aac169a3cbca648c - - default default] [instance: a7650c58-4663-47b0-8499-d470f8edddbd] Starting to track incoming migration f9552d95-1fd3-4e1c-9c7f-b072e712c5b6 with flavor ff01ccba-ad51-439f-9037-926190d6dc0f _update_usage_from_migration /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1431#033[00m
Jan 21 19:06:58 np0005591285 nova_compute[182755]: 2026-01-22 00:06:58.865 182759 DEBUG nova.compute.provider_tree [None req-61d84e6f-1f5d-4e85-97f8-1a17c46dd747 365f219cd09c471fa6275faa2fe5e2a1 b26cf6f4abd54e30aac169a3cbca648c - - default default] Inventory has not changed in ProviderTree for provider: e96a8776-a298-4c19-937a-402cb8191067 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 21 19:06:58 np0005591285 nova_compute[182755]: 2026-01-22 00:06:58.883 182759 DEBUG nova.scheduler.client.report [None req-61d84e6f-1f5d-4e85-97f8-1a17c46dd747 365f219cd09c471fa6275faa2fe5e2a1 b26cf6f4abd54e30aac169a3cbca648c - - default default] Inventory has not changed for provider e96a8776-a298-4c19-937a-402cb8191067 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 21 19:06:58 np0005591285 nova_compute[182755]: 2026-01-22 00:06:58.926 182759 DEBUG oslo_concurrency.lockutils [None req-61d84e6f-1f5d-4e85-97f8-1a17c46dd747 365f219cd09c471fa6275faa2fe5e2a1 b26cf6f4abd54e30aac169a3cbca648c - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.resize_claim" :: held 0.261s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 21 19:06:58 np0005591285 nova_compute[182755]: 2026-01-22 00:06:58.927 182759 INFO nova.compute.manager [None req-61d84e6f-1f5d-4e85-97f8-1a17c46dd747 365f219cd09c471fa6275faa2fe5e2a1 b26cf6f4abd54e30aac169a3cbca648c - - default default] [instance: a7650c58-4663-47b0-8499-d470f8edddbd] Migrating#033[00m
Jan 21 19:06:59 np0005591285 nova_compute[182755]: 2026-01-22 00:06:59.820 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:07:00 np0005591285 systemd[1]: Created slice User Slice of UID 42436.
Jan 21 19:07:00 np0005591285 systemd[1]: Starting User Runtime Directory /run/user/42436...
Jan 21 19:07:00 np0005591285 systemd-logind[788]: New session 42 of user nova.
Jan 21 19:07:00 np0005591285 systemd[1]: Finished User Runtime Directory /run/user/42436.
Jan 21 19:07:00 np0005591285 systemd[1]: Starting User Manager for UID 42436...
Jan 21 19:07:01 np0005591285 systemd[226646]: Queued start job for default target Main User Target.
Jan 21 19:07:01 np0005591285 systemd[226646]: Created slice User Application Slice.
Jan 21 19:07:01 np0005591285 systemd[226646]: Started Mark boot as successful after the user session has run 2 minutes.
Jan 21 19:07:01 np0005591285 systemd[226646]: Started Daily Cleanup of User's Temporary Directories.
Jan 21 19:07:01 np0005591285 systemd[226646]: Reached target Paths.
Jan 21 19:07:01 np0005591285 systemd[226646]: Reached target Timers.
Jan 21 19:07:01 np0005591285 systemd[226646]: Starting D-Bus User Message Bus Socket...
Jan 21 19:07:01 np0005591285 systemd[226646]: Starting Create User's Volatile Files and Directories...
Jan 21 19:07:01 np0005591285 systemd[226646]: Finished Create User's Volatile Files and Directories.
Jan 21 19:07:01 np0005591285 systemd[226646]: Listening on D-Bus User Message Bus Socket.
Jan 21 19:07:01 np0005591285 systemd[226646]: Reached target Sockets.
Jan 21 19:07:01 np0005591285 systemd[226646]: Reached target Basic System.
Jan 21 19:07:01 np0005591285 systemd[226646]: Reached target Main User Target.
Jan 21 19:07:01 np0005591285 systemd[226646]: Startup finished in 169ms.
Jan 21 19:07:01 np0005591285 systemd[1]: Started User Manager for UID 42436.
Jan 21 19:07:01 np0005591285 systemd[1]: Started Session 42 of User nova.
Jan 21 19:07:01 np0005591285 systemd[1]: session-42.scope: Deactivated successfully.
Jan 21 19:07:01 np0005591285 systemd-logind[788]: Session 42 logged out. Waiting for processes to exit.
Jan 21 19:07:01 np0005591285 systemd-logind[788]: Removed session 42.
Jan 21 19:07:01 np0005591285 ovn_controller[94908]: 2026-01-22T00:07:01Z|00039|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:6a:66:fa 10.100.0.13
Jan 21 19:07:01 np0005591285 ovn_controller[94908]: 2026-01-22T00:07:01Z|00040|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:6a:66:fa 10.100.0.13
Jan 21 19:07:01 np0005591285 systemd-logind[788]: New session 44 of user nova.
Jan 21 19:07:01 np0005591285 systemd[1]: Started Session 44 of User nova.
Jan 21 19:07:01 np0005591285 nova_compute[182755]: 2026-01-22 00:07:01.503 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:07:01 np0005591285 systemd[1]: session-44.scope: Deactivated successfully.
Jan 21 19:07:01 np0005591285 systemd-logind[788]: Session 44 logged out. Waiting for processes to exit.
Jan 21 19:07:01 np0005591285 systemd-logind[788]: Removed session 44.
Jan 21 19:07:01 np0005591285 systemd-logind[788]: New session 45 of user nova.
Jan 21 19:07:01 np0005591285 systemd[1]: Started Session 45 of User nova.
Jan 21 19:07:02 np0005591285 systemd[1]: session-45.scope: Deactivated successfully.
Jan 21 19:07:02 np0005591285 systemd-logind[788]: Session 45 logged out. Waiting for processes to exit.
Jan 21 19:07:02 np0005591285 systemd-logind[788]: Removed session 45.
Jan 21 19:07:02 np0005591285 systemd-logind[788]: New session 46 of user nova.
Jan 21 19:07:02 np0005591285 systemd[1]: Started Session 46 of User nova.
Jan 21 19:07:02 np0005591285 systemd[1]: session-46.scope: Deactivated successfully.
Jan 21 19:07:02 np0005591285 systemd-logind[788]: Session 46 logged out. Waiting for processes to exit.
Jan 21 19:07:02 np0005591285 systemd-logind[788]: Removed session 46.
Jan 21 19:07:02 np0005591285 systemd-logind[788]: New session 47 of user nova.
Jan 21 19:07:02 np0005591285 systemd[1]: Started Session 47 of User nova.
Jan 21 19:07:02 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:07:02.971 104259 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 21 19:07:02 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:07:02.972 104259 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 21 19:07:02 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:07:02.974 104259 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 21 19:07:03 np0005591285 systemd[1]: session-47.scope: Deactivated successfully.
Jan 21 19:07:03 np0005591285 systemd-logind[788]: Session 47 logged out. Waiting for processes to exit.
Jan 21 19:07:03 np0005591285 systemd-logind[788]: Removed session 47.
Jan 21 19:07:04 np0005591285 nova_compute[182755]: 2026-01-22 00:07:04.012 182759 INFO nova.network.neutron [None req-61d84e6f-1f5d-4e85-97f8-1a17c46dd747 365f219cd09c471fa6275faa2fe5e2a1 b26cf6f4abd54e30aac169a3cbca648c - - default default] [instance: a7650c58-4663-47b0-8499-d470f8edddbd] Updating port 609c277b-133c-4824-9fd7-17b756932543 with attributes {'binding:host_id': 'compute-2.ctlplane.example.com', 'device_owner': 'compute:nova'}#033[00m
Jan 21 19:07:04 np0005591285 nova_compute[182755]: 2026-01-22 00:07:04.823 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:07:06 np0005591285 nova_compute[182755]: 2026-01-22 00:07:06.504 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:07:07 np0005591285 nova_compute[182755]: 2026-01-22 00:07:07.506 182759 DEBUG oslo_concurrency.lockutils [None req-61d84e6f-1f5d-4e85-97f8-1a17c46dd747 365f219cd09c471fa6275faa2fe5e2a1 b26cf6f4abd54e30aac169a3cbca648c - - default default] Acquiring lock "refresh_cache-a7650c58-4663-47b0-8499-d470f8edddbd" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 21 19:07:07 np0005591285 nova_compute[182755]: 2026-01-22 00:07:07.507 182759 DEBUG oslo_concurrency.lockutils [None req-61d84e6f-1f5d-4e85-97f8-1a17c46dd747 365f219cd09c471fa6275faa2fe5e2a1 b26cf6f4abd54e30aac169a3cbca648c - - default default] Acquired lock "refresh_cache-a7650c58-4663-47b0-8499-d470f8edddbd" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 21 19:07:07 np0005591285 nova_compute[182755]: 2026-01-22 00:07:07.508 182759 DEBUG nova.network.neutron [None req-61d84e6f-1f5d-4e85-97f8-1a17c46dd747 365f219cd09c471fa6275faa2fe5e2a1 b26cf6f4abd54e30aac169a3cbca648c - - default default] [instance: a7650c58-4663-47b0-8499-d470f8edddbd] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Jan 21 19:07:07 np0005591285 nova_compute[182755]: 2026-01-22 00:07:07.639 182759 DEBUG nova.compute.manager [req-7919bbc0-d3f9-4029-8b96-6c174d9052c2 req-505271b8-0003-4264-a046-56ef3977c83b 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: a7650c58-4663-47b0-8499-d470f8edddbd] Received event network-changed-609c277b-133c-4824-9fd7-17b756932543 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 21 19:07:07 np0005591285 nova_compute[182755]: 2026-01-22 00:07:07.640 182759 DEBUG nova.compute.manager [req-7919bbc0-d3f9-4029-8b96-6c174d9052c2 req-505271b8-0003-4264-a046-56ef3977c83b 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: a7650c58-4663-47b0-8499-d470f8edddbd] Refreshing instance network info cache due to event network-changed-609c277b-133c-4824-9fd7-17b756932543. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 21 19:07:07 np0005591285 nova_compute[182755]: 2026-01-22 00:07:07.640 182759 DEBUG oslo_concurrency.lockutils [req-7919bbc0-d3f9-4029-8b96-6c174d9052c2 req-505271b8-0003-4264-a046-56ef3977c83b 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "refresh_cache-a7650c58-4663-47b0-8499-d470f8edddbd" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 21 19:07:08 np0005591285 nova_compute[182755]: 2026-01-22 00:07:08.403 182759 INFO nova.compute.manager [None req-99d6612a-8a67-419f-a486-b1f8e85840b1 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] [instance: 6167cc82-55cf-479c-a543-101634481524] Get console output#033[00m
Jan 21 19:07:08 np0005591285 nova_compute[182755]: 2026-01-22 00:07:08.410 211512 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes#033[00m
Jan 21 19:07:09 np0005591285 nova_compute[182755]: 2026-01-22 00:07:09.825 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:07:10 np0005591285 nova_compute[182755]: 2026-01-22 00:07:09.999 182759 DEBUG nova.network.neutron [None req-61d84e6f-1f5d-4e85-97f8-1a17c46dd747 365f219cd09c471fa6275faa2fe5e2a1 b26cf6f4abd54e30aac169a3cbca648c - - default default] [instance: a7650c58-4663-47b0-8499-d470f8edddbd] Updating instance_info_cache with network_info: [{"id": "609c277b-133c-4824-9fd7-17b756932543", "address": "fa:16:3e:4e:1a:fc", "network": {"id": "1a4bd631-64c5-4e00-9341-0e44fd0833fb", "bridge": "br-int", "label": "tempest-ServerActionsTestOtherB-1779791452-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.176", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b26cf6f4abd54e30aac169a3cbca648c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap609c277b-13", "ovs_interfaceid": "609c277b-133c-4824-9fd7-17b756932543", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 21 19:07:10 np0005591285 nova_compute[182755]: 2026-01-22 00:07:10.023 182759 DEBUG oslo_concurrency.lockutils [None req-61d84e6f-1f5d-4e85-97f8-1a17c46dd747 365f219cd09c471fa6275faa2fe5e2a1 b26cf6f4abd54e30aac169a3cbca648c - - default default] Releasing lock "refresh_cache-a7650c58-4663-47b0-8499-d470f8edddbd" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 21 19:07:10 np0005591285 nova_compute[182755]: 2026-01-22 00:07:10.031 182759 DEBUG oslo_concurrency.lockutils [req-7919bbc0-d3f9-4029-8b96-6c174d9052c2 req-505271b8-0003-4264-a046-56ef3977c83b 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquired lock "refresh_cache-a7650c58-4663-47b0-8499-d470f8edddbd" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 21 19:07:10 np0005591285 nova_compute[182755]: 2026-01-22 00:07:10.031 182759 DEBUG nova.network.neutron [req-7919bbc0-d3f9-4029-8b96-6c174d9052c2 req-505271b8-0003-4264-a046-56ef3977c83b 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: a7650c58-4663-47b0-8499-d470f8edddbd] Refreshing network info cache for port 609c277b-133c-4824-9fd7-17b756932543 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 21 19:07:10 np0005591285 nova_compute[182755]: 2026-01-22 00:07:10.175 182759 DEBUG nova.virt.libvirt.driver [None req-61d84e6f-1f5d-4e85-97f8-1a17c46dd747 365f219cd09c471fa6275faa2fe5e2a1 b26cf6f4abd54e30aac169a3cbca648c - - default default] [instance: a7650c58-4663-47b0-8499-d470f8edddbd] Starting finish_migration finish_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11698#033[00m
Jan 21 19:07:10 np0005591285 nova_compute[182755]: 2026-01-22 00:07:10.177 182759 DEBUG nova.virt.libvirt.driver [None req-61d84e6f-1f5d-4e85-97f8-1a17c46dd747 365f219cd09c471fa6275faa2fe5e2a1 b26cf6f4abd54e30aac169a3cbca648c - - default default] [instance: a7650c58-4663-47b0-8499-d470f8edddbd] Instance directory exists: not creating _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4719#033[00m
Jan 21 19:07:10 np0005591285 nova_compute[182755]: 2026-01-22 00:07:10.178 182759 INFO nova.virt.libvirt.driver [None req-61d84e6f-1f5d-4e85-97f8-1a17c46dd747 365f219cd09c471fa6275faa2fe5e2a1 b26cf6f4abd54e30aac169a3cbca648c - - default default] [instance: a7650c58-4663-47b0-8499-d470f8edddbd] Creating image(s)#033[00m
Jan 21 19:07:10 np0005591285 nova_compute[182755]: 2026-01-22 00:07:10.179 182759 DEBUG nova.objects.instance [None req-61d84e6f-1f5d-4e85-97f8-1a17c46dd747 365f219cd09c471fa6275faa2fe5e2a1 b26cf6f4abd54e30aac169a3cbca648c - - default default] Lazy-loading 'trusted_certs' on Instance uuid a7650c58-4663-47b0-8499-d470f8edddbd obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 21 19:07:10 np0005591285 nova_compute[182755]: 2026-01-22 00:07:10.197 182759 DEBUG oslo_concurrency.processutils [None req-61d84e6f-1f5d-4e85-97f8-1a17c46dd747 365f219cd09c471fa6275faa2fe5e2a1 b26cf6f4abd54e30aac169a3cbca648c - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 21 19:07:10 np0005591285 nova_compute[182755]: 2026-01-22 00:07:10.277 182759 DEBUG oslo_concurrency.processutils [None req-61d84e6f-1f5d-4e85-97f8-1a17c46dd747 365f219cd09c471fa6275faa2fe5e2a1 b26cf6f4abd54e30aac169a3cbca648c - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json" returned: 0 in 0.079s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 21 19:07:10 np0005591285 nova_compute[182755]: 2026-01-22 00:07:10.278 182759 DEBUG nova.virt.disk.api [None req-61d84e6f-1f5d-4e85-97f8-1a17c46dd747 365f219cd09c471fa6275faa2fe5e2a1 b26cf6f4abd54e30aac169a3cbca648c - - default default] Checking if we can resize image /var/lib/nova/instances/a7650c58-4663-47b0-8499-d470f8edddbd/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166#033[00m
Jan 21 19:07:10 np0005591285 nova_compute[182755]: 2026-01-22 00:07:10.279 182759 DEBUG oslo_concurrency.processutils [None req-61d84e6f-1f5d-4e85-97f8-1a17c46dd747 365f219cd09c471fa6275faa2fe5e2a1 b26cf6f4abd54e30aac169a3cbca648c - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/a7650c58-4663-47b0-8499-d470f8edddbd/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 21 19:07:10 np0005591285 nova_compute[182755]: 2026-01-22 00:07:10.340 182759 DEBUG oslo_concurrency.processutils [None req-61d84e6f-1f5d-4e85-97f8-1a17c46dd747 365f219cd09c471fa6275faa2fe5e2a1 b26cf6f4abd54e30aac169a3cbca648c - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/a7650c58-4663-47b0-8499-d470f8edddbd/disk --force-share --output=json" returned: 0 in 0.061s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 21 19:07:10 np0005591285 nova_compute[182755]: 2026-01-22 00:07:10.341 182759 DEBUG nova.virt.disk.api [None req-61d84e6f-1f5d-4e85-97f8-1a17c46dd747 365f219cd09c471fa6275faa2fe5e2a1 b26cf6f4abd54e30aac169a3cbca648c - - default default] Cannot resize image /var/lib/nova/instances/a7650c58-4663-47b0-8499-d470f8edddbd/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172#033[00m
Jan 21 19:07:10 np0005591285 nova_compute[182755]: 2026-01-22 00:07:10.364 182759 DEBUG nova.virt.libvirt.driver [None req-61d84e6f-1f5d-4e85-97f8-1a17c46dd747 365f219cd09c471fa6275faa2fe5e2a1 b26cf6f4abd54e30aac169a3cbca648c - - default default] [instance: a7650c58-4663-47b0-8499-d470f8edddbd] Did not create local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4859#033[00m
Jan 21 19:07:10 np0005591285 nova_compute[182755]: 2026-01-22 00:07:10.365 182759 DEBUG nova.virt.libvirt.driver [None req-61d84e6f-1f5d-4e85-97f8-1a17c46dd747 365f219cd09c471fa6275faa2fe5e2a1 b26cf6f4abd54e30aac169a3cbca648c - - default default] [instance: a7650c58-4663-47b0-8499-d470f8edddbd] Ensure instance console log exists: /var/lib/nova/instances/a7650c58-4663-47b0-8499-d470f8edddbd/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Jan 21 19:07:10 np0005591285 nova_compute[182755]: 2026-01-22 00:07:10.366 182759 DEBUG oslo_concurrency.lockutils [None req-61d84e6f-1f5d-4e85-97f8-1a17c46dd747 365f219cd09c471fa6275faa2fe5e2a1 b26cf6f4abd54e30aac169a3cbca648c - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 21 19:07:10 np0005591285 nova_compute[182755]: 2026-01-22 00:07:10.367 182759 DEBUG oslo_concurrency.lockutils [None req-61d84e6f-1f5d-4e85-97f8-1a17c46dd747 365f219cd09c471fa6275faa2fe5e2a1 b26cf6f4abd54e30aac169a3cbca648c - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 21 19:07:10 np0005591285 nova_compute[182755]: 2026-01-22 00:07:10.367 182759 DEBUG oslo_concurrency.lockutils [None req-61d84e6f-1f5d-4e85-97f8-1a17c46dd747 365f219cd09c471fa6275faa2fe5e2a1 b26cf6f4abd54e30aac169a3cbca648c - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 21 19:07:10 np0005591285 nova_compute[182755]: 2026-01-22 00:07:10.372 182759 DEBUG nova.virt.libvirt.driver [None req-61d84e6f-1f5d-4e85-97f8-1a17c46dd747 365f219cd09c471fa6275faa2fe5e2a1 b26cf6f4abd54e30aac169a3cbca648c - - default default] [instance: a7650c58-4663-47b0-8499-d470f8edddbd] Start _get_guest_xml network_info=[{"id": "609c277b-133c-4824-9fd7-17b756932543", "address": "fa:16:3e:4e:1a:fc", "network": {"id": "1a4bd631-64c5-4e00-9341-0e44fd0833fb", "bridge": "br-int", "label": "tempest-ServerActionsTestOtherB-1779791452-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.176", "type": "floating", "version": 4, "meta": {}}], "label": "tempest-ServerActionsTestOtherB-1779791452-network", "vif_mac": "fa:16:3e:4e:1a:fc"}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b26cf6f4abd54e30aac169a3cbca648c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap609c277b-13", "ovs_interfaceid": "609c277b-133c-4824-9fd7-17b756932543", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-21T23:42:50Z,direct_url=<?>,disk_format='qcow2',id=9cd98f02-a505-4543-a7ad-04e9a377b456,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='43b70c4e837343859ac97b6b2397ba1b',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-21T23:42:52Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'device_type': 'disk', 'device_name': '/dev/vda', 'size': 0, 'boot_index': 0, 'disk_bus': 'virtio', 'guest_format': None, 'encryption_options': None, 'encryption_format': None, 'encrypted': False, 'encryption_secret_uuid': None, 'image_id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Jan 21 19:07:10 np0005591285 nova_compute[182755]: 2026-01-22 00:07:10.379 182759 WARNING nova.virt.libvirt.driver [None req-61d84e6f-1f5d-4e85-97f8-1a17c46dd747 365f219cd09c471fa6275faa2fe5e2a1 b26cf6f4abd54e30aac169a3cbca648c - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 21 19:07:10 np0005591285 nova_compute[182755]: 2026-01-22 00:07:10.386 182759 DEBUG nova.virt.libvirt.host [None req-61d84e6f-1f5d-4e85-97f8-1a17c46dd747 365f219cd09c471fa6275faa2fe5e2a1 b26cf6f4abd54e30aac169a3cbca648c - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Jan 21 19:07:10 np0005591285 nova_compute[182755]: 2026-01-22 00:07:10.387 182759 DEBUG nova.virt.libvirt.host [None req-61d84e6f-1f5d-4e85-97f8-1a17c46dd747 365f219cd09c471fa6275faa2fe5e2a1 b26cf6f4abd54e30aac169a3cbca648c - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Jan 21 19:07:10 np0005591285 nova_compute[182755]: 2026-01-22 00:07:10.392 182759 DEBUG nova.virt.libvirt.host [None req-61d84e6f-1f5d-4e85-97f8-1a17c46dd747 365f219cd09c471fa6275faa2fe5e2a1 b26cf6f4abd54e30aac169a3cbca648c - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Jan 21 19:07:10 np0005591285 nova_compute[182755]: 2026-01-22 00:07:10.393 182759 DEBUG nova.virt.libvirt.host [None req-61d84e6f-1f5d-4e85-97f8-1a17c46dd747 365f219cd09c471fa6275faa2fe5e2a1 b26cf6f4abd54e30aac169a3cbca648c - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Jan 21 19:07:10 np0005591285 nova_compute[182755]: 2026-01-22 00:07:10.396 182759 DEBUG nova.virt.libvirt.driver [None req-61d84e6f-1f5d-4e85-97f8-1a17c46dd747 365f219cd09c471fa6275faa2fe5e2a1 b26cf6f4abd54e30aac169a3cbca648c - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Jan 21 19:07:10 np0005591285 nova_compute[182755]: 2026-01-22 00:07:10.397 182759 DEBUG nova.virt.hardware [None req-61d84e6f-1f5d-4e85-97f8-1a17c46dd747 365f219cd09c471fa6275faa2fe5e2a1 b26cf6f4abd54e30aac169a3cbca648c - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-21T23:42:48Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='ff01ccba-ad51-439f-9037-926190d6dc0f',id=2,is_public=True,memory_mb=192,name='m1.micro',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-21T23:42:50Z,direct_url=<?>,disk_format='qcow2',id=9cd98f02-a505-4543-a7ad-04e9a377b456,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='43b70c4e837343859ac97b6b2397ba1b',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-21T23:42:52Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Jan 21 19:07:10 np0005591285 nova_compute[182755]: 2026-01-22 00:07:10.398 182759 DEBUG nova.virt.hardware [None req-61d84e6f-1f5d-4e85-97f8-1a17c46dd747 365f219cd09c471fa6275faa2fe5e2a1 b26cf6f4abd54e30aac169a3cbca648c - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Jan 21 19:07:10 np0005591285 nova_compute[182755]: 2026-01-22 00:07:10.399 182759 DEBUG nova.virt.hardware [None req-61d84e6f-1f5d-4e85-97f8-1a17c46dd747 365f219cd09c471fa6275faa2fe5e2a1 b26cf6f4abd54e30aac169a3cbca648c - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Jan 21 19:07:10 np0005591285 nova_compute[182755]: 2026-01-22 00:07:10.399 182759 DEBUG nova.virt.hardware [None req-61d84e6f-1f5d-4e85-97f8-1a17c46dd747 365f219cd09c471fa6275faa2fe5e2a1 b26cf6f4abd54e30aac169a3cbca648c - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Jan 21 19:07:10 np0005591285 nova_compute[182755]: 2026-01-22 00:07:10.400 182759 DEBUG nova.virt.hardware [None req-61d84e6f-1f5d-4e85-97f8-1a17c46dd747 365f219cd09c471fa6275faa2fe5e2a1 b26cf6f4abd54e30aac169a3cbca648c - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Jan 21 19:07:10 np0005591285 nova_compute[182755]: 2026-01-22 00:07:10.401 182759 DEBUG nova.virt.hardware [None req-61d84e6f-1f5d-4e85-97f8-1a17c46dd747 365f219cd09c471fa6275faa2fe5e2a1 b26cf6f4abd54e30aac169a3cbca648c - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Jan 21 19:07:10 np0005591285 nova_compute[182755]: 2026-01-22 00:07:10.401 182759 DEBUG nova.virt.hardware [None req-61d84e6f-1f5d-4e85-97f8-1a17c46dd747 365f219cd09c471fa6275faa2fe5e2a1 b26cf6f4abd54e30aac169a3cbca648c - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Jan 21 19:07:10 np0005591285 nova_compute[182755]: 2026-01-22 00:07:10.402 182759 DEBUG nova.virt.hardware [None req-61d84e6f-1f5d-4e85-97f8-1a17c46dd747 365f219cd09c471fa6275faa2fe5e2a1 b26cf6f4abd54e30aac169a3cbca648c - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Jan 21 19:07:10 np0005591285 nova_compute[182755]: 2026-01-22 00:07:10.402 182759 DEBUG nova.virt.hardware [None req-61d84e6f-1f5d-4e85-97f8-1a17c46dd747 365f219cd09c471fa6275faa2fe5e2a1 b26cf6f4abd54e30aac169a3cbca648c - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Jan 21 19:07:10 np0005591285 nova_compute[182755]: 2026-01-22 00:07:10.403 182759 DEBUG nova.virt.hardware [None req-61d84e6f-1f5d-4e85-97f8-1a17c46dd747 365f219cd09c471fa6275faa2fe5e2a1 b26cf6f4abd54e30aac169a3cbca648c - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Jan 21 19:07:10 np0005591285 nova_compute[182755]: 2026-01-22 00:07:10.404 182759 DEBUG nova.virt.hardware [None req-61d84e6f-1f5d-4e85-97f8-1a17c46dd747 365f219cd09c471fa6275faa2fe5e2a1 b26cf6f4abd54e30aac169a3cbca648c - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Jan 21 19:07:10 np0005591285 nova_compute[182755]: 2026-01-22 00:07:10.404 182759 DEBUG nova.objects.instance [None req-61d84e6f-1f5d-4e85-97f8-1a17c46dd747 365f219cd09c471fa6275faa2fe5e2a1 b26cf6f4abd54e30aac169a3cbca648c - - default default] Lazy-loading 'vcpu_model' on Instance uuid a7650c58-4663-47b0-8499-d470f8edddbd obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 21 19:07:10 np0005591285 nova_compute[182755]: 2026-01-22 00:07:10.436 182759 DEBUG oslo_concurrency.processutils [None req-61d84e6f-1f5d-4e85-97f8-1a17c46dd747 365f219cd09c471fa6275faa2fe5e2a1 b26cf6f4abd54e30aac169a3cbca648c - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/a7650c58-4663-47b0-8499-d470f8edddbd/disk.config --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 21 19:07:10 np0005591285 nova_compute[182755]: 2026-01-22 00:07:10.495 182759 DEBUG oslo_concurrency.processutils [None req-61d84e6f-1f5d-4e85-97f8-1a17c46dd747 365f219cd09c471fa6275faa2fe5e2a1 b26cf6f4abd54e30aac169a3cbca648c - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/a7650c58-4663-47b0-8499-d470f8edddbd/disk.config --force-share --output=json" returned: 0 in 0.058s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 21 19:07:10 np0005591285 nova_compute[182755]: 2026-01-22 00:07:10.496 182759 DEBUG oslo_concurrency.lockutils [None req-61d84e6f-1f5d-4e85-97f8-1a17c46dd747 365f219cd09c471fa6275faa2fe5e2a1 b26cf6f4abd54e30aac169a3cbca648c - - default default] Acquiring lock "/var/lib/nova/instances/a7650c58-4663-47b0-8499-d470f8edddbd/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 21 19:07:10 np0005591285 nova_compute[182755]: 2026-01-22 00:07:10.496 182759 DEBUG oslo_concurrency.lockutils [None req-61d84e6f-1f5d-4e85-97f8-1a17c46dd747 365f219cd09c471fa6275faa2fe5e2a1 b26cf6f4abd54e30aac169a3cbca648c - - default default] Lock "/var/lib/nova/instances/a7650c58-4663-47b0-8499-d470f8edddbd/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 21 19:07:10 np0005591285 nova_compute[182755]: 2026-01-22 00:07:10.497 182759 DEBUG oslo_concurrency.lockutils [None req-61d84e6f-1f5d-4e85-97f8-1a17c46dd747 365f219cd09c471fa6275faa2fe5e2a1 b26cf6f4abd54e30aac169a3cbca648c - - default default] Lock "/var/lib/nova/instances/a7650c58-4663-47b0-8499-d470f8edddbd/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 21 19:07:10 np0005591285 nova_compute[182755]: 2026-01-22 00:07:10.498 182759 DEBUG nova.virt.libvirt.vif [None req-61d84e6f-1f5d-4e85-97f8-1a17c46dd747 365f219cd09c471fa6275faa2fe5e2a1 b26cf6f4abd54e30aac169a3cbca648c - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-22T00:05:15Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ServerActionsTestOtherB-server-1989312991',display_name='tempest-ServerActionsTestOtherB-server-1989312991',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(2),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-serveractionstestotherb-server-1989312991',id=92,image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',info_cache=InstanceInfoCache,instance_type_id=2,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBOWyAqsdytk3W3HzFQzJP3BXJvSwE75PC1SitNdFnRhcK3nyEFtPzs/DJKbijcwArzRvqYzid7Fty+N11Hyd1TaRzX9I0f6oLPrGjMRpZbi4YRQ8Uh8k7+UR1VtydcvTDA==',key_name='tempest-keypair-1040205771',keypairs=<?>,launch_index=0,launched_at=2026-01-22T00:05:24Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=192,metadata={},migration_context=MigrationContext,new_flavor=Flavor(2),node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=Flavor(1),os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=4,progress=0,project_id='b26cf6f4abd54e30aac169a3cbca648c',ramdisk_id='',reservation_id='r-sfv20ppo',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=ServiceList,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',old_vm_state='stopped',owner_project_name='tempest-ServerActionsTestOtherB-1685479237',owner_user_name='tempest-ServerActionsTestOtherB-1685479237-project-member'},tags=<?>,task_state='resize_finish',terminated_at=None,trusted_certs=None,updated_at=2026-01-22T00:07:03Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='365f219cd09c471fa6275faa2fe5e2a1',uuid=a7650c58-4663-47b0-8499-d470f8edddbd,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='stopped') vif={"id": "609c277b-133c-4824-9fd7-17b756932543", "address": "fa:16:3e:4e:1a:fc", "network": {"id": "1a4bd631-64c5-4e00-9341-0e44fd0833fb", "bridge": "br-int", "label": "tempest-ServerActionsTestOtherB-1779791452-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.176", "type": "floating", "version": 4, "meta": {}}], "label": "tempest-ServerActionsTestOtherB-1779791452-network", "vif_mac": "fa:16:3e:4e:1a:fc"}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b26cf6f4abd54e30aac169a3cbca648c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap609c277b-13", "ovs_interfaceid": "609c277b-133c-4824-9fd7-17b756932543", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Jan 21 19:07:10 np0005591285 nova_compute[182755]: 2026-01-22 00:07:10.499 182759 DEBUG nova.network.os_vif_util [None req-61d84e6f-1f5d-4e85-97f8-1a17c46dd747 365f219cd09c471fa6275faa2fe5e2a1 b26cf6f4abd54e30aac169a3cbca648c - - default default] Converting VIF {"id": "609c277b-133c-4824-9fd7-17b756932543", "address": "fa:16:3e:4e:1a:fc", "network": {"id": "1a4bd631-64c5-4e00-9341-0e44fd0833fb", "bridge": "br-int", "label": "tempest-ServerActionsTestOtherB-1779791452-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.176", "type": "floating", "version": 4, "meta": {}}], "label": "tempest-ServerActionsTestOtherB-1779791452-network", "vif_mac": "fa:16:3e:4e:1a:fc"}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b26cf6f4abd54e30aac169a3cbca648c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap609c277b-13", "ovs_interfaceid": "609c277b-133c-4824-9fd7-17b756932543", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 21 19:07:10 np0005591285 nova_compute[182755]: 2026-01-22 00:07:10.500 182759 DEBUG nova.network.os_vif_util [None req-61d84e6f-1f5d-4e85-97f8-1a17c46dd747 365f219cd09c471fa6275faa2fe5e2a1 b26cf6f4abd54e30aac169a3cbca648c - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:4e:1a:fc,bridge_name='br-int',has_traffic_filtering=True,id=609c277b-133c-4824-9fd7-17b756932543,network=Network(1a4bd631-64c5-4e00-9341-0e44fd0833fb),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap609c277b-13') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 21 19:07:10 np0005591285 nova_compute[182755]: 2026-01-22 00:07:10.502 182759 DEBUG nova.virt.libvirt.driver [None req-61d84e6f-1f5d-4e85-97f8-1a17c46dd747 365f219cd09c471fa6275faa2fe5e2a1 b26cf6f4abd54e30aac169a3cbca648c - - default default] [instance: a7650c58-4663-47b0-8499-d470f8edddbd] End _get_guest_xml xml=<domain type="kvm">
Jan 21 19:07:10 np0005591285 nova_compute[182755]:  <uuid>a7650c58-4663-47b0-8499-d470f8edddbd</uuid>
Jan 21 19:07:10 np0005591285 nova_compute[182755]:  <name>instance-0000005c</name>
Jan 21 19:07:10 np0005591285 nova_compute[182755]:  <memory>196608</memory>
Jan 21 19:07:10 np0005591285 nova_compute[182755]:  <vcpu>1</vcpu>
Jan 21 19:07:10 np0005591285 nova_compute[182755]:  <metadata>
Jan 21 19:07:10 np0005591285 nova_compute[182755]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 21 19:07:10 np0005591285 nova_compute[182755]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 21 19:07:10 np0005591285 nova_compute[182755]:      <nova:name>tempest-ServerActionsTestOtherB-server-1989312991</nova:name>
Jan 21 19:07:10 np0005591285 nova_compute[182755]:      <nova:creationTime>2026-01-22 00:07:10</nova:creationTime>
Jan 21 19:07:10 np0005591285 nova_compute[182755]:      <nova:flavor name="m1.micro">
Jan 21 19:07:10 np0005591285 nova_compute[182755]:        <nova:memory>192</nova:memory>
Jan 21 19:07:10 np0005591285 nova_compute[182755]:        <nova:disk>1</nova:disk>
Jan 21 19:07:10 np0005591285 nova_compute[182755]:        <nova:swap>0</nova:swap>
Jan 21 19:07:10 np0005591285 nova_compute[182755]:        <nova:ephemeral>0</nova:ephemeral>
Jan 21 19:07:10 np0005591285 nova_compute[182755]:        <nova:vcpus>1</nova:vcpus>
Jan 21 19:07:10 np0005591285 nova_compute[182755]:      </nova:flavor>
Jan 21 19:07:10 np0005591285 nova_compute[182755]:      <nova:owner>
Jan 21 19:07:10 np0005591285 nova_compute[182755]:        <nova:user uuid="365f219cd09c471fa6275faa2fe5e2a1">tempest-ServerActionsTestOtherB-1685479237-project-member</nova:user>
Jan 21 19:07:10 np0005591285 nova_compute[182755]:        <nova:project uuid="b26cf6f4abd54e30aac169a3cbca648c">tempest-ServerActionsTestOtherB-1685479237</nova:project>
Jan 21 19:07:10 np0005591285 nova_compute[182755]:      </nova:owner>
Jan 21 19:07:10 np0005591285 nova_compute[182755]:      <nova:root type="image" uuid="9cd98f02-a505-4543-a7ad-04e9a377b456"/>
Jan 21 19:07:10 np0005591285 nova_compute[182755]:      <nova:ports>
Jan 21 19:07:10 np0005591285 nova_compute[182755]:        <nova:port uuid="609c277b-133c-4824-9fd7-17b756932543">
Jan 21 19:07:10 np0005591285 nova_compute[182755]:          <nova:ip type="fixed" address="10.100.0.7" ipVersion="4"/>
Jan 21 19:07:10 np0005591285 nova_compute[182755]:        </nova:port>
Jan 21 19:07:10 np0005591285 nova_compute[182755]:      </nova:ports>
Jan 21 19:07:10 np0005591285 nova_compute[182755]:    </nova:instance>
Jan 21 19:07:10 np0005591285 nova_compute[182755]:  </metadata>
Jan 21 19:07:10 np0005591285 nova_compute[182755]:  <sysinfo type="smbios">
Jan 21 19:07:10 np0005591285 nova_compute[182755]:    <system>
Jan 21 19:07:10 np0005591285 nova_compute[182755]:      <entry name="manufacturer">RDO</entry>
Jan 21 19:07:10 np0005591285 nova_compute[182755]:      <entry name="product">OpenStack Compute</entry>
Jan 21 19:07:10 np0005591285 nova_compute[182755]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 21 19:07:10 np0005591285 nova_compute[182755]:      <entry name="serial">a7650c58-4663-47b0-8499-d470f8edddbd</entry>
Jan 21 19:07:10 np0005591285 nova_compute[182755]:      <entry name="uuid">a7650c58-4663-47b0-8499-d470f8edddbd</entry>
Jan 21 19:07:10 np0005591285 nova_compute[182755]:      <entry name="family">Virtual Machine</entry>
Jan 21 19:07:10 np0005591285 nova_compute[182755]:    </system>
Jan 21 19:07:10 np0005591285 nova_compute[182755]:  </sysinfo>
Jan 21 19:07:10 np0005591285 nova_compute[182755]:  <os>
Jan 21 19:07:10 np0005591285 nova_compute[182755]:    <type arch="x86_64" machine="q35">hvm</type>
Jan 21 19:07:10 np0005591285 nova_compute[182755]:    <boot dev="hd"/>
Jan 21 19:07:10 np0005591285 nova_compute[182755]:    <smbios mode="sysinfo"/>
Jan 21 19:07:10 np0005591285 nova_compute[182755]:  </os>
Jan 21 19:07:10 np0005591285 nova_compute[182755]:  <features>
Jan 21 19:07:10 np0005591285 nova_compute[182755]:    <acpi/>
Jan 21 19:07:10 np0005591285 nova_compute[182755]:    <apic/>
Jan 21 19:07:10 np0005591285 nova_compute[182755]:    <vmcoreinfo/>
Jan 21 19:07:10 np0005591285 nova_compute[182755]:  </features>
Jan 21 19:07:10 np0005591285 nova_compute[182755]:  <clock offset="utc">
Jan 21 19:07:10 np0005591285 nova_compute[182755]:    <timer name="pit" tickpolicy="delay"/>
Jan 21 19:07:10 np0005591285 nova_compute[182755]:    <timer name="rtc" tickpolicy="catchup"/>
Jan 21 19:07:10 np0005591285 nova_compute[182755]:    <timer name="hpet" present="no"/>
Jan 21 19:07:10 np0005591285 nova_compute[182755]:  </clock>
Jan 21 19:07:10 np0005591285 nova_compute[182755]:  <cpu mode="custom" match="exact">
Jan 21 19:07:10 np0005591285 nova_compute[182755]:    <model>Nehalem</model>
Jan 21 19:07:10 np0005591285 nova_compute[182755]:    <topology sockets="1" cores="1" threads="1"/>
Jan 21 19:07:10 np0005591285 nova_compute[182755]:  </cpu>
Jan 21 19:07:10 np0005591285 nova_compute[182755]:  <devices>
Jan 21 19:07:10 np0005591285 nova_compute[182755]:    <disk type="file" device="disk">
Jan 21 19:07:10 np0005591285 nova_compute[182755]:      <driver name="qemu" type="qcow2" cache="none"/>
Jan 21 19:07:10 np0005591285 nova_compute[182755]:      <source file="/var/lib/nova/instances/a7650c58-4663-47b0-8499-d470f8edddbd/disk"/>
Jan 21 19:07:10 np0005591285 nova_compute[182755]:      <target dev="vda" bus="virtio"/>
Jan 21 19:07:10 np0005591285 nova_compute[182755]:    </disk>
Jan 21 19:07:10 np0005591285 nova_compute[182755]:    <disk type="file" device="cdrom">
Jan 21 19:07:10 np0005591285 nova_compute[182755]:      <driver name="qemu" type="raw" cache="none"/>
Jan 21 19:07:10 np0005591285 nova_compute[182755]:      <source file="/var/lib/nova/instances/a7650c58-4663-47b0-8499-d470f8edddbd/disk.config"/>
Jan 21 19:07:10 np0005591285 nova_compute[182755]:      <target dev="sda" bus="sata"/>
Jan 21 19:07:10 np0005591285 nova_compute[182755]:    </disk>
Jan 21 19:07:10 np0005591285 nova_compute[182755]:    <interface type="ethernet">
Jan 21 19:07:10 np0005591285 nova_compute[182755]:      <mac address="fa:16:3e:4e:1a:fc"/>
Jan 21 19:07:10 np0005591285 nova_compute[182755]:      <model type="virtio"/>
Jan 21 19:07:10 np0005591285 nova_compute[182755]:      <driver name="vhost" rx_queue_size="512"/>
Jan 21 19:07:10 np0005591285 nova_compute[182755]:      <mtu size="1442"/>
Jan 21 19:07:10 np0005591285 nova_compute[182755]:      <target dev="tap609c277b-13"/>
Jan 21 19:07:10 np0005591285 nova_compute[182755]:    </interface>
Jan 21 19:07:10 np0005591285 nova_compute[182755]:    <serial type="pty">
Jan 21 19:07:10 np0005591285 nova_compute[182755]:      <log file="/var/lib/nova/instances/a7650c58-4663-47b0-8499-d470f8edddbd/console.log" append="off"/>
Jan 21 19:07:10 np0005591285 nova_compute[182755]:    </serial>
Jan 21 19:07:10 np0005591285 nova_compute[182755]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 21 19:07:10 np0005591285 nova_compute[182755]:    <video>
Jan 21 19:07:10 np0005591285 nova_compute[182755]:      <model type="virtio"/>
Jan 21 19:07:10 np0005591285 nova_compute[182755]:    </video>
Jan 21 19:07:10 np0005591285 nova_compute[182755]:    <input type="tablet" bus="usb"/>
Jan 21 19:07:10 np0005591285 nova_compute[182755]:    <rng model="virtio">
Jan 21 19:07:10 np0005591285 nova_compute[182755]:      <backend model="random">/dev/urandom</backend>
Jan 21 19:07:10 np0005591285 nova_compute[182755]:    </rng>
Jan 21 19:07:10 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root"/>
Jan 21 19:07:10 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 19:07:10 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 19:07:10 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 19:07:10 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 19:07:10 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 19:07:10 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 19:07:10 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 19:07:10 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 19:07:10 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 19:07:10 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 19:07:10 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 19:07:10 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 19:07:10 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 19:07:10 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 19:07:10 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 19:07:10 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 19:07:10 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 19:07:10 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 19:07:10 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 19:07:10 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 19:07:10 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 19:07:10 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 19:07:10 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 19:07:10 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 19:07:10 np0005591285 nova_compute[182755]:    <controller type="usb" index="0"/>
Jan 21 19:07:10 np0005591285 nova_compute[182755]:    <memballoon model="virtio">
Jan 21 19:07:10 np0005591285 nova_compute[182755]:      <stats period="10"/>
Jan 21 19:07:10 np0005591285 nova_compute[182755]:    </memballoon>
Jan 21 19:07:10 np0005591285 nova_compute[182755]:  </devices>
Jan 21 19:07:10 np0005591285 nova_compute[182755]: </domain>
Jan 21 19:07:10 np0005591285 nova_compute[182755]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Jan 21 19:07:10 np0005591285 nova_compute[182755]: 2026-01-22 00:07:10.503 182759 DEBUG nova.virt.libvirt.vif [None req-61d84e6f-1f5d-4e85-97f8-1a17c46dd747 365f219cd09c471fa6275faa2fe5e2a1 b26cf6f4abd54e30aac169a3cbca648c - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-22T00:05:15Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ServerActionsTestOtherB-server-1989312991',display_name='tempest-ServerActionsTestOtherB-server-1989312991',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(2),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-serveractionstestotherb-server-1989312991',id=92,image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',info_cache=InstanceInfoCache,instance_type_id=2,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBOWyAqsdytk3W3HzFQzJP3BXJvSwE75PC1SitNdFnRhcK3nyEFtPzs/DJKbijcwArzRvqYzid7Fty+N11Hyd1TaRzX9I0f6oLPrGjMRpZbi4YRQ8Uh8k7+UR1VtydcvTDA==',key_name='tempest-keypair-1040205771',keypairs=<?>,launch_index=0,launched_at=2026-01-22T00:05:24Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=192,metadata={},migration_context=MigrationContext,new_flavor=Flavor(2),node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=Flavor(1),os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=4,progress=0,project_id='b26cf6f4abd54e30aac169a3cbca648c',ramdisk_id='',reservation_id='r-sfv20ppo',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=ServiceList,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',old_vm_state='stopped',owner_project_name='tempest-ServerActionsTestOtherB-1685479237',owner_user_name='tempest-ServerActionsTestOtherB-1685479237-project-member'},tags=<?>,task_state='resize_finish',terminated_at=None,trusted_certs=None,updated_at=2026-01-22T00:07:03Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='365f219cd09c471fa6275faa2fe5e2a1',uuid=a7650c58-4663-47b0-8499-d470f8edddbd,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='stopped') vif={"id": "609c277b-133c-4824-9fd7-17b756932543", "address": "fa:16:3e:4e:1a:fc", "network": {"id": "1a4bd631-64c5-4e00-9341-0e44fd0833fb", "bridge": "br-int", "label": "tempest-ServerActionsTestOtherB-1779791452-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.176", "type": "floating", "version": 4, "meta": {}}], "label": "tempest-ServerActionsTestOtherB-1779791452-network", "vif_mac": "fa:16:3e:4e:1a:fc"}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b26cf6f4abd54e30aac169a3cbca648c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap609c277b-13", "ovs_interfaceid": "609c277b-133c-4824-9fd7-17b756932543", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Jan 21 19:07:10 np0005591285 nova_compute[182755]: 2026-01-22 00:07:10.504 182759 DEBUG nova.network.os_vif_util [None req-61d84e6f-1f5d-4e85-97f8-1a17c46dd747 365f219cd09c471fa6275faa2fe5e2a1 b26cf6f4abd54e30aac169a3cbca648c - - default default] Converting VIF {"id": "609c277b-133c-4824-9fd7-17b756932543", "address": "fa:16:3e:4e:1a:fc", "network": {"id": "1a4bd631-64c5-4e00-9341-0e44fd0833fb", "bridge": "br-int", "label": "tempest-ServerActionsTestOtherB-1779791452-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.176", "type": "floating", "version": 4, "meta": {}}], "label": "tempest-ServerActionsTestOtherB-1779791452-network", "vif_mac": "fa:16:3e:4e:1a:fc"}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b26cf6f4abd54e30aac169a3cbca648c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap609c277b-13", "ovs_interfaceid": "609c277b-133c-4824-9fd7-17b756932543", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 21 19:07:10 np0005591285 nova_compute[182755]: 2026-01-22 00:07:10.505 182759 DEBUG nova.network.os_vif_util [None req-61d84e6f-1f5d-4e85-97f8-1a17c46dd747 365f219cd09c471fa6275faa2fe5e2a1 b26cf6f4abd54e30aac169a3cbca648c - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:4e:1a:fc,bridge_name='br-int',has_traffic_filtering=True,id=609c277b-133c-4824-9fd7-17b756932543,network=Network(1a4bd631-64c5-4e00-9341-0e44fd0833fb),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap609c277b-13') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 21 19:07:10 np0005591285 nova_compute[182755]: 2026-01-22 00:07:10.505 182759 DEBUG os_vif [None req-61d84e6f-1f5d-4e85-97f8-1a17c46dd747 365f219cd09c471fa6275faa2fe5e2a1 b26cf6f4abd54e30aac169a3cbca648c - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:4e:1a:fc,bridge_name='br-int',has_traffic_filtering=True,id=609c277b-133c-4824-9fd7-17b756932543,network=Network(1a4bd631-64c5-4e00-9341-0e44fd0833fb),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap609c277b-13') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Jan 21 19:07:10 np0005591285 nova_compute[182755]: 2026-01-22 00:07:10.506 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:07:10 np0005591285 nova_compute[182755]: 2026-01-22 00:07:10.506 182759 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 21 19:07:10 np0005591285 nova_compute[182755]: 2026-01-22 00:07:10.507 182759 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 21 19:07:10 np0005591285 nova_compute[182755]: 2026-01-22 00:07:10.509 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:07:10 np0005591285 nova_compute[182755]: 2026-01-22 00:07:10.510 182759 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap609c277b-13, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 21 19:07:10 np0005591285 nova_compute[182755]: 2026-01-22 00:07:10.510 182759 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap609c277b-13, col_values=(('external_ids', {'iface-id': '609c277b-133c-4824-9fd7-17b756932543', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:4e:1a:fc', 'vm-uuid': 'a7650c58-4663-47b0-8499-d470f8edddbd'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 21 19:07:10 np0005591285 NetworkManager[55017]: <info>  [1769040430.5128] manager: (tap609c277b-13): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/180)
Jan 21 19:07:10 np0005591285 nova_compute[182755]: 2026-01-22 00:07:10.512 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:07:10 np0005591285 nova_compute[182755]: 2026-01-22 00:07:10.516 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 21 19:07:10 np0005591285 nova_compute[182755]: 2026-01-22 00:07:10.518 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:07:10 np0005591285 nova_compute[182755]: 2026-01-22 00:07:10.519 182759 INFO os_vif [None req-61d84e6f-1f5d-4e85-97f8-1a17c46dd747 365f219cd09c471fa6275faa2fe5e2a1 b26cf6f4abd54e30aac169a3cbca648c - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:4e:1a:fc,bridge_name='br-int',has_traffic_filtering=True,id=609c277b-133c-4824-9fd7-17b756932543,network=Network(1a4bd631-64c5-4e00-9341-0e44fd0833fb),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap609c277b-13')#033[00m
Jan 21 19:07:10 np0005591285 nova_compute[182755]: 2026-01-22 00:07:10.595 182759 DEBUG nova.virt.libvirt.driver [None req-61d84e6f-1f5d-4e85-97f8-1a17c46dd747 365f219cd09c471fa6275faa2fe5e2a1 b26cf6f4abd54e30aac169a3cbca648c - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 21 19:07:10 np0005591285 nova_compute[182755]: 2026-01-22 00:07:10.595 182759 DEBUG nova.virt.libvirt.driver [None req-61d84e6f-1f5d-4e85-97f8-1a17c46dd747 365f219cd09c471fa6275faa2fe5e2a1 b26cf6f4abd54e30aac169a3cbca648c - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 21 19:07:10 np0005591285 nova_compute[182755]: 2026-01-22 00:07:10.596 182759 DEBUG nova.virt.libvirt.driver [None req-61d84e6f-1f5d-4e85-97f8-1a17c46dd747 365f219cd09c471fa6275faa2fe5e2a1 b26cf6f4abd54e30aac169a3cbca648c - - default default] No VIF found with MAC fa:16:3e:4e:1a:fc, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Jan 21 19:07:10 np0005591285 nova_compute[182755]: 2026-01-22 00:07:10.596 182759 INFO nova.virt.libvirt.driver [None req-61d84e6f-1f5d-4e85-97f8-1a17c46dd747 365f219cd09c471fa6275faa2fe5e2a1 b26cf6f4abd54e30aac169a3cbca648c - - default default] [instance: a7650c58-4663-47b0-8499-d470f8edddbd] Using config drive#033[00m
Jan 21 19:07:10 np0005591285 nova_compute[182755]: 2026-01-22 00:07:10.597 182759 DEBUG nova.compute.manager [None req-61d84e6f-1f5d-4e85-97f8-1a17c46dd747 365f219cd09c471fa6275faa2fe5e2a1 b26cf6f4abd54e30aac169a3cbca648c - - default default] [instance: a7650c58-4663-47b0-8499-d470f8edddbd] Instance event wait completed in 0 seconds for  wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Jan 21 19:07:10 np0005591285 nova_compute[182755]: 2026-01-22 00:07:10.598 182759 DEBUG nova.virt.libvirt.driver [None req-61d84e6f-1f5d-4e85-97f8-1a17c46dd747 365f219cd09c471fa6275faa2fe5e2a1 b26cf6f4abd54e30aac169a3cbca648c - - default default] [instance: a7650c58-4663-47b0-8499-d470f8edddbd] finish_migration finished successfully. finish_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11793#033[00m
Jan 21 19:07:11 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:07:11.090 104259 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=28, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '16:a4:fb', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': 'fa:c2:bb:50:eb:27'}, ipsec=False) old=SB_Global(nb_cfg=27) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 21 19:07:11 np0005591285 nova_compute[182755]: 2026-01-22 00:07:11.091 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:07:11 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:07:11.091 104259 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 10 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Jan 21 19:07:11 np0005591285 nova_compute[182755]: 2026-01-22 00:07:11.573 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:07:12 np0005591285 nova_compute[182755]: 2026-01-22 00:07:12.005 182759 DEBUG nova.network.neutron [req-7919bbc0-d3f9-4029-8b96-6c174d9052c2 req-505271b8-0003-4264-a046-56ef3977c83b 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: a7650c58-4663-47b0-8499-d470f8edddbd] Updated VIF entry in instance network info cache for port 609c277b-133c-4824-9fd7-17b756932543. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 21 19:07:12 np0005591285 nova_compute[182755]: 2026-01-22 00:07:12.005 182759 DEBUG nova.network.neutron [req-7919bbc0-d3f9-4029-8b96-6c174d9052c2 req-505271b8-0003-4264-a046-56ef3977c83b 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: a7650c58-4663-47b0-8499-d470f8edddbd] Updating instance_info_cache with network_info: [{"id": "609c277b-133c-4824-9fd7-17b756932543", "address": "fa:16:3e:4e:1a:fc", "network": {"id": "1a4bd631-64c5-4e00-9341-0e44fd0833fb", "bridge": "br-int", "label": "tempest-ServerActionsTestOtherB-1779791452-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.176", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b26cf6f4abd54e30aac169a3cbca648c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap609c277b-13", "ovs_interfaceid": "609c277b-133c-4824-9fd7-17b756932543", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 21 19:07:12 np0005591285 nova_compute[182755]: 2026-01-22 00:07:12.038 182759 DEBUG oslo_concurrency.lockutils [req-7919bbc0-d3f9-4029-8b96-6c174d9052c2 req-505271b8-0003-4264-a046-56ef3977c83b 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Releasing lock "refresh_cache-a7650c58-4663-47b0-8499-d470f8edddbd" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 21 19:07:13 np0005591285 systemd[1]: Stopping User Manager for UID 42436...
Jan 21 19:07:13 np0005591285 systemd[226646]: Activating special unit Exit the Session...
Jan 21 19:07:13 np0005591285 systemd[226646]: Stopped target Main User Target.
Jan 21 19:07:13 np0005591285 systemd[226646]: Stopped target Basic System.
Jan 21 19:07:13 np0005591285 systemd[226646]: Stopped target Paths.
Jan 21 19:07:13 np0005591285 systemd[226646]: Stopped target Sockets.
Jan 21 19:07:13 np0005591285 systemd[226646]: Stopped target Timers.
Jan 21 19:07:13 np0005591285 systemd[226646]: Stopped Mark boot as successful after the user session has run 2 minutes.
Jan 21 19:07:13 np0005591285 systemd[226646]: Stopped Daily Cleanup of User's Temporary Directories.
Jan 21 19:07:13 np0005591285 systemd[226646]: Closed D-Bus User Message Bus Socket.
Jan 21 19:07:13 np0005591285 systemd[226646]: Stopped Create User's Volatile Files and Directories.
Jan 21 19:07:13 np0005591285 systemd[226646]: Removed slice User Application Slice.
Jan 21 19:07:13 np0005591285 systemd[226646]: Reached target Shutdown.
Jan 21 19:07:13 np0005591285 systemd[226646]: Finished Exit the Session.
Jan 21 19:07:13 np0005591285 systemd[226646]: Reached target Exit the Session.
Jan 21 19:07:13 np0005591285 systemd[1]: user@42436.service: Deactivated successfully.
Jan 21 19:07:13 np0005591285 systemd[1]: Stopped User Manager for UID 42436.
Jan 21 19:07:13 np0005591285 systemd[1]: Stopping User Runtime Directory /run/user/42436...
Jan 21 19:07:13 np0005591285 systemd[1]: run-user-42436.mount: Deactivated successfully.
Jan 21 19:07:13 np0005591285 systemd[1]: user-runtime-dir@42436.service: Deactivated successfully.
Jan 21 19:07:13 np0005591285 systemd[1]: Stopped User Runtime Directory /run/user/42436.
Jan 21 19:07:13 np0005591285 systemd[1]: Removed slice User Slice of UID 42436.
Jan 21 19:07:14 np0005591285 nova_compute[182755]: 2026-01-22 00:07:14.202 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:07:14 np0005591285 podman[226698]: 2026-01-22 00:07:14.229255309 +0000 UTC m=+0.092719025 container health_status 769668cc4e818cfae854ab3fcb5517227ddca1e18005a18ef4d782a494f45662 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.buildah.version=1.33.7, release=1755695350, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., build-date=2025-08-20T13:12:41, managed_by=edpm_ansible, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-type=git, config_id=openstack_network_exporter, io.openshift.tags=minimal rhel9, maintainer=Red Hat, Inc., architecture=x86_64, container_name=openstack_network_exporter, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., distribution-scope=public, com.redhat.component=ubi9-minimal-container, version=9.6, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, name=ubi9-minimal, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=)
Jan 21 19:07:14 np0005591285 podman[226699]: 2026-01-22 00:07:14.230970045 +0000 UTC m=+0.092525549 container health_status b5c1a31a508d0cf9e10702abe9c0ef424614b4d8f0daee85791f7de054afa6f0 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, config_id=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ceilometer_agent_compute)
Jan 21 19:07:15 np0005591285 nova_compute[182755]: 2026-01-22 00:07:15.513 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:07:16 np0005591285 nova_compute[182755]: 2026-01-22 00:07:16.575 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:07:19 np0005591285 nova_compute[182755]: 2026-01-22 00:07:19.446 182759 DEBUG nova.objects.instance [None req-7c8e4fe1-5f6b-4bca-b226-67aa85b58c75 365f219cd09c471fa6275faa2fe5e2a1 b26cf6f4abd54e30aac169a3cbca648c - - default default] Lazy-loading 'flavor' on Instance uuid a7650c58-4663-47b0-8499-d470f8edddbd obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 21 19:07:19 np0005591285 nova_compute[182755]: 2026-01-22 00:07:19.479 182759 DEBUG nova.objects.instance [None req-7c8e4fe1-5f6b-4bca-b226-67aa85b58c75 365f219cd09c471fa6275faa2fe5e2a1 b26cf6f4abd54e30aac169a3cbca648c - - default default] Lazy-loading 'info_cache' on Instance uuid a7650c58-4663-47b0-8499-d470f8edddbd obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 21 19:07:19 np0005591285 nova_compute[182755]: 2026-01-22 00:07:19.512 182759 DEBUG oslo_concurrency.lockutils [None req-7c8e4fe1-5f6b-4bca-b226-67aa85b58c75 365f219cd09c471fa6275faa2fe5e2a1 b26cf6f4abd54e30aac169a3cbca648c - - default default] Acquiring lock "refresh_cache-a7650c58-4663-47b0-8499-d470f8edddbd" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 21 19:07:19 np0005591285 nova_compute[182755]: 2026-01-22 00:07:19.513 182759 DEBUG oslo_concurrency.lockutils [None req-7c8e4fe1-5f6b-4bca-b226-67aa85b58c75 365f219cd09c471fa6275faa2fe5e2a1 b26cf6f4abd54e30aac169a3cbca648c - - default default] Acquired lock "refresh_cache-a7650c58-4663-47b0-8499-d470f8edddbd" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 21 19:07:19 np0005591285 nova_compute[182755]: 2026-01-22 00:07:19.514 182759 DEBUG nova.network.neutron [None req-7c8e4fe1-5f6b-4bca-b226-67aa85b58c75 365f219cd09c471fa6275faa2fe5e2a1 b26cf6f4abd54e30aac169a3cbca648c - - default default] [instance: a7650c58-4663-47b0-8499-d470f8edddbd] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Jan 21 19:07:20 np0005591285 nova_compute[182755]: 2026-01-22 00:07:20.181 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:07:20 np0005591285 nova_compute[182755]: 2026-01-22 00:07:20.515 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:07:21 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:07:21.094 104259 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=ce4b296c-26ac-415a-aa87-9634754eb3d3, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '28'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 21 19:07:21 np0005591285 podman[226739]: 2026-01-22 00:07:21.163632529 +0000 UTC m=+0.043917272 container health_status 3d927e208c8d9d2bf2ed958430b573b5beb6e6062ba87e1ec7a0ee42c855b2e4 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter)
Jan 21 19:07:21 np0005591285 nova_compute[182755]: 2026-01-22 00:07:21.372 182759 DEBUG nova.network.neutron [None req-7c8e4fe1-5f6b-4bca-b226-67aa85b58c75 365f219cd09c471fa6275faa2fe5e2a1 b26cf6f4abd54e30aac169a3cbca648c - - default default] [instance: a7650c58-4663-47b0-8499-d470f8edddbd] Updating instance_info_cache with network_info: [{"id": "609c277b-133c-4824-9fd7-17b756932543", "address": "fa:16:3e:4e:1a:fc", "network": {"id": "1a4bd631-64c5-4e00-9341-0e44fd0833fb", "bridge": "br-int", "label": "tempest-ServerActionsTestOtherB-1779791452-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.176", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b26cf6f4abd54e30aac169a3cbca648c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap609c277b-13", "ovs_interfaceid": "609c277b-133c-4824-9fd7-17b756932543", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 21 19:07:21 np0005591285 nova_compute[182755]: 2026-01-22 00:07:21.397 182759 DEBUG oslo_concurrency.lockutils [None req-7c8e4fe1-5f6b-4bca-b226-67aa85b58c75 365f219cd09c471fa6275faa2fe5e2a1 b26cf6f4abd54e30aac169a3cbca648c - - default default] Releasing lock "refresh_cache-a7650c58-4663-47b0-8499-d470f8edddbd" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 21 19:07:21 np0005591285 nova_compute[182755]: 2026-01-22 00:07:21.433 182759 INFO nova.virt.libvirt.driver [-] [instance: a7650c58-4663-47b0-8499-d470f8edddbd] Instance destroyed successfully.#033[00m
Jan 21 19:07:21 np0005591285 nova_compute[182755]: 2026-01-22 00:07:21.433 182759 DEBUG nova.objects.instance [None req-7c8e4fe1-5f6b-4bca-b226-67aa85b58c75 365f219cd09c471fa6275faa2fe5e2a1 b26cf6f4abd54e30aac169a3cbca648c - - default default] Lazy-loading 'numa_topology' on Instance uuid a7650c58-4663-47b0-8499-d470f8edddbd obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 21 19:07:21 np0005591285 nova_compute[182755]: 2026-01-22 00:07:21.451 182759 DEBUG nova.objects.instance [None req-7c8e4fe1-5f6b-4bca-b226-67aa85b58c75 365f219cd09c471fa6275faa2fe5e2a1 b26cf6f4abd54e30aac169a3cbca648c - - default default] Lazy-loading 'resources' on Instance uuid a7650c58-4663-47b0-8499-d470f8edddbd obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 21 19:07:21 np0005591285 nova_compute[182755]: 2026-01-22 00:07:21.463 182759 DEBUG nova.virt.libvirt.vif [None req-7c8e4fe1-5f6b-4bca-b226-67aa85b58c75 365f219cd09c471fa6275faa2fe5e2a1 b26cf6f4abd54e30aac169a3cbca648c - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-22T00:05:15Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ServerActionsTestOtherB-server-1989312991',display_name='tempest-ServerActionsTestOtherB-server-1989312991',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(2),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-serveractionstestotherb-server-1989312991',id=92,image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',info_cache=InstanceInfoCache,instance_type_id=2,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBOWyAqsdytk3W3HzFQzJP3BXJvSwE75PC1SitNdFnRhcK3nyEFtPzs/DJKbijcwArzRvqYzid7Fty+N11Hyd1TaRzX9I0f6oLPrGjMRpZbi4YRQ8Uh8k7+UR1VtydcvTDA==',key_name='tempest-keypair-1040205771',keypairs=<?>,launch_index=0,launched_at=2026-01-22T00:07:10Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=192,metadata={},migration_context=<?>,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=4,progress=0,project_id='b26cf6f4abd54e30aac169a3cbca648c',ramdisk_id='',reservation_id='r-sfv20ppo',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=<?>,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-ServerActionsTestOtherB-1685479237',owner_user_name='tempest-ServerActionsTestOtherB-1685479237-project-member'},tags=<?>,task_state='powering-on',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-22T00:07:16Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='365f219cd09c471fa6275faa2fe5e2a1',uuid=a7650c58-4663-47b0-8499-d470f8edddbd,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='stopped') vif={"id": "609c277b-133c-4824-9fd7-17b756932543", "address": "fa:16:3e:4e:1a:fc", "network": {"id": "1a4bd631-64c5-4e00-9341-0e44fd0833fb", "bridge": "br-int", "label": "tempest-ServerActionsTestOtherB-1779791452-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.176", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b26cf6f4abd54e30aac169a3cbca648c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap609c277b-13", "ovs_interfaceid": "609c277b-133c-4824-9fd7-17b756932543", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Jan 21 19:07:21 np0005591285 nova_compute[182755]: 2026-01-22 00:07:21.464 182759 DEBUG nova.network.os_vif_util [None req-7c8e4fe1-5f6b-4bca-b226-67aa85b58c75 365f219cd09c471fa6275faa2fe5e2a1 b26cf6f4abd54e30aac169a3cbca648c - - default default] Converting VIF {"id": "609c277b-133c-4824-9fd7-17b756932543", "address": "fa:16:3e:4e:1a:fc", "network": {"id": "1a4bd631-64c5-4e00-9341-0e44fd0833fb", "bridge": "br-int", "label": "tempest-ServerActionsTestOtherB-1779791452-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.176", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b26cf6f4abd54e30aac169a3cbca648c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap609c277b-13", "ovs_interfaceid": "609c277b-133c-4824-9fd7-17b756932543", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 21 19:07:21 np0005591285 nova_compute[182755]: 2026-01-22 00:07:21.465 182759 DEBUG nova.network.os_vif_util [None req-7c8e4fe1-5f6b-4bca-b226-67aa85b58c75 365f219cd09c471fa6275faa2fe5e2a1 b26cf6f4abd54e30aac169a3cbca648c - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:4e:1a:fc,bridge_name='br-int',has_traffic_filtering=True,id=609c277b-133c-4824-9fd7-17b756932543,network=Network(1a4bd631-64c5-4e00-9341-0e44fd0833fb),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap609c277b-13') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 21 19:07:21 np0005591285 nova_compute[182755]: 2026-01-22 00:07:21.466 182759 DEBUG os_vif [None req-7c8e4fe1-5f6b-4bca-b226-67aa85b58c75 365f219cd09c471fa6275faa2fe5e2a1 b26cf6f4abd54e30aac169a3cbca648c - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:4e:1a:fc,bridge_name='br-int',has_traffic_filtering=True,id=609c277b-133c-4824-9fd7-17b756932543,network=Network(1a4bd631-64c5-4e00-9341-0e44fd0833fb),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap609c277b-13') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Jan 21 19:07:21 np0005591285 nova_compute[182755]: 2026-01-22 00:07:21.469 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:07:21 np0005591285 nova_compute[182755]: 2026-01-22 00:07:21.469 182759 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap609c277b-13, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 21 19:07:21 np0005591285 nova_compute[182755]: 2026-01-22 00:07:21.471 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:07:21 np0005591285 nova_compute[182755]: 2026-01-22 00:07:21.472 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:07:21 np0005591285 nova_compute[182755]: 2026-01-22 00:07:21.475 182759 INFO os_vif [None req-7c8e4fe1-5f6b-4bca-b226-67aa85b58c75 365f219cd09c471fa6275faa2fe5e2a1 b26cf6f4abd54e30aac169a3cbca648c - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:4e:1a:fc,bridge_name='br-int',has_traffic_filtering=True,id=609c277b-133c-4824-9fd7-17b756932543,network=Network(1a4bd631-64c5-4e00-9341-0e44fd0833fb),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap609c277b-13')#033[00m
Jan 21 19:07:21 np0005591285 nova_compute[182755]: 2026-01-22 00:07:21.482 182759 DEBUG nova.virt.libvirt.driver [None req-7c8e4fe1-5f6b-4bca-b226-67aa85b58c75 365f219cd09c471fa6275faa2fe5e2a1 b26cf6f4abd54e30aac169a3cbca648c - - default default] [instance: a7650c58-4663-47b0-8499-d470f8edddbd] Start _get_guest_xml network_info=[{"id": "609c277b-133c-4824-9fd7-17b756932543", "address": "fa:16:3e:4e:1a:fc", "network": {"id": "1a4bd631-64c5-4e00-9341-0e44fd0833fb", "bridge": "br-int", "label": "tempest-ServerActionsTestOtherB-1779791452-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.176", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b26cf6f4abd54e30aac169a3cbca648c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap609c277b-13", "ovs_interfaceid": "609c277b-133c-4824-9fd7-17b756932543", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum=<?>,container_format='bare',created_at=<?>,direct_url=<?>,disk_format='qcow2',id=9cd98f02-a505-4543-a7ad-04e9a377b456,min_disk=1,min_ram=0,name=<?>,owner=<?>,properties=ImageMetaProps,protected=<?>,size=<?>,status=<?>,tags=<?>,updated_at=<?>,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'device_type': 'disk', 'device_name': '/dev/vda', 'size': 0, 'boot_index': 0, 'disk_bus': 'virtio', 'guest_format': None, 'encryption_options': None, 'encryption_format': None, 'encrypted': False, 'encryption_secret_uuid': None, 'image_id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Jan 21 19:07:21 np0005591285 nova_compute[182755]: 2026-01-22 00:07:21.485 182759 WARNING nova.virt.libvirt.driver [None req-7c8e4fe1-5f6b-4bca-b226-67aa85b58c75 365f219cd09c471fa6275faa2fe5e2a1 b26cf6f4abd54e30aac169a3cbca648c - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 21 19:07:21 np0005591285 nova_compute[182755]: 2026-01-22 00:07:21.490 182759 DEBUG nova.virt.libvirt.host [None req-7c8e4fe1-5f6b-4bca-b226-67aa85b58c75 365f219cd09c471fa6275faa2fe5e2a1 b26cf6f4abd54e30aac169a3cbca648c - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Jan 21 19:07:21 np0005591285 nova_compute[182755]: 2026-01-22 00:07:21.490 182759 DEBUG nova.virt.libvirt.host [None req-7c8e4fe1-5f6b-4bca-b226-67aa85b58c75 365f219cd09c471fa6275faa2fe5e2a1 b26cf6f4abd54e30aac169a3cbca648c - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Jan 21 19:07:21 np0005591285 nova_compute[182755]: 2026-01-22 00:07:21.494 182759 DEBUG nova.virt.libvirt.host [None req-7c8e4fe1-5f6b-4bca-b226-67aa85b58c75 365f219cd09c471fa6275faa2fe5e2a1 b26cf6f4abd54e30aac169a3cbca648c - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Jan 21 19:07:21 np0005591285 nova_compute[182755]: 2026-01-22 00:07:21.494 182759 DEBUG nova.virt.libvirt.host [None req-7c8e4fe1-5f6b-4bca-b226-67aa85b58c75 365f219cd09c471fa6275faa2fe5e2a1 b26cf6f4abd54e30aac169a3cbca648c - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Jan 21 19:07:21 np0005591285 nova_compute[182755]: 2026-01-22 00:07:21.495 182759 DEBUG nova.virt.libvirt.driver [None req-7c8e4fe1-5f6b-4bca-b226-67aa85b58c75 365f219cd09c471fa6275faa2fe5e2a1 b26cf6f4abd54e30aac169a3cbca648c - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Jan 21 19:07:21 np0005591285 nova_compute[182755]: 2026-01-22 00:07:21.496 182759 DEBUG nova.virt.hardware [None req-7c8e4fe1-5f6b-4bca-b226-67aa85b58c75 365f219cd09c471fa6275faa2fe5e2a1 b26cf6f4abd54e30aac169a3cbca648c - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-21T23:42:48Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='ff01ccba-ad51-439f-9037-926190d6dc0f',id=2,is_public=True,memory_mb=192,name='m1.micro',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum=<?>,container_format='bare',created_at=<?>,direct_url=<?>,disk_format='qcow2',id=9cd98f02-a505-4543-a7ad-04e9a377b456,min_disk=1,min_ram=0,name=<?>,owner=<?>,properties=ImageMetaProps,protected=<?>,size=<?>,status=<?>,tags=<?>,updated_at=<?>,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Jan 21 19:07:21 np0005591285 nova_compute[182755]: 2026-01-22 00:07:21.496 182759 DEBUG nova.virt.hardware [None req-7c8e4fe1-5f6b-4bca-b226-67aa85b58c75 365f219cd09c471fa6275faa2fe5e2a1 b26cf6f4abd54e30aac169a3cbca648c - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Jan 21 19:07:21 np0005591285 nova_compute[182755]: 2026-01-22 00:07:21.496 182759 DEBUG nova.virt.hardware [None req-7c8e4fe1-5f6b-4bca-b226-67aa85b58c75 365f219cd09c471fa6275faa2fe5e2a1 b26cf6f4abd54e30aac169a3cbca648c - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Jan 21 19:07:21 np0005591285 nova_compute[182755]: 2026-01-22 00:07:21.496 182759 DEBUG nova.virt.hardware [None req-7c8e4fe1-5f6b-4bca-b226-67aa85b58c75 365f219cd09c471fa6275faa2fe5e2a1 b26cf6f4abd54e30aac169a3cbca648c - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Jan 21 19:07:21 np0005591285 nova_compute[182755]: 2026-01-22 00:07:21.497 182759 DEBUG nova.virt.hardware [None req-7c8e4fe1-5f6b-4bca-b226-67aa85b58c75 365f219cd09c471fa6275faa2fe5e2a1 b26cf6f4abd54e30aac169a3cbca648c - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Jan 21 19:07:21 np0005591285 nova_compute[182755]: 2026-01-22 00:07:21.497 182759 DEBUG nova.virt.hardware [None req-7c8e4fe1-5f6b-4bca-b226-67aa85b58c75 365f219cd09c471fa6275faa2fe5e2a1 b26cf6f4abd54e30aac169a3cbca648c - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Jan 21 19:07:21 np0005591285 nova_compute[182755]: 2026-01-22 00:07:21.497 182759 DEBUG nova.virt.hardware [None req-7c8e4fe1-5f6b-4bca-b226-67aa85b58c75 365f219cd09c471fa6275faa2fe5e2a1 b26cf6f4abd54e30aac169a3cbca648c - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Jan 21 19:07:21 np0005591285 nova_compute[182755]: 2026-01-22 00:07:21.497 182759 DEBUG nova.virt.hardware [None req-7c8e4fe1-5f6b-4bca-b226-67aa85b58c75 365f219cd09c471fa6275faa2fe5e2a1 b26cf6f4abd54e30aac169a3cbca648c - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Jan 21 19:07:21 np0005591285 nova_compute[182755]: 2026-01-22 00:07:21.498 182759 DEBUG nova.virt.hardware [None req-7c8e4fe1-5f6b-4bca-b226-67aa85b58c75 365f219cd09c471fa6275faa2fe5e2a1 b26cf6f4abd54e30aac169a3cbca648c - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Jan 21 19:07:21 np0005591285 nova_compute[182755]: 2026-01-22 00:07:21.498 182759 DEBUG nova.virt.hardware [None req-7c8e4fe1-5f6b-4bca-b226-67aa85b58c75 365f219cd09c471fa6275faa2fe5e2a1 b26cf6f4abd54e30aac169a3cbca648c - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Jan 21 19:07:21 np0005591285 nova_compute[182755]: 2026-01-22 00:07:21.498 182759 DEBUG nova.virt.hardware [None req-7c8e4fe1-5f6b-4bca-b226-67aa85b58c75 365f219cd09c471fa6275faa2fe5e2a1 b26cf6f4abd54e30aac169a3cbca648c - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Jan 21 19:07:21 np0005591285 nova_compute[182755]: 2026-01-22 00:07:21.498 182759 DEBUG nova.objects.instance [None req-7c8e4fe1-5f6b-4bca-b226-67aa85b58c75 365f219cd09c471fa6275faa2fe5e2a1 b26cf6f4abd54e30aac169a3cbca648c - - default default] Lazy-loading 'vcpu_model' on Instance uuid a7650c58-4663-47b0-8499-d470f8edddbd obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 21 19:07:21 np0005591285 nova_compute[182755]: 2026-01-22 00:07:21.518 182759 DEBUG nova.virt.libvirt.vif [None req-7c8e4fe1-5f6b-4bca-b226-67aa85b58c75 365f219cd09c471fa6275faa2fe5e2a1 b26cf6f4abd54e30aac169a3cbca648c - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-22T00:05:15Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ServerActionsTestOtherB-server-1989312991',display_name='tempest-ServerActionsTestOtherB-server-1989312991',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(2),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-serveractionstestotherb-server-1989312991',id=92,image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',info_cache=InstanceInfoCache,instance_type_id=2,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBOWyAqsdytk3W3HzFQzJP3BXJvSwE75PC1SitNdFnRhcK3nyEFtPzs/DJKbijcwArzRvqYzid7Fty+N11Hyd1TaRzX9I0f6oLPrGjMRpZbi4YRQ8Uh8k7+UR1VtydcvTDA==',key_name='tempest-keypair-1040205771',keypairs=<?>,launch_index=0,launched_at=2026-01-22T00:07:10Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=192,metadata={},migration_context=<?>,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=4,progress=0,project_id='b26cf6f4abd54e30aac169a3cbca648c',ramdisk_id='',reservation_id='r-sfv20ppo',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=<?>,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-ServerActionsTestOtherB-1685479237',owner_user_name='tempest-ServerActionsTestOtherB-1685479237-project-member'},tags=<?>,task_state='powering-on',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-22T00:07:16Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='365f219cd09c471fa6275faa2fe5e2a1',uuid=a7650c58-4663-47b0-8499-d470f8edddbd,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='stopped') vif={"id": "609c277b-133c-4824-9fd7-17b756932543", "address": "fa:16:3e:4e:1a:fc", "network": {"id": "1a4bd631-64c5-4e00-9341-0e44fd0833fb", "bridge": "br-int", "label": "tempest-ServerActionsTestOtherB-1779791452-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.176", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b26cf6f4abd54e30aac169a3cbca648c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap609c277b-13", "ovs_interfaceid": "609c277b-133c-4824-9fd7-17b756932543", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Jan 21 19:07:21 np0005591285 nova_compute[182755]: 2026-01-22 00:07:21.518 182759 DEBUG nova.network.os_vif_util [None req-7c8e4fe1-5f6b-4bca-b226-67aa85b58c75 365f219cd09c471fa6275faa2fe5e2a1 b26cf6f4abd54e30aac169a3cbca648c - - default default] Converting VIF {"id": "609c277b-133c-4824-9fd7-17b756932543", "address": "fa:16:3e:4e:1a:fc", "network": {"id": "1a4bd631-64c5-4e00-9341-0e44fd0833fb", "bridge": "br-int", "label": "tempest-ServerActionsTestOtherB-1779791452-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.176", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b26cf6f4abd54e30aac169a3cbca648c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap609c277b-13", "ovs_interfaceid": "609c277b-133c-4824-9fd7-17b756932543", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 21 19:07:21 np0005591285 nova_compute[182755]: 2026-01-22 00:07:21.519 182759 DEBUG nova.network.os_vif_util [None req-7c8e4fe1-5f6b-4bca-b226-67aa85b58c75 365f219cd09c471fa6275faa2fe5e2a1 b26cf6f4abd54e30aac169a3cbca648c - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:4e:1a:fc,bridge_name='br-int',has_traffic_filtering=True,id=609c277b-133c-4824-9fd7-17b756932543,network=Network(1a4bd631-64c5-4e00-9341-0e44fd0833fb),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap609c277b-13') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 21 19:07:21 np0005591285 nova_compute[182755]: 2026-01-22 00:07:21.519 182759 DEBUG nova.objects.instance [None req-7c8e4fe1-5f6b-4bca-b226-67aa85b58c75 365f219cd09c471fa6275faa2fe5e2a1 b26cf6f4abd54e30aac169a3cbca648c - - default default] Lazy-loading 'pci_devices' on Instance uuid a7650c58-4663-47b0-8499-d470f8edddbd obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 21 19:07:21 np0005591285 nova_compute[182755]: 2026-01-22 00:07:21.535 182759 DEBUG nova.virt.libvirt.driver [None req-7c8e4fe1-5f6b-4bca-b226-67aa85b58c75 365f219cd09c471fa6275faa2fe5e2a1 b26cf6f4abd54e30aac169a3cbca648c - - default default] [instance: a7650c58-4663-47b0-8499-d470f8edddbd] End _get_guest_xml xml=<domain type="kvm">
Jan 21 19:07:21 np0005591285 nova_compute[182755]:  <uuid>a7650c58-4663-47b0-8499-d470f8edddbd</uuid>
Jan 21 19:07:21 np0005591285 nova_compute[182755]:  <name>instance-0000005c</name>
Jan 21 19:07:21 np0005591285 nova_compute[182755]:  <memory>196608</memory>
Jan 21 19:07:21 np0005591285 nova_compute[182755]:  <vcpu>1</vcpu>
Jan 21 19:07:21 np0005591285 nova_compute[182755]:  <metadata>
Jan 21 19:07:21 np0005591285 nova_compute[182755]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 21 19:07:21 np0005591285 nova_compute[182755]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 21 19:07:21 np0005591285 nova_compute[182755]:      <nova:name>tempest-ServerActionsTestOtherB-server-1989312991</nova:name>
Jan 21 19:07:21 np0005591285 nova_compute[182755]:      <nova:creationTime>2026-01-22 00:07:21</nova:creationTime>
Jan 21 19:07:21 np0005591285 nova_compute[182755]:      <nova:flavor name="m1.micro">
Jan 21 19:07:21 np0005591285 nova_compute[182755]:        <nova:memory>192</nova:memory>
Jan 21 19:07:21 np0005591285 nova_compute[182755]:        <nova:disk>1</nova:disk>
Jan 21 19:07:21 np0005591285 nova_compute[182755]:        <nova:swap>0</nova:swap>
Jan 21 19:07:21 np0005591285 nova_compute[182755]:        <nova:ephemeral>0</nova:ephemeral>
Jan 21 19:07:21 np0005591285 nova_compute[182755]:        <nova:vcpus>1</nova:vcpus>
Jan 21 19:07:21 np0005591285 nova_compute[182755]:      </nova:flavor>
Jan 21 19:07:21 np0005591285 nova_compute[182755]:      <nova:owner>
Jan 21 19:07:21 np0005591285 nova_compute[182755]:        <nova:user uuid="365f219cd09c471fa6275faa2fe5e2a1">tempest-ServerActionsTestOtherB-1685479237-project-member</nova:user>
Jan 21 19:07:21 np0005591285 nova_compute[182755]:        <nova:project uuid="b26cf6f4abd54e30aac169a3cbca648c">tempest-ServerActionsTestOtherB-1685479237</nova:project>
Jan 21 19:07:21 np0005591285 nova_compute[182755]:      </nova:owner>
Jan 21 19:07:21 np0005591285 nova_compute[182755]:      <nova:root type="image" uuid="9cd98f02-a505-4543-a7ad-04e9a377b456"/>
Jan 21 19:07:21 np0005591285 nova_compute[182755]:      <nova:ports>
Jan 21 19:07:21 np0005591285 nova_compute[182755]:        <nova:port uuid="609c277b-133c-4824-9fd7-17b756932543">
Jan 21 19:07:21 np0005591285 nova_compute[182755]:          <nova:ip type="fixed" address="10.100.0.7" ipVersion="4"/>
Jan 21 19:07:21 np0005591285 nova_compute[182755]:        </nova:port>
Jan 21 19:07:21 np0005591285 nova_compute[182755]:      </nova:ports>
Jan 21 19:07:21 np0005591285 nova_compute[182755]:    </nova:instance>
Jan 21 19:07:21 np0005591285 nova_compute[182755]:  </metadata>
Jan 21 19:07:21 np0005591285 nova_compute[182755]:  <sysinfo type="smbios">
Jan 21 19:07:21 np0005591285 nova_compute[182755]:    <system>
Jan 21 19:07:21 np0005591285 nova_compute[182755]:      <entry name="manufacturer">RDO</entry>
Jan 21 19:07:21 np0005591285 nova_compute[182755]:      <entry name="product">OpenStack Compute</entry>
Jan 21 19:07:21 np0005591285 nova_compute[182755]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 21 19:07:21 np0005591285 nova_compute[182755]:      <entry name="serial">a7650c58-4663-47b0-8499-d470f8edddbd</entry>
Jan 21 19:07:21 np0005591285 nova_compute[182755]:      <entry name="uuid">a7650c58-4663-47b0-8499-d470f8edddbd</entry>
Jan 21 19:07:21 np0005591285 nova_compute[182755]:      <entry name="family">Virtual Machine</entry>
Jan 21 19:07:21 np0005591285 nova_compute[182755]:    </system>
Jan 21 19:07:21 np0005591285 nova_compute[182755]:  </sysinfo>
Jan 21 19:07:21 np0005591285 nova_compute[182755]:  <os>
Jan 21 19:07:21 np0005591285 nova_compute[182755]:    <type arch="x86_64" machine="q35">hvm</type>
Jan 21 19:07:21 np0005591285 nova_compute[182755]:    <boot dev="hd"/>
Jan 21 19:07:21 np0005591285 nova_compute[182755]:    <smbios mode="sysinfo"/>
Jan 21 19:07:21 np0005591285 nova_compute[182755]:  </os>
Jan 21 19:07:21 np0005591285 nova_compute[182755]:  <features>
Jan 21 19:07:21 np0005591285 nova_compute[182755]:    <acpi/>
Jan 21 19:07:21 np0005591285 nova_compute[182755]:    <apic/>
Jan 21 19:07:21 np0005591285 nova_compute[182755]:    <vmcoreinfo/>
Jan 21 19:07:21 np0005591285 nova_compute[182755]:  </features>
Jan 21 19:07:21 np0005591285 nova_compute[182755]:  <clock offset="utc">
Jan 21 19:07:21 np0005591285 nova_compute[182755]:    <timer name="pit" tickpolicy="delay"/>
Jan 21 19:07:21 np0005591285 nova_compute[182755]:    <timer name="rtc" tickpolicy="catchup"/>
Jan 21 19:07:21 np0005591285 nova_compute[182755]:    <timer name="hpet" present="no"/>
Jan 21 19:07:21 np0005591285 nova_compute[182755]:  </clock>
Jan 21 19:07:21 np0005591285 nova_compute[182755]:  <cpu mode="custom" match="exact">
Jan 21 19:07:21 np0005591285 nova_compute[182755]:    <model>Nehalem</model>
Jan 21 19:07:21 np0005591285 nova_compute[182755]:    <topology sockets="1" cores="1" threads="1"/>
Jan 21 19:07:21 np0005591285 nova_compute[182755]:  </cpu>
Jan 21 19:07:21 np0005591285 nova_compute[182755]:  <devices>
Jan 21 19:07:21 np0005591285 nova_compute[182755]:    <disk type="file" device="disk">
Jan 21 19:07:21 np0005591285 nova_compute[182755]:      <driver name="qemu" type="qcow2" cache="none"/>
Jan 21 19:07:21 np0005591285 nova_compute[182755]:      <source file="/var/lib/nova/instances/a7650c58-4663-47b0-8499-d470f8edddbd/disk"/>
Jan 21 19:07:21 np0005591285 nova_compute[182755]:      <target dev="vda" bus="virtio"/>
Jan 21 19:07:21 np0005591285 nova_compute[182755]:    </disk>
Jan 21 19:07:21 np0005591285 nova_compute[182755]:    <disk type="file" device="cdrom">
Jan 21 19:07:21 np0005591285 nova_compute[182755]:      <driver name="qemu" type="raw" cache="none"/>
Jan 21 19:07:21 np0005591285 nova_compute[182755]:      <source file="/var/lib/nova/instances/a7650c58-4663-47b0-8499-d470f8edddbd/disk.config"/>
Jan 21 19:07:21 np0005591285 nova_compute[182755]:      <target dev="sda" bus="sata"/>
Jan 21 19:07:21 np0005591285 nova_compute[182755]:    </disk>
Jan 21 19:07:21 np0005591285 nova_compute[182755]:    <interface type="ethernet">
Jan 21 19:07:21 np0005591285 nova_compute[182755]:      <mac address="fa:16:3e:4e:1a:fc"/>
Jan 21 19:07:21 np0005591285 nova_compute[182755]:      <model type="virtio"/>
Jan 21 19:07:21 np0005591285 nova_compute[182755]:      <driver name="vhost" rx_queue_size="512"/>
Jan 21 19:07:21 np0005591285 nova_compute[182755]:      <mtu size="1442"/>
Jan 21 19:07:21 np0005591285 nova_compute[182755]:      <target dev="tap609c277b-13"/>
Jan 21 19:07:21 np0005591285 nova_compute[182755]:    </interface>
Jan 21 19:07:21 np0005591285 nova_compute[182755]:    <serial type="pty">
Jan 21 19:07:21 np0005591285 nova_compute[182755]:      <log file="/var/lib/nova/instances/a7650c58-4663-47b0-8499-d470f8edddbd/console.log" append="off"/>
Jan 21 19:07:21 np0005591285 nova_compute[182755]:    </serial>
Jan 21 19:07:21 np0005591285 nova_compute[182755]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 21 19:07:21 np0005591285 nova_compute[182755]:    <video>
Jan 21 19:07:21 np0005591285 nova_compute[182755]:      <model type="virtio"/>
Jan 21 19:07:21 np0005591285 nova_compute[182755]:    </video>
Jan 21 19:07:21 np0005591285 nova_compute[182755]:    <input type="tablet" bus="usb"/>
Jan 21 19:07:21 np0005591285 nova_compute[182755]:    <input type="keyboard" bus="usb"/>
Jan 21 19:07:21 np0005591285 nova_compute[182755]:    <rng model="virtio">
Jan 21 19:07:21 np0005591285 nova_compute[182755]:      <backend model="random">/dev/urandom</backend>
Jan 21 19:07:21 np0005591285 nova_compute[182755]:    </rng>
Jan 21 19:07:21 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root"/>
Jan 21 19:07:21 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 19:07:21 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 19:07:21 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 19:07:21 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 19:07:21 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 19:07:21 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 19:07:21 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 19:07:21 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 19:07:21 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 19:07:21 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 19:07:21 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 19:07:21 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 19:07:21 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 19:07:21 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 19:07:21 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 19:07:21 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 19:07:21 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 19:07:21 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 19:07:21 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 19:07:21 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 19:07:21 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 19:07:21 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 19:07:21 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 19:07:21 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 19:07:21 np0005591285 nova_compute[182755]:    <controller type="usb" index="0"/>
Jan 21 19:07:21 np0005591285 nova_compute[182755]:    <memballoon model="virtio">
Jan 21 19:07:21 np0005591285 nova_compute[182755]:      <stats period="10"/>
Jan 21 19:07:21 np0005591285 nova_compute[182755]:    </memballoon>
Jan 21 19:07:21 np0005591285 nova_compute[182755]:  </devices>
Jan 21 19:07:21 np0005591285 nova_compute[182755]: </domain>
Jan 21 19:07:21 np0005591285 nova_compute[182755]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Jan 21 19:07:21 np0005591285 nova_compute[182755]: 2026-01-22 00:07:21.537 182759 DEBUG oslo_concurrency.processutils [None req-7c8e4fe1-5f6b-4bca-b226-67aa85b58c75 365f219cd09c471fa6275faa2fe5e2a1 b26cf6f4abd54e30aac169a3cbca648c - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/a7650c58-4663-47b0-8499-d470f8edddbd/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 21 19:07:21 np0005591285 nova_compute[182755]: 2026-01-22 00:07:21.577 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:07:21 np0005591285 nova_compute[182755]: 2026-01-22 00:07:21.596 182759 DEBUG oslo_concurrency.processutils [None req-7c8e4fe1-5f6b-4bca-b226-67aa85b58c75 365f219cd09c471fa6275faa2fe5e2a1 b26cf6f4abd54e30aac169a3cbca648c - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/a7650c58-4663-47b0-8499-d470f8edddbd/disk --force-share --output=json" returned: 0 in 0.059s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 21 19:07:21 np0005591285 nova_compute[182755]: 2026-01-22 00:07:21.597 182759 DEBUG oslo_concurrency.processutils [None req-7c8e4fe1-5f6b-4bca-b226-67aa85b58c75 365f219cd09c471fa6275faa2fe5e2a1 b26cf6f4abd54e30aac169a3cbca648c - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/a7650c58-4663-47b0-8499-d470f8edddbd/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 21 19:07:21 np0005591285 nova_compute[182755]: 2026-01-22 00:07:21.654 182759 DEBUG oslo_concurrency.processutils [None req-7c8e4fe1-5f6b-4bca-b226-67aa85b58c75 365f219cd09c471fa6275faa2fe5e2a1 b26cf6f4abd54e30aac169a3cbca648c - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/a7650c58-4663-47b0-8499-d470f8edddbd/disk --force-share --output=json" returned: 0 in 0.057s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 21 19:07:21 np0005591285 nova_compute[182755]: 2026-01-22 00:07:21.655 182759 DEBUG nova.objects.instance [None req-7c8e4fe1-5f6b-4bca-b226-67aa85b58c75 365f219cd09c471fa6275faa2fe5e2a1 b26cf6f4abd54e30aac169a3cbca648c - - default default] Lazy-loading 'trusted_certs' on Instance uuid a7650c58-4663-47b0-8499-d470f8edddbd obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 21 19:07:21 np0005591285 nova_compute[182755]: 2026-01-22 00:07:21.702 182759 DEBUG oslo_concurrency.processutils [None req-7c8e4fe1-5f6b-4bca-b226-67aa85b58c75 365f219cd09c471fa6275faa2fe5e2a1 b26cf6f4abd54e30aac169a3cbca648c - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 21 19:07:21 np0005591285 nova_compute[182755]: 2026-01-22 00:07:21.756 182759 DEBUG oslo_concurrency.processutils [None req-7c8e4fe1-5f6b-4bca-b226-67aa85b58c75 365f219cd09c471fa6275faa2fe5e2a1 b26cf6f4abd54e30aac169a3cbca648c - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json" returned: 0 in 0.054s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 21 19:07:21 np0005591285 nova_compute[182755]: 2026-01-22 00:07:21.757 182759 DEBUG nova.virt.disk.api [None req-7c8e4fe1-5f6b-4bca-b226-67aa85b58c75 365f219cd09c471fa6275faa2fe5e2a1 b26cf6f4abd54e30aac169a3cbca648c - - default default] Checking if we can resize image /var/lib/nova/instances/a7650c58-4663-47b0-8499-d470f8edddbd/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166#033[00m
Jan 21 19:07:21 np0005591285 nova_compute[182755]: 2026-01-22 00:07:21.758 182759 DEBUG oslo_concurrency.processutils [None req-7c8e4fe1-5f6b-4bca-b226-67aa85b58c75 365f219cd09c471fa6275faa2fe5e2a1 b26cf6f4abd54e30aac169a3cbca648c - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/a7650c58-4663-47b0-8499-d470f8edddbd/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 21 19:07:21 np0005591285 nova_compute[182755]: 2026-01-22 00:07:21.819 182759 DEBUG oslo_concurrency.processutils [None req-7c8e4fe1-5f6b-4bca-b226-67aa85b58c75 365f219cd09c471fa6275faa2fe5e2a1 b26cf6f4abd54e30aac169a3cbca648c - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/a7650c58-4663-47b0-8499-d470f8edddbd/disk --force-share --output=json" returned: 0 in 0.061s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 21 19:07:21 np0005591285 nova_compute[182755]: 2026-01-22 00:07:21.820 182759 DEBUG nova.virt.disk.api [None req-7c8e4fe1-5f6b-4bca-b226-67aa85b58c75 365f219cd09c471fa6275faa2fe5e2a1 b26cf6f4abd54e30aac169a3cbca648c - - default default] Cannot resize image /var/lib/nova/instances/a7650c58-4663-47b0-8499-d470f8edddbd/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172#033[00m
Jan 21 19:07:21 np0005591285 nova_compute[182755]: 2026-01-22 00:07:21.821 182759 DEBUG nova.objects.instance [None req-7c8e4fe1-5f6b-4bca-b226-67aa85b58c75 365f219cd09c471fa6275faa2fe5e2a1 b26cf6f4abd54e30aac169a3cbca648c - - default default] Lazy-loading 'migration_context' on Instance uuid a7650c58-4663-47b0-8499-d470f8edddbd obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 21 19:07:21 np0005591285 nova_compute[182755]: 2026-01-22 00:07:21.834 182759 DEBUG nova.virt.libvirt.vif [None req-7c8e4fe1-5f6b-4bca-b226-67aa85b58c75 365f219cd09c471fa6275faa2fe5e2a1 b26cf6f4abd54e30aac169a3cbca648c - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-22T00:05:15Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ServerActionsTestOtherB-server-1989312991',display_name='tempest-ServerActionsTestOtherB-server-1989312991',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(2),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-serveractionstestotherb-server-1989312991',id=92,image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',info_cache=InstanceInfoCache,instance_type_id=2,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBOWyAqsdytk3W3HzFQzJP3BXJvSwE75PC1SitNdFnRhcK3nyEFtPzs/DJKbijcwArzRvqYzid7Fty+N11Hyd1TaRzX9I0f6oLPrGjMRpZbi4YRQ8Uh8k7+UR1VtydcvTDA==',key_name='tempest-keypair-1040205771',keypairs=<?>,launch_index=0,launched_at=2026-01-22T00:07:10Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=192,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=<?>,power_state=4,progress=0,project_id='b26cf6f4abd54e30aac169a3cbca648c',ramdisk_id='',reservation_id='r-sfv20ppo',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=<?>,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-ServerActionsTestOtherB-1685479237',owner_user_name='tempest-ServerActionsTestOtherB-1685479237-project-member'},tags=<?>,task_state='powering-on',terminated_at=None,trusted_certs=None,updated_at=2026-01-22T00:07:16Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='365f219cd09c471fa6275faa2fe5e2a1',uuid=a7650c58-4663-47b0-8499-d470f8edddbd,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='stopped') vif={"id": "609c277b-133c-4824-9fd7-17b756932543", "address": "fa:16:3e:4e:1a:fc", "network": {"id": "1a4bd631-64c5-4e00-9341-0e44fd0833fb", "bridge": "br-int", "label": "tempest-ServerActionsTestOtherB-1779791452-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.176", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b26cf6f4abd54e30aac169a3cbca648c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap609c277b-13", "ovs_interfaceid": "609c277b-133c-4824-9fd7-17b756932543", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Jan 21 19:07:21 np0005591285 nova_compute[182755]: 2026-01-22 00:07:21.835 182759 DEBUG nova.network.os_vif_util [None req-7c8e4fe1-5f6b-4bca-b226-67aa85b58c75 365f219cd09c471fa6275faa2fe5e2a1 b26cf6f4abd54e30aac169a3cbca648c - - default default] Converting VIF {"id": "609c277b-133c-4824-9fd7-17b756932543", "address": "fa:16:3e:4e:1a:fc", "network": {"id": "1a4bd631-64c5-4e00-9341-0e44fd0833fb", "bridge": "br-int", "label": "tempest-ServerActionsTestOtherB-1779791452-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.176", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b26cf6f4abd54e30aac169a3cbca648c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap609c277b-13", "ovs_interfaceid": "609c277b-133c-4824-9fd7-17b756932543", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 21 19:07:21 np0005591285 nova_compute[182755]: 2026-01-22 00:07:21.836 182759 DEBUG nova.network.os_vif_util [None req-7c8e4fe1-5f6b-4bca-b226-67aa85b58c75 365f219cd09c471fa6275faa2fe5e2a1 b26cf6f4abd54e30aac169a3cbca648c - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:4e:1a:fc,bridge_name='br-int',has_traffic_filtering=True,id=609c277b-133c-4824-9fd7-17b756932543,network=Network(1a4bd631-64c5-4e00-9341-0e44fd0833fb),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap609c277b-13') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 21 19:07:21 np0005591285 nova_compute[182755]: 2026-01-22 00:07:21.836 182759 DEBUG os_vif [None req-7c8e4fe1-5f6b-4bca-b226-67aa85b58c75 365f219cd09c471fa6275faa2fe5e2a1 b26cf6f4abd54e30aac169a3cbca648c - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:4e:1a:fc,bridge_name='br-int',has_traffic_filtering=True,id=609c277b-133c-4824-9fd7-17b756932543,network=Network(1a4bd631-64c5-4e00-9341-0e44fd0833fb),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap609c277b-13') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Jan 21 19:07:21 np0005591285 nova_compute[182755]: 2026-01-22 00:07:21.837 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:07:21 np0005591285 nova_compute[182755]: 2026-01-22 00:07:21.837 182759 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 21 19:07:21 np0005591285 nova_compute[182755]: 2026-01-22 00:07:21.838 182759 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 21 19:07:21 np0005591285 nova_compute[182755]: 2026-01-22 00:07:21.840 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:07:21 np0005591285 nova_compute[182755]: 2026-01-22 00:07:21.841 182759 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap609c277b-13, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 21 19:07:21 np0005591285 nova_compute[182755]: 2026-01-22 00:07:21.841 182759 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap609c277b-13, col_values=(('external_ids', {'iface-id': '609c277b-133c-4824-9fd7-17b756932543', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:4e:1a:fc', 'vm-uuid': 'a7650c58-4663-47b0-8499-d470f8edddbd'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 21 19:07:21 np0005591285 NetworkManager[55017]: <info>  [1769040441.8936] manager: (tap609c277b-13): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/181)
Jan 21 19:07:21 np0005591285 nova_compute[182755]: 2026-01-22 00:07:21.893 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:07:21 np0005591285 nova_compute[182755]: 2026-01-22 00:07:21.896 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 21 19:07:21 np0005591285 nova_compute[182755]: 2026-01-22 00:07:21.900 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:07:21 np0005591285 nova_compute[182755]: 2026-01-22 00:07:21.900 182759 INFO os_vif [None req-7c8e4fe1-5f6b-4bca-b226-67aa85b58c75 365f219cd09c471fa6275faa2fe5e2a1 b26cf6f4abd54e30aac169a3cbca648c - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:4e:1a:fc,bridge_name='br-int',has_traffic_filtering=True,id=609c277b-133c-4824-9fd7-17b756932543,network=Network(1a4bd631-64c5-4e00-9341-0e44fd0833fb),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap609c277b-13')#033[00m
Jan 21 19:07:21 np0005591285 kernel: tap609c277b-13: entered promiscuous mode
Jan 21 19:07:21 np0005591285 NetworkManager[55017]: <info>  [1769040441.9688] manager: (tap609c277b-13): new Tun device (/org/freedesktop/NetworkManager/Devices/182)
Jan 21 19:07:21 np0005591285 nova_compute[182755]: 2026-01-22 00:07:21.970 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:07:21 np0005591285 ovn_controller[94908]: 2026-01-22T00:07:21Z|00361|binding|INFO|Claiming lport 609c277b-133c-4824-9fd7-17b756932543 for this chassis.
Jan 21 19:07:21 np0005591285 ovn_controller[94908]: 2026-01-22T00:07:21Z|00362|binding|INFO|609c277b-133c-4824-9fd7-17b756932543: Claiming fa:16:3e:4e:1a:fc 10.100.0.7
Jan 21 19:07:21 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:07:21.979 104259 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:4e:1a:fc 10.100.0.7'], port_security=['fa:16:3e:4e:1a:fc 10.100.0.7'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.7/28', 'neutron:device_id': 'a7650c58-4663-47b0-8499-d470f8edddbd', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-1a4bd631-64c5-4e00-9341-0e44fd0833fb', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'b26cf6f4abd54e30aac169a3cbca648c', 'neutron:revision_number': '6', 'neutron:security_group_ids': '80fb8d02-77b3-43f5-9cd3-4114236093b7', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:port_fip': '192.168.122.176'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=d46c6b58-b03f-4ac4-a6dd-9f507a40241a, chassis=[<ovs.db.idl.Row object at 0x7fdd55b5c970>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fdd55b5c970>], logical_port=609c277b-133c-4824-9fd7-17b756932543) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 21 19:07:21 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:07:21.980 104259 INFO neutron.agent.ovn.metadata.agent [-] Port 609c277b-133c-4824-9fd7-17b756932543 in datapath 1a4bd631-64c5-4e00-9341-0e44fd0833fb bound to our chassis#033[00m
Jan 21 19:07:21 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:07:21.981 104259 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 1a4bd631-64c5-4e00-9341-0e44fd0833fb#033[00m
Jan 21 19:07:21 np0005591285 nova_compute[182755]: 2026-01-22 00:07:21.984 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:07:21 np0005591285 ovn_controller[94908]: 2026-01-22T00:07:21Z|00363|binding|INFO|Setting lport 609c277b-133c-4824-9fd7-17b756932543 ovn-installed in OVS
Jan 21 19:07:21 np0005591285 ovn_controller[94908]: 2026-01-22T00:07:21Z|00364|binding|INFO|Setting lport 609c277b-133c-4824-9fd7-17b756932543 up in Southbound
Jan 21 19:07:21 np0005591285 nova_compute[182755]: 2026-01-22 00:07:21.987 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:07:21 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:07:21.992 211690 DEBUG oslo.privsep.daemon [-] privsep: reply[92ad4913-a137-4f7f-96dd-601ff9c1b2a7]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 19:07:21 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:07:21.992 104259 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap1a4bd631-61 in ovnmeta-1a4bd631-64c5-4e00-9341-0e44fd0833fb namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Jan 21 19:07:21 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:07:21.995 211690 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap1a4bd631-60 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Jan 21 19:07:21 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:07:21.995 211690 DEBUG oslo.privsep.daemon [-] privsep: reply[6328e270-6257-4da5-8675-72b30cc55bd7]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 19:07:21 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:07:21.996 211690 DEBUG oslo.privsep.daemon [-] privsep: reply[dfa74203-9638-4de2-8209-a4e3dd17cb06]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 19:07:22 np0005591285 systemd-udevd[226793]: Network interface NamePolicy= disabled on kernel command line.
Jan 21 19:07:22 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:07:22.007 104650 DEBUG oslo.privsep.daemon [-] privsep: reply[0aa18ab0-09b8-45d8-bd86-9dd7153f9a5b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 19:07:22 np0005591285 systemd-machined[154022]: New machine qemu-45-instance-0000005c.
Jan 21 19:07:22 np0005591285 NetworkManager[55017]: <info>  [1769040442.0173] device (tap609c277b-13): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 21 19:07:22 np0005591285 NetworkManager[55017]: <info>  [1769040442.0178] device (tap609c277b-13): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 21 19:07:22 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:07:22.020 211690 DEBUG oslo.privsep.daemon [-] privsep: reply[e02c7185-a1b8-47ca-9642-5dcd3f3b9152]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 19:07:22 np0005591285 systemd[1]: Started Virtual Machine qemu-45-instance-0000005c.
Jan 21 19:07:22 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:07:22.050 211704 DEBUG oslo.privsep.daemon [-] privsep: reply[961cbe0b-90d6-4d17-b04a-77124abe3904]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 19:07:22 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:07:22.056 211690 DEBUG oslo.privsep.daemon [-] privsep: reply[fbbc0eb7-dcbe-4b75-98bd-d9befc32a089]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 19:07:22 np0005591285 systemd-udevd[226796]: Network interface NamePolicy= disabled on kernel command line.
Jan 21 19:07:22 np0005591285 NetworkManager[55017]: <info>  [1769040442.0574] manager: (tap1a4bd631-60): new Veth device (/org/freedesktop/NetworkManager/Devices/183)
Jan 21 19:07:22 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:07:22.084 211704 DEBUG oslo.privsep.daemon [-] privsep: reply[cf32f47d-33c8-428f-bc7a-75bae8fabd09]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 19:07:22 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:07:22.086 211704 DEBUG oslo.privsep.daemon [-] privsep: reply[0e011059-e4ff-496f-8e31-d6c65dd66200]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 19:07:22 np0005591285 NetworkManager[55017]: <info>  [1769040442.1071] device (tap1a4bd631-60): carrier: link connected
Jan 21 19:07:22 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:07:22.114 211704 DEBUG oslo.privsep.daemon [-] privsep: reply[0df420ff-dc41-4666-8793-4ac50da30fbb]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 19:07:22 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:07:22.133 211690 DEBUG oslo.privsep.daemon [-] privsep: reply[31f61a64-ed41-4eff-a0e0-da06e8433edf]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap1a4bd631-61'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:28:78:33'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 115], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 500076, 'reachable_time': 42507, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 226824, 'error': None, 'target': 'ovnmeta-1a4bd631-64c5-4e00-9341-0e44fd0833fb', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 19:07:22 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:07:22.148 211690 DEBUG oslo.privsep.daemon [-] privsep: reply[ab972ebb-1ecf-45ff-bc7d-95f43fc3672e]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe28:7833'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 500076, 'tstamp': 500076}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 226825, 'error': None, 'target': 'ovnmeta-1a4bd631-64c5-4e00-9341-0e44fd0833fb', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 19:07:22 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:07:22.162 211690 DEBUG oslo.privsep.daemon [-] privsep: reply[3485577f-8d32-4df4-82b1-d288473eff7c]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap1a4bd631-61'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:28:78:33'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 115], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 500076, 'reachable_time': 42507, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 226826, 'error': None, 'target': 'ovnmeta-1a4bd631-64c5-4e00-9341-0e44fd0833fb', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 19:07:22 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:07:22.186 211690 DEBUG oslo.privsep.daemon [-] privsep: reply[1e57aec6-d056-4d9c-a3e9-33b6c4c707a9]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 19:07:22 np0005591285 nova_compute[182755]: 2026-01-22 00:07:22.201 182759 DEBUG nova.compute.manager [req-c07e4f47-fb2e-4383-8b78-7883c4890df4 req-2a7fcf83-a0f7-4f3f-a56a-7d583d786f61 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: a7650c58-4663-47b0-8499-d470f8edddbd] Received event network-vif-plugged-609c277b-133c-4824-9fd7-17b756932543 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 21 19:07:22 np0005591285 nova_compute[182755]: 2026-01-22 00:07:22.202 182759 DEBUG oslo_concurrency.lockutils [req-c07e4f47-fb2e-4383-8b78-7883c4890df4 req-2a7fcf83-a0f7-4f3f-a56a-7d583d786f61 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "a7650c58-4663-47b0-8499-d470f8edddbd-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 21 19:07:22 np0005591285 nova_compute[182755]: 2026-01-22 00:07:22.202 182759 DEBUG oslo_concurrency.lockutils [req-c07e4f47-fb2e-4383-8b78-7883c4890df4 req-2a7fcf83-a0f7-4f3f-a56a-7d583d786f61 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "a7650c58-4663-47b0-8499-d470f8edddbd-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 21 19:07:22 np0005591285 nova_compute[182755]: 2026-01-22 00:07:22.202 182759 DEBUG oslo_concurrency.lockutils [req-c07e4f47-fb2e-4383-8b78-7883c4890df4 req-2a7fcf83-a0f7-4f3f-a56a-7d583d786f61 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "a7650c58-4663-47b0-8499-d470f8edddbd-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 21 19:07:22 np0005591285 nova_compute[182755]: 2026-01-22 00:07:22.202 182759 DEBUG nova.compute.manager [req-c07e4f47-fb2e-4383-8b78-7883c4890df4 req-2a7fcf83-a0f7-4f3f-a56a-7d583d786f61 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: a7650c58-4663-47b0-8499-d470f8edddbd] No waiting events found dispatching network-vif-plugged-609c277b-133c-4824-9fd7-17b756932543 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 21 19:07:22 np0005591285 nova_compute[182755]: 2026-01-22 00:07:22.203 182759 WARNING nova.compute.manager [req-c07e4f47-fb2e-4383-8b78-7883c4890df4 req-2a7fcf83-a0f7-4f3f-a56a-7d583d786f61 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: a7650c58-4663-47b0-8499-d470f8edddbd] Received unexpected event network-vif-plugged-609c277b-133c-4824-9fd7-17b756932543 for instance with vm_state stopped and task_state powering-on.#033[00m
Jan 21 19:07:22 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:07:22.237 211690 DEBUG oslo.privsep.daemon [-] privsep: reply[b23fcd57-b156-4a73-a6d5-c4705790911c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 19:07:22 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:07:22.238 104259 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap1a4bd631-60, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 21 19:07:22 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:07:22.238 104259 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 21 19:07:22 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:07:22.238 104259 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap1a4bd631-60, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 21 19:07:22 np0005591285 nova_compute[182755]: 2026-01-22 00:07:22.240 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:07:22 np0005591285 kernel: tap1a4bd631-60: entered promiscuous mode
Jan 21 19:07:22 np0005591285 NetworkManager[55017]: <info>  [1769040442.2423] manager: (tap1a4bd631-60): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/184)
Jan 21 19:07:22 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:07:22.246 104259 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap1a4bd631-60, col_values=(('external_ids', {'iface-id': 'c2dbe75a-81e7-4c52-bada-9acaf8fbaf5c'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 21 19:07:22 np0005591285 ovn_controller[94908]: 2026-01-22T00:07:22Z|00365|binding|INFO|Releasing lport c2dbe75a-81e7-4c52-bada-9acaf8fbaf5c from this chassis (sb_readonly=0)
Jan 21 19:07:22 np0005591285 nova_compute[182755]: 2026-01-22 00:07:22.247 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:07:22 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:07:22.249 104259 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/1a4bd631-64c5-4e00-9341-0e44fd0833fb.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/1a4bd631-64c5-4e00-9341-0e44fd0833fb.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Jan 21 19:07:22 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:07:22.249 211690 DEBUG oslo.privsep.daemon [-] privsep: reply[645ccfe2-00b2-44b2-a171-9ef1b82073b6]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 19:07:22 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:07:22.250 104259 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Jan 21 19:07:22 np0005591285 ovn_metadata_agent[104254]: global
Jan 21 19:07:22 np0005591285 ovn_metadata_agent[104254]:    log         /dev/log local0 debug
Jan 21 19:07:22 np0005591285 ovn_metadata_agent[104254]:    log-tag     haproxy-metadata-proxy-1a4bd631-64c5-4e00-9341-0e44fd0833fb
Jan 21 19:07:22 np0005591285 ovn_metadata_agent[104254]:    user        root
Jan 21 19:07:22 np0005591285 ovn_metadata_agent[104254]:    group       root
Jan 21 19:07:22 np0005591285 ovn_metadata_agent[104254]:    maxconn     1024
Jan 21 19:07:22 np0005591285 ovn_metadata_agent[104254]:    pidfile     /var/lib/neutron/external/pids/1a4bd631-64c5-4e00-9341-0e44fd0833fb.pid.haproxy
Jan 21 19:07:22 np0005591285 ovn_metadata_agent[104254]:    daemon
Jan 21 19:07:22 np0005591285 ovn_metadata_agent[104254]: 
Jan 21 19:07:22 np0005591285 ovn_metadata_agent[104254]: defaults
Jan 21 19:07:22 np0005591285 ovn_metadata_agent[104254]:    log global
Jan 21 19:07:22 np0005591285 ovn_metadata_agent[104254]:    mode http
Jan 21 19:07:22 np0005591285 ovn_metadata_agent[104254]:    option httplog
Jan 21 19:07:22 np0005591285 ovn_metadata_agent[104254]:    option dontlognull
Jan 21 19:07:22 np0005591285 ovn_metadata_agent[104254]:    option http-server-close
Jan 21 19:07:22 np0005591285 ovn_metadata_agent[104254]:    option forwardfor
Jan 21 19:07:22 np0005591285 ovn_metadata_agent[104254]:    retries                 3
Jan 21 19:07:22 np0005591285 ovn_metadata_agent[104254]:    timeout http-request    30s
Jan 21 19:07:22 np0005591285 ovn_metadata_agent[104254]:    timeout connect         30s
Jan 21 19:07:22 np0005591285 ovn_metadata_agent[104254]:    timeout client          32s
Jan 21 19:07:22 np0005591285 ovn_metadata_agent[104254]:    timeout server          32s
Jan 21 19:07:22 np0005591285 ovn_metadata_agent[104254]:    timeout http-keep-alive 30s
Jan 21 19:07:22 np0005591285 ovn_metadata_agent[104254]: 
Jan 21 19:07:22 np0005591285 ovn_metadata_agent[104254]: 
Jan 21 19:07:22 np0005591285 ovn_metadata_agent[104254]: listen listener
Jan 21 19:07:22 np0005591285 ovn_metadata_agent[104254]:    bind 169.254.169.254:80
Jan 21 19:07:22 np0005591285 ovn_metadata_agent[104254]:    server metadata /var/lib/neutron/metadata_proxy
Jan 21 19:07:22 np0005591285 ovn_metadata_agent[104254]:    http-request add-header X-OVN-Network-ID 1a4bd631-64c5-4e00-9341-0e44fd0833fb
Jan 21 19:07:22 np0005591285 ovn_metadata_agent[104254]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Jan 21 19:07:22 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:07:22.251 104259 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-1a4bd631-64c5-4e00-9341-0e44fd0833fb', 'env', 'PROCESS_TAG=haproxy-1a4bd631-64c5-4e00-9341-0e44fd0833fb', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/1a4bd631-64c5-4e00-9341-0e44fd0833fb.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Jan 21 19:07:22 np0005591285 nova_compute[182755]: 2026-01-22 00:07:22.260 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:07:22 np0005591285 podman[226858]: 2026-01-22 00:07:22.569623939 +0000 UTC m=+0.041312182 container create 86cd9faf4032166c4405b8344aa6137f391fb62315c50a06067bc2a4210c59b5 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-1a4bd631-64c5-4e00-9341-0e44fd0833fb, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2)
Jan 21 19:07:22 np0005591285 systemd[1]: Started libpod-conmon-86cd9faf4032166c4405b8344aa6137f391fb62315c50a06067bc2a4210c59b5.scope.
Jan 21 19:07:22 np0005591285 systemd[1]: Started libcrun container.
Jan 21 19:07:22 np0005591285 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/f3f83024869f6be105fd0cd3704bdead44c85aed310f10e5ae1503bc07ffc5db/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 21 19:07:22 np0005591285 podman[226858]: 2026-01-22 00:07:22.633784025 +0000 UTC m=+0.105472288 container init 86cd9faf4032166c4405b8344aa6137f391fb62315c50a06067bc2a4210c59b5 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-1a4bd631-64c5-4e00-9341-0e44fd0833fb, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.build-date=20251202, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true)
Jan 21 19:07:22 np0005591285 podman[226858]: 2026-01-22 00:07:22.638862041 +0000 UTC m=+0.110550274 container start 86cd9faf4032166c4405b8344aa6137f391fb62315c50a06067bc2a4210c59b5 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-1a4bd631-64c5-4e00-9341-0e44fd0833fb, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251202)
Jan 21 19:07:22 np0005591285 podman[226858]: 2026-01-22 00:07:22.546381973 +0000 UTC m=+0.018070236 image pull 3695f0466b4af47afdf4b467956f8cc4744d7249671a73e7ca3fd26cca2f59c3 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Jan 21 19:07:22 np0005591285 neutron-haproxy-ovnmeta-1a4bd631-64c5-4e00-9341-0e44fd0833fb[226879]: [NOTICE]   (226884) : New worker (226887) forked
Jan 21 19:07:22 np0005591285 neutron-haproxy-ovnmeta-1a4bd631-64c5-4e00-9341-0e44fd0833fb[226879]: [NOTICE]   (226884) : Loading success.
Jan 21 19:07:22 np0005591285 nova_compute[182755]: 2026-01-22 00:07:22.695 182759 DEBUG nova.virt.driver [None req-f447ef32-7edb-4e2c-a5c5-5452f2f9c37e - - - - - -] Emitting event <LifecycleEvent: 1769040442.6953099, a7650c58-4663-47b0-8499-d470f8edddbd => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 21 19:07:22 np0005591285 nova_compute[182755]: 2026-01-22 00:07:22.697 182759 INFO nova.compute.manager [None req-f447ef32-7edb-4e2c-a5c5-5452f2f9c37e - - - - - -] [instance: a7650c58-4663-47b0-8499-d470f8edddbd] VM Resumed (Lifecycle Event)#033[00m
Jan 21 19:07:22 np0005591285 nova_compute[182755]: 2026-01-22 00:07:22.699 182759 DEBUG nova.compute.manager [None req-7c8e4fe1-5f6b-4bca-b226-67aa85b58c75 365f219cd09c471fa6275faa2fe5e2a1 b26cf6f4abd54e30aac169a3cbca648c - - default default] [instance: a7650c58-4663-47b0-8499-d470f8edddbd] Instance event wait completed in 0 seconds for  wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Jan 21 19:07:22 np0005591285 nova_compute[182755]: 2026-01-22 00:07:22.703 182759 INFO nova.virt.libvirt.driver [-] [instance: a7650c58-4663-47b0-8499-d470f8edddbd] Instance rebooted successfully.#033[00m
Jan 21 19:07:22 np0005591285 nova_compute[182755]: 2026-01-22 00:07:22.703 182759 DEBUG nova.compute.manager [None req-7c8e4fe1-5f6b-4bca-b226-67aa85b58c75 365f219cd09c471fa6275faa2fe5e2a1 b26cf6f4abd54e30aac169a3cbca648c - - default default] [instance: a7650c58-4663-47b0-8499-d470f8edddbd] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 21 19:07:22 np0005591285 nova_compute[182755]: 2026-01-22 00:07:22.724 182759 DEBUG nova.compute.manager [None req-f447ef32-7edb-4e2c-a5c5-5452f2f9c37e - - - - - -] [instance: a7650c58-4663-47b0-8499-d470f8edddbd] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 21 19:07:22 np0005591285 nova_compute[182755]: 2026-01-22 00:07:22.728 182759 DEBUG nova.compute.manager [None req-f447ef32-7edb-4e2c-a5c5-5452f2f9c37e - - - - - -] [instance: a7650c58-4663-47b0-8499-d470f8edddbd] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: stopped, current task_state: powering-on, current DB power_state: 4, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 21 19:07:22 np0005591285 nova_compute[182755]: 2026-01-22 00:07:22.761 182759 INFO nova.compute.manager [None req-f447ef32-7edb-4e2c-a5c5-5452f2f9c37e - - - - - -] [instance: a7650c58-4663-47b0-8499-d470f8edddbd] During sync_power_state the instance has a pending task (powering-on). Skip.#033[00m
Jan 21 19:07:22 np0005591285 nova_compute[182755]: 2026-01-22 00:07:22.762 182759 DEBUG nova.virt.driver [None req-f447ef32-7edb-4e2c-a5c5-5452f2f9c37e - - - - - -] Emitting event <LifecycleEvent: 1769040442.6971788, a7650c58-4663-47b0-8499-d470f8edddbd => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 21 19:07:22 np0005591285 nova_compute[182755]: 2026-01-22 00:07:22.762 182759 INFO nova.compute.manager [None req-f447ef32-7edb-4e2c-a5c5-5452f2f9c37e - - - - - -] [instance: a7650c58-4663-47b0-8499-d470f8edddbd] VM Started (Lifecycle Event)#033[00m
Jan 21 19:07:22 np0005591285 nova_compute[182755]: 2026-01-22 00:07:22.789 182759 DEBUG nova.compute.manager [None req-f447ef32-7edb-4e2c-a5c5-5452f2f9c37e - - - - - -] [instance: a7650c58-4663-47b0-8499-d470f8edddbd] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 21 19:07:22 np0005591285 nova_compute[182755]: 2026-01-22 00:07:22.792 182759 DEBUG nova.compute.manager [None req-f447ef32-7edb-4e2c-a5c5-5452f2f9c37e - - - - - -] [instance: a7650c58-4663-47b0-8499-d470f8edddbd] Synchronizing instance power state after lifecycle event "Started"; current vm_state: stopped, current task_state: powering-on, current DB power_state: 4, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 21 19:07:23 np0005591285 nova_compute[182755]: 2026-01-22 00:07:23.686 182759 DEBUG oslo_concurrency.lockutils [None req-22201990-8d45-4b4c-8af3-b670b4f55877 365f219cd09c471fa6275faa2fe5e2a1 b26cf6f4abd54e30aac169a3cbca648c - - default default] Acquiring lock "a7650c58-4663-47b0-8499-d470f8edddbd" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 21 19:07:23 np0005591285 nova_compute[182755]: 2026-01-22 00:07:23.687 182759 DEBUG oslo_concurrency.lockutils [None req-22201990-8d45-4b4c-8af3-b670b4f55877 365f219cd09c471fa6275faa2fe5e2a1 b26cf6f4abd54e30aac169a3cbca648c - - default default] Lock "a7650c58-4663-47b0-8499-d470f8edddbd" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 21 19:07:23 np0005591285 nova_compute[182755]: 2026-01-22 00:07:23.687 182759 DEBUG oslo_concurrency.lockutils [None req-22201990-8d45-4b4c-8af3-b670b4f55877 365f219cd09c471fa6275faa2fe5e2a1 b26cf6f4abd54e30aac169a3cbca648c - - default default] Acquiring lock "a7650c58-4663-47b0-8499-d470f8edddbd-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 21 19:07:23 np0005591285 nova_compute[182755]: 2026-01-22 00:07:23.687 182759 DEBUG oslo_concurrency.lockutils [None req-22201990-8d45-4b4c-8af3-b670b4f55877 365f219cd09c471fa6275faa2fe5e2a1 b26cf6f4abd54e30aac169a3cbca648c - - default default] Lock "a7650c58-4663-47b0-8499-d470f8edddbd-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 21 19:07:23 np0005591285 nova_compute[182755]: 2026-01-22 00:07:23.688 182759 DEBUG oslo_concurrency.lockutils [None req-22201990-8d45-4b4c-8af3-b670b4f55877 365f219cd09c471fa6275faa2fe5e2a1 b26cf6f4abd54e30aac169a3cbca648c - - default default] Lock "a7650c58-4663-47b0-8499-d470f8edddbd-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 21 19:07:23 np0005591285 nova_compute[182755]: 2026-01-22 00:07:23.703 182759 INFO nova.compute.manager [None req-22201990-8d45-4b4c-8af3-b670b4f55877 365f219cd09c471fa6275faa2fe5e2a1 b26cf6f4abd54e30aac169a3cbca648c - - default default] [instance: a7650c58-4663-47b0-8499-d470f8edddbd] Terminating instance#033[00m
Jan 21 19:07:23 np0005591285 nova_compute[182755]: 2026-01-22 00:07:23.714 182759 DEBUG nova.compute.manager [None req-22201990-8d45-4b4c-8af3-b670b4f55877 365f219cd09c471fa6275faa2fe5e2a1 b26cf6f4abd54e30aac169a3cbca648c - - default default] [instance: a7650c58-4663-47b0-8499-d470f8edddbd] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Jan 21 19:07:23 np0005591285 kernel: tap609c277b-13 (unregistering): left promiscuous mode
Jan 21 19:07:23 np0005591285 NetworkManager[55017]: <info>  [1769040443.7336] device (tap609c277b-13): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 21 19:07:23 np0005591285 nova_compute[182755]: 2026-01-22 00:07:23.737 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:07:23 np0005591285 ovn_controller[94908]: 2026-01-22T00:07:23Z|00366|binding|INFO|Releasing lport 609c277b-133c-4824-9fd7-17b756932543 from this chassis (sb_readonly=0)
Jan 21 19:07:23 np0005591285 ovn_controller[94908]: 2026-01-22T00:07:23Z|00367|binding|INFO|Setting lport 609c277b-133c-4824-9fd7-17b756932543 down in Southbound
Jan 21 19:07:23 np0005591285 ovn_controller[94908]: 2026-01-22T00:07:23Z|00368|binding|INFO|Removing iface tap609c277b-13 ovn-installed in OVS
Jan 21 19:07:23 np0005591285 nova_compute[182755]: 2026-01-22 00:07:23.739 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:07:23 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:07:23.750 104259 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:4e:1a:fc 10.100.0.7'], port_security=['fa:16:3e:4e:1a:fc 10.100.0.7'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.7/28', 'neutron:device_id': 'a7650c58-4663-47b0-8499-d470f8edddbd', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-1a4bd631-64c5-4e00-9341-0e44fd0833fb', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'b26cf6f4abd54e30aac169a3cbca648c', 'neutron:revision_number': '8', 'neutron:security_group_ids': '80fb8d02-77b3-43f5-9cd3-4114236093b7', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:port_fip': '192.168.122.176', 'neutron:host_id': 'compute-2.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=d46c6b58-b03f-4ac4-a6dd-9f507a40241a, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fdd55b5c970>], logical_port=609c277b-133c-4824-9fd7-17b756932543) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7fdd55b5c970>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 21 19:07:23 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:07:23.751 104259 INFO neutron.agent.ovn.metadata.agent [-] Port 609c277b-133c-4824-9fd7-17b756932543 in datapath 1a4bd631-64c5-4e00-9341-0e44fd0833fb unbound from our chassis#033[00m
Jan 21 19:07:23 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:07:23.752 104259 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 1a4bd631-64c5-4e00-9341-0e44fd0833fb, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Jan 21 19:07:23 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:07:23.753 211690 DEBUG oslo.privsep.daemon [-] privsep: reply[3ce27565-cd02-430b-8b04-c59f1580163f]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 19:07:23 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:07:23.754 104259 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-1a4bd631-64c5-4e00-9341-0e44fd0833fb namespace which is not needed anymore#033[00m
Jan 21 19:07:23 np0005591285 nova_compute[182755]: 2026-01-22 00:07:23.755 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:07:23 np0005591285 systemd[1]: machine-qemu\x2d45\x2dinstance\x2d0000005c.scope: Deactivated successfully.
Jan 21 19:07:23 np0005591285 systemd[1]: machine-qemu\x2d45\x2dinstance\x2d0000005c.scope: Consumed 1.725s CPU time.
Jan 21 19:07:23 np0005591285 systemd-machined[154022]: Machine qemu-45-instance-0000005c terminated.
Jan 21 19:07:23 np0005591285 neutron-haproxy-ovnmeta-1a4bd631-64c5-4e00-9341-0e44fd0833fb[226879]: [NOTICE]   (226884) : haproxy version is 2.8.14-c23fe91
Jan 21 19:07:23 np0005591285 neutron-haproxy-ovnmeta-1a4bd631-64c5-4e00-9341-0e44fd0833fb[226879]: [NOTICE]   (226884) : path to executable is /usr/sbin/haproxy
Jan 21 19:07:23 np0005591285 neutron-haproxy-ovnmeta-1a4bd631-64c5-4e00-9341-0e44fd0833fb[226879]: [WARNING]  (226884) : Exiting Master process...
Jan 21 19:07:23 np0005591285 neutron-haproxy-ovnmeta-1a4bd631-64c5-4e00-9341-0e44fd0833fb[226879]: [WARNING]  (226884) : Exiting Master process...
Jan 21 19:07:23 np0005591285 neutron-haproxy-ovnmeta-1a4bd631-64c5-4e00-9341-0e44fd0833fb[226879]: [ALERT]    (226884) : Current worker (226887) exited with code 143 (Terminated)
Jan 21 19:07:23 np0005591285 neutron-haproxy-ovnmeta-1a4bd631-64c5-4e00-9341-0e44fd0833fb[226879]: [WARNING]  (226884) : All workers exited. Exiting... (0)
Jan 21 19:07:23 np0005591285 systemd[1]: libpod-86cd9faf4032166c4405b8344aa6137f391fb62315c50a06067bc2a4210c59b5.scope: Deactivated successfully.
Jan 21 19:07:23 np0005591285 podman[226919]: 2026-01-22 00:07:23.871511348 +0000 UTC m=+0.039871833 container died 86cd9faf4032166c4405b8344aa6137f391fb62315c50a06067bc2a4210c59b5 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-1a4bd631-64c5-4e00-9341-0e44fd0833fb, tcib_managed=true, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 21 19:07:23 np0005591285 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-86cd9faf4032166c4405b8344aa6137f391fb62315c50a06067bc2a4210c59b5-userdata-shm.mount: Deactivated successfully.
Jan 21 19:07:23 np0005591285 systemd[1]: var-lib-containers-storage-overlay-f3f83024869f6be105fd0cd3704bdead44c85aed310f10e5ae1503bc07ffc5db-merged.mount: Deactivated successfully.
Jan 21 19:07:23 np0005591285 podman[226919]: 2026-01-22 00:07:23.899691957 +0000 UTC m=+0.068052452 container cleanup 86cd9faf4032166c4405b8344aa6137f391fb62315c50a06067bc2a4210c59b5 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-1a4bd631-64c5-4e00-9341-0e44fd0833fb, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, org.label-schema.vendor=CentOS)
Jan 21 19:07:23 np0005591285 systemd[1]: libpod-conmon-86cd9faf4032166c4405b8344aa6137f391fb62315c50a06067bc2a4210c59b5.scope: Deactivated successfully.
Jan 21 19:07:23 np0005591285 kernel: tap609c277b-13: entered promiscuous mode
Jan 21 19:07:23 np0005591285 kernel: tap609c277b-13 (unregistering): left promiscuous mode
Jan 21 19:07:23 np0005591285 NetworkManager[55017]: <info>  [1769040443.9368] manager: (tap609c277b-13): new Tun device (/org/freedesktop/NetworkManager/Devices/185)
Jan 21 19:07:23 np0005591285 nova_compute[182755]: 2026-01-22 00:07:23.942 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:07:23 np0005591285 podman[226949]: 2026-01-22 00:07:23.958046606 +0000 UTC m=+0.039936695 container remove 86cd9faf4032166c4405b8344aa6137f391fb62315c50a06067bc2a4210c59b5 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-1a4bd631-64c5-4e00-9341-0e44fd0833fb, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2)
Jan 21 19:07:23 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:07:23.962 211690 DEBUG oslo.privsep.daemon [-] privsep: reply[2022e73d-1064-4c45-bbc6-565562f573a8]: (4, ('Thu Jan 22 12:07:23 AM UTC 2026 Stopping container neutron-haproxy-ovnmeta-1a4bd631-64c5-4e00-9341-0e44fd0833fb (86cd9faf4032166c4405b8344aa6137f391fb62315c50a06067bc2a4210c59b5)\n86cd9faf4032166c4405b8344aa6137f391fb62315c50a06067bc2a4210c59b5\nThu Jan 22 12:07:23 AM UTC 2026 Deleting container neutron-haproxy-ovnmeta-1a4bd631-64c5-4e00-9341-0e44fd0833fb (86cd9faf4032166c4405b8344aa6137f391fb62315c50a06067bc2a4210c59b5)\n86cd9faf4032166c4405b8344aa6137f391fb62315c50a06067bc2a4210c59b5\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 19:07:23 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:07:23.964 211690 DEBUG oslo.privsep.daemon [-] privsep: reply[fabb8af2-b007-4c38-a790-62ed056329aa]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 19:07:23 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:07:23.965 104259 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap1a4bd631-60, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 21 19:07:23 np0005591285 nova_compute[182755]: 2026-01-22 00:07:23.966 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:07:23 np0005591285 kernel: tap1a4bd631-60: left promiscuous mode
Jan 21 19:07:23 np0005591285 nova_compute[182755]: 2026-01-22 00:07:23.977 182759 INFO nova.virt.libvirt.driver [-] [instance: a7650c58-4663-47b0-8499-d470f8edddbd] Instance destroyed successfully.#033[00m
Jan 21 19:07:23 np0005591285 nova_compute[182755]: 2026-01-22 00:07:23.977 182759 DEBUG nova.objects.instance [None req-22201990-8d45-4b4c-8af3-b670b4f55877 365f219cd09c471fa6275faa2fe5e2a1 b26cf6f4abd54e30aac169a3cbca648c - - default default] Lazy-loading 'resources' on Instance uuid a7650c58-4663-47b0-8499-d470f8edddbd obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 21 19:07:23 np0005591285 nova_compute[182755]: 2026-01-22 00:07:23.981 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:07:23 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:07:23.983 211690 DEBUG oslo.privsep.daemon [-] privsep: reply[38fdcfd0-2896-4c88-9722-b573fb8a44ea]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 19:07:23 np0005591285 nova_compute[182755]: 2026-01-22 00:07:23.993 182759 DEBUG nova.virt.libvirt.vif [None req-22201990-8d45-4b4c-8af3-b670b4f55877 365f219cd09c471fa6275faa2fe5e2a1 b26cf6f4abd54e30aac169a3cbca648c - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-22T00:05:15Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ServerActionsTestOtherB-server-1989312991',display_name='tempest-ServerActionsTestOtherB-server-1989312991',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(2),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-serveractionstestotherb-server-1989312991',id=92,image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',info_cache=InstanceInfoCache,instance_type_id=2,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBOWyAqsdytk3W3HzFQzJP3BXJvSwE75PC1SitNdFnRhcK3nyEFtPzs/DJKbijcwArzRvqYzid7Fty+N11Hyd1TaRzX9I0f6oLPrGjMRpZbi4YRQ8Uh8k7+UR1VtydcvTDA==',key_name='tempest-keypair-1040205771',keypairs=<?>,launch_index=0,launched_at=2026-01-22T00:07:10Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=192,metadata={},migration_context=<?>,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='b26cf6f4abd54e30aac169a3cbca648c',ramdisk_id='',reservation_id='r-sfv20ppo',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-ServerActionsTestOtherB-1685479237',owner_user_name='tempest-ServerActionsTestOtherB-1685479237-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-22T00:07:22Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='365f219cd09c471fa6275faa2fe5e2a1',uuid=a7650c58-4663-47b0-8499-d470f8edddbd,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "609c277b-133c-4824-9fd7-17b756932543", "address": "fa:16:3e:4e:1a:fc", "network": {"id": "1a4bd631-64c5-4e00-9341-0e44fd0833fb", "bridge": "br-int", "label": "tempest-ServerActionsTestOtherB-1779791452-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.176", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b26cf6f4abd54e30aac169a3cbca648c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap609c277b-13", "ovs_interfaceid": "609c277b-133c-4824-9fd7-17b756932543", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Jan 21 19:07:23 np0005591285 nova_compute[182755]: 2026-01-22 00:07:23.994 182759 DEBUG nova.network.os_vif_util [None req-22201990-8d45-4b4c-8af3-b670b4f55877 365f219cd09c471fa6275faa2fe5e2a1 b26cf6f4abd54e30aac169a3cbca648c - - default default] Converting VIF {"id": "609c277b-133c-4824-9fd7-17b756932543", "address": "fa:16:3e:4e:1a:fc", "network": {"id": "1a4bd631-64c5-4e00-9341-0e44fd0833fb", "bridge": "br-int", "label": "tempest-ServerActionsTestOtherB-1779791452-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.176", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b26cf6f4abd54e30aac169a3cbca648c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap609c277b-13", "ovs_interfaceid": "609c277b-133c-4824-9fd7-17b756932543", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 21 19:07:23 np0005591285 nova_compute[182755]: 2026-01-22 00:07:23.994 182759 DEBUG nova.network.os_vif_util [None req-22201990-8d45-4b4c-8af3-b670b4f55877 365f219cd09c471fa6275faa2fe5e2a1 b26cf6f4abd54e30aac169a3cbca648c - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:4e:1a:fc,bridge_name='br-int',has_traffic_filtering=True,id=609c277b-133c-4824-9fd7-17b756932543,network=Network(1a4bd631-64c5-4e00-9341-0e44fd0833fb),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap609c277b-13') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 21 19:07:23 np0005591285 nova_compute[182755]: 2026-01-22 00:07:23.995 182759 DEBUG os_vif [None req-22201990-8d45-4b4c-8af3-b670b4f55877 365f219cd09c471fa6275faa2fe5e2a1 b26cf6f4abd54e30aac169a3cbca648c - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:4e:1a:fc,bridge_name='br-int',has_traffic_filtering=True,id=609c277b-133c-4824-9fd7-17b756932543,network=Network(1a4bd631-64c5-4e00-9341-0e44fd0833fb),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap609c277b-13') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Jan 21 19:07:23 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:07:23.995 211690 DEBUG oslo.privsep.daemon [-] privsep: reply[fd2bd704-2ba0-4262-818c-df484ec2eb8e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 19:07:23 np0005591285 nova_compute[182755]: 2026-01-22 00:07:23.996 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:07:23 np0005591285 nova_compute[182755]: 2026-01-22 00:07:23.996 182759 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap609c277b-13, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 21 19:07:23 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:07:23.996 211690 DEBUG oslo.privsep.daemon [-] privsep: reply[c4dae820-c0e8-4dfe-b6d6-e091c7d21d65]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 19:07:24 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:07:24.011 211690 DEBUG oslo.privsep.daemon [-] privsep: reply[7c5ebad5-7a35-4c99-a67f-a8774ebc55ae]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 500070, 'reachable_time': 22248, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 226982, 'error': None, 'target': 'ovnmeta-1a4bd631-64c5-4e00-9341-0e44fd0833fb', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 19:07:24 np0005591285 systemd[1]: run-netns-ovnmeta\x2d1a4bd631\x2d64c5\x2d4e00\x2d9341\x2d0e44fd0833fb.mount: Deactivated successfully.
Jan 21 19:07:24 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:07:24.028 104650 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-1a4bd631-64c5-4e00-9341-0e44fd0833fb deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Jan 21 19:07:24 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:07:24.028 104650 DEBUG oslo.privsep.daemon [-] privsep: reply[6fed026d-00a4-46f4-b9ee-fbeedf72fdf4]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 19:07:24 np0005591285 nova_compute[182755]: 2026-01-22 00:07:24.029 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:07:24 np0005591285 nova_compute[182755]: 2026-01-22 00:07:24.030 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 21 19:07:24 np0005591285 nova_compute[182755]: 2026-01-22 00:07:24.031 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:07:24 np0005591285 nova_compute[182755]: 2026-01-22 00:07:24.033 182759 INFO os_vif [None req-22201990-8d45-4b4c-8af3-b670b4f55877 365f219cd09c471fa6275faa2fe5e2a1 b26cf6f4abd54e30aac169a3cbca648c - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:4e:1a:fc,bridge_name='br-int',has_traffic_filtering=True,id=609c277b-133c-4824-9fd7-17b756932543,network=Network(1a4bd631-64c5-4e00-9341-0e44fd0833fb),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap609c277b-13')#033[00m
Jan 21 19:07:24 np0005591285 nova_compute[182755]: 2026-01-22 00:07:24.034 182759 INFO nova.virt.libvirt.driver [None req-22201990-8d45-4b4c-8af3-b670b4f55877 365f219cd09c471fa6275faa2fe5e2a1 b26cf6f4abd54e30aac169a3cbca648c - - default default] [instance: a7650c58-4663-47b0-8499-d470f8edddbd] Deleting instance files /var/lib/nova/instances/a7650c58-4663-47b0-8499-d470f8edddbd_del#033[00m
Jan 21 19:07:24 np0005591285 nova_compute[182755]: 2026-01-22 00:07:24.039 182759 INFO nova.virt.libvirt.driver [None req-22201990-8d45-4b4c-8af3-b670b4f55877 365f219cd09c471fa6275faa2fe5e2a1 b26cf6f4abd54e30aac169a3cbca648c - - default default] [instance: a7650c58-4663-47b0-8499-d470f8edddbd] Deletion of /var/lib/nova/instances/a7650c58-4663-47b0-8499-d470f8edddbd_del complete#033[00m
Jan 21 19:07:24 np0005591285 nova_compute[182755]: 2026-01-22 00:07:24.166 182759 INFO nova.compute.manager [None req-22201990-8d45-4b4c-8af3-b670b4f55877 365f219cd09c471fa6275faa2fe5e2a1 b26cf6f4abd54e30aac169a3cbca648c - - default default] [instance: a7650c58-4663-47b0-8499-d470f8edddbd] Took 0.45 seconds to destroy the instance on the hypervisor.#033[00m
Jan 21 19:07:24 np0005591285 nova_compute[182755]: 2026-01-22 00:07:24.167 182759 DEBUG oslo.service.loopingcall [None req-22201990-8d45-4b4c-8af3-b670b4f55877 365f219cd09c471fa6275faa2fe5e2a1 b26cf6f4abd54e30aac169a3cbca648c - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Jan 21 19:07:24 np0005591285 nova_compute[182755]: 2026-01-22 00:07:24.167 182759 DEBUG nova.compute.manager [-] [instance: a7650c58-4663-47b0-8499-d470f8edddbd] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Jan 21 19:07:24 np0005591285 nova_compute[182755]: 2026-01-22 00:07:24.167 182759 DEBUG nova.network.neutron [-] [instance: a7650c58-4663-47b0-8499-d470f8edddbd] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Jan 21 19:07:24 np0005591285 nova_compute[182755]: 2026-01-22 00:07:24.956 182759 DEBUG nova.compute.manager [req-1c7f4597-fffa-47d7-a8ec-a7245fdcbd0c req-c830c431-5316-4f25-9229-2365675edd0a 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: a7650c58-4663-47b0-8499-d470f8edddbd] Received event network-vif-plugged-609c277b-133c-4824-9fd7-17b756932543 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 21 19:07:24 np0005591285 nova_compute[182755]: 2026-01-22 00:07:24.957 182759 DEBUG oslo_concurrency.lockutils [req-1c7f4597-fffa-47d7-a8ec-a7245fdcbd0c req-c830c431-5316-4f25-9229-2365675edd0a 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "a7650c58-4663-47b0-8499-d470f8edddbd-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 21 19:07:24 np0005591285 nova_compute[182755]: 2026-01-22 00:07:24.958 182759 DEBUG oslo_concurrency.lockutils [req-1c7f4597-fffa-47d7-a8ec-a7245fdcbd0c req-c830c431-5316-4f25-9229-2365675edd0a 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "a7650c58-4663-47b0-8499-d470f8edddbd-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 21 19:07:24 np0005591285 nova_compute[182755]: 2026-01-22 00:07:24.959 182759 DEBUG oslo_concurrency.lockutils [req-1c7f4597-fffa-47d7-a8ec-a7245fdcbd0c req-c830c431-5316-4f25-9229-2365675edd0a 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "a7650c58-4663-47b0-8499-d470f8edddbd-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 21 19:07:24 np0005591285 nova_compute[182755]: 2026-01-22 00:07:24.959 182759 DEBUG nova.compute.manager [req-1c7f4597-fffa-47d7-a8ec-a7245fdcbd0c req-c830c431-5316-4f25-9229-2365675edd0a 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: a7650c58-4663-47b0-8499-d470f8edddbd] No waiting events found dispatching network-vif-plugged-609c277b-133c-4824-9fd7-17b756932543 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 21 19:07:24 np0005591285 nova_compute[182755]: 2026-01-22 00:07:24.959 182759 WARNING nova.compute.manager [req-1c7f4597-fffa-47d7-a8ec-a7245fdcbd0c req-c830c431-5316-4f25-9229-2365675edd0a 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: a7650c58-4663-47b0-8499-d470f8edddbd] Received unexpected event network-vif-plugged-609c277b-133c-4824-9fd7-17b756932543 for instance with vm_state active and task_state deleting.#033[00m
Jan 21 19:07:24 np0005591285 nova_compute[182755]: 2026-01-22 00:07:24.960 182759 DEBUG nova.compute.manager [req-1c7f4597-fffa-47d7-a8ec-a7245fdcbd0c req-c830c431-5316-4f25-9229-2365675edd0a 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: a7650c58-4663-47b0-8499-d470f8edddbd] Received event network-vif-unplugged-609c277b-133c-4824-9fd7-17b756932543 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 21 19:07:24 np0005591285 nova_compute[182755]: 2026-01-22 00:07:24.960 182759 DEBUG oslo_concurrency.lockutils [req-1c7f4597-fffa-47d7-a8ec-a7245fdcbd0c req-c830c431-5316-4f25-9229-2365675edd0a 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "a7650c58-4663-47b0-8499-d470f8edddbd-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 21 19:07:24 np0005591285 nova_compute[182755]: 2026-01-22 00:07:24.961 182759 DEBUG oslo_concurrency.lockutils [req-1c7f4597-fffa-47d7-a8ec-a7245fdcbd0c req-c830c431-5316-4f25-9229-2365675edd0a 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "a7650c58-4663-47b0-8499-d470f8edddbd-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 21 19:07:24 np0005591285 nova_compute[182755]: 2026-01-22 00:07:24.961 182759 DEBUG oslo_concurrency.lockutils [req-1c7f4597-fffa-47d7-a8ec-a7245fdcbd0c req-c830c431-5316-4f25-9229-2365675edd0a 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "a7650c58-4663-47b0-8499-d470f8edddbd-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 21 19:07:24 np0005591285 nova_compute[182755]: 2026-01-22 00:07:24.961 182759 DEBUG nova.compute.manager [req-1c7f4597-fffa-47d7-a8ec-a7245fdcbd0c req-c830c431-5316-4f25-9229-2365675edd0a 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: a7650c58-4663-47b0-8499-d470f8edddbd] No waiting events found dispatching network-vif-unplugged-609c277b-133c-4824-9fd7-17b756932543 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 21 19:07:24 np0005591285 nova_compute[182755]: 2026-01-22 00:07:24.962 182759 DEBUG nova.compute.manager [req-1c7f4597-fffa-47d7-a8ec-a7245fdcbd0c req-c830c431-5316-4f25-9229-2365675edd0a 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: a7650c58-4663-47b0-8499-d470f8edddbd] Received event network-vif-unplugged-609c277b-133c-4824-9fd7-17b756932543 for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Jan 21 19:07:25 np0005591285 nova_compute[182755]: 2026-01-22 00:07:25.085 182759 DEBUG nova.network.neutron [-] [instance: a7650c58-4663-47b0-8499-d470f8edddbd] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 21 19:07:25 np0005591285 nova_compute[182755]: 2026-01-22 00:07:25.106 182759 INFO nova.compute.manager [-] [instance: a7650c58-4663-47b0-8499-d470f8edddbd] Took 0.94 seconds to deallocate network for instance.#033[00m
Jan 21 19:07:25 np0005591285 podman[226984]: 2026-01-22 00:07:25.193498399 +0000 UTC m=+0.057523298 container health_status 8919309155a033da8deb024bb37b9fc0d285fe97686ee0d202af37a5f2df8416 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter)
Jan 21 19:07:25 np0005591285 podman[226983]: 2026-01-22 00:07:25.193642183 +0000 UTC m=+0.058631668 container health_status 482a8450540b33e5cd4c4cb0c4b60281016c657b79b89fa82f8a9e447843f995 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251202, maintainer=OpenStack Kubernetes Operator team, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, container_name=ovn_metadata_agent, io.buildah.version=1.41.3)
Jan 21 19:07:25 np0005591285 nova_compute[182755]: 2026-01-22 00:07:25.208 182759 DEBUG oslo_concurrency.lockutils [None req-22201990-8d45-4b4c-8af3-b670b4f55877 365f219cd09c471fa6275faa2fe5e2a1 b26cf6f4abd54e30aac169a3cbca648c - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 21 19:07:25 np0005591285 nova_compute[182755]: 2026-01-22 00:07:25.208 182759 DEBUG oslo_concurrency.lockutils [None req-22201990-8d45-4b4c-8af3-b670b4f55877 365f219cd09c471fa6275faa2fe5e2a1 b26cf6f4abd54e30aac169a3cbca648c - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 21 19:07:25 np0005591285 nova_compute[182755]: 2026-01-22 00:07:25.215 182759 DEBUG oslo_concurrency.lockutils [None req-22201990-8d45-4b4c-8af3-b670b4f55877 365f219cd09c471fa6275faa2fe5e2a1 b26cf6f4abd54e30aac169a3cbca648c - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.006s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 21 19:07:25 np0005591285 podman[226985]: 2026-01-22 00:07:25.263313537 +0000 UTC m=+0.122488555 container health_status e38def30a2d032278c507a8fe96e62e9ea51ff4721fe55b24e66691821fd079e (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, managed_by=edpm_ansible, org.label-schema.license=GPLv2, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251202)
Jan 21 19:07:25 np0005591285 nova_compute[182755]: 2026-01-22 00:07:25.311 182759 INFO nova.scheduler.client.report [None req-22201990-8d45-4b4c-8af3-b670b4f55877 365f219cd09c471fa6275faa2fe5e2a1 b26cf6f4abd54e30aac169a3cbca648c - - default default] Deleted allocations for instance a7650c58-4663-47b0-8499-d470f8edddbd#033[00m
Jan 21 19:07:25 np0005591285 nova_compute[182755]: 2026-01-22 00:07:25.406 182759 DEBUG oslo_concurrency.lockutils [None req-22201990-8d45-4b4c-8af3-b670b4f55877 365f219cd09c471fa6275faa2fe5e2a1 b26cf6f4abd54e30aac169a3cbca648c - - default default] Lock "a7650c58-4663-47b0-8499-d470f8edddbd" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 1.720s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 21 19:07:25 np0005591285 nova_compute[182755]: 2026-01-22 00:07:25.731 182759 DEBUG oslo_service.periodic_task [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Running periodic task ComputeManager._cleanup_running_deleted_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 21 19:07:26 np0005591285 nova_compute[182755]: 2026-01-22 00:07:26.579 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:07:27 np0005591285 nova_compute[182755]: 2026-01-22 00:07:27.099 182759 DEBUG nova.compute.manager [req-fea6ecf9-78da-4a17-9f0c-e9e22e9cc4eb req-c151d313-8736-4317-990f-4a9881179a7c 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: a7650c58-4663-47b0-8499-d470f8edddbd] Received event network-vif-plugged-609c277b-133c-4824-9fd7-17b756932543 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 21 19:07:27 np0005591285 nova_compute[182755]: 2026-01-22 00:07:27.100 182759 DEBUG oslo_concurrency.lockutils [req-fea6ecf9-78da-4a17-9f0c-e9e22e9cc4eb req-c151d313-8736-4317-990f-4a9881179a7c 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "a7650c58-4663-47b0-8499-d470f8edddbd-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 21 19:07:27 np0005591285 nova_compute[182755]: 2026-01-22 00:07:27.100 182759 DEBUG oslo_concurrency.lockutils [req-fea6ecf9-78da-4a17-9f0c-e9e22e9cc4eb req-c151d313-8736-4317-990f-4a9881179a7c 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "a7650c58-4663-47b0-8499-d470f8edddbd-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 21 19:07:27 np0005591285 nova_compute[182755]: 2026-01-22 00:07:27.100 182759 DEBUG oslo_concurrency.lockutils [req-fea6ecf9-78da-4a17-9f0c-e9e22e9cc4eb req-c151d313-8736-4317-990f-4a9881179a7c 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "a7650c58-4663-47b0-8499-d470f8edddbd-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 21 19:07:27 np0005591285 nova_compute[182755]: 2026-01-22 00:07:27.100 182759 DEBUG nova.compute.manager [req-fea6ecf9-78da-4a17-9f0c-e9e22e9cc4eb req-c151d313-8736-4317-990f-4a9881179a7c 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: a7650c58-4663-47b0-8499-d470f8edddbd] No waiting events found dispatching network-vif-plugged-609c277b-133c-4824-9fd7-17b756932543 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 21 19:07:27 np0005591285 nova_compute[182755]: 2026-01-22 00:07:27.100 182759 WARNING nova.compute.manager [req-fea6ecf9-78da-4a17-9f0c-e9e22e9cc4eb req-c151d313-8736-4317-990f-4a9881179a7c 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: a7650c58-4663-47b0-8499-d470f8edddbd] Received unexpected event network-vif-plugged-609c277b-133c-4824-9fd7-17b756932543 for instance with vm_state deleted and task_state None.#033[00m
Jan 21 19:07:27 np0005591285 nova_compute[182755]: 2026-01-22 00:07:27.101 182759 DEBUG nova.compute.manager [req-fea6ecf9-78da-4a17-9f0c-e9e22e9cc4eb req-c151d313-8736-4317-990f-4a9881179a7c 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: a7650c58-4663-47b0-8499-d470f8edddbd] Received event network-vif-deleted-609c277b-133c-4824-9fd7-17b756932543 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 21 19:07:29 np0005591285 nova_compute[182755]: 2026-01-22 00:07:29.030 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:07:30 np0005591285 nova_compute[182755]: 2026-01-22 00:07:30.143 182759 DEBUG oslo_concurrency.lockutils [None req-a8e93372-239c-4045-a799-78d31a5cb7ba 365f219cd09c471fa6275faa2fe5e2a1 b26cf6f4abd54e30aac169a3cbca648c - - default default] Acquiring lock "4e87b9c8-cfba-431e-966e-24799ad0ece2" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 21 19:07:30 np0005591285 nova_compute[182755]: 2026-01-22 00:07:30.144 182759 DEBUG oslo_concurrency.lockutils [None req-a8e93372-239c-4045-a799-78d31a5cb7ba 365f219cd09c471fa6275faa2fe5e2a1 b26cf6f4abd54e30aac169a3cbca648c - - default default] Lock "4e87b9c8-cfba-431e-966e-24799ad0ece2" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 21 19:07:30 np0005591285 nova_compute[182755]: 2026-01-22 00:07:30.163 182759 DEBUG nova.compute.manager [None req-a8e93372-239c-4045-a799-78d31a5cb7ba 365f219cd09c471fa6275faa2fe5e2a1 b26cf6f4abd54e30aac169a3cbca648c - - default default] [instance: 4e87b9c8-cfba-431e-966e-24799ad0ece2] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Jan 21 19:07:30 np0005591285 nova_compute[182755]: 2026-01-22 00:07:30.217 182759 DEBUG oslo_service.periodic_task [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Running periodic task ComputeManager._run_pending_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 21 19:07:30 np0005591285 nova_compute[182755]: 2026-01-22 00:07:30.218 182759 DEBUG nova.compute.manager [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Cleaning up deleted instances _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11145#033[00m
Jan 21 19:07:30 np0005591285 nova_compute[182755]: 2026-01-22 00:07:30.235 182759 DEBUG nova.compute.manager [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] There are 0 instances to clean _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11154#033[00m
Jan 21 19:07:30 np0005591285 nova_compute[182755]: 2026-01-22 00:07:30.273 182759 DEBUG oslo_concurrency.lockutils [None req-a8e93372-239c-4045-a799-78d31a5cb7ba 365f219cd09c471fa6275faa2fe5e2a1 b26cf6f4abd54e30aac169a3cbca648c - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 21 19:07:30 np0005591285 nova_compute[182755]: 2026-01-22 00:07:30.274 182759 DEBUG oslo_concurrency.lockutils [None req-a8e93372-239c-4045-a799-78d31a5cb7ba 365f219cd09c471fa6275faa2fe5e2a1 b26cf6f4abd54e30aac169a3cbca648c - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 21 19:07:30 np0005591285 nova_compute[182755]: 2026-01-22 00:07:30.281 182759 DEBUG nova.virt.hardware [None req-a8e93372-239c-4045-a799-78d31a5cb7ba 365f219cd09c471fa6275faa2fe5e2a1 b26cf6f4abd54e30aac169a3cbca648c - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Jan 21 19:07:30 np0005591285 nova_compute[182755]: 2026-01-22 00:07:30.281 182759 INFO nova.compute.claims [None req-a8e93372-239c-4045-a799-78d31a5cb7ba 365f219cd09c471fa6275faa2fe5e2a1 b26cf6f4abd54e30aac169a3cbca648c - - default default] [instance: 4e87b9c8-cfba-431e-966e-24799ad0ece2] Claim successful on node compute-2.ctlplane.example.com#033[00m
Jan 21 19:07:30 np0005591285 nova_compute[182755]: 2026-01-22 00:07:30.424 182759 DEBUG nova.compute.provider_tree [None req-a8e93372-239c-4045-a799-78d31a5cb7ba 365f219cd09c471fa6275faa2fe5e2a1 b26cf6f4abd54e30aac169a3cbca648c - - default default] Inventory has not changed in ProviderTree for provider: e96a8776-a298-4c19-937a-402cb8191067 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 21 19:07:30 np0005591285 nova_compute[182755]: 2026-01-22 00:07:30.488 182759 DEBUG nova.scheduler.client.report [None req-a8e93372-239c-4045-a799-78d31a5cb7ba 365f219cd09c471fa6275faa2fe5e2a1 b26cf6f4abd54e30aac169a3cbca648c - - default default] Inventory has not changed for provider e96a8776-a298-4c19-937a-402cb8191067 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 21 19:07:30 np0005591285 nova_compute[182755]: 2026-01-22 00:07:30.520 182759 DEBUG oslo_concurrency.lockutils [None req-a8e93372-239c-4045-a799-78d31a5cb7ba 365f219cd09c471fa6275faa2fe5e2a1 b26cf6f4abd54e30aac169a3cbca648c - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.246s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 21 19:07:30 np0005591285 nova_compute[182755]: 2026-01-22 00:07:30.521 182759 DEBUG nova.compute.manager [None req-a8e93372-239c-4045-a799-78d31a5cb7ba 365f219cd09c471fa6275faa2fe5e2a1 b26cf6f4abd54e30aac169a3cbca648c - - default default] [instance: 4e87b9c8-cfba-431e-966e-24799ad0ece2] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Jan 21 19:07:30 np0005591285 nova_compute[182755]: 2026-01-22 00:07:30.594 182759 DEBUG nova.compute.manager [None req-a8e93372-239c-4045-a799-78d31a5cb7ba 365f219cd09c471fa6275faa2fe5e2a1 b26cf6f4abd54e30aac169a3cbca648c - - default default] [instance: 4e87b9c8-cfba-431e-966e-24799ad0ece2] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Jan 21 19:07:30 np0005591285 nova_compute[182755]: 2026-01-22 00:07:30.595 182759 DEBUG nova.network.neutron [None req-a8e93372-239c-4045-a799-78d31a5cb7ba 365f219cd09c471fa6275faa2fe5e2a1 b26cf6f4abd54e30aac169a3cbca648c - - default default] [instance: 4e87b9c8-cfba-431e-966e-24799ad0ece2] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Jan 21 19:07:30 np0005591285 nova_compute[182755]: 2026-01-22 00:07:30.620 182759 INFO nova.virt.libvirt.driver [None req-a8e93372-239c-4045-a799-78d31a5cb7ba 365f219cd09c471fa6275faa2fe5e2a1 b26cf6f4abd54e30aac169a3cbca648c - - default default] [instance: 4e87b9c8-cfba-431e-966e-24799ad0ece2] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Jan 21 19:07:30 np0005591285 nova_compute[182755]: 2026-01-22 00:07:30.644 182759 DEBUG nova.compute.manager [None req-a8e93372-239c-4045-a799-78d31a5cb7ba 365f219cd09c471fa6275faa2fe5e2a1 b26cf6f4abd54e30aac169a3cbca648c - - default default] [instance: 4e87b9c8-cfba-431e-966e-24799ad0ece2] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Jan 21 19:07:30 np0005591285 nova_compute[182755]: 2026-01-22 00:07:30.808 182759 DEBUG nova.compute.manager [None req-a8e93372-239c-4045-a799-78d31a5cb7ba 365f219cd09c471fa6275faa2fe5e2a1 b26cf6f4abd54e30aac169a3cbca648c - - default default] [instance: 4e87b9c8-cfba-431e-966e-24799ad0ece2] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Jan 21 19:07:30 np0005591285 nova_compute[182755]: 2026-01-22 00:07:30.810 182759 DEBUG nova.virt.libvirt.driver [None req-a8e93372-239c-4045-a799-78d31a5cb7ba 365f219cd09c471fa6275faa2fe5e2a1 b26cf6f4abd54e30aac169a3cbca648c - - default default] [instance: 4e87b9c8-cfba-431e-966e-24799ad0ece2] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Jan 21 19:07:30 np0005591285 nova_compute[182755]: 2026-01-22 00:07:30.810 182759 INFO nova.virt.libvirt.driver [None req-a8e93372-239c-4045-a799-78d31a5cb7ba 365f219cd09c471fa6275faa2fe5e2a1 b26cf6f4abd54e30aac169a3cbca648c - - default default] [instance: 4e87b9c8-cfba-431e-966e-24799ad0ece2] Creating image(s)#033[00m
Jan 21 19:07:30 np0005591285 nova_compute[182755]: 2026-01-22 00:07:30.811 182759 DEBUG oslo_concurrency.lockutils [None req-a8e93372-239c-4045-a799-78d31a5cb7ba 365f219cd09c471fa6275faa2fe5e2a1 b26cf6f4abd54e30aac169a3cbca648c - - default default] Acquiring lock "/var/lib/nova/instances/4e87b9c8-cfba-431e-966e-24799ad0ece2/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 21 19:07:30 np0005591285 nova_compute[182755]: 2026-01-22 00:07:30.811 182759 DEBUG oslo_concurrency.lockutils [None req-a8e93372-239c-4045-a799-78d31a5cb7ba 365f219cd09c471fa6275faa2fe5e2a1 b26cf6f4abd54e30aac169a3cbca648c - - default default] Lock "/var/lib/nova/instances/4e87b9c8-cfba-431e-966e-24799ad0ece2/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 21 19:07:30 np0005591285 nova_compute[182755]: 2026-01-22 00:07:30.812 182759 DEBUG oslo_concurrency.lockutils [None req-a8e93372-239c-4045-a799-78d31a5cb7ba 365f219cd09c471fa6275faa2fe5e2a1 b26cf6f4abd54e30aac169a3cbca648c - - default default] Lock "/var/lib/nova/instances/4e87b9c8-cfba-431e-966e-24799ad0ece2/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 21 19:07:30 np0005591285 nova_compute[182755]: 2026-01-22 00:07:30.830 182759 DEBUG oslo_concurrency.processutils [None req-a8e93372-239c-4045-a799-78d31a5cb7ba 365f219cd09c471fa6275faa2fe5e2a1 b26cf6f4abd54e30aac169a3cbca648c - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 21 19:07:30 np0005591285 nova_compute[182755]: 2026-01-22 00:07:30.891 182759 DEBUG nova.policy [None req-a8e93372-239c-4045-a799-78d31a5cb7ba 365f219cd09c471fa6275faa2fe5e2a1 b26cf6f4abd54e30aac169a3cbca648c - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '365f219cd09c471fa6275faa2fe5e2a1', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'b26cf6f4abd54e30aac169a3cbca648c', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Jan 21 19:07:30 np0005591285 nova_compute[182755]: 2026-01-22 00:07:30.896 182759 DEBUG oslo_concurrency.processutils [None req-a8e93372-239c-4045-a799-78d31a5cb7ba 365f219cd09c471fa6275faa2fe5e2a1 b26cf6f4abd54e30aac169a3cbca648c - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json" returned: 0 in 0.065s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 21 19:07:30 np0005591285 nova_compute[182755]: 2026-01-22 00:07:30.896 182759 DEBUG oslo_concurrency.lockutils [None req-a8e93372-239c-4045-a799-78d31a5cb7ba 365f219cd09c471fa6275faa2fe5e2a1 b26cf6f4abd54e30aac169a3cbca648c - - default default] Acquiring lock "3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 21 19:07:30 np0005591285 nova_compute[182755]: 2026-01-22 00:07:30.897 182759 DEBUG oslo_concurrency.lockutils [None req-a8e93372-239c-4045-a799-78d31a5cb7ba 365f219cd09c471fa6275faa2fe5e2a1 b26cf6f4abd54e30aac169a3cbca648c - - default default] Lock "3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 21 19:07:30 np0005591285 nova_compute[182755]: 2026-01-22 00:07:30.914 182759 DEBUG oslo_concurrency.processutils [None req-a8e93372-239c-4045-a799-78d31a5cb7ba 365f219cd09c471fa6275faa2fe5e2a1 b26cf6f4abd54e30aac169a3cbca648c - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 21 19:07:30 np0005591285 nova_compute[182755]: 2026-01-22 00:07:30.968 182759 DEBUG oslo_concurrency.processutils [None req-a8e93372-239c-4045-a799-78d31a5cb7ba 365f219cd09c471fa6275faa2fe5e2a1 b26cf6f4abd54e30aac169a3cbca648c - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json" returned: 0 in 0.054s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 21 19:07:30 np0005591285 nova_compute[182755]: 2026-01-22 00:07:30.969 182759 DEBUG oslo_concurrency.processutils [None req-a8e93372-239c-4045-a799-78d31a5cb7ba 365f219cd09c471fa6275faa2fe5e2a1 b26cf6f4abd54e30aac169a3cbca648c - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474,backing_fmt=raw /var/lib/nova/instances/4e87b9c8-cfba-431e-966e-24799ad0ece2/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 21 19:07:31 np0005591285 nova_compute[182755]: 2026-01-22 00:07:31.003 182759 DEBUG oslo_concurrency.processutils [None req-a8e93372-239c-4045-a799-78d31a5cb7ba 365f219cd09c471fa6275faa2fe5e2a1 b26cf6f4abd54e30aac169a3cbca648c - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474,backing_fmt=raw /var/lib/nova/instances/4e87b9c8-cfba-431e-966e-24799ad0ece2/disk 1073741824" returned: 0 in 0.034s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 21 19:07:31 np0005591285 nova_compute[182755]: 2026-01-22 00:07:31.005 182759 DEBUG oslo_concurrency.lockutils [None req-a8e93372-239c-4045-a799-78d31a5cb7ba 365f219cd09c471fa6275faa2fe5e2a1 b26cf6f4abd54e30aac169a3cbca648c - - default default] Lock "3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.108s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 21 19:07:31 np0005591285 nova_compute[182755]: 2026-01-22 00:07:31.005 182759 DEBUG oslo_concurrency.processutils [None req-a8e93372-239c-4045-a799-78d31a5cb7ba 365f219cd09c471fa6275faa2fe5e2a1 b26cf6f4abd54e30aac169a3cbca648c - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 21 19:07:31 np0005591285 nova_compute[182755]: 2026-01-22 00:07:31.065 182759 DEBUG oslo_concurrency.processutils [None req-a8e93372-239c-4045-a799-78d31a5cb7ba 365f219cd09c471fa6275faa2fe5e2a1 b26cf6f4abd54e30aac169a3cbca648c - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json" returned: 0 in 0.060s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 21 19:07:31 np0005591285 nova_compute[182755]: 2026-01-22 00:07:31.067 182759 DEBUG nova.virt.disk.api [None req-a8e93372-239c-4045-a799-78d31a5cb7ba 365f219cd09c471fa6275faa2fe5e2a1 b26cf6f4abd54e30aac169a3cbca648c - - default default] Checking if we can resize image /var/lib/nova/instances/4e87b9c8-cfba-431e-966e-24799ad0ece2/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166#033[00m
Jan 21 19:07:31 np0005591285 nova_compute[182755]: 2026-01-22 00:07:31.067 182759 DEBUG oslo_concurrency.processutils [None req-a8e93372-239c-4045-a799-78d31a5cb7ba 365f219cd09c471fa6275faa2fe5e2a1 b26cf6f4abd54e30aac169a3cbca648c - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/4e87b9c8-cfba-431e-966e-24799ad0ece2/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 21 19:07:31 np0005591285 nova_compute[182755]: 2026-01-22 00:07:31.125 182759 DEBUG oslo_concurrency.processutils [None req-a8e93372-239c-4045-a799-78d31a5cb7ba 365f219cd09c471fa6275faa2fe5e2a1 b26cf6f4abd54e30aac169a3cbca648c - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/4e87b9c8-cfba-431e-966e-24799ad0ece2/disk --force-share --output=json" returned: 0 in 0.058s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 21 19:07:31 np0005591285 nova_compute[182755]: 2026-01-22 00:07:31.126 182759 DEBUG nova.virt.disk.api [None req-a8e93372-239c-4045-a799-78d31a5cb7ba 365f219cd09c471fa6275faa2fe5e2a1 b26cf6f4abd54e30aac169a3cbca648c - - default default] Cannot resize image /var/lib/nova/instances/4e87b9c8-cfba-431e-966e-24799ad0ece2/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172#033[00m
Jan 21 19:07:31 np0005591285 nova_compute[182755]: 2026-01-22 00:07:31.126 182759 DEBUG nova.objects.instance [None req-a8e93372-239c-4045-a799-78d31a5cb7ba 365f219cd09c471fa6275faa2fe5e2a1 b26cf6f4abd54e30aac169a3cbca648c - - default default] Lazy-loading 'migration_context' on Instance uuid 4e87b9c8-cfba-431e-966e-24799ad0ece2 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 21 19:07:31 np0005591285 nova_compute[182755]: 2026-01-22 00:07:31.176 182759 DEBUG nova.virt.libvirt.driver [None req-a8e93372-239c-4045-a799-78d31a5cb7ba 365f219cd09c471fa6275faa2fe5e2a1 b26cf6f4abd54e30aac169a3cbca648c - - default default] [instance: 4e87b9c8-cfba-431e-966e-24799ad0ece2] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Jan 21 19:07:31 np0005591285 nova_compute[182755]: 2026-01-22 00:07:31.176 182759 DEBUG nova.virt.libvirt.driver [None req-a8e93372-239c-4045-a799-78d31a5cb7ba 365f219cd09c471fa6275faa2fe5e2a1 b26cf6f4abd54e30aac169a3cbca648c - - default default] [instance: 4e87b9c8-cfba-431e-966e-24799ad0ece2] Ensure instance console log exists: /var/lib/nova/instances/4e87b9c8-cfba-431e-966e-24799ad0ece2/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Jan 21 19:07:31 np0005591285 nova_compute[182755]: 2026-01-22 00:07:31.177 182759 DEBUG oslo_concurrency.lockutils [None req-a8e93372-239c-4045-a799-78d31a5cb7ba 365f219cd09c471fa6275faa2fe5e2a1 b26cf6f4abd54e30aac169a3cbca648c - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 21 19:07:31 np0005591285 nova_compute[182755]: 2026-01-22 00:07:31.177 182759 DEBUG oslo_concurrency.lockutils [None req-a8e93372-239c-4045-a799-78d31a5cb7ba 365f219cd09c471fa6275faa2fe5e2a1 b26cf6f4abd54e30aac169a3cbca648c - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 21 19:07:31 np0005591285 nova_compute[182755]: 2026-01-22 00:07:31.177 182759 DEBUG oslo_concurrency.lockutils [None req-a8e93372-239c-4045-a799-78d31a5cb7ba 365f219cd09c471fa6275faa2fe5e2a1 b26cf6f4abd54e30aac169a3cbca648c - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 21 19:07:31 np0005591285 nova_compute[182755]: 2026-01-22 00:07:31.581 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:07:32 np0005591285 nova_compute[182755]: 2026-01-22 00:07:32.210 182759 DEBUG nova.network.neutron [None req-a8e93372-239c-4045-a799-78d31a5cb7ba 365f219cd09c471fa6275faa2fe5e2a1 b26cf6f4abd54e30aac169a3cbca648c - - default default] [instance: 4e87b9c8-cfba-431e-966e-24799ad0ece2] Successfully created port: 02f1d29d-b6df-46d8-8387-cfa84ffb24af _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Jan 21 19:07:33 np0005591285 nova_compute[182755]: 2026-01-22 00:07:33.230 182759 DEBUG oslo_service.periodic_task [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 21 19:07:33 np0005591285 nova_compute[182755]: 2026-01-22 00:07:33.423 182759 DEBUG nova.network.neutron [None req-a8e93372-239c-4045-a799-78d31a5cb7ba 365f219cd09c471fa6275faa2fe5e2a1 b26cf6f4abd54e30aac169a3cbca648c - - default default] [instance: 4e87b9c8-cfba-431e-966e-24799ad0ece2] Successfully updated port: 02f1d29d-b6df-46d8-8387-cfa84ffb24af _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Jan 21 19:07:33 np0005591285 nova_compute[182755]: 2026-01-22 00:07:33.456 182759 DEBUG oslo_concurrency.lockutils [None req-a8e93372-239c-4045-a799-78d31a5cb7ba 365f219cd09c471fa6275faa2fe5e2a1 b26cf6f4abd54e30aac169a3cbca648c - - default default] Acquiring lock "refresh_cache-4e87b9c8-cfba-431e-966e-24799ad0ece2" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 21 19:07:33 np0005591285 nova_compute[182755]: 2026-01-22 00:07:33.457 182759 DEBUG oslo_concurrency.lockutils [None req-a8e93372-239c-4045-a799-78d31a5cb7ba 365f219cd09c471fa6275faa2fe5e2a1 b26cf6f4abd54e30aac169a3cbca648c - - default default] Acquired lock "refresh_cache-4e87b9c8-cfba-431e-966e-24799ad0ece2" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 21 19:07:33 np0005591285 nova_compute[182755]: 2026-01-22 00:07:33.457 182759 DEBUG nova.network.neutron [None req-a8e93372-239c-4045-a799-78d31a5cb7ba 365f219cd09c471fa6275faa2fe5e2a1 b26cf6f4abd54e30aac169a3cbca648c - - default default] [instance: 4e87b9c8-cfba-431e-966e-24799ad0ece2] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Jan 21 19:07:33 np0005591285 nova_compute[182755]: 2026-01-22 00:07:33.723 182759 DEBUG nova.network.neutron [None req-a8e93372-239c-4045-a799-78d31a5cb7ba 365f219cd09c471fa6275faa2fe5e2a1 b26cf6f4abd54e30aac169a3cbca648c - - default default] [instance: 4e87b9c8-cfba-431e-966e-24799ad0ece2] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Jan 21 19:07:34 np0005591285 nova_compute[182755]: 2026-01-22 00:07:34.032 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:07:34 np0005591285 nova_compute[182755]: 2026-01-22 00:07:34.692 182759 DEBUG nova.network.neutron [None req-a8e93372-239c-4045-a799-78d31a5cb7ba 365f219cd09c471fa6275faa2fe5e2a1 b26cf6f4abd54e30aac169a3cbca648c - - default default] [instance: 4e87b9c8-cfba-431e-966e-24799ad0ece2] Updating instance_info_cache with network_info: [{"id": "02f1d29d-b6df-46d8-8387-cfa84ffb24af", "address": "fa:16:3e:7c:e7:2e", "network": {"id": "1a4bd631-64c5-4e00-9341-0e44fd0833fb", "bridge": "br-int", "label": "tempest-ServerActionsTestOtherB-1779791452-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b26cf6f4abd54e30aac169a3cbca648c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap02f1d29d-b6", "ovs_interfaceid": "02f1d29d-b6df-46d8-8387-cfa84ffb24af", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 21 19:07:34 np0005591285 nova_compute[182755]: 2026-01-22 00:07:34.742 182759 DEBUG oslo_concurrency.lockutils [None req-a8e93372-239c-4045-a799-78d31a5cb7ba 365f219cd09c471fa6275faa2fe5e2a1 b26cf6f4abd54e30aac169a3cbca648c - - default default] Releasing lock "refresh_cache-4e87b9c8-cfba-431e-966e-24799ad0ece2" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 21 19:07:34 np0005591285 nova_compute[182755]: 2026-01-22 00:07:34.742 182759 DEBUG nova.compute.manager [None req-a8e93372-239c-4045-a799-78d31a5cb7ba 365f219cd09c471fa6275faa2fe5e2a1 b26cf6f4abd54e30aac169a3cbca648c - - default default] [instance: 4e87b9c8-cfba-431e-966e-24799ad0ece2] Instance network_info: |[{"id": "02f1d29d-b6df-46d8-8387-cfa84ffb24af", "address": "fa:16:3e:7c:e7:2e", "network": {"id": "1a4bd631-64c5-4e00-9341-0e44fd0833fb", "bridge": "br-int", "label": "tempest-ServerActionsTestOtherB-1779791452-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b26cf6f4abd54e30aac169a3cbca648c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap02f1d29d-b6", "ovs_interfaceid": "02f1d29d-b6df-46d8-8387-cfa84ffb24af", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Jan 21 19:07:34 np0005591285 nova_compute[182755]: 2026-01-22 00:07:34.746 182759 DEBUG nova.virt.libvirt.driver [None req-a8e93372-239c-4045-a799-78d31a5cb7ba 365f219cd09c471fa6275faa2fe5e2a1 b26cf6f4abd54e30aac169a3cbca648c - - default default] [instance: 4e87b9c8-cfba-431e-966e-24799ad0ece2] Start _get_guest_xml network_info=[{"id": "02f1d29d-b6df-46d8-8387-cfa84ffb24af", "address": "fa:16:3e:7c:e7:2e", "network": {"id": "1a4bd631-64c5-4e00-9341-0e44fd0833fb", "bridge": "br-int", "label": "tempest-ServerActionsTestOtherB-1779791452-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b26cf6f4abd54e30aac169a3cbca648c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap02f1d29d-b6", "ovs_interfaceid": "02f1d29d-b6df-46d8-8387-cfa84ffb24af", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-21T23:42:50Z,direct_url=<?>,disk_format='qcow2',id=9cd98f02-a505-4543-a7ad-04e9a377b456,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='43b70c4e837343859ac97b6b2397ba1b',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-21T23:42:52Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'device_type': 'disk', 'device_name': '/dev/vda', 'size': 0, 'boot_index': 0, 'disk_bus': 'virtio', 'guest_format': None, 'encryption_options': None, 'encryption_format': None, 'encrypted': False, 'encryption_secret_uuid': None, 'image_id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Jan 21 19:07:34 np0005591285 nova_compute[182755]: 2026-01-22 00:07:34.753 182759 WARNING nova.virt.libvirt.driver [None req-a8e93372-239c-4045-a799-78d31a5cb7ba 365f219cd09c471fa6275faa2fe5e2a1 b26cf6f4abd54e30aac169a3cbca648c - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 21 19:07:34 np0005591285 nova_compute[182755]: 2026-01-22 00:07:34.759 182759 DEBUG nova.virt.libvirt.host [None req-a8e93372-239c-4045-a799-78d31a5cb7ba 365f219cd09c471fa6275faa2fe5e2a1 b26cf6f4abd54e30aac169a3cbca648c - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Jan 21 19:07:34 np0005591285 nova_compute[182755]: 2026-01-22 00:07:34.759 182759 DEBUG nova.virt.libvirt.host [None req-a8e93372-239c-4045-a799-78d31a5cb7ba 365f219cd09c471fa6275faa2fe5e2a1 b26cf6f4abd54e30aac169a3cbca648c - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Jan 21 19:07:34 np0005591285 nova_compute[182755]: 2026-01-22 00:07:34.763 182759 DEBUG nova.virt.libvirt.host [None req-a8e93372-239c-4045-a799-78d31a5cb7ba 365f219cd09c471fa6275faa2fe5e2a1 b26cf6f4abd54e30aac169a3cbca648c - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Jan 21 19:07:34 np0005591285 nova_compute[182755]: 2026-01-22 00:07:34.763 182759 DEBUG nova.virt.libvirt.host [None req-a8e93372-239c-4045-a799-78d31a5cb7ba 365f219cd09c471fa6275faa2fe5e2a1 b26cf6f4abd54e30aac169a3cbca648c - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Jan 21 19:07:34 np0005591285 nova_compute[182755]: 2026-01-22 00:07:34.765 182759 DEBUG nova.virt.libvirt.driver [None req-a8e93372-239c-4045-a799-78d31a5cb7ba 365f219cd09c471fa6275faa2fe5e2a1 b26cf6f4abd54e30aac169a3cbca648c - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Jan 21 19:07:34 np0005591285 nova_compute[182755]: 2026-01-22 00:07:34.765 182759 DEBUG nova.virt.hardware [None req-a8e93372-239c-4045-a799-78d31a5cb7ba 365f219cd09c471fa6275faa2fe5e2a1 b26cf6f4abd54e30aac169a3cbca648c - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-21T23:42:45Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='c3389c03-89c4-4ff5-9e03-1a99d41713d4',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-21T23:42:50Z,direct_url=<?>,disk_format='qcow2',id=9cd98f02-a505-4543-a7ad-04e9a377b456,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='43b70c4e837343859ac97b6b2397ba1b',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-21T23:42:52Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Jan 21 19:07:34 np0005591285 nova_compute[182755]: 2026-01-22 00:07:34.766 182759 DEBUG nova.virt.hardware [None req-a8e93372-239c-4045-a799-78d31a5cb7ba 365f219cd09c471fa6275faa2fe5e2a1 b26cf6f4abd54e30aac169a3cbca648c - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Jan 21 19:07:34 np0005591285 nova_compute[182755]: 2026-01-22 00:07:34.766 182759 DEBUG nova.virt.hardware [None req-a8e93372-239c-4045-a799-78d31a5cb7ba 365f219cd09c471fa6275faa2fe5e2a1 b26cf6f4abd54e30aac169a3cbca648c - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Jan 21 19:07:34 np0005591285 nova_compute[182755]: 2026-01-22 00:07:34.766 182759 DEBUG nova.virt.hardware [None req-a8e93372-239c-4045-a799-78d31a5cb7ba 365f219cd09c471fa6275faa2fe5e2a1 b26cf6f4abd54e30aac169a3cbca648c - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Jan 21 19:07:34 np0005591285 nova_compute[182755]: 2026-01-22 00:07:34.767 182759 DEBUG nova.virt.hardware [None req-a8e93372-239c-4045-a799-78d31a5cb7ba 365f219cd09c471fa6275faa2fe5e2a1 b26cf6f4abd54e30aac169a3cbca648c - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Jan 21 19:07:34 np0005591285 nova_compute[182755]: 2026-01-22 00:07:34.767 182759 DEBUG nova.virt.hardware [None req-a8e93372-239c-4045-a799-78d31a5cb7ba 365f219cd09c471fa6275faa2fe5e2a1 b26cf6f4abd54e30aac169a3cbca648c - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Jan 21 19:07:34 np0005591285 nova_compute[182755]: 2026-01-22 00:07:34.767 182759 DEBUG nova.virt.hardware [None req-a8e93372-239c-4045-a799-78d31a5cb7ba 365f219cd09c471fa6275faa2fe5e2a1 b26cf6f4abd54e30aac169a3cbca648c - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Jan 21 19:07:34 np0005591285 nova_compute[182755]: 2026-01-22 00:07:34.768 182759 DEBUG nova.virt.hardware [None req-a8e93372-239c-4045-a799-78d31a5cb7ba 365f219cd09c471fa6275faa2fe5e2a1 b26cf6f4abd54e30aac169a3cbca648c - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Jan 21 19:07:34 np0005591285 nova_compute[182755]: 2026-01-22 00:07:34.768 182759 DEBUG nova.virt.hardware [None req-a8e93372-239c-4045-a799-78d31a5cb7ba 365f219cd09c471fa6275faa2fe5e2a1 b26cf6f4abd54e30aac169a3cbca648c - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Jan 21 19:07:34 np0005591285 nova_compute[182755]: 2026-01-22 00:07:34.769 182759 DEBUG nova.virt.hardware [None req-a8e93372-239c-4045-a799-78d31a5cb7ba 365f219cd09c471fa6275faa2fe5e2a1 b26cf6f4abd54e30aac169a3cbca648c - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Jan 21 19:07:34 np0005591285 nova_compute[182755]: 2026-01-22 00:07:34.769 182759 DEBUG nova.virt.hardware [None req-a8e93372-239c-4045-a799-78d31a5cb7ba 365f219cd09c471fa6275faa2fe5e2a1 b26cf6f4abd54e30aac169a3cbca648c - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Jan 21 19:07:34 np0005591285 nova_compute[182755]: 2026-01-22 00:07:34.774 182759 DEBUG nova.virt.libvirt.vif [None req-a8e93372-239c-4045-a799-78d31a5cb7ba 365f219cd09c471fa6275faa2fe5e2a1 b26cf6f4abd54e30aac169a3cbca648c - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-22T00:07:28Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ServerActionsTestOtherB-server-261934281',display_name='tempest-ServerActionsTestOtherB-server-261934281',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-serveractionstestotherb-server-261934281',id=104,image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBOWyAqsdytk3W3HzFQzJP3BXJvSwE75PC1SitNdFnRhcK3nyEFtPzs/DJKbijcwArzRvqYzid7Fty+N11Hyd1TaRzX9I0f6oLPrGjMRpZbi4YRQ8Uh8k7+UR1VtydcvTDA==',key_name='tempest-keypair-1040205771',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='b26cf6f4abd54e30aac169a3cbca648c',ramdisk_id='',reservation_id='r-bqv3jv0l',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ServerActionsTestOtherB-1685479237',owner_user_name='tempest-ServerActionsTestOtherB-1685479237-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-22T00:07:30Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='365f219cd09c471fa6275faa2fe5e2a1',uuid=4e87b9c8-cfba-431e-966e-24799ad0ece2,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "02f1d29d-b6df-46d8-8387-cfa84ffb24af", "address": "fa:16:3e:7c:e7:2e", "network": {"id": "1a4bd631-64c5-4e00-9341-0e44fd0833fb", "bridge": "br-int", "label": "tempest-ServerActionsTestOtherB-1779791452-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b26cf6f4abd54e30aac169a3cbca648c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap02f1d29d-b6", "ovs_interfaceid": "02f1d29d-b6df-46d8-8387-cfa84ffb24af", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Jan 21 19:07:34 np0005591285 nova_compute[182755]: 2026-01-22 00:07:34.775 182759 DEBUG nova.network.os_vif_util [None req-a8e93372-239c-4045-a799-78d31a5cb7ba 365f219cd09c471fa6275faa2fe5e2a1 b26cf6f4abd54e30aac169a3cbca648c - - default default] Converting VIF {"id": "02f1d29d-b6df-46d8-8387-cfa84ffb24af", "address": "fa:16:3e:7c:e7:2e", "network": {"id": "1a4bd631-64c5-4e00-9341-0e44fd0833fb", "bridge": "br-int", "label": "tempest-ServerActionsTestOtherB-1779791452-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b26cf6f4abd54e30aac169a3cbca648c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap02f1d29d-b6", "ovs_interfaceid": "02f1d29d-b6df-46d8-8387-cfa84ffb24af", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 21 19:07:34 np0005591285 nova_compute[182755]: 2026-01-22 00:07:34.775 182759 DEBUG nova.network.os_vif_util [None req-a8e93372-239c-4045-a799-78d31a5cb7ba 365f219cd09c471fa6275faa2fe5e2a1 b26cf6f4abd54e30aac169a3cbca648c - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:7c:e7:2e,bridge_name='br-int',has_traffic_filtering=True,id=02f1d29d-b6df-46d8-8387-cfa84ffb24af,network=Network(1a4bd631-64c5-4e00-9341-0e44fd0833fb),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap02f1d29d-b6') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 21 19:07:34 np0005591285 nova_compute[182755]: 2026-01-22 00:07:34.777 182759 DEBUG nova.objects.instance [None req-a8e93372-239c-4045-a799-78d31a5cb7ba 365f219cd09c471fa6275faa2fe5e2a1 b26cf6f4abd54e30aac169a3cbca648c - - default default] Lazy-loading 'pci_devices' on Instance uuid 4e87b9c8-cfba-431e-966e-24799ad0ece2 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 21 19:07:34 np0005591285 nova_compute[182755]: 2026-01-22 00:07:34.794 182759 DEBUG nova.virt.libvirt.driver [None req-a8e93372-239c-4045-a799-78d31a5cb7ba 365f219cd09c471fa6275faa2fe5e2a1 b26cf6f4abd54e30aac169a3cbca648c - - default default] [instance: 4e87b9c8-cfba-431e-966e-24799ad0ece2] End _get_guest_xml xml=<domain type="kvm">
Jan 21 19:07:34 np0005591285 nova_compute[182755]:  <uuid>4e87b9c8-cfba-431e-966e-24799ad0ece2</uuid>
Jan 21 19:07:34 np0005591285 nova_compute[182755]:  <name>instance-00000068</name>
Jan 21 19:07:34 np0005591285 nova_compute[182755]:  <memory>131072</memory>
Jan 21 19:07:34 np0005591285 nova_compute[182755]:  <vcpu>1</vcpu>
Jan 21 19:07:34 np0005591285 nova_compute[182755]:  <metadata>
Jan 21 19:07:34 np0005591285 nova_compute[182755]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 21 19:07:34 np0005591285 nova_compute[182755]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 21 19:07:34 np0005591285 nova_compute[182755]:      <nova:name>tempest-ServerActionsTestOtherB-server-261934281</nova:name>
Jan 21 19:07:34 np0005591285 nova_compute[182755]:      <nova:creationTime>2026-01-22 00:07:34</nova:creationTime>
Jan 21 19:07:34 np0005591285 nova_compute[182755]:      <nova:flavor name="m1.nano">
Jan 21 19:07:34 np0005591285 nova_compute[182755]:        <nova:memory>128</nova:memory>
Jan 21 19:07:34 np0005591285 nova_compute[182755]:        <nova:disk>1</nova:disk>
Jan 21 19:07:34 np0005591285 nova_compute[182755]:        <nova:swap>0</nova:swap>
Jan 21 19:07:34 np0005591285 nova_compute[182755]:        <nova:ephemeral>0</nova:ephemeral>
Jan 21 19:07:34 np0005591285 nova_compute[182755]:        <nova:vcpus>1</nova:vcpus>
Jan 21 19:07:34 np0005591285 nova_compute[182755]:      </nova:flavor>
Jan 21 19:07:34 np0005591285 nova_compute[182755]:      <nova:owner>
Jan 21 19:07:34 np0005591285 nova_compute[182755]:        <nova:user uuid="365f219cd09c471fa6275faa2fe5e2a1">tempest-ServerActionsTestOtherB-1685479237-project-member</nova:user>
Jan 21 19:07:34 np0005591285 nova_compute[182755]:        <nova:project uuid="b26cf6f4abd54e30aac169a3cbca648c">tempest-ServerActionsTestOtherB-1685479237</nova:project>
Jan 21 19:07:34 np0005591285 nova_compute[182755]:      </nova:owner>
Jan 21 19:07:34 np0005591285 nova_compute[182755]:      <nova:root type="image" uuid="9cd98f02-a505-4543-a7ad-04e9a377b456"/>
Jan 21 19:07:34 np0005591285 nova_compute[182755]:      <nova:ports>
Jan 21 19:07:34 np0005591285 nova_compute[182755]:        <nova:port uuid="02f1d29d-b6df-46d8-8387-cfa84ffb24af">
Jan 21 19:07:34 np0005591285 nova_compute[182755]:          <nova:ip type="fixed" address="10.100.0.8" ipVersion="4"/>
Jan 21 19:07:34 np0005591285 nova_compute[182755]:        </nova:port>
Jan 21 19:07:34 np0005591285 nova_compute[182755]:      </nova:ports>
Jan 21 19:07:34 np0005591285 nova_compute[182755]:    </nova:instance>
Jan 21 19:07:34 np0005591285 nova_compute[182755]:  </metadata>
Jan 21 19:07:34 np0005591285 nova_compute[182755]:  <sysinfo type="smbios">
Jan 21 19:07:34 np0005591285 nova_compute[182755]:    <system>
Jan 21 19:07:34 np0005591285 nova_compute[182755]:      <entry name="manufacturer">RDO</entry>
Jan 21 19:07:34 np0005591285 nova_compute[182755]:      <entry name="product">OpenStack Compute</entry>
Jan 21 19:07:34 np0005591285 nova_compute[182755]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 21 19:07:34 np0005591285 nova_compute[182755]:      <entry name="serial">4e87b9c8-cfba-431e-966e-24799ad0ece2</entry>
Jan 21 19:07:34 np0005591285 nova_compute[182755]:      <entry name="uuid">4e87b9c8-cfba-431e-966e-24799ad0ece2</entry>
Jan 21 19:07:34 np0005591285 nova_compute[182755]:      <entry name="family">Virtual Machine</entry>
Jan 21 19:07:34 np0005591285 nova_compute[182755]:    </system>
Jan 21 19:07:34 np0005591285 nova_compute[182755]:  </sysinfo>
Jan 21 19:07:34 np0005591285 nova_compute[182755]:  <os>
Jan 21 19:07:34 np0005591285 nova_compute[182755]:    <type arch="x86_64" machine="q35">hvm</type>
Jan 21 19:07:34 np0005591285 nova_compute[182755]:    <boot dev="hd"/>
Jan 21 19:07:34 np0005591285 nova_compute[182755]:    <smbios mode="sysinfo"/>
Jan 21 19:07:34 np0005591285 nova_compute[182755]:  </os>
Jan 21 19:07:34 np0005591285 nova_compute[182755]:  <features>
Jan 21 19:07:34 np0005591285 nova_compute[182755]:    <acpi/>
Jan 21 19:07:34 np0005591285 nova_compute[182755]:    <apic/>
Jan 21 19:07:34 np0005591285 nova_compute[182755]:    <vmcoreinfo/>
Jan 21 19:07:34 np0005591285 nova_compute[182755]:  </features>
Jan 21 19:07:34 np0005591285 nova_compute[182755]:  <clock offset="utc">
Jan 21 19:07:34 np0005591285 nova_compute[182755]:    <timer name="pit" tickpolicy="delay"/>
Jan 21 19:07:34 np0005591285 nova_compute[182755]:    <timer name="rtc" tickpolicy="catchup"/>
Jan 21 19:07:34 np0005591285 nova_compute[182755]:    <timer name="hpet" present="no"/>
Jan 21 19:07:34 np0005591285 nova_compute[182755]:  </clock>
Jan 21 19:07:34 np0005591285 nova_compute[182755]:  <cpu mode="custom" match="exact">
Jan 21 19:07:34 np0005591285 nova_compute[182755]:    <model>Nehalem</model>
Jan 21 19:07:34 np0005591285 nova_compute[182755]:    <topology sockets="1" cores="1" threads="1"/>
Jan 21 19:07:34 np0005591285 nova_compute[182755]:  </cpu>
Jan 21 19:07:34 np0005591285 nova_compute[182755]:  <devices>
Jan 21 19:07:34 np0005591285 nova_compute[182755]:    <disk type="file" device="disk">
Jan 21 19:07:34 np0005591285 nova_compute[182755]:      <driver name="qemu" type="qcow2" cache="none"/>
Jan 21 19:07:34 np0005591285 nova_compute[182755]:      <source file="/var/lib/nova/instances/4e87b9c8-cfba-431e-966e-24799ad0ece2/disk"/>
Jan 21 19:07:34 np0005591285 nova_compute[182755]:      <target dev="vda" bus="virtio"/>
Jan 21 19:07:34 np0005591285 nova_compute[182755]:    </disk>
Jan 21 19:07:34 np0005591285 nova_compute[182755]:    <disk type="file" device="cdrom">
Jan 21 19:07:34 np0005591285 nova_compute[182755]:      <driver name="qemu" type="raw" cache="none"/>
Jan 21 19:07:34 np0005591285 nova_compute[182755]:      <source file="/var/lib/nova/instances/4e87b9c8-cfba-431e-966e-24799ad0ece2/disk.config"/>
Jan 21 19:07:34 np0005591285 nova_compute[182755]:      <target dev="sda" bus="sata"/>
Jan 21 19:07:34 np0005591285 nova_compute[182755]:    </disk>
Jan 21 19:07:34 np0005591285 nova_compute[182755]:    <interface type="ethernet">
Jan 21 19:07:34 np0005591285 nova_compute[182755]:      <mac address="fa:16:3e:7c:e7:2e"/>
Jan 21 19:07:34 np0005591285 nova_compute[182755]:      <model type="virtio"/>
Jan 21 19:07:34 np0005591285 nova_compute[182755]:      <driver name="vhost" rx_queue_size="512"/>
Jan 21 19:07:34 np0005591285 nova_compute[182755]:      <mtu size="1442"/>
Jan 21 19:07:34 np0005591285 nova_compute[182755]:      <target dev="tap02f1d29d-b6"/>
Jan 21 19:07:34 np0005591285 nova_compute[182755]:    </interface>
Jan 21 19:07:34 np0005591285 nova_compute[182755]:    <serial type="pty">
Jan 21 19:07:34 np0005591285 nova_compute[182755]:      <log file="/var/lib/nova/instances/4e87b9c8-cfba-431e-966e-24799ad0ece2/console.log" append="off"/>
Jan 21 19:07:34 np0005591285 nova_compute[182755]:    </serial>
Jan 21 19:07:34 np0005591285 nova_compute[182755]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 21 19:07:34 np0005591285 nova_compute[182755]:    <video>
Jan 21 19:07:34 np0005591285 nova_compute[182755]:      <model type="virtio"/>
Jan 21 19:07:34 np0005591285 nova_compute[182755]:    </video>
Jan 21 19:07:34 np0005591285 nova_compute[182755]:    <input type="tablet" bus="usb"/>
Jan 21 19:07:34 np0005591285 nova_compute[182755]:    <rng model="virtio">
Jan 21 19:07:34 np0005591285 nova_compute[182755]:      <backend model="random">/dev/urandom</backend>
Jan 21 19:07:34 np0005591285 nova_compute[182755]:    </rng>
Jan 21 19:07:34 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root"/>
Jan 21 19:07:34 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 19:07:34 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 19:07:34 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 19:07:34 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 19:07:34 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 19:07:34 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 19:07:34 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 19:07:34 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 19:07:34 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 19:07:34 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 19:07:34 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 19:07:34 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 19:07:34 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 19:07:34 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 19:07:34 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 19:07:34 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 19:07:34 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 19:07:34 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 19:07:34 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 19:07:34 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 19:07:34 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 19:07:34 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 19:07:34 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 19:07:34 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 19:07:34 np0005591285 nova_compute[182755]:    <controller type="usb" index="0"/>
Jan 21 19:07:34 np0005591285 nova_compute[182755]:    <memballoon model="virtio">
Jan 21 19:07:34 np0005591285 nova_compute[182755]:      <stats period="10"/>
Jan 21 19:07:34 np0005591285 nova_compute[182755]:    </memballoon>
Jan 21 19:07:34 np0005591285 nova_compute[182755]:  </devices>
Jan 21 19:07:34 np0005591285 nova_compute[182755]: </domain>
Jan 21 19:07:34 np0005591285 nova_compute[182755]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Jan 21 19:07:34 np0005591285 nova_compute[182755]: 2026-01-22 00:07:34.796 182759 DEBUG nova.compute.manager [None req-a8e93372-239c-4045-a799-78d31a5cb7ba 365f219cd09c471fa6275faa2fe5e2a1 b26cf6f4abd54e30aac169a3cbca648c - - default default] [instance: 4e87b9c8-cfba-431e-966e-24799ad0ece2] Preparing to wait for external event network-vif-plugged-02f1d29d-b6df-46d8-8387-cfa84ffb24af prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Jan 21 19:07:34 np0005591285 nova_compute[182755]: 2026-01-22 00:07:34.797 182759 DEBUG oslo_concurrency.lockutils [None req-a8e93372-239c-4045-a799-78d31a5cb7ba 365f219cd09c471fa6275faa2fe5e2a1 b26cf6f4abd54e30aac169a3cbca648c - - default default] Acquiring lock "4e87b9c8-cfba-431e-966e-24799ad0ece2-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 21 19:07:34 np0005591285 nova_compute[182755]: 2026-01-22 00:07:34.797 182759 DEBUG oslo_concurrency.lockutils [None req-a8e93372-239c-4045-a799-78d31a5cb7ba 365f219cd09c471fa6275faa2fe5e2a1 b26cf6f4abd54e30aac169a3cbca648c - - default default] Lock "4e87b9c8-cfba-431e-966e-24799ad0ece2-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 21 19:07:34 np0005591285 nova_compute[182755]: 2026-01-22 00:07:34.798 182759 DEBUG oslo_concurrency.lockutils [None req-a8e93372-239c-4045-a799-78d31a5cb7ba 365f219cd09c471fa6275faa2fe5e2a1 b26cf6f4abd54e30aac169a3cbca648c - - default default] Lock "4e87b9c8-cfba-431e-966e-24799ad0ece2-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 21 19:07:34 np0005591285 nova_compute[182755]: 2026-01-22 00:07:34.799 182759 DEBUG nova.virt.libvirt.vif [None req-a8e93372-239c-4045-a799-78d31a5cb7ba 365f219cd09c471fa6275faa2fe5e2a1 b26cf6f4abd54e30aac169a3cbca648c - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-22T00:07:28Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ServerActionsTestOtherB-server-261934281',display_name='tempest-ServerActionsTestOtherB-server-261934281',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-serveractionstestotherb-server-261934281',id=104,image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBOWyAqsdytk3W3HzFQzJP3BXJvSwE75PC1SitNdFnRhcK3nyEFtPzs/DJKbijcwArzRvqYzid7Fty+N11Hyd1TaRzX9I0f6oLPrGjMRpZbi4YRQ8Uh8k7+UR1VtydcvTDA==',key_name='tempest-keypair-1040205771',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='b26cf6f4abd54e30aac169a3cbca648c',ramdisk_id='',reservation_id='r-bqv3jv0l',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ServerActionsTestOtherB-1685479237',owner_user_name='tempest-ServerActionsTestOtherB-1685479237-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-22T00:07:30Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='365f219cd09c471fa6275faa2fe5e2a1',uuid=4e87b9c8-cfba-431e-966e-24799ad0ece2,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "02f1d29d-b6df-46d8-8387-cfa84ffb24af", "address": "fa:16:3e:7c:e7:2e", "network": {"id": "1a4bd631-64c5-4e00-9341-0e44fd0833fb", "bridge": "br-int", "label": "tempest-ServerActionsTestOtherB-1779791452-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b26cf6f4abd54e30aac169a3cbca648c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap02f1d29d-b6", "ovs_interfaceid": "02f1d29d-b6df-46d8-8387-cfa84ffb24af", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Jan 21 19:07:34 np0005591285 nova_compute[182755]: 2026-01-22 00:07:34.800 182759 DEBUG nova.network.os_vif_util [None req-a8e93372-239c-4045-a799-78d31a5cb7ba 365f219cd09c471fa6275faa2fe5e2a1 b26cf6f4abd54e30aac169a3cbca648c - - default default] Converting VIF {"id": "02f1d29d-b6df-46d8-8387-cfa84ffb24af", "address": "fa:16:3e:7c:e7:2e", "network": {"id": "1a4bd631-64c5-4e00-9341-0e44fd0833fb", "bridge": "br-int", "label": "tempest-ServerActionsTestOtherB-1779791452-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b26cf6f4abd54e30aac169a3cbca648c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap02f1d29d-b6", "ovs_interfaceid": "02f1d29d-b6df-46d8-8387-cfa84ffb24af", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 21 19:07:34 np0005591285 nova_compute[182755]: 2026-01-22 00:07:34.801 182759 DEBUG nova.network.os_vif_util [None req-a8e93372-239c-4045-a799-78d31a5cb7ba 365f219cd09c471fa6275faa2fe5e2a1 b26cf6f4abd54e30aac169a3cbca648c - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:7c:e7:2e,bridge_name='br-int',has_traffic_filtering=True,id=02f1d29d-b6df-46d8-8387-cfa84ffb24af,network=Network(1a4bd631-64c5-4e00-9341-0e44fd0833fb),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap02f1d29d-b6') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 21 19:07:34 np0005591285 nova_compute[182755]: 2026-01-22 00:07:34.802 182759 DEBUG os_vif [None req-a8e93372-239c-4045-a799-78d31a5cb7ba 365f219cd09c471fa6275faa2fe5e2a1 b26cf6f4abd54e30aac169a3cbca648c - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:7c:e7:2e,bridge_name='br-int',has_traffic_filtering=True,id=02f1d29d-b6df-46d8-8387-cfa84ffb24af,network=Network(1a4bd631-64c5-4e00-9341-0e44fd0833fb),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap02f1d29d-b6') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Jan 21 19:07:34 np0005591285 nova_compute[182755]: 2026-01-22 00:07:34.803 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:07:34 np0005591285 nova_compute[182755]: 2026-01-22 00:07:34.803 182759 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 21 19:07:34 np0005591285 nova_compute[182755]: 2026-01-22 00:07:34.804 182759 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 21 19:07:34 np0005591285 nova_compute[182755]: 2026-01-22 00:07:34.808 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:07:34 np0005591285 nova_compute[182755]: 2026-01-22 00:07:34.809 182759 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap02f1d29d-b6, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 21 19:07:34 np0005591285 nova_compute[182755]: 2026-01-22 00:07:34.810 182759 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap02f1d29d-b6, col_values=(('external_ids', {'iface-id': '02f1d29d-b6df-46d8-8387-cfa84ffb24af', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:7c:e7:2e', 'vm-uuid': '4e87b9c8-cfba-431e-966e-24799ad0ece2'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 21 19:07:34 np0005591285 nova_compute[182755]: 2026-01-22 00:07:34.812 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:07:34 np0005591285 NetworkManager[55017]: <info>  [1769040454.8132] manager: (tap02f1d29d-b6): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/186)
Jan 21 19:07:34 np0005591285 nova_compute[182755]: 2026-01-22 00:07:34.814 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 21 19:07:34 np0005591285 nova_compute[182755]: 2026-01-22 00:07:34.821 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:07:34 np0005591285 nova_compute[182755]: 2026-01-22 00:07:34.822 182759 INFO os_vif [None req-a8e93372-239c-4045-a799-78d31a5cb7ba 365f219cd09c471fa6275faa2fe5e2a1 b26cf6f4abd54e30aac169a3cbca648c - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:7c:e7:2e,bridge_name='br-int',has_traffic_filtering=True,id=02f1d29d-b6df-46d8-8387-cfa84ffb24af,network=Network(1a4bd631-64c5-4e00-9341-0e44fd0833fb),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap02f1d29d-b6')#033[00m
Jan 21 19:07:34 np0005591285 nova_compute[182755]: 2026-01-22 00:07:34.901 182759 DEBUG nova.virt.libvirt.driver [None req-a8e93372-239c-4045-a799-78d31a5cb7ba 365f219cd09c471fa6275faa2fe5e2a1 b26cf6f4abd54e30aac169a3cbca648c - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 21 19:07:34 np0005591285 nova_compute[182755]: 2026-01-22 00:07:34.902 182759 DEBUG nova.virt.libvirt.driver [None req-a8e93372-239c-4045-a799-78d31a5cb7ba 365f219cd09c471fa6275faa2fe5e2a1 b26cf6f4abd54e30aac169a3cbca648c - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 21 19:07:34 np0005591285 nova_compute[182755]: 2026-01-22 00:07:34.902 182759 DEBUG nova.virt.libvirt.driver [None req-a8e93372-239c-4045-a799-78d31a5cb7ba 365f219cd09c471fa6275faa2fe5e2a1 b26cf6f4abd54e30aac169a3cbca648c - - default default] No VIF found with MAC fa:16:3e:7c:e7:2e, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Jan 21 19:07:34 np0005591285 nova_compute[182755]: 2026-01-22 00:07:34.903 182759 INFO nova.virt.libvirt.driver [None req-a8e93372-239c-4045-a799-78d31a5cb7ba 365f219cd09c471fa6275faa2fe5e2a1 b26cf6f4abd54e30aac169a3cbca648c - - default default] [instance: 4e87b9c8-cfba-431e-966e-24799ad0ece2] Using config drive#033[00m
Jan 21 19:07:35 np0005591285 nova_compute[182755]: 2026-01-22 00:07:35.377 182759 DEBUG nova.compute.manager [req-79cc67d8-8d39-4984-a3c1-d17fc192a0e7 req-c7d82ee3-6dea-4c44-8427-588876f0cc32 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 4e87b9c8-cfba-431e-966e-24799ad0ece2] Received event network-changed-02f1d29d-b6df-46d8-8387-cfa84ffb24af external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 21 19:07:35 np0005591285 nova_compute[182755]: 2026-01-22 00:07:35.378 182759 DEBUG nova.compute.manager [req-79cc67d8-8d39-4984-a3c1-d17fc192a0e7 req-c7d82ee3-6dea-4c44-8427-588876f0cc32 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 4e87b9c8-cfba-431e-966e-24799ad0ece2] Refreshing instance network info cache due to event network-changed-02f1d29d-b6df-46d8-8387-cfa84ffb24af. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 21 19:07:35 np0005591285 nova_compute[182755]: 2026-01-22 00:07:35.378 182759 DEBUG oslo_concurrency.lockutils [req-79cc67d8-8d39-4984-a3c1-d17fc192a0e7 req-c7d82ee3-6dea-4c44-8427-588876f0cc32 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "refresh_cache-4e87b9c8-cfba-431e-966e-24799ad0ece2" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 21 19:07:35 np0005591285 nova_compute[182755]: 2026-01-22 00:07:35.379 182759 DEBUG oslo_concurrency.lockutils [req-79cc67d8-8d39-4984-a3c1-d17fc192a0e7 req-c7d82ee3-6dea-4c44-8427-588876f0cc32 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquired lock "refresh_cache-4e87b9c8-cfba-431e-966e-24799ad0ece2" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 21 19:07:35 np0005591285 nova_compute[182755]: 2026-01-22 00:07:35.379 182759 DEBUG nova.network.neutron [req-79cc67d8-8d39-4984-a3c1-d17fc192a0e7 req-c7d82ee3-6dea-4c44-8427-588876f0cc32 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 4e87b9c8-cfba-431e-966e-24799ad0ece2] Refreshing network info cache for port 02f1d29d-b6df-46d8-8387-cfa84ffb24af _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 21 19:07:35 np0005591285 nova_compute[182755]: 2026-01-22 00:07:35.909 182759 INFO nova.virt.libvirt.driver [None req-a8e93372-239c-4045-a799-78d31a5cb7ba 365f219cd09c471fa6275faa2fe5e2a1 b26cf6f4abd54e30aac169a3cbca648c - - default default] [instance: 4e87b9c8-cfba-431e-966e-24799ad0ece2] Creating config drive at /var/lib/nova/instances/4e87b9c8-cfba-431e-966e-24799ad0ece2/disk.config#033[00m
Jan 21 19:07:35 np0005591285 nova_compute[182755]: 2026-01-22 00:07:35.921 182759 DEBUG oslo_concurrency.processutils [None req-a8e93372-239c-4045-a799-78d31a5cb7ba 365f219cd09c471fa6275faa2fe5e2a1 b26cf6f4abd54e30aac169a3cbca648c - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/4e87b9c8-cfba-431e-966e-24799ad0ece2/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp3riflah2 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 21 19:07:36 np0005591285 nova_compute[182755]: 2026-01-22 00:07:36.072 182759 DEBUG oslo_concurrency.processutils [None req-a8e93372-239c-4045-a799-78d31a5cb7ba 365f219cd09c471fa6275faa2fe5e2a1 b26cf6f4abd54e30aac169a3cbca648c - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/4e87b9c8-cfba-431e-966e-24799ad0ece2/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp3riflah2" returned: 0 in 0.151s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 21 19:07:36 np0005591285 kernel: tap02f1d29d-b6: entered promiscuous mode
Jan 21 19:07:36 np0005591285 NetworkManager[55017]: <info>  [1769040456.1420] manager: (tap02f1d29d-b6): new Tun device (/org/freedesktop/NetworkManager/Devices/187)
Jan 21 19:07:36 np0005591285 nova_compute[182755]: 2026-01-22 00:07:36.143 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:07:36 np0005591285 ovn_controller[94908]: 2026-01-22T00:07:36Z|00369|binding|INFO|Claiming lport 02f1d29d-b6df-46d8-8387-cfa84ffb24af for this chassis.
Jan 21 19:07:36 np0005591285 ovn_controller[94908]: 2026-01-22T00:07:36Z|00370|binding|INFO|02f1d29d-b6df-46d8-8387-cfa84ffb24af: Claiming fa:16:3e:7c:e7:2e 10.100.0.8
Jan 21 19:07:36 np0005591285 nova_compute[182755]: 2026-01-22 00:07:36.146 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:07:36 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:07:36.154 104259 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:7c:e7:2e 10.100.0.8'], port_security=['fa:16:3e:7c:e7:2e 10.100.0.8'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.8/28', 'neutron:device_id': '4e87b9c8-cfba-431e-966e-24799ad0ece2', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-1a4bd631-64c5-4e00-9341-0e44fd0833fb', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'b26cf6f4abd54e30aac169a3cbca648c', 'neutron:revision_number': '2', 'neutron:security_group_ids': '80fb8d02-77b3-43f5-9cd3-4114236093b7', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=d46c6b58-b03f-4ac4-a6dd-9f507a40241a, chassis=[<ovs.db.idl.Row object at 0x7fdd55b5c970>], tunnel_key=5, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fdd55b5c970>], logical_port=02f1d29d-b6df-46d8-8387-cfa84ffb24af) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 21 19:07:36 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:07:36.155 104259 INFO neutron.agent.ovn.metadata.agent [-] Port 02f1d29d-b6df-46d8-8387-cfa84ffb24af in datapath 1a4bd631-64c5-4e00-9341-0e44fd0833fb bound to our chassis#033[00m
Jan 21 19:07:36 np0005591285 ovn_controller[94908]: 2026-01-22T00:07:36Z|00371|binding|INFO|Setting lport 02f1d29d-b6df-46d8-8387-cfa84ffb24af ovn-installed in OVS
Jan 21 19:07:36 np0005591285 ovn_controller[94908]: 2026-01-22T00:07:36Z|00372|binding|INFO|Setting lport 02f1d29d-b6df-46d8-8387-cfa84ffb24af up in Southbound
Jan 21 19:07:36 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:07:36.157 104259 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 1a4bd631-64c5-4e00-9341-0e44fd0833fb#033[00m
Jan 21 19:07:36 np0005591285 nova_compute[182755]: 2026-01-22 00:07:36.158 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:07:36 np0005591285 nova_compute[182755]: 2026-01-22 00:07:36.159 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:07:36 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:07:36.175 211690 DEBUG oslo.privsep.daemon [-] privsep: reply[c44c4bf8-3d20-4c20-813d-7ee470a2c936]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 19:07:36 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:07:36.176 104259 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap1a4bd631-61 in ovnmeta-1a4bd631-64c5-4e00-9341-0e44fd0833fb namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Jan 21 19:07:36 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:07:36.179 211690 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap1a4bd631-60 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Jan 21 19:07:36 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:07:36.179 211690 DEBUG oslo.privsep.daemon [-] privsep: reply[c3f1f499-f741-42d9-af84-b0e0fc07e128]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 19:07:36 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:07:36.181 211690 DEBUG oslo.privsep.daemon [-] privsep: reply[2bd28812-810d-485d-ad24-2e27d0367efc]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 19:07:36 np0005591285 systemd-machined[154022]: New machine qemu-46-instance-00000068.
Jan 21 19:07:36 np0005591285 systemd-udevd[227088]: Network interface NamePolicy= disabled on kernel command line.
Jan 21 19:07:36 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:07:36.195 104650 DEBUG oslo.privsep.daemon [-] privsep: reply[76880063-762c-4015-a142-a3257b3504b1]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 19:07:36 np0005591285 NetworkManager[55017]: <info>  [1769040456.2019] device (tap02f1d29d-b6): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 21 19:07:36 np0005591285 NetworkManager[55017]: <info>  [1769040456.2024] device (tap02f1d29d-b6): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 21 19:07:36 np0005591285 systemd[1]: Started Virtual Machine qemu-46-instance-00000068.
Jan 21 19:07:36 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:07:36.206 211690 DEBUG oslo.privsep.daemon [-] privsep: reply[88e944f6-c933-47bd-9194-ef69f48a9e58]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 19:07:36 np0005591285 nova_compute[182755]: 2026-01-22 00:07:36.217 182759 DEBUG oslo_service.periodic_task [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 21 19:07:36 np0005591285 nova_compute[182755]: 2026-01-22 00:07:36.217 182759 DEBUG nova.compute.manager [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Jan 21 19:07:36 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:07:36.233 211704 DEBUG oslo.privsep.daemon [-] privsep: reply[364ab602-2410-4ea3-a30f-d5eb76cfabc0]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 19:07:36 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:07:36.238 211690 DEBUG oslo.privsep.daemon [-] privsep: reply[7a433797-f0f3-4675-ba55-1eb7be6a729b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 19:07:36 np0005591285 NetworkManager[55017]: <info>  [1769040456.2412] manager: (tap1a4bd631-60): new Veth device (/org/freedesktop/NetworkManager/Devices/188)
Jan 21 19:07:36 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:07:36.276 211704 DEBUG oslo.privsep.daemon [-] privsep: reply[47ea4406-1e57-4daa-9e41-c73044f32560]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 19:07:36 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:07:36.278 211704 DEBUG oslo.privsep.daemon [-] privsep: reply[94d6eb16-de2b-4739-9f07-10aa15be94fa]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 19:07:36 np0005591285 NetworkManager[55017]: <info>  [1769040456.3010] device (tap1a4bd631-60): carrier: link connected
Jan 21 19:07:36 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:07:36.306 211704 DEBUG oslo.privsep.daemon [-] privsep: reply[2d82e545-ad2f-4f45-8218-cd3847227115]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 19:07:36 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:07:36.331 211690 DEBUG oslo.privsep.daemon [-] privsep: reply[ccccfc75-fdfc-4625-95d7-ee859bada3c6]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap1a4bd631-61'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:28:78:33'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 118], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 501495, 'reachable_time': 33859, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 227120, 'error': None, 'target': 'ovnmeta-1a4bd631-64c5-4e00-9341-0e44fd0833fb', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 19:07:36 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:07:36.353 211690 DEBUG oslo.privsep.daemon [-] privsep: reply[147d082b-f8b7-4433-ba35-96bb30dfdf93]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe28:7833'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 501495, 'tstamp': 501495}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 227121, 'error': None, 'target': 'ovnmeta-1a4bd631-64c5-4e00-9341-0e44fd0833fb', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 19:07:36 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:07:36.373 211690 DEBUG oslo.privsep.daemon [-] privsep: reply[2dcf95a9-0512-4def-9596-85fcb2f519ec]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap1a4bd631-61'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:28:78:33'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 118], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 501495, 'reachable_time': 33859, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 227122, 'error': None, 'target': 'ovnmeta-1a4bd631-64c5-4e00-9341-0e44fd0833fb', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 19:07:36 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:07:36.410 211690 DEBUG oslo.privsep.daemon [-] privsep: reply[88ea30a5-bf5c-4bcd-abae-0b48f7aa0b70]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 19:07:36 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:07:36.480 211690 DEBUG oslo.privsep.daemon [-] privsep: reply[f8b1ee41-81bf-40fe-9834-69adc5c85a06]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 19:07:36 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:07:36.481 104259 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap1a4bd631-60, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 21 19:07:36 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:07:36.482 104259 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 21 19:07:36 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:07:36.482 104259 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap1a4bd631-60, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 21 19:07:36 np0005591285 nova_compute[182755]: 2026-01-22 00:07:36.484 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:07:36 np0005591285 kernel: tap1a4bd631-60: entered promiscuous mode
Jan 21 19:07:36 np0005591285 NetworkManager[55017]: <info>  [1769040456.4860] manager: (tap1a4bd631-60): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/189)
Jan 21 19:07:36 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:07:36.494 104259 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap1a4bd631-60, col_values=(('external_ids', {'iface-id': 'c2dbe75a-81e7-4c52-bada-9acaf8fbaf5c'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 21 19:07:36 np0005591285 ovn_controller[94908]: 2026-01-22T00:07:36Z|00373|binding|INFO|Releasing lport c2dbe75a-81e7-4c52-bada-9acaf8fbaf5c from this chassis (sb_readonly=0)
Jan 21 19:07:36 np0005591285 nova_compute[182755]: 2026-01-22 00:07:36.496 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:07:36 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:07:36.499 104259 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/1a4bd631-64c5-4e00-9341-0e44fd0833fb.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/1a4bd631-64c5-4e00-9341-0e44fd0833fb.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Jan 21 19:07:36 np0005591285 nova_compute[182755]: 2026-01-22 00:07:36.509 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:07:36 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:07:36.508 211690 DEBUG oslo.privsep.daemon [-] privsep: reply[878b68e9-0499-4295-bf35-165a3786e4f5]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 19:07:36 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:07:36.509 104259 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Jan 21 19:07:36 np0005591285 ovn_metadata_agent[104254]: global
Jan 21 19:07:36 np0005591285 ovn_metadata_agent[104254]:    log         /dev/log local0 debug
Jan 21 19:07:36 np0005591285 ovn_metadata_agent[104254]:    log-tag     haproxy-metadata-proxy-1a4bd631-64c5-4e00-9341-0e44fd0833fb
Jan 21 19:07:36 np0005591285 ovn_metadata_agent[104254]:    user        root
Jan 21 19:07:36 np0005591285 ovn_metadata_agent[104254]:    group       root
Jan 21 19:07:36 np0005591285 ovn_metadata_agent[104254]:    maxconn     1024
Jan 21 19:07:36 np0005591285 ovn_metadata_agent[104254]:    pidfile     /var/lib/neutron/external/pids/1a4bd631-64c5-4e00-9341-0e44fd0833fb.pid.haproxy
Jan 21 19:07:36 np0005591285 ovn_metadata_agent[104254]:    daemon
Jan 21 19:07:36 np0005591285 ovn_metadata_agent[104254]: 
Jan 21 19:07:36 np0005591285 ovn_metadata_agent[104254]: defaults
Jan 21 19:07:36 np0005591285 ovn_metadata_agent[104254]:    log global
Jan 21 19:07:36 np0005591285 ovn_metadata_agent[104254]:    mode http
Jan 21 19:07:36 np0005591285 ovn_metadata_agent[104254]:    option httplog
Jan 21 19:07:36 np0005591285 ovn_metadata_agent[104254]:    option dontlognull
Jan 21 19:07:36 np0005591285 ovn_metadata_agent[104254]:    option http-server-close
Jan 21 19:07:36 np0005591285 ovn_metadata_agent[104254]:    option forwardfor
Jan 21 19:07:36 np0005591285 ovn_metadata_agent[104254]:    retries                 3
Jan 21 19:07:36 np0005591285 ovn_metadata_agent[104254]:    timeout http-request    30s
Jan 21 19:07:36 np0005591285 ovn_metadata_agent[104254]:    timeout connect         30s
Jan 21 19:07:36 np0005591285 ovn_metadata_agent[104254]:    timeout client          32s
Jan 21 19:07:36 np0005591285 ovn_metadata_agent[104254]:    timeout server          32s
Jan 21 19:07:36 np0005591285 ovn_metadata_agent[104254]:    timeout http-keep-alive 30s
Jan 21 19:07:36 np0005591285 ovn_metadata_agent[104254]: 
Jan 21 19:07:36 np0005591285 ovn_metadata_agent[104254]: 
Jan 21 19:07:36 np0005591285 ovn_metadata_agent[104254]: listen listener
Jan 21 19:07:36 np0005591285 ovn_metadata_agent[104254]:    bind 169.254.169.254:80
Jan 21 19:07:36 np0005591285 ovn_metadata_agent[104254]:    server metadata /var/lib/neutron/metadata_proxy
Jan 21 19:07:36 np0005591285 ovn_metadata_agent[104254]:    http-request add-header X-OVN-Network-ID 1a4bd631-64c5-4e00-9341-0e44fd0833fb
Jan 21 19:07:36 np0005591285 ovn_metadata_agent[104254]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Jan 21 19:07:36 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:07:36.510 104259 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-1a4bd631-64c5-4e00-9341-0e44fd0833fb', 'env', 'PROCESS_TAG=haproxy-1a4bd631-64c5-4e00-9341-0e44fd0833fb', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/1a4bd631-64c5-4e00-9341-0e44fd0833fb.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Jan 21 19:07:36 np0005591285 nova_compute[182755]: 2026-01-22 00:07:36.584 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:07:36 np0005591285 nova_compute[182755]: 2026-01-22 00:07:36.673 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:07:36 np0005591285 podman[227154]: 2026-01-22 00:07:36.894820356 +0000 UTC m=+0.055919225 container create b11e7f79b0557916ed8a3ce4c7962ab8bb5dfe741fdf43b30b8ceb05607bdfcf (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-1a4bd631-64c5-4e00-9341-0e44fd0833fb, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 21 19:07:36 np0005591285 systemd[1]: Started libpod-conmon-b11e7f79b0557916ed8a3ce4c7962ab8bb5dfe741fdf43b30b8ceb05607bdfcf.scope.
Jan 21 19:07:36 np0005591285 podman[227154]: 2026-01-22 00:07:36.858840738 +0000 UTC m=+0.019939617 image pull 3695f0466b4af47afdf4b467956f8cc4744d7249671a73e7ca3fd26cca2f59c3 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Jan 21 19:07:36 np0005591285 systemd[1]: Started libcrun container.
Jan 21 19:07:36 np0005591285 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/8f38838ec0e67ba5930ac143552c01becc60215c2657e40e88a8d82cb4196e40/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 21 19:07:36 np0005591285 nova_compute[182755]: 2026-01-22 00:07:36.990 182759 DEBUG nova.virt.driver [None req-f447ef32-7edb-4e2c-a5c5-5452f2f9c37e - - - - - -] Emitting event <LifecycleEvent: 1769040456.9894776, 4e87b9c8-cfba-431e-966e-24799ad0ece2 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 21 19:07:36 np0005591285 nova_compute[182755]: 2026-01-22 00:07:36.991 182759 INFO nova.compute.manager [None req-f447ef32-7edb-4e2c-a5c5-5452f2f9c37e - - - - - -] [instance: 4e87b9c8-cfba-431e-966e-24799ad0ece2] VM Started (Lifecycle Event)#033[00m
Jan 21 19:07:37 np0005591285 podman[227154]: 2026-01-22 00:07:37.003416058 +0000 UTC m=+0.164514957 container init b11e7f79b0557916ed8a3ce4c7962ab8bb5dfe741fdf43b30b8ceb05607bdfcf (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-1a4bd631-64c5-4e00-9341-0e44fd0833fb, tcib_managed=true, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Jan 21 19:07:37 np0005591285 podman[227154]: 2026-01-22 00:07:37.010974781 +0000 UTC m=+0.172073670 container start b11e7f79b0557916ed8a3ce4c7962ab8bb5dfe741fdf43b30b8ceb05607bdfcf (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-1a4bd631-64c5-4e00-9341-0e44fd0833fb, tcib_managed=true, io.buildah.version=1.41.3, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS)
Jan 21 19:07:37 np0005591285 nova_compute[182755]: 2026-01-22 00:07:37.018 182759 DEBUG nova.compute.manager [None req-f447ef32-7edb-4e2c-a5c5-5452f2f9c37e - - - - - -] [instance: 4e87b9c8-cfba-431e-966e-24799ad0ece2] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 21 19:07:37 np0005591285 nova_compute[182755]: 2026-01-22 00:07:37.023 182759 DEBUG nova.virt.driver [None req-f447ef32-7edb-4e2c-a5c5-5452f2f9c37e - - - - - -] Emitting event <LifecycleEvent: 1769040456.9896126, 4e87b9c8-cfba-431e-966e-24799ad0ece2 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 21 19:07:37 np0005591285 nova_compute[182755]: 2026-01-22 00:07:37.024 182759 INFO nova.compute.manager [None req-f447ef32-7edb-4e2c-a5c5-5452f2f9c37e - - - - - -] [instance: 4e87b9c8-cfba-431e-966e-24799ad0ece2] VM Paused (Lifecycle Event)#033[00m
Jan 21 19:07:37 np0005591285 neutron-haproxy-ovnmeta-1a4bd631-64c5-4e00-9341-0e44fd0833fb[227177]: [NOTICE]   (227181) : New worker (227183) forked
Jan 21 19:07:37 np0005591285 neutron-haproxy-ovnmeta-1a4bd631-64c5-4e00-9341-0e44fd0833fb[227177]: [NOTICE]   (227181) : Loading success.
Jan 21 19:07:37 np0005591285 nova_compute[182755]: 2026-01-22 00:07:37.065 182759 DEBUG nova.compute.manager [None req-f447ef32-7edb-4e2c-a5c5-5452f2f9c37e - - - - - -] [instance: 4e87b9c8-cfba-431e-966e-24799ad0ece2] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 21 19:07:37 np0005591285 nova_compute[182755]: 2026-01-22 00:07:37.069 182759 DEBUG nova.compute.manager [None req-f447ef32-7edb-4e2c-a5c5-5452f2f9c37e - - - - - -] [instance: 4e87b9c8-cfba-431e-966e-24799ad0ece2] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 21 19:07:37 np0005591285 nova_compute[182755]: 2026-01-22 00:07:37.094 182759 INFO nova.compute.manager [None req-f447ef32-7edb-4e2c-a5c5-5452f2f9c37e - - - - - -] [instance: 4e87b9c8-cfba-431e-966e-24799ad0ece2] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 21 19:07:37 np0005591285 nova_compute[182755]: 2026-01-22 00:07:37.663 182759 DEBUG nova.compute.manager [req-ead09f89-60e3-4374-9bf6-fd5a7ca018fd req-13c8c99b-f4fb-467c-834e-8823f2d2f5a4 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 4e87b9c8-cfba-431e-966e-24799ad0ece2] Received event network-vif-plugged-02f1d29d-b6df-46d8-8387-cfa84ffb24af external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 21 19:07:37 np0005591285 nova_compute[182755]: 2026-01-22 00:07:37.664 182759 DEBUG oslo_concurrency.lockutils [req-ead09f89-60e3-4374-9bf6-fd5a7ca018fd req-13c8c99b-f4fb-467c-834e-8823f2d2f5a4 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "4e87b9c8-cfba-431e-966e-24799ad0ece2-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 21 19:07:37 np0005591285 nova_compute[182755]: 2026-01-22 00:07:37.664 182759 DEBUG oslo_concurrency.lockutils [req-ead09f89-60e3-4374-9bf6-fd5a7ca018fd req-13c8c99b-f4fb-467c-834e-8823f2d2f5a4 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "4e87b9c8-cfba-431e-966e-24799ad0ece2-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 21 19:07:37 np0005591285 nova_compute[182755]: 2026-01-22 00:07:37.665 182759 DEBUG oslo_concurrency.lockutils [req-ead09f89-60e3-4374-9bf6-fd5a7ca018fd req-13c8c99b-f4fb-467c-834e-8823f2d2f5a4 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "4e87b9c8-cfba-431e-966e-24799ad0ece2-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 21 19:07:37 np0005591285 nova_compute[182755]: 2026-01-22 00:07:37.665 182759 DEBUG nova.compute.manager [req-ead09f89-60e3-4374-9bf6-fd5a7ca018fd req-13c8c99b-f4fb-467c-834e-8823f2d2f5a4 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 4e87b9c8-cfba-431e-966e-24799ad0ece2] Processing event network-vif-plugged-02f1d29d-b6df-46d8-8387-cfa84ffb24af _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Jan 21 19:07:37 np0005591285 nova_compute[182755]: 2026-01-22 00:07:37.666 182759 DEBUG nova.compute.manager [req-ead09f89-60e3-4374-9bf6-fd5a7ca018fd req-13c8c99b-f4fb-467c-834e-8823f2d2f5a4 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 4e87b9c8-cfba-431e-966e-24799ad0ece2] Received event network-vif-plugged-02f1d29d-b6df-46d8-8387-cfa84ffb24af external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 21 19:07:37 np0005591285 nova_compute[182755]: 2026-01-22 00:07:37.666 182759 DEBUG oslo_concurrency.lockutils [req-ead09f89-60e3-4374-9bf6-fd5a7ca018fd req-13c8c99b-f4fb-467c-834e-8823f2d2f5a4 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "4e87b9c8-cfba-431e-966e-24799ad0ece2-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 21 19:07:37 np0005591285 nova_compute[182755]: 2026-01-22 00:07:37.666 182759 DEBUG oslo_concurrency.lockutils [req-ead09f89-60e3-4374-9bf6-fd5a7ca018fd req-13c8c99b-f4fb-467c-834e-8823f2d2f5a4 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "4e87b9c8-cfba-431e-966e-24799ad0ece2-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 21 19:07:37 np0005591285 nova_compute[182755]: 2026-01-22 00:07:37.667 182759 DEBUG oslo_concurrency.lockutils [req-ead09f89-60e3-4374-9bf6-fd5a7ca018fd req-13c8c99b-f4fb-467c-834e-8823f2d2f5a4 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "4e87b9c8-cfba-431e-966e-24799ad0ece2-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 21 19:07:37 np0005591285 nova_compute[182755]: 2026-01-22 00:07:37.667 182759 DEBUG nova.compute.manager [req-ead09f89-60e3-4374-9bf6-fd5a7ca018fd req-13c8c99b-f4fb-467c-834e-8823f2d2f5a4 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 4e87b9c8-cfba-431e-966e-24799ad0ece2] No waiting events found dispatching network-vif-plugged-02f1d29d-b6df-46d8-8387-cfa84ffb24af pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 21 19:07:37 np0005591285 nova_compute[182755]: 2026-01-22 00:07:37.667 182759 WARNING nova.compute.manager [req-ead09f89-60e3-4374-9bf6-fd5a7ca018fd req-13c8c99b-f4fb-467c-834e-8823f2d2f5a4 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 4e87b9c8-cfba-431e-966e-24799ad0ece2] Received unexpected event network-vif-plugged-02f1d29d-b6df-46d8-8387-cfa84ffb24af for instance with vm_state building and task_state spawning.#033[00m
Jan 21 19:07:37 np0005591285 nova_compute[182755]: 2026-01-22 00:07:37.668 182759 DEBUG nova.compute.manager [None req-a8e93372-239c-4045-a799-78d31a5cb7ba 365f219cd09c471fa6275faa2fe5e2a1 b26cf6f4abd54e30aac169a3cbca648c - - default default] [instance: 4e87b9c8-cfba-431e-966e-24799ad0ece2] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Jan 21 19:07:37 np0005591285 nova_compute[182755]: 2026-01-22 00:07:37.673 182759 DEBUG nova.virt.driver [None req-f447ef32-7edb-4e2c-a5c5-5452f2f9c37e - - - - - -] Emitting event <LifecycleEvent: 1769040457.6736524, 4e87b9c8-cfba-431e-966e-24799ad0ece2 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 21 19:07:37 np0005591285 nova_compute[182755]: 2026-01-22 00:07:37.674 182759 INFO nova.compute.manager [None req-f447ef32-7edb-4e2c-a5c5-5452f2f9c37e - - - - - -] [instance: 4e87b9c8-cfba-431e-966e-24799ad0ece2] VM Resumed (Lifecycle Event)#033[00m
Jan 21 19:07:37 np0005591285 nova_compute[182755]: 2026-01-22 00:07:37.676 182759 DEBUG nova.virt.libvirt.driver [None req-a8e93372-239c-4045-a799-78d31a5cb7ba 365f219cd09c471fa6275faa2fe5e2a1 b26cf6f4abd54e30aac169a3cbca648c - - default default] [instance: 4e87b9c8-cfba-431e-966e-24799ad0ece2] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Jan 21 19:07:37 np0005591285 nova_compute[182755]: 2026-01-22 00:07:37.679 182759 INFO nova.virt.libvirt.driver [-] [instance: 4e87b9c8-cfba-431e-966e-24799ad0ece2] Instance spawned successfully.#033[00m
Jan 21 19:07:37 np0005591285 nova_compute[182755]: 2026-01-22 00:07:37.680 182759 DEBUG nova.virt.libvirt.driver [None req-a8e93372-239c-4045-a799-78d31a5cb7ba 365f219cd09c471fa6275faa2fe5e2a1 b26cf6f4abd54e30aac169a3cbca648c - - default default] [instance: 4e87b9c8-cfba-431e-966e-24799ad0ece2] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Jan 21 19:07:37 np0005591285 nova_compute[182755]: 2026-01-22 00:07:37.706 182759 DEBUG nova.compute.manager [None req-f447ef32-7edb-4e2c-a5c5-5452f2f9c37e - - - - - -] [instance: 4e87b9c8-cfba-431e-966e-24799ad0ece2] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 21 19:07:37 np0005591285 nova_compute[182755]: 2026-01-22 00:07:37.713 182759 DEBUG nova.compute.manager [None req-f447ef32-7edb-4e2c-a5c5-5452f2f9c37e - - - - - -] [instance: 4e87b9c8-cfba-431e-966e-24799ad0ece2] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 21 19:07:37 np0005591285 nova_compute[182755]: 2026-01-22 00:07:37.717 182759 DEBUG nova.virt.libvirt.driver [None req-a8e93372-239c-4045-a799-78d31a5cb7ba 365f219cd09c471fa6275faa2fe5e2a1 b26cf6f4abd54e30aac169a3cbca648c - - default default] [instance: 4e87b9c8-cfba-431e-966e-24799ad0ece2] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 21 19:07:37 np0005591285 nova_compute[182755]: 2026-01-22 00:07:37.717 182759 DEBUG nova.virt.libvirt.driver [None req-a8e93372-239c-4045-a799-78d31a5cb7ba 365f219cd09c471fa6275faa2fe5e2a1 b26cf6f4abd54e30aac169a3cbca648c - - default default] [instance: 4e87b9c8-cfba-431e-966e-24799ad0ece2] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 21 19:07:37 np0005591285 nova_compute[182755]: 2026-01-22 00:07:37.717 182759 DEBUG nova.virt.libvirt.driver [None req-a8e93372-239c-4045-a799-78d31a5cb7ba 365f219cd09c471fa6275faa2fe5e2a1 b26cf6f4abd54e30aac169a3cbca648c - - default default] [instance: 4e87b9c8-cfba-431e-966e-24799ad0ece2] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 21 19:07:37 np0005591285 nova_compute[182755]: 2026-01-22 00:07:37.718 182759 DEBUG nova.virt.libvirt.driver [None req-a8e93372-239c-4045-a799-78d31a5cb7ba 365f219cd09c471fa6275faa2fe5e2a1 b26cf6f4abd54e30aac169a3cbca648c - - default default] [instance: 4e87b9c8-cfba-431e-966e-24799ad0ece2] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 21 19:07:37 np0005591285 nova_compute[182755]: 2026-01-22 00:07:37.718 182759 DEBUG nova.virt.libvirt.driver [None req-a8e93372-239c-4045-a799-78d31a5cb7ba 365f219cd09c471fa6275faa2fe5e2a1 b26cf6f4abd54e30aac169a3cbca648c - - default default] [instance: 4e87b9c8-cfba-431e-966e-24799ad0ece2] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 21 19:07:37 np0005591285 nova_compute[182755]: 2026-01-22 00:07:37.719 182759 DEBUG nova.virt.libvirt.driver [None req-a8e93372-239c-4045-a799-78d31a5cb7ba 365f219cd09c471fa6275faa2fe5e2a1 b26cf6f4abd54e30aac169a3cbca648c - - default default] [instance: 4e87b9c8-cfba-431e-966e-24799ad0ece2] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 21 19:07:37 np0005591285 nova_compute[182755]: 2026-01-22 00:07:37.756 182759 INFO nova.compute.manager [None req-f447ef32-7edb-4e2c-a5c5-5452f2f9c37e - - - - - -] [instance: 4e87b9c8-cfba-431e-966e-24799ad0ece2] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 21 19:07:37 np0005591285 nova_compute[182755]: 2026-01-22 00:07:37.811 182759 INFO nova.compute.manager [None req-a8e93372-239c-4045-a799-78d31a5cb7ba 365f219cd09c471fa6275faa2fe5e2a1 b26cf6f4abd54e30aac169a3cbca648c - - default default] [instance: 4e87b9c8-cfba-431e-966e-24799ad0ece2] Took 7.00 seconds to spawn the instance on the hypervisor.#033[00m
Jan 21 19:07:37 np0005591285 nova_compute[182755]: 2026-01-22 00:07:37.812 182759 DEBUG nova.compute.manager [None req-a8e93372-239c-4045-a799-78d31a5cb7ba 365f219cd09c471fa6275faa2fe5e2a1 b26cf6f4abd54e30aac169a3cbca648c - - default default] [instance: 4e87b9c8-cfba-431e-966e-24799ad0ece2] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 21 19:07:37 np0005591285 nova_compute[182755]: 2026-01-22 00:07:37.814 182759 DEBUG nova.network.neutron [req-79cc67d8-8d39-4984-a3c1-d17fc192a0e7 req-c7d82ee3-6dea-4c44-8427-588876f0cc32 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 4e87b9c8-cfba-431e-966e-24799ad0ece2] Updated VIF entry in instance network info cache for port 02f1d29d-b6df-46d8-8387-cfa84ffb24af. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 21 19:07:37 np0005591285 nova_compute[182755]: 2026-01-22 00:07:37.814 182759 DEBUG nova.network.neutron [req-79cc67d8-8d39-4984-a3c1-d17fc192a0e7 req-c7d82ee3-6dea-4c44-8427-588876f0cc32 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 4e87b9c8-cfba-431e-966e-24799ad0ece2] Updating instance_info_cache with network_info: [{"id": "02f1d29d-b6df-46d8-8387-cfa84ffb24af", "address": "fa:16:3e:7c:e7:2e", "network": {"id": "1a4bd631-64c5-4e00-9341-0e44fd0833fb", "bridge": "br-int", "label": "tempest-ServerActionsTestOtherB-1779791452-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b26cf6f4abd54e30aac169a3cbca648c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap02f1d29d-b6", "ovs_interfaceid": "02f1d29d-b6df-46d8-8387-cfa84ffb24af", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 21 19:07:37 np0005591285 nova_compute[182755]: 2026-01-22 00:07:37.842 182759 DEBUG oslo_concurrency.lockutils [req-79cc67d8-8d39-4984-a3c1-d17fc192a0e7 req-c7d82ee3-6dea-4c44-8427-588876f0cc32 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Releasing lock "refresh_cache-4e87b9c8-cfba-431e-966e-24799ad0ece2" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 21 19:07:37 np0005591285 nova_compute[182755]: 2026-01-22 00:07:37.937 182759 INFO nova.compute.manager [None req-a8e93372-239c-4045-a799-78d31a5cb7ba 365f219cd09c471fa6275faa2fe5e2a1 b26cf6f4abd54e30aac169a3cbca648c - - default default] [instance: 4e87b9c8-cfba-431e-966e-24799ad0ece2] Took 7.71 seconds to build instance.#033[00m
Jan 21 19:07:37 np0005591285 nova_compute[182755]: 2026-01-22 00:07:37.963 182759 DEBUG oslo_concurrency.lockutils [None req-a8e93372-239c-4045-a799-78d31a5cb7ba 365f219cd09c471fa6275faa2fe5e2a1 b26cf6f4abd54e30aac169a3cbca648c - - default default] Lock "4e87b9c8-cfba-431e-966e-24799ad0ece2" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 7.819s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 21 19:07:38 np0005591285 nova_compute[182755]: 2026-01-22 00:07:38.975 182759 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1769040443.9740355, a7650c58-4663-47b0-8499-d470f8edddbd => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 21 19:07:38 np0005591285 nova_compute[182755]: 2026-01-22 00:07:38.975 182759 INFO nova.compute.manager [-] [instance: a7650c58-4663-47b0-8499-d470f8edddbd] VM Stopped (Lifecycle Event)#033[00m
Jan 21 19:07:39 np0005591285 nova_compute[182755]: 2026-01-22 00:07:39.001 182759 DEBUG nova.compute.manager [None req-62564d7f-fbee-45b2-80d3-b970358a0284 - - - - - -] [instance: a7650c58-4663-47b0-8499-d470f8edddbd] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 21 19:07:39 np0005591285 nova_compute[182755]: 2026-01-22 00:07:39.218 182759 DEBUG oslo_service.periodic_task [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 21 19:07:39 np0005591285 nova_compute[182755]: 2026-01-22 00:07:39.812 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:07:41 np0005591285 nova_compute[182755]: 2026-01-22 00:07:41.217 182759 DEBUG oslo_service.periodic_task [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 21 19:07:41 np0005591285 nova_compute[182755]: 2026-01-22 00:07:41.218 182759 DEBUG oslo_service.periodic_task [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 21 19:07:41 np0005591285 nova_compute[182755]: 2026-01-22 00:07:41.589 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:07:43 np0005591285 nova_compute[182755]: 2026-01-22 00:07:43.100 182759 DEBUG nova.compute.manager [req-1f12994a-eada-4d09-a418-ffe76e32122a req-75e7e77c-02fc-413a-8d28-46d8f1352735 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 4e87b9c8-cfba-431e-966e-24799ad0ece2] Received event network-changed-02f1d29d-b6df-46d8-8387-cfa84ffb24af external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 21 19:07:43 np0005591285 nova_compute[182755]: 2026-01-22 00:07:43.100 182759 DEBUG nova.compute.manager [req-1f12994a-eada-4d09-a418-ffe76e32122a req-75e7e77c-02fc-413a-8d28-46d8f1352735 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 4e87b9c8-cfba-431e-966e-24799ad0ece2] Refreshing instance network info cache due to event network-changed-02f1d29d-b6df-46d8-8387-cfa84ffb24af. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 21 19:07:43 np0005591285 nova_compute[182755]: 2026-01-22 00:07:43.101 182759 DEBUG oslo_concurrency.lockutils [req-1f12994a-eada-4d09-a418-ffe76e32122a req-75e7e77c-02fc-413a-8d28-46d8f1352735 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "refresh_cache-4e87b9c8-cfba-431e-966e-24799ad0ece2" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 21 19:07:43 np0005591285 nova_compute[182755]: 2026-01-22 00:07:43.101 182759 DEBUG oslo_concurrency.lockutils [req-1f12994a-eada-4d09-a418-ffe76e32122a req-75e7e77c-02fc-413a-8d28-46d8f1352735 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquired lock "refresh_cache-4e87b9c8-cfba-431e-966e-24799ad0ece2" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 21 19:07:43 np0005591285 nova_compute[182755]: 2026-01-22 00:07:43.101 182759 DEBUG nova.network.neutron [req-1f12994a-eada-4d09-a418-ffe76e32122a req-75e7e77c-02fc-413a-8d28-46d8f1352735 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 4e87b9c8-cfba-431e-966e-24799ad0ece2] Refreshing network info cache for port 02f1d29d-b6df-46d8-8387-cfa84ffb24af _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 21 19:07:43 np0005591285 nova_compute[182755]: 2026-01-22 00:07:43.216 182759 DEBUG oslo_service.periodic_task [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 21 19:07:43 np0005591285 nova_compute[182755]: 2026-01-22 00:07:43.217 182759 DEBUG oslo_service.periodic_task [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Running periodic task ComputeManager._cleanup_incomplete_migrations run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 21 19:07:43 np0005591285 nova_compute[182755]: 2026-01-22 00:07:43.217 182759 DEBUG nova.compute.manager [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Cleaning up deleted instances with incomplete migration  _cleanup_incomplete_migrations /usr/lib/python3.9/site-packages/nova/compute/manager.py:11183#033[00m
Jan 21 19:07:44 np0005591285 nova_compute[182755]: 2026-01-22 00:07:44.233 182759 DEBUG oslo_service.periodic_task [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 21 19:07:44 np0005591285 nova_compute[182755]: 2026-01-22 00:07:44.260 182759 DEBUG oslo_service.periodic_task [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 21 19:07:44 np0005591285 nova_compute[182755]: 2026-01-22 00:07:44.260 182759 DEBUG oslo_service.periodic_task [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 21 19:07:44 np0005591285 nova_compute[182755]: 2026-01-22 00:07:44.285 182759 DEBUG oslo_concurrency.lockutils [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 21 19:07:44 np0005591285 nova_compute[182755]: 2026-01-22 00:07:44.286 182759 DEBUG oslo_concurrency.lockutils [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 21 19:07:44 np0005591285 nova_compute[182755]: 2026-01-22 00:07:44.286 182759 DEBUG oslo_concurrency.lockutils [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 21 19:07:44 np0005591285 nova_compute[182755]: 2026-01-22 00:07:44.287 182759 DEBUG nova.compute.resource_tracker [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Jan 21 19:07:44 np0005591285 ovn_controller[94908]: 2026-01-22T00:07:44Z|00374|binding|INFO|Releasing lport c2dbe75a-81e7-4c52-bada-9acaf8fbaf5c from this chassis (sb_readonly=0)
Jan 21 19:07:44 np0005591285 ovn_controller[94908]: 2026-01-22T00:07:44Z|00375|binding|INFO|Releasing lport e13e9986-32c8-46d0-b3e3-65edba110bfc from this chassis (sb_readonly=0)
Jan 21 19:07:44 np0005591285 nova_compute[182755]: 2026-01-22 00:07:44.357 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:07:44 np0005591285 nova_compute[182755]: 2026-01-22 00:07:44.415 182759 DEBUG oslo_concurrency.processutils [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/6167cc82-55cf-479c-a543-101634481524/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 21 19:07:44 np0005591285 podman[227194]: 2026-01-22 00:07:44.442773501 +0000 UTC m=+0.081150314 container health_status b5c1a31a508d0cf9e10702abe9c0ef424614b4d8f0daee85791f7de054afa6f0 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=ceilometer_agent_compute, container_name=ceilometer_agent_compute, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true)
Jan 21 19:07:44 np0005591285 nova_compute[182755]: 2026-01-22 00:07:44.447 182759 DEBUG nova.network.neutron [req-1f12994a-eada-4d09-a418-ffe76e32122a req-75e7e77c-02fc-413a-8d28-46d8f1352735 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 4e87b9c8-cfba-431e-966e-24799ad0ece2] Updated VIF entry in instance network info cache for port 02f1d29d-b6df-46d8-8387-cfa84ffb24af. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 21 19:07:44 np0005591285 nova_compute[182755]: 2026-01-22 00:07:44.448 182759 DEBUG nova.network.neutron [req-1f12994a-eada-4d09-a418-ffe76e32122a req-75e7e77c-02fc-413a-8d28-46d8f1352735 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 4e87b9c8-cfba-431e-966e-24799ad0ece2] Updating instance_info_cache with network_info: [{"id": "02f1d29d-b6df-46d8-8387-cfa84ffb24af", "address": "fa:16:3e:7c:e7:2e", "network": {"id": "1a4bd631-64c5-4e00-9341-0e44fd0833fb", "bridge": "br-int", "label": "tempest-ServerActionsTestOtherB-1779791452-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.176", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b26cf6f4abd54e30aac169a3cbca648c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap02f1d29d-b6", "ovs_interfaceid": "02f1d29d-b6df-46d8-8387-cfa84ffb24af", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 21 19:07:44 np0005591285 podman[227192]: 2026-01-22 00:07:44.456680215 +0000 UTC m=+0.087262138 container health_status 769668cc4e818cfae854ab3fcb5517227ddca1e18005a18ef4d782a494f45662 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, architecture=x86_64, distribution-scope=public, build-date=2025-08-20T13:12:41, io.buildah.version=1.33.7, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., managed_by=edpm_ansible, release=1755695350, container_name=openstack_network_exporter, com.redhat.component=ubi9-minimal-container, config_id=openstack_network_exporter, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.expose-services=, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, version=9.6, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=minimal rhel9, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-type=git, vendor=Red Hat, Inc., maintainer=Red Hat, Inc., name=ubi9-minimal)
Jan 21 19:07:44 np0005591285 nova_compute[182755]: 2026-01-22 00:07:44.469 182759 DEBUG oslo_concurrency.lockutils [req-1f12994a-eada-4d09-a418-ffe76e32122a req-75e7e77c-02fc-413a-8d28-46d8f1352735 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Releasing lock "refresh_cache-4e87b9c8-cfba-431e-966e-24799ad0ece2" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 21 19:07:44 np0005591285 nova_compute[182755]: 2026-01-22 00:07:44.490 182759 DEBUG oslo_concurrency.processutils [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/6167cc82-55cf-479c-a543-101634481524/disk --force-share --output=json" returned: 0 in 0.075s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 21 19:07:44 np0005591285 nova_compute[182755]: 2026-01-22 00:07:44.490 182759 DEBUG oslo_concurrency.processutils [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/6167cc82-55cf-479c-a543-101634481524/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 21 19:07:44 np0005591285 nova_compute[182755]: 2026-01-22 00:07:44.557 182759 DEBUG oslo_concurrency.processutils [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/6167cc82-55cf-479c-a543-101634481524/disk --force-share --output=json" returned: 0 in 0.066s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 21 19:07:44 np0005591285 nova_compute[182755]: 2026-01-22 00:07:44.563 182759 DEBUG oslo_concurrency.processutils [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/4e87b9c8-cfba-431e-966e-24799ad0ece2/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 21 19:07:44 np0005591285 nova_compute[182755]: 2026-01-22 00:07:44.650 182759 DEBUG oslo_concurrency.processutils [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/4e87b9c8-cfba-431e-966e-24799ad0ece2/disk --force-share --output=json" returned: 0 in 0.087s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 21 19:07:44 np0005591285 nova_compute[182755]: 2026-01-22 00:07:44.651 182759 DEBUG oslo_concurrency.processutils [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/4e87b9c8-cfba-431e-966e-24799ad0ece2/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 21 19:07:44 np0005591285 nova_compute[182755]: 2026-01-22 00:07:44.704 182759 DEBUG oslo_concurrency.processutils [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/4e87b9c8-cfba-431e-966e-24799ad0ece2/disk --force-share --output=json" returned: 0 in 0.054s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 21 19:07:44 np0005591285 nova_compute[182755]: 2026-01-22 00:07:44.815 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:07:44 np0005591285 nova_compute[182755]: 2026-01-22 00:07:44.875 182759 WARNING nova.virt.libvirt.driver [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 21 19:07:44 np0005591285 nova_compute[182755]: 2026-01-22 00:07:44.876 182759 DEBUG nova.compute.resource_tracker [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=5342MB free_disk=73.23148345947266GB free_vcpus=6 pci_devices=[{"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Jan 21 19:07:44 np0005591285 nova_compute[182755]: 2026-01-22 00:07:44.877 182759 DEBUG oslo_concurrency.lockutils [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 21 19:07:44 np0005591285 nova_compute[182755]: 2026-01-22 00:07:44.877 182759 DEBUG oslo_concurrency.lockutils [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 21 19:07:45 np0005591285 nova_compute[182755]: 2026-01-22 00:07:45.124 182759 DEBUG nova.compute.resource_tracker [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Instance 6167cc82-55cf-479c-a543-101634481524 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Jan 21 19:07:45 np0005591285 nova_compute[182755]: 2026-01-22 00:07:45.125 182759 DEBUG nova.compute.resource_tracker [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Instance 4e87b9c8-cfba-431e-966e-24799ad0ece2 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Jan 21 19:07:45 np0005591285 nova_compute[182755]: 2026-01-22 00:07:45.125 182759 DEBUG nova.compute.resource_tracker [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 2 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Jan 21 19:07:45 np0005591285 nova_compute[182755]: 2026-01-22 00:07:45.126 182759 DEBUG nova.compute.resource_tracker [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7679MB used_ram=768MB phys_disk=79GB used_disk=2GB total_vcpus=8 used_vcpus=2 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Jan 21 19:07:45 np0005591285 nova_compute[182755]: 2026-01-22 00:07:45.186 182759 DEBUG nova.compute.provider_tree [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Inventory has not changed in ProviderTree for provider: e96a8776-a298-4c19-937a-402cb8191067 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 21 19:07:45 np0005591285 nova_compute[182755]: 2026-01-22 00:07:45.204 182759 DEBUG nova.scheduler.client.report [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Inventory has not changed for provider e96a8776-a298-4c19-937a-402cb8191067 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 21 19:07:45 np0005591285 nova_compute[182755]: 2026-01-22 00:07:45.234 182759 DEBUG nova.compute.resource_tracker [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Jan 21 19:07:45 np0005591285 nova_compute[182755]: 2026-01-22 00:07:45.234 182759 DEBUG oslo_concurrency.lockutils [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.357s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 21 19:07:46 np0005591285 nova_compute[182755]: 2026-01-22 00:07:46.192 182759 DEBUG oslo_service.periodic_task [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 21 19:07:46 np0005591285 nova_compute[182755]: 2026-01-22 00:07:46.193 182759 DEBUG nova.compute.manager [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Jan 21 19:07:46 np0005591285 nova_compute[182755]: 2026-01-22 00:07:46.193 182759 DEBUG nova.compute.manager [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Jan 21 19:07:46 np0005591285 nova_compute[182755]: 2026-01-22 00:07:46.591 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:07:47 np0005591285 nova_compute[182755]: 2026-01-22 00:07:47.052 182759 DEBUG oslo_concurrency.lockutils [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Acquiring lock "refresh_cache-6167cc82-55cf-479c-a543-101634481524" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 21 19:07:47 np0005591285 nova_compute[182755]: 2026-01-22 00:07:47.052 182759 DEBUG oslo_concurrency.lockutils [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Acquired lock "refresh_cache-6167cc82-55cf-479c-a543-101634481524" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 21 19:07:47 np0005591285 nova_compute[182755]: 2026-01-22 00:07:47.052 182759 DEBUG nova.network.neutron [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] [instance: 6167cc82-55cf-479c-a543-101634481524] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004#033[00m
Jan 21 19:07:47 np0005591285 nova_compute[182755]: 2026-01-22 00:07:47.052 182759 DEBUG nova.objects.instance [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Lazy-loading 'info_cache' on Instance uuid 6167cc82-55cf-479c-a543-101634481524 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 21 19:07:49 np0005591285 nova_compute[182755]: 2026-01-22 00:07:49.817 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:07:51 np0005591285 nova_compute[182755]: 2026-01-22 00:07:51.518 182759 DEBUG nova.network.neutron [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] [instance: 6167cc82-55cf-479c-a543-101634481524] Updating instance_info_cache with network_info: [{"id": "c8f69aa7-693e-445d-9997-3c34ee42d0ad", "address": "fa:16:3e:6a:66:fa", "network": {"id": "f10ec1ab-4b98-425d-b81d-b3bec89eb303", "bridge": "br-int", "label": "tempest-network-smoke--2025591889", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.243", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "34b96b4037d24a0ea19383ca2477b2fd", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc8f69aa7-69", "ovs_interfaceid": "c8f69aa7-693e-445d-9997-3c34ee42d0ad", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 21 19:07:51 np0005591285 nova_compute[182755]: 2026-01-22 00:07:51.532 182759 DEBUG oslo_concurrency.lockutils [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Releasing lock "refresh_cache-6167cc82-55cf-479c-a543-101634481524" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 21 19:07:51 np0005591285 nova_compute[182755]: 2026-01-22 00:07:51.532 182759 DEBUG nova.compute.manager [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] [instance: 6167cc82-55cf-479c-a543-101634481524] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929#033[00m
Jan 21 19:07:51 np0005591285 nova_compute[182755]: 2026-01-22 00:07:51.593 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:07:51 np0005591285 ovn_controller[94908]: 2026-01-22T00:07:51Z|00041|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:7c:e7:2e 10.100.0.8
Jan 21 19:07:51 np0005591285 ovn_controller[94908]: 2026-01-22T00:07:51Z|00042|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:7c:e7:2e 10.100.0.8
Jan 21 19:07:52 np0005591285 podman[227261]: 2026-01-22 00:07:52.174740453 +0000 UTC m=+0.046822851 container health_status 3d927e208c8d9d2bf2ed958430b573b5beb6e6062ba87e1ec7a0ee42c855b2e4 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Jan 21 19:07:54 np0005591285 nova_compute[182755]: 2026-01-22 00:07:54.676 182759 DEBUG oslo_service.periodic_task [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Running periodic task ComputeManager._sync_power_states run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 21 19:07:54 np0005591285 nova_compute[182755]: 2026-01-22 00:07:54.703 182759 DEBUG nova.compute.manager [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Triggering sync for uuid 6167cc82-55cf-479c-a543-101634481524 _sync_power_states /usr/lib/python3.9/site-packages/nova/compute/manager.py:10268#033[00m
Jan 21 19:07:54 np0005591285 nova_compute[182755]: 2026-01-22 00:07:54.703 182759 DEBUG nova.compute.manager [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Triggering sync for uuid 4e87b9c8-cfba-431e-966e-24799ad0ece2 _sync_power_states /usr/lib/python3.9/site-packages/nova/compute/manager.py:10268#033[00m
Jan 21 19:07:54 np0005591285 nova_compute[182755]: 2026-01-22 00:07:54.704 182759 DEBUG oslo_concurrency.lockutils [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Acquiring lock "6167cc82-55cf-479c-a543-101634481524" by "nova.compute.manager.ComputeManager._sync_power_states.<locals>._sync.<locals>.query_driver_power_state_and_sync" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 21 19:07:54 np0005591285 nova_compute[182755]: 2026-01-22 00:07:54.704 182759 DEBUG oslo_concurrency.lockutils [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Lock "6167cc82-55cf-479c-a543-101634481524" acquired by "nova.compute.manager.ComputeManager._sync_power_states.<locals>._sync.<locals>.query_driver_power_state_and_sync" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 21 19:07:54 np0005591285 nova_compute[182755]: 2026-01-22 00:07:54.704 182759 DEBUG oslo_concurrency.lockutils [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Acquiring lock "4e87b9c8-cfba-431e-966e-24799ad0ece2" by "nova.compute.manager.ComputeManager._sync_power_states.<locals>._sync.<locals>.query_driver_power_state_and_sync" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 21 19:07:54 np0005591285 nova_compute[182755]: 2026-01-22 00:07:54.705 182759 DEBUG oslo_concurrency.lockutils [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Lock "4e87b9c8-cfba-431e-966e-24799ad0ece2" acquired by "nova.compute.manager.ComputeManager._sync_power_states.<locals>._sync.<locals>.query_driver_power_state_and_sync" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 21 19:07:54 np0005591285 nova_compute[182755]: 2026-01-22 00:07:54.742 182759 DEBUG oslo_concurrency.lockutils [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Lock "6167cc82-55cf-479c-a543-101634481524" "released" by "nova.compute.manager.ComputeManager._sync_power_states.<locals>._sync.<locals>.query_driver_power_state_and_sync" :: held 0.038s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 21 19:07:54 np0005591285 nova_compute[182755]: 2026-01-22 00:07:54.744 182759 DEBUG oslo_concurrency.lockutils [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Lock "4e87b9c8-cfba-431e-966e-24799ad0ece2" "released" by "nova.compute.manager.ComputeManager._sync_power_states.<locals>._sync.<locals>.query_driver_power_state_and_sync" :: held 0.039s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 21 19:07:54 np0005591285 nova_compute[182755]: 2026-01-22 00:07:54.819 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:07:55 np0005591285 nova_compute[182755]: 2026-01-22 00:07:55.125 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:07:55 np0005591285 ovn_controller[94908]: 2026-01-22T00:07:55Z|00376|binding|INFO|Releasing lport c2dbe75a-81e7-4c52-bada-9acaf8fbaf5c from this chassis (sb_readonly=0)
Jan 21 19:07:55 np0005591285 ovn_controller[94908]: 2026-01-22T00:07:55Z|00377|binding|INFO|Releasing lport e13e9986-32c8-46d0-b3e3-65edba110bfc from this chassis (sb_readonly=0)
Jan 21 19:07:55 np0005591285 nova_compute[182755]: 2026-01-22 00:07:55.337 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:07:56 np0005591285 podman[227288]: 2026-01-22 00:07:56.183790582 +0000 UTC m=+0.050575571 container health_status 8919309155a033da8deb024bb37b9fc0d285fe97686ee0d202af37a5f2df8416 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Jan 21 19:07:56 np0005591285 podman[227287]: 2026-01-22 00:07:56.213719917 +0000 UTC m=+0.082963923 container health_status 482a8450540b33e5cd4c4cb0c4b60281016c657b79b89fa82f8a9e447843f995 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent)
Jan 21 19:07:56 np0005591285 podman[227289]: 2026-01-22 00:07:56.214158809 +0000 UTC m=+0.080028274 container health_status e38def30a2d032278c507a8fe96e62e9ea51ff4721fe55b24e66691821fd079e (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.license=GPLv2, config_id=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible)
Jan 21 19:07:56 np0005591285 nova_compute[182755]: 2026-01-22 00:07:56.451 182759 DEBUG oslo_concurrency.lockutils [None req-088cd008-f64c-4a5d-8f98-62599b2274b5 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] Acquiring lock "6167cc82-55cf-479c-a543-101634481524" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 21 19:07:56 np0005591285 nova_compute[182755]: 2026-01-22 00:07:56.452 182759 DEBUG oslo_concurrency.lockutils [None req-088cd008-f64c-4a5d-8f98-62599b2274b5 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] Lock "6167cc82-55cf-479c-a543-101634481524" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 21 19:07:56 np0005591285 nova_compute[182755]: 2026-01-22 00:07:56.452 182759 DEBUG oslo_concurrency.lockutils [None req-088cd008-f64c-4a5d-8f98-62599b2274b5 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] Acquiring lock "6167cc82-55cf-479c-a543-101634481524-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 21 19:07:56 np0005591285 nova_compute[182755]: 2026-01-22 00:07:56.453 182759 DEBUG oslo_concurrency.lockutils [None req-088cd008-f64c-4a5d-8f98-62599b2274b5 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] Lock "6167cc82-55cf-479c-a543-101634481524-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 21 19:07:56 np0005591285 nova_compute[182755]: 2026-01-22 00:07:56.453 182759 DEBUG oslo_concurrency.lockutils [None req-088cd008-f64c-4a5d-8f98-62599b2274b5 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] Lock "6167cc82-55cf-479c-a543-101634481524-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 21 19:07:56 np0005591285 nova_compute[182755]: 2026-01-22 00:07:56.466 182759 INFO nova.compute.manager [None req-088cd008-f64c-4a5d-8f98-62599b2274b5 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] [instance: 6167cc82-55cf-479c-a543-101634481524] Terminating instance#033[00m
Jan 21 19:07:56 np0005591285 nova_compute[182755]: 2026-01-22 00:07:56.476 182759 DEBUG nova.compute.manager [None req-088cd008-f64c-4a5d-8f98-62599b2274b5 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] [instance: 6167cc82-55cf-479c-a543-101634481524] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Jan 21 19:07:56 np0005591285 kernel: tapc8f69aa7-69 (unregistering): left promiscuous mode
Jan 21 19:07:56 np0005591285 NetworkManager[55017]: <info>  [1769040476.5006] device (tapc8f69aa7-69): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 21 19:07:56 np0005591285 ovn_controller[94908]: 2026-01-22T00:07:56Z|00378|binding|INFO|Releasing lport c8f69aa7-693e-445d-9997-3c34ee42d0ad from this chassis (sb_readonly=0)
Jan 21 19:07:56 np0005591285 ovn_controller[94908]: 2026-01-22T00:07:56Z|00379|binding|INFO|Setting lport c8f69aa7-693e-445d-9997-3c34ee42d0ad down in Southbound
Jan 21 19:07:56 np0005591285 nova_compute[182755]: 2026-01-22 00:07:56.503 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:07:56 np0005591285 ovn_controller[94908]: 2026-01-22T00:07:56Z|00380|binding|INFO|Removing iface tapc8f69aa7-69 ovn-installed in OVS
Jan 21 19:07:56 np0005591285 nova_compute[182755]: 2026-01-22 00:07:56.507 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:07:56 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:07:56.511 104259 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:6a:66:fa 10.100.0.13'], port_security=['fa:16:3e:6a:66:fa 10.100.0.13'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.13/28', 'neutron:device_id': '6167cc82-55cf-479c-a543-101634481524', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-f10ec1ab-4b98-425d-b81d-b3bec89eb303', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '34b96b4037d24a0ea19383ca2477b2fd', 'neutron:revision_number': '4', 'neutron:security_group_ids': '2f4224c0-7028-42a4-a552-421afe2237a5', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-2.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=0f5e11f5-9bfe-4253-bb8d-4e8170927296, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fdd55b5c970>], logical_port=c8f69aa7-693e-445d-9997-3c34ee42d0ad) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7fdd55b5c970>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 21 19:07:56 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:07:56.513 104259 INFO neutron.agent.ovn.metadata.agent [-] Port c8f69aa7-693e-445d-9997-3c34ee42d0ad in datapath f10ec1ab-4b98-425d-b81d-b3bec89eb303 unbound from our chassis#033[00m
Jan 21 19:07:56 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:07:56.515 104259 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network f10ec1ab-4b98-425d-b81d-b3bec89eb303, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Jan 21 19:07:56 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:07:56.516 211690 DEBUG oslo.privsep.daemon [-] privsep: reply[c58830e5-1f41-4c4a-b035-63dfc21875ad]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 19:07:56 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:07:56.517 104259 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-f10ec1ab-4b98-425d-b81d-b3bec89eb303 namespace which is not needed anymore#033[00m
Jan 21 19:07:56 np0005591285 nova_compute[182755]: 2026-01-22 00:07:56.523 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:07:56 np0005591285 systemd[1]: machine-qemu\x2d44\x2dinstance\x2d00000062.scope: Deactivated successfully.
Jan 21 19:07:56 np0005591285 systemd[1]: machine-qemu\x2d44\x2dinstance\x2d00000062.scope: Consumed 16.122s CPU time.
Jan 21 19:07:56 np0005591285 systemd-machined[154022]: Machine qemu-44-instance-00000062 terminated.
Jan 21 19:07:56 np0005591285 nova_compute[182755]: 2026-01-22 00:07:56.594 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:07:56 np0005591285 neutron-haproxy-ovnmeta-f10ec1ab-4b98-425d-b81d-b3bec89eb303[226515]: [NOTICE]   (226519) : haproxy version is 2.8.14-c23fe91
Jan 21 19:07:56 np0005591285 neutron-haproxy-ovnmeta-f10ec1ab-4b98-425d-b81d-b3bec89eb303[226515]: [NOTICE]   (226519) : path to executable is /usr/sbin/haproxy
Jan 21 19:07:56 np0005591285 neutron-haproxy-ovnmeta-f10ec1ab-4b98-425d-b81d-b3bec89eb303[226515]: [WARNING]  (226519) : Exiting Master process...
Jan 21 19:07:56 np0005591285 neutron-haproxy-ovnmeta-f10ec1ab-4b98-425d-b81d-b3bec89eb303[226515]: [WARNING]  (226519) : Exiting Master process...
Jan 21 19:07:56 np0005591285 neutron-haproxy-ovnmeta-f10ec1ab-4b98-425d-b81d-b3bec89eb303[226515]: [ALERT]    (226519) : Current worker (226521) exited with code 143 (Terminated)
Jan 21 19:07:56 np0005591285 neutron-haproxy-ovnmeta-f10ec1ab-4b98-425d-b81d-b3bec89eb303[226515]: [WARNING]  (226519) : All workers exited. Exiting... (0)
Jan 21 19:07:56 np0005591285 systemd[1]: libpod-270e14f4385c8348b91de78577dcf863ac4ffb67bd9a677ee1570669b1d31dc7.scope: Deactivated successfully.
Jan 21 19:07:56 np0005591285 podman[227377]: 2026-01-22 00:07:56.662504519 +0000 UTC m=+0.048477316 container died 270e14f4385c8348b91de78577dcf863ac4ffb67bd9a677ee1570669b1d31dc7 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-f10ec1ab-4b98-425d-b81d-b3bec89eb303, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.schema-version=1.0)
Jan 21 19:07:56 np0005591285 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-270e14f4385c8348b91de78577dcf863ac4ffb67bd9a677ee1570669b1d31dc7-userdata-shm.mount: Deactivated successfully.
Jan 21 19:07:56 np0005591285 systemd[1]: var-lib-containers-storage-overlay-8fcdf618ab44c4445e28131968e548eced512fdaf0da331a39fee4f29efdd80e-merged.mount: Deactivated successfully.
Jan 21 19:07:56 np0005591285 podman[227377]: 2026-01-22 00:07:56.702784502 +0000 UTC m=+0.088757289 container cleanup 270e14f4385c8348b91de78577dcf863ac4ffb67bd9a677ee1570669b1d31dc7 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-f10ec1ab-4b98-425d-b81d-b3bec89eb303, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251202)
Jan 21 19:07:56 np0005591285 systemd[1]: libpod-conmon-270e14f4385c8348b91de78577dcf863ac4ffb67bd9a677ee1570669b1d31dc7.scope: Deactivated successfully.
Jan 21 19:07:56 np0005591285 nova_compute[182755]: 2026-01-22 00:07:56.737 182759 INFO nova.virt.libvirt.driver [-] [instance: 6167cc82-55cf-479c-a543-101634481524] Instance destroyed successfully.#033[00m
Jan 21 19:07:56 np0005591285 nova_compute[182755]: 2026-01-22 00:07:56.737 182759 DEBUG nova.objects.instance [None req-088cd008-f64c-4a5d-8f98-62599b2274b5 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] Lazy-loading 'resources' on Instance uuid 6167cc82-55cf-479c-a543-101634481524 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 21 19:07:56 np0005591285 nova_compute[182755]: 2026-01-22 00:07:56.757 182759 DEBUG nova.virt.libvirt.vif [None req-088cd008-f64c-4a5d-8f98-62599b2274b5 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-22T00:06:39Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-1751465895',display_name='tempest-TestNetworkBasicOps-server-1751465895',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-1751465895',id=98,image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBCUQEFaz6xFbfUUIyFDhdsrUS3W/OFy4rX5dg+VsnwsKvybUejlkijhF0sCEbEK0YXF3bzeH12g1xvPUzBL7J4wI/PY6jPsJBtb13ZmCpPxLQe7XZOC3++3YTNNntYsYIQ==',key_name='tempest-TestNetworkBasicOps-194618748',keypairs=<?>,launch_index=0,launched_at=2026-01-22T00:06:47Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='34b96b4037d24a0ea19383ca2477b2fd',ramdisk_id='',reservation_id='r-y000o108',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestNetworkBasicOps-822850957',owner_user_name='tempest-TestNetworkBasicOps-822850957-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-22T00:06:47Z,user_data=None,user_id='833f1e9dce90456ea55a443da6704907',uuid=6167cc82-55cf-479c-a543-101634481524,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "c8f69aa7-693e-445d-9997-3c34ee42d0ad", "address": "fa:16:3e:6a:66:fa", "network": {"id": "f10ec1ab-4b98-425d-b81d-b3bec89eb303", "bridge": "br-int", "label": "tempest-network-smoke--2025591889", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.243", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "34b96b4037d24a0ea19383ca2477b2fd", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc8f69aa7-69", "ovs_interfaceid": "c8f69aa7-693e-445d-9997-3c34ee42d0ad", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Jan 21 19:07:56 np0005591285 nova_compute[182755]: 2026-01-22 00:07:56.757 182759 DEBUG nova.network.os_vif_util [None req-088cd008-f64c-4a5d-8f98-62599b2274b5 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] Converting VIF {"id": "c8f69aa7-693e-445d-9997-3c34ee42d0ad", "address": "fa:16:3e:6a:66:fa", "network": {"id": "f10ec1ab-4b98-425d-b81d-b3bec89eb303", "bridge": "br-int", "label": "tempest-network-smoke--2025591889", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.243", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "34b96b4037d24a0ea19383ca2477b2fd", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc8f69aa7-69", "ovs_interfaceid": "c8f69aa7-693e-445d-9997-3c34ee42d0ad", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 21 19:07:56 np0005591285 nova_compute[182755]: 2026-01-22 00:07:56.758 182759 DEBUG nova.network.os_vif_util [None req-088cd008-f64c-4a5d-8f98-62599b2274b5 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:6a:66:fa,bridge_name='br-int',has_traffic_filtering=True,id=c8f69aa7-693e-445d-9997-3c34ee42d0ad,network=Network(f10ec1ab-4b98-425d-b81d-b3bec89eb303),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapc8f69aa7-69') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 21 19:07:56 np0005591285 nova_compute[182755]: 2026-01-22 00:07:56.758 182759 DEBUG os_vif [None req-088cd008-f64c-4a5d-8f98-62599b2274b5 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:6a:66:fa,bridge_name='br-int',has_traffic_filtering=True,id=c8f69aa7-693e-445d-9997-3c34ee42d0ad,network=Network(f10ec1ab-4b98-425d-b81d-b3bec89eb303),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapc8f69aa7-69') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Jan 21 19:07:56 np0005591285 nova_compute[182755]: 2026-01-22 00:07:56.760 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:07:56 np0005591285 nova_compute[182755]: 2026-01-22 00:07:56.760 182759 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapc8f69aa7-69, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 21 19:07:56 np0005591285 nova_compute[182755]: 2026-01-22 00:07:56.762 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:07:56 np0005591285 nova_compute[182755]: 2026-01-22 00:07:56.764 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:07:56 np0005591285 podman[227412]: 2026-01-22 00:07:56.765090228 +0000 UTC m=+0.041486276 container remove 270e14f4385c8348b91de78577dcf863ac4ffb67bd9a677ee1570669b1d31dc7 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-f10ec1ab-4b98-425d-b81d-b3bec89eb303, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 21 19:07:56 np0005591285 nova_compute[182755]: 2026-01-22 00:07:56.767 182759 INFO os_vif [None req-088cd008-f64c-4a5d-8f98-62599b2274b5 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:6a:66:fa,bridge_name='br-int',has_traffic_filtering=True,id=c8f69aa7-693e-445d-9997-3c34ee42d0ad,network=Network(f10ec1ab-4b98-425d-b81d-b3bec89eb303),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapc8f69aa7-69')#033[00m
Jan 21 19:07:56 np0005591285 nova_compute[182755]: 2026-01-22 00:07:56.767 182759 INFO nova.virt.libvirt.driver [None req-088cd008-f64c-4a5d-8f98-62599b2274b5 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] [instance: 6167cc82-55cf-479c-a543-101634481524] Deleting instance files /var/lib/nova/instances/6167cc82-55cf-479c-a543-101634481524_del#033[00m
Jan 21 19:07:56 np0005591285 nova_compute[182755]: 2026-01-22 00:07:56.768 182759 INFO nova.virt.libvirt.driver [None req-088cd008-f64c-4a5d-8f98-62599b2274b5 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] [instance: 6167cc82-55cf-479c-a543-101634481524] Deletion of /var/lib/nova/instances/6167cc82-55cf-479c-a543-101634481524_del complete#033[00m
Jan 21 19:07:56 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:07:56.770 211690 DEBUG oslo.privsep.daemon [-] privsep: reply[ce424c15-f054-4ba3-94aa-d503be40c5c1]: (4, ('Thu Jan 22 12:07:56 AM UTC 2026 Stopping container neutron-haproxy-ovnmeta-f10ec1ab-4b98-425d-b81d-b3bec89eb303 (270e14f4385c8348b91de78577dcf863ac4ffb67bd9a677ee1570669b1d31dc7)\n270e14f4385c8348b91de78577dcf863ac4ffb67bd9a677ee1570669b1d31dc7\nThu Jan 22 12:07:56 AM UTC 2026 Deleting container neutron-haproxy-ovnmeta-f10ec1ab-4b98-425d-b81d-b3bec89eb303 (270e14f4385c8348b91de78577dcf863ac4ffb67bd9a677ee1570669b1d31dc7)\n270e14f4385c8348b91de78577dcf863ac4ffb67bd9a677ee1570669b1d31dc7\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 19:07:56 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:07:56.771 211690 DEBUG oslo.privsep.daemon [-] privsep: reply[ff487f9f-46bc-4538-a2d5-412a303616e6]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 19:07:56 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:07:56.772 104259 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapf10ec1ab-40, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 21 19:07:56 np0005591285 nova_compute[182755]: 2026-01-22 00:07:56.774 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:07:56 np0005591285 kernel: tapf10ec1ab-40: left promiscuous mode
Jan 21 19:07:56 np0005591285 nova_compute[182755]: 2026-01-22 00:07:56.786 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:07:56 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:07:56.789 211690 DEBUG oslo.privsep.daemon [-] privsep: reply[89a9beee-3e91-4830-9026-2380f5dd3c17]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 19:07:56 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:07:56.805 211690 DEBUG oslo.privsep.daemon [-] privsep: reply[d6c1c1ab-d7c7-4f04-bc0d-9030319c59ec]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 19:07:56 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:07:56.806 211690 DEBUG oslo.privsep.daemon [-] privsep: reply[82d546c9-d290-4019-b85f-c2354e4edc82]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 19:07:56 np0005591285 nova_compute[182755]: 2026-01-22 00:07:56.820 182759 DEBUG nova.compute.manager [req-040f2349-1025-4acf-8d87-e4d560e13e45 req-cbc1cbc1-349d-4cb2-bfd0-f08810294558 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 6167cc82-55cf-479c-a543-101634481524] Received event network-vif-unplugged-c8f69aa7-693e-445d-9997-3c34ee42d0ad external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 21 19:07:56 np0005591285 nova_compute[182755]: 2026-01-22 00:07:56.820 182759 DEBUG oslo_concurrency.lockutils [req-040f2349-1025-4acf-8d87-e4d560e13e45 req-cbc1cbc1-349d-4cb2-bfd0-f08810294558 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "6167cc82-55cf-479c-a543-101634481524-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 21 19:07:56 np0005591285 nova_compute[182755]: 2026-01-22 00:07:56.822 182759 DEBUG oslo_concurrency.lockutils [req-040f2349-1025-4acf-8d87-e4d560e13e45 req-cbc1cbc1-349d-4cb2-bfd0-f08810294558 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "6167cc82-55cf-479c-a543-101634481524-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 21 19:07:56 np0005591285 nova_compute[182755]: 2026-01-22 00:07:56.822 182759 DEBUG oslo_concurrency.lockutils [req-040f2349-1025-4acf-8d87-e4d560e13e45 req-cbc1cbc1-349d-4cb2-bfd0-f08810294558 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "6167cc82-55cf-479c-a543-101634481524-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 21 19:07:56 np0005591285 nova_compute[182755]: 2026-01-22 00:07:56.823 182759 DEBUG nova.compute.manager [req-040f2349-1025-4acf-8d87-e4d560e13e45 req-cbc1cbc1-349d-4cb2-bfd0-f08810294558 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 6167cc82-55cf-479c-a543-101634481524] No waiting events found dispatching network-vif-unplugged-c8f69aa7-693e-445d-9997-3c34ee42d0ad pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 21 19:07:56 np0005591285 nova_compute[182755]: 2026-01-22 00:07:56.823 182759 DEBUG nova.compute.manager [req-040f2349-1025-4acf-8d87-e4d560e13e45 req-cbc1cbc1-349d-4cb2-bfd0-f08810294558 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 6167cc82-55cf-479c-a543-101634481524] Received event network-vif-unplugged-c8f69aa7-693e-445d-9997-3c34ee42d0ad for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Jan 21 19:07:56 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:07:56.823 211690 DEBUG oslo.privsep.daemon [-] privsep: reply[23eec6ca-305c-4ff3-962a-83cace54a111]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 496428, 'reachable_time': 26559, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 227436, 'error': None, 'target': 'ovnmeta-f10ec1ab-4b98-425d-b81d-b3bec89eb303', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 19:07:56 np0005591285 systemd[1]: run-netns-ovnmeta\x2df10ec1ab\x2d4b98\x2d425d\x2db81d\x2db3bec89eb303.mount: Deactivated successfully.
Jan 21 19:07:56 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:07:56.825 104650 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-f10ec1ab-4b98-425d-b81d-b3bec89eb303 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Jan 21 19:07:56 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:07:56.826 104650 DEBUG oslo.privsep.daemon [-] privsep: reply[9e8df72d-e280-4524-b37d-eb49b253ba75]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 19:07:56 np0005591285 nova_compute[182755]: 2026-01-22 00:07:56.855 182759 INFO nova.compute.manager [None req-088cd008-f64c-4a5d-8f98-62599b2274b5 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] [instance: 6167cc82-55cf-479c-a543-101634481524] Took 0.38 seconds to destroy the instance on the hypervisor.#033[00m
Jan 21 19:07:56 np0005591285 nova_compute[182755]: 2026-01-22 00:07:56.856 182759 DEBUG oslo.service.loopingcall [None req-088cd008-f64c-4a5d-8f98-62599b2274b5 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Jan 21 19:07:56 np0005591285 nova_compute[182755]: 2026-01-22 00:07:56.856 182759 DEBUG nova.compute.manager [-] [instance: 6167cc82-55cf-479c-a543-101634481524] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Jan 21 19:07:56 np0005591285 nova_compute[182755]: 2026-01-22 00:07:56.857 182759 DEBUG nova.network.neutron [-] [instance: 6167cc82-55cf-479c-a543-101634481524] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Jan 21 19:07:57 np0005591285 nova_compute[182755]: 2026-01-22 00:07:57.411 182759 DEBUG nova.compute.manager [req-9b066f60-380d-4173-bde2-1492d919ae55 req-9ee5e82f-5af5-415c-86b7-f667956c70d5 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 6167cc82-55cf-479c-a543-101634481524] Received event network-changed-c8f69aa7-693e-445d-9997-3c34ee42d0ad external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 21 19:07:57 np0005591285 nova_compute[182755]: 2026-01-22 00:07:57.412 182759 DEBUG nova.compute.manager [req-9b066f60-380d-4173-bde2-1492d919ae55 req-9ee5e82f-5af5-415c-86b7-f667956c70d5 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 6167cc82-55cf-479c-a543-101634481524] Refreshing instance network info cache due to event network-changed-c8f69aa7-693e-445d-9997-3c34ee42d0ad. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 21 19:07:57 np0005591285 nova_compute[182755]: 2026-01-22 00:07:57.412 182759 DEBUG oslo_concurrency.lockutils [req-9b066f60-380d-4173-bde2-1492d919ae55 req-9ee5e82f-5af5-415c-86b7-f667956c70d5 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "refresh_cache-6167cc82-55cf-479c-a543-101634481524" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 21 19:07:57 np0005591285 nova_compute[182755]: 2026-01-22 00:07:57.412 182759 DEBUG oslo_concurrency.lockutils [req-9b066f60-380d-4173-bde2-1492d919ae55 req-9ee5e82f-5af5-415c-86b7-f667956c70d5 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquired lock "refresh_cache-6167cc82-55cf-479c-a543-101634481524" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 21 19:07:57 np0005591285 nova_compute[182755]: 2026-01-22 00:07:57.412 182759 DEBUG nova.network.neutron [req-9b066f60-380d-4173-bde2-1492d919ae55 req-9ee5e82f-5af5-415c-86b7-f667956c70d5 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 6167cc82-55cf-479c-a543-101634481524] Refreshing network info cache for port c8f69aa7-693e-445d-9997-3c34ee42d0ad _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 21 19:07:58 np0005591285 nova_compute[182755]: 2026-01-22 00:07:58.331 182759 DEBUG nova.network.neutron [-] [instance: 6167cc82-55cf-479c-a543-101634481524] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 21 19:07:58 np0005591285 nova_compute[182755]: 2026-01-22 00:07:58.361 182759 INFO nova.compute.manager [-] [instance: 6167cc82-55cf-479c-a543-101634481524] Took 1.50 seconds to deallocate network for instance.#033[00m
Jan 21 19:07:58 np0005591285 nova_compute[182755]: 2026-01-22 00:07:58.425 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:07:58 np0005591285 nova_compute[182755]: 2026-01-22 00:07:58.534 182759 DEBUG oslo_concurrency.lockutils [None req-088cd008-f64c-4a5d-8f98-62599b2274b5 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 21 19:07:58 np0005591285 nova_compute[182755]: 2026-01-22 00:07:58.534 182759 DEBUG oslo_concurrency.lockutils [None req-088cd008-f64c-4a5d-8f98-62599b2274b5 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 21 19:07:58 np0005591285 nova_compute[182755]: 2026-01-22 00:07:58.747 182759 DEBUG nova.compute.provider_tree [None req-088cd008-f64c-4a5d-8f98-62599b2274b5 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] Inventory has not changed in ProviderTree for provider: e96a8776-a298-4c19-937a-402cb8191067 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 21 19:07:58 np0005591285 nova_compute[182755]: 2026-01-22 00:07:58.782 182759 DEBUG nova.scheduler.client.report [None req-088cd008-f64c-4a5d-8f98-62599b2274b5 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] Inventory has not changed for provider e96a8776-a298-4c19-937a-402cb8191067 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 21 19:07:58 np0005591285 nova_compute[182755]: 2026-01-22 00:07:58.825 182759 DEBUG oslo_concurrency.lockutils [None req-088cd008-f64c-4a5d-8f98-62599b2274b5 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.291s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 21 19:07:58 np0005591285 nova_compute[182755]: 2026-01-22 00:07:58.966 182759 INFO nova.scheduler.client.report [None req-088cd008-f64c-4a5d-8f98-62599b2274b5 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] Deleted allocations for instance 6167cc82-55cf-479c-a543-101634481524#033[00m
Jan 21 19:07:59 np0005591285 nova_compute[182755]: 2026-01-22 00:07:59.063 182759 DEBUG nova.compute.manager [req-e6636d80-d4a6-418a-98de-09d59efff517 req-d9b5f4bb-81a0-4284-bbc3-e00226c3402d 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 6167cc82-55cf-479c-a543-101634481524] Received event network-vif-plugged-c8f69aa7-693e-445d-9997-3c34ee42d0ad external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 21 19:07:59 np0005591285 nova_compute[182755]: 2026-01-22 00:07:59.064 182759 DEBUG oslo_concurrency.lockutils [req-e6636d80-d4a6-418a-98de-09d59efff517 req-d9b5f4bb-81a0-4284-bbc3-e00226c3402d 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "6167cc82-55cf-479c-a543-101634481524-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 21 19:07:59 np0005591285 nova_compute[182755]: 2026-01-22 00:07:59.064 182759 DEBUG oslo_concurrency.lockutils [req-e6636d80-d4a6-418a-98de-09d59efff517 req-d9b5f4bb-81a0-4284-bbc3-e00226c3402d 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "6167cc82-55cf-479c-a543-101634481524-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 21 19:07:59 np0005591285 nova_compute[182755]: 2026-01-22 00:07:59.064 182759 DEBUG oslo_concurrency.lockutils [req-e6636d80-d4a6-418a-98de-09d59efff517 req-d9b5f4bb-81a0-4284-bbc3-e00226c3402d 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "6167cc82-55cf-479c-a543-101634481524-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 21 19:07:59 np0005591285 nova_compute[182755]: 2026-01-22 00:07:59.064 182759 DEBUG nova.compute.manager [req-e6636d80-d4a6-418a-98de-09d59efff517 req-d9b5f4bb-81a0-4284-bbc3-e00226c3402d 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 6167cc82-55cf-479c-a543-101634481524] No waiting events found dispatching network-vif-plugged-c8f69aa7-693e-445d-9997-3c34ee42d0ad pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 21 19:07:59 np0005591285 nova_compute[182755]: 2026-01-22 00:07:59.065 182759 WARNING nova.compute.manager [req-e6636d80-d4a6-418a-98de-09d59efff517 req-d9b5f4bb-81a0-4284-bbc3-e00226c3402d 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 6167cc82-55cf-479c-a543-101634481524] Received unexpected event network-vif-plugged-c8f69aa7-693e-445d-9997-3c34ee42d0ad for instance with vm_state deleted and task_state None.#033[00m
Jan 21 19:07:59 np0005591285 nova_compute[182755]: 2026-01-22 00:07:59.065 182759 DEBUG nova.compute.manager [req-e6636d80-d4a6-418a-98de-09d59efff517 req-d9b5f4bb-81a0-4284-bbc3-e00226c3402d 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 6167cc82-55cf-479c-a543-101634481524] Received event network-vif-deleted-c8f69aa7-693e-445d-9997-3c34ee42d0ad external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 21 19:07:59 np0005591285 nova_compute[182755]: 2026-01-22 00:07:59.539 182759 DEBUG oslo_concurrency.lockutils [None req-088cd008-f64c-4a5d-8f98-62599b2274b5 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] Lock "6167cc82-55cf-479c-a543-101634481524" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 3.087s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 21 19:07:59 np0005591285 nova_compute[182755]: 2026-01-22 00:07:59.916 182759 DEBUG nova.network.neutron [req-9b066f60-380d-4173-bde2-1492d919ae55 req-9ee5e82f-5af5-415c-86b7-f667956c70d5 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 6167cc82-55cf-479c-a543-101634481524] Updated VIF entry in instance network info cache for port c8f69aa7-693e-445d-9997-3c34ee42d0ad. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 21 19:07:59 np0005591285 nova_compute[182755]: 2026-01-22 00:07:59.916 182759 DEBUG nova.network.neutron [req-9b066f60-380d-4173-bde2-1492d919ae55 req-9ee5e82f-5af5-415c-86b7-f667956c70d5 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 6167cc82-55cf-479c-a543-101634481524] Updating instance_info_cache with network_info: [{"id": "c8f69aa7-693e-445d-9997-3c34ee42d0ad", "address": "fa:16:3e:6a:66:fa", "network": {"id": "f10ec1ab-4b98-425d-b81d-b3bec89eb303", "bridge": "br-int", "label": "tempest-network-smoke--2025591889", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "34b96b4037d24a0ea19383ca2477b2fd", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc8f69aa7-69", "ovs_interfaceid": "c8f69aa7-693e-445d-9997-3c34ee42d0ad", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 21 19:07:59 np0005591285 nova_compute[182755]: 2026-01-22 00:07:59.940 182759 DEBUG oslo_concurrency.lockutils [req-9b066f60-380d-4173-bde2-1492d919ae55 req-9ee5e82f-5af5-415c-86b7-f667956c70d5 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Releasing lock "refresh_cache-6167cc82-55cf-479c-a543-101634481524" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 21 19:08:01 np0005591285 nova_compute[182755]: 2026-01-22 00:08:01.217 182759 DEBUG oslo_service.periodic_task [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 21 19:08:01 np0005591285 nova_compute[182755]: 2026-01-22 00:08:01.597 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:08:01 np0005591285 nova_compute[182755]: 2026-01-22 00:08:01.762 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:08:02 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:08:02.971 104259 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 21 19:08:02 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:08:02.972 104259 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 21 19:08:02 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:08:02.972 104259 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 21 19:08:04 np0005591285 nova_compute[182755]: 2026-01-22 00:08:04.448 182759 DEBUG oslo_concurrency.lockutils [None req-4b726b44-c025-4b73-b722-d8f06082cbba 365f219cd09c471fa6275faa2fe5e2a1 b26cf6f4abd54e30aac169a3cbca648c - - default default] Acquiring lock "a22c5192-6f57-46f1-8073-48ec7852a544" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 21 19:08:04 np0005591285 nova_compute[182755]: 2026-01-22 00:08:04.449 182759 DEBUG oslo_concurrency.lockutils [None req-4b726b44-c025-4b73-b722-d8f06082cbba 365f219cd09c471fa6275faa2fe5e2a1 b26cf6f4abd54e30aac169a3cbca648c - - default default] Lock "a22c5192-6f57-46f1-8073-48ec7852a544" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 21 19:08:04 np0005591285 nova_compute[182755]: 2026-01-22 00:08:04.466 182759 DEBUG nova.compute.manager [None req-4b726b44-c025-4b73-b722-d8f06082cbba 365f219cd09c471fa6275faa2fe5e2a1 b26cf6f4abd54e30aac169a3cbca648c - - default default] [instance: a22c5192-6f57-46f1-8073-48ec7852a544] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Jan 21 19:08:04 np0005591285 nova_compute[182755]: 2026-01-22 00:08:04.585 182759 DEBUG oslo_concurrency.lockutils [None req-4b726b44-c025-4b73-b722-d8f06082cbba 365f219cd09c471fa6275faa2fe5e2a1 b26cf6f4abd54e30aac169a3cbca648c - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 21 19:08:04 np0005591285 nova_compute[182755]: 2026-01-22 00:08:04.585 182759 DEBUG oslo_concurrency.lockutils [None req-4b726b44-c025-4b73-b722-d8f06082cbba 365f219cd09c471fa6275faa2fe5e2a1 b26cf6f4abd54e30aac169a3cbca648c - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 21 19:08:04 np0005591285 nova_compute[182755]: 2026-01-22 00:08:04.593 182759 DEBUG nova.virt.hardware [None req-4b726b44-c025-4b73-b722-d8f06082cbba 365f219cd09c471fa6275faa2fe5e2a1 b26cf6f4abd54e30aac169a3cbca648c - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Jan 21 19:08:04 np0005591285 nova_compute[182755]: 2026-01-22 00:08:04.594 182759 INFO nova.compute.claims [None req-4b726b44-c025-4b73-b722-d8f06082cbba 365f219cd09c471fa6275faa2fe5e2a1 b26cf6f4abd54e30aac169a3cbca648c - - default default] [instance: a22c5192-6f57-46f1-8073-48ec7852a544] Claim successful on node compute-2.ctlplane.example.com#033[00m
Jan 21 19:08:04 np0005591285 nova_compute[182755]: 2026-01-22 00:08:04.737 182759 DEBUG nova.compute.provider_tree [None req-4b726b44-c025-4b73-b722-d8f06082cbba 365f219cd09c471fa6275faa2fe5e2a1 b26cf6f4abd54e30aac169a3cbca648c - - default default] Inventory has not changed in ProviderTree for provider: e96a8776-a298-4c19-937a-402cb8191067 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 21 19:08:04 np0005591285 nova_compute[182755]: 2026-01-22 00:08:04.753 182759 DEBUG nova.scheduler.client.report [None req-4b726b44-c025-4b73-b722-d8f06082cbba 365f219cd09c471fa6275faa2fe5e2a1 b26cf6f4abd54e30aac169a3cbca648c - - default default] Inventory has not changed for provider e96a8776-a298-4c19-937a-402cb8191067 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 21 19:08:04 np0005591285 nova_compute[182755]: 2026-01-22 00:08:04.772 182759 DEBUG oslo_concurrency.lockutils [None req-4b726b44-c025-4b73-b722-d8f06082cbba 365f219cd09c471fa6275faa2fe5e2a1 b26cf6f4abd54e30aac169a3cbca648c - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.187s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 21 19:08:04 np0005591285 nova_compute[182755]: 2026-01-22 00:08:04.773 182759 DEBUG nova.compute.manager [None req-4b726b44-c025-4b73-b722-d8f06082cbba 365f219cd09c471fa6275faa2fe5e2a1 b26cf6f4abd54e30aac169a3cbca648c - - default default] [instance: a22c5192-6f57-46f1-8073-48ec7852a544] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Jan 21 19:08:04 np0005591285 nova_compute[182755]: 2026-01-22 00:08:04.841 182759 DEBUG nova.compute.manager [None req-4b726b44-c025-4b73-b722-d8f06082cbba 365f219cd09c471fa6275faa2fe5e2a1 b26cf6f4abd54e30aac169a3cbca648c - - default default] [instance: a22c5192-6f57-46f1-8073-48ec7852a544] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Jan 21 19:08:04 np0005591285 nova_compute[182755]: 2026-01-22 00:08:04.842 182759 DEBUG nova.network.neutron [None req-4b726b44-c025-4b73-b722-d8f06082cbba 365f219cd09c471fa6275faa2fe5e2a1 b26cf6f4abd54e30aac169a3cbca648c - - default default] [instance: a22c5192-6f57-46f1-8073-48ec7852a544] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Jan 21 19:08:04 np0005591285 nova_compute[182755]: 2026-01-22 00:08:04.868 182759 INFO nova.virt.libvirt.driver [None req-4b726b44-c025-4b73-b722-d8f06082cbba 365f219cd09c471fa6275faa2fe5e2a1 b26cf6f4abd54e30aac169a3cbca648c - - default default] [instance: a22c5192-6f57-46f1-8073-48ec7852a544] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Jan 21 19:08:04 np0005591285 nova_compute[182755]: 2026-01-22 00:08:04.887 182759 DEBUG nova.compute.manager [None req-4b726b44-c025-4b73-b722-d8f06082cbba 365f219cd09c471fa6275faa2fe5e2a1 b26cf6f4abd54e30aac169a3cbca648c - - default default] [instance: a22c5192-6f57-46f1-8073-48ec7852a544] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Jan 21 19:08:05 np0005591285 nova_compute[182755]: 2026-01-22 00:08:05.041 182759 DEBUG nova.compute.manager [None req-4b726b44-c025-4b73-b722-d8f06082cbba 365f219cd09c471fa6275faa2fe5e2a1 b26cf6f4abd54e30aac169a3cbca648c - - default default] [instance: a22c5192-6f57-46f1-8073-48ec7852a544] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Jan 21 19:08:05 np0005591285 nova_compute[182755]: 2026-01-22 00:08:05.043 182759 DEBUG nova.virt.libvirt.driver [None req-4b726b44-c025-4b73-b722-d8f06082cbba 365f219cd09c471fa6275faa2fe5e2a1 b26cf6f4abd54e30aac169a3cbca648c - - default default] [instance: a22c5192-6f57-46f1-8073-48ec7852a544] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Jan 21 19:08:05 np0005591285 nova_compute[182755]: 2026-01-22 00:08:05.043 182759 INFO nova.virt.libvirt.driver [None req-4b726b44-c025-4b73-b722-d8f06082cbba 365f219cd09c471fa6275faa2fe5e2a1 b26cf6f4abd54e30aac169a3cbca648c - - default default] [instance: a22c5192-6f57-46f1-8073-48ec7852a544] Creating image(s)#033[00m
Jan 21 19:08:05 np0005591285 nova_compute[182755]: 2026-01-22 00:08:05.044 182759 DEBUG oslo_concurrency.lockutils [None req-4b726b44-c025-4b73-b722-d8f06082cbba 365f219cd09c471fa6275faa2fe5e2a1 b26cf6f4abd54e30aac169a3cbca648c - - default default] Acquiring lock "/var/lib/nova/instances/a22c5192-6f57-46f1-8073-48ec7852a544/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 21 19:08:05 np0005591285 nova_compute[182755]: 2026-01-22 00:08:05.044 182759 DEBUG oslo_concurrency.lockutils [None req-4b726b44-c025-4b73-b722-d8f06082cbba 365f219cd09c471fa6275faa2fe5e2a1 b26cf6f4abd54e30aac169a3cbca648c - - default default] Lock "/var/lib/nova/instances/a22c5192-6f57-46f1-8073-48ec7852a544/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 21 19:08:05 np0005591285 nova_compute[182755]: 2026-01-22 00:08:05.045 182759 DEBUG oslo_concurrency.lockutils [None req-4b726b44-c025-4b73-b722-d8f06082cbba 365f219cd09c471fa6275faa2fe5e2a1 b26cf6f4abd54e30aac169a3cbca648c - - default default] Lock "/var/lib/nova/instances/a22c5192-6f57-46f1-8073-48ec7852a544/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 21 19:08:05 np0005591285 nova_compute[182755]: 2026-01-22 00:08:05.059 182759 DEBUG oslo_concurrency.processutils [None req-4b726b44-c025-4b73-b722-d8f06082cbba 365f219cd09c471fa6275faa2fe5e2a1 b26cf6f4abd54e30aac169a3cbca648c - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 21 19:08:05 np0005591285 nova_compute[182755]: 2026-01-22 00:08:05.083 182759 DEBUG nova.policy [None req-4b726b44-c025-4b73-b722-d8f06082cbba 365f219cd09c471fa6275faa2fe5e2a1 b26cf6f4abd54e30aac169a3cbca648c - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '365f219cd09c471fa6275faa2fe5e2a1', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'b26cf6f4abd54e30aac169a3cbca648c', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Jan 21 19:08:05 np0005591285 nova_compute[182755]: 2026-01-22 00:08:05.137 182759 DEBUG oslo_concurrency.processutils [None req-4b726b44-c025-4b73-b722-d8f06082cbba 365f219cd09c471fa6275faa2fe5e2a1 b26cf6f4abd54e30aac169a3cbca648c - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json" returned: 0 in 0.078s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 21 19:08:05 np0005591285 nova_compute[182755]: 2026-01-22 00:08:05.138 182759 DEBUG oslo_concurrency.lockutils [None req-4b726b44-c025-4b73-b722-d8f06082cbba 365f219cd09c471fa6275faa2fe5e2a1 b26cf6f4abd54e30aac169a3cbca648c - - default default] Acquiring lock "3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 21 19:08:05 np0005591285 nova_compute[182755]: 2026-01-22 00:08:05.139 182759 DEBUG oslo_concurrency.lockutils [None req-4b726b44-c025-4b73-b722-d8f06082cbba 365f219cd09c471fa6275faa2fe5e2a1 b26cf6f4abd54e30aac169a3cbca648c - - default default] Lock "3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 21 19:08:05 np0005591285 nova_compute[182755]: 2026-01-22 00:08:05.158 182759 DEBUG oslo_concurrency.processutils [None req-4b726b44-c025-4b73-b722-d8f06082cbba 365f219cd09c471fa6275faa2fe5e2a1 b26cf6f4abd54e30aac169a3cbca648c - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 21 19:08:05 np0005591285 nova_compute[182755]: 2026-01-22 00:08:05.218 182759 DEBUG oslo_concurrency.processutils [None req-4b726b44-c025-4b73-b722-d8f06082cbba 365f219cd09c471fa6275faa2fe5e2a1 b26cf6f4abd54e30aac169a3cbca648c - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json" returned: 0 in 0.060s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 21 19:08:05 np0005591285 nova_compute[182755]: 2026-01-22 00:08:05.219 182759 DEBUG oslo_concurrency.processutils [None req-4b726b44-c025-4b73-b722-d8f06082cbba 365f219cd09c471fa6275faa2fe5e2a1 b26cf6f4abd54e30aac169a3cbca648c - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474,backing_fmt=raw /var/lib/nova/instances/a22c5192-6f57-46f1-8073-48ec7852a544/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 21 19:08:05 np0005591285 nova_compute[182755]: 2026-01-22 00:08:05.253 182759 DEBUG oslo_concurrency.processutils [None req-4b726b44-c025-4b73-b722-d8f06082cbba 365f219cd09c471fa6275faa2fe5e2a1 b26cf6f4abd54e30aac169a3cbca648c - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474,backing_fmt=raw /var/lib/nova/instances/a22c5192-6f57-46f1-8073-48ec7852a544/disk 1073741824" returned: 0 in 0.034s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 21 19:08:05 np0005591285 nova_compute[182755]: 2026-01-22 00:08:05.254 182759 DEBUG oslo_concurrency.lockutils [None req-4b726b44-c025-4b73-b722-d8f06082cbba 365f219cd09c471fa6275faa2fe5e2a1 b26cf6f4abd54e30aac169a3cbca648c - - default default] Lock "3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.115s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 21 19:08:05 np0005591285 nova_compute[182755]: 2026-01-22 00:08:05.254 182759 DEBUG oslo_concurrency.processutils [None req-4b726b44-c025-4b73-b722-d8f06082cbba 365f219cd09c471fa6275faa2fe5e2a1 b26cf6f4abd54e30aac169a3cbca648c - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 21 19:08:05 np0005591285 nova_compute[182755]: 2026-01-22 00:08:05.311 182759 DEBUG oslo_concurrency.processutils [None req-4b726b44-c025-4b73-b722-d8f06082cbba 365f219cd09c471fa6275faa2fe5e2a1 b26cf6f4abd54e30aac169a3cbca648c - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json" returned: 0 in 0.056s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 21 19:08:05 np0005591285 nova_compute[182755]: 2026-01-22 00:08:05.312 182759 DEBUG nova.virt.disk.api [None req-4b726b44-c025-4b73-b722-d8f06082cbba 365f219cd09c471fa6275faa2fe5e2a1 b26cf6f4abd54e30aac169a3cbca648c - - default default] Checking if we can resize image /var/lib/nova/instances/a22c5192-6f57-46f1-8073-48ec7852a544/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166#033[00m
Jan 21 19:08:05 np0005591285 nova_compute[182755]: 2026-01-22 00:08:05.312 182759 DEBUG oslo_concurrency.processutils [None req-4b726b44-c025-4b73-b722-d8f06082cbba 365f219cd09c471fa6275faa2fe5e2a1 b26cf6f4abd54e30aac169a3cbca648c - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/a22c5192-6f57-46f1-8073-48ec7852a544/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 21 19:08:05 np0005591285 nova_compute[182755]: 2026-01-22 00:08:05.367 182759 DEBUG oslo_concurrency.processutils [None req-4b726b44-c025-4b73-b722-d8f06082cbba 365f219cd09c471fa6275faa2fe5e2a1 b26cf6f4abd54e30aac169a3cbca648c - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/a22c5192-6f57-46f1-8073-48ec7852a544/disk --force-share --output=json" returned: 0 in 0.054s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 21 19:08:05 np0005591285 nova_compute[182755]: 2026-01-22 00:08:05.368 182759 DEBUG nova.virt.disk.api [None req-4b726b44-c025-4b73-b722-d8f06082cbba 365f219cd09c471fa6275faa2fe5e2a1 b26cf6f4abd54e30aac169a3cbca648c - - default default] Cannot resize image /var/lib/nova/instances/a22c5192-6f57-46f1-8073-48ec7852a544/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172#033[00m
Jan 21 19:08:05 np0005591285 nova_compute[182755]: 2026-01-22 00:08:05.368 182759 DEBUG nova.objects.instance [None req-4b726b44-c025-4b73-b722-d8f06082cbba 365f219cd09c471fa6275faa2fe5e2a1 b26cf6f4abd54e30aac169a3cbca648c - - default default] Lazy-loading 'migration_context' on Instance uuid a22c5192-6f57-46f1-8073-48ec7852a544 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 21 19:08:05 np0005591285 nova_compute[182755]: 2026-01-22 00:08:05.406 182759 DEBUG nova.virt.libvirt.driver [None req-4b726b44-c025-4b73-b722-d8f06082cbba 365f219cd09c471fa6275faa2fe5e2a1 b26cf6f4abd54e30aac169a3cbca648c - - default default] [instance: a22c5192-6f57-46f1-8073-48ec7852a544] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Jan 21 19:08:05 np0005591285 nova_compute[182755]: 2026-01-22 00:08:05.406 182759 DEBUG nova.virt.libvirt.driver [None req-4b726b44-c025-4b73-b722-d8f06082cbba 365f219cd09c471fa6275faa2fe5e2a1 b26cf6f4abd54e30aac169a3cbca648c - - default default] [instance: a22c5192-6f57-46f1-8073-48ec7852a544] Ensure instance console log exists: /var/lib/nova/instances/a22c5192-6f57-46f1-8073-48ec7852a544/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Jan 21 19:08:05 np0005591285 nova_compute[182755]: 2026-01-22 00:08:05.407 182759 DEBUG oslo_concurrency.lockutils [None req-4b726b44-c025-4b73-b722-d8f06082cbba 365f219cd09c471fa6275faa2fe5e2a1 b26cf6f4abd54e30aac169a3cbca648c - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 21 19:08:05 np0005591285 nova_compute[182755]: 2026-01-22 00:08:05.407 182759 DEBUG oslo_concurrency.lockutils [None req-4b726b44-c025-4b73-b722-d8f06082cbba 365f219cd09c471fa6275faa2fe5e2a1 b26cf6f4abd54e30aac169a3cbca648c - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 21 19:08:05 np0005591285 nova_compute[182755]: 2026-01-22 00:08:05.407 182759 DEBUG oslo_concurrency.lockutils [None req-4b726b44-c025-4b73-b722-d8f06082cbba 365f219cd09c471fa6275faa2fe5e2a1 b26cf6f4abd54e30aac169a3cbca648c - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 21 19:08:06 np0005591285 nova_compute[182755]: 2026-01-22 00:08:06.062 182759 DEBUG nova.network.neutron [None req-4b726b44-c025-4b73-b722-d8f06082cbba 365f219cd09c471fa6275faa2fe5e2a1 b26cf6f4abd54e30aac169a3cbca648c - - default default] [instance: a22c5192-6f57-46f1-8073-48ec7852a544] Successfully created port: 56184cc4-8230-467f-b464-9f9ebdf257e0 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Jan 21 19:08:06 np0005591285 ovn_controller[94908]: 2026-01-22T00:08:06Z|00381|binding|INFO|Releasing lport c2dbe75a-81e7-4c52-bada-9acaf8fbaf5c from this chassis (sb_readonly=0)
Jan 21 19:08:06 np0005591285 nova_compute[182755]: 2026-01-22 00:08:06.648 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:08:06 np0005591285 nova_compute[182755]: 2026-01-22 00:08:06.764 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:08:06 np0005591285 nova_compute[182755]: 2026-01-22 00:08:06.968 182759 DEBUG nova.network.neutron [None req-4b726b44-c025-4b73-b722-d8f06082cbba 365f219cd09c471fa6275faa2fe5e2a1 b26cf6f4abd54e30aac169a3cbca648c - - default default] [instance: a22c5192-6f57-46f1-8073-48ec7852a544] Successfully updated port: 56184cc4-8230-467f-b464-9f9ebdf257e0 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Jan 21 19:08:06 np0005591285 nova_compute[182755]: 2026-01-22 00:08:06.989 182759 DEBUG oslo_concurrency.lockutils [None req-4b726b44-c025-4b73-b722-d8f06082cbba 365f219cd09c471fa6275faa2fe5e2a1 b26cf6f4abd54e30aac169a3cbca648c - - default default] Acquiring lock "refresh_cache-a22c5192-6f57-46f1-8073-48ec7852a544" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 21 19:08:06 np0005591285 nova_compute[182755]: 2026-01-22 00:08:06.989 182759 DEBUG oslo_concurrency.lockutils [None req-4b726b44-c025-4b73-b722-d8f06082cbba 365f219cd09c471fa6275faa2fe5e2a1 b26cf6f4abd54e30aac169a3cbca648c - - default default] Acquired lock "refresh_cache-a22c5192-6f57-46f1-8073-48ec7852a544" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 21 19:08:06 np0005591285 nova_compute[182755]: 2026-01-22 00:08:06.990 182759 DEBUG nova.network.neutron [None req-4b726b44-c025-4b73-b722-d8f06082cbba 365f219cd09c471fa6275faa2fe5e2a1 b26cf6f4abd54e30aac169a3cbca648c - - default default] [instance: a22c5192-6f57-46f1-8073-48ec7852a544] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Jan 21 19:08:07 np0005591285 nova_compute[182755]: 2026-01-22 00:08:07.120 182759 DEBUG nova.compute.manager [req-83c47011-9712-4a3d-a140-eb448e4be157 req-5e2eb799-e156-4e6b-afdf-d75d8ea6a67c 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: a22c5192-6f57-46f1-8073-48ec7852a544] Received event network-changed-56184cc4-8230-467f-b464-9f9ebdf257e0 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 21 19:08:07 np0005591285 nova_compute[182755]: 2026-01-22 00:08:07.121 182759 DEBUG nova.compute.manager [req-83c47011-9712-4a3d-a140-eb448e4be157 req-5e2eb799-e156-4e6b-afdf-d75d8ea6a67c 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: a22c5192-6f57-46f1-8073-48ec7852a544] Refreshing instance network info cache due to event network-changed-56184cc4-8230-467f-b464-9f9ebdf257e0. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 21 19:08:07 np0005591285 nova_compute[182755]: 2026-01-22 00:08:07.121 182759 DEBUG oslo_concurrency.lockutils [req-83c47011-9712-4a3d-a140-eb448e4be157 req-5e2eb799-e156-4e6b-afdf-d75d8ea6a67c 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "refresh_cache-a22c5192-6f57-46f1-8073-48ec7852a544" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 21 19:08:07 np0005591285 nova_compute[182755]: 2026-01-22 00:08:07.289 182759 DEBUG nova.network.neutron [None req-4b726b44-c025-4b73-b722-d8f06082cbba 365f219cd09c471fa6275faa2fe5e2a1 b26cf6f4abd54e30aac169a3cbca648c - - default default] [instance: a22c5192-6f57-46f1-8073-48ec7852a544] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Jan 21 19:08:09 np0005591285 nova_compute[182755]: 2026-01-22 00:08:09.462 182759 DEBUG nova.network.neutron [None req-4b726b44-c025-4b73-b722-d8f06082cbba 365f219cd09c471fa6275faa2fe5e2a1 b26cf6f4abd54e30aac169a3cbca648c - - default default] [instance: a22c5192-6f57-46f1-8073-48ec7852a544] Updating instance_info_cache with network_info: [{"id": "56184cc4-8230-467f-b464-9f9ebdf257e0", "address": "fa:16:3e:cc:48:66", "network": {"id": "1a4bd631-64c5-4e00-9341-0e44fd0833fb", "bridge": "br-int", "label": "tempest-ServerActionsTestOtherB-1779791452-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b26cf6f4abd54e30aac169a3cbca648c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap56184cc4-82", "ovs_interfaceid": "56184cc4-8230-467f-b464-9f9ebdf257e0", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 21 19:08:09 np0005591285 nova_compute[182755]: 2026-01-22 00:08:09.503 182759 DEBUG oslo_concurrency.lockutils [None req-4b726b44-c025-4b73-b722-d8f06082cbba 365f219cd09c471fa6275faa2fe5e2a1 b26cf6f4abd54e30aac169a3cbca648c - - default default] Releasing lock "refresh_cache-a22c5192-6f57-46f1-8073-48ec7852a544" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 21 19:08:09 np0005591285 nova_compute[182755]: 2026-01-22 00:08:09.503 182759 DEBUG nova.compute.manager [None req-4b726b44-c025-4b73-b722-d8f06082cbba 365f219cd09c471fa6275faa2fe5e2a1 b26cf6f4abd54e30aac169a3cbca648c - - default default] [instance: a22c5192-6f57-46f1-8073-48ec7852a544] Instance network_info: |[{"id": "56184cc4-8230-467f-b464-9f9ebdf257e0", "address": "fa:16:3e:cc:48:66", "network": {"id": "1a4bd631-64c5-4e00-9341-0e44fd0833fb", "bridge": "br-int", "label": "tempest-ServerActionsTestOtherB-1779791452-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b26cf6f4abd54e30aac169a3cbca648c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap56184cc4-82", "ovs_interfaceid": "56184cc4-8230-467f-b464-9f9ebdf257e0", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Jan 21 19:08:09 np0005591285 nova_compute[182755]: 2026-01-22 00:08:09.504 182759 DEBUG oslo_concurrency.lockutils [req-83c47011-9712-4a3d-a140-eb448e4be157 req-5e2eb799-e156-4e6b-afdf-d75d8ea6a67c 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquired lock "refresh_cache-a22c5192-6f57-46f1-8073-48ec7852a544" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 21 19:08:09 np0005591285 nova_compute[182755]: 2026-01-22 00:08:09.504 182759 DEBUG nova.network.neutron [req-83c47011-9712-4a3d-a140-eb448e4be157 req-5e2eb799-e156-4e6b-afdf-d75d8ea6a67c 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: a22c5192-6f57-46f1-8073-48ec7852a544] Refreshing network info cache for port 56184cc4-8230-467f-b464-9f9ebdf257e0 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 21 19:08:09 np0005591285 nova_compute[182755]: 2026-01-22 00:08:09.507 182759 DEBUG nova.virt.libvirt.driver [None req-4b726b44-c025-4b73-b722-d8f06082cbba 365f219cd09c471fa6275faa2fe5e2a1 b26cf6f4abd54e30aac169a3cbca648c - - default default] [instance: a22c5192-6f57-46f1-8073-48ec7852a544] Start _get_guest_xml network_info=[{"id": "56184cc4-8230-467f-b464-9f9ebdf257e0", "address": "fa:16:3e:cc:48:66", "network": {"id": "1a4bd631-64c5-4e00-9341-0e44fd0833fb", "bridge": "br-int", "label": "tempest-ServerActionsTestOtherB-1779791452-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b26cf6f4abd54e30aac169a3cbca648c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap56184cc4-82", "ovs_interfaceid": "56184cc4-8230-467f-b464-9f9ebdf257e0", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-21T23:42:50Z,direct_url=<?>,disk_format='qcow2',id=9cd98f02-a505-4543-a7ad-04e9a377b456,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='43b70c4e837343859ac97b6b2397ba1b',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-21T23:42:52Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'device_type': 'disk', 'device_name': '/dev/vda', 'size': 0, 'boot_index': 0, 'disk_bus': 'virtio', 'guest_format': None, 'encryption_options': None, 'encryption_format': None, 'encrypted': False, 'encryption_secret_uuid': None, 'image_id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Jan 21 19:08:09 np0005591285 nova_compute[182755]: 2026-01-22 00:08:09.512 182759 WARNING nova.virt.libvirt.driver [None req-4b726b44-c025-4b73-b722-d8f06082cbba 365f219cd09c471fa6275faa2fe5e2a1 b26cf6f4abd54e30aac169a3cbca648c - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 21 19:08:09 np0005591285 nova_compute[182755]: 2026-01-22 00:08:09.517 182759 DEBUG nova.virt.libvirt.host [None req-4b726b44-c025-4b73-b722-d8f06082cbba 365f219cd09c471fa6275faa2fe5e2a1 b26cf6f4abd54e30aac169a3cbca648c - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Jan 21 19:08:09 np0005591285 nova_compute[182755]: 2026-01-22 00:08:09.517 182759 DEBUG nova.virt.libvirt.host [None req-4b726b44-c025-4b73-b722-d8f06082cbba 365f219cd09c471fa6275faa2fe5e2a1 b26cf6f4abd54e30aac169a3cbca648c - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Jan 21 19:08:09 np0005591285 nova_compute[182755]: 2026-01-22 00:08:09.522 182759 DEBUG nova.virt.libvirt.host [None req-4b726b44-c025-4b73-b722-d8f06082cbba 365f219cd09c471fa6275faa2fe5e2a1 b26cf6f4abd54e30aac169a3cbca648c - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Jan 21 19:08:09 np0005591285 nova_compute[182755]: 2026-01-22 00:08:09.522 182759 DEBUG nova.virt.libvirt.host [None req-4b726b44-c025-4b73-b722-d8f06082cbba 365f219cd09c471fa6275faa2fe5e2a1 b26cf6f4abd54e30aac169a3cbca648c - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Jan 21 19:08:09 np0005591285 nova_compute[182755]: 2026-01-22 00:08:09.523 182759 DEBUG nova.virt.libvirt.driver [None req-4b726b44-c025-4b73-b722-d8f06082cbba 365f219cd09c471fa6275faa2fe5e2a1 b26cf6f4abd54e30aac169a3cbca648c - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Jan 21 19:08:09 np0005591285 nova_compute[182755]: 2026-01-22 00:08:09.524 182759 DEBUG nova.virt.hardware [None req-4b726b44-c025-4b73-b722-d8f06082cbba 365f219cd09c471fa6275faa2fe5e2a1 b26cf6f4abd54e30aac169a3cbca648c - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-21T23:42:45Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='c3389c03-89c4-4ff5-9e03-1a99d41713d4',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-21T23:42:50Z,direct_url=<?>,disk_format='qcow2',id=9cd98f02-a505-4543-a7ad-04e9a377b456,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='43b70c4e837343859ac97b6b2397ba1b',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-21T23:42:52Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Jan 21 19:08:09 np0005591285 nova_compute[182755]: 2026-01-22 00:08:09.524 182759 DEBUG nova.virt.hardware [None req-4b726b44-c025-4b73-b722-d8f06082cbba 365f219cd09c471fa6275faa2fe5e2a1 b26cf6f4abd54e30aac169a3cbca648c - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Jan 21 19:08:09 np0005591285 nova_compute[182755]: 2026-01-22 00:08:09.524 182759 DEBUG nova.virt.hardware [None req-4b726b44-c025-4b73-b722-d8f06082cbba 365f219cd09c471fa6275faa2fe5e2a1 b26cf6f4abd54e30aac169a3cbca648c - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Jan 21 19:08:09 np0005591285 nova_compute[182755]: 2026-01-22 00:08:09.524 182759 DEBUG nova.virt.hardware [None req-4b726b44-c025-4b73-b722-d8f06082cbba 365f219cd09c471fa6275faa2fe5e2a1 b26cf6f4abd54e30aac169a3cbca648c - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Jan 21 19:08:09 np0005591285 nova_compute[182755]: 2026-01-22 00:08:09.525 182759 DEBUG nova.virt.hardware [None req-4b726b44-c025-4b73-b722-d8f06082cbba 365f219cd09c471fa6275faa2fe5e2a1 b26cf6f4abd54e30aac169a3cbca648c - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Jan 21 19:08:09 np0005591285 nova_compute[182755]: 2026-01-22 00:08:09.525 182759 DEBUG nova.virt.hardware [None req-4b726b44-c025-4b73-b722-d8f06082cbba 365f219cd09c471fa6275faa2fe5e2a1 b26cf6f4abd54e30aac169a3cbca648c - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Jan 21 19:08:09 np0005591285 nova_compute[182755]: 2026-01-22 00:08:09.525 182759 DEBUG nova.virt.hardware [None req-4b726b44-c025-4b73-b722-d8f06082cbba 365f219cd09c471fa6275faa2fe5e2a1 b26cf6f4abd54e30aac169a3cbca648c - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Jan 21 19:08:09 np0005591285 nova_compute[182755]: 2026-01-22 00:08:09.525 182759 DEBUG nova.virt.hardware [None req-4b726b44-c025-4b73-b722-d8f06082cbba 365f219cd09c471fa6275faa2fe5e2a1 b26cf6f4abd54e30aac169a3cbca648c - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Jan 21 19:08:09 np0005591285 nova_compute[182755]: 2026-01-22 00:08:09.525 182759 DEBUG nova.virt.hardware [None req-4b726b44-c025-4b73-b722-d8f06082cbba 365f219cd09c471fa6275faa2fe5e2a1 b26cf6f4abd54e30aac169a3cbca648c - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Jan 21 19:08:09 np0005591285 nova_compute[182755]: 2026-01-22 00:08:09.526 182759 DEBUG nova.virt.hardware [None req-4b726b44-c025-4b73-b722-d8f06082cbba 365f219cd09c471fa6275faa2fe5e2a1 b26cf6f4abd54e30aac169a3cbca648c - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Jan 21 19:08:09 np0005591285 nova_compute[182755]: 2026-01-22 00:08:09.526 182759 DEBUG nova.virt.hardware [None req-4b726b44-c025-4b73-b722-d8f06082cbba 365f219cd09c471fa6275faa2fe5e2a1 b26cf6f4abd54e30aac169a3cbca648c - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Jan 21 19:08:09 np0005591285 nova_compute[182755]: 2026-01-22 00:08:09.530 182759 DEBUG nova.virt.libvirt.vif [None req-4b726b44-c025-4b73-b722-d8f06082cbba 365f219cd09c471fa6275faa2fe5e2a1 b26cf6f4abd54e30aac169a3cbca648c - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-22T00:08:03Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ServerActionsTestOtherB-server-125029116',display_name='tempest-ServerActionsTestOtherB-server-125029116',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-serveractionstestotherb-server-125029116',id=107,image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='b26cf6f4abd54e30aac169a3cbca648c',ramdisk_id='',reservation_id='r-hhcmlge4',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ServerActionsTestOtherB-1685479237',owner_user_name='tempest-ServerActionsTestOtherB-1685479237-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-22T00:08:04Z,user_data=None,user_id='365f219cd09c471fa6275faa2fe5e2a1',uuid=a22c5192-6f57-46f1-8073-48ec7852a544,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "56184cc4-8230-467f-b464-9f9ebdf257e0", "address": "fa:16:3e:cc:48:66", "network": {"id": "1a4bd631-64c5-4e00-9341-0e44fd0833fb", "bridge": "br-int", "label": "tempest-ServerActionsTestOtherB-1779791452-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b26cf6f4abd54e30aac169a3cbca648c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap56184cc4-82", "ovs_interfaceid": "56184cc4-8230-467f-b464-9f9ebdf257e0", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Jan 21 19:08:09 np0005591285 nova_compute[182755]: 2026-01-22 00:08:09.530 182759 DEBUG nova.network.os_vif_util [None req-4b726b44-c025-4b73-b722-d8f06082cbba 365f219cd09c471fa6275faa2fe5e2a1 b26cf6f4abd54e30aac169a3cbca648c - - default default] Converting VIF {"id": "56184cc4-8230-467f-b464-9f9ebdf257e0", "address": "fa:16:3e:cc:48:66", "network": {"id": "1a4bd631-64c5-4e00-9341-0e44fd0833fb", "bridge": "br-int", "label": "tempest-ServerActionsTestOtherB-1779791452-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b26cf6f4abd54e30aac169a3cbca648c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap56184cc4-82", "ovs_interfaceid": "56184cc4-8230-467f-b464-9f9ebdf257e0", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 21 19:08:09 np0005591285 nova_compute[182755]: 2026-01-22 00:08:09.531 182759 DEBUG nova.network.os_vif_util [None req-4b726b44-c025-4b73-b722-d8f06082cbba 365f219cd09c471fa6275faa2fe5e2a1 b26cf6f4abd54e30aac169a3cbca648c - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:cc:48:66,bridge_name='br-int',has_traffic_filtering=True,id=56184cc4-8230-467f-b464-9f9ebdf257e0,network=Network(1a4bd631-64c5-4e00-9341-0e44fd0833fb),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap56184cc4-82') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 21 19:08:09 np0005591285 nova_compute[182755]: 2026-01-22 00:08:09.532 182759 DEBUG nova.objects.instance [None req-4b726b44-c025-4b73-b722-d8f06082cbba 365f219cd09c471fa6275faa2fe5e2a1 b26cf6f4abd54e30aac169a3cbca648c - - default default] Lazy-loading 'pci_devices' on Instance uuid a22c5192-6f57-46f1-8073-48ec7852a544 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 21 19:08:09 np0005591285 nova_compute[182755]: 2026-01-22 00:08:09.592 182759 DEBUG nova.virt.libvirt.driver [None req-4b726b44-c025-4b73-b722-d8f06082cbba 365f219cd09c471fa6275faa2fe5e2a1 b26cf6f4abd54e30aac169a3cbca648c - - default default] [instance: a22c5192-6f57-46f1-8073-48ec7852a544] End _get_guest_xml xml=<domain type="kvm">
Jan 21 19:08:09 np0005591285 nova_compute[182755]:  <uuid>a22c5192-6f57-46f1-8073-48ec7852a544</uuid>
Jan 21 19:08:09 np0005591285 nova_compute[182755]:  <name>instance-0000006b</name>
Jan 21 19:08:09 np0005591285 nova_compute[182755]:  <memory>131072</memory>
Jan 21 19:08:09 np0005591285 nova_compute[182755]:  <vcpu>1</vcpu>
Jan 21 19:08:09 np0005591285 nova_compute[182755]:  <metadata>
Jan 21 19:08:09 np0005591285 nova_compute[182755]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 21 19:08:09 np0005591285 nova_compute[182755]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 21 19:08:09 np0005591285 nova_compute[182755]:      <nova:name>tempest-ServerActionsTestOtherB-server-125029116</nova:name>
Jan 21 19:08:09 np0005591285 nova_compute[182755]:      <nova:creationTime>2026-01-22 00:08:09</nova:creationTime>
Jan 21 19:08:09 np0005591285 nova_compute[182755]:      <nova:flavor name="m1.nano">
Jan 21 19:08:09 np0005591285 nova_compute[182755]:        <nova:memory>128</nova:memory>
Jan 21 19:08:09 np0005591285 nova_compute[182755]:        <nova:disk>1</nova:disk>
Jan 21 19:08:09 np0005591285 nova_compute[182755]:        <nova:swap>0</nova:swap>
Jan 21 19:08:09 np0005591285 nova_compute[182755]:        <nova:ephemeral>0</nova:ephemeral>
Jan 21 19:08:09 np0005591285 nova_compute[182755]:        <nova:vcpus>1</nova:vcpus>
Jan 21 19:08:09 np0005591285 nova_compute[182755]:      </nova:flavor>
Jan 21 19:08:09 np0005591285 nova_compute[182755]:      <nova:owner>
Jan 21 19:08:09 np0005591285 nova_compute[182755]:        <nova:user uuid="365f219cd09c471fa6275faa2fe5e2a1">tempest-ServerActionsTestOtherB-1685479237-project-member</nova:user>
Jan 21 19:08:09 np0005591285 nova_compute[182755]:        <nova:project uuid="b26cf6f4abd54e30aac169a3cbca648c">tempest-ServerActionsTestOtherB-1685479237</nova:project>
Jan 21 19:08:09 np0005591285 nova_compute[182755]:      </nova:owner>
Jan 21 19:08:09 np0005591285 nova_compute[182755]:      <nova:root type="image" uuid="9cd98f02-a505-4543-a7ad-04e9a377b456"/>
Jan 21 19:08:09 np0005591285 nova_compute[182755]:      <nova:ports>
Jan 21 19:08:09 np0005591285 nova_compute[182755]:        <nova:port uuid="56184cc4-8230-467f-b464-9f9ebdf257e0">
Jan 21 19:08:09 np0005591285 nova_compute[182755]:          <nova:ip type="fixed" address="10.100.0.10" ipVersion="4"/>
Jan 21 19:08:09 np0005591285 nova_compute[182755]:        </nova:port>
Jan 21 19:08:09 np0005591285 nova_compute[182755]:      </nova:ports>
Jan 21 19:08:09 np0005591285 nova_compute[182755]:    </nova:instance>
Jan 21 19:08:09 np0005591285 nova_compute[182755]:  </metadata>
Jan 21 19:08:09 np0005591285 nova_compute[182755]:  <sysinfo type="smbios">
Jan 21 19:08:09 np0005591285 nova_compute[182755]:    <system>
Jan 21 19:08:09 np0005591285 nova_compute[182755]:      <entry name="manufacturer">RDO</entry>
Jan 21 19:08:09 np0005591285 nova_compute[182755]:      <entry name="product">OpenStack Compute</entry>
Jan 21 19:08:09 np0005591285 nova_compute[182755]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 21 19:08:09 np0005591285 nova_compute[182755]:      <entry name="serial">a22c5192-6f57-46f1-8073-48ec7852a544</entry>
Jan 21 19:08:09 np0005591285 nova_compute[182755]:      <entry name="uuid">a22c5192-6f57-46f1-8073-48ec7852a544</entry>
Jan 21 19:08:09 np0005591285 nova_compute[182755]:      <entry name="family">Virtual Machine</entry>
Jan 21 19:08:09 np0005591285 nova_compute[182755]:    </system>
Jan 21 19:08:09 np0005591285 nova_compute[182755]:  </sysinfo>
Jan 21 19:08:09 np0005591285 nova_compute[182755]:  <os>
Jan 21 19:08:09 np0005591285 nova_compute[182755]:    <type arch="x86_64" machine="q35">hvm</type>
Jan 21 19:08:09 np0005591285 nova_compute[182755]:    <boot dev="hd"/>
Jan 21 19:08:09 np0005591285 nova_compute[182755]:    <smbios mode="sysinfo"/>
Jan 21 19:08:09 np0005591285 nova_compute[182755]:  </os>
Jan 21 19:08:09 np0005591285 nova_compute[182755]:  <features>
Jan 21 19:08:09 np0005591285 nova_compute[182755]:    <acpi/>
Jan 21 19:08:09 np0005591285 nova_compute[182755]:    <apic/>
Jan 21 19:08:09 np0005591285 nova_compute[182755]:    <vmcoreinfo/>
Jan 21 19:08:09 np0005591285 nova_compute[182755]:  </features>
Jan 21 19:08:09 np0005591285 nova_compute[182755]:  <clock offset="utc">
Jan 21 19:08:09 np0005591285 nova_compute[182755]:    <timer name="pit" tickpolicy="delay"/>
Jan 21 19:08:09 np0005591285 nova_compute[182755]:    <timer name="rtc" tickpolicy="catchup"/>
Jan 21 19:08:09 np0005591285 nova_compute[182755]:    <timer name="hpet" present="no"/>
Jan 21 19:08:09 np0005591285 nova_compute[182755]:  </clock>
Jan 21 19:08:09 np0005591285 nova_compute[182755]:  <cpu mode="custom" match="exact">
Jan 21 19:08:09 np0005591285 nova_compute[182755]:    <model>Nehalem</model>
Jan 21 19:08:09 np0005591285 nova_compute[182755]:    <topology sockets="1" cores="1" threads="1"/>
Jan 21 19:08:09 np0005591285 nova_compute[182755]:  </cpu>
Jan 21 19:08:09 np0005591285 nova_compute[182755]:  <devices>
Jan 21 19:08:09 np0005591285 nova_compute[182755]:    <disk type="file" device="disk">
Jan 21 19:08:09 np0005591285 nova_compute[182755]:      <driver name="qemu" type="qcow2" cache="none"/>
Jan 21 19:08:09 np0005591285 nova_compute[182755]:      <source file="/var/lib/nova/instances/a22c5192-6f57-46f1-8073-48ec7852a544/disk"/>
Jan 21 19:08:09 np0005591285 nova_compute[182755]:      <target dev="vda" bus="virtio"/>
Jan 21 19:08:09 np0005591285 nova_compute[182755]:    </disk>
Jan 21 19:08:09 np0005591285 nova_compute[182755]:    <disk type="file" device="cdrom">
Jan 21 19:08:09 np0005591285 nova_compute[182755]:      <driver name="qemu" type="raw" cache="none"/>
Jan 21 19:08:09 np0005591285 nova_compute[182755]:      <source file="/var/lib/nova/instances/a22c5192-6f57-46f1-8073-48ec7852a544/disk.config"/>
Jan 21 19:08:09 np0005591285 nova_compute[182755]:      <target dev="sda" bus="sata"/>
Jan 21 19:08:09 np0005591285 nova_compute[182755]:    </disk>
Jan 21 19:08:09 np0005591285 nova_compute[182755]:    <interface type="ethernet">
Jan 21 19:08:09 np0005591285 nova_compute[182755]:      <mac address="fa:16:3e:cc:48:66"/>
Jan 21 19:08:09 np0005591285 nova_compute[182755]:      <model type="virtio"/>
Jan 21 19:08:09 np0005591285 nova_compute[182755]:      <driver name="vhost" rx_queue_size="512"/>
Jan 21 19:08:09 np0005591285 nova_compute[182755]:      <mtu size="1442"/>
Jan 21 19:08:09 np0005591285 nova_compute[182755]:      <target dev="tap56184cc4-82"/>
Jan 21 19:08:09 np0005591285 nova_compute[182755]:    </interface>
Jan 21 19:08:09 np0005591285 nova_compute[182755]:    <serial type="pty">
Jan 21 19:08:09 np0005591285 nova_compute[182755]:      <log file="/var/lib/nova/instances/a22c5192-6f57-46f1-8073-48ec7852a544/console.log" append="off"/>
Jan 21 19:08:09 np0005591285 nova_compute[182755]:    </serial>
Jan 21 19:08:09 np0005591285 nova_compute[182755]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 21 19:08:09 np0005591285 nova_compute[182755]:    <video>
Jan 21 19:08:09 np0005591285 nova_compute[182755]:      <model type="virtio"/>
Jan 21 19:08:09 np0005591285 nova_compute[182755]:    </video>
Jan 21 19:08:09 np0005591285 nova_compute[182755]:    <input type="tablet" bus="usb"/>
Jan 21 19:08:09 np0005591285 nova_compute[182755]:    <rng model="virtio">
Jan 21 19:08:09 np0005591285 nova_compute[182755]:      <backend model="random">/dev/urandom</backend>
Jan 21 19:08:09 np0005591285 nova_compute[182755]:    </rng>
Jan 21 19:08:09 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root"/>
Jan 21 19:08:09 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 19:08:09 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 19:08:09 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 19:08:09 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 19:08:09 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 19:08:09 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 19:08:09 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 19:08:09 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 19:08:09 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 19:08:09 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 19:08:09 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 19:08:09 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 19:08:09 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 19:08:09 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 19:08:09 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 19:08:09 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 19:08:09 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 19:08:09 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 19:08:09 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 19:08:09 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 19:08:09 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 19:08:09 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 19:08:09 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 19:08:09 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 19:08:09 np0005591285 nova_compute[182755]:    <controller type="usb" index="0"/>
Jan 21 19:08:09 np0005591285 nova_compute[182755]:    <memballoon model="virtio">
Jan 21 19:08:09 np0005591285 nova_compute[182755]:      <stats period="10"/>
Jan 21 19:08:09 np0005591285 nova_compute[182755]:    </memballoon>
Jan 21 19:08:09 np0005591285 nova_compute[182755]:  </devices>
Jan 21 19:08:09 np0005591285 nova_compute[182755]: </domain>
Jan 21 19:08:09 np0005591285 nova_compute[182755]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Jan 21 19:08:09 np0005591285 nova_compute[182755]: 2026-01-22 00:08:09.593 182759 DEBUG nova.compute.manager [None req-4b726b44-c025-4b73-b722-d8f06082cbba 365f219cd09c471fa6275faa2fe5e2a1 b26cf6f4abd54e30aac169a3cbca648c - - default default] [instance: a22c5192-6f57-46f1-8073-48ec7852a544] Preparing to wait for external event network-vif-plugged-56184cc4-8230-467f-b464-9f9ebdf257e0 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Jan 21 19:08:09 np0005591285 nova_compute[182755]: 2026-01-22 00:08:09.594 182759 DEBUG oslo_concurrency.lockutils [None req-4b726b44-c025-4b73-b722-d8f06082cbba 365f219cd09c471fa6275faa2fe5e2a1 b26cf6f4abd54e30aac169a3cbca648c - - default default] Acquiring lock "a22c5192-6f57-46f1-8073-48ec7852a544-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 21 19:08:09 np0005591285 nova_compute[182755]: 2026-01-22 00:08:09.594 182759 DEBUG oslo_concurrency.lockutils [None req-4b726b44-c025-4b73-b722-d8f06082cbba 365f219cd09c471fa6275faa2fe5e2a1 b26cf6f4abd54e30aac169a3cbca648c - - default default] Lock "a22c5192-6f57-46f1-8073-48ec7852a544-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 21 19:08:09 np0005591285 nova_compute[182755]: 2026-01-22 00:08:09.595 182759 DEBUG oslo_concurrency.lockutils [None req-4b726b44-c025-4b73-b722-d8f06082cbba 365f219cd09c471fa6275faa2fe5e2a1 b26cf6f4abd54e30aac169a3cbca648c - - default default] Lock "a22c5192-6f57-46f1-8073-48ec7852a544-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 21 19:08:09 np0005591285 nova_compute[182755]: 2026-01-22 00:08:09.596 182759 DEBUG nova.virt.libvirt.vif [None req-4b726b44-c025-4b73-b722-d8f06082cbba 365f219cd09c471fa6275faa2fe5e2a1 b26cf6f4abd54e30aac169a3cbca648c - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-22T00:08:03Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ServerActionsTestOtherB-server-125029116',display_name='tempest-ServerActionsTestOtherB-server-125029116',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-serveractionstestotherb-server-125029116',id=107,image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='b26cf6f4abd54e30aac169a3cbca648c',ramdisk_id='',reservation_id='r-hhcmlge4',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ServerActionsTestOtherB-1685479237',owner_user_name='tempest-ServerActionsTestOtherB-1685479237-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-22T00:08:04Z,user_data=None,user_id='365f219cd09c471fa6275faa2fe5e2a1',uuid=a22c5192-6f57-46f1-8073-48ec7852a544,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "56184cc4-8230-467f-b464-9f9ebdf257e0", "address": "fa:16:3e:cc:48:66", "network": {"id": "1a4bd631-64c5-4e00-9341-0e44fd0833fb", "bridge": "br-int", "label": "tempest-ServerActionsTestOtherB-1779791452-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b26cf6f4abd54e30aac169a3cbca648c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap56184cc4-82", "ovs_interfaceid": "56184cc4-8230-467f-b464-9f9ebdf257e0", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Jan 21 19:08:09 np0005591285 nova_compute[182755]: 2026-01-22 00:08:09.596 182759 DEBUG nova.network.os_vif_util [None req-4b726b44-c025-4b73-b722-d8f06082cbba 365f219cd09c471fa6275faa2fe5e2a1 b26cf6f4abd54e30aac169a3cbca648c - - default default] Converting VIF {"id": "56184cc4-8230-467f-b464-9f9ebdf257e0", "address": "fa:16:3e:cc:48:66", "network": {"id": "1a4bd631-64c5-4e00-9341-0e44fd0833fb", "bridge": "br-int", "label": "tempest-ServerActionsTestOtherB-1779791452-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b26cf6f4abd54e30aac169a3cbca648c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap56184cc4-82", "ovs_interfaceid": "56184cc4-8230-467f-b464-9f9ebdf257e0", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 21 19:08:09 np0005591285 nova_compute[182755]: 2026-01-22 00:08:09.598 182759 DEBUG nova.network.os_vif_util [None req-4b726b44-c025-4b73-b722-d8f06082cbba 365f219cd09c471fa6275faa2fe5e2a1 b26cf6f4abd54e30aac169a3cbca648c - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:cc:48:66,bridge_name='br-int',has_traffic_filtering=True,id=56184cc4-8230-467f-b464-9f9ebdf257e0,network=Network(1a4bd631-64c5-4e00-9341-0e44fd0833fb),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap56184cc4-82') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 21 19:08:09 np0005591285 nova_compute[182755]: 2026-01-22 00:08:09.599 182759 DEBUG os_vif [None req-4b726b44-c025-4b73-b722-d8f06082cbba 365f219cd09c471fa6275faa2fe5e2a1 b26cf6f4abd54e30aac169a3cbca648c - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:cc:48:66,bridge_name='br-int',has_traffic_filtering=True,id=56184cc4-8230-467f-b464-9f9ebdf257e0,network=Network(1a4bd631-64c5-4e00-9341-0e44fd0833fb),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap56184cc4-82') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Jan 21 19:08:09 np0005591285 nova_compute[182755]: 2026-01-22 00:08:09.600 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:08:09 np0005591285 nova_compute[182755]: 2026-01-22 00:08:09.600 182759 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 21 19:08:09 np0005591285 nova_compute[182755]: 2026-01-22 00:08:09.601 182759 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 21 19:08:09 np0005591285 nova_compute[182755]: 2026-01-22 00:08:09.609 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:08:09 np0005591285 nova_compute[182755]: 2026-01-22 00:08:09.609 182759 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap56184cc4-82, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 21 19:08:09 np0005591285 nova_compute[182755]: 2026-01-22 00:08:09.610 182759 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap56184cc4-82, col_values=(('external_ids', {'iface-id': '56184cc4-8230-467f-b464-9f9ebdf257e0', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:cc:48:66', 'vm-uuid': 'a22c5192-6f57-46f1-8073-48ec7852a544'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 21 19:08:09 np0005591285 NetworkManager[55017]: <info>  [1769040489.6138] manager: (tap56184cc4-82): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/190)
Jan 21 19:08:09 np0005591285 nova_compute[182755]: 2026-01-22 00:08:09.616 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 21 19:08:09 np0005591285 nova_compute[182755]: 2026-01-22 00:08:09.620 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:08:09 np0005591285 nova_compute[182755]: 2026-01-22 00:08:09.622 182759 INFO os_vif [None req-4b726b44-c025-4b73-b722-d8f06082cbba 365f219cd09c471fa6275faa2fe5e2a1 b26cf6f4abd54e30aac169a3cbca648c - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:cc:48:66,bridge_name='br-int',has_traffic_filtering=True,id=56184cc4-8230-467f-b464-9f9ebdf257e0,network=Network(1a4bd631-64c5-4e00-9341-0e44fd0833fb),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap56184cc4-82')#033[00m
Jan 21 19:08:09 np0005591285 nova_compute[182755]: 2026-01-22 00:08:09.688 182759 DEBUG nova.virt.libvirt.driver [None req-4b726b44-c025-4b73-b722-d8f06082cbba 365f219cd09c471fa6275faa2fe5e2a1 b26cf6f4abd54e30aac169a3cbca648c - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 21 19:08:09 np0005591285 nova_compute[182755]: 2026-01-22 00:08:09.689 182759 DEBUG nova.virt.libvirt.driver [None req-4b726b44-c025-4b73-b722-d8f06082cbba 365f219cd09c471fa6275faa2fe5e2a1 b26cf6f4abd54e30aac169a3cbca648c - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 21 19:08:09 np0005591285 nova_compute[182755]: 2026-01-22 00:08:09.689 182759 DEBUG nova.virt.libvirt.driver [None req-4b726b44-c025-4b73-b722-d8f06082cbba 365f219cd09c471fa6275faa2fe5e2a1 b26cf6f4abd54e30aac169a3cbca648c - - default default] No VIF found with MAC fa:16:3e:cc:48:66, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Jan 21 19:08:09 np0005591285 nova_compute[182755]: 2026-01-22 00:08:09.690 182759 INFO nova.virt.libvirt.driver [None req-4b726b44-c025-4b73-b722-d8f06082cbba 365f219cd09c471fa6275faa2fe5e2a1 b26cf6f4abd54e30aac169a3cbca648c - - default default] [instance: a22c5192-6f57-46f1-8073-48ec7852a544] Using config drive#033[00m
Jan 21 19:08:10 np0005591285 nova_compute[182755]: 2026-01-22 00:08:10.232 182759 INFO nova.virt.libvirt.driver [None req-4b726b44-c025-4b73-b722-d8f06082cbba 365f219cd09c471fa6275faa2fe5e2a1 b26cf6f4abd54e30aac169a3cbca648c - - default default] [instance: a22c5192-6f57-46f1-8073-48ec7852a544] Creating config drive at /var/lib/nova/instances/a22c5192-6f57-46f1-8073-48ec7852a544/disk.config#033[00m
Jan 21 19:08:10 np0005591285 nova_compute[182755]: 2026-01-22 00:08:10.243 182759 DEBUG oslo_concurrency.processutils [None req-4b726b44-c025-4b73-b722-d8f06082cbba 365f219cd09c471fa6275faa2fe5e2a1 b26cf6f4abd54e30aac169a3cbca648c - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/a22c5192-6f57-46f1-8073-48ec7852a544/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpqemj5u3d execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 21 19:08:10 np0005591285 nova_compute[182755]: 2026-01-22 00:08:10.375 182759 DEBUG oslo_concurrency.processutils [None req-4b726b44-c025-4b73-b722-d8f06082cbba 365f219cd09c471fa6275faa2fe5e2a1 b26cf6f4abd54e30aac169a3cbca648c - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/a22c5192-6f57-46f1-8073-48ec7852a544/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpqemj5u3d" returned: 0 in 0.132s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 21 19:08:10 np0005591285 kernel: tap56184cc4-82: entered promiscuous mode
Jan 21 19:08:10 np0005591285 NetworkManager[55017]: <info>  [1769040490.4488] manager: (tap56184cc4-82): new Tun device (/org/freedesktop/NetworkManager/Devices/191)
Jan 21 19:08:10 np0005591285 ovn_controller[94908]: 2026-01-22T00:08:10Z|00382|binding|INFO|Claiming lport 56184cc4-8230-467f-b464-9f9ebdf257e0 for this chassis.
Jan 21 19:08:10 np0005591285 ovn_controller[94908]: 2026-01-22T00:08:10Z|00383|binding|INFO|56184cc4-8230-467f-b464-9f9ebdf257e0: Claiming fa:16:3e:cc:48:66 10.100.0.10
Jan 21 19:08:10 np0005591285 nova_compute[182755]: 2026-01-22 00:08:10.489 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:08:10 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:08:10.500 104259 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:cc:48:66 10.100.0.10'], port_security=['fa:16:3e:cc:48:66 10.100.0.10'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.10/28', 'neutron:device_id': 'a22c5192-6f57-46f1-8073-48ec7852a544', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-1a4bd631-64c5-4e00-9341-0e44fd0833fb', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'b26cf6f4abd54e30aac169a3cbca648c', 'neutron:revision_number': '2', 'neutron:security_group_ids': '51b050e2-1158-4da5-a294-6c6d2a400a60', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=d46c6b58-b03f-4ac4-a6dd-9f507a40241a, chassis=[<ovs.db.idl.Row object at 0x7fdd55b5c970>], tunnel_key=6, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fdd55b5c970>], logical_port=56184cc4-8230-467f-b464-9f9ebdf257e0) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 21 19:08:10 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:08:10.501 104259 INFO neutron.agent.ovn.metadata.agent [-] Port 56184cc4-8230-467f-b464-9f9ebdf257e0 in datapath 1a4bd631-64c5-4e00-9341-0e44fd0833fb bound to our chassis#033[00m
Jan 21 19:08:10 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:08:10.503 104259 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 1a4bd631-64c5-4e00-9341-0e44fd0833fb#033[00m
Jan 21 19:08:10 np0005591285 ovn_controller[94908]: 2026-01-22T00:08:10Z|00384|binding|INFO|Setting lport 56184cc4-8230-467f-b464-9f9ebdf257e0 ovn-installed in OVS
Jan 21 19:08:10 np0005591285 ovn_controller[94908]: 2026-01-22T00:08:10Z|00385|binding|INFO|Setting lport 56184cc4-8230-467f-b464-9f9ebdf257e0 up in Southbound
Jan 21 19:08:10 np0005591285 nova_compute[182755]: 2026-01-22 00:08:10.519 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:08:10 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:08:10.523 211690 DEBUG oslo.privsep.daemon [-] privsep: reply[fa282995-44be-4244-843d-f7d433aa1960]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 19:08:10 np0005591285 systemd-udevd[227473]: Network interface NamePolicy= disabled on kernel command line.
Jan 21 19:08:10 np0005591285 systemd-machined[154022]: New machine qemu-47-instance-0000006b.
Jan 21 19:08:10 np0005591285 NetworkManager[55017]: <info>  [1769040490.5425] device (tap56184cc4-82): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 21 19:08:10 np0005591285 NetworkManager[55017]: <info>  [1769040490.5442] device (tap56184cc4-82): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 21 19:08:10 np0005591285 systemd[1]: Started Virtual Machine qemu-47-instance-0000006b.
Jan 21 19:08:10 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:08:10.569 211704 DEBUG oslo.privsep.daemon [-] privsep: reply[035c28a4-fa36-4e95-89c8-1eb6a52af23a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 19:08:10 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:08:10.574 211704 DEBUG oslo.privsep.daemon [-] privsep: reply[93e6c157-dd01-468c-a39e-fc8a8f8d073f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 19:08:10 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:08:10.619 211704 DEBUG oslo.privsep.daemon [-] privsep: reply[4d3e6530-9f9b-4d33-9a38-2d10fb20d1a1]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 19:08:10 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:08:10.645 211690 DEBUG oslo.privsep.daemon [-] privsep: reply[be9f56ec-3dad-48ba-8030-83071b4ded1d]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap1a4bd631-61'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:28:78:33'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 8, 'tx_packets': 5, 'rx_bytes': 616, 'tx_bytes': 354, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 8, 'tx_packets': 5, 'rx_bytes': 616, 'tx_bytes': 354, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 118], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 501495, 'reachable_time': 33859, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 6, 'inoctets': 448, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 3, 'outoctets': 228, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 6, 'outmcastpkts': 3, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 448, 'outmcastoctets': 228, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 6, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 3, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 227485, 'error': None, 'target': 'ovnmeta-1a4bd631-64c5-4e00-9341-0e44fd0833fb', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 19:08:10 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:08:10.670 211690 DEBUG oslo.privsep.daemon [-] privsep: reply[d9c91a40-f113-42d2-9930-350cc7decd93]: (4, ({'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tap1a4bd631-61'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 501509, 'tstamp': 501509}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 227487, 'error': None, 'target': 'ovnmeta-1a4bd631-64c5-4e00-9341-0e44fd0833fb', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.15'], ['IFA_LABEL', 'tap1a4bd631-61'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 501513, 'tstamp': 501513}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 227487, 'error': None, 'target': 'ovnmeta-1a4bd631-64c5-4e00-9341-0e44fd0833fb', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 19:08:10 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:08:10.672 104259 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap1a4bd631-60, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 21 19:08:10 np0005591285 nova_compute[182755]: 2026-01-22 00:08:10.673 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:08:10 np0005591285 nova_compute[182755]: 2026-01-22 00:08:10.674 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:08:10 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:08:10.674 104259 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap1a4bd631-60, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 21 19:08:10 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:08:10.675 104259 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 21 19:08:10 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:08:10.675 104259 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap1a4bd631-60, col_values=(('external_ids', {'iface-id': 'c2dbe75a-81e7-4c52-bada-9acaf8fbaf5c'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 21 19:08:10 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:08:10.675 104259 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 21 19:08:11 np0005591285 nova_compute[182755]: 2026-01-22 00:08:11.099 182759 DEBUG nova.virt.driver [None req-f447ef32-7edb-4e2c-a5c5-5452f2f9c37e - - - - - -] Emitting event <LifecycleEvent: 1769040491.0986824, a22c5192-6f57-46f1-8073-48ec7852a544 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 21 19:08:11 np0005591285 nova_compute[182755]: 2026-01-22 00:08:11.100 182759 INFO nova.compute.manager [None req-f447ef32-7edb-4e2c-a5c5-5452f2f9c37e - - - - - -] [instance: a22c5192-6f57-46f1-8073-48ec7852a544] VM Started (Lifecycle Event)#033[00m
Jan 21 19:08:11 np0005591285 nova_compute[182755]: 2026-01-22 00:08:11.139 182759 DEBUG nova.compute.manager [None req-f447ef32-7edb-4e2c-a5c5-5452f2f9c37e - - - - - -] [instance: a22c5192-6f57-46f1-8073-48ec7852a544] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 21 19:08:11 np0005591285 nova_compute[182755]: 2026-01-22 00:08:11.145 182759 DEBUG nova.virt.driver [None req-f447ef32-7edb-4e2c-a5c5-5452f2f9c37e - - - - - -] Emitting event <LifecycleEvent: 1769040491.0990348, a22c5192-6f57-46f1-8073-48ec7852a544 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 21 19:08:11 np0005591285 nova_compute[182755]: 2026-01-22 00:08:11.145 182759 INFO nova.compute.manager [None req-f447ef32-7edb-4e2c-a5c5-5452f2f9c37e - - - - - -] [instance: a22c5192-6f57-46f1-8073-48ec7852a544] VM Paused (Lifecycle Event)#033[00m
Jan 21 19:08:11 np0005591285 nova_compute[182755]: 2026-01-22 00:08:11.167 182759 DEBUG nova.compute.manager [None req-f447ef32-7edb-4e2c-a5c5-5452f2f9c37e - - - - - -] [instance: a22c5192-6f57-46f1-8073-48ec7852a544] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 21 19:08:11 np0005591285 nova_compute[182755]: 2026-01-22 00:08:11.176 182759 DEBUG nova.compute.manager [None req-f447ef32-7edb-4e2c-a5c5-5452f2f9c37e - - - - - -] [instance: a22c5192-6f57-46f1-8073-48ec7852a544] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 21 19:08:11 np0005591285 nova_compute[182755]: 2026-01-22 00:08:11.216 182759 INFO nova.compute.manager [None req-f447ef32-7edb-4e2c-a5c5-5452f2f9c37e - - - - - -] [instance: a22c5192-6f57-46f1-8073-48ec7852a544] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 21 19:08:11 np0005591285 nova_compute[182755]: 2026-01-22 00:08:11.237 182759 DEBUG nova.network.neutron [req-83c47011-9712-4a3d-a140-eb448e4be157 req-5e2eb799-e156-4e6b-afdf-d75d8ea6a67c 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: a22c5192-6f57-46f1-8073-48ec7852a544] Updated VIF entry in instance network info cache for port 56184cc4-8230-467f-b464-9f9ebdf257e0. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 21 19:08:11 np0005591285 nova_compute[182755]: 2026-01-22 00:08:11.238 182759 DEBUG nova.network.neutron [req-83c47011-9712-4a3d-a140-eb448e4be157 req-5e2eb799-e156-4e6b-afdf-d75d8ea6a67c 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: a22c5192-6f57-46f1-8073-48ec7852a544] Updating instance_info_cache with network_info: [{"id": "56184cc4-8230-467f-b464-9f9ebdf257e0", "address": "fa:16:3e:cc:48:66", "network": {"id": "1a4bd631-64c5-4e00-9341-0e44fd0833fb", "bridge": "br-int", "label": "tempest-ServerActionsTestOtherB-1779791452-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b26cf6f4abd54e30aac169a3cbca648c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap56184cc4-82", "ovs_interfaceid": "56184cc4-8230-467f-b464-9f9ebdf257e0", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 21 19:08:11 np0005591285 nova_compute[182755]: 2026-01-22 00:08:11.258 182759 DEBUG oslo_concurrency.lockutils [req-83c47011-9712-4a3d-a140-eb448e4be157 req-5e2eb799-e156-4e6b-afdf-d75d8ea6a67c 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Releasing lock "refresh_cache-a22c5192-6f57-46f1-8073-48ec7852a544" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 21 19:08:11 np0005591285 nova_compute[182755]: 2026-01-22 00:08:11.435 182759 DEBUG nova.compute.manager [req-eaeb6d0e-175e-4e9d-9567-aef76a9dbb99 req-5e19939c-5938-483d-a9bc-8d6e1ab2b232 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: a22c5192-6f57-46f1-8073-48ec7852a544] Received event network-vif-plugged-56184cc4-8230-467f-b464-9f9ebdf257e0 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 21 19:08:11 np0005591285 nova_compute[182755]: 2026-01-22 00:08:11.436 182759 DEBUG oslo_concurrency.lockutils [req-eaeb6d0e-175e-4e9d-9567-aef76a9dbb99 req-5e19939c-5938-483d-a9bc-8d6e1ab2b232 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "a22c5192-6f57-46f1-8073-48ec7852a544-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 21 19:08:11 np0005591285 nova_compute[182755]: 2026-01-22 00:08:11.437 182759 DEBUG oslo_concurrency.lockutils [req-eaeb6d0e-175e-4e9d-9567-aef76a9dbb99 req-5e19939c-5938-483d-a9bc-8d6e1ab2b232 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "a22c5192-6f57-46f1-8073-48ec7852a544-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 21 19:08:11 np0005591285 nova_compute[182755]: 2026-01-22 00:08:11.437 182759 DEBUG oslo_concurrency.lockutils [req-eaeb6d0e-175e-4e9d-9567-aef76a9dbb99 req-5e19939c-5938-483d-a9bc-8d6e1ab2b232 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "a22c5192-6f57-46f1-8073-48ec7852a544-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 21 19:08:11 np0005591285 nova_compute[182755]: 2026-01-22 00:08:11.437 182759 DEBUG nova.compute.manager [req-eaeb6d0e-175e-4e9d-9567-aef76a9dbb99 req-5e19939c-5938-483d-a9bc-8d6e1ab2b232 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: a22c5192-6f57-46f1-8073-48ec7852a544] Processing event network-vif-plugged-56184cc4-8230-467f-b464-9f9ebdf257e0 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Jan 21 19:08:11 np0005591285 nova_compute[182755]: 2026-01-22 00:08:11.438 182759 DEBUG nova.compute.manager [req-eaeb6d0e-175e-4e9d-9567-aef76a9dbb99 req-5e19939c-5938-483d-a9bc-8d6e1ab2b232 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: a22c5192-6f57-46f1-8073-48ec7852a544] Received event network-vif-plugged-56184cc4-8230-467f-b464-9f9ebdf257e0 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 21 19:08:11 np0005591285 nova_compute[182755]: 2026-01-22 00:08:11.438 182759 DEBUG oslo_concurrency.lockutils [req-eaeb6d0e-175e-4e9d-9567-aef76a9dbb99 req-5e19939c-5938-483d-a9bc-8d6e1ab2b232 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "a22c5192-6f57-46f1-8073-48ec7852a544-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 21 19:08:11 np0005591285 nova_compute[182755]: 2026-01-22 00:08:11.439 182759 DEBUG oslo_concurrency.lockutils [req-eaeb6d0e-175e-4e9d-9567-aef76a9dbb99 req-5e19939c-5938-483d-a9bc-8d6e1ab2b232 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "a22c5192-6f57-46f1-8073-48ec7852a544-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 21 19:08:11 np0005591285 nova_compute[182755]: 2026-01-22 00:08:11.439 182759 DEBUG oslo_concurrency.lockutils [req-eaeb6d0e-175e-4e9d-9567-aef76a9dbb99 req-5e19939c-5938-483d-a9bc-8d6e1ab2b232 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "a22c5192-6f57-46f1-8073-48ec7852a544-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 21 19:08:11 np0005591285 nova_compute[182755]: 2026-01-22 00:08:11.439 182759 DEBUG nova.compute.manager [req-eaeb6d0e-175e-4e9d-9567-aef76a9dbb99 req-5e19939c-5938-483d-a9bc-8d6e1ab2b232 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: a22c5192-6f57-46f1-8073-48ec7852a544] No waiting events found dispatching network-vif-plugged-56184cc4-8230-467f-b464-9f9ebdf257e0 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 21 19:08:11 np0005591285 nova_compute[182755]: 2026-01-22 00:08:11.440 182759 WARNING nova.compute.manager [req-eaeb6d0e-175e-4e9d-9567-aef76a9dbb99 req-5e19939c-5938-483d-a9bc-8d6e1ab2b232 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: a22c5192-6f57-46f1-8073-48ec7852a544] Received unexpected event network-vif-plugged-56184cc4-8230-467f-b464-9f9ebdf257e0 for instance with vm_state building and task_state spawning.#033[00m
Jan 21 19:08:11 np0005591285 nova_compute[182755]: 2026-01-22 00:08:11.441 182759 DEBUG nova.compute.manager [None req-4b726b44-c025-4b73-b722-d8f06082cbba 365f219cd09c471fa6275faa2fe5e2a1 b26cf6f4abd54e30aac169a3cbca648c - - default default] [instance: a22c5192-6f57-46f1-8073-48ec7852a544] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Jan 21 19:08:11 np0005591285 nova_compute[182755]: 2026-01-22 00:08:11.445 182759 DEBUG nova.virt.driver [None req-f447ef32-7edb-4e2c-a5c5-5452f2f9c37e - - - - - -] Emitting event <LifecycleEvent: 1769040491.4453826, a22c5192-6f57-46f1-8073-48ec7852a544 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 21 19:08:11 np0005591285 nova_compute[182755]: 2026-01-22 00:08:11.446 182759 INFO nova.compute.manager [None req-f447ef32-7edb-4e2c-a5c5-5452f2f9c37e - - - - - -] [instance: a22c5192-6f57-46f1-8073-48ec7852a544] VM Resumed (Lifecycle Event)#033[00m
Jan 21 19:08:11 np0005591285 nova_compute[182755]: 2026-01-22 00:08:11.448 182759 DEBUG nova.virt.libvirt.driver [None req-4b726b44-c025-4b73-b722-d8f06082cbba 365f219cd09c471fa6275faa2fe5e2a1 b26cf6f4abd54e30aac169a3cbca648c - - default default] [instance: a22c5192-6f57-46f1-8073-48ec7852a544] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Jan 21 19:08:11 np0005591285 nova_compute[182755]: 2026-01-22 00:08:11.452 182759 INFO nova.virt.libvirt.driver [-] [instance: a22c5192-6f57-46f1-8073-48ec7852a544] Instance spawned successfully.#033[00m
Jan 21 19:08:11 np0005591285 nova_compute[182755]: 2026-01-22 00:08:11.453 182759 DEBUG nova.virt.libvirt.driver [None req-4b726b44-c025-4b73-b722-d8f06082cbba 365f219cd09c471fa6275faa2fe5e2a1 b26cf6f4abd54e30aac169a3cbca648c - - default default] [instance: a22c5192-6f57-46f1-8073-48ec7852a544] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Jan 21 19:08:11 np0005591285 nova_compute[182755]: 2026-01-22 00:08:11.472 182759 DEBUG nova.compute.manager [None req-f447ef32-7edb-4e2c-a5c5-5452f2f9c37e - - - - - -] [instance: a22c5192-6f57-46f1-8073-48ec7852a544] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 21 19:08:11 np0005591285 nova_compute[182755]: 2026-01-22 00:08:11.480 182759 DEBUG nova.compute.manager [None req-f447ef32-7edb-4e2c-a5c5-5452f2f9c37e - - - - - -] [instance: a22c5192-6f57-46f1-8073-48ec7852a544] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 21 19:08:11 np0005591285 nova_compute[182755]: 2026-01-22 00:08:11.485 182759 DEBUG nova.virt.libvirt.driver [None req-4b726b44-c025-4b73-b722-d8f06082cbba 365f219cd09c471fa6275faa2fe5e2a1 b26cf6f4abd54e30aac169a3cbca648c - - default default] [instance: a22c5192-6f57-46f1-8073-48ec7852a544] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 21 19:08:11 np0005591285 nova_compute[182755]: 2026-01-22 00:08:11.485 182759 DEBUG nova.virt.libvirt.driver [None req-4b726b44-c025-4b73-b722-d8f06082cbba 365f219cd09c471fa6275faa2fe5e2a1 b26cf6f4abd54e30aac169a3cbca648c - - default default] [instance: a22c5192-6f57-46f1-8073-48ec7852a544] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 21 19:08:11 np0005591285 nova_compute[182755]: 2026-01-22 00:08:11.486 182759 DEBUG nova.virt.libvirt.driver [None req-4b726b44-c025-4b73-b722-d8f06082cbba 365f219cd09c471fa6275faa2fe5e2a1 b26cf6f4abd54e30aac169a3cbca648c - - default default] [instance: a22c5192-6f57-46f1-8073-48ec7852a544] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 21 19:08:11 np0005591285 nova_compute[182755]: 2026-01-22 00:08:11.486 182759 DEBUG nova.virt.libvirt.driver [None req-4b726b44-c025-4b73-b722-d8f06082cbba 365f219cd09c471fa6275faa2fe5e2a1 b26cf6f4abd54e30aac169a3cbca648c - - default default] [instance: a22c5192-6f57-46f1-8073-48ec7852a544] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 21 19:08:11 np0005591285 nova_compute[182755]: 2026-01-22 00:08:11.487 182759 DEBUG nova.virt.libvirt.driver [None req-4b726b44-c025-4b73-b722-d8f06082cbba 365f219cd09c471fa6275faa2fe5e2a1 b26cf6f4abd54e30aac169a3cbca648c - - default default] [instance: a22c5192-6f57-46f1-8073-48ec7852a544] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 21 19:08:11 np0005591285 nova_compute[182755]: 2026-01-22 00:08:11.487 182759 DEBUG nova.virt.libvirt.driver [None req-4b726b44-c025-4b73-b722-d8f06082cbba 365f219cd09c471fa6275faa2fe5e2a1 b26cf6f4abd54e30aac169a3cbca648c - - default default] [instance: a22c5192-6f57-46f1-8073-48ec7852a544] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 21 19:08:11 np0005591285 nova_compute[182755]: 2026-01-22 00:08:11.519 182759 INFO nova.compute.manager [None req-f447ef32-7edb-4e2c-a5c5-5452f2f9c37e - - - - - -] [instance: a22c5192-6f57-46f1-8073-48ec7852a544] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 21 19:08:11 np0005591285 nova_compute[182755]: 2026-01-22 00:08:11.524 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:08:11 np0005591285 nova_compute[182755]: 2026-01-22 00:08:11.575 182759 INFO nova.compute.manager [None req-4b726b44-c025-4b73-b722-d8f06082cbba 365f219cd09c471fa6275faa2fe5e2a1 b26cf6f4abd54e30aac169a3cbca648c - - default default] [instance: a22c5192-6f57-46f1-8073-48ec7852a544] Took 6.53 seconds to spawn the instance on the hypervisor.#033[00m
Jan 21 19:08:11 np0005591285 nova_compute[182755]: 2026-01-22 00:08:11.576 182759 DEBUG nova.compute.manager [None req-4b726b44-c025-4b73-b722-d8f06082cbba 365f219cd09c471fa6275faa2fe5e2a1 b26cf6f4abd54e30aac169a3cbca648c - - default default] [instance: a22c5192-6f57-46f1-8073-48ec7852a544] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 21 19:08:11 np0005591285 nova_compute[182755]: 2026-01-22 00:08:11.651 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:08:11 np0005591285 nova_compute[182755]: 2026-01-22 00:08:11.679 182759 INFO nova.compute.manager [None req-4b726b44-c025-4b73-b722-d8f06082cbba 365f219cd09c471fa6275faa2fe5e2a1 b26cf6f4abd54e30aac169a3cbca648c - - default default] [instance: a22c5192-6f57-46f1-8073-48ec7852a544] Took 7.14 seconds to build instance.#033[00m
Jan 21 19:08:11 np0005591285 nova_compute[182755]: 2026-01-22 00:08:11.700 182759 DEBUG oslo_concurrency.lockutils [None req-4b726b44-c025-4b73-b722-d8f06082cbba 365f219cd09c471fa6275faa2fe5e2a1 b26cf6f4abd54e30aac169a3cbca648c - - default default] Lock "a22c5192-6f57-46f1-8073-48ec7852a544" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 7.251s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 21 19:08:11 np0005591285 nova_compute[182755]: 2026-01-22 00:08:11.735 182759 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1769040476.734625, 6167cc82-55cf-479c-a543-101634481524 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 21 19:08:11 np0005591285 nova_compute[182755]: 2026-01-22 00:08:11.735 182759 INFO nova.compute.manager [-] [instance: 6167cc82-55cf-479c-a543-101634481524] VM Stopped (Lifecycle Event)#033[00m
Jan 21 19:08:11 np0005591285 nova_compute[182755]: 2026-01-22 00:08:11.759 182759 DEBUG nova.compute.manager [None req-1a04e38a-3352-48f7-8451-b509e1c44452 - - - - - -] [instance: 6167cc82-55cf-479c-a543-101634481524] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 21 19:08:11 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:08:11.812 104259 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=29, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '16:a4:fb', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': 'fa:c2:bb:50:eb:27'}, ipsec=False) old=SB_Global(nb_cfg=28) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 21 19:08:11 np0005591285 nova_compute[182755]: 2026-01-22 00:08:11.811 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:08:11 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:08:11.813 104259 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 5 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Jan 21 19:08:13 np0005591285 nova_compute[182755]: 2026-01-22 00:08:13.062 182759 INFO nova.compute.manager [None req-0054b5b9-1d94-4f82-965d-d26db0b28ddb 365f219cd09c471fa6275faa2fe5e2a1 b26cf6f4abd54e30aac169a3cbca648c - - default default] [instance: a22c5192-6f57-46f1-8073-48ec7852a544] Pausing#033[00m
Jan 21 19:08:13 np0005591285 nova_compute[182755]: 2026-01-22 00:08:13.063 182759 DEBUG nova.objects.instance [None req-0054b5b9-1d94-4f82-965d-d26db0b28ddb 365f219cd09c471fa6275faa2fe5e2a1 b26cf6f4abd54e30aac169a3cbca648c - - default default] Lazy-loading 'flavor' on Instance uuid a22c5192-6f57-46f1-8073-48ec7852a544 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 21 19:08:13 np0005591285 nova_compute[182755]: 2026-01-22 00:08:13.107 182759 DEBUG nova.virt.driver [None req-f447ef32-7edb-4e2c-a5c5-5452f2f9c37e - - - - - -] Emitting event <LifecycleEvent: 1769040493.1070237, a22c5192-6f57-46f1-8073-48ec7852a544 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 21 19:08:13 np0005591285 nova_compute[182755]: 2026-01-22 00:08:13.108 182759 INFO nova.compute.manager [None req-f447ef32-7edb-4e2c-a5c5-5452f2f9c37e - - - - - -] [instance: a22c5192-6f57-46f1-8073-48ec7852a544] VM Paused (Lifecycle Event)#033[00m
Jan 21 19:08:13 np0005591285 nova_compute[182755]: 2026-01-22 00:08:13.110 182759 DEBUG nova.compute.manager [None req-0054b5b9-1d94-4f82-965d-d26db0b28ddb 365f219cd09c471fa6275faa2fe5e2a1 b26cf6f4abd54e30aac169a3cbca648c - - default default] [instance: a22c5192-6f57-46f1-8073-48ec7852a544] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 21 19:08:13 np0005591285 nova_compute[182755]: 2026-01-22 00:08:13.135 182759 DEBUG nova.compute.manager [None req-f447ef32-7edb-4e2c-a5c5-5452f2f9c37e - - - - - -] [instance: a22c5192-6f57-46f1-8073-48ec7852a544] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 21 19:08:13 np0005591285 nova_compute[182755]: 2026-01-22 00:08:13.138 182759 DEBUG nova.compute.manager [None req-f447ef32-7edb-4e2c-a5c5-5452f2f9c37e - - - - - -] [instance: a22c5192-6f57-46f1-8073-48ec7852a544] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: active, current task_state: pausing, current DB power_state: 1, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 21 19:08:13 np0005591285 nova_compute[182755]: 2026-01-22 00:08:13.176 182759 INFO nova.compute.manager [None req-f447ef32-7edb-4e2c-a5c5-5452f2f9c37e - - - - - -] [instance: a22c5192-6f57-46f1-8073-48ec7852a544] During sync_power_state the instance has a pending task (pausing). Skip.#033[00m
Jan 21 19:08:14 np0005591285 nova_compute[182755]: 2026-01-22 00:08:14.613 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:08:15 np0005591285 podman[227496]: 2026-01-22 00:08:15.21963235 +0000 UTC m=+0.080778434 container health_status b5c1a31a508d0cf9e10702abe9c0ef424614b4d8f0daee85791f7de054afa6f0 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, config_id=ceilometer_agent_compute, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 21 19:08:15 np0005591285 podman[227495]: 2026-01-22 00:08:15.220555855 +0000 UTC m=+0.084504334 container health_status 769668cc4e818cfae854ab3fcb5517227ddca1e18005a18ef4d782a494f45662 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., release=1755695350, version=9.6, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.openshift.tags=minimal rhel9, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=edpm_ansible, name=ubi9-minimal, container_name=openstack_network_exporter, com.redhat.component=ubi9-minimal-container, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, vendor=Red Hat, Inc., build-date=2025-08-20T13:12:41, architecture=x86_64, config_id=openstack_network_exporter, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.expose-services=, url=https://catalog.redhat.com/en/search?searchType=containers, io.buildah.version=1.33.7, vcs-type=git, distribution-scope=public, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Red Hat, Inc.)
Jan 21 19:08:15 np0005591285 nova_compute[182755]: 2026-01-22 00:08:15.833 182759 DEBUG oslo_concurrency.lockutils [None req-464ab5f8-60e8-453e-a55a-ebbc4be5425f 365f219cd09c471fa6275faa2fe5e2a1 b26cf6f4abd54e30aac169a3cbca648c - - default default] Acquiring lock "a22c5192-6f57-46f1-8073-48ec7852a544" by "nova.compute.manager.ComputeManager.shelve_instance.<locals>.do_shelve_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 21 19:08:15 np0005591285 nova_compute[182755]: 2026-01-22 00:08:15.834 182759 DEBUG oslo_concurrency.lockutils [None req-464ab5f8-60e8-453e-a55a-ebbc4be5425f 365f219cd09c471fa6275faa2fe5e2a1 b26cf6f4abd54e30aac169a3cbca648c - - default default] Lock "a22c5192-6f57-46f1-8073-48ec7852a544" acquired by "nova.compute.manager.ComputeManager.shelve_instance.<locals>.do_shelve_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 21 19:08:15 np0005591285 nova_compute[182755]: 2026-01-22 00:08:15.834 182759 INFO nova.compute.manager [None req-464ab5f8-60e8-453e-a55a-ebbc4be5425f 365f219cd09c471fa6275faa2fe5e2a1 b26cf6f4abd54e30aac169a3cbca648c - - default default] [instance: a22c5192-6f57-46f1-8073-48ec7852a544] Shelving#033[00m
Jan 21 19:08:15 np0005591285 kernel: tap56184cc4-82 (unregistering): left promiscuous mode
Jan 21 19:08:15 np0005591285 NetworkManager[55017]: <info>  [1769040495.9087] device (tap56184cc4-82): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 21 19:08:15 np0005591285 ovn_controller[94908]: 2026-01-22T00:08:15Z|00386|binding|INFO|Releasing lport 56184cc4-8230-467f-b464-9f9ebdf257e0 from this chassis (sb_readonly=0)
Jan 21 19:08:15 np0005591285 nova_compute[182755]: 2026-01-22 00:08:15.909 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:08:15 np0005591285 ovn_controller[94908]: 2026-01-22T00:08:15Z|00387|binding|INFO|Setting lport 56184cc4-8230-467f-b464-9f9ebdf257e0 down in Southbound
Jan 21 19:08:15 np0005591285 ovn_controller[94908]: 2026-01-22T00:08:15Z|00388|binding|INFO|Removing iface tap56184cc4-82 ovn-installed in OVS
Jan 21 19:08:15 np0005591285 nova_compute[182755]: 2026-01-22 00:08:15.913 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:08:15 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:08:15.923 104259 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:cc:48:66 10.100.0.10'], port_security=['fa:16:3e:cc:48:66 10.100.0.10'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.10/28', 'neutron:device_id': 'a22c5192-6f57-46f1-8073-48ec7852a544', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-1a4bd631-64c5-4e00-9341-0e44fd0833fb', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'b26cf6f4abd54e30aac169a3cbca648c', 'neutron:revision_number': '4', 'neutron:security_group_ids': '51b050e2-1158-4da5-a294-6c6d2a400a60', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-2.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=d46c6b58-b03f-4ac4-a6dd-9f507a40241a, chassis=[], tunnel_key=6, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fdd55b5c970>], logical_port=56184cc4-8230-467f-b464-9f9ebdf257e0) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7fdd55b5c970>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 21 19:08:15 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:08:15.924 104259 INFO neutron.agent.ovn.metadata.agent [-] Port 56184cc4-8230-467f-b464-9f9ebdf257e0 in datapath 1a4bd631-64c5-4e00-9341-0e44fd0833fb unbound from our chassis#033[00m
Jan 21 19:08:15 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:08:15.925 104259 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 1a4bd631-64c5-4e00-9341-0e44fd0833fb#033[00m
Jan 21 19:08:15 np0005591285 nova_compute[182755]: 2026-01-22 00:08:15.946 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:08:15 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:08:15.955 211690 DEBUG oslo.privsep.daemon [-] privsep: reply[308c2891-2820-43e4-a5d5-32afc73e85e7]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 19:08:15 np0005591285 systemd[1]: machine-qemu\x2d47\x2dinstance\x2d0000006b.scope: Deactivated successfully.
Jan 21 19:08:15 np0005591285 systemd[1]: machine-qemu\x2d47\x2dinstance\x2d0000006b.scope: Consumed 2.231s CPU time.
Jan 21 19:08:15 np0005591285 systemd-machined[154022]: Machine qemu-47-instance-0000006b terminated.
Jan 21 19:08:15 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:08:15.994 211704 DEBUG oslo.privsep.daemon [-] privsep: reply[b764bccc-308c-4e91-a74c-3fb50e8aa986]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 19:08:15 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:08:15.997 211704 DEBUG oslo.privsep.daemon [-] privsep: reply[eb9463ec-8ad3-4cb3-9352-733d80b396ed]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 19:08:16 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:08:16.027 211704 DEBUG oslo.privsep.daemon [-] privsep: reply[56510614-bcc0-40eb-99e1-04b719362154]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 19:08:16 np0005591285 nova_compute[182755]: 2026-01-22 00:08:16.032 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:08:16 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:08:16.046 211690 DEBUG oslo.privsep.daemon [-] privsep: reply[1a4c19c3-0000-49a3-a706-71c8708b3039]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap1a4bd631-61'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:28:78:33'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 8, 'tx_packets': 7, 'rx_bytes': 616, 'tx_bytes': 438, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 8, 'tx_packets': 7, 'rx_bytes': 616, 'tx_bytes': 438, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 118], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 501495, 'reachable_time': 33859, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 6, 'inoctets': 448, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 3, 'outoctets': 228, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 6, 'outmcastpkts': 3, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 448, 'outmcastoctets': 228, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 6, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 3, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 227549, 'error': None, 'target': 'ovnmeta-1a4bd631-64c5-4e00-9341-0e44fd0833fb', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 19:08:16 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:08:16.064 211690 DEBUG oslo.privsep.daemon [-] privsep: reply[f939b774-ab37-4723-a58e-c1e587c35878]: (4, ({'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tap1a4bd631-61'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 501509, 'tstamp': 501509}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 227550, 'error': None, 'target': 'ovnmeta-1a4bd631-64c5-4e00-9341-0e44fd0833fb', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.15'], ['IFA_LABEL', 'tap1a4bd631-61'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 501513, 'tstamp': 501513}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 227550, 'error': None, 'target': 'ovnmeta-1a4bd631-64c5-4e00-9341-0e44fd0833fb', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 19:08:16 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:08:16.066 104259 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap1a4bd631-60, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 21 19:08:16 np0005591285 nova_compute[182755]: 2026-01-22 00:08:16.067 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:08:16 np0005591285 nova_compute[182755]: 2026-01-22 00:08:16.071 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:08:16 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:08:16.071 104259 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap1a4bd631-60, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 21 19:08:16 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:08:16.072 104259 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 21 19:08:16 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:08:16.072 104259 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap1a4bd631-60, col_values=(('external_ids', {'iface-id': 'c2dbe75a-81e7-4c52-bada-9acaf8fbaf5c'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 21 19:08:16 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:08:16.072 104259 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 21 19:08:16 np0005591285 nova_compute[182755]: 2026-01-22 00:08:16.150 182759 INFO nova.virt.libvirt.driver [-] [instance: a22c5192-6f57-46f1-8073-48ec7852a544] Instance destroyed successfully.#033[00m
Jan 21 19:08:16 np0005591285 nova_compute[182755]: 2026-01-22 00:08:16.151 182759 DEBUG nova.objects.instance [None req-464ab5f8-60e8-453e-a55a-ebbc4be5425f 365f219cd09c471fa6275faa2fe5e2a1 b26cf6f4abd54e30aac169a3cbca648c - - default default] Lazy-loading 'numa_topology' on Instance uuid a22c5192-6f57-46f1-8073-48ec7852a544 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 21 19:08:16 np0005591285 nova_compute[182755]: 2026-01-22 00:08:16.183 182759 DEBUG nova.compute.manager [req-c5926705-612f-4d26-bcea-b5097901938c req-9c5b0db1-a094-436c-957e-7a2ee6a50f6f 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: a22c5192-6f57-46f1-8073-48ec7852a544] Received event network-vif-unplugged-56184cc4-8230-467f-b464-9f9ebdf257e0 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 21 19:08:16 np0005591285 nova_compute[182755]: 2026-01-22 00:08:16.184 182759 DEBUG oslo_concurrency.lockutils [req-c5926705-612f-4d26-bcea-b5097901938c req-9c5b0db1-a094-436c-957e-7a2ee6a50f6f 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "a22c5192-6f57-46f1-8073-48ec7852a544-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 21 19:08:16 np0005591285 nova_compute[182755]: 2026-01-22 00:08:16.184 182759 DEBUG oslo_concurrency.lockutils [req-c5926705-612f-4d26-bcea-b5097901938c req-9c5b0db1-a094-436c-957e-7a2ee6a50f6f 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "a22c5192-6f57-46f1-8073-48ec7852a544-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 21 19:08:16 np0005591285 nova_compute[182755]: 2026-01-22 00:08:16.185 182759 DEBUG oslo_concurrency.lockutils [req-c5926705-612f-4d26-bcea-b5097901938c req-9c5b0db1-a094-436c-957e-7a2ee6a50f6f 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "a22c5192-6f57-46f1-8073-48ec7852a544-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 21 19:08:16 np0005591285 nova_compute[182755]: 2026-01-22 00:08:16.185 182759 DEBUG nova.compute.manager [req-c5926705-612f-4d26-bcea-b5097901938c req-9c5b0db1-a094-436c-957e-7a2ee6a50f6f 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: a22c5192-6f57-46f1-8073-48ec7852a544] No waiting events found dispatching network-vif-unplugged-56184cc4-8230-467f-b464-9f9ebdf257e0 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 21 19:08:16 np0005591285 nova_compute[182755]: 2026-01-22 00:08:16.185 182759 WARNING nova.compute.manager [req-c5926705-612f-4d26-bcea-b5097901938c req-9c5b0db1-a094-436c-957e-7a2ee6a50f6f 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: a22c5192-6f57-46f1-8073-48ec7852a544] Received unexpected event network-vif-unplugged-56184cc4-8230-467f-b464-9f9ebdf257e0 for instance with vm_state paused and task_state shelving.#033[00m
Jan 21 19:08:16 np0005591285 nova_compute[182755]: 2026-01-22 00:08:16.515 182759 INFO nova.virt.libvirt.driver [None req-464ab5f8-60e8-453e-a55a-ebbc4be5425f 365f219cd09c471fa6275faa2fe5e2a1 b26cf6f4abd54e30aac169a3cbca648c - - default default] [instance: a22c5192-6f57-46f1-8073-48ec7852a544] Beginning cold snapshot process#033[00m
Jan 21 19:08:16 np0005591285 nova_compute[182755]: 2026-01-22 00:08:16.652 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:08:16 np0005591285 nova_compute[182755]: 2026-01-22 00:08:16.786 182759 DEBUG nova.privsep.utils [None req-464ab5f8-60e8-453e-a55a-ebbc4be5425f 365f219cd09c471fa6275faa2fe5e2a1 b26cf6f4abd54e30aac169a3cbca648c - - default default] Path '/var/lib/nova/instances' supports direct I/O supports_direct_io /usr/lib/python3.9/site-packages/nova/privsep/utils.py:63#033[00m
Jan 21 19:08:16 np0005591285 nova_compute[182755]: 2026-01-22 00:08:16.787 182759 DEBUG oslo_concurrency.processutils [None req-464ab5f8-60e8-453e-a55a-ebbc4be5425f 365f219cd09c471fa6275faa2fe5e2a1 b26cf6f4abd54e30aac169a3cbca648c - - default default] Running cmd (subprocess): qemu-img convert -t none -O qcow2 -f qcow2 /var/lib/nova/instances/a22c5192-6f57-46f1-8073-48ec7852a544/disk /var/lib/nova/instances/snapshots/tmp7riw7sfa/204c7a833c0947c9b73b22f3414e9807 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 21 19:08:16 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:08:16.816 104259 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=ce4b296c-26ac-415a-aa87-9634754eb3d3, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '29'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 21 19:08:16 np0005591285 nova_compute[182755]: 2026-01-22 00:08:16.981 182759 DEBUG oslo_concurrency.processutils [None req-464ab5f8-60e8-453e-a55a-ebbc4be5425f 365f219cd09c471fa6275faa2fe5e2a1 b26cf6f4abd54e30aac169a3cbca648c - - default default] CMD "qemu-img convert -t none -O qcow2 -f qcow2 /var/lib/nova/instances/a22c5192-6f57-46f1-8073-48ec7852a544/disk /var/lib/nova/instances/snapshots/tmp7riw7sfa/204c7a833c0947c9b73b22f3414e9807" returned: 0 in 0.195s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 21 19:08:16 np0005591285 nova_compute[182755]: 2026-01-22 00:08:16.982 182759 INFO nova.virt.libvirt.driver [None req-464ab5f8-60e8-453e-a55a-ebbc4be5425f 365f219cd09c471fa6275faa2fe5e2a1 b26cf6f4abd54e30aac169a3cbca648c - - default default] [instance: a22c5192-6f57-46f1-8073-48ec7852a544] Snapshot extracted, beginning image upload#033[00m
Jan 21 19:08:18 np0005591285 nova_compute[182755]: 2026-01-22 00:08:18.338 182759 DEBUG nova.compute.manager [req-cb232b58-0996-4123-bb48-8bd2d2cb0463 req-a0065d8f-bb5c-4c3c-91d7-8f038307796f 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: a22c5192-6f57-46f1-8073-48ec7852a544] Received event network-vif-plugged-56184cc4-8230-467f-b464-9f9ebdf257e0 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 21 19:08:18 np0005591285 nova_compute[182755]: 2026-01-22 00:08:18.339 182759 DEBUG oslo_concurrency.lockutils [req-cb232b58-0996-4123-bb48-8bd2d2cb0463 req-a0065d8f-bb5c-4c3c-91d7-8f038307796f 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "a22c5192-6f57-46f1-8073-48ec7852a544-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 21 19:08:18 np0005591285 nova_compute[182755]: 2026-01-22 00:08:18.340 182759 DEBUG oslo_concurrency.lockutils [req-cb232b58-0996-4123-bb48-8bd2d2cb0463 req-a0065d8f-bb5c-4c3c-91d7-8f038307796f 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "a22c5192-6f57-46f1-8073-48ec7852a544-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 21 19:08:18 np0005591285 nova_compute[182755]: 2026-01-22 00:08:18.340 182759 DEBUG oslo_concurrency.lockutils [req-cb232b58-0996-4123-bb48-8bd2d2cb0463 req-a0065d8f-bb5c-4c3c-91d7-8f038307796f 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "a22c5192-6f57-46f1-8073-48ec7852a544-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 21 19:08:18 np0005591285 nova_compute[182755]: 2026-01-22 00:08:18.341 182759 DEBUG nova.compute.manager [req-cb232b58-0996-4123-bb48-8bd2d2cb0463 req-a0065d8f-bb5c-4c3c-91d7-8f038307796f 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: a22c5192-6f57-46f1-8073-48ec7852a544] No waiting events found dispatching network-vif-plugged-56184cc4-8230-467f-b464-9f9ebdf257e0 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 21 19:08:18 np0005591285 nova_compute[182755]: 2026-01-22 00:08:18.341 182759 WARNING nova.compute.manager [req-cb232b58-0996-4123-bb48-8bd2d2cb0463 req-a0065d8f-bb5c-4c3c-91d7-8f038307796f 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: a22c5192-6f57-46f1-8073-48ec7852a544] Received unexpected event network-vif-plugged-56184cc4-8230-467f-b464-9f9ebdf257e0 for instance with vm_state paused and task_state shelving_image_uploading.#033[00m
Jan 21 19:08:19 np0005591285 nova_compute[182755]: 2026-01-22 00:08:19.271 182759 INFO nova.virt.libvirt.driver [None req-464ab5f8-60e8-453e-a55a-ebbc4be5425f 365f219cd09c471fa6275faa2fe5e2a1 b26cf6f4abd54e30aac169a3cbca648c - - default default] [instance: a22c5192-6f57-46f1-8073-48ec7852a544] Snapshot image upload complete#033[00m
Jan 21 19:08:19 np0005591285 nova_compute[182755]: 2026-01-22 00:08:19.271 182759 DEBUG nova.compute.manager [None req-464ab5f8-60e8-453e-a55a-ebbc4be5425f 365f219cd09c471fa6275faa2fe5e2a1 b26cf6f4abd54e30aac169a3cbca648c - - default default] [instance: a22c5192-6f57-46f1-8073-48ec7852a544] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 21 19:08:19 np0005591285 nova_compute[182755]: 2026-01-22 00:08:19.412 182759 INFO nova.compute.manager [None req-464ab5f8-60e8-453e-a55a-ebbc4be5425f 365f219cd09c471fa6275faa2fe5e2a1 b26cf6f4abd54e30aac169a3cbca648c - - default default] [instance: a22c5192-6f57-46f1-8073-48ec7852a544] Shelve offloading#033[00m
Jan 21 19:08:19 np0005591285 nova_compute[182755]: 2026-01-22 00:08:19.431 182759 INFO nova.virt.libvirt.driver [-] [instance: a22c5192-6f57-46f1-8073-48ec7852a544] Instance destroyed successfully.#033[00m
Jan 21 19:08:19 np0005591285 nova_compute[182755]: 2026-01-22 00:08:19.432 182759 DEBUG nova.compute.manager [None req-464ab5f8-60e8-453e-a55a-ebbc4be5425f 365f219cd09c471fa6275faa2fe5e2a1 b26cf6f4abd54e30aac169a3cbca648c - - default default] [instance: a22c5192-6f57-46f1-8073-48ec7852a544] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 21 19:08:19 np0005591285 nova_compute[182755]: 2026-01-22 00:08:19.434 182759 DEBUG oslo_concurrency.lockutils [None req-464ab5f8-60e8-453e-a55a-ebbc4be5425f 365f219cd09c471fa6275faa2fe5e2a1 b26cf6f4abd54e30aac169a3cbca648c - - default default] Acquiring lock "refresh_cache-a22c5192-6f57-46f1-8073-48ec7852a544" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 21 19:08:19 np0005591285 nova_compute[182755]: 2026-01-22 00:08:19.434 182759 DEBUG oslo_concurrency.lockutils [None req-464ab5f8-60e8-453e-a55a-ebbc4be5425f 365f219cd09c471fa6275faa2fe5e2a1 b26cf6f4abd54e30aac169a3cbca648c - - default default] Acquired lock "refresh_cache-a22c5192-6f57-46f1-8073-48ec7852a544" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 21 19:08:19 np0005591285 nova_compute[182755]: 2026-01-22 00:08:19.434 182759 DEBUG nova.network.neutron [None req-464ab5f8-60e8-453e-a55a-ebbc4be5425f 365f219cd09c471fa6275faa2fe5e2a1 b26cf6f4abd54e30aac169a3cbca648c - - default default] [instance: a22c5192-6f57-46f1-8073-48ec7852a544] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Jan 21 19:08:19 np0005591285 nova_compute[182755]: 2026-01-22 00:08:19.615 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:08:21 np0005591285 nova_compute[182755]: 2026-01-22 00:08:21.655 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:08:22 np0005591285 nova_compute[182755]: 2026-01-22 00:08:22.414 182759 DEBUG nova.network.neutron [None req-464ab5f8-60e8-453e-a55a-ebbc4be5425f 365f219cd09c471fa6275faa2fe5e2a1 b26cf6f4abd54e30aac169a3cbca648c - - default default] [instance: a22c5192-6f57-46f1-8073-48ec7852a544] Updating instance_info_cache with network_info: [{"id": "56184cc4-8230-467f-b464-9f9ebdf257e0", "address": "fa:16:3e:cc:48:66", "network": {"id": "1a4bd631-64c5-4e00-9341-0e44fd0833fb", "bridge": "br-int", "label": "tempest-ServerActionsTestOtherB-1779791452-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b26cf6f4abd54e30aac169a3cbca648c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap56184cc4-82", "ovs_interfaceid": "56184cc4-8230-467f-b464-9f9ebdf257e0", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 21 19:08:22 np0005591285 nova_compute[182755]: 2026-01-22 00:08:22.433 182759 DEBUG oslo_concurrency.lockutils [None req-464ab5f8-60e8-453e-a55a-ebbc4be5425f 365f219cd09c471fa6275faa2fe5e2a1 b26cf6f4abd54e30aac169a3cbca648c - - default default] Releasing lock "refresh_cache-a22c5192-6f57-46f1-8073-48ec7852a544" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.166 12 DEBUG ceilometer.compute.discovery [-] instance data: {'id': 'a22c5192-6f57-46f1-8073-48ec7852a544', 'name': 'tempest-ServerActionsTestOtherB-server-125029116', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'os_type': 'hvm', 'architecture': 'x86_64', 'OS-EXT-SRV-ATTR:instance_name': 'instance-0000006b', 'OS-EXT-SRV-ATTR:host': 'compute-2.ctlplane.example.com', 'OS-EXT-STS:vm_state': 'shutdown', 'tenant_id': 'b26cf6f4abd54e30aac169a3cbca648c', 'user_id': '365f219cd09c471fa6275faa2fe5e2a1', 'hostId': '1ae8d5dbf6e768f46f4454b06fe927f747a2ace1503561a97d1508fb', 'status': 'stopped', 'metadata': {}} discover_libvirt_polling /usr/lib/python3.9/site-packages/ceilometer/compute/discovery.py:228
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.170 12 DEBUG ceilometer.compute.discovery [-] instance data: {'id': '4e87b9c8-cfba-431e-966e-24799ad0ece2', 'name': 'tempest-ServerActionsTestOtherB-server-261934281', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'os_type': 'hvm', 'architecture': 'x86_64', 'OS-EXT-SRV-ATTR:instance_name': 'instance-00000068', 'OS-EXT-SRV-ATTR:host': 'compute-2.ctlplane.example.com', 'OS-EXT-STS:vm_state': 'running', 'tenant_id': 'b26cf6f4abd54e30aac169a3cbca648c', 'user_id': '365f219cd09c471fa6275faa2fe5e2a1', 'hostId': '1ae8d5dbf6e768f46f4454b06fe927f747a2ace1503561a97d1508fb', 'status': 'active', 'metadata': {}} discover_libvirt_polling /usr/lib/python3.9/site-packages/ceilometer/compute/discovery.py:228
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.170 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.error in the context of pollsters
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.172 12 DEBUG ceilometer.compute.pollsters [-] Instance a22c5192-6f57-46f1-8073-48ec7852a544 was shut off while getting sample of network.outgoing.packets.error: Failed to inspect data of instance <name=instance-0000006b, id=a22c5192-6f57-46f1-8073-48ec7852a544>, domain state is SHUTOFF. get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:152
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.176 12 DEBUG ceilometer.compute.virt.libvirt.inspector [-] No delta meter predecessor for 4e87b9c8-cfba-431e-966e-24799ad0ece2 / tap02f1d29d-b6 inspect_vnics /usr/lib/python3.9/site-packages/ceilometer/compute/virt/libvirt/inspector.py:136
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.177 12 DEBUG ceilometer.compute.pollsters [-] 4e87b9c8-cfba-431e-966e-24799ad0ece2/network.outgoing.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.181 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'b958f0aa-bd2b-4395-9bdc-6d94e21754b1', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '365f219cd09c471fa6275faa2fe5e2a1', 'user_name': None, 'project_id': 'b26cf6f4abd54e30aac169a3cbca648c', 'project_name': None, 'resource_id': 'instance-00000068-4e87b9c8-cfba-431e-966e-24799ad0ece2-tap02f1d29d-b6', 'timestamp': '2026-01-22T00:08:23.170814', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestOtherB-server-261934281', 'name': 'tap02f1d29d-b6', 'instance_id': '4e87b9c8-cfba-431e-966e-24799ad0ece2', 'instance_type': 'm1.nano', 'host': '1ae8d5dbf6e768f46f4454b06fe927f747a2ace1503561a97d1508fb', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:7c:e7:2e', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap02f1d29d-b6'}, 'message_id': '775b5482-f726-11f0-b13b-fa163e425b77', 'monotonic_time': 5061.891712347, 'message_signature': '076e328f0790fda4c936b73372897267b1ceb33c87757e364e334ffbdc76763d'}]}, 'timestamp': '2026-01-22 00:08:23.178371', '_unique_id': '9b8af0313fd04f489634c7835081ff8f'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.181 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.181 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.181 12 ERROR oslo_messaging.notify.messaging     yield
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.181 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.181 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.181 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.181 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.181 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.181 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.181 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.181 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.181 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.181 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.181 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.181 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.181 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.181 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.181 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.181 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.181 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.181 12 ERROR oslo_messaging.notify.messaging 
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.181 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.181 12 ERROR oslo_messaging.notify.messaging 
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.181 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.181 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.181 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.181 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.181 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.181 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.181 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.181 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.181 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.181 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.181 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.181 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.181 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.181 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.181 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.181 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.181 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.181 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.181 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.181 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.181 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.181 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.181 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.181 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.181 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.181 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.181 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.181 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.181 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.181 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.181 12 ERROR oslo_messaging.notify.messaging 
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.183 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.allocation in the context of pollsters
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.184 12 DEBUG ceilometer.compute.pollsters [-] Instance a22c5192-6f57-46f1-8073-48ec7852a544 was shut off while getting sample of disk.device.allocation: Failed to inspect data of instance <name=instance-0000006b, id=a22c5192-6f57-46f1-8073-48ec7852a544>, domain state is SHUTOFF. get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:152
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.199 12 DEBUG ceilometer.compute.pollsters [-] 4e87b9c8-cfba-431e-966e-24799ad0ece2/disk.device.allocation volume: 30089216 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.200 12 DEBUG ceilometer.compute.pollsters [-] 4e87b9c8-cfba-431e-966e-24799ad0ece2/disk.device.allocation volume: 512000 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.202 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'b105291c-2f56-4b5d-8824-0fa881e2b864', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 30089216, 'user_id': '365f219cd09c471fa6275faa2fe5e2a1', 'user_name': None, 'project_id': 'b26cf6f4abd54e30aac169a3cbca648c', 'project_name': None, 'resource_id': '4e87b9c8-cfba-431e-966e-24799ad0ece2-vda', 'timestamp': '2026-01-22T00:08:23.183283', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestOtherB-server-261934281', 'name': 'instance-00000068', 'instance_id': '4e87b9c8-cfba-431e-966e-24799ad0ece2', 'instance_type': 'm1.nano', 'host': '1ae8d5dbf6e768f46f4454b06fe927f747a2ace1503561a97d1508fb', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '775ec7d4-f726-11f0-b13b-fa163e425b77', 'monotonic_time': 5061.904025238, 'message_signature': '22c3b1412d9373452b9edcefa083adf7a15dd3673829e98b58272e3cdf92a85d'}, {'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 512000, 'user_id': '365f219cd09c471fa6275faa2fe5e2a1', 'user_name': None, 'project_id': 'b26cf6f4abd54e30aac169a3cbca648c', 'project_name': None, 'resource_id': '4e87b9c8-cfba-431e-966e-24799ad0ece2-sda', 'timestamp': '2026-01-22T00:08:23.183283', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestOtherB-server-261934281', 'name': 'instance-00000068', 'instance_id': '4e87b9c8-cfba-431e-966e-24799ad0ece2', 'instance_type': 'm1.nano', 'host': '1ae8d5dbf6e768f46f4454b06fe927f747a2ace1503561a97d1508fb', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '775ed8c8-f726-11f0-b13b-fa163e425b77', 'monotonic_time': 5061.904025238, 'message_signature': '31f9d21d911e7db1459014c023a3b3ea40e874b4984d558f2b65c4df12ee69b6'}]}, 'timestamp': '2026-01-22 00:08:23.201016', '_unique_id': 'faf0d0c07e3945339d69fd542944ddc4'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.202 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.202 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.202 12 ERROR oslo_messaging.notify.messaging     yield
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.202 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.202 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.202 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.202 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.202 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.202 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.202 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.202 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.202 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.202 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.202 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.202 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.202 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.202 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.202 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.202 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.202 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.202 12 ERROR oslo_messaging.notify.messaging 
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.202 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.202 12 ERROR oslo_messaging.notify.messaging 
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.202 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.202 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.202 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.202 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.202 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.202 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.202 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.202 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.202 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.202 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.202 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.202 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.202 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.202 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.202 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.202 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.202 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.202 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.202 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.202 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.202 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.202 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.202 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.202 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.202 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.202 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.202 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.202 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.202 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.202 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.202 12 ERROR oslo_messaging.notify.messaging 
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.203 12 INFO ceilometer.polling.manager [-] Polling pollster cpu in the context of pollsters
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.204 12 DEBUG ceilometer.compute.pollsters [-] Instance a22c5192-6f57-46f1-8073-48ec7852a544 was shut off while getting sample of cpu: Failed to inspect data of instance <name=instance-0000006b, id=a22c5192-6f57-46f1-8073-48ec7852a544>, domain state is SHUTOFF. get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:152
Jan 21 19:08:23 np0005591285 podman[227576]: 2026-01-22 00:08:23.220738013 +0000 UTC m=+0.075684536 container health_status 3d927e208c8d9d2bf2ed958430b573b5beb6e6062ba87e1ec7a0ee42c855b2e4 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter)
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.220 12 DEBUG ceilometer.compute.pollsters [-] 4e87b9c8-cfba-431e-966e-24799ad0ece2/cpu volume: 12120000000 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.222 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '616e5554-7cfb-4766-81a1-ab875be147db', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'cpu', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 12120000000, 'user_id': '365f219cd09c471fa6275faa2fe5e2a1', 'user_name': None, 'project_id': 'b26cf6f4abd54e30aac169a3cbca648c', 'project_name': None, 'resource_id': '4e87b9c8-cfba-431e-966e-24799ad0ece2', 'timestamp': '2026-01-22T00:08:23.203343', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestOtherB-server-261934281', 'name': 'instance-00000068', 'instance_id': '4e87b9c8-cfba-431e-966e-24799ad0ece2', 'instance_type': 'm1.nano', 'host': '1ae8d5dbf6e768f46f4454b06fe927f747a2ace1503561a97d1508fb', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'cpu_number': 1}, 'message_id': '7761fd00-f726-11f0-b13b-fa163e425b77', 'monotonic_time': 5061.93977437, 'message_signature': '4c3a12348d7b494c480e14ef5f9a386e18b3514dce2f0ef373e34d2278712d87'}]}, 'timestamp': '2026-01-22 00:08:23.221626', '_unique_id': '68d851d5808b4e5f9a7075d5dc1d9147'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.222 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.222 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.222 12 ERROR oslo_messaging.notify.messaging     yield
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.222 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.222 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.222 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.222 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.222 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.222 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.222 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.222 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.222 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.222 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.222 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.222 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.222 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.222 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.222 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.222 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.222 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.222 12 ERROR oslo_messaging.notify.messaging 
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.222 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.222 12 ERROR oslo_messaging.notify.messaging 
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.222 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.222 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.222 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.222 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.222 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.222 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.222 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.222 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.222 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.222 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.222 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.222 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.222 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.222 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.222 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.222 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.222 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.222 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.222 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.222 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.222 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.222 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.222 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.222 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.222 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.222 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.222 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.222 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.222 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.222 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.222 12 ERROR oslo_messaging.notify.messaging 
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.224 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes in the context of pollsters
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.225 12 DEBUG ceilometer.compute.pollsters [-] Instance a22c5192-6f57-46f1-8073-48ec7852a544 was shut off while getting sample of network.incoming.bytes: Failed to inspect data of instance <name=instance-0000006b, id=a22c5192-6f57-46f1-8073-48ec7852a544>, domain state is SHUTOFF. get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:152
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.225 12 DEBUG ceilometer.compute.pollsters [-] 4e87b9c8-cfba-431e-966e-24799ad0ece2/network.incoming.bytes volume: 4273 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.226 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'b8e32d4b-d83b-4ccd-9647-57d802f5b267', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 4273, 'user_id': '365f219cd09c471fa6275faa2fe5e2a1', 'user_name': None, 'project_id': 'b26cf6f4abd54e30aac169a3cbca648c', 'project_name': None, 'resource_id': 'instance-00000068-4e87b9c8-cfba-431e-966e-24799ad0ece2-tap02f1d29d-b6', 'timestamp': '2026-01-22T00:08:23.224213', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestOtherB-server-261934281', 'name': 'tap02f1d29d-b6', 'instance_id': '4e87b9c8-cfba-431e-966e-24799ad0ece2', 'instance_type': 'm1.nano', 'host': '1ae8d5dbf6e768f46f4454b06fe927f747a2ace1503561a97d1508fb', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:7c:e7:2e', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap02f1d29d-b6'}, 'message_id': '77629936-f726-11f0-b13b-fa163e425b77', 'monotonic_time': 5061.891712347, 'message_signature': 'af25bd4def08fd031f7ed4fb60da83c74a5383895f922ddf24043d219b9ae93c'}]}, 'timestamp': '2026-01-22 00:08:23.225539', '_unique_id': '596aee2aa66e440889b2221c569e4a44'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.226 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.226 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.226 12 ERROR oslo_messaging.notify.messaging     yield
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.226 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.226 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.226 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.226 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.226 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.226 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.226 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.226 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.226 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.226 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.226 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.226 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.226 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.226 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.226 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.226 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.226 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.226 12 ERROR oslo_messaging.notify.messaging 
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.226 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.226 12 ERROR oslo_messaging.notify.messaging 
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.226 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.226 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.226 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.226 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.226 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.226 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.226 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.226 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.226 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.226 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.226 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.226 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.226 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.226 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.226 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.226 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.226 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.226 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.226 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.226 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.226 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.226 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.226 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.226 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.226 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.226 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.226 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.226 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.226 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.226 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.226 12 ERROR oslo_messaging.notify.messaging 
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.227 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.rate in the context of pollsters
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.227 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for IncomingBytesRatePollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.227 12 ERROR ceilometer.polling.manager [-] Prevent pollster network.incoming.bytes.rate from polling [<NovaLikeServer: tempest-ServerActionsTestOtherB-server-125029116>, <NovaLikeServer: tempest-ServerActionsTestOtherB-server-261934281>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-ServerActionsTestOtherB-server-125029116>, <NovaLikeServer: tempest-ServerActionsTestOtherB-server-261934281>]
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.227 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.requests in the context of pollsters
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.228 12 DEBUG ceilometer.compute.pollsters [-] Instance a22c5192-6f57-46f1-8073-48ec7852a544 was shut off while getting sample of disk.device.write.requests: Failed to inspect data of instance <name=instance-0000006b, id=a22c5192-6f57-46f1-8073-48ec7852a544>, domain state is SHUTOFF. get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:152
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.252 12 DEBUG ceilometer.compute.pollsters [-] 4e87b9c8-cfba-431e-966e-24799ad0ece2/disk.device.write.requests volume: 289 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.252 12 DEBUG ceilometer.compute.pollsters [-] 4e87b9c8-cfba-431e-966e-24799ad0ece2/disk.device.write.requests volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.254 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '58ae9dcb-1fe9-456a-8f44-9d918bea3c54', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 289, 'user_id': '365f219cd09c471fa6275faa2fe5e2a1', 'user_name': None, 'project_id': 'b26cf6f4abd54e30aac169a3cbca648c', 'project_name': None, 'resource_id': '4e87b9c8-cfba-431e-966e-24799ad0ece2-vda', 'timestamp': '2026-01-22T00:08:23.227846', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestOtherB-server-261934281', 'name': 'instance-00000068', 'instance_id': '4e87b9c8-cfba-431e-966e-24799ad0ece2', 'instance_type': 'm1.nano', 'host': '1ae8d5dbf6e768f46f4454b06fe927f747a2ace1503561a97d1508fb', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '7766c826-f726-11f0-b13b-fa163e425b77', 'monotonic_time': 5061.947949, 'message_signature': '0b7cb0ba042dd6330058356a2f76ab9b9ef3d9575b9089c5e74d7cd6893d3e0d'}, {'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 0, 'user_id': '365f219cd09c471fa6275faa2fe5e2a1', 'user_name': None, 'project_id': 'b26cf6f4abd54e30aac169a3cbca648c', 'project_name': None, 'resource_id': '4e87b9c8-cfba-431e-966e-24799ad0ece2-sda', 'timestamp': '2026-01-22T00:08:23.227846', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestOtherB-server-261934281', 'name': 'instance-00000068', 'instance_id': '4e87b9c8-cfba-431e-966e-24799ad0ece2', 'instance_type': 'm1.nano', 'host': '1ae8d5dbf6e768f46f4454b06fe927f747a2ace1503561a97d1508fb', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '7766d654-f726-11f0-b13b-fa163e425b77', 'monotonic_time': 5061.947949, 'message_signature': '25d008dc4508a78a171cfa926b2781158e8acaf4b5cdcedeaf0f72ea14f8493f'}]}, 'timestamp': '2026-01-22 00:08:23.253319', '_unique_id': 'd94fd5ee0dac4def82be1efa8d7d7b9b'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.254 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.254 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.254 12 ERROR oslo_messaging.notify.messaging     yield
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.254 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.254 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.254 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.254 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.254 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.254 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.254 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.254 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.254 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.254 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.254 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.254 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.254 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.254 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.254 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.254 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.254 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.254 12 ERROR oslo_messaging.notify.messaging 
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.254 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.254 12 ERROR oslo_messaging.notify.messaging 
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.254 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.254 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.254 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.254 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.254 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.254 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.254 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.254 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.254 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.254 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.254 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.254 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.254 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.254 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.254 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.254 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.254 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.254 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.254 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.254 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.254 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.254 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.254 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.254 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.254 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.254 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.254 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.254 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.254 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.254 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.254 12 ERROR oslo_messaging.notify.messaging 
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.255 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.error in the context of pollsters
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.256 12 DEBUG ceilometer.compute.pollsters [-] Instance a22c5192-6f57-46f1-8073-48ec7852a544 was shut off while getting sample of network.incoming.packets.error: Failed to inspect data of instance <name=instance-0000006b, id=a22c5192-6f57-46f1-8073-48ec7852a544>, domain state is SHUTOFF. get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:152
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.256 12 DEBUG ceilometer.compute.pollsters [-] 4e87b9c8-cfba-431e-966e-24799ad0ece2/network.incoming.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.257 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'f0c318ea-b290-4e87-b1a6-f91e60b70d82', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '365f219cd09c471fa6275faa2fe5e2a1', 'user_name': None, 'project_id': 'b26cf6f4abd54e30aac169a3cbca648c', 'project_name': None, 'resource_id': 'instance-00000068-4e87b9c8-cfba-431e-966e-24799ad0ece2-tap02f1d29d-b6', 'timestamp': '2026-01-22T00:08:23.255473', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestOtherB-server-261934281', 'name': 'tap02f1d29d-b6', 'instance_id': '4e87b9c8-cfba-431e-966e-24799ad0ece2', 'instance_type': 'm1.nano', 'host': '1ae8d5dbf6e768f46f4454b06fe927f747a2ace1503561a97d1508fb', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:7c:e7:2e', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap02f1d29d-b6'}, 'message_id': '77675afc-f726-11f0-b13b-fa163e425b77', 'monotonic_time': 5061.891712347, 'message_signature': '9ac9669fb91edb88da20b6474312c88781c777f40051afacb17ac00bfc007717'}]}, 'timestamp': '2026-01-22 00:08:23.256710', '_unique_id': '66b15dd52f834cc19466f60f40cc2d82'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.257 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.257 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.257 12 ERROR oslo_messaging.notify.messaging     yield
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.257 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.257 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.257 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.257 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.257 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.257 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.257 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.257 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.257 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.257 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.257 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.257 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.257 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.257 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.257 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.257 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.257 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.257 12 ERROR oslo_messaging.notify.messaging 
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.257 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.257 12 ERROR oslo_messaging.notify.messaging 
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.257 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.257 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.257 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.257 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.257 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.257 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.257 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.257 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.257 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.257 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.257 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.257 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.257 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.257 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.257 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.257 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.257 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.257 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.257 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.257 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.257 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.257 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.257 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.257 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.257 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.257 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.257 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.257 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.257 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.257 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.257 12 ERROR oslo_messaging.notify.messaging 
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.258 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.delta in the context of pollsters
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.258 12 DEBUG ceilometer.compute.pollsters [-] Instance a22c5192-6f57-46f1-8073-48ec7852a544 was shut off while getting sample of network.incoming.bytes.delta: Failed to inspect data of instance <name=instance-0000006b, id=a22c5192-6f57-46f1-8073-48ec7852a544>, domain state is SHUTOFF. get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:152
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.259 12 DEBUG ceilometer.compute.pollsters [-] 4e87b9c8-cfba-431e-966e-24799ad0ece2/network.incoming.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.260 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '1ddf155f-b43f-46c6-a7bf-90184026039d', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '365f219cd09c471fa6275faa2fe5e2a1', 'user_name': None, 'project_id': 'b26cf6f4abd54e30aac169a3cbca648c', 'project_name': None, 'resource_id': 'instance-00000068-4e87b9c8-cfba-431e-966e-24799ad0ece2-tap02f1d29d-b6', 'timestamp': '2026-01-22T00:08:23.258295', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestOtherB-server-261934281', 'name': 'tap02f1d29d-b6', 'instance_id': '4e87b9c8-cfba-431e-966e-24799ad0ece2', 'instance_type': 'm1.nano', 'host': '1ae8d5dbf6e768f46f4454b06fe927f747a2ace1503561a97d1508fb', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:7c:e7:2e', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap02f1d29d-b6'}, 'message_id': '7767c50a-f726-11f0-b13b-fa163e425b77', 'monotonic_time': 5061.891712347, 'message_signature': '2c118c4c8992827228a66ed6005a908eababb0a9f5a3e7e69e1546ce68b41173'}]}, 'timestamp': '2026-01-22 00:08:23.259419', '_unique_id': '6a4474d07d1c4fcb831b610585d44cfa'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.260 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.260 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.260 12 ERROR oslo_messaging.notify.messaging     yield
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.260 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.260 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.260 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.260 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.260 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.260 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.260 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.260 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.260 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.260 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.260 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.260 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.260 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.260 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.260 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.260 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.260 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.260 12 ERROR oslo_messaging.notify.messaging 
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.260 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.260 12 ERROR oslo_messaging.notify.messaging 
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.260 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.260 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.260 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.260 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.260 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.260 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.260 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.260 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.260 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.260 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.260 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.260 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.260 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.260 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.260 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.260 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.260 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.260 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.260 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.260 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.260 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.260 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.260 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.260 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.260 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.260 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.260 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.260 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.260 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.260 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.260 12 ERROR oslo_messaging.notify.messaging 
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.260 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.bytes in the context of pollsters
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.261 12 DEBUG ceilometer.compute.pollsters [-] Instance a22c5192-6f57-46f1-8073-48ec7852a544 was shut off while getting sample of disk.device.read.bytes: Failed to inspect data of instance <name=instance-0000006b, id=a22c5192-6f57-46f1-8073-48ec7852a544>, domain state is SHUTOFF. get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:152
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.261 12 DEBUG ceilometer.compute.pollsters [-] 4e87b9c8-cfba-431e-966e-24799ad0ece2/disk.device.read.bytes volume: 29407744 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.262 12 DEBUG ceilometer.compute.pollsters [-] 4e87b9c8-cfba-431e-966e-24799ad0ece2/disk.device.read.bytes volume: 299326 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.263 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '141c69f8-4e09-430e-8d9d-b951bd693ed2', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 29407744, 'user_id': '365f219cd09c471fa6275faa2fe5e2a1', 'user_name': None, 'project_id': 'b26cf6f4abd54e30aac169a3cbca648c', 'project_name': None, 'resource_id': '4e87b9c8-cfba-431e-966e-24799ad0ece2-vda', 'timestamp': '2026-01-22T00:08:23.261109', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestOtherB-server-261934281', 'name': 'instance-00000068', 'instance_id': '4e87b9c8-cfba-431e-966e-24799ad0ece2', 'instance_type': 'm1.nano', 'host': '1ae8d5dbf6e768f46f4454b06fe927f747a2ace1503561a97d1508fb', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '776831d4-f726-11f0-b13b-fa163e425b77', 'monotonic_time': 5061.947949, 'message_signature': '09ade97773110bebb4d588e85620c7fb1a2c5763fc2d291d29efe65892fc6c4f'}, {'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 299326, 'user_id': '365f219cd09c471fa6275faa2fe5e2a1', 'user_name': None, 'project_id': 'b26cf6f4abd54e30aac169a3cbca648c', 'project_name': None, 'resource_id': '4e87b9c8-cfba-431e-966e-24799ad0ece2-sda', 'timestamp': '2026-01-22T00:08:23.261109', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestOtherB-server-261934281', 'name': 'instance-00000068', 'instance_id': '4e87b9c8-cfba-431e-966e-24799ad0ece2', 'instance_type': 'm1.nano', 'host': '1ae8d5dbf6e768f46f4454b06fe927f747a2ace1503561a97d1508fb', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '77683e22-f726-11f0-b13b-fa163e425b77', 'monotonic_time': 5061.947949, 'message_signature': '0c79c208c0264060cd6acd9e54ba45c186d55fef86a73817d0d099dfc3ba30c1'}]}, 'timestamp': '2026-01-22 00:08:23.262506', '_unique_id': 'd453adb4a8ce440c9d063ffb5f6b25c6'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.263 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.263 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.263 12 ERROR oslo_messaging.notify.messaging     yield
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.263 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.263 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.263 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.263 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.263 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.263 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.263 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.263 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.263 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.263 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.263 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.263 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.263 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.263 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.263 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.263 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.263 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.263 12 ERROR oslo_messaging.notify.messaging 
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.263 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.263 12 ERROR oslo_messaging.notify.messaging 
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.263 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.263 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.263 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.263 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.263 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.263 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.263 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.263 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.263 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.263 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.263 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.263 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.263 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.263 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.263 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.263 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.263 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.263 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.263 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.263 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.263 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.263 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.263 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.263 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.263 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.263 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.263 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.263 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.263 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.263 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.263 12 ERROR oslo_messaging.notify.messaging 
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.264 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.bytes in the context of pollsters
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.265 12 DEBUG ceilometer.compute.pollsters [-] Instance a22c5192-6f57-46f1-8073-48ec7852a544 was shut off while getting sample of disk.device.write.bytes: Failed to inspect data of instance <name=instance-0000006b, id=a22c5192-6f57-46f1-8073-48ec7852a544>, domain state is SHUTOFF. get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:152
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.265 12 DEBUG ceilometer.compute.pollsters [-] 4e87b9c8-cfba-431e-966e-24799ad0ece2/disk.device.write.bytes volume: 72986624 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.265 12 DEBUG ceilometer.compute.pollsters [-] 4e87b9c8-cfba-431e-966e-24799ad0ece2/disk.device.write.bytes volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.266 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'aa164eb4-6931-469c-aec2-02ed4805cbf7', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 72986624, 'user_id': '365f219cd09c471fa6275faa2fe5e2a1', 'user_name': None, 'project_id': 'b26cf6f4abd54e30aac169a3cbca648c', 'project_name': None, 'resource_id': '4e87b9c8-cfba-431e-966e-24799ad0ece2-vda', 'timestamp': '2026-01-22T00:08:23.264150', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestOtherB-server-261934281', 'name': 'instance-00000068', 'instance_id': '4e87b9c8-cfba-431e-966e-24799ad0ece2', 'instance_type': 'm1.nano', 'host': '1ae8d5dbf6e768f46f4454b06fe927f747a2ace1503561a97d1508fb', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '7768b348-f726-11f0-b13b-fa163e425b77', 'monotonic_time': 5061.947949, 'message_signature': '15f904d81f5764f5a461bd712686ec025bf579030e0476be8425aab4c00f5058'}, {'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '365f219cd09c471fa6275faa2fe5e2a1', 'user_name': None, 'project_id': 'b26cf6f4abd54e30aac169a3cbca648c', 'project_name': None, 'resource_id': '4e87b9c8-cfba-431e-966e-24799ad0ece2-sda', 'timestamp': '2026-01-22T00:08:23.264150', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestOtherB-server-261934281', 'name': 'instance-00000068', 'instance_id': '4e87b9c8-cfba-431e-966e-24799ad0ece2', 'instance_type': 'm1.nano', 'host': '1ae8d5dbf6e768f46f4454b06fe927f747a2ace1503561a97d1508fb', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '7768bf96-f726-11f0-b13b-fa163e425b77', 'monotonic_time': 5061.947949, 'message_signature': '33c0be60926fecaad534677013da8c3ba01b09c50dd840d7c9876c755555b555'}]}, 'timestamp': '2026-01-22 00:08:23.265820', '_unique_id': '33a1a55339764073872bbaaad433962c'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.266 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.266 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.266 12 ERROR oslo_messaging.notify.messaging     yield
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.266 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.266 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.266 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.266 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.266 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.266 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.266 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.266 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.266 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.266 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.266 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.266 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.266 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.266 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.266 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.266 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.266 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.266 12 ERROR oslo_messaging.notify.messaging 
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.266 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.266 12 ERROR oslo_messaging.notify.messaging 
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.266 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.266 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.266 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.266 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.266 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.266 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.266 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.266 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.266 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.266 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.266 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.266 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.266 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.266 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.266 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.266 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.266 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.266 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.266 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.266 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.266 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.266 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.266 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.266 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.266 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.266 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.266 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.266 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.266 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.266 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.266 12 ERROR oslo_messaging.notify.messaging 
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.267 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets in the context of pollsters
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.268 12 DEBUG ceilometer.compute.pollsters [-] Instance a22c5192-6f57-46f1-8073-48ec7852a544 was shut off while getting sample of network.incoming.packets: Failed to inspect data of instance <name=instance-0000006b, id=a22c5192-6f57-46f1-8073-48ec7852a544>, domain state is SHUTOFF. get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:152
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.268 12 DEBUG ceilometer.compute.pollsters [-] 4e87b9c8-cfba-431e-966e-24799ad0ece2/network.incoming.packets volume: 29 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.269 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'c01620b6-216d-4bad-935f-26e84be234b1', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 29, 'user_id': '365f219cd09c471fa6275faa2fe5e2a1', 'user_name': None, 'project_id': 'b26cf6f4abd54e30aac169a3cbca648c', 'project_name': None, 'resource_id': 'instance-00000068-4e87b9c8-cfba-431e-966e-24799ad0ece2-tap02f1d29d-b6', 'timestamp': '2026-01-22T00:08:23.267433', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestOtherB-server-261934281', 'name': 'tap02f1d29d-b6', 'instance_id': '4e87b9c8-cfba-431e-966e-24799ad0ece2', 'instance_type': 'm1.nano', 'host': '1ae8d5dbf6e768f46f4454b06fe927f747a2ace1503561a97d1508fb', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:7c:e7:2e', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap02f1d29d-b6'}, 'message_id': '77692a08-f726-11f0-b13b-fa163e425b77', 'monotonic_time': 5061.891712347, 'message_signature': '86e7b3463682558b743b19346c1863310605ad6964a4ea5019bcadbd2dd55d91'}]}, 'timestamp': '2026-01-22 00:08:23.268561', '_unique_id': '718223b664f14fe4b658c1f5cc882c5a'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.269 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.269 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.269 12 ERROR oslo_messaging.notify.messaging     yield
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.269 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.269 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.269 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.269 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.269 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.269 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.269 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.269 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.269 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.269 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.269 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.269 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.269 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.269 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.269 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.269 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.269 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.269 12 ERROR oslo_messaging.notify.messaging 
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.269 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.269 12 ERROR oslo_messaging.notify.messaging 
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.269 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.269 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.269 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.269 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.269 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.269 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.269 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.269 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.269 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.269 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.269 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.269 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.269 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.269 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.269 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.269 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.269 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.269 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.269 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.269 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.269 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.269 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.269 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.269 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.269 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.269 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.269 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.269 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.269 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.269 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.269 12 ERROR oslo_messaging.notify.messaging 
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.270 12 INFO ceilometer.polling.manager [-] Polling pollster memory.usage in the context of pollsters
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.271 12 DEBUG ceilometer.compute.pollsters [-] Instance a22c5192-6f57-46f1-8073-48ec7852a544 was shut off while getting sample of memory.usage: Failed to inspect data of instance <name=instance-0000006b, id=a22c5192-6f57-46f1-8073-48ec7852a544>, domain state is SHUTOFF. get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:152
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.271 12 DEBUG ceilometer.compute.pollsters [-] 4e87b9c8-cfba-431e-966e-24799ad0ece2/memory.usage volume: 42.74609375 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.272 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '064d2725-c66d-4417-b905-e2ade8810c79', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'memory.usage', 'counter_type': 'gauge', 'counter_unit': 'MB', 'counter_volume': 42.74609375, 'user_id': '365f219cd09c471fa6275faa2fe5e2a1', 'user_name': None, 'project_id': 'b26cf6f4abd54e30aac169a3cbca648c', 'project_name': None, 'resource_id': '4e87b9c8-cfba-431e-966e-24799ad0ece2', 'timestamp': '2026-01-22T00:08:23.270585', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestOtherB-server-261934281', 'name': 'instance-00000068', 'instance_id': '4e87b9c8-cfba-431e-966e-24799ad0ece2', 'instance_type': 'm1.nano', 'host': '1ae8d5dbf6e768f46f4454b06fe927f747a2ace1503561a97d1508fb', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1}, 'message_id': '7769a352-f726-11f0-b13b-fa163e425b77', 'monotonic_time': 5061.93977437, 'message_signature': 'b477e6194ecf266e00fd1e1d8e36c98753276bc8558f08c8aeb5ea4837ee3738'}]}, 'timestamp': '2026-01-22 00:08:23.271660', '_unique_id': 'e7e497028feb4bbfb8ece83423d7f34a'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.272 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.272 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.272 12 ERROR oslo_messaging.notify.messaging     yield
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.272 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.272 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.272 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.272 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.272 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.272 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.272 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.272 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.272 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.272 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.272 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.272 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.272 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.272 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.272 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.272 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.272 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.272 12 ERROR oslo_messaging.notify.messaging 
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.272 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.272 12 ERROR oslo_messaging.notify.messaging 
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.272 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.272 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.272 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.272 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.272 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.272 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.272 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.272 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.272 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.272 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.272 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.272 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.272 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.272 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.272 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.272 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.272 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.272 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.272 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.272 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.272 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.272 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.272 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.272 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.272 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.272 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.272 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.272 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.272 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.272 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.272 12 ERROR oslo_messaging.notify.messaging 
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.273 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes in the context of pollsters
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.274 12 DEBUG ceilometer.compute.pollsters [-] Instance a22c5192-6f57-46f1-8073-48ec7852a544 was shut off while getting sample of network.outgoing.bytes: Failed to inspect data of instance <name=instance-0000006b, id=a22c5192-6f57-46f1-8073-48ec7852a544>, domain state is SHUTOFF. get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:152
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.274 12 DEBUG ceilometer.compute.pollsters [-] 4e87b9c8-cfba-431e-966e-24799ad0ece2/network.outgoing.bytes volume: 3390 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.275 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '561bcf3c-1ba7-4989-bfa0-002a07f0f017', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 3390, 'user_id': '365f219cd09c471fa6275faa2fe5e2a1', 'user_name': None, 'project_id': 'b26cf6f4abd54e30aac169a3cbca648c', 'project_name': None, 'resource_id': 'instance-00000068-4e87b9c8-cfba-431e-966e-24799ad0ece2-tap02f1d29d-b6', 'timestamp': '2026-01-22T00:08:23.273298', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestOtherB-server-261934281', 'name': 'tap02f1d29d-b6', 'instance_id': '4e87b9c8-cfba-431e-966e-24799ad0ece2', 'instance_type': 'm1.nano', 'host': '1ae8d5dbf6e768f46f4454b06fe927f747a2ace1503561a97d1508fb', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:7c:e7:2e', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap02f1d29d-b6'}, 'message_id': '776a18d2-f726-11f0-b13b-fa163e425b77', 'monotonic_time': 5061.891712347, 'message_signature': '8c1f94499a6b5a13a4d37f02eba1ce25585e4c47d1d902fc3ad13d3ed7713055'}]}, 'timestamp': '2026-01-22 00:08:23.274693', '_unique_id': 'c4e62f669fd147afbadea2ec42662839'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.275 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.275 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.275 12 ERROR oslo_messaging.notify.messaging     yield
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.275 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.275 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.275 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.275 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.275 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.275 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.275 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.275 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.275 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.275 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.275 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.275 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.275 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.275 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.275 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.275 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.275 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.275 12 ERROR oslo_messaging.notify.messaging 
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.275 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.275 12 ERROR oslo_messaging.notify.messaging 
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.275 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.275 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.275 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.275 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.275 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.275 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.275 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.275 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.275 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.275 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.275 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.275 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.275 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.275 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.275 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.275 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.275 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.275 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.275 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.275 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.275 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.275 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.275 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.275 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.275 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.275 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.275 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.275 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.275 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.275 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.275 12 ERROR oslo_messaging.notify.messaging 
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.276 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets in the context of pollsters
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.276 12 DEBUG ceilometer.compute.pollsters [-] Instance a22c5192-6f57-46f1-8073-48ec7852a544 was shut off while getting sample of network.outgoing.packets: Failed to inspect data of instance <name=instance-0000006b, id=a22c5192-6f57-46f1-8073-48ec7852a544>, domain state is SHUTOFF. get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:152
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.276 12 DEBUG ceilometer.compute.pollsters [-] 4e87b9c8-cfba-431e-966e-24799ad0ece2/network.outgoing.packets volume: 28 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.277 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'cd134d0e-9cea-4219-bdb0-06b6ae837544', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 28, 'user_id': '365f219cd09c471fa6275faa2fe5e2a1', 'user_name': None, 'project_id': 'b26cf6f4abd54e30aac169a3cbca648c', 'project_name': None, 'resource_id': 'instance-00000068-4e87b9c8-cfba-431e-966e-24799ad0ece2-tap02f1d29d-b6', 'timestamp': '2026-01-22T00:08:23.276319', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestOtherB-server-261934281', 'name': 'tap02f1d29d-b6', 'instance_id': '4e87b9c8-cfba-431e-966e-24799ad0ece2', 'instance_type': 'm1.nano', 'host': '1ae8d5dbf6e768f46f4454b06fe927f747a2ace1503561a97d1508fb', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:7c:e7:2e', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap02f1d29d-b6'}, 'message_id': '776a7cfa-f726-11f0-b13b-fa163e425b77', 'monotonic_time': 5061.891712347, 'message_signature': '6aa8c888785680b749d8739f4a99ea28e40a9984011381ab9cf512992c8bfab3'}]}, 'timestamp': '2026-01-22 00:08:23.277243', '_unique_id': 'a8754327ae5a4556a8b2beeb9289fff8'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.277 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.277 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.277 12 ERROR oslo_messaging.notify.messaging     yield
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.277 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.277 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.277 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.277 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.277 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.277 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.277 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.277 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.277 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.277 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.277 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.277 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.277 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.277 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.277 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.277 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.277 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.277 12 ERROR oslo_messaging.notify.messaging 
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.277 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.277 12 ERROR oslo_messaging.notify.messaging 
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.277 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.277 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.277 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.277 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.277 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.277 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.277 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.277 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.277 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.277 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.277 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.277 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.277 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.277 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.277 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.277 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.277 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.277 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.277 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.277 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.277 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.277 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.277 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.277 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.277 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.277 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.277 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.277 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.277 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.277 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.277 12 ERROR oslo_messaging.notify.messaging 
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.278 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.drop in the context of pollsters
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.279 12 DEBUG ceilometer.compute.pollsters [-] Instance a22c5192-6f57-46f1-8073-48ec7852a544 was shut off while getting sample of network.incoming.packets.drop: Failed to inspect data of instance <name=instance-0000006b, id=a22c5192-6f57-46f1-8073-48ec7852a544>, domain state is SHUTOFF. get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:152
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.279 12 DEBUG ceilometer.compute.pollsters [-] 4e87b9c8-cfba-431e-966e-24799ad0ece2/network.incoming.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.280 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '15d8577d-927c-4782-b633-3dfdeb055e4f', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '365f219cd09c471fa6275faa2fe5e2a1', 'user_name': None, 'project_id': 'b26cf6f4abd54e30aac169a3cbca648c', 'project_name': None, 'resource_id': 'instance-00000068-4e87b9c8-cfba-431e-966e-24799ad0ece2-tap02f1d29d-b6', 'timestamp': '2026-01-22T00:08:23.278709', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestOtherB-server-261934281', 'name': 'tap02f1d29d-b6', 'instance_id': '4e87b9c8-cfba-431e-966e-24799ad0ece2', 'instance_type': 'm1.nano', 'host': '1ae8d5dbf6e768f46f4454b06fe927f747a2ace1503561a97d1508fb', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:7c:e7:2e', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap02f1d29d-b6'}, 'message_id': '776adb6e-f726-11f0-b13b-fa163e425b77', 'monotonic_time': 5061.891712347, 'message_signature': '5827ee54599c5492fd84ca461dac5e04e361889f510f42057e645de1fc1bfe79'}]}, 'timestamp': '2026-01-22 00:08:23.279658', '_unique_id': '595f8249193143c8a3e88cab1ebfe8ba'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.280 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.280 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.280 12 ERROR oslo_messaging.notify.messaging     yield
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.280 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.280 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.280 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.280 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.280 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.280 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.280 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.280 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.280 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.280 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.280 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.280 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.280 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.280 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.280 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.280 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.280 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.280 12 ERROR oslo_messaging.notify.messaging 
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.280 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.280 12 ERROR oslo_messaging.notify.messaging 
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.280 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.280 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.280 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.280 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.280 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.280 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.280 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.280 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.280 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.280 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.280 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.280 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.280 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.280 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.280 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.280 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.280 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.280 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.280 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.280 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.280 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.280 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.280 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.280 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.280 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.280 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.280 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.280 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.280 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.280 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.280 12 ERROR oslo_messaging.notify.messaging 
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.281 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.capacity in the context of pollsters
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.281 12 DEBUG ceilometer.compute.pollsters [-] Instance a22c5192-6f57-46f1-8073-48ec7852a544 was shut off while getting sample of disk.device.capacity: Failed to inspect data of instance <name=instance-0000006b, id=a22c5192-6f57-46f1-8073-48ec7852a544>, domain state is SHUTOFF. get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:152
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.281 12 DEBUG ceilometer.compute.pollsters [-] 4e87b9c8-cfba-431e-966e-24799ad0ece2/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.282 12 DEBUG ceilometer.compute.pollsters [-] 4e87b9c8-cfba-431e-966e-24799ad0ece2/disk.device.capacity volume: 509952 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.283 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'cabcfccd-ea19-4b64-a32a-83dae9112e4f', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '365f219cd09c471fa6275faa2fe5e2a1', 'user_name': None, 'project_id': 'b26cf6f4abd54e30aac169a3cbca648c', 'project_name': None, 'resource_id': '4e87b9c8-cfba-431e-966e-24799ad0ece2-vda', 'timestamp': '2026-01-22T00:08:23.281163', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestOtherB-server-261934281', 'name': 'instance-00000068', 'instance_id': '4e87b9c8-cfba-431e-966e-24799ad0ece2', 'instance_type': 'm1.nano', 'host': '1ae8d5dbf6e768f46f4454b06fe927f747a2ace1503561a97d1508fb', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '776b376c-f726-11f0-b13b-fa163e425b77', 'monotonic_time': 5061.904025238, 'message_signature': '07b3c9d8029c84dbffe20006c29283551375150296fb7fdb3d7eb350bf7e5edc'}, {'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 509952, 'user_id': '365f219cd09c471fa6275faa2fe5e2a1', 'user_name': None, 'project_id': 'b26cf6f4abd54e30aac169a3cbca648c', 'project_name': None, 'resource_id': '4e87b9c8-cfba-431e-966e-24799ad0ece2-sda', 'timestamp': '2026-01-22T00:08:23.281163', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestOtherB-server-261934281', 'name': 'instance-00000068', 'instance_id': '4e87b9c8-cfba-431e-966e-24799ad0ece2', 'instance_type': 'm1.nano', 'host': '1ae8d5dbf6e768f46f4454b06fe927f747a2ace1503561a97d1508fb', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '776b56f2-f726-11f0-b13b-fa163e425b77', 'monotonic_time': 5061.904025238, 'message_signature': '18e09874759b04dab12a35dae3dc2a31ebe8f9162cabd8b4bbbef3f003a0f2d7'}]}, 'timestamp': '2026-01-22 00:08:23.282812', '_unique_id': '7a49e59978c741f2a2df40ba5270c55c'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.283 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.283 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.283 12 ERROR oslo_messaging.notify.messaging     yield
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.283 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.283 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.283 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.283 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.283 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.283 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.283 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.283 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.283 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.283 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.283 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.283 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.283 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.283 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.283 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.283 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.283 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.283 12 ERROR oslo_messaging.notify.messaging 
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.283 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.283 12 ERROR oslo_messaging.notify.messaging 
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.283 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.283 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.283 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.283 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.283 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.283 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.283 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.283 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.283 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.283 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.283 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.283 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.283 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.283 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.283 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.283 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.283 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.283 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.283 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.283 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.283 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.283 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.283 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.283 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.283 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.283 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.283 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.283 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.283 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.283 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.283 12 ERROR oslo_messaging.notify.messaging 
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.284 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.delta in the context of pollsters
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.284 12 DEBUG ceilometer.compute.pollsters [-] Instance a22c5192-6f57-46f1-8073-48ec7852a544 was shut off while getting sample of network.outgoing.bytes.delta: Failed to inspect data of instance <name=instance-0000006b, id=a22c5192-6f57-46f1-8073-48ec7852a544>, domain state is SHUTOFF. get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:152
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.285 12 DEBUG ceilometer.compute.pollsters [-] 4e87b9c8-cfba-431e-966e-24799ad0ece2/network.outgoing.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.286 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'f4cae461-c2d5-48af-89d6-ee22a03271a0', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '365f219cd09c471fa6275faa2fe5e2a1', 'user_name': None, 'project_id': 'b26cf6f4abd54e30aac169a3cbca648c', 'project_name': None, 'resource_id': 'instance-00000068-4e87b9c8-cfba-431e-966e-24799ad0ece2-tap02f1d29d-b6', 'timestamp': '2026-01-22T00:08:23.284363', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestOtherB-server-261934281', 'name': 'tap02f1d29d-b6', 'instance_id': '4e87b9c8-cfba-431e-966e-24799ad0ece2', 'instance_type': 'm1.nano', 'host': '1ae8d5dbf6e768f46f4454b06fe927f747a2ace1503561a97d1508fb', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:7c:e7:2e', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap02f1d29d-b6'}, 'message_id': '776bbcaa-f726-11f0-b13b-fa163e425b77', 'monotonic_time': 5061.891712347, 'message_signature': '80c89890c642db1a9df6cd1c3a93e85febe9c1311f04d0ad9fa52337a440df8b'}]}, 'timestamp': '2026-01-22 00:08:23.285423', '_unique_id': '433aee44a7164d28a89377020a6d08e3'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.286 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.286 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.286 12 ERROR oslo_messaging.notify.messaging     yield
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.286 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.286 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.286 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.286 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.286 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.286 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.286 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.286 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.286 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.286 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.286 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.286 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.286 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.286 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.286 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.286 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.286 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.286 12 ERROR oslo_messaging.notify.messaging 
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.286 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.286 12 ERROR oslo_messaging.notify.messaging 
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.286 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.286 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.286 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.286 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.286 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.286 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.286 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.286 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.286 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.286 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.286 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.286 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.286 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.286 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.286 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.286 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.286 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.286 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.286 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.286 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.286 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.286 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.286 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.286 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.286 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.286 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.286 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.286 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.286 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.286 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.286 12 ERROR oslo_messaging.notify.messaging 
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.286 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.latency in the context of pollsters
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.286 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for PerDeviceDiskLatencyPollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.287 12 ERROR ceilometer.polling.manager [-] Prevent pollster disk.device.latency from polling [<NovaLikeServer: tempest-ServerActionsTestOtherB-server-125029116>, <NovaLikeServer: tempest-ServerActionsTestOtherB-server-261934281>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-ServerActionsTestOtherB-server-125029116>, <NovaLikeServer: tempest-ServerActionsTestOtherB-server-261934281>]
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.287 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.rate in the context of pollsters
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.287 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for OutgoingBytesRatePollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.287 12 ERROR ceilometer.polling.manager [-] Prevent pollster network.outgoing.bytes.rate from polling [<NovaLikeServer: tempest-ServerActionsTestOtherB-server-125029116>, <NovaLikeServer: tempest-ServerActionsTestOtherB-server-261934281>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-ServerActionsTestOtherB-server-125029116>, <NovaLikeServer: tempest-ServerActionsTestOtherB-server-261934281>]
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.287 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.drop in the context of pollsters
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.288 12 DEBUG ceilometer.compute.pollsters [-] Instance a22c5192-6f57-46f1-8073-48ec7852a544 was shut off while getting sample of network.outgoing.packets.drop: Failed to inspect data of instance <name=instance-0000006b, id=a22c5192-6f57-46f1-8073-48ec7852a544>, domain state is SHUTOFF. get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:152
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.288 12 DEBUG ceilometer.compute.pollsters [-] 4e87b9c8-cfba-431e-966e-24799ad0ece2/network.outgoing.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.289 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '2698401e-9fa2-4f35-9945-ad48476ef675', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '365f219cd09c471fa6275faa2fe5e2a1', 'user_name': None, 'project_id': 'b26cf6f4abd54e30aac169a3cbca648c', 'project_name': None, 'resource_id': 'instance-00000068-4e87b9c8-cfba-431e-966e-24799ad0ece2-tap02f1d29d-b6', 'timestamp': '2026-01-22T00:08:23.287620', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestOtherB-server-261934281', 'name': 'tap02f1d29d-b6', 'instance_id': '4e87b9c8-cfba-431e-966e-24799ad0ece2', 'instance_type': 'm1.nano', 'host': '1ae8d5dbf6e768f46f4454b06fe927f747a2ace1503561a97d1508fb', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:7c:e7:2e', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap02f1d29d-b6'}, 'message_id': '776c38a6-f726-11f0-b13b-fa163e425b77', 'monotonic_time': 5061.891712347, 'message_signature': 'abeaee04801c7d3f401e394c19154a468172e944a2e805edfcf9474e5c07051a'}]}, 'timestamp': '2026-01-22 00:08:23.288552', '_unique_id': '0f1269acb6514492a943be0cf2a14503'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.289 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.289 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.289 12 ERROR oslo_messaging.notify.messaging     yield
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.289 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.289 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.289 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.289 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.289 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.289 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.289 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.289 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.289 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.289 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.289 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.289 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.289 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.289 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.289 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.289 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.289 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.289 12 ERROR oslo_messaging.notify.messaging 
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.289 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.289 12 ERROR oslo_messaging.notify.messaging 
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.289 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.289 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.289 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.289 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.289 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.289 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.289 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.289 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.289 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.289 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.289 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.289 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.289 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.289 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.289 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.289 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.289 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.289 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.289 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.289 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.289 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.289 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.289 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.289 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.289 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.289 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.289 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.289 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.289 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.289 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.289 12 ERROR oslo_messaging.notify.messaging 
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.289 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.latency in the context of pollsters
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.290 12 DEBUG ceilometer.compute.pollsters [-] Instance a22c5192-6f57-46f1-8073-48ec7852a544 was shut off while getting sample of disk.device.read.latency: Failed to inspect data of instance <name=instance-0000006b, id=a22c5192-6f57-46f1-8073-48ec7852a544>, domain state is SHUTOFF. get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:152
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.290 12 DEBUG ceilometer.compute.pollsters [-] 4e87b9c8-cfba-431e-966e-24799ad0ece2/disk.device.read.latency volume: 203964241 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.290 12 DEBUG ceilometer.compute.pollsters [-] 4e87b9c8-cfba-431e-966e-24799ad0ece2/disk.device.read.latency volume: 24650124 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.291 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'ec9b465f-e130-44ab-a578-a9b3621c6018', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 203964241, 'user_id': '365f219cd09c471fa6275faa2fe5e2a1', 'user_name': None, 'project_id': 'b26cf6f4abd54e30aac169a3cbca648c', 'project_name': None, 'resource_id': '4e87b9c8-cfba-431e-966e-24799ad0ece2-vda', 'timestamp': '2026-01-22T00:08:23.289631', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestOtherB-server-261934281', 'name': 'instance-00000068', 'instance_id': '4e87b9c8-cfba-431e-966e-24799ad0ece2', 'instance_type': 'm1.nano', 'host': '1ae8d5dbf6e768f46f4454b06fe927f747a2ace1503561a97d1508fb', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '776c8022-f726-11f0-b13b-fa163e425b77', 'monotonic_time': 5061.947949, 'message_signature': '53c20cc0b2a5a519a7537b94c7917a9085cc755c5c0cf281fc5f133059ae8667'}, {'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 24650124, 'user_id': '365f219cd09c471fa6275faa2fe5e2a1', 'user_name': None, 'project_id': 'b26cf6f4abd54e30aac169a3cbca648c', 'project_name': None, 'resource_id': '4e87b9c8-cfba-431e-966e-24799ad0ece2-sda', 'timestamp': '2026-01-22T00:08:23.289631', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestOtherB-server-261934281', 'name': 'instance-00000068', 'instance_id': '4e87b9c8-cfba-431e-966e-24799ad0ece2', 'instance_type': 'm1.nano', 'host': '1ae8d5dbf6e768f46f4454b06fe927f747a2ace1503561a97d1508fb', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '776c88ce-f726-11f0-b13b-fa163e425b77', 'monotonic_time': 5061.947949, 'message_signature': '0fd4cfc97418df61c8391796acc84a6bb7b9d2eca626741be3d91e75c0e5f8c4'}]}, 'timestamp': '2026-01-22 00:08:23.290599', '_unique_id': '2f82e6a61d1c4a22af2559dabf264358'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.291 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.291 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.291 12 ERROR oslo_messaging.notify.messaging     yield
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.291 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.291 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.291 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.291 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.291 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.291 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.291 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.291 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.291 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.291 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.291 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.291 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.291 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.291 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.291 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.291 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.291 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.291 12 ERROR oslo_messaging.notify.messaging 
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.291 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.291 12 ERROR oslo_messaging.notify.messaging 
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.291 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.291 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.291 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.291 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.291 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.291 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.291 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.291 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.291 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.291 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.291 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.291 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.291 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.291 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.291 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.291 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.291 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.291 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.291 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.291 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.291 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.291 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.291 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.291 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.291 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.291 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.291 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.291 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.291 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.291 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.291 12 ERROR oslo_messaging.notify.messaging 
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.291 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.usage in the context of pollsters
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.292 12 DEBUG ceilometer.compute.pollsters [-] Instance a22c5192-6f57-46f1-8073-48ec7852a544 was shut off while getting sample of disk.device.usage: Failed to inspect data of instance <name=instance-0000006b, id=a22c5192-6f57-46f1-8073-48ec7852a544>, domain state is SHUTOFF. get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:152
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.292 12 DEBUG ceilometer.compute.pollsters [-] 4e87b9c8-cfba-431e-966e-24799ad0ece2/disk.device.usage volume: 29949952 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.292 12 DEBUG ceilometer.compute.pollsters [-] 4e87b9c8-cfba-431e-966e-24799ad0ece2/disk.device.usage volume: 509952 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.293 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '4edf95b0-53ba-479b-a883-64edfc05a9fd', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 29949952, 'user_id': '365f219cd09c471fa6275faa2fe5e2a1', 'user_name': None, 'project_id': 'b26cf6f4abd54e30aac169a3cbca648c', 'project_name': None, 'resource_id': '4e87b9c8-cfba-431e-966e-24799ad0ece2-vda', 'timestamp': '2026-01-22T00:08:23.291707', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestOtherB-server-261934281', 'name': 'instance-00000068', 'instance_id': '4e87b9c8-cfba-431e-966e-24799ad0ece2', 'instance_type': 'm1.nano', 'host': '1ae8d5dbf6e768f46f4454b06fe927f747a2ace1503561a97d1508fb', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '776cd306-f726-11f0-b13b-fa163e425b77', 'monotonic_time': 5061.904025238, 'message_signature': 'eb42c9702e15efbe6b31e3f4a277ef35905baeae44c77f39f61594690545d05f'}, {'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 509952, 'user_id': '365f219cd09c471fa6275faa2fe5e2a1', 'user_name': None, 'project_id': 'b26cf6f4abd54e30aac169a3cbca648c', 'project_name': None, 'resource_id': '4e87b9c8-cfba-431e-966e-24799ad0ece2-sda', 'timestamp': '2026-01-22T00:08:23.291707', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestOtherB-server-261934281', 'name': 'instance-00000068', 'instance_id': '4e87b9c8-cfba-431e-966e-24799ad0ece2', 'instance_type': 'm1.nano', 'host': '1ae8d5dbf6e768f46f4454b06fe927f747a2ace1503561a97d1508fb', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '776cdbb2-f726-11f0-b13b-fa163e425b77', 'monotonic_time': 5061.904025238, 'message_signature': '46e11822aae0dcc57145a25138df840379a10a473b475531aaf1d70387ad6170'}]}, 'timestamp': '2026-01-22 00:08:23.292716', '_unique_id': '357eca2e0c274a4e9252a7a65f5f1071'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.293 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.293 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.293 12 ERROR oslo_messaging.notify.messaging     yield
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.293 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.293 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.293 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.293 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.293 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.293 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.293 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.293 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.293 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.293 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.293 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.293 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.293 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.293 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.293 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.293 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.293 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.293 12 ERROR oslo_messaging.notify.messaging 
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.293 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.293 12 ERROR oslo_messaging.notify.messaging 
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.293 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.293 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.293 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.293 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.293 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.293 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.293 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.293 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.293 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.293 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.293 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.293 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.293 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.293 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.293 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.293 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.293 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.293 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.293 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.293 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.293 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.293 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.293 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.293 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.293 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.293 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.293 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.293 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.293 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.293 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.293 12 ERROR oslo_messaging.notify.messaging 
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.293 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.requests in the context of pollsters
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.294 12 DEBUG ceilometer.compute.pollsters [-] Instance a22c5192-6f57-46f1-8073-48ec7852a544 was shut off while getting sample of disk.device.read.requests: Failed to inspect data of instance <name=instance-0000006b, id=a22c5192-6f57-46f1-8073-48ec7852a544>, domain state is SHUTOFF. get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:152
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.294 12 DEBUG ceilometer.compute.pollsters [-] 4e87b9c8-cfba-431e-966e-24799ad0ece2/disk.device.read.requests volume: 1065 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.294 12 DEBUG ceilometer.compute.pollsters [-] 4e87b9c8-cfba-431e-966e-24799ad0ece2/disk.device.read.requests volume: 120 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.295 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '1a3c5fef-4f47-48d4-b35e-c51e58354fb0', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 1065, 'user_id': '365f219cd09c471fa6275faa2fe5e2a1', 'user_name': None, 'project_id': 'b26cf6f4abd54e30aac169a3cbca648c', 'project_name': None, 'resource_id': '4e87b9c8-cfba-431e-966e-24799ad0ece2-vda', 'timestamp': '2026-01-22T00:08:23.293815', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestOtherB-server-261934281', 'name': 'instance-00000068', 'instance_id': '4e87b9c8-cfba-431e-966e-24799ad0ece2', 'instance_type': 'm1.nano', 'host': '1ae8d5dbf6e768f46f4454b06fe927f747a2ace1503561a97d1508fb', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '776d2388-f726-11f0-b13b-fa163e425b77', 'monotonic_time': 5061.947949, 'message_signature': '0574bc75aa68bc0e4e7bdb1b292f77dcac0842c9c65fbf9db4b715a1089ee084'}, {'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 120, 'user_id': '365f219cd09c471fa6275faa2fe5e2a1', 'user_name': None, 'project_id': 'b26cf6f4abd54e30aac169a3cbca648c', 'project_name': None, 'resource_id': '4e87b9c8-cfba-431e-966e-24799ad0ece2-sda', 'timestamp': '2026-01-22T00:08:23.293815', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestOtherB-server-261934281', 'name': 'instance-00000068', 'instance_id': '4e87b9c8-cfba-431e-966e-24799ad0ece2', 'instance_type': 'm1.nano', 'host': '1ae8d5dbf6e768f46f4454b06fe927f747a2ace1503561a97d1508fb', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '776d2c0c-f726-11f0-b13b-fa163e425b77', 'monotonic_time': 5061.947949, 'message_signature': '8dde1e5557d7147dd7e82e821dcbda2e1a40633b51196fc4200723dbd9180e79'}]}, 'timestamp': '2026-01-22 00:08:23.294772', '_unique_id': '03c21e491db8467f8cb018b740b28c48'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.295 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.295 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.295 12 ERROR oslo_messaging.notify.messaging     yield
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.295 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.295 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.295 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.295 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.295 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.295 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.295 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.295 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.295 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.295 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.295 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.295 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.295 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.295 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.295 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.295 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.295 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.295 12 ERROR oslo_messaging.notify.messaging 
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.295 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.295 12 ERROR oslo_messaging.notify.messaging 
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.295 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.295 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.295 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.295 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.295 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.295 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.295 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.295 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.295 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.295 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.295 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.295 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.295 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.295 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.295 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.295 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.295 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.295 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.295 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.295 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.295 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.295 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.295 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.295 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.295 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.295 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.295 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.295 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.295 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.295 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.295 12 ERROR oslo_messaging.notify.messaging 
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.295 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.latency in the context of pollsters
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.296 12 DEBUG ceilometer.compute.pollsters [-] Instance a22c5192-6f57-46f1-8073-48ec7852a544 was shut off while getting sample of disk.device.write.latency: Failed to inspect data of instance <name=instance-0000006b, id=a22c5192-6f57-46f1-8073-48ec7852a544>, domain state is SHUTOFF. get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:152
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.296 12 DEBUG ceilometer.compute.pollsters [-] 4e87b9c8-cfba-431e-966e-24799ad0ece2/disk.device.write.latency volume: 1825917779 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.296 12 DEBUG ceilometer.compute.pollsters [-] 4e87b9c8-cfba-431e-966e-24799ad0ece2/disk.device.write.latency volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.297 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'd877a532-131d-4ad6-868c-de75cfdeadb0', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 1825917779, 'user_id': '365f219cd09c471fa6275faa2fe5e2a1', 'user_name': None, 'project_id': 'b26cf6f4abd54e30aac169a3cbca648c', 'project_name': None, 'resource_id': '4e87b9c8-cfba-431e-966e-24799ad0ece2-vda', 'timestamp': '2026-01-22T00:08:23.295888', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestOtherB-server-261934281', 'name': 'instance-00000068', 'instance_id': '4e87b9c8-cfba-431e-966e-24799ad0ece2', 'instance_type': 'm1.nano', 'host': '1ae8d5dbf6e768f46f4454b06fe927f747a2ace1503561a97d1508fb', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '776d73a6-f726-11f0-b13b-fa163e425b77', 'monotonic_time': 5061.947949, 'message_signature': '16978118115acdebfb9be35bf5584eb08c70195b881774d80ea06f7759cc8ac6'}, {'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 0, 'user_id': '365f219cd09c471fa6275faa2fe5e2a1', 'user_name': None, 'project_id': 'b26cf6f4abd54e30aac169a3cbca648c', 'project_name': None, 'resource_id': '4e87b9c8-cfba-431e-966e-24799ad0ece2-sda', 'timestamp': '2026-01-22T00:08:23.295888', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestOtherB-server-261934281', 'name': 'instance-00000068', 'instance_id': '4e87b9c8-cfba-431e-966e-24799ad0ece2', 'instance_type': 'm1.nano', 'host': '1ae8d5dbf6e768f46f4454b06fe927f747a2ace1503561a97d1508fb', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '776d7c34-f726-11f0-b13b-fa163e425b77', 'monotonic_time': 5061.947949, 'message_signature': '3df6b8081c5f206ea7d82f40bb1d36ffb1290c69f0ce9c9ad189127b4726a8a6'}]}, 'timestamp': '2026-01-22 00:08:23.296828', '_unique_id': '054650bb24b149859f7de6e76b5a4b18'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.297 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.297 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.297 12 ERROR oslo_messaging.notify.messaging     yield
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.297 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.297 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.297 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.297 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.297 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.297 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.297 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.297 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.297 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.297 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.297 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.297 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.297 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.297 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.297 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.297 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.297 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.297 12 ERROR oslo_messaging.notify.messaging 
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.297 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.297 12 ERROR oslo_messaging.notify.messaging 
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.297 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.297 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.297 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.297 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.297 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.297 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.297 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.297 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.297 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.297 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.297 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.297 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.297 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.297 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.297 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.297 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.297 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.297 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.297 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.297 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.297 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.297 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.297 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.297 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.297 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.297 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.297 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.297 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.297 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.297 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.297 12 ERROR oslo_messaging.notify.messaging 
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.297 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.iops in the context of pollsters
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.297 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for PerDeviceDiskIOPSPollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Jan 21 19:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:08:23.298 12 ERROR ceilometer.polling.manager [-] Prevent pollster disk.device.iops from polling [<NovaLikeServer: tempest-ServerActionsTestOtherB-server-125029116>, <NovaLikeServer: tempest-ServerActionsTestOtherB-server-261934281>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-ServerActionsTestOtherB-server-125029116>, <NovaLikeServer: tempest-ServerActionsTestOtherB-server-261934281>]
Jan 21 19:08:24 np0005591285 nova_compute[182755]: 2026-01-22 00:08:24.191 182759 INFO nova.virt.libvirt.driver [-] [instance: a22c5192-6f57-46f1-8073-48ec7852a544] Instance destroyed successfully.#033[00m
Jan 21 19:08:24 np0005591285 nova_compute[182755]: 2026-01-22 00:08:24.192 182759 DEBUG nova.objects.instance [None req-464ab5f8-60e8-453e-a55a-ebbc4be5425f 365f219cd09c471fa6275faa2fe5e2a1 b26cf6f4abd54e30aac169a3cbca648c - - default default] Lazy-loading 'resources' on Instance uuid a22c5192-6f57-46f1-8073-48ec7852a544 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 21 19:08:24 np0005591285 nova_compute[182755]: 2026-01-22 00:08:24.206 182759 DEBUG nova.virt.libvirt.vif [None req-464ab5f8-60e8-453e-a55a-ebbc4be5425f 365f219cd09c471fa6275faa2fe5e2a1 b26cf6f4abd54e30aac169a3cbca648c - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-22T00:08:03Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ServerActionsTestOtherB-server-125029116',display_name='tempest-ServerActionsTestOtherB-server-125029116',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-serveractionstestotherb-server-125029116',id=107,image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2026-01-22T00:08:11Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=4,progress=0,project_id='b26cf6f4abd54e30aac169a3cbca648c',ramdisk_id='',reservation_id='r-hhcmlge4',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-ServerActionsTestOtherB-1685479237',owner_user_name='tempest-ServerActionsTestOtherB-1685479237-project-member',shelved_at='2026-01-22T00:08:19.271680',shelved_host='compute-2.ctlplane.example.com',shelved_image_id='b5b2c786-8dcc-47bf-91b9-e8f0523a23d4'},tags=<?>,task_state='shelving_offloading',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-22T00:08:17Z,user_data=None,user_id='365f219cd09c471fa6275faa2fe5e2a1',uuid=a22c5192-6f57-46f1-8073-48ec7852a544,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='shelved') vif={"id": "56184cc4-8230-467f-b464-9f9ebdf257e0", "address": "fa:16:3e:cc:48:66", "network": {"id": "1a4bd631-64c5-4e00-9341-0e44fd0833fb", "bridge": "br-int", "label": "tempest-ServerActionsTestOtherB-1779791452-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b26cf6f4abd54e30aac169a3cbca648c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap56184cc4-82", "ovs_interfaceid": "56184cc4-8230-467f-b464-9f9ebdf257e0", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Jan 21 19:08:24 np0005591285 nova_compute[182755]: 2026-01-22 00:08:24.206 182759 DEBUG nova.network.os_vif_util [None req-464ab5f8-60e8-453e-a55a-ebbc4be5425f 365f219cd09c471fa6275faa2fe5e2a1 b26cf6f4abd54e30aac169a3cbca648c - - default default] Converting VIF {"id": "56184cc4-8230-467f-b464-9f9ebdf257e0", "address": "fa:16:3e:cc:48:66", "network": {"id": "1a4bd631-64c5-4e00-9341-0e44fd0833fb", "bridge": "br-int", "label": "tempest-ServerActionsTestOtherB-1779791452-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b26cf6f4abd54e30aac169a3cbca648c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap56184cc4-82", "ovs_interfaceid": "56184cc4-8230-467f-b464-9f9ebdf257e0", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 21 19:08:24 np0005591285 nova_compute[182755]: 2026-01-22 00:08:24.207 182759 DEBUG nova.network.os_vif_util [None req-464ab5f8-60e8-453e-a55a-ebbc4be5425f 365f219cd09c471fa6275faa2fe5e2a1 b26cf6f4abd54e30aac169a3cbca648c - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:cc:48:66,bridge_name='br-int',has_traffic_filtering=True,id=56184cc4-8230-467f-b464-9f9ebdf257e0,network=Network(1a4bd631-64c5-4e00-9341-0e44fd0833fb),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap56184cc4-82') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 21 19:08:24 np0005591285 nova_compute[182755]: 2026-01-22 00:08:24.208 182759 DEBUG os_vif [None req-464ab5f8-60e8-453e-a55a-ebbc4be5425f 365f219cd09c471fa6275faa2fe5e2a1 b26cf6f4abd54e30aac169a3cbca648c - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:cc:48:66,bridge_name='br-int',has_traffic_filtering=True,id=56184cc4-8230-467f-b464-9f9ebdf257e0,network=Network(1a4bd631-64c5-4e00-9341-0e44fd0833fb),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap56184cc4-82') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Jan 21 19:08:24 np0005591285 nova_compute[182755]: 2026-01-22 00:08:24.209 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:08:24 np0005591285 nova_compute[182755]: 2026-01-22 00:08:24.210 182759 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap56184cc4-82, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 21 19:08:24 np0005591285 nova_compute[182755]: 2026-01-22 00:08:24.252 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:08:24 np0005591285 nova_compute[182755]: 2026-01-22 00:08:24.255 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 21 19:08:24 np0005591285 nova_compute[182755]: 2026-01-22 00:08:24.260 182759 INFO os_vif [None req-464ab5f8-60e8-453e-a55a-ebbc4be5425f 365f219cd09c471fa6275faa2fe5e2a1 b26cf6f4abd54e30aac169a3cbca648c - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:cc:48:66,bridge_name='br-int',has_traffic_filtering=True,id=56184cc4-8230-467f-b464-9f9ebdf257e0,network=Network(1a4bd631-64c5-4e00-9341-0e44fd0833fb),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap56184cc4-82')#033[00m
Jan 21 19:08:24 np0005591285 nova_compute[182755]: 2026-01-22 00:08:24.260 182759 INFO nova.virt.libvirt.driver [None req-464ab5f8-60e8-453e-a55a-ebbc4be5425f 365f219cd09c471fa6275faa2fe5e2a1 b26cf6f4abd54e30aac169a3cbca648c - - default default] [instance: a22c5192-6f57-46f1-8073-48ec7852a544] Deleting instance files /var/lib/nova/instances/a22c5192-6f57-46f1-8073-48ec7852a544_del#033[00m
Jan 21 19:08:24 np0005591285 nova_compute[182755]: 2026-01-22 00:08:24.261 182759 INFO nova.virt.libvirt.driver [None req-464ab5f8-60e8-453e-a55a-ebbc4be5425f 365f219cd09c471fa6275faa2fe5e2a1 b26cf6f4abd54e30aac169a3cbca648c - - default default] [instance: a22c5192-6f57-46f1-8073-48ec7852a544] Deletion of /var/lib/nova/instances/a22c5192-6f57-46f1-8073-48ec7852a544_del complete#033[00m
Jan 21 19:08:24 np0005591285 nova_compute[182755]: 2026-01-22 00:08:24.286 182759 DEBUG nova.compute.manager [req-baf070e9-dfab-4a4f-a486-477f3c975760 req-2c87c44e-b020-48f3-aaf4-fcbed5e4f5a7 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: a22c5192-6f57-46f1-8073-48ec7852a544] Received event network-changed-56184cc4-8230-467f-b464-9f9ebdf257e0 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 21 19:08:24 np0005591285 nova_compute[182755]: 2026-01-22 00:08:24.286 182759 DEBUG nova.compute.manager [req-baf070e9-dfab-4a4f-a486-477f3c975760 req-2c87c44e-b020-48f3-aaf4-fcbed5e4f5a7 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: a22c5192-6f57-46f1-8073-48ec7852a544] Refreshing instance network info cache due to event network-changed-56184cc4-8230-467f-b464-9f9ebdf257e0. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 21 19:08:24 np0005591285 nova_compute[182755]: 2026-01-22 00:08:24.287 182759 DEBUG oslo_concurrency.lockutils [req-baf070e9-dfab-4a4f-a486-477f3c975760 req-2c87c44e-b020-48f3-aaf4-fcbed5e4f5a7 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "refresh_cache-a22c5192-6f57-46f1-8073-48ec7852a544" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 21 19:08:24 np0005591285 nova_compute[182755]: 2026-01-22 00:08:24.287 182759 DEBUG oslo_concurrency.lockutils [req-baf070e9-dfab-4a4f-a486-477f3c975760 req-2c87c44e-b020-48f3-aaf4-fcbed5e4f5a7 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquired lock "refresh_cache-a22c5192-6f57-46f1-8073-48ec7852a544" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 21 19:08:24 np0005591285 nova_compute[182755]: 2026-01-22 00:08:24.287 182759 DEBUG nova.network.neutron [req-baf070e9-dfab-4a4f-a486-477f3c975760 req-2c87c44e-b020-48f3-aaf4-fcbed5e4f5a7 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: a22c5192-6f57-46f1-8073-48ec7852a544] Refreshing network info cache for port 56184cc4-8230-467f-b464-9f9ebdf257e0 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 21 19:08:24 np0005591285 nova_compute[182755]: 2026-01-22 00:08:24.423 182759 INFO nova.scheduler.client.report [None req-464ab5f8-60e8-453e-a55a-ebbc4be5425f 365f219cd09c471fa6275faa2fe5e2a1 b26cf6f4abd54e30aac169a3cbca648c - - default default] Deleted allocations for instance a22c5192-6f57-46f1-8073-48ec7852a544#033[00m
Jan 21 19:08:24 np0005591285 nova_compute[182755]: 2026-01-22 00:08:24.503 182759 DEBUG oslo_concurrency.lockutils [None req-464ab5f8-60e8-453e-a55a-ebbc4be5425f 365f219cd09c471fa6275faa2fe5e2a1 b26cf6f4abd54e30aac169a3cbca648c - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 21 19:08:24 np0005591285 nova_compute[182755]: 2026-01-22 00:08:24.504 182759 DEBUG oslo_concurrency.lockutils [None req-464ab5f8-60e8-453e-a55a-ebbc4be5425f 365f219cd09c471fa6275faa2fe5e2a1 b26cf6f4abd54e30aac169a3cbca648c - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 21 19:08:24 np0005591285 nova_compute[182755]: 2026-01-22 00:08:24.640 182759 DEBUG nova.compute.provider_tree [None req-464ab5f8-60e8-453e-a55a-ebbc4be5425f 365f219cd09c471fa6275faa2fe5e2a1 b26cf6f4abd54e30aac169a3cbca648c - - default default] Inventory has not changed in ProviderTree for provider: e96a8776-a298-4c19-937a-402cb8191067 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 21 19:08:24 np0005591285 nova_compute[182755]: 2026-01-22 00:08:24.664 182759 DEBUG nova.scheduler.client.report [None req-464ab5f8-60e8-453e-a55a-ebbc4be5425f 365f219cd09c471fa6275faa2fe5e2a1 b26cf6f4abd54e30aac169a3cbca648c - - default default] Inventory has not changed for provider e96a8776-a298-4c19-937a-402cb8191067 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 21 19:08:24 np0005591285 nova_compute[182755]: 2026-01-22 00:08:24.693 182759 DEBUG oslo_concurrency.lockutils [None req-464ab5f8-60e8-453e-a55a-ebbc4be5425f 365f219cd09c471fa6275faa2fe5e2a1 b26cf6f4abd54e30aac169a3cbca648c - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.189s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 21 19:08:24 np0005591285 nova_compute[182755]: 2026-01-22 00:08:24.786 182759 DEBUG oslo_concurrency.lockutils [None req-464ab5f8-60e8-453e-a55a-ebbc4be5425f 365f219cd09c471fa6275faa2fe5e2a1 b26cf6f4abd54e30aac169a3cbca648c - - default default] Lock "a22c5192-6f57-46f1-8073-48ec7852a544" "released" by "nova.compute.manager.ComputeManager.shelve_instance.<locals>.do_shelve_instance" :: held 8.952s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 21 19:08:26 np0005591285 nova_compute[182755]: 2026-01-22 00:08:26.681 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:08:27 np0005591285 podman[227600]: 2026-01-22 00:08:27.203794704 +0000 UTC m=+0.062975565 container health_status 482a8450540b33e5cd4c4cb0c4b60281016c657b79b89fa82f8a9e447843f995 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS)
Jan 21 19:08:27 np0005591285 podman[227601]: 2026-01-22 00:08:27.207839763 +0000 UTC m=+0.068277487 container health_status 8919309155a033da8deb024bb37b9fc0d285fe97686ee0d202af37a5f2df8416 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter)
Jan 21 19:08:27 np0005591285 podman[227602]: 2026-01-22 00:08:27.233535955 +0000 UTC m=+0.090005623 container health_status e38def30a2d032278c507a8fe96e62e9ea51ff4721fe55b24e66691821fd079e (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.build-date=20251202)
Jan 21 19:08:27 np0005591285 ovn_controller[94908]: 2026-01-22T00:08:27Z|00389|binding|INFO|Releasing lport c2dbe75a-81e7-4c52-bada-9acaf8fbaf5c from this chassis (sb_readonly=0)
Jan 21 19:08:27 np0005591285 nova_compute[182755]: 2026-01-22 00:08:27.901 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:08:28 np0005591285 nova_compute[182755]: 2026-01-22 00:08:28.052 182759 DEBUG nova.network.neutron [req-baf070e9-dfab-4a4f-a486-477f3c975760 req-2c87c44e-b020-48f3-aaf4-fcbed5e4f5a7 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: a22c5192-6f57-46f1-8073-48ec7852a544] Updated VIF entry in instance network info cache for port 56184cc4-8230-467f-b464-9f9ebdf257e0. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 21 19:08:28 np0005591285 nova_compute[182755]: 2026-01-22 00:08:28.053 182759 DEBUG nova.network.neutron [req-baf070e9-dfab-4a4f-a486-477f3c975760 req-2c87c44e-b020-48f3-aaf4-fcbed5e4f5a7 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: a22c5192-6f57-46f1-8073-48ec7852a544] Updating instance_info_cache with network_info: [{"id": "56184cc4-8230-467f-b464-9f9ebdf257e0", "address": "fa:16:3e:cc:48:66", "network": {"id": "1a4bd631-64c5-4e00-9341-0e44fd0833fb", "bridge": null, "label": "tempest-ServerActionsTestOtherB-1779791452-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b26cf6f4abd54e30aac169a3cbca648c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "unbound", "details": {}, "devname": "tap56184cc4-82", "ovs_interfaceid": null, "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 21 19:08:28 np0005591285 nova_compute[182755]: 2026-01-22 00:08:28.222 182759 DEBUG oslo_concurrency.lockutils [req-baf070e9-dfab-4a4f-a486-477f3c975760 req-2c87c44e-b020-48f3-aaf4-fcbed5e4f5a7 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Releasing lock "refresh_cache-a22c5192-6f57-46f1-8073-48ec7852a544" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 21 19:08:29 np0005591285 nova_compute[182755]: 2026-01-22 00:08:29.254 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:08:30 np0005591285 nova_compute[182755]: 2026-01-22 00:08:30.166 182759 DEBUG oslo_concurrency.lockutils [None req-de1e0738-8854-463f-8397-67597a09b4ac 365f219cd09c471fa6275faa2fe5e2a1 b26cf6f4abd54e30aac169a3cbca648c - - default default] Acquiring lock "4e87b9c8-cfba-431e-966e-24799ad0ece2" by "nova.compute.manager.ComputeManager.shelve_instance.<locals>.do_shelve_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 21 19:08:30 np0005591285 nova_compute[182755]: 2026-01-22 00:08:30.167 182759 DEBUG oslo_concurrency.lockutils [None req-de1e0738-8854-463f-8397-67597a09b4ac 365f219cd09c471fa6275faa2fe5e2a1 b26cf6f4abd54e30aac169a3cbca648c - - default default] Lock "4e87b9c8-cfba-431e-966e-24799ad0ece2" acquired by "nova.compute.manager.ComputeManager.shelve_instance.<locals>.do_shelve_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 21 19:08:30 np0005591285 nova_compute[182755]: 2026-01-22 00:08:30.167 182759 INFO nova.compute.manager [None req-de1e0738-8854-463f-8397-67597a09b4ac 365f219cd09c471fa6275faa2fe5e2a1 b26cf6f4abd54e30aac169a3cbca648c - - default default] [instance: 4e87b9c8-cfba-431e-966e-24799ad0ece2] Shelving#033[00m
Jan 21 19:08:30 np0005591285 nova_compute[182755]: 2026-01-22 00:08:30.241 182759 DEBUG nova.virt.libvirt.driver [None req-de1e0738-8854-463f-8397-67597a09b4ac 365f219cd09c471fa6275faa2fe5e2a1 b26cf6f4abd54e30aac169a3cbca648c - - default default] [instance: 4e87b9c8-cfba-431e-966e-24799ad0ece2] Shutting down instance from state 1 _clean_shutdown /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4071#033[00m
Jan 21 19:08:31 np0005591285 nova_compute[182755]: 2026-01-22 00:08:31.149 182759 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1769040496.148383, a22c5192-6f57-46f1-8073-48ec7852a544 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 21 19:08:31 np0005591285 nova_compute[182755]: 2026-01-22 00:08:31.150 182759 INFO nova.compute.manager [-] [instance: a22c5192-6f57-46f1-8073-48ec7852a544] VM Stopped (Lifecycle Event)#033[00m
Jan 21 19:08:31 np0005591285 nova_compute[182755]: 2026-01-22 00:08:31.171 182759 DEBUG nova.compute.manager [None req-766e09d2-6aae-4ee7-87ab-50348ea27221 - - - - - -] [instance: a22c5192-6f57-46f1-8073-48ec7852a544] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 21 19:08:31 np0005591285 nova_compute[182755]: 2026-01-22 00:08:31.684 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:08:32 np0005591285 kernel: tap02f1d29d-b6 (unregistering): left promiscuous mode
Jan 21 19:08:32 np0005591285 NetworkManager[55017]: <info>  [1769040512.4368] device (tap02f1d29d-b6): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 21 19:08:32 np0005591285 ovn_controller[94908]: 2026-01-22T00:08:32Z|00390|binding|INFO|Releasing lport 02f1d29d-b6df-46d8-8387-cfa84ffb24af from this chassis (sb_readonly=0)
Jan 21 19:08:32 np0005591285 ovn_controller[94908]: 2026-01-22T00:08:32Z|00391|binding|INFO|Setting lport 02f1d29d-b6df-46d8-8387-cfa84ffb24af down in Southbound
Jan 21 19:08:32 np0005591285 nova_compute[182755]: 2026-01-22 00:08:32.440 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:08:32 np0005591285 ovn_controller[94908]: 2026-01-22T00:08:32Z|00392|binding|INFO|Removing iface tap02f1d29d-b6 ovn-installed in OVS
Jan 21 19:08:32 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:08:32.453 104259 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:7c:e7:2e 10.100.0.8'], port_security=['fa:16:3e:7c:e7:2e 10.100.0.8'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.8/28', 'neutron:device_id': '4e87b9c8-cfba-431e-966e-24799ad0ece2', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-1a4bd631-64c5-4e00-9341-0e44fd0833fb', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'b26cf6f4abd54e30aac169a3cbca648c', 'neutron:revision_number': '4', 'neutron:security_group_ids': '80fb8d02-77b3-43f5-9cd3-4114236093b7', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-2.ctlplane.example.com', 'neutron:port_fip': '192.168.122.176'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=d46c6b58-b03f-4ac4-a6dd-9f507a40241a, chassis=[], tunnel_key=5, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fdd55b5c970>], logical_port=02f1d29d-b6df-46d8-8387-cfa84ffb24af) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7fdd55b5c970>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 21 19:08:32 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:08:32.454 104259 INFO neutron.agent.ovn.metadata.agent [-] Port 02f1d29d-b6df-46d8-8387-cfa84ffb24af in datapath 1a4bd631-64c5-4e00-9341-0e44fd0833fb unbound from our chassis#033[00m
Jan 21 19:08:32 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:08:32.455 104259 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 1a4bd631-64c5-4e00-9341-0e44fd0833fb, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Jan 21 19:08:32 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:08:32.456 211690 DEBUG oslo.privsep.daemon [-] privsep: reply[438ed83e-65b3-40a8-99f0-7a463f06e272]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 19:08:32 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:08:32.457 104259 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-1a4bd631-64c5-4e00-9341-0e44fd0833fb namespace which is not needed anymore#033[00m
Jan 21 19:08:32 np0005591285 nova_compute[182755]: 2026-01-22 00:08:32.457 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:08:32 np0005591285 systemd[1]: machine-qemu\x2d46\x2dinstance\x2d00000068.scope: Deactivated successfully.
Jan 21 19:08:32 np0005591285 systemd[1]: machine-qemu\x2d46\x2dinstance\x2d00000068.scope: Consumed 15.669s CPU time.
Jan 21 19:08:32 np0005591285 systemd-machined[154022]: Machine qemu-46-instance-00000068 terminated.
Jan 21 19:08:32 np0005591285 neutron-haproxy-ovnmeta-1a4bd631-64c5-4e00-9341-0e44fd0833fb[227177]: [NOTICE]   (227181) : haproxy version is 2.8.14-c23fe91
Jan 21 19:08:32 np0005591285 neutron-haproxy-ovnmeta-1a4bd631-64c5-4e00-9341-0e44fd0833fb[227177]: [NOTICE]   (227181) : path to executable is /usr/sbin/haproxy
Jan 21 19:08:32 np0005591285 neutron-haproxy-ovnmeta-1a4bd631-64c5-4e00-9341-0e44fd0833fb[227177]: [WARNING]  (227181) : Exiting Master process...
Jan 21 19:08:32 np0005591285 neutron-haproxy-ovnmeta-1a4bd631-64c5-4e00-9341-0e44fd0833fb[227177]: [ALERT]    (227181) : Current worker (227183) exited with code 143 (Terminated)
Jan 21 19:08:32 np0005591285 neutron-haproxy-ovnmeta-1a4bd631-64c5-4e00-9341-0e44fd0833fb[227177]: [WARNING]  (227181) : All workers exited. Exiting... (0)
Jan 21 19:08:32 np0005591285 systemd[1]: libpod-b11e7f79b0557916ed8a3ce4c7962ab8bb5dfe741fdf43b30b8ceb05607bdfcf.scope: Deactivated successfully.
Jan 21 19:08:32 np0005591285 podman[227697]: 2026-01-22 00:08:32.650305691 +0000 UTC m=+0.066568302 container died b11e7f79b0557916ed8a3ce4c7962ab8bb5dfe741fdf43b30b8ceb05607bdfcf (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-1a4bd631-64c5-4e00-9341-0e44fd0833fb, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Jan 21 19:08:32 np0005591285 nova_compute[182755]: 2026-01-22 00:08:32.686 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:08:32 np0005591285 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-b11e7f79b0557916ed8a3ce4c7962ab8bb5dfe741fdf43b30b8ceb05607bdfcf-userdata-shm.mount: Deactivated successfully.
Jan 21 19:08:32 np0005591285 systemd[1]: var-lib-containers-storage-overlay-8f38838ec0e67ba5930ac143552c01becc60215c2657e40e88a8d82cb4196e40-merged.mount: Deactivated successfully.
Jan 21 19:08:32 np0005591285 nova_compute[182755]: 2026-01-22 00:08:32.693 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:08:32 np0005591285 podman[227697]: 2026-01-22 00:08:32.701351953 +0000 UTC m=+0.117614524 container cleanup b11e7f79b0557916ed8a3ce4c7962ab8bb5dfe741fdf43b30b8ceb05607bdfcf (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-1a4bd631-64c5-4e00-9341-0e44fd0833fb, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.license=GPLv2)
Jan 21 19:08:32 np0005591285 systemd[1]: libpod-conmon-b11e7f79b0557916ed8a3ce4c7962ab8bb5dfe741fdf43b30b8ceb05607bdfcf.scope: Deactivated successfully.
Jan 21 19:08:32 np0005591285 podman[227739]: 2026-01-22 00:08:32.760305589 +0000 UTC m=+0.038291330 container remove b11e7f79b0557916ed8a3ce4c7962ab8bb5dfe741fdf43b30b8ceb05607bdfcf (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-1a4bd631-64c5-4e00-9341-0e44fd0833fb, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.schema-version=1.0)
Jan 21 19:08:32 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:08:32.765 211690 DEBUG oslo.privsep.daemon [-] privsep: reply[0f66f80b-e792-47fb-8e5d-c8785e550345]: (4, ('Thu Jan 22 12:08:32 AM UTC 2026 Stopping container neutron-haproxy-ovnmeta-1a4bd631-64c5-4e00-9341-0e44fd0833fb (b11e7f79b0557916ed8a3ce4c7962ab8bb5dfe741fdf43b30b8ceb05607bdfcf)\nb11e7f79b0557916ed8a3ce4c7962ab8bb5dfe741fdf43b30b8ceb05607bdfcf\nThu Jan 22 12:08:32 AM UTC 2026 Deleting container neutron-haproxy-ovnmeta-1a4bd631-64c5-4e00-9341-0e44fd0833fb (b11e7f79b0557916ed8a3ce4c7962ab8bb5dfe741fdf43b30b8ceb05607bdfcf)\nb11e7f79b0557916ed8a3ce4c7962ab8bb5dfe741fdf43b30b8ceb05607bdfcf\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 19:08:32 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:08:32.767 211690 DEBUG oslo.privsep.daemon [-] privsep: reply[f7efe5c0-a67e-43b3-84ef-fe2f0f17c7bc]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 19:08:32 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:08:32.768 104259 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap1a4bd631-60, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 21 19:08:32 np0005591285 nova_compute[182755]: 2026-01-22 00:08:32.769 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:08:32 np0005591285 kernel: tap1a4bd631-60: left promiscuous mode
Jan 21 19:08:32 np0005591285 nova_compute[182755]: 2026-01-22 00:08:32.784 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:08:32 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:08:32.787 211690 DEBUG oslo.privsep.daemon [-] privsep: reply[2ec16fae-adc1-4f22-864e-64020ca72938]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 19:08:32 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:08:32.805 211690 DEBUG oslo.privsep.daemon [-] privsep: reply[64905ae6-ae15-4feb-bf4f-8a2db59fc034]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 19:08:32 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:08:32.806 211690 DEBUG oslo.privsep.daemon [-] privsep: reply[9eaaa515-a01b-4e44-834d-8a6f77751a8a]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 19:08:32 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:08:32.819 211690 DEBUG oslo.privsep.daemon [-] privsep: reply[ac7af8dc-03f7-40e4-9a26-51d48fc7a4df]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 501488, 'reachable_time': 16651, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 227761, 'error': None, 'target': 'ovnmeta-1a4bd631-64c5-4e00-9341-0e44fd0833fb', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 19:08:32 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:08:32.822 104650 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-1a4bd631-64c5-4e00-9341-0e44fd0833fb deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Jan 21 19:08:32 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:08:32.822 104650 DEBUG oslo.privsep.daemon [-] privsep: reply[ab74ec88-e92d-41a1-9cd5-65b8cfd8c0e4]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 19:08:32 np0005591285 systemd[1]: run-netns-ovnmeta\x2d1a4bd631\x2d64c5\x2d4e00\x2d9341\x2d0e44fd0833fb.mount: Deactivated successfully.
Jan 21 19:08:33 np0005591285 nova_compute[182755]: 2026-01-22 00:08:33.250 182759 DEBUG nova.compute.manager [req-5614da13-4da5-4833-bd82-bf8b906edb86 req-2aabd932-24f4-4582-86a4-99d2d3203771 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 4e87b9c8-cfba-431e-966e-24799ad0ece2] Received event network-vif-unplugged-02f1d29d-b6df-46d8-8387-cfa84ffb24af external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 21 19:08:33 np0005591285 nova_compute[182755]: 2026-01-22 00:08:33.251 182759 DEBUG oslo_concurrency.lockutils [req-5614da13-4da5-4833-bd82-bf8b906edb86 req-2aabd932-24f4-4582-86a4-99d2d3203771 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "4e87b9c8-cfba-431e-966e-24799ad0ece2-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 21 19:08:33 np0005591285 nova_compute[182755]: 2026-01-22 00:08:33.251 182759 DEBUG oslo_concurrency.lockutils [req-5614da13-4da5-4833-bd82-bf8b906edb86 req-2aabd932-24f4-4582-86a4-99d2d3203771 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "4e87b9c8-cfba-431e-966e-24799ad0ece2-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 21 19:08:33 np0005591285 nova_compute[182755]: 2026-01-22 00:08:33.252 182759 DEBUG oslo_concurrency.lockutils [req-5614da13-4da5-4833-bd82-bf8b906edb86 req-2aabd932-24f4-4582-86a4-99d2d3203771 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "4e87b9c8-cfba-431e-966e-24799ad0ece2-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 21 19:08:33 np0005591285 nova_compute[182755]: 2026-01-22 00:08:33.252 182759 DEBUG nova.compute.manager [req-5614da13-4da5-4833-bd82-bf8b906edb86 req-2aabd932-24f4-4582-86a4-99d2d3203771 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 4e87b9c8-cfba-431e-966e-24799ad0ece2] No waiting events found dispatching network-vif-unplugged-02f1d29d-b6df-46d8-8387-cfa84ffb24af pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 21 19:08:33 np0005591285 nova_compute[182755]: 2026-01-22 00:08:33.252 182759 WARNING nova.compute.manager [req-5614da13-4da5-4833-bd82-bf8b906edb86 req-2aabd932-24f4-4582-86a4-99d2d3203771 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 4e87b9c8-cfba-431e-966e-24799ad0ece2] Received unexpected event network-vif-unplugged-02f1d29d-b6df-46d8-8387-cfa84ffb24af for instance with vm_state active and task_state shelving.#033[00m
Jan 21 19:08:33 np0005591285 nova_compute[182755]: 2026-01-22 00:08:33.259 182759 INFO nova.virt.libvirt.driver [None req-de1e0738-8854-463f-8397-67597a09b4ac 365f219cd09c471fa6275faa2fe5e2a1 b26cf6f4abd54e30aac169a3cbca648c - - default default] [instance: 4e87b9c8-cfba-431e-966e-24799ad0ece2] Instance shutdown successfully after 3 seconds.#033[00m
Jan 21 19:08:33 np0005591285 nova_compute[182755]: 2026-01-22 00:08:33.264 182759 INFO nova.virt.libvirt.driver [-] [instance: 4e87b9c8-cfba-431e-966e-24799ad0ece2] Instance destroyed successfully.#033[00m
Jan 21 19:08:33 np0005591285 nova_compute[182755]: 2026-01-22 00:08:33.264 182759 DEBUG nova.objects.instance [None req-de1e0738-8854-463f-8397-67597a09b4ac 365f219cd09c471fa6275faa2fe5e2a1 b26cf6f4abd54e30aac169a3cbca648c - - default default] Lazy-loading 'numa_topology' on Instance uuid 4e87b9c8-cfba-431e-966e-24799ad0ece2 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 21 19:08:34 np0005591285 nova_compute[182755]: 2026-01-22 00:08:34.045 182759 INFO nova.virt.libvirt.driver [None req-de1e0738-8854-463f-8397-67597a09b4ac 365f219cd09c471fa6275faa2fe5e2a1 b26cf6f4abd54e30aac169a3cbca648c - - default default] [instance: 4e87b9c8-cfba-431e-966e-24799ad0ece2] Beginning cold snapshot process#033[00m
Jan 21 19:08:34 np0005591285 nova_compute[182755]: 2026-01-22 00:08:34.227 182759 DEBUG oslo_service.periodic_task [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 21 19:08:34 np0005591285 nova_compute[182755]: 2026-01-22 00:08:34.256 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:08:34 np0005591285 nova_compute[182755]: 2026-01-22 00:08:34.382 182759 DEBUG nova.privsep.utils [None req-de1e0738-8854-463f-8397-67597a09b4ac 365f219cd09c471fa6275faa2fe5e2a1 b26cf6f4abd54e30aac169a3cbca648c - - default default] Path '/var/lib/nova/instances' supports direct I/O supports_direct_io /usr/lib/python3.9/site-packages/nova/privsep/utils.py:63#033[00m
Jan 21 19:08:34 np0005591285 nova_compute[182755]: 2026-01-22 00:08:34.383 182759 DEBUG oslo_concurrency.processutils [None req-de1e0738-8854-463f-8397-67597a09b4ac 365f219cd09c471fa6275faa2fe5e2a1 b26cf6f4abd54e30aac169a3cbca648c - - default default] Running cmd (subprocess): qemu-img convert -t none -O qcow2 -f qcow2 /var/lib/nova/instances/4e87b9c8-cfba-431e-966e-24799ad0ece2/disk /var/lib/nova/instances/snapshots/tmpyqyduxor/35a9afc10d2b4acfafe277aa0203a11f execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 21 19:08:34 np0005591285 nova_compute[182755]: 2026-01-22 00:08:34.807 182759 DEBUG oslo_concurrency.processutils [None req-de1e0738-8854-463f-8397-67597a09b4ac 365f219cd09c471fa6275faa2fe5e2a1 b26cf6f4abd54e30aac169a3cbca648c - - default default] CMD "qemu-img convert -t none -O qcow2 -f qcow2 /var/lib/nova/instances/4e87b9c8-cfba-431e-966e-24799ad0ece2/disk /var/lib/nova/instances/snapshots/tmpyqyduxor/35a9afc10d2b4acfafe277aa0203a11f" returned: 0 in 0.423s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 21 19:08:34 np0005591285 nova_compute[182755]: 2026-01-22 00:08:34.808 182759 INFO nova.virt.libvirt.driver [None req-de1e0738-8854-463f-8397-67597a09b4ac 365f219cd09c471fa6275faa2fe5e2a1 b26cf6f4abd54e30aac169a3cbca648c - - default default] [instance: 4e87b9c8-cfba-431e-966e-24799ad0ece2] Snapshot extracted, beginning image upload#033[00m
Jan 21 19:08:35 np0005591285 nova_compute[182755]: 2026-01-22 00:08:35.622 182759 DEBUG nova.compute.manager [req-55ca4a31-1b2f-48a1-8eee-d0812b942c98 req-5d4f779d-79bd-47f7-9338-2721a5358555 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 4e87b9c8-cfba-431e-966e-24799ad0ece2] Received event network-vif-plugged-02f1d29d-b6df-46d8-8387-cfa84ffb24af external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 21 19:08:35 np0005591285 nova_compute[182755]: 2026-01-22 00:08:35.622 182759 DEBUG oslo_concurrency.lockutils [req-55ca4a31-1b2f-48a1-8eee-d0812b942c98 req-5d4f779d-79bd-47f7-9338-2721a5358555 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "4e87b9c8-cfba-431e-966e-24799ad0ece2-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 21 19:08:35 np0005591285 nova_compute[182755]: 2026-01-22 00:08:35.622 182759 DEBUG oslo_concurrency.lockutils [req-55ca4a31-1b2f-48a1-8eee-d0812b942c98 req-5d4f779d-79bd-47f7-9338-2721a5358555 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "4e87b9c8-cfba-431e-966e-24799ad0ece2-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 21 19:08:35 np0005591285 nova_compute[182755]: 2026-01-22 00:08:35.623 182759 DEBUG oslo_concurrency.lockutils [req-55ca4a31-1b2f-48a1-8eee-d0812b942c98 req-5d4f779d-79bd-47f7-9338-2721a5358555 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "4e87b9c8-cfba-431e-966e-24799ad0ece2-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 21 19:08:35 np0005591285 nova_compute[182755]: 2026-01-22 00:08:35.623 182759 DEBUG nova.compute.manager [req-55ca4a31-1b2f-48a1-8eee-d0812b942c98 req-5d4f779d-79bd-47f7-9338-2721a5358555 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 4e87b9c8-cfba-431e-966e-24799ad0ece2] No waiting events found dispatching network-vif-plugged-02f1d29d-b6df-46d8-8387-cfa84ffb24af pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 21 19:08:35 np0005591285 nova_compute[182755]: 2026-01-22 00:08:35.623 182759 WARNING nova.compute.manager [req-55ca4a31-1b2f-48a1-8eee-d0812b942c98 req-5d4f779d-79bd-47f7-9338-2721a5358555 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 4e87b9c8-cfba-431e-966e-24799ad0ece2] Received unexpected event network-vif-plugged-02f1d29d-b6df-46d8-8387-cfa84ffb24af for instance with vm_state active and task_state shelving_image_uploading.#033[00m
Jan 21 19:08:36 np0005591285 nova_compute[182755]: 2026-01-22 00:08:36.685 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:08:37 np0005591285 nova_compute[182755]: 2026-01-22 00:08:37.217 182759 DEBUG oslo_service.periodic_task [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 21 19:08:37 np0005591285 nova_compute[182755]: 2026-01-22 00:08:37.218 182759 DEBUG nova.compute.manager [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Jan 21 19:08:37 np0005591285 nova_compute[182755]: 2026-01-22 00:08:37.844 182759 INFO nova.virt.libvirt.driver [None req-de1e0738-8854-463f-8397-67597a09b4ac 365f219cd09c471fa6275faa2fe5e2a1 b26cf6f4abd54e30aac169a3cbca648c - - default default] [instance: 4e87b9c8-cfba-431e-966e-24799ad0ece2] Snapshot image upload complete#033[00m
Jan 21 19:08:37 np0005591285 nova_compute[182755]: 2026-01-22 00:08:37.845 182759 DEBUG nova.compute.manager [None req-de1e0738-8854-463f-8397-67597a09b4ac 365f219cd09c471fa6275faa2fe5e2a1 b26cf6f4abd54e30aac169a3cbca648c - - default default] [instance: 4e87b9c8-cfba-431e-966e-24799ad0ece2] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 21 19:08:37 np0005591285 nova_compute[182755]: 2026-01-22 00:08:37.956 182759 INFO nova.compute.manager [None req-de1e0738-8854-463f-8397-67597a09b4ac 365f219cd09c471fa6275faa2fe5e2a1 b26cf6f4abd54e30aac169a3cbca648c - - default default] [instance: 4e87b9c8-cfba-431e-966e-24799ad0ece2] Shelve offloading#033[00m
Jan 21 19:08:37 np0005591285 nova_compute[182755]: 2026-01-22 00:08:37.980 182759 INFO nova.virt.libvirt.driver [-] [instance: 4e87b9c8-cfba-431e-966e-24799ad0ece2] Instance destroyed successfully.#033[00m
Jan 21 19:08:37 np0005591285 nova_compute[182755]: 2026-01-22 00:08:37.981 182759 DEBUG nova.compute.manager [None req-de1e0738-8854-463f-8397-67597a09b4ac 365f219cd09c471fa6275faa2fe5e2a1 b26cf6f4abd54e30aac169a3cbca648c - - default default] [instance: 4e87b9c8-cfba-431e-966e-24799ad0ece2] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 21 19:08:37 np0005591285 nova_compute[182755]: 2026-01-22 00:08:37.985 182759 DEBUG oslo_concurrency.lockutils [None req-de1e0738-8854-463f-8397-67597a09b4ac 365f219cd09c471fa6275faa2fe5e2a1 b26cf6f4abd54e30aac169a3cbca648c - - default default] Acquiring lock "refresh_cache-4e87b9c8-cfba-431e-966e-24799ad0ece2" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 21 19:08:37 np0005591285 nova_compute[182755]: 2026-01-22 00:08:37.985 182759 DEBUG oslo_concurrency.lockutils [None req-de1e0738-8854-463f-8397-67597a09b4ac 365f219cd09c471fa6275faa2fe5e2a1 b26cf6f4abd54e30aac169a3cbca648c - - default default] Acquired lock "refresh_cache-4e87b9c8-cfba-431e-966e-24799ad0ece2" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 21 19:08:37 np0005591285 nova_compute[182755]: 2026-01-22 00:08:37.986 182759 DEBUG nova.network.neutron [None req-de1e0738-8854-463f-8397-67597a09b4ac 365f219cd09c471fa6275faa2fe5e2a1 b26cf6f4abd54e30aac169a3cbca648c - - default default] [instance: 4e87b9c8-cfba-431e-966e-24799ad0ece2] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Jan 21 19:08:39 np0005591285 nova_compute[182755]: 2026-01-22 00:08:39.260 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:08:40 np0005591285 nova_compute[182755]: 2026-01-22 00:08:40.217 182759 DEBUG oslo_service.periodic_task [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 21 19:08:41 np0005591285 nova_compute[182755]: 2026-01-22 00:08:41.086 182759 DEBUG nova.network.neutron [None req-de1e0738-8854-463f-8397-67597a09b4ac 365f219cd09c471fa6275faa2fe5e2a1 b26cf6f4abd54e30aac169a3cbca648c - - default default] [instance: 4e87b9c8-cfba-431e-966e-24799ad0ece2] Updating instance_info_cache with network_info: [{"id": "02f1d29d-b6df-46d8-8387-cfa84ffb24af", "address": "fa:16:3e:7c:e7:2e", "network": {"id": "1a4bd631-64c5-4e00-9341-0e44fd0833fb", "bridge": "br-int", "label": "tempest-ServerActionsTestOtherB-1779791452-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.176", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b26cf6f4abd54e30aac169a3cbca648c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap02f1d29d-b6", "ovs_interfaceid": "02f1d29d-b6df-46d8-8387-cfa84ffb24af", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 21 19:08:41 np0005591285 nova_compute[182755]: 2026-01-22 00:08:41.117 182759 DEBUG oslo_concurrency.lockutils [None req-de1e0738-8854-463f-8397-67597a09b4ac 365f219cd09c471fa6275faa2fe5e2a1 b26cf6f4abd54e30aac169a3cbca648c - - default default] Releasing lock "refresh_cache-4e87b9c8-cfba-431e-966e-24799ad0ece2" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 21 19:08:41 np0005591285 nova_compute[182755]: 2026-01-22 00:08:41.226 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:08:41 np0005591285 nova_compute[182755]: 2026-01-22 00:08:41.725 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:08:42 np0005591285 nova_compute[182755]: 2026-01-22 00:08:42.217 182759 DEBUG oslo_service.periodic_task [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 21 19:08:42 np0005591285 nova_compute[182755]: 2026-01-22 00:08:42.217 182759 DEBUG oslo_service.periodic_task [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 21 19:08:42 np0005591285 nova_compute[182755]: 2026-01-22 00:08:42.612 182759 INFO nova.virt.libvirt.driver [-] [instance: 4e87b9c8-cfba-431e-966e-24799ad0ece2] Instance destroyed successfully.#033[00m
Jan 21 19:08:42 np0005591285 nova_compute[182755]: 2026-01-22 00:08:42.612 182759 DEBUG nova.objects.instance [None req-de1e0738-8854-463f-8397-67597a09b4ac 365f219cd09c471fa6275faa2fe5e2a1 b26cf6f4abd54e30aac169a3cbca648c - - default default] Lazy-loading 'resources' on Instance uuid 4e87b9c8-cfba-431e-966e-24799ad0ece2 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 21 19:08:42 np0005591285 nova_compute[182755]: 2026-01-22 00:08:42.630 182759 DEBUG nova.virt.libvirt.vif [None req-de1e0738-8854-463f-8397-67597a09b4ac 365f219cd09c471fa6275faa2fe5e2a1 b26cf6f4abd54e30aac169a3cbca648c - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-22T00:07:28Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ServerActionsTestOtherB-server-261934281',display_name='tempest-ServerActionsTestOtherB-server-261934281',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-serveractionstestotherb-server-261934281',id=104,image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBOWyAqsdytk3W3HzFQzJP3BXJvSwE75PC1SitNdFnRhcK3nyEFtPzs/DJKbijcwArzRvqYzid7Fty+N11Hyd1TaRzX9I0f6oLPrGjMRpZbi4YRQ8Uh8k7+UR1VtydcvTDA==',key_name='tempest-keypair-1040205771',keypairs=<?>,launch_index=0,launched_at=2026-01-22T00:07:37Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=4,progress=0,project_id='b26cf6f4abd54e30aac169a3cbca648c',ramdisk_id='',reservation_id='r-bqv3jv0l',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-ServerActionsTestOtherB-1685479237',owner_user_name='tempest-ServerActionsTestOtherB-1685479237-project-member',shelved_at='2026-01-22T00:08:37.845451',shelved_host='compute-2.ctlplane.example.com',shelved_image_id='570dc310-bae9-497b-a8ff-1f0fd81ca729'},tags=<?>,task_state='shelving_offloading',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-22T00:08:34Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='365f219cd09c471fa6275faa2fe5e2a1',uuid=4e87b9c8-cfba-431e-966e-24799ad0ece2,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='shelved') vif={"id": "02f1d29d-b6df-46d8-8387-cfa84ffb24af", "address": "fa:16:3e:7c:e7:2e", "network": {"id": "1a4bd631-64c5-4e00-9341-0e44fd0833fb", "bridge": "br-int", "label": "tempest-ServerActionsTestOtherB-1779791452-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.176", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b26cf6f4abd54e30aac169a3cbca648c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap02f1d29d-b6", "ovs_interfaceid": "02f1d29d-b6df-46d8-8387-cfa84ffb24af", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Jan 21 19:08:42 np0005591285 nova_compute[182755]: 2026-01-22 00:08:42.630 182759 DEBUG nova.network.os_vif_util [None req-de1e0738-8854-463f-8397-67597a09b4ac 365f219cd09c471fa6275faa2fe5e2a1 b26cf6f4abd54e30aac169a3cbca648c - - default default] Converting VIF {"id": "02f1d29d-b6df-46d8-8387-cfa84ffb24af", "address": "fa:16:3e:7c:e7:2e", "network": {"id": "1a4bd631-64c5-4e00-9341-0e44fd0833fb", "bridge": "br-int", "label": "tempest-ServerActionsTestOtherB-1779791452-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.176", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b26cf6f4abd54e30aac169a3cbca648c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap02f1d29d-b6", "ovs_interfaceid": "02f1d29d-b6df-46d8-8387-cfa84ffb24af", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 21 19:08:42 np0005591285 nova_compute[182755]: 2026-01-22 00:08:42.631 182759 DEBUG nova.network.os_vif_util [None req-de1e0738-8854-463f-8397-67597a09b4ac 365f219cd09c471fa6275faa2fe5e2a1 b26cf6f4abd54e30aac169a3cbca648c - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:7c:e7:2e,bridge_name='br-int',has_traffic_filtering=True,id=02f1d29d-b6df-46d8-8387-cfa84ffb24af,network=Network(1a4bd631-64c5-4e00-9341-0e44fd0833fb),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap02f1d29d-b6') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 21 19:08:42 np0005591285 nova_compute[182755]: 2026-01-22 00:08:42.632 182759 DEBUG os_vif [None req-de1e0738-8854-463f-8397-67597a09b4ac 365f219cd09c471fa6275faa2fe5e2a1 b26cf6f4abd54e30aac169a3cbca648c - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:7c:e7:2e,bridge_name='br-int',has_traffic_filtering=True,id=02f1d29d-b6df-46d8-8387-cfa84ffb24af,network=Network(1a4bd631-64c5-4e00-9341-0e44fd0833fb),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap02f1d29d-b6') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Jan 21 19:08:42 np0005591285 nova_compute[182755]: 2026-01-22 00:08:42.635 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:08:42 np0005591285 nova_compute[182755]: 2026-01-22 00:08:42.635 182759 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap02f1d29d-b6, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 21 19:08:42 np0005591285 nova_compute[182755]: 2026-01-22 00:08:42.637 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:08:42 np0005591285 nova_compute[182755]: 2026-01-22 00:08:42.640 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:08:42 np0005591285 nova_compute[182755]: 2026-01-22 00:08:42.643 182759 INFO os_vif [None req-de1e0738-8854-463f-8397-67597a09b4ac 365f219cd09c471fa6275faa2fe5e2a1 b26cf6f4abd54e30aac169a3cbca648c - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:7c:e7:2e,bridge_name='br-int',has_traffic_filtering=True,id=02f1d29d-b6df-46d8-8387-cfa84ffb24af,network=Network(1a4bd631-64c5-4e00-9341-0e44fd0833fb),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap02f1d29d-b6')#033[00m
Jan 21 19:08:42 np0005591285 nova_compute[182755]: 2026-01-22 00:08:42.644 182759 INFO nova.virt.libvirt.driver [None req-de1e0738-8854-463f-8397-67597a09b4ac 365f219cd09c471fa6275faa2fe5e2a1 b26cf6f4abd54e30aac169a3cbca648c - - default default] [instance: 4e87b9c8-cfba-431e-966e-24799ad0ece2] Deleting instance files /var/lib/nova/instances/4e87b9c8-cfba-431e-966e-24799ad0ece2_del#033[00m
Jan 21 19:08:42 np0005591285 nova_compute[182755]: 2026-01-22 00:08:42.655 182759 INFO nova.virt.libvirt.driver [None req-de1e0738-8854-463f-8397-67597a09b4ac 365f219cd09c471fa6275faa2fe5e2a1 b26cf6f4abd54e30aac169a3cbca648c - - default default] [instance: 4e87b9c8-cfba-431e-966e-24799ad0ece2] Deletion of /var/lib/nova/instances/4e87b9c8-cfba-431e-966e-24799ad0ece2_del complete#033[00m
Jan 21 19:08:42 np0005591285 nova_compute[182755]: 2026-01-22 00:08:42.701 182759 DEBUG nova.compute.manager [req-696050d9-7ba9-4926-8c3a-0633d4f346e4 req-118a62f3-5073-4b8e-87bc-c8947e72771e 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 4e87b9c8-cfba-431e-966e-24799ad0ece2] Received event network-changed-02f1d29d-b6df-46d8-8387-cfa84ffb24af external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 21 19:08:42 np0005591285 nova_compute[182755]: 2026-01-22 00:08:42.701 182759 DEBUG nova.compute.manager [req-696050d9-7ba9-4926-8c3a-0633d4f346e4 req-118a62f3-5073-4b8e-87bc-c8947e72771e 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 4e87b9c8-cfba-431e-966e-24799ad0ece2] Refreshing instance network info cache due to event network-changed-02f1d29d-b6df-46d8-8387-cfa84ffb24af. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 21 19:08:42 np0005591285 nova_compute[182755]: 2026-01-22 00:08:42.702 182759 DEBUG oslo_concurrency.lockutils [req-696050d9-7ba9-4926-8c3a-0633d4f346e4 req-118a62f3-5073-4b8e-87bc-c8947e72771e 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "refresh_cache-4e87b9c8-cfba-431e-966e-24799ad0ece2" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 21 19:08:42 np0005591285 nova_compute[182755]: 2026-01-22 00:08:42.702 182759 DEBUG oslo_concurrency.lockutils [req-696050d9-7ba9-4926-8c3a-0633d4f346e4 req-118a62f3-5073-4b8e-87bc-c8947e72771e 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquired lock "refresh_cache-4e87b9c8-cfba-431e-966e-24799ad0ece2" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 21 19:08:42 np0005591285 nova_compute[182755]: 2026-01-22 00:08:42.702 182759 DEBUG nova.network.neutron [req-696050d9-7ba9-4926-8c3a-0633d4f346e4 req-118a62f3-5073-4b8e-87bc-c8947e72771e 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 4e87b9c8-cfba-431e-966e-24799ad0ece2] Refreshing network info cache for port 02f1d29d-b6df-46d8-8387-cfa84ffb24af _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 21 19:08:42 np0005591285 nova_compute[182755]: 2026-01-22 00:08:42.838 182759 INFO nova.scheduler.client.report [None req-de1e0738-8854-463f-8397-67597a09b4ac 365f219cd09c471fa6275faa2fe5e2a1 b26cf6f4abd54e30aac169a3cbca648c - - default default] Deleted allocations for instance 4e87b9c8-cfba-431e-966e-24799ad0ece2#033[00m
Jan 21 19:08:42 np0005591285 nova_compute[182755]: 2026-01-22 00:08:42.937 182759 DEBUG oslo_concurrency.lockutils [None req-de1e0738-8854-463f-8397-67597a09b4ac 365f219cd09c471fa6275faa2fe5e2a1 b26cf6f4abd54e30aac169a3cbca648c - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 21 19:08:42 np0005591285 nova_compute[182755]: 2026-01-22 00:08:42.938 182759 DEBUG oslo_concurrency.lockutils [None req-de1e0738-8854-463f-8397-67597a09b4ac 365f219cd09c471fa6275faa2fe5e2a1 b26cf6f4abd54e30aac169a3cbca648c - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 21 19:08:42 np0005591285 nova_compute[182755]: 2026-01-22 00:08:42.978 182759 DEBUG nova.compute.provider_tree [None req-de1e0738-8854-463f-8397-67597a09b4ac 365f219cd09c471fa6275faa2fe5e2a1 b26cf6f4abd54e30aac169a3cbca648c - - default default] Inventory has not changed in ProviderTree for provider: e96a8776-a298-4c19-937a-402cb8191067 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 21 19:08:43 np0005591285 nova_compute[182755]: 2026-01-22 00:08:43.001 182759 DEBUG nova.scheduler.client.report [None req-de1e0738-8854-463f-8397-67597a09b4ac 365f219cd09c471fa6275faa2fe5e2a1 b26cf6f4abd54e30aac169a3cbca648c - - default default] Inventory has not changed for provider e96a8776-a298-4c19-937a-402cb8191067 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 21 19:08:43 np0005591285 nova_compute[182755]: 2026-01-22 00:08:43.025 182759 DEBUG oslo_concurrency.lockutils [None req-de1e0738-8854-463f-8397-67597a09b4ac 365f219cd09c471fa6275faa2fe5e2a1 b26cf6f4abd54e30aac169a3cbca648c - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.087s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 21 19:08:43 np0005591285 nova_compute[182755]: 2026-01-22 00:08:43.129 182759 DEBUG oslo_concurrency.lockutils [None req-de1e0738-8854-463f-8397-67597a09b4ac 365f219cd09c471fa6275faa2fe5e2a1 b26cf6f4abd54e30aac169a3cbca648c - - default default] Lock "4e87b9c8-cfba-431e-966e-24799ad0ece2" "released" by "nova.compute.manager.ComputeManager.shelve_instance.<locals>.do_shelve_instance" :: held 12.963s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 21 19:08:44 np0005591285 nova_compute[182755]: 2026-01-22 00:08:44.218 182759 DEBUG oslo_service.periodic_task [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 21 19:08:44 np0005591285 nova_compute[182755]: 2026-01-22 00:08:44.391 182759 DEBUG nova.network.neutron [req-696050d9-7ba9-4926-8c3a-0633d4f346e4 req-118a62f3-5073-4b8e-87bc-c8947e72771e 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 4e87b9c8-cfba-431e-966e-24799ad0ece2] Updated VIF entry in instance network info cache for port 02f1d29d-b6df-46d8-8387-cfa84ffb24af. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 21 19:08:44 np0005591285 nova_compute[182755]: 2026-01-22 00:08:44.393 182759 DEBUG nova.network.neutron [req-696050d9-7ba9-4926-8c3a-0633d4f346e4 req-118a62f3-5073-4b8e-87bc-c8947e72771e 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 4e87b9c8-cfba-431e-966e-24799ad0ece2] Updating instance_info_cache with network_info: [{"id": "02f1d29d-b6df-46d8-8387-cfa84ffb24af", "address": "fa:16:3e:7c:e7:2e", "network": {"id": "1a4bd631-64c5-4e00-9341-0e44fd0833fb", "bridge": null, "label": "tempest-ServerActionsTestOtherB-1779791452-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.176", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b26cf6f4abd54e30aac169a3cbca648c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "unbound", "details": {}, "devname": "tap02f1d29d-b6", "ovs_interfaceid": null, "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 21 19:08:44 np0005591285 nova_compute[182755]: 2026-01-22 00:08:44.422 182759 DEBUG oslo_concurrency.lockutils [req-696050d9-7ba9-4926-8c3a-0633d4f346e4 req-118a62f3-5073-4b8e-87bc-c8947e72771e 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Releasing lock "refresh_cache-4e87b9c8-cfba-431e-966e-24799ad0ece2" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 21 19:08:45 np0005591285 nova_compute[182755]: 2026-01-22 00:08:45.216 182759 DEBUG oslo_service.periodic_task [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 21 19:08:45 np0005591285 nova_compute[182755]: 2026-01-22 00:08:45.217 182759 DEBUG nova.compute.manager [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Jan 21 19:08:45 np0005591285 nova_compute[182755]: 2026-01-22 00:08:45.286 182759 DEBUG nova.compute.manager [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] [instance: 4e87b9c8-cfba-431e-966e-24799ad0ece2] Skipping network cache update for instance because it has been migrated to another host. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9902#033[00m
Jan 21 19:08:45 np0005591285 nova_compute[182755]: 2026-01-22 00:08:45.287 182759 DEBUG nova.compute.manager [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Jan 21 19:08:45 np0005591285 nova_compute[182755]: 2026-01-22 00:08:45.288 182759 DEBUG oslo_service.periodic_task [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 21 19:08:45 np0005591285 nova_compute[182755]: 2026-01-22 00:08:45.288 182759 DEBUG oslo_service.periodic_task [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 21 19:08:45 np0005591285 nova_compute[182755]: 2026-01-22 00:08:45.325 182759 DEBUG oslo_concurrency.lockutils [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 21 19:08:45 np0005591285 nova_compute[182755]: 2026-01-22 00:08:45.325 182759 DEBUG oslo_concurrency.lockutils [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 21 19:08:45 np0005591285 nova_compute[182755]: 2026-01-22 00:08:45.326 182759 DEBUG oslo_concurrency.lockutils [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 21 19:08:45 np0005591285 nova_compute[182755]: 2026-01-22 00:08:45.326 182759 DEBUG nova.compute.resource_tracker [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Jan 21 19:08:45 np0005591285 podman[227772]: 2026-01-22 00:08:45.458002225 +0000 UTC m=+0.094857453 container health_status 769668cc4e818cfae854ab3fcb5517227ddca1e18005a18ef4d782a494f45662 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, architecture=x86_64, managed_by=edpm_ansible, version=9.6, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., build-date=2025-08-20T13:12:41, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.33.7, maintainer=Red Hat, Inc., vendor=Red Hat, Inc., container_name=openstack_network_exporter, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vcs-type=git, com.redhat.component=ubi9-minimal-container, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, config_id=openstack_network_exporter, io.openshift.tags=minimal rhel9, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, url=https://catalog.redhat.com/en/search?searchType=containers, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., name=ubi9-minimal, io.openshift.expose-services=, distribution-scope=public, release=1755695350, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9.)
Jan 21 19:08:45 np0005591285 podman[227774]: 2026-01-22 00:08:45.457961394 +0000 UTC m=+0.078659437 container health_status b5c1a31a508d0cf9e10702abe9c0ef424614b4d8f0daee85791f7de054afa6f0 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=ceilometer_agent_compute, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202)
Jan 21 19:08:45 np0005591285 nova_compute[182755]: 2026-01-22 00:08:45.515 182759 WARNING nova.virt.libvirt.driver [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 21 19:08:45 np0005591285 nova_compute[182755]: 2026-01-22 00:08:45.516 182759 DEBUG nova.compute.resource_tracker [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=5697MB free_disk=73.26092529296875GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Jan 21 19:08:45 np0005591285 nova_compute[182755]: 2026-01-22 00:08:45.516 182759 DEBUG oslo_concurrency.lockutils [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 21 19:08:45 np0005591285 nova_compute[182755]: 2026-01-22 00:08:45.516 182759 DEBUG oslo_concurrency.lockutils [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 21 19:08:45 np0005591285 nova_compute[182755]: 2026-01-22 00:08:45.657 182759 DEBUG nova.compute.resource_tracker [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Jan 21 19:08:45 np0005591285 nova_compute[182755]: 2026-01-22 00:08:45.657 182759 DEBUG nova.compute.resource_tracker [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Jan 21 19:08:45 np0005591285 nova_compute[182755]: 2026-01-22 00:08:45.687 182759 DEBUG nova.compute.provider_tree [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Inventory has not changed in ProviderTree for provider: e96a8776-a298-4c19-937a-402cb8191067 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 21 19:08:45 np0005591285 nova_compute[182755]: 2026-01-22 00:08:45.711 182759 DEBUG nova.scheduler.client.report [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Inventory has not changed for provider e96a8776-a298-4c19-937a-402cb8191067 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 21 19:08:45 np0005591285 nova_compute[182755]: 2026-01-22 00:08:45.768 182759 DEBUG nova.compute.resource_tracker [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Jan 21 19:08:45 np0005591285 nova_compute[182755]: 2026-01-22 00:08:45.769 182759 DEBUG oslo_concurrency.lockutils [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.253s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 21 19:08:46 np0005591285 nova_compute[182755]: 2026-01-22 00:08:46.579 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:08:46 np0005591285 nova_compute[182755]: 2026-01-22 00:08:46.727 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:08:47 np0005591285 nova_compute[182755]: 2026-01-22 00:08:47.638 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:08:47 np0005591285 nova_compute[182755]: 2026-01-22 00:08:47.724 182759 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1769040512.7239585, 4e87b9c8-cfba-431e-966e-24799ad0ece2 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 21 19:08:47 np0005591285 nova_compute[182755]: 2026-01-22 00:08:47.725 182759 INFO nova.compute.manager [-] [instance: 4e87b9c8-cfba-431e-966e-24799ad0ece2] VM Stopped (Lifecycle Event)#033[00m
Jan 21 19:08:47 np0005591285 nova_compute[182755]: 2026-01-22 00:08:47.751 182759 DEBUG nova.compute.manager [None req-78dffc6b-2d1e-4ba0-a099-30407261e2c9 - - - - - -] [instance: 4e87b9c8-cfba-431e-966e-24799ad0ece2] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 21 19:08:49 np0005591285 nova_compute[182755]: 2026-01-22 00:08:49.584 182759 DEBUG oslo_concurrency.lockutils [None req-1620f237-30e3-4c25-9f9f-e88b9800a1b5 365f219cd09c471fa6275faa2fe5e2a1 b26cf6f4abd54e30aac169a3cbca648c - - default default] Acquiring lock "4e87b9c8-cfba-431e-966e-24799ad0ece2" by "nova.compute.manager.ComputeManager.unshelve_instance.<locals>.do_unshelve_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 21 19:08:49 np0005591285 nova_compute[182755]: 2026-01-22 00:08:49.584 182759 DEBUG oslo_concurrency.lockutils [None req-1620f237-30e3-4c25-9f9f-e88b9800a1b5 365f219cd09c471fa6275faa2fe5e2a1 b26cf6f4abd54e30aac169a3cbca648c - - default default] Lock "4e87b9c8-cfba-431e-966e-24799ad0ece2" acquired by "nova.compute.manager.ComputeManager.unshelve_instance.<locals>.do_unshelve_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 21 19:08:49 np0005591285 nova_compute[182755]: 2026-01-22 00:08:49.585 182759 INFO nova.compute.manager [None req-1620f237-30e3-4c25-9f9f-e88b9800a1b5 365f219cd09c471fa6275faa2fe5e2a1 b26cf6f4abd54e30aac169a3cbca648c - - default default] [instance: 4e87b9c8-cfba-431e-966e-24799ad0ece2] Unshelving#033[00m
Jan 21 19:08:49 np0005591285 nova_compute[182755]: 2026-01-22 00:08:49.701 182759 DEBUG oslo_concurrency.lockutils [None req-1620f237-30e3-4c25-9f9f-e88b9800a1b5 365f219cd09c471fa6275faa2fe5e2a1 b26cf6f4abd54e30aac169a3cbca648c - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 21 19:08:49 np0005591285 nova_compute[182755]: 2026-01-22 00:08:49.702 182759 DEBUG oslo_concurrency.lockutils [None req-1620f237-30e3-4c25-9f9f-e88b9800a1b5 365f219cd09c471fa6275faa2fe5e2a1 b26cf6f4abd54e30aac169a3cbca648c - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 21 19:08:49 np0005591285 nova_compute[182755]: 2026-01-22 00:08:49.705 182759 DEBUG nova.objects.instance [None req-1620f237-30e3-4c25-9f9f-e88b9800a1b5 365f219cd09c471fa6275faa2fe5e2a1 b26cf6f4abd54e30aac169a3cbca648c - - default default] Lazy-loading 'pci_requests' on Instance uuid 4e87b9c8-cfba-431e-966e-24799ad0ece2 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 21 19:08:49 np0005591285 nova_compute[182755]: 2026-01-22 00:08:49.727 182759 DEBUG nova.objects.instance [None req-1620f237-30e3-4c25-9f9f-e88b9800a1b5 365f219cd09c471fa6275faa2fe5e2a1 b26cf6f4abd54e30aac169a3cbca648c - - default default] Lazy-loading 'numa_topology' on Instance uuid 4e87b9c8-cfba-431e-966e-24799ad0ece2 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 21 19:08:49 np0005591285 nova_compute[182755]: 2026-01-22 00:08:49.762 182759 DEBUG nova.virt.hardware [None req-1620f237-30e3-4c25-9f9f-e88b9800a1b5 365f219cd09c471fa6275faa2fe5e2a1 b26cf6f4abd54e30aac169a3cbca648c - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Jan 21 19:08:49 np0005591285 nova_compute[182755]: 2026-01-22 00:08:49.763 182759 INFO nova.compute.claims [None req-1620f237-30e3-4c25-9f9f-e88b9800a1b5 365f219cd09c471fa6275faa2fe5e2a1 b26cf6f4abd54e30aac169a3cbca648c - - default default] [instance: 4e87b9c8-cfba-431e-966e-24799ad0ece2] Claim successful on node compute-2.ctlplane.example.com#033[00m
Jan 21 19:08:49 np0005591285 nova_compute[182755]: 2026-01-22 00:08:49.945 182759 DEBUG nova.compute.provider_tree [None req-1620f237-30e3-4c25-9f9f-e88b9800a1b5 365f219cd09c471fa6275faa2fe5e2a1 b26cf6f4abd54e30aac169a3cbca648c - - default default] Inventory has not changed in ProviderTree for provider: e96a8776-a298-4c19-937a-402cb8191067 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 21 19:08:49 np0005591285 nova_compute[182755]: 2026-01-22 00:08:49.977 182759 DEBUG nova.scheduler.client.report [None req-1620f237-30e3-4c25-9f9f-e88b9800a1b5 365f219cd09c471fa6275faa2fe5e2a1 b26cf6f4abd54e30aac169a3cbca648c - - default default] Inventory has not changed for provider e96a8776-a298-4c19-937a-402cb8191067 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 21 19:08:49 np0005591285 nova_compute[182755]: 2026-01-22 00:08:49.998 182759 DEBUG oslo_concurrency.lockutils [None req-1620f237-30e3-4c25-9f9f-e88b9800a1b5 365f219cd09c471fa6275faa2fe5e2a1 b26cf6f4abd54e30aac169a3cbca648c - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.296s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 21 19:08:50 np0005591285 nova_compute[182755]: 2026-01-22 00:08:50.195 182759 INFO nova.network.neutron [None req-1620f237-30e3-4c25-9f9f-e88b9800a1b5 365f219cd09c471fa6275faa2fe5e2a1 b26cf6f4abd54e30aac169a3cbca648c - - default default] [instance: 4e87b9c8-cfba-431e-966e-24799ad0ece2] Updating port 02f1d29d-b6df-46d8-8387-cfa84ffb24af with attributes {'binding:host_id': 'compute-2.ctlplane.example.com', 'device_owner': 'compute:nova'}#033[00m
Jan 21 19:08:51 np0005591285 nova_compute[182755]: 2026-01-22 00:08:51.085 182759 DEBUG oslo_concurrency.lockutils [None req-16135a30-0415-48fc-85cb-ae2a817f04b8 8324d8ba232c476e925d31b7d5645a7a 3b9315c6168049d79f20d630e51ffff3 - - default default] Acquiring lock "cacae884-d2ca-4741-952f-59ffbb641328" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 21 19:08:51 np0005591285 nova_compute[182755]: 2026-01-22 00:08:51.085 182759 DEBUG oslo_concurrency.lockutils [None req-16135a30-0415-48fc-85cb-ae2a817f04b8 8324d8ba232c476e925d31b7d5645a7a 3b9315c6168049d79f20d630e51ffff3 - - default default] Lock "cacae884-d2ca-4741-952f-59ffbb641328" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 21 19:08:51 np0005591285 nova_compute[182755]: 2026-01-22 00:08:51.116 182759 DEBUG nova.compute.manager [None req-16135a30-0415-48fc-85cb-ae2a817f04b8 8324d8ba232c476e925d31b7d5645a7a 3b9315c6168049d79f20d630e51ffff3 - - default default] [instance: cacae884-d2ca-4741-952f-59ffbb641328] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Jan 21 19:08:51 np0005591285 nova_compute[182755]: 2026-01-22 00:08:51.224 182759 DEBUG oslo_concurrency.lockutils [None req-16135a30-0415-48fc-85cb-ae2a817f04b8 8324d8ba232c476e925d31b7d5645a7a 3b9315c6168049d79f20d630e51ffff3 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 21 19:08:51 np0005591285 nova_compute[182755]: 2026-01-22 00:08:51.224 182759 DEBUG oslo_concurrency.lockutils [None req-16135a30-0415-48fc-85cb-ae2a817f04b8 8324d8ba232c476e925d31b7d5645a7a 3b9315c6168049d79f20d630e51ffff3 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 21 19:08:51 np0005591285 nova_compute[182755]: 2026-01-22 00:08:51.231 182759 DEBUG nova.virt.hardware [None req-16135a30-0415-48fc-85cb-ae2a817f04b8 8324d8ba232c476e925d31b7d5645a7a 3b9315c6168049d79f20d630e51ffff3 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Jan 21 19:08:51 np0005591285 nova_compute[182755]: 2026-01-22 00:08:51.231 182759 INFO nova.compute.claims [None req-16135a30-0415-48fc-85cb-ae2a817f04b8 8324d8ba232c476e925d31b7d5645a7a 3b9315c6168049d79f20d630e51ffff3 - - default default] [instance: cacae884-d2ca-4741-952f-59ffbb641328] Claim successful on node compute-2.ctlplane.example.com#033[00m
Jan 21 19:08:51 np0005591285 nova_compute[182755]: 2026-01-22 00:08:51.445 182759 DEBUG nova.compute.provider_tree [None req-16135a30-0415-48fc-85cb-ae2a817f04b8 8324d8ba232c476e925d31b7d5645a7a 3b9315c6168049d79f20d630e51ffff3 - - default default] Inventory has not changed in ProviderTree for provider: e96a8776-a298-4c19-937a-402cb8191067 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 21 19:08:51 np0005591285 nova_compute[182755]: 2026-01-22 00:08:51.478 182759 DEBUG nova.scheduler.client.report [None req-16135a30-0415-48fc-85cb-ae2a817f04b8 8324d8ba232c476e925d31b7d5645a7a 3b9315c6168049d79f20d630e51ffff3 - - default default] Inventory has not changed for provider e96a8776-a298-4c19-937a-402cb8191067 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 21 19:08:51 np0005591285 nova_compute[182755]: 2026-01-22 00:08:51.505 182759 DEBUG oslo_concurrency.lockutils [None req-16135a30-0415-48fc-85cb-ae2a817f04b8 8324d8ba232c476e925d31b7d5645a7a 3b9315c6168049d79f20d630e51ffff3 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.281s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 21 19:08:51 np0005591285 nova_compute[182755]: 2026-01-22 00:08:51.506 182759 DEBUG nova.compute.manager [None req-16135a30-0415-48fc-85cb-ae2a817f04b8 8324d8ba232c476e925d31b7d5645a7a 3b9315c6168049d79f20d630e51ffff3 - - default default] [instance: cacae884-d2ca-4741-952f-59ffbb641328] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Jan 21 19:08:51 np0005591285 nova_compute[182755]: 2026-01-22 00:08:51.566 182759 DEBUG nova.compute.manager [None req-16135a30-0415-48fc-85cb-ae2a817f04b8 8324d8ba232c476e925d31b7d5645a7a 3b9315c6168049d79f20d630e51ffff3 - - default default] [instance: cacae884-d2ca-4741-952f-59ffbb641328] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Jan 21 19:08:51 np0005591285 nova_compute[182755]: 2026-01-22 00:08:51.566 182759 DEBUG nova.network.neutron [None req-16135a30-0415-48fc-85cb-ae2a817f04b8 8324d8ba232c476e925d31b7d5645a7a 3b9315c6168049d79f20d630e51ffff3 - - default default] [instance: cacae884-d2ca-4741-952f-59ffbb641328] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Jan 21 19:08:51 np0005591285 nova_compute[182755]: 2026-01-22 00:08:51.590 182759 INFO nova.virt.libvirt.driver [None req-16135a30-0415-48fc-85cb-ae2a817f04b8 8324d8ba232c476e925d31b7d5645a7a 3b9315c6168049d79f20d630e51ffff3 - - default default] [instance: cacae884-d2ca-4741-952f-59ffbb641328] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Jan 21 19:08:51 np0005591285 nova_compute[182755]: 2026-01-22 00:08:51.605 182759 DEBUG nova.compute.manager [None req-16135a30-0415-48fc-85cb-ae2a817f04b8 8324d8ba232c476e925d31b7d5645a7a 3b9315c6168049d79f20d630e51ffff3 - - default default] [instance: cacae884-d2ca-4741-952f-59ffbb641328] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Jan 21 19:08:51 np0005591285 nova_compute[182755]: 2026-01-22 00:08:51.701 182759 DEBUG oslo_concurrency.lockutils [None req-1620f237-30e3-4c25-9f9f-e88b9800a1b5 365f219cd09c471fa6275faa2fe5e2a1 b26cf6f4abd54e30aac169a3cbca648c - - default default] Acquiring lock "refresh_cache-4e87b9c8-cfba-431e-966e-24799ad0ece2" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 21 19:08:51 np0005591285 nova_compute[182755]: 2026-01-22 00:08:51.702 182759 DEBUG oslo_concurrency.lockutils [None req-1620f237-30e3-4c25-9f9f-e88b9800a1b5 365f219cd09c471fa6275faa2fe5e2a1 b26cf6f4abd54e30aac169a3cbca648c - - default default] Acquired lock "refresh_cache-4e87b9c8-cfba-431e-966e-24799ad0ece2" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 21 19:08:51 np0005591285 nova_compute[182755]: 2026-01-22 00:08:51.702 182759 DEBUG nova.network.neutron [None req-1620f237-30e3-4c25-9f9f-e88b9800a1b5 365f219cd09c471fa6275faa2fe5e2a1 b26cf6f4abd54e30aac169a3cbca648c - - default default] [instance: 4e87b9c8-cfba-431e-966e-24799ad0ece2] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Jan 21 19:08:51 np0005591285 nova_compute[182755]: 2026-01-22 00:08:51.742 182759 DEBUG nova.compute.manager [None req-16135a30-0415-48fc-85cb-ae2a817f04b8 8324d8ba232c476e925d31b7d5645a7a 3b9315c6168049d79f20d630e51ffff3 - - default default] [instance: cacae884-d2ca-4741-952f-59ffbb641328] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Jan 21 19:08:51 np0005591285 nova_compute[182755]: 2026-01-22 00:08:51.744 182759 DEBUG nova.virt.libvirt.driver [None req-16135a30-0415-48fc-85cb-ae2a817f04b8 8324d8ba232c476e925d31b7d5645a7a 3b9315c6168049d79f20d630e51ffff3 - - default default] [instance: cacae884-d2ca-4741-952f-59ffbb641328] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Jan 21 19:08:51 np0005591285 nova_compute[182755]: 2026-01-22 00:08:51.745 182759 INFO nova.virt.libvirt.driver [None req-16135a30-0415-48fc-85cb-ae2a817f04b8 8324d8ba232c476e925d31b7d5645a7a 3b9315c6168049d79f20d630e51ffff3 - - default default] [instance: cacae884-d2ca-4741-952f-59ffbb641328] Creating image(s)#033[00m
Jan 21 19:08:51 np0005591285 nova_compute[182755]: 2026-01-22 00:08:51.745 182759 DEBUG oslo_concurrency.lockutils [None req-16135a30-0415-48fc-85cb-ae2a817f04b8 8324d8ba232c476e925d31b7d5645a7a 3b9315c6168049d79f20d630e51ffff3 - - default default] Acquiring lock "/var/lib/nova/instances/cacae884-d2ca-4741-952f-59ffbb641328/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 21 19:08:51 np0005591285 nova_compute[182755]: 2026-01-22 00:08:51.746 182759 DEBUG oslo_concurrency.lockutils [None req-16135a30-0415-48fc-85cb-ae2a817f04b8 8324d8ba232c476e925d31b7d5645a7a 3b9315c6168049d79f20d630e51ffff3 - - default default] Lock "/var/lib/nova/instances/cacae884-d2ca-4741-952f-59ffbb641328/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 21 19:08:51 np0005591285 nova_compute[182755]: 2026-01-22 00:08:51.747 182759 DEBUG oslo_concurrency.lockutils [None req-16135a30-0415-48fc-85cb-ae2a817f04b8 8324d8ba232c476e925d31b7d5645a7a 3b9315c6168049d79f20d630e51ffff3 - - default default] Lock "/var/lib/nova/instances/cacae884-d2ca-4741-952f-59ffbb641328/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 21 19:08:51 np0005591285 nova_compute[182755]: 2026-01-22 00:08:51.766 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:08:51 np0005591285 nova_compute[182755]: 2026-01-22 00:08:51.769 182759 DEBUG oslo_concurrency.processutils [None req-16135a30-0415-48fc-85cb-ae2a817f04b8 8324d8ba232c476e925d31b7d5645a7a 3b9315c6168049d79f20d630e51ffff3 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 21 19:08:51 np0005591285 nova_compute[182755]: 2026-01-22 00:08:51.845 182759 DEBUG nova.compute.manager [req-33005401-5f04-4099-96c7-97dce7af0236 req-860c7de4-a62e-489f-990c-9371f32977e4 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 4e87b9c8-cfba-431e-966e-24799ad0ece2] Received event network-changed-02f1d29d-b6df-46d8-8387-cfa84ffb24af external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 21 19:08:51 np0005591285 nova_compute[182755]: 2026-01-22 00:08:51.845 182759 DEBUG nova.compute.manager [req-33005401-5f04-4099-96c7-97dce7af0236 req-860c7de4-a62e-489f-990c-9371f32977e4 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 4e87b9c8-cfba-431e-966e-24799ad0ece2] Refreshing instance network info cache due to event network-changed-02f1d29d-b6df-46d8-8387-cfa84ffb24af. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 21 19:08:51 np0005591285 nova_compute[182755]: 2026-01-22 00:08:51.846 182759 DEBUG oslo_concurrency.lockutils [req-33005401-5f04-4099-96c7-97dce7af0236 req-860c7de4-a62e-489f-990c-9371f32977e4 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "refresh_cache-4e87b9c8-cfba-431e-966e-24799ad0ece2" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 21 19:08:51 np0005591285 nova_compute[182755]: 2026-01-22 00:08:51.860 182759 DEBUG oslo_concurrency.processutils [None req-16135a30-0415-48fc-85cb-ae2a817f04b8 8324d8ba232c476e925d31b7d5645a7a 3b9315c6168049d79f20d630e51ffff3 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json" returned: 0 in 0.091s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 21 19:08:51 np0005591285 nova_compute[182755]: 2026-01-22 00:08:51.861 182759 DEBUG oslo_concurrency.lockutils [None req-16135a30-0415-48fc-85cb-ae2a817f04b8 8324d8ba232c476e925d31b7d5645a7a 3b9315c6168049d79f20d630e51ffff3 - - default default] Acquiring lock "3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 21 19:08:51 np0005591285 nova_compute[182755]: 2026-01-22 00:08:51.862 182759 DEBUG oslo_concurrency.lockutils [None req-16135a30-0415-48fc-85cb-ae2a817f04b8 8324d8ba232c476e925d31b7d5645a7a 3b9315c6168049d79f20d630e51ffff3 - - default default] Lock "3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 21 19:08:51 np0005591285 nova_compute[182755]: 2026-01-22 00:08:51.872 182759 DEBUG oslo_concurrency.processutils [None req-16135a30-0415-48fc-85cb-ae2a817f04b8 8324d8ba232c476e925d31b7d5645a7a 3b9315c6168049d79f20d630e51ffff3 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 21 19:08:51 np0005591285 nova_compute[182755]: 2026-01-22 00:08:51.894 182759 DEBUG nova.policy [None req-16135a30-0415-48fc-85cb-ae2a817f04b8 8324d8ba232c476e925d31b7d5645a7a 3b9315c6168049d79f20d630e51ffff3 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '8324d8ba232c476e925d31b7d5645a7a', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '3b9315c6168049d79f20d630e51ffff3', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Jan 21 19:08:51 np0005591285 nova_compute[182755]: 2026-01-22 00:08:51.938 182759 DEBUG oslo_concurrency.processutils [None req-16135a30-0415-48fc-85cb-ae2a817f04b8 8324d8ba232c476e925d31b7d5645a7a 3b9315c6168049d79f20d630e51ffff3 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json" returned: 0 in 0.066s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 21 19:08:51 np0005591285 nova_compute[182755]: 2026-01-22 00:08:51.939 182759 DEBUG oslo_concurrency.processutils [None req-16135a30-0415-48fc-85cb-ae2a817f04b8 8324d8ba232c476e925d31b7d5645a7a 3b9315c6168049d79f20d630e51ffff3 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474,backing_fmt=raw /var/lib/nova/instances/cacae884-d2ca-4741-952f-59ffbb641328/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 21 19:08:51 np0005591285 nova_compute[182755]: 2026-01-22 00:08:51.975 182759 DEBUG oslo_concurrency.processutils [None req-16135a30-0415-48fc-85cb-ae2a817f04b8 8324d8ba232c476e925d31b7d5645a7a 3b9315c6168049d79f20d630e51ffff3 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474,backing_fmt=raw /var/lib/nova/instances/cacae884-d2ca-4741-952f-59ffbb641328/disk 1073741824" returned: 0 in 0.036s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 21 19:08:51 np0005591285 nova_compute[182755]: 2026-01-22 00:08:51.976 182759 DEBUG oslo_concurrency.lockutils [None req-16135a30-0415-48fc-85cb-ae2a817f04b8 8324d8ba232c476e925d31b7d5645a7a 3b9315c6168049d79f20d630e51ffff3 - - default default] Lock "3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.114s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 21 19:08:51 np0005591285 nova_compute[182755]: 2026-01-22 00:08:51.976 182759 DEBUG oslo_concurrency.processutils [None req-16135a30-0415-48fc-85cb-ae2a817f04b8 8324d8ba232c476e925d31b7d5645a7a 3b9315c6168049d79f20d630e51ffff3 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 21 19:08:52 np0005591285 nova_compute[182755]: 2026-01-22 00:08:52.032 182759 DEBUG oslo_concurrency.processutils [None req-16135a30-0415-48fc-85cb-ae2a817f04b8 8324d8ba232c476e925d31b7d5645a7a 3b9315c6168049d79f20d630e51ffff3 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json" returned: 0 in 0.056s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 21 19:08:52 np0005591285 nova_compute[182755]: 2026-01-22 00:08:52.033 182759 DEBUG nova.virt.disk.api [None req-16135a30-0415-48fc-85cb-ae2a817f04b8 8324d8ba232c476e925d31b7d5645a7a 3b9315c6168049d79f20d630e51ffff3 - - default default] Checking if we can resize image /var/lib/nova/instances/cacae884-d2ca-4741-952f-59ffbb641328/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166#033[00m
Jan 21 19:08:52 np0005591285 nova_compute[182755]: 2026-01-22 00:08:52.034 182759 DEBUG oslo_concurrency.processutils [None req-16135a30-0415-48fc-85cb-ae2a817f04b8 8324d8ba232c476e925d31b7d5645a7a 3b9315c6168049d79f20d630e51ffff3 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/cacae884-d2ca-4741-952f-59ffbb641328/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 21 19:08:52 np0005591285 nova_compute[182755]: 2026-01-22 00:08:52.100 182759 DEBUG oslo_concurrency.processutils [None req-16135a30-0415-48fc-85cb-ae2a817f04b8 8324d8ba232c476e925d31b7d5645a7a 3b9315c6168049d79f20d630e51ffff3 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/cacae884-d2ca-4741-952f-59ffbb641328/disk --force-share --output=json" returned: 0 in 0.066s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 21 19:08:52 np0005591285 nova_compute[182755]: 2026-01-22 00:08:52.101 182759 DEBUG nova.virt.disk.api [None req-16135a30-0415-48fc-85cb-ae2a817f04b8 8324d8ba232c476e925d31b7d5645a7a 3b9315c6168049d79f20d630e51ffff3 - - default default] Cannot resize image /var/lib/nova/instances/cacae884-d2ca-4741-952f-59ffbb641328/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172#033[00m
Jan 21 19:08:52 np0005591285 nova_compute[182755]: 2026-01-22 00:08:52.101 182759 DEBUG nova.objects.instance [None req-16135a30-0415-48fc-85cb-ae2a817f04b8 8324d8ba232c476e925d31b7d5645a7a 3b9315c6168049d79f20d630e51ffff3 - - default default] Lazy-loading 'migration_context' on Instance uuid cacae884-d2ca-4741-952f-59ffbb641328 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 21 19:08:52 np0005591285 nova_compute[182755]: 2026-01-22 00:08:52.116 182759 DEBUG nova.virt.libvirt.driver [None req-16135a30-0415-48fc-85cb-ae2a817f04b8 8324d8ba232c476e925d31b7d5645a7a 3b9315c6168049d79f20d630e51ffff3 - - default default] [instance: cacae884-d2ca-4741-952f-59ffbb641328] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Jan 21 19:08:52 np0005591285 nova_compute[182755]: 2026-01-22 00:08:52.117 182759 DEBUG nova.virt.libvirt.driver [None req-16135a30-0415-48fc-85cb-ae2a817f04b8 8324d8ba232c476e925d31b7d5645a7a 3b9315c6168049d79f20d630e51ffff3 - - default default] [instance: cacae884-d2ca-4741-952f-59ffbb641328] Ensure instance console log exists: /var/lib/nova/instances/cacae884-d2ca-4741-952f-59ffbb641328/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Jan 21 19:08:52 np0005591285 nova_compute[182755]: 2026-01-22 00:08:52.118 182759 DEBUG oslo_concurrency.lockutils [None req-16135a30-0415-48fc-85cb-ae2a817f04b8 8324d8ba232c476e925d31b7d5645a7a 3b9315c6168049d79f20d630e51ffff3 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 21 19:08:52 np0005591285 nova_compute[182755]: 2026-01-22 00:08:52.118 182759 DEBUG oslo_concurrency.lockutils [None req-16135a30-0415-48fc-85cb-ae2a817f04b8 8324d8ba232c476e925d31b7d5645a7a 3b9315c6168049d79f20d630e51ffff3 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 21 19:08:52 np0005591285 nova_compute[182755]: 2026-01-22 00:08:52.118 182759 DEBUG oslo_concurrency.lockutils [None req-16135a30-0415-48fc-85cb-ae2a817f04b8 8324d8ba232c476e925d31b7d5645a7a 3b9315c6168049d79f20d630e51ffff3 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 21 19:08:52 np0005591285 nova_compute[182755]: 2026-01-22 00:08:52.640 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:08:53 np0005591285 nova_compute[182755]: 2026-01-22 00:08:53.716 182759 DEBUG nova.network.neutron [None req-16135a30-0415-48fc-85cb-ae2a817f04b8 8324d8ba232c476e925d31b7d5645a7a 3b9315c6168049d79f20d630e51ffff3 - - default default] [instance: cacae884-d2ca-4741-952f-59ffbb641328] Successfully created port: ebf5a837-6957-4227-9b3d-1ae66eb381bd _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Jan 21 19:08:53 np0005591285 nova_compute[182755]: 2026-01-22 00:08:53.807 182759 DEBUG nova.network.neutron [None req-1620f237-30e3-4c25-9f9f-e88b9800a1b5 365f219cd09c471fa6275faa2fe5e2a1 b26cf6f4abd54e30aac169a3cbca648c - - default default] [instance: 4e87b9c8-cfba-431e-966e-24799ad0ece2] Updating instance_info_cache with network_info: [{"id": "02f1d29d-b6df-46d8-8387-cfa84ffb24af", "address": "fa:16:3e:7c:e7:2e", "network": {"id": "1a4bd631-64c5-4e00-9341-0e44fd0833fb", "bridge": "br-int", "label": "tempest-ServerActionsTestOtherB-1779791452-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.176", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b26cf6f4abd54e30aac169a3cbca648c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap02f1d29d-b6", "ovs_interfaceid": "02f1d29d-b6df-46d8-8387-cfa84ffb24af", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 21 19:08:53 np0005591285 nova_compute[182755]: 2026-01-22 00:08:53.828 182759 DEBUG oslo_concurrency.lockutils [None req-1620f237-30e3-4c25-9f9f-e88b9800a1b5 365f219cd09c471fa6275faa2fe5e2a1 b26cf6f4abd54e30aac169a3cbca648c - - default default] Releasing lock "refresh_cache-4e87b9c8-cfba-431e-966e-24799ad0ece2" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 21 19:08:53 np0005591285 nova_compute[182755]: 2026-01-22 00:08:53.830 182759 DEBUG nova.virt.libvirt.driver [None req-1620f237-30e3-4c25-9f9f-e88b9800a1b5 365f219cd09c471fa6275faa2fe5e2a1 b26cf6f4abd54e30aac169a3cbca648c - - default default] [instance: 4e87b9c8-cfba-431e-966e-24799ad0ece2] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Jan 21 19:08:53 np0005591285 nova_compute[182755]: 2026-01-22 00:08:53.830 182759 INFO nova.virt.libvirt.driver [None req-1620f237-30e3-4c25-9f9f-e88b9800a1b5 365f219cd09c471fa6275faa2fe5e2a1 b26cf6f4abd54e30aac169a3cbca648c - - default default] [instance: 4e87b9c8-cfba-431e-966e-24799ad0ece2] Creating image(s)#033[00m
Jan 21 19:08:53 np0005591285 nova_compute[182755]: 2026-01-22 00:08:53.831 182759 DEBUG oslo_concurrency.lockutils [None req-1620f237-30e3-4c25-9f9f-e88b9800a1b5 365f219cd09c471fa6275faa2fe5e2a1 b26cf6f4abd54e30aac169a3cbca648c - - default default] Acquiring lock "/var/lib/nova/instances/4e87b9c8-cfba-431e-966e-24799ad0ece2/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 21 19:08:53 np0005591285 nova_compute[182755]: 2026-01-22 00:08:53.831 182759 DEBUG oslo_concurrency.lockutils [None req-1620f237-30e3-4c25-9f9f-e88b9800a1b5 365f219cd09c471fa6275faa2fe5e2a1 b26cf6f4abd54e30aac169a3cbca648c - - default default] Lock "/var/lib/nova/instances/4e87b9c8-cfba-431e-966e-24799ad0ece2/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 21 19:08:53 np0005591285 nova_compute[182755]: 2026-01-22 00:08:53.832 182759 DEBUG oslo_concurrency.lockutils [None req-1620f237-30e3-4c25-9f9f-e88b9800a1b5 365f219cd09c471fa6275faa2fe5e2a1 b26cf6f4abd54e30aac169a3cbca648c - - default default] Lock "/var/lib/nova/instances/4e87b9c8-cfba-431e-966e-24799ad0ece2/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 21 19:08:53 np0005591285 nova_compute[182755]: 2026-01-22 00:08:53.832 182759 DEBUG nova.objects.instance [None req-1620f237-30e3-4c25-9f9f-e88b9800a1b5 365f219cd09c471fa6275faa2fe5e2a1 b26cf6f4abd54e30aac169a3cbca648c - - default default] Lazy-loading 'trusted_certs' on Instance uuid 4e87b9c8-cfba-431e-966e-24799ad0ece2 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 21 19:08:53 np0005591285 nova_compute[182755]: 2026-01-22 00:08:53.835 182759 DEBUG oslo_concurrency.lockutils [req-33005401-5f04-4099-96c7-97dce7af0236 req-860c7de4-a62e-489f-990c-9371f32977e4 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquired lock "refresh_cache-4e87b9c8-cfba-431e-966e-24799ad0ece2" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 21 19:08:53 np0005591285 nova_compute[182755]: 2026-01-22 00:08:53.835 182759 DEBUG nova.network.neutron [req-33005401-5f04-4099-96c7-97dce7af0236 req-860c7de4-a62e-489f-990c-9371f32977e4 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 4e87b9c8-cfba-431e-966e-24799ad0ece2] Refreshing network info cache for port 02f1d29d-b6df-46d8-8387-cfa84ffb24af _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 21 19:08:53 np0005591285 nova_compute[182755]: 2026-01-22 00:08:53.849 182759 DEBUG oslo_concurrency.lockutils [None req-1620f237-30e3-4c25-9f9f-e88b9800a1b5 365f219cd09c471fa6275faa2fe5e2a1 b26cf6f4abd54e30aac169a3cbca648c - - default default] Acquiring lock "b33c9920bf26c6dae549fa60eaf22a65772f20df" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 21 19:08:53 np0005591285 nova_compute[182755]: 2026-01-22 00:08:53.850 182759 DEBUG oslo_concurrency.lockutils [None req-1620f237-30e3-4c25-9f9f-e88b9800a1b5 365f219cd09c471fa6275faa2fe5e2a1 b26cf6f4abd54e30aac169a3cbca648c - - default default] Lock "b33c9920bf26c6dae549fa60eaf22a65772f20df" acquired by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 21 19:08:54 np0005591285 podman[227830]: 2026-01-22 00:08:54.211779854 +0000 UTC m=+0.071056302 container health_status 3d927e208c8d9d2bf2ed958430b573b5beb6e6062ba87e1ec7a0ee42c855b2e4 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter)
Jan 21 19:08:55 np0005591285 nova_compute[182755]: 2026-01-22 00:08:55.385 182759 DEBUG nova.network.neutron [None req-16135a30-0415-48fc-85cb-ae2a817f04b8 8324d8ba232c476e925d31b7d5645a7a 3b9315c6168049d79f20d630e51ffff3 - - default default] [instance: cacae884-d2ca-4741-952f-59ffbb641328] Successfully updated port: ebf5a837-6957-4227-9b3d-1ae66eb381bd _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Jan 21 19:08:55 np0005591285 nova_compute[182755]: 2026-01-22 00:08:55.405 182759 DEBUG oslo_concurrency.lockutils [None req-16135a30-0415-48fc-85cb-ae2a817f04b8 8324d8ba232c476e925d31b7d5645a7a 3b9315c6168049d79f20d630e51ffff3 - - default default] Acquiring lock "refresh_cache-cacae884-d2ca-4741-952f-59ffbb641328" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 21 19:08:55 np0005591285 nova_compute[182755]: 2026-01-22 00:08:55.405 182759 DEBUG oslo_concurrency.lockutils [None req-16135a30-0415-48fc-85cb-ae2a817f04b8 8324d8ba232c476e925d31b7d5645a7a 3b9315c6168049d79f20d630e51ffff3 - - default default] Acquired lock "refresh_cache-cacae884-d2ca-4741-952f-59ffbb641328" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 21 19:08:55 np0005591285 nova_compute[182755]: 2026-01-22 00:08:55.405 182759 DEBUG nova.network.neutron [None req-16135a30-0415-48fc-85cb-ae2a817f04b8 8324d8ba232c476e925d31b7d5645a7a 3b9315c6168049d79f20d630e51ffff3 - - default default] [instance: cacae884-d2ca-4741-952f-59ffbb641328] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Jan 21 19:08:55 np0005591285 nova_compute[182755]: 2026-01-22 00:08:55.493 182759 DEBUG nova.compute.manager [req-584d2787-f89e-44db-8bcc-145ab190725a req-0d5af684-e6db-4b22-afc0-b7657c58808a 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: cacae884-d2ca-4741-952f-59ffbb641328] Received event network-changed-ebf5a837-6957-4227-9b3d-1ae66eb381bd external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 21 19:08:55 np0005591285 nova_compute[182755]: 2026-01-22 00:08:55.494 182759 DEBUG nova.compute.manager [req-584d2787-f89e-44db-8bcc-145ab190725a req-0d5af684-e6db-4b22-afc0-b7657c58808a 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: cacae884-d2ca-4741-952f-59ffbb641328] Refreshing instance network info cache due to event network-changed-ebf5a837-6957-4227-9b3d-1ae66eb381bd. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 21 19:08:55 np0005591285 nova_compute[182755]: 2026-01-22 00:08:55.494 182759 DEBUG oslo_concurrency.lockutils [req-584d2787-f89e-44db-8bcc-145ab190725a req-0d5af684-e6db-4b22-afc0-b7657c58808a 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "refresh_cache-cacae884-d2ca-4741-952f-59ffbb641328" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 21 19:08:55 np0005591285 nova_compute[182755]: 2026-01-22 00:08:55.645 182759 DEBUG oslo_concurrency.processutils [None req-1620f237-30e3-4c25-9f9f-e88b9800a1b5 365f219cd09c471fa6275faa2fe5e2a1 b26cf6f4abd54e30aac169a3cbca648c - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/b33c9920bf26c6dae549fa60eaf22a65772f20df.part --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 21 19:08:55 np0005591285 nova_compute[182755]: 2026-01-22 00:08:55.678 182759 DEBUG nova.network.neutron [None req-16135a30-0415-48fc-85cb-ae2a817f04b8 8324d8ba232c476e925d31b7d5645a7a 3b9315c6168049d79f20d630e51ffff3 - - default default] [instance: cacae884-d2ca-4741-952f-59ffbb641328] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Jan 21 19:08:55 np0005591285 nova_compute[182755]: 2026-01-22 00:08:55.738 182759 DEBUG oslo_concurrency.processutils [None req-1620f237-30e3-4c25-9f9f-e88b9800a1b5 365f219cd09c471fa6275faa2fe5e2a1 b26cf6f4abd54e30aac169a3cbca648c - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/b33c9920bf26c6dae549fa60eaf22a65772f20df.part --force-share --output=json" returned: 0 in 0.093s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 21 19:08:55 np0005591285 nova_compute[182755]: 2026-01-22 00:08:55.739 182759 DEBUG nova.virt.images [None req-1620f237-30e3-4c25-9f9f-e88b9800a1b5 365f219cd09c471fa6275faa2fe5e2a1 b26cf6f4abd54e30aac169a3cbca648c - - default default] 570dc310-bae9-497b-a8ff-1f0fd81ca729 was qcow2, converting to raw fetch_to_raw /usr/lib/python3.9/site-packages/nova/virt/images.py:242#033[00m
Jan 21 19:08:55 np0005591285 nova_compute[182755]: 2026-01-22 00:08:55.741 182759 DEBUG nova.privsep.utils [None req-1620f237-30e3-4c25-9f9f-e88b9800a1b5 365f219cd09c471fa6275faa2fe5e2a1 b26cf6f4abd54e30aac169a3cbca648c - - default default] Path '/var/lib/nova/instances' supports direct I/O supports_direct_io /usr/lib/python3.9/site-packages/nova/privsep/utils.py:63#033[00m
Jan 21 19:08:55 np0005591285 nova_compute[182755]: 2026-01-22 00:08:55.742 182759 DEBUG oslo_concurrency.processutils [None req-1620f237-30e3-4c25-9f9f-e88b9800a1b5 365f219cd09c471fa6275faa2fe5e2a1 b26cf6f4abd54e30aac169a3cbca648c - - default default] Running cmd (subprocess): qemu-img convert -t none -O raw -f qcow2 /var/lib/nova/instances/_base/b33c9920bf26c6dae549fa60eaf22a65772f20df.part /var/lib/nova/instances/_base/b33c9920bf26c6dae549fa60eaf22a65772f20df.converted execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 21 19:08:56 np0005591285 nova_compute[182755]: 2026-01-22 00:08:56.036 182759 DEBUG oslo_concurrency.processutils [None req-1620f237-30e3-4c25-9f9f-e88b9800a1b5 365f219cd09c471fa6275faa2fe5e2a1 b26cf6f4abd54e30aac169a3cbca648c - - default default] CMD "qemu-img convert -t none -O raw -f qcow2 /var/lib/nova/instances/_base/b33c9920bf26c6dae549fa60eaf22a65772f20df.part /var/lib/nova/instances/_base/b33c9920bf26c6dae549fa60eaf22a65772f20df.converted" returned: 0 in 0.294s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 21 19:08:56 np0005591285 nova_compute[182755]: 2026-01-22 00:08:56.054 182759 DEBUG oslo_concurrency.processutils [None req-1620f237-30e3-4c25-9f9f-e88b9800a1b5 365f219cd09c471fa6275faa2fe5e2a1 b26cf6f4abd54e30aac169a3cbca648c - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/b33c9920bf26c6dae549fa60eaf22a65772f20df.converted --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 21 19:08:56 np0005591285 nova_compute[182755]: 2026-01-22 00:08:56.114 182759 DEBUG oslo_concurrency.processutils [None req-1620f237-30e3-4c25-9f9f-e88b9800a1b5 365f219cd09c471fa6275faa2fe5e2a1 b26cf6f4abd54e30aac169a3cbca648c - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/b33c9920bf26c6dae549fa60eaf22a65772f20df.converted --force-share --output=json" returned: 0 in 0.060s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 21 19:08:56 np0005591285 nova_compute[182755]: 2026-01-22 00:08:56.116 182759 DEBUG oslo_concurrency.lockutils [None req-1620f237-30e3-4c25-9f9f-e88b9800a1b5 365f219cd09c471fa6275faa2fe5e2a1 b26cf6f4abd54e30aac169a3cbca648c - - default default] Lock "b33c9920bf26c6dae549fa60eaf22a65772f20df" "released" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: held 2.266s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 21 19:08:56 np0005591285 nova_compute[182755]: 2026-01-22 00:08:56.143 182759 DEBUG oslo_concurrency.processutils [None req-1620f237-30e3-4c25-9f9f-e88b9800a1b5 365f219cd09c471fa6275faa2fe5e2a1 b26cf6f4abd54e30aac169a3cbca648c - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/b33c9920bf26c6dae549fa60eaf22a65772f20df --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 21 19:08:56 np0005591285 nova_compute[182755]: 2026-01-22 00:08:56.230 182759 DEBUG oslo_concurrency.processutils [None req-1620f237-30e3-4c25-9f9f-e88b9800a1b5 365f219cd09c471fa6275faa2fe5e2a1 b26cf6f4abd54e30aac169a3cbca648c - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/b33c9920bf26c6dae549fa60eaf22a65772f20df --force-share --output=json" returned: 0 in 0.087s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 21 19:08:56 np0005591285 nova_compute[182755]: 2026-01-22 00:08:56.232 182759 DEBUG oslo_concurrency.lockutils [None req-1620f237-30e3-4c25-9f9f-e88b9800a1b5 365f219cd09c471fa6275faa2fe5e2a1 b26cf6f4abd54e30aac169a3cbca648c - - default default] Acquiring lock "b33c9920bf26c6dae549fa60eaf22a65772f20df" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 21 19:08:56 np0005591285 nova_compute[182755]: 2026-01-22 00:08:56.234 182759 DEBUG oslo_concurrency.lockutils [None req-1620f237-30e3-4c25-9f9f-e88b9800a1b5 365f219cd09c471fa6275faa2fe5e2a1 b26cf6f4abd54e30aac169a3cbca648c - - default default] Lock "b33c9920bf26c6dae549fa60eaf22a65772f20df" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 21 19:08:56 np0005591285 nova_compute[182755]: 2026-01-22 00:08:56.261 182759 DEBUG oslo_concurrency.processutils [None req-1620f237-30e3-4c25-9f9f-e88b9800a1b5 365f219cd09c471fa6275faa2fe5e2a1 b26cf6f4abd54e30aac169a3cbca648c - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/b33c9920bf26c6dae549fa60eaf22a65772f20df --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 21 19:08:56 np0005591285 nova_compute[182755]: 2026-01-22 00:08:56.358 182759 DEBUG oslo_concurrency.processutils [None req-1620f237-30e3-4c25-9f9f-e88b9800a1b5 365f219cd09c471fa6275faa2fe5e2a1 b26cf6f4abd54e30aac169a3cbca648c - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/b33c9920bf26c6dae549fa60eaf22a65772f20df --force-share --output=json" returned: 0 in 0.097s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 21 19:08:56 np0005591285 nova_compute[182755]: 2026-01-22 00:08:56.360 182759 DEBUG oslo_concurrency.processutils [None req-1620f237-30e3-4c25-9f9f-e88b9800a1b5 365f219cd09c471fa6275faa2fe5e2a1 b26cf6f4abd54e30aac169a3cbca648c - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/b33c9920bf26c6dae549fa60eaf22a65772f20df,backing_fmt=raw /var/lib/nova/instances/4e87b9c8-cfba-431e-966e-24799ad0ece2/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 21 19:08:56 np0005591285 nova_compute[182755]: 2026-01-22 00:08:56.413 182759 DEBUG oslo_concurrency.processutils [None req-1620f237-30e3-4c25-9f9f-e88b9800a1b5 365f219cd09c471fa6275faa2fe5e2a1 b26cf6f4abd54e30aac169a3cbca648c - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/b33c9920bf26c6dae549fa60eaf22a65772f20df,backing_fmt=raw /var/lib/nova/instances/4e87b9c8-cfba-431e-966e-24799ad0ece2/disk 1073741824" returned: 0 in 0.053s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 21 19:08:56 np0005591285 nova_compute[182755]: 2026-01-22 00:08:56.414 182759 DEBUG oslo_concurrency.lockutils [None req-1620f237-30e3-4c25-9f9f-e88b9800a1b5 365f219cd09c471fa6275faa2fe5e2a1 b26cf6f4abd54e30aac169a3cbca648c - - default default] Lock "b33c9920bf26c6dae549fa60eaf22a65772f20df" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.181s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 21 19:08:56 np0005591285 nova_compute[182755]: 2026-01-22 00:08:56.415 182759 DEBUG oslo_concurrency.processutils [None req-1620f237-30e3-4c25-9f9f-e88b9800a1b5 365f219cd09c471fa6275faa2fe5e2a1 b26cf6f4abd54e30aac169a3cbca648c - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/b33c9920bf26c6dae549fa60eaf22a65772f20df --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 21 19:08:56 np0005591285 nova_compute[182755]: 2026-01-22 00:08:56.493 182759 DEBUG oslo_concurrency.processutils [None req-1620f237-30e3-4c25-9f9f-e88b9800a1b5 365f219cd09c471fa6275faa2fe5e2a1 b26cf6f4abd54e30aac169a3cbca648c - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/b33c9920bf26c6dae549fa60eaf22a65772f20df --force-share --output=json" returned: 0 in 0.077s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 21 19:08:56 np0005591285 nova_compute[182755]: 2026-01-22 00:08:56.495 182759 DEBUG nova.objects.instance [None req-1620f237-30e3-4c25-9f9f-e88b9800a1b5 365f219cd09c471fa6275faa2fe5e2a1 b26cf6f4abd54e30aac169a3cbca648c - - default default] Lazy-loading 'migration_context' on Instance uuid 4e87b9c8-cfba-431e-966e-24799ad0ece2 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 21 19:08:56 np0005591285 nova_compute[182755]: 2026-01-22 00:08:56.629 182759 INFO nova.virt.libvirt.driver [None req-1620f237-30e3-4c25-9f9f-e88b9800a1b5 365f219cd09c471fa6275faa2fe5e2a1 b26cf6f4abd54e30aac169a3cbca648c - - default default] [instance: 4e87b9c8-cfba-431e-966e-24799ad0ece2] Rebasing disk image.#033[00m
Jan 21 19:08:56 np0005591285 nova_compute[182755]: 2026-01-22 00:08:56.630 182759 DEBUG oslo_concurrency.processutils [None req-1620f237-30e3-4c25-9f9f-e88b9800a1b5 365f219cd09c471fa6275faa2fe5e2a1 b26cf6f4abd54e30aac169a3cbca648c - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 21 19:08:56 np0005591285 nova_compute[182755]: 2026-01-22 00:08:56.689 182759 DEBUG oslo_concurrency.processutils [None req-1620f237-30e3-4c25-9f9f-e88b9800a1b5 365f219cd09c471fa6275faa2fe5e2a1 b26cf6f4abd54e30aac169a3cbca648c - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json" returned: 0 in 0.059s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 21 19:08:56 np0005591285 nova_compute[182755]: 2026-01-22 00:08:56.690 182759 DEBUG oslo_concurrency.processutils [None req-1620f237-30e3-4c25-9f9f-e88b9800a1b5 365f219cd09c471fa6275faa2fe5e2a1 b26cf6f4abd54e30aac169a3cbca648c - - default default] Running cmd (subprocess): qemu-img rebase -b /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 -F raw /var/lib/nova/instances/4e87b9c8-cfba-431e-966e-24799ad0ece2/disk execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 21 19:08:56 np0005591285 nova_compute[182755]: 2026-01-22 00:08:56.769 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:08:57 np0005591285 nova_compute[182755]: 2026-01-22 00:08:57.147 182759 DEBUG nova.network.neutron [req-33005401-5f04-4099-96c7-97dce7af0236 req-860c7de4-a62e-489f-990c-9371f32977e4 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 4e87b9c8-cfba-431e-966e-24799ad0ece2] Updated VIF entry in instance network info cache for port 02f1d29d-b6df-46d8-8387-cfa84ffb24af. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 21 19:08:57 np0005591285 nova_compute[182755]: 2026-01-22 00:08:57.148 182759 DEBUG nova.network.neutron [req-33005401-5f04-4099-96c7-97dce7af0236 req-860c7de4-a62e-489f-990c-9371f32977e4 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 4e87b9c8-cfba-431e-966e-24799ad0ece2] Updating instance_info_cache with network_info: [{"id": "02f1d29d-b6df-46d8-8387-cfa84ffb24af", "address": "fa:16:3e:7c:e7:2e", "network": {"id": "1a4bd631-64c5-4e00-9341-0e44fd0833fb", "bridge": "br-int", "label": "tempest-ServerActionsTestOtherB-1779791452-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.176", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b26cf6f4abd54e30aac169a3cbca648c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap02f1d29d-b6", "ovs_interfaceid": "02f1d29d-b6df-46d8-8387-cfa84ffb24af", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 21 19:08:57 np0005591285 nova_compute[182755]: 2026-01-22 00:08:57.187 182759 DEBUG oslo_concurrency.lockutils [req-33005401-5f04-4099-96c7-97dce7af0236 req-860c7de4-a62e-489f-990c-9371f32977e4 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Releasing lock "refresh_cache-4e87b9c8-cfba-431e-966e-24799ad0ece2" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 21 19:08:57 np0005591285 nova_compute[182755]: 2026-01-22 00:08:57.643 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:08:58 np0005591285 nova_compute[182755]: 2026-01-22 00:08:58.096 182759 DEBUG nova.network.neutron [None req-16135a30-0415-48fc-85cb-ae2a817f04b8 8324d8ba232c476e925d31b7d5645a7a 3b9315c6168049d79f20d630e51ffff3 - - default default] [instance: cacae884-d2ca-4741-952f-59ffbb641328] Updating instance_info_cache with network_info: [{"id": "ebf5a837-6957-4227-9b3d-1ae66eb381bd", "address": "fa:16:3e:52:ad:ee", "network": {"id": "eaa59f49-9477-4d9b-9a5e-8f1eb5cf78a6", "bridge": "br-int", "label": "tempest-ServerRescueTestJSON-1280377146-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": false}}], "meta": {"injected": false, "tenant_id": "3b9315c6168049d79f20d630e51ffff3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapebf5a837-69", "ovs_interfaceid": "ebf5a837-6957-4227-9b3d-1ae66eb381bd", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 21 19:08:58 np0005591285 nova_compute[182755]: 2026-01-22 00:08:58.123 182759 DEBUG oslo_concurrency.lockutils [None req-16135a30-0415-48fc-85cb-ae2a817f04b8 8324d8ba232c476e925d31b7d5645a7a 3b9315c6168049d79f20d630e51ffff3 - - default default] Releasing lock "refresh_cache-cacae884-d2ca-4741-952f-59ffbb641328" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 21 19:08:58 np0005591285 nova_compute[182755]: 2026-01-22 00:08:58.123 182759 DEBUG nova.compute.manager [None req-16135a30-0415-48fc-85cb-ae2a817f04b8 8324d8ba232c476e925d31b7d5645a7a 3b9315c6168049d79f20d630e51ffff3 - - default default] [instance: cacae884-d2ca-4741-952f-59ffbb641328] Instance network_info: |[{"id": "ebf5a837-6957-4227-9b3d-1ae66eb381bd", "address": "fa:16:3e:52:ad:ee", "network": {"id": "eaa59f49-9477-4d9b-9a5e-8f1eb5cf78a6", "bridge": "br-int", "label": "tempest-ServerRescueTestJSON-1280377146-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": false}}], "meta": {"injected": false, "tenant_id": "3b9315c6168049d79f20d630e51ffff3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapebf5a837-69", "ovs_interfaceid": "ebf5a837-6957-4227-9b3d-1ae66eb381bd", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Jan 21 19:08:58 np0005591285 nova_compute[182755]: 2026-01-22 00:08:58.125 182759 DEBUG oslo_concurrency.lockutils [req-584d2787-f89e-44db-8bcc-145ab190725a req-0d5af684-e6db-4b22-afc0-b7657c58808a 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquired lock "refresh_cache-cacae884-d2ca-4741-952f-59ffbb641328" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 21 19:08:58 np0005591285 nova_compute[182755]: 2026-01-22 00:08:58.125 182759 DEBUG nova.network.neutron [req-584d2787-f89e-44db-8bcc-145ab190725a req-0d5af684-e6db-4b22-afc0-b7657c58808a 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: cacae884-d2ca-4741-952f-59ffbb641328] Refreshing network info cache for port ebf5a837-6957-4227-9b3d-1ae66eb381bd _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 21 19:08:58 np0005591285 nova_compute[182755]: 2026-01-22 00:08:58.130 182759 DEBUG nova.virt.libvirt.driver [None req-16135a30-0415-48fc-85cb-ae2a817f04b8 8324d8ba232c476e925d31b7d5645a7a 3b9315c6168049d79f20d630e51ffff3 - - default default] [instance: cacae884-d2ca-4741-952f-59ffbb641328] Start _get_guest_xml network_info=[{"id": "ebf5a837-6957-4227-9b3d-1ae66eb381bd", "address": "fa:16:3e:52:ad:ee", "network": {"id": "eaa59f49-9477-4d9b-9a5e-8f1eb5cf78a6", "bridge": "br-int", "label": "tempest-ServerRescueTestJSON-1280377146-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": false}}], "meta": {"injected": false, "tenant_id": "3b9315c6168049d79f20d630e51ffff3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapebf5a837-69", "ovs_interfaceid": "ebf5a837-6957-4227-9b3d-1ae66eb381bd", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-21T23:42:50Z,direct_url=<?>,disk_format='qcow2',id=9cd98f02-a505-4543-a7ad-04e9a377b456,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='43b70c4e837343859ac97b6b2397ba1b',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-21T23:42:52Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'device_type': 'disk', 'device_name': '/dev/vda', 'size': 0, 'boot_index': 0, 'disk_bus': 'virtio', 'guest_format': None, 'encryption_options': None, 'encryption_format': None, 'encrypted': False, 'encryption_secret_uuid': None, 'image_id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Jan 21 19:08:58 np0005591285 nova_compute[182755]: 2026-01-22 00:08:58.154 182759 WARNING nova.virt.libvirt.driver [None req-16135a30-0415-48fc-85cb-ae2a817f04b8 8324d8ba232c476e925d31b7d5645a7a 3b9315c6168049d79f20d630e51ffff3 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 21 19:08:58 np0005591285 nova_compute[182755]: 2026-01-22 00:08:58.165 182759 DEBUG nova.virt.libvirt.host [None req-16135a30-0415-48fc-85cb-ae2a817f04b8 8324d8ba232c476e925d31b7d5645a7a 3b9315c6168049d79f20d630e51ffff3 - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Jan 21 19:08:58 np0005591285 nova_compute[182755]: 2026-01-22 00:08:58.165 182759 DEBUG nova.virt.libvirt.host [None req-16135a30-0415-48fc-85cb-ae2a817f04b8 8324d8ba232c476e925d31b7d5645a7a 3b9315c6168049d79f20d630e51ffff3 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Jan 21 19:08:58 np0005591285 nova_compute[182755]: 2026-01-22 00:08:58.172 182759 DEBUG nova.virt.libvirt.host [None req-16135a30-0415-48fc-85cb-ae2a817f04b8 8324d8ba232c476e925d31b7d5645a7a 3b9315c6168049d79f20d630e51ffff3 - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Jan 21 19:08:58 np0005591285 nova_compute[182755]: 2026-01-22 00:08:58.173 182759 DEBUG nova.virt.libvirt.host [None req-16135a30-0415-48fc-85cb-ae2a817f04b8 8324d8ba232c476e925d31b7d5645a7a 3b9315c6168049d79f20d630e51ffff3 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Jan 21 19:08:58 np0005591285 nova_compute[182755]: 2026-01-22 00:08:58.174 182759 DEBUG nova.virt.libvirt.driver [None req-16135a30-0415-48fc-85cb-ae2a817f04b8 8324d8ba232c476e925d31b7d5645a7a 3b9315c6168049d79f20d630e51ffff3 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Jan 21 19:08:58 np0005591285 nova_compute[182755]: 2026-01-22 00:08:58.174 182759 DEBUG nova.virt.hardware [None req-16135a30-0415-48fc-85cb-ae2a817f04b8 8324d8ba232c476e925d31b7d5645a7a 3b9315c6168049d79f20d630e51ffff3 - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-21T23:42:45Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='c3389c03-89c4-4ff5-9e03-1a99d41713d4',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-21T23:42:50Z,direct_url=<?>,disk_format='qcow2',id=9cd98f02-a505-4543-a7ad-04e9a377b456,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='43b70c4e837343859ac97b6b2397ba1b',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-21T23:42:52Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Jan 21 19:08:58 np0005591285 nova_compute[182755]: 2026-01-22 00:08:58.175 182759 DEBUG nova.virt.hardware [None req-16135a30-0415-48fc-85cb-ae2a817f04b8 8324d8ba232c476e925d31b7d5645a7a 3b9315c6168049d79f20d630e51ffff3 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Jan 21 19:08:58 np0005591285 nova_compute[182755]: 2026-01-22 00:08:58.175 182759 DEBUG nova.virt.hardware [None req-16135a30-0415-48fc-85cb-ae2a817f04b8 8324d8ba232c476e925d31b7d5645a7a 3b9315c6168049d79f20d630e51ffff3 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Jan 21 19:08:58 np0005591285 nova_compute[182755]: 2026-01-22 00:08:58.175 182759 DEBUG nova.virt.hardware [None req-16135a30-0415-48fc-85cb-ae2a817f04b8 8324d8ba232c476e925d31b7d5645a7a 3b9315c6168049d79f20d630e51ffff3 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Jan 21 19:08:58 np0005591285 nova_compute[182755]: 2026-01-22 00:08:58.175 182759 DEBUG nova.virt.hardware [None req-16135a30-0415-48fc-85cb-ae2a817f04b8 8324d8ba232c476e925d31b7d5645a7a 3b9315c6168049d79f20d630e51ffff3 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Jan 21 19:08:58 np0005591285 nova_compute[182755]: 2026-01-22 00:08:58.175 182759 DEBUG nova.virt.hardware [None req-16135a30-0415-48fc-85cb-ae2a817f04b8 8324d8ba232c476e925d31b7d5645a7a 3b9315c6168049d79f20d630e51ffff3 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Jan 21 19:08:58 np0005591285 nova_compute[182755]: 2026-01-22 00:08:58.175 182759 DEBUG nova.virt.hardware [None req-16135a30-0415-48fc-85cb-ae2a817f04b8 8324d8ba232c476e925d31b7d5645a7a 3b9315c6168049d79f20d630e51ffff3 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Jan 21 19:08:58 np0005591285 nova_compute[182755]: 2026-01-22 00:08:58.176 182759 DEBUG nova.virt.hardware [None req-16135a30-0415-48fc-85cb-ae2a817f04b8 8324d8ba232c476e925d31b7d5645a7a 3b9315c6168049d79f20d630e51ffff3 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Jan 21 19:08:58 np0005591285 nova_compute[182755]: 2026-01-22 00:08:58.176 182759 DEBUG nova.virt.hardware [None req-16135a30-0415-48fc-85cb-ae2a817f04b8 8324d8ba232c476e925d31b7d5645a7a 3b9315c6168049d79f20d630e51ffff3 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Jan 21 19:08:58 np0005591285 nova_compute[182755]: 2026-01-22 00:08:58.176 182759 DEBUG nova.virt.hardware [None req-16135a30-0415-48fc-85cb-ae2a817f04b8 8324d8ba232c476e925d31b7d5645a7a 3b9315c6168049d79f20d630e51ffff3 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Jan 21 19:08:58 np0005591285 nova_compute[182755]: 2026-01-22 00:08:58.176 182759 DEBUG nova.virt.hardware [None req-16135a30-0415-48fc-85cb-ae2a817f04b8 8324d8ba232c476e925d31b7d5645a7a 3b9315c6168049d79f20d630e51ffff3 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Jan 21 19:08:58 np0005591285 nova_compute[182755]: 2026-01-22 00:08:58.180 182759 DEBUG nova.virt.libvirt.vif [None req-16135a30-0415-48fc-85cb-ae2a817f04b8 8324d8ba232c476e925d31b7d5645a7a 3b9315c6168049d79f20d630e51ffff3 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-22T00:08:49Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ServerRescueTestJSON-server-927653202',display_name='tempest-ServerRescueTestJSON-server-927653202',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-serverrescuetestjson-server-927653202',id=110,image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='3b9315c6168049d79f20d630e51ffff3',ramdisk_id='',reservation_id='r-8xr05bq6',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ServerRescueTestJSON-401787473',owner_user_name='tempest-ServerRescueTestJSON-401787473-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-22T00:08:51Z,user_data=None,user_id='8324d8ba232c476e925d31b7d5645a7a',uuid=cacae884-d2ca-4741-952f-59ffbb641328,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "ebf5a837-6957-4227-9b3d-1ae66eb381bd", "address": "fa:16:3e:52:ad:ee", "network": {"id": "eaa59f49-9477-4d9b-9a5e-8f1eb5cf78a6", "bridge": "br-int", "label": "tempest-ServerRescueTestJSON-1280377146-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": false}}], "meta": {"injected": false, "tenant_id": "3b9315c6168049d79f20d630e51ffff3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapebf5a837-69", "ovs_interfaceid": "ebf5a837-6957-4227-9b3d-1ae66eb381bd", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Jan 21 19:08:58 np0005591285 nova_compute[182755]: 2026-01-22 00:08:58.180 182759 DEBUG nova.network.os_vif_util [None req-16135a30-0415-48fc-85cb-ae2a817f04b8 8324d8ba232c476e925d31b7d5645a7a 3b9315c6168049d79f20d630e51ffff3 - - default default] Converting VIF {"id": "ebf5a837-6957-4227-9b3d-1ae66eb381bd", "address": "fa:16:3e:52:ad:ee", "network": {"id": "eaa59f49-9477-4d9b-9a5e-8f1eb5cf78a6", "bridge": "br-int", "label": "tempest-ServerRescueTestJSON-1280377146-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": false}}], "meta": {"injected": false, "tenant_id": "3b9315c6168049d79f20d630e51ffff3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapebf5a837-69", "ovs_interfaceid": "ebf5a837-6957-4227-9b3d-1ae66eb381bd", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 21 19:08:58 np0005591285 nova_compute[182755]: 2026-01-22 00:08:58.181 182759 DEBUG nova.network.os_vif_util [None req-16135a30-0415-48fc-85cb-ae2a817f04b8 8324d8ba232c476e925d31b7d5645a7a 3b9315c6168049d79f20d630e51ffff3 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:52:ad:ee,bridge_name='br-int',has_traffic_filtering=True,id=ebf5a837-6957-4227-9b3d-1ae66eb381bd,network=Network(eaa59f49-9477-4d9b-9a5e-8f1eb5cf78a6),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapebf5a837-69') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 21 19:08:58 np0005591285 nova_compute[182755]: 2026-01-22 00:08:58.182 182759 DEBUG nova.objects.instance [None req-16135a30-0415-48fc-85cb-ae2a817f04b8 8324d8ba232c476e925d31b7d5645a7a 3b9315c6168049d79f20d630e51ffff3 - - default default] Lazy-loading 'pci_devices' on Instance uuid cacae884-d2ca-4741-952f-59ffbb641328 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 21 19:08:58 np0005591285 podman[227887]: 2026-01-22 00:08:58.236010132 +0000 UTC m=+0.070761593 container health_status 8919309155a033da8deb024bb37b9fc0d285fe97686ee0d202af37a5f2df8416 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Jan 21 19:08:58 np0005591285 nova_compute[182755]: 2026-01-22 00:08:58.237 182759 DEBUG oslo_concurrency.processutils [None req-1620f237-30e3-4c25-9f9f-e88b9800a1b5 365f219cd09c471fa6275faa2fe5e2a1 b26cf6f4abd54e30aac169a3cbca648c - - default default] CMD "qemu-img rebase -b /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 -F raw /var/lib/nova/instances/4e87b9c8-cfba-431e-966e-24799ad0ece2/disk" returned: 0 in 1.547s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 21 19:08:58 np0005591285 nova_compute[182755]: 2026-01-22 00:08:58.237 182759 DEBUG nova.virt.libvirt.driver [None req-1620f237-30e3-4c25-9f9f-e88b9800a1b5 365f219cd09c471fa6275faa2fe5e2a1 b26cf6f4abd54e30aac169a3cbca648c - - default default] [instance: 4e87b9c8-cfba-431e-966e-24799ad0ece2] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Jan 21 19:08:58 np0005591285 nova_compute[182755]: 2026-01-22 00:08:58.238 182759 DEBUG nova.virt.libvirt.driver [None req-1620f237-30e3-4c25-9f9f-e88b9800a1b5 365f219cd09c471fa6275faa2fe5e2a1 b26cf6f4abd54e30aac169a3cbca648c - - default default] [instance: 4e87b9c8-cfba-431e-966e-24799ad0ece2] Ensure instance console log exists: /var/lib/nova/instances/4e87b9c8-cfba-431e-966e-24799ad0ece2/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Jan 21 19:08:58 np0005591285 nova_compute[182755]: 2026-01-22 00:08:58.238 182759 DEBUG oslo_concurrency.lockutils [None req-1620f237-30e3-4c25-9f9f-e88b9800a1b5 365f219cd09c471fa6275faa2fe5e2a1 b26cf6f4abd54e30aac169a3cbca648c - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 21 19:08:58 np0005591285 nova_compute[182755]: 2026-01-22 00:08:58.238 182759 DEBUG oslo_concurrency.lockutils [None req-1620f237-30e3-4c25-9f9f-e88b9800a1b5 365f219cd09c471fa6275faa2fe5e2a1 b26cf6f4abd54e30aac169a3cbca648c - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 21 19:08:58 np0005591285 nova_compute[182755]: 2026-01-22 00:08:58.239 182759 DEBUG oslo_concurrency.lockutils [None req-1620f237-30e3-4c25-9f9f-e88b9800a1b5 365f219cd09c471fa6275faa2fe5e2a1 b26cf6f4abd54e30aac169a3cbca648c - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 21 19:08:58 np0005591285 podman[227886]: 2026-01-22 00:08:58.240628237 +0000 UTC m=+0.077368552 container health_status 482a8450540b33e5cd4c4cb0c4b60281016c657b79b89fa82f8a9e447843f995 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, io.buildah.version=1.41.3)
Jan 21 19:08:58 np0005591285 nova_compute[182755]: 2026-01-22 00:08:58.241 182759 DEBUG nova.virt.libvirt.driver [None req-1620f237-30e3-4c25-9f9f-e88b9800a1b5 365f219cd09c471fa6275faa2fe5e2a1 b26cf6f4abd54e30aac169a3cbca648c - - default default] [instance: 4e87b9c8-cfba-431e-966e-24799ad0ece2] Start _get_guest_xml network_info=[{"id": "02f1d29d-b6df-46d8-8387-cfa84ffb24af", "address": "fa:16:3e:7c:e7:2e", "network": {"id": "1a4bd631-64c5-4e00-9341-0e44fd0833fb", "bridge": "br-int", "label": "tempest-ServerActionsTestOtherB-1779791452-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.176", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b26cf6f4abd54e30aac169a3cbca648c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap02f1d29d-b6", "ovs_interfaceid": "02f1d29d-b6df-46d8-8387-cfa84ffb24af", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='3a48f94f9543c71d7fdd546b417942a8',container_format='bare',created_at=2026-01-22T00:08:30Z,direct_url=<?>,disk_format='qcow2',id=570dc310-bae9-497b-a8ff-1f0fd81ca729,min_disk=1,min_ram=0,name='tempest-ServerActionsTestOtherB-server-261934281-shelved',owner='b26cf6f4abd54e30aac169a3cbca648c',properties=ImageMetaProps,protected=<?>,size=52297728,status='active',tags=<?>,updated_at=2026-01-22T00:08:37Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'device_type': 'disk', 'device_name': '/dev/vda', 'size': 0, 'boot_index': 0, 'disk_bus': 'virtio', 'guest_format': None, 'encryption_options': None, 'encryption_format': None, 'encrypted': False, 'encryption_secret_uuid': None, 'image_id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Jan 21 19:08:58 np0005591285 nova_compute[182755]: 2026-01-22 00:08:58.244 182759 WARNING nova.virt.libvirt.driver [None req-1620f237-30e3-4c25-9f9f-e88b9800a1b5 365f219cd09c471fa6275faa2fe5e2a1 b26cf6f4abd54e30aac169a3cbca648c - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 21 19:08:58 np0005591285 nova_compute[182755]: 2026-01-22 00:08:58.247 182759 DEBUG nova.virt.libvirt.host [None req-1620f237-30e3-4c25-9f9f-e88b9800a1b5 365f219cd09c471fa6275faa2fe5e2a1 b26cf6f4abd54e30aac169a3cbca648c - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Jan 21 19:08:58 np0005591285 nova_compute[182755]: 2026-01-22 00:08:58.248 182759 DEBUG nova.virt.libvirt.host [None req-1620f237-30e3-4c25-9f9f-e88b9800a1b5 365f219cd09c471fa6275faa2fe5e2a1 b26cf6f4abd54e30aac169a3cbca648c - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Jan 21 19:08:58 np0005591285 nova_compute[182755]: 2026-01-22 00:08:58.250 182759 DEBUG nova.virt.libvirt.host [None req-1620f237-30e3-4c25-9f9f-e88b9800a1b5 365f219cd09c471fa6275faa2fe5e2a1 b26cf6f4abd54e30aac169a3cbca648c - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Jan 21 19:08:58 np0005591285 nova_compute[182755]: 2026-01-22 00:08:58.250 182759 DEBUG nova.virt.libvirt.host [None req-1620f237-30e3-4c25-9f9f-e88b9800a1b5 365f219cd09c471fa6275faa2fe5e2a1 b26cf6f4abd54e30aac169a3cbca648c - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Jan 21 19:08:58 np0005591285 nova_compute[182755]: 2026-01-22 00:08:58.251 182759 DEBUG nova.virt.libvirt.driver [None req-1620f237-30e3-4c25-9f9f-e88b9800a1b5 365f219cd09c471fa6275faa2fe5e2a1 b26cf6f4abd54e30aac169a3cbca648c - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Jan 21 19:08:58 np0005591285 nova_compute[182755]: 2026-01-22 00:08:58.251 182759 DEBUG nova.virt.hardware [None req-1620f237-30e3-4c25-9f9f-e88b9800a1b5 365f219cd09c471fa6275faa2fe5e2a1 b26cf6f4abd54e30aac169a3cbca648c - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-21T23:42:45Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='c3389c03-89c4-4ff5-9e03-1a99d41713d4',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='3a48f94f9543c71d7fdd546b417942a8',container_format='bare',created_at=2026-01-22T00:08:30Z,direct_url=<?>,disk_format='qcow2',id=570dc310-bae9-497b-a8ff-1f0fd81ca729,min_disk=1,min_ram=0,name='tempest-ServerActionsTestOtherB-server-261934281-shelved',owner='b26cf6f4abd54e30aac169a3cbca648c',properties=ImageMetaProps,protected=<?>,size=52297728,status='active',tags=<?>,updated_at=2026-01-22T00:08:37Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Jan 21 19:08:58 np0005591285 nova_compute[182755]: 2026-01-22 00:08:58.251 182759 DEBUG nova.virt.hardware [None req-1620f237-30e3-4c25-9f9f-e88b9800a1b5 365f219cd09c471fa6275faa2fe5e2a1 b26cf6f4abd54e30aac169a3cbca648c - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Jan 21 19:08:58 np0005591285 nova_compute[182755]: 2026-01-22 00:08:58.252 182759 DEBUG nova.virt.hardware [None req-1620f237-30e3-4c25-9f9f-e88b9800a1b5 365f219cd09c471fa6275faa2fe5e2a1 b26cf6f4abd54e30aac169a3cbca648c - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Jan 21 19:08:58 np0005591285 nova_compute[182755]: 2026-01-22 00:08:58.252 182759 DEBUG nova.virt.hardware [None req-1620f237-30e3-4c25-9f9f-e88b9800a1b5 365f219cd09c471fa6275faa2fe5e2a1 b26cf6f4abd54e30aac169a3cbca648c - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Jan 21 19:08:58 np0005591285 nova_compute[182755]: 2026-01-22 00:08:58.252 182759 DEBUG nova.virt.hardware [None req-1620f237-30e3-4c25-9f9f-e88b9800a1b5 365f219cd09c471fa6275faa2fe5e2a1 b26cf6f4abd54e30aac169a3cbca648c - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Jan 21 19:08:58 np0005591285 nova_compute[182755]: 2026-01-22 00:08:58.252 182759 DEBUG nova.virt.hardware [None req-1620f237-30e3-4c25-9f9f-e88b9800a1b5 365f219cd09c471fa6275faa2fe5e2a1 b26cf6f4abd54e30aac169a3cbca648c - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Jan 21 19:08:58 np0005591285 nova_compute[182755]: 2026-01-22 00:08:58.252 182759 DEBUG nova.virt.hardware [None req-1620f237-30e3-4c25-9f9f-e88b9800a1b5 365f219cd09c471fa6275faa2fe5e2a1 b26cf6f4abd54e30aac169a3cbca648c - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Jan 21 19:08:58 np0005591285 nova_compute[182755]: 2026-01-22 00:08:58.252 182759 DEBUG nova.virt.hardware [None req-1620f237-30e3-4c25-9f9f-e88b9800a1b5 365f219cd09c471fa6275faa2fe5e2a1 b26cf6f4abd54e30aac169a3cbca648c - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Jan 21 19:08:58 np0005591285 nova_compute[182755]: 2026-01-22 00:08:58.252 182759 DEBUG nova.virt.hardware [None req-1620f237-30e3-4c25-9f9f-e88b9800a1b5 365f219cd09c471fa6275faa2fe5e2a1 b26cf6f4abd54e30aac169a3cbca648c - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Jan 21 19:08:58 np0005591285 nova_compute[182755]: 2026-01-22 00:08:58.253 182759 DEBUG nova.virt.hardware [None req-1620f237-30e3-4c25-9f9f-e88b9800a1b5 365f219cd09c471fa6275faa2fe5e2a1 b26cf6f4abd54e30aac169a3cbca648c - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Jan 21 19:08:58 np0005591285 nova_compute[182755]: 2026-01-22 00:08:58.253 182759 DEBUG nova.virt.hardware [None req-1620f237-30e3-4c25-9f9f-e88b9800a1b5 365f219cd09c471fa6275faa2fe5e2a1 b26cf6f4abd54e30aac169a3cbca648c - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Jan 21 19:08:58 np0005591285 nova_compute[182755]: 2026-01-22 00:08:58.253 182759 DEBUG nova.objects.instance [None req-1620f237-30e3-4c25-9f9f-e88b9800a1b5 365f219cd09c471fa6275faa2fe5e2a1 b26cf6f4abd54e30aac169a3cbca648c - - default default] Lazy-loading 'vcpu_model' on Instance uuid 4e87b9c8-cfba-431e-966e-24799ad0ece2 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 21 19:08:58 np0005591285 nova_compute[182755]: 2026-01-22 00:08:58.256 182759 DEBUG nova.virt.libvirt.driver [None req-16135a30-0415-48fc-85cb-ae2a817f04b8 8324d8ba232c476e925d31b7d5645a7a 3b9315c6168049d79f20d630e51ffff3 - - default default] [instance: cacae884-d2ca-4741-952f-59ffbb641328] End _get_guest_xml xml=<domain type="kvm">
Jan 21 19:08:58 np0005591285 nova_compute[182755]:  <uuid>cacae884-d2ca-4741-952f-59ffbb641328</uuid>
Jan 21 19:08:58 np0005591285 nova_compute[182755]:  <name>instance-0000006e</name>
Jan 21 19:08:58 np0005591285 nova_compute[182755]:  <memory>131072</memory>
Jan 21 19:08:58 np0005591285 nova_compute[182755]:  <vcpu>1</vcpu>
Jan 21 19:08:58 np0005591285 nova_compute[182755]:  <metadata>
Jan 21 19:08:58 np0005591285 nova_compute[182755]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 21 19:08:58 np0005591285 nova_compute[182755]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 21 19:08:58 np0005591285 nova_compute[182755]:      <nova:name>tempest-ServerRescueTestJSON-server-927653202</nova:name>
Jan 21 19:08:58 np0005591285 nova_compute[182755]:      <nova:creationTime>2026-01-22 00:08:58</nova:creationTime>
Jan 21 19:08:58 np0005591285 nova_compute[182755]:      <nova:flavor name="m1.nano">
Jan 21 19:08:58 np0005591285 nova_compute[182755]:        <nova:memory>128</nova:memory>
Jan 21 19:08:58 np0005591285 nova_compute[182755]:        <nova:disk>1</nova:disk>
Jan 21 19:08:58 np0005591285 nova_compute[182755]:        <nova:swap>0</nova:swap>
Jan 21 19:08:58 np0005591285 nova_compute[182755]:        <nova:ephemeral>0</nova:ephemeral>
Jan 21 19:08:58 np0005591285 nova_compute[182755]:        <nova:vcpus>1</nova:vcpus>
Jan 21 19:08:58 np0005591285 nova_compute[182755]:      </nova:flavor>
Jan 21 19:08:58 np0005591285 nova_compute[182755]:      <nova:owner>
Jan 21 19:08:58 np0005591285 nova_compute[182755]:        <nova:user uuid="8324d8ba232c476e925d31b7d5645a7a">tempest-ServerRescueTestJSON-401787473-project-member</nova:user>
Jan 21 19:08:58 np0005591285 nova_compute[182755]:        <nova:project uuid="3b9315c6168049d79f20d630e51ffff3">tempest-ServerRescueTestJSON-401787473</nova:project>
Jan 21 19:08:58 np0005591285 nova_compute[182755]:      </nova:owner>
Jan 21 19:08:58 np0005591285 nova_compute[182755]:      <nova:root type="image" uuid="9cd98f02-a505-4543-a7ad-04e9a377b456"/>
Jan 21 19:08:58 np0005591285 nova_compute[182755]:      <nova:ports>
Jan 21 19:08:58 np0005591285 nova_compute[182755]:        <nova:port uuid="ebf5a837-6957-4227-9b3d-1ae66eb381bd">
Jan 21 19:08:58 np0005591285 nova_compute[182755]:          <nova:ip type="fixed" address="10.100.0.11" ipVersion="4"/>
Jan 21 19:08:58 np0005591285 nova_compute[182755]:        </nova:port>
Jan 21 19:08:58 np0005591285 nova_compute[182755]:      </nova:ports>
Jan 21 19:08:58 np0005591285 nova_compute[182755]:    </nova:instance>
Jan 21 19:08:58 np0005591285 nova_compute[182755]:  </metadata>
Jan 21 19:08:58 np0005591285 nova_compute[182755]:  <sysinfo type="smbios">
Jan 21 19:08:58 np0005591285 nova_compute[182755]:    <system>
Jan 21 19:08:58 np0005591285 nova_compute[182755]:      <entry name="manufacturer">RDO</entry>
Jan 21 19:08:58 np0005591285 nova_compute[182755]:      <entry name="product">OpenStack Compute</entry>
Jan 21 19:08:58 np0005591285 nova_compute[182755]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 21 19:08:58 np0005591285 nova_compute[182755]:      <entry name="serial">cacae884-d2ca-4741-952f-59ffbb641328</entry>
Jan 21 19:08:58 np0005591285 nova_compute[182755]:      <entry name="uuid">cacae884-d2ca-4741-952f-59ffbb641328</entry>
Jan 21 19:08:58 np0005591285 nova_compute[182755]:      <entry name="family">Virtual Machine</entry>
Jan 21 19:08:58 np0005591285 nova_compute[182755]:    </system>
Jan 21 19:08:58 np0005591285 nova_compute[182755]:  </sysinfo>
Jan 21 19:08:58 np0005591285 nova_compute[182755]:  <os>
Jan 21 19:08:58 np0005591285 nova_compute[182755]:    <type arch="x86_64" machine="q35">hvm</type>
Jan 21 19:08:58 np0005591285 nova_compute[182755]:    <boot dev="hd"/>
Jan 21 19:08:58 np0005591285 nova_compute[182755]:    <smbios mode="sysinfo"/>
Jan 21 19:08:58 np0005591285 nova_compute[182755]:  </os>
Jan 21 19:08:58 np0005591285 nova_compute[182755]:  <features>
Jan 21 19:08:58 np0005591285 nova_compute[182755]:    <acpi/>
Jan 21 19:08:58 np0005591285 nova_compute[182755]:    <apic/>
Jan 21 19:08:58 np0005591285 nova_compute[182755]:    <vmcoreinfo/>
Jan 21 19:08:58 np0005591285 nova_compute[182755]:  </features>
Jan 21 19:08:58 np0005591285 nova_compute[182755]:  <clock offset="utc">
Jan 21 19:08:58 np0005591285 nova_compute[182755]:    <timer name="pit" tickpolicy="delay"/>
Jan 21 19:08:58 np0005591285 nova_compute[182755]:    <timer name="rtc" tickpolicy="catchup"/>
Jan 21 19:08:58 np0005591285 nova_compute[182755]:    <timer name="hpet" present="no"/>
Jan 21 19:08:58 np0005591285 nova_compute[182755]:  </clock>
Jan 21 19:08:58 np0005591285 nova_compute[182755]:  <cpu mode="custom" match="exact">
Jan 21 19:08:58 np0005591285 nova_compute[182755]:    <model>Nehalem</model>
Jan 21 19:08:58 np0005591285 nova_compute[182755]:    <topology sockets="1" cores="1" threads="1"/>
Jan 21 19:08:58 np0005591285 nova_compute[182755]:  </cpu>
Jan 21 19:08:58 np0005591285 nova_compute[182755]:  <devices>
Jan 21 19:08:58 np0005591285 nova_compute[182755]:    <disk type="file" device="disk">
Jan 21 19:08:58 np0005591285 nova_compute[182755]:      <driver name="qemu" type="qcow2" cache="none"/>
Jan 21 19:08:58 np0005591285 nova_compute[182755]:      <source file="/var/lib/nova/instances/cacae884-d2ca-4741-952f-59ffbb641328/disk"/>
Jan 21 19:08:58 np0005591285 nova_compute[182755]:      <target dev="vda" bus="virtio"/>
Jan 21 19:08:58 np0005591285 nova_compute[182755]:    </disk>
Jan 21 19:08:58 np0005591285 nova_compute[182755]:    <disk type="file" device="cdrom">
Jan 21 19:08:58 np0005591285 nova_compute[182755]:      <driver name="qemu" type="raw" cache="none"/>
Jan 21 19:08:58 np0005591285 nova_compute[182755]:      <source file="/var/lib/nova/instances/cacae884-d2ca-4741-952f-59ffbb641328/disk.config"/>
Jan 21 19:08:58 np0005591285 nova_compute[182755]:      <target dev="sda" bus="sata"/>
Jan 21 19:08:58 np0005591285 nova_compute[182755]:    </disk>
Jan 21 19:08:58 np0005591285 nova_compute[182755]:    <interface type="ethernet">
Jan 21 19:08:58 np0005591285 nova_compute[182755]:      <mac address="fa:16:3e:52:ad:ee"/>
Jan 21 19:08:58 np0005591285 nova_compute[182755]:      <model type="virtio"/>
Jan 21 19:08:58 np0005591285 nova_compute[182755]:      <driver name="vhost" rx_queue_size="512"/>
Jan 21 19:08:58 np0005591285 nova_compute[182755]:      <mtu size="1442"/>
Jan 21 19:08:58 np0005591285 nova_compute[182755]:      <target dev="tapebf5a837-69"/>
Jan 21 19:08:58 np0005591285 nova_compute[182755]:    </interface>
Jan 21 19:08:58 np0005591285 nova_compute[182755]:    <serial type="pty">
Jan 21 19:08:58 np0005591285 nova_compute[182755]:      <log file="/var/lib/nova/instances/cacae884-d2ca-4741-952f-59ffbb641328/console.log" append="off"/>
Jan 21 19:08:58 np0005591285 nova_compute[182755]:    </serial>
Jan 21 19:08:58 np0005591285 nova_compute[182755]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 21 19:08:58 np0005591285 nova_compute[182755]:    <video>
Jan 21 19:08:58 np0005591285 nova_compute[182755]:      <model type="virtio"/>
Jan 21 19:08:58 np0005591285 nova_compute[182755]:    </video>
Jan 21 19:08:58 np0005591285 nova_compute[182755]:    <input type="tablet" bus="usb"/>
Jan 21 19:08:58 np0005591285 nova_compute[182755]:    <rng model="virtio">
Jan 21 19:08:58 np0005591285 nova_compute[182755]:      <backend model="random">/dev/urandom</backend>
Jan 21 19:08:58 np0005591285 nova_compute[182755]:    </rng>
Jan 21 19:08:58 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root"/>
Jan 21 19:08:58 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 19:08:58 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 19:08:58 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 19:08:58 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 19:08:58 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 19:08:58 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 19:08:58 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 19:08:58 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 19:08:58 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 19:08:58 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 19:08:58 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 19:08:58 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 19:08:58 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 19:08:58 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 19:08:58 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 19:08:58 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 19:08:58 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 19:08:58 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 19:08:58 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 19:08:58 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 19:08:58 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 19:08:58 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 19:08:58 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 19:08:58 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 19:08:58 np0005591285 nova_compute[182755]:    <controller type="usb" index="0"/>
Jan 21 19:08:58 np0005591285 nova_compute[182755]:    <memballoon model="virtio">
Jan 21 19:08:58 np0005591285 nova_compute[182755]:      <stats period="10"/>
Jan 21 19:08:58 np0005591285 nova_compute[182755]:    </memballoon>
Jan 21 19:08:58 np0005591285 nova_compute[182755]:  </devices>
Jan 21 19:08:58 np0005591285 nova_compute[182755]: </domain>
Jan 21 19:08:58 np0005591285 nova_compute[182755]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Jan 21 19:08:58 np0005591285 nova_compute[182755]: 2026-01-22 00:08:58.256 182759 DEBUG nova.compute.manager [None req-16135a30-0415-48fc-85cb-ae2a817f04b8 8324d8ba232c476e925d31b7d5645a7a 3b9315c6168049d79f20d630e51ffff3 - - default default] [instance: cacae884-d2ca-4741-952f-59ffbb641328] Preparing to wait for external event network-vif-plugged-ebf5a837-6957-4227-9b3d-1ae66eb381bd prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Jan 21 19:08:58 np0005591285 nova_compute[182755]: 2026-01-22 00:08:58.256 182759 DEBUG oslo_concurrency.lockutils [None req-16135a30-0415-48fc-85cb-ae2a817f04b8 8324d8ba232c476e925d31b7d5645a7a 3b9315c6168049d79f20d630e51ffff3 - - default default] Acquiring lock "cacae884-d2ca-4741-952f-59ffbb641328-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 21 19:08:58 np0005591285 nova_compute[182755]: 2026-01-22 00:08:58.256 182759 DEBUG oslo_concurrency.lockutils [None req-16135a30-0415-48fc-85cb-ae2a817f04b8 8324d8ba232c476e925d31b7d5645a7a 3b9315c6168049d79f20d630e51ffff3 - - default default] Lock "cacae884-d2ca-4741-952f-59ffbb641328-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 21 19:08:58 np0005591285 nova_compute[182755]: 2026-01-22 00:08:58.257 182759 DEBUG oslo_concurrency.lockutils [None req-16135a30-0415-48fc-85cb-ae2a817f04b8 8324d8ba232c476e925d31b7d5645a7a 3b9315c6168049d79f20d630e51ffff3 - - default default] Lock "cacae884-d2ca-4741-952f-59ffbb641328-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 21 19:08:58 np0005591285 nova_compute[182755]: 2026-01-22 00:08:58.257 182759 DEBUG nova.virt.libvirt.vif [None req-16135a30-0415-48fc-85cb-ae2a817f04b8 8324d8ba232c476e925d31b7d5645a7a 3b9315c6168049d79f20d630e51ffff3 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-22T00:08:49Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ServerRescueTestJSON-server-927653202',display_name='tempest-ServerRescueTestJSON-server-927653202',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-serverrescuetestjson-server-927653202',id=110,image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='3b9315c6168049d79f20d630e51ffff3',ramdisk_id='',reservation_id='r-8xr05bq6',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ServerRescueTestJSON-401787473',owner_user_name='tempest-ServerRescueTestJSON-401787473-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-22T00:08:51Z,user_data=None,user_id='8324d8ba232c476e925d31b7d5645a7a',uuid=cacae884-d2ca-4741-952f-59ffbb641328,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "ebf5a837-6957-4227-9b3d-1ae66eb381bd", "address": "fa:16:3e:52:ad:ee", "network": {"id": "eaa59f49-9477-4d9b-9a5e-8f1eb5cf78a6", "bridge": "br-int", "label": "tempest-ServerRescueTestJSON-1280377146-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": false}}], "meta": {"injected": false, "tenant_id": "3b9315c6168049d79f20d630e51ffff3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapebf5a837-69", "ovs_interfaceid": "ebf5a837-6957-4227-9b3d-1ae66eb381bd", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Jan 21 19:08:58 np0005591285 nova_compute[182755]: 2026-01-22 00:08:58.257 182759 DEBUG nova.network.os_vif_util [None req-16135a30-0415-48fc-85cb-ae2a817f04b8 8324d8ba232c476e925d31b7d5645a7a 3b9315c6168049d79f20d630e51ffff3 - - default default] Converting VIF {"id": "ebf5a837-6957-4227-9b3d-1ae66eb381bd", "address": "fa:16:3e:52:ad:ee", "network": {"id": "eaa59f49-9477-4d9b-9a5e-8f1eb5cf78a6", "bridge": "br-int", "label": "tempest-ServerRescueTestJSON-1280377146-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": false}}], "meta": {"injected": false, "tenant_id": "3b9315c6168049d79f20d630e51ffff3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapebf5a837-69", "ovs_interfaceid": "ebf5a837-6957-4227-9b3d-1ae66eb381bd", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 21 19:08:58 np0005591285 nova_compute[182755]: 2026-01-22 00:08:58.258 182759 DEBUG nova.network.os_vif_util [None req-16135a30-0415-48fc-85cb-ae2a817f04b8 8324d8ba232c476e925d31b7d5645a7a 3b9315c6168049d79f20d630e51ffff3 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:52:ad:ee,bridge_name='br-int',has_traffic_filtering=True,id=ebf5a837-6957-4227-9b3d-1ae66eb381bd,network=Network(eaa59f49-9477-4d9b-9a5e-8f1eb5cf78a6),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapebf5a837-69') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 21 19:08:58 np0005591285 nova_compute[182755]: 2026-01-22 00:08:58.258 182759 DEBUG os_vif [None req-16135a30-0415-48fc-85cb-ae2a817f04b8 8324d8ba232c476e925d31b7d5645a7a 3b9315c6168049d79f20d630e51ffff3 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:52:ad:ee,bridge_name='br-int',has_traffic_filtering=True,id=ebf5a837-6957-4227-9b3d-1ae66eb381bd,network=Network(eaa59f49-9477-4d9b-9a5e-8f1eb5cf78a6),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapebf5a837-69') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Jan 21 19:08:58 np0005591285 nova_compute[182755]: 2026-01-22 00:08:58.258 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:08:58 np0005591285 nova_compute[182755]: 2026-01-22 00:08:58.259 182759 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 21 19:08:58 np0005591285 nova_compute[182755]: 2026-01-22 00:08:58.259 182759 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 21 19:08:58 np0005591285 nova_compute[182755]: 2026-01-22 00:08:58.262 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:08:58 np0005591285 nova_compute[182755]: 2026-01-22 00:08:58.262 182759 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapebf5a837-69, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 21 19:08:58 np0005591285 nova_compute[182755]: 2026-01-22 00:08:58.262 182759 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tapebf5a837-69, col_values=(('external_ids', {'iface-id': 'ebf5a837-6957-4227-9b3d-1ae66eb381bd', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:52:ad:ee', 'vm-uuid': 'cacae884-d2ca-4741-952f-59ffbb641328'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 21 19:08:58 np0005591285 nova_compute[182755]: 2026-01-22 00:08:58.263 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:08:58 np0005591285 NetworkManager[55017]: <info>  [1769040538.2642] manager: (tapebf5a837-69): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/192)
Jan 21 19:08:58 np0005591285 nova_compute[182755]: 2026-01-22 00:08:58.265 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 21 19:08:58 np0005591285 nova_compute[182755]: 2026-01-22 00:08:58.270 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:08:58 np0005591285 nova_compute[182755]: 2026-01-22 00:08:58.271 182759 INFO os_vif [None req-16135a30-0415-48fc-85cb-ae2a817f04b8 8324d8ba232c476e925d31b7d5645a7a 3b9315c6168049d79f20d630e51ffff3 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:52:ad:ee,bridge_name='br-int',has_traffic_filtering=True,id=ebf5a837-6957-4227-9b3d-1ae66eb381bd,network=Network(eaa59f49-9477-4d9b-9a5e-8f1eb5cf78a6),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapebf5a837-69')#033[00m
Jan 21 19:08:58 np0005591285 podman[227888]: 2026-01-22 00:08:58.297833166 +0000 UTC m=+0.139947395 container health_status e38def30a2d032278c507a8fe96e62e9ea51ff4721fe55b24e66691821fd079e (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3)
Jan 21 19:08:58 np0005591285 nova_compute[182755]: 2026-01-22 00:08:58.354 182759 DEBUG nova.virt.libvirt.vif [None req-1620f237-30e3-4c25-9f9f-e88b9800a1b5 365f219cd09c471fa6275faa2fe5e2a1 b26cf6f4abd54e30aac169a3cbca648c - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=True,config_drive='True',created_at=2026-01-22T00:07:28Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ServerActionsTestOtherB-server-261934281',display_name='tempest-ServerActionsTestOtherB-server-261934281',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-serveractionstestotherb-server-261934281',id=104,image_ref='570dc310-bae9-497b-a8ff-1f0fd81ca729',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name='tempest-keypair-1040205771',keypairs=<?>,launch_index=0,launched_at=2026-01-22T00:07:37Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=4,progress=0,project_id='b26cf6f4abd54e30aac169a3cbca648c',ramdisk_id='',reservation_id='r-bqv3jv0l',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',clean_attempts='1',image_base_image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-ServerActionsTestOtherB-1685479237',owner_user_name='tempest-ServerActionsTestOtherB-1685479237-project-member',shelved_at='2026-01-22T00:08:37.845451',shelved_host='compute-2.ctlplane.example.com',shelved_image_id='570dc310-bae9-497b-a8ff-1f0fd81ca729'},tags=<?>,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-22T00:08:49Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='365f219cd09c471fa6275faa2fe5e2a1',uuid=4e87b9c8-cfba-431e-966e-24799ad0ece2,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='shelved_offloaded') vif={"id": "02f1d29d-b6df-46d8-8387-cfa84ffb24af", "address": "fa:16:3e:7c:e7:2e", "network": {"id": "1a4bd631-64c5-4e00-9341-0e44fd0833fb", "bridge": "br-int", "label": "tempest-ServerActionsTestOtherB-1779791452-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.176", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b26cf6f4abd54e30aac169a3cbca648c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap02f1d29d-b6", "ovs_interfaceid": "02f1d29d-b6df-46d8-8387-cfa84ffb24af", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Jan 21 19:08:58 np0005591285 nova_compute[182755]: 2026-01-22 00:08:58.354 182759 DEBUG nova.network.os_vif_util [None req-1620f237-30e3-4c25-9f9f-e88b9800a1b5 365f219cd09c471fa6275faa2fe5e2a1 b26cf6f4abd54e30aac169a3cbca648c - - default default] Converting VIF {"id": "02f1d29d-b6df-46d8-8387-cfa84ffb24af", "address": "fa:16:3e:7c:e7:2e", "network": {"id": "1a4bd631-64c5-4e00-9341-0e44fd0833fb", "bridge": "br-int", "label": "tempest-ServerActionsTestOtherB-1779791452-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.176", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b26cf6f4abd54e30aac169a3cbca648c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap02f1d29d-b6", "ovs_interfaceid": "02f1d29d-b6df-46d8-8387-cfa84ffb24af", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 21 19:08:58 np0005591285 nova_compute[182755]: 2026-01-22 00:08:58.355 182759 DEBUG nova.network.os_vif_util [None req-1620f237-30e3-4c25-9f9f-e88b9800a1b5 365f219cd09c471fa6275faa2fe5e2a1 b26cf6f4abd54e30aac169a3cbca648c - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:7c:e7:2e,bridge_name='br-int',has_traffic_filtering=True,id=02f1d29d-b6df-46d8-8387-cfa84ffb24af,network=Network(1a4bd631-64c5-4e00-9341-0e44fd0833fb),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap02f1d29d-b6') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 21 19:08:58 np0005591285 nova_compute[182755]: 2026-01-22 00:08:58.357 182759 DEBUG nova.objects.instance [None req-1620f237-30e3-4c25-9f9f-e88b9800a1b5 365f219cd09c471fa6275faa2fe5e2a1 b26cf6f4abd54e30aac169a3cbca648c - - default default] Lazy-loading 'pci_devices' on Instance uuid 4e87b9c8-cfba-431e-966e-24799ad0ece2 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 21 19:08:58 np0005591285 nova_compute[182755]: 2026-01-22 00:08:58.386 182759 DEBUG nova.virt.libvirt.driver [None req-1620f237-30e3-4c25-9f9f-e88b9800a1b5 365f219cd09c471fa6275faa2fe5e2a1 b26cf6f4abd54e30aac169a3cbca648c - - default default] [instance: 4e87b9c8-cfba-431e-966e-24799ad0ece2] End _get_guest_xml xml=<domain type="kvm">
Jan 21 19:08:58 np0005591285 nova_compute[182755]:  <uuid>4e87b9c8-cfba-431e-966e-24799ad0ece2</uuid>
Jan 21 19:08:58 np0005591285 nova_compute[182755]:  <name>instance-00000068</name>
Jan 21 19:08:58 np0005591285 nova_compute[182755]:  <memory>131072</memory>
Jan 21 19:08:58 np0005591285 nova_compute[182755]:  <vcpu>1</vcpu>
Jan 21 19:08:58 np0005591285 nova_compute[182755]:  <metadata>
Jan 21 19:08:58 np0005591285 nova_compute[182755]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 21 19:08:58 np0005591285 nova_compute[182755]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 21 19:08:58 np0005591285 nova_compute[182755]:      <nova:name>tempest-ServerActionsTestOtherB-server-261934281</nova:name>
Jan 21 19:08:58 np0005591285 nova_compute[182755]:      <nova:creationTime>2026-01-22 00:08:58</nova:creationTime>
Jan 21 19:08:58 np0005591285 nova_compute[182755]:      <nova:flavor name="m1.nano">
Jan 21 19:08:58 np0005591285 nova_compute[182755]:        <nova:memory>128</nova:memory>
Jan 21 19:08:58 np0005591285 nova_compute[182755]:        <nova:disk>1</nova:disk>
Jan 21 19:08:58 np0005591285 nova_compute[182755]:        <nova:swap>0</nova:swap>
Jan 21 19:08:58 np0005591285 nova_compute[182755]:        <nova:ephemeral>0</nova:ephemeral>
Jan 21 19:08:58 np0005591285 nova_compute[182755]:        <nova:vcpus>1</nova:vcpus>
Jan 21 19:08:58 np0005591285 nova_compute[182755]:      </nova:flavor>
Jan 21 19:08:58 np0005591285 nova_compute[182755]:      <nova:owner>
Jan 21 19:08:58 np0005591285 nova_compute[182755]:        <nova:user uuid="365f219cd09c471fa6275faa2fe5e2a1">tempest-ServerActionsTestOtherB-1685479237-project-member</nova:user>
Jan 21 19:08:58 np0005591285 nova_compute[182755]:        <nova:project uuid="b26cf6f4abd54e30aac169a3cbca648c">tempest-ServerActionsTestOtherB-1685479237</nova:project>
Jan 21 19:08:58 np0005591285 nova_compute[182755]:      </nova:owner>
Jan 21 19:08:58 np0005591285 nova_compute[182755]:      <nova:root type="image" uuid="570dc310-bae9-497b-a8ff-1f0fd81ca729"/>
Jan 21 19:08:58 np0005591285 nova_compute[182755]:      <nova:ports>
Jan 21 19:08:58 np0005591285 nova_compute[182755]:        <nova:port uuid="02f1d29d-b6df-46d8-8387-cfa84ffb24af">
Jan 21 19:08:58 np0005591285 nova_compute[182755]:          <nova:ip type="fixed" address="10.100.0.8" ipVersion="4"/>
Jan 21 19:08:58 np0005591285 nova_compute[182755]:        </nova:port>
Jan 21 19:08:58 np0005591285 nova_compute[182755]:      </nova:ports>
Jan 21 19:08:58 np0005591285 nova_compute[182755]:    </nova:instance>
Jan 21 19:08:58 np0005591285 nova_compute[182755]:  </metadata>
Jan 21 19:08:58 np0005591285 nova_compute[182755]:  <sysinfo type="smbios">
Jan 21 19:08:58 np0005591285 nova_compute[182755]:    <system>
Jan 21 19:08:58 np0005591285 nova_compute[182755]:      <entry name="manufacturer">RDO</entry>
Jan 21 19:08:58 np0005591285 nova_compute[182755]:      <entry name="product">OpenStack Compute</entry>
Jan 21 19:08:58 np0005591285 nova_compute[182755]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 21 19:08:58 np0005591285 nova_compute[182755]:      <entry name="serial">4e87b9c8-cfba-431e-966e-24799ad0ece2</entry>
Jan 21 19:08:58 np0005591285 nova_compute[182755]:      <entry name="uuid">4e87b9c8-cfba-431e-966e-24799ad0ece2</entry>
Jan 21 19:08:58 np0005591285 nova_compute[182755]:      <entry name="family">Virtual Machine</entry>
Jan 21 19:08:58 np0005591285 nova_compute[182755]:    </system>
Jan 21 19:08:58 np0005591285 nova_compute[182755]:  </sysinfo>
Jan 21 19:08:58 np0005591285 nova_compute[182755]:  <os>
Jan 21 19:08:58 np0005591285 nova_compute[182755]:    <type arch="x86_64" machine="q35">hvm</type>
Jan 21 19:08:58 np0005591285 nova_compute[182755]:    <boot dev="hd"/>
Jan 21 19:08:58 np0005591285 nova_compute[182755]:    <smbios mode="sysinfo"/>
Jan 21 19:08:58 np0005591285 nova_compute[182755]:  </os>
Jan 21 19:08:58 np0005591285 nova_compute[182755]:  <features>
Jan 21 19:08:58 np0005591285 nova_compute[182755]:    <acpi/>
Jan 21 19:08:58 np0005591285 nova_compute[182755]:    <apic/>
Jan 21 19:08:58 np0005591285 nova_compute[182755]:    <vmcoreinfo/>
Jan 21 19:08:58 np0005591285 nova_compute[182755]:  </features>
Jan 21 19:08:58 np0005591285 nova_compute[182755]:  <clock offset="utc">
Jan 21 19:08:58 np0005591285 nova_compute[182755]:    <timer name="pit" tickpolicy="delay"/>
Jan 21 19:08:58 np0005591285 nova_compute[182755]:    <timer name="rtc" tickpolicy="catchup"/>
Jan 21 19:08:58 np0005591285 nova_compute[182755]:    <timer name="hpet" present="no"/>
Jan 21 19:08:58 np0005591285 nova_compute[182755]:  </clock>
Jan 21 19:08:58 np0005591285 nova_compute[182755]:  <cpu mode="custom" match="exact">
Jan 21 19:08:58 np0005591285 nova_compute[182755]:    <model>Nehalem</model>
Jan 21 19:08:58 np0005591285 nova_compute[182755]:    <topology sockets="1" cores="1" threads="1"/>
Jan 21 19:08:58 np0005591285 nova_compute[182755]:  </cpu>
Jan 21 19:08:58 np0005591285 nova_compute[182755]:  <devices>
Jan 21 19:08:58 np0005591285 nova_compute[182755]:    <disk type="file" device="disk">
Jan 21 19:08:58 np0005591285 nova_compute[182755]:      <driver name="qemu" type="qcow2" cache="none"/>
Jan 21 19:08:58 np0005591285 nova_compute[182755]:      <source file="/var/lib/nova/instances/4e87b9c8-cfba-431e-966e-24799ad0ece2/disk"/>
Jan 21 19:08:58 np0005591285 nova_compute[182755]:      <target dev="vda" bus="virtio"/>
Jan 21 19:08:58 np0005591285 nova_compute[182755]:    </disk>
Jan 21 19:08:58 np0005591285 nova_compute[182755]:    <disk type="file" device="cdrom">
Jan 21 19:08:58 np0005591285 nova_compute[182755]:      <driver name="qemu" type="raw" cache="none"/>
Jan 21 19:08:58 np0005591285 nova_compute[182755]:      <source file="/var/lib/nova/instances/4e87b9c8-cfba-431e-966e-24799ad0ece2/disk.config"/>
Jan 21 19:08:58 np0005591285 nova_compute[182755]:      <target dev="sda" bus="sata"/>
Jan 21 19:08:58 np0005591285 nova_compute[182755]:    </disk>
Jan 21 19:08:58 np0005591285 nova_compute[182755]:    <interface type="ethernet">
Jan 21 19:08:58 np0005591285 nova_compute[182755]:      <mac address="fa:16:3e:7c:e7:2e"/>
Jan 21 19:08:58 np0005591285 nova_compute[182755]:      <model type="virtio"/>
Jan 21 19:08:58 np0005591285 nova_compute[182755]:      <driver name="vhost" rx_queue_size="512"/>
Jan 21 19:08:58 np0005591285 nova_compute[182755]:      <mtu size="1442"/>
Jan 21 19:08:58 np0005591285 nova_compute[182755]:      <target dev="tap02f1d29d-b6"/>
Jan 21 19:08:58 np0005591285 nova_compute[182755]:    </interface>
Jan 21 19:08:58 np0005591285 nova_compute[182755]:    <serial type="pty">
Jan 21 19:08:58 np0005591285 nova_compute[182755]:      <log file="/var/lib/nova/instances/4e87b9c8-cfba-431e-966e-24799ad0ece2/console.log" append="off"/>
Jan 21 19:08:58 np0005591285 nova_compute[182755]:    </serial>
Jan 21 19:08:58 np0005591285 nova_compute[182755]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 21 19:08:58 np0005591285 nova_compute[182755]:    <video>
Jan 21 19:08:58 np0005591285 nova_compute[182755]:      <model type="virtio"/>
Jan 21 19:08:58 np0005591285 nova_compute[182755]:    </video>
Jan 21 19:08:58 np0005591285 nova_compute[182755]:    <input type="tablet" bus="usb"/>
Jan 21 19:08:58 np0005591285 nova_compute[182755]:    <input type="keyboard" bus="usb"/>
Jan 21 19:08:58 np0005591285 nova_compute[182755]:    <rng model="virtio">
Jan 21 19:08:58 np0005591285 nova_compute[182755]:      <backend model="random">/dev/urandom</backend>
Jan 21 19:08:58 np0005591285 nova_compute[182755]:    </rng>
Jan 21 19:08:58 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root"/>
Jan 21 19:08:58 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 19:08:58 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 19:08:58 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 19:08:58 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 19:08:58 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 19:08:58 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 19:08:58 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 19:08:58 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 19:08:58 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 19:08:58 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 19:08:58 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 19:08:58 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 19:08:58 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 19:08:58 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 19:08:58 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 19:08:58 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 19:08:58 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 19:08:58 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 19:08:58 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 19:08:58 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 19:08:58 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 19:08:58 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 19:08:58 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 19:08:58 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 19:08:58 np0005591285 nova_compute[182755]:    <controller type="usb" index="0"/>
Jan 21 19:08:58 np0005591285 nova_compute[182755]:    <memballoon model="virtio">
Jan 21 19:08:58 np0005591285 nova_compute[182755]:      <stats period="10"/>
Jan 21 19:08:58 np0005591285 nova_compute[182755]:    </memballoon>
Jan 21 19:08:58 np0005591285 nova_compute[182755]:  </devices>
Jan 21 19:08:58 np0005591285 nova_compute[182755]: </domain>
Jan 21 19:08:58 np0005591285 nova_compute[182755]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Jan 21 19:08:58 np0005591285 nova_compute[182755]: 2026-01-22 00:08:58.387 182759 DEBUG nova.compute.manager [None req-1620f237-30e3-4c25-9f9f-e88b9800a1b5 365f219cd09c471fa6275faa2fe5e2a1 b26cf6f4abd54e30aac169a3cbca648c - - default default] [instance: 4e87b9c8-cfba-431e-966e-24799ad0ece2] Preparing to wait for external event network-vif-plugged-02f1d29d-b6df-46d8-8387-cfa84ffb24af prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Jan 21 19:08:58 np0005591285 nova_compute[182755]: 2026-01-22 00:08:58.387 182759 DEBUG oslo_concurrency.lockutils [None req-1620f237-30e3-4c25-9f9f-e88b9800a1b5 365f219cd09c471fa6275faa2fe5e2a1 b26cf6f4abd54e30aac169a3cbca648c - - default default] Acquiring lock "4e87b9c8-cfba-431e-966e-24799ad0ece2-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 21 19:08:58 np0005591285 nova_compute[182755]: 2026-01-22 00:08:58.387 182759 DEBUG oslo_concurrency.lockutils [None req-1620f237-30e3-4c25-9f9f-e88b9800a1b5 365f219cd09c471fa6275faa2fe5e2a1 b26cf6f4abd54e30aac169a3cbca648c - - default default] Lock "4e87b9c8-cfba-431e-966e-24799ad0ece2-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 21 19:08:58 np0005591285 nova_compute[182755]: 2026-01-22 00:08:58.388 182759 DEBUG oslo_concurrency.lockutils [None req-1620f237-30e3-4c25-9f9f-e88b9800a1b5 365f219cd09c471fa6275faa2fe5e2a1 b26cf6f4abd54e30aac169a3cbca648c - - default default] Lock "4e87b9c8-cfba-431e-966e-24799ad0ece2-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 21 19:08:58 np0005591285 nova_compute[182755]: 2026-01-22 00:08:58.388 182759 DEBUG nova.virt.libvirt.vif [None req-1620f237-30e3-4c25-9f9f-e88b9800a1b5 365f219cd09c471fa6275faa2fe5e2a1 b26cf6f4abd54e30aac169a3cbca648c - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=True,config_drive='True',created_at=2026-01-22T00:07:28Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ServerActionsTestOtherB-server-261934281',display_name='tempest-ServerActionsTestOtherB-server-261934281',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-serveractionstestotherb-server-261934281',id=104,image_ref='570dc310-bae9-497b-a8ff-1f0fd81ca729',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name='tempest-keypair-1040205771',keypairs=<?>,launch_index=0,launched_at=2026-01-22T00:07:37Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=4,progress=0,project_id='b26cf6f4abd54e30aac169a3cbca648c',ramdisk_id='',reservation_id='r-bqv3jv0l',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',clean_attempts='1',image_base_image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-ServerActionsTestOtherB-1685479237',owner_user_name='tempest-ServerActionsTestOtherB-1685479237-project-member',shelved_at='2026-01-22T00:08:37.845451',shelved_host='compute-2.ctlplane.example.com',shelved_image_id='570dc310-bae9-497b-a8ff-1f0fd81ca729'},tags=<?>,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-22T00:08:49Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='365f219cd09c471fa6275faa2fe5e2a1',uuid=4e87b9c8-cfba-431e-966e-24799ad0ece2,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='shelved_offloaded') vif={"id": "02f1d29d-b6df-46d8-8387-cfa84ffb24af", "address": "fa:16:3e:7c:e7:2e", "network": {"id": "1a4bd631-64c5-4e00-9341-0e44fd0833fb", "bridge": "br-int", "label": "tempest-ServerActionsTestOtherB-1779791452-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.176", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b26cf6f4abd54e30aac169a3cbca648c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap02f1d29d-b6", "ovs_interfaceid": "02f1d29d-b6df-46d8-8387-cfa84ffb24af", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Jan 21 19:08:58 np0005591285 nova_compute[182755]: 2026-01-22 00:08:58.389 182759 DEBUG nova.network.os_vif_util [None req-1620f237-30e3-4c25-9f9f-e88b9800a1b5 365f219cd09c471fa6275faa2fe5e2a1 b26cf6f4abd54e30aac169a3cbca648c - - default default] Converting VIF {"id": "02f1d29d-b6df-46d8-8387-cfa84ffb24af", "address": "fa:16:3e:7c:e7:2e", "network": {"id": "1a4bd631-64c5-4e00-9341-0e44fd0833fb", "bridge": "br-int", "label": "tempest-ServerActionsTestOtherB-1779791452-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.176", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b26cf6f4abd54e30aac169a3cbca648c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap02f1d29d-b6", "ovs_interfaceid": "02f1d29d-b6df-46d8-8387-cfa84ffb24af", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 21 19:08:58 np0005591285 nova_compute[182755]: 2026-01-22 00:08:58.389 182759 DEBUG nova.network.os_vif_util [None req-1620f237-30e3-4c25-9f9f-e88b9800a1b5 365f219cd09c471fa6275faa2fe5e2a1 b26cf6f4abd54e30aac169a3cbca648c - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:7c:e7:2e,bridge_name='br-int',has_traffic_filtering=True,id=02f1d29d-b6df-46d8-8387-cfa84ffb24af,network=Network(1a4bd631-64c5-4e00-9341-0e44fd0833fb),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap02f1d29d-b6') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 21 19:08:58 np0005591285 nova_compute[182755]: 2026-01-22 00:08:58.390 182759 DEBUG os_vif [None req-1620f237-30e3-4c25-9f9f-e88b9800a1b5 365f219cd09c471fa6275faa2fe5e2a1 b26cf6f4abd54e30aac169a3cbca648c - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:7c:e7:2e,bridge_name='br-int',has_traffic_filtering=True,id=02f1d29d-b6df-46d8-8387-cfa84ffb24af,network=Network(1a4bd631-64c5-4e00-9341-0e44fd0833fb),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap02f1d29d-b6') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Jan 21 19:08:58 np0005591285 nova_compute[182755]: 2026-01-22 00:08:58.391 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:08:58 np0005591285 nova_compute[182755]: 2026-01-22 00:08:58.391 182759 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 21 19:08:58 np0005591285 nova_compute[182755]: 2026-01-22 00:08:58.391 182759 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 21 19:08:58 np0005591285 nova_compute[182755]: 2026-01-22 00:08:58.394 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:08:58 np0005591285 nova_compute[182755]: 2026-01-22 00:08:58.395 182759 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap02f1d29d-b6, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 21 19:08:58 np0005591285 nova_compute[182755]: 2026-01-22 00:08:58.395 182759 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap02f1d29d-b6, col_values=(('external_ids', {'iface-id': '02f1d29d-b6df-46d8-8387-cfa84ffb24af', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:7c:e7:2e', 'vm-uuid': '4e87b9c8-cfba-431e-966e-24799ad0ece2'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 21 19:08:58 np0005591285 NetworkManager[55017]: <info>  [1769040538.4071] manager: (tap02f1d29d-b6): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/193)
Jan 21 19:08:58 np0005591285 nova_compute[182755]: 2026-01-22 00:08:58.407 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:08:58 np0005591285 nova_compute[182755]: 2026-01-22 00:08:58.411 182759 DEBUG nova.virt.libvirt.driver [None req-16135a30-0415-48fc-85cb-ae2a817f04b8 8324d8ba232c476e925d31b7d5645a7a 3b9315c6168049d79f20d630e51ffff3 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 21 19:08:58 np0005591285 nova_compute[182755]: 2026-01-22 00:08:58.412 182759 DEBUG nova.virt.libvirt.driver [None req-16135a30-0415-48fc-85cb-ae2a817f04b8 8324d8ba232c476e925d31b7d5645a7a 3b9315c6168049d79f20d630e51ffff3 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 21 19:08:58 np0005591285 nova_compute[182755]: 2026-01-22 00:08:58.412 182759 DEBUG nova.virt.libvirt.driver [None req-16135a30-0415-48fc-85cb-ae2a817f04b8 8324d8ba232c476e925d31b7d5645a7a 3b9315c6168049d79f20d630e51ffff3 - - default default] No VIF found with MAC fa:16:3e:52:ad:ee, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Jan 21 19:08:58 np0005591285 nova_compute[182755]: 2026-01-22 00:08:58.413 182759 INFO nova.virt.libvirt.driver [None req-16135a30-0415-48fc-85cb-ae2a817f04b8 8324d8ba232c476e925d31b7d5645a7a 3b9315c6168049d79f20d630e51ffff3 - - default default] [instance: cacae884-d2ca-4741-952f-59ffbb641328] Using config drive#033[00m
Jan 21 19:08:58 np0005591285 nova_compute[182755]: 2026-01-22 00:08:58.415 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:08:58 np0005591285 nova_compute[182755]: 2026-01-22 00:08:58.417 182759 INFO os_vif [None req-1620f237-30e3-4c25-9f9f-e88b9800a1b5 365f219cd09c471fa6275faa2fe5e2a1 b26cf6f4abd54e30aac169a3cbca648c - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:7c:e7:2e,bridge_name='br-int',has_traffic_filtering=True,id=02f1d29d-b6df-46d8-8387-cfa84ffb24af,network=Network(1a4bd631-64c5-4e00-9341-0e44fd0833fb),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap02f1d29d-b6')#033[00m
Jan 21 19:08:58 np0005591285 nova_compute[182755]: 2026-01-22 00:08:58.518 182759 DEBUG nova.virt.libvirt.driver [None req-1620f237-30e3-4c25-9f9f-e88b9800a1b5 365f219cd09c471fa6275faa2fe5e2a1 b26cf6f4abd54e30aac169a3cbca648c - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 21 19:08:58 np0005591285 nova_compute[182755]: 2026-01-22 00:08:58.518 182759 DEBUG nova.virt.libvirt.driver [None req-1620f237-30e3-4c25-9f9f-e88b9800a1b5 365f219cd09c471fa6275faa2fe5e2a1 b26cf6f4abd54e30aac169a3cbca648c - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 21 19:08:58 np0005591285 nova_compute[182755]: 2026-01-22 00:08:58.518 182759 DEBUG nova.virt.libvirt.driver [None req-1620f237-30e3-4c25-9f9f-e88b9800a1b5 365f219cd09c471fa6275faa2fe5e2a1 b26cf6f4abd54e30aac169a3cbca648c - - default default] No VIF found with MAC fa:16:3e:7c:e7:2e, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Jan 21 19:08:58 np0005591285 nova_compute[182755]: 2026-01-22 00:08:58.519 182759 INFO nova.virt.libvirt.driver [None req-1620f237-30e3-4c25-9f9f-e88b9800a1b5 365f219cd09c471fa6275faa2fe5e2a1 b26cf6f4abd54e30aac169a3cbca648c - - default default] [instance: 4e87b9c8-cfba-431e-966e-24799ad0ece2] Using config drive#033[00m
Jan 21 19:08:58 np0005591285 nova_compute[182755]: 2026-01-22 00:08:58.691 182759 DEBUG nova.objects.instance [None req-1620f237-30e3-4c25-9f9f-e88b9800a1b5 365f219cd09c471fa6275faa2fe5e2a1 b26cf6f4abd54e30aac169a3cbca648c - - default default] Lazy-loading 'ec2_ids' on Instance uuid 4e87b9c8-cfba-431e-966e-24799ad0ece2 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 21 19:08:58 np0005591285 nova_compute[182755]: 2026-01-22 00:08:58.757 182759 DEBUG nova.objects.instance [None req-1620f237-30e3-4c25-9f9f-e88b9800a1b5 365f219cd09c471fa6275faa2fe5e2a1 b26cf6f4abd54e30aac169a3cbca648c - - default default] Lazy-loading 'keypairs' on Instance uuid 4e87b9c8-cfba-431e-966e-24799ad0ece2 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 21 19:08:58 np0005591285 nova_compute[182755]: 2026-01-22 00:08:58.911 182759 INFO nova.virt.libvirt.driver [None req-16135a30-0415-48fc-85cb-ae2a817f04b8 8324d8ba232c476e925d31b7d5645a7a 3b9315c6168049d79f20d630e51ffff3 - - default default] [instance: cacae884-d2ca-4741-952f-59ffbb641328] Creating config drive at /var/lib/nova/instances/cacae884-d2ca-4741-952f-59ffbb641328/disk.config#033[00m
Jan 21 19:08:58 np0005591285 nova_compute[182755]: 2026-01-22 00:08:58.917 182759 DEBUG oslo_concurrency.processutils [None req-16135a30-0415-48fc-85cb-ae2a817f04b8 8324d8ba232c476e925d31b7d5645a7a 3b9315c6168049d79f20d630e51ffff3 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/cacae884-d2ca-4741-952f-59ffbb641328/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpg5cvdv53 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 21 19:08:59 np0005591285 nova_compute[182755]: 2026-01-22 00:08:59.049 182759 DEBUG oslo_concurrency.processutils [None req-16135a30-0415-48fc-85cb-ae2a817f04b8 8324d8ba232c476e925d31b7d5645a7a 3b9315c6168049d79f20d630e51ffff3 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/cacae884-d2ca-4741-952f-59ffbb641328/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpg5cvdv53" returned: 0 in 0.132s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 21 19:08:59 np0005591285 kernel: tapebf5a837-69: entered promiscuous mode
Jan 21 19:08:59 np0005591285 NetworkManager[55017]: <info>  [1769040539.1100] manager: (tapebf5a837-69): new Tun device (/org/freedesktop/NetworkManager/Devices/194)
Jan 21 19:08:59 np0005591285 ovn_controller[94908]: 2026-01-22T00:08:59Z|00393|binding|INFO|Claiming lport ebf5a837-6957-4227-9b3d-1ae66eb381bd for this chassis.
Jan 21 19:08:59 np0005591285 ovn_controller[94908]: 2026-01-22T00:08:59Z|00394|binding|INFO|ebf5a837-6957-4227-9b3d-1ae66eb381bd: Claiming fa:16:3e:52:ad:ee 10.100.0.11
Jan 21 19:08:59 np0005591285 nova_compute[182755]: 2026-01-22 00:08:59.115 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:08:59 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:08:59.121 104259 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:52:ad:ee 10.100.0.11'], port_security=['fa:16:3e:52:ad:ee 10.100.0.11'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.11/28', 'neutron:device_id': 'cacae884-d2ca-4741-952f-59ffbb641328', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-eaa59f49-9477-4d9b-9a5e-8f1eb5cf78a6', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '3b9315c6168049d79f20d630e51ffff3', 'neutron:revision_number': '2', 'neutron:security_group_ids': '88dd83ff-b733-44b2-9065-8f39dcf83d23', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=ada6e58f-6492-44c0-abaa-a00698af112f, chassis=[<ovs.db.idl.Row object at 0x7fdd55b5c970>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fdd55b5c970>], logical_port=ebf5a837-6957-4227-9b3d-1ae66eb381bd) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 21 19:08:59 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:08:59.122 104259 INFO neutron.agent.ovn.metadata.agent [-] Port ebf5a837-6957-4227-9b3d-1ae66eb381bd in datapath eaa59f49-9477-4d9b-9a5e-8f1eb5cf78a6 bound to our chassis#033[00m
Jan 21 19:08:59 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:08:59.123 104259 DEBUG neutron.agent.ovn.metadata.agent [-] There is no metadata port for network eaa59f49-9477-4d9b-9a5e-8f1eb5cf78a6 or it has no MAC or IP addresses configured, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:599#033[00m
Jan 21 19:08:59 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:08:59.124 211690 DEBUG oslo.privsep.daemon [-] privsep: reply[0ae4c59f-305d-416f-8e5e-3f38783f48a5]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 19:08:59 np0005591285 nova_compute[182755]: 2026-01-22 00:08:59.138 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:08:59 np0005591285 ovn_controller[94908]: 2026-01-22T00:08:59Z|00395|binding|INFO|Setting lport ebf5a837-6957-4227-9b3d-1ae66eb381bd ovn-installed in OVS
Jan 21 19:08:59 np0005591285 ovn_controller[94908]: 2026-01-22T00:08:59Z|00396|binding|INFO|Setting lport ebf5a837-6957-4227-9b3d-1ae66eb381bd up in Southbound
Jan 21 19:08:59 np0005591285 systemd-udevd[227969]: Network interface NamePolicy= disabled on kernel command line.
Jan 21 19:08:59 np0005591285 nova_compute[182755]: 2026-01-22 00:08:59.141 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:08:59 np0005591285 nova_compute[182755]: 2026-01-22 00:08:59.147 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:08:59 np0005591285 NetworkManager[55017]: <info>  [1769040539.1524] device (tapebf5a837-69): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 21 19:08:59 np0005591285 NetworkManager[55017]: <info>  [1769040539.1534] device (tapebf5a837-69): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 21 19:08:59 np0005591285 systemd-machined[154022]: New machine qemu-48-instance-0000006e.
Jan 21 19:08:59 np0005591285 systemd[1]: Started Virtual Machine qemu-48-instance-0000006e.
Jan 21 19:08:59 np0005591285 nova_compute[182755]: 2026-01-22 00:08:59.372 182759 INFO nova.virt.libvirt.driver [None req-1620f237-30e3-4c25-9f9f-e88b9800a1b5 365f219cd09c471fa6275faa2fe5e2a1 b26cf6f4abd54e30aac169a3cbca648c - - default default] [instance: 4e87b9c8-cfba-431e-966e-24799ad0ece2] Creating config drive at /var/lib/nova/instances/4e87b9c8-cfba-431e-966e-24799ad0ece2/disk.config#033[00m
Jan 21 19:08:59 np0005591285 nova_compute[182755]: 2026-01-22 00:08:59.380 182759 DEBUG oslo_concurrency.processutils [None req-1620f237-30e3-4c25-9f9f-e88b9800a1b5 365f219cd09c471fa6275faa2fe5e2a1 b26cf6f4abd54e30aac169a3cbca648c - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/4e87b9c8-cfba-431e-966e-24799ad0ece2/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpjp36gw0l execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 21 19:08:59 np0005591285 nova_compute[182755]: 2026-01-22 00:08:59.453 182759 DEBUG nova.compute.manager [req-3768e0a8-1290-416d-bb1a-d6f35a03c547 req-06053179-d53c-4d0c-80d8-cb001b197a19 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: cacae884-d2ca-4741-952f-59ffbb641328] Received event network-vif-plugged-ebf5a837-6957-4227-9b3d-1ae66eb381bd external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 21 19:08:59 np0005591285 nova_compute[182755]: 2026-01-22 00:08:59.454 182759 DEBUG oslo_concurrency.lockutils [req-3768e0a8-1290-416d-bb1a-d6f35a03c547 req-06053179-d53c-4d0c-80d8-cb001b197a19 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "cacae884-d2ca-4741-952f-59ffbb641328-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 21 19:08:59 np0005591285 nova_compute[182755]: 2026-01-22 00:08:59.454 182759 DEBUG oslo_concurrency.lockutils [req-3768e0a8-1290-416d-bb1a-d6f35a03c547 req-06053179-d53c-4d0c-80d8-cb001b197a19 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "cacae884-d2ca-4741-952f-59ffbb641328-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 21 19:08:59 np0005591285 nova_compute[182755]: 2026-01-22 00:08:59.454 182759 DEBUG oslo_concurrency.lockutils [req-3768e0a8-1290-416d-bb1a-d6f35a03c547 req-06053179-d53c-4d0c-80d8-cb001b197a19 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "cacae884-d2ca-4741-952f-59ffbb641328-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 21 19:08:59 np0005591285 nova_compute[182755]: 2026-01-22 00:08:59.455 182759 DEBUG nova.compute.manager [req-3768e0a8-1290-416d-bb1a-d6f35a03c547 req-06053179-d53c-4d0c-80d8-cb001b197a19 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: cacae884-d2ca-4741-952f-59ffbb641328] Processing event network-vif-plugged-ebf5a837-6957-4227-9b3d-1ae66eb381bd _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Jan 21 19:08:59 np0005591285 nova_compute[182755]: 2026-01-22 00:08:59.505 182759 DEBUG oslo_concurrency.processutils [None req-1620f237-30e3-4c25-9f9f-e88b9800a1b5 365f219cd09c471fa6275faa2fe5e2a1 b26cf6f4abd54e30aac169a3cbca648c - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/4e87b9c8-cfba-431e-966e-24799ad0ece2/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpjp36gw0l" returned: 0 in 0.125s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 21 19:08:59 np0005591285 kernel: tap02f1d29d-b6: entered promiscuous mode
Jan 21 19:08:59 np0005591285 NetworkManager[55017]: <info>  [1769040539.5706] manager: (tap02f1d29d-b6): new Tun device (/org/freedesktop/NetworkManager/Devices/195)
Jan 21 19:08:59 np0005591285 ovn_controller[94908]: 2026-01-22T00:08:59Z|00397|binding|INFO|Claiming lport 02f1d29d-b6df-46d8-8387-cfa84ffb24af for this chassis.
Jan 21 19:08:59 np0005591285 ovn_controller[94908]: 2026-01-22T00:08:59Z|00398|binding|INFO|02f1d29d-b6df-46d8-8387-cfa84ffb24af: Claiming fa:16:3e:7c:e7:2e 10.100.0.8
Jan 21 19:08:59 np0005591285 nova_compute[182755]: 2026-01-22 00:08:59.570 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:08:59 np0005591285 NetworkManager[55017]: <info>  [1769040539.5832] device (tap02f1d29d-b6): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 21 19:08:59 np0005591285 ovn_controller[94908]: 2026-01-22T00:08:59Z|00399|binding|INFO|Setting lport 02f1d29d-b6df-46d8-8387-cfa84ffb24af ovn-installed in OVS
Jan 21 19:08:59 np0005591285 ovn_controller[94908]: 2026-01-22T00:08:59Z|00400|binding|INFO|Setting lport 02f1d29d-b6df-46d8-8387-cfa84ffb24af up in Southbound
Jan 21 19:08:59 np0005591285 NetworkManager[55017]: <info>  [1769040539.5840] device (tap02f1d29d-b6): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 21 19:08:59 np0005591285 nova_compute[182755]: 2026-01-22 00:08:59.585 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:08:59 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:08:59.587 104259 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:7c:e7:2e 10.100.0.8'], port_security=['fa:16:3e:7c:e7:2e 10.100.0.8'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.8/28', 'neutron:device_id': '4e87b9c8-cfba-431e-966e-24799ad0ece2', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-1a4bd631-64c5-4e00-9341-0e44fd0833fb', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'b26cf6f4abd54e30aac169a3cbca648c', 'neutron:revision_number': '7', 'neutron:security_group_ids': '80fb8d02-77b3-43f5-9cd3-4114236093b7', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:port_fip': '192.168.122.176'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=d46c6b58-b03f-4ac4-a6dd-9f507a40241a, chassis=[<ovs.db.idl.Row object at 0x7fdd55b5c970>], tunnel_key=5, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fdd55b5c970>], logical_port=02f1d29d-b6df-46d8-8387-cfa84ffb24af) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 21 19:08:59 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:08:59.589 104259 INFO neutron.agent.ovn.metadata.agent [-] Port 02f1d29d-b6df-46d8-8387-cfa84ffb24af in datapath 1a4bd631-64c5-4e00-9341-0e44fd0833fb bound to our chassis#033[00m
Jan 21 19:08:59 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:08:59.593 104259 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 1a4bd631-64c5-4e00-9341-0e44fd0833fb#033[00m
Jan 21 19:08:59 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:08:59.604 211690 DEBUG oslo.privsep.daemon [-] privsep: reply[42afeaba-b2c1-4cc4-be51-758e5f70df2a]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 19:08:59 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:08:59.605 104259 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap1a4bd631-61 in ovnmeta-1a4bd631-64c5-4e00-9341-0e44fd0833fb namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Jan 21 19:08:59 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:08:59.607 211690 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap1a4bd631-60 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Jan 21 19:08:59 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:08:59.607 211690 DEBUG oslo.privsep.daemon [-] privsep: reply[0de2a430-e7dd-4649-bebe-0962d6005ca0]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 19:08:59 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:08:59.608 211690 DEBUG oslo.privsep.daemon [-] privsep: reply[a6d1246c-ae13-4fd9-87dd-e6d53ce1bb94]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 19:08:59 np0005591285 systemd-machined[154022]: New machine qemu-49-instance-00000068.
Jan 21 19:08:59 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:08:59.619 104650 DEBUG oslo.privsep.daemon [-] privsep: reply[393f3a76-92d4-4111-b5b9-5e7152a3c049]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 19:08:59 np0005591285 systemd[1]: Started Virtual Machine qemu-49-instance-00000068.
Jan 21 19:08:59 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:08:59.637 211690 DEBUG oslo.privsep.daemon [-] privsep: reply[7eda80d6-eab1-4832-ab0a-e91aa7d1ed71]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 19:08:59 np0005591285 nova_compute[182755]: 2026-01-22 00:08:59.651 182759 DEBUG nova.virt.driver [None req-f447ef32-7edb-4e2c-a5c5-5452f2f9c37e - - - - - -] Emitting event <LifecycleEvent: 1769040539.6514854, cacae884-d2ca-4741-952f-59ffbb641328 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 21 19:08:59 np0005591285 nova_compute[182755]: 2026-01-22 00:08:59.652 182759 INFO nova.compute.manager [None req-f447ef32-7edb-4e2c-a5c5-5452f2f9c37e - - - - - -] [instance: cacae884-d2ca-4741-952f-59ffbb641328] VM Started (Lifecycle Event)#033[00m
Jan 21 19:08:59 np0005591285 nova_compute[182755]: 2026-01-22 00:08:59.654 182759 DEBUG nova.compute.manager [None req-16135a30-0415-48fc-85cb-ae2a817f04b8 8324d8ba232c476e925d31b7d5645a7a 3b9315c6168049d79f20d630e51ffff3 - - default default] [instance: cacae884-d2ca-4741-952f-59ffbb641328] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Jan 21 19:08:59 np0005591285 nova_compute[182755]: 2026-01-22 00:08:59.659 182759 DEBUG nova.virt.libvirt.driver [None req-16135a30-0415-48fc-85cb-ae2a817f04b8 8324d8ba232c476e925d31b7d5645a7a 3b9315c6168049d79f20d630e51ffff3 - - default default] [instance: cacae884-d2ca-4741-952f-59ffbb641328] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Jan 21 19:08:59 np0005591285 nova_compute[182755]: 2026-01-22 00:08:59.663 182759 INFO nova.virt.libvirt.driver [-] [instance: cacae884-d2ca-4741-952f-59ffbb641328] Instance spawned successfully.#033[00m
Jan 21 19:08:59 np0005591285 nova_compute[182755]: 2026-01-22 00:08:59.663 182759 DEBUG nova.virt.libvirt.driver [None req-16135a30-0415-48fc-85cb-ae2a817f04b8 8324d8ba232c476e925d31b7d5645a7a 3b9315c6168049d79f20d630e51ffff3 - - default default] [instance: cacae884-d2ca-4741-952f-59ffbb641328] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Jan 21 19:08:59 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:08:59.664 211704 DEBUG oslo.privsep.daemon [-] privsep: reply[375e0f05-3860-4d64-bce6-dc4d455e7227]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 19:08:59 np0005591285 NetworkManager[55017]: <info>  [1769040539.6703] manager: (tap1a4bd631-60): new Veth device (/org/freedesktop/NetworkManager/Devices/196)
Jan 21 19:08:59 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:08:59.670 211690 DEBUG oslo.privsep.daemon [-] privsep: reply[dedb9d0c-0288-450f-81b8-5a78925733b2]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 19:08:59 np0005591285 nova_compute[182755]: 2026-01-22 00:08:59.685 182759 DEBUG nova.compute.manager [None req-f447ef32-7edb-4e2c-a5c5-5452f2f9c37e - - - - - -] [instance: cacae884-d2ca-4741-952f-59ffbb641328] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 21 19:08:59 np0005591285 nova_compute[182755]: 2026-01-22 00:08:59.708 182759 DEBUG nova.compute.manager [None req-f447ef32-7edb-4e2c-a5c5-5452f2f9c37e - - - - - -] [instance: cacae884-d2ca-4741-952f-59ffbb641328] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 21 19:08:59 np0005591285 nova_compute[182755]: 2026-01-22 00:08:59.712 182759 DEBUG nova.virt.libvirt.driver [None req-16135a30-0415-48fc-85cb-ae2a817f04b8 8324d8ba232c476e925d31b7d5645a7a 3b9315c6168049d79f20d630e51ffff3 - - default default] [instance: cacae884-d2ca-4741-952f-59ffbb641328] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 21 19:08:59 np0005591285 nova_compute[182755]: 2026-01-22 00:08:59.712 182759 DEBUG nova.virt.libvirt.driver [None req-16135a30-0415-48fc-85cb-ae2a817f04b8 8324d8ba232c476e925d31b7d5645a7a 3b9315c6168049d79f20d630e51ffff3 - - default default] [instance: cacae884-d2ca-4741-952f-59ffbb641328] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 21 19:08:59 np0005591285 nova_compute[182755]: 2026-01-22 00:08:59.712 182759 DEBUG nova.virt.libvirt.driver [None req-16135a30-0415-48fc-85cb-ae2a817f04b8 8324d8ba232c476e925d31b7d5645a7a 3b9315c6168049d79f20d630e51ffff3 - - default default] [instance: cacae884-d2ca-4741-952f-59ffbb641328] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 21 19:08:59 np0005591285 nova_compute[182755]: 2026-01-22 00:08:59.713 182759 DEBUG nova.virt.libvirt.driver [None req-16135a30-0415-48fc-85cb-ae2a817f04b8 8324d8ba232c476e925d31b7d5645a7a 3b9315c6168049d79f20d630e51ffff3 - - default default] [instance: cacae884-d2ca-4741-952f-59ffbb641328] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 21 19:08:59 np0005591285 nova_compute[182755]: 2026-01-22 00:08:59.713 182759 DEBUG nova.virt.libvirt.driver [None req-16135a30-0415-48fc-85cb-ae2a817f04b8 8324d8ba232c476e925d31b7d5645a7a 3b9315c6168049d79f20d630e51ffff3 - - default default] [instance: cacae884-d2ca-4741-952f-59ffbb641328] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 21 19:08:59 np0005591285 nova_compute[182755]: 2026-01-22 00:08:59.713 182759 DEBUG nova.virt.libvirt.driver [None req-16135a30-0415-48fc-85cb-ae2a817f04b8 8324d8ba232c476e925d31b7d5645a7a 3b9315c6168049d79f20d630e51ffff3 - - default default] [instance: cacae884-d2ca-4741-952f-59ffbb641328] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 21 19:08:59 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:08:59.713 211704 DEBUG oslo.privsep.daemon [-] privsep: reply[d15f0643-76cb-441c-904b-9f1731f110f7]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 19:08:59 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:08:59.717 211704 DEBUG oslo.privsep.daemon [-] privsep: reply[56f0f6af-2af1-4dba-8eca-3f931a1f343c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 19:08:59 np0005591285 NetworkManager[55017]: <info>  [1769040539.7383] device (tap1a4bd631-60): carrier: link connected
Jan 21 19:08:59 np0005591285 nova_compute[182755]: 2026-01-22 00:08:59.742 182759 INFO nova.compute.manager [None req-f447ef32-7edb-4e2c-a5c5-5452f2f9c37e - - - - - -] [instance: cacae884-d2ca-4741-952f-59ffbb641328] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 21 19:08:59 np0005591285 nova_compute[182755]: 2026-01-22 00:08:59.743 182759 DEBUG nova.virt.driver [None req-f447ef32-7edb-4e2c-a5c5-5452f2f9c37e - - - - - -] Emitting event <LifecycleEvent: 1769040539.6539204, cacae884-d2ca-4741-952f-59ffbb641328 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 21 19:08:59 np0005591285 nova_compute[182755]: 2026-01-22 00:08:59.743 182759 INFO nova.compute.manager [None req-f447ef32-7edb-4e2c-a5c5-5452f2f9c37e - - - - - -] [instance: cacae884-d2ca-4741-952f-59ffbb641328] VM Paused (Lifecycle Event)#033[00m
Jan 21 19:08:59 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:08:59.744 211704 DEBUG oslo.privsep.daemon [-] privsep: reply[7ade8a1a-ac30-42f4-a2cd-32792b5657ed]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 19:08:59 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:08:59.764 211690 DEBUG oslo.privsep.daemon [-] privsep: reply[dfc7bf33-f208-406d-a926-777af63b9ab3]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap1a4bd631-61'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:28:78:33'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 125], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 509839, 'reachable_time': 44929, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 228041, 'error': None, 'target': 'ovnmeta-1a4bd631-64c5-4e00-9341-0e44fd0833fb', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 19:08:59 np0005591285 nova_compute[182755]: 2026-01-22 00:08:59.773 182759 DEBUG nova.compute.manager [None req-f447ef32-7edb-4e2c-a5c5-5452f2f9c37e - - - - - -] [instance: cacae884-d2ca-4741-952f-59ffbb641328] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 21 19:08:59 np0005591285 nova_compute[182755]: 2026-01-22 00:08:59.777 182759 DEBUG nova.virt.driver [None req-f447ef32-7edb-4e2c-a5c5-5452f2f9c37e - - - - - -] Emitting event <LifecycleEvent: 1769040539.657754, cacae884-d2ca-4741-952f-59ffbb641328 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 21 19:08:59 np0005591285 nova_compute[182755]: 2026-01-22 00:08:59.777 182759 INFO nova.compute.manager [None req-f447ef32-7edb-4e2c-a5c5-5452f2f9c37e - - - - - -] [instance: cacae884-d2ca-4741-952f-59ffbb641328] VM Resumed (Lifecycle Event)#033[00m
Jan 21 19:08:59 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:08:59.784 211690 DEBUG oslo.privsep.daemon [-] privsep: reply[080d7a62-6020-4465-a050-522ed7e285f0]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe28:7833'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 509839, 'tstamp': 509839}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 228042, 'error': None, 'target': 'ovnmeta-1a4bd631-64c5-4e00-9341-0e44fd0833fb', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 19:08:59 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:08:59.809 211690 DEBUG oslo.privsep.daemon [-] privsep: reply[d1a4ffee-34df-4e8d-822e-6f29ce7c84be]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap1a4bd631-61'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:28:78:33'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 125], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 509839, 'reachable_time': 44929, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 228043, 'error': None, 'target': 'ovnmeta-1a4bd631-64c5-4e00-9341-0e44fd0833fb', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 19:08:59 np0005591285 nova_compute[182755]: 2026-01-22 00:08:59.810 182759 INFO nova.compute.manager [None req-16135a30-0415-48fc-85cb-ae2a817f04b8 8324d8ba232c476e925d31b7d5645a7a 3b9315c6168049d79f20d630e51ffff3 - - default default] [instance: cacae884-d2ca-4741-952f-59ffbb641328] Took 8.07 seconds to spawn the instance on the hypervisor.#033[00m
Jan 21 19:08:59 np0005591285 nova_compute[182755]: 2026-01-22 00:08:59.810 182759 DEBUG nova.compute.manager [None req-16135a30-0415-48fc-85cb-ae2a817f04b8 8324d8ba232c476e925d31b7d5645a7a 3b9315c6168049d79f20d630e51ffff3 - - default default] [instance: cacae884-d2ca-4741-952f-59ffbb641328] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 21 19:08:59 np0005591285 nova_compute[182755]: 2026-01-22 00:08:59.817 182759 DEBUG nova.compute.manager [None req-f447ef32-7edb-4e2c-a5c5-5452f2f9c37e - - - - - -] [instance: cacae884-d2ca-4741-952f-59ffbb641328] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 21 19:08:59 np0005591285 nova_compute[182755]: 2026-01-22 00:08:59.828 182759 DEBUG nova.compute.manager [None req-f447ef32-7edb-4e2c-a5c5-5452f2f9c37e - - - - - -] [instance: cacae884-d2ca-4741-952f-59ffbb641328] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 21 19:08:59 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:08:59.847 211690 DEBUG oslo.privsep.daemon [-] privsep: reply[8dc74c3b-68bf-451b-bef7-a593618529ac]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 19:08:59 np0005591285 nova_compute[182755]: 2026-01-22 00:08:59.874 182759 INFO nova.compute.manager [None req-f447ef32-7edb-4e2c-a5c5-5452f2f9c37e - - - - - -] [instance: cacae884-d2ca-4741-952f-59ffbb641328] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 21 19:08:59 np0005591285 nova_compute[182755]: 2026-01-22 00:08:59.923 182759 INFO nova.compute.manager [None req-16135a30-0415-48fc-85cb-ae2a817f04b8 8324d8ba232c476e925d31b7d5645a7a 3b9315c6168049d79f20d630e51ffff3 - - default default] [instance: cacae884-d2ca-4741-952f-59ffbb641328] Took 8.73 seconds to build instance.#033[00m
Jan 21 19:08:59 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:08:59.932 211690 DEBUG oslo.privsep.daemon [-] privsep: reply[4fa2e3a2-b82b-4e89-b89f-14fb30a1f27b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 19:08:59 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:08:59.933 104259 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap1a4bd631-60, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 21 19:08:59 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:08:59.933 104259 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 21 19:08:59 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:08:59.934 104259 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap1a4bd631-60, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 21 19:08:59 np0005591285 nova_compute[182755]: 2026-01-22 00:08:59.935 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:08:59 np0005591285 NetworkManager[55017]: <info>  [1769040539.9362] manager: (tap1a4bd631-60): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/197)
Jan 21 19:08:59 np0005591285 kernel: tap1a4bd631-60: entered promiscuous mode
Jan 21 19:08:59 np0005591285 nova_compute[182755]: 2026-01-22 00:08:59.938 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:08:59 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:08:59.939 104259 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap1a4bd631-60, col_values=(('external_ids', {'iface-id': 'c2dbe75a-81e7-4c52-bada-9acaf8fbaf5c'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 21 19:08:59 np0005591285 nova_compute[182755]: 2026-01-22 00:08:59.940 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:08:59 np0005591285 ovn_controller[94908]: 2026-01-22T00:08:59Z|00401|binding|INFO|Releasing lport c2dbe75a-81e7-4c52-bada-9acaf8fbaf5c from this chassis (sb_readonly=0)
Jan 21 19:08:59 np0005591285 nova_compute[182755]: 2026-01-22 00:08:59.943 182759 DEBUG oslo_concurrency.lockutils [None req-16135a30-0415-48fc-85cb-ae2a817f04b8 8324d8ba232c476e925d31b7d5645a7a 3b9315c6168049d79f20d630e51ffff3 - - default default] Lock "cacae884-d2ca-4741-952f-59ffbb641328" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 8.857s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 21 19:08:59 np0005591285 nova_compute[182755]: 2026-01-22 00:08:59.963 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:08:59 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:08:59.964 104259 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/1a4bd631-64c5-4e00-9341-0e44fd0833fb.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/1a4bd631-64c5-4e00-9341-0e44fd0833fb.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Jan 21 19:08:59 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:08:59.965 211690 DEBUG oslo.privsep.daemon [-] privsep: reply[874bd5a4-6c73-4691-8751-0ac56cbea744]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 19:08:59 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:08:59.966 104259 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Jan 21 19:08:59 np0005591285 ovn_metadata_agent[104254]: global
Jan 21 19:08:59 np0005591285 ovn_metadata_agent[104254]:    log         /dev/log local0 debug
Jan 21 19:08:59 np0005591285 ovn_metadata_agent[104254]:    log-tag     haproxy-metadata-proxy-1a4bd631-64c5-4e00-9341-0e44fd0833fb
Jan 21 19:08:59 np0005591285 ovn_metadata_agent[104254]:    user        root
Jan 21 19:08:59 np0005591285 ovn_metadata_agent[104254]:    group       root
Jan 21 19:08:59 np0005591285 ovn_metadata_agent[104254]:    maxconn     1024
Jan 21 19:08:59 np0005591285 ovn_metadata_agent[104254]:    pidfile     /var/lib/neutron/external/pids/1a4bd631-64c5-4e00-9341-0e44fd0833fb.pid.haproxy
Jan 21 19:08:59 np0005591285 ovn_metadata_agent[104254]:    daemon
Jan 21 19:08:59 np0005591285 ovn_metadata_agent[104254]: 
Jan 21 19:08:59 np0005591285 ovn_metadata_agent[104254]: defaults
Jan 21 19:08:59 np0005591285 ovn_metadata_agent[104254]:    log global
Jan 21 19:08:59 np0005591285 ovn_metadata_agent[104254]:    mode http
Jan 21 19:08:59 np0005591285 ovn_metadata_agent[104254]:    option httplog
Jan 21 19:08:59 np0005591285 ovn_metadata_agent[104254]:    option dontlognull
Jan 21 19:08:59 np0005591285 ovn_metadata_agent[104254]:    option http-server-close
Jan 21 19:08:59 np0005591285 ovn_metadata_agent[104254]:    option forwardfor
Jan 21 19:08:59 np0005591285 ovn_metadata_agent[104254]:    retries                 3
Jan 21 19:08:59 np0005591285 ovn_metadata_agent[104254]:    timeout http-request    30s
Jan 21 19:08:59 np0005591285 ovn_metadata_agent[104254]:    timeout connect         30s
Jan 21 19:08:59 np0005591285 ovn_metadata_agent[104254]:    timeout client          32s
Jan 21 19:08:59 np0005591285 ovn_metadata_agent[104254]:    timeout server          32s
Jan 21 19:08:59 np0005591285 ovn_metadata_agent[104254]:    timeout http-keep-alive 30s
Jan 21 19:08:59 np0005591285 ovn_metadata_agent[104254]: 
Jan 21 19:08:59 np0005591285 ovn_metadata_agent[104254]: 
Jan 21 19:08:59 np0005591285 ovn_metadata_agent[104254]: listen listener
Jan 21 19:08:59 np0005591285 ovn_metadata_agent[104254]:    bind 169.254.169.254:80
Jan 21 19:08:59 np0005591285 ovn_metadata_agent[104254]:    server metadata /var/lib/neutron/metadata_proxy
Jan 21 19:08:59 np0005591285 ovn_metadata_agent[104254]:    http-request add-header X-OVN-Network-ID 1a4bd631-64c5-4e00-9341-0e44fd0833fb
Jan 21 19:08:59 np0005591285 ovn_metadata_agent[104254]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Jan 21 19:08:59 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:08:59.968 104259 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-1a4bd631-64c5-4e00-9341-0e44fd0833fb', 'env', 'PROCESS_TAG=haproxy-1a4bd631-64c5-4e00-9341-0e44fd0833fb', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/1a4bd631-64c5-4e00-9341-0e44fd0833fb.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Jan 21 19:08:59 np0005591285 nova_compute[182755]: 2026-01-22 00:08:59.982 182759 DEBUG nova.virt.driver [None req-f447ef32-7edb-4e2c-a5c5-5452f2f9c37e - - - - - -] Emitting event <LifecycleEvent: 1769040539.9823682, 4e87b9c8-cfba-431e-966e-24799ad0ece2 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 21 19:08:59 np0005591285 nova_compute[182755]: 2026-01-22 00:08:59.983 182759 INFO nova.compute.manager [None req-f447ef32-7edb-4e2c-a5c5-5452f2f9c37e - - - - - -] [instance: 4e87b9c8-cfba-431e-966e-24799ad0ece2] VM Started (Lifecycle Event)#033[00m
Jan 21 19:09:00 np0005591285 nova_compute[182755]: 2026-01-22 00:09:00.006 182759 DEBUG nova.compute.manager [None req-f447ef32-7edb-4e2c-a5c5-5452f2f9c37e - - - - - -] [instance: 4e87b9c8-cfba-431e-966e-24799ad0ece2] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 21 19:09:00 np0005591285 nova_compute[182755]: 2026-01-22 00:09:00.010 182759 DEBUG nova.virt.driver [None req-f447ef32-7edb-4e2c-a5c5-5452f2f9c37e - - - - - -] Emitting event <LifecycleEvent: 1769040539.984504, 4e87b9c8-cfba-431e-966e-24799ad0ece2 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 21 19:09:00 np0005591285 nova_compute[182755]: 2026-01-22 00:09:00.010 182759 INFO nova.compute.manager [None req-f447ef32-7edb-4e2c-a5c5-5452f2f9c37e - - - - - -] [instance: 4e87b9c8-cfba-431e-966e-24799ad0ece2] VM Paused (Lifecycle Event)#033[00m
Jan 21 19:09:00 np0005591285 nova_compute[182755]: 2026-01-22 00:09:00.131 182759 DEBUG nova.network.neutron [req-584d2787-f89e-44db-8bcc-145ab190725a req-0d5af684-e6db-4b22-afc0-b7657c58808a 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: cacae884-d2ca-4741-952f-59ffbb641328] Updated VIF entry in instance network info cache for port ebf5a837-6957-4227-9b3d-1ae66eb381bd. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 21 19:09:00 np0005591285 nova_compute[182755]: 2026-01-22 00:09:00.132 182759 DEBUG nova.network.neutron [req-584d2787-f89e-44db-8bcc-145ab190725a req-0d5af684-e6db-4b22-afc0-b7657c58808a 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: cacae884-d2ca-4741-952f-59ffbb641328] Updating instance_info_cache with network_info: [{"id": "ebf5a837-6957-4227-9b3d-1ae66eb381bd", "address": "fa:16:3e:52:ad:ee", "network": {"id": "eaa59f49-9477-4d9b-9a5e-8f1eb5cf78a6", "bridge": "br-int", "label": "tempest-ServerRescueTestJSON-1280377146-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": false}}], "meta": {"injected": false, "tenant_id": "3b9315c6168049d79f20d630e51ffff3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapebf5a837-69", "ovs_interfaceid": "ebf5a837-6957-4227-9b3d-1ae66eb381bd", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 21 19:09:00 np0005591285 nova_compute[182755]: 2026-01-22 00:09:00.135 182759 DEBUG nova.compute.manager [None req-f447ef32-7edb-4e2c-a5c5-5452f2f9c37e - - - - - -] [instance: 4e87b9c8-cfba-431e-966e-24799ad0ece2] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 21 19:09:00 np0005591285 nova_compute[182755]: 2026-01-22 00:09:00.138 182759 DEBUG nova.compute.manager [None req-f447ef32-7edb-4e2c-a5c5-5452f2f9c37e - - - - - -] [instance: 4e87b9c8-cfba-431e-966e-24799ad0ece2] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: shelved_offloaded, current task_state: spawning, current DB power_state: 4, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 21 19:09:00 np0005591285 podman[228082]: 2026-01-22 00:09:00.422451347 +0000 UTC m=+0.056774638 container create deb6f724b07c02d613586488e57ffd2c0d77cf6911e2df4276c807d1de654950 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-1a4bd631-64c5-4e00-9341-0e44fd0833fb, org.label-schema.build-date=20251202, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0)
Jan 21 19:09:00 np0005591285 systemd[1]: Started libpod-conmon-deb6f724b07c02d613586488e57ffd2c0d77cf6911e2df4276c807d1de654950.scope.
Jan 21 19:09:00 np0005591285 systemd[1]: Started libcrun container.
Jan 21 19:09:00 np0005591285 podman[228082]: 2026-01-22 00:09:00.396178711 +0000 UTC m=+0.030502022 image pull 3695f0466b4af47afdf4b467956f8cc4744d7249671a73e7ca3fd26cca2f59c3 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Jan 21 19:09:00 np0005591285 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/01bb4b39c1271c60be420e9b84c447749dafc2a7da4fe4fd78e13e91297f2f65/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 21 19:09:00 np0005591285 podman[228082]: 2026-01-22 00:09:00.50769672 +0000 UTC m=+0.142020041 container init deb6f724b07c02d613586488e57ffd2c0d77cf6911e2df4276c807d1de654950 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-1a4bd631-64c5-4e00-9341-0e44fd0833fb, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS)
Jan 21 19:09:00 np0005591285 podman[228082]: 2026-01-22 00:09:00.512821338 +0000 UTC m=+0.147144629 container start deb6f724b07c02d613586488e57ffd2c0d77cf6911e2df4276c807d1de654950 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-1a4bd631-64c5-4e00-9341-0e44fd0833fb, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 21 19:09:00 np0005591285 nova_compute[182755]: 2026-01-22 00:09:00.530 182759 DEBUG nova.compute.manager [req-4c6e334d-0e8c-40e5-b6a7-31479c426fdd req-735977ad-9429-4e0a-a1ae-1556af6a6a8e 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 4e87b9c8-cfba-431e-966e-24799ad0ece2] Received event network-vif-plugged-02f1d29d-b6df-46d8-8387-cfa84ffb24af external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 21 19:09:00 np0005591285 nova_compute[182755]: 2026-01-22 00:09:00.530 182759 DEBUG oslo_concurrency.lockutils [req-4c6e334d-0e8c-40e5-b6a7-31479c426fdd req-735977ad-9429-4e0a-a1ae-1556af6a6a8e 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "4e87b9c8-cfba-431e-966e-24799ad0ece2-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 21 19:09:00 np0005591285 nova_compute[182755]: 2026-01-22 00:09:00.531 182759 DEBUG oslo_concurrency.lockutils [req-4c6e334d-0e8c-40e5-b6a7-31479c426fdd req-735977ad-9429-4e0a-a1ae-1556af6a6a8e 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "4e87b9c8-cfba-431e-966e-24799ad0ece2-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 21 19:09:00 np0005591285 nova_compute[182755]: 2026-01-22 00:09:00.531 182759 DEBUG oslo_concurrency.lockutils [req-4c6e334d-0e8c-40e5-b6a7-31479c426fdd req-735977ad-9429-4e0a-a1ae-1556af6a6a8e 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "4e87b9c8-cfba-431e-966e-24799ad0ece2-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 21 19:09:00 np0005591285 nova_compute[182755]: 2026-01-22 00:09:00.531 182759 DEBUG nova.compute.manager [req-4c6e334d-0e8c-40e5-b6a7-31479c426fdd req-735977ad-9429-4e0a-a1ae-1556af6a6a8e 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 4e87b9c8-cfba-431e-966e-24799ad0ece2] Processing event network-vif-plugged-02f1d29d-b6df-46d8-8387-cfa84ffb24af _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Jan 21 19:09:00 np0005591285 nova_compute[182755]: 2026-01-22 00:09:00.532 182759 DEBUG nova.compute.manager [None req-1620f237-30e3-4c25-9f9f-e88b9800a1b5 365f219cd09c471fa6275faa2fe5e2a1 b26cf6f4abd54e30aac169a3cbca648c - - default default] [instance: 4e87b9c8-cfba-431e-966e-24799ad0ece2] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Jan 21 19:09:00 np0005591285 nova_compute[182755]: 2026-01-22 00:09:00.536 182759 DEBUG nova.virt.libvirt.driver [None req-1620f237-30e3-4c25-9f9f-e88b9800a1b5 365f219cd09c471fa6275faa2fe5e2a1 b26cf6f4abd54e30aac169a3cbca648c - - default default] [instance: 4e87b9c8-cfba-431e-966e-24799ad0ece2] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Jan 21 19:09:00 np0005591285 nova_compute[182755]: 2026-01-22 00:09:00.543 182759 INFO nova.virt.libvirt.driver [-] [instance: 4e87b9c8-cfba-431e-966e-24799ad0ece2] Instance spawned successfully.#033[00m
Jan 21 19:09:00 np0005591285 neutron-haproxy-ovnmeta-1a4bd631-64c5-4e00-9341-0e44fd0833fb[228097]: [NOTICE]   (228101) : New worker (228103) forked
Jan 21 19:09:00 np0005591285 neutron-haproxy-ovnmeta-1a4bd631-64c5-4e00-9341-0e44fd0833fb[228097]: [NOTICE]   (228101) : Loading success.
Jan 21 19:09:00 np0005591285 nova_compute[182755]: 2026-01-22 00:09:00.550 182759 DEBUG oslo_concurrency.lockutils [req-584d2787-f89e-44db-8bcc-145ab190725a req-0d5af684-e6db-4b22-afc0-b7657c58808a 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Releasing lock "refresh_cache-cacae884-d2ca-4741-952f-59ffbb641328" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 21 19:09:00 np0005591285 nova_compute[182755]: 2026-01-22 00:09:00.552 182759 INFO nova.compute.manager [None req-f447ef32-7edb-4e2c-a5c5-5452f2f9c37e - - - - - -] [instance: 4e87b9c8-cfba-431e-966e-24799ad0ece2] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 21 19:09:00 np0005591285 nova_compute[182755]: 2026-01-22 00:09:00.552 182759 DEBUG nova.virt.driver [None req-f447ef32-7edb-4e2c-a5c5-5452f2f9c37e - - - - - -] Emitting event <LifecycleEvent: 1769040540.5348058, 4e87b9c8-cfba-431e-966e-24799ad0ece2 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 21 19:09:00 np0005591285 nova_compute[182755]: 2026-01-22 00:09:00.552 182759 INFO nova.compute.manager [None req-f447ef32-7edb-4e2c-a5c5-5452f2f9c37e - - - - - -] [instance: 4e87b9c8-cfba-431e-966e-24799ad0ece2] VM Resumed (Lifecycle Event)#033[00m
Jan 21 19:09:00 np0005591285 nova_compute[182755]: 2026-01-22 00:09:00.705 182759 DEBUG nova.compute.manager [None req-f447ef32-7edb-4e2c-a5c5-5452f2f9c37e - - - - - -] [instance: 4e87b9c8-cfba-431e-966e-24799ad0ece2] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 21 19:09:00 np0005591285 nova_compute[182755]: 2026-01-22 00:09:00.711 182759 DEBUG nova.compute.manager [None req-f447ef32-7edb-4e2c-a5c5-5452f2f9c37e - - - - - -] [instance: 4e87b9c8-cfba-431e-966e-24799ad0ece2] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: shelved_offloaded, current task_state: spawning, current DB power_state: 4, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 21 19:09:00 np0005591285 nova_compute[182755]: 2026-01-22 00:09:00.947 182759 INFO nova.compute.manager [None req-f447ef32-7edb-4e2c-a5c5-5452f2f9c37e - - - - - -] [instance: 4e87b9c8-cfba-431e-966e-24799ad0ece2] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 21 19:09:01 np0005591285 nova_compute[182755]: 2026-01-22 00:09:01.767 182759 DEBUG nova.compute.manager [req-53cddb98-7430-46a3-a62c-342097cec439 req-d87e5203-7c66-4998-8600-6128b9874a9a 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: cacae884-d2ca-4741-952f-59ffbb641328] Received event network-vif-plugged-ebf5a837-6957-4227-9b3d-1ae66eb381bd external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 21 19:09:01 np0005591285 nova_compute[182755]: 2026-01-22 00:09:01.768 182759 DEBUG oslo_concurrency.lockutils [req-53cddb98-7430-46a3-a62c-342097cec439 req-d87e5203-7c66-4998-8600-6128b9874a9a 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "cacae884-d2ca-4741-952f-59ffbb641328-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 21 19:09:01 np0005591285 nova_compute[182755]: 2026-01-22 00:09:01.769 182759 DEBUG oslo_concurrency.lockutils [req-53cddb98-7430-46a3-a62c-342097cec439 req-d87e5203-7c66-4998-8600-6128b9874a9a 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "cacae884-d2ca-4741-952f-59ffbb641328-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 21 19:09:01 np0005591285 nova_compute[182755]: 2026-01-22 00:09:01.769 182759 DEBUG oslo_concurrency.lockutils [req-53cddb98-7430-46a3-a62c-342097cec439 req-d87e5203-7c66-4998-8600-6128b9874a9a 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "cacae884-d2ca-4741-952f-59ffbb641328-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 21 19:09:01 np0005591285 nova_compute[182755]: 2026-01-22 00:09:01.770 182759 DEBUG nova.compute.manager [req-53cddb98-7430-46a3-a62c-342097cec439 req-d87e5203-7c66-4998-8600-6128b9874a9a 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: cacae884-d2ca-4741-952f-59ffbb641328] No waiting events found dispatching network-vif-plugged-ebf5a837-6957-4227-9b3d-1ae66eb381bd pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 21 19:09:01 np0005591285 nova_compute[182755]: 2026-01-22 00:09:01.770 182759 WARNING nova.compute.manager [req-53cddb98-7430-46a3-a62c-342097cec439 req-d87e5203-7c66-4998-8600-6128b9874a9a 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: cacae884-d2ca-4741-952f-59ffbb641328] Received unexpected event network-vif-plugged-ebf5a837-6957-4227-9b3d-1ae66eb381bd for instance with vm_state active and task_state None.#033[00m
Jan 21 19:09:01 np0005591285 nova_compute[182755]: 2026-01-22 00:09:01.806 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:09:02 np0005591285 nova_compute[182755]: 2026-01-22 00:09:02.215 182759 DEBUG nova.compute.manager [None req-1620f237-30e3-4c25-9f9f-e88b9800a1b5 365f219cd09c471fa6275faa2fe5e2a1 b26cf6f4abd54e30aac169a3cbca648c - - default default] [instance: 4e87b9c8-cfba-431e-966e-24799ad0ece2] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 21 19:09:02 np0005591285 nova_compute[182755]: 2026-01-22 00:09:02.590 182759 INFO nova.compute.manager [None req-2e0f08a9-0141-4546-ad53-c5e4e194343b 8324d8ba232c476e925d31b7d5645a7a 3b9315c6168049d79f20d630e51ffff3 - - default default] [instance: cacae884-d2ca-4741-952f-59ffbb641328] Rescuing#033[00m
Jan 21 19:09:02 np0005591285 nova_compute[182755]: 2026-01-22 00:09:02.591 182759 DEBUG oslo_concurrency.lockutils [None req-2e0f08a9-0141-4546-ad53-c5e4e194343b 8324d8ba232c476e925d31b7d5645a7a 3b9315c6168049d79f20d630e51ffff3 - - default default] Acquiring lock "refresh_cache-cacae884-d2ca-4741-952f-59ffbb641328" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 21 19:09:02 np0005591285 nova_compute[182755]: 2026-01-22 00:09:02.592 182759 DEBUG oslo_concurrency.lockutils [None req-2e0f08a9-0141-4546-ad53-c5e4e194343b 8324d8ba232c476e925d31b7d5645a7a 3b9315c6168049d79f20d630e51ffff3 - - default default] Acquired lock "refresh_cache-cacae884-d2ca-4741-952f-59ffbb641328" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 21 19:09:02 np0005591285 nova_compute[182755]: 2026-01-22 00:09:02.592 182759 DEBUG nova.network.neutron [None req-2e0f08a9-0141-4546-ad53-c5e4e194343b 8324d8ba232c476e925d31b7d5645a7a 3b9315c6168049d79f20d630e51ffff3 - - default default] [instance: cacae884-d2ca-4741-952f-59ffbb641328] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Jan 21 19:09:02 np0005591285 nova_compute[182755]: 2026-01-22 00:09:02.603 182759 DEBUG oslo_concurrency.lockutils [None req-1620f237-30e3-4c25-9f9f-e88b9800a1b5 365f219cd09c471fa6275faa2fe5e2a1 b26cf6f4abd54e30aac169a3cbca648c - - default default] Lock "4e87b9c8-cfba-431e-966e-24799ad0ece2" "released" by "nova.compute.manager.ComputeManager.unshelve_instance.<locals>.do_unshelve_instance" :: held 13.019s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 21 19:09:02 np0005591285 nova_compute[182755]: 2026-01-22 00:09:02.768 182759 DEBUG nova.compute.manager [req-b3fd88b0-fe36-4199-84b3-c0acf5025142 req-2ecaacf8-0350-4d8f-99ba-805bb87d372d 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 4e87b9c8-cfba-431e-966e-24799ad0ece2] Received event network-vif-plugged-02f1d29d-b6df-46d8-8387-cfa84ffb24af external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 21 19:09:02 np0005591285 nova_compute[182755]: 2026-01-22 00:09:02.768 182759 DEBUG oslo_concurrency.lockutils [req-b3fd88b0-fe36-4199-84b3-c0acf5025142 req-2ecaacf8-0350-4d8f-99ba-805bb87d372d 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "4e87b9c8-cfba-431e-966e-24799ad0ece2-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 21 19:09:02 np0005591285 nova_compute[182755]: 2026-01-22 00:09:02.769 182759 DEBUG oslo_concurrency.lockutils [req-b3fd88b0-fe36-4199-84b3-c0acf5025142 req-2ecaacf8-0350-4d8f-99ba-805bb87d372d 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "4e87b9c8-cfba-431e-966e-24799ad0ece2-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 21 19:09:02 np0005591285 nova_compute[182755]: 2026-01-22 00:09:02.769 182759 DEBUG oslo_concurrency.lockutils [req-b3fd88b0-fe36-4199-84b3-c0acf5025142 req-2ecaacf8-0350-4d8f-99ba-805bb87d372d 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "4e87b9c8-cfba-431e-966e-24799ad0ece2-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 21 19:09:02 np0005591285 nova_compute[182755]: 2026-01-22 00:09:02.769 182759 DEBUG nova.compute.manager [req-b3fd88b0-fe36-4199-84b3-c0acf5025142 req-2ecaacf8-0350-4d8f-99ba-805bb87d372d 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 4e87b9c8-cfba-431e-966e-24799ad0ece2] No waiting events found dispatching network-vif-plugged-02f1d29d-b6df-46d8-8387-cfa84ffb24af pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 21 19:09:02 np0005591285 nova_compute[182755]: 2026-01-22 00:09:02.769 182759 WARNING nova.compute.manager [req-b3fd88b0-fe36-4199-84b3-c0acf5025142 req-2ecaacf8-0350-4d8f-99ba-805bb87d372d 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 4e87b9c8-cfba-431e-966e-24799ad0ece2] Received unexpected event network-vif-plugged-02f1d29d-b6df-46d8-8387-cfa84ffb24af for instance with vm_state active and task_state None.#033[00m
Jan 21 19:09:02 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:09:02.972 104259 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 21 19:09:02 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:09:02.973 104259 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 21 19:09:02 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:09:02.974 104259 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 21 19:09:03 np0005591285 nova_compute[182755]: 2026-01-22 00:09:03.406 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:09:05 np0005591285 nova_compute[182755]: 2026-01-22 00:09:05.863 182759 DEBUG nova.network.neutron [None req-2e0f08a9-0141-4546-ad53-c5e4e194343b 8324d8ba232c476e925d31b7d5645a7a 3b9315c6168049d79f20d630e51ffff3 - - default default] [instance: cacae884-d2ca-4741-952f-59ffbb641328] Updating instance_info_cache with network_info: [{"id": "ebf5a837-6957-4227-9b3d-1ae66eb381bd", "address": "fa:16:3e:52:ad:ee", "network": {"id": "eaa59f49-9477-4d9b-9a5e-8f1eb5cf78a6", "bridge": "br-int", "label": "tempest-ServerRescueTestJSON-1280377146-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": false}}], "meta": {"injected": false, "tenant_id": "3b9315c6168049d79f20d630e51ffff3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapebf5a837-69", "ovs_interfaceid": "ebf5a837-6957-4227-9b3d-1ae66eb381bd", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 21 19:09:06 np0005591285 nova_compute[182755]: 2026-01-22 00:09:06.490 182759 DEBUG oslo_concurrency.lockutils [None req-2e0f08a9-0141-4546-ad53-c5e4e194343b 8324d8ba232c476e925d31b7d5645a7a 3b9315c6168049d79f20d630e51ffff3 - - default default] Releasing lock "refresh_cache-cacae884-d2ca-4741-952f-59ffbb641328" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 21 19:09:06 np0005591285 nova_compute[182755]: 2026-01-22 00:09:06.844 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:09:07 np0005591285 nova_compute[182755]: 2026-01-22 00:09:07.375 182759 DEBUG nova.virt.libvirt.driver [None req-2e0f08a9-0141-4546-ad53-c5e4e194343b 8324d8ba232c476e925d31b7d5645a7a 3b9315c6168049d79f20d630e51ffff3 - - default default] [instance: cacae884-d2ca-4741-952f-59ffbb641328] Shutting down instance from state 1 _clean_shutdown /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4071#033[00m
Jan 21 19:09:08 np0005591285 nova_compute[182755]: 2026-01-22 00:09:08.410 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:09:10 np0005591285 nova_compute[182755]: 2026-01-22 00:09:10.387 182759 DEBUG oslo_concurrency.lockutils [None req-c1183517-2ac1-42aa-afbc-e93c0db8fa48 365f219cd09c471fa6275faa2fe5e2a1 b26cf6f4abd54e30aac169a3cbca648c - - default default] Acquiring lock "4e87b9c8-cfba-431e-966e-24799ad0ece2" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 21 19:09:10 np0005591285 nova_compute[182755]: 2026-01-22 00:09:10.387 182759 DEBUG oslo_concurrency.lockutils [None req-c1183517-2ac1-42aa-afbc-e93c0db8fa48 365f219cd09c471fa6275faa2fe5e2a1 b26cf6f4abd54e30aac169a3cbca648c - - default default] Lock "4e87b9c8-cfba-431e-966e-24799ad0ece2" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 21 19:09:10 np0005591285 nova_compute[182755]: 2026-01-22 00:09:10.387 182759 DEBUG oslo_concurrency.lockutils [None req-c1183517-2ac1-42aa-afbc-e93c0db8fa48 365f219cd09c471fa6275faa2fe5e2a1 b26cf6f4abd54e30aac169a3cbca648c - - default default] Acquiring lock "4e87b9c8-cfba-431e-966e-24799ad0ece2-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 21 19:09:10 np0005591285 nova_compute[182755]: 2026-01-22 00:09:10.388 182759 DEBUG oslo_concurrency.lockutils [None req-c1183517-2ac1-42aa-afbc-e93c0db8fa48 365f219cd09c471fa6275faa2fe5e2a1 b26cf6f4abd54e30aac169a3cbca648c - - default default] Lock "4e87b9c8-cfba-431e-966e-24799ad0ece2-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 21 19:09:10 np0005591285 nova_compute[182755]: 2026-01-22 00:09:10.388 182759 DEBUG oslo_concurrency.lockutils [None req-c1183517-2ac1-42aa-afbc-e93c0db8fa48 365f219cd09c471fa6275faa2fe5e2a1 b26cf6f4abd54e30aac169a3cbca648c - - default default] Lock "4e87b9c8-cfba-431e-966e-24799ad0ece2-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 21 19:09:10 np0005591285 nova_compute[182755]: 2026-01-22 00:09:10.407 182759 INFO nova.compute.manager [None req-c1183517-2ac1-42aa-afbc-e93c0db8fa48 365f219cd09c471fa6275faa2fe5e2a1 b26cf6f4abd54e30aac169a3cbca648c - - default default] [instance: 4e87b9c8-cfba-431e-966e-24799ad0ece2] Terminating instance#033[00m
Jan 21 19:09:10 np0005591285 nova_compute[182755]: 2026-01-22 00:09:10.429 182759 DEBUG nova.compute.manager [None req-c1183517-2ac1-42aa-afbc-e93c0db8fa48 365f219cd09c471fa6275faa2fe5e2a1 b26cf6f4abd54e30aac169a3cbca648c - - default default] [instance: 4e87b9c8-cfba-431e-966e-24799ad0ece2] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Jan 21 19:09:10 np0005591285 kernel: tap02f1d29d-b6 (unregistering): left promiscuous mode
Jan 21 19:09:10 np0005591285 NetworkManager[55017]: <info>  [1769040550.4685] device (tap02f1d29d-b6): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 21 19:09:10 np0005591285 ovn_controller[94908]: 2026-01-22T00:09:10Z|00402|binding|INFO|Releasing lport 02f1d29d-b6df-46d8-8387-cfa84ffb24af from this chassis (sb_readonly=0)
Jan 21 19:09:10 np0005591285 ovn_controller[94908]: 2026-01-22T00:09:10Z|00403|binding|INFO|Setting lport 02f1d29d-b6df-46d8-8387-cfa84ffb24af down in Southbound
Jan 21 19:09:10 np0005591285 nova_compute[182755]: 2026-01-22 00:09:10.499 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:09:10 np0005591285 ovn_controller[94908]: 2026-01-22T00:09:10Z|00404|binding|INFO|Removing iface tap02f1d29d-b6 ovn-installed in OVS
Jan 21 19:09:10 np0005591285 nova_compute[182755]: 2026-01-22 00:09:10.503 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:09:10 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:09:10.514 104259 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:7c:e7:2e 10.100.0.8'], port_security=['fa:16:3e:7c:e7:2e 10.100.0.8'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.8/28', 'neutron:device_id': '4e87b9c8-cfba-431e-966e-24799ad0ece2', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-1a4bd631-64c5-4e00-9341-0e44fd0833fb', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'b26cf6f4abd54e30aac169a3cbca648c', 'neutron:revision_number': '9', 'neutron:security_group_ids': '80fb8d02-77b3-43f5-9cd3-4114236093b7', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:port_fip': '192.168.122.176', 'neutron:host_id': 'compute-2.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=d46c6b58-b03f-4ac4-a6dd-9f507a40241a, chassis=[], tunnel_key=5, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fdd55b5c970>], logical_port=02f1d29d-b6df-46d8-8387-cfa84ffb24af) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7fdd55b5c970>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 21 19:09:10 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:09:10.515 104259 INFO neutron.agent.ovn.metadata.agent [-] Port 02f1d29d-b6df-46d8-8387-cfa84ffb24af in datapath 1a4bd631-64c5-4e00-9341-0e44fd0833fb unbound from our chassis#033[00m
Jan 21 19:09:10 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:09:10.517 104259 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 1a4bd631-64c5-4e00-9341-0e44fd0833fb, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Jan 21 19:09:10 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:09:10.519 211690 DEBUG oslo.privsep.daemon [-] privsep: reply[87f924ff-fbed-4ac0-998e-4b9dd9d7503c]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 19:09:10 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:09:10.520 104259 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-1a4bd631-64c5-4e00-9341-0e44fd0833fb namespace which is not needed anymore#033[00m
Jan 21 19:09:10 np0005591285 nova_compute[182755]: 2026-01-22 00:09:10.531 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:09:10 np0005591285 systemd[1]: machine-qemu\x2d49\x2dinstance\x2d00000068.scope: Deactivated successfully.
Jan 21 19:09:10 np0005591285 systemd[1]: machine-qemu\x2d49\x2dinstance\x2d00000068.scope: Consumed 10.859s CPU time.
Jan 21 19:09:10 np0005591285 systemd-machined[154022]: Machine qemu-49-instance-00000068 terminated.
Jan 21 19:09:10 np0005591285 nova_compute[182755]: 2026-01-22 00:09:10.685 182759 INFO nova.virt.libvirt.driver [-] [instance: 4e87b9c8-cfba-431e-966e-24799ad0ece2] Instance destroyed successfully.#033[00m
Jan 21 19:09:10 np0005591285 nova_compute[182755]: 2026-01-22 00:09:10.686 182759 DEBUG nova.objects.instance [None req-c1183517-2ac1-42aa-afbc-e93c0db8fa48 365f219cd09c471fa6275faa2fe5e2a1 b26cf6f4abd54e30aac169a3cbca648c - - default default] Lazy-loading 'resources' on Instance uuid 4e87b9c8-cfba-431e-966e-24799ad0ece2 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 21 19:09:10 np0005591285 neutron-haproxy-ovnmeta-1a4bd631-64c5-4e00-9341-0e44fd0833fb[228097]: [NOTICE]   (228101) : haproxy version is 2.8.14-c23fe91
Jan 21 19:09:10 np0005591285 neutron-haproxy-ovnmeta-1a4bd631-64c5-4e00-9341-0e44fd0833fb[228097]: [NOTICE]   (228101) : path to executable is /usr/sbin/haproxy
Jan 21 19:09:10 np0005591285 neutron-haproxy-ovnmeta-1a4bd631-64c5-4e00-9341-0e44fd0833fb[228097]: [WARNING]  (228101) : Exiting Master process...
Jan 21 19:09:10 np0005591285 neutron-haproxy-ovnmeta-1a4bd631-64c5-4e00-9341-0e44fd0833fb[228097]: [WARNING]  (228101) : Exiting Master process...
Jan 21 19:09:10 np0005591285 neutron-haproxy-ovnmeta-1a4bd631-64c5-4e00-9341-0e44fd0833fb[228097]: [ALERT]    (228101) : Current worker (228103) exited with code 143 (Terminated)
Jan 21 19:09:10 np0005591285 neutron-haproxy-ovnmeta-1a4bd631-64c5-4e00-9341-0e44fd0833fb[228097]: [WARNING]  (228101) : All workers exited. Exiting... (0)
Jan 21 19:09:10 np0005591285 systemd[1]: libpod-deb6f724b07c02d613586488e57ffd2c0d77cf6911e2df4276c807d1de654950.scope: Deactivated successfully.
Jan 21 19:09:10 np0005591285 nova_compute[182755]: 2026-01-22 00:09:10.700 182759 DEBUG nova.virt.libvirt.vif [None req-c1183517-2ac1-42aa-afbc-e93c0db8fa48 365f219cd09c471fa6275faa2fe5e2a1 b26cf6f4abd54e30aac169a3cbca648c - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=True,config_drive='True',created_at=2026-01-22T00:07:28Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ServerActionsTestOtherB-server-261934281',display_name='tempest-ServerActionsTestOtherB-server-261934281',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-serveractionstestotherb-server-261934281',id=104,image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBOWyAqsdytk3W3HzFQzJP3BXJvSwE75PC1SitNdFnRhcK3nyEFtPzs/DJKbijcwArzRvqYzid7Fty+N11Hyd1TaRzX9I0f6oLPrGjMRpZbi4YRQ8Uh8k7+UR1VtydcvTDA==',key_name='tempest-keypair-1040205771',keypairs=<?>,launch_index=0,launched_at=2026-01-22T00:09:02Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='b26cf6f4abd54e30aac169a3cbca648c',ramdisk_id='',reservation_id='r-bqv3jv0l',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',clean_attempts='1',image_base_image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-ServerActionsTestOtherB-1685479237',owner_user_name='tempest-ServerActionsTestOtherB-1685479237-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-22T00:09:02Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='365f219cd09c471fa6275faa2fe5e2a1',uuid=4e87b9c8-cfba-431e-966e-24799ad0ece2,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "02f1d29d-b6df-46d8-8387-cfa84ffb24af", "address": "fa:16:3e:7c:e7:2e", "network": {"id": "1a4bd631-64c5-4e00-9341-0e44fd0833fb", "bridge": "br-int", "label": "tempest-ServerActionsTestOtherB-1779791452-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.176", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b26cf6f4abd54e30aac169a3cbca648c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap02f1d29d-b6", "ovs_interfaceid": "02f1d29d-b6df-46d8-8387-cfa84ffb24af", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Jan 21 19:09:10 np0005591285 nova_compute[182755]: 2026-01-22 00:09:10.701 182759 DEBUG nova.network.os_vif_util [None req-c1183517-2ac1-42aa-afbc-e93c0db8fa48 365f219cd09c471fa6275faa2fe5e2a1 b26cf6f4abd54e30aac169a3cbca648c - - default default] Converting VIF {"id": "02f1d29d-b6df-46d8-8387-cfa84ffb24af", "address": "fa:16:3e:7c:e7:2e", "network": {"id": "1a4bd631-64c5-4e00-9341-0e44fd0833fb", "bridge": "br-int", "label": "tempest-ServerActionsTestOtherB-1779791452-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.176", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b26cf6f4abd54e30aac169a3cbca648c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap02f1d29d-b6", "ovs_interfaceid": "02f1d29d-b6df-46d8-8387-cfa84ffb24af", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 21 19:09:10 np0005591285 podman[228137]: 2026-01-22 00:09:10.720400292 +0000 UTC m=+0.071253528 container died deb6f724b07c02d613586488e57ffd2c0d77cf6911e2df4276c807d1de654950 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-1a4bd631-64c5-4e00-9341-0e44fd0833fb, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.build-date=20251202)
Jan 21 19:09:10 np0005591285 nova_compute[182755]: 2026-01-22 00:09:10.720 182759 DEBUG nova.network.os_vif_util [None req-c1183517-2ac1-42aa-afbc-e93c0db8fa48 365f219cd09c471fa6275faa2fe5e2a1 b26cf6f4abd54e30aac169a3cbca648c - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:7c:e7:2e,bridge_name='br-int',has_traffic_filtering=True,id=02f1d29d-b6df-46d8-8387-cfa84ffb24af,network=Network(1a4bd631-64c5-4e00-9341-0e44fd0833fb),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap02f1d29d-b6') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 21 19:09:10 np0005591285 nova_compute[182755]: 2026-01-22 00:09:10.722 182759 DEBUG os_vif [None req-c1183517-2ac1-42aa-afbc-e93c0db8fa48 365f219cd09c471fa6275faa2fe5e2a1 b26cf6f4abd54e30aac169a3cbca648c - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:7c:e7:2e,bridge_name='br-int',has_traffic_filtering=True,id=02f1d29d-b6df-46d8-8387-cfa84ffb24af,network=Network(1a4bd631-64c5-4e00-9341-0e44fd0833fb),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap02f1d29d-b6') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Jan 21 19:09:10 np0005591285 nova_compute[182755]: 2026-01-22 00:09:10.724 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:09:10 np0005591285 nova_compute[182755]: 2026-01-22 00:09:10.725 182759 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap02f1d29d-b6, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 21 19:09:10 np0005591285 nova_compute[182755]: 2026-01-22 00:09:10.728 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:09:10 np0005591285 nova_compute[182755]: 2026-01-22 00:09:10.731 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 21 19:09:10 np0005591285 nova_compute[182755]: 2026-01-22 00:09:10.733 182759 INFO os_vif [None req-c1183517-2ac1-42aa-afbc-e93c0db8fa48 365f219cd09c471fa6275faa2fe5e2a1 b26cf6f4abd54e30aac169a3cbca648c - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:7c:e7:2e,bridge_name='br-int',has_traffic_filtering=True,id=02f1d29d-b6df-46d8-8387-cfa84ffb24af,network=Network(1a4bd631-64c5-4e00-9341-0e44fd0833fb),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap02f1d29d-b6')#033[00m
Jan 21 19:09:10 np0005591285 nova_compute[182755]: 2026-01-22 00:09:10.734 182759 INFO nova.virt.libvirt.driver [None req-c1183517-2ac1-42aa-afbc-e93c0db8fa48 365f219cd09c471fa6275faa2fe5e2a1 b26cf6f4abd54e30aac169a3cbca648c - - default default] [instance: 4e87b9c8-cfba-431e-966e-24799ad0ece2] Deleting instance files /var/lib/nova/instances/4e87b9c8-cfba-431e-966e-24799ad0ece2_del#033[00m
Jan 21 19:09:10 np0005591285 nova_compute[182755]: 2026-01-22 00:09:10.739 182759 INFO nova.virt.libvirt.driver [None req-c1183517-2ac1-42aa-afbc-e93c0db8fa48 365f219cd09c471fa6275faa2fe5e2a1 b26cf6f4abd54e30aac169a3cbca648c - - default default] [instance: 4e87b9c8-cfba-431e-966e-24799ad0ece2] Deletion of /var/lib/nova/instances/4e87b9c8-cfba-431e-966e-24799ad0ece2_del complete#033[00m
Jan 21 19:09:10 np0005591285 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-deb6f724b07c02d613586488e57ffd2c0d77cf6911e2df4276c807d1de654950-userdata-shm.mount: Deactivated successfully.
Jan 21 19:09:10 np0005591285 systemd[1]: var-lib-containers-storage-overlay-01bb4b39c1271c60be420e9b84c447749dafc2a7da4fe4fd78e13e91297f2f65-merged.mount: Deactivated successfully.
Jan 21 19:09:10 np0005591285 podman[228137]: 2026-01-22 00:09:10.759559756 +0000 UTC m=+0.110412972 container cleanup deb6f724b07c02d613586488e57ffd2c0d77cf6911e2df4276c807d1de654950 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-1a4bd631-64c5-4e00-9341-0e44fd0833fb, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.license=GPLv2)
Jan 21 19:09:10 np0005591285 systemd[1]: libpod-conmon-deb6f724b07c02d613586488e57ffd2c0d77cf6911e2df4276c807d1de654950.scope: Deactivated successfully.
Jan 21 19:09:10 np0005591285 nova_compute[182755]: 2026-01-22 00:09:10.822 182759 INFO nova.compute.manager [None req-c1183517-2ac1-42aa-afbc-e93c0db8fa48 365f219cd09c471fa6275faa2fe5e2a1 b26cf6f4abd54e30aac169a3cbca648c - - default default] [instance: 4e87b9c8-cfba-431e-966e-24799ad0ece2] Took 0.39 seconds to destroy the instance on the hypervisor.#033[00m
Jan 21 19:09:10 np0005591285 nova_compute[182755]: 2026-01-22 00:09:10.823 182759 DEBUG oslo.service.loopingcall [None req-c1183517-2ac1-42aa-afbc-e93c0db8fa48 365f219cd09c471fa6275faa2fe5e2a1 b26cf6f4abd54e30aac169a3cbca648c - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Jan 21 19:09:10 np0005591285 nova_compute[182755]: 2026-01-22 00:09:10.823 182759 DEBUG nova.compute.manager [-] [instance: 4e87b9c8-cfba-431e-966e-24799ad0ece2] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Jan 21 19:09:10 np0005591285 nova_compute[182755]: 2026-01-22 00:09:10.823 182759 DEBUG nova.network.neutron [-] [instance: 4e87b9c8-cfba-431e-966e-24799ad0ece2] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Jan 21 19:09:10 np0005591285 podman[228183]: 2026-01-22 00:09:10.840634386 +0000 UTC m=+0.057314103 container remove deb6f724b07c02d613586488e57ffd2c0d77cf6911e2df4276c807d1de654950 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-1a4bd631-64c5-4e00-9341-0e44fd0833fb, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true)
Jan 21 19:09:10 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:09:10.846 211690 DEBUG oslo.privsep.daemon [-] privsep: reply[e9ef4f75-4d92-4fd0-b585-24e6000b87bc]: (4, ('Thu Jan 22 12:09:10 AM UTC 2026 Stopping container neutron-haproxy-ovnmeta-1a4bd631-64c5-4e00-9341-0e44fd0833fb (deb6f724b07c02d613586488e57ffd2c0d77cf6911e2df4276c807d1de654950)\ndeb6f724b07c02d613586488e57ffd2c0d77cf6911e2df4276c807d1de654950\nThu Jan 22 12:09:10 AM UTC 2026 Deleting container neutron-haproxy-ovnmeta-1a4bd631-64c5-4e00-9341-0e44fd0833fb (deb6f724b07c02d613586488e57ffd2c0d77cf6911e2df4276c807d1de654950)\ndeb6f724b07c02d613586488e57ffd2c0d77cf6911e2df4276c807d1de654950\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 19:09:10 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:09:10.848 211690 DEBUG oslo.privsep.daemon [-] privsep: reply[0896fc92-d1cc-4ae9-95fe-c33dcc685ae7]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 19:09:10 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:09:10.849 104259 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap1a4bd631-60, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 21 19:09:10 np0005591285 kernel: tap1a4bd631-60: left promiscuous mode
Jan 21 19:09:10 np0005591285 nova_compute[182755]: 2026-01-22 00:09:10.851 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:09:10 np0005591285 nova_compute[182755]: 2026-01-22 00:09:10.853 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:09:10 np0005591285 nova_compute[182755]: 2026-01-22 00:09:10.866 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:09:10 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:09:10.867 211690 DEBUG oslo.privsep.daemon [-] privsep: reply[009659ea-0e66-462d-ad83-626a758b677e]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 19:09:10 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:09:10.879 211690 DEBUG oslo.privsep.daemon [-] privsep: reply[f37ec042-e347-497f-940f-fd72f3782017]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 19:09:10 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:09:10.882 211690 DEBUG oslo.privsep.daemon [-] privsep: reply[67404c29-b892-4950-bd8b-6e10fc7eba91]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 19:09:10 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:09:10.907 211690 DEBUG oslo.privsep.daemon [-] privsep: reply[e6fb36c0-17a2-41bd-9631-d59fd162cfa2]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 509831, 'reachable_time': 16397, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 228198, 'error': None, 'target': 'ovnmeta-1a4bd631-64c5-4e00-9341-0e44fd0833fb', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 19:09:10 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:09:10.910 104650 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-1a4bd631-64c5-4e00-9341-0e44fd0833fb deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Jan 21 19:09:10 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:09:10.911 104650 DEBUG oslo.privsep.daemon [-] privsep: reply[22395e5d-0802-4425-a7cf-d142b71b2890]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 19:09:10 np0005591285 systemd[1]: run-netns-ovnmeta\x2d1a4bd631\x2d64c5\x2d4e00\x2d9341\x2d0e44fd0833fb.mount: Deactivated successfully.
Jan 21 19:09:11 np0005591285 nova_compute[182755]: 2026-01-22 00:09:11.845 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:09:12 np0005591285 nova_compute[182755]: 2026-01-22 00:09:12.872 182759 DEBUG nova.network.neutron [-] [instance: 4e87b9c8-cfba-431e-966e-24799ad0ece2] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 21 19:09:12 np0005591285 nova_compute[182755]: 2026-01-22 00:09:12.898 182759 INFO nova.compute.manager [-] [instance: 4e87b9c8-cfba-431e-966e-24799ad0ece2] Took 2.07 seconds to deallocate network for instance.#033[00m
Jan 21 19:09:12 np0005591285 nova_compute[182755]: 2026-01-22 00:09:12.995 182759 DEBUG oslo_concurrency.lockutils [None req-c1183517-2ac1-42aa-afbc-e93c0db8fa48 365f219cd09c471fa6275faa2fe5e2a1 b26cf6f4abd54e30aac169a3cbca648c - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 21 19:09:12 np0005591285 nova_compute[182755]: 2026-01-22 00:09:12.996 182759 DEBUG oslo_concurrency.lockutils [None req-c1183517-2ac1-42aa-afbc-e93c0db8fa48 365f219cd09c471fa6275faa2fe5e2a1 b26cf6f4abd54e30aac169a3cbca648c - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 21 19:09:13 np0005591285 nova_compute[182755]: 2026-01-22 00:09:13.101 182759 DEBUG nova.compute.provider_tree [None req-c1183517-2ac1-42aa-afbc-e93c0db8fa48 365f219cd09c471fa6275faa2fe5e2a1 b26cf6f4abd54e30aac169a3cbca648c - - default default] Inventory has not changed in ProviderTree for provider: e96a8776-a298-4c19-937a-402cb8191067 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 21 19:09:13 np0005591285 nova_compute[182755]: 2026-01-22 00:09:13.120 182759 DEBUG nova.scheduler.client.report [None req-c1183517-2ac1-42aa-afbc-e93c0db8fa48 365f219cd09c471fa6275faa2fe5e2a1 b26cf6f4abd54e30aac169a3cbca648c - - default default] Inventory has not changed for provider e96a8776-a298-4c19-937a-402cb8191067 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 21 19:09:13 np0005591285 nova_compute[182755]: 2026-01-22 00:09:13.157 182759 DEBUG oslo_concurrency.lockutils [None req-c1183517-2ac1-42aa-afbc-e93c0db8fa48 365f219cd09c471fa6275faa2fe5e2a1 b26cf6f4abd54e30aac169a3cbca648c - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.161s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 21 19:09:13 np0005591285 nova_compute[182755]: 2026-01-22 00:09:13.212 182759 INFO nova.scheduler.client.report [None req-c1183517-2ac1-42aa-afbc-e93c0db8fa48 365f219cd09c471fa6275faa2fe5e2a1 b26cf6f4abd54e30aac169a3cbca648c - - default default] Deleted allocations for instance 4e87b9c8-cfba-431e-966e-24799ad0ece2#033[00m
Jan 21 19:09:13 np0005591285 nova_compute[182755]: 2026-01-22 00:09:13.305 182759 DEBUG oslo_concurrency.lockutils [None req-c1183517-2ac1-42aa-afbc-e93c0db8fa48 365f219cd09c471fa6275faa2fe5e2a1 b26cf6f4abd54e30aac169a3cbca648c - - default default] Lock "4e87b9c8-cfba-431e-966e-24799ad0ece2" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 2.918s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 21 19:09:13 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:09:13.500 104259 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=30, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '16:a4:fb', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': 'fa:c2:bb:50:eb:27'}, ipsec=False) old=SB_Global(nb_cfg=29) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 21 19:09:13 np0005591285 nova_compute[182755]: 2026-01-22 00:09:13.500 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:09:13 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:09:13.502 104259 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 2 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Jan 21 19:09:13 np0005591285 nova_compute[182755]: 2026-01-22 00:09:13.771 182759 DEBUG nova.compute.manager [req-0566c5c8-a2d3-467e-ade5-bb1dda16d42f req-5e581df1-a662-4951-b949-71f63efdb112 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 4e87b9c8-cfba-431e-966e-24799ad0ece2] Received event network-vif-unplugged-02f1d29d-b6df-46d8-8387-cfa84ffb24af external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 21 19:09:13 np0005591285 nova_compute[182755]: 2026-01-22 00:09:13.771 182759 DEBUG oslo_concurrency.lockutils [req-0566c5c8-a2d3-467e-ade5-bb1dda16d42f req-5e581df1-a662-4951-b949-71f63efdb112 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "4e87b9c8-cfba-431e-966e-24799ad0ece2-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 21 19:09:13 np0005591285 nova_compute[182755]: 2026-01-22 00:09:13.771 182759 DEBUG oslo_concurrency.lockutils [req-0566c5c8-a2d3-467e-ade5-bb1dda16d42f req-5e581df1-a662-4951-b949-71f63efdb112 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "4e87b9c8-cfba-431e-966e-24799ad0ece2-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 21 19:09:13 np0005591285 nova_compute[182755]: 2026-01-22 00:09:13.772 182759 DEBUG oslo_concurrency.lockutils [req-0566c5c8-a2d3-467e-ade5-bb1dda16d42f req-5e581df1-a662-4951-b949-71f63efdb112 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "4e87b9c8-cfba-431e-966e-24799ad0ece2-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 21 19:09:13 np0005591285 nova_compute[182755]: 2026-01-22 00:09:13.772 182759 DEBUG nova.compute.manager [req-0566c5c8-a2d3-467e-ade5-bb1dda16d42f req-5e581df1-a662-4951-b949-71f63efdb112 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 4e87b9c8-cfba-431e-966e-24799ad0ece2] No waiting events found dispatching network-vif-unplugged-02f1d29d-b6df-46d8-8387-cfa84ffb24af pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 21 19:09:13 np0005591285 nova_compute[182755]: 2026-01-22 00:09:13.772 182759 WARNING nova.compute.manager [req-0566c5c8-a2d3-467e-ade5-bb1dda16d42f req-5e581df1-a662-4951-b949-71f63efdb112 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 4e87b9c8-cfba-431e-966e-24799ad0ece2] Received unexpected event network-vif-unplugged-02f1d29d-b6df-46d8-8387-cfa84ffb24af for instance with vm_state deleted and task_state None.#033[00m
Jan 21 19:09:13 np0005591285 nova_compute[182755]: 2026-01-22 00:09:13.773 182759 DEBUG nova.compute.manager [req-0566c5c8-a2d3-467e-ade5-bb1dda16d42f req-5e581df1-a662-4951-b949-71f63efdb112 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 4e87b9c8-cfba-431e-966e-24799ad0ece2] Received event network-vif-plugged-02f1d29d-b6df-46d8-8387-cfa84ffb24af external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 21 19:09:13 np0005591285 nova_compute[182755]: 2026-01-22 00:09:13.773 182759 DEBUG oslo_concurrency.lockutils [req-0566c5c8-a2d3-467e-ade5-bb1dda16d42f req-5e581df1-a662-4951-b949-71f63efdb112 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "4e87b9c8-cfba-431e-966e-24799ad0ece2-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 21 19:09:13 np0005591285 nova_compute[182755]: 2026-01-22 00:09:13.773 182759 DEBUG oslo_concurrency.lockutils [req-0566c5c8-a2d3-467e-ade5-bb1dda16d42f req-5e581df1-a662-4951-b949-71f63efdb112 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "4e87b9c8-cfba-431e-966e-24799ad0ece2-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 21 19:09:13 np0005591285 nova_compute[182755]: 2026-01-22 00:09:13.773 182759 DEBUG oslo_concurrency.lockutils [req-0566c5c8-a2d3-467e-ade5-bb1dda16d42f req-5e581df1-a662-4951-b949-71f63efdb112 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "4e87b9c8-cfba-431e-966e-24799ad0ece2-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 21 19:09:13 np0005591285 nova_compute[182755]: 2026-01-22 00:09:13.774 182759 DEBUG nova.compute.manager [req-0566c5c8-a2d3-467e-ade5-bb1dda16d42f req-5e581df1-a662-4951-b949-71f63efdb112 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 4e87b9c8-cfba-431e-966e-24799ad0ece2] No waiting events found dispatching network-vif-plugged-02f1d29d-b6df-46d8-8387-cfa84ffb24af pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 21 19:09:13 np0005591285 nova_compute[182755]: 2026-01-22 00:09:13.774 182759 WARNING nova.compute.manager [req-0566c5c8-a2d3-467e-ade5-bb1dda16d42f req-5e581df1-a662-4951-b949-71f63efdb112 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 4e87b9c8-cfba-431e-966e-24799ad0ece2] Received unexpected event network-vif-plugged-02f1d29d-b6df-46d8-8387-cfa84ffb24af for instance with vm_state deleted and task_state None.#033[00m
Jan 21 19:09:13 np0005591285 nova_compute[182755]: 2026-01-22 00:09:13.774 182759 DEBUG nova.compute.manager [req-0566c5c8-a2d3-467e-ade5-bb1dda16d42f req-5e581df1-a662-4951-b949-71f63efdb112 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 4e87b9c8-cfba-431e-966e-24799ad0ece2] Received event network-vif-deleted-02f1d29d-b6df-46d8-8387-cfa84ffb24af external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 21 19:09:15 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:09:15.504 104259 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=ce4b296c-26ac-415a-aa87-9634754eb3d3, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '30'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 21 19:09:15 np0005591285 nova_compute[182755]: 2026-01-22 00:09:15.728 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:09:16 np0005591285 podman[228215]: 2026-01-22 00:09:16.242989246 +0000 UTC m=+0.103506016 container health_status b5c1a31a508d0cf9e10702abe9c0ef424614b4d8f0daee85791f7de054afa6f0 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=ceilometer_agent_compute, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.build-date=20251202)
Jan 21 19:09:16 np0005591285 podman[228214]: 2026-01-22 00:09:16.266809157 +0000 UTC m=+0.128095527 container health_status 769668cc4e818cfae854ab3fcb5517227ddca1e18005a18ef4d782a494f45662 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, name=ubi9-minimal, com.redhat.component=ubi9-minimal-container, config_id=openstack_network_exporter, release=1755695350, io.openshift.expose-services=, vcs-type=git, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, container_name=openstack_network_exporter, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., build-date=2025-08-20T13:12:41, version=9.6, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., url=https://catalog.redhat.com/en/search?searchType=containers, vendor=Red Hat, Inc., distribution-scope=public, managed_by=edpm_ansible, io.openshift.tags=minimal rhel9, maintainer=Red Hat, Inc., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.33.7, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']})
Jan 21 19:09:16 np0005591285 nova_compute[182755]: 2026-01-22 00:09:16.847 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:09:17 np0005591285 nova_compute[182755]: 2026-01-22 00:09:17.428 182759 DEBUG nova.virt.libvirt.driver [None req-2e0f08a9-0141-4546-ad53-c5e4e194343b 8324d8ba232c476e925d31b7d5645a7a 3b9315c6168049d79f20d630e51ffff3 - - default default] [instance: cacae884-d2ca-4741-952f-59ffbb641328] Instance in state 1 after 10 seconds - resending shutdown _clean_shutdown /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4101#033[00m
Jan 21 19:09:19 np0005591285 kernel: tapebf5a837-69 (unregistering): left promiscuous mode
Jan 21 19:09:19 np0005591285 NetworkManager[55017]: <info>  [1769040559.6811] device (tapebf5a837-69): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 21 19:09:19 np0005591285 ovn_controller[94908]: 2026-01-22T00:09:19Z|00405|binding|INFO|Releasing lport ebf5a837-6957-4227-9b3d-1ae66eb381bd from this chassis (sb_readonly=0)
Jan 21 19:09:19 np0005591285 ovn_controller[94908]: 2026-01-22T00:09:19Z|00406|binding|INFO|Setting lport ebf5a837-6957-4227-9b3d-1ae66eb381bd down in Southbound
Jan 21 19:09:19 np0005591285 nova_compute[182755]: 2026-01-22 00:09:19.688 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:09:19 np0005591285 ovn_controller[94908]: 2026-01-22T00:09:19Z|00407|binding|INFO|Removing iface tapebf5a837-69 ovn-installed in OVS
Jan 21 19:09:19 np0005591285 nova_compute[182755]: 2026-01-22 00:09:19.689 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:09:19 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:09:19.695 104259 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:52:ad:ee 10.100.0.11'], port_security=['fa:16:3e:52:ad:ee 10.100.0.11'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.11/28', 'neutron:device_id': 'cacae884-d2ca-4741-952f-59ffbb641328', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-eaa59f49-9477-4d9b-9a5e-8f1eb5cf78a6', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '3b9315c6168049d79f20d630e51ffff3', 'neutron:revision_number': '4', 'neutron:security_group_ids': '88dd83ff-b733-44b2-9065-8f39dcf83d23', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-2.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=ada6e58f-6492-44c0-abaa-a00698af112f, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fdd55b5c970>], logical_port=ebf5a837-6957-4227-9b3d-1ae66eb381bd) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7fdd55b5c970>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 21 19:09:19 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:09:19.696 104259 INFO neutron.agent.ovn.metadata.agent [-] Port ebf5a837-6957-4227-9b3d-1ae66eb381bd in datapath eaa59f49-9477-4d9b-9a5e-8f1eb5cf78a6 unbound from our chassis#033[00m
Jan 21 19:09:19 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:09:19.696 104259 DEBUG neutron.agent.ovn.metadata.agent [-] There is no metadata port for network eaa59f49-9477-4d9b-9a5e-8f1eb5cf78a6 or it has no MAC or IP addresses configured, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:599#033[00m
Jan 21 19:09:19 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:09:19.697 211690 DEBUG oslo.privsep.daemon [-] privsep: reply[1ac7f8dc-bc74-403d-99c6-185daf5bd0f8]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 19:09:19 np0005591285 nova_compute[182755]: 2026-01-22 00:09:19.700 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:09:19 np0005591285 systemd[1]: machine-qemu\x2d48\x2dinstance\x2d0000006e.scope: Deactivated successfully.
Jan 21 19:09:19 np0005591285 systemd[1]: machine-qemu\x2d48\x2dinstance\x2d0000006e.scope: Consumed 14.088s CPU time.
Jan 21 19:09:19 np0005591285 systemd-machined[154022]: Machine qemu-48-instance-0000006e terminated.
Jan 21 19:09:20 np0005591285 nova_compute[182755]: 2026-01-22 00:09:20.193 182759 DEBUG nova.compute.manager [req-008fb41e-0543-4b1f-8283-affaf814c572 req-dd453c67-63bc-4852-8a5d-8ff8bb56ccd1 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: cacae884-d2ca-4741-952f-59ffbb641328] Received event network-vif-unplugged-ebf5a837-6957-4227-9b3d-1ae66eb381bd external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 21 19:09:20 np0005591285 nova_compute[182755]: 2026-01-22 00:09:20.194 182759 DEBUG oslo_concurrency.lockutils [req-008fb41e-0543-4b1f-8283-affaf814c572 req-dd453c67-63bc-4852-8a5d-8ff8bb56ccd1 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "cacae884-d2ca-4741-952f-59ffbb641328-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 21 19:09:20 np0005591285 nova_compute[182755]: 2026-01-22 00:09:20.194 182759 DEBUG oslo_concurrency.lockutils [req-008fb41e-0543-4b1f-8283-affaf814c572 req-dd453c67-63bc-4852-8a5d-8ff8bb56ccd1 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "cacae884-d2ca-4741-952f-59ffbb641328-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 21 19:09:20 np0005591285 nova_compute[182755]: 2026-01-22 00:09:20.195 182759 DEBUG oslo_concurrency.lockutils [req-008fb41e-0543-4b1f-8283-affaf814c572 req-dd453c67-63bc-4852-8a5d-8ff8bb56ccd1 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "cacae884-d2ca-4741-952f-59ffbb641328-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 21 19:09:20 np0005591285 nova_compute[182755]: 2026-01-22 00:09:20.195 182759 DEBUG nova.compute.manager [req-008fb41e-0543-4b1f-8283-affaf814c572 req-dd453c67-63bc-4852-8a5d-8ff8bb56ccd1 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: cacae884-d2ca-4741-952f-59ffbb641328] No waiting events found dispatching network-vif-unplugged-ebf5a837-6957-4227-9b3d-1ae66eb381bd pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 21 19:09:20 np0005591285 nova_compute[182755]: 2026-01-22 00:09:20.196 182759 WARNING nova.compute.manager [req-008fb41e-0543-4b1f-8283-affaf814c572 req-dd453c67-63bc-4852-8a5d-8ff8bb56ccd1 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: cacae884-d2ca-4741-952f-59ffbb641328] Received unexpected event network-vif-unplugged-ebf5a837-6957-4227-9b3d-1ae66eb381bd for instance with vm_state active and task_state rescuing.#033[00m
Jan 21 19:09:20 np0005591285 nova_compute[182755]: 2026-01-22 00:09:20.445 182759 INFO nova.virt.libvirt.driver [None req-2e0f08a9-0141-4546-ad53-c5e4e194343b 8324d8ba232c476e925d31b7d5645a7a 3b9315c6168049d79f20d630e51ffff3 - - default default] [instance: cacae884-d2ca-4741-952f-59ffbb641328] Instance shutdown successfully after 13 seconds.#033[00m
Jan 21 19:09:20 np0005591285 nova_compute[182755]: 2026-01-22 00:09:20.452 182759 INFO nova.virt.libvirt.driver [-] [instance: cacae884-d2ca-4741-952f-59ffbb641328] Instance destroyed successfully.#033[00m
Jan 21 19:09:20 np0005591285 nova_compute[182755]: 2026-01-22 00:09:20.453 182759 DEBUG nova.objects.instance [None req-2e0f08a9-0141-4546-ad53-c5e4e194343b 8324d8ba232c476e925d31b7d5645a7a 3b9315c6168049d79f20d630e51ffff3 - - default default] Lazy-loading 'numa_topology' on Instance uuid cacae884-d2ca-4741-952f-59ffbb641328 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 21 19:09:20 np0005591285 nova_compute[182755]: 2026-01-22 00:09:20.474 182759 INFO nova.virt.libvirt.driver [None req-2e0f08a9-0141-4546-ad53-c5e4e194343b 8324d8ba232c476e925d31b7d5645a7a 3b9315c6168049d79f20d630e51ffff3 - - default default] [instance: cacae884-d2ca-4741-952f-59ffbb641328] Attempting rescue#033[00m
Jan 21 19:09:20 np0005591285 nova_compute[182755]: 2026-01-22 00:09:20.475 182759 DEBUG nova.virt.libvirt.driver [None req-2e0f08a9-0141-4546-ad53-c5e4e194343b 8324d8ba232c476e925d31b7d5645a7a 3b9315c6168049d79f20d630e51ffff3 - - default default] rescue generated disk_info: {'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'disk.rescue': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vdb', 'type': 'disk'}, 'disk.config.rescue': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} rescue /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4314#033[00m
Jan 21 19:09:20 np0005591285 nova_compute[182755]: 2026-01-22 00:09:20.480 182759 DEBUG nova.virt.libvirt.driver [None req-2e0f08a9-0141-4546-ad53-c5e4e194343b 8324d8ba232c476e925d31b7d5645a7a 3b9315c6168049d79f20d630e51ffff3 - - default default] [instance: cacae884-d2ca-4741-952f-59ffbb641328] Instance directory exists: not creating _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4719#033[00m
Jan 21 19:09:20 np0005591285 nova_compute[182755]: 2026-01-22 00:09:20.480 182759 INFO nova.virt.libvirt.driver [None req-2e0f08a9-0141-4546-ad53-c5e4e194343b 8324d8ba232c476e925d31b7d5645a7a 3b9315c6168049d79f20d630e51ffff3 - - default default] [instance: cacae884-d2ca-4741-952f-59ffbb641328] Creating image(s)#033[00m
Jan 21 19:09:20 np0005591285 nova_compute[182755]: 2026-01-22 00:09:20.481 182759 DEBUG oslo_concurrency.lockutils [None req-2e0f08a9-0141-4546-ad53-c5e4e194343b 8324d8ba232c476e925d31b7d5645a7a 3b9315c6168049d79f20d630e51ffff3 - - default default] Acquiring lock "/var/lib/nova/instances/cacae884-d2ca-4741-952f-59ffbb641328/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 21 19:09:20 np0005591285 nova_compute[182755]: 2026-01-22 00:09:20.481 182759 DEBUG oslo_concurrency.lockutils [None req-2e0f08a9-0141-4546-ad53-c5e4e194343b 8324d8ba232c476e925d31b7d5645a7a 3b9315c6168049d79f20d630e51ffff3 - - default default] Lock "/var/lib/nova/instances/cacae884-d2ca-4741-952f-59ffbb641328/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 21 19:09:20 np0005591285 nova_compute[182755]: 2026-01-22 00:09:20.482 182759 DEBUG oslo_concurrency.lockutils [None req-2e0f08a9-0141-4546-ad53-c5e4e194343b 8324d8ba232c476e925d31b7d5645a7a 3b9315c6168049d79f20d630e51ffff3 - - default default] Lock "/var/lib/nova/instances/cacae884-d2ca-4741-952f-59ffbb641328/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 21 19:09:20 np0005591285 nova_compute[182755]: 2026-01-22 00:09:20.482 182759 DEBUG nova.objects.instance [None req-2e0f08a9-0141-4546-ad53-c5e4e194343b 8324d8ba232c476e925d31b7d5645a7a 3b9315c6168049d79f20d630e51ffff3 - - default default] Lazy-loading 'trusted_certs' on Instance uuid cacae884-d2ca-4741-952f-59ffbb641328 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 21 19:09:20 np0005591285 nova_compute[182755]: 2026-01-22 00:09:20.515 182759 DEBUG oslo_concurrency.lockutils [None req-2e0f08a9-0141-4546-ad53-c5e4e194343b 8324d8ba232c476e925d31b7d5645a7a 3b9315c6168049d79f20d630e51ffff3 - - default default] Acquiring lock "3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 21 19:09:20 np0005591285 nova_compute[182755]: 2026-01-22 00:09:20.516 182759 DEBUG oslo_concurrency.lockutils [None req-2e0f08a9-0141-4546-ad53-c5e4e194343b 8324d8ba232c476e925d31b7d5645a7a 3b9315c6168049d79f20d630e51ffff3 - - default default] Lock "3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 21 19:09:20 np0005591285 nova_compute[182755]: 2026-01-22 00:09:20.527 182759 DEBUG oslo_concurrency.processutils [None req-2e0f08a9-0141-4546-ad53-c5e4e194343b 8324d8ba232c476e925d31b7d5645a7a 3b9315c6168049d79f20d630e51ffff3 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 21 19:09:20 np0005591285 nova_compute[182755]: 2026-01-22 00:09:20.616 182759 DEBUG oslo_concurrency.processutils [None req-2e0f08a9-0141-4546-ad53-c5e4e194343b 8324d8ba232c476e925d31b7d5645a7a 3b9315c6168049d79f20d630e51ffff3 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json" returned: 0 in 0.089s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 21 19:09:20 np0005591285 nova_compute[182755]: 2026-01-22 00:09:20.618 182759 DEBUG oslo_concurrency.processutils [None req-2e0f08a9-0141-4546-ad53-c5e4e194343b 8324d8ba232c476e925d31b7d5645a7a 3b9315c6168049d79f20d630e51ffff3 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474,backing_fmt=raw /var/lib/nova/instances/cacae884-d2ca-4741-952f-59ffbb641328/disk.rescue execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 21 19:09:20 np0005591285 nova_compute[182755]: 2026-01-22 00:09:20.677 182759 DEBUG oslo_concurrency.processutils [None req-2e0f08a9-0141-4546-ad53-c5e4e194343b 8324d8ba232c476e925d31b7d5645a7a 3b9315c6168049d79f20d630e51ffff3 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474,backing_fmt=raw /var/lib/nova/instances/cacae884-d2ca-4741-952f-59ffbb641328/disk.rescue" returned: 0 in 0.060s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 21 19:09:20 np0005591285 nova_compute[182755]: 2026-01-22 00:09:20.679 182759 DEBUG oslo_concurrency.lockutils [None req-2e0f08a9-0141-4546-ad53-c5e4e194343b 8324d8ba232c476e925d31b7d5645a7a 3b9315c6168049d79f20d630e51ffff3 - - default default] Lock "3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.163s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 21 19:09:20 np0005591285 nova_compute[182755]: 2026-01-22 00:09:20.680 182759 DEBUG nova.objects.instance [None req-2e0f08a9-0141-4546-ad53-c5e4e194343b 8324d8ba232c476e925d31b7d5645a7a 3b9315c6168049d79f20d630e51ffff3 - - default default] Lazy-loading 'migration_context' on Instance uuid cacae884-d2ca-4741-952f-59ffbb641328 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 21 19:09:20 np0005591285 nova_compute[182755]: 2026-01-22 00:09:20.697 182759 DEBUG nova.virt.libvirt.driver [None req-2e0f08a9-0141-4546-ad53-c5e4e194343b 8324d8ba232c476e925d31b7d5645a7a 3b9315c6168049d79f20d630e51ffff3 - - default default] [instance: cacae884-d2ca-4741-952f-59ffbb641328] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Jan 21 19:09:20 np0005591285 nova_compute[182755]: 2026-01-22 00:09:20.698 182759 DEBUG nova.virt.libvirt.driver [None req-2e0f08a9-0141-4546-ad53-c5e4e194343b 8324d8ba232c476e925d31b7d5645a7a 3b9315c6168049d79f20d630e51ffff3 - - default default] [instance: cacae884-d2ca-4741-952f-59ffbb641328] Start _get_guest_xml network_info=[{"id": "ebf5a837-6957-4227-9b3d-1ae66eb381bd", "address": "fa:16:3e:52:ad:ee", "network": {"id": "eaa59f49-9477-4d9b-9a5e-8f1eb5cf78a6", "bridge": "br-int", "label": "tempest-ServerRescueTestJSON-1280377146-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [], "label": "tempest-ServerRescueTestJSON-1280377146-network", "vif_mac": "fa:16:3e:52:ad:ee"}], "routes": [], "version": 4, "meta": {"enable_dhcp": false}}], "meta": {"injected": false, "tenant_id": "3b9315c6168049d79f20d630e51ffff3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapebf5a837-69", "ovs_interfaceid": "ebf5a837-6957-4227-9b3d-1ae66eb381bd", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'disk.rescue': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vdb', 'type': 'disk'}, 'disk.config.rescue': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-21T23:42:50Z,direct_url=<?>,disk_format='qcow2',id=9cd98f02-a505-4543-a7ad-04e9a377b456,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='43b70c4e837343859ac97b6b2397ba1b',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-21T23:42:52Z,virtual_size=<?>,visibility=<?>) rescue={'image_id': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'kernel_id': '', 'ramdisk_id': ''} block_device_info=None _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Jan 21 19:09:20 np0005591285 nova_compute[182755]: 2026-01-22 00:09:20.699 182759 DEBUG nova.objects.instance [None req-2e0f08a9-0141-4546-ad53-c5e4e194343b 8324d8ba232c476e925d31b7d5645a7a 3b9315c6168049d79f20d630e51ffff3 - - default default] Lazy-loading 'resources' on Instance uuid cacae884-d2ca-4741-952f-59ffbb641328 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 21 19:09:20 np0005591285 nova_compute[182755]: 2026-01-22 00:09:20.723 182759 WARNING nova.virt.libvirt.driver [None req-2e0f08a9-0141-4546-ad53-c5e4e194343b 8324d8ba232c476e925d31b7d5645a7a 3b9315c6168049d79f20d630e51ffff3 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 21 19:09:20 np0005591285 nova_compute[182755]: 2026-01-22 00:09:20.728 182759 DEBUG nova.virt.libvirt.host [None req-2e0f08a9-0141-4546-ad53-c5e4e194343b 8324d8ba232c476e925d31b7d5645a7a 3b9315c6168049d79f20d630e51ffff3 - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Jan 21 19:09:20 np0005591285 nova_compute[182755]: 2026-01-22 00:09:20.729 182759 DEBUG nova.virt.libvirt.host [None req-2e0f08a9-0141-4546-ad53-c5e4e194343b 8324d8ba232c476e925d31b7d5645a7a 3b9315c6168049d79f20d630e51ffff3 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Jan 21 19:09:20 np0005591285 nova_compute[182755]: 2026-01-22 00:09:20.730 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:09:20 np0005591285 nova_compute[182755]: 2026-01-22 00:09:20.731 182759 DEBUG nova.virt.libvirt.host [None req-2e0f08a9-0141-4546-ad53-c5e4e194343b 8324d8ba232c476e925d31b7d5645a7a 3b9315c6168049d79f20d630e51ffff3 - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Jan 21 19:09:20 np0005591285 nova_compute[182755]: 2026-01-22 00:09:20.732 182759 DEBUG nova.virt.libvirt.host [None req-2e0f08a9-0141-4546-ad53-c5e4e194343b 8324d8ba232c476e925d31b7d5645a7a 3b9315c6168049d79f20d630e51ffff3 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Jan 21 19:09:20 np0005591285 nova_compute[182755]: 2026-01-22 00:09:20.733 182759 DEBUG nova.virt.libvirt.driver [None req-2e0f08a9-0141-4546-ad53-c5e4e194343b 8324d8ba232c476e925d31b7d5645a7a 3b9315c6168049d79f20d630e51ffff3 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Jan 21 19:09:20 np0005591285 nova_compute[182755]: 2026-01-22 00:09:20.733 182759 DEBUG nova.virt.hardware [None req-2e0f08a9-0141-4546-ad53-c5e4e194343b 8324d8ba232c476e925d31b7d5645a7a 3b9315c6168049d79f20d630e51ffff3 - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-21T23:42:45Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='c3389c03-89c4-4ff5-9e03-1a99d41713d4',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-21T23:42:50Z,direct_url=<?>,disk_format='qcow2',id=9cd98f02-a505-4543-a7ad-04e9a377b456,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='43b70c4e837343859ac97b6b2397ba1b',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-21T23:42:52Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Jan 21 19:09:20 np0005591285 nova_compute[182755]: 2026-01-22 00:09:20.734 182759 DEBUG nova.virt.hardware [None req-2e0f08a9-0141-4546-ad53-c5e4e194343b 8324d8ba232c476e925d31b7d5645a7a 3b9315c6168049d79f20d630e51ffff3 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Jan 21 19:09:20 np0005591285 nova_compute[182755]: 2026-01-22 00:09:20.734 182759 DEBUG nova.virt.hardware [None req-2e0f08a9-0141-4546-ad53-c5e4e194343b 8324d8ba232c476e925d31b7d5645a7a 3b9315c6168049d79f20d630e51ffff3 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Jan 21 19:09:20 np0005591285 nova_compute[182755]: 2026-01-22 00:09:20.734 182759 DEBUG nova.virt.hardware [None req-2e0f08a9-0141-4546-ad53-c5e4e194343b 8324d8ba232c476e925d31b7d5645a7a 3b9315c6168049d79f20d630e51ffff3 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Jan 21 19:09:20 np0005591285 nova_compute[182755]: 2026-01-22 00:09:20.734 182759 DEBUG nova.virt.hardware [None req-2e0f08a9-0141-4546-ad53-c5e4e194343b 8324d8ba232c476e925d31b7d5645a7a 3b9315c6168049d79f20d630e51ffff3 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Jan 21 19:09:20 np0005591285 nova_compute[182755]: 2026-01-22 00:09:20.734 182759 DEBUG nova.virt.hardware [None req-2e0f08a9-0141-4546-ad53-c5e4e194343b 8324d8ba232c476e925d31b7d5645a7a 3b9315c6168049d79f20d630e51ffff3 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Jan 21 19:09:20 np0005591285 nova_compute[182755]: 2026-01-22 00:09:20.735 182759 DEBUG nova.virt.hardware [None req-2e0f08a9-0141-4546-ad53-c5e4e194343b 8324d8ba232c476e925d31b7d5645a7a 3b9315c6168049d79f20d630e51ffff3 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Jan 21 19:09:20 np0005591285 nova_compute[182755]: 2026-01-22 00:09:20.735 182759 DEBUG nova.virt.hardware [None req-2e0f08a9-0141-4546-ad53-c5e4e194343b 8324d8ba232c476e925d31b7d5645a7a 3b9315c6168049d79f20d630e51ffff3 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Jan 21 19:09:20 np0005591285 nova_compute[182755]: 2026-01-22 00:09:20.735 182759 DEBUG nova.virt.hardware [None req-2e0f08a9-0141-4546-ad53-c5e4e194343b 8324d8ba232c476e925d31b7d5645a7a 3b9315c6168049d79f20d630e51ffff3 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Jan 21 19:09:20 np0005591285 nova_compute[182755]: 2026-01-22 00:09:20.735 182759 DEBUG nova.virt.hardware [None req-2e0f08a9-0141-4546-ad53-c5e4e194343b 8324d8ba232c476e925d31b7d5645a7a 3b9315c6168049d79f20d630e51ffff3 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Jan 21 19:09:20 np0005591285 nova_compute[182755]: 2026-01-22 00:09:20.735 182759 DEBUG nova.virt.hardware [None req-2e0f08a9-0141-4546-ad53-c5e4e194343b 8324d8ba232c476e925d31b7d5645a7a 3b9315c6168049d79f20d630e51ffff3 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Jan 21 19:09:20 np0005591285 nova_compute[182755]: 2026-01-22 00:09:20.736 182759 DEBUG nova.objects.instance [None req-2e0f08a9-0141-4546-ad53-c5e4e194343b 8324d8ba232c476e925d31b7d5645a7a 3b9315c6168049d79f20d630e51ffff3 - - default default] Lazy-loading 'vcpu_model' on Instance uuid cacae884-d2ca-4741-952f-59ffbb641328 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 21 19:09:20 np0005591285 nova_compute[182755]: 2026-01-22 00:09:20.753 182759 DEBUG nova.virt.libvirt.vif [None req-2e0f08a9-0141-4546-ad53-c5e4e194343b 8324d8ba232c476e925d31b7d5645a7a 3b9315c6168049d79f20d630e51ffff3 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-22T00:08:49Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ServerRescueTestJSON-server-927653202',display_name='tempest-ServerRescueTestJSON-server-927653202',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-serverrescuetestjson-server-927653202',id=110,image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2026-01-22T00:08:59Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='3b9315c6168049d79f20d630e51ffff3',ramdisk_id='',reservation_id='r-8xr05bq6',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-ServerRescueTestJSON-401787473',owner_user_name='tempest-ServerRescueTestJSON-401787473-project-member'},tags=<?>,task_state='rescuing',terminated_at=None,trusted_certs=None,updated_at=2026-01-22T00:08:59Z,user_data=None,user_id='8324d8ba232c476e925d31b7d5645a7a',uuid=cacae884-d2ca-4741-952f-59ffbb641328,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "ebf5a837-6957-4227-9b3d-1ae66eb381bd", "address": "fa:16:3e:52:ad:ee", "network": {"id": "eaa59f49-9477-4d9b-9a5e-8f1eb5cf78a6", "bridge": "br-int", "label": "tempest-ServerRescueTestJSON-1280377146-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [], "label": "tempest-ServerRescueTestJSON-1280377146-network", "vif_mac": "fa:16:3e:52:ad:ee"}], "routes": [], "version": 4, "meta": {"enable_dhcp": false}}], "meta": {"injected": false, "tenant_id": "3b9315c6168049d79f20d630e51ffff3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapebf5a837-69", "ovs_interfaceid": "ebf5a837-6957-4227-9b3d-1ae66eb381bd", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Jan 21 19:09:20 np0005591285 nova_compute[182755]: 2026-01-22 00:09:20.753 182759 DEBUG nova.network.os_vif_util [None req-2e0f08a9-0141-4546-ad53-c5e4e194343b 8324d8ba232c476e925d31b7d5645a7a 3b9315c6168049d79f20d630e51ffff3 - - default default] Converting VIF {"id": "ebf5a837-6957-4227-9b3d-1ae66eb381bd", "address": "fa:16:3e:52:ad:ee", "network": {"id": "eaa59f49-9477-4d9b-9a5e-8f1eb5cf78a6", "bridge": "br-int", "label": "tempest-ServerRescueTestJSON-1280377146-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [], "label": "tempest-ServerRescueTestJSON-1280377146-network", "vif_mac": "fa:16:3e:52:ad:ee"}], "routes": [], "version": 4, "meta": {"enable_dhcp": false}}], "meta": {"injected": false, "tenant_id": "3b9315c6168049d79f20d630e51ffff3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapebf5a837-69", "ovs_interfaceid": "ebf5a837-6957-4227-9b3d-1ae66eb381bd", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 21 19:09:20 np0005591285 nova_compute[182755]: 2026-01-22 00:09:20.754 182759 DEBUG nova.network.os_vif_util [None req-2e0f08a9-0141-4546-ad53-c5e4e194343b 8324d8ba232c476e925d31b7d5645a7a 3b9315c6168049d79f20d630e51ffff3 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:52:ad:ee,bridge_name='br-int',has_traffic_filtering=True,id=ebf5a837-6957-4227-9b3d-1ae66eb381bd,network=Network(eaa59f49-9477-4d9b-9a5e-8f1eb5cf78a6),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapebf5a837-69') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 21 19:09:20 np0005591285 nova_compute[182755]: 2026-01-22 00:09:20.755 182759 DEBUG nova.objects.instance [None req-2e0f08a9-0141-4546-ad53-c5e4e194343b 8324d8ba232c476e925d31b7d5645a7a 3b9315c6168049d79f20d630e51ffff3 - - default default] Lazy-loading 'pci_devices' on Instance uuid cacae884-d2ca-4741-952f-59ffbb641328 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 21 19:09:20 np0005591285 nova_compute[182755]: 2026-01-22 00:09:20.776 182759 DEBUG nova.virt.libvirt.driver [None req-2e0f08a9-0141-4546-ad53-c5e4e194343b 8324d8ba232c476e925d31b7d5645a7a 3b9315c6168049d79f20d630e51ffff3 - - default default] [instance: cacae884-d2ca-4741-952f-59ffbb641328] End _get_guest_xml xml=<domain type="kvm">
Jan 21 19:09:20 np0005591285 nova_compute[182755]:  <uuid>cacae884-d2ca-4741-952f-59ffbb641328</uuid>
Jan 21 19:09:20 np0005591285 nova_compute[182755]:  <name>instance-0000006e</name>
Jan 21 19:09:20 np0005591285 nova_compute[182755]:  <memory>131072</memory>
Jan 21 19:09:20 np0005591285 nova_compute[182755]:  <vcpu>1</vcpu>
Jan 21 19:09:20 np0005591285 nova_compute[182755]:  <metadata>
Jan 21 19:09:20 np0005591285 nova_compute[182755]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 21 19:09:20 np0005591285 nova_compute[182755]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 21 19:09:20 np0005591285 nova_compute[182755]:      <nova:name>tempest-ServerRescueTestJSON-server-927653202</nova:name>
Jan 21 19:09:20 np0005591285 nova_compute[182755]:      <nova:creationTime>2026-01-22 00:09:20</nova:creationTime>
Jan 21 19:09:20 np0005591285 nova_compute[182755]:      <nova:flavor name="m1.nano">
Jan 21 19:09:20 np0005591285 nova_compute[182755]:        <nova:memory>128</nova:memory>
Jan 21 19:09:20 np0005591285 nova_compute[182755]:        <nova:disk>1</nova:disk>
Jan 21 19:09:20 np0005591285 nova_compute[182755]:        <nova:swap>0</nova:swap>
Jan 21 19:09:20 np0005591285 nova_compute[182755]:        <nova:ephemeral>0</nova:ephemeral>
Jan 21 19:09:20 np0005591285 nova_compute[182755]:        <nova:vcpus>1</nova:vcpus>
Jan 21 19:09:20 np0005591285 nova_compute[182755]:      </nova:flavor>
Jan 21 19:09:20 np0005591285 nova_compute[182755]:      <nova:owner>
Jan 21 19:09:20 np0005591285 nova_compute[182755]:        <nova:user uuid="8324d8ba232c476e925d31b7d5645a7a">tempest-ServerRescueTestJSON-401787473-project-member</nova:user>
Jan 21 19:09:20 np0005591285 nova_compute[182755]:        <nova:project uuid="3b9315c6168049d79f20d630e51ffff3">tempest-ServerRescueTestJSON-401787473</nova:project>
Jan 21 19:09:20 np0005591285 nova_compute[182755]:      </nova:owner>
Jan 21 19:09:20 np0005591285 nova_compute[182755]:      <nova:root type="image" uuid="9cd98f02-a505-4543-a7ad-04e9a377b456"/>
Jan 21 19:09:20 np0005591285 nova_compute[182755]:      <nova:ports>
Jan 21 19:09:20 np0005591285 nova_compute[182755]:        <nova:port uuid="ebf5a837-6957-4227-9b3d-1ae66eb381bd">
Jan 21 19:09:20 np0005591285 nova_compute[182755]:          <nova:ip type="fixed" address="10.100.0.11" ipVersion="4"/>
Jan 21 19:09:20 np0005591285 nova_compute[182755]:        </nova:port>
Jan 21 19:09:20 np0005591285 nova_compute[182755]:      </nova:ports>
Jan 21 19:09:20 np0005591285 nova_compute[182755]:    </nova:instance>
Jan 21 19:09:20 np0005591285 nova_compute[182755]:  </metadata>
Jan 21 19:09:20 np0005591285 nova_compute[182755]:  <sysinfo type="smbios">
Jan 21 19:09:20 np0005591285 nova_compute[182755]:    <system>
Jan 21 19:09:20 np0005591285 nova_compute[182755]:      <entry name="manufacturer">RDO</entry>
Jan 21 19:09:20 np0005591285 nova_compute[182755]:      <entry name="product">OpenStack Compute</entry>
Jan 21 19:09:20 np0005591285 nova_compute[182755]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 21 19:09:20 np0005591285 nova_compute[182755]:      <entry name="serial">cacae884-d2ca-4741-952f-59ffbb641328</entry>
Jan 21 19:09:20 np0005591285 nova_compute[182755]:      <entry name="uuid">cacae884-d2ca-4741-952f-59ffbb641328</entry>
Jan 21 19:09:20 np0005591285 nova_compute[182755]:      <entry name="family">Virtual Machine</entry>
Jan 21 19:09:20 np0005591285 nova_compute[182755]:    </system>
Jan 21 19:09:20 np0005591285 nova_compute[182755]:  </sysinfo>
Jan 21 19:09:20 np0005591285 nova_compute[182755]:  <os>
Jan 21 19:09:20 np0005591285 nova_compute[182755]:    <type arch="x86_64" machine="q35">hvm</type>
Jan 21 19:09:20 np0005591285 nova_compute[182755]:    <smbios mode="sysinfo"/>
Jan 21 19:09:20 np0005591285 nova_compute[182755]:  </os>
Jan 21 19:09:20 np0005591285 nova_compute[182755]:  <features>
Jan 21 19:09:20 np0005591285 nova_compute[182755]:    <acpi/>
Jan 21 19:09:20 np0005591285 nova_compute[182755]:    <apic/>
Jan 21 19:09:20 np0005591285 nova_compute[182755]:    <vmcoreinfo/>
Jan 21 19:09:20 np0005591285 nova_compute[182755]:  </features>
Jan 21 19:09:20 np0005591285 nova_compute[182755]:  <clock offset="utc">
Jan 21 19:09:20 np0005591285 nova_compute[182755]:    <timer name="pit" tickpolicy="delay"/>
Jan 21 19:09:20 np0005591285 nova_compute[182755]:    <timer name="rtc" tickpolicy="catchup"/>
Jan 21 19:09:20 np0005591285 nova_compute[182755]:    <timer name="hpet" present="no"/>
Jan 21 19:09:20 np0005591285 nova_compute[182755]:  </clock>
Jan 21 19:09:20 np0005591285 nova_compute[182755]:  <cpu mode="custom" match="exact">
Jan 21 19:09:20 np0005591285 nova_compute[182755]:    <model>Nehalem</model>
Jan 21 19:09:20 np0005591285 nova_compute[182755]:    <topology sockets="1" cores="1" threads="1"/>
Jan 21 19:09:20 np0005591285 nova_compute[182755]:  </cpu>
Jan 21 19:09:20 np0005591285 nova_compute[182755]:  <devices>
Jan 21 19:09:20 np0005591285 nova_compute[182755]:    <disk type="file" device="disk">
Jan 21 19:09:20 np0005591285 nova_compute[182755]:      <driver name="qemu" type="qcow2" cache="none"/>
Jan 21 19:09:20 np0005591285 nova_compute[182755]:      <source file="/var/lib/nova/instances/cacae884-d2ca-4741-952f-59ffbb641328/disk.rescue"/>
Jan 21 19:09:20 np0005591285 nova_compute[182755]:      <target dev="vda" bus="virtio"/>
Jan 21 19:09:20 np0005591285 nova_compute[182755]:    </disk>
Jan 21 19:09:20 np0005591285 nova_compute[182755]:    <disk type="file" device="disk">
Jan 21 19:09:20 np0005591285 nova_compute[182755]:      <driver name="qemu" type="qcow2" cache="none"/>
Jan 21 19:09:20 np0005591285 nova_compute[182755]:      <source file="/var/lib/nova/instances/cacae884-d2ca-4741-952f-59ffbb641328/disk"/>
Jan 21 19:09:20 np0005591285 nova_compute[182755]:      <target dev="vdb" bus="virtio"/>
Jan 21 19:09:20 np0005591285 nova_compute[182755]:    </disk>
Jan 21 19:09:20 np0005591285 nova_compute[182755]:    <disk type="file" device="cdrom">
Jan 21 19:09:20 np0005591285 nova_compute[182755]:      <driver name="qemu" type="raw" cache="none"/>
Jan 21 19:09:20 np0005591285 nova_compute[182755]:      <source file="/var/lib/nova/instances/cacae884-d2ca-4741-952f-59ffbb641328/disk.config.rescue"/>
Jan 21 19:09:20 np0005591285 nova_compute[182755]:      <target dev="sda" bus="sata"/>
Jan 21 19:09:20 np0005591285 nova_compute[182755]:    </disk>
Jan 21 19:09:20 np0005591285 nova_compute[182755]:    <interface type="ethernet">
Jan 21 19:09:20 np0005591285 nova_compute[182755]:      <mac address="fa:16:3e:52:ad:ee"/>
Jan 21 19:09:20 np0005591285 nova_compute[182755]:      <model type="virtio"/>
Jan 21 19:09:20 np0005591285 nova_compute[182755]:      <driver name="vhost" rx_queue_size="512"/>
Jan 21 19:09:20 np0005591285 nova_compute[182755]:      <mtu size="1442"/>
Jan 21 19:09:20 np0005591285 nova_compute[182755]:      <target dev="tapebf5a837-69"/>
Jan 21 19:09:20 np0005591285 nova_compute[182755]:    </interface>
Jan 21 19:09:20 np0005591285 nova_compute[182755]:    <serial type="pty">
Jan 21 19:09:20 np0005591285 nova_compute[182755]:      <log file="/var/lib/nova/instances/cacae884-d2ca-4741-952f-59ffbb641328/console.log" append="off"/>
Jan 21 19:09:20 np0005591285 nova_compute[182755]:    </serial>
Jan 21 19:09:20 np0005591285 nova_compute[182755]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 21 19:09:20 np0005591285 nova_compute[182755]:    <video>
Jan 21 19:09:20 np0005591285 nova_compute[182755]:      <model type="virtio"/>
Jan 21 19:09:20 np0005591285 nova_compute[182755]:    </video>
Jan 21 19:09:20 np0005591285 nova_compute[182755]:    <input type="tablet" bus="usb"/>
Jan 21 19:09:20 np0005591285 nova_compute[182755]:    <rng model="virtio">
Jan 21 19:09:20 np0005591285 nova_compute[182755]:      <backend model="random">/dev/urandom</backend>
Jan 21 19:09:20 np0005591285 nova_compute[182755]:    </rng>
Jan 21 19:09:20 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root"/>
Jan 21 19:09:20 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 19:09:20 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 19:09:20 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 19:09:20 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 19:09:20 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 19:09:20 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 19:09:20 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 19:09:20 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 19:09:20 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 19:09:20 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 19:09:20 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 19:09:20 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 19:09:20 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 19:09:20 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 19:09:20 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 19:09:20 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 19:09:20 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 19:09:20 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 19:09:20 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 19:09:20 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 19:09:20 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 19:09:20 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 19:09:20 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 19:09:20 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 19:09:20 np0005591285 nova_compute[182755]:    <controller type="usb" index="0"/>
Jan 21 19:09:20 np0005591285 nova_compute[182755]:    <memballoon model="virtio">
Jan 21 19:09:20 np0005591285 nova_compute[182755]:      <stats period="10"/>
Jan 21 19:09:20 np0005591285 nova_compute[182755]:    </memballoon>
Jan 21 19:09:20 np0005591285 nova_compute[182755]:  </devices>
Jan 21 19:09:20 np0005591285 nova_compute[182755]: </domain>
Jan 21 19:09:20 np0005591285 nova_compute[182755]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Jan 21 19:09:20 np0005591285 nova_compute[182755]: 2026-01-22 00:09:20.783 182759 INFO nova.virt.libvirt.driver [-] [instance: cacae884-d2ca-4741-952f-59ffbb641328] Instance destroyed successfully.#033[00m
Jan 21 19:09:20 np0005591285 nova_compute[182755]: 2026-01-22 00:09:20.838 182759 DEBUG nova.virt.libvirt.driver [None req-2e0f08a9-0141-4546-ad53-c5e4e194343b 8324d8ba232c476e925d31b7d5645a7a 3b9315c6168049d79f20d630e51ffff3 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 21 19:09:20 np0005591285 nova_compute[182755]: 2026-01-22 00:09:20.839 182759 DEBUG nova.virt.libvirt.driver [None req-2e0f08a9-0141-4546-ad53-c5e4e194343b 8324d8ba232c476e925d31b7d5645a7a 3b9315c6168049d79f20d630e51ffff3 - - default default] No BDM found with device name vdb, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 21 19:09:20 np0005591285 nova_compute[182755]: 2026-01-22 00:09:20.839 182759 DEBUG nova.virt.libvirt.driver [None req-2e0f08a9-0141-4546-ad53-c5e4e194343b 8324d8ba232c476e925d31b7d5645a7a 3b9315c6168049d79f20d630e51ffff3 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 21 19:09:20 np0005591285 nova_compute[182755]: 2026-01-22 00:09:20.839 182759 DEBUG nova.virt.libvirt.driver [None req-2e0f08a9-0141-4546-ad53-c5e4e194343b 8324d8ba232c476e925d31b7d5645a7a 3b9315c6168049d79f20d630e51ffff3 - - default default] No VIF found with MAC fa:16:3e:52:ad:ee, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Jan 21 19:09:20 np0005591285 nova_compute[182755]: 2026-01-22 00:09:20.840 182759 INFO nova.virt.libvirt.driver [None req-2e0f08a9-0141-4546-ad53-c5e4e194343b 8324d8ba232c476e925d31b7d5645a7a 3b9315c6168049d79f20d630e51ffff3 - - default default] [instance: cacae884-d2ca-4741-952f-59ffbb641328] Using config drive#033[00m
Jan 21 19:09:20 np0005591285 nova_compute[182755]: 2026-01-22 00:09:20.864 182759 DEBUG nova.objects.instance [None req-2e0f08a9-0141-4546-ad53-c5e4e194343b 8324d8ba232c476e925d31b7d5645a7a 3b9315c6168049d79f20d630e51ffff3 - - default default] Lazy-loading 'ec2_ids' on Instance uuid cacae884-d2ca-4741-952f-59ffbb641328 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 21 19:09:20 np0005591285 nova_compute[182755]: 2026-01-22 00:09:20.895 182759 DEBUG nova.objects.instance [None req-2e0f08a9-0141-4546-ad53-c5e4e194343b 8324d8ba232c476e925d31b7d5645a7a 3b9315c6168049d79f20d630e51ffff3 - - default default] Lazy-loading 'keypairs' on Instance uuid cacae884-d2ca-4741-952f-59ffbb641328 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 21 19:09:21 np0005591285 nova_compute[182755]: 2026-01-22 00:09:21.423 182759 INFO nova.virt.libvirt.driver [None req-2e0f08a9-0141-4546-ad53-c5e4e194343b 8324d8ba232c476e925d31b7d5645a7a 3b9315c6168049d79f20d630e51ffff3 - - default default] [instance: cacae884-d2ca-4741-952f-59ffbb641328] Creating config drive at /var/lib/nova/instances/cacae884-d2ca-4741-952f-59ffbb641328/disk.config.rescue#033[00m
Jan 21 19:09:21 np0005591285 nova_compute[182755]: 2026-01-22 00:09:21.433 182759 DEBUG oslo_concurrency.processutils [None req-2e0f08a9-0141-4546-ad53-c5e4e194343b 8324d8ba232c476e925d31b7d5645a7a 3b9315c6168049d79f20d630e51ffff3 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/cacae884-d2ca-4741-952f-59ffbb641328/disk.config.rescue -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpmy1pk6d9 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 21 19:09:21 np0005591285 nova_compute[182755]: 2026-01-22 00:09:21.562 182759 DEBUG oslo_concurrency.processutils [None req-2e0f08a9-0141-4546-ad53-c5e4e194343b 8324d8ba232c476e925d31b7d5645a7a 3b9315c6168049d79f20d630e51ffff3 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/cacae884-d2ca-4741-952f-59ffbb641328/disk.config.rescue -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpmy1pk6d9" returned: 0 in 0.129s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 21 19:09:21 np0005591285 kernel: tapebf5a837-69: entered promiscuous mode
Jan 21 19:09:21 np0005591285 systemd-udevd[228259]: Network interface NamePolicy= disabled on kernel command line.
Jan 21 19:09:21 np0005591285 NetworkManager[55017]: <info>  [1769040561.6396] manager: (tapebf5a837-69): new Tun device (/org/freedesktop/NetworkManager/Devices/198)
Jan 21 19:09:21 np0005591285 ovn_controller[94908]: 2026-01-22T00:09:21Z|00408|binding|INFO|Claiming lport ebf5a837-6957-4227-9b3d-1ae66eb381bd for this chassis.
Jan 21 19:09:21 np0005591285 ovn_controller[94908]: 2026-01-22T00:09:21Z|00409|binding|INFO|ebf5a837-6957-4227-9b3d-1ae66eb381bd: Claiming fa:16:3e:52:ad:ee 10.100.0.11
Jan 21 19:09:21 np0005591285 nova_compute[182755]: 2026-01-22 00:09:21.640 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:09:21 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:09:21.652 104259 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:52:ad:ee 10.100.0.11'], port_security=['fa:16:3e:52:ad:ee 10.100.0.11'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.11/28', 'neutron:device_id': 'cacae884-d2ca-4741-952f-59ffbb641328', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-eaa59f49-9477-4d9b-9a5e-8f1eb5cf78a6', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '3b9315c6168049d79f20d630e51ffff3', 'neutron:revision_number': '5', 'neutron:security_group_ids': '88dd83ff-b733-44b2-9065-8f39dcf83d23', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=ada6e58f-6492-44c0-abaa-a00698af112f, chassis=[<ovs.db.idl.Row object at 0x7fdd55b5c970>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fdd55b5c970>], logical_port=ebf5a837-6957-4227-9b3d-1ae66eb381bd) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 21 19:09:21 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:09:21.653 104259 INFO neutron.agent.ovn.metadata.agent [-] Port ebf5a837-6957-4227-9b3d-1ae66eb381bd in datapath eaa59f49-9477-4d9b-9a5e-8f1eb5cf78a6 bound to our chassis#033[00m
Jan 21 19:09:21 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:09:21.653 104259 DEBUG neutron.agent.ovn.metadata.agent [-] There is no metadata port for network eaa59f49-9477-4d9b-9a5e-8f1eb5cf78a6 or it has no MAC or IP addresses configured, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:599#033[00m
Jan 21 19:09:21 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:09:21.654 211690 DEBUG oslo.privsep.daemon [-] privsep: reply[d707cda2-0411-4f89-8cf2-a28ed7821e80]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 19:09:21 np0005591285 NetworkManager[55017]: <info>  [1769040561.6569] device (tapebf5a837-69): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 21 19:09:21 np0005591285 NetworkManager[55017]: <info>  [1769040561.6594] device (tapebf5a837-69): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 21 19:09:21 np0005591285 ovn_controller[94908]: 2026-01-22T00:09:21Z|00410|binding|INFO|Setting lport ebf5a837-6957-4227-9b3d-1ae66eb381bd ovn-installed in OVS
Jan 21 19:09:21 np0005591285 ovn_controller[94908]: 2026-01-22T00:09:21Z|00411|binding|INFO|Setting lport ebf5a837-6957-4227-9b3d-1ae66eb381bd up in Southbound
Jan 21 19:09:21 np0005591285 nova_compute[182755]: 2026-01-22 00:09:21.665 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:09:21 np0005591285 nova_compute[182755]: 2026-01-22 00:09:21.670 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:09:21 np0005591285 systemd-machined[154022]: New machine qemu-50-instance-0000006e.
Jan 21 19:09:21 np0005591285 systemd[1]: Started Virtual Machine qemu-50-instance-0000006e.
Jan 21 19:09:21 np0005591285 nova_compute[182755]: 2026-01-22 00:09:21.849 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:09:22 np0005591285 nova_compute[182755]: 2026-01-22 00:09:22.063 182759 DEBUG nova.virt.libvirt.host [None req-f447ef32-7edb-4e2c-a5c5-5452f2f9c37e - - - - - -] Removed pending event for cacae884-d2ca-4741-952f-59ffbb641328 due to event _event_emit_delayed /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:438#033[00m
Jan 21 19:09:22 np0005591285 nova_compute[182755]: 2026-01-22 00:09:22.064 182759 DEBUG nova.virt.driver [None req-f447ef32-7edb-4e2c-a5c5-5452f2f9c37e - - - - - -] Emitting event <LifecycleEvent: 1769040562.063409, cacae884-d2ca-4741-952f-59ffbb641328 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 21 19:09:22 np0005591285 nova_compute[182755]: 2026-01-22 00:09:22.064 182759 INFO nova.compute.manager [None req-f447ef32-7edb-4e2c-a5c5-5452f2f9c37e - - - - - -] [instance: cacae884-d2ca-4741-952f-59ffbb641328] VM Resumed (Lifecycle Event)#033[00m
Jan 21 19:09:22 np0005591285 nova_compute[182755]: 2026-01-22 00:09:22.088 182759 DEBUG nova.compute.manager [None req-2e0f08a9-0141-4546-ad53-c5e4e194343b 8324d8ba232c476e925d31b7d5645a7a 3b9315c6168049d79f20d630e51ffff3 - - default default] [instance: cacae884-d2ca-4741-952f-59ffbb641328] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 21 19:09:22 np0005591285 nova_compute[182755]: 2026-01-22 00:09:22.090 182759 DEBUG nova.compute.manager [None req-f447ef32-7edb-4e2c-a5c5-5452f2f9c37e - - - - - -] [instance: cacae884-d2ca-4741-952f-59ffbb641328] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 21 19:09:22 np0005591285 nova_compute[182755]: 2026-01-22 00:09:22.096 182759 DEBUG nova.compute.manager [None req-f447ef32-7edb-4e2c-a5c5-5452f2f9c37e - - - - - -] [instance: cacae884-d2ca-4741-952f-59ffbb641328] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: active, current task_state: rescuing, current DB power_state: 1, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 21 19:09:22 np0005591285 nova_compute[182755]: 2026-01-22 00:09:22.162 182759 INFO nova.compute.manager [None req-f447ef32-7edb-4e2c-a5c5-5452f2f9c37e - - - - - -] [instance: cacae884-d2ca-4741-952f-59ffbb641328] During sync_power_state the instance has a pending task (rescuing). Skip.#033[00m
Jan 21 19:09:22 np0005591285 nova_compute[182755]: 2026-01-22 00:09:22.163 182759 DEBUG nova.virt.driver [None req-f447ef32-7edb-4e2c-a5c5-5452f2f9c37e - - - - - -] Emitting event <LifecycleEvent: 1769040562.064404, cacae884-d2ca-4741-952f-59ffbb641328 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 21 19:09:22 np0005591285 nova_compute[182755]: 2026-01-22 00:09:22.163 182759 INFO nova.compute.manager [None req-f447ef32-7edb-4e2c-a5c5-5452f2f9c37e - - - - - -] [instance: cacae884-d2ca-4741-952f-59ffbb641328] VM Started (Lifecycle Event)#033[00m
Jan 21 19:09:22 np0005591285 nova_compute[182755]: 2026-01-22 00:09:22.199 182759 DEBUG nova.compute.manager [None req-f447ef32-7edb-4e2c-a5c5-5452f2f9c37e - - - - - -] [instance: cacae884-d2ca-4741-952f-59ffbb641328] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 21 19:09:22 np0005591285 nova_compute[182755]: 2026-01-22 00:09:22.201 182759 DEBUG nova.compute.manager [None req-f447ef32-7edb-4e2c-a5c5-5452f2f9c37e - - - - - -] [instance: cacae884-d2ca-4741-952f-59ffbb641328] Synchronizing instance power state after lifecycle event "Started"; current vm_state: active, current task_state: rescuing, current DB power_state: 1, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 21 19:09:22 np0005591285 nova_compute[182755]: 2026-01-22 00:09:22.302 182759 DEBUG nova.compute.manager [req-90cbe5e6-23cd-402f-823b-bb3798838db8 req-efa6364f-c033-4b35-8972-56102ab55b7a 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: cacae884-d2ca-4741-952f-59ffbb641328] Received event network-vif-plugged-ebf5a837-6957-4227-9b3d-1ae66eb381bd external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 21 19:09:22 np0005591285 nova_compute[182755]: 2026-01-22 00:09:22.303 182759 DEBUG oslo_concurrency.lockutils [req-90cbe5e6-23cd-402f-823b-bb3798838db8 req-efa6364f-c033-4b35-8972-56102ab55b7a 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "cacae884-d2ca-4741-952f-59ffbb641328-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 21 19:09:22 np0005591285 nova_compute[182755]: 2026-01-22 00:09:22.304 182759 DEBUG oslo_concurrency.lockutils [req-90cbe5e6-23cd-402f-823b-bb3798838db8 req-efa6364f-c033-4b35-8972-56102ab55b7a 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "cacae884-d2ca-4741-952f-59ffbb641328-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 21 19:09:22 np0005591285 nova_compute[182755]: 2026-01-22 00:09:22.304 182759 DEBUG oslo_concurrency.lockutils [req-90cbe5e6-23cd-402f-823b-bb3798838db8 req-efa6364f-c033-4b35-8972-56102ab55b7a 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "cacae884-d2ca-4741-952f-59ffbb641328-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 21 19:09:22 np0005591285 nova_compute[182755]: 2026-01-22 00:09:22.305 182759 DEBUG nova.compute.manager [req-90cbe5e6-23cd-402f-823b-bb3798838db8 req-efa6364f-c033-4b35-8972-56102ab55b7a 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: cacae884-d2ca-4741-952f-59ffbb641328] No waiting events found dispatching network-vif-plugged-ebf5a837-6957-4227-9b3d-1ae66eb381bd pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 21 19:09:22 np0005591285 nova_compute[182755]: 2026-01-22 00:09:22.305 182759 WARNING nova.compute.manager [req-90cbe5e6-23cd-402f-823b-bb3798838db8 req-efa6364f-c033-4b35-8972-56102ab55b7a 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: cacae884-d2ca-4741-952f-59ffbb641328] Received unexpected event network-vif-plugged-ebf5a837-6957-4227-9b3d-1ae66eb381bd for instance with vm_state rescued and task_state None.#033[00m
Jan 21 19:09:22 np0005591285 nova_compute[182755]: 2026-01-22 00:09:22.306 182759 DEBUG nova.compute.manager [req-90cbe5e6-23cd-402f-823b-bb3798838db8 req-efa6364f-c033-4b35-8972-56102ab55b7a 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: cacae884-d2ca-4741-952f-59ffbb641328] Received event network-vif-plugged-ebf5a837-6957-4227-9b3d-1ae66eb381bd external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 21 19:09:22 np0005591285 nova_compute[182755]: 2026-01-22 00:09:22.306 182759 DEBUG oslo_concurrency.lockutils [req-90cbe5e6-23cd-402f-823b-bb3798838db8 req-efa6364f-c033-4b35-8972-56102ab55b7a 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "cacae884-d2ca-4741-952f-59ffbb641328-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 21 19:09:22 np0005591285 nova_compute[182755]: 2026-01-22 00:09:22.307 182759 DEBUG oslo_concurrency.lockutils [req-90cbe5e6-23cd-402f-823b-bb3798838db8 req-efa6364f-c033-4b35-8972-56102ab55b7a 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "cacae884-d2ca-4741-952f-59ffbb641328-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 21 19:09:22 np0005591285 nova_compute[182755]: 2026-01-22 00:09:22.307 182759 DEBUG oslo_concurrency.lockutils [req-90cbe5e6-23cd-402f-823b-bb3798838db8 req-efa6364f-c033-4b35-8972-56102ab55b7a 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "cacae884-d2ca-4741-952f-59ffbb641328-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 21 19:09:22 np0005591285 nova_compute[182755]: 2026-01-22 00:09:22.308 182759 DEBUG nova.compute.manager [req-90cbe5e6-23cd-402f-823b-bb3798838db8 req-efa6364f-c033-4b35-8972-56102ab55b7a 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: cacae884-d2ca-4741-952f-59ffbb641328] No waiting events found dispatching network-vif-plugged-ebf5a837-6957-4227-9b3d-1ae66eb381bd pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 21 19:09:22 np0005591285 nova_compute[182755]: 2026-01-22 00:09:22.308 182759 WARNING nova.compute.manager [req-90cbe5e6-23cd-402f-823b-bb3798838db8 req-efa6364f-c033-4b35-8972-56102ab55b7a 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: cacae884-d2ca-4741-952f-59ffbb641328] Received unexpected event network-vif-plugged-ebf5a837-6957-4227-9b3d-1ae66eb381bd for instance with vm_state rescued and task_state None.#033[00m
Jan 21 19:09:22 np0005591285 nova_compute[182755]: 2026-01-22 00:09:22.309 182759 DEBUG nova.compute.manager [req-90cbe5e6-23cd-402f-823b-bb3798838db8 req-efa6364f-c033-4b35-8972-56102ab55b7a 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: cacae884-d2ca-4741-952f-59ffbb641328] Received event network-vif-plugged-ebf5a837-6957-4227-9b3d-1ae66eb381bd external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 21 19:09:22 np0005591285 nova_compute[182755]: 2026-01-22 00:09:22.310 182759 DEBUG oslo_concurrency.lockutils [req-90cbe5e6-23cd-402f-823b-bb3798838db8 req-efa6364f-c033-4b35-8972-56102ab55b7a 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "cacae884-d2ca-4741-952f-59ffbb641328-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 21 19:09:22 np0005591285 nova_compute[182755]: 2026-01-22 00:09:22.310 182759 DEBUG oslo_concurrency.lockutils [req-90cbe5e6-23cd-402f-823b-bb3798838db8 req-efa6364f-c033-4b35-8972-56102ab55b7a 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "cacae884-d2ca-4741-952f-59ffbb641328-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 21 19:09:22 np0005591285 nova_compute[182755]: 2026-01-22 00:09:22.311 182759 DEBUG oslo_concurrency.lockutils [req-90cbe5e6-23cd-402f-823b-bb3798838db8 req-efa6364f-c033-4b35-8972-56102ab55b7a 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "cacae884-d2ca-4741-952f-59ffbb641328-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 21 19:09:22 np0005591285 nova_compute[182755]: 2026-01-22 00:09:22.311 182759 DEBUG nova.compute.manager [req-90cbe5e6-23cd-402f-823b-bb3798838db8 req-efa6364f-c033-4b35-8972-56102ab55b7a 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: cacae884-d2ca-4741-952f-59ffbb641328] No waiting events found dispatching network-vif-plugged-ebf5a837-6957-4227-9b3d-1ae66eb381bd pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 21 19:09:22 np0005591285 nova_compute[182755]: 2026-01-22 00:09:22.312 182759 WARNING nova.compute.manager [req-90cbe5e6-23cd-402f-823b-bb3798838db8 req-efa6364f-c033-4b35-8972-56102ab55b7a 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: cacae884-d2ca-4741-952f-59ffbb641328] Received unexpected event network-vif-plugged-ebf5a837-6957-4227-9b3d-1ae66eb381bd for instance with vm_state rescued and task_state None.#033[00m
Jan 21 19:09:23 np0005591285 nova_compute[182755]: 2026-01-22 00:09:23.151 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:09:23 np0005591285 nova_compute[182755]: 2026-01-22 00:09:23.294 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:09:25 np0005591285 podman[228322]: 2026-01-22 00:09:25.191365001 +0000 UTC m=+0.059160053 container health_status 3d927e208c8d9d2bf2ed958430b573b5beb6e6062ba87e1ec7a0ee42c855b2e4 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Jan 21 19:09:25 np0005591285 nova_compute[182755]: 2026-01-22 00:09:25.684 182759 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1769040550.6835153, 4e87b9c8-cfba-431e-966e-24799ad0ece2 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 21 19:09:25 np0005591285 nova_compute[182755]: 2026-01-22 00:09:25.685 182759 INFO nova.compute.manager [-] [instance: 4e87b9c8-cfba-431e-966e-24799ad0ece2] VM Stopped (Lifecycle Event)#033[00m
Jan 21 19:09:25 np0005591285 nova_compute[182755]: 2026-01-22 00:09:25.708 182759 DEBUG nova.compute.manager [None req-eed2be2e-af3b-4b6b-9211-9fe3352e58aa - - - - - -] [instance: 4e87b9c8-cfba-431e-966e-24799ad0ece2] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 21 19:09:25 np0005591285 nova_compute[182755]: 2026-01-22 00:09:25.732 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:09:26 np0005591285 nova_compute[182755]: 2026-01-22 00:09:26.850 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:09:29 np0005591285 podman[228348]: 2026-01-22 00:09:29.196802814 +0000 UTC m=+0.052460722 container health_status 8919309155a033da8deb024bb37b9fc0d285fe97686ee0d202af37a5f2df8416 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Jan 21 19:09:29 np0005591285 podman[228347]: 2026-01-22 00:09:29.226496713 +0000 UTC m=+0.074945207 container health_status 482a8450540b33e5cd4c4cb0c4b60281016c657b79b89fa82f8a9e447843f995 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=ovn_metadata_agent)
Jan 21 19:09:29 np0005591285 podman[228349]: 2026-01-22 00:09:29.268849202 +0000 UTC m=+0.108256864 container health_status e38def30a2d032278c507a8fe96e62e9ea51ff4721fe55b24e66691821fd079e (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, config_id=ovn_controller, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 21 19:09:30 np0005591285 nova_compute[182755]: 2026-01-22 00:09:30.735 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:09:31 np0005591285 nova_compute[182755]: 2026-01-22 00:09:31.852 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:09:35 np0005591285 nova_compute[182755]: 2026-01-22 00:09:35.739 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:09:36 np0005591285 nova_compute[182755]: 2026-01-22 00:09:36.766 182759 DEBUG oslo_service.periodic_task [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 21 19:09:36 np0005591285 nova_compute[182755]: 2026-01-22 00:09:36.854 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:09:37 np0005591285 nova_compute[182755]: 2026-01-22 00:09:37.217 182759 DEBUG oslo_service.periodic_task [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 21 19:09:37 np0005591285 nova_compute[182755]: 2026-01-22 00:09:37.219 182759 DEBUG nova.compute.manager [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Jan 21 19:09:40 np0005591285 nova_compute[182755]: 2026-01-22 00:09:40.742 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:09:41 np0005591285 nova_compute[182755]: 2026-01-22 00:09:41.219 182759 DEBUG oslo_service.periodic_task [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 21 19:09:41 np0005591285 nova_compute[182755]: 2026-01-22 00:09:41.967 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:09:42 np0005591285 nova_compute[182755]: 2026-01-22 00:09:42.217 182759 DEBUG oslo_service.periodic_task [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 21 19:09:43 np0005591285 nova_compute[182755]: 2026-01-22 00:09:43.217 182759 DEBUG oslo_service.periodic_task [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 21 19:09:44 np0005591285 nova_compute[182755]: 2026-01-22 00:09:44.272 182759 DEBUG oslo_concurrency.lockutils [None req-aac980e4-9f3c-4a41-94d3-08d43f8e200e 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] Acquiring lock "c8ef6504-1e46-493c-a14c-7cd1bebde8e0" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 21 19:09:44 np0005591285 nova_compute[182755]: 2026-01-22 00:09:44.272 182759 DEBUG oslo_concurrency.lockutils [None req-aac980e4-9f3c-4a41-94d3-08d43f8e200e 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] Lock "c8ef6504-1e46-493c-a14c-7cd1bebde8e0" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 21 19:09:44 np0005591285 nova_compute[182755]: 2026-01-22 00:09:44.310 182759 DEBUG nova.compute.manager [None req-aac980e4-9f3c-4a41-94d3-08d43f8e200e 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] [instance: c8ef6504-1e46-493c-a14c-7cd1bebde8e0] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Jan 21 19:09:44 np0005591285 nova_compute[182755]: 2026-01-22 00:09:44.438 182759 DEBUG oslo_concurrency.lockutils [None req-aac980e4-9f3c-4a41-94d3-08d43f8e200e 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 21 19:09:44 np0005591285 nova_compute[182755]: 2026-01-22 00:09:44.439 182759 DEBUG oslo_concurrency.lockutils [None req-aac980e4-9f3c-4a41-94d3-08d43f8e200e 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 21 19:09:44 np0005591285 nova_compute[182755]: 2026-01-22 00:09:44.445 182759 DEBUG nova.virt.hardware [None req-aac980e4-9f3c-4a41-94d3-08d43f8e200e 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Jan 21 19:09:44 np0005591285 nova_compute[182755]: 2026-01-22 00:09:44.445 182759 INFO nova.compute.claims [None req-aac980e4-9f3c-4a41-94d3-08d43f8e200e 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] [instance: c8ef6504-1e46-493c-a14c-7cd1bebde8e0] Claim successful on node compute-2.ctlplane.example.com#033[00m
Jan 21 19:09:44 np0005591285 nova_compute[182755]: 2026-01-22 00:09:44.544 182759 DEBUG nova.scheduler.client.report [None req-aac980e4-9f3c-4a41-94d3-08d43f8e200e 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] Refreshing inventories for resource provider e96a8776-a298-4c19-937a-402cb8191067 _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:804#033[00m
Jan 21 19:09:44 np0005591285 nova_compute[182755]: 2026-01-22 00:09:44.565 182759 DEBUG nova.scheduler.client.report [None req-aac980e4-9f3c-4a41-94d3-08d43f8e200e 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] Updating ProviderTree inventory for provider e96a8776-a298-4c19-937a-402cb8191067 from _refresh_and_get_inventory using data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} _refresh_and_get_inventory /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:768#033[00m
Jan 21 19:09:44 np0005591285 nova_compute[182755]: 2026-01-22 00:09:44.565 182759 DEBUG nova.compute.provider_tree [None req-aac980e4-9f3c-4a41-94d3-08d43f8e200e 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] Updating inventory in ProviderTree for provider e96a8776-a298-4c19-937a-402cb8191067 with inventory: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176#033[00m
Jan 21 19:09:44 np0005591285 nova_compute[182755]: 2026-01-22 00:09:44.586 182759 DEBUG nova.scheduler.client.report [None req-aac980e4-9f3c-4a41-94d3-08d43f8e200e 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] Refreshing aggregate associations for resource provider e96a8776-a298-4c19-937a-402cb8191067, aggregates: None _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:813#033[00m
Jan 21 19:09:44 np0005591285 nova_compute[182755]: 2026-01-22 00:09:44.605 182759 DEBUG nova.scheduler.client.report [None req-aac980e4-9f3c-4a41-94d3-08d43f8e200e 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] Refreshing trait associations for resource provider e96a8776-a298-4c19-937a-402cb8191067, traits: COMPUTE_SECURITY_UEFI_SECURE_BOOT,COMPUTE_VIOMMU_MODEL_VIRTIO,COMPUTE_STORAGE_BUS_IDE,COMPUTE_NET_ATTACH_INTERFACE,HW_CPU_X86_SSSE3,HW_CPU_X86_SSE,COMPUTE_IMAGE_TYPE_QCOW2,COMPUTE_NET_VIF_MODEL_VMXNET3,COMPUTE_STORAGE_BUS_FDC,HW_CPU_X86_SSE42,COMPUTE_SECURITY_TPM_2_0,COMPUTE_IMAGE_TYPE_AMI,COMPUTE_DEVICE_TAGGING,COMPUTE_VOLUME_EXTEND,COMPUTE_NET_VIF_MODEL_E1000,COMPUTE_VIOMMU_MODEL_AUTO,COMPUTE_GRAPHICS_MODEL_CIRRUS,COMPUTE_VOLUME_ATTACH_WITH_TAG,COMPUTE_IMAGE_TYPE_AKI,COMPUTE_STORAGE_BUS_SATA,HW_CPU_X86_SSE2,COMPUTE_IMAGE_TYPE_ARI,HW_CPU_X86_SSE41,COMPUTE_RESCUE_BFV,COMPUTE_NODE,COMPUTE_GRAPHICS_MODEL_VIRTIO,COMPUTE_NET_VIF_MODEL_PCNET,COMPUTE_NET_VIF_MODEL_VIRTIO,COMPUTE_NET_VIF_MODEL_RTL8139,COMPUTE_ACCELERATORS,COMPUTE_SECURITY_TPM_1_2,COMPUTE_GRAPHICS_MODEL_NONE,COMPUTE_GRAPHICS_MODEL_BOCHS,COMPUTE_VOLUME_MULTI_ATTACH,COMPUTE_NET_VIF_MODEL_NE2K_PCI,COMPUTE_TRUSTED_CERTS,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,COMPUTE_VIOMMU_MODEL_INTEL,COMPUTE_IMAGE_TYPE_ISO,COMPUTE_SOCKET_PCI_NUMA_AFFINITY,COMPUTE_STORAGE_BUS_USB,COMPUTE_STORAGE_BUS_SCSI,COMPUTE_STORAGE_BUS_VIRTIO,HW_CPU_X86_MMX,COMPUTE_GRAPHICS_MODEL_VGA,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,COMPUTE_NET_VIF_MODEL_E1000E,COMPUTE_IMAGE_TYPE_RAW _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:825#033[00m
Jan 21 19:09:44 np0005591285 nova_compute[182755]: 2026-01-22 00:09:44.661 182759 DEBUG nova.compute.provider_tree [None req-aac980e4-9f3c-4a41-94d3-08d43f8e200e 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] Inventory has not changed in ProviderTree for provider: e96a8776-a298-4c19-937a-402cb8191067 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 21 19:09:44 np0005591285 nova_compute[182755]: 2026-01-22 00:09:44.677 182759 DEBUG nova.scheduler.client.report [None req-aac980e4-9f3c-4a41-94d3-08d43f8e200e 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] Inventory has not changed for provider e96a8776-a298-4c19-937a-402cb8191067 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 21 19:09:44 np0005591285 nova_compute[182755]: 2026-01-22 00:09:44.713 182759 DEBUG oslo_concurrency.lockutils [None req-aac980e4-9f3c-4a41-94d3-08d43f8e200e 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.275s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 21 19:09:44 np0005591285 nova_compute[182755]: 2026-01-22 00:09:44.714 182759 DEBUG nova.compute.manager [None req-aac980e4-9f3c-4a41-94d3-08d43f8e200e 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] [instance: c8ef6504-1e46-493c-a14c-7cd1bebde8e0] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Jan 21 19:09:44 np0005591285 nova_compute[182755]: 2026-01-22 00:09:44.778 182759 DEBUG nova.compute.manager [None req-aac980e4-9f3c-4a41-94d3-08d43f8e200e 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] [instance: c8ef6504-1e46-493c-a14c-7cd1bebde8e0] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Jan 21 19:09:44 np0005591285 nova_compute[182755]: 2026-01-22 00:09:44.778 182759 DEBUG nova.network.neutron [None req-aac980e4-9f3c-4a41-94d3-08d43f8e200e 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] [instance: c8ef6504-1e46-493c-a14c-7cd1bebde8e0] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Jan 21 19:09:44 np0005591285 nova_compute[182755]: 2026-01-22 00:09:44.811 182759 INFO nova.virt.libvirt.driver [None req-aac980e4-9f3c-4a41-94d3-08d43f8e200e 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] [instance: c8ef6504-1e46-493c-a14c-7cd1bebde8e0] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Jan 21 19:09:44 np0005591285 nova_compute[182755]: 2026-01-22 00:09:44.833 182759 DEBUG nova.compute.manager [None req-aac980e4-9f3c-4a41-94d3-08d43f8e200e 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] [instance: c8ef6504-1e46-493c-a14c-7cd1bebde8e0] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Jan 21 19:09:45 np0005591285 nova_compute[182755]: 2026-01-22 00:09:45.018 182759 DEBUG nova.compute.manager [None req-aac980e4-9f3c-4a41-94d3-08d43f8e200e 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] [instance: c8ef6504-1e46-493c-a14c-7cd1bebde8e0] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Jan 21 19:09:45 np0005591285 nova_compute[182755]: 2026-01-22 00:09:45.020 182759 DEBUG nova.virt.libvirt.driver [None req-aac980e4-9f3c-4a41-94d3-08d43f8e200e 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] [instance: c8ef6504-1e46-493c-a14c-7cd1bebde8e0] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Jan 21 19:09:45 np0005591285 nova_compute[182755]: 2026-01-22 00:09:45.021 182759 INFO nova.virt.libvirt.driver [None req-aac980e4-9f3c-4a41-94d3-08d43f8e200e 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] [instance: c8ef6504-1e46-493c-a14c-7cd1bebde8e0] Creating image(s)#033[00m
Jan 21 19:09:45 np0005591285 nova_compute[182755]: 2026-01-22 00:09:45.021 182759 DEBUG oslo_concurrency.lockutils [None req-aac980e4-9f3c-4a41-94d3-08d43f8e200e 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] Acquiring lock "/var/lib/nova/instances/c8ef6504-1e46-493c-a14c-7cd1bebde8e0/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 21 19:09:45 np0005591285 nova_compute[182755]: 2026-01-22 00:09:45.022 182759 DEBUG oslo_concurrency.lockutils [None req-aac980e4-9f3c-4a41-94d3-08d43f8e200e 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] Lock "/var/lib/nova/instances/c8ef6504-1e46-493c-a14c-7cd1bebde8e0/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 21 19:09:45 np0005591285 nova_compute[182755]: 2026-01-22 00:09:45.023 182759 DEBUG oslo_concurrency.lockutils [None req-aac980e4-9f3c-4a41-94d3-08d43f8e200e 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] Lock "/var/lib/nova/instances/c8ef6504-1e46-493c-a14c-7cd1bebde8e0/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 21 19:09:45 np0005591285 nova_compute[182755]: 2026-01-22 00:09:45.041 182759 DEBUG oslo_concurrency.processutils [None req-aac980e4-9f3c-4a41-94d3-08d43f8e200e 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 21 19:09:45 np0005591285 nova_compute[182755]: 2026-01-22 00:09:45.076 182759 DEBUG nova.policy [None req-aac980e4-9f3c-4a41-94d3-08d43f8e200e 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '833f1e9dce90456ea55a443da6704907', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '34b96b4037d24a0ea19383ca2477b2fd', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Jan 21 19:09:45 np0005591285 nova_compute[182755]: 2026-01-22 00:09:45.117 182759 DEBUG oslo_concurrency.processutils [None req-aac980e4-9f3c-4a41-94d3-08d43f8e200e 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json" returned: 0 in 0.076s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 21 19:09:45 np0005591285 nova_compute[182755]: 2026-01-22 00:09:45.118 182759 DEBUG oslo_concurrency.lockutils [None req-aac980e4-9f3c-4a41-94d3-08d43f8e200e 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] Acquiring lock "3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 21 19:09:45 np0005591285 nova_compute[182755]: 2026-01-22 00:09:45.119 182759 DEBUG oslo_concurrency.lockutils [None req-aac980e4-9f3c-4a41-94d3-08d43f8e200e 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] Lock "3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 21 19:09:45 np0005591285 nova_compute[182755]: 2026-01-22 00:09:45.134 182759 DEBUG oslo_concurrency.processutils [None req-aac980e4-9f3c-4a41-94d3-08d43f8e200e 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 21 19:09:45 np0005591285 nova_compute[182755]: 2026-01-22 00:09:45.188 182759 DEBUG oslo_concurrency.processutils [None req-aac980e4-9f3c-4a41-94d3-08d43f8e200e 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json" returned: 0 in 0.054s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 21 19:09:45 np0005591285 nova_compute[182755]: 2026-01-22 00:09:45.189 182759 DEBUG oslo_concurrency.processutils [None req-aac980e4-9f3c-4a41-94d3-08d43f8e200e 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474,backing_fmt=raw /var/lib/nova/instances/c8ef6504-1e46-493c-a14c-7cd1bebde8e0/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 21 19:09:45 np0005591285 nova_compute[182755]: 2026-01-22 00:09:45.213 182759 DEBUG oslo_service.periodic_task [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 21 19:09:45 np0005591285 nova_compute[182755]: 2026-01-22 00:09:45.222 182759 DEBUG oslo_concurrency.processutils [None req-aac980e4-9f3c-4a41-94d3-08d43f8e200e 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474,backing_fmt=raw /var/lib/nova/instances/c8ef6504-1e46-493c-a14c-7cd1bebde8e0/disk 1073741824" returned: 0 in 0.033s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 21 19:09:45 np0005591285 nova_compute[182755]: 2026-01-22 00:09:45.223 182759 DEBUG oslo_concurrency.lockutils [None req-aac980e4-9f3c-4a41-94d3-08d43f8e200e 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] Lock "3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.104s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 21 19:09:45 np0005591285 nova_compute[182755]: 2026-01-22 00:09:45.223 182759 DEBUG oslo_concurrency.processutils [None req-aac980e4-9f3c-4a41-94d3-08d43f8e200e 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 21 19:09:45 np0005591285 nova_compute[182755]: 2026-01-22 00:09:45.277 182759 DEBUG oslo_concurrency.processutils [None req-aac980e4-9f3c-4a41-94d3-08d43f8e200e 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json" returned: 0 in 0.053s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 21 19:09:45 np0005591285 nova_compute[182755]: 2026-01-22 00:09:45.278 182759 DEBUG nova.virt.disk.api [None req-aac980e4-9f3c-4a41-94d3-08d43f8e200e 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] Checking if we can resize image /var/lib/nova/instances/c8ef6504-1e46-493c-a14c-7cd1bebde8e0/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166#033[00m
Jan 21 19:09:45 np0005591285 nova_compute[182755]: 2026-01-22 00:09:45.278 182759 DEBUG oslo_concurrency.processutils [None req-aac980e4-9f3c-4a41-94d3-08d43f8e200e 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/c8ef6504-1e46-493c-a14c-7cd1bebde8e0/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 21 19:09:45 np0005591285 nova_compute[182755]: 2026-01-22 00:09:45.305 182759 DEBUG oslo_service.periodic_task [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 21 19:09:45 np0005591285 nova_compute[182755]: 2026-01-22 00:09:45.306 182759 DEBUG nova.compute.manager [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Jan 21 19:09:45 np0005591285 nova_compute[182755]: 2026-01-22 00:09:45.306 182759 DEBUG nova.compute.manager [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Jan 21 19:09:45 np0005591285 nova_compute[182755]: 2026-01-22 00:09:45.339 182759 DEBUG oslo_concurrency.processutils [None req-aac980e4-9f3c-4a41-94d3-08d43f8e200e 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/c8ef6504-1e46-493c-a14c-7cd1bebde8e0/disk --force-share --output=json" returned: 0 in 0.061s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 21 19:09:45 np0005591285 nova_compute[182755]: 2026-01-22 00:09:45.340 182759 DEBUG nova.virt.disk.api [None req-aac980e4-9f3c-4a41-94d3-08d43f8e200e 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] Cannot resize image /var/lib/nova/instances/c8ef6504-1e46-493c-a14c-7cd1bebde8e0/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172#033[00m
Jan 21 19:09:45 np0005591285 nova_compute[182755]: 2026-01-22 00:09:45.340 182759 DEBUG nova.objects.instance [None req-aac980e4-9f3c-4a41-94d3-08d43f8e200e 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] Lazy-loading 'migration_context' on Instance uuid c8ef6504-1e46-493c-a14c-7cd1bebde8e0 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 21 19:09:45 np0005591285 nova_compute[182755]: 2026-01-22 00:09:45.595 182759 DEBUG nova.virt.libvirt.driver [None req-aac980e4-9f3c-4a41-94d3-08d43f8e200e 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] [instance: c8ef6504-1e46-493c-a14c-7cd1bebde8e0] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Jan 21 19:09:45 np0005591285 nova_compute[182755]: 2026-01-22 00:09:45.595 182759 DEBUG nova.virt.libvirt.driver [None req-aac980e4-9f3c-4a41-94d3-08d43f8e200e 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] [instance: c8ef6504-1e46-493c-a14c-7cd1bebde8e0] Ensure instance console log exists: /var/lib/nova/instances/c8ef6504-1e46-493c-a14c-7cd1bebde8e0/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Jan 21 19:09:45 np0005591285 nova_compute[182755]: 2026-01-22 00:09:45.596 182759 DEBUG oslo_concurrency.lockutils [None req-aac980e4-9f3c-4a41-94d3-08d43f8e200e 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 21 19:09:45 np0005591285 nova_compute[182755]: 2026-01-22 00:09:45.596 182759 DEBUG oslo_concurrency.lockutils [None req-aac980e4-9f3c-4a41-94d3-08d43f8e200e 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 21 19:09:45 np0005591285 nova_compute[182755]: 2026-01-22 00:09:45.596 182759 DEBUG oslo_concurrency.lockutils [None req-aac980e4-9f3c-4a41-94d3-08d43f8e200e 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 21 19:09:45 np0005591285 nova_compute[182755]: 2026-01-22 00:09:45.599 182759 DEBUG nova.compute.manager [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] [instance: c8ef6504-1e46-493c-a14c-7cd1bebde8e0] Skipping network cache update for instance because it is Building. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9871#033[00m
Jan 21 19:09:45 np0005591285 nova_compute[182755]: 2026-01-22 00:09:45.745 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:09:46 np0005591285 nova_compute[182755]: 2026-01-22 00:09:46.095 182759 DEBUG oslo_concurrency.lockutils [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Acquiring lock "refresh_cache-cacae884-d2ca-4741-952f-59ffbb641328" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 21 19:09:46 np0005591285 nova_compute[182755]: 2026-01-22 00:09:46.095 182759 DEBUG oslo_concurrency.lockutils [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Acquired lock "refresh_cache-cacae884-d2ca-4741-952f-59ffbb641328" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 21 19:09:46 np0005591285 nova_compute[182755]: 2026-01-22 00:09:46.095 182759 DEBUG nova.network.neutron [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] [instance: cacae884-d2ca-4741-952f-59ffbb641328] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004#033[00m
Jan 21 19:09:46 np0005591285 nova_compute[182755]: 2026-01-22 00:09:46.096 182759 DEBUG nova.objects.instance [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Lazy-loading 'info_cache' on Instance uuid cacae884-d2ca-4741-952f-59ffbb641328 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 21 19:09:46 np0005591285 nova_compute[182755]: 2026-01-22 00:09:46.119 182759 DEBUG nova.network.neutron [None req-aac980e4-9f3c-4a41-94d3-08d43f8e200e 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] [instance: c8ef6504-1e46-493c-a14c-7cd1bebde8e0] Successfully created port: 058e919a-93d5-4b55-bae2-1ab8baad8296 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Jan 21 19:09:46 np0005591285 nova_compute[182755]: 2026-01-22 00:09:46.969 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:09:47 np0005591285 podman[228444]: 2026-01-22 00:09:47.195556966 +0000 UTC m=+0.063334354 container health_status 769668cc4e818cfae854ab3fcb5517227ddca1e18005a18ef4d782a494f45662 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, build-date=2025-08-20T13:12:41, container_name=openstack_network_exporter, managed_by=edpm_ansible, url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=ubi9-minimal, vendor=Red Hat, Inc., com.redhat.component=ubi9-minimal-container, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., version=9.6, architecture=x86_64, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, distribution-scope=public, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-type=git, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.buildah.version=1.33.7, release=1755695350, config_id=openstack_network_exporter, maintainer=Red Hat, Inc., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.openshift.tags=minimal rhel9, io.openshift.expose-services=)
Jan 21 19:09:47 np0005591285 podman[228445]: 2026-01-22 00:09:47.201227858 +0000 UTC m=+0.063617951 container health_status b5c1a31a508d0cf9e10702abe9c0ef424614b4d8f0daee85791f7de054afa6f0 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=ceilometer_agent_compute, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team)
Jan 21 19:09:49 np0005591285 nova_compute[182755]: 2026-01-22 00:09:49.122 182759 DEBUG nova.network.neutron [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] [instance: cacae884-d2ca-4741-952f-59ffbb641328] Updating instance_info_cache with network_info: [{"id": "ebf5a837-6957-4227-9b3d-1ae66eb381bd", "address": "fa:16:3e:52:ad:ee", "network": {"id": "eaa59f49-9477-4d9b-9a5e-8f1eb5cf78a6", "bridge": "br-int", "label": "tempest-ServerRescueTestJSON-1280377146-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": false}}], "meta": {"injected": false, "tenant_id": "3b9315c6168049d79f20d630e51ffff3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapebf5a837-69", "ovs_interfaceid": "ebf5a837-6957-4227-9b3d-1ae66eb381bd", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 21 19:09:49 np0005591285 nova_compute[182755]: 2026-01-22 00:09:49.154 182759 DEBUG oslo_concurrency.lockutils [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Releasing lock "refresh_cache-cacae884-d2ca-4741-952f-59ffbb641328" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 21 19:09:49 np0005591285 nova_compute[182755]: 2026-01-22 00:09:49.154 182759 DEBUG nova.compute.manager [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] [instance: cacae884-d2ca-4741-952f-59ffbb641328] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929#033[00m
Jan 21 19:09:49 np0005591285 nova_compute[182755]: 2026-01-22 00:09:49.155 182759 DEBUG oslo_service.periodic_task [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 21 19:09:49 np0005591285 nova_compute[182755]: 2026-01-22 00:09:49.156 182759 DEBUG oslo_service.periodic_task [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 21 19:09:49 np0005591285 nova_compute[182755]: 2026-01-22 00:09:49.156 182759 DEBUG oslo_service.periodic_task [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 21 19:09:49 np0005591285 nova_compute[182755]: 2026-01-22 00:09:49.188 182759 DEBUG oslo_concurrency.lockutils [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 21 19:09:49 np0005591285 nova_compute[182755]: 2026-01-22 00:09:49.189 182759 DEBUG oslo_concurrency.lockutils [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 21 19:09:49 np0005591285 nova_compute[182755]: 2026-01-22 00:09:49.190 182759 DEBUG oslo_concurrency.lockutils [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 21 19:09:49 np0005591285 nova_compute[182755]: 2026-01-22 00:09:49.190 182759 DEBUG nova.compute.resource_tracker [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Jan 21 19:09:49 np0005591285 nova_compute[182755]: 2026-01-22 00:09:49.292 182759 DEBUG oslo_concurrency.processutils [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/cacae884-d2ca-4741-952f-59ffbb641328/disk.rescue --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 21 19:09:49 np0005591285 nova_compute[182755]: 2026-01-22 00:09:49.356 182759 DEBUG oslo_concurrency.processutils [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/cacae884-d2ca-4741-952f-59ffbb641328/disk.rescue --force-share --output=json" returned: 0 in 0.063s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 21 19:09:49 np0005591285 nova_compute[182755]: 2026-01-22 00:09:49.357 182759 DEBUG oslo_concurrency.processutils [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/cacae884-d2ca-4741-952f-59ffbb641328/disk.rescue --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 21 19:09:49 np0005591285 nova_compute[182755]: 2026-01-22 00:09:49.423 182759 DEBUG oslo_concurrency.processutils [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/cacae884-d2ca-4741-952f-59ffbb641328/disk.rescue --force-share --output=json" returned: 0 in 0.066s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 21 19:09:49 np0005591285 nova_compute[182755]: 2026-01-22 00:09:49.424 182759 DEBUG oslo_concurrency.processutils [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/cacae884-d2ca-4741-952f-59ffbb641328/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 21 19:09:49 np0005591285 nova_compute[182755]: 2026-01-22 00:09:49.477 182759 DEBUG oslo_concurrency.processutils [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/cacae884-d2ca-4741-952f-59ffbb641328/disk --force-share --output=json" returned: 0 in 0.053s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 21 19:09:49 np0005591285 nova_compute[182755]: 2026-01-22 00:09:49.478 182759 DEBUG oslo_concurrency.processutils [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/cacae884-d2ca-4741-952f-59ffbb641328/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 21 19:09:49 np0005591285 nova_compute[182755]: 2026-01-22 00:09:49.533 182759 DEBUG oslo_concurrency.processutils [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/cacae884-d2ca-4741-952f-59ffbb641328/disk --force-share --output=json" returned: 0 in 0.055s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 21 19:09:49 np0005591285 nova_compute[182755]: 2026-01-22 00:09:49.660 182759 WARNING nova.virt.libvirt.driver [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 21 19:09:49 np0005591285 nova_compute[182755]: 2026-01-22 00:09:49.662 182759 DEBUG nova.compute.resource_tracker [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=5589MB free_disk=73.16352462768555GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Jan 21 19:09:49 np0005591285 nova_compute[182755]: 2026-01-22 00:09:49.662 182759 DEBUG oslo_concurrency.lockutils [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 21 19:09:49 np0005591285 nova_compute[182755]: 2026-01-22 00:09:49.662 182759 DEBUG oslo_concurrency.lockutils [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 21 19:09:49 np0005591285 nova_compute[182755]: 2026-01-22 00:09:49.973 182759 DEBUG nova.compute.resource_tracker [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Instance cacae884-d2ca-4741-952f-59ffbb641328 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Jan 21 19:09:49 np0005591285 nova_compute[182755]: 2026-01-22 00:09:49.973 182759 DEBUG nova.compute.resource_tracker [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Instance c8ef6504-1e46-493c-a14c-7cd1bebde8e0 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Jan 21 19:09:49 np0005591285 nova_compute[182755]: 2026-01-22 00:09:49.973 182759 DEBUG nova.compute.resource_tracker [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 2 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Jan 21 19:09:49 np0005591285 nova_compute[182755]: 2026-01-22 00:09:49.973 182759 DEBUG nova.compute.resource_tracker [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7679MB used_ram=768MB phys_disk=79GB used_disk=2GB total_vcpus=8 used_vcpus=2 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Jan 21 19:09:50 np0005591285 nova_compute[182755]: 2026-01-22 00:09:50.055 182759 DEBUG nova.compute.provider_tree [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Inventory has not changed in ProviderTree for provider: e96a8776-a298-4c19-937a-402cb8191067 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 21 19:09:50 np0005591285 nova_compute[182755]: 2026-01-22 00:09:50.075 182759 DEBUG nova.scheduler.client.report [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Inventory has not changed for provider e96a8776-a298-4c19-937a-402cb8191067 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 21 19:09:50 np0005591285 nova_compute[182755]: 2026-01-22 00:09:50.122 182759 DEBUG nova.compute.resource_tracker [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Jan 21 19:09:50 np0005591285 nova_compute[182755]: 2026-01-22 00:09:50.122 182759 DEBUG oslo_concurrency.lockutils [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.460s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 21 19:09:50 np0005591285 nova_compute[182755]: 2026-01-22 00:09:50.748 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:09:51 np0005591285 nova_compute[182755]: 2026-01-22 00:09:51.971 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:09:53 np0005591285 nova_compute[182755]: 2026-01-22 00:09:53.727 182759 DEBUG nova.network.neutron [None req-aac980e4-9f3c-4a41-94d3-08d43f8e200e 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] [instance: c8ef6504-1e46-493c-a14c-7cd1bebde8e0] Successfully updated port: 058e919a-93d5-4b55-bae2-1ab8baad8296 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Jan 21 19:09:53 np0005591285 nova_compute[182755]: 2026-01-22 00:09:53.750 182759 DEBUG oslo_concurrency.lockutils [None req-aac980e4-9f3c-4a41-94d3-08d43f8e200e 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] Acquiring lock "refresh_cache-c8ef6504-1e46-493c-a14c-7cd1bebde8e0" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 21 19:09:53 np0005591285 nova_compute[182755]: 2026-01-22 00:09:53.751 182759 DEBUG oslo_concurrency.lockutils [None req-aac980e4-9f3c-4a41-94d3-08d43f8e200e 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] Acquired lock "refresh_cache-c8ef6504-1e46-493c-a14c-7cd1bebde8e0" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 21 19:09:53 np0005591285 nova_compute[182755]: 2026-01-22 00:09:53.751 182759 DEBUG nova.network.neutron [None req-aac980e4-9f3c-4a41-94d3-08d43f8e200e 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] [instance: c8ef6504-1e46-493c-a14c-7cd1bebde8e0] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Jan 21 19:09:54 np0005591285 nova_compute[182755]: 2026-01-22 00:09:54.008 182759 DEBUG nova.compute.manager [req-53ff2c58-0a7d-4eb3-84e2-b2249cb5174a req-3bda6447-8fa9-4421-a4dd-2d39b30053e5 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: c8ef6504-1e46-493c-a14c-7cd1bebde8e0] Received event network-changed-058e919a-93d5-4b55-bae2-1ab8baad8296 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 21 19:09:54 np0005591285 nova_compute[182755]: 2026-01-22 00:09:54.009 182759 DEBUG nova.compute.manager [req-53ff2c58-0a7d-4eb3-84e2-b2249cb5174a req-3bda6447-8fa9-4421-a4dd-2d39b30053e5 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: c8ef6504-1e46-493c-a14c-7cd1bebde8e0] Refreshing instance network info cache due to event network-changed-058e919a-93d5-4b55-bae2-1ab8baad8296. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 21 19:09:54 np0005591285 nova_compute[182755]: 2026-01-22 00:09:54.009 182759 DEBUG oslo_concurrency.lockutils [req-53ff2c58-0a7d-4eb3-84e2-b2249cb5174a req-3bda6447-8fa9-4421-a4dd-2d39b30053e5 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "refresh_cache-c8ef6504-1e46-493c-a14c-7cd1bebde8e0" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 21 19:09:55 np0005591285 nova_compute[182755]: 2026-01-22 00:09:55.327 182759 DEBUG nova.network.neutron [None req-aac980e4-9f3c-4a41-94d3-08d43f8e200e 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] [instance: c8ef6504-1e46-493c-a14c-7cd1bebde8e0] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Jan 21 19:09:55 np0005591285 nova_compute[182755]: 2026-01-22 00:09:55.803 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:09:56 np0005591285 podman[228500]: 2026-01-22 00:09:56.193686527 +0000 UTC m=+0.059848350 container health_status 3d927e208c8d9d2bf2ed958430b573b5beb6e6062ba87e1ec7a0ee42c855b2e4 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter)
Jan 21 19:09:56 np0005591285 nova_compute[182755]: 2026-01-22 00:09:56.972 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:10:00 np0005591285 podman[228523]: 2026-01-22 00:10:00.173561283 +0000 UTC m=+0.049254567 container health_status 482a8450540b33e5cd4c4cb0c4b60281016c657b79b89fa82f8a9e447843f995 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, io.buildah.version=1.41.3)
Jan 21 19:10:00 np0005591285 podman[228524]: 2026-01-22 00:10:00.188799592 +0000 UTC m=+0.060192570 container health_status 8919309155a033da8deb024bb37b9fc0d285fe97686ee0d202af37a5f2df8416 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter)
Jan 21 19:10:00 np0005591285 nova_compute[182755]: 2026-01-22 00:10:00.226 182759 DEBUG nova.network.neutron [None req-aac980e4-9f3c-4a41-94d3-08d43f8e200e 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] [instance: c8ef6504-1e46-493c-a14c-7cd1bebde8e0] Updating instance_info_cache with network_info: [{"id": "058e919a-93d5-4b55-bae2-1ab8baad8296", "address": "fa:16:3e:d8:3a:e3", "network": {"id": "8eb4d5e9-8ac1-4d97-a3ca-29514334d492", "bridge": "br-int", "label": "tempest-network-smoke--436957495", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "34b96b4037d24a0ea19383ca2477b2fd", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap058e919a-93", "ovs_interfaceid": "058e919a-93d5-4b55-bae2-1ab8baad8296", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 21 19:10:00 np0005591285 podman[228525]: 2026-01-22 00:10:00.275113644 +0000 UTC m=+0.139671438 container health_status e38def30a2d032278c507a8fe96e62e9ea51ff4721fe55b24e66691821fd079e (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_id=ovn_controller, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.license=GPLv2)
Jan 21 19:10:00 np0005591285 nova_compute[182755]: 2026-01-22 00:10:00.318 182759 DEBUG oslo_concurrency.lockutils [None req-aac980e4-9f3c-4a41-94d3-08d43f8e200e 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] Releasing lock "refresh_cache-c8ef6504-1e46-493c-a14c-7cd1bebde8e0" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 21 19:10:00 np0005591285 nova_compute[182755]: 2026-01-22 00:10:00.319 182759 DEBUG nova.compute.manager [None req-aac980e4-9f3c-4a41-94d3-08d43f8e200e 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] [instance: c8ef6504-1e46-493c-a14c-7cd1bebde8e0] Instance network_info: |[{"id": "058e919a-93d5-4b55-bae2-1ab8baad8296", "address": "fa:16:3e:d8:3a:e3", "network": {"id": "8eb4d5e9-8ac1-4d97-a3ca-29514334d492", "bridge": "br-int", "label": "tempest-network-smoke--436957495", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "34b96b4037d24a0ea19383ca2477b2fd", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap058e919a-93", "ovs_interfaceid": "058e919a-93d5-4b55-bae2-1ab8baad8296", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Jan 21 19:10:00 np0005591285 nova_compute[182755]: 2026-01-22 00:10:00.319 182759 DEBUG oslo_concurrency.lockutils [req-53ff2c58-0a7d-4eb3-84e2-b2249cb5174a req-3bda6447-8fa9-4421-a4dd-2d39b30053e5 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquired lock "refresh_cache-c8ef6504-1e46-493c-a14c-7cd1bebde8e0" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 21 19:10:00 np0005591285 nova_compute[182755]: 2026-01-22 00:10:00.320 182759 DEBUG nova.network.neutron [req-53ff2c58-0a7d-4eb3-84e2-b2249cb5174a req-3bda6447-8fa9-4421-a4dd-2d39b30053e5 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: c8ef6504-1e46-493c-a14c-7cd1bebde8e0] Refreshing network info cache for port 058e919a-93d5-4b55-bae2-1ab8baad8296 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 21 19:10:00 np0005591285 nova_compute[182755]: 2026-01-22 00:10:00.323 182759 DEBUG nova.virt.libvirt.driver [None req-aac980e4-9f3c-4a41-94d3-08d43f8e200e 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] [instance: c8ef6504-1e46-493c-a14c-7cd1bebde8e0] Start _get_guest_xml network_info=[{"id": "058e919a-93d5-4b55-bae2-1ab8baad8296", "address": "fa:16:3e:d8:3a:e3", "network": {"id": "8eb4d5e9-8ac1-4d97-a3ca-29514334d492", "bridge": "br-int", "label": "tempest-network-smoke--436957495", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "34b96b4037d24a0ea19383ca2477b2fd", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap058e919a-93", "ovs_interfaceid": "058e919a-93d5-4b55-bae2-1ab8baad8296", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-21T23:42:50Z,direct_url=<?>,disk_format='qcow2',id=9cd98f02-a505-4543-a7ad-04e9a377b456,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='43b70c4e837343859ac97b6b2397ba1b',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-21T23:42:52Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'device_type': 'disk', 'device_name': '/dev/vda', 'size': 0, 'boot_index': 0, 'disk_bus': 'virtio', 'guest_format': None, 'encryption_options': None, 'encryption_format': None, 'encrypted': False, 'encryption_secret_uuid': None, 'image_id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Jan 21 19:10:00 np0005591285 nova_compute[182755]: 2026-01-22 00:10:00.328 182759 WARNING nova.virt.libvirt.driver [None req-aac980e4-9f3c-4a41-94d3-08d43f8e200e 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 21 19:10:00 np0005591285 nova_compute[182755]: 2026-01-22 00:10:00.337 182759 DEBUG nova.virt.libvirt.host [None req-aac980e4-9f3c-4a41-94d3-08d43f8e200e 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Jan 21 19:10:00 np0005591285 nova_compute[182755]: 2026-01-22 00:10:00.338 182759 DEBUG nova.virt.libvirt.host [None req-aac980e4-9f3c-4a41-94d3-08d43f8e200e 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Jan 21 19:10:00 np0005591285 nova_compute[182755]: 2026-01-22 00:10:00.342 182759 DEBUG nova.virt.libvirt.host [None req-aac980e4-9f3c-4a41-94d3-08d43f8e200e 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Jan 21 19:10:00 np0005591285 nova_compute[182755]: 2026-01-22 00:10:00.343 182759 DEBUG nova.virt.libvirt.host [None req-aac980e4-9f3c-4a41-94d3-08d43f8e200e 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Jan 21 19:10:00 np0005591285 nova_compute[182755]: 2026-01-22 00:10:00.345 182759 DEBUG nova.virt.libvirt.driver [None req-aac980e4-9f3c-4a41-94d3-08d43f8e200e 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Jan 21 19:10:00 np0005591285 nova_compute[182755]: 2026-01-22 00:10:00.345 182759 DEBUG nova.virt.hardware [None req-aac980e4-9f3c-4a41-94d3-08d43f8e200e 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-21T23:42:45Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='c3389c03-89c4-4ff5-9e03-1a99d41713d4',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-21T23:42:50Z,direct_url=<?>,disk_format='qcow2',id=9cd98f02-a505-4543-a7ad-04e9a377b456,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='43b70c4e837343859ac97b6b2397ba1b',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-21T23:42:52Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Jan 21 19:10:00 np0005591285 nova_compute[182755]: 2026-01-22 00:10:00.346 182759 DEBUG nova.virt.hardware [None req-aac980e4-9f3c-4a41-94d3-08d43f8e200e 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Jan 21 19:10:00 np0005591285 nova_compute[182755]: 2026-01-22 00:10:00.346 182759 DEBUG nova.virt.hardware [None req-aac980e4-9f3c-4a41-94d3-08d43f8e200e 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Jan 21 19:10:00 np0005591285 nova_compute[182755]: 2026-01-22 00:10:00.346 182759 DEBUG nova.virt.hardware [None req-aac980e4-9f3c-4a41-94d3-08d43f8e200e 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Jan 21 19:10:00 np0005591285 nova_compute[182755]: 2026-01-22 00:10:00.346 182759 DEBUG nova.virt.hardware [None req-aac980e4-9f3c-4a41-94d3-08d43f8e200e 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Jan 21 19:10:00 np0005591285 nova_compute[182755]: 2026-01-22 00:10:00.347 182759 DEBUG nova.virt.hardware [None req-aac980e4-9f3c-4a41-94d3-08d43f8e200e 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Jan 21 19:10:00 np0005591285 nova_compute[182755]: 2026-01-22 00:10:00.347 182759 DEBUG nova.virt.hardware [None req-aac980e4-9f3c-4a41-94d3-08d43f8e200e 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Jan 21 19:10:00 np0005591285 nova_compute[182755]: 2026-01-22 00:10:00.347 182759 DEBUG nova.virt.hardware [None req-aac980e4-9f3c-4a41-94d3-08d43f8e200e 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Jan 21 19:10:00 np0005591285 nova_compute[182755]: 2026-01-22 00:10:00.347 182759 DEBUG nova.virt.hardware [None req-aac980e4-9f3c-4a41-94d3-08d43f8e200e 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Jan 21 19:10:00 np0005591285 nova_compute[182755]: 2026-01-22 00:10:00.348 182759 DEBUG nova.virt.hardware [None req-aac980e4-9f3c-4a41-94d3-08d43f8e200e 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Jan 21 19:10:00 np0005591285 nova_compute[182755]: 2026-01-22 00:10:00.348 182759 DEBUG nova.virt.hardware [None req-aac980e4-9f3c-4a41-94d3-08d43f8e200e 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Jan 21 19:10:00 np0005591285 nova_compute[182755]: 2026-01-22 00:10:00.352 182759 DEBUG nova.virt.libvirt.vif [None req-aac980e4-9f3c-4a41-94d3-08d43f8e200e 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-22T00:09:43Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-1980643582',display_name='tempest-TestNetworkBasicOps-server-1980643582',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-1980643582',id=114,image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBJZ6XnigKgOH1qgP1qM1cbTSG+YSKJr4HPXzLmrePgGjqOoDXJutLYqYLK4oOPuYV4HHBQRnMUsgviAJJBFwFGWl4kwT3DJVEor+PG84hf0+tujPqIms5W8Uc9EI4E+YGQ==',key_name='tempest-TestNetworkBasicOps-575491139',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='34b96b4037d24a0ea19383ca2477b2fd',ramdisk_id='',reservation_id='r-dzas0rtl',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestNetworkBasicOps-822850957',owner_user_name='tempest-TestNetworkBasicOps-822850957-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-22T00:09:44Z,user_data=None,user_id='833f1e9dce90456ea55a443da6704907',uuid=c8ef6504-1e46-493c-a14c-7cd1bebde8e0,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "058e919a-93d5-4b55-bae2-1ab8baad8296", "address": "fa:16:3e:d8:3a:e3", "network": {"id": "8eb4d5e9-8ac1-4d97-a3ca-29514334d492", "bridge": "br-int", "label": "tempest-network-smoke--436957495", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "34b96b4037d24a0ea19383ca2477b2fd", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap058e919a-93", "ovs_interfaceid": "058e919a-93d5-4b55-bae2-1ab8baad8296", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Jan 21 19:10:00 np0005591285 nova_compute[182755]: 2026-01-22 00:10:00.352 182759 DEBUG nova.network.os_vif_util [None req-aac980e4-9f3c-4a41-94d3-08d43f8e200e 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] Converting VIF {"id": "058e919a-93d5-4b55-bae2-1ab8baad8296", "address": "fa:16:3e:d8:3a:e3", "network": {"id": "8eb4d5e9-8ac1-4d97-a3ca-29514334d492", "bridge": "br-int", "label": "tempest-network-smoke--436957495", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "34b96b4037d24a0ea19383ca2477b2fd", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap058e919a-93", "ovs_interfaceid": "058e919a-93d5-4b55-bae2-1ab8baad8296", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 21 19:10:00 np0005591285 nova_compute[182755]: 2026-01-22 00:10:00.353 182759 DEBUG nova.network.os_vif_util [None req-aac980e4-9f3c-4a41-94d3-08d43f8e200e 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:d8:3a:e3,bridge_name='br-int',has_traffic_filtering=True,id=058e919a-93d5-4b55-bae2-1ab8baad8296,network=Network(8eb4d5e9-8ac1-4d97-a3ca-29514334d492),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap058e919a-93') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 21 19:10:00 np0005591285 nova_compute[182755]: 2026-01-22 00:10:00.354 182759 DEBUG nova.objects.instance [None req-aac980e4-9f3c-4a41-94d3-08d43f8e200e 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] Lazy-loading 'pci_devices' on Instance uuid c8ef6504-1e46-493c-a14c-7cd1bebde8e0 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 21 19:10:00 np0005591285 nova_compute[182755]: 2026-01-22 00:10:00.401 182759 DEBUG nova.virt.libvirt.driver [None req-aac980e4-9f3c-4a41-94d3-08d43f8e200e 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] [instance: c8ef6504-1e46-493c-a14c-7cd1bebde8e0] End _get_guest_xml xml=<domain type="kvm">
Jan 21 19:10:00 np0005591285 nova_compute[182755]:  <uuid>c8ef6504-1e46-493c-a14c-7cd1bebde8e0</uuid>
Jan 21 19:10:00 np0005591285 nova_compute[182755]:  <name>instance-00000072</name>
Jan 21 19:10:00 np0005591285 nova_compute[182755]:  <memory>131072</memory>
Jan 21 19:10:00 np0005591285 nova_compute[182755]:  <vcpu>1</vcpu>
Jan 21 19:10:00 np0005591285 nova_compute[182755]:  <metadata>
Jan 21 19:10:00 np0005591285 nova_compute[182755]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 21 19:10:00 np0005591285 nova_compute[182755]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 21 19:10:00 np0005591285 nova_compute[182755]:      <nova:name>tempest-TestNetworkBasicOps-server-1980643582</nova:name>
Jan 21 19:10:00 np0005591285 nova_compute[182755]:      <nova:creationTime>2026-01-22 00:10:00</nova:creationTime>
Jan 21 19:10:00 np0005591285 nova_compute[182755]:      <nova:flavor name="m1.nano">
Jan 21 19:10:00 np0005591285 nova_compute[182755]:        <nova:memory>128</nova:memory>
Jan 21 19:10:00 np0005591285 nova_compute[182755]:        <nova:disk>1</nova:disk>
Jan 21 19:10:00 np0005591285 nova_compute[182755]:        <nova:swap>0</nova:swap>
Jan 21 19:10:00 np0005591285 nova_compute[182755]:        <nova:ephemeral>0</nova:ephemeral>
Jan 21 19:10:00 np0005591285 nova_compute[182755]:        <nova:vcpus>1</nova:vcpus>
Jan 21 19:10:00 np0005591285 nova_compute[182755]:      </nova:flavor>
Jan 21 19:10:00 np0005591285 nova_compute[182755]:      <nova:owner>
Jan 21 19:10:00 np0005591285 nova_compute[182755]:        <nova:user uuid="833f1e9dce90456ea55a443da6704907">tempest-TestNetworkBasicOps-822850957-project-member</nova:user>
Jan 21 19:10:00 np0005591285 nova_compute[182755]:        <nova:project uuid="34b96b4037d24a0ea19383ca2477b2fd">tempest-TestNetworkBasicOps-822850957</nova:project>
Jan 21 19:10:00 np0005591285 nova_compute[182755]:      </nova:owner>
Jan 21 19:10:00 np0005591285 nova_compute[182755]:      <nova:root type="image" uuid="9cd98f02-a505-4543-a7ad-04e9a377b456"/>
Jan 21 19:10:00 np0005591285 nova_compute[182755]:      <nova:ports>
Jan 21 19:10:00 np0005591285 nova_compute[182755]:        <nova:port uuid="058e919a-93d5-4b55-bae2-1ab8baad8296">
Jan 21 19:10:00 np0005591285 nova_compute[182755]:          <nova:ip type="fixed" address="10.100.0.3" ipVersion="4"/>
Jan 21 19:10:00 np0005591285 nova_compute[182755]:        </nova:port>
Jan 21 19:10:00 np0005591285 nova_compute[182755]:      </nova:ports>
Jan 21 19:10:00 np0005591285 nova_compute[182755]:    </nova:instance>
Jan 21 19:10:00 np0005591285 nova_compute[182755]:  </metadata>
Jan 21 19:10:00 np0005591285 nova_compute[182755]:  <sysinfo type="smbios">
Jan 21 19:10:00 np0005591285 nova_compute[182755]:    <system>
Jan 21 19:10:00 np0005591285 nova_compute[182755]:      <entry name="manufacturer">RDO</entry>
Jan 21 19:10:00 np0005591285 nova_compute[182755]:      <entry name="product">OpenStack Compute</entry>
Jan 21 19:10:00 np0005591285 nova_compute[182755]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 21 19:10:00 np0005591285 nova_compute[182755]:      <entry name="serial">c8ef6504-1e46-493c-a14c-7cd1bebde8e0</entry>
Jan 21 19:10:00 np0005591285 nova_compute[182755]:      <entry name="uuid">c8ef6504-1e46-493c-a14c-7cd1bebde8e0</entry>
Jan 21 19:10:00 np0005591285 nova_compute[182755]:      <entry name="family">Virtual Machine</entry>
Jan 21 19:10:00 np0005591285 nova_compute[182755]:    </system>
Jan 21 19:10:00 np0005591285 nova_compute[182755]:  </sysinfo>
Jan 21 19:10:00 np0005591285 nova_compute[182755]:  <os>
Jan 21 19:10:00 np0005591285 nova_compute[182755]:    <type arch="x86_64" machine="q35">hvm</type>
Jan 21 19:10:00 np0005591285 nova_compute[182755]:    <boot dev="hd"/>
Jan 21 19:10:00 np0005591285 nova_compute[182755]:    <smbios mode="sysinfo"/>
Jan 21 19:10:00 np0005591285 nova_compute[182755]:  </os>
Jan 21 19:10:00 np0005591285 nova_compute[182755]:  <features>
Jan 21 19:10:00 np0005591285 nova_compute[182755]:    <acpi/>
Jan 21 19:10:00 np0005591285 nova_compute[182755]:    <apic/>
Jan 21 19:10:00 np0005591285 nova_compute[182755]:    <vmcoreinfo/>
Jan 21 19:10:00 np0005591285 nova_compute[182755]:  </features>
Jan 21 19:10:00 np0005591285 nova_compute[182755]:  <clock offset="utc">
Jan 21 19:10:00 np0005591285 nova_compute[182755]:    <timer name="pit" tickpolicy="delay"/>
Jan 21 19:10:00 np0005591285 nova_compute[182755]:    <timer name="rtc" tickpolicy="catchup"/>
Jan 21 19:10:00 np0005591285 nova_compute[182755]:    <timer name="hpet" present="no"/>
Jan 21 19:10:00 np0005591285 nova_compute[182755]:  </clock>
Jan 21 19:10:00 np0005591285 nova_compute[182755]:  <cpu mode="custom" match="exact">
Jan 21 19:10:00 np0005591285 nova_compute[182755]:    <model>Nehalem</model>
Jan 21 19:10:00 np0005591285 nova_compute[182755]:    <topology sockets="1" cores="1" threads="1"/>
Jan 21 19:10:00 np0005591285 nova_compute[182755]:  </cpu>
Jan 21 19:10:00 np0005591285 nova_compute[182755]:  <devices>
Jan 21 19:10:00 np0005591285 nova_compute[182755]:    <disk type="file" device="disk">
Jan 21 19:10:00 np0005591285 nova_compute[182755]:      <driver name="qemu" type="qcow2" cache="none"/>
Jan 21 19:10:00 np0005591285 nova_compute[182755]:      <source file="/var/lib/nova/instances/c8ef6504-1e46-493c-a14c-7cd1bebde8e0/disk"/>
Jan 21 19:10:00 np0005591285 nova_compute[182755]:      <target dev="vda" bus="virtio"/>
Jan 21 19:10:00 np0005591285 nova_compute[182755]:    </disk>
Jan 21 19:10:00 np0005591285 nova_compute[182755]:    <disk type="file" device="cdrom">
Jan 21 19:10:00 np0005591285 nova_compute[182755]:      <driver name="qemu" type="raw" cache="none"/>
Jan 21 19:10:00 np0005591285 nova_compute[182755]:      <source file="/var/lib/nova/instances/c8ef6504-1e46-493c-a14c-7cd1bebde8e0/disk.config"/>
Jan 21 19:10:00 np0005591285 nova_compute[182755]:      <target dev="sda" bus="sata"/>
Jan 21 19:10:00 np0005591285 nova_compute[182755]:    </disk>
Jan 21 19:10:00 np0005591285 nova_compute[182755]:    <interface type="ethernet">
Jan 21 19:10:00 np0005591285 nova_compute[182755]:      <mac address="fa:16:3e:d8:3a:e3"/>
Jan 21 19:10:00 np0005591285 nova_compute[182755]:      <model type="virtio"/>
Jan 21 19:10:00 np0005591285 nova_compute[182755]:      <driver name="vhost" rx_queue_size="512"/>
Jan 21 19:10:00 np0005591285 nova_compute[182755]:      <mtu size="1442"/>
Jan 21 19:10:00 np0005591285 nova_compute[182755]:      <target dev="tap058e919a-93"/>
Jan 21 19:10:00 np0005591285 nova_compute[182755]:    </interface>
Jan 21 19:10:00 np0005591285 nova_compute[182755]:    <serial type="pty">
Jan 21 19:10:00 np0005591285 nova_compute[182755]:      <log file="/var/lib/nova/instances/c8ef6504-1e46-493c-a14c-7cd1bebde8e0/console.log" append="off"/>
Jan 21 19:10:00 np0005591285 nova_compute[182755]:    </serial>
Jan 21 19:10:00 np0005591285 nova_compute[182755]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 21 19:10:00 np0005591285 nova_compute[182755]:    <video>
Jan 21 19:10:00 np0005591285 nova_compute[182755]:      <model type="virtio"/>
Jan 21 19:10:00 np0005591285 nova_compute[182755]:    </video>
Jan 21 19:10:00 np0005591285 nova_compute[182755]:    <input type="tablet" bus="usb"/>
Jan 21 19:10:00 np0005591285 nova_compute[182755]:    <rng model="virtio">
Jan 21 19:10:00 np0005591285 nova_compute[182755]:      <backend model="random">/dev/urandom</backend>
Jan 21 19:10:00 np0005591285 nova_compute[182755]:    </rng>
Jan 21 19:10:00 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root"/>
Jan 21 19:10:00 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 19:10:00 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 19:10:00 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 19:10:00 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 19:10:00 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 19:10:00 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 19:10:00 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 19:10:00 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 19:10:00 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 19:10:00 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 19:10:00 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 19:10:00 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 19:10:00 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 19:10:00 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 19:10:00 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 19:10:00 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 19:10:00 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 19:10:00 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 19:10:00 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 19:10:00 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 19:10:00 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 19:10:00 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 19:10:00 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 19:10:00 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 19:10:00 np0005591285 nova_compute[182755]:    <controller type="usb" index="0"/>
Jan 21 19:10:00 np0005591285 nova_compute[182755]:    <memballoon model="virtio">
Jan 21 19:10:00 np0005591285 nova_compute[182755]:      <stats period="10"/>
Jan 21 19:10:00 np0005591285 nova_compute[182755]:    </memballoon>
Jan 21 19:10:00 np0005591285 nova_compute[182755]:  </devices>
Jan 21 19:10:00 np0005591285 nova_compute[182755]: </domain>
Jan 21 19:10:00 np0005591285 nova_compute[182755]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Jan 21 19:10:00 np0005591285 nova_compute[182755]: 2026-01-22 00:10:00.403 182759 DEBUG nova.compute.manager [None req-aac980e4-9f3c-4a41-94d3-08d43f8e200e 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] [instance: c8ef6504-1e46-493c-a14c-7cd1bebde8e0] Preparing to wait for external event network-vif-plugged-058e919a-93d5-4b55-bae2-1ab8baad8296 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Jan 21 19:10:00 np0005591285 nova_compute[182755]: 2026-01-22 00:10:00.403 182759 DEBUG oslo_concurrency.lockutils [None req-aac980e4-9f3c-4a41-94d3-08d43f8e200e 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] Acquiring lock "c8ef6504-1e46-493c-a14c-7cd1bebde8e0-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 21 19:10:00 np0005591285 nova_compute[182755]: 2026-01-22 00:10:00.403 182759 DEBUG oslo_concurrency.lockutils [None req-aac980e4-9f3c-4a41-94d3-08d43f8e200e 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] Lock "c8ef6504-1e46-493c-a14c-7cd1bebde8e0-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 21 19:10:00 np0005591285 nova_compute[182755]: 2026-01-22 00:10:00.404 182759 DEBUG oslo_concurrency.lockutils [None req-aac980e4-9f3c-4a41-94d3-08d43f8e200e 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] Lock "c8ef6504-1e46-493c-a14c-7cd1bebde8e0-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 21 19:10:00 np0005591285 nova_compute[182755]: 2026-01-22 00:10:00.405 182759 DEBUG nova.virt.libvirt.vif [None req-aac980e4-9f3c-4a41-94d3-08d43f8e200e 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-22T00:09:43Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-1980643582',display_name='tempest-TestNetworkBasicOps-server-1980643582',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-1980643582',id=114,image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBJZ6XnigKgOH1qgP1qM1cbTSG+YSKJr4HPXzLmrePgGjqOoDXJutLYqYLK4oOPuYV4HHBQRnMUsgviAJJBFwFGWl4kwT3DJVEor+PG84hf0+tujPqIms5W8Uc9EI4E+YGQ==',key_name='tempest-TestNetworkBasicOps-575491139',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='34b96b4037d24a0ea19383ca2477b2fd',ramdisk_id='',reservation_id='r-dzas0rtl',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestNetworkBasicOps-822850957',owner_user_name='tempest-TestNetworkBasicOps-822850957-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-22T00:09:44Z,user_data=None,user_id='833f1e9dce90456ea55a443da6704907',uuid=c8ef6504-1e46-493c-a14c-7cd1bebde8e0,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "058e919a-93d5-4b55-bae2-1ab8baad8296", "address": "fa:16:3e:d8:3a:e3", "network": {"id": "8eb4d5e9-8ac1-4d97-a3ca-29514334d492", "bridge": "br-int", "label": "tempest-network-smoke--436957495", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "34b96b4037d24a0ea19383ca2477b2fd", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap058e919a-93", "ovs_interfaceid": "058e919a-93d5-4b55-bae2-1ab8baad8296", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Jan 21 19:10:00 np0005591285 nova_compute[182755]: 2026-01-22 00:10:00.405 182759 DEBUG nova.network.os_vif_util [None req-aac980e4-9f3c-4a41-94d3-08d43f8e200e 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] Converting VIF {"id": "058e919a-93d5-4b55-bae2-1ab8baad8296", "address": "fa:16:3e:d8:3a:e3", "network": {"id": "8eb4d5e9-8ac1-4d97-a3ca-29514334d492", "bridge": "br-int", "label": "tempest-network-smoke--436957495", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "34b96b4037d24a0ea19383ca2477b2fd", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap058e919a-93", "ovs_interfaceid": "058e919a-93d5-4b55-bae2-1ab8baad8296", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 21 19:10:00 np0005591285 nova_compute[182755]: 2026-01-22 00:10:00.406 182759 DEBUG nova.network.os_vif_util [None req-aac980e4-9f3c-4a41-94d3-08d43f8e200e 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:d8:3a:e3,bridge_name='br-int',has_traffic_filtering=True,id=058e919a-93d5-4b55-bae2-1ab8baad8296,network=Network(8eb4d5e9-8ac1-4d97-a3ca-29514334d492),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap058e919a-93') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 21 19:10:00 np0005591285 nova_compute[182755]: 2026-01-22 00:10:00.406 182759 DEBUG os_vif [None req-aac980e4-9f3c-4a41-94d3-08d43f8e200e 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:d8:3a:e3,bridge_name='br-int',has_traffic_filtering=True,id=058e919a-93d5-4b55-bae2-1ab8baad8296,network=Network(8eb4d5e9-8ac1-4d97-a3ca-29514334d492),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap058e919a-93') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Jan 21 19:10:00 np0005591285 nova_compute[182755]: 2026-01-22 00:10:00.407 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:10:00 np0005591285 nova_compute[182755]: 2026-01-22 00:10:00.408 182759 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 21 19:10:00 np0005591285 nova_compute[182755]: 2026-01-22 00:10:00.408 182759 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 21 19:10:00 np0005591285 nova_compute[182755]: 2026-01-22 00:10:00.413 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:10:00 np0005591285 nova_compute[182755]: 2026-01-22 00:10:00.413 182759 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap058e919a-93, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 21 19:10:00 np0005591285 nova_compute[182755]: 2026-01-22 00:10:00.414 182759 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap058e919a-93, col_values=(('external_ids', {'iface-id': '058e919a-93d5-4b55-bae2-1ab8baad8296', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:d8:3a:e3', 'vm-uuid': 'c8ef6504-1e46-493c-a14c-7cd1bebde8e0'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 21 19:10:00 np0005591285 nova_compute[182755]: 2026-01-22 00:10:00.416 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:10:00 np0005591285 NetworkManager[55017]: <info>  [1769040600.4175] manager: (tap058e919a-93): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/199)
Jan 21 19:10:00 np0005591285 nova_compute[182755]: 2026-01-22 00:10:00.417 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 21 19:10:00 np0005591285 nova_compute[182755]: 2026-01-22 00:10:00.424 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:10:00 np0005591285 nova_compute[182755]: 2026-01-22 00:10:00.427 182759 INFO os_vif [None req-aac980e4-9f3c-4a41-94d3-08d43f8e200e 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:d8:3a:e3,bridge_name='br-int',has_traffic_filtering=True,id=058e919a-93d5-4b55-bae2-1ab8baad8296,network=Network(8eb4d5e9-8ac1-4d97-a3ca-29514334d492),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap058e919a-93')#033[00m
Jan 21 19:10:00 np0005591285 nova_compute[182755]: 2026-01-22 00:10:00.730 182759 DEBUG nova.virt.libvirt.driver [None req-aac980e4-9f3c-4a41-94d3-08d43f8e200e 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 21 19:10:00 np0005591285 nova_compute[182755]: 2026-01-22 00:10:00.730 182759 DEBUG nova.virt.libvirt.driver [None req-aac980e4-9f3c-4a41-94d3-08d43f8e200e 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 21 19:10:00 np0005591285 nova_compute[182755]: 2026-01-22 00:10:00.731 182759 DEBUG nova.virt.libvirt.driver [None req-aac980e4-9f3c-4a41-94d3-08d43f8e200e 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] No VIF found with MAC fa:16:3e:d8:3a:e3, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Jan 21 19:10:00 np0005591285 nova_compute[182755]: 2026-01-22 00:10:00.731 182759 INFO nova.virt.libvirt.driver [None req-aac980e4-9f3c-4a41-94d3-08d43f8e200e 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] [instance: c8ef6504-1e46-493c-a14c-7cd1bebde8e0] Using config drive#033[00m
Jan 21 19:10:01 np0005591285 nova_compute[182755]: 2026-01-22 00:10:01.974 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:10:02 np0005591285 nova_compute[182755]: 2026-01-22 00:10:02.411 182759 INFO nova.virt.libvirt.driver [None req-aac980e4-9f3c-4a41-94d3-08d43f8e200e 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] [instance: c8ef6504-1e46-493c-a14c-7cd1bebde8e0] Creating config drive at /var/lib/nova/instances/c8ef6504-1e46-493c-a14c-7cd1bebde8e0/disk.config#033[00m
Jan 21 19:10:02 np0005591285 nova_compute[182755]: 2026-01-22 00:10:02.417 182759 DEBUG oslo_concurrency.processutils [None req-aac980e4-9f3c-4a41-94d3-08d43f8e200e 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/c8ef6504-1e46-493c-a14c-7cd1bebde8e0/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpujc5m74j execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 21 19:10:02 np0005591285 nova_compute[182755]: 2026-01-22 00:10:02.549 182759 DEBUG oslo_concurrency.processutils [None req-aac980e4-9f3c-4a41-94d3-08d43f8e200e 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/c8ef6504-1e46-493c-a14c-7cd1bebde8e0/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpujc5m74j" returned: 0 in 0.131s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 21 19:10:02 np0005591285 kernel: tap058e919a-93: entered promiscuous mode
Jan 21 19:10:02 np0005591285 NetworkManager[55017]: <info>  [1769040602.6104] manager: (tap058e919a-93): new Tun device (/org/freedesktop/NetworkManager/Devices/200)
Jan 21 19:10:02 np0005591285 ovn_controller[94908]: 2026-01-22T00:10:02Z|00412|binding|INFO|Claiming lport 058e919a-93d5-4b55-bae2-1ab8baad8296 for this chassis.
Jan 21 19:10:02 np0005591285 nova_compute[182755]: 2026-01-22 00:10:02.611 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:10:02 np0005591285 ovn_controller[94908]: 2026-01-22T00:10:02Z|00413|binding|INFO|058e919a-93d5-4b55-bae2-1ab8baad8296: Claiming fa:16:3e:d8:3a:e3 10.100.0.3
Jan 21 19:10:02 np0005591285 systemd-udevd[228607]: Network interface NamePolicy= disabled on kernel command line.
Jan 21 19:10:02 np0005591285 NetworkManager[55017]: <info>  [1769040602.6600] device (tap058e919a-93): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 21 19:10:02 np0005591285 NetworkManager[55017]: <info>  [1769040602.6607] device (tap058e919a-93): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 21 19:10:02 np0005591285 systemd-machined[154022]: New machine qemu-51-instance-00000072.
Jan 21 19:10:02 np0005591285 nova_compute[182755]: 2026-01-22 00:10:02.673 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:10:02 np0005591285 ovn_controller[94908]: 2026-01-22T00:10:02Z|00414|binding|INFO|Setting lport 058e919a-93d5-4b55-bae2-1ab8baad8296 ovn-installed in OVS
Jan 21 19:10:02 np0005591285 nova_compute[182755]: 2026-01-22 00:10:02.678 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:10:02 np0005591285 systemd[1]: Started Virtual Machine qemu-51-instance-00000072.
Jan 21 19:10:02 np0005591285 ovn_controller[94908]: 2026-01-22T00:10:02Z|00415|binding|INFO|Setting lport 058e919a-93d5-4b55-bae2-1ab8baad8296 up in Southbound
Jan 21 19:10:02 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:10:02.703 104259 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:d8:3a:e3 10.100.0.3'], port_security=['fa:16:3e:d8:3a:e3 10.100.0.3'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.3/28', 'neutron:device_id': 'c8ef6504-1e46-493c-a14c-7cd1bebde8e0', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-8eb4d5e9-8ac1-4d97-a3ca-29514334d492', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '34b96b4037d24a0ea19383ca2477b2fd', 'neutron:revision_number': '2', 'neutron:security_group_ids': 'a1ecbca0-9209-47aa-82bf-9c5d3641042f', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=797d444f-c0c6-4f3e-8fc3-fb2e70d06d89, chassis=[<ovs.db.idl.Row object at 0x7fdd55b5c970>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fdd55b5c970>], logical_port=058e919a-93d5-4b55-bae2-1ab8baad8296) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 21 19:10:02 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:10:02.705 104259 INFO neutron.agent.ovn.metadata.agent [-] Port 058e919a-93d5-4b55-bae2-1ab8baad8296 in datapath 8eb4d5e9-8ac1-4d97-a3ca-29514334d492 bound to our chassis#033[00m
Jan 21 19:10:02 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:10:02.707 104259 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 8eb4d5e9-8ac1-4d97-a3ca-29514334d492#033[00m
Jan 21 19:10:02 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:10:02.720 211690 DEBUG oslo.privsep.daemon [-] privsep: reply[d470b04b-f46f-4d33-b9ac-bf4a6896df92]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 19:10:02 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:10:02.721 104259 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap8eb4d5e9-81 in ovnmeta-8eb4d5e9-8ac1-4d97-a3ca-29514334d492 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Jan 21 19:10:02 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:10:02.723 211690 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap8eb4d5e9-80 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Jan 21 19:10:02 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:10:02.724 211690 DEBUG oslo.privsep.daemon [-] privsep: reply[3018a4d3-1d8f-4333-8748-5dad808601e5]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 19:10:02 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:10:02.724 211690 DEBUG oslo.privsep.daemon [-] privsep: reply[865181d4-df32-410c-9695-a79f943aba52]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 19:10:02 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:10:02.736 104650 DEBUG oslo.privsep.daemon [-] privsep: reply[06af4701-33d5-4f33-acd0-0258486af077]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 19:10:02 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:10:02.748 211690 DEBUG oslo.privsep.daemon [-] privsep: reply[118a33f0-139a-406c-b5a7-964a84bdbc63]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 19:10:02 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:10:02.777 211704 DEBUG oslo.privsep.daemon [-] privsep: reply[274f8d65-7f66-4f85-bc0d-fa7d42f9455f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 19:10:02 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:10:02.782 211690 DEBUG oslo.privsep.daemon [-] privsep: reply[073cda64-0ada-41e8-9709-00756a35bf05]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 19:10:02 np0005591285 NetworkManager[55017]: <info>  [1769040602.7837] manager: (tap8eb4d5e9-80): new Veth device (/org/freedesktop/NetworkManager/Devices/201)
Jan 21 19:10:02 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:10:02.812 211704 DEBUG oslo.privsep.daemon [-] privsep: reply[4813db90-ae0f-40d7-ad32-62351234e6e5]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 19:10:02 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:10:02.815 211704 DEBUG oslo.privsep.daemon [-] privsep: reply[1af3af5d-ac8f-4e0c-bac2-fae9463230af]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 19:10:02 np0005591285 NetworkManager[55017]: <info>  [1769040602.8378] device (tap8eb4d5e9-80): carrier: link connected
Jan 21 19:10:02 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:10:02.844 211704 DEBUG oslo.privsep.daemon [-] privsep: reply[f59076c6-6abc-4dc7-a0bb-b151746a5c61]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 19:10:02 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:10:02.864 211690 DEBUG oslo.privsep.daemon [-] privsep: reply[eabb285c-bc77-4d20-9eb1-dd23f5f0ba99]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap8eb4d5e9-81'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:4e:a9:14'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 130], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 516149, 'reachable_time': 43239, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 228642, 'error': None, 'target': 'ovnmeta-8eb4d5e9-8ac1-4d97-a3ca-29514334d492', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 19:10:02 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:10:02.878 211690 DEBUG oslo.privsep.daemon [-] privsep: reply[58856cd2-702f-4ec1-bed8-e5b3af3cdaa6]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe4e:a914'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 516149, 'tstamp': 516149}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 228643, 'error': None, 'target': 'ovnmeta-8eb4d5e9-8ac1-4d97-a3ca-29514334d492', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 19:10:02 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:10:02.894 211690 DEBUG oslo.privsep.daemon [-] privsep: reply[e74edab2-5691-4cdd-b50a-f337a5ce6638]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap8eb4d5e9-81'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:4e:a9:14'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 130], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 516149, 'reachable_time': 43239, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 228644, 'error': None, 'target': 'ovnmeta-8eb4d5e9-8ac1-4d97-a3ca-29514334d492', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 19:10:02 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:10:02.922 211690 DEBUG oslo.privsep.daemon [-] privsep: reply[e71d7e50-1a7c-4d13-95f8-4a834689a31b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 19:10:02 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:10:02.971 211690 DEBUG oslo.privsep.daemon [-] privsep: reply[00e4021f-0079-411f-b6b6-22ab61a1bb83]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 19:10:02 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:10:02.973 104259 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap8eb4d5e9-80, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 21 19:10:02 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:10:02.973 104259 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 21 19:10:02 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:10:02.973 104259 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 21 19:10:02 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:10:02.974 104259 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 21 19:10:02 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:10:02.974 104259 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 21 19:10:02 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:10:02.974 104259 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap8eb4d5e9-80, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 21 19:10:02 np0005591285 nova_compute[182755]: 2026-01-22 00:10:02.976 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:10:02 np0005591285 kernel: tap8eb4d5e9-80: entered promiscuous mode
Jan 21 19:10:02 np0005591285 NetworkManager[55017]: <info>  [1769040602.9767] manager: (tap8eb4d5e9-80): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/202)
Jan 21 19:10:02 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:10:02.979 104259 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap8eb4d5e9-80, col_values=(('external_ids', {'iface-id': '552f0b46-d8c2-456a-a665-639ee7e20754'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 21 19:10:02 np0005591285 nova_compute[182755]: 2026-01-22 00:10:02.980 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:10:02 np0005591285 ovn_controller[94908]: 2026-01-22T00:10:02Z|00416|binding|INFO|Releasing lport 552f0b46-d8c2-456a-a665-639ee7e20754 from this chassis (sb_readonly=0)
Jan 21 19:10:02 np0005591285 nova_compute[182755]: 2026-01-22 00:10:02.982 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:10:02 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:10:02.983 104259 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/8eb4d5e9-8ac1-4d97-a3ca-29514334d492.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/8eb4d5e9-8ac1-4d97-a3ca-29514334d492.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Jan 21 19:10:02 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:10:02.983 211690 DEBUG oslo.privsep.daemon [-] privsep: reply[1a67fa23-a356-4b23-a1ea-e7d7b6b28f5b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 19:10:02 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:10:02.984 104259 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Jan 21 19:10:02 np0005591285 ovn_metadata_agent[104254]: global
Jan 21 19:10:02 np0005591285 ovn_metadata_agent[104254]:    log         /dev/log local0 debug
Jan 21 19:10:02 np0005591285 ovn_metadata_agent[104254]:    log-tag     haproxy-metadata-proxy-8eb4d5e9-8ac1-4d97-a3ca-29514334d492
Jan 21 19:10:02 np0005591285 ovn_metadata_agent[104254]:    user        root
Jan 21 19:10:02 np0005591285 ovn_metadata_agent[104254]:    group       root
Jan 21 19:10:02 np0005591285 ovn_metadata_agent[104254]:    maxconn     1024
Jan 21 19:10:02 np0005591285 ovn_metadata_agent[104254]:    pidfile     /var/lib/neutron/external/pids/8eb4d5e9-8ac1-4d97-a3ca-29514334d492.pid.haproxy
Jan 21 19:10:02 np0005591285 ovn_metadata_agent[104254]:    daemon
Jan 21 19:10:02 np0005591285 ovn_metadata_agent[104254]: 
Jan 21 19:10:02 np0005591285 ovn_metadata_agent[104254]: defaults
Jan 21 19:10:02 np0005591285 ovn_metadata_agent[104254]:    log global
Jan 21 19:10:02 np0005591285 ovn_metadata_agent[104254]:    mode http
Jan 21 19:10:02 np0005591285 ovn_metadata_agent[104254]:    option httplog
Jan 21 19:10:02 np0005591285 ovn_metadata_agent[104254]:    option dontlognull
Jan 21 19:10:02 np0005591285 ovn_metadata_agent[104254]:    option http-server-close
Jan 21 19:10:02 np0005591285 ovn_metadata_agent[104254]:    option forwardfor
Jan 21 19:10:02 np0005591285 ovn_metadata_agent[104254]:    retries                 3
Jan 21 19:10:02 np0005591285 ovn_metadata_agent[104254]:    timeout http-request    30s
Jan 21 19:10:02 np0005591285 ovn_metadata_agent[104254]:    timeout connect         30s
Jan 21 19:10:02 np0005591285 ovn_metadata_agent[104254]:    timeout client          32s
Jan 21 19:10:02 np0005591285 ovn_metadata_agent[104254]:    timeout server          32s
Jan 21 19:10:02 np0005591285 ovn_metadata_agent[104254]:    timeout http-keep-alive 30s
Jan 21 19:10:02 np0005591285 ovn_metadata_agent[104254]: 
Jan 21 19:10:02 np0005591285 ovn_metadata_agent[104254]: 
Jan 21 19:10:02 np0005591285 ovn_metadata_agent[104254]: listen listener
Jan 21 19:10:02 np0005591285 ovn_metadata_agent[104254]:    bind 169.254.169.254:80
Jan 21 19:10:02 np0005591285 ovn_metadata_agent[104254]:    server metadata /var/lib/neutron/metadata_proxy
Jan 21 19:10:02 np0005591285 ovn_metadata_agent[104254]:    http-request add-header X-OVN-Network-ID 8eb4d5e9-8ac1-4d97-a3ca-29514334d492
Jan 21 19:10:02 np0005591285 ovn_metadata_agent[104254]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Jan 21 19:10:02 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:10:02.984 104259 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-8eb4d5e9-8ac1-4d97-a3ca-29514334d492', 'env', 'PROCESS_TAG=haproxy-8eb4d5e9-8ac1-4d97-a3ca-29514334d492', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/8eb4d5e9-8ac1-4d97-a3ca-29514334d492.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Jan 21 19:10:02 np0005591285 nova_compute[182755]: 2026-01-22 00:10:02.993 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:10:03 np0005591285 podman[228676]: 2026-01-22 00:10:03.336299288 +0000 UTC m=+0.025915948 image pull 3695f0466b4af47afdf4b467956f8cc4744d7249671a73e7ca3fd26cca2f59c3 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Jan 21 19:10:03 np0005591285 nova_compute[182755]: 2026-01-22 00:10:03.996 182759 DEBUG nova.virt.driver [None req-f447ef32-7edb-4e2c-a5c5-5452f2f9c37e - - - - - -] Emitting event <LifecycleEvent: 1769040603.9948876, c8ef6504-1e46-493c-a14c-7cd1bebde8e0 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 21 19:10:03 np0005591285 nova_compute[182755]: 2026-01-22 00:10:03.997 182759 INFO nova.compute.manager [None req-f447ef32-7edb-4e2c-a5c5-5452f2f9c37e - - - - - -] [instance: c8ef6504-1e46-493c-a14c-7cd1bebde8e0] VM Started (Lifecycle Event)#033[00m
Jan 21 19:10:04 np0005591285 nova_compute[182755]: 2026-01-22 00:10:04.046 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:10:04 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:10:04.046 104259 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=31, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '16:a4:fb', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': 'fa:c2:bb:50:eb:27'}, ipsec=False) old=SB_Global(nb_cfg=30) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 21 19:10:04 np0005591285 nova_compute[182755]: 2026-01-22 00:10:04.199 182759 DEBUG nova.compute.manager [None req-f447ef32-7edb-4e2c-a5c5-5452f2f9c37e - - - - - -] [instance: c8ef6504-1e46-493c-a14c-7cd1bebde8e0] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 21 19:10:04 np0005591285 nova_compute[182755]: 2026-01-22 00:10:04.205 182759 DEBUG nova.virt.driver [None req-f447ef32-7edb-4e2c-a5c5-5452f2f9c37e - - - - - -] Emitting event <LifecycleEvent: 1769040603.9950564, c8ef6504-1e46-493c-a14c-7cd1bebde8e0 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 21 19:10:04 np0005591285 nova_compute[182755]: 2026-01-22 00:10:04.205 182759 INFO nova.compute.manager [None req-f447ef32-7edb-4e2c-a5c5-5452f2f9c37e - - - - - -] [instance: c8ef6504-1e46-493c-a14c-7cd1bebde8e0] VM Paused (Lifecycle Event)#033[00m
Jan 21 19:10:04 np0005591285 nova_compute[182755]: 2026-01-22 00:10:04.497 182759 DEBUG nova.compute.manager [None req-f447ef32-7edb-4e2c-a5c5-5452f2f9c37e - - - - - -] [instance: c8ef6504-1e46-493c-a14c-7cd1bebde8e0] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 21 19:10:04 np0005591285 nova_compute[182755]: 2026-01-22 00:10:04.502 182759 DEBUG nova.compute.manager [None req-f447ef32-7edb-4e2c-a5c5-5452f2f9c37e - - - - - -] [instance: c8ef6504-1e46-493c-a14c-7cd1bebde8e0] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 21 19:10:04 np0005591285 podman[228676]: 2026-01-22 00:10:04.679567392 +0000 UTC m=+1.369184032 container create 8159413f2072c9582343eeea508d4c23860594893b4d9a4985cac9e763ce777b (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-8eb4d5e9-8ac1-4d97-a3ca-29514334d492, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.build-date=20251202, tcib_managed=true)
Jan 21 19:10:04 np0005591285 systemd[1]: Started libpod-conmon-8159413f2072c9582343eeea508d4c23860594893b4d9a4985cac9e763ce777b.scope.
Jan 21 19:10:04 np0005591285 systemd[1]: Started libcrun container.
Jan 21 19:10:04 np0005591285 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/38551ce169bfd3d73987e0824f50f46143f727f0b109cb1a2d89281396f11553/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 21 19:10:05 np0005591285 podman[228676]: 2026-01-22 00:10:05.306207068 +0000 UTC m=+1.995823738 container init 8159413f2072c9582343eeea508d4c23860594893b4d9a4985cac9e763ce777b (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-8eb4d5e9-8ac1-4d97-a3ca-29514334d492, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3)
Jan 21 19:10:05 np0005591285 podman[228676]: 2026-01-22 00:10:05.314055919 +0000 UTC m=+2.003672559 container start 8159413f2072c9582343eeea508d4c23860594893b4d9a4985cac9e763ce777b (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-8eb4d5e9-8ac1-4d97-a3ca-29514334d492, org.label-schema.build-date=20251202, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 21 19:10:05 np0005591285 nova_compute[182755]: 2026-01-22 00:10:05.319 182759 INFO nova.compute.manager [None req-f447ef32-7edb-4e2c-a5c5-5452f2f9c37e - - - - - -] [instance: c8ef6504-1e46-493c-a14c-7cd1bebde8e0] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 21 19:10:05 np0005591285 neutron-haproxy-ovnmeta-8eb4d5e9-8ac1-4d97-a3ca-29514334d492[228698]: [NOTICE]   (228702) : New worker (228704) forked
Jan 21 19:10:05 np0005591285 neutron-haproxy-ovnmeta-8eb4d5e9-8ac1-4d97-a3ca-29514334d492[228698]: [NOTICE]   (228702) : Loading success.
Jan 21 19:10:05 np0005591285 nova_compute[182755]: 2026-01-22 00:10:05.417 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:10:05 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:10:05.970 104259 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 0 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Jan 21 19:10:05 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:10:05.971 104259 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=ce4b296c-26ac-415a-aa87-9634754eb3d3, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '31'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 21 19:10:05 np0005591285 nova_compute[182755]: 2026-01-22 00:10:05.974 182759 DEBUG nova.network.neutron [req-53ff2c58-0a7d-4eb3-84e2-b2249cb5174a req-3bda6447-8fa9-4421-a4dd-2d39b30053e5 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: c8ef6504-1e46-493c-a14c-7cd1bebde8e0] Updated VIF entry in instance network info cache for port 058e919a-93d5-4b55-bae2-1ab8baad8296. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 21 19:10:05 np0005591285 nova_compute[182755]: 2026-01-22 00:10:05.975 182759 DEBUG nova.network.neutron [req-53ff2c58-0a7d-4eb3-84e2-b2249cb5174a req-3bda6447-8fa9-4421-a4dd-2d39b30053e5 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: c8ef6504-1e46-493c-a14c-7cd1bebde8e0] Updating instance_info_cache with network_info: [{"id": "058e919a-93d5-4b55-bae2-1ab8baad8296", "address": "fa:16:3e:d8:3a:e3", "network": {"id": "8eb4d5e9-8ac1-4d97-a3ca-29514334d492", "bridge": "br-int", "label": "tempest-network-smoke--436957495", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "34b96b4037d24a0ea19383ca2477b2fd", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap058e919a-93", "ovs_interfaceid": "058e919a-93d5-4b55-bae2-1ab8baad8296", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 21 19:10:06 np0005591285 nova_compute[182755]: 2026-01-22 00:10:06.010 182759 DEBUG oslo_concurrency.lockutils [req-53ff2c58-0a7d-4eb3-84e2-b2249cb5174a req-3bda6447-8fa9-4421-a4dd-2d39b30053e5 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Releasing lock "refresh_cache-c8ef6504-1e46-493c-a14c-7cd1bebde8e0" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 21 19:10:06 np0005591285 nova_compute[182755]: 2026-01-22 00:10:06.976 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:10:07 np0005591285 nova_compute[182755]: 2026-01-22 00:10:07.066 182759 DEBUG nova.compute.manager [req-de2ec185-3250-4836-a48f-ee3692c06add req-fd000feb-9051-44cd-a76a-8ef4a97e9b60 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: c8ef6504-1e46-493c-a14c-7cd1bebde8e0] Received event network-vif-plugged-058e919a-93d5-4b55-bae2-1ab8baad8296 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 21 19:10:07 np0005591285 nova_compute[182755]: 2026-01-22 00:10:07.067 182759 DEBUG oslo_concurrency.lockutils [req-de2ec185-3250-4836-a48f-ee3692c06add req-fd000feb-9051-44cd-a76a-8ef4a97e9b60 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "c8ef6504-1e46-493c-a14c-7cd1bebde8e0-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 21 19:10:07 np0005591285 nova_compute[182755]: 2026-01-22 00:10:07.067 182759 DEBUG oslo_concurrency.lockutils [req-de2ec185-3250-4836-a48f-ee3692c06add req-fd000feb-9051-44cd-a76a-8ef4a97e9b60 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "c8ef6504-1e46-493c-a14c-7cd1bebde8e0-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 21 19:10:07 np0005591285 nova_compute[182755]: 2026-01-22 00:10:07.067 182759 DEBUG oslo_concurrency.lockutils [req-de2ec185-3250-4836-a48f-ee3692c06add req-fd000feb-9051-44cd-a76a-8ef4a97e9b60 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "c8ef6504-1e46-493c-a14c-7cd1bebde8e0-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 21 19:10:07 np0005591285 nova_compute[182755]: 2026-01-22 00:10:07.067 182759 DEBUG nova.compute.manager [req-de2ec185-3250-4836-a48f-ee3692c06add req-fd000feb-9051-44cd-a76a-8ef4a97e9b60 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: c8ef6504-1e46-493c-a14c-7cd1bebde8e0] Processing event network-vif-plugged-058e919a-93d5-4b55-bae2-1ab8baad8296 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Jan 21 19:10:07 np0005591285 nova_compute[182755]: 2026-01-22 00:10:07.068 182759 DEBUG nova.compute.manager [req-de2ec185-3250-4836-a48f-ee3692c06add req-fd000feb-9051-44cd-a76a-8ef4a97e9b60 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: c8ef6504-1e46-493c-a14c-7cd1bebde8e0] Received event network-vif-plugged-058e919a-93d5-4b55-bae2-1ab8baad8296 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 21 19:10:07 np0005591285 nova_compute[182755]: 2026-01-22 00:10:07.068 182759 DEBUG oslo_concurrency.lockutils [req-de2ec185-3250-4836-a48f-ee3692c06add req-fd000feb-9051-44cd-a76a-8ef4a97e9b60 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "c8ef6504-1e46-493c-a14c-7cd1bebde8e0-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 21 19:10:07 np0005591285 nova_compute[182755]: 2026-01-22 00:10:07.068 182759 DEBUG oslo_concurrency.lockutils [req-de2ec185-3250-4836-a48f-ee3692c06add req-fd000feb-9051-44cd-a76a-8ef4a97e9b60 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "c8ef6504-1e46-493c-a14c-7cd1bebde8e0-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 21 19:10:07 np0005591285 nova_compute[182755]: 2026-01-22 00:10:07.068 182759 DEBUG oslo_concurrency.lockutils [req-de2ec185-3250-4836-a48f-ee3692c06add req-fd000feb-9051-44cd-a76a-8ef4a97e9b60 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "c8ef6504-1e46-493c-a14c-7cd1bebde8e0-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 21 19:10:07 np0005591285 nova_compute[182755]: 2026-01-22 00:10:07.068 182759 DEBUG nova.compute.manager [req-de2ec185-3250-4836-a48f-ee3692c06add req-fd000feb-9051-44cd-a76a-8ef4a97e9b60 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: c8ef6504-1e46-493c-a14c-7cd1bebde8e0] No waiting events found dispatching network-vif-plugged-058e919a-93d5-4b55-bae2-1ab8baad8296 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 21 19:10:07 np0005591285 nova_compute[182755]: 2026-01-22 00:10:07.069 182759 WARNING nova.compute.manager [req-de2ec185-3250-4836-a48f-ee3692c06add req-fd000feb-9051-44cd-a76a-8ef4a97e9b60 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: c8ef6504-1e46-493c-a14c-7cd1bebde8e0] Received unexpected event network-vif-plugged-058e919a-93d5-4b55-bae2-1ab8baad8296 for instance with vm_state building and task_state spawning.#033[00m
Jan 21 19:10:07 np0005591285 nova_compute[182755]: 2026-01-22 00:10:07.069 182759 DEBUG nova.compute.manager [None req-aac980e4-9f3c-4a41-94d3-08d43f8e200e 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] [instance: c8ef6504-1e46-493c-a14c-7cd1bebde8e0] Instance event wait completed in 3 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Jan 21 19:10:07 np0005591285 nova_compute[182755]: 2026-01-22 00:10:07.074 182759 DEBUG nova.virt.driver [None req-f447ef32-7edb-4e2c-a5c5-5452f2f9c37e - - - - - -] Emitting event <LifecycleEvent: 1769040607.0737953, c8ef6504-1e46-493c-a14c-7cd1bebde8e0 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 21 19:10:07 np0005591285 nova_compute[182755]: 2026-01-22 00:10:07.074 182759 INFO nova.compute.manager [None req-f447ef32-7edb-4e2c-a5c5-5452f2f9c37e - - - - - -] [instance: c8ef6504-1e46-493c-a14c-7cd1bebde8e0] VM Resumed (Lifecycle Event)#033[00m
Jan 21 19:10:07 np0005591285 nova_compute[182755]: 2026-01-22 00:10:07.076 182759 DEBUG nova.virt.libvirt.driver [None req-aac980e4-9f3c-4a41-94d3-08d43f8e200e 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] [instance: c8ef6504-1e46-493c-a14c-7cd1bebde8e0] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Jan 21 19:10:07 np0005591285 nova_compute[182755]: 2026-01-22 00:10:07.080 182759 INFO nova.virt.libvirt.driver [-] [instance: c8ef6504-1e46-493c-a14c-7cd1bebde8e0] Instance spawned successfully.#033[00m
Jan 21 19:10:07 np0005591285 nova_compute[182755]: 2026-01-22 00:10:07.080 182759 DEBUG nova.virt.libvirt.driver [None req-aac980e4-9f3c-4a41-94d3-08d43f8e200e 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] [instance: c8ef6504-1e46-493c-a14c-7cd1bebde8e0] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Jan 21 19:10:07 np0005591285 nova_compute[182755]: 2026-01-22 00:10:07.192 182759 DEBUG nova.compute.manager [None req-f447ef32-7edb-4e2c-a5c5-5452f2f9c37e - - - - - -] [instance: c8ef6504-1e46-493c-a14c-7cd1bebde8e0] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 21 19:10:07 np0005591285 nova_compute[182755]: 2026-01-22 00:10:07.199 182759 DEBUG nova.compute.manager [None req-f447ef32-7edb-4e2c-a5c5-5452f2f9c37e - - - - - -] [instance: c8ef6504-1e46-493c-a14c-7cd1bebde8e0] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 21 19:10:07 np0005591285 nova_compute[182755]: 2026-01-22 00:10:07.203 182759 DEBUG nova.virt.libvirt.driver [None req-aac980e4-9f3c-4a41-94d3-08d43f8e200e 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] [instance: c8ef6504-1e46-493c-a14c-7cd1bebde8e0] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 21 19:10:07 np0005591285 nova_compute[182755]: 2026-01-22 00:10:07.203 182759 DEBUG nova.virt.libvirt.driver [None req-aac980e4-9f3c-4a41-94d3-08d43f8e200e 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] [instance: c8ef6504-1e46-493c-a14c-7cd1bebde8e0] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 21 19:10:07 np0005591285 nova_compute[182755]: 2026-01-22 00:10:07.204 182759 DEBUG nova.virt.libvirt.driver [None req-aac980e4-9f3c-4a41-94d3-08d43f8e200e 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] [instance: c8ef6504-1e46-493c-a14c-7cd1bebde8e0] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 21 19:10:07 np0005591285 nova_compute[182755]: 2026-01-22 00:10:07.204 182759 DEBUG nova.virt.libvirt.driver [None req-aac980e4-9f3c-4a41-94d3-08d43f8e200e 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] [instance: c8ef6504-1e46-493c-a14c-7cd1bebde8e0] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 21 19:10:07 np0005591285 nova_compute[182755]: 2026-01-22 00:10:07.205 182759 DEBUG nova.virt.libvirt.driver [None req-aac980e4-9f3c-4a41-94d3-08d43f8e200e 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] [instance: c8ef6504-1e46-493c-a14c-7cd1bebde8e0] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 21 19:10:07 np0005591285 nova_compute[182755]: 2026-01-22 00:10:07.205 182759 DEBUG nova.virt.libvirt.driver [None req-aac980e4-9f3c-4a41-94d3-08d43f8e200e 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] [instance: c8ef6504-1e46-493c-a14c-7cd1bebde8e0] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 21 19:10:07 np0005591285 nova_compute[182755]: 2026-01-22 00:10:07.320 182759 INFO nova.compute.manager [None req-f447ef32-7edb-4e2c-a5c5-5452f2f9c37e - - - - - -] [instance: c8ef6504-1e46-493c-a14c-7cd1bebde8e0] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 21 19:10:08 np0005591285 nova_compute[182755]: 2026-01-22 00:10:08.562 182759 INFO nova.compute.manager [None req-aac980e4-9f3c-4a41-94d3-08d43f8e200e 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] [instance: c8ef6504-1e46-493c-a14c-7cd1bebde8e0] Took 23.54 seconds to spawn the instance on the hypervisor.#033[00m
Jan 21 19:10:08 np0005591285 nova_compute[182755]: 2026-01-22 00:10:08.563 182759 DEBUG nova.compute.manager [None req-aac980e4-9f3c-4a41-94d3-08d43f8e200e 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] [instance: c8ef6504-1e46-493c-a14c-7cd1bebde8e0] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 21 19:10:08 np0005591285 nova_compute[182755]: 2026-01-22 00:10:08.834 182759 INFO nova.compute.manager [None req-aac980e4-9f3c-4a41-94d3-08d43f8e200e 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] [instance: c8ef6504-1e46-493c-a14c-7cd1bebde8e0] Took 24.43 seconds to build instance.#033[00m
Jan 21 19:10:08 np0005591285 nova_compute[182755]: 2026-01-22 00:10:08.862 182759 DEBUG oslo_concurrency.lockutils [None req-aac980e4-9f3c-4a41-94d3-08d43f8e200e 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] Lock "c8ef6504-1e46-493c-a14c-7cd1bebde8e0" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 24.590s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 21 19:10:10 np0005591285 nova_compute[182755]: 2026-01-22 00:10:10.420 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:10:11 np0005591285 nova_compute[182755]: 2026-01-22 00:10:11.978 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:10:15 np0005591285 nova_compute[182755]: 2026-01-22 00:10:15.422 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:10:16 np0005591285 nova_compute[182755]: 2026-01-22 00:10:16.979 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:10:17 np0005591285 NetworkManager[55017]: <info>  [1769040617.7126] manager: (patch-br-int-to-provnet-99bcf688-d143-4306-a11c-956e66fbe227): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/203)
Jan 21 19:10:17 np0005591285 NetworkManager[55017]: <info>  [1769040617.7140] manager: (patch-provnet-99bcf688-d143-4306-a11c-956e66fbe227-to-br-int): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/204)
Jan 21 19:10:17 np0005591285 nova_compute[182755]: 2026-01-22 00:10:17.714 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:10:17 np0005591285 nova_compute[182755]: 2026-01-22 00:10:17.768 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:10:17 np0005591285 ovn_controller[94908]: 2026-01-22T00:10:17Z|00417|binding|INFO|Releasing lport 552f0b46-d8c2-456a-a665-639ee7e20754 from this chassis (sb_readonly=0)
Jan 21 19:10:17 np0005591285 nova_compute[182755]: 2026-01-22 00:10:17.781 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:10:18 np0005591285 ovn_controller[94908]: 2026-01-22T00:10:18Z|00418|binding|INFO|Releasing lport 552f0b46-d8c2-456a-a665-639ee7e20754 from this chassis (sb_readonly=0)
Jan 21 19:10:18 np0005591285 podman[228714]: 2026-01-22 00:10:18.188538553 +0000 UTC m=+0.060782626 container health_status 769668cc4e818cfae854ab3fcb5517227ddca1e18005a18ef4d782a494f45662 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., managed_by=edpm_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, name=ubi9-minimal, version=9.6, vendor=Red Hat, Inc., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.buildah.version=1.33.7, release=1755695350, architecture=x86_64, io.openshift.tags=minimal rhel9, io.openshift.expose-services=, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., com.redhat.component=ubi9-minimal-container, config_id=openstack_network_exporter, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., container_name=openstack_network_exporter, vcs-type=git, build-date=2025-08-20T13:12:41, distribution-scope=public, maintainer=Red Hat, Inc., url=https://catalog.redhat.com/en/search?searchType=containers)
Jan 21 19:10:18 np0005591285 nova_compute[182755]: 2026-01-22 00:10:18.209 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:10:18 np0005591285 podman[228715]: 2026-01-22 00:10:18.226837473 +0000 UTC m=+0.095164891 container health_status b5c1a31a508d0cf9e10702abe9c0ef424614b4d8f0daee85791f7de054afa6f0 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_id=ceilometer_agent_compute, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, container_name=ceilometer_agent_compute)
Jan 21 19:10:18 np0005591285 nova_compute[182755]: 2026-01-22 00:10:18.446 182759 DEBUG nova.compute.manager [req-5ad67476-96b8-4a57-88f3-dc181323b3fb req-8a74a6ca-ed0a-45ed-9c36-e23b821a913c 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: c8ef6504-1e46-493c-a14c-7cd1bebde8e0] Received event network-changed-058e919a-93d5-4b55-bae2-1ab8baad8296 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 21 19:10:18 np0005591285 nova_compute[182755]: 2026-01-22 00:10:18.446 182759 DEBUG nova.compute.manager [req-5ad67476-96b8-4a57-88f3-dc181323b3fb req-8a74a6ca-ed0a-45ed-9c36-e23b821a913c 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: c8ef6504-1e46-493c-a14c-7cd1bebde8e0] Refreshing instance network info cache due to event network-changed-058e919a-93d5-4b55-bae2-1ab8baad8296. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 21 19:10:18 np0005591285 nova_compute[182755]: 2026-01-22 00:10:18.447 182759 DEBUG oslo_concurrency.lockutils [req-5ad67476-96b8-4a57-88f3-dc181323b3fb req-8a74a6ca-ed0a-45ed-9c36-e23b821a913c 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "refresh_cache-c8ef6504-1e46-493c-a14c-7cd1bebde8e0" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 21 19:10:18 np0005591285 nova_compute[182755]: 2026-01-22 00:10:18.447 182759 DEBUG oslo_concurrency.lockutils [req-5ad67476-96b8-4a57-88f3-dc181323b3fb req-8a74a6ca-ed0a-45ed-9c36-e23b821a913c 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquired lock "refresh_cache-c8ef6504-1e46-493c-a14c-7cd1bebde8e0" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 21 19:10:18 np0005591285 nova_compute[182755]: 2026-01-22 00:10:18.447 182759 DEBUG nova.network.neutron [req-5ad67476-96b8-4a57-88f3-dc181323b3fb req-8a74a6ca-ed0a-45ed-9c36-e23b821a913c 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: c8ef6504-1e46-493c-a14c-7cd1bebde8e0] Refreshing network info cache for port 058e919a-93d5-4b55-bae2-1ab8baad8296 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 21 19:10:18 np0005591285 nova_compute[182755]: 2026-01-22 00:10:18.646 182759 DEBUG oslo_concurrency.lockutils [None req-f211b4da-3b79-4472-866c-d368daa58610 8324d8ba232c476e925d31b7d5645a7a 3b9315c6168049d79f20d630e51ffff3 - - default default] Acquiring lock "cacae884-d2ca-4741-952f-59ffbb641328" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 21 19:10:18 np0005591285 nova_compute[182755]: 2026-01-22 00:10:18.646 182759 DEBUG oslo_concurrency.lockutils [None req-f211b4da-3b79-4472-866c-d368daa58610 8324d8ba232c476e925d31b7d5645a7a 3b9315c6168049d79f20d630e51ffff3 - - default default] Lock "cacae884-d2ca-4741-952f-59ffbb641328" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 21 19:10:18 np0005591285 nova_compute[182755]: 2026-01-22 00:10:18.647 182759 DEBUG oslo_concurrency.lockutils [None req-f211b4da-3b79-4472-866c-d368daa58610 8324d8ba232c476e925d31b7d5645a7a 3b9315c6168049d79f20d630e51ffff3 - - default default] Acquiring lock "cacae884-d2ca-4741-952f-59ffbb641328-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 21 19:10:18 np0005591285 nova_compute[182755]: 2026-01-22 00:10:18.647 182759 DEBUG oslo_concurrency.lockutils [None req-f211b4da-3b79-4472-866c-d368daa58610 8324d8ba232c476e925d31b7d5645a7a 3b9315c6168049d79f20d630e51ffff3 - - default default] Lock "cacae884-d2ca-4741-952f-59ffbb641328-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 21 19:10:18 np0005591285 nova_compute[182755]: 2026-01-22 00:10:18.647 182759 DEBUG oslo_concurrency.lockutils [None req-f211b4da-3b79-4472-866c-d368daa58610 8324d8ba232c476e925d31b7d5645a7a 3b9315c6168049d79f20d630e51ffff3 - - default default] Lock "cacae884-d2ca-4741-952f-59ffbb641328-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 21 19:10:18 np0005591285 nova_compute[182755]: 2026-01-22 00:10:18.658 182759 INFO nova.compute.manager [None req-f211b4da-3b79-4472-866c-d368daa58610 8324d8ba232c476e925d31b7d5645a7a 3b9315c6168049d79f20d630e51ffff3 - - default default] [instance: cacae884-d2ca-4741-952f-59ffbb641328] Terminating instance#033[00m
Jan 21 19:10:18 np0005591285 nova_compute[182755]: 2026-01-22 00:10:18.670 182759 DEBUG nova.compute.manager [None req-f211b4da-3b79-4472-866c-d368daa58610 8324d8ba232c476e925d31b7d5645a7a 3b9315c6168049d79f20d630e51ffff3 - - default default] [instance: cacae884-d2ca-4741-952f-59ffbb641328] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Jan 21 19:10:18 np0005591285 kernel: tapebf5a837-69 (unregistering): left promiscuous mode
Jan 21 19:10:18 np0005591285 NetworkManager[55017]: <info>  [1769040618.6986] device (tapebf5a837-69): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 21 19:10:18 np0005591285 nova_compute[182755]: 2026-01-22 00:10:18.707 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:10:18 np0005591285 ovn_controller[94908]: 2026-01-22T00:10:18Z|00419|binding|INFO|Releasing lport ebf5a837-6957-4227-9b3d-1ae66eb381bd from this chassis (sb_readonly=0)
Jan 21 19:10:18 np0005591285 ovn_controller[94908]: 2026-01-22T00:10:18Z|00420|binding|INFO|Setting lport ebf5a837-6957-4227-9b3d-1ae66eb381bd down in Southbound
Jan 21 19:10:18 np0005591285 ovn_controller[94908]: 2026-01-22T00:10:18Z|00421|binding|INFO|Removing iface tapebf5a837-69 ovn-installed in OVS
Jan 21 19:10:18 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:10:18.717 104259 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:52:ad:ee 10.100.0.11'], port_security=['fa:16:3e:52:ad:ee 10.100.0.11'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.11/28', 'neutron:device_id': 'cacae884-d2ca-4741-952f-59ffbb641328', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-eaa59f49-9477-4d9b-9a5e-8f1eb5cf78a6', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '3b9315c6168049d79f20d630e51ffff3', 'neutron:revision_number': '6', 'neutron:security_group_ids': '88dd83ff-b733-44b2-9065-8f39dcf83d23', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-2.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=ada6e58f-6492-44c0-abaa-a00698af112f, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fdd55b5c970>], logical_port=ebf5a837-6957-4227-9b3d-1ae66eb381bd) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7fdd55b5c970>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 21 19:10:18 np0005591285 nova_compute[182755]: 2026-01-22 00:10:18.719 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:10:18 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:10:18.721 104259 INFO neutron.agent.ovn.metadata.agent [-] Port ebf5a837-6957-4227-9b3d-1ae66eb381bd in datapath eaa59f49-9477-4d9b-9a5e-8f1eb5cf78a6 unbound from our chassis#033[00m
Jan 21 19:10:18 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:10:18.724 104259 DEBUG neutron.agent.ovn.metadata.agent [-] There is no metadata port for network eaa59f49-9477-4d9b-9a5e-8f1eb5cf78a6 or it has no MAC or IP addresses configured, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:599#033[00m
Jan 21 19:10:18 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:10:18.726 211690 DEBUG oslo.privsep.daemon [-] privsep: reply[4f2ba35a-e122-45ce-9cef-ec0c63a0f0c9]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 19:10:18 np0005591285 systemd[1]: machine-qemu\x2d50\x2dinstance\x2d0000006e.scope: Deactivated successfully.
Jan 21 19:10:18 np0005591285 systemd[1]: machine-qemu\x2d50\x2dinstance\x2d0000006e.scope: Consumed 14.558s CPU time.
Jan 21 19:10:18 np0005591285 systemd-machined[154022]: Machine qemu-50-instance-0000006e terminated.
Jan 21 19:10:18 np0005591285 nova_compute[182755]: 2026-01-22 00:10:18.944 182759 INFO nova.virt.libvirt.driver [-] [instance: cacae884-d2ca-4741-952f-59ffbb641328] Instance destroyed successfully.#033[00m
Jan 21 19:10:18 np0005591285 nova_compute[182755]: 2026-01-22 00:10:18.944 182759 DEBUG nova.objects.instance [None req-f211b4da-3b79-4472-866c-d368daa58610 8324d8ba232c476e925d31b7d5645a7a 3b9315c6168049d79f20d630e51ffff3 - - default default] Lazy-loading 'resources' on Instance uuid cacae884-d2ca-4741-952f-59ffbb641328 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 21 19:10:18 np0005591285 nova_compute[182755]: 2026-01-22 00:10:18.971 182759 DEBUG nova.virt.libvirt.vif [None req-f211b4da-3b79-4472-866c-d368daa58610 8324d8ba232c476e925d31b7d5645a7a 3b9315c6168049d79f20d630e51ffff3 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-22T00:08:49Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ServerRescueTestJSON-server-927653202',display_name='tempest-ServerRescueTestJSON-server-927653202',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-serverrescuetestjson-server-927653202',id=110,image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2026-01-22T00:09:22Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='3b9315c6168049d79f20d630e51ffff3',ramdisk_id='',reservation_id='r-8xr05bq6',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-ServerRescueTestJSON-401787473',owner_user_name='tempest-ServerRescueTestJSON-401787473-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-22T00:09:22Z,user_data=None,user_id='8324d8ba232c476e925d31b7d5645a7a',uuid=cacae884-d2ca-4741-952f-59ffbb641328,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='rescued') vif={"id": "ebf5a837-6957-4227-9b3d-1ae66eb381bd", "address": "fa:16:3e:52:ad:ee", "network": {"id": "eaa59f49-9477-4d9b-9a5e-8f1eb5cf78a6", "bridge": "br-int", "label": "tempest-ServerRescueTestJSON-1280377146-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": false}}], "meta": {"injected": false, "tenant_id": "3b9315c6168049d79f20d630e51ffff3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapebf5a837-69", "ovs_interfaceid": "ebf5a837-6957-4227-9b3d-1ae66eb381bd", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Jan 21 19:10:18 np0005591285 nova_compute[182755]: 2026-01-22 00:10:18.972 182759 DEBUG nova.network.os_vif_util [None req-f211b4da-3b79-4472-866c-d368daa58610 8324d8ba232c476e925d31b7d5645a7a 3b9315c6168049d79f20d630e51ffff3 - - default default] Converting VIF {"id": "ebf5a837-6957-4227-9b3d-1ae66eb381bd", "address": "fa:16:3e:52:ad:ee", "network": {"id": "eaa59f49-9477-4d9b-9a5e-8f1eb5cf78a6", "bridge": "br-int", "label": "tempest-ServerRescueTestJSON-1280377146-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": false}}], "meta": {"injected": false, "tenant_id": "3b9315c6168049d79f20d630e51ffff3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapebf5a837-69", "ovs_interfaceid": "ebf5a837-6957-4227-9b3d-1ae66eb381bd", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 21 19:10:18 np0005591285 nova_compute[182755]: 2026-01-22 00:10:18.972 182759 DEBUG nova.network.os_vif_util [None req-f211b4da-3b79-4472-866c-d368daa58610 8324d8ba232c476e925d31b7d5645a7a 3b9315c6168049d79f20d630e51ffff3 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:52:ad:ee,bridge_name='br-int',has_traffic_filtering=True,id=ebf5a837-6957-4227-9b3d-1ae66eb381bd,network=Network(eaa59f49-9477-4d9b-9a5e-8f1eb5cf78a6),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapebf5a837-69') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 21 19:10:18 np0005591285 nova_compute[182755]: 2026-01-22 00:10:18.973 182759 DEBUG os_vif [None req-f211b4da-3b79-4472-866c-d368daa58610 8324d8ba232c476e925d31b7d5645a7a 3b9315c6168049d79f20d630e51ffff3 - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:52:ad:ee,bridge_name='br-int',has_traffic_filtering=True,id=ebf5a837-6957-4227-9b3d-1ae66eb381bd,network=Network(eaa59f49-9477-4d9b-9a5e-8f1eb5cf78a6),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapebf5a837-69') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Jan 21 19:10:18 np0005591285 nova_compute[182755]: 2026-01-22 00:10:18.975 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:10:18 np0005591285 nova_compute[182755]: 2026-01-22 00:10:18.976 182759 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapebf5a837-69, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 21 19:10:18 np0005591285 nova_compute[182755]: 2026-01-22 00:10:18.978 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:10:18 np0005591285 nova_compute[182755]: 2026-01-22 00:10:18.979 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:10:18 np0005591285 nova_compute[182755]: 2026-01-22 00:10:18.983 182759 INFO os_vif [None req-f211b4da-3b79-4472-866c-d368daa58610 8324d8ba232c476e925d31b7d5645a7a 3b9315c6168049d79f20d630e51ffff3 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:52:ad:ee,bridge_name='br-int',has_traffic_filtering=True,id=ebf5a837-6957-4227-9b3d-1ae66eb381bd,network=Network(eaa59f49-9477-4d9b-9a5e-8f1eb5cf78a6),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapebf5a837-69')#033[00m
Jan 21 19:10:18 np0005591285 nova_compute[182755]: 2026-01-22 00:10:18.984 182759 INFO nova.virt.libvirt.driver [None req-f211b4da-3b79-4472-866c-d368daa58610 8324d8ba232c476e925d31b7d5645a7a 3b9315c6168049d79f20d630e51ffff3 - - default default] [instance: cacae884-d2ca-4741-952f-59ffbb641328] Deleting instance files /var/lib/nova/instances/cacae884-d2ca-4741-952f-59ffbb641328_del#033[00m
Jan 21 19:10:18 np0005591285 nova_compute[182755]: 2026-01-22 00:10:18.985 182759 INFO nova.virt.libvirt.driver [None req-f211b4da-3b79-4472-866c-d368daa58610 8324d8ba232c476e925d31b7d5645a7a 3b9315c6168049d79f20d630e51ffff3 - - default default] [instance: cacae884-d2ca-4741-952f-59ffbb641328] Deletion of /var/lib/nova/instances/cacae884-d2ca-4741-952f-59ffbb641328_del complete#033[00m
Jan 21 19:10:19 np0005591285 nova_compute[182755]: 2026-01-22 00:10:19.112 182759 INFO nova.compute.manager [None req-f211b4da-3b79-4472-866c-d368daa58610 8324d8ba232c476e925d31b7d5645a7a 3b9315c6168049d79f20d630e51ffff3 - - default default] [instance: cacae884-d2ca-4741-952f-59ffbb641328] Took 0.44 seconds to destroy the instance on the hypervisor.#033[00m
Jan 21 19:10:19 np0005591285 nova_compute[182755]: 2026-01-22 00:10:19.112 182759 DEBUG oslo.service.loopingcall [None req-f211b4da-3b79-4472-866c-d368daa58610 8324d8ba232c476e925d31b7d5645a7a 3b9315c6168049d79f20d630e51ffff3 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Jan 21 19:10:19 np0005591285 nova_compute[182755]: 2026-01-22 00:10:19.113 182759 DEBUG nova.compute.manager [-] [instance: cacae884-d2ca-4741-952f-59ffbb641328] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Jan 21 19:10:19 np0005591285 nova_compute[182755]: 2026-01-22 00:10:19.113 182759 DEBUG nova.network.neutron [-] [instance: cacae884-d2ca-4741-952f-59ffbb641328] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Jan 21 19:10:19 np0005591285 ovn_controller[94908]: 2026-01-22T00:10:19Z|00422|binding|INFO|Releasing lport 552f0b46-d8c2-456a-a665-639ee7e20754 from this chassis (sb_readonly=0)
Jan 21 19:10:19 np0005591285 nova_compute[182755]: 2026-01-22 00:10:19.539 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:10:19 np0005591285 nova_compute[182755]: 2026-01-22 00:10:19.941 182759 DEBUG nova.compute.manager [req-0e3ec53a-148e-4bc0-a5e8-b971fc14fe41 req-37570bf7-87c1-4218-ab6d-5349bead52dd 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: cacae884-d2ca-4741-952f-59ffbb641328] Received event network-vif-unplugged-ebf5a837-6957-4227-9b3d-1ae66eb381bd external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 21 19:10:19 np0005591285 nova_compute[182755]: 2026-01-22 00:10:19.942 182759 DEBUG oslo_concurrency.lockutils [req-0e3ec53a-148e-4bc0-a5e8-b971fc14fe41 req-37570bf7-87c1-4218-ab6d-5349bead52dd 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "cacae884-d2ca-4741-952f-59ffbb641328-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 21 19:10:19 np0005591285 nova_compute[182755]: 2026-01-22 00:10:19.942 182759 DEBUG oslo_concurrency.lockutils [req-0e3ec53a-148e-4bc0-a5e8-b971fc14fe41 req-37570bf7-87c1-4218-ab6d-5349bead52dd 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "cacae884-d2ca-4741-952f-59ffbb641328-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 21 19:10:19 np0005591285 nova_compute[182755]: 2026-01-22 00:10:19.942 182759 DEBUG oslo_concurrency.lockutils [req-0e3ec53a-148e-4bc0-a5e8-b971fc14fe41 req-37570bf7-87c1-4218-ab6d-5349bead52dd 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "cacae884-d2ca-4741-952f-59ffbb641328-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 21 19:10:19 np0005591285 nova_compute[182755]: 2026-01-22 00:10:19.943 182759 DEBUG nova.compute.manager [req-0e3ec53a-148e-4bc0-a5e8-b971fc14fe41 req-37570bf7-87c1-4218-ab6d-5349bead52dd 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: cacae884-d2ca-4741-952f-59ffbb641328] No waiting events found dispatching network-vif-unplugged-ebf5a837-6957-4227-9b3d-1ae66eb381bd pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 21 19:10:19 np0005591285 nova_compute[182755]: 2026-01-22 00:10:19.951 182759 DEBUG nova.compute.manager [req-0e3ec53a-148e-4bc0-a5e8-b971fc14fe41 req-37570bf7-87c1-4218-ab6d-5349bead52dd 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: cacae884-d2ca-4741-952f-59ffbb641328] Received event network-vif-unplugged-ebf5a837-6957-4227-9b3d-1ae66eb381bd for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Jan 21 19:10:19 np0005591285 nova_compute[182755]: 2026-01-22 00:10:19.951 182759 DEBUG nova.compute.manager [req-0e3ec53a-148e-4bc0-a5e8-b971fc14fe41 req-37570bf7-87c1-4218-ab6d-5349bead52dd 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: cacae884-d2ca-4741-952f-59ffbb641328] Received event network-vif-plugged-ebf5a837-6957-4227-9b3d-1ae66eb381bd external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 21 19:10:19 np0005591285 nova_compute[182755]: 2026-01-22 00:10:19.951 182759 DEBUG oslo_concurrency.lockutils [req-0e3ec53a-148e-4bc0-a5e8-b971fc14fe41 req-37570bf7-87c1-4218-ab6d-5349bead52dd 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "cacae884-d2ca-4741-952f-59ffbb641328-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 21 19:10:19 np0005591285 nova_compute[182755]: 2026-01-22 00:10:19.951 182759 DEBUG oslo_concurrency.lockutils [req-0e3ec53a-148e-4bc0-a5e8-b971fc14fe41 req-37570bf7-87c1-4218-ab6d-5349bead52dd 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "cacae884-d2ca-4741-952f-59ffbb641328-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 21 19:10:19 np0005591285 nova_compute[182755]: 2026-01-22 00:10:19.951 182759 DEBUG oslo_concurrency.lockutils [req-0e3ec53a-148e-4bc0-a5e8-b971fc14fe41 req-37570bf7-87c1-4218-ab6d-5349bead52dd 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "cacae884-d2ca-4741-952f-59ffbb641328-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 21 19:10:19 np0005591285 nova_compute[182755]: 2026-01-22 00:10:19.952 182759 DEBUG nova.compute.manager [req-0e3ec53a-148e-4bc0-a5e8-b971fc14fe41 req-37570bf7-87c1-4218-ab6d-5349bead52dd 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: cacae884-d2ca-4741-952f-59ffbb641328] No waiting events found dispatching network-vif-plugged-ebf5a837-6957-4227-9b3d-1ae66eb381bd pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 21 19:10:19 np0005591285 nova_compute[182755]: 2026-01-22 00:10:19.952 182759 WARNING nova.compute.manager [req-0e3ec53a-148e-4bc0-a5e8-b971fc14fe41 req-37570bf7-87c1-4218-ab6d-5349bead52dd 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: cacae884-d2ca-4741-952f-59ffbb641328] Received unexpected event network-vif-plugged-ebf5a837-6957-4227-9b3d-1ae66eb381bd for instance with vm_state rescued and task_state deleting.#033[00m
Jan 21 19:10:21 np0005591285 nova_compute[182755]: 2026-01-22 00:10:21.116 182759 DEBUG nova.network.neutron [-] [instance: cacae884-d2ca-4741-952f-59ffbb641328] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 21 19:10:21 np0005591285 nova_compute[182755]: 2026-01-22 00:10:21.195 182759 INFO nova.compute.manager [-] [instance: cacae884-d2ca-4741-952f-59ffbb641328] Took 2.08 seconds to deallocate network for instance.#033[00m
Jan 21 19:10:21 np0005591285 ovn_controller[94908]: 2026-01-22T00:10:21Z|00043|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:d8:3a:e3 10.100.0.3
Jan 21 19:10:21 np0005591285 ovn_controller[94908]: 2026-01-22T00:10:21Z|00044|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:d8:3a:e3 10.100.0.3
Jan 21 19:10:21 np0005591285 nova_compute[182755]: 2026-01-22 00:10:21.373 182759 DEBUG oslo_concurrency.lockutils [None req-f211b4da-3b79-4472-866c-d368daa58610 8324d8ba232c476e925d31b7d5645a7a 3b9315c6168049d79f20d630e51ffff3 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 21 19:10:21 np0005591285 nova_compute[182755]: 2026-01-22 00:10:21.374 182759 DEBUG oslo_concurrency.lockutils [None req-f211b4da-3b79-4472-866c-d368daa58610 8324d8ba232c476e925d31b7d5645a7a 3b9315c6168049d79f20d630e51ffff3 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 21 19:10:21 np0005591285 nova_compute[182755]: 2026-01-22 00:10:21.579 182759 DEBUG nova.compute.provider_tree [None req-f211b4da-3b79-4472-866c-d368daa58610 8324d8ba232c476e925d31b7d5645a7a 3b9315c6168049d79f20d630e51ffff3 - - default default] Inventory has not changed in ProviderTree for provider: e96a8776-a298-4c19-937a-402cb8191067 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 21 19:10:21 np0005591285 nova_compute[182755]: 2026-01-22 00:10:21.835 182759 DEBUG nova.scheduler.client.report [None req-f211b4da-3b79-4472-866c-d368daa58610 8324d8ba232c476e925d31b7d5645a7a 3b9315c6168049d79f20d630e51ffff3 - - default default] Inventory has not changed for provider e96a8776-a298-4c19-937a-402cb8191067 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 21 19:10:21 np0005591285 nova_compute[182755]: 2026-01-22 00:10:21.879 182759 DEBUG oslo_concurrency.lockutils [None req-f211b4da-3b79-4472-866c-d368daa58610 8324d8ba232c476e925d31b7d5645a7a 3b9315c6168049d79f20d630e51ffff3 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.505s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 21 19:10:21 np0005591285 nova_compute[182755]: 2026-01-22 00:10:21.913 182759 DEBUG nova.compute.manager [req-516992ac-3c53-4423-95d7-4cb3891deffe req-e9e9d974-8a01-4628-9848-95ecec5ffdbd 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: cacae884-d2ca-4741-952f-59ffbb641328] Received event network-vif-deleted-ebf5a837-6957-4227-9b3d-1ae66eb381bd external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 21 19:10:21 np0005591285 nova_compute[182755]: 2026-01-22 00:10:21.964 182759 INFO nova.scheduler.client.report [None req-f211b4da-3b79-4472-866c-d368daa58610 8324d8ba232c476e925d31b7d5645a7a 3b9315c6168049d79f20d630e51ffff3 - - default default] Deleted allocations for instance cacae884-d2ca-4741-952f-59ffbb641328#033[00m
Jan 21 19:10:21 np0005591285 nova_compute[182755]: 2026-01-22 00:10:21.983 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:10:22 np0005591285 nova_compute[182755]: 2026-01-22 00:10:22.092 182759 DEBUG oslo_concurrency.lockutils [None req-f211b4da-3b79-4472-866c-d368daa58610 8324d8ba232c476e925d31b7d5645a7a 3b9315c6168049d79f20d630e51ffff3 - - default default] Lock "cacae884-d2ca-4741-952f-59ffbb641328" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 3.445s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 21 19:10:22 np0005591285 nova_compute[182755]: 2026-01-22 00:10:22.507 182759 DEBUG nova.network.neutron [req-5ad67476-96b8-4a57-88f3-dc181323b3fb req-8a74a6ca-ed0a-45ed-9c36-e23b821a913c 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: c8ef6504-1e46-493c-a14c-7cd1bebde8e0] Updated VIF entry in instance network info cache for port 058e919a-93d5-4b55-bae2-1ab8baad8296. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 21 19:10:22 np0005591285 nova_compute[182755]: 2026-01-22 00:10:22.507 182759 DEBUG nova.network.neutron [req-5ad67476-96b8-4a57-88f3-dc181323b3fb req-8a74a6ca-ed0a-45ed-9c36-e23b821a913c 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: c8ef6504-1e46-493c-a14c-7cd1bebde8e0] Updating instance_info_cache with network_info: [{"id": "058e919a-93d5-4b55-bae2-1ab8baad8296", "address": "fa:16:3e:d8:3a:e3", "network": {"id": "8eb4d5e9-8ac1-4d97-a3ca-29514334d492", "bridge": "br-int", "label": "tempest-network-smoke--436957495", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.185", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "34b96b4037d24a0ea19383ca2477b2fd", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap058e919a-93", "ovs_interfaceid": "058e919a-93d5-4b55-bae2-1ab8baad8296", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 21 19:10:22 np0005591285 nova_compute[182755]: 2026-01-22 00:10:22.592 182759 DEBUG oslo_concurrency.lockutils [req-5ad67476-96b8-4a57-88f3-dc181323b3fb req-8a74a6ca-ed0a-45ed-9c36-e23b821a913c 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Releasing lock "refresh_cache-c8ef6504-1e46-493c-a14c-7cd1bebde8e0" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.167 12 DEBUG ceilometer.compute.discovery [-] instance data: {'id': 'c8ef6504-1e46-493c-a14c-7cd1bebde8e0', 'name': 'tempest-TestNetworkBasicOps-server-1980643582', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'os_type': 'hvm', 'architecture': 'x86_64', 'OS-EXT-SRV-ATTR:instance_name': 'instance-00000072', 'OS-EXT-SRV-ATTR:host': 'compute-2.ctlplane.example.com', 'OS-EXT-STS:vm_state': 'running', 'tenant_id': '34b96b4037d24a0ea19383ca2477b2fd', 'user_id': '833f1e9dce90456ea55a443da6704907', 'hostId': 'e5ddce960d32daad00025894dc3b34f2163b2588c9dc00067da2324e', 'status': 'active', 'metadata': {}} discover_libvirt_polling /usr/lib/python3.9/site-packages/ceilometer/compute/discovery.py:228
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.168 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.error in the context of pollsters
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.173 12 DEBUG ceilometer.compute.virt.libvirt.inspector [-] No delta meter predecessor for c8ef6504-1e46-493c-a14c-7cd1bebde8e0 / tap058e919a-93 inspect_vnics /usr/lib/python3.9/site-packages/ceilometer/compute/virt/libvirt/inspector.py:136
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.173 12 DEBUG ceilometer.compute.pollsters [-] c8ef6504-1e46-493c-a14c-7cd1bebde8e0/network.outgoing.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.178 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '0e3450b7-fef0-4d1d-9519-3ba497a19dee', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '833f1e9dce90456ea55a443da6704907', 'user_name': None, 'project_id': '34b96b4037d24a0ea19383ca2477b2fd', 'project_name': None, 'resource_id': 'instance-00000072-c8ef6504-1e46-493c-a14c-7cd1bebde8e0-tap058e919a-93', 'timestamp': '2026-01-22T00:10:23.168712', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1980643582', 'name': 'tap058e919a-93', 'instance_id': 'c8ef6504-1e46-493c-a14c-7cd1bebde8e0', 'instance_type': 'm1.nano', 'host': 'e5ddce960d32daad00025894dc3b34f2163b2588c9dc00067da2324e', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:d8:3a:e3', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap058e919a-93'}, 'message_id': 'bee1716a-f726-11f0-b13b-fa163e425b77', 'monotonic_time': 5181.887980336, 'message_signature': 'fed190428e7e5b3f91b461fb80e1dc09c7431d58dcaf4fc6723793a007e61d62'}]}, 'timestamp': '2026-01-22 00:10:23.175495', '_unique_id': '6ad1cff0f7094377a27c06905704c6db'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.178 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.178 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.178 12 ERROR oslo_messaging.notify.messaging     yield
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.178 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.178 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.178 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.178 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.178 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.178 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.178 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.178 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.178 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.178 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.178 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.178 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.178 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.178 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.178 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.178 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.178 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.178 12 ERROR oslo_messaging.notify.messaging 
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.178 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.178 12 ERROR oslo_messaging.notify.messaging 
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.178 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.178 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.178 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.178 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.178 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.178 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.178 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.178 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.178 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.178 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.178 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.178 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.178 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.178 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.178 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.178 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.178 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.178 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.178 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.178 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.178 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.178 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.178 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.178 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.178 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.178 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.178 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.178 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.178 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.178 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.178 12 ERROR oslo_messaging.notify.messaging 
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.180 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.latency in the context of pollsters
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.221 12 DEBUG ceilometer.compute.pollsters [-] c8ef6504-1e46-493c-a14c-7cd1bebde8e0/disk.device.read.latency volume: 204535285 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.222 12 DEBUG ceilometer.compute.pollsters [-] c8ef6504-1e46-493c-a14c-7cd1bebde8e0/disk.device.read.latency volume: 19489435 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.224 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'f1354ed0-b89a-40ba-b57b-b1df0f79b7bb', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 204535285, 'user_id': '833f1e9dce90456ea55a443da6704907', 'user_name': None, 'project_id': '34b96b4037d24a0ea19383ca2477b2fd', 'project_name': None, 'resource_id': 'c8ef6504-1e46-493c-a14c-7cd1bebde8e0-vda', 'timestamp': '2026-01-22T00:10:23.180509', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1980643582', 'name': 'instance-00000072', 'instance_id': 'c8ef6504-1e46-493c-a14c-7cd1bebde8e0', 'instance_type': 'm1.nano', 'host': 'e5ddce960d32daad00025894dc3b34f2163b2588c9dc00067da2324e', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'bee8a4bc-f726-11f0-b13b-fa163e425b77', 'monotonic_time': 5181.899742593, 'message_signature': 'b8384b9589bd03e6d0f7c912b26b2e4ba0b541ad816ef5c884f5925482cc013a'}, {'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 19489435, 'user_id': '833f1e9dce90456ea55a443da6704907', 'user_name': None, 'project_id': '34b96b4037d24a0ea19383ca2477b2fd', 'project_name': None, 'resource_id': 'c8ef6504-1e46-493c-a14c-7cd1bebde8e0-sda', 'timestamp': '2026-01-22T00:10:23.180509', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1980643582', 'name': 'instance-00000072', 'instance_id': 'c8ef6504-1e46-493c-a14c-7cd1bebde8e0', 'instance_type': 'm1.nano', 'host': 'e5ddce960d32daad00025894dc3b34f2163b2588c9dc00067da2324e', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'bee8be16-f726-11f0-b13b-fa163e425b77', 'monotonic_time': 5181.899742593, 'message_signature': '2ff36f5bd071fa017c6cf1658330bff6f12e5a655385da305e4b607f00928680'}]}, 'timestamp': '2026-01-22 00:10:23.222929', '_unique_id': '94fbfd4334094297940f03566243671b'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.224 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.224 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.224 12 ERROR oslo_messaging.notify.messaging     yield
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.224 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.224 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.224 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.224 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.224 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.224 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.224 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.224 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.224 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.224 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.224 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.224 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.224 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.224 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.224 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.224 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.224 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.224 12 ERROR oslo_messaging.notify.messaging 
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.224 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.224 12 ERROR oslo_messaging.notify.messaging 
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.224 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.224 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.224 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.224 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.224 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.224 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.224 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.224 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.224 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.224 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.224 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.224 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.224 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.224 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.224 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.224 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.224 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.224 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.224 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.224 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.224 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.224 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.224 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.224 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.224 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.224 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.224 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.224 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.224 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.224 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.224 12 ERROR oslo_messaging.notify.messaging 
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.225 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets in the context of pollsters
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.225 12 DEBUG ceilometer.compute.pollsters [-] c8ef6504-1e46-493c-a14c-7cd1bebde8e0/network.outgoing.packets volume: 10 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.226 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '79abcd5b-6198-4dd2-8fb9-496ada448bbf', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 10, 'user_id': '833f1e9dce90456ea55a443da6704907', 'user_name': None, 'project_id': '34b96b4037d24a0ea19383ca2477b2fd', 'project_name': None, 'resource_id': 'instance-00000072-c8ef6504-1e46-493c-a14c-7cd1bebde8e0-tap058e919a-93', 'timestamp': '2026-01-22T00:10:23.225270', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1980643582', 'name': 'tap058e919a-93', 'instance_id': 'c8ef6504-1e46-493c-a14c-7cd1bebde8e0', 'instance_type': 'm1.nano', 'host': 'e5ddce960d32daad00025894dc3b34f2163b2588c9dc00067da2324e', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:d8:3a:e3', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap058e919a-93'}, 'message_id': 'bee92932-f726-11f0-b13b-fa163e425b77', 'monotonic_time': 5181.887980336, 'message_signature': 'b81bb557c8d0c26e9ab500cb42529252ad0957706e4be4d090ee6a677ce1466f'}]}, 'timestamp': '2026-01-22 00:10:23.225651', '_unique_id': '819cee7c46f94c8c8f176390ea720ca4'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.226 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.226 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.226 12 ERROR oslo_messaging.notify.messaging     yield
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.226 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.226 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.226 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.226 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.226 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.226 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.226 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.226 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.226 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.226 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.226 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.226 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.226 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.226 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.226 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.226 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.226 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.226 12 ERROR oslo_messaging.notify.messaging 
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.226 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.226 12 ERROR oslo_messaging.notify.messaging 
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.226 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.226 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.226 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.226 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.226 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.226 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.226 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.226 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.226 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.226 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.226 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.226 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.226 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.226 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.226 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.226 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.226 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.226 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.226 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.226 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.226 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.226 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.226 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.226 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.226 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.226 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.226 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.226 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.226 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.226 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.226 12 ERROR oslo_messaging.notify.messaging 
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.227 12 INFO ceilometer.polling.manager [-] Polling pollster memory.usage in the context of pollsters
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.244 12 DEBUG ceilometer.compute.pollsters [-] c8ef6504-1e46-493c-a14c-7cd1bebde8e0/memory.usage volume: 40.3671875 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.245 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '1683663b-51e6-4cf1-820c-16dcf46453fe', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'memory.usage', 'counter_type': 'gauge', 'counter_unit': 'MB', 'counter_volume': 40.3671875, 'user_id': '833f1e9dce90456ea55a443da6704907', 'user_name': None, 'project_id': '34b96b4037d24a0ea19383ca2477b2fd', 'project_name': None, 'resource_id': 'c8ef6504-1e46-493c-a14c-7cd1bebde8e0', 'timestamp': '2026-01-22T00:10:23.227915', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1980643582', 'name': 'instance-00000072', 'instance_id': 'c8ef6504-1e46-493c-a14c-7cd1bebde8e0', 'instance_type': 'm1.nano', 'host': 'e5ddce960d32daad00025894dc3b34f2163b2588c9dc00067da2324e', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1}, 'message_id': 'beec173c-f726-11f0-b13b-fa163e425b77', 'monotonic_time': 5181.963408224, 'message_signature': '3cbab868734e0a16446606a1edb02354c3dc992046d8734f8a44628c161e1ad0'}]}, 'timestamp': '2026-01-22 00:10:23.244897', '_unique_id': '0ceca471ba4d429ea317da86413d7cc8'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.245 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.245 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.245 12 ERROR oslo_messaging.notify.messaging     yield
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.245 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.245 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.245 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.245 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.245 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.245 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.245 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.245 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.245 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.245 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.245 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.245 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.245 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.245 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.245 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.245 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.245 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.245 12 ERROR oslo_messaging.notify.messaging 
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.245 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.245 12 ERROR oslo_messaging.notify.messaging 
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.245 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.245 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.245 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.245 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.245 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.245 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.245 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.245 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.245 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.245 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.245 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.245 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.245 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.245 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.245 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.245 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.245 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.245 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.245 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.245 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.245 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.245 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.245 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.245 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.245 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.245 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.245 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.245 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.245 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.245 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.245 12 ERROR oslo_messaging.notify.messaging 
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.246 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets in the context of pollsters
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.246 12 DEBUG ceilometer.compute.pollsters [-] c8ef6504-1e46-493c-a14c-7cd1bebde8e0/network.incoming.packets volume: 12 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.247 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '74af12bd-53d4-4cbb-8c53-bfb1380ecbd5', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 12, 'user_id': '833f1e9dce90456ea55a443da6704907', 'user_name': None, 'project_id': '34b96b4037d24a0ea19383ca2477b2fd', 'project_name': None, 'resource_id': 'instance-00000072-c8ef6504-1e46-493c-a14c-7cd1bebde8e0-tap058e919a-93', 'timestamp': '2026-01-22T00:10:23.246699', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1980643582', 'name': 'tap058e919a-93', 'instance_id': 'c8ef6504-1e46-493c-a14c-7cd1bebde8e0', 'instance_type': 'm1.nano', 'host': 'e5ddce960d32daad00025894dc3b34f2163b2588c9dc00067da2324e', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:d8:3a:e3', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap058e919a-93'}, 'message_id': 'beec6bba-f726-11f0-b13b-fa163e425b77', 'monotonic_time': 5181.887980336, 'message_signature': '27840a72fd92c3fab7be9ab96244e97488205c76b12d11f9f503757f869efe81'}]}, 'timestamp': '2026-01-22 00:10:23.247008', '_unique_id': '7840f387041e4a589b0fa2edabb9aa8b'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.247 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.247 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.247 12 ERROR oslo_messaging.notify.messaging     yield
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.247 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.247 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.247 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.247 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.247 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.247 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.247 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.247 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.247 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.247 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.247 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.247 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.247 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.247 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.247 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.247 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.247 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.247 12 ERROR oslo_messaging.notify.messaging 
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.247 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.247 12 ERROR oslo_messaging.notify.messaging 
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.247 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.247 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.247 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.247 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.247 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.247 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.247 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.247 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.247 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.247 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.247 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.247 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.247 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.247 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.247 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.247 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.247 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.247 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.247 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.247 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.247 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.247 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.247 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.247 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.247 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.247 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.247 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.247 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.247 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.247 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.247 12 ERROR oslo_messaging.notify.messaging 
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.248 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.iops in the context of pollsters
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.248 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for PerDeviceDiskIOPSPollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.248 12 ERROR ceilometer.polling.manager [-] Prevent pollster disk.device.iops from polling [<NovaLikeServer: tempest-TestNetworkBasicOps-server-1980643582>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-TestNetworkBasicOps-server-1980643582>]
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.248 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.bytes in the context of pollsters
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.248 12 DEBUG ceilometer.compute.pollsters [-] c8ef6504-1e46-493c-a14c-7cd1bebde8e0/disk.device.write.bytes volume: 72753152 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.248 12 DEBUG ceilometer.compute.pollsters [-] c8ef6504-1e46-493c-a14c-7cd1bebde8e0/disk.device.write.bytes volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.249 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '45722ae3-bb4e-4e23-b29f-08cbb255743f', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 72753152, 'user_id': '833f1e9dce90456ea55a443da6704907', 'user_name': None, 'project_id': '34b96b4037d24a0ea19383ca2477b2fd', 'project_name': None, 'resource_id': 'c8ef6504-1e46-493c-a14c-7cd1bebde8e0-vda', 'timestamp': '2026-01-22T00:10:23.248673', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1980643582', 'name': 'instance-00000072', 'instance_id': 'c8ef6504-1e46-493c-a14c-7cd1bebde8e0', 'instance_type': 'm1.nano', 'host': 'e5ddce960d32daad00025894dc3b34f2163b2588c9dc00067da2324e', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'beecb89a-f726-11f0-b13b-fa163e425b77', 'monotonic_time': 5181.899742593, 'message_signature': 'c6c6086680e5eb5b8650aac339a68a523726076e83d3bf21aef46f5da45206bb'}, {'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '833f1e9dce90456ea55a443da6704907', 'user_name': None, 'project_id': '34b96b4037d24a0ea19383ca2477b2fd', 'project_name': None, 'resource_id': 'c8ef6504-1e46-493c-a14c-7cd1bebde8e0-sda', 'timestamp': '2026-01-22T00:10:23.248673', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1980643582', 'name': 'instance-00000072', 'instance_id': 'c8ef6504-1e46-493c-a14c-7cd1bebde8e0', 'instance_type': 'm1.nano', 'host': 'e5ddce960d32daad00025894dc3b34f2163b2588c9dc00067da2324e', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'beecc2d6-f726-11f0-b13b-fa163e425b77', 'monotonic_time': 5181.899742593, 'message_signature': '17a2e75c1d10c69ae7ff0a47f315361af22629cfdbb812bf2166b8ce30bf8937'}]}, 'timestamp': '2026-01-22 00:10:23.249209', '_unique_id': '20a37487591145bdb3acfd54ad7f486a'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.249 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.249 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.249 12 ERROR oslo_messaging.notify.messaging     yield
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.249 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.249 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.249 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.249 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.249 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.249 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.249 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.249 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.249 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.249 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.249 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.249 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.249 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.249 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.249 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.249 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.249 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.249 12 ERROR oslo_messaging.notify.messaging 
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.249 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.249 12 ERROR oslo_messaging.notify.messaging 
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.249 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.249 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.249 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.249 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.249 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.249 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.249 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.249 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.249 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.249 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.249 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.249 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.249 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.249 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.249 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.249 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.249 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.249 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.249 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.249 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.249 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.249 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.249 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.249 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.249 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.249 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.249 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.249 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.249 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.249 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.249 12 ERROR oslo_messaging.notify.messaging 
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.250 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes in the context of pollsters
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.250 12 DEBUG ceilometer.compute.pollsters [-] c8ef6504-1e46-493c-a14c-7cd1bebde8e0/network.incoming.bytes volume: 1646 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.251 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'a79bac6c-8f5b-43b8-a86c-1a9ee1897525', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 1646, 'user_id': '833f1e9dce90456ea55a443da6704907', 'user_name': None, 'project_id': '34b96b4037d24a0ea19383ca2477b2fd', 'project_name': None, 'resource_id': 'instance-00000072-c8ef6504-1e46-493c-a14c-7cd1bebde8e0-tap058e919a-93', 'timestamp': '2026-01-22T00:10:23.250470', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1980643582', 'name': 'tap058e919a-93', 'instance_id': 'c8ef6504-1e46-493c-a14c-7cd1bebde8e0', 'instance_type': 'm1.nano', 'host': 'e5ddce960d32daad00025894dc3b34f2163b2588c9dc00067da2324e', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:d8:3a:e3', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap058e919a-93'}, 'message_id': 'beecfea4-f726-11f0-b13b-fa163e425b77', 'monotonic_time': 5181.887980336, 'message_signature': 'a3eecbb6910042efdbbb711bc5d2d789847b1a68194dbda7d147c75a9c2a1d1e'}]}, 'timestamp': '2026-01-22 00:10:23.250717', '_unique_id': '80f01ca81723421aa964ab0d6b967641'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.251 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.251 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.251 12 ERROR oslo_messaging.notify.messaging     yield
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.251 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.251 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.251 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.251 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.251 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.251 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.251 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.251 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.251 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.251 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.251 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.251 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.251 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.251 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.251 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.251 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.251 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.251 12 ERROR oslo_messaging.notify.messaging 
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.251 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.251 12 ERROR oslo_messaging.notify.messaging 
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.251 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.251 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.251 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.251 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.251 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.251 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.251 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.251 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.251 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.251 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.251 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.251 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.251 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.251 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.251 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.251 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.251 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.251 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.251 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.251 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.251 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.251 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.251 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.251 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.251 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.251 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.251 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.251 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.251 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.251 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.251 12 ERROR oslo_messaging.notify.messaging 
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.252 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.delta in the context of pollsters
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.252 12 DEBUG ceilometer.compute.pollsters [-] c8ef6504-1e46-493c-a14c-7cd1bebde8e0/network.outgoing.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.253 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'a4fb910f-8fae-42c2-a6f8-59816c07274f', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '833f1e9dce90456ea55a443da6704907', 'user_name': None, 'project_id': '34b96b4037d24a0ea19383ca2477b2fd', 'project_name': None, 'resource_id': 'instance-00000072-c8ef6504-1e46-493c-a14c-7cd1bebde8e0-tap058e919a-93', 'timestamp': '2026-01-22T00:10:23.252081', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1980643582', 'name': 'tap058e919a-93', 'instance_id': 'c8ef6504-1e46-493c-a14c-7cd1bebde8e0', 'instance_type': 'm1.nano', 'host': 'e5ddce960d32daad00025894dc3b34f2163b2588c9dc00067da2324e', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:d8:3a:e3', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap058e919a-93'}, 'message_id': 'beed3edc-f726-11f0-b13b-fa163e425b77', 'monotonic_time': 5181.887980336, 'message_signature': '9b1594d54f9f89bdbc1c07ded5c2636389773cd9ead56194d264dd1c0471ece6'}]}, 'timestamp': '2026-01-22 00:10:23.252400', '_unique_id': '49b4ed743867418dbb30f824bb2b0641'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.253 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.253 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.253 12 ERROR oslo_messaging.notify.messaging     yield
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.253 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.253 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.253 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.253 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.253 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.253 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.253 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.253 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.253 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.253 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.253 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.253 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.253 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.253 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.253 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.253 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.253 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.253 12 ERROR oslo_messaging.notify.messaging 
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.253 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.253 12 ERROR oslo_messaging.notify.messaging 
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.253 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.253 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.253 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.253 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.253 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.253 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.253 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.253 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.253 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.253 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.253 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.253 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.253 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.253 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.253 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.253 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.253 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.253 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.253 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.253 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.253 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.253 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.253 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.253 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.253 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.253 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.253 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.253 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.253 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.253 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.253 12 ERROR oslo_messaging.notify.messaging 
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.253 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.delta in the context of pollsters
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.253 12 DEBUG ceilometer.compute.pollsters [-] c8ef6504-1e46-493c-a14c-7cd1bebde8e0/network.incoming.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.254 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '05e1a743-f398-42bb-8477-c993c1e6b407', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '833f1e9dce90456ea55a443da6704907', 'user_name': None, 'project_id': '34b96b4037d24a0ea19383ca2477b2fd', 'project_name': None, 'resource_id': 'instance-00000072-c8ef6504-1e46-493c-a14c-7cd1bebde8e0-tap058e919a-93', 'timestamp': '2026-01-22T00:10:23.253921', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1980643582', 'name': 'tap058e919a-93', 'instance_id': 'c8ef6504-1e46-493c-a14c-7cd1bebde8e0', 'instance_type': 'm1.nano', 'host': 'e5ddce960d32daad00025894dc3b34f2163b2588c9dc00067da2324e', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:d8:3a:e3', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap058e919a-93'}, 'message_id': 'beed8720-f726-11f0-b13b-fa163e425b77', 'monotonic_time': 5181.887980336, 'message_signature': '34b4711813a96df1393b8cacb8d85c607b9d13e4108b5f0a0ef4b44c8cf18545'}]}, 'timestamp': '2026-01-22 00:10:23.254261', '_unique_id': '766514e55bad4aaba98f692a5b3e7900'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.254 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.254 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.254 12 ERROR oslo_messaging.notify.messaging     yield
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.254 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.254 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.254 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.254 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.254 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.254 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.254 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.254 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.254 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.254 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.254 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.254 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.254 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.254 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.254 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.254 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.254 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.254 12 ERROR oslo_messaging.notify.messaging 
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.254 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.254 12 ERROR oslo_messaging.notify.messaging 
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.254 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.254 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.254 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.254 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.254 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.254 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.254 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.254 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.254 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.254 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.254 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.254 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.254 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.254 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.254 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.254 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.254 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.254 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.254 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.254 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.254 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.254 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.254 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.254 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.254 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.254 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.254 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.254 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.254 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.254 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.254 12 ERROR oslo_messaging.notify.messaging 
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.255 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes in the context of pollsters
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.255 12 DEBUG ceilometer.compute.pollsters [-] c8ef6504-1e46-493c-a14c-7cd1bebde8e0/network.outgoing.bytes volume: 1284 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.256 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'ffd3ea64-a05b-4a1d-86bd-20e79f19932d', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 1284, 'user_id': '833f1e9dce90456ea55a443da6704907', 'user_name': None, 'project_id': '34b96b4037d24a0ea19383ca2477b2fd', 'project_name': None, 'resource_id': 'instance-00000072-c8ef6504-1e46-493c-a14c-7cd1bebde8e0-tap058e919a-93', 'timestamp': '2026-01-22T00:10:23.255552', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1980643582', 'name': 'tap058e919a-93', 'instance_id': 'c8ef6504-1e46-493c-a14c-7cd1bebde8e0', 'instance_type': 'm1.nano', 'host': 'e5ddce960d32daad00025894dc3b34f2163b2588c9dc00067da2324e', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:d8:3a:e3', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap058e919a-93'}, 'message_id': 'beedc4ec-f726-11f0-b13b-fa163e425b77', 'monotonic_time': 5181.887980336, 'message_signature': '331b2b56374088e27c52835644fd9c32a0cf6683d79618541721e6d1fb77b364'}]}, 'timestamp': '2026-01-22 00:10:23.255792', '_unique_id': '971771958b9c4034b56b4d3d9028fc58'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.256 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.256 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.256 12 ERROR oslo_messaging.notify.messaging     yield
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.256 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.256 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.256 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.256 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.256 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.256 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.256 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.256 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.256 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.256 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.256 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.256 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.256 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.256 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.256 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.256 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.256 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.256 12 ERROR oslo_messaging.notify.messaging 
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.256 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.256 12 ERROR oslo_messaging.notify.messaging 
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.256 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.256 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.256 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.256 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.256 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.256 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.256 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.256 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.256 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.256 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.256 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.256 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.256 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.256 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.256 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.256 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.256 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.256 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.256 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.256 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.256 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.256 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.256 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.256 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.256 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.256 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.256 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.256 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.256 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.256 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.256 12 ERROR oslo_messaging.notify.messaging 
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.257 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.allocation in the context of pollsters
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.267 12 DEBUG ceilometer.compute.pollsters [-] c8ef6504-1e46-493c-a14c-7cd1bebde8e0/disk.device.allocation volume: 30089216 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.267 12 DEBUG ceilometer.compute.pollsters [-] c8ef6504-1e46-493c-a14c-7cd1bebde8e0/disk.device.allocation volume: 487424 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.268 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '315a42c2-2346-4e3b-8b0d-d0b4a4c7505c', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 30089216, 'user_id': '833f1e9dce90456ea55a443da6704907', 'user_name': None, 'project_id': '34b96b4037d24a0ea19383ca2477b2fd', 'project_name': None, 'resource_id': 'c8ef6504-1e46-493c-a14c-7cd1bebde8e0-vda', 'timestamp': '2026-01-22T00:10:23.257179', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1980643582', 'name': 'instance-00000072', 'instance_id': 'c8ef6504-1e46-493c-a14c-7cd1bebde8e0', 'instance_type': 'm1.nano', 'host': 'e5ddce960d32daad00025894dc3b34f2163b2588c9dc00067da2324e', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'beef92f4-f726-11f0-b13b-fa163e425b77', 'monotonic_time': 5181.976408015, 'message_signature': '1db9b6ad99362b1d206c0798f0c108343700ceeb851ef89e1e5b7245a49ebef1'}, {'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 487424, 'user_id': '833f1e9dce90456ea55a443da6704907', 'user_name': None, 'project_id': '34b96b4037d24a0ea19383ca2477b2fd', 'project_name': None, 'resource_id': 'c8ef6504-1e46-493c-a14c-7cd1bebde8e0-sda', 'timestamp': '2026-01-22T00:10:23.257179', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1980643582', 'name': 'instance-00000072', 'instance_id': 'c8ef6504-1e46-493c-a14c-7cd1bebde8e0', 'instance_type': 'm1.nano', 'host': 'e5ddce960d32daad00025894dc3b34f2163b2588c9dc00067da2324e', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'beef9df8-f726-11f0-b13b-fa163e425b77', 'monotonic_time': 5181.976408015, 'message_signature': 'de34f6c3fc37595db3d8ef0d5ad92b90f69d26bbc1aa445702aedfc107cdffda'}]}, 'timestamp': '2026-01-22 00:10:23.267921', '_unique_id': 'bbcfe642b8664a849c54472717998ede'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.268 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.268 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.268 12 ERROR oslo_messaging.notify.messaging     yield
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.268 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.268 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.268 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.268 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.268 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.268 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.268 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.268 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.268 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.268 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.268 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.268 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.268 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.268 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.268 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.268 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.268 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.268 12 ERROR oslo_messaging.notify.messaging 
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.268 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.268 12 ERROR oslo_messaging.notify.messaging 
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.268 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.268 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.268 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.268 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.268 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.268 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.268 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.268 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.268 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.268 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.268 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.268 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.268 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.268 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.268 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.268 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.268 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.268 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.268 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.268 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.268 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.268 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.268 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.268 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.268 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.268 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.268 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.268 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.268 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.268 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.268 12 ERROR oslo_messaging.notify.messaging 
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.269 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.latency in the context of pollsters
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.269 12 DEBUG ceilometer.compute.pollsters [-] c8ef6504-1e46-493c-a14c-7cd1bebde8e0/disk.device.write.latency volume: 3277493857 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.269 12 DEBUG ceilometer.compute.pollsters [-] c8ef6504-1e46-493c-a14c-7cd1bebde8e0/disk.device.write.latency volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.270 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '2a0f00b3-267d-4bef-8e2e-3dcef7e183e3', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 3277493857, 'user_id': '833f1e9dce90456ea55a443da6704907', 'user_name': None, 'project_id': '34b96b4037d24a0ea19383ca2477b2fd', 'project_name': None, 'resource_id': 'c8ef6504-1e46-493c-a14c-7cd1bebde8e0-vda', 'timestamp': '2026-01-22T00:10:23.269505', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1980643582', 'name': 'instance-00000072', 'instance_id': 'c8ef6504-1e46-493c-a14c-7cd1bebde8e0', 'instance_type': 'm1.nano', 'host': 'e5ddce960d32daad00025894dc3b34f2163b2588c9dc00067da2324e', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'beefe5d8-f726-11f0-b13b-fa163e425b77', 'monotonic_time': 5181.899742593, 'message_signature': '3fee3259ce296d1be220ee2e0f084fd1b94c9dfdbe125a0cfe720b0ef9adca1c'}, {'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 0, 'user_id': '833f1e9dce90456ea55a443da6704907', 'user_name': None, 'project_id': '34b96b4037d24a0ea19383ca2477b2fd', 'project_name': None, 'resource_id': 'c8ef6504-1e46-493c-a14c-7cd1bebde8e0-sda', 'timestamp': '2026-01-22T00:10:23.269505', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1980643582', 'name': 'instance-00000072', 'instance_id': 'c8ef6504-1e46-493c-a14c-7cd1bebde8e0', 'instance_type': 'm1.nano', 'host': 'e5ddce960d32daad00025894dc3b34f2163b2588c9dc00067da2324e', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'beefeede-f726-11f0-b13b-fa163e425b77', 'monotonic_time': 5181.899742593, 'message_signature': 'bd726865393d4d51647cac020de44269fdf1c61c7aec0532808dd71acdc53d6c'}]}, 'timestamp': '2026-01-22 00:10:23.269994', '_unique_id': '4d375680607b4c10b5c884f30f6e606f'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.270 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.270 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.270 12 ERROR oslo_messaging.notify.messaging     yield
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.270 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.270 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.270 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.270 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.270 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.270 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.270 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.270 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.270 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.270 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.270 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.270 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.270 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.270 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.270 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.270 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.270 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.270 12 ERROR oslo_messaging.notify.messaging 
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.270 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.270 12 ERROR oslo_messaging.notify.messaging 
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.270 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.270 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.270 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.270 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.270 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.270 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.270 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.270 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.270 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.270 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.270 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.270 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.270 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.270 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.270 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.270 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.270 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.270 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.270 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.270 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.270 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.270 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.270 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.270 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.270 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.270 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.270 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.270 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.270 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.270 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.270 12 ERROR oslo_messaging.notify.messaging 
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.271 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.drop in the context of pollsters
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.271 12 DEBUG ceilometer.compute.pollsters [-] c8ef6504-1e46-493c-a14c-7cd1bebde8e0/network.incoming.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.272 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '2ce35ac1-5794-448e-b81c-79dbf39bb23b', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '833f1e9dce90456ea55a443da6704907', 'user_name': None, 'project_id': '34b96b4037d24a0ea19383ca2477b2fd', 'project_name': None, 'resource_id': 'instance-00000072-c8ef6504-1e46-493c-a14c-7cd1bebde8e0-tap058e919a-93', 'timestamp': '2026-01-22T00:10:23.271377', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1980643582', 'name': 'tap058e919a-93', 'instance_id': 'c8ef6504-1e46-493c-a14c-7cd1bebde8e0', 'instance_type': 'm1.nano', 'host': 'e5ddce960d32daad00025894dc3b34f2163b2588c9dc00067da2324e', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:d8:3a:e3', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap058e919a-93'}, 'message_id': 'bef0301a-f726-11f0-b13b-fa163e425b77', 'monotonic_time': 5181.887980336, 'message_signature': '452f37f4129265ff59b5a39129ed22122fa418d7c6cbcf7107e8cff59785e48e'}]}, 'timestamp': '2026-01-22 00:10:23.271643', '_unique_id': '853d3a5f0e954c26818983cf2427d9f8'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.272 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.272 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.272 12 ERROR oslo_messaging.notify.messaging     yield
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.272 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.272 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.272 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.272 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.272 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.272 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.272 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.272 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.272 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.272 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.272 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.272 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.272 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.272 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.272 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.272 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.272 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.272 12 ERROR oslo_messaging.notify.messaging 
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.272 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.272 12 ERROR oslo_messaging.notify.messaging 
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.272 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.272 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.272 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.272 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.272 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.272 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.272 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.272 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.272 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.272 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.272 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.272 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.272 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.272 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.272 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.272 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.272 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.272 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.272 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.272 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.272 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.272 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.272 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.272 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.272 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.272 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.272 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.272 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.272 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.272 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.272 12 ERROR oslo_messaging.notify.messaging 
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.272 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.latency in the context of pollsters
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.272 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for PerDeviceDiskLatencyPollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.272 12 ERROR ceilometer.polling.manager [-] Prevent pollster disk.device.latency from polling [<NovaLikeServer: tempest-TestNetworkBasicOps-server-1980643582>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-TestNetworkBasicOps-server-1980643582>]
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.272 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.usage in the context of pollsters
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.273 12 DEBUG ceilometer.compute.pollsters [-] c8ef6504-1e46-493c-a14c-7cd1bebde8e0/disk.device.usage volume: 29753344 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.273 12 DEBUG ceilometer.compute.pollsters [-] c8ef6504-1e46-493c-a14c-7cd1bebde8e0/disk.device.usage volume: 485376 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.273 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '2ab1cdd9-d454-47e8-a3c9-3b97f449df3d', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 29753344, 'user_id': '833f1e9dce90456ea55a443da6704907', 'user_name': None, 'project_id': '34b96b4037d24a0ea19383ca2477b2fd', 'project_name': None, 'resource_id': 'c8ef6504-1e46-493c-a14c-7cd1bebde8e0-vda', 'timestamp': '2026-01-22T00:10:23.273041', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1980643582', 'name': 'instance-00000072', 'instance_id': 'c8ef6504-1e46-493c-a14c-7cd1bebde8e0', 'instance_type': 'm1.nano', 'host': 'e5ddce960d32daad00025894dc3b34f2163b2588c9dc00067da2324e', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'bef06fc6-f726-11f0-b13b-fa163e425b77', 'monotonic_time': 5181.976408015, 'message_signature': '08a758a171a59666cae8e91571dba3d2829cde8a1a8db95ef58ab539ce56f13d'}, {'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 485376, 'user_id': '833f1e9dce90456ea55a443da6704907', 'user_name': None, 'project_id': '34b96b4037d24a0ea19383ca2477b2fd', 'project_name': None, 'resource_id': 'c8ef6504-1e46-493c-a14c-7cd1bebde8e0-sda', 'timestamp': '2026-01-22T00:10:23.273041', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1980643582', 'name': 'instance-00000072', 'instance_id': 'c8ef6504-1e46-493c-a14c-7cd1bebde8e0', 'instance_type': 'm1.nano', 'host': 'e5ddce960d32daad00025894dc3b34f2163b2588c9dc00067da2324e', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'bef077a0-f726-11f0-b13b-fa163e425b77', 'monotonic_time': 5181.976408015, 'message_signature': 'f2fd8908b297c3a708970a194a8a4a2b4144fa3a811bd30b29276d14b002f03a'}]}, 'timestamp': '2026-01-22 00:10:23.273453', '_unique_id': 'e1b87da2509a4344a66b215fff0797a5'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.273 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.273 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.273 12 ERROR oslo_messaging.notify.messaging     yield
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.273 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.273 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.273 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.273 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.273 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.273 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.273 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.273 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.273 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.273 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.273 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.273 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.273 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.273 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.273 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.273 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.273 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.273 12 ERROR oslo_messaging.notify.messaging 
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.273 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.273 12 ERROR oslo_messaging.notify.messaging 
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.273 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.273 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.273 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.273 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.273 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.273 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.273 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.273 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.273 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.273 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.273 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.273 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.273 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.273 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.273 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.273 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.273 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.273 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.273 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.273 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.273 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.273 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.273 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.273 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.273 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.273 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.273 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.273 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.273 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.273 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.273 12 ERROR oslo_messaging.notify.messaging 
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.274 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.capacity in the context of pollsters
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.274 12 DEBUG ceilometer.compute.pollsters [-] c8ef6504-1e46-493c-a14c-7cd1bebde8e0/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.274 12 DEBUG ceilometer.compute.pollsters [-] c8ef6504-1e46-493c-a14c-7cd1bebde8e0/disk.device.capacity volume: 485376 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.275 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'ca58cfd9-3ee3-4b10-a137-91f64e908a5f', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '833f1e9dce90456ea55a443da6704907', 'user_name': None, 'project_id': '34b96b4037d24a0ea19383ca2477b2fd', 'project_name': None, 'resource_id': 'c8ef6504-1e46-493c-a14c-7cd1bebde8e0-vda', 'timestamp': '2026-01-22T00:10:23.274594', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1980643582', 'name': 'instance-00000072', 'instance_id': 'c8ef6504-1e46-493c-a14c-7cd1bebde8e0', 'instance_type': 'm1.nano', 'host': 'e5ddce960d32daad00025894dc3b34f2163b2588c9dc00067da2324e', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'bef0acac-f726-11f0-b13b-fa163e425b77', 'monotonic_time': 5181.976408015, 'message_signature': 'a2a3dcf2d238c04ccb821f56b236def5a1a0145151864d5df5eb03ae05d03bdc'}, {'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 485376, 'user_id': '833f1e9dce90456ea55a443da6704907', 'user_name': None, 'project_id': '34b96b4037d24a0ea19383ca2477b2fd', 'project_name': None, 'resource_id': 'c8ef6504-1e46-493c-a14c-7cd1bebde8e0-sda', 'timestamp': '2026-01-22T00:10:23.274594', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1980643582', 'name': 'instance-00000072', 'instance_id': 'c8ef6504-1e46-493c-a14c-7cd1bebde8e0', 'instance_type': 'm1.nano', 'host': 'e5ddce960d32daad00025894dc3b34f2163b2588c9dc00067da2324e', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'bef0b5d0-f726-11f0-b13b-fa163e425b77', 'monotonic_time': 5181.976408015, 'message_signature': '731ef49bd3b447ab03ed3988e5ab47fd294e1123e04f31018cc810bbd8083204'}]}, 'timestamp': '2026-01-22 00:10:23.275047', '_unique_id': 'bde74f9f07294fa2bccc213c402d1cd9'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.275 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.275 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.275 12 ERROR oslo_messaging.notify.messaging     yield
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.275 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.275 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.275 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.275 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.275 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.275 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.275 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.275 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.275 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.275 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.275 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.275 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.275 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.275 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.275 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.275 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.275 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.275 12 ERROR oslo_messaging.notify.messaging 
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.275 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.275 12 ERROR oslo_messaging.notify.messaging 
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.275 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.275 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.275 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.275 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.275 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.275 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.275 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.275 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.275 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.275 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.275 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.275 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.275 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.275 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.275 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.275 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.275 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.275 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.275 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.275 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.275 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.275 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.275 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.275 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.275 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.275 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.275 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.275 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.275 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.275 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.275 12 ERROR oslo_messaging.notify.messaging 
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.276 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.bytes in the context of pollsters
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.276 12 DEBUG ceilometer.compute.pollsters [-] c8ef6504-1e46-493c-a14c-7cd1bebde8e0/disk.device.read.bytes volume: 30206464 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.276 12 DEBUG ceilometer.compute.pollsters [-] c8ef6504-1e46-493c-a14c-7cd1bebde8e0/disk.device.read.bytes volume: 274750 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.277 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'b16198f9-e4c7-4ec3-994d-f2c331e634ab', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 30206464, 'user_id': '833f1e9dce90456ea55a443da6704907', 'user_name': None, 'project_id': '34b96b4037d24a0ea19383ca2477b2fd', 'project_name': None, 'resource_id': 'c8ef6504-1e46-493c-a14c-7cd1bebde8e0-vda', 'timestamp': '2026-01-22T00:10:23.276115', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1980643582', 'name': 'instance-00000072', 'instance_id': 'c8ef6504-1e46-493c-a14c-7cd1bebde8e0', 'instance_type': 'm1.nano', 'host': 'e5ddce960d32daad00025894dc3b34f2163b2588c9dc00067da2324e', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'bef0e7d0-f726-11f0-b13b-fa163e425b77', 'monotonic_time': 5181.899742593, 'message_signature': 'e32ec4909cc144bad0420b9ca741a4ea9b2149067d1c4de6366551de60864884'}, {'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 274750, 'user_id': '833f1e9dce90456ea55a443da6704907', 'user_name': None, 'project_id': '34b96b4037d24a0ea19383ca2477b2fd', 'project_name': None, 'resource_id': 'c8ef6504-1e46-493c-a14c-7cd1bebde8e0-sda', 'timestamp': '2026-01-22T00:10:23.276115', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1980643582', 'name': 'instance-00000072', 'instance_id': 'c8ef6504-1e46-493c-a14c-7cd1bebde8e0', 'instance_type': 'm1.nano', 'host': 'e5ddce960d32daad00025894dc3b34f2163b2588c9dc00067da2324e', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'bef0f0f4-f726-11f0-b13b-fa163e425b77', 'monotonic_time': 5181.899742593, 'message_signature': 'a6621ed516ddb262ac6bb4f2464e80b1f060c70da1f9548094ff668eb762acf5'}]}, 'timestamp': '2026-01-22 00:10:23.276561', '_unique_id': 'f499c218bed24b23b818bf7455bf072c'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.277 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.277 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.277 12 ERROR oslo_messaging.notify.messaging     yield
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.277 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.277 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.277 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.277 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.277 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.277 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.277 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.277 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.277 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.277 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.277 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.277 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.277 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.277 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.277 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.277 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.277 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.277 12 ERROR oslo_messaging.notify.messaging 
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.277 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.277 12 ERROR oslo_messaging.notify.messaging 
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.277 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.277 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.277 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.277 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.277 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.277 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.277 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.277 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.277 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.277 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.277 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.277 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.277 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.277 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.277 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.277 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.277 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.277 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.277 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.277 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.277 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.277 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.277 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.277 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.277 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.277 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.277 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.277 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.277 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.277 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.277 12 ERROR oslo_messaging.notify.messaging 
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.277 12 INFO ceilometer.polling.manager [-] Polling pollster cpu in the context of pollsters
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.277 12 DEBUG ceilometer.compute.pollsters [-] c8ef6504-1e46-493c-a14c-7cd1bebde8e0/cpu volume: 12360000000 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.278 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'beae7a52-c7e0-4482-a53c-a3982cceb303', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'cpu', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 12360000000, 'user_id': '833f1e9dce90456ea55a443da6704907', 'user_name': None, 'project_id': '34b96b4037d24a0ea19383ca2477b2fd', 'project_name': None, 'resource_id': 'c8ef6504-1e46-493c-a14c-7cd1bebde8e0', 'timestamp': '2026-01-22T00:10:23.277753', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1980643582', 'name': 'instance-00000072', 'instance_id': 'c8ef6504-1e46-493c-a14c-7cd1bebde8e0', 'instance_type': 'm1.nano', 'host': 'e5ddce960d32daad00025894dc3b34f2163b2588c9dc00067da2324e', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'cpu_number': 1}, 'message_id': 'bef12952-f726-11f0-b13b-fa163e425b77', 'monotonic_time': 5181.963408224, 'message_signature': 'dbcfae4cb2f2f55ad9ea3a21ed44f3449956346134731b2ef3ab27dd7190aea7'}]}, 'timestamp': '2026-01-22 00:10:23.278012', '_unique_id': '32dc4d7331484d1da2b5ae15301b7385'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.278 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.278 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.278 12 ERROR oslo_messaging.notify.messaging     yield
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.278 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.278 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.278 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.278 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.278 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.278 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.278 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.278 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.278 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.278 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.278 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.278 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.278 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.278 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.278 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.278 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.278 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.278 12 ERROR oslo_messaging.notify.messaging 
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.278 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.278 12 ERROR oslo_messaging.notify.messaging 
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.278 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.278 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.278 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.278 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.278 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.278 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.278 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.278 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.278 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.278 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.278 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.278 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.278 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.278 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.278 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.278 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.278 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.278 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.278 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.278 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.278 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.278 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.278 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.278 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.278 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.278 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.278 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.278 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.278 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.278 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.278 12 ERROR oslo_messaging.notify.messaging 
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.279 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.requests in the context of pollsters
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.279 12 DEBUG ceilometer.compute.pollsters [-] c8ef6504-1e46-493c-a14c-7cd1bebde8e0/disk.device.write.requests volume: 304 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.279 12 DEBUG ceilometer.compute.pollsters [-] c8ef6504-1e46-493c-a14c-7cd1bebde8e0/disk.device.write.requests volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.280 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '4524c81a-c3f5-467f-9cc0-5373802c9d71', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 304, 'user_id': '833f1e9dce90456ea55a443da6704907', 'user_name': None, 'project_id': '34b96b4037d24a0ea19383ca2477b2fd', 'project_name': None, 'resource_id': 'c8ef6504-1e46-493c-a14c-7cd1bebde8e0-vda', 'timestamp': '2026-01-22T00:10:23.279118', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1980643582', 'name': 'instance-00000072', 'instance_id': 'c8ef6504-1e46-493c-a14c-7cd1bebde8e0', 'instance_type': 'm1.nano', 'host': 'e5ddce960d32daad00025894dc3b34f2163b2588c9dc00067da2324e', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'bef15dbe-f726-11f0-b13b-fa163e425b77', 'monotonic_time': 5181.899742593, 'message_signature': '8ad4152f3ac294cf79d0bd6a17498db110b8b6abb6e1d572d0f13f3249d7fd00'}, {'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 0, 'user_id': '833f1e9dce90456ea55a443da6704907', 'user_name': None, 'project_id': '34b96b4037d24a0ea19383ca2477b2fd', 'project_name': None, 'resource_id': 'c8ef6504-1e46-493c-a14c-7cd1bebde8e0-sda', 'timestamp': '2026-01-22T00:10:23.279118', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1980643582', 'name': 'instance-00000072', 'instance_id': 'c8ef6504-1e46-493c-a14c-7cd1bebde8e0', 'instance_type': 'm1.nano', 'host': 'e5ddce960d32daad00025894dc3b34f2163b2588c9dc00067da2324e', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'bef1671e-f726-11f0-b13b-fa163e425b77', 'monotonic_time': 5181.899742593, 'message_signature': '0ef414fadee6c0f490a39997a479e6438efd6f16d51cd209e5321ad798107369'}]}, 'timestamp': '2026-01-22 00:10:23.279607', '_unique_id': '7883e297e59c445fa41f8ece20855891'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.280 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.280 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.280 12 ERROR oslo_messaging.notify.messaging     yield
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.280 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.280 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.280 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.280 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.280 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.280 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.280 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.280 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.280 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.280 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.280 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.280 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.280 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.280 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.280 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.280 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.280 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.280 12 ERROR oslo_messaging.notify.messaging 
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.280 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.280 12 ERROR oslo_messaging.notify.messaging 
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.280 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.280 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.280 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.280 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.280 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.280 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.280 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.280 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.280 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.280 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.280 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.280 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.280 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.280 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.280 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.280 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.280 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.280 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.280 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.280 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.280 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.280 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.280 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.280 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.280 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.280 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.280 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.280 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.280 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.280 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.280 12 ERROR oslo_messaging.notify.messaging 
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.280 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.rate in the context of pollsters
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.280 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for IncomingBytesRatePollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.281 12 ERROR ceilometer.polling.manager [-] Prevent pollster network.incoming.bytes.rate from polling [<NovaLikeServer: tempest-TestNetworkBasicOps-server-1980643582>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-TestNetworkBasicOps-server-1980643582>]
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.281 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.rate in the context of pollsters
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.281 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for OutgoingBytesRatePollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.281 12 ERROR ceilometer.polling.manager [-] Prevent pollster network.outgoing.bytes.rate from polling [<NovaLikeServer: tempest-TestNetworkBasicOps-server-1980643582>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-TestNetworkBasicOps-server-1980643582>]
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.281 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.error in the context of pollsters
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.281 12 DEBUG ceilometer.compute.pollsters [-] c8ef6504-1e46-493c-a14c-7cd1bebde8e0/network.incoming.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.282 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '38d896b3-8a55-40ff-8279-657191338b8d', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '833f1e9dce90456ea55a443da6704907', 'user_name': None, 'project_id': '34b96b4037d24a0ea19383ca2477b2fd', 'project_name': None, 'resource_id': 'instance-00000072-c8ef6504-1e46-493c-a14c-7cd1bebde8e0-tap058e919a-93', 'timestamp': '2026-01-22T00:10:23.281462', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1980643582', 'name': 'tap058e919a-93', 'instance_id': 'c8ef6504-1e46-493c-a14c-7cd1bebde8e0', 'instance_type': 'm1.nano', 'host': 'e5ddce960d32daad00025894dc3b34f2163b2588c9dc00067da2324e', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:d8:3a:e3', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap058e919a-93'}, 'message_id': 'bef1b8fe-f726-11f0-b13b-fa163e425b77', 'monotonic_time': 5181.887980336, 'message_signature': 'fa83ab2fb3c053b64a1753b556d80370b4ae6459d73c3fa3941d01ccf185fbdd'}]}, 'timestamp': '2026-01-22 00:10:23.281740', '_unique_id': '5e7424feb88e4faa967cf59c1cc55f10'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.282 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.282 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.282 12 ERROR oslo_messaging.notify.messaging     yield
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.282 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.282 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.282 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.282 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.282 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.282 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.282 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.282 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.282 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.282 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.282 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.282 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.282 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.282 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.282 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.282 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.282 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.282 12 ERROR oslo_messaging.notify.messaging 
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.282 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.282 12 ERROR oslo_messaging.notify.messaging 
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.282 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.282 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.282 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.282 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.282 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.282 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.282 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.282 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.282 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.282 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.282 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.282 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.282 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.282 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.282 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.282 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.282 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.282 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.282 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.282 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.282 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.282 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.282 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.282 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.282 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.282 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.282 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.282 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.282 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.282 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.282 12 ERROR oslo_messaging.notify.messaging 
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.282 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.requests in the context of pollsters
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.282 12 DEBUG ceilometer.compute.pollsters [-] c8ef6504-1e46-493c-a14c-7cd1bebde8e0/disk.device.read.requests volume: 1087 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.283 12 DEBUG ceilometer.compute.pollsters [-] c8ef6504-1e46-493c-a14c-7cd1bebde8e0/disk.device.read.requests volume: 108 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.283 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '0d41e95a-f599-45fd-b393-04933b9612b4', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 1087, 'user_id': '833f1e9dce90456ea55a443da6704907', 'user_name': None, 'project_id': '34b96b4037d24a0ea19383ca2477b2fd', 'project_name': None, 'resource_id': 'c8ef6504-1e46-493c-a14c-7cd1bebde8e0-vda', 'timestamp': '2026-01-22T00:10:23.282839', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1980643582', 'name': 'instance-00000072', 'instance_id': 'c8ef6504-1e46-493c-a14c-7cd1bebde8e0', 'instance_type': 'm1.nano', 'host': 'e5ddce960d32daad00025894dc3b34f2163b2588c9dc00067da2324e', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'bef1efc2-f726-11f0-b13b-fa163e425b77', 'monotonic_time': 5181.899742593, 'message_signature': '7bfa65ccc0d1eeec7b91eb573f545ee5e43fa04bd8bf48ffd7d5e4bbe7696659'}, {'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 108, 'user_id': '833f1e9dce90456ea55a443da6704907', 'user_name': None, 'project_id': '34b96b4037d24a0ea19383ca2477b2fd', 'project_name': None, 'resource_id': 'c8ef6504-1e46-493c-a14c-7cd1bebde8e0-sda', 'timestamp': '2026-01-22T00:10:23.282839', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1980643582', 'name': 'instance-00000072', 'instance_id': 'c8ef6504-1e46-493c-a14c-7cd1bebde8e0', 'instance_type': 'm1.nano', 'host': 'e5ddce960d32daad00025894dc3b34f2163b2588c9dc00067da2324e', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'bef1f7b0-f726-11f0-b13b-fa163e425b77', 'monotonic_time': 5181.899742593, 'message_signature': 'dfb222ef5d4c7eae5c6b285ff0a8a27347901b0eebf7de4c14ce12e8113686d4'}]}, 'timestamp': '2026-01-22 00:10:23.283288', '_unique_id': '36fb4d99662f4950ad74f88db885b1d0'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.283 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.283 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.283 12 ERROR oslo_messaging.notify.messaging     yield
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.283 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.283 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.283 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.283 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.283 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.283 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.283 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.283 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.283 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.283 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.283 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.283 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.283 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.283 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.283 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.283 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.283 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.283 12 ERROR oslo_messaging.notify.messaging 
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.283 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.283 12 ERROR oslo_messaging.notify.messaging 
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.283 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.283 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.283 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.283 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.283 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.283 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.283 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.283 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.283 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.283 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.283 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.283 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.283 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.283 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.283 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.283 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.283 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.283 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.283 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.283 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.283 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.283 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.283 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.283 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.283 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.283 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.283 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.283 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.283 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.283 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.283 12 ERROR oslo_messaging.notify.messaging 
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.284 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.drop in the context of pollsters
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.284 12 DEBUG ceilometer.compute.pollsters [-] c8ef6504-1e46-493c-a14c-7cd1bebde8e0/network.outgoing.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.285 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'ea93459a-612b-4f1e-88a2-aadf82e7d0f5', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '833f1e9dce90456ea55a443da6704907', 'user_name': None, 'project_id': '34b96b4037d24a0ea19383ca2477b2fd', 'project_name': None, 'resource_id': 'instance-00000072-c8ef6504-1e46-493c-a14c-7cd1bebde8e0-tap058e919a-93', 'timestamp': '2026-01-22T00:10:23.284378', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1980643582', 'name': 'tap058e919a-93', 'instance_id': 'c8ef6504-1e46-493c-a14c-7cd1bebde8e0', 'instance_type': 'm1.nano', 'host': 'e5ddce960d32daad00025894dc3b34f2163b2588c9dc00067da2324e', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:d8:3a:e3', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap058e919a-93'}, 'message_id': 'bef22aaa-f726-11f0-b13b-fa163e425b77', 'monotonic_time': 5181.887980336, 'message_signature': '2585746e7a0efee5a5d7daff267f2635840f1f9dbbd477bd3c29a986d15a2044'}]}, 'timestamp': '2026-01-22 00:10:23.284607', '_unique_id': '59a5690d6a1745229a32d68e5fe18e23'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.285 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.285 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.285 12 ERROR oslo_messaging.notify.messaging     yield
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.285 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.285 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.285 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.285 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.285 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.285 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.285 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.285 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.285 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.285 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.285 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.285 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.285 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.285 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.285 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.285 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.285 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.285 12 ERROR oslo_messaging.notify.messaging 
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.285 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.285 12 ERROR oslo_messaging.notify.messaging 
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.285 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.285 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.285 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.285 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.285 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.285 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.285 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.285 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.285 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.285 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.285 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.285 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.285 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.285 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.285 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.285 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.285 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.285 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.285 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.285 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.285 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.285 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.285 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.285 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.285 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.285 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.285 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.285 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.285 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.285 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 21 19:10:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:10:23.285 12 ERROR oslo_messaging.notify.messaging 
Jan 21 19:10:23 np0005591285 nova_compute[182755]: 2026-01-22 00:10:23.979 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:10:26 np0005591285 nova_compute[182755]: 2026-01-22 00:10:26.984 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:10:27 np0005591285 podman[228797]: 2026-01-22 00:10:27.174608968 +0000 UTC m=+0.048043523 container health_status 3d927e208c8d9d2bf2ed958430b573b5beb6e6062ba87e1ec7a0ee42c855b2e4 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Jan 21 19:10:27 np0005591285 nova_compute[182755]: 2026-01-22 00:10:27.620 182759 INFO nova.compute.manager [None req-bbfb6b6e-88dc-4e52-a257-0aa4a88b4baa 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] [instance: c8ef6504-1e46-493c-a14c-7cd1bebde8e0] Get console output#033[00m
Jan 21 19:10:27 np0005591285 nova_compute[182755]: 2026-01-22 00:10:27.627 211512 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes#033[00m
Jan 21 19:10:28 np0005591285 nova_compute[182755]: 2026-01-22 00:10:28.672 182759 INFO nova.compute.manager [None req-40e0562c-18d0-4597-af6b-1ef6c357a38e 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] [instance: c8ef6504-1e46-493c-a14c-7cd1bebde8e0] Get console output#033[00m
Jan 21 19:10:28 np0005591285 nova_compute[182755]: 2026-01-22 00:10:28.679 211512 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes#033[00m
Jan 21 19:10:28 np0005591285 nova_compute[182755]: 2026-01-22 00:10:28.981 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:10:29 np0005591285 ovn_controller[94908]: 2026-01-22T00:10:29Z|00423|binding|INFO|Releasing lport 552f0b46-d8c2-456a-a665-639ee7e20754 from this chassis (sb_readonly=0)
Jan 21 19:10:29 np0005591285 nova_compute[182755]: 2026-01-22 00:10:29.261 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:10:29 np0005591285 nova_compute[182755]: 2026-01-22 00:10:29.941 182759 DEBUG nova.compute.manager [req-3c577c98-a741-4faa-99b7-741dae60ae17 req-9dc1ae61-af4a-4192-ad81-796f4eef4327 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: c8ef6504-1e46-493c-a14c-7cd1bebde8e0] Received event network-changed-058e919a-93d5-4b55-bae2-1ab8baad8296 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 21 19:10:29 np0005591285 nova_compute[182755]: 2026-01-22 00:10:29.942 182759 DEBUG nova.compute.manager [req-3c577c98-a741-4faa-99b7-741dae60ae17 req-9dc1ae61-af4a-4192-ad81-796f4eef4327 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: c8ef6504-1e46-493c-a14c-7cd1bebde8e0] Refreshing instance network info cache due to event network-changed-058e919a-93d5-4b55-bae2-1ab8baad8296. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 21 19:10:29 np0005591285 nova_compute[182755]: 2026-01-22 00:10:29.942 182759 DEBUG oslo_concurrency.lockutils [req-3c577c98-a741-4faa-99b7-741dae60ae17 req-9dc1ae61-af4a-4192-ad81-796f4eef4327 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "refresh_cache-c8ef6504-1e46-493c-a14c-7cd1bebde8e0" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 21 19:10:29 np0005591285 nova_compute[182755]: 2026-01-22 00:10:29.942 182759 DEBUG oslo_concurrency.lockutils [req-3c577c98-a741-4faa-99b7-741dae60ae17 req-9dc1ae61-af4a-4192-ad81-796f4eef4327 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquired lock "refresh_cache-c8ef6504-1e46-493c-a14c-7cd1bebde8e0" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 21 19:10:29 np0005591285 nova_compute[182755]: 2026-01-22 00:10:29.942 182759 DEBUG nova.network.neutron [req-3c577c98-a741-4faa-99b7-741dae60ae17 req-9dc1ae61-af4a-4192-ad81-796f4eef4327 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: c8ef6504-1e46-493c-a14c-7cd1bebde8e0] Refreshing network info cache for port 058e919a-93d5-4b55-bae2-1ab8baad8296 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 21 19:10:30 np0005591285 nova_compute[182755]: 2026-01-22 00:10:30.115 182759 DEBUG oslo_concurrency.lockutils [None req-b311045f-fa46-4cf1-baee-c165dfc1c7e1 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] Acquiring lock "c8ef6504-1e46-493c-a14c-7cd1bebde8e0" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 21 19:10:30 np0005591285 nova_compute[182755]: 2026-01-22 00:10:30.116 182759 DEBUG oslo_concurrency.lockutils [None req-b311045f-fa46-4cf1-baee-c165dfc1c7e1 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] Lock "c8ef6504-1e46-493c-a14c-7cd1bebde8e0" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 21 19:10:30 np0005591285 nova_compute[182755]: 2026-01-22 00:10:30.116 182759 DEBUG oslo_concurrency.lockutils [None req-b311045f-fa46-4cf1-baee-c165dfc1c7e1 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] Acquiring lock "c8ef6504-1e46-493c-a14c-7cd1bebde8e0-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 21 19:10:30 np0005591285 nova_compute[182755]: 2026-01-22 00:10:30.116 182759 DEBUG oslo_concurrency.lockutils [None req-b311045f-fa46-4cf1-baee-c165dfc1c7e1 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] Lock "c8ef6504-1e46-493c-a14c-7cd1bebde8e0-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 21 19:10:30 np0005591285 nova_compute[182755]: 2026-01-22 00:10:30.117 182759 DEBUG oslo_concurrency.lockutils [None req-b311045f-fa46-4cf1-baee-c165dfc1c7e1 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] Lock "c8ef6504-1e46-493c-a14c-7cd1bebde8e0-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 21 19:10:30 np0005591285 nova_compute[182755]: 2026-01-22 00:10:30.129 182759 INFO nova.compute.manager [None req-b311045f-fa46-4cf1-baee-c165dfc1c7e1 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] [instance: c8ef6504-1e46-493c-a14c-7cd1bebde8e0] Terminating instance#033[00m
Jan 21 19:10:30 np0005591285 nova_compute[182755]: 2026-01-22 00:10:30.139 182759 DEBUG nova.compute.manager [None req-b311045f-fa46-4cf1-baee-c165dfc1c7e1 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] [instance: c8ef6504-1e46-493c-a14c-7cd1bebde8e0] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Jan 21 19:10:30 np0005591285 kernel: tap058e919a-93 (unregistering): left promiscuous mode
Jan 21 19:10:30 np0005591285 NetworkManager[55017]: <info>  [1769040630.1649] device (tap058e919a-93): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 21 19:10:30 np0005591285 nova_compute[182755]: 2026-01-22 00:10:30.176 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:10:30 np0005591285 ovn_controller[94908]: 2026-01-22T00:10:30Z|00424|binding|INFO|Releasing lport 058e919a-93d5-4b55-bae2-1ab8baad8296 from this chassis (sb_readonly=0)
Jan 21 19:10:30 np0005591285 ovn_controller[94908]: 2026-01-22T00:10:30Z|00425|binding|INFO|Setting lport 058e919a-93d5-4b55-bae2-1ab8baad8296 down in Southbound
Jan 21 19:10:30 np0005591285 ovn_controller[94908]: 2026-01-22T00:10:30Z|00426|binding|INFO|Removing iface tap058e919a-93 ovn-installed in OVS
Jan 21 19:10:30 np0005591285 nova_compute[182755]: 2026-01-22 00:10:30.180 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:10:30 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:10:30.189 104259 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:d8:3a:e3 10.100.0.3'], port_security=['fa:16:3e:d8:3a:e3 10.100.0.3'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.3/28', 'neutron:device_id': 'c8ef6504-1e46-493c-a14c-7cd1bebde8e0', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-8eb4d5e9-8ac1-4d97-a3ca-29514334d492', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '34b96b4037d24a0ea19383ca2477b2fd', 'neutron:revision_number': '4', 'neutron:security_group_ids': 'a1ecbca0-9209-47aa-82bf-9c5d3641042f', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-2.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=797d444f-c0c6-4f3e-8fc3-fb2e70d06d89, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fdd55b5c970>], logical_port=058e919a-93d5-4b55-bae2-1ab8baad8296) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7fdd55b5c970>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 21 19:10:30 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:10:30.190 104259 INFO neutron.agent.ovn.metadata.agent [-] Port 058e919a-93d5-4b55-bae2-1ab8baad8296 in datapath 8eb4d5e9-8ac1-4d97-a3ca-29514334d492 unbound from our chassis#033[00m
Jan 21 19:10:30 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:10:30.191 104259 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 8eb4d5e9-8ac1-4d97-a3ca-29514334d492, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Jan 21 19:10:30 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:10:30.192 211690 DEBUG oslo.privsep.daemon [-] privsep: reply[a794a0a2-5a64-421e-902c-da634977d632]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 19:10:30 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:10:30.193 104259 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-8eb4d5e9-8ac1-4d97-a3ca-29514334d492 namespace which is not needed anymore#033[00m
Jan 21 19:10:30 np0005591285 nova_compute[182755]: 2026-01-22 00:10:30.210 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:10:30 np0005591285 systemd[1]: machine-qemu\x2d51\x2dinstance\x2d00000072.scope: Deactivated successfully.
Jan 21 19:10:30 np0005591285 systemd[1]: machine-qemu\x2d51\x2dinstance\x2d00000072.scope: Consumed 14.858s CPU time.
Jan 21 19:10:30 np0005591285 systemd-machined[154022]: Machine qemu-51-instance-00000072 terminated.
Jan 21 19:10:30 np0005591285 podman[228825]: 2026-01-22 00:10:30.313919453 +0000 UTC m=+0.087395071 container health_status 482a8450540b33e5cd4c4cb0c4b60281016c657b79b89fa82f8a9e447843f995 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 21 19:10:30 np0005591285 podman[228831]: 2026-01-22 00:10:30.321640001 +0000 UTC m=+0.075040920 container health_status 8919309155a033da8deb024bb37b9fc0d285fe97686ee0d202af37a5f2df8416 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Jan 21 19:10:30 np0005591285 kernel: tap058e919a-93: entered promiscuous mode
Jan 21 19:10:30 np0005591285 kernel: tap058e919a-93 (unregistering): left promiscuous mode
Jan 21 19:10:30 np0005591285 NetworkManager[55017]: <info>  [1769040630.3584] manager: (tap058e919a-93): new Tun device (/org/freedesktop/NetworkManager/Devices/205)
Jan 21 19:10:30 np0005591285 nova_compute[182755]: 2026-01-22 00:10:30.362 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:10:30 np0005591285 nova_compute[182755]: 2026-01-22 00:10:30.404 182759 INFO nova.virt.libvirt.driver [-] [instance: c8ef6504-1e46-493c-a14c-7cd1bebde8e0] Instance destroyed successfully.#033[00m
Jan 21 19:10:30 np0005591285 nova_compute[182755]: 2026-01-22 00:10:30.405 182759 DEBUG nova.objects.instance [None req-b311045f-fa46-4cf1-baee-c165dfc1c7e1 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] Lazy-loading 'resources' on Instance uuid c8ef6504-1e46-493c-a14c-7cd1bebde8e0 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 21 19:10:30 np0005591285 nova_compute[182755]: 2026-01-22 00:10:30.434 182759 DEBUG nova.virt.libvirt.vif [None req-b311045f-fa46-4cf1-baee-c165dfc1c7e1 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-22T00:09:43Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-1980643582',display_name='tempest-TestNetworkBasicOps-server-1980643582',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-1980643582',id=114,image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBJZ6XnigKgOH1qgP1qM1cbTSG+YSKJr4HPXzLmrePgGjqOoDXJutLYqYLK4oOPuYV4HHBQRnMUsgviAJJBFwFGWl4kwT3DJVEor+PG84hf0+tujPqIms5W8Uc9EI4E+YGQ==',key_name='tempest-TestNetworkBasicOps-575491139',keypairs=<?>,launch_index=0,launched_at=2026-01-22T00:10:08Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='34b96b4037d24a0ea19383ca2477b2fd',ramdisk_id='',reservation_id='r-dzas0rtl',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestNetworkBasicOps-822850957',owner_user_name='tempest-TestNetworkBasicOps-822850957-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-22T00:10:08Z,user_data=None,user_id='833f1e9dce90456ea55a443da6704907',uuid=c8ef6504-1e46-493c-a14c-7cd1bebde8e0,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "058e919a-93d5-4b55-bae2-1ab8baad8296", "address": "fa:16:3e:d8:3a:e3", "network": {"id": "8eb4d5e9-8ac1-4d97-a3ca-29514334d492", "bridge": "br-int", "label": "tempest-network-smoke--436957495", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.185", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "34b96b4037d24a0ea19383ca2477b2fd", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap058e919a-93", "ovs_interfaceid": "058e919a-93d5-4b55-bae2-1ab8baad8296", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Jan 21 19:10:30 np0005591285 nova_compute[182755]: 2026-01-22 00:10:30.434 182759 DEBUG nova.network.os_vif_util [None req-b311045f-fa46-4cf1-baee-c165dfc1c7e1 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] Converting VIF {"id": "058e919a-93d5-4b55-bae2-1ab8baad8296", "address": "fa:16:3e:d8:3a:e3", "network": {"id": "8eb4d5e9-8ac1-4d97-a3ca-29514334d492", "bridge": "br-int", "label": "tempest-network-smoke--436957495", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.185", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "34b96b4037d24a0ea19383ca2477b2fd", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap058e919a-93", "ovs_interfaceid": "058e919a-93d5-4b55-bae2-1ab8baad8296", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 21 19:10:30 np0005591285 nova_compute[182755]: 2026-01-22 00:10:30.435 182759 DEBUG nova.network.os_vif_util [None req-b311045f-fa46-4cf1-baee-c165dfc1c7e1 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:d8:3a:e3,bridge_name='br-int',has_traffic_filtering=True,id=058e919a-93d5-4b55-bae2-1ab8baad8296,network=Network(8eb4d5e9-8ac1-4d97-a3ca-29514334d492),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap058e919a-93') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 21 19:10:30 np0005591285 nova_compute[182755]: 2026-01-22 00:10:30.436 182759 DEBUG os_vif [None req-b311045f-fa46-4cf1-baee-c165dfc1c7e1 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:d8:3a:e3,bridge_name='br-int',has_traffic_filtering=True,id=058e919a-93d5-4b55-bae2-1ab8baad8296,network=Network(8eb4d5e9-8ac1-4d97-a3ca-29514334d492),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap058e919a-93') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Jan 21 19:10:30 np0005591285 nova_compute[182755]: 2026-01-22 00:10:30.438 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:10:30 np0005591285 nova_compute[182755]: 2026-01-22 00:10:30.439 182759 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap058e919a-93, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 21 19:10:30 np0005591285 nova_compute[182755]: 2026-01-22 00:10:30.440 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:10:30 np0005591285 nova_compute[182755]: 2026-01-22 00:10:30.443 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 21 19:10:30 np0005591285 nova_compute[182755]: 2026-01-22 00:10:30.446 182759 INFO os_vif [None req-b311045f-fa46-4cf1-baee-c165dfc1c7e1 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:d8:3a:e3,bridge_name='br-int',has_traffic_filtering=True,id=058e919a-93d5-4b55-bae2-1ab8baad8296,network=Network(8eb4d5e9-8ac1-4d97-a3ca-29514334d492),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap058e919a-93')#033[00m
Jan 21 19:10:30 np0005591285 nova_compute[182755]: 2026-01-22 00:10:30.447 182759 INFO nova.virt.libvirt.driver [None req-b311045f-fa46-4cf1-baee-c165dfc1c7e1 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] [instance: c8ef6504-1e46-493c-a14c-7cd1bebde8e0] Deleting instance files /var/lib/nova/instances/c8ef6504-1e46-493c-a14c-7cd1bebde8e0_del#033[00m
Jan 21 19:10:30 np0005591285 nova_compute[182755]: 2026-01-22 00:10:30.448 182759 INFO nova.virt.libvirt.driver [None req-b311045f-fa46-4cf1-baee-c165dfc1c7e1 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] [instance: c8ef6504-1e46-493c-a14c-7cd1bebde8e0] Deletion of /var/lib/nova/instances/c8ef6504-1e46-493c-a14c-7cd1bebde8e0_del complete#033[00m
Jan 21 19:10:30 np0005591285 nova_compute[182755]: 2026-01-22 00:10:30.639 182759 INFO nova.compute.manager [None req-b311045f-fa46-4cf1-baee-c165dfc1c7e1 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] [instance: c8ef6504-1e46-493c-a14c-7cd1bebde8e0] Took 0.50 seconds to destroy the instance on the hypervisor.#033[00m
Jan 21 19:10:30 np0005591285 nova_compute[182755]: 2026-01-22 00:10:30.640 182759 DEBUG oslo.service.loopingcall [None req-b311045f-fa46-4cf1-baee-c165dfc1c7e1 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Jan 21 19:10:30 np0005591285 nova_compute[182755]: 2026-01-22 00:10:30.641 182759 DEBUG nova.compute.manager [-] [instance: c8ef6504-1e46-493c-a14c-7cd1bebde8e0] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Jan 21 19:10:30 np0005591285 nova_compute[182755]: 2026-01-22 00:10:30.641 182759 DEBUG nova.network.neutron [-] [instance: c8ef6504-1e46-493c-a14c-7cd1bebde8e0] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Jan 21 19:10:30 np0005591285 nova_compute[182755]: 2026-01-22 00:10:30.671 182759 DEBUG nova.compute.manager [req-8a41675f-6b2b-46d9-836f-14df4f4062f1 req-50c890c0-6d1a-45b8-b638-a404c438e2d7 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: c8ef6504-1e46-493c-a14c-7cd1bebde8e0] Received event network-vif-unplugged-058e919a-93d5-4b55-bae2-1ab8baad8296 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 21 19:10:30 np0005591285 nova_compute[182755]: 2026-01-22 00:10:30.672 182759 DEBUG oslo_concurrency.lockutils [req-8a41675f-6b2b-46d9-836f-14df4f4062f1 req-50c890c0-6d1a-45b8-b638-a404c438e2d7 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "c8ef6504-1e46-493c-a14c-7cd1bebde8e0-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 21 19:10:30 np0005591285 nova_compute[182755]: 2026-01-22 00:10:30.673 182759 DEBUG oslo_concurrency.lockutils [req-8a41675f-6b2b-46d9-836f-14df4f4062f1 req-50c890c0-6d1a-45b8-b638-a404c438e2d7 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "c8ef6504-1e46-493c-a14c-7cd1bebde8e0-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 21 19:10:30 np0005591285 nova_compute[182755]: 2026-01-22 00:10:30.673 182759 DEBUG oslo_concurrency.lockutils [req-8a41675f-6b2b-46d9-836f-14df4f4062f1 req-50c890c0-6d1a-45b8-b638-a404c438e2d7 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "c8ef6504-1e46-493c-a14c-7cd1bebde8e0-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 21 19:10:30 np0005591285 nova_compute[182755]: 2026-01-22 00:10:30.674 182759 DEBUG nova.compute.manager [req-8a41675f-6b2b-46d9-836f-14df4f4062f1 req-50c890c0-6d1a-45b8-b638-a404c438e2d7 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: c8ef6504-1e46-493c-a14c-7cd1bebde8e0] No waiting events found dispatching network-vif-unplugged-058e919a-93d5-4b55-bae2-1ab8baad8296 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 21 19:10:30 np0005591285 nova_compute[182755]: 2026-01-22 00:10:30.675 182759 DEBUG nova.compute.manager [req-8a41675f-6b2b-46d9-836f-14df4f4062f1 req-50c890c0-6d1a-45b8-b638-a404c438e2d7 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: c8ef6504-1e46-493c-a14c-7cd1bebde8e0] Received event network-vif-unplugged-058e919a-93d5-4b55-bae2-1ab8baad8296 for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Jan 21 19:10:30 np0005591285 neutron-haproxy-ovnmeta-8eb4d5e9-8ac1-4d97-a3ca-29514334d492[228698]: [NOTICE]   (228702) : haproxy version is 2.8.14-c23fe91
Jan 21 19:10:30 np0005591285 neutron-haproxy-ovnmeta-8eb4d5e9-8ac1-4d97-a3ca-29514334d492[228698]: [NOTICE]   (228702) : path to executable is /usr/sbin/haproxy
Jan 21 19:10:30 np0005591285 neutron-haproxy-ovnmeta-8eb4d5e9-8ac1-4d97-a3ca-29514334d492[228698]: [WARNING]  (228702) : Exiting Master process...
Jan 21 19:10:30 np0005591285 neutron-haproxy-ovnmeta-8eb4d5e9-8ac1-4d97-a3ca-29514334d492[228698]: [ALERT]    (228702) : Current worker (228704) exited with code 143 (Terminated)
Jan 21 19:10:30 np0005591285 neutron-haproxy-ovnmeta-8eb4d5e9-8ac1-4d97-a3ca-29514334d492[228698]: [WARNING]  (228702) : All workers exited. Exiting... (0)
Jan 21 19:10:30 np0005591285 systemd[1]: libpod-8159413f2072c9582343eeea508d4c23860594893b4d9a4985cac9e763ce777b.scope: Deactivated successfully.
Jan 21 19:10:30 np0005591285 podman[228873]: 2026-01-22 00:10:30.804601662 +0000 UTC m=+0.506098154 container died 8159413f2072c9582343eeea508d4c23860594893b4d9a4985cac9e763ce777b (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-8eb4d5e9-8ac1-4d97-a3ca-29514334d492, tcib_managed=true, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 21 19:10:31 np0005591285 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-8159413f2072c9582343eeea508d4c23860594893b4d9a4985cac9e763ce777b-userdata-shm.mount: Deactivated successfully.
Jan 21 19:10:31 np0005591285 systemd[1]: var-lib-containers-storage-overlay-38551ce169bfd3d73987e0824f50f46143f727f0b109cb1a2d89281396f11553-merged.mount: Deactivated successfully.
Jan 21 19:10:31 np0005591285 nova_compute[182755]: 2026-01-22 00:10:31.985 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:10:32 np0005591285 podman[228876]: 2026-01-22 00:10:32.175225341 +0000 UTC m=+1.852937544 container health_status e38def30a2d032278c507a8fe96e62e9ea51ff4721fe55b24e66691821fd079e (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, managed_by=edpm_ansible)
Jan 21 19:10:32 np0005591285 podman[228873]: 2026-01-22 00:10:32.944487262 +0000 UTC m=+2.645983764 container cleanup 8159413f2072c9582343eeea508d4c23860594893b4d9a4985cac9e763ce777b (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-8eb4d5e9-8ac1-4d97-a3ca-29514334d492, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 21 19:10:32 np0005591285 systemd[1]: libpod-conmon-8159413f2072c9582343eeea508d4c23860594893b4d9a4985cac9e763ce777b.scope: Deactivated successfully.
Jan 21 19:10:33 np0005591285 nova_compute[182755]: 2026-01-22 00:10:33.077 182759 DEBUG nova.compute.manager [req-dff0957e-173b-4505-b6e3-a872473715a7 req-8478bab8-9257-4505-9992-058fa82b884f 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: c8ef6504-1e46-493c-a14c-7cd1bebde8e0] Received event network-vif-plugged-058e919a-93d5-4b55-bae2-1ab8baad8296 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 21 19:10:33 np0005591285 nova_compute[182755]: 2026-01-22 00:10:33.077 182759 DEBUG oslo_concurrency.lockutils [req-dff0957e-173b-4505-b6e3-a872473715a7 req-8478bab8-9257-4505-9992-058fa82b884f 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "c8ef6504-1e46-493c-a14c-7cd1bebde8e0-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 21 19:10:33 np0005591285 nova_compute[182755]: 2026-01-22 00:10:33.078 182759 DEBUG oslo_concurrency.lockutils [req-dff0957e-173b-4505-b6e3-a872473715a7 req-8478bab8-9257-4505-9992-058fa82b884f 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "c8ef6504-1e46-493c-a14c-7cd1bebde8e0-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 21 19:10:33 np0005591285 nova_compute[182755]: 2026-01-22 00:10:33.078 182759 DEBUG oslo_concurrency.lockutils [req-dff0957e-173b-4505-b6e3-a872473715a7 req-8478bab8-9257-4505-9992-058fa82b884f 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "c8ef6504-1e46-493c-a14c-7cd1bebde8e0-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 21 19:10:33 np0005591285 nova_compute[182755]: 2026-01-22 00:10:33.078 182759 DEBUG nova.compute.manager [req-dff0957e-173b-4505-b6e3-a872473715a7 req-8478bab8-9257-4505-9992-058fa82b884f 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: c8ef6504-1e46-493c-a14c-7cd1bebde8e0] No waiting events found dispatching network-vif-plugged-058e919a-93d5-4b55-bae2-1ab8baad8296 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 21 19:10:33 np0005591285 nova_compute[182755]: 2026-01-22 00:10:33.078 182759 WARNING nova.compute.manager [req-dff0957e-173b-4505-b6e3-a872473715a7 req-8478bab8-9257-4505-9992-058fa82b884f 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: c8ef6504-1e46-493c-a14c-7cd1bebde8e0] Received unexpected event network-vif-plugged-058e919a-93d5-4b55-bae2-1ab8baad8296 for instance with vm_state active and task_state deleting.#033[00m
Jan 21 19:10:33 np0005591285 nova_compute[182755]: 2026-01-22 00:10:33.942 182759 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1769040618.9413798, cacae884-d2ca-4741-952f-59ffbb641328 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 21 19:10:33 np0005591285 nova_compute[182755]: 2026-01-22 00:10:33.943 182759 INFO nova.compute.manager [-] [instance: cacae884-d2ca-4741-952f-59ffbb641328] VM Stopped (Lifecycle Event)#033[00m
Jan 21 19:10:34 np0005591285 nova_compute[182755]: 2026-01-22 00:10:33.999 182759 DEBUG nova.compute.manager [None req-8671b432-a7c6-4572-ab8a-5e7a60b0c031 - - - - - -] [instance: cacae884-d2ca-4741-952f-59ffbb641328] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 21 19:10:34 np0005591285 podman[228961]: 2026-01-22 00:10:34.412295805 +0000 UTC m=+1.435791791 container remove 8159413f2072c9582343eeea508d4c23860594893b4d9a4985cac9e763ce777b (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-8eb4d5e9-8ac1-4d97-a3ca-29514334d492, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 21 19:10:34 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:10:34.420 211690 DEBUG oslo.privsep.daemon [-] privsep: reply[f249e5a2-c0f1-46e3-86b6-dc6b8575e773]: (4, ('Thu Jan 22 12:10:30 AM UTC 2026 Stopping container neutron-haproxy-ovnmeta-8eb4d5e9-8ac1-4d97-a3ca-29514334d492 (8159413f2072c9582343eeea508d4c23860594893b4d9a4985cac9e763ce777b)\n8159413f2072c9582343eeea508d4c23860594893b4d9a4985cac9e763ce777b\nThu Jan 22 12:10:32 AM UTC 2026 Deleting container neutron-haproxy-ovnmeta-8eb4d5e9-8ac1-4d97-a3ca-29514334d492 (8159413f2072c9582343eeea508d4c23860594893b4d9a4985cac9e763ce777b)\n8159413f2072c9582343eeea508d4c23860594893b4d9a4985cac9e763ce777b\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 19:10:34 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:10:34.423 211690 DEBUG oslo.privsep.daemon [-] privsep: reply[b634531d-aab7-4b91-a74b-8d853718cc2c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 19:10:34 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:10:34.425 104259 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap8eb4d5e9-80, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 21 19:10:34 np0005591285 nova_compute[182755]: 2026-01-22 00:10:34.427 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:10:34 np0005591285 kernel: tap8eb4d5e9-80: left promiscuous mode
Jan 21 19:10:34 np0005591285 nova_compute[182755]: 2026-01-22 00:10:34.432 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:10:34 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:10:34.436 211690 DEBUG oslo.privsep.daemon [-] privsep: reply[fc44dc6d-f263-420f-8be1-14c224bfcf3e]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 19:10:34 np0005591285 nova_compute[182755]: 2026-01-22 00:10:34.453 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:10:34 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:10:34.462 211690 DEBUG oslo.privsep.daemon [-] privsep: reply[36b6ee31-f0fb-453b-95ce-6b9720393f2d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 19:10:34 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:10:34.464 211690 DEBUG oslo.privsep.daemon [-] privsep: reply[bca544ef-0e4f-46ba-b92c-46ffdd663f55]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 19:10:34 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:10:34.487 211690 DEBUG oslo.privsep.daemon [-] privsep: reply[0e06edb4-247e-4d78-88ef-50acd6adcffb]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 516143, 'reachable_time': 19410, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 228973, 'error': None, 'target': 'ovnmeta-8eb4d5e9-8ac1-4d97-a3ca-29514334d492', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 19:10:34 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:10:34.491 104650 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-8eb4d5e9-8ac1-4d97-a3ca-29514334d492 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Jan 21 19:10:34 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:10:34.491 104650 DEBUG oslo.privsep.daemon [-] privsep: reply[dd18a318-8f1b-42f7-8608-d625877ad054]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 19:10:34 np0005591285 systemd[1]: run-netns-ovnmeta\x2d8eb4d5e9\x2d8ac1\x2d4d97\x2da3ca\x2d29514334d492.mount: Deactivated successfully.
Jan 21 19:10:35 np0005591285 nova_compute[182755]: 2026-01-22 00:10:35.442 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:10:36 np0005591285 nova_compute[182755]: 2026-01-22 00:10:36.197 182759 DEBUG nova.network.neutron [-] [instance: c8ef6504-1e46-493c-a14c-7cd1bebde8e0] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 21 19:10:36 np0005591285 nova_compute[182755]: 2026-01-22 00:10:36.224 182759 INFO nova.compute.manager [-] [instance: c8ef6504-1e46-493c-a14c-7cd1bebde8e0] Took 5.58 seconds to deallocate network for instance.#033[00m
Jan 21 19:10:36 np0005591285 nova_compute[182755]: 2026-01-22 00:10:36.534 182759 DEBUG oslo_concurrency.lockutils [None req-b311045f-fa46-4cf1-baee-c165dfc1c7e1 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 21 19:10:36 np0005591285 nova_compute[182755]: 2026-01-22 00:10:36.535 182759 DEBUG oslo_concurrency.lockutils [None req-b311045f-fa46-4cf1-baee-c165dfc1c7e1 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 21 19:10:36 np0005591285 nova_compute[182755]: 2026-01-22 00:10:36.641 182759 DEBUG nova.compute.manager [req-496c5680-efe6-41af-b4ce-eba7f3b0e3c2 req-05d6215d-ffaa-4275-818e-f5d95dec5357 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: c8ef6504-1e46-493c-a14c-7cd1bebde8e0] Received event network-vif-deleted-058e919a-93d5-4b55-bae2-1ab8baad8296 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 21 19:10:36 np0005591285 nova_compute[182755]: 2026-01-22 00:10:36.716 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:10:36 np0005591285 nova_compute[182755]: 2026-01-22 00:10:36.732 182759 DEBUG nova.compute.provider_tree [None req-b311045f-fa46-4cf1-baee-c165dfc1c7e1 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] Inventory has not changed in ProviderTree for provider: e96a8776-a298-4c19-937a-402cb8191067 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 21 19:10:36 np0005591285 nova_compute[182755]: 2026-01-22 00:10:36.781 182759 DEBUG nova.scheduler.client.report [None req-b311045f-fa46-4cf1-baee-c165dfc1c7e1 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] Inventory has not changed for provider e96a8776-a298-4c19-937a-402cb8191067 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 21 19:10:36 np0005591285 nova_compute[182755]: 2026-01-22 00:10:36.832 182759 DEBUG oslo_concurrency.lockutils [None req-b311045f-fa46-4cf1-baee-c165dfc1c7e1 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.297s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 21 19:10:36 np0005591285 nova_compute[182755]: 2026-01-22 00:10:36.914 182759 INFO nova.scheduler.client.report [None req-b311045f-fa46-4cf1-baee-c165dfc1c7e1 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] Deleted allocations for instance c8ef6504-1e46-493c-a14c-7cd1bebde8e0#033[00m
Jan 21 19:10:36 np0005591285 nova_compute[182755]: 2026-01-22 00:10:36.986 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:10:37 np0005591285 nova_compute[182755]: 2026-01-22 00:10:37.051 182759 DEBUG nova.network.neutron [req-3c577c98-a741-4faa-99b7-741dae60ae17 req-9dc1ae61-af4a-4192-ad81-796f4eef4327 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: c8ef6504-1e46-493c-a14c-7cd1bebde8e0] Updated VIF entry in instance network info cache for port 058e919a-93d5-4b55-bae2-1ab8baad8296. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 21 19:10:37 np0005591285 nova_compute[182755]: 2026-01-22 00:10:37.052 182759 DEBUG nova.network.neutron [req-3c577c98-a741-4faa-99b7-741dae60ae17 req-9dc1ae61-af4a-4192-ad81-796f4eef4327 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: c8ef6504-1e46-493c-a14c-7cd1bebde8e0] Updating instance_info_cache with network_info: [{"id": "058e919a-93d5-4b55-bae2-1ab8baad8296", "address": "fa:16:3e:d8:3a:e3", "network": {"id": "8eb4d5e9-8ac1-4d97-a3ca-29514334d492", "bridge": "br-int", "label": "tempest-network-smoke--436957495", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "34b96b4037d24a0ea19383ca2477b2fd", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap058e919a-93", "ovs_interfaceid": "058e919a-93d5-4b55-bae2-1ab8baad8296", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 21 19:10:37 np0005591285 nova_compute[182755]: 2026-01-22 00:10:37.147 182759 DEBUG oslo_concurrency.lockutils [None req-b311045f-fa46-4cf1-baee-c165dfc1c7e1 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] Lock "c8ef6504-1e46-493c-a14c-7cd1bebde8e0" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 7.031s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 21 19:10:37 np0005591285 nova_compute[182755]: 2026-01-22 00:10:37.151 182759 DEBUG oslo_concurrency.lockutils [req-3c577c98-a741-4faa-99b7-741dae60ae17 req-9dc1ae61-af4a-4192-ad81-796f4eef4327 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Releasing lock "refresh_cache-c8ef6504-1e46-493c-a14c-7cd1bebde8e0" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 21 19:10:39 np0005591285 nova_compute[182755]: 2026-01-22 00:10:39.184 182759 DEBUG oslo_service.periodic_task [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 21 19:10:39 np0005591285 nova_compute[182755]: 2026-01-22 00:10:39.184 182759 DEBUG oslo_service.periodic_task [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 21 19:10:39 np0005591285 nova_compute[182755]: 2026-01-22 00:10:39.184 182759 DEBUG nova.compute.manager [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Jan 21 19:10:40 np0005591285 nova_compute[182755]: 2026-01-22 00:10:40.444 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:10:41 np0005591285 nova_compute[182755]: 2026-01-22 00:10:41.296 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:10:42 np0005591285 nova_compute[182755]: 2026-01-22 00:10:42.055 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:10:43 np0005591285 nova_compute[182755]: 2026-01-22 00:10:43.217 182759 DEBUG oslo_service.periodic_task [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 21 19:10:43 np0005591285 nova_compute[182755]: 2026-01-22 00:10:43.217 182759 DEBUG oslo_service.periodic_task [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 21 19:10:43 np0005591285 nova_compute[182755]: 2026-01-22 00:10:43.218 182759 DEBUG oslo_service.periodic_task [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 21 19:10:45 np0005591285 nova_compute[182755]: 2026-01-22 00:10:45.218 182759 DEBUG oslo_service.periodic_task [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 21 19:10:45 np0005591285 nova_compute[182755]: 2026-01-22 00:10:45.218 182759 DEBUG nova.compute.manager [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Jan 21 19:10:45 np0005591285 nova_compute[182755]: 2026-01-22 00:10:45.219 182759 DEBUG nova.compute.manager [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Jan 21 19:10:45 np0005591285 nova_compute[182755]: 2026-01-22 00:10:45.255 182759 DEBUG nova.compute.manager [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Jan 21 19:10:45 np0005591285 nova_compute[182755]: 2026-01-22 00:10:45.400 182759 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1769040630.4000819, c8ef6504-1e46-493c-a14c-7cd1bebde8e0 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 21 19:10:45 np0005591285 nova_compute[182755]: 2026-01-22 00:10:45.401 182759 INFO nova.compute.manager [-] [instance: c8ef6504-1e46-493c-a14c-7cd1bebde8e0] VM Stopped (Lifecycle Event)#033[00m
Jan 21 19:10:45 np0005591285 nova_compute[182755]: 2026-01-22 00:10:45.446 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:10:45 np0005591285 nova_compute[182755]: 2026-01-22 00:10:45.448 182759 DEBUG nova.compute.manager [None req-e949d5a2-2253-4145-8bb7-711f4dcb3898 - - - - - -] [instance: c8ef6504-1e46-493c-a14c-7cd1bebde8e0] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 21 19:10:47 np0005591285 nova_compute[182755]: 2026-01-22 00:10:47.057 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:10:47 np0005591285 nova_compute[182755]: 2026-01-22 00:10:47.217 182759 DEBUG oslo_service.periodic_task [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 21 19:10:47 np0005591285 nova_compute[182755]: 2026-01-22 00:10:47.217 182759 DEBUG oslo_service.periodic_task [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 21 19:10:48 np0005591285 nova_compute[182755]: 2026-01-22 00:10:48.546 182759 DEBUG oslo_concurrency.lockutils [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 21 19:10:48 np0005591285 nova_compute[182755]: 2026-01-22 00:10:48.547 182759 DEBUG oslo_concurrency.lockutils [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 21 19:10:48 np0005591285 nova_compute[182755]: 2026-01-22 00:10:48.547 182759 DEBUG oslo_concurrency.lockutils [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 21 19:10:48 np0005591285 nova_compute[182755]: 2026-01-22 00:10:48.547 182759 DEBUG nova.compute.resource_tracker [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Jan 21 19:10:48 np0005591285 podman[228981]: 2026-01-22 00:10:48.677662254 +0000 UTC m=+0.074865614 container health_status b5c1a31a508d0cf9e10702abe9c0ef424614b4d8f0daee85791f7de054afa6f0 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ceilometer_agent_compute, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.build-date=20251202)
Jan 21 19:10:48 np0005591285 podman[228979]: 2026-01-22 00:10:48.693142381 +0000 UTC m=+0.098991644 container health_status 769668cc4e818cfae854ab3fcb5517227ddca1e18005a18ef4d782a494f45662 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, name=ubi9-minimal, container_name=openstack_network_exporter, distribution-scope=public, io.openshift.expose-services=, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., release=1755695350, vcs-type=git, architecture=x86_64, com.redhat.component=ubi9-minimal-container, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vendor=Red Hat, Inc., config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., version=9.6, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=openstack_network_exporter, io.openshift.tags=minimal rhel9, managed_by=edpm_ansible, url=https://catalog.redhat.com/en/search?searchType=containers, io.buildah.version=1.33.7, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., build-date=2025-08-20T13:12:41, maintainer=Red Hat, Inc.)
Jan 21 19:10:48 np0005591285 nova_compute[182755]: 2026-01-22 00:10:48.747 182759 WARNING nova.virt.libvirt.driver [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 21 19:10:48 np0005591285 nova_compute[182755]: 2026-01-22 00:10:48.748 182759 DEBUG nova.compute.resource_tracker [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=5744MB free_disk=73.19351577758789GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Jan 21 19:10:48 np0005591285 nova_compute[182755]: 2026-01-22 00:10:48.748 182759 DEBUG oslo_concurrency.lockutils [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 21 19:10:48 np0005591285 nova_compute[182755]: 2026-01-22 00:10:48.748 182759 DEBUG oslo_concurrency.lockutils [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 21 19:10:49 np0005591285 nova_compute[182755]: 2026-01-22 00:10:49.504 182759 DEBUG nova.compute.resource_tracker [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Jan 21 19:10:49 np0005591285 nova_compute[182755]: 2026-01-22 00:10:49.505 182759 DEBUG nova.compute.resource_tracker [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Jan 21 19:10:49 np0005591285 nova_compute[182755]: 2026-01-22 00:10:49.559 182759 DEBUG nova.compute.provider_tree [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Inventory has not changed in ProviderTree for provider: e96a8776-a298-4c19-937a-402cb8191067 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 21 19:10:49 np0005591285 nova_compute[182755]: 2026-01-22 00:10:49.743 182759 DEBUG nova.scheduler.client.report [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Inventory has not changed for provider e96a8776-a298-4c19-937a-402cb8191067 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 21 19:10:49 np0005591285 nova_compute[182755]: 2026-01-22 00:10:49.849 182759 DEBUG nova.compute.resource_tracker [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Jan 21 19:10:49 np0005591285 nova_compute[182755]: 2026-01-22 00:10:49.849 182759 DEBUG oslo_concurrency.lockutils [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 1.101s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 21 19:10:50 np0005591285 nova_compute[182755]: 2026-01-22 00:10:50.448 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:10:50 np0005591285 nova_compute[182755]: 2026-01-22 00:10:50.850 182759 DEBUG oslo_service.periodic_task [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 21 19:10:51 np0005591285 nova_compute[182755]: 2026-01-22 00:10:51.779 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:10:52 np0005591285 nova_compute[182755]: 2026-01-22 00:10:52.060 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:10:53 np0005591285 nova_compute[182755]: 2026-01-22 00:10:53.001 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:10:53 np0005591285 nova_compute[182755]: 2026-01-22 00:10:53.824 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:10:53 np0005591285 nova_compute[182755]: 2026-01-22 00:10:53.936 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:10:55 np0005591285 nova_compute[182755]: 2026-01-22 00:10:55.450 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:10:57 np0005591285 nova_compute[182755]: 2026-01-22 00:10:57.061 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:10:58 np0005591285 podman[229022]: 2026-01-22 00:10:58.16889552 +0000 UTC m=+0.044542019 container health_status 3d927e208c8d9d2bf2ed958430b573b5beb6e6062ba87e1ec7a0ee42c855b2e4 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter)
Jan 21 19:11:00 np0005591285 nova_compute[182755]: 2026-01-22 00:11:00.452 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:11:01 np0005591285 podman[229047]: 2026-01-22 00:11:01.17870149 +0000 UTC m=+0.050852638 container health_status 8919309155a033da8deb024bb37b9fc0d285fe97686ee0d202af37a5f2df8416 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Jan 21 19:11:01 np0005591285 podman[229046]: 2026-01-22 00:11:01.217035091 +0000 UTC m=+0.081673908 container health_status 482a8450540b33e5cd4c4cb0c4b60281016c657b79b89fa82f8a9e447843f995 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Jan 21 19:11:01 np0005591285 nova_compute[182755]: 2026-01-22 00:11:01.852 182759 DEBUG oslo_concurrency.lockutils [None req-5bd32913-4296-4ff4-8858-5d64aeacd506 34c123183bb440f5812e26cf267019c7 09aea696a8524affb62dfae6819b6ba4 - - default default] Acquiring lock "069e978e-d494-4830-93c7-f449d9fefe71" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 21 19:11:01 np0005591285 nova_compute[182755]: 2026-01-22 00:11:01.853 182759 DEBUG oslo_concurrency.lockutils [None req-5bd32913-4296-4ff4-8858-5d64aeacd506 34c123183bb440f5812e26cf267019c7 09aea696a8524affb62dfae6819b6ba4 - - default default] Lock "069e978e-d494-4830-93c7-f449d9fefe71" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 21 19:11:01 np0005591285 nova_compute[182755]: 2026-01-22 00:11:01.950 182759 DEBUG nova.compute.manager [None req-5bd32913-4296-4ff4-8858-5d64aeacd506 34c123183bb440f5812e26cf267019c7 09aea696a8524affb62dfae6819b6ba4 - - default default] [instance: 069e978e-d494-4830-93c7-f449d9fefe71] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Jan 21 19:11:02 np0005591285 nova_compute[182755]: 2026-01-22 00:11:02.143 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:11:02 np0005591285 nova_compute[182755]: 2026-01-22 00:11:02.434 182759 DEBUG oslo_concurrency.lockutils [None req-5bd32913-4296-4ff4-8858-5d64aeacd506 34c123183bb440f5812e26cf267019c7 09aea696a8524affb62dfae6819b6ba4 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 21 19:11:02 np0005591285 nova_compute[182755]: 2026-01-22 00:11:02.435 182759 DEBUG oslo_concurrency.lockutils [None req-5bd32913-4296-4ff4-8858-5d64aeacd506 34c123183bb440f5812e26cf267019c7 09aea696a8524affb62dfae6819b6ba4 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 21 19:11:02 np0005591285 nova_compute[182755]: 2026-01-22 00:11:02.444 182759 DEBUG nova.virt.hardware [None req-5bd32913-4296-4ff4-8858-5d64aeacd506 34c123183bb440f5812e26cf267019c7 09aea696a8524affb62dfae6819b6ba4 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Jan 21 19:11:02 np0005591285 nova_compute[182755]: 2026-01-22 00:11:02.444 182759 INFO nova.compute.claims [None req-5bd32913-4296-4ff4-8858-5d64aeacd506 34c123183bb440f5812e26cf267019c7 09aea696a8524affb62dfae6819b6ba4 - - default default] [instance: 069e978e-d494-4830-93c7-f449d9fefe71] Claim successful on node compute-2.ctlplane.example.com#033[00m
Jan 21 19:11:02 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:11:02.973 104259 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 21 19:11:02 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:11:02.974 104259 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 21 19:11:02 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:11:02.974 104259 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 21 19:11:03 np0005591285 podman[229087]: 2026-01-22 00:11:03.28619507 +0000 UTC m=+0.156987424 container health_status e38def30a2d032278c507a8fe96e62e9ea51ff4721fe55b24e66691821fd079e (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_controller, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']})
Jan 21 19:11:03 np0005591285 nova_compute[182755]: 2026-01-22 00:11:03.777 182759 DEBUG nova.compute.provider_tree [None req-5bd32913-4296-4ff4-8858-5d64aeacd506 34c123183bb440f5812e26cf267019c7 09aea696a8524affb62dfae6819b6ba4 - - default default] Inventory has not changed in ProviderTree for provider: e96a8776-a298-4c19-937a-402cb8191067 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 21 19:11:03 np0005591285 nova_compute[182755]: 2026-01-22 00:11:03.805 182759 DEBUG nova.scheduler.client.report [None req-5bd32913-4296-4ff4-8858-5d64aeacd506 34c123183bb440f5812e26cf267019c7 09aea696a8524affb62dfae6819b6ba4 - - default default] Inventory has not changed for provider e96a8776-a298-4c19-937a-402cb8191067 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 21 19:11:04 np0005591285 nova_compute[182755]: 2026-01-22 00:11:04.130 182759 DEBUG oslo_concurrency.lockutils [None req-5bd32913-4296-4ff4-8858-5d64aeacd506 34c123183bb440f5812e26cf267019c7 09aea696a8524affb62dfae6819b6ba4 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 1.696s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 21 19:11:04 np0005591285 nova_compute[182755]: 2026-01-22 00:11:04.131 182759 DEBUG nova.compute.manager [None req-5bd32913-4296-4ff4-8858-5d64aeacd506 34c123183bb440f5812e26cf267019c7 09aea696a8524affb62dfae6819b6ba4 - - default default] [instance: 069e978e-d494-4830-93c7-f449d9fefe71] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Jan 21 19:11:04 np0005591285 nova_compute[182755]: 2026-01-22 00:11:04.897 182759 DEBUG nova.compute.manager [None req-5bd32913-4296-4ff4-8858-5d64aeacd506 34c123183bb440f5812e26cf267019c7 09aea696a8524affb62dfae6819b6ba4 - - default default] [instance: 069e978e-d494-4830-93c7-f449d9fefe71] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Jan 21 19:11:04 np0005591285 nova_compute[182755]: 2026-01-22 00:11:04.897 182759 DEBUG nova.network.neutron [None req-5bd32913-4296-4ff4-8858-5d64aeacd506 34c123183bb440f5812e26cf267019c7 09aea696a8524affb62dfae6819b6ba4 - - default default] [instance: 069e978e-d494-4830-93c7-f449d9fefe71] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Jan 21 19:11:04 np0005591285 nova_compute[182755]: 2026-01-22 00:11:04.933 182759 INFO nova.virt.libvirt.driver [None req-5bd32913-4296-4ff4-8858-5d64aeacd506 34c123183bb440f5812e26cf267019c7 09aea696a8524affb62dfae6819b6ba4 - - default default] [instance: 069e978e-d494-4830-93c7-f449d9fefe71] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Jan 21 19:11:05 np0005591285 nova_compute[182755]: 2026-01-22 00:11:05.487 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:11:06 np0005591285 nova_compute[182755]: 2026-01-22 00:11:06.311 182759 DEBUG nova.compute.manager [None req-5bd32913-4296-4ff4-8858-5d64aeacd506 34c123183bb440f5812e26cf267019c7 09aea696a8524affb62dfae6819b6ba4 - - default default] [instance: 069e978e-d494-4830-93c7-f449d9fefe71] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Jan 21 19:11:06 np0005591285 nova_compute[182755]: 2026-01-22 00:11:06.587 182759 DEBUG nova.policy [None req-5bd32913-4296-4ff4-8858-5d64aeacd506 34c123183bb440f5812e26cf267019c7 09aea696a8524affb62dfae6819b6ba4 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '34c123183bb440f5812e26cf267019c7', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '09aea696a8524affb62dfae6819b6ba4', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Jan 21 19:11:07 np0005591285 nova_compute[182755]: 2026-01-22 00:11:07.146 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:11:07 np0005591285 nova_compute[182755]: 2026-01-22 00:11:07.426 182759 DEBUG nova.compute.manager [None req-5bd32913-4296-4ff4-8858-5d64aeacd506 34c123183bb440f5812e26cf267019c7 09aea696a8524affb62dfae6819b6ba4 - - default default] [instance: 069e978e-d494-4830-93c7-f449d9fefe71] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Jan 21 19:11:07 np0005591285 nova_compute[182755]: 2026-01-22 00:11:07.427 182759 DEBUG nova.virt.libvirt.driver [None req-5bd32913-4296-4ff4-8858-5d64aeacd506 34c123183bb440f5812e26cf267019c7 09aea696a8524affb62dfae6819b6ba4 - - default default] [instance: 069e978e-d494-4830-93c7-f449d9fefe71] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Jan 21 19:11:07 np0005591285 nova_compute[182755]: 2026-01-22 00:11:07.427 182759 INFO nova.virt.libvirt.driver [None req-5bd32913-4296-4ff4-8858-5d64aeacd506 34c123183bb440f5812e26cf267019c7 09aea696a8524affb62dfae6819b6ba4 - - default default] [instance: 069e978e-d494-4830-93c7-f449d9fefe71] Creating image(s)#033[00m
Jan 21 19:11:07 np0005591285 nova_compute[182755]: 2026-01-22 00:11:07.428 182759 DEBUG oslo_concurrency.lockutils [None req-5bd32913-4296-4ff4-8858-5d64aeacd506 34c123183bb440f5812e26cf267019c7 09aea696a8524affb62dfae6819b6ba4 - - default default] Acquiring lock "/var/lib/nova/instances/069e978e-d494-4830-93c7-f449d9fefe71/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 21 19:11:07 np0005591285 nova_compute[182755]: 2026-01-22 00:11:07.428 182759 DEBUG oslo_concurrency.lockutils [None req-5bd32913-4296-4ff4-8858-5d64aeacd506 34c123183bb440f5812e26cf267019c7 09aea696a8524affb62dfae6819b6ba4 - - default default] Lock "/var/lib/nova/instances/069e978e-d494-4830-93c7-f449d9fefe71/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 21 19:11:07 np0005591285 nova_compute[182755]: 2026-01-22 00:11:07.429 182759 DEBUG oslo_concurrency.lockutils [None req-5bd32913-4296-4ff4-8858-5d64aeacd506 34c123183bb440f5812e26cf267019c7 09aea696a8524affb62dfae6819b6ba4 - - default default] Lock "/var/lib/nova/instances/069e978e-d494-4830-93c7-f449d9fefe71/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 21 19:11:07 np0005591285 nova_compute[182755]: 2026-01-22 00:11:07.442 182759 DEBUG oslo_concurrency.processutils [None req-5bd32913-4296-4ff4-8858-5d64aeacd506 34c123183bb440f5812e26cf267019c7 09aea696a8524affb62dfae6819b6ba4 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 21 19:11:07 np0005591285 nova_compute[182755]: 2026-01-22 00:11:07.524 182759 DEBUG oslo_concurrency.processutils [None req-5bd32913-4296-4ff4-8858-5d64aeacd506 34c123183bb440f5812e26cf267019c7 09aea696a8524affb62dfae6819b6ba4 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json" returned: 0 in 0.083s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 21 19:11:07 np0005591285 nova_compute[182755]: 2026-01-22 00:11:07.525 182759 DEBUG oslo_concurrency.lockutils [None req-5bd32913-4296-4ff4-8858-5d64aeacd506 34c123183bb440f5812e26cf267019c7 09aea696a8524affb62dfae6819b6ba4 - - default default] Acquiring lock "3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 21 19:11:07 np0005591285 nova_compute[182755]: 2026-01-22 00:11:07.526 182759 DEBUG oslo_concurrency.lockutils [None req-5bd32913-4296-4ff4-8858-5d64aeacd506 34c123183bb440f5812e26cf267019c7 09aea696a8524affb62dfae6819b6ba4 - - default default] Lock "3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 21 19:11:07 np0005591285 nova_compute[182755]: 2026-01-22 00:11:07.536 182759 DEBUG oslo_concurrency.processutils [None req-5bd32913-4296-4ff4-8858-5d64aeacd506 34c123183bb440f5812e26cf267019c7 09aea696a8524affb62dfae6819b6ba4 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 21 19:11:07 np0005591285 nova_compute[182755]: 2026-01-22 00:11:07.588 182759 DEBUG oslo_concurrency.processutils [None req-5bd32913-4296-4ff4-8858-5d64aeacd506 34c123183bb440f5812e26cf267019c7 09aea696a8524affb62dfae6819b6ba4 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json" returned: 0 in 0.052s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 21 19:11:07 np0005591285 nova_compute[182755]: 2026-01-22 00:11:07.589 182759 DEBUG oslo_concurrency.processutils [None req-5bd32913-4296-4ff4-8858-5d64aeacd506 34c123183bb440f5812e26cf267019c7 09aea696a8524affb62dfae6819b6ba4 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474,backing_fmt=raw /var/lib/nova/instances/069e978e-d494-4830-93c7-f449d9fefe71/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 21 19:11:07 np0005591285 nova_compute[182755]: 2026-01-22 00:11:07.730 182759 DEBUG oslo_concurrency.processutils [None req-5bd32913-4296-4ff4-8858-5d64aeacd506 34c123183bb440f5812e26cf267019c7 09aea696a8524affb62dfae6819b6ba4 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474,backing_fmt=raw /var/lib/nova/instances/069e978e-d494-4830-93c7-f449d9fefe71/disk 1073741824" returned: 0 in 0.142s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 21 19:11:07 np0005591285 nova_compute[182755]: 2026-01-22 00:11:07.732 182759 DEBUG oslo_concurrency.lockutils [None req-5bd32913-4296-4ff4-8858-5d64aeacd506 34c123183bb440f5812e26cf267019c7 09aea696a8524affb62dfae6819b6ba4 - - default default] Lock "3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.206s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 21 19:11:07 np0005591285 nova_compute[182755]: 2026-01-22 00:11:07.732 182759 DEBUG oslo_concurrency.processutils [None req-5bd32913-4296-4ff4-8858-5d64aeacd506 34c123183bb440f5812e26cf267019c7 09aea696a8524affb62dfae6819b6ba4 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 21 19:11:07 np0005591285 nova_compute[182755]: 2026-01-22 00:11:07.785 182759 DEBUG oslo_concurrency.processutils [None req-5bd32913-4296-4ff4-8858-5d64aeacd506 34c123183bb440f5812e26cf267019c7 09aea696a8524affb62dfae6819b6ba4 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json" returned: 0 in 0.052s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 21 19:11:07 np0005591285 nova_compute[182755]: 2026-01-22 00:11:07.786 182759 DEBUG nova.virt.disk.api [None req-5bd32913-4296-4ff4-8858-5d64aeacd506 34c123183bb440f5812e26cf267019c7 09aea696a8524affb62dfae6819b6ba4 - - default default] Checking if we can resize image /var/lib/nova/instances/069e978e-d494-4830-93c7-f449d9fefe71/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166#033[00m
Jan 21 19:11:07 np0005591285 nova_compute[182755]: 2026-01-22 00:11:07.786 182759 DEBUG oslo_concurrency.processutils [None req-5bd32913-4296-4ff4-8858-5d64aeacd506 34c123183bb440f5812e26cf267019c7 09aea696a8524affb62dfae6819b6ba4 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/069e978e-d494-4830-93c7-f449d9fefe71/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 21 19:11:07 np0005591285 nova_compute[182755]: 2026-01-22 00:11:07.873 182759 DEBUG oslo_concurrency.processutils [None req-5bd32913-4296-4ff4-8858-5d64aeacd506 34c123183bb440f5812e26cf267019c7 09aea696a8524affb62dfae6819b6ba4 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/069e978e-d494-4830-93c7-f449d9fefe71/disk --force-share --output=json" returned: 0 in 0.086s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 21 19:11:07 np0005591285 nova_compute[182755]: 2026-01-22 00:11:07.874 182759 DEBUG nova.virt.disk.api [None req-5bd32913-4296-4ff4-8858-5d64aeacd506 34c123183bb440f5812e26cf267019c7 09aea696a8524affb62dfae6819b6ba4 - - default default] Cannot resize image /var/lib/nova/instances/069e978e-d494-4830-93c7-f449d9fefe71/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172#033[00m
Jan 21 19:11:07 np0005591285 nova_compute[182755]: 2026-01-22 00:11:07.875 182759 DEBUG nova.objects.instance [None req-5bd32913-4296-4ff4-8858-5d64aeacd506 34c123183bb440f5812e26cf267019c7 09aea696a8524affb62dfae6819b6ba4 - - default default] Lazy-loading 'migration_context' on Instance uuid 069e978e-d494-4830-93c7-f449d9fefe71 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 21 19:11:08 np0005591285 nova_compute[182755]: 2026-01-22 00:11:08.410 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:11:08 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:11:08.410 104259 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=32, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '16:a4:fb', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': 'fa:c2:bb:50:eb:27'}, ipsec=False) old=SB_Global(nb_cfg=31) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 21 19:11:08 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:11:08.411 104259 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 10 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Jan 21 19:11:08 np0005591285 nova_compute[182755]: 2026-01-22 00:11:08.579 182759 DEBUG nova.virt.libvirt.driver [None req-5bd32913-4296-4ff4-8858-5d64aeacd506 34c123183bb440f5812e26cf267019c7 09aea696a8524affb62dfae6819b6ba4 - - default default] [instance: 069e978e-d494-4830-93c7-f449d9fefe71] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Jan 21 19:11:08 np0005591285 nova_compute[182755]: 2026-01-22 00:11:08.580 182759 DEBUG nova.virt.libvirt.driver [None req-5bd32913-4296-4ff4-8858-5d64aeacd506 34c123183bb440f5812e26cf267019c7 09aea696a8524affb62dfae6819b6ba4 - - default default] [instance: 069e978e-d494-4830-93c7-f449d9fefe71] Ensure instance console log exists: /var/lib/nova/instances/069e978e-d494-4830-93c7-f449d9fefe71/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Jan 21 19:11:08 np0005591285 nova_compute[182755]: 2026-01-22 00:11:08.580 182759 DEBUG oslo_concurrency.lockutils [None req-5bd32913-4296-4ff4-8858-5d64aeacd506 34c123183bb440f5812e26cf267019c7 09aea696a8524affb62dfae6819b6ba4 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 21 19:11:08 np0005591285 nova_compute[182755]: 2026-01-22 00:11:08.581 182759 DEBUG oslo_concurrency.lockutils [None req-5bd32913-4296-4ff4-8858-5d64aeacd506 34c123183bb440f5812e26cf267019c7 09aea696a8524affb62dfae6819b6ba4 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 21 19:11:08 np0005591285 nova_compute[182755]: 2026-01-22 00:11:08.581 182759 DEBUG oslo_concurrency.lockutils [None req-5bd32913-4296-4ff4-8858-5d64aeacd506 34c123183bb440f5812e26cf267019c7 09aea696a8524affb62dfae6819b6ba4 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 21 19:11:10 np0005591285 nova_compute[182755]: 2026-01-22 00:11:10.489 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:11:11 np0005591285 nova_compute[182755]: 2026-01-22 00:11:11.043 182759 DEBUG nova.network.neutron [None req-5bd32913-4296-4ff4-8858-5d64aeacd506 34c123183bb440f5812e26cf267019c7 09aea696a8524affb62dfae6819b6ba4 - - default default] [instance: 069e978e-d494-4830-93c7-f449d9fefe71] Successfully created port: 44b26f9f-3553-4a58-a1bf-068e5bc636a5 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Jan 21 19:11:12 np0005591285 nova_compute[182755]: 2026-01-22 00:11:12.148 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:11:15 np0005591285 nova_compute[182755]: 2026-01-22 00:11:15.492 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:11:17 np0005591285 nova_compute[182755]: 2026-01-22 00:11:17.203 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:11:18 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:11:18.413 104259 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=ce4b296c-26ac-415a-aa87-9634754eb3d3, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '32'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 21 19:11:19 np0005591285 podman[229132]: 2026-01-22 00:11:19.191165432 +0000 UTC m=+0.058907925 container health_status b5c1a31a508d0cf9e10702abe9c0ef424614b4d8f0daee85791f7de054afa6f0 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, container_name=ceilometer_agent_compute, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, managed_by=edpm_ansible, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=ceilometer_agent_compute, io.buildah.version=1.41.3, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Jan 21 19:11:19 np0005591285 podman[229131]: 2026-01-22 00:11:19.191538592 +0000 UTC m=+0.059329057 container health_status 769668cc4e818cfae854ab3fcb5517227ddca1e18005a18ef4d782a494f45662 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, url=https://catalog.redhat.com/en/search?searchType=containers, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, build-date=2025-08-20T13:12:41, com.redhat.component=ubi9-minimal-container, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.buildah.version=1.33.7, name=ubi9-minimal, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., io.openshift.expose-services=, io.openshift.tags=minimal rhel9, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, container_name=openstack_network_exporter, vcs-type=git, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, distribution-scope=public, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Red Hat, Inc., architecture=x86_64, config_id=openstack_network_exporter, version=9.6, managed_by=edpm_ansible, release=1755695350)
Jan 21 19:11:20 np0005591285 nova_compute[182755]: 2026-01-22 00:11:20.494 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:11:22 np0005591285 nova_compute[182755]: 2026-01-22 00:11:22.236 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:11:23 np0005591285 nova_compute[182755]: 2026-01-22 00:11:23.308 182759 DEBUG nova.network.neutron [None req-5bd32913-4296-4ff4-8858-5d64aeacd506 34c123183bb440f5812e26cf267019c7 09aea696a8524affb62dfae6819b6ba4 - - default default] [instance: 069e978e-d494-4830-93c7-f449d9fefe71] Successfully updated port: 44b26f9f-3553-4a58-a1bf-068e5bc636a5 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Jan 21 19:11:23 np0005591285 nova_compute[182755]: 2026-01-22 00:11:23.556 182759 DEBUG oslo_concurrency.lockutils [None req-5bd32913-4296-4ff4-8858-5d64aeacd506 34c123183bb440f5812e26cf267019c7 09aea696a8524affb62dfae6819b6ba4 - - default default] Acquiring lock "refresh_cache-069e978e-d494-4830-93c7-f449d9fefe71" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 21 19:11:23 np0005591285 nova_compute[182755]: 2026-01-22 00:11:23.556 182759 DEBUG oslo_concurrency.lockutils [None req-5bd32913-4296-4ff4-8858-5d64aeacd506 34c123183bb440f5812e26cf267019c7 09aea696a8524affb62dfae6819b6ba4 - - default default] Acquired lock "refresh_cache-069e978e-d494-4830-93c7-f449d9fefe71" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 21 19:11:23 np0005591285 nova_compute[182755]: 2026-01-22 00:11:23.557 182759 DEBUG nova.network.neutron [None req-5bd32913-4296-4ff4-8858-5d64aeacd506 34c123183bb440f5812e26cf267019c7 09aea696a8524affb62dfae6819b6ba4 - - default default] [instance: 069e978e-d494-4830-93c7-f449d9fefe71] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Jan 21 19:11:23 np0005591285 nova_compute[182755]: 2026-01-22 00:11:23.569 182759 DEBUG nova.compute.manager [req-02290897-7c72-4a60-832f-2fc1c89f6526 req-97aebd2d-17ce-49f8-9dee-c25c74ddbc4c 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 069e978e-d494-4830-93c7-f449d9fefe71] Received event network-changed-44b26f9f-3553-4a58-a1bf-068e5bc636a5 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 21 19:11:23 np0005591285 nova_compute[182755]: 2026-01-22 00:11:23.569 182759 DEBUG nova.compute.manager [req-02290897-7c72-4a60-832f-2fc1c89f6526 req-97aebd2d-17ce-49f8-9dee-c25c74ddbc4c 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 069e978e-d494-4830-93c7-f449d9fefe71] Refreshing instance network info cache due to event network-changed-44b26f9f-3553-4a58-a1bf-068e5bc636a5. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 21 19:11:23 np0005591285 nova_compute[182755]: 2026-01-22 00:11:23.569 182759 DEBUG oslo_concurrency.lockutils [req-02290897-7c72-4a60-832f-2fc1c89f6526 req-97aebd2d-17ce-49f8-9dee-c25c74ddbc4c 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "refresh_cache-069e978e-d494-4830-93c7-f449d9fefe71" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 21 19:11:24 np0005591285 nova_compute[182755]: 2026-01-22 00:11:24.555 182759 DEBUG nova.network.neutron [None req-5bd32913-4296-4ff4-8858-5d64aeacd506 34c123183bb440f5812e26cf267019c7 09aea696a8524affb62dfae6819b6ba4 - - default default] [instance: 069e978e-d494-4830-93c7-f449d9fefe71] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Jan 21 19:11:25 np0005591285 nova_compute[182755]: 2026-01-22 00:11:25.496 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:11:26 np0005591285 nova_compute[182755]: 2026-01-22 00:11:26.582 182759 DEBUG nova.network.neutron [None req-5bd32913-4296-4ff4-8858-5d64aeacd506 34c123183bb440f5812e26cf267019c7 09aea696a8524affb62dfae6819b6ba4 - - default default] [instance: 069e978e-d494-4830-93c7-f449d9fefe71] Updating instance_info_cache with network_info: [{"id": "44b26f9f-3553-4a58-a1bf-068e5bc636a5", "address": "fa:16:3e:dc:80:17", "network": {"id": "ffed1ac3-0e62-43c1-a887-80d5e274a540", "bridge": "br-int", "label": "tempest-ServerAddressesNegativeTestJSON-1714373655-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "09aea696a8524affb62dfae6819b6ba4", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap44b26f9f-35", "ovs_interfaceid": "44b26f9f-3553-4a58-a1bf-068e5bc636a5", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 21 19:11:26 np0005591285 nova_compute[182755]: 2026-01-22 00:11:26.784 182759 DEBUG oslo_concurrency.lockutils [None req-5bd32913-4296-4ff4-8858-5d64aeacd506 34c123183bb440f5812e26cf267019c7 09aea696a8524affb62dfae6819b6ba4 - - default default] Releasing lock "refresh_cache-069e978e-d494-4830-93c7-f449d9fefe71" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 21 19:11:26 np0005591285 nova_compute[182755]: 2026-01-22 00:11:26.784 182759 DEBUG nova.compute.manager [None req-5bd32913-4296-4ff4-8858-5d64aeacd506 34c123183bb440f5812e26cf267019c7 09aea696a8524affb62dfae6819b6ba4 - - default default] [instance: 069e978e-d494-4830-93c7-f449d9fefe71] Instance network_info: |[{"id": "44b26f9f-3553-4a58-a1bf-068e5bc636a5", "address": "fa:16:3e:dc:80:17", "network": {"id": "ffed1ac3-0e62-43c1-a887-80d5e274a540", "bridge": "br-int", "label": "tempest-ServerAddressesNegativeTestJSON-1714373655-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "09aea696a8524affb62dfae6819b6ba4", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap44b26f9f-35", "ovs_interfaceid": "44b26f9f-3553-4a58-a1bf-068e5bc636a5", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Jan 21 19:11:26 np0005591285 nova_compute[182755]: 2026-01-22 00:11:26.785 182759 DEBUG oslo_concurrency.lockutils [req-02290897-7c72-4a60-832f-2fc1c89f6526 req-97aebd2d-17ce-49f8-9dee-c25c74ddbc4c 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquired lock "refresh_cache-069e978e-d494-4830-93c7-f449d9fefe71" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 21 19:11:26 np0005591285 nova_compute[182755]: 2026-01-22 00:11:26.785 182759 DEBUG nova.network.neutron [req-02290897-7c72-4a60-832f-2fc1c89f6526 req-97aebd2d-17ce-49f8-9dee-c25c74ddbc4c 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 069e978e-d494-4830-93c7-f449d9fefe71] Refreshing network info cache for port 44b26f9f-3553-4a58-a1bf-068e5bc636a5 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 21 19:11:26 np0005591285 nova_compute[182755]: 2026-01-22 00:11:26.790 182759 DEBUG nova.virt.libvirt.driver [None req-5bd32913-4296-4ff4-8858-5d64aeacd506 34c123183bb440f5812e26cf267019c7 09aea696a8524affb62dfae6819b6ba4 - - default default] [instance: 069e978e-d494-4830-93c7-f449d9fefe71] Start _get_guest_xml network_info=[{"id": "44b26f9f-3553-4a58-a1bf-068e5bc636a5", "address": "fa:16:3e:dc:80:17", "network": {"id": "ffed1ac3-0e62-43c1-a887-80d5e274a540", "bridge": "br-int", "label": "tempest-ServerAddressesNegativeTestJSON-1714373655-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "09aea696a8524affb62dfae6819b6ba4", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap44b26f9f-35", "ovs_interfaceid": "44b26f9f-3553-4a58-a1bf-068e5bc636a5", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-21T23:42:50Z,direct_url=<?>,disk_format='qcow2',id=9cd98f02-a505-4543-a7ad-04e9a377b456,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='43b70c4e837343859ac97b6b2397ba1b',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-21T23:42:52Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'device_type': 'disk', 'device_name': '/dev/vda', 'size': 0, 'boot_index': 0, 'disk_bus': 'virtio', 'guest_format': None, 'encryption_options': None, 'encryption_format': None, 'encrypted': False, 'encryption_secret_uuid': None, 'image_id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Jan 21 19:11:26 np0005591285 nova_compute[182755]: 2026-01-22 00:11:26.795 182759 WARNING nova.virt.libvirt.driver [None req-5bd32913-4296-4ff4-8858-5d64aeacd506 34c123183bb440f5812e26cf267019c7 09aea696a8524affb62dfae6819b6ba4 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 21 19:11:26 np0005591285 nova_compute[182755]: 2026-01-22 00:11:26.805 182759 DEBUG nova.virt.libvirt.host [None req-5bd32913-4296-4ff4-8858-5d64aeacd506 34c123183bb440f5812e26cf267019c7 09aea696a8524affb62dfae6819b6ba4 - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Jan 21 19:11:26 np0005591285 nova_compute[182755]: 2026-01-22 00:11:26.806 182759 DEBUG nova.virt.libvirt.host [None req-5bd32913-4296-4ff4-8858-5d64aeacd506 34c123183bb440f5812e26cf267019c7 09aea696a8524affb62dfae6819b6ba4 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Jan 21 19:11:26 np0005591285 nova_compute[182755]: 2026-01-22 00:11:26.819 182759 DEBUG nova.virt.libvirt.host [None req-5bd32913-4296-4ff4-8858-5d64aeacd506 34c123183bb440f5812e26cf267019c7 09aea696a8524affb62dfae6819b6ba4 - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Jan 21 19:11:26 np0005591285 nova_compute[182755]: 2026-01-22 00:11:26.820 182759 DEBUG nova.virt.libvirt.host [None req-5bd32913-4296-4ff4-8858-5d64aeacd506 34c123183bb440f5812e26cf267019c7 09aea696a8524affb62dfae6819b6ba4 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Jan 21 19:11:26 np0005591285 nova_compute[182755]: 2026-01-22 00:11:26.822 182759 DEBUG nova.virt.libvirt.driver [None req-5bd32913-4296-4ff4-8858-5d64aeacd506 34c123183bb440f5812e26cf267019c7 09aea696a8524affb62dfae6819b6ba4 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Jan 21 19:11:26 np0005591285 nova_compute[182755]: 2026-01-22 00:11:26.822 182759 DEBUG nova.virt.hardware [None req-5bd32913-4296-4ff4-8858-5d64aeacd506 34c123183bb440f5812e26cf267019c7 09aea696a8524affb62dfae6819b6ba4 - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-21T23:42:45Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='c3389c03-89c4-4ff5-9e03-1a99d41713d4',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-21T23:42:50Z,direct_url=<?>,disk_format='qcow2',id=9cd98f02-a505-4543-a7ad-04e9a377b456,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='43b70c4e837343859ac97b6b2397ba1b',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-21T23:42:52Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Jan 21 19:11:26 np0005591285 nova_compute[182755]: 2026-01-22 00:11:26.822 182759 DEBUG nova.virt.hardware [None req-5bd32913-4296-4ff4-8858-5d64aeacd506 34c123183bb440f5812e26cf267019c7 09aea696a8524affb62dfae6819b6ba4 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Jan 21 19:11:26 np0005591285 nova_compute[182755]: 2026-01-22 00:11:26.823 182759 DEBUG nova.virt.hardware [None req-5bd32913-4296-4ff4-8858-5d64aeacd506 34c123183bb440f5812e26cf267019c7 09aea696a8524affb62dfae6819b6ba4 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Jan 21 19:11:26 np0005591285 nova_compute[182755]: 2026-01-22 00:11:26.823 182759 DEBUG nova.virt.hardware [None req-5bd32913-4296-4ff4-8858-5d64aeacd506 34c123183bb440f5812e26cf267019c7 09aea696a8524affb62dfae6819b6ba4 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Jan 21 19:11:26 np0005591285 nova_compute[182755]: 2026-01-22 00:11:26.823 182759 DEBUG nova.virt.hardware [None req-5bd32913-4296-4ff4-8858-5d64aeacd506 34c123183bb440f5812e26cf267019c7 09aea696a8524affb62dfae6819b6ba4 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Jan 21 19:11:26 np0005591285 nova_compute[182755]: 2026-01-22 00:11:26.823 182759 DEBUG nova.virt.hardware [None req-5bd32913-4296-4ff4-8858-5d64aeacd506 34c123183bb440f5812e26cf267019c7 09aea696a8524affb62dfae6819b6ba4 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Jan 21 19:11:26 np0005591285 nova_compute[182755]: 2026-01-22 00:11:26.824 182759 DEBUG nova.virt.hardware [None req-5bd32913-4296-4ff4-8858-5d64aeacd506 34c123183bb440f5812e26cf267019c7 09aea696a8524affb62dfae6819b6ba4 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Jan 21 19:11:26 np0005591285 nova_compute[182755]: 2026-01-22 00:11:26.824 182759 DEBUG nova.virt.hardware [None req-5bd32913-4296-4ff4-8858-5d64aeacd506 34c123183bb440f5812e26cf267019c7 09aea696a8524affb62dfae6819b6ba4 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Jan 21 19:11:26 np0005591285 nova_compute[182755]: 2026-01-22 00:11:26.824 182759 DEBUG nova.virt.hardware [None req-5bd32913-4296-4ff4-8858-5d64aeacd506 34c123183bb440f5812e26cf267019c7 09aea696a8524affb62dfae6819b6ba4 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Jan 21 19:11:26 np0005591285 nova_compute[182755]: 2026-01-22 00:11:26.824 182759 DEBUG nova.virt.hardware [None req-5bd32913-4296-4ff4-8858-5d64aeacd506 34c123183bb440f5812e26cf267019c7 09aea696a8524affb62dfae6819b6ba4 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Jan 21 19:11:26 np0005591285 nova_compute[182755]: 2026-01-22 00:11:26.825 182759 DEBUG nova.virt.hardware [None req-5bd32913-4296-4ff4-8858-5d64aeacd506 34c123183bb440f5812e26cf267019c7 09aea696a8524affb62dfae6819b6ba4 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Jan 21 19:11:26 np0005591285 nova_compute[182755]: 2026-01-22 00:11:26.829 182759 DEBUG nova.virt.libvirt.vif [None req-5bd32913-4296-4ff4-8858-5d64aeacd506 34c123183bb440f5812e26cf267019c7 09aea696a8524affb62dfae6819b6ba4 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-22T00:10:58Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ServerAddressesNegativeTestJSON-server-519178471',display_name='tempest-ServerAddressesNegativeTestJSON-server-519178471',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-serveraddressesnegativetestjson-server-519178471',id=115,image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='09aea696a8524affb62dfae6819b6ba4',ramdisk_id='',reservation_id='r-7c5n96qo',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ServerAddressesNegativeTestJSON-2089684813',owner_user_name='tempest-ServerAddressesNegativeTestJSON-2089684813-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-22T00:11:06Z,user_data=None,user_id='34c123183bb440f5812e26cf267019c7',uuid=069e978e-d494-4830-93c7-f449d9fefe71,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "44b26f9f-3553-4a58-a1bf-068e5bc636a5", "address": "fa:16:3e:dc:80:17", "network": {"id": "ffed1ac3-0e62-43c1-a887-80d5e274a540", "bridge": "br-int", "label": "tempest-ServerAddressesNegativeTestJSON-1714373655-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "09aea696a8524affb62dfae6819b6ba4", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap44b26f9f-35", "ovs_interfaceid": "44b26f9f-3553-4a58-a1bf-068e5bc636a5", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Jan 21 19:11:26 np0005591285 nova_compute[182755]: 2026-01-22 00:11:26.829 182759 DEBUG nova.network.os_vif_util [None req-5bd32913-4296-4ff4-8858-5d64aeacd506 34c123183bb440f5812e26cf267019c7 09aea696a8524affb62dfae6819b6ba4 - - default default] Converting VIF {"id": "44b26f9f-3553-4a58-a1bf-068e5bc636a5", "address": "fa:16:3e:dc:80:17", "network": {"id": "ffed1ac3-0e62-43c1-a887-80d5e274a540", "bridge": "br-int", "label": "tempest-ServerAddressesNegativeTestJSON-1714373655-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "09aea696a8524affb62dfae6819b6ba4", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap44b26f9f-35", "ovs_interfaceid": "44b26f9f-3553-4a58-a1bf-068e5bc636a5", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 21 19:11:26 np0005591285 nova_compute[182755]: 2026-01-22 00:11:26.830 182759 DEBUG nova.network.os_vif_util [None req-5bd32913-4296-4ff4-8858-5d64aeacd506 34c123183bb440f5812e26cf267019c7 09aea696a8524affb62dfae6819b6ba4 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:dc:80:17,bridge_name='br-int',has_traffic_filtering=True,id=44b26f9f-3553-4a58-a1bf-068e5bc636a5,network=Network(ffed1ac3-0e62-43c1-a887-80d5e274a540),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap44b26f9f-35') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 21 19:11:26 np0005591285 nova_compute[182755]: 2026-01-22 00:11:26.831 182759 DEBUG nova.objects.instance [None req-5bd32913-4296-4ff4-8858-5d64aeacd506 34c123183bb440f5812e26cf267019c7 09aea696a8524affb62dfae6819b6ba4 - - default default] Lazy-loading 'pci_devices' on Instance uuid 069e978e-d494-4830-93c7-f449d9fefe71 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 21 19:11:27 np0005591285 nova_compute[182755]: 2026-01-22 00:11:27.011 182759 DEBUG nova.virt.libvirt.driver [None req-5bd32913-4296-4ff4-8858-5d64aeacd506 34c123183bb440f5812e26cf267019c7 09aea696a8524affb62dfae6819b6ba4 - - default default] [instance: 069e978e-d494-4830-93c7-f449d9fefe71] End _get_guest_xml xml=<domain type="kvm">
Jan 21 19:11:27 np0005591285 nova_compute[182755]:  <uuid>069e978e-d494-4830-93c7-f449d9fefe71</uuid>
Jan 21 19:11:27 np0005591285 nova_compute[182755]:  <name>instance-00000073</name>
Jan 21 19:11:27 np0005591285 nova_compute[182755]:  <memory>131072</memory>
Jan 21 19:11:27 np0005591285 nova_compute[182755]:  <vcpu>1</vcpu>
Jan 21 19:11:27 np0005591285 nova_compute[182755]:  <metadata>
Jan 21 19:11:27 np0005591285 nova_compute[182755]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 21 19:11:27 np0005591285 nova_compute[182755]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 21 19:11:27 np0005591285 nova_compute[182755]:      <nova:name>tempest-ServerAddressesNegativeTestJSON-server-519178471</nova:name>
Jan 21 19:11:27 np0005591285 nova_compute[182755]:      <nova:creationTime>2026-01-22 00:11:26</nova:creationTime>
Jan 21 19:11:27 np0005591285 nova_compute[182755]:      <nova:flavor name="m1.nano">
Jan 21 19:11:27 np0005591285 nova_compute[182755]:        <nova:memory>128</nova:memory>
Jan 21 19:11:27 np0005591285 nova_compute[182755]:        <nova:disk>1</nova:disk>
Jan 21 19:11:27 np0005591285 nova_compute[182755]:        <nova:swap>0</nova:swap>
Jan 21 19:11:27 np0005591285 nova_compute[182755]:        <nova:ephemeral>0</nova:ephemeral>
Jan 21 19:11:27 np0005591285 nova_compute[182755]:        <nova:vcpus>1</nova:vcpus>
Jan 21 19:11:27 np0005591285 nova_compute[182755]:      </nova:flavor>
Jan 21 19:11:27 np0005591285 nova_compute[182755]:      <nova:owner>
Jan 21 19:11:27 np0005591285 nova_compute[182755]:        <nova:user uuid="34c123183bb440f5812e26cf267019c7">tempest-ServerAddressesNegativeTestJSON-2089684813-project-member</nova:user>
Jan 21 19:11:27 np0005591285 nova_compute[182755]:        <nova:project uuid="09aea696a8524affb62dfae6819b6ba4">tempest-ServerAddressesNegativeTestJSON-2089684813</nova:project>
Jan 21 19:11:27 np0005591285 nova_compute[182755]:      </nova:owner>
Jan 21 19:11:27 np0005591285 nova_compute[182755]:      <nova:root type="image" uuid="9cd98f02-a505-4543-a7ad-04e9a377b456"/>
Jan 21 19:11:27 np0005591285 nova_compute[182755]:      <nova:ports>
Jan 21 19:11:27 np0005591285 nova_compute[182755]:        <nova:port uuid="44b26f9f-3553-4a58-a1bf-068e5bc636a5">
Jan 21 19:11:27 np0005591285 nova_compute[182755]:          <nova:ip type="fixed" address="10.100.0.9" ipVersion="4"/>
Jan 21 19:11:27 np0005591285 nova_compute[182755]:        </nova:port>
Jan 21 19:11:27 np0005591285 nova_compute[182755]:      </nova:ports>
Jan 21 19:11:27 np0005591285 nova_compute[182755]:    </nova:instance>
Jan 21 19:11:27 np0005591285 nova_compute[182755]:  </metadata>
Jan 21 19:11:27 np0005591285 nova_compute[182755]:  <sysinfo type="smbios">
Jan 21 19:11:27 np0005591285 nova_compute[182755]:    <system>
Jan 21 19:11:27 np0005591285 nova_compute[182755]:      <entry name="manufacturer">RDO</entry>
Jan 21 19:11:27 np0005591285 nova_compute[182755]:      <entry name="product">OpenStack Compute</entry>
Jan 21 19:11:27 np0005591285 nova_compute[182755]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 21 19:11:27 np0005591285 nova_compute[182755]:      <entry name="serial">069e978e-d494-4830-93c7-f449d9fefe71</entry>
Jan 21 19:11:27 np0005591285 nova_compute[182755]:      <entry name="uuid">069e978e-d494-4830-93c7-f449d9fefe71</entry>
Jan 21 19:11:27 np0005591285 nova_compute[182755]:      <entry name="family">Virtual Machine</entry>
Jan 21 19:11:27 np0005591285 nova_compute[182755]:    </system>
Jan 21 19:11:27 np0005591285 nova_compute[182755]:  </sysinfo>
Jan 21 19:11:27 np0005591285 nova_compute[182755]:  <os>
Jan 21 19:11:27 np0005591285 nova_compute[182755]:    <type arch="x86_64" machine="q35">hvm</type>
Jan 21 19:11:27 np0005591285 nova_compute[182755]:    <boot dev="hd"/>
Jan 21 19:11:27 np0005591285 nova_compute[182755]:    <smbios mode="sysinfo"/>
Jan 21 19:11:27 np0005591285 nova_compute[182755]:  </os>
Jan 21 19:11:27 np0005591285 nova_compute[182755]:  <features>
Jan 21 19:11:27 np0005591285 nova_compute[182755]:    <acpi/>
Jan 21 19:11:27 np0005591285 nova_compute[182755]:    <apic/>
Jan 21 19:11:27 np0005591285 nova_compute[182755]:    <vmcoreinfo/>
Jan 21 19:11:27 np0005591285 nova_compute[182755]:  </features>
Jan 21 19:11:27 np0005591285 nova_compute[182755]:  <clock offset="utc">
Jan 21 19:11:27 np0005591285 nova_compute[182755]:    <timer name="pit" tickpolicy="delay"/>
Jan 21 19:11:27 np0005591285 nova_compute[182755]:    <timer name="rtc" tickpolicy="catchup"/>
Jan 21 19:11:27 np0005591285 nova_compute[182755]:    <timer name="hpet" present="no"/>
Jan 21 19:11:27 np0005591285 nova_compute[182755]:  </clock>
Jan 21 19:11:27 np0005591285 nova_compute[182755]:  <cpu mode="custom" match="exact">
Jan 21 19:11:27 np0005591285 nova_compute[182755]:    <model>Nehalem</model>
Jan 21 19:11:27 np0005591285 nova_compute[182755]:    <topology sockets="1" cores="1" threads="1"/>
Jan 21 19:11:27 np0005591285 nova_compute[182755]:  </cpu>
Jan 21 19:11:27 np0005591285 nova_compute[182755]:  <devices>
Jan 21 19:11:27 np0005591285 nova_compute[182755]:    <disk type="file" device="disk">
Jan 21 19:11:27 np0005591285 nova_compute[182755]:      <driver name="qemu" type="qcow2" cache="none"/>
Jan 21 19:11:27 np0005591285 nova_compute[182755]:      <source file="/var/lib/nova/instances/069e978e-d494-4830-93c7-f449d9fefe71/disk"/>
Jan 21 19:11:27 np0005591285 nova_compute[182755]:      <target dev="vda" bus="virtio"/>
Jan 21 19:11:27 np0005591285 nova_compute[182755]:    </disk>
Jan 21 19:11:27 np0005591285 nova_compute[182755]:    <disk type="file" device="cdrom">
Jan 21 19:11:27 np0005591285 nova_compute[182755]:      <driver name="qemu" type="raw" cache="none"/>
Jan 21 19:11:27 np0005591285 nova_compute[182755]:      <source file="/var/lib/nova/instances/069e978e-d494-4830-93c7-f449d9fefe71/disk.config"/>
Jan 21 19:11:27 np0005591285 nova_compute[182755]:      <target dev="sda" bus="sata"/>
Jan 21 19:11:27 np0005591285 nova_compute[182755]:    </disk>
Jan 21 19:11:27 np0005591285 nova_compute[182755]:    <interface type="ethernet">
Jan 21 19:11:27 np0005591285 nova_compute[182755]:      <mac address="fa:16:3e:dc:80:17"/>
Jan 21 19:11:27 np0005591285 nova_compute[182755]:      <model type="virtio"/>
Jan 21 19:11:27 np0005591285 nova_compute[182755]:      <driver name="vhost" rx_queue_size="512"/>
Jan 21 19:11:27 np0005591285 nova_compute[182755]:      <mtu size="1442"/>
Jan 21 19:11:27 np0005591285 nova_compute[182755]:      <target dev="tap44b26f9f-35"/>
Jan 21 19:11:27 np0005591285 nova_compute[182755]:    </interface>
Jan 21 19:11:27 np0005591285 nova_compute[182755]:    <serial type="pty">
Jan 21 19:11:27 np0005591285 nova_compute[182755]:      <log file="/var/lib/nova/instances/069e978e-d494-4830-93c7-f449d9fefe71/console.log" append="off"/>
Jan 21 19:11:27 np0005591285 nova_compute[182755]:    </serial>
Jan 21 19:11:27 np0005591285 nova_compute[182755]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 21 19:11:27 np0005591285 nova_compute[182755]:    <video>
Jan 21 19:11:27 np0005591285 nova_compute[182755]:      <model type="virtio"/>
Jan 21 19:11:27 np0005591285 nova_compute[182755]:    </video>
Jan 21 19:11:27 np0005591285 nova_compute[182755]:    <input type="tablet" bus="usb"/>
Jan 21 19:11:27 np0005591285 nova_compute[182755]:    <rng model="virtio">
Jan 21 19:11:27 np0005591285 nova_compute[182755]:      <backend model="random">/dev/urandom</backend>
Jan 21 19:11:27 np0005591285 nova_compute[182755]:    </rng>
Jan 21 19:11:27 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root"/>
Jan 21 19:11:27 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 19:11:27 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 19:11:27 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 19:11:27 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 19:11:27 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 19:11:27 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 19:11:27 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 19:11:27 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 19:11:27 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 19:11:27 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 19:11:27 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 19:11:27 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 19:11:27 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 19:11:27 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 19:11:27 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 19:11:27 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 19:11:27 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 19:11:27 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 19:11:27 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 19:11:27 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 19:11:27 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 19:11:27 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 19:11:27 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 19:11:27 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 19:11:27 np0005591285 nova_compute[182755]:    <controller type="usb" index="0"/>
Jan 21 19:11:27 np0005591285 nova_compute[182755]:    <memballoon model="virtio">
Jan 21 19:11:27 np0005591285 nova_compute[182755]:      <stats period="10"/>
Jan 21 19:11:27 np0005591285 nova_compute[182755]:    </memballoon>
Jan 21 19:11:27 np0005591285 nova_compute[182755]:  </devices>
Jan 21 19:11:27 np0005591285 nova_compute[182755]: </domain>
Jan 21 19:11:27 np0005591285 nova_compute[182755]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Jan 21 19:11:27 np0005591285 nova_compute[182755]: 2026-01-22 00:11:27.012 182759 DEBUG nova.compute.manager [None req-5bd32913-4296-4ff4-8858-5d64aeacd506 34c123183bb440f5812e26cf267019c7 09aea696a8524affb62dfae6819b6ba4 - - default default] [instance: 069e978e-d494-4830-93c7-f449d9fefe71] Preparing to wait for external event network-vif-plugged-44b26f9f-3553-4a58-a1bf-068e5bc636a5 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Jan 21 19:11:27 np0005591285 nova_compute[182755]: 2026-01-22 00:11:27.013 182759 DEBUG oslo_concurrency.lockutils [None req-5bd32913-4296-4ff4-8858-5d64aeacd506 34c123183bb440f5812e26cf267019c7 09aea696a8524affb62dfae6819b6ba4 - - default default] Acquiring lock "069e978e-d494-4830-93c7-f449d9fefe71-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 21 19:11:27 np0005591285 nova_compute[182755]: 2026-01-22 00:11:27.013 182759 DEBUG oslo_concurrency.lockutils [None req-5bd32913-4296-4ff4-8858-5d64aeacd506 34c123183bb440f5812e26cf267019c7 09aea696a8524affb62dfae6819b6ba4 - - default default] Lock "069e978e-d494-4830-93c7-f449d9fefe71-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 21 19:11:27 np0005591285 nova_compute[182755]: 2026-01-22 00:11:27.013 182759 DEBUG oslo_concurrency.lockutils [None req-5bd32913-4296-4ff4-8858-5d64aeacd506 34c123183bb440f5812e26cf267019c7 09aea696a8524affb62dfae6819b6ba4 - - default default] Lock "069e978e-d494-4830-93c7-f449d9fefe71-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 21 19:11:27 np0005591285 nova_compute[182755]: 2026-01-22 00:11:27.014 182759 DEBUG nova.virt.libvirt.vif [None req-5bd32913-4296-4ff4-8858-5d64aeacd506 34c123183bb440f5812e26cf267019c7 09aea696a8524affb62dfae6819b6ba4 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-22T00:10:58Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ServerAddressesNegativeTestJSON-server-519178471',display_name='tempest-ServerAddressesNegativeTestJSON-server-519178471',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-serveraddressesnegativetestjson-server-519178471',id=115,image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='09aea696a8524affb62dfae6819b6ba4',ramdisk_id='',reservation_id='r-7c5n96qo',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ServerAddressesNegativeTestJSON-2089684813',owner_user_name='tempest-ServerAddressesNegativeTestJSON-2089684813-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-22T00:11:06Z,user_data=None,user_id='34c123183bb440f5812e26cf267019c7',uuid=069e978e-d494-4830-93c7-f449d9fefe71,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "44b26f9f-3553-4a58-a1bf-068e5bc636a5", "address": "fa:16:3e:dc:80:17", "network": {"id": "ffed1ac3-0e62-43c1-a887-80d5e274a540", "bridge": "br-int", "label": "tempest-ServerAddressesNegativeTestJSON-1714373655-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "09aea696a8524affb62dfae6819b6ba4", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap44b26f9f-35", "ovs_interfaceid": "44b26f9f-3553-4a58-a1bf-068e5bc636a5", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Jan 21 19:11:27 np0005591285 nova_compute[182755]: 2026-01-22 00:11:27.014 182759 DEBUG nova.network.os_vif_util [None req-5bd32913-4296-4ff4-8858-5d64aeacd506 34c123183bb440f5812e26cf267019c7 09aea696a8524affb62dfae6819b6ba4 - - default default] Converting VIF {"id": "44b26f9f-3553-4a58-a1bf-068e5bc636a5", "address": "fa:16:3e:dc:80:17", "network": {"id": "ffed1ac3-0e62-43c1-a887-80d5e274a540", "bridge": "br-int", "label": "tempest-ServerAddressesNegativeTestJSON-1714373655-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "09aea696a8524affb62dfae6819b6ba4", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap44b26f9f-35", "ovs_interfaceid": "44b26f9f-3553-4a58-a1bf-068e5bc636a5", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 21 19:11:27 np0005591285 nova_compute[182755]: 2026-01-22 00:11:27.014 182759 DEBUG nova.network.os_vif_util [None req-5bd32913-4296-4ff4-8858-5d64aeacd506 34c123183bb440f5812e26cf267019c7 09aea696a8524affb62dfae6819b6ba4 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:dc:80:17,bridge_name='br-int',has_traffic_filtering=True,id=44b26f9f-3553-4a58-a1bf-068e5bc636a5,network=Network(ffed1ac3-0e62-43c1-a887-80d5e274a540),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap44b26f9f-35') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 21 19:11:27 np0005591285 nova_compute[182755]: 2026-01-22 00:11:27.015 182759 DEBUG os_vif [None req-5bd32913-4296-4ff4-8858-5d64aeacd506 34c123183bb440f5812e26cf267019c7 09aea696a8524affb62dfae6819b6ba4 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:dc:80:17,bridge_name='br-int',has_traffic_filtering=True,id=44b26f9f-3553-4a58-a1bf-068e5bc636a5,network=Network(ffed1ac3-0e62-43c1-a887-80d5e274a540),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap44b26f9f-35') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Jan 21 19:11:27 np0005591285 nova_compute[182755]: 2026-01-22 00:11:27.015 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:11:27 np0005591285 nova_compute[182755]: 2026-01-22 00:11:27.016 182759 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 21 19:11:27 np0005591285 nova_compute[182755]: 2026-01-22 00:11:27.016 182759 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 21 19:11:27 np0005591285 nova_compute[182755]: 2026-01-22 00:11:27.018 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:11:27 np0005591285 nova_compute[182755]: 2026-01-22 00:11:27.019 182759 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap44b26f9f-35, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 21 19:11:27 np0005591285 nova_compute[182755]: 2026-01-22 00:11:27.019 182759 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap44b26f9f-35, col_values=(('external_ids', {'iface-id': '44b26f9f-3553-4a58-a1bf-068e5bc636a5', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:dc:80:17', 'vm-uuid': '069e978e-d494-4830-93c7-f449d9fefe71'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 21 19:11:27 np0005591285 nova_compute[182755]: 2026-01-22 00:11:27.020 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:11:27 np0005591285 NetworkManager[55017]: <info>  [1769040687.0216] manager: (tap44b26f9f-35): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/206)
Jan 21 19:11:27 np0005591285 nova_compute[182755]: 2026-01-22 00:11:27.023 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 21 19:11:27 np0005591285 nova_compute[182755]: 2026-01-22 00:11:27.026 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:11:27 np0005591285 nova_compute[182755]: 2026-01-22 00:11:27.027 182759 INFO os_vif [None req-5bd32913-4296-4ff4-8858-5d64aeacd506 34c123183bb440f5812e26cf267019c7 09aea696a8524affb62dfae6819b6ba4 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:dc:80:17,bridge_name='br-int',has_traffic_filtering=True,id=44b26f9f-3553-4a58-a1bf-068e5bc636a5,network=Network(ffed1ac3-0e62-43c1-a887-80d5e274a540),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap44b26f9f-35')#033[00m
Jan 21 19:11:27 np0005591285 nova_compute[182755]: 2026-01-22 00:11:27.237 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:11:28 np0005591285 nova_compute[182755]: 2026-01-22 00:11:28.067 182759 DEBUG nova.virt.libvirt.driver [None req-5bd32913-4296-4ff4-8858-5d64aeacd506 34c123183bb440f5812e26cf267019c7 09aea696a8524affb62dfae6819b6ba4 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 21 19:11:28 np0005591285 nova_compute[182755]: 2026-01-22 00:11:28.067 182759 DEBUG nova.virt.libvirt.driver [None req-5bd32913-4296-4ff4-8858-5d64aeacd506 34c123183bb440f5812e26cf267019c7 09aea696a8524affb62dfae6819b6ba4 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 21 19:11:28 np0005591285 nova_compute[182755]: 2026-01-22 00:11:28.068 182759 DEBUG nova.virt.libvirt.driver [None req-5bd32913-4296-4ff4-8858-5d64aeacd506 34c123183bb440f5812e26cf267019c7 09aea696a8524affb62dfae6819b6ba4 - - default default] No VIF found with MAC fa:16:3e:dc:80:17, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Jan 21 19:11:28 np0005591285 nova_compute[182755]: 2026-01-22 00:11:28.068 182759 INFO nova.virt.libvirt.driver [None req-5bd32913-4296-4ff4-8858-5d64aeacd506 34c123183bb440f5812e26cf267019c7 09aea696a8524affb62dfae6819b6ba4 - - default default] [instance: 069e978e-d494-4830-93c7-f449d9fefe71] Using config drive#033[00m
Jan 21 19:11:28 np0005591285 nova_compute[182755]: 2026-01-22 00:11:28.833 182759 INFO nova.virt.libvirt.driver [None req-5bd32913-4296-4ff4-8858-5d64aeacd506 34c123183bb440f5812e26cf267019c7 09aea696a8524affb62dfae6819b6ba4 - - default default] [instance: 069e978e-d494-4830-93c7-f449d9fefe71] Creating config drive at /var/lib/nova/instances/069e978e-d494-4830-93c7-f449d9fefe71/disk.config#033[00m
Jan 21 19:11:28 np0005591285 nova_compute[182755]: 2026-01-22 00:11:28.838 182759 DEBUG oslo_concurrency.processutils [None req-5bd32913-4296-4ff4-8858-5d64aeacd506 34c123183bb440f5812e26cf267019c7 09aea696a8524affb62dfae6819b6ba4 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/069e978e-d494-4830-93c7-f449d9fefe71/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmplzy5xvri execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 21 19:11:28 np0005591285 nova_compute[182755]: 2026-01-22 00:11:28.969 182759 DEBUG oslo_concurrency.processutils [None req-5bd32913-4296-4ff4-8858-5d64aeacd506 34c123183bb440f5812e26cf267019c7 09aea696a8524affb62dfae6819b6ba4 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/069e978e-d494-4830-93c7-f449d9fefe71/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmplzy5xvri" returned: 0 in 0.131s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 21 19:11:29 np0005591285 kernel: tap44b26f9f-35: entered promiscuous mode
Jan 21 19:11:29 np0005591285 NetworkManager[55017]: <info>  [1769040689.0447] manager: (tap44b26f9f-35): new Tun device (/org/freedesktop/NetworkManager/Devices/207)
Jan 21 19:11:29 np0005591285 systemd-udevd[229194]: Network interface NamePolicy= disabled on kernel command line.
Jan 21 19:11:29 np0005591285 nova_compute[182755]: 2026-01-22 00:11:29.083 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:11:29 np0005591285 ovn_controller[94908]: 2026-01-22T00:11:29Z|00427|binding|INFO|Claiming lport 44b26f9f-3553-4a58-a1bf-068e5bc636a5 for this chassis.
Jan 21 19:11:29 np0005591285 ovn_controller[94908]: 2026-01-22T00:11:29Z|00428|binding|INFO|44b26f9f-3553-4a58-a1bf-068e5bc636a5: Claiming fa:16:3e:dc:80:17 10.100.0.9
Jan 21 19:11:29 np0005591285 NetworkManager[55017]: <info>  [1769040689.0994] device (tap44b26f9f-35): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 21 19:11:29 np0005591285 NetworkManager[55017]: <info>  [1769040689.1005] device (tap44b26f9f-35): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 21 19:11:29 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:11:29.110 104259 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:dc:80:17 10.100.0.9'], port_security=['fa:16:3e:dc:80:17 10.100.0.9'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.9/28', 'neutron:device_id': '069e978e-d494-4830-93c7-f449d9fefe71', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-ffed1ac3-0e62-43c1-a887-80d5e274a540', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '09aea696a8524affb62dfae6819b6ba4', 'neutron:revision_number': '2', 'neutron:security_group_ids': '7917ae04-8ebd-43d8-a1bb-2a5e3cec3a77', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=8f85c045-0261-47dc-af36-82e03395c868, chassis=[<ovs.db.idl.Row object at 0x7fdd55b5c970>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fdd55b5c970>], logical_port=44b26f9f-3553-4a58-a1bf-068e5bc636a5) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 21 19:11:29 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:11:29.111 104259 INFO neutron.agent.ovn.metadata.agent [-] Port 44b26f9f-3553-4a58-a1bf-068e5bc636a5 in datapath ffed1ac3-0e62-43c1-a887-80d5e274a540 bound to our chassis#033[00m
Jan 21 19:11:29 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:11:29.113 104259 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network ffed1ac3-0e62-43c1-a887-80d5e274a540#033[00m
Jan 21 19:11:29 np0005591285 systemd-machined[154022]: New machine qemu-52-instance-00000073.
Jan 21 19:11:29 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:11:29.127 211690 DEBUG oslo.privsep.daemon [-] privsep: reply[8aeceb64-1a77-45ce-b169-ea664f7b6b46]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 19:11:29 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:11:29.128 104259 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tapffed1ac3-01 in ovnmeta-ffed1ac3-0e62-43c1-a887-80d5e274a540 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Jan 21 19:11:29 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:11:29.131 211690 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tapffed1ac3-00 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Jan 21 19:11:29 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:11:29.131 211690 DEBUG oslo.privsep.daemon [-] privsep: reply[574dbf80-cf64-4c5b-abec-e3271d7c7b72]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 19:11:29 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:11:29.131 211690 DEBUG oslo.privsep.daemon [-] privsep: reply[b6a6d173-9000-4e0e-854d-7039140ade3a]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 19:11:29 np0005591285 systemd[1]: Started Virtual Machine qemu-52-instance-00000073.
Jan 21 19:11:29 np0005591285 nova_compute[182755]: 2026-01-22 00:11:29.140 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:11:29 np0005591285 ovn_controller[94908]: 2026-01-22T00:11:29Z|00429|binding|INFO|Setting lport 44b26f9f-3553-4a58-a1bf-068e5bc636a5 ovn-installed in OVS
Jan 21 19:11:29 np0005591285 ovn_controller[94908]: 2026-01-22T00:11:29Z|00430|binding|INFO|Setting lport 44b26f9f-3553-4a58-a1bf-068e5bc636a5 up in Southbound
Jan 21 19:11:29 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:11:29.144 104650 DEBUG oslo.privsep.daemon [-] privsep: reply[d196e0e8-da8c-4e84-8607-33a757647b9c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 19:11:29 np0005591285 nova_compute[182755]: 2026-01-22 00:11:29.148 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:11:29 np0005591285 podman[229180]: 2026-01-22 00:11:29.150850029 +0000 UTC m=+0.111381557 container health_status 3d927e208c8d9d2bf2ed958430b573b5beb6e6062ba87e1ec7a0ee42c855b2e4 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Jan 21 19:11:29 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:11:29.159 211690 DEBUG oslo.privsep.daemon [-] privsep: reply[d5f7787b-24f2-458c-bfaf-9a24fc65fc21]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 19:11:29 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:11:29.186 211704 DEBUG oslo.privsep.daemon [-] privsep: reply[944c2b3c-e98c-4e93-8c6f-8ec5e0724797]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 19:11:29 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:11:29.192 211690 DEBUG oslo.privsep.daemon [-] privsep: reply[bebe329f-eccd-4022-aa9f-c2bbf0dd1d2c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 19:11:29 np0005591285 NetworkManager[55017]: <info>  [1769040689.1940] manager: (tapffed1ac3-00): new Veth device (/org/freedesktop/NetworkManager/Devices/208)
Jan 21 19:11:29 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:11:29.228 211704 DEBUG oslo.privsep.daemon [-] privsep: reply[e68caced-c08c-42cd-962c-ab7edffbb3f1]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 19:11:29 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:11:29.232 211704 DEBUG oslo.privsep.daemon [-] privsep: reply[4f7a44be-5499-420d-bd6b-e8a10d822967]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 19:11:29 np0005591285 NetworkManager[55017]: <info>  [1769040689.2573] device (tapffed1ac3-00): carrier: link connected
Jan 21 19:11:29 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:11:29.261 211704 DEBUG oslo.privsep.daemon [-] privsep: reply[83d2a7cf-3811-400d-beed-8d39031eef23]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 19:11:29 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:11:29.279 211690 DEBUG oslo.privsep.daemon [-] privsep: reply[5bc862fd-0ad9-4488-9119-c210fbe601c8]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapffed1ac3-01'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:e6:4b:32'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 134], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 524791, 'reachable_time': 44241, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 229243, 'error': None, 'target': 'ovnmeta-ffed1ac3-0e62-43c1-a887-80d5e274a540', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 19:11:29 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:11:29.303 211690 DEBUG oslo.privsep.daemon [-] privsep: reply[bb5eb376-0eb0-4eb6-b7e3-52b27138c81c]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fee6:4b32'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 524791, 'tstamp': 524791}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 229244, 'error': None, 'target': 'ovnmeta-ffed1ac3-0e62-43c1-a887-80d5e274a540', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 19:11:29 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:11:29.318 211690 DEBUG oslo.privsep.daemon [-] privsep: reply[b6967139-a401-4968-88da-6a73d48b7ffd]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapffed1ac3-01'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:e6:4b:32'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 134], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 524791, 'reachable_time': 44241, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 229247, 'error': None, 'target': 'ovnmeta-ffed1ac3-0e62-43c1-a887-80d5e274a540', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 19:11:29 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:11:29.346 211690 DEBUG oslo.privsep.daemon [-] privsep: reply[138e6667-c5ec-4ec6-933c-0b96689bb897]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 19:11:29 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:11:29.398 211690 DEBUG oslo.privsep.daemon [-] privsep: reply[2930f271-a8ab-45e1-a6bc-ac82f9752b70]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 19:11:29 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:11:29.400 104259 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapffed1ac3-00, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 21 19:11:29 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:11:29.401 104259 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 21 19:11:29 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:11:29.402 104259 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapffed1ac3-00, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 21 19:11:29 np0005591285 nova_compute[182755]: 2026-01-22 00:11:29.403 182759 DEBUG nova.virt.driver [None req-f447ef32-7edb-4e2c-a5c5-5452f2f9c37e - - - - - -] Emitting event <LifecycleEvent: 1769040689.402006, 069e978e-d494-4830-93c7-f449d9fefe71 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 21 19:11:29 np0005591285 NetworkManager[55017]: <info>  [1769040689.4042] manager: (tapffed1ac3-00): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/209)
Jan 21 19:11:29 np0005591285 kernel: tapffed1ac3-00: entered promiscuous mode
Jan 21 19:11:29 np0005591285 nova_compute[182755]: 2026-01-22 00:11:29.404 182759 INFO nova.compute.manager [None req-f447ef32-7edb-4e2c-a5c5-5452f2f9c37e - - - - - -] [instance: 069e978e-d494-4830-93c7-f449d9fefe71] VM Started (Lifecycle Event)#033[00m
Jan 21 19:11:29 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:11:29.406 104259 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapffed1ac3-00, col_values=(('external_ids', {'iface-id': '39b7b4af-1fea-41d6-bd6e-aeb4fe9f14d7'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 21 19:11:29 np0005591285 nova_compute[182755]: 2026-01-22 00:11:29.407 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:11:29 np0005591285 ovn_controller[94908]: 2026-01-22T00:11:29Z|00431|binding|INFO|Releasing lport 39b7b4af-1fea-41d6-bd6e-aeb4fe9f14d7 from this chassis (sb_readonly=0)
Jan 21 19:11:29 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:11:29.409 104259 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/ffed1ac3-0e62-43c1-a887-80d5e274a540.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/ffed1ac3-0e62-43c1-a887-80d5e274a540.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Jan 21 19:11:29 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:11:29.410 211690 DEBUG oslo.privsep.daemon [-] privsep: reply[3b109716-0d32-4401-86b0-9f843e59b208]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 19:11:29 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:11:29.411 104259 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Jan 21 19:11:29 np0005591285 ovn_metadata_agent[104254]: global
Jan 21 19:11:29 np0005591285 ovn_metadata_agent[104254]:    log         /dev/log local0 debug
Jan 21 19:11:29 np0005591285 ovn_metadata_agent[104254]:    log-tag     haproxy-metadata-proxy-ffed1ac3-0e62-43c1-a887-80d5e274a540
Jan 21 19:11:29 np0005591285 ovn_metadata_agent[104254]:    user        root
Jan 21 19:11:29 np0005591285 ovn_metadata_agent[104254]:    group       root
Jan 21 19:11:29 np0005591285 ovn_metadata_agent[104254]:    maxconn     1024
Jan 21 19:11:29 np0005591285 ovn_metadata_agent[104254]:    pidfile     /var/lib/neutron/external/pids/ffed1ac3-0e62-43c1-a887-80d5e274a540.pid.haproxy
Jan 21 19:11:29 np0005591285 ovn_metadata_agent[104254]:    daemon
Jan 21 19:11:29 np0005591285 ovn_metadata_agent[104254]: 
Jan 21 19:11:29 np0005591285 ovn_metadata_agent[104254]: defaults
Jan 21 19:11:29 np0005591285 ovn_metadata_agent[104254]:    log global
Jan 21 19:11:29 np0005591285 ovn_metadata_agent[104254]:    mode http
Jan 21 19:11:29 np0005591285 ovn_metadata_agent[104254]:    option httplog
Jan 21 19:11:29 np0005591285 ovn_metadata_agent[104254]:    option dontlognull
Jan 21 19:11:29 np0005591285 ovn_metadata_agent[104254]:    option http-server-close
Jan 21 19:11:29 np0005591285 ovn_metadata_agent[104254]:    option forwardfor
Jan 21 19:11:29 np0005591285 ovn_metadata_agent[104254]:    retries                 3
Jan 21 19:11:29 np0005591285 ovn_metadata_agent[104254]:    timeout http-request    30s
Jan 21 19:11:29 np0005591285 ovn_metadata_agent[104254]:    timeout connect         30s
Jan 21 19:11:29 np0005591285 ovn_metadata_agent[104254]:    timeout client          32s
Jan 21 19:11:29 np0005591285 ovn_metadata_agent[104254]:    timeout server          32s
Jan 21 19:11:29 np0005591285 ovn_metadata_agent[104254]:    timeout http-keep-alive 30s
Jan 21 19:11:29 np0005591285 ovn_metadata_agent[104254]: 
Jan 21 19:11:29 np0005591285 ovn_metadata_agent[104254]: 
Jan 21 19:11:29 np0005591285 ovn_metadata_agent[104254]: listen listener
Jan 21 19:11:29 np0005591285 ovn_metadata_agent[104254]:    bind 169.254.169.254:80
Jan 21 19:11:29 np0005591285 ovn_metadata_agent[104254]:    server metadata /var/lib/neutron/metadata_proxy
Jan 21 19:11:29 np0005591285 ovn_metadata_agent[104254]:    http-request add-header X-OVN-Network-ID ffed1ac3-0e62-43c1-a887-80d5e274a540
Jan 21 19:11:29 np0005591285 ovn_metadata_agent[104254]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Jan 21 19:11:29 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:11:29.412 104259 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-ffed1ac3-0e62-43c1-a887-80d5e274a540', 'env', 'PROCESS_TAG=haproxy-ffed1ac3-0e62-43c1-a887-80d5e274a540', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/ffed1ac3-0e62-43c1-a887-80d5e274a540.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Jan 21 19:11:29 np0005591285 nova_compute[182755]: 2026-01-22 00:11:29.419 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:11:29 np0005591285 nova_compute[182755]: 2026-01-22 00:11:29.652 182759 DEBUG nova.compute.manager [None req-f447ef32-7edb-4e2c-a5c5-5452f2f9c37e - - - - - -] [instance: 069e978e-d494-4830-93c7-f449d9fefe71] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 21 19:11:29 np0005591285 nova_compute[182755]: 2026-01-22 00:11:29.657 182759 DEBUG nova.virt.driver [None req-f447ef32-7edb-4e2c-a5c5-5452f2f9c37e - - - - - -] Emitting event <LifecycleEvent: 1769040689.4027195, 069e978e-d494-4830-93c7-f449d9fefe71 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 21 19:11:29 np0005591285 nova_compute[182755]: 2026-01-22 00:11:29.657 182759 INFO nova.compute.manager [None req-f447ef32-7edb-4e2c-a5c5-5452f2f9c37e - - - - - -] [instance: 069e978e-d494-4830-93c7-f449d9fefe71] VM Paused (Lifecycle Event)#033[00m
Jan 21 19:11:29 np0005591285 podman[229284]: 2026-01-22 00:11:29.726864654 +0000 UTC m=+0.030199694 image pull 3695f0466b4af47afdf4b467956f8cc4744d7249671a73e7ca3fd26cca2f59c3 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Jan 21 19:11:29 np0005591285 nova_compute[182755]: 2026-01-22 00:11:29.931 182759 DEBUG nova.compute.manager [None req-f447ef32-7edb-4e2c-a5c5-5452f2f9c37e - - - - - -] [instance: 069e978e-d494-4830-93c7-f449d9fefe71] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 21 19:11:29 np0005591285 nova_compute[182755]: 2026-01-22 00:11:29.937 182759 DEBUG nova.compute.manager [None req-f447ef32-7edb-4e2c-a5c5-5452f2f9c37e - - - - - -] [instance: 069e978e-d494-4830-93c7-f449d9fefe71] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 21 19:11:30 np0005591285 podman[229284]: 2026-01-22 00:11:30.090435243 +0000 UTC m=+0.393770253 container create 2205497d85c65ce7d77a4c175584170b83bfbe52e86367cbf01de96f7078f44e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-ffed1ac3-0e62-43c1-a887-80d5e274a540, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0)
Jan 21 19:11:30 np0005591285 nova_compute[182755]: 2026-01-22 00:11:30.121 182759 INFO nova.compute.manager [None req-f447ef32-7edb-4e2c-a5c5-5452f2f9c37e - - - - - -] [instance: 069e978e-d494-4830-93c7-f449d9fefe71] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 21 19:11:30 np0005591285 systemd[1]: Started libpod-conmon-2205497d85c65ce7d77a4c175584170b83bfbe52e86367cbf01de96f7078f44e.scope.
Jan 21 19:11:30 np0005591285 systemd[1]: Started libcrun container.
Jan 21 19:11:30 np0005591285 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/1dc34d4c4763501e0fc380bd604e33a746078a1f857000191ea02a876f98837d/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 21 19:11:30 np0005591285 podman[229284]: 2026-01-22 00:11:30.279427387 +0000 UTC m=+0.582762427 container init 2205497d85c65ce7d77a4c175584170b83bfbe52e86367cbf01de96f7078f44e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-ffed1ac3-0e62-43c1-a887-80d5e274a540, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 21 19:11:30 np0005591285 podman[229284]: 2026-01-22 00:11:30.285135251 +0000 UTC m=+0.588470261 container start 2205497d85c65ce7d77a4c175584170b83bfbe52e86367cbf01de96f7078f44e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-ffed1ac3-0e62-43c1-a887-80d5e274a540, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_managed=true, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 21 19:11:30 np0005591285 neutron-haproxy-ovnmeta-ffed1ac3-0e62-43c1-a887-80d5e274a540[229299]: [NOTICE]   (229303) : New worker (229305) forked
Jan 21 19:11:30 np0005591285 neutron-haproxy-ovnmeta-ffed1ac3-0e62-43c1-a887-80d5e274a540[229299]: [NOTICE]   (229303) : Loading success.
Jan 21 19:11:30 np0005591285 nova_compute[182755]: 2026-01-22 00:11:30.829 182759 DEBUG nova.network.neutron [req-02290897-7c72-4a60-832f-2fc1c89f6526 req-97aebd2d-17ce-49f8-9dee-c25c74ddbc4c 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 069e978e-d494-4830-93c7-f449d9fefe71] Updated VIF entry in instance network info cache for port 44b26f9f-3553-4a58-a1bf-068e5bc636a5. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 21 19:11:30 np0005591285 nova_compute[182755]: 2026-01-22 00:11:30.830 182759 DEBUG nova.network.neutron [req-02290897-7c72-4a60-832f-2fc1c89f6526 req-97aebd2d-17ce-49f8-9dee-c25c74ddbc4c 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 069e978e-d494-4830-93c7-f449d9fefe71] Updating instance_info_cache with network_info: [{"id": "44b26f9f-3553-4a58-a1bf-068e5bc636a5", "address": "fa:16:3e:dc:80:17", "network": {"id": "ffed1ac3-0e62-43c1-a887-80d5e274a540", "bridge": "br-int", "label": "tempest-ServerAddressesNegativeTestJSON-1714373655-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "09aea696a8524affb62dfae6819b6ba4", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap44b26f9f-35", "ovs_interfaceid": "44b26f9f-3553-4a58-a1bf-068e5bc636a5", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 21 19:11:30 np0005591285 nova_compute[182755]: 2026-01-22 00:11:30.931 182759 DEBUG oslo_concurrency.lockutils [req-02290897-7c72-4a60-832f-2fc1c89f6526 req-97aebd2d-17ce-49f8-9dee-c25c74ddbc4c 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Releasing lock "refresh_cache-069e978e-d494-4830-93c7-f449d9fefe71" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 21 19:11:32 np0005591285 nova_compute[182755]: 2026-01-22 00:11:32.024 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:11:32 np0005591285 podman[229315]: 2026-01-22 00:11:32.177927415 +0000 UTC m=+0.050556701 container health_status 8919309155a033da8deb024bb37b9fc0d285fe97686ee0d202af37a5f2df8416 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter)
Jan 21 19:11:32 np0005591285 podman[229314]: 2026-01-22 00:11:32.180238607 +0000 UTC m=+0.054733283 container health_status 482a8450540b33e5cd4c4cb0c4b60281016c657b79b89fa82f8a9e447843f995 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent)
Jan 21 19:11:32 np0005591285 nova_compute[182755]: 2026-01-22 00:11:32.241 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:11:34 np0005591285 podman[229356]: 2026-01-22 00:11:34.210032347 +0000 UTC m=+0.082349317 container health_status e38def30a2d032278c507a8fe96e62e9ea51ff4721fe55b24e66691821fd079e (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_controller, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, container_name=ovn_controller, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 21 19:11:37 np0005591285 nova_compute[182755]: 2026-01-22 00:11:37.027 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:11:37 np0005591285 nova_compute[182755]: 2026-01-22 00:11:37.291 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:11:38 np0005591285 nova_compute[182755]: 2026-01-22 00:11:38.213 182759 DEBUG oslo_service.periodic_task [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 21 19:11:38 np0005591285 nova_compute[182755]: 2026-01-22 00:11:38.686 182759 DEBUG nova.compute.manager [req-62b24225-045d-4f67-ae73-78ab1b851b22 req-83b70c99-9a33-444b-9218-d9ccc418c4f3 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 069e978e-d494-4830-93c7-f449d9fefe71] Received event network-vif-plugged-44b26f9f-3553-4a58-a1bf-068e5bc636a5 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 21 19:11:38 np0005591285 nova_compute[182755]: 2026-01-22 00:11:38.686 182759 DEBUG oslo_concurrency.lockutils [req-62b24225-045d-4f67-ae73-78ab1b851b22 req-83b70c99-9a33-444b-9218-d9ccc418c4f3 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "069e978e-d494-4830-93c7-f449d9fefe71-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 21 19:11:38 np0005591285 nova_compute[182755]: 2026-01-22 00:11:38.686 182759 DEBUG oslo_concurrency.lockutils [req-62b24225-045d-4f67-ae73-78ab1b851b22 req-83b70c99-9a33-444b-9218-d9ccc418c4f3 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "069e978e-d494-4830-93c7-f449d9fefe71-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 21 19:11:38 np0005591285 nova_compute[182755]: 2026-01-22 00:11:38.687 182759 DEBUG oslo_concurrency.lockutils [req-62b24225-045d-4f67-ae73-78ab1b851b22 req-83b70c99-9a33-444b-9218-d9ccc418c4f3 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "069e978e-d494-4830-93c7-f449d9fefe71-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 21 19:11:38 np0005591285 nova_compute[182755]: 2026-01-22 00:11:38.687 182759 DEBUG nova.compute.manager [req-62b24225-045d-4f67-ae73-78ab1b851b22 req-83b70c99-9a33-444b-9218-d9ccc418c4f3 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 069e978e-d494-4830-93c7-f449d9fefe71] Processing event network-vif-plugged-44b26f9f-3553-4a58-a1bf-068e5bc636a5 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Jan 21 19:11:38 np0005591285 nova_compute[182755]: 2026-01-22 00:11:38.687 182759 DEBUG nova.compute.manager [None req-5bd32913-4296-4ff4-8858-5d64aeacd506 34c123183bb440f5812e26cf267019c7 09aea696a8524affb62dfae6819b6ba4 - - default default] [instance: 069e978e-d494-4830-93c7-f449d9fefe71] Instance event wait completed in 9 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Jan 21 19:11:38 np0005591285 nova_compute[182755]: 2026-01-22 00:11:38.692 182759 DEBUG nova.virt.driver [None req-f447ef32-7edb-4e2c-a5c5-5452f2f9c37e - - - - - -] Emitting event <LifecycleEvent: 1769040698.6920013, 069e978e-d494-4830-93c7-f449d9fefe71 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 21 19:11:38 np0005591285 nova_compute[182755]: 2026-01-22 00:11:38.692 182759 INFO nova.compute.manager [None req-f447ef32-7edb-4e2c-a5c5-5452f2f9c37e - - - - - -] [instance: 069e978e-d494-4830-93c7-f449d9fefe71] VM Resumed (Lifecycle Event)#033[00m
Jan 21 19:11:38 np0005591285 nova_compute[182755]: 2026-01-22 00:11:38.694 182759 DEBUG nova.virt.libvirt.driver [None req-5bd32913-4296-4ff4-8858-5d64aeacd506 34c123183bb440f5812e26cf267019c7 09aea696a8524affb62dfae6819b6ba4 - - default default] [instance: 069e978e-d494-4830-93c7-f449d9fefe71] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Jan 21 19:11:38 np0005591285 nova_compute[182755]: 2026-01-22 00:11:38.697 182759 INFO nova.virt.libvirt.driver [-] [instance: 069e978e-d494-4830-93c7-f449d9fefe71] Instance spawned successfully.#033[00m
Jan 21 19:11:38 np0005591285 nova_compute[182755]: 2026-01-22 00:11:38.698 182759 DEBUG nova.virt.libvirt.driver [None req-5bd32913-4296-4ff4-8858-5d64aeacd506 34c123183bb440f5812e26cf267019c7 09aea696a8524affb62dfae6819b6ba4 - - default default] [instance: 069e978e-d494-4830-93c7-f449d9fefe71] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Jan 21 19:11:39 np0005591285 nova_compute[182755]: 2026-01-22 00:11:39.104 182759 DEBUG nova.compute.manager [None req-f447ef32-7edb-4e2c-a5c5-5452f2f9c37e - - - - - -] [instance: 069e978e-d494-4830-93c7-f449d9fefe71] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 21 19:11:39 np0005591285 nova_compute[182755]: 2026-01-22 00:11:39.109 182759 DEBUG nova.virt.libvirt.driver [None req-5bd32913-4296-4ff4-8858-5d64aeacd506 34c123183bb440f5812e26cf267019c7 09aea696a8524affb62dfae6819b6ba4 - - default default] [instance: 069e978e-d494-4830-93c7-f449d9fefe71] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 21 19:11:39 np0005591285 nova_compute[182755]: 2026-01-22 00:11:39.109 182759 DEBUG nova.virt.libvirt.driver [None req-5bd32913-4296-4ff4-8858-5d64aeacd506 34c123183bb440f5812e26cf267019c7 09aea696a8524affb62dfae6819b6ba4 - - default default] [instance: 069e978e-d494-4830-93c7-f449d9fefe71] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 21 19:11:39 np0005591285 nova_compute[182755]: 2026-01-22 00:11:39.109 182759 DEBUG nova.virt.libvirt.driver [None req-5bd32913-4296-4ff4-8858-5d64aeacd506 34c123183bb440f5812e26cf267019c7 09aea696a8524affb62dfae6819b6ba4 - - default default] [instance: 069e978e-d494-4830-93c7-f449d9fefe71] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 21 19:11:39 np0005591285 nova_compute[182755]: 2026-01-22 00:11:39.110 182759 DEBUG nova.virt.libvirt.driver [None req-5bd32913-4296-4ff4-8858-5d64aeacd506 34c123183bb440f5812e26cf267019c7 09aea696a8524affb62dfae6819b6ba4 - - default default] [instance: 069e978e-d494-4830-93c7-f449d9fefe71] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 21 19:11:39 np0005591285 nova_compute[182755]: 2026-01-22 00:11:39.110 182759 DEBUG nova.virt.libvirt.driver [None req-5bd32913-4296-4ff4-8858-5d64aeacd506 34c123183bb440f5812e26cf267019c7 09aea696a8524affb62dfae6819b6ba4 - - default default] [instance: 069e978e-d494-4830-93c7-f449d9fefe71] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 21 19:11:39 np0005591285 nova_compute[182755]: 2026-01-22 00:11:39.111 182759 DEBUG nova.virt.libvirt.driver [None req-5bd32913-4296-4ff4-8858-5d64aeacd506 34c123183bb440f5812e26cf267019c7 09aea696a8524affb62dfae6819b6ba4 - - default default] [instance: 069e978e-d494-4830-93c7-f449d9fefe71] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 21 19:11:39 np0005591285 nova_compute[182755]: 2026-01-22 00:11:39.114 182759 DEBUG nova.compute.manager [None req-f447ef32-7edb-4e2c-a5c5-5452f2f9c37e - - - - - -] [instance: 069e978e-d494-4830-93c7-f449d9fefe71] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 21 19:11:39 np0005591285 nova_compute[182755]: 2026-01-22 00:11:39.171 182759 INFO nova.compute.manager [None req-f447ef32-7edb-4e2c-a5c5-5452f2f9c37e - - - - - -] [instance: 069e978e-d494-4830-93c7-f449d9fefe71] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 21 19:11:39 np0005591285 nova_compute[182755]: 2026-01-22 00:11:39.217 182759 DEBUG oslo_service.periodic_task [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 21 19:11:39 np0005591285 nova_compute[182755]: 2026-01-22 00:11:39.217 182759 DEBUG nova.compute.manager [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Jan 21 19:11:39 np0005591285 nova_compute[182755]: 2026-01-22 00:11:39.239 182759 INFO nova.compute.manager [None req-5bd32913-4296-4ff4-8858-5d64aeacd506 34c123183bb440f5812e26cf267019c7 09aea696a8524affb62dfae6819b6ba4 - - default default] [instance: 069e978e-d494-4830-93c7-f449d9fefe71] Took 31.81 seconds to spawn the instance on the hypervisor.#033[00m
Jan 21 19:11:39 np0005591285 nova_compute[182755]: 2026-01-22 00:11:39.240 182759 DEBUG nova.compute.manager [None req-5bd32913-4296-4ff4-8858-5d64aeacd506 34c123183bb440f5812e26cf267019c7 09aea696a8524affb62dfae6819b6ba4 - - default default] [instance: 069e978e-d494-4830-93c7-f449d9fefe71] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 21 19:11:39 np0005591285 nova_compute[182755]: 2026-01-22 00:11:39.761 182759 INFO nova.compute.manager [None req-5bd32913-4296-4ff4-8858-5d64aeacd506 34c123183bb440f5812e26cf267019c7 09aea696a8524affb62dfae6819b6ba4 - - default default] [instance: 069e978e-d494-4830-93c7-f449d9fefe71] Took 37.38 seconds to build instance.#033[00m
Jan 21 19:11:39 np0005591285 nova_compute[182755]: 2026-01-22 00:11:39.828 182759 DEBUG oslo_concurrency.lockutils [None req-5bd32913-4296-4ff4-8858-5d64aeacd506 34c123183bb440f5812e26cf267019c7 09aea696a8524affb62dfae6819b6ba4 - - default default] Lock "069e978e-d494-4830-93c7-f449d9fefe71" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 37.975s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 21 19:11:41 np0005591285 nova_compute[182755]: 2026-01-22 00:11:41.019 182759 DEBUG nova.compute.manager [req-79530e4e-4bb0-4ab8-9628-e9971dd8da88 req-77ee70a2-6272-4673-bcb3-8ee3d7da2994 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 069e978e-d494-4830-93c7-f449d9fefe71] Received event network-vif-plugged-44b26f9f-3553-4a58-a1bf-068e5bc636a5 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 21 19:11:41 np0005591285 nova_compute[182755]: 2026-01-22 00:11:41.020 182759 DEBUG oslo_concurrency.lockutils [req-79530e4e-4bb0-4ab8-9628-e9971dd8da88 req-77ee70a2-6272-4673-bcb3-8ee3d7da2994 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "069e978e-d494-4830-93c7-f449d9fefe71-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 21 19:11:41 np0005591285 nova_compute[182755]: 2026-01-22 00:11:41.020 182759 DEBUG oslo_concurrency.lockutils [req-79530e4e-4bb0-4ab8-9628-e9971dd8da88 req-77ee70a2-6272-4673-bcb3-8ee3d7da2994 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "069e978e-d494-4830-93c7-f449d9fefe71-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 21 19:11:41 np0005591285 nova_compute[182755]: 2026-01-22 00:11:41.020 182759 DEBUG oslo_concurrency.lockutils [req-79530e4e-4bb0-4ab8-9628-e9971dd8da88 req-77ee70a2-6272-4673-bcb3-8ee3d7da2994 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "069e978e-d494-4830-93c7-f449d9fefe71-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 21 19:11:41 np0005591285 nova_compute[182755]: 2026-01-22 00:11:41.020 182759 DEBUG nova.compute.manager [req-79530e4e-4bb0-4ab8-9628-e9971dd8da88 req-77ee70a2-6272-4673-bcb3-8ee3d7da2994 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 069e978e-d494-4830-93c7-f449d9fefe71] No waiting events found dispatching network-vif-plugged-44b26f9f-3553-4a58-a1bf-068e5bc636a5 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 21 19:11:41 np0005591285 nova_compute[182755]: 2026-01-22 00:11:41.021 182759 WARNING nova.compute.manager [req-79530e4e-4bb0-4ab8-9628-e9971dd8da88 req-77ee70a2-6272-4673-bcb3-8ee3d7da2994 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 069e978e-d494-4830-93c7-f449d9fefe71] Received unexpected event network-vif-plugged-44b26f9f-3553-4a58-a1bf-068e5bc636a5 for instance with vm_state active and task_state None.#033[00m
Jan 21 19:11:42 np0005591285 nova_compute[182755]: 2026-01-22 00:11:42.031 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:11:42 np0005591285 nova_compute[182755]: 2026-01-22 00:11:42.292 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:11:43 np0005591285 nova_compute[182755]: 2026-01-22 00:11:43.055 182759 DEBUG oslo_concurrency.lockutils [None req-55c081a7-99a4-4932-85e4-de6c73ea16f1 34c123183bb440f5812e26cf267019c7 09aea696a8524affb62dfae6819b6ba4 - - default default] Acquiring lock "069e978e-d494-4830-93c7-f449d9fefe71" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 21 19:11:43 np0005591285 nova_compute[182755]: 2026-01-22 00:11:43.055 182759 DEBUG oslo_concurrency.lockutils [None req-55c081a7-99a4-4932-85e4-de6c73ea16f1 34c123183bb440f5812e26cf267019c7 09aea696a8524affb62dfae6819b6ba4 - - default default] Lock "069e978e-d494-4830-93c7-f449d9fefe71" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 21 19:11:43 np0005591285 nova_compute[182755]: 2026-01-22 00:11:43.056 182759 DEBUG oslo_concurrency.lockutils [None req-55c081a7-99a4-4932-85e4-de6c73ea16f1 34c123183bb440f5812e26cf267019c7 09aea696a8524affb62dfae6819b6ba4 - - default default] Acquiring lock "069e978e-d494-4830-93c7-f449d9fefe71-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 21 19:11:43 np0005591285 nova_compute[182755]: 2026-01-22 00:11:43.056 182759 DEBUG oslo_concurrency.lockutils [None req-55c081a7-99a4-4932-85e4-de6c73ea16f1 34c123183bb440f5812e26cf267019c7 09aea696a8524affb62dfae6819b6ba4 - - default default] Lock "069e978e-d494-4830-93c7-f449d9fefe71-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 21 19:11:43 np0005591285 nova_compute[182755]: 2026-01-22 00:11:43.057 182759 DEBUG oslo_concurrency.lockutils [None req-55c081a7-99a4-4932-85e4-de6c73ea16f1 34c123183bb440f5812e26cf267019c7 09aea696a8524affb62dfae6819b6ba4 - - default default] Lock "069e978e-d494-4830-93c7-f449d9fefe71-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 21 19:11:43 np0005591285 nova_compute[182755]: 2026-01-22 00:11:43.284 182759 DEBUG oslo_service.periodic_task [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 21 19:11:43 np0005591285 nova_compute[182755]: 2026-01-22 00:11:43.801 182759 INFO nova.compute.manager [None req-55c081a7-99a4-4932-85e4-de6c73ea16f1 34c123183bb440f5812e26cf267019c7 09aea696a8524affb62dfae6819b6ba4 - - default default] [instance: 069e978e-d494-4830-93c7-f449d9fefe71] Terminating instance#033[00m
Jan 21 19:11:44 np0005591285 nova_compute[182755]: 2026-01-22 00:11:44.216 182759 DEBUG oslo_service.periodic_task [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 21 19:11:44 np0005591285 nova_compute[182755]: 2026-01-22 00:11:44.217 182759 DEBUG oslo_service.periodic_task [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 21 19:11:44 np0005591285 nova_compute[182755]: 2026-01-22 00:11:44.947 182759 DEBUG nova.compute.manager [None req-55c081a7-99a4-4932-85e4-de6c73ea16f1 34c123183bb440f5812e26cf267019c7 09aea696a8524affb62dfae6819b6ba4 - - default default] [instance: 069e978e-d494-4830-93c7-f449d9fefe71] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Jan 21 19:11:44 np0005591285 kernel: tap44b26f9f-35 (unregistering): left promiscuous mode
Jan 21 19:11:44 np0005591285 NetworkManager[55017]: <info>  [1769040704.9699] device (tap44b26f9f-35): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 21 19:11:44 np0005591285 ovn_controller[94908]: 2026-01-22T00:11:44Z|00432|binding|INFO|Releasing lport 44b26f9f-3553-4a58-a1bf-068e5bc636a5 from this chassis (sb_readonly=0)
Jan 21 19:11:44 np0005591285 nova_compute[182755]: 2026-01-22 00:11:44.981 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:11:44 np0005591285 ovn_controller[94908]: 2026-01-22T00:11:44Z|00433|binding|INFO|Setting lport 44b26f9f-3553-4a58-a1bf-068e5bc636a5 down in Southbound
Jan 21 19:11:44 np0005591285 ovn_controller[94908]: 2026-01-22T00:11:44Z|00434|binding|INFO|Removing iface tap44b26f9f-35 ovn-installed in OVS
Jan 21 19:11:44 np0005591285 nova_compute[182755]: 2026-01-22 00:11:44.987 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:11:45 np0005591285 nova_compute[182755]: 2026-01-22 00:11:45.014 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:11:45 np0005591285 systemd[1]: machine-qemu\x2d52\x2dinstance\x2d00000073.scope: Deactivated successfully.
Jan 21 19:11:45 np0005591285 systemd[1]: machine-qemu\x2d52\x2dinstance\x2d00000073.scope: Consumed 6.693s CPU time.
Jan 21 19:11:45 np0005591285 systemd-machined[154022]: Machine qemu-52-instance-00000073 terminated.
Jan 21 19:11:45 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:11:45.151 104259 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:dc:80:17 10.100.0.9'], port_security=['fa:16:3e:dc:80:17 10.100.0.9'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.9/28', 'neutron:device_id': '069e978e-d494-4830-93c7-f449d9fefe71', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-ffed1ac3-0e62-43c1-a887-80d5e274a540', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '09aea696a8524affb62dfae6819b6ba4', 'neutron:revision_number': '4', 'neutron:security_group_ids': '7917ae04-8ebd-43d8-a1bb-2a5e3cec3a77', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-2.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=8f85c045-0261-47dc-af36-82e03395c868, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fdd55b5c970>], logical_port=44b26f9f-3553-4a58-a1bf-068e5bc636a5) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7fdd55b5c970>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 21 19:11:45 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:11:45.152 104259 INFO neutron.agent.ovn.metadata.agent [-] Port 44b26f9f-3553-4a58-a1bf-068e5bc636a5 in datapath ffed1ac3-0e62-43c1-a887-80d5e274a540 unbound from our chassis#033[00m
Jan 21 19:11:45 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:11:45.153 104259 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network ffed1ac3-0e62-43c1-a887-80d5e274a540, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Jan 21 19:11:45 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:11:45.154 211690 DEBUG oslo.privsep.daemon [-] privsep: reply[482a6610-4169-40e0-92d8-8e3210a6facb]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 19:11:45 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:11:45.155 104259 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-ffed1ac3-0e62-43c1-a887-80d5e274a540 namespace which is not needed anymore#033[00m
Jan 21 19:11:45 np0005591285 nova_compute[182755]: 2026-01-22 00:11:45.216 182759 INFO nova.virt.libvirt.driver [-] [instance: 069e978e-d494-4830-93c7-f449d9fefe71] Instance destroyed successfully.#033[00m
Jan 21 19:11:45 np0005591285 nova_compute[182755]: 2026-01-22 00:11:45.216 182759 DEBUG nova.objects.instance [None req-55c081a7-99a4-4932-85e4-de6c73ea16f1 34c123183bb440f5812e26cf267019c7 09aea696a8524affb62dfae6819b6ba4 - - default default] Lazy-loading 'resources' on Instance uuid 069e978e-d494-4830-93c7-f449d9fefe71 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 21 19:11:45 np0005591285 nova_compute[182755]: 2026-01-22 00:11:45.220 182759 DEBUG oslo_service.periodic_task [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 21 19:11:45 np0005591285 nova_compute[182755]: 2026-01-22 00:11:45.221 182759 DEBUG nova.compute.manager [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Jan 21 19:11:45 np0005591285 nova_compute[182755]: 2026-01-22 00:11:45.221 182759 DEBUG nova.compute.manager [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Jan 21 19:11:45 np0005591285 nova_compute[182755]: 2026-01-22 00:11:45.267 182759 DEBUG nova.virt.libvirt.vif [None req-55c081a7-99a4-4932-85e4-de6c73ea16f1 34c123183bb440f5812e26cf267019c7 09aea696a8524affb62dfae6819b6ba4 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-22T00:10:58Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ServerAddressesNegativeTestJSON-server-519178471',display_name='tempest-ServerAddressesNegativeTestJSON-server-519178471',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-serveraddressesnegativetestjson-server-519178471',id=115,image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2026-01-22T00:11:39Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='09aea696a8524affb62dfae6819b6ba4',ramdisk_id='',reservation_id='r-7c5n96qo',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-ServerAddressesNegativeTestJSON-2089684813',owner_user_name='tempest-ServerAddressesNegativeTestJSON-2089684813-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-22T00:11:39Z,user_data=None,user_id='34c123183bb440f5812e26cf267019c7',uuid=069e978e-d494-4830-93c7-f449d9fefe71,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "44b26f9f-3553-4a58-a1bf-068e5bc636a5", "address": "fa:16:3e:dc:80:17", "network": {"id": "ffed1ac3-0e62-43c1-a887-80d5e274a540", "bridge": "br-int", "label": "tempest-ServerAddressesNegativeTestJSON-1714373655-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "09aea696a8524affb62dfae6819b6ba4", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap44b26f9f-35", "ovs_interfaceid": "44b26f9f-3553-4a58-a1bf-068e5bc636a5", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Jan 21 19:11:45 np0005591285 nova_compute[182755]: 2026-01-22 00:11:45.267 182759 DEBUG nova.network.os_vif_util [None req-55c081a7-99a4-4932-85e4-de6c73ea16f1 34c123183bb440f5812e26cf267019c7 09aea696a8524affb62dfae6819b6ba4 - - default default] Converting VIF {"id": "44b26f9f-3553-4a58-a1bf-068e5bc636a5", "address": "fa:16:3e:dc:80:17", "network": {"id": "ffed1ac3-0e62-43c1-a887-80d5e274a540", "bridge": "br-int", "label": "tempest-ServerAddressesNegativeTestJSON-1714373655-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "09aea696a8524affb62dfae6819b6ba4", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap44b26f9f-35", "ovs_interfaceid": "44b26f9f-3553-4a58-a1bf-068e5bc636a5", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 21 19:11:45 np0005591285 nova_compute[182755]: 2026-01-22 00:11:45.268 182759 DEBUG nova.network.os_vif_util [None req-55c081a7-99a4-4932-85e4-de6c73ea16f1 34c123183bb440f5812e26cf267019c7 09aea696a8524affb62dfae6819b6ba4 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:dc:80:17,bridge_name='br-int',has_traffic_filtering=True,id=44b26f9f-3553-4a58-a1bf-068e5bc636a5,network=Network(ffed1ac3-0e62-43c1-a887-80d5e274a540),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap44b26f9f-35') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 21 19:11:45 np0005591285 nova_compute[182755]: 2026-01-22 00:11:45.269 182759 DEBUG os_vif [None req-55c081a7-99a4-4932-85e4-de6c73ea16f1 34c123183bb440f5812e26cf267019c7 09aea696a8524affb62dfae6819b6ba4 - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:dc:80:17,bridge_name='br-int',has_traffic_filtering=True,id=44b26f9f-3553-4a58-a1bf-068e5bc636a5,network=Network(ffed1ac3-0e62-43c1-a887-80d5e274a540),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap44b26f9f-35') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Jan 21 19:11:45 np0005591285 nova_compute[182755]: 2026-01-22 00:11:45.270 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:11:45 np0005591285 nova_compute[182755]: 2026-01-22 00:11:45.270 182759 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap44b26f9f-35, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 21 19:11:45 np0005591285 nova_compute[182755]: 2026-01-22 00:11:45.272 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:11:45 np0005591285 nova_compute[182755]: 2026-01-22 00:11:45.274 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 21 19:11:45 np0005591285 nova_compute[182755]: 2026-01-22 00:11:45.275 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:11:45 np0005591285 nova_compute[182755]: 2026-01-22 00:11:45.277 182759 INFO os_vif [None req-55c081a7-99a4-4932-85e4-de6c73ea16f1 34c123183bb440f5812e26cf267019c7 09aea696a8524affb62dfae6819b6ba4 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:dc:80:17,bridge_name='br-int',has_traffic_filtering=True,id=44b26f9f-3553-4a58-a1bf-068e5bc636a5,network=Network(ffed1ac3-0e62-43c1-a887-80d5e274a540),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap44b26f9f-35')#033[00m
Jan 21 19:11:45 np0005591285 nova_compute[182755]: 2026-01-22 00:11:45.277 182759 INFO nova.virt.libvirt.driver [None req-55c081a7-99a4-4932-85e4-de6c73ea16f1 34c123183bb440f5812e26cf267019c7 09aea696a8524affb62dfae6819b6ba4 - - default default] [instance: 069e978e-d494-4830-93c7-f449d9fefe71] Deleting instance files /var/lib/nova/instances/069e978e-d494-4830-93c7-f449d9fefe71_del#033[00m
Jan 21 19:11:45 np0005591285 nova_compute[182755]: 2026-01-22 00:11:45.278 182759 INFO nova.virt.libvirt.driver [None req-55c081a7-99a4-4932-85e4-de6c73ea16f1 34c123183bb440f5812e26cf267019c7 09aea696a8524affb62dfae6819b6ba4 - - default default] [instance: 069e978e-d494-4830-93c7-f449d9fefe71] Deletion of /var/lib/nova/instances/069e978e-d494-4830-93c7-f449d9fefe71_del complete#033[00m
Jan 21 19:11:45 np0005591285 nova_compute[182755]: 2026-01-22 00:11:45.289 182759 DEBUG nova.compute.manager [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] [instance: 069e978e-d494-4830-93c7-f449d9fefe71] Skipping network cache update for instance because it is being deleted. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9875#033[00m
Jan 21 19:11:45 np0005591285 nova_compute[182755]: 2026-01-22 00:11:45.290 182759 DEBUG nova.compute.manager [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Jan 21 19:11:45 np0005591285 neutron-haproxy-ovnmeta-ffed1ac3-0e62-43c1-a887-80d5e274a540[229299]: [NOTICE]   (229303) : haproxy version is 2.8.14-c23fe91
Jan 21 19:11:45 np0005591285 neutron-haproxy-ovnmeta-ffed1ac3-0e62-43c1-a887-80d5e274a540[229299]: [NOTICE]   (229303) : path to executable is /usr/sbin/haproxy
Jan 21 19:11:45 np0005591285 neutron-haproxy-ovnmeta-ffed1ac3-0e62-43c1-a887-80d5e274a540[229299]: [WARNING]  (229303) : Exiting Master process...
Jan 21 19:11:45 np0005591285 neutron-haproxy-ovnmeta-ffed1ac3-0e62-43c1-a887-80d5e274a540[229299]: [ALERT]    (229303) : Current worker (229305) exited with code 143 (Terminated)
Jan 21 19:11:45 np0005591285 neutron-haproxy-ovnmeta-ffed1ac3-0e62-43c1-a887-80d5e274a540[229299]: [WARNING]  (229303) : All workers exited. Exiting... (0)
Jan 21 19:11:45 np0005591285 systemd[1]: libpod-2205497d85c65ce7d77a4c175584170b83bfbe52e86367cbf01de96f7078f44e.scope: Deactivated successfully.
Jan 21 19:11:45 np0005591285 podman[229424]: 2026-01-22 00:11:45.311064945 +0000 UTC m=+0.056754069 container died 2205497d85c65ce7d77a4c175584170b83bfbe52e86367cbf01de96f7078f44e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-ffed1ac3-0e62-43c1-a887-80d5e274a540, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, tcib_managed=true, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2)
Jan 21 19:11:45 np0005591285 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-2205497d85c65ce7d77a4c175584170b83bfbe52e86367cbf01de96f7078f44e-userdata-shm.mount: Deactivated successfully.
Jan 21 19:11:45 np0005591285 systemd[1]: var-lib-containers-storage-overlay-1dc34d4c4763501e0fc380bd604e33a746078a1f857000191ea02a876f98837d-merged.mount: Deactivated successfully.
Jan 21 19:11:45 np0005591285 podman[229424]: 2026-01-22 00:11:45.343455275 +0000 UTC m=+0.089144379 container cleanup 2205497d85c65ce7d77a4c175584170b83bfbe52e86367cbf01de96f7078f44e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-ffed1ac3-0e62-43c1-a887-80d5e274a540, tcib_managed=true, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0)
Jan 21 19:11:45 np0005591285 systemd[1]: libpod-conmon-2205497d85c65ce7d77a4c175584170b83bfbe52e86367cbf01de96f7078f44e.scope: Deactivated successfully.
Jan 21 19:11:45 np0005591285 podman[229454]: 2026-01-22 00:11:45.413088719 +0000 UTC m=+0.045792793 container remove 2205497d85c65ce7d77a4c175584170b83bfbe52e86367cbf01de96f7078f44e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-ffed1ac3-0e62-43c1-a887-80d5e274a540, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0)
Jan 21 19:11:45 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:11:45.417 211690 DEBUG oslo.privsep.daemon [-] privsep: reply[2d75579b-8ddd-48ff-b343-a01ca4ceb914]: (4, ('Thu Jan 22 12:11:45 AM UTC 2026 Stopping container neutron-haproxy-ovnmeta-ffed1ac3-0e62-43c1-a887-80d5e274a540 (2205497d85c65ce7d77a4c175584170b83bfbe52e86367cbf01de96f7078f44e)\n2205497d85c65ce7d77a4c175584170b83bfbe52e86367cbf01de96f7078f44e\nThu Jan 22 12:11:45 AM UTC 2026 Deleting container neutron-haproxy-ovnmeta-ffed1ac3-0e62-43c1-a887-80d5e274a540 (2205497d85c65ce7d77a4c175584170b83bfbe52e86367cbf01de96f7078f44e)\n2205497d85c65ce7d77a4c175584170b83bfbe52e86367cbf01de96f7078f44e\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 19:11:45 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:11:45.419 211690 DEBUG oslo.privsep.daemon [-] privsep: reply[89be3bce-9e03-43ab-a25a-d23e79a919bb]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 19:11:45 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:11:45.420 104259 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapffed1ac3-00, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 21 19:11:45 np0005591285 nova_compute[182755]: 2026-01-22 00:11:45.422 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:11:45 np0005591285 kernel: tapffed1ac3-00: left promiscuous mode
Jan 21 19:11:45 np0005591285 nova_compute[182755]: 2026-01-22 00:11:45.435 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:11:45 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:11:45.438 211690 DEBUG oslo.privsep.daemon [-] privsep: reply[32d512d2-9396-4c08-889b-e18e335fb012]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 19:11:45 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:11:45.453 211690 DEBUG oslo.privsep.daemon [-] privsep: reply[07b81e76-a6b5-4795-84a6-66d6695e87a8]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 19:11:45 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:11:45.455 211690 DEBUG oslo.privsep.daemon [-] privsep: reply[4515e1ad-7716-4858-8d9e-96782fe9b821]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 19:11:45 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:11:45.477 211690 DEBUG oslo.privsep.daemon [-] privsep: reply[7caa5cd4-4901-4c4d-8c24-09f8bc857cff]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 524784, 'reachable_time': 36562, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 229469, 'error': None, 'target': 'ovnmeta-ffed1ac3-0e62-43c1-a887-80d5e274a540', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 19:11:45 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:11:45.481 104650 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-ffed1ac3-0e62-43c1-a887-80d5e274a540 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Jan 21 19:11:45 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:11:45.481 104650 DEBUG oslo.privsep.daemon [-] privsep: reply[7338a765-e996-422b-9534-d117e54f4548]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 19:11:45 np0005591285 systemd[1]: run-netns-ovnmeta\x2dffed1ac3\x2d0e62\x2d43c1\x2da887\x2d80d5e274a540.mount: Deactivated successfully.
Jan 21 19:11:45 np0005591285 nova_compute[182755]: 2026-01-22 00:11:45.767 182759 INFO nova.compute.manager [None req-55c081a7-99a4-4932-85e4-de6c73ea16f1 34c123183bb440f5812e26cf267019c7 09aea696a8524affb62dfae6819b6ba4 - - default default] [instance: 069e978e-d494-4830-93c7-f449d9fefe71] Took 0.82 seconds to destroy the instance on the hypervisor.#033[00m
Jan 21 19:11:45 np0005591285 nova_compute[182755]: 2026-01-22 00:11:45.768 182759 DEBUG oslo.service.loopingcall [None req-55c081a7-99a4-4932-85e4-de6c73ea16f1 34c123183bb440f5812e26cf267019c7 09aea696a8524affb62dfae6819b6ba4 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Jan 21 19:11:45 np0005591285 nova_compute[182755]: 2026-01-22 00:11:45.768 182759 DEBUG nova.compute.manager [-] [instance: 069e978e-d494-4830-93c7-f449d9fefe71] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Jan 21 19:11:45 np0005591285 nova_compute[182755]: 2026-01-22 00:11:45.768 182759 DEBUG nova.network.neutron [-] [instance: 069e978e-d494-4830-93c7-f449d9fefe71] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Jan 21 19:11:47 np0005591285 nova_compute[182755]: 2026-01-22 00:11:47.294 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:11:47 np0005591285 nova_compute[182755]: 2026-01-22 00:11:47.617 182759 DEBUG nova.compute.manager [req-ed20ae28-1785-435d-b437-261232253222 req-4aef4cee-20f4-4ca1-8785-ed9795b2f666 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 069e978e-d494-4830-93c7-f449d9fefe71] Received event network-vif-unplugged-44b26f9f-3553-4a58-a1bf-068e5bc636a5 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 21 19:11:47 np0005591285 nova_compute[182755]: 2026-01-22 00:11:47.617 182759 DEBUG oslo_concurrency.lockutils [req-ed20ae28-1785-435d-b437-261232253222 req-4aef4cee-20f4-4ca1-8785-ed9795b2f666 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "069e978e-d494-4830-93c7-f449d9fefe71-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 21 19:11:47 np0005591285 nova_compute[182755]: 2026-01-22 00:11:47.617 182759 DEBUG oslo_concurrency.lockutils [req-ed20ae28-1785-435d-b437-261232253222 req-4aef4cee-20f4-4ca1-8785-ed9795b2f666 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "069e978e-d494-4830-93c7-f449d9fefe71-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 21 19:11:47 np0005591285 nova_compute[182755]: 2026-01-22 00:11:47.617 182759 DEBUG oslo_concurrency.lockutils [req-ed20ae28-1785-435d-b437-261232253222 req-4aef4cee-20f4-4ca1-8785-ed9795b2f666 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "069e978e-d494-4830-93c7-f449d9fefe71-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 21 19:11:47 np0005591285 nova_compute[182755]: 2026-01-22 00:11:47.617 182759 DEBUG nova.compute.manager [req-ed20ae28-1785-435d-b437-261232253222 req-4aef4cee-20f4-4ca1-8785-ed9795b2f666 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 069e978e-d494-4830-93c7-f449d9fefe71] No waiting events found dispatching network-vif-unplugged-44b26f9f-3553-4a58-a1bf-068e5bc636a5 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 21 19:11:47 np0005591285 nova_compute[182755]: 2026-01-22 00:11:47.618 182759 DEBUG nova.compute.manager [req-ed20ae28-1785-435d-b437-261232253222 req-4aef4cee-20f4-4ca1-8785-ed9795b2f666 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 069e978e-d494-4830-93c7-f449d9fefe71] Received event network-vif-unplugged-44b26f9f-3553-4a58-a1bf-068e5bc636a5 for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Jan 21 19:11:48 np0005591285 nova_compute[182755]: 2026-01-22 00:11:48.216 182759 DEBUG oslo_service.periodic_task [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 21 19:11:48 np0005591285 nova_compute[182755]: 2026-01-22 00:11:48.217 182759 DEBUG oslo_service.periodic_task [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 21 19:11:48 np0005591285 nova_compute[182755]: 2026-01-22 00:11:48.458 182759 DEBUG oslo_concurrency.lockutils [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 21 19:11:48 np0005591285 nova_compute[182755]: 2026-01-22 00:11:48.459 182759 DEBUG oslo_concurrency.lockutils [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 21 19:11:48 np0005591285 nova_compute[182755]: 2026-01-22 00:11:48.459 182759 DEBUG oslo_concurrency.lockutils [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 21 19:11:48 np0005591285 nova_compute[182755]: 2026-01-22 00:11:48.459 182759 DEBUG nova.compute.resource_tracker [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Jan 21 19:11:48 np0005591285 nova_compute[182755]: 2026-01-22 00:11:48.690 182759 WARNING nova.virt.libvirt.driver [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 21 19:11:48 np0005591285 nova_compute[182755]: 2026-01-22 00:11:48.691 182759 DEBUG nova.compute.resource_tracker [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=5704MB free_disk=73.19342803955078GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Jan 21 19:11:48 np0005591285 nova_compute[182755]: 2026-01-22 00:11:48.692 182759 DEBUG oslo_concurrency.lockutils [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 21 19:11:48 np0005591285 nova_compute[182755]: 2026-01-22 00:11:48.692 182759 DEBUG oslo_concurrency.lockutils [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 21 19:11:49 np0005591285 nova_compute[182755]: 2026-01-22 00:11:49.719 182759 DEBUG nova.compute.resource_tracker [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Instance 069e978e-d494-4830-93c7-f449d9fefe71 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Jan 21 19:11:49 np0005591285 nova_compute[182755]: 2026-01-22 00:11:49.719 182759 DEBUG nova.compute.resource_tracker [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Jan 21 19:11:49 np0005591285 nova_compute[182755]: 2026-01-22 00:11:49.720 182759 DEBUG nova.compute.resource_tracker [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7679MB used_ram=640MB phys_disk=79GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Jan 21 19:11:49 np0005591285 nova_compute[182755]: 2026-01-22 00:11:49.775 182759 DEBUG nova.compute.provider_tree [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Inventory has not changed in ProviderTree for provider: e96a8776-a298-4c19-937a-402cb8191067 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 21 19:11:50 np0005591285 nova_compute[182755]: 2026-01-22 00:11:50.114 182759 DEBUG nova.compute.manager [req-530f51cd-e2e7-4880-8014-66c43ad39e8b req-13aebddc-4760-434f-8fd8-8a780f3bb873 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 069e978e-d494-4830-93c7-f449d9fefe71] Received event network-vif-plugged-44b26f9f-3553-4a58-a1bf-068e5bc636a5 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 21 19:11:50 np0005591285 nova_compute[182755]: 2026-01-22 00:11:50.114 182759 DEBUG oslo_concurrency.lockutils [req-530f51cd-e2e7-4880-8014-66c43ad39e8b req-13aebddc-4760-434f-8fd8-8a780f3bb873 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "069e978e-d494-4830-93c7-f449d9fefe71-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 21 19:11:50 np0005591285 nova_compute[182755]: 2026-01-22 00:11:50.115 182759 DEBUG oslo_concurrency.lockutils [req-530f51cd-e2e7-4880-8014-66c43ad39e8b req-13aebddc-4760-434f-8fd8-8a780f3bb873 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "069e978e-d494-4830-93c7-f449d9fefe71-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 21 19:11:50 np0005591285 nova_compute[182755]: 2026-01-22 00:11:50.116 182759 DEBUG oslo_concurrency.lockutils [req-530f51cd-e2e7-4880-8014-66c43ad39e8b req-13aebddc-4760-434f-8fd8-8a780f3bb873 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "069e978e-d494-4830-93c7-f449d9fefe71-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 21 19:11:50 np0005591285 nova_compute[182755]: 2026-01-22 00:11:50.117 182759 DEBUG nova.compute.manager [req-530f51cd-e2e7-4880-8014-66c43ad39e8b req-13aebddc-4760-434f-8fd8-8a780f3bb873 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 069e978e-d494-4830-93c7-f449d9fefe71] No waiting events found dispatching network-vif-plugged-44b26f9f-3553-4a58-a1bf-068e5bc636a5 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 21 19:11:50 np0005591285 nova_compute[182755]: 2026-01-22 00:11:50.117 182759 WARNING nova.compute.manager [req-530f51cd-e2e7-4880-8014-66c43ad39e8b req-13aebddc-4760-434f-8fd8-8a780f3bb873 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 069e978e-d494-4830-93c7-f449d9fefe71] Received unexpected event network-vif-plugged-44b26f9f-3553-4a58-a1bf-068e5bc636a5 for instance with vm_state active and task_state deleting.#033[00m
Jan 21 19:11:50 np0005591285 nova_compute[182755]: 2026-01-22 00:11:50.124 182759 DEBUG nova.compute.manager [req-b8f9086e-d617-4f0b-ae50-a0b97463b6d8 req-ee4340fc-46d5-4ad0-8ddd-ec1f0665da94 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 069e978e-d494-4830-93c7-f449d9fefe71] Received event network-vif-deleted-44b26f9f-3553-4a58-a1bf-068e5bc636a5 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 21 19:11:50 np0005591285 nova_compute[182755]: 2026-01-22 00:11:50.124 182759 INFO nova.compute.manager [req-b8f9086e-d617-4f0b-ae50-a0b97463b6d8 req-ee4340fc-46d5-4ad0-8ddd-ec1f0665da94 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 069e978e-d494-4830-93c7-f449d9fefe71] Neutron deleted interface 44b26f9f-3553-4a58-a1bf-068e5bc636a5; detaching it from the instance and deleting it from the info cache#033[00m
Jan 21 19:11:50 np0005591285 nova_compute[182755]: 2026-01-22 00:11:50.125 182759 DEBUG nova.network.neutron [req-b8f9086e-d617-4f0b-ae50-a0b97463b6d8 req-ee4340fc-46d5-4ad0-8ddd-ec1f0665da94 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 069e978e-d494-4830-93c7-f449d9fefe71] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 21 19:11:50 np0005591285 nova_compute[182755]: 2026-01-22 00:11:50.147 182759 DEBUG nova.scheduler.client.report [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Inventory has not changed for provider e96a8776-a298-4c19-937a-402cb8191067 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 21 19:11:50 np0005591285 podman[229473]: 2026-01-22 00:11:50.214733919 +0000 UTC m=+0.071512015 container health_status b5c1a31a508d0cf9e10702abe9c0ef424614b4d8f0daee85791f7de054afa6f0 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, config_id=ceilometer_agent_compute, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, tcib_managed=true, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Jan 21 19:11:50 np0005591285 podman[229472]: 2026-01-22 00:11:50.223590937 +0000 UTC m=+0.080085244 container health_status 769668cc4e818cfae854ab3fcb5517227ddca1e18005a18ef4d782a494f45662 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, vendor=Red Hat, Inc., com.redhat.component=ubi9-minimal-container, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., config_id=openstack_network_exporter, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, version=9.6, io.openshift.tags=minimal rhel9, managed_by=edpm_ansible, name=ubi9-minimal, release=1755695350, build-date=2025-08-20T13:12:41, distribution-scope=public, io.buildah.version=1.33.7, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, architecture=x86_64, container_name=openstack_network_exporter, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., url=https://catalog.redhat.com/en/search?searchType=containers, maintainer=Red Hat, Inc.)
Jan 21 19:11:50 np0005591285 nova_compute[182755]: 2026-01-22 00:11:50.273 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:11:50 np0005591285 nova_compute[182755]: 2026-01-22 00:11:50.297 182759 DEBUG nova.network.neutron [-] [instance: 069e978e-d494-4830-93c7-f449d9fefe71] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 21 19:11:50 np0005591285 nova_compute[182755]: 2026-01-22 00:11:50.592 182759 DEBUG nova.compute.manager [req-b8f9086e-d617-4f0b-ae50-a0b97463b6d8 req-ee4340fc-46d5-4ad0-8ddd-ec1f0665da94 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 069e978e-d494-4830-93c7-f449d9fefe71] Detach interface failed, port_id=44b26f9f-3553-4a58-a1bf-068e5bc636a5, reason: Instance 069e978e-d494-4830-93c7-f449d9fefe71 could not be found. _process_instance_vif_deleted_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10882#033[00m
Jan 21 19:11:50 np0005591285 nova_compute[182755]: 2026-01-22 00:11:50.597 182759 INFO nova.compute.manager [-] [instance: 069e978e-d494-4830-93c7-f449d9fefe71] Took 4.83 seconds to deallocate network for instance.#033[00m
Jan 21 19:11:50 np0005591285 nova_compute[182755]: 2026-01-22 00:11:50.679 182759 DEBUG nova.compute.resource_tracker [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Jan 21 19:11:50 np0005591285 nova_compute[182755]: 2026-01-22 00:11:50.679 182759 DEBUG oslo_concurrency.lockutils [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 1.987s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 21 19:11:51 np0005591285 nova_compute[182755]: 2026-01-22 00:11:51.675 182759 DEBUG oslo_service.periodic_task [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 21 19:11:51 np0005591285 nova_compute[182755]: 2026-01-22 00:11:51.806 182759 DEBUG oslo_concurrency.lockutils [None req-55c081a7-99a4-4932-85e4-de6c73ea16f1 34c123183bb440f5812e26cf267019c7 09aea696a8524affb62dfae6819b6ba4 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 21 19:11:51 np0005591285 nova_compute[182755]: 2026-01-22 00:11:51.807 182759 DEBUG oslo_concurrency.lockutils [None req-55c081a7-99a4-4932-85e4-de6c73ea16f1 34c123183bb440f5812e26cf267019c7 09aea696a8524affb62dfae6819b6ba4 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 21 19:11:51 np0005591285 nova_compute[182755]: 2026-01-22 00:11:51.815 182759 DEBUG oslo_service.periodic_task [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 21 19:11:51 np0005591285 nova_compute[182755]: 2026-01-22 00:11:51.847 182759 DEBUG nova.compute.provider_tree [None req-55c081a7-99a4-4932-85e4-de6c73ea16f1 34c123183bb440f5812e26cf267019c7 09aea696a8524affb62dfae6819b6ba4 - - default default] Inventory has not changed in ProviderTree for provider: e96a8776-a298-4c19-937a-402cb8191067 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 21 19:11:52 np0005591285 nova_compute[182755]: 2026-01-22 00:11:52.163 182759 DEBUG nova.scheduler.client.report [None req-55c081a7-99a4-4932-85e4-de6c73ea16f1 34c123183bb440f5812e26cf267019c7 09aea696a8524affb62dfae6819b6ba4 - - default default] Inventory has not changed for provider e96a8776-a298-4c19-937a-402cb8191067 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 21 19:11:52 np0005591285 nova_compute[182755]: 2026-01-22 00:11:52.296 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:11:52 np0005591285 nova_compute[182755]: 2026-01-22 00:11:52.976 182759 DEBUG oslo_concurrency.lockutils [None req-55c081a7-99a4-4932-85e4-de6c73ea16f1 34c123183bb440f5812e26cf267019c7 09aea696a8524affb62dfae6819b6ba4 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 1.169s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 21 19:11:53 np0005591285 nova_compute[182755]: 2026-01-22 00:11:53.245 182759 INFO nova.scheduler.client.report [None req-55c081a7-99a4-4932-85e4-de6c73ea16f1 34c123183bb440f5812e26cf267019c7 09aea696a8524affb62dfae6819b6ba4 - - default default] Deleted allocations for instance 069e978e-d494-4830-93c7-f449d9fefe71#033[00m
Jan 21 19:11:53 np0005591285 nova_compute[182755]: 2026-01-22 00:11:53.964 182759 DEBUG oslo_concurrency.lockutils [None req-55c081a7-99a4-4932-85e4-de6c73ea16f1 34c123183bb440f5812e26cf267019c7 09aea696a8524affb62dfae6819b6ba4 - - default default] Lock "069e978e-d494-4830-93c7-f449d9fefe71" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 10.909s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 21 19:11:55 np0005591285 nova_compute[182755]: 2026-01-22 00:11:55.283 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:11:57 np0005591285 nova_compute[182755]: 2026-01-22 00:11:57.298 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:12:00 np0005591285 podman[229514]: 2026-01-22 00:12:00.194770845 +0000 UTC m=+0.071150216 container health_status 3d927e208c8d9d2bf2ed958430b573b5beb6e6062ba87e1ec7a0ee42c855b2e4 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Jan 21 19:12:00 np0005591285 nova_compute[182755]: 2026-01-22 00:12:00.214 182759 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1769040705.2133996, 069e978e-d494-4830-93c7-f449d9fefe71 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 21 19:12:00 np0005591285 nova_compute[182755]: 2026-01-22 00:12:00.215 182759 INFO nova.compute.manager [-] [instance: 069e978e-d494-4830-93c7-f449d9fefe71] VM Stopped (Lifecycle Event)#033[00m
Jan 21 19:12:00 np0005591285 nova_compute[182755]: 2026-01-22 00:12:00.247 182759 DEBUG nova.compute.manager [None req-591c98f5-e29c-4913-a3d3-269c02f5e488 - - - - - -] [instance: 069e978e-d494-4830-93c7-f449d9fefe71] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 21 19:12:00 np0005591285 nova_compute[182755]: 2026-01-22 00:12:00.284 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:12:02 np0005591285 nova_compute[182755]: 2026-01-22 00:12:02.300 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:12:02 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:12:02.975 104259 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 21 19:12:02 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:12:02.976 104259 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 21 19:12:02 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:12:02.977 104259 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 21 19:12:03 np0005591285 podman[229541]: 2026-01-22 00:12:03.200560389 +0000 UTC m=+0.058425233 container health_status 8919309155a033da8deb024bb37b9fc0d285fe97686ee0d202af37a5f2df8416 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Jan 21 19:12:03 np0005591285 podman[229540]: 2026-01-22 00:12:03.237455931 +0000 UTC m=+0.095192311 container health_status 482a8450540b33e5cd4c4cb0c4b60281016c657b79b89fa82f8a9e447843f995 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, managed_by=edpm_ansible, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, tcib_managed=true, container_name=ovn_metadata_agent)
Jan 21 19:12:05 np0005591285 podman[229585]: 2026-01-22 00:12:05.228701104 +0000 UTC m=+0.099584281 container health_status e38def30a2d032278c507a8fe96e62e9ea51ff4721fe55b24e66691821fd079e (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_managed=true, container_name=ovn_controller, io.buildah.version=1.41.3, managed_by=edpm_ansible)
Jan 21 19:12:05 np0005591285 nova_compute[182755]: 2026-01-22 00:12:05.285 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:12:06 np0005591285 nova_compute[182755]: 2026-01-22 00:12:06.559 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:12:07 np0005591285 nova_compute[182755]: 2026-01-22 00:12:07.302 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:12:08 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:12:08.972 104259 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=33, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '16:a4:fb', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': 'fa:c2:bb:50:eb:27'}, ipsec=False) old=SB_Global(nb_cfg=32) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 21 19:12:08 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:12:08.973 104259 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 2 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Jan 21 19:12:08 np0005591285 nova_compute[182755]: 2026-01-22 00:12:08.979 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:12:10 np0005591285 nova_compute[182755]: 2026-01-22 00:12:10.288 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:12:10 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:12:10.975 104259 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=ce4b296c-26ac-415a-aa87-9634754eb3d3, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '33'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 21 19:12:12 np0005591285 nova_compute[182755]: 2026-01-22 00:12:12.324 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:12:15 np0005591285 nova_compute[182755]: 2026-01-22 00:12:15.292 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:12:17 np0005591285 nova_compute[182755]: 2026-01-22 00:12:17.325 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:12:20 np0005591285 nova_compute[182755]: 2026-01-22 00:12:20.294 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:12:21 np0005591285 podman[229612]: 2026-01-22 00:12:21.203071822 +0000 UTC m=+0.065315299 container health_status b5c1a31a508d0cf9e10702abe9c0ef424614b4d8f0daee85791f7de054afa6f0 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, container_name=ceilometer_agent_compute, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ceilometer_agent_compute, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_managed=true)
Jan 21 19:12:21 np0005591285 podman[229611]: 2026-01-22 00:12:21.213380958 +0000 UTC m=+0.072925792 container health_status 769668cc4e818cfae854ab3fcb5517227ddca1e18005a18ef4d782a494f45662 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, com.redhat.component=ubi9-minimal-container, io.buildah.version=1.33.7, url=https://catalog.redhat.com/en/search?searchType=containers, container_name=openstack_network_exporter, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-type=git, build-date=2025-08-20T13:12:41, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=minimal rhel9, maintainer=Red Hat, Inc., version=9.6, managed_by=edpm_ansible, distribution-scope=public, name=ubi9-minimal, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, config_id=openstack_network_exporter, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., release=1755695350, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., architecture=x86_64, vendor=Red Hat, Inc., io.openshift.expose-services=)
Jan 21 19:12:22 np0005591285 nova_compute[182755]: 2026-01-22 00:12:22.327 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:12:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:12:23.164 12 DEBUG ceilometer.polling.manager [-] Skip pollster memory.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 21 19:12:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:12:23.165 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.allocation, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 21 19:12:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:12:23.165 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 21 19:12:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:12:23.165 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.capacity, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 21 19:12:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:12:23.165 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 21 19:12:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:12:23.165 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 21 19:12:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:12:23.166 12 DEBUG ceilometer.polling.manager [-] Skip pollster cpu, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 21 19:12:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:12:23.166 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 21 19:12:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:12:23.166 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 21 19:12:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:12:23.166 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.iops, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 21 19:12:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:12:23.166 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 21 19:12:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:12:23.166 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 21 19:12:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:12:23.166 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 21 19:12:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:12:23.166 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 21 19:12:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:12:23.166 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 21 19:12:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:12:23.166 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 21 19:12:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:12:23.166 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 21 19:12:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:12:23.166 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 21 19:12:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:12:23.166 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 21 19:12:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:12:23.167 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 21 19:12:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:12:23.167 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 21 19:12:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:12:23.167 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 21 19:12:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:12:23.167 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 21 19:12:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:12:23.167 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 21 19:12:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:12:23.167 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 21 19:12:25 np0005591285 nova_compute[182755]: 2026-01-22 00:12:25.296 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:12:27 np0005591285 nova_compute[182755]: 2026-01-22 00:12:27.328 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:12:30 np0005591285 nova_compute[182755]: 2026-01-22 00:12:30.217 182759 DEBUG oslo_service.periodic_task [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Running periodic task ComputeManager._run_pending_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 21 19:12:30 np0005591285 nova_compute[182755]: 2026-01-22 00:12:30.218 182759 DEBUG nova.compute.manager [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Cleaning up deleted instances _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11145#033[00m
Jan 21 19:12:30 np0005591285 nova_compute[182755]: 2026-01-22 00:12:30.298 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:12:31 np0005591285 podman[229652]: 2026-01-22 00:12:31.182208912 +0000 UTC m=+0.055571516 container health_status 3d927e208c8d9d2bf2ed958430b573b5beb6e6062ba87e1ec7a0ee42c855b2e4 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Jan 21 19:12:31 np0005591285 nova_compute[182755]: 2026-01-22 00:12:31.639 182759 DEBUG nova.compute.manager [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] There are 0 instances to clean _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11154#033[00m
Jan 21 19:12:32 np0005591285 nova_compute[182755]: 2026-01-22 00:12:32.330 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:12:34 np0005591285 podman[229676]: 2026-01-22 00:12:34.183688259 +0000 UTC m=+0.055118144 container health_status 482a8450540b33e5cd4c4cb0c4b60281016c657b79b89fa82f8a9e447843f995 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']})
Jan 21 19:12:34 np0005591285 podman[229677]: 2026-01-22 00:12:34.205934167 +0000 UTC m=+0.074915306 container health_status 8919309155a033da8deb024bb37b9fc0d285fe97686ee0d202af37a5f2df8416 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter)
Jan 21 19:12:35 np0005591285 nova_compute[182755]: 2026-01-22 00:12:35.300 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:12:36 np0005591285 podman[229718]: 2026-01-22 00:12:36.213712975 +0000 UTC m=+0.085902462 container health_status e38def30a2d032278c507a8fe96e62e9ea51ff4721fe55b24e66691821fd079e (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251202, config_id=ovn_controller, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_managed=true)
Jan 21 19:12:37 np0005591285 nova_compute[182755]: 2026-01-22 00:12:37.383 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:12:40 np0005591285 nova_compute[182755]: 2026-01-22 00:12:40.303 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:12:41 np0005591285 nova_compute[182755]: 2026-01-22 00:12:41.634 182759 DEBUG oslo_service.periodic_task [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 21 19:12:41 np0005591285 nova_compute[182755]: 2026-01-22 00:12:41.635 182759 DEBUG oslo_service.periodic_task [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 21 19:12:41 np0005591285 nova_compute[182755]: 2026-01-22 00:12:41.635 182759 DEBUG nova.compute.manager [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Jan 21 19:12:42 np0005591285 nova_compute[182755]: 2026-01-22 00:12:42.385 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:12:45 np0005591285 nova_compute[182755]: 2026-01-22 00:12:45.218 182759 DEBUG oslo_service.periodic_task [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 21 19:12:45 np0005591285 nova_compute[182755]: 2026-01-22 00:12:45.219 182759 DEBUG oslo_service.periodic_task [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 21 19:12:45 np0005591285 nova_compute[182755]: 2026-01-22 00:12:45.305 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:12:46 np0005591285 nova_compute[182755]: 2026-01-22 00:12:46.218 182759 DEBUG oslo_service.periodic_task [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 21 19:12:47 np0005591285 nova_compute[182755]: 2026-01-22 00:12:47.219 182759 DEBUG oslo_service.periodic_task [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 21 19:12:47 np0005591285 nova_compute[182755]: 2026-01-22 00:12:47.220 182759 DEBUG nova.compute.manager [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Jan 21 19:12:47 np0005591285 nova_compute[182755]: 2026-01-22 00:12:47.220 182759 DEBUG nova.compute.manager [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Jan 21 19:12:47 np0005591285 nova_compute[182755]: 2026-01-22 00:12:47.387 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:12:48 np0005591285 nova_compute[182755]: 2026-01-22 00:12:48.450 182759 DEBUG nova.compute.manager [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Jan 21 19:12:48 np0005591285 nova_compute[182755]: 2026-01-22 00:12:48.450 182759 DEBUG oslo_service.periodic_task [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 21 19:12:49 np0005591285 nova_compute[182755]: 2026-01-22 00:12:49.216 182759 DEBUG oslo_service.periodic_task [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 21 19:12:49 np0005591285 nova_compute[182755]: 2026-01-22 00:12:49.382 182759 DEBUG oslo_concurrency.lockutils [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 21 19:12:49 np0005591285 nova_compute[182755]: 2026-01-22 00:12:49.383 182759 DEBUG oslo_concurrency.lockutils [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 21 19:12:49 np0005591285 nova_compute[182755]: 2026-01-22 00:12:49.383 182759 DEBUG oslo_concurrency.lockutils [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 21 19:12:49 np0005591285 nova_compute[182755]: 2026-01-22 00:12:49.384 182759 DEBUG nova.compute.resource_tracker [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Jan 21 19:12:49 np0005591285 nova_compute[182755]: 2026-01-22 00:12:49.813 182759 WARNING nova.virt.libvirt.driver [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 21 19:12:49 np0005591285 nova_compute[182755]: 2026-01-22 00:12:49.814 182759 DEBUG nova.compute.resource_tracker [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=5738MB free_disk=73.19357299804688GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Jan 21 19:12:49 np0005591285 nova_compute[182755]: 2026-01-22 00:12:49.814 182759 DEBUG oslo_concurrency.lockutils [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 21 19:12:49 np0005591285 nova_compute[182755]: 2026-01-22 00:12:49.814 182759 DEBUG oslo_concurrency.lockutils [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 21 19:12:50 np0005591285 nova_compute[182755]: 2026-01-22 00:12:50.307 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:12:50 np0005591285 nova_compute[182755]: 2026-01-22 00:12:50.472 182759 DEBUG nova.compute.resource_tracker [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Jan 21 19:12:50 np0005591285 nova_compute[182755]: 2026-01-22 00:12:50.473 182759 DEBUG nova.compute.resource_tracker [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Jan 21 19:12:50 np0005591285 nova_compute[182755]: 2026-01-22 00:12:50.520 182759 DEBUG nova.compute.provider_tree [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Inventory has not changed in ProviderTree for provider: e96a8776-a298-4c19-937a-402cb8191067 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 21 19:12:50 np0005591285 nova_compute[182755]: 2026-01-22 00:12:50.543 182759 DEBUG nova.scheduler.client.report [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Inventory has not changed for provider e96a8776-a298-4c19-937a-402cb8191067 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 21 19:12:50 np0005591285 nova_compute[182755]: 2026-01-22 00:12:50.822 182759 DEBUG nova.compute.resource_tracker [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Jan 21 19:12:50 np0005591285 nova_compute[182755]: 2026-01-22 00:12:50.823 182759 DEBUG oslo_concurrency.lockutils [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 1.009s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 21 19:12:50 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:12:50.935 104259 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=34, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '16:a4:fb', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': 'fa:c2:bb:50:eb:27'}, ipsec=False) old=SB_Global(nb_cfg=33) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 21 19:12:50 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:12:50.936 104259 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 8 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Jan 21 19:12:50 np0005591285 nova_compute[182755]: 2026-01-22 00:12:50.937 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:12:51 np0005591285 nova_compute[182755]: 2026-01-22 00:12:51.533 182759 DEBUG oslo_concurrency.lockutils [None req-6475acbd-323f-4d3e-8fa6-ea96f95ecd1c 9ee45ba20dd444a5a5e88aa96cc8e043 873b2f2688e942d5924aa81fa18d84c0 - - default default] Acquiring lock "ee09d802-1f59-4f58-befa-a281fe642b6b" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 21 19:12:51 np0005591285 nova_compute[182755]: 2026-01-22 00:12:51.534 182759 DEBUG oslo_concurrency.lockutils [None req-6475acbd-323f-4d3e-8fa6-ea96f95ecd1c 9ee45ba20dd444a5a5e88aa96cc8e043 873b2f2688e942d5924aa81fa18d84c0 - - default default] Lock "ee09d802-1f59-4f58-befa-a281fe642b6b" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 21 19:12:51 np0005591285 nova_compute[182755]: 2026-01-22 00:12:51.620 182759 DEBUG nova.compute.manager [None req-6475acbd-323f-4d3e-8fa6-ea96f95ecd1c 9ee45ba20dd444a5a5e88aa96cc8e043 873b2f2688e942d5924aa81fa18d84c0 - - default default] [instance: ee09d802-1f59-4f58-befa-a281fe642b6b] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Jan 21 19:12:52 np0005591285 podman[229744]: 2026-01-22 00:12:52.185591527 +0000 UTC m=+0.060313030 container health_status 769668cc4e818cfae854ab3fcb5517227ddca1e18005a18ef4d782a494f45662 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, distribution-scope=public, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, name=ubi9-minimal, io.openshift.expose-services=, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, com.redhat.component=ubi9-minimal-container, release=1755695350, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., container_name=openstack_network_exporter, version=9.6, io.buildah.version=1.33.7, io.openshift.tags=minimal rhel9, managed_by=edpm_ansible, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., url=https://catalog.redhat.com/en/search?searchType=containers, vcs-type=git, config_id=openstack_network_exporter, architecture=x86_64, build-date=2025-08-20T13:12:41, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Red Hat, Inc.)
Jan 21 19:12:52 np0005591285 podman[229745]: 2026-01-22 00:12:52.213760848 +0000 UTC m=+0.074736344 container health_status b5c1a31a508d0cf9e10702abe9c0ef424614b4d8f0daee85791f7de054afa6f0 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true)
Jan 21 19:12:52 np0005591285 nova_compute[182755]: 2026-01-22 00:12:52.422 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:12:52 np0005591285 nova_compute[182755]: 2026-01-22 00:12:52.823 182759 DEBUG oslo_service.periodic_task [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 21 19:12:53 np0005591285 nova_compute[182755]: 2026-01-22 00:12:53.103 182759 DEBUG oslo_concurrency.lockutils [None req-ce7cd976-0f13-413f-b6d1-7ca0671f190c 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] Acquiring lock "118577c2-2440-472a-b858-f075b2a804b1" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 21 19:12:53 np0005591285 nova_compute[182755]: 2026-01-22 00:12:53.104 182759 DEBUG oslo_concurrency.lockutils [None req-ce7cd976-0f13-413f-b6d1-7ca0671f190c 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] Lock "118577c2-2440-472a-b858-f075b2a804b1" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 21 19:12:53 np0005591285 nova_compute[182755]: 2026-01-22 00:12:53.965 182759 DEBUG oslo_concurrency.lockutils [None req-6475acbd-323f-4d3e-8fa6-ea96f95ecd1c 9ee45ba20dd444a5a5e88aa96cc8e043 873b2f2688e942d5924aa81fa18d84c0 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 21 19:12:53 np0005591285 nova_compute[182755]: 2026-01-22 00:12:53.965 182759 DEBUG oslo_concurrency.lockutils [None req-6475acbd-323f-4d3e-8fa6-ea96f95ecd1c 9ee45ba20dd444a5a5e88aa96cc8e043 873b2f2688e942d5924aa81fa18d84c0 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 21 19:12:53 np0005591285 nova_compute[182755]: 2026-01-22 00:12:53.974 182759 DEBUG nova.virt.hardware [None req-6475acbd-323f-4d3e-8fa6-ea96f95ecd1c 9ee45ba20dd444a5a5e88aa96cc8e043 873b2f2688e942d5924aa81fa18d84c0 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Jan 21 19:12:53 np0005591285 nova_compute[182755]: 2026-01-22 00:12:53.974 182759 INFO nova.compute.claims [None req-6475acbd-323f-4d3e-8fa6-ea96f95ecd1c 9ee45ba20dd444a5a5e88aa96cc8e043 873b2f2688e942d5924aa81fa18d84c0 - - default default] [instance: ee09d802-1f59-4f58-befa-a281fe642b6b] Claim successful on node compute-2.ctlplane.example.com#033[00m
Jan 21 19:12:54 np0005591285 nova_compute[182755]: 2026-01-22 00:12:54.585 182759 DEBUG nova.compute.manager [None req-ce7cd976-0f13-413f-b6d1-7ca0671f190c 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] [instance: 118577c2-2440-472a-b858-f075b2a804b1] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Jan 21 19:12:55 np0005591285 nova_compute[182755]: 2026-01-22 00:12:55.309 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:12:56 np0005591285 nova_compute[182755]: 2026-01-22 00:12:56.144 182759 DEBUG nova.compute.provider_tree [None req-6475acbd-323f-4d3e-8fa6-ea96f95ecd1c 9ee45ba20dd444a5a5e88aa96cc8e043 873b2f2688e942d5924aa81fa18d84c0 - - default default] Inventory has not changed in ProviderTree for provider: e96a8776-a298-4c19-937a-402cb8191067 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 21 19:12:56 np0005591285 nova_compute[182755]: 2026-01-22 00:12:56.763 182759 DEBUG nova.scheduler.client.report [None req-6475acbd-323f-4d3e-8fa6-ea96f95ecd1c 9ee45ba20dd444a5a5e88aa96cc8e043 873b2f2688e942d5924aa81fa18d84c0 - - default default] Inventory has not changed for provider e96a8776-a298-4c19-937a-402cb8191067 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 21 19:12:56 np0005591285 nova_compute[182755]: 2026-01-22 00:12:56.821 182759 DEBUG oslo_concurrency.lockutils [None req-ce7cd976-0f13-413f-b6d1-7ca0671f190c 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 21 19:12:56 np0005591285 nova_compute[182755]: 2026-01-22 00:12:56.966 182759 DEBUG oslo_concurrency.lockutils [None req-6475acbd-323f-4d3e-8fa6-ea96f95ecd1c 9ee45ba20dd444a5a5e88aa96cc8e043 873b2f2688e942d5924aa81fa18d84c0 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 3.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 21 19:12:56 np0005591285 nova_compute[182755]: 2026-01-22 00:12:56.967 182759 DEBUG nova.compute.manager [None req-6475acbd-323f-4d3e-8fa6-ea96f95ecd1c 9ee45ba20dd444a5a5e88aa96cc8e043 873b2f2688e942d5924aa81fa18d84c0 - - default default] [instance: ee09d802-1f59-4f58-befa-a281fe642b6b] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Jan 21 19:12:56 np0005591285 nova_compute[182755]: 2026-01-22 00:12:56.971 182759 DEBUG oslo_concurrency.lockutils [None req-ce7cd976-0f13-413f-b6d1-7ca0671f190c 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.149s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 21 19:12:56 np0005591285 nova_compute[182755]: 2026-01-22 00:12:56.978 182759 DEBUG nova.virt.hardware [None req-ce7cd976-0f13-413f-b6d1-7ca0671f190c 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Jan 21 19:12:56 np0005591285 nova_compute[182755]: 2026-01-22 00:12:56.978 182759 INFO nova.compute.claims [None req-ce7cd976-0f13-413f-b6d1-7ca0671f190c 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] [instance: 118577c2-2440-472a-b858-f075b2a804b1] Claim successful on node compute-2.ctlplane.example.com#033[00m
Jan 21 19:12:57 np0005591285 nova_compute[182755]: 2026-01-22 00:12:57.217 182759 DEBUG oslo_service.periodic_task [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Running periodic task ComputeManager._cleanup_incomplete_migrations run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 21 19:12:57 np0005591285 nova_compute[182755]: 2026-01-22 00:12:57.218 182759 DEBUG nova.compute.manager [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Cleaning up deleted instances with incomplete migration  _cleanup_incomplete_migrations /usr/lib/python3.9/site-packages/nova/compute/manager.py:11183#033[00m
Jan 21 19:12:57 np0005591285 nova_compute[182755]: 2026-01-22 00:12:57.424 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:12:57 np0005591285 nova_compute[182755]: 2026-01-22 00:12:57.438 182759 DEBUG nova.compute.manager [None req-6475acbd-323f-4d3e-8fa6-ea96f95ecd1c 9ee45ba20dd444a5a5e88aa96cc8e043 873b2f2688e942d5924aa81fa18d84c0 - - default default] [instance: ee09d802-1f59-4f58-befa-a281fe642b6b] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Jan 21 19:12:57 np0005591285 nova_compute[182755]: 2026-01-22 00:12:57.438 182759 DEBUG nova.network.neutron [None req-6475acbd-323f-4d3e-8fa6-ea96f95ecd1c 9ee45ba20dd444a5a5e88aa96cc8e043 873b2f2688e942d5924aa81fa18d84c0 - - default default] [instance: ee09d802-1f59-4f58-befa-a281fe642b6b] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Jan 21 19:12:57 np0005591285 nova_compute[182755]: 2026-01-22 00:12:57.458 182759 DEBUG nova.compute.provider_tree [None req-ce7cd976-0f13-413f-b6d1-7ca0671f190c 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] Inventory has not changed in ProviderTree for provider: e96a8776-a298-4c19-937a-402cb8191067 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 21 19:12:57 np0005591285 nova_compute[182755]: 2026-01-22 00:12:57.618 182759 DEBUG nova.policy [None req-6475acbd-323f-4d3e-8fa6-ea96f95ecd1c 9ee45ba20dd444a5a5e88aa96cc8e043 873b2f2688e942d5924aa81fa18d84c0 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '9ee45ba20dd444a5a5e88aa96cc8e043', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '873b2f2688e942d5924aa81fa18d84c0', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Jan 21 19:12:57 np0005591285 nova_compute[182755]: 2026-01-22 00:12:57.637 182759 DEBUG nova.scheduler.client.report [None req-ce7cd976-0f13-413f-b6d1-7ca0671f190c 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] Inventory has not changed for provider e96a8776-a298-4c19-937a-402cb8191067 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 21 19:12:57 np0005591285 nova_compute[182755]: 2026-01-22 00:12:57.664 182759 INFO nova.virt.libvirt.driver [None req-6475acbd-323f-4d3e-8fa6-ea96f95ecd1c 9ee45ba20dd444a5a5e88aa96cc8e043 873b2f2688e942d5924aa81fa18d84c0 - - default default] [instance: ee09d802-1f59-4f58-befa-a281fe642b6b] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Jan 21 19:12:57 np0005591285 nova_compute[182755]: 2026-01-22 00:12:57.882 182759 DEBUG oslo_concurrency.lockutils [None req-ce7cd976-0f13-413f-b6d1-7ca0671f190c 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.911s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 21 19:12:57 np0005591285 nova_compute[182755]: 2026-01-22 00:12:57.883 182759 DEBUG nova.compute.manager [None req-ce7cd976-0f13-413f-b6d1-7ca0671f190c 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] [instance: 118577c2-2440-472a-b858-f075b2a804b1] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Jan 21 19:12:57 np0005591285 nova_compute[182755]: 2026-01-22 00:12:57.924 182759 DEBUG nova.compute.manager [None req-6475acbd-323f-4d3e-8fa6-ea96f95ecd1c 9ee45ba20dd444a5a5e88aa96cc8e043 873b2f2688e942d5924aa81fa18d84c0 - - default default] [instance: ee09d802-1f59-4f58-befa-a281fe642b6b] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Jan 21 19:12:58 np0005591285 ovn_controller[94908]: 2026-01-22T00:12:58Z|00435|memory_trim|INFO|Detected inactivity (last active 30001 ms ago): trimming memory
Jan 21 19:12:58 np0005591285 nova_compute[182755]: 2026-01-22 00:12:58.470 182759 DEBUG nova.compute.manager [None req-ce7cd976-0f13-413f-b6d1-7ca0671f190c 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] [instance: 118577c2-2440-472a-b858-f075b2a804b1] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Jan 21 19:12:58 np0005591285 nova_compute[182755]: 2026-01-22 00:12:58.470 182759 DEBUG nova.network.neutron [None req-ce7cd976-0f13-413f-b6d1-7ca0671f190c 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] [instance: 118577c2-2440-472a-b858-f075b2a804b1] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Jan 21 19:12:58 np0005591285 nova_compute[182755]: 2026-01-22 00:12:58.564 182759 INFO nova.virt.libvirt.driver [None req-ce7cd976-0f13-413f-b6d1-7ca0671f190c 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] [instance: 118577c2-2440-472a-b858-f075b2a804b1] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Jan 21 19:12:58 np0005591285 nova_compute[182755]: 2026-01-22 00:12:58.596 182759 DEBUG nova.compute.manager [None req-6475acbd-323f-4d3e-8fa6-ea96f95ecd1c 9ee45ba20dd444a5a5e88aa96cc8e043 873b2f2688e942d5924aa81fa18d84c0 - - default default] [instance: ee09d802-1f59-4f58-befa-a281fe642b6b] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Jan 21 19:12:58 np0005591285 nova_compute[182755]: 2026-01-22 00:12:58.598 182759 DEBUG nova.virt.libvirt.driver [None req-6475acbd-323f-4d3e-8fa6-ea96f95ecd1c 9ee45ba20dd444a5a5e88aa96cc8e043 873b2f2688e942d5924aa81fa18d84c0 - - default default] [instance: ee09d802-1f59-4f58-befa-a281fe642b6b] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Jan 21 19:12:58 np0005591285 nova_compute[182755]: 2026-01-22 00:12:58.599 182759 INFO nova.virt.libvirt.driver [None req-6475acbd-323f-4d3e-8fa6-ea96f95ecd1c 9ee45ba20dd444a5a5e88aa96cc8e043 873b2f2688e942d5924aa81fa18d84c0 - - default default] [instance: ee09d802-1f59-4f58-befa-a281fe642b6b] Creating image(s)#033[00m
Jan 21 19:12:58 np0005591285 nova_compute[182755]: 2026-01-22 00:12:58.600 182759 DEBUG oslo_concurrency.lockutils [None req-6475acbd-323f-4d3e-8fa6-ea96f95ecd1c 9ee45ba20dd444a5a5e88aa96cc8e043 873b2f2688e942d5924aa81fa18d84c0 - - default default] Acquiring lock "/var/lib/nova/instances/ee09d802-1f59-4f58-befa-a281fe642b6b/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 21 19:12:58 np0005591285 nova_compute[182755]: 2026-01-22 00:12:58.600 182759 DEBUG oslo_concurrency.lockutils [None req-6475acbd-323f-4d3e-8fa6-ea96f95ecd1c 9ee45ba20dd444a5a5e88aa96cc8e043 873b2f2688e942d5924aa81fa18d84c0 - - default default] Lock "/var/lib/nova/instances/ee09d802-1f59-4f58-befa-a281fe642b6b/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 21 19:12:58 np0005591285 nova_compute[182755]: 2026-01-22 00:12:58.601 182759 DEBUG oslo_concurrency.lockutils [None req-6475acbd-323f-4d3e-8fa6-ea96f95ecd1c 9ee45ba20dd444a5a5e88aa96cc8e043 873b2f2688e942d5924aa81fa18d84c0 - - default default] Lock "/var/lib/nova/instances/ee09d802-1f59-4f58-befa-a281fe642b6b/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 21 19:12:58 np0005591285 nova_compute[182755]: 2026-01-22 00:12:58.620 182759 DEBUG nova.compute.manager [None req-ce7cd976-0f13-413f-b6d1-7ca0671f190c 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] [instance: 118577c2-2440-472a-b858-f075b2a804b1] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Jan 21 19:12:58 np0005591285 nova_compute[182755]: 2026-01-22 00:12:58.623 182759 DEBUG oslo_concurrency.processutils [None req-6475acbd-323f-4d3e-8fa6-ea96f95ecd1c 9ee45ba20dd444a5a5e88aa96cc8e043 873b2f2688e942d5924aa81fa18d84c0 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 21 19:12:58 np0005591285 nova_compute[182755]: 2026-01-22 00:12:58.692 182759 DEBUG oslo_concurrency.processutils [None req-6475acbd-323f-4d3e-8fa6-ea96f95ecd1c 9ee45ba20dd444a5a5e88aa96cc8e043 873b2f2688e942d5924aa81fa18d84c0 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json" returned: 0 in 0.069s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 21 19:12:58 np0005591285 nova_compute[182755]: 2026-01-22 00:12:58.694 182759 DEBUG oslo_concurrency.lockutils [None req-6475acbd-323f-4d3e-8fa6-ea96f95ecd1c 9ee45ba20dd444a5a5e88aa96cc8e043 873b2f2688e942d5924aa81fa18d84c0 - - default default] Acquiring lock "3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 21 19:12:58 np0005591285 nova_compute[182755]: 2026-01-22 00:12:58.695 182759 DEBUG oslo_concurrency.lockutils [None req-6475acbd-323f-4d3e-8fa6-ea96f95ecd1c 9ee45ba20dd444a5a5e88aa96cc8e043 873b2f2688e942d5924aa81fa18d84c0 - - default default] Lock "3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 21 19:12:58 np0005591285 nova_compute[182755]: 2026-01-22 00:12:58.720 182759 DEBUG oslo_concurrency.processutils [None req-6475acbd-323f-4d3e-8fa6-ea96f95ecd1c 9ee45ba20dd444a5a5e88aa96cc8e043 873b2f2688e942d5924aa81fa18d84c0 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 21 19:12:58 np0005591285 nova_compute[182755]: 2026-01-22 00:12:58.740 182759 DEBUG nova.policy [None req-ce7cd976-0f13-413f-b6d1-7ca0671f190c 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '833f1e9dce90456ea55a443da6704907', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '34b96b4037d24a0ea19383ca2477b2fd', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Jan 21 19:12:58 np0005591285 nova_compute[182755]: 2026-01-22 00:12:58.776 182759 DEBUG oslo_concurrency.processutils [None req-6475acbd-323f-4d3e-8fa6-ea96f95ecd1c 9ee45ba20dd444a5a5e88aa96cc8e043 873b2f2688e942d5924aa81fa18d84c0 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json" returned: 0 in 0.056s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 21 19:12:58 np0005591285 nova_compute[182755]: 2026-01-22 00:12:58.777 182759 DEBUG oslo_concurrency.processutils [None req-6475acbd-323f-4d3e-8fa6-ea96f95ecd1c 9ee45ba20dd444a5a5e88aa96cc8e043 873b2f2688e942d5924aa81fa18d84c0 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474,backing_fmt=raw /var/lib/nova/instances/ee09d802-1f59-4f58-befa-a281fe642b6b/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 21 19:12:58 np0005591285 nova_compute[182755]: 2026-01-22 00:12:58.814 182759 DEBUG oslo_concurrency.processutils [None req-6475acbd-323f-4d3e-8fa6-ea96f95ecd1c 9ee45ba20dd444a5a5e88aa96cc8e043 873b2f2688e942d5924aa81fa18d84c0 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474,backing_fmt=raw /var/lib/nova/instances/ee09d802-1f59-4f58-befa-a281fe642b6b/disk 1073741824" returned: 0 in 0.037s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 21 19:12:58 np0005591285 nova_compute[182755]: 2026-01-22 00:12:58.815 182759 DEBUG oslo_concurrency.lockutils [None req-6475acbd-323f-4d3e-8fa6-ea96f95ecd1c 9ee45ba20dd444a5a5e88aa96cc8e043 873b2f2688e942d5924aa81fa18d84c0 - - default default] Lock "3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.120s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 21 19:12:58 np0005591285 nova_compute[182755]: 2026-01-22 00:12:58.816 182759 DEBUG oslo_concurrency.processutils [None req-6475acbd-323f-4d3e-8fa6-ea96f95ecd1c 9ee45ba20dd444a5a5e88aa96cc8e043 873b2f2688e942d5924aa81fa18d84c0 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 21 19:12:58 np0005591285 nova_compute[182755]: 2026-01-22 00:12:58.869 182759 DEBUG oslo_concurrency.processutils [None req-6475acbd-323f-4d3e-8fa6-ea96f95ecd1c 9ee45ba20dd444a5a5e88aa96cc8e043 873b2f2688e942d5924aa81fa18d84c0 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json" returned: 0 in 0.053s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 21 19:12:58 np0005591285 nova_compute[182755]: 2026-01-22 00:12:58.870 182759 DEBUG nova.virt.disk.api [None req-6475acbd-323f-4d3e-8fa6-ea96f95ecd1c 9ee45ba20dd444a5a5e88aa96cc8e043 873b2f2688e942d5924aa81fa18d84c0 - - default default] Checking if we can resize image /var/lib/nova/instances/ee09d802-1f59-4f58-befa-a281fe642b6b/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166#033[00m
Jan 21 19:12:58 np0005591285 nova_compute[182755]: 2026-01-22 00:12:58.871 182759 DEBUG oslo_concurrency.processutils [None req-6475acbd-323f-4d3e-8fa6-ea96f95ecd1c 9ee45ba20dd444a5a5e88aa96cc8e043 873b2f2688e942d5924aa81fa18d84c0 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/ee09d802-1f59-4f58-befa-a281fe642b6b/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 21 19:12:58 np0005591285 nova_compute[182755]: 2026-01-22 00:12:58.927 182759 DEBUG oslo_concurrency.processutils [None req-6475acbd-323f-4d3e-8fa6-ea96f95ecd1c 9ee45ba20dd444a5a5e88aa96cc8e043 873b2f2688e942d5924aa81fa18d84c0 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/ee09d802-1f59-4f58-befa-a281fe642b6b/disk --force-share --output=json" returned: 0 in 0.056s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 21 19:12:58 np0005591285 nova_compute[182755]: 2026-01-22 00:12:58.928 182759 DEBUG nova.virt.disk.api [None req-6475acbd-323f-4d3e-8fa6-ea96f95ecd1c 9ee45ba20dd444a5a5e88aa96cc8e043 873b2f2688e942d5924aa81fa18d84c0 - - default default] Cannot resize image /var/lib/nova/instances/ee09d802-1f59-4f58-befa-a281fe642b6b/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172#033[00m
Jan 21 19:12:58 np0005591285 nova_compute[182755]: 2026-01-22 00:12:58.929 182759 DEBUG nova.objects.instance [None req-6475acbd-323f-4d3e-8fa6-ea96f95ecd1c 9ee45ba20dd444a5a5e88aa96cc8e043 873b2f2688e942d5924aa81fa18d84c0 - - default default] Lazy-loading 'migration_context' on Instance uuid ee09d802-1f59-4f58-befa-a281fe642b6b obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 21 19:12:58 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:12:58.938 104259 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=ce4b296c-26ac-415a-aa87-9634754eb3d3, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '34'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 21 19:12:59 np0005591285 nova_compute[182755]: 2026-01-22 00:12:59.001 182759 DEBUG nova.compute.manager [None req-ce7cd976-0f13-413f-b6d1-7ca0671f190c 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] [instance: 118577c2-2440-472a-b858-f075b2a804b1] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Jan 21 19:12:59 np0005591285 nova_compute[182755]: 2026-01-22 00:12:59.002 182759 DEBUG nova.virt.libvirt.driver [None req-ce7cd976-0f13-413f-b6d1-7ca0671f190c 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] [instance: 118577c2-2440-472a-b858-f075b2a804b1] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Jan 21 19:12:59 np0005591285 nova_compute[182755]: 2026-01-22 00:12:59.002 182759 INFO nova.virt.libvirt.driver [None req-ce7cd976-0f13-413f-b6d1-7ca0671f190c 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] [instance: 118577c2-2440-472a-b858-f075b2a804b1] Creating image(s)#033[00m
Jan 21 19:12:59 np0005591285 nova_compute[182755]: 2026-01-22 00:12:59.003 182759 DEBUG oslo_concurrency.lockutils [None req-ce7cd976-0f13-413f-b6d1-7ca0671f190c 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] Acquiring lock "/var/lib/nova/instances/118577c2-2440-472a-b858-f075b2a804b1/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 21 19:12:59 np0005591285 nova_compute[182755]: 2026-01-22 00:12:59.003 182759 DEBUG oslo_concurrency.lockutils [None req-ce7cd976-0f13-413f-b6d1-7ca0671f190c 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] Lock "/var/lib/nova/instances/118577c2-2440-472a-b858-f075b2a804b1/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 21 19:12:59 np0005591285 nova_compute[182755]: 2026-01-22 00:12:59.004 182759 DEBUG oslo_concurrency.lockutils [None req-ce7cd976-0f13-413f-b6d1-7ca0671f190c 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] Lock "/var/lib/nova/instances/118577c2-2440-472a-b858-f075b2a804b1/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 21 19:12:59 np0005591285 nova_compute[182755]: 2026-01-22 00:12:59.018 182759 DEBUG nova.virt.libvirt.driver [None req-6475acbd-323f-4d3e-8fa6-ea96f95ecd1c 9ee45ba20dd444a5a5e88aa96cc8e043 873b2f2688e942d5924aa81fa18d84c0 - - default default] [instance: ee09d802-1f59-4f58-befa-a281fe642b6b] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Jan 21 19:12:59 np0005591285 nova_compute[182755]: 2026-01-22 00:12:59.018 182759 DEBUG nova.virt.libvirt.driver [None req-6475acbd-323f-4d3e-8fa6-ea96f95ecd1c 9ee45ba20dd444a5a5e88aa96cc8e043 873b2f2688e942d5924aa81fa18d84c0 - - default default] [instance: ee09d802-1f59-4f58-befa-a281fe642b6b] Ensure instance console log exists: /var/lib/nova/instances/ee09d802-1f59-4f58-befa-a281fe642b6b/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Jan 21 19:12:59 np0005591285 nova_compute[182755]: 2026-01-22 00:12:59.018 182759 DEBUG oslo_concurrency.lockutils [None req-6475acbd-323f-4d3e-8fa6-ea96f95ecd1c 9ee45ba20dd444a5a5e88aa96cc8e043 873b2f2688e942d5924aa81fa18d84c0 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 21 19:12:59 np0005591285 nova_compute[182755]: 2026-01-22 00:12:59.018 182759 DEBUG oslo_concurrency.lockutils [None req-6475acbd-323f-4d3e-8fa6-ea96f95ecd1c 9ee45ba20dd444a5a5e88aa96cc8e043 873b2f2688e942d5924aa81fa18d84c0 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 21 19:12:59 np0005591285 nova_compute[182755]: 2026-01-22 00:12:59.019 182759 DEBUG oslo_concurrency.lockutils [None req-6475acbd-323f-4d3e-8fa6-ea96f95ecd1c 9ee45ba20dd444a5a5e88aa96cc8e043 873b2f2688e942d5924aa81fa18d84c0 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 21 19:12:59 np0005591285 nova_compute[182755]: 2026-01-22 00:12:59.019 182759 DEBUG oslo_concurrency.processutils [None req-ce7cd976-0f13-413f-b6d1-7ca0671f190c 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 21 19:12:59 np0005591285 nova_compute[182755]: 2026-01-22 00:12:59.074 182759 DEBUG oslo_concurrency.processutils [None req-ce7cd976-0f13-413f-b6d1-7ca0671f190c 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json" returned: 0 in 0.055s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 21 19:12:59 np0005591285 nova_compute[182755]: 2026-01-22 00:12:59.075 182759 DEBUG oslo_concurrency.lockutils [None req-ce7cd976-0f13-413f-b6d1-7ca0671f190c 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] Acquiring lock "3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 21 19:12:59 np0005591285 nova_compute[182755]: 2026-01-22 00:12:59.075 182759 DEBUG oslo_concurrency.lockutils [None req-ce7cd976-0f13-413f-b6d1-7ca0671f190c 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] Lock "3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 21 19:12:59 np0005591285 nova_compute[182755]: 2026-01-22 00:12:59.087 182759 DEBUG oslo_concurrency.processutils [None req-ce7cd976-0f13-413f-b6d1-7ca0671f190c 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 21 19:12:59 np0005591285 nova_compute[182755]: 2026-01-22 00:12:59.104 182759 DEBUG nova.network.neutron [None req-6475acbd-323f-4d3e-8fa6-ea96f95ecd1c 9ee45ba20dd444a5a5e88aa96cc8e043 873b2f2688e942d5924aa81fa18d84c0 - - default default] [instance: ee09d802-1f59-4f58-befa-a281fe642b6b] Successfully created port: dc2fa6e9-f8da-41bd-b113-8c7f2b22d4a9 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Jan 21 19:12:59 np0005591285 nova_compute[182755]: 2026-01-22 00:12:59.138 182759 DEBUG oslo_concurrency.processutils [None req-ce7cd976-0f13-413f-b6d1-7ca0671f190c 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json" returned: 0 in 0.052s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 21 19:12:59 np0005591285 nova_compute[182755]: 2026-01-22 00:12:59.139 182759 DEBUG oslo_concurrency.processutils [None req-ce7cd976-0f13-413f-b6d1-7ca0671f190c 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474,backing_fmt=raw /var/lib/nova/instances/118577c2-2440-472a-b858-f075b2a804b1/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 21 19:12:59 np0005591285 nova_compute[182755]: 2026-01-22 00:12:59.171 182759 DEBUG oslo_concurrency.processutils [None req-ce7cd976-0f13-413f-b6d1-7ca0671f190c 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474,backing_fmt=raw /var/lib/nova/instances/118577c2-2440-472a-b858-f075b2a804b1/disk 1073741824" returned: 0 in 0.032s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 21 19:12:59 np0005591285 nova_compute[182755]: 2026-01-22 00:12:59.173 182759 DEBUG oslo_concurrency.lockutils [None req-ce7cd976-0f13-413f-b6d1-7ca0671f190c 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] Lock "3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.097s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 21 19:12:59 np0005591285 nova_compute[182755]: 2026-01-22 00:12:59.173 182759 DEBUG oslo_concurrency.processutils [None req-ce7cd976-0f13-413f-b6d1-7ca0671f190c 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 21 19:12:59 np0005591285 nova_compute[182755]: 2026-01-22 00:12:59.231 182759 DEBUG oslo_concurrency.processutils [None req-ce7cd976-0f13-413f-b6d1-7ca0671f190c 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json" returned: 0 in 0.057s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 21 19:12:59 np0005591285 nova_compute[182755]: 2026-01-22 00:12:59.232 182759 DEBUG nova.virt.disk.api [None req-ce7cd976-0f13-413f-b6d1-7ca0671f190c 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] Checking if we can resize image /var/lib/nova/instances/118577c2-2440-472a-b858-f075b2a804b1/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166#033[00m
Jan 21 19:12:59 np0005591285 nova_compute[182755]: 2026-01-22 00:12:59.232 182759 DEBUG oslo_concurrency.processutils [None req-ce7cd976-0f13-413f-b6d1-7ca0671f190c 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/118577c2-2440-472a-b858-f075b2a804b1/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 21 19:12:59 np0005591285 nova_compute[182755]: 2026-01-22 00:12:59.291 182759 DEBUG oslo_concurrency.processutils [None req-ce7cd976-0f13-413f-b6d1-7ca0671f190c 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/118577c2-2440-472a-b858-f075b2a804b1/disk --force-share --output=json" returned: 0 in 0.059s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 21 19:12:59 np0005591285 nova_compute[182755]: 2026-01-22 00:12:59.292 182759 DEBUG nova.virt.disk.api [None req-ce7cd976-0f13-413f-b6d1-7ca0671f190c 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] Cannot resize image /var/lib/nova/instances/118577c2-2440-472a-b858-f075b2a804b1/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172#033[00m
Jan 21 19:12:59 np0005591285 nova_compute[182755]: 2026-01-22 00:12:59.292 182759 DEBUG nova.objects.instance [None req-ce7cd976-0f13-413f-b6d1-7ca0671f190c 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] Lazy-loading 'migration_context' on Instance uuid 118577c2-2440-472a-b858-f075b2a804b1 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 21 19:12:59 np0005591285 nova_compute[182755]: 2026-01-22 00:12:59.311 182759 DEBUG nova.virt.libvirt.driver [None req-ce7cd976-0f13-413f-b6d1-7ca0671f190c 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] [instance: 118577c2-2440-472a-b858-f075b2a804b1] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Jan 21 19:12:59 np0005591285 nova_compute[182755]: 2026-01-22 00:12:59.311 182759 DEBUG nova.virt.libvirt.driver [None req-ce7cd976-0f13-413f-b6d1-7ca0671f190c 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] [instance: 118577c2-2440-472a-b858-f075b2a804b1] Ensure instance console log exists: /var/lib/nova/instances/118577c2-2440-472a-b858-f075b2a804b1/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Jan 21 19:12:59 np0005591285 nova_compute[182755]: 2026-01-22 00:12:59.312 182759 DEBUG oslo_concurrency.lockutils [None req-ce7cd976-0f13-413f-b6d1-7ca0671f190c 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 21 19:12:59 np0005591285 nova_compute[182755]: 2026-01-22 00:12:59.312 182759 DEBUG oslo_concurrency.lockutils [None req-ce7cd976-0f13-413f-b6d1-7ca0671f190c 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 21 19:12:59 np0005591285 nova_compute[182755]: 2026-01-22 00:12:59.312 182759 DEBUG oslo_concurrency.lockutils [None req-ce7cd976-0f13-413f-b6d1-7ca0671f190c 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 21 19:12:59 np0005591285 nova_compute[182755]: 2026-01-22 00:12:59.593 182759 DEBUG nova.network.neutron [None req-ce7cd976-0f13-413f-b6d1-7ca0671f190c 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] [instance: 118577c2-2440-472a-b858-f075b2a804b1] Successfully created port: f091e31e-112e-4a90-9947-5a807f422c9c _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Jan 21 19:13:00 np0005591285 nova_compute[182755]: 2026-01-22 00:13:00.354 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:13:02 np0005591285 podman[229815]: 2026-01-22 00:13:02.216182125 +0000 UTC m=+0.077244730 container health_status 3d927e208c8d9d2bf2ed958430b573b5beb6e6062ba87e1ec7a0ee42c855b2e4 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Jan 21 19:13:02 np0005591285 nova_compute[182755]: 2026-01-22 00:13:02.463 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:13:02 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:13:02.975 104259 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 21 19:13:02 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:13:02.976 104259 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 21 19:13:02 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:13:02.976 104259 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 21 19:13:05 np0005591285 podman[229839]: 2026-01-22 00:13:05.18811037 +0000 UTC m=+0.051962116 container health_status 8919309155a033da8deb024bb37b9fc0d285fe97686ee0d202af37a5f2df8416 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter)
Jan 21 19:13:05 np0005591285 podman[229838]: 2026-01-22 00:13:05.218219824 +0000 UTC m=+0.086032656 container health_status 482a8450540b33e5cd4c4cb0c4b60281016c657b79b89fa82f8a9e447843f995 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.build-date=20251202, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team)
Jan 21 19:13:05 np0005591285 nova_compute[182755]: 2026-01-22 00:13:05.356 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:13:07 np0005591285 podman[229879]: 2026-01-22 00:13:07.256188038 +0000 UTC m=+0.130810668 container health_status e38def30a2d032278c507a8fe96e62e9ea51ff4721fe55b24e66691821fd079e (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202)
Jan 21 19:13:07 np0005591285 nova_compute[182755]: 2026-01-22 00:13:07.465 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:13:08 np0005591285 nova_compute[182755]: 2026-01-22 00:13:08.494 182759 DEBUG nova.network.neutron [None req-6475acbd-323f-4d3e-8fa6-ea96f95ecd1c 9ee45ba20dd444a5a5e88aa96cc8e043 873b2f2688e942d5924aa81fa18d84c0 - - default default] [instance: ee09d802-1f59-4f58-befa-a281fe642b6b] Successfully updated port: dc2fa6e9-f8da-41bd-b113-8c7f2b22d4a9 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Jan 21 19:13:08 np0005591285 nova_compute[182755]: 2026-01-22 00:13:08.550 182759 DEBUG oslo_concurrency.lockutils [None req-6475acbd-323f-4d3e-8fa6-ea96f95ecd1c 9ee45ba20dd444a5a5e88aa96cc8e043 873b2f2688e942d5924aa81fa18d84c0 - - default default] Acquiring lock "refresh_cache-ee09d802-1f59-4f58-befa-a281fe642b6b" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 21 19:13:08 np0005591285 nova_compute[182755]: 2026-01-22 00:13:08.551 182759 DEBUG oslo_concurrency.lockutils [None req-6475acbd-323f-4d3e-8fa6-ea96f95ecd1c 9ee45ba20dd444a5a5e88aa96cc8e043 873b2f2688e942d5924aa81fa18d84c0 - - default default] Acquired lock "refresh_cache-ee09d802-1f59-4f58-befa-a281fe642b6b" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 21 19:13:08 np0005591285 nova_compute[182755]: 2026-01-22 00:13:08.551 182759 DEBUG nova.network.neutron [None req-6475acbd-323f-4d3e-8fa6-ea96f95ecd1c 9ee45ba20dd444a5a5e88aa96cc8e043 873b2f2688e942d5924aa81fa18d84c0 - - default default] [instance: ee09d802-1f59-4f58-befa-a281fe642b6b] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Jan 21 19:13:08 np0005591285 nova_compute[182755]: 2026-01-22 00:13:08.623 182759 DEBUG nova.network.neutron [None req-ce7cd976-0f13-413f-b6d1-7ca0671f190c 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] [instance: 118577c2-2440-472a-b858-f075b2a804b1] Successfully updated port: f091e31e-112e-4a90-9947-5a807f422c9c _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Jan 21 19:13:08 np0005591285 nova_compute[182755]: 2026-01-22 00:13:08.775 182759 DEBUG nova.compute.manager [req-8ef12b3c-5f95-4688-94ad-20379dd1c96d req-ecae4b8f-7244-4a98-bf08-6a3ed1d28293 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: ee09d802-1f59-4f58-befa-a281fe642b6b] Received event network-changed-dc2fa6e9-f8da-41bd-b113-8c7f2b22d4a9 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 21 19:13:08 np0005591285 nova_compute[182755]: 2026-01-22 00:13:08.776 182759 DEBUG nova.compute.manager [req-8ef12b3c-5f95-4688-94ad-20379dd1c96d req-ecae4b8f-7244-4a98-bf08-6a3ed1d28293 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: ee09d802-1f59-4f58-befa-a281fe642b6b] Refreshing instance network info cache due to event network-changed-dc2fa6e9-f8da-41bd-b113-8c7f2b22d4a9. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 21 19:13:08 np0005591285 nova_compute[182755]: 2026-01-22 00:13:08.776 182759 DEBUG oslo_concurrency.lockutils [req-8ef12b3c-5f95-4688-94ad-20379dd1c96d req-ecae4b8f-7244-4a98-bf08-6a3ed1d28293 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "refresh_cache-ee09d802-1f59-4f58-befa-a281fe642b6b" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 21 19:13:08 np0005591285 nova_compute[182755]: 2026-01-22 00:13:08.822 182759 DEBUG oslo_concurrency.lockutils [None req-ce7cd976-0f13-413f-b6d1-7ca0671f190c 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] Acquiring lock "refresh_cache-118577c2-2440-472a-b858-f075b2a804b1" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 21 19:13:08 np0005591285 nova_compute[182755]: 2026-01-22 00:13:08.823 182759 DEBUG oslo_concurrency.lockutils [None req-ce7cd976-0f13-413f-b6d1-7ca0671f190c 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] Acquired lock "refresh_cache-118577c2-2440-472a-b858-f075b2a804b1" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 21 19:13:08 np0005591285 nova_compute[182755]: 2026-01-22 00:13:08.823 182759 DEBUG nova.network.neutron [None req-ce7cd976-0f13-413f-b6d1-7ca0671f190c 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] [instance: 118577c2-2440-472a-b858-f075b2a804b1] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Jan 21 19:13:08 np0005591285 nova_compute[182755]: 2026-01-22 00:13:08.889 182759 DEBUG nova.compute.manager [req-4df39661-500b-48e3-bb0a-9a0b42e97cbd req-123190da-d41e-44f2-9f07-24570c612c9b 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 118577c2-2440-472a-b858-f075b2a804b1] Received event network-changed-f091e31e-112e-4a90-9947-5a807f422c9c external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 21 19:13:08 np0005591285 nova_compute[182755]: 2026-01-22 00:13:08.890 182759 DEBUG nova.compute.manager [req-4df39661-500b-48e3-bb0a-9a0b42e97cbd req-123190da-d41e-44f2-9f07-24570c612c9b 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 118577c2-2440-472a-b858-f075b2a804b1] Refreshing instance network info cache due to event network-changed-f091e31e-112e-4a90-9947-5a807f422c9c. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 21 19:13:08 np0005591285 nova_compute[182755]: 2026-01-22 00:13:08.890 182759 DEBUG oslo_concurrency.lockutils [req-4df39661-500b-48e3-bb0a-9a0b42e97cbd req-123190da-d41e-44f2-9f07-24570c612c9b 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "refresh_cache-118577c2-2440-472a-b858-f075b2a804b1" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 21 19:13:08 np0005591285 nova_compute[182755]: 2026-01-22 00:13:08.967 182759 DEBUG nova.network.neutron [None req-6475acbd-323f-4d3e-8fa6-ea96f95ecd1c 9ee45ba20dd444a5a5e88aa96cc8e043 873b2f2688e942d5924aa81fa18d84c0 - - default default] [instance: ee09d802-1f59-4f58-befa-a281fe642b6b] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Jan 21 19:13:09 np0005591285 nova_compute[182755]: 2026-01-22 00:13:09.033 182759 DEBUG nova.network.neutron [None req-ce7cd976-0f13-413f-b6d1-7ca0671f190c 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] [instance: 118577c2-2440-472a-b858-f075b2a804b1] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Jan 21 19:13:10 np0005591285 nova_compute[182755]: 2026-01-22 00:13:10.004 182759 DEBUG nova.network.neutron [None req-ce7cd976-0f13-413f-b6d1-7ca0671f190c 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] [instance: 118577c2-2440-472a-b858-f075b2a804b1] Updating instance_info_cache with network_info: [{"id": "f091e31e-112e-4a90-9947-5a807f422c9c", "address": "fa:16:3e:a7:98:77", "network": {"id": "88a7330a-aaa1-424a-b4dc-f7500e450abb", "bridge": "br-int", "label": "tempest-network-smoke--616986641", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "34b96b4037d24a0ea19383ca2477b2fd", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapf091e31e-11", "ovs_interfaceid": "f091e31e-112e-4a90-9947-5a807f422c9c", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 21 19:13:10 np0005591285 nova_compute[182755]: 2026-01-22 00:13:10.326 182759 DEBUG oslo_concurrency.lockutils [None req-ce7cd976-0f13-413f-b6d1-7ca0671f190c 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] Releasing lock "refresh_cache-118577c2-2440-472a-b858-f075b2a804b1" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 21 19:13:10 np0005591285 nova_compute[182755]: 2026-01-22 00:13:10.326 182759 DEBUG nova.compute.manager [None req-ce7cd976-0f13-413f-b6d1-7ca0671f190c 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] [instance: 118577c2-2440-472a-b858-f075b2a804b1] Instance network_info: |[{"id": "f091e31e-112e-4a90-9947-5a807f422c9c", "address": "fa:16:3e:a7:98:77", "network": {"id": "88a7330a-aaa1-424a-b4dc-f7500e450abb", "bridge": "br-int", "label": "tempest-network-smoke--616986641", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "34b96b4037d24a0ea19383ca2477b2fd", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapf091e31e-11", "ovs_interfaceid": "f091e31e-112e-4a90-9947-5a807f422c9c", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Jan 21 19:13:10 np0005591285 nova_compute[182755]: 2026-01-22 00:13:10.328 182759 DEBUG oslo_concurrency.lockutils [req-4df39661-500b-48e3-bb0a-9a0b42e97cbd req-123190da-d41e-44f2-9f07-24570c612c9b 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquired lock "refresh_cache-118577c2-2440-472a-b858-f075b2a804b1" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 21 19:13:10 np0005591285 nova_compute[182755]: 2026-01-22 00:13:10.328 182759 DEBUG nova.network.neutron [req-4df39661-500b-48e3-bb0a-9a0b42e97cbd req-123190da-d41e-44f2-9f07-24570c612c9b 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 118577c2-2440-472a-b858-f075b2a804b1] Refreshing network info cache for port f091e31e-112e-4a90-9947-5a807f422c9c _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 21 19:13:10 np0005591285 nova_compute[182755]: 2026-01-22 00:13:10.334 182759 DEBUG nova.virt.libvirt.driver [None req-ce7cd976-0f13-413f-b6d1-7ca0671f190c 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] [instance: 118577c2-2440-472a-b858-f075b2a804b1] Start _get_guest_xml network_info=[{"id": "f091e31e-112e-4a90-9947-5a807f422c9c", "address": "fa:16:3e:a7:98:77", "network": {"id": "88a7330a-aaa1-424a-b4dc-f7500e450abb", "bridge": "br-int", "label": "tempest-network-smoke--616986641", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "34b96b4037d24a0ea19383ca2477b2fd", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapf091e31e-11", "ovs_interfaceid": "f091e31e-112e-4a90-9947-5a807f422c9c", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-21T23:42:50Z,direct_url=<?>,disk_format='qcow2',id=9cd98f02-a505-4543-a7ad-04e9a377b456,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='43b70c4e837343859ac97b6b2397ba1b',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-21T23:42:52Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'device_type': 'disk', 'device_name': '/dev/vda', 'size': 0, 'boot_index': 0, 'disk_bus': 'virtio', 'guest_format': None, 'encryption_options': None, 'encryption_format': None, 'encrypted': False, 'encryption_secret_uuid': None, 'image_id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Jan 21 19:13:10 np0005591285 nova_compute[182755]: 2026-01-22 00:13:10.341 182759 WARNING nova.virt.libvirt.driver [None req-ce7cd976-0f13-413f-b6d1-7ca0671f190c 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 21 19:13:10 np0005591285 nova_compute[182755]: 2026-01-22 00:13:10.355 182759 DEBUG nova.virt.libvirt.host [None req-ce7cd976-0f13-413f-b6d1-7ca0671f190c 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Jan 21 19:13:10 np0005591285 nova_compute[182755]: 2026-01-22 00:13:10.357 182759 DEBUG nova.virt.libvirt.host [None req-ce7cd976-0f13-413f-b6d1-7ca0671f190c 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Jan 21 19:13:10 np0005591285 nova_compute[182755]: 2026-01-22 00:13:10.359 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:13:10 np0005591285 nova_compute[182755]: 2026-01-22 00:13:10.361 182759 DEBUG nova.virt.libvirt.host [None req-ce7cd976-0f13-413f-b6d1-7ca0671f190c 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Jan 21 19:13:10 np0005591285 nova_compute[182755]: 2026-01-22 00:13:10.362 182759 DEBUG nova.virt.libvirt.host [None req-ce7cd976-0f13-413f-b6d1-7ca0671f190c 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Jan 21 19:13:10 np0005591285 nova_compute[182755]: 2026-01-22 00:13:10.365 182759 DEBUG nova.virt.libvirt.driver [None req-ce7cd976-0f13-413f-b6d1-7ca0671f190c 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Jan 21 19:13:10 np0005591285 nova_compute[182755]: 2026-01-22 00:13:10.365 182759 DEBUG nova.virt.hardware [None req-ce7cd976-0f13-413f-b6d1-7ca0671f190c 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-21T23:42:45Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='c3389c03-89c4-4ff5-9e03-1a99d41713d4',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-21T23:42:50Z,direct_url=<?>,disk_format='qcow2',id=9cd98f02-a505-4543-a7ad-04e9a377b456,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='43b70c4e837343859ac97b6b2397ba1b',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-21T23:42:52Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Jan 21 19:13:10 np0005591285 nova_compute[182755]: 2026-01-22 00:13:10.366 182759 DEBUG nova.virt.hardware [None req-ce7cd976-0f13-413f-b6d1-7ca0671f190c 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Jan 21 19:13:10 np0005591285 nova_compute[182755]: 2026-01-22 00:13:10.367 182759 DEBUG nova.virt.hardware [None req-ce7cd976-0f13-413f-b6d1-7ca0671f190c 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Jan 21 19:13:10 np0005591285 nova_compute[182755]: 2026-01-22 00:13:10.367 182759 DEBUG nova.virt.hardware [None req-ce7cd976-0f13-413f-b6d1-7ca0671f190c 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Jan 21 19:13:10 np0005591285 nova_compute[182755]: 2026-01-22 00:13:10.368 182759 DEBUG nova.virt.hardware [None req-ce7cd976-0f13-413f-b6d1-7ca0671f190c 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Jan 21 19:13:10 np0005591285 nova_compute[182755]: 2026-01-22 00:13:10.368 182759 DEBUG nova.virt.hardware [None req-ce7cd976-0f13-413f-b6d1-7ca0671f190c 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Jan 21 19:13:10 np0005591285 nova_compute[182755]: 2026-01-22 00:13:10.369 182759 DEBUG nova.virt.hardware [None req-ce7cd976-0f13-413f-b6d1-7ca0671f190c 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Jan 21 19:13:10 np0005591285 nova_compute[182755]: 2026-01-22 00:13:10.370 182759 DEBUG nova.virt.hardware [None req-ce7cd976-0f13-413f-b6d1-7ca0671f190c 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Jan 21 19:13:10 np0005591285 nova_compute[182755]: 2026-01-22 00:13:10.370 182759 DEBUG nova.virt.hardware [None req-ce7cd976-0f13-413f-b6d1-7ca0671f190c 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Jan 21 19:13:10 np0005591285 nova_compute[182755]: 2026-01-22 00:13:10.370 182759 DEBUG nova.virt.hardware [None req-ce7cd976-0f13-413f-b6d1-7ca0671f190c 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Jan 21 19:13:10 np0005591285 nova_compute[182755]: 2026-01-22 00:13:10.371 182759 DEBUG nova.virt.hardware [None req-ce7cd976-0f13-413f-b6d1-7ca0671f190c 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Jan 21 19:13:10 np0005591285 nova_compute[182755]: 2026-01-22 00:13:10.380 182759 DEBUG nova.virt.libvirt.vif [None req-ce7cd976-0f13-413f-b6d1-7ca0671f190c 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-22T00:12:49Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-1399445424',display_name='tempest-TestNetworkBasicOps-server-1399445424',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-1399445424',id=119,image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBDVR4jwL+EdUdOjKT2r4p3G73DZqMg92wtbjTJGvSWwJFmKZ8OOzFfVJeEzaB4zscbAeSw1Wszev9bU02pUh4vhhfe5YmWTWD6v3j1dlzOjM4Q/vNeMgCbEUgq0iq35//w==',key_name='tempest-TestNetworkBasicOps-1037238834',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='34b96b4037d24a0ea19383ca2477b2fd',ramdisk_id='',reservation_id='r-25upqi0r',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestNetworkBasicOps-822850957',owner_user_name='tempest-TestNetworkBasicOps-822850957-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-22T00:12:58Z,user_data=None,user_id='833f1e9dce90456ea55a443da6704907',uuid=118577c2-2440-472a-b858-f075b2a804b1,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "f091e31e-112e-4a90-9947-5a807f422c9c", "address": "fa:16:3e:a7:98:77", "network": {"id": "88a7330a-aaa1-424a-b4dc-f7500e450abb", "bridge": "br-int", "label": "tempest-network-smoke--616986641", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "34b96b4037d24a0ea19383ca2477b2fd", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapf091e31e-11", "ovs_interfaceid": "f091e31e-112e-4a90-9947-5a807f422c9c", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Jan 21 19:13:10 np0005591285 nova_compute[182755]: 2026-01-22 00:13:10.380 182759 DEBUG nova.network.os_vif_util [None req-ce7cd976-0f13-413f-b6d1-7ca0671f190c 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] Converting VIF {"id": "f091e31e-112e-4a90-9947-5a807f422c9c", "address": "fa:16:3e:a7:98:77", "network": {"id": "88a7330a-aaa1-424a-b4dc-f7500e450abb", "bridge": "br-int", "label": "tempest-network-smoke--616986641", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "34b96b4037d24a0ea19383ca2477b2fd", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapf091e31e-11", "ovs_interfaceid": "f091e31e-112e-4a90-9947-5a807f422c9c", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 21 19:13:10 np0005591285 nova_compute[182755]: 2026-01-22 00:13:10.382 182759 DEBUG nova.network.os_vif_util [None req-ce7cd976-0f13-413f-b6d1-7ca0671f190c 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:a7:98:77,bridge_name='br-int',has_traffic_filtering=True,id=f091e31e-112e-4a90-9947-5a807f422c9c,network=Network(88a7330a-aaa1-424a-b4dc-f7500e450abb),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapf091e31e-11') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 21 19:13:10 np0005591285 nova_compute[182755]: 2026-01-22 00:13:10.384 182759 DEBUG nova.objects.instance [None req-ce7cd976-0f13-413f-b6d1-7ca0671f190c 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] Lazy-loading 'pci_devices' on Instance uuid 118577c2-2440-472a-b858-f075b2a804b1 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 21 19:13:10 np0005591285 nova_compute[182755]: 2026-01-22 00:13:10.395 182759 DEBUG nova.network.neutron [None req-6475acbd-323f-4d3e-8fa6-ea96f95ecd1c 9ee45ba20dd444a5a5e88aa96cc8e043 873b2f2688e942d5924aa81fa18d84c0 - - default default] [instance: ee09d802-1f59-4f58-befa-a281fe642b6b] Updating instance_info_cache with network_info: [{"id": "dc2fa6e9-f8da-41bd-b113-8c7f2b22d4a9", "address": "fa:16:3e:f6:6f:67", "network": {"id": "d1e3ff28-7ba4-4007-895f-2557b60edefb", "bridge": "br-int", "label": "tempest-InstanceActionsTestJSON-1124107938-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "873b2f2688e942d5924aa81fa18d84c0", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapdc2fa6e9-f8", "ovs_interfaceid": "dc2fa6e9-f8da-41bd-b113-8c7f2b22d4a9", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 21 19:13:10 np0005591285 nova_compute[182755]: 2026-01-22 00:13:10.609 182759 DEBUG nova.virt.libvirt.driver [None req-ce7cd976-0f13-413f-b6d1-7ca0671f190c 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] [instance: 118577c2-2440-472a-b858-f075b2a804b1] End _get_guest_xml xml=<domain type="kvm">
Jan 21 19:13:10 np0005591285 nova_compute[182755]:  <uuid>118577c2-2440-472a-b858-f075b2a804b1</uuid>
Jan 21 19:13:10 np0005591285 nova_compute[182755]:  <name>instance-00000077</name>
Jan 21 19:13:10 np0005591285 nova_compute[182755]:  <memory>131072</memory>
Jan 21 19:13:10 np0005591285 nova_compute[182755]:  <vcpu>1</vcpu>
Jan 21 19:13:10 np0005591285 nova_compute[182755]:  <metadata>
Jan 21 19:13:10 np0005591285 nova_compute[182755]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 21 19:13:10 np0005591285 nova_compute[182755]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 21 19:13:10 np0005591285 nova_compute[182755]:      <nova:name>tempest-TestNetworkBasicOps-server-1399445424</nova:name>
Jan 21 19:13:10 np0005591285 nova_compute[182755]:      <nova:creationTime>2026-01-22 00:13:10</nova:creationTime>
Jan 21 19:13:10 np0005591285 nova_compute[182755]:      <nova:flavor name="m1.nano">
Jan 21 19:13:10 np0005591285 nova_compute[182755]:        <nova:memory>128</nova:memory>
Jan 21 19:13:10 np0005591285 nova_compute[182755]:        <nova:disk>1</nova:disk>
Jan 21 19:13:10 np0005591285 nova_compute[182755]:        <nova:swap>0</nova:swap>
Jan 21 19:13:10 np0005591285 nova_compute[182755]:        <nova:ephemeral>0</nova:ephemeral>
Jan 21 19:13:10 np0005591285 nova_compute[182755]:        <nova:vcpus>1</nova:vcpus>
Jan 21 19:13:10 np0005591285 nova_compute[182755]:      </nova:flavor>
Jan 21 19:13:10 np0005591285 nova_compute[182755]:      <nova:owner>
Jan 21 19:13:10 np0005591285 nova_compute[182755]:        <nova:user uuid="833f1e9dce90456ea55a443da6704907">tempest-TestNetworkBasicOps-822850957-project-member</nova:user>
Jan 21 19:13:10 np0005591285 nova_compute[182755]:        <nova:project uuid="34b96b4037d24a0ea19383ca2477b2fd">tempest-TestNetworkBasicOps-822850957</nova:project>
Jan 21 19:13:10 np0005591285 nova_compute[182755]:      </nova:owner>
Jan 21 19:13:10 np0005591285 nova_compute[182755]:      <nova:root type="image" uuid="9cd98f02-a505-4543-a7ad-04e9a377b456"/>
Jan 21 19:13:10 np0005591285 nova_compute[182755]:      <nova:ports>
Jan 21 19:13:10 np0005591285 nova_compute[182755]:        <nova:port uuid="f091e31e-112e-4a90-9947-5a807f422c9c">
Jan 21 19:13:10 np0005591285 nova_compute[182755]:          <nova:ip type="fixed" address="10.100.0.11" ipVersion="4"/>
Jan 21 19:13:10 np0005591285 nova_compute[182755]:        </nova:port>
Jan 21 19:13:10 np0005591285 nova_compute[182755]:      </nova:ports>
Jan 21 19:13:10 np0005591285 nova_compute[182755]:    </nova:instance>
Jan 21 19:13:10 np0005591285 nova_compute[182755]:  </metadata>
Jan 21 19:13:10 np0005591285 nova_compute[182755]:  <sysinfo type="smbios">
Jan 21 19:13:10 np0005591285 nova_compute[182755]:    <system>
Jan 21 19:13:10 np0005591285 nova_compute[182755]:      <entry name="manufacturer">RDO</entry>
Jan 21 19:13:10 np0005591285 nova_compute[182755]:      <entry name="product">OpenStack Compute</entry>
Jan 21 19:13:10 np0005591285 nova_compute[182755]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 21 19:13:10 np0005591285 nova_compute[182755]:      <entry name="serial">118577c2-2440-472a-b858-f075b2a804b1</entry>
Jan 21 19:13:10 np0005591285 nova_compute[182755]:      <entry name="uuid">118577c2-2440-472a-b858-f075b2a804b1</entry>
Jan 21 19:13:10 np0005591285 nova_compute[182755]:      <entry name="family">Virtual Machine</entry>
Jan 21 19:13:10 np0005591285 nova_compute[182755]:    </system>
Jan 21 19:13:10 np0005591285 nova_compute[182755]:  </sysinfo>
Jan 21 19:13:10 np0005591285 nova_compute[182755]:  <os>
Jan 21 19:13:10 np0005591285 nova_compute[182755]:    <type arch="x86_64" machine="q35">hvm</type>
Jan 21 19:13:10 np0005591285 nova_compute[182755]:    <boot dev="hd"/>
Jan 21 19:13:10 np0005591285 nova_compute[182755]:    <smbios mode="sysinfo"/>
Jan 21 19:13:10 np0005591285 nova_compute[182755]:  </os>
Jan 21 19:13:10 np0005591285 nova_compute[182755]:  <features>
Jan 21 19:13:10 np0005591285 nova_compute[182755]:    <acpi/>
Jan 21 19:13:10 np0005591285 nova_compute[182755]:    <apic/>
Jan 21 19:13:10 np0005591285 nova_compute[182755]:    <vmcoreinfo/>
Jan 21 19:13:10 np0005591285 nova_compute[182755]:  </features>
Jan 21 19:13:10 np0005591285 nova_compute[182755]:  <clock offset="utc">
Jan 21 19:13:10 np0005591285 nova_compute[182755]:    <timer name="pit" tickpolicy="delay"/>
Jan 21 19:13:10 np0005591285 nova_compute[182755]:    <timer name="rtc" tickpolicy="catchup"/>
Jan 21 19:13:10 np0005591285 nova_compute[182755]:    <timer name="hpet" present="no"/>
Jan 21 19:13:10 np0005591285 nova_compute[182755]:  </clock>
Jan 21 19:13:10 np0005591285 nova_compute[182755]:  <cpu mode="custom" match="exact">
Jan 21 19:13:10 np0005591285 nova_compute[182755]:    <model>Nehalem</model>
Jan 21 19:13:10 np0005591285 nova_compute[182755]:    <topology sockets="1" cores="1" threads="1"/>
Jan 21 19:13:10 np0005591285 nova_compute[182755]:  </cpu>
Jan 21 19:13:10 np0005591285 nova_compute[182755]:  <devices>
Jan 21 19:13:10 np0005591285 nova_compute[182755]:    <disk type="file" device="disk">
Jan 21 19:13:10 np0005591285 nova_compute[182755]:      <driver name="qemu" type="qcow2" cache="none"/>
Jan 21 19:13:10 np0005591285 nova_compute[182755]:      <source file="/var/lib/nova/instances/118577c2-2440-472a-b858-f075b2a804b1/disk"/>
Jan 21 19:13:10 np0005591285 nova_compute[182755]:      <target dev="vda" bus="virtio"/>
Jan 21 19:13:10 np0005591285 nova_compute[182755]:    </disk>
Jan 21 19:13:10 np0005591285 nova_compute[182755]:    <disk type="file" device="cdrom">
Jan 21 19:13:10 np0005591285 nova_compute[182755]:      <driver name="qemu" type="raw" cache="none"/>
Jan 21 19:13:10 np0005591285 nova_compute[182755]:      <source file="/var/lib/nova/instances/118577c2-2440-472a-b858-f075b2a804b1/disk.config"/>
Jan 21 19:13:10 np0005591285 nova_compute[182755]:      <target dev="sda" bus="sata"/>
Jan 21 19:13:10 np0005591285 nova_compute[182755]:    </disk>
Jan 21 19:13:10 np0005591285 nova_compute[182755]:    <interface type="ethernet">
Jan 21 19:13:10 np0005591285 nova_compute[182755]:      <mac address="fa:16:3e:a7:98:77"/>
Jan 21 19:13:10 np0005591285 nova_compute[182755]:      <model type="virtio"/>
Jan 21 19:13:10 np0005591285 nova_compute[182755]:      <driver name="vhost" rx_queue_size="512"/>
Jan 21 19:13:10 np0005591285 nova_compute[182755]:      <mtu size="1442"/>
Jan 21 19:13:10 np0005591285 nova_compute[182755]:      <target dev="tapf091e31e-11"/>
Jan 21 19:13:10 np0005591285 nova_compute[182755]:    </interface>
Jan 21 19:13:10 np0005591285 nova_compute[182755]:    <serial type="pty">
Jan 21 19:13:10 np0005591285 nova_compute[182755]:      <log file="/var/lib/nova/instances/118577c2-2440-472a-b858-f075b2a804b1/console.log" append="off"/>
Jan 21 19:13:10 np0005591285 nova_compute[182755]:    </serial>
Jan 21 19:13:10 np0005591285 nova_compute[182755]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 21 19:13:10 np0005591285 nova_compute[182755]:    <video>
Jan 21 19:13:10 np0005591285 nova_compute[182755]:      <model type="virtio"/>
Jan 21 19:13:10 np0005591285 nova_compute[182755]:    </video>
Jan 21 19:13:10 np0005591285 nova_compute[182755]:    <input type="tablet" bus="usb"/>
Jan 21 19:13:10 np0005591285 nova_compute[182755]:    <rng model="virtio">
Jan 21 19:13:10 np0005591285 nova_compute[182755]:      <backend model="random">/dev/urandom</backend>
Jan 21 19:13:10 np0005591285 nova_compute[182755]:    </rng>
Jan 21 19:13:10 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root"/>
Jan 21 19:13:10 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 19:13:10 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 19:13:10 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 19:13:10 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 19:13:10 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 19:13:10 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 19:13:10 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 19:13:10 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 19:13:10 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 19:13:10 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 19:13:10 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 19:13:10 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 19:13:10 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 19:13:10 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 19:13:10 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 19:13:10 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 19:13:10 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 19:13:10 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 19:13:10 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 19:13:10 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 19:13:10 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 19:13:10 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 19:13:10 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 19:13:10 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 19:13:10 np0005591285 nova_compute[182755]:    <controller type="usb" index="0"/>
Jan 21 19:13:10 np0005591285 nova_compute[182755]:    <memballoon model="virtio">
Jan 21 19:13:10 np0005591285 nova_compute[182755]:      <stats period="10"/>
Jan 21 19:13:10 np0005591285 nova_compute[182755]:    </memballoon>
Jan 21 19:13:10 np0005591285 nova_compute[182755]:  </devices>
Jan 21 19:13:10 np0005591285 nova_compute[182755]: </domain>
Jan 21 19:13:10 np0005591285 nova_compute[182755]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Jan 21 19:13:10 np0005591285 nova_compute[182755]: 2026-01-22 00:13:10.611 182759 DEBUG nova.compute.manager [None req-ce7cd976-0f13-413f-b6d1-7ca0671f190c 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] [instance: 118577c2-2440-472a-b858-f075b2a804b1] Preparing to wait for external event network-vif-plugged-f091e31e-112e-4a90-9947-5a807f422c9c prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Jan 21 19:13:10 np0005591285 nova_compute[182755]: 2026-01-22 00:13:10.611 182759 DEBUG oslo_concurrency.lockutils [None req-ce7cd976-0f13-413f-b6d1-7ca0671f190c 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] Acquiring lock "118577c2-2440-472a-b858-f075b2a804b1-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 21 19:13:10 np0005591285 nova_compute[182755]: 2026-01-22 00:13:10.612 182759 DEBUG oslo_concurrency.lockutils [None req-ce7cd976-0f13-413f-b6d1-7ca0671f190c 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] Lock "118577c2-2440-472a-b858-f075b2a804b1-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 21 19:13:10 np0005591285 nova_compute[182755]: 2026-01-22 00:13:10.612 182759 DEBUG oslo_concurrency.lockutils [None req-ce7cd976-0f13-413f-b6d1-7ca0671f190c 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] Lock "118577c2-2440-472a-b858-f075b2a804b1-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 21 19:13:10 np0005591285 nova_compute[182755]: 2026-01-22 00:13:10.614 182759 DEBUG nova.virt.libvirt.vif [None req-ce7cd976-0f13-413f-b6d1-7ca0671f190c 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-22T00:12:49Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-1399445424',display_name='tempest-TestNetworkBasicOps-server-1399445424',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-1399445424',id=119,image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBDVR4jwL+EdUdOjKT2r4p3G73DZqMg92wtbjTJGvSWwJFmKZ8OOzFfVJeEzaB4zscbAeSw1Wszev9bU02pUh4vhhfe5YmWTWD6v3j1dlzOjM4Q/vNeMgCbEUgq0iq35//w==',key_name='tempest-TestNetworkBasicOps-1037238834',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='34b96b4037d24a0ea19383ca2477b2fd',ramdisk_id='',reservation_id='r-25upqi0r',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestNetworkBasicOps-822850957',owner_user_name='tempest-TestNetworkBasicOps-822850957-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-22T00:12:58Z,user_data=None,user_id='833f1e9dce90456ea55a443da6704907',uuid=118577c2-2440-472a-b858-f075b2a804b1,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "f091e31e-112e-4a90-9947-5a807f422c9c", "address": "fa:16:3e:a7:98:77", "network": {"id": "88a7330a-aaa1-424a-b4dc-f7500e450abb", "bridge": "br-int", "label": "tempest-network-smoke--616986641", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "34b96b4037d24a0ea19383ca2477b2fd", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapf091e31e-11", "ovs_interfaceid": "f091e31e-112e-4a90-9947-5a807f422c9c", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Jan 21 19:13:10 np0005591285 nova_compute[182755]: 2026-01-22 00:13:10.614 182759 DEBUG nova.network.os_vif_util [None req-ce7cd976-0f13-413f-b6d1-7ca0671f190c 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] Converting VIF {"id": "f091e31e-112e-4a90-9947-5a807f422c9c", "address": "fa:16:3e:a7:98:77", "network": {"id": "88a7330a-aaa1-424a-b4dc-f7500e450abb", "bridge": "br-int", "label": "tempest-network-smoke--616986641", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "34b96b4037d24a0ea19383ca2477b2fd", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapf091e31e-11", "ovs_interfaceid": "f091e31e-112e-4a90-9947-5a807f422c9c", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 21 19:13:10 np0005591285 nova_compute[182755]: 2026-01-22 00:13:10.616 182759 DEBUG nova.network.os_vif_util [None req-ce7cd976-0f13-413f-b6d1-7ca0671f190c 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:a7:98:77,bridge_name='br-int',has_traffic_filtering=True,id=f091e31e-112e-4a90-9947-5a807f422c9c,network=Network(88a7330a-aaa1-424a-b4dc-f7500e450abb),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapf091e31e-11') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 21 19:13:10 np0005591285 nova_compute[182755]: 2026-01-22 00:13:10.616 182759 DEBUG os_vif [None req-ce7cd976-0f13-413f-b6d1-7ca0671f190c 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:a7:98:77,bridge_name='br-int',has_traffic_filtering=True,id=f091e31e-112e-4a90-9947-5a807f422c9c,network=Network(88a7330a-aaa1-424a-b4dc-f7500e450abb),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapf091e31e-11') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Jan 21 19:13:10 np0005591285 nova_compute[182755]: 2026-01-22 00:13:10.617 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:13:10 np0005591285 nova_compute[182755]: 2026-01-22 00:13:10.618 182759 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 21 19:13:10 np0005591285 nova_compute[182755]: 2026-01-22 00:13:10.619 182759 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 21 19:13:10 np0005591285 nova_compute[182755]: 2026-01-22 00:13:10.629 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:13:10 np0005591285 nova_compute[182755]: 2026-01-22 00:13:10.630 182759 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapf091e31e-11, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 21 19:13:10 np0005591285 nova_compute[182755]: 2026-01-22 00:13:10.630 182759 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tapf091e31e-11, col_values=(('external_ids', {'iface-id': 'f091e31e-112e-4a90-9947-5a807f422c9c', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:a7:98:77', 'vm-uuid': '118577c2-2440-472a-b858-f075b2a804b1'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 21 19:13:10 np0005591285 nova_compute[182755]: 2026-01-22 00:13:10.632 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:13:10 np0005591285 NetworkManager[55017]: <info>  [1769040790.6335] manager: (tapf091e31e-11): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/210)
Jan 21 19:13:10 np0005591285 nova_compute[182755]: 2026-01-22 00:13:10.634 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 21 19:13:10 np0005591285 nova_compute[182755]: 2026-01-22 00:13:10.639 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:13:10 np0005591285 nova_compute[182755]: 2026-01-22 00:13:10.639 182759 INFO os_vif [None req-ce7cd976-0f13-413f-b6d1-7ca0671f190c 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:a7:98:77,bridge_name='br-int',has_traffic_filtering=True,id=f091e31e-112e-4a90-9947-5a807f422c9c,network=Network(88a7330a-aaa1-424a-b4dc-f7500e450abb),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapf091e31e-11')#033[00m
Jan 21 19:13:11 np0005591285 nova_compute[182755]: 2026-01-22 00:13:11.338 182759 DEBUG oslo_concurrency.lockutils [None req-6475acbd-323f-4d3e-8fa6-ea96f95ecd1c 9ee45ba20dd444a5a5e88aa96cc8e043 873b2f2688e942d5924aa81fa18d84c0 - - default default] Releasing lock "refresh_cache-ee09d802-1f59-4f58-befa-a281fe642b6b" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 21 19:13:11 np0005591285 nova_compute[182755]: 2026-01-22 00:13:11.339 182759 DEBUG nova.compute.manager [None req-6475acbd-323f-4d3e-8fa6-ea96f95ecd1c 9ee45ba20dd444a5a5e88aa96cc8e043 873b2f2688e942d5924aa81fa18d84c0 - - default default] [instance: ee09d802-1f59-4f58-befa-a281fe642b6b] Instance network_info: |[{"id": "dc2fa6e9-f8da-41bd-b113-8c7f2b22d4a9", "address": "fa:16:3e:f6:6f:67", "network": {"id": "d1e3ff28-7ba4-4007-895f-2557b60edefb", "bridge": "br-int", "label": "tempest-InstanceActionsTestJSON-1124107938-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "873b2f2688e942d5924aa81fa18d84c0", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapdc2fa6e9-f8", "ovs_interfaceid": "dc2fa6e9-f8da-41bd-b113-8c7f2b22d4a9", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Jan 21 19:13:11 np0005591285 nova_compute[182755]: 2026-01-22 00:13:11.339 182759 DEBUG oslo_concurrency.lockutils [req-8ef12b3c-5f95-4688-94ad-20379dd1c96d req-ecae4b8f-7244-4a98-bf08-6a3ed1d28293 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquired lock "refresh_cache-ee09d802-1f59-4f58-befa-a281fe642b6b" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 21 19:13:11 np0005591285 nova_compute[182755]: 2026-01-22 00:13:11.340 182759 DEBUG nova.network.neutron [req-8ef12b3c-5f95-4688-94ad-20379dd1c96d req-ecae4b8f-7244-4a98-bf08-6a3ed1d28293 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: ee09d802-1f59-4f58-befa-a281fe642b6b] Refreshing network info cache for port dc2fa6e9-f8da-41bd-b113-8c7f2b22d4a9 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 21 19:13:11 np0005591285 nova_compute[182755]: 2026-01-22 00:13:11.343 182759 DEBUG nova.virt.libvirt.driver [None req-6475acbd-323f-4d3e-8fa6-ea96f95ecd1c 9ee45ba20dd444a5a5e88aa96cc8e043 873b2f2688e942d5924aa81fa18d84c0 - - default default] [instance: ee09d802-1f59-4f58-befa-a281fe642b6b] Start _get_guest_xml network_info=[{"id": "dc2fa6e9-f8da-41bd-b113-8c7f2b22d4a9", "address": "fa:16:3e:f6:6f:67", "network": {"id": "d1e3ff28-7ba4-4007-895f-2557b60edefb", "bridge": "br-int", "label": "tempest-InstanceActionsTestJSON-1124107938-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "873b2f2688e942d5924aa81fa18d84c0", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapdc2fa6e9-f8", "ovs_interfaceid": "dc2fa6e9-f8da-41bd-b113-8c7f2b22d4a9", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-21T23:42:50Z,direct_url=<?>,disk_format='qcow2',id=9cd98f02-a505-4543-a7ad-04e9a377b456,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='43b70c4e837343859ac97b6b2397ba1b',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-21T23:42:52Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'device_type': 'disk', 'device_name': '/dev/vda', 'size': 0, 'boot_index': 0, 'disk_bus': 'virtio', 'guest_format': None, 'encryption_options': None, 'encryption_format': None, 'encrypted': False, 'encryption_secret_uuid': None, 'image_id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Jan 21 19:13:11 np0005591285 nova_compute[182755]: 2026-01-22 00:13:11.348 182759 WARNING nova.virt.libvirt.driver [None req-6475acbd-323f-4d3e-8fa6-ea96f95ecd1c 9ee45ba20dd444a5a5e88aa96cc8e043 873b2f2688e942d5924aa81fa18d84c0 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 21 19:13:11 np0005591285 nova_compute[182755]: 2026-01-22 00:13:11.356 182759 DEBUG nova.virt.libvirt.driver [None req-ce7cd976-0f13-413f-b6d1-7ca0671f190c 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 21 19:13:11 np0005591285 nova_compute[182755]: 2026-01-22 00:13:11.356 182759 DEBUG nova.virt.libvirt.driver [None req-ce7cd976-0f13-413f-b6d1-7ca0671f190c 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 21 19:13:11 np0005591285 nova_compute[182755]: 2026-01-22 00:13:11.356 182759 DEBUG nova.virt.libvirt.driver [None req-ce7cd976-0f13-413f-b6d1-7ca0671f190c 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] No VIF found with MAC fa:16:3e:a7:98:77, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Jan 21 19:13:11 np0005591285 nova_compute[182755]: 2026-01-22 00:13:11.357 182759 INFO nova.virt.libvirt.driver [None req-ce7cd976-0f13-413f-b6d1-7ca0671f190c 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] [instance: 118577c2-2440-472a-b858-f075b2a804b1] Using config drive#033[00m
Jan 21 19:13:11 np0005591285 nova_compute[182755]: 2026-01-22 00:13:11.360 182759 DEBUG nova.virt.libvirt.host [None req-6475acbd-323f-4d3e-8fa6-ea96f95ecd1c 9ee45ba20dd444a5a5e88aa96cc8e043 873b2f2688e942d5924aa81fa18d84c0 - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Jan 21 19:13:11 np0005591285 nova_compute[182755]: 2026-01-22 00:13:11.360 182759 DEBUG nova.virt.libvirt.host [None req-6475acbd-323f-4d3e-8fa6-ea96f95ecd1c 9ee45ba20dd444a5a5e88aa96cc8e043 873b2f2688e942d5924aa81fa18d84c0 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Jan 21 19:13:11 np0005591285 nova_compute[182755]: 2026-01-22 00:13:11.370 182759 DEBUG nova.virt.libvirt.host [None req-6475acbd-323f-4d3e-8fa6-ea96f95ecd1c 9ee45ba20dd444a5a5e88aa96cc8e043 873b2f2688e942d5924aa81fa18d84c0 - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Jan 21 19:13:11 np0005591285 nova_compute[182755]: 2026-01-22 00:13:11.370 182759 DEBUG nova.virt.libvirt.host [None req-6475acbd-323f-4d3e-8fa6-ea96f95ecd1c 9ee45ba20dd444a5a5e88aa96cc8e043 873b2f2688e942d5924aa81fa18d84c0 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Jan 21 19:13:11 np0005591285 nova_compute[182755]: 2026-01-22 00:13:11.371 182759 DEBUG nova.virt.libvirt.driver [None req-6475acbd-323f-4d3e-8fa6-ea96f95ecd1c 9ee45ba20dd444a5a5e88aa96cc8e043 873b2f2688e942d5924aa81fa18d84c0 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Jan 21 19:13:11 np0005591285 nova_compute[182755]: 2026-01-22 00:13:11.371 182759 DEBUG nova.virt.hardware [None req-6475acbd-323f-4d3e-8fa6-ea96f95ecd1c 9ee45ba20dd444a5a5e88aa96cc8e043 873b2f2688e942d5924aa81fa18d84c0 - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-21T23:42:45Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='c3389c03-89c4-4ff5-9e03-1a99d41713d4',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-21T23:42:50Z,direct_url=<?>,disk_format='qcow2',id=9cd98f02-a505-4543-a7ad-04e9a377b456,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='43b70c4e837343859ac97b6b2397ba1b',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-21T23:42:52Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Jan 21 19:13:11 np0005591285 nova_compute[182755]: 2026-01-22 00:13:11.372 182759 DEBUG nova.virt.hardware [None req-6475acbd-323f-4d3e-8fa6-ea96f95ecd1c 9ee45ba20dd444a5a5e88aa96cc8e043 873b2f2688e942d5924aa81fa18d84c0 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Jan 21 19:13:11 np0005591285 nova_compute[182755]: 2026-01-22 00:13:11.372 182759 DEBUG nova.virt.hardware [None req-6475acbd-323f-4d3e-8fa6-ea96f95ecd1c 9ee45ba20dd444a5a5e88aa96cc8e043 873b2f2688e942d5924aa81fa18d84c0 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Jan 21 19:13:11 np0005591285 nova_compute[182755]: 2026-01-22 00:13:11.372 182759 DEBUG nova.virt.hardware [None req-6475acbd-323f-4d3e-8fa6-ea96f95ecd1c 9ee45ba20dd444a5a5e88aa96cc8e043 873b2f2688e942d5924aa81fa18d84c0 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Jan 21 19:13:11 np0005591285 nova_compute[182755]: 2026-01-22 00:13:11.373 182759 DEBUG nova.virt.hardware [None req-6475acbd-323f-4d3e-8fa6-ea96f95ecd1c 9ee45ba20dd444a5a5e88aa96cc8e043 873b2f2688e942d5924aa81fa18d84c0 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Jan 21 19:13:11 np0005591285 nova_compute[182755]: 2026-01-22 00:13:11.373 182759 DEBUG nova.virt.hardware [None req-6475acbd-323f-4d3e-8fa6-ea96f95ecd1c 9ee45ba20dd444a5a5e88aa96cc8e043 873b2f2688e942d5924aa81fa18d84c0 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Jan 21 19:13:11 np0005591285 nova_compute[182755]: 2026-01-22 00:13:11.373 182759 DEBUG nova.virt.hardware [None req-6475acbd-323f-4d3e-8fa6-ea96f95ecd1c 9ee45ba20dd444a5a5e88aa96cc8e043 873b2f2688e942d5924aa81fa18d84c0 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Jan 21 19:13:11 np0005591285 nova_compute[182755]: 2026-01-22 00:13:11.374 182759 DEBUG nova.virt.hardware [None req-6475acbd-323f-4d3e-8fa6-ea96f95ecd1c 9ee45ba20dd444a5a5e88aa96cc8e043 873b2f2688e942d5924aa81fa18d84c0 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Jan 21 19:13:11 np0005591285 nova_compute[182755]: 2026-01-22 00:13:11.374 182759 DEBUG nova.virt.hardware [None req-6475acbd-323f-4d3e-8fa6-ea96f95ecd1c 9ee45ba20dd444a5a5e88aa96cc8e043 873b2f2688e942d5924aa81fa18d84c0 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Jan 21 19:13:11 np0005591285 nova_compute[182755]: 2026-01-22 00:13:11.374 182759 DEBUG nova.virt.hardware [None req-6475acbd-323f-4d3e-8fa6-ea96f95ecd1c 9ee45ba20dd444a5a5e88aa96cc8e043 873b2f2688e942d5924aa81fa18d84c0 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Jan 21 19:13:11 np0005591285 nova_compute[182755]: 2026-01-22 00:13:11.374 182759 DEBUG nova.virt.hardware [None req-6475acbd-323f-4d3e-8fa6-ea96f95ecd1c 9ee45ba20dd444a5a5e88aa96cc8e043 873b2f2688e942d5924aa81fa18d84c0 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Jan 21 19:13:11 np0005591285 nova_compute[182755]: 2026-01-22 00:13:11.378 182759 DEBUG nova.virt.libvirt.vif [None req-6475acbd-323f-4d3e-8fa6-ea96f95ecd1c 9ee45ba20dd444a5a5e88aa96cc8e043 873b2f2688e942d5924aa81fa18d84c0 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-22T00:12:48Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-InstanceActionsTestJSON-server-300361081',display_name='tempest-InstanceActionsTestJSON-server-300361081',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-instanceactionstestjson-server-300361081',id=118,image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='873b2f2688e942d5924aa81fa18d84c0',ramdisk_id='',reservation_id='r-0sonkysu',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-InstanceActionsTestJSON-232501859',owner_user_name='tempest-InstanceActionsTestJSON-232501859-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-22T00:12:58Z,user_data=None,user_id='9ee45ba20dd444a5a5e88aa96cc8e043',uuid=ee09d802-1f59-4f58-befa-a281fe642b6b,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "dc2fa6e9-f8da-41bd-b113-8c7f2b22d4a9", "address": "fa:16:3e:f6:6f:67", "network": {"id": "d1e3ff28-7ba4-4007-895f-2557b60edefb", "bridge": "br-int", "label": "tempest-InstanceActionsTestJSON-1124107938-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "873b2f2688e942d5924aa81fa18d84c0", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapdc2fa6e9-f8", "ovs_interfaceid": "dc2fa6e9-f8da-41bd-b113-8c7f2b22d4a9", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Jan 21 19:13:11 np0005591285 nova_compute[182755]: 2026-01-22 00:13:11.378 182759 DEBUG nova.network.os_vif_util [None req-6475acbd-323f-4d3e-8fa6-ea96f95ecd1c 9ee45ba20dd444a5a5e88aa96cc8e043 873b2f2688e942d5924aa81fa18d84c0 - - default default] Converting VIF {"id": "dc2fa6e9-f8da-41bd-b113-8c7f2b22d4a9", "address": "fa:16:3e:f6:6f:67", "network": {"id": "d1e3ff28-7ba4-4007-895f-2557b60edefb", "bridge": "br-int", "label": "tempest-InstanceActionsTestJSON-1124107938-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "873b2f2688e942d5924aa81fa18d84c0", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapdc2fa6e9-f8", "ovs_interfaceid": "dc2fa6e9-f8da-41bd-b113-8c7f2b22d4a9", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 21 19:13:11 np0005591285 nova_compute[182755]: 2026-01-22 00:13:11.379 182759 DEBUG nova.network.os_vif_util [None req-6475acbd-323f-4d3e-8fa6-ea96f95ecd1c 9ee45ba20dd444a5a5e88aa96cc8e043 873b2f2688e942d5924aa81fa18d84c0 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:f6:6f:67,bridge_name='br-int',has_traffic_filtering=True,id=dc2fa6e9-f8da-41bd-b113-8c7f2b22d4a9,network=Network(d1e3ff28-7ba4-4007-895f-2557b60edefb),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapdc2fa6e9-f8') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 21 19:13:11 np0005591285 nova_compute[182755]: 2026-01-22 00:13:11.380 182759 DEBUG nova.objects.instance [None req-6475acbd-323f-4d3e-8fa6-ea96f95ecd1c 9ee45ba20dd444a5a5e88aa96cc8e043 873b2f2688e942d5924aa81fa18d84c0 - - default default] Lazy-loading 'pci_devices' on Instance uuid ee09d802-1f59-4f58-befa-a281fe642b6b obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 21 19:13:11 np0005591285 nova_compute[182755]: 2026-01-22 00:13:11.405 182759 DEBUG nova.virt.libvirt.driver [None req-6475acbd-323f-4d3e-8fa6-ea96f95ecd1c 9ee45ba20dd444a5a5e88aa96cc8e043 873b2f2688e942d5924aa81fa18d84c0 - - default default] [instance: ee09d802-1f59-4f58-befa-a281fe642b6b] End _get_guest_xml xml=<domain type="kvm">
Jan 21 19:13:11 np0005591285 nova_compute[182755]:  <uuid>ee09d802-1f59-4f58-befa-a281fe642b6b</uuid>
Jan 21 19:13:11 np0005591285 nova_compute[182755]:  <name>instance-00000076</name>
Jan 21 19:13:11 np0005591285 nova_compute[182755]:  <memory>131072</memory>
Jan 21 19:13:11 np0005591285 nova_compute[182755]:  <vcpu>1</vcpu>
Jan 21 19:13:11 np0005591285 nova_compute[182755]:  <metadata>
Jan 21 19:13:11 np0005591285 nova_compute[182755]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 21 19:13:11 np0005591285 nova_compute[182755]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 21 19:13:11 np0005591285 nova_compute[182755]:      <nova:name>tempest-InstanceActionsTestJSON-server-300361081</nova:name>
Jan 21 19:13:11 np0005591285 nova_compute[182755]:      <nova:creationTime>2026-01-22 00:13:11</nova:creationTime>
Jan 21 19:13:11 np0005591285 nova_compute[182755]:      <nova:flavor name="m1.nano">
Jan 21 19:13:11 np0005591285 nova_compute[182755]:        <nova:memory>128</nova:memory>
Jan 21 19:13:11 np0005591285 nova_compute[182755]:        <nova:disk>1</nova:disk>
Jan 21 19:13:11 np0005591285 nova_compute[182755]:        <nova:swap>0</nova:swap>
Jan 21 19:13:11 np0005591285 nova_compute[182755]:        <nova:ephemeral>0</nova:ephemeral>
Jan 21 19:13:11 np0005591285 nova_compute[182755]:        <nova:vcpus>1</nova:vcpus>
Jan 21 19:13:11 np0005591285 nova_compute[182755]:      </nova:flavor>
Jan 21 19:13:11 np0005591285 nova_compute[182755]:      <nova:owner>
Jan 21 19:13:11 np0005591285 nova_compute[182755]:        <nova:user uuid="9ee45ba20dd444a5a5e88aa96cc8e043">tempest-InstanceActionsTestJSON-232501859-project-member</nova:user>
Jan 21 19:13:11 np0005591285 nova_compute[182755]:        <nova:project uuid="873b2f2688e942d5924aa81fa18d84c0">tempest-InstanceActionsTestJSON-232501859</nova:project>
Jan 21 19:13:11 np0005591285 nova_compute[182755]:      </nova:owner>
Jan 21 19:13:11 np0005591285 nova_compute[182755]:      <nova:root type="image" uuid="9cd98f02-a505-4543-a7ad-04e9a377b456"/>
Jan 21 19:13:11 np0005591285 nova_compute[182755]:      <nova:ports>
Jan 21 19:13:11 np0005591285 nova_compute[182755]:        <nova:port uuid="dc2fa6e9-f8da-41bd-b113-8c7f2b22d4a9">
Jan 21 19:13:11 np0005591285 nova_compute[182755]:          <nova:ip type="fixed" address="10.100.0.10" ipVersion="4"/>
Jan 21 19:13:11 np0005591285 nova_compute[182755]:        </nova:port>
Jan 21 19:13:11 np0005591285 nova_compute[182755]:      </nova:ports>
Jan 21 19:13:11 np0005591285 nova_compute[182755]:    </nova:instance>
Jan 21 19:13:11 np0005591285 nova_compute[182755]:  </metadata>
Jan 21 19:13:11 np0005591285 nova_compute[182755]:  <sysinfo type="smbios">
Jan 21 19:13:11 np0005591285 nova_compute[182755]:    <system>
Jan 21 19:13:11 np0005591285 nova_compute[182755]:      <entry name="manufacturer">RDO</entry>
Jan 21 19:13:11 np0005591285 nova_compute[182755]:      <entry name="product">OpenStack Compute</entry>
Jan 21 19:13:11 np0005591285 nova_compute[182755]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 21 19:13:11 np0005591285 nova_compute[182755]:      <entry name="serial">ee09d802-1f59-4f58-befa-a281fe642b6b</entry>
Jan 21 19:13:11 np0005591285 nova_compute[182755]:      <entry name="uuid">ee09d802-1f59-4f58-befa-a281fe642b6b</entry>
Jan 21 19:13:11 np0005591285 nova_compute[182755]:      <entry name="family">Virtual Machine</entry>
Jan 21 19:13:11 np0005591285 nova_compute[182755]:    </system>
Jan 21 19:13:11 np0005591285 nova_compute[182755]:  </sysinfo>
Jan 21 19:13:11 np0005591285 nova_compute[182755]:  <os>
Jan 21 19:13:11 np0005591285 nova_compute[182755]:    <type arch="x86_64" machine="q35">hvm</type>
Jan 21 19:13:11 np0005591285 nova_compute[182755]:    <boot dev="hd"/>
Jan 21 19:13:11 np0005591285 nova_compute[182755]:    <smbios mode="sysinfo"/>
Jan 21 19:13:11 np0005591285 nova_compute[182755]:  </os>
Jan 21 19:13:11 np0005591285 nova_compute[182755]:  <features>
Jan 21 19:13:11 np0005591285 nova_compute[182755]:    <acpi/>
Jan 21 19:13:11 np0005591285 nova_compute[182755]:    <apic/>
Jan 21 19:13:11 np0005591285 nova_compute[182755]:    <vmcoreinfo/>
Jan 21 19:13:11 np0005591285 nova_compute[182755]:  </features>
Jan 21 19:13:11 np0005591285 nova_compute[182755]:  <clock offset="utc">
Jan 21 19:13:11 np0005591285 nova_compute[182755]:    <timer name="pit" tickpolicy="delay"/>
Jan 21 19:13:11 np0005591285 nova_compute[182755]:    <timer name="rtc" tickpolicy="catchup"/>
Jan 21 19:13:11 np0005591285 nova_compute[182755]:    <timer name="hpet" present="no"/>
Jan 21 19:13:11 np0005591285 nova_compute[182755]:  </clock>
Jan 21 19:13:11 np0005591285 nova_compute[182755]:  <cpu mode="custom" match="exact">
Jan 21 19:13:11 np0005591285 nova_compute[182755]:    <model>Nehalem</model>
Jan 21 19:13:11 np0005591285 nova_compute[182755]:    <topology sockets="1" cores="1" threads="1"/>
Jan 21 19:13:11 np0005591285 nova_compute[182755]:  </cpu>
Jan 21 19:13:11 np0005591285 nova_compute[182755]:  <devices>
Jan 21 19:13:11 np0005591285 nova_compute[182755]:    <disk type="file" device="disk">
Jan 21 19:13:11 np0005591285 nova_compute[182755]:      <driver name="qemu" type="qcow2" cache="none"/>
Jan 21 19:13:11 np0005591285 nova_compute[182755]:      <source file="/var/lib/nova/instances/ee09d802-1f59-4f58-befa-a281fe642b6b/disk"/>
Jan 21 19:13:11 np0005591285 nova_compute[182755]:      <target dev="vda" bus="virtio"/>
Jan 21 19:13:11 np0005591285 nova_compute[182755]:    </disk>
Jan 21 19:13:11 np0005591285 nova_compute[182755]:    <disk type="file" device="cdrom">
Jan 21 19:13:11 np0005591285 nova_compute[182755]:      <driver name="qemu" type="raw" cache="none"/>
Jan 21 19:13:11 np0005591285 nova_compute[182755]:      <source file="/var/lib/nova/instances/ee09d802-1f59-4f58-befa-a281fe642b6b/disk.config"/>
Jan 21 19:13:11 np0005591285 nova_compute[182755]:      <target dev="sda" bus="sata"/>
Jan 21 19:13:11 np0005591285 nova_compute[182755]:    </disk>
Jan 21 19:13:11 np0005591285 nova_compute[182755]:    <interface type="ethernet">
Jan 21 19:13:11 np0005591285 nova_compute[182755]:      <mac address="fa:16:3e:f6:6f:67"/>
Jan 21 19:13:11 np0005591285 nova_compute[182755]:      <model type="virtio"/>
Jan 21 19:13:11 np0005591285 nova_compute[182755]:      <driver name="vhost" rx_queue_size="512"/>
Jan 21 19:13:11 np0005591285 nova_compute[182755]:      <mtu size="1442"/>
Jan 21 19:13:11 np0005591285 nova_compute[182755]:      <target dev="tapdc2fa6e9-f8"/>
Jan 21 19:13:11 np0005591285 nova_compute[182755]:    </interface>
Jan 21 19:13:11 np0005591285 nova_compute[182755]:    <serial type="pty">
Jan 21 19:13:11 np0005591285 nova_compute[182755]:      <log file="/var/lib/nova/instances/ee09d802-1f59-4f58-befa-a281fe642b6b/console.log" append="off"/>
Jan 21 19:13:11 np0005591285 nova_compute[182755]:    </serial>
Jan 21 19:13:11 np0005591285 nova_compute[182755]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 21 19:13:11 np0005591285 nova_compute[182755]:    <video>
Jan 21 19:13:11 np0005591285 nova_compute[182755]:      <model type="virtio"/>
Jan 21 19:13:11 np0005591285 nova_compute[182755]:    </video>
Jan 21 19:13:11 np0005591285 nova_compute[182755]:    <input type="tablet" bus="usb"/>
Jan 21 19:13:11 np0005591285 nova_compute[182755]:    <rng model="virtio">
Jan 21 19:13:11 np0005591285 nova_compute[182755]:      <backend model="random">/dev/urandom</backend>
Jan 21 19:13:11 np0005591285 nova_compute[182755]:    </rng>
Jan 21 19:13:11 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root"/>
Jan 21 19:13:11 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 19:13:11 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 19:13:11 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 19:13:11 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 19:13:11 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 19:13:11 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 19:13:11 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 19:13:11 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 19:13:11 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 19:13:11 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 19:13:11 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 19:13:11 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 19:13:11 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 19:13:11 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 19:13:11 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 19:13:11 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 19:13:11 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 19:13:11 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 19:13:11 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 19:13:11 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 19:13:11 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 19:13:11 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 19:13:11 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 19:13:11 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 19:13:11 np0005591285 nova_compute[182755]:    <controller type="usb" index="0"/>
Jan 21 19:13:11 np0005591285 nova_compute[182755]:    <memballoon model="virtio">
Jan 21 19:13:11 np0005591285 nova_compute[182755]:      <stats period="10"/>
Jan 21 19:13:11 np0005591285 nova_compute[182755]:    </memballoon>
Jan 21 19:13:11 np0005591285 nova_compute[182755]:  </devices>
Jan 21 19:13:11 np0005591285 nova_compute[182755]: </domain>
Jan 21 19:13:11 np0005591285 nova_compute[182755]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Jan 21 19:13:11 np0005591285 nova_compute[182755]: 2026-01-22 00:13:11.406 182759 DEBUG nova.compute.manager [None req-6475acbd-323f-4d3e-8fa6-ea96f95ecd1c 9ee45ba20dd444a5a5e88aa96cc8e043 873b2f2688e942d5924aa81fa18d84c0 - - default default] [instance: ee09d802-1f59-4f58-befa-a281fe642b6b] Preparing to wait for external event network-vif-plugged-dc2fa6e9-f8da-41bd-b113-8c7f2b22d4a9 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Jan 21 19:13:11 np0005591285 nova_compute[182755]: 2026-01-22 00:13:11.407 182759 DEBUG oslo_concurrency.lockutils [None req-6475acbd-323f-4d3e-8fa6-ea96f95ecd1c 9ee45ba20dd444a5a5e88aa96cc8e043 873b2f2688e942d5924aa81fa18d84c0 - - default default] Acquiring lock "ee09d802-1f59-4f58-befa-a281fe642b6b-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 21 19:13:11 np0005591285 nova_compute[182755]: 2026-01-22 00:13:11.407 182759 DEBUG oslo_concurrency.lockutils [None req-6475acbd-323f-4d3e-8fa6-ea96f95ecd1c 9ee45ba20dd444a5a5e88aa96cc8e043 873b2f2688e942d5924aa81fa18d84c0 - - default default] Lock "ee09d802-1f59-4f58-befa-a281fe642b6b-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 21 19:13:11 np0005591285 nova_compute[182755]: 2026-01-22 00:13:11.407 182759 DEBUG oslo_concurrency.lockutils [None req-6475acbd-323f-4d3e-8fa6-ea96f95ecd1c 9ee45ba20dd444a5a5e88aa96cc8e043 873b2f2688e942d5924aa81fa18d84c0 - - default default] Lock "ee09d802-1f59-4f58-befa-a281fe642b6b-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 21 19:13:11 np0005591285 nova_compute[182755]: 2026-01-22 00:13:11.408 182759 DEBUG nova.virt.libvirt.vif [None req-6475acbd-323f-4d3e-8fa6-ea96f95ecd1c 9ee45ba20dd444a5a5e88aa96cc8e043 873b2f2688e942d5924aa81fa18d84c0 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-22T00:12:48Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-InstanceActionsTestJSON-server-300361081',display_name='tempest-InstanceActionsTestJSON-server-300361081',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-instanceactionstestjson-server-300361081',id=118,image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='873b2f2688e942d5924aa81fa18d84c0',ramdisk_id='',reservation_id='r-0sonkysu',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-InstanceActionsTestJSON-232501859',owner_user_name='tempest-InstanceActionsTestJSON-232501859-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-22T00:12:58Z,user_data=None,user_id='9ee45ba20dd444a5a5e88aa96cc8e043',uuid=ee09d802-1f59-4f58-befa-a281fe642b6b,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "dc2fa6e9-f8da-41bd-b113-8c7f2b22d4a9", "address": "fa:16:3e:f6:6f:67", "network": {"id": "d1e3ff28-7ba4-4007-895f-2557b60edefb", "bridge": "br-int", "label": "tempest-InstanceActionsTestJSON-1124107938-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "873b2f2688e942d5924aa81fa18d84c0", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapdc2fa6e9-f8", "ovs_interfaceid": "dc2fa6e9-f8da-41bd-b113-8c7f2b22d4a9", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Jan 21 19:13:11 np0005591285 nova_compute[182755]: 2026-01-22 00:13:11.409 182759 DEBUG nova.network.os_vif_util [None req-6475acbd-323f-4d3e-8fa6-ea96f95ecd1c 9ee45ba20dd444a5a5e88aa96cc8e043 873b2f2688e942d5924aa81fa18d84c0 - - default default] Converting VIF {"id": "dc2fa6e9-f8da-41bd-b113-8c7f2b22d4a9", "address": "fa:16:3e:f6:6f:67", "network": {"id": "d1e3ff28-7ba4-4007-895f-2557b60edefb", "bridge": "br-int", "label": "tempest-InstanceActionsTestJSON-1124107938-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "873b2f2688e942d5924aa81fa18d84c0", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapdc2fa6e9-f8", "ovs_interfaceid": "dc2fa6e9-f8da-41bd-b113-8c7f2b22d4a9", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 21 19:13:11 np0005591285 nova_compute[182755]: 2026-01-22 00:13:11.409 182759 DEBUG nova.network.os_vif_util [None req-6475acbd-323f-4d3e-8fa6-ea96f95ecd1c 9ee45ba20dd444a5a5e88aa96cc8e043 873b2f2688e942d5924aa81fa18d84c0 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:f6:6f:67,bridge_name='br-int',has_traffic_filtering=True,id=dc2fa6e9-f8da-41bd-b113-8c7f2b22d4a9,network=Network(d1e3ff28-7ba4-4007-895f-2557b60edefb),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapdc2fa6e9-f8') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 21 19:13:11 np0005591285 nova_compute[182755]: 2026-01-22 00:13:11.410 182759 DEBUG os_vif [None req-6475acbd-323f-4d3e-8fa6-ea96f95ecd1c 9ee45ba20dd444a5a5e88aa96cc8e043 873b2f2688e942d5924aa81fa18d84c0 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:f6:6f:67,bridge_name='br-int',has_traffic_filtering=True,id=dc2fa6e9-f8da-41bd-b113-8c7f2b22d4a9,network=Network(d1e3ff28-7ba4-4007-895f-2557b60edefb),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapdc2fa6e9-f8') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Jan 21 19:13:11 np0005591285 nova_compute[182755]: 2026-01-22 00:13:11.410 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:13:11 np0005591285 nova_compute[182755]: 2026-01-22 00:13:11.411 182759 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 21 19:13:11 np0005591285 nova_compute[182755]: 2026-01-22 00:13:11.411 182759 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 21 19:13:11 np0005591285 nova_compute[182755]: 2026-01-22 00:13:11.416 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:13:11 np0005591285 nova_compute[182755]: 2026-01-22 00:13:11.416 182759 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapdc2fa6e9-f8, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 21 19:13:11 np0005591285 nova_compute[182755]: 2026-01-22 00:13:11.417 182759 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tapdc2fa6e9-f8, col_values=(('external_ids', {'iface-id': 'dc2fa6e9-f8da-41bd-b113-8c7f2b22d4a9', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:f6:6f:67', 'vm-uuid': 'ee09d802-1f59-4f58-befa-a281fe642b6b'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 21 19:13:11 np0005591285 nova_compute[182755]: 2026-01-22 00:13:11.418 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:13:11 np0005591285 NetworkManager[55017]: <info>  [1769040791.4200] manager: (tapdc2fa6e9-f8): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/211)
Jan 21 19:13:11 np0005591285 nova_compute[182755]: 2026-01-22 00:13:11.420 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 21 19:13:11 np0005591285 nova_compute[182755]: 2026-01-22 00:13:11.430 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:13:11 np0005591285 nova_compute[182755]: 2026-01-22 00:13:11.431 182759 INFO os_vif [None req-6475acbd-323f-4d3e-8fa6-ea96f95ecd1c 9ee45ba20dd444a5a5e88aa96cc8e043 873b2f2688e942d5924aa81fa18d84c0 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:f6:6f:67,bridge_name='br-int',has_traffic_filtering=True,id=dc2fa6e9-f8da-41bd-b113-8c7f2b22d4a9,network=Network(d1e3ff28-7ba4-4007-895f-2557b60edefb),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapdc2fa6e9-f8')#033[00m
Jan 21 19:13:11 np0005591285 nova_compute[182755]: 2026-01-22 00:13:11.516 182759 DEBUG nova.virt.libvirt.driver [None req-6475acbd-323f-4d3e-8fa6-ea96f95ecd1c 9ee45ba20dd444a5a5e88aa96cc8e043 873b2f2688e942d5924aa81fa18d84c0 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 21 19:13:11 np0005591285 nova_compute[182755]: 2026-01-22 00:13:11.516 182759 DEBUG nova.virt.libvirt.driver [None req-6475acbd-323f-4d3e-8fa6-ea96f95ecd1c 9ee45ba20dd444a5a5e88aa96cc8e043 873b2f2688e942d5924aa81fa18d84c0 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 21 19:13:11 np0005591285 nova_compute[182755]: 2026-01-22 00:13:11.516 182759 DEBUG nova.virt.libvirt.driver [None req-6475acbd-323f-4d3e-8fa6-ea96f95ecd1c 9ee45ba20dd444a5a5e88aa96cc8e043 873b2f2688e942d5924aa81fa18d84c0 - - default default] No VIF found with MAC fa:16:3e:f6:6f:67, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Jan 21 19:13:11 np0005591285 nova_compute[182755]: 2026-01-22 00:13:11.517 182759 INFO nova.virt.libvirt.driver [None req-6475acbd-323f-4d3e-8fa6-ea96f95ecd1c 9ee45ba20dd444a5a5e88aa96cc8e043 873b2f2688e942d5924aa81fa18d84c0 - - default default] [instance: ee09d802-1f59-4f58-befa-a281fe642b6b] Using config drive#033[00m
Jan 21 19:13:11 np0005591285 nova_compute[182755]: 2026-01-22 00:13:11.896 182759 INFO nova.virt.libvirt.driver [None req-ce7cd976-0f13-413f-b6d1-7ca0671f190c 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] [instance: 118577c2-2440-472a-b858-f075b2a804b1] Creating config drive at /var/lib/nova/instances/118577c2-2440-472a-b858-f075b2a804b1/disk.config#033[00m
Jan 21 19:13:11 np0005591285 nova_compute[182755]: 2026-01-22 00:13:11.903 182759 DEBUG oslo_concurrency.processutils [None req-ce7cd976-0f13-413f-b6d1-7ca0671f190c 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/118577c2-2440-472a-b858-f075b2a804b1/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmps7okait1 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 21 19:13:12 np0005591285 nova_compute[182755]: 2026-01-22 00:13:12.029 182759 DEBUG oslo_concurrency.processutils [None req-ce7cd976-0f13-413f-b6d1-7ca0671f190c 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/118577c2-2440-472a-b858-f075b2a804b1/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmps7okait1" returned: 0 in 0.126s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 21 19:13:12 np0005591285 NetworkManager[55017]: <info>  [1769040792.1137] manager: (tapf091e31e-11): new Tun device (/org/freedesktop/NetworkManager/Devices/212)
Jan 21 19:13:12 np0005591285 kernel: tapf091e31e-11: entered promiscuous mode
Jan 21 19:13:12 np0005591285 nova_compute[182755]: 2026-01-22 00:13:12.119 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:13:12 np0005591285 ovn_controller[94908]: 2026-01-22T00:13:12Z|00436|binding|INFO|Claiming lport f091e31e-112e-4a90-9947-5a807f422c9c for this chassis.
Jan 21 19:13:12 np0005591285 ovn_controller[94908]: 2026-01-22T00:13:12Z|00437|binding|INFO|f091e31e-112e-4a90-9947-5a807f422c9c: Claiming fa:16:3e:a7:98:77 10.100.0.11
Jan 21 19:13:12 np0005591285 nova_compute[182755]: 2026-01-22 00:13:12.137 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:13:12 np0005591285 systemd-machined[154022]: New machine qemu-53-instance-00000077.
Jan 21 19:13:12 np0005591285 systemd[1]: Started Virtual Machine qemu-53-instance-00000077.
Jan 21 19:13:12 np0005591285 ovn_controller[94908]: 2026-01-22T00:13:12Z|00438|binding|INFO|Setting lport f091e31e-112e-4a90-9947-5a807f422c9c ovn-installed in OVS
Jan 21 19:13:12 np0005591285 nova_compute[182755]: 2026-01-22 00:13:12.218 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:13:12 np0005591285 systemd-udevd[229929]: Network interface NamePolicy= disabled on kernel command line.
Jan 21 19:13:12 np0005591285 NetworkManager[55017]: <info>  [1769040792.2547] device (tapf091e31e-11): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 21 19:13:12 np0005591285 NetworkManager[55017]: <info>  [1769040792.2558] device (tapf091e31e-11): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 21 19:13:12 np0005591285 ovn_controller[94908]: 2026-01-22T00:13:12Z|00439|binding|INFO|Setting lport f091e31e-112e-4a90-9947-5a807f422c9c up in Southbound
Jan 21 19:13:12 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:13:12.281 104259 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:a7:98:77 10.100.0.11'], port_security=['fa:16:3e:a7:98:77 10.100.0.11'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.11/28', 'neutron:device_id': '118577c2-2440-472a-b858-f075b2a804b1', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-88a7330a-aaa1-424a-b4dc-f7500e450abb', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '34b96b4037d24a0ea19383ca2477b2fd', 'neutron:revision_number': '2', 'neutron:security_group_ids': '08295c1c-ae1e-44be-8dc2-34f42af0072b', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=d452ef76-084d-4578-ab80-dfb49c9c8f9b, chassis=[<ovs.db.idl.Row object at 0x7fdd55b5c970>], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fdd55b5c970>], logical_port=f091e31e-112e-4a90-9947-5a807f422c9c) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 21 19:13:12 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:13:12.283 104259 INFO neutron.agent.ovn.metadata.agent [-] Port f091e31e-112e-4a90-9947-5a807f422c9c in datapath 88a7330a-aaa1-424a-b4dc-f7500e450abb bound to our chassis#033[00m
Jan 21 19:13:12 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:13:12.284 104259 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 88a7330a-aaa1-424a-b4dc-f7500e450abb#033[00m
Jan 21 19:13:12 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:13:12.300 211690 DEBUG oslo.privsep.daemon [-] privsep: reply[de65532b-6953-476c-8f37-fa3a5ec97cc7]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 19:13:12 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:13:12.301 104259 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap88a7330a-a1 in ovnmeta-88a7330a-aaa1-424a-b4dc-f7500e450abb namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Jan 21 19:13:12 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:13:12.303 211690 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap88a7330a-a0 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Jan 21 19:13:12 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:13:12.303 211690 DEBUG oslo.privsep.daemon [-] privsep: reply[8a33f401-5c90-48ee-8cd4-24de0278e1bb]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 19:13:12 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:13:12.304 211690 DEBUG oslo.privsep.daemon [-] privsep: reply[1abc3da7-60bb-4c57-8959-6c9bbc43fff3]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 19:13:12 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:13:12.319 104650 DEBUG oslo.privsep.daemon [-] privsep: reply[911f9ab0-798f-4a73-9e87-7104a41f7ebe]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 19:13:12 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:13:12.334 211690 DEBUG oslo.privsep.daemon [-] privsep: reply[cd05766d-18c7-4835-935b-6b5ec0362a50]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 19:13:12 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:13:12.368 211704 DEBUG oslo.privsep.daemon [-] privsep: reply[31b7b26e-1994-4a24-9113-33751794f0a4]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 19:13:12 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:13:12.376 211690 DEBUG oslo.privsep.daemon [-] privsep: reply[86bed879-52ff-45a5-b7bf-5d30162383b8]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 19:13:12 np0005591285 NetworkManager[55017]: <info>  [1769040792.3786] manager: (tap88a7330a-a0): new Veth device (/org/freedesktop/NetworkManager/Devices/213)
Jan 21 19:13:12 np0005591285 systemd-udevd[229933]: Network interface NamePolicy= disabled on kernel command line.
Jan 21 19:13:12 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:13:12.419 211704 DEBUG oslo.privsep.daemon [-] privsep: reply[a70208b4-b43b-4a81-b570-3d6f27ebb20d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 19:13:12 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:13:12.423 211704 DEBUG oslo.privsep.daemon [-] privsep: reply[ac8bf174-7a1c-425a-bcec-5ab41313848d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 19:13:12 np0005591285 NetworkManager[55017]: <info>  [1769040792.4497] device (tap88a7330a-a0): carrier: link connected
Jan 21 19:13:12 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:13:12.453 211704 DEBUG oslo.privsep.daemon [-] privsep: reply[1b918adf-caa4-436c-a159-a010f5e9e552]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 19:13:12 np0005591285 nova_compute[182755]: 2026-01-22 00:13:12.466 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:13:12 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:13:12.483 211690 DEBUG oslo.privsep.daemon [-] privsep: reply[0f33e3be-9da9-4b02-a26d-c90ed93a091c]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap88a7330a-a1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:6d:32:52'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 137], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 535110, 'reachable_time': 42327, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 229972, 'error': None, 'target': 'ovnmeta-88a7330a-aaa1-424a-b4dc-f7500e450abb', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 19:13:12 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:13:12.501 211690 DEBUG oslo.privsep.daemon [-] privsep: reply[9cf4ef38-d9ef-455c-84a3-b320d34f8044]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe6d:3252'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 535110, 'tstamp': 535110}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 229975, 'error': None, 'target': 'ovnmeta-88a7330a-aaa1-424a-b4dc-f7500e450abb', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 19:13:12 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:13:12.517 211690 DEBUG oslo.privsep.daemon [-] privsep: reply[c33b87fe-8082-41f8-8608-8d7e687b9a87]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap88a7330a-a1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:6d:32:52'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 2, 'rx_bytes': 90, 'tx_bytes': 180, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 2, 'rx_bytes': 90, 'tx_bytes': 180, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 137], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 535110, 'reachable_time': 42327, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 2, 'outoctets': 152, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 2, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 152, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 2, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 229976, 'error': None, 'target': 'ovnmeta-88a7330a-aaa1-424a-b4dc-f7500e450abb', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 19:13:12 np0005591285 nova_compute[182755]: 2026-01-22 00:13:12.544 182759 DEBUG nova.virt.driver [None req-f447ef32-7edb-4e2c-a5c5-5452f2f9c37e - - - - - -] Emitting event <LifecycleEvent: 1769040792.5438895, 118577c2-2440-472a-b858-f075b2a804b1 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 21 19:13:12 np0005591285 nova_compute[182755]: 2026-01-22 00:13:12.545 182759 INFO nova.compute.manager [None req-f447ef32-7edb-4e2c-a5c5-5452f2f9c37e - - - - - -] [instance: 118577c2-2440-472a-b858-f075b2a804b1] VM Started (Lifecycle Event)#033[00m
Jan 21 19:13:12 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:13:12.545 211690 DEBUG oslo.privsep.daemon [-] privsep: reply[d8b77076-6d69-4cc5-b4a9-52cc897255aa]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 19:13:12 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:13:12.590 211690 DEBUG oslo.privsep.daemon [-] privsep: reply[89e8410c-d304-44f1-b893-ef2454ddf242]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 19:13:12 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:13:12.592 104259 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap88a7330a-a0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 21 19:13:12 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:13:12.592 104259 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 21 19:13:12 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:13:12.593 104259 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap88a7330a-a0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 21 19:13:12 np0005591285 kernel: tap88a7330a-a0: entered promiscuous mode
Jan 21 19:13:12 np0005591285 nova_compute[182755]: 2026-01-22 00:13:12.594 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:13:12 np0005591285 NetworkManager[55017]: <info>  [1769040792.5954] manager: (tap88a7330a-a0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/214)
Jan 21 19:13:12 np0005591285 nova_compute[182755]: 2026-01-22 00:13:12.599 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:13:12 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:13:12.601 104259 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap88a7330a-a0, col_values=(('external_ids', {'iface-id': 'f63f34ac-9af7-4a13-911f-2c9f043a5c66'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 21 19:13:12 np0005591285 ovn_controller[94908]: 2026-01-22T00:13:12Z|00440|binding|INFO|Releasing lport f63f34ac-9af7-4a13-911f-2c9f043a5c66 from this chassis (sb_readonly=0)
Jan 21 19:13:12 np0005591285 nova_compute[182755]: 2026-01-22 00:13:12.602 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:13:12 np0005591285 nova_compute[182755]: 2026-01-22 00:13:12.614 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:13:12 np0005591285 nova_compute[182755]: 2026-01-22 00:13:12.616 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:13:12 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:13:12.617 104259 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/88a7330a-aaa1-424a-b4dc-f7500e450abb.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/88a7330a-aaa1-424a-b4dc-f7500e450abb.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Jan 21 19:13:12 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:13:12.618 211690 DEBUG oslo.privsep.daemon [-] privsep: reply[b17091b7-c5f1-49cf-96c7-027f16f15bc8]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 19:13:12 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:13:12.619 104259 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Jan 21 19:13:12 np0005591285 ovn_metadata_agent[104254]: global
Jan 21 19:13:12 np0005591285 ovn_metadata_agent[104254]:    log         /dev/log local0 debug
Jan 21 19:13:12 np0005591285 ovn_metadata_agent[104254]:    log-tag     haproxy-metadata-proxy-88a7330a-aaa1-424a-b4dc-f7500e450abb
Jan 21 19:13:12 np0005591285 ovn_metadata_agent[104254]:    user        root
Jan 21 19:13:12 np0005591285 ovn_metadata_agent[104254]:    group       root
Jan 21 19:13:12 np0005591285 ovn_metadata_agent[104254]:    maxconn     1024
Jan 21 19:13:12 np0005591285 ovn_metadata_agent[104254]:    pidfile     /var/lib/neutron/external/pids/88a7330a-aaa1-424a-b4dc-f7500e450abb.pid.haproxy
Jan 21 19:13:12 np0005591285 ovn_metadata_agent[104254]:    daemon
Jan 21 19:13:12 np0005591285 ovn_metadata_agent[104254]: 
Jan 21 19:13:12 np0005591285 ovn_metadata_agent[104254]: defaults
Jan 21 19:13:12 np0005591285 ovn_metadata_agent[104254]:    log global
Jan 21 19:13:12 np0005591285 ovn_metadata_agent[104254]:    mode http
Jan 21 19:13:12 np0005591285 ovn_metadata_agent[104254]:    option httplog
Jan 21 19:13:12 np0005591285 ovn_metadata_agent[104254]:    option dontlognull
Jan 21 19:13:12 np0005591285 ovn_metadata_agent[104254]:    option http-server-close
Jan 21 19:13:12 np0005591285 ovn_metadata_agent[104254]:    option forwardfor
Jan 21 19:13:12 np0005591285 ovn_metadata_agent[104254]:    retries                 3
Jan 21 19:13:12 np0005591285 ovn_metadata_agent[104254]:    timeout http-request    30s
Jan 21 19:13:12 np0005591285 ovn_metadata_agent[104254]:    timeout connect         30s
Jan 21 19:13:12 np0005591285 ovn_metadata_agent[104254]:    timeout client          32s
Jan 21 19:13:12 np0005591285 ovn_metadata_agent[104254]:    timeout server          32s
Jan 21 19:13:12 np0005591285 ovn_metadata_agent[104254]:    timeout http-keep-alive 30s
Jan 21 19:13:12 np0005591285 ovn_metadata_agent[104254]: 
Jan 21 19:13:12 np0005591285 ovn_metadata_agent[104254]: 
Jan 21 19:13:12 np0005591285 ovn_metadata_agent[104254]: listen listener
Jan 21 19:13:12 np0005591285 ovn_metadata_agent[104254]:    bind 169.254.169.254:80
Jan 21 19:13:12 np0005591285 ovn_metadata_agent[104254]:    server metadata /var/lib/neutron/metadata_proxy
Jan 21 19:13:12 np0005591285 ovn_metadata_agent[104254]:    http-request add-header X-OVN-Network-ID 88a7330a-aaa1-424a-b4dc-f7500e450abb
Jan 21 19:13:12 np0005591285 ovn_metadata_agent[104254]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Jan 21 19:13:12 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:13:12.619 104259 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-88a7330a-aaa1-424a-b4dc-f7500e450abb', 'env', 'PROCESS_TAG=haproxy-88a7330a-aaa1-424a-b4dc-f7500e450abb', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/88a7330a-aaa1-424a-b4dc-f7500e450abb.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Jan 21 19:13:12 np0005591285 nova_compute[182755]: 2026-01-22 00:13:12.656 182759 DEBUG nova.compute.manager [None req-f447ef32-7edb-4e2c-a5c5-5452f2f9c37e - - - - - -] [instance: 118577c2-2440-472a-b858-f075b2a804b1] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 21 19:13:12 np0005591285 nova_compute[182755]: 2026-01-22 00:13:12.659 182759 INFO nova.virt.libvirt.driver [None req-6475acbd-323f-4d3e-8fa6-ea96f95ecd1c 9ee45ba20dd444a5a5e88aa96cc8e043 873b2f2688e942d5924aa81fa18d84c0 - - default default] [instance: ee09d802-1f59-4f58-befa-a281fe642b6b] Creating config drive at /var/lib/nova/instances/ee09d802-1f59-4f58-befa-a281fe642b6b/disk.config#033[00m
Jan 21 19:13:12 np0005591285 nova_compute[182755]: 2026-01-22 00:13:12.665 182759 DEBUG oslo_concurrency.processutils [None req-6475acbd-323f-4d3e-8fa6-ea96f95ecd1c 9ee45ba20dd444a5a5e88aa96cc8e043 873b2f2688e942d5924aa81fa18d84c0 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/ee09d802-1f59-4f58-befa-a281fe642b6b/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpbt6xwe1a execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 21 19:13:12 np0005591285 nova_compute[182755]: 2026-01-22 00:13:12.687 182759 DEBUG nova.virt.driver [None req-f447ef32-7edb-4e2c-a5c5-5452f2f9c37e - - - - - -] Emitting event <LifecycleEvent: 1769040792.5441725, 118577c2-2440-472a-b858-f075b2a804b1 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 21 19:13:12 np0005591285 nova_compute[182755]: 2026-01-22 00:13:12.688 182759 INFO nova.compute.manager [None req-f447ef32-7edb-4e2c-a5c5-5452f2f9c37e - - - - - -] [instance: 118577c2-2440-472a-b858-f075b2a804b1] VM Paused (Lifecycle Event)#033[00m
Jan 21 19:13:12 np0005591285 nova_compute[182755]: 2026-01-22 00:13:12.758 182759 DEBUG nova.compute.manager [None req-f447ef32-7edb-4e2c-a5c5-5452f2f9c37e - - - - - -] [instance: 118577c2-2440-472a-b858-f075b2a804b1] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 21 19:13:12 np0005591285 nova_compute[182755]: 2026-01-22 00:13:12.762 182759 DEBUG nova.compute.manager [None req-f447ef32-7edb-4e2c-a5c5-5452f2f9c37e - - - - - -] [instance: 118577c2-2440-472a-b858-f075b2a804b1] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 21 19:13:12 np0005591285 nova_compute[182755]: 2026-01-22 00:13:12.791 182759 DEBUG oslo_concurrency.processutils [None req-6475acbd-323f-4d3e-8fa6-ea96f95ecd1c 9ee45ba20dd444a5a5e88aa96cc8e043 873b2f2688e942d5924aa81fa18d84c0 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/ee09d802-1f59-4f58-befa-a281fe642b6b/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpbt6xwe1a" returned: 0 in 0.126s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 21 19:13:12 np0005591285 nova_compute[182755]: 2026-01-22 00:13:12.816 182759 INFO nova.compute.manager [None req-f447ef32-7edb-4e2c-a5c5-5452f2f9c37e - - - - - -] [instance: 118577c2-2440-472a-b858-f075b2a804b1] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 21 19:13:12 np0005591285 kernel: tapdc2fa6e9-f8: entered promiscuous mode
Jan 21 19:13:12 np0005591285 NetworkManager[55017]: <info>  [1769040792.8595] manager: (tapdc2fa6e9-f8): new Tun device (/org/freedesktop/NetworkManager/Devices/215)
Jan 21 19:13:12 np0005591285 ovn_controller[94908]: 2026-01-22T00:13:12Z|00441|binding|INFO|Claiming lport dc2fa6e9-f8da-41bd-b113-8c7f2b22d4a9 for this chassis.
Jan 21 19:13:12 np0005591285 ovn_controller[94908]: 2026-01-22T00:13:12Z|00442|binding|INFO|dc2fa6e9-f8da-41bd-b113-8c7f2b22d4a9: Claiming fa:16:3e:f6:6f:67 10.100.0.10
Jan 21 19:13:12 np0005591285 nova_compute[182755]: 2026-01-22 00:13:12.860 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:13:12 np0005591285 systemd-udevd[229952]: Network interface NamePolicy= disabled on kernel command line.
Jan 21 19:13:12 np0005591285 nova_compute[182755]: 2026-01-22 00:13:12.863 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:13:12 np0005591285 nova_compute[182755]: 2026-01-22 00:13:12.874 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:13:12 np0005591285 NetworkManager[55017]: <info>  [1769040792.8797] device (tapdc2fa6e9-f8): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 21 19:13:12 np0005591285 NetworkManager[55017]: <info>  [1769040792.8815] device (tapdc2fa6e9-f8): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 21 19:13:12 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:13:12.908 104259 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:f6:6f:67 10.100.0.10'], port_security=['fa:16:3e:f6:6f:67 10.100.0.10'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.10/28', 'neutron:device_id': 'ee09d802-1f59-4f58-befa-a281fe642b6b', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-d1e3ff28-7ba4-4007-895f-2557b60edefb', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '873b2f2688e942d5924aa81fa18d84c0', 'neutron:revision_number': '2', 'neutron:security_group_ids': 'e0235df0-8e10-4f2c-bca1-3481f216c1a8', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=b6e8d061-740c-490c-81d2-02d385a2e787, chassis=[<ovs.db.idl.Row object at 0x7fdd55b5c970>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fdd55b5c970>], logical_port=dc2fa6e9-f8da-41bd-b113-8c7f2b22d4a9) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 21 19:13:12 np0005591285 systemd-machined[154022]: New machine qemu-54-instance-00000076.
Jan 21 19:13:12 np0005591285 ovn_controller[94908]: 2026-01-22T00:13:12Z|00443|binding|INFO|Setting lport dc2fa6e9-f8da-41bd-b113-8c7f2b22d4a9 ovn-installed in OVS
Jan 21 19:13:12 np0005591285 ovn_controller[94908]: 2026-01-22T00:13:12Z|00444|binding|INFO|Setting lport dc2fa6e9-f8da-41bd-b113-8c7f2b22d4a9 up in Southbound
Jan 21 19:13:12 np0005591285 systemd[1]: Started Virtual Machine qemu-54-instance-00000076.
Jan 21 19:13:12 np0005591285 nova_compute[182755]: 2026-01-22 00:13:12.962 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:13:12 np0005591285 podman[230030]: 2026-01-22 00:13:12.994650953 +0000 UTC m=+0.059542779 container create 501174ea5462386b7fef34abf0d453472b0cd0177d50a0b81c2e28cc2ee7608f (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-88a7330a-aaa1-424a-b4dc-f7500e450abb, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Jan 21 19:13:13 np0005591285 nova_compute[182755]: 2026-01-22 00:13:13.027 182759 DEBUG nova.compute.manager [req-056b48ed-d73c-43ff-a38a-9b767c8e2ea4 req-b9e807b6-00f1-431b-8f49-ca9cb33fbaac 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 118577c2-2440-472a-b858-f075b2a804b1] Received event network-vif-plugged-f091e31e-112e-4a90-9947-5a807f422c9c external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 21 19:13:13 np0005591285 nova_compute[182755]: 2026-01-22 00:13:13.028 182759 DEBUG oslo_concurrency.lockutils [req-056b48ed-d73c-43ff-a38a-9b767c8e2ea4 req-b9e807b6-00f1-431b-8f49-ca9cb33fbaac 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "118577c2-2440-472a-b858-f075b2a804b1-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 21 19:13:13 np0005591285 nova_compute[182755]: 2026-01-22 00:13:13.028 182759 DEBUG oslo_concurrency.lockutils [req-056b48ed-d73c-43ff-a38a-9b767c8e2ea4 req-b9e807b6-00f1-431b-8f49-ca9cb33fbaac 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "118577c2-2440-472a-b858-f075b2a804b1-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 21 19:13:13 np0005591285 nova_compute[182755]: 2026-01-22 00:13:13.029 182759 DEBUG oslo_concurrency.lockutils [req-056b48ed-d73c-43ff-a38a-9b767c8e2ea4 req-b9e807b6-00f1-431b-8f49-ca9cb33fbaac 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "118577c2-2440-472a-b858-f075b2a804b1-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 21 19:13:13 np0005591285 nova_compute[182755]: 2026-01-22 00:13:13.029 182759 DEBUG nova.compute.manager [req-056b48ed-d73c-43ff-a38a-9b767c8e2ea4 req-b9e807b6-00f1-431b-8f49-ca9cb33fbaac 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 118577c2-2440-472a-b858-f075b2a804b1] Processing event network-vif-plugged-f091e31e-112e-4a90-9947-5a807f422c9c _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Jan 21 19:13:13 np0005591285 nova_compute[182755]: 2026-01-22 00:13:13.029 182759 DEBUG nova.compute.manager [None req-ce7cd976-0f13-413f-b6d1-7ca0671f190c 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] [instance: 118577c2-2440-472a-b858-f075b2a804b1] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Jan 21 19:13:13 np0005591285 systemd[1]: Started libpod-conmon-501174ea5462386b7fef34abf0d453472b0cd0177d50a0b81c2e28cc2ee7608f.scope.
Jan 21 19:13:13 np0005591285 nova_compute[182755]: 2026-01-22 00:13:13.034 182759 DEBUG nova.virt.driver [None req-f447ef32-7edb-4e2c-a5c5-5452f2f9c37e - - - - - -] Emitting event <LifecycleEvent: 1769040793.034183, 118577c2-2440-472a-b858-f075b2a804b1 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 21 19:13:13 np0005591285 nova_compute[182755]: 2026-01-22 00:13:13.035 182759 INFO nova.compute.manager [None req-f447ef32-7edb-4e2c-a5c5-5452f2f9c37e - - - - - -] [instance: 118577c2-2440-472a-b858-f075b2a804b1] VM Resumed (Lifecycle Event)#033[00m
Jan 21 19:13:13 np0005591285 nova_compute[182755]: 2026-01-22 00:13:13.037 182759 DEBUG nova.virt.libvirt.driver [None req-ce7cd976-0f13-413f-b6d1-7ca0671f190c 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] [instance: 118577c2-2440-472a-b858-f075b2a804b1] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Jan 21 19:13:13 np0005591285 nova_compute[182755]: 2026-01-22 00:13:13.042 182759 INFO nova.virt.libvirt.driver [-] [instance: 118577c2-2440-472a-b858-f075b2a804b1] Instance spawned successfully.#033[00m
Jan 21 19:13:13 np0005591285 nova_compute[182755]: 2026-01-22 00:13:13.043 182759 DEBUG nova.virt.libvirt.driver [None req-ce7cd976-0f13-413f-b6d1-7ca0671f190c 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] [instance: 118577c2-2440-472a-b858-f075b2a804b1] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Jan 21 19:13:13 np0005591285 podman[230030]: 2026-01-22 00:13:12.966488062 +0000 UTC m=+0.031379938 image pull 3695f0466b4af47afdf4b467956f8cc4744d7249671a73e7ca3fd26cca2f59c3 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Jan 21 19:13:13 np0005591285 systemd[1]: Started libcrun container.
Jan 21 19:13:13 np0005591285 nova_compute[182755]: 2026-01-22 00:13:13.065 182759 DEBUG nova.compute.manager [None req-f447ef32-7edb-4e2c-a5c5-5452f2f9c37e - - - - - -] [instance: 118577c2-2440-472a-b858-f075b2a804b1] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 21 19:13:13 np0005591285 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/dd98519f1c4d189d5ef351bd8afbaf70b73c3fd624b829aeadadb977d87c64d0/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 21 19:13:13 np0005591285 nova_compute[182755]: 2026-01-22 00:13:13.076 182759 DEBUG nova.compute.manager [None req-f447ef32-7edb-4e2c-a5c5-5452f2f9c37e - - - - - -] [instance: 118577c2-2440-472a-b858-f075b2a804b1] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 21 19:13:13 np0005591285 nova_compute[182755]: 2026-01-22 00:13:13.079 182759 DEBUG nova.virt.libvirt.driver [None req-ce7cd976-0f13-413f-b6d1-7ca0671f190c 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] [instance: 118577c2-2440-472a-b858-f075b2a804b1] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 21 19:13:13 np0005591285 nova_compute[182755]: 2026-01-22 00:13:13.080 182759 DEBUG nova.virt.libvirt.driver [None req-ce7cd976-0f13-413f-b6d1-7ca0671f190c 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] [instance: 118577c2-2440-472a-b858-f075b2a804b1] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 21 19:13:13 np0005591285 podman[230030]: 2026-01-22 00:13:13.080544662 +0000 UTC m=+0.145436508 container init 501174ea5462386b7fef34abf0d453472b0cd0177d50a0b81c2e28cc2ee7608f (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-88a7330a-aaa1-424a-b4dc-f7500e450abb, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 21 19:13:13 np0005591285 nova_compute[182755]: 2026-01-22 00:13:13.081 182759 DEBUG nova.virt.libvirt.driver [None req-ce7cd976-0f13-413f-b6d1-7ca0671f190c 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] [instance: 118577c2-2440-472a-b858-f075b2a804b1] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 21 19:13:13 np0005591285 nova_compute[182755]: 2026-01-22 00:13:13.081 182759 DEBUG nova.virt.libvirt.driver [None req-ce7cd976-0f13-413f-b6d1-7ca0671f190c 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] [instance: 118577c2-2440-472a-b858-f075b2a804b1] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 21 19:13:13 np0005591285 nova_compute[182755]: 2026-01-22 00:13:13.081 182759 DEBUG nova.virt.libvirt.driver [None req-ce7cd976-0f13-413f-b6d1-7ca0671f190c 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] [instance: 118577c2-2440-472a-b858-f075b2a804b1] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 21 19:13:13 np0005591285 nova_compute[182755]: 2026-01-22 00:13:13.082 182759 DEBUG nova.virt.libvirt.driver [None req-ce7cd976-0f13-413f-b6d1-7ca0671f190c 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] [instance: 118577c2-2440-472a-b858-f075b2a804b1] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 21 19:13:13 np0005591285 podman[230030]: 2026-01-22 00:13:13.085438573 +0000 UTC m=+0.150330399 container start 501174ea5462386b7fef34abf0d453472b0cd0177d50a0b81c2e28cc2ee7608f (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-88a7330a-aaa1-424a-b4dc-f7500e450abb, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.build-date=20251202)
Jan 21 19:13:13 np0005591285 nova_compute[182755]: 2026-01-22 00:13:13.107 182759 INFO nova.compute.manager [None req-f447ef32-7edb-4e2c-a5c5-5452f2f9c37e - - - - - -] [instance: 118577c2-2440-472a-b858-f075b2a804b1] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 21 19:13:13 np0005591285 neutron-haproxy-ovnmeta-88a7330a-aaa1-424a-b4dc-f7500e450abb[230051]: [NOTICE]   (230055) : New worker (230057) forked
Jan 21 19:13:13 np0005591285 neutron-haproxy-ovnmeta-88a7330a-aaa1-424a-b4dc-f7500e450abb[230051]: [NOTICE]   (230055) : Loading success.
Jan 21 19:13:13 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:13:13.140 104259 INFO neutron.agent.ovn.metadata.agent [-] Port dc2fa6e9-f8da-41bd-b113-8c7f2b22d4a9 in datapath d1e3ff28-7ba4-4007-895f-2557b60edefb unbound from our chassis#033[00m
Jan 21 19:13:13 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:13:13.141 104259 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network d1e3ff28-7ba4-4007-895f-2557b60edefb#033[00m
Jan 21 19:13:13 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:13:13.151 211690 DEBUG oslo.privsep.daemon [-] privsep: reply[52a521e8-b2b2-4567-b0e4-547053213e5b]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 19:13:13 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:13:13.153 104259 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tapd1e3ff28-71 in ovnmeta-d1e3ff28-7ba4-4007-895f-2557b60edefb namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Jan 21 19:13:13 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:13:13.155 211690 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tapd1e3ff28-70 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Jan 21 19:13:13 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:13:13.155 211690 DEBUG oslo.privsep.daemon [-] privsep: reply[7e7c96c4-933d-4ba0-a104-93603a7fcf2d]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 19:13:13 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:13:13.156 211690 DEBUG oslo.privsep.daemon [-] privsep: reply[4870924b-c86e-4662-91f4-20c12a608285]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 19:13:13 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:13:13.170 104650 DEBUG oslo.privsep.daemon [-] privsep: reply[80f02766-7895-43e7-a0f8-0a3bd3e58524]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 19:13:13 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:13:13.192 211690 DEBUG oslo.privsep.daemon [-] privsep: reply[7976d82b-9e28-4059-83a2-41545907029f]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 19:13:13 np0005591285 nova_compute[182755]: 2026-01-22 00:13:13.199 182759 INFO nova.compute.manager [None req-ce7cd976-0f13-413f-b6d1-7ca0671f190c 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] [instance: 118577c2-2440-472a-b858-f075b2a804b1] Took 14.20 seconds to spawn the instance on the hypervisor.#033[00m
Jan 21 19:13:13 np0005591285 nova_compute[182755]: 2026-01-22 00:13:13.200 182759 DEBUG nova.compute.manager [None req-ce7cd976-0f13-413f-b6d1-7ca0671f190c 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] [instance: 118577c2-2440-472a-b858-f075b2a804b1] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 21 19:13:13 np0005591285 nova_compute[182755]: 2026-01-22 00:13:13.217 182759 DEBUG oslo_service.periodic_task [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 21 19:13:13 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:13:13.220 211704 DEBUG oslo.privsep.daemon [-] privsep: reply[8f8bf84c-a861-44bb-ac6b-336059b53d48]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 19:13:13 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:13:13.227 211690 DEBUG oslo.privsep.daemon [-] privsep: reply[78673639-00a3-4e87-a4cb-42342209ff04]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 19:13:13 np0005591285 NetworkManager[55017]: <info>  [1769040793.2286] manager: (tapd1e3ff28-70): new Veth device (/org/freedesktop/NetworkManager/Devices/216)
Jan 21 19:13:13 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:13:13.258 211704 DEBUG oslo.privsep.daemon [-] privsep: reply[8042df05-a2f3-428a-8243-63a8c1a3496f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 19:13:13 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:13:13.263 211704 DEBUG oslo.privsep.daemon [-] privsep: reply[c2bb48f1-e634-4071-9aac-e1c8018e13d1]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 19:13:13 np0005591285 NetworkManager[55017]: <info>  [1769040793.2871] device (tapd1e3ff28-70): carrier: link connected
Jan 21 19:13:13 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:13:13.293 211704 DEBUG oslo.privsep.daemon [-] privsep: reply[93587891-6f76-4700-ae71-112cc42e7a4b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 19:13:13 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:13:13.308 211690 DEBUG oslo.privsep.daemon [-] privsep: reply[716a268b-7798-45e0-a858-523dd317e9e4]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapd1e3ff28-71'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:1c:5b:09'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 139], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 535194, 'reachable_time': 23331, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 230076, 'error': None, 'target': 'ovnmeta-d1e3ff28-7ba4-4007-895f-2557b60edefb', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 19:13:13 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:13:13.322 211690 DEBUG oslo.privsep.daemon [-] privsep: reply[e221d5b4-93b2-4b83-b299-ff9b462456f3]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe1c:5b09'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 535194, 'tstamp': 535194}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 230077, 'error': None, 'target': 'ovnmeta-d1e3ff28-7ba4-4007-895f-2557b60edefb', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 19:13:13 np0005591285 nova_compute[182755]: 2026-01-22 00:13:13.332 182759 DEBUG nova.compute.manager [req-b7bc55e6-33f8-4c8c-9519-856076310552 req-05ee1479-d89e-4263-8f0d-c2d6cf83d245 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: ee09d802-1f59-4f58-befa-a281fe642b6b] Received event network-vif-plugged-dc2fa6e9-f8da-41bd-b113-8c7f2b22d4a9 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 21 19:13:13 np0005591285 nova_compute[182755]: 2026-01-22 00:13:13.332 182759 DEBUG oslo_concurrency.lockutils [req-b7bc55e6-33f8-4c8c-9519-856076310552 req-05ee1479-d89e-4263-8f0d-c2d6cf83d245 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "ee09d802-1f59-4f58-befa-a281fe642b6b-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 21 19:13:13 np0005591285 nova_compute[182755]: 2026-01-22 00:13:13.333 182759 DEBUG oslo_concurrency.lockutils [req-b7bc55e6-33f8-4c8c-9519-856076310552 req-05ee1479-d89e-4263-8f0d-c2d6cf83d245 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "ee09d802-1f59-4f58-befa-a281fe642b6b-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 21 19:13:13 np0005591285 nova_compute[182755]: 2026-01-22 00:13:13.333 182759 DEBUG oslo_concurrency.lockutils [req-b7bc55e6-33f8-4c8c-9519-856076310552 req-05ee1479-d89e-4263-8f0d-c2d6cf83d245 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "ee09d802-1f59-4f58-befa-a281fe642b6b-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 21 19:13:13 np0005591285 nova_compute[182755]: 2026-01-22 00:13:13.334 182759 DEBUG nova.compute.manager [req-b7bc55e6-33f8-4c8c-9519-856076310552 req-05ee1479-d89e-4263-8f0d-c2d6cf83d245 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: ee09d802-1f59-4f58-befa-a281fe642b6b] Processing event network-vif-plugged-dc2fa6e9-f8da-41bd-b113-8c7f2b22d4a9 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Jan 21 19:13:13 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:13:13.338 211690 DEBUG oslo.privsep.daemon [-] privsep: reply[3653b2e9-5310-45ef-a928-8a80eb58638a]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapd1e3ff28-71'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:1c:5b:09'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 139], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 535194, 'reachable_time': 23331, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 230078, 'error': None, 'target': 'ovnmeta-d1e3ff28-7ba4-4007-895f-2557b60edefb', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 19:13:13 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:13:13.365 211690 DEBUG oslo.privsep.daemon [-] privsep: reply[78b1c8c5-8e76-4cc0-82b7-dd1b9c7c779a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 19:13:13 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:13:13.428 211690 DEBUG oslo.privsep.daemon [-] privsep: reply[5dfe50af-b82d-43ec-a941-9432d8647d92]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 19:13:13 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:13:13.430 104259 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapd1e3ff28-70, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 21 19:13:13 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:13:13.430 104259 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 21 19:13:13 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:13:13.431 104259 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapd1e3ff28-70, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 21 19:13:13 np0005591285 NetworkManager[55017]: <info>  [1769040793.4339] manager: (tapd1e3ff28-70): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/217)
Jan 21 19:13:13 np0005591285 kernel: tapd1e3ff28-70: entered promiscuous mode
Jan 21 19:13:13 np0005591285 nova_compute[182755]: 2026-01-22 00:13:13.435 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:13:13 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:13:13.438 104259 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapd1e3ff28-70, col_values=(('external_ids', {'iface-id': '0b252d65-412b-4740-a674-4727d3037b7f'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 21 19:13:13 np0005591285 nova_compute[182755]: 2026-01-22 00:13:13.439 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:13:13 np0005591285 ovn_controller[94908]: 2026-01-22T00:13:13Z|00445|binding|INFO|Releasing lport 0b252d65-412b-4740-a674-4727d3037b7f from this chassis (sb_readonly=0)
Jan 21 19:13:13 np0005591285 nova_compute[182755]: 2026-01-22 00:13:13.441 182759 INFO nova.compute.manager [None req-ce7cd976-0f13-413f-b6d1-7ca0671f190c 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] [instance: 118577c2-2440-472a-b858-f075b2a804b1] Took 17.30 seconds to build instance.#033[00m
Jan 21 19:13:13 np0005591285 nova_compute[182755]: 2026-01-22 00:13:13.457 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:13:13 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:13:13.458 104259 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/d1e3ff28-7ba4-4007-895f-2557b60edefb.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/d1e3ff28-7ba4-4007-895f-2557b60edefb.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Jan 21 19:13:13 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:13:13.459 211690 DEBUG oslo.privsep.daemon [-] privsep: reply[fab85c22-ce6d-4661-b797-34af902c2a0c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 19:13:13 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:13:13.460 104259 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Jan 21 19:13:13 np0005591285 ovn_metadata_agent[104254]: global
Jan 21 19:13:13 np0005591285 ovn_metadata_agent[104254]:    log         /dev/log local0 debug
Jan 21 19:13:13 np0005591285 ovn_metadata_agent[104254]:    log-tag     haproxy-metadata-proxy-d1e3ff28-7ba4-4007-895f-2557b60edefb
Jan 21 19:13:13 np0005591285 ovn_metadata_agent[104254]:    user        root
Jan 21 19:13:13 np0005591285 ovn_metadata_agent[104254]:    group       root
Jan 21 19:13:13 np0005591285 ovn_metadata_agent[104254]:    maxconn     1024
Jan 21 19:13:13 np0005591285 ovn_metadata_agent[104254]:    pidfile     /var/lib/neutron/external/pids/d1e3ff28-7ba4-4007-895f-2557b60edefb.pid.haproxy
Jan 21 19:13:13 np0005591285 ovn_metadata_agent[104254]:    daemon
Jan 21 19:13:13 np0005591285 ovn_metadata_agent[104254]: 
Jan 21 19:13:13 np0005591285 ovn_metadata_agent[104254]: defaults
Jan 21 19:13:13 np0005591285 ovn_metadata_agent[104254]:    log global
Jan 21 19:13:13 np0005591285 ovn_metadata_agent[104254]:    mode http
Jan 21 19:13:13 np0005591285 ovn_metadata_agent[104254]:    option httplog
Jan 21 19:13:13 np0005591285 ovn_metadata_agent[104254]:    option dontlognull
Jan 21 19:13:13 np0005591285 ovn_metadata_agent[104254]:    option http-server-close
Jan 21 19:13:13 np0005591285 ovn_metadata_agent[104254]:    option forwardfor
Jan 21 19:13:13 np0005591285 ovn_metadata_agent[104254]:    retries                 3
Jan 21 19:13:13 np0005591285 ovn_metadata_agent[104254]:    timeout http-request    30s
Jan 21 19:13:13 np0005591285 ovn_metadata_agent[104254]:    timeout connect         30s
Jan 21 19:13:13 np0005591285 ovn_metadata_agent[104254]:    timeout client          32s
Jan 21 19:13:13 np0005591285 ovn_metadata_agent[104254]:    timeout server          32s
Jan 21 19:13:13 np0005591285 ovn_metadata_agent[104254]:    timeout http-keep-alive 30s
Jan 21 19:13:13 np0005591285 ovn_metadata_agent[104254]: 
Jan 21 19:13:13 np0005591285 ovn_metadata_agent[104254]: 
Jan 21 19:13:13 np0005591285 ovn_metadata_agent[104254]: listen listener
Jan 21 19:13:13 np0005591285 ovn_metadata_agent[104254]:    bind 169.254.169.254:80
Jan 21 19:13:13 np0005591285 ovn_metadata_agent[104254]:    server metadata /var/lib/neutron/metadata_proxy
Jan 21 19:13:13 np0005591285 ovn_metadata_agent[104254]:    http-request add-header X-OVN-Network-ID d1e3ff28-7ba4-4007-895f-2557b60edefb
Jan 21 19:13:13 np0005591285 ovn_metadata_agent[104254]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Jan 21 19:13:13 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:13:13.461 104259 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-d1e3ff28-7ba4-4007-895f-2557b60edefb', 'env', 'PROCESS_TAG=haproxy-d1e3ff28-7ba4-4007-895f-2557b60edefb', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/d1e3ff28-7ba4-4007-895f-2557b60edefb.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Jan 21 19:13:13 np0005591285 nova_compute[182755]: 2026-01-22 00:13:13.462 182759 DEBUG oslo_concurrency.lockutils [None req-ce7cd976-0f13-413f-b6d1-7ca0671f190c 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] Lock "118577c2-2440-472a-b858-f075b2a804b1" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 20.358s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 21 19:13:13 np0005591285 nova_compute[182755]: 2026-01-22 00:13:13.471 182759 DEBUG nova.virt.driver [None req-f447ef32-7edb-4e2c-a5c5-5452f2f9c37e - - - - - -] Emitting event <LifecycleEvent: 1769040793.4717531, ee09d802-1f59-4f58-befa-a281fe642b6b => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 21 19:13:13 np0005591285 nova_compute[182755]: 2026-01-22 00:13:13.473 182759 INFO nova.compute.manager [None req-f447ef32-7edb-4e2c-a5c5-5452f2f9c37e - - - - - -] [instance: ee09d802-1f59-4f58-befa-a281fe642b6b] VM Started (Lifecycle Event)#033[00m
Jan 21 19:13:13 np0005591285 nova_compute[182755]: 2026-01-22 00:13:13.477 182759 DEBUG nova.compute.manager [None req-6475acbd-323f-4d3e-8fa6-ea96f95ecd1c 9ee45ba20dd444a5a5e88aa96cc8e043 873b2f2688e942d5924aa81fa18d84c0 - - default default] [instance: ee09d802-1f59-4f58-befa-a281fe642b6b] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Jan 21 19:13:13 np0005591285 nova_compute[182755]: 2026-01-22 00:13:13.484 182759 DEBUG nova.virt.libvirt.driver [None req-6475acbd-323f-4d3e-8fa6-ea96f95ecd1c 9ee45ba20dd444a5a5e88aa96cc8e043 873b2f2688e942d5924aa81fa18d84c0 - - default default] [instance: ee09d802-1f59-4f58-befa-a281fe642b6b] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Jan 21 19:13:13 np0005591285 nova_compute[182755]: 2026-01-22 00:13:13.498 182759 DEBUG nova.compute.manager [None req-f447ef32-7edb-4e2c-a5c5-5452f2f9c37e - - - - - -] [instance: ee09d802-1f59-4f58-befa-a281fe642b6b] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 21 19:13:13 np0005591285 nova_compute[182755]: 2026-01-22 00:13:13.501 182759 INFO nova.virt.libvirt.driver [-] [instance: ee09d802-1f59-4f58-befa-a281fe642b6b] Instance spawned successfully.#033[00m
Jan 21 19:13:13 np0005591285 nova_compute[182755]: 2026-01-22 00:13:13.501 182759 DEBUG nova.virt.libvirt.driver [None req-6475acbd-323f-4d3e-8fa6-ea96f95ecd1c 9ee45ba20dd444a5a5e88aa96cc8e043 873b2f2688e942d5924aa81fa18d84c0 - - default default] [instance: ee09d802-1f59-4f58-befa-a281fe642b6b] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Jan 21 19:13:13 np0005591285 nova_compute[182755]: 2026-01-22 00:13:13.504 182759 DEBUG nova.compute.manager [None req-f447ef32-7edb-4e2c-a5c5-5452f2f9c37e - - - - - -] [instance: ee09d802-1f59-4f58-befa-a281fe642b6b] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 21 19:13:13 np0005591285 nova_compute[182755]: 2026-01-22 00:13:13.536 182759 INFO nova.compute.manager [None req-f447ef32-7edb-4e2c-a5c5-5452f2f9c37e - - - - - -] [instance: ee09d802-1f59-4f58-befa-a281fe642b6b] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 21 19:13:13 np0005591285 nova_compute[182755]: 2026-01-22 00:13:13.536 182759 DEBUG nova.virt.driver [None req-f447ef32-7edb-4e2c-a5c5-5452f2f9c37e - - - - - -] Emitting event <LifecycleEvent: 1769040793.4722888, ee09d802-1f59-4f58-befa-a281fe642b6b => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 21 19:13:13 np0005591285 nova_compute[182755]: 2026-01-22 00:13:13.537 182759 INFO nova.compute.manager [None req-f447ef32-7edb-4e2c-a5c5-5452f2f9c37e - - - - - -] [instance: ee09d802-1f59-4f58-befa-a281fe642b6b] VM Paused (Lifecycle Event)#033[00m
Jan 21 19:13:13 np0005591285 nova_compute[182755]: 2026-01-22 00:13:13.545 182759 DEBUG nova.virt.libvirt.driver [None req-6475acbd-323f-4d3e-8fa6-ea96f95ecd1c 9ee45ba20dd444a5a5e88aa96cc8e043 873b2f2688e942d5924aa81fa18d84c0 - - default default] [instance: ee09d802-1f59-4f58-befa-a281fe642b6b] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 21 19:13:13 np0005591285 nova_compute[182755]: 2026-01-22 00:13:13.546 182759 DEBUG nova.virt.libvirt.driver [None req-6475acbd-323f-4d3e-8fa6-ea96f95ecd1c 9ee45ba20dd444a5a5e88aa96cc8e043 873b2f2688e942d5924aa81fa18d84c0 - - default default] [instance: ee09d802-1f59-4f58-befa-a281fe642b6b] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 21 19:13:13 np0005591285 nova_compute[182755]: 2026-01-22 00:13:13.547 182759 DEBUG nova.virt.libvirt.driver [None req-6475acbd-323f-4d3e-8fa6-ea96f95ecd1c 9ee45ba20dd444a5a5e88aa96cc8e043 873b2f2688e942d5924aa81fa18d84c0 - - default default] [instance: ee09d802-1f59-4f58-befa-a281fe642b6b] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 21 19:13:13 np0005591285 nova_compute[182755]: 2026-01-22 00:13:13.549 182759 DEBUG nova.virt.libvirt.driver [None req-6475acbd-323f-4d3e-8fa6-ea96f95ecd1c 9ee45ba20dd444a5a5e88aa96cc8e043 873b2f2688e942d5924aa81fa18d84c0 - - default default] [instance: ee09d802-1f59-4f58-befa-a281fe642b6b] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 21 19:13:13 np0005591285 nova_compute[182755]: 2026-01-22 00:13:13.550 182759 DEBUG nova.virt.libvirt.driver [None req-6475acbd-323f-4d3e-8fa6-ea96f95ecd1c 9ee45ba20dd444a5a5e88aa96cc8e043 873b2f2688e942d5924aa81fa18d84c0 - - default default] [instance: ee09d802-1f59-4f58-befa-a281fe642b6b] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 21 19:13:13 np0005591285 nova_compute[182755]: 2026-01-22 00:13:13.550 182759 DEBUG nova.virt.libvirt.driver [None req-6475acbd-323f-4d3e-8fa6-ea96f95ecd1c 9ee45ba20dd444a5a5e88aa96cc8e043 873b2f2688e942d5924aa81fa18d84c0 - - default default] [instance: ee09d802-1f59-4f58-befa-a281fe642b6b] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 21 19:13:13 np0005591285 nova_compute[182755]: 2026-01-22 00:13:13.564 182759 DEBUG nova.compute.manager [None req-f447ef32-7edb-4e2c-a5c5-5452f2f9c37e - - - - - -] [instance: ee09d802-1f59-4f58-befa-a281fe642b6b] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 21 19:13:13 np0005591285 nova_compute[182755]: 2026-01-22 00:13:13.568 182759 DEBUG nova.virt.driver [None req-f447ef32-7edb-4e2c-a5c5-5452f2f9c37e - - - - - -] Emitting event <LifecycleEvent: 1769040793.4821332, ee09d802-1f59-4f58-befa-a281fe642b6b => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 21 19:13:13 np0005591285 nova_compute[182755]: 2026-01-22 00:13:13.568 182759 INFO nova.compute.manager [None req-f447ef32-7edb-4e2c-a5c5-5452f2f9c37e - - - - - -] [instance: ee09d802-1f59-4f58-befa-a281fe642b6b] VM Resumed (Lifecycle Event)#033[00m
Jan 21 19:13:13 np0005591285 nova_compute[182755]: 2026-01-22 00:13:13.586 182759 DEBUG nova.network.neutron [req-8ef12b3c-5f95-4688-94ad-20379dd1c96d req-ecae4b8f-7244-4a98-bf08-6a3ed1d28293 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: ee09d802-1f59-4f58-befa-a281fe642b6b] Updated VIF entry in instance network info cache for port dc2fa6e9-f8da-41bd-b113-8c7f2b22d4a9. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 21 19:13:13 np0005591285 nova_compute[182755]: 2026-01-22 00:13:13.586 182759 DEBUG nova.network.neutron [req-8ef12b3c-5f95-4688-94ad-20379dd1c96d req-ecae4b8f-7244-4a98-bf08-6a3ed1d28293 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: ee09d802-1f59-4f58-befa-a281fe642b6b] Updating instance_info_cache with network_info: [{"id": "dc2fa6e9-f8da-41bd-b113-8c7f2b22d4a9", "address": "fa:16:3e:f6:6f:67", "network": {"id": "d1e3ff28-7ba4-4007-895f-2557b60edefb", "bridge": "br-int", "label": "tempest-InstanceActionsTestJSON-1124107938-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "873b2f2688e942d5924aa81fa18d84c0", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapdc2fa6e9-f8", "ovs_interfaceid": "dc2fa6e9-f8da-41bd-b113-8c7f2b22d4a9", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 21 19:13:13 np0005591285 nova_compute[182755]: 2026-01-22 00:13:13.604 182759 DEBUG nova.compute.manager [None req-f447ef32-7edb-4e2c-a5c5-5452f2f9c37e - - - - - -] [instance: ee09d802-1f59-4f58-befa-a281fe642b6b] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 21 19:13:13 np0005591285 nova_compute[182755]: 2026-01-22 00:13:13.608 182759 DEBUG nova.compute.manager [None req-f447ef32-7edb-4e2c-a5c5-5452f2f9c37e - - - - - -] [instance: ee09d802-1f59-4f58-befa-a281fe642b6b] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 21 19:13:13 np0005591285 nova_compute[182755]: 2026-01-22 00:13:13.622 182759 DEBUG oslo_concurrency.lockutils [req-8ef12b3c-5f95-4688-94ad-20379dd1c96d req-ecae4b8f-7244-4a98-bf08-6a3ed1d28293 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Releasing lock "refresh_cache-ee09d802-1f59-4f58-befa-a281fe642b6b" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 21 19:13:13 np0005591285 nova_compute[182755]: 2026-01-22 00:13:13.631 182759 INFO nova.compute.manager [None req-f447ef32-7edb-4e2c-a5c5-5452f2f9c37e - - - - - -] [instance: ee09d802-1f59-4f58-befa-a281fe642b6b] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 21 19:13:13 np0005591285 nova_compute[182755]: 2026-01-22 00:13:13.639 182759 INFO nova.compute.manager [None req-6475acbd-323f-4d3e-8fa6-ea96f95ecd1c 9ee45ba20dd444a5a5e88aa96cc8e043 873b2f2688e942d5924aa81fa18d84c0 - - default default] [instance: ee09d802-1f59-4f58-befa-a281fe642b6b] Took 15.04 seconds to spawn the instance on the hypervisor.#033[00m
Jan 21 19:13:13 np0005591285 nova_compute[182755]: 2026-01-22 00:13:13.640 182759 DEBUG nova.compute.manager [None req-6475acbd-323f-4d3e-8fa6-ea96f95ecd1c 9ee45ba20dd444a5a5e88aa96cc8e043 873b2f2688e942d5924aa81fa18d84c0 - - default default] [instance: ee09d802-1f59-4f58-befa-a281fe642b6b] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 21 19:13:13 np0005591285 nova_compute[182755]: 2026-01-22 00:13:13.731 182759 INFO nova.compute.manager [None req-6475acbd-323f-4d3e-8fa6-ea96f95ecd1c 9ee45ba20dd444a5a5e88aa96cc8e043 873b2f2688e942d5924aa81fa18d84c0 - - default default] [instance: ee09d802-1f59-4f58-befa-a281fe642b6b] Took 21.61 seconds to build instance.#033[00m
Jan 21 19:13:13 np0005591285 nova_compute[182755]: 2026-01-22 00:13:13.782 182759 DEBUG oslo_concurrency.lockutils [None req-6475acbd-323f-4d3e-8fa6-ea96f95ecd1c 9ee45ba20dd444a5a5e88aa96cc8e043 873b2f2688e942d5924aa81fa18d84c0 - - default default] Lock "ee09d802-1f59-4f58-befa-a281fe642b6b" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 22.248s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 21 19:13:13 np0005591285 podman[230116]: 2026-01-22 00:13:13.875495877 +0000 UTC m=+0.060354070 container create 7bb8c357101b8c87b3316ae8b53e444bd75e19cae4669bd4868c4f4988fe56b9 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-d1e3ff28-7ba4-4007-895f-2557b60edefb, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 21 19:13:13 np0005591285 systemd[1]: Started libpod-conmon-7bb8c357101b8c87b3316ae8b53e444bd75e19cae4669bd4868c4f4988fe56b9.scope.
Jan 21 19:13:13 np0005591285 systemd[1]: Started libcrun container.
Jan 21 19:13:13 np0005591285 podman[230116]: 2026-01-22 00:13:13.842546988 +0000 UTC m=+0.027405221 image pull 3695f0466b4af47afdf4b467956f8cc4744d7249671a73e7ca3fd26cca2f59c3 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Jan 21 19:13:13 np0005591285 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/535239dff6939f44a806ad99cd1a8558d383b084a97a2556b0db0a01baad9098/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 21 19:13:13 np0005591285 podman[230116]: 2026-01-22 00:13:13.95140437 +0000 UTC m=+0.136262583 container init 7bb8c357101b8c87b3316ae8b53e444bd75e19cae4669bd4868c4f4988fe56b9 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-d1e3ff28-7ba4-4007-895f-2557b60edefb, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.license=GPLv2, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0)
Jan 21 19:13:13 np0005591285 podman[230116]: 2026-01-22 00:13:13.956974999 +0000 UTC m=+0.141833192 container start 7bb8c357101b8c87b3316ae8b53e444bd75e19cae4669bd4868c4f4988fe56b9 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-d1e3ff28-7ba4-4007-895f-2557b60edefb, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, tcib_managed=true, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS)
Jan 21 19:13:13 np0005591285 neutron-haproxy-ovnmeta-d1e3ff28-7ba4-4007-895f-2557b60edefb[230132]: [NOTICE]   (230136) : New worker (230138) forked
Jan 21 19:13:13 np0005591285 neutron-haproxy-ovnmeta-d1e3ff28-7ba4-4007-895f-2557b60edefb[230132]: [NOTICE]   (230136) : Loading success.
Jan 21 19:13:14 np0005591285 nova_compute[182755]: 2026-01-22 00:13:14.072 182759 DEBUG nova.network.neutron [req-4df39661-500b-48e3-bb0a-9a0b42e97cbd req-123190da-d41e-44f2-9f07-24570c612c9b 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 118577c2-2440-472a-b858-f075b2a804b1] Updated VIF entry in instance network info cache for port f091e31e-112e-4a90-9947-5a807f422c9c. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 21 19:13:14 np0005591285 nova_compute[182755]: 2026-01-22 00:13:14.074 182759 DEBUG nova.network.neutron [req-4df39661-500b-48e3-bb0a-9a0b42e97cbd req-123190da-d41e-44f2-9f07-24570c612c9b 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 118577c2-2440-472a-b858-f075b2a804b1] Updating instance_info_cache with network_info: [{"id": "f091e31e-112e-4a90-9947-5a807f422c9c", "address": "fa:16:3e:a7:98:77", "network": {"id": "88a7330a-aaa1-424a-b4dc-f7500e450abb", "bridge": "br-int", "label": "tempest-network-smoke--616986641", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "34b96b4037d24a0ea19383ca2477b2fd", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapf091e31e-11", "ovs_interfaceid": "f091e31e-112e-4a90-9947-5a807f422c9c", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 21 19:13:14 np0005591285 nova_compute[182755]: 2026-01-22 00:13:14.089 182759 DEBUG oslo_concurrency.lockutils [req-4df39661-500b-48e3-bb0a-9a0b42e97cbd req-123190da-d41e-44f2-9f07-24570c612c9b 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Releasing lock "refresh_cache-118577c2-2440-472a-b858-f075b2a804b1" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 21 19:13:15 np0005591285 nova_compute[182755]: 2026-01-22 00:13:15.385 182759 DEBUG nova.compute.manager [req-f4ecc211-2f81-4539-8fe4-f92d0dfe79b3 req-03c223f2-9dfd-4174-aea2-cada6042b64b 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 118577c2-2440-472a-b858-f075b2a804b1] Received event network-vif-plugged-f091e31e-112e-4a90-9947-5a807f422c9c external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 21 19:13:15 np0005591285 nova_compute[182755]: 2026-01-22 00:13:15.386 182759 DEBUG oslo_concurrency.lockutils [req-f4ecc211-2f81-4539-8fe4-f92d0dfe79b3 req-03c223f2-9dfd-4174-aea2-cada6042b64b 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "118577c2-2440-472a-b858-f075b2a804b1-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 21 19:13:15 np0005591285 nova_compute[182755]: 2026-01-22 00:13:15.386 182759 DEBUG oslo_concurrency.lockutils [req-f4ecc211-2f81-4539-8fe4-f92d0dfe79b3 req-03c223f2-9dfd-4174-aea2-cada6042b64b 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "118577c2-2440-472a-b858-f075b2a804b1-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 21 19:13:15 np0005591285 nova_compute[182755]: 2026-01-22 00:13:15.386 182759 DEBUG oslo_concurrency.lockutils [req-f4ecc211-2f81-4539-8fe4-f92d0dfe79b3 req-03c223f2-9dfd-4174-aea2-cada6042b64b 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "118577c2-2440-472a-b858-f075b2a804b1-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 21 19:13:15 np0005591285 nova_compute[182755]: 2026-01-22 00:13:15.387 182759 DEBUG nova.compute.manager [req-f4ecc211-2f81-4539-8fe4-f92d0dfe79b3 req-03c223f2-9dfd-4174-aea2-cada6042b64b 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 118577c2-2440-472a-b858-f075b2a804b1] No waiting events found dispatching network-vif-plugged-f091e31e-112e-4a90-9947-5a807f422c9c pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 21 19:13:15 np0005591285 nova_compute[182755]: 2026-01-22 00:13:15.387 182759 WARNING nova.compute.manager [req-f4ecc211-2f81-4539-8fe4-f92d0dfe79b3 req-03c223f2-9dfd-4174-aea2-cada6042b64b 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 118577c2-2440-472a-b858-f075b2a804b1] Received unexpected event network-vif-plugged-f091e31e-112e-4a90-9947-5a807f422c9c for instance with vm_state active and task_state None.#033[00m
Jan 21 19:13:15 np0005591285 nova_compute[182755]: 2026-01-22 00:13:15.927 182759 DEBUG nova.compute.manager [req-e596daf8-5223-45d0-a309-22b782ad4e2d req-5c47ac03-ac6d-46b9-8906-2a7862a55dc5 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: ee09d802-1f59-4f58-befa-a281fe642b6b] Received event network-vif-plugged-dc2fa6e9-f8da-41bd-b113-8c7f2b22d4a9 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 21 19:13:15 np0005591285 nova_compute[182755]: 2026-01-22 00:13:15.929 182759 DEBUG oslo_concurrency.lockutils [req-e596daf8-5223-45d0-a309-22b782ad4e2d req-5c47ac03-ac6d-46b9-8906-2a7862a55dc5 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "ee09d802-1f59-4f58-befa-a281fe642b6b-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 21 19:13:15 np0005591285 nova_compute[182755]: 2026-01-22 00:13:15.930 182759 DEBUG oslo_concurrency.lockutils [req-e596daf8-5223-45d0-a309-22b782ad4e2d req-5c47ac03-ac6d-46b9-8906-2a7862a55dc5 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "ee09d802-1f59-4f58-befa-a281fe642b6b-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 21 19:13:15 np0005591285 nova_compute[182755]: 2026-01-22 00:13:15.931 182759 DEBUG oslo_concurrency.lockutils [req-e596daf8-5223-45d0-a309-22b782ad4e2d req-5c47ac03-ac6d-46b9-8906-2a7862a55dc5 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "ee09d802-1f59-4f58-befa-a281fe642b6b-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 21 19:13:15 np0005591285 nova_compute[182755]: 2026-01-22 00:13:15.931 182759 DEBUG nova.compute.manager [req-e596daf8-5223-45d0-a309-22b782ad4e2d req-5c47ac03-ac6d-46b9-8906-2a7862a55dc5 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: ee09d802-1f59-4f58-befa-a281fe642b6b] No waiting events found dispatching network-vif-plugged-dc2fa6e9-f8da-41bd-b113-8c7f2b22d4a9 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 21 19:13:15 np0005591285 nova_compute[182755]: 2026-01-22 00:13:15.932 182759 WARNING nova.compute.manager [req-e596daf8-5223-45d0-a309-22b782ad4e2d req-5c47ac03-ac6d-46b9-8906-2a7862a55dc5 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: ee09d802-1f59-4f58-befa-a281fe642b6b] Received unexpected event network-vif-plugged-dc2fa6e9-f8da-41bd-b113-8c7f2b22d4a9 for instance with vm_state active and task_state None.#033[00m
Jan 21 19:13:16 np0005591285 nova_compute[182755]: 2026-01-22 00:13:16.420 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:13:17 np0005591285 nova_compute[182755]: 2026-01-22 00:13:17.470 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:13:21 np0005591285 nova_compute[182755]: 2026-01-22 00:13:21.423 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:13:21 np0005591285 nova_compute[182755]: 2026-01-22 00:13:21.813 182759 DEBUG oslo_concurrency.lockutils [None req-9dcb3324-6936-4a1b-8c6b-567b0c7bea41 9ee45ba20dd444a5a5e88aa96cc8e043 873b2f2688e942d5924aa81fa18d84c0 - - default default] Acquiring lock "ee09d802-1f59-4f58-befa-a281fe642b6b" by "nova.compute.manager.ComputeManager.reboot_instance.<locals>.do_reboot_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 21 19:13:21 np0005591285 nova_compute[182755]: 2026-01-22 00:13:21.814 182759 DEBUG oslo_concurrency.lockutils [None req-9dcb3324-6936-4a1b-8c6b-567b0c7bea41 9ee45ba20dd444a5a5e88aa96cc8e043 873b2f2688e942d5924aa81fa18d84c0 - - default default] Lock "ee09d802-1f59-4f58-befa-a281fe642b6b" acquired by "nova.compute.manager.ComputeManager.reboot_instance.<locals>.do_reboot_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 21 19:13:21 np0005591285 nova_compute[182755]: 2026-01-22 00:13:21.815 182759 INFO nova.compute.manager [None req-9dcb3324-6936-4a1b-8c6b-567b0c7bea41 9ee45ba20dd444a5a5e88aa96cc8e043 873b2f2688e942d5924aa81fa18d84c0 - - default default] [instance: ee09d802-1f59-4f58-befa-a281fe642b6b] Rebooting instance#033[00m
Jan 21 19:13:22 np0005591285 nova_compute[182755]: 2026-01-22 00:13:22.108 182759 DEBUG oslo_concurrency.lockutils [None req-9dcb3324-6936-4a1b-8c6b-567b0c7bea41 9ee45ba20dd444a5a5e88aa96cc8e043 873b2f2688e942d5924aa81fa18d84c0 - - default default] Acquiring lock "refresh_cache-ee09d802-1f59-4f58-befa-a281fe642b6b" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 21 19:13:22 np0005591285 nova_compute[182755]: 2026-01-22 00:13:22.108 182759 DEBUG oslo_concurrency.lockutils [None req-9dcb3324-6936-4a1b-8c6b-567b0c7bea41 9ee45ba20dd444a5a5e88aa96cc8e043 873b2f2688e942d5924aa81fa18d84c0 - - default default] Acquired lock "refresh_cache-ee09d802-1f59-4f58-befa-a281fe642b6b" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 21 19:13:22 np0005591285 nova_compute[182755]: 2026-01-22 00:13:22.109 182759 DEBUG nova.network.neutron [None req-9dcb3324-6936-4a1b-8c6b-567b0c7bea41 9ee45ba20dd444a5a5e88aa96cc8e043 873b2f2688e942d5924aa81fa18d84c0 - - default default] [instance: ee09d802-1f59-4f58-befa-a281fe642b6b] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Jan 21 19:13:22 np0005591285 nova_compute[182755]: 2026-01-22 00:13:22.473 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:13:23 np0005591285 podman[230148]: 2026-01-22 00:13:23.19437968 +0000 UTC m=+0.054694730 container health_status b5c1a31a508d0cf9e10702abe9c0ef424614b4d8f0daee85791f7de054afa6f0 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=ceilometer_agent_compute)
Jan 21 19:13:23 np0005591285 podman[230147]: 2026-01-22 00:13:23.20491914 +0000 UTC m=+0.066501763 container health_status 769668cc4e818cfae854ab3fcb5517227ddca1e18005a18ef4d782a494f45662 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, build-date=2025-08-20T13:12:41, config_id=openstack_network_exporter, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., distribution-scope=public, io.buildah.version=1.33.7, vendor=Red Hat, Inc., vcs-type=git, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Red Hat, Inc., release=1755695350, version=9.6, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.openshift.expose-services=, name=ubi9-minimal, architecture=x86_64, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=minimal rhel9, managed_by=edpm_ansible, url=https://catalog.redhat.com/en/search?searchType=containers, container_name=openstack_network_exporter, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, com.redhat.component=ubi9-minimal-container)
Jan 21 19:13:25 np0005591285 ovn_controller[94908]: 2026-01-22T00:13:25Z|00446|binding|INFO|Releasing lport f63f34ac-9af7-4a13-911f-2c9f043a5c66 from this chassis (sb_readonly=0)
Jan 21 19:13:25 np0005591285 ovn_controller[94908]: 2026-01-22T00:13:25Z|00447|binding|INFO|Releasing lport 0b252d65-412b-4740-a674-4727d3037b7f from this chassis (sb_readonly=0)
Jan 21 19:13:25 np0005591285 NetworkManager[55017]: <info>  [1769040805.4449] manager: (patch-provnet-99bcf688-d143-4306-a11c-956e66fbe227-to-br-int): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/218)
Jan 21 19:13:25 np0005591285 nova_compute[182755]: 2026-01-22 00:13:25.444 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:13:25 np0005591285 NetworkManager[55017]: <info>  [1769040805.4455] manager: (patch-br-int-to-provnet-99bcf688-d143-4306-a11c-956e66fbe227): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/219)
Jan 21 19:13:25 np0005591285 ovn_controller[94908]: 2026-01-22T00:13:25Z|00448|binding|INFO|Releasing lport f63f34ac-9af7-4a13-911f-2c9f043a5c66 from this chassis (sb_readonly=0)
Jan 21 19:13:25 np0005591285 ovn_controller[94908]: 2026-01-22T00:13:25Z|00449|binding|INFO|Releasing lport 0b252d65-412b-4740-a674-4727d3037b7f from this chassis (sb_readonly=0)
Jan 21 19:13:25 np0005591285 nova_compute[182755]: 2026-01-22 00:13:25.473 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:13:25 np0005591285 nova_compute[182755]: 2026-01-22 00:13:25.483 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:13:25 np0005591285 ovn_controller[94908]: 2026-01-22T00:13:25Z|00045|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:a7:98:77 10.100.0.11
Jan 21 19:13:25 np0005591285 ovn_controller[94908]: 2026-01-22T00:13:25Z|00046|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:a7:98:77 10.100.0.11
Jan 21 19:13:25 np0005591285 nova_compute[182755]: 2026-01-22 00:13:25.752 182759 DEBUG nova.network.neutron [None req-9dcb3324-6936-4a1b-8c6b-567b0c7bea41 9ee45ba20dd444a5a5e88aa96cc8e043 873b2f2688e942d5924aa81fa18d84c0 - - default default] [instance: ee09d802-1f59-4f58-befa-a281fe642b6b] Updating instance_info_cache with network_info: [{"id": "dc2fa6e9-f8da-41bd-b113-8c7f2b22d4a9", "address": "fa:16:3e:f6:6f:67", "network": {"id": "d1e3ff28-7ba4-4007-895f-2557b60edefb", "bridge": "br-int", "label": "tempest-InstanceActionsTestJSON-1124107938-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "873b2f2688e942d5924aa81fa18d84c0", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapdc2fa6e9-f8", "ovs_interfaceid": "dc2fa6e9-f8da-41bd-b113-8c7f2b22d4a9", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 21 19:13:25 np0005591285 nova_compute[182755]: 2026-01-22 00:13:25.950 182759 DEBUG oslo_concurrency.lockutils [None req-9dcb3324-6936-4a1b-8c6b-567b0c7bea41 9ee45ba20dd444a5a5e88aa96cc8e043 873b2f2688e942d5924aa81fa18d84c0 - - default default] Releasing lock "refresh_cache-ee09d802-1f59-4f58-befa-a281fe642b6b" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 21 19:13:26 np0005591285 ovn_controller[94908]: 2026-01-22T00:13:26Z|00047|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:f6:6f:67 10.100.0.10
Jan 21 19:13:26 np0005591285 ovn_controller[94908]: 2026-01-22T00:13:26Z|00048|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:f6:6f:67 10.100.0.10
Jan 21 19:13:26 np0005591285 nova_compute[182755]: 2026-01-22 00:13:26.426 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:13:26 np0005591285 nova_compute[182755]: 2026-01-22 00:13:26.537 182759 DEBUG nova.compute.manager [req-4ccd5df2-edba-4711-b37a-369c4ae34b5b req-7512b272-3ee4-46f6-aefa-8d4a12364b1d 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 118577c2-2440-472a-b858-f075b2a804b1] Received event network-changed-f091e31e-112e-4a90-9947-5a807f422c9c external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 21 19:13:26 np0005591285 nova_compute[182755]: 2026-01-22 00:13:26.539 182759 DEBUG nova.compute.manager [req-4ccd5df2-edba-4711-b37a-369c4ae34b5b req-7512b272-3ee4-46f6-aefa-8d4a12364b1d 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 118577c2-2440-472a-b858-f075b2a804b1] Refreshing instance network info cache due to event network-changed-f091e31e-112e-4a90-9947-5a807f422c9c. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 21 19:13:26 np0005591285 nova_compute[182755]: 2026-01-22 00:13:26.539 182759 DEBUG oslo_concurrency.lockutils [req-4ccd5df2-edba-4711-b37a-369c4ae34b5b req-7512b272-3ee4-46f6-aefa-8d4a12364b1d 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "refresh_cache-118577c2-2440-472a-b858-f075b2a804b1" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 21 19:13:26 np0005591285 nova_compute[182755]: 2026-01-22 00:13:26.540 182759 DEBUG oslo_concurrency.lockutils [req-4ccd5df2-edba-4711-b37a-369c4ae34b5b req-7512b272-3ee4-46f6-aefa-8d4a12364b1d 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquired lock "refresh_cache-118577c2-2440-472a-b858-f075b2a804b1" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 21 19:13:26 np0005591285 nova_compute[182755]: 2026-01-22 00:13:26.540 182759 DEBUG nova.network.neutron [req-4ccd5df2-edba-4711-b37a-369c4ae34b5b req-7512b272-3ee4-46f6-aefa-8d4a12364b1d 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 118577c2-2440-472a-b858-f075b2a804b1] Refreshing network info cache for port f091e31e-112e-4a90-9947-5a807f422c9c _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 21 19:13:27 np0005591285 nova_compute[182755]: 2026-01-22 00:13:27.416 182759 DEBUG nova.compute.manager [None req-9dcb3324-6936-4a1b-8c6b-567b0c7bea41 9ee45ba20dd444a5a5e88aa96cc8e043 873b2f2688e942d5924aa81fa18d84c0 - - default default] [instance: ee09d802-1f59-4f58-befa-a281fe642b6b] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 21 19:13:27 np0005591285 nova_compute[182755]: 2026-01-22 00:13:27.475 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:13:27 np0005591285 nova_compute[182755]: 2026-01-22 00:13:27.958 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:13:30 np0005591285 kernel: tapdc2fa6e9-f8 (unregistering): left promiscuous mode
Jan 21 19:13:30 np0005591285 NetworkManager[55017]: <info>  [1769040810.5521] device (tapdc2fa6e9-f8): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 21 19:13:30 np0005591285 nova_compute[182755]: 2026-01-22 00:13:30.564 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:13:30 np0005591285 ovn_controller[94908]: 2026-01-22T00:13:30Z|00450|binding|INFO|Releasing lport dc2fa6e9-f8da-41bd-b113-8c7f2b22d4a9 from this chassis (sb_readonly=0)
Jan 21 19:13:30 np0005591285 ovn_controller[94908]: 2026-01-22T00:13:30Z|00451|binding|INFO|Setting lport dc2fa6e9-f8da-41bd-b113-8c7f2b22d4a9 down in Southbound
Jan 21 19:13:30 np0005591285 ovn_controller[94908]: 2026-01-22T00:13:30Z|00452|binding|INFO|Removing iface tapdc2fa6e9-f8 ovn-installed in OVS
Jan 21 19:13:30 np0005591285 nova_compute[182755]: 2026-01-22 00:13:30.566 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:13:30 np0005591285 nova_compute[182755]: 2026-01-22 00:13:30.577 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:13:30 np0005591285 nova_compute[182755]: 2026-01-22 00:13:30.589 182759 DEBUG nova.network.neutron [req-4ccd5df2-edba-4711-b37a-369c4ae34b5b req-7512b272-3ee4-46f6-aefa-8d4a12364b1d 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 118577c2-2440-472a-b858-f075b2a804b1] Updated VIF entry in instance network info cache for port f091e31e-112e-4a90-9947-5a807f422c9c. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 21 19:13:30 np0005591285 nova_compute[182755]: 2026-01-22 00:13:30.590 182759 DEBUG nova.network.neutron [req-4ccd5df2-edba-4711-b37a-369c4ae34b5b req-7512b272-3ee4-46f6-aefa-8d4a12364b1d 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 118577c2-2440-472a-b858-f075b2a804b1] Updating instance_info_cache with network_info: [{"id": "f091e31e-112e-4a90-9947-5a807f422c9c", "address": "fa:16:3e:a7:98:77", "network": {"id": "88a7330a-aaa1-424a-b4dc-f7500e450abb", "bridge": "br-int", "label": "tempest-network-smoke--616986641", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.193", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "34b96b4037d24a0ea19383ca2477b2fd", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapf091e31e-11", "ovs_interfaceid": "f091e31e-112e-4a90-9947-5a807f422c9c", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 21 19:13:30 np0005591285 systemd[1]: machine-qemu\x2d54\x2dinstance\x2d00000076.scope: Deactivated successfully.
Jan 21 19:13:30 np0005591285 systemd[1]: machine-qemu\x2d54\x2dinstance\x2d00000076.scope: Consumed 13.522s CPU time.
Jan 21 19:13:30 np0005591285 systemd-machined[154022]: Machine qemu-54-instance-00000076 terminated.
Jan 21 19:13:30 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:13:30.637 104259 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:f6:6f:67 10.100.0.10'], port_security=['fa:16:3e:f6:6f:67 10.100.0.10'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.10/28', 'neutron:device_id': 'ee09d802-1f59-4f58-befa-a281fe642b6b', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-d1e3ff28-7ba4-4007-895f-2557b60edefb', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '873b2f2688e942d5924aa81fa18d84c0', 'neutron:revision_number': '4', 'neutron:security_group_ids': 'e0235df0-8e10-4f2c-bca1-3481f216c1a8', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-2.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=b6e8d061-740c-490c-81d2-02d385a2e787, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fdd55b5c970>], logical_port=dc2fa6e9-f8da-41bd-b113-8c7f2b22d4a9) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7fdd55b5c970>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 21 19:13:30 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:13:30.639 104259 INFO neutron.agent.ovn.metadata.agent [-] Port dc2fa6e9-f8da-41bd-b113-8c7f2b22d4a9 in datapath d1e3ff28-7ba4-4007-895f-2557b60edefb unbound from our chassis#033[00m
Jan 21 19:13:30 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:13:30.640 104259 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network d1e3ff28-7ba4-4007-895f-2557b60edefb, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Jan 21 19:13:30 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:13:30.642 211690 DEBUG oslo.privsep.daemon [-] privsep: reply[98ef264f-49f4-40d1-80fc-1bf095f757fb]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 19:13:30 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:13:30.642 104259 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-d1e3ff28-7ba4-4007-895f-2557b60edefb namespace which is not needed anymore#033[00m
Jan 21 19:13:30 np0005591285 nova_compute[182755]: 2026-01-22 00:13:30.667 182759 DEBUG oslo_concurrency.lockutils [req-4ccd5df2-edba-4711-b37a-369c4ae34b5b req-7512b272-3ee4-46f6-aefa-8d4a12364b1d 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Releasing lock "refresh_cache-118577c2-2440-472a-b858-f075b2a804b1" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 21 19:13:30 np0005591285 kernel: tapdc2fa6e9-f8: entered promiscuous mode
Jan 21 19:13:30 np0005591285 NetworkManager[55017]: <info>  [1769040810.7452] manager: (tapdc2fa6e9-f8): new Tun device (/org/freedesktop/NetworkManager/Devices/220)
Jan 21 19:13:30 np0005591285 systemd-udevd[230220]: Network interface NamePolicy= disabled on kernel command line.
Jan 21 19:13:30 np0005591285 ovn_controller[94908]: 2026-01-22T00:13:30Z|00453|binding|INFO|Claiming lport dc2fa6e9-f8da-41bd-b113-8c7f2b22d4a9 for this chassis.
Jan 21 19:13:30 np0005591285 ovn_controller[94908]: 2026-01-22T00:13:30Z|00454|binding|INFO|dc2fa6e9-f8da-41bd-b113-8c7f2b22d4a9: Claiming fa:16:3e:f6:6f:67 10.100.0.10
Jan 21 19:13:30 np0005591285 kernel: tapdc2fa6e9-f8 (unregistering): left promiscuous mode
Jan 21 19:13:30 np0005591285 nova_compute[182755]: 2026-01-22 00:13:30.746 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:13:30 np0005591285 nova_compute[182755]: 2026-01-22 00:13:30.771 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:13:30 np0005591285 ovn_controller[94908]: 2026-01-22T00:13:30Z|00455|if_status|INFO|Dropped 1 log messages in last 887 seconds (most recently, 887 seconds ago) due to excessive rate
Jan 21 19:13:30 np0005591285 ovn_controller[94908]: 2026-01-22T00:13:30Z|00456|if_status|INFO|Not setting lport dc2fa6e9-f8da-41bd-b113-8c7f2b22d4a9 down as sb is readonly
Jan 21 19:13:30 np0005591285 neutron-haproxy-ovnmeta-d1e3ff28-7ba4-4007-895f-2557b60edefb[230132]: [NOTICE]   (230136) : haproxy version is 2.8.14-c23fe91
Jan 21 19:13:30 np0005591285 neutron-haproxy-ovnmeta-d1e3ff28-7ba4-4007-895f-2557b60edefb[230132]: [NOTICE]   (230136) : path to executable is /usr/sbin/haproxy
Jan 21 19:13:30 np0005591285 neutron-haproxy-ovnmeta-d1e3ff28-7ba4-4007-895f-2557b60edefb[230132]: [WARNING]  (230136) : Exiting Master process...
Jan 21 19:13:30 np0005591285 neutron-haproxy-ovnmeta-d1e3ff28-7ba4-4007-895f-2557b60edefb[230132]: [WARNING]  (230136) : Exiting Master process...
Jan 21 19:13:30 np0005591285 neutron-haproxy-ovnmeta-d1e3ff28-7ba4-4007-895f-2557b60edefb[230132]: [ALERT]    (230136) : Current worker (230138) exited with code 143 (Terminated)
Jan 21 19:13:30 np0005591285 neutron-haproxy-ovnmeta-d1e3ff28-7ba4-4007-895f-2557b60edefb[230132]: [WARNING]  (230136) : All workers exited. Exiting... (0)
Jan 21 19:13:30 np0005591285 systemd[1]: libpod-7bb8c357101b8c87b3316ae8b53e444bd75e19cae4669bd4868c4f4988fe56b9.scope: Deactivated successfully.
Jan 21 19:13:30 np0005591285 podman[230240]: 2026-01-22 00:13:30.790288506 +0000 UTC m=+0.054124173 container died 7bb8c357101b8c87b3316ae8b53e444bd75e19cae4669bd4868c4f4988fe56b9 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-d1e3ff28-7ba4-4007-895f-2557b60edefb, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, io.buildah.version=1.41.3)
Jan 21 19:13:30 np0005591285 nova_compute[182755]: 2026-01-22 00:13:30.804 182759 INFO nova.virt.libvirt.driver [-] [instance: ee09d802-1f59-4f58-befa-a281fe642b6b] Instance destroyed successfully.#033[00m
Jan 21 19:13:30 np0005591285 nova_compute[182755]: 2026-01-22 00:13:30.806 182759 DEBUG nova.objects.instance [None req-9dcb3324-6936-4a1b-8c6b-567b0c7bea41 9ee45ba20dd444a5a5e88aa96cc8e043 873b2f2688e942d5924aa81fa18d84c0 - - default default] Lazy-loading 'resources' on Instance uuid ee09d802-1f59-4f58-befa-a281fe642b6b obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 21 19:13:30 np0005591285 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-7bb8c357101b8c87b3316ae8b53e444bd75e19cae4669bd4868c4f4988fe56b9-userdata-shm.mount: Deactivated successfully.
Jan 21 19:13:30 np0005591285 systemd[1]: var-lib-containers-storage-overlay-535239dff6939f44a806ad99cd1a8558d383b084a97a2556b0db0a01baad9098-merged.mount: Deactivated successfully.
Jan 21 19:13:30 np0005591285 podman[230240]: 2026-01-22 00:13:30.825721231 +0000 UTC m=+0.089556888 container cleanup 7bb8c357101b8c87b3316ae8b53e444bd75e19cae4669bd4868c4f4988fe56b9 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-d1e3ff28-7ba4-4007-895f-2557b60edefb, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251202)
Jan 21 19:13:30 np0005591285 ovn_controller[94908]: 2026-01-22T00:13:30Z|00457|binding|INFO|Releasing lport dc2fa6e9-f8da-41bd-b113-8c7f2b22d4a9 from this chassis (sb_readonly=0)
Jan 21 19:13:30 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:13:30.827 104259 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:f6:6f:67 10.100.0.10'], port_security=['fa:16:3e:f6:6f:67 10.100.0.10'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.10/28', 'neutron:device_id': 'ee09d802-1f59-4f58-befa-a281fe642b6b', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-d1e3ff28-7ba4-4007-895f-2557b60edefb', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '873b2f2688e942d5924aa81fa18d84c0', 'neutron:revision_number': '4', 'neutron:security_group_ids': 'e0235df0-8e10-4f2c-bca1-3481f216c1a8', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-2.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=b6e8d061-740c-490c-81d2-02d385a2e787, chassis=[<ovs.db.idl.Row object at 0x7fdd55b5c970>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fdd55b5c970>], logical_port=dc2fa6e9-f8da-41bd-b113-8c7f2b22d4a9) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 21 19:13:30 np0005591285 nova_compute[182755]: 2026-01-22 00:13:30.837 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:13:30 np0005591285 systemd[1]: libpod-conmon-7bb8c357101b8c87b3316ae8b53e444bd75e19cae4669bd4868c4f4988fe56b9.scope: Deactivated successfully.
Jan 21 19:13:30 np0005591285 podman[230280]: 2026-01-22 00:13:30.881647762 +0000 UTC m=+0.034802679 container remove 7bb8c357101b8c87b3316ae8b53e444bd75e19cae4669bd4868c4f4988fe56b9 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-d1e3ff28-7ba4-4007-895f-2557b60edefb, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 21 19:13:30 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:13:30.886 211690 DEBUG oslo.privsep.daemon [-] privsep: reply[351d860f-0cac-4e19-816b-8f21ff4111c9]: (4, ('Thu Jan 22 12:13:30 AM UTC 2026 Stopping container neutron-haproxy-ovnmeta-d1e3ff28-7ba4-4007-895f-2557b60edefb (7bb8c357101b8c87b3316ae8b53e444bd75e19cae4669bd4868c4f4988fe56b9)\n7bb8c357101b8c87b3316ae8b53e444bd75e19cae4669bd4868c4f4988fe56b9\nThu Jan 22 12:13:30 AM UTC 2026 Deleting container neutron-haproxy-ovnmeta-d1e3ff28-7ba4-4007-895f-2557b60edefb (7bb8c357101b8c87b3316ae8b53e444bd75e19cae4669bd4868c4f4988fe56b9)\n7bb8c357101b8c87b3316ae8b53e444bd75e19cae4669bd4868c4f4988fe56b9\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 19:13:30 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:13:30.888 211690 DEBUG oslo.privsep.daemon [-] privsep: reply[22684d6d-e647-435b-86e7-28efdee2a225]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 19:13:30 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:13:30.889 104259 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapd1e3ff28-70, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 21 19:13:30 np0005591285 nova_compute[182755]: 2026-01-22 00:13:30.891 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:13:30 np0005591285 kernel: tapd1e3ff28-70: left promiscuous mode
Jan 21 19:13:30 np0005591285 nova_compute[182755]: 2026-01-22 00:13:30.906 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:13:30 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:13:30.908 211690 DEBUG oslo.privsep.daemon [-] privsep: reply[e8c0e876-a521-4cfa-8f66-a66c25bf9aca]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 19:13:30 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:13:30.921 211690 DEBUG oslo.privsep.daemon [-] privsep: reply[2e747da8-1558-4334-9baa-4758ad56bf1e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 19:13:30 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:13:30.922 211690 DEBUG oslo.privsep.daemon [-] privsep: reply[7fd40856-61e4-4839-be53-9984c0bfa770]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 19:13:30 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:13:30.937 211690 DEBUG oslo.privsep.daemon [-] privsep: reply[a3e76996-a93a-436f-a9ed-9d1323f223be]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 535187, 'reachable_time': 24360, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 230299, 'error': None, 'target': 'ovnmeta-d1e3ff28-7ba4-4007-895f-2557b60edefb', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 19:13:30 np0005591285 systemd[1]: run-netns-ovnmeta\x2dd1e3ff28\x2d7ba4\x2d4007\x2d895f\x2d2557b60edefb.mount: Deactivated successfully.
Jan 21 19:13:30 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:13:30.941 104650 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-d1e3ff28-7ba4-4007-895f-2557b60edefb deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Jan 21 19:13:30 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:13:30.941 104650 DEBUG oslo.privsep.daemon [-] privsep: reply[43be9e04-4f1d-4e94-906b-a27c823b3c8d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 19:13:30 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:13:30.942 104259 INFO neutron.agent.ovn.metadata.agent [-] Port dc2fa6e9-f8da-41bd-b113-8c7f2b22d4a9 in datapath d1e3ff28-7ba4-4007-895f-2557b60edefb bound to our chassis#033[00m
Jan 21 19:13:30 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:13:30.944 104259 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network d1e3ff28-7ba4-4007-895f-2557b60edefb#033[00m
Jan 21 19:13:30 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:13:30.953 211690 DEBUG oslo.privsep.daemon [-] privsep: reply[040c6987-83dd-4fe1-8c4c-e617505df177]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 19:13:30 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:13:30.954 104259 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tapd1e3ff28-71 in ovnmeta-d1e3ff28-7ba4-4007-895f-2557b60edefb namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Jan 21 19:13:30 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:13:30.957 211690 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tapd1e3ff28-70 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Jan 21 19:13:30 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:13:30.957 211690 DEBUG oslo.privsep.daemon [-] privsep: reply[a9842595-92ae-4ab8-a8b1-bd50bd0b377d]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 19:13:30 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:13:30.958 211690 DEBUG oslo.privsep.daemon [-] privsep: reply[35e0c2d0-f76e-4747-89a6-f8e0c4f8e002]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 19:13:30 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:13:30.969 104650 DEBUG oslo.privsep.daemon [-] privsep: reply[bda7d955-8455-4389-9786-dff8cb0f050a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 19:13:30 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:13:30.990 211690 DEBUG oslo.privsep.daemon [-] privsep: reply[04a156dd-9ae9-408d-a417-2fb68201c744]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 19:13:31 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:13:31.017 211704 DEBUG oslo.privsep.daemon [-] privsep: reply[2ec6e640-8fbc-45bc-92c7-a10f19235a58]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 19:13:31 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:13:31.021 211690 DEBUG oslo.privsep.daemon [-] privsep: reply[974ab4bf-9ddf-4710-98ff-8c1102f64fe4]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 19:13:31 np0005591285 NetworkManager[55017]: <info>  [1769040811.0229] manager: (tapd1e3ff28-70): new Veth device (/org/freedesktop/NetworkManager/Devices/221)
Jan 21 19:13:31 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:13:31.051 211704 DEBUG oslo.privsep.daemon [-] privsep: reply[2e44ffc6-941d-4a30-96f4-21b9503813b1]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 19:13:31 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:13:31.053 211704 DEBUG oslo.privsep.daemon [-] privsep: reply[03ac6e83-d379-46cc-b581-1d886a3426ca]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 19:13:31 np0005591285 NetworkManager[55017]: <info>  [1769040811.0781] device (tapd1e3ff28-70): carrier: link connected
Jan 21 19:13:31 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:13:31.086 211704 DEBUG oslo.privsep.daemon [-] privsep: reply[6bced1b1-38c6-4ad9-9370-ba99b5b4149e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 19:13:31 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:13:31.112 211690 DEBUG oslo.privsep.daemon [-] privsep: reply[6255f005-19d8-4f8a-bc1d-57317d6948f6]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapd1e3ff28-71'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:1c:5b:09'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 141], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 536973, 'reachable_time': 35509, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 230327, 'error': None, 'target': 'ovnmeta-d1e3ff28-7ba4-4007-895f-2557b60edefb', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 19:13:31 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:13:31.129 211690 DEBUG oslo.privsep.daemon [-] privsep: reply[59b0e492-03d7-4100-aeac-2e601463d8ae]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe1c:5b09'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 536973, 'tstamp': 536973}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 230328, 'error': None, 'target': 'ovnmeta-d1e3ff28-7ba4-4007-895f-2557b60edefb', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 19:13:31 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:13:31.156 211690 DEBUG oslo.privsep.daemon [-] privsep: reply[5b5bd546-1b9f-477d-a062-228f4747049f]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapd1e3ff28-71'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:1c:5b:09'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 2, 'tx_packets': 1, 'rx_bytes': 176, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 2, 'tx_packets': 1, 'rx_bytes': 176, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 141], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 536973, 'reachable_time': 35509, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 2, 'inoctets': 148, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 2, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 148, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 2, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 230329, 'error': None, 'target': 'ovnmeta-d1e3ff28-7ba4-4007-895f-2557b60edefb', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 19:13:31 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:13:31.189 104259 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:f6:6f:67 10.100.0.10'], port_security=['fa:16:3e:f6:6f:67 10.100.0.10'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.10/28', 'neutron:device_id': 'ee09d802-1f59-4f58-befa-a281fe642b6b', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-d1e3ff28-7ba4-4007-895f-2557b60edefb', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '873b2f2688e942d5924aa81fa18d84c0', 'neutron:revision_number': '4', 'neutron:security_group_ids': 'e0235df0-8e10-4f2c-bca1-3481f216c1a8', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-2.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=b6e8d061-740c-490c-81d2-02d385a2e787, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fdd55b5c970>], logical_port=dc2fa6e9-f8da-41bd-b113-8c7f2b22d4a9) old=Port_Binding(chassis=[<ovs.db.idl.Row object at 0x7fdd55b5c970>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 21 19:13:31 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:13:31.195 211690 DEBUG oslo.privsep.daemon [-] privsep: reply[30c72f06-5481-48ab-b05e-fd138af1d2f8]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 19:13:31 np0005591285 nova_compute[182755]: 2026-01-22 00:13:31.202 182759 DEBUG nova.virt.libvirt.vif [None req-9dcb3324-6936-4a1b-8c6b-567b0c7bea41 9ee45ba20dd444a5a5e88aa96cc8e043 873b2f2688e942d5924aa81fa18d84c0 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-22T00:12:48Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-InstanceActionsTestJSON-server-300361081',display_name='tempest-InstanceActionsTestJSON-server-300361081',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-instanceactionstestjson-server-300361081',id=118,image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2026-01-22T00:13:13Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='873b2f2688e942d5924aa81fa18d84c0',ramdisk_id='',reservation_id='r-0sonkysu',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-InstanceActionsTestJSON-232501859',owner_user_name='tempest-InstanceActionsTestJSON-232501859-project-member'},tags=<?>,task_state='reboot_started_hard',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-22T00:13:28Z,user_data=None,user_id='9ee45ba20dd444a5a5e88aa96cc8e043',uuid=ee09d802-1f59-4f58-befa-a281fe642b6b,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "dc2fa6e9-f8da-41bd-b113-8c7f2b22d4a9", "address": "fa:16:3e:f6:6f:67", "network": {"id": "d1e3ff28-7ba4-4007-895f-2557b60edefb", "bridge": "br-int", "label": "tempest-InstanceActionsTestJSON-1124107938-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "873b2f2688e942d5924aa81fa18d84c0", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapdc2fa6e9-f8", "ovs_interfaceid": "dc2fa6e9-f8da-41bd-b113-8c7f2b22d4a9", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Jan 21 19:13:31 np0005591285 nova_compute[182755]: 2026-01-22 00:13:31.203 182759 DEBUG nova.network.os_vif_util [None req-9dcb3324-6936-4a1b-8c6b-567b0c7bea41 9ee45ba20dd444a5a5e88aa96cc8e043 873b2f2688e942d5924aa81fa18d84c0 - - default default] Converting VIF {"id": "dc2fa6e9-f8da-41bd-b113-8c7f2b22d4a9", "address": "fa:16:3e:f6:6f:67", "network": {"id": "d1e3ff28-7ba4-4007-895f-2557b60edefb", "bridge": "br-int", "label": "tempest-InstanceActionsTestJSON-1124107938-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "873b2f2688e942d5924aa81fa18d84c0", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapdc2fa6e9-f8", "ovs_interfaceid": "dc2fa6e9-f8da-41bd-b113-8c7f2b22d4a9", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 21 19:13:31 np0005591285 nova_compute[182755]: 2026-01-22 00:13:31.204 182759 DEBUG nova.network.os_vif_util [None req-9dcb3324-6936-4a1b-8c6b-567b0c7bea41 9ee45ba20dd444a5a5e88aa96cc8e043 873b2f2688e942d5924aa81fa18d84c0 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:f6:6f:67,bridge_name='br-int',has_traffic_filtering=True,id=dc2fa6e9-f8da-41bd-b113-8c7f2b22d4a9,network=Network(d1e3ff28-7ba4-4007-895f-2557b60edefb),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapdc2fa6e9-f8') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 21 19:13:31 np0005591285 nova_compute[182755]: 2026-01-22 00:13:31.204 182759 DEBUG os_vif [None req-9dcb3324-6936-4a1b-8c6b-567b0c7bea41 9ee45ba20dd444a5a5e88aa96cc8e043 873b2f2688e942d5924aa81fa18d84c0 - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:f6:6f:67,bridge_name='br-int',has_traffic_filtering=True,id=dc2fa6e9-f8da-41bd-b113-8c7f2b22d4a9,network=Network(d1e3ff28-7ba4-4007-895f-2557b60edefb),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapdc2fa6e9-f8') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Jan 21 19:13:31 np0005591285 nova_compute[182755]: 2026-01-22 00:13:31.207 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:13:31 np0005591285 nova_compute[182755]: 2026-01-22 00:13:31.207 182759 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapdc2fa6e9-f8, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 21 19:13:31 np0005591285 nova_compute[182755]: 2026-01-22 00:13:31.210 182759 DEBUG nova.compute.manager [req-2c452fcc-d20e-4195-8db6-1e79ae07460a req-acfb3fcb-2caf-41e6-9c0c-ec5b665c3c14 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: ee09d802-1f59-4f58-befa-a281fe642b6b] Received event network-vif-unplugged-dc2fa6e9-f8da-41bd-b113-8c7f2b22d4a9 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 21 19:13:31 np0005591285 nova_compute[182755]: 2026-01-22 00:13:31.210 182759 DEBUG oslo_concurrency.lockutils [req-2c452fcc-d20e-4195-8db6-1e79ae07460a req-acfb3fcb-2caf-41e6-9c0c-ec5b665c3c14 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "ee09d802-1f59-4f58-befa-a281fe642b6b-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 21 19:13:31 np0005591285 nova_compute[182755]: 2026-01-22 00:13:31.211 182759 DEBUG oslo_concurrency.lockutils [req-2c452fcc-d20e-4195-8db6-1e79ae07460a req-acfb3fcb-2caf-41e6-9c0c-ec5b665c3c14 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "ee09d802-1f59-4f58-befa-a281fe642b6b-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 21 19:13:31 np0005591285 nova_compute[182755]: 2026-01-22 00:13:31.211 182759 DEBUG oslo_concurrency.lockutils [req-2c452fcc-d20e-4195-8db6-1e79ae07460a req-acfb3fcb-2caf-41e6-9c0c-ec5b665c3c14 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "ee09d802-1f59-4f58-befa-a281fe642b6b-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 21 19:13:31 np0005591285 nova_compute[182755]: 2026-01-22 00:13:31.211 182759 DEBUG nova.compute.manager [req-2c452fcc-d20e-4195-8db6-1e79ae07460a req-acfb3fcb-2caf-41e6-9c0c-ec5b665c3c14 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: ee09d802-1f59-4f58-befa-a281fe642b6b] No waiting events found dispatching network-vif-unplugged-dc2fa6e9-f8da-41bd-b113-8c7f2b22d4a9 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 21 19:13:31 np0005591285 nova_compute[182755]: 2026-01-22 00:13:31.211 182759 WARNING nova.compute.manager [req-2c452fcc-d20e-4195-8db6-1e79ae07460a req-acfb3fcb-2caf-41e6-9c0c-ec5b665c3c14 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: ee09d802-1f59-4f58-befa-a281fe642b6b] Received unexpected event network-vif-unplugged-dc2fa6e9-f8da-41bd-b113-8c7f2b22d4a9 for instance with vm_state active and task_state reboot_started_hard.#033[00m
Jan 21 19:13:31 np0005591285 nova_compute[182755]: 2026-01-22 00:13:31.212 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:13:31 np0005591285 nova_compute[182755]: 2026-01-22 00:13:31.213 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 21 19:13:31 np0005591285 nova_compute[182755]: 2026-01-22 00:13:31.215 182759 INFO os_vif [None req-9dcb3324-6936-4a1b-8c6b-567b0c7bea41 9ee45ba20dd444a5a5e88aa96cc8e043 873b2f2688e942d5924aa81fa18d84c0 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:f6:6f:67,bridge_name='br-int',has_traffic_filtering=True,id=dc2fa6e9-f8da-41bd-b113-8c7f2b22d4a9,network=Network(d1e3ff28-7ba4-4007-895f-2557b60edefb),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapdc2fa6e9-f8')#033[00m
Jan 21 19:13:31 np0005591285 nova_compute[182755]: 2026-01-22 00:13:31.223 182759 DEBUG nova.virt.libvirt.driver [None req-9dcb3324-6936-4a1b-8c6b-567b0c7bea41 9ee45ba20dd444a5a5e88aa96cc8e043 873b2f2688e942d5924aa81fa18d84c0 - - default default] [instance: ee09d802-1f59-4f58-befa-a281fe642b6b] Start _get_guest_xml network_info=[{"id": "dc2fa6e9-f8da-41bd-b113-8c7f2b22d4a9", "address": "fa:16:3e:f6:6f:67", "network": {"id": "d1e3ff28-7ba4-4007-895f-2557b60edefb", "bridge": "br-int", "label": "tempest-InstanceActionsTestJSON-1124107938-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "873b2f2688e942d5924aa81fa18d84c0", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapdc2fa6e9-f8", "ovs_interfaceid": "dc2fa6e9-f8da-41bd-b113-8c7f2b22d4a9", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum=<?>,container_format='bare',created_at=<?>,direct_url=<?>,disk_format='qcow2',id=9cd98f02-a505-4543-a7ad-04e9a377b456,min_disk=1,min_ram=0,name=<?>,owner=<?>,properties=ImageMetaProps,protected=<?>,size=<?>,status=<?>,tags=<?>,updated_at=<?>,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'device_type': 'disk', 'device_name': '/dev/vda', 'size': 0, 'boot_index': 0, 'disk_bus': 'virtio', 'guest_format': None, 'encryption_options': None, 'encryption_format': None, 'encrypted': False, 'encryption_secret_uuid': None, 'image_id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Jan 21 19:13:31 np0005591285 nova_compute[182755]: 2026-01-22 00:13:31.227 182759 WARNING nova.virt.libvirt.driver [None req-9dcb3324-6936-4a1b-8c6b-567b0c7bea41 9ee45ba20dd444a5a5e88aa96cc8e043 873b2f2688e942d5924aa81fa18d84c0 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 21 19:13:31 np0005591285 nova_compute[182755]: 2026-01-22 00:13:31.234 182759 DEBUG nova.virt.libvirt.host [None req-9dcb3324-6936-4a1b-8c6b-567b0c7bea41 9ee45ba20dd444a5a5e88aa96cc8e043 873b2f2688e942d5924aa81fa18d84c0 - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Jan 21 19:13:31 np0005591285 nova_compute[182755]: 2026-01-22 00:13:31.235 182759 DEBUG nova.virt.libvirt.host [None req-9dcb3324-6936-4a1b-8c6b-567b0c7bea41 9ee45ba20dd444a5a5e88aa96cc8e043 873b2f2688e942d5924aa81fa18d84c0 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Jan 21 19:13:31 np0005591285 nova_compute[182755]: 2026-01-22 00:13:31.239 182759 DEBUG nova.virt.libvirt.host [None req-9dcb3324-6936-4a1b-8c6b-567b0c7bea41 9ee45ba20dd444a5a5e88aa96cc8e043 873b2f2688e942d5924aa81fa18d84c0 - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Jan 21 19:13:31 np0005591285 nova_compute[182755]: 2026-01-22 00:13:31.240 182759 DEBUG nova.virt.libvirt.host [None req-9dcb3324-6936-4a1b-8c6b-567b0c7bea41 9ee45ba20dd444a5a5e88aa96cc8e043 873b2f2688e942d5924aa81fa18d84c0 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Jan 21 19:13:31 np0005591285 nova_compute[182755]: 2026-01-22 00:13:31.241 182759 DEBUG nova.virt.libvirt.driver [None req-9dcb3324-6936-4a1b-8c6b-567b0c7bea41 9ee45ba20dd444a5a5e88aa96cc8e043 873b2f2688e942d5924aa81fa18d84c0 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Jan 21 19:13:31 np0005591285 nova_compute[182755]: 2026-01-22 00:13:31.242 182759 DEBUG nova.virt.hardware [None req-9dcb3324-6936-4a1b-8c6b-567b0c7bea41 9ee45ba20dd444a5a5e88aa96cc8e043 873b2f2688e942d5924aa81fa18d84c0 - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-21T23:42:45Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='c3389c03-89c4-4ff5-9e03-1a99d41713d4',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum=<?>,container_format='bare',created_at=<?>,direct_url=<?>,disk_format='qcow2',id=9cd98f02-a505-4543-a7ad-04e9a377b456,min_disk=1,min_ram=0,name=<?>,owner=<?>,properties=ImageMetaProps,protected=<?>,size=<?>,status=<?>,tags=<?>,updated_at=<?>,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Jan 21 19:13:31 np0005591285 nova_compute[182755]: 2026-01-22 00:13:31.242 182759 DEBUG nova.virt.hardware [None req-9dcb3324-6936-4a1b-8c6b-567b0c7bea41 9ee45ba20dd444a5a5e88aa96cc8e043 873b2f2688e942d5924aa81fa18d84c0 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Jan 21 19:13:31 np0005591285 nova_compute[182755]: 2026-01-22 00:13:31.242 182759 DEBUG nova.virt.hardware [None req-9dcb3324-6936-4a1b-8c6b-567b0c7bea41 9ee45ba20dd444a5a5e88aa96cc8e043 873b2f2688e942d5924aa81fa18d84c0 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Jan 21 19:13:31 np0005591285 nova_compute[182755]: 2026-01-22 00:13:31.243 182759 DEBUG nova.virt.hardware [None req-9dcb3324-6936-4a1b-8c6b-567b0c7bea41 9ee45ba20dd444a5a5e88aa96cc8e043 873b2f2688e942d5924aa81fa18d84c0 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Jan 21 19:13:31 np0005591285 nova_compute[182755]: 2026-01-22 00:13:31.243 182759 DEBUG nova.virt.hardware [None req-9dcb3324-6936-4a1b-8c6b-567b0c7bea41 9ee45ba20dd444a5a5e88aa96cc8e043 873b2f2688e942d5924aa81fa18d84c0 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Jan 21 19:13:31 np0005591285 nova_compute[182755]: 2026-01-22 00:13:31.243 182759 DEBUG nova.virt.hardware [None req-9dcb3324-6936-4a1b-8c6b-567b0c7bea41 9ee45ba20dd444a5a5e88aa96cc8e043 873b2f2688e942d5924aa81fa18d84c0 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Jan 21 19:13:31 np0005591285 nova_compute[182755]: 2026-01-22 00:13:31.244 182759 DEBUG nova.virt.hardware [None req-9dcb3324-6936-4a1b-8c6b-567b0c7bea41 9ee45ba20dd444a5a5e88aa96cc8e043 873b2f2688e942d5924aa81fa18d84c0 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Jan 21 19:13:31 np0005591285 nova_compute[182755]: 2026-01-22 00:13:31.244 182759 DEBUG nova.virt.hardware [None req-9dcb3324-6936-4a1b-8c6b-567b0c7bea41 9ee45ba20dd444a5a5e88aa96cc8e043 873b2f2688e942d5924aa81fa18d84c0 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Jan 21 19:13:31 np0005591285 nova_compute[182755]: 2026-01-22 00:13:31.244 182759 DEBUG nova.virt.hardware [None req-9dcb3324-6936-4a1b-8c6b-567b0c7bea41 9ee45ba20dd444a5a5e88aa96cc8e043 873b2f2688e942d5924aa81fa18d84c0 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Jan 21 19:13:31 np0005591285 nova_compute[182755]: 2026-01-22 00:13:31.244 182759 DEBUG nova.virt.hardware [None req-9dcb3324-6936-4a1b-8c6b-567b0c7bea41 9ee45ba20dd444a5a5e88aa96cc8e043 873b2f2688e942d5924aa81fa18d84c0 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Jan 21 19:13:31 np0005591285 nova_compute[182755]: 2026-01-22 00:13:31.245 182759 DEBUG nova.virt.hardware [None req-9dcb3324-6936-4a1b-8c6b-567b0c7bea41 9ee45ba20dd444a5a5e88aa96cc8e043 873b2f2688e942d5924aa81fa18d84c0 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Jan 21 19:13:31 np0005591285 nova_compute[182755]: 2026-01-22 00:13:31.245 182759 DEBUG nova.objects.instance [None req-9dcb3324-6936-4a1b-8c6b-567b0c7bea41 9ee45ba20dd444a5a5e88aa96cc8e043 873b2f2688e942d5924aa81fa18d84c0 - - default default] Lazy-loading 'vcpu_model' on Instance uuid ee09d802-1f59-4f58-befa-a281fe642b6b obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 21 19:13:31 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:13:31.281 211690 DEBUG oslo.privsep.daemon [-] privsep: reply[13f86d17-6e93-4107-909a-428f36c37ec0]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 19:13:31 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:13:31.283 104259 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapd1e3ff28-70, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 21 19:13:31 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:13:31.283 104259 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 21 19:13:31 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:13:31.283 104259 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapd1e3ff28-70, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 21 19:13:31 np0005591285 nova_compute[182755]: 2026-01-22 00:13:31.285 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:13:31 np0005591285 NetworkManager[55017]: <info>  [1769040811.2864] manager: (tapd1e3ff28-70): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/222)
Jan 21 19:13:31 np0005591285 kernel: tapd1e3ff28-70: entered promiscuous mode
Jan 21 19:13:31 np0005591285 nova_compute[182755]: 2026-01-22 00:13:31.288 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:13:31 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:13:31.290 104259 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapd1e3ff28-70, col_values=(('external_ids', {'iface-id': '0b252d65-412b-4740-a674-4727d3037b7f'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 21 19:13:31 np0005591285 ovn_controller[94908]: 2026-01-22T00:13:31Z|00458|binding|INFO|Releasing lport 0b252d65-412b-4740-a674-4727d3037b7f from this chassis (sb_readonly=0)
Jan 21 19:13:31 np0005591285 nova_compute[182755]: 2026-01-22 00:13:31.291 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:13:31 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:13:31.309 104259 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/d1e3ff28-7ba4-4007-895f-2557b60edefb.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/d1e3ff28-7ba4-4007-895f-2557b60edefb.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Jan 21 19:13:31 np0005591285 nova_compute[182755]: 2026-01-22 00:13:31.309 182759 DEBUG oslo_concurrency.processutils [None req-9dcb3324-6936-4a1b-8c6b-567b0c7bea41 9ee45ba20dd444a5a5e88aa96cc8e043 873b2f2688e942d5924aa81fa18d84c0 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/ee09d802-1f59-4f58-befa-a281fe642b6b/disk.config --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 21 19:13:31 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:13:31.310 211690 DEBUG oslo.privsep.daemon [-] privsep: reply[c517d47c-34e8-45f5-a6db-f8f7a691564a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 19:13:31 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:13:31.311 104259 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Jan 21 19:13:31 np0005591285 ovn_metadata_agent[104254]: global
Jan 21 19:13:31 np0005591285 ovn_metadata_agent[104254]:    log         /dev/log local0 debug
Jan 21 19:13:31 np0005591285 ovn_metadata_agent[104254]:    log-tag     haproxy-metadata-proxy-d1e3ff28-7ba4-4007-895f-2557b60edefb
Jan 21 19:13:31 np0005591285 ovn_metadata_agent[104254]:    user        root
Jan 21 19:13:31 np0005591285 ovn_metadata_agent[104254]:    group       root
Jan 21 19:13:31 np0005591285 ovn_metadata_agent[104254]:    maxconn     1024
Jan 21 19:13:31 np0005591285 ovn_metadata_agent[104254]:    pidfile     /var/lib/neutron/external/pids/d1e3ff28-7ba4-4007-895f-2557b60edefb.pid.haproxy
Jan 21 19:13:31 np0005591285 ovn_metadata_agent[104254]:    daemon
Jan 21 19:13:31 np0005591285 ovn_metadata_agent[104254]: 
Jan 21 19:13:31 np0005591285 ovn_metadata_agent[104254]: defaults
Jan 21 19:13:31 np0005591285 ovn_metadata_agent[104254]:    log global
Jan 21 19:13:31 np0005591285 ovn_metadata_agent[104254]:    mode http
Jan 21 19:13:31 np0005591285 ovn_metadata_agent[104254]:    option httplog
Jan 21 19:13:31 np0005591285 ovn_metadata_agent[104254]:    option dontlognull
Jan 21 19:13:31 np0005591285 ovn_metadata_agent[104254]:    option http-server-close
Jan 21 19:13:31 np0005591285 ovn_metadata_agent[104254]:    option forwardfor
Jan 21 19:13:31 np0005591285 ovn_metadata_agent[104254]:    retries                 3
Jan 21 19:13:31 np0005591285 ovn_metadata_agent[104254]:    timeout http-request    30s
Jan 21 19:13:31 np0005591285 ovn_metadata_agent[104254]:    timeout connect         30s
Jan 21 19:13:31 np0005591285 ovn_metadata_agent[104254]:    timeout client          32s
Jan 21 19:13:31 np0005591285 ovn_metadata_agent[104254]:    timeout server          32s
Jan 21 19:13:31 np0005591285 ovn_metadata_agent[104254]:    timeout http-keep-alive 30s
Jan 21 19:13:31 np0005591285 ovn_metadata_agent[104254]: 
Jan 21 19:13:31 np0005591285 ovn_metadata_agent[104254]: 
Jan 21 19:13:31 np0005591285 ovn_metadata_agent[104254]: listen listener
Jan 21 19:13:31 np0005591285 ovn_metadata_agent[104254]:    bind 169.254.169.254:80
Jan 21 19:13:31 np0005591285 ovn_metadata_agent[104254]:    server metadata /var/lib/neutron/metadata_proxy
Jan 21 19:13:31 np0005591285 ovn_metadata_agent[104254]:    http-request add-header X-OVN-Network-ID d1e3ff28-7ba4-4007-895f-2557b60edefb
Jan 21 19:13:31 np0005591285 ovn_metadata_agent[104254]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Jan 21 19:13:31 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:13:31.312 104259 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-d1e3ff28-7ba4-4007-895f-2557b60edefb', 'env', 'PROCESS_TAG=haproxy-d1e3ff28-7ba4-4007-895f-2557b60edefb', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/d1e3ff28-7ba4-4007-895f-2557b60edefb.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Jan 21 19:13:31 np0005591285 nova_compute[182755]: 2026-01-22 00:13:31.330 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:13:31 np0005591285 nova_compute[182755]: 2026-01-22 00:13:31.371 182759 DEBUG oslo_concurrency.processutils [None req-9dcb3324-6936-4a1b-8c6b-567b0c7bea41 9ee45ba20dd444a5a5e88aa96cc8e043 873b2f2688e942d5924aa81fa18d84c0 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/ee09d802-1f59-4f58-befa-a281fe642b6b/disk.config --force-share --output=json" returned: 0 in 0.062s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 21 19:13:31 np0005591285 nova_compute[182755]: 2026-01-22 00:13:31.372 182759 DEBUG oslo_concurrency.lockutils [None req-9dcb3324-6936-4a1b-8c6b-567b0c7bea41 9ee45ba20dd444a5a5e88aa96cc8e043 873b2f2688e942d5924aa81fa18d84c0 - - default default] Acquiring lock "/var/lib/nova/instances/ee09d802-1f59-4f58-befa-a281fe642b6b/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 21 19:13:31 np0005591285 nova_compute[182755]: 2026-01-22 00:13:31.372 182759 DEBUG oslo_concurrency.lockutils [None req-9dcb3324-6936-4a1b-8c6b-567b0c7bea41 9ee45ba20dd444a5a5e88aa96cc8e043 873b2f2688e942d5924aa81fa18d84c0 - - default default] Lock "/var/lib/nova/instances/ee09d802-1f59-4f58-befa-a281fe642b6b/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 21 19:13:31 np0005591285 nova_compute[182755]: 2026-01-22 00:13:31.373 182759 DEBUG oslo_concurrency.lockutils [None req-9dcb3324-6936-4a1b-8c6b-567b0c7bea41 9ee45ba20dd444a5a5e88aa96cc8e043 873b2f2688e942d5924aa81fa18d84c0 - - default default] Lock "/var/lib/nova/instances/ee09d802-1f59-4f58-befa-a281fe642b6b/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 21 19:13:31 np0005591285 nova_compute[182755]: 2026-01-22 00:13:31.375 182759 DEBUG nova.virt.libvirt.vif [None req-9dcb3324-6936-4a1b-8c6b-567b0c7bea41 9ee45ba20dd444a5a5e88aa96cc8e043 873b2f2688e942d5924aa81fa18d84c0 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-22T00:12:48Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-InstanceActionsTestJSON-server-300361081',display_name='tempest-InstanceActionsTestJSON-server-300361081',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-instanceactionstestjson-server-300361081',id=118,image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2026-01-22T00:13:13Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='873b2f2688e942d5924aa81fa18d84c0',ramdisk_id='',reservation_id='r-0sonkysu',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-InstanceActionsTestJSON-232501859',owner_user_name='tempest-InstanceActionsTestJSON-232501859-project-member'},tags=<?>,task_state='reboot_started_hard',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-22T00:13:28Z,user_data=None,user_id='9ee45ba20dd444a5a5e88aa96cc8e043',uuid=ee09d802-1f59-4f58-befa-a281fe642b6b,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "dc2fa6e9-f8da-41bd-b113-8c7f2b22d4a9", "address": "fa:16:3e:f6:6f:67", "network": {"id": "d1e3ff28-7ba4-4007-895f-2557b60edefb", "bridge": "br-int", "label": "tempest-InstanceActionsTestJSON-1124107938-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "873b2f2688e942d5924aa81fa18d84c0", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapdc2fa6e9-f8", "ovs_interfaceid": "dc2fa6e9-f8da-41bd-b113-8c7f2b22d4a9", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Jan 21 19:13:31 np0005591285 nova_compute[182755]: 2026-01-22 00:13:31.376 182759 DEBUG nova.network.os_vif_util [None req-9dcb3324-6936-4a1b-8c6b-567b0c7bea41 9ee45ba20dd444a5a5e88aa96cc8e043 873b2f2688e942d5924aa81fa18d84c0 - - default default] Converting VIF {"id": "dc2fa6e9-f8da-41bd-b113-8c7f2b22d4a9", "address": "fa:16:3e:f6:6f:67", "network": {"id": "d1e3ff28-7ba4-4007-895f-2557b60edefb", "bridge": "br-int", "label": "tempest-InstanceActionsTestJSON-1124107938-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "873b2f2688e942d5924aa81fa18d84c0", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapdc2fa6e9-f8", "ovs_interfaceid": "dc2fa6e9-f8da-41bd-b113-8c7f2b22d4a9", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 21 19:13:31 np0005591285 nova_compute[182755]: 2026-01-22 00:13:31.377 182759 DEBUG nova.network.os_vif_util [None req-9dcb3324-6936-4a1b-8c6b-567b0c7bea41 9ee45ba20dd444a5a5e88aa96cc8e043 873b2f2688e942d5924aa81fa18d84c0 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:f6:6f:67,bridge_name='br-int',has_traffic_filtering=True,id=dc2fa6e9-f8da-41bd-b113-8c7f2b22d4a9,network=Network(d1e3ff28-7ba4-4007-895f-2557b60edefb),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapdc2fa6e9-f8') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 21 19:13:31 np0005591285 nova_compute[182755]: 2026-01-22 00:13:31.379 182759 DEBUG nova.objects.instance [None req-9dcb3324-6936-4a1b-8c6b-567b0c7bea41 9ee45ba20dd444a5a5e88aa96cc8e043 873b2f2688e942d5924aa81fa18d84c0 - - default default] Lazy-loading 'pci_devices' on Instance uuid ee09d802-1f59-4f58-befa-a281fe642b6b obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 21 19:13:31 np0005591285 nova_compute[182755]: 2026-01-22 00:13:31.572 182759 DEBUG nova.virt.libvirt.driver [None req-9dcb3324-6936-4a1b-8c6b-567b0c7bea41 9ee45ba20dd444a5a5e88aa96cc8e043 873b2f2688e942d5924aa81fa18d84c0 - - default default] [instance: ee09d802-1f59-4f58-befa-a281fe642b6b] End _get_guest_xml xml=<domain type="kvm">
Jan 21 19:13:31 np0005591285 nova_compute[182755]:  <uuid>ee09d802-1f59-4f58-befa-a281fe642b6b</uuid>
Jan 21 19:13:31 np0005591285 nova_compute[182755]:  <name>instance-00000076</name>
Jan 21 19:13:31 np0005591285 nova_compute[182755]:  <memory>131072</memory>
Jan 21 19:13:31 np0005591285 nova_compute[182755]:  <vcpu>1</vcpu>
Jan 21 19:13:31 np0005591285 nova_compute[182755]:  <metadata>
Jan 21 19:13:31 np0005591285 nova_compute[182755]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 21 19:13:31 np0005591285 nova_compute[182755]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 21 19:13:31 np0005591285 nova_compute[182755]:      <nova:name>tempest-InstanceActionsTestJSON-server-300361081</nova:name>
Jan 21 19:13:31 np0005591285 nova_compute[182755]:      <nova:creationTime>2026-01-22 00:13:31</nova:creationTime>
Jan 21 19:13:31 np0005591285 nova_compute[182755]:      <nova:flavor name="m1.nano">
Jan 21 19:13:31 np0005591285 nova_compute[182755]:        <nova:memory>128</nova:memory>
Jan 21 19:13:31 np0005591285 nova_compute[182755]:        <nova:disk>1</nova:disk>
Jan 21 19:13:31 np0005591285 nova_compute[182755]:        <nova:swap>0</nova:swap>
Jan 21 19:13:31 np0005591285 nova_compute[182755]:        <nova:ephemeral>0</nova:ephemeral>
Jan 21 19:13:31 np0005591285 nova_compute[182755]:        <nova:vcpus>1</nova:vcpus>
Jan 21 19:13:31 np0005591285 nova_compute[182755]:      </nova:flavor>
Jan 21 19:13:31 np0005591285 nova_compute[182755]:      <nova:owner>
Jan 21 19:13:31 np0005591285 nova_compute[182755]:        <nova:user uuid="9ee45ba20dd444a5a5e88aa96cc8e043">tempest-InstanceActionsTestJSON-232501859-project-member</nova:user>
Jan 21 19:13:31 np0005591285 nova_compute[182755]:        <nova:project uuid="873b2f2688e942d5924aa81fa18d84c0">tempest-InstanceActionsTestJSON-232501859</nova:project>
Jan 21 19:13:31 np0005591285 nova_compute[182755]:      </nova:owner>
Jan 21 19:13:31 np0005591285 nova_compute[182755]:      <nova:root type="image" uuid="9cd98f02-a505-4543-a7ad-04e9a377b456"/>
Jan 21 19:13:31 np0005591285 nova_compute[182755]:      <nova:ports>
Jan 21 19:13:31 np0005591285 nova_compute[182755]:        <nova:port uuid="dc2fa6e9-f8da-41bd-b113-8c7f2b22d4a9">
Jan 21 19:13:31 np0005591285 nova_compute[182755]:          <nova:ip type="fixed" address="10.100.0.10" ipVersion="4"/>
Jan 21 19:13:31 np0005591285 nova_compute[182755]:        </nova:port>
Jan 21 19:13:31 np0005591285 nova_compute[182755]:      </nova:ports>
Jan 21 19:13:31 np0005591285 nova_compute[182755]:    </nova:instance>
Jan 21 19:13:31 np0005591285 nova_compute[182755]:  </metadata>
Jan 21 19:13:31 np0005591285 nova_compute[182755]:  <sysinfo type="smbios">
Jan 21 19:13:31 np0005591285 nova_compute[182755]:    <system>
Jan 21 19:13:31 np0005591285 nova_compute[182755]:      <entry name="manufacturer">RDO</entry>
Jan 21 19:13:31 np0005591285 nova_compute[182755]:      <entry name="product">OpenStack Compute</entry>
Jan 21 19:13:31 np0005591285 nova_compute[182755]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 21 19:13:31 np0005591285 nova_compute[182755]:      <entry name="serial">ee09d802-1f59-4f58-befa-a281fe642b6b</entry>
Jan 21 19:13:31 np0005591285 nova_compute[182755]:      <entry name="uuid">ee09d802-1f59-4f58-befa-a281fe642b6b</entry>
Jan 21 19:13:31 np0005591285 nova_compute[182755]:      <entry name="family">Virtual Machine</entry>
Jan 21 19:13:31 np0005591285 nova_compute[182755]:    </system>
Jan 21 19:13:31 np0005591285 nova_compute[182755]:  </sysinfo>
Jan 21 19:13:31 np0005591285 nova_compute[182755]:  <os>
Jan 21 19:13:31 np0005591285 nova_compute[182755]:    <type arch="x86_64" machine="q35">hvm</type>
Jan 21 19:13:31 np0005591285 nova_compute[182755]:    <boot dev="hd"/>
Jan 21 19:13:31 np0005591285 nova_compute[182755]:    <smbios mode="sysinfo"/>
Jan 21 19:13:31 np0005591285 nova_compute[182755]:  </os>
Jan 21 19:13:31 np0005591285 nova_compute[182755]:  <features>
Jan 21 19:13:31 np0005591285 nova_compute[182755]:    <acpi/>
Jan 21 19:13:31 np0005591285 nova_compute[182755]:    <apic/>
Jan 21 19:13:31 np0005591285 nova_compute[182755]:    <vmcoreinfo/>
Jan 21 19:13:31 np0005591285 nova_compute[182755]:  </features>
Jan 21 19:13:31 np0005591285 nova_compute[182755]:  <clock offset="utc">
Jan 21 19:13:31 np0005591285 nova_compute[182755]:    <timer name="pit" tickpolicy="delay"/>
Jan 21 19:13:31 np0005591285 nova_compute[182755]:    <timer name="rtc" tickpolicy="catchup"/>
Jan 21 19:13:31 np0005591285 nova_compute[182755]:    <timer name="hpet" present="no"/>
Jan 21 19:13:31 np0005591285 nova_compute[182755]:  </clock>
Jan 21 19:13:31 np0005591285 nova_compute[182755]:  <cpu mode="custom" match="exact">
Jan 21 19:13:31 np0005591285 nova_compute[182755]:    <model>Nehalem</model>
Jan 21 19:13:31 np0005591285 nova_compute[182755]:    <topology sockets="1" cores="1" threads="1"/>
Jan 21 19:13:31 np0005591285 nova_compute[182755]:  </cpu>
Jan 21 19:13:31 np0005591285 nova_compute[182755]:  <devices>
Jan 21 19:13:31 np0005591285 nova_compute[182755]:    <disk type="file" device="disk">
Jan 21 19:13:31 np0005591285 nova_compute[182755]:      <driver name="qemu" type="qcow2" cache="none"/>
Jan 21 19:13:31 np0005591285 nova_compute[182755]:      <source file="/var/lib/nova/instances/ee09d802-1f59-4f58-befa-a281fe642b6b/disk"/>
Jan 21 19:13:31 np0005591285 nova_compute[182755]:      <target dev="vda" bus="virtio"/>
Jan 21 19:13:31 np0005591285 nova_compute[182755]:    </disk>
Jan 21 19:13:31 np0005591285 nova_compute[182755]:    <disk type="file" device="cdrom">
Jan 21 19:13:31 np0005591285 nova_compute[182755]:      <driver name="qemu" type="raw" cache="none"/>
Jan 21 19:13:31 np0005591285 nova_compute[182755]:      <source file="/var/lib/nova/instances/ee09d802-1f59-4f58-befa-a281fe642b6b/disk.config"/>
Jan 21 19:13:31 np0005591285 nova_compute[182755]:      <target dev="sda" bus="sata"/>
Jan 21 19:13:31 np0005591285 nova_compute[182755]:    </disk>
Jan 21 19:13:31 np0005591285 nova_compute[182755]:    <interface type="ethernet">
Jan 21 19:13:31 np0005591285 nova_compute[182755]:      <mac address="fa:16:3e:f6:6f:67"/>
Jan 21 19:13:31 np0005591285 nova_compute[182755]:      <model type="virtio"/>
Jan 21 19:13:31 np0005591285 nova_compute[182755]:      <driver name="vhost" rx_queue_size="512"/>
Jan 21 19:13:31 np0005591285 nova_compute[182755]:      <mtu size="1442"/>
Jan 21 19:13:31 np0005591285 nova_compute[182755]:      <target dev="tapdc2fa6e9-f8"/>
Jan 21 19:13:31 np0005591285 nova_compute[182755]:    </interface>
Jan 21 19:13:31 np0005591285 nova_compute[182755]:    <serial type="pty">
Jan 21 19:13:31 np0005591285 nova_compute[182755]:      <log file="/var/lib/nova/instances/ee09d802-1f59-4f58-befa-a281fe642b6b/console.log" append="off"/>
Jan 21 19:13:31 np0005591285 nova_compute[182755]:    </serial>
Jan 21 19:13:31 np0005591285 nova_compute[182755]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 21 19:13:31 np0005591285 nova_compute[182755]:    <video>
Jan 21 19:13:31 np0005591285 nova_compute[182755]:      <model type="virtio"/>
Jan 21 19:13:31 np0005591285 nova_compute[182755]:    </video>
Jan 21 19:13:31 np0005591285 nova_compute[182755]:    <input type="tablet" bus="usb"/>
Jan 21 19:13:31 np0005591285 nova_compute[182755]:    <input type="keyboard" bus="usb"/>
Jan 21 19:13:31 np0005591285 nova_compute[182755]:    <rng model="virtio">
Jan 21 19:13:31 np0005591285 nova_compute[182755]:      <backend model="random">/dev/urandom</backend>
Jan 21 19:13:31 np0005591285 nova_compute[182755]:    </rng>
Jan 21 19:13:31 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root"/>
Jan 21 19:13:31 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 19:13:31 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 19:13:31 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 19:13:31 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 19:13:31 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 19:13:31 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 19:13:31 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 19:13:31 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 19:13:31 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 19:13:31 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 19:13:31 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 19:13:31 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 19:13:31 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 19:13:31 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 19:13:31 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 19:13:31 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 19:13:31 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 19:13:31 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 19:13:31 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 19:13:31 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 19:13:31 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 19:13:31 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 19:13:31 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 19:13:31 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 19:13:31 np0005591285 nova_compute[182755]:    <controller type="usb" index="0"/>
Jan 21 19:13:31 np0005591285 nova_compute[182755]:    <memballoon model="virtio">
Jan 21 19:13:31 np0005591285 nova_compute[182755]:      <stats period="10"/>
Jan 21 19:13:31 np0005591285 nova_compute[182755]:    </memballoon>
Jan 21 19:13:31 np0005591285 nova_compute[182755]:  </devices>
Jan 21 19:13:31 np0005591285 nova_compute[182755]: </domain>
Jan 21 19:13:31 np0005591285 nova_compute[182755]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Jan 21 19:13:31 np0005591285 nova_compute[182755]: 2026-01-22 00:13:31.574 182759 DEBUG oslo_concurrency.processutils [None req-9dcb3324-6936-4a1b-8c6b-567b0c7bea41 9ee45ba20dd444a5a5e88aa96cc8e043 873b2f2688e942d5924aa81fa18d84c0 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/ee09d802-1f59-4f58-befa-a281fe642b6b/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 21 19:13:31 np0005591285 nova_compute[182755]: 2026-01-22 00:13:31.613 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:13:31 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:13:31.614 104259 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=35, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '16:a4:fb', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': 'fa:c2:bb:50:eb:27'}, ipsec=False) old=SB_Global(nb_cfg=34) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 21 19:13:31 np0005591285 nova_compute[182755]: 2026-01-22 00:13:31.636 182759 DEBUG oslo_concurrency.processutils [None req-9dcb3324-6936-4a1b-8c6b-567b0c7bea41 9ee45ba20dd444a5a5e88aa96cc8e043 873b2f2688e942d5924aa81fa18d84c0 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/ee09d802-1f59-4f58-befa-a281fe642b6b/disk --force-share --output=json" returned: 0 in 0.061s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 21 19:13:31 np0005591285 nova_compute[182755]: 2026-01-22 00:13:31.637 182759 DEBUG oslo_concurrency.processutils [None req-9dcb3324-6936-4a1b-8c6b-567b0c7bea41 9ee45ba20dd444a5a5e88aa96cc8e043 873b2f2688e942d5924aa81fa18d84c0 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/ee09d802-1f59-4f58-befa-a281fe642b6b/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 21 19:13:31 np0005591285 podman[230362]: 2026-01-22 00:13:31.66477017 +0000 UTC m=+0.046581122 container create c076551188c3cdf575cd5ddab00cab7f1ac80c2d140c6e990e9c5fe3b9d7efdf (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-d1e3ff28-7ba4-4007-895f-2557b60edefb, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.schema-version=1.0)
Jan 21 19:13:31 np0005591285 systemd[1]: Started libpod-conmon-c076551188c3cdf575cd5ddab00cab7f1ac80c2d140c6e990e9c5fe3b9d7efdf.scope.
Jan 21 19:13:31 np0005591285 nova_compute[182755]: 2026-01-22 00:13:31.707 182759 DEBUG oslo_concurrency.processutils [None req-9dcb3324-6936-4a1b-8c6b-567b0c7bea41 9ee45ba20dd444a5a5e88aa96cc8e043 873b2f2688e942d5924aa81fa18d84c0 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/ee09d802-1f59-4f58-befa-a281fe642b6b/disk --force-share --output=json" returned: 0 in 0.070s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 21 19:13:31 np0005591285 nova_compute[182755]: 2026-01-22 00:13:31.709 182759 DEBUG nova.objects.instance [None req-9dcb3324-6936-4a1b-8c6b-567b0c7bea41 9ee45ba20dd444a5a5e88aa96cc8e043 873b2f2688e942d5924aa81fa18d84c0 - - default default] Lazy-loading 'trusted_certs' on Instance uuid ee09d802-1f59-4f58-befa-a281fe642b6b obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 21 19:13:31 np0005591285 systemd[1]: Started libcrun container.
Jan 21 19:13:31 np0005591285 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/5b1d8fe1f860a079c3e338dedbc081f92cdea2eea14c1402697e38ecef429afb/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 21 19:13:31 np0005591285 podman[230362]: 2026-01-22 00:13:31.726987349 +0000 UTC m=+0.108798351 container init c076551188c3cdf575cd5ddab00cab7f1ac80c2d140c6e990e9c5fe3b9d7efdf (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-d1e3ff28-7ba4-4007-895f-2557b60edefb, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251202)
Jan 21 19:13:31 np0005591285 podman[230362]: 2026-01-22 00:13:31.732565598 +0000 UTC m=+0.114376570 container start c076551188c3cdf575cd5ddab00cab7f1ac80c2d140c6e990e9c5fe3b9d7efdf (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-d1e3ff28-7ba4-4007-895f-2557b60edefb, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.build-date=20251202, io.buildah.version=1.41.3, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true)
Jan 21 19:13:31 np0005591285 nova_compute[182755]: 2026-01-22 00:13:31.734 182759 DEBUG oslo_concurrency.processutils [None req-9dcb3324-6936-4a1b-8c6b-567b0c7bea41 9ee45ba20dd444a5a5e88aa96cc8e043 873b2f2688e942d5924aa81fa18d84c0 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 21 19:13:31 np0005591285 podman[230362]: 2026-01-22 00:13:31.639889777 +0000 UTC m=+0.021700759 image pull 3695f0466b4af47afdf4b467956f8cc4744d7249671a73e7ca3fd26cca2f59c3 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Jan 21 19:13:31 np0005591285 neutron-haproxy-ovnmeta-d1e3ff28-7ba4-4007-895f-2557b60edefb[230382]: [NOTICE]   (230386) : New worker (230389) forked
Jan 21 19:13:31 np0005591285 neutron-haproxy-ovnmeta-d1e3ff28-7ba4-4007-895f-2557b60edefb[230382]: [NOTICE]   (230386) : Loading success.
Jan 21 19:13:31 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:13:31.799 104259 INFO neutron.agent.ovn.metadata.agent [-] Port dc2fa6e9-f8da-41bd-b113-8c7f2b22d4a9 in datapath d1e3ff28-7ba4-4007-895f-2557b60edefb unbound from our chassis#033[00m
Jan 21 19:13:31 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:13:31.801 104259 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network d1e3ff28-7ba4-4007-895f-2557b60edefb, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Jan 21 19:13:31 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:13:31.802 211690 DEBUG oslo.privsep.daemon [-] privsep: reply[7707f54e-e5b7-420a-beb4-0b31fd3fe430]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 19:13:31 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:13:31.802 104259 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-d1e3ff28-7ba4-4007-895f-2557b60edefb namespace which is not needed anymore#033[00m
Jan 21 19:13:31 np0005591285 nova_compute[182755]: 2026-01-22 00:13:31.805 182759 DEBUG oslo_concurrency.processutils [None req-9dcb3324-6936-4a1b-8c6b-567b0c7bea41 9ee45ba20dd444a5a5e88aa96cc8e043 873b2f2688e942d5924aa81fa18d84c0 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json" returned: 0 in 0.071s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 21 19:13:31 np0005591285 nova_compute[182755]: 2026-01-22 00:13:31.806 182759 DEBUG nova.virt.disk.api [None req-9dcb3324-6936-4a1b-8c6b-567b0c7bea41 9ee45ba20dd444a5a5e88aa96cc8e043 873b2f2688e942d5924aa81fa18d84c0 - - default default] Checking if we can resize image /var/lib/nova/instances/ee09d802-1f59-4f58-befa-a281fe642b6b/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166#033[00m
Jan 21 19:13:31 np0005591285 nova_compute[182755]: 2026-01-22 00:13:31.807 182759 DEBUG oslo_concurrency.processutils [None req-9dcb3324-6936-4a1b-8c6b-567b0c7bea41 9ee45ba20dd444a5a5e88aa96cc8e043 873b2f2688e942d5924aa81fa18d84c0 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/ee09d802-1f59-4f58-befa-a281fe642b6b/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 21 19:13:31 np0005591285 nova_compute[182755]: 2026-01-22 00:13:31.875 182759 DEBUG oslo_concurrency.processutils [None req-9dcb3324-6936-4a1b-8c6b-567b0c7bea41 9ee45ba20dd444a5a5e88aa96cc8e043 873b2f2688e942d5924aa81fa18d84c0 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/ee09d802-1f59-4f58-befa-a281fe642b6b/disk --force-share --output=json" returned: 0 in 0.069s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 21 19:13:31 np0005591285 nova_compute[182755]: 2026-01-22 00:13:31.876 182759 DEBUG nova.virt.disk.api [None req-9dcb3324-6936-4a1b-8c6b-567b0c7bea41 9ee45ba20dd444a5a5e88aa96cc8e043 873b2f2688e942d5924aa81fa18d84c0 - - default default] Cannot resize image /var/lib/nova/instances/ee09d802-1f59-4f58-befa-a281fe642b6b/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172#033[00m
Jan 21 19:13:31 np0005591285 nova_compute[182755]: 2026-01-22 00:13:31.877 182759 DEBUG nova.objects.instance [None req-9dcb3324-6936-4a1b-8c6b-567b0c7bea41 9ee45ba20dd444a5a5e88aa96cc8e043 873b2f2688e942d5924aa81fa18d84c0 - - default default] Lazy-loading 'migration_context' on Instance uuid ee09d802-1f59-4f58-befa-a281fe642b6b obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 21 19:13:31 np0005591285 neutron-haproxy-ovnmeta-d1e3ff28-7ba4-4007-895f-2557b60edefb[230382]: [NOTICE]   (230386) : haproxy version is 2.8.14-c23fe91
Jan 21 19:13:31 np0005591285 neutron-haproxy-ovnmeta-d1e3ff28-7ba4-4007-895f-2557b60edefb[230382]: [NOTICE]   (230386) : path to executable is /usr/sbin/haproxy
Jan 21 19:13:31 np0005591285 neutron-haproxy-ovnmeta-d1e3ff28-7ba4-4007-895f-2557b60edefb[230382]: [WARNING]  (230386) : Exiting Master process...
Jan 21 19:13:31 np0005591285 neutron-haproxy-ovnmeta-d1e3ff28-7ba4-4007-895f-2557b60edefb[230382]: [WARNING]  (230386) : Exiting Master process...
Jan 21 19:13:31 np0005591285 neutron-haproxy-ovnmeta-d1e3ff28-7ba4-4007-895f-2557b60edefb[230382]: [ALERT]    (230386) : Current worker (230389) exited with code 143 (Terminated)
Jan 21 19:13:31 np0005591285 neutron-haproxy-ovnmeta-d1e3ff28-7ba4-4007-895f-2557b60edefb[230382]: [WARNING]  (230386) : All workers exited. Exiting... (0)
Jan 21 19:13:31 np0005591285 systemd[1]: libpod-c076551188c3cdf575cd5ddab00cab7f1ac80c2d140c6e990e9c5fe3b9d7efdf.scope: Deactivated successfully.
Jan 21 19:13:31 np0005591285 podman[230418]: 2026-01-22 00:13:31.927746292 +0000 UTC m=+0.045687879 container died c076551188c3cdf575cd5ddab00cab7f1ac80c2d140c6e990e9c5fe3b9d7efdf (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-d1e3ff28-7ba4-4007-895f-2557b60edefb, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 21 19:13:31 np0005591285 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-c076551188c3cdf575cd5ddab00cab7f1ac80c2d140c6e990e9c5fe3b9d7efdf-userdata-shm.mount: Deactivated successfully.
Jan 21 19:13:31 np0005591285 systemd[1]: var-lib-containers-storage-overlay-5b1d8fe1f860a079c3e338dedbc081f92cdea2eea14c1402697e38ecef429afb-merged.mount: Deactivated successfully.
Jan 21 19:13:31 np0005591285 podman[230418]: 2026-01-22 00:13:31.956457197 +0000 UTC m=+0.074398784 container cleanup c076551188c3cdf575cd5ddab00cab7f1ac80c2d140c6e990e9c5fe3b9d7efdf (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-d1e3ff28-7ba4-4007-895f-2557b60edefb, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 21 19:13:31 np0005591285 systemd[1]: libpod-conmon-c076551188c3cdf575cd5ddab00cab7f1ac80c2d140c6e990e9c5fe3b9d7efdf.scope: Deactivated successfully.
Jan 21 19:13:31 np0005591285 nova_compute[182755]: 2026-01-22 00:13:31.980 182759 DEBUG nova.virt.libvirt.vif [None req-9dcb3324-6936-4a1b-8c6b-567b0c7bea41 9ee45ba20dd444a5a5e88aa96cc8e043 873b2f2688e942d5924aa81fa18d84c0 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-22T00:12:48Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-InstanceActionsTestJSON-server-300361081',display_name='tempest-InstanceActionsTestJSON-server-300361081',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-instanceactionstestjson-server-300361081',id=118,image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2026-01-22T00:13:13Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=<?>,power_state=1,progress=0,project_id='873b2f2688e942d5924aa81fa18d84c0',ramdisk_id='',reservation_id='r-0sonkysu',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-InstanceActionsTestJSON-232501859',owner_user_name='tempest-InstanceActionsTestJSON-232501859-project-member'},tags=<?>,task_state='reboot_started_hard',terminated_at=None,trusted_certs=None,updated_at=2026-01-22T00:13:28Z,user_data=None,user_id='9ee45ba20dd444a5a5e88aa96cc8e043',uuid=ee09d802-1f59-4f58-befa-a281fe642b6b,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "dc2fa6e9-f8da-41bd-b113-8c7f2b22d4a9", "address": "fa:16:3e:f6:6f:67", "network": {"id": "d1e3ff28-7ba4-4007-895f-2557b60edefb", "bridge": "br-int", "label": "tempest-InstanceActionsTestJSON-1124107938-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "873b2f2688e942d5924aa81fa18d84c0", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapdc2fa6e9-f8", "ovs_interfaceid": "dc2fa6e9-f8da-41bd-b113-8c7f2b22d4a9", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Jan 21 19:13:31 np0005591285 nova_compute[182755]: 2026-01-22 00:13:31.980 182759 DEBUG nova.network.os_vif_util [None req-9dcb3324-6936-4a1b-8c6b-567b0c7bea41 9ee45ba20dd444a5a5e88aa96cc8e043 873b2f2688e942d5924aa81fa18d84c0 - - default default] Converting VIF {"id": "dc2fa6e9-f8da-41bd-b113-8c7f2b22d4a9", "address": "fa:16:3e:f6:6f:67", "network": {"id": "d1e3ff28-7ba4-4007-895f-2557b60edefb", "bridge": "br-int", "label": "tempest-InstanceActionsTestJSON-1124107938-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "873b2f2688e942d5924aa81fa18d84c0", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapdc2fa6e9-f8", "ovs_interfaceid": "dc2fa6e9-f8da-41bd-b113-8c7f2b22d4a9", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 21 19:13:31 np0005591285 nova_compute[182755]: 2026-01-22 00:13:31.981 182759 DEBUG nova.network.os_vif_util [None req-9dcb3324-6936-4a1b-8c6b-567b0c7bea41 9ee45ba20dd444a5a5e88aa96cc8e043 873b2f2688e942d5924aa81fa18d84c0 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:f6:6f:67,bridge_name='br-int',has_traffic_filtering=True,id=dc2fa6e9-f8da-41bd-b113-8c7f2b22d4a9,network=Network(d1e3ff28-7ba4-4007-895f-2557b60edefb),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapdc2fa6e9-f8') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 21 19:13:31 np0005591285 nova_compute[182755]: 2026-01-22 00:13:31.982 182759 DEBUG os_vif [None req-9dcb3324-6936-4a1b-8c6b-567b0c7bea41 9ee45ba20dd444a5a5e88aa96cc8e043 873b2f2688e942d5924aa81fa18d84c0 - - default default] Plugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:f6:6f:67,bridge_name='br-int',has_traffic_filtering=True,id=dc2fa6e9-f8da-41bd-b113-8c7f2b22d4a9,network=Network(d1e3ff28-7ba4-4007-895f-2557b60edefb),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapdc2fa6e9-f8') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Jan 21 19:13:31 np0005591285 nova_compute[182755]: 2026-01-22 00:13:31.982 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:13:31 np0005591285 nova_compute[182755]: 2026-01-22 00:13:31.983 182759 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 21 19:13:31 np0005591285 nova_compute[182755]: 2026-01-22 00:13:31.984 182759 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 21 19:13:31 np0005591285 nova_compute[182755]: 2026-01-22 00:13:31.986 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:13:31 np0005591285 nova_compute[182755]: 2026-01-22 00:13:31.986 182759 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapdc2fa6e9-f8, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 21 19:13:31 np0005591285 nova_compute[182755]: 2026-01-22 00:13:31.987 182759 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tapdc2fa6e9-f8, col_values=(('external_ids', {'iface-id': 'dc2fa6e9-f8da-41bd-b113-8c7f2b22d4a9', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:f6:6f:67', 'vm-uuid': 'ee09d802-1f59-4f58-befa-a281fe642b6b'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 21 19:13:31 np0005591285 nova_compute[182755]: 2026-01-22 00:13:31.988 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:13:31 np0005591285 NetworkManager[55017]: <info>  [1769040811.9900] manager: (tapdc2fa6e9-f8): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/223)
Jan 21 19:13:31 np0005591285 nova_compute[182755]: 2026-01-22 00:13:31.991 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 21 19:13:31 np0005591285 nova_compute[182755]: 2026-01-22 00:13:31.995 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:13:31 np0005591285 nova_compute[182755]: 2026-01-22 00:13:31.996 182759 INFO os_vif [None req-9dcb3324-6936-4a1b-8c6b-567b0c7bea41 9ee45ba20dd444a5a5e88aa96cc8e043 873b2f2688e942d5924aa81fa18d84c0 - - default default] Successfully plugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:f6:6f:67,bridge_name='br-int',has_traffic_filtering=True,id=dc2fa6e9-f8da-41bd-b113-8c7f2b22d4a9,network=Network(d1e3ff28-7ba4-4007-895f-2557b60edefb),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapdc2fa6e9-f8')#033[00m
Jan 21 19:13:32 np0005591285 podman[230450]: 2026-01-22 00:13:32.018516262 +0000 UTC m=+0.042899625 container remove c076551188c3cdf575cd5ddab00cab7f1ac80c2d140c6e990e9c5fe3b9d7efdf (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-d1e3ff28-7ba4-4007-895f-2557b60edefb, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, tcib_managed=true, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 21 19:13:32 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:13:32.022 211690 DEBUG oslo.privsep.daemon [-] privsep: reply[35e2e527-ac06-43a1-adcf-53a8b174c9fb]: (4, ('Thu Jan 22 12:13:31 AM UTC 2026 Stopping container neutron-haproxy-ovnmeta-d1e3ff28-7ba4-4007-895f-2557b60edefb (c076551188c3cdf575cd5ddab00cab7f1ac80c2d140c6e990e9c5fe3b9d7efdf)\nc076551188c3cdf575cd5ddab00cab7f1ac80c2d140c6e990e9c5fe3b9d7efdf\nThu Jan 22 12:13:31 AM UTC 2026 Deleting container neutron-haproxy-ovnmeta-d1e3ff28-7ba4-4007-895f-2557b60edefb (c076551188c3cdf575cd5ddab00cab7f1ac80c2d140c6e990e9c5fe3b9d7efdf)\nc076551188c3cdf575cd5ddab00cab7f1ac80c2d140c6e990e9c5fe3b9d7efdf\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 19:13:32 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:13:32.024 211690 DEBUG oslo.privsep.daemon [-] privsep: reply[ed3d6764-1424-4909-b4c7-876128ea8017]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 19:13:32 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:13:32.026 104259 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapd1e3ff28-70, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 21 19:13:32 np0005591285 kernel: tapd1e3ff28-70: left promiscuous mode
Jan 21 19:13:32 np0005591285 nova_compute[182755]: 2026-01-22 00:13:32.027 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:13:32 np0005591285 nova_compute[182755]: 2026-01-22 00:13:32.044 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:13:32 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:13:32.047 211690 DEBUG oslo.privsep.daemon [-] privsep: reply[00e695e0-0302-4fdc-8a6e-18eeae8ea3f5]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 19:13:32 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:13:32.059 211690 DEBUG oslo.privsep.daemon [-] privsep: reply[ab14e56e-7340-40d6-a70f-ad1b66b338e1]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 19:13:32 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:13:32.060 211690 DEBUG oslo.privsep.daemon [-] privsep: reply[c3d15b03-9b43-4928-9287-d8cb6abd1aac]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 19:13:32 np0005591285 kernel: tapdc2fa6e9-f8: entered promiscuous mode
Jan 21 19:13:32 np0005591285 systemd-udevd[230317]: Network interface NamePolicy= disabled on kernel command line.
Jan 21 19:13:32 np0005591285 NetworkManager[55017]: <info>  [1769040812.0706] manager: (tapdc2fa6e9-f8): new Tun device (/org/freedesktop/NetworkManager/Devices/224)
Jan 21 19:13:32 np0005591285 nova_compute[182755]: 2026-01-22 00:13:32.071 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:13:32 np0005591285 ovn_controller[94908]: 2026-01-22T00:13:32Z|00459|binding|INFO|Claiming lport dc2fa6e9-f8da-41bd-b113-8c7f2b22d4a9 for this chassis.
Jan 21 19:13:32 np0005591285 ovn_controller[94908]: 2026-01-22T00:13:32Z|00460|binding|INFO|dc2fa6e9-f8da-41bd-b113-8c7f2b22d4a9: Claiming fa:16:3e:f6:6f:67 10.100.0.10
Jan 21 19:13:32 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:13:32.074 211690 DEBUG oslo.privsep.daemon [-] privsep: reply[6119c596-3016-47f0-bd0f-8c2ef7ad422a]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 536967, 'reachable_time': 25672, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 230478, 'error': None, 'target': 'ovnmeta-d1e3ff28-7ba4-4007-895f-2557b60edefb', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 19:13:32 np0005591285 systemd[1]: run-netns-ovnmeta\x2dd1e3ff28\x2d7ba4\x2d4007\x2d895f\x2d2557b60edefb.mount: Deactivated successfully.
Jan 21 19:13:32 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:13:32.081 104650 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-d1e3ff28-7ba4-4007-895f-2557b60edefb deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Jan 21 19:13:32 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:13:32.082 104650 DEBUG oslo.privsep.daemon [-] privsep: reply[72215c10-e2e4-4baf-97b6-cf9c2e9d8212]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 19:13:32 np0005591285 ovn_controller[94908]: 2026-01-22T00:13:32Z|00461|binding|INFO|Setting lport dc2fa6e9-f8da-41bd-b113-8c7f2b22d4a9 ovn-installed in OVS
Jan 21 19:13:32 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:13:32.082 104259 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 7 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Jan 21 19:13:32 np0005591285 nova_compute[182755]: 2026-01-22 00:13:32.083 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:13:32 np0005591285 NetworkManager[55017]: <info>  [1769040812.0856] device (tapdc2fa6e9-f8): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 21 19:13:32 np0005591285 NetworkManager[55017]: <info>  [1769040812.0862] device (tapdc2fa6e9-f8): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 21 19:13:32 np0005591285 nova_compute[182755]: 2026-01-22 00:13:32.087 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:13:32 np0005591285 systemd-machined[154022]: New machine qemu-55-instance-00000076.
Jan 21 19:13:32 np0005591285 systemd[1]: Started Virtual Machine qemu-55-instance-00000076.
Jan 21 19:13:32 np0005591285 ovn_controller[94908]: 2026-01-22T00:13:32Z|00462|binding|INFO|Setting lport dc2fa6e9-f8da-41bd-b113-8c7f2b22d4a9 up in Southbound
Jan 21 19:13:32 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:13:32.151 104259 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:f6:6f:67 10.100.0.10'], port_security=['fa:16:3e:f6:6f:67 10.100.0.10'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.10/28', 'neutron:device_id': 'ee09d802-1f59-4f58-befa-a281fe642b6b', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-d1e3ff28-7ba4-4007-895f-2557b60edefb', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '873b2f2688e942d5924aa81fa18d84c0', 'neutron:revision_number': '5', 'neutron:security_group_ids': 'e0235df0-8e10-4f2c-bca1-3481f216c1a8', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=b6e8d061-740c-490c-81d2-02d385a2e787, chassis=[<ovs.db.idl.Row object at 0x7fdd55b5c970>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fdd55b5c970>], logical_port=dc2fa6e9-f8da-41bd-b113-8c7f2b22d4a9) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 21 19:13:32 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:13:32.153 104259 INFO neutron.agent.ovn.metadata.agent [-] Port dc2fa6e9-f8da-41bd-b113-8c7f2b22d4a9 in datapath d1e3ff28-7ba4-4007-895f-2557b60edefb bound to our chassis#033[00m
Jan 21 19:13:32 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:13:32.154 104259 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network d1e3ff28-7ba4-4007-895f-2557b60edefb#033[00m
Jan 21 19:13:32 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:13:32.164 211690 DEBUG oslo.privsep.daemon [-] privsep: reply[a53fef19-d97f-41a6-9ad2-c336a432769a]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 19:13:32 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:13:32.165 104259 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tapd1e3ff28-71 in ovnmeta-d1e3ff28-7ba4-4007-895f-2557b60edefb namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Jan 21 19:13:32 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:13:32.166 211690 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tapd1e3ff28-70 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Jan 21 19:13:32 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:13:32.166 211690 DEBUG oslo.privsep.daemon [-] privsep: reply[e2e0038b-ca94-4fa6-b217-3c33ef462cf6]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 19:13:32 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:13:32.167 211690 DEBUG oslo.privsep.daemon [-] privsep: reply[5eda89e6-4d2a-4ffc-9326-d12a4c2d7a39]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 19:13:32 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:13:32.177 104650 DEBUG oslo.privsep.daemon [-] privsep: reply[a9c1426b-0d4b-47d2-a9ae-3a7c6b486d56]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 19:13:32 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:13:32.196 211690 DEBUG oslo.privsep.daemon [-] privsep: reply[1bbadf9a-55c9-4fba-bb1b-68acf8e5c507]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 19:13:32 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:13:32.224 211704 DEBUG oslo.privsep.daemon [-] privsep: reply[ffcf597d-6fdf-49dc-972f-394001997f37]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 19:13:32 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:13:32.230 211690 DEBUG oslo.privsep.daemon [-] privsep: reply[da18f84e-d02a-44f0-b6e3-6ba37e9a3949]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 19:13:32 np0005591285 NetworkManager[55017]: <info>  [1769040812.2311] manager: (tapd1e3ff28-70): new Veth device (/org/freedesktop/NetworkManager/Devices/225)
Jan 21 19:13:32 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:13:32.264 211704 DEBUG oslo.privsep.daemon [-] privsep: reply[2368079b-79f4-4f5f-acfd-b57d6b6af8a3]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 19:13:32 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:13:32.267 211704 DEBUG oslo.privsep.daemon [-] privsep: reply[778917dd-195a-4e48-b36a-a973c1e83865]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 19:13:32 np0005591285 NetworkManager[55017]: <info>  [1769040812.2883] device (tapd1e3ff28-70): carrier: link connected
Jan 21 19:13:32 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:13:32.293 211704 DEBUG oslo.privsep.daemon [-] privsep: reply[d9a4f4aa-1469-41e3-ad7e-c988c411fd80]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 19:13:32 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:13:32.312 211690 DEBUG oslo.privsep.daemon [-] privsep: reply[56156eb8-7b76-42a2-b9b5-8e6cb9454eb3]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapd1e3ff28-71'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:1c:5b:09'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 143], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 537094, 'reachable_time': 44528, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 230511, 'error': None, 'target': 'ovnmeta-d1e3ff28-7ba4-4007-895f-2557b60edefb', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 19:13:32 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:13:32.324 211690 DEBUG oslo.privsep.daemon [-] privsep: reply[32f2d190-85ac-48bb-bf0e-feb330bfae23]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe1c:5b09'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 537094, 'tstamp': 537094}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 230518, 'error': None, 'target': 'ovnmeta-d1e3ff28-7ba4-4007-895f-2557b60edefb', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 19:13:32 np0005591285 podman[230499]: 2026-01-22 00:13:32.328301591 +0000 UTC m=+0.051092173 container health_status 3d927e208c8d9d2bf2ed958430b573b5beb6e6062ba87e1ec7a0ee42c855b2e4 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter)
Jan 21 19:13:32 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:13:32.341 211690 DEBUG oslo.privsep.daemon [-] privsep: reply[2c092c0c-fca7-44f1-a5fc-b4dd96fe9c6d]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapd1e3ff28-71'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:1c:5b:09'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 143], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 537094, 'reachable_time': 44528, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 230526, 'error': None, 'target': 'ovnmeta-d1e3ff28-7ba4-4007-895f-2557b60edefb', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 19:13:32 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:13:32.368 211690 DEBUG oslo.privsep.daemon [-] privsep: reply[79caa9ac-766c-4037-bd67-98bbc5228910]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 19:13:32 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:13:32.419 211690 DEBUG oslo.privsep.daemon [-] privsep: reply[ebe9a400-872e-4d0d-8b50-6129044ffbc8]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 19:13:32 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:13:32.421 104259 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapd1e3ff28-70, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 21 19:13:32 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:13:32.422 104259 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 21 19:13:32 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:13:32.422 104259 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapd1e3ff28-70, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 21 19:13:32 np0005591285 kernel: tapd1e3ff28-70: entered promiscuous mode
Jan 21 19:13:32 np0005591285 NetworkManager[55017]: <info>  [1769040812.4246] manager: (tapd1e3ff28-70): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/226)
Jan 21 19:13:32 np0005591285 nova_compute[182755]: 2026-01-22 00:13:32.424 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:13:32 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:13:32.429 104259 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapd1e3ff28-70, col_values=(('external_ids', {'iface-id': '0b252d65-412b-4740-a674-4727d3037b7f'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 21 19:13:32 np0005591285 nova_compute[182755]: 2026-01-22 00:13:32.430 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:13:32 np0005591285 nova_compute[182755]: 2026-01-22 00:13:32.431 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:13:32 np0005591285 ovn_controller[94908]: 2026-01-22T00:13:32Z|00463|binding|INFO|Releasing lport 0b252d65-412b-4740-a674-4727d3037b7f from this chassis (sb_readonly=0)
Jan 21 19:13:32 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:13:32.432 104259 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/d1e3ff28-7ba4-4007-895f-2557b60edefb.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/d1e3ff28-7ba4-4007-895f-2557b60edefb.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Jan 21 19:13:32 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:13:32.433 211690 DEBUG oslo.privsep.daemon [-] privsep: reply[5ad01c96-7d9b-4aba-a5b1-20dad5fd0913]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 19:13:32 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:13:32.433 104259 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Jan 21 19:13:32 np0005591285 ovn_metadata_agent[104254]: global
Jan 21 19:13:32 np0005591285 ovn_metadata_agent[104254]:    log         /dev/log local0 debug
Jan 21 19:13:32 np0005591285 ovn_metadata_agent[104254]:    log-tag     haproxy-metadata-proxy-d1e3ff28-7ba4-4007-895f-2557b60edefb
Jan 21 19:13:32 np0005591285 ovn_metadata_agent[104254]:    user        root
Jan 21 19:13:32 np0005591285 ovn_metadata_agent[104254]:    group       root
Jan 21 19:13:32 np0005591285 ovn_metadata_agent[104254]:    maxconn     1024
Jan 21 19:13:32 np0005591285 ovn_metadata_agent[104254]:    pidfile     /var/lib/neutron/external/pids/d1e3ff28-7ba4-4007-895f-2557b60edefb.pid.haproxy
Jan 21 19:13:32 np0005591285 ovn_metadata_agent[104254]:    daemon
Jan 21 19:13:32 np0005591285 ovn_metadata_agent[104254]: 
Jan 21 19:13:32 np0005591285 ovn_metadata_agent[104254]: defaults
Jan 21 19:13:32 np0005591285 ovn_metadata_agent[104254]:    log global
Jan 21 19:13:32 np0005591285 ovn_metadata_agent[104254]:    mode http
Jan 21 19:13:32 np0005591285 ovn_metadata_agent[104254]:    option httplog
Jan 21 19:13:32 np0005591285 ovn_metadata_agent[104254]:    option dontlognull
Jan 21 19:13:32 np0005591285 ovn_metadata_agent[104254]:    option http-server-close
Jan 21 19:13:32 np0005591285 ovn_metadata_agent[104254]:    option forwardfor
Jan 21 19:13:32 np0005591285 ovn_metadata_agent[104254]:    retries                 3
Jan 21 19:13:32 np0005591285 ovn_metadata_agent[104254]:    timeout http-request    30s
Jan 21 19:13:32 np0005591285 ovn_metadata_agent[104254]:    timeout connect         30s
Jan 21 19:13:32 np0005591285 ovn_metadata_agent[104254]:    timeout client          32s
Jan 21 19:13:32 np0005591285 ovn_metadata_agent[104254]:    timeout server          32s
Jan 21 19:13:32 np0005591285 ovn_metadata_agent[104254]:    timeout http-keep-alive 30s
Jan 21 19:13:32 np0005591285 ovn_metadata_agent[104254]: 
Jan 21 19:13:32 np0005591285 ovn_metadata_agent[104254]: 
Jan 21 19:13:32 np0005591285 ovn_metadata_agent[104254]: listen listener
Jan 21 19:13:32 np0005591285 ovn_metadata_agent[104254]:    bind 169.254.169.254:80
Jan 21 19:13:32 np0005591285 ovn_metadata_agent[104254]:    server metadata /var/lib/neutron/metadata_proxy
Jan 21 19:13:32 np0005591285 ovn_metadata_agent[104254]:    http-request add-header X-OVN-Network-ID d1e3ff28-7ba4-4007-895f-2557b60edefb
Jan 21 19:13:32 np0005591285 ovn_metadata_agent[104254]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Jan 21 19:13:32 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:13:32.435 104259 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-d1e3ff28-7ba4-4007-895f-2557b60edefb', 'env', 'PROCESS_TAG=haproxy-d1e3ff28-7ba4-4007-895f-2557b60edefb', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/d1e3ff28-7ba4-4007-895f-2557b60edefb.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Jan 21 19:13:32 np0005591285 nova_compute[182755]: 2026-01-22 00:13:32.439 182759 INFO nova.compute.manager [None req-69884c76-b344-4b77-a3b2-7b7f0d92ea38 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] [instance: 118577c2-2440-472a-b858-f075b2a804b1] Get console output#033[00m
Jan 21 19:13:32 np0005591285 nova_compute[182755]: 2026-01-22 00:13:32.474 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:13:32 np0005591285 nova_compute[182755]: 2026-01-22 00:13:32.480 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:13:32 np0005591285 nova_compute[182755]: 2026-01-22 00:13:32.481 211512 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes#033[00m
Jan 21 19:13:32 np0005591285 nova_compute[182755]: 2026-01-22 00:13:32.571 182759 DEBUG nova.virt.libvirt.host [None req-f447ef32-7edb-4e2c-a5c5-5452f2f9c37e - - - - - -] Removed pending event for ee09d802-1f59-4f58-befa-a281fe642b6b due to event _event_emit_delayed /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:438#033[00m
Jan 21 19:13:32 np0005591285 nova_compute[182755]: 2026-01-22 00:13:32.571 182759 DEBUG nova.virt.driver [None req-f447ef32-7edb-4e2c-a5c5-5452f2f9c37e - - - - - -] Emitting event <LifecycleEvent: 1769040812.5710437, ee09d802-1f59-4f58-befa-a281fe642b6b => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 21 19:13:32 np0005591285 nova_compute[182755]: 2026-01-22 00:13:32.571 182759 INFO nova.compute.manager [None req-f447ef32-7edb-4e2c-a5c5-5452f2f9c37e - - - - - -] [instance: ee09d802-1f59-4f58-befa-a281fe642b6b] VM Resumed (Lifecycle Event)#033[00m
Jan 21 19:13:32 np0005591285 nova_compute[182755]: 2026-01-22 00:13:32.573 182759 DEBUG nova.compute.manager [None req-9dcb3324-6936-4a1b-8c6b-567b0c7bea41 9ee45ba20dd444a5a5e88aa96cc8e043 873b2f2688e942d5924aa81fa18d84c0 - - default default] [instance: ee09d802-1f59-4f58-befa-a281fe642b6b] Instance event wait completed in 0 seconds for  wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Jan 21 19:13:32 np0005591285 nova_compute[182755]: 2026-01-22 00:13:32.580 182759 INFO nova.virt.libvirt.driver [-] [instance: ee09d802-1f59-4f58-befa-a281fe642b6b] Instance rebooted successfully.#033[00m
Jan 21 19:13:32 np0005591285 nova_compute[182755]: 2026-01-22 00:13:32.580 182759 DEBUG nova.compute.manager [None req-9dcb3324-6936-4a1b-8c6b-567b0c7bea41 9ee45ba20dd444a5a5e88aa96cc8e043 873b2f2688e942d5924aa81fa18d84c0 - - default default] [instance: ee09d802-1f59-4f58-befa-a281fe642b6b] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 21 19:13:32 np0005591285 nova_compute[182755]: 2026-01-22 00:13:32.618 182759 DEBUG nova.compute.manager [None req-f447ef32-7edb-4e2c-a5c5-5452f2f9c37e - - - - - -] [instance: ee09d802-1f59-4f58-befa-a281fe642b6b] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 21 19:13:32 np0005591285 nova_compute[182755]: 2026-01-22 00:13:32.622 182759 DEBUG nova.compute.manager [None req-f447ef32-7edb-4e2c-a5c5-5452f2f9c37e - - - - - -] [instance: ee09d802-1f59-4f58-befa-a281fe642b6b] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: active, current task_state: reboot_started_hard, current DB power_state: 1, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 21 19:13:32 np0005591285 nova_compute[182755]: 2026-01-22 00:13:32.664 182759 INFO nova.compute.manager [None req-f447ef32-7edb-4e2c-a5c5-5452f2f9c37e - - - - - -] [instance: ee09d802-1f59-4f58-befa-a281fe642b6b] During sync_power_state the instance has a pending task (reboot_started_hard). Skip.#033[00m
Jan 21 19:13:32 np0005591285 nova_compute[182755]: 2026-01-22 00:13:32.665 182759 DEBUG nova.virt.driver [None req-f447ef32-7edb-4e2c-a5c5-5452f2f9c37e - - - - - -] Emitting event <LifecycleEvent: 1769040812.572732, ee09d802-1f59-4f58-befa-a281fe642b6b => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 21 19:13:32 np0005591285 nova_compute[182755]: 2026-01-22 00:13:32.665 182759 INFO nova.compute.manager [None req-f447ef32-7edb-4e2c-a5c5-5452f2f9c37e - - - - - -] [instance: ee09d802-1f59-4f58-befa-a281fe642b6b] VM Started (Lifecycle Event)#033[00m
Jan 21 19:13:32 np0005591285 nova_compute[182755]: 2026-01-22 00:13:32.695 182759 DEBUG nova.compute.manager [None req-f447ef32-7edb-4e2c-a5c5-5452f2f9c37e - - - - - -] [instance: ee09d802-1f59-4f58-befa-a281fe642b6b] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 21 19:13:32 np0005591285 nova_compute[182755]: 2026-01-22 00:13:32.704 182759 DEBUG nova.compute.manager [None req-f447ef32-7edb-4e2c-a5c5-5452f2f9c37e - - - - - -] [instance: ee09d802-1f59-4f58-befa-a281fe642b6b] Synchronizing instance power state after lifecycle event "Started"; current vm_state: active, current task_state: None, current DB power_state: 1, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 21 19:13:32 np0005591285 nova_compute[182755]: 2026-01-22 00:13:32.707 182759 DEBUG oslo_concurrency.lockutils [None req-9dcb3324-6936-4a1b-8c6b-567b0c7bea41 9ee45ba20dd444a5a5e88aa96cc8e043 873b2f2688e942d5924aa81fa18d84c0 - - default default] Lock "ee09d802-1f59-4f58-befa-a281fe642b6b" "released" by "nova.compute.manager.ComputeManager.reboot_instance.<locals>.do_reboot_instance" :: held 10.893s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 21 19:13:32 np0005591285 podman[230565]: 2026-01-22 00:13:32.822546959 +0000 UTC m=+0.053628151 container create e5daa75e71e729a71521fb46c96a4e7543bed3eb5b951741a079fd86210ba3ac (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-d1e3ff28-7ba4-4007-895f-2557b60edefb, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true)
Jan 21 19:13:32 np0005591285 systemd[1]: Started libpod-conmon-e5daa75e71e729a71521fb46c96a4e7543bed3eb5b951741a079fd86210ba3ac.scope.
Jan 21 19:13:32 np0005591285 nova_compute[182755]: 2026-01-22 00:13:32.883 182759 DEBUG oslo_concurrency.lockutils [None req-02473b20-3767-47f6-8ef7-cf831b481552 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] Acquiring lock "118577c2-2440-472a-b858-f075b2a804b1" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 21 19:13:32 np0005591285 nova_compute[182755]: 2026-01-22 00:13:32.884 182759 DEBUG oslo_concurrency.lockutils [None req-02473b20-3767-47f6-8ef7-cf831b481552 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] Lock "118577c2-2440-472a-b858-f075b2a804b1" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 21 19:13:32 np0005591285 nova_compute[182755]: 2026-01-22 00:13:32.884 182759 DEBUG oslo_concurrency.lockutils [None req-02473b20-3767-47f6-8ef7-cf831b481552 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] Acquiring lock "118577c2-2440-472a-b858-f075b2a804b1-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 21 19:13:32 np0005591285 nova_compute[182755]: 2026-01-22 00:13:32.884 182759 DEBUG oslo_concurrency.lockutils [None req-02473b20-3767-47f6-8ef7-cf831b481552 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] Lock "118577c2-2440-472a-b858-f075b2a804b1-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 21 19:13:32 np0005591285 nova_compute[182755]: 2026-01-22 00:13:32.884 182759 DEBUG oslo_concurrency.lockutils [None req-02473b20-3767-47f6-8ef7-cf831b481552 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] Lock "118577c2-2440-472a-b858-f075b2a804b1-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 21 19:13:32 np0005591285 systemd[1]: Started libcrun container.
Jan 21 19:13:32 np0005591285 podman[230565]: 2026-01-22 00:13:32.797039888 +0000 UTC m=+0.028121110 image pull 3695f0466b4af47afdf4b467956f8cc4744d7249671a73e7ca3fd26cca2f59c3 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Jan 21 19:13:32 np0005591285 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/fcc1b5d5a50ea09411a81c9ba9eddf74d91e206163d9462057fc5e040e660631/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 21 19:13:32 np0005591285 nova_compute[182755]: 2026-01-22 00:13:32.903 182759 INFO nova.compute.manager [None req-02473b20-3767-47f6-8ef7-cf831b481552 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] [instance: 118577c2-2440-472a-b858-f075b2a804b1] Terminating instance#033[00m
Jan 21 19:13:32 np0005591285 podman[230565]: 2026-01-22 00:13:32.907295348 +0000 UTC m=+0.138376570 container init e5daa75e71e729a71521fb46c96a4e7543bed3eb5b951741a079fd86210ba3ac (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-d1e3ff28-7ba4-4007-895f-2557b60edefb, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Jan 21 19:13:32 np0005591285 podman[230565]: 2026-01-22 00:13:32.913952366 +0000 UTC m=+0.145033558 container start e5daa75e71e729a71521fb46c96a4e7543bed3eb5b951741a079fd86210ba3ac (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-d1e3ff28-7ba4-4007-895f-2557b60edefb, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0)
Jan 21 19:13:32 np0005591285 nova_compute[182755]: 2026-01-22 00:13:32.915 182759 DEBUG nova.compute.manager [None req-02473b20-3767-47f6-8ef7-cf831b481552 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] [instance: 118577c2-2440-472a-b858-f075b2a804b1] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Jan 21 19:13:32 np0005591285 kernel: tapf091e31e-11 (unregistering): left promiscuous mode
Jan 21 19:13:32 np0005591285 neutron-haproxy-ovnmeta-d1e3ff28-7ba4-4007-895f-2557b60edefb[230581]: [NOTICE]   (230585) : New worker (230587) forked
Jan 21 19:13:32 np0005591285 neutron-haproxy-ovnmeta-d1e3ff28-7ba4-4007-895f-2557b60edefb[230581]: [NOTICE]   (230585) : Loading success.
Jan 21 19:13:32 np0005591285 NetworkManager[55017]: <info>  [1769040812.9457] device (tapf091e31e-11): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 21 19:13:32 np0005591285 ovn_controller[94908]: 2026-01-22T00:13:32Z|00464|binding|INFO|Releasing lport f091e31e-112e-4a90-9947-5a807f422c9c from this chassis (sb_readonly=0)
Jan 21 19:13:32 np0005591285 ovn_controller[94908]: 2026-01-22T00:13:32Z|00465|binding|INFO|Setting lport f091e31e-112e-4a90-9947-5a807f422c9c down in Southbound
Jan 21 19:13:32 np0005591285 nova_compute[182755]: 2026-01-22 00:13:32.962 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:13:32 np0005591285 ovn_controller[94908]: 2026-01-22T00:13:32Z|00466|binding|INFO|Removing iface tapf091e31e-11 ovn-installed in OVS
Jan 21 19:13:32 np0005591285 nova_compute[182755]: 2026-01-22 00:13:32.963 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:13:32 np0005591285 nova_compute[182755]: 2026-01-22 00:13:32.980 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:13:33 np0005591285 systemd[1]: machine-qemu\x2d53\x2dinstance\x2d00000077.scope: Deactivated successfully.
Jan 21 19:13:33 np0005591285 systemd[1]: machine-qemu\x2d53\x2dinstance\x2d00000077.scope: Consumed 12.712s CPU time.
Jan 21 19:13:33 np0005591285 systemd-machined[154022]: Machine qemu-53-instance-00000077 terminated.
Jan 21 19:13:33 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:13:33.042 104259 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:a7:98:77 10.100.0.11'], port_security=['fa:16:3e:a7:98:77 10.100.0.11'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.11/28', 'neutron:device_id': '118577c2-2440-472a-b858-f075b2a804b1', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-88a7330a-aaa1-424a-b4dc-f7500e450abb', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '34b96b4037d24a0ea19383ca2477b2fd', 'neutron:revision_number': '4', 'neutron:security_group_ids': '08295c1c-ae1e-44be-8dc2-34f42af0072b', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-2.ctlplane.example.com', 'neutron:port_fip': '192.168.122.193'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=d452ef76-084d-4578-ab80-dfb49c9c8f9b, chassis=[], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fdd55b5c970>], logical_port=f091e31e-112e-4a90-9947-5a807f422c9c) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7fdd55b5c970>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 21 19:13:33 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:13:33.044 104259 INFO neutron.agent.ovn.metadata.agent [-] Port f091e31e-112e-4a90-9947-5a807f422c9c in datapath 88a7330a-aaa1-424a-b4dc-f7500e450abb unbound from our chassis#033[00m
Jan 21 19:13:33 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:13:33.045 104259 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 88a7330a-aaa1-424a-b4dc-f7500e450abb, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Jan 21 19:13:33 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:13:33.046 211690 DEBUG oslo.privsep.daemon [-] privsep: reply[8512414a-1dca-4c38-85c3-20022401a010]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 19:13:33 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:13:33.047 104259 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-88a7330a-aaa1-424a-b4dc-f7500e450abb namespace which is not needed anymore#033[00m
Jan 21 19:13:33 np0005591285 neutron-haproxy-ovnmeta-88a7330a-aaa1-424a-b4dc-f7500e450abb[230051]: [NOTICE]   (230055) : haproxy version is 2.8.14-c23fe91
Jan 21 19:13:33 np0005591285 neutron-haproxy-ovnmeta-88a7330a-aaa1-424a-b4dc-f7500e450abb[230051]: [NOTICE]   (230055) : path to executable is /usr/sbin/haproxy
Jan 21 19:13:33 np0005591285 neutron-haproxy-ovnmeta-88a7330a-aaa1-424a-b4dc-f7500e450abb[230051]: [WARNING]  (230055) : Exiting Master process...
Jan 21 19:13:33 np0005591285 neutron-haproxy-ovnmeta-88a7330a-aaa1-424a-b4dc-f7500e450abb[230051]: [WARNING]  (230055) : Exiting Master process...
Jan 21 19:13:33 np0005591285 neutron-haproxy-ovnmeta-88a7330a-aaa1-424a-b4dc-f7500e450abb[230051]: [ALERT]    (230055) : Current worker (230057) exited with code 143 (Terminated)
Jan 21 19:13:33 np0005591285 neutron-haproxy-ovnmeta-88a7330a-aaa1-424a-b4dc-f7500e450abb[230051]: [WARNING]  (230055) : All workers exited. Exiting... (0)
Jan 21 19:13:33 np0005591285 systemd[1]: libpod-501174ea5462386b7fef34abf0d453472b0cd0177d50a0b81c2e28cc2ee7608f.scope: Deactivated successfully.
Jan 21 19:13:33 np0005591285 nova_compute[182755]: 2026-01-22 00:13:33.174 182759 INFO nova.virt.libvirt.driver [-] [instance: 118577c2-2440-472a-b858-f075b2a804b1] Instance destroyed successfully.#033[00m
Jan 21 19:13:33 np0005591285 nova_compute[182755]: 2026-01-22 00:13:33.178 182759 DEBUG nova.objects.instance [None req-02473b20-3767-47f6-8ef7-cf831b481552 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] Lazy-loading 'resources' on Instance uuid 118577c2-2440-472a-b858-f075b2a804b1 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 21 19:13:33 np0005591285 podman[230617]: 2026-01-22 00:13:33.198006209 +0000 UTC m=+0.074331773 container died 501174ea5462386b7fef34abf0d453472b0cd0177d50a0b81c2e28cc2ee7608f (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-88a7330a-aaa1-424a-b4dc-f7500e450abb, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202)
Jan 21 19:13:33 np0005591285 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-501174ea5462386b7fef34abf0d453472b0cd0177d50a0b81c2e28cc2ee7608f-userdata-shm.mount: Deactivated successfully.
Jan 21 19:13:33 np0005591285 systemd[1]: var-lib-containers-storage-overlay-dd98519f1c4d189d5ef351bd8afbaf70b73c3fd624b829aeadadb977d87c64d0-merged.mount: Deactivated successfully.
Jan 21 19:13:33 np0005591285 podman[230617]: 2026-01-22 00:13:33.246326517 +0000 UTC m=+0.122652081 container cleanup 501174ea5462386b7fef34abf0d453472b0cd0177d50a0b81c2e28cc2ee7608f (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-88a7330a-aaa1-424a-b4dc-f7500e450abb, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Jan 21 19:13:33 np0005591285 systemd[1]: libpod-conmon-501174ea5462386b7fef34abf0d453472b0cd0177d50a0b81c2e28cc2ee7608f.scope: Deactivated successfully.
Jan 21 19:13:33 np0005591285 podman[230661]: 2026-01-22 00:13:33.321153433 +0000 UTC m=+0.044324703 container remove 501174ea5462386b7fef34abf0d453472b0cd0177d50a0b81c2e28cc2ee7608f (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-88a7330a-aaa1-424a-b4dc-f7500e450abb, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team)
Jan 21 19:13:33 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:13:33.327 211690 DEBUG oslo.privsep.daemon [-] privsep: reply[d805e7e8-635b-490e-87ab-4945bf814d19]: (4, ('Thu Jan 22 12:13:33 AM UTC 2026 Stopping container neutron-haproxy-ovnmeta-88a7330a-aaa1-424a-b4dc-f7500e450abb (501174ea5462386b7fef34abf0d453472b0cd0177d50a0b81c2e28cc2ee7608f)\n501174ea5462386b7fef34abf0d453472b0cd0177d50a0b81c2e28cc2ee7608f\nThu Jan 22 12:13:33 AM UTC 2026 Deleting container neutron-haproxy-ovnmeta-88a7330a-aaa1-424a-b4dc-f7500e450abb (501174ea5462386b7fef34abf0d453472b0cd0177d50a0b81c2e28cc2ee7608f)\n501174ea5462386b7fef34abf0d453472b0cd0177d50a0b81c2e28cc2ee7608f\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 19:13:33 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:13:33.328 211690 DEBUG oslo.privsep.daemon [-] privsep: reply[c7e7a8af-3a5e-47ee-aa9a-f842f64208b9]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 19:13:33 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:13:33.329 104259 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap88a7330a-a0, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 21 19:13:33 np0005591285 nova_compute[182755]: 2026-01-22 00:13:33.331 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:13:33 np0005591285 kernel: tap88a7330a-a0: left promiscuous mode
Jan 21 19:13:33 np0005591285 nova_compute[182755]: 2026-01-22 00:13:33.347 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:13:33 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:13:33.349 211690 DEBUG oslo.privsep.daemon [-] privsep: reply[92e0bc9d-b209-4940-9f5c-07dd554527fe]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 19:13:33 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:13:33.364 211690 DEBUG oslo.privsep.daemon [-] privsep: reply[95d07c44-0fd3-4e71-89f2-72f8592120ea]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 19:13:33 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:13:33.365 211690 DEBUG oslo.privsep.daemon [-] privsep: reply[68666ef7-cd5a-4e35-9939-3b72f2225630]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 19:13:33 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:13:33.379 211690 DEBUG oslo.privsep.daemon [-] privsep: reply[3a2083d6-3d3b-49ad-9d6d-00daba60cc9b]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 535102, 'reachable_time': 35995, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 230680, 'error': None, 'target': 'ovnmeta-88a7330a-aaa1-424a-b4dc-f7500e450abb', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 19:13:33 np0005591285 systemd[1]: run-netns-ovnmeta\x2d88a7330a\x2daaa1\x2d424a\x2db4dc\x2df7500e450abb.mount: Deactivated successfully.
Jan 21 19:13:33 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:13:33.383 104650 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-88a7330a-aaa1-424a-b4dc-f7500e450abb deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Jan 21 19:13:33 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:13:33.383 104650 DEBUG oslo.privsep.daemon [-] privsep: reply[abd9bcc8-e22f-4831-972a-493fa3763b34]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 19:13:33 np0005591285 nova_compute[182755]: 2026-01-22 00:13:33.849 182759 DEBUG nova.virt.libvirt.vif [None req-02473b20-3767-47f6-8ef7-cf831b481552 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-22T00:12:49Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-1399445424',display_name='tempest-TestNetworkBasicOps-server-1399445424',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-1399445424',id=119,image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBDVR4jwL+EdUdOjKT2r4p3G73DZqMg92wtbjTJGvSWwJFmKZ8OOzFfVJeEzaB4zscbAeSw1Wszev9bU02pUh4vhhfe5YmWTWD6v3j1dlzOjM4Q/vNeMgCbEUgq0iq35//w==',key_name='tempest-TestNetworkBasicOps-1037238834',keypairs=<?>,launch_index=0,launched_at=2026-01-22T00:13:13Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='34b96b4037d24a0ea19383ca2477b2fd',ramdisk_id='',reservation_id='r-25upqi0r',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestNetworkBasicOps-822850957',owner_user_name='tempest-TestNetworkBasicOps-822850957-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-22T00:13:13Z,user_data=None,user_id='833f1e9dce90456ea55a443da6704907',uuid=118577c2-2440-472a-b858-f075b2a804b1,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "f091e31e-112e-4a90-9947-5a807f422c9c", "address": "fa:16:3e:a7:98:77", "network": {"id": "88a7330a-aaa1-424a-b4dc-f7500e450abb", "bridge": "br-int", "label": "tempest-network-smoke--616986641", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.193", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "34b96b4037d24a0ea19383ca2477b2fd", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapf091e31e-11", "ovs_interfaceid": "f091e31e-112e-4a90-9947-5a807f422c9c", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Jan 21 19:13:33 np0005591285 nova_compute[182755]: 2026-01-22 00:13:33.850 182759 DEBUG nova.network.os_vif_util [None req-02473b20-3767-47f6-8ef7-cf831b481552 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] Converting VIF {"id": "f091e31e-112e-4a90-9947-5a807f422c9c", "address": "fa:16:3e:a7:98:77", "network": {"id": "88a7330a-aaa1-424a-b4dc-f7500e450abb", "bridge": "br-int", "label": "tempest-network-smoke--616986641", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.193", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "34b96b4037d24a0ea19383ca2477b2fd", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapf091e31e-11", "ovs_interfaceid": "f091e31e-112e-4a90-9947-5a807f422c9c", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 21 19:13:33 np0005591285 nova_compute[182755]: 2026-01-22 00:13:33.851 182759 DEBUG nova.network.os_vif_util [None req-02473b20-3767-47f6-8ef7-cf831b481552 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:a7:98:77,bridge_name='br-int',has_traffic_filtering=True,id=f091e31e-112e-4a90-9947-5a807f422c9c,network=Network(88a7330a-aaa1-424a-b4dc-f7500e450abb),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapf091e31e-11') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 21 19:13:33 np0005591285 nova_compute[182755]: 2026-01-22 00:13:33.851 182759 DEBUG os_vif [None req-02473b20-3767-47f6-8ef7-cf831b481552 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:a7:98:77,bridge_name='br-int',has_traffic_filtering=True,id=f091e31e-112e-4a90-9947-5a807f422c9c,network=Network(88a7330a-aaa1-424a-b4dc-f7500e450abb),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapf091e31e-11') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Jan 21 19:13:33 np0005591285 nova_compute[182755]: 2026-01-22 00:13:33.853 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:13:33 np0005591285 nova_compute[182755]: 2026-01-22 00:13:33.853 182759 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapf091e31e-11, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 21 19:13:33 np0005591285 nova_compute[182755]: 2026-01-22 00:13:33.855 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:13:33 np0005591285 nova_compute[182755]: 2026-01-22 00:13:33.856 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:13:33 np0005591285 nova_compute[182755]: 2026-01-22 00:13:33.858 182759 INFO os_vif [None req-02473b20-3767-47f6-8ef7-cf831b481552 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:a7:98:77,bridge_name='br-int',has_traffic_filtering=True,id=f091e31e-112e-4a90-9947-5a807f422c9c,network=Network(88a7330a-aaa1-424a-b4dc-f7500e450abb),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapf091e31e-11')#033[00m
Jan 21 19:13:33 np0005591285 nova_compute[182755]: 2026-01-22 00:13:33.859 182759 INFO nova.virt.libvirt.driver [None req-02473b20-3767-47f6-8ef7-cf831b481552 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] [instance: 118577c2-2440-472a-b858-f075b2a804b1] Deleting instance files /var/lib/nova/instances/118577c2-2440-472a-b858-f075b2a804b1_del#033[00m
Jan 21 19:13:33 np0005591285 nova_compute[182755]: 2026-01-22 00:13:33.860 182759 INFO nova.virt.libvirt.driver [None req-02473b20-3767-47f6-8ef7-cf831b481552 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] [instance: 118577c2-2440-472a-b858-f075b2a804b1] Deletion of /var/lib/nova/instances/118577c2-2440-472a-b858-f075b2a804b1_del complete#033[00m
Jan 21 19:13:33 np0005591285 nova_compute[182755]: 2026-01-22 00:13:33.869 182759 DEBUG nova.compute.manager [req-3bbd0388-20de-42fe-a1f6-46ca1c715863 req-616aee6b-27a0-4dcd-95e6-b14cc607507c 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: ee09d802-1f59-4f58-befa-a281fe642b6b] Received event network-vif-plugged-dc2fa6e9-f8da-41bd-b113-8c7f2b22d4a9 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 21 19:13:33 np0005591285 nova_compute[182755]: 2026-01-22 00:13:33.869 182759 DEBUG oslo_concurrency.lockutils [req-3bbd0388-20de-42fe-a1f6-46ca1c715863 req-616aee6b-27a0-4dcd-95e6-b14cc607507c 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "ee09d802-1f59-4f58-befa-a281fe642b6b-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 21 19:13:33 np0005591285 nova_compute[182755]: 2026-01-22 00:13:33.869 182759 DEBUG oslo_concurrency.lockutils [req-3bbd0388-20de-42fe-a1f6-46ca1c715863 req-616aee6b-27a0-4dcd-95e6-b14cc607507c 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "ee09d802-1f59-4f58-befa-a281fe642b6b-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 21 19:13:33 np0005591285 nova_compute[182755]: 2026-01-22 00:13:33.871 182759 DEBUG oslo_concurrency.lockutils [req-3bbd0388-20de-42fe-a1f6-46ca1c715863 req-616aee6b-27a0-4dcd-95e6-b14cc607507c 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "ee09d802-1f59-4f58-befa-a281fe642b6b-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 21 19:13:33 np0005591285 nova_compute[182755]: 2026-01-22 00:13:33.871 182759 DEBUG nova.compute.manager [req-3bbd0388-20de-42fe-a1f6-46ca1c715863 req-616aee6b-27a0-4dcd-95e6-b14cc607507c 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: ee09d802-1f59-4f58-befa-a281fe642b6b] No waiting events found dispatching network-vif-plugged-dc2fa6e9-f8da-41bd-b113-8c7f2b22d4a9 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 21 19:13:33 np0005591285 nova_compute[182755]: 2026-01-22 00:13:33.872 182759 WARNING nova.compute.manager [req-3bbd0388-20de-42fe-a1f6-46ca1c715863 req-616aee6b-27a0-4dcd-95e6-b14cc607507c 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: ee09d802-1f59-4f58-befa-a281fe642b6b] Received unexpected event network-vif-plugged-dc2fa6e9-f8da-41bd-b113-8c7f2b22d4a9 for instance with vm_state active and task_state None.#033[00m
Jan 21 19:13:33 np0005591285 nova_compute[182755]: 2026-01-22 00:13:33.872 182759 DEBUG nova.compute.manager [req-3bbd0388-20de-42fe-a1f6-46ca1c715863 req-616aee6b-27a0-4dcd-95e6-b14cc607507c 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: ee09d802-1f59-4f58-befa-a281fe642b6b] Received event network-vif-plugged-dc2fa6e9-f8da-41bd-b113-8c7f2b22d4a9 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 21 19:13:33 np0005591285 nova_compute[182755]: 2026-01-22 00:13:33.872 182759 DEBUG oslo_concurrency.lockutils [req-3bbd0388-20de-42fe-a1f6-46ca1c715863 req-616aee6b-27a0-4dcd-95e6-b14cc607507c 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "ee09d802-1f59-4f58-befa-a281fe642b6b-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 21 19:13:33 np0005591285 nova_compute[182755]: 2026-01-22 00:13:33.873 182759 DEBUG oslo_concurrency.lockutils [req-3bbd0388-20de-42fe-a1f6-46ca1c715863 req-616aee6b-27a0-4dcd-95e6-b14cc607507c 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "ee09d802-1f59-4f58-befa-a281fe642b6b-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 21 19:13:33 np0005591285 nova_compute[182755]: 2026-01-22 00:13:33.873 182759 DEBUG oslo_concurrency.lockutils [req-3bbd0388-20de-42fe-a1f6-46ca1c715863 req-616aee6b-27a0-4dcd-95e6-b14cc607507c 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "ee09d802-1f59-4f58-befa-a281fe642b6b-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 21 19:13:33 np0005591285 nova_compute[182755]: 2026-01-22 00:13:33.873 182759 DEBUG nova.compute.manager [req-3bbd0388-20de-42fe-a1f6-46ca1c715863 req-616aee6b-27a0-4dcd-95e6-b14cc607507c 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: ee09d802-1f59-4f58-befa-a281fe642b6b] No waiting events found dispatching network-vif-plugged-dc2fa6e9-f8da-41bd-b113-8c7f2b22d4a9 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 21 19:13:33 np0005591285 nova_compute[182755]: 2026-01-22 00:13:33.873 182759 WARNING nova.compute.manager [req-3bbd0388-20de-42fe-a1f6-46ca1c715863 req-616aee6b-27a0-4dcd-95e6-b14cc607507c 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: ee09d802-1f59-4f58-befa-a281fe642b6b] Received unexpected event network-vif-plugged-dc2fa6e9-f8da-41bd-b113-8c7f2b22d4a9 for instance with vm_state active and task_state None.#033[00m
Jan 21 19:13:33 np0005591285 nova_compute[182755]: 2026-01-22 00:13:33.874 182759 DEBUG nova.compute.manager [req-3bbd0388-20de-42fe-a1f6-46ca1c715863 req-616aee6b-27a0-4dcd-95e6-b14cc607507c 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: ee09d802-1f59-4f58-befa-a281fe642b6b] Received event network-vif-plugged-dc2fa6e9-f8da-41bd-b113-8c7f2b22d4a9 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 21 19:13:33 np0005591285 nova_compute[182755]: 2026-01-22 00:13:33.874 182759 DEBUG oslo_concurrency.lockutils [req-3bbd0388-20de-42fe-a1f6-46ca1c715863 req-616aee6b-27a0-4dcd-95e6-b14cc607507c 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "ee09d802-1f59-4f58-befa-a281fe642b6b-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 21 19:13:33 np0005591285 nova_compute[182755]: 2026-01-22 00:13:33.874 182759 DEBUG oslo_concurrency.lockutils [req-3bbd0388-20de-42fe-a1f6-46ca1c715863 req-616aee6b-27a0-4dcd-95e6-b14cc607507c 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "ee09d802-1f59-4f58-befa-a281fe642b6b-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 21 19:13:33 np0005591285 nova_compute[182755]: 2026-01-22 00:13:33.874 182759 DEBUG oslo_concurrency.lockutils [req-3bbd0388-20de-42fe-a1f6-46ca1c715863 req-616aee6b-27a0-4dcd-95e6-b14cc607507c 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "ee09d802-1f59-4f58-befa-a281fe642b6b-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 21 19:13:33 np0005591285 nova_compute[182755]: 2026-01-22 00:13:33.875 182759 DEBUG nova.compute.manager [req-3bbd0388-20de-42fe-a1f6-46ca1c715863 req-616aee6b-27a0-4dcd-95e6-b14cc607507c 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: ee09d802-1f59-4f58-befa-a281fe642b6b] No waiting events found dispatching network-vif-plugged-dc2fa6e9-f8da-41bd-b113-8c7f2b22d4a9 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 21 19:13:33 np0005591285 nova_compute[182755]: 2026-01-22 00:13:33.875 182759 WARNING nova.compute.manager [req-3bbd0388-20de-42fe-a1f6-46ca1c715863 req-616aee6b-27a0-4dcd-95e6-b14cc607507c 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: ee09d802-1f59-4f58-befa-a281fe642b6b] Received unexpected event network-vif-plugged-dc2fa6e9-f8da-41bd-b113-8c7f2b22d4a9 for instance with vm_state active and task_state None.#033[00m
Jan 21 19:13:34 np0005591285 nova_compute[182755]: 2026-01-22 00:13:34.120 182759 INFO nova.compute.manager [None req-02473b20-3767-47f6-8ef7-cf831b481552 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] [instance: 118577c2-2440-472a-b858-f075b2a804b1] Took 1.21 seconds to destroy the instance on the hypervisor.#033[00m
Jan 21 19:13:34 np0005591285 nova_compute[182755]: 2026-01-22 00:13:34.121 182759 DEBUG oslo.service.loopingcall [None req-02473b20-3767-47f6-8ef7-cf831b481552 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Jan 21 19:13:34 np0005591285 nova_compute[182755]: 2026-01-22 00:13:34.121 182759 DEBUG nova.compute.manager [-] [instance: 118577c2-2440-472a-b858-f075b2a804b1] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Jan 21 19:13:34 np0005591285 nova_compute[182755]: 2026-01-22 00:13:34.121 182759 DEBUG nova.network.neutron [-] [instance: 118577c2-2440-472a-b858-f075b2a804b1] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Jan 21 19:13:34 np0005591285 nova_compute[182755]: 2026-01-22 00:13:34.639 182759 DEBUG nova.compute.manager [req-9b3715c9-9c07-4bb0-b0a6-0dfbc2e8c0db req-fb8539ad-0603-4e30-8e27-215828f1e6f0 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 118577c2-2440-472a-b858-f075b2a804b1] Received event network-vif-unplugged-f091e31e-112e-4a90-9947-5a807f422c9c external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 21 19:13:34 np0005591285 nova_compute[182755]: 2026-01-22 00:13:34.639 182759 DEBUG oslo_concurrency.lockutils [req-9b3715c9-9c07-4bb0-b0a6-0dfbc2e8c0db req-fb8539ad-0603-4e30-8e27-215828f1e6f0 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "118577c2-2440-472a-b858-f075b2a804b1-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 21 19:13:34 np0005591285 nova_compute[182755]: 2026-01-22 00:13:34.640 182759 DEBUG oslo_concurrency.lockutils [req-9b3715c9-9c07-4bb0-b0a6-0dfbc2e8c0db req-fb8539ad-0603-4e30-8e27-215828f1e6f0 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "118577c2-2440-472a-b858-f075b2a804b1-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 21 19:13:34 np0005591285 nova_compute[182755]: 2026-01-22 00:13:34.640 182759 DEBUG oslo_concurrency.lockutils [req-9b3715c9-9c07-4bb0-b0a6-0dfbc2e8c0db req-fb8539ad-0603-4e30-8e27-215828f1e6f0 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "118577c2-2440-472a-b858-f075b2a804b1-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 21 19:13:34 np0005591285 nova_compute[182755]: 2026-01-22 00:13:34.640 182759 DEBUG nova.compute.manager [req-9b3715c9-9c07-4bb0-b0a6-0dfbc2e8c0db req-fb8539ad-0603-4e30-8e27-215828f1e6f0 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 118577c2-2440-472a-b858-f075b2a804b1] No waiting events found dispatching network-vif-unplugged-f091e31e-112e-4a90-9947-5a807f422c9c pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 21 19:13:34 np0005591285 nova_compute[182755]: 2026-01-22 00:13:34.641 182759 DEBUG nova.compute.manager [req-9b3715c9-9c07-4bb0-b0a6-0dfbc2e8c0db req-fb8539ad-0603-4e30-8e27-215828f1e6f0 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 118577c2-2440-472a-b858-f075b2a804b1] Received event network-vif-unplugged-f091e31e-112e-4a90-9947-5a807f422c9c for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Jan 21 19:13:36 np0005591285 podman[230682]: 2026-01-22 00:13:36.211752039 +0000 UTC m=+0.063847823 container health_status 8919309155a033da8deb024bb37b9fc0d285fe97686ee0d202af37a5f2df8416 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Jan 21 19:13:36 np0005591285 podman[230681]: 2026-01-22 00:13:36.225731401 +0000 UTC m=+0.097227042 container health_status 482a8450540b33e5cd4c4cb0c4b60281016c657b79b89fa82f8a9e447843f995 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 21 19:13:36 np0005591285 nova_compute[182755]: 2026-01-22 00:13:36.513 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:13:36 np0005591285 nova_compute[182755]: 2026-01-22 00:13:36.601 182759 DEBUG oslo_concurrency.lockutils [None req-36d9e20d-0d42-4a91-b630-3f4346468b15 9ee45ba20dd444a5a5e88aa96cc8e043 873b2f2688e942d5924aa81fa18d84c0 - - default default] Acquiring lock "ee09d802-1f59-4f58-befa-a281fe642b6b" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 21 19:13:36 np0005591285 nova_compute[182755]: 2026-01-22 00:13:36.601 182759 DEBUG oslo_concurrency.lockutils [None req-36d9e20d-0d42-4a91-b630-3f4346468b15 9ee45ba20dd444a5a5e88aa96cc8e043 873b2f2688e942d5924aa81fa18d84c0 - - default default] Lock "ee09d802-1f59-4f58-befa-a281fe642b6b" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 21 19:13:36 np0005591285 nova_compute[182755]: 2026-01-22 00:13:36.602 182759 DEBUG oslo_concurrency.lockutils [None req-36d9e20d-0d42-4a91-b630-3f4346468b15 9ee45ba20dd444a5a5e88aa96cc8e043 873b2f2688e942d5924aa81fa18d84c0 - - default default] Acquiring lock "ee09d802-1f59-4f58-befa-a281fe642b6b-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 21 19:13:36 np0005591285 nova_compute[182755]: 2026-01-22 00:13:36.602 182759 DEBUG oslo_concurrency.lockutils [None req-36d9e20d-0d42-4a91-b630-3f4346468b15 9ee45ba20dd444a5a5e88aa96cc8e043 873b2f2688e942d5924aa81fa18d84c0 - - default default] Lock "ee09d802-1f59-4f58-befa-a281fe642b6b-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 21 19:13:36 np0005591285 nova_compute[182755]: 2026-01-22 00:13:36.602 182759 DEBUG oslo_concurrency.lockutils [None req-36d9e20d-0d42-4a91-b630-3f4346468b15 9ee45ba20dd444a5a5e88aa96cc8e043 873b2f2688e942d5924aa81fa18d84c0 - - default default] Lock "ee09d802-1f59-4f58-befa-a281fe642b6b-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 21 19:13:36 np0005591285 nova_compute[182755]: 2026-01-22 00:13:36.615 182759 INFO nova.compute.manager [None req-36d9e20d-0d42-4a91-b630-3f4346468b15 9ee45ba20dd444a5a5e88aa96cc8e043 873b2f2688e942d5924aa81fa18d84c0 - - default default] [instance: ee09d802-1f59-4f58-befa-a281fe642b6b] Terminating instance#033[00m
Jan 21 19:13:36 np0005591285 nova_compute[182755]: 2026-01-22 00:13:36.628 182759 DEBUG nova.compute.manager [None req-36d9e20d-0d42-4a91-b630-3f4346468b15 9ee45ba20dd444a5a5e88aa96cc8e043 873b2f2688e942d5924aa81fa18d84c0 - - default default] [instance: ee09d802-1f59-4f58-befa-a281fe642b6b] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Jan 21 19:13:36 np0005591285 kernel: tapdc2fa6e9-f8 (unregistering): left promiscuous mode
Jan 21 19:13:36 np0005591285 NetworkManager[55017]: <info>  [1769040816.6503] device (tapdc2fa6e9-f8): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 21 19:13:36 np0005591285 ovn_controller[94908]: 2026-01-22T00:13:36Z|00467|binding|INFO|Releasing lport dc2fa6e9-f8da-41bd-b113-8c7f2b22d4a9 from this chassis (sb_readonly=0)
Jan 21 19:13:36 np0005591285 ovn_controller[94908]: 2026-01-22T00:13:36Z|00468|binding|INFO|Setting lport dc2fa6e9-f8da-41bd-b113-8c7f2b22d4a9 down in Southbound
Jan 21 19:13:36 np0005591285 ovn_controller[94908]: 2026-01-22T00:13:36Z|00469|binding|INFO|Removing iface tapdc2fa6e9-f8 ovn-installed in OVS
Jan 21 19:13:36 np0005591285 nova_compute[182755]: 2026-01-22 00:13:36.662 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:13:36 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:13:36.687 104259 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:f6:6f:67 10.100.0.10'], port_security=['fa:16:3e:f6:6f:67 10.100.0.10'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.10/28', 'neutron:device_id': 'ee09d802-1f59-4f58-befa-a281fe642b6b', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-d1e3ff28-7ba4-4007-895f-2557b60edefb', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '873b2f2688e942d5924aa81fa18d84c0', 'neutron:revision_number': '6', 'neutron:security_group_ids': 'e0235df0-8e10-4f2c-bca1-3481f216c1a8', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-2.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=b6e8d061-740c-490c-81d2-02d385a2e787, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fdd55b5c970>], logical_port=dc2fa6e9-f8da-41bd-b113-8c7f2b22d4a9) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7fdd55b5c970>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 21 19:13:36 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:13:36.688 104259 INFO neutron.agent.ovn.metadata.agent [-] Port dc2fa6e9-f8da-41bd-b113-8c7f2b22d4a9 in datapath d1e3ff28-7ba4-4007-895f-2557b60edefb unbound from our chassis#033[00m
Jan 21 19:13:36 np0005591285 nova_compute[182755]: 2026-01-22 00:13:36.689 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:13:36 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:13:36.690 104259 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network d1e3ff28-7ba4-4007-895f-2557b60edefb, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Jan 21 19:13:36 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:13:36.691 211690 DEBUG oslo.privsep.daemon [-] privsep: reply[a1ad9879-fe12-43d7-bb43-059a22ea8964]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 19:13:36 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:13:36.691 104259 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-d1e3ff28-7ba4-4007-895f-2557b60edefb namespace which is not needed anymore#033[00m
Jan 21 19:13:36 np0005591285 systemd[1]: machine-qemu\x2d55\x2dinstance\x2d00000076.scope: Deactivated successfully.
Jan 21 19:13:36 np0005591285 systemd[1]: machine-qemu\x2d55\x2dinstance\x2d00000076.scope: Consumed 4.400s CPU time.
Jan 21 19:13:36 np0005591285 systemd-machined[154022]: Machine qemu-55-instance-00000076 terminated.
Jan 21 19:13:36 np0005591285 nova_compute[182755]: 2026-01-22 00:13:36.775 182759 DEBUG nova.compute.manager [req-e3f17fe9-9bab-439b-b2aa-55aa0cb1a1ba req-4432e7f6-d16d-4a5c-9daa-549b83d02c1b 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 118577c2-2440-472a-b858-f075b2a804b1] Received event network-vif-plugged-f091e31e-112e-4a90-9947-5a807f422c9c external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 21 19:13:36 np0005591285 nova_compute[182755]: 2026-01-22 00:13:36.776 182759 DEBUG oslo_concurrency.lockutils [req-e3f17fe9-9bab-439b-b2aa-55aa0cb1a1ba req-4432e7f6-d16d-4a5c-9daa-549b83d02c1b 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "118577c2-2440-472a-b858-f075b2a804b1-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 21 19:13:36 np0005591285 nova_compute[182755]: 2026-01-22 00:13:36.776 182759 DEBUG oslo_concurrency.lockutils [req-e3f17fe9-9bab-439b-b2aa-55aa0cb1a1ba req-4432e7f6-d16d-4a5c-9daa-549b83d02c1b 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "118577c2-2440-472a-b858-f075b2a804b1-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 21 19:13:36 np0005591285 nova_compute[182755]: 2026-01-22 00:13:36.776 182759 DEBUG oslo_concurrency.lockutils [req-e3f17fe9-9bab-439b-b2aa-55aa0cb1a1ba req-4432e7f6-d16d-4a5c-9daa-549b83d02c1b 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "118577c2-2440-472a-b858-f075b2a804b1-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 21 19:13:36 np0005591285 nova_compute[182755]: 2026-01-22 00:13:36.776 182759 DEBUG nova.compute.manager [req-e3f17fe9-9bab-439b-b2aa-55aa0cb1a1ba req-4432e7f6-d16d-4a5c-9daa-549b83d02c1b 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 118577c2-2440-472a-b858-f075b2a804b1] No waiting events found dispatching network-vif-plugged-f091e31e-112e-4a90-9947-5a807f422c9c pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 21 19:13:36 np0005591285 nova_compute[182755]: 2026-01-22 00:13:36.776 182759 WARNING nova.compute.manager [req-e3f17fe9-9bab-439b-b2aa-55aa0cb1a1ba req-4432e7f6-d16d-4a5c-9daa-549b83d02c1b 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 118577c2-2440-472a-b858-f075b2a804b1] Received unexpected event network-vif-plugged-f091e31e-112e-4a90-9947-5a807f422c9c for instance with vm_state active and task_state deleting.#033[00m
Jan 21 19:13:36 np0005591285 neutron-haproxy-ovnmeta-d1e3ff28-7ba4-4007-895f-2557b60edefb[230581]: [NOTICE]   (230585) : haproxy version is 2.8.14-c23fe91
Jan 21 19:13:36 np0005591285 neutron-haproxy-ovnmeta-d1e3ff28-7ba4-4007-895f-2557b60edefb[230581]: [NOTICE]   (230585) : path to executable is /usr/sbin/haproxy
Jan 21 19:13:36 np0005591285 neutron-haproxy-ovnmeta-d1e3ff28-7ba4-4007-895f-2557b60edefb[230581]: [WARNING]  (230585) : Exiting Master process...
Jan 21 19:13:36 np0005591285 neutron-haproxy-ovnmeta-d1e3ff28-7ba4-4007-895f-2557b60edefb[230581]: [WARNING]  (230585) : Exiting Master process...
Jan 21 19:13:36 np0005591285 neutron-haproxy-ovnmeta-d1e3ff28-7ba4-4007-895f-2557b60edefb[230581]: [ALERT]    (230585) : Current worker (230587) exited with code 143 (Terminated)
Jan 21 19:13:36 np0005591285 neutron-haproxy-ovnmeta-d1e3ff28-7ba4-4007-895f-2557b60edefb[230581]: [WARNING]  (230585) : All workers exited. Exiting... (0)
Jan 21 19:13:36 np0005591285 systemd[1]: libpod-e5daa75e71e729a71521fb46c96a4e7543bed3eb5b951741a079fd86210ba3ac.scope: Deactivated successfully.
Jan 21 19:13:36 np0005591285 podman[230743]: 2026-01-22 00:13:36.823215512 +0000 UTC m=+0.041760425 container died e5daa75e71e729a71521fb46c96a4e7543bed3eb5b951741a079fd86210ba3ac (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-d1e3ff28-7ba4-4007-895f-2557b60edefb, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251202, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Jan 21 19:13:36 np0005591285 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-e5daa75e71e729a71521fb46c96a4e7543bed3eb5b951741a079fd86210ba3ac-userdata-shm.mount: Deactivated successfully.
Jan 21 19:13:36 np0005591285 systemd[1]: var-lib-containers-storage-overlay-fcc1b5d5a50ea09411a81c9ba9eddf74d91e206163d9462057fc5e040e660631-merged.mount: Deactivated successfully.
Jan 21 19:13:36 np0005591285 nova_compute[182755]: 2026-01-22 00:13:36.855 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:13:36 np0005591285 nova_compute[182755]: 2026-01-22 00:13:36.856 182759 DEBUG nova.network.neutron [-] [instance: 118577c2-2440-472a-b858-f075b2a804b1] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 21 19:13:36 np0005591285 podman[230743]: 2026-01-22 00:13:36.857123006 +0000 UTC m=+0.075667909 container cleanup e5daa75e71e729a71521fb46c96a4e7543bed3eb5b951741a079fd86210ba3ac (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-d1e3ff28-7ba4-4007-895f-2557b60edefb, org.label-schema.build-date=20251202, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Jan 21 19:13:36 np0005591285 nova_compute[182755]: 2026-01-22 00:13:36.861 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:13:36 np0005591285 systemd[1]: libpod-conmon-e5daa75e71e729a71521fb46c96a4e7543bed3eb5b951741a079fd86210ba3ac.scope: Deactivated successfully.
Jan 21 19:13:36 np0005591285 nova_compute[182755]: 2026-01-22 00:13:36.878 182759 INFO nova.compute.manager [-] [instance: 118577c2-2440-472a-b858-f075b2a804b1] Took 2.76 seconds to deallocate network for instance.#033[00m
Jan 21 19:13:36 np0005591285 nova_compute[182755]: 2026-01-22 00:13:36.886 182759 INFO nova.virt.libvirt.driver [-] [instance: ee09d802-1f59-4f58-befa-a281fe642b6b] Instance destroyed successfully.#033[00m
Jan 21 19:13:36 np0005591285 nova_compute[182755]: 2026-01-22 00:13:36.886 182759 DEBUG nova.objects.instance [None req-36d9e20d-0d42-4a91-b630-3f4346468b15 9ee45ba20dd444a5a5e88aa96cc8e043 873b2f2688e942d5924aa81fa18d84c0 - - default default] Lazy-loading 'resources' on Instance uuid ee09d802-1f59-4f58-befa-a281fe642b6b obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 21 19:13:36 np0005591285 nova_compute[182755]: 2026-01-22 00:13:36.899 182759 DEBUG nova.virt.libvirt.vif [None req-36d9e20d-0d42-4a91-b630-3f4346468b15 9ee45ba20dd444a5a5e88aa96cc8e043 873b2f2688e942d5924aa81fa18d84c0 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-22T00:12:48Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-InstanceActionsTestJSON-server-300361081',display_name='tempest-InstanceActionsTestJSON-server-300361081',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-instanceactionstestjson-server-300361081',id=118,image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2026-01-22T00:13:13Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='873b2f2688e942d5924aa81fa18d84c0',ramdisk_id='',reservation_id='r-0sonkysu',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-InstanceActionsTestJSON-232501859',owner_user_name='tempest-InstanceActionsTestJSON-232501859-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-22T00:13:32Z,user_data=None,user_id='9ee45ba20dd444a5a5e88aa96cc8e043',uuid=ee09d802-1f59-4f58-befa-a281fe642b6b,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "dc2fa6e9-f8da-41bd-b113-8c7f2b22d4a9", "address": "fa:16:3e:f6:6f:67", "network": {"id": "d1e3ff28-7ba4-4007-895f-2557b60edefb", "bridge": "br-int", "label": "tempest-InstanceActionsTestJSON-1124107938-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "873b2f2688e942d5924aa81fa18d84c0", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapdc2fa6e9-f8", "ovs_interfaceid": "dc2fa6e9-f8da-41bd-b113-8c7f2b22d4a9", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Jan 21 19:13:36 np0005591285 nova_compute[182755]: 2026-01-22 00:13:36.900 182759 DEBUG nova.network.os_vif_util [None req-36d9e20d-0d42-4a91-b630-3f4346468b15 9ee45ba20dd444a5a5e88aa96cc8e043 873b2f2688e942d5924aa81fa18d84c0 - - default default] Converting VIF {"id": "dc2fa6e9-f8da-41bd-b113-8c7f2b22d4a9", "address": "fa:16:3e:f6:6f:67", "network": {"id": "d1e3ff28-7ba4-4007-895f-2557b60edefb", "bridge": "br-int", "label": "tempest-InstanceActionsTestJSON-1124107938-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "873b2f2688e942d5924aa81fa18d84c0", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapdc2fa6e9-f8", "ovs_interfaceid": "dc2fa6e9-f8da-41bd-b113-8c7f2b22d4a9", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 21 19:13:36 np0005591285 nova_compute[182755]: 2026-01-22 00:13:36.900 182759 DEBUG nova.network.os_vif_util [None req-36d9e20d-0d42-4a91-b630-3f4346468b15 9ee45ba20dd444a5a5e88aa96cc8e043 873b2f2688e942d5924aa81fa18d84c0 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:f6:6f:67,bridge_name='br-int',has_traffic_filtering=True,id=dc2fa6e9-f8da-41bd-b113-8c7f2b22d4a9,network=Network(d1e3ff28-7ba4-4007-895f-2557b60edefb),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapdc2fa6e9-f8') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 21 19:13:36 np0005591285 nova_compute[182755]: 2026-01-22 00:13:36.901 182759 DEBUG os_vif [None req-36d9e20d-0d42-4a91-b630-3f4346468b15 9ee45ba20dd444a5a5e88aa96cc8e043 873b2f2688e942d5924aa81fa18d84c0 - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:f6:6f:67,bridge_name='br-int',has_traffic_filtering=True,id=dc2fa6e9-f8da-41bd-b113-8c7f2b22d4a9,network=Network(d1e3ff28-7ba4-4007-895f-2557b60edefb),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapdc2fa6e9-f8') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Jan 21 19:13:36 np0005591285 nova_compute[182755]: 2026-01-22 00:13:36.902 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:13:36 np0005591285 nova_compute[182755]: 2026-01-22 00:13:36.903 182759 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapdc2fa6e9-f8, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 21 19:13:36 np0005591285 nova_compute[182755]: 2026-01-22 00:13:36.905 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:13:36 np0005591285 nova_compute[182755]: 2026-01-22 00:13:36.907 182759 INFO os_vif [None req-36d9e20d-0d42-4a91-b630-3f4346468b15 9ee45ba20dd444a5a5e88aa96cc8e043 873b2f2688e942d5924aa81fa18d84c0 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:f6:6f:67,bridge_name='br-int',has_traffic_filtering=True,id=dc2fa6e9-f8da-41bd-b113-8c7f2b22d4a9,network=Network(d1e3ff28-7ba4-4007-895f-2557b60edefb),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapdc2fa6e9-f8')#033[00m
Jan 21 19:13:36 np0005591285 nova_compute[182755]: 2026-01-22 00:13:36.907 182759 INFO nova.virt.libvirt.driver [None req-36d9e20d-0d42-4a91-b630-3f4346468b15 9ee45ba20dd444a5a5e88aa96cc8e043 873b2f2688e942d5924aa81fa18d84c0 - - default default] [instance: ee09d802-1f59-4f58-befa-a281fe642b6b] Deleting instance files /var/lib/nova/instances/ee09d802-1f59-4f58-befa-a281fe642b6b_del#033[00m
Jan 21 19:13:36 np0005591285 nova_compute[182755]: 2026-01-22 00:13:36.908 182759 INFO nova.virt.libvirt.driver [None req-36d9e20d-0d42-4a91-b630-3f4346468b15 9ee45ba20dd444a5a5e88aa96cc8e043 873b2f2688e942d5924aa81fa18d84c0 - - default default] [instance: ee09d802-1f59-4f58-befa-a281fe642b6b] Deletion of /var/lib/nova/instances/ee09d802-1f59-4f58-befa-a281fe642b6b_del complete#033[00m
Jan 21 19:13:36 np0005591285 podman[230781]: 2026-01-22 00:13:36.91540899 +0000 UTC m=+0.036860975 container remove e5daa75e71e729a71521fb46c96a4e7543bed3eb5b951741a079fd86210ba3ac (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-d1e3ff28-7ba4-4007-895f-2557b60edefb, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2)
Jan 21 19:13:36 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:13:36.920 211690 DEBUG oslo.privsep.daemon [-] privsep: reply[277b6933-ac50-4bed-9be1-d589809eaa3f]: (4, ('Thu Jan 22 12:13:36 AM UTC 2026 Stopping container neutron-haproxy-ovnmeta-d1e3ff28-7ba4-4007-895f-2557b60edefb (e5daa75e71e729a71521fb46c96a4e7543bed3eb5b951741a079fd86210ba3ac)\ne5daa75e71e729a71521fb46c96a4e7543bed3eb5b951741a079fd86210ba3ac\nThu Jan 22 12:13:36 AM UTC 2026 Deleting container neutron-haproxy-ovnmeta-d1e3ff28-7ba4-4007-895f-2557b60edefb (e5daa75e71e729a71521fb46c96a4e7543bed3eb5b951741a079fd86210ba3ac)\ne5daa75e71e729a71521fb46c96a4e7543bed3eb5b951741a079fd86210ba3ac\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 19:13:36 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:13:36.921 211690 DEBUG oslo.privsep.daemon [-] privsep: reply[f2f9d000-27f4-4aef-b27f-77bb75bdda1a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 19:13:36 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:13:36.922 104259 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapd1e3ff28-70, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 21 19:13:36 np0005591285 nova_compute[182755]: 2026-01-22 00:13:36.923 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:13:36 np0005591285 kernel: tapd1e3ff28-70: left promiscuous mode
Jan 21 19:13:36 np0005591285 nova_compute[182755]: 2026-01-22 00:13:36.933 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:13:36 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:13:36.935 211690 DEBUG oslo.privsep.daemon [-] privsep: reply[65d46e5b-d5f8-419e-8901-2706eaf9e1b5]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 19:13:36 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:13:36.959 211690 DEBUG oslo.privsep.daemon [-] privsep: reply[9ccc52ac-750b-46d0-b6dd-82c64113c077]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 19:13:36 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:13:36.960 211690 DEBUG oslo.privsep.daemon [-] privsep: reply[b7b3c448-f1be-4efa-856a-e44f5f691d61]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 19:13:36 np0005591285 nova_compute[182755]: 2026-01-22 00:13:36.963 182759 DEBUG nova.compute.manager [req-62d6d37b-6ac6-4c84-8dae-15536b162767 req-f44e235f-0118-419e-92e3-4773d0f43b75 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 118577c2-2440-472a-b858-f075b2a804b1] Received event network-vif-deleted-f091e31e-112e-4a90-9947-5a807f422c9c external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 21 19:13:36 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:13:36.975 211690 DEBUG oslo.privsep.daemon [-] privsep: reply[363159bc-0743-4d40-83b1-e4e43b5ccdf4]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 537087, 'reachable_time': 41839, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 230800, 'error': None, 'target': 'ovnmeta-d1e3ff28-7ba4-4007-895f-2557b60edefb', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 19:13:36 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:13:36.976 104650 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-d1e3ff28-7ba4-4007-895f-2557b60edefb deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Jan 21 19:13:36 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:13:36.976 104650 DEBUG oslo.privsep.daemon [-] privsep: reply[454c5747-5d5d-431a-93c1-28d258861ba3]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 19:13:36 np0005591285 systemd[1]: run-netns-ovnmeta\x2dd1e3ff28\x2d7ba4\x2d4007\x2d895f\x2d2557b60edefb.mount: Deactivated successfully.
Jan 21 19:13:37 np0005591285 nova_compute[182755]: 2026-01-22 00:13:37.005 182759 DEBUG oslo_concurrency.lockutils [None req-02473b20-3767-47f6-8ef7-cf831b481552 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 21 19:13:37 np0005591285 nova_compute[182755]: 2026-01-22 00:13:37.005 182759 DEBUG oslo_concurrency.lockutils [None req-02473b20-3767-47f6-8ef7-cf831b481552 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 21 19:13:37 np0005591285 nova_compute[182755]: 2026-01-22 00:13:37.019 182759 INFO nova.compute.manager [None req-36d9e20d-0d42-4a91-b630-3f4346468b15 9ee45ba20dd444a5a5e88aa96cc8e043 873b2f2688e942d5924aa81fa18d84c0 - - default default] [instance: ee09d802-1f59-4f58-befa-a281fe642b6b] Took 0.39 seconds to destroy the instance on the hypervisor.#033[00m
Jan 21 19:13:37 np0005591285 nova_compute[182755]: 2026-01-22 00:13:37.020 182759 DEBUG oslo.service.loopingcall [None req-36d9e20d-0d42-4a91-b630-3f4346468b15 9ee45ba20dd444a5a5e88aa96cc8e043 873b2f2688e942d5924aa81fa18d84c0 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Jan 21 19:13:37 np0005591285 nova_compute[182755]: 2026-01-22 00:13:37.020 182759 DEBUG nova.compute.manager [-] [instance: ee09d802-1f59-4f58-befa-a281fe642b6b] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Jan 21 19:13:37 np0005591285 nova_compute[182755]: 2026-01-22 00:13:37.020 182759 DEBUG nova.network.neutron [-] [instance: ee09d802-1f59-4f58-befa-a281fe642b6b] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Jan 21 19:13:37 np0005591285 nova_compute[182755]: 2026-01-22 00:13:37.361 182759 DEBUG nova.compute.provider_tree [None req-02473b20-3767-47f6-8ef7-cf831b481552 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] Inventory has not changed in ProviderTree for provider: e96a8776-a298-4c19-937a-402cb8191067 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 21 19:13:37 np0005591285 nova_compute[182755]: 2026-01-22 00:13:37.380 182759 DEBUG nova.scheduler.client.report [None req-02473b20-3767-47f6-8ef7-cf831b481552 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] Inventory has not changed for provider e96a8776-a298-4c19-937a-402cb8191067 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 21 19:13:37 np0005591285 nova_compute[182755]: 2026-01-22 00:13:37.405 182759 DEBUG oslo_concurrency.lockutils [None req-02473b20-3767-47f6-8ef7-cf831b481552 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.399s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 21 19:13:37 np0005591285 nova_compute[182755]: 2026-01-22 00:13:37.439 182759 INFO nova.scheduler.client.report [None req-02473b20-3767-47f6-8ef7-cf831b481552 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] Deleted allocations for instance 118577c2-2440-472a-b858-f075b2a804b1#033[00m
Jan 21 19:13:37 np0005591285 nova_compute[182755]: 2026-01-22 00:13:37.480 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:13:37 np0005591285 nova_compute[182755]: 2026-01-22 00:13:37.539 182759 DEBUG oslo_concurrency.lockutils [None req-02473b20-3767-47f6-8ef7-cf831b481552 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] Lock "118577c2-2440-472a-b858-f075b2a804b1" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 4.655s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 21 19:13:37 np0005591285 nova_compute[182755]: 2026-01-22 00:13:37.706 182759 DEBUG nova.network.neutron [-] [instance: ee09d802-1f59-4f58-befa-a281fe642b6b] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 21 19:13:37 np0005591285 nova_compute[182755]: 2026-01-22 00:13:37.722 182759 INFO nova.compute.manager [-] [instance: ee09d802-1f59-4f58-befa-a281fe642b6b] Took 0.70 seconds to deallocate network for instance.#033[00m
Jan 21 19:13:37 np0005591285 nova_compute[182755]: 2026-01-22 00:13:37.822 182759 DEBUG oslo_concurrency.lockutils [None req-36d9e20d-0d42-4a91-b630-3f4346468b15 9ee45ba20dd444a5a5e88aa96cc8e043 873b2f2688e942d5924aa81fa18d84c0 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 21 19:13:37 np0005591285 nova_compute[182755]: 2026-01-22 00:13:37.823 182759 DEBUG oslo_concurrency.lockutils [None req-36d9e20d-0d42-4a91-b630-3f4346468b15 9ee45ba20dd444a5a5e88aa96cc8e043 873b2f2688e942d5924aa81fa18d84c0 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 21 19:13:37 np0005591285 nova_compute[182755]: 2026-01-22 00:13:37.885 182759 DEBUG nova.compute.provider_tree [None req-36d9e20d-0d42-4a91-b630-3f4346468b15 9ee45ba20dd444a5a5e88aa96cc8e043 873b2f2688e942d5924aa81fa18d84c0 - - default default] Inventory has not changed in ProviderTree for provider: e96a8776-a298-4c19-937a-402cb8191067 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 21 19:13:37 np0005591285 nova_compute[182755]: 2026-01-22 00:13:37.900 182759 DEBUG nova.scheduler.client.report [None req-36d9e20d-0d42-4a91-b630-3f4346468b15 9ee45ba20dd444a5a5e88aa96cc8e043 873b2f2688e942d5924aa81fa18d84c0 - - default default] Inventory has not changed for provider e96a8776-a298-4c19-937a-402cb8191067 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 21 19:13:37 np0005591285 nova_compute[182755]: 2026-01-22 00:13:37.941 182759 DEBUG oslo_concurrency.lockutils [None req-36d9e20d-0d42-4a91-b630-3f4346468b15 9ee45ba20dd444a5a5e88aa96cc8e043 873b2f2688e942d5924aa81fa18d84c0 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.118s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 21 19:13:37 np0005591285 nova_compute[182755]: 2026-01-22 00:13:37.972 182759 INFO nova.scheduler.client.report [None req-36d9e20d-0d42-4a91-b630-3f4346468b15 9ee45ba20dd444a5a5e88aa96cc8e043 873b2f2688e942d5924aa81fa18d84c0 - - default default] Deleted allocations for instance ee09d802-1f59-4f58-befa-a281fe642b6b#033[00m
Jan 21 19:13:38 np0005591285 nova_compute[182755]: 2026-01-22 00:13:38.069 182759 DEBUG oslo_concurrency.lockutils [None req-36d9e20d-0d42-4a91-b630-3f4346468b15 9ee45ba20dd444a5a5e88aa96cc8e043 873b2f2688e942d5924aa81fa18d84c0 - - default default] Lock "ee09d802-1f59-4f58-befa-a281fe642b6b" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 1.468s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 21 19:13:38 np0005591285 podman[230801]: 2026-01-22 00:13:38.215787499 +0000 UTC m=+0.084050061 container health_status e38def30a2d032278c507a8fe96e62e9ea51ff4721fe55b24e66691821fd079e (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=ovn_controller, container_name=ovn_controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251202)
Jan 21 19:13:39 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:13:39.084 104259 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=ce4b296c-26ac-415a-aa87-9634754eb3d3, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '35'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 21 19:13:39 np0005591285 nova_compute[182755]: 2026-01-22 00:13:39.978 182759 DEBUG nova.compute.manager [req-ffaa772f-fa7c-454e-82b4-6400e7f8e643 req-46e6ae4d-8bdd-42e7-aa64-9e8c1df7b56a 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: ee09d802-1f59-4f58-befa-a281fe642b6b] Received event network-vif-unplugged-dc2fa6e9-f8da-41bd-b113-8c7f2b22d4a9 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 21 19:13:39 np0005591285 nova_compute[182755]: 2026-01-22 00:13:39.978 182759 DEBUG oslo_concurrency.lockutils [req-ffaa772f-fa7c-454e-82b4-6400e7f8e643 req-46e6ae4d-8bdd-42e7-aa64-9e8c1df7b56a 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "ee09d802-1f59-4f58-befa-a281fe642b6b-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 21 19:13:39 np0005591285 nova_compute[182755]: 2026-01-22 00:13:39.978 182759 DEBUG oslo_concurrency.lockutils [req-ffaa772f-fa7c-454e-82b4-6400e7f8e643 req-46e6ae4d-8bdd-42e7-aa64-9e8c1df7b56a 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "ee09d802-1f59-4f58-befa-a281fe642b6b-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 21 19:13:39 np0005591285 nova_compute[182755]: 2026-01-22 00:13:39.979 182759 DEBUG oslo_concurrency.lockutils [req-ffaa772f-fa7c-454e-82b4-6400e7f8e643 req-46e6ae4d-8bdd-42e7-aa64-9e8c1df7b56a 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "ee09d802-1f59-4f58-befa-a281fe642b6b-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 21 19:13:39 np0005591285 nova_compute[182755]: 2026-01-22 00:13:39.979 182759 DEBUG nova.compute.manager [req-ffaa772f-fa7c-454e-82b4-6400e7f8e643 req-46e6ae4d-8bdd-42e7-aa64-9e8c1df7b56a 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: ee09d802-1f59-4f58-befa-a281fe642b6b] No waiting events found dispatching network-vif-unplugged-dc2fa6e9-f8da-41bd-b113-8c7f2b22d4a9 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 21 19:13:39 np0005591285 nova_compute[182755]: 2026-01-22 00:13:39.979 182759 WARNING nova.compute.manager [req-ffaa772f-fa7c-454e-82b4-6400e7f8e643 req-46e6ae4d-8bdd-42e7-aa64-9e8c1df7b56a 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: ee09d802-1f59-4f58-befa-a281fe642b6b] Received unexpected event network-vif-unplugged-dc2fa6e9-f8da-41bd-b113-8c7f2b22d4a9 for instance with vm_state deleted and task_state None.#033[00m
Jan 21 19:13:39 np0005591285 nova_compute[182755]: 2026-01-22 00:13:39.979 182759 DEBUG nova.compute.manager [req-ffaa772f-fa7c-454e-82b4-6400e7f8e643 req-46e6ae4d-8bdd-42e7-aa64-9e8c1df7b56a 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: ee09d802-1f59-4f58-befa-a281fe642b6b] Received event network-vif-plugged-dc2fa6e9-f8da-41bd-b113-8c7f2b22d4a9 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 21 19:13:39 np0005591285 nova_compute[182755]: 2026-01-22 00:13:39.980 182759 DEBUG oslo_concurrency.lockutils [req-ffaa772f-fa7c-454e-82b4-6400e7f8e643 req-46e6ae4d-8bdd-42e7-aa64-9e8c1df7b56a 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "ee09d802-1f59-4f58-befa-a281fe642b6b-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 21 19:13:39 np0005591285 nova_compute[182755]: 2026-01-22 00:13:39.980 182759 DEBUG oslo_concurrency.lockutils [req-ffaa772f-fa7c-454e-82b4-6400e7f8e643 req-46e6ae4d-8bdd-42e7-aa64-9e8c1df7b56a 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "ee09d802-1f59-4f58-befa-a281fe642b6b-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 21 19:13:39 np0005591285 nova_compute[182755]: 2026-01-22 00:13:39.980 182759 DEBUG oslo_concurrency.lockutils [req-ffaa772f-fa7c-454e-82b4-6400e7f8e643 req-46e6ae4d-8bdd-42e7-aa64-9e8c1df7b56a 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "ee09d802-1f59-4f58-befa-a281fe642b6b-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 21 19:13:39 np0005591285 nova_compute[182755]: 2026-01-22 00:13:39.980 182759 DEBUG nova.compute.manager [req-ffaa772f-fa7c-454e-82b4-6400e7f8e643 req-46e6ae4d-8bdd-42e7-aa64-9e8c1df7b56a 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: ee09d802-1f59-4f58-befa-a281fe642b6b] No waiting events found dispatching network-vif-plugged-dc2fa6e9-f8da-41bd-b113-8c7f2b22d4a9 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 21 19:13:39 np0005591285 nova_compute[182755]: 2026-01-22 00:13:39.981 182759 WARNING nova.compute.manager [req-ffaa772f-fa7c-454e-82b4-6400e7f8e643 req-46e6ae4d-8bdd-42e7-aa64-9e8c1df7b56a 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: ee09d802-1f59-4f58-befa-a281fe642b6b] Received unexpected event network-vif-plugged-dc2fa6e9-f8da-41bd-b113-8c7f2b22d4a9 for instance with vm_state deleted and task_state None.#033[00m
Jan 21 19:13:39 np0005591285 nova_compute[182755]: 2026-01-22 00:13:39.981 182759 DEBUG nova.compute.manager [req-ffaa772f-fa7c-454e-82b4-6400e7f8e643 req-46e6ae4d-8bdd-42e7-aa64-9e8c1df7b56a 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: ee09d802-1f59-4f58-befa-a281fe642b6b] Received event network-vif-deleted-dc2fa6e9-f8da-41bd-b113-8c7f2b22d4a9 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 21 19:13:41 np0005591285 nova_compute[182755]: 2026-01-22 00:13:41.266 182759 DEBUG oslo_service.periodic_task [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 21 19:13:41 np0005591285 nova_compute[182755]: 2026-01-22 00:13:41.267 182759 DEBUG nova.compute.manager [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Jan 21 19:13:41 np0005591285 nova_compute[182755]: 2026-01-22 00:13:41.906 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:13:42 np0005591285 nova_compute[182755]: 2026-01-22 00:13:42.213 182759 DEBUG oslo_service.periodic_task [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 21 19:13:42 np0005591285 nova_compute[182755]: 2026-01-22 00:13:42.482 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:13:42 np0005591285 nova_compute[182755]: 2026-01-22 00:13:42.873 182759 DEBUG oslo_concurrency.lockutils [None req-8df65266-e942-4836-8336-ecf2622f1a3c 685d867cee4e4629a9cebd15bdbcb282 3072623f3b79497ab043c7aafb5a7523 - - default default] Acquiring lock "e70b4551-5394-4f61-b02d-ad3b69890e83" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 21 19:13:42 np0005591285 nova_compute[182755]: 2026-01-22 00:13:42.874 182759 DEBUG oslo_concurrency.lockutils [None req-8df65266-e942-4836-8336-ecf2622f1a3c 685d867cee4e4629a9cebd15bdbcb282 3072623f3b79497ab043c7aafb5a7523 - - default default] Lock "e70b4551-5394-4f61-b02d-ad3b69890e83" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 21 19:13:42 np0005591285 nova_compute[182755]: 2026-01-22 00:13:42.907 182759 DEBUG nova.compute.manager [None req-8df65266-e942-4836-8336-ecf2622f1a3c 685d867cee4e4629a9cebd15bdbcb282 3072623f3b79497ab043c7aafb5a7523 - - default default] [instance: e70b4551-5394-4f61-b02d-ad3b69890e83] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Jan 21 19:13:43 np0005591285 nova_compute[182755]: 2026-01-22 00:13:43.084 182759 DEBUG oslo_concurrency.lockutils [None req-8df65266-e942-4836-8336-ecf2622f1a3c 685d867cee4e4629a9cebd15bdbcb282 3072623f3b79497ab043c7aafb5a7523 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 21 19:13:43 np0005591285 nova_compute[182755]: 2026-01-22 00:13:43.085 182759 DEBUG oslo_concurrency.lockutils [None req-8df65266-e942-4836-8336-ecf2622f1a3c 685d867cee4e4629a9cebd15bdbcb282 3072623f3b79497ab043c7aafb5a7523 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 21 19:13:43 np0005591285 nova_compute[182755]: 2026-01-22 00:13:43.092 182759 DEBUG nova.virt.hardware [None req-8df65266-e942-4836-8336-ecf2622f1a3c 685d867cee4e4629a9cebd15bdbcb282 3072623f3b79497ab043c7aafb5a7523 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Jan 21 19:13:43 np0005591285 nova_compute[182755]: 2026-01-22 00:13:43.092 182759 INFO nova.compute.claims [None req-8df65266-e942-4836-8336-ecf2622f1a3c 685d867cee4e4629a9cebd15bdbcb282 3072623f3b79497ab043c7aafb5a7523 - - default default] [instance: e70b4551-5394-4f61-b02d-ad3b69890e83] Claim successful on node compute-2.ctlplane.example.com#033[00m
Jan 21 19:13:43 np0005591285 nova_compute[182755]: 2026-01-22 00:13:43.288 182759 DEBUG nova.compute.provider_tree [None req-8df65266-e942-4836-8336-ecf2622f1a3c 685d867cee4e4629a9cebd15bdbcb282 3072623f3b79497ab043c7aafb5a7523 - - default default] Inventory has not changed in ProviderTree for provider: e96a8776-a298-4c19-937a-402cb8191067 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 21 19:13:43 np0005591285 nova_compute[182755]: 2026-01-22 00:13:43.323 182759 DEBUG nova.scheduler.client.report [None req-8df65266-e942-4836-8336-ecf2622f1a3c 685d867cee4e4629a9cebd15bdbcb282 3072623f3b79497ab043c7aafb5a7523 - - default default] Inventory has not changed for provider e96a8776-a298-4c19-937a-402cb8191067 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 21 19:13:43 np0005591285 nova_compute[182755]: 2026-01-22 00:13:43.356 182759 DEBUG oslo_concurrency.lockutils [None req-8df65266-e942-4836-8336-ecf2622f1a3c 685d867cee4e4629a9cebd15bdbcb282 3072623f3b79497ab043c7aafb5a7523 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.271s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 21 19:13:43 np0005591285 nova_compute[182755]: 2026-01-22 00:13:43.356 182759 DEBUG nova.compute.manager [None req-8df65266-e942-4836-8336-ecf2622f1a3c 685d867cee4e4629a9cebd15bdbcb282 3072623f3b79497ab043c7aafb5a7523 - - default default] [instance: e70b4551-5394-4f61-b02d-ad3b69890e83] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Jan 21 19:13:43 np0005591285 nova_compute[182755]: 2026-01-22 00:13:43.434 182759 DEBUG nova.compute.manager [None req-8df65266-e942-4836-8336-ecf2622f1a3c 685d867cee4e4629a9cebd15bdbcb282 3072623f3b79497ab043c7aafb5a7523 - - default default] [instance: e70b4551-5394-4f61-b02d-ad3b69890e83] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Jan 21 19:13:43 np0005591285 nova_compute[182755]: 2026-01-22 00:13:43.435 182759 DEBUG nova.network.neutron [None req-8df65266-e942-4836-8336-ecf2622f1a3c 685d867cee4e4629a9cebd15bdbcb282 3072623f3b79497ab043c7aafb5a7523 - - default default] [instance: e70b4551-5394-4f61-b02d-ad3b69890e83] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Jan 21 19:13:43 np0005591285 nova_compute[182755]: 2026-01-22 00:13:43.474 182759 INFO nova.virt.libvirt.driver [None req-8df65266-e942-4836-8336-ecf2622f1a3c 685d867cee4e4629a9cebd15bdbcb282 3072623f3b79497ab043c7aafb5a7523 - - default default] [instance: e70b4551-5394-4f61-b02d-ad3b69890e83] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Jan 21 19:13:43 np0005591285 nova_compute[182755]: 2026-01-22 00:13:43.499 182759 DEBUG nova.compute.manager [None req-8df65266-e942-4836-8336-ecf2622f1a3c 685d867cee4e4629a9cebd15bdbcb282 3072623f3b79497ab043c7aafb5a7523 - - default default] [instance: e70b4551-5394-4f61-b02d-ad3b69890e83] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Jan 21 19:13:43 np0005591285 nova_compute[182755]: 2026-01-22 00:13:43.641 182759 DEBUG nova.compute.manager [None req-8df65266-e942-4836-8336-ecf2622f1a3c 685d867cee4e4629a9cebd15bdbcb282 3072623f3b79497ab043c7aafb5a7523 - - default default] [instance: e70b4551-5394-4f61-b02d-ad3b69890e83] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Jan 21 19:13:43 np0005591285 nova_compute[182755]: 2026-01-22 00:13:43.643 182759 DEBUG nova.virt.libvirt.driver [None req-8df65266-e942-4836-8336-ecf2622f1a3c 685d867cee4e4629a9cebd15bdbcb282 3072623f3b79497ab043c7aafb5a7523 - - default default] [instance: e70b4551-5394-4f61-b02d-ad3b69890e83] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Jan 21 19:13:43 np0005591285 nova_compute[182755]: 2026-01-22 00:13:43.643 182759 INFO nova.virt.libvirt.driver [None req-8df65266-e942-4836-8336-ecf2622f1a3c 685d867cee4e4629a9cebd15bdbcb282 3072623f3b79497ab043c7aafb5a7523 - - default default] [instance: e70b4551-5394-4f61-b02d-ad3b69890e83] Creating image(s)#033[00m
Jan 21 19:13:43 np0005591285 nova_compute[182755]: 2026-01-22 00:13:43.644 182759 DEBUG oslo_concurrency.lockutils [None req-8df65266-e942-4836-8336-ecf2622f1a3c 685d867cee4e4629a9cebd15bdbcb282 3072623f3b79497ab043c7aafb5a7523 - - default default] Acquiring lock "/var/lib/nova/instances/e70b4551-5394-4f61-b02d-ad3b69890e83/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 21 19:13:43 np0005591285 nova_compute[182755]: 2026-01-22 00:13:43.644 182759 DEBUG oslo_concurrency.lockutils [None req-8df65266-e942-4836-8336-ecf2622f1a3c 685d867cee4e4629a9cebd15bdbcb282 3072623f3b79497ab043c7aafb5a7523 - - default default] Lock "/var/lib/nova/instances/e70b4551-5394-4f61-b02d-ad3b69890e83/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 21 19:13:43 np0005591285 nova_compute[182755]: 2026-01-22 00:13:43.645 182759 DEBUG oslo_concurrency.lockutils [None req-8df65266-e942-4836-8336-ecf2622f1a3c 685d867cee4e4629a9cebd15bdbcb282 3072623f3b79497ab043c7aafb5a7523 - - default default] Lock "/var/lib/nova/instances/e70b4551-5394-4f61-b02d-ad3b69890e83/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 21 19:13:43 np0005591285 nova_compute[182755]: 2026-01-22 00:13:43.662 182759 DEBUG oslo_concurrency.processutils [None req-8df65266-e942-4836-8336-ecf2622f1a3c 685d867cee4e4629a9cebd15bdbcb282 3072623f3b79497ab043c7aafb5a7523 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 21 19:13:43 np0005591285 nova_compute[182755]: 2026-01-22 00:13:43.694 182759 DEBUG nova.policy [None req-8df65266-e942-4836-8336-ecf2622f1a3c 685d867cee4e4629a9cebd15bdbcb282 3072623f3b79497ab043c7aafb5a7523 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '685d867cee4e4629a9cebd15bdbcb282', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '3072623f3b79497ab043c7aafb5a7523', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Jan 21 19:13:43 np0005591285 nova_compute[182755]: 2026-01-22 00:13:43.720 182759 DEBUG oslo_concurrency.processutils [None req-8df65266-e942-4836-8336-ecf2622f1a3c 685d867cee4e4629a9cebd15bdbcb282 3072623f3b79497ab043c7aafb5a7523 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json" returned: 0 in 0.058s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 21 19:13:43 np0005591285 nova_compute[182755]: 2026-01-22 00:13:43.721 182759 DEBUG oslo_concurrency.lockutils [None req-8df65266-e942-4836-8336-ecf2622f1a3c 685d867cee4e4629a9cebd15bdbcb282 3072623f3b79497ab043c7aafb5a7523 - - default default] Acquiring lock "3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 21 19:13:43 np0005591285 nova_compute[182755]: 2026-01-22 00:13:43.722 182759 DEBUG oslo_concurrency.lockutils [None req-8df65266-e942-4836-8336-ecf2622f1a3c 685d867cee4e4629a9cebd15bdbcb282 3072623f3b79497ab043c7aafb5a7523 - - default default] Lock "3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 21 19:13:43 np0005591285 nova_compute[182755]: 2026-01-22 00:13:43.734 182759 DEBUG oslo_concurrency.processutils [None req-8df65266-e942-4836-8336-ecf2622f1a3c 685d867cee4e4629a9cebd15bdbcb282 3072623f3b79497ab043c7aafb5a7523 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 21 19:13:43 np0005591285 nova_compute[182755]: 2026-01-22 00:13:43.793 182759 DEBUG oslo_concurrency.processutils [None req-8df65266-e942-4836-8336-ecf2622f1a3c 685d867cee4e4629a9cebd15bdbcb282 3072623f3b79497ab043c7aafb5a7523 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json" returned: 0 in 0.059s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 21 19:13:43 np0005591285 nova_compute[182755]: 2026-01-22 00:13:43.794 182759 DEBUG oslo_concurrency.processutils [None req-8df65266-e942-4836-8336-ecf2622f1a3c 685d867cee4e4629a9cebd15bdbcb282 3072623f3b79497ab043c7aafb5a7523 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474,backing_fmt=raw /var/lib/nova/instances/e70b4551-5394-4f61-b02d-ad3b69890e83/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 21 19:13:43 np0005591285 nova_compute[182755]: 2026-01-22 00:13:43.833 182759 DEBUG oslo_concurrency.processutils [None req-8df65266-e942-4836-8336-ecf2622f1a3c 685d867cee4e4629a9cebd15bdbcb282 3072623f3b79497ab043c7aafb5a7523 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474,backing_fmt=raw /var/lib/nova/instances/e70b4551-5394-4f61-b02d-ad3b69890e83/disk 1073741824" returned: 0 in 0.039s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 21 19:13:43 np0005591285 nova_compute[182755]: 2026-01-22 00:13:43.835 182759 DEBUG oslo_concurrency.lockutils [None req-8df65266-e942-4836-8336-ecf2622f1a3c 685d867cee4e4629a9cebd15bdbcb282 3072623f3b79497ab043c7aafb5a7523 - - default default] Lock "3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.113s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 21 19:13:43 np0005591285 nova_compute[182755]: 2026-01-22 00:13:43.835 182759 DEBUG oslo_concurrency.processutils [None req-8df65266-e942-4836-8336-ecf2622f1a3c 685d867cee4e4629a9cebd15bdbcb282 3072623f3b79497ab043c7aafb5a7523 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 21 19:13:43 np0005591285 nova_compute[182755]: 2026-01-22 00:13:43.924 182759 DEBUG oslo_concurrency.processutils [None req-8df65266-e942-4836-8336-ecf2622f1a3c 685d867cee4e4629a9cebd15bdbcb282 3072623f3b79497ab043c7aafb5a7523 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json" returned: 0 in 0.089s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 21 19:13:43 np0005591285 nova_compute[182755]: 2026-01-22 00:13:43.926 182759 DEBUG nova.virt.disk.api [None req-8df65266-e942-4836-8336-ecf2622f1a3c 685d867cee4e4629a9cebd15bdbcb282 3072623f3b79497ab043c7aafb5a7523 - - default default] Checking if we can resize image /var/lib/nova/instances/e70b4551-5394-4f61-b02d-ad3b69890e83/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166#033[00m
Jan 21 19:13:43 np0005591285 nova_compute[182755]: 2026-01-22 00:13:43.928 182759 DEBUG oslo_concurrency.processutils [None req-8df65266-e942-4836-8336-ecf2622f1a3c 685d867cee4e4629a9cebd15bdbcb282 3072623f3b79497ab043c7aafb5a7523 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/e70b4551-5394-4f61-b02d-ad3b69890e83/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 21 19:13:44 np0005591285 nova_compute[182755]: 2026-01-22 00:13:44.020 182759 DEBUG oslo_concurrency.processutils [None req-8df65266-e942-4836-8336-ecf2622f1a3c 685d867cee4e4629a9cebd15bdbcb282 3072623f3b79497ab043c7aafb5a7523 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/e70b4551-5394-4f61-b02d-ad3b69890e83/disk --force-share --output=json" returned: 0 in 0.092s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 21 19:13:44 np0005591285 nova_compute[182755]: 2026-01-22 00:13:44.023 182759 DEBUG nova.virt.disk.api [None req-8df65266-e942-4836-8336-ecf2622f1a3c 685d867cee4e4629a9cebd15bdbcb282 3072623f3b79497ab043c7aafb5a7523 - - default default] Cannot resize image /var/lib/nova/instances/e70b4551-5394-4f61-b02d-ad3b69890e83/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172#033[00m
Jan 21 19:13:44 np0005591285 nova_compute[182755]: 2026-01-22 00:13:44.024 182759 DEBUG nova.objects.instance [None req-8df65266-e942-4836-8336-ecf2622f1a3c 685d867cee4e4629a9cebd15bdbcb282 3072623f3b79497ab043c7aafb5a7523 - - default default] Lazy-loading 'migration_context' on Instance uuid e70b4551-5394-4f61-b02d-ad3b69890e83 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 21 19:13:44 np0005591285 nova_compute[182755]: 2026-01-22 00:13:44.042 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:13:44 np0005591285 nova_compute[182755]: 2026-01-22 00:13:44.052 182759 DEBUG nova.virt.libvirt.driver [None req-8df65266-e942-4836-8336-ecf2622f1a3c 685d867cee4e4629a9cebd15bdbcb282 3072623f3b79497ab043c7aafb5a7523 - - default default] [instance: e70b4551-5394-4f61-b02d-ad3b69890e83] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Jan 21 19:13:44 np0005591285 nova_compute[182755]: 2026-01-22 00:13:44.053 182759 DEBUG nova.virt.libvirt.driver [None req-8df65266-e942-4836-8336-ecf2622f1a3c 685d867cee4e4629a9cebd15bdbcb282 3072623f3b79497ab043c7aafb5a7523 - - default default] [instance: e70b4551-5394-4f61-b02d-ad3b69890e83] Ensure instance console log exists: /var/lib/nova/instances/e70b4551-5394-4f61-b02d-ad3b69890e83/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Jan 21 19:13:44 np0005591285 nova_compute[182755]: 2026-01-22 00:13:44.054 182759 DEBUG oslo_concurrency.lockutils [None req-8df65266-e942-4836-8336-ecf2622f1a3c 685d867cee4e4629a9cebd15bdbcb282 3072623f3b79497ab043c7aafb5a7523 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 21 19:13:44 np0005591285 nova_compute[182755]: 2026-01-22 00:13:44.054 182759 DEBUG oslo_concurrency.lockutils [None req-8df65266-e942-4836-8336-ecf2622f1a3c 685d867cee4e4629a9cebd15bdbcb282 3072623f3b79497ab043c7aafb5a7523 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 21 19:13:44 np0005591285 nova_compute[182755]: 2026-01-22 00:13:44.054 182759 DEBUG oslo_concurrency.lockutils [None req-8df65266-e942-4836-8336-ecf2622f1a3c 685d867cee4e4629a9cebd15bdbcb282 3072623f3b79497ab043c7aafb5a7523 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 21 19:13:44 np0005591285 nova_compute[182755]: 2026-01-22 00:13:44.220 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:13:44 np0005591285 nova_compute[182755]: 2026-01-22 00:13:44.412 182759 DEBUG nova.network.neutron [None req-8df65266-e942-4836-8336-ecf2622f1a3c 685d867cee4e4629a9cebd15bdbcb282 3072623f3b79497ab043c7aafb5a7523 - - default default] [instance: e70b4551-5394-4f61-b02d-ad3b69890e83] Successfully created port: 5d8d346d-4613-42c3-a377-72d51f4de448 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Jan 21 19:13:45 np0005591285 nova_compute[182755]: 2026-01-22 00:13:45.217 182759 DEBUG oslo_service.periodic_task [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 21 19:13:45 np0005591285 nova_compute[182755]: 2026-01-22 00:13:45.218 182759 DEBUG oslo_service.periodic_task [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 21 19:13:45 np0005591285 nova_compute[182755]: 2026-01-22 00:13:45.554 182759 DEBUG nova.network.neutron [None req-8df65266-e942-4836-8336-ecf2622f1a3c 685d867cee4e4629a9cebd15bdbcb282 3072623f3b79497ab043c7aafb5a7523 - - default default] [instance: e70b4551-5394-4f61-b02d-ad3b69890e83] Successfully updated port: 5d8d346d-4613-42c3-a377-72d51f4de448 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Jan 21 19:13:45 np0005591285 nova_compute[182755]: 2026-01-22 00:13:45.593 182759 DEBUG oslo_concurrency.lockutils [None req-8df65266-e942-4836-8336-ecf2622f1a3c 685d867cee4e4629a9cebd15bdbcb282 3072623f3b79497ab043c7aafb5a7523 - - default default] Acquiring lock "refresh_cache-e70b4551-5394-4f61-b02d-ad3b69890e83" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 21 19:13:45 np0005591285 nova_compute[182755]: 2026-01-22 00:13:45.593 182759 DEBUG oslo_concurrency.lockutils [None req-8df65266-e942-4836-8336-ecf2622f1a3c 685d867cee4e4629a9cebd15bdbcb282 3072623f3b79497ab043c7aafb5a7523 - - default default] Acquired lock "refresh_cache-e70b4551-5394-4f61-b02d-ad3b69890e83" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 21 19:13:45 np0005591285 nova_compute[182755]: 2026-01-22 00:13:45.594 182759 DEBUG nova.network.neutron [None req-8df65266-e942-4836-8336-ecf2622f1a3c 685d867cee4e4629a9cebd15bdbcb282 3072623f3b79497ab043c7aafb5a7523 - - default default] [instance: e70b4551-5394-4f61-b02d-ad3b69890e83] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Jan 21 19:13:45 np0005591285 nova_compute[182755]: 2026-01-22 00:13:45.690 182759 DEBUG nova.compute.manager [req-ee55501d-280d-46eb-bbc6-9aecaaf90d7a req-636bc9f7-171b-4984-8bfd-9718945a3b32 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: e70b4551-5394-4f61-b02d-ad3b69890e83] Received event network-changed-5d8d346d-4613-42c3-a377-72d51f4de448 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 21 19:13:45 np0005591285 nova_compute[182755]: 2026-01-22 00:13:45.691 182759 DEBUG nova.compute.manager [req-ee55501d-280d-46eb-bbc6-9aecaaf90d7a req-636bc9f7-171b-4984-8bfd-9718945a3b32 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: e70b4551-5394-4f61-b02d-ad3b69890e83] Refreshing instance network info cache due to event network-changed-5d8d346d-4613-42c3-a377-72d51f4de448. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 21 19:13:45 np0005591285 nova_compute[182755]: 2026-01-22 00:13:45.691 182759 DEBUG oslo_concurrency.lockutils [req-ee55501d-280d-46eb-bbc6-9aecaaf90d7a req-636bc9f7-171b-4984-8bfd-9718945a3b32 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "refresh_cache-e70b4551-5394-4f61-b02d-ad3b69890e83" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 21 19:13:45 np0005591285 nova_compute[182755]: 2026-01-22 00:13:45.825 182759 DEBUG nova.network.neutron [None req-8df65266-e942-4836-8336-ecf2622f1a3c 685d867cee4e4629a9cebd15bdbcb282 3072623f3b79497ab043c7aafb5a7523 - - default default] [instance: e70b4551-5394-4f61-b02d-ad3b69890e83] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Jan 21 19:13:46 np0005591285 nova_compute[182755]: 2026-01-22 00:13:46.922 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:13:47 np0005591285 nova_compute[182755]: 2026-01-22 00:13:47.217 182759 DEBUG oslo_service.periodic_task [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 21 19:13:47 np0005591285 nova_compute[182755]: 2026-01-22 00:13:47.306 182759 DEBUG nova.network.neutron [None req-8df65266-e942-4836-8336-ecf2622f1a3c 685d867cee4e4629a9cebd15bdbcb282 3072623f3b79497ab043c7aafb5a7523 - - default default] [instance: e70b4551-5394-4f61-b02d-ad3b69890e83] Updating instance_info_cache with network_info: [{"id": "5d8d346d-4613-42c3-a377-72d51f4de448", "address": "fa:16:3e:2d:1c:67", "network": {"id": "742490ba-4d29-4314-8cf8-ff183d293525", "bridge": "br-int", "label": "tempest-ServerMetadataNegativeTestJSON-46026287-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3072623f3b79497ab043c7aafb5a7523", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap5d8d346d-46", "ovs_interfaceid": "5d8d346d-4613-42c3-a377-72d51f4de448", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 21 19:13:47 np0005591285 nova_compute[182755]: 2026-01-22 00:13:47.329 182759 DEBUG oslo_concurrency.lockutils [None req-8df65266-e942-4836-8336-ecf2622f1a3c 685d867cee4e4629a9cebd15bdbcb282 3072623f3b79497ab043c7aafb5a7523 - - default default] Releasing lock "refresh_cache-e70b4551-5394-4f61-b02d-ad3b69890e83" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 21 19:13:47 np0005591285 nova_compute[182755]: 2026-01-22 00:13:47.329 182759 DEBUG nova.compute.manager [None req-8df65266-e942-4836-8336-ecf2622f1a3c 685d867cee4e4629a9cebd15bdbcb282 3072623f3b79497ab043c7aafb5a7523 - - default default] [instance: e70b4551-5394-4f61-b02d-ad3b69890e83] Instance network_info: |[{"id": "5d8d346d-4613-42c3-a377-72d51f4de448", "address": "fa:16:3e:2d:1c:67", "network": {"id": "742490ba-4d29-4314-8cf8-ff183d293525", "bridge": "br-int", "label": "tempest-ServerMetadataNegativeTestJSON-46026287-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3072623f3b79497ab043c7aafb5a7523", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap5d8d346d-46", "ovs_interfaceid": "5d8d346d-4613-42c3-a377-72d51f4de448", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Jan 21 19:13:47 np0005591285 nova_compute[182755]: 2026-01-22 00:13:47.330 182759 DEBUG oslo_concurrency.lockutils [req-ee55501d-280d-46eb-bbc6-9aecaaf90d7a req-636bc9f7-171b-4984-8bfd-9718945a3b32 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquired lock "refresh_cache-e70b4551-5394-4f61-b02d-ad3b69890e83" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 21 19:13:47 np0005591285 nova_compute[182755]: 2026-01-22 00:13:47.330 182759 DEBUG nova.network.neutron [req-ee55501d-280d-46eb-bbc6-9aecaaf90d7a req-636bc9f7-171b-4984-8bfd-9718945a3b32 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: e70b4551-5394-4f61-b02d-ad3b69890e83] Refreshing network info cache for port 5d8d346d-4613-42c3-a377-72d51f4de448 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 21 19:13:47 np0005591285 nova_compute[182755]: 2026-01-22 00:13:47.332 182759 DEBUG nova.virt.libvirt.driver [None req-8df65266-e942-4836-8336-ecf2622f1a3c 685d867cee4e4629a9cebd15bdbcb282 3072623f3b79497ab043c7aafb5a7523 - - default default] [instance: e70b4551-5394-4f61-b02d-ad3b69890e83] Start _get_guest_xml network_info=[{"id": "5d8d346d-4613-42c3-a377-72d51f4de448", "address": "fa:16:3e:2d:1c:67", "network": {"id": "742490ba-4d29-4314-8cf8-ff183d293525", "bridge": "br-int", "label": "tempest-ServerMetadataNegativeTestJSON-46026287-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3072623f3b79497ab043c7aafb5a7523", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap5d8d346d-46", "ovs_interfaceid": "5d8d346d-4613-42c3-a377-72d51f4de448", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-21T23:42:50Z,direct_url=<?>,disk_format='qcow2',id=9cd98f02-a505-4543-a7ad-04e9a377b456,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='43b70c4e837343859ac97b6b2397ba1b',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-21T23:42:52Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'device_type': 'disk', 'device_name': '/dev/vda', 'size': 0, 'boot_index': 0, 'disk_bus': 'virtio', 'guest_format': None, 'encryption_options': None, 'encryption_format': None, 'encrypted': False, 'encryption_secret_uuid': None, 'image_id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Jan 21 19:13:47 np0005591285 nova_compute[182755]: 2026-01-22 00:13:47.337 182759 WARNING nova.virt.libvirt.driver [None req-8df65266-e942-4836-8336-ecf2622f1a3c 685d867cee4e4629a9cebd15bdbcb282 3072623f3b79497ab043c7aafb5a7523 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 21 19:13:47 np0005591285 nova_compute[182755]: 2026-01-22 00:13:47.345 182759 DEBUG nova.virt.libvirt.host [None req-8df65266-e942-4836-8336-ecf2622f1a3c 685d867cee4e4629a9cebd15bdbcb282 3072623f3b79497ab043c7aafb5a7523 - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Jan 21 19:13:47 np0005591285 nova_compute[182755]: 2026-01-22 00:13:47.346 182759 DEBUG nova.virt.libvirt.host [None req-8df65266-e942-4836-8336-ecf2622f1a3c 685d867cee4e4629a9cebd15bdbcb282 3072623f3b79497ab043c7aafb5a7523 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Jan 21 19:13:47 np0005591285 nova_compute[182755]: 2026-01-22 00:13:47.353 182759 DEBUG nova.virt.libvirt.host [None req-8df65266-e942-4836-8336-ecf2622f1a3c 685d867cee4e4629a9cebd15bdbcb282 3072623f3b79497ab043c7aafb5a7523 - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Jan 21 19:13:47 np0005591285 nova_compute[182755]: 2026-01-22 00:13:47.354 182759 DEBUG nova.virt.libvirt.host [None req-8df65266-e942-4836-8336-ecf2622f1a3c 685d867cee4e4629a9cebd15bdbcb282 3072623f3b79497ab043c7aafb5a7523 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Jan 21 19:13:47 np0005591285 nova_compute[182755]: 2026-01-22 00:13:47.355 182759 DEBUG nova.virt.libvirt.driver [None req-8df65266-e942-4836-8336-ecf2622f1a3c 685d867cee4e4629a9cebd15bdbcb282 3072623f3b79497ab043c7aafb5a7523 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Jan 21 19:13:47 np0005591285 nova_compute[182755]: 2026-01-22 00:13:47.355 182759 DEBUG nova.virt.hardware [None req-8df65266-e942-4836-8336-ecf2622f1a3c 685d867cee4e4629a9cebd15bdbcb282 3072623f3b79497ab043c7aafb5a7523 - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-21T23:42:45Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='c3389c03-89c4-4ff5-9e03-1a99d41713d4',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-21T23:42:50Z,direct_url=<?>,disk_format='qcow2',id=9cd98f02-a505-4543-a7ad-04e9a377b456,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='43b70c4e837343859ac97b6b2397ba1b',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-21T23:42:52Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Jan 21 19:13:47 np0005591285 nova_compute[182755]: 2026-01-22 00:13:47.356 182759 DEBUG nova.virt.hardware [None req-8df65266-e942-4836-8336-ecf2622f1a3c 685d867cee4e4629a9cebd15bdbcb282 3072623f3b79497ab043c7aafb5a7523 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Jan 21 19:13:47 np0005591285 nova_compute[182755]: 2026-01-22 00:13:47.356 182759 DEBUG nova.virt.hardware [None req-8df65266-e942-4836-8336-ecf2622f1a3c 685d867cee4e4629a9cebd15bdbcb282 3072623f3b79497ab043c7aafb5a7523 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Jan 21 19:13:47 np0005591285 nova_compute[182755]: 2026-01-22 00:13:47.356 182759 DEBUG nova.virt.hardware [None req-8df65266-e942-4836-8336-ecf2622f1a3c 685d867cee4e4629a9cebd15bdbcb282 3072623f3b79497ab043c7aafb5a7523 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Jan 21 19:13:47 np0005591285 nova_compute[182755]: 2026-01-22 00:13:47.356 182759 DEBUG nova.virt.hardware [None req-8df65266-e942-4836-8336-ecf2622f1a3c 685d867cee4e4629a9cebd15bdbcb282 3072623f3b79497ab043c7aafb5a7523 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Jan 21 19:13:47 np0005591285 nova_compute[182755]: 2026-01-22 00:13:47.356 182759 DEBUG nova.virt.hardware [None req-8df65266-e942-4836-8336-ecf2622f1a3c 685d867cee4e4629a9cebd15bdbcb282 3072623f3b79497ab043c7aafb5a7523 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Jan 21 19:13:47 np0005591285 nova_compute[182755]: 2026-01-22 00:13:47.357 182759 DEBUG nova.virt.hardware [None req-8df65266-e942-4836-8336-ecf2622f1a3c 685d867cee4e4629a9cebd15bdbcb282 3072623f3b79497ab043c7aafb5a7523 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Jan 21 19:13:47 np0005591285 nova_compute[182755]: 2026-01-22 00:13:47.357 182759 DEBUG nova.virt.hardware [None req-8df65266-e942-4836-8336-ecf2622f1a3c 685d867cee4e4629a9cebd15bdbcb282 3072623f3b79497ab043c7aafb5a7523 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Jan 21 19:13:47 np0005591285 nova_compute[182755]: 2026-01-22 00:13:47.357 182759 DEBUG nova.virt.hardware [None req-8df65266-e942-4836-8336-ecf2622f1a3c 685d867cee4e4629a9cebd15bdbcb282 3072623f3b79497ab043c7aafb5a7523 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Jan 21 19:13:47 np0005591285 nova_compute[182755]: 2026-01-22 00:13:47.357 182759 DEBUG nova.virt.hardware [None req-8df65266-e942-4836-8336-ecf2622f1a3c 685d867cee4e4629a9cebd15bdbcb282 3072623f3b79497ab043c7aafb5a7523 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Jan 21 19:13:47 np0005591285 nova_compute[182755]: 2026-01-22 00:13:47.357 182759 DEBUG nova.virt.hardware [None req-8df65266-e942-4836-8336-ecf2622f1a3c 685d867cee4e4629a9cebd15bdbcb282 3072623f3b79497ab043c7aafb5a7523 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Jan 21 19:13:47 np0005591285 nova_compute[182755]: 2026-01-22 00:13:47.361 182759 DEBUG nova.virt.libvirt.vif [None req-8df65266-e942-4836-8336-ecf2622f1a3c 685d867cee4e4629a9cebd15bdbcb282 3072623f3b79497ab043c7aafb5a7523 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-22T00:13:40Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ServerMetadataNegativeTestJSON-server-1340270100',display_name='tempest-ServerMetadataNegativeTestJSON-server-1340270100',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-servermetadatanegativetestjson-server-1340270100',id=122,image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='3072623f3b79497ab043c7aafb5a7523',ramdisk_id='',reservation_id='r-qs2dhcsj',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ServerMetadataNegativeTestJSON-1469736869',owner_user_name='tempest-ServerMetadataNegativeTestJSON-1469736869-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-22T00:13:43Z,user_data=None,user_id='685d867cee4e4629a9cebd15bdbcb282',uuid=e70b4551-5394-4f61-b02d-ad3b69890e83,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "5d8d346d-4613-42c3-a377-72d51f4de448", "address": "fa:16:3e:2d:1c:67", "network": {"id": "742490ba-4d29-4314-8cf8-ff183d293525", "bridge": "br-int", "label": "tempest-ServerMetadataNegativeTestJSON-46026287-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3072623f3b79497ab043c7aafb5a7523", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap5d8d346d-46", "ovs_interfaceid": "5d8d346d-4613-42c3-a377-72d51f4de448", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Jan 21 19:13:47 np0005591285 nova_compute[182755]: 2026-01-22 00:13:47.361 182759 DEBUG nova.network.os_vif_util [None req-8df65266-e942-4836-8336-ecf2622f1a3c 685d867cee4e4629a9cebd15bdbcb282 3072623f3b79497ab043c7aafb5a7523 - - default default] Converting VIF {"id": "5d8d346d-4613-42c3-a377-72d51f4de448", "address": "fa:16:3e:2d:1c:67", "network": {"id": "742490ba-4d29-4314-8cf8-ff183d293525", "bridge": "br-int", "label": "tempest-ServerMetadataNegativeTestJSON-46026287-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3072623f3b79497ab043c7aafb5a7523", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap5d8d346d-46", "ovs_interfaceid": "5d8d346d-4613-42c3-a377-72d51f4de448", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 21 19:13:47 np0005591285 nova_compute[182755]: 2026-01-22 00:13:47.361 182759 DEBUG nova.network.os_vif_util [None req-8df65266-e942-4836-8336-ecf2622f1a3c 685d867cee4e4629a9cebd15bdbcb282 3072623f3b79497ab043c7aafb5a7523 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:2d:1c:67,bridge_name='br-int',has_traffic_filtering=True,id=5d8d346d-4613-42c3-a377-72d51f4de448,network=Network(742490ba-4d29-4314-8cf8-ff183d293525),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap5d8d346d-46') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 21 19:13:47 np0005591285 nova_compute[182755]: 2026-01-22 00:13:47.362 182759 DEBUG nova.objects.instance [None req-8df65266-e942-4836-8336-ecf2622f1a3c 685d867cee4e4629a9cebd15bdbcb282 3072623f3b79497ab043c7aafb5a7523 - - default default] Lazy-loading 'pci_devices' on Instance uuid e70b4551-5394-4f61-b02d-ad3b69890e83 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 21 19:13:47 np0005591285 nova_compute[182755]: 2026-01-22 00:13:47.379 182759 DEBUG nova.virt.libvirt.driver [None req-8df65266-e942-4836-8336-ecf2622f1a3c 685d867cee4e4629a9cebd15bdbcb282 3072623f3b79497ab043c7aafb5a7523 - - default default] [instance: e70b4551-5394-4f61-b02d-ad3b69890e83] End _get_guest_xml xml=<domain type="kvm">
Jan 21 19:13:47 np0005591285 nova_compute[182755]:  <uuid>e70b4551-5394-4f61-b02d-ad3b69890e83</uuid>
Jan 21 19:13:47 np0005591285 nova_compute[182755]:  <name>instance-0000007a</name>
Jan 21 19:13:47 np0005591285 nova_compute[182755]:  <memory>131072</memory>
Jan 21 19:13:47 np0005591285 nova_compute[182755]:  <vcpu>1</vcpu>
Jan 21 19:13:47 np0005591285 nova_compute[182755]:  <metadata>
Jan 21 19:13:47 np0005591285 nova_compute[182755]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 21 19:13:47 np0005591285 nova_compute[182755]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 21 19:13:47 np0005591285 nova_compute[182755]:      <nova:name>tempest-ServerMetadataNegativeTestJSON-server-1340270100</nova:name>
Jan 21 19:13:47 np0005591285 nova_compute[182755]:      <nova:creationTime>2026-01-22 00:13:47</nova:creationTime>
Jan 21 19:13:47 np0005591285 nova_compute[182755]:      <nova:flavor name="m1.nano">
Jan 21 19:13:47 np0005591285 nova_compute[182755]:        <nova:memory>128</nova:memory>
Jan 21 19:13:47 np0005591285 nova_compute[182755]:        <nova:disk>1</nova:disk>
Jan 21 19:13:47 np0005591285 nova_compute[182755]:        <nova:swap>0</nova:swap>
Jan 21 19:13:47 np0005591285 nova_compute[182755]:        <nova:ephemeral>0</nova:ephemeral>
Jan 21 19:13:47 np0005591285 nova_compute[182755]:        <nova:vcpus>1</nova:vcpus>
Jan 21 19:13:47 np0005591285 nova_compute[182755]:      </nova:flavor>
Jan 21 19:13:47 np0005591285 nova_compute[182755]:      <nova:owner>
Jan 21 19:13:47 np0005591285 nova_compute[182755]:        <nova:user uuid="685d867cee4e4629a9cebd15bdbcb282">tempest-ServerMetadataNegativeTestJSON-1469736869-project-member</nova:user>
Jan 21 19:13:47 np0005591285 nova_compute[182755]:        <nova:project uuid="3072623f3b79497ab043c7aafb5a7523">tempest-ServerMetadataNegativeTestJSON-1469736869</nova:project>
Jan 21 19:13:47 np0005591285 nova_compute[182755]:      </nova:owner>
Jan 21 19:13:47 np0005591285 nova_compute[182755]:      <nova:root type="image" uuid="9cd98f02-a505-4543-a7ad-04e9a377b456"/>
Jan 21 19:13:47 np0005591285 nova_compute[182755]:      <nova:ports>
Jan 21 19:13:47 np0005591285 nova_compute[182755]:        <nova:port uuid="5d8d346d-4613-42c3-a377-72d51f4de448">
Jan 21 19:13:47 np0005591285 nova_compute[182755]:          <nova:ip type="fixed" address="10.100.0.5" ipVersion="4"/>
Jan 21 19:13:47 np0005591285 nova_compute[182755]:        </nova:port>
Jan 21 19:13:47 np0005591285 nova_compute[182755]:      </nova:ports>
Jan 21 19:13:47 np0005591285 nova_compute[182755]:    </nova:instance>
Jan 21 19:13:47 np0005591285 nova_compute[182755]:  </metadata>
Jan 21 19:13:47 np0005591285 nova_compute[182755]:  <sysinfo type="smbios">
Jan 21 19:13:47 np0005591285 nova_compute[182755]:    <system>
Jan 21 19:13:47 np0005591285 nova_compute[182755]:      <entry name="manufacturer">RDO</entry>
Jan 21 19:13:47 np0005591285 nova_compute[182755]:      <entry name="product">OpenStack Compute</entry>
Jan 21 19:13:47 np0005591285 nova_compute[182755]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 21 19:13:47 np0005591285 nova_compute[182755]:      <entry name="serial">e70b4551-5394-4f61-b02d-ad3b69890e83</entry>
Jan 21 19:13:47 np0005591285 nova_compute[182755]:      <entry name="uuid">e70b4551-5394-4f61-b02d-ad3b69890e83</entry>
Jan 21 19:13:47 np0005591285 nova_compute[182755]:      <entry name="family">Virtual Machine</entry>
Jan 21 19:13:47 np0005591285 nova_compute[182755]:    </system>
Jan 21 19:13:47 np0005591285 nova_compute[182755]:  </sysinfo>
Jan 21 19:13:47 np0005591285 nova_compute[182755]:  <os>
Jan 21 19:13:47 np0005591285 nova_compute[182755]:    <type arch="x86_64" machine="q35">hvm</type>
Jan 21 19:13:47 np0005591285 nova_compute[182755]:    <boot dev="hd"/>
Jan 21 19:13:47 np0005591285 nova_compute[182755]:    <smbios mode="sysinfo"/>
Jan 21 19:13:47 np0005591285 nova_compute[182755]:  </os>
Jan 21 19:13:47 np0005591285 nova_compute[182755]:  <features>
Jan 21 19:13:47 np0005591285 nova_compute[182755]:    <acpi/>
Jan 21 19:13:47 np0005591285 nova_compute[182755]:    <apic/>
Jan 21 19:13:47 np0005591285 nova_compute[182755]:    <vmcoreinfo/>
Jan 21 19:13:47 np0005591285 nova_compute[182755]:  </features>
Jan 21 19:13:47 np0005591285 nova_compute[182755]:  <clock offset="utc">
Jan 21 19:13:47 np0005591285 nova_compute[182755]:    <timer name="pit" tickpolicy="delay"/>
Jan 21 19:13:47 np0005591285 nova_compute[182755]:    <timer name="rtc" tickpolicy="catchup"/>
Jan 21 19:13:47 np0005591285 nova_compute[182755]:    <timer name="hpet" present="no"/>
Jan 21 19:13:47 np0005591285 nova_compute[182755]:  </clock>
Jan 21 19:13:47 np0005591285 nova_compute[182755]:  <cpu mode="custom" match="exact">
Jan 21 19:13:47 np0005591285 nova_compute[182755]:    <model>Nehalem</model>
Jan 21 19:13:47 np0005591285 nova_compute[182755]:    <topology sockets="1" cores="1" threads="1"/>
Jan 21 19:13:47 np0005591285 nova_compute[182755]:  </cpu>
Jan 21 19:13:47 np0005591285 nova_compute[182755]:  <devices>
Jan 21 19:13:47 np0005591285 nova_compute[182755]:    <disk type="file" device="disk">
Jan 21 19:13:47 np0005591285 nova_compute[182755]:      <driver name="qemu" type="qcow2" cache="none"/>
Jan 21 19:13:47 np0005591285 nova_compute[182755]:      <source file="/var/lib/nova/instances/e70b4551-5394-4f61-b02d-ad3b69890e83/disk"/>
Jan 21 19:13:47 np0005591285 nova_compute[182755]:      <target dev="vda" bus="virtio"/>
Jan 21 19:13:47 np0005591285 nova_compute[182755]:    </disk>
Jan 21 19:13:47 np0005591285 nova_compute[182755]:    <disk type="file" device="cdrom">
Jan 21 19:13:47 np0005591285 nova_compute[182755]:      <driver name="qemu" type="raw" cache="none"/>
Jan 21 19:13:47 np0005591285 nova_compute[182755]:      <source file="/var/lib/nova/instances/e70b4551-5394-4f61-b02d-ad3b69890e83/disk.config"/>
Jan 21 19:13:47 np0005591285 nova_compute[182755]:      <target dev="sda" bus="sata"/>
Jan 21 19:13:47 np0005591285 nova_compute[182755]:    </disk>
Jan 21 19:13:47 np0005591285 nova_compute[182755]:    <interface type="ethernet">
Jan 21 19:13:47 np0005591285 nova_compute[182755]:      <mac address="fa:16:3e:2d:1c:67"/>
Jan 21 19:13:47 np0005591285 nova_compute[182755]:      <model type="virtio"/>
Jan 21 19:13:47 np0005591285 nova_compute[182755]:      <driver name="vhost" rx_queue_size="512"/>
Jan 21 19:13:47 np0005591285 nova_compute[182755]:      <mtu size="1442"/>
Jan 21 19:13:47 np0005591285 nova_compute[182755]:      <target dev="tap5d8d346d-46"/>
Jan 21 19:13:47 np0005591285 nova_compute[182755]:    </interface>
Jan 21 19:13:47 np0005591285 nova_compute[182755]:    <serial type="pty">
Jan 21 19:13:47 np0005591285 nova_compute[182755]:      <log file="/var/lib/nova/instances/e70b4551-5394-4f61-b02d-ad3b69890e83/console.log" append="off"/>
Jan 21 19:13:47 np0005591285 nova_compute[182755]:    </serial>
Jan 21 19:13:47 np0005591285 nova_compute[182755]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 21 19:13:47 np0005591285 nova_compute[182755]:    <video>
Jan 21 19:13:47 np0005591285 nova_compute[182755]:      <model type="virtio"/>
Jan 21 19:13:47 np0005591285 nova_compute[182755]:    </video>
Jan 21 19:13:47 np0005591285 nova_compute[182755]:    <input type="tablet" bus="usb"/>
Jan 21 19:13:47 np0005591285 nova_compute[182755]:    <rng model="virtio">
Jan 21 19:13:47 np0005591285 nova_compute[182755]:      <backend model="random">/dev/urandom</backend>
Jan 21 19:13:47 np0005591285 nova_compute[182755]:    </rng>
Jan 21 19:13:47 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root"/>
Jan 21 19:13:47 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 19:13:47 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 19:13:47 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 19:13:47 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 19:13:47 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 19:13:47 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 19:13:47 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 19:13:47 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 19:13:47 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 19:13:47 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 19:13:47 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 19:13:47 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 19:13:47 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 19:13:47 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 19:13:47 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 19:13:47 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 19:13:47 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 19:13:47 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 19:13:47 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 19:13:47 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 19:13:47 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 19:13:47 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 19:13:47 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 19:13:47 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 19:13:47 np0005591285 nova_compute[182755]:    <controller type="usb" index="0"/>
Jan 21 19:13:47 np0005591285 nova_compute[182755]:    <memballoon model="virtio">
Jan 21 19:13:47 np0005591285 nova_compute[182755]:      <stats period="10"/>
Jan 21 19:13:47 np0005591285 nova_compute[182755]:    </memballoon>
Jan 21 19:13:47 np0005591285 nova_compute[182755]:  </devices>
Jan 21 19:13:47 np0005591285 nova_compute[182755]: </domain>
Jan 21 19:13:47 np0005591285 nova_compute[182755]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Jan 21 19:13:47 np0005591285 nova_compute[182755]: 2026-01-22 00:13:47.380 182759 DEBUG nova.compute.manager [None req-8df65266-e942-4836-8336-ecf2622f1a3c 685d867cee4e4629a9cebd15bdbcb282 3072623f3b79497ab043c7aafb5a7523 - - default default] [instance: e70b4551-5394-4f61-b02d-ad3b69890e83] Preparing to wait for external event network-vif-plugged-5d8d346d-4613-42c3-a377-72d51f4de448 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Jan 21 19:13:47 np0005591285 nova_compute[182755]: 2026-01-22 00:13:47.380 182759 DEBUG oslo_concurrency.lockutils [None req-8df65266-e942-4836-8336-ecf2622f1a3c 685d867cee4e4629a9cebd15bdbcb282 3072623f3b79497ab043c7aafb5a7523 - - default default] Acquiring lock "e70b4551-5394-4f61-b02d-ad3b69890e83-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 21 19:13:47 np0005591285 nova_compute[182755]: 2026-01-22 00:13:47.380 182759 DEBUG oslo_concurrency.lockutils [None req-8df65266-e942-4836-8336-ecf2622f1a3c 685d867cee4e4629a9cebd15bdbcb282 3072623f3b79497ab043c7aafb5a7523 - - default default] Lock "e70b4551-5394-4f61-b02d-ad3b69890e83-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 21 19:13:47 np0005591285 nova_compute[182755]: 2026-01-22 00:13:47.381 182759 DEBUG oslo_concurrency.lockutils [None req-8df65266-e942-4836-8336-ecf2622f1a3c 685d867cee4e4629a9cebd15bdbcb282 3072623f3b79497ab043c7aafb5a7523 - - default default] Lock "e70b4551-5394-4f61-b02d-ad3b69890e83-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 21 19:13:47 np0005591285 nova_compute[182755]: 2026-01-22 00:13:47.381 182759 DEBUG nova.virt.libvirt.vif [None req-8df65266-e942-4836-8336-ecf2622f1a3c 685d867cee4e4629a9cebd15bdbcb282 3072623f3b79497ab043c7aafb5a7523 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-22T00:13:40Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ServerMetadataNegativeTestJSON-server-1340270100',display_name='tempest-ServerMetadataNegativeTestJSON-server-1340270100',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-servermetadatanegativetestjson-server-1340270100',id=122,image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='3072623f3b79497ab043c7aafb5a7523',ramdisk_id='',reservation_id='r-qs2dhcsj',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ServerMetadataNegativeTestJSON-1469736869',owner_user_name='tempest-ServerMetadataNegativeTestJSON-1469736869-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-22T00:13:43Z,user_data=None,user_id='685d867cee4e4629a9cebd15bdbcb282',uuid=e70b4551-5394-4f61-b02d-ad3b69890e83,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "5d8d346d-4613-42c3-a377-72d51f4de448", "address": "fa:16:3e:2d:1c:67", "network": {"id": "742490ba-4d29-4314-8cf8-ff183d293525", "bridge": "br-int", "label": "tempest-ServerMetadataNegativeTestJSON-46026287-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3072623f3b79497ab043c7aafb5a7523", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap5d8d346d-46", "ovs_interfaceid": "5d8d346d-4613-42c3-a377-72d51f4de448", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Jan 21 19:13:47 np0005591285 nova_compute[182755]: 2026-01-22 00:13:47.382 182759 DEBUG nova.network.os_vif_util [None req-8df65266-e942-4836-8336-ecf2622f1a3c 685d867cee4e4629a9cebd15bdbcb282 3072623f3b79497ab043c7aafb5a7523 - - default default] Converting VIF {"id": "5d8d346d-4613-42c3-a377-72d51f4de448", "address": "fa:16:3e:2d:1c:67", "network": {"id": "742490ba-4d29-4314-8cf8-ff183d293525", "bridge": "br-int", "label": "tempest-ServerMetadataNegativeTestJSON-46026287-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3072623f3b79497ab043c7aafb5a7523", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap5d8d346d-46", "ovs_interfaceid": "5d8d346d-4613-42c3-a377-72d51f4de448", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 21 19:13:47 np0005591285 nova_compute[182755]: 2026-01-22 00:13:47.382 182759 DEBUG nova.network.os_vif_util [None req-8df65266-e942-4836-8336-ecf2622f1a3c 685d867cee4e4629a9cebd15bdbcb282 3072623f3b79497ab043c7aafb5a7523 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:2d:1c:67,bridge_name='br-int',has_traffic_filtering=True,id=5d8d346d-4613-42c3-a377-72d51f4de448,network=Network(742490ba-4d29-4314-8cf8-ff183d293525),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap5d8d346d-46') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 21 19:13:47 np0005591285 nova_compute[182755]: 2026-01-22 00:13:47.383 182759 DEBUG os_vif [None req-8df65266-e942-4836-8336-ecf2622f1a3c 685d867cee4e4629a9cebd15bdbcb282 3072623f3b79497ab043c7aafb5a7523 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:2d:1c:67,bridge_name='br-int',has_traffic_filtering=True,id=5d8d346d-4613-42c3-a377-72d51f4de448,network=Network(742490ba-4d29-4314-8cf8-ff183d293525),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap5d8d346d-46') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Jan 21 19:13:47 np0005591285 nova_compute[182755]: 2026-01-22 00:13:47.383 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:13:47 np0005591285 nova_compute[182755]: 2026-01-22 00:13:47.383 182759 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 21 19:13:47 np0005591285 nova_compute[182755]: 2026-01-22 00:13:47.384 182759 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 21 19:13:47 np0005591285 nova_compute[182755]: 2026-01-22 00:13:47.387 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:13:47 np0005591285 nova_compute[182755]: 2026-01-22 00:13:47.387 182759 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap5d8d346d-46, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 21 19:13:47 np0005591285 nova_compute[182755]: 2026-01-22 00:13:47.388 182759 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap5d8d346d-46, col_values=(('external_ids', {'iface-id': '5d8d346d-4613-42c3-a377-72d51f4de448', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:2d:1c:67', 'vm-uuid': 'e70b4551-5394-4f61-b02d-ad3b69890e83'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 21 19:13:47 np0005591285 NetworkManager[55017]: <info>  [1769040827.4368] manager: (tap5d8d346d-46): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/227)
Jan 21 19:13:47 np0005591285 nova_compute[182755]: 2026-01-22 00:13:47.436 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:13:47 np0005591285 nova_compute[182755]: 2026-01-22 00:13:47.440 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 21 19:13:47 np0005591285 nova_compute[182755]: 2026-01-22 00:13:47.441 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:13:47 np0005591285 nova_compute[182755]: 2026-01-22 00:13:47.442 182759 INFO os_vif [None req-8df65266-e942-4836-8336-ecf2622f1a3c 685d867cee4e4629a9cebd15bdbcb282 3072623f3b79497ab043c7aafb5a7523 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:2d:1c:67,bridge_name='br-int',has_traffic_filtering=True,id=5d8d346d-4613-42c3-a377-72d51f4de448,network=Network(742490ba-4d29-4314-8cf8-ff183d293525),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap5d8d346d-46')#033[00m
Jan 21 19:13:47 np0005591285 nova_compute[182755]: 2026-01-22 00:13:47.484 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:13:47 np0005591285 nova_compute[182755]: 2026-01-22 00:13:47.514 182759 DEBUG nova.virt.libvirt.driver [None req-8df65266-e942-4836-8336-ecf2622f1a3c 685d867cee4e4629a9cebd15bdbcb282 3072623f3b79497ab043c7aafb5a7523 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 21 19:13:47 np0005591285 nova_compute[182755]: 2026-01-22 00:13:47.515 182759 DEBUG nova.virt.libvirt.driver [None req-8df65266-e942-4836-8336-ecf2622f1a3c 685d867cee4e4629a9cebd15bdbcb282 3072623f3b79497ab043c7aafb5a7523 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 21 19:13:47 np0005591285 nova_compute[182755]: 2026-01-22 00:13:47.515 182759 DEBUG nova.virt.libvirt.driver [None req-8df65266-e942-4836-8336-ecf2622f1a3c 685d867cee4e4629a9cebd15bdbcb282 3072623f3b79497ab043c7aafb5a7523 - - default default] No VIF found with MAC fa:16:3e:2d:1c:67, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Jan 21 19:13:47 np0005591285 nova_compute[182755]: 2026-01-22 00:13:47.516 182759 INFO nova.virt.libvirt.driver [None req-8df65266-e942-4836-8336-ecf2622f1a3c 685d867cee4e4629a9cebd15bdbcb282 3072623f3b79497ab043c7aafb5a7523 - - default default] [instance: e70b4551-5394-4f61-b02d-ad3b69890e83] Using config drive#033[00m
Jan 21 19:13:47 np0005591285 nova_compute[182755]: 2026-01-22 00:13:47.980 182759 INFO nova.virt.libvirt.driver [None req-8df65266-e942-4836-8336-ecf2622f1a3c 685d867cee4e4629a9cebd15bdbcb282 3072623f3b79497ab043c7aafb5a7523 - - default default] [instance: e70b4551-5394-4f61-b02d-ad3b69890e83] Creating config drive at /var/lib/nova/instances/e70b4551-5394-4f61-b02d-ad3b69890e83/disk.config#033[00m
Jan 21 19:13:47 np0005591285 nova_compute[182755]: 2026-01-22 00:13:47.987 182759 DEBUG oslo_concurrency.processutils [None req-8df65266-e942-4836-8336-ecf2622f1a3c 685d867cee4e4629a9cebd15bdbcb282 3072623f3b79497ab043c7aafb5a7523 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/e70b4551-5394-4f61-b02d-ad3b69890e83/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpw7rpjlsh execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 21 19:13:48 np0005591285 nova_compute[182755]: 2026-01-22 00:13:48.113 182759 DEBUG oslo_concurrency.processutils [None req-8df65266-e942-4836-8336-ecf2622f1a3c 685d867cee4e4629a9cebd15bdbcb282 3072623f3b79497ab043c7aafb5a7523 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/e70b4551-5394-4f61-b02d-ad3b69890e83/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpw7rpjlsh" returned: 0 in 0.126s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 21 19:13:48 np0005591285 nova_compute[182755]: 2026-01-22 00:13:48.173 182759 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1769040813.1720095, 118577c2-2440-472a-b858-f075b2a804b1 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 21 19:13:48 np0005591285 nova_compute[182755]: 2026-01-22 00:13:48.174 182759 INFO nova.compute.manager [-] [instance: 118577c2-2440-472a-b858-f075b2a804b1] VM Stopped (Lifecycle Event)#033[00m
Jan 21 19:13:48 np0005591285 kernel: tap5d8d346d-46: entered promiscuous mode
Jan 21 19:13:48 np0005591285 NetworkManager[55017]: <info>  [1769040828.1829] manager: (tap5d8d346d-46): new Tun device (/org/freedesktop/NetworkManager/Devices/228)
Jan 21 19:13:48 np0005591285 ovn_controller[94908]: 2026-01-22T00:13:48Z|00470|binding|INFO|Claiming lport 5d8d346d-4613-42c3-a377-72d51f4de448 for this chassis.
Jan 21 19:13:48 np0005591285 ovn_controller[94908]: 2026-01-22T00:13:48Z|00471|binding|INFO|5d8d346d-4613-42c3-a377-72d51f4de448: Claiming fa:16:3e:2d:1c:67 10.100.0.5
Jan 21 19:13:48 np0005591285 nova_compute[182755]: 2026-01-22 00:13:48.183 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:13:48 np0005591285 nova_compute[182755]: 2026-01-22 00:13:48.188 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:13:48 np0005591285 nova_compute[182755]: 2026-01-22 00:13:48.195 182759 DEBUG nova.compute.manager [None req-67f4b82f-bd72-43ca-b2f0-70cca4c8ee4e - - - - - -] [instance: 118577c2-2440-472a-b858-f075b2a804b1] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 21 19:13:48 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:13:48.197 104259 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:2d:1c:67 10.100.0.5'], port_security=['fa:16:3e:2d:1c:67 10.100.0.5'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.5/28', 'neutron:device_id': 'e70b4551-5394-4f61-b02d-ad3b69890e83', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-742490ba-4d29-4314-8cf8-ff183d293525', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '3072623f3b79497ab043c7aafb5a7523', 'neutron:revision_number': '2', 'neutron:security_group_ids': 'a39c3f33-4773-428c-9ed7-8f603a221d83', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=eb2947d7-9c21-451d-9b8e-7a292419011d, chassis=[<ovs.db.idl.Row object at 0x7fdd55b5c970>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fdd55b5c970>], logical_port=5d8d346d-4613-42c3-a377-72d51f4de448) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 21 19:13:48 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:13:48.198 104259 INFO neutron.agent.ovn.metadata.agent [-] Port 5d8d346d-4613-42c3-a377-72d51f4de448 in datapath 742490ba-4d29-4314-8cf8-ff183d293525 bound to our chassis#033[00m
Jan 21 19:13:48 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:13:48.199 104259 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 742490ba-4d29-4314-8cf8-ff183d293525#033[00m
Jan 21 19:13:48 np0005591285 systemd-udevd[230862]: Network interface NamePolicy= disabled on kernel command line.
Jan 21 19:13:48 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:13:48.211 211690 DEBUG oslo.privsep.daemon [-] privsep: reply[617a79c7-d140-4ed4-a321-400cf2de70a4]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 19:13:48 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:13:48.212 104259 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap742490ba-41 in ovnmeta-742490ba-4d29-4314-8cf8-ff183d293525 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Jan 21 19:13:48 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:13:48.214 211690 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap742490ba-40 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Jan 21 19:13:48 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:13:48.214 211690 DEBUG oslo.privsep.daemon [-] privsep: reply[7d6045ec-4e31-447b-b0da-2ffed1a672e0]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 19:13:48 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:13:48.215 211690 DEBUG oslo.privsep.daemon [-] privsep: reply[2d0e38a1-ed54-4d2b-9104-efc73452d504]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 19:13:48 np0005591285 nova_compute[182755]: 2026-01-22 00:13:48.217 182759 DEBUG oslo_service.periodic_task [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 21 19:13:48 np0005591285 nova_compute[182755]: 2026-01-22 00:13:48.217 182759 DEBUG nova.compute.manager [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Jan 21 19:13:48 np0005591285 nova_compute[182755]: 2026-01-22 00:13:48.217 182759 DEBUG nova.compute.manager [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Jan 21 19:13:48 np0005591285 NetworkManager[55017]: <info>  [1769040828.2257] device (tap5d8d346d-46): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 21 19:13:48 np0005591285 NetworkManager[55017]: <info>  [1769040828.2263] device (tap5d8d346d-46): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 21 19:13:48 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:13:48.225 104650 DEBUG oslo.privsep.daemon [-] privsep: reply[14019586-9bfc-44ee-b126-a0863df683f6]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 19:13:48 np0005591285 nova_compute[182755]: 2026-01-22 00:13:48.238 182759 DEBUG nova.compute.manager [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] [instance: e70b4551-5394-4f61-b02d-ad3b69890e83] Skipping network cache update for instance because it is Building. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9871#033[00m
Jan 21 19:13:48 np0005591285 nova_compute[182755]: 2026-01-22 00:13:48.238 182759 DEBUG nova.compute.manager [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Jan 21 19:13:48 np0005591285 nova_compute[182755]: 2026-01-22 00:13:48.239 182759 DEBUG oslo_service.periodic_task [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 21 19:13:48 np0005591285 systemd-machined[154022]: New machine qemu-56-instance-0000007a.
Jan 21 19:13:48 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:13:48.268 211690 DEBUG oslo.privsep.daemon [-] privsep: reply[ba16d1a0-272a-42e3-af57-5a3d09a13f77]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 19:13:48 np0005591285 nova_compute[182755]: 2026-01-22 00:13:48.275 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:13:48 np0005591285 systemd[1]: Started Virtual Machine qemu-56-instance-0000007a.
Jan 21 19:13:48 np0005591285 ovn_controller[94908]: 2026-01-22T00:13:48Z|00472|binding|INFO|Setting lport 5d8d346d-4613-42c3-a377-72d51f4de448 ovn-installed in OVS
Jan 21 19:13:48 np0005591285 ovn_controller[94908]: 2026-01-22T00:13:48Z|00473|binding|INFO|Setting lport 5d8d346d-4613-42c3-a377-72d51f4de448 up in Southbound
Jan 21 19:13:48 np0005591285 nova_compute[182755]: 2026-01-22 00:13:48.281 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:13:48 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:13:48.302 211704 DEBUG oslo.privsep.daemon [-] privsep: reply[bd178e92-b303-4d21-8dce-6ff377166934]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 19:13:48 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:13:48.307 211690 DEBUG oslo.privsep.daemon [-] privsep: reply[2ae8f3d0-d271-4858-8240-3d801f543db2]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 19:13:48 np0005591285 NetworkManager[55017]: <info>  [1769040828.3081] manager: (tap742490ba-40): new Veth device (/org/freedesktop/NetworkManager/Devices/229)
Jan 21 19:13:48 np0005591285 systemd-udevd[230867]: Network interface NamePolicy= disabled on kernel command line.
Jan 21 19:13:48 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:13:48.338 211704 DEBUG oslo.privsep.daemon [-] privsep: reply[979135e9-c12d-4799-8878-b1dc3813be40]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 19:13:48 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:13:48.341 211704 DEBUG oslo.privsep.daemon [-] privsep: reply[b8e81cd2-ec63-4c17-9ea4-ce409cae3480]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 19:13:48 np0005591285 NetworkManager[55017]: <info>  [1769040828.3605] device (tap742490ba-40): carrier: link connected
Jan 21 19:13:48 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:13:48.365 211704 DEBUG oslo.privsep.daemon [-] privsep: reply[85b0521f-0e24-4155-8afe-0c1b8654a6fb]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 19:13:48 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:13:48.380 211690 DEBUG oslo.privsep.daemon [-] privsep: reply[db03989f-a1f7-4d94-a3f3-efb4135878d0]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap742490ba-41'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:ca:cf:7a'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 147], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 538701, 'reachable_time': 37248, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 230897, 'error': None, 'target': 'ovnmeta-742490ba-4d29-4314-8cf8-ff183d293525', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 19:13:48 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:13:48.392 211690 DEBUG oslo.privsep.daemon [-] privsep: reply[7a18bbe7-a2c4-4368-8dfa-ea43ef68eed0]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:feca:cf7a'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 538701, 'tstamp': 538701}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 230898, 'error': None, 'target': 'ovnmeta-742490ba-4d29-4314-8cf8-ff183d293525', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 19:13:48 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:13:48.406 211690 DEBUG oslo.privsep.daemon [-] privsep: reply[89a666df-cdda-4740-9851-9f3ff2dabde5]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap742490ba-41'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:ca:cf:7a'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 147], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 538701, 'reachable_time': 37248, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 230899, 'error': None, 'target': 'ovnmeta-742490ba-4d29-4314-8cf8-ff183d293525', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 19:13:48 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:13:48.432 211690 DEBUG oslo.privsep.daemon [-] privsep: reply[ce94a389-812e-411c-a616-b159b1e51355]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 19:13:48 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:13:48.495 211690 DEBUG oslo.privsep.daemon [-] privsep: reply[8f9c4a7c-3b7c-4b50-b520-3333eb214b2a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 19:13:48 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:13:48.497 104259 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap742490ba-40, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 21 19:13:48 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:13:48.497 104259 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 21 19:13:48 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:13:48.497 104259 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap742490ba-40, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 21 19:13:48 np0005591285 nova_compute[182755]: 2026-01-22 00:13:48.514 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:13:48 np0005591285 kernel: tap742490ba-40: entered promiscuous mode
Jan 21 19:13:48 np0005591285 NetworkManager[55017]: <info>  [1769040828.5164] manager: (tap742490ba-40): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/230)
Jan 21 19:13:48 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:13:48.517 104259 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap742490ba-40, col_values=(('external_ids', {'iface-id': 'adcda054-a10b-4a08-9e7c-a48cb4d3da53'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 21 19:13:48 np0005591285 ovn_controller[94908]: 2026-01-22T00:13:48Z|00474|binding|INFO|Releasing lport adcda054-a10b-4a08-9e7c-a48cb4d3da53 from this chassis (sb_readonly=0)
Jan 21 19:13:48 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:13:48.520 104259 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/742490ba-4d29-4314-8cf8-ff183d293525.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/742490ba-4d29-4314-8cf8-ff183d293525.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Jan 21 19:13:48 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:13:48.521 211690 DEBUG oslo.privsep.daemon [-] privsep: reply[889a2c68-99f3-45e0-a089-745345ed93ca]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 19:13:48 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:13:48.522 104259 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Jan 21 19:13:48 np0005591285 ovn_metadata_agent[104254]: global
Jan 21 19:13:48 np0005591285 ovn_metadata_agent[104254]:    log         /dev/log local0 debug
Jan 21 19:13:48 np0005591285 ovn_metadata_agent[104254]:    log-tag     haproxy-metadata-proxy-742490ba-4d29-4314-8cf8-ff183d293525
Jan 21 19:13:48 np0005591285 ovn_metadata_agent[104254]:    user        root
Jan 21 19:13:48 np0005591285 ovn_metadata_agent[104254]:    group       root
Jan 21 19:13:48 np0005591285 ovn_metadata_agent[104254]:    maxconn     1024
Jan 21 19:13:48 np0005591285 ovn_metadata_agent[104254]:    pidfile     /var/lib/neutron/external/pids/742490ba-4d29-4314-8cf8-ff183d293525.pid.haproxy
Jan 21 19:13:48 np0005591285 ovn_metadata_agent[104254]:    daemon
Jan 21 19:13:48 np0005591285 ovn_metadata_agent[104254]: 
Jan 21 19:13:48 np0005591285 ovn_metadata_agent[104254]: defaults
Jan 21 19:13:48 np0005591285 ovn_metadata_agent[104254]:    log global
Jan 21 19:13:48 np0005591285 ovn_metadata_agent[104254]:    mode http
Jan 21 19:13:48 np0005591285 ovn_metadata_agent[104254]:    option httplog
Jan 21 19:13:48 np0005591285 ovn_metadata_agent[104254]:    option dontlognull
Jan 21 19:13:48 np0005591285 ovn_metadata_agent[104254]:    option http-server-close
Jan 21 19:13:48 np0005591285 ovn_metadata_agent[104254]:    option forwardfor
Jan 21 19:13:48 np0005591285 ovn_metadata_agent[104254]:    retries                 3
Jan 21 19:13:48 np0005591285 ovn_metadata_agent[104254]:    timeout http-request    30s
Jan 21 19:13:48 np0005591285 ovn_metadata_agent[104254]:    timeout connect         30s
Jan 21 19:13:48 np0005591285 ovn_metadata_agent[104254]:    timeout client          32s
Jan 21 19:13:48 np0005591285 ovn_metadata_agent[104254]:    timeout server          32s
Jan 21 19:13:48 np0005591285 ovn_metadata_agent[104254]:    timeout http-keep-alive 30s
Jan 21 19:13:48 np0005591285 ovn_metadata_agent[104254]: 
Jan 21 19:13:48 np0005591285 ovn_metadata_agent[104254]: 
Jan 21 19:13:48 np0005591285 ovn_metadata_agent[104254]: listen listener
Jan 21 19:13:48 np0005591285 ovn_metadata_agent[104254]:    bind 169.254.169.254:80
Jan 21 19:13:48 np0005591285 ovn_metadata_agent[104254]:    server metadata /var/lib/neutron/metadata_proxy
Jan 21 19:13:48 np0005591285 ovn_metadata_agent[104254]:    http-request add-header X-OVN-Network-ID 742490ba-4d29-4314-8cf8-ff183d293525
Jan 21 19:13:48 np0005591285 ovn_metadata_agent[104254]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Jan 21 19:13:48 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:13:48.525 104259 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-742490ba-4d29-4314-8cf8-ff183d293525', 'env', 'PROCESS_TAG=haproxy-742490ba-4d29-4314-8cf8-ff183d293525', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/742490ba-4d29-4314-8cf8-ff183d293525.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Jan 21 19:13:48 np0005591285 nova_compute[182755]: 2026-01-22 00:13:48.531 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:13:48 np0005591285 nova_compute[182755]: 2026-01-22 00:13:48.624 182759 DEBUG nova.virt.driver [None req-f447ef32-7edb-4e2c-a5c5-5452f2f9c37e - - - - - -] Emitting event <LifecycleEvent: 1769040828.6238966, e70b4551-5394-4f61-b02d-ad3b69890e83 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 21 19:13:48 np0005591285 nova_compute[182755]: 2026-01-22 00:13:48.625 182759 INFO nova.compute.manager [None req-f447ef32-7edb-4e2c-a5c5-5452f2f9c37e - - - - - -] [instance: e70b4551-5394-4f61-b02d-ad3b69890e83] VM Started (Lifecycle Event)#033[00m
Jan 21 19:13:48 np0005591285 nova_compute[182755]: 2026-01-22 00:13:48.649 182759 DEBUG nova.compute.manager [None req-f447ef32-7edb-4e2c-a5c5-5452f2f9c37e - - - - - -] [instance: e70b4551-5394-4f61-b02d-ad3b69890e83] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 21 19:13:48 np0005591285 nova_compute[182755]: 2026-01-22 00:13:48.652 182759 DEBUG nova.virt.driver [None req-f447ef32-7edb-4e2c-a5c5-5452f2f9c37e - - - - - -] Emitting event <LifecycleEvent: 1769040828.624046, e70b4551-5394-4f61-b02d-ad3b69890e83 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 21 19:13:48 np0005591285 nova_compute[182755]: 2026-01-22 00:13:48.652 182759 INFO nova.compute.manager [None req-f447ef32-7edb-4e2c-a5c5-5452f2f9c37e - - - - - -] [instance: e70b4551-5394-4f61-b02d-ad3b69890e83] VM Paused (Lifecycle Event)#033[00m
Jan 21 19:13:48 np0005591285 nova_compute[182755]: 2026-01-22 00:13:48.669 182759 DEBUG nova.compute.manager [req-53fe1fb0-aba7-4573-a0c8-d632029c0fc3 req-3d7d9d69-8774-4d1e-b3e1-7c23eadb06b8 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: e70b4551-5394-4f61-b02d-ad3b69890e83] Received event network-vif-plugged-5d8d346d-4613-42c3-a377-72d51f4de448 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 21 19:13:48 np0005591285 nova_compute[182755]: 2026-01-22 00:13:48.670 182759 DEBUG oslo_concurrency.lockutils [req-53fe1fb0-aba7-4573-a0c8-d632029c0fc3 req-3d7d9d69-8774-4d1e-b3e1-7c23eadb06b8 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "e70b4551-5394-4f61-b02d-ad3b69890e83-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 21 19:13:48 np0005591285 nova_compute[182755]: 2026-01-22 00:13:48.671 182759 DEBUG oslo_concurrency.lockutils [req-53fe1fb0-aba7-4573-a0c8-d632029c0fc3 req-3d7d9d69-8774-4d1e-b3e1-7c23eadb06b8 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "e70b4551-5394-4f61-b02d-ad3b69890e83-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 21 19:13:48 np0005591285 nova_compute[182755]: 2026-01-22 00:13:48.671 182759 DEBUG oslo_concurrency.lockutils [req-53fe1fb0-aba7-4573-a0c8-d632029c0fc3 req-3d7d9d69-8774-4d1e-b3e1-7c23eadb06b8 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "e70b4551-5394-4f61-b02d-ad3b69890e83-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 21 19:13:48 np0005591285 nova_compute[182755]: 2026-01-22 00:13:48.672 182759 DEBUG nova.compute.manager [req-53fe1fb0-aba7-4573-a0c8-d632029c0fc3 req-3d7d9d69-8774-4d1e-b3e1-7c23eadb06b8 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: e70b4551-5394-4f61-b02d-ad3b69890e83] Processing event network-vif-plugged-5d8d346d-4613-42c3-a377-72d51f4de448 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Jan 21 19:13:48 np0005591285 nova_compute[182755]: 2026-01-22 00:13:48.674 182759 DEBUG nova.compute.manager [None req-8df65266-e942-4836-8336-ecf2622f1a3c 685d867cee4e4629a9cebd15bdbcb282 3072623f3b79497ab043c7aafb5a7523 - - default default] [instance: e70b4551-5394-4f61-b02d-ad3b69890e83] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Jan 21 19:13:48 np0005591285 nova_compute[182755]: 2026-01-22 00:13:48.675 182759 DEBUG nova.compute.manager [None req-f447ef32-7edb-4e2c-a5c5-5452f2f9c37e - - - - - -] [instance: e70b4551-5394-4f61-b02d-ad3b69890e83] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 21 19:13:48 np0005591285 nova_compute[182755]: 2026-01-22 00:13:48.679 182759 DEBUG nova.virt.libvirt.driver [None req-8df65266-e942-4836-8336-ecf2622f1a3c 685d867cee4e4629a9cebd15bdbcb282 3072623f3b79497ab043c7aafb5a7523 - - default default] [instance: e70b4551-5394-4f61-b02d-ad3b69890e83] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Jan 21 19:13:48 np0005591285 nova_compute[182755]: 2026-01-22 00:13:48.681 182759 DEBUG nova.virt.driver [None req-f447ef32-7edb-4e2c-a5c5-5452f2f9c37e - - - - - -] Emitting event <LifecycleEvent: 1769040828.6787922, e70b4551-5394-4f61-b02d-ad3b69890e83 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 21 19:13:48 np0005591285 nova_compute[182755]: 2026-01-22 00:13:48.682 182759 INFO nova.compute.manager [None req-f447ef32-7edb-4e2c-a5c5-5452f2f9c37e - - - - - -] [instance: e70b4551-5394-4f61-b02d-ad3b69890e83] VM Resumed (Lifecycle Event)#033[00m
Jan 21 19:13:48 np0005591285 nova_compute[182755]: 2026-01-22 00:13:48.690 182759 INFO nova.virt.libvirt.driver [-] [instance: e70b4551-5394-4f61-b02d-ad3b69890e83] Instance spawned successfully.#033[00m
Jan 21 19:13:48 np0005591285 nova_compute[182755]: 2026-01-22 00:13:48.691 182759 DEBUG nova.virt.libvirt.driver [None req-8df65266-e942-4836-8336-ecf2622f1a3c 685d867cee4e4629a9cebd15bdbcb282 3072623f3b79497ab043c7aafb5a7523 - - default default] [instance: e70b4551-5394-4f61-b02d-ad3b69890e83] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Jan 21 19:13:48 np0005591285 nova_compute[182755]: 2026-01-22 00:13:48.711 182759 DEBUG nova.compute.manager [None req-f447ef32-7edb-4e2c-a5c5-5452f2f9c37e - - - - - -] [instance: e70b4551-5394-4f61-b02d-ad3b69890e83] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 21 19:13:48 np0005591285 nova_compute[182755]: 2026-01-22 00:13:48.715 182759 DEBUG nova.virt.libvirt.driver [None req-8df65266-e942-4836-8336-ecf2622f1a3c 685d867cee4e4629a9cebd15bdbcb282 3072623f3b79497ab043c7aafb5a7523 - - default default] [instance: e70b4551-5394-4f61-b02d-ad3b69890e83] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 21 19:13:48 np0005591285 nova_compute[182755]: 2026-01-22 00:13:48.716 182759 DEBUG nova.virt.libvirt.driver [None req-8df65266-e942-4836-8336-ecf2622f1a3c 685d867cee4e4629a9cebd15bdbcb282 3072623f3b79497ab043c7aafb5a7523 - - default default] [instance: e70b4551-5394-4f61-b02d-ad3b69890e83] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 21 19:13:48 np0005591285 nova_compute[182755]: 2026-01-22 00:13:48.716 182759 DEBUG nova.virt.libvirt.driver [None req-8df65266-e942-4836-8336-ecf2622f1a3c 685d867cee4e4629a9cebd15bdbcb282 3072623f3b79497ab043c7aafb5a7523 - - default default] [instance: e70b4551-5394-4f61-b02d-ad3b69890e83] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 21 19:13:48 np0005591285 nova_compute[182755]: 2026-01-22 00:13:48.717 182759 DEBUG nova.virt.libvirt.driver [None req-8df65266-e942-4836-8336-ecf2622f1a3c 685d867cee4e4629a9cebd15bdbcb282 3072623f3b79497ab043c7aafb5a7523 - - default default] [instance: e70b4551-5394-4f61-b02d-ad3b69890e83] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 21 19:13:48 np0005591285 nova_compute[182755]: 2026-01-22 00:13:48.717 182759 DEBUG nova.virt.libvirt.driver [None req-8df65266-e942-4836-8336-ecf2622f1a3c 685d867cee4e4629a9cebd15bdbcb282 3072623f3b79497ab043c7aafb5a7523 - - default default] [instance: e70b4551-5394-4f61-b02d-ad3b69890e83] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 21 19:13:48 np0005591285 nova_compute[182755]: 2026-01-22 00:13:48.717 182759 DEBUG nova.virt.libvirt.driver [None req-8df65266-e942-4836-8336-ecf2622f1a3c 685d867cee4e4629a9cebd15bdbcb282 3072623f3b79497ab043c7aafb5a7523 - - default default] [instance: e70b4551-5394-4f61-b02d-ad3b69890e83] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 21 19:13:48 np0005591285 nova_compute[182755]: 2026-01-22 00:13:48.721 182759 DEBUG nova.compute.manager [None req-f447ef32-7edb-4e2c-a5c5-5452f2f9c37e - - - - - -] [instance: e70b4551-5394-4f61-b02d-ad3b69890e83] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 21 19:13:48 np0005591285 nova_compute[182755]: 2026-01-22 00:13:48.758 182759 INFO nova.compute.manager [None req-f447ef32-7edb-4e2c-a5c5-5452f2f9c37e - - - - - -] [instance: e70b4551-5394-4f61-b02d-ad3b69890e83] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 21 19:13:48 np0005591285 nova_compute[182755]: 2026-01-22 00:13:48.787 182759 INFO nova.compute.manager [None req-8df65266-e942-4836-8336-ecf2622f1a3c 685d867cee4e4629a9cebd15bdbcb282 3072623f3b79497ab043c7aafb5a7523 - - default default] [instance: e70b4551-5394-4f61-b02d-ad3b69890e83] Took 5.15 seconds to spawn the instance on the hypervisor.#033[00m
Jan 21 19:13:48 np0005591285 nova_compute[182755]: 2026-01-22 00:13:48.787 182759 DEBUG nova.compute.manager [None req-8df65266-e942-4836-8336-ecf2622f1a3c 685d867cee4e4629a9cebd15bdbcb282 3072623f3b79497ab043c7aafb5a7523 - - default default] [instance: e70b4551-5394-4f61-b02d-ad3b69890e83] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 21 19:13:48 np0005591285 nova_compute[182755]: 2026-01-22 00:13:48.869 182759 INFO nova.compute.manager [None req-8df65266-e942-4836-8336-ecf2622f1a3c 685d867cee4e4629a9cebd15bdbcb282 3072623f3b79497ab043c7aafb5a7523 - - default default] [instance: e70b4551-5394-4f61-b02d-ad3b69890e83] Took 5.85 seconds to build instance.#033[00m
Jan 21 19:13:48 np0005591285 podman[230938]: 2026-01-22 00:13:48.876534647 +0000 UTC m=+0.049467189 container create 3df1cf01549538b3a4f9112e89b74ec3feb819ffc4e015da5dc9d6017671a703 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-742490ba-4d29-4314-8cf8-ff183d293525, tcib_managed=true, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS)
Jan 21 19:13:48 np0005591285 nova_compute[182755]: 2026-01-22 00:13:48.896 182759 DEBUG oslo_concurrency.lockutils [None req-8df65266-e942-4836-8336-ecf2622f1a3c 685d867cee4e4629a9cebd15bdbcb282 3072623f3b79497ab043c7aafb5a7523 - - default default] Lock "e70b4551-5394-4f61-b02d-ad3b69890e83" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 6.023s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 21 19:13:48 np0005591285 systemd[1]: Started libpod-conmon-3df1cf01549538b3a4f9112e89b74ec3feb819ffc4e015da5dc9d6017671a703.scope.
Jan 21 19:13:48 np0005591285 systemd[1]: Started libcrun container.
Jan 21 19:13:48 np0005591285 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/b81d78b042a2547cbfde9f76306191ba1f3ba2c31683b8c54e96b467cef53e8d/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 21 19:13:48 np0005591285 podman[230938]: 2026-01-22 00:13:48.942470696 +0000 UTC m=+0.115403258 container init 3df1cf01549538b3a4f9112e89b74ec3feb819ffc4e015da5dc9d6017671a703 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-742490ba-4d29-4314-8cf8-ff183d293525, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 21 19:13:48 np0005591285 podman[230938]: 2026-01-22 00:13:48.850827212 +0000 UTC m=+0.023759804 image pull 3695f0466b4af47afdf4b467956f8cc4744d7249671a73e7ca3fd26cca2f59c3 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Jan 21 19:13:48 np0005591285 podman[230938]: 2026-01-22 00:13:48.947559592 +0000 UTC m=+0.120492134 container start 3df1cf01549538b3a4f9112e89b74ec3feb819ffc4e015da5dc9d6017671a703 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-742490ba-4d29-4314-8cf8-ff183d293525, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team)
Jan 21 19:13:48 np0005591285 neutron-haproxy-ovnmeta-742490ba-4d29-4314-8cf8-ff183d293525[230953]: [NOTICE]   (230957) : New worker (230959) forked
Jan 21 19:13:48 np0005591285 neutron-haproxy-ovnmeta-742490ba-4d29-4314-8cf8-ff183d293525[230953]: [NOTICE]   (230957) : Loading success.
Jan 21 19:13:49 np0005591285 nova_compute[182755]: 2026-01-22 00:13:49.680 182759 DEBUG nova.network.neutron [req-ee55501d-280d-46eb-bbc6-9aecaaf90d7a req-636bc9f7-171b-4984-8bfd-9718945a3b32 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: e70b4551-5394-4f61-b02d-ad3b69890e83] Updated VIF entry in instance network info cache for port 5d8d346d-4613-42c3-a377-72d51f4de448. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 21 19:13:49 np0005591285 nova_compute[182755]: 2026-01-22 00:13:49.682 182759 DEBUG nova.network.neutron [req-ee55501d-280d-46eb-bbc6-9aecaaf90d7a req-636bc9f7-171b-4984-8bfd-9718945a3b32 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: e70b4551-5394-4f61-b02d-ad3b69890e83] Updating instance_info_cache with network_info: [{"id": "5d8d346d-4613-42c3-a377-72d51f4de448", "address": "fa:16:3e:2d:1c:67", "network": {"id": "742490ba-4d29-4314-8cf8-ff183d293525", "bridge": "br-int", "label": "tempest-ServerMetadataNegativeTestJSON-46026287-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3072623f3b79497ab043c7aafb5a7523", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap5d8d346d-46", "ovs_interfaceid": "5d8d346d-4613-42c3-a377-72d51f4de448", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 21 19:13:49 np0005591285 nova_compute[182755]: 2026-01-22 00:13:49.707 182759 DEBUG oslo_concurrency.lockutils [req-ee55501d-280d-46eb-bbc6-9aecaaf90d7a req-636bc9f7-171b-4984-8bfd-9718945a3b32 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Releasing lock "refresh_cache-e70b4551-5394-4f61-b02d-ad3b69890e83" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 21 19:13:50 np0005591285 nova_compute[182755]: 2026-01-22 00:13:50.218 182759 DEBUG oslo_service.periodic_task [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 21 19:13:50 np0005591285 nova_compute[182755]: 2026-01-22 00:13:50.253 182759 DEBUG oslo_service.periodic_task [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 21 19:13:50 np0005591285 nova_compute[182755]: 2026-01-22 00:13:50.289 182759 DEBUG oslo_concurrency.lockutils [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 21 19:13:50 np0005591285 nova_compute[182755]: 2026-01-22 00:13:50.290 182759 DEBUG oslo_concurrency.lockutils [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 21 19:13:50 np0005591285 nova_compute[182755]: 2026-01-22 00:13:50.290 182759 DEBUG oslo_concurrency.lockutils [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 21 19:13:50 np0005591285 nova_compute[182755]: 2026-01-22 00:13:50.291 182759 DEBUG nova.compute.resource_tracker [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Jan 21 19:13:50 np0005591285 nova_compute[182755]: 2026-01-22 00:13:50.360 182759 DEBUG oslo_concurrency.processutils [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/e70b4551-5394-4f61-b02d-ad3b69890e83/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 21 19:13:50 np0005591285 nova_compute[182755]: 2026-01-22 00:13:50.434 182759 DEBUG oslo_concurrency.processutils [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/e70b4551-5394-4f61-b02d-ad3b69890e83/disk --force-share --output=json" returned: 0 in 0.074s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 21 19:13:50 np0005591285 nova_compute[182755]: 2026-01-22 00:13:50.435 182759 DEBUG oslo_concurrency.processutils [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/e70b4551-5394-4f61-b02d-ad3b69890e83/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 21 19:13:50 np0005591285 nova_compute[182755]: 2026-01-22 00:13:50.496 182759 DEBUG oslo_concurrency.processutils [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/e70b4551-5394-4f61-b02d-ad3b69890e83/disk --force-share --output=json" returned: 0 in 0.061s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 21 19:13:50 np0005591285 nova_compute[182755]: 2026-01-22 00:13:50.629 182759 WARNING nova.virt.libvirt.driver [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 21 19:13:50 np0005591285 nova_compute[182755]: 2026-01-22 00:13:50.630 182759 DEBUG nova.compute.resource_tracker [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=5466MB free_disk=73.19244384765625GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Jan 21 19:13:50 np0005591285 nova_compute[182755]: 2026-01-22 00:13:50.631 182759 DEBUG oslo_concurrency.lockutils [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 21 19:13:50 np0005591285 nova_compute[182755]: 2026-01-22 00:13:50.631 182759 DEBUG oslo_concurrency.lockutils [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 21 19:13:50 np0005591285 nova_compute[182755]: 2026-01-22 00:13:50.713 182759 DEBUG nova.compute.resource_tracker [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Instance e70b4551-5394-4f61-b02d-ad3b69890e83 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Jan 21 19:13:50 np0005591285 nova_compute[182755]: 2026-01-22 00:13:50.714 182759 DEBUG nova.compute.resource_tracker [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Jan 21 19:13:50 np0005591285 nova_compute[182755]: 2026-01-22 00:13:50.714 182759 DEBUG nova.compute.resource_tracker [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7679MB used_ram=640MB phys_disk=79GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Jan 21 19:13:50 np0005591285 nova_compute[182755]: 2026-01-22 00:13:50.762 182759 DEBUG nova.compute.provider_tree [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Inventory has not changed in ProviderTree for provider: e96a8776-a298-4c19-937a-402cb8191067 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 21 19:13:50 np0005591285 nova_compute[182755]: 2026-01-22 00:13:50.794 182759 DEBUG nova.scheduler.client.report [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Inventory has not changed for provider e96a8776-a298-4c19-937a-402cb8191067 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 21 19:13:51 np0005591285 nova_compute[182755]: 2026-01-22 00:13:51.009 182759 DEBUG nova.compute.manager [req-c4caa2f6-f9d7-43ac-af3f-b16c622e4595 req-d80ce594-5b93-4c33-b4b7-e051fd47b72f 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: e70b4551-5394-4f61-b02d-ad3b69890e83] Received event network-vif-plugged-5d8d346d-4613-42c3-a377-72d51f4de448 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 21 19:13:51 np0005591285 nova_compute[182755]: 2026-01-22 00:13:51.010 182759 DEBUG oslo_concurrency.lockutils [req-c4caa2f6-f9d7-43ac-af3f-b16c622e4595 req-d80ce594-5b93-4c33-b4b7-e051fd47b72f 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "e70b4551-5394-4f61-b02d-ad3b69890e83-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 21 19:13:51 np0005591285 nova_compute[182755]: 2026-01-22 00:13:51.011 182759 DEBUG oslo_concurrency.lockutils [req-c4caa2f6-f9d7-43ac-af3f-b16c622e4595 req-d80ce594-5b93-4c33-b4b7-e051fd47b72f 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "e70b4551-5394-4f61-b02d-ad3b69890e83-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 21 19:13:51 np0005591285 nova_compute[182755]: 2026-01-22 00:13:51.011 182759 DEBUG oslo_concurrency.lockutils [req-c4caa2f6-f9d7-43ac-af3f-b16c622e4595 req-d80ce594-5b93-4c33-b4b7-e051fd47b72f 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "e70b4551-5394-4f61-b02d-ad3b69890e83-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 21 19:13:51 np0005591285 nova_compute[182755]: 2026-01-22 00:13:51.012 182759 DEBUG nova.compute.manager [req-c4caa2f6-f9d7-43ac-af3f-b16c622e4595 req-d80ce594-5b93-4c33-b4b7-e051fd47b72f 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: e70b4551-5394-4f61-b02d-ad3b69890e83] No waiting events found dispatching network-vif-plugged-5d8d346d-4613-42c3-a377-72d51f4de448 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 21 19:13:51 np0005591285 nova_compute[182755]: 2026-01-22 00:13:51.012 182759 WARNING nova.compute.manager [req-c4caa2f6-f9d7-43ac-af3f-b16c622e4595 req-d80ce594-5b93-4c33-b4b7-e051fd47b72f 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: e70b4551-5394-4f61-b02d-ad3b69890e83] Received unexpected event network-vif-plugged-5d8d346d-4613-42c3-a377-72d51f4de448 for instance with vm_state active and task_state None.#033[00m
Jan 21 19:13:51 np0005591285 nova_compute[182755]: 2026-01-22 00:13:51.246 182759 DEBUG nova.compute.resource_tracker [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Jan 21 19:13:51 np0005591285 nova_compute[182755]: 2026-01-22 00:13:51.247 182759 DEBUG oslo_concurrency.lockutils [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.616s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 21 19:13:51 np0005591285 nova_compute[182755]: 2026-01-22 00:13:51.884 182759 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1769040816.8823483, ee09d802-1f59-4f58-befa-a281fe642b6b => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 21 19:13:51 np0005591285 nova_compute[182755]: 2026-01-22 00:13:51.884 182759 INFO nova.compute.manager [-] [instance: ee09d802-1f59-4f58-befa-a281fe642b6b] VM Stopped (Lifecycle Event)#033[00m
Jan 21 19:13:51 np0005591285 nova_compute[182755]: 2026-01-22 00:13:51.909 182759 DEBUG nova.compute.manager [None req-3baf83dc-7101-4012-9028-e0c0da1bb383 - - - - - -] [instance: ee09d802-1f59-4f58-befa-a281fe642b6b] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 21 19:13:52 np0005591285 nova_compute[182755]: 2026-01-22 00:13:52.438 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:13:52 np0005591285 nova_compute[182755]: 2026-01-22 00:13:52.487 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:13:53 np0005591285 nova_compute[182755]: 2026-01-22 00:13:53.479 182759 DEBUG oslo_concurrency.lockutils [None req-5722c82b-d4ae-41ac-8d69-b5f74441a59b 685d867cee4e4629a9cebd15bdbcb282 3072623f3b79497ab043c7aafb5a7523 - - default default] Acquiring lock "e70b4551-5394-4f61-b02d-ad3b69890e83" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 21 19:13:53 np0005591285 nova_compute[182755]: 2026-01-22 00:13:53.480 182759 DEBUG oslo_concurrency.lockutils [None req-5722c82b-d4ae-41ac-8d69-b5f74441a59b 685d867cee4e4629a9cebd15bdbcb282 3072623f3b79497ab043c7aafb5a7523 - - default default] Lock "e70b4551-5394-4f61-b02d-ad3b69890e83" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 21 19:13:53 np0005591285 nova_compute[182755]: 2026-01-22 00:13:53.481 182759 DEBUG oslo_concurrency.lockutils [None req-5722c82b-d4ae-41ac-8d69-b5f74441a59b 685d867cee4e4629a9cebd15bdbcb282 3072623f3b79497ab043c7aafb5a7523 - - default default] Acquiring lock "e70b4551-5394-4f61-b02d-ad3b69890e83-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 21 19:13:53 np0005591285 nova_compute[182755]: 2026-01-22 00:13:53.481 182759 DEBUG oslo_concurrency.lockutils [None req-5722c82b-d4ae-41ac-8d69-b5f74441a59b 685d867cee4e4629a9cebd15bdbcb282 3072623f3b79497ab043c7aafb5a7523 - - default default] Lock "e70b4551-5394-4f61-b02d-ad3b69890e83-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 21 19:13:53 np0005591285 nova_compute[182755]: 2026-01-22 00:13:53.481 182759 DEBUG oslo_concurrency.lockutils [None req-5722c82b-d4ae-41ac-8d69-b5f74441a59b 685d867cee4e4629a9cebd15bdbcb282 3072623f3b79497ab043c7aafb5a7523 - - default default] Lock "e70b4551-5394-4f61-b02d-ad3b69890e83-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 21 19:13:53 np0005591285 nova_compute[182755]: 2026-01-22 00:13:53.494 182759 INFO nova.compute.manager [None req-5722c82b-d4ae-41ac-8d69-b5f74441a59b 685d867cee4e4629a9cebd15bdbcb282 3072623f3b79497ab043c7aafb5a7523 - - default default] [instance: e70b4551-5394-4f61-b02d-ad3b69890e83] Terminating instance#033[00m
Jan 21 19:13:53 np0005591285 nova_compute[182755]: 2026-01-22 00:13:53.506 182759 DEBUG nova.compute.manager [None req-5722c82b-d4ae-41ac-8d69-b5f74441a59b 685d867cee4e4629a9cebd15bdbcb282 3072623f3b79497ab043c7aafb5a7523 - - default default] [instance: e70b4551-5394-4f61-b02d-ad3b69890e83] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Jan 21 19:13:53 np0005591285 kernel: tap5d8d346d-46 (unregistering): left promiscuous mode
Jan 21 19:13:53 np0005591285 NetworkManager[55017]: <info>  [1769040833.5355] device (tap5d8d346d-46): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 21 19:13:53 np0005591285 nova_compute[182755]: 2026-01-22 00:13:53.542 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:13:53 np0005591285 ovn_controller[94908]: 2026-01-22T00:13:53Z|00475|binding|INFO|Releasing lport 5d8d346d-4613-42c3-a377-72d51f4de448 from this chassis (sb_readonly=0)
Jan 21 19:13:53 np0005591285 ovn_controller[94908]: 2026-01-22T00:13:53Z|00476|binding|INFO|Setting lport 5d8d346d-4613-42c3-a377-72d51f4de448 down in Southbound
Jan 21 19:13:53 np0005591285 ovn_controller[94908]: 2026-01-22T00:13:53Z|00477|binding|INFO|Removing iface tap5d8d346d-46 ovn-installed in OVS
Jan 21 19:13:53 np0005591285 nova_compute[182755]: 2026-01-22 00:13:53.546 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:13:53 np0005591285 nova_compute[182755]: 2026-01-22 00:13:53.557 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:13:53 np0005591285 systemd[1]: machine-qemu\x2d56\x2dinstance\x2d0000007a.scope: Deactivated successfully.
Jan 21 19:13:53 np0005591285 systemd[1]: machine-qemu\x2d56\x2dinstance\x2d0000007a.scope: Consumed 5.020s CPU time.
Jan 21 19:13:53 np0005591285 systemd-machined[154022]: Machine qemu-56-instance-0000007a terminated.
Jan 21 19:13:53 np0005591285 podman[230979]: 2026-01-22 00:13:53.63443511 +0000 UTC m=+0.060289889 container health_status b5c1a31a508d0cf9e10702abe9c0ef424614b4d8f0daee85791f7de054afa6f0 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_id=ceilometer_agent_compute, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']})
Jan 21 19:13:53 np0005591285 podman[230977]: 2026-01-22 00:13:53.644844206 +0000 UTC m=+0.072105062 container health_status 769668cc4e818cfae854ab3fcb5517227ddca1e18005a18ef4d782a494f45662 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.openshift.tags=minimal rhel9, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, version=9.6, io.openshift.expose-services=, config_id=openstack_network_exporter, managed_by=edpm_ansible, name=ubi9-minimal, vcs-type=git, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, release=1755695350, architecture=x86_64, maintainer=Red Hat, Inc., io.buildah.version=1.33.7, url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.component=ubi9-minimal-container, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2025-08-20T13:12:41, container_name=openstack_network_exporter, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., distribution-scope=public)
Jan 21 19:13:53 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:13:53.716 104259 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:2d:1c:67 10.100.0.5'], port_security=['fa:16:3e:2d:1c:67 10.100.0.5'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.5/28', 'neutron:device_id': 'e70b4551-5394-4f61-b02d-ad3b69890e83', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-742490ba-4d29-4314-8cf8-ff183d293525', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '3072623f3b79497ab043c7aafb5a7523', 'neutron:revision_number': '4', 'neutron:security_group_ids': 'a39c3f33-4773-428c-9ed7-8f603a221d83', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-2.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=eb2947d7-9c21-451d-9b8e-7a292419011d, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fdd55b5c970>], logical_port=5d8d346d-4613-42c3-a377-72d51f4de448) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7fdd55b5c970>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 21 19:13:53 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:13:53.718 104259 INFO neutron.agent.ovn.metadata.agent [-] Port 5d8d346d-4613-42c3-a377-72d51f4de448 in datapath 742490ba-4d29-4314-8cf8-ff183d293525 unbound from our chassis#033[00m
Jan 21 19:13:53 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:13:53.719 104259 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 742490ba-4d29-4314-8cf8-ff183d293525, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Jan 21 19:13:53 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:13:53.720 211690 DEBUG oslo.privsep.daemon [-] privsep: reply[b20588db-4ca6-4ae4-b628-324781263974]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 19:13:53 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:13:53.721 104259 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-742490ba-4d29-4314-8cf8-ff183d293525 namespace which is not needed anymore#033[00m
Jan 21 19:13:53 np0005591285 nova_compute[182755]: 2026-01-22 00:13:53.728 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:13:53 np0005591285 nova_compute[182755]: 2026-01-22 00:13:53.733 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:13:53 np0005591285 nova_compute[182755]: 2026-01-22 00:13:53.770 182759 INFO nova.virt.libvirt.driver [-] [instance: e70b4551-5394-4f61-b02d-ad3b69890e83] Instance destroyed successfully.#033[00m
Jan 21 19:13:53 np0005591285 nova_compute[182755]: 2026-01-22 00:13:53.770 182759 DEBUG nova.objects.instance [None req-5722c82b-d4ae-41ac-8d69-b5f74441a59b 685d867cee4e4629a9cebd15bdbcb282 3072623f3b79497ab043c7aafb5a7523 - - default default] Lazy-loading 'resources' on Instance uuid e70b4551-5394-4f61-b02d-ad3b69890e83 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 21 19:13:53 np0005591285 nova_compute[182755]: 2026-01-22 00:13:53.796 182759 DEBUG nova.virt.libvirt.vif [None req-5722c82b-d4ae-41ac-8d69-b5f74441a59b 685d867cee4e4629a9cebd15bdbcb282 3072623f3b79497ab043c7aafb5a7523 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-22T00:13:40Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ServerMetadataNegativeTestJSON-server-1340270100',display_name='tempest-ServerMetadataNegativeTestJSON-server-1340270100',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-servermetadatanegativetestjson-server-1340270100',id=122,image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2026-01-22T00:13:48Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='3072623f3b79497ab043c7aafb5a7523',ramdisk_id='',reservation_id='r-qs2dhcsj',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-ServerMetadataNegativeTestJSON-1469736869',owner_user_name='tempest-ServerMetadataNegativeTestJSON-1469736869-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-22T00:13:48Z,user_data=None,user_id='685d867cee4e4629a9cebd15bdbcb282',uuid=e70b4551-5394-4f61-b02d-ad3b69890e83,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "5d8d346d-4613-42c3-a377-72d51f4de448", "address": "fa:16:3e:2d:1c:67", "network": {"id": "742490ba-4d29-4314-8cf8-ff183d293525", "bridge": "br-int", "label": "tempest-ServerMetadataNegativeTestJSON-46026287-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3072623f3b79497ab043c7aafb5a7523", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap5d8d346d-46", "ovs_interfaceid": "5d8d346d-4613-42c3-a377-72d51f4de448", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Jan 21 19:13:53 np0005591285 nova_compute[182755]: 2026-01-22 00:13:53.796 182759 DEBUG nova.network.os_vif_util [None req-5722c82b-d4ae-41ac-8d69-b5f74441a59b 685d867cee4e4629a9cebd15bdbcb282 3072623f3b79497ab043c7aafb5a7523 - - default default] Converting VIF {"id": "5d8d346d-4613-42c3-a377-72d51f4de448", "address": "fa:16:3e:2d:1c:67", "network": {"id": "742490ba-4d29-4314-8cf8-ff183d293525", "bridge": "br-int", "label": "tempest-ServerMetadataNegativeTestJSON-46026287-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3072623f3b79497ab043c7aafb5a7523", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap5d8d346d-46", "ovs_interfaceid": "5d8d346d-4613-42c3-a377-72d51f4de448", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 21 19:13:53 np0005591285 nova_compute[182755]: 2026-01-22 00:13:53.797 182759 DEBUG nova.network.os_vif_util [None req-5722c82b-d4ae-41ac-8d69-b5f74441a59b 685d867cee4e4629a9cebd15bdbcb282 3072623f3b79497ab043c7aafb5a7523 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:2d:1c:67,bridge_name='br-int',has_traffic_filtering=True,id=5d8d346d-4613-42c3-a377-72d51f4de448,network=Network(742490ba-4d29-4314-8cf8-ff183d293525),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap5d8d346d-46') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 21 19:13:53 np0005591285 nova_compute[182755]: 2026-01-22 00:13:53.797 182759 DEBUG os_vif [None req-5722c82b-d4ae-41ac-8d69-b5f74441a59b 685d867cee4e4629a9cebd15bdbcb282 3072623f3b79497ab043c7aafb5a7523 - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:2d:1c:67,bridge_name='br-int',has_traffic_filtering=True,id=5d8d346d-4613-42c3-a377-72d51f4de448,network=Network(742490ba-4d29-4314-8cf8-ff183d293525),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap5d8d346d-46') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Jan 21 19:13:53 np0005591285 nova_compute[182755]: 2026-01-22 00:13:53.800 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:13:53 np0005591285 nova_compute[182755]: 2026-01-22 00:13:53.800 182759 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap5d8d346d-46, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 21 19:13:53 np0005591285 nova_compute[182755]: 2026-01-22 00:13:53.802 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:13:53 np0005591285 nova_compute[182755]: 2026-01-22 00:13:53.803 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:13:53 np0005591285 nova_compute[182755]: 2026-01-22 00:13:53.805 182759 INFO os_vif [None req-5722c82b-d4ae-41ac-8d69-b5f74441a59b 685d867cee4e4629a9cebd15bdbcb282 3072623f3b79497ab043c7aafb5a7523 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:2d:1c:67,bridge_name='br-int',has_traffic_filtering=True,id=5d8d346d-4613-42c3-a377-72d51f4de448,network=Network(742490ba-4d29-4314-8cf8-ff183d293525),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap5d8d346d-46')#033[00m
Jan 21 19:13:53 np0005591285 nova_compute[182755]: 2026-01-22 00:13:53.806 182759 INFO nova.virt.libvirt.driver [None req-5722c82b-d4ae-41ac-8d69-b5f74441a59b 685d867cee4e4629a9cebd15bdbcb282 3072623f3b79497ab043c7aafb5a7523 - - default default] [instance: e70b4551-5394-4f61-b02d-ad3b69890e83] Deleting instance files /var/lib/nova/instances/e70b4551-5394-4f61-b02d-ad3b69890e83_del#033[00m
Jan 21 19:13:53 np0005591285 nova_compute[182755]: 2026-01-22 00:13:53.807 182759 INFO nova.virt.libvirt.driver [None req-5722c82b-d4ae-41ac-8d69-b5f74441a59b 685d867cee4e4629a9cebd15bdbcb282 3072623f3b79497ab043c7aafb5a7523 - - default default] [instance: e70b4551-5394-4f61-b02d-ad3b69890e83] Deletion of /var/lib/nova/instances/e70b4551-5394-4f61-b02d-ad3b69890e83_del complete#033[00m
Jan 21 19:13:53 np0005591285 neutron-haproxy-ovnmeta-742490ba-4d29-4314-8cf8-ff183d293525[230953]: [NOTICE]   (230957) : haproxy version is 2.8.14-c23fe91
Jan 21 19:13:53 np0005591285 neutron-haproxy-ovnmeta-742490ba-4d29-4314-8cf8-ff183d293525[230953]: [NOTICE]   (230957) : path to executable is /usr/sbin/haproxy
Jan 21 19:13:53 np0005591285 neutron-haproxy-ovnmeta-742490ba-4d29-4314-8cf8-ff183d293525[230953]: [WARNING]  (230957) : Exiting Master process...
Jan 21 19:13:53 np0005591285 neutron-haproxy-ovnmeta-742490ba-4d29-4314-8cf8-ff183d293525[230953]: [ALERT]    (230957) : Current worker (230959) exited with code 143 (Terminated)
Jan 21 19:13:53 np0005591285 neutron-haproxy-ovnmeta-742490ba-4d29-4314-8cf8-ff183d293525[230953]: [WARNING]  (230957) : All workers exited. Exiting... (0)
Jan 21 19:13:53 np0005591285 systemd[1]: libpod-3df1cf01549538b3a4f9112e89b74ec3feb819ffc4e015da5dc9d6017671a703.scope: Deactivated successfully.
Jan 21 19:13:53 np0005591285 podman[231056]: 2026-01-22 00:13:53.865308364 +0000 UTC m=+0.049299954 container died 3df1cf01549538b3a4f9112e89b74ec3feb819ffc4e015da5dc9d6017671a703 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-742490ba-4d29-4314-8cf8-ff183d293525, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Jan 21 19:13:53 np0005591285 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-3df1cf01549538b3a4f9112e89b74ec3feb819ffc4e015da5dc9d6017671a703-userdata-shm.mount: Deactivated successfully.
Jan 21 19:13:53 np0005591285 systemd[1]: var-lib-containers-storage-overlay-b81d78b042a2547cbfde9f76306191ba1f3ba2c31683b8c54e96b467cef53e8d-merged.mount: Deactivated successfully.
Jan 21 19:13:53 np0005591285 podman[231056]: 2026-01-22 00:13:53.899894746 +0000 UTC m=+0.083886336 container cleanup 3df1cf01549538b3a4f9112e89b74ec3feb819ffc4e015da5dc9d6017671a703 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-742490ba-4d29-4314-8cf8-ff183d293525, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team)
Jan 21 19:13:53 np0005591285 systemd[1]: libpod-conmon-3df1cf01549538b3a4f9112e89b74ec3feb819ffc4e015da5dc9d6017671a703.scope: Deactivated successfully.
Jan 21 19:13:53 np0005591285 nova_compute[182755]: 2026-01-22 00:13:53.917 182759 INFO nova.compute.manager [None req-5722c82b-d4ae-41ac-8d69-b5f74441a59b 685d867cee4e4629a9cebd15bdbcb282 3072623f3b79497ab043c7aafb5a7523 - - default default] [instance: e70b4551-5394-4f61-b02d-ad3b69890e83] Took 0.41 seconds to destroy the instance on the hypervisor.#033[00m
Jan 21 19:13:53 np0005591285 nova_compute[182755]: 2026-01-22 00:13:53.918 182759 DEBUG oslo.service.loopingcall [None req-5722c82b-d4ae-41ac-8d69-b5f74441a59b 685d867cee4e4629a9cebd15bdbcb282 3072623f3b79497ab043c7aafb5a7523 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Jan 21 19:13:53 np0005591285 nova_compute[182755]: 2026-01-22 00:13:53.918 182759 DEBUG nova.compute.manager [-] [instance: e70b4551-5394-4f61-b02d-ad3b69890e83] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Jan 21 19:13:53 np0005591285 nova_compute[182755]: 2026-01-22 00:13:53.919 182759 DEBUG nova.network.neutron [-] [instance: e70b4551-5394-4f61-b02d-ad3b69890e83] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Jan 21 19:13:53 np0005591285 podman[231086]: 2026-01-22 00:13:53.967300424 +0000 UTC m=+0.040940063 container remove 3df1cf01549538b3a4f9112e89b74ec3feb819ffc4e015da5dc9d6017671a703 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-742490ba-4d29-4314-8cf8-ff183d293525, io.buildah.version=1.41.3, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true)
Jan 21 19:13:53 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:13:53.971 211690 DEBUG oslo.privsep.daemon [-] privsep: reply[855c93bd-56ab-4701-a71a-4196a13c5a60]: (4, ('Thu Jan 22 12:13:53 AM UTC 2026 Stopping container neutron-haproxy-ovnmeta-742490ba-4d29-4314-8cf8-ff183d293525 (3df1cf01549538b3a4f9112e89b74ec3feb819ffc4e015da5dc9d6017671a703)\n3df1cf01549538b3a4f9112e89b74ec3feb819ffc4e015da5dc9d6017671a703\nThu Jan 22 12:13:53 AM UTC 2026 Deleting container neutron-haproxy-ovnmeta-742490ba-4d29-4314-8cf8-ff183d293525 (3df1cf01549538b3a4f9112e89b74ec3feb819ffc4e015da5dc9d6017671a703)\n3df1cf01549538b3a4f9112e89b74ec3feb819ffc4e015da5dc9d6017671a703\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 19:13:53 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:13:53.974 211690 DEBUG oslo.privsep.daemon [-] privsep: reply[1e6a8efb-7428-4d15-962d-8dd0a6fd7b69]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 19:13:53 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:13:53.975 104259 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap742490ba-40, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 21 19:13:53 np0005591285 nova_compute[182755]: 2026-01-22 00:13:53.976 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:13:53 np0005591285 kernel: tap742490ba-40: left promiscuous mode
Jan 21 19:13:53 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:13:53.981 211690 DEBUG oslo.privsep.daemon [-] privsep: reply[38a516d7-b56a-49fc-aa98-830b7e848ec9]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 19:13:53 np0005591285 nova_compute[182755]: 2026-01-22 00:13:53.990 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:13:53 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:13:53.996 211690 DEBUG oslo.privsep.daemon [-] privsep: reply[62be832b-6dd7-4cac-8d36-906e7e189f0a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 19:13:53 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:13:53.997 211690 DEBUG oslo.privsep.daemon [-] privsep: reply[da99e0dc-e33b-4b95-8bf2-b087d0877869]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 19:13:54 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:13:54.016 211690 DEBUG oslo.privsep.daemon [-] privsep: reply[b60e13f9-b1fa-4cab-9159-009462cbba2c]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 538695, 'reachable_time': 30514, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 231101, 'error': None, 'target': 'ovnmeta-742490ba-4d29-4314-8cf8-ff183d293525', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 19:13:54 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:13:54.019 104650 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-742490ba-4d29-4314-8cf8-ff183d293525 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Jan 21 19:13:54 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:13:54.019 104650 DEBUG oslo.privsep.daemon [-] privsep: reply[4354ef56-3057-489a-bf99-fd0bf8415ec0]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 19:13:54 np0005591285 systemd[1]: run-netns-ovnmeta\x2d742490ba\x2d4d29\x2d4314\x2d8cf8\x2dff183d293525.mount: Deactivated successfully.
Jan 21 19:13:54 np0005591285 nova_compute[182755]: 2026-01-22 00:13:54.211 182759 DEBUG oslo_service.periodic_task [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 21 19:13:55 np0005591285 nova_compute[182755]: 2026-01-22 00:13:55.874 182759 DEBUG nova.network.neutron [-] [instance: e70b4551-5394-4f61-b02d-ad3b69890e83] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 21 19:13:56 np0005591285 nova_compute[182755]: 2026-01-22 00:13:56.069 182759 DEBUG nova.compute.manager [req-e2d1a6d3-cc3f-40e1-ac29-e002a5347d71 req-61efc83e-300c-4191-a733-4e550584540b 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: e70b4551-5394-4f61-b02d-ad3b69890e83] Received event network-vif-deleted-5d8d346d-4613-42c3-a377-72d51f4de448 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 21 19:13:56 np0005591285 nova_compute[182755]: 2026-01-22 00:13:56.069 182759 INFO nova.compute.manager [req-e2d1a6d3-cc3f-40e1-ac29-e002a5347d71 req-61efc83e-300c-4191-a733-4e550584540b 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: e70b4551-5394-4f61-b02d-ad3b69890e83] Neutron deleted interface 5d8d346d-4613-42c3-a377-72d51f4de448; detaching it from the instance and deleting it from the info cache#033[00m
Jan 21 19:13:56 np0005591285 nova_compute[182755]: 2026-01-22 00:13:56.070 182759 DEBUG nova.network.neutron [req-e2d1a6d3-cc3f-40e1-ac29-e002a5347d71 req-61efc83e-300c-4191-a733-4e550584540b 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: e70b4551-5394-4f61-b02d-ad3b69890e83] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 21 19:13:56 np0005591285 nova_compute[182755]: 2026-01-22 00:13:56.415 182759 INFO nova.compute.manager [-] [instance: e70b4551-5394-4f61-b02d-ad3b69890e83] Took 2.50 seconds to deallocate network for instance.#033[00m
Jan 21 19:13:56 np0005591285 nova_compute[182755]: 2026-01-22 00:13:56.419 182759 DEBUG nova.compute.manager [req-e2d1a6d3-cc3f-40e1-ac29-e002a5347d71 req-61efc83e-300c-4191-a733-4e550584540b 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: e70b4551-5394-4f61-b02d-ad3b69890e83] Detach interface failed, port_id=5d8d346d-4613-42c3-a377-72d51f4de448, reason: Instance e70b4551-5394-4f61-b02d-ad3b69890e83 could not be found. _process_instance_vif_deleted_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10882#033[00m
Jan 21 19:13:57 np0005591285 nova_compute[182755]: 2026-01-22 00:13:57.010 182759 DEBUG oslo_concurrency.lockutils [None req-5722c82b-d4ae-41ac-8d69-b5f74441a59b 685d867cee4e4629a9cebd15bdbcb282 3072623f3b79497ab043c7aafb5a7523 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 21 19:13:57 np0005591285 nova_compute[182755]: 2026-01-22 00:13:57.011 182759 DEBUG oslo_concurrency.lockutils [None req-5722c82b-d4ae-41ac-8d69-b5f74441a59b 685d867cee4e4629a9cebd15bdbcb282 3072623f3b79497ab043c7aafb5a7523 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 21 19:13:57 np0005591285 nova_compute[182755]: 2026-01-22 00:13:57.120 182759 DEBUG nova.compute.provider_tree [None req-5722c82b-d4ae-41ac-8d69-b5f74441a59b 685d867cee4e4629a9cebd15bdbcb282 3072623f3b79497ab043c7aafb5a7523 - - default default] Inventory has not changed in ProviderTree for provider: e96a8776-a298-4c19-937a-402cb8191067 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 21 19:13:57 np0005591285 nova_compute[182755]: 2026-01-22 00:13:57.464 182759 DEBUG nova.scheduler.client.report [None req-5722c82b-d4ae-41ac-8d69-b5f74441a59b 685d867cee4e4629a9cebd15bdbcb282 3072623f3b79497ab043c7aafb5a7523 - - default default] Inventory has not changed for provider e96a8776-a298-4c19-937a-402cb8191067 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 21 19:13:57 np0005591285 nova_compute[182755]: 2026-01-22 00:13:57.488 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:13:57 np0005591285 nova_compute[182755]: 2026-01-22 00:13:57.511 182759 DEBUG oslo_concurrency.lockutils [None req-5722c82b-d4ae-41ac-8d69-b5f74441a59b 685d867cee4e4629a9cebd15bdbcb282 3072623f3b79497ab043c7aafb5a7523 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.500s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 21 19:13:57 np0005591285 nova_compute[182755]: 2026-01-22 00:13:57.602 182759 INFO nova.scheduler.client.report [None req-5722c82b-d4ae-41ac-8d69-b5f74441a59b 685d867cee4e4629a9cebd15bdbcb282 3072623f3b79497ab043c7aafb5a7523 - - default default] Deleted allocations for instance e70b4551-5394-4f61-b02d-ad3b69890e83#033[00m
Jan 21 19:13:57 np0005591285 nova_compute[182755]: 2026-01-22 00:13:57.711 182759 DEBUG oslo_concurrency.lockutils [None req-5722c82b-d4ae-41ac-8d69-b5f74441a59b 685d867cee4e4629a9cebd15bdbcb282 3072623f3b79497ab043c7aafb5a7523 - - default default] Lock "e70b4551-5394-4f61-b02d-ad3b69890e83" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 4.230s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 21 19:13:58 np0005591285 nova_compute[182755]: 2026-01-22 00:13:58.804 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:14:02 np0005591285 nova_compute[182755]: 2026-01-22 00:14:02.491 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:14:02 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:14:02.977 104259 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 21 19:14:02 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:14:02.978 104259 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 21 19:14:02 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:14:02.978 104259 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 21 19:14:03 np0005591285 podman[231102]: 2026-01-22 00:14:03.220978669 +0000 UTC m=+0.078346390 container health_status 3d927e208c8d9d2bf2ed958430b573b5beb6e6062ba87e1ec7a0ee42c855b2e4 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Jan 21 19:14:03 np0005591285 nova_compute[182755]: 2026-01-22 00:14:03.807 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:14:07 np0005591285 podman[231125]: 2026-01-22 00:14:07.207369741 +0000 UTC m=+0.071011105 container health_status 482a8450540b33e5cd4c4cb0c4b60281016c657b79b89fa82f8a9e447843f995 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, tcib_managed=true)
Jan 21 19:14:07 np0005591285 podman[231126]: 2026-01-22 00:14:07.209788016 +0000 UTC m=+0.072149655 container health_status 8919309155a033da8deb024bb37b9fc0d285fe97686ee0d202af37a5f2df8416 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter)
Jan 21 19:14:07 np0005591285 nova_compute[182755]: 2026-01-22 00:14:07.493 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:14:08 np0005591285 nova_compute[182755]: 2026-01-22 00:14:08.767 182759 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1769040833.7661521, e70b4551-5394-4f61-b02d-ad3b69890e83 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 21 19:14:08 np0005591285 nova_compute[182755]: 2026-01-22 00:14:08.768 182759 INFO nova.compute.manager [-] [instance: e70b4551-5394-4f61-b02d-ad3b69890e83] VM Stopped (Lifecycle Event)#033[00m
Jan 21 19:14:08 np0005591285 nova_compute[182755]: 2026-01-22 00:14:08.811 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:14:09 np0005591285 podman[231169]: 2026-01-22 00:14:09.219904838 +0000 UTC m=+0.094602093 container health_status e38def30a2d032278c507a8fe96e62e9ea51ff4721fe55b24e66691821fd079e (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, managed_by=edpm_ansible, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0)
Jan 21 19:14:09 np0005591285 nova_compute[182755]: 2026-01-22 00:14:09.463 182759 DEBUG nova.compute.manager [None req-cd63a1cd-ae00-4920-a421-851a41fc427f - - - - - -] [instance: e70b4551-5394-4f61-b02d-ad3b69890e83] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 21 19:14:12 np0005591285 nova_compute[182755]: 2026-01-22 00:14:12.495 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:14:13 np0005591285 nova_compute[182755]: 2026-01-22 00:14:13.823 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:14:14 np0005591285 nova_compute[182755]: 2026-01-22 00:14:14.649 182759 DEBUG oslo_concurrency.lockutils [None req-32069043-92dc-4888-8b4d-8dfd7a4b0e89 00a7d470e36045deabd5584bd3a9c73e f02fc2085f6340ffa895cb894fdf5882 - - default default] Acquiring lock "4a27cb55-31e5-4343-bed7-44671f47ae20" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 21 19:14:14 np0005591285 nova_compute[182755]: 2026-01-22 00:14:14.650 182759 DEBUG oslo_concurrency.lockutils [None req-32069043-92dc-4888-8b4d-8dfd7a4b0e89 00a7d470e36045deabd5584bd3a9c73e f02fc2085f6340ffa895cb894fdf5882 - - default default] Lock "4a27cb55-31e5-4343-bed7-44671f47ae20" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 21 19:14:14 np0005591285 nova_compute[182755]: 2026-01-22 00:14:14.686 182759 DEBUG nova.compute.manager [None req-32069043-92dc-4888-8b4d-8dfd7a4b0e89 00a7d470e36045deabd5584bd3a9c73e f02fc2085f6340ffa895cb894fdf5882 - - default default] [instance: 4a27cb55-31e5-4343-bed7-44671f47ae20] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Jan 21 19:14:14 np0005591285 nova_compute[182755]: 2026-01-22 00:14:14.862 182759 DEBUG oslo_concurrency.lockutils [None req-32069043-92dc-4888-8b4d-8dfd7a4b0e89 00a7d470e36045deabd5584bd3a9c73e f02fc2085f6340ffa895cb894fdf5882 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 21 19:14:14 np0005591285 nova_compute[182755]: 2026-01-22 00:14:14.863 182759 DEBUG oslo_concurrency.lockutils [None req-32069043-92dc-4888-8b4d-8dfd7a4b0e89 00a7d470e36045deabd5584bd3a9c73e f02fc2085f6340ffa895cb894fdf5882 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 21 19:14:14 np0005591285 nova_compute[182755]: 2026-01-22 00:14:14.872 182759 DEBUG nova.virt.hardware [None req-32069043-92dc-4888-8b4d-8dfd7a4b0e89 00a7d470e36045deabd5584bd3a9c73e f02fc2085f6340ffa895cb894fdf5882 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Jan 21 19:14:14 np0005591285 nova_compute[182755]: 2026-01-22 00:14:14.872 182759 INFO nova.compute.claims [None req-32069043-92dc-4888-8b4d-8dfd7a4b0e89 00a7d470e36045deabd5584bd3a9c73e f02fc2085f6340ffa895cb894fdf5882 - - default default] [instance: 4a27cb55-31e5-4343-bed7-44671f47ae20] Claim successful on node compute-2.ctlplane.example.com#033[00m
Jan 21 19:14:15 np0005591285 nova_compute[182755]: 2026-01-22 00:14:15.060 182759 DEBUG nova.compute.provider_tree [None req-32069043-92dc-4888-8b4d-8dfd7a4b0e89 00a7d470e36045deabd5584bd3a9c73e f02fc2085f6340ffa895cb894fdf5882 - - default default] Inventory has not changed in ProviderTree for provider: e96a8776-a298-4c19-937a-402cb8191067 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 21 19:14:15 np0005591285 nova_compute[182755]: 2026-01-22 00:14:15.082 182759 DEBUG nova.scheduler.client.report [None req-32069043-92dc-4888-8b4d-8dfd7a4b0e89 00a7d470e36045deabd5584bd3a9c73e f02fc2085f6340ffa895cb894fdf5882 - - default default] Inventory has not changed for provider e96a8776-a298-4c19-937a-402cb8191067 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 21 19:14:15 np0005591285 nova_compute[182755]: 2026-01-22 00:14:15.125 182759 DEBUG oslo_concurrency.lockutils [None req-32069043-92dc-4888-8b4d-8dfd7a4b0e89 00a7d470e36045deabd5584bd3a9c73e f02fc2085f6340ffa895cb894fdf5882 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.262s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 21 19:14:15 np0005591285 nova_compute[182755]: 2026-01-22 00:14:15.126 182759 DEBUG nova.compute.manager [None req-32069043-92dc-4888-8b4d-8dfd7a4b0e89 00a7d470e36045deabd5584bd3a9c73e f02fc2085f6340ffa895cb894fdf5882 - - default default] [instance: 4a27cb55-31e5-4343-bed7-44671f47ae20] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Jan 21 19:14:15 np0005591285 nova_compute[182755]: 2026-01-22 00:14:15.241 182759 DEBUG nova.compute.manager [None req-32069043-92dc-4888-8b4d-8dfd7a4b0e89 00a7d470e36045deabd5584bd3a9c73e f02fc2085f6340ffa895cb894fdf5882 - - default default] [instance: 4a27cb55-31e5-4343-bed7-44671f47ae20] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Jan 21 19:14:15 np0005591285 nova_compute[182755]: 2026-01-22 00:14:15.241 182759 DEBUG nova.network.neutron [None req-32069043-92dc-4888-8b4d-8dfd7a4b0e89 00a7d470e36045deabd5584bd3a9c73e f02fc2085f6340ffa895cb894fdf5882 - - default default] [instance: 4a27cb55-31e5-4343-bed7-44671f47ae20] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Jan 21 19:14:15 np0005591285 nova_compute[182755]: 2026-01-22 00:14:15.292 182759 INFO nova.virt.libvirt.driver [None req-32069043-92dc-4888-8b4d-8dfd7a4b0e89 00a7d470e36045deabd5584bd3a9c73e f02fc2085f6340ffa895cb894fdf5882 - - default default] [instance: 4a27cb55-31e5-4343-bed7-44671f47ae20] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Jan 21 19:14:15 np0005591285 nova_compute[182755]: 2026-01-22 00:14:15.336 182759 DEBUG nova.compute.manager [None req-32069043-92dc-4888-8b4d-8dfd7a4b0e89 00a7d470e36045deabd5584bd3a9c73e f02fc2085f6340ffa895cb894fdf5882 - - default default] [instance: 4a27cb55-31e5-4343-bed7-44671f47ae20] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Jan 21 19:14:15 np0005591285 nova_compute[182755]: 2026-01-22 00:14:15.599 182759 DEBUG nova.compute.manager [None req-32069043-92dc-4888-8b4d-8dfd7a4b0e89 00a7d470e36045deabd5584bd3a9c73e f02fc2085f6340ffa895cb894fdf5882 - - default default] [instance: 4a27cb55-31e5-4343-bed7-44671f47ae20] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Jan 21 19:14:15 np0005591285 nova_compute[182755]: 2026-01-22 00:14:15.601 182759 DEBUG nova.virt.libvirt.driver [None req-32069043-92dc-4888-8b4d-8dfd7a4b0e89 00a7d470e36045deabd5584bd3a9c73e f02fc2085f6340ffa895cb894fdf5882 - - default default] [instance: 4a27cb55-31e5-4343-bed7-44671f47ae20] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Jan 21 19:14:15 np0005591285 nova_compute[182755]: 2026-01-22 00:14:15.601 182759 INFO nova.virt.libvirt.driver [None req-32069043-92dc-4888-8b4d-8dfd7a4b0e89 00a7d470e36045deabd5584bd3a9c73e f02fc2085f6340ffa895cb894fdf5882 - - default default] [instance: 4a27cb55-31e5-4343-bed7-44671f47ae20] Creating image(s)#033[00m
Jan 21 19:14:15 np0005591285 nova_compute[182755]: 2026-01-22 00:14:15.602 182759 DEBUG oslo_concurrency.lockutils [None req-32069043-92dc-4888-8b4d-8dfd7a4b0e89 00a7d470e36045deabd5584bd3a9c73e f02fc2085f6340ffa895cb894fdf5882 - - default default] Acquiring lock "/var/lib/nova/instances/4a27cb55-31e5-4343-bed7-44671f47ae20/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 21 19:14:15 np0005591285 nova_compute[182755]: 2026-01-22 00:14:15.602 182759 DEBUG oslo_concurrency.lockutils [None req-32069043-92dc-4888-8b4d-8dfd7a4b0e89 00a7d470e36045deabd5584bd3a9c73e f02fc2085f6340ffa895cb894fdf5882 - - default default] Lock "/var/lib/nova/instances/4a27cb55-31e5-4343-bed7-44671f47ae20/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 21 19:14:15 np0005591285 nova_compute[182755]: 2026-01-22 00:14:15.602 182759 DEBUG oslo_concurrency.lockutils [None req-32069043-92dc-4888-8b4d-8dfd7a4b0e89 00a7d470e36045deabd5584bd3a9c73e f02fc2085f6340ffa895cb894fdf5882 - - default default] Lock "/var/lib/nova/instances/4a27cb55-31e5-4343-bed7-44671f47ae20/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 21 19:14:15 np0005591285 nova_compute[182755]: 2026-01-22 00:14:15.615 182759 DEBUG oslo_concurrency.processutils [None req-32069043-92dc-4888-8b4d-8dfd7a4b0e89 00a7d470e36045deabd5584bd3a9c73e f02fc2085f6340ffa895cb894fdf5882 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 21 19:14:15 np0005591285 nova_compute[182755]: 2026-01-22 00:14:15.669 182759 DEBUG oslo_concurrency.processutils [None req-32069043-92dc-4888-8b4d-8dfd7a4b0e89 00a7d470e36045deabd5584bd3a9c73e f02fc2085f6340ffa895cb894fdf5882 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json" returned: 0 in 0.054s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 21 19:14:15 np0005591285 nova_compute[182755]: 2026-01-22 00:14:15.670 182759 DEBUG oslo_concurrency.lockutils [None req-32069043-92dc-4888-8b4d-8dfd7a4b0e89 00a7d470e36045deabd5584bd3a9c73e f02fc2085f6340ffa895cb894fdf5882 - - default default] Acquiring lock "3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 21 19:14:15 np0005591285 nova_compute[182755]: 2026-01-22 00:14:15.671 182759 DEBUG oslo_concurrency.lockutils [None req-32069043-92dc-4888-8b4d-8dfd7a4b0e89 00a7d470e36045deabd5584bd3a9c73e f02fc2085f6340ffa895cb894fdf5882 - - default default] Lock "3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 21 19:14:15 np0005591285 nova_compute[182755]: 2026-01-22 00:14:15.682 182759 DEBUG oslo_concurrency.processutils [None req-32069043-92dc-4888-8b4d-8dfd7a4b0e89 00a7d470e36045deabd5584bd3a9c73e f02fc2085f6340ffa895cb894fdf5882 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 21 19:14:15 np0005591285 nova_compute[182755]: 2026-01-22 00:14:15.734 182759 DEBUG oslo_concurrency.processutils [None req-32069043-92dc-4888-8b4d-8dfd7a4b0e89 00a7d470e36045deabd5584bd3a9c73e f02fc2085f6340ffa895cb894fdf5882 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json" returned: 0 in 0.053s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 21 19:14:15 np0005591285 nova_compute[182755]: 2026-01-22 00:14:15.735 182759 DEBUG oslo_concurrency.processutils [None req-32069043-92dc-4888-8b4d-8dfd7a4b0e89 00a7d470e36045deabd5584bd3a9c73e f02fc2085f6340ffa895cb894fdf5882 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474,backing_fmt=raw /var/lib/nova/instances/4a27cb55-31e5-4343-bed7-44671f47ae20/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 21 19:14:15 np0005591285 nova_compute[182755]: 2026-01-22 00:14:15.768 182759 DEBUG oslo_concurrency.processutils [None req-32069043-92dc-4888-8b4d-8dfd7a4b0e89 00a7d470e36045deabd5584bd3a9c73e f02fc2085f6340ffa895cb894fdf5882 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474,backing_fmt=raw /var/lib/nova/instances/4a27cb55-31e5-4343-bed7-44671f47ae20/disk 1073741824" returned: 0 in 0.033s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 21 19:14:15 np0005591285 nova_compute[182755]: 2026-01-22 00:14:15.769 182759 DEBUG oslo_concurrency.lockutils [None req-32069043-92dc-4888-8b4d-8dfd7a4b0e89 00a7d470e36045deabd5584bd3a9c73e f02fc2085f6340ffa895cb894fdf5882 - - default default] Lock "3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.098s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 21 19:14:15 np0005591285 nova_compute[182755]: 2026-01-22 00:14:15.770 182759 DEBUG oslo_concurrency.processutils [None req-32069043-92dc-4888-8b4d-8dfd7a4b0e89 00a7d470e36045deabd5584bd3a9c73e f02fc2085f6340ffa895cb894fdf5882 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 21 19:14:15 np0005591285 nova_compute[182755]: 2026-01-22 00:14:15.823 182759 DEBUG oslo_concurrency.processutils [None req-32069043-92dc-4888-8b4d-8dfd7a4b0e89 00a7d470e36045deabd5584bd3a9c73e f02fc2085f6340ffa895cb894fdf5882 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json" returned: 0 in 0.053s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 21 19:14:15 np0005591285 nova_compute[182755]: 2026-01-22 00:14:15.824 182759 DEBUG nova.virt.disk.api [None req-32069043-92dc-4888-8b4d-8dfd7a4b0e89 00a7d470e36045deabd5584bd3a9c73e f02fc2085f6340ffa895cb894fdf5882 - - default default] Checking if we can resize image /var/lib/nova/instances/4a27cb55-31e5-4343-bed7-44671f47ae20/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166#033[00m
Jan 21 19:14:15 np0005591285 nova_compute[182755]: 2026-01-22 00:14:15.825 182759 DEBUG oslo_concurrency.processutils [None req-32069043-92dc-4888-8b4d-8dfd7a4b0e89 00a7d470e36045deabd5584bd3a9c73e f02fc2085f6340ffa895cb894fdf5882 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/4a27cb55-31e5-4343-bed7-44671f47ae20/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 21 19:14:15 np0005591285 nova_compute[182755]: 2026-01-22 00:14:15.878 182759 DEBUG oslo_concurrency.processutils [None req-32069043-92dc-4888-8b4d-8dfd7a4b0e89 00a7d470e36045deabd5584bd3a9c73e f02fc2085f6340ffa895cb894fdf5882 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/4a27cb55-31e5-4343-bed7-44671f47ae20/disk --force-share --output=json" returned: 0 in 0.054s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 21 19:14:15 np0005591285 nova_compute[182755]: 2026-01-22 00:14:15.880 182759 DEBUG nova.virt.disk.api [None req-32069043-92dc-4888-8b4d-8dfd7a4b0e89 00a7d470e36045deabd5584bd3a9c73e f02fc2085f6340ffa895cb894fdf5882 - - default default] Cannot resize image /var/lib/nova/instances/4a27cb55-31e5-4343-bed7-44671f47ae20/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172#033[00m
Jan 21 19:14:15 np0005591285 nova_compute[182755]: 2026-01-22 00:14:15.881 182759 DEBUG nova.objects.instance [None req-32069043-92dc-4888-8b4d-8dfd7a4b0e89 00a7d470e36045deabd5584bd3a9c73e f02fc2085f6340ffa895cb894fdf5882 - - default default] Lazy-loading 'migration_context' on Instance uuid 4a27cb55-31e5-4343-bed7-44671f47ae20 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 21 19:14:15 np0005591285 nova_compute[182755]: 2026-01-22 00:14:15.891 182759 DEBUG nova.policy [None req-32069043-92dc-4888-8b4d-8dfd7a4b0e89 00a7d470e36045deabd5584bd3a9c73e f02fc2085f6340ffa895cb894fdf5882 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '00a7d470e36045deabd5584bd3a9c73e', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'f02fc2085f6340ffa895cb894fdf5882', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Jan 21 19:14:17 np0005591285 nova_compute[182755]: 2026-01-22 00:14:17.499 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:14:17 np0005591285 nova_compute[182755]: 2026-01-22 00:14:17.822 182759 DEBUG nova.virt.libvirt.driver [None req-32069043-92dc-4888-8b4d-8dfd7a4b0e89 00a7d470e36045deabd5584bd3a9c73e f02fc2085f6340ffa895cb894fdf5882 - - default default] [instance: 4a27cb55-31e5-4343-bed7-44671f47ae20] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Jan 21 19:14:17 np0005591285 nova_compute[182755]: 2026-01-22 00:14:17.823 182759 DEBUG nova.virt.libvirt.driver [None req-32069043-92dc-4888-8b4d-8dfd7a4b0e89 00a7d470e36045deabd5584bd3a9c73e f02fc2085f6340ffa895cb894fdf5882 - - default default] [instance: 4a27cb55-31e5-4343-bed7-44671f47ae20] Ensure instance console log exists: /var/lib/nova/instances/4a27cb55-31e5-4343-bed7-44671f47ae20/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Jan 21 19:14:17 np0005591285 nova_compute[182755]: 2026-01-22 00:14:17.823 182759 DEBUG oslo_concurrency.lockutils [None req-32069043-92dc-4888-8b4d-8dfd7a4b0e89 00a7d470e36045deabd5584bd3a9c73e f02fc2085f6340ffa895cb894fdf5882 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 21 19:14:17 np0005591285 nova_compute[182755]: 2026-01-22 00:14:17.824 182759 DEBUG oslo_concurrency.lockutils [None req-32069043-92dc-4888-8b4d-8dfd7a4b0e89 00a7d470e36045deabd5584bd3a9c73e f02fc2085f6340ffa895cb894fdf5882 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 21 19:14:17 np0005591285 nova_compute[182755]: 2026-01-22 00:14:17.824 182759 DEBUG oslo_concurrency.lockutils [None req-32069043-92dc-4888-8b4d-8dfd7a4b0e89 00a7d470e36045deabd5584bd3a9c73e f02fc2085f6340ffa895cb894fdf5882 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 21 19:14:17 np0005591285 nova_compute[182755]: 2026-01-22 00:14:17.966 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:14:18 np0005591285 nova_compute[182755]: 2026-01-22 00:14:18.828 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:14:21 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:14:21.915 104259 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=36, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '16:a4:fb', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': 'fa:c2:bb:50:eb:27'}, ipsec=False) old=SB_Global(nb_cfg=35) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 21 19:14:21 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:14:21.916 104259 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 4 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Jan 21 19:14:21 np0005591285 nova_compute[182755]: 2026-01-22 00:14:21.957 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:14:22 np0005591285 nova_compute[182755]: 2026-01-22 00:14:22.501 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:14:22 np0005591285 nova_compute[182755]: 2026-01-22 00:14:22.598 182759 DEBUG nova.network.neutron [None req-32069043-92dc-4888-8b4d-8dfd7a4b0e89 00a7d470e36045deabd5584bd3a9c73e f02fc2085f6340ffa895cb894fdf5882 - - default default] [instance: 4a27cb55-31e5-4343-bed7-44671f47ae20] Successfully created port: bd929f81-9ed0-4db8-9e56-d5c7f3261b6a _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Jan 21 19:14:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:14:23.166 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 21 19:14:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:14:23.168 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.capacity, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 21 19:14:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:14:23.168 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 21 19:14:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:14:23.168 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 21 19:14:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:14:23.168 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 21 19:14:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:14:23.168 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 21 19:14:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:14:23.169 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 21 19:14:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:14:23.169 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 21 19:14:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:14:23.169 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 21 19:14:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:14:23.169 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 21 19:14:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:14:23.170 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 21 19:14:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:14:23.170 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 21 19:14:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:14:23.170 12 DEBUG ceilometer.polling.manager [-] Skip pollster memory.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 21 19:14:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:14:23.170 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 21 19:14:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:14:23.171 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 21 19:14:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:14:23.171 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 21 19:14:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:14:23.171 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 21 19:14:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:14:23.172 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 21 19:14:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:14:23.172 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 21 19:14:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:14:23.173 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 21 19:14:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:14:23.173 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.allocation, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 21 19:14:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:14:23.173 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.iops, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 21 19:14:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:14:23.174 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 21 19:14:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:14:23.174 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 21 19:14:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:14:23.174 12 DEBUG ceilometer.polling.manager [-] Skip pollster cpu, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 21 19:14:23 np0005591285 nova_compute[182755]: 2026-01-22 00:14:23.831 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:14:24 np0005591285 podman[231211]: 2026-01-22 00:14:24.21179109 +0000 UTC m=+0.076040898 container health_status b5c1a31a508d0cf9e10702abe9c0ef424614b4d8f0daee85791f7de054afa6f0 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=ceilometer_agent_compute, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, container_name=ceilometer_agent_compute, org.label-schema.build-date=20251202)
Jan 21 19:14:24 np0005591285 podman[231210]: 2026-01-22 00:14:24.236253493 +0000 UTC m=+0.100721647 container health_status 769668cc4e818cfae854ab3fcb5517227ddca1e18005a18ef4d782a494f45662 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1755695350, vendor=Red Hat, Inc., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., name=ubi9-minimal, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Red Hat, Inc., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.openshift.tags=minimal rhel9, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., container_name=openstack_network_exporter, io.openshift.expose-services=, version=9.6, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, build-date=2025-08-20T13:12:41, url=https://catalog.redhat.com/en/search?searchType=containers, distribution-scope=public, io.buildah.version=1.33.7, managed_by=edpm_ansible, com.redhat.component=ubi9-minimal-container, config_id=openstack_network_exporter, vcs-type=git)
Jan 21 19:14:24 np0005591285 nova_compute[182755]: 2026-01-22 00:14:24.782 182759 DEBUG nova.network.neutron [None req-32069043-92dc-4888-8b4d-8dfd7a4b0e89 00a7d470e36045deabd5584bd3a9c73e f02fc2085f6340ffa895cb894fdf5882 - - default default] [instance: 4a27cb55-31e5-4343-bed7-44671f47ae20] Successfully updated port: bd929f81-9ed0-4db8-9e56-d5c7f3261b6a _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Jan 21 19:14:24 np0005591285 nova_compute[182755]: 2026-01-22 00:14:24.979 182759 DEBUG oslo_concurrency.lockutils [None req-32069043-92dc-4888-8b4d-8dfd7a4b0e89 00a7d470e36045deabd5584bd3a9c73e f02fc2085f6340ffa895cb894fdf5882 - - default default] Acquiring lock "refresh_cache-4a27cb55-31e5-4343-bed7-44671f47ae20" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 21 19:14:24 np0005591285 nova_compute[182755]: 2026-01-22 00:14:24.979 182759 DEBUG oslo_concurrency.lockutils [None req-32069043-92dc-4888-8b4d-8dfd7a4b0e89 00a7d470e36045deabd5584bd3a9c73e f02fc2085f6340ffa895cb894fdf5882 - - default default] Acquired lock "refresh_cache-4a27cb55-31e5-4343-bed7-44671f47ae20" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 21 19:14:24 np0005591285 nova_compute[182755]: 2026-01-22 00:14:24.980 182759 DEBUG nova.network.neutron [None req-32069043-92dc-4888-8b4d-8dfd7a4b0e89 00a7d470e36045deabd5584bd3a9c73e f02fc2085f6340ffa895cb894fdf5882 - - default default] [instance: 4a27cb55-31e5-4343-bed7-44671f47ae20] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Jan 21 19:14:25 np0005591285 nova_compute[182755]: 2026-01-22 00:14:25.458 182759 DEBUG nova.network.neutron [None req-32069043-92dc-4888-8b4d-8dfd7a4b0e89 00a7d470e36045deabd5584bd3a9c73e f02fc2085f6340ffa895cb894fdf5882 - - default default] [instance: 4a27cb55-31e5-4343-bed7-44671f47ae20] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Jan 21 19:14:25 np0005591285 nova_compute[182755]: 2026-01-22 00:14:25.616 182759 DEBUG nova.compute.manager [req-fefac87b-60f7-4185-8ece-cc5549933116 req-d076da06-6d2e-4dba-9af7-ba2c58c70ce1 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 4a27cb55-31e5-4343-bed7-44671f47ae20] Received event network-changed-bd929f81-9ed0-4db8-9e56-d5c7f3261b6a external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 21 19:14:25 np0005591285 nova_compute[182755]: 2026-01-22 00:14:25.617 182759 DEBUG nova.compute.manager [req-fefac87b-60f7-4185-8ece-cc5549933116 req-d076da06-6d2e-4dba-9af7-ba2c58c70ce1 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 4a27cb55-31e5-4343-bed7-44671f47ae20] Refreshing instance network info cache due to event network-changed-bd929f81-9ed0-4db8-9e56-d5c7f3261b6a. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 21 19:14:25 np0005591285 nova_compute[182755]: 2026-01-22 00:14:25.617 182759 DEBUG oslo_concurrency.lockutils [req-fefac87b-60f7-4185-8ece-cc5549933116 req-d076da06-6d2e-4dba-9af7-ba2c58c70ce1 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "refresh_cache-4a27cb55-31e5-4343-bed7-44671f47ae20" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 21 19:14:25 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:14:25.918 104259 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=ce4b296c-26ac-415a-aa87-9634754eb3d3, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '36'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 21 19:14:27 np0005591285 nova_compute[182755]: 2026-01-22 00:14:27.433 182759 DEBUG nova.network.neutron [None req-32069043-92dc-4888-8b4d-8dfd7a4b0e89 00a7d470e36045deabd5584bd3a9c73e f02fc2085f6340ffa895cb894fdf5882 - - default default] [instance: 4a27cb55-31e5-4343-bed7-44671f47ae20] Updating instance_info_cache with network_info: [{"id": "bd929f81-9ed0-4db8-9e56-d5c7f3261b6a", "address": "fa:16:3e:89:3a:5c", "network": {"id": "c19848fe-a435-4c66-8190-94e8e9e1b266", "bridge": "br-int", "label": "tempest-MultipleCreateTestJSON-1960670064-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "f02fc2085f6340ffa895cb894fdf5882", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapbd929f81-9e", "ovs_interfaceid": "bd929f81-9ed0-4db8-9e56-d5c7f3261b6a", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 21 19:14:27 np0005591285 nova_compute[182755]: 2026-01-22 00:14:27.501 182759 DEBUG oslo_concurrency.lockutils [None req-32069043-92dc-4888-8b4d-8dfd7a4b0e89 00a7d470e36045deabd5584bd3a9c73e f02fc2085f6340ffa895cb894fdf5882 - - default default] Releasing lock "refresh_cache-4a27cb55-31e5-4343-bed7-44671f47ae20" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 21 19:14:27 np0005591285 nova_compute[182755]: 2026-01-22 00:14:27.502 182759 DEBUG nova.compute.manager [None req-32069043-92dc-4888-8b4d-8dfd7a4b0e89 00a7d470e36045deabd5584bd3a9c73e f02fc2085f6340ffa895cb894fdf5882 - - default default] [instance: 4a27cb55-31e5-4343-bed7-44671f47ae20] Instance network_info: |[{"id": "bd929f81-9ed0-4db8-9e56-d5c7f3261b6a", "address": "fa:16:3e:89:3a:5c", "network": {"id": "c19848fe-a435-4c66-8190-94e8e9e1b266", "bridge": "br-int", "label": "tempest-MultipleCreateTestJSON-1960670064-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "f02fc2085f6340ffa895cb894fdf5882", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapbd929f81-9e", "ovs_interfaceid": "bd929f81-9ed0-4db8-9e56-d5c7f3261b6a", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Jan 21 19:14:27 np0005591285 nova_compute[182755]: 2026-01-22 00:14:27.502 182759 DEBUG oslo_concurrency.lockutils [req-fefac87b-60f7-4185-8ece-cc5549933116 req-d076da06-6d2e-4dba-9af7-ba2c58c70ce1 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquired lock "refresh_cache-4a27cb55-31e5-4343-bed7-44671f47ae20" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 21 19:14:27 np0005591285 nova_compute[182755]: 2026-01-22 00:14:27.502 182759 DEBUG nova.network.neutron [req-fefac87b-60f7-4185-8ece-cc5549933116 req-d076da06-6d2e-4dba-9af7-ba2c58c70ce1 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 4a27cb55-31e5-4343-bed7-44671f47ae20] Refreshing network info cache for port bd929f81-9ed0-4db8-9e56-d5c7f3261b6a _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 21 19:14:27 np0005591285 nova_compute[182755]: 2026-01-22 00:14:27.505 182759 DEBUG nova.virt.libvirt.driver [None req-32069043-92dc-4888-8b4d-8dfd7a4b0e89 00a7d470e36045deabd5584bd3a9c73e f02fc2085f6340ffa895cb894fdf5882 - - default default] [instance: 4a27cb55-31e5-4343-bed7-44671f47ae20] Start _get_guest_xml network_info=[{"id": "bd929f81-9ed0-4db8-9e56-d5c7f3261b6a", "address": "fa:16:3e:89:3a:5c", "network": {"id": "c19848fe-a435-4c66-8190-94e8e9e1b266", "bridge": "br-int", "label": "tempest-MultipleCreateTestJSON-1960670064-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "f02fc2085f6340ffa895cb894fdf5882", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapbd929f81-9e", "ovs_interfaceid": "bd929f81-9ed0-4db8-9e56-d5c7f3261b6a", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-21T23:42:50Z,direct_url=<?>,disk_format='qcow2',id=9cd98f02-a505-4543-a7ad-04e9a377b456,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='43b70c4e837343859ac97b6b2397ba1b',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-21T23:42:52Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'device_type': 'disk', 'device_name': '/dev/vda', 'size': 0, 'boot_index': 0, 'disk_bus': 'virtio', 'guest_format': None, 'encryption_options': None, 'encryption_format': None, 'encrypted': False, 'encryption_secret_uuid': None, 'image_id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Jan 21 19:14:27 np0005591285 nova_compute[182755]: 2026-01-22 00:14:27.506 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:14:27 np0005591285 nova_compute[182755]: 2026-01-22 00:14:27.513 182759 WARNING nova.virt.libvirt.driver [None req-32069043-92dc-4888-8b4d-8dfd7a4b0e89 00a7d470e36045deabd5584bd3a9c73e f02fc2085f6340ffa895cb894fdf5882 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 21 19:14:27 np0005591285 nova_compute[182755]: 2026-01-22 00:14:27.532 182759 DEBUG nova.virt.libvirt.host [None req-32069043-92dc-4888-8b4d-8dfd7a4b0e89 00a7d470e36045deabd5584bd3a9c73e f02fc2085f6340ffa895cb894fdf5882 - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Jan 21 19:14:27 np0005591285 nova_compute[182755]: 2026-01-22 00:14:27.533 182759 DEBUG nova.virt.libvirt.host [None req-32069043-92dc-4888-8b4d-8dfd7a4b0e89 00a7d470e36045deabd5584bd3a9c73e f02fc2085f6340ffa895cb894fdf5882 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Jan 21 19:14:27 np0005591285 nova_compute[182755]: 2026-01-22 00:14:27.541 182759 DEBUG nova.virt.libvirt.host [None req-32069043-92dc-4888-8b4d-8dfd7a4b0e89 00a7d470e36045deabd5584bd3a9c73e f02fc2085f6340ffa895cb894fdf5882 - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Jan 21 19:14:27 np0005591285 nova_compute[182755]: 2026-01-22 00:14:27.542 182759 DEBUG nova.virt.libvirt.host [None req-32069043-92dc-4888-8b4d-8dfd7a4b0e89 00a7d470e36045deabd5584bd3a9c73e f02fc2085f6340ffa895cb894fdf5882 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Jan 21 19:14:27 np0005591285 nova_compute[182755]: 2026-01-22 00:14:27.544 182759 DEBUG nova.virt.libvirt.driver [None req-32069043-92dc-4888-8b4d-8dfd7a4b0e89 00a7d470e36045deabd5584bd3a9c73e f02fc2085f6340ffa895cb894fdf5882 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Jan 21 19:14:27 np0005591285 nova_compute[182755]: 2026-01-22 00:14:27.544 182759 DEBUG nova.virt.hardware [None req-32069043-92dc-4888-8b4d-8dfd7a4b0e89 00a7d470e36045deabd5584bd3a9c73e f02fc2085f6340ffa895cb894fdf5882 - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-21T23:42:45Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='c3389c03-89c4-4ff5-9e03-1a99d41713d4',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-21T23:42:50Z,direct_url=<?>,disk_format='qcow2',id=9cd98f02-a505-4543-a7ad-04e9a377b456,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='43b70c4e837343859ac97b6b2397ba1b',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-21T23:42:52Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Jan 21 19:14:27 np0005591285 nova_compute[182755]: 2026-01-22 00:14:27.545 182759 DEBUG nova.virt.hardware [None req-32069043-92dc-4888-8b4d-8dfd7a4b0e89 00a7d470e36045deabd5584bd3a9c73e f02fc2085f6340ffa895cb894fdf5882 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Jan 21 19:14:27 np0005591285 nova_compute[182755]: 2026-01-22 00:14:27.545 182759 DEBUG nova.virt.hardware [None req-32069043-92dc-4888-8b4d-8dfd7a4b0e89 00a7d470e36045deabd5584bd3a9c73e f02fc2085f6340ffa895cb894fdf5882 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Jan 21 19:14:27 np0005591285 nova_compute[182755]: 2026-01-22 00:14:27.546 182759 DEBUG nova.virt.hardware [None req-32069043-92dc-4888-8b4d-8dfd7a4b0e89 00a7d470e36045deabd5584bd3a9c73e f02fc2085f6340ffa895cb894fdf5882 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Jan 21 19:14:27 np0005591285 nova_compute[182755]: 2026-01-22 00:14:27.546 182759 DEBUG nova.virt.hardware [None req-32069043-92dc-4888-8b4d-8dfd7a4b0e89 00a7d470e36045deabd5584bd3a9c73e f02fc2085f6340ffa895cb894fdf5882 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Jan 21 19:14:27 np0005591285 nova_compute[182755]: 2026-01-22 00:14:27.547 182759 DEBUG nova.virt.hardware [None req-32069043-92dc-4888-8b4d-8dfd7a4b0e89 00a7d470e36045deabd5584bd3a9c73e f02fc2085f6340ffa895cb894fdf5882 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Jan 21 19:14:27 np0005591285 nova_compute[182755]: 2026-01-22 00:14:27.547 182759 DEBUG nova.virt.hardware [None req-32069043-92dc-4888-8b4d-8dfd7a4b0e89 00a7d470e36045deabd5584bd3a9c73e f02fc2085f6340ffa895cb894fdf5882 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Jan 21 19:14:27 np0005591285 nova_compute[182755]: 2026-01-22 00:14:27.548 182759 DEBUG nova.virt.hardware [None req-32069043-92dc-4888-8b4d-8dfd7a4b0e89 00a7d470e36045deabd5584bd3a9c73e f02fc2085f6340ffa895cb894fdf5882 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Jan 21 19:14:27 np0005591285 nova_compute[182755]: 2026-01-22 00:14:27.548 182759 DEBUG nova.virt.hardware [None req-32069043-92dc-4888-8b4d-8dfd7a4b0e89 00a7d470e36045deabd5584bd3a9c73e f02fc2085f6340ffa895cb894fdf5882 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Jan 21 19:14:27 np0005591285 nova_compute[182755]: 2026-01-22 00:14:27.549 182759 DEBUG nova.virt.hardware [None req-32069043-92dc-4888-8b4d-8dfd7a4b0e89 00a7d470e36045deabd5584bd3a9c73e f02fc2085f6340ffa895cb894fdf5882 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Jan 21 19:14:27 np0005591285 nova_compute[182755]: 2026-01-22 00:14:27.549 182759 DEBUG nova.virt.hardware [None req-32069043-92dc-4888-8b4d-8dfd7a4b0e89 00a7d470e36045deabd5584bd3a9c73e f02fc2085f6340ffa895cb894fdf5882 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Jan 21 19:14:27 np0005591285 nova_compute[182755]: 2026-01-22 00:14:27.555 182759 DEBUG nova.virt.libvirt.vif [None req-32069043-92dc-4888-8b4d-8dfd7a4b0e89 00a7d470e36045deabd5584bd3a9c73e f02fc2085f6340ffa895cb894fdf5882 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-22T00:14:03Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-tempest.common.compute-instance-1102130362',display_name='tempest-tempest.common.compute-instance-1102130362-2',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-tempest-common-compute-instance-1102130362-2',id=124,image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=1,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='f02fc2085f6340ffa895cb894fdf5882',ramdisk_id='',reservation_id='r-u2tu2ftn',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-MultipleCreateTestJSON-620854064',owner_user_name='tempest-MultipleCreateTestJSON-620854064-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-22T00:14:15Z,user_data=None,user_id='00a7d470e36045deabd5584bd3a9c73e',uuid=4a27cb55-31e5-4343-bed7-44671f47ae20,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "bd929f81-9ed0-4db8-9e56-d5c7f3261b6a", "address": "fa:16:3e:89:3a:5c", "network": {"id": "c19848fe-a435-4c66-8190-94e8e9e1b266", "bridge": "br-int", "label": "tempest-MultipleCreateTestJSON-1960670064-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "f02fc2085f6340ffa895cb894fdf5882", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapbd929f81-9e", "ovs_interfaceid": "bd929f81-9ed0-4db8-9e56-d5c7f3261b6a", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Jan 21 19:14:27 np0005591285 nova_compute[182755]: 2026-01-22 00:14:27.556 182759 DEBUG nova.network.os_vif_util [None req-32069043-92dc-4888-8b4d-8dfd7a4b0e89 00a7d470e36045deabd5584bd3a9c73e f02fc2085f6340ffa895cb894fdf5882 - - default default] Converting VIF {"id": "bd929f81-9ed0-4db8-9e56-d5c7f3261b6a", "address": "fa:16:3e:89:3a:5c", "network": {"id": "c19848fe-a435-4c66-8190-94e8e9e1b266", "bridge": "br-int", "label": "tempest-MultipleCreateTestJSON-1960670064-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "f02fc2085f6340ffa895cb894fdf5882", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapbd929f81-9e", "ovs_interfaceid": "bd929f81-9ed0-4db8-9e56-d5c7f3261b6a", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 21 19:14:27 np0005591285 nova_compute[182755]: 2026-01-22 00:14:27.557 182759 DEBUG nova.network.os_vif_util [None req-32069043-92dc-4888-8b4d-8dfd7a4b0e89 00a7d470e36045deabd5584bd3a9c73e f02fc2085f6340ffa895cb894fdf5882 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:89:3a:5c,bridge_name='br-int',has_traffic_filtering=True,id=bd929f81-9ed0-4db8-9e56-d5c7f3261b6a,network=Network(c19848fe-a435-4c66-8190-94e8e9e1b266),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapbd929f81-9e') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 21 19:14:27 np0005591285 nova_compute[182755]: 2026-01-22 00:14:27.559 182759 DEBUG nova.objects.instance [None req-32069043-92dc-4888-8b4d-8dfd7a4b0e89 00a7d470e36045deabd5584bd3a9c73e f02fc2085f6340ffa895cb894fdf5882 - - default default] Lazy-loading 'pci_devices' on Instance uuid 4a27cb55-31e5-4343-bed7-44671f47ae20 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 21 19:14:27 np0005591285 nova_compute[182755]: 2026-01-22 00:14:27.700 182759 DEBUG nova.virt.libvirt.driver [None req-32069043-92dc-4888-8b4d-8dfd7a4b0e89 00a7d470e36045deabd5584bd3a9c73e f02fc2085f6340ffa895cb894fdf5882 - - default default] [instance: 4a27cb55-31e5-4343-bed7-44671f47ae20] End _get_guest_xml xml=<domain type="kvm">
Jan 21 19:14:27 np0005591285 nova_compute[182755]:  <uuid>4a27cb55-31e5-4343-bed7-44671f47ae20</uuid>
Jan 21 19:14:27 np0005591285 nova_compute[182755]:  <name>instance-0000007c</name>
Jan 21 19:14:27 np0005591285 nova_compute[182755]:  <memory>131072</memory>
Jan 21 19:14:27 np0005591285 nova_compute[182755]:  <vcpu>1</vcpu>
Jan 21 19:14:27 np0005591285 nova_compute[182755]:  <metadata>
Jan 21 19:14:27 np0005591285 nova_compute[182755]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 21 19:14:27 np0005591285 nova_compute[182755]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 21 19:14:27 np0005591285 nova_compute[182755]:      <nova:name>tempest-tempest.common.compute-instance-1102130362-2</nova:name>
Jan 21 19:14:27 np0005591285 nova_compute[182755]:      <nova:creationTime>2026-01-22 00:14:27</nova:creationTime>
Jan 21 19:14:27 np0005591285 nova_compute[182755]:      <nova:flavor name="m1.nano">
Jan 21 19:14:27 np0005591285 nova_compute[182755]:        <nova:memory>128</nova:memory>
Jan 21 19:14:27 np0005591285 nova_compute[182755]:        <nova:disk>1</nova:disk>
Jan 21 19:14:27 np0005591285 nova_compute[182755]:        <nova:swap>0</nova:swap>
Jan 21 19:14:27 np0005591285 nova_compute[182755]:        <nova:ephemeral>0</nova:ephemeral>
Jan 21 19:14:27 np0005591285 nova_compute[182755]:        <nova:vcpus>1</nova:vcpus>
Jan 21 19:14:27 np0005591285 nova_compute[182755]:      </nova:flavor>
Jan 21 19:14:27 np0005591285 nova_compute[182755]:      <nova:owner>
Jan 21 19:14:27 np0005591285 nova_compute[182755]:        <nova:user uuid="00a7d470e36045deabd5584bd3a9c73e">tempest-MultipleCreateTestJSON-620854064-project-member</nova:user>
Jan 21 19:14:27 np0005591285 nova_compute[182755]:        <nova:project uuid="f02fc2085f6340ffa895cb894fdf5882">tempest-MultipleCreateTestJSON-620854064</nova:project>
Jan 21 19:14:27 np0005591285 nova_compute[182755]:      </nova:owner>
Jan 21 19:14:27 np0005591285 nova_compute[182755]:      <nova:root type="image" uuid="9cd98f02-a505-4543-a7ad-04e9a377b456"/>
Jan 21 19:14:27 np0005591285 nova_compute[182755]:      <nova:ports>
Jan 21 19:14:27 np0005591285 nova_compute[182755]:        <nova:port uuid="bd929f81-9ed0-4db8-9e56-d5c7f3261b6a">
Jan 21 19:14:27 np0005591285 nova_compute[182755]:          <nova:ip type="fixed" address="10.100.0.7" ipVersion="4"/>
Jan 21 19:14:27 np0005591285 nova_compute[182755]:        </nova:port>
Jan 21 19:14:27 np0005591285 nova_compute[182755]:      </nova:ports>
Jan 21 19:14:27 np0005591285 nova_compute[182755]:    </nova:instance>
Jan 21 19:14:27 np0005591285 nova_compute[182755]:  </metadata>
Jan 21 19:14:27 np0005591285 nova_compute[182755]:  <sysinfo type="smbios">
Jan 21 19:14:27 np0005591285 nova_compute[182755]:    <system>
Jan 21 19:14:27 np0005591285 nova_compute[182755]:      <entry name="manufacturer">RDO</entry>
Jan 21 19:14:27 np0005591285 nova_compute[182755]:      <entry name="product">OpenStack Compute</entry>
Jan 21 19:14:27 np0005591285 nova_compute[182755]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 21 19:14:27 np0005591285 nova_compute[182755]:      <entry name="serial">4a27cb55-31e5-4343-bed7-44671f47ae20</entry>
Jan 21 19:14:27 np0005591285 nova_compute[182755]:      <entry name="uuid">4a27cb55-31e5-4343-bed7-44671f47ae20</entry>
Jan 21 19:14:27 np0005591285 nova_compute[182755]:      <entry name="family">Virtual Machine</entry>
Jan 21 19:14:27 np0005591285 nova_compute[182755]:    </system>
Jan 21 19:14:27 np0005591285 nova_compute[182755]:  </sysinfo>
Jan 21 19:14:27 np0005591285 nova_compute[182755]:  <os>
Jan 21 19:14:27 np0005591285 nova_compute[182755]:    <type arch="x86_64" machine="q35">hvm</type>
Jan 21 19:14:27 np0005591285 nova_compute[182755]:    <boot dev="hd"/>
Jan 21 19:14:27 np0005591285 nova_compute[182755]:    <smbios mode="sysinfo"/>
Jan 21 19:14:27 np0005591285 nova_compute[182755]:  </os>
Jan 21 19:14:27 np0005591285 nova_compute[182755]:  <features>
Jan 21 19:14:27 np0005591285 nova_compute[182755]:    <acpi/>
Jan 21 19:14:27 np0005591285 nova_compute[182755]:    <apic/>
Jan 21 19:14:27 np0005591285 nova_compute[182755]:    <vmcoreinfo/>
Jan 21 19:14:27 np0005591285 nova_compute[182755]:  </features>
Jan 21 19:14:27 np0005591285 nova_compute[182755]:  <clock offset="utc">
Jan 21 19:14:27 np0005591285 nova_compute[182755]:    <timer name="pit" tickpolicy="delay"/>
Jan 21 19:14:27 np0005591285 nova_compute[182755]:    <timer name="rtc" tickpolicy="catchup"/>
Jan 21 19:14:27 np0005591285 nova_compute[182755]:    <timer name="hpet" present="no"/>
Jan 21 19:14:27 np0005591285 nova_compute[182755]:  </clock>
Jan 21 19:14:27 np0005591285 nova_compute[182755]:  <cpu mode="custom" match="exact">
Jan 21 19:14:27 np0005591285 nova_compute[182755]:    <model>Nehalem</model>
Jan 21 19:14:27 np0005591285 nova_compute[182755]:    <topology sockets="1" cores="1" threads="1"/>
Jan 21 19:14:27 np0005591285 nova_compute[182755]:  </cpu>
Jan 21 19:14:27 np0005591285 nova_compute[182755]:  <devices>
Jan 21 19:14:27 np0005591285 nova_compute[182755]:    <disk type="file" device="disk">
Jan 21 19:14:27 np0005591285 nova_compute[182755]:      <driver name="qemu" type="qcow2" cache="none"/>
Jan 21 19:14:27 np0005591285 nova_compute[182755]:      <source file="/var/lib/nova/instances/4a27cb55-31e5-4343-bed7-44671f47ae20/disk"/>
Jan 21 19:14:27 np0005591285 nova_compute[182755]:      <target dev="vda" bus="virtio"/>
Jan 21 19:14:27 np0005591285 nova_compute[182755]:    </disk>
Jan 21 19:14:27 np0005591285 nova_compute[182755]:    <disk type="file" device="cdrom">
Jan 21 19:14:27 np0005591285 nova_compute[182755]:      <driver name="qemu" type="raw" cache="none"/>
Jan 21 19:14:27 np0005591285 nova_compute[182755]:      <source file="/var/lib/nova/instances/4a27cb55-31e5-4343-bed7-44671f47ae20/disk.config"/>
Jan 21 19:14:27 np0005591285 nova_compute[182755]:      <target dev="sda" bus="sata"/>
Jan 21 19:14:27 np0005591285 nova_compute[182755]:    </disk>
Jan 21 19:14:27 np0005591285 nova_compute[182755]:    <interface type="ethernet">
Jan 21 19:14:27 np0005591285 nova_compute[182755]:      <mac address="fa:16:3e:89:3a:5c"/>
Jan 21 19:14:27 np0005591285 nova_compute[182755]:      <model type="virtio"/>
Jan 21 19:14:27 np0005591285 nova_compute[182755]:      <driver name="vhost" rx_queue_size="512"/>
Jan 21 19:14:27 np0005591285 nova_compute[182755]:      <mtu size="1442"/>
Jan 21 19:14:27 np0005591285 nova_compute[182755]:      <target dev="tapbd929f81-9e"/>
Jan 21 19:14:27 np0005591285 nova_compute[182755]:    </interface>
Jan 21 19:14:27 np0005591285 nova_compute[182755]:    <serial type="pty">
Jan 21 19:14:27 np0005591285 nova_compute[182755]:      <log file="/var/lib/nova/instances/4a27cb55-31e5-4343-bed7-44671f47ae20/console.log" append="off"/>
Jan 21 19:14:27 np0005591285 nova_compute[182755]:    </serial>
Jan 21 19:14:27 np0005591285 nova_compute[182755]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 21 19:14:27 np0005591285 nova_compute[182755]:    <video>
Jan 21 19:14:27 np0005591285 nova_compute[182755]:      <model type="virtio"/>
Jan 21 19:14:27 np0005591285 nova_compute[182755]:    </video>
Jan 21 19:14:27 np0005591285 nova_compute[182755]:    <input type="tablet" bus="usb"/>
Jan 21 19:14:27 np0005591285 nova_compute[182755]:    <rng model="virtio">
Jan 21 19:14:27 np0005591285 nova_compute[182755]:      <backend model="random">/dev/urandom</backend>
Jan 21 19:14:27 np0005591285 nova_compute[182755]:    </rng>
Jan 21 19:14:27 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root"/>
Jan 21 19:14:27 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 19:14:27 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 19:14:27 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 19:14:27 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 19:14:27 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 19:14:27 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 19:14:27 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 19:14:27 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 19:14:27 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 19:14:27 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 19:14:27 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 19:14:27 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 19:14:27 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 19:14:27 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 19:14:27 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 19:14:27 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 19:14:27 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 19:14:27 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 19:14:27 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 19:14:27 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 19:14:27 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 19:14:27 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 19:14:27 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 19:14:27 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 19:14:27 np0005591285 nova_compute[182755]:    <controller type="usb" index="0"/>
Jan 21 19:14:27 np0005591285 nova_compute[182755]:    <memballoon model="virtio">
Jan 21 19:14:27 np0005591285 nova_compute[182755]:      <stats period="10"/>
Jan 21 19:14:27 np0005591285 nova_compute[182755]:    </memballoon>
Jan 21 19:14:27 np0005591285 nova_compute[182755]:  </devices>
Jan 21 19:14:27 np0005591285 nova_compute[182755]: </domain>
Jan 21 19:14:27 np0005591285 nova_compute[182755]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Jan 21 19:14:27 np0005591285 nova_compute[182755]: 2026-01-22 00:14:27.702 182759 DEBUG nova.compute.manager [None req-32069043-92dc-4888-8b4d-8dfd7a4b0e89 00a7d470e36045deabd5584bd3a9c73e f02fc2085f6340ffa895cb894fdf5882 - - default default] [instance: 4a27cb55-31e5-4343-bed7-44671f47ae20] Preparing to wait for external event network-vif-plugged-bd929f81-9ed0-4db8-9e56-d5c7f3261b6a prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Jan 21 19:14:27 np0005591285 nova_compute[182755]: 2026-01-22 00:14:27.703 182759 DEBUG oslo_concurrency.lockutils [None req-32069043-92dc-4888-8b4d-8dfd7a4b0e89 00a7d470e36045deabd5584bd3a9c73e f02fc2085f6340ffa895cb894fdf5882 - - default default] Acquiring lock "4a27cb55-31e5-4343-bed7-44671f47ae20-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 21 19:14:27 np0005591285 nova_compute[182755]: 2026-01-22 00:14:27.703 182759 DEBUG oslo_concurrency.lockutils [None req-32069043-92dc-4888-8b4d-8dfd7a4b0e89 00a7d470e36045deabd5584bd3a9c73e f02fc2085f6340ffa895cb894fdf5882 - - default default] Lock "4a27cb55-31e5-4343-bed7-44671f47ae20-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 21 19:14:27 np0005591285 nova_compute[182755]: 2026-01-22 00:14:27.703 182759 DEBUG oslo_concurrency.lockutils [None req-32069043-92dc-4888-8b4d-8dfd7a4b0e89 00a7d470e36045deabd5584bd3a9c73e f02fc2085f6340ffa895cb894fdf5882 - - default default] Lock "4a27cb55-31e5-4343-bed7-44671f47ae20-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 21 19:14:27 np0005591285 nova_compute[182755]: 2026-01-22 00:14:27.704 182759 DEBUG nova.virt.libvirt.vif [None req-32069043-92dc-4888-8b4d-8dfd7a4b0e89 00a7d470e36045deabd5584bd3a9c73e f02fc2085f6340ffa895cb894fdf5882 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-22T00:14:03Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-tempest.common.compute-instance-1102130362',display_name='tempest-tempest.common.compute-instance-1102130362-2',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-tempest-common-compute-instance-1102130362-2',id=124,image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=1,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='f02fc2085f6340ffa895cb894fdf5882',ramdisk_id='',reservation_id='r-u2tu2ftn',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-MultipleCreateTestJSON-620854064',owner_user_name='tempest-MultipleCreateTestJSON-620854064-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-22T00:14:15Z,user_data=None,user_id='00a7d470e36045deabd5584bd3a9c73e',uuid=4a27cb55-31e5-4343-bed7-44671f47ae20,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "bd929f81-9ed0-4db8-9e56-d5c7f3261b6a", "address": "fa:16:3e:89:3a:5c", "network": {"id": "c19848fe-a435-4c66-8190-94e8e9e1b266", "bridge": "br-int", "label": "tempest-MultipleCreateTestJSON-1960670064-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "f02fc2085f6340ffa895cb894fdf5882", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapbd929f81-9e", "ovs_interfaceid": "bd929f81-9ed0-4db8-9e56-d5c7f3261b6a", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Jan 21 19:14:27 np0005591285 nova_compute[182755]: 2026-01-22 00:14:27.705 182759 DEBUG nova.network.os_vif_util [None req-32069043-92dc-4888-8b4d-8dfd7a4b0e89 00a7d470e36045deabd5584bd3a9c73e f02fc2085f6340ffa895cb894fdf5882 - - default default] Converting VIF {"id": "bd929f81-9ed0-4db8-9e56-d5c7f3261b6a", "address": "fa:16:3e:89:3a:5c", "network": {"id": "c19848fe-a435-4c66-8190-94e8e9e1b266", "bridge": "br-int", "label": "tempest-MultipleCreateTestJSON-1960670064-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "f02fc2085f6340ffa895cb894fdf5882", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapbd929f81-9e", "ovs_interfaceid": "bd929f81-9ed0-4db8-9e56-d5c7f3261b6a", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 21 19:14:27 np0005591285 nova_compute[182755]: 2026-01-22 00:14:27.706 182759 DEBUG nova.network.os_vif_util [None req-32069043-92dc-4888-8b4d-8dfd7a4b0e89 00a7d470e36045deabd5584bd3a9c73e f02fc2085f6340ffa895cb894fdf5882 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:89:3a:5c,bridge_name='br-int',has_traffic_filtering=True,id=bd929f81-9ed0-4db8-9e56-d5c7f3261b6a,network=Network(c19848fe-a435-4c66-8190-94e8e9e1b266),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapbd929f81-9e') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 21 19:14:27 np0005591285 nova_compute[182755]: 2026-01-22 00:14:27.707 182759 DEBUG os_vif [None req-32069043-92dc-4888-8b4d-8dfd7a4b0e89 00a7d470e36045deabd5584bd3a9c73e f02fc2085f6340ffa895cb894fdf5882 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:89:3a:5c,bridge_name='br-int',has_traffic_filtering=True,id=bd929f81-9ed0-4db8-9e56-d5c7f3261b6a,network=Network(c19848fe-a435-4c66-8190-94e8e9e1b266),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapbd929f81-9e') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Jan 21 19:14:27 np0005591285 nova_compute[182755]: 2026-01-22 00:14:27.707 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:14:27 np0005591285 nova_compute[182755]: 2026-01-22 00:14:27.708 182759 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 21 19:14:27 np0005591285 nova_compute[182755]: 2026-01-22 00:14:27.708 182759 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 21 19:14:27 np0005591285 nova_compute[182755]: 2026-01-22 00:14:27.711 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:14:27 np0005591285 nova_compute[182755]: 2026-01-22 00:14:27.711 182759 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapbd929f81-9e, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 21 19:14:27 np0005591285 nova_compute[182755]: 2026-01-22 00:14:27.712 182759 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tapbd929f81-9e, col_values=(('external_ids', {'iface-id': 'bd929f81-9ed0-4db8-9e56-d5c7f3261b6a', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:89:3a:5c', 'vm-uuid': '4a27cb55-31e5-4343-bed7-44671f47ae20'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 21 19:14:27 np0005591285 nova_compute[182755]: 2026-01-22 00:14:27.713 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:14:27 np0005591285 NetworkManager[55017]: <info>  [1769040867.7150] manager: (tapbd929f81-9e): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/231)
Jan 21 19:14:27 np0005591285 nova_compute[182755]: 2026-01-22 00:14:27.719 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 21 19:14:27 np0005591285 nova_compute[182755]: 2026-01-22 00:14:27.721 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:14:27 np0005591285 nova_compute[182755]: 2026-01-22 00:14:27.723 182759 INFO os_vif [None req-32069043-92dc-4888-8b4d-8dfd7a4b0e89 00a7d470e36045deabd5584bd3a9c73e f02fc2085f6340ffa895cb894fdf5882 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:89:3a:5c,bridge_name='br-int',has_traffic_filtering=True,id=bd929f81-9ed0-4db8-9e56-d5c7f3261b6a,network=Network(c19848fe-a435-4c66-8190-94e8e9e1b266),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapbd929f81-9e')#033[00m
Jan 21 19:14:28 np0005591285 nova_compute[182755]: 2026-01-22 00:14:28.118 182759 DEBUG nova.virt.libvirt.driver [None req-32069043-92dc-4888-8b4d-8dfd7a4b0e89 00a7d470e36045deabd5584bd3a9c73e f02fc2085f6340ffa895cb894fdf5882 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 21 19:14:28 np0005591285 nova_compute[182755]: 2026-01-22 00:14:28.118 182759 DEBUG nova.virt.libvirt.driver [None req-32069043-92dc-4888-8b4d-8dfd7a4b0e89 00a7d470e36045deabd5584bd3a9c73e f02fc2085f6340ffa895cb894fdf5882 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 21 19:14:28 np0005591285 nova_compute[182755]: 2026-01-22 00:14:28.119 182759 DEBUG nova.virt.libvirt.driver [None req-32069043-92dc-4888-8b4d-8dfd7a4b0e89 00a7d470e36045deabd5584bd3a9c73e f02fc2085f6340ffa895cb894fdf5882 - - default default] No VIF found with MAC fa:16:3e:89:3a:5c, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Jan 21 19:14:28 np0005591285 nova_compute[182755]: 2026-01-22 00:14:28.119 182759 INFO nova.virt.libvirt.driver [None req-32069043-92dc-4888-8b4d-8dfd7a4b0e89 00a7d470e36045deabd5584bd3a9c73e f02fc2085f6340ffa895cb894fdf5882 - - default default] [instance: 4a27cb55-31e5-4343-bed7-44671f47ae20] Using config drive#033[00m
Jan 21 19:14:29 np0005591285 nova_compute[182755]: 2026-01-22 00:14:29.062 182759 INFO nova.virt.libvirt.driver [None req-32069043-92dc-4888-8b4d-8dfd7a4b0e89 00a7d470e36045deabd5584bd3a9c73e f02fc2085f6340ffa895cb894fdf5882 - - default default] [instance: 4a27cb55-31e5-4343-bed7-44671f47ae20] Creating config drive at /var/lib/nova/instances/4a27cb55-31e5-4343-bed7-44671f47ae20/disk.config#033[00m
Jan 21 19:14:29 np0005591285 nova_compute[182755]: 2026-01-22 00:14:29.072 182759 DEBUG oslo_concurrency.processutils [None req-32069043-92dc-4888-8b4d-8dfd7a4b0e89 00a7d470e36045deabd5584bd3a9c73e f02fc2085f6340ffa895cb894fdf5882 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/4a27cb55-31e5-4343-bed7-44671f47ae20/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp1wafx3cz execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 21 19:14:29 np0005591285 nova_compute[182755]: 2026-01-22 00:14:29.204 182759 DEBUG oslo_concurrency.processutils [None req-32069043-92dc-4888-8b4d-8dfd7a4b0e89 00a7d470e36045deabd5584bd3a9c73e f02fc2085f6340ffa895cb894fdf5882 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/4a27cb55-31e5-4343-bed7-44671f47ae20/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp1wafx3cz" returned: 0 in 0.133s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 21 19:14:29 np0005591285 kernel: tapbd929f81-9e: entered promiscuous mode
Jan 21 19:14:29 np0005591285 NetworkManager[55017]: <info>  [1769040869.2693] manager: (tapbd929f81-9e): new Tun device (/org/freedesktop/NetworkManager/Devices/232)
Jan 21 19:14:29 np0005591285 ovn_controller[94908]: 2026-01-22T00:14:29Z|00478|binding|INFO|Claiming lport bd929f81-9ed0-4db8-9e56-d5c7f3261b6a for this chassis.
Jan 21 19:14:29 np0005591285 ovn_controller[94908]: 2026-01-22T00:14:29Z|00479|binding|INFO|bd929f81-9ed0-4db8-9e56-d5c7f3261b6a: Claiming fa:16:3e:89:3a:5c 10.100.0.7
Jan 21 19:14:29 np0005591285 nova_compute[182755]: 2026-01-22 00:14:29.270 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:14:29 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:14:29.290 104259 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:89:3a:5c 10.100.0.7'], port_security=['fa:16:3e:89:3a:5c 10.100.0.7'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.7/28', 'neutron:device_id': '4a27cb55-31e5-4343-bed7-44671f47ae20', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-c19848fe-a435-4c66-8190-94e8e9e1b266', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'f02fc2085f6340ffa895cb894fdf5882', 'neutron:revision_number': '2', 'neutron:security_group_ids': '01430d09-4466-4c63-8f42-d6bde77fcc79', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=639fe658-8c59-48e8-bb7b-52cdb7487f54, chassis=[<ovs.db.idl.Row object at 0x7fdd55b5c970>], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fdd55b5c970>], logical_port=bd929f81-9ed0-4db8-9e56-d5c7f3261b6a) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 21 19:14:29 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:14:29.292 104259 INFO neutron.agent.ovn.metadata.agent [-] Port bd929f81-9ed0-4db8-9e56-d5c7f3261b6a in datapath c19848fe-a435-4c66-8190-94e8e9e1b266 bound to our chassis#033[00m
Jan 21 19:14:29 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:14:29.293 104259 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network c19848fe-a435-4c66-8190-94e8e9e1b266#033[00m
Jan 21 19:14:29 np0005591285 systemd-udevd[231268]: Network interface NamePolicy= disabled on kernel command line.
Jan 21 19:14:29 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:14:29.307 211690 DEBUG oslo.privsep.daemon [-] privsep: reply[3bdea7fa-b40d-4764-aff7-aa56eab7f765]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 19:14:29 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:14:29.308 104259 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tapc19848fe-a1 in ovnmeta-c19848fe-a435-4c66-8190-94e8e9e1b266 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Jan 21 19:14:29 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:14:29.310 211690 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tapc19848fe-a0 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Jan 21 19:14:29 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:14:29.310 211690 DEBUG oslo.privsep.daemon [-] privsep: reply[154e9e96-081c-4ac9-84bf-9acc922fe001]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 19:14:29 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:14:29.311 211690 DEBUG oslo.privsep.daemon [-] privsep: reply[13957b78-503e-4084-82c1-3000ebc3e2ca]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 19:14:29 np0005591285 ovn_controller[94908]: 2026-01-22T00:14:29Z|00480|binding|INFO|Setting lport bd929f81-9ed0-4db8-9e56-d5c7f3261b6a ovn-installed in OVS
Jan 21 19:14:29 np0005591285 ovn_controller[94908]: 2026-01-22T00:14:29Z|00481|binding|INFO|Setting lport bd929f81-9ed0-4db8-9e56-d5c7f3261b6a up in Southbound
Jan 21 19:14:29 np0005591285 nova_compute[182755]: 2026-01-22 00:14:29.354 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:14:29 np0005591285 systemd-machined[154022]: New machine qemu-57-instance-0000007c.
Jan 21 19:14:29 np0005591285 NetworkManager[55017]: <info>  [1769040869.3649] device (tapbd929f81-9e): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 21 19:14:29 np0005591285 NetworkManager[55017]: <info>  [1769040869.3659] device (tapbd929f81-9e): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 21 19:14:29 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:14:29.365 104650 DEBUG oslo.privsep.daemon [-] privsep: reply[7072c514-fedf-4204-8a30-90fe3d8f19c3]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 19:14:29 np0005591285 systemd[1]: Started Virtual Machine qemu-57-instance-0000007c.
Jan 21 19:14:29 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:14:29.382 211690 DEBUG oslo.privsep.daemon [-] privsep: reply[3a212ee1-972e-4ac2-a6ff-febc1e50b60f]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 19:14:29 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:14:29.414 211704 DEBUG oslo.privsep.daemon [-] privsep: reply[74023704-4e2a-45ec-9043-dbe6c8a5aaad]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 19:14:29 np0005591285 NetworkManager[55017]: <info>  [1769040869.4239] manager: (tapc19848fe-a0): new Veth device (/org/freedesktop/NetworkManager/Devices/233)
Jan 21 19:14:29 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:14:29.423 211690 DEBUG oslo.privsep.daemon [-] privsep: reply[7c015c44-1196-42e6-bb93-14cd07289b26]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 19:14:29 np0005591285 systemd-udevd[231274]: Network interface NamePolicy= disabled on kernel command line.
Jan 21 19:14:29 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:14:29.458 211704 DEBUG oslo.privsep.daemon [-] privsep: reply[65b17b59-19ac-424c-81ba-d12fbd3ed1e1]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 19:14:29 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:14:29.461 211704 DEBUG oslo.privsep.daemon [-] privsep: reply[bd457f7e-b487-4853-b276-3dcd323289b1]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 19:14:29 np0005591285 NetworkManager[55017]: <info>  [1769040869.4877] device (tapc19848fe-a0): carrier: link connected
Jan 21 19:14:29 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:14:29.493 211704 DEBUG oslo.privsep.daemon [-] privsep: reply[20a29a96-a5c3-4830-ae8f-cb4119f5aa97]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 19:14:29 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:14:29.512 211690 DEBUG oslo.privsep.daemon [-] privsep: reply[7c0c0761-0dc0-461b-a9e4-08021878a06e]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapc19848fe-a1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:25:5c:b6'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 150], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 542814, 'reachable_time': 42175, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 231303, 'error': None, 'target': 'ovnmeta-c19848fe-a435-4c66-8190-94e8e9e1b266', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 19:14:29 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:14:29.529 211690 DEBUG oslo.privsep.daemon [-] privsep: reply[7234f47b-4d4c-4757-b223-7ed982b8e3a5]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe25:5cb6'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 542814, 'tstamp': 542814}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 231304, 'error': None, 'target': 'ovnmeta-c19848fe-a435-4c66-8190-94e8e9e1b266', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 19:14:29 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:14:29.546 211690 DEBUG oslo.privsep.daemon [-] privsep: reply[d2ed02df-c691-4aa0-b109-5c680fd0dca6]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapc19848fe-a1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:25:5c:b6'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 150], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 542814, 'reachable_time': 42175, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 231305, 'error': None, 'target': 'ovnmeta-c19848fe-a435-4c66-8190-94e8e9e1b266', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 19:14:29 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:14:29.577 211690 DEBUG oslo.privsep.daemon [-] privsep: reply[e6bc3fc8-f72a-4ab0-8d80-323640e5b1b4]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 19:14:29 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:14:29.640 211690 DEBUG oslo.privsep.daemon [-] privsep: reply[a35553a1-d505-4389-8b32-14fc589ca770]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 19:14:29 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:14:29.642 104259 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapc19848fe-a0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 21 19:14:29 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:14:29.642 104259 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 21 19:14:29 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:14:29.642 104259 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapc19848fe-a0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 21 19:14:29 np0005591285 nova_compute[182755]: 2026-01-22 00:14:29.644 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:14:29 np0005591285 NetworkManager[55017]: <info>  [1769040869.6447] manager: (tapc19848fe-a0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/234)
Jan 21 19:14:29 np0005591285 kernel: tapc19848fe-a0: entered promiscuous mode
Jan 21 19:14:29 np0005591285 nova_compute[182755]: 2026-01-22 00:14:29.647 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:14:29 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:14:29.654 104259 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapc19848fe-a0, col_values=(('external_ids', {'iface-id': 'ba768391-9e0e-4cf0-83c5-526ca3a05a58'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 21 19:14:29 np0005591285 nova_compute[182755]: 2026-01-22 00:14:29.655 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:14:29 np0005591285 ovn_controller[94908]: 2026-01-22T00:14:29Z|00482|binding|INFO|Releasing lport ba768391-9e0e-4cf0-83c5-526ca3a05a58 from this chassis (sb_readonly=0)
Jan 21 19:14:29 np0005591285 nova_compute[182755]: 2026-01-22 00:14:29.668 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:14:29 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:14:29.669 104259 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/c19848fe-a435-4c66-8190-94e8e9e1b266.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/c19848fe-a435-4c66-8190-94e8e9e1b266.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Jan 21 19:14:29 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:14:29.670 211690 DEBUG oslo.privsep.daemon [-] privsep: reply[f4a911cb-5db5-4602-a420-d7620f931542]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 19:14:29 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:14:29.671 104259 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Jan 21 19:14:29 np0005591285 ovn_metadata_agent[104254]: global
Jan 21 19:14:29 np0005591285 ovn_metadata_agent[104254]:    log         /dev/log local0 debug
Jan 21 19:14:29 np0005591285 ovn_metadata_agent[104254]:    log-tag     haproxy-metadata-proxy-c19848fe-a435-4c66-8190-94e8e9e1b266
Jan 21 19:14:29 np0005591285 ovn_metadata_agent[104254]:    user        root
Jan 21 19:14:29 np0005591285 ovn_metadata_agent[104254]:    group       root
Jan 21 19:14:29 np0005591285 ovn_metadata_agent[104254]:    maxconn     1024
Jan 21 19:14:29 np0005591285 ovn_metadata_agent[104254]:    pidfile     /var/lib/neutron/external/pids/c19848fe-a435-4c66-8190-94e8e9e1b266.pid.haproxy
Jan 21 19:14:29 np0005591285 ovn_metadata_agent[104254]:    daemon
Jan 21 19:14:29 np0005591285 ovn_metadata_agent[104254]: 
Jan 21 19:14:29 np0005591285 ovn_metadata_agent[104254]: defaults
Jan 21 19:14:29 np0005591285 ovn_metadata_agent[104254]:    log global
Jan 21 19:14:29 np0005591285 ovn_metadata_agent[104254]:    mode http
Jan 21 19:14:29 np0005591285 ovn_metadata_agent[104254]:    option httplog
Jan 21 19:14:29 np0005591285 ovn_metadata_agent[104254]:    option dontlognull
Jan 21 19:14:29 np0005591285 ovn_metadata_agent[104254]:    option http-server-close
Jan 21 19:14:29 np0005591285 ovn_metadata_agent[104254]:    option forwardfor
Jan 21 19:14:29 np0005591285 ovn_metadata_agent[104254]:    retries                 3
Jan 21 19:14:29 np0005591285 ovn_metadata_agent[104254]:    timeout http-request    30s
Jan 21 19:14:29 np0005591285 ovn_metadata_agent[104254]:    timeout connect         30s
Jan 21 19:14:29 np0005591285 ovn_metadata_agent[104254]:    timeout client          32s
Jan 21 19:14:29 np0005591285 ovn_metadata_agent[104254]:    timeout server          32s
Jan 21 19:14:29 np0005591285 ovn_metadata_agent[104254]:    timeout http-keep-alive 30s
Jan 21 19:14:29 np0005591285 ovn_metadata_agent[104254]: 
Jan 21 19:14:29 np0005591285 ovn_metadata_agent[104254]: 
Jan 21 19:14:29 np0005591285 ovn_metadata_agent[104254]: listen listener
Jan 21 19:14:29 np0005591285 ovn_metadata_agent[104254]:    bind 169.254.169.254:80
Jan 21 19:14:29 np0005591285 ovn_metadata_agent[104254]:    server metadata /var/lib/neutron/metadata_proxy
Jan 21 19:14:29 np0005591285 ovn_metadata_agent[104254]:    http-request add-header X-OVN-Network-ID c19848fe-a435-4c66-8190-94e8e9e1b266
Jan 21 19:14:29 np0005591285 ovn_metadata_agent[104254]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Jan 21 19:14:29 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:14:29.672 104259 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-c19848fe-a435-4c66-8190-94e8e9e1b266', 'env', 'PROCESS_TAG=haproxy-c19848fe-a435-4c66-8190-94e8e9e1b266', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/c19848fe-a435-4c66-8190-94e8e9e1b266.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Jan 21 19:14:30 np0005591285 nova_compute[182755]: 2026-01-22 00:14:30.010 182759 DEBUG nova.virt.driver [None req-f447ef32-7edb-4e2c-a5c5-5452f2f9c37e - - - - - -] Emitting event <LifecycleEvent: 1769040870.0095162, 4a27cb55-31e5-4343-bed7-44671f47ae20 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 21 19:14:30 np0005591285 nova_compute[182755]: 2026-01-22 00:14:30.011 182759 INFO nova.compute.manager [None req-f447ef32-7edb-4e2c-a5c5-5452f2f9c37e - - - - - -] [instance: 4a27cb55-31e5-4343-bed7-44671f47ae20] VM Started (Lifecycle Event)#033[00m
Jan 21 19:14:30 np0005591285 podman[231343]: 2026-01-22 00:14:30.036064123 +0000 UTC m=+0.055729077 container create 9248e21d9290965136c26d174c039691772999baaa4fedca13172fb7cbb61e7f (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-c19848fe-a435-4c66-8190-94e8e9e1b266, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 21 19:14:30 np0005591285 systemd[1]: Started libpod-conmon-9248e21d9290965136c26d174c039691772999baaa4fedca13172fb7cbb61e7f.scope.
Jan 21 19:14:30 np0005591285 podman[231343]: 2026-01-22 00:14:30.00296559 +0000 UTC m=+0.022630524 image pull 3695f0466b4af47afdf4b467956f8cc4744d7249671a73e7ca3fd26cca2f59c3 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Jan 21 19:14:30 np0005591285 systemd[1]: Started libcrun container.
Jan 21 19:14:30 np0005591285 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/15a00b631118e6a3d771a11190222858f19d81e8da9e8025e9f4eb2c45941f35/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 21 19:14:30 np0005591285 podman[231343]: 2026-01-22 00:14:30.12712747 +0000 UTC m=+0.146792424 container init 9248e21d9290965136c26d174c039691772999baaa4fedca13172fb7cbb61e7f (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-c19848fe-a435-4c66-8190-94e8e9e1b266, tcib_managed=true, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.license=GPLv2, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS)
Jan 21 19:14:30 np0005591285 podman[231343]: 2026-01-22 00:14:30.134316002 +0000 UTC m=+0.153980946 container start 9248e21d9290965136c26d174c039691772999baaa4fedca13172fb7cbb61e7f (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-c19848fe-a435-4c66-8190-94e8e9e1b266, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS)
Jan 21 19:14:30 np0005591285 neutron-haproxy-ovnmeta-c19848fe-a435-4c66-8190-94e8e9e1b266[231360]: [NOTICE]   (231364) : New worker (231366) forked
Jan 21 19:14:30 np0005591285 neutron-haproxy-ovnmeta-c19848fe-a435-4c66-8190-94e8e9e1b266[231360]: [NOTICE]   (231364) : Loading success.
Jan 21 19:14:30 np0005591285 nova_compute[182755]: 2026-01-22 00:14:30.514 182759 DEBUG nova.compute.manager [None req-f447ef32-7edb-4e2c-a5c5-5452f2f9c37e - - - - - -] [instance: 4a27cb55-31e5-4343-bed7-44671f47ae20] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 21 19:14:30 np0005591285 nova_compute[182755]: 2026-01-22 00:14:30.521 182759 DEBUG nova.virt.driver [None req-f447ef32-7edb-4e2c-a5c5-5452f2f9c37e - - - - - -] Emitting event <LifecycleEvent: 1769040870.0106876, 4a27cb55-31e5-4343-bed7-44671f47ae20 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 21 19:14:30 np0005591285 nova_compute[182755]: 2026-01-22 00:14:30.521 182759 INFO nova.compute.manager [None req-f447ef32-7edb-4e2c-a5c5-5452f2f9c37e - - - - - -] [instance: 4a27cb55-31e5-4343-bed7-44671f47ae20] VM Paused (Lifecycle Event)#033[00m
Jan 21 19:14:30 np0005591285 nova_compute[182755]: 2026-01-22 00:14:30.845 182759 DEBUG nova.compute.manager [None req-f447ef32-7edb-4e2c-a5c5-5452f2f9c37e - - - - - -] [instance: 4a27cb55-31e5-4343-bed7-44671f47ae20] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 21 19:14:30 np0005591285 nova_compute[182755]: 2026-01-22 00:14:30.850 182759 DEBUG nova.compute.manager [None req-f447ef32-7edb-4e2c-a5c5-5452f2f9c37e - - - - - -] [instance: 4a27cb55-31e5-4343-bed7-44671f47ae20] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 21 19:14:31 np0005591285 nova_compute[182755]: 2026-01-22 00:14:31.134 182759 INFO nova.compute.manager [None req-f447ef32-7edb-4e2c-a5c5-5452f2f9c37e - - - - - -] [instance: 4a27cb55-31e5-4343-bed7-44671f47ae20] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 21 19:14:31 np0005591285 nova_compute[182755]: 2026-01-22 00:14:31.330 182759 DEBUG nova.network.neutron [req-fefac87b-60f7-4185-8ece-cc5549933116 req-d076da06-6d2e-4dba-9af7-ba2c58c70ce1 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 4a27cb55-31e5-4343-bed7-44671f47ae20] Updated VIF entry in instance network info cache for port bd929f81-9ed0-4db8-9e56-d5c7f3261b6a. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 21 19:14:31 np0005591285 nova_compute[182755]: 2026-01-22 00:14:31.331 182759 DEBUG nova.network.neutron [req-fefac87b-60f7-4185-8ece-cc5549933116 req-d076da06-6d2e-4dba-9af7-ba2c58c70ce1 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 4a27cb55-31e5-4343-bed7-44671f47ae20] Updating instance_info_cache with network_info: [{"id": "bd929f81-9ed0-4db8-9e56-d5c7f3261b6a", "address": "fa:16:3e:89:3a:5c", "network": {"id": "c19848fe-a435-4c66-8190-94e8e9e1b266", "bridge": "br-int", "label": "tempest-MultipleCreateTestJSON-1960670064-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "f02fc2085f6340ffa895cb894fdf5882", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapbd929f81-9e", "ovs_interfaceid": "bd929f81-9ed0-4db8-9e56-d5c7f3261b6a", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 21 19:14:31 np0005591285 nova_compute[182755]: 2026-01-22 00:14:31.426 182759 DEBUG oslo_concurrency.lockutils [req-fefac87b-60f7-4185-8ece-cc5549933116 req-d076da06-6d2e-4dba-9af7-ba2c58c70ce1 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Releasing lock "refresh_cache-4a27cb55-31e5-4343-bed7-44671f47ae20" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 21 19:14:31 np0005591285 nova_compute[182755]: 2026-01-22 00:14:31.478 182759 DEBUG nova.compute.manager [req-bd7c70bd-2efb-4b67-b0b0-83c6131022f6 req-5e5fb95d-6359-4bad-8714-5335ee499061 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 4a27cb55-31e5-4343-bed7-44671f47ae20] Received event network-vif-plugged-bd929f81-9ed0-4db8-9e56-d5c7f3261b6a external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 21 19:14:31 np0005591285 nova_compute[182755]: 2026-01-22 00:14:31.479 182759 DEBUG oslo_concurrency.lockutils [req-bd7c70bd-2efb-4b67-b0b0-83c6131022f6 req-5e5fb95d-6359-4bad-8714-5335ee499061 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "4a27cb55-31e5-4343-bed7-44671f47ae20-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 21 19:14:31 np0005591285 nova_compute[182755]: 2026-01-22 00:14:31.479 182759 DEBUG oslo_concurrency.lockutils [req-bd7c70bd-2efb-4b67-b0b0-83c6131022f6 req-5e5fb95d-6359-4bad-8714-5335ee499061 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "4a27cb55-31e5-4343-bed7-44671f47ae20-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 21 19:14:31 np0005591285 nova_compute[182755]: 2026-01-22 00:14:31.479 182759 DEBUG oslo_concurrency.lockutils [req-bd7c70bd-2efb-4b67-b0b0-83c6131022f6 req-5e5fb95d-6359-4bad-8714-5335ee499061 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "4a27cb55-31e5-4343-bed7-44671f47ae20-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 21 19:14:31 np0005591285 nova_compute[182755]: 2026-01-22 00:14:31.480 182759 DEBUG nova.compute.manager [req-bd7c70bd-2efb-4b67-b0b0-83c6131022f6 req-5e5fb95d-6359-4bad-8714-5335ee499061 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 4a27cb55-31e5-4343-bed7-44671f47ae20] Processing event network-vif-plugged-bd929f81-9ed0-4db8-9e56-d5c7f3261b6a _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Jan 21 19:14:31 np0005591285 nova_compute[182755]: 2026-01-22 00:14:31.480 182759 DEBUG nova.compute.manager [None req-32069043-92dc-4888-8b4d-8dfd7a4b0e89 00a7d470e36045deabd5584bd3a9c73e f02fc2085f6340ffa895cb894fdf5882 - - default default] [instance: 4a27cb55-31e5-4343-bed7-44671f47ae20] Instance event wait completed in 1 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Jan 21 19:14:31 np0005591285 nova_compute[182755]: 2026-01-22 00:14:31.483 182759 DEBUG nova.virt.driver [None req-f447ef32-7edb-4e2c-a5c5-5452f2f9c37e - - - - - -] Emitting event <LifecycleEvent: 1769040871.4837384, 4a27cb55-31e5-4343-bed7-44671f47ae20 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 21 19:14:31 np0005591285 nova_compute[182755]: 2026-01-22 00:14:31.484 182759 INFO nova.compute.manager [None req-f447ef32-7edb-4e2c-a5c5-5452f2f9c37e - - - - - -] [instance: 4a27cb55-31e5-4343-bed7-44671f47ae20] VM Resumed (Lifecycle Event)#033[00m
Jan 21 19:14:31 np0005591285 nova_compute[182755]: 2026-01-22 00:14:31.485 182759 DEBUG nova.virt.libvirt.driver [None req-32069043-92dc-4888-8b4d-8dfd7a4b0e89 00a7d470e36045deabd5584bd3a9c73e f02fc2085f6340ffa895cb894fdf5882 - - default default] [instance: 4a27cb55-31e5-4343-bed7-44671f47ae20] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Jan 21 19:14:31 np0005591285 nova_compute[182755]: 2026-01-22 00:14:31.488 182759 INFO nova.virt.libvirt.driver [-] [instance: 4a27cb55-31e5-4343-bed7-44671f47ae20] Instance spawned successfully.#033[00m
Jan 21 19:14:31 np0005591285 nova_compute[182755]: 2026-01-22 00:14:31.488 182759 DEBUG nova.virt.libvirt.driver [None req-32069043-92dc-4888-8b4d-8dfd7a4b0e89 00a7d470e36045deabd5584bd3a9c73e f02fc2085f6340ffa895cb894fdf5882 - - default default] [instance: 4a27cb55-31e5-4343-bed7-44671f47ae20] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Jan 21 19:14:31 np0005591285 nova_compute[182755]: 2026-01-22 00:14:31.557 182759 DEBUG nova.compute.manager [None req-f447ef32-7edb-4e2c-a5c5-5452f2f9c37e - - - - - -] [instance: 4a27cb55-31e5-4343-bed7-44671f47ae20] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 21 19:14:31 np0005591285 nova_compute[182755]: 2026-01-22 00:14:31.562 182759 DEBUG nova.compute.manager [None req-f447ef32-7edb-4e2c-a5c5-5452f2f9c37e - - - - - -] [instance: 4a27cb55-31e5-4343-bed7-44671f47ae20] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 21 19:14:31 np0005591285 nova_compute[182755]: 2026-01-22 00:14:31.566 182759 DEBUG nova.virt.libvirt.driver [None req-32069043-92dc-4888-8b4d-8dfd7a4b0e89 00a7d470e36045deabd5584bd3a9c73e f02fc2085f6340ffa895cb894fdf5882 - - default default] [instance: 4a27cb55-31e5-4343-bed7-44671f47ae20] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 21 19:14:31 np0005591285 nova_compute[182755]: 2026-01-22 00:14:31.566 182759 DEBUG nova.virt.libvirt.driver [None req-32069043-92dc-4888-8b4d-8dfd7a4b0e89 00a7d470e36045deabd5584bd3a9c73e f02fc2085f6340ffa895cb894fdf5882 - - default default] [instance: 4a27cb55-31e5-4343-bed7-44671f47ae20] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 21 19:14:31 np0005591285 nova_compute[182755]: 2026-01-22 00:14:31.567 182759 DEBUG nova.virt.libvirt.driver [None req-32069043-92dc-4888-8b4d-8dfd7a4b0e89 00a7d470e36045deabd5584bd3a9c73e f02fc2085f6340ffa895cb894fdf5882 - - default default] [instance: 4a27cb55-31e5-4343-bed7-44671f47ae20] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 21 19:14:31 np0005591285 nova_compute[182755]: 2026-01-22 00:14:31.567 182759 DEBUG nova.virt.libvirt.driver [None req-32069043-92dc-4888-8b4d-8dfd7a4b0e89 00a7d470e36045deabd5584bd3a9c73e f02fc2085f6340ffa895cb894fdf5882 - - default default] [instance: 4a27cb55-31e5-4343-bed7-44671f47ae20] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 21 19:14:31 np0005591285 nova_compute[182755]: 2026-01-22 00:14:31.568 182759 DEBUG nova.virt.libvirt.driver [None req-32069043-92dc-4888-8b4d-8dfd7a4b0e89 00a7d470e36045deabd5584bd3a9c73e f02fc2085f6340ffa895cb894fdf5882 - - default default] [instance: 4a27cb55-31e5-4343-bed7-44671f47ae20] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 21 19:14:31 np0005591285 nova_compute[182755]: 2026-01-22 00:14:31.568 182759 DEBUG nova.virt.libvirt.driver [None req-32069043-92dc-4888-8b4d-8dfd7a4b0e89 00a7d470e36045deabd5584bd3a9c73e f02fc2085f6340ffa895cb894fdf5882 - - default default] [instance: 4a27cb55-31e5-4343-bed7-44671f47ae20] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 21 19:14:31 np0005591285 nova_compute[182755]: 2026-01-22 00:14:31.671 182759 INFO nova.compute.manager [None req-f447ef32-7edb-4e2c-a5c5-5452f2f9c37e - - - - - -] [instance: 4a27cb55-31e5-4343-bed7-44671f47ae20] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 21 19:14:31 np0005591285 nova_compute[182755]: 2026-01-22 00:14:31.893 182759 INFO nova.compute.manager [None req-32069043-92dc-4888-8b4d-8dfd7a4b0e89 00a7d470e36045deabd5584bd3a9c73e f02fc2085f6340ffa895cb894fdf5882 - - default default] [instance: 4a27cb55-31e5-4343-bed7-44671f47ae20] Took 16.29 seconds to spawn the instance on the hypervisor.#033[00m
Jan 21 19:14:31 np0005591285 nova_compute[182755]: 2026-01-22 00:14:31.894 182759 DEBUG nova.compute.manager [None req-32069043-92dc-4888-8b4d-8dfd7a4b0e89 00a7d470e36045deabd5584bd3a9c73e f02fc2085f6340ffa895cb894fdf5882 - - default default] [instance: 4a27cb55-31e5-4343-bed7-44671f47ae20] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 21 19:14:32 np0005591285 nova_compute[182755]: 2026-01-22 00:14:32.040 182759 INFO nova.compute.manager [None req-32069043-92dc-4888-8b4d-8dfd7a4b0e89 00a7d470e36045deabd5584bd3a9c73e f02fc2085f6340ffa895cb894fdf5882 - - default default] [instance: 4a27cb55-31e5-4343-bed7-44671f47ae20] Took 17.24 seconds to build instance.#033[00m
Jan 21 19:14:32 np0005591285 nova_compute[182755]: 2026-01-22 00:14:32.109 182759 DEBUG oslo_concurrency.lockutils [None req-32069043-92dc-4888-8b4d-8dfd7a4b0e89 00a7d470e36045deabd5584bd3a9c73e f02fc2085f6340ffa895cb894fdf5882 - - default default] Lock "4a27cb55-31e5-4343-bed7-44671f47ae20" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 17.460s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 21 19:14:32 np0005591285 nova_compute[182755]: 2026-01-22 00:14:32.506 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:14:32 np0005591285 nova_compute[182755]: 2026-01-22 00:14:32.713 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:14:33 np0005591285 nova_compute[182755]: 2026-01-22 00:14:33.744 182759 DEBUG nova.compute.manager [req-7cc02180-0042-4fc6-82b4-b015177deb22 req-e3662960-3406-401e-8311-426049b90356 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 4a27cb55-31e5-4343-bed7-44671f47ae20] Received event network-vif-plugged-bd929f81-9ed0-4db8-9e56-d5c7f3261b6a external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 21 19:14:33 np0005591285 nova_compute[182755]: 2026-01-22 00:14:33.745 182759 DEBUG oslo_concurrency.lockutils [req-7cc02180-0042-4fc6-82b4-b015177deb22 req-e3662960-3406-401e-8311-426049b90356 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "4a27cb55-31e5-4343-bed7-44671f47ae20-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 21 19:14:33 np0005591285 nova_compute[182755]: 2026-01-22 00:14:33.745 182759 DEBUG oslo_concurrency.lockutils [req-7cc02180-0042-4fc6-82b4-b015177deb22 req-e3662960-3406-401e-8311-426049b90356 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "4a27cb55-31e5-4343-bed7-44671f47ae20-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 21 19:14:33 np0005591285 nova_compute[182755]: 2026-01-22 00:14:33.745 182759 DEBUG oslo_concurrency.lockutils [req-7cc02180-0042-4fc6-82b4-b015177deb22 req-e3662960-3406-401e-8311-426049b90356 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "4a27cb55-31e5-4343-bed7-44671f47ae20-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 21 19:14:33 np0005591285 nova_compute[182755]: 2026-01-22 00:14:33.745 182759 DEBUG nova.compute.manager [req-7cc02180-0042-4fc6-82b4-b015177deb22 req-e3662960-3406-401e-8311-426049b90356 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 4a27cb55-31e5-4343-bed7-44671f47ae20] No waiting events found dispatching network-vif-plugged-bd929f81-9ed0-4db8-9e56-d5c7f3261b6a pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 21 19:14:33 np0005591285 nova_compute[182755]: 2026-01-22 00:14:33.746 182759 WARNING nova.compute.manager [req-7cc02180-0042-4fc6-82b4-b015177deb22 req-e3662960-3406-401e-8311-426049b90356 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 4a27cb55-31e5-4343-bed7-44671f47ae20] Received unexpected event network-vif-plugged-bd929f81-9ed0-4db8-9e56-d5c7f3261b6a for instance with vm_state active and task_state None.#033[00m
Jan 21 19:14:34 np0005591285 podman[231375]: 2026-01-22 00:14:34.192092327 +0000 UTC m=+0.064126420 container health_status 3d927e208c8d9d2bf2ed958430b573b5beb6e6062ba87e1ec7a0ee42c855b2e4 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter)
Jan 21 19:14:37 np0005591285 nova_compute[182755]: 2026-01-22 00:14:37.532 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:14:37 np0005591285 nova_compute[182755]: 2026-01-22 00:14:37.719 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:14:37 np0005591285 nova_compute[182755]: 2026-01-22 00:14:37.994 182759 DEBUG oslo_concurrency.lockutils [None req-9a8b7fbc-e94c-477b-9318-19758ba23e21 00a7d470e36045deabd5584bd3a9c73e f02fc2085f6340ffa895cb894fdf5882 - - default default] Acquiring lock "4a27cb55-31e5-4343-bed7-44671f47ae20" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 21 19:14:37 np0005591285 nova_compute[182755]: 2026-01-22 00:14:37.995 182759 DEBUG oslo_concurrency.lockutils [None req-9a8b7fbc-e94c-477b-9318-19758ba23e21 00a7d470e36045deabd5584bd3a9c73e f02fc2085f6340ffa895cb894fdf5882 - - default default] Lock "4a27cb55-31e5-4343-bed7-44671f47ae20" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 21 19:14:37 np0005591285 nova_compute[182755]: 2026-01-22 00:14:37.995 182759 DEBUG oslo_concurrency.lockutils [None req-9a8b7fbc-e94c-477b-9318-19758ba23e21 00a7d470e36045deabd5584bd3a9c73e f02fc2085f6340ffa895cb894fdf5882 - - default default] Acquiring lock "4a27cb55-31e5-4343-bed7-44671f47ae20-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 21 19:14:37 np0005591285 nova_compute[182755]: 2026-01-22 00:14:37.996 182759 DEBUG oslo_concurrency.lockutils [None req-9a8b7fbc-e94c-477b-9318-19758ba23e21 00a7d470e36045deabd5584bd3a9c73e f02fc2085f6340ffa895cb894fdf5882 - - default default] Lock "4a27cb55-31e5-4343-bed7-44671f47ae20-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 21 19:14:37 np0005591285 nova_compute[182755]: 2026-01-22 00:14:37.996 182759 DEBUG oslo_concurrency.lockutils [None req-9a8b7fbc-e94c-477b-9318-19758ba23e21 00a7d470e36045deabd5584bd3a9c73e f02fc2085f6340ffa895cb894fdf5882 - - default default] Lock "4a27cb55-31e5-4343-bed7-44671f47ae20-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 21 19:14:38 np0005591285 nova_compute[182755]: 2026-01-22 00:14:38.009 182759 INFO nova.compute.manager [None req-9a8b7fbc-e94c-477b-9318-19758ba23e21 00a7d470e36045deabd5584bd3a9c73e f02fc2085f6340ffa895cb894fdf5882 - - default default] [instance: 4a27cb55-31e5-4343-bed7-44671f47ae20] Terminating instance#033[00m
Jan 21 19:14:38 np0005591285 nova_compute[182755]: 2026-01-22 00:14:38.020 182759 DEBUG nova.compute.manager [None req-9a8b7fbc-e94c-477b-9318-19758ba23e21 00a7d470e36045deabd5584bd3a9c73e f02fc2085f6340ffa895cb894fdf5882 - - default default] [instance: 4a27cb55-31e5-4343-bed7-44671f47ae20] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Jan 21 19:14:38 np0005591285 kernel: tapbd929f81-9e (unregistering): left promiscuous mode
Jan 21 19:14:38 np0005591285 NetworkManager[55017]: <info>  [1769040878.0474] device (tapbd929f81-9e): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 21 19:14:38 np0005591285 ovn_controller[94908]: 2026-01-22T00:14:38Z|00483|binding|INFO|Releasing lport bd929f81-9ed0-4db8-9e56-d5c7f3261b6a from this chassis (sb_readonly=0)
Jan 21 19:14:38 np0005591285 ovn_controller[94908]: 2026-01-22T00:14:38Z|00484|binding|INFO|Setting lport bd929f81-9ed0-4db8-9e56-d5c7f3261b6a down in Southbound
Jan 21 19:14:38 np0005591285 nova_compute[182755]: 2026-01-22 00:14:38.052 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:14:38 np0005591285 ovn_controller[94908]: 2026-01-22T00:14:38Z|00485|binding|INFO|Removing iface tapbd929f81-9e ovn-installed in OVS
Jan 21 19:14:38 np0005591285 nova_compute[182755]: 2026-01-22 00:14:38.054 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:14:38 np0005591285 nova_compute[182755]: 2026-01-22 00:14:38.067 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:14:38 np0005591285 systemd[1]: machine-qemu\x2d57\x2dinstance\x2d0000007c.scope: Deactivated successfully.
Jan 21 19:14:38 np0005591285 systemd[1]: machine-qemu\x2d57\x2dinstance\x2d0000007c.scope: Consumed 7.331s CPU time.
Jan 21 19:14:38 np0005591285 systemd-machined[154022]: Machine qemu-57-instance-0000007c terminated.
Jan 21 19:14:38 np0005591285 podman[231398]: 2026-01-22 00:14:38.129595916 +0000 UTC m=+0.059040615 container health_status 482a8450540b33e5cd4c4cb0c4b60281016c657b79b89fa82f8a9e447843f995 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, org.label-schema.schema-version=1.0, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.build-date=20251202)
Jan 21 19:14:38 np0005591285 podman[231402]: 2026-01-22 00:14:38.151700925 +0000 UTC m=+0.078612127 container health_status 8919309155a033da8deb024bb37b9fc0d285fe97686ee0d202af37a5f2df8416 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Jan 21 19:14:38 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:14:38.194 104259 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:89:3a:5c 10.100.0.7'], port_security=['fa:16:3e:89:3a:5c 10.100.0.7'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.7/28', 'neutron:device_id': '4a27cb55-31e5-4343-bed7-44671f47ae20', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-c19848fe-a435-4c66-8190-94e8e9e1b266', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'f02fc2085f6340ffa895cb894fdf5882', 'neutron:revision_number': '4', 'neutron:security_group_ids': '01430d09-4466-4c63-8f42-d6bde77fcc79', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-2.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=639fe658-8c59-48e8-bb7b-52cdb7487f54, chassis=[], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fdd55b5c970>], logical_port=bd929f81-9ed0-4db8-9e56-d5c7f3261b6a) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7fdd55b5c970>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 21 19:14:38 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:14:38.196 104259 INFO neutron.agent.ovn.metadata.agent [-] Port bd929f81-9ed0-4db8-9e56-d5c7f3261b6a in datapath c19848fe-a435-4c66-8190-94e8e9e1b266 unbound from our chassis#033[00m
Jan 21 19:14:38 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:14:38.197 104259 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network c19848fe-a435-4c66-8190-94e8e9e1b266, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Jan 21 19:14:38 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:14:38.198 211690 DEBUG oslo.privsep.daemon [-] privsep: reply[60f41783-951c-4d63-bf2e-b4ee773052ac]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 19:14:38 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:14:38.198 104259 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-c19848fe-a435-4c66-8190-94e8e9e1b266 namespace which is not needed anymore#033[00m
Jan 21 19:14:38 np0005591285 kernel: tapbd929f81-9e: entered promiscuous mode
Jan 21 19:14:38 np0005591285 kernel: tapbd929f81-9e (unregistering): left promiscuous mode
Jan 21 19:14:38 np0005591285 NetworkManager[55017]: <info>  [1769040878.2493] manager: (tapbd929f81-9e): new Tun device (/org/freedesktop/NetworkManager/Devices/235)
Jan 21 19:14:38 np0005591285 nova_compute[182755]: 2026-01-22 00:14:38.250 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:14:38 np0005591285 ovn_controller[94908]: 2026-01-22T00:14:38Z|00486|binding|INFO|Claiming lport bd929f81-9ed0-4db8-9e56-d5c7f3261b6a for this chassis.
Jan 21 19:14:38 np0005591285 ovn_controller[94908]: 2026-01-22T00:14:38Z|00487|binding|INFO|bd929f81-9ed0-4db8-9e56-d5c7f3261b6a: Claiming fa:16:3e:89:3a:5c 10.100.0.7
Jan 21 19:14:38 np0005591285 nova_compute[182755]: 2026-01-22 00:14:38.281 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:14:38 np0005591285 ovn_controller[94908]: 2026-01-22T00:14:38Z|00488|if_status|INFO|Dropped 2 log messages in last 68 seconds (most recently, 68 seconds ago) due to excessive rate
Jan 21 19:14:38 np0005591285 ovn_controller[94908]: 2026-01-22T00:14:38Z|00489|if_status|INFO|Not setting lport bd929f81-9ed0-4db8-9e56-d5c7f3261b6a down as sb is readonly
Jan 21 19:14:38 np0005591285 nova_compute[182755]: 2026-01-22 00:14:38.287 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:14:38 np0005591285 nova_compute[182755]: 2026-01-22 00:14:38.294 182759 INFO nova.virt.libvirt.driver [-] [instance: 4a27cb55-31e5-4343-bed7-44671f47ae20] Instance destroyed successfully.#033[00m
Jan 21 19:14:38 np0005591285 nova_compute[182755]: 2026-01-22 00:14:38.295 182759 DEBUG nova.objects.instance [None req-9a8b7fbc-e94c-477b-9318-19758ba23e21 00a7d470e36045deabd5584bd3a9c73e f02fc2085f6340ffa895cb894fdf5882 - - default default] Lazy-loading 'resources' on Instance uuid 4a27cb55-31e5-4343-bed7-44671f47ae20 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 21 19:14:38 np0005591285 ovn_controller[94908]: 2026-01-22T00:14:38Z|00490|binding|INFO|Releasing lport bd929f81-9ed0-4db8-9e56-d5c7f3261b6a from this chassis (sb_readonly=0)
Jan 21 19:14:38 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:14:38.327 104259 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:89:3a:5c 10.100.0.7'], port_security=['fa:16:3e:89:3a:5c 10.100.0.7'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.7/28', 'neutron:device_id': '4a27cb55-31e5-4343-bed7-44671f47ae20', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-c19848fe-a435-4c66-8190-94e8e9e1b266', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'f02fc2085f6340ffa895cb894fdf5882', 'neutron:revision_number': '4', 'neutron:security_group_ids': '01430d09-4466-4c63-8f42-d6bde77fcc79', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-2.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=639fe658-8c59-48e8-bb7b-52cdb7487f54, chassis=[<ovs.db.idl.Row object at 0x7fdd55b5c970>], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fdd55b5c970>], logical_port=bd929f81-9ed0-4db8-9e56-d5c7f3261b6a) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 21 19:14:38 np0005591285 nova_compute[182755]: 2026-01-22 00:14:38.338 182759 DEBUG nova.virt.libvirt.vif [None req-9a8b7fbc-e94c-477b-9318-19758ba23e21 00a7d470e36045deabd5584bd3a9c73e f02fc2085f6340ffa895cb894fdf5882 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-22T00:14:03Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-tempest.common.compute-instance-1102130362',display_name='tempest-tempest.common.compute-instance-1102130362-2',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-tempest-common-compute-instance-1102130362-2',id=124,image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=1,launched_at=2026-01-22T00:14:31Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='f02fc2085f6340ffa895cb894fdf5882',ramdisk_id='',reservation_id='r-u2tu2ftn',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-MultipleCreateTestJSON-620854064',owner_user_name='tempest-MultipleCreateTestJSON-620854064-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-22T00:14:31Z,user_data=None,user_id='00a7d470e36045deabd5584bd3a9c73e',uuid=4a27cb55-31e5-4343-bed7-44671f47ae20,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "bd929f81-9ed0-4db8-9e56-d5c7f3261b6a", "address": "fa:16:3e:89:3a:5c", "network": {"id": "c19848fe-a435-4c66-8190-94e8e9e1b266", "bridge": "br-int", "label": "tempest-MultipleCreateTestJSON-1960670064-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "f02fc2085f6340ffa895cb894fdf5882", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapbd929f81-9e", "ovs_interfaceid": "bd929f81-9ed0-4db8-9e56-d5c7f3261b6a", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Jan 21 19:14:38 np0005591285 nova_compute[182755]: 2026-01-22 00:14:38.338 182759 DEBUG nova.network.os_vif_util [None req-9a8b7fbc-e94c-477b-9318-19758ba23e21 00a7d470e36045deabd5584bd3a9c73e f02fc2085f6340ffa895cb894fdf5882 - - default default] Converting VIF {"id": "bd929f81-9ed0-4db8-9e56-d5c7f3261b6a", "address": "fa:16:3e:89:3a:5c", "network": {"id": "c19848fe-a435-4c66-8190-94e8e9e1b266", "bridge": "br-int", "label": "tempest-MultipleCreateTestJSON-1960670064-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "f02fc2085f6340ffa895cb894fdf5882", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapbd929f81-9e", "ovs_interfaceid": "bd929f81-9ed0-4db8-9e56-d5c7f3261b6a", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 21 19:14:38 np0005591285 neutron-haproxy-ovnmeta-c19848fe-a435-4c66-8190-94e8e9e1b266[231360]: [NOTICE]   (231364) : haproxy version is 2.8.14-c23fe91
Jan 21 19:14:38 np0005591285 neutron-haproxy-ovnmeta-c19848fe-a435-4c66-8190-94e8e9e1b266[231360]: [NOTICE]   (231364) : path to executable is /usr/sbin/haproxy
Jan 21 19:14:38 np0005591285 neutron-haproxy-ovnmeta-c19848fe-a435-4c66-8190-94e8e9e1b266[231360]: [WARNING]  (231364) : Exiting Master process...
Jan 21 19:14:38 np0005591285 nova_compute[182755]: 2026-01-22 00:14:38.340 182759 DEBUG nova.network.os_vif_util [None req-9a8b7fbc-e94c-477b-9318-19758ba23e21 00a7d470e36045deabd5584bd3a9c73e f02fc2085f6340ffa895cb894fdf5882 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:89:3a:5c,bridge_name='br-int',has_traffic_filtering=True,id=bd929f81-9ed0-4db8-9e56-d5c7f3261b6a,network=Network(c19848fe-a435-4c66-8190-94e8e9e1b266),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapbd929f81-9e') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 21 19:14:38 np0005591285 neutron-haproxy-ovnmeta-c19848fe-a435-4c66-8190-94e8e9e1b266[231360]: [ALERT]    (231364) : Current worker (231366) exited with code 143 (Terminated)
Jan 21 19:14:38 np0005591285 neutron-haproxy-ovnmeta-c19848fe-a435-4c66-8190-94e8e9e1b266[231360]: [WARNING]  (231364) : All workers exited. Exiting... (0)
Jan 21 19:14:38 np0005591285 nova_compute[182755]: 2026-01-22 00:14:38.341 182759 DEBUG os_vif [None req-9a8b7fbc-e94c-477b-9318-19758ba23e21 00a7d470e36045deabd5584bd3a9c73e f02fc2085f6340ffa895cb894fdf5882 - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:89:3a:5c,bridge_name='br-int',has_traffic_filtering=True,id=bd929f81-9ed0-4db8-9e56-d5c7f3261b6a,network=Network(c19848fe-a435-4c66-8190-94e8e9e1b266),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapbd929f81-9e') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Jan 21 19:14:38 np0005591285 systemd[1]: libpod-9248e21d9290965136c26d174c039691772999baaa4fedca13172fb7cbb61e7f.scope: Deactivated successfully.
Jan 21 19:14:38 np0005591285 nova_compute[182755]: 2026-01-22 00:14:38.343 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:14:38 np0005591285 nova_compute[182755]: 2026-01-22 00:14:38.344 182759 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapbd929f81-9e, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 21 19:14:38 np0005591285 nova_compute[182755]: 2026-01-22 00:14:38.345 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:14:38 np0005591285 nova_compute[182755]: 2026-01-22 00:14:38.346 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:14:38 np0005591285 podman[231476]: 2026-01-22 00:14:38.350252269 +0000 UTC m=+0.049809279 container died 9248e21d9290965136c26d174c039691772999baaa4fedca13172fb7cbb61e7f (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-c19848fe-a435-4c66-8190-94e8e9e1b266, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.license=GPLv2, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team)
Jan 21 19:14:38 np0005591285 nova_compute[182755]: 2026-01-22 00:14:38.353 182759 INFO os_vif [None req-9a8b7fbc-e94c-477b-9318-19758ba23e21 00a7d470e36045deabd5584bd3a9c73e f02fc2085f6340ffa895cb894fdf5882 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:89:3a:5c,bridge_name='br-int',has_traffic_filtering=True,id=bd929f81-9ed0-4db8-9e56-d5c7f3261b6a,network=Network(c19848fe-a435-4c66-8190-94e8e9e1b266),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapbd929f81-9e')#033[00m
Jan 21 19:14:38 np0005591285 nova_compute[182755]: 2026-01-22 00:14:38.355 182759 INFO nova.virt.libvirt.driver [None req-9a8b7fbc-e94c-477b-9318-19758ba23e21 00a7d470e36045deabd5584bd3a9c73e f02fc2085f6340ffa895cb894fdf5882 - - default default] [instance: 4a27cb55-31e5-4343-bed7-44671f47ae20] Deleting instance files /var/lib/nova/instances/4a27cb55-31e5-4343-bed7-44671f47ae20_del#033[00m
Jan 21 19:14:38 np0005591285 nova_compute[182755]: 2026-01-22 00:14:38.357 182759 INFO nova.virt.libvirt.driver [None req-9a8b7fbc-e94c-477b-9318-19758ba23e21 00a7d470e36045deabd5584bd3a9c73e f02fc2085f6340ffa895cb894fdf5882 - - default default] [instance: 4a27cb55-31e5-4343-bed7-44671f47ae20] Deletion of /var/lib/nova/instances/4a27cb55-31e5-4343-bed7-44671f47ae20_del complete#033[00m
Jan 21 19:14:38 np0005591285 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-9248e21d9290965136c26d174c039691772999baaa4fedca13172fb7cbb61e7f-userdata-shm.mount: Deactivated successfully.
Jan 21 19:14:38 np0005591285 systemd[1]: var-lib-containers-storage-overlay-15a00b631118e6a3d771a11190222858f19d81e8da9e8025e9f4eb2c45941f35-merged.mount: Deactivated successfully.
Jan 21 19:14:38 np0005591285 podman[231476]: 2026-01-22 00:14:38.39569852 +0000 UTC m=+0.095255530 container cleanup 9248e21d9290965136c26d174c039691772999baaa4fedca13172fb7cbb61e7f (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-c19848fe-a435-4c66-8190-94e8e9e1b266, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 21 19:14:38 np0005591285 systemd[1]: libpod-conmon-9248e21d9290965136c26d174c039691772999baaa4fedca13172fb7cbb61e7f.scope: Deactivated successfully.
Jan 21 19:14:38 np0005591285 podman[231507]: 2026-01-22 00:14:38.451913109 +0000 UTC m=+0.035804316 container remove 9248e21d9290965136c26d174c039691772999baaa4fedca13172fb7cbb61e7f (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-c19848fe-a435-4c66-8190-94e8e9e1b266, org.label-schema.build-date=20251202, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS)
Jan 21 19:14:38 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:14:38.458 211690 DEBUG oslo.privsep.daemon [-] privsep: reply[1d761009-593f-4c07-9967-087c321705c8]: (4, ('Thu Jan 22 12:14:38 AM UTC 2026 Stopping container neutron-haproxy-ovnmeta-c19848fe-a435-4c66-8190-94e8e9e1b266 (9248e21d9290965136c26d174c039691772999baaa4fedca13172fb7cbb61e7f)\n9248e21d9290965136c26d174c039691772999baaa4fedca13172fb7cbb61e7f\nThu Jan 22 12:14:38 AM UTC 2026 Deleting container neutron-haproxy-ovnmeta-c19848fe-a435-4c66-8190-94e8e9e1b266 (9248e21d9290965136c26d174c039691772999baaa4fedca13172fb7cbb61e7f)\n9248e21d9290965136c26d174c039691772999baaa4fedca13172fb7cbb61e7f\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 19:14:38 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:14:38.460 211690 DEBUG oslo.privsep.daemon [-] privsep: reply[0675ee55-66ff-41f3-a015-e4e129bfcc5e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 19:14:38 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:14:38.461 104259 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapc19848fe-a0, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 21 19:14:38 np0005591285 kernel: tapc19848fe-a0: left promiscuous mode
Jan 21 19:14:38 np0005591285 nova_compute[182755]: 2026-01-22 00:14:38.464 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:14:38 np0005591285 nova_compute[182755]: 2026-01-22 00:14:38.479 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:14:38 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:14:38.484 211690 DEBUG oslo.privsep.daemon [-] privsep: reply[2f270670-ec11-47cf-8a89-a9d4d5ce9262]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 19:14:38 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:14:38.502 211690 DEBUG oslo.privsep.daemon [-] privsep: reply[4c734e23-b3c9-4ffc-8a70-c5cdded32eb2]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 19:14:38 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:14:38.503 211690 DEBUG oslo.privsep.daemon [-] privsep: reply[9efbeda6-7cc1-4a6b-983e-d4a61a17fd42]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 19:14:38 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:14:38.506 104259 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:89:3a:5c 10.100.0.7'], port_security=['fa:16:3e:89:3a:5c 10.100.0.7'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.7/28', 'neutron:device_id': '4a27cb55-31e5-4343-bed7-44671f47ae20', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-c19848fe-a435-4c66-8190-94e8e9e1b266', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'f02fc2085f6340ffa895cb894fdf5882', 'neutron:revision_number': '4', 'neutron:security_group_ids': '01430d09-4466-4c63-8f42-d6bde77fcc79', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-2.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=639fe658-8c59-48e8-bb7b-52cdb7487f54, chassis=[], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fdd55b5c970>], logical_port=bd929f81-9ed0-4db8-9e56-d5c7f3261b6a) old=Port_Binding(chassis=[<ovs.db.idl.Row object at 0x7fdd55b5c970>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 21 19:14:38 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:14:38.527 211690 DEBUG oslo.privsep.daemon [-] privsep: reply[7cb6a0ec-14f4-4cca-8c5e-3636b3ce264e]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 542806, 'reachable_time': 25459, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 231523, 'error': None, 'target': 'ovnmeta-c19848fe-a435-4c66-8190-94e8e9e1b266', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 19:14:38 np0005591285 systemd[1]: run-netns-ovnmeta\x2dc19848fe\x2da435\x2d4c66\x2d8190\x2d94e8e9e1b266.mount: Deactivated successfully.
Jan 21 19:14:38 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:14:38.530 104650 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-c19848fe-a435-4c66-8190-94e8e9e1b266 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Jan 21 19:14:38 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:14:38.530 104650 DEBUG oslo.privsep.daemon [-] privsep: reply[fdf148e2-fbee-4ae3-96d0-7ed442da30e0]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 19:14:38 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:14:38.532 104259 INFO neutron.agent.ovn.metadata.agent [-] Port bd929f81-9ed0-4db8-9e56-d5c7f3261b6a in datapath c19848fe-a435-4c66-8190-94e8e9e1b266 unbound from our chassis#033[00m
Jan 21 19:14:38 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:14:38.533 104259 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network c19848fe-a435-4c66-8190-94e8e9e1b266, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Jan 21 19:14:38 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:14:38.534 211690 DEBUG oslo.privsep.daemon [-] privsep: reply[697df460-4524-4f3c-9f0a-c682b2309f41]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 19:14:38 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:14:38.534 104259 INFO neutron.agent.ovn.metadata.agent [-] Port bd929f81-9ed0-4db8-9e56-d5c7f3261b6a in datapath c19848fe-a435-4c66-8190-94e8e9e1b266 unbound from our chassis#033[00m
Jan 21 19:14:38 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:14:38.535 104259 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network c19848fe-a435-4c66-8190-94e8e9e1b266, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Jan 21 19:14:38 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:14:38.536 211690 DEBUG oslo.privsep.daemon [-] privsep: reply[76d14c6f-65d4-4fb7-9f63-f1e8b749b87d]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 19:14:38 np0005591285 nova_compute[182755]: 2026-01-22 00:14:38.725 182759 DEBUG nova.compute.manager [req-1fccf402-2105-4926-ae47-582d60f10fd3 req-f200b881-3a90-4c2d-9918-3739de0696a8 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 4a27cb55-31e5-4343-bed7-44671f47ae20] Received event network-vif-unplugged-bd929f81-9ed0-4db8-9e56-d5c7f3261b6a external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 21 19:14:38 np0005591285 nova_compute[182755]: 2026-01-22 00:14:38.725 182759 DEBUG oslo_concurrency.lockutils [req-1fccf402-2105-4926-ae47-582d60f10fd3 req-f200b881-3a90-4c2d-9918-3739de0696a8 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "4a27cb55-31e5-4343-bed7-44671f47ae20-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 21 19:14:38 np0005591285 nova_compute[182755]: 2026-01-22 00:14:38.725 182759 DEBUG oslo_concurrency.lockutils [req-1fccf402-2105-4926-ae47-582d60f10fd3 req-f200b881-3a90-4c2d-9918-3739de0696a8 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "4a27cb55-31e5-4343-bed7-44671f47ae20-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 21 19:14:38 np0005591285 nova_compute[182755]: 2026-01-22 00:14:38.726 182759 DEBUG oslo_concurrency.lockutils [req-1fccf402-2105-4926-ae47-582d60f10fd3 req-f200b881-3a90-4c2d-9918-3739de0696a8 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "4a27cb55-31e5-4343-bed7-44671f47ae20-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 21 19:14:38 np0005591285 nova_compute[182755]: 2026-01-22 00:14:38.726 182759 DEBUG nova.compute.manager [req-1fccf402-2105-4926-ae47-582d60f10fd3 req-f200b881-3a90-4c2d-9918-3739de0696a8 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 4a27cb55-31e5-4343-bed7-44671f47ae20] No waiting events found dispatching network-vif-unplugged-bd929f81-9ed0-4db8-9e56-d5c7f3261b6a pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 21 19:14:38 np0005591285 nova_compute[182755]: 2026-01-22 00:14:38.726 182759 DEBUG nova.compute.manager [req-1fccf402-2105-4926-ae47-582d60f10fd3 req-f200b881-3a90-4c2d-9918-3739de0696a8 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 4a27cb55-31e5-4343-bed7-44671f47ae20] Received event network-vif-unplugged-bd929f81-9ed0-4db8-9e56-d5c7f3261b6a for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Jan 21 19:14:38 np0005591285 nova_compute[182755]: 2026-01-22 00:14:38.791 182759 INFO nova.compute.manager [None req-9a8b7fbc-e94c-477b-9318-19758ba23e21 00a7d470e36045deabd5584bd3a9c73e f02fc2085f6340ffa895cb894fdf5882 - - default default] [instance: 4a27cb55-31e5-4343-bed7-44671f47ae20] Took 0.77 seconds to destroy the instance on the hypervisor.#033[00m
Jan 21 19:14:38 np0005591285 nova_compute[182755]: 2026-01-22 00:14:38.791 182759 DEBUG oslo.service.loopingcall [None req-9a8b7fbc-e94c-477b-9318-19758ba23e21 00a7d470e36045deabd5584bd3a9c73e f02fc2085f6340ffa895cb894fdf5882 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Jan 21 19:14:38 np0005591285 nova_compute[182755]: 2026-01-22 00:14:38.792 182759 DEBUG nova.compute.manager [-] [instance: 4a27cb55-31e5-4343-bed7-44671f47ae20] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Jan 21 19:14:38 np0005591285 nova_compute[182755]: 2026-01-22 00:14:38.792 182759 DEBUG nova.network.neutron [-] [instance: 4a27cb55-31e5-4343-bed7-44671f47ae20] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Jan 21 19:14:40 np0005591285 podman[231524]: 2026-01-22 00:14:40.253811179 +0000 UTC m=+0.118297564 container health_status e38def30a2d032278c507a8fe96e62e9ea51ff4721fe55b24e66691821fd079e (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.license=GPLv2, config_id=ovn_controller, container_name=ovn_controller)
Jan 21 19:14:40 np0005591285 nova_compute[182755]: 2026-01-22 00:14:40.966 182759 DEBUG nova.compute.manager [req-75de2097-1771-4023-9a85-808ab9e70d7e req-424d157b-5dc0-4d4b-bdd8-c114760e25da 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 4a27cb55-31e5-4343-bed7-44671f47ae20] Received event network-vif-plugged-bd929f81-9ed0-4db8-9e56-d5c7f3261b6a external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 21 19:14:40 np0005591285 nova_compute[182755]: 2026-01-22 00:14:40.966 182759 DEBUG oslo_concurrency.lockutils [req-75de2097-1771-4023-9a85-808ab9e70d7e req-424d157b-5dc0-4d4b-bdd8-c114760e25da 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "4a27cb55-31e5-4343-bed7-44671f47ae20-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 21 19:14:40 np0005591285 nova_compute[182755]: 2026-01-22 00:14:40.966 182759 DEBUG oslo_concurrency.lockutils [req-75de2097-1771-4023-9a85-808ab9e70d7e req-424d157b-5dc0-4d4b-bdd8-c114760e25da 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "4a27cb55-31e5-4343-bed7-44671f47ae20-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 21 19:14:40 np0005591285 nova_compute[182755]: 2026-01-22 00:14:40.967 182759 DEBUG oslo_concurrency.lockutils [req-75de2097-1771-4023-9a85-808ab9e70d7e req-424d157b-5dc0-4d4b-bdd8-c114760e25da 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "4a27cb55-31e5-4343-bed7-44671f47ae20-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 21 19:14:40 np0005591285 nova_compute[182755]: 2026-01-22 00:14:40.967 182759 DEBUG nova.compute.manager [req-75de2097-1771-4023-9a85-808ab9e70d7e req-424d157b-5dc0-4d4b-bdd8-c114760e25da 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 4a27cb55-31e5-4343-bed7-44671f47ae20] No waiting events found dispatching network-vif-plugged-bd929f81-9ed0-4db8-9e56-d5c7f3261b6a pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 21 19:14:40 np0005591285 nova_compute[182755]: 2026-01-22 00:14:40.967 182759 WARNING nova.compute.manager [req-75de2097-1771-4023-9a85-808ab9e70d7e req-424d157b-5dc0-4d4b-bdd8-c114760e25da 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 4a27cb55-31e5-4343-bed7-44671f47ae20] Received unexpected event network-vif-plugged-bd929f81-9ed0-4db8-9e56-d5c7f3261b6a for instance with vm_state active and task_state deleting.#033[00m
Jan 21 19:14:41 np0005591285 nova_compute[182755]: 2026-01-22 00:14:41.012 182759 DEBUG nova.network.neutron [-] [instance: 4a27cb55-31e5-4343-bed7-44671f47ae20] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 21 19:14:41 np0005591285 nova_compute[182755]: 2026-01-22 00:14:41.073 182759 INFO nova.compute.manager [-] [instance: 4a27cb55-31e5-4343-bed7-44671f47ae20] Took 2.28 seconds to deallocate network for instance.#033[00m
Jan 21 19:14:41 np0005591285 nova_compute[182755]: 2026-01-22 00:14:41.196 182759 DEBUG oslo_concurrency.lockutils [None req-9a8b7fbc-e94c-477b-9318-19758ba23e21 00a7d470e36045deabd5584bd3a9c73e f02fc2085f6340ffa895cb894fdf5882 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 21 19:14:41 np0005591285 nova_compute[182755]: 2026-01-22 00:14:41.197 182759 DEBUG oslo_concurrency.lockutils [None req-9a8b7fbc-e94c-477b-9318-19758ba23e21 00a7d470e36045deabd5584bd3a9c73e f02fc2085f6340ffa895cb894fdf5882 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 21 19:14:41 np0005591285 nova_compute[182755]: 2026-01-22 00:14:41.397 182759 DEBUG nova.compute.provider_tree [None req-9a8b7fbc-e94c-477b-9318-19758ba23e21 00a7d470e36045deabd5584bd3a9c73e f02fc2085f6340ffa895cb894fdf5882 - - default default] Inventory has not changed in ProviderTree for provider: e96a8776-a298-4c19-937a-402cb8191067 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 21 19:14:41 np0005591285 nova_compute[182755]: 2026-01-22 00:14:41.635 182759 DEBUG nova.scheduler.client.report [None req-9a8b7fbc-e94c-477b-9318-19758ba23e21 00a7d470e36045deabd5584bd3a9c73e f02fc2085f6340ffa895cb894fdf5882 - - default default] Inventory has not changed for provider e96a8776-a298-4c19-937a-402cb8191067 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 21 19:14:41 np0005591285 nova_compute[182755]: 2026-01-22 00:14:41.677 182759 DEBUG oslo_concurrency.lockutils [None req-9a8b7fbc-e94c-477b-9318-19758ba23e21 00a7d470e36045deabd5584bd3a9c73e f02fc2085f6340ffa895cb894fdf5882 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.480s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 21 19:14:41 np0005591285 nova_compute[182755]: 2026-01-22 00:14:41.763 182759 INFO nova.scheduler.client.report [None req-9a8b7fbc-e94c-477b-9318-19758ba23e21 00a7d470e36045deabd5584bd3a9c73e f02fc2085f6340ffa895cb894fdf5882 - - default default] Deleted allocations for instance 4a27cb55-31e5-4343-bed7-44671f47ae20#033[00m
Jan 21 19:14:41 np0005591285 nova_compute[182755]: 2026-01-22 00:14:41.896 182759 DEBUG oslo_concurrency.lockutils [None req-9a8b7fbc-e94c-477b-9318-19758ba23e21 00a7d470e36045deabd5584bd3a9c73e f02fc2085f6340ffa895cb894fdf5882 - - default default] Lock "4a27cb55-31e5-4343-bed7-44671f47ae20" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 3.900s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 21 19:14:42 np0005591285 nova_compute[182755]: 2026-01-22 00:14:42.533 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:14:43 np0005591285 nova_compute[182755]: 2026-01-22 00:14:43.174 182759 DEBUG nova.compute.manager [req-4082c61a-3eb8-4b35-94c9-795dfbd8dd86 req-43ad23c7-7659-4ece-8039-e0bfd5742f2a 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 4a27cb55-31e5-4343-bed7-44671f47ae20] Received event network-vif-deleted-bd929f81-9ed0-4db8-9e56-d5c7f3261b6a external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 21 19:14:43 np0005591285 nova_compute[182755]: 2026-01-22 00:14:43.213 182759 DEBUG oslo_service.periodic_task [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 21 19:14:43 np0005591285 nova_compute[182755]: 2026-01-22 00:14:43.217 182759 DEBUG oslo_service.periodic_task [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 21 19:14:43 np0005591285 nova_compute[182755]: 2026-01-22 00:14:43.217 182759 DEBUG nova.compute.manager [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Jan 21 19:14:43 np0005591285 nova_compute[182755]: 2026-01-22 00:14:43.347 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:14:45 np0005591285 nova_compute[182755]: 2026-01-22 00:14:45.217 182759 DEBUG oslo_service.periodic_task [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 21 19:14:46 np0005591285 nova_compute[182755]: 2026-01-22 00:14:46.218 182759 DEBUG oslo_service.periodic_task [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 21 19:14:47 np0005591285 nova_compute[182755]: 2026-01-22 00:14:47.535 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:14:48 np0005591285 nova_compute[182755]: 2026-01-22 00:14:48.218 182759 DEBUG oslo_service.periodic_task [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 21 19:14:48 np0005591285 nova_compute[182755]: 2026-01-22 00:14:48.218 182759 DEBUG nova.compute.manager [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Jan 21 19:14:48 np0005591285 nova_compute[182755]: 2026-01-22 00:14:48.218 182759 DEBUG nova.compute.manager [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Jan 21 19:14:48 np0005591285 nova_compute[182755]: 2026-01-22 00:14:48.349 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:14:49 np0005591285 nova_compute[182755]: 2026-01-22 00:14:49.032 182759 DEBUG nova.compute.manager [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Jan 21 19:14:49 np0005591285 nova_compute[182755]: 2026-01-22 00:14:49.217 182759 DEBUG oslo_service.periodic_task [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 21 19:14:49 np0005591285 nova_compute[182755]: 2026-01-22 00:14:49.218 182759 DEBUG oslo_service.periodic_task [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 21 19:14:52 np0005591285 nova_compute[182755]: 2026-01-22 00:14:52.217 182759 DEBUG oslo_service.periodic_task [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 21 19:14:52 np0005591285 nova_compute[182755]: 2026-01-22 00:14:52.489 182759 DEBUG oslo_concurrency.lockutils [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 21 19:14:52 np0005591285 nova_compute[182755]: 2026-01-22 00:14:52.490 182759 DEBUG oslo_concurrency.lockutils [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 21 19:14:52 np0005591285 nova_compute[182755]: 2026-01-22 00:14:52.491 182759 DEBUG oslo_concurrency.lockutils [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 21 19:14:52 np0005591285 nova_compute[182755]: 2026-01-22 00:14:52.492 182759 DEBUG nova.compute.resource_tracker [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Jan 21 19:14:52 np0005591285 nova_compute[182755]: 2026-01-22 00:14:52.538 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:14:52 np0005591285 nova_compute[182755]: 2026-01-22 00:14:52.746 182759 WARNING nova.virt.libvirt.driver [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 21 19:14:52 np0005591285 nova_compute[182755]: 2026-01-22 00:14:52.747 182759 DEBUG nova.compute.resource_tracker [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=5662MB free_disk=73.19332885742188GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Jan 21 19:14:52 np0005591285 nova_compute[182755]: 2026-01-22 00:14:52.747 182759 DEBUG oslo_concurrency.lockutils [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 21 19:14:52 np0005591285 nova_compute[182755]: 2026-01-22 00:14:52.748 182759 DEBUG oslo_concurrency.lockutils [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 21 19:14:52 np0005591285 nova_compute[182755]: 2026-01-22 00:14:52.974 182759 DEBUG nova.compute.resource_tracker [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Jan 21 19:14:52 np0005591285 nova_compute[182755]: 2026-01-22 00:14:52.974 182759 DEBUG nova.compute.resource_tracker [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Jan 21 19:14:52 np0005591285 nova_compute[182755]: 2026-01-22 00:14:52.992 182759 DEBUG nova.scheduler.client.report [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Refreshing inventories for resource provider e96a8776-a298-4c19-937a-402cb8191067 _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:804#033[00m
Jan 21 19:14:53 np0005591285 nova_compute[182755]: 2026-01-22 00:14:53.019 182759 DEBUG nova.scheduler.client.report [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Updating ProviderTree inventory for provider e96a8776-a298-4c19-937a-402cb8191067 from _refresh_and_get_inventory using data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} _refresh_and_get_inventory /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:768#033[00m
Jan 21 19:14:53 np0005591285 nova_compute[182755]: 2026-01-22 00:14:53.020 182759 DEBUG nova.compute.provider_tree [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Updating inventory in ProviderTree for provider e96a8776-a298-4c19-937a-402cb8191067 with inventory: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176#033[00m
Jan 21 19:14:53 np0005591285 nova_compute[182755]: 2026-01-22 00:14:53.038 182759 DEBUG nova.scheduler.client.report [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Refreshing aggregate associations for resource provider e96a8776-a298-4c19-937a-402cb8191067, aggregates: None _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:813#033[00m
Jan 21 19:14:53 np0005591285 nova_compute[182755]: 2026-01-22 00:14:53.069 182759 DEBUG nova.scheduler.client.report [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Refreshing trait associations for resource provider e96a8776-a298-4c19-937a-402cb8191067, traits: COMPUTE_SECURITY_UEFI_SECURE_BOOT,COMPUTE_VIOMMU_MODEL_VIRTIO,COMPUTE_STORAGE_BUS_IDE,COMPUTE_NET_ATTACH_INTERFACE,HW_CPU_X86_SSSE3,HW_CPU_X86_SSE,COMPUTE_IMAGE_TYPE_QCOW2,COMPUTE_NET_VIF_MODEL_VMXNET3,COMPUTE_STORAGE_BUS_FDC,HW_CPU_X86_SSE42,COMPUTE_SECURITY_TPM_2_0,COMPUTE_IMAGE_TYPE_AMI,COMPUTE_DEVICE_TAGGING,COMPUTE_VOLUME_EXTEND,COMPUTE_NET_VIF_MODEL_E1000,COMPUTE_VIOMMU_MODEL_AUTO,COMPUTE_GRAPHICS_MODEL_CIRRUS,COMPUTE_VOLUME_ATTACH_WITH_TAG,COMPUTE_IMAGE_TYPE_AKI,COMPUTE_STORAGE_BUS_SATA,HW_CPU_X86_SSE2,COMPUTE_IMAGE_TYPE_ARI,HW_CPU_X86_SSE41,COMPUTE_RESCUE_BFV,COMPUTE_NODE,COMPUTE_GRAPHICS_MODEL_VIRTIO,COMPUTE_NET_VIF_MODEL_PCNET,COMPUTE_NET_VIF_MODEL_VIRTIO,COMPUTE_NET_VIF_MODEL_RTL8139,COMPUTE_ACCELERATORS,COMPUTE_SECURITY_TPM_1_2,COMPUTE_GRAPHICS_MODEL_NONE,COMPUTE_GRAPHICS_MODEL_BOCHS,COMPUTE_VOLUME_MULTI_ATTACH,COMPUTE_NET_VIF_MODEL_NE2K_PCI,COMPUTE_TRUSTED_CERTS,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,COMPUTE_VIOMMU_MODEL_INTEL,COMPUTE_IMAGE_TYPE_ISO,COMPUTE_SOCKET_PCI_NUMA_AFFINITY,COMPUTE_STORAGE_BUS_USB,COMPUTE_STORAGE_BUS_SCSI,COMPUTE_STORAGE_BUS_VIRTIO,HW_CPU_X86_MMX,COMPUTE_GRAPHICS_MODEL_VGA,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,COMPUTE_NET_VIF_MODEL_E1000E,COMPUTE_IMAGE_TYPE_RAW _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:825#033[00m
Jan 21 19:14:53 np0005591285 nova_compute[182755]: 2026-01-22 00:14:53.101 182759 DEBUG nova.compute.provider_tree [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Inventory has not changed in ProviderTree for provider: e96a8776-a298-4c19-937a-402cb8191067 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 21 19:14:53 np0005591285 nova_compute[182755]: 2026-01-22 00:14:53.292 182759 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1769040878.292164, 4a27cb55-31e5-4343-bed7-44671f47ae20 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 21 19:14:53 np0005591285 nova_compute[182755]: 2026-01-22 00:14:53.293 182759 INFO nova.compute.manager [-] [instance: 4a27cb55-31e5-4343-bed7-44671f47ae20] VM Stopped (Lifecycle Event)#033[00m
Jan 21 19:14:53 np0005591285 nova_compute[182755]: 2026-01-22 00:14:53.351 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:14:53 np0005591285 nova_compute[182755]: 2026-01-22 00:14:53.499 182759 DEBUG nova.scheduler.client.report [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Inventory has not changed for provider e96a8776-a298-4c19-937a-402cb8191067 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 21 19:14:53 np0005591285 nova_compute[182755]: 2026-01-22 00:14:53.510 182759 DEBUG nova.compute.manager [None req-40a3cfb9-47cb-4d4a-8366-b7c42155d62e - - - - - -] [instance: 4a27cb55-31e5-4343-bed7-44671f47ae20] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 21 19:14:53 np0005591285 nova_compute[182755]: 2026-01-22 00:14:53.546 182759 DEBUG nova.compute.resource_tracker [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Jan 21 19:14:53 np0005591285 nova_compute[182755]: 2026-01-22 00:14:53.546 182759 DEBUG oslo_concurrency.lockutils [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.799s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 21 19:14:54 np0005591285 nova_compute[182755]: 2026-01-22 00:14:54.546 182759 DEBUG oslo_service.periodic_task [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 21 19:14:55 np0005591285 podman[231552]: 2026-01-22 00:14:55.185109206 +0000 UTC m=+0.057302929 container health_status 769668cc4e818cfae854ab3fcb5517227ddca1e18005a18ef4d782a494f45662 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, build-date=2025-08-20T13:12:41, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=openstack_network_exporter, distribution-scope=public, maintainer=Red Hat, Inc., version=9.6, architecture=x86_64, io.buildah.version=1.33.7, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., release=1755695350, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.openshift.tags=minimal rhel9, managed_by=edpm_ansible, vendor=Red Hat, Inc., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.openshift.expose-services=, name=ubi9-minimal, config_id=openstack_network_exporter, com.redhat.component=ubi9-minimal-container, vcs-type=git, url=https://catalog.redhat.com/en/search?searchType=containers)
Jan 21 19:14:55 np0005591285 podman[231553]: 2026-01-22 00:14:55.222704479 +0000 UTC m=+0.083223551 container health_status b5c1a31a508d0cf9e10702abe9c0ef424614b4d8f0daee85791f7de054afa6f0 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=ceilometer_agent_compute)
Jan 21 19:14:57 np0005591285 nova_compute[182755]: 2026-01-22 00:14:57.541 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:14:57 np0005591285 nova_compute[182755]: 2026-01-22 00:14:57.966 182759 DEBUG oslo_concurrency.lockutils [None req-3cd3470c-fe37-4c48-88cb-fe198fd9e938 00a7d470e36045deabd5584bd3a9c73e f02fc2085f6340ffa895cb894fdf5882 - - default default] Acquiring lock "40f18c70-81c1-4729-929c-5368ae297eb8" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 21 19:14:57 np0005591285 nova_compute[182755]: 2026-01-22 00:14:57.967 182759 DEBUG oslo_concurrency.lockutils [None req-3cd3470c-fe37-4c48-88cb-fe198fd9e938 00a7d470e36045deabd5584bd3a9c73e f02fc2085f6340ffa895cb894fdf5882 - - default default] Lock "40f18c70-81c1-4729-929c-5368ae297eb8" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 21 19:14:58 np0005591285 nova_compute[182755]: 2026-01-22 00:14:58.397 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:14:58 np0005591285 nova_compute[182755]: 2026-01-22 00:14:58.761 182759 DEBUG nova.compute.manager [None req-3cd3470c-fe37-4c48-88cb-fe198fd9e938 00a7d470e36045deabd5584bd3a9c73e f02fc2085f6340ffa895cb894fdf5882 - - default default] [instance: 40f18c70-81c1-4729-929c-5368ae297eb8] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Jan 21 19:14:59 np0005591285 nova_compute[182755]: 2026-01-22 00:14:59.694 182759 DEBUG oslo_concurrency.lockutils [None req-3cd3470c-fe37-4c48-88cb-fe198fd9e938 00a7d470e36045deabd5584bd3a9c73e f02fc2085f6340ffa895cb894fdf5882 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 21 19:14:59 np0005591285 nova_compute[182755]: 2026-01-22 00:14:59.694 182759 DEBUG oslo_concurrency.lockutils [None req-3cd3470c-fe37-4c48-88cb-fe198fd9e938 00a7d470e36045deabd5584bd3a9c73e f02fc2085f6340ffa895cb894fdf5882 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 21 19:14:59 np0005591285 nova_compute[182755]: 2026-01-22 00:14:59.703 182759 DEBUG nova.virt.hardware [None req-3cd3470c-fe37-4c48-88cb-fe198fd9e938 00a7d470e36045deabd5584bd3a9c73e f02fc2085f6340ffa895cb894fdf5882 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Jan 21 19:14:59 np0005591285 nova_compute[182755]: 2026-01-22 00:14:59.703 182759 INFO nova.compute.claims [None req-3cd3470c-fe37-4c48-88cb-fe198fd9e938 00a7d470e36045deabd5584bd3a9c73e f02fc2085f6340ffa895cb894fdf5882 - - default default] [instance: 40f18c70-81c1-4729-929c-5368ae297eb8] Claim successful on node compute-2.ctlplane.example.com#033[00m
Jan 21 19:15:00 np0005591285 nova_compute[182755]: 2026-01-22 00:15:00.315 182759 DEBUG nova.compute.provider_tree [None req-3cd3470c-fe37-4c48-88cb-fe198fd9e938 00a7d470e36045deabd5584bd3a9c73e f02fc2085f6340ffa895cb894fdf5882 - - default default] Inventory has not changed in ProviderTree for provider: e96a8776-a298-4c19-937a-402cb8191067 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 21 19:15:00 np0005591285 nova_compute[182755]: 2026-01-22 00:15:00.332 182759 DEBUG nova.scheduler.client.report [None req-3cd3470c-fe37-4c48-88cb-fe198fd9e938 00a7d470e36045deabd5584bd3a9c73e f02fc2085f6340ffa895cb894fdf5882 - - default default] Inventory has not changed for provider e96a8776-a298-4c19-937a-402cb8191067 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 21 19:15:00 np0005591285 nova_compute[182755]: 2026-01-22 00:15:00.381 182759 DEBUG oslo_concurrency.lockutils [None req-3cd3470c-fe37-4c48-88cb-fe198fd9e938 00a7d470e36045deabd5584bd3a9c73e f02fc2085f6340ffa895cb894fdf5882 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.686s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 21 19:15:00 np0005591285 nova_compute[182755]: 2026-01-22 00:15:00.382 182759 DEBUG nova.compute.manager [None req-3cd3470c-fe37-4c48-88cb-fe198fd9e938 00a7d470e36045deabd5584bd3a9c73e f02fc2085f6340ffa895cb894fdf5882 - - default default] [instance: 40f18c70-81c1-4729-929c-5368ae297eb8] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Jan 21 19:15:00 np0005591285 nova_compute[182755]: 2026-01-22 00:15:00.484 182759 DEBUG nova.compute.manager [None req-3cd3470c-fe37-4c48-88cb-fe198fd9e938 00a7d470e36045deabd5584bd3a9c73e f02fc2085f6340ffa895cb894fdf5882 - - default default] [instance: 40f18c70-81c1-4729-929c-5368ae297eb8] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Jan 21 19:15:00 np0005591285 nova_compute[182755]: 2026-01-22 00:15:00.485 182759 DEBUG nova.network.neutron [None req-3cd3470c-fe37-4c48-88cb-fe198fd9e938 00a7d470e36045deabd5584bd3a9c73e f02fc2085f6340ffa895cb894fdf5882 - - default default] [instance: 40f18c70-81c1-4729-929c-5368ae297eb8] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Jan 21 19:15:00 np0005591285 nova_compute[182755]: 2026-01-22 00:15:00.526 182759 INFO nova.virt.libvirt.driver [None req-3cd3470c-fe37-4c48-88cb-fe198fd9e938 00a7d470e36045deabd5584bd3a9c73e f02fc2085f6340ffa895cb894fdf5882 - - default default] [instance: 40f18c70-81c1-4729-929c-5368ae297eb8] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Jan 21 19:15:00 np0005591285 nova_compute[182755]: 2026-01-22 00:15:00.568 182759 DEBUG nova.compute.manager [None req-3cd3470c-fe37-4c48-88cb-fe198fd9e938 00a7d470e36045deabd5584bd3a9c73e f02fc2085f6340ffa895cb894fdf5882 - - default default] [instance: 40f18c70-81c1-4729-929c-5368ae297eb8] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Jan 21 19:15:00 np0005591285 nova_compute[182755]: 2026-01-22 00:15:00.778 182759 DEBUG nova.compute.manager [None req-3cd3470c-fe37-4c48-88cb-fe198fd9e938 00a7d470e36045deabd5584bd3a9c73e f02fc2085f6340ffa895cb894fdf5882 - - default default] [instance: 40f18c70-81c1-4729-929c-5368ae297eb8] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Jan 21 19:15:00 np0005591285 nova_compute[182755]: 2026-01-22 00:15:00.779 182759 DEBUG nova.virt.libvirt.driver [None req-3cd3470c-fe37-4c48-88cb-fe198fd9e938 00a7d470e36045deabd5584bd3a9c73e f02fc2085f6340ffa895cb894fdf5882 - - default default] [instance: 40f18c70-81c1-4729-929c-5368ae297eb8] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Jan 21 19:15:00 np0005591285 nova_compute[182755]: 2026-01-22 00:15:00.779 182759 INFO nova.virt.libvirt.driver [None req-3cd3470c-fe37-4c48-88cb-fe198fd9e938 00a7d470e36045deabd5584bd3a9c73e f02fc2085f6340ffa895cb894fdf5882 - - default default] [instance: 40f18c70-81c1-4729-929c-5368ae297eb8] Creating image(s)#033[00m
Jan 21 19:15:00 np0005591285 nova_compute[182755]: 2026-01-22 00:15:00.780 182759 DEBUG oslo_concurrency.lockutils [None req-3cd3470c-fe37-4c48-88cb-fe198fd9e938 00a7d470e36045deabd5584bd3a9c73e f02fc2085f6340ffa895cb894fdf5882 - - default default] Acquiring lock "/var/lib/nova/instances/40f18c70-81c1-4729-929c-5368ae297eb8/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 21 19:15:00 np0005591285 nova_compute[182755]: 2026-01-22 00:15:00.780 182759 DEBUG oslo_concurrency.lockutils [None req-3cd3470c-fe37-4c48-88cb-fe198fd9e938 00a7d470e36045deabd5584bd3a9c73e f02fc2085f6340ffa895cb894fdf5882 - - default default] Lock "/var/lib/nova/instances/40f18c70-81c1-4729-929c-5368ae297eb8/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 21 19:15:00 np0005591285 nova_compute[182755]: 2026-01-22 00:15:00.781 182759 DEBUG oslo_concurrency.lockutils [None req-3cd3470c-fe37-4c48-88cb-fe198fd9e938 00a7d470e36045deabd5584bd3a9c73e f02fc2085f6340ffa895cb894fdf5882 - - default default] Lock "/var/lib/nova/instances/40f18c70-81c1-4729-929c-5368ae297eb8/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 21 19:15:00 np0005591285 nova_compute[182755]: 2026-01-22 00:15:00.794 182759 DEBUG oslo_concurrency.processutils [None req-3cd3470c-fe37-4c48-88cb-fe198fd9e938 00a7d470e36045deabd5584bd3a9c73e f02fc2085f6340ffa895cb894fdf5882 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 21 19:15:00 np0005591285 nova_compute[182755]: 2026-01-22 00:15:00.849 182759 DEBUG oslo_concurrency.processutils [None req-3cd3470c-fe37-4c48-88cb-fe198fd9e938 00a7d470e36045deabd5584bd3a9c73e f02fc2085f6340ffa895cb894fdf5882 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json" returned: 0 in 0.055s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 21 19:15:00 np0005591285 nova_compute[182755]: 2026-01-22 00:15:00.850 182759 DEBUG oslo_concurrency.lockutils [None req-3cd3470c-fe37-4c48-88cb-fe198fd9e938 00a7d470e36045deabd5584bd3a9c73e f02fc2085f6340ffa895cb894fdf5882 - - default default] Acquiring lock "3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 21 19:15:00 np0005591285 nova_compute[182755]: 2026-01-22 00:15:00.851 182759 DEBUG oslo_concurrency.lockutils [None req-3cd3470c-fe37-4c48-88cb-fe198fd9e938 00a7d470e36045deabd5584bd3a9c73e f02fc2085f6340ffa895cb894fdf5882 - - default default] Lock "3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 21 19:15:00 np0005591285 nova_compute[182755]: 2026-01-22 00:15:00.862 182759 DEBUG oslo_concurrency.processutils [None req-3cd3470c-fe37-4c48-88cb-fe198fd9e938 00a7d470e36045deabd5584bd3a9c73e f02fc2085f6340ffa895cb894fdf5882 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 21 19:15:00 np0005591285 nova_compute[182755]: 2026-01-22 00:15:00.919 182759 DEBUG oslo_concurrency.processutils [None req-3cd3470c-fe37-4c48-88cb-fe198fd9e938 00a7d470e36045deabd5584bd3a9c73e f02fc2085f6340ffa895cb894fdf5882 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json" returned: 0 in 0.057s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 21 19:15:00 np0005591285 nova_compute[182755]: 2026-01-22 00:15:00.921 182759 DEBUG oslo_concurrency.processutils [None req-3cd3470c-fe37-4c48-88cb-fe198fd9e938 00a7d470e36045deabd5584bd3a9c73e f02fc2085f6340ffa895cb894fdf5882 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474,backing_fmt=raw /var/lib/nova/instances/40f18c70-81c1-4729-929c-5368ae297eb8/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 21 19:15:00 np0005591285 nova_compute[182755]: 2026-01-22 00:15:00.957 182759 DEBUG oslo_concurrency.processutils [None req-3cd3470c-fe37-4c48-88cb-fe198fd9e938 00a7d470e36045deabd5584bd3a9c73e f02fc2085f6340ffa895cb894fdf5882 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474,backing_fmt=raw /var/lib/nova/instances/40f18c70-81c1-4729-929c-5368ae297eb8/disk 1073741824" returned: 0 in 0.036s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 21 19:15:00 np0005591285 nova_compute[182755]: 2026-01-22 00:15:00.958 182759 DEBUG oslo_concurrency.lockutils [None req-3cd3470c-fe37-4c48-88cb-fe198fd9e938 00a7d470e36045deabd5584bd3a9c73e f02fc2085f6340ffa895cb894fdf5882 - - default default] Lock "3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.107s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 21 19:15:00 np0005591285 nova_compute[182755]: 2026-01-22 00:15:00.958 182759 DEBUG oslo_concurrency.processutils [None req-3cd3470c-fe37-4c48-88cb-fe198fd9e938 00a7d470e36045deabd5584bd3a9c73e f02fc2085f6340ffa895cb894fdf5882 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 21 19:15:00 np0005591285 nova_compute[182755]: 2026-01-22 00:15:00.992 182759 DEBUG nova.policy [None req-3cd3470c-fe37-4c48-88cb-fe198fd9e938 00a7d470e36045deabd5584bd3a9c73e f02fc2085f6340ffa895cb894fdf5882 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '00a7d470e36045deabd5584bd3a9c73e', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'f02fc2085f6340ffa895cb894fdf5882', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Jan 21 19:15:01 np0005591285 nova_compute[182755]: 2026-01-22 00:15:01.013 182759 DEBUG oslo_concurrency.processutils [None req-3cd3470c-fe37-4c48-88cb-fe198fd9e938 00a7d470e36045deabd5584bd3a9c73e f02fc2085f6340ffa895cb894fdf5882 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json" returned: 0 in 0.054s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 21 19:15:01 np0005591285 nova_compute[182755]: 2026-01-22 00:15:01.014 182759 DEBUG nova.virt.disk.api [None req-3cd3470c-fe37-4c48-88cb-fe198fd9e938 00a7d470e36045deabd5584bd3a9c73e f02fc2085f6340ffa895cb894fdf5882 - - default default] Checking if we can resize image /var/lib/nova/instances/40f18c70-81c1-4729-929c-5368ae297eb8/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166#033[00m
Jan 21 19:15:01 np0005591285 nova_compute[182755]: 2026-01-22 00:15:01.014 182759 DEBUG oslo_concurrency.processutils [None req-3cd3470c-fe37-4c48-88cb-fe198fd9e938 00a7d470e36045deabd5584bd3a9c73e f02fc2085f6340ffa895cb894fdf5882 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/40f18c70-81c1-4729-929c-5368ae297eb8/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 21 19:15:01 np0005591285 nova_compute[182755]: 2026-01-22 00:15:01.073 182759 DEBUG oslo_concurrency.processutils [None req-3cd3470c-fe37-4c48-88cb-fe198fd9e938 00a7d470e36045deabd5584bd3a9c73e f02fc2085f6340ffa895cb894fdf5882 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/40f18c70-81c1-4729-929c-5368ae297eb8/disk --force-share --output=json" returned: 0 in 0.059s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 21 19:15:01 np0005591285 nova_compute[182755]: 2026-01-22 00:15:01.074 182759 DEBUG nova.virt.disk.api [None req-3cd3470c-fe37-4c48-88cb-fe198fd9e938 00a7d470e36045deabd5584bd3a9c73e f02fc2085f6340ffa895cb894fdf5882 - - default default] Cannot resize image /var/lib/nova/instances/40f18c70-81c1-4729-929c-5368ae297eb8/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172#033[00m
Jan 21 19:15:01 np0005591285 nova_compute[182755]: 2026-01-22 00:15:01.075 182759 DEBUG nova.objects.instance [None req-3cd3470c-fe37-4c48-88cb-fe198fd9e938 00a7d470e36045deabd5584bd3a9c73e f02fc2085f6340ffa895cb894fdf5882 - - default default] Lazy-loading 'migration_context' on Instance uuid 40f18c70-81c1-4729-929c-5368ae297eb8 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 21 19:15:01 np0005591285 nova_compute[182755]: 2026-01-22 00:15:01.294 182759 DEBUG nova.virt.libvirt.driver [None req-3cd3470c-fe37-4c48-88cb-fe198fd9e938 00a7d470e36045deabd5584bd3a9c73e f02fc2085f6340ffa895cb894fdf5882 - - default default] [instance: 40f18c70-81c1-4729-929c-5368ae297eb8] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Jan 21 19:15:01 np0005591285 nova_compute[182755]: 2026-01-22 00:15:01.295 182759 DEBUG nova.virt.libvirt.driver [None req-3cd3470c-fe37-4c48-88cb-fe198fd9e938 00a7d470e36045deabd5584bd3a9c73e f02fc2085f6340ffa895cb894fdf5882 - - default default] [instance: 40f18c70-81c1-4729-929c-5368ae297eb8] Ensure instance console log exists: /var/lib/nova/instances/40f18c70-81c1-4729-929c-5368ae297eb8/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Jan 21 19:15:01 np0005591285 nova_compute[182755]: 2026-01-22 00:15:01.295 182759 DEBUG oslo_concurrency.lockutils [None req-3cd3470c-fe37-4c48-88cb-fe198fd9e938 00a7d470e36045deabd5584bd3a9c73e f02fc2085f6340ffa895cb894fdf5882 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 21 19:15:01 np0005591285 nova_compute[182755]: 2026-01-22 00:15:01.295 182759 DEBUG oslo_concurrency.lockutils [None req-3cd3470c-fe37-4c48-88cb-fe198fd9e938 00a7d470e36045deabd5584bd3a9c73e f02fc2085f6340ffa895cb894fdf5882 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 21 19:15:01 np0005591285 nova_compute[182755]: 2026-01-22 00:15:01.296 182759 DEBUG oslo_concurrency.lockutils [None req-3cd3470c-fe37-4c48-88cb-fe198fd9e938 00a7d470e36045deabd5584bd3a9c73e f02fc2085f6340ffa895cb894fdf5882 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 21 19:15:02 np0005591285 nova_compute[182755]: 2026-01-22 00:15:02.543 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:15:02 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:15:02.977 104259 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 21 19:15:02 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:15:02.978 104259 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 21 19:15:02 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:15:02.978 104259 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 21 19:15:03 np0005591285 nova_compute[182755]: 2026-01-22 00:15:03.399 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:15:03 np0005591285 nova_compute[182755]: 2026-01-22 00:15:03.971 182759 DEBUG nova.network.neutron [None req-3cd3470c-fe37-4c48-88cb-fe198fd9e938 00a7d470e36045deabd5584bd3a9c73e f02fc2085f6340ffa895cb894fdf5882 - - default default] [instance: 40f18c70-81c1-4729-929c-5368ae297eb8] Successfully created port: 7c155749-1486-479f-9ee6-d99ea840c942 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Jan 21 19:15:05 np0005591285 podman[231611]: 2026-01-22 00:15:05.200978292 +0000 UTC m=+0.066231397 container health_status 3d927e208c8d9d2bf2ed958430b573b5beb6e6062ba87e1ec7a0ee42c855b2e4 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Jan 21 19:15:07 np0005591285 nova_compute[182755]: 2026-01-22 00:15:07.562 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:15:08 np0005591285 nova_compute[182755]: 2026-01-22 00:15:08.401 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:15:09 np0005591285 nova_compute[182755]: 2026-01-22 00:15:09.087 182759 DEBUG nova.network.neutron [None req-3cd3470c-fe37-4c48-88cb-fe198fd9e938 00a7d470e36045deabd5584bd3a9c73e f02fc2085f6340ffa895cb894fdf5882 - - default default] [instance: 40f18c70-81c1-4729-929c-5368ae297eb8] Successfully updated port: 7c155749-1486-479f-9ee6-d99ea840c942 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Jan 21 19:15:09 np0005591285 nova_compute[182755]: 2026-01-22 00:15:09.116 182759 DEBUG oslo_concurrency.lockutils [None req-3cd3470c-fe37-4c48-88cb-fe198fd9e938 00a7d470e36045deabd5584bd3a9c73e f02fc2085f6340ffa895cb894fdf5882 - - default default] Acquiring lock "refresh_cache-40f18c70-81c1-4729-929c-5368ae297eb8" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 21 19:15:09 np0005591285 nova_compute[182755]: 2026-01-22 00:15:09.117 182759 DEBUG oslo_concurrency.lockutils [None req-3cd3470c-fe37-4c48-88cb-fe198fd9e938 00a7d470e36045deabd5584bd3a9c73e f02fc2085f6340ffa895cb894fdf5882 - - default default] Acquired lock "refresh_cache-40f18c70-81c1-4729-929c-5368ae297eb8" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 21 19:15:09 np0005591285 nova_compute[182755]: 2026-01-22 00:15:09.117 182759 DEBUG nova.network.neutron [None req-3cd3470c-fe37-4c48-88cb-fe198fd9e938 00a7d470e36045deabd5584bd3a9c73e f02fc2085f6340ffa895cb894fdf5882 - - default default] [instance: 40f18c70-81c1-4729-929c-5368ae297eb8] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Jan 21 19:15:09 np0005591285 podman[231636]: 2026-01-22 00:15:09.198748068 +0000 UTC m=+0.066169426 container health_status 482a8450540b33e5cd4c4cb0c4b60281016c657b79b89fa82f8a9e447843f995 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.build-date=20251202, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_managed=true, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team)
Jan 21 19:15:09 np0005591285 podman[231637]: 2026-01-22 00:15:09.199441776 +0000 UTC m=+0.068744644 container health_status 8919309155a033da8deb024bb37b9fc0d285fe97686ee0d202af37a5f2df8416 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter)
Jan 21 19:15:09 np0005591285 nova_compute[182755]: 2026-01-22 00:15:09.230 182759 DEBUG nova.compute.manager [req-a4796d53-c453-4be4-9669-1109b3f0a67f req-de2ecf49-554f-41ff-9d4c-6b9f8aa87a80 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 40f18c70-81c1-4729-929c-5368ae297eb8] Received event network-changed-7c155749-1486-479f-9ee6-d99ea840c942 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 21 19:15:09 np0005591285 nova_compute[182755]: 2026-01-22 00:15:09.230 182759 DEBUG nova.compute.manager [req-a4796d53-c453-4be4-9669-1109b3f0a67f req-de2ecf49-554f-41ff-9d4c-6b9f8aa87a80 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 40f18c70-81c1-4729-929c-5368ae297eb8] Refreshing instance network info cache due to event network-changed-7c155749-1486-479f-9ee6-d99ea840c942. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 21 19:15:09 np0005591285 nova_compute[182755]: 2026-01-22 00:15:09.230 182759 DEBUG oslo_concurrency.lockutils [req-a4796d53-c453-4be4-9669-1109b3f0a67f req-de2ecf49-554f-41ff-9d4c-6b9f8aa87a80 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "refresh_cache-40f18c70-81c1-4729-929c-5368ae297eb8" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 21 19:15:09 np0005591285 nova_compute[182755]: 2026-01-22 00:15:09.543 182759 DEBUG nova.network.neutron [None req-3cd3470c-fe37-4c48-88cb-fe198fd9e938 00a7d470e36045deabd5584bd3a9c73e f02fc2085f6340ffa895cb894fdf5882 - - default default] [instance: 40f18c70-81c1-4729-929c-5368ae297eb8] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Jan 21 19:15:10 np0005591285 nova_compute[182755]: 2026-01-22 00:15:10.532 182759 DEBUG nova.network.neutron [None req-3cd3470c-fe37-4c48-88cb-fe198fd9e938 00a7d470e36045deabd5584bd3a9c73e f02fc2085f6340ffa895cb894fdf5882 - - default default] [instance: 40f18c70-81c1-4729-929c-5368ae297eb8] Updating instance_info_cache with network_info: [{"id": "7c155749-1486-479f-9ee6-d99ea840c942", "address": "fa:16:3e:36:d6:55", "network": {"id": "c19848fe-a435-4c66-8190-94e8e9e1b266", "bridge": "br-int", "label": "tempest-MultipleCreateTestJSON-1960670064-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "f02fc2085f6340ffa895cb894fdf5882", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap7c155749-14", "ovs_interfaceid": "7c155749-1486-479f-9ee6-d99ea840c942", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 21 19:15:10 np0005591285 nova_compute[182755]: 2026-01-22 00:15:10.555 182759 DEBUG oslo_concurrency.lockutils [None req-3cd3470c-fe37-4c48-88cb-fe198fd9e938 00a7d470e36045deabd5584bd3a9c73e f02fc2085f6340ffa895cb894fdf5882 - - default default] Releasing lock "refresh_cache-40f18c70-81c1-4729-929c-5368ae297eb8" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 21 19:15:10 np0005591285 nova_compute[182755]: 2026-01-22 00:15:10.555 182759 DEBUG nova.compute.manager [None req-3cd3470c-fe37-4c48-88cb-fe198fd9e938 00a7d470e36045deabd5584bd3a9c73e f02fc2085f6340ffa895cb894fdf5882 - - default default] [instance: 40f18c70-81c1-4729-929c-5368ae297eb8] Instance network_info: |[{"id": "7c155749-1486-479f-9ee6-d99ea840c942", "address": "fa:16:3e:36:d6:55", "network": {"id": "c19848fe-a435-4c66-8190-94e8e9e1b266", "bridge": "br-int", "label": "tempest-MultipleCreateTestJSON-1960670064-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "f02fc2085f6340ffa895cb894fdf5882", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap7c155749-14", "ovs_interfaceid": "7c155749-1486-479f-9ee6-d99ea840c942", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Jan 21 19:15:10 np0005591285 nova_compute[182755]: 2026-01-22 00:15:10.556 182759 DEBUG oslo_concurrency.lockutils [req-a4796d53-c453-4be4-9669-1109b3f0a67f req-de2ecf49-554f-41ff-9d4c-6b9f8aa87a80 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquired lock "refresh_cache-40f18c70-81c1-4729-929c-5368ae297eb8" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 21 19:15:10 np0005591285 nova_compute[182755]: 2026-01-22 00:15:10.556 182759 DEBUG nova.network.neutron [req-a4796d53-c453-4be4-9669-1109b3f0a67f req-de2ecf49-554f-41ff-9d4c-6b9f8aa87a80 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 40f18c70-81c1-4729-929c-5368ae297eb8] Refreshing network info cache for port 7c155749-1486-479f-9ee6-d99ea840c942 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 21 19:15:10 np0005591285 nova_compute[182755]: 2026-01-22 00:15:10.560 182759 DEBUG nova.virt.libvirt.driver [None req-3cd3470c-fe37-4c48-88cb-fe198fd9e938 00a7d470e36045deabd5584bd3a9c73e f02fc2085f6340ffa895cb894fdf5882 - - default default] [instance: 40f18c70-81c1-4729-929c-5368ae297eb8] Start _get_guest_xml network_info=[{"id": "7c155749-1486-479f-9ee6-d99ea840c942", "address": "fa:16:3e:36:d6:55", "network": {"id": "c19848fe-a435-4c66-8190-94e8e9e1b266", "bridge": "br-int", "label": "tempest-MultipleCreateTestJSON-1960670064-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "f02fc2085f6340ffa895cb894fdf5882", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap7c155749-14", "ovs_interfaceid": "7c155749-1486-479f-9ee6-d99ea840c942", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-21T23:42:50Z,direct_url=<?>,disk_format='qcow2',id=9cd98f02-a505-4543-a7ad-04e9a377b456,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='43b70c4e837343859ac97b6b2397ba1b',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-21T23:42:52Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'device_type': 'disk', 'device_name': '/dev/vda', 'size': 0, 'boot_index': 0, 'disk_bus': 'virtio', 'guest_format': None, 'encryption_options': None, 'encryption_format': None, 'encrypted': False, 'encryption_secret_uuid': None, 'image_id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Jan 21 19:15:10 np0005591285 nova_compute[182755]: 2026-01-22 00:15:10.565 182759 WARNING nova.virt.libvirt.driver [None req-3cd3470c-fe37-4c48-88cb-fe198fd9e938 00a7d470e36045deabd5584bd3a9c73e f02fc2085f6340ffa895cb894fdf5882 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 21 19:15:10 np0005591285 nova_compute[182755]: 2026-01-22 00:15:10.579 182759 DEBUG nova.virt.libvirt.host [None req-3cd3470c-fe37-4c48-88cb-fe198fd9e938 00a7d470e36045deabd5584bd3a9c73e f02fc2085f6340ffa895cb894fdf5882 - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Jan 21 19:15:10 np0005591285 nova_compute[182755]: 2026-01-22 00:15:10.580 182759 DEBUG nova.virt.libvirt.host [None req-3cd3470c-fe37-4c48-88cb-fe198fd9e938 00a7d470e36045deabd5584bd3a9c73e f02fc2085f6340ffa895cb894fdf5882 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Jan 21 19:15:10 np0005591285 nova_compute[182755]: 2026-01-22 00:15:10.586 182759 DEBUG nova.virt.libvirt.host [None req-3cd3470c-fe37-4c48-88cb-fe198fd9e938 00a7d470e36045deabd5584bd3a9c73e f02fc2085f6340ffa895cb894fdf5882 - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Jan 21 19:15:10 np0005591285 nova_compute[182755]: 2026-01-22 00:15:10.587 182759 DEBUG nova.virt.libvirt.host [None req-3cd3470c-fe37-4c48-88cb-fe198fd9e938 00a7d470e36045deabd5584bd3a9c73e f02fc2085f6340ffa895cb894fdf5882 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Jan 21 19:15:10 np0005591285 nova_compute[182755]: 2026-01-22 00:15:10.589 182759 DEBUG nova.virt.libvirt.driver [None req-3cd3470c-fe37-4c48-88cb-fe198fd9e938 00a7d470e36045deabd5584bd3a9c73e f02fc2085f6340ffa895cb894fdf5882 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Jan 21 19:15:10 np0005591285 nova_compute[182755]: 2026-01-22 00:15:10.589 182759 DEBUG nova.virt.hardware [None req-3cd3470c-fe37-4c48-88cb-fe198fd9e938 00a7d470e36045deabd5584bd3a9c73e f02fc2085f6340ffa895cb894fdf5882 - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-21T23:42:45Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='c3389c03-89c4-4ff5-9e03-1a99d41713d4',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-21T23:42:50Z,direct_url=<?>,disk_format='qcow2',id=9cd98f02-a505-4543-a7ad-04e9a377b456,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='43b70c4e837343859ac97b6b2397ba1b',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-21T23:42:52Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Jan 21 19:15:10 np0005591285 nova_compute[182755]: 2026-01-22 00:15:10.590 182759 DEBUG nova.virt.hardware [None req-3cd3470c-fe37-4c48-88cb-fe198fd9e938 00a7d470e36045deabd5584bd3a9c73e f02fc2085f6340ffa895cb894fdf5882 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Jan 21 19:15:10 np0005591285 nova_compute[182755]: 2026-01-22 00:15:10.590 182759 DEBUG nova.virt.hardware [None req-3cd3470c-fe37-4c48-88cb-fe198fd9e938 00a7d470e36045deabd5584bd3a9c73e f02fc2085f6340ffa895cb894fdf5882 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Jan 21 19:15:10 np0005591285 nova_compute[182755]: 2026-01-22 00:15:10.591 182759 DEBUG nova.virt.hardware [None req-3cd3470c-fe37-4c48-88cb-fe198fd9e938 00a7d470e36045deabd5584bd3a9c73e f02fc2085f6340ffa895cb894fdf5882 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Jan 21 19:15:10 np0005591285 nova_compute[182755]: 2026-01-22 00:15:10.591 182759 DEBUG nova.virt.hardware [None req-3cd3470c-fe37-4c48-88cb-fe198fd9e938 00a7d470e36045deabd5584bd3a9c73e f02fc2085f6340ffa895cb894fdf5882 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Jan 21 19:15:10 np0005591285 nova_compute[182755]: 2026-01-22 00:15:10.591 182759 DEBUG nova.virt.hardware [None req-3cd3470c-fe37-4c48-88cb-fe198fd9e938 00a7d470e36045deabd5584bd3a9c73e f02fc2085f6340ffa895cb894fdf5882 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Jan 21 19:15:10 np0005591285 nova_compute[182755]: 2026-01-22 00:15:10.591 182759 DEBUG nova.virt.hardware [None req-3cd3470c-fe37-4c48-88cb-fe198fd9e938 00a7d470e36045deabd5584bd3a9c73e f02fc2085f6340ffa895cb894fdf5882 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Jan 21 19:15:10 np0005591285 nova_compute[182755]: 2026-01-22 00:15:10.592 182759 DEBUG nova.virt.hardware [None req-3cd3470c-fe37-4c48-88cb-fe198fd9e938 00a7d470e36045deabd5584bd3a9c73e f02fc2085f6340ffa895cb894fdf5882 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Jan 21 19:15:10 np0005591285 nova_compute[182755]: 2026-01-22 00:15:10.592 182759 DEBUG nova.virt.hardware [None req-3cd3470c-fe37-4c48-88cb-fe198fd9e938 00a7d470e36045deabd5584bd3a9c73e f02fc2085f6340ffa895cb894fdf5882 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Jan 21 19:15:10 np0005591285 nova_compute[182755]: 2026-01-22 00:15:10.593 182759 DEBUG nova.virt.hardware [None req-3cd3470c-fe37-4c48-88cb-fe198fd9e938 00a7d470e36045deabd5584bd3a9c73e f02fc2085f6340ffa895cb894fdf5882 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Jan 21 19:15:10 np0005591285 nova_compute[182755]: 2026-01-22 00:15:10.593 182759 DEBUG nova.virt.hardware [None req-3cd3470c-fe37-4c48-88cb-fe198fd9e938 00a7d470e36045deabd5584bd3a9c73e f02fc2085f6340ffa895cb894fdf5882 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Jan 21 19:15:10 np0005591285 nova_compute[182755]: 2026-01-22 00:15:10.600 182759 DEBUG nova.virt.libvirt.vif [None req-3cd3470c-fe37-4c48-88cb-fe198fd9e938 00a7d470e36045deabd5584bd3a9c73e f02fc2085f6340ffa895cb894fdf5882 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-22T00:14:47Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-MultipleCreateTestJSON-server-1628567258',display_name='tempest-MultipleCreateTestJSON-server-1628567258-1',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-multiplecreatetestjson-server-1628567258-1',id=126,image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='f02fc2085f6340ffa895cb894fdf5882',ramdisk_id='',reservation_id='r-ivpvh0b6',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-MultipleCreateTestJSON-620854064',owner_user_name='tempest-MultipleCreateTestJSON-620854064-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-22T00:15:00Z,user_data=None,user_id='00a7d470e36045deabd5584bd3a9c73e',uuid=40f18c70-81c1-4729-929c-5368ae297eb8,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "7c155749-1486-479f-9ee6-d99ea840c942", "address": "fa:16:3e:36:d6:55", "network": {"id": "c19848fe-a435-4c66-8190-94e8e9e1b266", "bridge": "br-int", "label": "tempest-MultipleCreateTestJSON-1960670064-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "f02fc2085f6340ffa895cb894fdf5882", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap7c155749-14", "ovs_interfaceid": "7c155749-1486-479f-9ee6-d99ea840c942", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Jan 21 19:15:10 np0005591285 nova_compute[182755]: 2026-01-22 00:15:10.600 182759 DEBUG nova.network.os_vif_util [None req-3cd3470c-fe37-4c48-88cb-fe198fd9e938 00a7d470e36045deabd5584bd3a9c73e f02fc2085f6340ffa895cb894fdf5882 - - default default] Converting VIF {"id": "7c155749-1486-479f-9ee6-d99ea840c942", "address": "fa:16:3e:36:d6:55", "network": {"id": "c19848fe-a435-4c66-8190-94e8e9e1b266", "bridge": "br-int", "label": "tempest-MultipleCreateTestJSON-1960670064-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "f02fc2085f6340ffa895cb894fdf5882", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap7c155749-14", "ovs_interfaceid": "7c155749-1486-479f-9ee6-d99ea840c942", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 21 19:15:10 np0005591285 nova_compute[182755]: 2026-01-22 00:15:10.601 182759 DEBUG nova.network.os_vif_util [None req-3cd3470c-fe37-4c48-88cb-fe198fd9e938 00a7d470e36045deabd5584bd3a9c73e f02fc2085f6340ffa895cb894fdf5882 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:36:d6:55,bridge_name='br-int',has_traffic_filtering=True,id=7c155749-1486-479f-9ee6-d99ea840c942,network=Network(c19848fe-a435-4c66-8190-94e8e9e1b266),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap7c155749-14') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 21 19:15:10 np0005591285 nova_compute[182755]: 2026-01-22 00:15:10.603 182759 DEBUG nova.objects.instance [None req-3cd3470c-fe37-4c48-88cb-fe198fd9e938 00a7d470e36045deabd5584bd3a9c73e f02fc2085f6340ffa895cb894fdf5882 - - default default] Lazy-loading 'pci_devices' on Instance uuid 40f18c70-81c1-4729-929c-5368ae297eb8 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 21 19:15:10 np0005591285 nova_compute[182755]: 2026-01-22 00:15:10.621 182759 DEBUG nova.virt.libvirt.driver [None req-3cd3470c-fe37-4c48-88cb-fe198fd9e938 00a7d470e36045deabd5584bd3a9c73e f02fc2085f6340ffa895cb894fdf5882 - - default default] [instance: 40f18c70-81c1-4729-929c-5368ae297eb8] End _get_guest_xml xml=<domain type="kvm">
Jan 21 19:15:10 np0005591285 nova_compute[182755]:  <uuid>40f18c70-81c1-4729-929c-5368ae297eb8</uuid>
Jan 21 19:15:10 np0005591285 nova_compute[182755]:  <name>instance-0000007e</name>
Jan 21 19:15:10 np0005591285 nova_compute[182755]:  <memory>131072</memory>
Jan 21 19:15:10 np0005591285 nova_compute[182755]:  <vcpu>1</vcpu>
Jan 21 19:15:10 np0005591285 nova_compute[182755]:  <metadata>
Jan 21 19:15:10 np0005591285 nova_compute[182755]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 21 19:15:10 np0005591285 nova_compute[182755]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 21 19:15:10 np0005591285 nova_compute[182755]:      <nova:name>tempest-MultipleCreateTestJSON-server-1628567258-1</nova:name>
Jan 21 19:15:10 np0005591285 nova_compute[182755]:      <nova:creationTime>2026-01-22 00:15:10</nova:creationTime>
Jan 21 19:15:10 np0005591285 nova_compute[182755]:      <nova:flavor name="m1.nano">
Jan 21 19:15:10 np0005591285 nova_compute[182755]:        <nova:memory>128</nova:memory>
Jan 21 19:15:10 np0005591285 nova_compute[182755]:        <nova:disk>1</nova:disk>
Jan 21 19:15:10 np0005591285 nova_compute[182755]:        <nova:swap>0</nova:swap>
Jan 21 19:15:10 np0005591285 nova_compute[182755]:        <nova:ephemeral>0</nova:ephemeral>
Jan 21 19:15:10 np0005591285 nova_compute[182755]:        <nova:vcpus>1</nova:vcpus>
Jan 21 19:15:10 np0005591285 nova_compute[182755]:      </nova:flavor>
Jan 21 19:15:10 np0005591285 nova_compute[182755]:      <nova:owner>
Jan 21 19:15:10 np0005591285 nova_compute[182755]:        <nova:user uuid="00a7d470e36045deabd5584bd3a9c73e">tempest-MultipleCreateTestJSON-620854064-project-member</nova:user>
Jan 21 19:15:10 np0005591285 nova_compute[182755]:        <nova:project uuid="f02fc2085f6340ffa895cb894fdf5882">tempest-MultipleCreateTestJSON-620854064</nova:project>
Jan 21 19:15:10 np0005591285 nova_compute[182755]:      </nova:owner>
Jan 21 19:15:10 np0005591285 nova_compute[182755]:      <nova:root type="image" uuid="9cd98f02-a505-4543-a7ad-04e9a377b456"/>
Jan 21 19:15:10 np0005591285 nova_compute[182755]:      <nova:ports>
Jan 21 19:15:10 np0005591285 nova_compute[182755]:        <nova:port uuid="7c155749-1486-479f-9ee6-d99ea840c942">
Jan 21 19:15:10 np0005591285 nova_compute[182755]:          <nova:ip type="fixed" address="10.100.0.5" ipVersion="4"/>
Jan 21 19:15:10 np0005591285 nova_compute[182755]:        </nova:port>
Jan 21 19:15:10 np0005591285 nova_compute[182755]:      </nova:ports>
Jan 21 19:15:10 np0005591285 nova_compute[182755]:    </nova:instance>
Jan 21 19:15:10 np0005591285 nova_compute[182755]:  </metadata>
Jan 21 19:15:10 np0005591285 nova_compute[182755]:  <sysinfo type="smbios">
Jan 21 19:15:10 np0005591285 nova_compute[182755]:    <system>
Jan 21 19:15:10 np0005591285 nova_compute[182755]:      <entry name="manufacturer">RDO</entry>
Jan 21 19:15:10 np0005591285 nova_compute[182755]:      <entry name="product">OpenStack Compute</entry>
Jan 21 19:15:10 np0005591285 nova_compute[182755]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 21 19:15:10 np0005591285 nova_compute[182755]:      <entry name="serial">40f18c70-81c1-4729-929c-5368ae297eb8</entry>
Jan 21 19:15:10 np0005591285 nova_compute[182755]:      <entry name="uuid">40f18c70-81c1-4729-929c-5368ae297eb8</entry>
Jan 21 19:15:10 np0005591285 nova_compute[182755]:      <entry name="family">Virtual Machine</entry>
Jan 21 19:15:10 np0005591285 nova_compute[182755]:    </system>
Jan 21 19:15:10 np0005591285 nova_compute[182755]:  </sysinfo>
Jan 21 19:15:10 np0005591285 nova_compute[182755]:  <os>
Jan 21 19:15:10 np0005591285 nova_compute[182755]:    <type arch="x86_64" machine="q35">hvm</type>
Jan 21 19:15:10 np0005591285 nova_compute[182755]:    <boot dev="hd"/>
Jan 21 19:15:10 np0005591285 nova_compute[182755]:    <smbios mode="sysinfo"/>
Jan 21 19:15:10 np0005591285 nova_compute[182755]:  </os>
Jan 21 19:15:10 np0005591285 nova_compute[182755]:  <features>
Jan 21 19:15:10 np0005591285 nova_compute[182755]:    <acpi/>
Jan 21 19:15:10 np0005591285 nova_compute[182755]:    <apic/>
Jan 21 19:15:10 np0005591285 nova_compute[182755]:    <vmcoreinfo/>
Jan 21 19:15:10 np0005591285 nova_compute[182755]:  </features>
Jan 21 19:15:10 np0005591285 nova_compute[182755]:  <clock offset="utc">
Jan 21 19:15:10 np0005591285 nova_compute[182755]:    <timer name="pit" tickpolicy="delay"/>
Jan 21 19:15:10 np0005591285 nova_compute[182755]:    <timer name="rtc" tickpolicy="catchup"/>
Jan 21 19:15:10 np0005591285 nova_compute[182755]:    <timer name="hpet" present="no"/>
Jan 21 19:15:10 np0005591285 nova_compute[182755]:  </clock>
Jan 21 19:15:10 np0005591285 nova_compute[182755]:  <cpu mode="custom" match="exact">
Jan 21 19:15:10 np0005591285 nova_compute[182755]:    <model>Nehalem</model>
Jan 21 19:15:10 np0005591285 nova_compute[182755]:    <topology sockets="1" cores="1" threads="1"/>
Jan 21 19:15:10 np0005591285 nova_compute[182755]:  </cpu>
Jan 21 19:15:10 np0005591285 nova_compute[182755]:  <devices>
Jan 21 19:15:10 np0005591285 nova_compute[182755]:    <disk type="file" device="disk">
Jan 21 19:15:10 np0005591285 nova_compute[182755]:      <driver name="qemu" type="qcow2" cache="none"/>
Jan 21 19:15:10 np0005591285 nova_compute[182755]:      <source file="/var/lib/nova/instances/40f18c70-81c1-4729-929c-5368ae297eb8/disk"/>
Jan 21 19:15:10 np0005591285 nova_compute[182755]:      <target dev="vda" bus="virtio"/>
Jan 21 19:15:10 np0005591285 nova_compute[182755]:    </disk>
Jan 21 19:15:10 np0005591285 nova_compute[182755]:    <disk type="file" device="cdrom">
Jan 21 19:15:10 np0005591285 nova_compute[182755]:      <driver name="qemu" type="raw" cache="none"/>
Jan 21 19:15:10 np0005591285 nova_compute[182755]:      <source file="/var/lib/nova/instances/40f18c70-81c1-4729-929c-5368ae297eb8/disk.config"/>
Jan 21 19:15:10 np0005591285 nova_compute[182755]:      <target dev="sda" bus="sata"/>
Jan 21 19:15:10 np0005591285 nova_compute[182755]:    </disk>
Jan 21 19:15:10 np0005591285 nova_compute[182755]:    <interface type="ethernet">
Jan 21 19:15:10 np0005591285 nova_compute[182755]:      <mac address="fa:16:3e:36:d6:55"/>
Jan 21 19:15:10 np0005591285 nova_compute[182755]:      <model type="virtio"/>
Jan 21 19:15:10 np0005591285 nova_compute[182755]:      <driver name="vhost" rx_queue_size="512"/>
Jan 21 19:15:10 np0005591285 nova_compute[182755]:      <mtu size="1442"/>
Jan 21 19:15:10 np0005591285 nova_compute[182755]:      <target dev="tap7c155749-14"/>
Jan 21 19:15:10 np0005591285 nova_compute[182755]:    </interface>
Jan 21 19:15:10 np0005591285 nova_compute[182755]:    <serial type="pty">
Jan 21 19:15:10 np0005591285 nova_compute[182755]:      <log file="/var/lib/nova/instances/40f18c70-81c1-4729-929c-5368ae297eb8/console.log" append="off"/>
Jan 21 19:15:10 np0005591285 nova_compute[182755]:    </serial>
Jan 21 19:15:10 np0005591285 nova_compute[182755]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 21 19:15:10 np0005591285 nova_compute[182755]:    <video>
Jan 21 19:15:10 np0005591285 nova_compute[182755]:      <model type="virtio"/>
Jan 21 19:15:10 np0005591285 nova_compute[182755]:    </video>
Jan 21 19:15:10 np0005591285 nova_compute[182755]:    <input type="tablet" bus="usb"/>
Jan 21 19:15:10 np0005591285 nova_compute[182755]:    <rng model="virtio">
Jan 21 19:15:10 np0005591285 nova_compute[182755]:      <backend model="random">/dev/urandom</backend>
Jan 21 19:15:10 np0005591285 nova_compute[182755]:    </rng>
Jan 21 19:15:10 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root"/>
Jan 21 19:15:10 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 19:15:10 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 19:15:10 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 19:15:10 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 19:15:10 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 19:15:10 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 19:15:10 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 19:15:10 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 19:15:10 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 19:15:10 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 19:15:10 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 19:15:10 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 19:15:10 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 19:15:10 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 19:15:10 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 19:15:10 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 19:15:10 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 19:15:10 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 19:15:10 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 19:15:10 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 19:15:10 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 19:15:10 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 19:15:10 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 19:15:10 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 19:15:10 np0005591285 nova_compute[182755]:    <controller type="usb" index="0"/>
Jan 21 19:15:10 np0005591285 nova_compute[182755]:    <memballoon model="virtio">
Jan 21 19:15:10 np0005591285 nova_compute[182755]:      <stats period="10"/>
Jan 21 19:15:10 np0005591285 nova_compute[182755]:    </memballoon>
Jan 21 19:15:10 np0005591285 nova_compute[182755]:  </devices>
Jan 21 19:15:10 np0005591285 nova_compute[182755]: </domain>
Jan 21 19:15:10 np0005591285 nova_compute[182755]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Jan 21 19:15:10 np0005591285 nova_compute[182755]: 2026-01-22 00:15:10.623 182759 DEBUG nova.compute.manager [None req-3cd3470c-fe37-4c48-88cb-fe198fd9e938 00a7d470e36045deabd5584bd3a9c73e f02fc2085f6340ffa895cb894fdf5882 - - default default] [instance: 40f18c70-81c1-4729-929c-5368ae297eb8] Preparing to wait for external event network-vif-plugged-7c155749-1486-479f-9ee6-d99ea840c942 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Jan 21 19:15:10 np0005591285 nova_compute[182755]: 2026-01-22 00:15:10.623 182759 DEBUG oslo_concurrency.lockutils [None req-3cd3470c-fe37-4c48-88cb-fe198fd9e938 00a7d470e36045deabd5584bd3a9c73e f02fc2085f6340ffa895cb894fdf5882 - - default default] Acquiring lock "40f18c70-81c1-4729-929c-5368ae297eb8-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 21 19:15:10 np0005591285 nova_compute[182755]: 2026-01-22 00:15:10.623 182759 DEBUG oslo_concurrency.lockutils [None req-3cd3470c-fe37-4c48-88cb-fe198fd9e938 00a7d470e36045deabd5584bd3a9c73e f02fc2085f6340ffa895cb894fdf5882 - - default default] Lock "40f18c70-81c1-4729-929c-5368ae297eb8-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 21 19:15:10 np0005591285 nova_compute[182755]: 2026-01-22 00:15:10.623 182759 DEBUG oslo_concurrency.lockutils [None req-3cd3470c-fe37-4c48-88cb-fe198fd9e938 00a7d470e36045deabd5584bd3a9c73e f02fc2085f6340ffa895cb894fdf5882 - - default default] Lock "40f18c70-81c1-4729-929c-5368ae297eb8-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 21 19:15:10 np0005591285 nova_compute[182755]: 2026-01-22 00:15:10.624 182759 DEBUG nova.virt.libvirt.vif [None req-3cd3470c-fe37-4c48-88cb-fe198fd9e938 00a7d470e36045deabd5584bd3a9c73e f02fc2085f6340ffa895cb894fdf5882 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-22T00:14:47Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-MultipleCreateTestJSON-server-1628567258',display_name='tempest-MultipleCreateTestJSON-server-1628567258-1',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-multiplecreatetestjson-server-1628567258-1',id=126,image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='f02fc2085f6340ffa895cb894fdf5882',ramdisk_id='',reservation_id='r-ivpvh0b6',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-MultipleCreateTestJSON-620854064',owner_user_name='tempest-MultipleCreateTestJSON-620854064-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-22T00:15:00Z,user_data=None,user_id='00a7d470e36045deabd5584bd3a9c73e',uuid=40f18c70-81c1-4729-929c-5368ae297eb8,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "7c155749-1486-479f-9ee6-d99ea840c942", "address": "fa:16:3e:36:d6:55", "network": {"id": "c19848fe-a435-4c66-8190-94e8e9e1b266", "bridge": "br-int", "label": "tempest-MultipleCreateTestJSON-1960670064-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "f02fc2085f6340ffa895cb894fdf5882", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap7c155749-14", "ovs_interfaceid": "7c155749-1486-479f-9ee6-d99ea840c942", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Jan 21 19:15:10 np0005591285 nova_compute[182755]: 2026-01-22 00:15:10.624 182759 DEBUG nova.network.os_vif_util [None req-3cd3470c-fe37-4c48-88cb-fe198fd9e938 00a7d470e36045deabd5584bd3a9c73e f02fc2085f6340ffa895cb894fdf5882 - - default default] Converting VIF {"id": "7c155749-1486-479f-9ee6-d99ea840c942", "address": "fa:16:3e:36:d6:55", "network": {"id": "c19848fe-a435-4c66-8190-94e8e9e1b266", "bridge": "br-int", "label": "tempest-MultipleCreateTestJSON-1960670064-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "f02fc2085f6340ffa895cb894fdf5882", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap7c155749-14", "ovs_interfaceid": "7c155749-1486-479f-9ee6-d99ea840c942", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 21 19:15:10 np0005591285 nova_compute[182755]: 2026-01-22 00:15:10.625 182759 DEBUG nova.network.os_vif_util [None req-3cd3470c-fe37-4c48-88cb-fe198fd9e938 00a7d470e36045deabd5584bd3a9c73e f02fc2085f6340ffa895cb894fdf5882 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:36:d6:55,bridge_name='br-int',has_traffic_filtering=True,id=7c155749-1486-479f-9ee6-d99ea840c942,network=Network(c19848fe-a435-4c66-8190-94e8e9e1b266),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap7c155749-14') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 21 19:15:10 np0005591285 nova_compute[182755]: 2026-01-22 00:15:10.625 182759 DEBUG os_vif [None req-3cd3470c-fe37-4c48-88cb-fe198fd9e938 00a7d470e36045deabd5584bd3a9c73e f02fc2085f6340ffa895cb894fdf5882 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:36:d6:55,bridge_name='br-int',has_traffic_filtering=True,id=7c155749-1486-479f-9ee6-d99ea840c942,network=Network(c19848fe-a435-4c66-8190-94e8e9e1b266),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap7c155749-14') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Jan 21 19:15:10 np0005591285 nova_compute[182755]: 2026-01-22 00:15:10.625 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:15:10 np0005591285 nova_compute[182755]: 2026-01-22 00:15:10.626 182759 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 21 19:15:10 np0005591285 nova_compute[182755]: 2026-01-22 00:15:10.626 182759 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 21 19:15:10 np0005591285 nova_compute[182755]: 2026-01-22 00:15:10.628 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:15:10 np0005591285 nova_compute[182755]: 2026-01-22 00:15:10.628 182759 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap7c155749-14, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 21 19:15:10 np0005591285 nova_compute[182755]: 2026-01-22 00:15:10.629 182759 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap7c155749-14, col_values=(('external_ids', {'iface-id': '7c155749-1486-479f-9ee6-d99ea840c942', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:36:d6:55', 'vm-uuid': '40f18c70-81c1-4729-929c-5368ae297eb8'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 21 19:15:10 np0005591285 nova_compute[182755]: 2026-01-22 00:15:10.632 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:15:10 np0005591285 NetworkManager[55017]: <info>  [1769040910.6331] manager: (tap7c155749-14): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/236)
Jan 21 19:15:10 np0005591285 nova_compute[182755]: 2026-01-22 00:15:10.635 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 21 19:15:10 np0005591285 nova_compute[182755]: 2026-01-22 00:15:10.640 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:15:10 np0005591285 nova_compute[182755]: 2026-01-22 00:15:10.641 182759 INFO os_vif [None req-3cd3470c-fe37-4c48-88cb-fe198fd9e938 00a7d470e36045deabd5584bd3a9c73e f02fc2085f6340ffa895cb894fdf5882 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:36:d6:55,bridge_name='br-int',has_traffic_filtering=True,id=7c155749-1486-479f-9ee6-d99ea840c942,network=Network(c19848fe-a435-4c66-8190-94e8e9e1b266),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap7c155749-14')#033[00m
Jan 21 19:15:10 np0005591285 nova_compute[182755]: 2026-01-22 00:15:10.762 182759 DEBUG nova.virt.libvirt.driver [None req-3cd3470c-fe37-4c48-88cb-fe198fd9e938 00a7d470e36045deabd5584bd3a9c73e f02fc2085f6340ffa895cb894fdf5882 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 21 19:15:10 np0005591285 nova_compute[182755]: 2026-01-22 00:15:10.762 182759 DEBUG nova.virt.libvirt.driver [None req-3cd3470c-fe37-4c48-88cb-fe198fd9e938 00a7d470e36045deabd5584bd3a9c73e f02fc2085f6340ffa895cb894fdf5882 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 21 19:15:10 np0005591285 nova_compute[182755]: 2026-01-22 00:15:10.763 182759 DEBUG nova.virt.libvirt.driver [None req-3cd3470c-fe37-4c48-88cb-fe198fd9e938 00a7d470e36045deabd5584bd3a9c73e f02fc2085f6340ffa895cb894fdf5882 - - default default] No VIF found with MAC fa:16:3e:36:d6:55, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Jan 21 19:15:10 np0005591285 nova_compute[182755]: 2026-01-22 00:15:10.763 182759 INFO nova.virt.libvirt.driver [None req-3cd3470c-fe37-4c48-88cb-fe198fd9e938 00a7d470e36045deabd5584bd3a9c73e f02fc2085f6340ffa895cb894fdf5882 - - default default] [instance: 40f18c70-81c1-4729-929c-5368ae297eb8] Using config drive#033[00m
Jan 21 19:15:11 np0005591285 nova_compute[182755]: 2026-01-22 00:15:11.205 182759 INFO nova.virt.libvirt.driver [None req-3cd3470c-fe37-4c48-88cb-fe198fd9e938 00a7d470e36045deabd5584bd3a9c73e f02fc2085f6340ffa895cb894fdf5882 - - default default] [instance: 40f18c70-81c1-4729-929c-5368ae297eb8] Creating config drive at /var/lib/nova/instances/40f18c70-81c1-4729-929c-5368ae297eb8/disk.config#033[00m
Jan 21 19:15:11 np0005591285 nova_compute[182755]: 2026-01-22 00:15:11.217 182759 DEBUG oslo_concurrency.processutils [None req-3cd3470c-fe37-4c48-88cb-fe198fd9e938 00a7d470e36045deabd5584bd3a9c73e f02fc2085f6340ffa895cb894fdf5882 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/40f18c70-81c1-4729-929c-5368ae297eb8/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpdqx7ztj8 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 21 19:15:11 np0005591285 podman[231680]: 2026-01-22 00:15:11.224445855 +0000 UTC m=+0.093984227 container health_status e38def30a2d032278c507a8fe96e62e9ea51ff4721fe55b24e66691821fd079e (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=ovn_controller, org.label-schema.schema-version=1.0, tcib_managed=true, container_name=ovn_controller)
Jan 21 19:15:11 np0005591285 nova_compute[182755]: 2026-01-22 00:15:11.347 182759 DEBUG oslo_concurrency.processutils [None req-3cd3470c-fe37-4c48-88cb-fe198fd9e938 00a7d470e36045deabd5584bd3a9c73e f02fc2085f6340ffa895cb894fdf5882 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/40f18c70-81c1-4729-929c-5368ae297eb8/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpdqx7ztj8" returned: 0 in 0.130s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 21 19:15:11 np0005591285 kernel: tap7c155749-14: entered promiscuous mode
Jan 21 19:15:11 np0005591285 NetworkManager[55017]: <info>  [1769040911.4055] manager: (tap7c155749-14): new Tun device (/org/freedesktop/NetworkManager/Devices/237)
Jan 21 19:15:11 np0005591285 ovn_controller[94908]: 2026-01-22T00:15:11Z|00491|binding|INFO|Claiming lport 7c155749-1486-479f-9ee6-d99ea840c942 for this chassis.
Jan 21 19:15:11 np0005591285 ovn_controller[94908]: 2026-01-22T00:15:11Z|00492|binding|INFO|7c155749-1486-479f-9ee6-d99ea840c942: Claiming fa:16:3e:36:d6:55 10.100.0.5
Jan 21 19:15:11 np0005591285 nova_compute[182755]: 2026-01-22 00:15:11.406 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:15:11 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:15:11.413 104259 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:36:d6:55 10.100.0.5'], port_security=['fa:16:3e:36:d6:55 10.100.0.5'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.5/28', 'neutron:device_id': '40f18c70-81c1-4729-929c-5368ae297eb8', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-c19848fe-a435-4c66-8190-94e8e9e1b266', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'f02fc2085f6340ffa895cb894fdf5882', 'neutron:revision_number': '2', 'neutron:security_group_ids': '01430d09-4466-4c63-8f42-d6bde77fcc79', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=639fe658-8c59-48e8-bb7b-52cdb7487f54, chassis=[<ovs.db.idl.Row object at 0x7fdd55b5c970>], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fdd55b5c970>], logical_port=7c155749-1486-479f-9ee6-d99ea840c942) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 21 19:15:11 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:15:11.414 104259 INFO neutron.agent.ovn.metadata.agent [-] Port 7c155749-1486-479f-9ee6-d99ea840c942 in datapath c19848fe-a435-4c66-8190-94e8e9e1b266 bound to our chassis#033[00m
Jan 21 19:15:11 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:15:11.415 104259 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network c19848fe-a435-4c66-8190-94e8e9e1b266#033[00m
Jan 21 19:15:11 np0005591285 ovn_controller[94908]: 2026-01-22T00:15:11Z|00493|binding|INFO|Setting lport 7c155749-1486-479f-9ee6-d99ea840c942 ovn-installed in OVS
Jan 21 19:15:11 np0005591285 ovn_controller[94908]: 2026-01-22T00:15:11Z|00494|binding|INFO|Setting lport 7c155749-1486-479f-9ee6-d99ea840c942 up in Southbound
Jan 21 19:15:11 np0005591285 nova_compute[182755]: 2026-01-22 00:15:11.425 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:15:11 np0005591285 nova_compute[182755]: 2026-01-22 00:15:11.428 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:15:11 np0005591285 systemd-udevd[231721]: Network interface NamePolicy= disabled on kernel command line.
Jan 21 19:15:11 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:15:11.428 211690 DEBUG oslo.privsep.daemon [-] privsep: reply[4abcfe4f-29e9-4628-b863-1852e3c9485d]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 19:15:11 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:15:11.429 104259 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tapc19848fe-a1 in ovnmeta-c19848fe-a435-4c66-8190-94e8e9e1b266 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Jan 21 19:15:11 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:15:11.431 211690 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tapc19848fe-a0 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Jan 21 19:15:11 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:15:11.431 211690 DEBUG oslo.privsep.daemon [-] privsep: reply[475d4634-249c-44c4-99ec-f31a1a8dd546]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 19:15:11 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:15:11.432 211690 DEBUG oslo.privsep.daemon [-] privsep: reply[86efd6df-56da-4207-83b9-4fdbe1e43b35]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 19:15:11 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:15:11.442 104650 DEBUG oslo.privsep.daemon [-] privsep: reply[df48d8cc-bc30-429c-b78a-0d98db7324af]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 19:15:11 np0005591285 NetworkManager[55017]: <info>  [1769040911.4440] device (tap7c155749-14): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 21 19:15:11 np0005591285 NetworkManager[55017]: <info>  [1769040911.4452] device (tap7c155749-14): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 21 19:15:11 np0005591285 systemd-machined[154022]: New machine qemu-58-instance-0000007e.
Jan 21 19:15:11 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:15:11.456 211690 DEBUG oslo.privsep.daemon [-] privsep: reply[865019d2-c8e6-4228-be0d-9796b9bb9b8d]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 19:15:11 np0005591285 systemd[1]: Started Virtual Machine qemu-58-instance-0000007e.
Jan 21 19:15:11 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:15:11.481 211704 DEBUG oslo.privsep.daemon [-] privsep: reply[c5cc9f87-a037-47c9-858c-af1d55167a91]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 19:15:11 np0005591285 systemd-udevd[231726]: Network interface NamePolicy= disabled on kernel command line.
Jan 21 19:15:11 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:15:11.487 211690 DEBUG oslo.privsep.daemon [-] privsep: reply[04e80b90-b0d5-460d-aeb2-1dec57c12f90]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 19:15:11 np0005591285 NetworkManager[55017]: <info>  [1769040911.4892] manager: (tapc19848fe-a0): new Veth device (/org/freedesktop/NetworkManager/Devices/238)
Jan 21 19:15:11 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:15:11.515 211704 DEBUG oslo.privsep.daemon [-] privsep: reply[b4b755e5-e2cc-48c4-afe4-a8361bdf1f73]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 19:15:11 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:15:11.518 211704 DEBUG oslo.privsep.daemon [-] privsep: reply[8575239b-4bf5-4a0e-98d5-7ead9fd9ea1f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 19:15:11 np0005591285 NetworkManager[55017]: <info>  [1769040911.5375] device (tapc19848fe-a0): carrier: link connected
Jan 21 19:15:11 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:15:11.544 211704 DEBUG oslo.privsep.daemon [-] privsep: reply[55572130-64c9-47d1-9a85-cbcf39ff53f3]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 19:15:11 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:15:11.560 211690 DEBUG oslo.privsep.daemon [-] privsep: reply[98fe6aa4-3135-47a0-b347-5478a8f91f14]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapc19848fe-a1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:25:5c:b6'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 153], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 547019, 'reachable_time': 44188, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 231755, 'error': None, 'target': 'ovnmeta-c19848fe-a435-4c66-8190-94e8e9e1b266', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 19:15:11 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:15:11.576 211690 DEBUG oslo.privsep.daemon [-] privsep: reply[59a6a61c-61d6-481d-892e-20fefca46bb0]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe25:5cb6'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 547019, 'tstamp': 547019}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 231757, 'error': None, 'target': 'ovnmeta-c19848fe-a435-4c66-8190-94e8e9e1b266', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 19:15:11 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:15:11.594 211690 DEBUG oslo.privsep.daemon [-] privsep: reply[2aeff386-2a25-440f-bc79-b9c81bb44384]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapc19848fe-a1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:25:5c:b6'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 153], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 547019, 'reachable_time': 44188, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 231758, 'error': None, 'target': 'ovnmeta-c19848fe-a435-4c66-8190-94e8e9e1b266', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 19:15:11 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:15:11.634 211690 DEBUG oslo.privsep.daemon [-] privsep: reply[d7dfd249-de76-49f7-b231-7be6ffcd0c96]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 19:15:11 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:15:11.695 211690 DEBUG oslo.privsep.daemon [-] privsep: reply[d0f26019-7805-49de-94e6-85503e81b038]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 19:15:11 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:15:11.696 104259 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapc19848fe-a0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 21 19:15:11 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:15:11.697 104259 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 21 19:15:11 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:15:11.697 104259 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapc19848fe-a0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 21 19:15:11 np0005591285 kernel: tapc19848fe-a0: entered promiscuous mode
Jan 21 19:15:11 np0005591285 nova_compute[182755]: 2026-01-22 00:15:11.698 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:15:11 np0005591285 NetworkManager[55017]: <info>  [1769040911.7013] manager: (tapc19848fe-a0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/239)
Jan 21 19:15:11 np0005591285 nova_compute[182755]: 2026-01-22 00:15:11.701 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:15:11 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:15:11.701 104259 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapc19848fe-a0, col_values=(('external_ids', {'iface-id': 'ba768391-9e0e-4cf0-83c5-526ca3a05a58'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 21 19:15:11 np0005591285 nova_compute[182755]: 2026-01-22 00:15:11.702 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:15:11 np0005591285 ovn_controller[94908]: 2026-01-22T00:15:11Z|00495|binding|INFO|Releasing lport ba768391-9e0e-4cf0-83c5-526ca3a05a58 from this chassis (sb_readonly=0)
Jan 21 19:15:11 np0005591285 nova_compute[182755]: 2026-01-22 00:15:11.705 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:15:11 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:15:11.706 104259 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/c19848fe-a435-4c66-8190-94e8e9e1b266.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/c19848fe-a435-4c66-8190-94e8e9e1b266.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Jan 21 19:15:11 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:15:11.706 211690 DEBUG oslo.privsep.daemon [-] privsep: reply[447d1881-88df-4e3b-b65a-de843d821d64]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 19:15:11 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:15:11.707 104259 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Jan 21 19:15:11 np0005591285 ovn_metadata_agent[104254]: global
Jan 21 19:15:11 np0005591285 ovn_metadata_agent[104254]:    log         /dev/log local0 debug
Jan 21 19:15:11 np0005591285 ovn_metadata_agent[104254]:    log-tag     haproxy-metadata-proxy-c19848fe-a435-4c66-8190-94e8e9e1b266
Jan 21 19:15:11 np0005591285 ovn_metadata_agent[104254]:    user        root
Jan 21 19:15:11 np0005591285 ovn_metadata_agent[104254]:    group       root
Jan 21 19:15:11 np0005591285 ovn_metadata_agent[104254]:    maxconn     1024
Jan 21 19:15:11 np0005591285 ovn_metadata_agent[104254]:    pidfile     /var/lib/neutron/external/pids/c19848fe-a435-4c66-8190-94e8e9e1b266.pid.haproxy
Jan 21 19:15:11 np0005591285 ovn_metadata_agent[104254]:    daemon
Jan 21 19:15:11 np0005591285 ovn_metadata_agent[104254]: 
Jan 21 19:15:11 np0005591285 ovn_metadata_agent[104254]: defaults
Jan 21 19:15:11 np0005591285 ovn_metadata_agent[104254]:    log global
Jan 21 19:15:11 np0005591285 ovn_metadata_agent[104254]:    mode http
Jan 21 19:15:11 np0005591285 ovn_metadata_agent[104254]:    option httplog
Jan 21 19:15:11 np0005591285 ovn_metadata_agent[104254]:    option dontlognull
Jan 21 19:15:11 np0005591285 ovn_metadata_agent[104254]:    option http-server-close
Jan 21 19:15:11 np0005591285 ovn_metadata_agent[104254]:    option forwardfor
Jan 21 19:15:11 np0005591285 ovn_metadata_agent[104254]:    retries                 3
Jan 21 19:15:11 np0005591285 ovn_metadata_agent[104254]:    timeout http-request    30s
Jan 21 19:15:11 np0005591285 ovn_metadata_agent[104254]:    timeout connect         30s
Jan 21 19:15:11 np0005591285 ovn_metadata_agent[104254]:    timeout client          32s
Jan 21 19:15:11 np0005591285 ovn_metadata_agent[104254]:    timeout server          32s
Jan 21 19:15:11 np0005591285 ovn_metadata_agent[104254]:    timeout http-keep-alive 30s
Jan 21 19:15:11 np0005591285 ovn_metadata_agent[104254]: 
Jan 21 19:15:11 np0005591285 ovn_metadata_agent[104254]: 
Jan 21 19:15:11 np0005591285 ovn_metadata_agent[104254]: listen listener
Jan 21 19:15:11 np0005591285 ovn_metadata_agent[104254]:    bind 169.254.169.254:80
Jan 21 19:15:11 np0005591285 ovn_metadata_agent[104254]:    server metadata /var/lib/neutron/metadata_proxy
Jan 21 19:15:11 np0005591285 ovn_metadata_agent[104254]:    http-request add-header X-OVN-Network-ID c19848fe-a435-4c66-8190-94e8e9e1b266
Jan 21 19:15:11 np0005591285 ovn_metadata_agent[104254]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Jan 21 19:15:11 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:15:11.708 104259 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-c19848fe-a435-4c66-8190-94e8e9e1b266', 'env', 'PROCESS_TAG=haproxy-c19848fe-a435-4c66-8190-94e8e9e1b266', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/c19848fe-a435-4c66-8190-94e8e9e1b266.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Jan 21 19:15:11 np0005591285 nova_compute[182755]: 2026-01-22 00:15:11.721 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:15:11 np0005591285 nova_compute[182755]: 2026-01-22 00:15:11.894 182759 DEBUG nova.compute.manager [req-7a73ca90-c54e-47c7-8591-aaa0e23f5f55 req-36127b86-d499-4088-a328-f7a7fe890878 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 40f18c70-81c1-4729-929c-5368ae297eb8] Received event network-vif-plugged-7c155749-1486-479f-9ee6-d99ea840c942 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 21 19:15:11 np0005591285 nova_compute[182755]: 2026-01-22 00:15:11.895 182759 DEBUG oslo_concurrency.lockutils [req-7a73ca90-c54e-47c7-8591-aaa0e23f5f55 req-36127b86-d499-4088-a328-f7a7fe890878 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "40f18c70-81c1-4729-929c-5368ae297eb8-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 21 19:15:11 np0005591285 nova_compute[182755]: 2026-01-22 00:15:11.895 182759 DEBUG oslo_concurrency.lockutils [req-7a73ca90-c54e-47c7-8591-aaa0e23f5f55 req-36127b86-d499-4088-a328-f7a7fe890878 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "40f18c70-81c1-4729-929c-5368ae297eb8-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 21 19:15:11 np0005591285 nova_compute[182755]: 2026-01-22 00:15:11.896 182759 DEBUG oslo_concurrency.lockutils [req-7a73ca90-c54e-47c7-8591-aaa0e23f5f55 req-36127b86-d499-4088-a328-f7a7fe890878 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "40f18c70-81c1-4729-929c-5368ae297eb8-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 21 19:15:11 np0005591285 nova_compute[182755]: 2026-01-22 00:15:11.896 182759 DEBUG nova.compute.manager [req-7a73ca90-c54e-47c7-8591-aaa0e23f5f55 req-36127b86-d499-4088-a328-f7a7fe890878 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 40f18c70-81c1-4729-929c-5368ae297eb8] Processing event network-vif-plugged-7c155749-1486-479f-9ee6-d99ea840c942 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Jan 21 19:15:11 np0005591285 nova_compute[182755]: 2026-01-22 00:15:11.963 182759 DEBUG nova.virt.driver [None req-f447ef32-7edb-4e2c-a5c5-5452f2f9c37e - - - - - -] Emitting event <LifecycleEvent: 1769040911.962508, 40f18c70-81c1-4729-929c-5368ae297eb8 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 21 19:15:11 np0005591285 nova_compute[182755]: 2026-01-22 00:15:11.963 182759 INFO nova.compute.manager [None req-f447ef32-7edb-4e2c-a5c5-5452f2f9c37e - - - - - -] [instance: 40f18c70-81c1-4729-929c-5368ae297eb8] VM Started (Lifecycle Event)#033[00m
Jan 21 19:15:11 np0005591285 nova_compute[182755]: 2026-01-22 00:15:11.965 182759 DEBUG nova.compute.manager [None req-3cd3470c-fe37-4c48-88cb-fe198fd9e938 00a7d470e36045deabd5584bd3a9c73e f02fc2085f6340ffa895cb894fdf5882 - - default default] [instance: 40f18c70-81c1-4729-929c-5368ae297eb8] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Jan 21 19:15:11 np0005591285 nova_compute[182755]: 2026-01-22 00:15:11.969 182759 DEBUG nova.virt.libvirt.driver [None req-3cd3470c-fe37-4c48-88cb-fe198fd9e938 00a7d470e36045deabd5584bd3a9c73e f02fc2085f6340ffa895cb894fdf5882 - - default default] [instance: 40f18c70-81c1-4729-929c-5368ae297eb8] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Jan 21 19:15:11 np0005591285 nova_compute[182755]: 2026-01-22 00:15:11.972 182759 INFO nova.virt.libvirt.driver [-] [instance: 40f18c70-81c1-4729-929c-5368ae297eb8] Instance spawned successfully.#033[00m
Jan 21 19:15:11 np0005591285 nova_compute[182755]: 2026-01-22 00:15:11.973 182759 DEBUG nova.virt.libvirt.driver [None req-3cd3470c-fe37-4c48-88cb-fe198fd9e938 00a7d470e36045deabd5584bd3a9c73e f02fc2085f6340ffa895cb894fdf5882 - - default default] [instance: 40f18c70-81c1-4729-929c-5368ae297eb8] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Jan 21 19:15:11 np0005591285 nova_compute[182755]: 2026-01-22 00:15:11.988 182759 DEBUG nova.compute.manager [None req-f447ef32-7edb-4e2c-a5c5-5452f2f9c37e - - - - - -] [instance: 40f18c70-81c1-4729-929c-5368ae297eb8] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 21 19:15:11 np0005591285 nova_compute[182755]: 2026-01-22 00:15:11.995 182759 DEBUG nova.compute.manager [None req-f447ef32-7edb-4e2c-a5c5-5452f2f9c37e - - - - - -] [instance: 40f18c70-81c1-4729-929c-5368ae297eb8] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 21 19:15:12 np0005591285 nova_compute[182755]: 2026-01-22 00:15:12.000 182759 DEBUG nova.virt.libvirt.driver [None req-3cd3470c-fe37-4c48-88cb-fe198fd9e938 00a7d470e36045deabd5584bd3a9c73e f02fc2085f6340ffa895cb894fdf5882 - - default default] [instance: 40f18c70-81c1-4729-929c-5368ae297eb8] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 21 19:15:12 np0005591285 nova_compute[182755]: 2026-01-22 00:15:12.001 182759 DEBUG nova.virt.libvirt.driver [None req-3cd3470c-fe37-4c48-88cb-fe198fd9e938 00a7d470e36045deabd5584bd3a9c73e f02fc2085f6340ffa895cb894fdf5882 - - default default] [instance: 40f18c70-81c1-4729-929c-5368ae297eb8] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 21 19:15:12 np0005591285 nova_compute[182755]: 2026-01-22 00:15:12.001 182759 DEBUG nova.virt.libvirt.driver [None req-3cd3470c-fe37-4c48-88cb-fe198fd9e938 00a7d470e36045deabd5584bd3a9c73e f02fc2085f6340ffa895cb894fdf5882 - - default default] [instance: 40f18c70-81c1-4729-929c-5368ae297eb8] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 21 19:15:12 np0005591285 nova_compute[182755]: 2026-01-22 00:15:12.002 182759 DEBUG nova.virt.libvirt.driver [None req-3cd3470c-fe37-4c48-88cb-fe198fd9e938 00a7d470e36045deabd5584bd3a9c73e f02fc2085f6340ffa895cb894fdf5882 - - default default] [instance: 40f18c70-81c1-4729-929c-5368ae297eb8] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 21 19:15:12 np0005591285 nova_compute[182755]: 2026-01-22 00:15:12.002 182759 DEBUG nova.virt.libvirt.driver [None req-3cd3470c-fe37-4c48-88cb-fe198fd9e938 00a7d470e36045deabd5584bd3a9c73e f02fc2085f6340ffa895cb894fdf5882 - - default default] [instance: 40f18c70-81c1-4729-929c-5368ae297eb8] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 21 19:15:12 np0005591285 nova_compute[182755]: 2026-01-22 00:15:12.003 182759 DEBUG nova.virt.libvirt.driver [None req-3cd3470c-fe37-4c48-88cb-fe198fd9e938 00a7d470e36045deabd5584bd3a9c73e f02fc2085f6340ffa895cb894fdf5882 - - default default] [instance: 40f18c70-81c1-4729-929c-5368ae297eb8] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 21 19:15:12 np0005591285 nova_compute[182755]: 2026-01-22 00:15:12.028 182759 INFO nova.compute.manager [None req-f447ef32-7edb-4e2c-a5c5-5452f2f9c37e - - - - - -] [instance: 40f18c70-81c1-4729-929c-5368ae297eb8] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 21 19:15:12 np0005591285 nova_compute[182755]: 2026-01-22 00:15:12.029 182759 DEBUG nova.virt.driver [None req-f447ef32-7edb-4e2c-a5c5-5452f2f9c37e - - - - - -] Emitting event <LifecycleEvent: 1769040911.9634082, 40f18c70-81c1-4729-929c-5368ae297eb8 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 21 19:15:12 np0005591285 nova_compute[182755]: 2026-01-22 00:15:12.029 182759 INFO nova.compute.manager [None req-f447ef32-7edb-4e2c-a5c5-5452f2f9c37e - - - - - -] [instance: 40f18c70-81c1-4729-929c-5368ae297eb8] VM Paused (Lifecycle Event)#033[00m
Jan 21 19:15:12 np0005591285 nova_compute[182755]: 2026-01-22 00:15:12.059 182759 DEBUG nova.compute.manager [None req-f447ef32-7edb-4e2c-a5c5-5452f2f9c37e - - - - - -] [instance: 40f18c70-81c1-4729-929c-5368ae297eb8] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 21 19:15:12 np0005591285 nova_compute[182755]: 2026-01-22 00:15:12.063 182759 DEBUG nova.virt.driver [None req-f447ef32-7edb-4e2c-a5c5-5452f2f9c37e - - - - - -] Emitting event <LifecycleEvent: 1769040911.96793, 40f18c70-81c1-4729-929c-5368ae297eb8 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 21 19:15:12 np0005591285 nova_compute[182755]: 2026-01-22 00:15:12.063 182759 INFO nova.compute.manager [None req-f447ef32-7edb-4e2c-a5c5-5452f2f9c37e - - - - - -] [instance: 40f18c70-81c1-4729-929c-5368ae297eb8] VM Resumed (Lifecycle Event)#033[00m
Jan 21 19:15:12 np0005591285 nova_compute[182755]: 2026-01-22 00:15:12.091 182759 DEBUG nova.compute.manager [None req-f447ef32-7edb-4e2c-a5c5-5452f2f9c37e - - - - - -] [instance: 40f18c70-81c1-4729-929c-5368ae297eb8] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 21 19:15:12 np0005591285 nova_compute[182755]: 2026-01-22 00:15:12.094 182759 DEBUG nova.compute.manager [None req-f447ef32-7edb-4e2c-a5c5-5452f2f9c37e - - - - - -] [instance: 40f18c70-81c1-4729-929c-5368ae297eb8] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 21 19:15:12 np0005591285 nova_compute[182755]: 2026-01-22 00:15:12.097 182759 INFO nova.compute.manager [None req-3cd3470c-fe37-4c48-88cb-fe198fd9e938 00a7d470e36045deabd5584bd3a9c73e f02fc2085f6340ffa895cb894fdf5882 - - default default] [instance: 40f18c70-81c1-4729-929c-5368ae297eb8] Took 11.32 seconds to spawn the instance on the hypervisor.#033[00m
Jan 21 19:15:12 np0005591285 nova_compute[182755]: 2026-01-22 00:15:12.098 182759 DEBUG nova.compute.manager [None req-3cd3470c-fe37-4c48-88cb-fe198fd9e938 00a7d470e36045deabd5584bd3a9c73e f02fc2085f6340ffa895cb894fdf5882 - - default default] [instance: 40f18c70-81c1-4729-929c-5368ae297eb8] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 21 19:15:12 np0005591285 podman[231797]: 2026-01-22 00:15:12.112234525 +0000 UTC m=+0.066599937 container create 8a6685c8d83330df309f243eb6f3235a1cfe8c78077d0b6bc7aa57f7d75df81f (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-c19848fe-a435-4c66-8190-94e8e9e1b266, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, io.buildah.version=1.41.3, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.license=GPLv2)
Jan 21 19:15:12 np0005591285 nova_compute[182755]: 2026-01-22 00:15:12.126 182759 INFO nova.compute.manager [None req-f447ef32-7edb-4e2c-a5c5-5452f2f9c37e - - - - - -] [instance: 40f18c70-81c1-4729-929c-5368ae297eb8] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 21 19:15:12 np0005591285 systemd[1]: Started libpod-conmon-8a6685c8d83330df309f243eb6f3235a1cfe8c78077d0b6bc7aa57f7d75df81f.scope.
Jan 21 19:15:12 np0005591285 systemd[1]: Started libcrun container.
Jan 21 19:15:12 np0005591285 podman[231797]: 2026-01-22 00:15:12.08095989 +0000 UTC m=+0.035325342 image pull 3695f0466b4af47afdf4b467956f8cc4744d7249671a73e7ca3fd26cca2f59c3 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Jan 21 19:15:12 np0005591285 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/1cf8d423dbf9343aaf173cbb9c6fc95091f5a2cdfa708608948902ea8e1553e4/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 21 19:15:12 np0005591285 nova_compute[182755]: 2026-01-22 00:15:12.209 182759 INFO nova.compute.manager [None req-3cd3470c-fe37-4c48-88cb-fe198fd9e938 00a7d470e36045deabd5584bd3a9c73e f02fc2085f6340ffa895cb894fdf5882 - - default default] [instance: 40f18c70-81c1-4729-929c-5368ae297eb8] Took 12.70 seconds to build instance.#033[00m
Jan 21 19:15:12 np0005591285 podman[231797]: 2026-01-22 00:15:12.218256221 +0000 UTC m=+0.172621673 container init 8a6685c8d83330df309f243eb6f3235a1cfe8c78077d0b6bc7aa57f7d75df81f (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-c19848fe-a435-4c66-8190-94e8e9e1b266, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, io.buildah.version=1.41.3)
Jan 21 19:15:12 np0005591285 podman[231797]: 2026-01-22 00:15:12.225296659 +0000 UTC m=+0.179662081 container start 8a6685c8d83330df309f243eb6f3235a1cfe8c78077d0b6bc7aa57f7d75df81f (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-c19848fe-a435-4c66-8190-94e8e9e1b266, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.license=GPLv2)
Jan 21 19:15:12 np0005591285 nova_compute[182755]: 2026-01-22 00:15:12.242 182759 DEBUG oslo_concurrency.lockutils [None req-3cd3470c-fe37-4c48-88cb-fe198fd9e938 00a7d470e36045deabd5584bd3a9c73e f02fc2085f6340ffa895cb894fdf5882 - - default default] Lock "40f18c70-81c1-4729-929c-5368ae297eb8" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 14.276s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 21 19:15:12 np0005591285 neutron-haproxy-ovnmeta-c19848fe-a435-4c66-8190-94e8e9e1b266[231813]: [NOTICE]   (231817) : New worker (231819) forked
Jan 21 19:15:12 np0005591285 neutron-haproxy-ovnmeta-c19848fe-a435-4c66-8190-94e8e9e1b266[231813]: [NOTICE]   (231817) : Loading success.
Jan 21 19:15:12 np0005591285 nova_compute[182755]: 2026-01-22 00:15:12.506 182759 DEBUG nova.network.neutron [req-a4796d53-c453-4be4-9669-1109b3f0a67f req-de2ecf49-554f-41ff-9d4c-6b9f8aa87a80 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 40f18c70-81c1-4729-929c-5368ae297eb8] Updated VIF entry in instance network info cache for port 7c155749-1486-479f-9ee6-d99ea840c942. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 21 19:15:12 np0005591285 nova_compute[182755]: 2026-01-22 00:15:12.507 182759 DEBUG nova.network.neutron [req-a4796d53-c453-4be4-9669-1109b3f0a67f req-de2ecf49-554f-41ff-9d4c-6b9f8aa87a80 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 40f18c70-81c1-4729-929c-5368ae297eb8] Updating instance_info_cache with network_info: [{"id": "7c155749-1486-479f-9ee6-d99ea840c942", "address": "fa:16:3e:36:d6:55", "network": {"id": "c19848fe-a435-4c66-8190-94e8e9e1b266", "bridge": "br-int", "label": "tempest-MultipleCreateTestJSON-1960670064-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "f02fc2085f6340ffa895cb894fdf5882", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap7c155749-14", "ovs_interfaceid": "7c155749-1486-479f-9ee6-d99ea840c942", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 21 19:15:12 np0005591285 nova_compute[182755]: 2026-01-22 00:15:12.564 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:15:12 np0005591285 nova_compute[182755]: 2026-01-22 00:15:12.601 182759 DEBUG oslo_concurrency.lockutils [req-a4796d53-c453-4be4-9669-1109b3f0a67f req-de2ecf49-554f-41ff-9d4c-6b9f8aa87a80 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Releasing lock "refresh_cache-40f18c70-81c1-4729-929c-5368ae297eb8" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 21 19:15:14 np0005591285 nova_compute[182755]: 2026-01-22 00:15:14.114 182759 DEBUG nova.compute.manager [req-6a60daf2-da13-40db-b399-76f929ea82e0 req-ce0403d6-db80-4936-9b70-f377b06ad544 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 40f18c70-81c1-4729-929c-5368ae297eb8] Received event network-vif-plugged-7c155749-1486-479f-9ee6-d99ea840c942 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 21 19:15:14 np0005591285 nova_compute[182755]: 2026-01-22 00:15:14.116 182759 DEBUG oslo_concurrency.lockutils [req-6a60daf2-da13-40db-b399-76f929ea82e0 req-ce0403d6-db80-4936-9b70-f377b06ad544 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "40f18c70-81c1-4729-929c-5368ae297eb8-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 21 19:15:14 np0005591285 nova_compute[182755]: 2026-01-22 00:15:14.117 182759 DEBUG oslo_concurrency.lockutils [req-6a60daf2-da13-40db-b399-76f929ea82e0 req-ce0403d6-db80-4936-9b70-f377b06ad544 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "40f18c70-81c1-4729-929c-5368ae297eb8-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 21 19:15:14 np0005591285 nova_compute[182755]: 2026-01-22 00:15:14.117 182759 DEBUG oslo_concurrency.lockutils [req-6a60daf2-da13-40db-b399-76f929ea82e0 req-ce0403d6-db80-4936-9b70-f377b06ad544 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "40f18c70-81c1-4729-929c-5368ae297eb8-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 21 19:15:14 np0005591285 nova_compute[182755]: 2026-01-22 00:15:14.118 182759 DEBUG nova.compute.manager [req-6a60daf2-da13-40db-b399-76f929ea82e0 req-ce0403d6-db80-4936-9b70-f377b06ad544 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 40f18c70-81c1-4729-929c-5368ae297eb8] No waiting events found dispatching network-vif-plugged-7c155749-1486-479f-9ee6-d99ea840c942 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 21 19:15:14 np0005591285 nova_compute[182755]: 2026-01-22 00:15:14.119 182759 WARNING nova.compute.manager [req-6a60daf2-da13-40db-b399-76f929ea82e0 req-ce0403d6-db80-4936-9b70-f377b06ad544 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 40f18c70-81c1-4729-929c-5368ae297eb8] Received unexpected event network-vif-plugged-7c155749-1486-479f-9ee6-d99ea840c942 for instance with vm_state active and task_state None.#033[00m
Jan 21 19:15:15 np0005591285 nova_compute[182755]: 2026-01-22 00:15:15.667 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:15:15 np0005591285 nova_compute[182755]: 2026-01-22 00:15:15.962 182759 DEBUG oslo_concurrency.lockutils [None req-b9bfce37-3ed9-4dfb-9b43-bd50803f204c 00a7d470e36045deabd5584bd3a9c73e f02fc2085f6340ffa895cb894fdf5882 - - default default] Acquiring lock "40f18c70-81c1-4729-929c-5368ae297eb8" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 21 19:15:15 np0005591285 nova_compute[182755]: 2026-01-22 00:15:15.963 182759 DEBUG oslo_concurrency.lockutils [None req-b9bfce37-3ed9-4dfb-9b43-bd50803f204c 00a7d470e36045deabd5584bd3a9c73e f02fc2085f6340ffa895cb894fdf5882 - - default default] Lock "40f18c70-81c1-4729-929c-5368ae297eb8" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 21 19:15:15 np0005591285 nova_compute[182755]: 2026-01-22 00:15:15.963 182759 DEBUG oslo_concurrency.lockutils [None req-b9bfce37-3ed9-4dfb-9b43-bd50803f204c 00a7d470e36045deabd5584bd3a9c73e f02fc2085f6340ffa895cb894fdf5882 - - default default] Acquiring lock "40f18c70-81c1-4729-929c-5368ae297eb8-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 21 19:15:15 np0005591285 nova_compute[182755]: 2026-01-22 00:15:15.964 182759 DEBUG oslo_concurrency.lockutils [None req-b9bfce37-3ed9-4dfb-9b43-bd50803f204c 00a7d470e36045deabd5584bd3a9c73e f02fc2085f6340ffa895cb894fdf5882 - - default default] Lock "40f18c70-81c1-4729-929c-5368ae297eb8-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 21 19:15:15 np0005591285 nova_compute[182755]: 2026-01-22 00:15:15.964 182759 DEBUG oslo_concurrency.lockutils [None req-b9bfce37-3ed9-4dfb-9b43-bd50803f204c 00a7d470e36045deabd5584bd3a9c73e f02fc2085f6340ffa895cb894fdf5882 - - default default] Lock "40f18c70-81c1-4729-929c-5368ae297eb8-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 21 19:15:15 np0005591285 nova_compute[182755]: 2026-01-22 00:15:15.976 182759 INFO nova.compute.manager [None req-b9bfce37-3ed9-4dfb-9b43-bd50803f204c 00a7d470e36045deabd5584bd3a9c73e f02fc2085f6340ffa895cb894fdf5882 - - default default] [instance: 40f18c70-81c1-4729-929c-5368ae297eb8] Terminating instance#033[00m
Jan 21 19:15:15 np0005591285 nova_compute[182755]: 2026-01-22 00:15:15.985 182759 DEBUG nova.compute.manager [None req-b9bfce37-3ed9-4dfb-9b43-bd50803f204c 00a7d470e36045deabd5584bd3a9c73e f02fc2085f6340ffa895cb894fdf5882 - - default default] [instance: 40f18c70-81c1-4729-929c-5368ae297eb8] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Jan 21 19:15:16 np0005591285 kernel: tap7c155749-14 (unregistering): left promiscuous mode
Jan 21 19:15:16 np0005591285 NetworkManager[55017]: <info>  [1769040916.0141] device (tap7c155749-14): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 21 19:15:16 np0005591285 ovn_controller[94908]: 2026-01-22T00:15:16Z|00496|binding|INFO|Releasing lport 7c155749-1486-479f-9ee6-d99ea840c942 from this chassis (sb_readonly=0)
Jan 21 19:15:16 np0005591285 ovn_controller[94908]: 2026-01-22T00:15:16Z|00497|binding|INFO|Setting lport 7c155749-1486-479f-9ee6-d99ea840c942 down in Southbound
Jan 21 19:15:16 np0005591285 ovn_controller[94908]: 2026-01-22T00:15:16Z|00498|binding|INFO|Removing iface tap7c155749-14 ovn-installed in OVS
Jan 21 19:15:16 np0005591285 nova_compute[182755]: 2026-01-22 00:15:16.023 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:15:16 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:15:16.032 104259 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:36:d6:55 10.100.0.5'], port_security=['fa:16:3e:36:d6:55 10.100.0.5'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.5/28', 'neutron:device_id': '40f18c70-81c1-4729-929c-5368ae297eb8', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-c19848fe-a435-4c66-8190-94e8e9e1b266', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'f02fc2085f6340ffa895cb894fdf5882', 'neutron:revision_number': '4', 'neutron:security_group_ids': '01430d09-4466-4c63-8f42-d6bde77fcc79', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-2.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=639fe658-8c59-48e8-bb7b-52cdb7487f54, chassis=[], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fdd55b5c970>], logical_port=7c155749-1486-479f-9ee6-d99ea840c942) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7fdd55b5c970>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 21 19:15:16 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:15:16.033 104259 INFO neutron.agent.ovn.metadata.agent [-] Port 7c155749-1486-479f-9ee6-d99ea840c942 in datapath c19848fe-a435-4c66-8190-94e8e9e1b266 unbound from our chassis#033[00m
Jan 21 19:15:16 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:15:16.035 104259 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network c19848fe-a435-4c66-8190-94e8e9e1b266, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Jan 21 19:15:16 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:15:16.036 211690 DEBUG oslo.privsep.daemon [-] privsep: reply[7a199fa1-7873-4fed-8846-ceae0219e45a]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 19:15:16 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:15:16.037 104259 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-c19848fe-a435-4c66-8190-94e8e9e1b266 namespace which is not needed anymore#033[00m
Jan 21 19:15:16 np0005591285 nova_compute[182755]: 2026-01-22 00:15:16.039 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:15:16 np0005591285 systemd[1]: machine-qemu\x2d58\x2dinstance\x2d0000007e.scope: Deactivated successfully.
Jan 21 19:15:16 np0005591285 systemd[1]: machine-qemu\x2d58\x2dinstance\x2d0000007e.scope: Consumed 4.566s CPU time.
Jan 21 19:15:16 np0005591285 systemd-machined[154022]: Machine qemu-58-instance-0000007e terminated.
Jan 21 19:15:16 np0005591285 neutron-haproxy-ovnmeta-c19848fe-a435-4c66-8190-94e8e9e1b266[231813]: [NOTICE]   (231817) : haproxy version is 2.8.14-c23fe91
Jan 21 19:15:16 np0005591285 neutron-haproxy-ovnmeta-c19848fe-a435-4c66-8190-94e8e9e1b266[231813]: [NOTICE]   (231817) : path to executable is /usr/sbin/haproxy
Jan 21 19:15:16 np0005591285 neutron-haproxy-ovnmeta-c19848fe-a435-4c66-8190-94e8e9e1b266[231813]: [WARNING]  (231817) : Exiting Master process...
Jan 21 19:15:16 np0005591285 neutron-haproxy-ovnmeta-c19848fe-a435-4c66-8190-94e8e9e1b266[231813]: [ALERT]    (231817) : Current worker (231819) exited with code 143 (Terminated)
Jan 21 19:15:16 np0005591285 neutron-haproxy-ovnmeta-c19848fe-a435-4c66-8190-94e8e9e1b266[231813]: [WARNING]  (231817) : All workers exited. Exiting... (0)
Jan 21 19:15:16 np0005591285 systemd[1]: libpod-8a6685c8d83330df309f243eb6f3235a1cfe8c78077d0b6bc7aa57f7d75df81f.scope: Deactivated successfully.
Jan 21 19:15:16 np0005591285 podman[231853]: 2026-01-22 00:15:16.2124144 +0000 UTC m=+0.060981127 container died 8a6685c8d83330df309f243eb6f3235a1cfe8c78077d0b6bc7aa57f7d75df81f (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-c19848fe-a435-4c66-8190-94e8e9e1b266, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 21 19:15:16 np0005591285 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-8a6685c8d83330df309f243eb6f3235a1cfe8c78077d0b6bc7aa57f7d75df81f-userdata-shm.mount: Deactivated successfully.
Jan 21 19:15:16 np0005591285 systemd[1]: var-lib-containers-storage-overlay-1cf8d423dbf9343aaf173cbb9c6fc95091f5a2cdfa708608948902ea8e1553e4-merged.mount: Deactivated successfully.
Jan 21 19:15:16 np0005591285 nova_compute[182755]: 2026-01-22 00:15:16.254 182759 INFO nova.virt.libvirt.driver [-] [instance: 40f18c70-81c1-4729-929c-5368ae297eb8] Instance destroyed successfully.#033[00m
Jan 21 19:15:16 np0005591285 nova_compute[182755]: 2026-01-22 00:15:16.255 182759 DEBUG nova.objects.instance [None req-b9bfce37-3ed9-4dfb-9b43-bd50803f204c 00a7d470e36045deabd5584bd3a9c73e f02fc2085f6340ffa895cb894fdf5882 - - default default] Lazy-loading 'resources' on Instance uuid 40f18c70-81c1-4729-929c-5368ae297eb8 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 21 19:15:16 np0005591285 podman[231853]: 2026-01-22 00:15:16.258426107 +0000 UTC m=+0.106992774 container cleanup 8a6685c8d83330df309f243eb6f3235a1cfe8c78077d0b6bc7aa57f7d75df81f (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-c19848fe-a435-4c66-8190-94e8e9e1b266, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2)
Jan 21 19:15:16 np0005591285 systemd[1]: libpod-conmon-8a6685c8d83330df309f243eb6f3235a1cfe8c78077d0b6bc7aa57f7d75df81f.scope: Deactivated successfully.
Jan 21 19:15:16 np0005591285 nova_compute[182755]: 2026-01-22 00:15:16.292 182759 DEBUG nova.virt.libvirt.vif [None req-b9bfce37-3ed9-4dfb-9b43-bd50803f204c 00a7d470e36045deabd5584bd3a9c73e f02fc2085f6340ffa895cb894fdf5882 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-22T00:14:47Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-MultipleCreateTestJSON-server-1628567258',display_name='tempest-MultipleCreateTestJSON-server-1628567258-1',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-multiplecreatetestjson-server-1628567258-1',id=126,image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2026-01-22T00:15:12Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='f02fc2085f6340ffa895cb894fdf5882',ramdisk_id='',reservation_id='r-ivpvh0b6',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-MultipleCreateTestJSON-620854064',owner_user_name='tempest-MultipleCreateTestJSON-620854064-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-22T00:15:12Z,user_data=None,user_id='00a7d470e36045deabd5584bd3a9c73e',uuid=40f18c70-81c1-4729-929c-5368ae297eb8,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "7c155749-1486-479f-9ee6-d99ea840c942", "address": "fa:16:3e:36:d6:55", "network": {"id": "c19848fe-a435-4c66-8190-94e8e9e1b266", "bridge": "br-int", "label": "tempest-MultipleCreateTestJSON-1960670064-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "f02fc2085f6340ffa895cb894fdf5882", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap7c155749-14", "ovs_interfaceid": "7c155749-1486-479f-9ee6-d99ea840c942", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Jan 21 19:15:16 np0005591285 nova_compute[182755]: 2026-01-22 00:15:16.293 182759 DEBUG nova.network.os_vif_util [None req-b9bfce37-3ed9-4dfb-9b43-bd50803f204c 00a7d470e36045deabd5584bd3a9c73e f02fc2085f6340ffa895cb894fdf5882 - - default default] Converting VIF {"id": "7c155749-1486-479f-9ee6-d99ea840c942", "address": "fa:16:3e:36:d6:55", "network": {"id": "c19848fe-a435-4c66-8190-94e8e9e1b266", "bridge": "br-int", "label": "tempest-MultipleCreateTestJSON-1960670064-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "f02fc2085f6340ffa895cb894fdf5882", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap7c155749-14", "ovs_interfaceid": "7c155749-1486-479f-9ee6-d99ea840c942", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 21 19:15:16 np0005591285 nova_compute[182755]: 2026-01-22 00:15:16.294 182759 DEBUG nova.network.os_vif_util [None req-b9bfce37-3ed9-4dfb-9b43-bd50803f204c 00a7d470e36045deabd5584bd3a9c73e f02fc2085f6340ffa895cb894fdf5882 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:36:d6:55,bridge_name='br-int',has_traffic_filtering=True,id=7c155749-1486-479f-9ee6-d99ea840c942,network=Network(c19848fe-a435-4c66-8190-94e8e9e1b266),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap7c155749-14') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 21 19:15:16 np0005591285 nova_compute[182755]: 2026-01-22 00:15:16.294 182759 DEBUG os_vif [None req-b9bfce37-3ed9-4dfb-9b43-bd50803f204c 00a7d470e36045deabd5584bd3a9c73e f02fc2085f6340ffa895cb894fdf5882 - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:36:d6:55,bridge_name='br-int',has_traffic_filtering=True,id=7c155749-1486-479f-9ee6-d99ea840c942,network=Network(c19848fe-a435-4c66-8190-94e8e9e1b266),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap7c155749-14') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Jan 21 19:15:16 np0005591285 nova_compute[182755]: 2026-01-22 00:15:16.296 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:15:16 np0005591285 nova_compute[182755]: 2026-01-22 00:15:16.296 182759 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap7c155749-14, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 21 19:15:16 np0005591285 nova_compute[182755]: 2026-01-22 00:15:16.301 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 21 19:15:16 np0005591285 nova_compute[182755]: 2026-01-22 00:15:16.303 182759 INFO os_vif [None req-b9bfce37-3ed9-4dfb-9b43-bd50803f204c 00a7d470e36045deabd5584bd3a9c73e f02fc2085f6340ffa895cb894fdf5882 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:36:d6:55,bridge_name='br-int',has_traffic_filtering=True,id=7c155749-1486-479f-9ee6-d99ea840c942,network=Network(c19848fe-a435-4c66-8190-94e8e9e1b266),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap7c155749-14')#033[00m
Jan 21 19:15:16 np0005591285 nova_compute[182755]: 2026-01-22 00:15:16.304 182759 INFO nova.virt.libvirt.driver [None req-b9bfce37-3ed9-4dfb-9b43-bd50803f204c 00a7d470e36045deabd5584bd3a9c73e f02fc2085f6340ffa895cb894fdf5882 - - default default] [instance: 40f18c70-81c1-4729-929c-5368ae297eb8] Deleting instance files /var/lib/nova/instances/40f18c70-81c1-4729-929c-5368ae297eb8_del#033[00m
Jan 21 19:15:16 np0005591285 nova_compute[182755]: 2026-01-22 00:15:16.305 182759 INFO nova.virt.libvirt.driver [None req-b9bfce37-3ed9-4dfb-9b43-bd50803f204c 00a7d470e36045deabd5584bd3a9c73e f02fc2085f6340ffa895cb894fdf5882 - - default default] [instance: 40f18c70-81c1-4729-929c-5368ae297eb8] Deletion of /var/lib/nova/instances/40f18c70-81c1-4729-929c-5368ae297eb8_del complete#033[00m
Jan 21 19:15:16 np0005591285 podman[231899]: 2026-01-22 00:15:16.320948354 +0000 UTC m=+0.041573470 container remove 8a6685c8d83330df309f243eb6f3235a1cfe8c78077d0b6bc7aa57f7d75df81f (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-c19848fe-a435-4c66-8190-94e8e9e1b266, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Jan 21 19:15:16 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:15:16.326 211690 DEBUG oslo.privsep.daemon [-] privsep: reply[cfb533af-3bd5-4579-8c19-5cc0754a5bf8]: (4, ('Thu Jan 22 12:15:16 AM UTC 2026 Stopping container neutron-haproxy-ovnmeta-c19848fe-a435-4c66-8190-94e8e9e1b266 (8a6685c8d83330df309f243eb6f3235a1cfe8c78077d0b6bc7aa57f7d75df81f)\n8a6685c8d83330df309f243eb6f3235a1cfe8c78077d0b6bc7aa57f7d75df81f\nThu Jan 22 12:15:16 AM UTC 2026 Deleting container neutron-haproxy-ovnmeta-c19848fe-a435-4c66-8190-94e8e9e1b266 (8a6685c8d83330df309f243eb6f3235a1cfe8c78077d0b6bc7aa57f7d75df81f)\n8a6685c8d83330df309f243eb6f3235a1cfe8c78077d0b6bc7aa57f7d75df81f\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 19:15:16 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:15:16.328 211690 DEBUG oslo.privsep.daemon [-] privsep: reply[f75eea30-a56c-4029-9dfa-e48541408b04]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 19:15:16 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:15:16.329 104259 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapc19848fe-a0, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 21 19:15:16 np0005591285 nova_compute[182755]: 2026-01-22 00:15:16.332 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:15:16 np0005591285 kernel: tapc19848fe-a0: left promiscuous mode
Jan 21 19:15:16 np0005591285 nova_compute[182755]: 2026-01-22 00:15:16.343 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:15:16 np0005591285 nova_compute[182755]: 2026-01-22 00:15:16.344 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:15:16 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:15:16.346 211690 DEBUG oslo.privsep.daemon [-] privsep: reply[2e33cb3f-1074-4333-8eca-a963dbdefbc0]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 19:15:16 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:15:16.369 211690 DEBUG oslo.privsep.daemon [-] privsep: reply[97b77a00-31ac-4f2f-a0b7-04ec6d786dd0]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 19:15:16 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:15:16.371 211690 DEBUG oslo.privsep.daemon [-] privsep: reply[565003f1-4887-48b1-821d-b1686fb9629e]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 19:15:16 np0005591285 nova_compute[182755]: 2026-01-22 00:15:16.383 182759 INFO nova.compute.manager [None req-b9bfce37-3ed9-4dfb-9b43-bd50803f204c 00a7d470e36045deabd5584bd3a9c73e f02fc2085f6340ffa895cb894fdf5882 - - default default] [instance: 40f18c70-81c1-4729-929c-5368ae297eb8] Took 0.40 seconds to destroy the instance on the hypervisor.#033[00m
Jan 21 19:15:16 np0005591285 nova_compute[182755]: 2026-01-22 00:15:16.383 182759 DEBUG oslo.service.loopingcall [None req-b9bfce37-3ed9-4dfb-9b43-bd50803f204c 00a7d470e36045deabd5584bd3a9c73e f02fc2085f6340ffa895cb894fdf5882 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Jan 21 19:15:16 np0005591285 nova_compute[182755]: 2026-01-22 00:15:16.383 182759 DEBUG nova.compute.manager [-] [instance: 40f18c70-81c1-4729-929c-5368ae297eb8] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Jan 21 19:15:16 np0005591285 nova_compute[182755]: 2026-01-22 00:15:16.384 182759 DEBUG nova.network.neutron [-] [instance: 40f18c70-81c1-4729-929c-5368ae297eb8] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Jan 21 19:15:16 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:15:16.393 211690 DEBUG oslo.privsep.daemon [-] privsep: reply[c48b7211-b893-442a-96af-4fe12e95c182]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 547013, 'reachable_time': 33909, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 231914, 'error': None, 'target': 'ovnmeta-c19848fe-a435-4c66-8190-94e8e9e1b266', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 19:15:16 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:15:16.396 104650 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-c19848fe-a435-4c66-8190-94e8e9e1b266 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Jan 21 19:15:16 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:15:16.396 104650 DEBUG oslo.privsep.daemon [-] privsep: reply[c7c85f13-66e1-43d4-94dd-a8b8cf3f7b55]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 19:15:16 np0005591285 systemd[1]: run-netns-ovnmeta\x2dc19848fe\x2da435\x2d4c66\x2d8190\x2d94e8e9e1b266.mount: Deactivated successfully.
Jan 21 19:15:17 np0005591285 nova_compute[182755]: 2026-01-22 00:15:17.567 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:15:17 np0005591285 nova_compute[182755]: 2026-01-22 00:15:17.970 182759 DEBUG nova.network.neutron [-] [instance: 40f18c70-81c1-4729-929c-5368ae297eb8] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 21 19:15:17 np0005591285 nova_compute[182755]: 2026-01-22 00:15:17.993 182759 INFO nova.compute.manager [-] [instance: 40f18c70-81c1-4729-929c-5368ae297eb8] Took 1.61 seconds to deallocate network for instance.#033[00m
Jan 21 19:15:18 np0005591285 nova_compute[182755]: 2026-01-22 00:15:18.070 182759 DEBUG nova.compute.manager [req-a4d09075-3f92-43e9-9688-bcc481349eb6 req-5f068e64-5eac-4e0a-a359-4e8178f76843 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 40f18c70-81c1-4729-929c-5368ae297eb8] Received event network-vif-deleted-7c155749-1486-479f-9ee6-d99ea840c942 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 21 19:15:18 np0005591285 nova_compute[182755]: 2026-01-22 00:15:18.116 182759 DEBUG oslo_concurrency.lockutils [None req-b9bfce37-3ed9-4dfb-9b43-bd50803f204c 00a7d470e36045deabd5584bd3a9c73e f02fc2085f6340ffa895cb894fdf5882 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 21 19:15:18 np0005591285 nova_compute[182755]: 2026-01-22 00:15:18.117 182759 DEBUG oslo_concurrency.lockutils [None req-b9bfce37-3ed9-4dfb-9b43-bd50803f204c 00a7d470e36045deabd5584bd3a9c73e f02fc2085f6340ffa895cb894fdf5882 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 21 19:15:18 np0005591285 nova_compute[182755]: 2026-01-22 00:15:18.195 182759 DEBUG nova.compute.provider_tree [None req-b9bfce37-3ed9-4dfb-9b43-bd50803f204c 00a7d470e36045deabd5584bd3a9c73e f02fc2085f6340ffa895cb894fdf5882 - - default default] Inventory has not changed in ProviderTree for provider: e96a8776-a298-4c19-937a-402cb8191067 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 21 19:15:18 np0005591285 nova_compute[182755]: 2026-01-22 00:15:18.214 182759 DEBUG nova.scheduler.client.report [None req-b9bfce37-3ed9-4dfb-9b43-bd50803f204c 00a7d470e36045deabd5584bd3a9c73e f02fc2085f6340ffa895cb894fdf5882 - - default default] Inventory has not changed for provider e96a8776-a298-4c19-937a-402cb8191067 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 21 19:15:18 np0005591285 nova_compute[182755]: 2026-01-22 00:15:18.242 182759 DEBUG oslo_concurrency.lockutils [None req-b9bfce37-3ed9-4dfb-9b43-bd50803f204c 00a7d470e36045deabd5584bd3a9c73e f02fc2085f6340ffa895cb894fdf5882 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.126s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 21 19:15:18 np0005591285 nova_compute[182755]: 2026-01-22 00:15:18.265 182759 INFO nova.scheduler.client.report [None req-b9bfce37-3ed9-4dfb-9b43-bd50803f204c 00a7d470e36045deabd5584bd3a9c73e f02fc2085f6340ffa895cb894fdf5882 - - default default] Deleted allocations for instance 40f18c70-81c1-4729-929c-5368ae297eb8#033[00m
Jan 21 19:15:18 np0005591285 nova_compute[182755]: 2026-01-22 00:15:18.375 182759 DEBUG nova.compute.manager [req-d2b33b82-a711-4c41-aff3-3e56039c9dc3 req-1ea485ff-66bb-4e80-bc76-e382c7c37302 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 40f18c70-81c1-4729-929c-5368ae297eb8] Received event network-vif-unplugged-7c155749-1486-479f-9ee6-d99ea840c942 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 21 19:15:18 np0005591285 nova_compute[182755]: 2026-01-22 00:15:18.376 182759 DEBUG oslo_concurrency.lockutils [req-d2b33b82-a711-4c41-aff3-3e56039c9dc3 req-1ea485ff-66bb-4e80-bc76-e382c7c37302 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "40f18c70-81c1-4729-929c-5368ae297eb8-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 21 19:15:18 np0005591285 nova_compute[182755]: 2026-01-22 00:15:18.376 182759 DEBUG oslo_concurrency.lockutils [req-d2b33b82-a711-4c41-aff3-3e56039c9dc3 req-1ea485ff-66bb-4e80-bc76-e382c7c37302 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "40f18c70-81c1-4729-929c-5368ae297eb8-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 21 19:15:18 np0005591285 nova_compute[182755]: 2026-01-22 00:15:18.376 182759 DEBUG oslo_concurrency.lockutils [req-d2b33b82-a711-4c41-aff3-3e56039c9dc3 req-1ea485ff-66bb-4e80-bc76-e382c7c37302 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "40f18c70-81c1-4729-929c-5368ae297eb8-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 21 19:15:18 np0005591285 nova_compute[182755]: 2026-01-22 00:15:18.376 182759 DEBUG nova.compute.manager [req-d2b33b82-a711-4c41-aff3-3e56039c9dc3 req-1ea485ff-66bb-4e80-bc76-e382c7c37302 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 40f18c70-81c1-4729-929c-5368ae297eb8] No waiting events found dispatching network-vif-unplugged-7c155749-1486-479f-9ee6-d99ea840c942 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 21 19:15:18 np0005591285 nova_compute[182755]: 2026-01-22 00:15:18.377 182759 WARNING nova.compute.manager [req-d2b33b82-a711-4c41-aff3-3e56039c9dc3 req-1ea485ff-66bb-4e80-bc76-e382c7c37302 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 40f18c70-81c1-4729-929c-5368ae297eb8] Received unexpected event network-vif-unplugged-7c155749-1486-479f-9ee6-d99ea840c942 for instance with vm_state deleted and task_state None.#033[00m
Jan 21 19:15:18 np0005591285 nova_compute[182755]: 2026-01-22 00:15:18.377 182759 DEBUG nova.compute.manager [req-d2b33b82-a711-4c41-aff3-3e56039c9dc3 req-1ea485ff-66bb-4e80-bc76-e382c7c37302 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 40f18c70-81c1-4729-929c-5368ae297eb8] Received event network-vif-plugged-7c155749-1486-479f-9ee6-d99ea840c942 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 21 19:15:18 np0005591285 nova_compute[182755]: 2026-01-22 00:15:18.377 182759 DEBUG oslo_concurrency.lockutils [req-d2b33b82-a711-4c41-aff3-3e56039c9dc3 req-1ea485ff-66bb-4e80-bc76-e382c7c37302 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "40f18c70-81c1-4729-929c-5368ae297eb8-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 21 19:15:18 np0005591285 nova_compute[182755]: 2026-01-22 00:15:18.377 182759 DEBUG oslo_concurrency.lockutils [req-d2b33b82-a711-4c41-aff3-3e56039c9dc3 req-1ea485ff-66bb-4e80-bc76-e382c7c37302 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "40f18c70-81c1-4729-929c-5368ae297eb8-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 21 19:15:18 np0005591285 nova_compute[182755]: 2026-01-22 00:15:18.377 182759 DEBUG oslo_concurrency.lockutils [req-d2b33b82-a711-4c41-aff3-3e56039c9dc3 req-1ea485ff-66bb-4e80-bc76-e382c7c37302 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "40f18c70-81c1-4729-929c-5368ae297eb8-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 21 19:15:18 np0005591285 nova_compute[182755]: 2026-01-22 00:15:18.377 182759 DEBUG nova.compute.manager [req-d2b33b82-a711-4c41-aff3-3e56039c9dc3 req-1ea485ff-66bb-4e80-bc76-e382c7c37302 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 40f18c70-81c1-4729-929c-5368ae297eb8] No waiting events found dispatching network-vif-plugged-7c155749-1486-479f-9ee6-d99ea840c942 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 21 19:15:18 np0005591285 nova_compute[182755]: 2026-01-22 00:15:18.378 182759 WARNING nova.compute.manager [req-d2b33b82-a711-4c41-aff3-3e56039c9dc3 req-1ea485ff-66bb-4e80-bc76-e382c7c37302 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 40f18c70-81c1-4729-929c-5368ae297eb8] Received unexpected event network-vif-plugged-7c155749-1486-479f-9ee6-d99ea840c942 for instance with vm_state deleted and task_state None.#033[00m
Jan 21 19:15:18 np0005591285 nova_compute[182755]: 2026-01-22 00:15:18.389 182759 DEBUG oslo_concurrency.lockutils [None req-b9bfce37-3ed9-4dfb-9b43-bd50803f204c 00a7d470e36045deabd5584bd3a9c73e f02fc2085f6340ffa895cb894fdf5882 - - default default] Lock "40f18c70-81c1-4729-929c-5368ae297eb8" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 2.426s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 21 19:15:20 np0005591285 nova_compute[182755]: 2026-01-22 00:15:20.746 182759 DEBUG oslo_concurrency.lockutils [None req-b0c4f951-d686-472a-b22b-f4ccb1bf59da 154c87efa1ae4839ac679b6bb5a57518 7abcf6b34bcf45a9937f19251e144d1e - - default default] Acquiring lock "d698021b-7b60-4b57-bb35-826d365e5bd3" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 21 19:15:20 np0005591285 nova_compute[182755]: 2026-01-22 00:15:20.747 182759 DEBUG oslo_concurrency.lockutils [None req-b0c4f951-d686-472a-b22b-f4ccb1bf59da 154c87efa1ae4839ac679b6bb5a57518 7abcf6b34bcf45a9937f19251e144d1e - - default default] Lock "d698021b-7b60-4b57-bb35-826d365e5bd3" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 21 19:15:20 np0005591285 nova_compute[182755]: 2026-01-22 00:15:20.782 182759 DEBUG nova.compute.manager [None req-b0c4f951-d686-472a-b22b-f4ccb1bf59da 154c87efa1ae4839ac679b6bb5a57518 7abcf6b34bcf45a9937f19251e144d1e - - default default] [instance: d698021b-7b60-4b57-bb35-826d365e5bd3] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Jan 21 19:15:20 np0005591285 nova_compute[182755]: 2026-01-22 00:15:20.880 182759 DEBUG oslo_concurrency.lockutils [None req-b0c4f951-d686-472a-b22b-f4ccb1bf59da 154c87efa1ae4839ac679b6bb5a57518 7abcf6b34bcf45a9937f19251e144d1e - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 21 19:15:20 np0005591285 nova_compute[182755]: 2026-01-22 00:15:20.881 182759 DEBUG oslo_concurrency.lockutils [None req-b0c4f951-d686-472a-b22b-f4ccb1bf59da 154c87efa1ae4839ac679b6bb5a57518 7abcf6b34bcf45a9937f19251e144d1e - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 21 19:15:20 np0005591285 nova_compute[182755]: 2026-01-22 00:15:20.886 182759 DEBUG nova.virt.hardware [None req-b0c4f951-d686-472a-b22b-f4ccb1bf59da 154c87efa1ae4839ac679b6bb5a57518 7abcf6b34bcf45a9937f19251e144d1e - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Jan 21 19:15:20 np0005591285 nova_compute[182755]: 2026-01-22 00:15:20.886 182759 INFO nova.compute.claims [None req-b0c4f951-d686-472a-b22b-f4ccb1bf59da 154c87efa1ae4839ac679b6bb5a57518 7abcf6b34bcf45a9937f19251e144d1e - - default default] [instance: d698021b-7b60-4b57-bb35-826d365e5bd3] Claim successful on node compute-2.ctlplane.example.com#033[00m
Jan 21 19:15:21 np0005591285 nova_compute[182755]: 2026-01-22 00:15:21.072 182759 DEBUG nova.compute.provider_tree [None req-b0c4f951-d686-472a-b22b-f4ccb1bf59da 154c87efa1ae4839ac679b6bb5a57518 7abcf6b34bcf45a9937f19251e144d1e - - default default] Inventory has not changed in ProviderTree for provider: e96a8776-a298-4c19-937a-402cb8191067 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 21 19:15:21 np0005591285 nova_compute[182755]: 2026-01-22 00:15:21.089 182759 DEBUG nova.scheduler.client.report [None req-b0c4f951-d686-472a-b22b-f4ccb1bf59da 154c87efa1ae4839ac679b6bb5a57518 7abcf6b34bcf45a9937f19251e144d1e - - default default] Inventory has not changed for provider e96a8776-a298-4c19-937a-402cb8191067 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 21 19:15:21 np0005591285 nova_compute[182755]: 2026-01-22 00:15:21.112 182759 DEBUG oslo_concurrency.lockutils [None req-b0c4f951-d686-472a-b22b-f4ccb1bf59da 154c87efa1ae4839ac679b6bb5a57518 7abcf6b34bcf45a9937f19251e144d1e - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.231s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 21 19:15:21 np0005591285 nova_compute[182755]: 2026-01-22 00:15:21.113 182759 DEBUG nova.compute.manager [None req-b0c4f951-d686-472a-b22b-f4ccb1bf59da 154c87efa1ae4839ac679b6bb5a57518 7abcf6b34bcf45a9937f19251e144d1e - - default default] [instance: d698021b-7b60-4b57-bb35-826d365e5bd3] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Jan 21 19:15:21 np0005591285 nova_compute[182755]: 2026-01-22 00:15:21.198 182759 DEBUG nova.compute.manager [None req-b0c4f951-d686-472a-b22b-f4ccb1bf59da 154c87efa1ae4839ac679b6bb5a57518 7abcf6b34bcf45a9937f19251e144d1e - - default default] [instance: d698021b-7b60-4b57-bb35-826d365e5bd3] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Jan 21 19:15:21 np0005591285 nova_compute[182755]: 2026-01-22 00:15:21.198 182759 DEBUG nova.network.neutron [None req-b0c4f951-d686-472a-b22b-f4ccb1bf59da 154c87efa1ae4839ac679b6bb5a57518 7abcf6b34bcf45a9937f19251e144d1e - - default default] [instance: d698021b-7b60-4b57-bb35-826d365e5bd3] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Jan 21 19:15:21 np0005591285 nova_compute[182755]: 2026-01-22 00:15:21.226 182759 INFO nova.virt.libvirt.driver [None req-b0c4f951-d686-472a-b22b-f4ccb1bf59da 154c87efa1ae4839ac679b6bb5a57518 7abcf6b34bcf45a9937f19251e144d1e - - default default] [instance: d698021b-7b60-4b57-bb35-826d365e5bd3] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Jan 21 19:15:21 np0005591285 nova_compute[182755]: 2026-01-22 00:15:21.256 182759 DEBUG nova.compute.manager [None req-b0c4f951-d686-472a-b22b-f4ccb1bf59da 154c87efa1ae4839ac679b6bb5a57518 7abcf6b34bcf45a9937f19251e144d1e - - default default] [instance: d698021b-7b60-4b57-bb35-826d365e5bd3] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Jan 21 19:15:21 np0005591285 nova_compute[182755]: 2026-01-22 00:15:21.299 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:15:21 np0005591285 nova_compute[182755]: 2026-01-22 00:15:21.397 182759 DEBUG nova.policy [None req-b0c4f951-d686-472a-b22b-f4ccb1bf59da 154c87efa1ae4839ac679b6bb5a57518 7abcf6b34bcf45a9937f19251e144d1e - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '154c87efa1ae4839ac679b6bb5a57518', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '7abcf6b34bcf45a9937f19251e144d1e', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Jan 21 19:15:21 np0005591285 nova_compute[182755]: 2026-01-22 00:15:21.420 182759 DEBUG nova.compute.manager [None req-b0c4f951-d686-472a-b22b-f4ccb1bf59da 154c87efa1ae4839ac679b6bb5a57518 7abcf6b34bcf45a9937f19251e144d1e - - default default] [instance: d698021b-7b60-4b57-bb35-826d365e5bd3] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Jan 21 19:15:21 np0005591285 nova_compute[182755]: 2026-01-22 00:15:21.422 182759 DEBUG nova.virt.libvirt.driver [None req-b0c4f951-d686-472a-b22b-f4ccb1bf59da 154c87efa1ae4839ac679b6bb5a57518 7abcf6b34bcf45a9937f19251e144d1e - - default default] [instance: d698021b-7b60-4b57-bb35-826d365e5bd3] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Jan 21 19:15:21 np0005591285 nova_compute[182755]: 2026-01-22 00:15:21.422 182759 INFO nova.virt.libvirt.driver [None req-b0c4f951-d686-472a-b22b-f4ccb1bf59da 154c87efa1ae4839ac679b6bb5a57518 7abcf6b34bcf45a9937f19251e144d1e - - default default] [instance: d698021b-7b60-4b57-bb35-826d365e5bd3] Creating image(s)#033[00m
Jan 21 19:15:21 np0005591285 nova_compute[182755]: 2026-01-22 00:15:21.423 182759 DEBUG oslo_concurrency.lockutils [None req-b0c4f951-d686-472a-b22b-f4ccb1bf59da 154c87efa1ae4839ac679b6bb5a57518 7abcf6b34bcf45a9937f19251e144d1e - - default default] Acquiring lock "/var/lib/nova/instances/d698021b-7b60-4b57-bb35-826d365e5bd3/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 21 19:15:21 np0005591285 nova_compute[182755]: 2026-01-22 00:15:21.423 182759 DEBUG oslo_concurrency.lockutils [None req-b0c4f951-d686-472a-b22b-f4ccb1bf59da 154c87efa1ae4839ac679b6bb5a57518 7abcf6b34bcf45a9937f19251e144d1e - - default default] Lock "/var/lib/nova/instances/d698021b-7b60-4b57-bb35-826d365e5bd3/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 21 19:15:21 np0005591285 nova_compute[182755]: 2026-01-22 00:15:21.424 182759 DEBUG oslo_concurrency.lockutils [None req-b0c4f951-d686-472a-b22b-f4ccb1bf59da 154c87efa1ae4839ac679b6bb5a57518 7abcf6b34bcf45a9937f19251e144d1e - - default default] Lock "/var/lib/nova/instances/d698021b-7b60-4b57-bb35-826d365e5bd3/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 21 19:15:21 np0005591285 nova_compute[182755]: 2026-01-22 00:15:21.440 182759 DEBUG oslo_concurrency.processutils [None req-b0c4f951-d686-472a-b22b-f4ccb1bf59da 154c87efa1ae4839ac679b6bb5a57518 7abcf6b34bcf45a9937f19251e144d1e - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 21 19:15:21 np0005591285 nova_compute[182755]: 2026-01-22 00:15:21.534 182759 DEBUG oslo_concurrency.processutils [None req-b0c4f951-d686-472a-b22b-f4ccb1bf59da 154c87efa1ae4839ac679b6bb5a57518 7abcf6b34bcf45a9937f19251e144d1e - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json" returned: 0 in 0.094s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 21 19:15:21 np0005591285 nova_compute[182755]: 2026-01-22 00:15:21.535 182759 DEBUG oslo_concurrency.lockutils [None req-b0c4f951-d686-472a-b22b-f4ccb1bf59da 154c87efa1ae4839ac679b6bb5a57518 7abcf6b34bcf45a9937f19251e144d1e - - default default] Acquiring lock "3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 21 19:15:21 np0005591285 nova_compute[182755]: 2026-01-22 00:15:21.535 182759 DEBUG oslo_concurrency.lockutils [None req-b0c4f951-d686-472a-b22b-f4ccb1bf59da 154c87efa1ae4839ac679b6bb5a57518 7abcf6b34bcf45a9937f19251e144d1e - - default default] Lock "3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 21 19:15:21 np0005591285 nova_compute[182755]: 2026-01-22 00:15:21.546 182759 DEBUG oslo_concurrency.processutils [None req-b0c4f951-d686-472a-b22b-f4ccb1bf59da 154c87efa1ae4839ac679b6bb5a57518 7abcf6b34bcf45a9937f19251e144d1e - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 21 19:15:21 np0005591285 nova_compute[182755]: 2026-01-22 00:15:21.618 182759 DEBUG oslo_concurrency.processutils [None req-b0c4f951-d686-472a-b22b-f4ccb1bf59da 154c87efa1ae4839ac679b6bb5a57518 7abcf6b34bcf45a9937f19251e144d1e - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json" returned: 0 in 0.072s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 21 19:15:21 np0005591285 nova_compute[182755]: 2026-01-22 00:15:21.620 182759 DEBUG oslo_concurrency.processutils [None req-b0c4f951-d686-472a-b22b-f4ccb1bf59da 154c87efa1ae4839ac679b6bb5a57518 7abcf6b34bcf45a9937f19251e144d1e - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474,backing_fmt=raw /var/lib/nova/instances/d698021b-7b60-4b57-bb35-826d365e5bd3/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 21 19:15:21 np0005591285 nova_compute[182755]: 2026-01-22 00:15:21.658 182759 DEBUG oslo_concurrency.processutils [None req-b0c4f951-d686-472a-b22b-f4ccb1bf59da 154c87efa1ae4839ac679b6bb5a57518 7abcf6b34bcf45a9937f19251e144d1e - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474,backing_fmt=raw /var/lib/nova/instances/d698021b-7b60-4b57-bb35-826d365e5bd3/disk 1073741824" returned: 0 in 0.038s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 21 19:15:21 np0005591285 nova_compute[182755]: 2026-01-22 00:15:21.659 182759 DEBUG oslo_concurrency.lockutils [None req-b0c4f951-d686-472a-b22b-f4ccb1bf59da 154c87efa1ae4839ac679b6bb5a57518 7abcf6b34bcf45a9937f19251e144d1e - - default default] Lock "3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.124s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 21 19:15:21 np0005591285 nova_compute[182755]: 2026-01-22 00:15:21.660 182759 DEBUG oslo_concurrency.processutils [None req-b0c4f951-d686-472a-b22b-f4ccb1bf59da 154c87efa1ae4839ac679b6bb5a57518 7abcf6b34bcf45a9937f19251e144d1e - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 21 19:15:21 np0005591285 nova_compute[182755]: 2026-01-22 00:15:21.740 182759 DEBUG oslo_concurrency.processutils [None req-b0c4f951-d686-472a-b22b-f4ccb1bf59da 154c87efa1ae4839ac679b6bb5a57518 7abcf6b34bcf45a9937f19251e144d1e - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json" returned: 0 in 0.079s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 21 19:15:21 np0005591285 nova_compute[182755]: 2026-01-22 00:15:21.740 182759 DEBUG nova.virt.disk.api [None req-b0c4f951-d686-472a-b22b-f4ccb1bf59da 154c87efa1ae4839ac679b6bb5a57518 7abcf6b34bcf45a9937f19251e144d1e - - default default] Checking if we can resize image /var/lib/nova/instances/d698021b-7b60-4b57-bb35-826d365e5bd3/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166#033[00m
Jan 21 19:15:21 np0005591285 nova_compute[182755]: 2026-01-22 00:15:21.741 182759 DEBUG oslo_concurrency.processutils [None req-b0c4f951-d686-472a-b22b-f4ccb1bf59da 154c87efa1ae4839ac679b6bb5a57518 7abcf6b34bcf45a9937f19251e144d1e - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/d698021b-7b60-4b57-bb35-826d365e5bd3/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 21 19:15:21 np0005591285 nova_compute[182755]: 2026-01-22 00:15:21.799 182759 DEBUG oslo_concurrency.processutils [None req-b0c4f951-d686-472a-b22b-f4ccb1bf59da 154c87efa1ae4839ac679b6bb5a57518 7abcf6b34bcf45a9937f19251e144d1e - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/d698021b-7b60-4b57-bb35-826d365e5bd3/disk --force-share --output=json" returned: 0 in 0.058s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 21 19:15:21 np0005591285 nova_compute[182755]: 2026-01-22 00:15:21.800 182759 DEBUG nova.virt.disk.api [None req-b0c4f951-d686-472a-b22b-f4ccb1bf59da 154c87efa1ae4839ac679b6bb5a57518 7abcf6b34bcf45a9937f19251e144d1e - - default default] Cannot resize image /var/lib/nova/instances/d698021b-7b60-4b57-bb35-826d365e5bd3/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172#033[00m
Jan 21 19:15:21 np0005591285 nova_compute[182755]: 2026-01-22 00:15:21.800 182759 DEBUG nova.objects.instance [None req-b0c4f951-d686-472a-b22b-f4ccb1bf59da 154c87efa1ae4839ac679b6bb5a57518 7abcf6b34bcf45a9937f19251e144d1e - - default default] Lazy-loading 'migration_context' on Instance uuid d698021b-7b60-4b57-bb35-826d365e5bd3 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 21 19:15:21 np0005591285 nova_compute[182755]: 2026-01-22 00:15:21.813 182759 DEBUG nova.virt.libvirt.driver [None req-b0c4f951-d686-472a-b22b-f4ccb1bf59da 154c87efa1ae4839ac679b6bb5a57518 7abcf6b34bcf45a9937f19251e144d1e - - default default] [instance: d698021b-7b60-4b57-bb35-826d365e5bd3] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Jan 21 19:15:21 np0005591285 nova_compute[182755]: 2026-01-22 00:15:21.814 182759 DEBUG nova.virt.libvirt.driver [None req-b0c4f951-d686-472a-b22b-f4ccb1bf59da 154c87efa1ae4839ac679b6bb5a57518 7abcf6b34bcf45a9937f19251e144d1e - - default default] [instance: d698021b-7b60-4b57-bb35-826d365e5bd3] Ensure instance console log exists: /var/lib/nova/instances/d698021b-7b60-4b57-bb35-826d365e5bd3/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Jan 21 19:15:21 np0005591285 nova_compute[182755]: 2026-01-22 00:15:21.814 182759 DEBUG oslo_concurrency.lockutils [None req-b0c4f951-d686-472a-b22b-f4ccb1bf59da 154c87efa1ae4839ac679b6bb5a57518 7abcf6b34bcf45a9937f19251e144d1e - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 21 19:15:21 np0005591285 nova_compute[182755]: 2026-01-22 00:15:21.814 182759 DEBUG oslo_concurrency.lockutils [None req-b0c4f951-d686-472a-b22b-f4ccb1bf59da 154c87efa1ae4839ac679b6bb5a57518 7abcf6b34bcf45a9937f19251e144d1e - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 21 19:15:21 np0005591285 nova_compute[182755]: 2026-01-22 00:15:21.814 182759 DEBUG oslo_concurrency.lockutils [None req-b0c4f951-d686-472a-b22b-f4ccb1bf59da 154c87efa1ae4839ac679b6bb5a57518 7abcf6b34bcf45a9937f19251e144d1e - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 21 19:15:21 np0005591285 nova_compute[182755]: 2026-01-22 00:15:21.982 182759 DEBUG nova.network.neutron [None req-b0c4f951-d686-472a-b22b-f4ccb1bf59da 154c87efa1ae4839ac679b6bb5a57518 7abcf6b34bcf45a9937f19251e144d1e - - default default] [instance: d698021b-7b60-4b57-bb35-826d365e5bd3] Successfully created port: 7d76cfd6-c5ce-4d30-bb5f-6fe27cb61873 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Jan 21 19:15:21 np0005591285 nova_compute[182755]: 2026-01-22 00:15:21.985 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:15:22 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:15:22.507 104259 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=37, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '16:a4:fb', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': 'fa:c2:bb:50:eb:27'}, ipsec=False) old=SB_Global(nb_cfg=36) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 21 19:15:22 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:15:22.508 104259 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 2 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Jan 21 19:15:22 np0005591285 nova_compute[182755]: 2026-01-22 00:15:22.539 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:15:22 np0005591285 nova_compute[182755]: 2026-01-22 00:15:22.569 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:15:23 np0005591285 nova_compute[182755]: 2026-01-22 00:15:23.061 182759 DEBUG nova.network.neutron [None req-b0c4f951-d686-472a-b22b-f4ccb1bf59da 154c87efa1ae4839ac679b6bb5a57518 7abcf6b34bcf45a9937f19251e144d1e - - default default] [instance: d698021b-7b60-4b57-bb35-826d365e5bd3] Successfully updated port: 7d76cfd6-c5ce-4d30-bb5f-6fe27cb61873 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Jan 21 19:15:23 np0005591285 nova_compute[182755]: 2026-01-22 00:15:23.150 182759 DEBUG oslo_concurrency.lockutils [None req-b0c4f951-d686-472a-b22b-f4ccb1bf59da 154c87efa1ae4839ac679b6bb5a57518 7abcf6b34bcf45a9937f19251e144d1e - - default default] Acquiring lock "refresh_cache-d698021b-7b60-4b57-bb35-826d365e5bd3" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 21 19:15:23 np0005591285 nova_compute[182755]: 2026-01-22 00:15:23.151 182759 DEBUG oslo_concurrency.lockutils [None req-b0c4f951-d686-472a-b22b-f4ccb1bf59da 154c87efa1ae4839ac679b6bb5a57518 7abcf6b34bcf45a9937f19251e144d1e - - default default] Acquired lock "refresh_cache-d698021b-7b60-4b57-bb35-826d365e5bd3" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 21 19:15:23 np0005591285 nova_compute[182755]: 2026-01-22 00:15:23.151 182759 DEBUG nova.network.neutron [None req-b0c4f951-d686-472a-b22b-f4ccb1bf59da 154c87efa1ae4839ac679b6bb5a57518 7abcf6b34bcf45a9937f19251e144d1e - - default default] [instance: d698021b-7b60-4b57-bb35-826d365e5bd3] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Jan 21 19:15:23 np0005591285 nova_compute[182755]: 2026-01-22 00:15:23.736 182759 DEBUG nova.compute.manager [req-f87a35ad-1483-4d8f-aa2c-5f08f66464be req-3fb76e56-0c40-4628-8a4f-29ec4a56b6bd 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: d698021b-7b60-4b57-bb35-826d365e5bd3] Received event network-changed-7d76cfd6-c5ce-4d30-bb5f-6fe27cb61873 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 21 19:15:23 np0005591285 nova_compute[182755]: 2026-01-22 00:15:23.737 182759 DEBUG nova.compute.manager [req-f87a35ad-1483-4d8f-aa2c-5f08f66464be req-3fb76e56-0c40-4628-8a4f-29ec4a56b6bd 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: d698021b-7b60-4b57-bb35-826d365e5bd3] Refreshing instance network info cache due to event network-changed-7d76cfd6-c5ce-4d30-bb5f-6fe27cb61873. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 21 19:15:23 np0005591285 nova_compute[182755]: 2026-01-22 00:15:23.738 182759 DEBUG oslo_concurrency.lockutils [req-f87a35ad-1483-4d8f-aa2c-5f08f66464be req-3fb76e56-0c40-4628-8a4f-29ec4a56b6bd 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "refresh_cache-d698021b-7b60-4b57-bb35-826d365e5bd3" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 21 19:15:23 np0005591285 nova_compute[182755]: 2026-01-22 00:15:23.902 182759 DEBUG nova.network.neutron [None req-b0c4f951-d686-472a-b22b-f4ccb1bf59da 154c87efa1ae4839ac679b6bb5a57518 7abcf6b34bcf45a9937f19251e144d1e - - default default] [instance: d698021b-7b60-4b57-bb35-826d365e5bd3] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Jan 21 19:15:24 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:15:24.511 104259 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=ce4b296c-26ac-415a-aa87-9634754eb3d3, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '37'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 21 19:15:24 np0005591285 nova_compute[182755]: 2026-01-22 00:15:24.947 182759 DEBUG nova.network.neutron [None req-b0c4f951-d686-472a-b22b-f4ccb1bf59da 154c87efa1ae4839ac679b6bb5a57518 7abcf6b34bcf45a9937f19251e144d1e - - default default] [instance: d698021b-7b60-4b57-bb35-826d365e5bd3] Updating instance_info_cache with network_info: [{"id": "7d76cfd6-c5ce-4d30-bb5f-6fe27cb61873", "address": "fa:16:3e:d3:2e:0d", "network": {"id": "b22364e6-24ba-4549-9455-32f1aced62b6", "bridge": "br-int", "label": "tempest-ServersNegativeTestMultiTenantJSON-1033463498-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "7abcf6b34bcf45a9937f19251e144d1e", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap7d76cfd6-c5", "ovs_interfaceid": "7d76cfd6-c5ce-4d30-bb5f-6fe27cb61873", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 21 19:15:24 np0005591285 nova_compute[182755]: 2026-01-22 00:15:24.974 182759 DEBUG oslo_concurrency.lockutils [None req-b0c4f951-d686-472a-b22b-f4ccb1bf59da 154c87efa1ae4839ac679b6bb5a57518 7abcf6b34bcf45a9937f19251e144d1e - - default default] Releasing lock "refresh_cache-d698021b-7b60-4b57-bb35-826d365e5bd3" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 21 19:15:24 np0005591285 nova_compute[182755]: 2026-01-22 00:15:24.975 182759 DEBUG nova.compute.manager [None req-b0c4f951-d686-472a-b22b-f4ccb1bf59da 154c87efa1ae4839ac679b6bb5a57518 7abcf6b34bcf45a9937f19251e144d1e - - default default] [instance: d698021b-7b60-4b57-bb35-826d365e5bd3] Instance network_info: |[{"id": "7d76cfd6-c5ce-4d30-bb5f-6fe27cb61873", "address": "fa:16:3e:d3:2e:0d", "network": {"id": "b22364e6-24ba-4549-9455-32f1aced62b6", "bridge": "br-int", "label": "tempest-ServersNegativeTestMultiTenantJSON-1033463498-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "7abcf6b34bcf45a9937f19251e144d1e", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap7d76cfd6-c5", "ovs_interfaceid": "7d76cfd6-c5ce-4d30-bb5f-6fe27cb61873", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Jan 21 19:15:24 np0005591285 nova_compute[182755]: 2026-01-22 00:15:24.976 182759 DEBUG oslo_concurrency.lockutils [req-f87a35ad-1483-4d8f-aa2c-5f08f66464be req-3fb76e56-0c40-4628-8a4f-29ec4a56b6bd 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquired lock "refresh_cache-d698021b-7b60-4b57-bb35-826d365e5bd3" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 21 19:15:24 np0005591285 nova_compute[182755]: 2026-01-22 00:15:24.977 182759 DEBUG nova.network.neutron [req-f87a35ad-1483-4d8f-aa2c-5f08f66464be req-3fb76e56-0c40-4628-8a4f-29ec4a56b6bd 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: d698021b-7b60-4b57-bb35-826d365e5bd3] Refreshing network info cache for port 7d76cfd6-c5ce-4d30-bb5f-6fe27cb61873 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 21 19:15:24 np0005591285 nova_compute[182755]: 2026-01-22 00:15:24.983 182759 DEBUG nova.virt.libvirt.driver [None req-b0c4f951-d686-472a-b22b-f4ccb1bf59da 154c87efa1ae4839ac679b6bb5a57518 7abcf6b34bcf45a9937f19251e144d1e - - default default] [instance: d698021b-7b60-4b57-bb35-826d365e5bd3] Start _get_guest_xml network_info=[{"id": "7d76cfd6-c5ce-4d30-bb5f-6fe27cb61873", "address": "fa:16:3e:d3:2e:0d", "network": {"id": "b22364e6-24ba-4549-9455-32f1aced62b6", "bridge": "br-int", "label": "tempest-ServersNegativeTestMultiTenantJSON-1033463498-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "7abcf6b34bcf45a9937f19251e144d1e", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap7d76cfd6-c5", "ovs_interfaceid": "7d76cfd6-c5ce-4d30-bb5f-6fe27cb61873", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-21T23:42:50Z,direct_url=<?>,disk_format='qcow2',id=9cd98f02-a505-4543-a7ad-04e9a377b456,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='43b70c4e837343859ac97b6b2397ba1b',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-21T23:42:52Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'device_type': 'disk', 'device_name': '/dev/vda', 'size': 0, 'boot_index': 0, 'disk_bus': 'virtio', 'guest_format': None, 'encryption_options': None, 'encryption_format': None, 'encrypted': False, 'encryption_secret_uuid': None, 'image_id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Jan 21 19:15:24 np0005591285 nova_compute[182755]: 2026-01-22 00:15:24.991 182759 WARNING nova.virt.libvirt.driver [None req-b0c4f951-d686-472a-b22b-f4ccb1bf59da 154c87efa1ae4839ac679b6bb5a57518 7abcf6b34bcf45a9937f19251e144d1e - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 21 19:15:25 np0005591285 nova_compute[182755]: 2026-01-22 00:15:25.001 182759 DEBUG nova.virt.libvirt.host [None req-b0c4f951-d686-472a-b22b-f4ccb1bf59da 154c87efa1ae4839ac679b6bb5a57518 7abcf6b34bcf45a9937f19251e144d1e - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Jan 21 19:15:25 np0005591285 nova_compute[182755]: 2026-01-22 00:15:25.002 182759 DEBUG nova.virt.libvirt.host [None req-b0c4f951-d686-472a-b22b-f4ccb1bf59da 154c87efa1ae4839ac679b6bb5a57518 7abcf6b34bcf45a9937f19251e144d1e - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Jan 21 19:15:25 np0005591285 nova_compute[182755]: 2026-01-22 00:15:25.012 182759 DEBUG nova.virt.libvirt.host [None req-b0c4f951-d686-472a-b22b-f4ccb1bf59da 154c87efa1ae4839ac679b6bb5a57518 7abcf6b34bcf45a9937f19251e144d1e - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Jan 21 19:15:25 np0005591285 nova_compute[182755]: 2026-01-22 00:15:25.013 182759 DEBUG nova.virt.libvirt.host [None req-b0c4f951-d686-472a-b22b-f4ccb1bf59da 154c87efa1ae4839ac679b6bb5a57518 7abcf6b34bcf45a9937f19251e144d1e - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Jan 21 19:15:25 np0005591285 nova_compute[182755]: 2026-01-22 00:15:25.015 182759 DEBUG nova.virt.libvirt.driver [None req-b0c4f951-d686-472a-b22b-f4ccb1bf59da 154c87efa1ae4839ac679b6bb5a57518 7abcf6b34bcf45a9937f19251e144d1e - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Jan 21 19:15:25 np0005591285 nova_compute[182755]: 2026-01-22 00:15:25.015 182759 DEBUG nova.virt.hardware [None req-b0c4f951-d686-472a-b22b-f4ccb1bf59da 154c87efa1ae4839ac679b6bb5a57518 7abcf6b34bcf45a9937f19251e144d1e - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-21T23:42:45Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='c3389c03-89c4-4ff5-9e03-1a99d41713d4',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-21T23:42:50Z,direct_url=<?>,disk_format='qcow2',id=9cd98f02-a505-4543-a7ad-04e9a377b456,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='43b70c4e837343859ac97b6b2397ba1b',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-21T23:42:52Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Jan 21 19:15:25 np0005591285 nova_compute[182755]: 2026-01-22 00:15:25.016 182759 DEBUG nova.virt.hardware [None req-b0c4f951-d686-472a-b22b-f4ccb1bf59da 154c87efa1ae4839ac679b6bb5a57518 7abcf6b34bcf45a9937f19251e144d1e - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Jan 21 19:15:25 np0005591285 nova_compute[182755]: 2026-01-22 00:15:25.016 182759 DEBUG nova.virt.hardware [None req-b0c4f951-d686-472a-b22b-f4ccb1bf59da 154c87efa1ae4839ac679b6bb5a57518 7abcf6b34bcf45a9937f19251e144d1e - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Jan 21 19:15:25 np0005591285 nova_compute[182755]: 2026-01-22 00:15:25.016 182759 DEBUG nova.virt.hardware [None req-b0c4f951-d686-472a-b22b-f4ccb1bf59da 154c87efa1ae4839ac679b6bb5a57518 7abcf6b34bcf45a9937f19251e144d1e - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Jan 21 19:15:25 np0005591285 nova_compute[182755]: 2026-01-22 00:15:25.016 182759 DEBUG nova.virt.hardware [None req-b0c4f951-d686-472a-b22b-f4ccb1bf59da 154c87efa1ae4839ac679b6bb5a57518 7abcf6b34bcf45a9937f19251e144d1e - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Jan 21 19:15:25 np0005591285 nova_compute[182755]: 2026-01-22 00:15:25.017 182759 DEBUG nova.virt.hardware [None req-b0c4f951-d686-472a-b22b-f4ccb1bf59da 154c87efa1ae4839ac679b6bb5a57518 7abcf6b34bcf45a9937f19251e144d1e - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Jan 21 19:15:25 np0005591285 nova_compute[182755]: 2026-01-22 00:15:25.017 182759 DEBUG nova.virt.hardware [None req-b0c4f951-d686-472a-b22b-f4ccb1bf59da 154c87efa1ae4839ac679b6bb5a57518 7abcf6b34bcf45a9937f19251e144d1e - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Jan 21 19:15:25 np0005591285 nova_compute[182755]: 2026-01-22 00:15:25.017 182759 DEBUG nova.virt.hardware [None req-b0c4f951-d686-472a-b22b-f4ccb1bf59da 154c87efa1ae4839ac679b6bb5a57518 7abcf6b34bcf45a9937f19251e144d1e - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Jan 21 19:15:25 np0005591285 nova_compute[182755]: 2026-01-22 00:15:25.018 182759 DEBUG nova.virt.hardware [None req-b0c4f951-d686-472a-b22b-f4ccb1bf59da 154c87efa1ae4839ac679b6bb5a57518 7abcf6b34bcf45a9937f19251e144d1e - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Jan 21 19:15:25 np0005591285 nova_compute[182755]: 2026-01-22 00:15:25.018 182759 DEBUG nova.virt.hardware [None req-b0c4f951-d686-472a-b22b-f4ccb1bf59da 154c87efa1ae4839ac679b6bb5a57518 7abcf6b34bcf45a9937f19251e144d1e - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Jan 21 19:15:25 np0005591285 nova_compute[182755]: 2026-01-22 00:15:25.018 182759 DEBUG nova.virt.hardware [None req-b0c4f951-d686-472a-b22b-f4ccb1bf59da 154c87efa1ae4839ac679b6bb5a57518 7abcf6b34bcf45a9937f19251e144d1e - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Jan 21 19:15:25 np0005591285 nova_compute[182755]: 2026-01-22 00:15:25.023 182759 DEBUG nova.virt.libvirt.vif [None req-b0c4f951-d686-472a-b22b-f4ccb1bf59da 154c87efa1ae4839ac679b6bb5a57518 7abcf6b34bcf45a9937f19251e144d1e - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-22T00:15:19Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ServersNegativeTestMultiTenantJSON-server-1898834577',display_name='tempest-ServersNegativeTestMultiTenantJSON-server-1898834577',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-serversnegativetestmultitenantjson-server-1898834577',id=129,image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='7abcf6b34bcf45a9937f19251e144d1e',ramdisk_id='',reservation_id='r-g2fz3s0v',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ServersNegativeTestMultiTenantJSON-1161793871',owner_user_name='tempest-ServersNegativeTestMultiTenantJSON-1161793871-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-22T00:15:21Z,user_data=None,user_id='154c87efa1ae4839ac679b6bb5a57518',uuid=d698021b-7b60-4b57-bb35-826d365e5bd3,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "7d76cfd6-c5ce-4d30-bb5f-6fe27cb61873", "address": "fa:16:3e:d3:2e:0d", "network": {"id": "b22364e6-24ba-4549-9455-32f1aced62b6", "bridge": "br-int", "label": "tempest-ServersNegativeTestMultiTenantJSON-1033463498-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "7abcf6b34bcf45a9937f19251e144d1e", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap7d76cfd6-c5", "ovs_interfaceid": "7d76cfd6-c5ce-4d30-bb5f-6fe27cb61873", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Jan 21 19:15:25 np0005591285 nova_compute[182755]: 2026-01-22 00:15:25.023 182759 DEBUG nova.network.os_vif_util [None req-b0c4f951-d686-472a-b22b-f4ccb1bf59da 154c87efa1ae4839ac679b6bb5a57518 7abcf6b34bcf45a9937f19251e144d1e - - default default] Converting VIF {"id": "7d76cfd6-c5ce-4d30-bb5f-6fe27cb61873", "address": "fa:16:3e:d3:2e:0d", "network": {"id": "b22364e6-24ba-4549-9455-32f1aced62b6", "bridge": "br-int", "label": "tempest-ServersNegativeTestMultiTenantJSON-1033463498-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "7abcf6b34bcf45a9937f19251e144d1e", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap7d76cfd6-c5", "ovs_interfaceid": "7d76cfd6-c5ce-4d30-bb5f-6fe27cb61873", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 21 19:15:25 np0005591285 nova_compute[182755]: 2026-01-22 00:15:25.024 182759 DEBUG nova.network.os_vif_util [None req-b0c4f951-d686-472a-b22b-f4ccb1bf59da 154c87efa1ae4839ac679b6bb5a57518 7abcf6b34bcf45a9937f19251e144d1e - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:d3:2e:0d,bridge_name='br-int',has_traffic_filtering=True,id=7d76cfd6-c5ce-4d30-bb5f-6fe27cb61873,network=Network(b22364e6-24ba-4549-9455-32f1aced62b6),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap7d76cfd6-c5') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 21 19:15:25 np0005591285 nova_compute[182755]: 2026-01-22 00:15:25.026 182759 DEBUG nova.objects.instance [None req-b0c4f951-d686-472a-b22b-f4ccb1bf59da 154c87efa1ae4839ac679b6bb5a57518 7abcf6b34bcf45a9937f19251e144d1e - - default default] Lazy-loading 'pci_devices' on Instance uuid d698021b-7b60-4b57-bb35-826d365e5bd3 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 21 19:15:25 np0005591285 nova_compute[182755]: 2026-01-22 00:15:25.058 182759 DEBUG nova.virt.libvirt.driver [None req-b0c4f951-d686-472a-b22b-f4ccb1bf59da 154c87efa1ae4839ac679b6bb5a57518 7abcf6b34bcf45a9937f19251e144d1e - - default default] [instance: d698021b-7b60-4b57-bb35-826d365e5bd3] End _get_guest_xml xml=<domain type="kvm">
Jan 21 19:15:25 np0005591285 nova_compute[182755]:  <uuid>d698021b-7b60-4b57-bb35-826d365e5bd3</uuid>
Jan 21 19:15:25 np0005591285 nova_compute[182755]:  <name>instance-00000081</name>
Jan 21 19:15:25 np0005591285 nova_compute[182755]:  <memory>131072</memory>
Jan 21 19:15:25 np0005591285 nova_compute[182755]:  <vcpu>1</vcpu>
Jan 21 19:15:25 np0005591285 nova_compute[182755]:  <metadata>
Jan 21 19:15:25 np0005591285 nova_compute[182755]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 21 19:15:25 np0005591285 nova_compute[182755]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 21 19:15:25 np0005591285 nova_compute[182755]:      <nova:name>tempest-ServersNegativeTestMultiTenantJSON-server-1898834577</nova:name>
Jan 21 19:15:25 np0005591285 nova_compute[182755]:      <nova:creationTime>2026-01-22 00:15:24</nova:creationTime>
Jan 21 19:15:25 np0005591285 nova_compute[182755]:      <nova:flavor name="m1.nano">
Jan 21 19:15:25 np0005591285 nova_compute[182755]:        <nova:memory>128</nova:memory>
Jan 21 19:15:25 np0005591285 nova_compute[182755]:        <nova:disk>1</nova:disk>
Jan 21 19:15:25 np0005591285 nova_compute[182755]:        <nova:swap>0</nova:swap>
Jan 21 19:15:25 np0005591285 nova_compute[182755]:        <nova:ephemeral>0</nova:ephemeral>
Jan 21 19:15:25 np0005591285 nova_compute[182755]:        <nova:vcpus>1</nova:vcpus>
Jan 21 19:15:25 np0005591285 nova_compute[182755]:      </nova:flavor>
Jan 21 19:15:25 np0005591285 nova_compute[182755]:      <nova:owner>
Jan 21 19:15:25 np0005591285 nova_compute[182755]:        <nova:user uuid="154c87efa1ae4839ac679b6bb5a57518">tempest-ServersNegativeTestMultiTenantJSON-1161793871-project-member</nova:user>
Jan 21 19:15:25 np0005591285 nova_compute[182755]:        <nova:project uuid="7abcf6b34bcf45a9937f19251e144d1e">tempest-ServersNegativeTestMultiTenantJSON-1161793871</nova:project>
Jan 21 19:15:25 np0005591285 nova_compute[182755]:      </nova:owner>
Jan 21 19:15:25 np0005591285 nova_compute[182755]:      <nova:root type="image" uuid="9cd98f02-a505-4543-a7ad-04e9a377b456"/>
Jan 21 19:15:25 np0005591285 nova_compute[182755]:      <nova:ports>
Jan 21 19:15:25 np0005591285 nova_compute[182755]:        <nova:port uuid="7d76cfd6-c5ce-4d30-bb5f-6fe27cb61873">
Jan 21 19:15:25 np0005591285 nova_compute[182755]:          <nova:ip type="fixed" address="10.100.0.8" ipVersion="4"/>
Jan 21 19:15:25 np0005591285 nova_compute[182755]:        </nova:port>
Jan 21 19:15:25 np0005591285 nova_compute[182755]:      </nova:ports>
Jan 21 19:15:25 np0005591285 nova_compute[182755]:    </nova:instance>
Jan 21 19:15:25 np0005591285 nova_compute[182755]:  </metadata>
Jan 21 19:15:25 np0005591285 nova_compute[182755]:  <sysinfo type="smbios">
Jan 21 19:15:25 np0005591285 nova_compute[182755]:    <system>
Jan 21 19:15:25 np0005591285 nova_compute[182755]:      <entry name="manufacturer">RDO</entry>
Jan 21 19:15:25 np0005591285 nova_compute[182755]:      <entry name="product">OpenStack Compute</entry>
Jan 21 19:15:25 np0005591285 nova_compute[182755]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 21 19:15:25 np0005591285 nova_compute[182755]:      <entry name="serial">d698021b-7b60-4b57-bb35-826d365e5bd3</entry>
Jan 21 19:15:25 np0005591285 nova_compute[182755]:      <entry name="uuid">d698021b-7b60-4b57-bb35-826d365e5bd3</entry>
Jan 21 19:15:25 np0005591285 nova_compute[182755]:      <entry name="family">Virtual Machine</entry>
Jan 21 19:15:25 np0005591285 nova_compute[182755]:    </system>
Jan 21 19:15:25 np0005591285 nova_compute[182755]:  </sysinfo>
Jan 21 19:15:25 np0005591285 nova_compute[182755]:  <os>
Jan 21 19:15:25 np0005591285 nova_compute[182755]:    <type arch="x86_64" machine="q35">hvm</type>
Jan 21 19:15:25 np0005591285 nova_compute[182755]:    <boot dev="hd"/>
Jan 21 19:15:25 np0005591285 nova_compute[182755]:    <smbios mode="sysinfo"/>
Jan 21 19:15:25 np0005591285 nova_compute[182755]:  </os>
Jan 21 19:15:25 np0005591285 nova_compute[182755]:  <features>
Jan 21 19:15:25 np0005591285 nova_compute[182755]:    <acpi/>
Jan 21 19:15:25 np0005591285 nova_compute[182755]:    <apic/>
Jan 21 19:15:25 np0005591285 nova_compute[182755]:    <vmcoreinfo/>
Jan 21 19:15:25 np0005591285 nova_compute[182755]:  </features>
Jan 21 19:15:25 np0005591285 nova_compute[182755]:  <clock offset="utc">
Jan 21 19:15:25 np0005591285 nova_compute[182755]:    <timer name="pit" tickpolicy="delay"/>
Jan 21 19:15:25 np0005591285 nova_compute[182755]:    <timer name="rtc" tickpolicy="catchup"/>
Jan 21 19:15:25 np0005591285 nova_compute[182755]:    <timer name="hpet" present="no"/>
Jan 21 19:15:25 np0005591285 nova_compute[182755]:  </clock>
Jan 21 19:15:25 np0005591285 nova_compute[182755]:  <cpu mode="custom" match="exact">
Jan 21 19:15:25 np0005591285 nova_compute[182755]:    <model>Nehalem</model>
Jan 21 19:15:25 np0005591285 nova_compute[182755]:    <topology sockets="1" cores="1" threads="1"/>
Jan 21 19:15:25 np0005591285 nova_compute[182755]:  </cpu>
Jan 21 19:15:25 np0005591285 nova_compute[182755]:  <devices>
Jan 21 19:15:25 np0005591285 nova_compute[182755]:    <disk type="file" device="disk">
Jan 21 19:15:25 np0005591285 nova_compute[182755]:      <driver name="qemu" type="qcow2" cache="none"/>
Jan 21 19:15:25 np0005591285 nova_compute[182755]:      <source file="/var/lib/nova/instances/d698021b-7b60-4b57-bb35-826d365e5bd3/disk"/>
Jan 21 19:15:25 np0005591285 nova_compute[182755]:      <target dev="vda" bus="virtio"/>
Jan 21 19:15:25 np0005591285 nova_compute[182755]:    </disk>
Jan 21 19:15:25 np0005591285 nova_compute[182755]:    <disk type="file" device="cdrom">
Jan 21 19:15:25 np0005591285 nova_compute[182755]:      <driver name="qemu" type="raw" cache="none"/>
Jan 21 19:15:25 np0005591285 nova_compute[182755]:      <source file="/var/lib/nova/instances/d698021b-7b60-4b57-bb35-826d365e5bd3/disk.config"/>
Jan 21 19:15:25 np0005591285 nova_compute[182755]:      <target dev="sda" bus="sata"/>
Jan 21 19:15:25 np0005591285 nova_compute[182755]:    </disk>
Jan 21 19:15:25 np0005591285 nova_compute[182755]:    <interface type="ethernet">
Jan 21 19:15:25 np0005591285 nova_compute[182755]:      <mac address="fa:16:3e:d3:2e:0d"/>
Jan 21 19:15:25 np0005591285 nova_compute[182755]:      <model type="virtio"/>
Jan 21 19:15:25 np0005591285 nova_compute[182755]:      <driver name="vhost" rx_queue_size="512"/>
Jan 21 19:15:25 np0005591285 nova_compute[182755]:      <mtu size="1442"/>
Jan 21 19:15:25 np0005591285 nova_compute[182755]:      <target dev="tap7d76cfd6-c5"/>
Jan 21 19:15:25 np0005591285 nova_compute[182755]:    </interface>
Jan 21 19:15:25 np0005591285 nova_compute[182755]:    <serial type="pty">
Jan 21 19:15:25 np0005591285 nova_compute[182755]:      <log file="/var/lib/nova/instances/d698021b-7b60-4b57-bb35-826d365e5bd3/console.log" append="off"/>
Jan 21 19:15:25 np0005591285 nova_compute[182755]:    </serial>
Jan 21 19:15:25 np0005591285 nova_compute[182755]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 21 19:15:25 np0005591285 nova_compute[182755]:    <video>
Jan 21 19:15:25 np0005591285 nova_compute[182755]:      <model type="virtio"/>
Jan 21 19:15:25 np0005591285 nova_compute[182755]:    </video>
Jan 21 19:15:25 np0005591285 nova_compute[182755]:    <input type="tablet" bus="usb"/>
Jan 21 19:15:25 np0005591285 nova_compute[182755]:    <rng model="virtio">
Jan 21 19:15:25 np0005591285 nova_compute[182755]:      <backend model="random">/dev/urandom</backend>
Jan 21 19:15:25 np0005591285 nova_compute[182755]:    </rng>
Jan 21 19:15:25 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root"/>
Jan 21 19:15:25 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 19:15:25 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 19:15:25 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 19:15:25 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 19:15:25 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 19:15:25 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 19:15:25 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 19:15:25 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 19:15:25 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 19:15:25 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 19:15:25 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 19:15:25 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 19:15:25 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 19:15:25 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 19:15:25 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 19:15:25 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 19:15:25 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 19:15:25 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 19:15:25 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 19:15:25 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 19:15:25 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 19:15:25 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 19:15:25 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 19:15:25 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 19:15:25 np0005591285 nova_compute[182755]:    <controller type="usb" index="0"/>
Jan 21 19:15:25 np0005591285 nova_compute[182755]:    <memballoon model="virtio">
Jan 21 19:15:25 np0005591285 nova_compute[182755]:      <stats period="10"/>
Jan 21 19:15:25 np0005591285 nova_compute[182755]:    </memballoon>
Jan 21 19:15:25 np0005591285 nova_compute[182755]:  </devices>
Jan 21 19:15:25 np0005591285 nova_compute[182755]: </domain>
Jan 21 19:15:25 np0005591285 nova_compute[182755]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Jan 21 19:15:25 np0005591285 nova_compute[182755]: 2026-01-22 00:15:25.060 182759 DEBUG nova.compute.manager [None req-b0c4f951-d686-472a-b22b-f4ccb1bf59da 154c87efa1ae4839ac679b6bb5a57518 7abcf6b34bcf45a9937f19251e144d1e - - default default] [instance: d698021b-7b60-4b57-bb35-826d365e5bd3] Preparing to wait for external event network-vif-plugged-7d76cfd6-c5ce-4d30-bb5f-6fe27cb61873 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Jan 21 19:15:25 np0005591285 nova_compute[182755]: 2026-01-22 00:15:25.061 182759 DEBUG oslo_concurrency.lockutils [None req-b0c4f951-d686-472a-b22b-f4ccb1bf59da 154c87efa1ae4839ac679b6bb5a57518 7abcf6b34bcf45a9937f19251e144d1e - - default default] Acquiring lock "d698021b-7b60-4b57-bb35-826d365e5bd3-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 21 19:15:25 np0005591285 nova_compute[182755]: 2026-01-22 00:15:25.061 182759 DEBUG oslo_concurrency.lockutils [None req-b0c4f951-d686-472a-b22b-f4ccb1bf59da 154c87efa1ae4839ac679b6bb5a57518 7abcf6b34bcf45a9937f19251e144d1e - - default default] Lock "d698021b-7b60-4b57-bb35-826d365e5bd3-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 21 19:15:25 np0005591285 nova_compute[182755]: 2026-01-22 00:15:25.062 182759 DEBUG oslo_concurrency.lockutils [None req-b0c4f951-d686-472a-b22b-f4ccb1bf59da 154c87efa1ae4839ac679b6bb5a57518 7abcf6b34bcf45a9937f19251e144d1e - - default default] Lock "d698021b-7b60-4b57-bb35-826d365e5bd3-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 21 19:15:25 np0005591285 nova_compute[182755]: 2026-01-22 00:15:25.063 182759 DEBUG nova.virt.libvirt.vif [None req-b0c4f951-d686-472a-b22b-f4ccb1bf59da 154c87efa1ae4839ac679b6bb5a57518 7abcf6b34bcf45a9937f19251e144d1e - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-22T00:15:19Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ServersNegativeTestMultiTenantJSON-server-1898834577',display_name='tempest-ServersNegativeTestMultiTenantJSON-server-1898834577',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-serversnegativetestmultitenantjson-server-1898834577',id=129,image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='7abcf6b34bcf45a9937f19251e144d1e',ramdisk_id='',reservation_id='r-g2fz3s0v',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ServersNegativeTestMultiTenantJSON-1161793871',owner_user_name='tempest-ServersNegativeTestMultiTenantJSON-1161793871-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-22T00:15:21Z,user_data=None,user_id='154c87efa1ae4839ac679b6bb5a57518',uuid=d698021b-7b60-4b57-bb35-826d365e5bd3,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "7d76cfd6-c5ce-4d30-bb5f-6fe27cb61873", "address": "fa:16:3e:d3:2e:0d", "network": {"id": "b22364e6-24ba-4549-9455-32f1aced62b6", "bridge": "br-int", "label": "tempest-ServersNegativeTestMultiTenantJSON-1033463498-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "7abcf6b34bcf45a9937f19251e144d1e", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap7d76cfd6-c5", "ovs_interfaceid": "7d76cfd6-c5ce-4d30-bb5f-6fe27cb61873", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Jan 21 19:15:25 np0005591285 nova_compute[182755]: 2026-01-22 00:15:25.063 182759 DEBUG nova.network.os_vif_util [None req-b0c4f951-d686-472a-b22b-f4ccb1bf59da 154c87efa1ae4839ac679b6bb5a57518 7abcf6b34bcf45a9937f19251e144d1e - - default default] Converting VIF {"id": "7d76cfd6-c5ce-4d30-bb5f-6fe27cb61873", "address": "fa:16:3e:d3:2e:0d", "network": {"id": "b22364e6-24ba-4549-9455-32f1aced62b6", "bridge": "br-int", "label": "tempest-ServersNegativeTestMultiTenantJSON-1033463498-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "7abcf6b34bcf45a9937f19251e144d1e", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap7d76cfd6-c5", "ovs_interfaceid": "7d76cfd6-c5ce-4d30-bb5f-6fe27cb61873", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 21 19:15:25 np0005591285 nova_compute[182755]: 2026-01-22 00:15:25.064 182759 DEBUG nova.network.os_vif_util [None req-b0c4f951-d686-472a-b22b-f4ccb1bf59da 154c87efa1ae4839ac679b6bb5a57518 7abcf6b34bcf45a9937f19251e144d1e - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:d3:2e:0d,bridge_name='br-int',has_traffic_filtering=True,id=7d76cfd6-c5ce-4d30-bb5f-6fe27cb61873,network=Network(b22364e6-24ba-4549-9455-32f1aced62b6),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap7d76cfd6-c5') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 21 19:15:25 np0005591285 nova_compute[182755]: 2026-01-22 00:15:25.065 182759 DEBUG os_vif [None req-b0c4f951-d686-472a-b22b-f4ccb1bf59da 154c87efa1ae4839ac679b6bb5a57518 7abcf6b34bcf45a9937f19251e144d1e - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:d3:2e:0d,bridge_name='br-int',has_traffic_filtering=True,id=7d76cfd6-c5ce-4d30-bb5f-6fe27cb61873,network=Network(b22364e6-24ba-4549-9455-32f1aced62b6),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap7d76cfd6-c5') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Jan 21 19:15:25 np0005591285 nova_compute[182755]: 2026-01-22 00:15:25.066 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:15:25 np0005591285 nova_compute[182755]: 2026-01-22 00:15:25.067 182759 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 21 19:15:25 np0005591285 nova_compute[182755]: 2026-01-22 00:15:25.067 182759 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 21 19:15:25 np0005591285 nova_compute[182755]: 2026-01-22 00:15:25.071 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:15:25 np0005591285 nova_compute[182755]: 2026-01-22 00:15:25.072 182759 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap7d76cfd6-c5, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 21 19:15:25 np0005591285 nova_compute[182755]: 2026-01-22 00:15:25.072 182759 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap7d76cfd6-c5, col_values=(('external_ids', {'iface-id': '7d76cfd6-c5ce-4d30-bb5f-6fe27cb61873', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:d3:2e:0d', 'vm-uuid': 'd698021b-7b60-4b57-bb35-826d365e5bd3'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 21 19:15:25 np0005591285 nova_compute[182755]: 2026-01-22 00:15:25.114 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:15:25 np0005591285 NetworkManager[55017]: <info>  [1769040925.1160] manager: (tap7d76cfd6-c5): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/240)
Jan 21 19:15:25 np0005591285 nova_compute[182755]: 2026-01-22 00:15:25.117 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 21 19:15:25 np0005591285 nova_compute[182755]: 2026-01-22 00:15:25.122 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:15:25 np0005591285 nova_compute[182755]: 2026-01-22 00:15:25.123 182759 INFO os_vif [None req-b0c4f951-d686-472a-b22b-f4ccb1bf59da 154c87efa1ae4839ac679b6bb5a57518 7abcf6b34bcf45a9937f19251e144d1e - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:d3:2e:0d,bridge_name='br-int',has_traffic_filtering=True,id=7d76cfd6-c5ce-4d30-bb5f-6fe27cb61873,network=Network(b22364e6-24ba-4549-9455-32f1aced62b6),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap7d76cfd6-c5')#033[00m
Jan 21 19:15:25 np0005591285 nova_compute[182755]: 2026-01-22 00:15:25.198 182759 DEBUG nova.virt.libvirt.driver [None req-b0c4f951-d686-472a-b22b-f4ccb1bf59da 154c87efa1ae4839ac679b6bb5a57518 7abcf6b34bcf45a9937f19251e144d1e - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 21 19:15:25 np0005591285 nova_compute[182755]: 2026-01-22 00:15:25.199 182759 DEBUG nova.virt.libvirt.driver [None req-b0c4f951-d686-472a-b22b-f4ccb1bf59da 154c87efa1ae4839ac679b6bb5a57518 7abcf6b34bcf45a9937f19251e144d1e - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 21 19:15:25 np0005591285 nova_compute[182755]: 2026-01-22 00:15:25.199 182759 DEBUG nova.virt.libvirt.driver [None req-b0c4f951-d686-472a-b22b-f4ccb1bf59da 154c87efa1ae4839ac679b6bb5a57518 7abcf6b34bcf45a9937f19251e144d1e - - default default] No VIF found with MAC fa:16:3e:d3:2e:0d, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Jan 21 19:15:25 np0005591285 nova_compute[182755]: 2026-01-22 00:15:25.200 182759 INFO nova.virt.libvirt.driver [None req-b0c4f951-d686-472a-b22b-f4ccb1bf59da 154c87efa1ae4839ac679b6bb5a57518 7abcf6b34bcf45a9937f19251e144d1e - - default default] [instance: d698021b-7b60-4b57-bb35-826d365e5bd3] Using config drive#033[00m
Jan 21 19:15:25 np0005591285 nova_compute[182755]: 2026-01-22 00:15:25.661 182759 INFO nova.virt.libvirt.driver [None req-b0c4f951-d686-472a-b22b-f4ccb1bf59da 154c87efa1ae4839ac679b6bb5a57518 7abcf6b34bcf45a9937f19251e144d1e - - default default] [instance: d698021b-7b60-4b57-bb35-826d365e5bd3] Creating config drive at /var/lib/nova/instances/d698021b-7b60-4b57-bb35-826d365e5bd3/disk.config#033[00m
Jan 21 19:15:25 np0005591285 nova_compute[182755]: 2026-01-22 00:15:25.668 182759 DEBUG oslo_concurrency.processutils [None req-b0c4f951-d686-472a-b22b-f4ccb1bf59da 154c87efa1ae4839ac679b6bb5a57518 7abcf6b34bcf45a9937f19251e144d1e - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/d698021b-7b60-4b57-bb35-826d365e5bd3/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpqr_qrbcd execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 21 19:15:25 np0005591285 nova_compute[182755]: 2026-01-22 00:15:25.792 182759 DEBUG oslo_concurrency.processutils [None req-b0c4f951-d686-472a-b22b-f4ccb1bf59da 154c87efa1ae4839ac679b6bb5a57518 7abcf6b34bcf45a9937f19251e144d1e - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/d698021b-7b60-4b57-bb35-826d365e5bd3/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpqr_qrbcd" returned: 0 in 0.124s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 21 19:15:25 np0005591285 kernel: tap7d76cfd6-c5: entered promiscuous mode
Jan 21 19:15:25 np0005591285 ovn_controller[94908]: 2026-01-22T00:15:25Z|00499|binding|INFO|Claiming lport 7d76cfd6-c5ce-4d30-bb5f-6fe27cb61873 for this chassis.
Jan 21 19:15:25 np0005591285 ovn_controller[94908]: 2026-01-22T00:15:25Z|00500|binding|INFO|7d76cfd6-c5ce-4d30-bb5f-6fe27cb61873: Claiming fa:16:3e:d3:2e:0d 10.100.0.8
Jan 21 19:15:25 np0005591285 NetworkManager[55017]: <info>  [1769040925.8670] manager: (tap7d76cfd6-c5): new Tun device (/org/freedesktop/NetworkManager/Devices/241)
Jan 21 19:15:25 np0005591285 nova_compute[182755]: 2026-01-22 00:15:25.876 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:15:25 np0005591285 systemd-udevd[231985]: Network interface NamePolicy= disabled on kernel command line.
Jan 21 19:15:25 np0005591285 systemd-machined[154022]: New machine qemu-59-instance-00000081.
Jan 21 19:15:25 np0005591285 podman[231937]: 2026-01-22 00:15:25.895582726 +0000 UTC m=+0.065401335 container health_status 769668cc4e818cfae854ab3fcb5517227ddca1e18005a18ef4d782a494f45662 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, vcs-type=git, release=1755695350, build-date=2025-08-20T13:12:41, config_id=openstack_network_exporter, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., distribution-scope=public, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., url=https://catalog.redhat.com/en/search?searchType=containers, vendor=Red Hat, Inc., io.openshift.tags=minimal rhel9, container_name=openstack_network_exporter, version=9.6, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.openshift.expose-services=, name=ubi9-minimal, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, architecture=x86_64, io.buildah.version=1.33.7, maintainer=Red Hat, Inc., managed_by=edpm_ansible, com.redhat.component=ubi9-minimal-container, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b)
Jan 21 19:15:25 np0005591285 NetworkManager[55017]: <info>  [1769040925.9030] device (tap7d76cfd6-c5): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 21 19:15:25 np0005591285 NetworkManager[55017]: <info>  [1769040925.9041] device (tap7d76cfd6-c5): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 21 19:15:25 np0005591285 systemd[1]: Started Virtual Machine qemu-59-instance-00000081.
Jan 21 19:15:25 np0005591285 nova_compute[182755]: 2026-01-22 00:15:25.921 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:15:25 np0005591285 ovn_controller[94908]: 2026-01-22T00:15:25Z|00501|binding|INFO|Setting lport 7d76cfd6-c5ce-4d30-bb5f-6fe27cb61873 ovn-installed in OVS
Jan 21 19:15:25 np0005591285 nova_compute[182755]: 2026-01-22 00:15:25.926 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:15:25 np0005591285 ovn_controller[94908]: 2026-01-22T00:15:25Z|00502|binding|INFO|Setting lport 7d76cfd6-c5ce-4d30-bb5f-6fe27cb61873 up in Southbound
Jan 21 19:15:25 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:15:25.935 104259 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:d3:2e:0d 10.100.0.8'], port_security=['fa:16:3e:d3:2e:0d 10.100.0.8'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.8/28', 'neutron:device_id': 'd698021b-7b60-4b57-bb35-826d365e5bd3', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-b22364e6-24ba-4549-9455-32f1aced62b6', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '7abcf6b34bcf45a9937f19251e144d1e', 'neutron:revision_number': '2', 'neutron:security_group_ids': '4a65fa38-0599-423c-a1cc-2beb11e474bb', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=995fd5d6-737d-49bb-b4b2-b9f6f6d20c8d, chassis=[<ovs.db.idl.Row object at 0x7fdd55b5c970>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fdd55b5c970>], logical_port=7d76cfd6-c5ce-4d30-bb5f-6fe27cb61873) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 21 19:15:25 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:15:25.936 104259 INFO neutron.agent.ovn.metadata.agent [-] Port 7d76cfd6-c5ce-4d30-bb5f-6fe27cb61873 in datapath b22364e6-24ba-4549-9455-32f1aced62b6 bound to our chassis#033[00m
Jan 21 19:15:25 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:15:25.937 104259 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network b22364e6-24ba-4549-9455-32f1aced62b6#033[00m
Jan 21 19:15:25 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:15:25.948 211690 DEBUG oslo.privsep.daemon [-] privsep: reply[53b90c81-e7de-4f69-98d5-6d1354f1edd5]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 19:15:25 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:15:25.949 104259 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tapb22364e6-21 in ovnmeta-b22364e6-24ba-4549-9455-32f1aced62b6 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Jan 21 19:15:25 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:15:25.951 211690 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tapb22364e6-20 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Jan 21 19:15:25 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:15:25.951 211690 DEBUG oslo.privsep.daemon [-] privsep: reply[aa4b8800-f610-49cc-be1e-3b42c49f5834]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 19:15:25 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:15:25.952 211690 DEBUG oslo.privsep.daemon [-] privsep: reply[a5a4031c-2a2f-4f8a-9029-d9b0350c41e4]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 19:15:25 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:15:25.962 104650 DEBUG oslo.privsep.daemon [-] privsep: reply[dd7df0be-489b-46dc-a050-d634fe7d10de]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 19:15:25 np0005591285 podman[231941]: 2026-01-22 00:15:25.963816695 +0000 UTC m=+0.129279087 container health_status b5c1a31a508d0cf9e10702abe9c0ef424614b4d8f0daee85791f7de054afa6f0 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_id=ceilometer_agent_compute, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 21 19:15:25 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:15:25.985 211690 DEBUG oslo.privsep.daemon [-] privsep: reply[6a898da2-bfa8-42af-9bf7-5d973243ae25]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 19:15:26 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:15:26.023 211704 DEBUG oslo.privsep.daemon [-] privsep: reply[cd70aaeb-9987-42e0-b52d-3389549c1a46]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 19:15:26 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:15:26.030 211690 DEBUG oslo.privsep.daemon [-] privsep: reply[18bfe969-a148-4232-92ce-baca590d9860]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 19:15:26 np0005591285 NetworkManager[55017]: <info>  [1769040926.0319] manager: (tapb22364e6-20): new Veth device (/org/freedesktop/NetworkManager/Devices/242)
Jan 21 19:15:26 np0005591285 systemd-udevd[231988]: Network interface NamePolicy= disabled on kernel command line.
Jan 21 19:15:26 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:15:26.066 211704 DEBUG oslo.privsep.daemon [-] privsep: reply[e28fa032-e574-4539-b69e-5416ef46c898]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 19:15:26 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:15:26.069 211704 DEBUG oslo.privsep.daemon [-] privsep: reply[f5f3aca5-1b32-4a76-9e1e-dbf36882c060]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 19:15:26 np0005591285 NetworkManager[55017]: <info>  [1769040926.0889] device (tapb22364e6-20): carrier: link connected
Jan 21 19:15:26 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:15:26.093 211704 DEBUG oslo.privsep.daemon [-] privsep: reply[00b37ed0-b247-441e-b178-80e45ec9d3f9]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 19:15:26 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:15:26.108 211690 DEBUG oslo.privsep.daemon [-] privsep: reply[641ee502-fc9c-413d-a1cb-dbdcd7eb930e]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapb22364e6-21'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:4f:0d:ba'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 156], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 548474, 'reachable_time': 29352, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 232022, 'error': None, 'target': 'ovnmeta-b22364e6-24ba-4549-9455-32f1aced62b6', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 19:15:26 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:15:26.122 211690 DEBUG oslo.privsep.daemon [-] privsep: reply[3885221f-0b16-4af3-aaf9-3eeaa300baad]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe4f:dba'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 548474, 'tstamp': 548474}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 232023, 'error': None, 'target': 'ovnmeta-b22364e6-24ba-4549-9455-32f1aced62b6', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 19:15:26 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:15:26.142 211690 DEBUG oslo.privsep.daemon [-] privsep: reply[1e77388a-0ad4-4a88-9f2e-2513427c1a9d]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapb22364e6-21'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:4f:0d:ba'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 156], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 548474, 'reachable_time': 29352, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 232024, 'error': None, 'target': 'ovnmeta-b22364e6-24ba-4549-9455-32f1aced62b6', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 19:15:26 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:15:26.173 211690 DEBUG oslo.privsep.daemon [-] privsep: reply[57d3b62f-d9f2-45d7-bb13-c029f36d98c6]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 19:15:26 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:15:26.245 211690 DEBUG oslo.privsep.daemon [-] privsep: reply[1d1d6aa5-8c22-45b1-82bb-9a7ca56d5149]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 19:15:26 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:15:26.246 104259 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapb22364e6-20, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 21 19:15:26 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:15:26.247 104259 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 21 19:15:26 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:15:26.247 104259 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapb22364e6-20, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 21 19:15:26 np0005591285 nova_compute[182755]: 2026-01-22 00:15:26.300 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:15:26 np0005591285 kernel: tapb22364e6-20: entered promiscuous mode
Jan 21 19:15:26 np0005591285 NetworkManager[55017]: <info>  [1769040926.3025] manager: (tapb22364e6-20): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/243)
Jan 21 19:15:26 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:15:26.306 104259 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapb22364e6-20, col_values=(('external_ids', {'iface-id': '8bd09136-46a7-43b7-b761-32b8518381b2'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 21 19:15:26 np0005591285 ovn_controller[94908]: 2026-01-22T00:15:26Z|00503|binding|INFO|Releasing lport 8bd09136-46a7-43b7-b761-32b8518381b2 from this chassis (sb_readonly=0)
Jan 21 19:15:26 np0005591285 nova_compute[182755]: 2026-01-22 00:15:26.308 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:15:26 np0005591285 nova_compute[182755]: 2026-01-22 00:15:26.313 182759 DEBUG nova.virt.driver [None req-f447ef32-7edb-4e2c-a5c5-5452f2f9c37e - - - - - -] Emitting event <LifecycleEvent: 1769040926.3127508, d698021b-7b60-4b57-bb35-826d365e5bd3 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 21 19:15:26 np0005591285 nova_compute[182755]: 2026-01-22 00:15:26.313 182759 INFO nova.compute.manager [None req-f447ef32-7edb-4e2c-a5c5-5452f2f9c37e - - - - - -] [instance: d698021b-7b60-4b57-bb35-826d365e5bd3] VM Started (Lifecycle Event)#033[00m
Jan 21 19:15:26 np0005591285 nova_compute[182755]: 2026-01-22 00:15:26.336 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:15:26 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:15:26.337 104259 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/b22364e6-24ba-4549-9455-32f1aced62b6.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/b22364e6-24ba-4549-9455-32f1aced62b6.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Jan 21 19:15:26 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:15:26.338 211690 DEBUG oslo.privsep.daemon [-] privsep: reply[191aa917-48dd-4e97-b12e-b39ab3052a22]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 19:15:26 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:15:26.339 104259 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Jan 21 19:15:26 np0005591285 ovn_metadata_agent[104254]: global
Jan 21 19:15:26 np0005591285 ovn_metadata_agent[104254]:    log         /dev/log local0 debug
Jan 21 19:15:26 np0005591285 ovn_metadata_agent[104254]:    log-tag     haproxy-metadata-proxy-b22364e6-24ba-4549-9455-32f1aced62b6
Jan 21 19:15:26 np0005591285 ovn_metadata_agent[104254]:    user        root
Jan 21 19:15:26 np0005591285 ovn_metadata_agent[104254]:    group       root
Jan 21 19:15:26 np0005591285 ovn_metadata_agent[104254]:    maxconn     1024
Jan 21 19:15:26 np0005591285 ovn_metadata_agent[104254]:    pidfile     /var/lib/neutron/external/pids/b22364e6-24ba-4549-9455-32f1aced62b6.pid.haproxy
Jan 21 19:15:26 np0005591285 ovn_metadata_agent[104254]:    daemon
Jan 21 19:15:26 np0005591285 ovn_metadata_agent[104254]: 
Jan 21 19:15:26 np0005591285 ovn_metadata_agent[104254]: defaults
Jan 21 19:15:26 np0005591285 ovn_metadata_agent[104254]:    log global
Jan 21 19:15:26 np0005591285 ovn_metadata_agent[104254]:    mode http
Jan 21 19:15:26 np0005591285 ovn_metadata_agent[104254]:    option httplog
Jan 21 19:15:26 np0005591285 ovn_metadata_agent[104254]:    option dontlognull
Jan 21 19:15:26 np0005591285 ovn_metadata_agent[104254]:    option http-server-close
Jan 21 19:15:26 np0005591285 ovn_metadata_agent[104254]:    option forwardfor
Jan 21 19:15:26 np0005591285 ovn_metadata_agent[104254]:    retries                 3
Jan 21 19:15:26 np0005591285 ovn_metadata_agent[104254]:    timeout http-request    30s
Jan 21 19:15:26 np0005591285 ovn_metadata_agent[104254]:    timeout connect         30s
Jan 21 19:15:26 np0005591285 ovn_metadata_agent[104254]:    timeout client          32s
Jan 21 19:15:26 np0005591285 ovn_metadata_agent[104254]:    timeout server          32s
Jan 21 19:15:26 np0005591285 ovn_metadata_agent[104254]:    timeout http-keep-alive 30s
Jan 21 19:15:26 np0005591285 ovn_metadata_agent[104254]: 
Jan 21 19:15:26 np0005591285 ovn_metadata_agent[104254]: 
Jan 21 19:15:26 np0005591285 ovn_metadata_agent[104254]: listen listener
Jan 21 19:15:26 np0005591285 ovn_metadata_agent[104254]:    bind 169.254.169.254:80
Jan 21 19:15:26 np0005591285 ovn_metadata_agent[104254]:    server metadata /var/lib/neutron/metadata_proxy
Jan 21 19:15:26 np0005591285 ovn_metadata_agent[104254]:    http-request add-header X-OVN-Network-ID b22364e6-24ba-4549-9455-32f1aced62b6
Jan 21 19:15:26 np0005591285 ovn_metadata_agent[104254]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Jan 21 19:15:26 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:15:26.340 104259 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-b22364e6-24ba-4549-9455-32f1aced62b6', 'env', 'PROCESS_TAG=haproxy-b22364e6-24ba-4549-9455-32f1aced62b6', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/b22364e6-24ba-4549-9455-32f1aced62b6.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Jan 21 19:15:26 np0005591285 nova_compute[182755]: 2026-01-22 00:15:26.358 182759 DEBUG nova.compute.manager [None req-f447ef32-7edb-4e2c-a5c5-5452f2f9c37e - - - - - -] [instance: d698021b-7b60-4b57-bb35-826d365e5bd3] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 21 19:15:26 np0005591285 nova_compute[182755]: 2026-01-22 00:15:26.363 182759 DEBUG nova.virt.driver [None req-f447ef32-7edb-4e2c-a5c5-5452f2f9c37e - - - - - -] Emitting event <LifecycleEvent: 1769040926.3129163, d698021b-7b60-4b57-bb35-826d365e5bd3 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 21 19:15:26 np0005591285 nova_compute[182755]: 2026-01-22 00:15:26.363 182759 INFO nova.compute.manager [None req-f447ef32-7edb-4e2c-a5c5-5452f2f9c37e - - - - - -] [instance: d698021b-7b60-4b57-bb35-826d365e5bd3] VM Paused (Lifecycle Event)#033[00m
Jan 21 19:15:26 np0005591285 nova_compute[182755]: 2026-01-22 00:15:26.387 182759 DEBUG nova.compute.manager [None req-f447ef32-7edb-4e2c-a5c5-5452f2f9c37e - - - - - -] [instance: d698021b-7b60-4b57-bb35-826d365e5bd3] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 21 19:15:26 np0005591285 nova_compute[182755]: 2026-01-22 00:15:26.390 182759 DEBUG nova.compute.manager [None req-f447ef32-7edb-4e2c-a5c5-5452f2f9c37e - - - - - -] [instance: d698021b-7b60-4b57-bb35-826d365e5bd3] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 21 19:15:26 np0005591285 nova_compute[182755]: 2026-01-22 00:15:26.411 182759 INFO nova.compute.manager [None req-f447ef32-7edb-4e2c-a5c5-5452f2f9c37e - - - - - -] [instance: d698021b-7b60-4b57-bb35-826d365e5bd3] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 21 19:15:26 np0005591285 podman[232063]: 2026-01-22 00:15:26.703736372 +0000 UTC m=+0.052608513 container create ca2078fb798e2cf86af5c24f8826cde2a2e9b105d27a6858f30a0d3d4611e774 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-b22364e6-24ba-4549-9455-32f1aced62b6, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3)
Jan 21 19:15:26 np0005591285 systemd[1]: Started libpod-conmon-ca2078fb798e2cf86af5c24f8826cde2a2e9b105d27a6858f30a0d3d4611e774.scope.
Jan 21 19:15:26 np0005591285 podman[232063]: 2026-01-22 00:15:26.672049738 +0000 UTC m=+0.020921919 image pull 3695f0466b4af47afdf4b467956f8cc4744d7249671a73e7ca3fd26cca2f59c3 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Jan 21 19:15:26 np0005591285 systemd[1]: Started libcrun container.
Jan 21 19:15:26 np0005591285 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/51f3b1f23be5f93a79d4828ee3f76b5b013e20290563caceaec3bc435ff7f225/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 21 19:15:26 np0005591285 podman[232063]: 2026-01-22 00:15:26.798291714 +0000 UTC m=+0.147163875 container init ca2078fb798e2cf86af5c24f8826cde2a2e9b105d27a6858f30a0d3d4611e774 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-b22364e6-24ba-4549-9455-32f1aced62b6, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, tcib_managed=true, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 21 19:15:26 np0005591285 podman[232063]: 2026-01-22 00:15:26.803210455 +0000 UTC m=+0.152082596 container start ca2078fb798e2cf86af5c24f8826cde2a2e9b105d27a6858f30a0d3d4611e774 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-b22364e6-24ba-4549-9455-32f1aced62b6, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS)
Jan 21 19:15:26 np0005591285 neutron-haproxy-ovnmeta-b22364e6-24ba-4549-9455-32f1aced62b6[232079]: [NOTICE]   (232083) : New worker (232085) forked
Jan 21 19:15:26 np0005591285 neutron-haproxy-ovnmeta-b22364e6-24ba-4549-9455-32f1aced62b6[232079]: [NOTICE]   (232083) : Loading success.
Jan 21 19:15:27 np0005591285 nova_compute[182755]: 2026-01-22 00:15:27.379 182759 DEBUG nova.compute.manager [req-959a545f-c13d-4c54-a7bb-8f7b8593c216 req-2d566604-6e15-4e8b-bfa5-a96cf4df3d77 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: d698021b-7b60-4b57-bb35-826d365e5bd3] Received event network-vif-plugged-7d76cfd6-c5ce-4d30-bb5f-6fe27cb61873 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 21 19:15:27 np0005591285 nova_compute[182755]: 2026-01-22 00:15:27.380 182759 DEBUG oslo_concurrency.lockutils [req-959a545f-c13d-4c54-a7bb-8f7b8593c216 req-2d566604-6e15-4e8b-bfa5-a96cf4df3d77 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "d698021b-7b60-4b57-bb35-826d365e5bd3-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 21 19:15:27 np0005591285 nova_compute[182755]: 2026-01-22 00:15:27.381 182759 DEBUG oslo_concurrency.lockutils [req-959a545f-c13d-4c54-a7bb-8f7b8593c216 req-2d566604-6e15-4e8b-bfa5-a96cf4df3d77 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "d698021b-7b60-4b57-bb35-826d365e5bd3-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 21 19:15:27 np0005591285 nova_compute[182755]: 2026-01-22 00:15:27.381 182759 DEBUG oslo_concurrency.lockutils [req-959a545f-c13d-4c54-a7bb-8f7b8593c216 req-2d566604-6e15-4e8b-bfa5-a96cf4df3d77 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "d698021b-7b60-4b57-bb35-826d365e5bd3-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 21 19:15:27 np0005591285 nova_compute[182755]: 2026-01-22 00:15:27.382 182759 DEBUG nova.compute.manager [req-959a545f-c13d-4c54-a7bb-8f7b8593c216 req-2d566604-6e15-4e8b-bfa5-a96cf4df3d77 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: d698021b-7b60-4b57-bb35-826d365e5bd3] Processing event network-vif-plugged-7d76cfd6-c5ce-4d30-bb5f-6fe27cb61873 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Jan 21 19:15:27 np0005591285 nova_compute[182755]: 2026-01-22 00:15:27.383 182759 DEBUG nova.compute.manager [None req-b0c4f951-d686-472a-b22b-f4ccb1bf59da 154c87efa1ae4839ac679b6bb5a57518 7abcf6b34bcf45a9937f19251e144d1e - - default default] [instance: d698021b-7b60-4b57-bb35-826d365e5bd3] Instance event wait completed in 1 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Jan 21 19:15:27 np0005591285 nova_compute[182755]: 2026-01-22 00:15:27.390 182759 DEBUG nova.virt.driver [None req-f447ef32-7edb-4e2c-a5c5-5452f2f9c37e - - - - - -] Emitting event <LifecycleEvent: 1769040927.3899307, d698021b-7b60-4b57-bb35-826d365e5bd3 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 21 19:15:27 np0005591285 nova_compute[182755]: 2026-01-22 00:15:27.391 182759 INFO nova.compute.manager [None req-f447ef32-7edb-4e2c-a5c5-5452f2f9c37e - - - - - -] [instance: d698021b-7b60-4b57-bb35-826d365e5bd3] VM Resumed (Lifecycle Event)#033[00m
Jan 21 19:15:27 np0005591285 nova_compute[182755]: 2026-01-22 00:15:27.396 182759 DEBUG nova.virt.libvirt.driver [None req-b0c4f951-d686-472a-b22b-f4ccb1bf59da 154c87efa1ae4839ac679b6bb5a57518 7abcf6b34bcf45a9937f19251e144d1e - - default default] [instance: d698021b-7b60-4b57-bb35-826d365e5bd3] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Jan 21 19:15:27 np0005591285 nova_compute[182755]: 2026-01-22 00:15:27.402 182759 INFO nova.virt.libvirt.driver [-] [instance: d698021b-7b60-4b57-bb35-826d365e5bd3] Instance spawned successfully.#033[00m
Jan 21 19:15:27 np0005591285 nova_compute[182755]: 2026-01-22 00:15:27.403 182759 DEBUG nova.virt.libvirt.driver [None req-b0c4f951-d686-472a-b22b-f4ccb1bf59da 154c87efa1ae4839ac679b6bb5a57518 7abcf6b34bcf45a9937f19251e144d1e - - default default] [instance: d698021b-7b60-4b57-bb35-826d365e5bd3] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Jan 21 19:15:27 np0005591285 nova_compute[182755]: 2026-01-22 00:15:27.430 182759 DEBUG nova.compute.manager [None req-f447ef32-7edb-4e2c-a5c5-5452f2f9c37e - - - - - -] [instance: d698021b-7b60-4b57-bb35-826d365e5bd3] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 21 19:15:27 np0005591285 nova_compute[182755]: 2026-01-22 00:15:27.437 182759 DEBUG nova.virt.libvirt.driver [None req-b0c4f951-d686-472a-b22b-f4ccb1bf59da 154c87efa1ae4839ac679b6bb5a57518 7abcf6b34bcf45a9937f19251e144d1e - - default default] [instance: d698021b-7b60-4b57-bb35-826d365e5bd3] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 21 19:15:27 np0005591285 nova_compute[182755]: 2026-01-22 00:15:27.437 182759 DEBUG nova.virt.libvirt.driver [None req-b0c4f951-d686-472a-b22b-f4ccb1bf59da 154c87efa1ae4839ac679b6bb5a57518 7abcf6b34bcf45a9937f19251e144d1e - - default default] [instance: d698021b-7b60-4b57-bb35-826d365e5bd3] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 21 19:15:27 np0005591285 nova_compute[182755]: 2026-01-22 00:15:27.438 182759 DEBUG nova.virt.libvirt.driver [None req-b0c4f951-d686-472a-b22b-f4ccb1bf59da 154c87efa1ae4839ac679b6bb5a57518 7abcf6b34bcf45a9937f19251e144d1e - - default default] [instance: d698021b-7b60-4b57-bb35-826d365e5bd3] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 21 19:15:27 np0005591285 nova_compute[182755]: 2026-01-22 00:15:27.439 182759 DEBUG nova.virt.libvirt.driver [None req-b0c4f951-d686-472a-b22b-f4ccb1bf59da 154c87efa1ae4839ac679b6bb5a57518 7abcf6b34bcf45a9937f19251e144d1e - - default default] [instance: d698021b-7b60-4b57-bb35-826d365e5bd3] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 21 19:15:27 np0005591285 nova_compute[182755]: 2026-01-22 00:15:27.440 182759 DEBUG nova.virt.libvirt.driver [None req-b0c4f951-d686-472a-b22b-f4ccb1bf59da 154c87efa1ae4839ac679b6bb5a57518 7abcf6b34bcf45a9937f19251e144d1e - - default default] [instance: d698021b-7b60-4b57-bb35-826d365e5bd3] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 21 19:15:27 np0005591285 nova_compute[182755]: 2026-01-22 00:15:27.441 182759 DEBUG nova.virt.libvirt.driver [None req-b0c4f951-d686-472a-b22b-f4ccb1bf59da 154c87efa1ae4839ac679b6bb5a57518 7abcf6b34bcf45a9937f19251e144d1e - - default default] [instance: d698021b-7b60-4b57-bb35-826d365e5bd3] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 21 19:15:27 np0005591285 nova_compute[182755]: 2026-01-22 00:15:27.450 182759 DEBUG nova.compute.manager [None req-f447ef32-7edb-4e2c-a5c5-5452f2f9c37e - - - - - -] [instance: d698021b-7b60-4b57-bb35-826d365e5bd3] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 21 19:15:27 np0005591285 nova_compute[182755]: 2026-01-22 00:15:27.493 182759 INFO nova.compute.manager [None req-f447ef32-7edb-4e2c-a5c5-5452f2f9c37e - - - - - -] [instance: d698021b-7b60-4b57-bb35-826d365e5bd3] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 21 19:15:27 np0005591285 nova_compute[182755]: 2026-01-22 00:15:27.530 182759 INFO nova.compute.manager [None req-b0c4f951-d686-472a-b22b-f4ccb1bf59da 154c87efa1ae4839ac679b6bb5a57518 7abcf6b34bcf45a9937f19251e144d1e - - default default] [instance: d698021b-7b60-4b57-bb35-826d365e5bd3] Took 6.11 seconds to spawn the instance on the hypervisor.#033[00m
Jan 21 19:15:27 np0005591285 nova_compute[182755]: 2026-01-22 00:15:27.531 182759 DEBUG nova.compute.manager [None req-b0c4f951-d686-472a-b22b-f4ccb1bf59da 154c87efa1ae4839ac679b6bb5a57518 7abcf6b34bcf45a9937f19251e144d1e - - default default] [instance: d698021b-7b60-4b57-bb35-826d365e5bd3] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 21 19:15:27 np0005591285 nova_compute[182755]: 2026-01-22 00:15:27.612 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:15:27 np0005591285 nova_compute[182755]: 2026-01-22 00:15:27.666 182759 INFO nova.compute.manager [None req-b0c4f951-d686-472a-b22b-f4ccb1bf59da 154c87efa1ae4839ac679b6bb5a57518 7abcf6b34bcf45a9937f19251e144d1e - - default default] [instance: d698021b-7b60-4b57-bb35-826d365e5bd3] Took 6.82 seconds to build instance.#033[00m
Jan 21 19:15:27 np0005591285 nova_compute[182755]: 2026-01-22 00:15:27.694 182759 DEBUG oslo_concurrency.lockutils [None req-b0c4f951-d686-472a-b22b-f4ccb1bf59da 154c87efa1ae4839ac679b6bb5a57518 7abcf6b34bcf45a9937f19251e144d1e - - default default] Lock "d698021b-7b60-4b57-bb35-826d365e5bd3" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 6.947s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 21 19:15:27 np0005591285 nova_compute[182755]: 2026-01-22 00:15:27.754 182759 DEBUG nova.network.neutron [req-f87a35ad-1483-4d8f-aa2c-5f08f66464be req-3fb76e56-0c40-4628-8a4f-29ec4a56b6bd 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: d698021b-7b60-4b57-bb35-826d365e5bd3] Updated VIF entry in instance network info cache for port 7d76cfd6-c5ce-4d30-bb5f-6fe27cb61873. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 21 19:15:27 np0005591285 nova_compute[182755]: 2026-01-22 00:15:27.756 182759 DEBUG nova.network.neutron [req-f87a35ad-1483-4d8f-aa2c-5f08f66464be req-3fb76e56-0c40-4628-8a4f-29ec4a56b6bd 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: d698021b-7b60-4b57-bb35-826d365e5bd3] Updating instance_info_cache with network_info: [{"id": "7d76cfd6-c5ce-4d30-bb5f-6fe27cb61873", "address": "fa:16:3e:d3:2e:0d", "network": {"id": "b22364e6-24ba-4549-9455-32f1aced62b6", "bridge": "br-int", "label": "tempest-ServersNegativeTestMultiTenantJSON-1033463498-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "7abcf6b34bcf45a9937f19251e144d1e", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap7d76cfd6-c5", "ovs_interfaceid": "7d76cfd6-c5ce-4d30-bb5f-6fe27cb61873", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 21 19:15:27 np0005591285 nova_compute[182755]: 2026-01-22 00:15:27.905 182759 DEBUG oslo_concurrency.lockutils [req-f87a35ad-1483-4d8f-aa2c-5f08f66464be req-3fb76e56-0c40-4628-8a4f-29ec4a56b6bd 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Releasing lock "refresh_cache-d698021b-7b60-4b57-bb35-826d365e5bd3" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 21 19:15:30 np0005591285 nova_compute[182755]: 2026-01-22 00:15:30.116 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:15:30 np0005591285 nova_compute[182755]: 2026-01-22 00:15:30.230 182759 DEBUG nova.compute.manager [req-9b9fe77f-16c6-4c19-aaf6-d24760b47552 req-acb54359-283b-45a6-a422-a0071e50f170 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: d698021b-7b60-4b57-bb35-826d365e5bd3] Received event network-vif-plugged-7d76cfd6-c5ce-4d30-bb5f-6fe27cb61873 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 21 19:15:30 np0005591285 nova_compute[182755]: 2026-01-22 00:15:30.231 182759 DEBUG oslo_concurrency.lockutils [req-9b9fe77f-16c6-4c19-aaf6-d24760b47552 req-acb54359-283b-45a6-a422-a0071e50f170 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "d698021b-7b60-4b57-bb35-826d365e5bd3-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 21 19:15:30 np0005591285 nova_compute[182755]: 2026-01-22 00:15:30.231 182759 DEBUG oslo_concurrency.lockutils [req-9b9fe77f-16c6-4c19-aaf6-d24760b47552 req-acb54359-283b-45a6-a422-a0071e50f170 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "d698021b-7b60-4b57-bb35-826d365e5bd3-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 21 19:15:30 np0005591285 nova_compute[182755]: 2026-01-22 00:15:30.231 182759 DEBUG oslo_concurrency.lockutils [req-9b9fe77f-16c6-4c19-aaf6-d24760b47552 req-acb54359-283b-45a6-a422-a0071e50f170 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "d698021b-7b60-4b57-bb35-826d365e5bd3-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 21 19:15:30 np0005591285 nova_compute[182755]: 2026-01-22 00:15:30.232 182759 DEBUG nova.compute.manager [req-9b9fe77f-16c6-4c19-aaf6-d24760b47552 req-acb54359-283b-45a6-a422-a0071e50f170 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: d698021b-7b60-4b57-bb35-826d365e5bd3] No waiting events found dispatching network-vif-plugged-7d76cfd6-c5ce-4d30-bb5f-6fe27cb61873 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 21 19:15:30 np0005591285 nova_compute[182755]: 2026-01-22 00:15:30.232 182759 WARNING nova.compute.manager [req-9b9fe77f-16c6-4c19-aaf6-d24760b47552 req-acb54359-283b-45a6-a422-a0071e50f170 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: d698021b-7b60-4b57-bb35-826d365e5bd3] Received unexpected event network-vif-plugged-7d76cfd6-c5ce-4d30-bb5f-6fe27cb61873 for instance with vm_state active and task_state None.#033[00m
Jan 21 19:15:31 np0005591285 nova_compute[182755]: 2026-01-22 00:15:31.253 182759 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1769040916.252138, 40f18c70-81c1-4729-929c-5368ae297eb8 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 21 19:15:31 np0005591285 nova_compute[182755]: 2026-01-22 00:15:31.254 182759 INFO nova.compute.manager [-] [instance: 40f18c70-81c1-4729-929c-5368ae297eb8] VM Stopped (Lifecycle Event)#033[00m
Jan 21 19:15:31 np0005591285 nova_compute[182755]: 2026-01-22 00:15:31.280 182759 DEBUG nova.compute.manager [None req-6328ae2c-dcbf-4747-9fce-61491d710853 - - - - - -] [instance: 40f18c70-81c1-4729-929c-5368ae297eb8] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 21 19:15:32 np0005591285 nova_compute[182755]: 2026-01-22 00:15:32.189 182759 DEBUG oslo_concurrency.lockutils [None req-f45d7d13-375c-4ef3-b082-fddd10e99c0e 154c87efa1ae4839ac679b6bb5a57518 7abcf6b34bcf45a9937f19251e144d1e - - default default] Acquiring lock "d698021b-7b60-4b57-bb35-826d365e5bd3" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 21 19:15:32 np0005591285 nova_compute[182755]: 2026-01-22 00:15:32.189 182759 DEBUG oslo_concurrency.lockutils [None req-f45d7d13-375c-4ef3-b082-fddd10e99c0e 154c87efa1ae4839ac679b6bb5a57518 7abcf6b34bcf45a9937f19251e144d1e - - default default] Lock "d698021b-7b60-4b57-bb35-826d365e5bd3" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 21 19:15:32 np0005591285 nova_compute[182755]: 2026-01-22 00:15:32.190 182759 DEBUG oslo_concurrency.lockutils [None req-f45d7d13-375c-4ef3-b082-fddd10e99c0e 154c87efa1ae4839ac679b6bb5a57518 7abcf6b34bcf45a9937f19251e144d1e - - default default] Acquiring lock "d698021b-7b60-4b57-bb35-826d365e5bd3-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 21 19:15:32 np0005591285 nova_compute[182755]: 2026-01-22 00:15:32.190 182759 DEBUG oslo_concurrency.lockutils [None req-f45d7d13-375c-4ef3-b082-fddd10e99c0e 154c87efa1ae4839ac679b6bb5a57518 7abcf6b34bcf45a9937f19251e144d1e - - default default] Lock "d698021b-7b60-4b57-bb35-826d365e5bd3-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 21 19:15:32 np0005591285 nova_compute[182755]: 2026-01-22 00:15:32.190 182759 DEBUG oslo_concurrency.lockutils [None req-f45d7d13-375c-4ef3-b082-fddd10e99c0e 154c87efa1ae4839ac679b6bb5a57518 7abcf6b34bcf45a9937f19251e144d1e - - default default] Lock "d698021b-7b60-4b57-bb35-826d365e5bd3-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 21 19:15:32 np0005591285 nova_compute[182755]: 2026-01-22 00:15:32.205 182759 INFO nova.compute.manager [None req-f45d7d13-375c-4ef3-b082-fddd10e99c0e 154c87efa1ae4839ac679b6bb5a57518 7abcf6b34bcf45a9937f19251e144d1e - - default default] [instance: d698021b-7b60-4b57-bb35-826d365e5bd3] Terminating instance#033[00m
Jan 21 19:15:32 np0005591285 nova_compute[182755]: 2026-01-22 00:15:32.217 182759 DEBUG nova.compute.manager [None req-f45d7d13-375c-4ef3-b082-fddd10e99c0e 154c87efa1ae4839ac679b6bb5a57518 7abcf6b34bcf45a9937f19251e144d1e - - default default] [instance: d698021b-7b60-4b57-bb35-826d365e5bd3] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Jan 21 19:15:32 np0005591285 kernel: tap7d76cfd6-c5 (unregistering): left promiscuous mode
Jan 21 19:15:32 np0005591285 NetworkManager[55017]: <info>  [1769040932.2377] device (tap7d76cfd6-c5): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 21 19:15:32 np0005591285 ovn_controller[94908]: 2026-01-22T00:15:32Z|00504|binding|INFO|Releasing lport 7d76cfd6-c5ce-4d30-bb5f-6fe27cb61873 from this chassis (sb_readonly=0)
Jan 21 19:15:32 np0005591285 ovn_controller[94908]: 2026-01-22T00:15:32Z|00505|binding|INFO|Setting lport 7d76cfd6-c5ce-4d30-bb5f-6fe27cb61873 down in Southbound
Jan 21 19:15:32 np0005591285 ovn_controller[94908]: 2026-01-22T00:15:32Z|00506|binding|INFO|Removing iface tap7d76cfd6-c5 ovn-installed in OVS
Jan 21 19:15:32 np0005591285 nova_compute[182755]: 2026-01-22 00:15:32.246 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:15:32 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:15:32.252 104259 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:d3:2e:0d 10.100.0.8'], port_security=['fa:16:3e:d3:2e:0d 10.100.0.8'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.8/28', 'neutron:device_id': 'd698021b-7b60-4b57-bb35-826d365e5bd3', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-b22364e6-24ba-4549-9455-32f1aced62b6', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '7abcf6b34bcf45a9937f19251e144d1e', 'neutron:revision_number': '4', 'neutron:security_group_ids': '4a65fa38-0599-423c-a1cc-2beb11e474bb', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-2.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=995fd5d6-737d-49bb-b4b2-b9f6f6d20c8d, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fdd55b5c970>], logical_port=7d76cfd6-c5ce-4d30-bb5f-6fe27cb61873) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7fdd55b5c970>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 21 19:15:32 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:15:32.254 104259 INFO neutron.agent.ovn.metadata.agent [-] Port 7d76cfd6-c5ce-4d30-bb5f-6fe27cb61873 in datapath b22364e6-24ba-4549-9455-32f1aced62b6 unbound from our chassis#033[00m
Jan 21 19:15:32 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:15:32.255 104259 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network b22364e6-24ba-4549-9455-32f1aced62b6, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Jan 21 19:15:32 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:15:32.256 211690 DEBUG oslo.privsep.daemon [-] privsep: reply[c7040f3b-bdfe-40c8-b951-48492141a0a7]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 19:15:32 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:15:32.257 104259 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-b22364e6-24ba-4549-9455-32f1aced62b6 namespace which is not needed anymore#033[00m
Jan 21 19:15:32 np0005591285 nova_compute[182755]: 2026-01-22 00:15:32.262 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:15:32 np0005591285 systemd[1]: machine-qemu\x2d59\x2dinstance\x2d00000081.scope: Deactivated successfully.
Jan 21 19:15:32 np0005591285 systemd[1]: machine-qemu\x2d59\x2dinstance\x2d00000081.scope: Consumed 5.331s CPU time.
Jan 21 19:15:32 np0005591285 systemd-machined[154022]: Machine qemu-59-instance-00000081 terminated.
Jan 21 19:15:32 np0005591285 neutron-haproxy-ovnmeta-b22364e6-24ba-4549-9455-32f1aced62b6[232079]: [NOTICE]   (232083) : haproxy version is 2.8.14-c23fe91
Jan 21 19:15:32 np0005591285 neutron-haproxy-ovnmeta-b22364e6-24ba-4549-9455-32f1aced62b6[232079]: [NOTICE]   (232083) : path to executable is /usr/sbin/haproxy
Jan 21 19:15:32 np0005591285 neutron-haproxy-ovnmeta-b22364e6-24ba-4549-9455-32f1aced62b6[232079]: [WARNING]  (232083) : Exiting Master process...
Jan 21 19:15:32 np0005591285 neutron-haproxy-ovnmeta-b22364e6-24ba-4549-9455-32f1aced62b6[232079]: [ALERT]    (232083) : Current worker (232085) exited with code 143 (Terminated)
Jan 21 19:15:32 np0005591285 neutron-haproxy-ovnmeta-b22364e6-24ba-4549-9455-32f1aced62b6[232079]: [WARNING]  (232083) : All workers exited. Exiting... (0)
Jan 21 19:15:32 np0005591285 systemd[1]: libpod-ca2078fb798e2cf86af5c24f8826cde2a2e9b105d27a6858f30a0d3d4611e774.scope: Deactivated successfully.
Jan 21 19:15:32 np0005591285 podman[232118]: 2026-01-22 00:15:32.37440981 +0000 UTC m=+0.042297729 container died ca2078fb798e2cf86af5c24f8826cde2a2e9b105d27a6858f30a0d3d4611e774 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-b22364e6-24ba-4549-9455-32f1aced62b6, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202)
Jan 21 19:15:32 np0005591285 systemd[1]: var-lib-containers-storage-overlay-51f3b1f23be5f93a79d4828ee3f76b5b013e20290563caceaec3bc435ff7f225-merged.mount: Deactivated successfully.
Jan 21 19:15:32 np0005591285 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-ca2078fb798e2cf86af5c24f8826cde2a2e9b105d27a6858f30a0d3d4611e774-userdata-shm.mount: Deactivated successfully.
Jan 21 19:15:32 np0005591285 podman[232118]: 2026-01-22 00:15:32.405179221 +0000 UTC m=+0.073067140 container cleanup ca2078fb798e2cf86af5c24f8826cde2a2e9b105d27a6858f30a0d3d4611e774 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-b22364e6-24ba-4549-9455-32f1aced62b6, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, tcib_managed=true, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 21 19:15:32 np0005591285 systemd[1]: libpod-conmon-ca2078fb798e2cf86af5c24f8826cde2a2e9b105d27a6858f30a0d3d4611e774.scope: Deactivated successfully.
Jan 21 19:15:32 np0005591285 podman[232147]: 2026-01-22 00:15:32.46180055 +0000 UTC m=+0.037555393 container remove ca2078fb798e2cf86af5c24f8826cde2a2e9b105d27a6858f30a0d3d4611e774 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-b22364e6-24ba-4549-9455-32f1aced62b6, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.vendor=CentOS)
Jan 21 19:15:32 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:15:32.469 211690 DEBUG oslo.privsep.daemon [-] privsep: reply[0de34eea-f280-402c-b55d-969d77c5d7c0]: (4, ('Thu Jan 22 12:15:32 AM UTC 2026 Stopping container neutron-haproxy-ovnmeta-b22364e6-24ba-4549-9455-32f1aced62b6 (ca2078fb798e2cf86af5c24f8826cde2a2e9b105d27a6858f30a0d3d4611e774)\nca2078fb798e2cf86af5c24f8826cde2a2e9b105d27a6858f30a0d3d4611e774\nThu Jan 22 12:15:32 AM UTC 2026 Deleting container neutron-haproxy-ovnmeta-b22364e6-24ba-4549-9455-32f1aced62b6 (ca2078fb798e2cf86af5c24f8826cde2a2e9b105d27a6858f30a0d3d4611e774)\nca2078fb798e2cf86af5c24f8826cde2a2e9b105d27a6858f30a0d3d4611e774\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 19:15:32 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:15:32.471 211690 DEBUG oslo.privsep.daemon [-] privsep: reply[45877caa-94c8-44a2-b1c6-c9eb81a0dc75]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 19:15:32 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:15:32.473 104259 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapb22364e6-20, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 21 19:15:32 np0005591285 nova_compute[182755]: 2026-01-22 00:15:32.475 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:15:32 np0005591285 kernel: tapb22364e6-20: left promiscuous mode
Jan 21 19:15:32 np0005591285 nova_compute[182755]: 2026-01-22 00:15:32.480 182759 INFO nova.virt.libvirt.driver [-] [instance: d698021b-7b60-4b57-bb35-826d365e5bd3] Instance destroyed successfully.#033[00m
Jan 21 19:15:32 np0005591285 nova_compute[182755]: 2026-01-22 00:15:32.480 182759 DEBUG nova.objects.instance [None req-f45d7d13-375c-4ef3-b082-fddd10e99c0e 154c87efa1ae4839ac679b6bb5a57518 7abcf6b34bcf45a9937f19251e144d1e - - default default] Lazy-loading 'resources' on Instance uuid d698021b-7b60-4b57-bb35-826d365e5bd3 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 21 19:15:32 np0005591285 nova_compute[182755]: 2026-01-22 00:15:32.491 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:15:32 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:15:32.495 211690 DEBUG oslo.privsep.daemon [-] privsep: reply[ba845b60-8a72-4d06-83e6-8dd5b837869e]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 19:15:32 np0005591285 nova_compute[182755]: 2026-01-22 00:15:32.500 182759 DEBUG nova.virt.libvirt.vif [None req-f45d7d13-375c-4ef3-b082-fddd10e99c0e 154c87efa1ae4839ac679b6bb5a57518 7abcf6b34bcf45a9937f19251e144d1e - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-22T00:15:19Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ServersNegativeTestMultiTenantJSON-server-1898834577',display_name='tempest-ServersNegativeTestMultiTenantJSON-server-1898834577',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-serversnegativetestmultitenantjson-server-1898834577',id=129,image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2026-01-22T00:15:27Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='7abcf6b34bcf45a9937f19251e144d1e',ramdisk_id='',reservation_id='r-g2fz3s0v',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-ServersNegativeTestMultiTenantJSON-1161793871',owner_user_name='tempest-ServersNegativeTestMultiTenantJSON-1161793871-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-22T00:15:27Z,user_data=None,user_id='154c87efa1ae4839ac679b6bb5a57518',uuid=d698021b-7b60-4b57-bb35-826d365e5bd3,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "7d76cfd6-c5ce-4d30-bb5f-6fe27cb61873", "address": "fa:16:3e:d3:2e:0d", "network": {"id": "b22364e6-24ba-4549-9455-32f1aced62b6", "bridge": "br-int", "label": "tempest-ServersNegativeTestMultiTenantJSON-1033463498-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "7abcf6b34bcf45a9937f19251e144d1e", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap7d76cfd6-c5", "ovs_interfaceid": "7d76cfd6-c5ce-4d30-bb5f-6fe27cb61873", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Jan 21 19:15:32 np0005591285 nova_compute[182755]: 2026-01-22 00:15:32.501 182759 DEBUG nova.network.os_vif_util [None req-f45d7d13-375c-4ef3-b082-fddd10e99c0e 154c87efa1ae4839ac679b6bb5a57518 7abcf6b34bcf45a9937f19251e144d1e - - default default] Converting VIF {"id": "7d76cfd6-c5ce-4d30-bb5f-6fe27cb61873", "address": "fa:16:3e:d3:2e:0d", "network": {"id": "b22364e6-24ba-4549-9455-32f1aced62b6", "bridge": "br-int", "label": "tempest-ServersNegativeTestMultiTenantJSON-1033463498-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "7abcf6b34bcf45a9937f19251e144d1e", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap7d76cfd6-c5", "ovs_interfaceid": "7d76cfd6-c5ce-4d30-bb5f-6fe27cb61873", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 21 19:15:32 np0005591285 nova_compute[182755]: 2026-01-22 00:15:32.502 182759 DEBUG nova.network.os_vif_util [None req-f45d7d13-375c-4ef3-b082-fddd10e99c0e 154c87efa1ae4839ac679b6bb5a57518 7abcf6b34bcf45a9937f19251e144d1e - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:d3:2e:0d,bridge_name='br-int',has_traffic_filtering=True,id=7d76cfd6-c5ce-4d30-bb5f-6fe27cb61873,network=Network(b22364e6-24ba-4549-9455-32f1aced62b6),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap7d76cfd6-c5') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 21 19:15:32 np0005591285 nova_compute[182755]: 2026-01-22 00:15:32.503 182759 DEBUG os_vif [None req-f45d7d13-375c-4ef3-b082-fddd10e99c0e 154c87efa1ae4839ac679b6bb5a57518 7abcf6b34bcf45a9937f19251e144d1e - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:d3:2e:0d,bridge_name='br-int',has_traffic_filtering=True,id=7d76cfd6-c5ce-4d30-bb5f-6fe27cb61873,network=Network(b22364e6-24ba-4549-9455-32f1aced62b6),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap7d76cfd6-c5') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Jan 21 19:15:32 np0005591285 nova_compute[182755]: 2026-01-22 00:15:32.505 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:15:32 np0005591285 nova_compute[182755]: 2026-01-22 00:15:32.505 182759 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap7d76cfd6-c5, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 21 19:15:32 np0005591285 nova_compute[182755]: 2026-01-22 00:15:32.508 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:15:32 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:15:32.507 211690 DEBUG oslo.privsep.daemon [-] privsep: reply[6dddd2cf-83cd-4df8-a1b9-c85cc4b9f120]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 19:15:32 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:15:32.508 211690 DEBUG oslo.privsep.daemon [-] privsep: reply[0e25db58-7a2c-4b42-8124-ded727d89132]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 19:15:32 np0005591285 nova_compute[182755]: 2026-01-22 00:15:32.511 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 21 19:15:32 np0005591285 nova_compute[182755]: 2026-01-22 00:15:32.514 182759 INFO os_vif [None req-f45d7d13-375c-4ef3-b082-fddd10e99c0e 154c87efa1ae4839ac679b6bb5a57518 7abcf6b34bcf45a9937f19251e144d1e - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:d3:2e:0d,bridge_name='br-int',has_traffic_filtering=True,id=7d76cfd6-c5ce-4d30-bb5f-6fe27cb61873,network=Network(b22364e6-24ba-4549-9455-32f1aced62b6),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap7d76cfd6-c5')#033[00m
Jan 21 19:15:32 np0005591285 nova_compute[182755]: 2026-01-22 00:15:32.515 182759 INFO nova.virt.libvirt.driver [None req-f45d7d13-375c-4ef3-b082-fddd10e99c0e 154c87efa1ae4839ac679b6bb5a57518 7abcf6b34bcf45a9937f19251e144d1e - - default default] [instance: d698021b-7b60-4b57-bb35-826d365e5bd3] Deleting instance files /var/lib/nova/instances/d698021b-7b60-4b57-bb35-826d365e5bd3_del#033[00m
Jan 21 19:15:32 np0005591285 nova_compute[182755]: 2026-01-22 00:15:32.516 182759 INFO nova.virt.libvirt.driver [None req-f45d7d13-375c-4ef3-b082-fddd10e99c0e 154c87efa1ae4839ac679b6bb5a57518 7abcf6b34bcf45a9937f19251e144d1e - - default default] [instance: d698021b-7b60-4b57-bb35-826d365e5bd3] Deletion of /var/lib/nova/instances/d698021b-7b60-4b57-bb35-826d365e5bd3_del complete#033[00m
Jan 21 19:15:32 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:15:32.527 211690 DEBUG oslo.privsep.daemon [-] privsep: reply[e7d7f20a-ab05-4e19-abbd-509fcfff1907]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 548467, 'reachable_time': 15205, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 232184, 'error': None, 'target': 'ovnmeta-b22364e6-24ba-4549-9455-32f1aced62b6', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 19:15:32 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:15:32.530 104650 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-b22364e6-24ba-4549-9455-32f1aced62b6 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Jan 21 19:15:32 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:15:32.530 104650 DEBUG oslo.privsep.daemon [-] privsep: reply[0d55a076-9ab7-47bc-a44d-8ca19716cd47]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 19:15:32 np0005591285 systemd[1]: run-netns-ovnmeta\x2db22364e6\x2d24ba\x2d4549\x2d9455\x2d32f1aced62b6.mount: Deactivated successfully.
Jan 21 19:15:32 np0005591285 nova_compute[182755]: 2026-01-22 00:15:32.587 182759 INFO nova.compute.manager [None req-f45d7d13-375c-4ef3-b082-fddd10e99c0e 154c87efa1ae4839ac679b6bb5a57518 7abcf6b34bcf45a9937f19251e144d1e - - default default] [instance: d698021b-7b60-4b57-bb35-826d365e5bd3] Took 0.37 seconds to destroy the instance on the hypervisor.#033[00m
Jan 21 19:15:32 np0005591285 nova_compute[182755]: 2026-01-22 00:15:32.588 182759 DEBUG oslo.service.loopingcall [None req-f45d7d13-375c-4ef3-b082-fddd10e99c0e 154c87efa1ae4839ac679b6bb5a57518 7abcf6b34bcf45a9937f19251e144d1e - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Jan 21 19:15:32 np0005591285 nova_compute[182755]: 2026-01-22 00:15:32.588 182759 DEBUG nova.compute.manager [-] [instance: d698021b-7b60-4b57-bb35-826d365e5bd3] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Jan 21 19:15:32 np0005591285 nova_compute[182755]: 2026-01-22 00:15:32.588 182759 DEBUG nova.network.neutron [-] [instance: d698021b-7b60-4b57-bb35-826d365e5bd3] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Jan 21 19:15:32 np0005591285 nova_compute[182755]: 2026-01-22 00:15:32.611 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:15:32 np0005591285 nova_compute[182755]: 2026-01-22 00:15:32.972 182759 DEBUG nova.compute.manager [req-50d9ed16-7375-44f6-89aa-068bad5fd858 req-345e720e-3af1-4a55-ae36-06a8d56fa124 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: d698021b-7b60-4b57-bb35-826d365e5bd3] Received event network-vif-unplugged-7d76cfd6-c5ce-4d30-bb5f-6fe27cb61873 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 21 19:15:32 np0005591285 nova_compute[182755]: 2026-01-22 00:15:32.973 182759 DEBUG oslo_concurrency.lockutils [req-50d9ed16-7375-44f6-89aa-068bad5fd858 req-345e720e-3af1-4a55-ae36-06a8d56fa124 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "d698021b-7b60-4b57-bb35-826d365e5bd3-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 21 19:15:32 np0005591285 nova_compute[182755]: 2026-01-22 00:15:32.973 182759 DEBUG oslo_concurrency.lockutils [req-50d9ed16-7375-44f6-89aa-068bad5fd858 req-345e720e-3af1-4a55-ae36-06a8d56fa124 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "d698021b-7b60-4b57-bb35-826d365e5bd3-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 21 19:15:32 np0005591285 nova_compute[182755]: 2026-01-22 00:15:32.973 182759 DEBUG oslo_concurrency.lockutils [req-50d9ed16-7375-44f6-89aa-068bad5fd858 req-345e720e-3af1-4a55-ae36-06a8d56fa124 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "d698021b-7b60-4b57-bb35-826d365e5bd3-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 21 19:15:32 np0005591285 nova_compute[182755]: 2026-01-22 00:15:32.973 182759 DEBUG nova.compute.manager [req-50d9ed16-7375-44f6-89aa-068bad5fd858 req-345e720e-3af1-4a55-ae36-06a8d56fa124 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: d698021b-7b60-4b57-bb35-826d365e5bd3] No waiting events found dispatching network-vif-unplugged-7d76cfd6-c5ce-4d30-bb5f-6fe27cb61873 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 21 19:15:32 np0005591285 nova_compute[182755]: 2026-01-22 00:15:32.973 182759 DEBUG nova.compute.manager [req-50d9ed16-7375-44f6-89aa-068bad5fd858 req-345e720e-3af1-4a55-ae36-06a8d56fa124 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: d698021b-7b60-4b57-bb35-826d365e5bd3] Received event network-vif-unplugged-7d76cfd6-c5ce-4d30-bb5f-6fe27cb61873 for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Jan 21 19:15:34 np0005591285 nova_compute[182755]: 2026-01-22 00:15:34.115 182759 DEBUG nova.network.neutron [-] [instance: d698021b-7b60-4b57-bb35-826d365e5bd3] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 21 19:15:34 np0005591285 nova_compute[182755]: 2026-01-22 00:15:34.145 182759 INFO nova.compute.manager [-] [instance: d698021b-7b60-4b57-bb35-826d365e5bd3] Took 1.56 seconds to deallocate network for instance.#033[00m
Jan 21 19:15:34 np0005591285 nova_compute[182755]: 2026-01-22 00:15:34.225 182759 DEBUG nova.compute.manager [req-b3d9226e-119b-4282-b347-1982a7dba441 req-dff4666c-6147-4793-b41f-178b3747d30c 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: d698021b-7b60-4b57-bb35-826d365e5bd3] Received event network-vif-deleted-7d76cfd6-c5ce-4d30-bb5f-6fe27cb61873 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 21 19:15:34 np0005591285 nova_compute[182755]: 2026-01-22 00:15:34.273 182759 DEBUG oslo_concurrency.lockutils [None req-f45d7d13-375c-4ef3-b082-fddd10e99c0e 154c87efa1ae4839ac679b6bb5a57518 7abcf6b34bcf45a9937f19251e144d1e - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 21 19:15:34 np0005591285 nova_compute[182755]: 2026-01-22 00:15:34.273 182759 DEBUG oslo_concurrency.lockutils [None req-f45d7d13-375c-4ef3-b082-fddd10e99c0e 154c87efa1ae4839ac679b6bb5a57518 7abcf6b34bcf45a9937f19251e144d1e - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 21 19:15:34 np0005591285 nova_compute[182755]: 2026-01-22 00:15:34.354 182759 DEBUG nova.compute.provider_tree [None req-f45d7d13-375c-4ef3-b082-fddd10e99c0e 154c87efa1ae4839ac679b6bb5a57518 7abcf6b34bcf45a9937f19251e144d1e - - default default] Inventory has not changed in ProviderTree for provider: e96a8776-a298-4c19-937a-402cb8191067 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 21 19:15:34 np0005591285 nova_compute[182755]: 2026-01-22 00:15:34.380 182759 DEBUG nova.scheduler.client.report [None req-f45d7d13-375c-4ef3-b082-fddd10e99c0e 154c87efa1ae4839ac679b6bb5a57518 7abcf6b34bcf45a9937f19251e144d1e - - default default] Inventory has not changed for provider e96a8776-a298-4c19-937a-402cb8191067 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 21 19:15:34 np0005591285 nova_compute[182755]: 2026-01-22 00:15:34.414 182759 DEBUG oslo_concurrency.lockutils [None req-f45d7d13-375c-4ef3-b082-fddd10e99c0e 154c87efa1ae4839ac679b6bb5a57518 7abcf6b34bcf45a9937f19251e144d1e - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.141s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 21 19:15:34 np0005591285 nova_compute[182755]: 2026-01-22 00:15:34.453 182759 INFO nova.scheduler.client.report [None req-f45d7d13-375c-4ef3-b082-fddd10e99c0e 154c87efa1ae4839ac679b6bb5a57518 7abcf6b34bcf45a9937f19251e144d1e - - default default] Deleted allocations for instance d698021b-7b60-4b57-bb35-826d365e5bd3#033[00m
Jan 21 19:15:34 np0005591285 nova_compute[182755]: 2026-01-22 00:15:34.673 182759 DEBUG oslo_concurrency.lockutils [None req-f45d7d13-375c-4ef3-b082-fddd10e99c0e 154c87efa1ae4839ac679b6bb5a57518 7abcf6b34bcf45a9937f19251e144d1e - - default default] Lock "d698021b-7b60-4b57-bb35-826d365e5bd3" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 2.484s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 21 19:15:35 np0005591285 nova_compute[182755]: 2026-01-22 00:15:35.164 182759 DEBUG nova.compute.manager [req-a1b24989-6709-4017-b91e-03200a6d98a6 req-96f1b3f2-652a-4bac-bd0d-223b12228c6f 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: d698021b-7b60-4b57-bb35-826d365e5bd3] Received event network-vif-plugged-7d76cfd6-c5ce-4d30-bb5f-6fe27cb61873 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 21 19:15:35 np0005591285 nova_compute[182755]: 2026-01-22 00:15:35.165 182759 DEBUG oslo_concurrency.lockutils [req-a1b24989-6709-4017-b91e-03200a6d98a6 req-96f1b3f2-652a-4bac-bd0d-223b12228c6f 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "d698021b-7b60-4b57-bb35-826d365e5bd3-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 21 19:15:35 np0005591285 nova_compute[182755]: 2026-01-22 00:15:35.165 182759 DEBUG oslo_concurrency.lockutils [req-a1b24989-6709-4017-b91e-03200a6d98a6 req-96f1b3f2-652a-4bac-bd0d-223b12228c6f 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "d698021b-7b60-4b57-bb35-826d365e5bd3-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 21 19:15:35 np0005591285 nova_compute[182755]: 2026-01-22 00:15:35.165 182759 DEBUG oslo_concurrency.lockutils [req-a1b24989-6709-4017-b91e-03200a6d98a6 req-96f1b3f2-652a-4bac-bd0d-223b12228c6f 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "d698021b-7b60-4b57-bb35-826d365e5bd3-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 21 19:15:35 np0005591285 nova_compute[182755]: 2026-01-22 00:15:35.166 182759 DEBUG nova.compute.manager [req-a1b24989-6709-4017-b91e-03200a6d98a6 req-96f1b3f2-652a-4bac-bd0d-223b12228c6f 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: d698021b-7b60-4b57-bb35-826d365e5bd3] No waiting events found dispatching network-vif-plugged-7d76cfd6-c5ce-4d30-bb5f-6fe27cb61873 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 21 19:15:35 np0005591285 nova_compute[182755]: 2026-01-22 00:15:35.166 182759 WARNING nova.compute.manager [req-a1b24989-6709-4017-b91e-03200a6d98a6 req-96f1b3f2-652a-4bac-bd0d-223b12228c6f 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: d698021b-7b60-4b57-bb35-826d365e5bd3] Received unexpected event network-vif-plugged-7d76cfd6-c5ce-4d30-bb5f-6fe27cb61873 for instance with vm_state deleted and task_state None.#033[00m
Jan 21 19:15:36 np0005591285 podman[232185]: 2026-01-22 00:15:36.213560197 +0000 UTC m=+0.063511255 container health_status 3d927e208c8d9d2bf2ed958430b573b5beb6e6062ba87e1ec7a0ee42c855b2e4 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Jan 21 19:15:37 np0005591285 nova_compute[182755]: 2026-01-22 00:15:37.508 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:15:37 np0005591285 nova_compute[182755]: 2026-01-22 00:15:37.614 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:15:39 np0005591285 nova_compute[182755]: 2026-01-22 00:15:39.339 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:15:40 np0005591285 podman[232212]: 2026-01-22 00:15:40.209171164 +0000 UTC m=+0.073522201 container health_status 482a8450540b33e5cd4c4cb0c4b60281016c657b79b89fa82f8a9e447843f995 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, tcib_managed=true)
Jan 21 19:15:40 np0005591285 podman[232213]: 2026-01-22 00:15:40.230927424 +0000 UTC m=+0.094840679 container health_status 8919309155a033da8deb024bb37b9fc0d285fe97686ee0d202af37a5f2df8416 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Jan 21 19:15:42 np0005591285 podman[232254]: 2026-01-22 00:15:42.308022853 +0000 UTC m=+0.157513121 container health_status e38def30a2d032278c507a8fe96e62e9ea51ff4721fe55b24e66691821fd079e (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_id=ovn_controller, io.buildah.version=1.41.3)
Jan 21 19:15:42 np0005591285 nova_compute[182755]: 2026-01-22 00:15:42.510 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:15:42 np0005591285 nova_compute[182755]: 2026-01-22 00:15:42.616 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:15:43 np0005591285 nova_compute[182755]: 2026-01-22 00:15:43.213 182759 DEBUG oslo_service.periodic_task [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 21 19:15:43 np0005591285 nova_compute[182755]: 2026-01-22 00:15:43.549 182759 DEBUG oslo_concurrency.lockutils [None req-c523f52c-14e1-48fe-abe8-06282d7fbefc c26ff016fcfc4e08803feb0e96005a8e 4c6e66779ffe440d9c3270f0328391fb - - default default] Acquiring lock "be07a8c0-548d-4bfc-b004-799a831f8252" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 21 19:15:43 np0005591285 nova_compute[182755]: 2026-01-22 00:15:43.549 182759 DEBUG oslo_concurrency.lockutils [None req-c523f52c-14e1-48fe-abe8-06282d7fbefc c26ff016fcfc4e08803feb0e96005a8e 4c6e66779ffe440d9c3270f0328391fb - - default default] Lock "be07a8c0-548d-4bfc-b004-799a831f8252" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 21 19:15:43 np0005591285 nova_compute[182755]: 2026-01-22 00:15:43.588 182759 DEBUG nova.compute.manager [None req-c523f52c-14e1-48fe-abe8-06282d7fbefc c26ff016fcfc4e08803feb0e96005a8e 4c6e66779ffe440d9c3270f0328391fb - - default default] [instance: be07a8c0-548d-4bfc-b004-799a831f8252] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Jan 21 19:15:43 np0005591285 nova_compute[182755]: 2026-01-22 00:15:43.766 182759 DEBUG oslo_concurrency.lockutils [None req-c523f52c-14e1-48fe-abe8-06282d7fbefc c26ff016fcfc4e08803feb0e96005a8e 4c6e66779ffe440d9c3270f0328391fb - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 21 19:15:43 np0005591285 nova_compute[182755]: 2026-01-22 00:15:43.767 182759 DEBUG oslo_concurrency.lockutils [None req-c523f52c-14e1-48fe-abe8-06282d7fbefc c26ff016fcfc4e08803feb0e96005a8e 4c6e66779ffe440d9c3270f0328391fb - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 21 19:15:43 np0005591285 nova_compute[182755]: 2026-01-22 00:15:43.775 182759 DEBUG nova.virt.hardware [None req-c523f52c-14e1-48fe-abe8-06282d7fbefc c26ff016fcfc4e08803feb0e96005a8e 4c6e66779ffe440d9c3270f0328391fb - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Jan 21 19:15:43 np0005591285 nova_compute[182755]: 2026-01-22 00:15:43.775 182759 INFO nova.compute.claims [None req-c523f52c-14e1-48fe-abe8-06282d7fbefc c26ff016fcfc4e08803feb0e96005a8e 4c6e66779ffe440d9c3270f0328391fb - - default default] [instance: be07a8c0-548d-4bfc-b004-799a831f8252] Claim successful on node compute-2.ctlplane.example.com#033[00m
Jan 21 19:15:44 np0005591285 nova_compute[182755]: 2026-01-22 00:15:44.144 182759 DEBUG nova.compute.provider_tree [None req-c523f52c-14e1-48fe-abe8-06282d7fbefc c26ff016fcfc4e08803feb0e96005a8e 4c6e66779ffe440d9c3270f0328391fb - - default default] Inventory has not changed in ProviderTree for provider: e96a8776-a298-4c19-937a-402cb8191067 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 21 19:15:44 np0005591285 nova_compute[182755]: 2026-01-22 00:15:44.182 182759 DEBUG nova.scheduler.client.report [None req-c523f52c-14e1-48fe-abe8-06282d7fbefc c26ff016fcfc4e08803feb0e96005a8e 4c6e66779ffe440d9c3270f0328391fb - - default default] Inventory has not changed for provider e96a8776-a298-4c19-937a-402cb8191067 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 21 19:15:44 np0005591285 nova_compute[182755]: 2026-01-22 00:15:44.217 182759 DEBUG oslo_service.periodic_task [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 21 19:15:44 np0005591285 nova_compute[182755]: 2026-01-22 00:15:44.217 182759 DEBUG nova.compute.manager [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Jan 21 19:15:44 np0005591285 nova_compute[182755]: 2026-01-22 00:15:44.224 182759 DEBUG oslo_concurrency.lockutils [None req-c523f52c-14e1-48fe-abe8-06282d7fbefc c26ff016fcfc4e08803feb0e96005a8e 4c6e66779ffe440d9c3270f0328391fb - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.458s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 21 19:15:44 np0005591285 nova_compute[182755]: 2026-01-22 00:15:44.226 182759 DEBUG nova.compute.manager [None req-c523f52c-14e1-48fe-abe8-06282d7fbefc c26ff016fcfc4e08803feb0e96005a8e 4c6e66779ffe440d9c3270f0328391fb - - default default] [instance: be07a8c0-548d-4bfc-b004-799a831f8252] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Jan 21 19:15:44 np0005591285 nova_compute[182755]: 2026-01-22 00:15:44.345 182759 DEBUG nova.compute.manager [None req-c523f52c-14e1-48fe-abe8-06282d7fbefc c26ff016fcfc4e08803feb0e96005a8e 4c6e66779ffe440d9c3270f0328391fb - - default default] [instance: be07a8c0-548d-4bfc-b004-799a831f8252] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Jan 21 19:15:44 np0005591285 nova_compute[182755]: 2026-01-22 00:15:44.346 182759 DEBUG nova.network.neutron [None req-c523f52c-14e1-48fe-abe8-06282d7fbefc c26ff016fcfc4e08803feb0e96005a8e 4c6e66779ffe440d9c3270f0328391fb - - default default] [instance: be07a8c0-548d-4bfc-b004-799a831f8252] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Jan 21 19:15:44 np0005591285 nova_compute[182755]: 2026-01-22 00:15:44.380 182759 INFO nova.virt.libvirt.driver [None req-c523f52c-14e1-48fe-abe8-06282d7fbefc c26ff016fcfc4e08803feb0e96005a8e 4c6e66779ffe440d9c3270f0328391fb - - default default] [instance: be07a8c0-548d-4bfc-b004-799a831f8252] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Jan 21 19:15:44 np0005591285 nova_compute[182755]: 2026-01-22 00:15:44.411 182759 DEBUG nova.compute.manager [None req-c523f52c-14e1-48fe-abe8-06282d7fbefc c26ff016fcfc4e08803feb0e96005a8e 4c6e66779ffe440d9c3270f0328391fb - - default default] [instance: be07a8c0-548d-4bfc-b004-799a831f8252] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Jan 21 19:15:44 np0005591285 nova_compute[182755]: 2026-01-22 00:15:44.637 182759 DEBUG nova.compute.manager [None req-c523f52c-14e1-48fe-abe8-06282d7fbefc c26ff016fcfc4e08803feb0e96005a8e 4c6e66779ffe440d9c3270f0328391fb - - default default] [instance: be07a8c0-548d-4bfc-b004-799a831f8252] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Jan 21 19:15:44 np0005591285 nova_compute[182755]: 2026-01-22 00:15:44.638 182759 DEBUG nova.virt.libvirt.driver [None req-c523f52c-14e1-48fe-abe8-06282d7fbefc c26ff016fcfc4e08803feb0e96005a8e 4c6e66779ffe440d9c3270f0328391fb - - default default] [instance: be07a8c0-548d-4bfc-b004-799a831f8252] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Jan 21 19:15:44 np0005591285 nova_compute[182755]: 2026-01-22 00:15:44.639 182759 INFO nova.virt.libvirt.driver [None req-c523f52c-14e1-48fe-abe8-06282d7fbefc c26ff016fcfc4e08803feb0e96005a8e 4c6e66779ffe440d9c3270f0328391fb - - default default] [instance: be07a8c0-548d-4bfc-b004-799a831f8252] Creating image(s)#033[00m
Jan 21 19:15:44 np0005591285 nova_compute[182755]: 2026-01-22 00:15:44.639 182759 DEBUG oslo_concurrency.lockutils [None req-c523f52c-14e1-48fe-abe8-06282d7fbefc c26ff016fcfc4e08803feb0e96005a8e 4c6e66779ffe440d9c3270f0328391fb - - default default] Acquiring lock "/var/lib/nova/instances/be07a8c0-548d-4bfc-b004-799a831f8252/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 21 19:15:44 np0005591285 nova_compute[182755]: 2026-01-22 00:15:44.640 182759 DEBUG oslo_concurrency.lockutils [None req-c523f52c-14e1-48fe-abe8-06282d7fbefc c26ff016fcfc4e08803feb0e96005a8e 4c6e66779ffe440d9c3270f0328391fb - - default default] Lock "/var/lib/nova/instances/be07a8c0-548d-4bfc-b004-799a831f8252/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 21 19:15:44 np0005591285 nova_compute[182755]: 2026-01-22 00:15:44.641 182759 DEBUG oslo_concurrency.lockutils [None req-c523f52c-14e1-48fe-abe8-06282d7fbefc c26ff016fcfc4e08803feb0e96005a8e 4c6e66779ffe440d9c3270f0328391fb - - default default] Lock "/var/lib/nova/instances/be07a8c0-548d-4bfc-b004-799a831f8252/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 21 19:15:44 np0005591285 nova_compute[182755]: 2026-01-22 00:15:44.657 182759 DEBUG oslo_concurrency.processutils [None req-c523f52c-14e1-48fe-abe8-06282d7fbefc c26ff016fcfc4e08803feb0e96005a8e 4c6e66779ffe440d9c3270f0328391fb - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 21 19:15:44 np0005591285 nova_compute[182755]: 2026-01-22 00:15:44.751 182759 DEBUG oslo_concurrency.processutils [None req-c523f52c-14e1-48fe-abe8-06282d7fbefc c26ff016fcfc4e08803feb0e96005a8e 4c6e66779ffe440d9c3270f0328391fb - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json" returned: 0 in 0.094s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 21 19:15:44 np0005591285 nova_compute[182755]: 2026-01-22 00:15:44.752 182759 DEBUG oslo_concurrency.lockutils [None req-c523f52c-14e1-48fe-abe8-06282d7fbefc c26ff016fcfc4e08803feb0e96005a8e 4c6e66779ffe440d9c3270f0328391fb - - default default] Acquiring lock "3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 21 19:15:44 np0005591285 nova_compute[182755]: 2026-01-22 00:15:44.753 182759 DEBUG oslo_concurrency.lockutils [None req-c523f52c-14e1-48fe-abe8-06282d7fbefc c26ff016fcfc4e08803feb0e96005a8e 4c6e66779ffe440d9c3270f0328391fb - - default default] Lock "3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 21 19:15:44 np0005591285 nova_compute[182755]: 2026-01-22 00:15:44.764 182759 DEBUG oslo_concurrency.processutils [None req-c523f52c-14e1-48fe-abe8-06282d7fbefc c26ff016fcfc4e08803feb0e96005a8e 4c6e66779ffe440d9c3270f0328391fb - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 21 19:15:44 np0005591285 nova_compute[182755]: 2026-01-22 00:15:44.833 182759 DEBUG oslo_concurrency.processutils [None req-c523f52c-14e1-48fe-abe8-06282d7fbefc c26ff016fcfc4e08803feb0e96005a8e 4c6e66779ffe440d9c3270f0328391fb - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json" returned: 0 in 0.069s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 21 19:15:44 np0005591285 nova_compute[182755]: 2026-01-22 00:15:44.835 182759 DEBUG oslo_concurrency.processutils [None req-c523f52c-14e1-48fe-abe8-06282d7fbefc c26ff016fcfc4e08803feb0e96005a8e 4c6e66779ffe440d9c3270f0328391fb - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474,backing_fmt=raw /var/lib/nova/instances/be07a8c0-548d-4bfc-b004-799a831f8252/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 21 19:15:44 np0005591285 nova_compute[182755]: 2026-01-22 00:15:44.875 182759 DEBUG oslo_concurrency.processutils [None req-c523f52c-14e1-48fe-abe8-06282d7fbefc c26ff016fcfc4e08803feb0e96005a8e 4c6e66779ffe440d9c3270f0328391fb - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474,backing_fmt=raw /var/lib/nova/instances/be07a8c0-548d-4bfc-b004-799a831f8252/disk 1073741824" returned: 0 in 0.040s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 21 19:15:44 np0005591285 nova_compute[182755]: 2026-01-22 00:15:44.878 182759 DEBUG oslo_concurrency.lockutils [None req-c523f52c-14e1-48fe-abe8-06282d7fbefc c26ff016fcfc4e08803feb0e96005a8e 4c6e66779ffe440d9c3270f0328391fb - - default default] Lock "3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.125s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 21 19:15:44 np0005591285 nova_compute[182755]: 2026-01-22 00:15:44.879 182759 DEBUG oslo_concurrency.processutils [None req-c523f52c-14e1-48fe-abe8-06282d7fbefc c26ff016fcfc4e08803feb0e96005a8e 4c6e66779ffe440d9c3270f0328391fb - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 21 19:15:44 np0005591285 nova_compute[182755]: 2026-01-22 00:15:44.979 182759 DEBUG oslo_concurrency.processutils [None req-c523f52c-14e1-48fe-abe8-06282d7fbefc c26ff016fcfc4e08803feb0e96005a8e 4c6e66779ffe440d9c3270f0328391fb - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json" returned: 0 in 0.100s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 21 19:15:44 np0005591285 nova_compute[182755]: 2026-01-22 00:15:44.980 182759 DEBUG nova.virt.disk.api [None req-c523f52c-14e1-48fe-abe8-06282d7fbefc c26ff016fcfc4e08803feb0e96005a8e 4c6e66779ffe440d9c3270f0328391fb - - default default] Checking if we can resize image /var/lib/nova/instances/be07a8c0-548d-4bfc-b004-799a831f8252/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166#033[00m
Jan 21 19:15:44 np0005591285 nova_compute[182755]: 2026-01-22 00:15:44.981 182759 DEBUG oslo_concurrency.processutils [None req-c523f52c-14e1-48fe-abe8-06282d7fbefc c26ff016fcfc4e08803feb0e96005a8e 4c6e66779ffe440d9c3270f0328391fb - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/be07a8c0-548d-4bfc-b004-799a831f8252/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 21 19:15:45 np0005591285 nova_compute[182755]: 2026-01-22 00:15:45.038 182759 DEBUG nova.policy [None req-c523f52c-14e1-48fe-abe8-06282d7fbefc c26ff016fcfc4e08803feb0e96005a8e 4c6e66779ffe440d9c3270f0328391fb - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'c26ff016fcfc4e08803feb0e96005a8e', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '4c6e66779ffe440d9c3270f0328391fb', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Jan 21 19:15:45 np0005591285 nova_compute[182755]: 2026-01-22 00:15:45.042 182759 DEBUG oslo_concurrency.processutils [None req-c523f52c-14e1-48fe-abe8-06282d7fbefc c26ff016fcfc4e08803feb0e96005a8e 4c6e66779ffe440d9c3270f0328391fb - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/be07a8c0-548d-4bfc-b004-799a831f8252/disk --force-share --output=json" returned: 0 in 0.061s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 21 19:15:45 np0005591285 nova_compute[182755]: 2026-01-22 00:15:45.043 182759 DEBUG nova.virt.disk.api [None req-c523f52c-14e1-48fe-abe8-06282d7fbefc c26ff016fcfc4e08803feb0e96005a8e 4c6e66779ffe440d9c3270f0328391fb - - default default] Cannot resize image /var/lib/nova/instances/be07a8c0-548d-4bfc-b004-799a831f8252/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172#033[00m
Jan 21 19:15:45 np0005591285 nova_compute[182755]: 2026-01-22 00:15:45.043 182759 DEBUG nova.objects.instance [None req-c523f52c-14e1-48fe-abe8-06282d7fbefc c26ff016fcfc4e08803feb0e96005a8e 4c6e66779ffe440d9c3270f0328391fb - - default default] Lazy-loading 'migration_context' on Instance uuid be07a8c0-548d-4bfc-b004-799a831f8252 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 21 19:15:45 np0005591285 nova_compute[182755]: 2026-01-22 00:15:45.091 182759 DEBUG nova.virt.libvirt.driver [None req-c523f52c-14e1-48fe-abe8-06282d7fbefc c26ff016fcfc4e08803feb0e96005a8e 4c6e66779ffe440d9c3270f0328391fb - - default default] [instance: be07a8c0-548d-4bfc-b004-799a831f8252] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Jan 21 19:15:45 np0005591285 nova_compute[182755]: 2026-01-22 00:15:45.091 182759 DEBUG nova.virt.libvirt.driver [None req-c523f52c-14e1-48fe-abe8-06282d7fbefc c26ff016fcfc4e08803feb0e96005a8e 4c6e66779ffe440d9c3270f0328391fb - - default default] [instance: be07a8c0-548d-4bfc-b004-799a831f8252] Ensure instance console log exists: /var/lib/nova/instances/be07a8c0-548d-4bfc-b004-799a831f8252/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Jan 21 19:15:45 np0005591285 nova_compute[182755]: 2026-01-22 00:15:45.092 182759 DEBUG oslo_concurrency.lockutils [None req-c523f52c-14e1-48fe-abe8-06282d7fbefc c26ff016fcfc4e08803feb0e96005a8e 4c6e66779ffe440d9c3270f0328391fb - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 21 19:15:45 np0005591285 nova_compute[182755]: 2026-01-22 00:15:45.092 182759 DEBUG oslo_concurrency.lockutils [None req-c523f52c-14e1-48fe-abe8-06282d7fbefc c26ff016fcfc4e08803feb0e96005a8e 4c6e66779ffe440d9c3270f0328391fb - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 21 19:15:45 np0005591285 nova_compute[182755]: 2026-01-22 00:15:45.093 182759 DEBUG oslo_concurrency.lockutils [None req-c523f52c-14e1-48fe-abe8-06282d7fbefc c26ff016fcfc4e08803feb0e96005a8e 4c6e66779ffe440d9c3270f0328391fb - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 21 19:15:47 np0005591285 nova_compute[182755]: 2026-01-22 00:15:47.218 182759 DEBUG oslo_service.periodic_task [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 21 19:15:47 np0005591285 nova_compute[182755]: 2026-01-22 00:15:47.218 182759 DEBUG oslo_service.periodic_task [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 21 19:15:47 np0005591285 nova_compute[182755]: 2026-01-22 00:15:47.481 182759 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1769040932.4796743, d698021b-7b60-4b57-bb35-826d365e5bd3 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 21 19:15:47 np0005591285 nova_compute[182755]: 2026-01-22 00:15:47.481 182759 INFO nova.compute.manager [-] [instance: d698021b-7b60-4b57-bb35-826d365e5bd3] VM Stopped (Lifecycle Event)#033[00m
Jan 21 19:15:47 np0005591285 nova_compute[182755]: 2026-01-22 00:15:47.515 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:15:47 np0005591285 nova_compute[182755]: 2026-01-22 00:15:47.618 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:15:48 np0005591285 nova_compute[182755]: 2026-01-22 00:15:48.218 182759 DEBUG oslo_service.periodic_task [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 21 19:15:48 np0005591285 nova_compute[182755]: 2026-01-22 00:15:48.219 182759 DEBUG nova.compute.manager [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Jan 21 19:15:48 np0005591285 nova_compute[182755]: 2026-01-22 00:15:48.219 182759 DEBUG nova.compute.manager [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Jan 21 19:15:48 np0005591285 nova_compute[182755]: 2026-01-22 00:15:48.635 182759 DEBUG nova.compute.manager [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] [instance: be07a8c0-548d-4bfc-b004-799a831f8252] Skipping network cache update for instance because it is Building. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9871#033[00m
Jan 21 19:15:48 np0005591285 nova_compute[182755]: 2026-01-22 00:15:48.635 182759 DEBUG nova.compute.manager [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Jan 21 19:15:48 np0005591285 nova_compute[182755]: 2026-01-22 00:15:48.647 182759 DEBUG nova.compute.manager [None req-9bbc4779-80c2-477c-b571-d05f24b3ad99 - - - - - -] [instance: d698021b-7b60-4b57-bb35-826d365e5bd3] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 21 19:15:50 np0005591285 nova_compute[182755]: 2026-01-22 00:15:50.217 182759 DEBUG oslo_service.periodic_task [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 21 19:15:50 np0005591285 nova_compute[182755]: 2026-01-22 00:15:50.217 182759 DEBUG oslo_service.periodic_task [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 21 19:15:50 np0005591285 nova_compute[182755]: 2026-01-22 00:15:50.629 182759 DEBUG nova.network.neutron [None req-c523f52c-14e1-48fe-abe8-06282d7fbefc c26ff016fcfc4e08803feb0e96005a8e 4c6e66779ffe440d9c3270f0328391fb - - default default] [instance: be07a8c0-548d-4bfc-b004-799a831f8252] Successfully created port: 9cb3a782-7c84-4107-9add-6097c8ec4c06 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Jan 21 19:15:52 np0005591285 nova_compute[182755]: 2026-01-22 00:15:52.217 182759 DEBUG oslo_service.periodic_task [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 21 19:15:52 np0005591285 nova_compute[182755]: 2026-01-22 00:15:52.352 182759 DEBUG oslo_concurrency.lockutils [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 21 19:15:52 np0005591285 nova_compute[182755]: 2026-01-22 00:15:52.353 182759 DEBUG oslo_concurrency.lockutils [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 21 19:15:52 np0005591285 nova_compute[182755]: 2026-01-22 00:15:52.353 182759 DEBUG oslo_concurrency.lockutils [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 21 19:15:52 np0005591285 nova_compute[182755]: 2026-01-22 00:15:52.353 182759 DEBUG nova.compute.resource_tracker [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Jan 21 19:15:52 np0005591285 nova_compute[182755]: 2026-01-22 00:15:52.518 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:15:52 np0005591285 nova_compute[182755]: 2026-01-22 00:15:52.620 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:15:52 np0005591285 nova_compute[182755]: 2026-01-22 00:15:52.997 182759 WARNING nova.virt.libvirt.driver [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 21 19:15:52 np0005591285 nova_compute[182755]: 2026-01-22 00:15:52.998 182759 DEBUG nova.compute.resource_tracker [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=5722MB free_disk=73.1931037902832GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Jan 21 19:15:52 np0005591285 nova_compute[182755]: 2026-01-22 00:15:52.998 182759 DEBUG oslo_concurrency.lockutils [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 21 19:15:52 np0005591285 nova_compute[182755]: 2026-01-22 00:15:52.999 182759 DEBUG oslo_concurrency.lockutils [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 21 19:15:53 np0005591285 nova_compute[182755]: 2026-01-22 00:15:53.582 182759 DEBUG nova.compute.resource_tracker [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Instance be07a8c0-548d-4bfc-b004-799a831f8252 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Jan 21 19:15:53 np0005591285 nova_compute[182755]: 2026-01-22 00:15:53.582 182759 DEBUG nova.compute.resource_tracker [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Jan 21 19:15:53 np0005591285 nova_compute[182755]: 2026-01-22 00:15:53.582 182759 DEBUG nova.compute.resource_tracker [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7679MB used_ram=640MB phys_disk=79GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Jan 21 19:15:53 np0005591285 nova_compute[182755]: 2026-01-22 00:15:53.683 182759 DEBUG nova.compute.provider_tree [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Inventory has not changed in ProviderTree for provider: e96a8776-a298-4c19-937a-402cb8191067 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 21 19:15:53 np0005591285 nova_compute[182755]: 2026-01-22 00:15:53.701 182759 DEBUG nova.scheduler.client.report [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Inventory has not changed for provider e96a8776-a298-4c19-937a-402cb8191067 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 21 19:15:53 np0005591285 nova_compute[182755]: 2026-01-22 00:15:53.758 182759 DEBUG nova.compute.resource_tracker [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Jan 21 19:15:53 np0005591285 nova_compute[182755]: 2026-01-22 00:15:53.758 182759 DEBUG oslo_concurrency.lockutils [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.760s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 21 19:15:54 np0005591285 nova_compute[182755]: 2026-01-22 00:15:54.154 182759 DEBUG nova.network.neutron [None req-c523f52c-14e1-48fe-abe8-06282d7fbefc c26ff016fcfc4e08803feb0e96005a8e 4c6e66779ffe440d9c3270f0328391fb - - default default] [instance: be07a8c0-548d-4bfc-b004-799a831f8252] Successfully updated port: 9cb3a782-7c84-4107-9add-6097c8ec4c06 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Jan 21 19:15:54 np0005591285 nova_compute[182755]: 2026-01-22 00:15:54.172 182759 DEBUG oslo_concurrency.lockutils [None req-c523f52c-14e1-48fe-abe8-06282d7fbefc c26ff016fcfc4e08803feb0e96005a8e 4c6e66779ffe440d9c3270f0328391fb - - default default] Acquiring lock "refresh_cache-be07a8c0-548d-4bfc-b004-799a831f8252" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 21 19:15:54 np0005591285 nova_compute[182755]: 2026-01-22 00:15:54.173 182759 DEBUG oslo_concurrency.lockutils [None req-c523f52c-14e1-48fe-abe8-06282d7fbefc c26ff016fcfc4e08803feb0e96005a8e 4c6e66779ffe440d9c3270f0328391fb - - default default] Acquired lock "refresh_cache-be07a8c0-548d-4bfc-b004-799a831f8252" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 21 19:15:54 np0005591285 nova_compute[182755]: 2026-01-22 00:15:54.173 182759 DEBUG nova.network.neutron [None req-c523f52c-14e1-48fe-abe8-06282d7fbefc c26ff016fcfc4e08803feb0e96005a8e 4c6e66779ffe440d9c3270f0328391fb - - default default] [instance: be07a8c0-548d-4bfc-b004-799a831f8252] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Jan 21 19:15:54 np0005591285 nova_compute[182755]: 2026-01-22 00:15:54.753 182759 DEBUG oslo_service.periodic_task [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 21 19:15:54 np0005591285 nova_compute[182755]: 2026-01-22 00:15:54.962 182759 DEBUG nova.compute.manager [req-f9414583-18ed-4de1-9ad4-9113d27e1fdb req-9d96da57-0dd1-481c-8c73-b8b01fc61b47 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: be07a8c0-548d-4bfc-b004-799a831f8252] Received event network-changed-9cb3a782-7c84-4107-9add-6097c8ec4c06 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 21 19:15:54 np0005591285 nova_compute[182755]: 2026-01-22 00:15:54.963 182759 DEBUG nova.compute.manager [req-f9414583-18ed-4de1-9ad4-9113d27e1fdb req-9d96da57-0dd1-481c-8c73-b8b01fc61b47 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: be07a8c0-548d-4bfc-b004-799a831f8252] Refreshing instance network info cache due to event network-changed-9cb3a782-7c84-4107-9add-6097c8ec4c06. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 21 19:15:54 np0005591285 nova_compute[182755]: 2026-01-22 00:15:54.963 182759 DEBUG oslo_concurrency.lockutils [req-f9414583-18ed-4de1-9ad4-9113d27e1fdb req-9d96da57-0dd1-481c-8c73-b8b01fc61b47 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "refresh_cache-be07a8c0-548d-4bfc-b004-799a831f8252" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 21 19:15:54 np0005591285 nova_compute[182755]: 2026-01-22 00:15:54.991 182759 DEBUG oslo_service.periodic_task [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 21 19:15:55 np0005591285 nova_compute[182755]: 2026-01-22 00:15:55.558 182759 DEBUG nova.network.neutron [None req-c523f52c-14e1-48fe-abe8-06282d7fbefc c26ff016fcfc4e08803feb0e96005a8e 4c6e66779ffe440d9c3270f0328391fb - - default default] [instance: be07a8c0-548d-4bfc-b004-799a831f8252] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Jan 21 19:15:56 np0005591285 podman[232297]: 2026-01-22 00:15:56.198431098 +0000 UTC m=+0.056269821 container health_status 769668cc4e818cfae854ab3fcb5517227ddca1e18005a18ef4d782a494f45662 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.openshift.tags=minimal rhel9, architecture=x86_64, io.openshift.expose-services=, com.redhat.component=ubi9-minimal-container, release=1755695350, vcs-type=git, version=9.6, url=https://catalog.redhat.com/en/search?searchType=containers, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.33.7, managed_by=edpm_ansible, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, config_id=openstack_network_exporter, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, maintainer=Red Hat, Inc., name=ubi9-minimal, distribution-scope=public, vendor=Red Hat, Inc., build-date=2025-08-20T13:12:41, container_name=openstack_network_exporter, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9.)
Jan 21 19:15:56 np0005591285 podman[232298]: 2026-01-22 00:15:56.218340879 +0000 UTC m=+0.070723347 container health_status b5c1a31a508d0cf9e10702abe9c0ef424614b4d8f0daee85791f7de054afa6f0 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_id=ceilometer_agent_compute, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS)
Jan 21 19:15:57 np0005591285 nova_compute[182755]: 2026-01-22 00:15:57.520 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:15:57 np0005591285 nova_compute[182755]: 2026-01-22 00:15:57.664 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:15:59 np0005591285 nova_compute[182755]: 2026-01-22 00:15:59.091 182759 DEBUG nova.network.neutron [None req-c523f52c-14e1-48fe-abe8-06282d7fbefc c26ff016fcfc4e08803feb0e96005a8e 4c6e66779ffe440d9c3270f0328391fb - - default default] [instance: be07a8c0-548d-4bfc-b004-799a831f8252] Updating instance_info_cache with network_info: [{"id": "9cb3a782-7c84-4107-9add-6097c8ec4c06", "address": "fa:16:3e:d8:7c:18", "network": {"id": "55594f65-206f-4b2a-a4ed-c049861ef480", "bridge": "br-int", "label": "tempest-ServerRescueNegativeTestJSON-2114387764-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "4c6e66779ffe440d9c3270f0328391fb", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap9cb3a782-7c", "ovs_interfaceid": "9cb3a782-7c84-4107-9add-6097c8ec4c06", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 21 19:16:01 np0005591285 nova_compute[182755]: 2026-01-22 00:16:01.151 182759 DEBUG oslo_concurrency.lockutils [None req-c523f52c-14e1-48fe-abe8-06282d7fbefc c26ff016fcfc4e08803feb0e96005a8e 4c6e66779ffe440d9c3270f0328391fb - - default default] Releasing lock "refresh_cache-be07a8c0-548d-4bfc-b004-799a831f8252" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 21 19:16:01 np0005591285 nova_compute[182755]: 2026-01-22 00:16:01.151 182759 DEBUG nova.compute.manager [None req-c523f52c-14e1-48fe-abe8-06282d7fbefc c26ff016fcfc4e08803feb0e96005a8e 4c6e66779ffe440d9c3270f0328391fb - - default default] [instance: be07a8c0-548d-4bfc-b004-799a831f8252] Instance network_info: |[{"id": "9cb3a782-7c84-4107-9add-6097c8ec4c06", "address": "fa:16:3e:d8:7c:18", "network": {"id": "55594f65-206f-4b2a-a4ed-c049861ef480", "bridge": "br-int", "label": "tempest-ServerRescueNegativeTestJSON-2114387764-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "4c6e66779ffe440d9c3270f0328391fb", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap9cb3a782-7c", "ovs_interfaceid": "9cb3a782-7c84-4107-9add-6097c8ec4c06", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Jan 21 19:16:01 np0005591285 nova_compute[182755]: 2026-01-22 00:16:01.152 182759 DEBUG oslo_concurrency.lockutils [req-f9414583-18ed-4de1-9ad4-9113d27e1fdb req-9d96da57-0dd1-481c-8c73-b8b01fc61b47 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquired lock "refresh_cache-be07a8c0-548d-4bfc-b004-799a831f8252" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 21 19:16:01 np0005591285 nova_compute[182755]: 2026-01-22 00:16:01.152 182759 DEBUG nova.network.neutron [req-f9414583-18ed-4de1-9ad4-9113d27e1fdb req-9d96da57-0dd1-481c-8c73-b8b01fc61b47 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: be07a8c0-548d-4bfc-b004-799a831f8252] Refreshing network info cache for port 9cb3a782-7c84-4107-9add-6097c8ec4c06 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 21 19:16:01 np0005591285 nova_compute[182755]: 2026-01-22 00:16:01.158 182759 DEBUG nova.virt.libvirt.driver [None req-c523f52c-14e1-48fe-abe8-06282d7fbefc c26ff016fcfc4e08803feb0e96005a8e 4c6e66779ffe440d9c3270f0328391fb - - default default] [instance: be07a8c0-548d-4bfc-b004-799a831f8252] Start _get_guest_xml network_info=[{"id": "9cb3a782-7c84-4107-9add-6097c8ec4c06", "address": "fa:16:3e:d8:7c:18", "network": {"id": "55594f65-206f-4b2a-a4ed-c049861ef480", "bridge": "br-int", "label": "tempest-ServerRescueNegativeTestJSON-2114387764-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "4c6e66779ffe440d9c3270f0328391fb", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap9cb3a782-7c", "ovs_interfaceid": "9cb3a782-7c84-4107-9add-6097c8ec4c06", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-21T23:42:50Z,direct_url=<?>,disk_format='qcow2',id=9cd98f02-a505-4543-a7ad-04e9a377b456,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='43b70c4e837343859ac97b6b2397ba1b',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-21T23:42:52Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'device_type': 'disk', 'device_name': '/dev/vda', 'size': 0, 'boot_index': 0, 'disk_bus': 'virtio', 'guest_format': None, 'encryption_options': None, 'encryption_format': None, 'encrypted': False, 'encryption_secret_uuid': None, 'image_id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Jan 21 19:16:01 np0005591285 nova_compute[182755]: 2026-01-22 00:16:01.163 182759 WARNING nova.virt.libvirt.driver [None req-c523f52c-14e1-48fe-abe8-06282d7fbefc c26ff016fcfc4e08803feb0e96005a8e 4c6e66779ffe440d9c3270f0328391fb - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 21 19:16:01 np0005591285 nova_compute[182755]: 2026-01-22 00:16:01.169 182759 DEBUG nova.virt.libvirt.host [None req-c523f52c-14e1-48fe-abe8-06282d7fbefc c26ff016fcfc4e08803feb0e96005a8e 4c6e66779ffe440d9c3270f0328391fb - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Jan 21 19:16:01 np0005591285 nova_compute[182755]: 2026-01-22 00:16:01.170 182759 DEBUG nova.virt.libvirt.host [None req-c523f52c-14e1-48fe-abe8-06282d7fbefc c26ff016fcfc4e08803feb0e96005a8e 4c6e66779ffe440d9c3270f0328391fb - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Jan 21 19:16:01 np0005591285 nova_compute[182755]: 2026-01-22 00:16:01.173 182759 DEBUG nova.virt.libvirt.host [None req-c523f52c-14e1-48fe-abe8-06282d7fbefc c26ff016fcfc4e08803feb0e96005a8e 4c6e66779ffe440d9c3270f0328391fb - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Jan 21 19:16:01 np0005591285 nova_compute[182755]: 2026-01-22 00:16:01.173 182759 DEBUG nova.virt.libvirt.host [None req-c523f52c-14e1-48fe-abe8-06282d7fbefc c26ff016fcfc4e08803feb0e96005a8e 4c6e66779ffe440d9c3270f0328391fb - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Jan 21 19:16:01 np0005591285 nova_compute[182755]: 2026-01-22 00:16:01.175 182759 DEBUG nova.virt.libvirt.driver [None req-c523f52c-14e1-48fe-abe8-06282d7fbefc c26ff016fcfc4e08803feb0e96005a8e 4c6e66779ffe440d9c3270f0328391fb - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Jan 21 19:16:01 np0005591285 nova_compute[182755]: 2026-01-22 00:16:01.175 182759 DEBUG nova.virt.hardware [None req-c523f52c-14e1-48fe-abe8-06282d7fbefc c26ff016fcfc4e08803feb0e96005a8e 4c6e66779ffe440d9c3270f0328391fb - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-21T23:42:45Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='c3389c03-89c4-4ff5-9e03-1a99d41713d4',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-21T23:42:50Z,direct_url=<?>,disk_format='qcow2',id=9cd98f02-a505-4543-a7ad-04e9a377b456,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='43b70c4e837343859ac97b6b2397ba1b',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-21T23:42:52Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Jan 21 19:16:01 np0005591285 nova_compute[182755]: 2026-01-22 00:16:01.176 182759 DEBUG nova.virt.hardware [None req-c523f52c-14e1-48fe-abe8-06282d7fbefc c26ff016fcfc4e08803feb0e96005a8e 4c6e66779ffe440d9c3270f0328391fb - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Jan 21 19:16:01 np0005591285 nova_compute[182755]: 2026-01-22 00:16:01.176 182759 DEBUG nova.virt.hardware [None req-c523f52c-14e1-48fe-abe8-06282d7fbefc c26ff016fcfc4e08803feb0e96005a8e 4c6e66779ffe440d9c3270f0328391fb - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Jan 21 19:16:01 np0005591285 nova_compute[182755]: 2026-01-22 00:16:01.176 182759 DEBUG nova.virt.hardware [None req-c523f52c-14e1-48fe-abe8-06282d7fbefc c26ff016fcfc4e08803feb0e96005a8e 4c6e66779ffe440d9c3270f0328391fb - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Jan 21 19:16:01 np0005591285 nova_compute[182755]: 2026-01-22 00:16:01.177 182759 DEBUG nova.virt.hardware [None req-c523f52c-14e1-48fe-abe8-06282d7fbefc c26ff016fcfc4e08803feb0e96005a8e 4c6e66779ffe440d9c3270f0328391fb - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Jan 21 19:16:01 np0005591285 nova_compute[182755]: 2026-01-22 00:16:01.177 182759 DEBUG nova.virt.hardware [None req-c523f52c-14e1-48fe-abe8-06282d7fbefc c26ff016fcfc4e08803feb0e96005a8e 4c6e66779ffe440d9c3270f0328391fb - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Jan 21 19:16:01 np0005591285 nova_compute[182755]: 2026-01-22 00:16:01.177 182759 DEBUG nova.virt.hardware [None req-c523f52c-14e1-48fe-abe8-06282d7fbefc c26ff016fcfc4e08803feb0e96005a8e 4c6e66779ffe440d9c3270f0328391fb - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Jan 21 19:16:01 np0005591285 nova_compute[182755]: 2026-01-22 00:16:01.178 182759 DEBUG nova.virt.hardware [None req-c523f52c-14e1-48fe-abe8-06282d7fbefc c26ff016fcfc4e08803feb0e96005a8e 4c6e66779ffe440d9c3270f0328391fb - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Jan 21 19:16:01 np0005591285 nova_compute[182755]: 2026-01-22 00:16:01.178 182759 DEBUG nova.virt.hardware [None req-c523f52c-14e1-48fe-abe8-06282d7fbefc c26ff016fcfc4e08803feb0e96005a8e 4c6e66779ffe440d9c3270f0328391fb - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Jan 21 19:16:01 np0005591285 nova_compute[182755]: 2026-01-22 00:16:01.178 182759 DEBUG nova.virt.hardware [None req-c523f52c-14e1-48fe-abe8-06282d7fbefc c26ff016fcfc4e08803feb0e96005a8e 4c6e66779ffe440d9c3270f0328391fb - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Jan 21 19:16:01 np0005591285 nova_compute[182755]: 2026-01-22 00:16:01.179 182759 DEBUG nova.virt.hardware [None req-c523f52c-14e1-48fe-abe8-06282d7fbefc c26ff016fcfc4e08803feb0e96005a8e 4c6e66779ffe440d9c3270f0328391fb - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Jan 21 19:16:01 np0005591285 nova_compute[182755]: 2026-01-22 00:16:01.183 182759 DEBUG nova.virt.libvirt.vif [None req-c523f52c-14e1-48fe-abe8-06282d7fbefc c26ff016fcfc4e08803feb0e96005a8e 4c6e66779ffe440d9c3270f0328391fb - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-22T00:15:42Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ServerRescueNegativeTestJSON-server-2105253339',display_name='tempest-ServerRescueNegativeTestJSON-server-2105253339',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-serverrescuenegativetestjson-server-2105253339',id=132,image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='4c6e66779ffe440d9c3270f0328391fb',ramdisk_id='',reservation_id='r-aq0efre3',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ServerRescueNegativeTestJSON-1986679883',owner_user_name='tempest-ServerRescueNegativeTestJSON-1986679883-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-22T00:15:44Z,user_data=None,user_id='c26ff016fcfc4e08803feb0e96005a8e',uuid=be07a8c0-548d-4bfc-b004-799a831f8252,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "9cb3a782-7c84-4107-9add-6097c8ec4c06", "address": "fa:16:3e:d8:7c:18", "network": {"id": "55594f65-206f-4b2a-a4ed-c049861ef480", "bridge": "br-int", "label": "tempest-ServerRescueNegativeTestJSON-2114387764-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "4c6e66779ffe440d9c3270f0328391fb", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap9cb3a782-7c", "ovs_interfaceid": "9cb3a782-7c84-4107-9add-6097c8ec4c06", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Jan 21 19:16:01 np0005591285 nova_compute[182755]: 2026-01-22 00:16:01.184 182759 DEBUG nova.network.os_vif_util [None req-c523f52c-14e1-48fe-abe8-06282d7fbefc c26ff016fcfc4e08803feb0e96005a8e 4c6e66779ffe440d9c3270f0328391fb - - default default] Converting VIF {"id": "9cb3a782-7c84-4107-9add-6097c8ec4c06", "address": "fa:16:3e:d8:7c:18", "network": {"id": "55594f65-206f-4b2a-a4ed-c049861ef480", "bridge": "br-int", "label": "tempest-ServerRescueNegativeTestJSON-2114387764-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "4c6e66779ffe440d9c3270f0328391fb", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap9cb3a782-7c", "ovs_interfaceid": "9cb3a782-7c84-4107-9add-6097c8ec4c06", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 21 19:16:01 np0005591285 nova_compute[182755]: 2026-01-22 00:16:01.185 182759 DEBUG nova.network.os_vif_util [None req-c523f52c-14e1-48fe-abe8-06282d7fbefc c26ff016fcfc4e08803feb0e96005a8e 4c6e66779ffe440d9c3270f0328391fb - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:d8:7c:18,bridge_name='br-int',has_traffic_filtering=True,id=9cb3a782-7c84-4107-9add-6097c8ec4c06,network=Network(55594f65-206f-4b2a-a4ed-c049861ef480),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap9cb3a782-7c') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 21 19:16:01 np0005591285 nova_compute[182755]: 2026-01-22 00:16:01.186 182759 DEBUG nova.objects.instance [None req-c523f52c-14e1-48fe-abe8-06282d7fbefc c26ff016fcfc4e08803feb0e96005a8e 4c6e66779ffe440d9c3270f0328391fb - - default default] Lazy-loading 'pci_devices' on Instance uuid be07a8c0-548d-4bfc-b004-799a831f8252 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 21 19:16:01 np0005591285 nova_compute[182755]: 2026-01-22 00:16:01.354 182759 DEBUG nova.virt.libvirt.driver [None req-c523f52c-14e1-48fe-abe8-06282d7fbefc c26ff016fcfc4e08803feb0e96005a8e 4c6e66779ffe440d9c3270f0328391fb - - default default] [instance: be07a8c0-548d-4bfc-b004-799a831f8252] End _get_guest_xml xml=<domain type="kvm">
Jan 21 19:16:01 np0005591285 nova_compute[182755]:  <uuid>be07a8c0-548d-4bfc-b004-799a831f8252</uuid>
Jan 21 19:16:01 np0005591285 nova_compute[182755]:  <name>instance-00000084</name>
Jan 21 19:16:01 np0005591285 nova_compute[182755]:  <memory>131072</memory>
Jan 21 19:16:01 np0005591285 nova_compute[182755]:  <vcpu>1</vcpu>
Jan 21 19:16:01 np0005591285 nova_compute[182755]:  <metadata>
Jan 21 19:16:01 np0005591285 nova_compute[182755]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 21 19:16:01 np0005591285 nova_compute[182755]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 21 19:16:01 np0005591285 nova_compute[182755]:      <nova:name>tempest-ServerRescueNegativeTestJSON-server-2105253339</nova:name>
Jan 21 19:16:01 np0005591285 nova_compute[182755]:      <nova:creationTime>2026-01-22 00:16:01</nova:creationTime>
Jan 21 19:16:01 np0005591285 nova_compute[182755]:      <nova:flavor name="m1.nano">
Jan 21 19:16:01 np0005591285 nova_compute[182755]:        <nova:memory>128</nova:memory>
Jan 21 19:16:01 np0005591285 nova_compute[182755]:        <nova:disk>1</nova:disk>
Jan 21 19:16:01 np0005591285 nova_compute[182755]:        <nova:swap>0</nova:swap>
Jan 21 19:16:01 np0005591285 nova_compute[182755]:        <nova:ephemeral>0</nova:ephemeral>
Jan 21 19:16:01 np0005591285 nova_compute[182755]:        <nova:vcpus>1</nova:vcpus>
Jan 21 19:16:01 np0005591285 nova_compute[182755]:      </nova:flavor>
Jan 21 19:16:01 np0005591285 nova_compute[182755]:      <nova:owner>
Jan 21 19:16:01 np0005591285 nova_compute[182755]:        <nova:user uuid="c26ff016fcfc4e08803feb0e96005a8e">tempest-ServerRescueNegativeTestJSON-1986679883-project-member</nova:user>
Jan 21 19:16:01 np0005591285 nova_compute[182755]:        <nova:project uuid="4c6e66779ffe440d9c3270f0328391fb">tempest-ServerRescueNegativeTestJSON-1986679883</nova:project>
Jan 21 19:16:01 np0005591285 nova_compute[182755]:      </nova:owner>
Jan 21 19:16:01 np0005591285 nova_compute[182755]:      <nova:root type="image" uuid="9cd98f02-a505-4543-a7ad-04e9a377b456"/>
Jan 21 19:16:01 np0005591285 nova_compute[182755]:      <nova:ports>
Jan 21 19:16:01 np0005591285 nova_compute[182755]:        <nova:port uuid="9cb3a782-7c84-4107-9add-6097c8ec4c06">
Jan 21 19:16:01 np0005591285 nova_compute[182755]:          <nova:ip type="fixed" address="10.100.0.3" ipVersion="4"/>
Jan 21 19:16:01 np0005591285 nova_compute[182755]:        </nova:port>
Jan 21 19:16:01 np0005591285 nova_compute[182755]:      </nova:ports>
Jan 21 19:16:01 np0005591285 nova_compute[182755]:    </nova:instance>
Jan 21 19:16:01 np0005591285 nova_compute[182755]:  </metadata>
Jan 21 19:16:01 np0005591285 nova_compute[182755]:  <sysinfo type="smbios">
Jan 21 19:16:01 np0005591285 nova_compute[182755]:    <system>
Jan 21 19:16:01 np0005591285 nova_compute[182755]:      <entry name="manufacturer">RDO</entry>
Jan 21 19:16:01 np0005591285 nova_compute[182755]:      <entry name="product">OpenStack Compute</entry>
Jan 21 19:16:01 np0005591285 nova_compute[182755]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 21 19:16:01 np0005591285 nova_compute[182755]:      <entry name="serial">be07a8c0-548d-4bfc-b004-799a831f8252</entry>
Jan 21 19:16:01 np0005591285 nova_compute[182755]:      <entry name="uuid">be07a8c0-548d-4bfc-b004-799a831f8252</entry>
Jan 21 19:16:01 np0005591285 nova_compute[182755]:      <entry name="family">Virtual Machine</entry>
Jan 21 19:16:01 np0005591285 nova_compute[182755]:    </system>
Jan 21 19:16:01 np0005591285 nova_compute[182755]:  </sysinfo>
Jan 21 19:16:01 np0005591285 nova_compute[182755]:  <os>
Jan 21 19:16:01 np0005591285 nova_compute[182755]:    <type arch="x86_64" machine="q35">hvm</type>
Jan 21 19:16:01 np0005591285 nova_compute[182755]:    <boot dev="hd"/>
Jan 21 19:16:01 np0005591285 nova_compute[182755]:    <smbios mode="sysinfo"/>
Jan 21 19:16:01 np0005591285 nova_compute[182755]:  </os>
Jan 21 19:16:01 np0005591285 nova_compute[182755]:  <features>
Jan 21 19:16:01 np0005591285 nova_compute[182755]:    <acpi/>
Jan 21 19:16:01 np0005591285 nova_compute[182755]:    <apic/>
Jan 21 19:16:01 np0005591285 nova_compute[182755]:    <vmcoreinfo/>
Jan 21 19:16:01 np0005591285 nova_compute[182755]:  </features>
Jan 21 19:16:01 np0005591285 nova_compute[182755]:  <clock offset="utc">
Jan 21 19:16:01 np0005591285 nova_compute[182755]:    <timer name="pit" tickpolicy="delay"/>
Jan 21 19:16:01 np0005591285 nova_compute[182755]:    <timer name="rtc" tickpolicy="catchup"/>
Jan 21 19:16:01 np0005591285 nova_compute[182755]:    <timer name="hpet" present="no"/>
Jan 21 19:16:01 np0005591285 nova_compute[182755]:  </clock>
Jan 21 19:16:01 np0005591285 nova_compute[182755]:  <cpu mode="custom" match="exact">
Jan 21 19:16:01 np0005591285 nova_compute[182755]:    <model>Nehalem</model>
Jan 21 19:16:01 np0005591285 nova_compute[182755]:    <topology sockets="1" cores="1" threads="1"/>
Jan 21 19:16:01 np0005591285 nova_compute[182755]:  </cpu>
Jan 21 19:16:01 np0005591285 nova_compute[182755]:  <devices>
Jan 21 19:16:01 np0005591285 nova_compute[182755]:    <disk type="file" device="disk">
Jan 21 19:16:01 np0005591285 nova_compute[182755]:      <driver name="qemu" type="qcow2" cache="none"/>
Jan 21 19:16:01 np0005591285 nova_compute[182755]:      <source file="/var/lib/nova/instances/be07a8c0-548d-4bfc-b004-799a831f8252/disk"/>
Jan 21 19:16:01 np0005591285 nova_compute[182755]:      <target dev="vda" bus="virtio"/>
Jan 21 19:16:01 np0005591285 nova_compute[182755]:    </disk>
Jan 21 19:16:01 np0005591285 nova_compute[182755]:    <disk type="file" device="cdrom">
Jan 21 19:16:01 np0005591285 nova_compute[182755]:      <driver name="qemu" type="raw" cache="none"/>
Jan 21 19:16:01 np0005591285 nova_compute[182755]:      <source file="/var/lib/nova/instances/be07a8c0-548d-4bfc-b004-799a831f8252/disk.config"/>
Jan 21 19:16:01 np0005591285 nova_compute[182755]:      <target dev="sda" bus="sata"/>
Jan 21 19:16:01 np0005591285 nova_compute[182755]:    </disk>
Jan 21 19:16:01 np0005591285 nova_compute[182755]:    <interface type="ethernet">
Jan 21 19:16:01 np0005591285 nova_compute[182755]:      <mac address="fa:16:3e:d8:7c:18"/>
Jan 21 19:16:01 np0005591285 nova_compute[182755]:      <model type="virtio"/>
Jan 21 19:16:01 np0005591285 nova_compute[182755]:      <driver name="vhost" rx_queue_size="512"/>
Jan 21 19:16:01 np0005591285 nova_compute[182755]:      <mtu size="1442"/>
Jan 21 19:16:01 np0005591285 nova_compute[182755]:      <target dev="tap9cb3a782-7c"/>
Jan 21 19:16:01 np0005591285 nova_compute[182755]:    </interface>
Jan 21 19:16:01 np0005591285 nova_compute[182755]:    <serial type="pty">
Jan 21 19:16:01 np0005591285 nova_compute[182755]:      <log file="/var/lib/nova/instances/be07a8c0-548d-4bfc-b004-799a831f8252/console.log" append="off"/>
Jan 21 19:16:01 np0005591285 nova_compute[182755]:    </serial>
Jan 21 19:16:01 np0005591285 nova_compute[182755]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 21 19:16:01 np0005591285 nova_compute[182755]:    <video>
Jan 21 19:16:01 np0005591285 nova_compute[182755]:      <model type="virtio"/>
Jan 21 19:16:01 np0005591285 nova_compute[182755]:    </video>
Jan 21 19:16:01 np0005591285 nova_compute[182755]:    <input type="tablet" bus="usb"/>
Jan 21 19:16:01 np0005591285 nova_compute[182755]:    <rng model="virtio">
Jan 21 19:16:01 np0005591285 nova_compute[182755]:      <backend model="random">/dev/urandom</backend>
Jan 21 19:16:01 np0005591285 nova_compute[182755]:    </rng>
Jan 21 19:16:01 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root"/>
Jan 21 19:16:01 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 19:16:01 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 19:16:01 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 19:16:01 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 19:16:01 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 19:16:01 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 19:16:01 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 19:16:01 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 19:16:01 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 19:16:01 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 19:16:01 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 19:16:01 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 19:16:01 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 19:16:01 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 19:16:01 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 19:16:01 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 19:16:01 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 19:16:01 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 19:16:01 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 19:16:01 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 19:16:01 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 19:16:01 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 19:16:01 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 19:16:01 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 19:16:01 np0005591285 nova_compute[182755]:    <controller type="usb" index="0"/>
Jan 21 19:16:01 np0005591285 nova_compute[182755]:    <memballoon model="virtio">
Jan 21 19:16:01 np0005591285 nova_compute[182755]:      <stats period="10"/>
Jan 21 19:16:01 np0005591285 nova_compute[182755]:    </memballoon>
Jan 21 19:16:01 np0005591285 nova_compute[182755]:  </devices>
Jan 21 19:16:01 np0005591285 nova_compute[182755]: </domain>
Jan 21 19:16:01 np0005591285 nova_compute[182755]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Jan 21 19:16:01 np0005591285 nova_compute[182755]: 2026-01-22 00:16:01.356 182759 DEBUG nova.compute.manager [None req-c523f52c-14e1-48fe-abe8-06282d7fbefc c26ff016fcfc4e08803feb0e96005a8e 4c6e66779ffe440d9c3270f0328391fb - - default default] [instance: be07a8c0-548d-4bfc-b004-799a831f8252] Preparing to wait for external event network-vif-plugged-9cb3a782-7c84-4107-9add-6097c8ec4c06 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Jan 21 19:16:01 np0005591285 nova_compute[182755]: 2026-01-22 00:16:01.356 182759 DEBUG oslo_concurrency.lockutils [None req-c523f52c-14e1-48fe-abe8-06282d7fbefc c26ff016fcfc4e08803feb0e96005a8e 4c6e66779ffe440d9c3270f0328391fb - - default default] Acquiring lock "be07a8c0-548d-4bfc-b004-799a831f8252-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 21 19:16:01 np0005591285 nova_compute[182755]: 2026-01-22 00:16:01.357 182759 DEBUG oslo_concurrency.lockutils [None req-c523f52c-14e1-48fe-abe8-06282d7fbefc c26ff016fcfc4e08803feb0e96005a8e 4c6e66779ffe440d9c3270f0328391fb - - default default] Lock "be07a8c0-548d-4bfc-b004-799a831f8252-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 21 19:16:01 np0005591285 nova_compute[182755]: 2026-01-22 00:16:01.357 182759 DEBUG oslo_concurrency.lockutils [None req-c523f52c-14e1-48fe-abe8-06282d7fbefc c26ff016fcfc4e08803feb0e96005a8e 4c6e66779ffe440d9c3270f0328391fb - - default default] Lock "be07a8c0-548d-4bfc-b004-799a831f8252-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 21 19:16:01 np0005591285 nova_compute[182755]: 2026-01-22 00:16:01.357 182759 DEBUG nova.virt.libvirt.vif [None req-c523f52c-14e1-48fe-abe8-06282d7fbefc c26ff016fcfc4e08803feb0e96005a8e 4c6e66779ffe440d9c3270f0328391fb - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-22T00:15:42Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ServerRescueNegativeTestJSON-server-2105253339',display_name='tempest-ServerRescueNegativeTestJSON-server-2105253339',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-serverrescuenegativetestjson-server-2105253339',id=132,image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='4c6e66779ffe440d9c3270f0328391fb',ramdisk_id='',reservation_id='r-aq0efre3',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ServerRescueNegativeTestJSON-1986679883',owner_user_name='tempest-ServerRescueNegativeTestJSON-1986679883-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-22T00:15:44Z,user_data=None,user_id='c26ff016fcfc4e08803feb0e96005a8e',uuid=be07a8c0-548d-4bfc-b004-799a831f8252,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "9cb3a782-7c84-4107-9add-6097c8ec4c06", "address": "fa:16:3e:d8:7c:18", "network": {"id": "55594f65-206f-4b2a-a4ed-c049861ef480", "bridge": "br-int", "label": "tempest-ServerRescueNegativeTestJSON-2114387764-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "4c6e66779ffe440d9c3270f0328391fb", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap9cb3a782-7c", "ovs_interfaceid": "9cb3a782-7c84-4107-9add-6097c8ec4c06", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Jan 21 19:16:01 np0005591285 nova_compute[182755]: 2026-01-22 00:16:01.358 182759 DEBUG nova.network.os_vif_util [None req-c523f52c-14e1-48fe-abe8-06282d7fbefc c26ff016fcfc4e08803feb0e96005a8e 4c6e66779ffe440d9c3270f0328391fb - - default default] Converting VIF {"id": "9cb3a782-7c84-4107-9add-6097c8ec4c06", "address": "fa:16:3e:d8:7c:18", "network": {"id": "55594f65-206f-4b2a-a4ed-c049861ef480", "bridge": "br-int", "label": "tempest-ServerRescueNegativeTestJSON-2114387764-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "4c6e66779ffe440d9c3270f0328391fb", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap9cb3a782-7c", "ovs_interfaceid": "9cb3a782-7c84-4107-9add-6097c8ec4c06", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 21 19:16:01 np0005591285 nova_compute[182755]: 2026-01-22 00:16:01.358 182759 DEBUG nova.network.os_vif_util [None req-c523f52c-14e1-48fe-abe8-06282d7fbefc c26ff016fcfc4e08803feb0e96005a8e 4c6e66779ffe440d9c3270f0328391fb - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:d8:7c:18,bridge_name='br-int',has_traffic_filtering=True,id=9cb3a782-7c84-4107-9add-6097c8ec4c06,network=Network(55594f65-206f-4b2a-a4ed-c049861ef480),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap9cb3a782-7c') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 21 19:16:01 np0005591285 nova_compute[182755]: 2026-01-22 00:16:01.359 182759 DEBUG os_vif [None req-c523f52c-14e1-48fe-abe8-06282d7fbefc c26ff016fcfc4e08803feb0e96005a8e 4c6e66779ffe440d9c3270f0328391fb - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:d8:7c:18,bridge_name='br-int',has_traffic_filtering=True,id=9cb3a782-7c84-4107-9add-6097c8ec4c06,network=Network(55594f65-206f-4b2a-a4ed-c049861ef480),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap9cb3a782-7c') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Jan 21 19:16:01 np0005591285 nova_compute[182755]: 2026-01-22 00:16:01.359 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:16:01 np0005591285 nova_compute[182755]: 2026-01-22 00:16:01.360 182759 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 21 19:16:01 np0005591285 nova_compute[182755]: 2026-01-22 00:16:01.360 182759 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 21 19:16:01 np0005591285 nova_compute[182755]: 2026-01-22 00:16:01.363 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:16:01 np0005591285 nova_compute[182755]: 2026-01-22 00:16:01.363 182759 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap9cb3a782-7c, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 21 19:16:01 np0005591285 nova_compute[182755]: 2026-01-22 00:16:01.363 182759 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap9cb3a782-7c, col_values=(('external_ids', {'iface-id': '9cb3a782-7c84-4107-9add-6097c8ec4c06', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:d8:7c:18', 'vm-uuid': 'be07a8c0-548d-4bfc-b004-799a831f8252'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 21 19:16:01 np0005591285 nova_compute[182755]: 2026-01-22 00:16:01.364 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:16:01 np0005591285 NetworkManager[55017]: <info>  [1769040961.3660] manager: (tap9cb3a782-7c): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/244)
Jan 21 19:16:01 np0005591285 nova_compute[182755]: 2026-01-22 00:16:01.367 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 21 19:16:01 np0005591285 nova_compute[182755]: 2026-01-22 00:16:01.370 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:16:01 np0005591285 nova_compute[182755]: 2026-01-22 00:16:01.371 182759 INFO os_vif [None req-c523f52c-14e1-48fe-abe8-06282d7fbefc c26ff016fcfc4e08803feb0e96005a8e 4c6e66779ffe440d9c3270f0328391fb - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:d8:7c:18,bridge_name='br-int',has_traffic_filtering=True,id=9cb3a782-7c84-4107-9add-6097c8ec4c06,network=Network(55594f65-206f-4b2a-a4ed-c049861ef480),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap9cb3a782-7c')#033[00m
Jan 21 19:16:01 np0005591285 nova_compute[182755]: 2026-01-22 00:16:01.488 182759 DEBUG nova.virt.libvirt.driver [None req-c523f52c-14e1-48fe-abe8-06282d7fbefc c26ff016fcfc4e08803feb0e96005a8e 4c6e66779ffe440d9c3270f0328391fb - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 21 19:16:01 np0005591285 nova_compute[182755]: 2026-01-22 00:16:01.489 182759 DEBUG nova.virt.libvirt.driver [None req-c523f52c-14e1-48fe-abe8-06282d7fbefc c26ff016fcfc4e08803feb0e96005a8e 4c6e66779ffe440d9c3270f0328391fb - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 21 19:16:01 np0005591285 nova_compute[182755]: 2026-01-22 00:16:01.489 182759 DEBUG nova.virt.libvirt.driver [None req-c523f52c-14e1-48fe-abe8-06282d7fbefc c26ff016fcfc4e08803feb0e96005a8e 4c6e66779ffe440d9c3270f0328391fb - - default default] No VIF found with MAC fa:16:3e:d8:7c:18, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Jan 21 19:16:01 np0005591285 nova_compute[182755]: 2026-01-22 00:16:01.489 182759 INFO nova.virt.libvirt.driver [None req-c523f52c-14e1-48fe-abe8-06282d7fbefc c26ff016fcfc4e08803feb0e96005a8e 4c6e66779ffe440d9c3270f0328391fb - - default default] [instance: be07a8c0-548d-4bfc-b004-799a831f8252] Using config drive#033[00m
Jan 21 19:16:02 np0005591285 nova_compute[182755]: 2026-01-22 00:16:02.666 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:16:02 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:16:02.979 104259 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 21 19:16:02 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:16:02.980 104259 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 21 19:16:02 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:16:02.980 104259 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 21 19:16:03 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:16:03.977 104259 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=38, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '16:a4:fb', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': 'fa:c2:bb:50:eb:27'}, ipsec=False) old=SB_Global(nb_cfg=37) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 21 19:16:03 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:16:03.977 104259 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 9 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Jan 21 19:16:03 np0005591285 nova_compute[182755]: 2026-01-22 00:16:03.979 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:16:04 np0005591285 nova_compute[182755]: 2026-01-22 00:16:04.941 182759 INFO nova.virt.libvirt.driver [None req-c523f52c-14e1-48fe-abe8-06282d7fbefc c26ff016fcfc4e08803feb0e96005a8e 4c6e66779ffe440d9c3270f0328391fb - - default default] [instance: be07a8c0-548d-4bfc-b004-799a831f8252] Creating config drive at /var/lib/nova/instances/be07a8c0-548d-4bfc-b004-799a831f8252/disk.config#033[00m
Jan 21 19:16:04 np0005591285 nova_compute[182755]: 2026-01-22 00:16:04.947 182759 DEBUG oslo_concurrency.processutils [None req-c523f52c-14e1-48fe-abe8-06282d7fbefc c26ff016fcfc4e08803feb0e96005a8e 4c6e66779ffe440d9c3270f0328391fb - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/be07a8c0-548d-4bfc-b004-799a831f8252/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpqoc98rwd execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 21 19:16:05 np0005591285 nova_compute[182755]: 2026-01-22 00:16:05.084 182759 DEBUG oslo_concurrency.processutils [None req-c523f52c-14e1-48fe-abe8-06282d7fbefc c26ff016fcfc4e08803feb0e96005a8e 4c6e66779ffe440d9c3270f0328391fb - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/be07a8c0-548d-4bfc-b004-799a831f8252/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpqoc98rwd" returned: 0 in 0.138s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 21 19:16:05 np0005591285 kernel: tap9cb3a782-7c: entered promiscuous mode
Jan 21 19:16:05 np0005591285 nova_compute[182755]: 2026-01-22 00:16:05.141 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:16:05 np0005591285 ovn_controller[94908]: 2026-01-22T00:16:05Z|00507|binding|INFO|Claiming lport 9cb3a782-7c84-4107-9add-6097c8ec4c06 for this chassis.
Jan 21 19:16:05 np0005591285 ovn_controller[94908]: 2026-01-22T00:16:05Z|00508|binding|INFO|9cb3a782-7c84-4107-9add-6097c8ec4c06: Claiming fa:16:3e:d8:7c:18 10.100.0.3
Jan 21 19:16:05 np0005591285 NetworkManager[55017]: <info>  [1769040965.1433] manager: (tap9cb3a782-7c): new Tun device (/org/freedesktop/NetworkManager/Devices/245)
Jan 21 19:16:05 np0005591285 nova_compute[182755]: 2026-01-22 00:16:05.145 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:16:05 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:16:05.164 104259 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:d8:7c:18 10.100.0.3'], port_security=['fa:16:3e:d8:7c:18 10.100.0.3'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.3/28', 'neutron:device_id': 'be07a8c0-548d-4bfc-b004-799a831f8252', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-55594f65-206f-4b2a-a4ed-c049861ef480', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '4c6e66779ffe440d9c3270f0328391fb', 'neutron:revision_number': '2', 'neutron:security_group_ids': '99a05b18-dc2e-46bb-b7ee-4bfce96057f6', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=f515f91a-3ddc-47bf-8aaf-753c19e78de1, chassis=[<ovs.db.idl.Row object at 0x7fdd55b5c970>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fdd55b5c970>], logical_port=9cb3a782-7c84-4107-9add-6097c8ec4c06) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 21 19:16:05 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:16:05.165 104259 INFO neutron.agent.ovn.metadata.agent [-] Port 9cb3a782-7c84-4107-9add-6097c8ec4c06 in datapath 55594f65-206f-4b2a-a4ed-c049861ef480 bound to our chassis#033[00m
Jan 21 19:16:05 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:16:05.166 104259 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 55594f65-206f-4b2a-a4ed-c049861ef480#033[00m
Jan 21 19:16:05 np0005591285 systemd-udevd[232355]: Network interface NamePolicy= disabled on kernel command line.
Jan 21 19:16:05 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:16:05.183 211690 DEBUG oslo.privsep.daemon [-] privsep: reply[6c90bf43-324f-4225-bfb3-fe1d4cbc9c23]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 19:16:05 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:16:05.185 104259 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap55594f65-21 in ovnmeta-55594f65-206f-4b2a-a4ed-c049861ef480 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Jan 21 19:16:05 np0005591285 NetworkManager[55017]: <info>  [1769040965.1869] device (tap9cb3a782-7c): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 21 19:16:05 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:16:05.188 211690 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap55594f65-20 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Jan 21 19:16:05 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:16:05.188 211690 DEBUG oslo.privsep.daemon [-] privsep: reply[1b7a230f-9374-4e8e-bb6b-72460fcfe1a7]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 19:16:05 np0005591285 NetworkManager[55017]: <info>  [1769040965.1888] device (tap9cb3a782-7c): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 21 19:16:05 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:16:05.189 211690 DEBUG oslo.privsep.daemon [-] privsep: reply[f5e86d80-b3e5-4ef2-b70e-deda91672fbe]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 19:16:05 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:16:05.203 104650 DEBUG oslo.privsep.daemon [-] privsep: reply[b84d08d5-f747-4b53-bcf6-05272f137bb8]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 19:16:05 np0005591285 systemd-machined[154022]: New machine qemu-60-instance-00000084.
Jan 21 19:16:05 np0005591285 nova_compute[182755]: 2026-01-22 00:16:05.222 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:16:05 np0005591285 ovn_controller[94908]: 2026-01-22T00:16:05Z|00509|binding|INFO|Setting lport 9cb3a782-7c84-4107-9add-6097c8ec4c06 ovn-installed in OVS
Jan 21 19:16:05 np0005591285 ovn_controller[94908]: 2026-01-22T00:16:05Z|00510|binding|INFO|Setting lport 9cb3a782-7c84-4107-9add-6097c8ec4c06 up in Southbound
Jan 21 19:16:05 np0005591285 nova_compute[182755]: 2026-01-22 00:16:05.227 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:16:05 np0005591285 systemd[1]: Started Virtual Machine qemu-60-instance-00000084.
Jan 21 19:16:05 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:16:05.229 211690 DEBUG oslo.privsep.daemon [-] privsep: reply[18ef128f-7e54-4ea4-b62e-3e0826e456ea]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 19:16:05 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:16:05.273 211704 DEBUG oslo.privsep.daemon [-] privsep: reply[03e39c39-9bb6-47fb-9c2d-11217b3936c1]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 19:16:05 np0005591285 NetworkManager[55017]: <info>  [1769040965.2802] manager: (tap55594f65-20): new Veth device (/org/freedesktop/NetworkManager/Devices/246)
Jan 21 19:16:05 np0005591285 systemd-udevd[232360]: Network interface NamePolicy= disabled on kernel command line.
Jan 21 19:16:05 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:16:05.279 211690 DEBUG oslo.privsep.daemon [-] privsep: reply[ef4cc9ee-4435-4bb6-b216-d878e69cf945]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 19:16:05 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:16:05.320 211704 DEBUG oslo.privsep.daemon [-] privsep: reply[cbfb5f9c-db1b-427a-b493-62b50da88db7]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 19:16:05 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:16:05.324 211704 DEBUG oslo.privsep.daemon [-] privsep: reply[50a35b60-2795-4592-9d83-62946a905d32]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 19:16:05 np0005591285 NetworkManager[55017]: <info>  [1769040965.3570] device (tap55594f65-20): carrier: link connected
Jan 21 19:16:05 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:16:05.363 211704 DEBUG oslo.privsep.daemon [-] privsep: reply[cdc4d2af-1063-480e-89f1-07b096464a8c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 19:16:05 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:16:05.380 211690 DEBUG oslo.privsep.daemon [-] privsep: reply[4eb321a8-65f4-4f50-a713-369ce2bb9ca6]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap55594f65-21'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:b9:fe:a1'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 159], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 552401, 'reachable_time': 27202, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 232391, 'error': None, 'target': 'ovnmeta-55594f65-206f-4b2a-a4ed-c049861ef480', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 19:16:05 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:16:05.395 211690 DEBUG oslo.privsep.daemon [-] privsep: reply[e265a509-da56-4794-8924-475e1f982a78]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:feb9:fea1'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 552401, 'tstamp': 552401}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 232392, 'error': None, 'target': 'ovnmeta-55594f65-206f-4b2a-a4ed-c049861ef480', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 19:16:05 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:16:05.422 211690 DEBUG oslo.privsep.daemon [-] privsep: reply[ecb5d21d-7658-4756-aefc-287f6cb98200]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap55594f65-21'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:b9:fe:a1'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 159], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 552401, 'reachable_time': 27202, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 232397, 'error': None, 'target': 'ovnmeta-55594f65-206f-4b2a-a4ed-c049861ef480', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 19:16:05 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:16:05.464 211690 DEBUG oslo.privsep.daemon [-] privsep: reply[092c21c3-810b-49b3-bd98-d32dd4c2cefa]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 19:16:05 np0005591285 nova_compute[182755]: 2026-01-22 00:16:05.486 182759 DEBUG nova.virt.driver [None req-f447ef32-7edb-4e2c-a5c5-5452f2f9c37e - - - - - -] Emitting event <LifecycleEvent: 1769040965.4856842, be07a8c0-548d-4bfc-b004-799a831f8252 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 21 19:16:05 np0005591285 nova_compute[182755]: 2026-01-22 00:16:05.486 182759 INFO nova.compute.manager [None req-f447ef32-7edb-4e2c-a5c5-5452f2f9c37e - - - - - -] [instance: be07a8c0-548d-4bfc-b004-799a831f8252] VM Started (Lifecycle Event)#033[00m
Jan 21 19:16:05 np0005591285 nova_compute[182755]: 2026-01-22 00:16:05.513 182759 DEBUG nova.compute.manager [None req-f447ef32-7edb-4e2c-a5c5-5452f2f9c37e - - - - - -] [instance: be07a8c0-548d-4bfc-b004-799a831f8252] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 21 19:16:05 np0005591285 nova_compute[182755]: 2026-01-22 00:16:05.519 182759 DEBUG nova.virt.driver [None req-f447ef32-7edb-4e2c-a5c5-5452f2f9c37e - - - - - -] Emitting event <LifecycleEvent: 1769040965.4858391, be07a8c0-548d-4bfc-b004-799a831f8252 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 21 19:16:05 np0005591285 nova_compute[182755]: 2026-01-22 00:16:05.519 182759 INFO nova.compute.manager [None req-f447ef32-7edb-4e2c-a5c5-5452f2f9c37e - - - - - -] [instance: be07a8c0-548d-4bfc-b004-799a831f8252] VM Paused (Lifecycle Event)#033[00m
Jan 21 19:16:05 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:16:05.543 211690 DEBUG oslo.privsep.daemon [-] privsep: reply[c844847e-37f3-42f6-9e17-56f6ffa3be22]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 19:16:05 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:16:05.544 104259 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap55594f65-20, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 21 19:16:05 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:16:05.544 104259 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 21 19:16:05 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:16:05.544 104259 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap55594f65-20, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 21 19:16:05 np0005591285 NetworkManager[55017]: <info>  [1769040965.5469] manager: (tap55594f65-20): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/247)
Jan 21 19:16:05 np0005591285 nova_compute[182755]: 2026-01-22 00:16:05.546 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:16:05 np0005591285 kernel: tap55594f65-20: entered promiscuous mode
Jan 21 19:16:05 np0005591285 nova_compute[182755]: 2026-01-22 00:16:05.549 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:16:05 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:16:05.549 104259 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap55594f65-20, col_values=(('external_ids', {'iface-id': 'c516d686-0754-486d-a980-7442f4c88088'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 21 19:16:05 np0005591285 ovn_controller[94908]: 2026-01-22T00:16:05Z|00511|binding|INFO|Releasing lport c516d686-0754-486d-a980-7442f4c88088 from this chassis (sb_readonly=0)
Jan 21 19:16:05 np0005591285 nova_compute[182755]: 2026-01-22 00:16:05.552 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:16:05 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:16:05.552 104259 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/55594f65-206f-4b2a-a4ed-c049861ef480.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/55594f65-206f-4b2a-a4ed-c049861ef480.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Jan 21 19:16:05 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:16:05.553 211690 DEBUG oslo.privsep.daemon [-] privsep: reply[e510fbbb-0d10-41dd-8452-0673becf2766]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 19:16:05 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:16:05.554 104259 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Jan 21 19:16:05 np0005591285 ovn_metadata_agent[104254]: global
Jan 21 19:16:05 np0005591285 ovn_metadata_agent[104254]:    log         /dev/log local0 debug
Jan 21 19:16:05 np0005591285 ovn_metadata_agent[104254]:    log-tag     haproxy-metadata-proxy-55594f65-206f-4b2a-a4ed-c049861ef480
Jan 21 19:16:05 np0005591285 ovn_metadata_agent[104254]:    user        root
Jan 21 19:16:05 np0005591285 ovn_metadata_agent[104254]:    group       root
Jan 21 19:16:05 np0005591285 ovn_metadata_agent[104254]:    maxconn     1024
Jan 21 19:16:05 np0005591285 ovn_metadata_agent[104254]:    pidfile     /var/lib/neutron/external/pids/55594f65-206f-4b2a-a4ed-c049861ef480.pid.haproxy
Jan 21 19:16:05 np0005591285 ovn_metadata_agent[104254]:    daemon
Jan 21 19:16:05 np0005591285 ovn_metadata_agent[104254]: 
Jan 21 19:16:05 np0005591285 ovn_metadata_agent[104254]: defaults
Jan 21 19:16:05 np0005591285 ovn_metadata_agent[104254]:    log global
Jan 21 19:16:05 np0005591285 ovn_metadata_agent[104254]:    mode http
Jan 21 19:16:05 np0005591285 ovn_metadata_agent[104254]:    option httplog
Jan 21 19:16:05 np0005591285 ovn_metadata_agent[104254]:    option dontlognull
Jan 21 19:16:05 np0005591285 ovn_metadata_agent[104254]:    option http-server-close
Jan 21 19:16:05 np0005591285 ovn_metadata_agent[104254]:    option forwardfor
Jan 21 19:16:05 np0005591285 ovn_metadata_agent[104254]:    retries                 3
Jan 21 19:16:05 np0005591285 ovn_metadata_agent[104254]:    timeout http-request    30s
Jan 21 19:16:05 np0005591285 ovn_metadata_agent[104254]:    timeout connect         30s
Jan 21 19:16:05 np0005591285 ovn_metadata_agent[104254]:    timeout client          32s
Jan 21 19:16:05 np0005591285 ovn_metadata_agent[104254]:    timeout server          32s
Jan 21 19:16:05 np0005591285 ovn_metadata_agent[104254]:    timeout http-keep-alive 30s
Jan 21 19:16:05 np0005591285 ovn_metadata_agent[104254]: 
Jan 21 19:16:05 np0005591285 ovn_metadata_agent[104254]: 
Jan 21 19:16:05 np0005591285 ovn_metadata_agent[104254]: listen listener
Jan 21 19:16:05 np0005591285 ovn_metadata_agent[104254]:    bind 169.254.169.254:80
Jan 21 19:16:05 np0005591285 ovn_metadata_agent[104254]:    server metadata /var/lib/neutron/metadata_proxy
Jan 21 19:16:05 np0005591285 ovn_metadata_agent[104254]:    http-request add-header X-OVN-Network-ID 55594f65-206f-4b2a-a4ed-c049861ef480
Jan 21 19:16:05 np0005591285 ovn_metadata_agent[104254]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Jan 21 19:16:05 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:16:05.556 104259 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-55594f65-206f-4b2a-a4ed-c049861ef480', 'env', 'PROCESS_TAG=haproxy-55594f65-206f-4b2a-a4ed-c049861ef480', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/55594f65-206f-4b2a-a4ed-c049861ef480.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Jan 21 19:16:05 np0005591285 nova_compute[182755]: 2026-01-22 00:16:05.579 182759 DEBUG nova.compute.manager [None req-f447ef32-7edb-4e2c-a5c5-5452f2f9c37e - - - - - -] [instance: be07a8c0-548d-4bfc-b004-799a831f8252] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 21 19:16:05 np0005591285 nova_compute[182755]: 2026-01-22 00:16:05.580 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:16:05 np0005591285 nova_compute[182755]: 2026-01-22 00:16:05.584 182759 DEBUG nova.compute.manager [None req-f447ef32-7edb-4e2c-a5c5-5452f2f9c37e - - - - - -] [instance: be07a8c0-548d-4bfc-b004-799a831f8252] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 21 19:16:05 np0005591285 nova_compute[182755]: 2026-01-22 00:16:05.615 182759 INFO nova.compute.manager [None req-f447ef32-7edb-4e2c-a5c5-5452f2f9c37e - - - - - -] [instance: be07a8c0-548d-4bfc-b004-799a831f8252] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 21 19:16:05 np0005591285 podman[232430]: 2026-01-22 00:16:05.960732813 +0000 UTC m=+0.056047306 container create 48d1b8ba862549ad395951fbe37a8efcef155a3e0d3f8de1a1756e4438934d85 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-55594f65-206f-4b2a-a4ed-c049861ef480, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, tcib_managed=true, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 21 19:16:06 np0005591285 systemd[1]: Started libpod-conmon-48d1b8ba862549ad395951fbe37a8efcef155a3e0d3f8de1a1756e4438934d85.scope.
Jan 21 19:16:06 np0005591285 podman[232430]: 2026-01-22 00:16:05.933976629 +0000 UTC m=+0.029291112 image pull 3695f0466b4af47afdf4b467956f8cc4744d7249671a73e7ca3fd26cca2f59c3 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Jan 21 19:16:06 np0005591285 systemd[1]: Started libcrun container.
Jan 21 19:16:06 np0005591285 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/aed3c941d635211ff622eeb2b1fe5e53480103e095539e0c3b8b5a680cad59b2/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 21 19:16:06 np0005591285 podman[232430]: 2026-01-22 00:16:06.067659863 +0000 UTC m=+0.162974396 container init 48d1b8ba862549ad395951fbe37a8efcef155a3e0d3f8de1a1756e4438934d85 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-55594f65-206f-4b2a-a4ed-c049861ef480, tcib_managed=true, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.build-date=20251202, io.buildah.version=1.41.3)
Jan 21 19:16:06 np0005591285 podman[232430]: 2026-01-22 00:16:06.079612002 +0000 UTC m=+0.174926455 container start 48d1b8ba862549ad395951fbe37a8efcef155a3e0d3f8de1a1756e4438934d85 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-55594f65-206f-4b2a-a4ed-c049861ef480, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3)
Jan 21 19:16:06 np0005591285 neutron-haproxy-ovnmeta-55594f65-206f-4b2a-a4ed-c049861ef480[232446]: [NOTICE]   (232450) : New worker (232452) forked
Jan 21 19:16:06 np0005591285 neutron-haproxy-ovnmeta-55594f65-206f-4b2a-a4ed-c049861ef480[232446]: [NOTICE]   (232450) : Loading success.
Jan 21 19:16:06 np0005591285 nova_compute[182755]: 2026-01-22 00:16:06.365 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:16:07 np0005591285 podman[232461]: 2026-01-22 00:16:07.217827358 +0000 UTC m=+0.080139947 container health_status 3d927e208c8d9d2bf2ed958430b573b5beb6e6062ba87e1ec7a0ee42c855b2e4 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Jan 21 19:16:07 np0005591285 nova_compute[182755]: 2026-01-22 00:16:07.668 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:16:08 np0005591285 nova_compute[182755]: 2026-01-22 00:16:08.567 182759 DEBUG nova.network.neutron [req-f9414583-18ed-4de1-9ad4-9113d27e1fdb req-9d96da57-0dd1-481c-8c73-b8b01fc61b47 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: be07a8c0-548d-4bfc-b004-799a831f8252] Updated VIF entry in instance network info cache for port 9cb3a782-7c84-4107-9add-6097c8ec4c06. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 21 19:16:08 np0005591285 nova_compute[182755]: 2026-01-22 00:16:08.568 182759 DEBUG nova.network.neutron [req-f9414583-18ed-4de1-9ad4-9113d27e1fdb req-9d96da57-0dd1-481c-8c73-b8b01fc61b47 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: be07a8c0-548d-4bfc-b004-799a831f8252] Updating instance_info_cache with network_info: [{"id": "9cb3a782-7c84-4107-9add-6097c8ec4c06", "address": "fa:16:3e:d8:7c:18", "network": {"id": "55594f65-206f-4b2a-a4ed-c049861ef480", "bridge": "br-int", "label": "tempest-ServerRescueNegativeTestJSON-2114387764-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "4c6e66779ffe440d9c3270f0328391fb", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap9cb3a782-7c", "ovs_interfaceid": "9cb3a782-7c84-4107-9add-6097c8ec4c06", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 21 19:16:09 np0005591285 nova_compute[182755]: 2026-01-22 00:16:09.431 182759 DEBUG nova.compute.manager [req-491c35bd-b51a-437b-9dcc-e338dfc1828e req-e6671563-3f77-4486-befc-f54df90a7dd5 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: be07a8c0-548d-4bfc-b004-799a831f8252] Received event network-vif-plugged-9cb3a782-7c84-4107-9add-6097c8ec4c06 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 21 19:16:09 np0005591285 nova_compute[182755]: 2026-01-22 00:16:09.431 182759 DEBUG oslo_concurrency.lockutils [req-491c35bd-b51a-437b-9dcc-e338dfc1828e req-e6671563-3f77-4486-befc-f54df90a7dd5 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "be07a8c0-548d-4bfc-b004-799a831f8252-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 21 19:16:09 np0005591285 nova_compute[182755]: 2026-01-22 00:16:09.432 182759 DEBUG oslo_concurrency.lockutils [req-491c35bd-b51a-437b-9dcc-e338dfc1828e req-e6671563-3f77-4486-befc-f54df90a7dd5 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "be07a8c0-548d-4bfc-b004-799a831f8252-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 21 19:16:09 np0005591285 nova_compute[182755]: 2026-01-22 00:16:09.432 182759 DEBUG oslo_concurrency.lockutils [req-491c35bd-b51a-437b-9dcc-e338dfc1828e req-e6671563-3f77-4486-befc-f54df90a7dd5 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "be07a8c0-548d-4bfc-b004-799a831f8252-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 21 19:16:09 np0005591285 nova_compute[182755]: 2026-01-22 00:16:09.432 182759 DEBUG nova.compute.manager [req-491c35bd-b51a-437b-9dcc-e338dfc1828e req-e6671563-3f77-4486-befc-f54df90a7dd5 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: be07a8c0-548d-4bfc-b004-799a831f8252] Processing event network-vif-plugged-9cb3a782-7c84-4107-9add-6097c8ec4c06 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Jan 21 19:16:09 np0005591285 nova_compute[182755]: 2026-01-22 00:16:09.432 182759 DEBUG nova.compute.manager [None req-c523f52c-14e1-48fe-abe8-06282d7fbefc c26ff016fcfc4e08803feb0e96005a8e 4c6e66779ffe440d9c3270f0328391fb - - default default] [instance: be07a8c0-548d-4bfc-b004-799a831f8252] Instance event wait completed in 3 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Jan 21 19:16:09 np0005591285 nova_compute[182755]: 2026-01-22 00:16:09.436 182759 DEBUG nova.virt.driver [None req-f447ef32-7edb-4e2c-a5c5-5452f2f9c37e - - - - - -] Emitting event <LifecycleEvent: 1769040969.4364352, be07a8c0-548d-4bfc-b004-799a831f8252 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 21 19:16:09 np0005591285 nova_compute[182755]: 2026-01-22 00:16:09.436 182759 INFO nova.compute.manager [None req-f447ef32-7edb-4e2c-a5c5-5452f2f9c37e - - - - - -] [instance: be07a8c0-548d-4bfc-b004-799a831f8252] VM Resumed (Lifecycle Event)#033[00m
Jan 21 19:16:09 np0005591285 nova_compute[182755]: 2026-01-22 00:16:09.438 182759 DEBUG nova.virt.libvirt.driver [None req-c523f52c-14e1-48fe-abe8-06282d7fbefc c26ff016fcfc4e08803feb0e96005a8e 4c6e66779ffe440d9c3270f0328391fb - - default default] [instance: be07a8c0-548d-4bfc-b004-799a831f8252] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Jan 21 19:16:09 np0005591285 nova_compute[182755]: 2026-01-22 00:16:09.441 182759 INFO nova.virt.libvirt.driver [-] [instance: be07a8c0-548d-4bfc-b004-799a831f8252] Instance spawned successfully.#033[00m
Jan 21 19:16:09 np0005591285 nova_compute[182755]: 2026-01-22 00:16:09.441 182759 DEBUG nova.virt.libvirt.driver [None req-c523f52c-14e1-48fe-abe8-06282d7fbefc c26ff016fcfc4e08803feb0e96005a8e 4c6e66779ffe440d9c3270f0328391fb - - default default] [instance: be07a8c0-548d-4bfc-b004-799a831f8252] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Jan 21 19:16:09 np0005591285 nova_compute[182755]: 2026-01-22 00:16:09.582 182759 DEBUG nova.compute.manager [None req-f447ef32-7edb-4e2c-a5c5-5452f2f9c37e - - - - - -] [instance: be07a8c0-548d-4bfc-b004-799a831f8252] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 21 19:16:09 np0005591285 nova_compute[182755]: 2026-01-22 00:16:09.586 182759 DEBUG nova.compute.manager [None req-f447ef32-7edb-4e2c-a5c5-5452f2f9c37e - - - - - -] [instance: be07a8c0-548d-4bfc-b004-799a831f8252] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 21 19:16:09 np0005591285 nova_compute[182755]: 2026-01-22 00:16:09.651 182759 INFO nova.compute.manager [None req-f447ef32-7edb-4e2c-a5c5-5452f2f9c37e - - - - - -] [instance: be07a8c0-548d-4bfc-b004-799a831f8252] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 21 19:16:09 np0005591285 nova_compute[182755]: 2026-01-22 00:16:09.666 182759 DEBUG nova.virt.libvirt.driver [None req-c523f52c-14e1-48fe-abe8-06282d7fbefc c26ff016fcfc4e08803feb0e96005a8e 4c6e66779ffe440d9c3270f0328391fb - - default default] [instance: be07a8c0-548d-4bfc-b004-799a831f8252] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 21 19:16:09 np0005591285 nova_compute[182755]: 2026-01-22 00:16:09.666 182759 DEBUG nova.virt.libvirt.driver [None req-c523f52c-14e1-48fe-abe8-06282d7fbefc c26ff016fcfc4e08803feb0e96005a8e 4c6e66779ffe440d9c3270f0328391fb - - default default] [instance: be07a8c0-548d-4bfc-b004-799a831f8252] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 21 19:16:09 np0005591285 nova_compute[182755]: 2026-01-22 00:16:09.667 182759 DEBUG nova.virt.libvirt.driver [None req-c523f52c-14e1-48fe-abe8-06282d7fbefc c26ff016fcfc4e08803feb0e96005a8e 4c6e66779ffe440d9c3270f0328391fb - - default default] [instance: be07a8c0-548d-4bfc-b004-799a831f8252] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 21 19:16:09 np0005591285 nova_compute[182755]: 2026-01-22 00:16:09.667 182759 DEBUG nova.virt.libvirt.driver [None req-c523f52c-14e1-48fe-abe8-06282d7fbefc c26ff016fcfc4e08803feb0e96005a8e 4c6e66779ffe440d9c3270f0328391fb - - default default] [instance: be07a8c0-548d-4bfc-b004-799a831f8252] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 21 19:16:09 np0005591285 nova_compute[182755]: 2026-01-22 00:16:09.667 182759 DEBUG nova.virt.libvirt.driver [None req-c523f52c-14e1-48fe-abe8-06282d7fbefc c26ff016fcfc4e08803feb0e96005a8e 4c6e66779ffe440d9c3270f0328391fb - - default default] [instance: be07a8c0-548d-4bfc-b004-799a831f8252] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 21 19:16:09 np0005591285 nova_compute[182755]: 2026-01-22 00:16:09.668 182759 DEBUG nova.virt.libvirt.driver [None req-c523f52c-14e1-48fe-abe8-06282d7fbefc c26ff016fcfc4e08803feb0e96005a8e 4c6e66779ffe440d9c3270f0328391fb - - default default] [instance: be07a8c0-548d-4bfc-b004-799a831f8252] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 21 19:16:09 np0005591285 nova_compute[182755]: 2026-01-22 00:16:09.672 182759 DEBUG oslo_concurrency.lockutils [req-f9414583-18ed-4de1-9ad4-9113d27e1fdb req-9d96da57-0dd1-481c-8c73-b8b01fc61b47 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Releasing lock "refresh_cache-be07a8c0-548d-4bfc-b004-799a831f8252" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 21 19:16:09 np0005591285 nova_compute[182755]: 2026-01-22 00:16:09.942 182759 INFO nova.compute.manager [None req-c523f52c-14e1-48fe-abe8-06282d7fbefc c26ff016fcfc4e08803feb0e96005a8e 4c6e66779ffe440d9c3270f0328391fb - - default default] [instance: be07a8c0-548d-4bfc-b004-799a831f8252] Took 25.30 seconds to spawn the instance on the hypervisor.#033[00m
Jan 21 19:16:09 np0005591285 nova_compute[182755]: 2026-01-22 00:16:09.943 182759 DEBUG nova.compute.manager [None req-c523f52c-14e1-48fe-abe8-06282d7fbefc c26ff016fcfc4e08803feb0e96005a8e 4c6e66779ffe440d9c3270f0328391fb - - default default] [instance: be07a8c0-548d-4bfc-b004-799a831f8252] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 21 19:16:10 np0005591285 nova_compute[182755]: 2026-01-22 00:16:10.107 182759 INFO nova.compute.manager [None req-c523f52c-14e1-48fe-abe8-06282d7fbefc c26ff016fcfc4e08803feb0e96005a8e 4c6e66779ffe440d9c3270f0328391fb - - default default] [instance: be07a8c0-548d-4bfc-b004-799a831f8252] Took 26.40 seconds to build instance.#033[00m
Jan 21 19:16:10 np0005591285 nova_compute[182755]: 2026-01-22 00:16:10.209 182759 DEBUG oslo_concurrency.lockutils [None req-c523f52c-14e1-48fe-abe8-06282d7fbefc c26ff016fcfc4e08803feb0e96005a8e 4c6e66779ffe440d9c3270f0328391fb - - default default] Lock "be07a8c0-548d-4bfc-b004-799a831f8252" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 26.660s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 21 19:16:11 np0005591285 podman[232485]: 2026-01-22 00:16:11.187117153 +0000 UTC m=+0.061528751 container health_status 482a8450540b33e5cd4c4cb0c4b60281016c657b79b89fa82f8a9e447843f995 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251202, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS)
Jan 21 19:16:11 np0005591285 podman[232486]: 2026-01-22 00:16:11.22521529 +0000 UTC m=+0.094281365 container health_status 8919309155a033da8deb024bb37b9fc0d285fe97686ee0d202af37a5f2df8416 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter)
Jan 21 19:16:11 np0005591285 nova_compute[182755]: 2026-01-22 00:16:11.420 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:16:11 np0005591285 nova_compute[182755]: 2026-01-22 00:16:11.671 182759 DEBUG nova.compute.manager [req-df2027ae-a8be-4c9b-9bc5-649b8547e607 req-02fc496f-264a-44ea-af56-01f6621e11e4 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: be07a8c0-548d-4bfc-b004-799a831f8252] Received event network-vif-plugged-9cb3a782-7c84-4107-9add-6097c8ec4c06 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 21 19:16:11 np0005591285 nova_compute[182755]: 2026-01-22 00:16:11.671 182759 DEBUG oslo_concurrency.lockutils [req-df2027ae-a8be-4c9b-9bc5-649b8547e607 req-02fc496f-264a-44ea-af56-01f6621e11e4 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "be07a8c0-548d-4bfc-b004-799a831f8252-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 21 19:16:11 np0005591285 nova_compute[182755]: 2026-01-22 00:16:11.671 182759 DEBUG oslo_concurrency.lockutils [req-df2027ae-a8be-4c9b-9bc5-649b8547e607 req-02fc496f-264a-44ea-af56-01f6621e11e4 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "be07a8c0-548d-4bfc-b004-799a831f8252-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 21 19:16:11 np0005591285 nova_compute[182755]: 2026-01-22 00:16:11.672 182759 DEBUG oslo_concurrency.lockutils [req-df2027ae-a8be-4c9b-9bc5-649b8547e607 req-02fc496f-264a-44ea-af56-01f6621e11e4 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "be07a8c0-548d-4bfc-b004-799a831f8252-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 21 19:16:11 np0005591285 nova_compute[182755]: 2026-01-22 00:16:11.672 182759 DEBUG nova.compute.manager [req-df2027ae-a8be-4c9b-9bc5-649b8547e607 req-02fc496f-264a-44ea-af56-01f6621e11e4 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: be07a8c0-548d-4bfc-b004-799a831f8252] No waiting events found dispatching network-vif-plugged-9cb3a782-7c84-4107-9add-6097c8ec4c06 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 21 19:16:11 np0005591285 nova_compute[182755]: 2026-01-22 00:16:11.672 182759 WARNING nova.compute.manager [req-df2027ae-a8be-4c9b-9bc5-649b8547e607 req-02fc496f-264a-44ea-af56-01f6621e11e4 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: be07a8c0-548d-4bfc-b004-799a831f8252] Received unexpected event network-vif-plugged-9cb3a782-7c84-4107-9add-6097c8ec4c06 for instance with vm_state active and task_state None.#033[00m
Jan 21 19:16:12 np0005591285 nova_compute[182755]: 2026-01-22 00:16:12.670 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:16:12 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:16:12.981 104259 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=ce4b296c-26ac-415a-aa87-9634754eb3d3, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '38'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 21 19:16:13 np0005591285 podman[232525]: 2026-01-22 00:16:13.247786304 +0000 UTC m=+0.122060466 container health_status e38def30a2d032278c507a8fe96e62e9ea51ff4721fe55b24e66691821fd079e (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_controller, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, managed_by=edpm_ansible)
Jan 21 19:16:16 np0005591285 nova_compute[182755]: 2026-01-22 00:16:16.423 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:16:17 np0005591285 nova_compute[182755]: 2026-01-22 00:16:17.672 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:16:21 np0005591285 nova_compute[182755]: 2026-01-22 00:16:21.425 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:16:22 np0005591285 ovn_controller[94908]: 2026-01-22T00:16:22Z|00049|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:d8:7c:18 10.100.0.3
Jan 21 19:16:22 np0005591285 ovn_controller[94908]: 2026-01-22T00:16:22Z|00050|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:d8:7c:18 10.100.0.3
Jan 21 19:16:22 np0005591285 nova_compute[182755]: 2026-01-22 00:16:22.675 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.170 12 DEBUG ceilometer.compute.discovery [-] instance data: {'id': 'be07a8c0-548d-4bfc-b004-799a831f8252', 'name': 'tempest-ServerRescueNegativeTestJSON-server-2105253339', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'os_type': 'hvm', 'architecture': 'x86_64', 'OS-EXT-SRV-ATTR:instance_name': 'instance-00000084', 'OS-EXT-SRV-ATTR:host': 'compute-2.ctlplane.example.com', 'OS-EXT-STS:vm_state': 'running', 'tenant_id': '4c6e66779ffe440d9c3270f0328391fb', 'user_id': 'c26ff016fcfc4e08803feb0e96005a8e', 'hostId': 'f137520901d0b37968133fe2c1e8620421bce1d99f8452f026773e2d', 'status': 'active', 'metadata': {}} discover_libvirt_polling /usr/lib/python3.9/site-packages/ceilometer/compute/discovery.py:228
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.171 12 INFO ceilometer.polling.manager [-] Polling pollster cpu in the context of pollsters
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.187 12 DEBUG ceilometer.compute.pollsters [-] be07a8c0-548d-4bfc-b004-799a831f8252/cpu volume: 11640000000 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.191 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'c152d37b-58d4-4598-9cc0-ca2c30598707', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'cpu', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 11640000000, 'user_id': 'c26ff016fcfc4e08803feb0e96005a8e', 'user_name': None, 'project_id': '4c6e66779ffe440d9c3270f0328391fb', 'project_name': None, 'resource_id': 'be07a8c0-548d-4bfc-b004-799a831f8252', 'timestamp': '2026-01-22T00:16:23.171690', 'resource_metadata': {'display_name': 'tempest-ServerRescueNegativeTestJSON-server-2105253339', 'name': 'instance-00000084', 'instance_id': 'be07a8c0-548d-4bfc-b004-799a831f8252', 'instance_type': 'm1.nano', 'host': 'f137520901d0b37968133fe2c1e8620421bce1d99f8452f026773e2d', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'cpu_number': 1}, 'message_id': '95771c20-f727-11f0-b13b-fa163e425b77', 'monotonic_time': 5541.90635976, 'message_signature': 'a3ac1edae47a33df51f1a845837bc934314c6c1389696c2dacf49fe1dc546188'}]}, 'timestamp': '2026-01-22 00:16:23.188638', '_unique_id': 'e4db5d2999f747809e13314423510f2e'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.191 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.191 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.191 12 ERROR oslo_messaging.notify.messaging     yield
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.191 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.191 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.191 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.191 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.191 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.191 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.191 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.191 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.191 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.191 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.191 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.191 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.191 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.191 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.191 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.191 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.191 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.191 12 ERROR oslo_messaging.notify.messaging 
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.191 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.191 12 ERROR oslo_messaging.notify.messaging 
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.191 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.191 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.191 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.191 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.191 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.191 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.191 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.191 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.191 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.191 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.191 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.191 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.191 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.191 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.191 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.191 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.191 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.191 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.191 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.191 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.191 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.191 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.191 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.191 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.191 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.191 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.191 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.191 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.191 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.191 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.191 12 ERROR oslo_messaging.notify.messaging 
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.192 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.drop in the context of pollsters
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.194 12 DEBUG ceilometer.compute.virt.libvirt.inspector [-] No delta meter predecessor for be07a8c0-548d-4bfc-b004-799a831f8252 / tap9cb3a782-7c inspect_vnics /usr/lib/python3.9/site-packages/ceilometer/compute/virt/libvirt/inspector.py:136
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.194 12 DEBUG ceilometer.compute.pollsters [-] be07a8c0-548d-4bfc-b004-799a831f8252/network.outgoing.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.196 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'fdcf2723-a4c8-41ff-86e4-b142554c3a93', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'c26ff016fcfc4e08803feb0e96005a8e', 'user_name': None, 'project_id': '4c6e66779ffe440d9c3270f0328391fb', 'project_name': None, 'resource_id': 'instance-00000084-be07a8c0-548d-4bfc-b004-799a831f8252-tap9cb3a782-7c', 'timestamp': '2026-01-22T00:16:23.192675', 'resource_metadata': {'display_name': 'tempest-ServerRescueNegativeTestJSON-server-2105253339', 'name': 'tap9cb3a782-7c', 'instance_id': 'be07a8c0-548d-4bfc-b004-799a831f8252', 'instance_type': 'm1.nano', 'host': 'f137520901d0b37968133fe2c1e8620421bce1d99f8452f026773e2d', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:d8:7c:18', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap9cb3a782-7c'}, 'message_id': '957830e2-f727-11f0-b13b-fa163e425b77', 'monotonic_time': 5541.911869597, 'message_signature': '627ce25962a79dde340e346bdc17ac8682eed0a2a2b8a17eadada978c6fa3175'}]}, 'timestamp': '2026-01-22 00:16:23.195407', '_unique_id': '942f56d6d6cd442aaa817de5c5b7b6e7'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.196 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.196 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.196 12 ERROR oslo_messaging.notify.messaging     yield
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.196 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.196 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.196 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.196 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.196 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.196 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.196 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.196 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.196 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.196 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.196 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.196 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.196 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.196 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.196 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.196 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.196 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.196 12 ERROR oslo_messaging.notify.messaging 
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.196 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.196 12 ERROR oslo_messaging.notify.messaging 
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.196 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.196 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.196 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.196 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.196 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.196 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.196 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.196 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.196 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.196 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.196 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.196 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.196 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.196 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.196 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.196 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.196 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.196 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.196 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.196 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.196 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.196 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.196 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.196 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.196 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.196 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.196 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.196 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.196 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.196 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.196 12 ERROR oslo_messaging.notify.messaging 
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.197 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.drop in the context of pollsters
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.197 12 DEBUG ceilometer.compute.pollsters [-] be07a8c0-548d-4bfc-b004-799a831f8252/network.incoming.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.197 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '0715764b-0e42-41a3-9757-eaea0234aa43', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'c26ff016fcfc4e08803feb0e96005a8e', 'user_name': None, 'project_id': '4c6e66779ffe440d9c3270f0328391fb', 'project_name': None, 'resource_id': 'instance-00000084-be07a8c0-548d-4bfc-b004-799a831f8252-tap9cb3a782-7c', 'timestamp': '2026-01-22T00:16:23.197179', 'resource_metadata': {'display_name': 'tempest-ServerRescueNegativeTestJSON-server-2105253339', 'name': 'tap9cb3a782-7c', 'instance_id': 'be07a8c0-548d-4bfc-b004-799a831f8252', 'instance_type': 'm1.nano', 'host': 'f137520901d0b37968133fe2c1e8620421bce1d99f8452f026773e2d', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:d8:7c:18', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap9cb3a782-7c'}, 'message_id': '95788286-f727-11f0-b13b-fa163e425b77', 'monotonic_time': 5541.911869597, 'message_signature': '6ad38253b6aab9b9bcf6c7a4aedede6bc0d1550326874af32ce8f031baf9b3fe'}]}, 'timestamp': '2026-01-22 00:16:23.197468', '_unique_id': 'f2a33810e58c4e23be041d7c053f9b6f'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.197 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.197 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.197 12 ERROR oslo_messaging.notify.messaging     yield
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.197 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.197 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.197 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.197 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.197 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.197 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.197 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.197 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.197 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.197 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.197 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.197 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.197 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.197 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.197 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.197 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.197 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.197 12 ERROR oslo_messaging.notify.messaging 
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.197 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.197 12 ERROR oslo_messaging.notify.messaging 
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.197 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.197 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.197 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.197 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.197 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.197 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.197 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.197 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.197 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.197 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.197 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.197 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.197 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.197 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.197 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.197 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.197 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.197 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.197 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.197 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.197 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.197 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.197 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.197 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.197 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.197 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.197 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.197 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.197 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.197 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.197 12 ERROR oslo_messaging.notify.messaging 
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.199 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.latency in the context of pollsters
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.199 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for PerDeviceDiskLatencyPollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.199 12 ERROR ceilometer.polling.manager [-] Prevent pollster disk.device.latency from polling [<NovaLikeServer: tempest-ServerRescueNegativeTestJSON-server-2105253339>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-ServerRescueNegativeTestJSON-server-2105253339>]
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.199 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.rate in the context of pollsters
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.199 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for OutgoingBytesRatePollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.199 12 ERROR ceilometer.polling.manager [-] Prevent pollster network.outgoing.bytes.rate from polling [<NovaLikeServer: tempest-ServerRescueNegativeTestJSON-server-2105253339>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-ServerRescueNegativeTestJSON-server-2105253339>]
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.200 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.requests in the context of pollsters
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.224 12 DEBUG ceilometer.compute.pollsters [-] be07a8c0-548d-4bfc-b004-799a831f8252/disk.device.write.requests volume: 294 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.224 12 DEBUG ceilometer.compute.pollsters [-] be07a8c0-548d-4bfc-b004-799a831f8252/disk.device.write.requests volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.226 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'ae1a97f4-1ce3-41f4-81d6-302be3445d81', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 294, 'user_id': 'c26ff016fcfc4e08803feb0e96005a8e', 'user_name': None, 'project_id': '4c6e66779ffe440d9c3270f0328391fb', 'project_name': None, 'resource_id': 'be07a8c0-548d-4bfc-b004-799a831f8252-vda', 'timestamp': '2026-01-22T00:16:23.200218', 'resource_metadata': {'display_name': 'tempest-ServerRescueNegativeTestJSON-server-2105253339', 'name': 'instance-00000084', 'instance_id': 'be07a8c0-548d-4bfc-b004-799a831f8252', 'instance_type': 'm1.nano', 'host': 'f137520901d0b37968133fe2c1e8620421bce1d99f8452f026773e2d', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '957cafaa-f727-11f0-b13b-fa163e425b77', 'monotonic_time': 5541.919408559, 'message_signature': 'ff5adcde52ea84b58268963a61f4e286ceacc5ad7c8c55dafa24cfe4c7486574'}, {'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 0, 'user_id': 'c26ff016fcfc4e08803feb0e96005a8e', 'user_name': None, 'project_id': '4c6e66779ffe440d9c3270f0328391fb', 'project_name': None, 'resource_id': 'be07a8c0-548d-4bfc-b004-799a831f8252-sda', 'timestamp': '2026-01-22T00:16:23.200218', 'resource_metadata': {'display_name': 'tempest-ServerRescueNegativeTestJSON-server-2105253339', 'name': 'instance-00000084', 'instance_id': 'be07a8c0-548d-4bfc-b004-799a831f8252', 'instance_type': 'm1.nano', 'host': 'f137520901d0b37968133fe2c1e8620421bce1d99f8452f026773e2d', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '957cbe28-f727-11f0-b13b-fa163e425b77', 'monotonic_time': 5541.919408559, 'message_signature': 'e507c5c74bdc85b81190618a05df73b36787a088d8222d25d67475fa8af32d4e'}]}, 'timestamp': '2026-01-22 00:16:23.225212', '_unique_id': '86b82b2792664341a67ceca726ef1305'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.226 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.226 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.226 12 ERROR oslo_messaging.notify.messaging     yield
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.226 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.226 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.226 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.226 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.226 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.226 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.226 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.226 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.226 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.226 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.226 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.226 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.226 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.226 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.226 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.226 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.226 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.226 12 ERROR oslo_messaging.notify.messaging 
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.226 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.226 12 ERROR oslo_messaging.notify.messaging 
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.226 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.226 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.226 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.226 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.226 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.226 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.226 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.226 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.226 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.226 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.226 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.226 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.226 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.226 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.226 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.226 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.226 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.226 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.226 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.226 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.226 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.226 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.226 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.226 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.226 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.226 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.226 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.226 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.226 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.226 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.226 12 ERROR oslo_messaging.notify.messaging 
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.227 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.rate in the context of pollsters
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.227 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for IncomingBytesRatePollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.227 12 ERROR ceilometer.polling.manager [-] Prevent pollster network.incoming.bytes.rate from polling [<NovaLikeServer: tempest-ServerRescueNegativeTestJSON-server-2105253339>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-ServerRescueNegativeTestJSON-server-2105253339>]
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.227 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets in the context of pollsters
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.227 12 DEBUG ceilometer.compute.pollsters [-] be07a8c0-548d-4bfc-b004-799a831f8252/network.incoming.packets volume: 9 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.228 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'eb2f27fe-3e08-471c-98e1-4ef7e7ab8b7b', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 9, 'user_id': 'c26ff016fcfc4e08803feb0e96005a8e', 'user_name': None, 'project_id': '4c6e66779ffe440d9c3270f0328391fb', 'project_name': None, 'resource_id': 'instance-00000084-be07a8c0-548d-4bfc-b004-799a831f8252-tap9cb3a782-7c', 'timestamp': '2026-01-22T00:16:23.227836', 'resource_metadata': {'display_name': 'tempest-ServerRescueNegativeTestJSON-server-2105253339', 'name': 'tap9cb3a782-7c', 'instance_id': 'be07a8c0-548d-4bfc-b004-799a831f8252', 'instance_type': 'm1.nano', 'host': 'f137520901d0b37968133fe2c1e8620421bce1d99f8452f026773e2d', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:d8:7c:18', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap9cb3a782-7c'}, 'message_id': '957d31fa-f727-11f0-b13b-fa163e425b77', 'monotonic_time': 5541.911869597, 'message_signature': 'f0efb1c7713a685936358022553a307037cea6afb2646bfac1300d745db36426'}]}, 'timestamp': '2026-01-22 00:16:23.228239', '_unique_id': '8c40b6845e794daca3e0d09b0286a45d'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.228 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.228 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.228 12 ERROR oslo_messaging.notify.messaging     yield
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.228 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.228 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.228 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.228 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.228 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.228 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.228 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.228 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.228 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.228 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.228 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.228 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.228 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.228 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.228 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.228 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.228 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.228 12 ERROR oslo_messaging.notify.messaging 
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.228 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.228 12 ERROR oslo_messaging.notify.messaging 
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.228 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.228 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.228 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.228 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.228 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.228 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.228 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.228 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.228 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.228 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.228 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.228 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.228 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.228 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.228 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.228 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.228 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.228 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.228 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.228 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.228 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.228 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.228 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.228 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.228 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.228 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.228 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.228 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.228 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.228 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.228 12 ERROR oslo_messaging.notify.messaging 
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.229 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.usage in the context of pollsters
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.241 12 DEBUG ceilometer.compute.pollsters [-] be07a8c0-548d-4bfc-b004-799a831f8252/disk.device.usage volume: 29753344 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.241 12 DEBUG ceilometer.compute.pollsters [-] be07a8c0-548d-4bfc-b004-799a831f8252/disk.device.usage volume: 485376 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.242 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '7785ceec-d5e7-4708-b23f-a068c89ba274', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 29753344, 'user_id': 'c26ff016fcfc4e08803feb0e96005a8e', 'user_name': None, 'project_id': '4c6e66779ffe440d9c3270f0328391fb', 'project_name': None, 'resource_id': 'be07a8c0-548d-4bfc-b004-799a831f8252-vda', 'timestamp': '2026-01-22T00:16:23.230045', 'resource_metadata': {'display_name': 'tempest-ServerRescueNegativeTestJSON-server-2105253339', 'name': 'instance-00000084', 'instance_id': 'be07a8c0-548d-4bfc-b004-799a831f8252', 'instance_type': 'm1.nano', 'host': 'f137520901d0b37968133fe2c1e8620421bce1d99f8452f026773e2d', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '957f3fcc-f727-11f0-b13b-fa163e425b77', 'monotonic_time': 5541.949255294, 'message_signature': '6482681037055c009632606967757261e5ef1379bf36b1d49d630e27d5a4ca3e'}, {'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 485376, 'user_id': 'c26ff016fcfc4e08803feb0e96005a8e', 'user_name': None, 'project_id': '4c6e66779ffe440d9c3270f0328391fb', 'project_name': None, 'resource_id': 'be07a8c0-548d-4bfc-b004-799a831f8252-sda', 'timestamp': '2026-01-22T00:16:23.230045', 'resource_metadata': {'display_name': 'tempest-ServerRescueNegativeTestJSON-server-2105253339', 'name': 'instance-00000084', 'instance_id': 'be07a8c0-548d-4bfc-b004-799a831f8252', 'instance_type': 'm1.nano', 'host': 'f137520901d0b37968133fe2c1e8620421bce1d99f8452f026773e2d', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '957f4bb6-f727-11f0-b13b-fa163e425b77', 'monotonic_time': 5541.949255294, 'message_signature': 'fab68848ff3578bee56d15509d85d49f5acea13e623b40e57420324303413dee'}]}, 'timestamp': '2026-01-22 00:16:23.241956', '_unique_id': '62b4d84b464d41379c69a38813df9a1c'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.242 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.242 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.242 12 ERROR oslo_messaging.notify.messaging     yield
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.242 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.242 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.242 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.242 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.242 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.242 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.242 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.242 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.242 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.242 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.242 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.242 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.242 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.242 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.242 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.242 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.242 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.242 12 ERROR oslo_messaging.notify.messaging 
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.242 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.242 12 ERROR oslo_messaging.notify.messaging 
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.242 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.242 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.242 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.242 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.242 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.242 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.242 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.242 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.242 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.242 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.242 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.242 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.242 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.242 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.242 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.242 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.242 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.242 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.242 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.242 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.242 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.242 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.242 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.242 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.242 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.242 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.242 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.242 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.242 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.242 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.242 12 ERROR oslo_messaging.notify.messaging 
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.243 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.bytes in the context of pollsters
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.244 12 DEBUG ceilometer.compute.pollsters [-] be07a8c0-548d-4bfc-b004-799a831f8252/disk.device.write.bytes volume: 72695808 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.244 12 DEBUG ceilometer.compute.pollsters [-] be07a8c0-548d-4bfc-b004-799a831f8252/disk.device.write.bytes volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.245 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'de5a60ec-c024-46ce-a1f6-8f8fa3281355', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 72695808, 'user_id': 'c26ff016fcfc4e08803feb0e96005a8e', 'user_name': None, 'project_id': '4c6e66779ffe440d9c3270f0328391fb', 'project_name': None, 'resource_id': 'be07a8c0-548d-4bfc-b004-799a831f8252-vda', 'timestamp': '2026-01-22T00:16:23.244000', 'resource_metadata': {'display_name': 'tempest-ServerRescueNegativeTestJSON-server-2105253339', 'name': 'instance-00000084', 'instance_id': 'be07a8c0-548d-4bfc-b004-799a831f8252', 'instance_type': 'm1.nano', 'host': 'f137520901d0b37968133fe2c1e8620421bce1d99f8452f026773e2d', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '957fa8c2-f727-11f0-b13b-fa163e425b77', 'monotonic_time': 5541.919408559, 'message_signature': 'c22e9e3a31b68b0862d753689b2baf57f0608ece0757aeb41a45499139f4d86c'}, {'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': 'c26ff016fcfc4e08803feb0e96005a8e', 'user_name': None, 'project_id': '4c6e66779ffe440d9c3270f0328391fb', 'project_name': None, 'resource_id': 'be07a8c0-548d-4bfc-b004-799a831f8252-sda', 'timestamp': '2026-01-22T00:16:23.244000', 'resource_metadata': {'display_name': 'tempest-ServerRescueNegativeTestJSON-server-2105253339', 'name': 'instance-00000084', 'instance_id': 'be07a8c0-548d-4bfc-b004-799a831f8252', 'instance_type': 'm1.nano', 'host': 'f137520901d0b37968133fe2c1e8620421bce1d99f8452f026773e2d', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '957fb696-f727-11f0-b13b-fa163e425b77', 'monotonic_time': 5541.919408559, 'message_signature': 'bf44fa318fde340654c04b56d9b2ffbb1bf431888375a1fddac4dabaac1aa445'}]}, 'timestamp': '2026-01-22 00:16:23.244695', '_unique_id': '40077cf1cb2f46f280755ba727a6d5ff'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.245 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.245 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.245 12 ERROR oslo_messaging.notify.messaging     yield
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.245 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.245 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.245 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.245 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.245 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.245 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.245 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.245 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.245 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.245 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.245 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.245 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.245 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.245 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.245 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.245 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.245 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.245 12 ERROR oslo_messaging.notify.messaging 
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.245 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.245 12 ERROR oslo_messaging.notify.messaging 
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.245 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.245 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.245 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.245 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.245 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.245 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.245 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.245 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.245 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.245 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.245 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.245 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.245 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.245 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.245 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.245 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.245 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.245 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.245 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.245 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.245 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.245 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.245 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.245 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.245 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.245 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.245 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.245 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.245 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.245 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.245 12 ERROR oslo_messaging.notify.messaging 
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.246 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.error in the context of pollsters
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.246 12 DEBUG ceilometer.compute.pollsters [-] be07a8c0-548d-4bfc-b004-799a831f8252/network.outgoing.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.247 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '5a84f9af-3cd7-4230-bc78-73cac7427452', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'c26ff016fcfc4e08803feb0e96005a8e', 'user_name': None, 'project_id': '4c6e66779ffe440d9c3270f0328391fb', 'project_name': None, 'resource_id': 'instance-00000084-be07a8c0-548d-4bfc-b004-799a831f8252-tap9cb3a782-7c', 'timestamp': '2026-01-22T00:16:23.246444', 'resource_metadata': {'display_name': 'tempest-ServerRescueNegativeTestJSON-server-2105253339', 'name': 'tap9cb3a782-7c', 'instance_id': 'be07a8c0-548d-4bfc-b004-799a831f8252', 'instance_type': 'm1.nano', 'host': 'f137520901d0b37968133fe2c1e8620421bce1d99f8452f026773e2d', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:d8:7c:18', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap9cb3a782-7c'}, 'message_id': '958007cc-f727-11f0-b13b-fa163e425b77', 'monotonic_time': 5541.911869597, 'message_signature': 'd804e3a11fb87390eb29746d15b579a3289b6f86bc3fe1d8c4400255088f48aa'}]}, 'timestamp': '2026-01-22 00:16:23.246753', '_unique_id': '019214fedd564208a5c297e640227584'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.247 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.247 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.247 12 ERROR oslo_messaging.notify.messaging     yield
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.247 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.247 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.247 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.247 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.247 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.247 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.247 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.247 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.247 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.247 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.247 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.247 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.247 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.247 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.247 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.247 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.247 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.247 12 ERROR oslo_messaging.notify.messaging 
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.247 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.247 12 ERROR oslo_messaging.notify.messaging 
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.247 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.247 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.247 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.247 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.247 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.247 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.247 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.247 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.247 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.247 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.247 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.247 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.247 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.247 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.247 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.247 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.247 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.247 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.247 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.247 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.247 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.247 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.247 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.247 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.247 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.247 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.247 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.247 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.247 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.247 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.247 12 ERROR oslo_messaging.notify.messaging 
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.248 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.latency in the context of pollsters
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.248 12 DEBUG ceilometer.compute.pollsters [-] be07a8c0-548d-4bfc-b004-799a831f8252/disk.device.read.latency volume: 179259932 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.248 12 DEBUG ceilometer.compute.pollsters [-] be07a8c0-548d-4bfc-b004-799a831f8252/disk.device.read.latency volume: 25468959 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.249 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '17b31777-f1ac-4bcb-b47d-58981e5c8264', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 179259932, 'user_id': 'c26ff016fcfc4e08803feb0e96005a8e', 'user_name': None, 'project_id': '4c6e66779ffe440d9c3270f0328391fb', 'project_name': None, 'resource_id': 'be07a8c0-548d-4bfc-b004-799a831f8252-vda', 'timestamp': '2026-01-22T00:16:23.248404', 'resource_metadata': {'display_name': 'tempest-ServerRescueNegativeTestJSON-server-2105253339', 'name': 'instance-00000084', 'instance_id': 'be07a8c0-548d-4bfc-b004-799a831f8252', 'instance_type': 'm1.nano', 'host': 'f137520901d0b37968133fe2c1e8620421bce1d99f8452f026773e2d', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '95805326-f727-11f0-b13b-fa163e425b77', 'monotonic_time': 5541.919408559, 'message_signature': '461a51e6e04f28f64644b6be8282201568ddf02ce641aa1fed80d67039b29e45'}, {'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 25468959, 'user_id': 'c26ff016fcfc4e08803feb0e96005a8e', 'user_name': None, 'project_id': '4c6e66779ffe440d9c3270f0328391fb', 'project_name': None, 'resource_id': 'be07a8c0-548d-4bfc-b004-799a831f8252-sda', 'timestamp': '2026-01-22T00:16:23.248404', 'resource_metadata': {'display_name': 'tempest-ServerRescueNegativeTestJSON-server-2105253339', 'name': 'instance-00000084', 'instance_id': 'be07a8c0-548d-4bfc-b004-799a831f8252', 'instance_type': 'm1.nano', 'host': 'f137520901d0b37968133fe2c1e8620421bce1d99f8452f026773e2d', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '95805eac-f727-11f0-b13b-fa163e425b77', 'monotonic_time': 5541.919408559, 'message_signature': 'a428d201b3078a0b942564b10c4a3f1f7b5949c54f5775ebbd68124a2762a5bb'}]}, 'timestamp': '2026-01-22 00:16:23.249015', '_unique_id': '265220cf881c4b3e9878e9714b3802a6'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.249 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.249 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.249 12 ERROR oslo_messaging.notify.messaging     yield
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.249 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.249 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.249 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.249 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.249 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.249 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.249 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.249 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.249 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.249 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.249 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.249 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.249 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.249 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.249 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.249 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.249 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.249 12 ERROR oslo_messaging.notify.messaging 
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.249 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.249 12 ERROR oslo_messaging.notify.messaging 
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.249 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.249 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.249 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.249 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.249 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.249 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.249 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.249 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.249 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.249 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.249 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.249 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.249 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.249 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.249 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.249 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.249 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.249 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.249 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.249 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.249 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.249 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.249 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.249 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.249 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.249 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.249 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.249 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.249 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.249 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.249 12 ERROR oslo_messaging.notify.messaging 
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.250 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes in the context of pollsters
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.250 12 DEBUG ceilometer.compute.pollsters [-] be07a8c0-548d-4bfc-b004-799a831f8252/network.incoming.bytes volume: 1352 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.251 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '08a6bb0c-b70d-4217-80e3-36f26a93a1fd', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 1352, 'user_id': 'c26ff016fcfc4e08803feb0e96005a8e', 'user_name': None, 'project_id': '4c6e66779ffe440d9c3270f0328391fb', 'project_name': None, 'resource_id': 'instance-00000084-be07a8c0-548d-4bfc-b004-799a831f8252-tap9cb3a782-7c', 'timestamp': '2026-01-22T00:16:23.250587', 'resource_metadata': {'display_name': 'tempest-ServerRescueNegativeTestJSON-server-2105253339', 'name': 'tap9cb3a782-7c', 'instance_id': 'be07a8c0-548d-4bfc-b004-799a831f8252', 'instance_type': 'm1.nano', 'host': 'f137520901d0b37968133fe2c1e8620421bce1d99f8452f026773e2d', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:d8:7c:18', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap9cb3a782-7c'}, 'message_id': '9580a8e4-f727-11f0-b13b-fa163e425b77', 'monotonic_time': 5541.911869597, 'message_signature': '93f0e3ff01d983d0a8c62484ffccb1db1f7d1700846a301d2b0ac65bf238caca'}]}, 'timestamp': '2026-01-22 00:16:23.250918', '_unique_id': 'cf457feebe894df6883088009c5ad462'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.251 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.251 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.251 12 ERROR oslo_messaging.notify.messaging     yield
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.251 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.251 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.251 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.251 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.251 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.251 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.251 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.251 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.251 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.251 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.251 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.251 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.251 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.251 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.251 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.251 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.251 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.251 12 ERROR oslo_messaging.notify.messaging 
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.251 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.251 12 ERROR oslo_messaging.notify.messaging 
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.251 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.251 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.251 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.251 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.251 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.251 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.251 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.251 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.251 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.251 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.251 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.251 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.251 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.251 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.251 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.251 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.251 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.251 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.251 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.251 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.251 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.251 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.251 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.251 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.251 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.251 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.251 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.251 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.251 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.251 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.251 12 ERROR oslo_messaging.notify.messaging 
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.252 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.allocation in the context of pollsters
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.252 12 DEBUG ceilometer.compute.pollsters [-] be07a8c0-548d-4bfc-b004-799a831f8252/disk.device.allocation volume: 30154752 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.252 12 DEBUG ceilometer.compute.pollsters [-] be07a8c0-548d-4bfc-b004-799a831f8252/disk.device.allocation volume: 487424 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.253 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '6a7c2ba5-cb36-4967-a586-fb8ac2c4eb69', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 30154752, 'user_id': 'c26ff016fcfc4e08803feb0e96005a8e', 'user_name': None, 'project_id': '4c6e66779ffe440d9c3270f0328391fb', 'project_name': None, 'resource_id': 'be07a8c0-548d-4bfc-b004-799a831f8252-vda', 'timestamp': '2026-01-22T00:16:23.252460', 'resource_metadata': {'display_name': 'tempest-ServerRescueNegativeTestJSON-server-2105253339', 'name': 'instance-00000084', 'instance_id': 'be07a8c0-548d-4bfc-b004-799a831f8252', 'instance_type': 'm1.nano', 'host': 'f137520901d0b37968133fe2c1e8620421bce1d99f8452f026773e2d', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '9580f286-f727-11f0-b13b-fa163e425b77', 'monotonic_time': 5541.949255294, 'message_signature': '698cc05e3e8a6b3983a7b68eb59316d05e089b376ff89d8f0b2358a11394a3b3'}, {'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 487424, 'user_id': 'c26ff016fcfc4e08803feb0e96005a8e', 'user_name': None, 'project_id': '4c6e66779ffe440d9c3270f0328391fb', 'project_name': None, 'resource_id': 'be07a8c0-548d-4bfc-b004-799a831f8252-sda', 'timestamp': '2026-01-22T00:16:23.252460', 'resource_metadata': {'display_name': 'tempest-ServerRescueNegativeTestJSON-server-2105253339', 'name': 'instance-00000084', 'instance_id': 'be07a8c0-548d-4bfc-b004-799a831f8252', 'instance_type': 'm1.nano', 'host': 'f137520901d0b37968133fe2c1e8620421bce1d99f8452f026773e2d', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '9580ffe2-f727-11f0-b13b-fa163e425b77', 'monotonic_time': 5541.949255294, 'message_signature': '40e4bfa4604a9f8859988626f4b30df32a3d5ce22f4a8ee6f95aaec20b83224f'}]}, 'timestamp': '2026-01-22 00:16:23.253131', '_unique_id': 'bd702b663c6a4574862eb97437e78428'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.253 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.253 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.253 12 ERROR oslo_messaging.notify.messaging     yield
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.253 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.253 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.253 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.253 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.253 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.253 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.253 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.253 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.253 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.253 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.253 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.253 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.253 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.253 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.253 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.253 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.253 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.253 12 ERROR oslo_messaging.notify.messaging 
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.253 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.253 12 ERROR oslo_messaging.notify.messaging 
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.253 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.253 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.253 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.253 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.253 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.253 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.253 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.253 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.253 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.253 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.253 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.253 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.253 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.253 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.253 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.253 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.253 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.253 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.253 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.253 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.253 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.253 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.253 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.253 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.253 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.253 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.253 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.253 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.253 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.253 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.253 12 ERROR oslo_messaging.notify.messaging 
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.254 12 INFO ceilometer.polling.manager [-] Polling pollster memory.usage in the context of pollsters
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.255 12 DEBUG ceilometer.compute.pollsters [-] be07a8c0-548d-4bfc-b004-799a831f8252/memory.usage volume: 40.37890625 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.256 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '216ffc9e-12df-4f11-87c1-2e18149f8f1e', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'memory.usage', 'counter_type': 'gauge', 'counter_unit': 'MB', 'counter_volume': 40.37890625, 'user_id': 'c26ff016fcfc4e08803feb0e96005a8e', 'user_name': None, 'project_id': '4c6e66779ffe440d9c3270f0328391fb', 'project_name': None, 'resource_id': 'be07a8c0-548d-4bfc-b004-799a831f8252', 'timestamp': '2026-01-22T00:16:23.255044', 'resource_metadata': {'display_name': 'tempest-ServerRescueNegativeTestJSON-server-2105253339', 'name': 'instance-00000084', 'instance_id': 'be07a8c0-548d-4bfc-b004-799a831f8252', 'instance_type': 'm1.nano', 'host': 'f137520901d0b37968133fe2c1e8620421bce1d99f8452f026773e2d', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1}, 'message_id': '958158b6-f727-11f0-b13b-fa163e425b77', 'monotonic_time': 5541.90635976, 'message_signature': '76cf8fa79a505da019851e2d6c9f4fd7749e56997ccbf2ed0e6e0555feccbf4b'}]}, 'timestamp': '2026-01-22 00:16:23.255466', '_unique_id': '287a1c25f894479583cf6fef3fe2878f'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.256 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.256 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.256 12 ERROR oslo_messaging.notify.messaging     yield
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.256 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.256 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.256 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.256 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.256 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.256 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.256 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.256 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.256 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.256 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.256 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.256 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.256 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.256 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.256 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.256 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.256 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.256 12 ERROR oslo_messaging.notify.messaging 
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.256 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.256 12 ERROR oslo_messaging.notify.messaging 
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.256 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.256 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.256 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.256 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.256 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.256 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.256 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.256 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.256 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.256 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.256 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.256 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.256 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.256 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.256 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.256 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.256 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.256 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.256 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.256 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.256 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.256 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.256 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.256 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.256 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.256 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.256 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.256 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.256 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.256 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.256 12 ERROR oslo_messaging.notify.messaging 
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.257 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.delta in the context of pollsters
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.257 12 DEBUG ceilometer.compute.pollsters [-] be07a8c0-548d-4bfc-b004-799a831f8252/network.outgoing.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.258 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '8c9cc7bd-8d84-4904-9b58-938b35cc8704', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': 'c26ff016fcfc4e08803feb0e96005a8e', 'user_name': None, 'project_id': '4c6e66779ffe440d9c3270f0328391fb', 'project_name': None, 'resource_id': 'instance-00000084-be07a8c0-548d-4bfc-b004-799a831f8252-tap9cb3a782-7c', 'timestamp': '2026-01-22T00:16:23.257371', 'resource_metadata': {'display_name': 'tempest-ServerRescueNegativeTestJSON-server-2105253339', 'name': 'tap9cb3a782-7c', 'instance_id': 'be07a8c0-548d-4bfc-b004-799a831f8252', 'instance_type': 'm1.nano', 'host': 'f137520901d0b37968133fe2c1e8620421bce1d99f8452f026773e2d', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:d8:7c:18', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap9cb3a782-7c'}, 'message_id': '9581b36a-f727-11f0-b13b-fa163e425b77', 'monotonic_time': 5541.911869597, 'message_signature': '771e6fd562671fc7242d0f32e692d53f3247f03e70c84e70d9dbcf4adbd01e4e'}]}, 'timestamp': '2026-01-22 00:16:23.257707', '_unique_id': '0385bdfd8e4946dbadf98619499bd09e'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.258 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.258 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.258 12 ERROR oslo_messaging.notify.messaging     yield
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.258 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.258 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.258 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.258 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.258 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.258 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.258 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.258 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.258 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.258 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.258 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.258 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.258 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.258 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.258 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.258 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.258 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.258 12 ERROR oslo_messaging.notify.messaging 
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.258 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.258 12 ERROR oslo_messaging.notify.messaging 
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.258 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.258 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.258 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.258 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.258 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.258 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.258 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.258 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.258 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.258 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.258 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.258 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.258 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.258 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.258 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.258 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.258 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.258 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.258 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.258 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.258 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.258 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.258 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.258 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.258 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.258 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.258 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.258 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.258 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.258 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.258 12 ERROR oslo_messaging.notify.messaging 
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.259 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.capacity in the context of pollsters
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.259 12 DEBUG ceilometer.compute.pollsters [-] be07a8c0-548d-4bfc-b004-799a831f8252/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.259 12 DEBUG ceilometer.compute.pollsters [-] be07a8c0-548d-4bfc-b004-799a831f8252/disk.device.capacity volume: 485376 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.260 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'ec25a416-7787-4e77-923e-adbc01476027', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': 'c26ff016fcfc4e08803feb0e96005a8e', 'user_name': None, 'project_id': '4c6e66779ffe440d9c3270f0328391fb', 'project_name': None, 'resource_id': 'be07a8c0-548d-4bfc-b004-799a831f8252-vda', 'timestamp': '2026-01-22T00:16:23.259397', 'resource_metadata': {'display_name': 'tempest-ServerRescueNegativeTestJSON-server-2105253339', 'name': 'instance-00000084', 'instance_id': 'be07a8c0-548d-4bfc-b004-799a831f8252', 'instance_type': 'm1.nano', 'host': 'f137520901d0b37968133fe2c1e8620421bce1d99f8452f026773e2d', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '958201d0-f727-11f0-b13b-fa163e425b77', 'monotonic_time': 5541.949255294, 'message_signature': '66fb634a2fdeb3aba7c6c6cfd6aca6a6faa716c42403f90d6107c8959d99f485'}, {'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 485376, 'user_id': 'c26ff016fcfc4e08803feb0e96005a8e', 'user_name': None, 'project_id': '4c6e66779ffe440d9c3270f0328391fb', 'project_name': None, 'resource_id': 'be07a8c0-548d-4bfc-b004-799a831f8252-sda', 'timestamp': '2026-01-22T00:16:23.259397', 'resource_metadata': {'display_name': 'tempest-ServerRescueNegativeTestJSON-server-2105253339', 'name': 'instance-00000084', 'instance_id': 'be07a8c0-548d-4bfc-b004-799a831f8252', 'instance_type': 'm1.nano', 'host': 'f137520901d0b37968133fe2c1e8620421bce1d99f8452f026773e2d', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '95820bf8-f727-11f0-b13b-fa163e425b77', 'monotonic_time': 5541.949255294, 'message_signature': '6efa13b9b7fce1c82d3e2950670519b8deb9cf459146e8637bd714393a30b83d'}]}, 'timestamp': '2026-01-22 00:16:23.259980', '_unique_id': '6741867b42c34366b1ca2dedef2ef7c0'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.260 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.260 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.260 12 ERROR oslo_messaging.notify.messaging     yield
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.260 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.260 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.260 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.260 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.260 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.260 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.260 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.260 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.260 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.260 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.260 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.260 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.260 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.260 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.260 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.260 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.260 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.260 12 ERROR oslo_messaging.notify.messaging 
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.260 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.260 12 ERROR oslo_messaging.notify.messaging 
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.260 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.260 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.260 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.260 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.260 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.260 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.260 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.260 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.260 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.260 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.260 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.260 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.260 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.260 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.260 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.260 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.260 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.260 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.260 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.260 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.260 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.260 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.260 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.260 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.260 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.260 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.260 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.260 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.260 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.260 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.260 12 ERROR oslo_messaging.notify.messaging 
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.261 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets in the context of pollsters
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.261 12 DEBUG ceilometer.compute.pollsters [-] be07a8c0-548d-4bfc-b004-799a831f8252/network.outgoing.packets volume: 8 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.262 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '862302b8-3360-4bfb-8877-b93f37632675', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 8, 'user_id': 'c26ff016fcfc4e08803feb0e96005a8e', 'user_name': None, 'project_id': '4c6e66779ffe440d9c3270f0328391fb', 'project_name': None, 'resource_id': 'instance-00000084-be07a8c0-548d-4bfc-b004-799a831f8252-tap9cb3a782-7c', 'timestamp': '2026-01-22T00:16:23.261539', 'resource_metadata': {'display_name': 'tempest-ServerRescueNegativeTestJSON-server-2105253339', 'name': 'tap9cb3a782-7c', 'instance_id': 'be07a8c0-548d-4bfc-b004-799a831f8252', 'instance_type': 'm1.nano', 'host': 'f137520901d0b37968133fe2c1e8620421bce1d99f8452f026773e2d', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:d8:7c:18', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap9cb3a782-7c'}, 'message_id': '958254dc-f727-11f0-b13b-fa163e425b77', 'monotonic_time': 5541.911869597, 'message_signature': '1179682e72addac5c5c9656a38b3c24480cd83827949f20fef78649c2c697be0'}]}, 'timestamp': '2026-01-22 00:16:23.261829', '_unique_id': '3652a7b427d54fe296043406377ccd04'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.262 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.262 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.262 12 ERROR oslo_messaging.notify.messaging     yield
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.262 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.262 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.262 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.262 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.262 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.262 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.262 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.262 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.262 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.262 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.262 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.262 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.262 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.262 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.262 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.262 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.262 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.262 12 ERROR oslo_messaging.notify.messaging 
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.262 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.262 12 ERROR oslo_messaging.notify.messaging 
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.262 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.262 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.262 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.262 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.262 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.262 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.262 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.262 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.262 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.262 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.262 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.262 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.262 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.262 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.262 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.262 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.262 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.262 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.262 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.262 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.262 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.262 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.262 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.262 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.262 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.262 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.262 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.262 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.262 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.262 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.262 12 ERROR oslo_messaging.notify.messaging 
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.263 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.delta in the context of pollsters
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.263 12 DEBUG ceilometer.compute.pollsters [-] be07a8c0-548d-4bfc-b004-799a831f8252/network.incoming.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.264 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'e73ff249-4318-4c01-8948-9f1c5e330c7b', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': 'c26ff016fcfc4e08803feb0e96005a8e', 'user_name': None, 'project_id': '4c6e66779ffe440d9c3270f0328391fb', 'project_name': None, 'resource_id': 'instance-00000084-be07a8c0-548d-4bfc-b004-799a831f8252-tap9cb3a782-7c', 'timestamp': '2026-01-22T00:16:23.263302', 'resource_metadata': {'display_name': 'tempest-ServerRescueNegativeTestJSON-server-2105253339', 'name': 'tap9cb3a782-7c', 'instance_id': 'be07a8c0-548d-4bfc-b004-799a831f8252', 'instance_type': 'm1.nano', 'host': 'f137520901d0b37968133fe2c1e8620421bce1d99f8452f026773e2d', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:d8:7c:18', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap9cb3a782-7c'}, 'message_id': '958298f2-f727-11f0-b13b-fa163e425b77', 'monotonic_time': 5541.911869597, 'message_signature': '0e48701c863d742e83490c58e6ee38a5a683541f318142b99cfc393672fc78c9'}]}, 'timestamp': '2026-01-22 00:16:23.263570', '_unique_id': '81c237b81b2b43c08698e53574d0b1ed'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.264 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.264 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.264 12 ERROR oslo_messaging.notify.messaging     yield
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.264 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.264 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 21 19:16:23 np0005591285 rsyslogd[1006]: imjournal: journal files changed, reloading...  [v8.2510.0-2.el9 try https://www.rsyslog.com/e/0 ]
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.264 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.264 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.264 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.264 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.264 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.264 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.264 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.264 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.264 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.264 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.264 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.264 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.264 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.264 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.264 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.264 12 ERROR oslo_messaging.notify.messaging 
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.264 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.264 12 ERROR oslo_messaging.notify.messaging 
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.264 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.264 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.264 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.264 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.264 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.264 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.264 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.264 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.264 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.264 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.264 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.264 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.264 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.264 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.264 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.264 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.264 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.264 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.264 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.264 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.264 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.264 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.264 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.264 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.264 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.264 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.264 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.264 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.264 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.264 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.264 12 ERROR oslo_messaging.notify.messaging 
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.265 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.requests in the context of pollsters
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.265 12 DEBUG ceilometer.compute.pollsters [-] be07a8c0-548d-4bfc-b004-799a831f8252/disk.device.read.requests volume: 1057 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.266 12 DEBUG ceilometer.compute.pollsters [-] be07a8c0-548d-4bfc-b004-799a831f8252/disk.device.read.requests volume: 108 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.267 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'ce5d63e9-b106-49f9-88ee-27af82f3cd21', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 1057, 'user_id': 'c26ff016fcfc4e08803feb0e96005a8e', 'user_name': None, 'project_id': '4c6e66779ffe440d9c3270f0328391fb', 'project_name': None, 'resource_id': 'be07a8c0-548d-4bfc-b004-799a831f8252-vda', 'timestamp': '2026-01-22T00:16:23.265656', 'resource_metadata': {'display_name': 'tempest-ServerRescueNegativeTestJSON-server-2105253339', 'name': 'instance-00000084', 'instance_id': 'be07a8c0-548d-4bfc-b004-799a831f8252', 'instance_type': 'm1.nano', 'host': 'f137520901d0b37968133fe2c1e8620421bce1d99f8452f026773e2d', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '9582fd60-f727-11f0-b13b-fa163e425b77', 'monotonic_time': 5541.919408559, 'message_signature': '7af8cfaf5c31c0bdb956e4970fa322e3aba003f7b12bc08cf4c9f2cafd145722'}, {'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 108, 'user_id': 'c26ff016fcfc4e08803feb0e96005a8e', 'user_name': None, 'project_id': '4c6e66779ffe440d9c3270f0328391fb', 'project_name': None, 'resource_id': 'be07a8c0-548d-4bfc-b004-799a831f8252-sda', 'timestamp': '2026-01-22T00:16:23.265656', 'resource_metadata': {'display_name': 'tempest-ServerRescueNegativeTestJSON-server-2105253339', 'name': 'instance-00000084', 'instance_id': 'be07a8c0-548d-4bfc-b004-799a831f8252', 'instance_type': 'm1.nano', 'host': 'f137520901d0b37968133fe2c1e8620421bce1d99f8452f026773e2d', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '95830e5e-f727-11f0-b13b-fa163e425b77', 'monotonic_time': 5541.919408559, 'message_signature': '97b816772517055f068df702b5a6d861f492b9b768d93f3e2a258dc967107750'}]}, 'timestamp': '2026-01-22 00:16:23.266626', '_unique_id': '8959961a582045888bb25d8589dcbdda'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.267 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.267 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.267 12 ERROR oslo_messaging.notify.messaging     yield
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.267 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.267 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.267 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.267 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.267 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.267 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.267 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.267 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.267 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.267 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.267 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.267 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.267 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.267 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.267 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.267 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.267 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.267 12 ERROR oslo_messaging.notify.messaging 
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.267 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.267 12 ERROR oslo_messaging.notify.messaging 
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.267 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.267 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.267 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.267 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.267 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.267 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.267 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.267 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.267 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.267 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.267 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.267 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.267 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.267 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.267 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.267 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.267 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.267 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.267 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.267 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.267 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.267 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.267 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.267 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.267 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.267 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.267 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.267 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.267 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.267 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.267 12 ERROR oslo_messaging.notify.messaging 
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.269 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.error in the context of pollsters
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.269 12 DEBUG ceilometer.compute.pollsters [-] be07a8c0-548d-4bfc-b004-799a831f8252/network.incoming.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.270 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '3b6ee980-588b-4939-a7ed-dddce935b891', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'c26ff016fcfc4e08803feb0e96005a8e', 'user_name': None, 'project_id': '4c6e66779ffe440d9c3270f0328391fb', 'project_name': None, 'resource_id': 'instance-00000084-be07a8c0-548d-4bfc-b004-799a831f8252-tap9cb3a782-7c', 'timestamp': '2026-01-22T00:16:23.269417', 'resource_metadata': {'display_name': 'tempest-ServerRescueNegativeTestJSON-server-2105253339', 'name': 'tap9cb3a782-7c', 'instance_id': 'be07a8c0-548d-4bfc-b004-799a831f8252', 'instance_type': 'm1.nano', 'host': 'f137520901d0b37968133fe2c1e8620421bce1d99f8452f026773e2d', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:d8:7c:18', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap9cb3a782-7c'}, 'message_id': '95838ae6-f727-11f0-b13b-fa163e425b77', 'monotonic_time': 5541.911869597, 'message_signature': 'be81f5fb87ef4e12db9793287de6ca322a87bea51376df9995edd2f6e9362667'}]}, 'timestamp': '2026-01-22 00:16:23.269818', '_unique_id': '90eeaba7ff8f4c04a29a5271a6bfcaa5'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.270 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.270 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.270 12 ERROR oslo_messaging.notify.messaging     yield
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.270 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.270 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.270 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.270 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.270 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.270 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.270 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.270 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.270 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.270 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.270 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.270 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.270 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.270 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.270 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.270 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.270 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.270 12 ERROR oslo_messaging.notify.messaging 
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.270 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.270 12 ERROR oslo_messaging.notify.messaging 
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.270 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.270 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.270 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.270 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.270 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.270 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.270 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.270 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.270 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.270 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.270 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.270 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.270 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.270 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.270 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.270 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.270 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.270 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.270 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.270 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.270 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.270 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.270 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.270 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.270 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.270 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.270 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.270 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.270 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.270 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.270 12 ERROR oslo_messaging.notify.messaging 
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.271 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.bytes in the context of pollsters
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.271 12 DEBUG ceilometer.compute.pollsters [-] be07a8c0-548d-4bfc-b004-799a831f8252/disk.device.read.bytes volume: 29305344 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.272 12 DEBUG ceilometer.compute.pollsters [-] be07a8c0-548d-4bfc-b004-799a831f8252/disk.device.read.bytes volume: 274750 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.273 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '1ac24f0a-ff05-43b1-8b3c-478217ea3296', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 29305344, 'user_id': 'c26ff016fcfc4e08803feb0e96005a8e', 'user_name': None, 'project_id': '4c6e66779ffe440d9c3270f0328391fb', 'project_name': None, 'resource_id': 'be07a8c0-548d-4bfc-b004-799a831f8252-vda', 'timestamp': '2026-01-22T00:16:23.271916', 'resource_metadata': {'display_name': 'tempest-ServerRescueNegativeTestJSON-server-2105253339', 'name': 'instance-00000084', 'instance_id': 'be07a8c0-548d-4bfc-b004-799a831f8252', 'instance_type': 'm1.nano', 'host': 'f137520901d0b37968133fe2c1e8620421bce1d99f8452f026773e2d', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '9583ede2-f727-11f0-b13b-fa163e425b77', 'monotonic_time': 5541.919408559, 'message_signature': '7e380fdf6fa30f2e4cadd907ff16c1f330551290d386e099651a0978f439d46f'}, {'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 274750, 'user_id': 'c26ff016fcfc4e08803feb0e96005a8e', 'user_name': None, 'project_id': '4c6e66779ffe440d9c3270f0328391fb', 'project_name': None, 'resource_id': 'be07a8c0-548d-4bfc-b004-799a831f8252-sda', 'timestamp': '2026-01-22T00:16:23.271916', 'resource_metadata': {'display_name': 'tempest-ServerRescueNegativeTestJSON-server-2105253339', 'name': 'instance-00000084', 'instance_id': 'be07a8c0-548d-4bfc-b004-799a831f8252', 'instance_type': 'm1.nano', 'host': 'f137520901d0b37968133fe2c1e8620421bce1d99f8452f026773e2d', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '9583fcc4-f727-11f0-b13b-fa163e425b77', 'monotonic_time': 5541.919408559, 'message_signature': '98182334e8b252901257b119ffb455082e7a2cc97e10de107add7f6e66034019'}]}, 'timestamp': '2026-01-22 00:16:23.272717', '_unique_id': '326ef71892b2497fa8ea32ba87d14416'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.273 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.273 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.273 12 ERROR oslo_messaging.notify.messaging     yield
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.273 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.273 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.273 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.273 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.273 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.273 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.273 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.273 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.273 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.273 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.273 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.273 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.273 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.273 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.273 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.273 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.273 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.273 12 ERROR oslo_messaging.notify.messaging 
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.273 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.273 12 ERROR oslo_messaging.notify.messaging 
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.273 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.273 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.273 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.273 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.273 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.273 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.273 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.273 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.273 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.273 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.273 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.273 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.273 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.273 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.273 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.273 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.273 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.273 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.273 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.273 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.273 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.273 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.273 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.273 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.273 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.273 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.273 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.273 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.273 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.273 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.273 12 ERROR oslo_messaging.notify.messaging 
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.274 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.latency in the context of pollsters
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.274 12 DEBUG ceilometer.compute.pollsters [-] be07a8c0-548d-4bfc-b004-799a831f8252/disk.device.write.latency volume: 2934580559 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.275 12 DEBUG ceilometer.compute.pollsters [-] be07a8c0-548d-4bfc-b004-799a831f8252/disk.device.write.latency volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.276 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'a6284dd0-0d0a-4edf-94b3-b86e4b234b61', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 2934580559, 'user_id': 'c26ff016fcfc4e08803feb0e96005a8e', 'user_name': None, 'project_id': '4c6e66779ffe440d9c3270f0328391fb', 'project_name': None, 'resource_id': 'be07a8c0-548d-4bfc-b004-799a831f8252-vda', 'timestamp': '2026-01-22T00:16:23.274956', 'resource_metadata': {'display_name': 'tempest-ServerRescueNegativeTestJSON-server-2105253339', 'name': 'instance-00000084', 'instance_id': 'be07a8c0-548d-4bfc-b004-799a831f8252', 'instance_type': 'm1.nano', 'host': 'f137520901d0b37968133fe2c1e8620421bce1d99f8452f026773e2d', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '95846362-f727-11f0-b13b-fa163e425b77', 'monotonic_time': 5541.919408559, 'message_signature': 'ec634137997926c2e1012291e9769188d69f450478ac86f1b8b4d83bceb41138'}, {'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 0, 'user_id': 'c26ff016fcfc4e08803feb0e96005a8e', 'user_name': None, 'project_id': '4c6e66779ffe440d9c3270f0328391fb', 'project_name': None, 'resource_id': 'be07a8c0-548d-4bfc-b004-799a831f8252-sda', 'timestamp': '2026-01-22T00:16:23.274956', 'resource_metadata': {'display_name': 'tempest-ServerRescueNegativeTestJSON-server-2105253339', 'name': 'instance-00000084', 'instance_id': 'be07a8c0-548d-4bfc-b004-799a831f8252', 'instance_type': 'm1.nano', 'host': 'f137520901d0b37968133fe2c1e8620421bce1d99f8452f026773e2d', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '95847118-f727-11f0-b13b-fa163e425b77', 'monotonic_time': 5541.919408559, 'message_signature': 'd1a0b9c7290a3087bac23a4491e4810a5f6a22ee4ec965dadec6ecae7da3a7b9'}]}, 'timestamp': '2026-01-22 00:16:23.275693', '_unique_id': '859cf4a06b334c6289a05d0bbf17febb'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.276 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.276 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.276 12 ERROR oslo_messaging.notify.messaging     yield
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.276 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.276 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.276 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.276 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.276 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.276 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.276 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.276 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.276 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.276 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.276 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.276 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.276 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.276 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.276 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.276 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.276 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.276 12 ERROR oslo_messaging.notify.messaging 
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.276 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.276 12 ERROR oslo_messaging.notify.messaging 
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.276 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.276 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.276 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.276 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.276 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.276 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.276 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.276 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.276 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.276 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.276 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.276 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.276 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.276 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.276 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.276 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.276 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.276 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.276 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.276 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.276 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.276 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.276 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.276 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.276 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.276 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.276 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.276 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.276 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.276 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.276 12 ERROR oslo_messaging.notify.messaging 
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.278 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.iops in the context of pollsters
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.278 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for PerDeviceDiskIOPSPollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.278 12 ERROR ceilometer.polling.manager [-] Prevent pollster disk.device.iops from polling [<NovaLikeServer: tempest-ServerRescueNegativeTestJSON-server-2105253339>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-ServerRescueNegativeTestJSON-server-2105253339>]
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.279 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes in the context of pollsters
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.279 12 DEBUG ceilometer.compute.pollsters [-] be07a8c0-548d-4bfc-b004-799a831f8252/network.outgoing.bytes volume: 1152 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.280 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '3c5daba8-cefd-4f35-9b88-2643107f1b83', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 1152, 'user_id': 'c26ff016fcfc4e08803feb0e96005a8e', 'user_name': None, 'project_id': '4c6e66779ffe440d9c3270f0328391fb', 'project_name': None, 'resource_id': 'instance-00000084-be07a8c0-548d-4bfc-b004-799a831f8252-tap9cb3a782-7c', 'timestamp': '2026-01-22T00:16:23.279223', 'resource_metadata': {'display_name': 'tempest-ServerRescueNegativeTestJSON-server-2105253339', 'name': 'tap9cb3a782-7c', 'instance_id': 'be07a8c0-548d-4bfc-b004-799a831f8252', 'instance_type': 'm1.nano', 'host': 'f137520901d0b37968133fe2c1e8620421bce1d99f8452f026773e2d', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:d8:7c:18', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap9cb3a782-7c'}, 'message_id': '95850c4a-f727-11f0-b13b-fa163e425b77', 'monotonic_time': 5541.911869597, 'message_signature': '0e3ebddd4552247f795b5cd1b4e0b3b5dd0eed4c07d99aebb4b4556a72ab9a3c'}]}, 'timestamp': '2026-01-22 00:16:23.279735', '_unique_id': 'd69c2a39763b4dc0b7bc4636fde083a2'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.280 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.280 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.280 12 ERROR oslo_messaging.notify.messaging     yield
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.280 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.280 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.280 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.280 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.280 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.280 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.280 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.280 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.280 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.280 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.280 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.280 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.280 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.280 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.280 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.280 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.280 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.280 12 ERROR oslo_messaging.notify.messaging 
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.280 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.280 12 ERROR oslo_messaging.notify.messaging 
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.280 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.280 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.280 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.280 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.280 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.280 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.280 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.280 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.280 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.280 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.280 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.280 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.280 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.280 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.280 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.280 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.280 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.280 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.280 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.280 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.280 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.280 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.280 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.280 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.280 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.280 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.280 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.280 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.280 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.280 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 21 19:16:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:16:23.280 12 ERROR oslo_messaging.notify.messaging 
Jan 21 19:16:23 np0005591285 rsyslogd[1006]: imjournal: journal files changed, reloading...  [v8.2510.0-2.el9 try https://www.rsyslog.com/e/0 ]
Jan 21 19:16:26 np0005591285 nova_compute[182755]: 2026-01-22 00:16:26.428 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:16:27 np0005591285 podman[232566]: 2026-01-22 00:16:27.183363823 +0000 UTC m=+0.058591504 container health_status 769668cc4e818cfae854ab3fcb5517227ddca1e18005a18ef4d782a494f45662 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Red Hat, Inc., io.openshift.expose-services=, release=1755695350, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.component=ubi9-minimal-container, distribution-scope=public, name=ubi9-minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., config_id=openstack_network_exporter, vcs-type=git, version=9.6, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., url=https://catalog.redhat.com/en/search?searchType=containers, vendor=Red Hat, Inc., managed_by=edpm_ansible, build-date=2025-08-20T13:12:41, io.buildah.version=1.33.7, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=openstack_network_exporter, io.openshift.tags=minimal rhel9)
Jan 21 19:16:27 np0005591285 podman[232567]: 2026-01-22 00:16:27.224001846 +0000 UTC m=+0.083983271 container health_status b5c1a31a508d0cf9e10702abe9c0ef424614b4d8f0daee85791f7de054afa6f0 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, container_name=ceilometer_agent_compute, managed_by=edpm_ansible, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.vendor=CentOS, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=ceilometer_agent_compute, io.buildah.version=1.41.3)
Jan 21 19:16:27 np0005591285 nova_compute[182755]: 2026-01-22 00:16:27.678 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:16:31 np0005591285 nova_compute[182755]: 2026-01-22 00:16:31.459 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:16:32 np0005591285 nova_compute[182755]: 2026-01-22 00:16:32.719 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:16:36 np0005591285 nova_compute[182755]: 2026-01-22 00:16:36.462 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:16:37 np0005591285 nova_compute[182755]: 2026-01-22 00:16:37.721 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:16:38 np0005591285 podman[232608]: 2026-01-22 00:16:38.184855047 +0000 UTC m=+0.054630528 container health_status 3d927e208c8d9d2bf2ed958430b573b5beb6e6062ba87e1ec7a0ee42c855b2e4 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Jan 21 19:16:41 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:16:41.387 104259 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=39, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '16:a4:fb', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': 'fa:c2:bb:50:eb:27'}, ipsec=False) old=SB_Global(nb_cfg=38) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 21 19:16:41 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:16:41.389 104259 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 10 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Jan 21 19:16:41 np0005591285 nova_compute[182755]: 2026-01-22 00:16:41.390 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:16:41 np0005591285 nova_compute[182755]: 2026-01-22 00:16:41.464 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:16:42 np0005591285 podman[232632]: 2026-01-22 00:16:42.198986559 +0000 UTC m=+0.065313443 container health_status 482a8450540b33e5cd4c4cb0c4b60281016c657b79b89fa82f8a9e447843f995 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, managed_by=edpm_ansible, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202)
Jan 21 19:16:42 np0005591285 podman[232633]: 2026-01-22 00:16:42.227420837 +0000 UTC m=+0.091229394 container health_status 8919309155a033da8deb024bb37b9fc0d285fe97686ee0d202af37a5f2df8416 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Jan 21 19:16:42 np0005591285 nova_compute[182755]: 2026-01-22 00:16:42.723 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:16:44 np0005591285 podman[232675]: 2026-01-22 00:16:44.233946833 +0000 UTC m=+0.100725307 container health_status e38def30a2d032278c507a8fe96e62e9ea51ff4721fe55b24e66691821fd079e (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, io.buildah.version=1.41.3)
Jan 21 19:16:44 np0005591285 nova_compute[182755]: 2026-01-22 00:16:44.451 182759 DEBUG oslo_service.periodic_task [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 21 19:16:45 np0005591285 nova_compute[182755]: 2026-01-22 00:16:45.218 182759 DEBUG oslo_service.periodic_task [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 21 19:16:45 np0005591285 nova_compute[182755]: 2026-01-22 00:16:45.218 182759 DEBUG nova.compute.manager [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Jan 21 19:16:46 np0005591285 nova_compute[182755]: 2026-01-22 00:16:46.097 182759 INFO nova.compute.manager [None req-f6fa4aaf-d9d0-499c-afb4-f0085cc491c3 c26ff016fcfc4e08803feb0e96005a8e 4c6e66779ffe440d9c3270f0328391fb - - default default] [instance: be07a8c0-548d-4bfc-b004-799a831f8252] Pausing#033[00m
Jan 21 19:16:46 np0005591285 nova_compute[182755]: 2026-01-22 00:16:46.099 182759 DEBUG nova.objects.instance [None req-f6fa4aaf-d9d0-499c-afb4-f0085cc491c3 c26ff016fcfc4e08803feb0e96005a8e 4c6e66779ffe440d9c3270f0328391fb - - default default] Lazy-loading 'flavor' on Instance uuid be07a8c0-548d-4bfc-b004-799a831f8252 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 21 19:16:46 np0005591285 nova_compute[182755]: 2026-01-22 00:16:46.202 182759 DEBUG nova.virt.driver [None req-f447ef32-7edb-4e2c-a5c5-5452f2f9c37e - - - - - -] Emitting event <LifecycleEvent: 1769041006.2018137, be07a8c0-548d-4bfc-b004-799a831f8252 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 21 19:16:46 np0005591285 nova_compute[182755]: 2026-01-22 00:16:46.202 182759 INFO nova.compute.manager [None req-f447ef32-7edb-4e2c-a5c5-5452f2f9c37e - - - - - -] [instance: be07a8c0-548d-4bfc-b004-799a831f8252] VM Paused (Lifecycle Event)#033[00m
Jan 21 19:16:46 np0005591285 nova_compute[182755]: 2026-01-22 00:16:46.204 182759 DEBUG nova.compute.manager [None req-f6fa4aaf-d9d0-499c-afb4-f0085cc491c3 c26ff016fcfc4e08803feb0e96005a8e 4c6e66779ffe440d9c3270f0328391fb - - default default] [instance: be07a8c0-548d-4bfc-b004-799a831f8252] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 21 19:16:46 np0005591285 nova_compute[182755]: 2026-01-22 00:16:46.224 182759 DEBUG nova.compute.manager [None req-f447ef32-7edb-4e2c-a5c5-5452f2f9c37e - - - - - -] [instance: be07a8c0-548d-4bfc-b004-799a831f8252] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 21 19:16:46 np0005591285 nova_compute[182755]: 2026-01-22 00:16:46.228 182759 DEBUG nova.compute.manager [None req-f447ef32-7edb-4e2c-a5c5-5452f2f9c37e - - - - - -] [instance: be07a8c0-548d-4bfc-b004-799a831f8252] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: active, current task_state: pausing, current DB power_state: 1, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 21 19:16:46 np0005591285 nova_compute[182755]: 2026-01-22 00:16:46.324 182759 INFO nova.compute.manager [None req-f447ef32-7edb-4e2c-a5c5-5452f2f9c37e - - - - - -] [instance: be07a8c0-548d-4bfc-b004-799a831f8252] During sync_power_state the instance has a pending task (pausing). Skip.#033[00m
Jan 21 19:16:46 np0005591285 nova_compute[182755]: 2026-01-22 00:16:46.467 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:16:47 np0005591285 nova_compute[182755]: 2026-01-22 00:16:47.725 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:16:48 np0005591285 nova_compute[182755]: 2026-01-22 00:16:48.218 182759 DEBUG oslo_service.periodic_task [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 21 19:16:49 np0005591285 nova_compute[182755]: 2026-01-22 00:16:49.217 182759 DEBUG oslo_service.periodic_task [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 21 19:16:49 np0005591285 nova_compute[182755]: 2026-01-22 00:16:49.218 182759 DEBUG nova.compute.manager [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Jan 21 19:16:49 np0005591285 nova_compute[182755]: 2026-01-22 00:16:49.218 182759 DEBUG nova.compute.manager [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Jan 21 19:16:50 np0005591285 nova_compute[182755]: 2026-01-22 00:16:50.998 182759 DEBUG oslo_concurrency.lockutils [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Acquiring lock "refresh_cache-be07a8c0-548d-4bfc-b004-799a831f8252" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 21 19:16:50 np0005591285 nova_compute[182755]: 2026-01-22 00:16:50.998 182759 DEBUG oslo_concurrency.lockutils [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Acquired lock "refresh_cache-be07a8c0-548d-4bfc-b004-799a831f8252" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 21 19:16:50 np0005591285 nova_compute[182755]: 2026-01-22 00:16:50.998 182759 DEBUG nova.network.neutron [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] [instance: be07a8c0-548d-4bfc-b004-799a831f8252] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004#033[00m
Jan 21 19:16:50 np0005591285 nova_compute[182755]: 2026-01-22 00:16:50.999 182759 DEBUG nova.objects.instance [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Lazy-loading 'info_cache' on Instance uuid be07a8c0-548d-4bfc-b004-799a831f8252 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 21 19:16:51 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:16:51.393 104259 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=ce4b296c-26ac-415a-aa87-9634754eb3d3, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '39'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 21 19:16:51 np0005591285 nova_compute[182755]: 2026-01-22 00:16:51.469 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:16:52 np0005591285 nova_compute[182755]: 2026-01-22 00:16:52.094 182759 INFO nova.compute.manager [None req-f7156d5a-5aed-484c-8e25-b7d26cfa3154 c26ff016fcfc4e08803feb0e96005a8e 4c6e66779ffe440d9c3270f0328391fb - - default default] [instance: be07a8c0-548d-4bfc-b004-799a831f8252] Unpausing#033[00m
Jan 21 19:16:52 np0005591285 nova_compute[182755]: 2026-01-22 00:16:52.095 182759 DEBUG nova.objects.instance [None req-f7156d5a-5aed-484c-8e25-b7d26cfa3154 c26ff016fcfc4e08803feb0e96005a8e 4c6e66779ffe440d9c3270f0328391fb - - default default] Lazy-loading 'flavor' on Instance uuid be07a8c0-548d-4bfc-b004-799a831f8252 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 21 19:16:52 np0005591285 nova_compute[182755]: 2026-01-22 00:16:52.429 182759 DEBUG nova.virt.driver [None req-f447ef32-7edb-4e2c-a5c5-5452f2f9c37e - - - - - -] Emitting event <LifecycleEvent: 1769041012.4296489, be07a8c0-548d-4bfc-b004-799a831f8252 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 21 19:16:52 np0005591285 nova_compute[182755]: 2026-01-22 00:16:52.430 182759 INFO nova.compute.manager [None req-f447ef32-7edb-4e2c-a5c5-5452f2f9c37e - - - - - -] [instance: be07a8c0-548d-4bfc-b004-799a831f8252] VM Resumed (Lifecycle Event)#033[00m
Jan 21 19:16:52 np0005591285 virtqemud[182299]: argument unsupported: QEMU guest agent is not configured
Jan 21 19:16:52 np0005591285 nova_compute[182755]: 2026-01-22 00:16:52.436 182759 DEBUG nova.virt.libvirt.guest [None req-f7156d5a-5aed-484c-8e25-b7d26cfa3154 c26ff016fcfc4e08803feb0e96005a8e 4c6e66779ffe440d9c3270f0328391fb - - default default] [instance: be07a8c0-548d-4bfc-b004-799a831f8252] Failed to set time: agent not configured sync_guest_time /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:200#033[00m
Jan 21 19:16:52 np0005591285 nova_compute[182755]: 2026-01-22 00:16:52.437 182759 DEBUG nova.compute.manager [None req-f7156d5a-5aed-484c-8e25-b7d26cfa3154 c26ff016fcfc4e08803feb0e96005a8e 4c6e66779ffe440d9c3270f0328391fb - - default default] [instance: be07a8c0-548d-4bfc-b004-799a831f8252] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 21 19:16:52 np0005591285 nova_compute[182755]: 2026-01-22 00:16:52.547 182759 DEBUG nova.compute.manager [None req-f447ef32-7edb-4e2c-a5c5-5452f2f9c37e - - - - - -] [instance: be07a8c0-548d-4bfc-b004-799a831f8252] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 21 19:16:52 np0005591285 nova_compute[182755]: 2026-01-22 00:16:52.551 182759 DEBUG nova.compute.manager [None req-f447ef32-7edb-4e2c-a5c5-5452f2f9c37e - - - - - -] [instance: be07a8c0-548d-4bfc-b004-799a831f8252] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: paused, current task_state: unpausing, current DB power_state: 3, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 21 19:16:52 np0005591285 nova_compute[182755]: 2026-01-22 00:16:52.664 182759 INFO nova.compute.manager [None req-f447ef32-7edb-4e2c-a5c5-5452f2f9c37e - - - - - -] [instance: be07a8c0-548d-4bfc-b004-799a831f8252] During sync_power_state the instance has a pending task (unpausing). Skip.#033[00m
Jan 21 19:16:52 np0005591285 nova_compute[182755]: 2026-01-22 00:16:52.727 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:16:53 np0005591285 nova_compute[182755]: 2026-01-22 00:16:53.802 182759 DEBUG nova.network.neutron [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] [instance: be07a8c0-548d-4bfc-b004-799a831f8252] Updating instance_info_cache with network_info: [{"id": "9cb3a782-7c84-4107-9add-6097c8ec4c06", "address": "fa:16:3e:d8:7c:18", "network": {"id": "55594f65-206f-4b2a-a4ed-c049861ef480", "bridge": "br-int", "label": "tempest-ServerRescueNegativeTestJSON-2114387764-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "4c6e66779ffe440d9c3270f0328391fb", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap9cb3a782-7c", "ovs_interfaceid": "9cb3a782-7c84-4107-9add-6097c8ec4c06", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 21 19:16:54 np0005591285 nova_compute[182755]: 2026-01-22 00:16:54.143 182759 DEBUG oslo_concurrency.lockutils [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Releasing lock "refresh_cache-be07a8c0-548d-4bfc-b004-799a831f8252" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 21 19:16:54 np0005591285 nova_compute[182755]: 2026-01-22 00:16:54.144 182759 DEBUG nova.compute.manager [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] [instance: be07a8c0-548d-4bfc-b004-799a831f8252] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929#033[00m
Jan 21 19:16:54 np0005591285 nova_compute[182755]: 2026-01-22 00:16:54.145 182759 DEBUG oslo_service.periodic_task [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 21 19:16:54 np0005591285 nova_compute[182755]: 2026-01-22 00:16:54.145 182759 DEBUG oslo_service.periodic_task [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 21 19:16:54 np0005591285 nova_compute[182755]: 2026-01-22 00:16:54.145 182759 DEBUG oslo_service.periodic_task [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 21 19:16:54 np0005591285 nova_compute[182755]: 2026-01-22 00:16:54.145 182759 DEBUG oslo_service.periodic_task [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 21 19:16:54 np0005591285 nova_compute[182755]: 2026-01-22 00:16:54.214 182759 DEBUG oslo_concurrency.lockutils [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 21 19:16:54 np0005591285 nova_compute[182755]: 2026-01-22 00:16:54.214 182759 DEBUG oslo_concurrency.lockutils [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 21 19:16:54 np0005591285 nova_compute[182755]: 2026-01-22 00:16:54.215 182759 DEBUG oslo_concurrency.lockutils [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 21 19:16:54 np0005591285 nova_compute[182755]: 2026-01-22 00:16:54.215 182759 DEBUG nova.compute.resource_tracker [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Jan 21 19:16:54 np0005591285 nova_compute[182755]: 2026-01-22 00:16:54.352 182759 DEBUG oslo_concurrency.processutils [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/be07a8c0-548d-4bfc-b004-799a831f8252/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 21 19:16:54 np0005591285 nova_compute[182755]: 2026-01-22 00:16:54.426 182759 DEBUG oslo_concurrency.processutils [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/be07a8c0-548d-4bfc-b004-799a831f8252/disk --force-share --output=json" returned: 0 in 0.073s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 21 19:16:54 np0005591285 nova_compute[182755]: 2026-01-22 00:16:54.427 182759 DEBUG oslo_concurrency.processutils [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/be07a8c0-548d-4bfc-b004-799a831f8252/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 21 19:16:54 np0005591285 nova_compute[182755]: 2026-01-22 00:16:54.488 182759 DEBUG oslo_concurrency.processutils [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/be07a8c0-548d-4bfc-b004-799a831f8252/disk --force-share --output=json" returned: 0 in 0.061s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 21 19:16:54 np0005591285 nova_compute[182755]: 2026-01-22 00:16:54.660 182759 WARNING nova.virt.libvirt.driver [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 21 19:16:54 np0005591285 nova_compute[182755]: 2026-01-22 00:16:54.662 182759 DEBUG nova.compute.resource_tracker [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=5565MB free_disk=73.16455459594727GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Jan 21 19:16:54 np0005591285 nova_compute[182755]: 2026-01-22 00:16:54.662 182759 DEBUG oslo_concurrency.lockutils [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 21 19:16:54 np0005591285 nova_compute[182755]: 2026-01-22 00:16:54.663 182759 DEBUG oslo_concurrency.lockutils [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 21 19:16:55 np0005591285 nova_compute[182755]: 2026-01-22 00:16:55.317 182759 DEBUG nova.compute.resource_tracker [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Instance be07a8c0-548d-4bfc-b004-799a831f8252 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Jan 21 19:16:55 np0005591285 nova_compute[182755]: 2026-01-22 00:16:55.317 182759 DEBUG nova.compute.resource_tracker [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Jan 21 19:16:55 np0005591285 nova_compute[182755]: 2026-01-22 00:16:55.318 182759 DEBUG nova.compute.resource_tracker [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7679MB used_ram=640MB phys_disk=79GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Jan 21 19:16:55 np0005591285 nova_compute[182755]: 2026-01-22 00:16:55.434 182759 DEBUG nova.compute.provider_tree [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Inventory has not changed in ProviderTree for provider: e96a8776-a298-4c19-937a-402cb8191067 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 21 19:16:55 np0005591285 nova_compute[182755]: 2026-01-22 00:16:55.545 182759 DEBUG nova.scheduler.client.report [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Inventory has not changed for provider e96a8776-a298-4c19-937a-402cb8191067 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 21 19:16:55 np0005591285 nova_compute[182755]: 2026-01-22 00:16:55.591 182759 DEBUG nova.compute.resource_tracker [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Jan 21 19:16:55 np0005591285 nova_compute[182755]: 2026-01-22 00:16:55.593 182759 DEBUG oslo_concurrency.lockutils [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.930s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 21 19:16:56 np0005591285 nova_compute[182755]: 2026-01-22 00:16:56.471 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:16:56 np0005591285 nova_compute[182755]: 2026-01-22 00:16:56.667 182759 DEBUG oslo_service.periodic_task [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 21 19:16:57 np0005591285 nova_compute[182755]: 2026-01-22 00:16:57.729 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:16:58 np0005591285 podman[232709]: 2026-01-22 00:16:58.188951469 +0000 UTC m=+0.062775014 container health_status 769668cc4e818cfae854ab3fcb5517227ddca1e18005a18ef4d782a494f45662 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.33.7, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.component=ubi9-minimal-container, distribution-scope=public, release=1755695350, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, container_name=openstack_network_exporter, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.tags=minimal rhel9, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vcs-type=git, vendor=Red Hat, Inc., maintainer=Red Hat, Inc., managed_by=edpm_ansible, io.openshift.expose-services=, version=9.6, name=ubi9-minimal, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, config_id=openstack_network_exporter, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., url=https://catalog.redhat.com/en/search?searchType=containers, architecture=x86_64, build-date=2025-08-20T13:12:41)
Jan 21 19:16:58 np0005591285 podman[232710]: 2026-01-22 00:16:58.222116894 +0000 UTC m=+0.080933189 container health_status b5c1a31a508d0cf9e10702abe9c0ef424614b4d8f0daee85791f7de054afa6f0 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_id=ceilometer_agent_compute, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']})
Jan 21 19:17:01 np0005591285 nova_compute[182755]: 2026-01-22 00:17:01.475 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:17:02 np0005591285 nova_compute[182755]: 2026-01-22 00:17:02.731 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:17:02 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:17:02.980 104259 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 21 19:17:02 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:17:02.981 104259 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 21 19:17:02 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:17:02.982 104259 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 21 19:17:04 np0005591285 nova_compute[182755]: 2026-01-22 00:17:04.070 182759 DEBUG oslo_concurrency.lockutils [None req-bc10fd85-ea1a-468e-ac18-7fe0b9c1a78d c26ff016fcfc4e08803feb0e96005a8e 4c6e66779ffe440d9c3270f0328391fb - - default default] Acquiring lock "be07a8c0-548d-4bfc-b004-799a831f8252" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 21 19:17:04 np0005591285 nova_compute[182755]: 2026-01-22 00:17:04.070 182759 DEBUG oslo_concurrency.lockutils [None req-bc10fd85-ea1a-468e-ac18-7fe0b9c1a78d c26ff016fcfc4e08803feb0e96005a8e 4c6e66779ffe440d9c3270f0328391fb - - default default] Lock "be07a8c0-548d-4bfc-b004-799a831f8252" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 21 19:17:04 np0005591285 nova_compute[182755]: 2026-01-22 00:17:04.071 182759 DEBUG oslo_concurrency.lockutils [None req-bc10fd85-ea1a-468e-ac18-7fe0b9c1a78d c26ff016fcfc4e08803feb0e96005a8e 4c6e66779ffe440d9c3270f0328391fb - - default default] Acquiring lock "be07a8c0-548d-4bfc-b004-799a831f8252-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 21 19:17:04 np0005591285 nova_compute[182755]: 2026-01-22 00:17:04.071 182759 DEBUG oslo_concurrency.lockutils [None req-bc10fd85-ea1a-468e-ac18-7fe0b9c1a78d c26ff016fcfc4e08803feb0e96005a8e 4c6e66779ffe440d9c3270f0328391fb - - default default] Lock "be07a8c0-548d-4bfc-b004-799a831f8252-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 21 19:17:04 np0005591285 nova_compute[182755]: 2026-01-22 00:17:04.071 182759 DEBUG oslo_concurrency.lockutils [None req-bc10fd85-ea1a-468e-ac18-7fe0b9c1a78d c26ff016fcfc4e08803feb0e96005a8e 4c6e66779ffe440d9c3270f0328391fb - - default default] Lock "be07a8c0-548d-4bfc-b004-799a831f8252-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 21 19:17:04 np0005591285 nova_compute[182755]: 2026-01-22 00:17:04.094 182759 INFO nova.compute.manager [None req-bc10fd85-ea1a-468e-ac18-7fe0b9c1a78d c26ff016fcfc4e08803feb0e96005a8e 4c6e66779ffe440d9c3270f0328391fb - - default default] [instance: be07a8c0-548d-4bfc-b004-799a831f8252] Terminating instance#033[00m
Jan 21 19:17:04 np0005591285 nova_compute[182755]: 2026-01-22 00:17:04.122 182759 DEBUG nova.compute.manager [None req-bc10fd85-ea1a-468e-ac18-7fe0b9c1a78d c26ff016fcfc4e08803feb0e96005a8e 4c6e66779ffe440d9c3270f0328391fb - - default default] [instance: be07a8c0-548d-4bfc-b004-799a831f8252] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Jan 21 19:17:04 np0005591285 kernel: tap9cb3a782-7c (unregistering): left promiscuous mode
Jan 21 19:17:04 np0005591285 NetworkManager[55017]: <info>  [1769041024.1507] device (tap9cb3a782-7c): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 21 19:17:04 np0005591285 ovn_controller[94908]: 2026-01-22T00:17:04Z|00512|binding|INFO|Releasing lport 9cb3a782-7c84-4107-9add-6097c8ec4c06 from this chassis (sb_readonly=0)
Jan 21 19:17:04 np0005591285 ovn_controller[94908]: 2026-01-22T00:17:04Z|00513|binding|INFO|Setting lport 9cb3a782-7c84-4107-9add-6097c8ec4c06 down in Southbound
Jan 21 19:17:04 np0005591285 nova_compute[182755]: 2026-01-22 00:17:04.162 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:17:04 np0005591285 ovn_controller[94908]: 2026-01-22T00:17:04Z|00514|binding|INFO|Removing iface tap9cb3a782-7c ovn-installed in OVS
Jan 21 19:17:04 np0005591285 nova_compute[182755]: 2026-01-22 00:17:04.177 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:17:04 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:17:04.178 104259 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:d8:7c:18 10.100.0.3'], port_security=['fa:16:3e:d8:7c:18 10.100.0.3'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.3/28', 'neutron:device_id': 'be07a8c0-548d-4bfc-b004-799a831f8252', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-55594f65-206f-4b2a-a4ed-c049861ef480', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '4c6e66779ffe440d9c3270f0328391fb', 'neutron:revision_number': '4', 'neutron:security_group_ids': '99a05b18-dc2e-46bb-b7ee-4bfce96057f6', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-2.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=f515f91a-3ddc-47bf-8aaf-753c19e78de1, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fdd55b5c970>], logical_port=9cb3a782-7c84-4107-9add-6097c8ec4c06) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7fdd55b5c970>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 21 19:17:04 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:17:04.179 104259 INFO neutron.agent.ovn.metadata.agent [-] Port 9cb3a782-7c84-4107-9add-6097c8ec4c06 in datapath 55594f65-206f-4b2a-a4ed-c049861ef480 unbound from our chassis#033[00m
Jan 21 19:17:04 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:17:04.180 104259 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 55594f65-206f-4b2a-a4ed-c049861ef480, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Jan 21 19:17:04 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:17:04.181 211690 DEBUG oslo.privsep.daemon [-] privsep: reply[a8d07a02-63fd-43cb-90b1-1f813ff28cf6]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 19:17:04 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:17:04.182 104259 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-55594f65-206f-4b2a-a4ed-c049861ef480 namespace which is not needed anymore#033[00m
Jan 21 19:17:04 np0005591285 systemd[1]: machine-qemu\x2d60\x2dinstance\x2d00000084.scope: Deactivated successfully.
Jan 21 19:17:04 np0005591285 systemd[1]: machine-qemu\x2d60\x2dinstance\x2d00000084.scope: Consumed 14.451s CPU time.
Jan 21 19:17:04 np0005591285 systemd-machined[154022]: Machine qemu-60-instance-00000084 terminated.
Jan 21 19:17:04 np0005591285 neutron-haproxy-ovnmeta-55594f65-206f-4b2a-a4ed-c049861ef480[232446]: [NOTICE]   (232450) : haproxy version is 2.8.14-c23fe91
Jan 21 19:17:04 np0005591285 neutron-haproxy-ovnmeta-55594f65-206f-4b2a-a4ed-c049861ef480[232446]: [NOTICE]   (232450) : path to executable is /usr/sbin/haproxy
Jan 21 19:17:04 np0005591285 neutron-haproxy-ovnmeta-55594f65-206f-4b2a-a4ed-c049861ef480[232446]: [WARNING]  (232450) : Exiting Master process...
Jan 21 19:17:04 np0005591285 neutron-haproxy-ovnmeta-55594f65-206f-4b2a-a4ed-c049861ef480[232446]: [WARNING]  (232450) : Exiting Master process...
Jan 21 19:17:04 np0005591285 neutron-haproxy-ovnmeta-55594f65-206f-4b2a-a4ed-c049861ef480[232446]: [ALERT]    (232450) : Current worker (232452) exited with code 143 (Terminated)
Jan 21 19:17:04 np0005591285 neutron-haproxy-ovnmeta-55594f65-206f-4b2a-a4ed-c049861ef480[232446]: [WARNING]  (232450) : All workers exited. Exiting... (0)
Jan 21 19:17:04 np0005591285 systemd[1]: libpod-48d1b8ba862549ad395951fbe37a8efcef155a3e0d3f8de1a1756e4438934d85.scope: Deactivated successfully.
Jan 21 19:17:04 np0005591285 podman[232774]: 2026-01-22 00:17:04.327159004 +0000 UTC m=+0.047814686 container died 48d1b8ba862549ad395951fbe37a8efcef155a3e0d3f8de1a1756e4438934d85 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-55594f65-206f-4b2a-a4ed-c049861ef480, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS)
Jan 21 19:17:04 np0005591285 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-48d1b8ba862549ad395951fbe37a8efcef155a3e0d3f8de1a1756e4438934d85-userdata-shm.mount: Deactivated successfully.
Jan 21 19:17:04 np0005591285 systemd[1]: var-lib-containers-storage-overlay-aed3c941d635211ff622eeb2b1fe5e53480103e095539e0c3b8b5a680cad59b2-merged.mount: Deactivated successfully.
Jan 21 19:17:04 np0005591285 podman[232774]: 2026-01-22 00:17:04.367943491 +0000 UTC m=+0.088599193 container cleanup 48d1b8ba862549ad395951fbe37a8efcef155a3e0d3f8de1a1756e4438934d85 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-55594f65-206f-4b2a-a4ed-c049861ef480, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 21 19:17:04 np0005591285 systemd[1]: libpod-conmon-48d1b8ba862549ad395951fbe37a8efcef155a3e0d3f8de1a1756e4438934d85.scope: Deactivated successfully.
Jan 21 19:17:04 np0005591285 nova_compute[182755]: 2026-01-22 00:17:04.399 182759 INFO nova.virt.libvirt.driver [-] [instance: be07a8c0-548d-4bfc-b004-799a831f8252] Instance destroyed successfully.#033[00m
Jan 21 19:17:04 np0005591285 nova_compute[182755]: 2026-01-22 00:17:04.400 182759 DEBUG nova.objects.instance [None req-bc10fd85-ea1a-468e-ac18-7fe0b9c1a78d c26ff016fcfc4e08803feb0e96005a8e 4c6e66779ffe440d9c3270f0328391fb - - default default] Lazy-loading 'resources' on Instance uuid be07a8c0-548d-4bfc-b004-799a831f8252 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 21 19:17:04 np0005591285 nova_compute[182755]: 2026-01-22 00:17:04.421 182759 DEBUG nova.virt.libvirt.vif [None req-bc10fd85-ea1a-468e-ac18-7fe0b9c1a78d c26ff016fcfc4e08803feb0e96005a8e 4c6e66779ffe440d9c3270f0328391fb - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-22T00:15:42Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ServerRescueNegativeTestJSON-server-2105253339',display_name='tempest-ServerRescueNegativeTestJSON-server-2105253339',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-serverrescuenegativetestjson-server-2105253339',id=132,image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2026-01-22T00:16:09Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='4c6e66779ffe440d9c3270f0328391fb',ramdisk_id='',reservation_id='r-aq0efre3',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-ServerRescueNegativeTestJSON-1986679883',owner_user_name='tempest-ServerRescueNegativeTestJSON-1986679883-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-22T00:16:52Z,user_data=None,user_id='c26ff016fcfc4e08803feb0e96005a8e',uuid=be07a8c0-548d-4bfc-b004-799a831f8252,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "9cb3a782-7c84-4107-9add-6097c8ec4c06", "address": "fa:16:3e:d8:7c:18", "network": {"id": "55594f65-206f-4b2a-a4ed-c049861ef480", "bridge": "br-int", "label": "tempest-ServerRescueNegativeTestJSON-2114387764-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "4c6e66779ffe440d9c3270f0328391fb", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap9cb3a782-7c", "ovs_interfaceid": "9cb3a782-7c84-4107-9add-6097c8ec4c06", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Jan 21 19:17:04 np0005591285 nova_compute[182755]: 2026-01-22 00:17:04.421 182759 DEBUG nova.network.os_vif_util [None req-bc10fd85-ea1a-468e-ac18-7fe0b9c1a78d c26ff016fcfc4e08803feb0e96005a8e 4c6e66779ffe440d9c3270f0328391fb - - default default] Converting VIF {"id": "9cb3a782-7c84-4107-9add-6097c8ec4c06", "address": "fa:16:3e:d8:7c:18", "network": {"id": "55594f65-206f-4b2a-a4ed-c049861ef480", "bridge": "br-int", "label": "tempest-ServerRescueNegativeTestJSON-2114387764-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "4c6e66779ffe440d9c3270f0328391fb", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap9cb3a782-7c", "ovs_interfaceid": "9cb3a782-7c84-4107-9add-6097c8ec4c06", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 21 19:17:04 np0005591285 nova_compute[182755]: 2026-01-22 00:17:04.422 182759 DEBUG nova.network.os_vif_util [None req-bc10fd85-ea1a-468e-ac18-7fe0b9c1a78d c26ff016fcfc4e08803feb0e96005a8e 4c6e66779ffe440d9c3270f0328391fb - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:d8:7c:18,bridge_name='br-int',has_traffic_filtering=True,id=9cb3a782-7c84-4107-9add-6097c8ec4c06,network=Network(55594f65-206f-4b2a-a4ed-c049861ef480),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap9cb3a782-7c') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 21 19:17:04 np0005591285 nova_compute[182755]: 2026-01-22 00:17:04.422 182759 DEBUG os_vif [None req-bc10fd85-ea1a-468e-ac18-7fe0b9c1a78d c26ff016fcfc4e08803feb0e96005a8e 4c6e66779ffe440d9c3270f0328391fb - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:d8:7c:18,bridge_name='br-int',has_traffic_filtering=True,id=9cb3a782-7c84-4107-9add-6097c8ec4c06,network=Network(55594f65-206f-4b2a-a4ed-c049861ef480),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap9cb3a782-7c') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Jan 21 19:17:04 np0005591285 nova_compute[182755]: 2026-01-22 00:17:04.424 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:17:04 np0005591285 nova_compute[182755]: 2026-01-22 00:17:04.424 182759 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap9cb3a782-7c, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 21 19:17:04 np0005591285 nova_compute[182755]: 2026-01-22 00:17:04.426 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:17:04 np0005591285 nova_compute[182755]: 2026-01-22 00:17:04.427 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:17:04 np0005591285 nova_compute[182755]: 2026-01-22 00:17:04.430 182759 INFO os_vif [None req-bc10fd85-ea1a-468e-ac18-7fe0b9c1a78d c26ff016fcfc4e08803feb0e96005a8e 4c6e66779ffe440d9c3270f0328391fb - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:d8:7c:18,bridge_name='br-int',has_traffic_filtering=True,id=9cb3a782-7c84-4107-9add-6097c8ec4c06,network=Network(55594f65-206f-4b2a-a4ed-c049861ef480),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap9cb3a782-7c')#033[00m
Jan 21 19:17:04 np0005591285 nova_compute[182755]: 2026-01-22 00:17:04.430 182759 INFO nova.virt.libvirt.driver [None req-bc10fd85-ea1a-468e-ac18-7fe0b9c1a78d c26ff016fcfc4e08803feb0e96005a8e 4c6e66779ffe440d9c3270f0328391fb - - default default] [instance: be07a8c0-548d-4bfc-b004-799a831f8252] Deleting instance files /var/lib/nova/instances/be07a8c0-548d-4bfc-b004-799a831f8252_del#033[00m
Jan 21 19:17:04 np0005591285 nova_compute[182755]: 2026-01-22 00:17:04.431 182759 INFO nova.virt.libvirt.driver [None req-bc10fd85-ea1a-468e-ac18-7fe0b9c1a78d c26ff016fcfc4e08803feb0e96005a8e 4c6e66779ffe440d9c3270f0328391fb - - default default] [instance: be07a8c0-548d-4bfc-b004-799a831f8252] Deletion of /var/lib/nova/instances/be07a8c0-548d-4bfc-b004-799a831f8252_del complete#033[00m
Jan 21 19:17:04 np0005591285 podman[232814]: 2026-01-22 00:17:04.447360879 +0000 UTC m=+0.051638669 container remove 48d1b8ba862549ad395951fbe37a8efcef155a3e0d3f8de1a1756e4438934d85 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-55594f65-206f-4b2a-a4ed-c049861ef480, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 21 19:17:04 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:17:04.452 211690 DEBUG oslo.privsep.daemon [-] privsep: reply[0cc474a1-fc58-469f-88d7-bca57644daca]: (4, ('Thu Jan 22 12:17:04 AM UTC 2026 Stopping container neutron-haproxy-ovnmeta-55594f65-206f-4b2a-a4ed-c049861ef480 (48d1b8ba862549ad395951fbe37a8efcef155a3e0d3f8de1a1756e4438934d85)\n48d1b8ba862549ad395951fbe37a8efcef155a3e0d3f8de1a1756e4438934d85\nThu Jan 22 12:17:04 AM UTC 2026 Deleting container neutron-haproxy-ovnmeta-55594f65-206f-4b2a-a4ed-c049861ef480 (48d1b8ba862549ad395951fbe37a8efcef155a3e0d3f8de1a1756e4438934d85)\n48d1b8ba862549ad395951fbe37a8efcef155a3e0d3f8de1a1756e4438934d85\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 19:17:04 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:17:04.454 211690 DEBUG oslo.privsep.daemon [-] privsep: reply[f9305cee-505c-4642-84ed-82f6e2ba9181]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 19:17:04 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:17:04.456 104259 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap55594f65-20, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 21 19:17:04 np0005591285 nova_compute[182755]: 2026-01-22 00:17:04.458 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:17:04 np0005591285 kernel: tap55594f65-20: left promiscuous mode
Jan 21 19:17:04 np0005591285 nova_compute[182755]: 2026-01-22 00:17:04.473 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:17:04 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:17:04.476 211690 DEBUG oslo.privsep.daemon [-] privsep: reply[af826f10-d100-46d0-a607-5f0d562b7fc6]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 19:17:04 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:17:04.491 211690 DEBUG oslo.privsep.daemon [-] privsep: reply[2476988a-5a81-4320-8708-6c92c13a1d08]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 19:17:04 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:17:04.492 211690 DEBUG oslo.privsep.daemon [-] privsep: reply[222d7ee9-6159-4ea4-930d-6a77e924a51a]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 19:17:04 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:17:04.510 211690 DEBUG oslo.privsep.daemon [-] privsep: reply[789742f0-cce4-48a7-bcde-21be8b2e828f]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 552392, 'reachable_time': 34479, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 232832, 'error': None, 'target': 'ovnmeta-55594f65-206f-4b2a-a4ed-c049861ef480', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 19:17:04 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:17:04.515 104650 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-55594f65-206f-4b2a-a4ed-c049861ef480 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Jan 21 19:17:04 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:17:04.515 104650 DEBUG oslo.privsep.daemon [-] privsep: reply[706d8da4-04e7-40ec-bedf-e349ed9940ba]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 19:17:04 np0005591285 systemd[1]: run-netns-ovnmeta\x2d55594f65\x2d206f\x2d4b2a\x2da4ed\x2dc049861ef480.mount: Deactivated successfully.
Jan 21 19:17:04 np0005591285 nova_compute[182755]: 2026-01-22 00:17:04.523 182759 INFO nova.compute.manager [None req-bc10fd85-ea1a-468e-ac18-7fe0b9c1a78d c26ff016fcfc4e08803feb0e96005a8e 4c6e66779ffe440d9c3270f0328391fb - - default default] [instance: be07a8c0-548d-4bfc-b004-799a831f8252] Took 0.40 seconds to destroy the instance on the hypervisor.#033[00m
Jan 21 19:17:04 np0005591285 nova_compute[182755]: 2026-01-22 00:17:04.524 182759 DEBUG oslo.service.loopingcall [None req-bc10fd85-ea1a-468e-ac18-7fe0b9c1a78d c26ff016fcfc4e08803feb0e96005a8e 4c6e66779ffe440d9c3270f0328391fb - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Jan 21 19:17:04 np0005591285 nova_compute[182755]: 2026-01-22 00:17:04.524 182759 DEBUG nova.compute.manager [-] [instance: be07a8c0-548d-4bfc-b004-799a831f8252] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Jan 21 19:17:04 np0005591285 nova_compute[182755]: 2026-01-22 00:17:04.525 182759 DEBUG nova.network.neutron [-] [instance: be07a8c0-548d-4bfc-b004-799a831f8252] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Jan 21 19:17:04 np0005591285 nova_compute[182755]: 2026-01-22 00:17:04.611 182759 DEBUG nova.compute.manager [req-89b28c02-11a1-4b20-b7a1-2a6127c99ffe req-237aa4d2-e097-43ac-bc51-d52f0d435bc7 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: be07a8c0-548d-4bfc-b004-799a831f8252] Received event network-vif-unplugged-9cb3a782-7c84-4107-9add-6097c8ec4c06 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 21 19:17:04 np0005591285 nova_compute[182755]: 2026-01-22 00:17:04.612 182759 DEBUG oslo_concurrency.lockutils [req-89b28c02-11a1-4b20-b7a1-2a6127c99ffe req-237aa4d2-e097-43ac-bc51-d52f0d435bc7 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "be07a8c0-548d-4bfc-b004-799a831f8252-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 21 19:17:04 np0005591285 nova_compute[182755]: 2026-01-22 00:17:04.612 182759 DEBUG oslo_concurrency.lockutils [req-89b28c02-11a1-4b20-b7a1-2a6127c99ffe req-237aa4d2-e097-43ac-bc51-d52f0d435bc7 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "be07a8c0-548d-4bfc-b004-799a831f8252-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 21 19:17:04 np0005591285 nova_compute[182755]: 2026-01-22 00:17:04.612 182759 DEBUG oslo_concurrency.lockutils [req-89b28c02-11a1-4b20-b7a1-2a6127c99ffe req-237aa4d2-e097-43ac-bc51-d52f0d435bc7 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "be07a8c0-548d-4bfc-b004-799a831f8252-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 21 19:17:04 np0005591285 nova_compute[182755]: 2026-01-22 00:17:04.612 182759 DEBUG nova.compute.manager [req-89b28c02-11a1-4b20-b7a1-2a6127c99ffe req-237aa4d2-e097-43ac-bc51-d52f0d435bc7 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: be07a8c0-548d-4bfc-b004-799a831f8252] No waiting events found dispatching network-vif-unplugged-9cb3a782-7c84-4107-9add-6097c8ec4c06 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 21 19:17:04 np0005591285 nova_compute[182755]: 2026-01-22 00:17:04.612 182759 DEBUG nova.compute.manager [req-89b28c02-11a1-4b20-b7a1-2a6127c99ffe req-237aa4d2-e097-43ac-bc51-d52f0d435bc7 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: be07a8c0-548d-4bfc-b004-799a831f8252] Received event network-vif-unplugged-9cb3a782-7c84-4107-9add-6097c8ec4c06 for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Jan 21 19:17:06 np0005591285 nova_compute[182755]: 2026-01-22 00:17:06.075 182759 DEBUG nova.network.neutron [-] [instance: be07a8c0-548d-4bfc-b004-799a831f8252] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 21 19:17:06 np0005591285 nova_compute[182755]: 2026-01-22 00:17:06.118 182759 INFO nova.compute.manager [-] [instance: be07a8c0-548d-4bfc-b004-799a831f8252] Took 1.59 seconds to deallocate network for instance.#033[00m
Jan 21 19:17:06 np0005591285 nova_compute[182755]: 2026-01-22 00:17:06.313 182759 DEBUG nova.compute.manager [req-c3f0f63c-95cd-4270-852d-ffa12ac287cd req-1f248057-91d7-4b3b-bcf3-1edf04f5b747 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: be07a8c0-548d-4bfc-b004-799a831f8252] Received event network-vif-deleted-9cb3a782-7c84-4107-9add-6097c8ec4c06 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 21 19:17:06 np0005591285 nova_compute[182755]: 2026-01-22 00:17:06.322 182759 DEBUG oslo_concurrency.lockutils [None req-bc10fd85-ea1a-468e-ac18-7fe0b9c1a78d c26ff016fcfc4e08803feb0e96005a8e 4c6e66779ffe440d9c3270f0328391fb - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 21 19:17:06 np0005591285 nova_compute[182755]: 2026-01-22 00:17:06.322 182759 DEBUG oslo_concurrency.lockutils [None req-bc10fd85-ea1a-468e-ac18-7fe0b9c1a78d c26ff016fcfc4e08803feb0e96005a8e 4c6e66779ffe440d9c3270f0328391fb - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 21 19:17:06 np0005591285 nova_compute[182755]: 2026-01-22 00:17:06.579 182759 DEBUG nova.compute.provider_tree [None req-bc10fd85-ea1a-468e-ac18-7fe0b9c1a78d c26ff016fcfc4e08803feb0e96005a8e 4c6e66779ffe440d9c3270f0328391fb - - default default] Inventory has not changed in ProviderTree for provider: e96a8776-a298-4c19-937a-402cb8191067 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 21 19:17:06 np0005591285 nova_compute[182755]: 2026-01-22 00:17:06.615 182759 DEBUG nova.scheduler.client.report [None req-bc10fd85-ea1a-468e-ac18-7fe0b9c1a78d c26ff016fcfc4e08803feb0e96005a8e 4c6e66779ffe440d9c3270f0328391fb - - default default] Inventory has not changed for provider e96a8776-a298-4c19-937a-402cb8191067 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 21 19:17:06 np0005591285 nova_compute[182755]: 2026-01-22 00:17:06.643 182759 DEBUG oslo_concurrency.lockutils [None req-bc10fd85-ea1a-468e-ac18-7fe0b9c1a78d c26ff016fcfc4e08803feb0e96005a8e 4c6e66779ffe440d9c3270f0328391fb - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.321s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 21 19:17:06 np0005591285 nova_compute[182755]: 2026-01-22 00:17:06.714 182759 INFO nova.scheduler.client.report [None req-bc10fd85-ea1a-468e-ac18-7fe0b9c1a78d c26ff016fcfc4e08803feb0e96005a8e 4c6e66779ffe440d9c3270f0328391fb - - default default] Deleted allocations for instance be07a8c0-548d-4bfc-b004-799a831f8252#033[00m
Jan 21 19:17:06 np0005591285 nova_compute[182755]: 2026-01-22 00:17:06.816 182759 DEBUG nova.compute.manager [req-54595edd-1d40-4c07-9d1a-15cc11bd7e71 req-db4636f7-76fb-4b5d-89ea-e8c827683ac4 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: be07a8c0-548d-4bfc-b004-799a831f8252] Received event network-vif-plugged-9cb3a782-7c84-4107-9add-6097c8ec4c06 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 21 19:17:06 np0005591285 nova_compute[182755]: 2026-01-22 00:17:06.817 182759 DEBUG oslo_concurrency.lockutils [req-54595edd-1d40-4c07-9d1a-15cc11bd7e71 req-db4636f7-76fb-4b5d-89ea-e8c827683ac4 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "be07a8c0-548d-4bfc-b004-799a831f8252-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 21 19:17:06 np0005591285 nova_compute[182755]: 2026-01-22 00:17:06.818 182759 DEBUG oslo_concurrency.lockutils [req-54595edd-1d40-4c07-9d1a-15cc11bd7e71 req-db4636f7-76fb-4b5d-89ea-e8c827683ac4 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "be07a8c0-548d-4bfc-b004-799a831f8252-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 21 19:17:06 np0005591285 nova_compute[182755]: 2026-01-22 00:17:06.818 182759 DEBUG oslo_concurrency.lockutils [req-54595edd-1d40-4c07-9d1a-15cc11bd7e71 req-db4636f7-76fb-4b5d-89ea-e8c827683ac4 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "be07a8c0-548d-4bfc-b004-799a831f8252-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 21 19:17:06 np0005591285 nova_compute[182755]: 2026-01-22 00:17:06.819 182759 DEBUG nova.compute.manager [req-54595edd-1d40-4c07-9d1a-15cc11bd7e71 req-db4636f7-76fb-4b5d-89ea-e8c827683ac4 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: be07a8c0-548d-4bfc-b004-799a831f8252] No waiting events found dispatching network-vif-plugged-9cb3a782-7c84-4107-9add-6097c8ec4c06 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 21 19:17:06 np0005591285 nova_compute[182755]: 2026-01-22 00:17:06.819 182759 WARNING nova.compute.manager [req-54595edd-1d40-4c07-9d1a-15cc11bd7e71 req-db4636f7-76fb-4b5d-89ea-e8c827683ac4 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: be07a8c0-548d-4bfc-b004-799a831f8252] Received unexpected event network-vif-plugged-9cb3a782-7c84-4107-9add-6097c8ec4c06 for instance with vm_state deleted and task_state None.#033[00m
Jan 21 19:17:06 np0005591285 nova_compute[182755]: 2026-01-22 00:17:06.829 182759 DEBUG oslo_concurrency.lockutils [None req-bc10fd85-ea1a-468e-ac18-7fe0b9c1a78d c26ff016fcfc4e08803feb0e96005a8e 4c6e66779ffe440d9c3270f0328391fb - - default default] Lock "be07a8c0-548d-4bfc-b004-799a831f8252" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 2.758s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 21 19:17:07 np0005591285 nova_compute[182755]: 2026-01-22 00:17:07.732 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:17:09 np0005591285 podman[232833]: 2026-01-22 00:17:09.188319878 +0000 UTC m=+0.058554952 container health_status 3d927e208c8d9d2bf2ed958430b573b5beb6e6062ba87e1ec7a0ee42c855b2e4 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter)
Jan 21 19:17:09 np0005591285 nova_compute[182755]: 2026-01-22 00:17:09.427 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:17:11 np0005591285 nova_compute[182755]: 2026-01-22 00:17:11.218 182759 DEBUG oslo_service.periodic_task [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Running periodic task ComputeManager._run_image_cache_manager_pass run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 21 19:17:11 np0005591285 nova_compute[182755]: 2026-01-22 00:17:11.219 182759 DEBUG oslo_concurrency.lockutils [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Acquiring lock "storage-registry-lock" by "nova.virt.storage_users.register_storage_use.<locals>.do_register_storage_use" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 21 19:17:11 np0005591285 nova_compute[182755]: 2026-01-22 00:17:11.220 182759 DEBUG oslo_concurrency.lockutils [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Lock "storage-registry-lock" acquired by "nova.virt.storage_users.register_storage_use.<locals>.do_register_storage_use" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 21 19:17:11 np0005591285 nova_compute[182755]: 2026-01-22 00:17:11.221 182759 DEBUG oslo_concurrency.lockutils [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Lock "storage-registry-lock" "released" by "nova.virt.storage_users.register_storage_use.<locals>.do_register_storage_use" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 21 19:17:11 np0005591285 nova_compute[182755]: 2026-01-22 00:17:11.221 182759 DEBUG oslo_concurrency.lockutils [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Acquiring lock "storage-registry-lock" by "nova.virt.storage_users.get_storage_users.<locals>.do_get_storage_users" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 21 19:17:11 np0005591285 nova_compute[182755]: 2026-01-22 00:17:11.222 182759 DEBUG oslo_concurrency.lockutils [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Lock "storage-registry-lock" acquired by "nova.virt.storage_users.get_storage_users.<locals>.do_get_storage_users" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 21 19:17:11 np0005591285 nova_compute[182755]: 2026-01-22 00:17:11.222 182759 DEBUG oslo_concurrency.lockutils [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Lock "storage-registry-lock" "released" by "nova.virt.storage_users.get_storage_users.<locals>.do_get_storage_users" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 21 19:17:11 np0005591285 nova_compute[182755]: 2026-01-22 00:17:11.251 182759 DEBUG nova.virt.libvirt.imagecache [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Adding ephemeral_1_0706d66 into backend ephemeral images _store_ephemeral_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/imagecache.py:100#033[00m
Jan 21 19:17:11 np0005591285 nova_compute[182755]: 2026-01-22 00:17:11.274 182759 DEBUG nova.virt.libvirt.imagecache [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Verify base images _age_and_verify_cached_images /usr/lib/python3.9/site-packages/nova/virt/libvirt/imagecache.py:314#033[00m
Jan 21 19:17:11 np0005591285 nova_compute[182755]: 2026-01-22 00:17:11.274 182759 WARNING nova.virt.libvirt.imagecache [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Unknown base file: /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474#033[00m
Jan 21 19:17:11 np0005591285 nova_compute[182755]: 2026-01-22 00:17:11.275 182759 WARNING nova.virt.libvirt.imagecache [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Unknown base file: /var/lib/nova/instances/_base/28d799b333bd7d52e5e892149f424e185effed74#033[00m
Jan 21 19:17:11 np0005591285 nova_compute[182755]: 2026-01-22 00:17:11.275 182759 WARNING nova.virt.libvirt.imagecache [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Unknown base file: /var/lib/nova/instances/_base/6779aa1454b0f9e323fac2693f45a73902da912b#033[00m
Jan 21 19:17:11 np0005591285 nova_compute[182755]: 2026-01-22 00:17:11.275 182759 WARNING nova.virt.libvirt.imagecache [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Unknown base file: /var/lib/nova/instances/_base/21a6e6787783e19d6abd064b1f558cdd7dc1053f#033[00m
Jan 21 19:17:11 np0005591285 nova_compute[182755]: 2026-01-22 00:17:11.276 182759 WARNING nova.virt.libvirt.imagecache [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Unknown base file: /var/lib/nova/instances/_base/b33c9920bf26c6dae549fa60eaf22a65772f20df#033[00m
Jan 21 19:17:11 np0005591285 nova_compute[182755]: 2026-01-22 00:17:11.276 182759 INFO nova.virt.libvirt.imagecache [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Removable base files: /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 /var/lib/nova/instances/_base/28d799b333bd7d52e5e892149f424e185effed74 /var/lib/nova/instances/_base/6779aa1454b0f9e323fac2693f45a73902da912b /var/lib/nova/instances/_base/21a6e6787783e19d6abd064b1f558cdd7dc1053f /var/lib/nova/instances/_base/b33c9920bf26c6dae549fa60eaf22a65772f20df#033[00m
Jan 21 19:17:11 np0005591285 nova_compute[182755]: 2026-01-22 00:17:11.278 182759 INFO nova.virt.libvirt.imagecache [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Base, swap or ephemeral file too young to remove: /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474#033[00m
Jan 21 19:17:11 np0005591285 nova_compute[182755]: 2026-01-22 00:17:11.278 182759 INFO nova.virt.libvirt.imagecache [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Base, swap or ephemeral file too young to remove: /var/lib/nova/instances/_base/28d799b333bd7d52e5e892149f424e185effed74#033[00m
Jan 21 19:17:11 np0005591285 nova_compute[182755]: 2026-01-22 00:17:11.279 182759 INFO nova.virt.libvirt.imagecache [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Base, swap or ephemeral file too young to remove: /var/lib/nova/instances/_base/6779aa1454b0f9e323fac2693f45a73902da912b#033[00m
Jan 21 19:17:11 np0005591285 nova_compute[182755]: 2026-01-22 00:17:11.279 182759 INFO nova.virt.libvirt.imagecache [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Base, swap or ephemeral file too young to remove: /var/lib/nova/instances/_base/21a6e6787783e19d6abd064b1f558cdd7dc1053f#033[00m
Jan 21 19:17:11 np0005591285 nova_compute[182755]: 2026-01-22 00:17:11.280 182759 INFO nova.virt.libvirt.imagecache [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Base, swap or ephemeral file too young to remove: /var/lib/nova/instances/_base/b33c9920bf26c6dae549fa60eaf22a65772f20df#033[00m
Jan 21 19:17:11 np0005591285 nova_compute[182755]: 2026-01-22 00:17:11.280 182759 DEBUG nova.virt.libvirt.imagecache [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Verification complete _age_and_verify_cached_images /usr/lib/python3.9/site-packages/nova/virt/libvirt/imagecache.py:350#033[00m
Jan 21 19:17:11 np0005591285 nova_compute[182755]: 2026-01-22 00:17:11.280 182759 DEBUG nova.virt.libvirt.imagecache [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Verify swap images _age_and_verify_swap_images /usr/lib/python3.9/site-packages/nova/virt/libvirt/imagecache.py:299#033[00m
Jan 21 19:17:11 np0005591285 nova_compute[182755]: 2026-01-22 00:17:11.281 182759 DEBUG nova.virt.libvirt.imagecache [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Verify ephemeral images _age_and_verify_ephemeral_images /usr/lib/python3.9/site-packages/nova/virt/libvirt/imagecache.py:284#033[00m
Jan 21 19:17:11 np0005591285 nova_compute[182755]: 2026-01-22 00:17:11.281 182759 INFO nova.virt.libvirt.imagecache [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Base, swap or ephemeral file too young to remove: /var/lib/nova/instances/_base/ephemeral_1_0706d66#033[00m
Jan 21 19:17:12 np0005591285 nova_compute[182755]: 2026-01-22 00:17:12.734 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:17:12 np0005591285 nova_compute[182755]: 2026-01-22 00:17:12.967 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:17:13 np0005591285 podman[232858]: 2026-01-22 00:17:13.206253892 +0000 UTC m=+0.065993230 container health_status 8919309155a033da8deb024bb37b9fc0d285fe97686ee0d202af37a5f2df8416 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter)
Jan 21 19:17:13 np0005591285 podman[232857]: 2026-01-22 00:17:13.238790619 +0000 UTC m=+0.101264140 container health_status 482a8450540b33e5cd4c4cb0c4b60281016c657b79b89fa82f8a9e447843f995 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible)
Jan 21 19:17:14 np0005591285 nova_compute[182755]: 2026-01-22 00:17:14.431 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:17:14 np0005591285 nova_compute[182755]: 2026-01-22 00:17:14.643 182759 DEBUG oslo_concurrency.lockutils [None req-35e96e77-dcb6-4cc3-a259-d34651d26c6e 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] Acquiring lock "13e8c059-7a27-4db4-8151-32511c435aaf" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 21 19:17:14 np0005591285 nova_compute[182755]: 2026-01-22 00:17:14.643 182759 DEBUG oslo_concurrency.lockutils [None req-35e96e77-dcb6-4cc3-a259-d34651d26c6e 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] Lock "13e8c059-7a27-4db4-8151-32511c435aaf" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 21 19:17:14 np0005591285 nova_compute[182755]: 2026-01-22 00:17:14.662 182759 DEBUG nova.compute.manager [None req-35e96e77-dcb6-4cc3-a259-d34651d26c6e 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] [instance: 13e8c059-7a27-4db4-8151-32511c435aaf] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Jan 21 19:17:14 np0005591285 nova_compute[182755]: 2026-01-22 00:17:14.840 182759 DEBUG oslo_concurrency.lockutils [None req-35e96e77-dcb6-4cc3-a259-d34651d26c6e 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 21 19:17:14 np0005591285 nova_compute[182755]: 2026-01-22 00:17:14.841 182759 DEBUG oslo_concurrency.lockutils [None req-35e96e77-dcb6-4cc3-a259-d34651d26c6e 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 21 19:17:14 np0005591285 nova_compute[182755]: 2026-01-22 00:17:14.848 182759 DEBUG nova.virt.hardware [None req-35e96e77-dcb6-4cc3-a259-d34651d26c6e 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Jan 21 19:17:14 np0005591285 nova_compute[182755]: 2026-01-22 00:17:14.848 182759 INFO nova.compute.claims [None req-35e96e77-dcb6-4cc3-a259-d34651d26c6e 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] [instance: 13e8c059-7a27-4db4-8151-32511c435aaf] Claim successful on node compute-2.ctlplane.example.com#033[00m
Jan 21 19:17:15 np0005591285 podman[232898]: 2026-01-22 00:17:15.256161605 +0000 UTC m=+0.129946536 container health_status e38def30a2d032278c507a8fe96e62e9ea51ff4721fe55b24e66691821fd079e (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, container_name=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_id=ovn_controller, managed_by=edpm_ansible)
Jan 21 19:17:15 np0005591285 nova_compute[182755]: 2026-01-22 00:17:15.697 182759 DEBUG nova.compute.provider_tree [None req-35e96e77-dcb6-4cc3-a259-d34651d26c6e 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] Inventory has not changed in ProviderTree for provider: e96a8776-a298-4c19-937a-402cb8191067 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 21 19:17:16 np0005591285 nova_compute[182755]: 2026-01-22 00:17:16.022 182759 DEBUG nova.scheduler.client.report [None req-35e96e77-dcb6-4cc3-a259-d34651d26c6e 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] Inventory has not changed for provider e96a8776-a298-4c19-937a-402cb8191067 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 21 19:17:16 np0005591285 nova_compute[182755]: 2026-01-22 00:17:16.167 182759 DEBUG oslo_concurrency.lockutils [None req-35e96e77-dcb6-4cc3-a259-d34651d26c6e 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 1.326s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 21 19:17:16 np0005591285 nova_compute[182755]: 2026-01-22 00:17:16.168 182759 DEBUG nova.compute.manager [None req-35e96e77-dcb6-4cc3-a259-d34651d26c6e 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] [instance: 13e8c059-7a27-4db4-8151-32511c435aaf] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Jan 21 19:17:16 np0005591285 nova_compute[182755]: 2026-01-22 00:17:16.242 182759 DEBUG nova.compute.manager [None req-35e96e77-dcb6-4cc3-a259-d34651d26c6e 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] [instance: 13e8c059-7a27-4db4-8151-32511c435aaf] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Jan 21 19:17:16 np0005591285 nova_compute[182755]: 2026-01-22 00:17:16.242 182759 DEBUG nova.network.neutron [None req-35e96e77-dcb6-4cc3-a259-d34651d26c6e 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] [instance: 13e8c059-7a27-4db4-8151-32511c435aaf] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Jan 21 19:17:16 np0005591285 nova_compute[182755]: 2026-01-22 00:17:16.264 182759 INFO nova.virt.libvirt.driver [None req-35e96e77-dcb6-4cc3-a259-d34651d26c6e 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] [instance: 13e8c059-7a27-4db4-8151-32511c435aaf] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Jan 21 19:17:16 np0005591285 nova_compute[182755]: 2026-01-22 00:17:16.309 182759 DEBUG nova.compute.manager [None req-35e96e77-dcb6-4cc3-a259-d34651d26c6e 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] [instance: 13e8c059-7a27-4db4-8151-32511c435aaf] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Jan 21 19:17:16 np0005591285 nova_compute[182755]: 2026-01-22 00:17:16.467 182759 DEBUG nova.compute.manager [None req-35e96e77-dcb6-4cc3-a259-d34651d26c6e 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] [instance: 13e8c059-7a27-4db4-8151-32511c435aaf] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Jan 21 19:17:16 np0005591285 nova_compute[182755]: 2026-01-22 00:17:16.469 182759 DEBUG nova.virt.libvirt.driver [None req-35e96e77-dcb6-4cc3-a259-d34651d26c6e 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] [instance: 13e8c059-7a27-4db4-8151-32511c435aaf] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Jan 21 19:17:16 np0005591285 nova_compute[182755]: 2026-01-22 00:17:16.469 182759 INFO nova.virt.libvirt.driver [None req-35e96e77-dcb6-4cc3-a259-d34651d26c6e 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] [instance: 13e8c059-7a27-4db4-8151-32511c435aaf] Creating image(s)#033[00m
Jan 21 19:17:16 np0005591285 nova_compute[182755]: 2026-01-22 00:17:16.470 182759 DEBUG oslo_concurrency.lockutils [None req-35e96e77-dcb6-4cc3-a259-d34651d26c6e 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] Acquiring lock "/var/lib/nova/instances/13e8c059-7a27-4db4-8151-32511c435aaf/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 21 19:17:16 np0005591285 nova_compute[182755]: 2026-01-22 00:17:16.470 182759 DEBUG oslo_concurrency.lockutils [None req-35e96e77-dcb6-4cc3-a259-d34651d26c6e 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] Lock "/var/lib/nova/instances/13e8c059-7a27-4db4-8151-32511c435aaf/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 21 19:17:16 np0005591285 nova_compute[182755]: 2026-01-22 00:17:16.471 182759 DEBUG oslo_concurrency.lockutils [None req-35e96e77-dcb6-4cc3-a259-d34651d26c6e 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] Lock "/var/lib/nova/instances/13e8c059-7a27-4db4-8151-32511c435aaf/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 21 19:17:16 np0005591285 nova_compute[182755]: 2026-01-22 00:17:16.484 182759 DEBUG oslo_concurrency.processutils [None req-35e96e77-dcb6-4cc3-a259-d34651d26c6e 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 21 19:17:16 np0005591285 nova_compute[182755]: 2026-01-22 00:17:16.579 182759 DEBUG oslo_concurrency.processutils [None req-35e96e77-dcb6-4cc3-a259-d34651d26c6e 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json" returned: 0 in 0.095s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 21 19:17:16 np0005591285 nova_compute[182755]: 2026-01-22 00:17:16.580 182759 DEBUG oslo_concurrency.lockutils [None req-35e96e77-dcb6-4cc3-a259-d34651d26c6e 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] Acquiring lock "3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 21 19:17:16 np0005591285 nova_compute[182755]: 2026-01-22 00:17:16.580 182759 DEBUG oslo_concurrency.lockutils [None req-35e96e77-dcb6-4cc3-a259-d34651d26c6e 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] Lock "3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 21 19:17:16 np0005591285 nova_compute[182755]: 2026-01-22 00:17:16.591 182759 DEBUG oslo_concurrency.processutils [None req-35e96e77-dcb6-4cc3-a259-d34651d26c6e 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 21 19:17:16 np0005591285 nova_compute[182755]: 2026-01-22 00:17:16.667 182759 DEBUG oslo_concurrency.processutils [None req-35e96e77-dcb6-4cc3-a259-d34651d26c6e 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json" returned: 0 in 0.075s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 21 19:17:16 np0005591285 nova_compute[182755]: 2026-01-22 00:17:16.668 182759 DEBUG oslo_concurrency.processutils [None req-35e96e77-dcb6-4cc3-a259-d34651d26c6e 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474,backing_fmt=raw /var/lib/nova/instances/13e8c059-7a27-4db4-8151-32511c435aaf/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 21 19:17:16 np0005591285 nova_compute[182755]: 2026-01-22 00:17:16.700 182759 DEBUG oslo_concurrency.processutils [None req-35e96e77-dcb6-4cc3-a259-d34651d26c6e 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474,backing_fmt=raw /var/lib/nova/instances/13e8c059-7a27-4db4-8151-32511c435aaf/disk 1073741824" returned: 0 in 0.032s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 21 19:17:16 np0005591285 nova_compute[182755]: 2026-01-22 00:17:16.702 182759 DEBUG oslo_concurrency.lockutils [None req-35e96e77-dcb6-4cc3-a259-d34651d26c6e 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] Lock "3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.121s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 21 19:17:16 np0005591285 nova_compute[182755]: 2026-01-22 00:17:16.702 182759 DEBUG oslo_concurrency.processutils [None req-35e96e77-dcb6-4cc3-a259-d34651d26c6e 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 21 19:17:16 np0005591285 nova_compute[182755]: 2026-01-22 00:17:16.778 182759 DEBUG oslo_concurrency.processutils [None req-35e96e77-dcb6-4cc3-a259-d34651d26c6e 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json" returned: 0 in 0.076s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 21 19:17:16 np0005591285 nova_compute[182755]: 2026-01-22 00:17:16.779 182759 DEBUG nova.virt.disk.api [None req-35e96e77-dcb6-4cc3-a259-d34651d26c6e 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] Checking if we can resize image /var/lib/nova/instances/13e8c059-7a27-4db4-8151-32511c435aaf/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166#033[00m
Jan 21 19:17:16 np0005591285 nova_compute[182755]: 2026-01-22 00:17:16.780 182759 DEBUG oslo_concurrency.processutils [None req-35e96e77-dcb6-4cc3-a259-d34651d26c6e 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/13e8c059-7a27-4db4-8151-32511c435aaf/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 21 19:17:16 np0005591285 nova_compute[182755]: 2026-01-22 00:17:16.836 182759 DEBUG oslo_concurrency.processutils [None req-35e96e77-dcb6-4cc3-a259-d34651d26c6e 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/13e8c059-7a27-4db4-8151-32511c435aaf/disk --force-share --output=json" returned: 0 in 0.056s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 21 19:17:16 np0005591285 nova_compute[182755]: 2026-01-22 00:17:16.837 182759 DEBUG nova.virt.disk.api [None req-35e96e77-dcb6-4cc3-a259-d34651d26c6e 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] Cannot resize image /var/lib/nova/instances/13e8c059-7a27-4db4-8151-32511c435aaf/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172#033[00m
Jan 21 19:17:16 np0005591285 nova_compute[182755]: 2026-01-22 00:17:16.838 182759 DEBUG nova.objects.instance [None req-35e96e77-dcb6-4cc3-a259-d34651d26c6e 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] Lazy-loading 'migration_context' on Instance uuid 13e8c059-7a27-4db4-8151-32511c435aaf obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 21 19:17:16 np0005591285 nova_compute[182755]: 2026-01-22 00:17:16.855 182759 DEBUG nova.virt.libvirt.driver [None req-35e96e77-dcb6-4cc3-a259-d34651d26c6e 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] [instance: 13e8c059-7a27-4db4-8151-32511c435aaf] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Jan 21 19:17:16 np0005591285 nova_compute[182755]: 2026-01-22 00:17:16.855 182759 DEBUG nova.virt.libvirt.driver [None req-35e96e77-dcb6-4cc3-a259-d34651d26c6e 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] [instance: 13e8c059-7a27-4db4-8151-32511c435aaf] Ensure instance console log exists: /var/lib/nova/instances/13e8c059-7a27-4db4-8151-32511c435aaf/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Jan 21 19:17:16 np0005591285 nova_compute[182755]: 2026-01-22 00:17:16.856 182759 DEBUG oslo_concurrency.lockutils [None req-35e96e77-dcb6-4cc3-a259-d34651d26c6e 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 21 19:17:16 np0005591285 nova_compute[182755]: 2026-01-22 00:17:16.856 182759 DEBUG oslo_concurrency.lockutils [None req-35e96e77-dcb6-4cc3-a259-d34651d26c6e 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 21 19:17:16 np0005591285 nova_compute[182755]: 2026-01-22 00:17:16.856 182759 DEBUG oslo_concurrency.lockutils [None req-35e96e77-dcb6-4cc3-a259-d34651d26c6e 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 21 19:17:17 np0005591285 nova_compute[182755]: 2026-01-22 00:17:17.736 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:17:17 np0005591285 nova_compute[182755]: 2026-01-22 00:17:17.750 182759 DEBUG nova.policy [None req-35e96e77-dcb6-4cc3-a259-d34651d26c6e 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '833f1e9dce90456ea55a443da6704907', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '34b96b4037d24a0ea19383ca2477b2fd', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Jan 21 19:17:19 np0005591285 nova_compute[182755]: 2026-01-22 00:17:19.397 182759 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1769041024.3962824, be07a8c0-548d-4bfc-b004-799a831f8252 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 21 19:17:19 np0005591285 nova_compute[182755]: 2026-01-22 00:17:19.398 182759 INFO nova.compute.manager [-] [instance: be07a8c0-548d-4bfc-b004-799a831f8252] VM Stopped (Lifecycle Event)#033[00m
Jan 21 19:17:19 np0005591285 nova_compute[182755]: 2026-01-22 00:17:19.430 182759 DEBUG nova.compute.manager [None req-cbbfc65d-bb4c-410b-a158-3554962648f6 - - - - - -] [instance: be07a8c0-548d-4bfc-b004-799a831f8252] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 21 19:17:19 np0005591285 nova_compute[182755]: 2026-01-22 00:17:19.434 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:17:19 np0005591285 nova_compute[182755]: 2026-01-22 00:17:19.571 182759 DEBUG nova.network.neutron [None req-35e96e77-dcb6-4cc3-a259-d34651d26c6e 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] [instance: 13e8c059-7a27-4db4-8151-32511c435aaf] Successfully updated port: b5ddb845-36ad-438e-b95d-6d5c696671e9 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Jan 21 19:17:19 np0005591285 nova_compute[182755]: 2026-01-22 00:17:19.620 182759 DEBUG oslo_concurrency.lockutils [None req-35e96e77-dcb6-4cc3-a259-d34651d26c6e 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] Acquiring lock "refresh_cache-13e8c059-7a27-4db4-8151-32511c435aaf" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 21 19:17:19 np0005591285 nova_compute[182755]: 2026-01-22 00:17:19.620 182759 DEBUG oslo_concurrency.lockutils [None req-35e96e77-dcb6-4cc3-a259-d34651d26c6e 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] Acquired lock "refresh_cache-13e8c059-7a27-4db4-8151-32511c435aaf" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 21 19:17:19 np0005591285 nova_compute[182755]: 2026-01-22 00:17:19.620 182759 DEBUG nova.network.neutron [None req-35e96e77-dcb6-4cc3-a259-d34651d26c6e 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] [instance: 13e8c059-7a27-4db4-8151-32511c435aaf] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Jan 21 19:17:19 np0005591285 nova_compute[182755]: 2026-01-22 00:17:19.699 182759 DEBUG nova.compute.manager [req-2db16135-c909-4fa6-af57-677b893947f0 req-e969de37-76f4-42b1-a0b1-e221fb0e8cb5 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 13e8c059-7a27-4db4-8151-32511c435aaf] Received event network-changed-b5ddb845-36ad-438e-b95d-6d5c696671e9 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 21 19:17:19 np0005591285 nova_compute[182755]: 2026-01-22 00:17:19.700 182759 DEBUG nova.compute.manager [req-2db16135-c909-4fa6-af57-677b893947f0 req-e969de37-76f4-42b1-a0b1-e221fb0e8cb5 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 13e8c059-7a27-4db4-8151-32511c435aaf] Refreshing instance network info cache due to event network-changed-b5ddb845-36ad-438e-b95d-6d5c696671e9. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 21 19:17:19 np0005591285 nova_compute[182755]: 2026-01-22 00:17:19.700 182759 DEBUG oslo_concurrency.lockutils [req-2db16135-c909-4fa6-af57-677b893947f0 req-e969de37-76f4-42b1-a0b1-e221fb0e8cb5 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "refresh_cache-13e8c059-7a27-4db4-8151-32511c435aaf" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 21 19:17:20 np0005591285 nova_compute[182755]: 2026-01-22 00:17:20.077 182759 DEBUG nova.network.neutron [None req-35e96e77-dcb6-4cc3-a259-d34651d26c6e 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] [instance: 13e8c059-7a27-4db4-8151-32511c435aaf] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Jan 21 19:17:21 np0005591285 nova_compute[182755]: 2026-01-22 00:17:21.233 182759 DEBUG nova.network.neutron [None req-35e96e77-dcb6-4cc3-a259-d34651d26c6e 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] [instance: 13e8c059-7a27-4db4-8151-32511c435aaf] Updating instance_info_cache with network_info: [{"id": "b5ddb845-36ad-438e-b95d-6d5c696671e9", "address": "fa:16:3e:30:f9:d2", "network": {"id": "ed0e337a-102a-4cfd-8393-2e4b081cc9be", "bridge": "br-int", "label": "tempest-network-smoke--1899647072", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "34b96b4037d24a0ea19383ca2477b2fd", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb5ddb845-36", "ovs_interfaceid": "b5ddb845-36ad-438e-b95d-6d5c696671e9", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 21 19:17:21 np0005591285 nova_compute[182755]: 2026-01-22 00:17:21.259 182759 DEBUG oslo_concurrency.lockutils [None req-35e96e77-dcb6-4cc3-a259-d34651d26c6e 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] Releasing lock "refresh_cache-13e8c059-7a27-4db4-8151-32511c435aaf" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 21 19:17:21 np0005591285 nova_compute[182755]: 2026-01-22 00:17:21.259 182759 DEBUG nova.compute.manager [None req-35e96e77-dcb6-4cc3-a259-d34651d26c6e 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] [instance: 13e8c059-7a27-4db4-8151-32511c435aaf] Instance network_info: |[{"id": "b5ddb845-36ad-438e-b95d-6d5c696671e9", "address": "fa:16:3e:30:f9:d2", "network": {"id": "ed0e337a-102a-4cfd-8393-2e4b081cc9be", "bridge": "br-int", "label": "tempest-network-smoke--1899647072", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "34b96b4037d24a0ea19383ca2477b2fd", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb5ddb845-36", "ovs_interfaceid": "b5ddb845-36ad-438e-b95d-6d5c696671e9", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Jan 21 19:17:21 np0005591285 nova_compute[182755]: 2026-01-22 00:17:21.260 182759 DEBUG oslo_concurrency.lockutils [req-2db16135-c909-4fa6-af57-677b893947f0 req-e969de37-76f4-42b1-a0b1-e221fb0e8cb5 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquired lock "refresh_cache-13e8c059-7a27-4db4-8151-32511c435aaf" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 21 19:17:21 np0005591285 nova_compute[182755]: 2026-01-22 00:17:21.260 182759 DEBUG nova.network.neutron [req-2db16135-c909-4fa6-af57-677b893947f0 req-e969de37-76f4-42b1-a0b1-e221fb0e8cb5 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 13e8c059-7a27-4db4-8151-32511c435aaf] Refreshing network info cache for port b5ddb845-36ad-438e-b95d-6d5c696671e9 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 21 19:17:21 np0005591285 nova_compute[182755]: 2026-01-22 00:17:21.262 182759 DEBUG nova.virt.libvirt.driver [None req-35e96e77-dcb6-4cc3-a259-d34651d26c6e 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] [instance: 13e8c059-7a27-4db4-8151-32511c435aaf] Start _get_guest_xml network_info=[{"id": "b5ddb845-36ad-438e-b95d-6d5c696671e9", "address": "fa:16:3e:30:f9:d2", "network": {"id": "ed0e337a-102a-4cfd-8393-2e4b081cc9be", "bridge": "br-int", "label": "tempest-network-smoke--1899647072", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "34b96b4037d24a0ea19383ca2477b2fd", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb5ddb845-36", "ovs_interfaceid": "b5ddb845-36ad-438e-b95d-6d5c696671e9", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-21T23:42:50Z,direct_url=<?>,disk_format='qcow2',id=9cd98f02-a505-4543-a7ad-04e9a377b456,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='43b70c4e837343859ac97b6b2397ba1b',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-21T23:42:52Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'device_type': 'disk', 'device_name': '/dev/vda', 'size': 0, 'boot_index': 0, 'disk_bus': 'virtio', 'guest_format': None, 'encryption_options': None, 'encryption_format': None, 'encrypted': False, 'encryption_secret_uuid': None, 'image_id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Jan 21 19:17:21 np0005591285 nova_compute[182755]: 2026-01-22 00:17:21.266 182759 WARNING nova.virt.libvirt.driver [None req-35e96e77-dcb6-4cc3-a259-d34651d26c6e 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 21 19:17:21 np0005591285 nova_compute[182755]: 2026-01-22 00:17:21.274 182759 DEBUG nova.virt.libvirt.host [None req-35e96e77-dcb6-4cc3-a259-d34651d26c6e 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Jan 21 19:17:21 np0005591285 nova_compute[182755]: 2026-01-22 00:17:21.275 182759 DEBUG nova.virt.libvirt.host [None req-35e96e77-dcb6-4cc3-a259-d34651d26c6e 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Jan 21 19:17:21 np0005591285 nova_compute[182755]: 2026-01-22 00:17:21.279 182759 DEBUG nova.virt.libvirt.host [None req-35e96e77-dcb6-4cc3-a259-d34651d26c6e 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Jan 21 19:17:21 np0005591285 nova_compute[182755]: 2026-01-22 00:17:21.280 182759 DEBUG nova.virt.libvirt.host [None req-35e96e77-dcb6-4cc3-a259-d34651d26c6e 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Jan 21 19:17:21 np0005591285 nova_compute[182755]: 2026-01-22 00:17:21.281 182759 DEBUG nova.virt.libvirt.driver [None req-35e96e77-dcb6-4cc3-a259-d34651d26c6e 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Jan 21 19:17:21 np0005591285 nova_compute[182755]: 2026-01-22 00:17:21.282 182759 DEBUG nova.virt.hardware [None req-35e96e77-dcb6-4cc3-a259-d34651d26c6e 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-21T23:42:45Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='c3389c03-89c4-4ff5-9e03-1a99d41713d4',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-21T23:42:50Z,direct_url=<?>,disk_format='qcow2',id=9cd98f02-a505-4543-a7ad-04e9a377b456,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='43b70c4e837343859ac97b6b2397ba1b',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-21T23:42:52Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Jan 21 19:17:21 np0005591285 nova_compute[182755]: 2026-01-22 00:17:21.282 182759 DEBUG nova.virt.hardware [None req-35e96e77-dcb6-4cc3-a259-d34651d26c6e 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Jan 21 19:17:21 np0005591285 nova_compute[182755]: 2026-01-22 00:17:21.282 182759 DEBUG nova.virt.hardware [None req-35e96e77-dcb6-4cc3-a259-d34651d26c6e 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Jan 21 19:17:21 np0005591285 nova_compute[182755]: 2026-01-22 00:17:21.283 182759 DEBUG nova.virt.hardware [None req-35e96e77-dcb6-4cc3-a259-d34651d26c6e 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Jan 21 19:17:21 np0005591285 nova_compute[182755]: 2026-01-22 00:17:21.283 182759 DEBUG nova.virt.hardware [None req-35e96e77-dcb6-4cc3-a259-d34651d26c6e 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Jan 21 19:17:21 np0005591285 nova_compute[182755]: 2026-01-22 00:17:21.283 182759 DEBUG nova.virt.hardware [None req-35e96e77-dcb6-4cc3-a259-d34651d26c6e 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Jan 21 19:17:21 np0005591285 nova_compute[182755]: 2026-01-22 00:17:21.283 182759 DEBUG nova.virt.hardware [None req-35e96e77-dcb6-4cc3-a259-d34651d26c6e 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Jan 21 19:17:21 np0005591285 nova_compute[182755]: 2026-01-22 00:17:21.284 182759 DEBUG nova.virt.hardware [None req-35e96e77-dcb6-4cc3-a259-d34651d26c6e 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Jan 21 19:17:21 np0005591285 nova_compute[182755]: 2026-01-22 00:17:21.284 182759 DEBUG nova.virt.hardware [None req-35e96e77-dcb6-4cc3-a259-d34651d26c6e 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Jan 21 19:17:21 np0005591285 nova_compute[182755]: 2026-01-22 00:17:21.284 182759 DEBUG nova.virt.hardware [None req-35e96e77-dcb6-4cc3-a259-d34651d26c6e 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Jan 21 19:17:21 np0005591285 nova_compute[182755]: 2026-01-22 00:17:21.284 182759 DEBUG nova.virt.hardware [None req-35e96e77-dcb6-4cc3-a259-d34651d26c6e 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Jan 21 19:17:21 np0005591285 nova_compute[182755]: 2026-01-22 00:17:21.289 182759 DEBUG nova.virt.libvirt.vif [None req-35e96e77-dcb6-4cc3-a259-d34651d26c6e 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-22T00:17:12Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-802139876',display_name='tempest-TestNetworkBasicOps-server-802139876',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-802139876',id=135,image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBIYawHES1Dl8cWfoRcVnJdkovN/OqIt5Nziohu0QwSEHRC3Kt+d0XSF/5jXquQtaLInb13URkzauTrULYPKVjHnM9UWAgs48JRCsN9Ey+2urYk0Y/V55QWRx25UlL+0jtQ==',key_name='tempest-TestNetworkBasicOps-1237503814',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='34b96b4037d24a0ea19383ca2477b2fd',ramdisk_id='',reservation_id='r-f06umpf3',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestNetworkBasicOps-822850957',owner_user_name='tempest-TestNetworkBasicOps-822850957-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-22T00:17:16Z,user_data=None,user_id='833f1e9dce90456ea55a443da6704907',uuid=13e8c059-7a27-4db4-8151-32511c435aaf,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "b5ddb845-36ad-438e-b95d-6d5c696671e9", "address": "fa:16:3e:30:f9:d2", "network": {"id": "ed0e337a-102a-4cfd-8393-2e4b081cc9be", "bridge": "br-int", "label": "tempest-network-smoke--1899647072", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "34b96b4037d24a0ea19383ca2477b2fd", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb5ddb845-36", "ovs_interfaceid": "b5ddb845-36ad-438e-b95d-6d5c696671e9", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Jan 21 19:17:21 np0005591285 nova_compute[182755]: 2026-01-22 00:17:21.290 182759 DEBUG nova.network.os_vif_util [None req-35e96e77-dcb6-4cc3-a259-d34651d26c6e 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] Converting VIF {"id": "b5ddb845-36ad-438e-b95d-6d5c696671e9", "address": "fa:16:3e:30:f9:d2", "network": {"id": "ed0e337a-102a-4cfd-8393-2e4b081cc9be", "bridge": "br-int", "label": "tempest-network-smoke--1899647072", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "34b96b4037d24a0ea19383ca2477b2fd", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb5ddb845-36", "ovs_interfaceid": "b5ddb845-36ad-438e-b95d-6d5c696671e9", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 21 19:17:21 np0005591285 nova_compute[182755]: 2026-01-22 00:17:21.291 182759 DEBUG nova.network.os_vif_util [None req-35e96e77-dcb6-4cc3-a259-d34651d26c6e 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:30:f9:d2,bridge_name='br-int',has_traffic_filtering=True,id=b5ddb845-36ad-438e-b95d-6d5c696671e9,network=Network(ed0e337a-102a-4cfd-8393-2e4b081cc9be),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tapb5ddb845-36') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 21 19:17:21 np0005591285 nova_compute[182755]: 2026-01-22 00:17:21.292 182759 DEBUG nova.objects.instance [None req-35e96e77-dcb6-4cc3-a259-d34651d26c6e 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] Lazy-loading 'pci_devices' on Instance uuid 13e8c059-7a27-4db4-8151-32511c435aaf obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 21 19:17:21 np0005591285 nova_compute[182755]: 2026-01-22 00:17:21.313 182759 DEBUG nova.virt.libvirt.driver [None req-35e96e77-dcb6-4cc3-a259-d34651d26c6e 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] [instance: 13e8c059-7a27-4db4-8151-32511c435aaf] End _get_guest_xml xml=<domain type="kvm">
Jan 21 19:17:21 np0005591285 nova_compute[182755]:  <uuid>13e8c059-7a27-4db4-8151-32511c435aaf</uuid>
Jan 21 19:17:21 np0005591285 nova_compute[182755]:  <name>instance-00000087</name>
Jan 21 19:17:21 np0005591285 nova_compute[182755]:  <memory>131072</memory>
Jan 21 19:17:21 np0005591285 nova_compute[182755]:  <vcpu>1</vcpu>
Jan 21 19:17:21 np0005591285 nova_compute[182755]:  <metadata>
Jan 21 19:17:21 np0005591285 nova_compute[182755]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 21 19:17:21 np0005591285 nova_compute[182755]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 21 19:17:21 np0005591285 nova_compute[182755]:      <nova:name>tempest-TestNetworkBasicOps-server-802139876</nova:name>
Jan 21 19:17:21 np0005591285 nova_compute[182755]:      <nova:creationTime>2026-01-22 00:17:21</nova:creationTime>
Jan 21 19:17:21 np0005591285 nova_compute[182755]:      <nova:flavor name="m1.nano">
Jan 21 19:17:21 np0005591285 nova_compute[182755]:        <nova:memory>128</nova:memory>
Jan 21 19:17:21 np0005591285 nova_compute[182755]:        <nova:disk>1</nova:disk>
Jan 21 19:17:21 np0005591285 nova_compute[182755]:        <nova:swap>0</nova:swap>
Jan 21 19:17:21 np0005591285 nova_compute[182755]:        <nova:ephemeral>0</nova:ephemeral>
Jan 21 19:17:21 np0005591285 nova_compute[182755]:        <nova:vcpus>1</nova:vcpus>
Jan 21 19:17:21 np0005591285 nova_compute[182755]:      </nova:flavor>
Jan 21 19:17:21 np0005591285 nova_compute[182755]:      <nova:owner>
Jan 21 19:17:21 np0005591285 nova_compute[182755]:        <nova:user uuid="833f1e9dce90456ea55a443da6704907">tempest-TestNetworkBasicOps-822850957-project-member</nova:user>
Jan 21 19:17:21 np0005591285 nova_compute[182755]:        <nova:project uuid="34b96b4037d24a0ea19383ca2477b2fd">tempest-TestNetworkBasicOps-822850957</nova:project>
Jan 21 19:17:21 np0005591285 nova_compute[182755]:      </nova:owner>
Jan 21 19:17:21 np0005591285 nova_compute[182755]:      <nova:root type="image" uuid="9cd98f02-a505-4543-a7ad-04e9a377b456"/>
Jan 21 19:17:21 np0005591285 nova_compute[182755]:      <nova:ports>
Jan 21 19:17:21 np0005591285 nova_compute[182755]:        <nova:port uuid="b5ddb845-36ad-438e-b95d-6d5c696671e9">
Jan 21 19:17:21 np0005591285 nova_compute[182755]:          <nova:ip type="fixed" address="10.100.0.8" ipVersion="4"/>
Jan 21 19:17:21 np0005591285 nova_compute[182755]:        </nova:port>
Jan 21 19:17:21 np0005591285 nova_compute[182755]:      </nova:ports>
Jan 21 19:17:21 np0005591285 nova_compute[182755]:    </nova:instance>
Jan 21 19:17:21 np0005591285 nova_compute[182755]:  </metadata>
Jan 21 19:17:21 np0005591285 nova_compute[182755]:  <sysinfo type="smbios">
Jan 21 19:17:21 np0005591285 nova_compute[182755]:    <system>
Jan 21 19:17:21 np0005591285 nova_compute[182755]:      <entry name="manufacturer">RDO</entry>
Jan 21 19:17:21 np0005591285 nova_compute[182755]:      <entry name="product">OpenStack Compute</entry>
Jan 21 19:17:21 np0005591285 nova_compute[182755]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 21 19:17:21 np0005591285 nova_compute[182755]:      <entry name="serial">13e8c059-7a27-4db4-8151-32511c435aaf</entry>
Jan 21 19:17:21 np0005591285 nova_compute[182755]:      <entry name="uuid">13e8c059-7a27-4db4-8151-32511c435aaf</entry>
Jan 21 19:17:21 np0005591285 nova_compute[182755]:      <entry name="family">Virtual Machine</entry>
Jan 21 19:17:21 np0005591285 nova_compute[182755]:    </system>
Jan 21 19:17:21 np0005591285 nova_compute[182755]:  </sysinfo>
Jan 21 19:17:21 np0005591285 nova_compute[182755]:  <os>
Jan 21 19:17:21 np0005591285 nova_compute[182755]:    <type arch="x86_64" machine="q35">hvm</type>
Jan 21 19:17:21 np0005591285 nova_compute[182755]:    <boot dev="hd"/>
Jan 21 19:17:21 np0005591285 nova_compute[182755]:    <smbios mode="sysinfo"/>
Jan 21 19:17:21 np0005591285 nova_compute[182755]:  </os>
Jan 21 19:17:21 np0005591285 nova_compute[182755]:  <features>
Jan 21 19:17:21 np0005591285 nova_compute[182755]:    <acpi/>
Jan 21 19:17:21 np0005591285 nova_compute[182755]:    <apic/>
Jan 21 19:17:21 np0005591285 nova_compute[182755]:    <vmcoreinfo/>
Jan 21 19:17:21 np0005591285 nova_compute[182755]:  </features>
Jan 21 19:17:21 np0005591285 nova_compute[182755]:  <clock offset="utc">
Jan 21 19:17:21 np0005591285 nova_compute[182755]:    <timer name="pit" tickpolicy="delay"/>
Jan 21 19:17:21 np0005591285 nova_compute[182755]:    <timer name="rtc" tickpolicy="catchup"/>
Jan 21 19:17:21 np0005591285 nova_compute[182755]:    <timer name="hpet" present="no"/>
Jan 21 19:17:21 np0005591285 nova_compute[182755]:  </clock>
Jan 21 19:17:21 np0005591285 nova_compute[182755]:  <cpu mode="custom" match="exact">
Jan 21 19:17:21 np0005591285 nova_compute[182755]:    <model>Nehalem</model>
Jan 21 19:17:21 np0005591285 nova_compute[182755]:    <topology sockets="1" cores="1" threads="1"/>
Jan 21 19:17:21 np0005591285 nova_compute[182755]:  </cpu>
Jan 21 19:17:21 np0005591285 nova_compute[182755]:  <devices>
Jan 21 19:17:21 np0005591285 nova_compute[182755]:    <disk type="file" device="disk">
Jan 21 19:17:21 np0005591285 nova_compute[182755]:      <driver name="qemu" type="qcow2" cache="none"/>
Jan 21 19:17:21 np0005591285 nova_compute[182755]:      <source file="/var/lib/nova/instances/13e8c059-7a27-4db4-8151-32511c435aaf/disk"/>
Jan 21 19:17:21 np0005591285 nova_compute[182755]:      <target dev="vda" bus="virtio"/>
Jan 21 19:17:21 np0005591285 nova_compute[182755]:    </disk>
Jan 21 19:17:21 np0005591285 nova_compute[182755]:    <disk type="file" device="cdrom">
Jan 21 19:17:21 np0005591285 nova_compute[182755]:      <driver name="qemu" type="raw" cache="none"/>
Jan 21 19:17:21 np0005591285 nova_compute[182755]:      <source file="/var/lib/nova/instances/13e8c059-7a27-4db4-8151-32511c435aaf/disk.config"/>
Jan 21 19:17:21 np0005591285 nova_compute[182755]:      <target dev="sda" bus="sata"/>
Jan 21 19:17:21 np0005591285 nova_compute[182755]:    </disk>
Jan 21 19:17:21 np0005591285 nova_compute[182755]:    <interface type="ethernet">
Jan 21 19:17:21 np0005591285 nova_compute[182755]:      <mac address="fa:16:3e:30:f9:d2"/>
Jan 21 19:17:21 np0005591285 nova_compute[182755]:      <model type="virtio"/>
Jan 21 19:17:21 np0005591285 nova_compute[182755]:      <driver name="vhost" rx_queue_size="512"/>
Jan 21 19:17:21 np0005591285 nova_compute[182755]:      <mtu size="1442"/>
Jan 21 19:17:21 np0005591285 nova_compute[182755]:      <target dev="tapb5ddb845-36"/>
Jan 21 19:17:21 np0005591285 nova_compute[182755]:    </interface>
Jan 21 19:17:21 np0005591285 nova_compute[182755]:    <serial type="pty">
Jan 21 19:17:21 np0005591285 nova_compute[182755]:      <log file="/var/lib/nova/instances/13e8c059-7a27-4db4-8151-32511c435aaf/console.log" append="off"/>
Jan 21 19:17:21 np0005591285 nova_compute[182755]:    </serial>
Jan 21 19:17:21 np0005591285 nova_compute[182755]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 21 19:17:21 np0005591285 nova_compute[182755]:    <video>
Jan 21 19:17:21 np0005591285 nova_compute[182755]:      <model type="virtio"/>
Jan 21 19:17:21 np0005591285 nova_compute[182755]:    </video>
Jan 21 19:17:21 np0005591285 nova_compute[182755]:    <input type="tablet" bus="usb"/>
Jan 21 19:17:21 np0005591285 nova_compute[182755]:    <rng model="virtio">
Jan 21 19:17:21 np0005591285 nova_compute[182755]:      <backend model="random">/dev/urandom</backend>
Jan 21 19:17:21 np0005591285 nova_compute[182755]:    </rng>
Jan 21 19:17:21 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root"/>
Jan 21 19:17:21 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 19:17:21 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 19:17:21 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 19:17:21 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 19:17:21 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 19:17:21 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 19:17:21 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 19:17:21 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 19:17:21 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 19:17:21 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 19:17:21 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 19:17:21 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 19:17:21 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 19:17:21 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 19:17:21 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 19:17:21 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 19:17:21 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 19:17:21 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 19:17:21 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 19:17:21 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 19:17:21 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 19:17:21 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 19:17:21 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 19:17:21 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 19:17:21 np0005591285 nova_compute[182755]:    <controller type="usb" index="0"/>
Jan 21 19:17:21 np0005591285 nova_compute[182755]:    <memballoon model="virtio">
Jan 21 19:17:21 np0005591285 nova_compute[182755]:      <stats period="10"/>
Jan 21 19:17:21 np0005591285 nova_compute[182755]:    </memballoon>
Jan 21 19:17:21 np0005591285 nova_compute[182755]:  </devices>
Jan 21 19:17:21 np0005591285 nova_compute[182755]: </domain>
Jan 21 19:17:21 np0005591285 nova_compute[182755]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Jan 21 19:17:21 np0005591285 nova_compute[182755]: 2026-01-22 00:17:21.315 182759 DEBUG nova.compute.manager [None req-35e96e77-dcb6-4cc3-a259-d34651d26c6e 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] [instance: 13e8c059-7a27-4db4-8151-32511c435aaf] Preparing to wait for external event network-vif-plugged-b5ddb845-36ad-438e-b95d-6d5c696671e9 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Jan 21 19:17:21 np0005591285 nova_compute[182755]: 2026-01-22 00:17:21.315 182759 DEBUG oslo_concurrency.lockutils [None req-35e96e77-dcb6-4cc3-a259-d34651d26c6e 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] Acquiring lock "13e8c059-7a27-4db4-8151-32511c435aaf-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 21 19:17:21 np0005591285 nova_compute[182755]: 2026-01-22 00:17:21.316 182759 DEBUG oslo_concurrency.lockutils [None req-35e96e77-dcb6-4cc3-a259-d34651d26c6e 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] Lock "13e8c059-7a27-4db4-8151-32511c435aaf-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 21 19:17:21 np0005591285 nova_compute[182755]: 2026-01-22 00:17:21.316 182759 DEBUG oslo_concurrency.lockutils [None req-35e96e77-dcb6-4cc3-a259-d34651d26c6e 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] Lock "13e8c059-7a27-4db4-8151-32511c435aaf-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 21 19:17:21 np0005591285 nova_compute[182755]: 2026-01-22 00:17:21.317 182759 DEBUG nova.virt.libvirt.vif [None req-35e96e77-dcb6-4cc3-a259-d34651d26c6e 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-22T00:17:12Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-802139876',display_name='tempest-TestNetworkBasicOps-server-802139876',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-802139876',id=135,image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBIYawHES1Dl8cWfoRcVnJdkovN/OqIt5Nziohu0QwSEHRC3Kt+d0XSF/5jXquQtaLInb13URkzauTrULYPKVjHnM9UWAgs48JRCsN9Ey+2urYk0Y/V55QWRx25UlL+0jtQ==',key_name='tempest-TestNetworkBasicOps-1237503814',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='34b96b4037d24a0ea19383ca2477b2fd',ramdisk_id='',reservation_id='r-f06umpf3',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestNetworkBasicOps-822850957',owner_user_name='tempest-TestNetworkBasicOps-822850957-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-22T00:17:16Z,user_data=None,user_id='833f1e9dce90456ea55a443da6704907',uuid=13e8c059-7a27-4db4-8151-32511c435aaf,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "b5ddb845-36ad-438e-b95d-6d5c696671e9", "address": "fa:16:3e:30:f9:d2", "network": {"id": "ed0e337a-102a-4cfd-8393-2e4b081cc9be", "bridge": "br-int", "label": "tempest-network-smoke--1899647072", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "34b96b4037d24a0ea19383ca2477b2fd", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb5ddb845-36", "ovs_interfaceid": "b5ddb845-36ad-438e-b95d-6d5c696671e9", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Jan 21 19:17:21 np0005591285 nova_compute[182755]: 2026-01-22 00:17:21.318 182759 DEBUG nova.network.os_vif_util [None req-35e96e77-dcb6-4cc3-a259-d34651d26c6e 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] Converting VIF {"id": "b5ddb845-36ad-438e-b95d-6d5c696671e9", "address": "fa:16:3e:30:f9:d2", "network": {"id": "ed0e337a-102a-4cfd-8393-2e4b081cc9be", "bridge": "br-int", "label": "tempest-network-smoke--1899647072", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "34b96b4037d24a0ea19383ca2477b2fd", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb5ddb845-36", "ovs_interfaceid": "b5ddb845-36ad-438e-b95d-6d5c696671e9", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 21 19:17:21 np0005591285 nova_compute[182755]: 2026-01-22 00:17:21.319 182759 DEBUG nova.network.os_vif_util [None req-35e96e77-dcb6-4cc3-a259-d34651d26c6e 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:30:f9:d2,bridge_name='br-int',has_traffic_filtering=True,id=b5ddb845-36ad-438e-b95d-6d5c696671e9,network=Network(ed0e337a-102a-4cfd-8393-2e4b081cc9be),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tapb5ddb845-36') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 21 19:17:21 np0005591285 nova_compute[182755]: 2026-01-22 00:17:21.319 182759 DEBUG os_vif [None req-35e96e77-dcb6-4cc3-a259-d34651d26c6e 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:30:f9:d2,bridge_name='br-int',has_traffic_filtering=True,id=b5ddb845-36ad-438e-b95d-6d5c696671e9,network=Network(ed0e337a-102a-4cfd-8393-2e4b081cc9be),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tapb5ddb845-36') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Jan 21 19:17:21 np0005591285 nova_compute[182755]: 2026-01-22 00:17:21.320 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:17:21 np0005591285 nova_compute[182755]: 2026-01-22 00:17:21.321 182759 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 21 19:17:21 np0005591285 nova_compute[182755]: 2026-01-22 00:17:21.322 182759 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 21 19:17:21 np0005591285 nova_compute[182755]: 2026-01-22 00:17:21.328 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:17:21 np0005591285 nova_compute[182755]: 2026-01-22 00:17:21.328 182759 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapb5ddb845-36, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 21 19:17:21 np0005591285 nova_compute[182755]: 2026-01-22 00:17:21.329 182759 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tapb5ddb845-36, col_values=(('external_ids', {'iface-id': 'b5ddb845-36ad-438e-b95d-6d5c696671e9', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:30:f9:d2', 'vm-uuid': '13e8c059-7a27-4db4-8151-32511c435aaf'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 21 19:17:21 np0005591285 nova_compute[182755]: 2026-01-22 00:17:21.330 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:17:21 np0005591285 NetworkManager[55017]: <info>  [1769041041.3315] manager: (tapb5ddb845-36): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/248)
Jan 21 19:17:21 np0005591285 nova_compute[182755]: 2026-01-22 00:17:21.333 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 21 19:17:21 np0005591285 nova_compute[182755]: 2026-01-22 00:17:21.337 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:17:21 np0005591285 nova_compute[182755]: 2026-01-22 00:17:21.338 182759 INFO os_vif [None req-35e96e77-dcb6-4cc3-a259-d34651d26c6e 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:30:f9:d2,bridge_name='br-int',has_traffic_filtering=True,id=b5ddb845-36ad-438e-b95d-6d5c696671e9,network=Network(ed0e337a-102a-4cfd-8393-2e4b081cc9be),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tapb5ddb845-36')#033[00m
Jan 21 19:17:21 np0005591285 nova_compute[182755]: 2026-01-22 00:17:21.410 182759 DEBUG nova.virt.libvirt.driver [None req-35e96e77-dcb6-4cc3-a259-d34651d26c6e 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 21 19:17:21 np0005591285 nova_compute[182755]: 2026-01-22 00:17:21.411 182759 DEBUG nova.virt.libvirt.driver [None req-35e96e77-dcb6-4cc3-a259-d34651d26c6e 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 21 19:17:21 np0005591285 nova_compute[182755]: 2026-01-22 00:17:21.411 182759 DEBUG nova.virt.libvirt.driver [None req-35e96e77-dcb6-4cc3-a259-d34651d26c6e 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] No VIF found with MAC fa:16:3e:30:f9:d2, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Jan 21 19:17:21 np0005591285 nova_compute[182755]: 2026-01-22 00:17:21.412 182759 INFO nova.virt.libvirt.driver [None req-35e96e77-dcb6-4cc3-a259-d34651d26c6e 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] [instance: 13e8c059-7a27-4db4-8151-32511c435aaf] Using config drive#033[00m
Jan 21 19:17:22 np0005591285 nova_compute[182755]: 2026-01-22 00:17:22.108 182759 INFO nova.virt.libvirt.driver [None req-35e96e77-dcb6-4cc3-a259-d34651d26c6e 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] [instance: 13e8c059-7a27-4db4-8151-32511c435aaf] Creating config drive at /var/lib/nova/instances/13e8c059-7a27-4db4-8151-32511c435aaf/disk.config#033[00m
Jan 21 19:17:22 np0005591285 nova_compute[182755]: 2026-01-22 00:17:22.117 182759 DEBUG oslo_concurrency.processutils [None req-35e96e77-dcb6-4cc3-a259-d34651d26c6e 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/13e8c059-7a27-4db4-8151-32511c435aaf/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpz1a1nkjj execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 21 19:17:22 np0005591285 nova_compute[182755]: 2026-01-22 00:17:22.242 182759 DEBUG oslo_concurrency.processutils [None req-35e96e77-dcb6-4cc3-a259-d34651d26c6e 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/13e8c059-7a27-4db4-8151-32511c435aaf/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpz1a1nkjj" returned: 0 in 0.125s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 21 19:17:22 np0005591285 kernel: tapb5ddb845-36: entered promiscuous mode
Jan 21 19:17:22 np0005591285 NetworkManager[55017]: <info>  [1769041042.3293] manager: (tapb5ddb845-36): new Tun device (/org/freedesktop/NetworkManager/Devices/249)
Jan 21 19:17:22 np0005591285 ovn_controller[94908]: 2026-01-22T00:17:22Z|00515|binding|INFO|Claiming lport b5ddb845-36ad-438e-b95d-6d5c696671e9 for this chassis.
Jan 21 19:17:22 np0005591285 nova_compute[182755]: 2026-01-22 00:17:22.331 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:17:22 np0005591285 ovn_controller[94908]: 2026-01-22T00:17:22Z|00516|binding|INFO|b5ddb845-36ad-438e-b95d-6d5c696671e9: Claiming fa:16:3e:30:f9:d2 10.100.0.8
Jan 21 19:17:22 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:17:22.345 104259 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:30:f9:d2 10.100.0.8'], port_security=['fa:16:3e:30:f9:d2 10.100.0.8'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'name': 'tempest-TestNetworkBasicOps-1260237103', 'neutron:cidrs': '10.100.0.8/28', 'neutron:device_id': '13e8c059-7a27-4db4-8151-32511c435aaf', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-ed0e337a-102a-4cfd-8393-2e4b081cc9be', 'neutron:port_capabilities': '', 'neutron:port_name': 'tempest-TestNetworkBasicOps-1260237103', 'neutron:project_id': '34b96b4037d24a0ea19383ca2477b2fd', 'neutron:revision_number': '2', 'neutron:security_group_ids': '86c286e4-25eb-4f2e-b5ff-30677cfd8882', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=e0671c46-b585-465c-a43e-4ccad6b37e41, chassis=[<ovs.db.idl.Row object at 0x7fdd55b5c970>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fdd55b5c970>], logical_port=b5ddb845-36ad-438e-b95d-6d5c696671e9) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 21 19:17:22 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:17:22.347 104259 INFO neutron.agent.ovn.metadata.agent [-] Port b5ddb845-36ad-438e-b95d-6d5c696671e9 in datapath ed0e337a-102a-4cfd-8393-2e4b081cc9be bound to our chassis#033[00m
Jan 21 19:17:22 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:17:22.348 104259 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network ed0e337a-102a-4cfd-8393-2e4b081cc9be#033[00m
Jan 21 19:17:22 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:17:22.360 211690 DEBUG oslo.privsep.daemon [-] privsep: reply[8e2133cb-6ac4-42a9-a62e-821fa9e50db9]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 19:17:22 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:17:22.361 104259 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH taped0e337a-11 in ovnmeta-ed0e337a-102a-4cfd-8393-2e4b081cc9be namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Jan 21 19:17:22 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:17:22.364 211690 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface taped0e337a-10 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Jan 21 19:17:22 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:17:22.364 211690 DEBUG oslo.privsep.daemon [-] privsep: reply[319e6d72-f3be-453c-8e85-4b3813ed93c5]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 19:17:22 np0005591285 systemd-udevd[232959]: Network interface NamePolicy= disabled on kernel command line.
Jan 21 19:17:22 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:17:22.365 211690 DEBUG oslo.privsep.daemon [-] privsep: reply[6ee57533-2ab4-4a76-b617-c339604fcbe2]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 19:17:22 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:17:22.377 104650 DEBUG oslo.privsep.daemon [-] privsep: reply[6b6ba0e9-d21f-4fb6-85ec-e32889a47af8]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 19:17:22 np0005591285 NetworkManager[55017]: <info>  [1769041042.3803] device (tapb5ddb845-36): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 21 19:17:22 np0005591285 NetworkManager[55017]: <info>  [1769041042.3813] device (tapb5ddb845-36): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 21 19:17:22 np0005591285 systemd-machined[154022]: New machine qemu-61-instance-00000087.
Jan 21 19:17:22 np0005591285 nova_compute[182755]: 2026-01-22 00:17:22.390 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:17:22 np0005591285 ovn_controller[94908]: 2026-01-22T00:17:22Z|00517|binding|INFO|Setting lport b5ddb845-36ad-438e-b95d-6d5c696671e9 ovn-installed in OVS
Jan 21 19:17:22 np0005591285 ovn_controller[94908]: 2026-01-22T00:17:22Z|00518|binding|INFO|Setting lport b5ddb845-36ad-438e-b95d-6d5c696671e9 up in Southbound
Jan 21 19:17:22 np0005591285 nova_compute[182755]: 2026-01-22 00:17:22.397 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:17:22 np0005591285 systemd[1]: Started Virtual Machine qemu-61-instance-00000087.
Jan 21 19:17:22 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:17:22.402 211690 DEBUG oslo.privsep.daemon [-] privsep: reply[daf6f6c9-931a-4ca7-89c0-3a5164a4de05]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 19:17:22 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:17:22.436 211704 DEBUG oslo.privsep.daemon [-] privsep: reply[3035107a-e1d9-41d8-9b94-b9d8b334f044]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 19:17:22 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:17:22.445 211690 DEBUG oslo.privsep.daemon [-] privsep: reply[dcb2acc6-1348-406d-8bec-72640df24586]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 19:17:22 np0005591285 NetworkManager[55017]: <info>  [1769041042.4472] manager: (taped0e337a-10): new Veth device (/org/freedesktop/NetworkManager/Devices/250)
Jan 21 19:17:22 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:17:22.485 211704 DEBUG oslo.privsep.daemon [-] privsep: reply[3f4f1d83-b275-44f3-a0fd-6fd56a851b9d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 19:17:22 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:17:22.491 211704 DEBUG oslo.privsep.daemon [-] privsep: reply[c9a7fc16-fdb0-4701-b0ae-70983cfebd3e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 19:17:22 np0005591285 NetworkManager[55017]: <info>  [1769041042.5217] device (taped0e337a-10): carrier: link connected
Jan 21 19:17:22 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:17:22.529 211704 DEBUG oslo.privsep.daemon [-] privsep: reply[0cb0d3d9-3003-4d98-9718-7155a1cf9d79]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 19:17:22 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:17:22.552 211690 DEBUG oslo.privsep.daemon [-] privsep: reply[0c86e692-8564-4f60-af5b-027196d9d40f]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'taped0e337a-11'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:72:b4:8c'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 162], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 560118, 'reachable_time': 29201, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 232992, 'error': None, 'target': 'ovnmeta-ed0e337a-102a-4cfd-8393-2e4b081cc9be', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 19:17:22 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:17:22.568 211690 DEBUG oslo.privsep.daemon [-] privsep: reply[d818fa65-f8df-4988-a26e-400b5121ecc5]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe72:b48c'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 560118, 'tstamp': 560118}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 232993, 'error': None, 'target': 'ovnmeta-ed0e337a-102a-4cfd-8393-2e4b081cc9be', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 19:17:22 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:17:22.587 211690 DEBUG oslo.privsep.daemon [-] privsep: reply[25a35ce7-37d3-4d67-9e3f-fa85a0e376e0]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'taped0e337a-11'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:72:b4:8c'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 162], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 560118, 'reachable_time': 29201, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 232994, 'error': None, 'target': 'ovnmeta-ed0e337a-102a-4cfd-8393-2e4b081cc9be', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 19:17:22 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:17:22.621 211690 DEBUG oslo.privsep.daemon [-] privsep: reply[43ab40f0-9f5a-46e1-96d9-3bac6f72c9cf]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 19:17:22 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:17:22.688 211690 DEBUG oslo.privsep.daemon [-] privsep: reply[75f0567b-45f6-4c9c-9336-55d9b54b5b24]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 19:17:22 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:17:22.689 104259 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=taped0e337a-10, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 21 19:17:22 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:17:22.690 104259 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 21 19:17:22 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:17:22.690 104259 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=taped0e337a-10, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 21 19:17:22 np0005591285 kernel: taped0e337a-10: entered promiscuous mode
Jan 21 19:17:22 np0005591285 nova_compute[182755]: 2026-01-22 00:17:22.692 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:17:22 np0005591285 NetworkManager[55017]: <info>  [1769041042.6934] manager: (taped0e337a-10): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/251)
Jan 21 19:17:22 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:17:22.699 104259 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=taped0e337a-10, col_values=(('external_ids', {'iface-id': 'a677c548-67db-4eb4-acb2-02020cd1507a'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 21 19:17:22 np0005591285 nova_compute[182755]: 2026-01-22 00:17:22.701 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:17:22 np0005591285 ovn_controller[94908]: 2026-01-22T00:17:22Z|00519|binding|INFO|Releasing lport a677c548-67db-4eb4-acb2-02020cd1507a from this chassis (sb_readonly=0)
Jan 21 19:17:22 np0005591285 nova_compute[182755]: 2026-01-22 00:17:22.702 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:17:22 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:17:22.704 104259 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/ed0e337a-102a-4cfd-8393-2e4b081cc9be.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/ed0e337a-102a-4cfd-8393-2e4b081cc9be.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Jan 21 19:17:22 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:17:22.705 211690 DEBUG oslo.privsep.daemon [-] privsep: reply[ffc77790-db55-4c22-93aa-d8b5ea51d8f0]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 19:17:22 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:17:22.706 104259 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Jan 21 19:17:22 np0005591285 ovn_metadata_agent[104254]: global
Jan 21 19:17:22 np0005591285 ovn_metadata_agent[104254]:    log         /dev/log local0 debug
Jan 21 19:17:22 np0005591285 ovn_metadata_agent[104254]:    log-tag     haproxy-metadata-proxy-ed0e337a-102a-4cfd-8393-2e4b081cc9be
Jan 21 19:17:22 np0005591285 ovn_metadata_agent[104254]:    user        root
Jan 21 19:17:22 np0005591285 ovn_metadata_agent[104254]:    group       root
Jan 21 19:17:22 np0005591285 ovn_metadata_agent[104254]:    maxconn     1024
Jan 21 19:17:22 np0005591285 ovn_metadata_agent[104254]:    pidfile     /var/lib/neutron/external/pids/ed0e337a-102a-4cfd-8393-2e4b081cc9be.pid.haproxy
Jan 21 19:17:22 np0005591285 ovn_metadata_agent[104254]:    daemon
Jan 21 19:17:22 np0005591285 ovn_metadata_agent[104254]: 
Jan 21 19:17:22 np0005591285 ovn_metadata_agent[104254]: defaults
Jan 21 19:17:22 np0005591285 ovn_metadata_agent[104254]:    log global
Jan 21 19:17:22 np0005591285 ovn_metadata_agent[104254]:    mode http
Jan 21 19:17:22 np0005591285 ovn_metadata_agent[104254]:    option httplog
Jan 21 19:17:22 np0005591285 ovn_metadata_agent[104254]:    option dontlognull
Jan 21 19:17:22 np0005591285 ovn_metadata_agent[104254]:    option http-server-close
Jan 21 19:17:22 np0005591285 ovn_metadata_agent[104254]:    option forwardfor
Jan 21 19:17:22 np0005591285 ovn_metadata_agent[104254]:    retries                 3
Jan 21 19:17:22 np0005591285 ovn_metadata_agent[104254]:    timeout http-request    30s
Jan 21 19:17:22 np0005591285 ovn_metadata_agent[104254]:    timeout connect         30s
Jan 21 19:17:22 np0005591285 ovn_metadata_agent[104254]:    timeout client          32s
Jan 21 19:17:22 np0005591285 ovn_metadata_agent[104254]:    timeout server          32s
Jan 21 19:17:22 np0005591285 ovn_metadata_agent[104254]:    timeout http-keep-alive 30s
Jan 21 19:17:22 np0005591285 ovn_metadata_agent[104254]: 
Jan 21 19:17:22 np0005591285 ovn_metadata_agent[104254]: 
Jan 21 19:17:22 np0005591285 ovn_metadata_agent[104254]: listen listener
Jan 21 19:17:22 np0005591285 ovn_metadata_agent[104254]:    bind 169.254.169.254:80
Jan 21 19:17:22 np0005591285 ovn_metadata_agent[104254]:    server metadata /var/lib/neutron/metadata_proxy
Jan 21 19:17:22 np0005591285 ovn_metadata_agent[104254]:    http-request add-header X-OVN-Network-ID ed0e337a-102a-4cfd-8393-2e4b081cc9be
Jan 21 19:17:22 np0005591285 ovn_metadata_agent[104254]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Jan 21 19:17:22 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:17:22.706 104259 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-ed0e337a-102a-4cfd-8393-2e4b081cc9be', 'env', 'PROCESS_TAG=haproxy-ed0e337a-102a-4cfd-8393-2e4b081cc9be', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/ed0e337a-102a-4cfd-8393-2e4b081cc9be.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Jan 21 19:17:22 np0005591285 nova_compute[182755]: 2026-01-22 00:17:22.718 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:17:22 np0005591285 nova_compute[182755]: 2026-01-22 00:17:22.737 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:17:22 np0005591285 nova_compute[182755]: 2026-01-22 00:17:22.830 182759 DEBUG nova.virt.driver [None req-f447ef32-7edb-4e2c-a5c5-5452f2f9c37e - - - - - -] Emitting event <LifecycleEvent: 1769041042.829706, 13e8c059-7a27-4db4-8151-32511c435aaf => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 21 19:17:22 np0005591285 nova_compute[182755]: 2026-01-22 00:17:22.830 182759 INFO nova.compute.manager [None req-f447ef32-7edb-4e2c-a5c5-5452f2f9c37e - - - - - -] [instance: 13e8c059-7a27-4db4-8151-32511c435aaf] VM Started (Lifecycle Event)#033[00m
Jan 21 19:17:22 np0005591285 nova_compute[182755]: 2026-01-22 00:17:22.865 182759 DEBUG nova.compute.manager [None req-f447ef32-7edb-4e2c-a5c5-5452f2f9c37e - - - - - -] [instance: 13e8c059-7a27-4db4-8151-32511c435aaf] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 21 19:17:22 np0005591285 nova_compute[182755]: 2026-01-22 00:17:22.869 182759 DEBUG nova.virt.driver [None req-f447ef32-7edb-4e2c-a5c5-5452f2f9c37e - - - - - -] Emitting event <LifecycleEvent: 1769041042.8300269, 13e8c059-7a27-4db4-8151-32511c435aaf => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 21 19:17:22 np0005591285 nova_compute[182755]: 2026-01-22 00:17:22.869 182759 INFO nova.compute.manager [None req-f447ef32-7edb-4e2c-a5c5-5452f2f9c37e - - - - - -] [instance: 13e8c059-7a27-4db4-8151-32511c435aaf] VM Paused (Lifecycle Event)#033[00m
Jan 21 19:17:22 np0005591285 nova_compute[182755]: 2026-01-22 00:17:22.894 182759 DEBUG nova.compute.manager [None req-f447ef32-7edb-4e2c-a5c5-5452f2f9c37e - - - - - -] [instance: 13e8c059-7a27-4db4-8151-32511c435aaf] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 21 19:17:22 np0005591285 nova_compute[182755]: 2026-01-22 00:17:22.901 182759 DEBUG nova.compute.manager [None req-f447ef32-7edb-4e2c-a5c5-5452f2f9c37e - - - - - -] [instance: 13e8c059-7a27-4db4-8151-32511c435aaf] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 21 19:17:22 np0005591285 nova_compute[182755]: 2026-01-22 00:17:22.923 182759 INFO nova.compute.manager [None req-f447ef32-7edb-4e2c-a5c5-5452f2f9c37e - - - - - -] [instance: 13e8c059-7a27-4db4-8151-32511c435aaf] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 21 19:17:23 np0005591285 podman[233030]: 2026-01-22 00:17:23.088522205 +0000 UTC m=+0.054433783 container create 72d92c1eb4a661db94b6b80551ebab77f9f99014b8cda9a47ec647a03401fa78 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-ed0e337a-102a-4cfd-8393-2e4b081cc9be, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team)
Jan 21 19:17:23 np0005591285 systemd[1]: Started libpod-conmon-72d92c1eb4a661db94b6b80551ebab77f9f99014b8cda9a47ec647a03401fa78.scope.
Jan 21 19:17:23 np0005591285 systemd[1]: Started libcrun container.
Jan 21 19:17:23 np0005591285 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/1f9033bcd8a20a687773b94f84f84e0cbdb3bced39a6253dcf8be691acefd1a4/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 21 19:17:23 np0005591285 podman[233030]: 2026-01-22 00:17:23.057792196 +0000 UTC m=+0.023703784 image pull 3695f0466b4af47afdf4b467956f8cc4744d7249671a73e7ca3fd26cca2f59c3 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Jan 21 19:17:23 np0005591285 podman[233030]: 2026-01-22 00:17:23.16184174 +0000 UTC m=+0.127753308 container init 72d92c1eb4a661db94b6b80551ebab77f9f99014b8cda9a47ec647a03401fa78 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-ed0e337a-102a-4cfd-8393-2e4b081cc9be, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.build-date=20251202, maintainer=OpenStack Kubernetes Operator team)
Jan 21 19:17:23 np0005591285 podman[233030]: 2026-01-22 00:17:23.166511655 +0000 UTC m=+0.132423193 container start 72d92c1eb4a661db94b6b80551ebab77f9f99014b8cda9a47ec647a03401fa78 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-ed0e337a-102a-4cfd-8393-2e4b081cc9be, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.license=GPLv2, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3)
Jan 21 19:17:23 np0005591285 neutron-haproxy-ovnmeta-ed0e337a-102a-4cfd-8393-2e4b081cc9be[233045]: [NOTICE]   (233049) : New worker (233051) forked
Jan 21 19:17:23 np0005591285 neutron-haproxy-ovnmeta-ed0e337a-102a-4cfd-8393-2e4b081cc9be[233045]: [NOTICE]   (233049) : Loading success.
Jan 21 19:17:23 np0005591285 nova_compute[182755]: 2026-01-22 00:17:23.691 182759 DEBUG nova.network.neutron [req-2db16135-c909-4fa6-af57-677b893947f0 req-e969de37-76f4-42b1-a0b1-e221fb0e8cb5 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 13e8c059-7a27-4db4-8151-32511c435aaf] Updated VIF entry in instance network info cache for port b5ddb845-36ad-438e-b95d-6d5c696671e9. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 21 19:17:23 np0005591285 nova_compute[182755]: 2026-01-22 00:17:23.692 182759 DEBUG nova.network.neutron [req-2db16135-c909-4fa6-af57-677b893947f0 req-e969de37-76f4-42b1-a0b1-e221fb0e8cb5 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 13e8c059-7a27-4db4-8151-32511c435aaf] Updating instance_info_cache with network_info: [{"id": "b5ddb845-36ad-438e-b95d-6d5c696671e9", "address": "fa:16:3e:30:f9:d2", "network": {"id": "ed0e337a-102a-4cfd-8393-2e4b081cc9be", "bridge": "br-int", "label": "tempest-network-smoke--1899647072", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "34b96b4037d24a0ea19383ca2477b2fd", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb5ddb845-36", "ovs_interfaceid": "b5ddb845-36ad-438e-b95d-6d5c696671e9", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 21 19:17:23 np0005591285 nova_compute[182755]: 2026-01-22 00:17:23.711 182759 DEBUG oslo_concurrency.lockutils [req-2db16135-c909-4fa6-af57-677b893947f0 req-e969de37-76f4-42b1-a0b1-e221fb0e8cb5 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Releasing lock "refresh_cache-13e8c059-7a27-4db4-8151-32511c435aaf" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 21 19:17:24 np0005591285 nova_compute[182755]: 2026-01-22 00:17:24.086 182759 DEBUG nova.compute.manager [req-f264e842-4969-4d75-9e12-d2235c1aacb0 req-886bb9e7-4948-42ef-9b5a-4e6a0c8b53af 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 13e8c059-7a27-4db4-8151-32511c435aaf] Received event network-vif-plugged-b5ddb845-36ad-438e-b95d-6d5c696671e9 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 21 19:17:24 np0005591285 nova_compute[182755]: 2026-01-22 00:17:24.087 182759 DEBUG oslo_concurrency.lockutils [req-f264e842-4969-4d75-9e12-d2235c1aacb0 req-886bb9e7-4948-42ef-9b5a-4e6a0c8b53af 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "13e8c059-7a27-4db4-8151-32511c435aaf-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 21 19:17:24 np0005591285 nova_compute[182755]: 2026-01-22 00:17:24.087 182759 DEBUG oslo_concurrency.lockutils [req-f264e842-4969-4d75-9e12-d2235c1aacb0 req-886bb9e7-4948-42ef-9b5a-4e6a0c8b53af 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "13e8c059-7a27-4db4-8151-32511c435aaf-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 21 19:17:24 np0005591285 nova_compute[182755]: 2026-01-22 00:17:24.088 182759 DEBUG oslo_concurrency.lockutils [req-f264e842-4969-4d75-9e12-d2235c1aacb0 req-886bb9e7-4948-42ef-9b5a-4e6a0c8b53af 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "13e8c059-7a27-4db4-8151-32511c435aaf-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 21 19:17:24 np0005591285 nova_compute[182755]: 2026-01-22 00:17:24.088 182759 DEBUG nova.compute.manager [req-f264e842-4969-4d75-9e12-d2235c1aacb0 req-886bb9e7-4948-42ef-9b5a-4e6a0c8b53af 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 13e8c059-7a27-4db4-8151-32511c435aaf] Processing event network-vif-plugged-b5ddb845-36ad-438e-b95d-6d5c696671e9 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Jan 21 19:17:24 np0005591285 nova_compute[182755]: 2026-01-22 00:17:24.088 182759 DEBUG nova.compute.manager [req-f264e842-4969-4d75-9e12-d2235c1aacb0 req-886bb9e7-4948-42ef-9b5a-4e6a0c8b53af 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 13e8c059-7a27-4db4-8151-32511c435aaf] Received event network-vif-plugged-b5ddb845-36ad-438e-b95d-6d5c696671e9 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 21 19:17:24 np0005591285 nova_compute[182755]: 2026-01-22 00:17:24.088 182759 DEBUG oslo_concurrency.lockutils [req-f264e842-4969-4d75-9e12-d2235c1aacb0 req-886bb9e7-4948-42ef-9b5a-4e6a0c8b53af 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "13e8c059-7a27-4db4-8151-32511c435aaf-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 21 19:17:24 np0005591285 nova_compute[182755]: 2026-01-22 00:17:24.089 182759 DEBUG oslo_concurrency.lockutils [req-f264e842-4969-4d75-9e12-d2235c1aacb0 req-886bb9e7-4948-42ef-9b5a-4e6a0c8b53af 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "13e8c059-7a27-4db4-8151-32511c435aaf-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 21 19:17:24 np0005591285 nova_compute[182755]: 2026-01-22 00:17:24.089 182759 DEBUG oslo_concurrency.lockutils [req-f264e842-4969-4d75-9e12-d2235c1aacb0 req-886bb9e7-4948-42ef-9b5a-4e6a0c8b53af 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "13e8c059-7a27-4db4-8151-32511c435aaf-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 21 19:17:24 np0005591285 nova_compute[182755]: 2026-01-22 00:17:24.089 182759 DEBUG nova.compute.manager [req-f264e842-4969-4d75-9e12-d2235c1aacb0 req-886bb9e7-4948-42ef-9b5a-4e6a0c8b53af 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 13e8c059-7a27-4db4-8151-32511c435aaf] No waiting events found dispatching network-vif-plugged-b5ddb845-36ad-438e-b95d-6d5c696671e9 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 21 19:17:24 np0005591285 nova_compute[182755]: 2026-01-22 00:17:24.089 182759 WARNING nova.compute.manager [req-f264e842-4969-4d75-9e12-d2235c1aacb0 req-886bb9e7-4948-42ef-9b5a-4e6a0c8b53af 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 13e8c059-7a27-4db4-8151-32511c435aaf] Received unexpected event network-vif-plugged-b5ddb845-36ad-438e-b95d-6d5c696671e9 for instance with vm_state building and task_state spawning.#033[00m
Jan 21 19:17:24 np0005591285 nova_compute[182755]: 2026-01-22 00:17:24.090 182759 DEBUG nova.compute.manager [None req-35e96e77-dcb6-4cc3-a259-d34651d26c6e 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] [instance: 13e8c059-7a27-4db4-8151-32511c435aaf] Instance event wait completed in 1 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Jan 21 19:17:24 np0005591285 nova_compute[182755]: 2026-01-22 00:17:24.095 182759 DEBUG nova.virt.libvirt.driver [None req-35e96e77-dcb6-4cc3-a259-d34651d26c6e 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] [instance: 13e8c059-7a27-4db4-8151-32511c435aaf] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Jan 21 19:17:24 np0005591285 nova_compute[182755]: 2026-01-22 00:17:24.096 182759 DEBUG nova.virt.driver [None req-f447ef32-7edb-4e2c-a5c5-5452f2f9c37e - - - - - -] Emitting event <LifecycleEvent: 1769041044.0950458, 13e8c059-7a27-4db4-8151-32511c435aaf => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 21 19:17:24 np0005591285 nova_compute[182755]: 2026-01-22 00:17:24.096 182759 INFO nova.compute.manager [None req-f447ef32-7edb-4e2c-a5c5-5452f2f9c37e - - - - - -] [instance: 13e8c059-7a27-4db4-8151-32511c435aaf] VM Resumed (Lifecycle Event)#033[00m
Jan 21 19:17:24 np0005591285 nova_compute[182755]: 2026-01-22 00:17:24.102 182759 INFO nova.virt.libvirt.driver [-] [instance: 13e8c059-7a27-4db4-8151-32511c435aaf] Instance spawned successfully.#033[00m
Jan 21 19:17:24 np0005591285 nova_compute[182755]: 2026-01-22 00:17:24.104 182759 DEBUG nova.virt.libvirt.driver [None req-35e96e77-dcb6-4cc3-a259-d34651d26c6e 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] [instance: 13e8c059-7a27-4db4-8151-32511c435aaf] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Jan 21 19:17:24 np0005591285 nova_compute[182755]: 2026-01-22 00:17:24.146 182759 DEBUG nova.virt.libvirt.driver [None req-35e96e77-dcb6-4cc3-a259-d34651d26c6e 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] [instance: 13e8c059-7a27-4db4-8151-32511c435aaf] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 21 19:17:24 np0005591285 nova_compute[182755]: 2026-01-22 00:17:24.146 182759 DEBUG nova.virt.libvirt.driver [None req-35e96e77-dcb6-4cc3-a259-d34651d26c6e 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] [instance: 13e8c059-7a27-4db4-8151-32511c435aaf] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 21 19:17:24 np0005591285 nova_compute[182755]: 2026-01-22 00:17:24.147 182759 DEBUG nova.virt.libvirt.driver [None req-35e96e77-dcb6-4cc3-a259-d34651d26c6e 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] [instance: 13e8c059-7a27-4db4-8151-32511c435aaf] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 21 19:17:24 np0005591285 nova_compute[182755]: 2026-01-22 00:17:24.148 182759 DEBUG nova.virt.libvirt.driver [None req-35e96e77-dcb6-4cc3-a259-d34651d26c6e 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] [instance: 13e8c059-7a27-4db4-8151-32511c435aaf] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 21 19:17:24 np0005591285 nova_compute[182755]: 2026-01-22 00:17:24.148 182759 DEBUG nova.virt.libvirt.driver [None req-35e96e77-dcb6-4cc3-a259-d34651d26c6e 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] [instance: 13e8c059-7a27-4db4-8151-32511c435aaf] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 21 19:17:24 np0005591285 nova_compute[182755]: 2026-01-22 00:17:24.149 182759 DEBUG nova.virt.libvirt.driver [None req-35e96e77-dcb6-4cc3-a259-d34651d26c6e 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] [instance: 13e8c059-7a27-4db4-8151-32511c435aaf] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 21 19:17:24 np0005591285 nova_compute[182755]: 2026-01-22 00:17:24.191 182759 DEBUG nova.compute.manager [None req-f447ef32-7edb-4e2c-a5c5-5452f2f9c37e - - - - - -] [instance: 13e8c059-7a27-4db4-8151-32511c435aaf] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 21 19:17:24 np0005591285 nova_compute[182755]: 2026-01-22 00:17:24.195 182759 DEBUG nova.compute.manager [None req-f447ef32-7edb-4e2c-a5c5-5452f2f9c37e - - - - - -] [instance: 13e8c059-7a27-4db4-8151-32511c435aaf] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 21 19:17:24 np0005591285 nova_compute[182755]: 2026-01-22 00:17:24.263 182759 INFO nova.compute.manager [None req-f447ef32-7edb-4e2c-a5c5-5452f2f9c37e - - - - - -] [instance: 13e8c059-7a27-4db4-8151-32511c435aaf] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 21 19:17:24 np0005591285 nova_compute[182755]: 2026-01-22 00:17:24.310 182759 INFO nova.compute.manager [None req-35e96e77-dcb6-4cc3-a259-d34651d26c6e 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] [instance: 13e8c059-7a27-4db4-8151-32511c435aaf] Took 7.84 seconds to spawn the instance on the hypervisor.#033[00m
Jan 21 19:17:24 np0005591285 nova_compute[182755]: 2026-01-22 00:17:24.310 182759 DEBUG nova.compute.manager [None req-35e96e77-dcb6-4cc3-a259-d34651d26c6e 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] [instance: 13e8c059-7a27-4db4-8151-32511c435aaf] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 21 19:17:24 np0005591285 nova_compute[182755]: 2026-01-22 00:17:24.408 182759 INFO nova.compute.manager [None req-35e96e77-dcb6-4cc3-a259-d34651d26c6e 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] [instance: 13e8c059-7a27-4db4-8151-32511c435aaf] Took 9.67 seconds to build instance.#033[00m
Jan 21 19:17:24 np0005591285 nova_compute[182755]: 2026-01-22 00:17:24.429 182759 DEBUG oslo_concurrency.lockutils [None req-35e96e77-dcb6-4cc3-a259-d34651d26c6e 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] Lock "13e8c059-7a27-4db4-8151-32511c435aaf" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 9.786s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 21 19:17:25 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:17:25.628 104259 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=40, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '16:a4:fb', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': 'fa:c2:bb:50:eb:27'}, ipsec=False) old=SB_Global(nb_cfg=39) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 21 19:17:25 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:17:25.628 104259 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 7 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Jan 21 19:17:25 np0005591285 nova_compute[182755]: 2026-01-22 00:17:25.630 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:17:26 np0005591285 nova_compute[182755]: 2026-01-22 00:17:26.365 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:17:27 np0005591285 nova_compute[182755]: 2026-01-22 00:17:27.738 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:17:29 np0005591285 podman[233060]: 2026-01-22 00:17:29.218791045 +0000 UTC m=+0.085095639 container health_status 769668cc4e818cfae854ab3fcb5517227ddca1e18005a18ef4d782a494f45662 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.openshift.expose-services=, managed_by=edpm_ansible, vcs-type=git, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., url=https://catalog.redhat.com/en/search?searchType=containers, build-date=2025-08-20T13:12:41, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.buildah.version=1.33.7, release=1755695350, distribution-scope=public, vendor=Red Hat, Inc., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., version=9.6, architecture=x86_64, com.redhat.component=ubi9-minimal-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=openstack_network_exporter, io.openshift.tags=minimal rhel9, name=ubi9-minimal, config_id=openstack_network_exporter, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, maintainer=Red Hat, Inc.)
Jan 21 19:17:29 np0005591285 podman[233061]: 2026-01-22 00:17:29.220882051 +0000 UTC m=+0.069300789 container health_status b5c1a31a508d0cf9e10702abe9c0ef424614b4d8f0daee85791f7de054afa6f0 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, tcib_managed=true, config_id=ceilometer_agent_compute, container_name=ceilometer_agent_compute, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Jan 21 19:17:29 np0005591285 nova_compute[182755]: 2026-01-22 00:17:29.351 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:17:29 np0005591285 NetworkManager[55017]: <info>  [1769041049.3539] manager: (patch-br-int-to-provnet-99bcf688-d143-4306-a11c-956e66fbe227): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/252)
Jan 21 19:17:29 np0005591285 NetworkManager[55017]: <info>  [1769041049.3549] manager: (patch-provnet-99bcf688-d143-4306-a11c-956e66fbe227-to-br-int): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/253)
Jan 21 19:17:29 np0005591285 nova_compute[182755]: 2026-01-22 00:17:29.462 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:17:29 np0005591285 ovn_controller[94908]: 2026-01-22T00:17:29Z|00520|binding|INFO|Releasing lport a677c548-67db-4eb4-acb2-02020cd1507a from this chassis (sb_readonly=0)
Jan 21 19:17:29 np0005591285 nova_compute[182755]: 2026-01-22 00:17:29.478 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:17:29 np0005591285 nova_compute[182755]: 2026-01-22 00:17:29.679 182759 DEBUG nova.compute.manager [req-9351240c-94ad-405d-a414-6f4582c21ef7 req-b943f896-c83e-4470-a031-2fdfb60aee22 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 13e8c059-7a27-4db4-8151-32511c435aaf] Received event network-changed-b5ddb845-36ad-438e-b95d-6d5c696671e9 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 21 19:17:29 np0005591285 nova_compute[182755]: 2026-01-22 00:17:29.679 182759 DEBUG nova.compute.manager [req-9351240c-94ad-405d-a414-6f4582c21ef7 req-b943f896-c83e-4470-a031-2fdfb60aee22 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 13e8c059-7a27-4db4-8151-32511c435aaf] Refreshing instance network info cache due to event network-changed-b5ddb845-36ad-438e-b95d-6d5c696671e9. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 21 19:17:29 np0005591285 nova_compute[182755]: 2026-01-22 00:17:29.680 182759 DEBUG oslo_concurrency.lockutils [req-9351240c-94ad-405d-a414-6f4582c21ef7 req-b943f896-c83e-4470-a031-2fdfb60aee22 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "refresh_cache-13e8c059-7a27-4db4-8151-32511c435aaf" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 21 19:17:29 np0005591285 nova_compute[182755]: 2026-01-22 00:17:29.680 182759 DEBUG oslo_concurrency.lockutils [req-9351240c-94ad-405d-a414-6f4582c21ef7 req-b943f896-c83e-4470-a031-2fdfb60aee22 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquired lock "refresh_cache-13e8c059-7a27-4db4-8151-32511c435aaf" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 21 19:17:29 np0005591285 nova_compute[182755]: 2026-01-22 00:17:29.680 182759 DEBUG nova.network.neutron [req-9351240c-94ad-405d-a414-6f4582c21ef7 req-b943f896-c83e-4470-a031-2fdfb60aee22 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 13e8c059-7a27-4db4-8151-32511c435aaf] Refreshing network info cache for port b5ddb845-36ad-438e-b95d-6d5c696671e9 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 21 19:17:30 np0005591285 nova_compute[182755]: 2026-01-22 00:17:30.067 182759 DEBUG oslo_concurrency.lockutils [None req-f06e3ef7-6ebd-490e-8421-ecf30454d4cb 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] Acquiring lock "13e8c059-7a27-4db4-8151-32511c435aaf" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 21 19:17:30 np0005591285 nova_compute[182755]: 2026-01-22 00:17:30.068 182759 DEBUG oslo_concurrency.lockutils [None req-f06e3ef7-6ebd-490e-8421-ecf30454d4cb 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] Lock "13e8c059-7a27-4db4-8151-32511c435aaf" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 21 19:17:30 np0005591285 nova_compute[182755]: 2026-01-22 00:17:30.068 182759 DEBUG oslo_concurrency.lockutils [None req-f06e3ef7-6ebd-490e-8421-ecf30454d4cb 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] Acquiring lock "13e8c059-7a27-4db4-8151-32511c435aaf-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 21 19:17:30 np0005591285 nova_compute[182755]: 2026-01-22 00:17:30.069 182759 DEBUG oslo_concurrency.lockutils [None req-f06e3ef7-6ebd-490e-8421-ecf30454d4cb 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] Lock "13e8c059-7a27-4db4-8151-32511c435aaf-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 21 19:17:30 np0005591285 nova_compute[182755]: 2026-01-22 00:17:30.069 182759 DEBUG oslo_concurrency.lockutils [None req-f06e3ef7-6ebd-490e-8421-ecf30454d4cb 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] Lock "13e8c059-7a27-4db4-8151-32511c435aaf-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 21 19:17:30 np0005591285 nova_compute[182755]: 2026-01-22 00:17:30.085 182759 INFO nova.compute.manager [None req-f06e3ef7-6ebd-490e-8421-ecf30454d4cb 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] [instance: 13e8c059-7a27-4db4-8151-32511c435aaf] Terminating instance#033[00m
Jan 21 19:17:30 np0005591285 nova_compute[182755]: 2026-01-22 00:17:30.100 182759 DEBUG nova.compute.manager [None req-f06e3ef7-6ebd-490e-8421-ecf30454d4cb 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] [instance: 13e8c059-7a27-4db4-8151-32511c435aaf] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Jan 21 19:17:30 np0005591285 kernel: tapb5ddb845-36 (unregistering): left promiscuous mode
Jan 21 19:17:30 np0005591285 NetworkManager[55017]: <info>  [1769041050.1251] device (tapb5ddb845-36): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 21 19:17:30 np0005591285 ovn_controller[94908]: 2026-01-22T00:17:30Z|00521|binding|INFO|Releasing lport b5ddb845-36ad-438e-b95d-6d5c696671e9 from this chassis (sb_readonly=0)
Jan 21 19:17:30 np0005591285 ovn_controller[94908]: 2026-01-22T00:17:30Z|00522|binding|INFO|Setting lport b5ddb845-36ad-438e-b95d-6d5c696671e9 down in Southbound
Jan 21 19:17:30 np0005591285 nova_compute[182755]: 2026-01-22 00:17:30.137 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:17:30 np0005591285 ovn_controller[94908]: 2026-01-22T00:17:30Z|00523|binding|INFO|Removing iface tapb5ddb845-36 ovn-installed in OVS
Jan 21 19:17:30 np0005591285 nova_compute[182755]: 2026-01-22 00:17:30.140 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:17:30 np0005591285 nova_compute[182755]: 2026-01-22 00:17:30.172 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:17:30 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:17:30.186 104259 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:30:f9:d2 10.100.0.8'], port_security=['fa:16:3e:30:f9:d2 10.100.0.8'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'name': 'tempest-TestNetworkBasicOps-1260237103', 'neutron:cidrs': '10.100.0.8/28', 'neutron:device_id': '13e8c059-7a27-4db4-8151-32511c435aaf', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-ed0e337a-102a-4cfd-8393-2e4b081cc9be', 'neutron:port_capabilities': '', 'neutron:port_name': 'tempest-TestNetworkBasicOps-1260237103', 'neutron:project_id': '34b96b4037d24a0ea19383ca2477b2fd', 'neutron:revision_number': '4', 'neutron:security_group_ids': '86c286e4-25eb-4f2e-b5ff-30677cfd8882', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-2.ctlplane.example.com', 'neutron:port_fip': '192.168.122.176'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=e0671c46-b585-465c-a43e-4ccad6b37e41, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fdd55b5c970>], logical_port=b5ddb845-36ad-438e-b95d-6d5c696671e9) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7fdd55b5c970>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 21 19:17:30 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:17:30.187 104259 INFO neutron.agent.ovn.metadata.agent [-] Port b5ddb845-36ad-438e-b95d-6d5c696671e9 in datapath ed0e337a-102a-4cfd-8393-2e4b081cc9be unbound from our chassis#033[00m
Jan 21 19:17:30 np0005591285 systemd[1]: machine-qemu\x2d61\x2dinstance\x2d00000087.scope: Deactivated successfully.
Jan 21 19:17:30 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:17:30.189 104259 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network ed0e337a-102a-4cfd-8393-2e4b081cc9be, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Jan 21 19:17:30 np0005591285 systemd[1]: machine-qemu\x2d61\x2dinstance\x2d00000087.scope: Consumed 6.547s CPU time.
Jan 21 19:17:30 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:17:30.190 211690 DEBUG oslo.privsep.daemon [-] privsep: reply[4232eef8-5f32-416c-984e-07fabfa48345]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 19:17:30 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:17:30.191 104259 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-ed0e337a-102a-4cfd-8393-2e4b081cc9be namespace which is not needed anymore#033[00m
Jan 21 19:17:30 np0005591285 systemd-machined[154022]: Machine qemu-61-instance-00000087 terminated.
Jan 21 19:17:30 np0005591285 nova_compute[182755]: 2026-01-22 00:17:30.217 182759 DEBUG oslo_service.periodic_task [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Running periodic task ComputeManager._run_pending_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 21 19:17:30 np0005591285 nova_compute[182755]: 2026-01-22 00:17:30.218 182759 DEBUG nova.compute.manager [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Cleaning up deleted instances _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11145#033[00m
Jan 21 19:17:30 np0005591285 nova_compute[182755]: 2026-01-22 00:17:30.238 182759 DEBUG nova.compute.manager [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] There are 0 instances to clean _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11154#033[00m
Jan 21 19:17:30 np0005591285 nova_compute[182755]: 2026-01-22 00:17:30.330 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:17:30 np0005591285 nova_compute[182755]: 2026-01-22 00:17:30.334 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:17:30 np0005591285 neutron-haproxy-ovnmeta-ed0e337a-102a-4cfd-8393-2e4b081cc9be[233045]: [NOTICE]   (233049) : haproxy version is 2.8.14-c23fe91
Jan 21 19:17:30 np0005591285 neutron-haproxy-ovnmeta-ed0e337a-102a-4cfd-8393-2e4b081cc9be[233045]: [NOTICE]   (233049) : path to executable is /usr/sbin/haproxy
Jan 21 19:17:30 np0005591285 neutron-haproxy-ovnmeta-ed0e337a-102a-4cfd-8393-2e4b081cc9be[233045]: [WARNING]  (233049) : Exiting Master process...
Jan 21 19:17:30 np0005591285 neutron-haproxy-ovnmeta-ed0e337a-102a-4cfd-8393-2e4b081cc9be[233045]: [ALERT]    (233049) : Current worker (233051) exited with code 143 (Terminated)
Jan 21 19:17:30 np0005591285 neutron-haproxy-ovnmeta-ed0e337a-102a-4cfd-8393-2e4b081cc9be[233045]: [WARNING]  (233049) : All workers exited. Exiting... (0)
Jan 21 19:17:30 np0005591285 systemd[1]: libpod-72d92c1eb4a661db94b6b80551ebab77f9f99014b8cda9a47ec647a03401fa78.scope: Deactivated successfully.
Jan 21 19:17:30 np0005591285 podman[233124]: 2026-01-22 00:17:30.365116908 +0000 UTC m=+0.065035305 container died 72d92c1eb4a661db94b6b80551ebab77f9f99014b8cda9a47ec647a03401fa78 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-ed0e337a-102a-4cfd-8393-2e4b081cc9be, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3)
Jan 21 19:17:30 np0005591285 nova_compute[182755]: 2026-01-22 00:17:30.380 182759 INFO nova.virt.libvirt.driver [-] [instance: 13e8c059-7a27-4db4-8151-32511c435aaf] Instance destroyed successfully.#033[00m
Jan 21 19:17:30 np0005591285 nova_compute[182755]: 2026-01-22 00:17:30.381 182759 DEBUG nova.objects.instance [None req-f06e3ef7-6ebd-490e-8421-ecf30454d4cb 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] Lazy-loading 'resources' on Instance uuid 13e8c059-7a27-4db4-8151-32511c435aaf obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 21 19:17:30 np0005591285 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-72d92c1eb4a661db94b6b80551ebab77f9f99014b8cda9a47ec647a03401fa78-userdata-shm.mount: Deactivated successfully.
Jan 21 19:17:30 np0005591285 systemd[1]: var-lib-containers-storage-overlay-1f9033bcd8a20a687773b94f84f84e0cbdb3bced39a6253dcf8be691acefd1a4-merged.mount: Deactivated successfully.
Jan 21 19:17:30 np0005591285 nova_compute[182755]: 2026-01-22 00:17:30.402 182759 DEBUG nova.virt.libvirt.vif [None req-f06e3ef7-6ebd-490e-8421-ecf30454d4cb 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-22T00:17:12Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-802139876',display_name='tempest-TestNetworkBasicOps-server-802139876',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-802139876',id=135,image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBIYawHES1Dl8cWfoRcVnJdkovN/OqIt5Nziohu0QwSEHRC3Kt+d0XSF/5jXquQtaLInb13URkzauTrULYPKVjHnM9UWAgs48JRCsN9Ey+2urYk0Y/V55QWRx25UlL+0jtQ==',key_name='tempest-TestNetworkBasicOps-1237503814',keypairs=<?>,launch_index=0,launched_at=2026-01-22T00:17:24Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='34b96b4037d24a0ea19383ca2477b2fd',ramdisk_id='',reservation_id='r-f06umpf3',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestNetworkBasicOps-822850957',owner_user_name='tempest-TestNetworkBasicOps-822850957-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-22T00:17:24Z,user_data=None,user_id='833f1e9dce90456ea55a443da6704907',uuid=13e8c059-7a27-4db4-8151-32511c435aaf,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "b5ddb845-36ad-438e-b95d-6d5c696671e9", "address": "fa:16:3e:30:f9:d2", "network": {"id": "ed0e337a-102a-4cfd-8393-2e4b081cc9be", "bridge": "br-int", "label": "tempest-network-smoke--1899647072", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "34b96b4037d24a0ea19383ca2477b2fd", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb5ddb845-36", "ovs_interfaceid": "b5ddb845-36ad-438e-b95d-6d5c696671e9", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Jan 21 19:17:30 np0005591285 nova_compute[182755]: 2026-01-22 00:17:30.403 182759 DEBUG nova.network.os_vif_util [None req-f06e3ef7-6ebd-490e-8421-ecf30454d4cb 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] Converting VIF {"id": "b5ddb845-36ad-438e-b95d-6d5c696671e9", "address": "fa:16:3e:30:f9:d2", "network": {"id": "ed0e337a-102a-4cfd-8393-2e4b081cc9be", "bridge": "br-int", "label": "tempest-network-smoke--1899647072", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "34b96b4037d24a0ea19383ca2477b2fd", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb5ddb845-36", "ovs_interfaceid": "b5ddb845-36ad-438e-b95d-6d5c696671e9", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 21 19:17:30 np0005591285 podman[233124]: 2026-01-22 00:17:30.404158079 +0000 UTC m=+0.104076516 container cleanup 72d92c1eb4a661db94b6b80551ebab77f9f99014b8cda9a47ec647a03401fa78 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-ed0e337a-102a-4cfd-8393-2e4b081cc9be, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.build-date=20251202, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 21 19:17:30 np0005591285 nova_compute[182755]: 2026-01-22 00:17:30.404 182759 DEBUG nova.network.os_vif_util [None req-f06e3ef7-6ebd-490e-8421-ecf30454d4cb 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:30:f9:d2,bridge_name='br-int',has_traffic_filtering=True,id=b5ddb845-36ad-438e-b95d-6d5c696671e9,network=Network(ed0e337a-102a-4cfd-8393-2e4b081cc9be),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tapb5ddb845-36') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 21 19:17:30 np0005591285 nova_compute[182755]: 2026-01-22 00:17:30.405 182759 DEBUG os_vif [None req-f06e3ef7-6ebd-490e-8421-ecf30454d4cb 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:30:f9:d2,bridge_name='br-int',has_traffic_filtering=True,id=b5ddb845-36ad-438e-b95d-6d5c696671e9,network=Network(ed0e337a-102a-4cfd-8393-2e4b081cc9be),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tapb5ddb845-36') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Jan 21 19:17:30 np0005591285 nova_compute[182755]: 2026-01-22 00:17:30.407 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:17:30 np0005591285 nova_compute[182755]: 2026-01-22 00:17:30.407 182759 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapb5ddb845-36, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 21 19:17:30 np0005591285 nova_compute[182755]: 2026-01-22 00:17:30.409 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:17:30 np0005591285 nova_compute[182755]: 2026-01-22 00:17:30.413 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 21 19:17:30 np0005591285 systemd[1]: libpod-conmon-72d92c1eb4a661db94b6b80551ebab77f9f99014b8cda9a47ec647a03401fa78.scope: Deactivated successfully.
Jan 21 19:17:30 np0005591285 nova_compute[182755]: 2026-01-22 00:17:30.417 182759 INFO os_vif [None req-f06e3ef7-6ebd-490e-8421-ecf30454d4cb 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:30:f9:d2,bridge_name='br-int',has_traffic_filtering=True,id=b5ddb845-36ad-438e-b95d-6d5c696671e9,network=Network(ed0e337a-102a-4cfd-8393-2e4b081cc9be),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tapb5ddb845-36')#033[00m
Jan 21 19:17:30 np0005591285 nova_compute[182755]: 2026-01-22 00:17:30.418 182759 INFO nova.virt.libvirt.driver [None req-f06e3ef7-6ebd-490e-8421-ecf30454d4cb 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] [instance: 13e8c059-7a27-4db4-8151-32511c435aaf] Deleting instance files /var/lib/nova/instances/13e8c059-7a27-4db4-8151-32511c435aaf_del#033[00m
Jan 21 19:17:30 np0005591285 nova_compute[182755]: 2026-01-22 00:17:30.419 182759 INFO nova.virt.libvirt.driver [None req-f06e3ef7-6ebd-490e-8421-ecf30454d4cb 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] [instance: 13e8c059-7a27-4db4-8151-32511c435aaf] Deletion of /var/lib/nova/instances/13e8c059-7a27-4db4-8151-32511c435aaf_del complete#033[00m
Jan 21 19:17:30 np0005591285 podman[233168]: 2026-01-22 00:17:30.475349687 +0000 UTC m=+0.043807509 container remove 72d92c1eb4a661db94b6b80551ebab77f9f99014b8cda9a47ec647a03401fa78 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-ed0e337a-102a-4cfd-8393-2e4b081cc9be, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Jan 21 19:17:30 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:17:30.482 211690 DEBUG oslo.privsep.daemon [-] privsep: reply[b4e94c19-527b-4368-8b29-15f999c4dfdb]: (4, ('Thu Jan 22 12:17:30 AM UTC 2026 Stopping container neutron-haproxy-ovnmeta-ed0e337a-102a-4cfd-8393-2e4b081cc9be (72d92c1eb4a661db94b6b80551ebab77f9f99014b8cda9a47ec647a03401fa78)\n72d92c1eb4a661db94b6b80551ebab77f9f99014b8cda9a47ec647a03401fa78\nThu Jan 22 12:17:30 AM UTC 2026 Deleting container neutron-haproxy-ovnmeta-ed0e337a-102a-4cfd-8393-2e4b081cc9be (72d92c1eb4a661db94b6b80551ebab77f9f99014b8cda9a47ec647a03401fa78)\n72d92c1eb4a661db94b6b80551ebab77f9f99014b8cda9a47ec647a03401fa78\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 19:17:30 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:17:30.484 211690 DEBUG oslo.privsep.daemon [-] privsep: reply[4e1a2cb7-25cc-448a-a450-83295394f701]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 19:17:30 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:17:30.484 104259 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=taped0e337a-10, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 21 19:17:30 np0005591285 kernel: taped0e337a-10: left promiscuous mode
Jan 21 19:17:30 np0005591285 nova_compute[182755]: 2026-01-22 00:17:30.487 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:17:30 np0005591285 nova_compute[182755]: 2026-01-22 00:17:30.499 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:17:30 np0005591285 nova_compute[182755]: 2026-01-22 00:17:30.500 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:17:30 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:17:30.504 211690 DEBUG oslo.privsep.daemon [-] privsep: reply[6ddc4e6f-65a9-4311-bb5d-dd0a1104695c]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 19:17:30 np0005591285 nova_compute[182755]: 2026-01-22 00:17:30.511 182759 INFO nova.compute.manager [None req-f06e3ef7-6ebd-490e-8421-ecf30454d4cb 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] [instance: 13e8c059-7a27-4db4-8151-32511c435aaf] Took 0.41 seconds to destroy the instance on the hypervisor.#033[00m
Jan 21 19:17:30 np0005591285 nova_compute[182755]: 2026-01-22 00:17:30.512 182759 DEBUG oslo.service.loopingcall [None req-f06e3ef7-6ebd-490e-8421-ecf30454d4cb 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Jan 21 19:17:30 np0005591285 nova_compute[182755]: 2026-01-22 00:17:30.512 182759 DEBUG nova.compute.manager [-] [instance: 13e8c059-7a27-4db4-8151-32511c435aaf] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Jan 21 19:17:30 np0005591285 nova_compute[182755]: 2026-01-22 00:17:30.512 182759 DEBUG nova.network.neutron [-] [instance: 13e8c059-7a27-4db4-8151-32511c435aaf] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Jan 21 19:17:30 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:17:30.516 211690 DEBUG oslo.privsep.daemon [-] privsep: reply[f3c37580-f622-45c3-b33a-9adac3229f14]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 19:17:30 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:17:30.517 211690 DEBUG oslo.privsep.daemon [-] privsep: reply[d059e5ec-3730-4fd9-bf14-cfa75ff63ecb]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 19:17:30 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:17:30.540 211690 DEBUG oslo.privsep.daemon [-] privsep: reply[dc81753a-9b18-4748-8b07-e3f8a46e88e9]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 560108, 'reachable_time': 41857, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 233182, 'error': None, 'target': 'ovnmeta-ed0e337a-102a-4cfd-8393-2e4b081cc9be', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 19:17:30 np0005591285 systemd[1]: run-netns-ovnmeta\x2ded0e337a\x2d102a\x2d4cfd\x2d8393\x2d2e4b081cc9be.mount: Deactivated successfully.
Jan 21 19:17:30 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:17:30.543 104650 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-ed0e337a-102a-4cfd-8393-2e4b081cc9be deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Jan 21 19:17:30 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:17:30.544 104650 DEBUG oslo.privsep.daemon [-] privsep: reply[74f87e55-cd40-4d11-a08a-1a1813d5f26d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 19:17:31 np0005591285 nova_compute[182755]: 2026-01-22 00:17:31.687 182759 DEBUG nova.network.neutron [req-9351240c-94ad-405d-a414-6f4582c21ef7 req-b943f896-c83e-4470-a031-2fdfb60aee22 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 13e8c059-7a27-4db4-8151-32511c435aaf] Updated VIF entry in instance network info cache for port b5ddb845-36ad-438e-b95d-6d5c696671e9. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 21 19:17:31 np0005591285 nova_compute[182755]: 2026-01-22 00:17:31.688 182759 DEBUG nova.network.neutron [req-9351240c-94ad-405d-a414-6f4582c21ef7 req-b943f896-c83e-4470-a031-2fdfb60aee22 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 13e8c059-7a27-4db4-8151-32511c435aaf] Updating instance_info_cache with network_info: [{"id": "b5ddb845-36ad-438e-b95d-6d5c696671e9", "address": "fa:16:3e:30:f9:d2", "network": {"id": "ed0e337a-102a-4cfd-8393-2e4b081cc9be", "bridge": "br-int", "label": "tempest-network-smoke--1899647072", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.176", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "34b96b4037d24a0ea19383ca2477b2fd", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb5ddb845-36", "ovs_interfaceid": "b5ddb845-36ad-438e-b95d-6d5c696671e9", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 21 19:17:31 np0005591285 nova_compute[182755]: 2026-01-22 00:17:31.720 182759 DEBUG oslo_concurrency.lockutils [req-9351240c-94ad-405d-a414-6f4582c21ef7 req-b943f896-c83e-4470-a031-2fdfb60aee22 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Releasing lock "refresh_cache-13e8c059-7a27-4db4-8151-32511c435aaf" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 21 19:17:31 np0005591285 nova_compute[182755]: 2026-01-22 00:17:31.780 182759 DEBUG nova.compute.manager [req-82095a4c-ce39-4f87-bac6-27d79655e8e9 req-9cfc5463-6b87-4b42-bbe7-424063d21dda 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 13e8c059-7a27-4db4-8151-32511c435aaf] Received event network-vif-unplugged-b5ddb845-36ad-438e-b95d-6d5c696671e9 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 21 19:17:31 np0005591285 nova_compute[182755]: 2026-01-22 00:17:31.781 182759 DEBUG oslo_concurrency.lockutils [req-82095a4c-ce39-4f87-bac6-27d79655e8e9 req-9cfc5463-6b87-4b42-bbe7-424063d21dda 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "13e8c059-7a27-4db4-8151-32511c435aaf-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 21 19:17:31 np0005591285 nova_compute[182755]: 2026-01-22 00:17:31.781 182759 DEBUG oslo_concurrency.lockutils [req-82095a4c-ce39-4f87-bac6-27d79655e8e9 req-9cfc5463-6b87-4b42-bbe7-424063d21dda 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "13e8c059-7a27-4db4-8151-32511c435aaf-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 21 19:17:31 np0005591285 nova_compute[182755]: 2026-01-22 00:17:31.782 182759 DEBUG oslo_concurrency.lockutils [req-82095a4c-ce39-4f87-bac6-27d79655e8e9 req-9cfc5463-6b87-4b42-bbe7-424063d21dda 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "13e8c059-7a27-4db4-8151-32511c435aaf-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 21 19:17:31 np0005591285 nova_compute[182755]: 2026-01-22 00:17:31.782 182759 DEBUG nova.compute.manager [req-82095a4c-ce39-4f87-bac6-27d79655e8e9 req-9cfc5463-6b87-4b42-bbe7-424063d21dda 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 13e8c059-7a27-4db4-8151-32511c435aaf] No waiting events found dispatching network-vif-unplugged-b5ddb845-36ad-438e-b95d-6d5c696671e9 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 21 19:17:31 np0005591285 nova_compute[182755]: 2026-01-22 00:17:31.782 182759 DEBUG nova.compute.manager [req-82095a4c-ce39-4f87-bac6-27d79655e8e9 req-9cfc5463-6b87-4b42-bbe7-424063d21dda 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 13e8c059-7a27-4db4-8151-32511c435aaf] Received event network-vif-unplugged-b5ddb845-36ad-438e-b95d-6d5c696671e9 for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Jan 21 19:17:31 np0005591285 nova_compute[182755]: 2026-01-22 00:17:31.782 182759 DEBUG nova.compute.manager [req-82095a4c-ce39-4f87-bac6-27d79655e8e9 req-9cfc5463-6b87-4b42-bbe7-424063d21dda 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 13e8c059-7a27-4db4-8151-32511c435aaf] Received event network-vif-plugged-b5ddb845-36ad-438e-b95d-6d5c696671e9 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 21 19:17:31 np0005591285 nova_compute[182755]: 2026-01-22 00:17:31.783 182759 DEBUG oslo_concurrency.lockutils [req-82095a4c-ce39-4f87-bac6-27d79655e8e9 req-9cfc5463-6b87-4b42-bbe7-424063d21dda 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "13e8c059-7a27-4db4-8151-32511c435aaf-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 21 19:17:31 np0005591285 nova_compute[182755]: 2026-01-22 00:17:31.783 182759 DEBUG oslo_concurrency.lockutils [req-82095a4c-ce39-4f87-bac6-27d79655e8e9 req-9cfc5463-6b87-4b42-bbe7-424063d21dda 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "13e8c059-7a27-4db4-8151-32511c435aaf-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 21 19:17:31 np0005591285 nova_compute[182755]: 2026-01-22 00:17:31.783 182759 DEBUG oslo_concurrency.lockutils [req-82095a4c-ce39-4f87-bac6-27d79655e8e9 req-9cfc5463-6b87-4b42-bbe7-424063d21dda 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "13e8c059-7a27-4db4-8151-32511c435aaf-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 21 19:17:31 np0005591285 nova_compute[182755]: 2026-01-22 00:17:31.783 182759 DEBUG nova.compute.manager [req-82095a4c-ce39-4f87-bac6-27d79655e8e9 req-9cfc5463-6b87-4b42-bbe7-424063d21dda 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 13e8c059-7a27-4db4-8151-32511c435aaf] No waiting events found dispatching network-vif-plugged-b5ddb845-36ad-438e-b95d-6d5c696671e9 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 21 19:17:31 np0005591285 nova_compute[182755]: 2026-01-22 00:17:31.784 182759 WARNING nova.compute.manager [req-82095a4c-ce39-4f87-bac6-27d79655e8e9 req-9cfc5463-6b87-4b42-bbe7-424063d21dda 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 13e8c059-7a27-4db4-8151-32511c435aaf] Received unexpected event network-vif-plugged-b5ddb845-36ad-438e-b95d-6d5c696671e9 for instance with vm_state active and task_state deleting.#033[00m
Jan 21 19:17:32 np0005591285 nova_compute[182755]: 2026-01-22 00:17:32.405 182759 DEBUG nova.network.neutron [-] [instance: 13e8c059-7a27-4db4-8151-32511c435aaf] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 21 19:17:32 np0005591285 nova_compute[182755]: 2026-01-22 00:17:32.421 182759 INFO nova.compute.manager [-] [instance: 13e8c059-7a27-4db4-8151-32511c435aaf] Took 1.91 seconds to deallocate network for instance.#033[00m
Jan 21 19:17:32 np0005591285 nova_compute[182755]: 2026-01-22 00:17:32.504 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:17:32 np0005591285 nova_compute[182755]: 2026-01-22 00:17:32.587 182759 DEBUG oslo_concurrency.lockutils [None req-f06e3ef7-6ebd-490e-8421-ecf30454d4cb 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 21 19:17:32 np0005591285 nova_compute[182755]: 2026-01-22 00:17:32.588 182759 DEBUG oslo_concurrency.lockutils [None req-f06e3ef7-6ebd-490e-8421-ecf30454d4cb 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 21 19:17:32 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:17:32.630 104259 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=ce4b296c-26ac-415a-aa87-9634754eb3d3, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '40'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 21 19:17:32 np0005591285 nova_compute[182755]: 2026-01-22 00:17:32.654 182759 DEBUG nova.compute.provider_tree [None req-f06e3ef7-6ebd-490e-8421-ecf30454d4cb 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] Inventory has not changed in ProviderTree for provider: e96a8776-a298-4c19-937a-402cb8191067 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 21 19:17:32 np0005591285 nova_compute[182755]: 2026-01-22 00:17:32.675 182759 DEBUG nova.scheduler.client.report [None req-f06e3ef7-6ebd-490e-8421-ecf30454d4cb 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] Inventory has not changed for provider e96a8776-a298-4c19-937a-402cb8191067 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 21 19:17:32 np0005591285 nova_compute[182755]: 2026-01-22 00:17:32.702 182759 DEBUG oslo_concurrency.lockutils [None req-f06e3ef7-6ebd-490e-8421-ecf30454d4cb 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.114s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 21 19:17:32 np0005591285 nova_compute[182755]: 2026-01-22 00:17:32.741 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:17:32 np0005591285 nova_compute[182755]: 2026-01-22 00:17:32.744 182759 INFO nova.scheduler.client.report [None req-f06e3ef7-6ebd-490e-8421-ecf30454d4cb 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] Deleted allocations for instance 13e8c059-7a27-4db4-8151-32511c435aaf#033[00m
Jan 21 19:17:32 np0005591285 nova_compute[182755]: 2026-01-22 00:17:32.871 182759 DEBUG oslo_concurrency.lockutils [None req-f06e3ef7-6ebd-490e-8421-ecf30454d4cb 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] Lock "13e8c059-7a27-4db4-8151-32511c435aaf" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 2.803s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 21 19:17:35 np0005591285 nova_compute[182755]: 2026-01-22 00:17:35.411 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:17:37 np0005591285 nova_compute[182755]: 2026-01-22 00:17:37.744 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:17:38 np0005591285 nova_compute[182755]: 2026-01-22 00:17:38.794 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:17:40 np0005591285 podman[233183]: 2026-01-22 00:17:40.174707583 +0000 UTC m=+0.049258185 container health_status 3d927e208c8d9d2bf2ed958430b573b5beb6e6062ba87e1ec7a0ee42c855b2e4 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter)
Jan 21 19:17:40 np0005591285 nova_compute[182755]: 2026-01-22 00:17:40.413 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:17:42 np0005591285 nova_compute[182755]: 2026-01-22 00:17:42.746 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:17:44 np0005591285 podman[233209]: 2026-01-22 00:17:44.227915076 +0000 UTC m=+0.079854920 container health_status 8919309155a033da8deb024bb37b9fc0d285fe97686ee0d202af37a5f2df8416 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']})
Jan 21 19:17:44 np0005591285 podman[233208]: 2026-01-22 00:17:44.228562383 +0000 UTC m=+0.086761324 container health_status 482a8450540b33e5cd4c4cb0c4b60281016c657b79b89fa82f8a9e447843f995 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0)
Jan 21 19:17:44 np0005591285 nova_compute[182755]: 2026-01-22 00:17:44.233 182759 DEBUG oslo_service.periodic_task [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 21 19:17:45 np0005591285 nova_compute[182755]: 2026-01-22 00:17:45.212 182759 DEBUG oslo_concurrency.lockutils [None req-29da28cc-4080-4bd7-86ba-d3abad7df675 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] Acquiring lock "5fd1f867-a38b-4022-886e-080b31068c65" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 21 19:17:45 np0005591285 nova_compute[182755]: 2026-01-22 00:17:45.212 182759 DEBUG oslo_concurrency.lockutils [None req-29da28cc-4080-4bd7-86ba-d3abad7df675 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] Lock "5fd1f867-a38b-4022-886e-080b31068c65" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 21 19:17:45 np0005591285 nova_compute[182755]: 2026-01-22 00:17:45.230 182759 DEBUG nova.compute.manager [None req-29da28cc-4080-4bd7-86ba-d3abad7df675 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] [instance: 5fd1f867-a38b-4022-886e-080b31068c65] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Jan 21 19:17:45 np0005591285 nova_compute[182755]: 2026-01-22 00:17:45.378 182759 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1769041050.3766315, 13e8c059-7a27-4db4-8151-32511c435aaf => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 21 19:17:45 np0005591285 nova_compute[182755]: 2026-01-22 00:17:45.378 182759 INFO nova.compute.manager [-] [instance: 13e8c059-7a27-4db4-8151-32511c435aaf] VM Stopped (Lifecycle Event)#033[00m
Jan 21 19:17:45 np0005591285 nova_compute[182755]: 2026-01-22 00:17:45.404 182759 DEBUG nova.compute.manager [None req-d8555f0a-22db-47e5-b4a6-576e9206ec18 - - - - - -] [instance: 13e8c059-7a27-4db4-8151-32511c435aaf] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 21 19:17:45 np0005591285 nova_compute[182755]: 2026-01-22 00:17:45.415 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:17:45 np0005591285 nova_compute[182755]: 2026-01-22 00:17:45.638 182759 DEBUG oslo_concurrency.lockutils [None req-29da28cc-4080-4bd7-86ba-d3abad7df675 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 21 19:17:45 np0005591285 nova_compute[182755]: 2026-01-22 00:17:45.639 182759 DEBUG oslo_concurrency.lockutils [None req-29da28cc-4080-4bd7-86ba-d3abad7df675 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 21 19:17:45 np0005591285 nova_compute[182755]: 2026-01-22 00:17:45.649 182759 DEBUG nova.virt.hardware [None req-29da28cc-4080-4bd7-86ba-d3abad7df675 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Jan 21 19:17:45 np0005591285 nova_compute[182755]: 2026-01-22 00:17:45.650 182759 INFO nova.compute.claims [None req-29da28cc-4080-4bd7-86ba-d3abad7df675 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] [instance: 5fd1f867-a38b-4022-886e-080b31068c65] Claim successful on node compute-2.ctlplane.example.com#033[00m
Jan 21 19:17:45 np0005591285 nova_compute[182755]: 2026-01-22 00:17:45.837 182759 DEBUG nova.compute.provider_tree [None req-29da28cc-4080-4bd7-86ba-d3abad7df675 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] Inventory has not changed in ProviderTree for provider: e96a8776-a298-4c19-937a-402cb8191067 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 21 19:17:45 np0005591285 nova_compute[182755]: 2026-01-22 00:17:45.855 182759 DEBUG nova.scheduler.client.report [None req-29da28cc-4080-4bd7-86ba-d3abad7df675 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] Inventory has not changed for provider e96a8776-a298-4c19-937a-402cb8191067 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 21 19:17:45 np0005591285 nova_compute[182755]: 2026-01-22 00:17:45.880 182759 DEBUG oslo_concurrency.lockutils [None req-29da28cc-4080-4bd7-86ba-d3abad7df675 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.241s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 21 19:17:45 np0005591285 nova_compute[182755]: 2026-01-22 00:17:45.881 182759 DEBUG nova.compute.manager [None req-29da28cc-4080-4bd7-86ba-d3abad7df675 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] [instance: 5fd1f867-a38b-4022-886e-080b31068c65] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Jan 21 19:17:45 np0005591285 nova_compute[182755]: 2026-01-22 00:17:45.957 182759 DEBUG nova.compute.manager [None req-29da28cc-4080-4bd7-86ba-d3abad7df675 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] [instance: 5fd1f867-a38b-4022-886e-080b31068c65] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Jan 21 19:17:45 np0005591285 nova_compute[182755]: 2026-01-22 00:17:45.957 182759 DEBUG nova.network.neutron [None req-29da28cc-4080-4bd7-86ba-d3abad7df675 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] [instance: 5fd1f867-a38b-4022-886e-080b31068c65] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Jan 21 19:17:45 np0005591285 nova_compute[182755]: 2026-01-22 00:17:45.987 182759 INFO nova.virt.libvirt.driver [None req-29da28cc-4080-4bd7-86ba-d3abad7df675 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] [instance: 5fd1f867-a38b-4022-886e-080b31068c65] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Jan 21 19:17:46 np0005591285 nova_compute[182755]: 2026-01-22 00:17:46.013 182759 DEBUG nova.compute.manager [None req-29da28cc-4080-4bd7-86ba-d3abad7df675 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] [instance: 5fd1f867-a38b-4022-886e-080b31068c65] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Jan 21 19:17:46 np0005591285 nova_compute[182755]: 2026-01-22 00:17:46.201 182759 DEBUG nova.compute.manager [None req-29da28cc-4080-4bd7-86ba-d3abad7df675 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] [instance: 5fd1f867-a38b-4022-886e-080b31068c65] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Jan 21 19:17:46 np0005591285 nova_compute[182755]: 2026-01-22 00:17:46.203 182759 DEBUG nova.virt.libvirt.driver [None req-29da28cc-4080-4bd7-86ba-d3abad7df675 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] [instance: 5fd1f867-a38b-4022-886e-080b31068c65] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Jan 21 19:17:46 np0005591285 nova_compute[182755]: 2026-01-22 00:17:46.203 182759 INFO nova.virt.libvirt.driver [None req-29da28cc-4080-4bd7-86ba-d3abad7df675 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] [instance: 5fd1f867-a38b-4022-886e-080b31068c65] Creating image(s)#033[00m
Jan 21 19:17:46 np0005591285 nova_compute[182755]: 2026-01-22 00:17:46.204 182759 DEBUG oslo_concurrency.lockutils [None req-29da28cc-4080-4bd7-86ba-d3abad7df675 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] Acquiring lock "/var/lib/nova/instances/5fd1f867-a38b-4022-886e-080b31068c65/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 21 19:17:46 np0005591285 nova_compute[182755]: 2026-01-22 00:17:46.205 182759 DEBUG oslo_concurrency.lockutils [None req-29da28cc-4080-4bd7-86ba-d3abad7df675 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] Lock "/var/lib/nova/instances/5fd1f867-a38b-4022-886e-080b31068c65/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 21 19:17:46 np0005591285 nova_compute[182755]: 2026-01-22 00:17:46.206 182759 DEBUG oslo_concurrency.lockutils [None req-29da28cc-4080-4bd7-86ba-d3abad7df675 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] Lock "/var/lib/nova/instances/5fd1f867-a38b-4022-886e-080b31068c65/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 21 19:17:46 np0005591285 nova_compute[182755]: 2026-01-22 00:17:46.222 182759 DEBUG oslo_concurrency.processutils [None req-29da28cc-4080-4bd7-86ba-d3abad7df675 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 21 19:17:46 np0005591285 podman[233253]: 2026-01-22 00:17:46.232271545 +0000 UTC m=+0.100130040 container health_status e38def30a2d032278c507a8fe96e62e9ea51ff4721fe55b24e66691821fd079e (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, container_name=ovn_controller, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 21 19:17:46 np0005591285 nova_compute[182755]: 2026-01-22 00:17:46.295 182759 DEBUG oslo_concurrency.processutils [None req-29da28cc-4080-4bd7-86ba-d3abad7df675 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json" returned: 0 in 0.073s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 21 19:17:46 np0005591285 nova_compute[182755]: 2026-01-22 00:17:46.296 182759 DEBUG oslo_concurrency.lockutils [None req-29da28cc-4080-4bd7-86ba-d3abad7df675 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] Acquiring lock "3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 21 19:17:46 np0005591285 nova_compute[182755]: 2026-01-22 00:17:46.297 182759 DEBUG oslo_concurrency.lockutils [None req-29da28cc-4080-4bd7-86ba-d3abad7df675 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] Lock "3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 21 19:17:46 np0005591285 nova_compute[182755]: 2026-01-22 00:17:46.311 182759 DEBUG oslo_concurrency.processutils [None req-29da28cc-4080-4bd7-86ba-d3abad7df675 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 21 19:17:46 np0005591285 nova_compute[182755]: 2026-01-22 00:17:46.392 182759 DEBUG oslo_concurrency.processutils [None req-29da28cc-4080-4bd7-86ba-d3abad7df675 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json" returned: 0 in 0.081s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 21 19:17:46 np0005591285 nova_compute[182755]: 2026-01-22 00:17:46.394 182759 DEBUG oslo_concurrency.processutils [None req-29da28cc-4080-4bd7-86ba-d3abad7df675 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474,backing_fmt=raw /var/lib/nova/instances/5fd1f867-a38b-4022-886e-080b31068c65/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 21 19:17:46 np0005591285 nova_compute[182755]: 2026-01-22 00:17:46.451 182759 DEBUG oslo_concurrency.processutils [None req-29da28cc-4080-4bd7-86ba-d3abad7df675 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474,backing_fmt=raw /var/lib/nova/instances/5fd1f867-a38b-4022-886e-080b31068c65/disk 1073741824" returned: 0 in 0.056s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 21 19:17:46 np0005591285 nova_compute[182755]: 2026-01-22 00:17:46.453 182759 DEBUG oslo_concurrency.lockutils [None req-29da28cc-4080-4bd7-86ba-d3abad7df675 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] Lock "3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.156s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 21 19:17:46 np0005591285 nova_compute[182755]: 2026-01-22 00:17:46.454 182759 DEBUG oslo_concurrency.processutils [None req-29da28cc-4080-4bd7-86ba-d3abad7df675 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 21 19:17:46 np0005591285 nova_compute[182755]: 2026-01-22 00:17:46.542 182759 DEBUG oslo_concurrency.processutils [None req-29da28cc-4080-4bd7-86ba-d3abad7df675 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json" returned: 0 in 0.088s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 21 19:17:46 np0005591285 nova_compute[182755]: 2026-01-22 00:17:46.543 182759 DEBUG nova.virt.disk.api [None req-29da28cc-4080-4bd7-86ba-d3abad7df675 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] Checking if we can resize image /var/lib/nova/instances/5fd1f867-a38b-4022-886e-080b31068c65/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166#033[00m
Jan 21 19:17:46 np0005591285 nova_compute[182755]: 2026-01-22 00:17:46.544 182759 DEBUG oslo_concurrency.processutils [None req-29da28cc-4080-4bd7-86ba-d3abad7df675 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/5fd1f867-a38b-4022-886e-080b31068c65/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 21 19:17:46 np0005591285 nova_compute[182755]: 2026-01-22 00:17:46.564 182759 DEBUG nova.policy [None req-29da28cc-4080-4bd7-86ba-d3abad7df675 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '833f1e9dce90456ea55a443da6704907', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '34b96b4037d24a0ea19383ca2477b2fd', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Jan 21 19:17:46 np0005591285 nova_compute[182755]: 2026-01-22 00:17:46.601 182759 DEBUG oslo_concurrency.processutils [None req-29da28cc-4080-4bd7-86ba-d3abad7df675 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/5fd1f867-a38b-4022-886e-080b31068c65/disk --force-share --output=json" returned: 0 in 0.057s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 21 19:17:46 np0005591285 nova_compute[182755]: 2026-01-22 00:17:46.601 182759 DEBUG nova.virt.disk.api [None req-29da28cc-4080-4bd7-86ba-d3abad7df675 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] Cannot resize image /var/lib/nova/instances/5fd1f867-a38b-4022-886e-080b31068c65/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172#033[00m
Jan 21 19:17:46 np0005591285 nova_compute[182755]: 2026-01-22 00:17:46.602 182759 DEBUG nova.objects.instance [None req-29da28cc-4080-4bd7-86ba-d3abad7df675 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] Lazy-loading 'migration_context' on Instance uuid 5fd1f867-a38b-4022-886e-080b31068c65 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 21 19:17:46 np0005591285 nova_compute[182755]: 2026-01-22 00:17:46.616 182759 DEBUG nova.virt.libvirt.driver [None req-29da28cc-4080-4bd7-86ba-d3abad7df675 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] [instance: 5fd1f867-a38b-4022-886e-080b31068c65] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Jan 21 19:17:46 np0005591285 nova_compute[182755]: 2026-01-22 00:17:46.617 182759 DEBUG nova.virt.libvirt.driver [None req-29da28cc-4080-4bd7-86ba-d3abad7df675 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] [instance: 5fd1f867-a38b-4022-886e-080b31068c65] Ensure instance console log exists: /var/lib/nova/instances/5fd1f867-a38b-4022-886e-080b31068c65/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Jan 21 19:17:46 np0005591285 nova_compute[182755]: 2026-01-22 00:17:46.617 182759 DEBUG oslo_concurrency.lockutils [None req-29da28cc-4080-4bd7-86ba-d3abad7df675 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 21 19:17:46 np0005591285 nova_compute[182755]: 2026-01-22 00:17:46.618 182759 DEBUG oslo_concurrency.lockutils [None req-29da28cc-4080-4bd7-86ba-d3abad7df675 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 21 19:17:46 np0005591285 nova_compute[182755]: 2026-01-22 00:17:46.618 182759 DEBUG oslo_concurrency.lockutils [None req-29da28cc-4080-4bd7-86ba-d3abad7df675 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 21 19:17:47 np0005591285 nova_compute[182755]: 2026-01-22 00:17:47.217 182759 DEBUG oslo_service.periodic_task [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 21 19:17:47 np0005591285 nova_compute[182755]: 2026-01-22 00:17:47.218 182759 DEBUG nova.compute.manager [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Jan 21 19:17:47 np0005591285 nova_compute[182755]: 2026-01-22 00:17:47.749 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:17:48 np0005591285 nova_compute[182755]: 2026-01-22 00:17:48.412 182759 DEBUG nova.network.neutron [None req-29da28cc-4080-4bd7-86ba-d3abad7df675 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] [instance: 5fd1f867-a38b-4022-886e-080b31068c65] Successfully updated port: b5ddb845-36ad-438e-b95d-6d5c696671e9 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Jan 21 19:17:48 np0005591285 nova_compute[182755]: 2026-01-22 00:17:48.444 182759 DEBUG oslo_concurrency.lockutils [None req-29da28cc-4080-4bd7-86ba-d3abad7df675 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] Acquiring lock "refresh_cache-5fd1f867-a38b-4022-886e-080b31068c65" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 21 19:17:48 np0005591285 nova_compute[182755]: 2026-01-22 00:17:48.445 182759 DEBUG oslo_concurrency.lockutils [None req-29da28cc-4080-4bd7-86ba-d3abad7df675 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] Acquired lock "refresh_cache-5fd1f867-a38b-4022-886e-080b31068c65" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 21 19:17:48 np0005591285 nova_compute[182755]: 2026-01-22 00:17:48.445 182759 DEBUG nova.network.neutron [None req-29da28cc-4080-4bd7-86ba-d3abad7df675 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] [instance: 5fd1f867-a38b-4022-886e-080b31068c65] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Jan 21 19:17:48 np0005591285 nova_compute[182755]: 2026-01-22 00:17:48.454 182759 DEBUG nova.compute.manager [None req-6a54018f-e373-40db-bb2b-666c3eb4f8c4 74e6e619a2a649d8a98f51f794c814ec d7e5ce5531e5499fa8b5c71d40934672 - - default default] [instance: 46feac9e-f412-4027-8cfb-f7280308085e] Stashing vm_state: active _prep_resize /usr/lib/python3.9/site-packages/nova/compute/manager.py:5560#033[00m
Jan 21 19:17:48 np0005591285 nova_compute[182755]: 2026-01-22 00:17:48.578 182759 DEBUG nova.compute.manager [req-c1328963-1f75-4150-a97f-cff35ee8d394 req-01e6b09a-26a0-4620-8188-602b55c9d11b 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 5fd1f867-a38b-4022-886e-080b31068c65] Received event network-changed-b5ddb845-36ad-438e-b95d-6d5c696671e9 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 21 19:17:48 np0005591285 nova_compute[182755]: 2026-01-22 00:17:48.578 182759 DEBUG nova.compute.manager [req-c1328963-1f75-4150-a97f-cff35ee8d394 req-01e6b09a-26a0-4620-8188-602b55c9d11b 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 5fd1f867-a38b-4022-886e-080b31068c65] Refreshing instance network info cache due to event network-changed-b5ddb845-36ad-438e-b95d-6d5c696671e9. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 21 19:17:48 np0005591285 nova_compute[182755]: 2026-01-22 00:17:48.578 182759 DEBUG oslo_concurrency.lockutils [req-c1328963-1f75-4150-a97f-cff35ee8d394 req-01e6b09a-26a0-4620-8188-602b55c9d11b 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "refresh_cache-5fd1f867-a38b-4022-886e-080b31068c65" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 21 19:17:48 np0005591285 nova_compute[182755]: 2026-01-22 00:17:48.592 182759 DEBUG oslo_concurrency.lockutils [None req-6a54018f-e373-40db-bb2b-666c3eb4f8c4 74e6e619a2a649d8a98f51f794c814ec d7e5ce5531e5499fa8b5c71d40934672 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.resize_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 21 19:17:48 np0005591285 nova_compute[182755]: 2026-01-22 00:17:48.592 182759 DEBUG oslo_concurrency.lockutils [None req-6a54018f-e373-40db-bb2b-666c3eb4f8c4 74e6e619a2a649d8a98f51f794c814ec d7e5ce5531e5499fa8b5c71d40934672 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.resize_claim" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 21 19:17:48 np0005591285 nova_compute[182755]: 2026-01-22 00:17:48.624 182759 DEBUG nova.objects.instance [None req-6a54018f-e373-40db-bb2b-666c3eb4f8c4 74e6e619a2a649d8a98f51f794c814ec d7e5ce5531e5499fa8b5c71d40934672 - - default default] Lazy-loading 'pci_requests' on Instance uuid 46feac9e-f412-4027-8cfb-f7280308085e obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 21 19:17:48 np0005591285 nova_compute[182755]: 2026-01-22 00:17:48.641 182759 DEBUG nova.virt.hardware [None req-6a54018f-e373-40db-bb2b-666c3eb4f8c4 74e6e619a2a649d8a98f51f794c814ec d7e5ce5531e5499fa8b5c71d40934672 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Jan 21 19:17:48 np0005591285 nova_compute[182755]: 2026-01-22 00:17:48.641 182759 INFO nova.compute.claims [None req-6a54018f-e373-40db-bb2b-666c3eb4f8c4 74e6e619a2a649d8a98f51f794c814ec d7e5ce5531e5499fa8b5c71d40934672 - - default default] [instance: 46feac9e-f412-4027-8cfb-f7280308085e] Claim successful on node compute-2.ctlplane.example.com#033[00m
Jan 21 19:17:48 np0005591285 nova_compute[182755]: 2026-01-22 00:17:48.642 182759 DEBUG nova.objects.instance [None req-6a54018f-e373-40db-bb2b-666c3eb4f8c4 74e6e619a2a649d8a98f51f794c814ec d7e5ce5531e5499fa8b5c71d40934672 - - default default] Lazy-loading 'resources' on Instance uuid 46feac9e-f412-4027-8cfb-f7280308085e obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 21 19:17:48 np0005591285 nova_compute[182755]: 2026-01-22 00:17:48.661 182759 DEBUG nova.objects.instance [None req-6a54018f-e373-40db-bb2b-666c3eb4f8c4 74e6e619a2a649d8a98f51f794c814ec d7e5ce5531e5499fa8b5c71d40934672 - - default default] Lazy-loading 'numa_topology' on Instance uuid 46feac9e-f412-4027-8cfb-f7280308085e obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 21 19:17:48 np0005591285 nova_compute[182755]: 2026-01-22 00:17:48.696 182759 DEBUG nova.objects.instance [None req-6a54018f-e373-40db-bb2b-666c3eb4f8c4 74e6e619a2a649d8a98f51f794c814ec d7e5ce5531e5499fa8b5c71d40934672 - - default default] Lazy-loading 'pci_devices' on Instance uuid 46feac9e-f412-4027-8cfb-f7280308085e obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 21 19:17:48 np0005591285 nova_compute[182755]: 2026-01-22 00:17:48.702 182759 DEBUG nova.network.neutron [None req-29da28cc-4080-4bd7-86ba-d3abad7df675 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] [instance: 5fd1f867-a38b-4022-886e-080b31068c65] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Jan 21 19:17:48 np0005591285 nova_compute[182755]: 2026-01-22 00:17:48.781 182759 INFO nova.compute.resource_tracker [None req-6a54018f-e373-40db-bb2b-666c3eb4f8c4 74e6e619a2a649d8a98f51f794c814ec d7e5ce5531e5499fa8b5c71d40934672 - - default default] [instance: 46feac9e-f412-4027-8cfb-f7280308085e] Updating resource usage from migration bd86f60c-99de-4bae-8548-969bcc2d8d50#033[00m
Jan 21 19:17:48 np0005591285 nova_compute[182755]: 2026-01-22 00:17:48.781 182759 DEBUG nova.compute.resource_tracker [None req-6a54018f-e373-40db-bb2b-666c3eb4f8c4 74e6e619a2a649d8a98f51f794c814ec d7e5ce5531e5499fa8b5c71d40934672 - - default default] [instance: 46feac9e-f412-4027-8cfb-f7280308085e] Starting to track incoming migration bd86f60c-99de-4bae-8548-969bcc2d8d50 with flavor c3389c03-89c4-4ff5-9e03-1a99d41713d4 _update_usage_from_migration /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1431#033[00m
Jan 21 19:17:48 np0005591285 nova_compute[182755]: 2026-01-22 00:17:48.870 182759 DEBUG nova.compute.provider_tree [None req-6a54018f-e373-40db-bb2b-666c3eb4f8c4 74e6e619a2a649d8a98f51f794c814ec d7e5ce5531e5499fa8b5c71d40934672 - - default default] Inventory has not changed in ProviderTree for provider: e96a8776-a298-4c19-937a-402cb8191067 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 21 19:17:48 np0005591285 nova_compute[182755]: 2026-01-22 00:17:48.890 182759 DEBUG nova.scheduler.client.report [None req-6a54018f-e373-40db-bb2b-666c3eb4f8c4 74e6e619a2a649d8a98f51f794c814ec d7e5ce5531e5499fa8b5c71d40934672 - - default default] Inventory has not changed for provider e96a8776-a298-4c19-937a-402cb8191067 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 21 19:17:48 np0005591285 nova_compute[182755]: 2026-01-22 00:17:48.936 182759 DEBUG oslo_concurrency.lockutils [None req-6a54018f-e373-40db-bb2b-666c3eb4f8c4 74e6e619a2a649d8a98f51f794c814ec d7e5ce5531e5499fa8b5c71d40934672 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.resize_claim" :: held 0.343s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 21 19:17:48 np0005591285 nova_compute[182755]: 2026-01-22 00:17:48.936 182759 INFO nova.compute.manager [None req-6a54018f-e373-40db-bb2b-666c3eb4f8c4 74e6e619a2a649d8a98f51f794c814ec d7e5ce5531e5499fa8b5c71d40934672 - - default default] [instance: 46feac9e-f412-4027-8cfb-f7280308085e] Migrating#033[00m
Jan 21 19:17:49 np0005591285 nova_compute[182755]: 2026-01-22 00:17:49.217 182759 DEBUG oslo_service.periodic_task [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 21 19:17:49 np0005591285 nova_compute[182755]: 2026-01-22 00:17:49.218 182759 DEBUG nova.compute.manager [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Jan 21 19:17:49 np0005591285 nova_compute[182755]: 2026-01-22 00:17:49.219 182759 DEBUG nova.compute.manager [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Jan 21 19:17:49 np0005591285 nova_compute[182755]: 2026-01-22 00:17:49.278 182759 DEBUG nova.compute.manager [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] [instance: 5fd1f867-a38b-4022-886e-080b31068c65] Skipping network cache update for instance because it is Building. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9871#033[00m
Jan 21 19:17:49 np0005591285 nova_compute[182755]: 2026-01-22 00:17:49.278 182759 DEBUG nova.compute.manager [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Jan 21 19:17:49 np0005591285 nova_compute[182755]: 2026-01-22 00:17:49.279 182759 DEBUG oslo_service.periodic_task [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 21 19:17:50 np0005591285 nova_compute[182755]: 2026-01-22 00:17:50.217 182759 DEBUG oslo_service.periodic_task [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 21 19:17:50 np0005591285 nova_compute[182755]: 2026-01-22 00:17:50.333 182759 DEBUG nova.network.neutron [None req-29da28cc-4080-4bd7-86ba-d3abad7df675 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] [instance: 5fd1f867-a38b-4022-886e-080b31068c65] Updating instance_info_cache with network_info: [{"id": "b5ddb845-36ad-438e-b95d-6d5c696671e9", "address": "fa:16:3e:30:f9:d2", "network": {"id": "ed0e337a-102a-4cfd-8393-2e4b081cc9be", "bridge": "br-int", "label": "tempest-network-smoke--1899647072", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.176", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "34b96b4037d24a0ea19383ca2477b2fd", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb5ddb845-36", "ovs_interfaceid": "b5ddb845-36ad-438e-b95d-6d5c696671e9", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 21 19:17:50 np0005591285 nova_compute[182755]: 2026-01-22 00:17:50.417 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:17:50 np0005591285 nova_compute[182755]: 2026-01-22 00:17:50.569 182759 DEBUG oslo_concurrency.lockutils [None req-29da28cc-4080-4bd7-86ba-d3abad7df675 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] Releasing lock "refresh_cache-5fd1f867-a38b-4022-886e-080b31068c65" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 21 19:17:50 np0005591285 nova_compute[182755]: 2026-01-22 00:17:50.569 182759 DEBUG nova.compute.manager [None req-29da28cc-4080-4bd7-86ba-d3abad7df675 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] [instance: 5fd1f867-a38b-4022-886e-080b31068c65] Instance network_info: |[{"id": "b5ddb845-36ad-438e-b95d-6d5c696671e9", "address": "fa:16:3e:30:f9:d2", "network": {"id": "ed0e337a-102a-4cfd-8393-2e4b081cc9be", "bridge": "br-int", "label": "tempest-network-smoke--1899647072", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.176", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "34b96b4037d24a0ea19383ca2477b2fd", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb5ddb845-36", "ovs_interfaceid": "b5ddb845-36ad-438e-b95d-6d5c696671e9", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Jan 21 19:17:50 np0005591285 nova_compute[182755]: 2026-01-22 00:17:50.570 182759 DEBUG oslo_concurrency.lockutils [req-c1328963-1f75-4150-a97f-cff35ee8d394 req-01e6b09a-26a0-4620-8188-602b55c9d11b 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquired lock "refresh_cache-5fd1f867-a38b-4022-886e-080b31068c65" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 21 19:17:50 np0005591285 nova_compute[182755]: 2026-01-22 00:17:50.570 182759 DEBUG nova.network.neutron [req-c1328963-1f75-4150-a97f-cff35ee8d394 req-01e6b09a-26a0-4620-8188-602b55c9d11b 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 5fd1f867-a38b-4022-886e-080b31068c65] Refreshing network info cache for port b5ddb845-36ad-438e-b95d-6d5c696671e9 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 21 19:17:50 np0005591285 nova_compute[182755]: 2026-01-22 00:17:50.574 182759 DEBUG nova.virt.libvirt.driver [None req-29da28cc-4080-4bd7-86ba-d3abad7df675 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] [instance: 5fd1f867-a38b-4022-886e-080b31068c65] Start _get_guest_xml network_info=[{"id": "b5ddb845-36ad-438e-b95d-6d5c696671e9", "address": "fa:16:3e:30:f9:d2", "network": {"id": "ed0e337a-102a-4cfd-8393-2e4b081cc9be", "bridge": "br-int", "label": "tempest-network-smoke--1899647072", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.176", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "34b96b4037d24a0ea19383ca2477b2fd", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb5ddb845-36", "ovs_interfaceid": "b5ddb845-36ad-438e-b95d-6d5c696671e9", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-21T23:42:50Z,direct_url=<?>,disk_format='qcow2',id=9cd98f02-a505-4543-a7ad-04e9a377b456,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='43b70c4e837343859ac97b6b2397ba1b',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-21T23:42:52Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'device_type': 'disk', 'device_name': '/dev/vda', 'size': 0, 'boot_index': 0, 'disk_bus': 'virtio', 'guest_format': None, 'encryption_options': None, 'encryption_format': None, 'encrypted': False, 'encryption_secret_uuid': None, 'image_id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Jan 21 19:17:50 np0005591285 nova_compute[182755]: 2026-01-22 00:17:50.579 182759 WARNING nova.virt.libvirt.driver [None req-29da28cc-4080-4bd7-86ba-d3abad7df675 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 21 19:17:50 np0005591285 nova_compute[182755]: 2026-01-22 00:17:50.587 182759 DEBUG nova.virt.libvirt.host [None req-29da28cc-4080-4bd7-86ba-d3abad7df675 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Jan 21 19:17:50 np0005591285 nova_compute[182755]: 2026-01-22 00:17:50.588 182759 DEBUG nova.virt.libvirt.host [None req-29da28cc-4080-4bd7-86ba-d3abad7df675 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Jan 21 19:17:50 np0005591285 nova_compute[182755]: 2026-01-22 00:17:50.593 182759 DEBUG nova.virt.libvirt.host [None req-29da28cc-4080-4bd7-86ba-d3abad7df675 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Jan 21 19:17:50 np0005591285 nova_compute[182755]: 2026-01-22 00:17:50.594 182759 DEBUG nova.virt.libvirt.host [None req-29da28cc-4080-4bd7-86ba-d3abad7df675 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Jan 21 19:17:50 np0005591285 nova_compute[182755]: 2026-01-22 00:17:50.595 182759 DEBUG nova.virt.libvirt.driver [None req-29da28cc-4080-4bd7-86ba-d3abad7df675 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Jan 21 19:17:50 np0005591285 nova_compute[182755]: 2026-01-22 00:17:50.596 182759 DEBUG nova.virt.hardware [None req-29da28cc-4080-4bd7-86ba-d3abad7df675 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-21T23:42:45Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='c3389c03-89c4-4ff5-9e03-1a99d41713d4',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-21T23:42:50Z,direct_url=<?>,disk_format='qcow2',id=9cd98f02-a505-4543-a7ad-04e9a377b456,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='43b70c4e837343859ac97b6b2397ba1b',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-21T23:42:52Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Jan 21 19:17:50 np0005591285 nova_compute[182755]: 2026-01-22 00:17:50.597 182759 DEBUG nova.virt.hardware [None req-29da28cc-4080-4bd7-86ba-d3abad7df675 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Jan 21 19:17:50 np0005591285 nova_compute[182755]: 2026-01-22 00:17:50.597 182759 DEBUG nova.virt.hardware [None req-29da28cc-4080-4bd7-86ba-d3abad7df675 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Jan 21 19:17:50 np0005591285 nova_compute[182755]: 2026-01-22 00:17:50.598 182759 DEBUG nova.virt.hardware [None req-29da28cc-4080-4bd7-86ba-d3abad7df675 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Jan 21 19:17:50 np0005591285 nova_compute[182755]: 2026-01-22 00:17:50.598 182759 DEBUG nova.virt.hardware [None req-29da28cc-4080-4bd7-86ba-d3abad7df675 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Jan 21 19:17:50 np0005591285 nova_compute[182755]: 2026-01-22 00:17:50.599 182759 DEBUG nova.virt.hardware [None req-29da28cc-4080-4bd7-86ba-d3abad7df675 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Jan 21 19:17:50 np0005591285 nova_compute[182755]: 2026-01-22 00:17:50.599 182759 DEBUG nova.virt.hardware [None req-29da28cc-4080-4bd7-86ba-d3abad7df675 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Jan 21 19:17:50 np0005591285 nova_compute[182755]: 2026-01-22 00:17:50.600 182759 DEBUG nova.virt.hardware [None req-29da28cc-4080-4bd7-86ba-d3abad7df675 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Jan 21 19:17:50 np0005591285 nova_compute[182755]: 2026-01-22 00:17:50.600 182759 DEBUG nova.virt.hardware [None req-29da28cc-4080-4bd7-86ba-d3abad7df675 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Jan 21 19:17:50 np0005591285 nova_compute[182755]: 2026-01-22 00:17:50.601 182759 DEBUG nova.virt.hardware [None req-29da28cc-4080-4bd7-86ba-d3abad7df675 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Jan 21 19:17:50 np0005591285 nova_compute[182755]: 2026-01-22 00:17:50.601 182759 DEBUG nova.virt.hardware [None req-29da28cc-4080-4bd7-86ba-d3abad7df675 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Jan 21 19:17:50 np0005591285 nova_compute[182755]: 2026-01-22 00:17:50.609 182759 DEBUG nova.virt.libvirt.vif [None req-29da28cc-4080-4bd7-86ba-d3abad7df675 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-22T00:17:43Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-1162631708',display_name='tempest-TestNetworkBasicOps-server-1162631708',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-1162631708',id=137,image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBL1GgERA5A6dlnUAgyjlcKhVfhPfweuHGtTAxq6FqnnvgM7bKXaovy0QOYaTYF/A7+y4NM6KVriyhlUw8VnZj6YUMZAIXL846IzmPuKgYor70BbYEoJIH0X6V5P8cATakw==',key_name='tempest-TestNetworkBasicOps-935473695',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='34b96b4037d24a0ea19383ca2477b2fd',ramdisk_id='',reservation_id='r-nbmafi1w',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestNetworkBasicOps-822850957',owner_user_name='tempest-TestNetworkBasicOps-822850957-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-22T00:17:46Z,user_data=None,user_id='833f1e9dce90456ea55a443da6704907',uuid=5fd1f867-a38b-4022-886e-080b31068c65,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "b5ddb845-36ad-438e-b95d-6d5c696671e9", "address": "fa:16:3e:30:f9:d2", "network": {"id": "ed0e337a-102a-4cfd-8393-2e4b081cc9be", "bridge": "br-int", "label": "tempest-network-smoke--1899647072", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.176", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "34b96b4037d24a0ea19383ca2477b2fd", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb5ddb845-36", "ovs_interfaceid": "b5ddb845-36ad-438e-b95d-6d5c696671e9", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Jan 21 19:17:50 np0005591285 nova_compute[182755]: 2026-01-22 00:17:50.610 182759 DEBUG nova.network.os_vif_util [None req-29da28cc-4080-4bd7-86ba-d3abad7df675 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] Converting VIF {"id": "b5ddb845-36ad-438e-b95d-6d5c696671e9", "address": "fa:16:3e:30:f9:d2", "network": {"id": "ed0e337a-102a-4cfd-8393-2e4b081cc9be", "bridge": "br-int", "label": "tempest-network-smoke--1899647072", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.176", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "34b96b4037d24a0ea19383ca2477b2fd", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb5ddb845-36", "ovs_interfaceid": "b5ddb845-36ad-438e-b95d-6d5c696671e9", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 21 19:17:50 np0005591285 nova_compute[182755]: 2026-01-22 00:17:50.611 182759 DEBUG nova.network.os_vif_util [None req-29da28cc-4080-4bd7-86ba-d3abad7df675 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:30:f9:d2,bridge_name='br-int',has_traffic_filtering=True,id=b5ddb845-36ad-438e-b95d-6d5c696671e9,network=Network(ed0e337a-102a-4cfd-8393-2e4b081cc9be),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tapb5ddb845-36') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 21 19:17:50 np0005591285 nova_compute[182755]: 2026-01-22 00:17:50.613 182759 DEBUG nova.objects.instance [None req-29da28cc-4080-4bd7-86ba-d3abad7df675 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] Lazy-loading 'pci_devices' on Instance uuid 5fd1f867-a38b-4022-886e-080b31068c65 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 21 19:17:50 np0005591285 nova_compute[182755]: 2026-01-22 00:17:50.769 182759 DEBUG nova.virt.libvirt.driver [None req-29da28cc-4080-4bd7-86ba-d3abad7df675 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] [instance: 5fd1f867-a38b-4022-886e-080b31068c65] End _get_guest_xml xml=<domain type="kvm">
Jan 21 19:17:50 np0005591285 nova_compute[182755]:  <uuid>5fd1f867-a38b-4022-886e-080b31068c65</uuid>
Jan 21 19:17:50 np0005591285 nova_compute[182755]:  <name>instance-00000089</name>
Jan 21 19:17:50 np0005591285 nova_compute[182755]:  <memory>131072</memory>
Jan 21 19:17:50 np0005591285 nova_compute[182755]:  <vcpu>1</vcpu>
Jan 21 19:17:50 np0005591285 nova_compute[182755]:  <metadata>
Jan 21 19:17:50 np0005591285 nova_compute[182755]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 21 19:17:50 np0005591285 nova_compute[182755]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 21 19:17:50 np0005591285 nova_compute[182755]:      <nova:name>tempest-TestNetworkBasicOps-server-1162631708</nova:name>
Jan 21 19:17:50 np0005591285 nova_compute[182755]:      <nova:creationTime>2026-01-22 00:17:50</nova:creationTime>
Jan 21 19:17:50 np0005591285 nova_compute[182755]:      <nova:flavor name="m1.nano">
Jan 21 19:17:50 np0005591285 nova_compute[182755]:        <nova:memory>128</nova:memory>
Jan 21 19:17:50 np0005591285 nova_compute[182755]:        <nova:disk>1</nova:disk>
Jan 21 19:17:50 np0005591285 nova_compute[182755]:        <nova:swap>0</nova:swap>
Jan 21 19:17:50 np0005591285 nova_compute[182755]:        <nova:ephemeral>0</nova:ephemeral>
Jan 21 19:17:50 np0005591285 nova_compute[182755]:        <nova:vcpus>1</nova:vcpus>
Jan 21 19:17:50 np0005591285 nova_compute[182755]:      </nova:flavor>
Jan 21 19:17:50 np0005591285 nova_compute[182755]:      <nova:owner>
Jan 21 19:17:50 np0005591285 nova_compute[182755]:        <nova:user uuid="833f1e9dce90456ea55a443da6704907">tempest-TestNetworkBasicOps-822850957-project-member</nova:user>
Jan 21 19:17:50 np0005591285 nova_compute[182755]:        <nova:project uuid="34b96b4037d24a0ea19383ca2477b2fd">tempest-TestNetworkBasicOps-822850957</nova:project>
Jan 21 19:17:50 np0005591285 nova_compute[182755]:      </nova:owner>
Jan 21 19:17:50 np0005591285 nova_compute[182755]:      <nova:root type="image" uuid="9cd98f02-a505-4543-a7ad-04e9a377b456"/>
Jan 21 19:17:50 np0005591285 nova_compute[182755]:      <nova:ports>
Jan 21 19:17:50 np0005591285 nova_compute[182755]:        <nova:port uuid="b5ddb845-36ad-438e-b95d-6d5c696671e9">
Jan 21 19:17:50 np0005591285 nova_compute[182755]:          <nova:ip type="fixed" address="10.100.0.8" ipVersion="4"/>
Jan 21 19:17:50 np0005591285 nova_compute[182755]:        </nova:port>
Jan 21 19:17:50 np0005591285 nova_compute[182755]:      </nova:ports>
Jan 21 19:17:50 np0005591285 nova_compute[182755]:    </nova:instance>
Jan 21 19:17:50 np0005591285 nova_compute[182755]:  </metadata>
Jan 21 19:17:50 np0005591285 nova_compute[182755]:  <sysinfo type="smbios">
Jan 21 19:17:50 np0005591285 nova_compute[182755]:    <system>
Jan 21 19:17:50 np0005591285 nova_compute[182755]:      <entry name="manufacturer">RDO</entry>
Jan 21 19:17:50 np0005591285 nova_compute[182755]:      <entry name="product">OpenStack Compute</entry>
Jan 21 19:17:50 np0005591285 nova_compute[182755]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 21 19:17:50 np0005591285 nova_compute[182755]:      <entry name="serial">5fd1f867-a38b-4022-886e-080b31068c65</entry>
Jan 21 19:17:50 np0005591285 nova_compute[182755]:      <entry name="uuid">5fd1f867-a38b-4022-886e-080b31068c65</entry>
Jan 21 19:17:50 np0005591285 nova_compute[182755]:      <entry name="family">Virtual Machine</entry>
Jan 21 19:17:50 np0005591285 nova_compute[182755]:    </system>
Jan 21 19:17:50 np0005591285 nova_compute[182755]:  </sysinfo>
Jan 21 19:17:50 np0005591285 nova_compute[182755]:  <os>
Jan 21 19:17:50 np0005591285 nova_compute[182755]:    <type arch="x86_64" machine="q35">hvm</type>
Jan 21 19:17:50 np0005591285 nova_compute[182755]:    <boot dev="hd"/>
Jan 21 19:17:50 np0005591285 nova_compute[182755]:    <smbios mode="sysinfo"/>
Jan 21 19:17:50 np0005591285 nova_compute[182755]:  </os>
Jan 21 19:17:50 np0005591285 nova_compute[182755]:  <features>
Jan 21 19:17:50 np0005591285 nova_compute[182755]:    <acpi/>
Jan 21 19:17:50 np0005591285 nova_compute[182755]:    <apic/>
Jan 21 19:17:50 np0005591285 nova_compute[182755]:    <vmcoreinfo/>
Jan 21 19:17:50 np0005591285 nova_compute[182755]:  </features>
Jan 21 19:17:50 np0005591285 nova_compute[182755]:  <clock offset="utc">
Jan 21 19:17:50 np0005591285 nova_compute[182755]:    <timer name="pit" tickpolicy="delay"/>
Jan 21 19:17:50 np0005591285 nova_compute[182755]:    <timer name="rtc" tickpolicy="catchup"/>
Jan 21 19:17:50 np0005591285 nova_compute[182755]:    <timer name="hpet" present="no"/>
Jan 21 19:17:50 np0005591285 nova_compute[182755]:  </clock>
Jan 21 19:17:50 np0005591285 nova_compute[182755]:  <cpu mode="custom" match="exact">
Jan 21 19:17:50 np0005591285 nova_compute[182755]:    <model>Nehalem</model>
Jan 21 19:17:50 np0005591285 nova_compute[182755]:    <topology sockets="1" cores="1" threads="1"/>
Jan 21 19:17:50 np0005591285 nova_compute[182755]:  </cpu>
Jan 21 19:17:50 np0005591285 nova_compute[182755]:  <devices>
Jan 21 19:17:50 np0005591285 nova_compute[182755]:    <disk type="file" device="disk">
Jan 21 19:17:50 np0005591285 nova_compute[182755]:      <driver name="qemu" type="qcow2" cache="none"/>
Jan 21 19:17:50 np0005591285 nova_compute[182755]:      <source file="/var/lib/nova/instances/5fd1f867-a38b-4022-886e-080b31068c65/disk"/>
Jan 21 19:17:50 np0005591285 nova_compute[182755]:      <target dev="vda" bus="virtio"/>
Jan 21 19:17:50 np0005591285 nova_compute[182755]:    </disk>
Jan 21 19:17:50 np0005591285 nova_compute[182755]:    <disk type="file" device="cdrom">
Jan 21 19:17:50 np0005591285 nova_compute[182755]:      <driver name="qemu" type="raw" cache="none"/>
Jan 21 19:17:50 np0005591285 nova_compute[182755]:      <source file="/var/lib/nova/instances/5fd1f867-a38b-4022-886e-080b31068c65/disk.config"/>
Jan 21 19:17:50 np0005591285 nova_compute[182755]:      <target dev="sda" bus="sata"/>
Jan 21 19:17:50 np0005591285 nova_compute[182755]:    </disk>
Jan 21 19:17:50 np0005591285 nova_compute[182755]:    <interface type="ethernet">
Jan 21 19:17:50 np0005591285 nova_compute[182755]:      <mac address="fa:16:3e:30:f9:d2"/>
Jan 21 19:17:50 np0005591285 nova_compute[182755]:      <model type="virtio"/>
Jan 21 19:17:50 np0005591285 nova_compute[182755]:      <driver name="vhost" rx_queue_size="512"/>
Jan 21 19:17:50 np0005591285 nova_compute[182755]:      <mtu size="1442"/>
Jan 21 19:17:50 np0005591285 nova_compute[182755]:      <target dev="tapb5ddb845-36"/>
Jan 21 19:17:50 np0005591285 nova_compute[182755]:    </interface>
Jan 21 19:17:50 np0005591285 nova_compute[182755]:    <serial type="pty">
Jan 21 19:17:50 np0005591285 nova_compute[182755]:      <log file="/var/lib/nova/instances/5fd1f867-a38b-4022-886e-080b31068c65/console.log" append="off"/>
Jan 21 19:17:50 np0005591285 nova_compute[182755]:    </serial>
Jan 21 19:17:50 np0005591285 nova_compute[182755]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 21 19:17:50 np0005591285 nova_compute[182755]:    <video>
Jan 21 19:17:50 np0005591285 nova_compute[182755]:      <model type="virtio"/>
Jan 21 19:17:50 np0005591285 nova_compute[182755]:    </video>
Jan 21 19:17:50 np0005591285 nova_compute[182755]:    <input type="tablet" bus="usb"/>
Jan 21 19:17:50 np0005591285 nova_compute[182755]:    <rng model="virtio">
Jan 21 19:17:50 np0005591285 nova_compute[182755]:      <backend model="random">/dev/urandom</backend>
Jan 21 19:17:50 np0005591285 nova_compute[182755]:    </rng>
Jan 21 19:17:50 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root"/>
Jan 21 19:17:50 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 19:17:50 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 19:17:50 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 19:17:50 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 19:17:50 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 19:17:50 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 19:17:50 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 19:17:50 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 19:17:50 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 19:17:50 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 19:17:50 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 19:17:50 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 19:17:50 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 19:17:50 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 19:17:50 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 19:17:50 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 19:17:50 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 19:17:50 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 19:17:50 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 19:17:50 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 19:17:50 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 19:17:50 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 19:17:50 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 19:17:50 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 19:17:50 np0005591285 nova_compute[182755]:    <controller type="usb" index="0"/>
Jan 21 19:17:50 np0005591285 nova_compute[182755]:    <memballoon model="virtio">
Jan 21 19:17:50 np0005591285 nova_compute[182755]:      <stats period="10"/>
Jan 21 19:17:50 np0005591285 nova_compute[182755]:    </memballoon>
Jan 21 19:17:50 np0005591285 nova_compute[182755]:  </devices>
Jan 21 19:17:50 np0005591285 nova_compute[182755]: </domain>
Jan 21 19:17:50 np0005591285 nova_compute[182755]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Jan 21 19:17:50 np0005591285 nova_compute[182755]: 2026-01-22 00:17:50.771 182759 DEBUG nova.compute.manager [None req-29da28cc-4080-4bd7-86ba-d3abad7df675 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] [instance: 5fd1f867-a38b-4022-886e-080b31068c65] Preparing to wait for external event network-vif-plugged-b5ddb845-36ad-438e-b95d-6d5c696671e9 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Jan 21 19:17:50 np0005591285 nova_compute[182755]: 2026-01-22 00:17:50.772 182759 DEBUG oslo_concurrency.lockutils [None req-29da28cc-4080-4bd7-86ba-d3abad7df675 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] Acquiring lock "5fd1f867-a38b-4022-886e-080b31068c65-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 21 19:17:50 np0005591285 nova_compute[182755]: 2026-01-22 00:17:50.773 182759 DEBUG oslo_concurrency.lockutils [None req-29da28cc-4080-4bd7-86ba-d3abad7df675 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] Lock "5fd1f867-a38b-4022-886e-080b31068c65-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 21 19:17:50 np0005591285 nova_compute[182755]: 2026-01-22 00:17:50.773 182759 DEBUG oslo_concurrency.lockutils [None req-29da28cc-4080-4bd7-86ba-d3abad7df675 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] Lock "5fd1f867-a38b-4022-886e-080b31068c65-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 21 19:17:50 np0005591285 nova_compute[182755]: 2026-01-22 00:17:50.775 182759 DEBUG nova.virt.libvirt.vif [None req-29da28cc-4080-4bd7-86ba-d3abad7df675 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-22T00:17:43Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-1162631708',display_name='tempest-TestNetworkBasicOps-server-1162631708',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-1162631708',id=137,image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBL1GgERA5A6dlnUAgyjlcKhVfhPfweuHGtTAxq6FqnnvgM7bKXaovy0QOYaTYF/A7+y4NM6KVriyhlUw8VnZj6YUMZAIXL846IzmPuKgYor70BbYEoJIH0X6V5P8cATakw==',key_name='tempest-TestNetworkBasicOps-935473695',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='34b96b4037d24a0ea19383ca2477b2fd',ramdisk_id='',reservation_id='r-nbmafi1w',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestNetworkBasicOps-822850957',owner_user_name='tempest-TestNetworkBasicOps-822850957-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-22T00:17:46Z,user_data=None,user_id='833f1e9dce90456ea55a443da6704907',uuid=5fd1f867-a38b-4022-886e-080b31068c65,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "b5ddb845-36ad-438e-b95d-6d5c696671e9", "address": "fa:16:3e:30:f9:d2", "network": {"id": "ed0e337a-102a-4cfd-8393-2e4b081cc9be", "bridge": "br-int", "label": "tempest-network-smoke--1899647072", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.176", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "34b96b4037d24a0ea19383ca2477b2fd", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb5ddb845-36", "ovs_interfaceid": "b5ddb845-36ad-438e-b95d-6d5c696671e9", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Jan 21 19:17:50 np0005591285 nova_compute[182755]: 2026-01-22 00:17:50.775 182759 DEBUG nova.network.os_vif_util [None req-29da28cc-4080-4bd7-86ba-d3abad7df675 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] Converting VIF {"id": "b5ddb845-36ad-438e-b95d-6d5c696671e9", "address": "fa:16:3e:30:f9:d2", "network": {"id": "ed0e337a-102a-4cfd-8393-2e4b081cc9be", "bridge": "br-int", "label": "tempest-network-smoke--1899647072", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.176", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "34b96b4037d24a0ea19383ca2477b2fd", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb5ddb845-36", "ovs_interfaceid": "b5ddb845-36ad-438e-b95d-6d5c696671e9", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 21 19:17:50 np0005591285 nova_compute[182755]: 2026-01-22 00:17:50.777 182759 DEBUG nova.network.os_vif_util [None req-29da28cc-4080-4bd7-86ba-d3abad7df675 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:30:f9:d2,bridge_name='br-int',has_traffic_filtering=True,id=b5ddb845-36ad-438e-b95d-6d5c696671e9,network=Network(ed0e337a-102a-4cfd-8393-2e4b081cc9be),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tapb5ddb845-36') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 21 19:17:50 np0005591285 nova_compute[182755]: 2026-01-22 00:17:50.778 182759 DEBUG os_vif [None req-29da28cc-4080-4bd7-86ba-d3abad7df675 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:30:f9:d2,bridge_name='br-int',has_traffic_filtering=True,id=b5ddb845-36ad-438e-b95d-6d5c696671e9,network=Network(ed0e337a-102a-4cfd-8393-2e4b081cc9be),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tapb5ddb845-36') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Jan 21 19:17:50 np0005591285 nova_compute[182755]: 2026-01-22 00:17:50.779 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:17:50 np0005591285 nova_compute[182755]: 2026-01-22 00:17:50.780 182759 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 21 19:17:50 np0005591285 nova_compute[182755]: 2026-01-22 00:17:50.781 182759 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 21 19:17:50 np0005591285 nova_compute[182755]: 2026-01-22 00:17:50.786 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:17:50 np0005591285 nova_compute[182755]: 2026-01-22 00:17:50.787 182759 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapb5ddb845-36, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 21 19:17:50 np0005591285 nova_compute[182755]: 2026-01-22 00:17:50.788 182759 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tapb5ddb845-36, col_values=(('external_ids', {'iface-id': 'b5ddb845-36ad-438e-b95d-6d5c696671e9', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:30:f9:d2', 'vm-uuid': '5fd1f867-a38b-4022-886e-080b31068c65'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 21 19:17:50 np0005591285 nova_compute[182755]: 2026-01-22 00:17:50.825 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:17:50 np0005591285 NetworkManager[55017]: <info>  [1769041070.8277] manager: (tapb5ddb845-36): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/254)
Jan 21 19:17:50 np0005591285 nova_compute[182755]: 2026-01-22 00:17:50.829 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 21 19:17:50 np0005591285 nova_compute[182755]: 2026-01-22 00:17:50.833 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:17:50 np0005591285 nova_compute[182755]: 2026-01-22 00:17:50.834 182759 INFO os_vif [None req-29da28cc-4080-4bd7-86ba-d3abad7df675 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:30:f9:d2,bridge_name='br-int',has_traffic_filtering=True,id=b5ddb845-36ad-438e-b95d-6d5c696671e9,network=Network(ed0e337a-102a-4cfd-8393-2e4b081cc9be),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tapb5ddb845-36')#033[00m
Jan 21 19:17:50 np0005591285 nova_compute[182755]: 2026-01-22 00:17:50.907 182759 DEBUG nova.virt.libvirt.driver [None req-29da28cc-4080-4bd7-86ba-d3abad7df675 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 21 19:17:50 np0005591285 nova_compute[182755]: 2026-01-22 00:17:50.907 182759 DEBUG nova.virt.libvirt.driver [None req-29da28cc-4080-4bd7-86ba-d3abad7df675 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 21 19:17:50 np0005591285 nova_compute[182755]: 2026-01-22 00:17:50.907 182759 DEBUG nova.virt.libvirt.driver [None req-29da28cc-4080-4bd7-86ba-d3abad7df675 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] No VIF found with MAC fa:16:3e:30:f9:d2, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Jan 21 19:17:50 np0005591285 nova_compute[182755]: 2026-01-22 00:17:50.908 182759 INFO nova.virt.libvirt.driver [None req-29da28cc-4080-4bd7-86ba-d3abad7df675 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] [instance: 5fd1f867-a38b-4022-886e-080b31068c65] Using config drive#033[00m
Jan 21 19:17:51 np0005591285 nova_compute[182755]: 2026-01-22 00:17:51.408 182759 INFO nova.virt.libvirt.driver [None req-29da28cc-4080-4bd7-86ba-d3abad7df675 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] [instance: 5fd1f867-a38b-4022-886e-080b31068c65] Creating config drive at /var/lib/nova/instances/5fd1f867-a38b-4022-886e-080b31068c65/disk.config#033[00m
Jan 21 19:17:51 np0005591285 nova_compute[182755]: 2026-01-22 00:17:51.419 182759 DEBUG oslo_concurrency.processutils [None req-29da28cc-4080-4bd7-86ba-d3abad7df675 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/5fd1f867-a38b-4022-886e-080b31068c65/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpxiv2mhq0 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 21 19:17:51 np0005591285 nova_compute[182755]: 2026-01-22 00:17:51.553 182759 DEBUG oslo_concurrency.processutils [None req-29da28cc-4080-4bd7-86ba-d3abad7df675 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/5fd1f867-a38b-4022-886e-080b31068c65/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpxiv2mhq0" returned: 0 in 0.134s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 21 19:17:51 np0005591285 kernel: tapb5ddb845-36: entered promiscuous mode
Jan 21 19:17:51 np0005591285 NetworkManager[55017]: <info>  [1769041071.6362] manager: (tapb5ddb845-36): new Tun device (/org/freedesktop/NetworkManager/Devices/255)
Jan 21 19:17:51 np0005591285 ovn_controller[94908]: 2026-01-22T00:17:51Z|00524|binding|INFO|Claiming lport b5ddb845-36ad-438e-b95d-6d5c696671e9 for this chassis.
Jan 21 19:17:51 np0005591285 ovn_controller[94908]: 2026-01-22T00:17:51Z|00525|binding|INFO|b5ddb845-36ad-438e-b95d-6d5c696671e9: Claiming fa:16:3e:30:f9:d2 10.100.0.8
Jan 21 19:17:51 np0005591285 nova_compute[182755]: 2026-01-22 00:17:51.639 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:17:51 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:17:51.654 104259 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:30:f9:d2 10.100.0.8'], port_security=['fa:16:3e:30:f9:d2 10.100.0.8'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'name': 'tempest-TestNetworkBasicOps-1260237103', 'neutron:cidrs': '10.100.0.8/28', 'neutron:device_id': '5fd1f867-a38b-4022-886e-080b31068c65', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-ed0e337a-102a-4cfd-8393-2e4b081cc9be', 'neutron:port_capabilities': '', 'neutron:port_name': 'tempest-TestNetworkBasicOps-1260237103', 'neutron:project_id': '34b96b4037d24a0ea19383ca2477b2fd', 'neutron:revision_number': '7', 'neutron:security_group_ids': '86c286e4-25eb-4f2e-b5ff-30677cfd8882', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:port_fip': '192.168.122.176'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=e0671c46-b585-465c-a43e-4ccad6b37e41, chassis=[<ovs.db.idl.Row object at 0x7fdd55b5c970>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fdd55b5c970>], logical_port=b5ddb845-36ad-438e-b95d-6d5c696671e9) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 21 19:17:51 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:17:51.655 104259 INFO neutron.agent.ovn.metadata.agent [-] Port b5ddb845-36ad-438e-b95d-6d5c696671e9 in datapath ed0e337a-102a-4cfd-8393-2e4b081cc9be bound to our chassis#033[00m
Jan 21 19:17:51 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:17:51.657 104259 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network ed0e337a-102a-4cfd-8393-2e4b081cc9be#033[00m
Jan 21 19:17:51 np0005591285 nova_compute[182755]: 2026-01-22 00:17:51.665 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:17:51 np0005591285 ovn_controller[94908]: 2026-01-22T00:17:51Z|00526|binding|INFO|Setting lport b5ddb845-36ad-438e-b95d-6d5c696671e9 ovn-installed in OVS
Jan 21 19:17:51 np0005591285 ovn_controller[94908]: 2026-01-22T00:17:51Z|00527|binding|INFO|Setting lport b5ddb845-36ad-438e-b95d-6d5c696671e9 up in Southbound
Jan 21 19:17:51 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:17:51.669 211690 DEBUG oslo.privsep.daemon [-] privsep: reply[7592c747-d23e-4c91-9701-6f4825b98c3e]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 19:17:51 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:17:51.671 104259 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH taped0e337a-11 in ovnmeta-ed0e337a-102a-4cfd-8393-2e4b081cc9be namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Jan 21 19:17:51 np0005591285 nova_compute[182755]: 2026-01-22 00:17:51.671 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:17:51 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:17:51.673 211690 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface taped0e337a-10 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Jan 21 19:17:51 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:17:51.673 211690 DEBUG oslo.privsep.daemon [-] privsep: reply[ee51fac7-4a8d-4d8e-921c-3d30dd7a8f7c]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 19:17:51 np0005591285 nova_compute[182755]: 2026-01-22 00:17:51.674 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:17:51 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:17:51.674 211690 DEBUG oslo.privsep.daemon [-] privsep: reply[b0b0b74c-7ed8-4cc2-907c-f41c3cc725a2]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 19:17:51 np0005591285 systemd-udevd[233316]: Network interface NamePolicy= disabled on kernel command line.
Jan 21 19:17:51 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:17:51.687 104650 DEBUG oslo.privsep.daemon [-] privsep: reply[59105711-83a4-468e-8be6-bcffbe43d7e9]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 19:17:51 np0005591285 NetworkManager[55017]: <info>  [1769041071.6956] device (tapb5ddb845-36): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 21 19:17:51 np0005591285 NetworkManager[55017]: <info>  [1769041071.6964] device (tapb5ddb845-36): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 21 19:17:51 np0005591285 systemd-machined[154022]: New machine qemu-62-instance-00000089.
Jan 21 19:17:51 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:17:51.711 211690 DEBUG oslo.privsep.daemon [-] privsep: reply[1f0404b7-4979-46e5-a9e7-20d8cdc0b8db]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 19:17:51 np0005591285 systemd[1]: Started Virtual Machine qemu-62-instance-00000089.
Jan 21 19:17:51 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:17:51.749 211704 DEBUG oslo.privsep.daemon [-] privsep: reply[bec63fd4-78f2-4cc6-ad0d-357b0adae5ec]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 19:17:51 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:17:51.755 211690 DEBUG oslo.privsep.daemon [-] privsep: reply[5ebf5c9f-53a4-4bb1-9b24-e4e1b7bca92b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 19:17:51 np0005591285 NetworkManager[55017]: <info>  [1769041071.7587] manager: (taped0e337a-10): new Veth device (/org/freedesktop/NetworkManager/Devices/256)
Jan 21 19:17:51 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:17:51.793 211704 DEBUG oslo.privsep.daemon [-] privsep: reply[59aa1786-284d-4723-be3d-a80e23b39f83]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 19:17:51 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:17:51.797 211704 DEBUG oslo.privsep.daemon [-] privsep: reply[9e658485-bfd1-4fd4-b187-aa1f4c6e43eb]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 19:17:51 np0005591285 NetworkManager[55017]: <info>  [1769041071.8217] device (taped0e337a-10): carrier: link connected
Jan 21 19:17:51 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:17:51.827 211704 DEBUG oslo.privsep.daemon [-] privsep: reply[b2691bc8-ba56-4365-b752-1dbd1fb4ccf1]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 19:17:51 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:17:51.844 211690 DEBUG oslo.privsep.daemon [-] privsep: reply[bdd62de1-ddac-438f-89ae-61a08f9c604c]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'taped0e337a-11'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:72:b4:8c'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 165], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 563048, 'reachable_time': 35634, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 233348, 'error': None, 'target': 'ovnmeta-ed0e337a-102a-4cfd-8393-2e4b081cc9be', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 19:17:51 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:17:51.860 211690 DEBUG oslo.privsep.daemon [-] privsep: reply[18f252c3-88f5-4850-804b-b7c7b87a3543]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe72:b48c'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 563048, 'tstamp': 563048}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 233349, 'error': None, 'target': 'ovnmeta-ed0e337a-102a-4cfd-8393-2e4b081cc9be', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 19:17:51 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:17:51.882 211690 DEBUG oslo.privsep.daemon [-] privsep: reply[79d8d081-4191-4ea2-945c-88d970ff847d]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'taped0e337a-11'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:72:b4:8c'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 165], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 563048, 'reachable_time': 35634, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 233350, 'error': None, 'target': 'ovnmeta-ed0e337a-102a-4cfd-8393-2e4b081cc9be', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 19:17:51 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:17:51.928 211690 DEBUG oslo.privsep.daemon [-] privsep: reply[95e1ac15-563f-477c-bbcd-b5ddab868aa9]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 19:17:51 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:17:51.998 211690 DEBUG oslo.privsep.daemon [-] privsep: reply[fa650d29-cbd7-42a3-996e-e50fbc586cf2]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 19:17:52 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:17:52.001 104259 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=taped0e337a-10, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 21 19:17:52 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:17:52.001 104259 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 21 19:17:52 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:17:52.002 104259 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=taped0e337a-10, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 21 19:17:52 np0005591285 kernel: taped0e337a-10: entered promiscuous mode
Jan 21 19:17:52 np0005591285 NetworkManager[55017]: <info>  [1769041072.0082] manager: (taped0e337a-10): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/257)
Jan 21 19:17:52 np0005591285 nova_compute[182755]: 2026-01-22 00:17:52.010 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:17:52 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:17:52.011 104259 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=taped0e337a-10, col_values=(('external_ids', {'iface-id': 'a677c548-67db-4eb4-acb2-02020cd1507a'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 21 19:17:52 np0005591285 ovn_controller[94908]: 2026-01-22T00:17:52Z|00528|binding|INFO|Releasing lport a677c548-67db-4eb4-acb2-02020cd1507a from this chassis (sb_readonly=0)
Jan 21 19:17:52 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:17:52.015 104259 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/ed0e337a-102a-4cfd-8393-2e4b081cc9be.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/ed0e337a-102a-4cfd-8393-2e4b081cc9be.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Jan 21 19:17:52 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:17:52.016 211690 DEBUG oslo.privsep.daemon [-] privsep: reply[32a3a884-358e-412a-bd22-2b5a8551a022]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 19:17:52 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:17:52.017 104259 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Jan 21 19:17:52 np0005591285 ovn_metadata_agent[104254]: global
Jan 21 19:17:52 np0005591285 ovn_metadata_agent[104254]:    log         /dev/log local0 debug
Jan 21 19:17:52 np0005591285 ovn_metadata_agent[104254]:    log-tag     haproxy-metadata-proxy-ed0e337a-102a-4cfd-8393-2e4b081cc9be
Jan 21 19:17:52 np0005591285 ovn_metadata_agent[104254]:    user        root
Jan 21 19:17:52 np0005591285 ovn_metadata_agent[104254]:    group       root
Jan 21 19:17:52 np0005591285 ovn_metadata_agent[104254]:    maxconn     1024
Jan 21 19:17:52 np0005591285 ovn_metadata_agent[104254]:    pidfile     /var/lib/neutron/external/pids/ed0e337a-102a-4cfd-8393-2e4b081cc9be.pid.haproxy
Jan 21 19:17:52 np0005591285 ovn_metadata_agent[104254]:    daemon
Jan 21 19:17:52 np0005591285 ovn_metadata_agent[104254]: 
Jan 21 19:17:52 np0005591285 ovn_metadata_agent[104254]: defaults
Jan 21 19:17:52 np0005591285 ovn_metadata_agent[104254]:    log global
Jan 21 19:17:52 np0005591285 ovn_metadata_agent[104254]:    mode http
Jan 21 19:17:52 np0005591285 ovn_metadata_agent[104254]:    option httplog
Jan 21 19:17:52 np0005591285 ovn_metadata_agent[104254]:    option dontlognull
Jan 21 19:17:52 np0005591285 ovn_metadata_agent[104254]:    option http-server-close
Jan 21 19:17:52 np0005591285 ovn_metadata_agent[104254]:    option forwardfor
Jan 21 19:17:52 np0005591285 ovn_metadata_agent[104254]:    retries                 3
Jan 21 19:17:52 np0005591285 ovn_metadata_agent[104254]:    timeout http-request    30s
Jan 21 19:17:52 np0005591285 ovn_metadata_agent[104254]:    timeout connect         30s
Jan 21 19:17:52 np0005591285 ovn_metadata_agent[104254]:    timeout client          32s
Jan 21 19:17:52 np0005591285 ovn_metadata_agent[104254]:    timeout server          32s
Jan 21 19:17:52 np0005591285 ovn_metadata_agent[104254]:    timeout http-keep-alive 30s
Jan 21 19:17:52 np0005591285 ovn_metadata_agent[104254]: 
Jan 21 19:17:52 np0005591285 ovn_metadata_agent[104254]: 
Jan 21 19:17:52 np0005591285 ovn_metadata_agent[104254]: listen listener
Jan 21 19:17:52 np0005591285 ovn_metadata_agent[104254]:    bind 169.254.169.254:80
Jan 21 19:17:52 np0005591285 ovn_metadata_agent[104254]:    server metadata /var/lib/neutron/metadata_proxy
Jan 21 19:17:52 np0005591285 ovn_metadata_agent[104254]:    http-request add-header X-OVN-Network-ID ed0e337a-102a-4cfd-8393-2e4b081cc9be
Jan 21 19:17:52 np0005591285 ovn_metadata_agent[104254]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Jan 21 19:17:52 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:17:52.021 104259 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-ed0e337a-102a-4cfd-8393-2e4b081cc9be', 'env', 'PROCESS_TAG=haproxy-ed0e337a-102a-4cfd-8393-2e4b081cc9be', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/ed0e337a-102a-4cfd-8393-2e4b081cc9be.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Jan 21 19:17:52 np0005591285 nova_compute[182755]: 2026-01-22 00:17:52.031 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:17:52 np0005591285 systemd[1]: Created slice User Slice of UID 42436.
Jan 21 19:17:52 np0005591285 systemd[1]: Starting User Runtime Directory /run/user/42436...
Jan 21 19:17:52 np0005591285 systemd-logind[788]: New session 48 of user nova.
Jan 21 19:17:52 np0005591285 systemd[1]: Finished User Runtime Directory /run/user/42436.
Jan 21 19:17:52 np0005591285 systemd[1]: Starting User Manager for UID 42436...
Jan 21 19:17:52 np0005591285 nova_compute[182755]: 2026-01-22 00:17:52.218 182759 DEBUG oslo_service.periodic_task [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 21 19:17:52 np0005591285 nova_compute[182755]: 2026-01-22 00:17:52.219 182759 DEBUG oslo_service.periodic_task [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 21 19:17:52 np0005591285 systemd[233364]: Queued start job for default target Main User Target.
Jan 21 19:17:52 np0005591285 systemd[233364]: Created slice User Application Slice.
Jan 21 19:17:52 np0005591285 systemd[233364]: Started Mark boot as successful after the user session has run 2 minutes.
Jan 21 19:17:52 np0005591285 systemd[233364]: Started Daily Cleanup of User's Temporary Directories.
Jan 21 19:17:52 np0005591285 systemd[233364]: Reached target Paths.
Jan 21 19:17:52 np0005591285 systemd[233364]: Reached target Timers.
Jan 21 19:17:52 np0005591285 systemd[233364]: Starting D-Bus User Message Bus Socket...
Jan 21 19:17:52 np0005591285 systemd[233364]: Starting Create User's Volatile Files and Directories...
Jan 21 19:17:52 np0005591285 systemd[233364]: Listening on D-Bus User Message Bus Socket.
Jan 21 19:17:52 np0005591285 systemd[233364]: Reached target Sockets.
Jan 21 19:17:52 np0005591285 systemd[233364]: Finished Create User's Volatile Files and Directories.
Jan 21 19:17:52 np0005591285 systemd[233364]: Reached target Basic System.
Jan 21 19:17:52 np0005591285 systemd[233364]: Reached target Main User Target.
Jan 21 19:17:52 np0005591285 systemd[233364]: Startup finished in 156ms.
Jan 21 19:17:52 np0005591285 systemd[1]: Started User Manager for UID 42436.
Jan 21 19:17:52 np0005591285 podman[233399]: 2026-01-22 00:17:52.411016169 +0000 UTC m=+0.062978450 container create 5617eba4a502ccb4a928cea4faee0ae20ca488f29b06aa830ba7f93b23728396 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-ed0e337a-102a-4cfd-8393-2e4b081cc9be, io.buildah.version=1.41.3, tcib_managed=true, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team)
Jan 21 19:17:52 np0005591285 systemd[1]: Started Session 48 of User nova.
Jan 21 19:17:52 np0005591285 podman[233399]: 2026-01-22 00:17:52.373822508 +0000 UTC m=+0.025784799 image pull 3695f0466b4af47afdf4b467956f8cc4744d7249671a73e7ca3fd26cca2f59c3 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Jan 21 19:17:52 np0005591285 systemd[1]: Started libpod-conmon-5617eba4a502ccb4a928cea4faee0ae20ca488f29b06aa830ba7f93b23728396.scope.
Jan 21 19:17:52 np0005591285 systemd[1]: Started libcrun container.
Jan 21 19:17:52 np0005591285 systemd[1]: session-48.scope: Deactivated successfully.
Jan 21 19:17:52 np0005591285 systemd-logind[788]: Session 48 logged out. Waiting for processes to exit.
Jan 21 19:17:52 np0005591285 systemd-logind[788]: Removed session 48.
Jan 21 19:17:52 np0005591285 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/28838ccdb78abd70def816810076e5adb414d1ded3dbf5724b7a7e35c5ab574a/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 21 19:17:52 np0005591285 nova_compute[182755]: 2026-01-22 00:17:52.534 182759 DEBUG nova.compute.manager [req-45a29253-8f20-44d5-8af0-eedb0ad19849 req-363ce5ee-8b2e-4856-a74b-6f29b4a32f67 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 5fd1f867-a38b-4022-886e-080b31068c65] Received event network-vif-plugged-b5ddb845-36ad-438e-b95d-6d5c696671e9 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 21 19:17:52 np0005591285 nova_compute[182755]: 2026-01-22 00:17:52.536 182759 DEBUG oslo_concurrency.lockutils [req-45a29253-8f20-44d5-8af0-eedb0ad19849 req-363ce5ee-8b2e-4856-a74b-6f29b4a32f67 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "5fd1f867-a38b-4022-886e-080b31068c65-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 21 19:17:52 np0005591285 nova_compute[182755]: 2026-01-22 00:17:52.537 182759 DEBUG oslo_concurrency.lockutils [req-45a29253-8f20-44d5-8af0-eedb0ad19849 req-363ce5ee-8b2e-4856-a74b-6f29b4a32f67 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "5fd1f867-a38b-4022-886e-080b31068c65-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 21 19:17:52 np0005591285 nova_compute[182755]: 2026-01-22 00:17:52.537 182759 DEBUG oslo_concurrency.lockutils [req-45a29253-8f20-44d5-8af0-eedb0ad19849 req-363ce5ee-8b2e-4856-a74b-6f29b4a32f67 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "5fd1f867-a38b-4022-886e-080b31068c65-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 21 19:17:52 np0005591285 nova_compute[182755]: 2026-01-22 00:17:52.537 182759 DEBUG nova.compute.manager [req-45a29253-8f20-44d5-8af0-eedb0ad19849 req-363ce5ee-8b2e-4856-a74b-6f29b4a32f67 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 5fd1f867-a38b-4022-886e-080b31068c65] Processing event network-vif-plugged-b5ddb845-36ad-438e-b95d-6d5c696671e9 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Jan 21 19:17:52 np0005591285 podman[233399]: 2026-01-22 00:17:52.549586833 +0000 UTC m=+0.201549174 container init 5617eba4a502ccb4a928cea4faee0ae20ca488f29b06aa830ba7f93b23728396 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-ed0e337a-102a-4cfd-8393-2e4b081cc9be, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS)
Jan 21 19:17:52 np0005591285 podman[233399]: 2026-01-22 00:17:52.561322527 +0000 UTC m=+0.213284808 container start 5617eba4a502ccb4a928cea4faee0ae20ca488f29b06aa830ba7f93b23728396 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-ed0e337a-102a-4cfd-8393-2e4b081cc9be, tcib_managed=true, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0)
Jan 21 19:17:52 np0005591285 neutron-haproxy-ovnmeta-ed0e337a-102a-4cfd-8393-2e4b081cc9be[233416]: [NOTICE]   (233423) : New worker (233425) forked
Jan 21 19:17:52 np0005591285 neutron-haproxy-ovnmeta-ed0e337a-102a-4cfd-8393-2e4b081cc9be[233416]: [NOTICE]   (233423) : Loading success.
Jan 21 19:17:52 np0005591285 systemd-logind[788]: New session 50 of user nova.
Jan 21 19:17:52 np0005591285 systemd[1]: Started Session 50 of User nova.
Jan 21 19:17:52 np0005591285 systemd[1]: session-50.scope: Deactivated successfully.
Jan 21 19:17:52 np0005591285 systemd-logind[788]: Session 50 logged out. Waiting for processes to exit.
Jan 21 19:17:52 np0005591285 systemd-logind[788]: Removed session 50.
Jan 21 19:17:52 np0005591285 nova_compute[182755]: 2026-01-22 00:17:52.751 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:17:52 np0005591285 nova_compute[182755]: 2026-01-22 00:17:52.859 182759 DEBUG nova.compute.manager [None req-29da28cc-4080-4bd7-86ba-d3abad7df675 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] [instance: 5fd1f867-a38b-4022-886e-080b31068c65] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Jan 21 19:17:52 np0005591285 nova_compute[182755]: 2026-01-22 00:17:52.860 182759 DEBUG nova.virt.driver [None req-f447ef32-7edb-4e2c-a5c5-5452f2f9c37e - - - - - -] Emitting event <LifecycleEvent: 1769041072.8587072, 5fd1f867-a38b-4022-886e-080b31068c65 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 21 19:17:52 np0005591285 nova_compute[182755]: 2026-01-22 00:17:52.860 182759 INFO nova.compute.manager [None req-f447ef32-7edb-4e2c-a5c5-5452f2f9c37e - - - - - -] [instance: 5fd1f867-a38b-4022-886e-080b31068c65] VM Started (Lifecycle Event)#033[00m
Jan 21 19:17:52 np0005591285 nova_compute[182755]: 2026-01-22 00:17:52.864 182759 DEBUG nova.virt.libvirt.driver [None req-29da28cc-4080-4bd7-86ba-d3abad7df675 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] [instance: 5fd1f867-a38b-4022-886e-080b31068c65] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Jan 21 19:17:52 np0005591285 nova_compute[182755]: 2026-01-22 00:17:52.869 182759 INFO nova.virt.libvirt.driver [-] [instance: 5fd1f867-a38b-4022-886e-080b31068c65] Instance spawned successfully.#033[00m
Jan 21 19:17:52 np0005591285 nova_compute[182755]: 2026-01-22 00:17:52.870 182759 DEBUG nova.virt.libvirt.driver [None req-29da28cc-4080-4bd7-86ba-d3abad7df675 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] [instance: 5fd1f867-a38b-4022-886e-080b31068c65] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Jan 21 19:17:52 np0005591285 nova_compute[182755]: 2026-01-22 00:17:52.899 182759 DEBUG nova.compute.manager [None req-f447ef32-7edb-4e2c-a5c5-5452f2f9c37e - - - - - -] [instance: 5fd1f867-a38b-4022-886e-080b31068c65] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 21 19:17:52 np0005591285 nova_compute[182755]: 2026-01-22 00:17:52.905 182759 DEBUG nova.compute.manager [None req-f447ef32-7edb-4e2c-a5c5-5452f2f9c37e - - - - - -] [instance: 5fd1f867-a38b-4022-886e-080b31068c65] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 21 19:17:52 np0005591285 nova_compute[182755]: 2026-01-22 00:17:52.909 182759 DEBUG nova.virt.libvirt.driver [None req-29da28cc-4080-4bd7-86ba-d3abad7df675 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] [instance: 5fd1f867-a38b-4022-886e-080b31068c65] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 21 19:17:52 np0005591285 nova_compute[182755]: 2026-01-22 00:17:52.909 182759 DEBUG nova.virt.libvirt.driver [None req-29da28cc-4080-4bd7-86ba-d3abad7df675 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] [instance: 5fd1f867-a38b-4022-886e-080b31068c65] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 21 19:17:52 np0005591285 nova_compute[182755]: 2026-01-22 00:17:52.910 182759 DEBUG nova.virt.libvirt.driver [None req-29da28cc-4080-4bd7-86ba-d3abad7df675 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] [instance: 5fd1f867-a38b-4022-886e-080b31068c65] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 21 19:17:52 np0005591285 nova_compute[182755]: 2026-01-22 00:17:52.910 182759 DEBUG nova.virt.libvirt.driver [None req-29da28cc-4080-4bd7-86ba-d3abad7df675 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] [instance: 5fd1f867-a38b-4022-886e-080b31068c65] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 21 19:17:52 np0005591285 nova_compute[182755]: 2026-01-22 00:17:52.911 182759 DEBUG nova.virt.libvirt.driver [None req-29da28cc-4080-4bd7-86ba-d3abad7df675 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] [instance: 5fd1f867-a38b-4022-886e-080b31068c65] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 21 19:17:52 np0005591285 nova_compute[182755]: 2026-01-22 00:17:52.912 182759 DEBUG nova.virt.libvirt.driver [None req-29da28cc-4080-4bd7-86ba-d3abad7df675 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] [instance: 5fd1f867-a38b-4022-886e-080b31068c65] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 21 19:17:52 np0005591285 nova_compute[182755]: 2026-01-22 00:17:52.947 182759 INFO nova.compute.manager [None req-f447ef32-7edb-4e2c-a5c5-5452f2f9c37e - - - - - -] [instance: 5fd1f867-a38b-4022-886e-080b31068c65] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 21 19:17:52 np0005591285 nova_compute[182755]: 2026-01-22 00:17:52.948 182759 DEBUG nova.virt.driver [None req-f447ef32-7edb-4e2c-a5c5-5452f2f9c37e - - - - - -] Emitting event <LifecycleEvent: 1769041072.8597915, 5fd1f867-a38b-4022-886e-080b31068c65 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 21 19:17:52 np0005591285 nova_compute[182755]: 2026-01-22 00:17:52.948 182759 INFO nova.compute.manager [None req-f447ef32-7edb-4e2c-a5c5-5452f2f9c37e - - - - - -] [instance: 5fd1f867-a38b-4022-886e-080b31068c65] VM Paused (Lifecycle Event)#033[00m
Jan 21 19:17:52 np0005591285 nova_compute[182755]: 2026-01-22 00:17:52.982 182759 DEBUG nova.compute.manager [None req-f447ef32-7edb-4e2c-a5c5-5452f2f9c37e - - - - - -] [instance: 5fd1f867-a38b-4022-886e-080b31068c65] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 21 19:17:52 np0005591285 nova_compute[182755]: 2026-01-22 00:17:52.986 182759 DEBUG nova.virt.driver [None req-f447ef32-7edb-4e2c-a5c5-5452f2f9c37e - - - - - -] Emitting event <LifecycleEvent: 1769041072.8632827, 5fd1f867-a38b-4022-886e-080b31068c65 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 21 19:17:52 np0005591285 nova_compute[182755]: 2026-01-22 00:17:52.986 182759 INFO nova.compute.manager [None req-f447ef32-7edb-4e2c-a5c5-5452f2f9c37e - - - - - -] [instance: 5fd1f867-a38b-4022-886e-080b31068c65] VM Resumed (Lifecycle Event)#033[00m
Jan 21 19:17:53 np0005591285 nova_compute[182755]: 2026-01-22 00:17:53.015 182759 INFO nova.compute.manager [None req-29da28cc-4080-4bd7-86ba-d3abad7df675 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] [instance: 5fd1f867-a38b-4022-886e-080b31068c65] Took 6.81 seconds to spawn the instance on the hypervisor.#033[00m
Jan 21 19:17:53 np0005591285 nova_compute[182755]: 2026-01-22 00:17:53.015 182759 DEBUG nova.compute.manager [None req-29da28cc-4080-4bd7-86ba-d3abad7df675 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] [instance: 5fd1f867-a38b-4022-886e-080b31068c65] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 21 19:17:53 np0005591285 nova_compute[182755]: 2026-01-22 00:17:53.016 182759 DEBUG nova.compute.manager [None req-f447ef32-7edb-4e2c-a5c5-5452f2f9c37e - - - - - -] [instance: 5fd1f867-a38b-4022-886e-080b31068c65] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 21 19:17:53 np0005591285 nova_compute[182755]: 2026-01-22 00:17:53.023 182759 DEBUG nova.compute.manager [None req-f447ef32-7edb-4e2c-a5c5-5452f2f9c37e - - - - - -] [instance: 5fd1f867-a38b-4022-886e-080b31068c65] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 21 19:17:53 np0005591285 nova_compute[182755]: 2026-01-22 00:17:53.057 182759 INFO nova.compute.manager [None req-f447ef32-7edb-4e2c-a5c5-5452f2f9c37e - - - - - -] [instance: 5fd1f867-a38b-4022-886e-080b31068c65] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 21 19:17:53 np0005591285 nova_compute[182755]: 2026-01-22 00:17:53.102 182759 DEBUG nova.network.neutron [req-c1328963-1f75-4150-a97f-cff35ee8d394 req-01e6b09a-26a0-4620-8188-602b55c9d11b 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 5fd1f867-a38b-4022-886e-080b31068c65] Updated VIF entry in instance network info cache for port b5ddb845-36ad-438e-b95d-6d5c696671e9. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 21 19:17:53 np0005591285 nova_compute[182755]: 2026-01-22 00:17:53.102 182759 DEBUG nova.network.neutron [req-c1328963-1f75-4150-a97f-cff35ee8d394 req-01e6b09a-26a0-4620-8188-602b55c9d11b 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 5fd1f867-a38b-4022-886e-080b31068c65] Updating instance_info_cache with network_info: [{"id": "b5ddb845-36ad-438e-b95d-6d5c696671e9", "address": "fa:16:3e:30:f9:d2", "network": {"id": "ed0e337a-102a-4cfd-8393-2e4b081cc9be", "bridge": "br-int", "label": "tempest-network-smoke--1899647072", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.176", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "34b96b4037d24a0ea19383ca2477b2fd", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb5ddb845-36", "ovs_interfaceid": "b5ddb845-36ad-438e-b95d-6d5c696671e9", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 21 19:17:53 np0005591285 nova_compute[182755]: 2026-01-22 00:17:53.207 182759 DEBUG oslo_concurrency.lockutils [req-c1328963-1f75-4150-a97f-cff35ee8d394 req-01e6b09a-26a0-4620-8188-602b55c9d11b 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Releasing lock "refresh_cache-5fd1f867-a38b-4022-886e-080b31068c65" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 21 19:17:53 np0005591285 nova_compute[182755]: 2026-01-22 00:17:53.447 182759 INFO nova.compute.manager [None req-29da28cc-4080-4bd7-86ba-d3abad7df675 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] [instance: 5fd1f867-a38b-4022-886e-080b31068c65] Took 8.08 seconds to build instance.#033[00m
Jan 21 19:17:53 np0005591285 nova_compute[182755]: 2026-01-22 00:17:53.670 182759 DEBUG oslo_concurrency.lockutils [None req-29da28cc-4080-4bd7-86ba-d3abad7df675 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] Lock "5fd1f867-a38b-4022-886e-080b31068c65" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 8.457s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 21 19:17:54 np0005591285 nova_compute[182755]: 2026-01-22 00:17:54.217 182759 DEBUG oslo_service.periodic_task [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 21 19:17:54 np0005591285 nova_compute[182755]: 2026-01-22 00:17:54.247 182759 DEBUG oslo_concurrency.lockutils [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 21 19:17:54 np0005591285 nova_compute[182755]: 2026-01-22 00:17:54.248 182759 DEBUG oslo_concurrency.lockutils [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 21 19:17:54 np0005591285 nova_compute[182755]: 2026-01-22 00:17:54.248 182759 DEBUG oslo_concurrency.lockutils [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 21 19:17:54 np0005591285 nova_compute[182755]: 2026-01-22 00:17:54.248 182759 DEBUG nova.compute.resource_tracker [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Jan 21 19:17:54 np0005591285 nova_compute[182755]: 2026-01-22 00:17:54.356 182759 DEBUG oslo_concurrency.processutils [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/5fd1f867-a38b-4022-886e-080b31068c65/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 21 19:17:54 np0005591285 nova_compute[182755]: 2026-01-22 00:17:54.456 182759 DEBUG oslo_concurrency.processutils [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/5fd1f867-a38b-4022-886e-080b31068c65/disk --force-share --output=json" returned: 0 in 0.100s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 21 19:17:54 np0005591285 nova_compute[182755]: 2026-01-22 00:17:54.457 182759 DEBUG oslo_concurrency.processutils [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/5fd1f867-a38b-4022-886e-080b31068c65/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 21 19:17:54 np0005591285 nova_compute[182755]: 2026-01-22 00:17:54.510 182759 DEBUG oslo_concurrency.processutils [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/5fd1f867-a38b-4022-886e-080b31068c65/disk --force-share --output=json" returned: 0 in 0.053s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 21 19:17:54 np0005591285 nova_compute[182755]: 2026-01-22 00:17:54.652 182759 WARNING nova.virt.libvirt.driver [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 21 19:17:54 np0005591285 nova_compute[182755]: 2026-01-22 00:17:54.654 182759 DEBUG nova.compute.resource_tracker [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=5565MB free_disk=73.19247436523438GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Jan 21 19:17:54 np0005591285 nova_compute[182755]: 2026-01-22 00:17:54.655 182759 DEBUG oslo_concurrency.lockutils [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 21 19:17:54 np0005591285 nova_compute[182755]: 2026-01-22 00:17:54.655 182759 DEBUG oslo_concurrency.lockutils [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 21 19:17:54 np0005591285 nova_compute[182755]: 2026-01-22 00:17:54.694 182759 DEBUG nova.compute.manager [req-9a8ff611-f285-4f15-a0f9-b95221d9bed2 req-cbd5a8c3-10a8-4882-9df5-127c90615bfd 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 5fd1f867-a38b-4022-886e-080b31068c65] Received event network-vif-plugged-b5ddb845-36ad-438e-b95d-6d5c696671e9 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 21 19:17:54 np0005591285 nova_compute[182755]: 2026-01-22 00:17:54.695 182759 DEBUG oslo_concurrency.lockutils [req-9a8ff611-f285-4f15-a0f9-b95221d9bed2 req-cbd5a8c3-10a8-4882-9df5-127c90615bfd 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "5fd1f867-a38b-4022-886e-080b31068c65-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 21 19:17:54 np0005591285 nova_compute[182755]: 2026-01-22 00:17:54.696 182759 DEBUG oslo_concurrency.lockutils [req-9a8ff611-f285-4f15-a0f9-b95221d9bed2 req-cbd5a8c3-10a8-4882-9df5-127c90615bfd 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "5fd1f867-a38b-4022-886e-080b31068c65-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 21 19:17:54 np0005591285 nova_compute[182755]: 2026-01-22 00:17:54.696 182759 DEBUG oslo_concurrency.lockutils [req-9a8ff611-f285-4f15-a0f9-b95221d9bed2 req-cbd5a8c3-10a8-4882-9df5-127c90615bfd 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "5fd1f867-a38b-4022-886e-080b31068c65-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 21 19:17:54 np0005591285 nova_compute[182755]: 2026-01-22 00:17:54.696 182759 DEBUG nova.compute.manager [req-9a8ff611-f285-4f15-a0f9-b95221d9bed2 req-cbd5a8c3-10a8-4882-9df5-127c90615bfd 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 5fd1f867-a38b-4022-886e-080b31068c65] No waiting events found dispatching network-vif-plugged-b5ddb845-36ad-438e-b95d-6d5c696671e9 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 21 19:17:54 np0005591285 nova_compute[182755]: 2026-01-22 00:17:54.697 182759 WARNING nova.compute.manager [req-9a8ff611-f285-4f15-a0f9-b95221d9bed2 req-cbd5a8c3-10a8-4882-9df5-127c90615bfd 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 5fd1f867-a38b-4022-886e-080b31068c65] Received unexpected event network-vif-plugged-b5ddb845-36ad-438e-b95d-6d5c696671e9 for instance with vm_state active and task_state None.#033[00m
Jan 21 19:17:54 np0005591285 nova_compute[182755]: 2026-01-22 00:17:54.725 182759 DEBUG nova.compute.resource_tracker [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Migration for instance 46feac9e-f412-4027-8cfb-f7280308085e refers to another host's instance! _pair_instances_to_migrations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:903#033[00m
Jan 21 19:17:54 np0005591285 nova_compute[182755]: 2026-01-22 00:17:54.756 182759 INFO nova.compute.resource_tracker [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] [instance: 46feac9e-f412-4027-8cfb-f7280308085e] Updating resource usage from migration bd86f60c-99de-4bae-8548-969bcc2d8d50#033[00m
Jan 21 19:17:54 np0005591285 nova_compute[182755]: 2026-01-22 00:17:54.757 182759 DEBUG nova.compute.resource_tracker [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] [instance: 46feac9e-f412-4027-8cfb-f7280308085e] Starting to track incoming migration bd86f60c-99de-4bae-8548-969bcc2d8d50 with flavor c3389c03-89c4-4ff5-9e03-1a99d41713d4 _update_usage_from_migration /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1431#033[00m
Jan 21 19:17:54 np0005591285 nova_compute[182755]: 2026-01-22 00:17:54.792 182759 DEBUG nova.compute.resource_tracker [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Instance 5fd1f867-a38b-4022-886e-080b31068c65 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Jan 21 19:17:54 np0005591285 nova_compute[182755]: 2026-01-22 00:17:54.827 182759 WARNING nova.compute.resource_tracker [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Instance 46feac9e-f412-4027-8cfb-f7280308085e has been moved to another host compute-0.ctlplane.example.com(compute-0.ctlplane.example.com). There are allocations remaining against the source host that might need to be removed: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}.#033[00m
Jan 21 19:17:54 np0005591285 nova_compute[182755]: 2026-01-22 00:17:54.827 182759 DEBUG nova.compute.resource_tracker [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 2 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Jan 21 19:17:54 np0005591285 nova_compute[182755]: 2026-01-22 00:17:54.828 182759 DEBUG nova.compute.resource_tracker [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7679MB used_ram=768MB phys_disk=79GB used_disk=2GB total_vcpus=8 used_vcpus=2 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Jan 21 19:17:54 np0005591285 nova_compute[182755]: 2026-01-22 00:17:54.904 182759 DEBUG nova.compute.provider_tree [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Inventory has not changed in ProviderTree for provider: e96a8776-a298-4c19-937a-402cb8191067 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 21 19:17:54 np0005591285 nova_compute[182755]: 2026-01-22 00:17:54.922 182759 DEBUG nova.scheduler.client.report [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Inventory has not changed for provider e96a8776-a298-4c19-937a-402cb8191067 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 21 19:17:54 np0005591285 nova_compute[182755]: 2026-01-22 00:17:54.949 182759 DEBUG nova.compute.resource_tracker [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Jan 21 19:17:54 np0005591285 nova_compute[182755]: 2026-01-22 00:17:54.950 182759 DEBUG oslo_concurrency.lockutils [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.295s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 21 19:17:55 np0005591285 nova_compute[182755]: 2026-01-22 00:17:55.410 182759 DEBUG oslo_service.periodic_task [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Running periodic task ComputeManager._sync_power_states run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 21 19:17:55 np0005591285 nova_compute[182755]: 2026-01-22 00:17:55.437 182759 DEBUG nova.compute.manager [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Triggering sync for uuid 5fd1f867-a38b-4022-886e-080b31068c65 _sync_power_states /usr/lib/python3.9/site-packages/nova/compute/manager.py:10268#033[00m
Jan 21 19:17:55 np0005591285 nova_compute[182755]: 2026-01-22 00:17:55.438 182759 DEBUG oslo_concurrency.lockutils [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Acquiring lock "5fd1f867-a38b-4022-886e-080b31068c65" by "nova.compute.manager.ComputeManager._sync_power_states.<locals>._sync.<locals>.query_driver_power_state_and_sync" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 21 19:17:55 np0005591285 nova_compute[182755]: 2026-01-22 00:17:55.439 182759 DEBUG oslo_concurrency.lockutils [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Lock "5fd1f867-a38b-4022-886e-080b31068c65" acquired by "nova.compute.manager.ComputeManager._sync_power_states.<locals>._sync.<locals>.query_driver_power_state_and_sync" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 21 19:17:55 np0005591285 nova_compute[182755]: 2026-01-22 00:17:55.489 182759 DEBUG oslo_concurrency.lockutils [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Lock "5fd1f867-a38b-4022-886e-080b31068c65" "released" by "nova.compute.manager.ComputeManager._sync_power_states.<locals>._sync.<locals>.query_driver_power_state_and_sync" :: held 0.051s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 21 19:17:55 np0005591285 nova_compute[182755]: 2026-01-22 00:17:55.619 182759 DEBUG oslo_concurrency.lockutils [None req-7aec05fb-e9bb-42ad-83fd-24a6c3eb8f91 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] Acquiring lock "5fd1f867-a38b-4022-886e-080b31068c65" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 21 19:17:55 np0005591285 nova_compute[182755]: 2026-01-22 00:17:55.620 182759 DEBUG oslo_concurrency.lockutils [None req-7aec05fb-e9bb-42ad-83fd-24a6c3eb8f91 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] Lock "5fd1f867-a38b-4022-886e-080b31068c65" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 21 19:17:55 np0005591285 nova_compute[182755]: 2026-01-22 00:17:55.621 182759 DEBUG oslo_concurrency.lockutils [None req-7aec05fb-e9bb-42ad-83fd-24a6c3eb8f91 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] Acquiring lock "5fd1f867-a38b-4022-886e-080b31068c65-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 21 19:17:55 np0005591285 nova_compute[182755]: 2026-01-22 00:17:55.621 182759 DEBUG oslo_concurrency.lockutils [None req-7aec05fb-e9bb-42ad-83fd-24a6c3eb8f91 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] Lock "5fd1f867-a38b-4022-886e-080b31068c65-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 21 19:17:55 np0005591285 nova_compute[182755]: 2026-01-22 00:17:55.621 182759 DEBUG oslo_concurrency.lockutils [None req-7aec05fb-e9bb-42ad-83fd-24a6c3eb8f91 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] Lock "5fd1f867-a38b-4022-886e-080b31068c65-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 21 19:17:55 np0005591285 nova_compute[182755]: 2026-01-22 00:17:55.634 182759 INFO nova.compute.manager [None req-7aec05fb-e9bb-42ad-83fd-24a6c3eb8f91 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] [instance: 5fd1f867-a38b-4022-886e-080b31068c65] Terminating instance#033[00m
Jan 21 19:17:55 np0005591285 nova_compute[182755]: 2026-01-22 00:17:55.646 182759 DEBUG nova.compute.manager [None req-7aec05fb-e9bb-42ad-83fd-24a6c3eb8f91 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] [instance: 5fd1f867-a38b-4022-886e-080b31068c65] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Jan 21 19:17:55 np0005591285 kernel: tapb5ddb845-36 (unregistering): left promiscuous mode
Jan 21 19:17:55 np0005591285 NetworkManager[55017]: <info>  [1769041075.6666] device (tapb5ddb845-36): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 21 19:17:55 np0005591285 ovn_controller[94908]: 2026-01-22T00:17:55Z|00529|binding|INFO|Releasing lport b5ddb845-36ad-438e-b95d-6d5c696671e9 from this chassis (sb_readonly=0)
Jan 21 19:17:55 np0005591285 ovn_controller[94908]: 2026-01-22T00:17:55Z|00530|binding|INFO|Setting lport b5ddb845-36ad-438e-b95d-6d5c696671e9 down in Southbound
Jan 21 19:17:55 np0005591285 nova_compute[182755]: 2026-01-22 00:17:55.683 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:17:55 np0005591285 ovn_controller[94908]: 2026-01-22T00:17:55Z|00531|binding|INFO|Removing iface tapb5ddb845-36 ovn-installed in OVS
Jan 21 19:17:55 np0005591285 nova_compute[182755]: 2026-01-22 00:17:55.686 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:17:55 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:17:55.694 104259 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:30:f9:d2 10.100.0.8'], port_security=['fa:16:3e:30:f9:d2 10.100.0.8'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'name': 'tempest-TestNetworkBasicOps-1260237103', 'neutron:cidrs': '10.100.0.8/28', 'neutron:device_id': '5fd1f867-a38b-4022-886e-080b31068c65', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-ed0e337a-102a-4cfd-8393-2e4b081cc9be', 'neutron:port_capabilities': '', 'neutron:port_name': 'tempest-TestNetworkBasicOps-1260237103', 'neutron:project_id': '34b96b4037d24a0ea19383ca2477b2fd', 'neutron:revision_number': '9', 'neutron:security_group_ids': '86c286e4-25eb-4f2e-b5ff-30677cfd8882', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:port_fip': '192.168.122.176', 'neutron:host_id': 'compute-2.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=e0671c46-b585-465c-a43e-4ccad6b37e41, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fdd55b5c970>], logical_port=b5ddb845-36ad-438e-b95d-6d5c696671e9) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7fdd55b5c970>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 21 19:17:55 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:17:55.697 104259 INFO neutron.agent.ovn.metadata.agent [-] Port b5ddb845-36ad-438e-b95d-6d5c696671e9 in datapath ed0e337a-102a-4cfd-8393-2e4b081cc9be unbound from our chassis#033[00m
Jan 21 19:17:55 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:17:55.699 104259 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network ed0e337a-102a-4cfd-8393-2e4b081cc9be, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Jan 21 19:17:55 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:17:55.701 211690 DEBUG oslo.privsep.daemon [-] privsep: reply[20da2ba4-a9de-413b-8950-a22691602934]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 19:17:55 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:17:55.702 104259 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-ed0e337a-102a-4cfd-8393-2e4b081cc9be namespace which is not needed anymore#033[00m
Jan 21 19:17:55 np0005591285 nova_compute[182755]: 2026-01-22 00:17:55.704 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:17:55 np0005591285 systemd[1]: machine-qemu\x2d62\x2dinstance\x2d00000089.scope: Deactivated successfully.
Jan 21 19:17:55 np0005591285 systemd[1]: machine-qemu\x2d62\x2dinstance\x2d00000089.scope: Consumed 3.959s CPU time.
Jan 21 19:17:55 np0005591285 systemd-machined[154022]: Machine qemu-62-instance-00000089 terminated.
Jan 21 19:17:55 np0005591285 nova_compute[182755]: 2026-01-22 00:17:55.826 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:17:55 np0005591285 neutron-haproxy-ovnmeta-ed0e337a-102a-4cfd-8393-2e4b081cc9be[233416]: [NOTICE]   (233423) : haproxy version is 2.8.14-c23fe91
Jan 21 19:17:55 np0005591285 neutron-haproxy-ovnmeta-ed0e337a-102a-4cfd-8393-2e4b081cc9be[233416]: [NOTICE]   (233423) : path to executable is /usr/sbin/haproxy
Jan 21 19:17:55 np0005591285 neutron-haproxy-ovnmeta-ed0e337a-102a-4cfd-8393-2e4b081cc9be[233416]: [WARNING]  (233423) : Exiting Master process...
Jan 21 19:17:55 np0005591285 neutron-haproxy-ovnmeta-ed0e337a-102a-4cfd-8393-2e4b081cc9be[233416]: [WARNING]  (233423) : Exiting Master process...
Jan 21 19:17:55 np0005591285 neutron-haproxy-ovnmeta-ed0e337a-102a-4cfd-8393-2e4b081cc9be[233416]: [ALERT]    (233423) : Current worker (233425) exited with code 143 (Terminated)
Jan 21 19:17:55 np0005591285 neutron-haproxy-ovnmeta-ed0e337a-102a-4cfd-8393-2e4b081cc9be[233416]: [WARNING]  (233423) : All workers exited. Exiting... (0)
Jan 21 19:17:55 np0005591285 systemd[1]: libpod-5617eba4a502ccb4a928cea4faee0ae20ca488f29b06aa830ba7f93b23728396.scope: Deactivated successfully.
Jan 21 19:17:55 np0005591285 podman[233475]: 2026-01-22 00:17:55.85088684 +0000 UTC m=+0.048076502 container died 5617eba4a502ccb4a928cea4faee0ae20ca488f29b06aa830ba7f93b23728396 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-ed0e337a-102a-4cfd-8393-2e4b081cc9be, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true)
Jan 21 19:17:55 np0005591285 nova_compute[182755]: 2026-01-22 00:17:55.871 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:17:55 np0005591285 nova_compute[182755]: 2026-01-22 00:17:55.877 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:17:55 np0005591285 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-5617eba4a502ccb4a928cea4faee0ae20ca488f29b06aa830ba7f93b23728396-userdata-shm.mount: Deactivated successfully.
Jan 21 19:17:55 np0005591285 systemd[1]: var-lib-containers-storage-overlay-28838ccdb78abd70def816810076e5adb414d1ded3dbf5724b7a7e35c5ab574a-merged.mount: Deactivated successfully.
Jan 21 19:17:55 np0005591285 podman[233475]: 2026-01-22 00:17:55.902524237 +0000 UTC m=+0.099713919 container cleanup 5617eba4a502ccb4a928cea4faee0ae20ca488f29b06aa830ba7f93b23728396 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-ed0e337a-102a-4cfd-8393-2e4b081cc9be, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 21 19:17:55 np0005591285 nova_compute[182755]: 2026-01-22 00:17:55.911 182759 INFO nova.virt.libvirt.driver [-] [instance: 5fd1f867-a38b-4022-886e-080b31068c65] Instance destroyed successfully.#033[00m
Jan 21 19:17:55 np0005591285 nova_compute[182755]: 2026-01-22 00:17:55.912 182759 DEBUG nova.objects.instance [None req-7aec05fb-e9bb-42ad-83fd-24a6c3eb8f91 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] Lazy-loading 'resources' on Instance uuid 5fd1f867-a38b-4022-886e-080b31068c65 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 21 19:17:55 np0005591285 systemd[1]: libpod-conmon-5617eba4a502ccb4a928cea4faee0ae20ca488f29b06aa830ba7f93b23728396.scope: Deactivated successfully.
Jan 21 19:17:55 np0005591285 nova_compute[182755]: 2026-01-22 00:17:55.928 182759 DEBUG nova.virt.libvirt.vif [None req-7aec05fb-e9bb-42ad-83fd-24a6c3eb8f91 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-22T00:17:43Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-1162631708',display_name='tempest-TestNetworkBasicOps-server-1162631708',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-1162631708',id=137,image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBL1GgERA5A6dlnUAgyjlcKhVfhPfweuHGtTAxq6FqnnvgM7bKXaovy0QOYaTYF/A7+y4NM6KVriyhlUw8VnZj6YUMZAIXL846IzmPuKgYor70BbYEoJIH0X6V5P8cATakw==',key_name='tempest-TestNetworkBasicOps-935473695',keypairs=<?>,launch_index=0,launched_at=2026-01-22T00:17:53Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='34b96b4037d24a0ea19383ca2477b2fd',ramdisk_id='',reservation_id='r-nbmafi1w',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestNetworkBasicOps-822850957',owner_user_name='tempest-TestNetworkBasicOps-822850957-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-22T00:17:53Z,user_data=None,user_id='833f1e9dce90456ea55a443da6704907',uuid=5fd1f867-a38b-4022-886e-080b31068c65,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "b5ddb845-36ad-438e-b95d-6d5c696671e9", "address": "fa:16:3e:30:f9:d2", "network": {"id": "ed0e337a-102a-4cfd-8393-2e4b081cc9be", "bridge": "br-int", "label": "tempest-network-smoke--1899647072", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.176", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "34b96b4037d24a0ea19383ca2477b2fd", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb5ddb845-36", "ovs_interfaceid": "b5ddb845-36ad-438e-b95d-6d5c696671e9", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Jan 21 19:17:55 np0005591285 nova_compute[182755]: 2026-01-22 00:17:55.929 182759 DEBUG nova.network.os_vif_util [None req-7aec05fb-e9bb-42ad-83fd-24a6c3eb8f91 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] Converting VIF {"id": "b5ddb845-36ad-438e-b95d-6d5c696671e9", "address": "fa:16:3e:30:f9:d2", "network": {"id": "ed0e337a-102a-4cfd-8393-2e4b081cc9be", "bridge": "br-int", "label": "tempest-network-smoke--1899647072", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.176", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "34b96b4037d24a0ea19383ca2477b2fd", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb5ddb845-36", "ovs_interfaceid": "b5ddb845-36ad-438e-b95d-6d5c696671e9", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 21 19:17:55 np0005591285 nova_compute[182755]: 2026-01-22 00:17:55.931 182759 DEBUG nova.network.os_vif_util [None req-7aec05fb-e9bb-42ad-83fd-24a6c3eb8f91 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:30:f9:d2,bridge_name='br-int',has_traffic_filtering=True,id=b5ddb845-36ad-438e-b95d-6d5c696671e9,network=Network(ed0e337a-102a-4cfd-8393-2e4b081cc9be),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tapb5ddb845-36') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 21 19:17:55 np0005591285 nova_compute[182755]: 2026-01-22 00:17:55.932 182759 DEBUG os_vif [None req-7aec05fb-e9bb-42ad-83fd-24a6c3eb8f91 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:30:f9:d2,bridge_name='br-int',has_traffic_filtering=True,id=b5ddb845-36ad-438e-b95d-6d5c696671e9,network=Network(ed0e337a-102a-4cfd-8393-2e4b081cc9be),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tapb5ddb845-36') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Jan 21 19:17:55 np0005591285 nova_compute[182755]: 2026-01-22 00:17:55.934 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:17:55 np0005591285 nova_compute[182755]: 2026-01-22 00:17:55.934 182759 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapb5ddb845-36, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 21 19:17:55 np0005591285 nova_compute[182755]: 2026-01-22 00:17:55.936 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:17:55 np0005591285 nova_compute[182755]: 2026-01-22 00:17:55.938 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:17:55 np0005591285 nova_compute[182755]: 2026-01-22 00:17:55.940 182759 INFO os_vif [None req-7aec05fb-e9bb-42ad-83fd-24a6c3eb8f91 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:30:f9:d2,bridge_name='br-int',has_traffic_filtering=True,id=b5ddb845-36ad-438e-b95d-6d5c696671e9,network=Network(ed0e337a-102a-4cfd-8393-2e4b081cc9be),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tapb5ddb845-36')#033[00m
Jan 21 19:17:55 np0005591285 nova_compute[182755]: 2026-01-22 00:17:55.940 182759 INFO nova.virt.libvirt.driver [None req-7aec05fb-e9bb-42ad-83fd-24a6c3eb8f91 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] [instance: 5fd1f867-a38b-4022-886e-080b31068c65] Deleting instance files /var/lib/nova/instances/5fd1f867-a38b-4022-886e-080b31068c65_del#033[00m
Jan 21 19:17:55 np0005591285 nova_compute[182755]: 2026-01-22 00:17:55.941 182759 INFO nova.virt.libvirt.driver [None req-7aec05fb-e9bb-42ad-83fd-24a6c3eb8f91 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] [instance: 5fd1f867-a38b-4022-886e-080b31068c65] Deletion of /var/lib/nova/instances/5fd1f867-a38b-4022-886e-080b31068c65_del complete#033[00m
Jan 21 19:17:55 np0005591285 podman[233523]: 2026-01-22 00:17:55.978674597 +0000 UTC m=+0.046434009 container remove 5617eba4a502ccb4a928cea4faee0ae20ca488f29b06aa830ba7f93b23728396 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-ed0e337a-102a-4cfd-8393-2e4b081cc9be, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Jan 21 19:17:55 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:17:55.983 211690 DEBUG oslo.privsep.daemon [-] privsep: reply[7b435571-9712-43c8-8286-ff329938d387]: (4, ('Thu Jan 22 12:17:55 AM UTC 2026 Stopping container neutron-haproxy-ovnmeta-ed0e337a-102a-4cfd-8393-2e4b081cc9be (5617eba4a502ccb4a928cea4faee0ae20ca488f29b06aa830ba7f93b23728396)\n5617eba4a502ccb4a928cea4faee0ae20ca488f29b06aa830ba7f93b23728396\nThu Jan 22 12:17:55 AM UTC 2026 Deleting container neutron-haproxy-ovnmeta-ed0e337a-102a-4cfd-8393-2e4b081cc9be (5617eba4a502ccb4a928cea4faee0ae20ca488f29b06aa830ba7f93b23728396)\n5617eba4a502ccb4a928cea4faee0ae20ca488f29b06aa830ba7f93b23728396\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 19:17:55 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:17:55.985 211690 DEBUG oslo.privsep.daemon [-] privsep: reply[8a91a1d7-a866-442d-ab67-69f2624baee7]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 19:17:55 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:17:55.986 104259 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=taped0e337a-10, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 21 19:17:55 np0005591285 nova_compute[182755]: 2026-01-22 00:17:55.988 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:17:55 np0005591285 kernel: taped0e337a-10: left promiscuous mode
Jan 21 19:17:55 np0005591285 nova_compute[182755]: 2026-01-22 00:17:55.991 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:17:55 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:17:55.993 211690 DEBUG oslo.privsep.daemon [-] privsep: reply[7a443a9f-86ec-4fd3-b8d2-db379c9e85f1]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 19:17:56 np0005591285 nova_compute[182755]: 2026-01-22 00:17:56.006 182759 INFO nova.compute.manager [None req-7aec05fb-e9bb-42ad-83fd-24a6c3eb8f91 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] [instance: 5fd1f867-a38b-4022-886e-080b31068c65] Took 0.36 seconds to destroy the instance on the hypervisor.#033[00m
Jan 21 19:17:56 np0005591285 nova_compute[182755]: 2026-01-22 00:17:56.007 182759 DEBUG oslo.service.loopingcall [None req-7aec05fb-e9bb-42ad-83fd-24a6c3eb8f91 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Jan 21 19:17:56 np0005591285 nova_compute[182755]: 2026-01-22 00:17:56.007 182759 DEBUG nova.compute.manager [-] [instance: 5fd1f867-a38b-4022-886e-080b31068c65] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Jan 21 19:17:56 np0005591285 nova_compute[182755]: 2026-01-22 00:17:56.007 182759 DEBUG nova.network.neutron [-] [instance: 5fd1f867-a38b-4022-886e-080b31068c65] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Jan 21 19:17:56 np0005591285 nova_compute[182755]: 2026-01-22 00:17:56.010 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:17:56 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:17:56.017 211690 DEBUG oslo.privsep.daemon [-] privsep: reply[2f9e6b45-8593-4ea3-b083-1fefb0cb83e5]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 19:17:56 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:17:56.018 211690 DEBUG oslo.privsep.daemon [-] privsep: reply[aa3f19c8-ef69-48c5-8904-a9ea4bfd0bd6]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 19:17:56 np0005591285 systemd-logind[788]: New session 51 of user nova.
Jan 21 19:17:56 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:17:56.031 211690 DEBUG oslo.privsep.daemon [-] privsep: reply[55a2811e-ff84-4fe7-ac0e-f3677ed397ee]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 563040, 'reachable_time': 21702, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 233539, 'error': None, 'target': 'ovnmeta-ed0e337a-102a-4cfd-8393-2e4b081cc9be', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 19:17:56 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:17:56.035 104650 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-ed0e337a-102a-4cfd-8393-2e4b081cc9be deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Jan 21 19:17:56 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:17:56.035 104650 DEBUG oslo.privsep.daemon [-] privsep: reply[7d4e3514-7ae1-4c51-98f8-51719c6e2a65]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 19:17:56 np0005591285 systemd[1]: Started Session 51 of User nova.
Jan 21 19:17:56 np0005591285 systemd[1]: run-netns-ovnmeta\x2ded0e337a\x2d102a\x2d4cfd\x2d8393\x2d2e4b081cc9be.mount: Deactivated successfully.
Jan 21 19:17:56 np0005591285 nova_compute[182755]: 2026-01-22 00:17:56.246 182759 DEBUG oslo_service.periodic_task [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 21 19:17:56 np0005591285 systemd[1]: session-51.scope: Deactivated successfully.
Jan 21 19:17:56 np0005591285 systemd-logind[788]: Session 51 logged out. Waiting for processes to exit.
Jan 21 19:17:56 np0005591285 systemd-logind[788]: Removed session 51.
Jan 21 19:17:56 np0005591285 systemd-logind[788]: New session 52 of user nova.
Jan 21 19:17:56 np0005591285 systemd[1]: Started Session 52 of User nova.
Jan 21 19:17:56 np0005591285 systemd[1]: session-52.scope: Deactivated successfully.
Jan 21 19:17:56 np0005591285 systemd-logind[788]: Session 52 logged out. Waiting for processes to exit.
Jan 21 19:17:56 np0005591285 systemd-logind[788]: Removed session 52.
Jan 21 19:17:56 np0005591285 systemd-logind[788]: New session 53 of user nova.
Jan 21 19:17:56 np0005591285 systemd[1]: Started Session 53 of User nova.
Jan 21 19:17:56 np0005591285 systemd[1]: session-53.scope: Deactivated successfully.
Jan 21 19:17:56 np0005591285 systemd-logind[788]: Session 53 logged out. Waiting for processes to exit.
Jan 21 19:17:56 np0005591285 systemd-logind[788]: Removed session 53.
Jan 21 19:17:57 np0005591285 nova_compute[182755]: 2026-01-22 00:17:57.106 182759 DEBUG nova.compute.manager [req-0be5c44c-6e07-4152-8d76-76e3a0d4a8d1 req-486ba317-ee9c-42c6-b3f7-68f7d0381531 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 46feac9e-f412-4027-8cfb-f7280308085e] Received event network-vif-unplugged-7bc267e3-f762-4a18-a3a2-42a7161a231e external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 21 19:17:57 np0005591285 nova_compute[182755]: 2026-01-22 00:17:57.107 182759 DEBUG oslo_concurrency.lockutils [req-0be5c44c-6e07-4152-8d76-76e3a0d4a8d1 req-486ba317-ee9c-42c6-b3f7-68f7d0381531 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "46feac9e-f412-4027-8cfb-f7280308085e-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 21 19:17:57 np0005591285 nova_compute[182755]: 2026-01-22 00:17:57.107 182759 DEBUG oslo_concurrency.lockutils [req-0be5c44c-6e07-4152-8d76-76e3a0d4a8d1 req-486ba317-ee9c-42c6-b3f7-68f7d0381531 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "46feac9e-f412-4027-8cfb-f7280308085e-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 21 19:17:57 np0005591285 nova_compute[182755]: 2026-01-22 00:17:57.108 182759 DEBUG oslo_concurrency.lockutils [req-0be5c44c-6e07-4152-8d76-76e3a0d4a8d1 req-486ba317-ee9c-42c6-b3f7-68f7d0381531 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "46feac9e-f412-4027-8cfb-f7280308085e-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 21 19:17:57 np0005591285 nova_compute[182755]: 2026-01-22 00:17:57.108 182759 DEBUG nova.compute.manager [req-0be5c44c-6e07-4152-8d76-76e3a0d4a8d1 req-486ba317-ee9c-42c6-b3f7-68f7d0381531 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 46feac9e-f412-4027-8cfb-f7280308085e] No waiting events found dispatching network-vif-unplugged-7bc267e3-f762-4a18-a3a2-42a7161a231e pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 21 19:17:57 np0005591285 nova_compute[182755]: 2026-01-22 00:17:57.109 182759 WARNING nova.compute.manager [req-0be5c44c-6e07-4152-8d76-76e3a0d4a8d1 req-486ba317-ee9c-42c6-b3f7-68f7d0381531 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 46feac9e-f412-4027-8cfb-f7280308085e] Received unexpected event network-vif-unplugged-7bc267e3-f762-4a18-a3a2-42a7161a231e for instance with vm_state active and task_state resize_migrating.#033[00m
Jan 21 19:17:57 np0005591285 nova_compute[182755]: 2026-01-22 00:17:57.109 182759 DEBUG nova.compute.manager [req-0be5c44c-6e07-4152-8d76-76e3a0d4a8d1 req-486ba317-ee9c-42c6-b3f7-68f7d0381531 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 46feac9e-f412-4027-8cfb-f7280308085e] Received event network-vif-plugged-7bc267e3-f762-4a18-a3a2-42a7161a231e external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 21 19:17:57 np0005591285 nova_compute[182755]: 2026-01-22 00:17:57.110 182759 DEBUG oslo_concurrency.lockutils [req-0be5c44c-6e07-4152-8d76-76e3a0d4a8d1 req-486ba317-ee9c-42c6-b3f7-68f7d0381531 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "46feac9e-f412-4027-8cfb-f7280308085e-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 21 19:17:57 np0005591285 nova_compute[182755]: 2026-01-22 00:17:57.110 182759 DEBUG oslo_concurrency.lockutils [req-0be5c44c-6e07-4152-8d76-76e3a0d4a8d1 req-486ba317-ee9c-42c6-b3f7-68f7d0381531 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "46feac9e-f412-4027-8cfb-f7280308085e-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 21 19:17:57 np0005591285 nova_compute[182755]: 2026-01-22 00:17:57.110 182759 DEBUG oslo_concurrency.lockutils [req-0be5c44c-6e07-4152-8d76-76e3a0d4a8d1 req-486ba317-ee9c-42c6-b3f7-68f7d0381531 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "46feac9e-f412-4027-8cfb-f7280308085e-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 21 19:17:57 np0005591285 nova_compute[182755]: 2026-01-22 00:17:57.111 182759 DEBUG nova.compute.manager [req-0be5c44c-6e07-4152-8d76-76e3a0d4a8d1 req-486ba317-ee9c-42c6-b3f7-68f7d0381531 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 46feac9e-f412-4027-8cfb-f7280308085e] No waiting events found dispatching network-vif-plugged-7bc267e3-f762-4a18-a3a2-42a7161a231e pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 21 19:17:57 np0005591285 nova_compute[182755]: 2026-01-22 00:17:57.111 182759 WARNING nova.compute.manager [req-0be5c44c-6e07-4152-8d76-76e3a0d4a8d1 req-486ba317-ee9c-42c6-b3f7-68f7d0381531 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 46feac9e-f412-4027-8cfb-f7280308085e] Received unexpected event network-vif-plugged-7bc267e3-f762-4a18-a3a2-42a7161a231e for instance with vm_state active and task_state resize_migrating.#033[00m
Jan 21 19:17:57 np0005591285 nova_compute[182755]: 2026-01-22 00:17:57.112 182759 DEBUG nova.compute.manager [req-0be5c44c-6e07-4152-8d76-76e3a0d4a8d1 req-486ba317-ee9c-42c6-b3f7-68f7d0381531 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 5fd1f867-a38b-4022-886e-080b31068c65] Received event network-vif-unplugged-b5ddb845-36ad-438e-b95d-6d5c696671e9 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 21 19:17:57 np0005591285 nova_compute[182755]: 2026-01-22 00:17:57.112 182759 DEBUG oslo_concurrency.lockutils [req-0be5c44c-6e07-4152-8d76-76e3a0d4a8d1 req-486ba317-ee9c-42c6-b3f7-68f7d0381531 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "5fd1f867-a38b-4022-886e-080b31068c65-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 21 19:17:57 np0005591285 nova_compute[182755]: 2026-01-22 00:17:57.113 182759 DEBUG oslo_concurrency.lockutils [req-0be5c44c-6e07-4152-8d76-76e3a0d4a8d1 req-486ba317-ee9c-42c6-b3f7-68f7d0381531 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "5fd1f867-a38b-4022-886e-080b31068c65-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 21 19:17:57 np0005591285 nova_compute[182755]: 2026-01-22 00:17:57.113 182759 DEBUG oslo_concurrency.lockutils [req-0be5c44c-6e07-4152-8d76-76e3a0d4a8d1 req-486ba317-ee9c-42c6-b3f7-68f7d0381531 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "5fd1f867-a38b-4022-886e-080b31068c65-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 21 19:17:57 np0005591285 nova_compute[182755]: 2026-01-22 00:17:57.114 182759 DEBUG nova.compute.manager [req-0be5c44c-6e07-4152-8d76-76e3a0d4a8d1 req-486ba317-ee9c-42c6-b3f7-68f7d0381531 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 5fd1f867-a38b-4022-886e-080b31068c65] No waiting events found dispatching network-vif-unplugged-b5ddb845-36ad-438e-b95d-6d5c696671e9 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 21 19:17:57 np0005591285 nova_compute[182755]: 2026-01-22 00:17:57.114 182759 DEBUG nova.compute.manager [req-0be5c44c-6e07-4152-8d76-76e3a0d4a8d1 req-486ba317-ee9c-42c6-b3f7-68f7d0381531 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 5fd1f867-a38b-4022-886e-080b31068c65] Received event network-vif-unplugged-b5ddb845-36ad-438e-b95d-6d5c696671e9 for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Jan 21 19:17:57 np0005591285 nova_compute[182755]: 2026-01-22 00:17:57.115 182759 DEBUG nova.compute.manager [req-0be5c44c-6e07-4152-8d76-76e3a0d4a8d1 req-486ba317-ee9c-42c6-b3f7-68f7d0381531 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 5fd1f867-a38b-4022-886e-080b31068c65] Received event network-vif-plugged-b5ddb845-36ad-438e-b95d-6d5c696671e9 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 21 19:17:57 np0005591285 nova_compute[182755]: 2026-01-22 00:17:57.115 182759 DEBUG oslo_concurrency.lockutils [req-0be5c44c-6e07-4152-8d76-76e3a0d4a8d1 req-486ba317-ee9c-42c6-b3f7-68f7d0381531 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "5fd1f867-a38b-4022-886e-080b31068c65-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 21 19:17:57 np0005591285 nova_compute[182755]: 2026-01-22 00:17:57.115 182759 DEBUG oslo_concurrency.lockutils [req-0be5c44c-6e07-4152-8d76-76e3a0d4a8d1 req-486ba317-ee9c-42c6-b3f7-68f7d0381531 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "5fd1f867-a38b-4022-886e-080b31068c65-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 21 19:17:57 np0005591285 nova_compute[182755]: 2026-01-22 00:17:57.116 182759 DEBUG oslo_concurrency.lockutils [req-0be5c44c-6e07-4152-8d76-76e3a0d4a8d1 req-486ba317-ee9c-42c6-b3f7-68f7d0381531 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "5fd1f867-a38b-4022-886e-080b31068c65-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 21 19:17:57 np0005591285 nova_compute[182755]: 2026-01-22 00:17:57.116 182759 DEBUG nova.compute.manager [req-0be5c44c-6e07-4152-8d76-76e3a0d4a8d1 req-486ba317-ee9c-42c6-b3f7-68f7d0381531 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 5fd1f867-a38b-4022-886e-080b31068c65] No waiting events found dispatching network-vif-plugged-b5ddb845-36ad-438e-b95d-6d5c696671e9 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 21 19:17:57 np0005591285 nova_compute[182755]: 2026-01-22 00:17:57.117 182759 WARNING nova.compute.manager [req-0be5c44c-6e07-4152-8d76-76e3a0d4a8d1 req-486ba317-ee9c-42c6-b3f7-68f7d0381531 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 5fd1f867-a38b-4022-886e-080b31068c65] Received unexpected event network-vif-plugged-b5ddb845-36ad-438e-b95d-6d5c696671e9 for instance with vm_state active and task_state deleting.#033[00m
Jan 21 19:17:57 np0005591285 nova_compute[182755]: 2026-01-22 00:17:57.213 182759 DEBUG oslo_service.periodic_task [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 21 19:17:57 np0005591285 nova_compute[182755]: 2026-01-22 00:17:57.753 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:17:57 np0005591285 nova_compute[182755]: 2026-01-22 00:17:57.776 182759 INFO nova.network.neutron [None req-6a54018f-e373-40db-bb2b-666c3eb4f8c4 74e6e619a2a649d8a98f51f794c814ec d7e5ce5531e5499fa8b5c71d40934672 - - default default] [instance: 46feac9e-f412-4027-8cfb-f7280308085e] Updating port 7bc267e3-f762-4a18-a3a2-42a7161a231e with attributes {'binding:host_id': 'compute-2.ctlplane.example.com', 'device_owner': 'compute:nova'}#033[00m
Jan 21 19:17:58 np0005591285 nova_compute[182755]: 2026-01-22 00:17:58.312 182759 DEBUG nova.network.neutron [-] [instance: 5fd1f867-a38b-4022-886e-080b31068c65] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 21 19:17:58 np0005591285 nova_compute[182755]: 2026-01-22 00:17:58.329 182759 INFO nova.compute.manager [-] [instance: 5fd1f867-a38b-4022-886e-080b31068c65] Took 2.32 seconds to deallocate network for instance.#033[00m
Jan 21 19:17:58 np0005591285 nova_compute[182755]: 2026-01-22 00:17:58.429 182759 DEBUG oslo_concurrency.lockutils [None req-7aec05fb-e9bb-42ad-83fd-24a6c3eb8f91 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 21 19:17:58 np0005591285 nova_compute[182755]: 2026-01-22 00:17:58.430 182759 DEBUG oslo_concurrency.lockutils [None req-7aec05fb-e9bb-42ad-83fd-24a6c3eb8f91 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 21 19:17:58 np0005591285 nova_compute[182755]: 2026-01-22 00:17:58.491 182759 DEBUG nova.compute.provider_tree [None req-7aec05fb-e9bb-42ad-83fd-24a6c3eb8f91 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] Inventory has not changed in ProviderTree for provider: e96a8776-a298-4c19-937a-402cb8191067 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 21 19:17:58 np0005591285 nova_compute[182755]: 2026-01-22 00:17:58.511 182759 DEBUG nova.scheduler.client.report [None req-7aec05fb-e9bb-42ad-83fd-24a6c3eb8f91 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] Inventory has not changed for provider e96a8776-a298-4c19-937a-402cb8191067 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 21 19:17:58 np0005591285 nova_compute[182755]: 2026-01-22 00:17:58.541 182759 DEBUG oslo_concurrency.lockutils [None req-7aec05fb-e9bb-42ad-83fd-24a6c3eb8f91 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.111s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 21 19:17:58 np0005591285 nova_compute[182755]: 2026-01-22 00:17:58.575 182759 INFO nova.scheduler.client.report [None req-7aec05fb-e9bb-42ad-83fd-24a6c3eb8f91 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] Deleted allocations for instance 5fd1f867-a38b-4022-886e-080b31068c65#033[00m
Jan 21 19:17:58 np0005591285 nova_compute[182755]: 2026-01-22 00:17:58.744 182759 DEBUG oslo_concurrency.lockutils [None req-7aec05fb-e9bb-42ad-83fd-24a6c3eb8f91 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] Lock "5fd1f867-a38b-4022-886e-080b31068c65" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 3.124s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 21 19:17:58 np0005591285 nova_compute[182755]: 2026-01-22 00:17:58.746 182759 DEBUG oslo_concurrency.lockutils [None req-6a54018f-e373-40db-bb2b-666c3eb4f8c4 74e6e619a2a649d8a98f51f794c814ec d7e5ce5531e5499fa8b5c71d40934672 - - default default] Acquiring lock "refresh_cache-46feac9e-f412-4027-8cfb-f7280308085e" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 21 19:17:58 np0005591285 nova_compute[182755]: 2026-01-22 00:17:58.746 182759 DEBUG oslo_concurrency.lockutils [None req-6a54018f-e373-40db-bb2b-666c3eb4f8c4 74e6e619a2a649d8a98f51f794c814ec d7e5ce5531e5499fa8b5c71d40934672 - - default default] Acquired lock "refresh_cache-46feac9e-f412-4027-8cfb-f7280308085e" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 21 19:17:58 np0005591285 nova_compute[182755]: 2026-01-22 00:17:58.746 182759 DEBUG nova.network.neutron [None req-6a54018f-e373-40db-bb2b-666c3eb4f8c4 74e6e619a2a649d8a98f51f794c814ec d7e5ce5531e5499fa8b5c71d40934672 - - default default] [instance: 46feac9e-f412-4027-8cfb-f7280308085e] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Jan 21 19:17:58 np0005591285 nova_compute[182755]: 2026-01-22 00:17:58.879 182759 DEBUG nova.compute.manager [req-164072ac-9fa4-48f2-9c7f-d6a22bd849ed req-e01b6ba4-a895-4b3f-a40d-fb78ff6428b7 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 46feac9e-f412-4027-8cfb-f7280308085e] Received event network-changed-7bc267e3-f762-4a18-a3a2-42a7161a231e external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 21 19:17:58 np0005591285 nova_compute[182755]: 2026-01-22 00:17:58.880 182759 DEBUG nova.compute.manager [req-164072ac-9fa4-48f2-9c7f-d6a22bd849ed req-e01b6ba4-a895-4b3f-a40d-fb78ff6428b7 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 46feac9e-f412-4027-8cfb-f7280308085e] Refreshing instance network info cache due to event network-changed-7bc267e3-f762-4a18-a3a2-42a7161a231e. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 21 19:17:58 np0005591285 nova_compute[182755]: 2026-01-22 00:17:58.880 182759 DEBUG oslo_concurrency.lockutils [req-164072ac-9fa4-48f2-9c7f-d6a22bd849ed req-e01b6ba4-a895-4b3f-a40d-fb78ff6428b7 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "refresh_cache-46feac9e-f412-4027-8cfb-f7280308085e" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 21 19:18:00 np0005591285 nova_compute[182755]: 2026-01-22 00:18:00.125 182759 DEBUG nova.network.neutron [None req-6a54018f-e373-40db-bb2b-666c3eb4f8c4 74e6e619a2a649d8a98f51f794c814ec d7e5ce5531e5499fa8b5c71d40934672 - - default default] [instance: 46feac9e-f412-4027-8cfb-f7280308085e] Updating instance_info_cache with network_info: [{"id": "7bc267e3-f762-4a18-a3a2-42a7161a231e", "address": "fa:16:3e:38:78:10", "network": {"id": "184c07f2-f316-4056-b962-173c9a73cccb", "bridge": "br-int", "label": "tempest-network-smoke--1947088510", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.203", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "adb1305c8f874f2684e845e88fd95ffe", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap7bc267e3-f7", "ovs_interfaceid": "7bc267e3-f762-4a18-a3a2-42a7161a231e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 21 19:18:00 np0005591285 podman[233553]: 2026-01-22 00:18:00.222195782 +0000 UTC m=+0.078396271 container health_status b5c1a31a508d0cf9e10702abe9c0ef424614b4d8f0daee85791f7de054afa6f0 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=ceilometer_agent_compute, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible)
Jan 21 19:18:00 np0005591285 podman[233552]: 2026-01-22 00:18:00.232408084 +0000 UTC m=+0.086791925 container health_status 769668cc4e818cfae854ab3fcb5517227ddca1e18005a18ef4d782a494f45662 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=openstack_network_exporter, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vendor=Red Hat, Inc., io.openshift.tags=minimal rhel9, maintainer=Red Hat, Inc., container_name=openstack_network_exporter, io.buildah.version=1.33.7, vcs-type=git, io.openshift.expose-services=, name=ubi9-minimal, build-date=2025-08-20T13:12:41, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., distribution-scope=public, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, managed_by=edpm_ansible, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.component=ubi9-minimal-container, release=1755695350, architecture=x86_64, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, version=9.6)
Jan 21 19:18:00 np0005591285 nova_compute[182755]: 2026-01-22 00:18:00.562 182759 DEBUG oslo_concurrency.lockutils [None req-6a54018f-e373-40db-bb2b-666c3eb4f8c4 74e6e619a2a649d8a98f51f794c814ec d7e5ce5531e5499fa8b5c71d40934672 - - default default] Releasing lock "refresh_cache-46feac9e-f412-4027-8cfb-f7280308085e" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 21 19:18:00 np0005591285 nova_compute[182755]: 2026-01-22 00:18:00.566 182759 DEBUG oslo_concurrency.lockutils [req-164072ac-9fa4-48f2-9c7f-d6a22bd849ed req-e01b6ba4-a895-4b3f-a40d-fb78ff6428b7 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquired lock "refresh_cache-46feac9e-f412-4027-8cfb-f7280308085e" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 21 19:18:00 np0005591285 nova_compute[182755]: 2026-01-22 00:18:00.566 182759 DEBUG nova.network.neutron [req-164072ac-9fa4-48f2-9c7f-d6a22bd849ed req-e01b6ba4-a895-4b3f-a40d-fb78ff6428b7 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 46feac9e-f412-4027-8cfb-f7280308085e] Refreshing network info cache for port 7bc267e3-f762-4a18-a3a2-42a7161a231e _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 21 19:18:00 np0005591285 nova_compute[182755]: 2026-01-22 00:18:00.981 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:18:01 np0005591285 nova_compute[182755]: 2026-01-22 00:18:01.514 182759 DEBUG nova.virt.libvirt.driver [None req-6a54018f-e373-40db-bb2b-666c3eb4f8c4 74e6e619a2a649d8a98f51f794c814ec d7e5ce5531e5499fa8b5c71d40934672 - - default default] [instance: 46feac9e-f412-4027-8cfb-f7280308085e] Starting finish_migration finish_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11698#033[00m
Jan 21 19:18:01 np0005591285 nova_compute[182755]: 2026-01-22 00:18:01.516 182759 DEBUG nova.virt.libvirt.driver [None req-6a54018f-e373-40db-bb2b-666c3eb4f8c4 74e6e619a2a649d8a98f51f794c814ec d7e5ce5531e5499fa8b5c71d40934672 - - default default] [instance: 46feac9e-f412-4027-8cfb-f7280308085e] Instance directory exists: not creating _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4719#033[00m
Jan 21 19:18:01 np0005591285 nova_compute[182755]: 2026-01-22 00:18:01.516 182759 INFO nova.virt.libvirt.driver [None req-6a54018f-e373-40db-bb2b-666c3eb4f8c4 74e6e619a2a649d8a98f51f794c814ec d7e5ce5531e5499fa8b5c71d40934672 - - default default] [instance: 46feac9e-f412-4027-8cfb-f7280308085e] Creating image(s)#033[00m
Jan 21 19:18:01 np0005591285 nova_compute[182755]: 2026-01-22 00:18:01.517 182759 DEBUG nova.objects.instance [None req-6a54018f-e373-40db-bb2b-666c3eb4f8c4 74e6e619a2a649d8a98f51f794c814ec d7e5ce5531e5499fa8b5c71d40934672 - - default default] Lazy-loading 'trusted_certs' on Instance uuid 46feac9e-f412-4027-8cfb-f7280308085e obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 21 19:18:01 np0005591285 nova_compute[182755]: 2026-01-22 00:18:01.537 182759 DEBUG oslo_concurrency.processutils [None req-6a54018f-e373-40db-bb2b-666c3eb4f8c4 74e6e619a2a649d8a98f51f794c814ec d7e5ce5531e5499fa8b5c71d40934672 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 21 19:18:01 np0005591285 nova_compute[182755]: 2026-01-22 00:18:01.610 182759 DEBUG oslo_concurrency.processutils [None req-6a54018f-e373-40db-bb2b-666c3eb4f8c4 74e6e619a2a649d8a98f51f794c814ec d7e5ce5531e5499fa8b5c71d40934672 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json" returned: 0 in 0.073s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 21 19:18:01 np0005591285 nova_compute[182755]: 2026-01-22 00:18:01.611 182759 DEBUG nova.virt.disk.api [None req-6a54018f-e373-40db-bb2b-666c3eb4f8c4 74e6e619a2a649d8a98f51f794c814ec d7e5ce5531e5499fa8b5c71d40934672 - - default default] Checking if we can resize image /var/lib/nova/instances/46feac9e-f412-4027-8cfb-f7280308085e/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166#033[00m
Jan 21 19:18:01 np0005591285 nova_compute[182755]: 2026-01-22 00:18:01.612 182759 DEBUG oslo_concurrency.processutils [None req-6a54018f-e373-40db-bb2b-666c3eb4f8c4 74e6e619a2a649d8a98f51f794c814ec d7e5ce5531e5499fa8b5c71d40934672 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/46feac9e-f412-4027-8cfb-f7280308085e/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 21 19:18:01 np0005591285 nova_compute[182755]: 2026-01-22 00:18:01.705 182759 DEBUG oslo_concurrency.processutils [None req-6a54018f-e373-40db-bb2b-666c3eb4f8c4 74e6e619a2a649d8a98f51f794c814ec d7e5ce5531e5499fa8b5c71d40934672 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/46feac9e-f412-4027-8cfb-f7280308085e/disk --force-share --output=json" returned: 0 in 0.094s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 21 19:18:01 np0005591285 nova_compute[182755]: 2026-01-22 00:18:01.706 182759 DEBUG nova.virt.disk.api [None req-6a54018f-e373-40db-bb2b-666c3eb4f8c4 74e6e619a2a649d8a98f51f794c814ec d7e5ce5531e5499fa8b5c71d40934672 - - default default] Cannot resize image /var/lib/nova/instances/46feac9e-f412-4027-8cfb-f7280308085e/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172#033[00m
Jan 21 19:18:01 np0005591285 nova_compute[182755]: 2026-01-22 00:18:01.732 182759 DEBUG nova.virt.libvirt.driver [None req-6a54018f-e373-40db-bb2b-666c3eb4f8c4 74e6e619a2a649d8a98f51f794c814ec d7e5ce5531e5499fa8b5c71d40934672 - - default default] [instance: 46feac9e-f412-4027-8cfb-f7280308085e] Did not create local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4859#033[00m
Jan 21 19:18:01 np0005591285 nova_compute[182755]: 2026-01-22 00:18:01.733 182759 DEBUG nova.virt.libvirt.driver [None req-6a54018f-e373-40db-bb2b-666c3eb4f8c4 74e6e619a2a649d8a98f51f794c814ec d7e5ce5531e5499fa8b5c71d40934672 - - default default] [instance: 46feac9e-f412-4027-8cfb-f7280308085e] Ensure instance console log exists: /var/lib/nova/instances/46feac9e-f412-4027-8cfb-f7280308085e/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Jan 21 19:18:01 np0005591285 nova_compute[182755]: 2026-01-22 00:18:01.734 182759 DEBUG oslo_concurrency.lockutils [None req-6a54018f-e373-40db-bb2b-666c3eb4f8c4 74e6e619a2a649d8a98f51f794c814ec d7e5ce5531e5499fa8b5c71d40934672 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 21 19:18:01 np0005591285 nova_compute[182755]: 2026-01-22 00:18:01.734 182759 DEBUG oslo_concurrency.lockutils [None req-6a54018f-e373-40db-bb2b-666c3eb4f8c4 74e6e619a2a649d8a98f51f794c814ec d7e5ce5531e5499fa8b5c71d40934672 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 21 19:18:01 np0005591285 nova_compute[182755]: 2026-01-22 00:18:01.735 182759 DEBUG oslo_concurrency.lockutils [None req-6a54018f-e373-40db-bb2b-666c3eb4f8c4 74e6e619a2a649d8a98f51f794c814ec d7e5ce5531e5499fa8b5c71d40934672 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 21 19:18:01 np0005591285 nova_compute[182755]: 2026-01-22 00:18:01.739 182759 DEBUG nova.virt.libvirt.driver [None req-6a54018f-e373-40db-bb2b-666c3eb4f8c4 74e6e619a2a649d8a98f51f794c814ec d7e5ce5531e5499fa8b5c71d40934672 - - default default] [instance: 46feac9e-f412-4027-8cfb-f7280308085e] Start _get_guest_xml network_info=[{"id": "7bc267e3-f762-4a18-a3a2-42a7161a231e", "address": "fa:16:3e:38:78:10", "network": {"id": "184c07f2-f316-4056-b962-173c9a73cccb", "bridge": "br-int", "label": "tempest-network-smoke--1947088510", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.203", "type": "floating", "version": 4, "meta": {}}], "label": "tempest-network-smoke--1947088510", "vif_mac": "fa:16:3e:38:78:10"}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "adb1305c8f874f2684e845e88fd95ffe", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap7bc267e3-f7", "ovs_interfaceid": "7bc267e3-f762-4a18-a3a2-42a7161a231e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-21T23:42:50Z,direct_url=<?>,disk_format='qcow2',id=9cd98f02-a505-4543-a7ad-04e9a377b456,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='43b70c4e837343859ac97b6b2397ba1b',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-21T23:42:52Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'device_type': 'disk', 'device_name': '/dev/vda', 'size': 0, 'boot_index': 0, 'disk_bus': 'virtio', 'guest_format': None, 'encryption_options': None, 'encryption_format': None, 'encrypted': False, 'encryption_secret_uuid': None, 'image_id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Jan 21 19:18:01 np0005591285 nova_compute[182755]: 2026-01-22 00:18:01.746 182759 WARNING nova.virt.libvirt.driver [None req-6a54018f-e373-40db-bb2b-666c3eb4f8c4 74e6e619a2a649d8a98f51f794c814ec d7e5ce5531e5499fa8b5c71d40934672 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 21 19:18:01 np0005591285 nova_compute[182755]: 2026-01-22 00:18:01.757 182759 DEBUG nova.virt.libvirt.host [None req-6a54018f-e373-40db-bb2b-666c3eb4f8c4 74e6e619a2a649d8a98f51f794c814ec d7e5ce5531e5499fa8b5c71d40934672 - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Jan 21 19:18:01 np0005591285 nova_compute[182755]: 2026-01-22 00:18:01.758 182759 DEBUG nova.virt.libvirt.host [None req-6a54018f-e373-40db-bb2b-666c3eb4f8c4 74e6e619a2a649d8a98f51f794c814ec d7e5ce5531e5499fa8b5c71d40934672 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Jan 21 19:18:01 np0005591285 nova_compute[182755]: 2026-01-22 00:18:01.765 182759 DEBUG nova.virt.libvirt.host [None req-6a54018f-e373-40db-bb2b-666c3eb4f8c4 74e6e619a2a649d8a98f51f794c814ec d7e5ce5531e5499fa8b5c71d40934672 - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Jan 21 19:18:01 np0005591285 nova_compute[182755]: 2026-01-22 00:18:01.766 182759 DEBUG nova.virt.libvirt.host [None req-6a54018f-e373-40db-bb2b-666c3eb4f8c4 74e6e619a2a649d8a98f51f794c814ec d7e5ce5531e5499fa8b5c71d40934672 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Jan 21 19:18:01 np0005591285 nova_compute[182755]: 2026-01-22 00:18:01.767 182759 DEBUG nova.virt.libvirt.driver [None req-6a54018f-e373-40db-bb2b-666c3eb4f8c4 74e6e619a2a649d8a98f51f794c814ec d7e5ce5531e5499fa8b5c71d40934672 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Jan 21 19:18:01 np0005591285 nova_compute[182755]: 2026-01-22 00:18:01.767 182759 DEBUG nova.virt.hardware [None req-6a54018f-e373-40db-bb2b-666c3eb4f8c4 74e6e619a2a649d8a98f51f794c814ec d7e5ce5531e5499fa8b5c71d40934672 - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-21T23:42:45Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='c3389c03-89c4-4ff5-9e03-1a99d41713d4',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-21T23:42:50Z,direct_url=<?>,disk_format='qcow2',id=9cd98f02-a505-4543-a7ad-04e9a377b456,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='43b70c4e837343859ac97b6b2397ba1b',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-21T23:42:52Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Jan 21 19:18:01 np0005591285 nova_compute[182755]: 2026-01-22 00:18:01.768 182759 DEBUG nova.virt.hardware [None req-6a54018f-e373-40db-bb2b-666c3eb4f8c4 74e6e619a2a649d8a98f51f794c814ec d7e5ce5531e5499fa8b5c71d40934672 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Jan 21 19:18:01 np0005591285 nova_compute[182755]: 2026-01-22 00:18:01.768 182759 DEBUG nova.virt.hardware [None req-6a54018f-e373-40db-bb2b-666c3eb4f8c4 74e6e619a2a649d8a98f51f794c814ec d7e5ce5531e5499fa8b5c71d40934672 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Jan 21 19:18:01 np0005591285 nova_compute[182755]: 2026-01-22 00:18:01.768 182759 DEBUG nova.virt.hardware [None req-6a54018f-e373-40db-bb2b-666c3eb4f8c4 74e6e619a2a649d8a98f51f794c814ec d7e5ce5531e5499fa8b5c71d40934672 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Jan 21 19:18:01 np0005591285 nova_compute[182755]: 2026-01-22 00:18:01.768 182759 DEBUG nova.virt.hardware [None req-6a54018f-e373-40db-bb2b-666c3eb4f8c4 74e6e619a2a649d8a98f51f794c814ec d7e5ce5531e5499fa8b5c71d40934672 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Jan 21 19:18:01 np0005591285 nova_compute[182755]: 2026-01-22 00:18:01.768 182759 DEBUG nova.virt.hardware [None req-6a54018f-e373-40db-bb2b-666c3eb4f8c4 74e6e619a2a649d8a98f51f794c814ec d7e5ce5531e5499fa8b5c71d40934672 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Jan 21 19:18:01 np0005591285 nova_compute[182755]: 2026-01-22 00:18:01.769 182759 DEBUG nova.virt.hardware [None req-6a54018f-e373-40db-bb2b-666c3eb4f8c4 74e6e619a2a649d8a98f51f794c814ec d7e5ce5531e5499fa8b5c71d40934672 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Jan 21 19:18:01 np0005591285 nova_compute[182755]: 2026-01-22 00:18:01.769 182759 DEBUG nova.virt.hardware [None req-6a54018f-e373-40db-bb2b-666c3eb4f8c4 74e6e619a2a649d8a98f51f794c814ec d7e5ce5531e5499fa8b5c71d40934672 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Jan 21 19:18:01 np0005591285 nova_compute[182755]: 2026-01-22 00:18:01.769 182759 DEBUG nova.virt.hardware [None req-6a54018f-e373-40db-bb2b-666c3eb4f8c4 74e6e619a2a649d8a98f51f794c814ec d7e5ce5531e5499fa8b5c71d40934672 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Jan 21 19:18:01 np0005591285 nova_compute[182755]: 2026-01-22 00:18:01.769 182759 DEBUG nova.virt.hardware [None req-6a54018f-e373-40db-bb2b-666c3eb4f8c4 74e6e619a2a649d8a98f51f794c814ec d7e5ce5531e5499fa8b5c71d40934672 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Jan 21 19:18:01 np0005591285 nova_compute[182755]: 2026-01-22 00:18:01.770 182759 DEBUG nova.virt.hardware [None req-6a54018f-e373-40db-bb2b-666c3eb4f8c4 74e6e619a2a649d8a98f51f794c814ec d7e5ce5531e5499fa8b5c71d40934672 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Jan 21 19:18:01 np0005591285 nova_compute[182755]: 2026-01-22 00:18:01.770 182759 DEBUG nova.objects.instance [None req-6a54018f-e373-40db-bb2b-666c3eb4f8c4 74e6e619a2a649d8a98f51f794c814ec d7e5ce5531e5499fa8b5c71d40934672 - - default default] Lazy-loading 'vcpu_model' on Instance uuid 46feac9e-f412-4027-8cfb-f7280308085e obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 21 19:18:01 np0005591285 nova_compute[182755]: 2026-01-22 00:18:01.799 182759 DEBUG oslo_concurrency.processutils [None req-6a54018f-e373-40db-bb2b-666c3eb4f8c4 74e6e619a2a649d8a98f51f794c814ec d7e5ce5531e5499fa8b5c71d40934672 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/46feac9e-f412-4027-8cfb-f7280308085e/disk.config --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 21 19:18:01 np0005591285 nova_compute[182755]: 2026-01-22 00:18:01.876 182759 DEBUG oslo_concurrency.processutils [None req-6a54018f-e373-40db-bb2b-666c3eb4f8c4 74e6e619a2a649d8a98f51f794c814ec d7e5ce5531e5499fa8b5c71d40934672 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/46feac9e-f412-4027-8cfb-f7280308085e/disk.config --force-share --output=json" returned: 0 in 0.077s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 21 19:18:01 np0005591285 nova_compute[182755]: 2026-01-22 00:18:01.877 182759 DEBUG oslo_concurrency.lockutils [None req-6a54018f-e373-40db-bb2b-666c3eb4f8c4 74e6e619a2a649d8a98f51f794c814ec d7e5ce5531e5499fa8b5c71d40934672 - - default default] Acquiring lock "/var/lib/nova/instances/46feac9e-f412-4027-8cfb-f7280308085e/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 21 19:18:01 np0005591285 nova_compute[182755]: 2026-01-22 00:18:01.878 182759 DEBUG oslo_concurrency.lockutils [None req-6a54018f-e373-40db-bb2b-666c3eb4f8c4 74e6e619a2a649d8a98f51f794c814ec d7e5ce5531e5499fa8b5c71d40934672 - - default default] Lock "/var/lib/nova/instances/46feac9e-f412-4027-8cfb-f7280308085e/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 21 19:18:01 np0005591285 nova_compute[182755]: 2026-01-22 00:18:01.879 182759 DEBUG oslo_concurrency.lockutils [None req-6a54018f-e373-40db-bb2b-666c3eb4f8c4 74e6e619a2a649d8a98f51f794c814ec d7e5ce5531e5499fa8b5c71d40934672 - - default default] Lock "/var/lib/nova/instances/46feac9e-f412-4027-8cfb-f7280308085e/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 21 19:18:01 np0005591285 nova_compute[182755]: 2026-01-22 00:18:01.880 182759 DEBUG nova.virt.libvirt.vif [None req-6a54018f-e373-40db-bb2b-666c3eb4f8c4 74e6e619a2a649d8a98f51f794c814ec d7e5ce5531e5499fa8b5c71d40934672 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-22T00:17:10Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestNetworkAdvancedServerOps-server-1889498750',display_name='tempest-TestNetworkAdvancedServerOps-server-1889498750',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-testnetworkadvancedserverops-server-1889498750',id=134,image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBAOTO7d2Dwnkbz9wr9hWsejC9/1+pdYEpWKDQobSKPUmWC0nAs/mdLNrBlKhRnQPpVBXnMQms4q8X3v+9bWXw5gwGNW9NuZlObmqlerpOa7gv/9x3J0wC1Nx+jU/uK6YUg==',key_name='tempest-TestNetworkAdvancedServerOps-1110887450',keypairs=<?>,launch_index=0,launched_at=2026-01-22T00:17:21Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=MigrationContext,new_flavor=Flavor(1),node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=Flavor(1),os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='adb1305c8f874f2684e845e88fd95ffe',ramdisk_id='',reservation_id='r-ib8w2x0y',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=ServiceList,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',old_vm_state='active',owner_project_name='tempest-TestNetworkAdvancedServerOps-587955072',owner_user_name='tempest-TestNetworkAdvancedServerOps-587955072-project-member'},tags=<?>,task_state='resize_finish',terminated_at=None,trusted_certs=None,updated_at=2026-01-22T00:17:57Z,user_data=None,user_id='635cc2f351c344dc8e2b1264080dbafb',uuid=46feac9e-f412-4027-8cfb-f7280308085e,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "7bc267e3-f762-4a18-a3a2-42a7161a231e", "address": "fa:16:3e:38:78:10", "network": {"id": "184c07f2-f316-4056-b962-173c9a73cccb", "bridge": "br-int", "label": "tempest-network-smoke--1947088510", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.203", "type": "floating", "version": 4, "meta": {}}], "label": "tempest-network-smoke--1947088510", "vif_mac": "fa:16:3e:38:78:10"}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "adb1305c8f874f2684e845e88fd95ffe", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap7bc267e3-f7", "ovs_interfaceid": "7bc267e3-f762-4a18-a3a2-42a7161a231e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Jan 21 19:18:01 np0005591285 nova_compute[182755]: 2026-01-22 00:18:01.880 182759 DEBUG nova.network.os_vif_util [None req-6a54018f-e373-40db-bb2b-666c3eb4f8c4 74e6e619a2a649d8a98f51f794c814ec d7e5ce5531e5499fa8b5c71d40934672 - - default default] Converting VIF {"id": "7bc267e3-f762-4a18-a3a2-42a7161a231e", "address": "fa:16:3e:38:78:10", "network": {"id": "184c07f2-f316-4056-b962-173c9a73cccb", "bridge": "br-int", "label": "tempest-network-smoke--1947088510", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.203", "type": "floating", "version": 4, "meta": {}}], "label": "tempest-network-smoke--1947088510", "vif_mac": "fa:16:3e:38:78:10"}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "adb1305c8f874f2684e845e88fd95ffe", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap7bc267e3-f7", "ovs_interfaceid": "7bc267e3-f762-4a18-a3a2-42a7161a231e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 21 19:18:01 np0005591285 nova_compute[182755]: 2026-01-22 00:18:01.881 182759 DEBUG nova.network.os_vif_util [None req-6a54018f-e373-40db-bb2b-666c3eb4f8c4 74e6e619a2a649d8a98f51f794c814ec d7e5ce5531e5499fa8b5c71d40934672 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:38:78:10,bridge_name='br-int',has_traffic_filtering=True,id=7bc267e3-f762-4a18-a3a2-42a7161a231e,network=Network(184c07f2-f316-4056-b962-173c9a73cccb),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap7bc267e3-f7') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 21 19:18:01 np0005591285 nova_compute[182755]: 2026-01-22 00:18:01.883 182759 DEBUG nova.virt.libvirt.driver [None req-6a54018f-e373-40db-bb2b-666c3eb4f8c4 74e6e619a2a649d8a98f51f794c814ec d7e5ce5531e5499fa8b5c71d40934672 - - default default] [instance: 46feac9e-f412-4027-8cfb-f7280308085e] End _get_guest_xml xml=<domain type="kvm">
Jan 21 19:18:01 np0005591285 nova_compute[182755]:  <uuid>46feac9e-f412-4027-8cfb-f7280308085e</uuid>
Jan 21 19:18:01 np0005591285 nova_compute[182755]:  <name>instance-00000086</name>
Jan 21 19:18:01 np0005591285 nova_compute[182755]:  <memory>131072</memory>
Jan 21 19:18:01 np0005591285 nova_compute[182755]:  <vcpu>1</vcpu>
Jan 21 19:18:01 np0005591285 nova_compute[182755]:  <metadata>
Jan 21 19:18:01 np0005591285 nova_compute[182755]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 21 19:18:01 np0005591285 nova_compute[182755]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 21 19:18:01 np0005591285 nova_compute[182755]:      <nova:name>tempest-TestNetworkAdvancedServerOps-server-1889498750</nova:name>
Jan 21 19:18:01 np0005591285 nova_compute[182755]:      <nova:creationTime>2026-01-22 00:18:01</nova:creationTime>
Jan 21 19:18:01 np0005591285 nova_compute[182755]:      <nova:flavor name="m1.nano">
Jan 21 19:18:01 np0005591285 nova_compute[182755]:        <nova:memory>128</nova:memory>
Jan 21 19:18:01 np0005591285 nova_compute[182755]:        <nova:disk>1</nova:disk>
Jan 21 19:18:01 np0005591285 nova_compute[182755]:        <nova:swap>0</nova:swap>
Jan 21 19:18:01 np0005591285 nova_compute[182755]:        <nova:ephemeral>0</nova:ephemeral>
Jan 21 19:18:01 np0005591285 nova_compute[182755]:        <nova:vcpus>1</nova:vcpus>
Jan 21 19:18:01 np0005591285 nova_compute[182755]:      </nova:flavor>
Jan 21 19:18:01 np0005591285 nova_compute[182755]:      <nova:owner>
Jan 21 19:18:01 np0005591285 nova_compute[182755]:        <nova:user uuid="635cc2f351c344dc8e2b1264080dbafb">tempest-TestNetworkAdvancedServerOps-587955072-project-member</nova:user>
Jan 21 19:18:01 np0005591285 nova_compute[182755]:        <nova:project uuid="adb1305c8f874f2684e845e88fd95ffe">tempest-TestNetworkAdvancedServerOps-587955072</nova:project>
Jan 21 19:18:01 np0005591285 nova_compute[182755]:      </nova:owner>
Jan 21 19:18:01 np0005591285 nova_compute[182755]:      <nova:root type="image" uuid="9cd98f02-a505-4543-a7ad-04e9a377b456"/>
Jan 21 19:18:01 np0005591285 nova_compute[182755]:      <nova:ports>
Jan 21 19:18:01 np0005591285 nova_compute[182755]:        <nova:port uuid="7bc267e3-f762-4a18-a3a2-42a7161a231e">
Jan 21 19:18:01 np0005591285 nova_compute[182755]:          <nova:ip type="fixed" address="10.100.0.12" ipVersion="4"/>
Jan 21 19:18:01 np0005591285 nova_compute[182755]:        </nova:port>
Jan 21 19:18:01 np0005591285 nova_compute[182755]:      </nova:ports>
Jan 21 19:18:01 np0005591285 nova_compute[182755]:    </nova:instance>
Jan 21 19:18:01 np0005591285 nova_compute[182755]:  </metadata>
Jan 21 19:18:01 np0005591285 nova_compute[182755]:  <sysinfo type="smbios">
Jan 21 19:18:01 np0005591285 nova_compute[182755]:    <system>
Jan 21 19:18:01 np0005591285 nova_compute[182755]:      <entry name="manufacturer">RDO</entry>
Jan 21 19:18:01 np0005591285 nova_compute[182755]:      <entry name="product">OpenStack Compute</entry>
Jan 21 19:18:01 np0005591285 nova_compute[182755]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 21 19:18:01 np0005591285 nova_compute[182755]:      <entry name="serial">46feac9e-f412-4027-8cfb-f7280308085e</entry>
Jan 21 19:18:01 np0005591285 nova_compute[182755]:      <entry name="uuid">46feac9e-f412-4027-8cfb-f7280308085e</entry>
Jan 21 19:18:01 np0005591285 nova_compute[182755]:      <entry name="family">Virtual Machine</entry>
Jan 21 19:18:01 np0005591285 nova_compute[182755]:    </system>
Jan 21 19:18:01 np0005591285 nova_compute[182755]:  </sysinfo>
Jan 21 19:18:01 np0005591285 nova_compute[182755]:  <os>
Jan 21 19:18:01 np0005591285 nova_compute[182755]:    <type arch="x86_64" machine="q35">hvm</type>
Jan 21 19:18:01 np0005591285 nova_compute[182755]:    <boot dev="hd"/>
Jan 21 19:18:01 np0005591285 nova_compute[182755]:    <smbios mode="sysinfo"/>
Jan 21 19:18:01 np0005591285 nova_compute[182755]:  </os>
Jan 21 19:18:01 np0005591285 nova_compute[182755]:  <features>
Jan 21 19:18:01 np0005591285 nova_compute[182755]:    <acpi/>
Jan 21 19:18:01 np0005591285 nova_compute[182755]:    <apic/>
Jan 21 19:18:01 np0005591285 nova_compute[182755]:    <vmcoreinfo/>
Jan 21 19:18:01 np0005591285 nova_compute[182755]:  </features>
Jan 21 19:18:01 np0005591285 nova_compute[182755]:  <clock offset="utc">
Jan 21 19:18:01 np0005591285 nova_compute[182755]:    <timer name="pit" tickpolicy="delay"/>
Jan 21 19:18:01 np0005591285 nova_compute[182755]:    <timer name="rtc" tickpolicy="catchup"/>
Jan 21 19:18:01 np0005591285 nova_compute[182755]:    <timer name="hpet" present="no"/>
Jan 21 19:18:01 np0005591285 nova_compute[182755]:  </clock>
Jan 21 19:18:01 np0005591285 nova_compute[182755]:  <cpu mode="custom" match="exact">
Jan 21 19:18:01 np0005591285 nova_compute[182755]:    <model>Nehalem</model>
Jan 21 19:18:01 np0005591285 nova_compute[182755]:    <topology sockets="1" cores="1" threads="1"/>
Jan 21 19:18:01 np0005591285 nova_compute[182755]:  </cpu>
Jan 21 19:18:01 np0005591285 nova_compute[182755]:  <devices>
Jan 21 19:18:01 np0005591285 nova_compute[182755]:    <disk type="file" device="disk">
Jan 21 19:18:01 np0005591285 nova_compute[182755]:      <driver name="qemu" type="qcow2" cache="none"/>
Jan 21 19:18:01 np0005591285 nova_compute[182755]:      <source file="/var/lib/nova/instances/46feac9e-f412-4027-8cfb-f7280308085e/disk"/>
Jan 21 19:18:01 np0005591285 nova_compute[182755]:      <target dev="vda" bus="virtio"/>
Jan 21 19:18:01 np0005591285 nova_compute[182755]:    </disk>
Jan 21 19:18:01 np0005591285 nova_compute[182755]:    <disk type="file" device="cdrom">
Jan 21 19:18:01 np0005591285 nova_compute[182755]:      <driver name="qemu" type="raw" cache="none"/>
Jan 21 19:18:01 np0005591285 nova_compute[182755]:      <source file="/var/lib/nova/instances/46feac9e-f412-4027-8cfb-f7280308085e/disk.config"/>
Jan 21 19:18:01 np0005591285 nova_compute[182755]:      <target dev="sda" bus="sata"/>
Jan 21 19:18:01 np0005591285 nova_compute[182755]:    </disk>
Jan 21 19:18:01 np0005591285 nova_compute[182755]:    <interface type="ethernet">
Jan 21 19:18:01 np0005591285 nova_compute[182755]:      <mac address="fa:16:3e:38:78:10"/>
Jan 21 19:18:01 np0005591285 nova_compute[182755]:      <model type="virtio"/>
Jan 21 19:18:01 np0005591285 nova_compute[182755]:      <driver name="vhost" rx_queue_size="512"/>
Jan 21 19:18:01 np0005591285 nova_compute[182755]:      <mtu size="1442"/>
Jan 21 19:18:01 np0005591285 nova_compute[182755]:      <target dev="tap7bc267e3-f7"/>
Jan 21 19:18:01 np0005591285 nova_compute[182755]:    </interface>
Jan 21 19:18:01 np0005591285 nova_compute[182755]:    <serial type="pty">
Jan 21 19:18:01 np0005591285 nova_compute[182755]:      <log file="/var/lib/nova/instances/46feac9e-f412-4027-8cfb-f7280308085e/console.log" append="off"/>
Jan 21 19:18:01 np0005591285 nova_compute[182755]:    </serial>
Jan 21 19:18:01 np0005591285 nova_compute[182755]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 21 19:18:01 np0005591285 nova_compute[182755]:    <video>
Jan 21 19:18:01 np0005591285 nova_compute[182755]:      <model type="virtio"/>
Jan 21 19:18:01 np0005591285 nova_compute[182755]:    </video>
Jan 21 19:18:01 np0005591285 nova_compute[182755]:    <input type="tablet" bus="usb"/>
Jan 21 19:18:01 np0005591285 nova_compute[182755]:    <rng model="virtio">
Jan 21 19:18:01 np0005591285 nova_compute[182755]:      <backend model="random">/dev/urandom</backend>
Jan 21 19:18:01 np0005591285 nova_compute[182755]:    </rng>
Jan 21 19:18:01 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root"/>
Jan 21 19:18:01 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 19:18:01 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 19:18:01 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 19:18:01 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 19:18:01 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 19:18:01 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 19:18:01 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 19:18:01 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 19:18:01 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 19:18:01 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 19:18:01 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 19:18:01 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 19:18:01 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 19:18:01 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 19:18:01 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 19:18:01 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 19:18:01 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 19:18:01 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 19:18:01 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 19:18:01 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 19:18:01 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 19:18:01 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 19:18:01 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 19:18:01 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 19:18:01 np0005591285 nova_compute[182755]:    <controller type="usb" index="0"/>
Jan 21 19:18:01 np0005591285 nova_compute[182755]:    <memballoon model="virtio">
Jan 21 19:18:01 np0005591285 nova_compute[182755]:      <stats period="10"/>
Jan 21 19:18:01 np0005591285 nova_compute[182755]:    </memballoon>
Jan 21 19:18:01 np0005591285 nova_compute[182755]:  </devices>
Jan 21 19:18:01 np0005591285 nova_compute[182755]: </domain>
Jan 21 19:18:01 np0005591285 nova_compute[182755]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Jan 21 19:18:01 np0005591285 nova_compute[182755]: 2026-01-22 00:18:01.884 182759 DEBUG nova.virt.libvirt.vif [None req-6a54018f-e373-40db-bb2b-666c3eb4f8c4 74e6e619a2a649d8a98f51f794c814ec d7e5ce5531e5499fa8b5c71d40934672 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-22T00:17:10Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestNetworkAdvancedServerOps-server-1889498750',display_name='tempest-TestNetworkAdvancedServerOps-server-1889498750',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-testnetworkadvancedserverops-server-1889498750',id=134,image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBAOTO7d2Dwnkbz9wr9hWsejC9/1+pdYEpWKDQobSKPUmWC0nAs/mdLNrBlKhRnQPpVBXnMQms4q8X3v+9bWXw5gwGNW9NuZlObmqlerpOa7gv/9x3J0wC1Nx+jU/uK6YUg==',key_name='tempest-TestNetworkAdvancedServerOps-1110887450',keypairs=<?>,launch_index=0,launched_at=2026-01-22T00:17:21Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=MigrationContext,new_flavor=Flavor(1),node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=Flavor(1),os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='adb1305c8f874f2684e845e88fd95ffe',ramdisk_id='',reservation_id='r-ib8w2x0y',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=ServiceList,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',old_vm_state='active',owner_project_name='tempest-TestNetworkAdvancedServerOps-587955072',owner_user_name='tempest-TestNetworkAdvancedServerOps-587955072-project-member'},tags=<?>,task_state='resize_finish',terminated_at=None,trusted_certs=None,updated_at=2026-01-22T00:17:57Z,user_data=None,user_id='635cc2f351c344dc8e2b1264080dbafb',uuid=46feac9e-f412-4027-8cfb-f7280308085e,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "7bc267e3-f762-4a18-a3a2-42a7161a231e", "address": "fa:16:3e:38:78:10", "network": {"id": "184c07f2-f316-4056-b962-173c9a73cccb", "bridge": "br-int", "label": "tempest-network-smoke--1947088510", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.203", "type": "floating", "version": 4, "meta": {}}], "label": "tempest-network-smoke--1947088510", "vif_mac": "fa:16:3e:38:78:10"}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "adb1305c8f874f2684e845e88fd95ffe", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap7bc267e3-f7", "ovs_interfaceid": "7bc267e3-f762-4a18-a3a2-42a7161a231e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Jan 21 19:18:01 np0005591285 nova_compute[182755]: 2026-01-22 00:18:01.885 182759 DEBUG nova.network.os_vif_util [None req-6a54018f-e373-40db-bb2b-666c3eb4f8c4 74e6e619a2a649d8a98f51f794c814ec d7e5ce5531e5499fa8b5c71d40934672 - - default default] Converting VIF {"id": "7bc267e3-f762-4a18-a3a2-42a7161a231e", "address": "fa:16:3e:38:78:10", "network": {"id": "184c07f2-f316-4056-b962-173c9a73cccb", "bridge": "br-int", "label": "tempest-network-smoke--1947088510", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.203", "type": "floating", "version": 4, "meta": {}}], "label": "tempest-network-smoke--1947088510", "vif_mac": "fa:16:3e:38:78:10"}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "adb1305c8f874f2684e845e88fd95ffe", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap7bc267e3-f7", "ovs_interfaceid": "7bc267e3-f762-4a18-a3a2-42a7161a231e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 21 19:18:01 np0005591285 nova_compute[182755]: 2026-01-22 00:18:01.885 182759 DEBUG nova.network.os_vif_util [None req-6a54018f-e373-40db-bb2b-666c3eb4f8c4 74e6e619a2a649d8a98f51f794c814ec d7e5ce5531e5499fa8b5c71d40934672 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:38:78:10,bridge_name='br-int',has_traffic_filtering=True,id=7bc267e3-f762-4a18-a3a2-42a7161a231e,network=Network(184c07f2-f316-4056-b962-173c9a73cccb),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap7bc267e3-f7') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 21 19:18:01 np0005591285 nova_compute[182755]: 2026-01-22 00:18:01.885 182759 DEBUG os_vif [None req-6a54018f-e373-40db-bb2b-666c3eb4f8c4 74e6e619a2a649d8a98f51f794c814ec d7e5ce5531e5499fa8b5c71d40934672 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:38:78:10,bridge_name='br-int',has_traffic_filtering=True,id=7bc267e3-f762-4a18-a3a2-42a7161a231e,network=Network(184c07f2-f316-4056-b962-173c9a73cccb),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap7bc267e3-f7') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Jan 21 19:18:01 np0005591285 nova_compute[182755]: 2026-01-22 00:18:01.886 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:18:01 np0005591285 nova_compute[182755]: 2026-01-22 00:18:01.886 182759 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 21 19:18:01 np0005591285 nova_compute[182755]: 2026-01-22 00:18:01.887 182759 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 21 19:18:01 np0005591285 nova_compute[182755]: 2026-01-22 00:18:01.889 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:18:01 np0005591285 nova_compute[182755]: 2026-01-22 00:18:01.889 182759 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap7bc267e3-f7, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 21 19:18:01 np0005591285 nova_compute[182755]: 2026-01-22 00:18:01.890 182759 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap7bc267e3-f7, col_values=(('external_ids', {'iface-id': '7bc267e3-f762-4a18-a3a2-42a7161a231e', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:38:78:10', 'vm-uuid': '46feac9e-f412-4027-8cfb-f7280308085e'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 21 19:18:01 np0005591285 nova_compute[182755]: 2026-01-22 00:18:01.891 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:18:01 np0005591285 NetworkManager[55017]: <info>  [1769041081.8926] manager: (tap7bc267e3-f7): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/258)
Jan 21 19:18:01 np0005591285 nova_compute[182755]: 2026-01-22 00:18:01.894 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 21 19:18:01 np0005591285 nova_compute[182755]: 2026-01-22 00:18:01.900 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:18:01 np0005591285 nova_compute[182755]: 2026-01-22 00:18:01.901 182759 INFO os_vif [None req-6a54018f-e373-40db-bb2b-666c3eb4f8c4 74e6e619a2a649d8a98f51f794c814ec d7e5ce5531e5499fa8b5c71d40934672 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:38:78:10,bridge_name='br-int',has_traffic_filtering=True,id=7bc267e3-f762-4a18-a3a2-42a7161a231e,network=Network(184c07f2-f316-4056-b962-173c9a73cccb),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap7bc267e3-f7')#033[00m
Jan 21 19:18:01 np0005591285 nova_compute[182755]: 2026-01-22 00:18:01.964 182759 DEBUG nova.virt.libvirt.driver [None req-6a54018f-e373-40db-bb2b-666c3eb4f8c4 74e6e619a2a649d8a98f51f794c814ec d7e5ce5531e5499fa8b5c71d40934672 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 21 19:18:01 np0005591285 nova_compute[182755]: 2026-01-22 00:18:01.964 182759 DEBUG nova.virt.libvirt.driver [None req-6a54018f-e373-40db-bb2b-666c3eb4f8c4 74e6e619a2a649d8a98f51f794c814ec d7e5ce5531e5499fa8b5c71d40934672 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 21 19:18:01 np0005591285 nova_compute[182755]: 2026-01-22 00:18:01.965 182759 DEBUG nova.virt.libvirt.driver [None req-6a54018f-e373-40db-bb2b-666c3eb4f8c4 74e6e619a2a649d8a98f51f794c814ec d7e5ce5531e5499fa8b5c71d40934672 - - default default] No VIF found with MAC fa:16:3e:38:78:10, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Jan 21 19:18:01 np0005591285 nova_compute[182755]: 2026-01-22 00:18:01.965 182759 INFO nova.virt.libvirt.driver [None req-6a54018f-e373-40db-bb2b-666c3eb4f8c4 74e6e619a2a649d8a98f51f794c814ec d7e5ce5531e5499fa8b5c71d40934672 - - default default] [instance: 46feac9e-f412-4027-8cfb-f7280308085e] Using config drive#033[00m
Jan 21 19:18:02 np0005591285 kernel: tap7bc267e3-f7: entered promiscuous mode
Jan 21 19:18:02 np0005591285 NetworkManager[55017]: <info>  [1769041082.0228] manager: (tap7bc267e3-f7): new Tun device (/org/freedesktop/NetworkManager/Devices/259)
Jan 21 19:18:02 np0005591285 nova_compute[182755]: 2026-01-22 00:18:02.067 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:18:02 np0005591285 ovn_controller[94908]: 2026-01-22T00:18:02Z|00532|binding|INFO|Claiming lport 7bc267e3-f762-4a18-a3a2-42a7161a231e for this chassis.
Jan 21 19:18:02 np0005591285 ovn_controller[94908]: 2026-01-22T00:18:02Z|00533|binding|INFO|7bc267e3-f762-4a18-a3a2-42a7161a231e: Claiming fa:16:3e:38:78:10 10.100.0.12
Jan 21 19:18:02 np0005591285 ovn_controller[94908]: 2026-01-22T00:18:02Z|00534|binding|INFO|Setting lport 7bc267e3-f762-4a18-a3a2-42a7161a231e ovn-installed in OVS
Jan 21 19:18:02 np0005591285 nova_compute[182755]: 2026-01-22 00:18:02.087 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:18:02 np0005591285 systemd-udevd[233619]: Network interface NamePolicy= disabled on kernel command line.
Jan 21 19:18:02 np0005591285 systemd-machined[154022]: New machine qemu-63-instance-00000086.
Jan 21 19:18:02 np0005591285 NetworkManager[55017]: <info>  [1769041082.1053] device (tap7bc267e3-f7): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 21 19:18:02 np0005591285 NetworkManager[55017]: <info>  [1769041082.1064] device (tap7bc267e3-f7): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 21 19:18:02 np0005591285 systemd[1]: Started Virtual Machine qemu-63-instance-00000086.
Jan 21 19:18:02 np0005591285 ovn_controller[94908]: 2026-01-22T00:18:02Z|00535|binding|INFO|Setting lport 7bc267e3-f762-4a18-a3a2-42a7161a231e up in Southbound
Jan 21 19:18:02 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:18:02.185 104259 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:38:78:10 10.100.0.12'], port_security=['fa:16:3e:38:78:10 10.100.0.12'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.12/28', 'neutron:device_id': '46feac9e-f412-4027-8cfb-f7280308085e', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-184c07f2-f316-4056-b962-173c9a73cccb', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'adb1305c8f874f2684e845e88fd95ffe', 'neutron:revision_number': '6', 'neutron:security_group_ids': '9e59c3b5-e637-42fe-b28f-811656431607', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:port_fip': '192.168.122.203'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=ac0dd3c8-754f-43f7-8c8a-c2e10a6719dc, chassis=[<ovs.db.idl.Row object at 0x7fdd55b5c970>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fdd55b5c970>], logical_port=7bc267e3-f762-4a18-a3a2-42a7161a231e) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 21 19:18:02 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:18:02.187 104259 INFO neutron.agent.ovn.metadata.agent [-] Port 7bc267e3-f762-4a18-a3a2-42a7161a231e in datapath 184c07f2-f316-4056-b962-173c9a73cccb bound to our chassis#033[00m
Jan 21 19:18:02 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:18:02.188 104259 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 184c07f2-f316-4056-b962-173c9a73cccb#033[00m
Jan 21 19:18:02 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:18:02.199 211690 DEBUG oslo.privsep.daemon [-] privsep: reply[e45581b9-1976-4c55-8fd2-3a6bd0243526]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 19:18:02 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:18:02.200 104259 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap184c07f2-f1 in ovnmeta-184c07f2-f316-4056-b962-173c9a73cccb namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Jan 21 19:18:02 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:18:02.204 211690 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap184c07f2-f0 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Jan 21 19:18:02 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:18:02.204 211690 DEBUG oslo.privsep.daemon [-] privsep: reply[ead12a41-d660-4245-8380-542a715d7973]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 19:18:02 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:18:02.205 211690 DEBUG oslo.privsep.daemon [-] privsep: reply[85ea4f62-3741-4e8c-b5dd-15d7089e77ec]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 19:18:02 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:18:02.218 104650 DEBUG oslo.privsep.daemon [-] privsep: reply[b696b1d2-41e6-4f4a-b3a6-f58fc1587bfe]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 19:18:02 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:18:02.241 211690 DEBUG oslo.privsep.daemon [-] privsep: reply[9aa2b7fa-a912-4085-b5bd-674d0d6e30f0]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 19:18:02 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:18:02.267 211704 DEBUG oslo.privsep.daemon [-] privsep: reply[f540165b-dade-4f8b-9315-578e3cf51ff1]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 19:18:02 np0005591285 systemd-udevd[233622]: Network interface NamePolicy= disabled on kernel command line.
Jan 21 19:18:02 np0005591285 NetworkManager[55017]: <info>  [1769041082.2736] manager: (tap184c07f2-f0): new Veth device (/org/freedesktop/NetworkManager/Devices/260)
Jan 21 19:18:02 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:18:02.274 211690 DEBUG oslo.privsep.daemon [-] privsep: reply[eb860983-0afc-4833-a5fb-1bc693d459fb]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 19:18:02 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:18:02.303 211704 DEBUG oslo.privsep.daemon [-] privsep: reply[df47479a-a4f3-4835-90b3-f5ee29b64a0c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 19:18:02 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:18:02.305 211704 DEBUG oslo.privsep.daemon [-] privsep: reply[3ca409ed-1977-4d4f-8791-1b13fa9e3a16]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 19:18:02 np0005591285 NetworkManager[55017]: <info>  [1769041082.3243] device (tap184c07f2-f0): carrier: link connected
Jan 21 19:18:02 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:18:02.333 211704 DEBUG oslo.privsep.daemon [-] privsep: reply[a0250e88-ad19-4c92-9b8a-c85d445fb443]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 19:18:02 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:18:02.355 211690 DEBUG oslo.privsep.daemon [-] privsep: reply[851f0665-bb9e-4c62-8e78-3dd775f1ab8f]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap184c07f2-f1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:30:28:80'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 168], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 564098, 'reachable_time': 18957, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 233653, 'error': None, 'target': 'ovnmeta-184c07f2-f316-4056-b962-173c9a73cccb', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 19:18:02 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:18:02.371 211690 DEBUG oslo.privsep.daemon [-] privsep: reply[043a6640-a314-4080-a385-5c303e387c24]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe30:2880'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 564098, 'tstamp': 564098}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 233654, 'error': None, 'target': 'ovnmeta-184c07f2-f316-4056-b962-173c9a73cccb', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 19:18:02 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:18:02.389 211690 DEBUG oslo.privsep.daemon [-] privsep: reply[971a489d-db1c-43f5-a84d-51b3c0046b86]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap184c07f2-f1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:30:28:80'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 168], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 564098, 'reachable_time': 18957, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 233655, 'error': None, 'target': 'ovnmeta-184c07f2-f316-4056-b962-173c9a73cccb', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 19:18:02 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:18:02.423 211690 DEBUG oslo.privsep.daemon [-] privsep: reply[3ab162b9-b994-414c-98e3-b517c7c8a7cf]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 19:18:02 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:18:02.479 211690 DEBUG oslo.privsep.daemon [-] privsep: reply[465a55ae-47fd-4d49-898e-a6b04e88f5d8]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 19:18:02 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:18:02.481 104259 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap184c07f2-f0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 21 19:18:02 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:18:02.481 104259 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 21 19:18:02 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:18:02.481 104259 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap184c07f2-f0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 21 19:18:02 np0005591285 NetworkManager[55017]: <info>  [1769041082.4840] manager: (tap184c07f2-f0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/261)
Jan 21 19:18:02 np0005591285 nova_compute[182755]: 2026-01-22 00:18:02.483 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:18:02 np0005591285 kernel: tap184c07f2-f0: entered promiscuous mode
Jan 21 19:18:02 np0005591285 nova_compute[182755]: 2026-01-22 00:18:02.485 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:18:02 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:18:02.486 104259 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap184c07f2-f0, col_values=(('external_ids', {'iface-id': 'fb0deda5-be9d-4b30-99e6-73fb36bd8567'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 21 19:18:02 np0005591285 ovn_controller[94908]: 2026-01-22T00:18:02Z|00536|binding|INFO|Releasing lport fb0deda5-be9d-4b30-99e6-73fb36bd8567 from this chassis (sb_readonly=0)
Jan 21 19:18:02 np0005591285 nova_compute[182755]: 2026-01-22 00:18:02.488 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:18:02 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:18:02.488 104259 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/184c07f2-f316-4056-b962-173c9a73cccb.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/184c07f2-f316-4056-b962-173c9a73cccb.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Jan 21 19:18:02 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:18:02.503 211690 DEBUG oslo.privsep.daemon [-] privsep: reply[35ca171f-8629-40d0-bbdd-5c3d65012439]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 19:18:02 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:18:02.505 104259 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Jan 21 19:18:02 np0005591285 ovn_metadata_agent[104254]: global
Jan 21 19:18:02 np0005591285 ovn_metadata_agent[104254]:    log         /dev/log local0 debug
Jan 21 19:18:02 np0005591285 ovn_metadata_agent[104254]:    log-tag     haproxy-metadata-proxy-184c07f2-f316-4056-b962-173c9a73cccb
Jan 21 19:18:02 np0005591285 ovn_metadata_agent[104254]:    user        root
Jan 21 19:18:02 np0005591285 ovn_metadata_agent[104254]:    group       root
Jan 21 19:18:02 np0005591285 ovn_metadata_agent[104254]:    maxconn     1024
Jan 21 19:18:02 np0005591285 ovn_metadata_agent[104254]:    pidfile     /var/lib/neutron/external/pids/184c07f2-f316-4056-b962-173c9a73cccb.pid.haproxy
Jan 21 19:18:02 np0005591285 ovn_metadata_agent[104254]:    daemon
Jan 21 19:18:02 np0005591285 ovn_metadata_agent[104254]: 
Jan 21 19:18:02 np0005591285 ovn_metadata_agent[104254]: defaults
Jan 21 19:18:02 np0005591285 ovn_metadata_agent[104254]:    log global
Jan 21 19:18:02 np0005591285 ovn_metadata_agent[104254]:    mode http
Jan 21 19:18:02 np0005591285 ovn_metadata_agent[104254]:    option httplog
Jan 21 19:18:02 np0005591285 ovn_metadata_agent[104254]:    option dontlognull
Jan 21 19:18:02 np0005591285 ovn_metadata_agent[104254]:    option http-server-close
Jan 21 19:18:02 np0005591285 ovn_metadata_agent[104254]:    option forwardfor
Jan 21 19:18:02 np0005591285 ovn_metadata_agent[104254]:    retries                 3
Jan 21 19:18:02 np0005591285 ovn_metadata_agent[104254]:    timeout http-request    30s
Jan 21 19:18:02 np0005591285 ovn_metadata_agent[104254]:    timeout connect         30s
Jan 21 19:18:02 np0005591285 ovn_metadata_agent[104254]:    timeout client          32s
Jan 21 19:18:02 np0005591285 ovn_metadata_agent[104254]:    timeout server          32s
Jan 21 19:18:02 np0005591285 ovn_metadata_agent[104254]:    timeout http-keep-alive 30s
Jan 21 19:18:02 np0005591285 ovn_metadata_agent[104254]: 
Jan 21 19:18:02 np0005591285 ovn_metadata_agent[104254]: 
Jan 21 19:18:02 np0005591285 ovn_metadata_agent[104254]: listen listener
Jan 21 19:18:02 np0005591285 ovn_metadata_agent[104254]:    bind 169.254.169.254:80
Jan 21 19:18:02 np0005591285 ovn_metadata_agent[104254]:    server metadata /var/lib/neutron/metadata_proxy
Jan 21 19:18:02 np0005591285 ovn_metadata_agent[104254]:    http-request add-header X-OVN-Network-ID 184c07f2-f316-4056-b962-173c9a73cccb
Jan 21 19:18:02 np0005591285 ovn_metadata_agent[104254]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Jan 21 19:18:02 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:18:02.505 104259 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-184c07f2-f316-4056-b962-173c9a73cccb', 'env', 'PROCESS_TAG=haproxy-184c07f2-f316-4056-b962-173c9a73cccb', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/184c07f2-f316-4056-b962-173c9a73cccb.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Jan 21 19:18:02 np0005591285 nova_compute[182755]: 2026-01-22 00:18:02.507 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:18:02 np0005591285 nova_compute[182755]: 2026-01-22 00:18:02.555 182759 DEBUG nova.compute.manager [req-5b34d5ec-b838-49eb-8631-76f620e17fb0 req-e66cbc1c-78c4-4386-913b-de59c3130dce 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 46feac9e-f412-4027-8cfb-f7280308085e] Received event network-vif-plugged-7bc267e3-f762-4a18-a3a2-42a7161a231e external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 21 19:18:02 np0005591285 nova_compute[182755]: 2026-01-22 00:18:02.556 182759 DEBUG oslo_concurrency.lockutils [req-5b34d5ec-b838-49eb-8631-76f620e17fb0 req-e66cbc1c-78c4-4386-913b-de59c3130dce 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "46feac9e-f412-4027-8cfb-f7280308085e-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 21 19:18:02 np0005591285 nova_compute[182755]: 2026-01-22 00:18:02.556 182759 DEBUG oslo_concurrency.lockutils [req-5b34d5ec-b838-49eb-8631-76f620e17fb0 req-e66cbc1c-78c4-4386-913b-de59c3130dce 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "46feac9e-f412-4027-8cfb-f7280308085e-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 21 19:18:02 np0005591285 nova_compute[182755]: 2026-01-22 00:18:02.556 182759 DEBUG oslo_concurrency.lockutils [req-5b34d5ec-b838-49eb-8631-76f620e17fb0 req-e66cbc1c-78c4-4386-913b-de59c3130dce 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "46feac9e-f412-4027-8cfb-f7280308085e-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 21 19:18:02 np0005591285 nova_compute[182755]: 2026-01-22 00:18:02.557 182759 DEBUG nova.compute.manager [req-5b34d5ec-b838-49eb-8631-76f620e17fb0 req-e66cbc1c-78c4-4386-913b-de59c3130dce 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 46feac9e-f412-4027-8cfb-f7280308085e] No waiting events found dispatching network-vif-plugged-7bc267e3-f762-4a18-a3a2-42a7161a231e pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 21 19:18:02 np0005591285 nova_compute[182755]: 2026-01-22 00:18:02.557 182759 WARNING nova.compute.manager [req-5b34d5ec-b838-49eb-8631-76f620e17fb0 req-e66cbc1c-78c4-4386-913b-de59c3130dce 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 46feac9e-f412-4027-8cfb-f7280308085e] Received unexpected event network-vif-plugged-7bc267e3-f762-4a18-a3a2-42a7161a231e for instance with vm_state active and task_state resize_finish.#033[00m
Jan 21 19:18:02 np0005591285 nova_compute[182755]: 2026-01-22 00:18:02.641 182759 DEBUG oslo_concurrency.lockutils [None req-640e2d00-1896-4dac-9dd5-0441f5fff473 5eb4e81f0cef4003ae49faa67b3f17c3 3e408650207b498c8d115fd0c4f776dc - - default default] Acquiring lock "8b2b13b2-3477-4c12-b3a9-2af6bab94065" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 21 19:18:02 np0005591285 nova_compute[182755]: 2026-01-22 00:18:02.642 182759 DEBUG oslo_concurrency.lockutils [None req-640e2d00-1896-4dac-9dd5-0441f5fff473 5eb4e81f0cef4003ae49faa67b3f17c3 3e408650207b498c8d115fd0c4f776dc - - default default] Lock "8b2b13b2-3477-4c12-b3a9-2af6bab94065" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 21 19:18:02 np0005591285 nova_compute[182755]: 2026-01-22 00:18:02.675 182759 DEBUG nova.compute.manager [None req-640e2d00-1896-4dac-9dd5-0441f5fff473 5eb4e81f0cef4003ae49faa67b3f17c3 3e408650207b498c8d115fd0c4f776dc - - default default] [instance: 8b2b13b2-3477-4c12-b3a9-2af6bab94065] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Jan 21 19:18:02 np0005591285 nova_compute[182755]: 2026-01-22 00:18:02.756 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:18:02 np0005591285 nova_compute[182755]: 2026-01-22 00:18:02.758 182759 DEBUG nova.virt.driver [None req-f447ef32-7edb-4e2c-a5c5-5452f2f9c37e - - - - - -] Emitting event <LifecycleEvent: 1769041082.7581177, 46feac9e-f412-4027-8cfb-f7280308085e => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 21 19:18:02 np0005591285 nova_compute[182755]: 2026-01-22 00:18:02.759 182759 INFO nova.compute.manager [None req-f447ef32-7edb-4e2c-a5c5-5452f2f9c37e - - - - - -] [instance: 46feac9e-f412-4027-8cfb-f7280308085e] VM Resumed (Lifecycle Event)#033[00m
Jan 21 19:18:02 np0005591285 nova_compute[182755]: 2026-01-22 00:18:02.760 182759 DEBUG nova.compute.manager [None req-6a54018f-e373-40db-bb2b-666c3eb4f8c4 74e6e619a2a649d8a98f51f794c814ec d7e5ce5531e5499fa8b5c71d40934672 - - default default] [instance: 46feac9e-f412-4027-8cfb-f7280308085e] Instance event wait completed in 0 seconds for  wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Jan 21 19:18:02 np0005591285 nova_compute[182755]: 2026-01-22 00:18:02.766 182759 INFO nova.virt.libvirt.driver [-] [instance: 46feac9e-f412-4027-8cfb-f7280308085e] Instance running successfully.#033[00m
Jan 21 19:18:02 np0005591285 virtqemud[182299]: argument unsupported: QEMU guest agent is not configured
Jan 21 19:18:02 np0005591285 nova_compute[182755]: 2026-01-22 00:18:02.769 182759 DEBUG nova.virt.libvirt.guest [None req-6a54018f-e373-40db-bb2b-666c3eb4f8c4 74e6e619a2a649d8a98f51f794c814ec d7e5ce5531e5499fa8b5c71d40934672 - - default default] [instance: 46feac9e-f412-4027-8cfb-f7280308085e] Failed to set time: agent not configured sync_guest_time /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:200#033[00m
Jan 21 19:18:02 np0005591285 nova_compute[182755]: 2026-01-22 00:18:02.770 182759 DEBUG nova.virt.libvirt.driver [None req-6a54018f-e373-40db-bb2b-666c3eb4f8c4 74e6e619a2a649d8a98f51f794c814ec d7e5ce5531e5499fa8b5c71d40934672 - - default default] [instance: 46feac9e-f412-4027-8cfb-f7280308085e] finish_migration finished successfully. finish_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11793#033[00m
Jan 21 19:18:02 np0005591285 nova_compute[182755]: 2026-01-22 00:18:02.908 182759 DEBUG nova.network.neutron [req-164072ac-9fa4-48f2-9c7f-d6a22bd849ed req-e01b6ba4-a895-4b3f-a40d-fb78ff6428b7 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 46feac9e-f412-4027-8cfb-f7280308085e] Updated VIF entry in instance network info cache for port 7bc267e3-f762-4a18-a3a2-42a7161a231e. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 21 19:18:02 np0005591285 nova_compute[182755]: 2026-01-22 00:18:02.909 182759 DEBUG nova.network.neutron [req-164072ac-9fa4-48f2-9c7f-d6a22bd849ed req-e01b6ba4-a895-4b3f-a40d-fb78ff6428b7 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 46feac9e-f412-4027-8cfb-f7280308085e] Updating instance_info_cache with network_info: [{"id": "7bc267e3-f762-4a18-a3a2-42a7161a231e", "address": "fa:16:3e:38:78:10", "network": {"id": "184c07f2-f316-4056-b962-173c9a73cccb", "bridge": "br-int", "label": "tempest-network-smoke--1947088510", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.203", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "adb1305c8f874f2684e845e88fd95ffe", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap7bc267e3-f7", "ovs_interfaceid": "7bc267e3-f762-4a18-a3a2-42a7161a231e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 21 19:18:02 np0005591285 podman[233694]: 2026-01-22 00:18:02.914254534 +0000 UTC m=+0.080089176 container create 986007551cc33596b587d10b32a53fff3f9f4c2df0ab3f4f40d20aae2e24e17a (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-184c07f2-f316-4056-b962-173c9a73cccb, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3)
Jan 21 19:18:02 np0005591285 nova_compute[182755]: 2026-01-22 00:18:02.919 182759 DEBUG nova.compute.manager [None req-f447ef32-7edb-4e2c-a5c5-5452f2f9c37e - - - - - -] [instance: 46feac9e-f412-4027-8cfb-f7280308085e] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 21 19:18:02 np0005591285 nova_compute[182755]: 2026-01-22 00:18:02.925 182759 DEBUG nova.compute.manager [None req-f447ef32-7edb-4e2c-a5c5-5452f2f9c37e - - - - - -] [instance: 46feac9e-f412-4027-8cfb-f7280308085e] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: active, current task_state: resize_finish, current DB power_state: 1, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 21 19:18:02 np0005591285 podman[233694]: 2026-01-22 00:18:02.860677076 +0000 UTC m=+0.026511768 image pull 3695f0466b4af47afdf4b467956f8cc4744d7249671a73e7ca3fd26cca2f59c3 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Jan 21 19:18:02 np0005591285 systemd[1]: Started libpod-conmon-986007551cc33596b587d10b32a53fff3f9f4c2df0ab3f4f40d20aae2e24e17a.scope.
Jan 21 19:18:02 np0005591285 nova_compute[182755]: 2026-01-22 00:18:02.978 182759 DEBUG oslo_concurrency.lockutils [req-164072ac-9fa4-48f2-9c7f-d6a22bd849ed req-e01b6ba4-a895-4b3f-a40d-fb78ff6428b7 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Releasing lock "refresh_cache-46feac9e-f412-4027-8cfb-f7280308085e" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 21 19:18:02 np0005591285 nova_compute[182755]: 2026-01-22 00:18:02.980 182759 INFO nova.compute.manager [None req-f447ef32-7edb-4e2c-a5c5-5452f2f9c37e - - - - - -] [instance: 46feac9e-f412-4027-8cfb-f7280308085e] During sync_power_state the instance has a pending task (resize_finish). Skip.#033[00m
Jan 21 19:18:02 np0005591285 nova_compute[182755]: 2026-01-22 00:18:02.980 182759 DEBUG nova.virt.driver [None req-f447ef32-7edb-4e2c-a5c5-5452f2f9c37e - - - - - -] Emitting event <LifecycleEvent: 1769041082.7583156, 46feac9e-f412-4027-8cfb-f7280308085e => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 21 19:18:02 np0005591285 nova_compute[182755]: 2026-01-22 00:18:02.980 182759 INFO nova.compute.manager [None req-f447ef32-7edb-4e2c-a5c5-5452f2f9c37e - - - - - -] [instance: 46feac9e-f412-4027-8cfb-f7280308085e] VM Started (Lifecycle Event)#033[00m
Jan 21 19:18:02 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:18:02.982 104259 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 21 19:18:02 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:18:02.982 104259 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 21 19:18:02 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:18:02.983 104259 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 21 19:18:02 np0005591285 systemd[1]: Started libcrun container.
Jan 21 19:18:02 np0005591285 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/bb41bc3eaf541071b2398bd3c511117cfa24cbab927ad859e168138b2821b51e/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 21 19:18:03 np0005591285 podman[233694]: 2026-01-22 00:18:03.008788655 +0000 UTC m=+0.174623317 container init 986007551cc33596b587d10b32a53fff3f9f4c2df0ab3f4f40d20aae2e24e17a (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-184c07f2-f316-4056-b962-173c9a73cccb, org.label-schema.build-date=20251202, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 21 19:18:03 np0005591285 podman[233694]: 2026-01-22 00:18:03.013821709 +0000 UTC m=+0.179656351 container start 986007551cc33596b587d10b32a53fff3f9f4c2df0ab3f4f40d20aae2e24e17a (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-184c07f2-f316-4056-b962-173c9a73cccb, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, tcib_managed=true, org.label-schema.build-date=20251202)
Jan 21 19:18:03 np0005591285 neutron-haproxy-ovnmeta-184c07f2-f316-4056-b962-173c9a73cccb[233707]: [NOTICE]   (233713) : New worker (233715) forked
Jan 21 19:18:03 np0005591285 neutron-haproxy-ovnmeta-184c07f2-f316-4056-b962-173c9a73cccb[233707]: [NOTICE]   (233713) : Loading success.
Jan 21 19:18:03 np0005591285 nova_compute[182755]: 2026-01-22 00:18:03.109 182759 DEBUG nova.compute.manager [None req-f447ef32-7edb-4e2c-a5c5-5452f2f9c37e - - - - - -] [instance: 46feac9e-f412-4027-8cfb-f7280308085e] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 21 19:18:03 np0005591285 nova_compute[182755]: 2026-01-22 00:18:03.114 182759 DEBUG nova.compute.manager [None req-f447ef32-7edb-4e2c-a5c5-5452f2f9c37e - - - - - -] [instance: 46feac9e-f412-4027-8cfb-f7280308085e] Synchronizing instance power state after lifecycle event "Started"; current vm_state: active, current task_state: resize_finish, current DB power_state: 1, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 21 19:18:03 np0005591285 nova_compute[182755]: 2026-01-22 00:18:03.133 182759 DEBUG oslo_concurrency.lockutils [None req-640e2d00-1896-4dac-9dd5-0441f5fff473 5eb4e81f0cef4003ae49faa67b3f17c3 3e408650207b498c8d115fd0c4f776dc - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 21 19:18:03 np0005591285 nova_compute[182755]: 2026-01-22 00:18:03.134 182759 DEBUG oslo_concurrency.lockutils [None req-640e2d00-1896-4dac-9dd5-0441f5fff473 5eb4e81f0cef4003ae49faa67b3f17c3 3e408650207b498c8d115fd0c4f776dc - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 21 19:18:03 np0005591285 nova_compute[182755]: 2026-01-22 00:18:03.140 182759 DEBUG nova.virt.hardware [None req-640e2d00-1896-4dac-9dd5-0441f5fff473 5eb4e81f0cef4003ae49faa67b3f17c3 3e408650207b498c8d115fd0c4f776dc - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Jan 21 19:18:03 np0005591285 nova_compute[182755]: 2026-01-22 00:18:03.141 182759 INFO nova.compute.claims [None req-640e2d00-1896-4dac-9dd5-0441f5fff473 5eb4e81f0cef4003ae49faa67b3f17c3 3e408650207b498c8d115fd0c4f776dc - - default default] [instance: 8b2b13b2-3477-4c12-b3a9-2af6bab94065] Claim successful on node compute-2.ctlplane.example.com#033[00m
Jan 21 19:18:03 np0005591285 nova_compute[182755]: 2026-01-22 00:18:03.308 182759 DEBUG nova.compute.provider_tree [None req-640e2d00-1896-4dac-9dd5-0441f5fff473 5eb4e81f0cef4003ae49faa67b3f17c3 3e408650207b498c8d115fd0c4f776dc - - default default] Inventory has not changed in ProviderTree for provider: e96a8776-a298-4c19-937a-402cb8191067 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 21 19:18:03 np0005591285 nova_compute[182755]: 2026-01-22 00:18:03.326 182759 DEBUG nova.scheduler.client.report [None req-640e2d00-1896-4dac-9dd5-0441f5fff473 5eb4e81f0cef4003ae49faa67b3f17c3 3e408650207b498c8d115fd0c4f776dc - - default default] Inventory has not changed for provider e96a8776-a298-4c19-937a-402cb8191067 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 21 19:18:03 np0005591285 nova_compute[182755]: 2026-01-22 00:18:03.350 182759 DEBUG oslo_concurrency.lockutils [None req-640e2d00-1896-4dac-9dd5-0441f5fff473 5eb4e81f0cef4003ae49faa67b3f17c3 3e408650207b498c8d115fd0c4f776dc - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.217s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 21 19:18:03 np0005591285 nova_compute[182755]: 2026-01-22 00:18:03.351 182759 DEBUG nova.compute.manager [None req-640e2d00-1896-4dac-9dd5-0441f5fff473 5eb4e81f0cef4003ae49faa67b3f17c3 3e408650207b498c8d115fd0c4f776dc - - default default] [instance: 8b2b13b2-3477-4c12-b3a9-2af6bab94065] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Jan 21 19:18:03 np0005591285 nova_compute[182755]: 2026-01-22 00:18:03.419 182759 DEBUG nova.compute.manager [None req-640e2d00-1896-4dac-9dd5-0441f5fff473 5eb4e81f0cef4003ae49faa67b3f17c3 3e408650207b498c8d115fd0c4f776dc - - default default] [instance: 8b2b13b2-3477-4c12-b3a9-2af6bab94065] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Jan 21 19:18:03 np0005591285 nova_compute[182755]: 2026-01-22 00:18:03.419 182759 DEBUG nova.network.neutron [None req-640e2d00-1896-4dac-9dd5-0441f5fff473 5eb4e81f0cef4003ae49faa67b3f17c3 3e408650207b498c8d115fd0c4f776dc - - default default] [instance: 8b2b13b2-3477-4c12-b3a9-2af6bab94065] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Jan 21 19:18:03 np0005591285 nova_compute[182755]: 2026-01-22 00:18:03.443 182759 INFO nova.virt.libvirt.driver [None req-640e2d00-1896-4dac-9dd5-0441f5fff473 5eb4e81f0cef4003ae49faa67b3f17c3 3e408650207b498c8d115fd0c4f776dc - - default default] [instance: 8b2b13b2-3477-4c12-b3a9-2af6bab94065] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Jan 21 19:18:03 np0005591285 nova_compute[182755]: 2026-01-22 00:18:03.469 182759 DEBUG nova.compute.manager [None req-640e2d00-1896-4dac-9dd5-0441f5fff473 5eb4e81f0cef4003ae49faa67b3f17c3 3e408650207b498c8d115fd0c4f776dc - - default default] [instance: 8b2b13b2-3477-4c12-b3a9-2af6bab94065] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Jan 21 19:18:03 np0005591285 nova_compute[182755]: 2026-01-22 00:18:03.609 182759 DEBUG nova.compute.manager [None req-640e2d00-1896-4dac-9dd5-0441f5fff473 5eb4e81f0cef4003ae49faa67b3f17c3 3e408650207b498c8d115fd0c4f776dc - - default default] [instance: 8b2b13b2-3477-4c12-b3a9-2af6bab94065] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Jan 21 19:18:03 np0005591285 nova_compute[182755]: 2026-01-22 00:18:03.611 182759 DEBUG nova.virt.libvirt.driver [None req-640e2d00-1896-4dac-9dd5-0441f5fff473 5eb4e81f0cef4003ae49faa67b3f17c3 3e408650207b498c8d115fd0c4f776dc - - default default] [instance: 8b2b13b2-3477-4c12-b3a9-2af6bab94065] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Jan 21 19:18:03 np0005591285 nova_compute[182755]: 2026-01-22 00:18:03.611 182759 INFO nova.virt.libvirt.driver [None req-640e2d00-1896-4dac-9dd5-0441f5fff473 5eb4e81f0cef4003ae49faa67b3f17c3 3e408650207b498c8d115fd0c4f776dc - - default default] [instance: 8b2b13b2-3477-4c12-b3a9-2af6bab94065] Creating image(s)#033[00m
Jan 21 19:18:03 np0005591285 nova_compute[182755]: 2026-01-22 00:18:03.612 182759 DEBUG oslo_concurrency.lockutils [None req-640e2d00-1896-4dac-9dd5-0441f5fff473 5eb4e81f0cef4003ae49faa67b3f17c3 3e408650207b498c8d115fd0c4f776dc - - default default] Acquiring lock "/var/lib/nova/instances/8b2b13b2-3477-4c12-b3a9-2af6bab94065/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 21 19:18:03 np0005591285 nova_compute[182755]: 2026-01-22 00:18:03.612 182759 DEBUG oslo_concurrency.lockutils [None req-640e2d00-1896-4dac-9dd5-0441f5fff473 5eb4e81f0cef4003ae49faa67b3f17c3 3e408650207b498c8d115fd0c4f776dc - - default default] Lock "/var/lib/nova/instances/8b2b13b2-3477-4c12-b3a9-2af6bab94065/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 21 19:18:03 np0005591285 nova_compute[182755]: 2026-01-22 00:18:03.612 182759 DEBUG oslo_concurrency.lockutils [None req-640e2d00-1896-4dac-9dd5-0441f5fff473 5eb4e81f0cef4003ae49faa67b3f17c3 3e408650207b498c8d115fd0c4f776dc - - default default] Lock "/var/lib/nova/instances/8b2b13b2-3477-4c12-b3a9-2af6bab94065/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 21 19:18:03 np0005591285 nova_compute[182755]: 2026-01-22 00:18:03.626 182759 DEBUG oslo_concurrency.processutils [None req-640e2d00-1896-4dac-9dd5-0441f5fff473 5eb4e81f0cef4003ae49faa67b3f17c3 3e408650207b498c8d115fd0c4f776dc - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 21 19:18:03 np0005591285 nova_compute[182755]: 2026-01-22 00:18:03.682 182759 DEBUG oslo_concurrency.processutils [None req-640e2d00-1896-4dac-9dd5-0441f5fff473 5eb4e81f0cef4003ae49faa67b3f17c3 3e408650207b498c8d115fd0c4f776dc - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json" returned: 0 in 0.055s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 21 19:18:03 np0005591285 nova_compute[182755]: 2026-01-22 00:18:03.683 182759 DEBUG oslo_concurrency.lockutils [None req-640e2d00-1896-4dac-9dd5-0441f5fff473 5eb4e81f0cef4003ae49faa67b3f17c3 3e408650207b498c8d115fd0c4f776dc - - default default] Acquiring lock "3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 21 19:18:03 np0005591285 nova_compute[182755]: 2026-01-22 00:18:03.684 182759 DEBUG oslo_concurrency.lockutils [None req-640e2d00-1896-4dac-9dd5-0441f5fff473 5eb4e81f0cef4003ae49faa67b3f17c3 3e408650207b498c8d115fd0c4f776dc - - default default] Lock "3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 21 19:18:03 np0005591285 nova_compute[182755]: 2026-01-22 00:18:03.695 182759 DEBUG oslo_concurrency.processutils [None req-640e2d00-1896-4dac-9dd5-0441f5fff473 5eb4e81f0cef4003ae49faa67b3f17c3 3e408650207b498c8d115fd0c4f776dc - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 21 19:18:03 np0005591285 nova_compute[182755]: 2026-01-22 00:18:03.760 182759 DEBUG nova.policy [None req-640e2d00-1896-4dac-9dd5-0441f5fff473 5eb4e81f0cef4003ae49faa67b3f17c3 3e408650207b498c8d115fd0c4f776dc - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '5eb4e81f0cef4003ae49faa67b3f17c3', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '3e408650207b498c8d115fd0c4f776dc', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Jan 21 19:18:03 np0005591285 nova_compute[182755]: 2026-01-22 00:18:03.768 182759 DEBUG oslo_concurrency.processutils [None req-640e2d00-1896-4dac-9dd5-0441f5fff473 5eb4e81f0cef4003ae49faa67b3f17c3 3e408650207b498c8d115fd0c4f776dc - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json" returned: 0 in 0.073s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 21 19:18:03 np0005591285 nova_compute[182755]: 2026-01-22 00:18:03.770 182759 DEBUG oslo_concurrency.processutils [None req-640e2d00-1896-4dac-9dd5-0441f5fff473 5eb4e81f0cef4003ae49faa67b3f17c3 3e408650207b498c8d115fd0c4f776dc - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474,backing_fmt=raw /var/lib/nova/instances/8b2b13b2-3477-4c12-b3a9-2af6bab94065/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 21 19:18:03 np0005591285 nova_compute[182755]: 2026-01-22 00:18:03.815 182759 DEBUG oslo_concurrency.processutils [None req-640e2d00-1896-4dac-9dd5-0441f5fff473 5eb4e81f0cef4003ae49faa67b3f17c3 3e408650207b498c8d115fd0c4f776dc - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474,backing_fmt=raw /var/lib/nova/instances/8b2b13b2-3477-4c12-b3a9-2af6bab94065/disk 1073741824" returned: 0 in 0.044s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 21 19:18:03 np0005591285 nova_compute[182755]: 2026-01-22 00:18:03.816 182759 DEBUG oslo_concurrency.lockutils [None req-640e2d00-1896-4dac-9dd5-0441f5fff473 5eb4e81f0cef4003ae49faa67b3f17c3 3e408650207b498c8d115fd0c4f776dc - - default default] Lock "3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.132s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 21 19:18:03 np0005591285 nova_compute[182755]: 2026-01-22 00:18:03.816 182759 DEBUG oslo_concurrency.processutils [None req-640e2d00-1896-4dac-9dd5-0441f5fff473 5eb4e81f0cef4003ae49faa67b3f17c3 3e408650207b498c8d115fd0c4f776dc - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 21 19:18:03 np0005591285 nova_compute[182755]: 2026-01-22 00:18:03.872 182759 DEBUG oslo_concurrency.processutils [None req-640e2d00-1896-4dac-9dd5-0441f5fff473 5eb4e81f0cef4003ae49faa67b3f17c3 3e408650207b498c8d115fd0c4f776dc - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json" returned: 0 in 0.055s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 21 19:18:03 np0005591285 nova_compute[182755]: 2026-01-22 00:18:03.873 182759 DEBUG nova.virt.disk.api [None req-640e2d00-1896-4dac-9dd5-0441f5fff473 5eb4e81f0cef4003ae49faa67b3f17c3 3e408650207b498c8d115fd0c4f776dc - - default default] Checking if we can resize image /var/lib/nova/instances/8b2b13b2-3477-4c12-b3a9-2af6bab94065/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166#033[00m
Jan 21 19:18:03 np0005591285 nova_compute[182755]: 2026-01-22 00:18:03.874 182759 DEBUG oslo_concurrency.processutils [None req-640e2d00-1896-4dac-9dd5-0441f5fff473 5eb4e81f0cef4003ae49faa67b3f17c3 3e408650207b498c8d115fd0c4f776dc - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/8b2b13b2-3477-4c12-b3a9-2af6bab94065/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 21 19:18:03 np0005591285 nova_compute[182755]: 2026-01-22 00:18:03.929 182759 DEBUG oslo_concurrency.processutils [None req-640e2d00-1896-4dac-9dd5-0441f5fff473 5eb4e81f0cef4003ae49faa67b3f17c3 3e408650207b498c8d115fd0c4f776dc - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/8b2b13b2-3477-4c12-b3a9-2af6bab94065/disk --force-share --output=json" returned: 0 in 0.055s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 21 19:18:03 np0005591285 nova_compute[182755]: 2026-01-22 00:18:03.930 182759 DEBUG nova.virt.disk.api [None req-640e2d00-1896-4dac-9dd5-0441f5fff473 5eb4e81f0cef4003ae49faa67b3f17c3 3e408650207b498c8d115fd0c4f776dc - - default default] Cannot resize image /var/lib/nova/instances/8b2b13b2-3477-4c12-b3a9-2af6bab94065/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172#033[00m
Jan 21 19:18:03 np0005591285 nova_compute[182755]: 2026-01-22 00:18:03.930 182759 DEBUG nova.objects.instance [None req-640e2d00-1896-4dac-9dd5-0441f5fff473 5eb4e81f0cef4003ae49faa67b3f17c3 3e408650207b498c8d115fd0c4f776dc - - default default] Lazy-loading 'migration_context' on Instance uuid 8b2b13b2-3477-4c12-b3a9-2af6bab94065 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 21 19:18:03 np0005591285 nova_compute[182755]: 2026-01-22 00:18:03.947 182759 DEBUG nova.virt.libvirt.driver [None req-640e2d00-1896-4dac-9dd5-0441f5fff473 5eb4e81f0cef4003ae49faa67b3f17c3 3e408650207b498c8d115fd0c4f776dc - - default default] [instance: 8b2b13b2-3477-4c12-b3a9-2af6bab94065] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Jan 21 19:18:03 np0005591285 nova_compute[182755]: 2026-01-22 00:18:03.948 182759 DEBUG nova.virt.libvirt.driver [None req-640e2d00-1896-4dac-9dd5-0441f5fff473 5eb4e81f0cef4003ae49faa67b3f17c3 3e408650207b498c8d115fd0c4f776dc - - default default] [instance: 8b2b13b2-3477-4c12-b3a9-2af6bab94065] Ensure instance console log exists: /var/lib/nova/instances/8b2b13b2-3477-4c12-b3a9-2af6bab94065/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Jan 21 19:18:03 np0005591285 nova_compute[182755]: 2026-01-22 00:18:03.948 182759 DEBUG oslo_concurrency.lockutils [None req-640e2d00-1896-4dac-9dd5-0441f5fff473 5eb4e81f0cef4003ae49faa67b3f17c3 3e408650207b498c8d115fd0c4f776dc - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 21 19:18:03 np0005591285 nova_compute[182755]: 2026-01-22 00:18:03.949 182759 DEBUG oslo_concurrency.lockutils [None req-640e2d00-1896-4dac-9dd5-0441f5fff473 5eb4e81f0cef4003ae49faa67b3f17c3 3e408650207b498c8d115fd0c4f776dc - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 21 19:18:03 np0005591285 nova_compute[182755]: 2026-01-22 00:18:03.949 182759 DEBUG oslo_concurrency.lockutils [None req-640e2d00-1896-4dac-9dd5-0441f5fff473 5eb4e81f0cef4003ae49faa67b3f17c3 3e408650207b498c8d115fd0c4f776dc - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 21 19:18:04 np0005591285 nova_compute[182755]: 2026-01-22 00:18:04.975 182759 DEBUG nova.compute.manager [req-da858a30-795e-4208-9885-940989f6cfd8 req-77314b1e-3c03-47c6-a224-507c33bbe3f0 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 46feac9e-f412-4027-8cfb-f7280308085e] Received event network-vif-plugged-7bc267e3-f762-4a18-a3a2-42a7161a231e external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 21 19:18:04 np0005591285 nova_compute[182755]: 2026-01-22 00:18:04.976 182759 DEBUG oslo_concurrency.lockutils [req-da858a30-795e-4208-9885-940989f6cfd8 req-77314b1e-3c03-47c6-a224-507c33bbe3f0 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "46feac9e-f412-4027-8cfb-f7280308085e-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 21 19:18:04 np0005591285 nova_compute[182755]: 2026-01-22 00:18:04.976 182759 DEBUG oslo_concurrency.lockutils [req-da858a30-795e-4208-9885-940989f6cfd8 req-77314b1e-3c03-47c6-a224-507c33bbe3f0 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "46feac9e-f412-4027-8cfb-f7280308085e-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 21 19:18:04 np0005591285 nova_compute[182755]: 2026-01-22 00:18:04.977 182759 DEBUG oslo_concurrency.lockutils [req-da858a30-795e-4208-9885-940989f6cfd8 req-77314b1e-3c03-47c6-a224-507c33bbe3f0 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "46feac9e-f412-4027-8cfb-f7280308085e-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 21 19:18:04 np0005591285 nova_compute[182755]: 2026-01-22 00:18:04.977 182759 DEBUG nova.compute.manager [req-da858a30-795e-4208-9885-940989f6cfd8 req-77314b1e-3c03-47c6-a224-507c33bbe3f0 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 46feac9e-f412-4027-8cfb-f7280308085e] No waiting events found dispatching network-vif-plugged-7bc267e3-f762-4a18-a3a2-42a7161a231e pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 21 19:18:04 np0005591285 nova_compute[182755]: 2026-01-22 00:18:04.977 182759 WARNING nova.compute.manager [req-da858a30-795e-4208-9885-940989f6cfd8 req-77314b1e-3c03-47c6-a224-507c33bbe3f0 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 46feac9e-f412-4027-8cfb-f7280308085e] Received unexpected event network-vif-plugged-7bc267e3-f762-4a18-a3a2-42a7161a231e for instance with vm_state resized and task_state None.#033[00m
Jan 21 19:18:06 np0005591285 nova_compute[182755]: 2026-01-22 00:18:06.058 182759 DEBUG nova.network.neutron [None req-640e2d00-1896-4dac-9dd5-0441f5fff473 5eb4e81f0cef4003ae49faa67b3f17c3 3e408650207b498c8d115fd0c4f776dc - - default default] [instance: 8b2b13b2-3477-4c12-b3a9-2af6bab94065] Successfully created port: fea9b5d1-8621-4677-a6ea-12b5e8e94d7f _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Jan 21 19:18:06 np0005591285 nova_compute[182755]: 2026-01-22 00:18:06.893 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:18:07 np0005591285 systemd[1]: Stopping User Manager for UID 42436...
Jan 21 19:18:07 np0005591285 systemd[233364]: Activating special unit Exit the Session...
Jan 21 19:18:07 np0005591285 systemd[233364]: Stopped target Main User Target.
Jan 21 19:18:07 np0005591285 systemd[233364]: Stopped target Basic System.
Jan 21 19:18:07 np0005591285 systemd[233364]: Stopped target Paths.
Jan 21 19:18:07 np0005591285 systemd[233364]: Stopped target Sockets.
Jan 21 19:18:07 np0005591285 systemd[233364]: Stopped target Timers.
Jan 21 19:18:07 np0005591285 systemd[233364]: Stopped Mark boot as successful after the user session has run 2 minutes.
Jan 21 19:18:07 np0005591285 systemd[233364]: Stopped Daily Cleanup of User's Temporary Directories.
Jan 21 19:18:07 np0005591285 systemd[233364]: Closed D-Bus User Message Bus Socket.
Jan 21 19:18:07 np0005591285 systemd[233364]: Stopped Create User's Volatile Files and Directories.
Jan 21 19:18:07 np0005591285 systemd[233364]: Removed slice User Application Slice.
Jan 21 19:18:07 np0005591285 systemd[233364]: Reached target Shutdown.
Jan 21 19:18:07 np0005591285 systemd[233364]: Finished Exit the Session.
Jan 21 19:18:07 np0005591285 systemd[233364]: Reached target Exit the Session.
Jan 21 19:18:07 np0005591285 systemd[1]: user@42436.service: Deactivated successfully.
Jan 21 19:18:07 np0005591285 systemd[1]: Stopped User Manager for UID 42436.
Jan 21 19:18:07 np0005591285 systemd[1]: Stopping User Runtime Directory /run/user/42436...
Jan 21 19:18:07 np0005591285 systemd[1]: run-user-42436.mount: Deactivated successfully.
Jan 21 19:18:07 np0005591285 systemd[1]: user-runtime-dir@42436.service: Deactivated successfully.
Jan 21 19:18:07 np0005591285 systemd[1]: Stopped User Runtime Directory /run/user/42436.
Jan 21 19:18:07 np0005591285 systemd[1]: Removed slice User Slice of UID 42436.
Jan 21 19:18:07 np0005591285 ovn_controller[94908]: 2026-01-22T00:18:07Z|00537|binding|INFO|Releasing lport fb0deda5-be9d-4b30-99e6-73fb36bd8567 from this chassis (sb_readonly=0)
Jan 21 19:18:07 np0005591285 nova_compute[182755]: 2026-01-22 00:18:07.510 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:18:07 np0005591285 nova_compute[182755]: 2026-01-22 00:18:07.761 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:18:07 np0005591285 nova_compute[182755]: 2026-01-22 00:18:07.939 182759 DEBUG nova.network.neutron [None req-640e2d00-1896-4dac-9dd5-0441f5fff473 5eb4e81f0cef4003ae49faa67b3f17c3 3e408650207b498c8d115fd0c4f776dc - - default default] [instance: 8b2b13b2-3477-4c12-b3a9-2af6bab94065] Successfully updated port: fea9b5d1-8621-4677-a6ea-12b5e8e94d7f _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Jan 21 19:18:07 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:18:07.956 104259 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=41, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '16:a4:fb', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': 'fa:c2:bb:50:eb:27'}, ipsec=False) old=SB_Global(nb_cfg=40) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 21 19:18:07 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:18:07.958 104259 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 2 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Jan 21 19:18:07 np0005591285 nova_compute[182755]: 2026-01-22 00:18:07.959 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:18:07 np0005591285 nova_compute[182755]: 2026-01-22 00:18:07.976 182759 DEBUG oslo_concurrency.lockutils [None req-640e2d00-1896-4dac-9dd5-0441f5fff473 5eb4e81f0cef4003ae49faa67b3f17c3 3e408650207b498c8d115fd0c4f776dc - - default default] Acquiring lock "refresh_cache-8b2b13b2-3477-4c12-b3a9-2af6bab94065" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 21 19:18:07 np0005591285 nova_compute[182755]: 2026-01-22 00:18:07.977 182759 DEBUG oslo_concurrency.lockutils [None req-640e2d00-1896-4dac-9dd5-0441f5fff473 5eb4e81f0cef4003ae49faa67b3f17c3 3e408650207b498c8d115fd0c4f776dc - - default default] Acquired lock "refresh_cache-8b2b13b2-3477-4c12-b3a9-2af6bab94065" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 21 19:18:07 np0005591285 nova_compute[182755]: 2026-01-22 00:18:07.977 182759 DEBUG nova.network.neutron [None req-640e2d00-1896-4dac-9dd5-0441f5fff473 5eb4e81f0cef4003ae49faa67b3f17c3 3e408650207b498c8d115fd0c4f776dc - - default default] [instance: 8b2b13b2-3477-4c12-b3a9-2af6bab94065] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Jan 21 19:18:08 np0005591285 nova_compute[182755]: 2026-01-22 00:18:08.169 182759 DEBUG nova.network.neutron [None req-640e2d00-1896-4dac-9dd5-0441f5fff473 5eb4e81f0cef4003ae49faa67b3f17c3 3e408650207b498c8d115fd0c4f776dc - - default default] [instance: 8b2b13b2-3477-4c12-b3a9-2af6bab94065] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Jan 21 19:18:08 np0005591285 nova_compute[182755]: 2026-01-22 00:18:08.217 182759 DEBUG oslo_service.periodic_task [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Running periodic task ComputeManager._cleanup_incomplete_migrations run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 21 19:18:08 np0005591285 nova_compute[182755]: 2026-01-22 00:18:08.218 182759 DEBUG nova.compute.manager [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Cleaning up deleted instances with incomplete migration  _cleanup_incomplete_migrations /usr/lib/python3.9/site-packages/nova/compute/manager.py:11183#033[00m
Jan 21 19:18:09 np0005591285 nova_compute[182755]: 2026-01-22 00:18:09.194 182759 DEBUG nova.network.neutron [None req-640e2d00-1896-4dac-9dd5-0441f5fff473 5eb4e81f0cef4003ae49faa67b3f17c3 3e408650207b498c8d115fd0c4f776dc - - default default] [instance: 8b2b13b2-3477-4c12-b3a9-2af6bab94065] Updating instance_info_cache with network_info: [{"id": "fea9b5d1-8621-4677-a6ea-12b5e8e94d7f", "address": "fa:16:3e:f7:eb:21", "network": {"id": "aabf11c6-ef94-408a-8148-6c6400566606", "bridge": "br-int", "label": "tempest-ServersTestJSON-341875047-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3e408650207b498c8d115fd0c4f776dc", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapfea9b5d1-86", "ovs_interfaceid": "fea9b5d1-8621-4677-a6ea-12b5e8e94d7f", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 21 19:18:09 np0005591285 nova_compute[182755]: 2026-01-22 00:18:09.905 182759 DEBUG nova.compute.manager [req-285da80f-2f4e-4d97-a0a7-d1dc8300b7b0 req-878ff122-8fd6-4c0d-9a14-33e763631120 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 8b2b13b2-3477-4c12-b3a9-2af6bab94065] Received event network-changed-fea9b5d1-8621-4677-a6ea-12b5e8e94d7f external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 21 19:18:09 np0005591285 nova_compute[182755]: 2026-01-22 00:18:09.906 182759 DEBUG nova.compute.manager [req-285da80f-2f4e-4d97-a0a7-d1dc8300b7b0 req-878ff122-8fd6-4c0d-9a14-33e763631120 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 8b2b13b2-3477-4c12-b3a9-2af6bab94065] Refreshing instance network info cache due to event network-changed-fea9b5d1-8621-4677-a6ea-12b5e8e94d7f. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 21 19:18:09 np0005591285 nova_compute[182755]: 2026-01-22 00:18:09.906 182759 DEBUG oslo_concurrency.lockutils [req-285da80f-2f4e-4d97-a0a7-d1dc8300b7b0 req-878ff122-8fd6-4c0d-9a14-33e763631120 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "refresh_cache-8b2b13b2-3477-4c12-b3a9-2af6bab94065" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 21 19:18:09 np0005591285 nova_compute[182755]: 2026-01-22 00:18:09.931 182759 DEBUG oslo_concurrency.lockutils [None req-640e2d00-1896-4dac-9dd5-0441f5fff473 5eb4e81f0cef4003ae49faa67b3f17c3 3e408650207b498c8d115fd0c4f776dc - - default default] Releasing lock "refresh_cache-8b2b13b2-3477-4c12-b3a9-2af6bab94065" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 21 19:18:09 np0005591285 nova_compute[182755]: 2026-01-22 00:18:09.931 182759 DEBUG nova.compute.manager [None req-640e2d00-1896-4dac-9dd5-0441f5fff473 5eb4e81f0cef4003ae49faa67b3f17c3 3e408650207b498c8d115fd0c4f776dc - - default default] [instance: 8b2b13b2-3477-4c12-b3a9-2af6bab94065] Instance network_info: |[{"id": "fea9b5d1-8621-4677-a6ea-12b5e8e94d7f", "address": "fa:16:3e:f7:eb:21", "network": {"id": "aabf11c6-ef94-408a-8148-6c6400566606", "bridge": "br-int", "label": "tempest-ServersTestJSON-341875047-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3e408650207b498c8d115fd0c4f776dc", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapfea9b5d1-86", "ovs_interfaceid": "fea9b5d1-8621-4677-a6ea-12b5e8e94d7f", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Jan 21 19:18:09 np0005591285 nova_compute[182755]: 2026-01-22 00:18:09.932 182759 DEBUG oslo_concurrency.lockutils [req-285da80f-2f4e-4d97-a0a7-d1dc8300b7b0 req-878ff122-8fd6-4c0d-9a14-33e763631120 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquired lock "refresh_cache-8b2b13b2-3477-4c12-b3a9-2af6bab94065" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 21 19:18:09 np0005591285 nova_compute[182755]: 2026-01-22 00:18:09.932 182759 DEBUG nova.network.neutron [req-285da80f-2f4e-4d97-a0a7-d1dc8300b7b0 req-878ff122-8fd6-4c0d-9a14-33e763631120 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 8b2b13b2-3477-4c12-b3a9-2af6bab94065] Refreshing network info cache for port fea9b5d1-8621-4677-a6ea-12b5e8e94d7f _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 21 19:18:09 np0005591285 nova_compute[182755]: 2026-01-22 00:18:09.938 182759 DEBUG nova.virt.libvirt.driver [None req-640e2d00-1896-4dac-9dd5-0441f5fff473 5eb4e81f0cef4003ae49faa67b3f17c3 3e408650207b498c8d115fd0c4f776dc - - default default] [instance: 8b2b13b2-3477-4c12-b3a9-2af6bab94065] Start _get_guest_xml network_info=[{"id": "fea9b5d1-8621-4677-a6ea-12b5e8e94d7f", "address": "fa:16:3e:f7:eb:21", "network": {"id": "aabf11c6-ef94-408a-8148-6c6400566606", "bridge": "br-int", "label": "tempest-ServersTestJSON-341875047-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3e408650207b498c8d115fd0c4f776dc", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapfea9b5d1-86", "ovs_interfaceid": "fea9b5d1-8621-4677-a6ea-12b5e8e94d7f", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-21T23:42:50Z,direct_url=<?>,disk_format='qcow2',id=9cd98f02-a505-4543-a7ad-04e9a377b456,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='43b70c4e837343859ac97b6b2397ba1b',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-21T23:42:52Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'device_type': 'disk', 'device_name': '/dev/vda', 'size': 0, 'boot_index': 0, 'disk_bus': 'virtio', 'guest_format': None, 'encryption_options': None, 'encryption_format': None, 'encrypted': False, 'encryption_secret_uuid': None, 'image_id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Jan 21 19:18:09 np0005591285 nova_compute[182755]: 2026-01-22 00:18:09.946 182759 WARNING nova.virt.libvirt.driver [None req-640e2d00-1896-4dac-9dd5-0441f5fff473 5eb4e81f0cef4003ae49faa67b3f17c3 3e408650207b498c8d115fd0c4f776dc - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 21 19:18:09 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:18:09.961 104259 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=ce4b296c-26ac-415a-aa87-9634754eb3d3, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '41'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 21 19:18:09 np0005591285 nova_compute[182755]: 2026-01-22 00:18:09.972 182759 DEBUG nova.virt.libvirt.host [None req-640e2d00-1896-4dac-9dd5-0441f5fff473 5eb4e81f0cef4003ae49faa67b3f17c3 3e408650207b498c8d115fd0c4f776dc - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Jan 21 19:18:09 np0005591285 nova_compute[182755]: 2026-01-22 00:18:09.973 182759 DEBUG nova.virt.libvirt.host [None req-640e2d00-1896-4dac-9dd5-0441f5fff473 5eb4e81f0cef4003ae49faa67b3f17c3 3e408650207b498c8d115fd0c4f776dc - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Jan 21 19:18:09 np0005591285 nova_compute[182755]: 2026-01-22 00:18:09.980 182759 DEBUG nova.virt.libvirt.host [None req-640e2d00-1896-4dac-9dd5-0441f5fff473 5eb4e81f0cef4003ae49faa67b3f17c3 3e408650207b498c8d115fd0c4f776dc - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Jan 21 19:18:09 np0005591285 nova_compute[182755]: 2026-01-22 00:18:09.981 182759 DEBUG nova.virt.libvirt.host [None req-640e2d00-1896-4dac-9dd5-0441f5fff473 5eb4e81f0cef4003ae49faa67b3f17c3 3e408650207b498c8d115fd0c4f776dc - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Jan 21 19:18:09 np0005591285 nova_compute[182755]: 2026-01-22 00:18:09.984 182759 DEBUG nova.virt.libvirt.driver [None req-640e2d00-1896-4dac-9dd5-0441f5fff473 5eb4e81f0cef4003ae49faa67b3f17c3 3e408650207b498c8d115fd0c4f776dc - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Jan 21 19:18:09 np0005591285 nova_compute[182755]: 2026-01-22 00:18:09.985 182759 DEBUG nova.virt.hardware [None req-640e2d00-1896-4dac-9dd5-0441f5fff473 5eb4e81f0cef4003ae49faa67b3f17c3 3e408650207b498c8d115fd0c4f776dc - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-21T23:42:45Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='c3389c03-89c4-4ff5-9e03-1a99d41713d4',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-21T23:42:50Z,direct_url=<?>,disk_format='qcow2',id=9cd98f02-a505-4543-a7ad-04e9a377b456,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='43b70c4e837343859ac97b6b2397ba1b',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-21T23:42:52Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Jan 21 19:18:09 np0005591285 nova_compute[182755]: 2026-01-22 00:18:09.986 182759 DEBUG nova.virt.hardware [None req-640e2d00-1896-4dac-9dd5-0441f5fff473 5eb4e81f0cef4003ae49faa67b3f17c3 3e408650207b498c8d115fd0c4f776dc - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Jan 21 19:18:09 np0005591285 nova_compute[182755]: 2026-01-22 00:18:09.986 182759 DEBUG nova.virt.hardware [None req-640e2d00-1896-4dac-9dd5-0441f5fff473 5eb4e81f0cef4003ae49faa67b3f17c3 3e408650207b498c8d115fd0c4f776dc - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Jan 21 19:18:09 np0005591285 nova_compute[182755]: 2026-01-22 00:18:09.987 182759 DEBUG nova.virt.hardware [None req-640e2d00-1896-4dac-9dd5-0441f5fff473 5eb4e81f0cef4003ae49faa67b3f17c3 3e408650207b498c8d115fd0c4f776dc - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Jan 21 19:18:09 np0005591285 nova_compute[182755]: 2026-01-22 00:18:09.987 182759 DEBUG nova.virt.hardware [None req-640e2d00-1896-4dac-9dd5-0441f5fff473 5eb4e81f0cef4003ae49faa67b3f17c3 3e408650207b498c8d115fd0c4f776dc - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Jan 21 19:18:09 np0005591285 nova_compute[182755]: 2026-01-22 00:18:09.988 182759 DEBUG nova.virt.hardware [None req-640e2d00-1896-4dac-9dd5-0441f5fff473 5eb4e81f0cef4003ae49faa67b3f17c3 3e408650207b498c8d115fd0c4f776dc - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Jan 21 19:18:09 np0005591285 nova_compute[182755]: 2026-01-22 00:18:09.988 182759 DEBUG nova.virt.hardware [None req-640e2d00-1896-4dac-9dd5-0441f5fff473 5eb4e81f0cef4003ae49faa67b3f17c3 3e408650207b498c8d115fd0c4f776dc - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Jan 21 19:18:09 np0005591285 nova_compute[182755]: 2026-01-22 00:18:09.989 182759 DEBUG nova.virt.hardware [None req-640e2d00-1896-4dac-9dd5-0441f5fff473 5eb4e81f0cef4003ae49faa67b3f17c3 3e408650207b498c8d115fd0c4f776dc - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Jan 21 19:18:09 np0005591285 nova_compute[182755]: 2026-01-22 00:18:09.989 182759 DEBUG nova.virt.hardware [None req-640e2d00-1896-4dac-9dd5-0441f5fff473 5eb4e81f0cef4003ae49faa67b3f17c3 3e408650207b498c8d115fd0c4f776dc - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Jan 21 19:18:09 np0005591285 nova_compute[182755]: 2026-01-22 00:18:09.990 182759 DEBUG nova.virt.hardware [None req-640e2d00-1896-4dac-9dd5-0441f5fff473 5eb4e81f0cef4003ae49faa67b3f17c3 3e408650207b498c8d115fd0c4f776dc - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Jan 21 19:18:09 np0005591285 nova_compute[182755]: 2026-01-22 00:18:09.990 182759 DEBUG nova.virt.hardware [None req-640e2d00-1896-4dac-9dd5-0441f5fff473 5eb4e81f0cef4003ae49faa67b3f17c3 3e408650207b498c8d115fd0c4f776dc - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Jan 21 19:18:09 np0005591285 nova_compute[182755]: 2026-01-22 00:18:09.998 182759 DEBUG nova.virt.libvirt.vif [None req-640e2d00-1896-4dac-9dd5-0441f5fff473 5eb4e81f0cef4003ae49faa67b3f17c3 3e408650207b498c8d115fd0c4f776dc - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-22T00:18:01Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ServersTestJSON-server-543187526',display_name='tempest-ServersTestJSON-server-543187526',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-serverstestjson-server-543187526',id=140,image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBI+5UlsueU3lcaZm16ny5jVaYKPfArNkzeHW3zm7gFwt8bS+VRDPU11yMxbEj7+CJvdXpT9AVhmj/1esV1llenKVR1u67vV6JxJQErFhdN+skSO+BFtU7tS54N2HmBaKuA==',key_name='tempest-key-1410824327',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='3e408650207b498c8d115fd0c4f776dc',ramdisk_id='',reservation_id='r-i67mk0ee',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ServersTestJSON-374007797',owner_user_name='tempest-ServersTestJSON-374007797-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-22T00:18:03Z,user_data=None,user_id='5eb4e81f0cef4003ae49faa67b3f17c3',uuid=8b2b13b2-3477-4c12-b3a9-2af6bab94065,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "fea9b5d1-8621-4677-a6ea-12b5e8e94d7f", "address": "fa:16:3e:f7:eb:21", "network": {"id": "aabf11c6-ef94-408a-8148-6c6400566606", "bridge": "br-int", "label": "tempest-ServersTestJSON-341875047-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3e408650207b498c8d115fd0c4f776dc", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapfea9b5d1-86", "ovs_interfaceid": "fea9b5d1-8621-4677-a6ea-12b5e8e94d7f", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Jan 21 19:18:09 np0005591285 nova_compute[182755]: 2026-01-22 00:18:09.998 182759 DEBUG nova.network.os_vif_util [None req-640e2d00-1896-4dac-9dd5-0441f5fff473 5eb4e81f0cef4003ae49faa67b3f17c3 3e408650207b498c8d115fd0c4f776dc - - default default] Converting VIF {"id": "fea9b5d1-8621-4677-a6ea-12b5e8e94d7f", "address": "fa:16:3e:f7:eb:21", "network": {"id": "aabf11c6-ef94-408a-8148-6c6400566606", "bridge": "br-int", "label": "tempest-ServersTestJSON-341875047-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3e408650207b498c8d115fd0c4f776dc", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapfea9b5d1-86", "ovs_interfaceid": "fea9b5d1-8621-4677-a6ea-12b5e8e94d7f", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 21 19:18:10 np0005591285 nova_compute[182755]: 2026-01-22 00:18:09.999 182759 DEBUG nova.network.os_vif_util [None req-640e2d00-1896-4dac-9dd5-0441f5fff473 5eb4e81f0cef4003ae49faa67b3f17c3 3e408650207b498c8d115fd0c4f776dc - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:f7:eb:21,bridge_name='br-int',has_traffic_filtering=True,id=fea9b5d1-8621-4677-a6ea-12b5e8e94d7f,network=Network(aabf11c6-ef94-408a-8148-6c6400566606),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapfea9b5d1-86') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 21 19:18:10 np0005591285 nova_compute[182755]: 2026-01-22 00:18:10.000 182759 DEBUG nova.objects.instance [None req-640e2d00-1896-4dac-9dd5-0441f5fff473 5eb4e81f0cef4003ae49faa67b3f17c3 3e408650207b498c8d115fd0c4f776dc - - default default] Lazy-loading 'pci_devices' on Instance uuid 8b2b13b2-3477-4c12-b3a9-2af6bab94065 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 21 19:18:10 np0005591285 nova_compute[182755]: 2026-01-22 00:18:10.016 182759 DEBUG nova.virt.libvirt.driver [None req-640e2d00-1896-4dac-9dd5-0441f5fff473 5eb4e81f0cef4003ae49faa67b3f17c3 3e408650207b498c8d115fd0c4f776dc - - default default] [instance: 8b2b13b2-3477-4c12-b3a9-2af6bab94065] End _get_guest_xml xml=<domain type="kvm">
Jan 21 19:18:10 np0005591285 nova_compute[182755]:  <uuid>8b2b13b2-3477-4c12-b3a9-2af6bab94065</uuid>
Jan 21 19:18:10 np0005591285 nova_compute[182755]:  <name>instance-0000008c</name>
Jan 21 19:18:10 np0005591285 nova_compute[182755]:  <memory>131072</memory>
Jan 21 19:18:10 np0005591285 nova_compute[182755]:  <vcpu>1</vcpu>
Jan 21 19:18:10 np0005591285 nova_compute[182755]:  <metadata>
Jan 21 19:18:10 np0005591285 nova_compute[182755]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 21 19:18:10 np0005591285 nova_compute[182755]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 21 19:18:10 np0005591285 nova_compute[182755]:      <nova:name>tempest-ServersTestJSON-server-543187526</nova:name>
Jan 21 19:18:10 np0005591285 nova_compute[182755]:      <nova:creationTime>2026-01-22 00:18:09</nova:creationTime>
Jan 21 19:18:10 np0005591285 nova_compute[182755]:      <nova:flavor name="m1.nano">
Jan 21 19:18:10 np0005591285 nova_compute[182755]:        <nova:memory>128</nova:memory>
Jan 21 19:18:10 np0005591285 nova_compute[182755]:        <nova:disk>1</nova:disk>
Jan 21 19:18:10 np0005591285 nova_compute[182755]:        <nova:swap>0</nova:swap>
Jan 21 19:18:10 np0005591285 nova_compute[182755]:        <nova:ephemeral>0</nova:ephemeral>
Jan 21 19:18:10 np0005591285 nova_compute[182755]:        <nova:vcpus>1</nova:vcpus>
Jan 21 19:18:10 np0005591285 nova_compute[182755]:      </nova:flavor>
Jan 21 19:18:10 np0005591285 nova_compute[182755]:      <nova:owner>
Jan 21 19:18:10 np0005591285 nova_compute[182755]:        <nova:user uuid="5eb4e81f0cef4003ae49faa67b3f17c3">tempest-ServersTestJSON-374007797-project-member</nova:user>
Jan 21 19:18:10 np0005591285 nova_compute[182755]:        <nova:project uuid="3e408650207b498c8d115fd0c4f776dc">tempest-ServersTestJSON-374007797</nova:project>
Jan 21 19:18:10 np0005591285 nova_compute[182755]:      </nova:owner>
Jan 21 19:18:10 np0005591285 nova_compute[182755]:      <nova:root type="image" uuid="9cd98f02-a505-4543-a7ad-04e9a377b456"/>
Jan 21 19:18:10 np0005591285 nova_compute[182755]:      <nova:ports>
Jan 21 19:18:10 np0005591285 nova_compute[182755]:        <nova:port uuid="fea9b5d1-8621-4677-a6ea-12b5e8e94d7f">
Jan 21 19:18:10 np0005591285 nova_compute[182755]:          <nova:ip type="fixed" address="10.100.0.8" ipVersion="4"/>
Jan 21 19:18:10 np0005591285 nova_compute[182755]:        </nova:port>
Jan 21 19:18:10 np0005591285 nova_compute[182755]:      </nova:ports>
Jan 21 19:18:10 np0005591285 nova_compute[182755]:    </nova:instance>
Jan 21 19:18:10 np0005591285 nova_compute[182755]:  </metadata>
Jan 21 19:18:10 np0005591285 nova_compute[182755]:  <sysinfo type="smbios">
Jan 21 19:18:10 np0005591285 nova_compute[182755]:    <system>
Jan 21 19:18:10 np0005591285 nova_compute[182755]:      <entry name="manufacturer">RDO</entry>
Jan 21 19:18:10 np0005591285 nova_compute[182755]:      <entry name="product">OpenStack Compute</entry>
Jan 21 19:18:10 np0005591285 nova_compute[182755]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 21 19:18:10 np0005591285 nova_compute[182755]:      <entry name="serial">8b2b13b2-3477-4c12-b3a9-2af6bab94065</entry>
Jan 21 19:18:10 np0005591285 nova_compute[182755]:      <entry name="uuid">8b2b13b2-3477-4c12-b3a9-2af6bab94065</entry>
Jan 21 19:18:10 np0005591285 nova_compute[182755]:      <entry name="family">Virtual Machine</entry>
Jan 21 19:18:10 np0005591285 nova_compute[182755]:    </system>
Jan 21 19:18:10 np0005591285 nova_compute[182755]:  </sysinfo>
Jan 21 19:18:10 np0005591285 nova_compute[182755]:  <os>
Jan 21 19:18:10 np0005591285 nova_compute[182755]:    <type arch="x86_64" machine="q35">hvm</type>
Jan 21 19:18:10 np0005591285 nova_compute[182755]:    <boot dev="hd"/>
Jan 21 19:18:10 np0005591285 nova_compute[182755]:    <smbios mode="sysinfo"/>
Jan 21 19:18:10 np0005591285 nova_compute[182755]:  </os>
Jan 21 19:18:10 np0005591285 nova_compute[182755]:  <features>
Jan 21 19:18:10 np0005591285 nova_compute[182755]:    <acpi/>
Jan 21 19:18:10 np0005591285 nova_compute[182755]:    <apic/>
Jan 21 19:18:10 np0005591285 nova_compute[182755]:    <vmcoreinfo/>
Jan 21 19:18:10 np0005591285 nova_compute[182755]:  </features>
Jan 21 19:18:10 np0005591285 nova_compute[182755]:  <clock offset="utc">
Jan 21 19:18:10 np0005591285 nova_compute[182755]:    <timer name="pit" tickpolicy="delay"/>
Jan 21 19:18:10 np0005591285 nova_compute[182755]:    <timer name="rtc" tickpolicy="catchup"/>
Jan 21 19:18:10 np0005591285 nova_compute[182755]:    <timer name="hpet" present="no"/>
Jan 21 19:18:10 np0005591285 nova_compute[182755]:  </clock>
Jan 21 19:18:10 np0005591285 nova_compute[182755]:  <cpu mode="custom" match="exact">
Jan 21 19:18:10 np0005591285 nova_compute[182755]:    <model>Nehalem</model>
Jan 21 19:18:10 np0005591285 nova_compute[182755]:    <topology sockets="1" cores="1" threads="1"/>
Jan 21 19:18:10 np0005591285 nova_compute[182755]:  </cpu>
Jan 21 19:18:10 np0005591285 nova_compute[182755]:  <devices>
Jan 21 19:18:10 np0005591285 nova_compute[182755]:    <disk type="file" device="disk">
Jan 21 19:18:10 np0005591285 nova_compute[182755]:      <driver name="qemu" type="qcow2" cache="none"/>
Jan 21 19:18:10 np0005591285 nova_compute[182755]:      <source file="/var/lib/nova/instances/8b2b13b2-3477-4c12-b3a9-2af6bab94065/disk"/>
Jan 21 19:18:10 np0005591285 nova_compute[182755]:      <target dev="vda" bus="virtio"/>
Jan 21 19:18:10 np0005591285 nova_compute[182755]:    </disk>
Jan 21 19:18:10 np0005591285 nova_compute[182755]:    <disk type="file" device="cdrom">
Jan 21 19:18:10 np0005591285 nova_compute[182755]:      <driver name="qemu" type="raw" cache="none"/>
Jan 21 19:18:10 np0005591285 nova_compute[182755]:      <source file="/var/lib/nova/instances/8b2b13b2-3477-4c12-b3a9-2af6bab94065/disk.config"/>
Jan 21 19:18:10 np0005591285 nova_compute[182755]:      <target dev="sda" bus="sata"/>
Jan 21 19:18:10 np0005591285 nova_compute[182755]:    </disk>
Jan 21 19:18:10 np0005591285 nova_compute[182755]:    <interface type="ethernet">
Jan 21 19:18:10 np0005591285 nova_compute[182755]:      <mac address="fa:16:3e:f7:eb:21"/>
Jan 21 19:18:10 np0005591285 nova_compute[182755]:      <model type="virtio"/>
Jan 21 19:18:10 np0005591285 nova_compute[182755]:      <driver name="vhost" rx_queue_size="512"/>
Jan 21 19:18:10 np0005591285 nova_compute[182755]:      <mtu size="1442"/>
Jan 21 19:18:10 np0005591285 nova_compute[182755]:      <target dev="tapfea9b5d1-86"/>
Jan 21 19:18:10 np0005591285 nova_compute[182755]:    </interface>
Jan 21 19:18:10 np0005591285 nova_compute[182755]:    <serial type="pty">
Jan 21 19:18:10 np0005591285 nova_compute[182755]:      <log file="/var/lib/nova/instances/8b2b13b2-3477-4c12-b3a9-2af6bab94065/console.log" append="off"/>
Jan 21 19:18:10 np0005591285 nova_compute[182755]:    </serial>
Jan 21 19:18:10 np0005591285 nova_compute[182755]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 21 19:18:10 np0005591285 nova_compute[182755]:    <video>
Jan 21 19:18:10 np0005591285 nova_compute[182755]:      <model type="virtio"/>
Jan 21 19:18:10 np0005591285 nova_compute[182755]:    </video>
Jan 21 19:18:10 np0005591285 nova_compute[182755]:    <input type="tablet" bus="usb"/>
Jan 21 19:18:10 np0005591285 nova_compute[182755]:    <rng model="virtio">
Jan 21 19:18:10 np0005591285 nova_compute[182755]:      <backend model="random">/dev/urandom</backend>
Jan 21 19:18:10 np0005591285 nova_compute[182755]:    </rng>
Jan 21 19:18:10 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root"/>
Jan 21 19:18:10 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 19:18:10 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 19:18:10 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 19:18:10 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 19:18:10 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 19:18:10 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 19:18:10 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 19:18:10 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 19:18:10 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 19:18:10 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 19:18:10 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 19:18:10 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 19:18:10 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 19:18:10 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 19:18:10 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 19:18:10 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 19:18:10 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 19:18:10 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 19:18:10 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 19:18:10 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 19:18:10 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 19:18:10 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 19:18:10 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 19:18:10 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 19:18:10 np0005591285 nova_compute[182755]:    <controller type="usb" index="0"/>
Jan 21 19:18:10 np0005591285 nova_compute[182755]:    <memballoon model="virtio">
Jan 21 19:18:10 np0005591285 nova_compute[182755]:      <stats period="10"/>
Jan 21 19:18:10 np0005591285 nova_compute[182755]:    </memballoon>
Jan 21 19:18:10 np0005591285 nova_compute[182755]:  </devices>
Jan 21 19:18:10 np0005591285 nova_compute[182755]: </domain>
Jan 21 19:18:10 np0005591285 nova_compute[182755]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Jan 21 19:18:10 np0005591285 nova_compute[182755]: 2026-01-22 00:18:10.017 182759 DEBUG nova.compute.manager [None req-640e2d00-1896-4dac-9dd5-0441f5fff473 5eb4e81f0cef4003ae49faa67b3f17c3 3e408650207b498c8d115fd0c4f776dc - - default default] [instance: 8b2b13b2-3477-4c12-b3a9-2af6bab94065] Preparing to wait for external event network-vif-plugged-fea9b5d1-8621-4677-a6ea-12b5e8e94d7f prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Jan 21 19:18:10 np0005591285 nova_compute[182755]: 2026-01-22 00:18:10.017 182759 DEBUG oslo_concurrency.lockutils [None req-640e2d00-1896-4dac-9dd5-0441f5fff473 5eb4e81f0cef4003ae49faa67b3f17c3 3e408650207b498c8d115fd0c4f776dc - - default default] Acquiring lock "8b2b13b2-3477-4c12-b3a9-2af6bab94065-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 21 19:18:10 np0005591285 nova_compute[182755]: 2026-01-22 00:18:10.018 182759 DEBUG oslo_concurrency.lockutils [None req-640e2d00-1896-4dac-9dd5-0441f5fff473 5eb4e81f0cef4003ae49faa67b3f17c3 3e408650207b498c8d115fd0c4f776dc - - default default] Lock "8b2b13b2-3477-4c12-b3a9-2af6bab94065-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 21 19:18:10 np0005591285 nova_compute[182755]: 2026-01-22 00:18:10.018 182759 DEBUG oslo_concurrency.lockutils [None req-640e2d00-1896-4dac-9dd5-0441f5fff473 5eb4e81f0cef4003ae49faa67b3f17c3 3e408650207b498c8d115fd0c4f776dc - - default default] Lock "8b2b13b2-3477-4c12-b3a9-2af6bab94065-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 21 19:18:10 np0005591285 nova_compute[182755]: 2026-01-22 00:18:10.019 182759 DEBUG nova.virt.libvirt.vif [None req-640e2d00-1896-4dac-9dd5-0441f5fff473 5eb4e81f0cef4003ae49faa67b3f17c3 3e408650207b498c8d115fd0c4f776dc - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-22T00:18:01Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ServersTestJSON-server-543187526',display_name='tempest-ServersTestJSON-server-543187526',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-serverstestjson-server-543187526',id=140,image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBI+5UlsueU3lcaZm16ny5jVaYKPfArNkzeHW3zm7gFwt8bS+VRDPU11yMxbEj7+CJvdXpT9AVhmj/1esV1llenKVR1u67vV6JxJQErFhdN+skSO+BFtU7tS54N2HmBaKuA==',key_name='tempest-key-1410824327',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='3e408650207b498c8d115fd0c4f776dc',ramdisk_id='',reservation_id='r-i67mk0ee',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ServersTestJSON-374007797',owner_user_name='tempest-ServersTestJSON-374007797-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-22T00:18:03Z,user_data=None,user_id='5eb4e81f0cef4003ae49faa67b3f17c3',uuid=8b2b13b2-3477-4c12-b3a9-2af6bab94065,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "fea9b5d1-8621-4677-a6ea-12b5e8e94d7f", "address": "fa:16:3e:f7:eb:21", "network": {"id": "aabf11c6-ef94-408a-8148-6c6400566606", "bridge": "br-int", "label": "tempest-ServersTestJSON-341875047-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3e408650207b498c8d115fd0c4f776dc", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapfea9b5d1-86", "ovs_interfaceid": "fea9b5d1-8621-4677-a6ea-12b5e8e94d7f", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Jan 21 19:18:10 np0005591285 nova_compute[182755]: 2026-01-22 00:18:10.019 182759 DEBUG nova.network.os_vif_util [None req-640e2d00-1896-4dac-9dd5-0441f5fff473 5eb4e81f0cef4003ae49faa67b3f17c3 3e408650207b498c8d115fd0c4f776dc - - default default] Converting VIF {"id": "fea9b5d1-8621-4677-a6ea-12b5e8e94d7f", "address": "fa:16:3e:f7:eb:21", "network": {"id": "aabf11c6-ef94-408a-8148-6c6400566606", "bridge": "br-int", "label": "tempest-ServersTestJSON-341875047-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3e408650207b498c8d115fd0c4f776dc", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapfea9b5d1-86", "ovs_interfaceid": "fea9b5d1-8621-4677-a6ea-12b5e8e94d7f", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 21 19:18:10 np0005591285 nova_compute[182755]: 2026-01-22 00:18:10.020 182759 DEBUG nova.network.os_vif_util [None req-640e2d00-1896-4dac-9dd5-0441f5fff473 5eb4e81f0cef4003ae49faa67b3f17c3 3e408650207b498c8d115fd0c4f776dc - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:f7:eb:21,bridge_name='br-int',has_traffic_filtering=True,id=fea9b5d1-8621-4677-a6ea-12b5e8e94d7f,network=Network(aabf11c6-ef94-408a-8148-6c6400566606),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapfea9b5d1-86') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 21 19:18:10 np0005591285 nova_compute[182755]: 2026-01-22 00:18:10.020 182759 DEBUG os_vif [None req-640e2d00-1896-4dac-9dd5-0441f5fff473 5eb4e81f0cef4003ae49faa67b3f17c3 3e408650207b498c8d115fd0c4f776dc - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:f7:eb:21,bridge_name='br-int',has_traffic_filtering=True,id=fea9b5d1-8621-4677-a6ea-12b5e8e94d7f,network=Network(aabf11c6-ef94-408a-8148-6c6400566606),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapfea9b5d1-86') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Jan 21 19:18:10 np0005591285 nova_compute[182755]: 2026-01-22 00:18:10.021 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:18:10 np0005591285 nova_compute[182755]: 2026-01-22 00:18:10.022 182759 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 21 19:18:10 np0005591285 nova_compute[182755]: 2026-01-22 00:18:10.022 182759 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 21 19:18:10 np0005591285 nova_compute[182755]: 2026-01-22 00:18:10.025 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:18:10 np0005591285 nova_compute[182755]: 2026-01-22 00:18:10.026 182759 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapfea9b5d1-86, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 21 19:18:10 np0005591285 nova_compute[182755]: 2026-01-22 00:18:10.026 182759 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tapfea9b5d1-86, col_values=(('external_ids', {'iface-id': 'fea9b5d1-8621-4677-a6ea-12b5e8e94d7f', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:f7:eb:21', 'vm-uuid': '8b2b13b2-3477-4c12-b3a9-2af6bab94065'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 21 19:18:10 np0005591285 nova_compute[182755]: 2026-01-22 00:18:10.029 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:18:10 np0005591285 NetworkManager[55017]: <info>  [1769041090.0305] manager: (tapfea9b5d1-86): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/262)
Jan 21 19:18:10 np0005591285 nova_compute[182755]: 2026-01-22 00:18:10.033 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 21 19:18:10 np0005591285 nova_compute[182755]: 2026-01-22 00:18:10.043 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:18:10 np0005591285 nova_compute[182755]: 2026-01-22 00:18:10.044 182759 INFO os_vif [None req-640e2d00-1896-4dac-9dd5-0441f5fff473 5eb4e81f0cef4003ae49faa67b3f17c3 3e408650207b498c8d115fd0c4f776dc - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:f7:eb:21,bridge_name='br-int',has_traffic_filtering=True,id=fea9b5d1-8621-4677-a6ea-12b5e8e94d7f,network=Network(aabf11c6-ef94-408a-8148-6c6400566606),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapfea9b5d1-86')#033[00m
Jan 21 19:18:10 np0005591285 nova_compute[182755]: 2026-01-22 00:18:10.118 182759 DEBUG nova.virt.libvirt.driver [None req-640e2d00-1896-4dac-9dd5-0441f5fff473 5eb4e81f0cef4003ae49faa67b3f17c3 3e408650207b498c8d115fd0c4f776dc - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 21 19:18:10 np0005591285 nova_compute[182755]: 2026-01-22 00:18:10.118 182759 DEBUG nova.virt.libvirt.driver [None req-640e2d00-1896-4dac-9dd5-0441f5fff473 5eb4e81f0cef4003ae49faa67b3f17c3 3e408650207b498c8d115fd0c4f776dc - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 21 19:18:10 np0005591285 nova_compute[182755]: 2026-01-22 00:18:10.119 182759 DEBUG nova.virt.libvirt.driver [None req-640e2d00-1896-4dac-9dd5-0441f5fff473 5eb4e81f0cef4003ae49faa67b3f17c3 3e408650207b498c8d115fd0c4f776dc - - default default] No VIF found with MAC fa:16:3e:f7:eb:21, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Jan 21 19:18:10 np0005591285 nova_compute[182755]: 2026-01-22 00:18:10.119 182759 INFO nova.virt.libvirt.driver [None req-640e2d00-1896-4dac-9dd5-0441f5fff473 5eb4e81f0cef4003ae49faa67b3f17c3 3e408650207b498c8d115fd0c4f776dc - - default default] [instance: 8b2b13b2-3477-4c12-b3a9-2af6bab94065] Using config drive#033[00m
Jan 21 19:18:10 np0005591285 nova_compute[182755]: 2026-01-22 00:18:10.502 182759 INFO nova.virt.libvirt.driver [None req-640e2d00-1896-4dac-9dd5-0441f5fff473 5eb4e81f0cef4003ae49faa67b3f17c3 3e408650207b498c8d115fd0c4f776dc - - default default] [instance: 8b2b13b2-3477-4c12-b3a9-2af6bab94065] Creating config drive at /var/lib/nova/instances/8b2b13b2-3477-4c12-b3a9-2af6bab94065/disk.config#033[00m
Jan 21 19:18:10 np0005591285 nova_compute[182755]: 2026-01-22 00:18:10.514 182759 DEBUG oslo_concurrency.processutils [None req-640e2d00-1896-4dac-9dd5-0441f5fff473 5eb4e81f0cef4003ae49faa67b3f17c3 3e408650207b498c8d115fd0c4f776dc - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/8b2b13b2-3477-4c12-b3a9-2af6bab94065/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp5n22dzhi execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 21 19:18:10 np0005591285 nova_compute[182755]: 2026-01-22 00:18:10.643 182759 DEBUG oslo_concurrency.processutils [None req-640e2d00-1896-4dac-9dd5-0441f5fff473 5eb4e81f0cef4003ae49faa67b3f17c3 3e408650207b498c8d115fd0c4f776dc - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/8b2b13b2-3477-4c12-b3a9-2af6bab94065/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp5n22dzhi" returned: 0 in 0.129s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 21 19:18:10 np0005591285 NetworkManager[55017]: <info>  [1769041090.7175] manager: (tapfea9b5d1-86): new Tun device (/org/freedesktop/NetworkManager/Devices/263)
Jan 21 19:18:10 np0005591285 kernel: tapfea9b5d1-86: entered promiscuous mode
Jan 21 19:18:10 np0005591285 nova_compute[182755]: 2026-01-22 00:18:10.728 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:18:10 np0005591285 ovn_controller[94908]: 2026-01-22T00:18:10Z|00538|binding|INFO|Claiming lport fea9b5d1-8621-4677-a6ea-12b5e8e94d7f for this chassis.
Jan 21 19:18:10 np0005591285 ovn_controller[94908]: 2026-01-22T00:18:10Z|00539|binding|INFO|fea9b5d1-8621-4677-a6ea-12b5e8e94d7f: Claiming fa:16:3e:f7:eb:21 10.100.0.8
Jan 21 19:18:10 np0005591285 ovn_controller[94908]: 2026-01-22T00:18:10Z|00540|binding|INFO|Setting lport fea9b5d1-8621-4677-a6ea-12b5e8e94d7f ovn-installed in OVS
Jan 21 19:18:10 np0005591285 nova_compute[182755]: 2026-01-22 00:18:10.757 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:18:10 np0005591285 ovn_controller[94908]: 2026-01-22T00:18:10Z|00541|binding|INFO|Setting lport fea9b5d1-8621-4677-a6ea-12b5e8e94d7f up in Southbound
Jan 21 19:18:10 np0005591285 systemd-machined[154022]: New machine qemu-64-instance-0000008c.
Jan 21 19:18:10 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:18:10.760 104259 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:f7:eb:21 10.100.0.8'], port_security=['fa:16:3e:f7:eb:21 10.100.0.8'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.8/28', 'neutron:device_id': '8b2b13b2-3477-4c12-b3a9-2af6bab94065', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-aabf11c6-ef94-408a-8148-6c6400566606', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '3e408650207b498c8d115fd0c4f776dc', 'neutron:revision_number': '2', 'neutron:security_group_ids': 'd88f438e-f9bb-4593-93a6-6ce5aa939167', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=dfd57084-623a-46cf-a9c5-71a440a640c6, chassis=[<ovs.db.idl.Row object at 0x7fdd55b5c970>], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fdd55b5c970>], logical_port=fea9b5d1-8621-4677-a6ea-12b5e8e94d7f) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 21 19:18:10 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:18:10.761 104259 INFO neutron.agent.ovn.metadata.agent [-] Port fea9b5d1-8621-4677-a6ea-12b5e8e94d7f in datapath aabf11c6-ef94-408a-8148-6c6400566606 bound to our chassis#033[00m
Jan 21 19:18:10 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:18:10.763 104259 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network aabf11c6-ef94-408a-8148-6c6400566606#033[00m
Jan 21 19:18:10 np0005591285 systemd[1]: Started Virtual Machine qemu-64-instance-0000008c.
Jan 21 19:18:10 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:18:10.784 211690 DEBUG oslo.privsep.daemon [-] privsep: reply[fb622268-f6e1-4a94-8505-bfdf41340c5d]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 19:18:10 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:18:10.785 104259 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tapaabf11c6-e1 in ovnmeta-aabf11c6-ef94-408a-8148-6c6400566606 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Jan 21 19:18:10 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:18:10.791 211690 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tapaabf11c6-e0 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Jan 21 19:18:10 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:18:10.791 211690 DEBUG oslo.privsep.daemon [-] privsep: reply[a29e8b7b-a144-43b0-b833-8b9a8bd1b505]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 19:18:10 np0005591285 podman[233754]: 2026-01-22 00:18:10.79407799 +0000 UTC m=+0.081317369 container health_status 3d927e208c8d9d2bf2ed958430b573b5beb6e6062ba87e1ec7a0ee42c855b2e4 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter)
Jan 21 19:18:10 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:18:10.792 211690 DEBUG oslo.privsep.daemon [-] privsep: reply[f31dccd6-49e1-4eff-861d-f75c06d0a19c]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 19:18:10 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:18:10.807 104650 DEBUG oslo.privsep.daemon [-] privsep: reply[b6bdb250-9648-49ef-aecf-ab2414081c9f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 19:18:10 np0005591285 systemd-udevd[233791]: Network interface NamePolicy= disabled on kernel command line.
Jan 21 19:18:10 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:18:10.831 211690 DEBUG oslo.privsep.daemon [-] privsep: reply[0a4b1646-9608-4a9e-ad52-e8e6c33659e8]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 19:18:10 np0005591285 NetworkManager[55017]: <info>  [1769041090.8383] device (tapfea9b5d1-86): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 21 19:18:10 np0005591285 NetworkManager[55017]: <info>  [1769041090.8393] device (tapfea9b5d1-86): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 21 19:18:10 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:18:10.870 211704 DEBUG oslo.privsep.daemon [-] privsep: reply[b6d828c0-9307-4594-8fcd-895e404413c5]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 19:18:10 np0005591285 systemd-udevd[233793]: Network interface NamePolicy= disabled on kernel command line.
Jan 21 19:18:10 np0005591285 NetworkManager[55017]: <info>  [1769041090.8787] manager: (tapaabf11c6-e0): new Veth device (/org/freedesktop/NetworkManager/Devices/264)
Jan 21 19:18:10 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:18:10.877 211690 DEBUG oslo.privsep.daemon [-] privsep: reply[a33c59a1-0510-4d18-99c6-a4b91fcca7e1]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 19:18:10 np0005591285 nova_compute[182755]: 2026-01-22 00:18:10.910 182759 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1769041075.909076, 5fd1f867-a38b-4022-886e-080b31068c65 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 21 19:18:10 np0005591285 nova_compute[182755]: 2026-01-22 00:18:10.911 182759 INFO nova.compute.manager [-] [instance: 5fd1f867-a38b-4022-886e-080b31068c65] VM Stopped (Lifecycle Event)#033[00m
Jan 21 19:18:10 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:18:10.912 211704 DEBUG oslo.privsep.daemon [-] privsep: reply[c85a24b6-3a0f-4f07-b52b-13cafc6fecbd]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 19:18:10 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:18:10.915 211704 DEBUG oslo.privsep.daemon [-] privsep: reply[61abf8a9-56b7-4ddd-a2f8-939af7a77d7a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 19:18:10 np0005591285 NetworkManager[55017]: <info>  [1769041090.9358] device (tapaabf11c6-e0): carrier: link connected
Jan 21 19:18:10 np0005591285 nova_compute[182755]: 2026-01-22 00:18:10.943 182759 DEBUG nova.compute.manager [None req-c68028ca-bf16-43af-9bfb-b01eb205d650 - - - - - -] [instance: 5fd1f867-a38b-4022-886e-080b31068c65] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 21 19:18:10 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:18:10.943 211704 DEBUG oslo.privsep.daemon [-] privsep: reply[34c44baf-078e-474c-a68b-d8891e340589]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 19:18:10 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:18:10.961 211690 DEBUG oslo.privsep.daemon [-] privsep: reply[cc0acb9a-0707-4220-9987-839999c6459c]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapaabf11c6-e1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:ae:1b:62'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 2, 'rx_bytes': 90, 'tx_bytes': 180, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 2, 'rx_bytes': 90, 'tx_bytes': 180, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 170], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 564959, 'reachable_time': 20024, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 2, 'outoctets': 152, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 2, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 152, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 2, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 233816, 'error': None, 'target': 'ovnmeta-aabf11c6-ef94-408a-8148-6c6400566606', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 19:18:10 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:18:10.976 211690 DEBUG oslo.privsep.daemon [-] privsep: reply[be74c4e1-0f8b-40fa-ab38-0d47de846eb1]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:feae:1b62'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 564959, 'tstamp': 564959}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 233817, 'error': None, 'target': 'ovnmeta-aabf11c6-ef94-408a-8148-6c6400566606', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 19:18:10 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:18:10.996 211690 DEBUG oslo.privsep.daemon [-] privsep: reply[d5c96bea-504e-41f0-9303-57a270698996]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapaabf11c6-e1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:ae:1b:62'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 2, 'tx_packets': 2, 'rx_bytes': 176, 'tx_bytes': 180, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 2, 'tx_packets': 2, 'rx_bytes': 176, 'tx_bytes': 180, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 170], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 564959, 'reachable_time': 20024, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 2, 'inoctets': 148, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 2, 'outoctets': 152, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 2, 'outmcastpkts': 2, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 148, 'outmcastoctets': 152, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 2, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 2, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 233818, 'error': None, 'target': 'ovnmeta-aabf11c6-ef94-408a-8148-6c6400566606', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 19:18:11 np0005591285 nova_compute[182755]: 2026-01-22 00:18:11.017 182759 DEBUG nova.compute.manager [req-4e679b5f-730b-4d4e-a2c3-81e7d382c214 req-572db28f-e9d9-4350-84b4-17f9cc036ba5 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 8b2b13b2-3477-4c12-b3a9-2af6bab94065] Received event network-vif-plugged-fea9b5d1-8621-4677-a6ea-12b5e8e94d7f external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 21 19:18:11 np0005591285 nova_compute[182755]: 2026-01-22 00:18:11.018 182759 DEBUG oslo_concurrency.lockutils [req-4e679b5f-730b-4d4e-a2c3-81e7d382c214 req-572db28f-e9d9-4350-84b4-17f9cc036ba5 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "8b2b13b2-3477-4c12-b3a9-2af6bab94065-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 21 19:18:11 np0005591285 nova_compute[182755]: 2026-01-22 00:18:11.019 182759 DEBUG oslo_concurrency.lockutils [req-4e679b5f-730b-4d4e-a2c3-81e7d382c214 req-572db28f-e9d9-4350-84b4-17f9cc036ba5 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "8b2b13b2-3477-4c12-b3a9-2af6bab94065-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 21 19:18:11 np0005591285 nova_compute[182755]: 2026-01-22 00:18:11.019 182759 DEBUG oslo_concurrency.lockutils [req-4e679b5f-730b-4d4e-a2c3-81e7d382c214 req-572db28f-e9d9-4350-84b4-17f9cc036ba5 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "8b2b13b2-3477-4c12-b3a9-2af6bab94065-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 21 19:18:11 np0005591285 nova_compute[182755]: 2026-01-22 00:18:11.020 182759 DEBUG nova.compute.manager [req-4e679b5f-730b-4d4e-a2c3-81e7d382c214 req-572db28f-e9d9-4350-84b4-17f9cc036ba5 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 8b2b13b2-3477-4c12-b3a9-2af6bab94065] Processing event network-vif-plugged-fea9b5d1-8621-4677-a6ea-12b5e8e94d7f _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Jan 21 19:18:11 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:18:11.039 211690 DEBUG oslo.privsep.daemon [-] privsep: reply[d0c970d1-5f84-41d8-af9c-fdb5e5459e84]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 19:18:11 np0005591285 nova_compute[182755]: 2026-01-22 00:18:11.123 182759 DEBUG nova.network.neutron [req-285da80f-2f4e-4d97-a0a7-d1dc8300b7b0 req-878ff122-8fd6-4c0d-9a14-33e763631120 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 8b2b13b2-3477-4c12-b3a9-2af6bab94065] Updated VIF entry in instance network info cache for port fea9b5d1-8621-4677-a6ea-12b5e8e94d7f. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 21 19:18:11 np0005591285 nova_compute[182755]: 2026-01-22 00:18:11.124 182759 DEBUG nova.network.neutron [req-285da80f-2f4e-4d97-a0a7-d1dc8300b7b0 req-878ff122-8fd6-4c0d-9a14-33e763631120 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 8b2b13b2-3477-4c12-b3a9-2af6bab94065] Updating instance_info_cache with network_info: [{"id": "fea9b5d1-8621-4677-a6ea-12b5e8e94d7f", "address": "fa:16:3e:f7:eb:21", "network": {"id": "aabf11c6-ef94-408a-8148-6c6400566606", "bridge": "br-int", "label": "tempest-ServersTestJSON-341875047-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3e408650207b498c8d115fd0c4f776dc", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapfea9b5d1-86", "ovs_interfaceid": "fea9b5d1-8621-4677-a6ea-12b5e8e94d7f", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 21 19:18:11 np0005591285 nova_compute[182755]: 2026-01-22 00:18:11.143 182759 DEBUG oslo_concurrency.lockutils [req-285da80f-2f4e-4d97-a0a7-d1dc8300b7b0 req-878ff122-8fd6-4c0d-9a14-33e763631120 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Releasing lock "refresh_cache-8b2b13b2-3477-4c12-b3a9-2af6bab94065" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 21 19:18:11 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:18:11.148 211690 DEBUG oslo.privsep.daemon [-] privsep: reply[0481ad55-849f-405e-aebc-4561cb07e91b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 19:18:11 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:18:11.149 104259 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapaabf11c6-e0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 21 19:18:11 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:18:11.150 104259 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 21 19:18:11 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:18:11.150 104259 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapaabf11c6-e0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 21 19:18:11 np0005591285 nova_compute[182755]: 2026-01-22 00:18:11.151 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:18:11 np0005591285 kernel: tapaabf11c6-e0: entered promiscuous mode
Jan 21 19:18:11 np0005591285 NetworkManager[55017]: <info>  [1769041091.1544] manager: (tapaabf11c6-e0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/265)
Jan 21 19:18:11 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:18:11.155 104259 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapaabf11c6-e0, col_values=(('external_ids', {'iface-id': '1ae0dbff-a7cd-4db8-afc3-1d102fdd130f'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 21 19:18:11 np0005591285 nova_compute[182755]: 2026-01-22 00:18:11.155 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:18:11 np0005591285 ovn_controller[94908]: 2026-01-22T00:18:11Z|00542|binding|INFO|Releasing lport 1ae0dbff-a7cd-4db8-afc3-1d102fdd130f from this chassis (sb_readonly=0)
Jan 21 19:18:11 np0005591285 nova_compute[182755]: 2026-01-22 00:18:11.171 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:18:11 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:18:11.172 104259 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/aabf11c6-ef94-408a-8148-6c6400566606.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/aabf11c6-ef94-408a-8148-6c6400566606.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Jan 21 19:18:11 np0005591285 nova_compute[182755]: 2026-01-22 00:18:11.173 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:18:11 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:18:11.174 211690 DEBUG oslo.privsep.daemon [-] privsep: reply[02f890c8-2f15-4329-86b2-4a631c072eaf]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 19:18:11 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:18:11.175 104259 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Jan 21 19:18:11 np0005591285 ovn_metadata_agent[104254]: global
Jan 21 19:18:11 np0005591285 ovn_metadata_agent[104254]:    log         /dev/log local0 debug
Jan 21 19:18:11 np0005591285 ovn_metadata_agent[104254]:    log-tag     haproxy-metadata-proxy-aabf11c6-ef94-408a-8148-6c6400566606
Jan 21 19:18:11 np0005591285 ovn_metadata_agent[104254]:    user        root
Jan 21 19:18:11 np0005591285 ovn_metadata_agent[104254]:    group       root
Jan 21 19:18:11 np0005591285 ovn_metadata_agent[104254]:    maxconn     1024
Jan 21 19:18:11 np0005591285 ovn_metadata_agent[104254]:    pidfile     /var/lib/neutron/external/pids/aabf11c6-ef94-408a-8148-6c6400566606.pid.haproxy
Jan 21 19:18:11 np0005591285 ovn_metadata_agent[104254]:    daemon
Jan 21 19:18:11 np0005591285 ovn_metadata_agent[104254]: 
Jan 21 19:18:11 np0005591285 ovn_metadata_agent[104254]: defaults
Jan 21 19:18:11 np0005591285 ovn_metadata_agent[104254]:    log global
Jan 21 19:18:11 np0005591285 ovn_metadata_agent[104254]:    mode http
Jan 21 19:18:11 np0005591285 ovn_metadata_agent[104254]:    option httplog
Jan 21 19:18:11 np0005591285 ovn_metadata_agent[104254]:    option dontlognull
Jan 21 19:18:11 np0005591285 ovn_metadata_agent[104254]:    option http-server-close
Jan 21 19:18:11 np0005591285 ovn_metadata_agent[104254]:    option forwardfor
Jan 21 19:18:11 np0005591285 ovn_metadata_agent[104254]:    retries                 3
Jan 21 19:18:11 np0005591285 ovn_metadata_agent[104254]:    timeout http-request    30s
Jan 21 19:18:11 np0005591285 ovn_metadata_agent[104254]:    timeout connect         30s
Jan 21 19:18:11 np0005591285 ovn_metadata_agent[104254]:    timeout client          32s
Jan 21 19:18:11 np0005591285 ovn_metadata_agent[104254]:    timeout server          32s
Jan 21 19:18:11 np0005591285 ovn_metadata_agent[104254]:    timeout http-keep-alive 30s
Jan 21 19:18:11 np0005591285 ovn_metadata_agent[104254]: 
Jan 21 19:18:11 np0005591285 ovn_metadata_agent[104254]: 
Jan 21 19:18:11 np0005591285 ovn_metadata_agent[104254]: listen listener
Jan 21 19:18:11 np0005591285 ovn_metadata_agent[104254]:    bind 169.254.169.254:80
Jan 21 19:18:11 np0005591285 ovn_metadata_agent[104254]:    server metadata /var/lib/neutron/metadata_proxy
Jan 21 19:18:11 np0005591285 ovn_metadata_agent[104254]:    http-request add-header X-OVN-Network-ID aabf11c6-ef94-408a-8148-6c6400566606
Jan 21 19:18:11 np0005591285 ovn_metadata_agent[104254]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Jan 21 19:18:11 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:18:11.176 104259 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-aabf11c6-ef94-408a-8148-6c6400566606', 'env', 'PROCESS_TAG=haproxy-aabf11c6-ef94-408a-8148-6c6400566606', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/aabf11c6-ef94-408a-8148-6c6400566606.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Jan 21 19:18:11 np0005591285 nova_compute[182755]: 2026-01-22 00:18:11.307 182759 DEBUG nova.virt.driver [None req-f447ef32-7edb-4e2c-a5c5-5452f2f9c37e - - - - - -] Emitting event <LifecycleEvent: 1769041091.3069837, 8b2b13b2-3477-4c12-b3a9-2af6bab94065 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 21 19:18:11 np0005591285 nova_compute[182755]: 2026-01-22 00:18:11.308 182759 INFO nova.compute.manager [None req-f447ef32-7edb-4e2c-a5c5-5452f2f9c37e - - - - - -] [instance: 8b2b13b2-3477-4c12-b3a9-2af6bab94065] VM Started (Lifecycle Event)#033[00m
Jan 21 19:18:11 np0005591285 nova_compute[182755]: 2026-01-22 00:18:11.310 182759 DEBUG nova.compute.manager [None req-640e2d00-1896-4dac-9dd5-0441f5fff473 5eb4e81f0cef4003ae49faa67b3f17c3 3e408650207b498c8d115fd0c4f776dc - - default default] [instance: 8b2b13b2-3477-4c12-b3a9-2af6bab94065] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Jan 21 19:18:11 np0005591285 nova_compute[182755]: 2026-01-22 00:18:11.319 182759 DEBUG nova.virt.libvirt.driver [None req-640e2d00-1896-4dac-9dd5-0441f5fff473 5eb4e81f0cef4003ae49faa67b3f17c3 3e408650207b498c8d115fd0c4f776dc - - default default] [instance: 8b2b13b2-3477-4c12-b3a9-2af6bab94065] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Jan 21 19:18:11 np0005591285 nova_compute[182755]: 2026-01-22 00:18:11.324 182759 INFO nova.virt.libvirt.driver [-] [instance: 8b2b13b2-3477-4c12-b3a9-2af6bab94065] Instance spawned successfully.#033[00m
Jan 21 19:18:11 np0005591285 nova_compute[182755]: 2026-01-22 00:18:11.324 182759 DEBUG nova.virt.libvirt.driver [None req-640e2d00-1896-4dac-9dd5-0441f5fff473 5eb4e81f0cef4003ae49faa67b3f17c3 3e408650207b498c8d115fd0c4f776dc - - default default] [instance: 8b2b13b2-3477-4c12-b3a9-2af6bab94065] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Jan 21 19:18:11 np0005591285 nova_compute[182755]: 2026-01-22 00:18:11.326 182759 DEBUG nova.compute.manager [None req-f447ef32-7edb-4e2c-a5c5-5452f2f9c37e - - - - - -] [instance: 8b2b13b2-3477-4c12-b3a9-2af6bab94065] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 21 19:18:11 np0005591285 nova_compute[182755]: 2026-01-22 00:18:11.332 182759 DEBUG nova.compute.manager [None req-f447ef32-7edb-4e2c-a5c5-5452f2f9c37e - - - - - -] [instance: 8b2b13b2-3477-4c12-b3a9-2af6bab94065] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 21 19:18:11 np0005591285 nova_compute[182755]: 2026-01-22 00:18:11.343 182759 DEBUG nova.virt.libvirt.driver [None req-640e2d00-1896-4dac-9dd5-0441f5fff473 5eb4e81f0cef4003ae49faa67b3f17c3 3e408650207b498c8d115fd0c4f776dc - - default default] [instance: 8b2b13b2-3477-4c12-b3a9-2af6bab94065] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 21 19:18:11 np0005591285 nova_compute[182755]: 2026-01-22 00:18:11.343 182759 DEBUG nova.virt.libvirt.driver [None req-640e2d00-1896-4dac-9dd5-0441f5fff473 5eb4e81f0cef4003ae49faa67b3f17c3 3e408650207b498c8d115fd0c4f776dc - - default default] [instance: 8b2b13b2-3477-4c12-b3a9-2af6bab94065] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 21 19:18:11 np0005591285 nova_compute[182755]: 2026-01-22 00:18:11.344 182759 DEBUG nova.virt.libvirt.driver [None req-640e2d00-1896-4dac-9dd5-0441f5fff473 5eb4e81f0cef4003ae49faa67b3f17c3 3e408650207b498c8d115fd0c4f776dc - - default default] [instance: 8b2b13b2-3477-4c12-b3a9-2af6bab94065] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 21 19:18:11 np0005591285 nova_compute[182755]: 2026-01-22 00:18:11.344 182759 DEBUG nova.virt.libvirt.driver [None req-640e2d00-1896-4dac-9dd5-0441f5fff473 5eb4e81f0cef4003ae49faa67b3f17c3 3e408650207b498c8d115fd0c4f776dc - - default default] [instance: 8b2b13b2-3477-4c12-b3a9-2af6bab94065] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 21 19:18:11 np0005591285 nova_compute[182755]: 2026-01-22 00:18:11.345 182759 DEBUG nova.virt.libvirt.driver [None req-640e2d00-1896-4dac-9dd5-0441f5fff473 5eb4e81f0cef4003ae49faa67b3f17c3 3e408650207b498c8d115fd0c4f776dc - - default default] [instance: 8b2b13b2-3477-4c12-b3a9-2af6bab94065] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 21 19:18:11 np0005591285 nova_compute[182755]: 2026-01-22 00:18:11.345 182759 DEBUG nova.virt.libvirt.driver [None req-640e2d00-1896-4dac-9dd5-0441f5fff473 5eb4e81f0cef4003ae49faa67b3f17c3 3e408650207b498c8d115fd0c4f776dc - - default default] [instance: 8b2b13b2-3477-4c12-b3a9-2af6bab94065] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 21 19:18:11 np0005591285 nova_compute[182755]: 2026-01-22 00:18:11.357 182759 INFO nova.compute.manager [None req-f447ef32-7edb-4e2c-a5c5-5452f2f9c37e - - - - - -] [instance: 8b2b13b2-3477-4c12-b3a9-2af6bab94065] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 21 19:18:11 np0005591285 nova_compute[182755]: 2026-01-22 00:18:11.357 182759 DEBUG nova.virt.driver [None req-f447ef32-7edb-4e2c-a5c5-5452f2f9c37e - - - - - -] Emitting event <LifecycleEvent: 1769041091.3071442, 8b2b13b2-3477-4c12-b3a9-2af6bab94065 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 21 19:18:11 np0005591285 nova_compute[182755]: 2026-01-22 00:18:11.358 182759 INFO nova.compute.manager [None req-f447ef32-7edb-4e2c-a5c5-5452f2f9c37e - - - - - -] [instance: 8b2b13b2-3477-4c12-b3a9-2af6bab94065] VM Paused (Lifecycle Event)#033[00m
Jan 21 19:18:11 np0005591285 nova_compute[182755]: 2026-01-22 00:18:11.376 182759 DEBUG nova.compute.manager [None req-f447ef32-7edb-4e2c-a5c5-5452f2f9c37e - - - - - -] [instance: 8b2b13b2-3477-4c12-b3a9-2af6bab94065] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 21 19:18:11 np0005591285 nova_compute[182755]: 2026-01-22 00:18:11.379 182759 DEBUG nova.virt.driver [None req-f447ef32-7edb-4e2c-a5c5-5452f2f9c37e - - - - - -] Emitting event <LifecycleEvent: 1769041091.3190072, 8b2b13b2-3477-4c12-b3a9-2af6bab94065 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 21 19:18:11 np0005591285 nova_compute[182755]: 2026-01-22 00:18:11.380 182759 INFO nova.compute.manager [None req-f447ef32-7edb-4e2c-a5c5-5452f2f9c37e - - - - - -] [instance: 8b2b13b2-3477-4c12-b3a9-2af6bab94065] VM Resumed (Lifecycle Event)#033[00m
Jan 21 19:18:11 np0005591285 nova_compute[182755]: 2026-01-22 00:18:11.407 182759 DEBUG nova.compute.manager [None req-f447ef32-7edb-4e2c-a5c5-5452f2f9c37e - - - - - -] [instance: 8b2b13b2-3477-4c12-b3a9-2af6bab94065] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 21 19:18:11 np0005591285 nova_compute[182755]: 2026-01-22 00:18:11.411 182759 DEBUG nova.compute.manager [None req-f447ef32-7edb-4e2c-a5c5-5452f2f9c37e - - - - - -] [instance: 8b2b13b2-3477-4c12-b3a9-2af6bab94065] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 21 19:18:11 np0005591285 nova_compute[182755]: 2026-01-22 00:18:11.437 182759 INFO nova.compute.manager [None req-f447ef32-7edb-4e2c-a5c5-5452f2f9c37e - - - - - -] [instance: 8b2b13b2-3477-4c12-b3a9-2af6bab94065] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 21 19:18:11 np0005591285 nova_compute[182755]: 2026-01-22 00:18:11.450 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:18:11 np0005591285 podman[233856]: 2026-01-22 00:18:11.609559482 +0000 UTC m=+0.054067353 container create 2028a47040cbe935eebaa6424f312f5d3f3f3366da879a69d1fca1ab230f6877 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-aabf11c6-ef94-408a-8148-6c6400566606, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.vendor=CentOS)
Jan 21 19:18:11 np0005591285 systemd[1]: Started libpod-conmon-2028a47040cbe935eebaa6424f312f5d3f3f3366da879a69d1fca1ab230f6877.scope.
Jan 21 19:18:11 np0005591285 podman[233856]: 2026-01-22 00:18:11.580758444 +0000 UTC m=+0.025266345 image pull 3695f0466b4af47afdf4b467956f8cc4744d7249671a73e7ca3fd26cca2f59c3 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Jan 21 19:18:11 np0005591285 systemd[1]: Started libcrun container.
Jan 21 19:18:11 np0005591285 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/0bb34b9a724b19aaa51bf9fa2a8929b456f110956e5817893de066e9948d8aed/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 21 19:18:11 np0005591285 podman[233856]: 2026-01-22 00:18:11.72013449 +0000 UTC m=+0.164642381 container init 2028a47040cbe935eebaa6424f312f5d3f3f3366da879a69d1fca1ab230f6877 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-aabf11c6-ef94-408a-8148-6c6400566606, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS)
Jan 21 19:18:11 np0005591285 podman[233856]: 2026-01-22 00:18:11.72875334 +0000 UTC m=+0.173261201 container start 2028a47040cbe935eebaa6424f312f5d3f3f3366da879a69d1fca1ab230f6877 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-aabf11c6-ef94-408a-8148-6c6400566606, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, tcib_managed=true, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 21 19:18:11 np0005591285 neutron-haproxy-ovnmeta-aabf11c6-ef94-408a-8148-6c6400566606[233871]: [NOTICE]   (233875) : New worker (233877) forked
Jan 21 19:18:11 np0005591285 neutron-haproxy-ovnmeta-aabf11c6-ef94-408a-8148-6c6400566606[233871]: [NOTICE]   (233875) : Loading success.
Jan 21 19:18:11 np0005591285 nova_compute[182755]: 2026-01-22 00:18:11.848 182759 INFO nova.compute.manager [None req-640e2d00-1896-4dac-9dd5-0441f5fff473 5eb4e81f0cef4003ae49faa67b3f17c3 3e408650207b498c8d115fd0c4f776dc - - default default] [instance: 8b2b13b2-3477-4c12-b3a9-2af6bab94065] Took 8.24 seconds to spawn the instance on the hypervisor.#033[00m
Jan 21 19:18:11 np0005591285 nova_compute[182755]: 2026-01-22 00:18:11.849 182759 DEBUG nova.compute.manager [None req-640e2d00-1896-4dac-9dd5-0441f5fff473 5eb4e81f0cef4003ae49faa67b3f17c3 3e408650207b498c8d115fd0c4f776dc - - default default] [instance: 8b2b13b2-3477-4c12-b3a9-2af6bab94065] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 21 19:18:12 np0005591285 nova_compute[182755]: 2026-01-22 00:18:12.146 182759 INFO nova.compute.manager [None req-640e2d00-1896-4dac-9dd5-0441f5fff473 5eb4e81f0cef4003ae49faa67b3f17c3 3e408650207b498c8d115fd0c4f776dc - - default default] [instance: 8b2b13b2-3477-4c12-b3a9-2af6bab94065] Took 9.05 seconds to build instance.#033[00m
Jan 21 19:18:12 np0005591285 nova_compute[182755]: 2026-01-22 00:18:12.171 182759 DEBUG oslo_concurrency.lockutils [None req-640e2d00-1896-4dac-9dd5-0441f5fff473 5eb4e81f0cef4003ae49faa67b3f17c3 3e408650207b498c8d115fd0c4f776dc - - default default] Lock "8b2b13b2-3477-4c12-b3a9-2af6bab94065" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 9.530s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 21 19:18:12 np0005591285 nova_compute[182755]: 2026-01-22 00:18:12.761 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:18:13 np0005591285 nova_compute[182755]: 2026-01-22 00:18:13.105 182759 DEBUG nova.compute.manager [req-91aea106-7b3c-43f2-9450-0b0782e3ae2c req-c2aac8dd-1157-4574-92f7-5e89227c57f7 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 8b2b13b2-3477-4c12-b3a9-2af6bab94065] Received event network-vif-plugged-fea9b5d1-8621-4677-a6ea-12b5e8e94d7f external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 21 19:18:13 np0005591285 nova_compute[182755]: 2026-01-22 00:18:13.105 182759 DEBUG oslo_concurrency.lockutils [req-91aea106-7b3c-43f2-9450-0b0782e3ae2c req-c2aac8dd-1157-4574-92f7-5e89227c57f7 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "8b2b13b2-3477-4c12-b3a9-2af6bab94065-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 21 19:18:13 np0005591285 nova_compute[182755]: 2026-01-22 00:18:13.106 182759 DEBUG oslo_concurrency.lockutils [req-91aea106-7b3c-43f2-9450-0b0782e3ae2c req-c2aac8dd-1157-4574-92f7-5e89227c57f7 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "8b2b13b2-3477-4c12-b3a9-2af6bab94065-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 21 19:18:13 np0005591285 nova_compute[182755]: 2026-01-22 00:18:13.106 182759 DEBUG oslo_concurrency.lockutils [req-91aea106-7b3c-43f2-9450-0b0782e3ae2c req-c2aac8dd-1157-4574-92f7-5e89227c57f7 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "8b2b13b2-3477-4c12-b3a9-2af6bab94065-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 21 19:18:13 np0005591285 nova_compute[182755]: 2026-01-22 00:18:13.106 182759 DEBUG nova.compute.manager [req-91aea106-7b3c-43f2-9450-0b0782e3ae2c req-c2aac8dd-1157-4574-92f7-5e89227c57f7 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 8b2b13b2-3477-4c12-b3a9-2af6bab94065] No waiting events found dispatching network-vif-plugged-fea9b5d1-8621-4677-a6ea-12b5e8e94d7f pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 21 19:18:13 np0005591285 nova_compute[182755]: 2026-01-22 00:18:13.107 182759 WARNING nova.compute.manager [req-91aea106-7b3c-43f2-9450-0b0782e3ae2c req-c2aac8dd-1157-4574-92f7-5e89227c57f7 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 8b2b13b2-3477-4c12-b3a9-2af6bab94065] Received unexpected event network-vif-plugged-fea9b5d1-8621-4677-a6ea-12b5e8e94d7f for instance with vm_state active and task_state None.#033[00m
Jan 21 19:18:13 np0005591285 nova_compute[182755]: 2026-01-22 00:18:13.989 182759 DEBUG oslo_concurrency.lockutils [None req-6fd8cac3-afe4-4e36-a416-8ccec5c742bf 5eb4e81f0cef4003ae49faa67b3f17c3 3e408650207b498c8d115fd0c4f776dc - - default default] Acquiring lock "8b2b13b2-3477-4c12-b3a9-2af6bab94065" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 21 19:18:13 np0005591285 nova_compute[182755]: 2026-01-22 00:18:13.990 182759 DEBUG oslo_concurrency.lockutils [None req-6fd8cac3-afe4-4e36-a416-8ccec5c742bf 5eb4e81f0cef4003ae49faa67b3f17c3 3e408650207b498c8d115fd0c4f776dc - - default default] Lock "8b2b13b2-3477-4c12-b3a9-2af6bab94065" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 21 19:18:13 np0005591285 nova_compute[182755]: 2026-01-22 00:18:13.990 182759 DEBUG oslo_concurrency.lockutils [None req-6fd8cac3-afe4-4e36-a416-8ccec5c742bf 5eb4e81f0cef4003ae49faa67b3f17c3 3e408650207b498c8d115fd0c4f776dc - - default default] Acquiring lock "8b2b13b2-3477-4c12-b3a9-2af6bab94065-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 21 19:18:13 np0005591285 nova_compute[182755]: 2026-01-22 00:18:13.991 182759 DEBUG oslo_concurrency.lockutils [None req-6fd8cac3-afe4-4e36-a416-8ccec5c742bf 5eb4e81f0cef4003ae49faa67b3f17c3 3e408650207b498c8d115fd0c4f776dc - - default default] Lock "8b2b13b2-3477-4c12-b3a9-2af6bab94065-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 21 19:18:13 np0005591285 nova_compute[182755]: 2026-01-22 00:18:13.991 182759 DEBUG oslo_concurrency.lockutils [None req-6fd8cac3-afe4-4e36-a416-8ccec5c742bf 5eb4e81f0cef4003ae49faa67b3f17c3 3e408650207b498c8d115fd0c4f776dc - - default default] Lock "8b2b13b2-3477-4c12-b3a9-2af6bab94065-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 21 19:18:14 np0005591285 nova_compute[182755]: 2026-01-22 00:18:14.005 182759 INFO nova.compute.manager [None req-6fd8cac3-afe4-4e36-a416-8ccec5c742bf 5eb4e81f0cef4003ae49faa67b3f17c3 3e408650207b498c8d115fd0c4f776dc - - default default] [instance: 8b2b13b2-3477-4c12-b3a9-2af6bab94065] Terminating instance#033[00m
Jan 21 19:18:14 np0005591285 nova_compute[182755]: 2026-01-22 00:18:14.019 182759 DEBUG nova.compute.manager [None req-6fd8cac3-afe4-4e36-a416-8ccec5c742bf 5eb4e81f0cef4003ae49faa67b3f17c3 3e408650207b498c8d115fd0c4f776dc - - default default] [instance: 8b2b13b2-3477-4c12-b3a9-2af6bab94065] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Jan 21 19:18:14 np0005591285 kernel: tapfea9b5d1-86 (unregistering): left promiscuous mode
Jan 21 19:18:14 np0005591285 NetworkManager[55017]: <info>  [1769041094.0423] device (tapfea9b5d1-86): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 21 19:18:14 np0005591285 nova_compute[182755]: 2026-01-22 00:18:14.051 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:18:14 np0005591285 ovn_controller[94908]: 2026-01-22T00:18:14Z|00543|binding|INFO|Releasing lport fea9b5d1-8621-4677-a6ea-12b5e8e94d7f from this chassis (sb_readonly=0)
Jan 21 19:18:14 np0005591285 ovn_controller[94908]: 2026-01-22T00:18:14Z|00544|binding|INFO|Setting lport fea9b5d1-8621-4677-a6ea-12b5e8e94d7f down in Southbound
Jan 21 19:18:14 np0005591285 ovn_controller[94908]: 2026-01-22T00:18:14Z|00545|binding|INFO|Removing iface tapfea9b5d1-86 ovn-installed in OVS
Jan 21 19:18:14 np0005591285 nova_compute[182755]: 2026-01-22 00:18:14.053 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:18:14 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:18:14.060 104259 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:f7:eb:21 10.100.0.8'], port_security=['fa:16:3e:f7:eb:21 10.100.0.8'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.8/28', 'neutron:device_id': '8b2b13b2-3477-4c12-b3a9-2af6bab94065', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-aabf11c6-ef94-408a-8148-6c6400566606', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '3e408650207b498c8d115fd0c4f776dc', 'neutron:revision_number': '4', 'neutron:security_group_ids': 'd88f438e-f9bb-4593-93a6-6ce5aa939167', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-2.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=dfd57084-623a-46cf-a9c5-71a440a640c6, chassis=[], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fdd55b5c970>], logical_port=fea9b5d1-8621-4677-a6ea-12b5e8e94d7f) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7fdd55b5c970>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 21 19:18:14 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:18:14.062 104259 INFO neutron.agent.ovn.metadata.agent [-] Port fea9b5d1-8621-4677-a6ea-12b5e8e94d7f in datapath aabf11c6-ef94-408a-8148-6c6400566606 unbound from our chassis#033[00m
Jan 21 19:18:14 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:18:14.063 104259 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network aabf11c6-ef94-408a-8148-6c6400566606, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Jan 21 19:18:14 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:18:14.064 211690 DEBUG oslo.privsep.daemon [-] privsep: reply[7a543e68-74b5-4dea-bc70-2ba7a642f170]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 19:18:14 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:18:14.065 104259 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-aabf11c6-ef94-408a-8148-6c6400566606 namespace which is not needed anymore#033[00m
Jan 21 19:18:14 np0005591285 nova_compute[182755]: 2026-01-22 00:18:14.067 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:18:14 np0005591285 systemd[1]: machine-qemu\x2d64\x2dinstance\x2d0000008c.scope: Deactivated successfully.
Jan 21 19:18:14 np0005591285 systemd[1]: machine-qemu\x2d64\x2dinstance\x2d0000008c.scope: Consumed 3.237s CPU time.
Jan 21 19:18:14 np0005591285 systemd-machined[154022]: Machine qemu-64-instance-0000008c terminated.
Jan 21 19:18:14 np0005591285 neutron-haproxy-ovnmeta-aabf11c6-ef94-408a-8148-6c6400566606[233871]: [NOTICE]   (233875) : haproxy version is 2.8.14-c23fe91
Jan 21 19:18:14 np0005591285 neutron-haproxy-ovnmeta-aabf11c6-ef94-408a-8148-6c6400566606[233871]: [NOTICE]   (233875) : path to executable is /usr/sbin/haproxy
Jan 21 19:18:14 np0005591285 neutron-haproxy-ovnmeta-aabf11c6-ef94-408a-8148-6c6400566606[233871]: [WARNING]  (233875) : Exiting Master process...
Jan 21 19:18:14 np0005591285 neutron-haproxy-ovnmeta-aabf11c6-ef94-408a-8148-6c6400566606[233871]: [WARNING]  (233875) : Exiting Master process...
Jan 21 19:18:14 np0005591285 neutron-haproxy-ovnmeta-aabf11c6-ef94-408a-8148-6c6400566606[233871]: [ALERT]    (233875) : Current worker (233877) exited with code 143 (Terminated)
Jan 21 19:18:14 np0005591285 neutron-haproxy-ovnmeta-aabf11c6-ef94-408a-8148-6c6400566606[233871]: [WARNING]  (233875) : All workers exited. Exiting... (0)
Jan 21 19:18:14 np0005591285 systemd[1]: libpod-2028a47040cbe935eebaa6424f312f5d3f3f3366da879a69d1fca1ab230f6877.scope: Deactivated successfully.
Jan 21 19:18:14 np0005591285 podman[233913]: 2026-01-22 00:18:14.197341045 +0000 UTC m=+0.042196976 container died 2028a47040cbe935eebaa6424f312f5d3f3f3366da879a69d1fca1ab230f6877 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-aabf11c6-ef94-408a-8148-6c6400566606, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 21 19:18:14 np0005591285 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-2028a47040cbe935eebaa6424f312f5d3f3f3366da879a69d1fca1ab230f6877-userdata-shm.mount: Deactivated successfully.
Jan 21 19:18:14 np0005591285 systemd[1]: var-lib-containers-storage-overlay-0bb34b9a724b19aaa51bf9fa2a8929b456f110956e5817893de066e9948d8aed-merged.mount: Deactivated successfully.
Jan 21 19:18:14 np0005591285 podman[233913]: 2026-01-22 00:18:14.233664304 +0000 UTC m=+0.078520255 container cleanup 2028a47040cbe935eebaa6424f312f5d3f3f3366da879a69d1fca1ab230f6877 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-aabf11c6-ef94-408a-8148-6c6400566606, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, io.buildah.version=1.41.3)
Jan 21 19:18:14 np0005591285 systemd[1]: libpod-conmon-2028a47040cbe935eebaa6424f312f5d3f3f3366da879a69d1fca1ab230f6877.scope: Deactivated successfully.
Jan 21 19:18:14 np0005591285 nova_compute[182755]: 2026-01-22 00:18:14.289 182759 INFO nova.virt.libvirt.driver [-] [instance: 8b2b13b2-3477-4c12-b3a9-2af6bab94065] Instance destroyed successfully.#033[00m
Jan 21 19:18:14 np0005591285 nova_compute[182755]: 2026-01-22 00:18:14.290 182759 DEBUG nova.objects.instance [None req-6fd8cac3-afe4-4e36-a416-8ccec5c742bf 5eb4e81f0cef4003ae49faa67b3f17c3 3e408650207b498c8d115fd0c4f776dc - - default default] Lazy-loading 'resources' on Instance uuid 8b2b13b2-3477-4c12-b3a9-2af6bab94065 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 21 19:18:14 np0005591285 podman[233945]: 2026-01-22 00:18:14.305335564 +0000 UTC m=+0.050089536 container remove 2028a47040cbe935eebaa6424f312f5d3f3f3366da879a69d1fca1ab230f6877 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-aabf11c6-ef94-408a-8148-6c6400566606, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.schema-version=1.0)
Jan 21 19:18:14 np0005591285 nova_compute[182755]: 2026-01-22 00:18:14.310 182759 DEBUG nova.virt.libvirt.vif [None req-6fd8cac3-afe4-4e36-a416-8ccec5c742bf 5eb4e81f0cef4003ae49faa67b3f17c3 3e408650207b498c8d115fd0c4f776dc - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-22T00:18:01Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ServersTestJSON-server-543187526',display_name='tempest-ServersTestJSON-server-543187526',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-serverstestjson-server-543187526',id=140,image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBI+5UlsueU3lcaZm16ny5jVaYKPfArNkzeHW3zm7gFwt8bS+VRDPU11yMxbEj7+CJvdXpT9AVhmj/1esV1llenKVR1u67vV6JxJQErFhdN+skSO+BFtU7tS54N2HmBaKuA==',key_name='tempest-key-1410824327',keypairs=<?>,launch_index=0,launched_at=2026-01-22T00:18:11Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='3e408650207b498c8d115fd0c4f776dc',ramdisk_id='',reservation_id='r-i67mk0ee',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-ServersTestJSON-374007797',owner_user_name='tempest-ServersTestJSON-374007797-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-22T00:18:11Z,user_data=None,user_id='5eb4e81f0cef4003ae49faa67b3f17c3',uuid=8b2b13b2-3477-4c12-b3a9-2af6bab94065,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "fea9b5d1-8621-4677-a6ea-12b5e8e94d7f", "address": "fa:16:3e:f7:eb:21", "network": {"id": "aabf11c6-ef94-408a-8148-6c6400566606", "bridge": "br-int", "label": "tempest-ServersTestJSON-341875047-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3e408650207b498c8d115fd0c4f776dc", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapfea9b5d1-86", "ovs_interfaceid": "fea9b5d1-8621-4677-a6ea-12b5e8e94d7f", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Jan 21 19:18:14 np0005591285 nova_compute[182755]: 2026-01-22 00:18:14.311 182759 DEBUG nova.network.os_vif_util [None req-6fd8cac3-afe4-4e36-a416-8ccec5c742bf 5eb4e81f0cef4003ae49faa67b3f17c3 3e408650207b498c8d115fd0c4f776dc - - default default] Converting VIF {"id": "fea9b5d1-8621-4677-a6ea-12b5e8e94d7f", "address": "fa:16:3e:f7:eb:21", "network": {"id": "aabf11c6-ef94-408a-8148-6c6400566606", "bridge": "br-int", "label": "tempest-ServersTestJSON-341875047-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3e408650207b498c8d115fd0c4f776dc", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapfea9b5d1-86", "ovs_interfaceid": "fea9b5d1-8621-4677-a6ea-12b5e8e94d7f", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 21 19:18:14 np0005591285 nova_compute[182755]: 2026-01-22 00:18:14.311 182759 DEBUG nova.network.os_vif_util [None req-6fd8cac3-afe4-4e36-a416-8ccec5c742bf 5eb4e81f0cef4003ae49faa67b3f17c3 3e408650207b498c8d115fd0c4f776dc - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:f7:eb:21,bridge_name='br-int',has_traffic_filtering=True,id=fea9b5d1-8621-4677-a6ea-12b5e8e94d7f,network=Network(aabf11c6-ef94-408a-8148-6c6400566606),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapfea9b5d1-86') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 21 19:18:14 np0005591285 nova_compute[182755]: 2026-01-22 00:18:14.312 182759 DEBUG os_vif [None req-6fd8cac3-afe4-4e36-a416-8ccec5c742bf 5eb4e81f0cef4003ae49faa67b3f17c3 3e408650207b498c8d115fd0c4f776dc - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:f7:eb:21,bridge_name='br-int',has_traffic_filtering=True,id=fea9b5d1-8621-4677-a6ea-12b5e8e94d7f,network=Network(aabf11c6-ef94-408a-8148-6c6400566606),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapfea9b5d1-86') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Jan 21 19:18:14 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:18:14.312 211690 DEBUG oslo.privsep.daemon [-] privsep: reply[c434a34e-662c-46e3-b4da-19404e85a886]: (4, ('Thu Jan 22 12:18:14 AM UTC 2026 Stopping container neutron-haproxy-ovnmeta-aabf11c6-ef94-408a-8148-6c6400566606 (2028a47040cbe935eebaa6424f312f5d3f3f3366da879a69d1fca1ab230f6877)\n2028a47040cbe935eebaa6424f312f5d3f3f3366da879a69d1fca1ab230f6877\nThu Jan 22 12:18:14 AM UTC 2026 Deleting container neutron-haproxy-ovnmeta-aabf11c6-ef94-408a-8148-6c6400566606 (2028a47040cbe935eebaa6424f312f5d3f3f3366da879a69d1fca1ab230f6877)\n2028a47040cbe935eebaa6424f312f5d3f3f3366da879a69d1fca1ab230f6877\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 19:18:14 np0005591285 nova_compute[182755]: 2026-01-22 00:18:14.313 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:18:14 np0005591285 nova_compute[182755]: 2026-01-22 00:18:14.313 182759 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapfea9b5d1-86, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 21 19:18:14 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:18:14.313 211690 DEBUG oslo.privsep.daemon [-] privsep: reply[906460a0-6577-4b35-b44d-2e7c506ac2e9]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 19:18:14 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:18:14.314 104259 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapaabf11c6-e0, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 21 19:18:14 np0005591285 nova_compute[182755]: 2026-01-22 00:18:14.315 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:18:14 np0005591285 kernel: tapaabf11c6-e0: left promiscuous mode
Jan 21 19:18:14 np0005591285 nova_compute[182755]: 2026-01-22 00:18:14.316 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:18:14 np0005591285 nova_compute[182755]: 2026-01-22 00:18:14.317 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:18:14 np0005591285 nova_compute[182755]: 2026-01-22 00:18:14.319 182759 INFO os_vif [None req-6fd8cac3-afe4-4e36-a416-8ccec5c742bf 5eb4e81f0cef4003ae49faa67b3f17c3 3e408650207b498c8d115fd0c4f776dc - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:f7:eb:21,bridge_name='br-int',has_traffic_filtering=True,id=fea9b5d1-8621-4677-a6ea-12b5e8e94d7f,network=Network(aabf11c6-ef94-408a-8148-6c6400566606),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapfea9b5d1-86')#033[00m
Jan 21 19:18:14 np0005591285 nova_compute[182755]: 2026-01-22 00:18:14.319 182759 INFO nova.virt.libvirt.driver [None req-6fd8cac3-afe4-4e36-a416-8ccec5c742bf 5eb4e81f0cef4003ae49faa67b3f17c3 3e408650207b498c8d115fd0c4f776dc - - default default] [instance: 8b2b13b2-3477-4c12-b3a9-2af6bab94065] Deleting instance files /var/lib/nova/instances/8b2b13b2-3477-4c12-b3a9-2af6bab94065_del#033[00m
Jan 21 19:18:14 np0005591285 nova_compute[182755]: 2026-01-22 00:18:14.320 182759 INFO nova.virt.libvirt.driver [None req-6fd8cac3-afe4-4e36-a416-8ccec5c742bf 5eb4e81f0cef4003ae49faa67b3f17c3 3e408650207b498c8d115fd0c4f776dc - - default default] [instance: 8b2b13b2-3477-4c12-b3a9-2af6bab94065] Deletion of /var/lib/nova/instances/8b2b13b2-3477-4c12-b3a9-2af6bab94065_del complete#033[00m
Jan 21 19:18:14 np0005591285 nova_compute[182755]: 2026-01-22 00:18:14.334 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:18:14 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:18:14.337 211690 DEBUG oslo.privsep.daemon [-] privsep: reply[9b26981c-6fda-4d00-8104-9ff26545d7ba]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 19:18:14 np0005591285 podman[233952]: 2026-01-22 00:18:14.348326311 +0000 UTC m=+0.076225933 container health_status 482a8450540b33e5cd4c4cb0c4b60281016c657b79b89fa82f8a9e447843f995 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, container_name=ovn_metadata_agent, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, org.label-schema.license=GPLv2, tcib_managed=true)
Jan 21 19:18:14 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:18:14.353 211690 DEBUG oslo.privsep.daemon [-] privsep: reply[92bc7a94-fa79-4e93-9891-c12617e41a16]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 19:18:14 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:18:14.355 211690 DEBUG oslo.privsep.daemon [-] privsep: reply[11fe8cf1-60f7-43c2-a936-27f709948adb]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 19:18:14 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:18:14.372 211690 DEBUG oslo.privsep.daemon [-] privsep: reply[a477f5b4-f9ee-4ef0-802f-6211f761a592]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 564952, 'reachable_time': 31187, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 234012, 'error': None, 'target': 'ovnmeta-aabf11c6-ef94-408a-8148-6c6400566606', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 19:18:14 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:18:14.374 104650 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-aabf11c6-ef94-408a-8148-6c6400566606 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Jan 21 19:18:14 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:18:14.374 104650 DEBUG oslo.privsep.daemon [-] privsep: reply[87029f13-b468-4497-823f-0a72ce62ef17]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 19:18:14 np0005591285 systemd[1]: run-netns-ovnmeta\x2daabf11c6\x2def94\x2d408a\x2d8148\x2d6c6400566606.mount: Deactivated successfully.
Jan 21 19:18:14 np0005591285 podman[233961]: 2026-01-22 00:18:14.385285016 +0000 UTC m=+0.108096453 container health_status 8919309155a033da8deb024bb37b9fc0d285fe97686ee0d202af37a5f2df8416 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Jan 21 19:18:14 np0005591285 nova_compute[182755]: 2026-01-22 00:18:14.386 182759 INFO nova.compute.manager [None req-6fd8cac3-afe4-4e36-a416-8ccec5c742bf 5eb4e81f0cef4003ae49faa67b3f17c3 3e408650207b498c8d115fd0c4f776dc - - default default] [instance: 8b2b13b2-3477-4c12-b3a9-2af6bab94065] Took 0.37 seconds to destroy the instance on the hypervisor.#033[00m
Jan 21 19:18:14 np0005591285 nova_compute[182755]: 2026-01-22 00:18:14.386 182759 DEBUG oslo.service.loopingcall [None req-6fd8cac3-afe4-4e36-a416-8ccec5c742bf 5eb4e81f0cef4003ae49faa67b3f17c3 3e408650207b498c8d115fd0c4f776dc - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Jan 21 19:18:14 np0005591285 nova_compute[182755]: 2026-01-22 00:18:14.386 182759 DEBUG nova.compute.manager [-] [instance: 8b2b13b2-3477-4c12-b3a9-2af6bab94065] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Jan 21 19:18:14 np0005591285 nova_compute[182755]: 2026-01-22 00:18:14.386 182759 DEBUG nova.network.neutron [-] [instance: 8b2b13b2-3477-4c12-b3a9-2af6bab94065] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Jan 21 19:18:15 np0005591285 nova_compute[182755]: 2026-01-22 00:18:15.444 182759 DEBUG nova.compute.manager [req-477ccea1-c4f9-4c5a-964d-1ee3cf635896 req-76523f35-68a6-4706-8a63-eee510ffd90b 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 8b2b13b2-3477-4c12-b3a9-2af6bab94065] Received event network-vif-unplugged-fea9b5d1-8621-4677-a6ea-12b5e8e94d7f external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 21 19:18:15 np0005591285 nova_compute[182755]: 2026-01-22 00:18:15.445 182759 DEBUG oslo_concurrency.lockutils [req-477ccea1-c4f9-4c5a-964d-1ee3cf635896 req-76523f35-68a6-4706-8a63-eee510ffd90b 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "8b2b13b2-3477-4c12-b3a9-2af6bab94065-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 21 19:18:15 np0005591285 nova_compute[182755]: 2026-01-22 00:18:15.445 182759 DEBUG oslo_concurrency.lockutils [req-477ccea1-c4f9-4c5a-964d-1ee3cf635896 req-76523f35-68a6-4706-8a63-eee510ffd90b 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "8b2b13b2-3477-4c12-b3a9-2af6bab94065-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 21 19:18:15 np0005591285 nova_compute[182755]: 2026-01-22 00:18:15.446 182759 DEBUG oslo_concurrency.lockutils [req-477ccea1-c4f9-4c5a-964d-1ee3cf635896 req-76523f35-68a6-4706-8a63-eee510ffd90b 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "8b2b13b2-3477-4c12-b3a9-2af6bab94065-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 21 19:18:15 np0005591285 nova_compute[182755]: 2026-01-22 00:18:15.446 182759 DEBUG nova.compute.manager [req-477ccea1-c4f9-4c5a-964d-1ee3cf635896 req-76523f35-68a6-4706-8a63-eee510ffd90b 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 8b2b13b2-3477-4c12-b3a9-2af6bab94065] No waiting events found dispatching network-vif-unplugged-fea9b5d1-8621-4677-a6ea-12b5e8e94d7f pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 21 19:18:15 np0005591285 nova_compute[182755]: 2026-01-22 00:18:15.446 182759 DEBUG nova.compute.manager [req-477ccea1-c4f9-4c5a-964d-1ee3cf635896 req-76523f35-68a6-4706-8a63-eee510ffd90b 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 8b2b13b2-3477-4c12-b3a9-2af6bab94065] Received event network-vif-unplugged-fea9b5d1-8621-4677-a6ea-12b5e8e94d7f for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Jan 21 19:18:15 np0005591285 nova_compute[182755]: 2026-01-22 00:18:15.447 182759 DEBUG nova.compute.manager [req-477ccea1-c4f9-4c5a-964d-1ee3cf635896 req-76523f35-68a6-4706-8a63-eee510ffd90b 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 8b2b13b2-3477-4c12-b3a9-2af6bab94065] Received event network-vif-plugged-fea9b5d1-8621-4677-a6ea-12b5e8e94d7f external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 21 19:18:15 np0005591285 nova_compute[182755]: 2026-01-22 00:18:15.447 182759 DEBUG oslo_concurrency.lockutils [req-477ccea1-c4f9-4c5a-964d-1ee3cf635896 req-76523f35-68a6-4706-8a63-eee510ffd90b 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "8b2b13b2-3477-4c12-b3a9-2af6bab94065-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 21 19:18:15 np0005591285 nova_compute[182755]: 2026-01-22 00:18:15.447 182759 DEBUG oslo_concurrency.lockutils [req-477ccea1-c4f9-4c5a-964d-1ee3cf635896 req-76523f35-68a6-4706-8a63-eee510ffd90b 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "8b2b13b2-3477-4c12-b3a9-2af6bab94065-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 21 19:18:15 np0005591285 nova_compute[182755]: 2026-01-22 00:18:15.448 182759 DEBUG oslo_concurrency.lockutils [req-477ccea1-c4f9-4c5a-964d-1ee3cf635896 req-76523f35-68a6-4706-8a63-eee510ffd90b 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "8b2b13b2-3477-4c12-b3a9-2af6bab94065-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 21 19:18:15 np0005591285 nova_compute[182755]: 2026-01-22 00:18:15.448 182759 DEBUG nova.compute.manager [req-477ccea1-c4f9-4c5a-964d-1ee3cf635896 req-76523f35-68a6-4706-8a63-eee510ffd90b 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 8b2b13b2-3477-4c12-b3a9-2af6bab94065] No waiting events found dispatching network-vif-plugged-fea9b5d1-8621-4677-a6ea-12b5e8e94d7f pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 21 19:18:15 np0005591285 nova_compute[182755]: 2026-01-22 00:18:15.448 182759 WARNING nova.compute.manager [req-477ccea1-c4f9-4c5a-964d-1ee3cf635896 req-76523f35-68a6-4706-8a63-eee510ffd90b 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 8b2b13b2-3477-4c12-b3a9-2af6bab94065] Received unexpected event network-vif-plugged-fea9b5d1-8621-4677-a6ea-12b5e8e94d7f for instance with vm_state active and task_state deleting.#033[00m
Jan 21 19:18:16 np0005591285 ovn_controller[94908]: 2026-01-22T00:18:16Z|00051|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:38:78:10 10.100.0.12
Jan 21 19:18:16 np0005591285 nova_compute[182755]: 2026-01-22 00:18:16.195 182759 DEBUG nova.network.neutron [-] [instance: 8b2b13b2-3477-4c12-b3a9-2af6bab94065] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 21 19:18:16 np0005591285 nova_compute[182755]: 2026-01-22 00:18:16.244 182759 DEBUG nova.compute.manager [req-ee24dd44-f6ec-485c-962a-940943af43fa req-2cef1a7a-33a3-4843-ae08-6d605f5b51da 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 8b2b13b2-3477-4c12-b3a9-2af6bab94065] Received event network-vif-deleted-fea9b5d1-8621-4677-a6ea-12b5e8e94d7f external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 21 19:18:16 np0005591285 nova_compute[182755]: 2026-01-22 00:18:16.245 182759 INFO nova.compute.manager [req-ee24dd44-f6ec-485c-962a-940943af43fa req-2cef1a7a-33a3-4843-ae08-6d605f5b51da 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 8b2b13b2-3477-4c12-b3a9-2af6bab94065] Neutron deleted interface fea9b5d1-8621-4677-a6ea-12b5e8e94d7f; detaching it from the instance and deleting it from the info cache#033[00m
Jan 21 19:18:16 np0005591285 nova_compute[182755]: 2026-01-22 00:18:16.245 182759 DEBUG nova.network.neutron [req-ee24dd44-f6ec-485c-962a-940943af43fa req-2cef1a7a-33a3-4843-ae08-6d605f5b51da 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 8b2b13b2-3477-4c12-b3a9-2af6bab94065] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 21 19:18:16 np0005591285 nova_compute[182755]: 2026-01-22 00:18:16.329 182759 INFO nova.compute.manager [-] [instance: 8b2b13b2-3477-4c12-b3a9-2af6bab94065] Took 1.94 seconds to deallocate network for instance.#033[00m
Jan 21 19:18:16 np0005591285 nova_compute[182755]: 2026-01-22 00:18:16.333 182759 DEBUG nova.compute.manager [req-ee24dd44-f6ec-485c-962a-940943af43fa req-2cef1a7a-33a3-4843-ae08-6d605f5b51da 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 8b2b13b2-3477-4c12-b3a9-2af6bab94065] Detach interface failed, port_id=fea9b5d1-8621-4677-a6ea-12b5e8e94d7f, reason: Instance 8b2b13b2-3477-4c12-b3a9-2af6bab94065 could not be found. _process_instance_vif_deleted_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10882#033[00m
Jan 21 19:18:16 np0005591285 nova_compute[182755]: 2026-01-22 00:18:16.399 182759 DEBUG oslo_concurrency.lockutils [None req-6fd8cac3-afe4-4e36-a416-8ccec5c742bf 5eb4e81f0cef4003ae49faa67b3f17c3 3e408650207b498c8d115fd0c4f776dc - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 21 19:18:16 np0005591285 nova_compute[182755]: 2026-01-22 00:18:16.399 182759 DEBUG oslo_concurrency.lockutils [None req-6fd8cac3-afe4-4e36-a416-8ccec5c742bf 5eb4e81f0cef4003ae49faa67b3f17c3 3e408650207b498c8d115fd0c4f776dc - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 21 19:18:16 np0005591285 nova_compute[182755]: 2026-01-22 00:18:16.495 182759 DEBUG nova.compute.provider_tree [None req-6fd8cac3-afe4-4e36-a416-8ccec5c742bf 5eb4e81f0cef4003ae49faa67b3f17c3 3e408650207b498c8d115fd0c4f776dc - - default default] Inventory has not changed in ProviderTree for provider: e96a8776-a298-4c19-937a-402cb8191067 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 21 19:18:16 np0005591285 nova_compute[182755]: 2026-01-22 00:18:16.516 182759 DEBUG nova.scheduler.client.report [None req-6fd8cac3-afe4-4e36-a416-8ccec5c742bf 5eb4e81f0cef4003ae49faa67b3f17c3 3e408650207b498c8d115fd0c4f776dc - - default default] Inventory has not changed for provider e96a8776-a298-4c19-937a-402cb8191067 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 21 19:18:16 np0005591285 nova_compute[182755]: 2026-01-22 00:18:16.544 182759 DEBUG oslo_concurrency.lockutils [None req-6fd8cac3-afe4-4e36-a416-8ccec5c742bf 5eb4e81f0cef4003ae49faa67b3f17c3 3e408650207b498c8d115fd0c4f776dc - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.144s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 21 19:18:16 np0005591285 nova_compute[182755]: 2026-01-22 00:18:16.573 182759 INFO nova.scheduler.client.report [None req-6fd8cac3-afe4-4e36-a416-8ccec5c742bf 5eb4e81f0cef4003ae49faa67b3f17c3 3e408650207b498c8d115fd0c4f776dc - - default default] Deleted allocations for instance 8b2b13b2-3477-4c12-b3a9-2af6bab94065#033[00m
Jan 21 19:18:16 np0005591285 nova_compute[182755]: 2026-01-22 00:18:16.650 182759 DEBUG oslo_concurrency.lockutils [None req-6fd8cac3-afe4-4e36-a416-8ccec5c742bf 5eb4e81f0cef4003ae49faa67b3f17c3 3e408650207b498c8d115fd0c4f776dc - - default default] Lock "8b2b13b2-3477-4c12-b3a9-2af6bab94065" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 2.660s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 21 19:18:17 np0005591285 podman[234015]: 2026-01-22 00:18:17.325994239 +0000 UTC m=+0.184314095 container health_status e38def30a2d032278c507a8fe96e62e9ea51ff4721fe55b24e66691821fd079e (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.license=GPLv2, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, container_name=ovn_controller)
Jan 21 19:18:17 np0005591285 nova_compute[182755]: 2026-01-22 00:18:17.483 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:18:17 np0005591285 nova_compute[182755]: 2026-01-22 00:18:17.763 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:18:19 np0005591285 nova_compute[182755]: 2026-01-22 00:18:19.316 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:18:20 np0005591285 nova_compute[182755]: 2026-01-22 00:18:20.257 182759 DEBUG oslo_concurrency.lockutils [None req-634ebf58-c462-4745-85a4-9ebe826b43b8 5eb4e81f0cef4003ae49faa67b3f17c3 3e408650207b498c8d115fd0c4f776dc - - default default] Acquiring lock "fb2dc221-bb45-4407-90b5-ce2fe888001c" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 21 19:18:20 np0005591285 nova_compute[182755]: 2026-01-22 00:18:20.258 182759 DEBUG oslo_concurrency.lockutils [None req-634ebf58-c462-4745-85a4-9ebe826b43b8 5eb4e81f0cef4003ae49faa67b3f17c3 3e408650207b498c8d115fd0c4f776dc - - default default] Lock "fb2dc221-bb45-4407-90b5-ce2fe888001c" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 21 19:18:20 np0005591285 nova_compute[182755]: 2026-01-22 00:18:20.629 182759 DEBUG nova.compute.manager [None req-634ebf58-c462-4745-85a4-9ebe826b43b8 5eb4e81f0cef4003ae49faa67b3f17c3 3e408650207b498c8d115fd0c4f776dc - - default default] [instance: fb2dc221-bb45-4407-90b5-ce2fe888001c] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Jan 21 19:18:21 np0005591285 nova_compute[182755]: 2026-01-22 00:18:21.782 182759 DEBUG oslo_concurrency.lockutils [None req-634ebf58-c462-4745-85a4-9ebe826b43b8 5eb4e81f0cef4003ae49faa67b3f17c3 3e408650207b498c8d115fd0c4f776dc - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 21 19:18:21 np0005591285 nova_compute[182755]: 2026-01-22 00:18:21.783 182759 DEBUG oslo_concurrency.lockutils [None req-634ebf58-c462-4745-85a4-9ebe826b43b8 5eb4e81f0cef4003ae49faa67b3f17c3 3e408650207b498c8d115fd0c4f776dc - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 21 19:18:21 np0005591285 nova_compute[182755]: 2026-01-22 00:18:21.792 182759 DEBUG nova.virt.hardware [None req-634ebf58-c462-4745-85a4-9ebe826b43b8 5eb4e81f0cef4003ae49faa67b3f17c3 3e408650207b498c8d115fd0c4f776dc - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Jan 21 19:18:21 np0005591285 nova_compute[182755]: 2026-01-22 00:18:21.793 182759 INFO nova.compute.claims [None req-634ebf58-c462-4745-85a4-9ebe826b43b8 5eb4e81f0cef4003ae49faa67b3f17c3 3e408650207b498c8d115fd0c4f776dc - - default default] [instance: fb2dc221-bb45-4407-90b5-ce2fe888001c] Claim successful on node compute-2.ctlplane.example.com#033[00m
Jan 21 19:18:21 np0005591285 nova_compute[182755]: 2026-01-22 00:18:21.930 182759 DEBUG nova.compute.provider_tree [None req-634ebf58-c462-4745-85a4-9ebe826b43b8 5eb4e81f0cef4003ae49faa67b3f17c3 3e408650207b498c8d115fd0c4f776dc - - default default] Inventory has not changed in ProviderTree for provider: e96a8776-a298-4c19-937a-402cb8191067 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 21 19:18:22 np0005591285 nova_compute[182755]: 2026-01-22 00:18:22.028 182759 DEBUG nova.scheduler.client.report [None req-634ebf58-c462-4745-85a4-9ebe826b43b8 5eb4e81f0cef4003ae49faa67b3f17c3 3e408650207b498c8d115fd0c4f776dc - - default default] Inventory has not changed for provider e96a8776-a298-4c19-937a-402cb8191067 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 21 19:18:22 np0005591285 nova_compute[182755]: 2026-01-22 00:18:22.056 182759 DEBUG oslo_concurrency.lockutils [None req-634ebf58-c462-4745-85a4-9ebe826b43b8 5eb4e81f0cef4003ae49faa67b3f17c3 3e408650207b498c8d115fd0c4f776dc - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.273s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 21 19:18:22 np0005591285 nova_compute[182755]: 2026-01-22 00:18:22.057 182759 DEBUG nova.compute.manager [None req-634ebf58-c462-4745-85a4-9ebe826b43b8 5eb4e81f0cef4003ae49faa67b3f17c3 3e408650207b498c8d115fd0c4f776dc - - default default] [instance: fb2dc221-bb45-4407-90b5-ce2fe888001c] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Jan 21 19:18:22 np0005591285 nova_compute[182755]: 2026-01-22 00:18:22.123 182759 DEBUG nova.compute.manager [None req-634ebf58-c462-4745-85a4-9ebe826b43b8 5eb4e81f0cef4003ae49faa67b3f17c3 3e408650207b498c8d115fd0c4f776dc - - default default] [instance: fb2dc221-bb45-4407-90b5-ce2fe888001c] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Jan 21 19:18:22 np0005591285 nova_compute[182755]: 2026-01-22 00:18:22.123 182759 DEBUG nova.network.neutron [None req-634ebf58-c462-4745-85a4-9ebe826b43b8 5eb4e81f0cef4003ae49faa67b3f17c3 3e408650207b498c8d115fd0c4f776dc - - default default] [instance: fb2dc221-bb45-4407-90b5-ce2fe888001c] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Jan 21 19:18:22 np0005591285 nova_compute[182755]: 2026-01-22 00:18:22.151 182759 INFO nova.virt.libvirt.driver [None req-634ebf58-c462-4745-85a4-9ebe826b43b8 5eb4e81f0cef4003ae49faa67b3f17c3 3e408650207b498c8d115fd0c4f776dc - - default default] [instance: fb2dc221-bb45-4407-90b5-ce2fe888001c] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Jan 21 19:18:22 np0005591285 nova_compute[182755]: 2026-01-22 00:18:22.172 182759 DEBUG nova.compute.manager [None req-634ebf58-c462-4745-85a4-9ebe826b43b8 5eb4e81f0cef4003ae49faa67b3f17c3 3e408650207b498c8d115fd0c4f776dc - - default default] [instance: fb2dc221-bb45-4407-90b5-ce2fe888001c] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Jan 21 19:18:22 np0005591285 nova_compute[182755]: 2026-01-22 00:18:22.217 182759 DEBUG oslo_service.periodic_task [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 21 19:18:22 np0005591285 nova_compute[182755]: 2026-01-22 00:18:22.314 182759 DEBUG nova.compute.manager [None req-634ebf58-c462-4745-85a4-9ebe826b43b8 5eb4e81f0cef4003ae49faa67b3f17c3 3e408650207b498c8d115fd0c4f776dc - - default default] [instance: fb2dc221-bb45-4407-90b5-ce2fe888001c] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Jan 21 19:18:22 np0005591285 nova_compute[182755]: 2026-01-22 00:18:22.316 182759 DEBUG nova.virt.libvirt.driver [None req-634ebf58-c462-4745-85a4-9ebe826b43b8 5eb4e81f0cef4003ae49faa67b3f17c3 3e408650207b498c8d115fd0c4f776dc - - default default] [instance: fb2dc221-bb45-4407-90b5-ce2fe888001c] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Jan 21 19:18:22 np0005591285 nova_compute[182755]: 2026-01-22 00:18:22.317 182759 INFO nova.virt.libvirt.driver [None req-634ebf58-c462-4745-85a4-9ebe826b43b8 5eb4e81f0cef4003ae49faa67b3f17c3 3e408650207b498c8d115fd0c4f776dc - - default default] [instance: fb2dc221-bb45-4407-90b5-ce2fe888001c] Creating image(s)#033[00m
Jan 21 19:18:22 np0005591285 nova_compute[182755]: 2026-01-22 00:18:22.318 182759 DEBUG oslo_concurrency.lockutils [None req-634ebf58-c462-4745-85a4-9ebe826b43b8 5eb4e81f0cef4003ae49faa67b3f17c3 3e408650207b498c8d115fd0c4f776dc - - default default] Acquiring lock "/var/lib/nova/instances/fb2dc221-bb45-4407-90b5-ce2fe888001c/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 21 19:18:22 np0005591285 nova_compute[182755]: 2026-01-22 00:18:22.318 182759 DEBUG oslo_concurrency.lockutils [None req-634ebf58-c462-4745-85a4-9ebe826b43b8 5eb4e81f0cef4003ae49faa67b3f17c3 3e408650207b498c8d115fd0c4f776dc - - default default] Lock "/var/lib/nova/instances/fb2dc221-bb45-4407-90b5-ce2fe888001c/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 21 19:18:22 np0005591285 nova_compute[182755]: 2026-01-22 00:18:22.320 182759 DEBUG oslo_concurrency.lockutils [None req-634ebf58-c462-4745-85a4-9ebe826b43b8 5eb4e81f0cef4003ae49faa67b3f17c3 3e408650207b498c8d115fd0c4f776dc - - default default] Lock "/var/lib/nova/instances/fb2dc221-bb45-4407-90b5-ce2fe888001c/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 21 19:18:22 np0005591285 nova_compute[182755]: 2026-01-22 00:18:22.347 182759 DEBUG nova.policy [None req-634ebf58-c462-4745-85a4-9ebe826b43b8 5eb4e81f0cef4003ae49faa67b3f17c3 3e408650207b498c8d115fd0c4f776dc - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '5eb4e81f0cef4003ae49faa67b3f17c3', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '3e408650207b498c8d115fd0c4f776dc', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Jan 21 19:18:22 np0005591285 nova_compute[182755]: 2026-01-22 00:18:22.351 182759 DEBUG oslo_concurrency.processutils [None req-634ebf58-c462-4745-85a4-9ebe826b43b8 5eb4e81f0cef4003ae49faa67b3f17c3 3e408650207b498c8d115fd0c4f776dc - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 21 19:18:22 np0005591285 nova_compute[182755]: 2026-01-22 00:18:22.408 182759 DEBUG oslo_concurrency.processutils [None req-634ebf58-c462-4745-85a4-9ebe826b43b8 5eb4e81f0cef4003ae49faa67b3f17c3 3e408650207b498c8d115fd0c4f776dc - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json" returned: 0 in 0.057s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 21 19:18:22 np0005591285 nova_compute[182755]: 2026-01-22 00:18:22.410 182759 DEBUG oslo_concurrency.lockutils [None req-634ebf58-c462-4745-85a4-9ebe826b43b8 5eb4e81f0cef4003ae49faa67b3f17c3 3e408650207b498c8d115fd0c4f776dc - - default default] Acquiring lock "3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 21 19:18:22 np0005591285 nova_compute[182755]: 2026-01-22 00:18:22.410 182759 DEBUG oslo_concurrency.lockutils [None req-634ebf58-c462-4745-85a4-9ebe826b43b8 5eb4e81f0cef4003ae49faa67b3f17c3 3e408650207b498c8d115fd0c4f776dc - - default default] Lock "3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 21 19:18:22 np0005591285 nova_compute[182755]: 2026-01-22 00:18:22.421 182759 DEBUG oslo_concurrency.processutils [None req-634ebf58-c462-4745-85a4-9ebe826b43b8 5eb4e81f0cef4003ae49faa67b3f17c3 3e408650207b498c8d115fd0c4f776dc - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 21 19:18:22 np0005591285 nova_compute[182755]: 2026-01-22 00:18:22.484 182759 DEBUG oslo_concurrency.processutils [None req-634ebf58-c462-4745-85a4-9ebe826b43b8 5eb4e81f0cef4003ae49faa67b3f17c3 3e408650207b498c8d115fd0c4f776dc - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json" returned: 0 in 0.063s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 21 19:18:22 np0005591285 nova_compute[182755]: 2026-01-22 00:18:22.487 182759 DEBUG oslo_concurrency.processutils [None req-634ebf58-c462-4745-85a4-9ebe826b43b8 5eb4e81f0cef4003ae49faa67b3f17c3 3e408650207b498c8d115fd0c4f776dc - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474,backing_fmt=raw /var/lib/nova/instances/fb2dc221-bb45-4407-90b5-ce2fe888001c/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 21 19:18:22 np0005591285 nova_compute[182755]: 2026-01-22 00:18:22.528 182759 DEBUG oslo_concurrency.processutils [None req-634ebf58-c462-4745-85a4-9ebe826b43b8 5eb4e81f0cef4003ae49faa67b3f17c3 3e408650207b498c8d115fd0c4f776dc - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474,backing_fmt=raw /var/lib/nova/instances/fb2dc221-bb45-4407-90b5-ce2fe888001c/disk 1073741824" returned: 0 in 0.041s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 21 19:18:22 np0005591285 nova_compute[182755]: 2026-01-22 00:18:22.529 182759 DEBUG oslo_concurrency.lockutils [None req-634ebf58-c462-4745-85a4-9ebe826b43b8 5eb4e81f0cef4003ae49faa67b3f17c3 3e408650207b498c8d115fd0c4f776dc - - default default] Lock "3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.119s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 21 19:18:22 np0005591285 nova_compute[182755]: 2026-01-22 00:18:22.530 182759 DEBUG oslo_concurrency.processutils [None req-634ebf58-c462-4745-85a4-9ebe826b43b8 5eb4e81f0cef4003ae49faa67b3f17c3 3e408650207b498c8d115fd0c4f776dc - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 21 19:18:22 np0005591285 nova_compute[182755]: 2026-01-22 00:18:22.606 182759 DEBUG oslo_concurrency.processutils [None req-634ebf58-c462-4745-85a4-9ebe826b43b8 5eb4e81f0cef4003ae49faa67b3f17c3 3e408650207b498c8d115fd0c4f776dc - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json" returned: 0 in 0.077s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 21 19:18:22 np0005591285 nova_compute[182755]: 2026-01-22 00:18:22.607 182759 DEBUG nova.virt.disk.api [None req-634ebf58-c462-4745-85a4-9ebe826b43b8 5eb4e81f0cef4003ae49faa67b3f17c3 3e408650207b498c8d115fd0c4f776dc - - default default] Checking if we can resize image /var/lib/nova/instances/fb2dc221-bb45-4407-90b5-ce2fe888001c/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166#033[00m
Jan 21 19:18:22 np0005591285 nova_compute[182755]: 2026-01-22 00:18:22.608 182759 DEBUG oslo_concurrency.processutils [None req-634ebf58-c462-4745-85a4-9ebe826b43b8 5eb4e81f0cef4003ae49faa67b3f17c3 3e408650207b498c8d115fd0c4f776dc - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/fb2dc221-bb45-4407-90b5-ce2fe888001c/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 21 19:18:22 np0005591285 nova_compute[182755]: 2026-01-22 00:18:22.684 182759 DEBUG oslo_concurrency.processutils [None req-634ebf58-c462-4745-85a4-9ebe826b43b8 5eb4e81f0cef4003ae49faa67b3f17c3 3e408650207b498c8d115fd0c4f776dc - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/fb2dc221-bb45-4407-90b5-ce2fe888001c/disk --force-share --output=json" returned: 0 in 0.076s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 21 19:18:22 np0005591285 nova_compute[182755]: 2026-01-22 00:18:22.685 182759 DEBUG nova.virt.disk.api [None req-634ebf58-c462-4745-85a4-9ebe826b43b8 5eb4e81f0cef4003ae49faa67b3f17c3 3e408650207b498c8d115fd0c4f776dc - - default default] Cannot resize image /var/lib/nova/instances/fb2dc221-bb45-4407-90b5-ce2fe888001c/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172#033[00m
Jan 21 19:18:22 np0005591285 nova_compute[182755]: 2026-01-22 00:18:22.686 182759 DEBUG nova.objects.instance [None req-634ebf58-c462-4745-85a4-9ebe826b43b8 5eb4e81f0cef4003ae49faa67b3f17c3 3e408650207b498c8d115fd0c4f776dc - - default default] Lazy-loading 'migration_context' on Instance uuid fb2dc221-bb45-4407-90b5-ce2fe888001c obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 21 19:18:22 np0005591285 nova_compute[182755]: 2026-01-22 00:18:22.704 182759 DEBUG nova.virt.libvirt.driver [None req-634ebf58-c462-4745-85a4-9ebe826b43b8 5eb4e81f0cef4003ae49faa67b3f17c3 3e408650207b498c8d115fd0c4f776dc - - default default] [instance: fb2dc221-bb45-4407-90b5-ce2fe888001c] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Jan 21 19:18:22 np0005591285 nova_compute[182755]: 2026-01-22 00:18:22.705 182759 DEBUG nova.virt.libvirt.driver [None req-634ebf58-c462-4745-85a4-9ebe826b43b8 5eb4e81f0cef4003ae49faa67b3f17c3 3e408650207b498c8d115fd0c4f776dc - - default default] [instance: fb2dc221-bb45-4407-90b5-ce2fe888001c] Ensure instance console log exists: /var/lib/nova/instances/fb2dc221-bb45-4407-90b5-ce2fe888001c/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Jan 21 19:18:22 np0005591285 nova_compute[182755]: 2026-01-22 00:18:22.705 182759 DEBUG oslo_concurrency.lockutils [None req-634ebf58-c462-4745-85a4-9ebe826b43b8 5eb4e81f0cef4003ae49faa67b3f17c3 3e408650207b498c8d115fd0c4f776dc - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 21 19:18:22 np0005591285 nova_compute[182755]: 2026-01-22 00:18:22.706 182759 DEBUG oslo_concurrency.lockutils [None req-634ebf58-c462-4745-85a4-9ebe826b43b8 5eb4e81f0cef4003ae49faa67b3f17c3 3e408650207b498c8d115fd0c4f776dc - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 21 19:18:22 np0005591285 nova_compute[182755]: 2026-01-22 00:18:22.706 182759 DEBUG oslo_concurrency.lockutils [None req-634ebf58-c462-4745-85a4-9ebe826b43b8 5eb4e81f0cef4003ae49faa67b3f17c3 3e408650207b498c8d115fd0c4f776dc - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 21 19:18:22 np0005591285 nova_compute[182755]: 2026-01-22 00:18:22.768 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:18:23 np0005591285 nova_compute[182755]: 2026-01-22 00:18:23.031 182759 DEBUG nova.network.neutron [None req-634ebf58-c462-4745-85a4-9ebe826b43b8 5eb4e81f0cef4003ae49faa67b3f17c3 3e408650207b498c8d115fd0c4f776dc - - default default] [instance: fb2dc221-bb45-4407-90b5-ce2fe888001c] Successfully created port: e38b41ce-ced6-421a-ade5-becfd62fa83d _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.176 12 DEBUG ceilometer.compute.discovery [-] instance data: {'id': '46feac9e-f412-4027-8cfb-f7280308085e', 'name': 'tempest-TestNetworkAdvancedServerOps-server-1889498750', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'os_type': 'hvm', 'architecture': 'x86_64', 'OS-EXT-SRV-ATTR:instance_name': 'instance-00000086', 'OS-EXT-SRV-ATTR:host': 'compute-2.ctlplane.example.com', 'OS-EXT-STS:vm_state': 'running', 'tenant_id': 'adb1305c8f874f2684e845e88fd95ffe', 'user_id': '635cc2f351c344dc8e2b1264080dbafb', 'hostId': 'd17a70097cc94a0e18076735cc020fe8b794763e3e9773db065e2c8c', 'status': 'active', 'metadata': {}} discover_libvirt_polling /usr/lib/python3.9/site-packages/ceilometer/compute/discovery.py:228
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.179 12 INFO ceilometer.polling.manager [-] Polling pollster memory.usage in the context of pollsters
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.205 12 DEBUG ceilometer.compute.pollsters [-] 46feac9e-f412-4027-8cfb-f7280308085e/memory.usage volume: 42.15234375 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.211 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'da0efc64-58f4-4c55-954e-deac6a30f2bb', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'memory.usage', 'counter_type': 'gauge', 'counter_unit': 'MB', 'counter_volume': 42.15234375, 'user_id': '635cc2f351c344dc8e2b1264080dbafb', 'user_name': None, 'project_id': 'adb1305c8f874f2684e845e88fd95ffe', 'project_name': None, 'resource_id': '46feac9e-f412-4027-8cfb-f7280308085e', 'timestamp': '2026-01-22T00:18:23.179773', 'resource_metadata': {'display_name': 'tempest-TestNetworkAdvancedServerOps-server-1889498750', 'name': 'instance-00000086', 'instance_id': '46feac9e-f412-4027-8cfb-f7280308085e', 'instance_type': 'm1.nano', 'host': 'd17a70097cc94a0e18076735cc020fe8b794763e3e9773db065e2c8c', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1}, 'message_id': 'dd006b1e-f727-11f0-b13b-fa163e425b77', 'monotonic_time': 5661.924288122, 'message_signature': 'b4f389df9ac21a9014de28703ed2320cb2ca7812a5fd21c063b72a83c21edc35'}]}, 'timestamp': '2026-01-22 00:18:23.207067', '_unique_id': '8e31496ee52a4eec8a358045c8ccfe81'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.211 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.211 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.211 12 ERROR oslo_messaging.notify.messaging     yield
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.211 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.211 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.211 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.211 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.211 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.211 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.211 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.211 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.211 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.211 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.211 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.211 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.211 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.211 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.211 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.211 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.211 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.211 12 ERROR oslo_messaging.notify.messaging 
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.211 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.211 12 ERROR oslo_messaging.notify.messaging 
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.211 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.211 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.211 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.211 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.211 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.211 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.211 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.211 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.211 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.211 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.211 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.211 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.211 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.211 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.211 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.211 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.211 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.211 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.211 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.211 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.211 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.211 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.211 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.211 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.211 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.211 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.211 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.211 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.211 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.211 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.211 12 ERROR oslo_messaging.notify.messaging 
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.214 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.iops in the context of pollsters
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.214 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for PerDeviceDiskIOPSPollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.215 12 ERROR ceilometer.polling.manager [-] Prevent pollster disk.device.iops from polling [<NovaLikeServer: tempest-TestNetworkAdvancedServerOps-server-1889498750>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-TestNetworkAdvancedServerOps-server-1889498750>]
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.215 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.delta in the context of pollsters
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.220 12 DEBUG ceilometer.compute.virt.libvirt.inspector [-] No delta meter predecessor for 46feac9e-f412-4027-8cfb-f7280308085e / tap7bc267e3-f7 inspect_vnics /usr/lib/python3.9/site-packages/ceilometer/compute/virt/libvirt/inspector.py:136
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.220 12 DEBUG ceilometer.compute.pollsters [-] 46feac9e-f412-4027-8cfb-f7280308085e/network.outgoing.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.222 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '83a85c23-9366-4f0e-9b1b-c3b74f71d06f', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '635cc2f351c344dc8e2b1264080dbafb', 'user_name': None, 'project_id': 'adb1305c8f874f2684e845e88fd95ffe', 'project_name': None, 'resource_id': 'instance-00000086-46feac9e-f412-4027-8cfb-f7280308085e-tap7bc267e3-f7', 'timestamp': '2026-01-22T00:18:23.216037', 'resource_metadata': {'display_name': 'tempest-TestNetworkAdvancedServerOps-server-1889498750', 'name': 'tap7bc267e3-f7', 'instance_id': '46feac9e-f412-4027-8cfb-f7280308085e', 'instance_type': 'm1.nano', 'host': 'd17a70097cc94a0e18076735cc020fe8b794763e3e9773db065e2c8c', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:38:78:10', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap7bc267e3-f7'}, 'message_id': 'dd02b090-f727-11f0-b13b-fa163e425b77', 'monotonic_time': 5661.935299856, 'message_signature': 'e47c5f0ff9935bfe772d3ad83326d497043cf0049d5109a560fbb4cc2620c3b9'}]}, 'timestamp': '2026-01-22 00:18:23.221425', '_unique_id': '95af0a452d6f4b61aa694a56b2f6679f'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.222 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.222 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.222 12 ERROR oslo_messaging.notify.messaging     yield
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.222 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.222 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.222 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.222 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.222 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.222 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.222 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.222 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.222 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.222 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.222 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.222 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.222 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.222 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.222 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.222 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.222 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.222 12 ERROR oslo_messaging.notify.messaging 
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.222 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.222 12 ERROR oslo_messaging.notify.messaging 
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.222 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.222 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.222 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.222 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.222 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.222 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.222 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.222 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.222 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.222 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.222 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.222 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.222 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.222 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.222 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.222 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.222 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.222 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.222 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.222 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.222 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.222 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.222 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.222 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.222 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.222 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.222 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.222 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.222 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.222 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.222 12 ERROR oslo_messaging.notify.messaging 
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.225 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.latency in the context of pollsters
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.257 12 DEBUG ceilometer.compute.pollsters [-] 46feac9e-f412-4027-8cfb-f7280308085e/disk.device.read.latency volume: 259185656 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.258 12 DEBUG ceilometer.compute.pollsters [-] 46feac9e-f412-4027-8cfb-f7280308085e/disk.device.read.latency volume: 20026426 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.260 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '9d5f8dcb-6613-4521-8566-03b85b5e6b32', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 259185656, 'user_id': '635cc2f351c344dc8e2b1264080dbafb', 'user_name': None, 'project_id': 'adb1305c8f874f2684e845e88fd95ffe', 'project_name': None, 'resource_id': '46feac9e-f412-4027-8cfb-f7280308085e-vda', 'timestamp': '2026-01-22T00:18:23.225324', 'resource_metadata': {'display_name': 'tempest-TestNetworkAdvancedServerOps-server-1889498750', 'name': 'instance-00000086', 'instance_id': '46feac9e-f412-4027-8cfb-f7280308085e', 'instance_type': 'm1.nano', 'host': 'd17a70097cc94a0e18076735cc020fe8b794763e3e9773db065e2c8c', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'dd084a0a-f727-11f0-b13b-fa163e425b77', 'monotonic_time': 5661.944554213, 'message_signature': '23a26fa2683e1b6478d38c0e074cddf9c4b6e8b1aa1692b21bc52bdd6858ca59'}, {'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 20026426, 'user_id': '635cc2f351c344dc8e2b1264080dbafb', 'user_name': None, 'project_id': 'adb1305c8f874f2684e845e88fd95ffe', 'project_name': None, 'resource_id': '46feac9e-f412-4027-8cfb-f7280308085e-sda', 'timestamp': '2026-01-22T00:18:23.225324', 'resource_metadata': {'display_name': 'tempest-TestNetworkAdvancedServerOps-server-1889498750', 'name': 'instance-00000086', 'instance_id': '46feac9e-f412-4027-8cfb-f7280308085e', 'instance_type': 'm1.nano', 'host': 'd17a70097cc94a0e18076735cc020fe8b794763e3e9773db065e2c8c', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'dd086472-f727-11f0-b13b-fa163e425b77', 'monotonic_time': 5661.944554213, 'message_signature': '9bc652165753582bab68ab13837ecaf5533d6a11a293ad0a900775a60a1c9eed'}]}, 'timestamp': '2026-01-22 00:18:23.258771', '_unique_id': '02a8b87d1e20483c8a1de8d6c9d5cb94'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.260 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.260 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.260 12 ERROR oslo_messaging.notify.messaging     yield
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.260 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.260 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.260 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.260 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.260 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.260 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.260 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.260 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.260 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.260 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.260 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.260 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.260 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.260 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.260 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.260 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.260 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.260 12 ERROR oslo_messaging.notify.messaging 
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.260 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.260 12 ERROR oslo_messaging.notify.messaging 
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.260 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.260 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.260 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.260 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.260 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.260 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.260 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.260 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.260 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.260 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.260 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.260 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.260 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.260 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.260 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.260 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.260 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.260 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.260 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.260 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.260 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.260 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.260 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.260 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.260 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.260 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.260 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.260 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.260 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.260 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.260 12 ERROR oslo_messaging.notify.messaging 
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.261 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets in the context of pollsters
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.261 12 DEBUG ceilometer.compute.pollsters [-] 46feac9e-f412-4027-8cfb-f7280308085e/network.outgoing.packets volume: 12 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.263 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '65c9970d-30d4-4161-a43b-0d872c3d8dba', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 12, 'user_id': '635cc2f351c344dc8e2b1264080dbafb', 'user_name': None, 'project_id': 'adb1305c8f874f2684e845e88fd95ffe', 'project_name': None, 'resource_id': 'instance-00000086-46feac9e-f412-4027-8cfb-f7280308085e-tap7bc267e3-f7', 'timestamp': '2026-01-22T00:18:23.261865', 'resource_metadata': {'display_name': 'tempest-TestNetworkAdvancedServerOps-server-1889498750', 'name': 'tap7bc267e3-f7', 'instance_id': '46feac9e-f412-4027-8cfb-f7280308085e', 'instance_type': 'm1.nano', 'host': 'd17a70097cc94a0e18076735cc020fe8b794763e3e9773db065e2c8c', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:38:78:10', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap7bc267e3-f7'}, 'message_id': 'dd08f518-f727-11f0-b13b-fa163e425b77', 'monotonic_time': 5661.935299856, 'message_signature': '43fd2422aa3054f7087e3dfac887e88844cf74636fb05ac42bb32c560e633780'}]}, 'timestamp': '2026-01-22 00:18:23.262483', '_unique_id': 'ba5a8cfdff724d7ab68dd46d4c9ef7e7'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.263 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.263 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.263 12 ERROR oslo_messaging.notify.messaging     yield
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.263 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.263 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.263 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.263 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.263 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.263 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.263 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.263 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.263 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.263 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.263 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.263 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.263 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.263 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.263 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.263 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.263 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.263 12 ERROR oslo_messaging.notify.messaging 
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.263 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.263 12 ERROR oslo_messaging.notify.messaging 
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.263 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.263 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.263 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.263 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.263 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.263 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.263 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.263 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.263 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.263 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.263 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.263 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.263 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.263 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.263 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.263 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.263 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.263 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.263 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.263 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.263 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.263 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.263 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.263 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.263 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.263 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.263 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.263 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.263 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.263 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.263 12 ERROR oslo_messaging.notify.messaging 
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.264 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.allocation in the context of pollsters
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.279 12 DEBUG ceilometer.compute.pollsters [-] 46feac9e-f412-4027-8cfb-f7280308085e/disk.device.allocation volume: 30150656 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.280 12 DEBUG ceilometer.compute.pollsters [-] 46feac9e-f412-4027-8cfb-f7280308085e/disk.device.allocation volume: 487424 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.282 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'e1e870eb-2a1a-4e33-b55f-17a090488a3e', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 30150656, 'user_id': '635cc2f351c344dc8e2b1264080dbafb', 'user_name': None, 'project_id': 'adb1305c8f874f2684e845e88fd95ffe', 'project_name': None, 'resource_id': '46feac9e-f412-4027-8cfb-f7280308085e-vda', 'timestamp': '2026-01-22T00:18:23.264947', 'resource_metadata': {'display_name': 'tempest-TestNetworkAdvancedServerOps-server-1889498750', 'name': 'instance-00000086', 'instance_id': '46feac9e-f412-4027-8cfb-f7280308085e', 'instance_type': 'm1.nano', 'host': 'd17a70097cc94a0e18076735cc020fe8b794763e3e9773db065e2c8c', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'dd0bb0d2-f727-11f0-b13b-fa163e425b77', 'monotonic_time': 5661.984221941, 'message_signature': '83495248a2256818a100533bbb9746fed91b8676af843ee48618c3e581496736'}, {'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 487424, 'user_id': '635cc2f351c344dc8e2b1264080dbafb', 'user_name': None, 'project_id': 'adb1305c8f874f2684e845e88fd95ffe', 'project_name': None, 'resource_id': '46feac9e-f412-4027-8cfb-f7280308085e-sda', 'timestamp': '2026-01-22T00:18:23.264947', 'resource_metadata': {'display_name': 'tempest-TestNetworkAdvancedServerOps-server-1889498750', 'name': 'instance-00000086', 'instance_id': '46feac9e-f412-4027-8cfb-f7280308085e', 'instance_type': 'm1.nano', 'host': 'd17a70097cc94a0e18076735cc020fe8b794763e3e9773db065e2c8c', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'dd0bc6b2-f727-11f0-b13b-fa163e425b77', 'monotonic_time': 5661.984221941, 'message_signature': '938705c561bf8ab997ffc720ba9e13d5d496baae9a8f489690b823070613e021'}]}, 'timestamp': '2026-01-22 00:18:23.281057', '_unique_id': '6b14f1c8f94443a99b898ba43796464d'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.282 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.282 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.282 12 ERROR oslo_messaging.notify.messaging     yield
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.282 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.282 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.282 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.282 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.282 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.282 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.282 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.282 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.282 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.282 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.282 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.282 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.282 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.282 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.282 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.282 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.282 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.282 12 ERROR oslo_messaging.notify.messaging 
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.282 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.282 12 ERROR oslo_messaging.notify.messaging 
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.282 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.282 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.282 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.282 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.282 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.282 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.282 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.282 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.282 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.282 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.282 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.282 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.282 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.282 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.282 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.282 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.282 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.282 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.282 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.282 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.282 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.282 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.282 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.282 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.282 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.282 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.282 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.282 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.282 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.282 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.282 12 ERROR oslo_messaging.notify.messaging 
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.283 12 INFO ceilometer.polling.manager [-] Polling pollster cpu in the context of pollsters
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.284 12 DEBUG ceilometer.compute.pollsters [-] 46feac9e-f412-4027-8cfb-f7280308085e/cpu volume: 12300000000 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.286 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '47146d65-c697-4f58-81cd-bc77b4ab1498', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'cpu', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 12300000000, 'user_id': '635cc2f351c344dc8e2b1264080dbafb', 'user_name': None, 'project_id': 'adb1305c8f874f2684e845e88fd95ffe', 'project_name': None, 'resource_id': '46feac9e-f412-4027-8cfb-f7280308085e', 'timestamp': '2026-01-22T00:18:23.284033', 'resource_metadata': {'display_name': 'tempest-TestNetworkAdvancedServerOps-server-1889498750', 'name': 'instance-00000086', 'instance_id': '46feac9e-f412-4027-8cfb-f7280308085e', 'instance_type': 'm1.nano', 'host': 'd17a70097cc94a0e18076735cc020fe8b794763e3e9773db065e2c8c', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'cpu_number': 1}, 'message_id': 'dd0c592e-f727-11f0-b13b-fa163e425b77', 'monotonic_time': 5661.924288122, 'message_signature': '81db3587207247c1542e7d4772ee2fa7b5106d342236fb4d46d7812ed6b6876e'}]}, 'timestamp': '2026-01-22 00:18:23.284810', '_unique_id': 'e90f5737d6bb48f7ac0caa68168cb5b1'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.286 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.286 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.286 12 ERROR oslo_messaging.notify.messaging     yield
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.286 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.286 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.286 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.286 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.286 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.286 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.286 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.286 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.286 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.286 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.286 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.286 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.286 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.286 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.286 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.286 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.286 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.286 12 ERROR oslo_messaging.notify.messaging 
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.286 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.286 12 ERROR oslo_messaging.notify.messaging 
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.286 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.286 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.286 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.286 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.286 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.286 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.286 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.286 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.286 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.286 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.286 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.286 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.286 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.286 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.286 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.286 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.286 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.286 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.286 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.286 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.286 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.286 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.286 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.286 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.286 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.286 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.286 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.286 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.286 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.286 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.286 12 ERROR oslo_messaging.notify.messaging 
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.287 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.latency in the context of pollsters
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.288 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for PerDeviceDiskLatencyPollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.288 12 ERROR ceilometer.polling.manager [-] Prevent pollster disk.device.latency from polling [<NovaLikeServer: tempest-TestNetworkAdvancedServerOps-server-1889498750>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-TestNetworkAdvancedServerOps-server-1889498750>]
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.289 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.drop in the context of pollsters
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.289 12 DEBUG ceilometer.compute.pollsters [-] 46feac9e-f412-4027-8cfb-f7280308085e/network.outgoing.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.291 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'f00c2ca8-e730-4e99-92ab-3582b8e41629', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '635cc2f351c344dc8e2b1264080dbafb', 'user_name': None, 'project_id': 'adb1305c8f874f2684e845e88fd95ffe', 'project_name': None, 'resource_id': 'instance-00000086-46feac9e-f412-4027-8cfb-f7280308085e-tap7bc267e3-f7', 'timestamp': '2026-01-22T00:18:23.289212', 'resource_metadata': {'display_name': 'tempest-TestNetworkAdvancedServerOps-server-1889498750', 'name': 'tap7bc267e3-f7', 'instance_id': '46feac9e-f412-4027-8cfb-f7280308085e', 'instance_type': 'm1.nano', 'host': 'd17a70097cc94a0e18076735cc020fe8b794763e3e9773db065e2c8c', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:38:78:10', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap7bc267e3-f7'}, 'message_id': 'dd0d2174-f727-11f0-b13b-fa163e425b77', 'monotonic_time': 5661.935299856, 'message_signature': 'a11e768134a529689f969842fcc96c8aa7fe8ef8cc1fb53bd79cb66e4863274a'}]}, 'timestamp': '2026-01-22 00:18:23.289974', '_unique_id': '993caa3096c74926ad87d167a30126e5'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.291 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.291 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.291 12 ERROR oslo_messaging.notify.messaging     yield
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.291 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.291 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.291 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.291 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.291 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.291 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.291 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.291 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.291 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.291 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.291 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.291 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.291 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.291 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.291 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.291 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.291 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.291 12 ERROR oslo_messaging.notify.messaging 
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.291 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.291 12 ERROR oslo_messaging.notify.messaging 
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.291 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.291 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.291 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.291 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.291 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.291 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.291 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.291 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.291 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.291 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.291 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.291 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.291 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.291 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.291 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.291 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.291 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.291 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.291 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.291 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.291 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.291 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.291 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.291 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.291 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.291 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.291 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.291 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.291 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.291 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.291 12 ERROR oslo_messaging.notify.messaging 
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.293 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.delta in the context of pollsters
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.294 12 DEBUG ceilometer.compute.pollsters [-] 46feac9e-f412-4027-8cfb-f7280308085e/network.incoming.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.295 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'cf4d1c97-f4ea-46e0-9697-3a9e0aa1c583', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '635cc2f351c344dc8e2b1264080dbafb', 'user_name': None, 'project_id': 'adb1305c8f874f2684e845e88fd95ffe', 'project_name': None, 'resource_id': 'instance-00000086-46feac9e-f412-4027-8cfb-f7280308085e-tap7bc267e3-f7', 'timestamp': '2026-01-22T00:18:23.293993', 'resource_metadata': {'display_name': 'tempest-TestNetworkAdvancedServerOps-server-1889498750', 'name': 'tap7bc267e3-f7', 'instance_id': '46feac9e-f412-4027-8cfb-f7280308085e', 'instance_type': 'm1.nano', 'host': 'd17a70097cc94a0e18076735cc020fe8b794763e3e9773db065e2c8c', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:38:78:10', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap7bc267e3-f7'}, 'message_id': 'dd0ddcb8-f727-11f0-b13b-fa163e425b77', 'monotonic_time': 5661.935299856, 'message_signature': 'e9f8c011f99789a2f694db4efd8890a4190a0127b4553d67ab2f23aa1e250220'}]}, 'timestamp': '2026-01-22 00:18:23.294737', '_unique_id': 'eeaad7ac9c1147379b543418f2e826d6'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.295 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.295 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.295 12 ERROR oslo_messaging.notify.messaging     yield
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.295 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.295 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.295 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.295 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.295 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.295 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.295 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.295 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.295 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.295 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.295 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.295 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.295 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.295 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.295 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.295 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.295 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.295 12 ERROR oslo_messaging.notify.messaging 
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.295 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.295 12 ERROR oslo_messaging.notify.messaging 
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.295 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.295 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.295 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.295 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.295 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.295 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.295 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.295 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.295 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.295 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.295 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.295 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.295 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.295 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.295 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.295 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.295 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.295 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.295 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.295 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.295 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.295 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.295 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.295 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.295 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.295 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.295 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.295 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.295 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.295 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.295 12 ERROR oslo_messaging.notify.messaging 
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.298 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.drop in the context of pollsters
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.298 12 DEBUG ceilometer.compute.pollsters [-] 46feac9e-f412-4027-8cfb-f7280308085e/network.incoming.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.300 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'a343ae0a-843a-43f1-8774-6995b669ceb5', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '635cc2f351c344dc8e2b1264080dbafb', 'user_name': None, 'project_id': 'adb1305c8f874f2684e845e88fd95ffe', 'project_name': None, 'resource_id': 'instance-00000086-46feac9e-f412-4027-8cfb-f7280308085e-tap7bc267e3-f7', 'timestamp': '2026-01-22T00:18:23.298635', 'resource_metadata': {'display_name': 'tempest-TestNetworkAdvancedServerOps-server-1889498750', 'name': 'tap7bc267e3-f7', 'instance_id': '46feac9e-f412-4027-8cfb-f7280308085e', 'instance_type': 'm1.nano', 'host': 'd17a70097cc94a0e18076735cc020fe8b794763e3e9773db065e2c8c', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:38:78:10', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap7bc267e3-f7'}, 'message_id': 'dd0e8f1e-f727-11f0-b13b-fa163e425b77', 'monotonic_time': 5661.935299856, 'message_signature': '5e7c157a0993bdaf00ccf8e6c8f39307ad03cd19a4a7bebfa1628270d4fa16dc'}]}, 'timestamp': '2026-01-22 00:18:23.299239', '_unique_id': '55349707ca714f168edc863d1913c0b0'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.300 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.300 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.300 12 ERROR oslo_messaging.notify.messaging     yield
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.300 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.300 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.300 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.300 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.300 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.300 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.300 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.300 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.300 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.300 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.300 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.300 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.300 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.300 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.300 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.300 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.300 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.300 12 ERROR oslo_messaging.notify.messaging 
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.300 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.300 12 ERROR oslo_messaging.notify.messaging 
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.300 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.300 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.300 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.300 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.300 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.300 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.300 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.300 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.300 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.300 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.300 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.300 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.300 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.300 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.300 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.300 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.300 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.300 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.300 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.300 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.300 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.300 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.300 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.300 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.300 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.300 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.300 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.300 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.300 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.300 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.300 12 ERROR oslo_messaging.notify.messaging 
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.301 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.bytes in the context of pollsters
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.301 12 DEBUG ceilometer.compute.pollsters [-] 46feac9e-f412-4027-8cfb-f7280308085e/disk.device.write.bytes volume: 274432 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.302 12 DEBUG ceilometer.compute.pollsters [-] 46feac9e-f412-4027-8cfb-f7280308085e/disk.device.write.bytes volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.303 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '86d9fd05-bef3-4925-b370-c0d27d205cc0', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 274432, 'user_id': '635cc2f351c344dc8e2b1264080dbafb', 'user_name': None, 'project_id': 'adb1305c8f874f2684e845e88fd95ffe', 'project_name': None, 'resource_id': '46feac9e-f412-4027-8cfb-f7280308085e-vda', 'timestamp': '2026-01-22T00:18:23.301890', 'resource_metadata': {'display_name': 'tempest-TestNetworkAdvancedServerOps-server-1889498750', 'name': 'instance-00000086', 'instance_id': '46feac9e-f412-4027-8cfb-f7280308085e', 'instance_type': 'm1.nano', 'host': 'd17a70097cc94a0e18076735cc020fe8b794763e3e9773db065e2c8c', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'dd0f0b38-f727-11f0-b13b-fa163e425b77', 'monotonic_time': 5661.944554213, 'message_signature': 'bf3a3c9b622c36b44a34bbcf601b66814dbbdf29c712d1f9ab9edd6c07cc0421'}, {'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '635cc2f351c344dc8e2b1264080dbafb', 'user_name': None, 'project_id': 'adb1305c8f874f2684e845e88fd95ffe', 'project_name': None, 'resource_id': '46feac9e-f412-4027-8cfb-f7280308085e-sda', 'timestamp': '2026-01-22T00:18:23.301890', 'resource_metadata': {'display_name': 'tempest-TestNetworkAdvancedServerOps-server-1889498750', 'name': 'instance-00000086', 'instance_id': '46feac9e-f412-4027-8cfb-f7280308085e', 'instance_type': 'm1.nano', 'host': 'd17a70097cc94a0e18076735cc020fe8b794763e3e9773db065e2c8c', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'dd0f19b6-f727-11f0-b13b-fa163e425b77', 'monotonic_time': 5661.944554213, 'message_signature': 'd23ba29e7d02a4d3d08522692283e4780c9c6923197cfc831b84c0015c0e94ef'}]}, 'timestamp': '2026-01-22 00:18:23.302682', '_unique_id': 'b6ade3f454524f05a1eefdfabb85ccf1'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.303 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.303 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.303 12 ERROR oslo_messaging.notify.messaging     yield
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.303 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.303 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.303 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.303 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.303 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.303 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.303 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.303 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.303 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.303 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.303 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.303 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.303 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.303 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.303 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.303 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.303 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.303 12 ERROR oslo_messaging.notify.messaging 
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.303 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.303 12 ERROR oslo_messaging.notify.messaging 
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.303 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.303 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.303 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.303 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.303 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.303 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.303 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.303 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.303 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.303 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.303 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.303 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.303 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.303 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.303 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.303 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.303 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.303 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.303 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.303 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.303 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.303 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.303 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.303 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.303 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.303 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.303 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.303 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.303 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.303 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.303 12 ERROR oslo_messaging.notify.messaging 
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.304 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.rate in the context of pollsters
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.304 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for OutgoingBytesRatePollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.304 12 ERROR ceilometer.polling.manager [-] Prevent pollster network.outgoing.bytes.rate from polling [<NovaLikeServer: tempest-TestNetworkAdvancedServerOps-server-1889498750>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-TestNetworkAdvancedServerOps-server-1889498750>]
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.304 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.requests in the context of pollsters
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.305 12 DEBUG ceilometer.compute.pollsters [-] 46feac9e-f412-4027-8cfb-f7280308085e/disk.device.read.requests volume: 1210 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.305 12 DEBUG ceilometer.compute.pollsters [-] 46feac9e-f412-4027-8cfb-f7280308085e/disk.device.read.requests volume: 108 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.306 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'd3dbcce2-297d-46ef-904e-18e26104b117', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 1210, 'user_id': '635cc2f351c344dc8e2b1264080dbafb', 'user_name': None, 'project_id': 'adb1305c8f874f2684e845e88fd95ffe', 'project_name': None, 'resource_id': '46feac9e-f412-4027-8cfb-f7280308085e-vda', 'timestamp': '2026-01-22T00:18:23.305039', 'resource_metadata': {'display_name': 'tempest-TestNetworkAdvancedServerOps-server-1889498750', 'name': 'instance-00000086', 'instance_id': '46feac9e-f412-4027-8cfb-f7280308085e', 'instance_type': 'm1.nano', 'host': 'd17a70097cc94a0e18076735cc020fe8b794763e3e9773db065e2c8c', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'dd0f86ee-f727-11f0-b13b-fa163e425b77', 'monotonic_time': 5661.944554213, 'message_signature': 'f2274bc927f5c0c4f44a7a15b98776ba0a066c9cc4033279ed40870409c6d9db'}, {'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 108, 'user_id': '635cc2f351c344dc8e2b1264080dbafb', 'user_name': None, 'project_id': 'adb1305c8f874f2684e845e88fd95ffe', 'project_name': None, 'resource_id': '46feac9e-f412-4027-8cfb-f7280308085e-sda', 'timestamp': '2026-01-22T00:18:23.305039', 'resource_metadata': {'display_name': 'tempest-TestNetworkAdvancedServerOps-server-1889498750', 'name': 'instance-00000086', 'instance_id': '46feac9e-f412-4027-8cfb-f7280308085e', 'instance_type': 'm1.nano', 'host': 'd17a70097cc94a0e18076735cc020fe8b794763e3e9773db065e2c8c', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'dd0f9418-f727-11f0-b13b-fa163e425b77', 'monotonic_time': 5661.944554213, 'message_signature': '7783427bbd4e49ebe094486b0d5c06da03a30e6f5628da8dc149d6ca2717d682'}]}, 'timestamp': '2026-01-22 00:18:23.305775', '_unique_id': 'c1b03c6e927a4097bae4f4f367915e0b'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.306 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.306 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.306 12 ERROR oslo_messaging.notify.messaging     yield
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.306 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.306 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.306 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.306 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.306 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.306 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.306 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.306 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.306 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.306 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.306 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.306 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.306 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.306 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.306 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.306 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.306 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.306 12 ERROR oslo_messaging.notify.messaging 
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.306 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.306 12 ERROR oslo_messaging.notify.messaging 
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.306 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.306 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.306 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.306 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.306 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.306 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.306 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.306 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.306 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.306 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.306 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.306 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.306 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.306 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.306 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.306 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.306 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.306 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.306 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.306 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.306 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.306 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.306 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.306 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.306 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.306 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.306 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.306 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.306 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.306 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.306 12 ERROR oslo_messaging.notify.messaging 
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.307 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.capacity in the context of pollsters
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.307 12 DEBUG ceilometer.compute.pollsters [-] 46feac9e-f412-4027-8cfb-f7280308085e/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.307 12 DEBUG ceilometer.compute.pollsters [-] 46feac9e-f412-4027-8cfb-f7280308085e/disk.device.capacity volume: 485376 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.308 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '9a47ef29-beef-4bbf-9463-b617c74eebfd', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '635cc2f351c344dc8e2b1264080dbafb', 'user_name': None, 'project_id': 'adb1305c8f874f2684e845e88fd95ffe', 'project_name': None, 'resource_id': '46feac9e-f412-4027-8cfb-f7280308085e-vda', 'timestamp': '2026-01-22T00:18:23.307444', 'resource_metadata': {'display_name': 'tempest-TestNetworkAdvancedServerOps-server-1889498750', 'name': 'instance-00000086', 'instance_id': '46feac9e-f412-4027-8cfb-f7280308085e', 'instance_type': 'm1.nano', 'host': 'd17a70097cc94a0e18076735cc020fe8b794763e3e9773db065e2c8c', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'dd0fe184-f727-11f0-b13b-fa163e425b77', 'monotonic_time': 5661.984221941, 'message_signature': '40a2c4755c41761f069135fdaa72f8f86bee07c385dc02cfe2c6206d5d0dc80f'}, {'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 485376, 'user_id': '635cc2f351c344dc8e2b1264080dbafb', 'user_name': None, 'project_id': 'adb1305c8f874f2684e845e88fd95ffe', 'project_name': None, 'resource_id': '46feac9e-f412-4027-8cfb-f7280308085e-sda', 'timestamp': '2026-01-22T00:18:23.307444', 'resource_metadata': {'display_name': 'tempest-TestNetworkAdvancedServerOps-server-1889498750', 'name': 'instance-00000086', 'instance_id': '46feac9e-f412-4027-8cfb-f7280308085e', 'instance_type': 'm1.nano', 'host': 'd17a70097cc94a0e18076735cc020fe8b794763e3e9773db065e2c8c', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'dd0fedb4-f727-11f0-b13b-fa163e425b77', 'monotonic_time': 5661.984221941, 'message_signature': 'afff9e84504f48de7af9854728ec576f95e7d151a2cac9d21f2cb5944735448d'}]}, 'timestamp': '2026-01-22 00:18:23.308072', '_unique_id': '302adc0f30384c549214ae9e1406863c'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.308 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.308 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.308 12 ERROR oslo_messaging.notify.messaging     yield
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.308 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.308 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.308 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.308 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.308 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.308 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.308 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.308 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.308 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.308 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.308 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.308 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.308 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.308 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.308 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.308 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.308 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.308 12 ERROR oslo_messaging.notify.messaging 
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.308 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.308 12 ERROR oslo_messaging.notify.messaging 
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.308 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.308 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.308 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.308 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.308 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.308 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.308 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.308 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.308 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.308 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.308 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.308 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.308 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.308 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.308 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.308 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.308 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.308 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.308 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.308 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.308 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.308 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.308 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.308 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.308 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.308 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.308 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.308 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.308 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.308 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.308 12 ERROR oslo_messaging.notify.messaging 
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.309 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.requests in the context of pollsters
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.309 12 DEBUG ceilometer.compute.pollsters [-] 46feac9e-f412-4027-8cfb-f7280308085e/disk.device.write.requests volume: 30 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.309 12 DEBUG ceilometer.compute.pollsters [-] 46feac9e-f412-4027-8cfb-f7280308085e/disk.device.write.requests volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.311 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '57a05d44-7499-43b1-af7b-39d5ac57ed06', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 30, 'user_id': '635cc2f351c344dc8e2b1264080dbafb', 'user_name': None, 'project_id': 'adb1305c8f874f2684e845e88fd95ffe', 'project_name': None, 'resource_id': '46feac9e-f412-4027-8cfb-f7280308085e-vda', 'timestamp': '2026-01-22T00:18:23.309641', 'resource_metadata': {'display_name': 'tempest-TestNetworkAdvancedServerOps-server-1889498750', 'name': 'instance-00000086', 'instance_id': '46feac9e-f412-4027-8cfb-f7280308085e', 'instance_type': 'm1.nano', 'host': 'd17a70097cc94a0e18076735cc020fe8b794763e3e9773db065e2c8c', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'dd1037e2-f727-11f0-b13b-fa163e425b77', 'monotonic_time': 5661.944554213, 'message_signature': 'd4ea0e8b3e09f13dc4a448f017acf1f136a135462b63e1809088c19ca46f4bbe'}, {'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 0, 'user_id': '635cc2f351c344dc8e2b1264080dbafb', 'user_name': None, 'project_id': 'adb1305c8f874f2684e845e88fd95ffe', 'project_name': None, 'resource_id': '46feac9e-f412-4027-8cfb-f7280308085e-sda', 'timestamp': '2026-01-22T00:18:23.309641', 'resource_metadata': {'display_name': 'tempest-TestNetworkAdvancedServerOps-server-1889498750', 'name': 'instance-00000086', 'instance_id': '46feac9e-f412-4027-8cfb-f7280308085e', 'instance_type': 'm1.nano', 'host': 'd17a70097cc94a0e18076735cc020fe8b794763e3e9773db065e2c8c', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'dd10448a-f727-11f0-b13b-fa163e425b77', 'monotonic_time': 5661.944554213, 'message_signature': '0253a000a5639f5f05c3b9a9d1dbae425110405b4547fee2fa0ec915fde05b91'}]}, 'timestamp': '2026-01-22 00:18:23.310299', '_unique_id': '921a76d52b434bc38888a0f5a2b03fe9'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.311 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.311 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.311 12 ERROR oslo_messaging.notify.messaging     yield
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.311 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.311 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.311 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.311 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.311 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.311 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.311 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.311 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.311 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.311 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.311 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.311 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.311 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.311 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.311 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.311 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.311 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.311 12 ERROR oslo_messaging.notify.messaging 
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.311 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.311 12 ERROR oslo_messaging.notify.messaging 
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.311 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.311 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.311 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.311 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.311 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.311 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.311 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.311 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.311 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.311 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.311 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.311 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.311 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.311 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.311 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.311 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.311 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.311 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.311 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.311 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.311 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.311 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.311 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.311 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.311 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.311 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.311 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.311 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.311 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.311 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.311 12 ERROR oslo_messaging.notify.messaging 
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.311 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.rate in the context of pollsters
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.312 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for IncomingBytesRatePollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.312 12 ERROR ceilometer.polling.manager [-] Prevent pollster network.incoming.bytes.rate from polling [<NovaLikeServer: tempest-TestNetworkAdvancedServerOps-server-1889498750>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-TestNetworkAdvancedServerOps-server-1889498750>]
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.312 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets in the context of pollsters
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.312 12 DEBUG ceilometer.compute.pollsters [-] 46feac9e-f412-4027-8cfb-f7280308085e/network.incoming.packets volume: 15 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.313 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'af3d1a3c-bb80-4435-a046-459a94319c71', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 15, 'user_id': '635cc2f351c344dc8e2b1264080dbafb', 'user_name': None, 'project_id': 'adb1305c8f874f2684e845e88fd95ffe', 'project_name': None, 'resource_id': 'instance-00000086-46feac9e-f412-4027-8cfb-f7280308085e-tap7bc267e3-f7', 'timestamp': '2026-01-22T00:18:23.312627', 'resource_metadata': {'display_name': 'tempest-TestNetworkAdvancedServerOps-server-1889498750', 'name': 'tap7bc267e3-f7', 'instance_id': '46feac9e-f412-4027-8cfb-f7280308085e', 'instance_type': 'm1.nano', 'host': 'd17a70097cc94a0e18076735cc020fe8b794763e3e9773db065e2c8c', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:38:78:10', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap7bc267e3-f7'}, 'message_id': 'dd10b014-f727-11f0-b13b-fa163e425b77', 'monotonic_time': 5661.935299856, 'message_signature': 'd50e154edb63dd80d36db9f539ca2092dd7fabd6f5afd13d26d59cd39684aca2'}]}, 'timestamp': '2026-01-22 00:18:23.313103', '_unique_id': '5f7447da62124f3aba54eacfcf787f9c'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.313 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.313 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.313 12 ERROR oslo_messaging.notify.messaging     yield
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.313 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.313 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.313 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.313 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.313 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.313 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.313 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.313 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.313 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.313 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.313 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.313 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.313 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.313 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.313 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.313 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.313 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.313 12 ERROR oslo_messaging.notify.messaging 
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.313 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.313 12 ERROR oslo_messaging.notify.messaging 
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.313 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.313 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.313 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.313 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.313 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.313 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.313 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.313 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.313 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.313 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.313 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.313 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.313 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.313 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.313 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.313 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.313 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.313 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.313 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.313 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.313 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.313 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.313 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.313 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.313 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.313 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.313 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.313 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.313 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.313 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.313 12 ERROR oslo_messaging.notify.messaging 
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.314 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes in the context of pollsters
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.314 12 DEBUG ceilometer.compute.pollsters [-] 46feac9e-f412-4027-8cfb-f7280308085e/network.outgoing.bytes volume: 1096 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.315 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '12837ee6-72d9-48b2-90e4-6d397e5bc1a1', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 1096, 'user_id': '635cc2f351c344dc8e2b1264080dbafb', 'user_name': None, 'project_id': 'adb1305c8f874f2684e845e88fd95ffe', 'project_name': None, 'resource_id': 'instance-00000086-46feac9e-f412-4027-8cfb-f7280308085e-tap7bc267e3-f7', 'timestamp': '2026-01-22T00:18:23.314662', 'resource_metadata': {'display_name': 'tempest-TestNetworkAdvancedServerOps-server-1889498750', 'name': 'tap7bc267e3-f7', 'instance_id': '46feac9e-f412-4027-8cfb-f7280308085e', 'instance_type': 'm1.nano', 'host': 'd17a70097cc94a0e18076735cc020fe8b794763e3e9773db065e2c8c', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:38:78:10', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap7bc267e3-f7'}, 'message_id': 'dd10fc7c-f727-11f0-b13b-fa163e425b77', 'monotonic_time': 5661.935299856, 'message_signature': 'd1cf57a833ea108b1d7809687e9e3d53b2ee30100af6aac907368eeab581f81d'}]}, 'timestamp': '2026-01-22 00:18:23.315043', '_unique_id': '1fd8f6545da643c797539f339efc2187'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.315 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.315 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.315 12 ERROR oslo_messaging.notify.messaging     yield
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.315 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.315 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.315 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.315 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.315 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.315 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.315 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.315 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.315 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.315 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.315 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.315 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.315 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.315 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.315 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.315 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.315 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.315 12 ERROR oslo_messaging.notify.messaging 
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.315 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.315 12 ERROR oslo_messaging.notify.messaging 
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.315 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.315 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.315 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.315 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.315 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.315 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.315 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.315 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.315 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.315 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.315 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.315 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.315 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.315 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.315 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.315 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.315 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.315 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.315 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.315 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.315 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.315 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.315 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.315 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.315 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.315 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.315 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.315 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.315 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.315 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.315 12 ERROR oslo_messaging.notify.messaging 
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.316 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.latency in the context of pollsters
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.316 12 DEBUG ceilometer.compute.pollsters [-] 46feac9e-f412-4027-8cfb-f7280308085e/disk.device.write.latency volume: 32157016 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.316 12 DEBUG ceilometer.compute.pollsters [-] 46feac9e-f412-4027-8cfb-f7280308085e/disk.device.write.latency volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.317 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'baa6983b-8a5c-4337-9d8b-8538378d64d6', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 32157016, 'user_id': '635cc2f351c344dc8e2b1264080dbafb', 'user_name': None, 'project_id': 'adb1305c8f874f2684e845e88fd95ffe', 'project_name': None, 'resource_id': '46feac9e-f412-4027-8cfb-f7280308085e-vda', 'timestamp': '2026-01-22T00:18:23.316520', 'resource_metadata': {'display_name': 'tempest-TestNetworkAdvancedServerOps-server-1889498750', 'name': 'instance-00000086', 'instance_id': '46feac9e-f412-4027-8cfb-f7280308085e', 'instance_type': 'm1.nano', 'host': 'd17a70097cc94a0e18076735cc020fe8b794763e3e9773db065e2c8c', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'dd114510-f727-11f0-b13b-fa163e425b77', 'monotonic_time': 5661.944554213, 'message_signature': '29f34162006c916f1f3c30ca3029885d0563cc35ba33068e5d8255181e6ef995'}, {'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 0, 'user_id': '635cc2f351c344dc8e2b1264080dbafb', 'user_name': None, 'project_id': 'adb1305c8f874f2684e845e88fd95ffe', 'project_name': None, 'resource_id': '46feac9e-f412-4027-8cfb-f7280308085e-sda', 'timestamp': '2026-01-22T00:18:23.316520', 'resource_metadata': {'display_name': 'tempest-TestNetworkAdvancedServerOps-server-1889498750', 'name': 'instance-00000086', 'instance_id': '46feac9e-f412-4027-8cfb-f7280308085e', 'instance_type': 'm1.nano', 'host': 'd17a70097cc94a0e18076735cc020fe8b794763e3e9773db065e2c8c', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'dd1150fa-f727-11f0-b13b-fa163e425b77', 'monotonic_time': 5661.944554213, 'message_signature': '6d599eb0c517066314febc977042fbac389ca041dc572d88e954d4a9d7bd7efa'}]}, 'timestamp': '2026-01-22 00:18:23.317157', '_unique_id': '53b4caadea6244e09a24fd16666d002e'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.317 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.317 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.317 12 ERROR oslo_messaging.notify.messaging     yield
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.317 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.317 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.317 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.317 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.317 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.317 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.317 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.317 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.317 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.317 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.317 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.317 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.317 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.317 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.317 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.317 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.317 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.317 12 ERROR oslo_messaging.notify.messaging 
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.317 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.317 12 ERROR oslo_messaging.notify.messaging 
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.317 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.317 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.317 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.317 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.317 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.317 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.317 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.317 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.317 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.317 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.317 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.317 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.317 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.317 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.317 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.317 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.317 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.317 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.317 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.317 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.317 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.317 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.317 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.317 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.317 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.317 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.317 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.317 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.317 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.317 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.317 12 ERROR oslo_messaging.notify.messaging 
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.318 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.bytes in the context of pollsters
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.318 12 DEBUG ceilometer.compute.pollsters [-] 46feac9e-f412-4027-8cfb-f7280308085e/disk.device.read.bytes volume: 32036864 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.319 12 DEBUG ceilometer.compute.pollsters [-] 46feac9e-f412-4027-8cfb-f7280308085e/disk.device.read.bytes volume: 274750 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.320 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'b16cc87f-3340-4e81-ae9d-06c6ca9c1a6a', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 32036864, 'user_id': '635cc2f351c344dc8e2b1264080dbafb', 'user_name': None, 'project_id': 'adb1305c8f874f2684e845e88fd95ffe', 'project_name': None, 'resource_id': '46feac9e-f412-4027-8cfb-f7280308085e-vda', 'timestamp': '2026-01-22T00:18:23.318745', 'resource_metadata': {'display_name': 'tempest-TestNetworkAdvancedServerOps-server-1889498750', 'name': 'instance-00000086', 'instance_id': '46feac9e-f412-4027-8cfb-f7280308085e', 'instance_type': 'm1.nano', 'host': 'd17a70097cc94a0e18076735cc020fe8b794763e3e9773db065e2c8c', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'dd119bbe-f727-11f0-b13b-fa163e425b77', 'monotonic_time': 5661.944554213, 'message_signature': '2169f8fdf26a598e1706676a44aee62dd561ea1f3de64e8ff4c2c2686379a019'}, {'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 274750, 'user_id': '635cc2f351c344dc8e2b1264080dbafb', 'user_name': None, 'project_id': 'adb1305c8f874f2684e845e88fd95ffe', 'project_name': None, 'resource_id': '46feac9e-f412-4027-8cfb-f7280308085e-sda', 'timestamp': '2026-01-22T00:18:23.318745', 'resource_metadata': {'display_name': 'tempest-TestNetworkAdvancedServerOps-server-1889498750', 'name': 'instance-00000086', 'instance_id': '46feac9e-f412-4027-8cfb-f7280308085e', 'instance_type': 'm1.nano', 'host': 'd17a70097cc94a0e18076735cc020fe8b794763e3e9773db065e2c8c', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'dd11a794-f727-11f0-b13b-fa163e425b77', 'monotonic_time': 5661.944554213, 'message_signature': '6e7389d4933805cc0b5ff704b66b613eb372b78d18990ae410cb33deb538e77b'}]}, 'timestamp': '2026-01-22 00:18:23.319367', '_unique_id': 'dc02a93be6c845e6a0a8b9bf67ccda40'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.320 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.320 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.320 12 ERROR oslo_messaging.notify.messaging     yield
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.320 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.320 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.320 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.320 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.320 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.320 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.320 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.320 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.320 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.320 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.320 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.320 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.320 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.320 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.320 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.320 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.320 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.320 12 ERROR oslo_messaging.notify.messaging 
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.320 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.320 12 ERROR oslo_messaging.notify.messaging 
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.320 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.320 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.320 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.320 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.320 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.320 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.320 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.320 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.320 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.320 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.320 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.320 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.320 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.320 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.320 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.320 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.320 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.320 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.320 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.320 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.320 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.320 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.320 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.320 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.320 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.320 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.320 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.320 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.320 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.320 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.320 12 ERROR oslo_messaging.notify.messaging 
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.320 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes in the context of pollsters
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.321 12 DEBUG ceilometer.compute.pollsters [-] 46feac9e-f412-4027-8cfb-f7280308085e/network.incoming.bytes volume: 1673 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.321 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '6b67ec22-b3e1-452a-b86c-513d9520ff2c', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 1673, 'user_id': '635cc2f351c344dc8e2b1264080dbafb', 'user_name': None, 'project_id': 'adb1305c8f874f2684e845e88fd95ffe', 'project_name': None, 'resource_id': 'instance-00000086-46feac9e-f412-4027-8cfb-f7280308085e-tap7bc267e3-f7', 'timestamp': '2026-01-22T00:18:23.321002', 'resource_metadata': {'display_name': 'tempest-TestNetworkAdvancedServerOps-server-1889498750', 'name': 'tap7bc267e3-f7', 'instance_id': '46feac9e-f412-4027-8cfb-f7280308085e', 'instance_type': 'm1.nano', 'host': 'd17a70097cc94a0e18076735cc020fe8b794763e3e9773db065e2c8c', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:38:78:10', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap7bc267e3-f7'}, 'message_id': 'dd11f3e8-f727-11f0-b13b-fa163e425b77', 'monotonic_time': 5661.935299856, 'message_signature': '3240673215bafc54936080057c81fd87556b0ec28a21112f21a9ec162f9ef4ea'}]}, 'timestamp': '2026-01-22 00:18:23.321337', '_unique_id': 'd3ccc6777d27486ab0bf57e88b2ef52a'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.321 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.321 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.321 12 ERROR oslo_messaging.notify.messaging     yield
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.321 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.321 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.321 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.321 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.321 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.321 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.321 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.321 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.321 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.321 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.321 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.321 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.321 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.321 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.321 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.321 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.321 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.321 12 ERROR oslo_messaging.notify.messaging 
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.321 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.321 12 ERROR oslo_messaging.notify.messaging 
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.321 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.321 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.321 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.321 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.321 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.321 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.321 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.321 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.321 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.321 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.321 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.321 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.321 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.321 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.321 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.321 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.321 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.321 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.321 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.321 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.321 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.321 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.321 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.321 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.321 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.321 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.321 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.321 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.321 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.321 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.321 12 ERROR oslo_messaging.notify.messaging 
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.322 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.error in the context of pollsters
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.322 12 DEBUG ceilometer.compute.pollsters [-] 46feac9e-f412-4027-8cfb-f7280308085e/network.incoming.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.323 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '2afbd347-2f58-477e-a32a-12fe3983ab3a', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '635cc2f351c344dc8e2b1264080dbafb', 'user_name': None, 'project_id': 'adb1305c8f874f2684e845e88fd95ffe', 'project_name': None, 'resource_id': 'instance-00000086-46feac9e-f412-4027-8cfb-f7280308085e-tap7bc267e3-f7', 'timestamp': '2026-01-22T00:18:23.322803', 'resource_metadata': {'display_name': 'tempest-TestNetworkAdvancedServerOps-server-1889498750', 'name': 'tap7bc267e3-f7', 'instance_id': '46feac9e-f412-4027-8cfb-f7280308085e', 'instance_type': 'm1.nano', 'host': 'd17a70097cc94a0e18076735cc020fe8b794763e3e9773db065e2c8c', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:38:78:10', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap7bc267e3-f7'}, 'message_id': 'dd123a56-f727-11f0-b13b-fa163e425b77', 'monotonic_time': 5661.935299856, 'message_signature': '3cb6f6b1bd1d9436131891e733b681bde4a3095f4c9ad18489f1da36b36fff8d'}]}, 'timestamp': '2026-01-22 00:18:23.323155', '_unique_id': '6ef1c5abaa894b088c29bb8bb2f462a7'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.323 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.323 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.323 12 ERROR oslo_messaging.notify.messaging     yield
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.323 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.323 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.323 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.323 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.323 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.323 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.323 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.323 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.323 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.323 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.323 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.323 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.323 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.323 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.323 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.323 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.323 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.323 12 ERROR oslo_messaging.notify.messaging 
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.323 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.323 12 ERROR oslo_messaging.notify.messaging 
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.323 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.323 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.323 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.323 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.323 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.323 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.323 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.323 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.323 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.323 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.323 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.323 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.323 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.323 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.323 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.323 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.323 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.323 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.323 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.323 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.323 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.323 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.323 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.323 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.323 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.323 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.323 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.323 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.323 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.323 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.323 12 ERROR oslo_messaging.notify.messaging 
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.324 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.usage in the context of pollsters
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.324 12 DEBUG ceilometer.compute.pollsters [-] 46feac9e-f412-4027-8cfb-f7280308085e/disk.device.usage volume: 30146560 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.325 12 DEBUG ceilometer.compute.pollsters [-] 46feac9e-f412-4027-8cfb-f7280308085e/disk.device.usage volume: 485376 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.325 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '76a1fcbe-1104-42e7-ad08-957b44e1a1d1', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 30146560, 'user_id': '635cc2f351c344dc8e2b1264080dbafb', 'user_name': None, 'project_id': 'adb1305c8f874f2684e845e88fd95ffe', 'project_name': None, 'resource_id': '46feac9e-f412-4027-8cfb-f7280308085e-vda', 'timestamp': '2026-01-22T00:18:23.324622', 'resource_metadata': {'display_name': 'tempest-TestNetworkAdvancedServerOps-server-1889498750', 'name': 'instance-00000086', 'instance_id': '46feac9e-f412-4027-8cfb-f7280308085e', 'instance_type': 'm1.nano', 'host': 'd17a70097cc94a0e18076735cc020fe8b794763e3e9773db065e2c8c', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'dd128222-f727-11f0-b13b-fa163e425b77', 'monotonic_time': 5661.984221941, 'message_signature': 'b06b5450934fc93769e984464ced0938fd4d554f38690172329cffc10737c87e'}, {'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 485376, 'user_id': '635cc2f351c344dc8e2b1264080dbafb', 'user_name': None, 'project_id': 'adb1305c8f874f2684e845e88fd95ffe', 'project_name': None, 'resource_id': '46feac9e-f412-4027-8cfb-f7280308085e-sda', 'timestamp': '2026-01-22T00:18:23.324622', 'resource_metadata': {'display_name': 'tempest-TestNetworkAdvancedServerOps-server-1889498750', 'name': 'instance-00000086', 'instance_id': '46feac9e-f412-4027-8cfb-f7280308085e', 'instance_type': 'm1.nano', 'host': 'd17a70097cc94a0e18076735cc020fe8b794763e3e9773db065e2c8c', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'dd128ede-f727-11f0-b13b-fa163e425b77', 'monotonic_time': 5661.984221941, 'message_signature': 'a1f3d8e1a1446b6cea96f273f6c831fb6735b4acd5efd5ca72283c6c612c8c75'}]}, 'timestamp': '2026-01-22 00:18:23.325288', '_unique_id': '3f0912db6eb2473097f3297d742cebb1'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.325 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.325 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.325 12 ERROR oslo_messaging.notify.messaging     yield
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.325 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.325 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.325 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.325 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.325 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.325 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.325 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.325 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.325 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.325 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.325 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.325 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.325 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.325 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.325 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.325 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.325 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.325 12 ERROR oslo_messaging.notify.messaging 
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.325 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.325 12 ERROR oslo_messaging.notify.messaging 
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.325 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.325 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.325 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.325 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.325 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.325 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.325 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.325 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.325 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.325 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.325 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.325 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.325 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.325 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.325 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.325 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.325 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.325 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.325 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.325 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.325 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.325 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.325 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.325 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.325 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.325 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.325 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.325 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.325 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.325 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.325 12 ERROR oslo_messaging.notify.messaging 
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.326 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.error in the context of pollsters
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.326 12 DEBUG ceilometer.compute.pollsters [-] 46feac9e-f412-4027-8cfb-f7280308085e/network.outgoing.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.327 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '0b9fb3b6-d48f-4eb4-9df3-499d8415a9d2', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '635cc2f351c344dc8e2b1264080dbafb', 'user_name': None, 'project_id': 'adb1305c8f874f2684e845e88fd95ffe', 'project_name': None, 'resource_id': 'instance-00000086-46feac9e-f412-4027-8cfb-f7280308085e-tap7bc267e3-f7', 'timestamp': '2026-01-22T00:18:23.326755', 'resource_metadata': {'display_name': 'tempest-TestNetworkAdvancedServerOps-server-1889498750', 'name': 'tap7bc267e3-f7', 'instance_id': '46feac9e-f412-4027-8cfb-f7280308085e', 'instance_type': 'm1.nano', 'host': 'd17a70097cc94a0e18076735cc020fe8b794763e3e9773db065e2c8c', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:38:78:10', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap7bc267e3-f7'}, 'message_id': 'dd12d52e-f727-11f0-b13b-fa163e425b77', 'monotonic_time': 5661.935299856, 'message_signature': 'e01504d085fe71d628920ece4c2b3bb04f87f0b95fc68b440398cc375e3c4d69'}]}, 'timestamp': '2026-01-22 00:18:23.327139', '_unique_id': 'cd7af8dc3e334ede97568f13ebaa9bfa'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.327 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.327 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.327 12 ERROR oslo_messaging.notify.messaging     yield
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.327 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.327 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.327 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.327 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.327 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.327 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.327 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.327 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.327 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.327 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.327 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.327 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.327 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.327 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.327 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.327 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.327 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.327 12 ERROR oslo_messaging.notify.messaging 
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.327 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.327 12 ERROR oslo_messaging.notify.messaging 
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.327 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.327 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.327 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.327 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.327 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.327 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.327 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.327 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.327 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.327 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.327 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.327 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.327 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.327 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.327 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.327 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.327 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.327 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.327 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.327 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.327 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.327 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.327 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.327 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.327 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.327 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.327 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.327 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.327 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.327 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 21 19:18:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:18:23.327 12 ERROR oslo_messaging.notify.messaging 
Jan 21 19:18:23 np0005591285 nova_compute[182755]: 2026-01-22 00:18:23.405 182759 INFO nova.compute.manager [None req-331051d3-8540-4f7b-8a39-2a2694fcf16c 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] [instance: 46feac9e-f412-4027-8cfb-f7280308085e] Get console output#033[00m
Jan 21 19:18:23 np0005591285 nova_compute[182755]: 2026-01-22 00:18:23.413 211512 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes#033[00m
Jan 21 19:18:24 np0005591285 nova_compute[182755]: 2026-01-22 00:18:24.319 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:18:25 np0005591285 nova_compute[182755]: 2026-01-22 00:18:25.956 182759 DEBUG nova.network.neutron [None req-634ebf58-c462-4745-85a4-9ebe826b43b8 5eb4e81f0cef4003ae49faa67b3f17c3 3e408650207b498c8d115fd0c4f776dc - - default default] [instance: fb2dc221-bb45-4407-90b5-ce2fe888001c] Successfully updated port: e38b41ce-ced6-421a-ade5-becfd62fa83d _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Jan 21 19:18:26 np0005591285 nova_compute[182755]: 2026-01-22 00:18:26.209 182759 DEBUG nova.compute.manager [req-a55aa545-d111-4496-a980-10249498e857 req-c6b292d9-b49e-4361-8baa-b509f2ea5e76 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: fb2dc221-bb45-4407-90b5-ce2fe888001c] Received event network-changed-e38b41ce-ced6-421a-ade5-becfd62fa83d external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 21 19:18:26 np0005591285 nova_compute[182755]: 2026-01-22 00:18:26.210 182759 DEBUG nova.compute.manager [req-a55aa545-d111-4496-a980-10249498e857 req-c6b292d9-b49e-4361-8baa-b509f2ea5e76 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: fb2dc221-bb45-4407-90b5-ce2fe888001c] Refreshing instance network info cache due to event network-changed-e38b41ce-ced6-421a-ade5-becfd62fa83d. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 21 19:18:26 np0005591285 nova_compute[182755]: 2026-01-22 00:18:26.210 182759 DEBUG oslo_concurrency.lockutils [req-a55aa545-d111-4496-a980-10249498e857 req-c6b292d9-b49e-4361-8baa-b509f2ea5e76 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "refresh_cache-fb2dc221-bb45-4407-90b5-ce2fe888001c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 21 19:18:26 np0005591285 nova_compute[182755]: 2026-01-22 00:18:26.210 182759 DEBUG oslo_concurrency.lockutils [req-a55aa545-d111-4496-a980-10249498e857 req-c6b292d9-b49e-4361-8baa-b509f2ea5e76 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquired lock "refresh_cache-fb2dc221-bb45-4407-90b5-ce2fe888001c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 21 19:18:26 np0005591285 nova_compute[182755]: 2026-01-22 00:18:26.210 182759 DEBUG nova.network.neutron [req-a55aa545-d111-4496-a980-10249498e857 req-c6b292d9-b49e-4361-8baa-b509f2ea5e76 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: fb2dc221-bb45-4407-90b5-ce2fe888001c] Refreshing network info cache for port e38b41ce-ced6-421a-ade5-becfd62fa83d _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 21 19:18:26 np0005591285 nova_compute[182755]: 2026-01-22 00:18:26.215 182759 DEBUG oslo_concurrency.lockutils [None req-634ebf58-c462-4745-85a4-9ebe826b43b8 5eb4e81f0cef4003ae49faa67b3f17c3 3e408650207b498c8d115fd0c4f776dc - - default default] Acquiring lock "refresh_cache-fb2dc221-bb45-4407-90b5-ce2fe888001c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 21 19:18:27 np0005591285 nova_compute[182755]: 2026-01-22 00:18:27.249 182759 DEBUG nova.network.neutron [req-a55aa545-d111-4496-a980-10249498e857 req-c6b292d9-b49e-4361-8baa-b509f2ea5e76 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: fb2dc221-bb45-4407-90b5-ce2fe888001c] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Jan 21 19:18:27 np0005591285 nova_compute[182755]: 2026-01-22 00:18:27.518 182759 DEBUG oslo_concurrency.lockutils [None req-f2b55ebe-e91d-4ff6-8203-af6a61377b07 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] Acquiring lock "46feac9e-f412-4027-8cfb-f7280308085e" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 21 19:18:27 np0005591285 nova_compute[182755]: 2026-01-22 00:18:27.519 182759 DEBUG oslo_concurrency.lockutils [None req-f2b55ebe-e91d-4ff6-8203-af6a61377b07 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] Lock "46feac9e-f412-4027-8cfb-f7280308085e" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 21 19:18:27 np0005591285 nova_compute[182755]: 2026-01-22 00:18:27.519 182759 DEBUG oslo_concurrency.lockutils [None req-f2b55ebe-e91d-4ff6-8203-af6a61377b07 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] Acquiring lock "46feac9e-f412-4027-8cfb-f7280308085e-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 21 19:18:27 np0005591285 nova_compute[182755]: 2026-01-22 00:18:27.519 182759 DEBUG oslo_concurrency.lockutils [None req-f2b55ebe-e91d-4ff6-8203-af6a61377b07 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] Lock "46feac9e-f412-4027-8cfb-f7280308085e-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 21 19:18:27 np0005591285 nova_compute[182755]: 2026-01-22 00:18:27.520 182759 DEBUG oslo_concurrency.lockutils [None req-f2b55ebe-e91d-4ff6-8203-af6a61377b07 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] Lock "46feac9e-f412-4027-8cfb-f7280308085e-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 21 19:18:27 np0005591285 nova_compute[182755]: 2026-01-22 00:18:27.769 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:18:27 np0005591285 nova_compute[182755]: 2026-01-22 00:18:27.828 182759 DEBUG nova.network.neutron [req-a55aa545-d111-4496-a980-10249498e857 req-c6b292d9-b49e-4361-8baa-b509f2ea5e76 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: fb2dc221-bb45-4407-90b5-ce2fe888001c] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 21 19:18:27 np0005591285 nova_compute[182755]: 2026-01-22 00:18:27.978 182759 INFO nova.compute.manager [None req-f2b55ebe-e91d-4ff6-8203-af6a61377b07 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] [instance: 46feac9e-f412-4027-8cfb-f7280308085e] Terminating instance#033[00m
Jan 21 19:18:28 np0005591285 nova_compute[182755]: 2026-01-22 00:18:28.231 182759 DEBUG oslo_concurrency.lockutils [req-a55aa545-d111-4496-a980-10249498e857 req-c6b292d9-b49e-4361-8baa-b509f2ea5e76 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Releasing lock "refresh_cache-fb2dc221-bb45-4407-90b5-ce2fe888001c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 21 19:18:28 np0005591285 nova_compute[182755]: 2026-01-22 00:18:28.235 182759 DEBUG oslo_concurrency.lockutils [None req-634ebf58-c462-4745-85a4-9ebe826b43b8 5eb4e81f0cef4003ae49faa67b3f17c3 3e408650207b498c8d115fd0c4f776dc - - default default] Acquired lock "refresh_cache-fb2dc221-bb45-4407-90b5-ce2fe888001c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 21 19:18:28 np0005591285 nova_compute[182755]: 2026-01-22 00:18:28.236 182759 DEBUG nova.network.neutron [None req-634ebf58-c462-4745-85a4-9ebe826b43b8 5eb4e81f0cef4003ae49faa67b3f17c3 3e408650207b498c8d115fd0c4f776dc - - default default] [instance: fb2dc221-bb45-4407-90b5-ce2fe888001c] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Jan 21 19:18:28 np0005591285 nova_compute[182755]: 2026-01-22 00:18:28.788 182759 DEBUG nova.compute.manager [None req-f2b55ebe-e91d-4ff6-8203-af6a61377b07 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] [instance: 46feac9e-f412-4027-8cfb-f7280308085e] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Jan 21 19:18:28 np0005591285 kernel: tap7bc267e3-f7 (unregistering): left promiscuous mode
Jan 21 19:18:28 np0005591285 NetworkManager[55017]: <info>  [1769041108.8235] device (tap7bc267e3-f7): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 21 19:18:28 np0005591285 nova_compute[182755]: 2026-01-22 00:18:28.836 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:18:28 np0005591285 ovn_controller[94908]: 2026-01-22T00:18:28Z|00546|binding|INFO|Releasing lport 7bc267e3-f762-4a18-a3a2-42a7161a231e from this chassis (sb_readonly=0)
Jan 21 19:18:28 np0005591285 ovn_controller[94908]: 2026-01-22T00:18:28Z|00547|binding|INFO|Setting lport 7bc267e3-f762-4a18-a3a2-42a7161a231e down in Southbound
Jan 21 19:18:28 np0005591285 ovn_controller[94908]: 2026-01-22T00:18:28Z|00548|binding|INFO|Removing iface tap7bc267e3-f7 ovn-installed in OVS
Jan 21 19:18:28 np0005591285 nova_compute[182755]: 2026-01-22 00:18:28.844 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:18:28 np0005591285 nova_compute[182755]: 2026-01-22 00:18:28.853 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:18:28 np0005591285 systemd[1]: machine-qemu\x2d63\x2dinstance\x2d00000086.scope: Deactivated successfully.
Jan 21 19:18:28 np0005591285 systemd[1]: machine-qemu\x2d63\x2dinstance\x2d00000086.scope: Consumed 14.328s CPU time.
Jan 21 19:18:28 np0005591285 systemd-machined[154022]: Machine qemu-63-instance-00000086 terminated.
Jan 21 19:18:28 np0005591285 nova_compute[182755]: 2026-01-22 00:18:28.962 182759 DEBUG nova.network.neutron [None req-634ebf58-c462-4745-85a4-9ebe826b43b8 5eb4e81f0cef4003ae49faa67b3f17c3 3e408650207b498c8d115fd0c4f776dc - - default default] [instance: fb2dc221-bb45-4407-90b5-ce2fe888001c] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Jan 21 19:18:29 np0005591285 nova_compute[182755]: 2026-01-22 00:18:29.071 182759 INFO nova.virt.libvirt.driver [-] [instance: 46feac9e-f412-4027-8cfb-f7280308085e] Instance destroyed successfully.#033[00m
Jan 21 19:18:29 np0005591285 nova_compute[182755]: 2026-01-22 00:18:29.071 182759 DEBUG nova.objects.instance [None req-f2b55ebe-e91d-4ff6-8203-af6a61377b07 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] Lazy-loading 'resources' on Instance uuid 46feac9e-f412-4027-8cfb-f7280308085e obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 21 19:18:29 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:18:29.157 104259 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:38:78:10 10.100.0.12'], port_security=['fa:16:3e:38:78:10 10.100.0.12'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.12/28', 'neutron:device_id': '46feac9e-f412-4027-8cfb-f7280308085e', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-184c07f2-f316-4056-b962-173c9a73cccb', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'adb1305c8f874f2684e845e88fd95ffe', 'neutron:revision_number': '8', 'neutron:security_group_ids': '9e59c3b5-e637-42fe-b28f-811656431607', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-2.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=ac0dd3c8-754f-43f7-8c8a-c2e10a6719dc, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fdd55b5c970>], logical_port=7bc267e3-f762-4a18-a3a2-42a7161a231e) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7fdd55b5c970>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 21 19:18:29 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:18:29.159 104259 INFO neutron.agent.ovn.metadata.agent [-] Port 7bc267e3-f762-4a18-a3a2-42a7161a231e in datapath 184c07f2-f316-4056-b962-173c9a73cccb unbound from our chassis#033[00m
Jan 21 19:18:29 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:18:29.161 104259 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 184c07f2-f316-4056-b962-173c9a73cccb, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Jan 21 19:18:29 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:18:29.162 211690 DEBUG oslo.privsep.daemon [-] privsep: reply[347040bc-53a8-437e-8bbf-7520f4b419c5]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 19:18:29 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:18:29.163 104259 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-184c07f2-f316-4056-b962-173c9a73cccb namespace which is not needed anymore#033[00m
Jan 21 19:18:29 np0005591285 nova_compute[182755]: 2026-01-22 00:18:29.287 182759 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1769041094.2859576, 8b2b13b2-3477-4c12-b3a9-2af6bab94065 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 21 19:18:29 np0005591285 nova_compute[182755]: 2026-01-22 00:18:29.288 182759 INFO nova.compute.manager [-] [instance: 8b2b13b2-3477-4c12-b3a9-2af6bab94065] VM Stopped (Lifecycle Event)#033[00m
Jan 21 19:18:29 np0005591285 nova_compute[182755]: 2026-01-22 00:18:29.321 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:18:29 np0005591285 neutron-haproxy-ovnmeta-184c07f2-f316-4056-b962-173c9a73cccb[233707]: [NOTICE]   (233713) : haproxy version is 2.8.14-c23fe91
Jan 21 19:18:29 np0005591285 neutron-haproxy-ovnmeta-184c07f2-f316-4056-b962-173c9a73cccb[233707]: [NOTICE]   (233713) : path to executable is /usr/sbin/haproxy
Jan 21 19:18:29 np0005591285 neutron-haproxy-ovnmeta-184c07f2-f316-4056-b962-173c9a73cccb[233707]: [WARNING]  (233713) : Exiting Master process...
Jan 21 19:18:29 np0005591285 neutron-haproxy-ovnmeta-184c07f2-f316-4056-b962-173c9a73cccb[233707]: [WARNING]  (233713) : Exiting Master process...
Jan 21 19:18:29 np0005591285 neutron-haproxy-ovnmeta-184c07f2-f316-4056-b962-173c9a73cccb[233707]: [ALERT]    (233713) : Current worker (233715) exited with code 143 (Terminated)
Jan 21 19:18:29 np0005591285 neutron-haproxy-ovnmeta-184c07f2-f316-4056-b962-173c9a73cccb[233707]: [WARNING]  (233713) : All workers exited. Exiting... (0)
Jan 21 19:18:29 np0005591285 systemd[1]: libpod-986007551cc33596b587d10b32a53fff3f9f4c2df0ab3f4f40d20aae2e24e17a.scope: Deactivated successfully.
Jan 21 19:18:29 np0005591285 podman[234096]: 2026-01-22 00:18:29.336210205 +0000 UTC m=+0.053061566 container died 986007551cc33596b587d10b32a53fff3f9f4c2df0ab3f4f40d20aae2e24e17a (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-184c07f2-f316-4056-b962-173c9a73cccb, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true)
Jan 21 19:18:29 np0005591285 nova_compute[182755]: 2026-01-22 00:18:29.571 182759 DEBUG nova.compute.manager [req-65dcc279-a222-4f4c-99ea-6f388b8ad6cf req-3a84112a-c8ff-46d4-a57c-06b5b02dba1f 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 46feac9e-f412-4027-8cfb-f7280308085e] Received event network-changed-7bc267e3-f762-4a18-a3a2-42a7161a231e external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 21 19:18:29 np0005591285 nova_compute[182755]: 2026-01-22 00:18:29.572 182759 DEBUG nova.compute.manager [req-65dcc279-a222-4f4c-99ea-6f388b8ad6cf req-3a84112a-c8ff-46d4-a57c-06b5b02dba1f 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 46feac9e-f412-4027-8cfb-f7280308085e] Refreshing instance network info cache due to event network-changed-7bc267e3-f762-4a18-a3a2-42a7161a231e. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 21 19:18:29 np0005591285 nova_compute[182755]: 2026-01-22 00:18:29.572 182759 DEBUG oslo_concurrency.lockutils [req-65dcc279-a222-4f4c-99ea-6f388b8ad6cf req-3a84112a-c8ff-46d4-a57c-06b5b02dba1f 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "refresh_cache-46feac9e-f412-4027-8cfb-f7280308085e" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 21 19:18:29 np0005591285 nova_compute[182755]: 2026-01-22 00:18:29.573 182759 DEBUG oslo_concurrency.lockutils [req-65dcc279-a222-4f4c-99ea-6f388b8ad6cf req-3a84112a-c8ff-46d4-a57c-06b5b02dba1f 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquired lock "refresh_cache-46feac9e-f412-4027-8cfb-f7280308085e" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 21 19:18:29 np0005591285 nova_compute[182755]: 2026-01-22 00:18:29.573 182759 DEBUG nova.network.neutron [req-65dcc279-a222-4f4c-99ea-6f388b8ad6cf req-3a84112a-c8ff-46d4-a57c-06b5b02dba1f 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 46feac9e-f412-4027-8cfb-f7280308085e] Refreshing network info cache for port 7bc267e3-f762-4a18-a3a2-42a7161a231e _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 21 19:18:29 np0005591285 systemd[1]: var-lib-containers-storage-overlay-bb41bc3eaf541071b2398bd3c511117cfa24cbab927ad859e168138b2821b51e-merged.mount: Deactivated successfully.
Jan 21 19:18:29 np0005591285 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-986007551cc33596b587d10b32a53fff3f9f4c2df0ab3f4f40d20aae2e24e17a-userdata-shm.mount: Deactivated successfully.
Jan 21 19:18:29 np0005591285 podman[234096]: 2026-01-22 00:18:29.697574499 +0000 UTC m=+0.414425900 container cleanup 986007551cc33596b587d10b32a53fff3f9f4c2df0ab3f4f40d20aae2e24e17a (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-184c07f2-f316-4056-b962-173c9a73cccb, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true)
Jan 21 19:18:29 np0005591285 nova_compute[182755]: 2026-01-22 00:18:29.775 182759 DEBUG nova.virt.libvirt.vif [None req-f2b55ebe-e91d-4ff6-8203-af6a61377b07 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-22T00:17:10Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestNetworkAdvancedServerOps-server-1889498750',display_name='tempest-TestNetworkAdvancedServerOps-server-1889498750',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-testnetworkadvancedserverops-server-1889498750',id=134,image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBAOTO7d2Dwnkbz9wr9hWsejC9/1+pdYEpWKDQobSKPUmWC0nAs/mdLNrBlKhRnQPpVBXnMQms4q8X3v+9bWXw5gwGNW9NuZlObmqlerpOa7gv/9x3J0wC1Nx+jU/uK6YUg==',key_name='tempest-TestNetworkAdvancedServerOps-1110887450',keypairs=<?>,launch_index=0,launched_at=2026-01-22T00:18:02Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='adb1305c8f874f2684e845e88fd95ffe',ramdisk_id='',reservation_id='r-ib8w2x0y',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestNetworkAdvancedServerOps-587955072',owner_user_name='tempest-TestNetworkAdvancedServerOps-587955072-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-22T00:18:07Z,user_data=None,user_id='635cc2f351c344dc8e2b1264080dbafb',uuid=46feac9e-f412-4027-8cfb-f7280308085e,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "7bc267e3-f762-4a18-a3a2-42a7161a231e", "address": "fa:16:3e:38:78:10", "network": {"id": "184c07f2-f316-4056-b962-173c9a73cccb", "bridge": "br-int", "label": "tempest-network-smoke--1947088510", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.203", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "adb1305c8f874f2684e845e88fd95ffe", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap7bc267e3-f7", "ovs_interfaceid": "7bc267e3-f762-4a18-a3a2-42a7161a231e", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Jan 21 19:18:29 np0005591285 nova_compute[182755]: 2026-01-22 00:18:29.777 182759 DEBUG nova.network.os_vif_util [None req-f2b55ebe-e91d-4ff6-8203-af6a61377b07 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] Converting VIF {"id": "7bc267e3-f762-4a18-a3a2-42a7161a231e", "address": "fa:16:3e:38:78:10", "network": {"id": "184c07f2-f316-4056-b962-173c9a73cccb", "bridge": "br-int", "label": "tempest-network-smoke--1947088510", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.203", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "adb1305c8f874f2684e845e88fd95ffe", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap7bc267e3-f7", "ovs_interfaceid": "7bc267e3-f762-4a18-a3a2-42a7161a231e", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 21 19:18:29 np0005591285 nova_compute[182755]: 2026-01-22 00:18:29.778 182759 DEBUG nova.network.os_vif_util [None req-f2b55ebe-e91d-4ff6-8203-af6a61377b07 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:38:78:10,bridge_name='br-int',has_traffic_filtering=True,id=7bc267e3-f762-4a18-a3a2-42a7161a231e,network=Network(184c07f2-f316-4056-b962-173c9a73cccb),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap7bc267e3-f7') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 21 19:18:29 np0005591285 nova_compute[182755]: 2026-01-22 00:18:29.779 182759 DEBUG os_vif [None req-f2b55ebe-e91d-4ff6-8203-af6a61377b07 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:38:78:10,bridge_name='br-int',has_traffic_filtering=True,id=7bc267e3-f762-4a18-a3a2-42a7161a231e,network=Network(184c07f2-f316-4056-b962-173c9a73cccb),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap7bc267e3-f7') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Jan 21 19:18:29 np0005591285 nova_compute[182755]: 2026-01-22 00:18:29.782 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:18:29 np0005591285 nova_compute[182755]: 2026-01-22 00:18:29.783 182759 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap7bc267e3-f7, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 21 19:18:29 np0005591285 nova_compute[182755]: 2026-01-22 00:18:29.788 182759 DEBUG nova.compute.manager [None req-1641a1ec-ed8e-49b9-b903-86cec7f53b3f - - - - - -] [instance: 8b2b13b2-3477-4c12-b3a9-2af6bab94065] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 21 19:18:29 np0005591285 nova_compute[182755]: 2026-01-22 00:18:29.789 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 21 19:18:29 np0005591285 nova_compute[182755]: 2026-01-22 00:18:29.793 182759 INFO os_vif [None req-f2b55ebe-e91d-4ff6-8203-af6a61377b07 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:38:78:10,bridge_name='br-int',has_traffic_filtering=True,id=7bc267e3-f762-4a18-a3a2-42a7161a231e,network=Network(184c07f2-f316-4056-b962-173c9a73cccb),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap7bc267e3-f7')#033[00m
Jan 21 19:18:29 np0005591285 nova_compute[182755]: 2026-01-22 00:18:29.794 182759 INFO nova.virt.libvirt.driver [None req-f2b55ebe-e91d-4ff6-8203-af6a61377b07 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] [instance: 46feac9e-f412-4027-8cfb-f7280308085e] Deleting instance files /var/lib/nova/instances/46feac9e-f412-4027-8cfb-f7280308085e_del#033[00m
Jan 21 19:18:29 np0005591285 podman[234125]: 2026-01-22 00:18:29.801556181 +0000 UTC m=+0.067479910 container remove 986007551cc33596b587d10b32a53fff3f9f4c2df0ab3f4f40d20aae2e24e17a (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-184c07f2-f316-4056-b962-173c9a73cccb, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3)
Jan 21 19:18:29 np0005591285 nova_compute[182755]: 2026-01-22 00:18:29.805 182759 INFO nova.virt.libvirt.driver [None req-f2b55ebe-e91d-4ff6-8203-af6a61377b07 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] [instance: 46feac9e-f412-4027-8cfb-f7280308085e] Deletion of /var/lib/nova/instances/46feac9e-f412-4027-8cfb-f7280308085e_del complete#033[00m
Jan 21 19:18:29 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:18:29.811 211690 DEBUG oslo.privsep.daemon [-] privsep: reply[8a7e9909-8326-4e5a-aabd-77948edebeae]: (4, ('Thu Jan 22 12:18:29 AM UTC 2026 Stopping container neutron-haproxy-ovnmeta-184c07f2-f316-4056-b962-173c9a73cccb (986007551cc33596b587d10b32a53fff3f9f4c2df0ab3f4f40d20aae2e24e17a)\n986007551cc33596b587d10b32a53fff3f9f4c2df0ab3f4f40d20aae2e24e17a\nThu Jan 22 12:18:29 AM UTC 2026 Deleting container neutron-haproxy-ovnmeta-184c07f2-f316-4056-b962-173c9a73cccb (986007551cc33596b587d10b32a53fff3f9f4c2df0ab3f4f40d20aae2e24e17a)\n986007551cc33596b587d10b32a53fff3f9f4c2df0ab3f4f40d20aae2e24e17a\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 19:18:29 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:18:29.813 211690 DEBUG oslo.privsep.daemon [-] privsep: reply[bcd365a0-a3d9-45f4-a95f-daeeea5b4267]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 19:18:29 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:18:29.814 104259 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap184c07f2-f0, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 21 19:18:29 np0005591285 kernel: tap184c07f2-f0: left promiscuous mode
Jan 21 19:18:29 np0005591285 nova_compute[182755]: 2026-01-22 00:18:29.815 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:18:29 np0005591285 nova_compute[182755]: 2026-01-22 00:18:29.828 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:18:29 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:18:29.830 211690 DEBUG oslo.privsep.daemon [-] privsep: reply[d3fe3840-b486-4812-ba58-00ffdb85ef82]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 19:18:29 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:18:29.848 211690 DEBUG oslo.privsep.daemon [-] privsep: reply[c0925511-db8d-4801-84b3-501fd8264f55]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 19:18:29 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:18:29.851 211690 DEBUG oslo.privsep.daemon [-] privsep: reply[5073900f-c222-4807-a8a6-9629bef41101]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 19:18:29 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:18:29.874 211690 DEBUG oslo.privsep.daemon [-] privsep: reply[702c5de1-90ce-4d16-a811-b83c16c485c7]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 564092, 'reachable_time': 27171, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 234140, 'error': None, 'target': 'ovnmeta-184c07f2-f316-4056-b962-173c9a73cccb', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 19:18:29 np0005591285 systemd[1]: run-netns-ovnmeta\x2d184c07f2\x2df316\x2d4056\x2db962\x2d173c9a73cccb.mount: Deactivated successfully.
Jan 21 19:18:29 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:18:29.877 104650 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-184c07f2-f316-4056-b962-173c9a73cccb deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Jan 21 19:18:29 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:18:29.877 104650 DEBUG oslo.privsep.daemon [-] privsep: reply[c59e4d61-9105-4f9e-84c4-2dcb7a6430c7]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 19:18:29 np0005591285 systemd[1]: libpod-conmon-986007551cc33596b587d10b32a53fff3f9f4c2df0ab3f4f40d20aae2e24e17a.scope: Deactivated successfully.
Jan 21 19:18:29 np0005591285 nova_compute[182755]: 2026-01-22 00:18:29.984 182759 INFO nova.compute.manager [None req-f2b55ebe-e91d-4ff6-8203-af6a61377b07 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] [instance: 46feac9e-f412-4027-8cfb-f7280308085e] Took 1.19 seconds to destroy the instance on the hypervisor.#033[00m
Jan 21 19:18:29 np0005591285 nova_compute[182755]: 2026-01-22 00:18:29.985 182759 DEBUG oslo.service.loopingcall [None req-f2b55ebe-e91d-4ff6-8203-af6a61377b07 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Jan 21 19:18:29 np0005591285 nova_compute[182755]: 2026-01-22 00:18:29.985 182759 DEBUG nova.compute.manager [-] [instance: 46feac9e-f412-4027-8cfb-f7280308085e] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Jan 21 19:18:29 np0005591285 nova_compute[182755]: 2026-01-22 00:18:29.986 182759 DEBUG nova.network.neutron [-] [instance: 46feac9e-f412-4027-8cfb-f7280308085e] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Jan 21 19:18:31 np0005591285 podman[234142]: 2026-01-22 00:18:31.212184921 +0000 UTC m=+0.076265945 container health_status b5c1a31a508d0cf9e10702abe9c0ef424614b4d8f0daee85791f7de054afa6f0 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_id=ceilometer_agent_compute, managed_by=edpm_ansible, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, container_name=ceilometer_agent_compute, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0)
Jan 21 19:18:31 np0005591285 podman[234141]: 2026-01-22 00:18:31.216650459 +0000 UTC m=+0.074196228 container health_status 769668cc4e818cfae854ab3fcb5517227ddca1e18005a18ef4d782a494f45662 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.33.7, release=1755695350, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, name=ubi9-minimal, maintainer=Red Hat, Inc., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., container_name=openstack_network_exporter, url=https://catalog.redhat.com/en/search?searchType=containers, architecture=x86_64, io.openshift.tags=minimal rhel9, vcs-type=git, com.redhat.component=ubi9-minimal-container, config_id=openstack_network_exporter, vendor=Red Hat, Inc., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., version=9.6, managed_by=edpm_ansible, build-date=2025-08-20T13:12:41, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., distribution-scope=public, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b)
Jan 21 19:18:31 np0005591285 nova_compute[182755]: 2026-01-22 00:18:31.498 182759 DEBUG nova.compute.manager [req-e188fe0f-f918-453f-aeee-83213253d959 req-6e36d7b9-bac1-4ec4-93d7-45bd42d2470f 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 46feac9e-f412-4027-8cfb-f7280308085e] Received event network-vif-unplugged-7bc267e3-f762-4a18-a3a2-42a7161a231e external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 21 19:18:31 np0005591285 nova_compute[182755]: 2026-01-22 00:18:31.500 182759 DEBUG oslo_concurrency.lockutils [req-e188fe0f-f918-453f-aeee-83213253d959 req-6e36d7b9-bac1-4ec4-93d7-45bd42d2470f 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "46feac9e-f412-4027-8cfb-f7280308085e-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 21 19:18:31 np0005591285 nova_compute[182755]: 2026-01-22 00:18:31.501 182759 DEBUG oslo_concurrency.lockutils [req-e188fe0f-f918-453f-aeee-83213253d959 req-6e36d7b9-bac1-4ec4-93d7-45bd42d2470f 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "46feac9e-f412-4027-8cfb-f7280308085e-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 21 19:18:31 np0005591285 nova_compute[182755]: 2026-01-22 00:18:31.501 182759 DEBUG oslo_concurrency.lockutils [req-e188fe0f-f918-453f-aeee-83213253d959 req-6e36d7b9-bac1-4ec4-93d7-45bd42d2470f 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "46feac9e-f412-4027-8cfb-f7280308085e-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 21 19:18:31 np0005591285 nova_compute[182755]: 2026-01-22 00:18:31.502 182759 DEBUG nova.compute.manager [req-e188fe0f-f918-453f-aeee-83213253d959 req-6e36d7b9-bac1-4ec4-93d7-45bd42d2470f 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 46feac9e-f412-4027-8cfb-f7280308085e] No waiting events found dispatching network-vif-unplugged-7bc267e3-f762-4a18-a3a2-42a7161a231e pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 21 19:18:31 np0005591285 nova_compute[182755]: 2026-01-22 00:18:31.502 182759 DEBUG nova.compute.manager [req-e188fe0f-f918-453f-aeee-83213253d959 req-6e36d7b9-bac1-4ec4-93d7-45bd42d2470f 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 46feac9e-f412-4027-8cfb-f7280308085e] Received event network-vif-unplugged-7bc267e3-f762-4a18-a3a2-42a7161a231e for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Jan 21 19:18:32 np0005591285 nova_compute[182755]: 2026-01-22 00:18:32.315 182759 DEBUG nova.network.neutron [-] [instance: 46feac9e-f412-4027-8cfb-f7280308085e] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 21 19:18:32 np0005591285 nova_compute[182755]: 2026-01-22 00:18:32.354 182759 DEBUG nova.compute.manager [req-71d899b2-222b-4532-8ee8-f5c025992976 req-78fa42bd-10fc-419d-b3b3-2fac5e8da1ad 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 46feac9e-f412-4027-8cfb-f7280308085e] Received event network-vif-deleted-7bc267e3-f762-4a18-a3a2-42a7161a231e external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 21 19:18:32 np0005591285 nova_compute[182755]: 2026-01-22 00:18:32.354 182759 INFO nova.compute.manager [req-71d899b2-222b-4532-8ee8-f5c025992976 req-78fa42bd-10fc-419d-b3b3-2fac5e8da1ad 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 46feac9e-f412-4027-8cfb-f7280308085e] Neutron deleted interface 7bc267e3-f762-4a18-a3a2-42a7161a231e; detaching it from the instance and deleting it from the info cache#033[00m
Jan 21 19:18:32 np0005591285 nova_compute[182755]: 2026-01-22 00:18:32.355 182759 DEBUG nova.network.neutron [req-71d899b2-222b-4532-8ee8-f5c025992976 req-78fa42bd-10fc-419d-b3b3-2fac5e8da1ad 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 46feac9e-f412-4027-8cfb-f7280308085e] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 21 19:18:32 np0005591285 nova_compute[182755]: 2026-01-22 00:18:32.515 182759 DEBUG nova.network.neutron [None req-634ebf58-c462-4745-85a4-9ebe826b43b8 5eb4e81f0cef4003ae49faa67b3f17c3 3e408650207b498c8d115fd0c4f776dc - - default default] [instance: fb2dc221-bb45-4407-90b5-ce2fe888001c] Updating instance_info_cache with network_info: [{"id": "e38b41ce-ced6-421a-ade5-becfd62fa83d", "address": "fa:16:3e:df:32:6a", "network": {"id": "aabf11c6-ef94-408a-8148-6c6400566606", "bridge": "br-int", "label": "tempest-ServersTestJSON-341875047-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3e408650207b498c8d115fd0c4f776dc", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape38b41ce-ce", "ovs_interfaceid": "e38b41ce-ced6-421a-ade5-becfd62fa83d", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 21 19:18:32 np0005591285 nova_compute[182755]: 2026-01-22 00:18:32.771 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:18:32 np0005591285 nova_compute[182755]: 2026-01-22 00:18:32.914 182759 INFO nova.compute.manager [-] [instance: 46feac9e-f412-4027-8cfb-f7280308085e] Took 2.93 seconds to deallocate network for instance.#033[00m
Jan 21 19:18:32 np0005591285 nova_compute[182755]: 2026-01-22 00:18:32.919 182759 DEBUG nova.compute.manager [req-71d899b2-222b-4532-8ee8-f5c025992976 req-78fa42bd-10fc-419d-b3b3-2fac5e8da1ad 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 46feac9e-f412-4027-8cfb-f7280308085e] Detach interface failed, port_id=7bc267e3-f762-4a18-a3a2-42a7161a231e, reason: Instance 46feac9e-f412-4027-8cfb-f7280308085e could not be found. _process_instance_vif_deleted_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10882#033[00m
Jan 21 19:18:33 np0005591285 nova_compute[182755]: 2026-01-22 00:18:33.061 182759 DEBUG oslo_concurrency.lockutils [None req-634ebf58-c462-4745-85a4-9ebe826b43b8 5eb4e81f0cef4003ae49faa67b3f17c3 3e408650207b498c8d115fd0c4f776dc - - default default] Releasing lock "refresh_cache-fb2dc221-bb45-4407-90b5-ce2fe888001c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 21 19:18:33 np0005591285 nova_compute[182755]: 2026-01-22 00:18:33.061 182759 DEBUG nova.compute.manager [None req-634ebf58-c462-4745-85a4-9ebe826b43b8 5eb4e81f0cef4003ae49faa67b3f17c3 3e408650207b498c8d115fd0c4f776dc - - default default] [instance: fb2dc221-bb45-4407-90b5-ce2fe888001c] Instance network_info: |[{"id": "e38b41ce-ced6-421a-ade5-becfd62fa83d", "address": "fa:16:3e:df:32:6a", "network": {"id": "aabf11c6-ef94-408a-8148-6c6400566606", "bridge": "br-int", "label": "tempest-ServersTestJSON-341875047-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3e408650207b498c8d115fd0c4f776dc", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape38b41ce-ce", "ovs_interfaceid": "e38b41ce-ced6-421a-ade5-becfd62fa83d", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Jan 21 19:18:33 np0005591285 nova_compute[182755]: 2026-01-22 00:18:33.066 182759 DEBUG nova.virt.libvirt.driver [None req-634ebf58-c462-4745-85a4-9ebe826b43b8 5eb4e81f0cef4003ae49faa67b3f17c3 3e408650207b498c8d115fd0c4f776dc - - default default] [instance: fb2dc221-bb45-4407-90b5-ce2fe888001c] Start _get_guest_xml network_info=[{"id": "e38b41ce-ced6-421a-ade5-becfd62fa83d", "address": "fa:16:3e:df:32:6a", "network": {"id": "aabf11c6-ef94-408a-8148-6c6400566606", "bridge": "br-int", "label": "tempest-ServersTestJSON-341875047-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3e408650207b498c8d115fd0c4f776dc", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape38b41ce-ce", "ovs_interfaceid": "e38b41ce-ced6-421a-ade5-becfd62fa83d", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-21T23:42:50Z,direct_url=<?>,disk_format='qcow2',id=9cd98f02-a505-4543-a7ad-04e9a377b456,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='43b70c4e837343859ac97b6b2397ba1b',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-21T23:42:52Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'device_type': 'disk', 'device_name': '/dev/vda', 'size': 0, 'boot_index': 0, 'disk_bus': 'virtio', 'guest_format': None, 'encryption_options': None, 'encryption_format': None, 'encrypted': False, 'encryption_secret_uuid': None, 'image_id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Jan 21 19:18:33 np0005591285 nova_compute[182755]: 2026-01-22 00:18:33.072 182759 WARNING nova.virt.libvirt.driver [None req-634ebf58-c462-4745-85a4-9ebe826b43b8 5eb4e81f0cef4003ae49faa67b3f17c3 3e408650207b498c8d115fd0c4f776dc - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 21 19:18:33 np0005591285 nova_compute[182755]: 2026-01-22 00:18:33.077 182759 DEBUG nova.virt.libvirt.host [None req-634ebf58-c462-4745-85a4-9ebe826b43b8 5eb4e81f0cef4003ae49faa67b3f17c3 3e408650207b498c8d115fd0c4f776dc - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Jan 21 19:18:33 np0005591285 nova_compute[182755]: 2026-01-22 00:18:33.078 182759 DEBUG nova.virt.libvirt.host [None req-634ebf58-c462-4745-85a4-9ebe826b43b8 5eb4e81f0cef4003ae49faa67b3f17c3 3e408650207b498c8d115fd0c4f776dc - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Jan 21 19:18:33 np0005591285 nova_compute[182755]: 2026-01-22 00:18:33.081 182759 DEBUG nova.virt.libvirt.host [None req-634ebf58-c462-4745-85a4-9ebe826b43b8 5eb4e81f0cef4003ae49faa67b3f17c3 3e408650207b498c8d115fd0c4f776dc - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Jan 21 19:18:33 np0005591285 nova_compute[182755]: 2026-01-22 00:18:33.081 182759 DEBUG nova.virt.libvirt.host [None req-634ebf58-c462-4745-85a4-9ebe826b43b8 5eb4e81f0cef4003ae49faa67b3f17c3 3e408650207b498c8d115fd0c4f776dc - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Jan 21 19:18:33 np0005591285 nova_compute[182755]: 2026-01-22 00:18:33.082 182759 DEBUG nova.virt.libvirt.driver [None req-634ebf58-c462-4745-85a4-9ebe826b43b8 5eb4e81f0cef4003ae49faa67b3f17c3 3e408650207b498c8d115fd0c4f776dc - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Jan 21 19:18:33 np0005591285 nova_compute[182755]: 2026-01-22 00:18:33.082 182759 DEBUG nova.virt.hardware [None req-634ebf58-c462-4745-85a4-9ebe826b43b8 5eb4e81f0cef4003ae49faa67b3f17c3 3e408650207b498c8d115fd0c4f776dc - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-21T23:42:45Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='c3389c03-89c4-4ff5-9e03-1a99d41713d4',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-21T23:42:50Z,direct_url=<?>,disk_format='qcow2',id=9cd98f02-a505-4543-a7ad-04e9a377b456,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='43b70c4e837343859ac97b6b2397ba1b',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-21T23:42:52Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Jan 21 19:18:33 np0005591285 nova_compute[182755]: 2026-01-22 00:18:33.083 182759 DEBUG nova.virt.hardware [None req-634ebf58-c462-4745-85a4-9ebe826b43b8 5eb4e81f0cef4003ae49faa67b3f17c3 3e408650207b498c8d115fd0c4f776dc - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Jan 21 19:18:33 np0005591285 nova_compute[182755]: 2026-01-22 00:18:33.083 182759 DEBUG nova.virt.hardware [None req-634ebf58-c462-4745-85a4-9ebe826b43b8 5eb4e81f0cef4003ae49faa67b3f17c3 3e408650207b498c8d115fd0c4f776dc - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Jan 21 19:18:33 np0005591285 nova_compute[182755]: 2026-01-22 00:18:33.083 182759 DEBUG nova.virt.hardware [None req-634ebf58-c462-4745-85a4-9ebe826b43b8 5eb4e81f0cef4003ae49faa67b3f17c3 3e408650207b498c8d115fd0c4f776dc - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Jan 21 19:18:33 np0005591285 nova_compute[182755]: 2026-01-22 00:18:33.083 182759 DEBUG nova.virt.hardware [None req-634ebf58-c462-4745-85a4-9ebe826b43b8 5eb4e81f0cef4003ae49faa67b3f17c3 3e408650207b498c8d115fd0c4f776dc - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Jan 21 19:18:33 np0005591285 nova_compute[182755]: 2026-01-22 00:18:33.084 182759 DEBUG nova.virt.hardware [None req-634ebf58-c462-4745-85a4-9ebe826b43b8 5eb4e81f0cef4003ae49faa67b3f17c3 3e408650207b498c8d115fd0c4f776dc - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Jan 21 19:18:33 np0005591285 nova_compute[182755]: 2026-01-22 00:18:33.084 182759 DEBUG nova.virt.hardware [None req-634ebf58-c462-4745-85a4-9ebe826b43b8 5eb4e81f0cef4003ae49faa67b3f17c3 3e408650207b498c8d115fd0c4f776dc - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Jan 21 19:18:33 np0005591285 nova_compute[182755]: 2026-01-22 00:18:33.084 182759 DEBUG nova.virt.hardware [None req-634ebf58-c462-4745-85a4-9ebe826b43b8 5eb4e81f0cef4003ae49faa67b3f17c3 3e408650207b498c8d115fd0c4f776dc - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Jan 21 19:18:33 np0005591285 nova_compute[182755]: 2026-01-22 00:18:33.084 182759 DEBUG nova.virt.hardware [None req-634ebf58-c462-4745-85a4-9ebe826b43b8 5eb4e81f0cef4003ae49faa67b3f17c3 3e408650207b498c8d115fd0c4f776dc - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Jan 21 19:18:33 np0005591285 nova_compute[182755]: 2026-01-22 00:18:33.084 182759 DEBUG nova.virt.hardware [None req-634ebf58-c462-4745-85a4-9ebe826b43b8 5eb4e81f0cef4003ae49faa67b3f17c3 3e408650207b498c8d115fd0c4f776dc - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Jan 21 19:18:33 np0005591285 nova_compute[182755]: 2026-01-22 00:18:33.084 182759 DEBUG nova.virt.hardware [None req-634ebf58-c462-4745-85a4-9ebe826b43b8 5eb4e81f0cef4003ae49faa67b3f17c3 3e408650207b498c8d115fd0c4f776dc - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Jan 21 19:18:33 np0005591285 nova_compute[182755]: 2026-01-22 00:18:33.087 182759 DEBUG nova.virt.libvirt.vif [None req-634ebf58-c462-4745-85a4-9ebe826b43b8 5eb4e81f0cef4003ae49faa67b3f17c3 3e408650207b498c8d115fd0c4f776dc - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-22T00:18:18Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ServersTestJSON-server-1617563778',display_name='tempest-ServersTestJSON-server-1617563778',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-serverstestjson-server-1617563778',id=141,image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='3e408650207b498c8d115fd0c4f776dc',ramdisk_id='',reservation_id='r-te429pxn',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ServersTestJSON-374007797',owner_user_name='tempest-ServersTestJSON-374007797-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-22T00:18:22Z,user_data=None,user_id='5eb4e81f0cef4003ae49faa67b3f17c3',uuid=fb2dc221-bb45-4407-90b5-ce2fe888001c,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "e38b41ce-ced6-421a-ade5-becfd62fa83d", "address": "fa:16:3e:df:32:6a", "network": {"id": "aabf11c6-ef94-408a-8148-6c6400566606", "bridge": "br-int", "label": "tempest-ServersTestJSON-341875047-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3e408650207b498c8d115fd0c4f776dc", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape38b41ce-ce", "ovs_interfaceid": "e38b41ce-ced6-421a-ade5-becfd62fa83d", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Jan 21 19:18:33 np0005591285 nova_compute[182755]: 2026-01-22 00:18:33.087 182759 DEBUG nova.network.os_vif_util [None req-634ebf58-c462-4745-85a4-9ebe826b43b8 5eb4e81f0cef4003ae49faa67b3f17c3 3e408650207b498c8d115fd0c4f776dc - - default default] Converting VIF {"id": "e38b41ce-ced6-421a-ade5-becfd62fa83d", "address": "fa:16:3e:df:32:6a", "network": {"id": "aabf11c6-ef94-408a-8148-6c6400566606", "bridge": "br-int", "label": "tempest-ServersTestJSON-341875047-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3e408650207b498c8d115fd0c4f776dc", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape38b41ce-ce", "ovs_interfaceid": "e38b41ce-ced6-421a-ade5-becfd62fa83d", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 21 19:18:33 np0005591285 nova_compute[182755]: 2026-01-22 00:18:33.088 182759 DEBUG nova.network.os_vif_util [None req-634ebf58-c462-4745-85a4-9ebe826b43b8 5eb4e81f0cef4003ae49faa67b3f17c3 3e408650207b498c8d115fd0c4f776dc - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:df:32:6a,bridge_name='br-int',has_traffic_filtering=True,id=e38b41ce-ced6-421a-ade5-becfd62fa83d,network=Network(aabf11c6-ef94-408a-8148-6c6400566606),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tape38b41ce-ce') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 21 19:18:33 np0005591285 nova_compute[182755]: 2026-01-22 00:18:33.089 182759 DEBUG nova.objects.instance [None req-634ebf58-c462-4745-85a4-9ebe826b43b8 5eb4e81f0cef4003ae49faa67b3f17c3 3e408650207b498c8d115fd0c4f776dc - - default default] Lazy-loading 'pci_devices' on Instance uuid fb2dc221-bb45-4407-90b5-ce2fe888001c obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 21 19:18:33 np0005591285 nova_compute[182755]: 2026-01-22 00:18:33.236 182759 DEBUG nova.virt.libvirt.driver [None req-634ebf58-c462-4745-85a4-9ebe826b43b8 5eb4e81f0cef4003ae49faa67b3f17c3 3e408650207b498c8d115fd0c4f776dc - - default default] [instance: fb2dc221-bb45-4407-90b5-ce2fe888001c] End _get_guest_xml xml=<domain type="kvm">
Jan 21 19:18:33 np0005591285 nova_compute[182755]:  <uuid>fb2dc221-bb45-4407-90b5-ce2fe888001c</uuid>
Jan 21 19:18:33 np0005591285 nova_compute[182755]:  <name>instance-0000008d</name>
Jan 21 19:18:33 np0005591285 nova_compute[182755]:  <memory>131072</memory>
Jan 21 19:18:33 np0005591285 nova_compute[182755]:  <vcpu>1</vcpu>
Jan 21 19:18:33 np0005591285 nova_compute[182755]:  <metadata>
Jan 21 19:18:33 np0005591285 nova_compute[182755]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 21 19:18:33 np0005591285 nova_compute[182755]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 21 19:18:33 np0005591285 nova_compute[182755]:      <nova:name>tempest-ServersTestJSON-server-1617563778</nova:name>
Jan 21 19:18:33 np0005591285 nova_compute[182755]:      <nova:creationTime>2026-01-22 00:18:33</nova:creationTime>
Jan 21 19:18:33 np0005591285 nova_compute[182755]:      <nova:flavor name="m1.nano">
Jan 21 19:18:33 np0005591285 nova_compute[182755]:        <nova:memory>128</nova:memory>
Jan 21 19:18:33 np0005591285 nova_compute[182755]:        <nova:disk>1</nova:disk>
Jan 21 19:18:33 np0005591285 nova_compute[182755]:        <nova:swap>0</nova:swap>
Jan 21 19:18:33 np0005591285 nova_compute[182755]:        <nova:ephemeral>0</nova:ephemeral>
Jan 21 19:18:33 np0005591285 nova_compute[182755]:        <nova:vcpus>1</nova:vcpus>
Jan 21 19:18:33 np0005591285 nova_compute[182755]:      </nova:flavor>
Jan 21 19:18:33 np0005591285 nova_compute[182755]:      <nova:owner>
Jan 21 19:18:33 np0005591285 nova_compute[182755]:        <nova:user uuid="5eb4e81f0cef4003ae49faa67b3f17c3">tempest-ServersTestJSON-374007797-project-member</nova:user>
Jan 21 19:18:33 np0005591285 nova_compute[182755]:        <nova:project uuid="3e408650207b498c8d115fd0c4f776dc">tempest-ServersTestJSON-374007797</nova:project>
Jan 21 19:18:33 np0005591285 nova_compute[182755]:      </nova:owner>
Jan 21 19:18:33 np0005591285 nova_compute[182755]:      <nova:root type="image" uuid="9cd98f02-a505-4543-a7ad-04e9a377b456"/>
Jan 21 19:18:33 np0005591285 nova_compute[182755]:      <nova:ports>
Jan 21 19:18:33 np0005591285 nova_compute[182755]:        <nova:port uuid="e38b41ce-ced6-421a-ade5-becfd62fa83d">
Jan 21 19:18:33 np0005591285 nova_compute[182755]:          <nova:ip type="fixed" address="10.100.0.10" ipVersion="4"/>
Jan 21 19:18:33 np0005591285 nova_compute[182755]:        </nova:port>
Jan 21 19:18:33 np0005591285 nova_compute[182755]:      </nova:ports>
Jan 21 19:18:33 np0005591285 nova_compute[182755]:    </nova:instance>
Jan 21 19:18:33 np0005591285 nova_compute[182755]:  </metadata>
Jan 21 19:18:33 np0005591285 nova_compute[182755]:  <sysinfo type="smbios">
Jan 21 19:18:33 np0005591285 nova_compute[182755]:    <system>
Jan 21 19:18:33 np0005591285 nova_compute[182755]:      <entry name="manufacturer">RDO</entry>
Jan 21 19:18:33 np0005591285 nova_compute[182755]:      <entry name="product">OpenStack Compute</entry>
Jan 21 19:18:33 np0005591285 nova_compute[182755]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 21 19:18:33 np0005591285 nova_compute[182755]:      <entry name="serial">fb2dc221-bb45-4407-90b5-ce2fe888001c</entry>
Jan 21 19:18:33 np0005591285 nova_compute[182755]:      <entry name="uuid">fb2dc221-bb45-4407-90b5-ce2fe888001c</entry>
Jan 21 19:18:33 np0005591285 nova_compute[182755]:      <entry name="family">Virtual Machine</entry>
Jan 21 19:18:33 np0005591285 nova_compute[182755]:    </system>
Jan 21 19:18:33 np0005591285 nova_compute[182755]:  </sysinfo>
Jan 21 19:18:33 np0005591285 nova_compute[182755]:  <os>
Jan 21 19:18:33 np0005591285 nova_compute[182755]:    <type arch="x86_64" machine="q35">hvm</type>
Jan 21 19:18:33 np0005591285 nova_compute[182755]:    <boot dev="hd"/>
Jan 21 19:18:33 np0005591285 nova_compute[182755]:    <smbios mode="sysinfo"/>
Jan 21 19:18:33 np0005591285 nova_compute[182755]:  </os>
Jan 21 19:18:33 np0005591285 nova_compute[182755]:  <features>
Jan 21 19:18:33 np0005591285 nova_compute[182755]:    <acpi/>
Jan 21 19:18:33 np0005591285 nova_compute[182755]:    <apic/>
Jan 21 19:18:33 np0005591285 nova_compute[182755]:    <vmcoreinfo/>
Jan 21 19:18:33 np0005591285 nova_compute[182755]:  </features>
Jan 21 19:18:33 np0005591285 nova_compute[182755]:  <clock offset="utc">
Jan 21 19:18:33 np0005591285 nova_compute[182755]:    <timer name="pit" tickpolicy="delay"/>
Jan 21 19:18:33 np0005591285 nova_compute[182755]:    <timer name="rtc" tickpolicy="catchup"/>
Jan 21 19:18:33 np0005591285 nova_compute[182755]:    <timer name="hpet" present="no"/>
Jan 21 19:18:33 np0005591285 nova_compute[182755]:  </clock>
Jan 21 19:18:33 np0005591285 nova_compute[182755]:  <cpu mode="custom" match="exact">
Jan 21 19:18:33 np0005591285 nova_compute[182755]:    <model>Nehalem</model>
Jan 21 19:18:33 np0005591285 nova_compute[182755]:    <topology sockets="1" cores="1" threads="1"/>
Jan 21 19:18:33 np0005591285 nova_compute[182755]:  </cpu>
Jan 21 19:18:33 np0005591285 nova_compute[182755]:  <devices>
Jan 21 19:18:33 np0005591285 nova_compute[182755]:    <disk type="file" device="disk">
Jan 21 19:18:33 np0005591285 nova_compute[182755]:      <driver name="qemu" type="qcow2" cache="none"/>
Jan 21 19:18:33 np0005591285 nova_compute[182755]:      <source file="/var/lib/nova/instances/fb2dc221-bb45-4407-90b5-ce2fe888001c/disk"/>
Jan 21 19:18:33 np0005591285 nova_compute[182755]:      <target dev="vda" bus="virtio"/>
Jan 21 19:18:33 np0005591285 nova_compute[182755]:    </disk>
Jan 21 19:18:33 np0005591285 nova_compute[182755]:    <disk type="file" device="cdrom">
Jan 21 19:18:33 np0005591285 nova_compute[182755]:      <driver name="qemu" type="raw" cache="none"/>
Jan 21 19:18:33 np0005591285 nova_compute[182755]:      <source file="/var/lib/nova/instances/fb2dc221-bb45-4407-90b5-ce2fe888001c/disk.config"/>
Jan 21 19:18:33 np0005591285 nova_compute[182755]:      <target dev="sda" bus="sata"/>
Jan 21 19:18:33 np0005591285 nova_compute[182755]:    </disk>
Jan 21 19:18:33 np0005591285 nova_compute[182755]:    <interface type="ethernet">
Jan 21 19:18:33 np0005591285 nova_compute[182755]:      <mac address="fa:16:3e:df:32:6a"/>
Jan 21 19:18:33 np0005591285 nova_compute[182755]:      <model type="virtio"/>
Jan 21 19:18:33 np0005591285 nova_compute[182755]:      <driver name="vhost" rx_queue_size="512"/>
Jan 21 19:18:33 np0005591285 nova_compute[182755]:      <mtu size="1442"/>
Jan 21 19:18:33 np0005591285 nova_compute[182755]:      <target dev="tape38b41ce-ce"/>
Jan 21 19:18:33 np0005591285 nova_compute[182755]:    </interface>
Jan 21 19:18:33 np0005591285 nova_compute[182755]:    <serial type="pty">
Jan 21 19:18:33 np0005591285 nova_compute[182755]:      <log file="/var/lib/nova/instances/fb2dc221-bb45-4407-90b5-ce2fe888001c/console.log" append="off"/>
Jan 21 19:18:33 np0005591285 nova_compute[182755]:    </serial>
Jan 21 19:18:33 np0005591285 nova_compute[182755]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 21 19:18:33 np0005591285 nova_compute[182755]:    <video>
Jan 21 19:18:33 np0005591285 nova_compute[182755]:      <model type="virtio"/>
Jan 21 19:18:33 np0005591285 nova_compute[182755]:    </video>
Jan 21 19:18:33 np0005591285 nova_compute[182755]:    <input type="tablet" bus="usb"/>
Jan 21 19:18:33 np0005591285 nova_compute[182755]:    <rng model="virtio">
Jan 21 19:18:33 np0005591285 nova_compute[182755]:      <backend model="random">/dev/urandom</backend>
Jan 21 19:18:33 np0005591285 nova_compute[182755]:    </rng>
Jan 21 19:18:33 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root"/>
Jan 21 19:18:33 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 19:18:33 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 19:18:33 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 19:18:33 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 19:18:33 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 19:18:33 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 19:18:33 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 19:18:33 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 19:18:33 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 19:18:33 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 19:18:33 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 19:18:33 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 19:18:33 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 19:18:33 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 19:18:33 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 19:18:33 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 19:18:33 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 19:18:33 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 19:18:33 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 19:18:33 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 19:18:33 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 19:18:33 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 19:18:33 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 19:18:33 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 19:18:33 np0005591285 nova_compute[182755]:    <controller type="usb" index="0"/>
Jan 21 19:18:33 np0005591285 nova_compute[182755]:    <memballoon model="virtio">
Jan 21 19:18:33 np0005591285 nova_compute[182755]:      <stats period="10"/>
Jan 21 19:18:33 np0005591285 nova_compute[182755]:    </memballoon>
Jan 21 19:18:33 np0005591285 nova_compute[182755]:  </devices>
Jan 21 19:18:33 np0005591285 nova_compute[182755]: </domain>
Jan 21 19:18:33 np0005591285 nova_compute[182755]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Jan 21 19:18:33 np0005591285 nova_compute[182755]: 2026-01-22 00:18:33.236 182759 DEBUG nova.compute.manager [None req-634ebf58-c462-4745-85a4-9ebe826b43b8 5eb4e81f0cef4003ae49faa67b3f17c3 3e408650207b498c8d115fd0c4f776dc - - default default] [instance: fb2dc221-bb45-4407-90b5-ce2fe888001c] Preparing to wait for external event network-vif-plugged-e38b41ce-ced6-421a-ade5-becfd62fa83d prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Jan 21 19:18:33 np0005591285 nova_compute[182755]: 2026-01-22 00:18:33.236 182759 DEBUG oslo_concurrency.lockutils [None req-634ebf58-c462-4745-85a4-9ebe826b43b8 5eb4e81f0cef4003ae49faa67b3f17c3 3e408650207b498c8d115fd0c4f776dc - - default default] Acquiring lock "fb2dc221-bb45-4407-90b5-ce2fe888001c-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 21 19:18:33 np0005591285 nova_compute[182755]: 2026-01-22 00:18:33.236 182759 DEBUG oslo_concurrency.lockutils [None req-634ebf58-c462-4745-85a4-9ebe826b43b8 5eb4e81f0cef4003ae49faa67b3f17c3 3e408650207b498c8d115fd0c4f776dc - - default default] Lock "fb2dc221-bb45-4407-90b5-ce2fe888001c-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 21 19:18:33 np0005591285 nova_compute[182755]: 2026-01-22 00:18:33.237 182759 DEBUG oslo_concurrency.lockutils [None req-634ebf58-c462-4745-85a4-9ebe826b43b8 5eb4e81f0cef4003ae49faa67b3f17c3 3e408650207b498c8d115fd0c4f776dc - - default default] Lock "fb2dc221-bb45-4407-90b5-ce2fe888001c-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 21 19:18:33 np0005591285 nova_compute[182755]: 2026-01-22 00:18:33.237 182759 DEBUG nova.virt.libvirt.vif [None req-634ebf58-c462-4745-85a4-9ebe826b43b8 5eb4e81f0cef4003ae49faa67b3f17c3 3e408650207b498c8d115fd0c4f776dc - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-22T00:18:18Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ServersTestJSON-server-1617563778',display_name='tempest-ServersTestJSON-server-1617563778',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-serverstestjson-server-1617563778',id=141,image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='3e408650207b498c8d115fd0c4f776dc',ramdisk_id='',reservation_id='r-te429pxn',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ServersTestJSON-374007797',owner_user_name='tempest-ServersTestJSON-374007797-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-22T00:18:22Z,user_data=None,user_id='5eb4e81f0cef4003ae49faa67b3f17c3',uuid=fb2dc221-bb45-4407-90b5-ce2fe888001c,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "e38b41ce-ced6-421a-ade5-becfd62fa83d", "address": "fa:16:3e:df:32:6a", "network": {"id": "aabf11c6-ef94-408a-8148-6c6400566606", "bridge": "br-int", "label": "tempest-ServersTestJSON-341875047-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3e408650207b498c8d115fd0c4f776dc", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape38b41ce-ce", "ovs_interfaceid": "e38b41ce-ced6-421a-ade5-becfd62fa83d", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Jan 21 19:18:33 np0005591285 nova_compute[182755]: 2026-01-22 00:18:33.237 182759 DEBUG nova.network.os_vif_util [None req-634ebf58-c462-4745-85a4-9ebe826b43b8 5eb4e81f0cef4003ae49faa67b3f17c3 3e408650207b498c8d115fd0c4f776dc - - default default] Converting VIF {"id": "e38b41ce-ced6-421a-ade5-becfd62fa83d", "address": "fa:16:3e:df:32:6a", "network": {"id": "aabf11c6-ef94-408a-8148-6c6400566606", "bridge": "br-int", "label": "tempest-ServersTestJSON-341875047-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3e408650207b498c8d115fd0c4f776dc", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape38b41ce-ce", "ovs_interfaceid": "e38b41ce-ced6-421a-ade5-becfd62fa83d", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 21 19:18:33 np0005591285 nova_compute[182755]: 2026-01-22 00:18:33.238 182759 DEBUG nova.network.os_vif_util [None req-634ebf58-c462-4745-85a4-9ebe826b43b8 5eb4e81f0cef4003ae49faa67b3f17c3 3e408650207b498c8d115fd0c4f776dc - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:df:32:6a,bridge_name='br-int',has_traffic_filtering=True,id=e38b41ce-ced6-421a-ade5-becfd62fa83d,network=Network(aabf11c6-ef94-408a-8148-6c6400566606),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tape38b41ce-ce') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 21 19:18:33 np0005591285 nova_compute[182755]: 2026-01-22 00:18:33.238 182759 DEBUG os_vif [None req-634ebf58-c462-4745-85a4-9ebe826b43b8 5eb4e81f0cef4003ae49faa67b3f17c3 3e408650207b498c8d115fd0c4f776dc - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:df:32:6a,bridge_name='br-int',has_traffic_filtering=True,id=e38b41ce-ced6-421a-ade5-becfd62fa83d,network=Network(aabf11c6-ef94-408a-8148-6c6400566606),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tape38b41ce-ce') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Jan 21 19:18:33 np0005591285 nova_compute[182755]: 2026-01-22 00:18:33.238 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:18:33 np0005591285 nova_compute[182755]: 2026-01-22 00:18:33.239 182759 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 21 19:18:33 np0005591285 nova_compute[182755]: 2026-01-22 00:18:33.239 182759 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 21 19:18:33 np0005591285 nova_compute[182755]: 2026-01-22 00:18:33.241 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:18:33 np0005591285 nova_compute[182755]: 2026-01-22 00:18:33.241 182759 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tape38b41ce-ce, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 21 19:18:33 np0005591285 nova_compute[182755]: 2026-01-22 00:18:33.241 182759 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tape38b41ce-ce, col_values=(('external_ids', {'iface-id': 'e38b41ce-ced6-421a-ade5-becfd62fa83d', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:df:32:6a', 'vm-uuid': 'fb2dc221-bb45-4407-90b5-ce2fe888001c'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 21 19:18:33 np0005591285 nova_compute[182755]: 2026-01-22 00:18:33.243 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:18:33 np0005591285 NetworkManager[55017]: <info>  [1769041113.2437] manager: (tape38b41ce-ce): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/266)
Jan 21 19:18:33 np0005591285 nova_compute[182755]: 2026-01-22 00:18:33.246 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 21 19:18:33 np0005591285 nova_compute[182755]: 2026-01-22 00:18:33.252 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:18:33 np0005591285 nova_compute[182755]: 2026-01-22 00:18:33.253 182759 INFO os_vif [None req-634ebf58-c462-4745-85a4-9ebe826b43b8 5eb4e81f0cef4003ae49faa67b3f17c3 3e408650207b498c8d115fd0c4f776dc - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:df:32:6a,bridge_name='br-int',has_traffic_filtering=True,id=e38b41ce-ced6-421a-ade5-becfd62fa83d,network=Network(aabf11c6-ef94-408a-8148-6c6400566606),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tape38b41ce-ce')#033[00m
Jan 21 19:18:33 np0005591285 nova_compute[182755]: 2026-01-22 00:18:33.359 182759 DEBUG oslo_concurrency.lockutils [None req-f2b55ebe-e91d-4ff6-8203-af6a61377b07 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 21 19:18:33 np0005591285 nova_compute[182755]: 2026-01-22 00:18:33.360 182759 DEBUG oslo_concurrency.lockutils [None req-f2b55ebe-e91d-4ff6-8203-af6a61377b07 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 21 19:18:33 np0005591285 nova_compute[182755]: 2026-01-22 00:18:33.364 182759 DEBUG oslo_concurrency.lockutils [None req-f2b55ebe-e91d-4ff6-8203-af6a61377b07 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.004s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 21 19:18:33 np0005591285 nova_compute[182755]: 2026-01-22 00:18:33.372 182759 DEBUG nova.virt.libvirt.driver [None req-634ebf58-c462-4745-85a4-9ebe826b43b8 5eb4e81f0cef4003ae49faa67b3f17c3 3e408650207b498c8d115fd0c4f776dc - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 21 19:18:33 np0005591285 nova_compute[182755]: 2026-01-22 00:18:33.372 182759 DEBUG nova.virt.libvirt.driver [None req-634ebf58-c462-4745-85a4-9ebe826b43b8 5eb4e81f0cef4003ae49faa67b3f17c3 3e408650207b498c8d115fd0c4f776dc - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 21 19:18:33 np0005591285 nova_compute[182755]: 2026-01-22 00:18:33.373 182759 DEBUG nova.virt.libvirt.driver [None req-634ebf58-c462-4745-85a4-9ebe826b43b8 5eb4e81f0cef4003ae49faa67b3f17c3 3e408650207b498c8d115fd0c4f776dc - - default default] No VIF found with MAC fa:16:3e:df:32:6a, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Jan 21 19:18:33 np0005591285 nova_compute[182755]: 2026-01-22 00:18:33.373 182759 INFO nova.virt.libvirt.driver [None req-634ebf58-c462-4745-85a4-9ebe826b43b8 5eb4e81f0cef4003ae49faa67b3f17c3 3e408650207b498c8d115fd0c4f776dc - - default default] [instance: fb2dc221-bb45-4407-90b5-ce2fe888001c] Using config drive#033[00m
Jan 21 19:18:33 np0005591285 nova_compute[182755]: 2026-01-22 00:18:33.416 182759 INFO nova.scheduler.client.report [None req-f2b55ebe-e91d-4ff6-8203-af6a61377b07 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] Deleted allocations for instance 46feac9e-f412-4027-8cfb-f7280308085e#033[00m
Jan 21 19:18:33 np0005591285 nova_compute[182755]: 2026-01-22 00:18:33.678 182759 DEBUG oslo_concurrency.lockutils [None req-f2b55ebe-e91d-4ff6-8203-af6a61377b07 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] Lock "46feac9e-f412-4027-8cfb-f7280308085e" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 6.159s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 21 19:18:33 np0005591285 nova_compute[182755]: 2026-01-22 00:18:33.761 182759 DEBUG nova.compute.manager [req-f790b00e-5269-4770-b84d-9f287b80bacc req-476418f1-b820-40e4-91ca-14534d041fd9 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 46feac9e-f412-4027-8cfb-f7280308085e] Received event network-vif-plugged-7bc267e3-f762-4a18-a3a2-42a7161a231e external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 21 19:18:33 np0005591285 nova_compute[182755]: 2026-01-22 00:18:33.761 182759 DEBUG oslo_concurrency.lockutils [req-f790b00e-5269-4770-b84d-9f287b80bacc req-476418f1-b820-40e4-91ca-14534d041fd9 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "46feac9e-f412-4027-8cfb-f7280308085e-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 21 19:18:33 np0005591285 nova_compute[182755]: 2026-01-22 00:18:33.762 182759 DEBUG oslo_concurrency.lockutils [req-f790b00e-5269-4770-b84d-9f287b80bacc req-476418f1-b820-40e4-91ca-14534d041fd9 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "46feac9e-f412-4027-8cfb-f7280308085e-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 21 19:18:33 np0005591285 nova_compute[182755]: 2026-01-22 00:18:33.762 182759 DEBUG oslo_concurrency.lockutils [req-f790b00e-5269-4770-b84d-9f287b80bacc req-476418f1-b820-40e4-91ca-14534d041fd9 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "46feac9e-f412-4027-8cfb-f7280308085e-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 21 19:18:33 np0005591285 nova_compute[182755]: 2026-01-22 00:18:33.762 182759 DEBUG nova.compute.manager [req-f790b00e-5269-4770-b84d-9f287b80bacc req-476418f1-b820-40e4-91ca-14534d041fd9 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 46feac9e-f412-4027-8cfb-f7280308085e] No waiting events found dispatching network-vif-plugged-7bc267e3-f762-4a18-a3a2-42a7161a231e pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 21 19:18:33 np0005591285 nova_compute[182755]: 2026-01-22 00:18:33.762 182759 WARNING nova.compute.manager [req-f790b00e-5269-4770-b84d-9f287b80bacc req-476418f1-b820-40e4-91ca-14534d041fd9 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 46feac9e-f412-4027-8cfb-f7280308085e] Received unexpected event network-vif-plugged-7bc267e3-f762-4a18-a3a2-42a7161a231e for instance with vm_state deleted and task_state None.#033[00m
Jan 21 19:18:33 np0005591285 nova_compute[182755]: 2026-01-22 00:18:33.771 182759 DEBUG nova.network.neutron [req-65dcc279-a222-4f4c-99ea-6f388b8ad6cf req-3a84112a-c8ff-46d4-a57c-06b5b02dba1f 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 46feac9e-f412-4027-8cfb-f7280308085e] Updated VIF entry in instance network info cache for port 7bc267e3-f762-4a18-a3a2-42a7161a231e. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 21 19:18:33 np0005591285 nova_compute[182755]: 2026-01-22 00:18:33.771 182759 DEBUG nova.network.neutron [req-65dcc279-a222-4f4c-99ea-6f388b8ad6cf req-3a84112a-c8ff-46d4-a57c-06b5b02dba1f 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 46feac9e-f412-4027-8cfb-f7280308085e] Updating instance_info_cache with network_info: [{"id": "7bc267e3-f762-4a18-a3a2-42a7161a231e", "address": "fa:16:3e:38:78:10", "network": {"id": "184c07f2-f316-4056-b962-173c9a73cccb", "bridge": "br-int", "label": "tempest-network-smoke--1947088510", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "adb1305c8f874f2684e845e88fd95ffe", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap7bc267e3-f7", "ovs_interfaceid": "7bc267e3-f762-4a18-a3a2-42a7161a231e", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 21 19:18:33 np0005591285 nova_compute[182755]: 2026-01-22 00:18:33.799 182759 DEBUG oslo_concurrency.lockutils [req-65dcc279-a222-4f4c-99ea-6f388b8ad6cf req-3a84112a-c8ff-46d4-a57c-06b5b02dba1f 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Releasing lock "refresh_cache-46feac9e-f412-4027-8cfb-f7280308085e" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 21 19:18:34 np0005591285 nova_compute[182755]: 2026-01-22 00:18:34.084 182759 INFO nova.virt.libvirt.driver [None req-634ebf58-c462-4745-85a4-9ebe826b43b8 5eb4e81f0cef4003ae49faa67b3f17c3 3e408650207b498c8d115fd0c4f776dc - - default default] [instance: fb2dc221-bb45-4407-90b5-ce2fe888001c] Creating config drive at /var/lib/nova/instances/fb2dc221-bb45-4407-90b5-ce2fe888001c/disk.config#033[00m
Jan 21 19:18:34 np0005591285 nova_compute[182755]: 2026-01-22 00:18:34.091 182759 DEBUG oslo_concurrency.processutils [None req-634ebf58-c462-4745-85a4-9ebe826b43b8 5eb4e81f0cef4003ae49faa67b3f17c3 3e408650207b498c8d115fd0c4f776dc - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/fb2dc221-bb45-4407-90b5-ce2fe888001c/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpjer170_x execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 21 19:18:34 np0005591285 nova_compute[182755]: 2026-01-22 00:18:34.223 182759 DEBUG oslo_concurrency.processutils [None req-634ebf58-c462-4745-85a4-9ebe826b43b8 5eb4e81f0cef4003ae49faa67b3f17c3 3e408650207b498c8d115fd0c4f776dc - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/fb2dc221-bb45-4407-90b5-ce2fe888001c/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpjer170_x" returned: 0 in 0.132s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 21 19:18:34 np0005591285 kernel: tape38b41ce-ce: entered promiscuous mode
Jan 21 19:18:34 np0005591285 NetworkManager[55017]: <info>  [1769041114.3115] manager: (tape38b41ce-ce): new Tun device (/org/freedesktop/NetworkManager/Devices/267)
Jan 21 19:18:34 np0005591285 nova_compute[182755]: 2026-01-22 00:18:34.311 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:18:34 np0005591285 ovn_controller[94908]: 2026-01-22T00:18:34Z|00549|binding|INFO|Claiming lport e38b41ce-ced6-421a-ade5-becfd62fa83d for this chassis.
Jan 21 19:18:34 np0005591285 ovn_controller[94908]: 2026-01-22T00:18:34Z|00550|binding|INFO|e38b41ce-ced6-421a-ade5-becfd62fa83d: Claiming fa:16:3e:df:32:6a 10.100.0.10
Jan 21 19:18:34 np0005591285 ovn_controller[94908]: 2026-01-22T00:18:34Z|00551|binding|INFO|Setting lport e38b41ce-ced6-421a-ade5-becfd62fa83d ovn-installed in OVS
Jan 21 19:18:34 np0005591285 nova_compute[182755]: 2026-01-22 00:18:34.327 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:18:34 np0005591285 systemd-udevd[234204]: Network interface NamePolicy= disabled on kernel command line.
Jan 21 19:18:34 np0005591285 systemd-machined[154022]: New machine qemu-65-instance-0000008d.
Jan 21 19:18:34 np0005591285 NetworkManager[55017]: <info>  [1769041114.3508] device (tape38b41ce-ce): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 21 19:18:34 np0005591285 NetworkManager[55017]: <info>  [1769041114.3519] device (tape38b41ce-ce): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 21 19:18:34 np0005591285 systemd[1]: Started Virtual Machine qemu-65-instance-0000008d.
Jan 21 19:18:34 np0005591285 ovn_controller[94908]: 2026-01-22T00:18:34Z|00552|binding|INFO|Setting lport e38b41ce-ced6-421a-ade5-becfd62fa83d up in Southbound
Jan 21 19:18:34 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:18:34.418 104259 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:df:32:6a 10.100.0.10'], port_security=['fa:16:3e:df:32:6a 10.100.0.10'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.10/28', 'neutron:device_id': 'fb2dc221-bb45-4407-90b5-ce2fe888001c', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-aabf11c6-ef94-408a-8148-6c6400566606', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '3e408650207b498c8d115fd0c4f776dc', 'neutron:revision_number': '2', 'neutron:security_group_ids': 'd88f438e-f9bb-4593-93a6-6ce5aa939167', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=dfd57084-623a-46cf-a9c5-71a440a640c6, chassis=[<ovs.db.idl.Row object at 0x7fdd55b5c970>], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fdd55b5c970>], logical_port=e38b41ce-ced6-421a-ade5-becfd62fa83d) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 21 19:18:34 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:18:34.419 104259 INFO neutron.agent.ovn.metadata.agent [-] Port e38b41ce-ced6-421a-ade5-becfd62fa83d in datapath aabf11c6-ef94-408a-8148-6c6400566606 bound to our chassis#033[00m
Jan 21 19:18:34 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:18:34.421 104259 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network aabf11c6-ef94-408a-8148-6c6400566606#033[00m
Jan 21 19:18:34 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:18:34.431 211690 DEBUG oslo.privsep.daemon [-] privsep: reply[e2c8bc0c-5007-4411-be20-da595b58a98b]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 19:18:34 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:18:34.432 104259 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tapaabf11c6-e1 in ovnmeta-aabf11c6-ef94-408a-8148-6c6400566606 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Jan 21 19:18:34 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:18:34.434 211690 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tapaabf11c6-e0 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Jan 21 19:18:34 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:18:34.434 211690 DEBUG oslo.privsep.daemon [-] privsep: reply[5d38f76d-a343-48e4-9fcf-c991602d02bb]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 19:18:34 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:18:34.435 211690 DEBUG oslo.privsep.daemon [-] privsep: reply[c0e45819-a1eb-482a-aee2-e23e11ab8b46]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 19:18:34 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:18:34.445 104650 DEBUG oslo.privsep.daemon [-] privsep: reply[62a655af-7ce5-41fb-bfea-a6fa3033e854]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 19:18:34 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:18:34.467 211690 DEBUG oslo.privsep.daemon [-] privsep: reply[af3e09d8-393b-4359-b0d6-c2f86647dcc8]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 19:18:34 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:18:34.493 211704 DEBUG oslo.privsep.daemon [-] privsep: reply[e98890f5-675e-4f45-aeca-6e7aa4923c2b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 19:18:34 np0005591285 NetworkManager[55017]: <info>  [1769041114.5007] manager: (tapaabf11c6-e0): new Veth device (/org/freedesktop/NetworkManager/Devices/268)
Jan 21 19:18:34 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:18:34.501 211690 DEBUG oslo.privsep.daemon [-] privsep: reply[16cc8484-093d-4df7-80b4-c57c6b43b033]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 19:18:34 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:18:34.540 211704 DEBUG oslo.privsep.daemon [-] privsep: reply[da171f74-48ca-4734-b42d-568bd8e6b5b6]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 19:18:34 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:18:34.544 211704 DEBUG oslo.privsep.daemon [-] privsep: reply[980584ab-45dd-4096-8254-a873a7103d02]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 19:18:34 np0005591285 nova_compute[182755]: 2026-01-22 00:18:34.553 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:18:34 np0005591285 NetworkManager[55017]: <info>  [1769041114.5701] device (tapaabf11c6-e0): carrier: link connected
Jan 21 19:18:34 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:18:34.578 211704 DEBUG oslo.privsep.daemon [-] privsep: reply[521b0f29-456d-4b8a-90df-51fb53934eff]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 19:18:34 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:18:34.597 211690 DEBUG oslo.privsep.daemon [-] privsep: reply[96f3cea8-6acf-469c-a6ad-1ad9fa0eb501]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapaabf11c6-e1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:ae:1b:62'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 174], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 567322, 'reachable_time': 24572, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 234240, 'error': None, 'target': 'ovnmeta-aabf11c6-ef94-408a-8148-6c6400566606', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 19:18:34 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:18:34.613 211690 DEBUG oslo.privsep.daemon [-] privsep: reply[75b401b2-aedc-4efc-9b25-c3857ca65c66]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:feae:1b62'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 567322, 'tstamp': 567322}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 234245, 'error': None, 'target': 'ovnmeta-aabf11c6-ef94-408a-8148-6c6400566606', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 19:18:34 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:18:34.627 211690 DEBUG oslo.privsep.daemon [-] privsep: reply[8d588695-9235-42fc-8fef-1257498020f4]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapaabf11c6-e1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:ae:1b:62'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 174], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 567322, 'reachable_time': 24572, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 234246, 'error': None, 'target': 'ovnmeta-aabf11c6-ef94-408a-8148-6c6400566606', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 19:18:34 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:18:34.657 211690 DEBUG oslo.privsep.daemon [-] privsep: reply[cd2db9c7-deaf-428c-a725-f0879b25e9b0]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 19:18:34 np0005591285 nova_compute[182755]: 2026-01-22 00:18:34.671 182759 DEBUG nova.virt.driver [None req-f447ef32-7edb-4e2c-a5c5-5452f2f9c37e - - - - - -] Emitting event <LifecycleEvent: 1769041114.670785, fb2dc221-bb45-4407-90b5-ce2fe888001c => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 21 19:18:34 np0005591285 nova_compute[182755]: 2026-01-22 00:18:34.672 182759 INFO nova.compute.manager [None req-f447ef32-7edb-4e2c-a5c5-5452f2f9c37e - - - - - -] [instance: fb2dc221-bb45-4407-90b5-ce2fe888001c] VM Started (Lifecycle Event)#033[00m
Jan 21 19:18:34 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:18:34.719 211690 DEBUG oslo.privsep.daemon [-] privsep: reply[03b13e3a-a11c-4dc7-a850-afd0f1c5a805]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 19:18:34 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:18:34.720 104259 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapaabf11c6-e0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 21 19:18:34 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:18:34.720 104259 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 21 19:18:34 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:18:34.720 104259 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapaabf11c6-e0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 21 19:18:34 np0005591285 kernel: tapaabf11c6-e0: entered promiscuous mode
Jan 21 19:18:34 np0005591285 NetworkManager[55017]: <info>  [1769041114.7231] manager: (tapaabf11c6-e0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/269)
Jan 21 19:18:34 np0005591285 nova_compute[182755]: 2026-01-22 00:18:34.722 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:18:34 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:18:34.725 104259 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapaabf11c6-e0, col_values=(('external_ids', {'iface-id': '1ae0dbff-a7cd-4db8-afc3-1d102fdd130f'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 21 19:18:34 np0005591285 nova_compute[182755]: 2026-01-22 00:18:34.725 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:18:34 np0005591285 ovn_controller[94908]: 2026-01-22T00:18:34Z|00553|binding|INFO|Releasing lport 1ae0dbff-a7cd-4db8-afc3-1d102fdd130f from this chassis (sb_readonly=0)
Jan 21 19:18:34 np0005591285 nova_compute[182755]: 2026-01-22 00:18:34.737 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:18:34 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:18:34.738 104259 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/aabf11c6-ef94-408a-8148-6c6400566606.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/aabf11c6-ef94-408a-8148-6c6400566606.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Jan 21 19:18:34 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:18:34.739 211690 DEBUG oslo.privsep.daemon [-] privsep: reply[e1115f80-cdcc-4b30-a4b5-53eb8ad21481]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 19:18:34 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:18:34.740 104259 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Jan 21 19:18:34 np0005591285 ovn_metadata_agent[104254]: global
Jan 21 19:18:34 np0005591285 ovn_metadata_agent[104254]:    log         /dev/log local0 debug
Jan 21 19:18:34 np0005591285 ovn_metadata_agent[104254]:    log-tag     haproxy-metadata-proxy-aabf11c6-ef94-408a-8148-6c6400566606
Jan 21 19:18:34 np0005591285 ovn_metadata_agent[104254]:    user        root
Jan 21 19:18:34 np0005591285 ovn_metadata_agent[104254]:    group       root
Jan 21 19:18:34 np0005591285 ovn_metadata_agent[104254]:    maxconn     1024
Jan 21 19:18:34 np0005591285 ovn_metadata_agent[104254]:    pidfile     /var/lib/neutron/external/pids/aabf11c6-ef94-408a-8148-6c6400566606.pid.haproxy
Jan 21 19:18:34 np0005591285 ovn_metadata_agent[104254]:    daemon
Jan 21 19:18:34 np0005591285 ovn_metadata_agent[104254]: 
Jan 21 19:18:34 np0005591285 ovn_metadata_agent[104254]: defaults
Jan 21 19:18:34 np0005591285 ovn_metadata_agent[104254]:    log global
Jan 21 19:18:34 np0005591285 ovn_metadata_agent[104254]:    mode http
Jan 21 19:18:34 np0005591285 ovn_metadata_agent[104254]:    option httplog
Jan 21 19:18:34 np0005591285 ovn_metadata_agent[104254]:    option dontlognull
Jan 21 19:18:34 np0005591285 ovn_metadata_agent[104254]:    option http-server-close
Jan 21 19:18:34 np0005591285 ovn_metadata_agent[104254]:    option forwardfor
Jan 21 19:18:34 np0005591285 ovn_metadata_agent[104254]:    retries                 3
Jan 21 19:18:34 np0005591285 ovn_metadata_agent[104254]:    timeout http-request    30s
Jan 21 19:18:34 np0005591285 ovn_metadata_agent[104254]:    timeout connect         30s
Jan 21 19:18:34 np0005591285 ovn_metadata_agent[104254]:    timeout client          32s
Jan 21 19:18:34 np0005591285 ovn_metadata_agent[104254]:    timeout server          32s
Jan 21 19:18:34 np0005591285 ovn_metadata_agent[104254]:    timeout http-keep-alive 30s
Jan 21 19:18:34 np0005591285 ovn_metadata_agent[104254]: 
Jan 21 19:18:34 np0005591285 ovn_metadata_agent[104254]: 
Jan 21 19:18:34 np0005591285 ovn_metadata_agent[104254]: listen listener
Jan 21 19:18:34 np0005591285 ovn_metadata_agent[104254]:    bind 169.254.169.254:80
Jan 21 19:18:34 np0005591285 ovn_metadata_agent[104254]:    server metadata /var/lib/neutron/metadata_proxy
Jan 21 19:18:34 np0005591285 ovn_metadata_agent[104254]:    http-request add-header X-OVN-Network-ID aabf11c6-ef94-408a-8148-6c6400566606
Jan 21 19:18:34 np0005591285 ovn_metadata_agent[104254]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Jan 21 19:18:34 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:18:34.740 104259 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-aabf11c6-ef94-408a-8148-6c6400566606', 'env', 'PROCESS_TAG=haproxy-aabf11c6-ef94-408a-8148-6c6400566606', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/aabf11c6-ef94-408a-8148-6c6400566606.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Jan 21 19:18:34 np0005591285 nova_compute[182755]: 2026-01-22 00:18:34.843 182759 DEBUG nova.compute.manager [None req-f447ef32-7edb-4e2c-a5c5-5452f2f9c37e - - - - - -] [instance: fb2dc221-bb45-4407-90b5-ce2fe888001c] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 21 19:18:34 np0005591285 nova_compute[182755]: 2026-01-22 00:18:34.848 182759 DEBUG nova.virt.driver [None req-f447ef32-7edb-4e2c-a5c5-5452f2f9c37e - - - - - -] Emitting event <LifecycleEvent: 1769041114.6710615, fb2dc221-bb45-4407-90b5-ce2fe888001c => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 21 19:18:34 np0005591285 nova_compute[182755]: 2026-01-22 00:18:34.848 182759 INFO nova.compute.manager [None req-f447ef32-7edb-4e2c-a5c5-5452f2f9c37e - - - - - -] [instance: fb2dc221-bb45-4407-90b5-ce2fe888001c] VM Paused (Lifecycle Event)#033[00m
Jan 21 19:18:34 np0005591285 nova_compute[182755]: 2026-01-22 00:18:34.920 182759 DEBUG nova.compute.manager [None req-f447ef32-7edb-4e2c-a5c5-5452f2f9c37e - - - - - -] [instance: fb2dc221-bb45-4407-90b5-ce2fe888001c] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 21 19:18:34 np0005591285 nova_compute[182755]: 2026-01-22 00:18:34.925 182759 DEBUG nova.compute.manager [None req-f447ef32-7edb-4e2c-a5c5-5452f2f9c37e - - - - - -] [instance: fb2dc221-bb45-4407-90b5-ce2fe888001c] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 21 19:18:35 np0005591285 nova_compute[182755]: 2026-01-22 00:18:35.023 182759 INFO nova.compute.manager [None req-f447ef32-7edb-4e2c-a5c5-5452f2f9c37e - - - - - -] [instance: fb2dc221-bb45-4407-90b5-ce2fe888001c] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 21 19:18:35 np0005591285 podman[234278]: 2026-01-22 00:18:35.074436752 +0000 UTC m=+0.045828112 container create 5184a8c8dce40e7a1de3dbe01f291520f2ad24407839d38396ec3932be28057e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-aabf11c6-ef94-408a-8148-6c6400566606, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251202, tcib_managed=true, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3)
Jan 21 19:18:35 np0005591285 systemd[1]: Started libpod-conmon-5184a8c8dce40e7a1de3dbe01f291520f2ad24407839d38396ec3932be28057e.scope.
Jan 21 19:18:35 np0005591285 podman[234278]: 2026-01-22 00:18:35.052881708 +0000 UTC m=+0.024273088 image pull 3695f0466b4af47afdf4b467956f8cc4744d7249671a73e7ca3fd26cca2f59c3 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Jan 21 19:18:35 np0005591285 systemd[1]: Started libcrun container.
Jan 21 19:18:35 np0005591285 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/12c0bf2f75b026d74a017bfbd3b772a3907811bbc0704f958a28599947da93f0/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 21 19:18:35 np0005591285 podman[234278]: 2026-01-22 00:18:35.175845937 +0000 UTC m=+0.147237317 container init 5184a8c8dce40e7a1de3dbe01f291520f2ad24407839d38396ec3932be28057e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-aabf11c6-ef94-408a-8148-6c6400566606, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251202, tcib_managed=true)
Jan 21 19:18:35 np0005591285 podman[234278]: 2026-01-22 00:18:35.180560752 +0000 UTC m=+0.151952112 container start 5184a8c8dce40e7a1de3dbe01f291520f2ad24407839d38396ec3932be28057e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-aabf11c6-ef94-408a-8148-6c6400566606, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 21 19:18:35 np0005591285 neutron-haproxy-ovnmeta-aabf11c6-ef94-408a-8148-6c6400566606[234294]: [NOTICE]   (234298) : New worker (234300) forked
Jan 21 19:18:35 np0005591285 neutron-haproxy-ovnmeta-aabf11c6-ef94-408a-8148-6c6400566606[234294]: [NOTICE]   (234298) : Loading success.
Jan 21 19:18:37 np0005591285 nova_compute[182755]: 2026-01-22 00:18:37.773 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:18:38 np0005591285 nova_compute[182755]: 2026-01-22 00:18:38.244 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:18:38 np0005591285 nova_compute[182755]: 2026-01-22 00:18:38.990 182759 DEBUG nova.compute.manager [req-190f7572-24c8-46a4-9d4c-bd03e986235d req-f5045bcd-939d-41ec-8923-824484781622 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: fb2dc221-bb45-4407-90b5-ce2fe888001c] Received event network-vif-plugged-e38b41ce-ced6-421a-ade5-becfd62fa83d external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 21 19:18:38 np0005591285 nova_compute[182755]: 2026-01-22 00:18:38.990 182759 DEBUG oslo_concurrency.lockutils [req-190f7572-24c8-46a4-9d4c-bd03e986235d req-f5045bcd-939d-41ec-8923-824484781622 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "fb2dc221-bb45-4407-90b5-ce2fe888001c-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 21 19:18:38 np0005591285 nova_compute[182755]: 2026-01-22 00:18:38.991 182759 DEBUG oslo_concurrency.lockutils [req-190f7572-24c8-46a4-9d4c-bd03e986235d req-f5045bcd-939d-41ec-8923-824484781622 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "fb2dc221-bb45-4407-90b5-ce2fe888001c-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 21 19:18:38 np0005591285 nova_compute[182755]: 2026-01-22 00:18:38.992 182759 DEBUG oslo_concurrency.lockutils [req-190f7572-24c8-46a4-9d4c-bd03e986235d req-f5045bcd-939d-41ec-8923-824484781622 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "fb2dc221-bb45-4407-90b5-ce2fe888001c-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 21 19:18:38 np0005591285 nova_compute[182755]: 2026-01-22 00:18:38.992 182759 DEBUG nova.compute.manager [req-190f7572-24c8-46a4-9d4c-bd03e986235d req-f5045bcd-939d-41ec-8923-824484781622 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: fb2dc221-bb45-4407-90b5-ce2fe888001c] Processing event network-vif-plugged-e38b41ce-ced6-421a-ade5-becfd62fa83d _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Jan 21 19:18:38 np0005591285 nova_compute[182755]: 2026-01-22 00:18:38.993 182759 DEBUG nova.compute.manager [None req-634ebf58-c462-4745-85a4-9ebe826b43b8 5eb4e81f0cef4003ae49faa67b3f17c3 3e408650207b498c8d115fd0c4f776dc - - default default] [instance: fb2dc221-bb45-4407-90b5-ce2fe888001c] Instance event wait completed in 4 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Jan 21 19:18:38 np0005591285 nova_compute[182755]: 2026-01-22 00:18:38.999 182759 DEBUG nova.virt.driver [None req-f447ef32-7edb-4e2c-a5c5-5452f2f9c37e - - - - - -] Emitting event <LifecycleEvent: 1769041118.998858, fb2dc221-bb45-4407-90b5-ce2fe888001c => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 21 19:18:39 np0005591285 nova_compute[182755]: 2026-01-22 00:18:38.999 182759 INFO nova.compute.manager [None req-f447ef32-7edb-4e2c-a5c5-5452f2f9c37e - - - - - -] [instance: fb2dc221-bb45-4407-90b5-ce2fe888001c] VM Resumed (Lifecycle Event)#033[00m
Jan 21 19:18:39 np0005591285 nova_compute[182755]: 2026-01-22 00:18:39.005 182759 DEBUG nova.virt.libvirt.driver [None req-634ebf58-c462-4745-85a4-9ebe826b43b8 5eb4e81f0cef4003ae49faa67b3f17c3 3e408650207b498c8d115fd0c4f776dc - - default default] [instance: fb2dc221-bb45-4407-90b5-ce2fe888001c] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Jan 21 19:18:39 np0005591285 nova_compute[182755]: 2026-01-22 00:18:39.011 182759 INFO nova.virt.libvirt.driver [-] [instance: fb2dc221-bb45-4407-90b5-ce2fe888001c] Instance spawned successfully.#033[00m
Jan 21 19:18:39 np0005591285 nova_compute[182755]: 2026-01-22 00:18:39.011 182759 DEBUG nova.virt.libvirt.driver [None req-634ebf58-c462-4745-85a4-9ebe826b43b8 5eb4e81f0cef4003ae49faa67b3f17c3 3e408650207b498c8d115fd0c4f776dc - - default default] [instance: fb2dc221-bb45-4407-90b5-ce2fe888001c] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Jan 21 19:18:39 np0005591285 nova_compute[182755]: 2026-01-22 00:18:39.686 182759 DEBUG nova.compute.manager [None req-f447ef32-7edb-4e2c-a5c5-5452f2f9c37e - - - - - -] [instance: fb2dc221-bb45-4407-90b5-ce2fe888001c] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 21 19:18:39 np0005591285 nova_compute[182755]: 2026-01-22 00:18:39.691 182759 DEBUG nova.compute.manager [None req-f447ef32-7edb-4e2c-a5c5-5452f2f9c37e - - - - - -] [instance: fb2dc221-bb45-4407-90b5-ce2fe888001c] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 21 19:18:39 np0005591285 nova_compute[182755]: 2026-01-22 00:18:39.694 182759 DEBUG nova.virt.libvirt.driver [None req-634ebf58-c462-4745-85a4-9ebe826b43b8 5eb4e81f0cef4003ae49faa67b3f17c3 3e408650207b498c8d115fd0c4f776dc - - default default] [instance: fb2dc221-bb45-4407-90b5-ce2fe888001c] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 21 19:18:39 np0005591285 nova_compute[182755]: 2026-01-22 00:18:39.694 182759 DEBUG nova.virt.libvirt.driver [None req-634ebf58-c462-4745-85a4-9ebe826b43b8 5eb4e81f0cef4003ae49faa67b3f17c3 3e408650207b498c8d115fd0c4f776dc - - default default] [instance: fb2dc221-bb45-4407-90b5-ce2fe888001c] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 21 19:18:39 np0005591285 nova_compute[182755]: 2026-01-22 00:18:39.695 182759 DEBUG nova.virt.libvirt.driver [None req-634ebf58-c462-4745-85a4-9ebe826b43b8 5eb4e81f0cef4003ae49faa67b3f17c3 3e408650207b498c8d115fd0c4f776dc - - default default] [instance: fb2dc221-bb45-4407-90b5-ce2fe888001c] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 21 19:18:39 np0005591285 nova_compute[182755]: 2026-01-22 00:18:39.695 182759 DEBUG nova.virt.libvirt.driver [None req-634ebf58-c462-4745-85a4-9ebe826b43b8 5eb4e81f0cef4003ae49faa67b3f17c3 3e408650207b498c8d115fd0c4f776dc - - default default] [instance: fb2dc221-bb45-4407-90b5-ce2fe888001c] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 21 19:18:39 np0005591285 nova_compute[182755]: 2026-01-22 00:18:39.695 182759 DEBUG nova.virt.libvirt.driver [None req-634ebf58-c462-4745-85a4-9ebe826b43b8 5eb4e81f0cef4003ae49faa67b3f17c3 3e408650207b498c8d115fd0c4f776dc - - default default] [instance: fb2dc221-bb45-4407-90b5-ce2fe888001c] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 21 19:18:39 np0005591285 nova_compute[182755]: 2026-01-22 00:18:39.696 182759 DEBUG nova.virt.libvirt.driver [None req-634ebf58-c462-4745-85a4-9ebe826b43b8 5eb4e81f0cef4003ae49faa67b3f17c3 3e408650207b498c8d115fd0c4f776dc - - default default] [instance: fb2dc221-bb45-4407-90b5-ce2fe888001c] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 21 19:18:39 np0005591285 ovn_controller[94908]: 2026-01-22T00:18:39Z|00554|binding|INFO|Releasing lport 1ae0dbff-a7cd-4db8-afc3-1d102fdd130f from this chassis (sb_readonly=0)
Jan 21 19:18:39 np0005591285 nova_compute[182755]: 2026-01-22 00:18:39.891 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:18:39 np0005591285 nova_compute[182755]: 2026-01-22 00:18:39.993 182759 INFO nova.compute.manager [None req-f447ef32-7edb-4e2c-a5c5-5452f2f9c37e - - - - - -] [instance: fb2dc221-bb45-4407-90b5-ce2fe888001c] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 21 19:18:40 np0005591285 ovn_controller[94908]: 2026-01-22T00:18:40Z|00555|binding|INFO|Releasing lport 1ae0dbff-a7cd-4db8-afc3-1d102fdd130f from this chassis (sb_readonly=0)
Jan 21 19:18:40 np0005591285 nova_compute[182755]: 2026-01-22 00:18:40.051 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:18:40 np0005591285 nova_compute[182755]: 2026-01-22 00:18:40.056 182759 INFO nova.compute.manager [None req-634ebf58-c462-4745-85a4-9ebe826b43b8 5eb4e81f0cef4003ae49faa67b3f17c3 3e408650207b498c8d115fd0c4f776dc - - default default] [instance: fb2dc221-bb45-4407-90b5-ce2fe888001c] Took 17.74 seconds to spawn the instance on the hypervisor.#033[00m
Jan 21 19:18:40 np0005591285 nova_compute[182755]: 2026-01-22 00:18:40.057 182759 DEBUG nova.compute.manager [None req-634ebf58-c462-4745-85a4-9ebe826b43b8 5eb4e81f0cef4003ae49faa67b3f17c3 3e408650207b498c8d115fd0c4f776dc - - default default] [instance: fb2dc221-bb45-4407-90b5-ce2fe888001c] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 21 19:18:40 np0005591285 nova_compute[182755]: 2026-01-22 00:18:40.164 182759 INFO nova.compute.manager [None req-634ebf58-c462-4745-85a4-9ebe826b43b8 5eb4e81f0cef4003ae49faa67b3f17c3 3e408650207b498c8d115fd0c4f776dc - - default default] [instance: fb2dc221-bb45-4407-90b5-ce2fe888001c] Took 18.43 seconds to build instance.#033[00m
Jan 21 19:18:40 np0005591285 nova_compute[182755]: 2026-01-22 00:18:40.189 182759 DEBUG oslo_concurrency.lockutils [None req-634ebf58-c462-4745-85a4-9ebe826b43b8 5eb4e81f0cef4003ae49faa67b3f17c3 3e408650207b498c8d115fd0c4f776dc - - default default] Lock "fb2dc221-bb45-4407-90b5-ce2fe888001c" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 19.931s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 21 19:18:41 np0005591285 podman[234310]: 2026-01-22 00:18:41.178734861 +0000 UTC m=+0.053906858 container health_status 3d927e208c8d9d2bf2ed958430b573b5beb6e6062ba87e1ec7a0ee42c855b2e4 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter)
Jan 21 19:18:41 np0005591285 nova_compute[182755]: 2026-01-22 00:18:41.194 182759 DEBUG nova.compute.manager [req-fc2b43f9-ab53-446d-911f-27c2fde61e10 req-5278db0b-0ccb-4ac3-9fb6-db27a3501c75 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: fb2dc221-bb45-4407-90b5-ce2fe888001c] Received event network-vif-plugged-e38b41ce-ced6-421a-ade5-becfd62fa83d external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 21 19:18:41 np0005591285 nova_compute[182755]: 2026-01-22 00:18:41.195 182759 DEBUG oslo_concurrency.lockutils [req-fc2b43f9-ab53-446d-911f-27c2fde61e10 req-5278db0b-0ccb-4ac3-9fb6-db27a3501c75 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "fb2dc221-bb45-4407-90b5-ce2fe888001c-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 21 19:18:41 np0005591285 nova_compute[182755]: 2026-01-22 00:18:41.195 182759 DEBUG oslo_concurrency.lockutils [req-fc2b43f9-ab53-446d-911f-27c2fde61e10 req-5278db0b-0ccb-4ac3-9fb6-db27a3501c75 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "fb2dc221-bb45-4407-90b5-ce2fe888001c-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 21 19:18:41 np0005591285 nova_compute[182755]: 2026-01-22 00:18:41.195 182759 DEBUG oslo_concurrency.lockutils [req-fc2b43f9-ab53-446d-911f-27c2fde61e10 req-5278db0b-0ccb-4ac3-9fb6-db27a3501c75 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "fb2dc221-bb45-4407-90b5-ce2fe888001c-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 21 19:18:41 np0005591285 nova_compute[182755]: 2026-01-22 00:18:41.196 182759 DEBUG nova.compute.manager [req-fc2b43f9-ab53-446d-911f-27c2fde61e10 req-5278db0b-0ccb-4ac3-9fb6-db27a3501c75 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: fb2dc221-bb45-4407-90b5-ce2fe888001c] No waiting events found dispatching network-vif-plugged-e38b41ce-ced6-421a-ade5-becfd62fa83d pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 21 19:18:41 np0005591285 nova_compute[182755]: 2026-01-22 00:18:41.196 182759 WARNING nova.compute.manager [req-fc2b43f9-ab53-446d-911f-27c2fde61e10 req-5278db0b-0ccb-4ac3-9fb6-db27a3501c75 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: fb2dc221-bb45-4407-90b5-ce2fe888001c] Received unexpected event network-vif-plugged-e38b41ce-ced6-421a-ade5-becfd62fa83d for instance with vm_state active and task_state None.#033[00m
Jan 21 19:18:42 np0005591285 nova_compute[182755]: 2026-01-22 00:18:42.775 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:18:43 np0005591285 nova_compute[182755]: 2026-01-22 00:18:43.247 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:18:44 np0005591285 nova_compute[182755]: 2026-01-22 00:18:44.071 182759 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1769041109.0687966, 46feac9e-f412-4027-8cfb-f7280308085e => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 21 19:18:44 np0005591285 nova_compute[182755]: 2026-01-22 00:18:44.071 182759 INFO nova.compute.manager [-] [instance: 46feac9e-f412-4027-8cfb-f7280308085e] VM Stopped (Lifecycle Event)#033[00m
Jan 21 19:18:44 np0005591285 nova_compute[182755]: 2026-01-22 00:18:44.104 182759 DEBUG nova.compute.manager [None req-8332f9af-204b-4c30-a879-26091f959fec - - - - - -] [instance: 46feac9e-f412-4027-8cfb-f7280308085e] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 21 19:18:44 np0005591285 nova_compute[182755]: 2026-01-22 00:18:44.226 182759 DEBUG oslo_concurrency.lockutils [None req-b353149b-a80c-45b9-8938-808359b67be0 5eb4e81f0cef4003ae49faa67b3f17c3 3e408650207b498c8d115fd0c4f776dc - - default default] Acquiring lock "c4b8d52a-d6d6-4588-91ae-5eedc3a8db48" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 21 19:18:44 np0005591285 nova_compute[182755]: 2026-01-22 00:18:44.227 182759 DEBUG oslo_concurrency.lockutils [None req-b353149b-a80c-45b9-8938-808359b67be0 5eb4e81f0cef4003ae49faa67b3f17c3 3e408650207b498c8d115fd0c4f776dc - - default default] Lock "c4b8d52a-d6d6-4588-91ae-5eedc3a8db48" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 21 19:18:44 np0005591285 nova_compute[182755]: 2026-01-22 00:18:44.278 182759 DEBUG nova.compute.manager [None req-b353149b-a80c-45b9-8938-808359b67be0 5eb4e81f0cef4003ae49faa67b3f17c3 3e408650207b498c8d115fd0c4f776dc - - default default] [instance: c4b8d52a-d6d6-4588-91ae-5eedc3a8db48] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Jan 21 19:18:44 np0005591285 nova_compute[182755]: 2026-01-22 00:18:44.629 182759 DEBUG oslo_concurrency.lockutils [None req-b353149b-a80c-45b9-8938-808359b67be0 5eb4e81f0cef4003ae49faa67b3f17c3 3e408650207b498c8d115fd0c4f776dc - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 21 19:18:44 np0005591285 nova_compute[182755]: 2026-01-22 00:18:44.629 182759 DEBUG oslo_concurrency.lockutils [None req-b353149b-a80c-45b9-8938-808359b67be0 5eb4e81f0cef4003ae49faa67b3f17c3 3e408650207b498c8d115fd0c4f776dc - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 21 19:18:44 np0005591285 nova_compute[182755]: 2026-01-22 00:18:44.638 182759 DEBUG nova.virt.hardware [None req-b353149b-a80c-45b9-8938-808359b67be0 5eb4e81f0cef4003ae49faa67b3f17c3 3e408650207b498c8d115fd0c4f776dc - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Jan 21 19:18:44 np0005591285 nova_compute[182755]: 2026-01-22 00:18:44.639 182759 INFO nova.compute.claims [None req-b353149b-a80c-45b9-8938-808359b67be0 5eb4e81f0cef4003ae49faa67b3f17c3 3e408650207b498c8d115fd0c4f776dc - - default default] [instance: c4b8d52a-d6d6-4588-91ae-5eedc3a8db48] Claim successful on node compute-2.ctlplane.example.com#033[00m
Jan 21 19:18:44 np0005591285 nova_compute[182755]: 2026-01-22 00:18:44.958 182759 DEBUG nova.compute.provider_tree [None req-b353149b-a80c-45b9-8938-808359b67be0 5eb4e81f0cef4003ae49faa67b3f17c3 3e408650207b498c8d115fd0c4f776dc - - default default] Inventory has not changed in ProviderTree for provider: e96a8776-a298-4c19-937a-402cb8191067 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 21 19:18:44 np0005591285 nova_compute[182755]: 2026-01-22 00:18:44.978 182759 DEBUG nova.scheduler.client.report [None req-b353149b-a80c-45b9-8938-808359b67be0 5eb4e81f0cef4003ae49faa67b3f17c3 3e408650207b498c8d115fd0c4f776dc - - default default] Inventory has not changed for provider e96a8776-a298-4c19-937a-402cb8191067 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 21 19:18:45 np0005591285 nova_compute[182755]: 2026-01-22 00:18:45.006 182759 DEBUG oslo_concurrency.lockutils [None req-b353149b-a80c-45b9-8938-808359b67be0 5eb4e81f0cef4003ae49faa67b3f17c3 3e408650207b498c8d115fd0c4f776dc - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.377s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 21 19:18:45 np0005591285 nova_compute[182755]: 2026-01-22 00:18:45.007 182759 DEBUG nova.compute.manager [None req-b353149b-a80c-45b9-8938-808359b67be0 5eb4e81f0cef4003ae49faa67b3f17c3 3e408650207b498c8d115fd0c4f776dc - - default default] [instance: c4b8d52a-d6d6-4588-91ae-5eedc3a8db48] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Jan 21 19:18:45 np0005591285 nova_compute[182755]: 2026-01-22 00:18:45.075 182759 DEBUG nova.compute.manager [None req-b353149b-a80c-45b9-8938-808359b67be0 5eb4e81f0cef4003ae49faa67b3f17c3 3e408650207b498c8d115fd0c4f776dc - - default default] [instance: c4b8d52a-d6d6-4588-91ae-5eedc3a8db48] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Jan 21 19:18:45 np0005591285 nova_compute[182755]: 2026-01-22 00:18:45.076 182759 DEBUG nova.network.neutron [None req-b353149b-a80c-45b9-8938-808359b67be0 5eb4e81f0cef4003ae49faa67b3f17c3 3e408650207b498c8d115fd0c4f776dc - - default default] [instance: c4b8d52a-d6d6-4588-91ae-5eedc3a8db48] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Jan 21 19:18:45 np0005591285 nova_compute[182755]: 2026-01-22 00:18:45.108 182759 INFO nova.virt.libvirt.driver [None req-b353149b-a80c-45b9-8938-808359b67be0 5eb4e81f0cef4003ae49faa67b3f17c3 3e408650207b498c8d115fd0c4f776dc - - default default] [instance: c4b8d52a-d6d6-4588-91ae-5eedc3a8db48] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Jan 21 19:18:45 np0005591285 nova_compute[182755]: 2026-01-22 00:18:45.129 182759 DEBUG nova.compute.manager [None req-b353149b-a80c-45b9-8938-808359b67be0 5eb4e81f0cef4003ae49faa67b3f17c3 3e408650207b498c8d115fd0c4f776dc - - default default] [instance: c4b8d52a-d6d6-4588-91ae-5eedc3a8db48] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Jan 21 19:18:45 np0005591285 podman[234336]: 2026-01-22 00:18:45.202932872 +0000 UTC m=+0.074585640 container health_status 8919309155a033da8deb024bb37b9fc0d285fe97686ee0d202af37a5f2df8416 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Jan 21 19:18:45 np0005591285 podman[234335]: 2026-01-22 00:18:45.225462182 +0000 UTC m=+0.089471546 container health_status 482a8450540b33e5cd4c4cb0c4b60281016c657b79b89fa82f8a9e447843f995 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, container_name=ovn_metadata_agent)
Jan 21 19:18:45 np0005591285 nova_compute[182755]: 2026-01-22 00:18:45.226 182759 DEBUG oslo_service.periodic_task [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 21 19:18:45 np0005591285 nova_compute[182755]: 2026-01-22 00:18:45.304 182759 DEBUG nova.compute.manager [None req-b353149b-a80c-45b9-8938-808359b67be0 5eb4e81f0cef4003ae49faa67b3f17c3 3e408650207b498c8d115fd0c4f776dc - - default default] [instance: c4b8d52a-d6d6-4588-91ae-5eedc3a8db48] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Jan 21 19:18:45 np0005591285 nova_compute[182755]: 2026-01-22 00:18:45.306 182759 DEBUG nova.virt.libvirt.driver [None req-b353149b-a80c-45b9-8938-808359b67be0 5eb4e81f0cef4003ae49faa67b3f17c3 3e408650207b498c8d115fd0c4f776dc - - default default] [instance: c4b8d52a-d6d6-4588-91ae-5eedc3a8db48] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Jan 21 19:18:45 np0005591285 nova_compute[182755]: 2026-01-22 00:18:45.306 182759 INFO nova.virt.libvirt.driver [None req-b353149b-a80c-45b9-8938-808359b67be0 5eb4e81f0cef4003ae49faa67b3f17c3 3e408650207b498c8d115fd0c4f776dc - - default default] [instance: c4b8d52a-d6d6-4588-91ae-5eedc3a8db48] Creating image(s)#033[00m
Jan 21 19:18:45 np0005591285 nova_compute[182755]: 2026-01-22 00:18:45.307 182759 DEBUG oslo_concurrency.lockutils [None req-b353149b-a80c-45b9-8938-808359b67be0 5eb4e81f0cef4003ae49faa67b3f17c3 3e408650207b498c8d115fd0c4f776dc - - default default] Acquiring lock "/var/lib/nova/instances/c4b8d52a-d6d6-4588-91ae-5eedc3a8db48/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 21 19:18:45 np0005591285 nova_compute[182755]: 2026-01-22 00:18:45.307 182759 DEBUG oslo_concurrency.lockutils [None req-b353149b-a80c-45b9-8938-808359b67be0 5eb4e81f0cef4003ae49faa67b3f17c3 3e408650207b498c8d115fd0c4f776dc - - default default] Lock "/var/lib/nova/instances/c4b8d52a-d6d6-4588-91ae-5eedc3a8db48/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 21 19:18:45 np0005591285 nova_compute[182755]: 2026-01-22 00:18:45.308 182759 DEBUG oslo_concurrency.lockutils [None req-b353149b-a80c-45b9-8938-808359b67be0 5eb4e81f0cef4003ae49faa67b3f17c3 3e408650207b498c8d115fd0c4f776dc - - default default] Lock "/var/lib/nova/instances/c4b8d52a-d6d6-4588-91ae-5eedc3a8db48/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 21 19:18:45 np0005591285 nova_compute[182755]: 2026-01-22 00:18:45.322 182759 DEBUG oslo_concurrency.processutils [None req-b353149b-a80c-45b9-8938-808359b67be0 5eb4e81f0cef4003ae49faa67b3f17c3 3e408650207b498c8d115fd0c4f776dc - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 21 19:18:45 np0005591285 nova_compute[182755]: 2026-01-22 00:18:45.377 182759 DEBUG oslo_concurrency.processutils [None req-b353149b-a80c-45b9-8938-808359b67be0 5eb4e81f0cef4003ae49faa67b3f17c3 3e408650207b498c8d115fd0c4f776dc - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json" returned: 0 in 0.055s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 21 19:18:45 np0005591285 nova_compute[182755]: 2026-01-22 00:18:45.378 182759 DEBUG oslo_concurrency.lockutils [None req-b353149b-a80c-45b9-8938-808359b67be0 5eb4e81f0cef4003ae49faa67b3f17c3 3e408650207b498c8d115fd0c4f776dc - - default default] Acquiring lock "3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 21 19:18:45 np0005591285 nova_compute[182755]: 2026-01-22 00:18:45.378 182759 DEBUG oslo_concurrency.lockutils [None req-b353149b-a80c-45b9-8938-808359b67be0 5eb4e81f0cef4003ae49faa67b3f17c3 3e408650207b498c8d115fd0c4f776dc - - default default] Lock "3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 21 19:18:45 np0005591285 nova_compute[182755]: 2026-01-22 00:18:45.390 182759 DEBUG oslo_concurrency.processutils [None req-b353149b-a80c-45b9-8938-808359b67be0 5eb4e81f0cef4003ae49faa67b3f17c3 3e408650207b498c8d115fd0c4f776dc - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 21 19:18:45 np0005591285 nova_compute[182755]: 2026-01-22 00:18:45.411 182759 DEBUG nova.policy [None req-b353149b-a80c-45b9-8938-808359b67be0 5eb4e81f0cef4003ae49faa67b3f17c3 3e408650207b498c8d115fd0c4f776dc - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '5eb4e81f0cef4003ae49faa67b3f17c3', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '3e408650207b498c8d115fd0c4f776dc', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Jan 21 19:18:45 np0005591285 nova_compute[182755]: 2026-01-22 00:18:45.448 182759 DEBUG oslo_concurrency.processutils [None req-b353149b-a80c-45b9-8938-808359b67be0 5eb4e81f0cef4003ae49faa67b3f17c3 3e408650207b498c8d115fd0c4f776dc - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json" returned: 0 in 0.057s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 21 19:18:45 np0005591285 nova_compute[182755]: 2026-01-22 00:18:45.449 182759 DEBUG oslo_concurrency.processutils [None req-b353149b-a80c-45b9-8938-808359b67be0 5eb4e81f0cef4003ae49faa67b3f17c3 3e408650207b498c8d115fd0c4f776dc - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474,backing_fmt=raw /var/lib/nova/instances/c4b8d52a-d6d6-4588-91ae-5eedc3a8db48/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 21 19:18:45 np0005591285 nova_compute[182755]: 2026-01-22 00:18:45.487 182759 DEBUG oslo_concurrency.processutils [None req-b353149b-a80c-45b9-8938-808359b67be0 5eb4e81f0cef4003ae49faa67b3f17c3 3e408650207b498c8d115fd0c4f776dc - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474,backing_fmt=raw /var/lib/nova/instances/c4b8d52a-d6d6-4588-91ae-5eedc3a8db48/disk 1073741824" returned: 0 in 0.038s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 21 19:18:45 np0005591285 nova_compute[182755]: 2026-01-22 00:18:45.488 182759 DEBUG oslo_concurrency.lockutils [None req-b353149b-a80c-45b9-8938-808359b67be0 5eb4e81f0cef4003ae49faa67b3f17c3 3e408650207b498c8d115fd0c4f776dc - - default default] Lock "3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.110s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 21 19:18:45 np0005591285 nova_compute[182755]: 2026-01-22 00:18:45.489 182759 DEBUG oslo_concurrency.processutils [None req-b353149b-a80c-45b9-8938-808359b67be0 5eb4e81f0cef4003ae49faa67b3f17c3 3e408650207b498c8d115fd0c4f776dc - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 21 19:18:45 np0005591285 nova_compute[182755]: 2026-01-22 00:18:45.546 182759 DEBUG oslo_concurrency.processutils [None req-b353149b-a80c-45b9-8938-808359b67be0 5eb4e81f0cef4003ae49faa67b3f17c3 3e408650207b498c8d115fd0c4f776dc - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json" returned: 0 in 0.056s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 21 19:18:45 np0005591285 nova_compute[182755]: 2026-01-22 00:18:45.547 182759 DEBUG nova.virt.disk.api [None req-b353149b-a80c-45b9-8938-808359b67be0 5eb4e81f0cef4003ae49faa67b3f17c3 3e408650207b498c8d115fd0c4f776dc - - default default] Checking if we can resize image /var/lib/nova/instances/c4b8d52a-d6d6-4588-91ae-5eedc3a8db48/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166#033[00m
Jan 21 19:18:45 np0005591285 nova_compute[182755]: 2026-01-22 00:18:45.547 182759 DEBUG oslo_concurrency.processutils [None req-b353149b-a80c-45b9-8938-808359b67be0 5eb4e81f0cef4003ae49faa67b3f17c3 3e408650207b498c8d115fd0c4f776dc - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/c4b8d52a-d6d6-4588-91ae-5eedc3a8db48/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 21 19:18:45 np0005591285 nova_compute[182755]: 2026-01-22 00:18:45.601 182759 DEBUG oslo_concurrency.processutils [None req-b353149b-a80c-45b9-8938-808359b67be0 5eb4e81f0cef4003ae49faa67b3f17c3 3e408650207b498c8d115fd0c4f776dc - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/c4b8d52a-d6d6-4588-91ae-5eedc3a8db48/disk --force-share --output=json" returned: 0 in 0.054s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 21 19:18:45 np0005591285 nova_compute[182755]: 2026-01-22 00:18:45.602 182759 DEBUG nova.virt.disk.api [None req-b353149b-a80c-45b9-8938-808359b67be0 5eb4e81f0cef4003ae49faa67b3f17c3 3e408650207b498c8d115fd0c4f776dc - - default default] Cannot resize image /var/lib/nova/instances/c4b8d52a-d6d6-4588-91ae-5eedc3a8db48/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172#033[00m
Jan 21 19:18:45 np0005591285 nova_compute[182755]: 2026-01-22 00:18:45.603 182759 DEBUG nova.objects.instance [None req-b353149b-a80c-45b9-8938-808359b67be0 5eb4e81f0cef4003ae49faa67b3f17c3 3e408650207b498c8d115fd0c4f776dc - - default default] Lazy-loading 'migration_context' on Instance uuid c4b8d52a-d6d6-4588-91ae-5eedc3a8db48 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 21 19:18:45 np0005591285 nova_compute[182755]: 2026-01-22 00:18:45.632 182759 DEBUG nova.virt.libvirt.driver [None req-b353149b-a80c-45b9-8938-808359b67be0 5eb4e81f0cef4003ae49faa67b3f17c3 3e408650207b498c8d115fd0c4f776dc - - default default] [instance: c4b8d52a-d6d6-4588-91ae-5eedc3a8db48] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Jan 21 19:18:45 np0005591285 nova_compute[182755]: 2026-01-22 00:18:45.632 182759 DEBUG nova.virt.libvirt.driver [None req-b353149b-a80c-45b9-8938-808359b67be0 5eb4e81f0cef4003ae49faa67b3f17c3 3e408650207b498c8d115fd0c4f776dc - - default default] [instance: c4b8d52a-d6d6-4588-91ae-5eedc3a8db48] Ensure instance console log exists: /var/lib/nova/instances/c4b8d52a-d6d6-4588-91ae-5eedc3a8db48/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Jan 21 19:18:45 np0005591285 nova_compute[182755]: 2026-01-22 00:18:45.633 182759 DEBUG oslo_concurrency.lockutils [None req-b353149b-a80c-45b9-8938-808359b67be0 5eb4e81f0cef4003ae49faa67b3f17c3 3e408650207b498c8d115fd0c4f776dc - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 21 19:18:45 np0005591285 nova_compute[182755]: 2026-01-22 00:18:45.633 182759 DEBUG oslo_concurrency.lockutils [None req-b353149b-a80c-45b9-8938-808359b67be0 5eb4e81f0cef4003ae49faa67b3f17c3 3e408650207b498c8d115fd0c4f776dc - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 21 19:18:45 np0005591285 nova_compute[182755]: 2026-01-22 00:18:45.634 182759 DEBUG oslo_concurrency.lockutils [None req-b353149b-a80c-45b9-8938-808359b67be0 5eb4e81f0cef4003ae49faa67b3f17c3 3e408650207b498c8d115fd0c4f776dc - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 21 19:18:46 np0005591285 nova_compute[182755]: 2026-01-22 00:18:46.436 182759 DEBUG nova.network.neutron [None req-b353149b-a80c-45b9-8938-808359b67be0 5eb4e81f0cef4003ae49faa67b3f17c3 3e408650207b498c8d115fd0c4f776dc - - default default] [instance: c4b8d52a-d6d6-4588-91ae-5eedc3a8db48] Successfully created port: 4fc5d12c-3000-4f2c-b8b8-de8878075df7 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Jan 21 19:18:47 np0005591285 nova_compute[182755]: 2026-01-22 00:18:47.778 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:18:47 np0005591285 nova_compute[182755]: 2026-01-22 00:18:47.844 182759 DEBUG nova.network.neutron [None req-b353149b-a80c-45b9-8938-808359b67be0 5eb4e81f0cef4003ae49faa67b3f17c3 3e408650207b498c8d115fd0c4f776dc - - default default] [instance: c4b8d52a-d6d6-4588-91ae-5eedc3a8db48] Successfully updated port: 4fc5d12c-3000-4f2c-b8b8-de8878075df7 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Jan 21 19:18:47 np0005591285 nova_compute[182755]: 2026-01-22 00:18:47.875 182759 DEBUG oslo_concurrency.lockutils [None req-b353149b-a80c-45b9-8938-808359b67be0 5eb4e81f0cef4003ae49faa67b3f17c3 3e408650207b498c8d115fd0c4f776dc - - default default] Acquiring lock "refresh_cache-c4b8d52a-d6d6-4588-91ae-5eedc3a8db48" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 21 19:18:47 np0005591285 nova_compute[182755]: 2026-01-22 00:18:47.876 182759 DEBUG oslo_concurrency.lockutils [None req-b353149b-a80c-45b9-8938-808359b67be0 5eb4e81f0cef4003ae49faa67b3f17c3 3e408650207b498c8d115fd0c4f776dc - - default default] Acquired lock "refresh_cache-c4b8d52a-d6d6-4588-91ae-5eedc3a8db48" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 21 19:18:47 np0005591285 nova_compute[182755]: 2026-01-22 00:18:47.876 182759 DEBUG nova.network.neutron [None req-b353149b-a80c-45b9-8938-808359b67be0 5eb4e81f0cef4003ae49faa67b3f17c3 3e408650207b498c8d115fd0c4f776dc - - default default] [instance: c4b8d52a-d6d6-4588-91ae-5eedc3a8db48] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Jan 21 19:18:47 np0005591285 nova_compute[182755]: 2026-01-22 00:18:47.985 182759 DEBUG nova.compute.manager [req-6e399745-1ef5-4af8-a0c0-9056700f6a53 req-78748899-0f48-456a-98cf-cc2dec1d9f62 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: c4b8d52a-d6d6-4588-91ae-5eedc3a8db48] Received event network-changed-4fc5d12c-3000-4f2c-b8b8-de8878075df7 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 21 19:18:47 np0005591285 nova_compute[182755]: 2026-01-22 00:18:47.985 182759 DEBUG nova.compute.manager [req-6e399745-1ef5-4af8-a0c0-9056700f6a53 req-78748899-0f48-456a-98cf-cc2dec1d9f62 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: c4b8d52a-d6d6-4588-91ae-5eedc3a8db48] Refreshing instance network info cache due to event network-changed-4fc5d12c-3000-4f2c-b8b8-de8878075df7. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 21 19:18:47 np0005591285 nova_compute[182755]: 2026-01-22 00:18:47.986 182759 DEBUG oslo_concurrency.lockutils [req-6e399745-1ef5-4af8-a0c0-9056700f6a53 req-78748899-0f48-456a-98cf-cc2dec1d9f62 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "refresh_cache-c4b8d52a-d6d6-4588-91ae-5eedc3a8db48" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 21 19:18:48 np0005591285 nova_compute[182755]: 2026-01-22 00:18:48.119 182759 DEBUG nova.network.neutron [None req-b353149b-a80c-45b9-8938-808359b67be0 5eb4e81f0cef4003ae49faa67b3f17c3 3e408650207b498c8d115fd0c4f776dc - - default default] [instance: c4b8d52a-d6d6-4588-91ae-5eedc3a8db48] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Jan 21 19:18:48 np0005591285 nova_compute[182755]: 2026-01-22 00:18:48.217 182759 DEBUG oslo_service.periodic_task [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 21 19:18:48 np0005591285 nova_compute[182755]: 2026-01-22 00:18:48.217 182759 DEBUG nova.compute.manager [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Jan 21 19:18:48 np0005591285 podman[234391]: 2026-01-22 00:18:48.222100357 +0000 UTC m=+0.094579753 container health_status e38def30a2d032278c507a8fe96e62e9ea51ff4721fe55b24e66691821fd079e (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=ovn_controller, container_name=ovn_controller, io.buildah.version=1.41.3, managed_by=edpm_ansible, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Jan 21 19:18:48 np0005591285 nova_compute[182755]: 2026-01-22 00:18:48.250 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:18:49 np0005591285 nova_compute[182755]: 2026-01-22 00:18:49.195 182759 DEBUG nova.network.neutron [None req-b353149b-a80c-45b9-8938-808359b67be0 5eb4e81f0cef4003ae49faa67b3f17c3 3e408650207b498c8d115fd0c4f776dc - - default default] [instance: c4b8d52a-d6d6-4588-91ae-5eedc3a8db48] Updating instance_info_cache with network_info: [{"id": "4fc5d12c-3000-4f2c-b8b8-de8878075df7", "address": "fa:16:3e:31:8d:34", "network": {"id": "aabf11c6-ef94-408a-8148-6c6400566606", "bridge": "br-int", "label": "tempest-ServersTestJSON-341875047-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3e408650207b498c8d115fd0c4f776dc", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4fc5d12c-30", "ovs_interfaceid": "4fc5d12c-3000-4f2c-b8b8-de8878075df7", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 21 19:18:49 np0005591285 nova_compute[182755]: 2026-01-22 00:18:49.218 182759 DEBUG oslo_service.periodic_task [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 21 19:18:49 np0005591285 nova_compute[182755]: 2026-01-22 00:18:49.220 182759 DEBUG oslo_concurrency.lockutils [None req-b353149b-a80c-45b9-8938-808359b67be0 5eb4e81f0cef4003ae49faa67b3f17c3 3e408650207b498c8d115fd0c4f776dc - - default default] Releasing lock "refresh_cache-c4b8d52a-d6d6-4588-91ae-5eedc3a8db48" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 21 19:18:49 np0005591285 nova_compute[182755]: 2026-01-22 00:18:49.220 182759 DEBUG nova.compute.manager [None req-b353149b-a80c-45b9-8938-808359b67be0 5eb4e81f0cef4003ae49faa67b3f17c3 3e408650207b498c8d115fd0c4f776dc - - default default] [instance: c4b8d52a-d6d6-4588-91ae-5eedc3a8db48] Instance network_info: |[{"id": "4fc5d12c-3000-4f2c-b8b8-de8878075df7", "address": "fa:16:3e:31:8d:34", "network": {"id": "aabf11c6-ef94-408a-8148-6c6400566606", "bridge": "br-int", "label": "tempest-ServersTestJSON-341875047-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3e408650207b498c8d115fd0c4f776dc", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4fc5d12c-30", "ovs_interfaceid": "4fc5d12c-3000-4f2c-b8b8-de8878075df7", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Jan 21 19:18:49 np0005591285 nova_compute[182755]: 2026-01-22 00:18:49.220 182759 DEBUG oslo_concurrency.lockutils [req-6e399745-1ef5-4af8-a0c0-9056700f6a53 req-78748899-0f48-456a-98cf-cc2dec1d9f62 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquired lock "refresh_cache-c4b8d52a-d6d6-4588-91ae-5eedc3a8db48" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 21 19:18:49 np0005591285 nova_compute[182755]: 2026-01-22 00:18:49.221 182759 DEBUG nova.network.neutron [req-6e399745-1ef5-4af8-a0c0-9056700f6a53 req-78748899-0f48-456a-98cf-cc2dec1d9f62 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: c4b8d52a-d6d6-4588-91ae-5eedc3a8db48] Refreshing network info cache for port 4fc5d12c-3000-4f2c-b8b8-de8878075df7 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 21 19:18:49 np0005591285 nova_compute[182755]: 2026-01-22 00:18:49.224 182759 DEBUG nova.virt.libvirt.driver [None req-b353149b-a80c-45b9-8938-808359b67be0 5eb4e81f0cef4003ae49faa67b3f17c3 3e408650207b498c8d115fd0c4f776dc - - default default] [instance: c4b8d52a-d6d6-4588-91ae-5eedc3a8db48] Start _get_guest_xml network_info=[{"id": "4fc5d12c-3000-4f2c-b8b8-de8878075df7", "address": "fa:16:3e:31:8d:34", "network": {"id": "aabf11c6-ef94-408a-8148-6c6400566606", "bridge": "br-int", "label": "tempest-ServersTestJSON-341875047-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3e408650207b498c8d115fd0c4f776dc", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4fc5d12c-30", "ovs_interfaceid": "4fc5d12c-3000-4f2c-b8b8-de8878075df7", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-21T23:42:50Z,direct_url=<?>,disk_format='qcow2',id=9cd98f02-a505-4543-a7ad-04e9a377b456,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='43b70c4e837343859ac97b6b2397ba1b',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-21T23:42:52Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'device_type': 'disk', 'device_name': '/dev/vda', 'size': 0, 'boot_index': 0, 'disk_bus': 'virtio', 'guest_format': None, 'encryption_options': None, 'encryption_format': None, 'encrypted': False, 'encryption_secret_uuid': None, 'image_id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Jan 21 19:18:49 np0005591285 nova_compute[182755]: 2026-01-22 00:18:49.230 182759 WARNING nova.virt.libvirt.driver [None req-b353149b-a80c-45b9-8938-808359b67be0 5eb4e81f0cef4003ae49faa67b3f17c3 3e408650207b498c8d115fd0c4f776dc - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 21 19:18:49 np0005591285 nova_compute[182755]: 2026-01-22 00:18:49.236 182759 DEBUG nova.virt.libvirt.host [None req-b353149b-a80c-45b9-8938-808359b67be0 5eb4e81f0cef4003ae49faa67b3f17c3 3e408650207b498c8d115fd0c4f776dc - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Jan 21 19:18:49 np0005591285 nova_compute[182755]: 2026-01-22 00:18:49.238 182759 DEBUG nova.virt.libvirt.host [None req-b353149b-a80c-45b9-8938-808359b67be0 5eb4e81f0cef4003ae49faa67b3f17c3 3e408650207b498c8d115fd0c4f776dc - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Jan 21 19:18:49 np0005591285 nova_compute[182755]: 2026-01-22 00:18:49.251 182759 DEBUG nova.virt.libvirt.host [None req-b353149b-a80c-45b9-8938-808359b67be0 5eb4e81f0cef4003ae49faa67b3f17c3 3e408650207b498c8d115fd0c4f776dc - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Jan 21 19:18:49 np0005591285 nova_compute[182755]: 2026-01-22 00:18:49.253 182759 DEBUG nova.virt.libvirt.host [None req-b353149b-a80c-45b9-8938-808359b67be0 5eb4e81f0cef4003ae49faa67b3f17c3 3e408650207b498c8d115fd0c4f776dc - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Jan 21 19:18:49 np0005591285 nova_compute[182755]: 2026-01-22 00:18:49.256 182759 DEBUG nova.virt.libvirt.driver [None req-b353149b-a80c-45b9-8938-808359b67be0 5eb4e81f0cef4003ae49faa67b3f17c3 3e408650207b498c8d115fd0c4f776dc - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Jan 21 19:18:49 np0005591285 nova_compute[182755]: 2026-01-22 00:18:49.256 182759 DEBUG nova.virt.hardware [None req-b353149b-a80c-45b9-8938-808359b67be0 5eb4e81f0cef4003ae49faa67b3f17c3 3e408650207b498c8d115fd0c4f776dc - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-21T23:42:45Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='c3389c03-89c4-4ff5-9e03-1a99d41713d4',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-21T23:42:50Z,direct_url=<?>,disk_format='qcow2',id=9cd98f02-a505-4543-a7ad-04e9a377b456,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='43b70c4e837343859ac97b6b2397ba1b',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-21T23:42:52Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Jan 21 19:18:49 np0005591285 nova_compute[182755]: 2026-01-22 00:18:49.258 182759 DEBUG nova.virt.hardware [None req-b353149b-a80c-45b9-8938-808359b67be0 5eb4e81f0cef4003ae49faa67b3f17c3 3e408650207b498c8d115fd0c4f776dc - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Jan 21 19:18:49 np0005591285 nova_compute[182755]: 2026-01-22 00:18:49.258 182759 DEBUG nova.virt.hardware [None req-b353149b-a80c-45b9-8938-808359b67be0 5eb4e81f0cef4003ae49faa67b3f17c3 3e408650207b498c8d115fd0c4f776dc - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Jan 21 19:18:49 np0005591285 nova_compute[182755]: 2026-01-22 00:18:49.259 182759 DEBUG nova.virt.hardware [None req-b353149b-a80c-45b9-8938-808359b67be0 5eb4e81f0cef4003ae49faa67b3f17c3 3e408650207b498c8d115fd0c4f776dc - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Jan 21 19:18:49 np0005591285 nova_compute[182755]: 2026-01-22 00:18:49.259 182759 DEBUG nova.virt.hardware [None req-b353149b-a80c-45b9-8938-808359b67be0 5eb4e81f0cef4003ae49faa67b3f17c3 3e408650207b498c8d115fd0c4f776dc - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Jan 21 19:18:49 np0005591285 nova_compute[182755]: 2026-01-22 00:18:49.260 182759 DEBUG nova.virt.hardware [None req-b353149b-a80c-45b9-8938-808359b67be0 5eb4e81f0cef4003ae49faa67b3f17c3 3e408650207b498c8d115fd0c4f776dc - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Jan 21 19:18:49 np0005591285 nova_compute[182755]: 2026-01-22 00:18:49.261 182759 DEBUG nova.virt.hardware [None req-b353149b-a80c-45b9-8938-808359b67be0 5eb4e81f0cef4003ae49faa67b3f17c3 3e408650207b498c8d115fd0c4f776dc - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Jan 21 19:18:49 np0005591285 nova_compute[182755]: 2026-01-22 00:18:49.261 182759 DEBUG nova.virt.hardware [None req-b353149b-a80c-45b9-8938-808359b67be0 5eb4e81f0cef4003ae49faa67b3f17c3 3e408650207b498c8d115fd0c4f776dc - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Jan 21 19:18:49 np0005591285 nova_compute[182755]: 2026-01-22 00:18:49.262 182759 DEBUG nova.virt.hardware [None req-b353149b-a80c-45b9-8938-808359b67be0 5eb4e81f0cef4003ae49faa67b3f17c3 3e408650207b498c8d115fd0c4f776dc - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Jan 21 19:18:49 np0005591285 nova_compute[182755]: 2026-01-22 00:18:49.263 182759 DEBUG nova.virt.hardware [None req-b353149b-a80c-45b9-8938-808359b67be0 5eb4e81f0cef4003ae49faa67b3f17c3 3e408650207b498c8d115fd0c4f776dc - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Jan 21 19:18:49 np0005591285 nova_compute[182755]: 2026-01-22 00:18:49.263 182759 DEBUG nova.virt.hardware [None req-b353149b-a80c-45b9-8938-808359b67be0 5eb4e81f0cef4003ae49faa67b3f17c3 3e408650207b498c8d115fd0c4f776dc - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Jan 21 19:18:49 np0005591285 nova_compute[182755]: 2026-01-22 00:18:49.272 182759 DEBUG nova.virt.libvirt.vif [None req-b353149b-a80c-45b9-8938-808359b67be0 5eb4e81f0cef4003ae49faa67b3f17c3 3e408650207b498c8d115fd0c4f776dc - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-22T00:18:43Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ServersTestJSON-server-1617563778',display_name='tempest-ServersTestJSON-server-1617563778',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-serverstestjson-server-1617563778',id=143,image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='3e408650207b498c8d115fd0c4f776dc',ramdisk_id='',reservation_id='r-lk75n3y9',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ServersTestJSON-374007797',owner_user_name='tempest-ServersTestJSON-374007797-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-22T00:18:45Z,user_data=None,user_id='5eb4e81f0cef4003ae49faa67b3f17c3',uuid=c4b8d52a-d6d6-4588-91ae-5eedc3a8db48,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "4fc5d12c-3000-4f2c-b8b8-de8878075df7", "address": "fa:16:3e:31:8d:34", "network": {"id": "aabf11c6-ef94-408a-8148-6c6400566606", "bridge": "br-int", "label": "tempest-ServersTestJSON-341875047-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3e408650207b498c8d115fd0c4f776dc", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4fc5d12c-30", "ovs_interfaceid": "4fc5d12c-3000-4f2c-b8b8-de8878075df7", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Jan 21 19:18:49 np0005591285 nova_compute[182755]: 2026-01-22 00:18:49.273 182759 DEBUG nova.network.os_vif_util [None req-b353149b-a80c-45b9-8938-808359b67be0 5eb4e81f0cef4003ae49faa67b3f17c3 3e408650207b498c8d115fd0c4f776dc - - default default] Converting VIF {"id": "4fc5d12c-3000-4f2c-b8b8-de8878075df7", "address": "fa:16:3e:31:8d:34", "network": {"id": "aabf11c6-ef94-408a-8148-6c6400566606", "bridge": "br-int", "label": "tempest-ServersTestJSON-341875047-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3e408650207b498c8d115fd0c4f776dc", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4fc5d12c-30", "ovs_interfaceid": "4fc5d12c-3000-4f2c-b8b8-de8878075df7", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 21 19:18:49 np0005591285 nova_compute[182755]: 2026-01-22 00:18:49.274 182759 DEBUG nova.network.os_vif_util [None req-b353149b-a80c-45b9-8938-808359b67be0 5eb4e81f0cef4003ae49faa67b3f17c3 3e408650207b498c8d115fd0c4f776dc - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:31:8d:34,bridge_name='br-int',has_traffic_filtering=True,id=4fc5d12c-3000-4f2c-b8b8-de8878075df7,network=Network(aabf11c6-ef94-408a-8148-6c6400566606),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap4fc5d12c-30') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 21 19:18:49 np0005591285 nova_compute[182755]: 2026-01-22 00:18:49.275 182759 DEBUG nova.objects.instance [None req-b353149b-a80c-45b9-8938-808359b67be0 5eb4e81f0cef4003ae49faa67b3f17c3 3e408650207b498c8d115fd0c4f776dc - - default default] Lazy-loading 'pci_devices' on Instance uuid c4b8d52a-d6d6-4588-91ae-5eedc3a8db48 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 21 19:18:49 np0005591285 nova_compute[182755]: 2026-01-22 00:18:49.292 182759 DEBUG nova.virt.libvirt.driver [None req-b353149b-a80c-45b9-8938-808359b67be0 5eb4e81f0cef4003ae49faa67b3f17c3 3e408650207b498c8d115fd0c4f776dc - - default default] [instance: c4b8d52a-d6d6-4588-91ae-5eedc3a8db48] End _get_guest_xml xml=<domain type="kvm">
Jan 21 19:18:49 np0005591285 nova_compute[182755]:  <uuid>c4b8d52a-d6d6-4588-91ae-5eedc3a8db48</uuid>
Jan 21 19:18:49 np0005591285 nova_compute[182755]:  <name>instance-0000008f</name>
Jan 21 19:18:49 np0005591285 nova_compute[182755]:  <memory>131072</memory>
Jan 21 19:18:49 np0005591285 nova_compute[182755]:  <vcpu>1</vcpu>
Jan 21 19:18:49 np0005591285 nova_compute[182755]:  <metadata>
Jan 21 19:18:49 np0005591285 nova_compute[182755]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 21 19:18:49 np0005591285 nova_compute[182755]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 21 19:18:49 np0005591285 nova_compute[182755]:      <nova:name>tempest-ServersTestJSON-server-1617563778</nova:name>
Jan 21 19:18:49 np0005591285 nova_compute[182755]:      <nova:creationTime>2026-01-22 00:18:49</nova:creationTime>
Jan 21 19:18:49 np0005591285 nova_compute[182755]:      <nova:flavor name="m1.nano">
Jan 21 19:18:49 np0005591285 nova_compute[182755]:        <nova:memory>128</nova:memory>
Jan 21 19:18:49 np0005591285 nova_compute[182755]:        <nova:disk>1</nova:disk>
Jan 21 19:18:49 np0005591285 nova_compute[182755]:        <nova:swap>0</nova:swap>
Jan 21 19:18:49 np0005591285 nova_compute[182755]:        <nova:ephemeral>0</nova:ephemeral>
Jan 21 19:18:49 np0005591285 nova_compute[182755]:        <nova:vcpus>1</nova:vcpus>
Jan 21 19:18:49 np0005591285 nova_compute[182755]:      </nova:flavor>
Jan 21 19:18:49 np0005591285 nova_compute[182755]:      <nova:owner>
Jan 21 19:18:49 np0005591285 nova_compute[182755]:        <nova:user uuid="5eb4e81f0cef4003ae49faa67b3f17c3">tempest-ServersTestJSON-374007797-project-member</nova:user>
Jan 21 19:18:49 np0005591285 nova_compute[182755]:        <nova:project uuid="3e408650207b498c8d115fd0c4f776dc">tempest-ServersTestJSON-374007797</nova:project>
Jan 21 19:18:49 np0005591285 nova_compute[182755]:      </nova:owner>
Jan 21 19:18:49 np0005591285 nova_compute[182755]:      <nova:root type="image" uuid="9cd98f02-a505-4543-a7ad-04e9a377b456"/>
Jan 21 19:18:49 np0005591285 nova_compute[182755]:      <nova:ports>
Jan 21 19:18:49 np0005591285 nova_compute[182755]:        <nova:port uuid="4fc5d12c-3000-4f2c-b8b8-de8878075df7">
Jan 21 19:18:49 np0005591285 nova_compute[182755]:          <nova:ip type="fixed" address="10.100.0.4" ipVersion="4"/>
Jan 21 19:18:49 np0005591285 nova_compute[182755]:        </nova:port>
Jan 21 19:18:49 np0005591285 nova_compute[182755]:      </nova:ports>
Jan 21 19:18:49 np0005591285 nova_compute[182755]:    </nova:instance>
Jan 21 19:18:49 np0005591285 nova_compute[182755]:  </metadata>
Jan 21 19:18:49 np0005591285 nova_compute[182755]:  <sysinfo type="smbios">
Jan 21 19:18:49 np0005591285 nova_compute[182755]:    <system>
Jan 21 19:18:49 np0005591285 nova_compute[182755]:      <entry name="manufacturer">RDO</entry>
Jan 21 19:18:49 np0005591285 nova_compute[182755]:      <entry name="product">OpenStack Compute</entry>
Jan 21 19:18:49 np0005591285 nova_compute[182755]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 21 19:18:49 np0005591285 nova_compute[182755]:      <entry name="serial">c4b8d52a-d6d6-4588-91ae-5eedc3a8db48</entry>
Jan 21 19:18:49 np0005591285 nova_compute[182755]:      <entry name="uuid">c4b8d52a-d6d6-4588-91ae-5eedc3a8db48</entry>
Jan 21 19:18:49 np0005591285 nova_compute[182755]:      <entry name="family">Virtual Machine</entry>
Jan 21 19:18:49 np0005591285 nova_compute[182755]:    </system>
Jan 21 19:18:49 np0005591285 nova_compute[182755]:  </sysinfo>
Jan 21 19:18:49 np0005591285 nova_compute[182755]:  <os>
Jan 21 19:18:49 np0005591285 nova_compute[182755]:    <type arch="x86_64" machine="q35">hvm</type>
Jan 21 19:18:49 np0005591285 nova_compute[182755]:    <boot dev="hd"/>
Jan 21 19:18:49 np0005591285 nova_compute[182755]:    <smbios mode="sysinfo"/>
Jan 21 19:18:49 np0005591285 nova_compute[182755]:  </os>
Jan 21 19:18:49 np0005591285 nova_compute[182755]:  <features>
Jan 21 19:18:49 np0005591285 nova_compute[182755]:    <acpi/>
Jan 21 19:18:49 np0005591285 nova_compute[182755]:    <apic/>
Jan 21 19:18:49 np0005591285 nova_compute[182755]:    <vmcoreinfo/>
Jan 21 19:18:49 np0005591285 nova_compute[182755]:  </features>
Jan 21 19:18:49 np0005591285 nova_compute[182755]:  <clock offset="utc">
Jan 21 19:18:49 np0005591285 nova_compute[182755]:    <timer name="pit" tickpolicy="delay"/>
Jan 21 19:18:49 np0005591285 nova_compute[182755]:    <timer name="rtc" tickpolicy="catchup"/>
Jan 21 19:18:49 np0005591285 nova_compute[182755]:    <timer name="hpet" present="no"/>
Jan 21 19:18:49 np0005591285 nova_compute[182755]:  </clock>
Jan 21 19:18:49 np0005591285 nova_compute[182755]:  <cpu mode="custom" match="exact">
Jan 21 19:18:49 np0005591285 nova_compute[182755]:    <model>Nehalem</model>
Jan 21 19:18:49 np0005591285 nova_compute[182755]:    <topology sockets="1" cores="1" threads="1"/>
Jan 21 19:18:49 np0005591285 nova_compute[182755]:  </cpu>
Jan 21 19:18:49 np0005591285 nova_compute[182755]:  <devices>
Jan 21 19:18:49 np0005591285 nova_compute[182755]:    <disk type="file" device="disk">
Jan 21 19:18:49 np0005591285 nova_compute[182755]:      <driver name="qemu" type="qcow2" cache="none"/>
Jan 21 19:18:49 np0005591285 nova_compute[182755]:      <source file="/var/lib/nova/instances/c4b8d52a-d6d6-4588-91ae-5eedc3a8db48/disk"/>
Jan 21 19:18:49 np0005591285 nova_compute[182755]:      <target dev="vda" bus="virtio"/>
Jan 21 19:18:49 np0005591285 nova_compute[182755]:    </disk>
Jan 21 19:18:49 np0005591285 nova_compute[182755]:    <disk type="file" device="cdrom">
Jan 21 19:18:49 np0005591285 nova_compute[182755]:      <driver name="qemu" type="raw" cache="none"/>
Jan 21 19:18:49 np0005591285 nova_compute[182755]:      <source file="/var/lib/nova/instances/c4b8d52a-d6d6-4588-91ae-5eedc3a8db48/disk.config"/>
Jan 21 19:18:49 np0005591285 nova_compute[182755]:      <target dev="sda" bus="sata"/>
Jan 21 19:18:49 np0005591285 nova_compute[182755]:    </disk>
Jan 21 19:18:49 np0005591285 nova_compute[182755]:    <interface type="ethernet">
Jan 21 19:18:49 np0005591285 nova_compute[182755]:      <mac address="fa:16:3e:31:8d:34"/>
Jan 21 19:18:49 np0005591285 nova_compute[182755]:      <model type="virtio"/>
Jan 21 19:18:49 np0005591285 nova_compute[182755]:      <driver name="vhost" rx_queue_size="512"/>
Jan 21 19:18:49 np0005591285 nova_compute[182755]:      <mtu size="1442"/>
Jan 21 19:18:49 np0005591285 nova_compute[182755]:      <target dev="tap4fc5d12c-30"/>
Jan 21 19:18:49 np0005591285 nova_compute[182755]:    </interface>
Jan 21 19:18:49 np0005591285 nova_compute[182755]:    <serial type="pty">
Jan 21 19:18:49 np0005591285 nova_compute[182755]:      <log file="/var/lib/nova/instances/c4b8d52a-d6d6-4588-91ae-5eedc3a8db48/console.log" append="off"/>
Jan 21 19:18:49 np0005591285 nova_compute[182755]:    </serial>
Jan 21 19:18:49 np0005591285 nova_compute[182755]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 21 19:18:49 np0005591285 nova_compute[182755]:    <video>
Jan 21 19:18:49 np0005591285 nova_compute[182755]:      <model type="virtio"/>
Jan 21 19:18:49 np0005591285 nova_compute[182755]:    </video>
Jan 21 19:18:49 np0005591285 nova_compute[182755]:    <input type="tablet" bus="usb"/>
Jan 21 19:18:49 np0005591285 nova_compute[182755]:    <rng model="virtio">
Jan 21 19:18:49 np0005591285 nova_compute[182755]:      <backend model="random">/dev/urandom</backend>
Jan 21 19:18:49 np0005591285 nova_compute[182755]:    </rng>
Jan 21 19:18:49 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root"/>
Jan 21 19:18:49 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 19:18:49 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 19:18:49 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 19:18:49 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 19:18:49 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 19:18:49 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 19:18:49 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 19:18:49 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 19:18:49 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 19:18:49 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 19:18:49 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 19:18:49 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 19:18:49 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 19:18:49 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 19:18:49 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 19:18:49 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 19:18:49 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 19:18:49 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 19:18:49 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 19:18:49 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 19:18:49 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 19:18:49 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 19:18:49 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 19:18:49 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 19:18:49 np0005591285 nova_compute[182755]:    <controller type="usb" index="0"/>
Jan 21 19:18:49 np0005591285 nova_compute[182755]:    <memballoon model="virtio">
Jan 21 19:18:49 np0005591285 nova_compute[182755]:      <stats period="10"/>
Jan 21 19:18:49 np0005591285 nova_compute[182755]:    </memballoon>
Jan 21 19:18:49 np0005591285 nova_compute[182755]:  </devices>
Jan 21 19:18:49 np0005591285 nova_compute[182755]: </domain>
Jan 21 19:18:49 np0005591285 nova_compute[182755]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Jan 21 19:18:49 np0005591285 nova_compute[182755]: 2026-01-22 00:18:49.301 182759 DEBUG nova.compute.manager [None req-b353149b-a80c-45b9-8938-808359b67be0 5eb4e81f0cef4003ae49faa67b3f17c3 3e408650207b498c8d115fd0c4f776dc - - default default] [instance: c4b8d52a-d6d6-4588-91ae-5eedc3a8db48] Preparing to wait for external event network-vif-plugged-4fc5d12c-3000-4f2c-b8b8-de8878075df7 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Jan 21 19:18:49 np0005591285 nova_compute[182755]: 2026-01-22 00:18:49.301 182759 DEBUG oslo_concurrency.lockutils [None req-b353149b-a80c-45b9-8938-808359b67be0 5eb4e81f0cef4003ae49faa67b3f17c3 3e408650207b498c8d115fd0c4f776dc - - default default] Acquiring lock "c4b8d52a-d6d6-4588-91ae-5eedc3a8db48-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 21 19:18:49 np0005591285 nova_compute[182755]: 2026-01-22 00:18:49.302 182759 DEBUG oslo_concurrency.lockutils [None req-b353149b-a80c-45b9-8938-808359b67be0 5eb4e81f0cef4003ae49faa67b3f17c3 3e408650207b498c8d115fd0c4f776dc - - default default] Lock "c4b8d52a-d6d6-4588-91ae-5eedc3a8db48-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 21 19:18:49 np0005591285 nova_compute[182755]: 2026-01-22 00:18:49.302 182759 DEBUG oslo_concurrency.lockutils [None req-b353149b-a80c-45b9-8938-808359b67be0 5eb4e81f0cef4003ae49faa67b3f17c3 3e408650207b498c8d115fd0c4f776dc - - default default] Lock "c4b8d52a-d6d6-4588-91ae-5eedc3a8db48-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 21 19:18:49 np0005591285 nova_compute[182755]: 2026-01-22 00:18:49.303 182759 DEBUG nova.virt.libvirt.vif [None req-b353149b-a80c-45b9-8938-808359b67be0 5eb4e81f0cef4003ae49faa67b3f17c3 3e408650207b498c8d115fd0c4f776dc - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-22T00:18:43Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ServersTestJSON-server-1617563778',display_name='tempest-ServersTestJSON-server-1617563778',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-serverstestjson-server-1617563778',id=143,image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='3e408650207b498c8d115fd0c4f776dc',ramdisk_id='',reservation_id='r-lk75n3y9',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ServersTestJSON-374007797',owner_user_name='tempest-ServersTestJSON-374007797-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-22T00:18:45Z,user_data=None,user_id='5eb4e81f0cef4003ae49faa67b3f17c3',uuid=c4b8d52a-d6d6-4588-91ae-5eedc3a8db48,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "4fc5d12c-3000-4f2c-b8b8-de8878075df7", "address": "fa:16:3e:31:8d:34", "network": {"id": "aabf11c6-ef94-408a-8148-6c6400566606", "bridge": "br-int", "label": "tempest-ServersTestJSON-341875047-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3e408650207b498c8d115fd0c4f776dc", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4fc5d12c-30", "ovs_interfaceid": "4fc5d12c-3000-4f2c-b8b8-de8878075df7", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Jan 21 19:18:49 np0005591285 nova_compute[182755]: 2026-01-22 00:18:49.303 182759 DEBUG nova.network.os_vif_util [None req-b353149b-a80c-45b9-8938-808359b67be0 5eb4e81f0cef4003ae49faa67b3f17c3 3e408650207b498c8d115fd0c4f776dc - - default default] Converting VIF {"id": "4fc5d12c-3000-4f2c-b8b8-de8878075df7", "address": "fa:16:3e:31:8d:34", "network": {"id": "aabf11c6-ef94-408a-8148-6c6400566606", "bridge": "br-int", "label": "tempest-ServersTestJSON-341875047-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3e408650207b498c8d115fd0c4f776dc", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4fc5d12c-30", "ovs_interfaceid": "4fc5d12c-3000-4f2c-b8b8-de8878075df7", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 21 19:18:49 np0005591285 nova_compute[182755]: 2026-01-22 00:18:49.304 182759 DEBUG nova.network.os_vif_util [None req-b353149b-a80c-45b9-8938-808359b67be0 5eb4e81f0cef4003ae49faa67b3f17c3 3e408650207b498c8d115fd0c4f776dc - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:31:8d:34,bridge_name='br-int',has_traffic_filtering=True,id=4fc5d12c-3000-4f2c-b8b8-de8878075df7,network=Network(aabf11c6-ef94-408a-8148-6c6400566606),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap4fc5d12c-30') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 21 19:18:49 np0005591285 nova_compute[182755]: 2026-01-22 00:18:49.304 182759 DEBUG os_vif [None req-b353149b-a80c-45b9-8938-808359b67be0 5eb4e81f0cef4003ae49faa67b3f17c3 3e408650207b498c8d115fd0c4f776dc - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:31:8d:34,bridge_name='br-int',has_traffic_filtering=True,id=4fc5d12c-3000-4f2c-b8b8-de8878075df7,network=Network(aabf11c6-ef94-408a-8148-6c6400566606),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap4fc5d12c-30') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Jan 21 19:18:49 np0005591285 nova_compute[182755]: 2026-01-22 00:18:49.305 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:18:49 np0005591285 nova_compute[182755]: 2026-01-22 00:18:49.307 182759 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 21 19:18:49 np0005591285 nova_compute[182755]: 2026-01-22 00:18:49.308 182759 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 21 19:18:49 np0005591285 nova_compute[182755]: 2026-01-22 00:18:49.315 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:18:49 np0005591285 nova_compute[182755]: 2026-01-22 00:18:49.316 182759 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap4fc5d12c-30, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 21 19:18:49 np0005591285 nova_compute[182755]: 2026-01-22 00:18:49.316 182759 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap4fc5d12c-30, col_values=(('external_ids', {'iface-id': '4fc5d12c-3000-4f2c-b8b8-de8878075df7', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:31:8d:34', 'vm-uuid': 'c4b8d52a-d6d6-4588-91ae-5eedc3a8db48'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 21 19:18:49 np0005591285 nova_compute[182755]: 2026-01-22 00:18:49.318 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:18:49 np0005591285 NetworkManager[55017]: <info>  [1769041129.3198] manager: (tap4fc5d12c-30): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/270)
Jan 21 19:18:49 np0005591285 nova_compute[182755]: 2026-01-22 00:18:49.323 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 21 19:18:49 np0005591285 nova_compute[182755]: 2026-01-22 00:18:49.325 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:18:49 np0005591285 nova_compute[182755]: 2026-01-22 00:18:49.326 182759 INFO os_vif [None req-b353149b-a80c-45b9-8938-808359b67be0 5eb4e81f0cef4003ae49faa67b3f17c3 3e408650207b498c8d115fd0c4f776dc - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:31:8d:34,bridge_name='br-int',has_traffic_filtering=True,id=4fc5d12c-3000-4f2c-b8b8-de8878075df7,network=Network(aabf11c6-ef94-408a-8148-6c6400566606),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap4fc5d12c-30')#033[00m
Jan 21 19:18:49 np0005591285 nova_compute[182755]: 2026-01-22 00:18:49.546 182759 DEBUG nova.virt.libvirt.driver [None req-b353149b-a80c-45b9-8938-808359b67be0 5eb4e81f0cef4003ae49faa67b3f17c3 3e408650207b498c8d115fd0c4f776dc - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 21 19:18:49 np0005591285 nova_compute[182755]: 2026-01-22 00:18:49.547 182759 DEBUG nova.virt.libvirt.driver [None req-b353149b-a80c-45b9-8938-808359b67be0 5eb4e81f0cef4003ae49faa67b3f17c3 3e408650207b498c8d115fd0c4f776dc - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 21 19:18:49 np0005591285 nova_compute[182755]: 2026-01-22 00:18:49.547 182759 DEBUG nova.virt.libvirt.driver [None req-b353149b-a80c-45b9-8938-808359b67be0 5eb4e81f0cef4003ae49faa67b3f17c3 3e408650207b498c8d115fd0c4f776dc - - default default] No VIF found with MAC fa:16:3e:31:8d:34, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Jan 21 19:18:49 np0005591285 nova_compute[182755]: 2026-01-22 00:18:49.548 182759 INFO nova.virt.libvirt.driver [None req-b353149b-a80c-45b9-8938-808359b67be0 5eb4e81f0cef4003ae49faa67b3f17c3 3e408650207b498c8d115fd0c4f776dc - - default default] [instance: c4b8d52a-d6d6-4588-91ae-5eedc3a8db48] Using config drive#033[00m
Jan 21 19:18:50 np0005591285 nova_compute[182755]: 2026-01-22 00:18:50.219 182759 DEBUG oslo_service.periodic_task [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 21 19:18:50 np0005591285 nova_compute[182755]: 2026-01-22 00:18:50.220 182759 DEBUG nova.compute.manager [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Jan 21 19:18:50 np0005591285 nova_compute[182755]: 2026-01-22 00:18:50.220 182759 DEBUG nova.compute.manager [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Jan 21 19:18:50 np0005591285 nova_compute[182755]: 2026-01-22 00:18:50.242 182759 INFO nova.virt.libvirt.driver [None req-b353149b-a80c-45b9-8938-808359b67be0 5eb4e81f0cef4003ae49faa67b3f17c3 3e408650207b498c8d115fd0c4f776dc - - default default] [instance: c4b8d52a-d6d6-4588-91ae-5eedc3a8db48] Creating config drive at /var/lib/nova/instances/c4b8d52a-d6d6-4588-91ae-5eedc3a8db48/disk.config#033[00m
Jan 21 19:18:50 np0005591285 nova_compute[182755]: 2026-01-22 00:18:50.247 182759 DEBUG oslo_concurrency.processutils [None req-b353149b-a80c-45b9-8938-808359b67be0 5eb4e81f0cef4003ae49faa67b3f17c3 3e408650207b498c8d115fd0c4f776dc - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/c4b8d52a-d6d6-4588-91ae-5eedc3a8db48/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpdvttjcjb execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 21 19:18:50 np0005591285 nova_compute[182755]: 2026-01-22 00:18:50.269 182759 DEBUG nova.compute.manager [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] [instance: c4b8d52a-d6d6-4588-91ae-5eedc3a8db48] Skipping network cache update for instance because it is Building. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9871#033[00m
Jan 21 19:18:50 np0005591285 nova_compute[182755]: 2026-01-22 00:18:50.374 182759 DEBUG oslo_concurrency.processutils [None req-b353149b-a80c-45b9-8938-808359b67be0 5eb4e81f0cef4003ae49faa67b3f17c3 3e408650207b498c8d115fd0c4f776dc - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/c4b8d52a-d6d6-4588-91ae-5eedc3a8db48/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpdvttjcjb" returned: 0 in 0.127s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 21 19:18:50 np0005591285 nova_compute[182755]: 2026-01-22 00:18:50.423 182759 DEBUG oslo_concurrency.lockutils [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Acquiring lock "refresh_cache-fb2dc221-bb45-4407-90b5-ce2fe888001c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 21 19:18:50 np0005591285 nova_compute[182755]: 2026-01-22 00:18:50.424 182759 DEBUG oslo_concurrency.lockutils [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Acquired lock "refresh_cache-fb2dc221-bb45-4407-90b5-ce2fe888001c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 21 19:18:50 np0005591285 nova_compute[182755]: 2026-01-22 00:18:50.424 182759 DEBUG nova.network.neutron [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] [instance: fb2dc221-bb45-4407-90b5-ce2fe888001c] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004#033[00m
Jan 21 19:18:50 np0005591285 nova_compute[182755]: 2026-01-22 00:18:50.424 182759 DEBUG nova.objects.instance [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Lazy-loading 'info_cache' on Instance uuid fb2dc221-bb45-4407-90b5-ce2fe888001c obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 21 19:18:50 np0005591285 kernel: tap4fc5d12c-30: entered promiscuous mode
Jan 21 19:18:50 np0005591285 NetworkManager[55017]: <info>  [1769041130.4313] manager: (tap4fc5d12c-30): new Tun device (/org/freedesktop/NetworkManager/Devices/271)
Jan 21 19:18:50 np0005591285 ovn_controller[94908]: 2026-01-22T00:18:50Z|00556|binding|INFO|Claiming lport 4fc5d12c-3000-4f2c-b8b8-de8878075df7 for this chassis.
Jan 21 19:18:50 np0005591285 ovn_controller[94908]: 2026-01-22T00:18:50Z|00557|binding|INFO|4fc5d12c-3000-4f2c-b8b8-de8878075df7: Claiming fa:16:3e:31:8d:34 10.100.0.4
Jan 21 19:18:50 np0005591285 nova_compute[182755]: 2026-01-22 00:18:50.434 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:18:50 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:18:50.440 104259 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:31:8d:34 10.100.0.4'], port_security=['fa:16:3e:31:8d:34 10.100.0.4'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.4/28', 'neutron:device_id': 'c4b8d52a-d6d6-4588-91ae-5eedc3a8db48', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-aabf11c6-ef94-408a-8148-6c6400566606', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '3e408650207b498c8d115fd0c4f776dc', 'neutron:revision_number': '2', 'neutron:security_group_ids': 'd88f438e-f9bb-4593-93a6-6ce5aa939167', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=dfd57084-623a-46cf-a9c5-71a440a640c6, chassis=[<ovs.db.idl.Row object at 0x7fdd55b5c970>], tunnel_key=5, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fdd55b5c970>], logical_port=4fc5d12c-3000-4f2c-b8b8-de8878075df7) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 21 19:18:50 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:18:50.442 104259 INFO neutron.agent.ovn.metadata.agent [-] Port 4fc5d12c-3000-4f2c-b8b8-de8878075df7 in datapath aabf11c6-ef94-408a-8148-6c6400566606 bound to our chassis#033[00m
Jan 21 19:18:50 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:18:50.443 104259 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network aabf11c6-ef94-408a-8148-6c6400566606#033[00m
Jan 21 19:18:50 np0005591285 ovn_controller[94908]: 2026-01-22T00:18:50Z|00558|binding|INFO|Setting lport 4fc5d12c-3000-4f2c-b8b8-de8878075df7 ovn-installed in OVS
Jan 21 19:18:50 np0005591285 ovn_controller[94908]: 2026-01-22T00:18:50Z|00559|binding|INFO|Setting lport 4fc5d12c-3000-4f2c-b8b8-de8878075df7 up in Southbound
Jan 21 19:18:50 np0005591285 nova_compute[182755]: 2026-01-22 00:18:50.453 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:18:50 np0005591285 nova_compute[182755]: 2026-01-22 00:18:50.457 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:18:50 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:18:50.463 211690 DEBUG oslo.privsep.daemon [-] privsep: reply[433b6d62-da73-4cb8-82ce-96769f26d3eb]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 19:18:50 np0005591285 systemd-udevd[234442]: Network interface NamePolicy= disabled on kernel command line.
Jan 21 19:18:50 np0005591285 systemd-machined[154022]: New machine qemu-66-instance-0000008f.
Jan 21 19:18:50 np0005591285 NetworkManager[55017]: <info>  [1769041130.4906] device (tap4fc5d12c-30): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 21 19:18:50 np0005591285 NetworkManager[55017]: <info>  [1769041130.4913] device (tap4fc5d12c-30): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 21 19:18:50 np0005591285 systemd[1]: Started Virtual Machine qemu-66-instance-0000008f.
Jan 21 19:18:50 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:18:50.503 211704 DEBUG oslo.privsep.daemon [-] privsep: reply[ad6773ee-689b-46dd-a93c-cfe4cc72b82c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 19:18:50 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:18:50.508 211704 DEBUG oslo.privsep.daemon [-] privsep: reply[a8acfc0b-fd5c-4c12-a642-0141c1d6408d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 19:18:50 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:18:50.532 211704 DEBUG oslo.privsep.daemon [-] privsep: reply[07e6e881-61ff-4a96-85fb-de0a6a4e3e55]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 19:18:50 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:18:50.549 211690 DEBUG oslo.privsep.daemon [-] privsep: reply[c8d30a38-d3f8-429b-bee8-e86e9c153911]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapaabf11c6-e1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:ae:1b:62'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 6, 'tx_packets': 5, 'rx_bytes': 532, 'tx_bytes': 354, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 6, 'tx_packets': 5, 'rx_bytes': 532, 'tx_bytes': 354, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 174], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 567322, 'reachable_time': 24572, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 6, 'inoctets': 448, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 3, 'outoctets': 228, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 6, 'outmcastpkts': 3, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 448, 'outmcastoctets': 228, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 6, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 3, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 234464, 'error': None, 'target': 'ovnmeta-aabf11c6-ef94-408a-8148-6c6400566606', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 19:18:50 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:18:50.564 211690 DEBUG oslo.privsep.daemon [-] privsep: reply[8dcf2e52-3353-4f06-947e-4d27a45750aa]: (4, ({'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tapaabf11c6-e1'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 567334, 'tstamp': 567334}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 234465, 'error': None, 'target': 'ovnmeta-aabf11c6-ef94-408a-8148-6c6400566606', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.15'], ['IFA_LABEL', 'tapaabf11c6-e1'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 567337, 'tstamp': 567337}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 234465, 'error': None, 'target': 'ovnmeta-aabf11c6-ef94-408a-8148-6c6400566606', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 19:18:50 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:18:50.566 104259 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapaabf11c6-e0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 21 19:18:50 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:18:50.570 104259 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapaabf11c6-e0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 21 19:18:50 np0005591285 nova_compute[182755]: 2026-01-22 00:18:50.567 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:18:50 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:18:50.570 104259 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 21 19:18:50 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:18:50.570 104259 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapaabf11c6-e0, col_values=(('external_ids', {'iface-id': '1ae0dbff-a7cd-4db8-afc3-1d102fdd130f'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 21 19:18:50 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:18:50.571 104259 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 21 19:18:50 np0005591285 ovn_controller[94908]: 2026-01-22T00:18:50Z|00052|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:df:32:6a 10.100.0.10
Jan 21 19:18:50 np0005591285 ovn_controller[94908]: 2026-01-22T00:18:50Z|00053|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:df:32:6a 10.100.0.10
Jan 21 19:18:51 np0005591285 nova_compute[182755]: 2026-01-22 00:18:51.105 182759 DEBUG nova.virt.driver [None req-f447ef32-7edb-4e2c-a5c5-5452f2f9c37e - - - - - -] Emitting event <LifecycleEvent: 1769041131.104307, c4b8d52a-d6d6-4588-91ae-5eedc3a8db48 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 21 19:18:51 np0005591285 nova_compute[182755]: 2026-01-22 00:18:51.105 182759 INFO nova.compute.manager [None req-f447ef32-7edb-4e2c-a5c5-5452f2f9c37e - - - - - -] [instance: c4b8d52a-d6d6-4588-91ae-5eedc3a8db48] VM Started (Lifecycle Event)#033[00m
Jan 21 19:18:51 np0005591285 nova_compute[182755]: 2026-01-22 00:18:51.138 182759 DEBUG nova.network.neutron [req-6e399745-1ef5-4af8-a0c0-9056700f6a53 req-78748899-0f48-456a-98cf-cc2dec1d9f62 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: c4b8d52a-d6d6-4588-91ae-5eedc3a8db48] Updated VIF entry in instance network info cache for port 4fc5d12c-3000-4f2c-b8b8-de8878075df7. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 21 19:18:51 np0005591285 nova_compute[182755]: 2026-01-22 00:18:51.138 182759 DEBUG nova.network.neutron [req-6e399745-1ef5-4af8-a0c0-9056700f6a53 req-78748899-0f48-456a-98cf-cc2dec1d9f62 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: c4b8d52a-d6d6-4588-91ae-5eedc3a8db48] Updating instance_info_cache with network_info: [{"id": "4fc5d12c-3000-4f2c-b8b8-de8878075df7", "address": "fa:16:3e:31:8d:34", "network": {"id": "aabf11c6-ef94-408a-8148-6c6400566606", "bridge": "br-int", "label": "tempest-ServersTestJSON-341875047-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3e408650207b498c8d115fd0c4f776dc", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4fc5d12c-30", "ovs_interfaceid": "4fc5d12c-3000-4f2c-b8b8-de8878075df7", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 21 19:18:51 np0005591285 nova_compute[182755]: 2026-01-22 00:18:51.141 182759 DEBUG nova.compute.manager [None req-f447ef32-7edb-4e2c-a5c5-5452f2f9c37e - - - - - -] [instance: c4b8d52a-d6d6-4588-91ae-5eedc3a8db48] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 21 19:18:51 np0005591285 nova_compute[182755]: 2026-01-22 00:18:51.146 182759 DEBUG nova.virt.driver [None req-f447ef32-7edb-4e2c-a5c5-5452f2f9c37e - - - - - -] Emitting event <LifecycleEvent: 1769041131.108508, c4b8d52a-d6d6-4588-91ae-5eedc3a8db48 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 21 19:18:51 np0005591285 nova_compute[182755]: 2026-01-22 00:18:51.146 182759 INFO nova.compute.manager [None req-f447ef32-7edb-4e2c-a5c5-5452f2f9c37e - - - - - -] [instance: c4b8d52a-d6d6-4588-91ae-5eedc3a8db48] VM Paused (Lifecycle Event)#033[00m
Jan 21 19:18:51 np0005591285 nova_compute[182755]: 2026-01-22 00:18:51.167 182759 DEBUG oslo_concurrency.lockutils [req-6e399745-1ef5-4af8-a0c0-9056700f6a53 req-78748899-0f48-456a-98cf-cc2dec1d9f62 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Releasing lock "refresh_cache-c4b8d52a-d6d6-4588-91ae-5eedc3a8db48" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 21 19:18:51 np0005591285 nova_compute[182755]: 2026-01-22 00:18:51.169 182759 DEBUG nova.compute.manager [None req-f447ef32-7edb-4e2c-a5c5-5452f2f9c37e - - - - - -] [instance: c4b8d52a-d6d6-4588-91ae-5eedc3a8db48] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 21 19:18:51 np0005591285 nova_compute[182755]: 2026-01-22 00:18:51.174 182759 DEBUG nova.compute.manager [None req-f447ef32-7edb-4e2c-a5c5-5452f2f9c37e - - - - - -] [instance: c4b8d52a-d6d6-4588-91ae-5eedc3a8db48] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 21 19:18:51 np0005591285 nova_compute[182755]: 2026-01-22 00:18:51.202 182759 INFO nova.compute.manager [None req-f447ef32-7edb-4e2c-a5c5-5452f2f9c37e - - - - - -] [instance: c4b8d52a-d6d6-4588-91ae-5eedc3a8db48] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 21 19:18:51 np0005591285 nova_compute[182755]: 2026-01-22 00:18:51.488 182759 DEBUG nova.compute.manager [req-40053881-2c43-45a1-a7e1-a2e157097489 req-f28979da-f7c2-43dd-a760-26ca56d97703 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: c4b8d52a-d6d6-4588-91ae-5eedc3a8db48] Received event network-vif-plugged-4fc5d12c-3000-4f2c-b8b8-de8878075df7 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 21 19:18:51 np0005591285 nova_compute[182755]: 2026-01-22 00:18:51.489 182759 DEBUG oslo_concurrency.lockutils [req-40053881-2c43-45a1-a7e1-a2e157097489 req-f28979da-f7c2-43dd-a760-26ca56d97703 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "c4b8d52a-d6d6-4588-91ae-5eedc3a8db48-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 21 19:18:51 np0005591285 nova_compute[182755]: 2026-01-22 00:18:51.489 182759 DEBUG oslo_concurrency.lockutils [req-40053881-2c43-45a1-a7e1-a2e157097489 req-f28979da-f7c2-43dd-a760-26ca56d97703 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "c4b8d52a-d6d6-4588-91ae-5eedc3a8db48-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 21 19:18:51 np0005591285 nova_compute[182755]: 2026-01-22 00:18:51.490 182759 DEBUG oslo_concurrency.lockutils [req-40053881-2c43-45a1-a7e1-a2e157097489 req-f28979da-f7c2-43dd-a760-26ca56d97703 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "c4b8d52a-d6d6-4588-91ae-5eedc3a8db48-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 21 19:18:51 np0005591285 nova_compute[182755]: 2026-01-22 00:18:51.490 182759 DEBUG nova.compute.manager [req-40053881-2c43-45a1-a7e1-a2e157097489 req-f28979da-f7c2-43dd-a760-26ca56d97703 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: c4b8d52a-d6d6-4588-91ae-5eedc3a8db48] Processing event network-vif-plugged-4fc5d12c-3000-4f2c-b8b8-de8878075df7 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Jan 21 19:18:51 np0005591285 nova_compute[182755]: 2026-01-22 00:18:51.497 182759 DEBUG nova.compute.manager [None req-b353149b-a80c-45b9-8938-808359b67be0 5eb4e81f0cef4003ae49faa67b3f17c3 3e408650207b498c8d115fd0c4f776dc - - default default] [instance: c4b8d52a-d6d6-4588-91ae-5eedc3a8db48] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Jan 21 19:18:51 np0005591285 nova_compute[182755]: 2026-01-22 00:18:51.501 182759 DEBUG nova.virt.driver [None req-f447ef32-7edb-4e2c-a5c5-5452f2f9c37e - - - - - -] Emitting event <LifecycleEvent: 1769041131.5013027, c4b8d52a-d6d6-4588-91ae-5eedc3a8db48 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 21 19:18:51 np0005591285 nova_compute[182755]: 2026-01-22 00:18:51.502 182759 INFO nova.compute.manager [None req-f447ef32-7edb-4e2c-a5c5-5452f2f9c37e - - - - - -] [instance: c4b8d52a-d6d6-4588-91ae-5eedc3a8db48] VM Resumed (Lifecycle Event)#033[00m
Jan 21 19:18:51 np0005591285 nova_compute[182755]: 2026-01-22 00:18:51.505 182759 DEBUG nova.virt.libvirt.driver [None req-b353149b-a80c-45b9-8938-808359b67be0 5eb4e81f0cef4003ae49faa67b3f17c3 3e408650207b498c8d115fd0c4f776dc - - default default] [instance: c4b8d52a-d6d6-4588-91ae-5eedc3a8db48] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Jan 21 19:18:51 np0005591285 nova_compute[182755]: 2026-01-22 00:18:51.509 182759 INFO nova.virt.libvirt.driver [-] [instance: c4b8d52a-d6d6-4588-91ae-5eedc3a8db48] Instance spawned successfully.#033[00m
Jan 21 19:18:51 np0005591285 nova_compute[182755]: 2026-01-22 00:18:51.510 182759 DEBUG nova.virt.libvirt.driver [None req-b353149b-a80c-45b9-8938-808359b67be0 5eb4e81f0cef4003ae49faa67b3f17c3 3e408650207b498c8d115fd0c4f776dc - - default default] [instance: c4b8d52a-d6d6-4588-91ae-5eedc3a8db48] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Jan 21 19:18:51 np0005591285 nova_compute[182755]: 2026-01-22 00:18:51.526 182759 DEBUG nova.compute.manager [None req-f447ef32-7edb-4e2c-a5c5-5452f2f9c37e - - - - - -] [instance: c4b8d52a-d6d6-4588-91ae-5eedc3a8db48] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 21 19:18:51 np0005591285 nova_compute[182755]: 2026-01-22 00:18:51.532 182759 DEBUG nova.compute.manager [None req-f447ef32-7edb-4e2c-a5c5-5452f2f9c37e - - - - - -] [instance: c4b8d52a-d6d6-4588-91ae-5eedc3a8db48] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 21 19:18:51 np0005591285 nova_compute[182755]: 2026-01-22 00:18:51.548 182759 DEBUG nova.virt.libvirt.driver [None req-b353149b-a80c-45b9-8938-808359b67be0 5eb4e81f0cef4003ae49faa67b3f17c3 3e408650207b498c8d115fd0c4f776dc - - default default] [instance: c4b8d52a-d6d6-4588-91ae-5eedc3a8db48] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 21 19:18:51 np0005591285 nova_compute[182755]: 2026-01-22 00:18:51.548 182759 DEBUG nova.virt.libvirt.driver [None req-b353149b-a80c-45b9-8938-808359b67be0 5eb4e81f0cef4003ae49faa67b3f17c3 3e408650207b498c8d115fd0c4f776dc - - default default] [instance: c4b8d52a-d6d6-4588-91ae-5eedc3a8db48] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 21 19:18:51 np0005591285 nova_compute[182755]: 2026-01-22 00:18:51.549 182759 DEBUG nova.virt.libvirt.driver [None req-b353149b-a80c-45b9-8938-808359b67be0 5eb4e81f0cef4003ae49faa67b3f17c3 3e408650207b498c8d115fd0c4f776dc - - default default] [instance: c4b8d52a-d6d6-4588-91ae-5eedc3a8db48] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 21 19:18:51 np0005591285 nova_compute[182755]: 2026-01-22 00:18:51.550 182759 DEBUG nova.virt.libvirt.driver [None req-b353149b-a80c-45b9-8938-808359b67be0 5eb4e81f0cef4003ae49faa67b3f17c3 3e408650207b498c8d115fd0c4f776dc - - default default] [instance: c4b8d52a-d6d6-4588-91ae-5eedc3a8db48] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 21 19:18:51 np0005591285 nova_compute[182755]: 2026-01-22 00:18:51.551 182759 DEBUG nova.virt.libvirt.driver [None req-b353149b-a80c-45b9-8938-808359b67be0 5eb4e81f0cef4003ae49faa67b3f17c3 3e408650207b498c8d115fd0c4f776dc - - default default] [instance: c4b8d52a-d6d6-4588-91ae-5eedc3a8db48] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 21 19:18:51 np0005591285 nova_compute[182755]: 2026-01-22 00:18:51.552 182759 DEBUG nova.virt.libvirt.driver [None req-b353149b-a80c-45b9-8938-808359b67be0 5eb4e81f0cef4003ae49faa67b3f17c3 3e408650207b498c8d115fd0c4f776dc - - default default] [instance: c4b8d52a-d6d6-4588-91ae-5eedc3a8db48] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 21 19:18:51 np0005591285 nova_compute[182755]: 2026-01-22 00:18:51.560 182759 INFO nova.compute.manager [None req-f447ef32-7edb-4e2c-a5c5-5452f2f9c37e - - - - - -] [instance: c4b8d52a-d6d6-4588-91ae-5eedc3a8db48] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 21 19:18:51 np0005591285 nova_compute[182755]: 2026-01-22 00:18:51.641 182759 INFO nova.compute.manager [None req-b353149b-a80c-45b9-8938-808359b67be0 5eb4e81f0cef4003ae49faa67b3f17c3 3e408650207b498c8d115fd0c4f776dc - - default default] [instance: c4b8d52a-d6d6-4588-91ae-5eedc3a8db48] Took 6.34 seconds to spawn the instance on the hypervisor.#033[00m
Jan 21 19:18:51 np0005591285 nova_compute[182755]: 2026-01-22 00:18:51.641 182759 DEBUG nova.compute.manager [None req-b353149b-a80c-45b9-8938-808359b67be0 5eb4e81f0cef4003ae49faa67b3f17c3 3e408650207b498c8d115fd0c4f776dc - - default default] [instance: c4b8d52a-d6d6-4588-91ae-5eedc3a8db48] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 21 19:18:51 np0005591285 nova_compute[182755]: 2026-01-22 00:18:51.752 182759 INFO nova.compute.manager [None req-b353149b-a80c-45b9-8938-808359b67be0 5eb4e81f0cef4003ae49faa67b3f17c3 3e408650207b498c8d115fd0c4f776dc - - default default] [instance: c4b8d52a-d6d6-4588-91ae-5eedc3a8db48] Took 7.30 seconds to build instance.#033[00m
Jan 21 19:18:51 np0005591285 nova_compute[182755]: 2026-01-22 00:18:51.770 182759 DEBUG oslo_concurrency.lockutils [None req-b353149b-a80c-45b9-8938-808359b67be0 5eb4e81f0cef4003ae49faa67b3f17c3 3e408650207b498c8d115fd0c4f776dc - - default default] Lock "c4b8d52a-d6d6-4588-91ae-5eedc3a8db48" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 7.543s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 21 19:18:52 np0005591285 nova_compute[182755]: 2026-01-22 00:18:52.324 182759 DEBUG nova.network.neutron [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] [instance: fb2dc221-bb45-4407-90b5-ce2fe888001c] Updating instance_info_cache with network_info: [{"id": "e38b41ce-ced6-421a-ade5-becfd62fa83d", "address": "fa:16:3e:df:32:6a", "network": {"id": "aabf11c6-ef94-408a-8148-6c6400566606", "bridge": "br-int", "label": "tempest-ServersTestJSON-341875047-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3e408650207b498c8d115fd0c4f776dc", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape38b41ce-ce", "ovs_interfaceid": "e38b41ce-ced6-421a-ade5-becfd62fa83d", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 21 19:18:52 np0005591285 nova_compute[182755]: 2026-01-22 00:18:52.345 182759 DEBUG oslo_concurrency.lockutils [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Releasing lock "refresh_cache-fb2dc221-bb45-4407-90b5-ce2fe888001c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 21 19:18:52 np0005591285 nova_compute[182755]: 2026-01-22 00:18:52.345 182759 DEBUG nova.compute.manager [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] [instance: fb2dc221-bb45-4407-90b5-ce2fe888001c] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929#033[00m
Jan 21 19:18:52 np0005591285 nova_compute[182755]: 2026-01-22 00:18:52.346 182759 DEBUG oslo_service.periodic_task [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 21 19:18:52 np0005591285 nova_compute[182755]: 2026-01-22 00:18:52.346 182759 DEBUG oslo_service.periodic_task [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 21 19:18:52 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:18:52.547 104259 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=42, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '16:a4:fb', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': 'fa:c2:bb:50:eb:27'}, ipsec=False) old=SB_Global(nb_cfg=41) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 21 19:18:52 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:18:52.547 104259 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 0 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Jan 21 19:18:52 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:18:52.548 104259 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=ce4b296c-26ac-415a-aa87-9634754eb3d3, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '42'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 21 19:18:52 np0005591285 nova_compute[182755]: 2026-01-22 00:18:52.601 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:18:52 np0005591285 nova_compute[182755]: 2026-01-22 00:18:52.781 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:18:53 np0005591285 nova_compute[182755]: 2026-01-22 00:18:53.565 182759 DEBUG nova.compute.manager [req-3a856994-1b7c-4772-bb28-f6c6c7047854 req-5d1c1f4f-ebd2-4f58-bb17-c952f061acd5 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: c4b8d52a-d6d6-4588-91ae-5eedc3a8db48] Received event network-vif-plugged-4fc5d12c-3000-4f2c-b8b8-de8878075df7 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 21 19:18:53 np0005591285 nova_compute[182755]: 2026-01-22 00:18:53.566 182759 DEBUG oslo_concurrency.lockutils [req-3a856994-1b7c-4772-bb28-f6c6c7047854 req-5d1c1f4f-ebd2-4f58-bb17-c952f061acd5 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "c4b8d52a-d6d6-4588-91ae-5eedc3a8db48-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 21 19:18:53 np0005591285 nova_compute[182755]: 2026-01-22 00:18:53.566 182759 DEBUG oslo_concurrency.lockutils [req-3a856994-1b7c-4772-bb28-f6c6c7047854 req-5d1c1f4f-ebd2-4f58-bb17-c952f061acd5 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "c4b8d52a-d6d6-4588-91ae-5eedc3a8db48-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 21 19:18:53 np0005591285 nova_compute[182755]: 2026-01-22 00:18:53.566 182759 DEBUG oslo_concurrency.lockutils [req-3a856994-1b7c-4772-bb28-f6c6c7047854 req-5d1c1f4f-ebd2-4f58-bb17-c952f061acd5 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "c4b8d52a-d6d6-4588-91ae-5eedc3a8db48-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 21 19:18:53 np0005591285 nova_compute[182755]: 2026-01-22 00:18:53.566 182759 DEBUG nova.compute.manager [req-3a856994-1b7c-4772-bb28-f6c6c7047854 req-5d1c1f4f-ebd2-4f58-bb17-c952f061acd5 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: c4b8d52a-d6d6-4588-91ae-5eedc3a8db48] No waiting events found dispatching network-vif-plugged-4fc5d12c-3000-4f2c-b8b8-de8878075df7 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 21 19:18:53 np0005591285 nova_compute[182755]: 2026-01-22 00:18:53.566 182759 WARNING nova.compute.manager [req-3a856994-1b7c-4772-bb28-f6c6c7047854 req-5d1c1f4f-ebd2-4f58-bb17-c952f061acd5 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: c4b8d52a-d6d6-4588-91ae-5eedc3a8db48] Received unexpected event network-vif-plugged-4fc5d12c-3000-4f2c-b8b8-de8878075df7 for instance with vm_state active and task_state None.#033[00m
Jan 21 19:18:54 np0005591285 nova_compute[182755]: 2026-01-22 00:18:54.217 182759 DEBUG oslo_service.periodic_task [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 21 19:18:54 np0005591285 nova_compute[182755]: 2026-01-22 00:18:54.217 182759 DEBUG oslo_service.periodic_task [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 21 19:18:54 np0005591285 nova_compute[182755]: 2026-01-22 00:18:54.242 182759 DEBUG oslo_concurrency.lockutils [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 21 19:18:54 np0005591285 nova_compute[182755]: 2026-01-22 00:18:54.242 182759 DEBUG oslo_concurrency.lockutils [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 21 19:18:54 np0005591285 nova_compute[182755]: 2026-01-22 00:18:54.243 182759 DEBUG oslo_concurrency.lockutils [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 21 19:18:54 np0005591285 nova_compute[182755]: 2026-01-22 00:18:54.243 182759 DEBUG nova.compute.resource_tracker [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Jan 21 19:18:54 np0005591285 nova_compute[182755]: 2026-01-22 00:18:54.320 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:18:54 np0005591285 nova_compute[182755]: 2026-01-22 00:18:54.327 182759 DEBUG oslo_concurrency.processutils [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/fb2dc221-bb45-4407-90b5-ce2fe888001c/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 21 19:18:54 np0005591285 nova_compute[182755]: 2026-01-22 00:18:54.388 182759 DEBUG oslo_concurrency.processutils [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/fb2dc221-bb45-4407-90b5-ce2fe888001c/disk --force-share --output=json" returned: 0 in 0.061s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 21 19:18:54 np0005591285 nova_compute[182755]: 2026-01-22 00:18:54.389 182759 DEBUG oslo_concurrency.processutils [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/fb2dc221-bb45-4407-90b5-ce2fe888001c/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 21 19:18:54 np0005591285 nova_compute[182755]: 2026-01-22 00:18:54.466 182759 DEBUG oslo_concurrency.processutils [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/fb2dc221-bb45-4407-90b5-ce2fe888001c/disk --force-share --output=json" returned: 0 in 0.076s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 21 19:18:54 np0005591285 nova_compute[182755]: 2026-01-22 00:18:54.471 182759 DEBUG oslo_concurrency.processutils [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/c4b8d52a-d6d6-4588-91ae-5eedc3a8db48/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 21 19:18:54 np0005591285 nova_compute[182755]: 2026-01-22 00:18:54.541 182759 DEBUG oslo_concurrency.processutils [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/c4b8d52a-d6d6-4588-91ae-5eedc3a8db48/disk --force-share --output=json" returned: 0 in 0.071s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 21 19:18:54 np0005591285 nova_compute[182755]: 2026-01-22 00:18:54.542 182759 DEBUG oslo_concurrency.processutils [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/c4b8d52a-d6d6-4588-91ae-5eedc3a8db48/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 21 19:18:54 np0005591285 nova_compute[182755]: 2026-01-22 00:18:54.596 182759 DEBUG oslo_concurrency.processutils [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/c4b8d52a-d6d6-4588-91ae-5eedc3a8db48/disk --force-share --output=json" returned: 0 in 0.054s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 21 19:18:54 np0005591285 nova_compute[182755]: 2026-01-22 00:18:54.733 182759 WARNING nova.virt.libvirt.driver [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 21 19:18:54 np0005591285 nova_compute[182755]: 2026-01-22 00:18:54.734 182759 DEBUG nova.compute.resource_tracker [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=5453MB free_disk=73.16363906860352GB free_vcpus=6 pci_devices=[{"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Jan 21 19:18:54 np0005591285 nova_compute[182755]: 2026-01-22 00:18:54.734 182759 DEBUG oslo_concurrency.lockutils [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 21 19:18:54 np0005591285 nova_compute[182755]: 2026-01-22 00:18:54.735 182759 DEBUG oslo_concurrency.lockutils [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 21 19:18:54 np0005591285 nova_compute[182755]: 2026-01-22 00:18:54.830 182759 DEBUG nova.compute.resource_tracker [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Instance fb2dc221-bb45-4407-90b5-ce2fe888001c actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Jan 21 19:18:54 np0005591285 nova_compute[182755]: 2026-01-22 00:18:54.830 182759 DEBUG nova.compute.resource_tracker [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Instance c4b8d52a-d6d6-4588-91ae-5eedc3a8db48 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Jan 21 19:18:54 np0005591285 nova_compute[182755]: 2026-01-22 00:18:54.831 182759 DEBUG nova.compute.resource_tracker [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 2 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Jan 21 19:18:54 np0005591285 nova_compute[182755]: 2026-01-22 00:18:54.831 182759 DEBUG nova.compute.resource_tracker [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7679MB used_ram=768MB phys_disk=79GB used_disk=2GB total_vcpus=8 used_vcpus=2 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Jan 21 19:18:54 np0005591285 nova_compute[182755]: 2026-01-22 00:18:54.886 182759 DEBUG nova.compute.provider_tree [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Inventory has not changed in ProviderTree for provider: e96a8776-a298-4c19-937a-402cb8191067 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 21 19:18:54 np0005591285 nova_compute[182755]: 2026-01-22 00:18:54.903 182759 DEBUG nova.scheduler.client.report [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Inventory has not changed for provider e96a8776-a298-4c19-937a-402cb8191067 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 21 19:18:54 np0005591285 nova_compute[182755]: 2026-01-22 00:18:54.927 182759 DEBUG nova.compute.resource_tracker [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Jan 21 19:18:54 np0005591285 nova_compute[182755]: 2026-01-22 00:18:54.928 182759 DEBUG oslo_concurrency.lockutils [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.193s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 21 19:18:57 np0005591285 nova_compute[182755]: 2026-01-22 00:18:57.745 182759 DEBUG oslo_concurrency.lockutils [None req-81bda5d3-e615-47cd-beaa-1299551f058e 5eb4e81f0cef4003ae49faa67b3f17c3 3e408650207b498c8d115fd0c4f776dc - - default default] Acquiring lock "c4b8d52a-d6d6-4588-91ae-5eedc3a8db48" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 21 19:18:57 np0005591285 nova_compute[182755]: 2026-01-22 00:18:57.746 182759 DEBUG oslo_concurrency.lockutils [None req-81bda5d3-e615-47cd-beaa-1299551f058e 5eb4e81f0cef4003ae49faa67b3f17c3 3e408650207b498c8d115fd0c4f776dc - - default default] Lock "c4b8d52a-d6d6-4588-91ae-5eedc3a8db48" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 21 19:18:57 np0005591285 nova_compute[182755]: 2026-01-22 00:18:57.746 182759 DEBUG oslo_concurrency.lockutils [None req-81bda5d3-e615-47cd-beaa-1299551f058e 5eb4e81f0cef4003ae49faa67b3f17c3 3e408650207b498c8d115fd0c4f776dc - - default default] Acquiring lock "c4b8d52a-d6d6-4588-91ae-5eedc3a8db48-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 21 19:18:57 np0005591285 nova_compute[182755]: 2026-01-22 00:18:57.746 182759 DEBUG oslo_concurrency.lockutils [None req-81bda5d3-e615-47cd-beaa-1299551f058e 5eb4e81f0cef4003ae49faa67b3f17c3 3e408650207b498c8d115fd0c4f776dc - - default default] Lock "c4b8d52a-d6d6-4588-91ae-5eedc3a8db48-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 21 19:18:57 np0005591285 nova_compute[182755]: 2026-01-22 00:18:57.747 182759 DEBUG oslo_concurrency.lockutils [None req-81bda5d3-e615-47cd-beaa-1299551f058e 5eb4e81f0cef4003ae49faa67b3f17c3 3e408650207b498c8d115fd0c4f776dc - - default default] Lock "c4b8d52a-d6d6-4588-91ae-5eedc3a8db48-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 21 19:18:57 np0005591285 nova_compute[182755]: 2026-01-22 00:18:57.761 182759 INFO nova.compute.manager [None req-81bda5d3-e615-47cd-beaa-1299551f058e 5eb4e81f0cef4003ae49faa67b3f17c3 3e408650207b498c8d115fd0c4f776dc - - default default] [instance: c4b8d52a-d6d6-4588-91ae-5eedc3a8db48] Terminating instance#033[00m
Jan 21 19:18:57 np0005591285 nova_compute[182755]: 2026-01-22 00:18:57.772 182759 DEBUG nova.compute.manager [None req-81bda5d3-e615-47cd-beaa-1299551f058e 5eb4e81f0cef4003ae49faa67b3f17c3 3e408650207b498c8d115fd0c4f776dc - - default default] [instance: c4b8d52a-d6d6-4588-91ae-5eedc3a8db48] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Jan 21 19:18:57 np0005591285 kernel: tap4fc5d12c-30 (unregistering): left promiscuous mode
Jan 21 19:18:57 np0005591285 nova_compute[182755]: 2026-01-22 00:18:57.783 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:18:57 np0005591285 NetworkManager[55017]: <info>  [1769041137.7884] device (tap4fc5d12c-30): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 21 19:18:57 np0005591285 ovn_controller[94908]: 2026-01-22T00:18:57Z|00560|binding|INFO|Releasing lport 4fc5d12c-3000-4f2c-b8b8-de8878075df7 from this chassis (sb_readonly=0)
Jan 21 19:18:57 np0005591285 ovn_controller[94908]: 2026-01-22T00:18:57Z|00561|binding|INFO|Setting lport 4fc5d12c-3000-4f2c-b8b8-de8878075df7 down in Southbound
Jan 21 19:18:57 np0005591285 nova_compute[182755]: 2026-01-22 00:18:57.796 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:18:57 np0005591285 ovn_controller[94908]: 2026-01-22T00:18:57Z|00562|binding|INFO|Removing iface tap4fc5d12c-30 ovn-installed in OVS
Jan 21 19:18:57 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:18:57.815 104259 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:31:8d:34 10.100.0.4'], port_security=['fa:16:3e:31:8d:34 10.100.0.4'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.4/28', 'neutron:device_id': 'c4b8d52a-d6d6-4588-91ae-5eedc3a8db48', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-aabf11c6-ef94-408a-8148-6c6400566606', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '3e408650207b498c8d115fd0c4f776dc', 'neutron:revision_number': '4', 'neutron:security_group_ids': 'd88f438e-f9bb-4593-93a6-6ce5aa939167', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-2.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=dfd57084-623a-46cf-a9c5-71a440a640c6, chassis=[], tunnel_key=5, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fdd55b5c970>], logical_port=4fc5d12c-3000-4f2c-b8b8-de8878075df7) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7fdd55b5c970>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 21 19:18:57 np0005591285 nova_compute[182755]: 2026-01-22 00:18:57.815 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:18:57 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:18:57.816 104259 INFO neutron.agent.ovn.metadata.agent [-] Port 4fc5d12c-3000-4f2c-b8b8-de8878075df7 in datapath aabf11c6-ef94-408a-8148-6c6400566606 unbound from our chassis#033[00m
Jan 21 19:18:57 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:18:57.818 104259 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network aabf11c6-ef94-408a-8148-6c6400566606#033[00m
Jan 21 19:18:57 np0005591285 systemd[1]: machine-qemu\x2d66\x2dinstance\x2d0000008f.scope: Deactivated successfully.
Jan 21 19:18:57 np0005591285 systemd[1]: machine-qemu\x2d66\x2dinstance\x2d0000008f.scope: Consumed 6.969s CPU time.
Jan 21 19:18:57 np0005591285 systemd-machined[154022]: Machine qemu-66-instance-0000008f terminated.
Jan 21 19:18:57 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:18:57.832 211690 DEBUG oslo.privsep.daemon [-] privsep: reply[68437aae-8eeb-483c-a215-235bc26cf39b]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 19:18:57 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:18:57.866 211704 DEBUG oslo.privsep.daemon [-] privsep: reply[5e932b96-9c8d-46d5-97b4-e132589cc373]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 19:18:57 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:18:57.870 211704 DEBUG oslo.privsep.daemon [-] privsep: reply[4cc2db95-0ba8-4de8-be01-bb403d10a569]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 19:18:57 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:18:57.896 211704 DEBUG oslo.privsep.daemon [-] privsep: reply[70d6226f-6748-4209-ac24-4590d3d98ed1]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 19:18:57 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:18:57.910 211690 DEBUG oslo.privsep.daemon [-] privsep: reply[37502b5a-1fb8-462e-964e-7bdfbbbff225]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapaabf11c6-e1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:ae:1b:62'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 8, 'tx_packets': 7, 'rx_bytes': 616, 'tx_bytes': 438, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 8, 'tx_packets': 7, 'rx_bytes': 616, 'tx_bytes': 438, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 174], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 567322, 'reachable_time': 24572, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 6, 'inoctets': 448, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 3, 'outoctets': 228, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 6, 'outmcastpkts': 3, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 448, 'outmcastoctets': 228, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 6, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 3, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 234500, 'error': None, 'target': 'ovnmeta-aabf11c6-ef94-408a-8148-6c6400566606', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 19:18:57 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:18:57.927 211690 DEBUG oslo.privsep.daemon [-] privsep: reply[6200f83d-3375-49ce-99d3-bd554acf7418]: (4, ({'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tapaabf11c6-e1'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 567334, 'tstamp': 567334}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 234501, 'error': None, 'target': 'ovnmeta-aabf11c6-ef94-408a-8148-6c6400566606', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.15'], ['IFA_LABEL', 'tapaabf11c6-e1'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 567337, 'tstamp': 567337}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 234501, 'error': None, 'target': 'ovnmeta-aabf11c6-ef94-408a-8148-6c6400566606', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 19:18:57 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:18:57.928 104259 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapaabf11c6-e0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 21 19:18:57 np0005591285 nova_compute[182755]: 2026-01-22 00:18:57.929 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:18:57 np0005591285 nova_compute[182755]: 2026-01-22 00:18:57.935 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:18:57 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:18:57.936 104259 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapaabf11c6-e0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 21 19:18:57 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:18:57.936 104259 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 21 19:18:57 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:18:57.937 104259 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapaabf11c6-e0, col_values=(('external_ids', {'iface-id': '1ae0dbff-a7cd-4db8-afc3-1d102fdd130f'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 21 19:18:57 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:18:57.937 104259 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 21 19:18:58 np0005591285 nova_compute[182755]: 2026-01-22 00:18:58.047 182759 INFO nova.virt.libvirt.driver [-] [instance: c4b8d52a-d6d6-4588-91ae-5eedc3a8db48] Instance destroyed successfully.#033[00m
Jan 21 19:18:58 np0005591285 nova_compute[182755]: 2026-01-22 00:18:58.048 182759 DEBUG nova.objects.instance [None req-81bda5d3-e615-47cd-beaa-1299551f058e 5eb4e81f0cef4003ae49faa67b3f17c3 3e408650207b498c8d115fd0c4f776dc - - default default] Lazy-loading 'resources' on Instance uuid c4b8d52a-d6d6-4588-91ae-5eedc3a8db48 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 21 19:18:58 np0005591285 nova_compute[182755]: 2026-01-22 00:18:58.067 182759 DEBUG nova.virt.libvirt.vif [None req-81bda5d3-e615-47cd-beaa-1299551f058e 5eb4e81f0cef4003ae49faa67b3f17c3 3e408650207b498c8d115fd0c4f776dc - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-22T00:18:43Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ServersTestJSON-server-1617563778',display_name='tempest-ServersTestJSON-server-1617563778',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-serverstestjson-server-1617563778',id=143,image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2026-01-22T00:18:51Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='3e408650207b498c8d115fd0c4f776dc',ramdisk_id='',reservation_id='r-lk75n3y9',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-ServersTestJSON-374007797',owner_user_name='tempest-ServersTestJSON-374007797-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-22T00:18:51Z,user_data=None,user_id='5eb4e81f0cef4003ae49faa67b3f17c3',uuid=c4b8d52a-d6d6-4588-91ae-5eedc3a8db48,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "4fc5d12c-3000-4f2c-b8b8-de8878075df7", "address": "fa:16:3e:31:8d:34", "network": {"id": "aabf11c6-ef94-408a-8148-6c6400566606", "bridge": "br-int", "label": "tempest-ServersTestJSON-341875047-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3e408650207b498c8d115fd0c4f776dc", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4fc5d12c-30", "ovs_interfaceid": "4fc5d12c-3000-4f2c-b8b8-de8878075df7", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Jan 21 19:18:58 np0005591285 nova_compute[182755]: 2026-01-22 00:18:58.067 182759 DEBUG nova.network.os_vif_util [None req-81bda5d3-e615-47cd-beaa-1299551f058e 5eb4e81f0cef4003ae49faa67b3f17c3 3e408650207b498c8d115fd0c4f776dc - - default default] Converting VIF {"id": "4fc5d12c-3000-4f2c-b8b8-de8878075df7", "address": "fa:16:3e:31:8d:34", "network": {"id": "aabf11c6-ef94-408a-8148-6c6400566606", "bridge": "br-int", "label": "tempest-ServersTestJSON-341875047-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3e408650207b498c8d115fd0c4f776dc", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4fc5d12c-30", "ovs_interfaceid": "4fc5d12c-3000-4f2c-b8b8-de8878075df7", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 21 19:18:58 np0005591285 nova_compute[182755]: 2026-01-22 00:18:58.069 182759 DEBUG nova.network.os_vif_util [None req-81bda5d3-e615-47cd-beaa-1299551f058e 5eb4e81f0cef4003ae49faa67b3f17c3 3e408650207b498c8d115fd0c4f776dc - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:31:8d:34,bridge_name='br-int',has_traffic_filtering=True,id=4fc5d12c-3000-4f2c-b8b8-de8878075df7,network=Network(aabf11c6-ef94-408a-8148-6c6400566606),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap4fc5d12c-30') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 21 19:18:58 np0005591285 nova_compute[182755]: 2026-01-22 00:18:58.069 182759 DEBUG os_vif [None req-81bda5d3-e615-47cd-beaa-1299551f058e 5eb4e81f0cef4003ae49faa67b3f17c3 3e408650207b498c8d115fd0c4f776dc - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:31:8d:34,bridge_name='br-int',has_traffic_filtering=True,id=4fc5d12c-3000-4f2c-b8b8-de8878075df7,network=Network(aabf11c6-ef94-408a-8148-6c6400566606),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap4fc5d12c-30') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Jan 21 19:18:58 np0005591285 nova_compute[182755]: 2026-01-22 00:18:58.072 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:18:58 np0005591285 nova_compute[182755]: 2026-01-22 00:18:58.073 182759 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap4fc5d12c-30, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 21 19:18:58 np0005591285 nova_compute[182755]: 2026-01-22 00:18:58.075 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:18:58 np0005591285 nova_compute[182755]: 2026-01-22 00:18:58.078 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 21 19:18:58 np0005591285 nova_compute[182755]: 2026-01-22 00:18:58.081 182759 INFO os_vif [None req-81bda5d3-e615-47cd-beaa-1299551f058e 5eb4e81f0cef4003ae49faa67b3f17c3 3e408650207b498c8d115fd0c4f776dc - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:31:8d:34,bridge_name='br-int',has_traffic_filtering=True,id=4fc5d12c-3000-4f2c-b8b8-de8878075df7,network=Network(aabf11c6-ef94-408a-8148-6c6400566606),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap4fc5d12c-30')#033[00m
Jan 21 19:18:58 np0005591285 nova_compute[182755]: 2026-01-22 00:18:58.083 182759 INFO nova.virt.libvirt.driver [None req-81bda5d3-e615-47cd-beaa-1299551f058e 5eb4e81f0cef4003ae49faa67b3f17c3 3e408650207b498c8d115fd0c4f776dc - - default default] [instance: c4b8d52a-d6d6-4588-91ae-5eedc3a8db48] Deleting instance files /var/lib/nova/instances/c4b8d52a-d6d6-4588-91ae-5eedc3a8db48_del#033[00m
Jan 21 19:18:58 np0005591285 nova_compute[182755]: 2026-01-22 00:18:58.084 182759 INFO nova.virt.libvirt.driver [None req-81bda5d3-e615-47cd-beaa-1299551f058e 5eb4e81f0cef4003ae49faa67b3f17c3 3e408650207b498c8d115fd0c4f776dc - - default default] [instance: c4b8d52a-d6d6-4588-91ae-5eedc3a8db48] Deletion of /var/lib/nova/instances/c4b8d52a-d6d6-4588-91ae-5eedc3a8db48_del complete#033[00m
Jan 21 19:18:58 np0005591285 nova_compute[182755]: 2026-01-22 00:18:58.173 182759 INFO nova.compute.manager [None req-81bda5d3-e615-47cd-beaa-1299551f058e 5eb4e81f0cef4003ae49faa67b3f17c3 3e408650207b498c8d115fd0c4f776dc - - default default] [instance: c4b8d52a-d6d6-4588-91ae-5eedc3a8db48] Took 0.40 seconds to destroy the instance on the hypervisor.#033[00m
Jan 21 19:18:58 np0005591285 nova_compute[182755]: 2026-01-22 00:18:58.174 182759 DEBUG oslo.service.loopingcall [None req-81bda5d3-e615-47cd-beaa-1299551f058e 5eb4e81f0cef4003ae49faa67b3f17c3 3e408650207b498c8d115fd0c4f776dc - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Jan 21 19:18:58 np0005591285 nova_compute[182755]: 2026-01-22 00:18:58.174 182759 DEBUG nova.compute.manager [-] [instance: c4b8d52a-d6d6-4588-91ae-5eedc3a8db48] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Jan 21 19:18:58 np0005591285 nova_compute[182755]: 2026-01-22 00:18:58.174 182759 DEBUG nova.network.neutron [-] [instance: c4b8d52a-d6d6-4588-91ae-5eedc3a8db48] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Jan 21 19:18:58 np0005591285 nova_compute[182755]: 2026-01-22 00:18:58.817 182759 DEBUG nova.network.neutron [-] [instance: c4b8d52a-d6d6-4588-91ae-5eedc3a8db48] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 21 19:18:58 np0005591285 nova_compute[182755]: 2026-01-22 00:18:58.849 182759 INFO nova.compute.manager [-] [instance: c4b8d52a-d6d6-4588-91ae-5eedc3a8db48] Took 0.67 seconds to deallocate network for instance.#033[00m
Jan 21 19:18:58 np0005591285 nova_compute[182755]: 2026-01-22 00:18:58.928 182759 DEBUG oslo_service.periodic_task [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 21 19:18:58 np0005591285 nova_compute[182755]: 2026-01-22 00:18:58.941 182759 DEBUG nova.compute.manager [req-18062953-4bbe-4ca5-8496-2e38282b87eb req-fb24d21d-2084-4695-b934-0fb893fc4f57 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: c4b8d52a-d6d6-4588-91ae-5eedc3a8db48] Received event network-vif-deleted-4fc5d12c-3000-4f2c-b8b8-de8878075df7 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 21 19:18:59 np0005591285 nova_compute[182755]: 2026-01-22 00:18:59.191 182759 DEBUG oslo_concurrency.lockutils [None req-81bda5d3-e615-47cd-beaa-1299551f058e 5eb4e81f0cef4003ae49faa67b3f17c3 3e408650207b498c8d115fd0c4f776dc - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 21 19:18:59 np0005591285 nova_compute[182755]: 2026-01-22 00:18:59.192 182759 DEBUG oslo_concurrency.lockutils [None req-81bda5d3-e615-47cd-beaa-1299551f058e 5eb4e81f0cef4003ae49faa67b3f17c3 3e408650207b498c8d115fd0c4f776dc - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 21 19:18:59 np0005591285 nova_compute[182755]: 2026-01-22 00:18:59.293 182759 DEBUG nova.compute.provider_tree [None req-81bda5d3-e615-47cd-beaa-1299551f058e 5eb4e81f0cef4003ae49faa67b3f17c3 3e408650207b498c8d115fd0c4f776dc - - default default] Inventory has not changed in ProviderTree for provider: e96a8776-a298-4c19-937a-402cb8191067 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 21 19:18:59 np0005591285 nova_compute[182755]: 2026-01-22 00:18:59.318 182759 DEBUG nova.scheduler.client.report [None req-81bda5d3-e615-47cd-beaa-1299551f058e 5eb4e81f0cef4003ae49faa67b3f17c3 3e408650207b498c8d115fd0c4f776dc - - default default] Inventory has not changed for provider e96a8776-a298-4c19-937a-402cb8191067 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 21 19:18:59 np0005591285 nova_compute[182755]: 2026-01-22 00:18:59.339 182759 DEBUG oslo_concurrency.lockutils [None req-81bda5d3-e615-47cd-beaa-1299551f058e 5eb4e81f0cef4003ae49faa67b3f17c3 3e408650207b498c8d115fd0c4f776dc - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.147s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 21 19:18:59 np0005591285 nova_compute[182755]: 2026-01-22 00:18:59.370 182759 INFO nova.scheduler.client.report [None req-81bda5d3-e615-47cd-beaa-1299551f058e 5eb4e81f0cef4003ae49faa67b3f17c3 3e408650207b498c8d115fd0c4f776dc - - default default] Deleted allocations for instance c4b8d52a-d6d6-4588-91ae-5eedc3a8db48#033[00m
Jan 21 19:18:59 np0005591285 nova_compute[182755]: 2026-01-22 00:18:59.446 182759 DEBUG oslo_concurrency.lockutils [None req-81bda5d3-e615-47cd-beaa-1299551f058e 5eb4e81f0cef4003ae49faa67b3f17c3 3e408650207b498c8d115fd0c4f776dc - - default default] Lock "c4b8d52a-d6d6-4588-91ae-5eedc3a8db48" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 1.701s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 21 19:19:00 np0005591285 nova_compute[182755]: 2026-01-22 00:19:00.748 182759 DEBUG oslo_concurrency.lockutils [None req-d60bc20f-01db-4a60-82f0-dd0373e292ac 5eb4e81f0cef4003ae49faa67b3f17c3 3e408650207b498c8d115fd0c4f776dc - - default default] Acquiring lock "fb2dc221-bb45-4407-90b5-ce2fe888001c" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 21 19:19:00 np0005591285 nova_compute[182755]: 2026-01-22 00:19:00.749 182759 DEBUG oslo_concurrency.lockutils [None req-d60bc20f-01db-4a60-82f0-dd0373e292ac 5eb4e81f0cef4003ae49faa67b3f17c3 3e408650207b498c8d115fd0c4f776dc - - default default] Lock "fb2dc221-bb45-4407-90b5-ce2fe888001c" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 21 19:19:00 np0005591285 nova_compute[182755]: 2026-01-22 00:19:00.750 182759 DEBUG oslo_concurrency.lockutils [None req-d60bc20f-01db-4a60-82f0-dd0373e292ac 5eb4e81f0cef4003ae49faa67b3f17c3 3e408650207b498c8d115fd0c4f776dc - - default default] Acquiring lock "fb2dc221-bb45-4407-90b5-ce2fe888001c-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 21 19:19:00 np0005591285 nova_compute[182755]: 2026-01-22 00:19:00.751 182759 DEBUG oslo_concurrency.lockutils [None req-d60bc20f-01db-4a60-82f0-dd0373e292ac 5eb4e81f0cef4003ae49faa67b3f17c3 3e408650207b498c8d115fd0c4f776dc - - default default] Lock "fb2dc221-bb45-4407-90b5-ce2fe888001c-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 21 19:19:00 np0005591285 nova_compute[182755]: 2026-01-22 00:19:00.751 182759 DEBUG oslo_concurrency.lockutils [None req-d60bc20f-01db-4a60-82f0-dd0373e292ac 5eb4e81f0cef4003ae49faa67b3f17c3 3e408650207b498c8d115fd0c4f776dc - - default default] Lock "fb2dc221-bb45-4407-90b5-ce2fe888001c-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 21 19:19:00 np0005591285 nova_compute[182755]: 2026-01-22 00:19:00.764 182759 INFO nova.compute.manager [None req-d60bc20f-01db-4a60-82f0-dd0373e292ac 5eb4e81f0cef4003ae49faa67b3f17c3 3e408650207b498c8d115fd0c4f776dc - - default default] [instance: fb2dc221-bb45-4407-90b5-ce2fe888001c] Terminating instance#033[00m
Jan 21 19:19:00 np0005591285 nova_compute[182755]: 2026-01-22 00:19:00.775 182759 DEBUG nova.compute.manager [None req-d60bc20f-01db-4a60-82f0-dd0373e292ac 5eb4e81f0cef4003ae49faa67b3f17c3 3e408650207b498c8d115fd0c4f776dc - - default default] [instance: fb2dc221-bb45-4407-90b5-ce2fe888001c] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Jan 21 19:19:00 np0005591285 kernel: tape38b41ce-ce (unregistering): left promiscuous mode
Jan 21 19:19:00 np0005591285 ovn_controller[94908]: 2026-01-22T00:19:00Z|00563|binding|INFO|Releasing lport e38b41ce-ced6-421a-ade5-becfd62fa83d from this chassis (sb_readonly=0)
Jan 21 19:19:00 np0005591285 ovn_controller[94908]: 2026-01-22T00:19:00Z|00564|binding|INFO|Setting lport e38b41ce-ced6-421a-ade5-becfd62fa83d down in Southbound
Jan 21 19:19:00 np0005591285 ovn_controller[94908]: 2026-01-22T00:19:00Z|00565|binding|INFO|Removing iface tape38b41ce-ce ovn-installed in OVS
Jan 21 19:19:00 np0005591285 NetworkManager[55017]: <info>  [1769041140.8075] device (tape38b41ce-ce): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 21 19:19:00 np0005591285 nova_compute[182755]: 2026-01-22 00:19:00.809 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:19:00 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:19:00.823 104259 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:df:32:6a 10.100.0.10'], port_security=['fa:16:3e:df:32:6a 10.100.0.10'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.10/28', 'neutron:device_id': 'fb2dc221-bb45-4407-90b5-ce2fe888001c', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-aabf11c6-ef94-408a-8148-6c6400566606', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '3e408650207b498c8d115fd0c4f776dc', 'neutron:revision_number': '4', 'neutron:security_group_ids': 'd88f438e-f9bb-4593-93a6-6ce5aa939167', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-2.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=dfd57084-623a-46cf-a9c5-71a440a640c6, chassis=[], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fdd55b5c970>], logical_port=e38b41ce-ced6-421a-ade5-becfd62fa83d) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7fdd55b5c970>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 21 19:19:00 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:19:00.824 104259 INFO neutron.agent.ovn.metadata.agent [-] Port e38b41ce-ced6-421a-ade5-becfd62fa83d in datapath aabf11c6-ef94-408a-8148-6c6400566606 unbound from our chassis#033[00m
Jan 21 19:19:00 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:19:00.825 104259 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network aabf11c6-ef94-408a-8148-6c6400566606, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Jan 21 19:19:00 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:19:00.826 211690 DEBUG oslo.privsep.daemon [-] privsep: reply[18c1df85-88af-4733-ba3d-32d2d70d3f27]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 19:19:00 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:19:00.827 104259 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-aabf11c6-ef94-408a-8148-6c6400566606 namespace which is not needed anymore#033[00m
Jan 21 19:19:00 np0005591285 nova_compute[182755]: 2026-01-22 00:19:00.829 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:19:00 np0005591285 systemd[1]: machine-qemu\x2d65\x2dinstance\x2d0000008d.scope: Deactivated successfully.
Jan 21 19:19:00 np0005591285 systemd[1]: machine-qemu\x2d65\x2dinstance\x2d0000008d.scope: Consumed 13.194s CPU time.
Jan 21 19:19:00 np0005591285 systemd-machined[154022]: Machine qemu-65-instance-0000008d terminated.
Jan 21 19:19:00 np0005591285 neutron-haproxy-ovnmeta-aabf11c6-ef94-408a-8148-6c6400566606[234294]: [NOTICE]   (234298) : haproxy version is 2.8.14-c23fe91
Jan 21 19:19:00 np0005591285 neutron-haproxy-ovnmeta-aabf11c6-ef94-408a-8148-6c6400566606[234294]: [NOTICE]   (234298) : path to executable is /usr/sbin/haproxy
Jan 21 19:19:00 np0005591285 neutron-haproxy-ovnmeta-aabf11c6-ef94-408a-8148-6c6400566606[234294]: [WARNING]  (234298) : Exiting Master process...
Jan 21 19:19:00 np0005591285 neutron-haproxy-ovnmeta-aabf11c6-ef94-408a-8148-6c6400566606[234294]: [ALERT]    (234298) : Current worker (234300) exited with code 143 (Terminated)
Jan 21 19:19:00 np0005591285 neutron-haproxy-ovnmeta-aabf11c6-ef94-408a-8148-6c6400566606[234294]: [WARNING]  (234298) : All workers exited. Exiting... (0)
Jan 21 19:19:00 np0005591285 systemd[1]: libpod-5184a8c8dce40e7a1de3dbe01f291520f2ad24407839d38396ec3932be28057e.scope: Deactivated successfully.
Jan 21 19:19:00 np0005591285 podman[234545]: 2026-01-22 00:19:00.95793988 +0000 UTC m=+0.043485861 container died 5184a8c8dce40e7a1de3dbe01f291520f2ad24407839d38396ec3932be28057e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-aabf11c6-ef94-408a-8148-6c6400566606, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251202)
Jan 21 19:19:00 np0005591285 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-5184a8c8dce40e7a1de3dbe01f291520f2ad24407839d38396ec3932be28057e-userdata-shm.mount: Deactivated successfully.
Jan 21 19:19:01 np0005591285 systemd[1]: var-lib-containers-storage-overlay-12c0bf2f75b026d74a017bfbd3b772a3907811bbc0704f958a28599947da93f0-merged.mount: Deactivated successfully.
Jan 21 19:19:01 np0005591285 nova_compute[182755]: 2026-01-22 00:19:01.001 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:19:01 np0005591285 nova_compute[182755]: 2026-01-22 00:19:01.004 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:19:01 np0005591285 podman[234545]: 2026-01-22 00:19:01.005988701 +0000 UTC m=+0.091534642 container cleanup 5184a8c8dce40e7a1de3dbe01f291520f2ad24407839d38396ec3932be28057e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-aabf11c6-ef94-408a-8148-6c6400566606, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251202)
Jan 21 19:19:01 np0005591285 systemd[1]: libpod-conmon-5184a8c8dce40e7a1de3dbe01f291520f2ad24407839d38396ec3932be28057e.scope: Deactivated successfully.
Jan 21 19:19:01 np0005591285 nova_compute[182755]: 2026-01-22 00:19:01.042 182759 INFO nova.virt.libvirt.driver [-] [instance: fb2dc221-bb45-4407-90b5-ce2fe888001c] Instance destroyed successfully.#033[00m
Jan 21 19:19:01 np0005591285 nova_compute[182755]: 2026-01-22 00:19:01.042 182759 DEBUG nova.objects.instance [None req-d60bc20f-01db-4a60-82f0-dd0373e292ac 5eb4e81f0cef4003ae49faa67b3f17c3 3e408650207b498c8d115fd0c4f776dc - - default default] Lazy-loading 'resources' on Instance uuid fb2dc221-bb45-4407-90b5-ce2fe888001c obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 21 19:19:01 np0005591285 nova_compute[182755]: 2026-01-22 00:19:01.058 182759 DEBUG nova.virt.libvirt.vif [None req-d60bc20f-01db-4a60-82f0-dd0373e292ac 5eb4e81f0cef4003ae49faa67b3f17c3 3e408650207b498c8d115fd0c4f776dc - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-22T00:18:18Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ServersTestJSON-server-1617563778',display_name='tempest-ServersTestJSON-server-1617563778',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-serverstestjson-server-1617563778',id=141,image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2026-01-22T00:18:40Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='3e408650207b498c8d115fd0c4f776dc',ramdisk_id='',reservation_id='r-te429pxn',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-ServersTestJSON-374007797',owner_user_name='tempest-ServersTestJSON-374007797-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-22T00:18:40Z,user_data=None,user_id='5eb4e81f0cef4003ae49faa67b3f17c3',uuid=fb2dc221-bb45-4407-90b5-ce2fe888001c,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "e38b41ce-ced6-421a-ade5-becfd62fa83d", "address": "fa:16:3e:df:32:6a", "network": {"id": "aabf11c6-ef94-408a-8148-6c6400566606", "bridge": "br-int", "label": "tempest-ServersTestJSON-341875047-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3e408650207b498c8d115fd0c4f776dc", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape38b41ce-ce", "ovs_interfaceid": "e38b41ce-ced6-421a-ade5-becfd62fa83d", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Jan 21 19:19:01 np0005591285 nova_compute[182755]: 2026-01-22 00:19:01.059 182759 DEBUG nova.network.os_vif_util [None req-d60bc20f-01db-4a60-82f0-dd0373e292ac 5eb4e81f0cef4003ae49faa67b3f17c3 3e408650207b498c8d115fd0c4f776dc - - default default] Converting VIF {"id": "e38b41ce-ced6-421a-ade5-becfd62fa83d", "address": "fa:16:3e:df:32:6a", "network": {"id": "aabf11c6-ef94-408a-8148-6c6400566606", "bridge": "br-int", "label": "tempest-ServersTestJSON-341875047-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3e408650207b498c8d115fd0c4f776dc", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape38b41ce-ce", "ovs_interfaceid": "e38b41ce-ced6-421a-ade5-becfd62fa83d", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 21 19:19:01 np0005591285 nova_compute[182755]: 2026-01-22 00:19:01.059 182759 DEBUG nova.network.os_vif_util [None req-d60bc20f-01db-4a60-82f0-dd0373e292ac 5eb4e81f0cef4003ae49faa67b3f17c3 3e408650207b498c8d115fd0c4f776dc - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:df:32:6a,bridge_name='br-int',has_traffic_filtering=True,id=e38b41ce-ced6-421a-ade5-becfd62fa83d,network=Network(aabf11c6-ef94-408a-8148-6c6400566606),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tape38b41ce-ce') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 21 19:19:01 np0005591285 nova_compute[182755]: 2026-01-22 00:19:01.059 182759 DEBUG os_vif [None req-d60bc20f-01db-4a60-82f0-dd0373e292ac 5eb4e81f0cef4003ae49faa67b3f17c3 3e408650207b498c8d115fd0c4f776dc - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:df:32:6a,bridge_name='br-int',has_traffic_filtering=True,id=e38b41ce-ced6-421a-ade5-becfd62fa83d,network=Network(aabf11c6-ef94-408a-8148-6c6400566606),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tape38b41ce-ce') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Jan 21 19:19:01 np0005591285 nova_compute[182755]: 2026-01-22 00:19:01.060 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:19:01 np0005591285 nova_compute[182755]: 2026-01-22 00:19:01.061 182759 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tape38b41ce-ce, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 21 19:19:01 np0005591285 nova_compute[182755]: 2026-01-22 00:19:01.062 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:19:01 np0005591285 nova_compute[182755]: 2026-01-22 00:19:01.063 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:19:01 np0005591285 nova_compute[182755]: 2026-01-22 00:19:01.065 182759 INFO os_vif [None req-d60bc20f-01db-4a60-82f0-dd0373e292ac 5eb4e81f0cef4003ae49faa67b3f17c3 3e408650207b498c8d115fd0c4f776dc - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:df:32:6a,bridge_name='br-int',has_traffic_filtering=True,id=e38b41ce-ced6-421a-ade5-becfd62fa83d,network=Network(aabf11c6-ef94-408a-8148-6c6400566606),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tape38b41ce-ce')#033[00m
Jan 21 19:19:01 np0005591285 nova_compute[182755]: 2026-01-22 00:19:01.065 182759 INFO nova.virt.libvirt.driver [None req-d60bc20f-01db-4a60-82f0-dd0373e292ac 5eb4e81f0cef4003ae49faa67b3f17c3 3e408650207b498c8d115fd0c4f776dc - - default default] [instance: fb2dc221-bb45-4407-90b5-ce2fe888001c] Deleting instance files /var/lib/nova/instances/fb2dc221-bb45-4407-90b5-ce2fe888001c_del#033[00m
Jan 21 19:19:01 np0005591285 nova_compute[182755]: 2026-01-22 00:19:01.066 182759 INFO nova.virt.libvirt.driver [None req-d60bc20f-01db-4a60-82f0-dd0373e292ac 5eb4e81f0cef4003ae49faa67b3f17c3 3e408650207b498c8d115fd0c4f776dc - - default default] [instance: fb2dc221-bb45-4407-90b5-ce2fe888001c] Deletion of /var/lib/nova/instances/fb2dc221-bb45-4407-90b5-ce2fe888001c_del complete#033[00m
Jan 21 19:19:01 np0005591285 podman[234584]: 2026-01-22 00:19:01.071739763 +0000 UTC m=+0.043896380 container remove 5184a8c8dce40e7a1de3dbe01f291520f2ad24407839d38396ec3932be28057e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-aabf11c6-ef94-408a-8148-6c6400566606, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 21 19:19:01 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:19:01.076 211690 DEBUG oslo.privsep.daemon [-] privsep: reply[7d19b82c-b0d9-4a9d-b5e6-03f791ba8266]: (4, ('Thu Jan 22 12:19:00 AM UTC 2026 Stopping container neutron-haproxy-ovnmeta-aabf11c6-ef94-408a-8148-6c6400566606 (5184a8c8dce40e7a1de3dbe01f291520f2ad24407839d38396ec3932be28057e)\n5184a8c8dce40e7a1de3dbe01f291520f2ad24407839d38396ec3932be28057e\nThu Jan 22 12:19:01 AM UTC 2026 Deleting container neutron-haproxy-ovnmeta-aabf11c6-ef94-408a-8148-6c6400566606 (5184a8c8dce40e7a1de3dbe01f291520f2ad24407839d38396ec3932be28057e)\n5184a8c8dce40e7a1de3dbe01f291520f2ad24407839d38396ec3932be28057e\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 19:19:01 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:19:01.078 211690 DEBUG oslo.privsep.daemon [-] privsep: reply[da7402a6-e251-49e1-ab0f-642f656a4ee4]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 19:19:01 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:19:01.079 104259 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapaabf11c6-e0, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 21 19:19:01 np0005591285 kernel: tapaabf11c6-e0: left promiscuous mode
Jan 21 19:19:01 np0005591285 nova_compute[182755]: 2026-01-22 00:19:01.081 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:19:01 np0005591285 nova_compute[182755]: 2026-01-22 00:19:01.092 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:19:01 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:19:01.095 211690 DEBUG oslo.privsep.daemon [-] privsep: reply[8c19f912-ce91-4b17-9edb-428a22003d17]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 19:19:01 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:19:01.123 211690 DEBUG oslo.privsep.daemon [-] privsep: reply[10419602-5624-4193-accf-c67ce72f65ed]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 19:19:01 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:19:01.125 211690 DEBUG oslo.privsep.daemon [-] privsep: reply[f177059d-fff6-478a-bf42-ccd485f5b340]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 19:19:01 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:19:01.140 211690 DEBUG oslo.privsep.daemon [-] privsep: reply[389af497-931b-4f22-9c48-6e783c1cabd4]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 567314, 'reachable_time': 22166, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 234605, 'error': None, 'target': 'ovnmeta-aabf11c6-ef94-408a-8148-6c6400566606', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 19:19:01 np0005591285 systemd[1]: run-netns-ovnmeta\x2daabf11c6\x2def94\x2d408a\x2d8148\x2d6c6400566606.mount: Deactivated successfully.
Jan 21 19:19:01 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:19:01.143 104650 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-aabf11c6-ef94-408a-8148-6c6400566606 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Jan 21 19:19:01 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:19:01.143 104650 DEBUG oslo.privsep.daemon [-] privsep: reply[aa1f886f-a7a4-48ae-a92b-3f0c5b79e36a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 19:19:01 np0005591285 nova_compute[182755]: 2026-01-22 00:19:01.157 182759 INFO nova.compute.manager [None req-d60bc20f-01db-4a60-82f0-dd0373e292ac 5eb4e81f0cef4003ae49faa67b3f17c3 3e408650207b498c8d115fd0c4f776dc - - default default] [instance: fb2dc221-bb45-4407-90b5-ce2fe888001c] Took 0.38 seconds to destroy the instance on the hypervisor.#033[00m
Jan 21 19:19:01 np0005591285 nova_compute[182755]: 2026-01-22 00:19:01.157 182759 DEBUG oslo.service.loopingcall [None req-d60bc20f-01db-4a60-82f0-dd0373e292ac 5eb4e81f0cef4003ae49faa67b3f17c3 3e408650207b498c8d115fd0c4f776dc - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Jan 21 19:19:01 np0005591285 nova_compute[182755]: 2026-01-22 00:19:01.158 182759 DEBUG nova.compute.manager [-] [instance: fb2dc221-bb45-4407-90b5-ce2fe888001c] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Jan 21 19:19:01 np0005591285 nova_compute[182755]: 2026-01-22 00:19:01.158 182759 DEBUG nova.network.neutron [-] [instance: fb2dc221-bb45-4407-90b5-ce2fe888001c] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Jan 21 19:19:01 np0005591285 nova_compute[182755]: 2026-01-22 00:19:01.177 182759 DEBUG nova.compute.manager [req-ad47cc5c-97f4-447c-b63b-24f926087ec3 req-9dbaf156-d10d-428b-90b7-3c794ffec5b7 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: fb2dc221-bb45-4407-90b5-ce2fe888001c] Received event network-vif-unplugged-e38b41ce-ced6-421a-ade5-becfd62fa83d external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 21 19:19:01 np0005591285 nova_compute[182755]: 2026-01-22 00:19:01.177 182759 DEBUG oslo_concurrency.lockutils [req-ad47cc5c-97f4-447c-b63b-24f926087ec3 req-9dbaf156-d10d-428b-90b7-3c794ffec5b7 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "fb2dc221-bb45-4407-90b5-ce2fe888001c-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 21 19:19:01 np0005591285 nova_compute[182755]: 2026-01-22 00:19:01.178 182759 DEBUG oslo_concurrency.lockutils [req-ad47cc5c-97f4-447c-b63b-24f926087ec3 req-9dbaf156-d10d-428b-90b7-3c794ffec5b7 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "fb2dc221-bb45-4407-90b5-ce2fe888001c-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 21 19:19:01 np0005591285 nova_compute[182755]: 2026-01-22 00:19:01.178 182759 DEBUG oslo_concurrency.lockutils [req-ad47cc5c-97f4-447c-b63b-24f926087ec3 req-9dbaf156-d10d-428b-90b7-3c794ffec5b7 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "fb2dc221-bb45-4407-90b5-ce2fe888001c-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 21 19:19:01 np0005591285 nova_compute[182755]: 2026-01-22 00:19:01.178 182759 DEBUG nova.compute.manager [req-ad47cc5c-97f4-447c-b63b-24f926087ec3 req-9dbaf156-d10d-428b-90b7-3c794ffec5b7 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: fb2dc221-bb45-4407-90b5-ce2fe888001c] No waiting events found dispatching network-vif-unplugged-e38b41ce-ced6-421a-ade5-becfd62fa83d pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 21 19:19:01 np0005591285 nova_compute[182755]: 2026-01-22 00:19:01.178 182759 DEBUG nova.compute.manager [req-ad47cc5c-97f4-447c-b63b-24f926087ec3 req-9dbaf156-d10d-428b-90b7-3c794ffec5b7 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: fb2dc221-bb45-4407-90b5-ce2fe888001c] Received event network-vif-unplugged-e38b41ce-ced6-421a-ade5-becfd62fa83d for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Jan 21 19:19:02 np0005591285 podman[234607]: 2026-01-22 00:19:02.191640012 +0000 UTC m=+0.066203186 container health_status b5c1a31a508d0cf9e10702abe9c0ef424614b4d8f0daee85791f7de054afa6f0 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_managed=true, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_id=ceilometer_agent_compute, container_name=ceilometer_agent_compute, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0)
Jan 21 19:19:02 np0005591285 podman[234606]: 2026-01-22 00:19:02.19383992 +0000 UTC m=+0.069512014 container health_status 769668cc4e818cfae854ab3fcb5517227ddca1e18005a18ef4d782a494f45662 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, name=ubi9-minimal, vcs-type=git, build-date=2025-08-20T13:12:41, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.openshift.tags=minimal rhel9, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, maintainer=Red Hat, Inc., version=9.6, distribution-scope=public, managed_by=edpm_ansible, release=1755695350, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.buildah.version=1.33.7, url=https://catalog.redhat.com/en/search?searchType=containers, config_id=openstack_network_exporter, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=openstack_network_exporter, vendor=Red Hat, Inc., architecture=x86_64, com.redhat.component=ubi9-minimal-container, io.openshift.expose-services=)
Jan 21 19:19:02 np0005591285 nova_compute[182755]: 2026-01-22 00:19:02.661 182759 DEBUG nova.network.neutron [-] [instance: fb2dc221-bb45-4407-90b5-ce2fe888001c] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 21 19:19:02 np0005591285 nova_compute[182755]: 2026-01-22 00:19:02.682 182759 INFO nova.compute.manager [-] [instance: fb2dc221-bb45-4407-90b5-ce2fe888001c] Took 1.52 seconds to deallocate network for instance.#033[00m
Jan 21 19:19:02 np0005591285 nova_compute[182755]: 2026-01-22 00:19:02.731 182759 DEBUG nova.compute.manager [req-abf6593f-61ba-41e1-8521-e5ad103e7478 req-00f9561a-ec26-46be-af89-56bf6f4b5ff3 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: fb2dc221-bb45-4407-90b5-ce2fe888001c] Received event network-vif-deleted-e38b41ce-ced6-421a-ade5-becfd62fa83d external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 21 19:19:02 np0005591285 nova_compute[182755]: 2026-01-22 00:19:02.771 182759 DEBUG oslo_concurrency.lockutils [None req-d60bc20f-01db-4a60-82f0-dd0373e292ac 5eb4e81f0cef4003ae49faa67b3f17c3 3e408650207b498c8d115fd0c4f776dc - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 21 19:19:02 np0005591285 nova_compute[182755]: 2026-01-22 00:19:02.771 182759 DEBUG oslo_concurrency.lockutils [None req-d60bc20f-01db-4a60-82f0-dd0373e292ac 5eb4e81f0cef4003ae49faa67b3f17c3 3e408650207b498c8d115fd0c4f776dc - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 21 19:19:02 np0005591285 nova_compute[182755]: 2026-01-22 00:19:02.785 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:19:02 np0005591285 nova_compute[182755]: 2026-01-22 00:19:02.837 182759 DEBUG nova.compute.provider_tree [None req-d60bc20f-01db-4a60-82f0-dd0373e292ac 5eb4e81f0cef4003ae49faa67b3f17c3 3e408650207b498c8d115fd0c4f776dc - - default default] Inventory has not changed in ProviderTree for provider: e96a8776-a298-4c19-937a-402cb8191067 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 21 19:19:02 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:19:02.983 104259 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 21 19:19:02 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:19:02.984 104259 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 21 19:19:02 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:19:02.984 104259 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 21 19:19:02 np0005591285 nova_compute[182755]: 2026-01-22 00:19:02.985 182759 DEBUG nova.scheduler.client.report [None req-d60bc20f-01db-4a60-82f0-dd0373e292ac 5eb4e81f0cef4003ae49faa67b3f17c3 3e408650207b498c8d115fd0c4f776dc - - default default] Inventory has not changed for provider e96a8776-a298-4c19-937a-402cb8191067 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 21 19:19:03 np0005591285 nova_compute[182755]: 2026-01-22 00:19:03.176 182759 DEBUG oslo_concurrency.lockutils [None req-d60bc20f-01db-4a60-82f0-dd0373e292ac 5eb4e81f0cef4003ae49faa67b3f17c3 3e408650207b498c8d115fd0c4f776dc - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.405s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 21 19:19:03 np0005591285 nova_compute[182755]: 2026-01-22 00:19:03.261 182759 INFO nova.scheduler.client.report [None req-d60bc20f-01db-4a60-82f0-dd0373e292ac 5eb4e81f0cef4003ae49faa67b3f17c3 3e408650207b498c8d115fd0c4f776dc - - default default] Deleted allocations for instance fb2dc221-bb45-4407-90b5-ce2fe888001c#033[00m
Jan 21 19:19:03 np0005591285 nova_compute[182755]: 2026-01-22 00:19:03.348 182759 DEBUG nova.compute.manager [req-1e116260-f653-4b12-b244-303efad46bc9 req-28912778-0a8a-4190-8107-0ba369b0c802 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: fb2dc221-bb45-4407-90b5-ce2fe888001c] Received event network-vif-plugged-e38b41ce-ced6-421a-ade5-becfd62fa83d external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 21 19:19:03 np0005591285 nova_compute[182755]: 2026-01-22 00:19:03.349 182759 DEBUG oslo_concurrency.lockutils [req-1e116260-f653-4b12-b244-303efad46bc9 req-28912778-0a8a-4190-8107-0ba369b0c802 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "fb2dc221-bb45-4407-90b5-ce2fe888001c-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 21 19:19:03 np0005591285 nova_compute[182755]: 2026-01-22 00:19:03.350 182759 DEBUG oslo_concurrency.lockutils [req-1e116260-f653-4b12-b244-303efad46bc9 req-28912778-0a8a-4190-8107-0ba369b0c802 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "fb2dc221-bb45-4407-90b5-ce2fe888001c-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 21 19:19:03 np0005591285 nova_compute[182755]: 2026-01-22 00:19:03.350 182759 DEBUG oslo_concurrency.lockutils [req-1e116260-f653-4b12-b244-303efad46bc9 req-28912778-0a8a-4190-8107-0ba369b0c802 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "fb2dc221-bb45-4407-90b5-ce2fe888001c-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 21 19:19:03 np0005591285 nova_compute[182755]: 2026-01-22 00:19:03.351 182759 DEBUG nova.compute.manager [req-1e116260-f653-4b12-b244-303efad46bc9 req-28912778-0a8a-4190-8107-0ba369b0c802 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: fb2dc221-bb45-4407-90b5-ce2fe888001c] No waiting events found dispatching network-vif-plugged-e38b41ce-ced6-421a-ade5-becfd62fa83d pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 21 19:19:03 np0005591285 nova_compute[182755]: 2026-01-22 00:19:03.351 182759 WARNING nova.compute.manager [req-1e116260-f653-4b12-b244-303efad46bc9 req-28912778-0a8a-4190-8107-0ba369b0c802 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: fb2dc221-bb45-4407-90b5-ce2fe888001c] Received unexpected event network-vif-plugged-e38b41ce-ced6-421a-ade5-becfd62fa83d for instance with vm_state deleted and task_state None.#033[00m
Jan 21 19:19:04 np0005591285 nova_compute[182755]: 2026-01-22 00:19:04.907 182759 DEBUG oslo_concurrency.lockutils [None req-d60bc20f-01db-4a60-82f0-dd0373e292ac 5eb4e81f0cef4003ae49faa67b3f17c3 3e408650207b498c8d115fd0c4f776dc - - default default] Lock "fb2dc221-bb45-4407-90b5-ce2fe888001c" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 4.157s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 21 19:19:06 np0005591285 nova_compute[182755]: 2026-01-22 00:19:06.063 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:19:07 np0005591285 nova_compute[182755]: 2026-01-22 00:19:07.787 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:19:08 np0005591285 nova_compute[182755]: 2026-01-22 00:19:08.852 182759 DEBUG oslo_concurrency.lockutils [None req-e5452d3d-a693-46ac-8a85-6181fa6d5bc7 5eb4e81f0cef4003ae49faa67b3f17c3 3e408650207b498c8d115fd0c4f776dc - - default default] Acquiring lock "3a2a5065-a19a-41e9-ab2f-11e0d05fee11" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 21 19:19:08 np0005591285 nova_compute[182755]: 2026-01-22 00:19:08.853 182759 DEBUG oslo_concurrency.lockutils [None req-e5452d3d-a693-46ac-8a85-6181fa6d5bc7 5eb4e81f0cef4003ae49faa67b3f17c3 3e408650207b498c8d115fd0c4f776dc - - default default] Lock "3a2a5065-a19a-41e9-ab2f-11e0d05fee11" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 21 19:19:08 np0005591285 nova_compute[182755]: 2026-01-22 00:19:08.882 182759 DEBUG nova.compute.manager [None req-e5452d3d-a693-46ac-8a85-6181fa6d5bc7 5eb4e81f0cef4003ae49faa67b3f17c3 3e408650207b498c8d115fd0c4f776dc - - default default] [instance: 3a2a5065-a19a-41e9-ab2f-11e0d05fee11] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Jan 21 19:19:09 np0005591285 nova_compute[182755]: 2026-01-22 00:19:09.196 182759 DEBUG oslo_concurrency.lockutils [None req-e5452d3d-a693-46ac-8a85-6181fa6d5bc7 5eb4e81f0cef4003ae49faa67b3f17c3 3e408650207b498c8d115fd0c4f776dc - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 21 19:19:09 np0005591285 nova_compute[182755]: 2026-01-22 00:19:09.197 182759 DEBUG oslo_concurrency.lockutils [None req-e5452d3d-a693-46ac-8a85-6181fa6d5bc7 5eb4e81f0cef4003ae49faa67b3f17c3 3e408650207b498c8d115fd0c4f776dc - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 21 19:19:09 np0005591285 nova_compute[182755]: 2026-01-22 00:19:09.204 182759 DEBUG nova.virt.hardware [None req-e5452d3d-a693-46ac-8a85-6181fa6d5bc7 5eb4e81f0cef4003ae49faa67b3f17c3 3e408650207b498c8d115fd0c4f776dc - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Jan 21 19:19:09 np0005591285 nova_compute[182755]: 2026-01-22 00:19:09.205 182759 INFO nova.compute.claims [None req-e5452d3d-a693-46ac-8a85-6181fa6d5bc7 5eb4e81f0cef4003ae49faa67b3f17c3 3e408650207b498c8d115fd0c4f776dc - - default default] [instance: 3a2a5065-a19a-41e9-ab2f-11e0d05fee11] Claim successful on node compute-2.ctlplane.example.com#033[00m
Jan 21 19:19:09 np0005591285 nova_compute[182755]: 2026-01-22 00:19:09.387 182759 DEBUG nova.compute.provider_tree [None req-e5452d3d-a693-46ac-8a85-6181fa6d5bc7 5eb4e81f0cef4003ae49faa67b3f17c3 3e408650207b498c8d115fd0c4f776dc - - default default] Inventory has not changed in ProviderTree for provider: e96a8776-a298-4c19-937a-402cb8191067 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 21 19:19:09 np0005591285 nova_compute[182755]: 2026-01-22 00:19:09.406 182759 DEBUG nova.scheduler.client.report [None req-e5452d3d-a693-46ac-8a85-6181fa6d5bc7 5eb4e81f0cef4003ae49faa67b3f17c3 3e408650207b498c8d115fd0c4f776dc - - default default] Inventory has not changed for provider e96a8776-a298-4c19-937a-402cb8191067 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 21 19:19:09 np0005591285 nova_compute[182755]: 2026-01-22 00:19:09.446 182759 DEBUG oslo_concurrency.lockutils [None req-e5452d3d-a693-46ac-8a85-6181fa6d5bc7 5eb4e81f0cef4003ae49faa67b3f17c3 3e408650207b498c8d115fd0c4f776dc - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.250s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 21 19:19:09 np0005591285 nova_compute[182755]: 2026-01-22 00:19:09.447 182759 DEBUG nova.compute.manager [None req-e5452d3d-a693-46ac-8a85-6181fa6d5bc7 5eb4e81f0cef4003ae49faa67b3f17c3 3e408650207b498c8d115fd0c4f776dc - - default default] [instance: 3a2a5065-a19a-41e9-ab2f-11e0d05fee11] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Jan 21 19:19:09 np0005591285 nova_compute[182755]: 2026-01-22 00:19:09.523 182759 DEBUG nova.compute.manager [None req-e5452d3d-a693-46ac-8a85-6181fa6d5bc7 5eb4e81f0cef4003ae49faa67b3f17c3 3e408650207b498c8d115fd0c4f776dc - - default default] [instance: 3a2a5065-a19a-41e9-ab2f-11e0d05fee11] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Jan 21 19:19:09 np0005591285 nova_compute[182755]: 2026-01-22 00:19:09.524 182759 DEBUG nova.network.neutron [None req-e5452d3d-a693-46ac-8a85-6181fa6d5bc7 5eb4e81f0cef4003ae49faa67b3f17c3 3e408650207b498c8d115fd0c4f776dc - - default default] [instance: 3a2a5065-a19a-41e9-ab2f-11e0d05fee11] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Jan 21 19:19:09 np0005591285 nova_compute[182755]: 2026-01-22 00:19:09.576 182759 INFO nova.virt.libvirt.driver [None req-e5452d3d-a693-46ac-8a85-6181fa6d5bc7 5eb4e81f0cef4003ae49faa67b3f17c3 3e408650207b498c8d115fd0c4f776dc - - default default] [instance: 3a2a5065-a19a-41e9-ab2f-11e0d05fee11] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Jan 21 19:19:09 np0005591285 nova_compute[182755]: 2026-01-22 00:19:09.598 182759 DEBUG nova.compute.manager [None req-e5452d3d-a693-46ac-8a85-6181fa6d5bc7 5eb4e81f0cef4003ae49faa67b3f17c3 3e408650207b498c8d115fd0c4f776dc - - default default] [instance: 3a2a5065-a19a-41e9-ab2f-11e0d05fee11] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Jan 21 19:19:09 np0005591285 nova_compute[182755]: 2026-01-22 00:19:09.954 182759 DEBUG nova.compute.manager [None req-e5452d3d-a693-46ac-8a85-6181fa6d5bc7 5eb4e81f0cef4003ae49faa67b3f17c3 3e408650207b498c8d115fd0c4f776dc - - default default] [instance: 3a2a5065-a19a-41e9-ab2f-11e0d05fee11] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Jan 21 19:19:09 np0005591285 nova_compute[182755]: 2026-01-22 00:19:09.956 182759 DEBUG nova.virt.libvirt.driver [None req-e5452d3d-a693-46ac-8a85-6181fa6d5bc7 5eb4e81f0cef4003ae49faa67b3f17c3 3e408650207b498c8d115fd0c4f776dc - - default default] [instance: 3a2a5065-a19a-41e9-ab2f-11e0d05fee11] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Jan 21 19:19:09 np0005591285 nova_compute[182755]: 2026-01-22 00:19:09.956 182759 INFO nova.virt.libvirt.driver [None req-e5452d3d-a693-46ac-8a85-6181fa6d5bc7 5eb4e81f0cef4003ae49faa67b3f17c3 3e408650207b498c8d115fd0c4f776dc - - default default] [instance: 3a2a5065-a19a-41e9-ab2f-11e0d05fee11] Creating image(s)#033[00m
Jan 21 19:19:09 np0005591285 nova_compute[182755]: 2026-01-22 00:19:09.957 182759 DEBUG oslo_concurrency.lockutils [None req-e5452d3d-a693-46ac-8a85-6181fa6d5bc7 5eb4e81f0cef4003ae49faa67b3f17c3 3e408650207b498c8d115fd0c4f776dc - - default default] Acquiring lock "/var/lib/nova/instances/3a2a5065-a19a-41e9-ab2f-11e0d05fee11/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 21 19:19:09 np0005591285 nova_compute[182755]: 2026-01-22 00:19:09.957 182759 DEBUG oslo_concurrency.lockutils [None req-e5452d3d-a693-46ac-8a85-6181fa6d5bc7 5eb4e81f0cef4003ae49faa67b3f17c3 3e408650207b498c8d115fd0c4f776dc - - default default] Lock "/var/lib/nova/instances/3a2a5065-a19a-41e9-ab2f-11e0d05fee11/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 21 19:19:09 np0005591285 nova_compute[182755]: 2026-01-22 00:19:09.957 182759 DEBUG oslo_concurrency.lockutils [None req-e5452d3d-a693-46ac-8a85-6181fa6d5bc7 5eb4e81f0cef4003ae49faa67b3f17c3 3e408650207b498c8d115fd0c4f776dc - - default default] Lock "/var/lib/nova/instances/3a2a5065-a19a-41e9-ab2f-11e0d05fee11/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 21 19:19:09 np0005591285 nova_compute[182755]: 2026-01-22 00:19:09.971 182759 DEBUG oslo_concurrency.processutils [None req-e5452d3d-a693-46ac-8a85-6181fa6d5bc7 5eb4e81f0cef4003ae49faa67b3f17c3 3e408650207b498c8d115fd0c4f776dc - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 21 19:19:10 np0005591285 nova_compute[182755]: 2026-01-22 00:19:10.029 182759 DEBUG oslo_concurrency.processutils [None req-e5452d3d-a693-46ac-8a85-6181fa6d5bc7 5eb4e81f0cef4003ae49faa67b3f17c3 3e408650207b498c8d115fd0c4f776dc - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json" returned: 0 in 0.058s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 21 19:19:10 np0005591285 nova_compute[182755]: 2026-01-22 00:19:10.031 182759 DEBUG oslo_concurrency.lockutils [None req-e5452d3d-a693-46ac-8a85-6181fa6d5bc7 5eb4e81f0cef4003ae49faa67b3f17c3 3e408650207b498c8d115fd0c4f776dc - - default default] Acquiring lock "3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 21 19:19:10 np0005591285 nova_compute[182755]: 2026-01-22 00:19:10.031 182759 DEBUG oslo_concurrency.lockutils [None req-e5452d3d-a693-46ac-8a85-6181fa6d5bc7 5eb4e81f0cef4003ae49faa67b3f17c3 3e408650207b498c8d115fd0c4f776dc - - default default] Lock "3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 21 19:19:10 np0005591285 nova_compute[182755]: 2026-01-22 00:19:10.043 182759 DEBUG oslo_concurrency.processutils [None req-e5452d3d-a693-46ac-8a85-6181fa6d5bc7 5eb4e81f0cef4003ae49faa67b3f17c3 3e408650207b498c8d115fd0c4f776dc - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 21 19:19:10 np0005591285 nova_compute[182755]: 2026-01-22 00:19:10.099 182759 DEBUG oslo_concurrency.processutils [None req-e5452d3d-a693-46ac-8a85-6181fa6d5bc7 5eb4e81f0cef4003ae49faa67b3f17c3 3e408650207b498c8d115fd0c4f776dc - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json" returned: 0 in 0.056s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 21 19:19:10 np0005591285 nova_compute[182755]: 2026-01-22 00:19:10.100 182759 DEBUG oslo_concurrency.processutils [None req-e5452d3d-a693-46ac-8a85-6181fa6d5bc7 5eb4e81f0cef4003ae49faa67b3f17c3 3e408650207b498c8d115fd0c4f776dc - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474,backing_fmt=raw /var/lib/nova/instances/3a2a5065-a19a-41e9-ab2f-11e0d05fee11/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 21 19:19:10 np0005591285 nova_compute[182755]: 2026-01-22 00:19:10.124 182759 DEBUG nova.policy [None req-e5452d3d-a693-46ac-8a85-6181fa6d5bc7 5eb4e81f0cef4003ae49faa67b3f17c3 3e408650207b498c8d115fd0c4f776dc - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '5eb4e81f0cef4003ae49faa67b3f17c3', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '3e408650207b498c8d115fd0c4f776dc', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Jan 21 19:19:10 np0005591285 nova_compute[182755]: 2026-01-22 00:19:10.138 182759 DEBUG oslo_concurrency.processutils [None req-e5452d3d-a693-46ac-8a85-6181fa6d5bc7 5eb4e81f0cef4003ae49faa67b3f17c3 3e408650207b498c8d115fd0c4f776dc - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474,backing_fmt=raw /var/lib/nova/instances/3a2a5065-a19a-41e9-ab2f-11e0d05fee11/disk 1073741824" returned: 0 in 0.038s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 21 19:19:10 np0005591285 nova_compute[182755]: 2026-01-22 00:19:10.139 182759 DEBUG oslo_concurrency.lockutils [None req-e5452d3d-a693-46ac-8a85-6181fa6d5bc7 5eb4e81f0cef4003ae49faa67b3f17c3 3e408650207b498c8d115fd0c4f776dc - - default default] Lock "3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.108s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 21 19:19:10 np0005591285 nova_compute[182755]: 2026-01-22 00:19:10.139 182759 DEBUG oslo_concurrency.processutils [None req-e5452d3d-a693-46ac-8a85-6181fa6d5bc7 5eb4e81f0cef4003ae49faa67b3f17c3 3e408650207b498c8d115fd0c4f776dc - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 21 19:19:10 np0005591285 nova_compute[182755]: 2026-01-22 00:19:10.192 182759 DEBUG oslo_concurrency.processutils [None req-e5452d3d-a693-46ac-8a85-6181fa6d5bc7 5eb4e81f0cef4003ae49faa67b3f17c3 3e408650207b498c8d115fd0c4f776dc - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json" returned: 0 in 0.053s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 21 19:19:10 np0005591285 nova_compute[182755]: 2026-01-22 00:19:10.194 182759 DEBUG nova.virt.disk.api [None req-e5452d3d-a693-46ac-8a85-6181fa6d5bc7 5eb4e81f0cef4003ae49faa67b3f17c3 3e408650207b498c8d115fd0c4f776dc - - default default] Checking if we can resize image /var/lib/nova/instances/3a2a5065-a19a-41e9-ab2f-11e0d05fee11/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166#033[00m
Jan 21 19:19:10 np0005591285 nova_compute[182755]: 2026-01-22 00:19:10.194 182759 DEBUG oslo_concurrency.processutils [None req-e5452d3d-a693-46ac-8a85-6181fa6d5bc7 5eb4e81f0cef4003ae49faa67b3f17c3 3e408650207b498c8d115fd0c4f776dc - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/3a2a5065-a19a-41e9-ab2f-11e0d05fee11/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 21 19:19:10 np0005591285 nova_compute[182755]: 2026-01-22 00:19:10.250 182759 DEBUG oslo_concurrency.processutils [None req-e5452d3d-a693-46ac-8a85-6181fa6d5bc7 5eb4e81f0cef4003ae49faa67b3f17c3 3e408650207b498c8d115fd0c4f776dc - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/3a2a5065-a19a-41e9-ab2f-11e0d05fee11/disk --force-share --output=json" returned: 0 in 0.056s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 21 19:19:10 np0005591285 nova_compute[182755]: 2026-01-22 00:19:10.251 182759 DEBUG nova.virt.disk.api [None req-e5452d3d-a693-46ac-8a85-6181fa6d5bc7 5eb4e81f0cef4003ae49faa67b3f17c3 3e408650207b498c8d115fd0c4f776dc - - default default] Cannot resize image /var/lib/nova/instances/3a2a5065-a19a-41e9-ab2f-11e0d05fee11/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172#033[00m
Jan 21 19:19:10 np0005591285 nova_compute[182755]: 2026-01-22 00:19:10.252 182759 DEBUG nova.objects.instance [None req-e5452d3d-a693-46ac-8a85-6181fa6d5bc7 5eb4e81f0cef4003ae49faa67b3f17c3 3e408650207b498c8d115fd0c4f776dc - - default default] Lazy-loading 'migration_context' on Instance uuid 3a2a5065-a19a-41e9-ab2f-11e0d05fee11 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 21 19:19:10 np0005591285 nova_compute[182755]: 2026-01-22 00:19:10.274 182759 DEBUG nova.virt.libvirt.driver [None req-e5452d3d-a693-46ac-8a85-6181fa6d5bc7 5eb4e81f0cef4003ae49faa67b3f17c3 3e408650207b498c8d115fd0c4f776dc - - default default] [instance: 3a2a5065-a19a-41e9-ab2f-11e0d05fee11] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Jan 21 19:19:10 np0005591285 nova_compute[182755]: 2026-01-22 00:19:10.275 182759 DEBUG nova.virt.libvirt.driver [None req-e5452d3d-a693-46ac-8a85-6181fa6d5bc7 5eb4e81f0cef4003ae49faa67b3f17c3 3e408650207b498c8d115fd0c4f776dc - - default default] [instance: 3a2a5065-a19a-41e9-ab2f-11e0d05fee11] Ensure instance console log exists: /var/lib/nova/instances/3a2a5065-a19a-41e9-ab2f-11e0d05fee11/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Jan 21 19:19:10 np0005591285 nova_compute[182755]: 2026-01-22 00:19:10.275 182759 DEBUG oslo_concurrency.lockutils [None req-e5452d3d-a693-46ac-8a85-6181fa6d5bc7 5eb4e81f0cef4003ae49faa67b3f17c3 3e408650207b498c8d115fd0c4f776dc - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 21 19:19:10 np0005591285 nova_compute[182755]: 2026-01-22 00:19:10.276 182759 DEBUG oslo_concurrency.lockutils [None req-e5452d3d-a693-46ac-8a85-6181fa6d5bc7 5eb4e81f0cef4003ae49faa67b3f17c3 3e408650207b498c8d115fd0c4f776dc - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 21 19:19:10 np0005591285 nova_compute[182755]: 2026-01-22 00:19:10.276 182759 DEBUG oslo_concurrency.lockutils [None req-e5452d3d-a693-46ac-8a85-6181fa6d5bc7 5eb4e81f0cef4003ae49faa67b3f17c3 3e408650207b498c8d115fd0c4f776dc - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 21 19:19:10 np0005591285 nova_compute[182755]: 2026-01-22 00:19:10.916 182759 DEBUG nova.network.neutron [None req-e5452d3d-a693-46ac-8a85-6181fa6d5bc7 5eb4e81f0cef4003ae49faa67b3f17c3 3e408650207b498c8d115fd0c4f776dc - - default default] [instance: 3a2a5065-a19a-41e9-ab2f-11e0d05fee11] Successfully created port: 4fe6ce9a-3a01-4bfa-98b1-3844ff1be391 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Jan 21 19:19:11 np0005591285 nova_compute[182755]: 2026-01-22 00:19:11.066 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:19:12 np0005591285 nova_compute[182755]: 2026-01-22 00:19:12.171 182759 DEBUG nova.network.neutron [None req-e5452d3d-a693-46ac-8a85-6181fa6d5bc7 5eb4e81f0cef4003ae49faa67b3f17c3 3e408650207b498c8d115fd0c4f776dc - - default default] [instance: 3a2a5065-a19a-41e9-ab2f-11e0d05fee11] Successfully updated port: 4fe6ce9a-3a01-4bfa-98b1-3844ff1be391 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Jan 21 19:19:12 np0005591285 podman[234660]: 2026-01-22 00:19:12.196229487 +0000 UTC m=+0.061604804 container health_status 3d927e208c8d9d2bf2ed958430b573b5beb6e6062ba87e1ec7a0ee42c855b2e4 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Jan 21 19:19:12 np0005591285 nova_compute[182755]: 2026-01-22 00:19:12.198 182759 DEBUG oslo_concurrency.lockutils [None req-e5452d3d-a693-46ac-8a85-6181fa6d5bc7 5eb4e81f0cef4003ae49faa67b3f17c3 3e408650207b498c8d115fd0c4f776dc - - default default] Acquiring lock "refresh_cache-3a2a5065-a19a-41e9-ab2f-11e0d05fee11" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 21 19:19:12 np0005591285 nova_compute[182755]: 2026-01-22 00:19:12.198 182759 DEBUG oslo_concurrency.lockutils [None req-e5452d3d-a693-46ac-8a85-6181fa6d5bc7 5eb4e81f0cef4003ae49faa67b3f17c3 3e408650207b498c8d115fd0c4f776dc - - default default] Acquired lock "refresh_cache-3a2a5065-a19a-41e9-ab2f-11e0d05fee11" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 21 19:19:12 np0005591285 nova_compute[182755]: 2026-01-22 00:19:12.198 182759 DEBUG nova.network.neutron [None req-e5452d3d-a693-46ac-8a85-6181fa6d5bc7 5eb4e81f0cef4003ae49faa67b3f17c3 3e408650207b498c8d115fd0c4f776dc - - default default] [instance: 3a2a5065-a19a-41e9-ab2f-11e0d05fee11] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Jan 21 19:19:12 np0005591285 nova_compute[182755]: 2026-01-22 00:19:12.264 182759 DEBUG nova.compute.manager [req-dce91e3a-dc77-4d8e-bf0d-d3aee0870dbb req-8d80cf9f-31e6-440e-8972-23fc98dfc824 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 3a2a5065-a19a-41e9-ab2f-11e0d05fee11] Received event network-changed-4fe6ce9a-3a01-4bfa-98b1-3844ff1be391 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 21 19:19:12 np0005591285 nova_compute[182755]: 2026-01-22 00:19:12.264 182759 DEBUG nova.compute.manager [req-dce91e3a-dc77-4d8e-bf0d-d3aee0870dbb req-8d80cf9f-31e6-440e-8972-23fc98dfc824 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 3a2a5065-a19a-41e9-ab2f-11e0d05fee11] Refreshing instance network info cache due to event network-changed-4fe6ce9a-3a01-4bfa-98b1-3844ff1be391. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 21 19:19:12 np0005591285 nova_compute[182755]: 2026-01-22 00:19:12.265 182759 DEBUG oslo_concurrency.lockutils [req-dce91e3a-dc77-4d8e-bf0d-d3aee0870dbb req-8d80cf9f-31e6-440e-8972-23fc98dfc824 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "refresh_cache-3a2a5065-a19a-41e9-ab2f-11e0d05fee11" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 21 19:19:12 np0005591285 nova_compute[182755]: 2026-01-22 00:19:12.415 182759 DEBUG nova.network.neutron [None req-e5452d3d-a693-46ac-8a85-6181fa6d5bc7 5eb4e81f0cef4003ae49faa67b3f17c3 3e408650207b498c8d115fd0c4f776dc - - default default] [instance: 3a2a5065-a19a-41e9-ab2f-11e0d05fee11] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Jan 21 19:19:12 np0005591285 nova_compute[182755]: 2026-01-22 00:19:12.789 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:19:13 np0005591285 nova_compute[182755]: 2026-01-22 00:19:13.045 182759 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1769041138.0439112, c4b8d52a-d6d6-4588-91ae-5eedc3a8db48 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 21 19:19:13 np0005591285 nova_compute[182755]: 2026-01-22 00:19:13.046 182759 INFO nova.compute.manager [-] [instance: c4b8d52a-d6d6-4588-91ae-5eedc3a8db48] VM Stopped (Lifecycle Event)#033[00m
Jan 21 19:19:13 np0005591285 nova_compute[182755]: 2026-01-22 00:19:13.066 182759 DEBUG nova.compute.manager [None req-53de3daa-7ff3-49c4-b02a-5106c58de266 - - - - - -] [instance: c4b8d52a-d6d6-4588-91ae-5eedc3a8db48] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 21 19:19:14 np0005591285 nova_compute[182755]: 2026-01-22 00:19:14.454 182759 DEBUG nova.network.neutron [None req-e5452d3d-a693-46ac-8a85-6181fa6d5bc7 5eb4e81f0cef4003ae49faa67b3f17c3 3e408650207b498c8d115fd0c4f776dc - - default default] [instance: 3a2a5065-a19a-41e9-ab2f-11e0d05fee11] Updating instance_info_cache with network_info: [{"id": "4fe6ce9a-3a01-4bfa-98b1-3844ff1be391", "address": "fa:16:3e:9a:3e:e5", "network": {"id": "aabf11c6-ef94-408a-8148-6c6400566606", "bridge": "br-int", "label": "tempest-ServersTestJSON-341875047-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3e408650207b498c8d115fd0c4f776dc", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4fe6ce9a-3a", "ovs_interfaceid": "4fe6ce9a-3a01-4bfa-98b1-3844ff1be391", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 21 19:19:14 np0005591285 nova_compute[182755]: 2026-01-22 00:19:14.482 182759 DEBUG oslo_concurrency.lockutils [None req-e5452d3d-a693-46ac-8a85-6181fa6d5bc7 5eb4e81f0cef4003ae49faa67b3f17c3 3e408650207b498c8d115fd0c4f776dc - - default default] Releasing lock "refresh_cache-3a2a5065-a19a-41e9-ab2f-11e0d05fee11" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 21 19:19:14 np0005591285 nova_compute[182755]: 2026-01-22 00:19:14.483 182759 DEBUG nova.compute.manager [None req-e5452d3d-a693-46ac-8a85-6181fa6d5bc7 5eb4e81f0cef4003ae49faa67b3f17c3 3e408650207b498c8d115fd0c4f776dc - - default default] [instance: 3a2a5065-a19a-41e9-ab2f-11e0d05fee11] Instance network_info: |[{"id": "4fe6ce9a-3a01-4bfa-98b1-3844ff1be391", "address": "fa:16:3e:9a:3e:e5", "network": {"id": "aabf11c6-ef94-408a-8148-6c6400566606", "bridge": "br-int", "label": "tempest-ServersTestJSON-341875047-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3e408650207b498c8d115fd0c4f776dc", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4fe6ce9a-3a", "ovs_interfaceid": "4fe6ce9a-3a01-4bfa-98b1-3844ff1be391", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Jan 21 19:19:14 np0005591285 nova_compute[182755]: 2026-01-22 00:19:14.484 182759 DEBUG oslo_concurrency.lockutils [req-dce91e3a-dc77-4d8e-bf0d-d3aee0870dbb req-8d80cf9f-31e6-440e-8972-23fc98dfc824 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquired lock "refresh_cache-3a2a5065-a19a-41e9-ab2f-11e0d05fee11" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 21 19:19:14 np0005591285 nova_compute[182755]: 2026-01-22 00:19:14.485 182759 DEBUG nova.network.neutron [req-dce91e3a-dc77-4d8e-bf0d-d3aee0870dbb req-8d80cf9f-31e6-440e-8972-23fc98dfc824 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 3a2a5065-a19a-41e9-ab2f-11e0d05fee11] Refreshing network info cache for port 4fe6ce9a-3a01-4bfa-98b1-3844ff1be391 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 21 19:19:14 np0005591285 nova_compute[182755]: 2026-01-22 00:19:14.490 182759 DEBUG nova.virt.libvirt.driver [None req-e5452d3d-a693-46ac-8a85-6181fa6d5bc7 5eb4e81f0cef4003ae49faa67b3f17c3 3e408650207b498c8d115fd0c4f776dc - - default default] [instance: 3a2a5065-a19a-41e9-ab2f-11e0d05fee11] Start _get_guest_xml network_info=[{"id": "4fe6ce9a-3a01-4bfa-98b1-3844ff1be391", "address": "fa:16:3e:9a:3e:e5", "network": {"id": "aabf11c6-ef94-408a-8148-6c6400566606", "bridge": "br-int", "label": "tempest-ServersTestJSON-341875047-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3e408650207b498c8d115fd0c4f776dc", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4fe6ce9a-3a", "ovs_interfaceid": "4fe6ce9a-3a01-4bfa-98b1-3844ff1be391", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-21T23:42:50Z,direct_url=<?>,disk_format='qcow2',id=9cd98f02-a505-4543-a7ad-04e9a377b456,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='43b70c4e837343859ac97b6b2397ba1b',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-21T23:42:52Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'device_type': 'disk', 'device_name': '/dev/vda', 'size': 0, 'boot_index': 0, 'disk_bus': 'virtio', 'guest_format': None, 'encryption_options': None, 'encryption_format': None, 'encrypted': False, 'encryption_secret_uuid': None, 'image_id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Jan 21 19:19:14 np0005591285 nova_compute[182755]: 2026-01-22 00:19:14.498 182759 WARNING nova.virt.libvirt.driver [None req-e5452d3d-a693-46ac-8a85-6181fa6d5bc7 5eb4e81f0cef4003ae49faa67b3f17c3 3e408650207b498c8d115fd0c4f776dc - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 21 19:19:14 np0005591285 nova_compute[182755]: 2026-01-22 00:19:14.505 182759 DEBUG nova.virt.libvirt.host [None req-e5452d3d-a693-46ac-8a85-6181fa6d5bc7 5eb4e81f0cef4003ae49faa67b3f17c3 3e408650207b498c8d115fd0c4f776dc - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Jan 21 19:19:14 np0005591285 nova_compute[182755]: 2026-01-22 00:19:14.505 182759 DEBUG nova.virt.libvirt.host [None req-e5452d3d-a693-46ac-8a85-6181fa6d5bc7 5eb4e81f0cef4003ae49faa67b3f17c3 3e408650207b498c8d115fd0c4f776dc - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Jan 21 19:19:14 np0005591285 nova_compute[182755]: 2026-01-22 00:19:14.511 182759 DEBUG nova.virt.libvirt.host [None req-e5452d3d-a693-46ac-8a85-6181fa6d5bc7 5eb4e81f0cef4003ae49faa67b3f17c3 3e408650207b498c8d115fd0c4f776dc - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Jan 21 19:19:14 np0005591285 nova_compute[182755]: 2026-01-22 00:19:14.511 182759 DEBUG nova.virt.libvirt.host [None req-e5452d3d-a693-46ac-8a85-6181fa6d5bc7 5eb4e81f0cef4003ae49faa67b3f17c3 3e408650207b498c8d115fd0c4f776dc - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Jan 21 19:19:14 np0005591285 nova_compute[182755]: 2026-01-22 00:19:14.512 182759 DEBUG nova.virt.libvirt.driver [None req-e5452d3d-a693-46ac-8a85-6181fa6d5bc7 5eb4e81f0cef4003ae49faa67b3f17c3 3e408650207b498c8d115fd0c4f776dc - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Jan 21 19:19:14 np0005591285 nova_compute[182755]: 2026-01-22 00:19:14.512 182759 DEBUG nova.virt.hardware [None req-e5452d3d-a693-46ac-8a85-6181fa6d5bc7 5eb4e81f0cef4003ae49faa67b3f17c3 3e408650207b498c8d115fd0c4f776dc - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-21T23:42:45Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='c3389c03-89c4-4ff5-9e03-1a99d41713d4',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-21T23:42:50Z,direct_url=<?>,disk_format='qcow2',id=9cd98f02-a505-4543-a7ad-04e9a377b456,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='43b70c4e837343859ac97b6b2397ba1b',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-21T23:42:52Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Jan 21 19:19:14 np0005591285 nova_compute[182755]: 2026-01-22 00:19:14.513 182759 DEBUG nova.virt.hardware [None req-e5452d3d-a693-46ac-8a85-6181fa6d5bc7 5eb4e81f0cef4003ae49faa67b3f17c3 3e408650207b498c8d115fd0c4f776dc - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Jan 21 19:19:14 np0005591285 nova_compute[182755]: 2026-01-22 00:19:14.513 182759 DEBUG nova.virt.hardware [None req-e5452d3d-a693-46ac-8a85-6181fa6d5bc7 5eb4e81f0cef4003ae49faa67b3f17c3 3e408650207b498c8d115fd0c4f776dc - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Jan 21 19:19:14 np0005591285 nova_compute[182755]: 2026-01-22 00:19:14.513 182759 DEBUG nova.virt.hardware [None req-e5452d3d-a693-46ac-8a85-6181fa6d5bc7 5eb4e81f0cef4003ae49faa67b3f17c3 3e408650207b498c8d115fd0c4f776dc - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Jan 21 19:19:14 np0005591285 nova_compute[182755]: 2026-01-22 00:19:14.513 182759 DEBUG nova.virt.hardware [None req-e5452d3d-a693-46ac-8a85-6181fa6d5bc7 5eb4e81f0cef4003ae49faa67b3f17c3 3e408650207b498c8d115fd0c4f776dc - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Jan 21 19:19:14 np0005591285 nova_compute[182755]: 2026-01-22 00:19:14.513 182759 DEBUG nova.virt.hardware [None req-e5452d3d-a693-46ac-8a85-6181fa6d5bc7 5eb4e81f0cef4003ae49faa67b3f17c3 3e408650207b498c8d115fd0c4f776dc - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Jan 21 19:19:14 np0005591285 nova_compute[182755]: 2026-01-22 00:19:14.513 182759 DEBUG nova.virt.hardware [None req-e5452d3d-a693-46ac-8a85-6181fa6d5bc7 5eb4e81f0cef4003ae49faa67b3f17c3 3e408650207b498c8d115fd0c4f776dc - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Jan 21 19:19:14 np0005591285 nova_compute[182755]: 2026-01-22 00:19:14.514 182759 DEBUG nova.virt.hardware [None req-e5452d3d-a693-46ac-8a85-6181fa6d5bc7 5eb4e81f0cef4003ae49faa67b3f17c3 3e408650207b498c8d115fd0c4f776dc - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Jan 21 19:19:14 np0005591285 nova_compute[182755]: 2026-01-22 00:19:14.514 182759 DEBUG nova.virt.hardware [None req-e5452d3d-a693-46ac-8a85-6181fa6d5bc7 5eb4e81f0cef4003ae49faa67b3f17c3 3e408650207b498c8d115fd0c4f776dc - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Jan 21 19:19:14 np0005591285 nova_compute[182755]: 2026-01-22 00:19:14.514 182759 DEBUG nova.virt.hardware [None req-e5452d3d-a693-46ac-8a85-6181fa6d5bc7 5eb4e81f0cef4003ae49faa67b3f17c3 3e408650207b498c8d115fd0c4f776dc - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Jan 21 19:19:14 np0005591285 nova_compute[182755]: 2026-01-22 00:19:14.514 182759 DEBUG nova.virt.hardware [None req-e5452d3d-a693-46ac-8a85-6181fa6d5bc7 5eb4e81f0cef4003ae49faa67b3f17c3 3e408650207b498c8d115fd0c4f776dc - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Jan 21 19:19:14 np0005591285 nova_compute[182755]: 2026-01-22 00:19:14.517 182759 DEBUG nova.virt.libvirt.vif [None req-e5452d3d-a693-46ac-8a85-6181fa6d5bc7 5eb4e81f0cef4003ae49faa67b3f17c3 3e408650207b498c8d115fd0c4f776dc - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-22T00:19:08Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ServersTestJSON-server-1928398450',display_name='tempest-ServersTestJSON-server-1928398450',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-serverstestjson-server-1928398450',id=146,image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='3e408650207b498c8d115fd0c4f776dc',ramdisk_id='',reservation_id='r-j6btjais',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ServersTestJSON-374007797',owner_user_name='tempest-ServersTestJSON-374007797-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-22T00:19:09Z,user_data=None,user_id='5eb4e81f0cef4003ae49faa67b3f17c3',uuid=3a2a5065-a19a-41e9-ab2f-11e0d05fee11,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "4fe6ce9a-3a01-4bfa-98b1-3844ff1be391", "address": "fa:16:3e:9a:3e:e5", "network": {"id": "aabf11c6-ef94-408a-8148-6c6400566606", "bridge": "br-int", "label": "tempest-ServersTestJSON-341875047-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3e408650207b498c8d115fd0c4f776dc", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4fe6ce9a-3a", "ovs_interfaceid": "4fe6ce9a-3a01-4bfa-98b1-3844ff1be391", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Jan 21 19:19:14 np0005591285 nova_compute[182755]: 2026-01-22 00:19:14.517 182759 DEBUG nova.network.os_vif_util [None req-e5452d3d-a693-46ac-8a85-6181fa6d5bc7 5eb4e81f0cef4003ae49faa67b3f17c3 3e408650207b498c8d115fd0c4f776dc - - default default] Converting VIF {"id": "4fe6ce9a-3a01-4bfa-98b1-3844ff1be391", "address": "fa:16:3e:9a:3e:e5", "network": {"id": "aabf11c6-ef94-408a-8148-6c6400566606", "bridge": "br-int", "label": "tempest-ServersTestJSON-341875047-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3e408650207b498c8d115fd0c4f776dc", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4fe6ce9a-3a", "ovs_interfaceid": "4fe6ce9a-3a01-4bfa-98b1-3844ff1be391", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 21 19:19:14 np0005591285 nova_compute[182755]: 2026-01-22 00:19:14.518 182759 DEBUG nova.network.os_vif_util [None req-e5452d3d-a693-46ac-8a85-6181fa6d5bc7 5eb4e81f0cef4003ae49faa67b3f17c3 3e408650207b498c8d115fd0c4f776dc - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:9a:3e:e5,bridge_name='br-int',has_traffic_filtering=True,id=4fe6ce9a-3a01-4bfa-98b1-3844ff1be391,network=Network(aabf11c6-ef94-408a-8148-6c6400566606),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap4fe6ce9a-3a') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 21 19:19:14 np0005591285 nova_compute[182755]: 2026-01-22 00:19:14.519 182759 DEBUG nova.objects.instance [None req-e5452d3d-a693-46ac-8a85-6181fa6d5bc7 5eb4e81f0cef4003ae49faa67b3f17c3 3e408650207b498c8d115fd0c4f776dc - - default default] Lazy-loading 'pci_devices' on Instance uuid 3a2a5065-a19a-41e9-ab2f-11e0d05fee11 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 21 19:19:14 np0005591285 nova_compute[182755]: 2026-01-22 00:19:14.535 182759 DEBUG nova.virt.libvirt.driver [None req-e5452d3d-a693-46ac-8a85-6181fa6d5bc7 5eb4e81f0cef4003ae49faa67b3f17c3 3e408650207b498c8d115fd0c4f776dc - - default default] [instance: 3a2a5065-a19a-41e9-ab2f-11e0d05fee11] End _get_guest_xml xml=<domain type="kvm">
Jan 21 19:19:14 np0005591285 nova_compute[182755]:  <uuid>3a2a5065-a19a-41e9-ab2f-11e0d05fee11</uuid>
Jan 21 19:19:14 np0005591285 nova_compute[182755]:  <name>instance-00000092</name>
Jan 21 19:19:14 np0005591285 nova_compute[182755]:  <memory>131072</memory>
Jan 21 19:19:14 np0005591285 nova_compute[182755]:  <vcpu>1</vcpu>
Jan 21 19:19:14 np0005591285 nova_compute[182755]:  <metadata>
Jan 21 19:19:14 np0005591285 nova_compute[182755]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 21 19:19:14 np0005591285 nova_compute[182755]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 21 19:19:14 np0005591285 nova_compute[182755]:      <nova:name>tempest-ServersTestJSON-server-1928398450</nova:name>
Jan 21 19:19:14 np0005591285 nova_compute[182755]:      <nova:creationTime>2026-01-22 00:19:14</nova:creationTime>
Jan 21 19:19:14 np0005591285 nova_compute[182755]:      <nova:flavor name="m1.nano">
Jan 21 19:19:14 np0005591285 nova_compute[182755]:        <nova:memory>128</nova:memory>
Jan 21 19:19:14 np0005591285 nova_compute[182755]:        <nova:disk>1</nova:disk>
Jan 21 19:19:14 np0005591285 nova_compute[182755]:        <nova:swap>0</nova:swap>
Jan 21 19:19:14 np0005591285 nova_compute[182755]:        <nova:ephemeral>0</nova:ephemeral>
Jan 21 19:19:14 np0005591285 nova_compute[182755]:        <nova:vcpus>1</nova:vcpus>
Jan 21 19:19:14 np0005591285 nova_compute[182755]:      </nova:flavor>
Jan 21 19:19:14 np0005591285 nova_compute[182755]:      <nova:owner>
Jan 21 19:19:14 np0005591285 nova_compute[182755]:        <nova:user uuid="5eb4e81f0cef4003ae49faa67b3f17c3">tempest-ServersTestJSON-374007797-project-member</nova:user>
Jan 21 19:19:14 np0005591285 nova_compute[182755]:        <nova:project uuid="3e408650207b498c8d115fd0c4f776dc">tempest-ServersTestJSON-374007797</nova:project>
Jan 21 19:19:14 np0005591285 nova_compute[182755]:      </nova:owner>
Jan 21 19:19:14 np0005591285 nova_compute[182755]:      <nova:root type="image" uuid="9cd98f02-a505-4543-a7ad-04e9a377b456"/>
Jan 21 19:19:14 np0005591285 nova_compute[182755]:      <nova:ports>
Jan 21 19:19:14 np0005591285 nova_compute[182755]:        <nova:port uuid="4fe6ce9a-3a01-4bfa-98b1-3844ff1be391">
Jan 21 19:19:14 np0005591285 nova_compute[182755]:          <nova:ip type="fixed" address="10.100.0.3" ipVersion="4"/>
Jan 21 19:19:14 np0005591285 nova_compute[182755]:        </nova:port>
Jan 21 19:19:14 np0005591285 nova_compute[182755]:      </nova:ports>
Jan 21 19:19:14 np0005591285 nova_compute[182755]:    </nova:instance>
Jan 21 19:19:14 np0005591285 nova_compute[182755]:  </metadata>
Jan 21 19:19:14 np0005591285 nova_compute[182755]:  <sysinfo type="smbios">
Jan 21 19:19:14 np0005591285 nova_compute[182755]:    <system>
Jan 21 19:19:14 np0005591285 nova_compute[182755]:      <entry name="manufacturer">RDO</entry>
Jan 21 19:19:14 np0005591285 nova_compute[182755]:      <entry name="product">OpenStack Compute</entry>
Jan 21 19:19:14 np0005591285 nova_compute[182755]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 21 19:19:14 np0005591285 nova_compute[182755]:      <entry name="serial">3a2a5065-a19a-41e9-ab2f-11e0d05fee11</entry>
Jan 21 19:19:14 np0005591285 nova_compute[182755]:      <entry name="uuid">3a2a5065-a19a-41e9-ab2f-11e0d05fee11</entry>
Jan 21 19:19:14 np0005591285 nova_compute[182755]:      <entry name="family">Virtual Machine</entry>
Jan 21 19:19:14 np0005591285 nova_compute[182755]:    </system>
Jan 21 19:19:14 np0005591285 nova_compute[182755]:  </sysinfo>
Jan 21 19:19:14 np0005591285 nova_compute[182755]:  <os>
Jan 21 19:19:14 np0005591285 nova_compute[182755]:    <type arch="x86_64" machine="q35">hvm</type>
Jan 21 19:19:14 np0005591285 nova_compute[182755]:    <boot dev="hd"/>
Jan 21 19:19:14 np0005591285 nova_compute[182755]:    <smbios mode="sysinfo"/>
Jan 21 19:19:14 np0005591285 nova_compute[182755]:  </os>
Jan 21 19:19:14 np0005591285 nova_compute[182755]:  <features>
Jan 21 19:19:14 np0005591285 nova_compute[182755]:    <acpi/>
Jan 21 19:19:14 np0005591285 nova_compute[182755]:    <apic/>
Jan 21 19:19:14 np0005591285 nova_compute[182755]:    <vmcoreinfo/>
Jan 21 19:19:14 np0005591285 nova_compute[182755]:  </features>
Jan 21 19:19:14 np0005591285 nova_compute[182755]:  <clock offset="utc">
Jan 21 19:19:14 np0005591285 nova_compute[182755]:    <timer name="pit" tickpolicy="delay"/>
Jan 21 19:19:14 np0005591285 nova_compute[182755]:    <timer name="rtc" tickpolicy="catchup"/>
Jan 21 19:19:14 np0005591285 nova_compute[182755]:    <timer name="hpet" present="no"/>
Jan 21 19:19:14 np0005591285 nova_compute[182755]:  </clock>
Jan 21 19:19:14 np0005591285 nova_compute[182755]:  <cpu mode="custom" match="exact">
Jan 21 19:19:14 np0005591285 nova_compute[182755]:    <model>Nehalem</model>
Jan 21 19:19:14 np0005591285 nova_compute[182755]:    <topology sockets="1" cores="1" threads="1"/>
Jan 21 19:19:14 np0005591285 nova_compute[182755]:  </cpu>
Jan 21 19:19:14 np0005591285 nova_compute[182755]:  <devices>
Jan 21 19:19:14 np0005591285 nova_compute[182755]:    <disk type="file" device="disk">
Jan 21 19:19:14 np0005591285 nova_compute[182755]:      <driver name="qemu" type="qcow2" cache="none"/>
Jan 21 19:19:14 np0005591285 nova_compute[182755]:      <source file="/var/lib/nova/instances/3a2a5065-a19a-41e9-ab2f-11e0d05fee11/disk"/>
Jan 21 19:19:14 np0005591285 nova_compute[182755]:      <target dev="vda" bus="virtio"/>
Jan 21 19:19:14 np0005591285 nova_compute[182755]:    </disk>
Jan 21 19:19:14 np0005591285 nova_compute[182755]:    <disk type="file" device="cdrom">
Jan 21 19:19:14 np0005591285 nova_compute[182755]:      <driver name="qemu" type="raw" cache="none"/>
Jan 21 19:19:14 np0005591285 nova_compute[182755]:      <source file="/var/lib/nova/instances/3a2a5065-a19a-41e9-ab2f-11e0d05fee11/disk.config"/>
Jan 21 19:19:14 np0005591285 nova_compute[182755]:      <target dev="sda" bus="sata"/>
Jan 21 19:19:14 np0005591285 nova_compute[182755]:    </disk>
Jan 21 19:19:14 np0005591285 nova_compute[182755]:    <interface type="ethernet">
Jan 21 19:19:14 np0005591285 nova_compute[182755]:      <mac address="fa:16:3e:9a:3e:e5"/>
Jan 21 19:19:14 np0005591285 nova_compute[182755]:      <model type="virtio"/>
Jan 21 19:19:14 np0005591285 nova_compute[182755]:      <driver name="vhost" rx_queue_size="512"/>
Jan 21 19:19:14 np0005591285 nova_compute[182755]:      <mtu size="1442"/>
Jan 21 19:19:14 np0005591285 nova_compute[182755]:      <target dev="tap4fe6ce9a-3a"/>
Jan 21 19:19:14 np0005591285 nova_compute[182755]:    </interface>
Jan 21 19:19:14 np0005591285 nova_compute[182755]:    <serial type="pty">
Jan 21 19:19:14 np0005591285 nova_compute[182755]:      <log file="/var/lib/nova/instances/3a2a5065-a19a-41e9-ab2f-11e0d05fee11/console.log" append="off"/>
Jan 21 19:19:14 np0005591285 nova_compute[182755]:    </serial>
Jan 21 19:19:14 np0005591285 nova_compute[182755]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 21 19:19:14 np0005591285 nova_compute[182755]:    <video>
Jan 21 19:19:14 np0005591285 nova_compute[182755]:      <model type="virtio"/>
Jan 21 19:19:14 np0005591285 nova_compute[182755]:    </video>
Jan 21 19:19:14 np0005591285 nova_compute[182755]:    <input type="tablet" bus="usb"/>
Jan 21 19:19:14 np0005591285 nova_compute[182755]:    <rng model="virtio">
Jan 21 19:19:14 np0005591285 nova_compute[182755]:      <backend model="random">/dev/urandom</backend>
Jan 21 19:19:14 np0005591285 nova_compute[182755]:    </rng>
Jan 21 19:19:14 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root"/>
Jan 21 19:19:14 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 19:19:14 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 19:19:14 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 19:19:14 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 19:19:14 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 19:19:14 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 19:19:14 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 19:19:14 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 19:19:14 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 19:19:14 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 19:19:14 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 19:19:14 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 19:19:14 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 19:19:14 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 19:19:14 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 19:19:14 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 19:19:14 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 19:19:14 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 19:19:14 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 19:19:14 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 19:19:14 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 19:19:14 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 19:19:14 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 19:19:14 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 19:19:14 np0005591285 nova_compute[182755]:    <controller type="usb" index="0"/>
Jan 21 19:19:14 np0005591285 nova_compute[182755]:    <memballoon model="virtio">
Jan 21 19:19:14 np0005591285 nova_compute[182755]:      <stats period="10"/>
Jan 21 19:19:14 np0005591285 nova_compute[182755]:    </memballoon>
Jan 21 19:19:14 np0005591285 nova_compute[182755]:  </devices>
Jan 21 19:19:14 np0005591285 nova_compute[182755]: </domain>
Jan 21 19:19:14 np0005591285 nova_compute[182755]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Jan 21 19:19:14 np0005591285 nova_compute[182755]: 2026-01-22 00:19:14.536 182759 DEBUG nova.compute.manager [None req-e5452d3d-a693-46ac-8a85-6181fa6d5bc7 5eb4e81f0cef4003ae49faa67b3f17c3 3e408650207b498c8d115fd0c4f776dc - - default default] [instance: 3a2a5065-a19a-41e9-ab2f-11e0d05fee11] Preparing to wait for external event network-vif-plugged-4fe6ce9a-3a01-4bfa-98b1-3844ff1be391 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Jan 21 19:19:14 np0005591285 nova_compute[182755]: 2026-01-22 00:19:14.537 182759 DEBUG oslo_concurrency.lockutils [None req-e5452d3d-a693-46ac-8a85-6181fa6d5bc7 5eb4e81f0cef4003ae49faa67b3f17c3 3e408650207b498c8d115fd0c4f776dc - - default default] Acquiring lock "3a2a5065-a19a-41e9-ab2f-11e0d05fee11-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 21 19:19:14 np0005591285 nova_compute[182755]: 2026-01-22 00:19:14.537 182759 DEBUG oslo_concurrency.lockutils [None req-e5452d3d-a693-46ac-8a85-6181fa6d5bc7 5eb4e81f0cef4003ae49faa67b3f17c3 3e408650207b498c8d115fd0c4f776dc - - default default] Lock "3a2a5065-a19a-41e9-ab2f-11e0d05fee11-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 21 19:19:14 np0005591285 nova_compute[182755]: 2026-01-22 00:19:14.537 182759 DEBUG oslo_concurrency.lockutils [None req-e5452d3d-a693-46ac-8a85-6181fa6d5bc7 5eb4e81f0cef4003ae49faa67b3f17c3 3e408650207b498c8d115fd0c4f776dc - - default default] Lock "3a2a5065-a19a-41e9-ab2f-11e0d05fee11-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 21 19:19:14 np0005591285 nova_compute[182755]: 2026-01-22 00:19:14.538 182759 DEBUG nova.virt.libvirt.vif [None req-e5452d3d-a693-46ac-8a85-6181fa6d5bc7 5eb4e81f0cef4003ae49faa67b3f17c3 3e408650207b498c8d115fd0c4f776dc - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-22T00:19:08Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ServersTestJSON-server-1928398450',display_name='tempest-ServersTestJSON-server-1928398450',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-serverstestjson-server-1928398450',id=146,image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='3e408650207b498c8d115fd0c4f776dc',ramdisk_id='',reservation_id='r-j6btjais',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ServersTestJSON-374007797',owner_user_name='tempest-ServersTestJSON-374007797-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-22T00:19:09Z,user_data=None,user_id='5eb4e81f0cef4003ae49faa67b3f17c3',uuid=3a2a5065-a19a-41e9-ab2f-11e0d05fee11,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "4fe6ce9a-3a01-4bfa-98b1-3844ff1be391", "address": "fa:16:3e:9a:3e:e5", "network": {"id": "aabf11c6-ef94-408a-8148-6c6400566606", "bridge": "br-int", "label": "tempest-ServersTestJSON-341875047-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3e408650207b498c8d115fd0c4f776dc", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4fe6ce9a-3a", "ovs_interfaceid": "4fe6ce9a-3a01-4bfa-98b1-3844ff1be391", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Jan 21 19:19:14 np0005591285 nova_compute[182755]: 2026-01-22 00:19:14.538 182759 DEBUG nova.network.os_vif_util [None req-e5452d3d-a693-46ac-8a85-6181fa6d5bc7 5eb4e81f0cef4003ae49faa67b3f17c3 3e408650207b498c8d115fd0c4f776dc - - default default] Converting VIF {"id": "4fe6ce9a-3a01-4bfa-98b1-3844ff1be391", "address": "fa:16:3e:9a:3e:e5", "network": {"id": "aabf11c6-ef94-408a-8148-6c6400566606", "bridge": "br-int", "label": "tempest-ServersTestJSON-341875047-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3e408650207b498c8d115fd0c4f776dc", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4fe6ce9a-3a", "ovs_interfaceid": "4fe6ce9a-3a01-4bfa-98b1-3844ff1be391", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 21 19:19:14 np0005591285 nova_compute[182755]: 2026-01-22 00:19:14.538 182759 DEBUG nova.network.os_vif_util [None req-e5452d3d-a693-46ac-8a85-6181fa6d5bc7 5eb4e81f0cef4003ae49faa67b3f17c3 3e408650207b498c8d115fd0c4f776dc - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:9a:3e:e5,bridge_name='br-int',has_traffic_filtering=True,id=4fe6ce9a-3a01-4bfa-98b1-3844ff1be391,network=Network(aabf11c6-ef94-408a-8148-6c6400566606),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap4fe6ce9a-3a') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 21 19:19:14 np0005591285 nova_compute[182755]: 2026-01-22 00:19:14.539 182759 DEBUG os_vif [None req-e5452d3d-a693-46ac-8a85-6181fa6d5bc7 5eb4e81f0cef4003ae49faa67b3f17c3 3e408650207b498c8d115fd0c4f776dc - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:9a:3e:e5,bridge_name='br-int',has_traffic_filtering=True,id=4fe6ce9a-3a01-4bfa-98b1-3844ff1be391,network=Network(aabf11c6-ef94-408a-8148-6c6400566606),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap4fe6ce9a-3a') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Jan 21 19:19:14 np0005591285 nova_compute[182755]: 2026-01-22 00:19:14.539 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:19:14 np0005591285 nova_compute[182755]: 2026-01-22 00:19:14.539 182759 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 21 19:19:14 np0005591285 nova_compute[182755]: 2026-01-22 00:19:14.540 182759 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 21 19:19:14 np0005591285 nova_compute[182755]: 2026-01-22 00:19:14.543 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:19:14 np0005591285 nova_compute[182755]: 2026-01-22 00:19:14.544 182759 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap4fe6ce9a-3a, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 21 19:19:14 np0005591285 nova_compute[182755]: 2026-01-22 00:19:14.544 182759 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap4fe6ce9a-3a, col_values=(('external_ids', {'iface-id': '4fe6ce9a-3a01-4bfa-98b1-3844ff1be391', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:9a:3e:e5', 'vm-uuid': '3a2a5065-a19a-41e9-ab2f-11e0d05fee11'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 21 19:19:14 np0005591285 nova_compute[182755]: 2026-01-22 00:19:14.546 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:19:14 np0005591285 NetworkManager[55017]: <info>  [1769041154.5477] manager: (tap4fe6ce9a-3a): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/272)
Jan 21 19:19:14 np0005591285 nova_compute[182755]: 2026-01-22 00:19:14.548 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 21 19:19:14 np0005591285 nova_compute[182755]: 2026-01-22 00:19:14.552 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:19:14 np0005591285 nova_compute[182755]: 2026-01-22 00:19:14.553 182759 INFO os_vif [None req-e5452d3d-a693-46ac-8a85-6181fa6d5bc7 5eb4e81f0cef4003ae49faa67b3f17c3 3e408650207b498c8d115fd0c4f776dc - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:9a:3e:e5,bridge_name='br-int',has_traffic_filtering=True,id=4fe6ce9a-3a01-4bfa-98b1-3844ff1be391,network=Network(aabf11c6-ef94-408a-8148-6c6400566606),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap4fe6ce9a-3a')#033[00m
Jan 21 19:19:14 np0005591285 nova_compute[182755]: 2026-01-22 00:19:14.624 182759 DEBUG nova.virt.libvirt.driver [None req-e5452d3d-a693-46ac-8a85-6181fa6d5bc7 5eb4e81f0cef4003ae49faa67b3f17c3 3e408650207b498c8d115fd0c4f776dc - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 21 19:19:14 np0005591285 nova_compute[182755]: 2026-01-22 00:19:14.624 182759 DEBUG nova.virt.libvirt.driver [None req-e5452d3d-a693-46ac-8a85-6181fa6d5bc7 5eb4e81f0cef4003ae49faa67b3f17c3 3e408650207b498c8d115fd0c4f776dc - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 21 19:19:14 np0005591285 nova_compute[182755]: 2026-01-22 00:19:14.625 182759 DEBUG nova.virt.libvirt.driver [None req-e5452d3d-a693-46ac-8a85-6181fa6d5bc7 5eb4e81f0cef4003ae49faa67b3f17c3 3e408650207b498c8d115fd0c4f776dc - - default default] No VIF found with MAC fa:16:3e:9a:3e:e5, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Jan 21 19:19:14 np0005591285 nova_compute[182755]: 2026-01-22 00:19:14.625 182759 INFO nova.virt.libvirt.driver [None req-e5452d3d-a693-46ac-8a85-6181fa6d5bc7 5eb4e81f0cef4003ae49faa67b3f17c3 3e408650207b498c8d115fd0c4f776dc - - default default] [instance: 3a2a5065-a19a-41e9-ab2f-11e0d05fee11] Using config drive#033[00m
Jan 21 19:19:15 np0005591285 nova_compute[182755]: 2026-01-22 00:19:15.454 182759 INFO nova.virt.libvirt.driver [None req-e5452d3d-a693-46ac-8a85-6181fa6d5bc7 5eb4e81f0cef4003ae49faa67b3f17c3 3e408650207b498c8d115fd0c4f776dc - - default default] [instance: 3a2a5065-a19a-41e9-ab2f-11e0d05fee11] Creating config drive at /var/lib/nova/instances/3a2a5065-a19a-41e9-ab2f-11e0d05fee11/disk.config#033[00m
Jan 21 19:19:15 np0005591285 nova_compute[182755]: 2026-01-22 00:19:15.459 182759 DEBUG oslo_concurrency.processutils [None req-e5452d3d-a693-46ac-8a85-6181fa6d5bc7 5eb4e81f0cef4003ae49faa67b3f17c3 3e408650207b498c8d115fd0c4f776dc - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/3a2a5065-a19a-41e9-ab2f-11e0d05fee11/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpkyrn3wzu execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 21 19:19:15 np0005591285 nova_compute[182755]: 2026-01-22 00:19:15.584 182759 DEBUG oslo_concurrency.processutils [None req-e5452d3d-a693-46ac-8a85-6181fa6d5bc7 5eb4e81f0cef4003ae49faa67b3f17c3 3e408650207b498c8d115fd0c4f776dc - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/3a2a5065-a19a-41e9-ab2f-11e0d05fee11/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpkyrn3wzu" returned: 0 in 0.125s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 21 19:19:15 np0005591285 kernel: tap4fe6ce9a-3a: entered promiscuous mode
Jan 21 19:19:15 np0005591285 NetworkManager[55017]: <info>  [1769041155.6475] manager: (tap4fe6ce9a-3a): new Tun device (/org/freedesktop/NetworkManager/Devices/273)
Jan 21 19:19:15 np0005591285 ovn_controller[94908]: 2026-01-22T00:19:15Z|00566|binding|INFO|Claiming lport 4fe6ce9a-3a01-4bfa-98b1-3844ff1be391 for this chassis.
Jan 21 19:19:15 np0005591285 ovn_controller[94908]: 2026-01-22T00:19:15Z|00567|binding|INFO|4fe6ce9a-3a01-4bfa-98b1-3844ff1be391: Claiming fa:16:3e:9a:3e:e5 10.100.0.3
Jan 21 19:19:15 np0005591285 nova_compute[182755]: 2026-01-22 00:19:15.650 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:19:15 np0005591285 nova_compute[182755]: 2026-01-22 00:19:15.662 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:19:15 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:19:15.659 104259 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:9a:3e:e5 10.100.0.3'], port_security=['fa:16:3e:9a:3e:e5 10.100.0.3'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.3/28', 'neutron:device_id': '3a2a5065-a19a-41e9-ab2f-11e0d05fee11', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-aabf11c6-ef94-408a-8148-6c6400566606', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '3e408650207b498c8d115fd0c4f776dc', 'neutron:revision_number': '2', 'neutron:security_group_ids': 'd88f438e-f9bb-4593-93a6-6ce5aa939167', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=dfd57084-623a-46cf-a9c5-71a440a640c6, chassis=[<ovs.db.idl.Row object at 0x7fdd55b5c970>], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fdd55b5c970>], logical_port=4fe6ce9a-3a01-4bfa-98b1-3844ff1be391) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 21 19:19:15 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:19:15.660 104259 INFO neutron.agent.ovn.metadata.agent [-] Port 4fe6ce9a-3a01-4bfa-98b1-3844ff1be391 in datapath aabf11c6-ef94-408a-8148-6c6400566606 bound to our chassis#033[00m
Jan 21 19:19:15 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:19:15.662 104259 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network aabf11c6-ef94-408a-8148-6c6400566606#033[00m
Jan 21 19:19:15 np0005591285 ovn_controller[94908]: 2026-01-22T00:19:15Z|00568|binding|INFO|Setting lport 4fe6ce9a-3a01-4bfa-98b1-3844ff1be391 ovn-installed in OVS
Jan 21 19:19:15 np0005591285 ovn_controller[94908]: 2026-01-22T00:19:15Z|00569|binding|INFO|Setting lport 4fe6ce9a-3a01-4bfa-98b1-3844ff1be391 up in Southbound
Jan 21 19:19:15 np0005591285 nova_compute[182755]: 2026-01-22 00:19:15.666 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:19:15 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:19:15.674 211690 DEBUG oslo.privsep.daemon [-] privsep: reply[62ed8447-a12e-49d0-9ce8-3efdc790926f]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 19:19:15 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:19:15.675 104259 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tapaabf11c6-e1 in ovnmeta-aabf11c6-ef94-408a-8148-6c6400566606 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Jan 21 19:19:15 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:19:15.676 211690 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tapaabf11c6-e0 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Jan 21 19:19:15 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:19:15.676 211690 DEBUG oslo.privsep.daemon [-] privsep: reply[57acef72-cdea-40de-bbc4-806f9d1a6af0]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 19:19:15 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:19:15.677 211690 DEBUG oslo.privsep.daemon [-] privsep: reply[c4fdb617-c459-4219-9a7d-a79a2b467eef]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 19:19:15 np0005591285 systemd-machined[154022]: New machine qemu-67-instance-00000092.
Jan 21 19:19:15 np0005591285 podman[234691]: 2026-01-22 00:19:15.686589264 +0000 UTC m=+0.066080833 container health_status 482a8450540b33e5cd4c4cb0c4b60281016c657b79b89fa82f8a9e447843f995 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS)
Jan 21 19:19:15 np0005591285 podman[234696]: 2026-01-22 00:19:15.686627075 +0000 UTC m=+0.066263188 container health_status 8919309155a033da8deb024bb37b9fc0d285fe97686ee0d202af37a5f2df8416 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter)
Jan 21 19:19:15 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:19:15.687 104650 DEBUG oslo.privsep.daemon [-] privsep: reply[338309eb-389d-4cc6-bb88-9010cf750704]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 19:19:15 np0005591285 systemd[1]: Started Virtual Machine qemu-67-instance-00000092.
Jan 21 19:19:15 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:19:15.715 211690 DEBUG oslo.privsep.daemon [-] privsep: reply[6d4b94c3-14de-4c46-a5fb-9e2422f2586d]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 19:19:15 np0005591285 systemd-udevd[234747]: Network interface NamePolicy= disabled on kernel command line.
Jan 21 19:19:15 np0005591285 NetworkManager[55017]: <info>  [1769041155.7291] device (tap4fe6ce9a-3a): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 21 19:19:15 np0005591285 NetworkManager[55017]: <info>  [1769041155.7295] device (tap4fe6ce9a-3a): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 21 19:19:15 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:19:15.742 211704 DEBUG oslo.privsep.daemon [-] privsep: reply[c08e08f2-f004-47e5-b904-9fc973f296c8]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 19:19:15 np0005591285 NetworkManager[55017]: <info>  [1769041155.7490] manager: (tapaabf11c6-e0): new Veth device (/org/freedesktop/NetworkManager/Devices/274)
Jan 21 19:19:15 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:19:15.748 211690 DEBUG oslo.privsep.daemon [-] privsep: reply[1ad6442c-e1a0-440c-bf71-98140c0639c4]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 19:19:15 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:19:15.788 211704 DEBUG oslo.privsep.daemon [-] privsep: reply[3bbbbe72-c5d5-4b53-8eb5-218fadef0b51]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 19:19:15 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:19:15.791 211704 DEBUG oslo.privsep.daemon [-] privsep: reply[4bb24e80-09a8-4a93-bc90-a2438d8e243f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 19:19:15 np0005591285 NetworkManager[55017]: <info>  [1769041155.8160] device (tapaabf11c6-e0): carrier: link connected
Jan 21 19:19:15 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:19:15.821 211704 DEBUG oslo.privsep.daemon [-] privsep: reply[f5d0d533-fb92-43dd-9c82-f321610946d5]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 19:19:15 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:19:15.836 211690 DEBUG oslo.privsep.daemon [-] privsep: reply[e1e80597-4108-4468-b3e6-866e681a8cf4]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapaabf11c6-e1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:ae:1b:62'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 179], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 571447, 'reachable_time': 21926, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 234777, 'error': None, 'target': 'ovnmeta-aabf11c6-ef94-408a-8148-6c6400566606', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 19:19:15 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:19:15.850 211690 DEBUG oslo.privsep.daemon [-] privsep: reply[b4e2f2a8-1059-4aee-83a2-0ce540a24c49]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:feae:1b62'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 571447, 'tstamp': 571447}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 234778, 'error': None, 'target': 'ovnmeta-aabf11c6-ef94-408a-8148-6c6400566606', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 19:19:15 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:19:15.865 211690 DEBUG oslo.privsep.daemon [-] privsep: reply[c65836f9-2216-4263-baff-aaf266dc4779]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapaabf11c6-e1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:ae:1b:62'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 179], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 571447, 'reachable_time': 21926, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 234779, 'error': None, 'target': 'ovnmeta-aabf11c6-ef94-408a-8148-6c6400566606', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 19:19:15 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:19:15.901 211690 DEBUG oslo.privsep.daemon [-] privsep: reply[63871a19-5ed0-45e1-a8cf-90416ecf95d5]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 19:19:15 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:19:15.967 211690 DEBUG oslo.privsep.daemon [-] privsep: reply[cc7bb9b5-6f66-4eab-9d03-49f0a4c1846b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 19:19:15 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:19:15.974 104259 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapaabf11c6-e0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 21 19:19:15 np0005591285 nova_compute[182755]: 2026-01-22 00:19:15.974 182759 DEBUG nova.compute.manager [req-e7572f05-2c24-4a0f-9fa6-a4a2264eac52 req-0ad15671-4467-4694-bdf4-3549e375c5a4 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 3a2a5065-a19a-41e9-ab2f-11e0d05fee11] Received event network-vif-plugged-4fe6ce9a-3a01-4bfa-98b1-3844ff1be391 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 21 19:19:15 np0005591285 nova_compute[182755]: 2026-01-22 00:19:15.974 182759 DEBUG oslo_concurrency.lockutils [req-e7572f05-2c24-4a0f-9fa6-a4a2264eac52 req-0ad15671-4467-4694-bdf4-3549e375c5a4 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "3a2a5065-a19a-41e9-ab2f-11e0d05fee11-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 21 19:19:15 np0005591285 nova_compute[182755]: 2026-01-22 00:19:15.975 182759 DEBUG oslo_concurrency.lockutils [req-e7572f05-2c24-4a0f-9fa6-a4a2264eac52 req-0ad15671-4467-4694-bdf4-3549e375c5a4 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "3a2a5065-a19a-41e9-ab2f-11e0d05fee11-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 21 19:19:15 np0005591285 nova_compute[182755]: 2026-01-22 00:19:15.975 182759 DEBUG oslo_concurrency.lockutils [req-e7572f05-2c24-4a0f-9fa6-a4a2264eac52 req-0ad15671-4467-4694-bdf4-3549e375c5a4 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "3a2a5065-a19a-41e9-ab2f-11e0d05fee11-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 21 19:19:15 np0005591285 nova_compute[182755]: 2026-01-22 00:19:15.975 182759 DEBUG nova.compute.manager [req-e7572f05-2c24-4a0f-9fa6-a4a2264eac52 req-0ad15671-4467-4694-bdf4-3549e375c5a4 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 3a2a5065-a19a-41e9-ab2f-11e0d05fee11] Processing event network-vif-plugged-4fe6ce9a-3a01-4bfa-98b1-3844ff1be391 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Jan 21 19:19:15 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:19:15.975 104259 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 21 19:19:15 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:19:15.976 104259 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapaabf11c6-e0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 21 19:19:15 np0005591285 nova_compute[182755]: 2026-01-22 00:19:15.979 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:19:15 np0005591285 NetworkManager[55017]: <info>  [1769041155.9801] manager: (tapaabf11c6-e0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/275)
Jan 21 19:19:15 np0005591285 kernel: tapaabf11c6-e0: entered promiscuous mode
Jan 21 19:19:15 np0005591285 nova_compute[182755]: 2026-01-22 00:19:15.981 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:19:15 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:19:15.987 104259 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapaabf11c6-e0, col_values=(('external_ids', {'iface-id': '1ae0dbff-a7cd-4db8-afc3-1d102fdd130f'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 21 19:19:15 np0005591285 nova_compute[182755]: 2026-01-22 00:19:15.989 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:19:15 np0005591285 ovn_controller[94908]: 2026-01-22T00:19:15Z|00570|binding|INFO|Releasing lport 1ae0dbff-a7cd-4db8-afc3-1d102fdd130f from this chassis (sb_readonly=0)
Jan 21 19:19:15 np0005591285 nova_compute[182755]: 2026-01-22 00:19:15.990 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:19:15 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:19:15.995 104259 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/aabf11c6-ef94-408a-8148-6c6400566606.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/aabf11c6-ef94-408a-8148-6c6400566606.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Jan 21 19:19:15 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:19:15.997 211690 DEBUG oslo.privsep.daemon [-] privsep: reply[881cee7f-ebbd-4e02-af67-9566b2cda08f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 19:19:15 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:19:15.998 104259 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Jan 21 19:19:15 np0005591285 ovn_metadata_agent[104254]: global
Jan 21 19:19:15 np0005591285 ovn_metadata_agent[104254]:    log         /dev/log local0 debug
Jan 21 19:19:15 np0005591285 ovn_metadata_agent[104254]:    log-tag     haproxy-metadata-proxy-aabf11c6-ef94-408a-8148-6c6400566606
Jan 21 19:19:15 np0005591285 ovn_metadata_agent[104254]:    user        root
Jan 21 19:19:15 np0005591285 ovn_metadata_agent[104254]:    group       root
Jan 21 19:19:15 np0005591285 ovn_metadata_agent[104254]:    maxconn     1024
Jan 21 19:19:15 np0005591285 ovn_metadata_agent[104254]:    pidfile     /var/lib/neutron/external/pids/aabf11c6-ef94-408a-8148-6c6400566606.pid.haproxy
Jan 21 19:19:15 np0005591285 ovn_metadata_agent[104254]:    daemon
Jan 21 19:19:15 np0005591285 ovn_metadata_agent[104254]: 
Jan 21 19:19:15 np0005591285 ovn_metadata_agent[104254]: defaults
Jan 21 19:19:15 np0005591285 ovn_metadata_agent[104254]:    log global
Jan 21 19:19:16 np0005591285 ovn_metadata_agent[104254]:    mode http
Jan 21 19:19:16 np0005591285 ovn_metadata_agent[104254]:    option httplog
Jan 21 19:19:16 np0005591285 ovn_metadata_agent[104254]:    option dontlognull
Jan 21 19:19:16 np0005591285 ovn_metadata_agent[104254]:    option http-server-close
Jan 21 19:19:16 np0005591285 ovn_metadata_agent[104254]:    option forwardfor
Jan 21 19:19:16 np0005591285 ovn_metadata_agent[104254]:    retries                 3
Jan 21 19:19:16 np0005591285 ovn_metadata_agent[104254]:    timeout http-request    30s
Jan 21 19:19:16 np0005591285 ovn_metadata_agent[104254]:    timeout connect         30s
Jan 21 19:19:16 np0005591285 ovn_metadata_agent[104254]:    timeout client          32s
Jan 21 19:19:16 np0005591285 ovn_metadata_agent[104254]:    timeout server          32s
Jan 21 19:19:16 np0005591285 ovn_metadata_agent[104254]:    timeout http-keep-alive 30s
Jan 21 19:19:16 np0005591285 ovn_metadata_agent[104254]: 
Jan 21 19:19:16 np0005591285 ovn_metadata_agent[104254]: 
Jan 21 19:19:16 np0005591285 ovn_metadata_agent[104254]: listen listener
Jan 21 19:19:16 np0005591285 ovn_metadata_agent[104254]:    bind 169.254.169.254:80
Jan 21 19:19:16 np0005591285 ovn_metadata_agent[104254]:    server metadata /var/lib/neutron/metadata_proxy
Jan 21 19:19:16 np0005591285 ovn_metadata_agent[104254]:    http-request add-header X-OVN-Network-ID aabf11c6-ef94-408a-8148-6c6400566606
Jan 21 19:19:16 np0005591285 ovn_metadata_agent[104254]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Jan 21 19:19:16 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:19:16.002 104259 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-aabf11c6-ef94-408a-8148-6c6400566606', 'env', 'PROCESS_TAG=haproxy-aabf11c6-ef94-408a-8148-6c6400566606', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/aabf11c6-ef94-408a-8148-6c6400566606.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Jan 21 19:19:16 np0005591285 nova_compute[182755]: 2026-01-22 00:19:16.008 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:19:16 np0005591285 nova_compute[182755]: 2026-01-22 00:19:16.041 182759 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1769041141.0401378, fb2dc221-bb45-4407-90b5-ce2fe888001c => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 21 19:19:16 np0005591285 nova_compute[182755]: 2026-01-22 00:19:16.041 182759 INFO nova.compute.manager [-] [instance: fb2dc221-bb45-4407-90b5-ce2fe888001c] VM Stopped (Lifecycle Event)#033[00m
Jan 21 19:19:16 np0005591285 nova_compute[182755]: 2026-01-22 00:19:16.065 182759 DEBUG nova.compute.manager [None req-20c4f4a4-643a-4a8d-8d3c-c7d7bce27a13 - - - - - -] [instance: fb2dc221-bb45-4407-90b5-ce2fe888001c] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 21 19:19:16 np0005591285 nova_compute[182755]: 2026-01-22 00:19:16.422 182759 DEBUG nova.network.neutron [req-dce91e3a-dc77-4d8e-bf0d-d3aee0870dbb req-8d80cf9f-31e6-440e-8972-23fc98dfc824 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 3a2a5065-a19a-41e9-ab2f-11e0d05fee11] Updated VIF entry in instance network info cache for port 4fe6ce9a-3a01-4bfa-98b1-3844ff1be391. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 21 19:19:16 np0005591285 nova_compute[182755]: 2026-01-22 00:19:16.422 182759 DEBUG nova.network.neutron [req-dce91e3a-dc77-4d8e-bf0d-d3aee0870dbb req-8d80cf9f-31e6-440e-8972-23fc98dfc824 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 3a2a5065-a19a-41e9-ab2f-11e0d05fee11] Updating instance_info_cache with network_info: [{"id": "4fe6ce9a-3a01-4bfa-98b1-3844ff1be391", "address": "fa:16:3e:9a:3e:e5", "network": {"id": "aabf11c6-ef94-408a-8148-6c6400566606", "bridge": "br-int", "label": "tempest-ServersTestJSON-341875047-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3e408650207b498c8d115fd0c4f776dc", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4fe6ce9a-3a", "ovs_interfaceid": "4fe6ce9a-3a01-4bfa-98b1-3844ff1be391", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 21 19:19:16 np0005591285 nova_compute[182755]: 2026-01-22 00:19:16.446 182759 DEBUG oslo_concurrency.lockutils [req-dce91e3a-dc77-4d8e-bf0d-d3aee0870dbb req-8d80cf9f-31e6-440e-8972-23fc98dfc824 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Releasing lock "refresh_cache-3a2a5065-a19a-41e9-ab2f-11e0d05fee11" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 21 19:19:16 np0005591285 podman[234809]: 2026-01-22 00:19:16.475158747 +0000 UTC m=+0.070605253 container create 51fb40c1dba42df5b674f444315adfff8d79df0acd6485bf8be6c4f842077a24 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-aabf11c6-ef94-408a-8148-6c6400566606, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 21 19:19:16 np0005591285 systemd[1]: Started libpod-conmon-51fb40c1dba42df5b674f444315adfff8d79df0acd6485bf8be6c4f842077a24.scope.
Jan 21 19:19:16 np0005591285 nova_compute[182755]: 2026-01-22 00:19:16.522 182759 DEBUG nova.virt.driver [None req-f447ef32-7edb-4e2c-a5c5-5452f2f9c37e - - - - - -] Emitting event <LifecycleEvent: 1769041156.521566, 3a2a5065-a19a-41e9-ab2f-11e0d05fee11 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 21 19:19:16 np0005591285 nova_compute[182755]: 2026-01-22 00:19:16.522 182759 INFO nova.compute.manager [None req-f447ef32-7edb-4e2c-a5c5-5452f2f9c37e - - - - - -] [instance: 3a2a5065-a19a-41e9-ab2f-11e0d05fee11] VM Started (Lifecycle Event)#033[00m
Jan 21 19:19:16 np0005591285 nova_compute[182755]: 2026-01-22 00:19:16.525 182759 DEBUG nova.compute.manager [None req-e5452d3d-a693-46ac-8a85-6181fa6d5bc7 5eb4e81f0cef4003ae49faa67b3f17c3 3e408650207b498c8d115fd0c4f776dc - - default default] [instance: 3a2a5065-a19a-41e9-ab2f-11e0d05fee11] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Jan 21 19:19:16 np0005591285 systemd[1]: Started libcrun container.
Jan 21 19:19:16 np0005591285 nova_compute[182755]: 2026-01-22 00:19:16.529 182759 DEBUG nova.virt.libvirt.driver [None req-e5452d3d-a693-46ac-8a85-6181fa6d5bc7 5eb4e81f0cef4003ae49faa67b3f17c3 3e408650207b498c8d115fd0c4f776dc - - default default] [instance: 3a2a5065-a19a-41e9-ab2f-11e0d05fee11] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Jan 21 19:19:16 np0005591285 podman[234809]: 2026-01-22 00:19:16.441666404 +0000 UTC m=+0.037112990 image pull 3695f0466b4af47afdf4b467956f8cc4744d7249671a73e7ca3fd26cca2f59c3 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Jan 21 19:19:16 np0005591285 nova_compute[182755]: 2026-01-22 00:19:16.533 182759 INFO nova.virt.libvirt.driver [-] [instance: 3a2a5065-a19a-41e9-ab2f-11e0d05fee11] Instance spawned successfully.#033[00m
Jan 21 19:19:16 np0005591285 nova_compute[182755]: 2026-01-22 00:19:16.533 182759 DEBUG nova.virt.libvirt.driver [None req-e5452d3d-a693-46ac-8a85-6181fa6d5bc7 5eb4e81f0cef4003ae49faa67b3f17c3 3e408650207b498c8d115fd0c4f776dc - - default default] [instance: 3a2a5065-a19a-41e9-ab2f-11e0d05fee11] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Jan 21 19:19:16 np0005591285 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/2960a8855285e92d0f70784be06e65ed72c262337c559c0be36c5a50d4573f61/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 21 19:19:16 np0005591285 podman[234809]: 2026-01-22 00:19:16.547837495 +0000 UTC m=+0.143284021 container init 51fb40c1dba42df5b674f444315adfff8d79df0acd6485bf8be6c4f842077a24 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-aabf11c6-ef94-408a-8148-6c6400566606, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Jan 21 19:19:16 np0005591285 podman[234809]: 2026-01-22 00:19:16.556352512 +0000 UTC m=+0.151799008 container start 51fb40c1dba42df5b674f444315adfff8d79df0acd6485bf8be6c4f842077a24 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-aabf11c6-ef94-408a-8148-6c6400566606, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Jan 21 19:19:16 np0005591285 neutron-haproxy-ovnmeta-aabf11c6-ef94-408a-8148-6c6400566606[234832]: [NOTICE]   (234836) : New worker (234838) forked
Jan 21 19:19:16 np0005591285 neutron-haproxy-ovnmeta-aabf11c6-ef94-408a-8148-6c6400566606[234832]: [NOTICE]   (234836) : Loading success.
Jan 21 19:19:17 np0005591285 nova_compute[182755]: 2026-01-22 00:19:17.150 182759 DEBUG nova.compute.manager [None req-f447ef32-7edb-4e2c-a5c5-5452f2f9c37e - - - - - -] [instance: 3a2a5065-a19a-41e9-ab2f-11e0d05fee11] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 21 19:19:17 np0005591285 nova_compute[182755]: 2026-01-22 00:19:17.156 182759 DEBUG nova.compute.manager [None req-f447ef32-7edb-4e2c-a5c5-5452f2f9c37e - - - - - -] [instance: 3a2a5065-a19a-41e9-ab2f-11e0d05fee11] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 21 19:19:17 np0005591285 nova_compute[182755]: 2026-01-22 00:19:17.328 182759 DEBUG nova.virt.libvirt.driver [None req-e5452d3d-a693-46ac-8a85-6181fa6d5bc7 5eb4e81f0cef4003ae49faa67b3f17c3 3e408650207b498c8d115fd0c4f776dc - - default default] [instance: 3a2a5065-a19a-41e9-ab2f-11e0d05fee11] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 21 19:19:17 np0005591285 nova_compute[182755]: 2026-01-22 00:19:17.329 182759 DEBUG nova.virt.libvirt.driver [None req-e5452d3d-a693-46ac-8a85-6181fa6d5bc7 5eb4e81f0cef4003ae49faa67b3f17c3 3e408650207b498c8d115fd0c4f776dc - - default default] [instance: 3a2a5065-a19a-41e9-ab2f-11e0d05fee11] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 21 19:19:17 np0005591285 nova_compute[182755]: 2026-01-22 00:19:17.329 182759 DEBUG nova.virt.libvirt.driver [None req-e5452d3d-a693-46ac-8a85-6181fa6d5bc7 5eb4e81f0cef4003ae49faa67b3f17c3 3e408650207b498c8d115fd0c4f776dc - - default default] [instance: 3a2a5065-a19a-41e9-ab2f-11e0d05fee11] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 21 19:19:17 np0005591285 nova_compute[182755]: 2026-01-22 00:19:17.330 182759 DEBUG nova.virt.libvirt.driver [None req-e5452d3d-a693-46ac-8a85-6181fa6d5bc7 5eb4e81f0cef4003ae49faa67b3f17c3 3e408650207b498c8d115fd0c4f776dc - - default default] [instance: 3a2a5065-a19a-41e9-ab2f-11e0d05fee11] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 21 19:19:17 np0005591285 nova_compute[182755]: 2026-01-22 00:19:17.331 182759 DEBUG nova.virt.libvirt.driver [None req-e5452d3d-a693-46ac-8a85-6181fa6d5bc7 5eb4e81f0cef4003ae49faa67b3f17c3 3e408650207b498c8d115fd0c4f776dc - - default default] [instance: 3a2a5065-a19a-41e9-ab2f-11e0d05fee11] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 21 19:19:17 np0005591285 nova_compute[182755]: 2026-01-22 00:19:17.332 182759 DEBUG nova.virt.libvirt.driver [None req-e5452d3d-a693-46ac-8a85-6181fa6d5bc7 5eb4e81f0cef4003ae49faa67b3f17c3 3e408650207b498c8d115fd0c4f776dc - - default default] [instance: 3a2a5065-a19a-41e9-ab2f-11e0d05fee11] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 21 19:19:17 np0005591285 nova_compute[182755]: 2026-01-22 00:19:17.581 182759 INFO nova.compute.manager [None req-f447ef32-7edb-4e2c-a5c5-5452f2f9c37e - - - - - -] [instance: 3a2a5065-a19a-41e9-ab2f-11e0d05fee11] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 21 19:19:17 np0005591285 nova_compute[182755]: 2026-01-22 00:19:17.582 182759 DEBUG nova.virt.driver [None req-f447ef32-7edb-4e2c-a5c5-5452f2f9c37e - - - - - -] Emitting event <LifecycleEvent: 1769041156.5217292, 3a2a5065-a19a-41e9-ab2f-11e0d05fee11 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 21 19:19:17 np0005591285 nova_compute[182755]: 2026-01-22 00:19:17.583 182759 INFO nova.compute.manager [None req-f447ef32-7edb-4e2c-a5c5-5452f2f9c37e - - - - - -] [instance: 3a2a5065-a19a-41e9-ab2f-11e0d05fee11] VM Paused (Lifecycle Event)#033[00m
Jan 21 19:19:17 np0005591285 nova_compute[182755]: 2026-01-22 00:19:17.793 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:19:17 np0005591285 nova_compute[182755]: 2026-01-22 00:19:17.843 182759 DEBUG nova.compute.manager [None req-f447ef32-7edb-4e2c-a5c5-5452f2f9c37e - - - - - -] [instance: 3a2a5065-a19a-41e9-ab2f-11e0d05fee11] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 21 19:19:17 np0005591285 nova_compute[182755]: 2026-01-22 00:19:17.848 182759 DEBUG nova.virt.driver [None req-f447ef32-7edb-4e2c-a5c5-5452f2f9c37e - - - - - -] Emitting event <LifecycleEvent: 1769041156.5283637, 3a2a5065-a19a-41e9-ab2f-11e0d05fee11 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 21 19:19:17 np0005591285 nova_compute[182755]: 2026-01-22 00:19:17.848 182759 INFO nova.compute.manager [None req-f447ef32-7edb-4e2c-a5c5-5452f2f9c37e - - - - - -] [instance: 3a2a5065-a19a-41e9-ab2f-11e0d05fee11] VM Resumed (Lifecycle Event)#033[00m
Jan 21 19:19:18 np0005591285 nova_compute[182755]: 2026-01-22 00:19:18.085 182759 DEBUG nova.compute.manager [None req-f447ef32-7edb-4e2c-a5c5-5452f2f9c37e - - - - - -] [instance: 3a2a5065-a19a-41e9-ab2f-11e0d05fee11] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 21 19:19:18 np0005591285 nova_compute[182755]: 2026-01-22 00:19:18.089 182759 DEBUG nova.compute.manager [None req-f447ef32-7edb-4e2c-a5c5-5452f2f9c37e - - - - - -] [instance: 3a2a5065-a19a-41e9-ab2f-11e0d05fee11] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 21 19:19:18 np0005591285 nova_compute[182755]: 2026-01-22 00:19:18.189 182759 DEBUG nova.compute.manager [req-203157d2-9742-4c9b-add9-1e007e95e9ff req-8df5e420-63a1-4dbb-b32c-a41e8d9481d6 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 3a2a5065-a19a-41e9-ab2f-11e0d05fee11] Received event network-vif-plugged-4fe6ce9a-3a01-4bfa-98b1-3844ff1be391 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 21 19:19:18 np0005591285 nova_compute[182755]: 2026-01-22 00:19:18.189 182759 DEBUG oslo_concurrency.lockutils [req-203157d2-9742-4c9b-add9-1e007e95e9ff req-8df5e420-63a1-4dbb-b32c-a41e8d9481d6 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "3a2a5065-a19a-41e9-ab2f-11e0d05fee11-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 21 19:19:18 np0005591285 nova_compute[182755]: 2026-01-22 00:19:18.190 182759 DEBUG oslo_concurrency.lockutils [req-203157d2-9742-4c9b-add9-1e007e95e9ff req-8df5e420-63a1-4dbb-b32c-a41e8d9481d6 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "3a2a5065-a19a-41e9-ab2f-11e0d05fee11-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 21 19:19:18 np0005591285 nova_compute[182755]: 2026-01-22 00:19:18.190 182759 DEBUG oslo_concurrency.lockutils [req-203157d2-9742-4c9b-add9-1e007e95e9ff req-8df5e420-63a1-4dbb-b32c-a41e8d9481d6 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "3a2a5065-a19a-41e9-ab2f-11e0d05fee11-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 21 19:19:18 np0005591285 nova_compute[182755]: 2026-01-22 00:19:18.190 182759 DEBUG nova.compute.manager [req-203157d2-9742-4c9b-add9-1e007e95e9ff req-8df5e420-63a1-4dbb-b32c-a41e8d9481d6 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 3a2a5065-a19a-41e9-ab2f-11e0d05fee11] No waiting events found dispatching network-vif-plugged-4fe6ce9a-3a01-4bfa-98b1-3844ff1be391 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 21 19:19:18 np0005591285 nova_compute[182755]: 2026-01-22 00:19:18.190 182759 WARNING nova.compute.manager [req-203157d2-9742-4c9b-add9-1e007e95e9ff req-8df5e420-63a1-4dbb-b32c-a41e8d9481d6 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 3a2a5065-a19a-41e9-ab2f-11e0d05fee11] Received unexpected event network-vif-plugged-4fe6ce9a-3a01-4bfa-98b1-3844ff1be391 for instance with vm_state building and task_state spawning.#033[00m
Jan 21 19:19:18 np0005591285 nova_compute[182755]: 2026-01-22 00:19:18.218 182759 INFO nova.compute.manager [None req-e5452d3d-a693-46ac-8a85-6181fa6d5bc7 5eb4e81f0cef4003ae49faa67b3f17c3 3e408650207b498c8d115fd0c4f776dc - - default default] [instance: 3a2a5065-a19a-41e9-ab2f-11e0d05fee11] Took 8.26 seconds to spawn the instance on the hypervisor.#033[00m
Jan 21 19:19:18 np0005591285 nova_compute[182755]: 2026-01-22 00:19:18.218 182759 DEBUG nova.compute.manager [None req-e5452d3d-a693-46ac-8a85-6181fa6d5bc7 5eb4e81f0cef4003ae49faa67b3f17c3 3e408650207b498c8d115fd0c4f776dc - - default default] [instance: 3a2a5065-a19a-41e9-ab2f-11e0d05fee11] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 21 19:19:18 np0005591285 nova_compute[182755]: 2026-01-22 00:19:18.473 182759 INFO nova.compute.manager [None req-f447ef32-7edb-4e2c-a5c5-5452f2f9c37e - - - - - -] [instance: 3a2a5065-a19a-41e9-ab2f-11e0d05fee11] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 21 19:19:18 np0005591285 nova_compute[182755]: 2026-01-22 00:19:18.615 182759 INFO nova.compute.manager [None req-e5452d3d-a693-46ac-8a85-6181fa6d5bc7 5eb4e81f0cef4003ae49faa67b3f17c3 3e408650207b498c8d115fd0c4f776dc - - default default] [instance: 3a2a5065-a19a-41e9-ab2f-11e0d05fee11] Took 9.48 seconds to build instance.#033[00m
Jan 21 19:19:18 np0005591285 nova_compute[182755]: 2026-01-22 00:19:18.640 182759 DEBUG oslo_concurrency.lockutils [None req-e5452d3d-a693-46ac-8a85-6181fa6d5bc7 5eb4e81f0cef4003ae49faa67b3f17c3 3e408650207b498c8d115fd0c4f776dc - - default default] Lock "3a2a5065-a19a-41e9-ab2f-11e0d05fee11" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 9.787s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 21 19:19:19 np0005591285 podman[234847]: 2026-01-22 00:19:19.241369648 +0000 UTC m=+0.106861740 container health_status e38def30a2d032278c507a8fe96e62e9ea51ff4721fe55b24e66691821fd079e (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team)
Jan 21 19:19:19 np0005591285 nova_compute[182755]: 2026-01-22 00:19:19.547 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:19:22 np0005591285 nova_compute[182755]: 2026-01-22 00:19:22.795 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:19:24 np0005591285 nova_compute[182755]: 2026-01-22 00:19:24.248 182759 DEBUG oslo_concurrency.lockutils [None req-3615ac76-d1fd-4d8e-8b6a-190383c652e4 5eb4e81f0cef4003ae49faa67b3f17c3 3e408650207b498c8d115fd0c4f776dc - - default default] Acquiring lock "3a2a5065-a19a-41e9-ab2f-11e0d05fee11" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 21 19:19:24 np0005591285 nova_compute[182755]: 2026-01-22 00:19:24.249 182759 DEBUG oslo_concurrency.lockutils [None req-3615ac76-d1fd-4d8e-8b6a-190383c652e4 5eb4e81f0cef4003ae49faa67b3f17c3 3e408650207b498c8d115fd0c4f776dc - - default default] Lock "3a2a5065-a19a-41e9-ab2f-11e0d05fee11" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 21 19:19:24 np0005591285 nova_compute[182755]: 2026-01-22 00:19:24.249 182759 DEBUG oslo_concurrency.lockutils [None req-3615ac76-d1fd-4d8e-8b6a-190383c652e4 5eb4e81f0cef4003ae49faa67b3f17c3 3e408650207b498c8d115fd0c4f776dc - - default default] Acquiring lock "3a2a5065-a19a-41e9-ab2f-11e0d05fee11-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 21 19:19:24 np0005591285 nova_compute[182755]: 2026-01-22 00:19:24.249 182759 DEBUG oslo_concurrency.lockutils [None req-3615ac76-d1fd-4d8e-8b6a-190383c652e4 5eb4e81f0cef4003ae49faa67b3f17c3 3e408650207b498c8d115fd0c4f776dc - - default default] Lock "3a2a5065-a19a-41e9-ab2f-11e0d05fee11-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 21 19:19:24 np0005591285 nova_compute[182755]: 2026-01-22 00:19:24.249 182759 DEBUG oslo_concurrency.lockutils [None req-3615ac76-d1fd-4d8e-8b6a-190383c652e4 5eb4e81f0cef4003ae49faa67b3f17c3 3e408650207b498c8d115fd0c4f776dc - - default default] Lock "3a2a5065-a19a-41e9-ab2f-11e0d05fee11-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 21 19:19:24 np0005591285 nova_compute[182755]: 2026-01-22 00:19:24.262 182759 INFO nova.compute.manager [None req-3615ac76-d1fd-4d8e-8b6a-190383c652e4 5eb4e81f0cef4003ae49faa67b3f17c3 3e408650207b498c8d115fd0c4f776dc - - default default] [instance: 3a2a5065-a19a-41e9-ab2f-11e0d05fee11] Terminating instance#033[00m
Jan 21 19:19:24 np0005591285 nova_compute[182755]: 2026-01-22 00:19:24.275 182759 DEBUG nova.compute.manager [None req-3615ac76-d1fd-4d8e-8b6a-190383c652e4 5eb4e81f0cef4003ae49faa67b3f17c3 3e408650207b498c8d115fd0c4f776dc - - default default] [instance: 3a2a5065-a19a-41e9-ab2f-11e0d05fee11] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Jan 21 19:19:24 np0005591285 kernel: tap4fe6ce9a-3a (unregistering): left promiscuous mode
Jan 21 19:19:24 np0005591285 NetworkManager[55017]: <info>  [1769041164.2929] device (tap4fe6ce9a-3a): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 21 19:19:24 np0005591285 ovn_controller[94908]: 2026-01-22T00:19:24Z|00571|binding|INFO|Releasing lport 4fe6ce9a-3a01-4bfa-98b1-3844ff1be391 from this chassis (sb_readonly=0)
Jan 21 19:19:24 np0005591285 nova_compute[182755]: 2026-01-22 00:19:24.301 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:19:24 np0005591285 ovn_controller[94908]: 2026-01-22T00:19:24Z|00572|binding|INFO|Setting lport 4fe6ce9a-3a01-4bfa-98b1-3844ff1be391 down in Southbound
Jan 21 19:19:24 np0005591285 ovn_controller[94908]: 2026-01-22T00:19:24Z|00573|binding|INFO|Removing iface tap4fe6ce9a-3a ovn-installed in OVS
Jan 21 19:19:24 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:19:24.313 104259 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:9a:3e:e5 10.100.0.3'], port_security=['fa:16:3e:9a:3e:e5 10.100.0.3'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.3/28', 'neutron:device_id': '3a2a5065-a19a-41e9-ab2f-11e0d05fee11', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-aabf11c6-ef94-408a-8148-6c6400566606', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '3e408650207b498c8d115fd0c4f776dc', 'neutron:revision_number': '4', 'neutron:security_group_ids': 'd88f438e-f9bb-4593-93a6-6ce5aa939167', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-2.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=dfd57084-623a-46cf-a9c5-71a440a640c6, chassis=[], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fdd55b5c970>], logical_port=4fe6ce9a-3a01-4bfa-98b1-3844ff1be391) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7fdd55b5c970>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 21 19:19:24 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:19:24.315 104259 INFO neutron.agent.ovn.metadata.agent [-] Port 4fe6ce9a-3a01-4bfa-98b1-3844ff1be391 in datapath aabf11c6-ef94-408a-8148-6c6400566606 unbound from our chassis#033[00m
Jan 21 19:19:24 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:19:24.319 104259 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network aabf11c6-ef94-408a-8148-6c6400566606, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Jan 21 19:19:24 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:19:24.321 211690 DEBUG oslo.privsep.daemon [-] privsep: reply[d9593812-61cf-422b-a707-570188c92371]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 19:19:24 np0005591285 nova_compute[182755]: 2026-01-22 00:19:24.321 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:19:24 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:19:24.322 104259 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-aabf11c6-ef94-408a-8148-6c6400566606 namespace which is not needed anymore#033[00m
Jan 21 19:19:24 np0005591285 systemd[1]: machine-qemu\x2d67\x2dinstance\x2d00000092.scope: Deactivated successfully.
Jan 21 19:19:24 np0005591285 systemd[1]: machine-qemu\x2d67\x2dinstance\x2d00000092.scope: Consumed 8.581s CPU time.
Jan 21 19:19:24 np0005591285 systemd-machined[154022]: Machine qemu-67-instance-00000092 terminated.
Jan 21 19:19:24 np0005591285 neutron-haproxy-ovnmeta-aabf11c6-ef94-408a-8148-6c6400566606[234832]: [NOTICE]   (234836) : haproxy version is 2.8.14-c23fe91
Jan 21 19:19:24 np0005591285 neutron-haproxy-ovnmeta-aabf11c6-ef94-408a-8148-6c6400566606[234832]: [NOTICE]   (234836) : path to executable is /usr/sbin/haproxy
Jan 21 19:19:24 np0005591285 neutron-haproxy-ovnmeta-aabf11c6-ef94-408a-8148-6c6400566606[234832]: [WARNING]  (234836) : Exiting Master process...
Jan 21 19:19:24 np0005591285 neutron-haproxy-ovnmeta-aabf11c6-ef94-408a-8148-6c6400566606[234832]: [WARNING]  (234836) : Exiting Master process...
Jan 21 19:19:24 np0005591285 neutron-haproxy-ovnmeta-aabf11c6-ef94-408a-8148-6c6400566606[234832]: [ALERT]    (234836) : Current worker (234838) exited with code 143 (Terminated)
Jan 21 19:19:24 np0005591285 neutron-haproxy-ovnmeta-aabf11c6-ef94-408a-8148-6c6400566606[234832]: [WARNING]  (234836) : All workers exited. Exiting... (0)
Jan 21 19:19:24 np0005591285 systemd[1]: libpod-51fb40c1dba42df5b674f444315adfff8d79df0acd6485bf8be6c4f842077a24.scope: Deactivated successfully.
Jan 21 19:19:24 np0005591285 podman[234897]: 2026-01-22 00:19:24.479823081 +0000 UTC m=+0.053114087 container died 51fb40c1dba42df5b674f444315adfff8d79df0acd6485bf8be6c4f842077a24 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-aabf11c6-ef94-408a-8148-6c6400566606, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251202, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 21 19:19:24 np0005591285 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-51fb40c1dba42df5b674f444315adfff8d79df0acd6485bf8be6c4f842077a24-userdata-shm.mount: Deactivated successfully.
Jan 21 19:19:24 np0005591285 systemd[1]: var-lib-containers-storage-overlay-2960a8855285e92d0f70784be06e65ed72c262337c559c0be36c5a50d4573f61-merged.mount: Deactivated successfully.
Jan 21 19:19:24 np0005591285 podman[234897]: 2026-01-22 00:19:24.528083369 +0000 UTC m=+0.101374365 container cleanup 51fb40c1dba42df5b674f444315adfff8d79df0acd6485bf8be6c4f842077a24 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-aabf11c6-ef94-408a-8148-6c6400566606, org.label-schema.build-date=20251202, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS)
Jan 21 19:19:24 np0005591285 nova_compute[182755]: 2026-01-22 00:19:24.548 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:19:24 np0005591285 systemd[1]: libpod-conmon-51fb40c1dba42df5b674f444315adfff8d79df0acd6485bf8be6c4f842077a24.scope: Deactivated successfully.
Jan 21 19:19:24 np0005591285 nova_compute[182755]: 2026-01-22 00:19:24.561 182759 INFO nova.virt.libvirt.driver [-] [instance: 3a2a5065-a19a-41e9-ab2f-11e0d05fee11] Instance destroyed successfully.#033[00m
Jan 21 19:19:24 np0005591285 nova_compute[182755]: 2026-01-22 00:19:24.562 182759 DEBUG nova.objects.instance [None req-3615ac76-d1fd-4d8e-8b6a-190383c652e4 5eb4e81f0cef4003ae49faa67b3f17c3 3e408650207b498c8d115fd0c4f776dc - - default default] Lazy-loading 'resources' on Instance uuid 3a2a5065-a19a-41e9-ab2f-11e0d05fee11 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 21 19:19:24 np0005591285 podman[234939]: 2026-01-22 00:19:24.610165497 +0000 UTC m=+0.048717130 container remove 51fb40c1dba42df5b674f444315adfff8d79df0acd6485bf8be6c4f842077a24 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-aabf11c6-ef94-408a-8148-6c6400566606, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Jan 21 19:19:24 np0005591285 nova_compute[182755]: 2026-01-22 00:19:24.611 182759 DEBUG nova.virt.libvirt.vif [None req-3615ac76-d1fd-4d8e-8b6a-190383c652e4 5eb4e81f0cef4003ae49faa67b3f17c3 3e408650207b498c8d115fd0c4f776dc - - default default] vif_type=ovs instance=Instance(access_ip_v4=1.1.1.1,access_ip_v6=::babe:202:202,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-22T00:19:08Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ServersTestJSON-server-1928398450',display_name='tempest-ServersTestJSON-server-1928398450',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-serverstestjson-server-1928398450',id=146,image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2026-01-22T00:19:18Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='3e408650207b498c8d115fd0c4f776dc',ramdisk_id='',reservation_id='r-j6btjais',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-ServersTestJSON-374007797',owner_user_name='tempest-ServersTestJSON-374007797-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-22T00:19:21Z,user_data=None,user_id='5eb4e81f0cef4003ae49faa67b3f17c3',uuid=3a2a5065-a19a-41e9-ab2f-11e0d05fee11,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "4fe6ce9a-3a01-4bfa-98b1-3844ff1be391", "address": "fa:16:3e:9a:3e:e5", "network": {"id": "aabf11c6-ef94-408a-8148-6c6400566606", "bridge": "br-int", "label": "tempest-ServersTestJSON-341875047-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3e408650207b498c8d115fd0c4f776dc", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4fe6ce9a-3a", "ovs_interfaceid": "4fe6ce9a-3a01-4bfa-98b1-3844ff1be391", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Jan 21 19:19:24 np0005591285 nova_compute[182755]: 2026-01-22 00:19:24.611 182759 DEBUG nova.network.os_vif_util [None req-3615ac76-d1fd-4d8e-8b6a-190383c652e4 5eb4e81f0cef4003ae49faa67b3f17c3 3e408650207b498c8d115fd0c4f776dc - - default default] Converting VIF {"id": "4fe6ce9a-3a01-4bfa-98b1-3844ff1be391", "address": "fa:16:3e:9a:3e:e5", "network": {"id": "aabf11c6-ef94-408a-8148-6c6400566606", "bridge": "br-int", "label": "tempest-ServersTestJSON-341875047-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3e408650207b498c8d115fd0c4f776dc", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4fe6ce9a-3a", "ovs_interfaceid": "4fe6ce9a-3a01-4bfa-98b1-3844ff1be391", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 21 19:19:24 np0005591285 nova_compute[182755]: 2026-01-22 00:19:24.612 182759 DEBUG nova.network.os_vif_util [None req-3615ac76-d1fd-4d8e-8b6a-190383c652e4 5eb4e81f0cef4003ae49faa67b3f17c3 3e408650207b498c8d115fd0c4f776dc - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:9a:3e:e5,bridge_name='br-int',has_traffic_filtering=True,id=4fe6ce9a-3a01-4bfa-98b1-3844ff1be391,network=Network(aabf11c6-ef94-408a-8148-6c6400566606),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap4fe6ce9a-3a') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 21 19:19:24 np0005591285 nova_compute[182755]: 2026-01-22 00:19:24.613 182759 DEBUG os_vif [None req-3615ac76-d1fd-4d8e-8b6a-190383c652e4 5eb4e81f0cef4003ae49faa67b3f17c3 3e408650207b498c8d115fd0c4f776dc - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:9a:3e:e5,bridge_name='br-int',has_traffic_filtering=True,id=4fe6ce9a-3a01-4bfa-98b1-3844ff1be391,network=Network(aabf11c6-ef94-408a-8148-6c6400566606),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap4fe6ce9a-3a') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Jan 21 19:19:24 np0005591285 nova_compute[182755]: 2026-01-22 00:19:24.615 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:19:24 np0005591285 nova_compute[182755]: 2026-01-22 00:19:24.615 182759 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap4fe6ce9a-3a, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 21 19:19:24 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:19:24.615 211690 DEBUG oslo.privsep.daemon [-] privsep: reply[a26693b5-f59d-4398-8887-8b8dba36fb2b]: (4, ('Thu Jan 22 12:19:24 AM UTC 2026 Stopping container neutron-haproxy-ovnmeta-aabf11c6-ef94-408a-8148-6c6400566606 (51fb40c1dba42df5b674f444315adfff8d79df0acd6485bf8be6c4f842077a24)\n51fb40c1dba42df5b674f444315adfff8d79df0acd6485bf8be6c4f842077a24\nThu Jan 22 12:19:24 AM UTC 2026 Deleting container neutron-haproxy-ovnmeta-aabf11c6-ef94-408a-8148-6c6400566606 (51fb40c1dba42df5b674f444315adfff8d79df0acd6485bf8be6c4f842077a24)\n51fb40c1dba42df5b674f444315adfff8d79df0acd6485bf8be6c4f842077a24\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 19:19:24 np0005591285 nova_compute[182755]: 2026-01-22 00:19:24.616 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:19:24 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:19:24.617 211690 DEBUG oslo.privsep.daemon [-] privsep: reply[0f89c2f7-a03b-44a6-b532-d43d7b052120]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 19:19:24 np0005591285 nova_compute[182755]: 2026-01-22 00:19:24.618 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:19:24 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:19:24.618 104259 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapaabf11c6-e0, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 21 19:19:24 np0005591285 nova_compute[182755]: 2026-01-22 00:19:24.619 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:19:24 np0005591285 kernel: tapaabf11c6-e0: left promiscuous mode
Jan 21 19:19:24 np0005591285 nova_compute[182755]: 2026-01-22 00:19:24.621 182759 INFO os_vif [None req-3615ac76-d1fd-4d8e-8b6a-190383c652e4 5eb4e81f0cef4003ae49faa67b3f17c3 3e408650207b498c8d115fd0c4f776dc - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:9a:3e:e5,bridge_name='br-int',has_traffic_filtering=True,id=4fe6ce9a-3a01-4bfa-98b1-3844ff1be391,network=Network(aabf11c6-ef94-408a-8148-6c6400566606),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap4fe6ce9a-3a')#033[00m
Jan 21 19:19:24 np0005591285 nova_compute[182755]: 2026-01-22 00:19:24.622 182759 INFO nova.virt.libvirt.driver [None req-3615ac76-d1fd-4d8e-8b6a-190383c652e4 5eb4e81f0cef4003ae49faa67b3f17c3 3e408650207b498c8d115fd0c4f776dc - - default default] [instance: 3a2a5065-a19a-41e9-ab2f-11e0d05fee11] Deleting instance files /var/lib/nova/instances/3a2a5065-a19a-41e9-ab2f-11e0d05fee11_del#033[00m
Jan 21 19:19:24 np0005591285 nova_compute[182755]: 2026-01-22 00:19:24.623 182759 INFO nova.virt.libvirt.driver [None req-3615ac76-d1fd-4d8e-8b6a-190383c652e4 5eb4e81f0cef4003ae49faa67b3f17c3 3e408650207b498c8d115fd0c4f776dc - - default default] [instance: 3a2a5065-a19a-41e9-ab2f-11e0d05fee11] Deletion of /var/lib/nova/instances/3a2a5065-a19a-41e9-ab2f-11e0d05fee11_del complete#033[00m
Jan 21 19:19:24 np0005591285 nova_compute[182755]: 2026-01-22 00:19:24.635 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:19:24 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:19:24.638 211690 DEBUG oslo.privsep.daemon [-] privsep: reply[8a7c7cb8-ddda-4800-ba94-865bb455f77c]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 19:19:24 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:19:24.657 211690 DEBUG oslo.privsep.daemon [-] privsep: reply[8ed306ad-44f0-4fd3-a446-940705b8247e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 19:19:24 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:19:24.658 211690 DEBUG oslo.privsep.daemon [-] privsep: reply[e207722e-6e19-426f-ac4b-04cdf3670241]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 19:19:24 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:19:24.673 211690 DEBUG oslo.privsep.daemon [-] privsep: reply[ba70b3d3-bfb1-4519-9319-371c0cfda30a]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 571439, 'reachable_time': 22996, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 234959, 'error': None, 'target': 'ovnmeta-aabf11c6-ef94-408a-8148-6c6400566606', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 19:19:24 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:19:24.676 104650 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-aabf11c6-ef94-408a-8148-6c6400566606 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Jan 21 19:19:24 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:19:24.676 104650 DEBUG oslo.privsep.daemon [-] privsep: reply[690394fc-ffb5-466a-b440-1a4ed9eba340]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 19:19:24 np0005591285 systemd[1]: run-netns-ovnmeta\x2daabf11c6\x2def94\x2d408a\x2d8148\x2d6c6400566606.mount: Deactivated successfully.
Jan 21 19:19:24 np0005591285 nova_compute[182755]: 2026-01-22 00:19:24.731 182759 INFO nova.compute.manager [None req-3615ac76-d1fd-4d8e-8b6a-190383c652e4 5eb4e81f0cef4003ae49faa67b3f17c3 3e408650207b498c8d115fd0c4f776dc - - default default] [instance: 3a2a5065-a19a-41e9-ab2f-11e0d05fee11] Took 0.46 seconds to destroy the instance on the hypervisor.#033[00m
Jan 21 19:19:24 np0005591285 nova_compute[182755]: 2026-01-22 00:19:24.732 182759 DEBUG oslo.service.loopingcall [None req-3615ac76-d1fd-4d8e-8b6a-190383c652e4 5eb4e81f0cef4003ae49faa67b3f17c3 3e408650207b498c8d115fd0c4f776dc - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Jan 21 19:19:24 np0005591285 nova_compute[182755]: 2026-01-22 00:19:24.733 182759 DEBUG nova.compute.manager [-] [instance: 3a2a5065-a19a-41e9-ab2f-11e0d05fee11] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Jan 21 19:19:24 np0005591285 nova_compute[182755]: 2026-01-22 00:19:24.733 182759 DEBUG nova.network.neutron [-] [instance: 3a2a5065-a19a-41e9-ab2f-11e0d05fee11] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Jan 21 19:19:24 np0005591285 nova_compute[182755]: 2026-01-22 00:19:24.770 182759 DEBUG nova.compute.manager [req-f9e386d4-8815-4472-935e-a72943fb925c req-86ea91c6-42ff-4457-9ec1-f94fbfcbf38b 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 3a2a5065-a19a-41e9-ab2f-11e0d05fee11] Received event network-vif-unplugged-4fe6ce9a-3a01-4bfa-98b1-3844ff1be391 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 21 19:19:24 np0005591285 nova_compute[182755]: 2026-01-22 00:19:24.771 182759 DEBUG oslo_concurrency.lockutils [req-f9e386d4-8815-4472-935e-a72943fb925c req-86ea91c6-42ff-4457-9ec1-f94fbfcbf38b 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "3a2a5065-a19a-41e9-ab2f-11e0d05fee11-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 21 19:19:24 np0005591285 nova_compute[182755]: 2026-01-22 00:19:24.771 182759 DEBUG oslo_concurrency.lockutils [req-f9e386d4-8815-4472-935e-a72943fb925c req-86ea91c6-42ff-4457-9ec1-f94fbfcbf38b 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "3a2a5065-a19a-41e9-ab2f-11e0d05fee11-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 21 19:19:24 np0005591285 nova_compute[182755]: 2026-01-22 00:19:24.771 182759 DEBUG oslo_concurrency.lockutils [req-f9e386d4-8815-4472-935e-a72943fb925c req-86ea91c6-42ff-4457-9ec1-f94fbfcbf38b 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "3a2a5065-a19a-41e9-ab2f-11e0d05fee11-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 21 19:19:24 np0005591285 nova_compute[182755]: 2026-01-22 00:19:24.772 182759 DEBUG nova.compute.manager [req-f9e386d4-8815-4472-935e-a72943fb925c req-86ea91c6-42ff-4457-9ec1-f94fbfcbf38b 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 3a2a5065-a19a-41e9-ab2f-11e0d05fee11] No waiting events found dispatching network-vif-unplugged-4fe6ce9a-3a01-4bfa-98b1-3844ff1be391 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 21 19:19:24 np0005591285 nova_compute[182755]: 2026-01-22 00:19:24.772 182759 DEBUG nova.compute.manager [req-f9e386d4-8815-4472-935e-a72943fb925c req-86ea91c6-42ff-4457-9ec1-f94fbfcbf38b 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 3a2a5065-a19a-41e9-ab2f-11e0d05fee11] Received event network-vif-unplugged-4fe6ce9a-3a01-4bfa-98b1-3844ff1be391 for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Jan 21 19:19:26 np0005591285 nova_compute[182755]: 2026-01-22 00:19:26.116 182759 DEBUG nova.network.neutron [-] [instance: 3a2a5065-a19a-41e9-ab2f-11e0d05fee11] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 21 19:19:26 np0005591285 nova_compute[182755]: 2026-01-22 00:19:26.144 182759 INFO nova.compute.manager [-] [instance: 3a2a5065-a19a-41e9-ab2f-11e0d05fee11] Took 1.41 seconds to deallocate network for instance.#033[00m
Jan 21 19:19:26 np0005591285 nova_compute[182755]: 2026-01-22 00:19:26.278 182759 DEBUG oslo_concurrency.lockutils [None req-3615ac76-d1fd-4d8e-8b6a-190383c652e4 5eb4e81f0cef4003ae49faa67b3f17c3 3e408650207b498c8d115fd0c4f776dc - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 21 19:19:26 np0005591285 nova_compute[182755]: 2026-01-22 00:19:26.279 182759 DEBUG oslo_concurrency.lockutils [None req-3615ac76-d1fd-4d8e-8b6a-190383c652e4 5eb4e81f0cef4003ae49faa67b3f17c3 3e408650207b498c8d115fd0c4f776dc - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 21 19:19:26 np0005591285 nova_compute[182755]: 2026-01-22 00:19:26.360 182759 DEBUG nova.compute.provider_tree [None req-3615ac76-d1fd-4d8e-8b6a-190383c652e4 5eb4e81f0cef4003ae49faa67b3f17c3 3e408650207b498c8d115fd0c4f776dc - - default default] Inventory has not changed in ProviderTree for provider: e96a8776-a298-4c19-937a-402cb8191067 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 21 19:19:26 np0005591285 nova_compute[182755]: 2026-01-22 00:19:26.380 182759 DEBUG nova.scheduler.client.report [None req-3615ac76-d1fd-4d8e-8b6a-190383c652e4 5eb4e81f0cef4003ae49faa67b3f17c3 3e408650207b498c8d115fd0c4f776dc - - default default] Inventory has not changed for provider e96a8776-a298-4c19-937a-402cb8191067 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 21 19:19:26 np0005591285 nova_compute[182755]: 2026-01-22 00:19:26.404 182759 DEBUG oslo_concurrency.lockutils [None req-3615ac76-d1fd-4d8e-8b6a-190383c652e4 5eb4e81f0cef4003ae49faa67b3f17c3 3e408650207b498c8d115fd0c4f776dc - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.125s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 21 19:19:26 np0005591285 nova_compute[182755]: 2026-01-22 00:19:26.437 182759 INFO nova.scheduler.client.report [None req-3615ac76-d1fd-4d8e-8b6a-190383c652e4 5eb4e81f0cef4003ae49faa67b3f17c3 3e408650207b498c8d115fd0c4f776dc - - default default] Deleted allocations for instance 3a2a5065-a19a-41e9-ab2f-11e0d05fee11#033[00m
Jan 21 19:19:26 np0005591285 nova_compute[182755]: 2026-01-22 00:19:26.559 182759 DEBUG oslo_concurrency.lockutils [None req-3615ac76-d1fd-4d8e-8b6a-190383c652e4 5eb4e81f0cef4003ae49faa67b3f17c3 3e408650207b498c8d115fd0c4f776dc - - default default] Lock "3a2a5065-a19a-41e9-ab2f-11e0d05fee11" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 2.310s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 21 19:19:26 np0005591285 nova_compute[182755]: 2026-01-22 00:19:26.900 182759 DEBUG nova.compute.manager [req-576aa45f-f037-448e-ba21-d4803d63acd8 req-3136776f-1feb-4d17-8490-e30257a3f1e8 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 3a2a5065-a19a-41e9-ab2f-11e0d05fee11] Received event network-vif-plugged-4fe6ce9a-3a01-4bfa-98b1-3844ff1be391 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 21 19:19:26 np0005591285 nova_compute[182755]: 2026-01-22 00:19:26.901 182759 DEBUG oslo_concurrency.lockutils [req-576aa45f-f037-448e-ba21-d4803d63acd8 req-3136776f-1feb-4d17-8490-e30257a3f1e8 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "3a2a5065-a19a-41e9-ab2f-11e0d05fee11-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 21 19:19:26 np0005591285 nova_compute[182755]: 2026-01-22 00:19:26.902 182759 DEBUG oslo_concurrency.lockutils [req-576aa45f-f037-448e-ba21-d4803d63acd8 req-3136776f-1feb-4d17-8490-e30257a3f1e8 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "3a2a5065-a19a-41e9-ab2f-11e0d05fee11-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 21 19:19:26 np0005591285 nova_compute[182755]: 2026-01-22 00:19:26.902 182759 DEBUG oslo_concurrency.lockutils [req-576aa45f-f037-448e-ba21-d4803d63acd8 req-3136776f-1feb-4d17-8490-e30257a3f1e8 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "3a2a5065-a19a-41e9-ab2f-11e0d05fee11-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 21 19:19:26 np0005591285 nova_compute[182755]: 2026-01-22 00:19:26.903 182759 DEBUG nova.compute.manager [req-576aa45f-f037-448e-ba21-d4803d63acd8 req-3136776f-1feb-4d17-8490-e30257a3f1e8 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 3a2a5065-a19a-41e9-ab2f-11e0d05fee11] No waiting events found dispatching network-vif-plugged-4fe6ce9a-3a01-4bfa-98b1-3844ff1be391 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 21 19:19:26 np0005591285 nova_compute[182755]: 2026-01-22 00:19:26.903 182759 WARNING nova.compute.manager [req-576aa45f-f037-448e-ba21-d4803d63acd8 req-3136776f-1feb-4d17-8490-e30257a3f1e8 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 3a2a5065-a19a-41e9-ab2f-11e0d05fee11] Received unexpected event network-vif-plugged-4fe6ce9a-3a01-4bfa-98b1-3844ff1be391 for instance with vm_state deleted and task_state None.#033[00m
Jan 21 19:19:26 np0005591285 nova_compute[182755]: 2026-01-22 00:19:26.952 182759 DEBUG nova.compute.manager [req-52a49e40-5439-43b1-9796-300279c1696d req-9c046d38-8ff3-4f1c-97b0-2d00222493a3 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 3a2a5065-a19a-41e9-ab2f-11e0d05fee11] Received event network-vif-deleted-4fe6ce9a-3a01-4bfa-98b1-3844ff1be391 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 21 19:19:27 np0005591285 nova_compute[182755]: 2026-01-22 00:19:27.797 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:19:29 np0005591285 nova_compute[182755]: 2026-01-22 00:19:29.618 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:19:32 np0005591285 nova_compute[182755]: 2026-01-22 00:19:32.801 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:19:33 np0005591285 podman[234960]: 2026-01-22 00:19:33.21271249 +0000 UTC m=+0.078778541 container health_status 769668cc4e818cfae854ab3fcb5517227ddca1e18005a18ef4d782a494f45662 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, vcs-type=git, io.openshift.tags=minimal rhel9, managed_by=edpm_ansible, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, config_id=openstack_network_exporter, io.openshift.expose-services=, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Red Hat, Inc., build-date=2025-08-20T13:12:41, url=https://catalog.redhat.com/en/search?searchType=containers, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1755695350, version=9.6, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., distribution-scope=public, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, name=ubi9-minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., com.redhat.component=ubi9-minimal-container, container_name=openstack_network_exporter, io.buildah.version=1.33.7, vendor=Red Hat, Inc.)
Jan 21 19:19:33 np0005591285 podman[234961]: 2026-01-22 00:19:33.213802869 +0000 UTC m=+0.078885494 container health_status b5c1a31a508d0cf9e10702abe9c0ef424614b4d8f0daee85791f7de054afa6f0 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=ceilometer_agent_compute, container_name=ceilometer_agent_compute, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2)
Jan 21 19:19:34 np0005591285 nova_compute[182755]: 2026-01-22 00:19:34.622 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:19:37 np0005591285 nova_compute[182755]: 2026-01-22 00:19:37.802 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:19:39 np0005591285 nova_compute[182755]: 2026-01-22 00:19:39.560 182759 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1769041164.5587163, 3a2a5065-a19a-41e9-ab2f-11e0d05fee11 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 21 19:19:39 np0005591285 nova_compute[182755]: 2026-01-22 00:19:39.560 182759 INFO nova.compute.manager [-] [instance: 3a2a5065-a19a-41e9-ab2f-11e0d05fee11] VM Stopped (Lifecycle Event)#033[00m
Jan 21 19:19:39 np0005591285 nova_compute[182755]: 2026-01-22 00:19:39.588 182759 DEBUG nova.compute.manager [None req-dd7ac628-90e2-4bc3-94a6-4e97e9311538 - - - - - -] [instance: 3a2a5065-a19a-41e9-ab2f-11e0d05fee11] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 21 19:19:39 np0005591285 nova_compute[182755]: 2026-01-22 00:19:39.624 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:19:42 np0005591285 nova_compute[182755]: 2026-01-22 00:19:42.734 182759 DEBUG oslo_concurrency.lockutils [None req-ee3e43f8-9750-4464-a34f-22083fc87f6e 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] Acquiring lock "6b668707-d685-4bda-bfbf-c52a9214fc5a" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 21 19:19:42 np0005591285 nova_compute[182755]: 2026-01-22 00:19:42.734 182759 DEBUG oslo_concurrency.lockutils [None req-ee3e43f8-9750-4464-a34f-22083fc87f6e 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] Lock "6b668707-d685-4bda-bfbf-c52a9214fc5a" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 21 19:19:42 np0005591285 nova_compute[182755]: 2026-01-22 00:19:42.758 182759 DEBUG nova.compute.manager [None req-ee3e43f8-9750-4464-a34f-22083fc87f6e 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] [instance: 6b668707-d685-4bda-bfbf-c52a9214fc5a] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Jan 21 19:19:42 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:19:42.797 104259 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=43, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '16:a4:fb', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': 'fa:c2:bb:50:eb:27'}, ipsec=False) old=SB_Global(nb_cfg=42) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 21 19:19:42 np0005591285 nova_compute[182755]: 2026-01-22 00:19:42.798 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:19:42 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:19:42.798 104259 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 1 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Jan 21 19:19:42 np0005591285 nova_compute[182755]: 2026-01-22 00:19:42.804 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:19:42 np0005591285 nova_compute[182755]: 2026-01-22 00:19:42.903 182759 DEBUG oslo_concurrency.lockutils [None req-ee3e43f8-9750-4464-a34f-22083fc87f6e 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 21 19:19:42 np0005591285 nova_compute[182755]: 2026-01-22 00:19:42.904 182759 DEBUG oslo_concurrency.lockutils [None req-ee3e43f8-9750-4464-a34f-22083fc87f6e 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 21 19:19:42 np0005591285 nova_compute[182755]: 2026-01-22 00:19:42.909 182759 DEBUG nova.virt.hardware [None req-ee3e43f8-9750-4464-a34f-22083fc87f6e 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Jan 21 19:19:42 np0005591285 nova_compute[182755]: 2026-01-22 00:19:42.910 182759 INFO nova.compute.claims [None req-ee3e43f8-9750-4464-a34f-22083fc87f6e 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] [instance: 6b668707-d685-4bda-bfbf-c52a9214fc5a] Claim successful on node compute-2.ctlplane.example.com#033[00m
Jan 21 19:19:43 np0005591285 nova_compute[182755]: 2026-01-22 00:19:43.035 182759 DEBUG nova.compute.provider_tree [None req-ee3e43f8-9750-4464-a34f-22083fc87f6e 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] Inventory has not changed in ProviderTree for provider: e96a8776-a298-4c19-937a-402cb8191067 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 21 19:19:43 np0005591285 nova_compute[182755]: 2026-01-22 00:19:43.064 182759 DEBUG nova.scheduler.client.report [None req-ee3e43f8-9750-4464-a34f-22083fc87f6e 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] Inventory has not changed for provider e96a8776-a298-4c19-937a-402cb8191067 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 21 19:19:43 np0005591285 nova_compute[182755]: 2026-01-22 00:19:43.100 182759 DEBUG oslo_concurrency.lockutils [None req-ee3e43f8-9750-4464-a34f-22083fc87f6e 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.196s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 21 19:19:43 np0005591285 nova_compute[182755]: 2026-01-22 00:19:43.100 182759 DEBUG nova.compute.manager [None req-ee3e43f8-9750-4464-a34f-22083fc87f6e 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] [instance: 6b668707-d685-4bda-bfbf-c52a9214fc5a] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Jan 21 19:19:43 np0005591285 podman[234998]: 2026-01-22 00:19:43.171924404 +0000 UTC m=+0.048931127 container health_status 3d927e208c8d9d2bf2ed958430b573b5beb6e6062ba87e1ec7a0ee42c855b2e4 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter)
Jan 21 19:19:43 np0005591285 nova_compute[182755]: 2026-01-22 00:19:43.173 182759 DEBUG nova.compute.manager [None req-ee3e43f8-9750-4464-a34f-22083fc87f6e 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] [instance: 6b668707-d685-4bda-bfbf-c52a9214fc5a] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Jan 21 19:19:43 np0005591285 nova_compute[182755]: 2026-01-22 00:19:43.174 182759 DEBUG nova.network.neutron [None req-ee3e43f8-9750-4464-a34f-22083fc87f6e 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] [instance: 6b668707-d685-4bda-bfbf-c52a9214fc5a] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Jan 21 19:19:43 np0005591285 nova_compute[182755]: 2026-01-22 00:19:43.195 182759 INFO nova.virt.libvirt.driver [None req-ee3e43f8-9750-4464-a34f-22083fc87f6e 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] [instance: 6b668707-d685-4bda-bfbf-c52a9214fc5a] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Jan 21 19:19:43 np0005591285 nova_compute[182755]: 2026-01-22 00:19:43.221 182759 DEBUG nova.compute.manager [None req-ee3e43f8-9750-4464-a34f-22083fc87f6e 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] [instance: 6b668707-d685-4bda-bfbf-c52a9214fc5a] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Jan 21 19:19:43 np0005591285 nova_compute[182755]: 2026-01-22 00:19:43.401 182759 DEBUG nova.compute.manager [None req-ee3e43f8-9750-4464-a34f-22083fc87f6e 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] [instance: 6b668707-d685-4bda-bfbf-c52a9214fc5a] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Jan 21 19:19:43 np0005591285 nova_compute[182755]: 2026-01-22 00:19:43.402 182759 DEBUG nova.virt.libvirt.driver [None req-ee3e43f8-9750-4464-a34f-22083fc87f6e 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] [instance: 6b668707-d685-4bda-bfbf-c52a9214fc5a] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Jan 21 19:19:43 np0005591285 nova_compute[182755]: 2026-01-22 00:19:43.402 182759 INFO nova.virt.libvirt.driver [None req-ee3e43f8-9750-4464-a34f-22083fc87f6e 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] [instance: 6b668707-d685-4bda-bfbf-c52a9214fc5a] Creating image(s)#033[00m
Jan 21 19:19:43 np0005591285 nova_compute[182755]: 2026-01-22 00:19:43.402 182759 DEBUG oslo_concurrency.lockutils [None req-ee3e43f8-9750-4464-a34f-22083fc87f6e 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] Acquiring lock "/var/lib/nova/instances/6b668707-d685-4bda-bfbf-c52a9214fc5a/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 21 19:19:43 np0005591285 nova_compute[182755]: 2026-01-22 00:19:43.403 182759 DEBUG oslo_concurrency.lockutils [None req-ee3e43f8-9750-4464-a34f-22083fc87f6e 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] Lock "/var/lib/nova/instances/6b668707-d685-4bda-bfbf-c52a9214fc5a/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 21 19:19:43 np0005591285 nova_compute[182755]: 2026-01-22 00:19:43.403 182759 DEBUG oslo_concurrency.lockutils [None req-ee3e43f8-9750-4464-a34f-22083fc87f6e 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] Lock "/var/lib/nova/instances/6b668707-d685-4bda-bfbf-c52a9214fc5a/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 21 19:19:43 np0005591285 nova_compute[182755]: 2026-01-22 00:19:43.415 182759 DEBUG oslo_concurrency.processutils [None req-ee3e43f8-9750-4464-a34f-22083fc87f6e 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 21 19:19:43 np0005591285 nova_compute[182755]: 2026-01-22 00:19:43.487 182759 DEBUG oslo_concurrency.processutils [None req-ee3e43f8-9750-4464-a34f-22083fc87f6e 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json" returned: 0 in 0.072s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 21 19:19:43 np0005591285 nova_compute[182755]: 2026-01-22 00:19:43.488 182759 DEBUG oslo_concurrency.lockutils [None req-ee3e43f8-9750-4464-a34f-22083fc87f6e 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] Acquiring lock "3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 21 19:19:43 np0005591285 nova_compute[182755]: 2026-01-22 00:19:43.488 182759 DEBUG oslo_concurrency.lockutils [None req-ee3e43f8-9750-4464-a34f-22083fc87f6e 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] Lock "3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 21 19:19:43 np0005591285 nova_compute[182755]: 2026-01-22 00:19:43.499 182759 DEBUG oslo_concurrency.processutils [None req-ee3e43f8-9750-4464-a34f-22083fc87f6e 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 21 19:19:43 np0005591285 nova_compute[182755]: 2026-01-22 00:19:43.532 182759 DEBUG nova.policy [None req-ee3e43f8-9750-4464-a34f-22083fc87f6e 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '833f1e9dce90456ea55a443da6704907', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '34b96b4037d24a0ea19383ca2477b2fd', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Jan 21 19:19:43 np0005591285 nova_compute[182755]: 2026-01-22 00:19:43.561 182759 DEBUG oslo_concurrency.processutils [None req-ee3e43f8-9750-4464-a34f-22083fc87f6e 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json" returned: 0 in 0.062s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 21 19:19:43 np0005591285 nova_compute[182755]: 2026-01-22 00:19:43.561 182759 DEBUG oslo_concurrency.processutils [None req-ee3e43f8-9750-4464-a34f-22083fc87f6e 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474,backing_fmt=raw /var/lib/nova/instances/6b668707-d685-4bda-bfbf-c52a9214fc5a/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 21 19:19:43 np0005591285 nova_compute[182755]: 2026-01-22 00:19:43.592 182759 DEBUG oslo_concurrency.processutils [None req-ee3e43f8-9750-4464-a34f-22083fc87f6e 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474,backing_fmt=raw /var/lib/nova/instances/6b668707-d685-4bda-bfbf-c52a9214fc5a/disk 1073741824" returned: 0 in 0.031s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 21 19:19:43 np0005591285 nova_compute[182755]: 2026-01-22 00:19:43.594 182759 DEBUG oslo_concurrency.lockutils [None req-ee3e43f8-9750-4464-a34f-22083fc87f6e 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] Lock "3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.105s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 21 19:19:43 np0005591285 nova_compute[182755]: 2026-01-22 00:19:43.594 182759 DEBUG oslo_concurrency.processutils [None req-ee3e43f8-9750-4464-a34f-22083fc87f6e 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 21 19:19:43 np0005591285 nova_compute[182755]: 2026-01-22 00:19:43.650 182759 DEBUG oslo_concurrency.processutils [None req-ee3e43f8-9750-4464-a34f-22083fc87f6e 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json" returned: 0 in 0.056s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 21 19:19:43 np0005591285 nova_compute[182755]: 2026-01-22 00:19:43.651 182759 DEBUG nova.virt.disk.api [None req-ee3e43f8-9750-4464-a34f-22083fc87f6e 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] Checking if we can resize image /var/lib/nova/instances/6b668707-d685-4bda-bfbf-c52a9214fc5a/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166#033[00m
Jan 21 19:19:43 np0005591285 nova_compute[182755]: 2026-01-22 00:19:43.651 182759 DEBUG oslo_concurrency.processutils [None req-ee3e43f8-9750-4464-a34f-22083fc87f6e 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/6b668707-d685-4bda-bfbf-c52a9214fc5a/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 21 19:19:43 np0005591285 nova_compute[182755]: 2026-01-22 00:19:43.707 182759 DEBUG oslo_concurrency.processutils [None req-ee3e43f8-9750-4464-a34f-22083fc87f6e 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/6b668707-d685-4bda-bfbf-c52a9214fc5a/disk --force-share --output=json" returned: 0 in 0.056s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 21 19:19:43 np0005591285 nova_compute[182755]: 2026-01-22 00:19:43.708 182759 DEBUG nova.virt.disk.api [None req-ee3e43f8-9750-4464-a34f-22083fc87f6e 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] Cannot resize image /var/lib/nova/instances/6b668707-d685-4bda-bfbf-c52a9214fc5a/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172#033[00m
Jan 21 19:19:43 np0005591285 nova_compute[182755]: 2026-01-22 00:19:43.708 182759 DEBUG nova.objects.instance [None req-ee3e43f8-9750-4464-a34f-22083fc87f6e 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] Lazy-loading 'migration_context' on Instance uuid 6b668707-d685-4bda-bfbf-c52a9214fc5a obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 21 19:19:43 np0005591285 nova_compute[182755]: 2026-01-22 00:19:43.730 182759 DEBUG nova.virt.libvirt.driver [None req-ee3e43f8-9750-4464-a34f-22083fc87f6e 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] [instance: 6b668707-d685-4bda-bfbf-c52a9214fc5a] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Jan 21 19:19:43 np0005591285 nova_compute[182755]: 2026-01-22 00:19:43.731 182759 DEBUG nova.virt.libvirt.driver [None req-ee3e43f8-9750-4464-a34f-22083fc87f6e 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] [instance: 6b668707-d685-4bda-bfbf-c52a9214fc5a] Ensure instance console log exists: /var/lib/nova/instances/6b668707-d685-4bda-bfbf-c52a9214fc5a/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Jan 21 19:19:43 np0005591285 nova_compute[182755]: 2026-01-22 00:19:43.731 182759 DEBUG oslo_concurrency.lockutils [None req-ee3e43f8-9750-4464-a34f-22083fc87f6e 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 21 19:19:43 np0005591285 nova_compute[182755]: 2026-01-22 00:19:43.732 182759 DEBUG oslo_concurrency.lockutils [None req-ee3e43f8-9750-4464-a34f-22083fc87f6e 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 21 19:19:43 np0005591285 nova_compute[182755]: 2026-01-22 00:19:43.732 182759 DEBUG oslo_concurrency.lockutils [None req-ee3e43f8-9750-4464-a34f-22083fc87f6e 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 21 19:19:43 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:19:43.801 104259 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=ce4b296c-26ac-415a-aa87-9634754eb3d3, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '43'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 21 19:19:44 np0005591285 nova_compute[182755]: 2026-01-22 00:19:44.628 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:19:45 np0005591285 nova_compute[182755]: 2026-01-22 00:19:45.425 182759 DEBUG nova.network.neutron [None req-ee3e43f8-9750-4464-a34f-22083fc87f6e 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] [instance: 6b668707-d685-4bda-bfbf-c52a9214fc5a] Successfully created port: 93aa6e48-d8e4-4f3c-b816-eedca06529c0 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Jan 21 19:19:46 np0005591285 podman[235036]: 2026-01-22 00:19:46.206307294 +0000 UTC m=+0.081106553 container health_status 482a8450540b33e5cd4c4cb0c4b60281016c657b79b89fa82f8a9e447843f995 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, tcib_managed=true, config_id=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible)
Jan 21 19:19:46 np0005591285 podman[235037]: 2026-01-22 00:19:46.206855899 +0000 UTC m=+0.071890447 container health_status 8919309155a033da8deb024bb37b9fc0d285fe97686ee0d202af37a5f2df8416 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Jan 21 19:19:46 np0005591285 nova_compute[182755]: 2026-01-22 00:19:46.213 182759 DEBUG oslo_service.periodic_task [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 21 19:19:47 np0005591285 nova_compute[182755]: 2026-01-22 00:19:47.432 182759 DEBUG nova.network.neutron [None req-ee3e43f8-9750-4464-a34f-22083fc87f6e 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] [instance: 6b668707-d685-4bda-bfbf-c52a9214fc5a] Successfully updated port: 93aa6e48-d8e4-4f3c-b816-eedca06529c0 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Jan 21 19:19:47 np0005591285 nova_compute[182755]: 2026-01-22 00:19:47.547 182759 DEBUG nova.compute.manager [req-5288befa-c153-49ea-8de1-19a01c244906 req-d071d083-1fb5-4d07-94d1-b654d6bcd08b 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 6b668707-d685-4bda-bfbf-c52a9214fc5a] Received event network-changed-93aa6e48-d8e4-4f3c-b816-eedca06529c0 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 21 19:19:47 np0005591285 nova_compute[182755]: 2026-01-22 00:19:47.548 182759 DEBUG nova.compute.manager [req-5288befa-c153-49ea-8de1-19a01c244906 req-d071d083-1fb5-4d07-94d1-b654d6bcd08b 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 6b668707-d685-4bda-bfbf-c52a9214fc5a] Refreshing instance network info cache due to event network-changed-93aa6e48-d8e4-4f3c-b816-eedca06529c0. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 21 19:19:47 np0005591285 nova_compute[182755]: 2026-01-22 00:19:47.548 182759 DEBUG oslo_concurrency.lockutils [req-5288befa-c153-49ea-8de1-19a01c244906 req-d071d083-1fb5-4d07-94d1-b654d6bcd08b 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "refresh_cache-6b668707-d685-4bda-bfbf-c52a9214fc5a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 21 19:19:47 np0005591285 nova_compute[182755]: 2026-01-22 00:19:47.549 182759 DEBUG oslo_concurrency.lockutils [req-5288befa-c153-49ea-8de1-19a01c244906 req-d071d083-1fb5-4d07-94d1-b654d6bcd08b 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquired lock "refresh_cache-6b668707-d685-4bda-bfbf-c52a9214fc5a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 21 19:19:47 np0005591285 nova_compute[182755]: 2026-01-22 00:19:47.549 182759 DEBUG nova.network.neutron [req-5288befa-c153-49ea-8de1-19a01c244906 req-d071d083-1fb5-4d07-94d1-b654d6bcd08b 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 6b668707-d685-4bda-bfbf-c52a9214fc5a] Refreshing network info cache for port 93aa6e48-d8e4-4f3c-b816-eedca06529c0 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 21 19:19:47 np0005591285 nova_compute[182755]: 2026-01-22 00:19:47.562 182759 DEBUG oslo_concurrency.lockutils [None req-ee3e43f8-9750-4464-a34f-22083fc87f6e 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] Acquiring lock "refresh_cache-6b668707-d685-4bda-bfbf-c52a9214fc5a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 21 19:19:47 np0005591285 nova_compute[182755]: 2026-01-22 00:19:47.807 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:19:47 np0005591285 nova_compute[182755]: 2026-01-22 00:19:47.838 182759 DEBUG nova.network.neutron [req-5288befa-c153-49ea-8de1-19a01c244906 req-d071d083-1fb5-4d07-94d1-b654d6bcd08b 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 6b668707-d685-4bda-bfbf-c52a9214fc5a] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Jan 21 19:19:48 np0005591285 nova_compute[182755]: 2026-01-22 00:19:48.217 182759 DEBUG oslo_service.periodic_task [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 21 19:19:48 np0005591285 nova_compute[182755]: 2026-01-22 00:19:48.217 182759 DEBUG nova.compute.manager [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Jan 21 19:19:48 np0005591285 nova_compute[182755]: 2026-01-22 00:19:48.223 182759 DEBUG nova.network.neutron [req-5288befa-c153-49ea-8de1-19a01c244906 req-d071d083-1fb5-4d07-94d1-b654d6bcd08b 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 6b668707-d685-4bda-bfbf-c52a9214fc5a] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 21 19:19:48 np0005591285 nova_compute[182755]: 2026-01-22 00:19:48.239 182759 DEBUG oslo_concurrency.lockutils [req-5288befa-c153-49ea-8de1-19a01c244906 req-d071d083-1fb5-4d07-94d1-b654d6bcd08b 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Releasing lock "refresh_cache-6b668707-d685-4bda-bfbf-c52a9214fc5a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 21 19:19:48 np0005591285 nova_compute[182755]: 2026-01-22 00:19:48.240 182759 DEBUG oslo_concurrency.lockutils [None req-ee3e43f8-9750-4464-a34f-22083fc87f6e 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] Acquired lock "refresh_cache-6b668707-d685-4bda-bfbf-c52a9214fc5a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 21 19:19:48 np0005591285 nova_compute[182755]: 2026-01-22 00:19:48.240 182759 DEBUG nova.network.neutron [None req-ee3e43f8-9750-4464-a34f-22083fc87f6e 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] [instance: 6b668707-d685-4bda-bfbf-c52a9214fc5a] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Jan 21 19:19:48 np0005591285 nova_compute[182755]: 2026-01-22 00:19:48.433 182759 DEBUG nova.network.neutron [None req-ee3e43f8-9750-4464-a34f-22083fc87f6e 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] [instance: 6b668707-d685-4bda-bfbf-c52a9214fc5a] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Jan 21 19:19:49 np0005591285 nova_compute[182755]: 2026-01-22 00:19:49.217 182759 DEBUG oslo_service.periodic_task [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 21 19:19:49 np0005591285 nova_compute[182755]: 2026-01-22 00:19:49.497 182759 DEBUG nova.network.neutron [None req-ee3e43f8-9750-4464-a34f-22083fc87f6e 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] [instance: 6b668707-d685-4bda-bfbf-c52a9214fc5a] Updating instance_info_cache with network_info: [{"id": "93aa6e48-d8e4-4f3c-b816-eedca06529c0", "address": "fa:16:3e:d8:48:67", "network": {"id": "09aa8d20-eb46-4367-945b-494fddadbef9", "bridge": "br-int", "label": "tempest-network-smoke--676031905", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "34b96b4037d24a0ea19383ca2477b2fd", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap93aa6e48-d8", "ovs_interfaceid": "93aa6e48-d8e4-4f3c-b816-eedca06529c0", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 21 19:19:49 np0005591285 nova_compute[182755]: 2026-01-22 00:19:49.521 182759 DEBUG oslo_concurrency.lockutils [None req-ee3e43f8-9750-4464-a34f-22083fc87f6e 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] Releasing lock "refresh_cache-6b668707-d685-4bda-bfbf-c52a9214fc5a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 21 19:19:49 np0005591285 nova_compute[182755]: 2026-01-22 00:19:49.522 182759 DEBUG nova.compute.manager [None req-ee3e43f8-9750-4464-a34f-22083fc87f6e 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] [instance: 6b668707-d685-4bda-bfbf-c52a9214fc5a] Instance network_info: |[{"id": "93aa6e48-d8e4-4f3c-b816-eedca06529c0", "address": "fa:16:3e:d8:48:67", "network": {"id": "09aa8d20-eb46-4367-945b-494fddadbef9", "bridge": "br-int", "label": "tempest-network-smoke--676031905", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "34b96b4037d24a0ea19383ca2477b2fd", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap93aa6e48-d8", "ovs_interfaceid": "93aa6e48-d8e4-4f3c-b816-eedca06529c0", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Jan 21 19:19:49 np0005591285 nova_compute[182755]: 2026-01-22 00:19:49.525 182759 DEBUG nova.virt.libvirt.driver [None req-ee3e43f8-9750-4464-a34f-22083fc87f6e 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] [instance: 6b668707-d685-4bda-bfbf-c52a9214fc5a] Start _get_guest_xml network_info=[{"id": "93aa6e48-d8e4-4f3c-b816-eedca06529c0", "address": "fa:16:3e:d8:48:67", "network": {"id": "09aa8d20-eb46-4367-945b-494fddadbef9", "bridge": "br-int", "label": "tempest-network-smoke--676031905", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "34b96b4037d24a0ea19383ca2477b2fd", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap93aa6e48-d8", "ovs_interfaceid": "93aa6e48-d8e4-4f3c-b816-eedca06529c0", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-21T23:42:50Z,direct_url=<?>,disk_format='qcow2',id=9cd98f02-a505-4543-a7ad-04e9a377b456,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='43b70c4e837343859ac97b6b2397ba1b',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-21T23:42:52Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'device_type': 'disk', 'device_name': '/dev/vda', 'size': 0, 'boot_index': 0, 'disk_bus': 'virtio', 'guest_format': None, 'encryption_options': None, 'encryption_format': None, 'encrypted': False, 'encryption_secret_uuid': None, 'image_id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Jan 21 19:19:49 np0005591285 nova_compute[182755]: 2026-01-22 00:19:49.530 182759 WARNING nova.virt.libvirt.driver [None req-ee3e43f8-9750-4464-a34f-22083fc87f6e 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 21 19:19:49 np0005591285 nova_compute[182755]: 2026-01-22 00:19:49.537 182759 DEBUG nova.virt.libvirt.host [None req-ee3e43f8-9750-4464-a34f-22083fc87f6e 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Jan 21 19:19:49 np0005591285 nova_compute[182755]: 2026-01-22 00:19:49.538 182759 DEBUG nova.virt.libvirt.host [None req-ee3e43f8-9750-4464-a34f-22083fc87f6e 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Jan 21 19:19:49 np0005591285 nova_compute[182755]: 2026-01-22 00:19:49.542 182759 DEBUG nova.virt.libvirt.host [None req-ee3e43f8-9750-4464-a34f-22083fc87f6e 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Jan 21 19:19:49 np0005591285 nova_compute[182755]: 2026-01-22 00:19:49.543 182759 DEBUG nova.virt.libvirt.host [None req-ee3e43f8-9750-4464-a34f-22083fc87f6e 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Jan 21 19:19:49 np0005591285 nova_compute[182755]: 2026-01-22 00:19:49.544 182759 DEBUG nova.virt.libvirt.driver [None req-ee3e43f8-9750-4464-a34f-22083fc87f6e 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Jan 21 19:19:49 np0005591285 nova_compute[182755]: 2026-01-22 00:19:49.545 182759 DEBUG nova.virt.hardware [None req-ee3e43f8-9750-4464-a34f-22083fc87f6e 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-21T23:42:45Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='c3389c03-89c4-4ff5-9e03-1a99d41713d4',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-21T23:42:50Z,direct_url=<?>,disk_format='qcow2',id=9cd98f02-a505-4543-a7ad-04e9a377b456,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='43b70c4e837343859ac97b6b2397ba1b',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-21T23:42:52Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Jan 21 19:19:49 np0005591285 nova_compute[182755]: 2026-01-22 00:19:49.545 182759 DEBUG nova.virt.hardware [None req-ee3e43f8-9750-4464-a34f-22083fc87f6e 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Jan 21 19:19:49 np0005591285 nova_compute[182755]: 2026-01-22 00:19:49.545 182759 DEBUG nova.virt.hardware [None req-ee3e43f8-9750-4464-a34f-22083fc87f6e 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Jan 21 19:19:49 np0005591285 nova_compute[182755]: 2026-01-22 00:19:49.545 182759 DEBUG nova.virt.hardware [None req-ee3e43f8-9750-4464-a34f-22083fc87f6e 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Jan 21 19:19:49 np0005591285 nova_compute[182755]: 2026-01-22 00:19:49.546 182759 DEBUG nova.virt.hardware [None req-ee3e43f8-9750-4464-a34f-22083fc87f6e 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Jan 21 19:19:49 np0005591285 nova_compute[182755]: 2026-01-22 00:19:49.546 182759 DEBUG nova.virt.hardware [None req-ee3e43f8-9750-4464-a34f-22083fc87f6e 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Jan 21 19:19:49 np0005591285 nova_compute[182755]: 2026-01-22 00:19:49.546 182759 DEBUG nova.virt.hardware [None req-ee3e43f8-9750-4464-a34f-22083fc87f6e 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Jan 21 19:19:49 np0005591285 nova_compute[182755]: 2026-01-22 00:19:49.546 182759 DEBUG nova.virt.hardware [None req-ee3e43f8-9750-4464-a34f-22083fc87f6e 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Jan 21 19:19:49 np0005591285 nova_compute[182755]: 2026-01-22 00:19:49.547 182759 DEBUG nova.virt.hardware [None req-ee3e43f8-9750-4464-a34f-22083fc87f6e 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Jan 21 19:19:49 np0005591285 nova_compute[182755]: 2026-01-22 00:19:49.547 182759 DEBUG nova.virt.hardware [None req-ee3e43f8-9750-4464-a34f-22083fc87f6e 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Jan 21 19:19:49 np0005591285 nova_compute[182755]: 2026-01-22 00:19:49.547 182759 DEBUG nova.virt.hardware [None req-ee3e43f8-9750-4464-a34f-22083fc87f6e 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Jan 21 19:19:49 np0005591285 nova_compute[182755]: 2026-01-22 00:19:49.550 182759 DEBUG nova.virt.libvirt.vif [None req-ee3e43f8-9750-4464-a34f-22083fc87f6e 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-22T00:19:41Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-398620410',display_name='tempest-TestNetworkBasicOps-server-398620410',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-398620410',id=148,image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBKIEvieg9TrMIcpLAYt3gk5p/YmFG00eTJcN+irKlcwFP4JIb9ny8lLu9l+wAcWyWHvM0k7OczTH+oTvUvPkfix7KDnMApoByZrmipa2gWnVbEhuqkyx+mSn5bJs2Pn/xw==',key_name='tempest-TestNetworkBasicOps-596721142',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='34b96b4037d24a0ea19383ca2477b2fd',ramdisk_id='',reservation_id='r-g3rthjto',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestNetworkBasicOps-822850957',owner_user_name='tempest-TestNetworkBasicOps-822850957-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-22T00:19:43Z,user_data=None,user_id='833f1e9dce90456ea55a443da6704907',uuid=6b668707-d685-4bda-bfbf-c52a9214fc5a,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "93aa6e48-d8e4-4f3c-b816-eedca06529c0", "address": "fa:16:3e:d8:48:67", "network": {"id": "09aa8d20-eb46-4367-945b-494fddadbef9", "bridge": "br-int", "label": "tempest-network-smoke--676031905", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "34b96b4037d24a0ea19383ca2477b2fd", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap93aa6e48-d8", "ovs_interfaceid": "93aa6e48-d8e4-4f3c-b816-eedca06529c0", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Jan 21 19:19:49 np0005591285 nova_compute[182755]: 2026-01-22 00:19:49.551 182759 DEBUG nova.network.os_vif_util [None req-ee3e43f8-9750-4464-a34f-22083fc87f6e 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] Converting VIF {"id": "93aa6e48-d8e4-4f3c-b816-eedca06529c0", "address": "fa:16:3e:d8:48:67", "network": {"id": "09aa8d20-eb46-4367-945b-494fddadbef9", "bridge": "br-int", "label": "tempest-network-smoke--676031905", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "34b96b4037d24a0ea19383ca2477b2fd", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap93aa6e48-d8", "ovs_interfaceid": "93aa6e48-d8e4-4f3c-b816-eedca06529c0", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 21 19:19:49 np0005591285 nova_compute[182755]: 2026-01-22 00:19:49.551 182759 DEBUG nova.network.os_vif_util [None req-ee3e43f8-9750-4464-a34f-22083fc87f6e 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:d8:48:67,bridge_name='br-int',has_traffic_filtering=True,id=93aa6e48-d8e4-4f3c-b816-eedca06529c0,network=Network(09aa8d20-eb46-4367-945b-494fddadbef9),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap93aa6e48-d8') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 21 19:19:49 np0005591285 nova_compute[182755]: 2026-01-22 00:19:49.552 182759 DEBUG nova.objects.instance [None req-ee3e43f8-9750-4464-a34f-22083fc87f6e 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] Lazy-loading 'pci_devices' on Instance uuid 6b668707-d685-4bda-bfbf-c52a9214fc5a obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 21 19:19:49 np0005591285 nova_compute[182755]: 2026-01-22 00:19:49.569 182759 DEBUG nova.virt.libvirt.driver [None req-ee3e43f8-9750-4464-a34f-22083fc87f6e 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] [instance: 6b668707-d685-4bda-bfbf-c52a9214fc5a] End _get_guest_xml xml=<domain type="kvm">
Jan 21 19:19:49 np0005591285 nova_compute[182755]:  <uuid>6b668707-d685-4bda-bfbf-c52a9214fc5a</uuid>
Jan 21 19:19:49 np0005591285 nova_compute[182755]:  <name>instance-00000094</name>
Jan 21 19:19:49 np0005591285 nova_compute[182755]:  <memory>131072</memory>
Jan 21 19:19:49 np0005591285 nova_compute[182755]:  <vcpu>1</vcpu>
Jan 21 19:19:49 np0005591285 nova_compute[182755]:  <metadata>
Jan 21 19:19:49 np0005591285 nova_compute[182755]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 21 19:19:49 np0005591285 nova_compute[182755]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 21 19:19:49 np0005591285 nova_compute[182755]:      <nova:name>tempest-TestNetworkBasicOps-server-398620410</nova:name>
Jan 21 19:19:49 np0005591285 nova_compute[182755]:      <nova:creationTime>2026-01-22 00:19:49</nova:creationTime>
Jan 21 19:19:49 np0005591285 nova_compute[182755]:      <nova:flavor name="m1.nano">
Jan 21 19:19:49 np0005591285 nova_compute[182755]:        <nova:memory>128</nova:memory>
Jan 21 19:19:49 np0005591285 nova_compute[182755]:        <nova:disk>1</nova:disk>
Jan 21 19:19:49 np0005591285 nova_compute[182755]:        <nova:swap>0</nova:swap>
Jan 21 19:19:49 np0005591285 nova_compute[182755]:        <nova:ephemeral>0</nova:ephemeral>
Jan 21 19:19:49 np0005591285 nova_compute[182755]:        <nova:vcpus>1</nova:vcpus>
Jan 21 19:19:49 np0005591285 nova_compute[182755]:      </nova:flavor>
Jan 21 19:19:49 np0005591285 nova_compute[182755]:      <nova:owner>
Jan 21 19:19:49 np0005591285 nova_compute[182755]:        <nova:user uuid="833f1e9dce90456ea55a443da6704907">tempest-TestNetworkBasicOps-822850957-project-member</nova:user>
Jan 21 19:19:49 np0005591285 nova_compute[182755]:        <nova:project uuid="34b96b4037d24a0ea19383ca2477b2fd">tempest-TestNetworkBasicOps-822850957</nova:project>
Jan 21 19:19:49 np0005591285 nova_compute[182755]:      </nova:owner>
Jan 21 19:19:49 np0005591285 nova_compute[182755]:      <nova:root type="image" uuid="9cd98f02-a505-4543-a7ad-04e9a377b456"/>
Jan 21 19:19:49 np0005591285 nova_compute[182755]:      <nova:ports>
Jan 21 19:19:49 np0005591285 nova_compute[182755]:        <nova:port uuid="93aa6e48-d8e4-4f3c-b816-eedca06529c0">
Jan 21 19:19:49 np0005591285 nova_compute[182755]:          <nova:ip type="fixed" address="10.100.0.5" ipVersion="4"/>
Jan 21 19:19:49 np0005591285 nova_compute[182755]:        </nova:port>
Jan 21 19:19:49 np0005591285 nova_compute[182755]:      </nova:ports>
Jan 21 19:19:49 np0005591285 nova_compute[182755]:    </nova:instance>
Jan 21 19:19:49 np0005591285 nova_compute[182755]:  </metadata>
Jan 21 19:19:49 np0005591285 nova_compute[182755]:  <sysinfo type="smbios">
Jan 21 19:19:49 np0005591285 nova_compute[182755]:    <system>
Jan 21 19:19:49 np0005591285 nova_compute[182755]:      <entry name="manufacturer">RDO</entry>
Jan 21 19:19:49 np0005591285 nova_compute[182755]:      <entry name="product">OpenStack Compute</entry>
Jan 21 19:19:49 np0005591285 nova_compute[182755]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 21 19:19:49 np0005591285 nova_compute[182755]:      <entry name="serial">6b668707-d685-4bda-bfbf-c52a9214fc5a</entry>
Jan 21 19:19:49 np0005591285 nova_compute[182755]:      <entry name="uuid">6b668707-d685-4bda-bfbf-c52a9214fc5a</entry>
Jan 21 19:19:49 np0005591285 nova_compute[182755]:      <entry name="family">Virtual Machine</entry>
Jan 21 19:19:49 np0005591285 nova_compute[182755]:    </system>
Jan 21 19:19:49 np0005591285 nova_compute[182755]:  </sysinfo>
Jan 21 19:19:49 np0005591285 nova_compute[182755]:  <os>
Jan 21 19:19:49 np0005591285 nova_compute[182755]:    <type arch="x86_64" machine="q35">hvm</type>
Jan 21 19:19:49 np0005591285 nova_compute[182755]:    <boot dev="hd"/>
Jan 21 19:19:49 np0005591285 nova_compute[182755]:    <smbios mode="sysinfo"/>
Jan 21 19:19:49 np0005591285 nova_compute[182755]:  </os>
Jan 21 19:19:49 np0005591285 nova_compute[182755]:  <features>
Jan 21 19:19:49 np0005591285 nova_compute[182755]:    <acpi/>
Jan 21 19:19:49 np0005591285 nova_compute[182755]:    <apic/>
Jan 21 19:19:49 np0005591285 nova_compute[182755]:    <vmcoreinfo/>
Jan 21 19:19:49 np0005591285 nova_compute[182755]:  </features>
Jan 21 19:19:49 np0005591285 nova_compute[182755]:  <clock offset="utc">
Jan 21 19:19:49 np0005591285 nova_compute[182755]:    <timer name="pit" tickpolicy="delay"/>
Jan 21 19:19:49 np0005591285 nova_compute[182755]:    <timer name="rtc" tickpolicy="catchup"/>
Jan 21 19:19:49 np0005591285 nova_compute[182755]:    <timer name="hpet" present="no"/>
Jan 21 19:19:49 np0005591285 nova_compute[182755]:  </clock>
Jan 21 19:19:49 np0005591285 nova_compute[182755]:  <cpu mode="custom" match="exact">
Jan 21 19:19:49 np0005591285 nova_compute[182755]:    <model>Nehalem</model>
Jan 21 19:19:49 np0005591285 nova_compute[182755]:    <topology sockets="1" cores="1" threads="1"/>
Jan 21 19:19:49 np0005591285 nova_compute[182755]:  </cpu>
Jan 21 19:19:49 np0005591285 nova_compute[182755]:  <devices>
Jan 21 19:19:49 np0005591285 nova_compute[182755]:    <disk type="file" device="disk">
Jan 21 19:19:49 np0005591285 nova_compute[182755]:      <driver name="qemu" type="qcow2" cache="none"/>
Jan 21 19:19:49 np0005591285 nova_compute[182755]:      <source file="/var/lib/nova/instances/6b668707-d685-4bda-bfbf-c52a9214fc5a/disk"/>
Jan 21 19:19:49 np0005591285 nova_compute[182755]:      <target dev="vda" bus="virtio"/>
Jan 21 19:19:49 np0005591285 nova_compute[182755]:    </disk>
Jan 21 19:19:49 np0005591285 nova_compute[182755]:    <disk type="file" device="cdrom">
Jan 21 19:19:49 np0005591285 nova_compute[182755]:      <driver name="qemu" type="raw" cache="none"/>
Jan 21 19:19:49 np0005591285 nova_compute[182755]:      <source file="/var/lib/nova/instances/6b668707-d685-4bda-bfbf-c52a9214fc5a/disk.config"/>
Jan 21 19:19:49 np0005591285 nova_compute[182755]:      <target dev="sda" bus="sata"/>
Jan 21 19:19:49 np0005591285 nova_compute[182755]:    </disk>
Jan 21 19:19:49 np0005591285 nova_compute[182755]:    <interface type="ethernet">
Jan 21 19:19:49 np0005591285 nova_compute[182755]:      <mac address="fa:16:3e:d8:48:67"/>
Jan 21 19:19:49 np0005591285 nova_compute[182755]:      <model type="virtio"/>
Jan 21 19:19:49 np0005591285 nova_compute[182755]:      <driver name="vhost" rx_queue_size="512"/>
Jan 21 19:19:49 np0005591285 nova_compute[182755]:      <mtu size="1442"/>
Jan 21 19:19:49 np0005591285 nova_compute[182755]:      <target dev="tap93aa6e48-d8"/>
Jan 21 19:19:49 np0005591285 nova_compute[182755]:    </interface>
Jan 21 19:19:49 np0005591285 nova_compute[182755]:    <serial type="pty">
Jan 21 19:19:49 np0005591285 nova_compute[182755]:      <log file="/var/lib/nova/instances/6b668707-d685-4bda-bfbf-c52a9214fc5a/console.log" append="off"/>
Jan 21 19:19:49 np0005591285 nova_compute[182755]:    </serial>
Jan 21 19:19:49 np0005591285 nova_compute[182755]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 21 19:19:49 np0005591285 nova_compute[182755]:    <video>
Jan 21 19:19:49 np0005591285 nova_compute[182755]:      <model type="virtio"/>
Jan 21 19:19:49 np0005591285 nova_compute[182755]:    </video>
Jan 21 19:19:49 np0005591285 nova_compute[182755]:    <input type="tablet" bus="usb"/>
Jan 21 19:19:49 np0005591285 nova_compute[182755]:    <rng model="virtio">
Jan 21 19:19:49 np0005591285 nova_compute[182755]:      <backend model="random">/dev/urandom</backend>
Jan 21 19:19:49 np0005591285 nova_compute[182755]:    </rng>
Jan 21 19:19:49 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root"/>
Jan 21 19:19:49 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 19:19:49 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 19:19:49 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 19:19:49 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 19:19:49 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 19:19:49 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 19:19:49 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 19:19:49 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 19:19:49 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 19:19:49 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 19:19:49 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 19:19:49 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 19:19:49 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 19:19:49 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 19:19:49 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 19:19:49 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 19:19:49 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 19:19:49 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 19:19:49 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 19:19:49 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 19:19:49 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 19:19:49 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 19:19:49 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 19:19:49 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 19:19:49 np0005591285 nova_compute[182755]:    <controller type="usb" index="0"/>
Jan 21 19:19:49 np0005591285 nova_compute[182755]:    <memballoon model="virtio">
Jan 21 19:19:49 np0005591285 nova_compute[182755]:      <stats period="10"/>
Jan 21 19:19:49 np0005591285 nova_compute[182755]:    </memballoon>
Jan 21 19:19:49 np0005591285 nova_compute[182755]:  </devices>
Jan 21 19:19:49 np0005591285 nova_compute[182755]: </domain>
Jan 21 19:19:49 np0005591285 nova_compute[182755]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Jan 21 19:19:49 np0005591285 nova_compute[182755]: 2026-01-22 00:19:49.570 182759 DEBUG nova.compute.manager [None req-ee3e43f8-9750-4464-a34f-22083fc87f6e 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] [instance: 6b668707-d685-4bda-bfbf-c52a9214fc5a] Preparing to wait for external event network-vif-plugged-93aa6e48-d8e4-4f3c-b816-eedca06529c0 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Jan 21 19:19:49 np0005591285 nova_compute[182755]: 2026-01-22 00:19:49.571 182759 DEBUG oslo_concurrency.lockutils [None req-ee3e43f8-9750-4464-a34f-22083fc87f6e 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] Acquiring lock "6b668707-d685-4bda-bfbf-c52a9214fc5a-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 21 19:19:49 np0005591285 nova_compute[182755]: 2026-01-22 00:19:49.571 182759 DEBUG oslo_concurrency.lockutils [None req-ee3e43f8-9750-4464-a34f-22083fc87f6e 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] Lock "6b668707-d685-4bda-bfbf-c52a9214fc5a-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 21 19:19:49 np0005591285 nova_compute[182755]: 2026-01-22 00:19:49.571 182759 DEBUG oslo_concurrency.lockutils [None req-ee3e43f8-9750-4464-a34f-22083fc87f6e 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] Lock "6b668707-d685-4bda-bfbf-c52a9214fc5a-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 21 19:19:49 np0005591285 nova_compute[182755]: 2026-01-22 00:19:49.572 182759 DEBUG nova.virt.libvirt.vif [None req-ee3e43f8-9750-4464-a34f-22083fc87f6e 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-22T00:19:41Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-398620410',display_name='tempest-TestNetworkBasicOps-server-398620410',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-398620410',id=148,image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBKIEvieg9TrMIcpLAYt3gk5p/YmFG00eTJcN+irKlcwFP4JIb9ny8lLu9l+wAcWyWHvM0k7OczTH+oTvUvPkfix7KDnMApoByZrmipa2gWnVbEhuqkyx+mSn5bJs2Pn/xw==',key_name='tempest-TestNetworkBasicOps-596721142',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='34b96b4037d24a0ea19383ca2477b2fd',ramdisk_id='',reservation_id='r-g3rthjto',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestNetworkBasicOps-822850957',owner_user_name='tempest-TestNetworkBasicOps-822850957-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-22T00:19:43Z,user_data=None,user_id='833f1e9dce90456ea55a443da6704907',uuid=6b668707-d685-4bda-bfbf-c52a9214fc5a,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "93aa6e48-d8e4-4f3c-b816-eedca06529c0", "address": "fa:16:3e:d8:48:67", "network": {"id": "09aa8d20-eb46-4367-945b-494fddadbef9", "bridge": "br-int", "label": "tempest-network-smoke--676031905", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "34b96b4037d24a0ea19383ca2477b2fd", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap93aa6e48-d8", "ovs_interfaceid": "93aa6e48-d8e4-4f3c-b816-eedca06529c0", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Jan 21 19:19:49 np0005591285 nova_compute[182755]: 2026-01-22 00:19:49.572 182759 DEBUG nova.network.os_vif_util [None req-ee3e43f8-9750-4464-a34f-22083fc87f6e 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] Converting VIF {"id": "93aa6e48-d8e4-4f3c-b816-eedca06529c0", "address": "fa:16:3e:d8:48:67", "network": {"id": "09aa8d20-eb46-4367-945b-494fddadbef9", "bridge": "br-int", "label": "tempest-network-smoke--676031905", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "34b96b4037d24a0ea19383ca2477b2fd", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap93aa6e48-d8", "ovs_interfaceid": "93aa6e48-d8e4-4f3c-b816-eedca06529c0", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 21 19:19:49 np0005591285 nova_compute[182755]: 2026-01-22 00:19:49.573 182759 DEBUG nova.network.os_vif_util [None req-ee3e43f8-9750-4464-a34f-22083fc87f6e 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:d8:48:67,bridge_name='br-int',has_traffic_filtering=True,id=93aa6e48-d8e4-4f3c-b816-eedca06529c0,network=Network(09aa8d20-eb46-4367-945b-494fddadbef9),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap93aa6e48-d8') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 21 19:19:49 np0005591285 nova_compute[182755]: 2026-01-22 00:19:49.573 182759 DEBUG os_vif [None req-ee3e43f8-9750-4464-a34f-22083fc87f6e 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:d8:48:67,bridge_name='br-int',has_traffic_filtering=True,id=93aa6e48-d8e4-4f3c-b816-eedca06529c0,network=Network(09aa8d20-eb46-4367-945b-494fddadbef9),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap93aa6e48-d8') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Jan 21 19:19:49 np0005591285 nova_compute[182755]: 2026-01-22 00:19:49.573 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:19:49 np0005591285 nova_compute[182755]: 2026-01-22 00:19:49.574 182759 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 21 19:19:49 np0005591285 nova_compute[182755]: 2026-01-22 00:19:49.574 182759 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 21 19:19:49 np0005591285 nova_compute[182755]: 2026-01-22 00:19:49.578 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:19:49 np0005591285 nova_compute[182755]: 2026-01-22 00:19:49.579 182759 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap93aa6e48-d8, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 21 19:19:49 np0005591285 nova_compute[182755]: 2026-01-22 00:19:49.579 182759 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap93aa6e48-d8, col_values=(('external_ids', {'iface-id': '93aa6e48-d8e4-4f3c-b816-eedca06529c0', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:d8:48:67', 'vm-uuid': '6b668707-d685-4bda-bfbf-c52a9214fc5a'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 21 19:19:49 np0005591285 nova_compute[182755]: 2026-01-22 00:19:49.581 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:19:49 np0005591285 nova_compute[182755]: 2026-01-22 00:19:49.583 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 21 19:19:49 np0005591285 NetworkManager[55017]: <info>  [1769041189.5834] manager: (tap93aa6e48-d8): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/276)
Jan 21 19:19:49 np0005591285 nova_compute[182755]: 2026-01-22 00:19:49.589 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:19:49 np0005591285 nova_compute[182755]: 2026-01-22 00:19:49.591 182759 INFO os_vif [None req-ee3e43f8-9750-4464-a34f-22083fc87f6e 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:d8:48:67,bridge_name='br-int',has_traffic_filtering=True,id=93aa6e48-d8e4-4f3c-b816-eedca06529c0,network=Network(09aa8d20-eb46-4367-945b-494fddadbef9),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap93aa6e48-d8')#033[00m
Jan 21 19:19:49 np0005591285 nova_compute[182755]: 2026-01-22 00:19:49.655 182759 DEBUG nova.virt.libvirt.driver [None req-ee3e43f8-9750-4464-a34f-22083fc87f6e 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 21 19:19:49 np0005591285 nova_compute[182755]: 2026-01-22 00:19:49.655 182759 DEBUG nova.virt.libvirt.driver [None req-ee3e43f8-9750-4464-a34f-22083fc87f6e 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 21 19:19:49 np0005591285 nova_compute[182755]: 2026-01-22 00:19:49.655 182759 DEBUG nova.virt.libvirt.driver [None req-ee3e43f8-9750-4464-a34f-22083fc87f6e 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] No VIF found with MAC fa:16:3e:d8:48:67, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Jan 21 19:19:49 np0005591285 nova_compute[182755]: 2026-01-22 00:19:49.656 182759 INFO nova.virt.libvirt.driver [None req-ee3e43f8-9750-4464-a34f-22083fc87f6e 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] [instance: 6b668707-d685-4bda-bfbf-c52a9214fc5a] Using config drive#033[00m
Jan 21 19:19:49 np0005591285 podman[235083]: 2026-01-22 00:19:49.746786258 +0000 UTC m=+0.120370530 container health_status e38def30a2d032278c507a8fe96e62e9ea51ff4721fe55b24e66691821fd079e (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team)
Jan 21 19:19:50 np0005591285 nova_compute[182755]: 2026-01-22 00:19:50.217 182759 DEBUG oslo_service.periodic_task [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 21 19:19:50 np0005591285 nova_compute[182755]: 2026-01-22 00:19:50.217 182759 DEBUG nova.compute.manager [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Jan 21 19:19:50 np0005591285 nova_compute[182755]: 2026-01-22 00:19:50.217 182759 DEBUG nova.compute.manager [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Jan 21 19:19:50 np0005591285 nova_compute[182755]: 2026-01-22 00:19:50.236 182759 DEBUG nova.compute.manager [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] [instance: 6b668707-d685-4bda-bfbf-c52a9214fc5a] Skipping network cache update for instance because it is Building. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9871#033[00m
Jan 21 19:19:50 np0005591285 nova_compute[182755]: 2026-01-22 00:19:50.236 182759 DEBUG nova.compute.manager [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Jan 21 19:19:50 np0005591285 nova_compute[182755]: 2026-01-22 00:19:50.390 182759 INFO nova.virt.libvirt.driver [None req-ee3e43f8-9750-4464-a34f-22083fc87f6e 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] [instance: 6b668707-d685-4bda-bfbf-c52a9214fc5a] Creating config drive at /var/lib/nova/instances/6b668707-d685-4bda-bfbf-c52a9214fc5a/disk.config#033[00m
Jan 21 19:19:50 np0005591285 nova_compute[182755]: 2026-01-22 00:19:50.395 182759 DEBUG oslo_concurrency.processutils [None req-ee3e43f8-9750-4464-a34f-22083fc87f6e 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/6b668707-d685-4bda-bfbf-c52a9214fc5a/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp024su4en execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 21 19:19:50 np0005591285 nova_compute[182755]: 2026-01-22 00:19:50.523 182759 DEBUG oslo_concurrency.processutils [None req-ee3e43f8-9750-4464-a34f-22083fc87f6e 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/6b668707-d685-4bda-bfbf-c52a9214fc5a/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp024su4en" returned: 0 in 0.128s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 21 19:19:50 np0005591285 kernel: tap93aa6e48-d8: entered promiscuous mode
Jan 21 19:19:50 np0005591285 NetworkManager[55017]: <info>  [1769041190.5826] manager: (tap93aa6e48-d8): new Tun device (/org/freedesktop/NetworkManager/Devices/277)
Jan 21 19:19:50 np0005591285 ovn_controller[94908]: 2026-01-22T00:19:50Z|00574|binding|INFO|Claiming lport 93aa6e48-d8e4-4f3c-b816-eedca06529c0 for this chassis.
Jan 21 19:19:50 np0005591285 ovn_controller[94908]: 2026-01-22T00:19:50Z|00575|binding|INFO|93aa6e48-d8e4-4f3c-b816-eedca06529c0: Claiming fa:16:3e:d8:48:67 10.100.0.5
Jan 21 19:19:50 np0005591285 nova_compute[182755]: 2026-01-22 00:19:50.583 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:19:50 np0005591285 nova_compute[182755]: 2026-01-22 00:19:50.586 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:19:50 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:19:50.605 104259 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:d8:48:67 10.100.0.5'], port_security=['fa:16:3e:d8:48:67 10.100.0.5'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.5/28', 'neutron:device_id': '6b668707-d685-4bda-bfbf-c52a9214fc5a', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-09aa8d20-eb46-4367-945b-494fddadbef9', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '34b96b4037d24a0ea19383ca2477b2fd', 'neutron:revision_number': '2', 'neutron:security_group_ids': 'c95cb105-44d9-4f90-ae0f-7a483ddfaf48', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=def33975-878d-4f5c-8b1e-729e778f0cd5, chassis=[<ovs.db.idl.Row object at 0x7fdd55b5c970>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fdd55b5c970>], logical_port=93aa6e48-d8e4-4f3c-b816-eedca06529c0) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 21 19:19:50 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:19:50.606 104259 INFO neutron.agent.ovn.metadata.agent [-] Port 93aa6e48-d8e4-4f3c-b816-eedca06529c0 in datapath 09aa8d20-eb46-4367-945b-494fddadbef9 bound to our chassis#033[00m
Jan 21 19:19:50 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:19:50.608 104259 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 09aa8d20-eb46-4367-945b-494fddadbef9#033[00m
Jan 21 19:19:50 np0005591285 systemd-udevd[235124]: Network interface NamePolicy= disabled on kernel command line.
Jan 21 19:19:50 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:19:50.621 211690 DEBUG oslo.privsep.daemon [-] privsep: reply[336a9102-4696-438e-aa1d-5d0f7d4f2874]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 19:19:50 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:19:50.622 104259 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap09aa8d20-e1 in ovnmeta-09aa8d20-eb46-4367-945b-494fddadbef9 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Jan 21 19:19:50 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:19:50.623 211690 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap09aa8d20-e0 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Jan 21 19:19:50 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:19:50.624 211690 DEBUG oslo.privsep.daemon [-] privsep: reply[ec2ac149-0b92-47d0-915c-765c632169c0]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 19:19:50 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:19:50.624 211690 DEBUG oslo.privsep.daemon [-] privsep: reply[d40e8354-3412-40fc-8c7c-64b93afb4adc]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 19:19:50 np0005591285 NetworkManager[55017]: <info>  [1769041190.6259] device (tap93aa6e48-d8): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 21 19:19:50 np0005591285 NetworkManager[55017]: <info>  [1769041190.6264] device (tap93aa6e48-d8): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 21 19:19:50 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:19:50.636 104650 DEBUG oslo.privsep.daemon [-] privsep: reply[89cc1c2c-b163-48d7-808f-54d4a32573fb]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 19:19:50 np0005591285 systemd-machined[154022]: New machine qemu-68-instance-00000094.
Jan 21 19:19:50 np0005591285 nova_compute[182755]: 2026-01-22 00:19:50.643 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:19:50 np0005591285 ovn_controller[94908]: 2026-01-22T00:19:50Z|00576|binding|INFO|Setting lport 93aa6e48-d8e4-4f3c-b816-eedca06529c0 ovn-installed in OVS
Jan 21 19:19:50 np0005591285 ovn_controller[94908]: 2026-01-22T00:19:50Z|00577|binding|INFO|Setting lport 93aa6e48-d8e4-4f3c-b816-eedca06529c0 up in Southbound
Jan 21 19:19:50 np0005591285 nova_compute[182755]: 2026-01-22 00:19:50.649 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:19:50 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:19:50.652 211690 DEBUG oslo.privsep.daemon [-] privsep: reply[eebf4d26-8a3c-4426-95c3-a847ec7a0aae]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 19:19:50 np0005591285 systemd[1]: Started Virtual Machine qemu-68-instance-00000094.
Jan 21 19:19:50 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:19:50.682 211704 DEBUG oslo.privsep.daemon [-] privsep: reply[67d8fc36-aa52-458c-aa48-eea7bf8b3a25]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 19:19:50 np0005591285 NetworkManager[55017]: <info>  [1769041190.6896] manager: (tap09aa8d20-e0): new Veth device (/org/freedesktop/NetworkManager/Devices/278)
Jan 21 19:19:50 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:19:50.688 211690 DEBUG oslo.privsep.daemon [-] privsep: reply[eaaa4b9d-1342-4c2b-b1b4-b7547f7141df]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 19:19:50 np0005591285 systemd-udevd[235129]: Network interface NamePolicy= disabled on kernel command line.
Jan 21 19:19:50 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:19:50.721 211704 DEBUG oslo.privsep.daemon [-] privsep: reply[09a2b34f-76d0-4a9b-9fea-88a7e859f6da]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 19:19:50 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:19:50.724 211704 DEBUG oslo.privsep.daemon [-] privsep: reply[b40fa3d9-29ec-4ad4-800d-dacc07aee558]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 19:19:50 np0005591285 NetworkManager[55017]: <info>  [1769041190.7467] device (tap09aa8d20-e0): carrier: link connected
Jan 21 19:19:50 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:19:50.752 211704 DEBUG oslo.privsep.daemon [-] privsep: reply[97776119-42dd-47ee-bbc6-9299a7242c4f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 19:19:50 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:19:50.770 211690 DEBUG oslo.privsep.daemon [-] privsep: reply[4d356c50-e138-4d1c-a1ed-41ac7d650c69]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap09aa8d20-e1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:ec:d3:10'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 182], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 574940, 'reachable_time': 23910, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 235160, 'error': None, 'target': 'ovnmeta-09aa8d20-eb46-4367-945b-494fddadbef9', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 19:19:50 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:19:50.786 211690 DEBUG oslo.privsep.daemon [-] privsep: reply[1750b1e4-005c-404f-a9a0-c3917b07c1e1]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:feec:d310'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 574940, 'tstamp': 574940}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 235161, 'error': None, 'target': 'ovnmeta-09aa8d20-eb46-4367-945b-494fddadbef9', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 19:19:50 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:19:50.804 211690 DEBUG oslo.privsep.daemon [-] privsep: reply[838f0fcf-3134-449d-92d1-07195f15a76a]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap09aa8d20-e1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:ec:d3:10'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 182], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 574940, 'reachable_time': 23910, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 235162, 'error': None, 'target': 'ovnmeta-09aa8d20-eb46-4367-945b-494fddadbef9', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 19:19:50 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:19:50.830 211690 DEBUG oslo.privsep.daemon [-] privsep: reply[6933b459-412d-442d-8384-a07d5168425d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 19:19:50 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:19:50.888 211690 DEBUG oslo.privsep.daemon [-] privsep: reply[5259b24c-d462-4d6c-b326-ab57074a1955]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 19:19:50 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:19:50.889 104259 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap09aa8d20-e0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 21 19:19:50 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:19:50.890 104259 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 21 19:19:50 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:19:50.890 104259 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap09aa8d20-e0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 21 19:19:50 np0005591285 nova_compute[182755]: 2026-01-22 00:19:50.892 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:19:50 np0005591285 kernel: tap09aa8d20-e0: entered promiscuous mode
Jan 21 19:19:50 np0005591285 NetworkManager[55017]: <info>  [1769041190.8929] manager: (tap09aa8d20-e0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/279)
Jan 21 19:19:50 np0005591285 nova_compute[182755]: 2026-01-22 00:19:50.894 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:19:50 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:19:50.895 104259 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap09aa8d20-e0, col_values=(('external_ids', {'iface-id': '41a0cd65-2b8a-41a7-a2ad-c71bf323d9ae'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 21 19:19:50 np0005591285 nova_compute[182755]: 2026-01-22 00:19:50.896 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:19:50 np0005591285 ovn_controller[94908]: 2026-01-22T00:19:50Z|00578|binding|INFO|Releasing lport 41a0cd65-2b8a-41a7-a2ad-c71bf323d9ae from this chassis (sb_readonly=0)
Jan 21 19:19:50 np0005591285 nova_compute[182755]: 2026-01-22 00:19:50.907 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:19:50 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:19:50.908 104259 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/09aa8d20-eb46-4367-945b-494fddadbef9.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/09aa8d20-eb46-4367-945b-494fddadbef9.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Jan 21 19:19:50 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:19:50.909 211690 DEBUG oslo.privsep.daemon [-] privsep: reply[dca692bf-e99a-4be2-b7bd-67d24ba72751]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 19:19:50 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:19:50.909 104259 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Jan 21 19:19:50 np0005591285 ovn_metadata_agent[104254]: global
Jan 21 19:19:50 np0005591285 ovn_metadata_agent[104254]:    log         /dev/log local0 debug
Jan 21 19:19:50 np0005591285 ovn_metadata_agent[104254]:    log-tag     haproxy-metadata-proxy-09aa8d20-eb46-4367-945b-494fddadbef9
Jan 21 19:19:50 np0005591285 ovn_metadata_agent[104254]:    user        root
Jan 21 19:19:50 np0005591285 ovn_metadata_agent[104254]:    group       root
Jan 21 19:19:50 np0005591285 ovn_metadata_agent[104254]:    maxconn     1024
Jan 21 19:19:50 np0005591285 ovn_metadata_agent[104254]:    pidfile     /var/lib/neutron/external/pids/09aa8d20-eb46-4367-945b-494fddadbef9.pid.haproxy
Jan 21 19:19:50 np0005591285 ovn_metadata_agent[104254]:    daemon
Jan 21 19:19:50 np0005591285 ovn_metadata_agent[104254]: 
Jan 21 19:19:50 np0005591285 ovn_metadata_agent[104254]: defaults
Jan 21 19:19:50 np0005591285 ovn_metadata_agent[104254]:    log global
Jan 21 19:19:50 np0005591285 ovn_metadata_agent[104254]:    mode http
Jan 21 19:19:50 np0005591285 ovn_metadata_agent[104254]:    option httplog
Jan 21 19:19:50 np0005591285 ovn_metadata_agent[104254]:    option dontlognull
Jan 21 19:19:50 np0005591285 ovn_metadata_agent[104254]:    option http-server-close
Jan 21 19:19:50 np0005591285 ovn_metadata_agent[104254]:    option forwardfor
Jan 21 19:19:50 np0005591285 ovn_metadata_agent[104254]:    retries                 3
Jan 21 19:19:50 np0005591285 ovn_metadata_agent[104254]:    timeout http-request    30s
Jan 21 19:19:50 np0005591285 ovn_metadata_agent[104254]:    timeout connect         30s
Jan 21 19:19:50 np0005591285 ovn_metadata_agent[104254]:    timeout client          32s
Jan 21 19:19:50 np0005591285 ovn_metadata_agent[104254]:    timeout server          32s
Jan 21 19:19:50 np0005591285 ovn_metadata_agent[104254]:    timeout http-keep-alive 30s
Jan 21 19:19:50 np0005591285 ovn_metadata_agent[104254]: 
Jan 21 19:19:50 np0005591285 ovn_metadata_agent[104254]: 
Jan 21 19:19:50 np0005591285 ovn_metadata_agent[104254]: listen listener
Jan 21 19:19:50 np0005591285 ovn_metadata_agent[104254]:    bind 169.254.169.254:80
Jan 21 19:19:50 np0005591285 ovn_metadata_agent[104254]:    server metadata /var/lib/neutron/metadata_proxy
Jan 21 19:19:50 np0005591285 ovn_metadata_agent[104254]:    http-request add-header X-OVN-Network-ID 09aa8d20-eb46-4367-945b-494fddadbef9
Jan 21 19:19:50 np0005591285 ovn_metadata_agent[104254]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Jan 21 19:19:50 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:19:50.910 104259 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-09aa8d20-eb46-4367-945b-494fddadbef9', 'env', 'PROCESS_TAG=haproxy-09aa8d20-eb46-4367-945b-494fddadbef9', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/09aa8d20-eb46-4367-945b-494fddadbef9.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Jan 21 19:19:51 np0005591285 podman[235194]: 2026-01-22 00:19:51.214200651 +0000 UTC m=+0.018986067 image pull 3695f0466b4af47afdf4b467956f8cc4744d7249671a73e7ca3fd26cca2f59c3 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Jan 21 19:19:51 np0005591285 nova_compute[182755]: 2026-01-22 00:19:51.336 182759 DEBUG nova.compute.manager [req-b0ad82f5-e579-4a7e-a0b4-52d748666097 req-bb3e3cda-2e38-48a9-96d9-137145b45b51 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 6b668707-d685-4bda-bfbf-c52a9214fc5a] Received event network-vif-plugged-93aa6e48-d8e4-4f3c-b816-eedca06529c0 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 21 19:19:51 np0005591285 nova_compute[182755]: 2026-01-22 00:19:51.336 182759 DEBUG oslo_concurrency.lockutils [req-b0ad82f5-e579-4a7e-a0b4-52d748666097 req-bb3e3cda-2e38-48a9-96d9-137145b45b51 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "6b668707-d685-4bda-bfbf-c52a9214fc5a-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 21 19:19:51 np0005591285 nova_compute[182755]: 2026-01-22 00:19:51.337 182759 DEBUG oslo_concurrency.lockutils [req-b0ad82f5-e579-4a7e-a0b4-52d748666097 req-bb3e3cda-2e38-48a9-96d9-137145b45b51 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "6b668707-d685-4bda-bfbf-c52a9214fc5a-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 21 19:19:51 np0005591285 nova_compute[182755]: 2026-01-22 00:19:51.337 182759 DEBUG oslo_concurrency.lockutils [req-b0ad82f5-e579-4a7e-a0b4-52d748666097 req-bb3e3cda-2e38-48a9-96d9-137145b45b51 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "6b668707-d685-4bda-bfbf-c52a9214fc5a-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 21 19:19:51 np0005591285 nova_compute[182755]: 2026-01-22 00:19:51.337 182759 DEBUG nova.compute.manager [req-b0ad82f5-e579-4a7e-a0b4-52d748666097 req-bb3e3cda-2e38-48a9-96d9-137145b45b51 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 6b668707-d685-4bda-bfbf-c52a9214fc5a] Processing event network-vif-plugged-93aa6e48-d8e4-4f3c-b816-eedca06529c0 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Jan 21 19:19:51 np0005591285 nova_compute[182755]: 2026-01-22 00:19:51.743 182759 DEBUG nova.virt.driver [None req-f447ef32-7edb-4e2c-a5c5-5452f2f9c37e - - - - - -] Emitting event <LifecycleEvent: 1769041191.742988, 6b668707-d685-4bda-bfbf-c52a9214fc5a => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 21 19:19:51 np0005591285 nova_compute[182755]: 2026-01-22 00:19:51.744 182759 INFO nova.compute.manager [None req-f447ef32-7edb-4e2c-a5c5-5452f2f9c37e - - - - - -] [instance: 6b668707-d685-4bda-bfbf-c52a9214fc5a] VM Started (Lifecycle Event)#033[00m
Jan 21 19:19:51 np0005591285 nova_compute[182755]: 2026-01-22 00:19:51.745 182759 DEBUG nova.compute.manager [None req-ee3e43f8-9750-4464-a34f-22083fc87f6e 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] [instance: 6b668707-d685-4bda-bfbf-c52a9214fc5a] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Jan 21 19:19:51 np0005591285 nova_compute[182755]: 2026-01-22 00:19:51.750 182759 DEBUG nova.virt.libvirt.driver [None req-ee3e43f8-9750-4464-a34f-22083fc87f6e 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] [instance: 6b668707-d685-4bda-bfbf-c52a9214fc5a] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Jan 21 19:19:51 np0005591285 nova_compute[182755]: 2026-01-22 00:19:51.754 182759 INFO nova.virt.libvirt.driver [-] [instance: 6b668707-d685-4bda-bfbf-c52a9214fc5a] Instance spawned successfully.#033[00m
Jan 21 19:19:51 np0005591285 nova_compute[182755]: 2026-01-22 00:19:51.754 182759 DEBUG nova.virt.libvirt.driver [None req-ee3e43f8-9750-4464-a34f-22083fc87f6e 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] [instance: 6b668707-d685-4bda-bfbf-c52a9214fc5a] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Jan 21 19:19:51 np0005591285 nova_compute[182755]: 2026-01-22 00:19:51.785 182759 DEBUG nova.compute.manager [None req-f447ef32-7edb-4e2c-a5c5-5452f2f9c37e - - - - - -] [instance: 6b668707-d685-4bda-bfbf-c52a9214fc5a] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 21 19:19:51 np0005591285 nova_compute[182755]: 2026-01-22 00:19:51.790 182759 DEBUG nova.virt.libvirt.driver [None req-ee3e43f8-9750-4464-a34f-22083fc87f6e 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] [instance: 6b668707-d685-4bda-bfbf-c52a9214fc5a] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 21 19:19:51 np0005591285 nova_compute[182755]: 2026-01-22 00:19:51.791 182759 DEBUG nova.virt.libvirt.driver [None req-ee3e43f8-9750-4464-a34f-22083fc87f6e 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] [instance: 6b668707-d685-4bda-bfbf-c52a9214fc5a] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 21 19:19:51 np0005591285 nova_compute[182755]: 2026-01-22 00:19:51.791 182759 DEBUG nova.virt.libvirt.driver [None req-ee3e43f8-9750-4464-a34f-22083fc87f6e 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] [instance: 6b668707-d685-4bda-bfbf-c52a9214fc5a] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 21 19:19:51 np0005591285 nova_compute[182755]: 2026-01-22 00:19:51.792 182759 DEBUG nova.virt.libvirt.driver [None req-ee3e43f8-9750-4464-a34f-22083fc87f6e 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] [instance: 6b668707-d685-4bda-bfbf-c52a9214fc5a] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 21 19:19:51 np0005591285 nova_compute[182755]: 2026-01-22 00:19:51.792 182759 DEBUG nova.virt.libvirt.driver [None req-ee3e43f8-9750-4464-a34f-22083fc87f6e 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] [instance: 6b668707-d685-4bda-bfbf-c52a9214fc5a] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 21 19:19:51 np0005591285 nova_compute[182755]: 2026-01-22 00:19:51.793 182759 DEBUG nova.virt.libvirt.driver [None req-ee3e43f8-9750-4464-a34f-22083fc87f6e 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] [instance: 6b668707-d685-4bda-bfbf-c52a9214fc5a] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 21 19:19:51 np0005591285 nova_compute[182755]: 2026-01-22 00:19:51.797 182759 DEBUG nova.compute.manager [None req-f447ef32-7edb-4e2c-a5c5-5452f2f9c37e - - - - - -] [instance: 6b668707-d685-4bda-bfbf-c52a9214fc5a] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 21 19:19:51 np0005591285 nova_compute[182755]: 2026-01-22 00:19:51.829 182759 INFO nova.compute.manager [None req-f447ef32-7edb-4e2c-a5c5-5452f2f9c37e - - - - - -] [instance: 6b668707-d685-4bda-bfbf-c52a9214fc5a] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 21 19:19:51 np0005591285 nova_compute[182755]: 2026-01-22 00:19:51.829 182759 DEBUG nova.virt.driver [None req-f447ef32-7edb-4e2c-a5c5-5452f2f9c37e - - - - - -] Emitting event <LifecycleEvent: 1769041191.7431183, 6b668707-d685-4bda-bfbf-c52a9214fc5a => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 21 19:19:51 np0005591285 nova_compute[182755]: 2026-01-22 00:19:51.829 182759 INFO nova.compute.manager [None req-f447ef32-7edb-4e2c-a5c5-5452f2f9c37e - - - - - -] [instance: 6b668707-d685-4bda-bfbf-c52a9214fc5a] VM Paused (Lifecycle Event)#033[00m
Jan 21 19:19:51 np0005591285 nova_compute[182755]: 2026-01-22 00:19:51.855 182759 DEBUG nova.compute.manager [None req-f447ef32-7edb-4e2c-a5c5-5452f2f9c37e - - - - - -] [instance: 6b668707-d685-4bda-bfbf-c52a9214fc5a] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 21 19:19:51 np0005591285 nova_compute[182755]: 2026-01-22 00:19:51.858 182759 DEBUG nova.virt.driver [None req-f447ef32-7edb-4e2c-a5c5-5452f2f9c37e - - - - - -] Emitting event <LifecycleEvent: 1769041191.7490854, 6b668707-d685-4bda-bfbf-c52a9214fc5a => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 21 19:19:51 np0005591285 nova_compute[182755]: 2026-01-22 00:19:51.859 182759 INFO nova.compute.manager [None req-f447ef32-7edb-4e2c-a5c5-5452f2f9c37e - - - - - -] [instance: 6b668707-d685-4bda-bfbf-c52a9214fc5a] VM Resumed (Lifecycle Event)#033[00m
Jan 21 19:19:51 np0005591285 nova_compute[182755]: 2026-01-22 00:19:51.893 182759 INFO nova.compute.manager [None req-ee3e43f8-9750-4464-a34f-22083fc87f6e 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] [instance: 6b668707-d685-4bda-bfbf-c52a9214fc5a] Took 8.49 seconds to spawn the instance on the hypervisor.#033[00m
Jan 21 19:19:51 np0005591285 nova_compute[182755]: 2026-01-22 00:19:51.894 182759 DEBUG nova.compute.manager [None req-ee3e43f8-9750-4464-a34f-22083fc87f6e 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] [instance: 6b668707-d685-4bda-bfbf-c52a9214fc5a] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 21 19:19:51 np0005591285 nova_compute[182755]: 2026-01-22 00:19:51.896 182759 DEBUG nova.compute.manager [None req-f447ef32-7edb-4e2c-a5c5-5452f2f9c37e - - - - - -] [instance: 6b668707-d685-4bda-bfbf-c52a9214fc5a] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 21 19:19:51 np0005591285 nova_compute[182755]: 2026-01-22 00:19:51.903 182759 DEBUG nova.compute.manager [None req-f447ef32-7edb-4e2c-a5c5-5452f2f9c37e - - - - - -] [instance: 6b668707-d685-4bda-bfbf-c52a9214fc5a] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 21 19:19:51 np0005591285 nova_compute[182755]: 2026-01-22 00:19:51.950 182759 INFO nova.compute.manager [None req-f447ef32-7edb-4e2c-a5c5-5452f2f9c37e - - - - - -] [instance: 6b668707-d685-4bda-bfbf-c52a9214fc5a] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 21 19:19:52 np0005591285 nova_compute[182755]: 2026-01-22 00:19:52.009 182759 INFO nova.compute.manager [None req-ee3e43f8-9750-4464-a34f-22083fc87f6e 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] [instance: 6b668707-d685-4bda-bfbf-c52a9214fc5a] Took 9.17 seconds to build instance.#033[00m
Jan 21 19:19:52 np0005591285 nova_compute[182755]: 2026-01-22 00:19:52.028 182759 DEBUG oslo_concurrency.lockutils [None req-ee3e43f8-9750-4464-a34f-22083fc87f6e 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] Lock "6b668707-d685-4bda-bfbf-c52a9214fc5a" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 9.294s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 21 19:19:52 np0005591285 podman[235194]: 2026-01-22 00:19:52.77524043 +0000 UTC m=+1.580025816 container create 977f040c044f7163a091a0e4b4b2a261387bf1e0549304866ce8d00a81089e2d (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-09aa8d20-eb46-4367-945b-494fddadbef9, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true)
Jan 21 19:19:52 np0005591285 nova_compute[182755]: 2026-01-22 00:19:52.809 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:19:53 np0005591285 nova_compute[182755]: 2026-01-22 00:19:53.217 182759 DEBUG oslo_service.periodic_task [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 21 19:19:53 np0005591285 nova_compute[182755]: 2026-01-22 00:19:53.217 182759 DEBUG oslo_service.periodic_task [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 21 19:19:53 np0005591285 systemd[1]: Started libpod-conmon-977f040c044f7163a091a0e4b4b2a261387bf1e0549304866ce8d00a81089e2d.scope.
Jan 21 19:19:53 np0005591285 systemd[1]: Started libcrun container.
Jan 21 19:19:53 np0005591285 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/fbd2bde88504642330d514732a207817e9a2afbd5c232906e48ad540e6790c45/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 21 19:19:53 np0005591285 nova_compute[182755]: 2026-01-22 00:19:53.475 182759 DEBUG nova.compute.manager [req-061d9699-ca18-443d-94c6-ce3737b852c9 req-f62c8c46-2ac5-4b13-a44a-3647344f5455 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 6b668707-d685-4bda-bfbf-c52a9214fc5a] Received event network-vif-plugged-93aa6e48-d8e4-4f3c-b816-eedca06529c0 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 21 19:19:53 np0005591285 nova_compute[182755]: 2026-01-22 00:19:53.476 182759 DEBUG oslo_concurrency.lockutils [req-061d9699-ca18-443d-94c6-ce3737b852c9 req-f62c8c46-2ac5-4b13-a44a-3647344f5455 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "6b668707-d685-4bda-bfbf-c52a9214fc5a-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 21 19:19:53 np0005591285 nova_compute[182755]: 2026-01-22 00:19:53.477 182759 DEBUG oslo_concurrency.lockutils [req-061d9699-ca18-443d-94c6-ce3737b852c9 req-f62c8c46-2ac5-4b13-a44a-3647344f5455 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "6b668707-d685-4bda-bfbf-c52a9214fc5a-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 21 19:19:53 np0005591285 nova_compute[182755]: 2026-01-22 00:19:53.477 182759 DEBUG oslo_concurrency.lockutils [req-061d9699-ca18-443d-94c6-ce3737b852c9 req-f62c8c46-2ac5-4b13-a44a-3647344f5455 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "6b668707-d685-4bda-bfbf-c52a9214fc5a-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 21 19:19:53 np0005591285 nova_compute[182755]: 2026-01-22 00:19:53.478 182759 DEBUG nova.compute.manager [req-061d9699-ca18-443d-94c6-ce3737b852c9 req-f62c8c46-2ac5-4b13-a44a-3647344f5455 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 6b668707-d685-4bda-bfbf-c52a9214fc5a] No waiting events found dispatching network-vif-plugged-93aa6e48-d8e4-4f3c-b816-eedca06529c0 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 21 19:19:53 np0005591285 nova_compute[182755]: 2026-01-22 00:19:53.478 182759 WARNING nova.compute.manager [req-061d9699-ca18-443d-94c6-ce3737b852c9 req-f62c8c46-2ac5-4b13-a44a-3647344f5455 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 6b668707-d685-4bda-bfbf-c52a9214fc5a] Received unexpected event network-vif-plugged-93aa6e48-d8e4-4f3c-b816-eedca06529c0 for instance with vm_state active and task_state None.#033[00m
Jan 21 19:19:54 np0005591285 podman[235194]: 2026-01-22 00:19:54.147403174 +0000 UTC m=+2.952188580 container init 977f040c044f7163a091a0e4b4b2a261387bf1e0549304866ce8d00a81089e2d (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-09aa8d20-eb46-4367-945b-494fddadbef9, org.label-schema.license=GPLv2, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3)
Jan 21 19:19:54 np0005591285 podman[235194]: 2026-01-22 00:19:54.154685637 +0000 UTC m=+2.959471023 container start 977f040c044f7163a091a0e4b4b2a261387bf1e0549304866ce8d00a81089e2d (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-09aa8d20-eb46-4367-945b-494fddadbef9, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251202, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2)
Jan 21 19:19:54 np0005591285 neutron-haproxy-ovnmeta-09aa8d20-eb46-4367-945b-494fddadbef9[235216]: [NOTICE]   (235220) : New worker (235222) forked
Jan 21 19:19:54 np0005591285 neutron-haproxy-ovnmeta-09aa8d20-eb46-4367-945b-494fddadbef9[235216]: [NOTICE]   (235220) : Loading success.
Jan 21 19:19:54 np0005591285 nova_compute[182755]: 2026-01-22 00:19:54.218 182759 DEBUG oslo_service.periodic_task [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 21 19:19:54 np0005591285 nova_compute[182755]: 2026-01-22 00:19:54.583 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:19:55 np0005591285 nova_compute[182755]: 2026-01-22 00:19:55.217 182759 DEBUG oslo_service.periodic_task [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 21 19:19:55 np0005591285 nova_compute[182755]: 2026-01-22 00:19:55.247 182759 DEBUG oslo_concurrency.lockutils [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 21 19:19:55 np0005591285 nova_compute[182755]: 2026-01-22 00:19:55.248 182759 DEBUG oslo_concurrency.lockutils [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 21 19:19:55 np0005591285 nova_compute[182755]: 2026-01-22 00:19:55.248 182759 DEBUG oslo_concurrency.lockutils [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 21 19:19:55 np0005591285 nova_compute[182755]: 2026-01-22 00:19:55.249 182759 DEBUG nova.compute.resource_tracker [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Jan 21 19:19:55 np0005591285 nova_compute[182755]: 2026-01-22 00:19:55.334 182759 DEBUG oslo_concurrency.processutils [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/6b668707-d685-4bda-bfbf-c52a9214fc5a/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 21 19:19:55 np0005591285 nova_compute[182755]: 2026-01-22 00:19:55.391 182759 DEBUG oslo_concurrency.processutils [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/6b668707-d685-4bda-bfbf-c52a9214fc5a/disk --force-share --output=json" returned: 0 in 0.058s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 21 19:19:55 np0005591285 nova_compute[182755]: 2026-01-22 00:19:55.392 182759 DEBUG oslo_concurrency.processutils [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/6b668707-d685-4bda-bfbf-c52a9214fc5a/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 21 19:19:55 np0005591285 nova_compute[182755]: 2026-01-22 00:19:55.452 182759 DEBUG oslo_concurrency.processutils [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/6b668707-d685-4bda-bfbf-c52a9214fc5a/disk --force-share --output=json" returned: 0 in 0.060s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 21 19:19:55 np0005591285 nova_compute[182755]: 2026-01-22 00:19:55.596 182759 WARNING nova.virt.libvirt.driver [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 21 19:19:55 np0005591285 nova_compute[182755]: 2026-01-22 00:19:55.597 182759 DEBUG nova.compute.resource_tracker [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=5559MB free_disk=73.19233322143555GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Jan 21 19:19:55 np0005591285 nova_compute[182755]: 2026-01-22 00:19:55.598 182759 DEBUG oslo_concurrency.lockutils [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 21 19:19:55 np0005591285 nova_compute[182755]: 2026-01-22 00:19:55.598 182759 DEBUG oslo_concurrency.lockutils [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 21 19:19:55 np0005591285 nova_compute[182755]: 2026-01-22 00:19:55.679 182759 DEBUG nova.compute.resource_tracker [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Instance 6b668707-d685-4bda-bfbf-c52a9214fc5a actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Jan 21 19:19:55 np0005591285 nova_compute[182755]: 2026-01-22 00:19:55.679 182759 DEBUG nova.compute.resource_tracker [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Jan 21 19:19:55 np0005591285 nova_compute[182755]: 2026-01-22 00:19:55.680 182759 DEBUG nova.compute.resource_tracker [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7679MB used_ram=640MB phys_disk=79GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Jan 21 19:19:55 np0005591285 nova_compute[182755]: 2026-01-22 00:19:55.704 182759 DEBUG nova.scheduler.client.report [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Refreshing inventories for resource provider e96a8776-a298-4c19-937a-402cb8191067 _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:804#033[00m
Jan 21 19:19:55 np0005591285 nova_compute[182755]: 2026-01-22 00:19:55.767 182759 DEBUG nova.scheduler.client.report [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Updating ProviderTree inventory for provider e96a8776-a298-4c19-937a-402cb8191067 from _refresh_and_get_inventory using data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} _refresh_and_get_inventory /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:768#033[00m
Jan 21 19:19:55 np0005591285 nova_compute[182755]: 2026-01-22 00:19:55.768 182759 DEBUG nova.compute.provider_tree [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Updating inventory in ProviderTree for provider e96a8776-a298-4c19-937a-402cb8191067 with inventory: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176#033[00m
Jan 21 19:19:55 np0005591285 nova_compute[182755]: 2026-01-22 00:19:55.785 182759 DEBUG nova.scheduler.client.report [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Refreshing aggregate associations for resource provider e96a8776-a298-4c19-937a-402cb8191067, aggregates: None _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:813#033[00m
Jan 21 19:19:55 np0005591285 nova_compute[182755]: 2026-01-22 00:19:55.807 182759 DEBUG nova.scheduler.client.report [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Refreshing trait associations for resource provider e96a8776-a298-4c19-937a-402cb8191067, traits: COMPUTE_SECURITY_UEFI_SECURE_BOOT,COMPUTE_VIOMMU_MODEL_VIRTIO,COMPUTE_STORAGE_BUS_IDE,COMPUTE_NET_ATTACH_INTERFACE,HW_CPU_X86_SSSE3,HW_CPU_X86_SSE,COMPUTE_IMAGE_TYPE_QCOW2,COMPUTE_NET_VIF_MODEL_VMXNET3,COMPUTE_STORAGE_BUS_FDC,HW_CPU_X86_SSE42,COMPUTE_SECURITY_TPM_2_0,COMPUTE_IMAGE_TYPE_AMI,COMPUTE_DEVICE_TAGGING,COMPUTE_VOLUME_EXTEND,COMPUTE_NET_VIF_MODEL_E1000,COMPUTE_VIOMMU_MODEL_AUTO,COMPUTE_GRAPHICS_MODEL_CIRRUS,COMPUTE_VOLUME_ATTACH_WITH_TAG,COMPUTE_IMAGE_TYPE_AKI,COMPUTE_STORAGE_BUS_SATA,HW_CPU_X86_SSE2,COMPUTE_IMAGE_TYPE_ARI,HW_CPU_X86_SSE41,COMPUTE_RESCUE_BFV,COMPUTE_NODE,COMPUTE_GRAPHICS_MODEL_VIRTIO,COMPUTE_NET_VIF_MODEL_PCNET,COMPUTE_NET_VIF_MODEL_VIRTIO,COMPUTE_NET_VIF_MODEL_RTL8139,COMPUTE_ACCELERATORS,COMPUTE_SECURITY_TPM_1_2,COMPUTE_GRAPHICS_MODEL_NONE,COMPUTE_GRAPHICS_MODEL_BOCHS,COMPUTE_VOLUME_MULTI_ATTACH,COMPUTE_NET_VIF_MODEL_NE2K_PCI,COMPUTE_TRUSTED_CERTS,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,COMPUTE_VIOMMU_MODEL_INTEL,COMPUTE_IMAGE_TYPE_ISO,COMPUTE_SOCKET_PCI_NUMA_AFFINITY,COMPUTE_STORAGE_BUS_USB,COMPUTE_STORAGE_BUS_SCSI,COMPUTE_STORAGE_BUS_VIRTIO,HW_CPU_X86_MMX,COMPUTE_GRAPHICS_MODEL_VGA,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,COMPUTE_NET_VIF_MODEL_E1000E,COMPUTE_IMAGE_TYPE_RAW _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:825#033[00m
Jan 21 19:19:55 np0005591285 nova_compute[182755]: 2026-01-22 00:19:55.856 182759 DEBUG nova.compute.provider_tree [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Inventory has not changed in ProviderTree for provider: e96a8776-a298-4c19-937a-402cb8191067 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 21 19:19:55 np0005591285 nova_compute[182755]: 2026-01-22 00:19:55.899 182759 DEBUG nova.scheduler.client.report [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Inventory has not changed for provider e96a8776-a298-4c19-937a-402cb8191067 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 21 19:19:55 np0005591285 nova_compute[182755]: 2026-01-22 00:19:55.919 182759 DEBUG nova.compute.resource_tracker [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Jan 21 19:19:55 np0005591285 nova_compute[182755]: 2026-01-22 00:19:55.919 182759 DEBUG oslo_concurrency.lockutils [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.321s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 21 19:19:57 np0005591285 nova_compute[182755]: 2026-01-22 00:19:57.307 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:19:57 np0005591285 NetworkManager[55017]: <info>  [1769041197.3096] manager: (patch-provnet-99bcf688-d143-4306-a11c-956e66fbe227-to-br-int): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/280)
Jan 21 19:19:57 np0005591285 NetworkManager[55017]: <info>  [1769041197.3120] manager: (patch-br-int-to-provnet-99bcf688-d143-4306-a11c-956e66fbe227): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/281)
Jan 21 19:19:57 np0005591285 nova_compute[182755]: 2026-01-22 00:19:57.386 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:19:57 np0005591285 ovn_controller[94908]: 2026-01-22T00:19:57Z|00579|binding|INFO|Releasing lport 41a0cd65-2b8a-41a7-a2ad-c71bf323d9ae from this chassis (sb_readonly=0)
Jan 21 19:19:57 np0005591285 nova_compute[182755]: 2026-01-22 00:19:57.402 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:19:57 np0005591285 nova_compute[182755]: 2026-01-22 00:19:57.631 182759 DEBUG nova.compute.manager [req-09c6c66f-4c09-4f62-bc1c-7941fe087a78 req-b68e76e7-fa4c-448a-9724-60133db38cdc 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 6b668707-d685-4bda-bfbf-c52a9214fc5a] Received event network-changed-93aa6e48-d8e4-4f3c-b816-eedca06529c0 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 21 19:19:57 np0005591285 nova_compute[182755]: 2026-01-22 00:19:57.632 182759 DEBUG nova.compute.manager [req-09c6c66f-4c09-4f62-bc1c-7941fe087a78 req-b68e76e7-fa4c-448a-9724-60133db38cdc 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 6b668707-d685-4bda-bfbf-c52a9214fc5a] Refreshing instance network info cache due to event network-changed-93aa6e48-d8e4-4f3c-b816-eedca06529c0. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 21 19:19:57 np0005591285 nova_compute[182755]: 2026-01-22 00:19:57.633 182759 DEBUG oslo_concurrency.lockutils [req-09c6c66f-4c09-4f62-bc1c-7941fe087a78 req-b68e76e7-fa4c-448a-9724-60133db38cdc 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "refresh_cache-6b668707-d685-4bda-bfbf-c52a9214fc5a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 21 19:19:57 np0005591285 nova_compute[182755]: 2026-01-22 00:19:57.633 182759 DEBUG oslo_concurrency.lockutils [req-09c6c66f-4c09-4f62-bc1c-7941fe087a78 req-b68e76e7-fa4c-448a-9724-60133db38cdc 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquired lock "refresh_cache-6b668707-d685-4bda-bfbf-c52a9214fc5a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 21 19:19:57 np0005591285 nova_compute[182755]: 2026-01-22 00:19:57.633 182759 DEBUG nova.network.neutron [req-09c6c66f-4c09-4f62-bc1c-7941fe087a78 req-b68e76e7-fa4c-448a-9724-60133db38cdc 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 6b668707-d685-4bda-bfbf-c52a9214fc5a] Refreshing network info cache for port 93aa6e48-d8e4-4f3c-b816-eedca06529c0 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 21 19:19:57 np0005591285 nova_compute[182755]: 2026-01-22 00:19:57.811 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:19:57 np0005591285 nova_compute[182755]: 2026-01-22 00:19:57.914 182759 DEBUG oslo_service.periodic_task [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 21 19:19:58 np0005591285 nova_compute[182755]: 2026-01-22 00:19:58.217 182759 DEBUG oslo_service.periodic_task [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 21 19:19:59 np0005591285 nova_compute[182755]: 2026-01-22 00:19:59.587 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:19:59 np0005591285 nova_compute[182755]: 2026-01-22 00:19:59.708 182759 DEBUG nova.network.neutron [req-09c6c66f-4c09-4f62-bc1c-7941fe087a78 req-b68e76e7-fa4c-448a-9724-60133db38cdc 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 6b668707-d685-4bda-bfbf-c52a9214fc5a] Updated VIF entry in instance network info cache for port 93aa6e48-d8e4-4f3c-b816-eedca06529c0. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 21 19:19:59 np0005591285 nova_compute[182755]: 2026-01-22 00:19:59.709 182759 DEBUG nova.network.neutron [req-09c6c66f-4c09-4f62-bc1c-7941fe087a78 req-b68e76e7-fa4c-448a-9724-60133db38cdc 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 6b668707-d685-4bda-bfbf-c52a9214fc5a] Updating instance_info_cache with network_info: [{"id": "93aa6e48-d8e4-4f3c-b816-eedca06529c0", "address": "fa:16:3e:d8:48:67", "network": {"id": "09aa8d20-eb46-4367-945b-494fddadbef9", "bridge": "br-int", "label": "tempest-network-smoke--676031905", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.204", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "34b96b4037d24a0ea19383ca2477b2fd", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap93aa6e48-d8", "ovs_interfaceid": "93aa6e48-d8e4-4f3c-b816-eedca06529c0", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 21 19:19:59 np0005591285 nova_compute[182755]: 2026-01-22 00:19:59.741 182759 DEBUG oslo_concurrency.lockutils [req-09c6c66f-4c09-4f62-bc1c-7941fe087a78 req-b68e76e7-fa4c-448a-9724-60133db38cdc 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Releasing lock "refresh_cache-6b668707-d685-4bda-bfbf-c52a9214fc5a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 21 19:20:01 np0005591285 nova_compute[182755]: 2026-01-22 00:20:01.189 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:20:02 np0005591285 nova_compute[182755]: 2026-01-22 00:20:02.812 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:20:02 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:20:02.984 104259 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 21 19:20:02 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:20:02.986 104259 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 21 19:20:02 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:20:02.987 104259 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 21 19:20:04 np0005591285 podman[235254]: 2026-01-22 00:20:04.181441634 +0000 UTC m=+0.052937993 container health_status 769668cc4e818cfae854ab3fcb5517227ddca1e18005a18ef4d782a494f45662 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, com.redhat.component=ubi9-minimal-container, config_id=openstack_network_exporter, vcs-type=git, container_name=openstack_network_exporter, io.openshift.tags=minimal rhel9, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., managed_by=edpm_ansible, release=1755695350, vendor=Red Hat, Inc., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., architecture=x86_64, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.buildah.version=1.33.7, maintainer=Red Hat, Inc., name=ubi9-minimal, distribution-scope=public, url=https://catalog.redhat.com/en/search?searchType=containers, version=9.6, build-date=2025-08-20T13:12:41)
Jan 21 19:20:04 np0005591285 podman[235255]: 2026-01-22 00:20:04.208440143 +0000 UTC m=+0.076063299 container health_status b5c1a31a508d0cf9e10702abe9c0ef424614b4d8f0daee85791f7de054afa6f0 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=ceilometer_agent_compute, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, managed_by=edpm_ansible)
Jan 21 19:20:04 np0005591285 nova_compute[182755]: 2026-01-22 00:20:04.590 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:20:04 np0005591285 nova_compute[182755]: 2026-01-22 00:20:04.928 182759 DEBUG oslo_concurrency.lockutils [None req-016582db-ff6a-49dc-9271-3db9bcc99cf9 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] Acquiring lock "04c5bee9-8745-45b2-884d-2abfbbec5d0e" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 21 19:20:04 np0005591285 nova_compute[182755]: 2026-01-22 00:20:04.929 182759 DEBUG oslo_concurrency.lockutils [None req-016582db-ff6a-49dc-9271-3db9bcc99cf9 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] Lock "04c5bee9-8745-45b2-884d-2abfbbec5d0e" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 21 19:20:04 np0005591285 nova_compute[182755]: 2026-01-22 00:20:04.956 182759 DEBUG nova.compute.manager [None req-016582db-ff6a-49dc-9271-3db9bcc99cf9 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] [instance: 04c5bee9-8745-45b2-884d-2abfbbec5d0e] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Jan 21 19:20:05 np0005591285 nova_compute[182755]: 2026-01-22 00:20:05.070 182759 DEBUG oslo_concurrency.lockutils [None req-016582db-ff6a-49dc-9271-3db9bcc99cf9 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 21 19:20:05 np0005591285 nova_compute[182755]: 2026-01-22 00:20:05.071 182759 DEBUG oslo_concurrency.lockutils [None req-016582db-ff6a-49dc-9271-3db9bcc99cf9 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 21 19:20:05 np0005591285 nova_compute[182755]: 2026-01-22 00:20:05.080 182759 DEBUG nova.virt.hardware [None req-016582db-ff6a-49dc-9271-3db9bcc99cf9 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Jan 21 19:20:05 np0005591285 nova_compute[182755]: 2026-01-22 00:20:05.080 182759 INFO nova.compute.claims [None req-016582db-ff6a-49dc-9271-3db9bcc99cf9 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] [instance: 04c5bee9-8745-45b2-884d-2abfbbec5d0e] Claim successful on node compute-2.ctlplane.example.com#033[00m
Jan 21 19:20:05 np0005591285 nova_compute[182755]: 2026-01-22 00:20:05.563 182759 DEBUG nova.compute.provider_tree [None req-016582db-ff6a-49dc-9271-3db9bcc99cf9 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] Inventory has not changed in ProviderTree for provider: e96a8776-a298-4c19-937a-402cb8191067 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 21 19:20:05 np0005591285 nova_compute[182755]: 2026-01-22 00:20:05.581 182759 DEBUG nova.scheduler.client.report [None req-016582db-ff6a-49dc-9271-3db9bcc99cf9 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] Inventory has not changed for provider e96a8776-a298-4c19-937a-402cb8191067 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 21 19:20:05 np0005591285 nova_compute[182755]: 2026-01-22 00:20:05.605 182759 DEBUG oslo_concurrency.lockutils [None req-016582db-ff6a-49dc-9271-3db9bcc99cf9 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.534s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 21 19:20:05 np0005591285 nova_compute[182755]: 2026-01-22 00:20:05.605 182759 DEBUG nova.compute.manager [None req-016582db-ff6a-49dc-9271-3db9bcc99cf9 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] [instance: 04c5bee9-8745-45b2-884d-2abfbbec5d0e] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Jan 21 19:20:05 np0005591285 nova_compute[182755]: 2026-01-22 00:20:05.706 182759 DEBUG nova.compute.manager [None req-016582db-ff6a-49dc-9271-3db9bcc99cf9 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] [instance: 04c5bee9-8745-45b2-884d-2abfbbec5d0e] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Jan 21 19:20:05 np0005591285 nova_compute[182755]: 2026-01-22 00:20:05.707 182759 DEBUG nova.network.neutron [None req-016582db-ff6a-49dc-9271-3db9bcc99cf9 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] [instance: 04c5bee9-8745-45b2-884d-2abfbbec5d0e] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Jan 21 19:20:05 np0005591285 nova_compute[182755]: 2026-01-22 00:20:05.744 182759 INFO nova.virt.libvirt.driver [None req-016582db-ff6a-49dc-9271-3db9bcc99cf9 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] [instance: 04c5bee9-8745-45b2-884d-2abfbbec5d0e] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Jan 21 19:20:05 np0005591285 nova_compute[182755]: 2026-01-22 00:20:05.766 182759 DEBUG nova.compute.manager [None req-016582db-ff6a-49dc-9271-3db9bcc99cf9 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] [instance: 04c5bee9-8745-45b2-884d-2abfbbec5d0e] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Jan 21 19:20:05 np0005591285 ovn_controller[94908]: 2026-01-22T00:20:05Z|00054|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:d8:48:67 10.100.0.5
Jan 21 19:20:05 np0005591285 ovn_controller[94908]: 2026-01-22T00:20:05Z|00055|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:d8:48:67 10.100.0.5
Jan 21 19:20:06 np0005591285 nova_compute[182755]: 2026-01-22 00:20:06.268 182759 DEBUG nova.policy [None req-016582db-ff6a-49dc-9271-3db9bcc99cf9 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '833f1e9dce90456ea55a443da6704907', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '34b96b4037d24a0ea19383ca2477b2fd', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Jan 21 19:20:06 np0005591285 nova_compute[182755]: 2026-01-22 00:20:06.276 182759 DEBUG nova.compute.manager [None req-016582db-ff6a-49dc-9271-3db9bcc99cf9 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] [instance: 04c5bee9-8745-45b2-884d-2abfbbec5d0e] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Jan 21 19:20:06 np0005591285 nova_compute[182755]: 2026-01-22 00:20:06.277 182759 DEBUG nova.virt.libvirt.driver [None req-016582db-ff6a-49dc-9271-3db9bcc99cf9 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] [instance: 04c5bee9-8745-45b2-884d-2abfbbec5d0e] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Jan 21 19:20:06 np0005591285 nova_compute[182755]: 2026-01-22 00:20:06.278 182759 INFO nova.virt.libvirt.driver [None req-016582db-ff6a-49dc-9271-3db9bcc99cf9 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] [instance: 04c5bee9-8745-45b2-884d-2abfbbec5d0e] Creating image(s)#033[00m
Jan 21 19:20:06 np0005591285 nova_compute[182755]: 2026-01-22 00:20:06.279 182759 DEBUG oslo_concurrency.lockutils [None req-016582db-ff6a-49dc-9271-3db9bcc99cf9 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] Acquiring lock "/var/lib/nova/instances/04c5bee9-8745-45b2-884d-2abfbbec5d0e/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 21 19:20:06 np0005591285 nova_compute[182755]: 2026-01-22 00:20:06.279 182759 DEBUG oslo_concurrency.lockutils [None req-016582db-ff6a-49dc-9271-3db9bcc99cf9 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] Lock "/var/lib/nova/instances/04c5bee9-8745-45b2-884d-2abfbbec5d0e/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 21 19:20:06 np0005591285 nova_compute[182755]: 2026-01-22 00:20:06.280 182759 DEBUG oslo_concurrency.lockutils [None req-016582db-ff6a-49dc-9271-3db9bcc99cf9 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] Lock "/var/lib/nova/instances/04c5bee9-8745-45b2-884d-2abfbbec5d0e/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 21 19:20:06 np0005591285 nova_compute[182755]: 2026-01-22 00:20:06.307 182759 DEBUG oslo_concurrency.processutils [None req-016582db-ff6a-49dc-9271-3db9bcc99cf9 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 21 19:20:06 np0005591285 nova_compute[182755]: 2026-01-22 00:20:06.370 182759 DEBUG oslo_concurrency.processutils [None req-016582db-ff6a-49dc-9271-3db9bcc99cf9 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json" returned: 0 in 0.063s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 21 19:20:06 np0005591285 nova_compute[182755]: 2026-01-22 00:20:06.372 182759 DEBUG oslo_concurrency.lockutils [None req-016582db-ff6a-49dc-9271-3db9bcc99cf9 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] Acquiring lock "3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 21 19:20:06 np0005591285 nova_compute[182755]: 2026-01-22 00:20:06.373 182759 DEBUG oslo_concurrency.lockutils [None req-016582db-ff6a-49dc-9271-3db9bcc99cf9 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] Lock "3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 21 19:20:06 np0005591285 nova_compute[182755]: 2026-01-22 00:20:06.400 182759 DEBUG oslo_concurrency.processutils [None req-016582db-ff6a-49dc-9271-3db9bcc99cf9 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 21 19:20:06 np0005591285 nova_compute[182755]: 2026-01-22 00:20:06.456 182759 DEBUG oslo_concurrency.processutils [None req-016582db-ff6a-49dc-9271-3db9bcc99cf9 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json" returned: 0 in 0.056s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 21 19:20:06 np0005591285 nova_compute[182755]: 2026-01-22 00:20:06.457 182759 DEBUG oslo_concurrency.processutils [None req-016582db-ff6a-49dc-9271-3db9bcc99cf9 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474,backing_fmt=raw /var/lib/nova/instances/04c5bee9-8745-45b2-884d-2abfbbec5d0e/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 21 19:20:06 np0005591285 nova_compute[182755]: 2026-01-22 00:20:06.843 182759 DEBUG oslo_concurrency.processutils [None req-016582db-ff6a-49dc-9271-3db9bcc99cf9 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474,backing_fmt=raw /var/lib/nova/instances/04c5bee9-8745-45b2-884d-2abfbbec5d0e/disk 1073741824" returned: 0 in 0.387s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 21 19:20:06 np0005591285 nova_compute[182755]: 2026-01-22 00:20:06.845 182759 DEBUG oslo_concurrency.lockutils [None req-016582db-ff6a-49dc-9271-3db9bcc99cf9 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] Lock "3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.471s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 21 19:20:06 np0005591285 nova_compute[182755]: 2026-01-22 00:20:06.845 182759 DEBUG oslo_concurrency.processutils [None req-016582db-ff6a-49dc-9271-3db9bcc99cf9 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 21 19:20:06 np0005591285 nova_compute[182755]: 2026-01-22 00:20:06.922 182759 DEBUG oslo_concurrency.processutils [None req-016582db-ff6a-49dc-9271-3db9bcc99cf9 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json" returned: 0 in 0.077s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 21 19:20:06 np0005591285 nova_compute[182755]: 2026-01-22 00:20:06.924 182759 DEBUG nova.virt.disk.api [None req-016582db-ff6a-49dc-9271-3db9bcc99cf9 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] Checking if we can resize image /var/lib/nova/instances/04c5bee9-8745-45b2-884d-2abfbbec5d0e/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166#033[00m
Jan 21 19:20:06 np0005591285 nova_compute[182755]: 2026-01-22 00:20:06.924 182759 DEBUG oslo_concurrency.processutils [None req-016582db-ff6a-49dc-9271-3db9bcc99cf9 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/04c5bee9-8745-45b2-884d-2abfbbec5d0e/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 21 19:20:06 np0005591285 nova_compute[182755]: 2026-01-22 00:20:06.994 182759 DEBUG oslo_concurrency.processutils [None req-016582db-ff6a-49dc-9271-3db9bcc99cf9 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/04c5bee9-8745-45b2-884d-2abfbbec5d0e/disk --force-share --output=json" returned: 0 in 0.070s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 21 19:20:06 np0005591285 nova_compute[182755]: 2026-01-22 00:20:06.995 182759 DEBUG nova.virt.disk.api [None req-016582db-ff6a-49dc-9271-3db9bcc99cf9 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] Cannot resize image /var/lib/nova/instances/04c5bee9-8745-45b2-884d-2abfbbec5d0e/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172#033[00m
Jan 21 19:20:06 np0005591285 nova_compute[182755]: 2026-01-22 00:20:06.995 182759 DEBUG nova.objects.instance [None req-016582db-ff6a-49dc-9271-3db9bcc99cf9 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] Lazy-loading 'migration_context' on Instance uuid 04c5bee9-8745-45b2-884d-2abfbbec5d0e obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 21 19:20:07 np0005591285 nova_compute[182755]: 2026-01-22 00:20:07.024 182759 DEBUG nova.virt.libvirt.driver [None req-016582db-ff6a-49dc-9271-3db9bcc99cf9 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] [instance: 04c5bee9-8745-45b2-884d-2abfbbec5d0e] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Jan 21 19:20:07 np0005591285 nova_compute[182755]: 2026-01-22 00:20:07.024 182759 DEBUG nova.virt.libvirt.driver [None req-016582db-ff6a-49dc-9271-3db9bcc99cf9 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] [instance: 04c5bee9-8745-45b2-884d-2abfbbec5d0e] Ensure instance console log exists: /var/lib/nova/instances/04c5bee9-8745-45b2-884d-2abfbbec5d0e/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Jan 21 19:20:07 np0005591285 nova_compute[182755]: 2026-01-22 00:20:07.025 182759 DEBUG oslo_concurrency.lockutils [None req-016582db-ff6a-49dc-9271-3db9bcc99cf9 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 21 19:20:07 np0005591285 nova_compute[182755]: 2026-01-22 00:20:07.025 182759 DEBUG oslo_concurrency.lockutils [None req-016582db-ff6a-49dc-9271-3db9bcc99cf9 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 21 19:20:07 np0005591285 nova_compute[182755]: 2026-01-22 00:20:07.025 182759 DEBUG oslo_concurrency.lockutils [None req-016582db-ff6a-49dc-9271-3db9bcc99cf9 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 21 19:20:07 np0005591285 nova_compute[182755]: 2026-01-22 00:20:07.451 182759 DEBUG nova.network.neutron [None req-016582db-ff6a-49dc-9271-3db9bcc99cf9 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] [instance: 04c5bee9-8745-45b2-884d-2abfbbec5d0e] Successfully created port: 5f30e7af-d4d9-44f1-9c51-4ab3afc2e3df _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Jan 21 19:20:07 np0005591285 nova_compute[182755]: 2026-01-22 00:20:07.814 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:20:09 np0005591285 nova_compute[182755]: 2026-01-22 00:20:09.557 182759 DEBUG nova.network.neutron [None req-016582db-ff6a-49dc-9271-3db9bcc99cf9 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] [instance: 04c5bee9-8745-45b2-884d-2abfbbec5d0e] Successfully updated port: 5f30e7af-d4d9-44f1-9c51-4ab3afc2e3df _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Jan 21 19:20:09 np0005591285 nova_compute[182755]: 2026-01-22 00:20:09.591 182759 DEBUG oslo_concurrency.lockutils [None req-016582db-ff6a-49dc-9271-3db9bcc99cf9 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] Acquiring lock "refresh_cache-04c5bee9-8745-45b2-884d-2abfbbec5d0e" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 21 19:20:09 np0005591285 nova_compute[182755]: 2026-01-22 00:20:09.591 182759 DEBUG oslo_concurrency.lockutils [None req-016582db-ff6a-49dc-9271-3db9bcc99cf9 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] Acquired lock "refresh_cache-04c5bee9-8745-45b2-884d-2abfbbec5d0e" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 21 19:20:09 np0005591285 nova_compute[182755]: 2026-01-22 00:20:09.591 182759 DEBUG nova.network.neutron [None req-016582db-ff6a-49dc-9271-3db9bcc99cf9 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] [instance: 04c5bee9-8745-45b2-884d-2abfbbec5d0e] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Jan 21 19:20:09 np0005591285 nova_compute[182755]: 2026-01-22 00:20:09.593 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:20:09 np0005591285 nova_compute[182755]: 2026-01-22 00:20:09.766 182759 DEBUG nova.compute.manager [req-a18beb62-b98e-4972-b946-e7fee93191ac req-21085c91-d9ed-4284-9a9a-8c843e99d34f 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 04c5bee9-8745-45b2-884d-2abfbbec5d0e] Received event network-changed-5f30e7af-d4d9-44f1-9c51-4ab3afc2e3df external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 21 19:20:09 np0005591285 nova_compute[182755]: 2026-01-22 00:20:09.767 182759 DEBUG nova.compute.manager [req-a18beb62-b98e-4972-b946-e7fee93191ac req-21085c91-d9ed-4284-9a9a-8c843e99d34f 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 04c5bee9-8745-45b2-884d-2abfbbec5d0e] Refreshing instance network info cache due to event network-changed-5f30e7af-d4d9-44f1-9c51-4ab3afc2e3df. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 21 19:20:09 np0005591285 nova_compute[182755]: 2026-01-22 00:20:09.767 182759 DEBUG oslo_concurrency.lockutils [req-a18beb62-b98e-4972-b946-e7fee93191ac req-21085c91-d9ed-4284-9a9a-8c843e99d34f 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "refresh_cache-04c5bee9-8745-45b2-884d-2abfbbec5d0e" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 21 19:20:10 np0005591285 nova_compute[182755]: 2026-01-22 00:20:10.159 182759 DEBUG nova.network.neutron [None req-016582db-ff6a-49dc-9271-3db9bcc99cf9 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] [instance: 04c5bee9-8745-45b2-884d-2abfbbec5d0e] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Jan 21 19:20:11 np0005591285 nova_compute[182755]: 2026-01-22 00:20:11.582 182759 DEBUG nova.network.neutron [None req-016582db-ff6a-49dc-9271-3db9bcc99cf9 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] [instance: 04c5bee9-8745-45b2-884d-2abfbbec5d0e] Updating instance_info_cache with network_info: [{"id": "5f30e7af-d4d9-44f1-9c51-4ab3afc2e3df", "address": "fa:16:3e:81:a9:f6", "network": {"id": "09aa8d20-eb46-4367-945b-494fddadbef9", "bridge": "br-int", "label": "tempest-network-smoke--676031905", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "34b96b4037d24a0ea19383ca2477b2fd", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap5f30e7af-d4", "ovs_interfaceid": "5f30e7af-d4d9-44f1-9c51-4ab3afc2e3df", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 21 19:20:11 np0005591285 nova_compute[182755]: 2026-01-22 00:20:11.602 182759 DEBUG oslo_concurrency.lockutils [None req-016582db-ff6a-49dc-9271-3db9bcc99cf9 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] Releasing lock "refresh_cache-04c5bee9-8745-45b2-884d-2abfbbec5d0e" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 21 19:20:11 np0005591285 nova_compute[182755]: 2026-01-22 00:20:11.602 182759 DEBUG nova.compute.manager [None req-016582db-ff6a-49dc-9271-3db9bcc99cf9 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] [instance: 04c5bee9-8745-45b2-884d-2abfbbec5d0e] Instance network_info: |[{"id": "5f30e7af-d4d9-44f1-9c51-4ab3afc2e3df", "address": "fa:16:3e:81:a9:f6", "network": {"id": "09aa8d20-eb46-4367-945b-494fddadbef9", "bridge": "br-int", "label": "tempest-network-smoke--676031905", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "34b96b4037d24a0ea19383ca2477b2fd", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap5f30e7af-d4", "ovs_interfaceid": "5f30e7af-d4d9-44f1-9c51-4ab3afc2e3df", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Jan 21 19:20:11 np0005591285 nova_compute[182755]: 2026-01-22 00:20:11.603 182759 DEBUG oslo_concurrency.lockutils [req-a18beb62-b98e-4972-b946-e7fee93191ac req-21085c91-d9ed-4284-9a9a-8c843e99d34f 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquired lock "refresh_cache-04c5bee9-8745-45b2-884d-2abfbbec5d0e" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 21 19:20:11 np0005591285 nova_compute[182755]: 2026-01-22 00:20:11.603 182759 DEBUG nova.network.neutron [req-a18beb62-b98e-4972-b946-e7fee93191ac req-21085c91-d9ed-4284-9a9a-8c843e99d34f 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 04c5bee9-8745-45b2-884d-2abfbbec5d0e] Refreshing network info cache for port 5f30e7af-d4d9-44f1-9c51-4ab3afc2e3df _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 21 19:20:11 np0005591285 nova_compute[182755]: 2026-01-22 00:20:11.605 182759 DEBUG nova.virt.libvirt.driver [None req-016582db-ff6a-49dc-9271-3db9bcc99cf9 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] [instance: 04c5bee9-8745-45b2-884d-2abfbbec5d0e] Start _get_guest_xml network_info=[{"id": "5f30e7af-d4d9-44f1-9c51-4ab3afc2e3df", "address": "fa:16:3e:81:a9:f6", "network": {"id": "09aa8d20-eb46-4367-945b-494fddadbef9", "bridge": "br-int", "label": "tempest-network-smoke--676031905", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "34b96b4037d24a0ea19383ca2477b2fd", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap5f30e7af-d4", "ovs_interfaceid": "5f30e7af-d4d9-44f1-9c51-4ab3afc2e3df", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-21T23:42:50Z,direct_url=<?>,disk_format='qcow2',id=9cd98f02-a505-4543-a7ad-04e9a377b456,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='43b70c4e837343859ac97b6b2397ba1b',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-21T23:42:52Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'device_type': 'disk', 'device_name': '/dev/vda', 'size': 0, 'boot_index': 0, 'disk_bus': 'virtio', 'guest_format': None, 'encryption_options': None, 'encryption_format': None, 'encrypted': False, 'encryption_secret_uuid': None, 'image_id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Jan 21 19:20:11 np0005591285 nova_compute[182755]: 2026-01-22 00:20:11.610 182759 WARNING nova.virt.libvirt.driver [None req-016582db-ff6a-49dc-9271-3db9bcc99cf9 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 21 19:20:11 np0005591285 nova_compute[182755]: 2026-01-22 00:20:11.615 182759 DEBUG nova.virt.libvirt.host [None req-016582db-ff6a-49dc-9271-3db9bcc99cf9 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Jan 21 19:20:11 np0005591285 nova_compute[182755]: 2026-01-22 00:20:11.616 182759 DEBUG nova.virt.libvirt.host [None req-016582db-ff6a-49dc-9271-3db9bcc99cf9 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Jan 21 19:20:11 np0005591285 nova_compute[182755]: 2026-01-22 00:20:11.620 182759 DEBUG nova.virt.libvirt.host [None req-016582db-ff6a-49dc-9271-3db9bcc99cf9 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Jan 21 19:20:11 np0005591285 nova_compute[182755]: 2026-01-22 00:20:11.620 182759 DEBUG nova.virt.libvirt.host [None req-016582db-ff6a-49dc-9271-3db9bcc99cf9 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Jan 21 19:20:11 np0005591285 nova_compute[182755]: 2026-01-22 00:20:11.622 182759 DEBUG nova.virt.libvirt.driver [None req-016582db-ff6a-49dc-9271-3db9bcc99cf9 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Jan 21 19:20:11 np0005591285 nova_compute[182755]: 2026-01-22 00:20:11.622 182759 DEBUG nova.virt.hardware [None req-016582db-ff6a-49dc-9271-3db9bcc99cf9 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-21T23:42:45Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='c3389c03-89c4-4ff5-9e03-1a99d41713d4',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-21T23:42:50Z,direct_url=<?>,disk_format='qcow2',id=9cd98f02-a505-4543-a7ad-04e9a377b456,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='43b70c4e837343859ac97b6b2397ba1b',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-21T23:42:52Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Jan 21 19:20:11 np0005591285 nova_compute[182755]: 2026-01-22 00:20:11.622 182759 DEBUG nova.virt.hardware [None req-016582db-ff6a-49dc-9271-3db9bcc99cf9 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Jan 21 19:20:11 np0005591285 nova_compute[182755]: 2026-01-22 00:20:11.623 182759 DEBUG nova.virt.hardware [None req-016582db-ff6a-49dc-9271-3db9bcc99cf9 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Jan 21 19:20:11 np0005591285 nova_compute[182755]: 2026-01-22 00:20:11.623 182759 DEBUG nova.virt.hardware [None req-016582db-ff6a-49dc-9271-3db9bcc99cf9 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Jan 21 19:20:11 np0005591285 nova_compute[182755]: 2026-01-22 00:20:11.623 182759 DEBUG nova.virt.hardware [None req-016582db-ff6a-49dc-9271-3db9bcc99cf9 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Jan 21 19:20:11 np0005591285 nova_compute[182755]: 2026-01-22 00:20:11.624 182759 DEBUG nova.virt.hardware [None req-016582db-ff6a-49dc-9271-3db9bcc99cf9 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Jan 21 19:20:11 np0005591285 nova_compute[182755]: 2026-01-22 00:20:11.624 182759 DEBUG nova.virt.hardware [None req-016582db-ff6a-49dc-9271-3db9bcc99cf9 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Jan 21 19:20:11 np0005591285 nova_compute[182755]: 2026-01-22 00:20:11.625 182759 DEBUG nova.virt.hardware [None req-016582db-ff6a-49dc-9271-3db9bcc99cf9 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Jan 21 19:20:11 np0005591285 nova_compute[182755]: 2026-01-22 00:20:11.625 182759 DEBUG nova.virt.hardware [None req-016582db-ff6a-49dc-9271-3db9bcc99cf9 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Jan 21 19:20:11 np0005591285 nova_compute[182755]: 2026-01-22 00:20:11.625 182759 DEBUG nova.virt.hardware [None req-016582db-ff6a-49dc-9271-3db9bcc99cf9 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Jan 21 19:20:11 np0005591285 nova_compute[182755]: 2026-01-22 00:20:11.626 182759 DEBUG nova.virt.hardware [None req-016582db-ff6a-49dc-9271-3db9bcc99cf9 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Jan 21 19:20:11 np0005591285 nova_compute[182755]: 2026-01-22 00:20:11.631 182759 DEBUG nova.virt.libvirt.vif [None req-016582db-ff6a-49dc-9271-3db9bcc99cf9 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-22T00:20:04Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-205298966',display_name='tempest-TestNetworkBasicOps-server-205298966',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-205298966',id=149,image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBB3d2Do150sU4k4vLE98K5ZXopEIgo5vuyIpIC/J+P0c2gsHillvzH56tRAy7PK/iETBFSZEp/AHY9tJxPJOszxY/i7AfqrLWgGciqd22u5Iwpdczd5JmZABg5vIE2R2GA==',key_name='tempest-TestNetworkBasicOps-2098586360',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='34b96b4037d24a0ea19383ca2477b2fd',ramdisk_id='',reservation_id='r-h209qu80',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestNetworkBasicOps-822850957',owner_user_name='tempest-TestNetworkBasicOps-822850957-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-22T00:20:05Z,user_data=None,user_id='833f1e9dce90456ea55a443da6704907',uuid=04c5bee9-8745-45b2-884d-2abfbbec5d0e,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "5f30e7af-d4d9-44f1-9c51-4ab3afc2e3df", "address": "fa:16:3e:81:a9:f6", "network": {"id": "09aa8d20-eb46-4367-945b-494fddadbef9", "bridge": "br-int", "label": "tempest-network-smoke--676031905", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "34b96b4037d24a0ea19383ca2477b2fd", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap5f30e7af-d4", "ovs_interfaceid": "5f30e7af-d4d9-44f1-9c51-4ab3afc2e3df", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Jan 21 19:20:11 np0005591285 nova_compute[182755]: 2026-01-22 00:20:11.632 182759 DEBUG nova.network.os_vif_util [None req-016582db-ff6a-49dc-9271-3db9bcc99cf9 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] Converting VIF {"id": "5f30e7af-d4d9-44f1-9c51-4ab3afc2e3df", "address": "fa:16:3e:81:a9:f6", "network": {"id": "09aa8d20-eb46-4367-945b-494fddadbef9", "bridge": "br-int", "label": "tempest-network-smoke--676031905", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "34b96b4037d24a0ea19383ca2477b2fd", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap5f30e7af-d4", "ovs_interfaceid": "5f30e7af-d4d9-44f1-9c51-4ab3afc2e3df", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 21 19:20:11 np0005591285 nova_compute[182755]: 2026-01-22 00:20:11.632 182759 DEBUG nova.network.os_vif_util [None req-016582db-ff6a-49dc-9271-3db9bcc99cf9 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:81:a9:f6,bridge_name='br-int',has_traffic_filtering=True,id=5f30e7af-d4d9-44f1-9c51-4ab3afc2e3df,network=Network(09aa8d20-eb46-4367-945b-494fddadbef9),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap5f30e7af-d4') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 21 19:20:11 np0005591285 nova_compute[182755]: 2026-01-22 00:20:11.633 182759 DEBUG nova.objects.instance [None req-016582db-ff6a-49dc-9271-3db9bcc99cf9 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] Lazy-loading 'pci_devices' on Instance uuid 04c5bee9-8745-45b2-884d-2abfbbec5d0e obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 21 19:20:11 np0005591285 nova_compute[182755]: 2026-01-22 00:20:11.651 182759 DEBUG nova.virt.libvirt.driver [None req-016582db-ff6a-49dc-9271-3db9bcc99cf9 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] [instance: 04c5bee9-8745-45b2-884d-2abfbbec5d0e] End _get_guest_xml xml=<domain type="kvm">
Jan 21 19:20:11 np0005591285 nova_compute[182755]:  <uuid>04c5bee9-8745-45b2-884d-2abfbbec5d0e</uuid>
Jan 21 19:20:11 np0005591285 nova_compute[182755]:  <name>instance-00000095</name>
Jan 21 19:20:11 np0005591285 nova_compute[182755]:  <memory>131072</memory>
Jan 21 19:20:11 np0005591285 nova_compute[182755]:  <vcpu>1</vcpu>
Jan 21 19:20:11 np0005591285 nova_compute[182755]:  <metadata>
Jan 21 19:20:11 np0005591285 nova_compute[182755]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 21 19:20:11 np0005591285 nova_compute[182755]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 21 19:20:11 np0005591285 nova_compute[182755]:      <nova:name>tempest-TestNetworkBasicOps-server-205298966</nova:name>
Jan 21 19:20:11 np0005591285 nova_compute[182755]:      <nova:creationTime>2026-01-22 00:20:11</nova:creationTime>
Jan 21 19:20:11 np0005591285 nova_compute[182755]:      <nova:flavor name="m1.nano">
Jan 21 19:20:11 np0005591285 nova_compute[182755]:        <nova:memory>128</nova:memory>
Jan 21 19:20:11 np0005591285 nova_compute[182755]:        <nova:disk>1</nova:disk>
Jan 21 19:20:11 np0005591285 nova_compute[182755]:        <nova:swap>0</nova:swap>
Jan 21 19:20:11 np0005591285 nova_compute[182755]:        <nova:ephemeral>0</nova:ephemeral>
Jan 21 19:20:11 np0005591285 nova_compute[182755]:        <nova:vcpus>1</nova:vcpus>
Jan 21 19:20:11 np0005591285 nova_compute[182755]:      </nova:flavor>
Jan 21 19:20:11 np0005591285 nova_compute[182755]:      <nova:owner>
Jan 21 19:20:11 np0005591285 nova_compute[182755]:        <nova:user uuid="833f1e9dce90456ea55a443da6704907">tempest-TestNetworkBasicOps-822850957-project-member</nova:user>
Jan 21 19:20:11 np0005591285 nova_compute[182755]:        <nova:project uuid="34b96b4037d24a0ea19383ca2477b2fd">tempest-TestNetworkBasicOps-822850957</nova:project>
Jan 21 19:20:11 np0005591285 nova_compute[182755]:      </nova:owner>
Jan 21 19:20:11 np0005591285 nova_compute[182755]:      <nova:root type="image" uuid="9cd98f02-a505-4543-a7ad-04e9a377b456"/>
Jan 21 19:20:11 np0005591285 nova_compute[182755]:      <nova:ports>
Jan 21 19:20:11 np0005591285 nova_compute[182755]:        <nova:port uuid="5f30e7af-d4d9-44f1-9c51-4ab3afc2e3df">
Jan 21 19:20:11 np0005591285 nova_compute[182755]:          <nova:ip type="fixed" address="10.100.0.4" ipVersion="4"/>
Jan 21 19:20:11 np0005591285 nova_compute[182755]:        </nova:port>
Jan 21 19:20:11 np0005591285 nova_compute[182755]:      </nova:ports>
Jan 21 19:20:11 np0005591285 nova_compute[182755]:    </nova:instance>
Jan 21 19:20:11 np0005591285 nova_compute[182755]:  </metadata>
Jan 21 19:20:11 np0005591285 nova_compute[182755]:  <sysinfo type="smbios">
Jan 21 19:20:11 np0005591285 nova_compute[182755]:    <system>
Jan 21 19:20:11 np0005591285 nova_compute[182755]:      <entry name="manufacturer">RDO</entry>
Jan 21 19:20:11 np0005591285 nova_compute[182755]:      <entry name="product">OpenStack Compute</entry>
Jan 21 19:20:11 np0005591285 nova_compute[182755]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 21 19:20:11 np0005591285 nova_compute[182755]:      <entry name="serial">04c5bee9-8745-45b2-884d-2abfbbec5d0e</entry>
Jan 21 19:20:11 np0005591285 nova_compute[182755]:      <entry name="uuid">04c5bee9-8745-45b2-884d-2abfbbec5d0e</entry>
Jan 21 19:20:11 np0005591285 nova_compute[182755]:      <entry name="family">Virtual Machine</entry>
Jan 21 19:20:11 np0005591285 nova_compute[182755]:    </system>
Jan 21 19:20:11 np0005591285 nova_compute[182755]:  </sysinfo>
Jan 21 19:20:11 np0005591285 nova_compute[182755]:  <os>
Jan 21 19:20:11 np0005591285 nova_compute[182755]:    <type arch="x86_64" machine="q35">hvm</type>
Jan 21 19:20:11 np0005591285 nova_compute[182755]:    <boot dev="hd"/>
Jan 21 19:20:11 np0005591285 nova_compute[182755]:    <smbios mode="sysinfo"/>
Jan 21 19:20:11 np0005591285 nova_compute[182755]:  </os>
Jan 21 19:20:11 np0005591285 nova_compute[182755]:  <features>
Jan 21 19:20:11 np0005591285 nova_compute[182755]:    <acpi/>
Jan 21 19:20:11 np0005591285 nova_compute[182755]:    <apic/>
Jan 21 19:20:11 np0005591285 nova_compute[182755]:    <vmcoreinfo/>
Jan 21 19:20:11 np0005591285 nova_compute[182755]:  </features>
Jan 21 19:20:11 np0005591285 nova_compute[182755]:  <clock offset="utc">
Jan 21 19:20:11 np0005591285 nova_compute[182755]:    <timer name="pit" tickpolicy="delay"/>
Jan 21 19:20:11 np0005591285 nova_compute[182755]:    <timer name="rtc" tickpolicy="catchup"/>
Jan 21 19:20:11 np0005591285 nova_compute[182755]:    <timer name="hpet" present="no"/>
Jan 21 19:20:11 np0005591285 nova_compute[182755]:  </clock>
Jan 21 19:20:11 np0005591285 nova_compute[182755]:  <cpu mode="custom" match="exact">
Jan 21 19:20:11 np0005591285 nova_compute[182755]:    <model>Nehalem</model>
Jan 21 19:20:11 np0005591285 nova_compute[182755]:    <topology sockets="1" cores="1" threads="1"/>
Jan 21 19:20:11 np0005591285 nova_compute[182755]:  </cpu>
Jan 21 19:20:11 np0005591285 nova_compute[182755]:  <devices>
Jan 21 19:20:11 np0005591285 nova_compute[182755]:    <disk type="file" device="disk">
Jan 21 19:20:11 np0005591285 nova_compute[182755]:      <driver name="qemu" type="qcow2" cache="none"/>
Jan 21 19:20:11 np0005591285 nova_compute[182755]:      <source file="/var/lib/nova/instances/04c5bee9-8745-45b2-884d-2abfbbec5d0e/disk"/>
Jan 21 19:20:11 np0005591285 nova_compute[182755]:      <target dev="vda" bus="virtio"/>
Jan 21 19:20:11 np0005591285 nova_compute[182755]:    </disk>
Jan 21 19:20:11 np0005591285 nova_compute[182755]:    <disk type="file" device="cdrom">
Jan 21 19:20:11 np0005591285 nova_compute[182755]:      <driver name="qemu" type="raw" cache="none"/>
Jan 21 19:20:11 np0005591285 nova_compute[182755]:      <source file="/var/lib/nova/instances/04c5bee9-8745-45b2-884d-2abfbbec5d0e/disk.config"/>
Jan 21 19:20:11 np0005591285 nova_compute[182755]:      <target dev="sda" bus="sata"/>
Jan 21 19:20:11 np0005591285 nova_compute[182755]:    </disk>
Jan 21 19:20:11 np0005591285 nova_compute[182755]:    <interface type="ethernet">
Jan 21 19:20:11 np0005591285 nova_compute[182755]:      <mac address="fa:16:3e:81:a9:f6"/>
Jan 21 19:20:11 np0005591285 nova_compute[182755]:      <model type="virtio"/>
Jan 21 19:20:11 np0005591285 nova_compute[182755]:      <driver name="vhost" rx_queue_size="512"/>
Jan 21 19:20:11 np0005591285 nova_compute[182755]:      <mtu size="1442"/>
Jan 21 19:20:11 np0005591285 nova_compute[182755]:      <target dev="tap5f30e7af-d4"/>
Jan 21 19:20:11 np0005591285 nova_compute[182755]:    </interface>
Jan 21 19:20:11 np0005591285 nova_compute[182755]:    <serial type="pty">
Jan 21 19:20:11 np0005591285 nova_compute[182755]:      <log file="/var/lib/nova/instances/04c5bee9-8745-45b2-884d-2abfbbec5d0e/console.log" append="off"/>
Jan 21 19:20:11 np0005591285 nova_compute[182755]:    </serial>
Jan 21 19:20:11 np0005591285 nova_compute[182755]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 21 19:20:11 np0005591285 nova_compute[182755]:    <video>
Jan 21 19:20:11 np0005591285 nova_compute[182755]:      <model type="virtio"/>
Jan 21 19:20:11 np0005591285 nova_compute[182755]:    </video>
Jan 21 19:20:11 np0005591285 nova_compute[182755]:    <input type="tablet" bus="usb"/>
Jan 21 19:20:11 np0005591285 nova_compute[182755]:    <rng model="virtio">
Jan 21 19:20:11 np0005591285 nova_compute[182755]:      <backend model="random">/dev/urandom</backend>
Jan 21 19:20:11 np0005591285 nova_compute[182755]:    </rng>
Jan 21 19:20:11 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root"/>
Jan 21 19:20:11 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 19:20:11 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 19:20:11 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 19:20:11 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 19:20:11 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 19:20:11 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 19:20:11 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 19:20:11 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 19:20:11 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 19:20:11 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 19:20:11 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 19:20:11 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 19:20:11 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 19:20:11 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 19:20:11 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 19:20:11 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 19:20:11 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 19:20:11 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 19:20:11 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 19:20:11 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 19:20:11 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 19:20:11 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 19:20:11 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 19:20:11 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 19:20:11 np0005591285 nova_compute[182755]:    <controller type="usb" index="0"/>
Jan 21 19:20:11 np0005591285 nova_compute[182755]:    <memballoon model="virtio">
Jan 21 19:20:11 np0005591285 nova_compute[182755]:      <stats period="10"/>
Jan 21 19:20:11 np0005591285 nova_compute[182755]:    </memballoon>
Jan 21 19:20:11 np0005591285 nova_compute[182755]:  </devices>
Jan 21 19:20:11 np0005591285 nova_compute[182755]: </domain>
Jan 21 19:20:11 np0005591285 nova_compute[182755]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Jan 21 19:20:11 np0005591285 nova_compute[182755]: 2026-01-22 00:20:11.654 182759 DEBUG nova.compute.manager [None req-016582db-ff6a-49dc-9271-3db9bcc99cf9 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] [instance: 04c5bee9-8745-45b2-884d-2abfbbec5d0e] Preparing to wait for external event network-vif-plugged-5f30e7af-d4d9-44f1-9c51-4ab3afc2e3df prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Jan 21 19:20:11 np0005591285 nova_compute[182755]: 2026-01-22 00:20:11.654 182759 DEBUG oslo_concurrency.lockutils [None req-016582db-ff6a-49dc-9271-3db9bcc99cf9 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] Acquiring lock "04c5bee9-8745-45b2-884d-2abfbbec5d0e-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 21 19:20:11 np0005591285 nova_compute[182755]: 2026-01-22 00:20:11.654 182759 DEBUG oslo_concurrency.lockutils [None req-016582db-ff6a-49dc-9271-3db9bcc99cf9 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] Lock "04c5bee9-8745-45b2-884d-2abfbbec5d0e-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 21 19:20:11 np0005591285 nova_compute[182755]: 2026-01-22 00:20:11.655 182759 DEBUG oslo_concurrency.lockutils [None req-016582db-ff6a-49dc-9271-3db9bcc99cf9 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] Lock "04c5bee9-8745-45b2-884d-2abfbbec5d0e-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 21 19:20:11 np0005591285 nova_compute[182755]: 2026-01-22 00:20:11.655 182759 DEBUG nova.virt.libvirt.vif [None req-016582db-ff6a-49dc-9271-3db9bcc99cf9 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-22T00:20:04Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-205298966',display_name='tempest-TestNetworkBasicOps-server-205298966',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-205298966',id=149,image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBB3d2Do150sU4k4vLE98K5ZXopEIgo5vuyIpIC/J+P0c2gsHillvzH56tRAy7PK/iETBFSZEp/AHY9tJxPJOszxY/i7AfqrLWgGciqd22u5Iwpdczd5JmZABg5vIE2R2GA==',key_name='tempest-TestNetworkBasicOps-2098586360',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='34b96b4037d24a0ea19383ca2477b2fd',ramdisk_id='',reservation_id='r-h209qu80',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestNetworkBasicOps-822850957',owner_user_name='tempest-TestNetworkBasicOps-822850957-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-22T00:20:05Z,user_data=None,user_id='833f1e9dce90456ea55a443da6704907',uuid=04c5bee9-8745-45b2-884d-2abfbbec5d0e,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "5f30e7af-d4d9-44f1-9c51-4ab3afc2e3df", "address": "fa:16:3e:81:a9:f6", "network": {"id": "09aa8d20-eb46-4367-945b-494fddadbef9", "bridge": "br-int", "label": "tempest-network-smoke--676031905", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "34b96b4037d24a0ea19383ca2477b2fd", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap5f30e7af-d4", "ovs_interfaceid": "5f30e7af-d4d9-44f1-9c51-4ab3afc2e3df", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Jan 21 19:20:11 np0005591285 nova_compute[182755]: 2026-01-22 00:20:11.656 182759 DEBUG nova.network.os_vif_util [None req-016582db-ff6a-49dc-9271-3db9bcc99cf9 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] Converting VIF {"id": "5f30e7af-d4d9-44f1-9c51-4ab3afc2e3df", "address": "fa:16:3e:81:a9:f6", "network": {"id": "09aa8d20-eb46-4367-945b-494fddadbef9", "bridge": "br-int", "label": "tempest-network-smoke--676031905", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "34b96b4037d24a0ea19383ca2477b2fd", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap5f30e7af-d4", "ovs_interfaceid": "5f30e7af-d4d9-44f1-9c51-4ab3afc2e3df", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 21 19:20:11 np0005591285 nova_compute[182755]: 2026-01-22 00:20:11.656 182759 DEBUG nova.network.os_vif_util [None req-016582db-ff6a-49dc-9271-3db9bcc99cf9 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:81:a9:f6,bridge_name='br-int',has_traffic_filtering=True,id=5f30e7af-d4d9-44f1-9c51-4ab3afc2e3df,network=Network(09aa8d20-eb46-4367-945b-494fddadbef9),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap5f30e7af-d4') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 21 19:20:11 np0005591285 nova_compute[182755]: 2026-01-22 00:20:11.657 182759 DEBUG os_vif [None req-016582db-ff6a-49dc-9271-3db9bcc99cf9 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:81:a9:f6,bridge_name='br-int',has_traffic_filtering=True,id=5f30e7af-d4d9-44f1-9c51-4ab3afc2e3df,network=Network(09aa8d20-eb46-4367-945b-494fddadbef9),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap5f30e7af-d4') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Jan 21 19:20:11 np0005591285 nova_compute[182755]: 2026-01-22 00:20:11.657 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:20:11 np0005591285 nova_compute[182755]: 2026-01-22 00:20:11.657 182759 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 21 19:20:11 np0005591285 nova_compute[182755]: 2026-01-22 00:20:11.658 182759 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 21 19:20:11 np0005591285 nova_compute[182755]: 2026-01-22 00:20:11.660 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:20:11 np0005591285 nova_compute[182755]: 2026-01-22 00:20:11.660 182759 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap5f30e7af-d4, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 21 19:20:11 np0005591285 nova_compute[182755]: 2026-01-22 00:20:11.660 182759 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap5f30e7af-d4, col_values=(('external_ids', {'iface-id': '5f30e7af-d4d9-44f1-9c51-4ab3afc2e3df', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:81:a9:f6', 'vm-uuid': '04c5bee9-8745-45b2-884d-2abfbbec5d0e'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 21 19:20:11 np0005591285 nova_compute[182755]: 2026-01-22 00:20:11.662 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:20:11 np0005591285 NetworkManager[55017]: <info>  [1769041211.6632] manager: (tap5f30e7af-d4): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/282)
Jan 21 19:20:11 np0005591285 nova_compute[182755]: 2026-01-22 00:20:11.664 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 21 19:20:11 np0005591285 nova_compute[182755]: 2026-01-22 00:20:11.669 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:20:11 np0005591285 nova_compute[182755]: 2026-01-22 00:20:11.670 182759 INFO os_vif [None req-016582db-ff6a-49dc-9271-3db9bcc99cf9 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:81:a9:f6,bridge_name='br-int',has_traffic_filtering=True,id=5f30e7af-d4d9-44f1-9c51-4ab3afc2e3df,network=Network(09aa8d20-eb46-4367-945b-494fddadbef9),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap5f30e7af-d4')#033[00m
Jan 21 19:20:11 np0005591285 nova_compute[182755]: 2026-01-22 00:20:11.729 182759 DEBUG nova.virt.libvirt.driver [None req-016582db-ff6a-49dc-9271-3db9bcc99cf9 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 21 19:20:11 np0005591285 nova_compute[182755]: 2026-01-22 00:20:11.731 182759 DEBUG nova.virt.libvirt.driver [None req-016582db-ff6a-49dc-9271-3db9bcc99cf9 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 21 19:20:11 np0005591285 nova_compute[182755]: 2026-01-22 00:20:11.731 182759 DEBUG nova.virt.libvirt.driver [None req-016582db-ff6a-49dc-9271-3db9bcc99cf9 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] No VIF found with MAC fa:16:3e:81:a9:f6, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Jan 21 19:20:11 np0005591285 nova_compute[182755]: 2026-01-22 00:20:11.731 182759 INFO nova.virt.libvirt.driver [None req-016582db-ff6a-49dc-9271-3db9bcc99cf9 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] [instance: 04c5bee9-8745-45b2-884d-2abfbbec5d0e] Using config drive#033[00m
Jan 21 19:20:12 np0005591285 nova_compute[182755]: 2026-01-22 00:20:12.306 182759 INFO nova.virt.libvirt.driver [None req-016582db-ff6a-49dc-9271-3db9bcc99cf9 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] [instance: 04c5bee9-8745-45b2-884d-2abfbbec5d0e] Creating config drive at /var/lib/nova/instances/04c5bee9-8745-45b2-884d-2abfbbec5d0e/disk.config#033[00m
Jan 21 19:20:12 np0005591285 nova_compute[182755]: 2026-01-22 00:20:12.312 182759 DEBUG oslo_concurrency.processutils [None req-016582db-ff6a-49dc-9271-3db9bcc99cf9 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/04c5bee9-8745-45b2-884d-2abfbbec5d0e/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpd6s3po2_ execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 21 19:20:12 np0005591285 nova_compute[182755]: 2026-01-22 00:20:12.443 182759 DEBUG oslo_concurrency.processutils [None req-016582db-ff6a-49dc-9271-3db9bcc99cf9 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/04c5bee9-8745-45b2-884d-2abfbbec5d0e/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpd6s3po2_" returned: 0 in 0.132s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 21 19:20:12 np0005591285 kernel: tap5f30e7af-d4: entered promiscuous mode
Jan 21 19:20:12 np0005591285 NetworkManager[55017]: <info>  [1769041212.5047] manager: (tap5f30e7af-d4): new Tun device (/org/freedesktop/NetworkManager/Devices/283)
Jan 21 19:20:12 np0005591285 ovn_controller[94908]: 2026-01-22T00:20:12Z|00580|binding|INFO|Claiming lport 5f30e7af-d4d9-44f1-9c51-4ab3afc2e3df for this chassis.
Jan 21 19:20:12 np0005591285 ovn_controller[94908]: 2026-01-22T00:20:12Z|00581|binding|INFO|5f30e7af-d4d9-44f1-9c51-4ab3afc2e3df: Claiming fa:16:3e:81:a9:f6 10.100.0.4
Jan 21 19:20:12 np0005591285 nova_compute[182755]: 2026-01-22 00:20:12.505 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:20:12 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:20:12.517 104259 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:81:a9:f6 10.100.0.4'], port_security=['fa:16:3e:81:a9:f6 10.100.0.4'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.4/28', 'neutron:device_id': '04c5bee9-8745-45b2-884d-2abfbbec5d0e', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-09aa8d20-eb46-4367-945b-494fddadbef9', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '34b96b4037d24a0ea19383ca2477b2fd', 'neutron:revision_number': '2', 'neutron:security_group_ids': 'e3745aef-6a18-4d3a-b122-7a49de407273', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=def33975-878d-4f5c-8b1e-729e778f0cd5, chassis=[<ovs.db.idl.Row object at 0x7fdd55b5c970>], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fdd55b5c970>], logical_port=5f30e7af-d4d9-44f1-9c51-4ab3afc2e3df) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 21 19:20:12 np0005591285 nova_compute[182755]: 2026-01-22 00:20:12.517 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:20:12 np0005591285 ovn_controller[94908]: 2026-01-22T00:20:12Z|00582|binding|INFO|Setting lport 5f30e7af-d4d9-44f1-9c51-4ab3afc2e3df ovn-installed in OVS
Jan 21 19:20:12 np0005591285 ovn_controller[94908]: 2026-01-22T00:20:12Z|00583|binding|INFO|Setting lport 5f30e7af-d4d9-44f1-9c51-4ab3afc2e3df up in Southbound
Jan 21 19:20:12 np0005591285 nova_compute[182755]: 2026-01-22 00:20:12.518 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:20:12 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:20:12.520 104259 INFO neutron.agent.ovn.metadata.agent [-] Port 5f30e7af-d4d9-44f1-9c51-4ab3afc2e3df in datapath 09aa8d20-eb46-4367-945b-494fddadbef9 bound to our chassis#033[00m
Jan 21 19:20:12 np0005591285 nova_compute[182755]: 2026-01-22 00:20:12.520 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:20:12 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:20:12.522 104259 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 09aa8d20-eb46-4367-945b-494fddadbef9#033[00m
Jan 21 19:20:12 np0005591285 systemd-udevd[235329]: Network interface NamePolicy= disabled on kernel command line.
Jan 21 19:20:12 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:20:12.539 211690 DEBUG oslo.privsep.daemon [-] privsep: reply[ddffb791-de61-402d-b6eb-0e55427cdbc3]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 19:20:12 np0005591285 systemd-machined[154022]: New machine qemu-69-instance-00000095.
Jan 21 19:20:12 np0005591285 NetworkManager[55017]: <info>  [1769041212.5535] device (tap5f30e7af-d4): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 21 19:20:12 np0005591285 NetworkManager[55017]: <info>  [1769041212.5554] device (tap5f30e7af-d4): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 21 19:20:12 np0005591285 systemd[1]: Started Virtual Machine qemu-69-instance-00000095.
Jan 21 19:20:12 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:20:12.570 211704 DEBUG oslo.privsep.daemon [-] privsep: reply[c106cea9-279b-440c-a8f9-f2926a72a8cc]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 19:20:12 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:20:12.573 211704 DEBUG oslo.privsep.daemon [-] privsep: reply[aa667641-4776-4288-bfdb-9c5dcd56d05f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 19:20:12 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:20:12.601 211704 DEBUG oslo.privsep.daemon [-] privsep: reply[1229da36-0faf-440e-97f2-bdbbfedc1195]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 19:20:12 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:20:12.620 211690 DEBUG oslo.privsep.daemon [-] privsep: reply[71440af1-9a72-4454-80d2-617cc51cc298]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap09aa8d20-e1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:ec:d3:10'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 7, 'tx_packets': 5, 'rx_bytes': 574, 'tx_bytes': 354, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 7, 'tx_packets': 5, 'rx_bytes': 574, 'tx_bytes': 354, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 182], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 574940, 'reachable_time': 23910, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 6, 'inoctets': 448, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 3, 'outoctets': 228, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 6, 'outmcastpkts': 3, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 448, 'outmcastoctets': 228, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 6, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 3, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 235342, 'error': None, 'target': 'ovnmeta-09aa8d20-eb46-4367-945b-494fddadbef9', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 19:20:12 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:20:12.635 211690 DEBUG oslo.privsep.daemon [-] privsep: reply[6aa79329-1e46-47f0-8c79-ca53b7dd64b1]: (4, ({'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tap09aa8d20-e1'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 574951, 'tstamp': 574951}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 235343, 'error': None, 'target': 'ovnmeta-09aa8d20-eb46-4367-945b-494fddadbef9', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.15'], ['IFA_LABEL', 'tap09aa8d20-e1'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 574954, 'tstamp': 574954}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 235343, 'error': None, 'target': 'ovnmeta-09aa8d20-eb46-4367-945b-494fddadbef9', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 19:20:12 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:20:12.638 104259 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap09aa8d20-e0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 21 19:20:12 np0005591285 nova_compute[182755]: 2026-01-22 00:20:12.639 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:20:12 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:20:12.641 104259 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap09aa8d20-e0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 21 19:20:12 np0005591285 nova_compute[182755]: 2026-01-22 00:20:12.641 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:20:12 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:20:12.641 104259 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 21 19:20:12 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:20:12.642 104259 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap09aa8d20-e0, col_values=(('external_ids', {'iface-id': '41a0cd65-2b8a-41a7-a2ad-c71bf323d9ae'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 21 19:20:12 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:20:12.642 104259 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 21 19:20:12 np0005591285 nova_compute[182755]: 2026-01-22 00:20:12.817 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:20:12 np0005591285 nova_compute[182755]: 2026-01-22 00:20:12.928 182759 DEBUG nova.virt.driver [None req-f447ef32-7edb-4e2c-a5c5-5452f2f9c37e - - - - - -] Emitting event <LifecycleEvent: 1769041212.927232, 04c5bee9-8745-45b2-884d-2abfbbec5d0e => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 21 19:20:12 np0005591285 nova_compute[182755]: 2026-01-22 00:20:12.929 182759 INFO nova.compute.manager [None req-f447ef32-7edb-4e2c-a5c5-5452f2f9c37e - - - - - -] [instance: 04c5bee9-8745-45b2-884d-2abfbbec5d0e] VM Started (Lifecycle Event)#033[00m
Jan 21 19:20:12 np0005591285 nova_compute[182755]: 2026-01-22 00:20:12.955 182759 DEBUG nova.compute.manager [None req-f447ef32-7edb-4e2c-a5c5-5452f2f9c37e - - - - - -] [instance: 04c5bee9-8745-45b2-884d-2abfbbec5d0e] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 21 19:20:12 np0005591285 nova_compute[182755]: 2026-01-22 00:20:12.961 182759 DEBUG nova.virt.driver [None req-f447ef32-7edb-4e2c-a5c5-5452f2f9c37e - - - - - -] Emitting event <LifecycleEvent: 1769041212.9323246, 04c5bee9-8745-45b2-884d-2abfbbec5d0e => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 21 19:20:12 np0005591285 nova_compute[182755]: 2026-01-22 00:20:12.961 182759 INFO nova.compute.manager [None req-f447ef32-7edb-4e2c-a5c5-5452f2f9c37e - - - - - -] [instance: 04c5bee9-8745-45b2-884d-2abfbbec5d0e] VM Paused (Lifecycle Event)#033[00m
Jan 21 19:20:12 np0005591285 nova_compute[182755]: 2026-01-22 00:20:12.988 182759 DEBUG nova.compute.manager [None req-f447ef32-7edb-4e2c-a5c5-5452f2f9c37e - - - - - -] [instance: 04c5bee9-8745-45b2-884d-2abfbbec5d0e] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 21 19:20:12 np0005591285 nova_compute[182755]: 2026-01-22 00:20:12.992 182759 DEBUG nova.compute.manager [None req-f447ef32-7edb-4e2c-a5c5-5452f2f9c37e - - - - - -] [instance: 04c5bee9-8745-45b2-884d-2abfbbec5d0e] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 21 19:20:13 np0005591285 nova_compute[182755]: 2026-01-22 00:20:13.012 182759 INFO nova.compute.manager [None req-f447ef32-7edb-4e2c-a5c5-5452f2f9c37e - - - - - -] [instance: 04c5bee9-8745-45b2-884d-2abfbbec5d0e] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 21 19:20:13 np0005591285 nova_compute[182755]: 2026-01-22 00:20:13.700 182759 DEBUG nova.network.neutron [req-a18beb62-b98e-4972-b946-e7fee93191ac req-21085c91-d9ed-4284-9a9a-8c843e99d34f 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 04c5bee9-8745-45b2-884d-2abfbbec5d0e] Updated VIF entry in instance network info cache for port 5f30e7af-d4d9-44f1-9c51-4ab3afc2e3df. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 21 19:20:13 np0005591285 nova_compute[182755]: 2026-01-22 00:20:13.701 182759 DEBUG nova.network.neutron [req-a18beb62-b98e-4972-b946-e7fee93191ac req-21085c91-d9ed-4284-9a9a-8c843e99d34f 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 04c5bee9-8745-45b2-884d-2abfbbec5d0e] Updating instance_info_cache with network_info: [{"id": "5f30e7af-d4d9-44f1-9c51-4ab3afc2e3df", "address": "fa:16:3e:81:a9:f6", "network": {"id": "09aa8d20-eb46-4367-945b-494fddadbef9", "bridge": "br-int", "label": "tempest-network-smoke--676031905", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "34b96b4037d24a0ea19383ca2477b2fd", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap5f30e7af-d4", "ovs_interfaceid": "5f30e7af-d4d9-44f1-9c51-4ab3afc2e3df", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 21 19:20:13 np0005591285 nova_compute[182755]: 2026-01-22 00:20:13.719 182759 DEBUG oslo_concurrency.lockutils [req-a18beb62-b98e-4972-b946-e7fee93191ac req-21085c91-d9ed-4284-9a9a-8c843e99d34f 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Releasing lock "refresh_cache-04c5bee9-8745-45b2-884d-2abfbbec5d0e" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 21 19:20:13 np0005591285 nova_compute[182755]: 2026-01-22 00:20:13.977 182759 DEBUG nova.compute.manager [req-5b014c2d-dcd1-4054-afa4-e1450ca4b75f req-ce570bb2-1d4d-4a12-b3f6-45e57e1af912 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 04c5bee9-8745-45b2-884d-2abfbbec5d0e] Received event network-vif-plugged-5f30e7af-d4d9-44f1-9c51-4ab3afc2e3df external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 21 19:20:13 np0005591285 nova_compute[182755]: 2026-01-22 00:20:13.977 182759 DEBUG oslo_concurrency.lockutils [req-5b014c2d-dcd1-4054-afa4-e1450ca4b75f req-ce570bb2-1d4d-4a12-b3f6-45e57e1af912 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "04c5bee9-8745-45b2-884d-2abfbbec5d0e-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 21 19:20:13 np0005591285 nova_compute[182755]: 2026-01-22 00:20:13.977 182759 DEBUG oslo_concurrency.lockutils [req-5b014c2d-dcd1-4054-afa4-e1450ca4b75f req-ce570bb2-1d4d-4a12-b3f6-45e57e1af912 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "04c5bee9-8745-45b2-884d-2abfbbec5d0e-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 21 19:20:13 np0005591285 nova_compute[182755]: 2026-01-22 00:20:13.978 182759 DEBUG oslo_concurrency.lockutils [req-5b014c2d-dcd1-4054-afa4-e1450ca4b75f req-ce570bb2-1d4d-4a12-b3f6-45e57e1af912 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "04c5bee9-8745-45b2-884d-2abfbbec5d0e-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 21 19:20:13 np0005591285 nova_compute[182755]: 2026-01-22 00:20:13.978 182759 DEBUG nova.compute.manager [req-5b014c2d-dcd1-4054-afa4-e1450ca4b75f req-ce570bb2-1d4d-4a12-b3f6-45e57e1af912 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 04c5bee9-8745-45b2-884d-2abfbbec5d0e] Processing event network-vif-plugged-5f30e7af-d4d9-44f1-9c51-4ab3afc2e3df _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Jan 21 19:20:13 np0005591285 nova_compute[182755]: 2026-01-22 00:20:13.978 182759 DEBUG nova.compute.manager [req-5b014c2d-dcd1-4054-afa4-e1450ca4b75f req-ce570bb2-1d4d-4a12-b3f6-45e57e1af912 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 04c5bee9-8745-45b2-884d-2abfbbec5d0e] Received event network-vif-plugged-5f30e7af-d4d9-44f1-9c51-4ab3afc2e3df external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 21 19:20:13 np0005591285 nova_compute[182755]: 2026-01-22 00:20:13.978 182759 DEBUG oslo_concurrency.lockutils [req-5b014c2d-dcd1-4054-afa4-e1450ca4b75f req-ce570bb2-1d4d-4a12-b3f6-45e57e1af912 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "04c5bee9-8745-45b2-884d-2abfbbec5d0e-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 21 19:20:13 np0005591285 nova_compute[182755]: 2026-01-22 00:20:13.979 182759 DEBUG oslo_concurrency.lockutils [req-5b014c2d-dcd1-4054-afa4-e1450ca4b75f req-ce570bb2-1d4d-4a12-b3f6-45e57e1af912 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "04c5bee9-8745-45b2-884d-2abfbbec5d0e-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 21 19:20:13 np0005591285 nova_compute[182755]: 2026-01-22 00:20:13.979 182759 DEBUG oslo_concurrency.lockutils [req-5b014c2d-dcd1-4054-afa4-e1450ca4b75f req-ce570bb2-1d4d-4a12-b3f6-45e57e1af912 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "04c5bee9-8745-45b2-884d-2abfbbec5d0e-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 21 19:20:13 np0005591285 nova_compute[182755]: 2026-01-22 00:20:13.979 182759 DEBUG nova.compute.manager [req-5b014c2d-dcd1-4054-afa4-e1450ca4b75f req-ce570bb2-1d4d-4a12-b3f6-45e57e1af912 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 04c5bee9-8745-45b2-884d-2abfbbec5d0e] No waiting events found dispatching network-vif-plugged-5f30e7af-d4d9-44f1-9c51-4ab3afc2e3df pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 21 19:20:13 np0005591285 nova_compute[182755]: 2026-01-22 00:20:13.979 182759 WARNING nova.compute.manager [req-5b014c2d-dcd1-4054-afa4-e1450ca4b75f req-ce570bb2-1d4d-4a12-b3f6-45e57e1af912 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 04c5bee9-8745-45b2-884d-2abfbbec5d0e] Received unexpected event network-vif-plugged-5f30e7af-d4d9-44f1-9c51-4ab3afc2e3df for instance with vm_state building and task_state spawning.#033[00m
Jan 21 19:20:13 np0005591285 nova_compute[182755]: 2026-01-22 00:20:13.980 182759 DEBUG nova.compute.manager [None req-016582db-ff6a-49dc-9271-3db9bcc99cf9 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] [instance: 04c5bee9-8745-45b2-884d-2abfbbec5d0e] Instance event wait completed in 1 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Jan 21 19:20:13 np0005591285 nova_compute[182755]: 2026-01-22 00:20:13.983 182759 DEBUG nova.virt.driver [None req-f447ef32-7edb-4e2c-a5c5-5452f2f9c37e - - - - - -] Emitting event <LifecycleEvent: 1769041213.983127, 04c5bee9-8745-45b2-884d-2abfbbec5d0e => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 21 19:20:13 np0005591285 nova_compute[182755]: 2026-01-22 00:20:13.984 182759 INFO nova.compute.manager [None req-f447ef32-7edb-4e2c-a5c5-5452f2f9c37e - - - - - -] [instance: 04c5bee9-8745-45b2-884d-2abfbbec5d0e] VM Resumed (Lifecycle Event)#033[00m
Jan 21 19:20:13 np0005591285 nova_compute[182755]: 2026-01-22 00:20:13.985 182759 DEBUG nova.virt.libvirt.driver [None req-016582db-ff6a-49dc-9271-3db9bcc99cf9 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] [instance: 04c5bee9-8745-45b2-884d-2abfbbec5d0e] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Jan 21 19:20:13 np0005591285 nova_compute[182755]: 2026-01-22 00:20:13.987 182759 INFO nova.virt.libvirt.driver [-] [instance: 04c5bee9-8745-45b2-884d-2abfbbec5d0e] Instance spawned successfully.#033[00m
Jan 21 19:20:13 np0005591285 nova_compute[182755]: 2026-01-22 00:20:13.988 182759 DEBUG nova.virt.libvirt.driver [None req-016582db-ff6a-49dc-9271-3db9bcc99cf9 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] [instance: 04c5bee9-8745-45b2-884d-2abfbbec5d0e] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Jan 21 19:20:14 np0005591285 nova_compute[182755]: 2026-01-22 00:20:14.009 182759 DEBUG nova.compute.manager [None req-f447ef32-7edb-4e2c-a5c5-5452f2f9c37e - - - - - -] [instance: 04c5bee9-8745-45b2-884d-2abfbbec5d0e] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 21 19:20:14 np0005591285 nova_compute[182755]: 2026-01-22 00:20:14.015 182759 DEBUG nova.compute.manager [None req-f447ef32-7edb-4e2c-a5c5-5452f2f9c37e - - - - - -] [instance: 04c5bee9-8745-45b2-884d-2abfbbec5d0e] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 21 19:20:14 np0005591285 nova_compute[182755]: 2026-01-22 00:20:14.017 182759 DEBUG nova.virt.libvirt.driver [None req-016582db-ff6a-49dc-9271-3db9bcc99cf9 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] [instance: 04c5bee9-8745-45b2-884d-2abfbbec5d0e] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 21 19:20:14 np0005591285 nova_compute[182755]: 2026-01-22 00:20:14.017 182759 DEBUG nova.virt.libvirt.driver [None req-016582db-ff6a-49dc-9271-3db9bcc99cf9 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] [instance: 04c5bee9-8745-45b2-884d-2abfbbec5d0e] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 21 19:20:14 np0005591285 nova_compute[182755]: 2026-01-22 00:20:14.018 182759 DEBUG nova.virt.libvirt.driver [None req-016582db-ff6a-49dc-9271-3db9bcc99cf9 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] [instance: 04c5bee9-8745-45b2-884d-2abfbbec5d0e] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 21 19:20:14 np0005591285 nova_compute[182755]: 2026-01-22 00:20:14.018 182759 DEBUG nova.virt.libvirt.driver [None req-016582db-ff6a-49dc-9271-3db9bcc99cf9 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] [instance: 04c5bee9-8745-45b2-884d-2abfbbec5d0e] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 21 19:20:14 np0005591285 nova_compute[182755]: 2026-01-22 00:20:14.018 182759 DEBUG nova.virt.libvirt.driver [None req-016582db-ff6a-49dc-9271-3db9bcc99cf9 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] [instance: 04c5bee9-8745-45b2-884d-2abfbbec5d0e] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 21 19:20:14 np0005591285 nova_compute[182755]: 2026-01-22 00:20:14.018 182759 DEBUG nova.virt.libvirt.driver [None req-016582db-ff6a-49dc-9271-3db9bcc99cf9 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] [instance: 04c5bee9-8745-45b2-884d-2abfbbec5d0e] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 21 19:20:14 np0005591285 nova_compute[182755]: 2026-01-22 00:20:14.071 182759 INFO nova.compute.manager [None req-f447ef32-7edb-4e2c-a5c5-5452f2f9c37e - - - - - -] [instance: 04c5bee9-8745-45b2-884d-2abfbbec5d0e] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 21 19:20:14 np0005591285 nova_compute[182755]: 2026-01-22 00:20:14.152 182759 INFO nova.compute.manager [None req-016582db-ff6a-49dc-9271-3db9bcc99cf9 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] [instance: 04c5bee9-8745-45b2-884d-2abfbbec5d0e] Took 7.88 seconds to spawn the instance on the hypervisor.#033[00m
Jan 21 19:20:14 np0005591285 nova_compute[182755]: 2026-01-22 00:20:14.152 182759 DEBUG nova.compute.manager [None req-016582db-ff6a-49dc-9271-3db9bcc99cf9 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] [instance: 04c5bee9-8745-45b2-884d-2abfbbec5d0e] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 21 19:20:14 np0005591285 podman[235352]: 2026-01-22 00:20:14.184087696 +0000 UTC m=+0.055222684 container health_status 3d927e208c8d9d2bf2ed958430b573b5beb6e6062ba87e1ec7a0ee42c855b2e4 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter)
Jan 21 19:20:14 np0005591285 nova_compute[182755]: 2026-01-22 00:20:14.262 182759 INFO nova.compute.manager [None req-016582db-ff6a-49dc-9271-3db9bcc99cf9 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] [instance: 04c5bee9-8745-45b2-884d-2abfbbec5d0e] Took 9.23 seconds to build instance.#033[00m
Jan 21 19:20:14 np0005591285 nova_compute[182755]: 2026-01-22 00:20:14.284 182759 DEBUG oslo_concurrency.lockutils [None req-016582db-ff6a-49dc-9271-3db9bcc99cf9 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] Lock "04c5bee9-8745-45b2-884d-2abfbbec5d0e" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 9.355s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 21 19:20:16 np0005591285 nova_compute[182755]: 2026-01-22 00:20:16.663 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:20:17 np0005591285 podman[235377]: 2026-01-22 00:20:17.189653028 +0000 UTC m=+0.052009778 container health_status 8919309155a033da8deb024bb37b9fc0d285fe97686ee0d202af37a5f2df8416 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter)
Jan 21 19:20:17 np0005591285 podman[235376]: 2026-01-22 00:20:17.189653228 +0000 UTC m=+0.055452620 container health_status 482a8450540b33e5cd4c4cb0c4b60281016c657b79b89fa82f8a9e447843f995 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 21 19:20:17 np0005591285 nova_compute[182755]: 2026-01-22 00:20:17.819 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:20:19 np0005591285 ovn_controller[94908]: 2026-01-22T00:20:19Z|00584|binding|INFO|Releasing lport 41a0cd65-2b8a-41a7-a2ad-c71bf323d9ae from this chassis (sb_readonly=0)
Jan 21 19:20:19 np0005591285 nova_compute[182755]: 2026-01-22 00:20:19.253 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:20:20 np0005591285 podman[235416]: 2026-01-22 00:20:20.212436249 +0000 UTC m=+0.088094599 container health_status e38def30a2d032278c507a8fe96e62e9ea51ff4721fe55b24e66691821fd079e (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_controller, io.buildah.version=1.41.3, org.label-schema.license=GPLv2)
Jan 21 19:20:21 np0005591285 nova_compute[182755]: 2026-01-22 00:20:21.666 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:20:22 np0005591285 nova_compute[182755]: 2026-01-22 00:20:22.635 182759 DEBUG nova.compute.manager [req-47b39aac-ffae-4f76-a0ba-b1095c5cc3e5 req-cf1a07b3-ec8b-4ebc-831f-1b54d427b52d 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 04c5bee9-8745-45b2-884d-2abfbbec5d0e] Received event network-changed-5f30e7af-d4d9-44f1-9c51-4ab3afc2e3df external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 21 19:20:22 np0005591285 nova_compute[182755]: 2026-01-22 00:20:22.636 182759 DEBUG nova.compute.manager [req-47b39aac-ffae-4f76-a0ba-b1095c5cc3e5 req-cf1a07b3-ec8b-4ebc-831f-1b54d427b52d 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 04c5bee9-8745-45b2-884d-2abfbbec5d0e] Refreshing instance network info cache due to event network-changed-5f30e7af-d4d9-44f1-9c51-4ab3afc2e3df. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 21 19:20:22 np0005591285 nova_compute[182755]: 2026-01-22 00:20:22.636 182759 DEBUG oslo_concurrency.lockutils [req-47b39aac-ffae-4f76-a0ba-b1095c5cc3e5 req-cf1a07b3-ec8b-4ebc-831f-1b54d427b52d 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "refresh_cache-04c5bee9-8745-45b2-884d-2abfbbec5d0e" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 21 19:20:22 np0005591285 nova_compute[182755]: 2026-01-22 00:20:22.636 182759 DEBUG oslo_concurrency.lockutils [req-47b39aac-ffae-4f76-a0ba-b1095c5cc3e5 req-cf1a07b3-ec8b-4ebc-831f-1b54d427b52d 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquired lock "refresh_cache-04c5bee9-8745-45b2-884d-2abfbbec5d0e" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 21 19:20:22 np0005591285 nova_compute[182755]: 2026-01-22 00:20:22.636 182759 DEBUG nova.network.neutron [req-47b39aac-ffae-4f76-a0ba-b1095c5cc3e5 req-cf1a07b3-ec8b-4ebc-831f-1b54d427b52d 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 04c5bee9-8745-45b2-884d-2abfbbec5d0e] Refreshing network info cache for port 5f30e7af-d4d9-44f1-9c51-4ab3afc2e3df _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 21 19:20:22 np0005591285 nova_compute[182755]: 2026-01-22 00:20:22.821 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.172 12 DEBUG ceilometer.compute.discovery [-] instance data: {'id': '04c5bee9-8745-45b2-884d-2abfbbec5d0e', 'name': 'tempest-TestNetworkBasicOps-server-205298966', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'os_type': 'hvm', 'architecture': 'x86_64', 'OS-EXT-SRV-ATTR:instance_name': 'instance-00000095', 'OS-EXT-SRV-ATTR:host': 'compute-2.ctlplane.example.com', 'OS-EXT-STS:vm_state': 'running', 'tenant_id': '34b96b4037d24a0ea19383ca2477b2fd', 'user_id': '833f1e9dce90456ea55a443da6704907', 'hostId': 'e5ddce960d32daad00025894dc3b34f2163b2588c9dc00067da2324e', 'status': 'active', 'metadata': {}} discover_libvirt_polling /usr/lib/python3.9/site-packages/ceilometer/compute/discovery.py:228
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.176 12 DEBUG ceilometer.compute.discovery [-] instance data: {'id': '6b668707-d685-4bda-bfbf-c52a9214fc5a', 'name': 'tempest-TestNetworkBasicOps-server-398620410', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'os_type': 'hvm', 'architecture': 'x86_64', 'OS-EXT-SRV-ATTR:instance_name': 'instance-00000094', 'OS-EXT-SRV-ATTR:host': 'compute-2.ctlplane.example.com', 'OS-EXT-STS:vm_state': 'running', 'tenant_id': '34b96b4037d24a0ea19383ca2477b2fd', 'user_id': '833f1e9dce90456ea55a443da6704907', 'hostId': 'e5ddce960d32daad00025894dc3b34f2163b2588c9dc00067da2324e', 'status': 'active', 'metadata': {}} discover_libvirt_polling /usr/lib/python3.9/site-packages/ceilometer/compute/discovery.py:228
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.177 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.requests in the context of pollsters
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.208 12 DEBUG ceilometer.compute.pollsters [-] 04c5bee9-8745-45b2-884d-2abfbbec5d0e/disk.device.read.requests volume: 760 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.210 12 DEBUG ceilometer.compute.pollsters [-] 04c5bee9-8745-45b2-884d-2abfbbec5d0e/disk.device.read.requests volume: 1 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.239 12 DEBUG ceilometer.compute.pollsters [-] 6b668707-d685-4bda-bfbf-c52a9214fc5a/disk.device.read.requests volume: 1075 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.240 12 DEBUG ceilometer.compute.pollsters [-] 6b668707-d685-4bda-bfbf-c52a9214fc5a/disk.device.read.requests volume: 108 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.243 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '9dcb3428-bef8-43ec-9a2b-6d571d6577fc', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 760, 'user_id': '833f1e9dce90456ea55a443da6704907', 'user_name': None, 'project_id': '34b96b4037d24a0ea19383ca2477b2fd', 'project_name': None, 'resource_id': '04c5bee9-8745-45b2-884d-2abfbbec5d0e-vda', 'timestamp': '2026-01-22T00:20:23.177342', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-205298966', 'name': 'instance-00000095', 'instance_id': '04c5bee9-8745-45b2-884d-2abfbbec5d0e', 'instance_type': 'm1.nano', 'host': 'e5ddce960d32daad00025894dc3b34f2163b2588c9dc00067da2324e', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '24878288-f728-11f0-b13b-fa163e425b77', 'monotonic_time': 5781.896588878, 'message_signature': 'f6380d8a5cc8d35b66c610633bcd200ef75a8914e95ecfb4ce4b4b229cb71d68'}, {'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 1, 'user_id': '833f1e9dce90456ea55a443da6704907', 'user_name': None, 'project_id': '34b96b4037d24a0ea19383ca2477b2fd', 'project_name': None, 'resource_id': '04c5bee9-8745-45b2-884d-2abfbbec5d0e-sda', 'timestamp': '2026-01-22T00:20:23.177342', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-205298966', 'name': 'instance-00000095', 'instance_id': '04c5bee9-8745-45b2-884d-2abfbbec5d0e', 'instance_type': 'm1.nano', 'host': 'e5ddce960d32daad00025894dc3b34f2163b2588c9dc00067da2324e', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '2487a146-f728-11f0-b13b-fa163e425b77', 'monotonic_time': 5781.896588878, 'message_signature': '52734692b67227997b608b7b1fdde6e09669ace7585d4508778ff5f81e08fb08'}, {'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 1075, 'user_id': '833f1e9dce90456ea55a443da6704907', 'user_name': None, 'project_id': '34b96b4037d24a0ea19383ca2477b2fd', 'project_name': None, 'resource_id': '6b668707-d685-4bda-bfbf-c52a9214fc5a-vda', 'timestamp': '2026-01-22T00:20:23.177342', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-398620410', 'name': 'instance-00000094', 'instance_id': '6b668707-d685-4bda-bfbf-c52a9214fc5a', 'instance_type': 'm1.nano', 'host': 'e5ddce960d32daad00025894dc3b34f2163b2588c9dc00067da2324e', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '248c0b14-f728-11f0-b13b-fa163e425b77', 'monotonic_time': 5781.930061561, 'message_signature': '8f4c1a2e44acbe14fc64c5d396746df5c44ddcee91c359f2f666271647960bb3'}, {'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 108, 'user_id': '833f1e9dce90456ea55a443da6704907', 'user_name': None, 'project_id': '34b96b4037d24a0ea19383ca2477b2fd', 'project_name': None, 'resource_id': '6b668707-d685-4bda-bfbf-c52a9214fc5a-sda', 'timestamp': '2026-01-22T00:20:23.177342', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-398620410', 'name': 'instance-00000094', 'instance_id': '6b668707-d685-4bda-bfbf-c52a9214fc5a', 'instance_type': 'm1.nano', 'host': 'e5ddce960d32daad00025894dc3b34f2163b2588c9dc00067da2324e', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '248c2496-f728-11f0-b13b-fa163e425b77', 'monotonic_time': 5781.930061561, 'message_signature': 'a9299a54422834bb784d17f09ad39ba193a77626ad1283a118aeef0bd2cfefcb'}]}, 'timestamp': '2026-01-22 00:20:23.240604', '_unique_id': '61b9ac5a19f443aea6ee6e9e4a2f059f'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.243 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.243 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.243 12 ERROR oslo_messaging.notify.messaging     yield
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.243 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.243 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.243 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.243 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.243 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.243 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.243 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.243 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.243 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.243 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.243 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.243 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.243 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.243 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.243 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.243 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.243 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.243 12 ERROR oslo_messaging.notify.messaging 
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.243 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.243 12 ERROR oslo_messaging.notify.messaging 
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.243 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.243 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.243 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.243 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.243 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.243 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.243 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.243 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.243 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.243 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.243 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.243 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.243 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.243 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.243 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.243 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.243 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.243 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.243 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.243 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.243 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.243 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.243 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.243 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.243 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.243 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.243 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.243 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.243 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.243 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.243 12 ERROR oslo_messaging.notify.messaging 
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.245 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets in the context of pollsters
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.251 12 DEBUG ceilometer.compute.virt.libvirt.inspector [-] No delta meter predecessor for 04c5bee9-8745-45b2-884d-2abfbbec5d0e / tap5f30e7af-d4 inspect_vnics /usr/lib/python3.9/site-packages/ceilometer/compute/virt/libvirt/inspector.py:136
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.251 12 DEBUG ceilometer.compute.pollsters [-] 04c5bee9-8745-45b2-884d-2abfbbec5d0e/network.outgoing.packets volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.254 12 DEBUG ceilometer.compute.virt.libvirt.inspector [-] No delta meter predecessor for 6b668707-d685-4bda-bfbf-c52a9214fc5a / tap93aa6e48-d8 inspect_vnics /usr/lib/python3.9/site-packages/ceilometer/compute/virt/libvirt/inspector.py:136
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.254 12 DEBUG ceilometer.compute.pollsters [-] 6b668707-d685-4bda-bfbf-c52a9214fc5a/network.outgoing.packets volume: 16 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.255 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'fc3c7097-8aea-40b1-9b30-244fe85f410f', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '833f1e9dce90456ea55a443da6704907', 'user_name': None, 'project_id': '34b96b4037d24a0ea19383ca2477b2fd', 'project_name': None, 'resource_id': 'instance-00000095-04c5bee9-8745-45b2-884d-2abfbbec5d0e-tap5f30e7af-d4', 'timestamp': '2026-01-22T00:20:23.245492', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-205298966', 'name': 'tap5f30e7af-d4', 'instance_id': '04c5bee9-8745-45b2-884d-2abfbbec5d0e', 'instance_type': 'm1.nano', 'host': 'e5ddce960d32daad00025894dc3b34f2163b2588c9dc00067da2324e', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:81:a9:f6', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap5f30e7af-d4'}, 'message_id': '248df83e-f728-11f0-b13b-fa163e425b77', 'monotonic_time': 5781.964730966, 'message_signature': 'b96dba3e35379724e2103767516892963f634778cc58c864c3daa9062c35a1b2'}, {'source': 'openstack', 'counter_name': 'network.outgoing.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 16, 'user_id': '833f1e9dce90456ea55a443da6704907', 'user_name': None, 'project_id': '34b96b4037d24a0ea19383ca2477b2fd', 'project_name': None, 'resource_id': 'instance-00000094-6b668707-d685-4bda-bfbf-c52a9214fc5a-tap93aa6e48-d8', 'timestamp': '2026-01-22T00:20:23.245492', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-398620410', 'name': 'tap93aa6e48-d8', 'instance_id': '6b668707-d685-4bda-bfbf-c52a9214fc5a', 'instance_type': 'm1.nano', 'host': 'e5ddce960d32daad00025894dc3b34f2163b2588c9dc00067da2324e', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:d8:48:67', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap93aa6e48-d8'}, 'message_id': '248e5e46-f728-11f0-b13b-fa163e425b77', 'monotonic_time': 5781.971548267, 'message_signature': '80d9981c16738e98b52fc782707bcac075134ff908a9af4e2777e253db06bdf2'}]}, 'timestamp': '2026-01-22 00:20:23.254959', '_unique_id': '5ecd826e05f64a07945fb1050e0425ae'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.255 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.255 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.255 12 ERROR oslo_messaging.notify.messaging     yield
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.255 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.255 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.255 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.255 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.255 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.255 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.255 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.255 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.255 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.255 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.255 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.255 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.255 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.255 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.255 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.255 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.255 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.255 12 ERROR oslo_messaging.notify.messaging 
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.255 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.255 12 ERROR oslo_messaging.notify.messaging 
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.255 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.255 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.255 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.255 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.255 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.255 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.255 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.255 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.255 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.255 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.255 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.255 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.255 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.255 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.255 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.255 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.255 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.255 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.255 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.255 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.255 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.255 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.255 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.255 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.255 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.255 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.255 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.255 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.255 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.255 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.255 12 ERROR oslo_messaging.notify.messaging 
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.257 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.drop in the context of pollsters
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.257 12 DEBUG ceilometer.compute.pollsters [-] 04c5bee9-8745-45b2-884d-2abfbbec5d0e/network.outgoing.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.257 12 DEBUG ceilometer.compute.pollsters [-] 6b668707-d685-4bda-bfbf-c52a9214fc5a/network.outgoing.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.258 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'f1cf604b-0db9-4b7e-ba2f-e6205c0d77e1', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '833f1e9dce90456ea55a443da6704907', 'user_name': None, 'project_id': '34b96b4037d24a0ea19383ca2477b2fd', 'project_name': None, 'resource_id': 'instance-00000095-04c5bee9-8745-45b2-884d-2abfbbec5d0e-tap5f30e7af-d4', 'timestamp': '2026-01-22T00:20:23.257415', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-205298966', 'name': 'tap5f30e7af-d4', 'instance_id': '04c5bee9-8745-45b2-884d-2abfbbec5d0e', 'instance_type': 'm1.nano', 'host': 'e5ddce960d32daad00025894dc3b34f2163b2588c9dc00067da2324e', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:81:a9:f6', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap5f30e7af-d4'}, 'message_id': '248ece94-f728-11f0-b13b-fa163e425b77', 'monotonic_time': 5781.964730966, 'message_signature': 'f1092111da2d54619bf16173b128f0edbf4e6db1355edb1de82f837b49258340'}, {'source': 'openstack', 'counter_name': 'network.outgoing.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '833f1e9dce90456ea55a443da6704907', 'user_name': None, 'project_id': '34b96b4037d24a0ea19383ca2477b2fd', 'project_name': None, 'resource_id': 'instance-00000094-6b668707-d685-4bda-bfbf-c52a9214fc5a-tap93aa6e48-d8', 'timestamp': '2026-01-22T00:20:23.257415', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-398620410', 'name': 'tap93aa6e48-d8', 'instance_id': '6b668707-d685-4bda-bfbf-c52a9214fc5a', 'instance_type': 'm1.nano', 'host': 'e5ddce960d32daad00025894dc3b34f2163b2588c9dc00067da2324e', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:d8:48:67', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap93aa6e48-d8'}, 'message_id': '248ed9b6-f728-11f0-b13b-fa163e425b77', 'monotonic_time': 5781.971548267, 'message_signature': '11a6175750c4825de0f3412bc098599740b78ce391febfc0aa8fb326eebc35a0'}]}, 'timestamp': '2026-01-22 00:20:23.258061', '_unique_id': 'b21861acb75b4ea88e318b7930a4a8ec'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.258 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.258 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.258 12 ERROR oslo_messaging.notify.messaging     yield
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.258 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.258 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.258 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.258 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.258 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.258 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.258 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.258 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.258 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.258 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.258 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.258 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.258 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.258 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.258 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.258 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.258 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.258 12 ERROR oslo_messaging.notify.messaging 
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.258 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.258 12 ERROR oslo_messaging.notify.messaging 
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.258 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.258 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.258 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.258 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.258 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.258 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.258 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.258 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.258 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.258 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.258 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.258 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.258 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.258 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.258 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.258 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.258 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.258 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.258 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.258 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.258 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.258 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.258 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.258 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.258 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.258 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.258 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.258 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.258 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.258 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.258 12 ERROR oslo_messaging.notify.messaging 
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.259 12 INFO ceilometer.polling.manager [-] Polling pollster cpu in the context of pollsters
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.276 12 DEBUG ceilometer.compute.pollsters [-] 04c5bee9-8745-45b2-884d-2abfbbec5d0e/cpu volume: 8790000000 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.292 12 DEBUG ceilometer.compute.pollsters [-] 6b668707-d685-4bda-bfbf-c52a9214fc5a/cpu volume: 12200000000 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.294 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '8bb13ef3-274b-4321-b646-129a9870f89a', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'cpu', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 8790000000, 'user_id': '833f1e9dce90456ea55a443da6704907', 'user_name': None, 'project_id': '34b96b4037d24a0ea19383ca2477b2fd', 'project_name': None, 'resource_id': '04c5bee9-8745-45b2-884d-2abfbbec5d0e', 'timestamp': '2026-01-22T00:20:23.259815', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-205298966', 'name': 'instance-00000095', 'instance_id': '04c5bee9-8745-45b2-884d-2abfbbec5d0e', 'instance_type': 'm1.nano', 'host': 'e5ddce960d32daad00025894dc3b34f2163b2588c9dc00067da2324e', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'cpu_number': 1}, 'message_id': '2491b730-f728-11f0-b13b-fa163e425b77', 'monotonic_time': 5781.995373173, 'message_signature': '88d450b937d8d5335910c2d6cc0b39b8851363a422c542f0e9c0d6b81fc6fe3b'}, {'source': 'openstack', 'counter_name': 'cpu', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 12200000000, 'user_id': '833f1e9dce90456ea55a443da6704907', 'user_name': None, 'project_id': '34b96b4037d24a0ea19383ca2477b2fd', 'project_name': None, 'resource_id': '6b668707-d685-4bda-bfbf-c52a9214fc5a', 'timestamp': '2026-01-22T00:20:23.259815', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-398620410', 'name': 'instance-00000094', 'instance_id': '6b668707-d685-4bda-bfbf-c52a9214fc5a', 'instance_type': 'm1.nano', 'host': 'e5ddce960d32daad00025894dc3b34f2163b2588c9dc00067da2324e', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'cpu_number': 1}, 'message_id': '24944360-f728-11f0-b13b-fa163e425b77', 'monotonic_time': 5782.012030167, 'message_signature': 'dc98c35d5b71f15ca56f2d95e06564ff824201034d85004d23bd23d07d85ac44'}]}, 'timestamp': '2026-01-22 00:20:23.293607', '_unique_id': '944420e7215144fd88b46574a7953449'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.294 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.294 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.294 12 ERROR oslo_messaging.notify.messaging     yield
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.294 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.294 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.294 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.294 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.294 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.294 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.294 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.294 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.294 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.294 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.294 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.294 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.294 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.294 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.294 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.294 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.294 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.294 12 ERROR oslo_messaging.notify.messaging 
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.294 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.294 12 ERROR oslo_messaging.notify.messaging 
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.294 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.294 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.294 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.294 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.294 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.294 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.294 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.294 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.294 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.294 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.294 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.294 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.294 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.294 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.294 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.294 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.294 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.294 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.294 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.294 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.294 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.294 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.294 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.294 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.294 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.294 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.294 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.294 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.294 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.294 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.294 12 ERROR oslo_messaging.notify.messaging 
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.295 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.latency in the context of pollsters
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.295 12 DEBUG ceilometer.compute.pollsters [-] 04c5bee9-8745-45b2-884d-2abfbbec5d0e/disk.device.read.latency volume: 115707418 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.296 12 DEBUG ceilometer.compute.pollsters [-] 04c5bee9-8745-45b2-884d-2abfbbec5d0e/disk.device.read.latency volume: 854612 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.296 12 DEBUG ceilometer.compute.pollsters [-] 6b668707-d685-4bda-bfbf-c52a9214fc5a/disk.device.read.latency volume: 199210254 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.296 12 DEBUG ceilometer.compute.pollsters [-] 6b668707-d685-4bda-bfbf-c52a9214fc5a/disk.device.read.latency volume: 20375330 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.298 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '59f1a4d9-cb73-4c1b-bbef-3845dbb02350', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 115707418, 'user_id': '833f1e9dce90456ea55a443da6704907', 'user_name': None, 'project_id': '34b96b4037d24a0ea19383ca2477b2fd', 'project_name': None, 'resource_id': '04c5bee9-8745-45b2-884d-2abfbbec5d0e-vda', 'timestamp': '2026-01-22T00:20:23.295848', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-205298966', 'name': 'instance-00000095', 'instance_id': '04c5bee9-8745-45b2-884d-2abfbbec5d0e', 'instance_type': 'm1.nano', 'host': 'e5ddce960d32daad00025894dc3b34f2163b2588c9dc00067da2324e', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '2494ac56-f728-11f0-b13b-fa163e425b77', 'monotonic_time': 5781.896588878, 'message_signature': '3c70f1c0840dcdb30bb24ea7e4573f2783d8b63d2ec5823f37ad8d7b164b811e'}, {'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 854612, 'user_id': '833f1e9dce90456ea55a443da6704907', 'user_name': None, 'project_id': '34b96b4037d24a0ea19383ca2477b2fd', 'project_name': None, 'resource_id': '04c5bee9-8745-45b2-884d-2abfbbec5d0e-sda', 'timestamp': '2026-01-22T00:20:23.295848', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-205298966', 'name': 'instance-00000095', 'instance_id': '04c5bee9-8745-45b2-884d-2abfbbec5d0e', 'instance_type': 'm1.nano', 'host': 'e5ddce960d32daad00025894dc3b34f2163b2588c9dc00067da2324e', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '2494b9e4-f728-11f0-b13b-fa163e425b77', 'monotonic_time': 5781.896588878, 'message_signature': 'e75fb6493106129f4830eb298c1176305be66a36a74d63c441ed46ebc1359fe8'}, {'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 199210254, 'user_id': '833f1e9dce90456ea55a443da6704907', 'user_name': None, 'project_id': '34b96b4037d24a0ea19383ca2477b2fd', 'project_name': None, 'resource_id': '6b668707-d685-4bda-bfbf-c52a9214fc5a-vda', 'timestamp': '2026-01-22T00:20:23.295848', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-398620410', 'name': 'instance-00000094', 'instance_id': '6b668707-d685-4bda-bfbf-c52a9214fc5a', 'instance_type': 'm1.nano', 'host': 'e5ddce960d32daad00025894dc3b34f2163b2588c9dc00067da2324e', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '2494c740-f728-11f0-b13b-fa163e425b77', 'monotonic_time': 5781.930061561, 'message_signature': 'bdd88f1461128e5190a2af5031e2809e6ddcfc4c110e0a1564f33ed7c3e16ae8'}, {'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 20375330, 'user_id': '833f1e9dce90456ea55a443da6704907', 'user_name': None, 'project_id': '34b96b4037d24a0ea19383ca2477b2fd', 'project_name': None, 'resource_id': '6b668707-d685-4bda-bfbf-c52a9214fc5a-sda', 'timestamp': '2026-01-22T00:20:23.295848', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-398620410', 'name': 'instance-00000094', 'instance_id': '6b668707-d685-4bda-bfbf-c52a9214fc5a', 'instance_type': 'm1.nano', 'host': 'e5ddce960d32daad00025894dc3b34f2163b2588c9dc00067da2324e', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '2494d42e-f728-11f0-b13b-fa163e425b77', 'monotonic_time': 5781.930061561, 'message_signature': '5e5ccd83ccf52d1dbf6c4c1a786d08132feee976dfa1b244579072398b883444'}]}, 'timestamp': '2026-01-22 00:20:23.297275', '_unique_id': 'e32fcb24c4e749aeabd9c91b570ac53b'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.298 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.298 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.298 12 ERROR oslo_messaging.notify.messaging     yield
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.298 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.298 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.298 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.298 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.298 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.298 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.298 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.298 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.298 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.298 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.298 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.298 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.298 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.298 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.298 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.298 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.298 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.298 12 ERROR oslo_messaging.notify.messaging 
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.298 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.298 12 ERROR oslo_messaging.notify.messaging 
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.298 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.298 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.298 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.298 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.298 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.298 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.298 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.298 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.298 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.298 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.298 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.298 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.298 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.298 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.298 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.298 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.298 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.298 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.298 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.298 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.298 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.298 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.298 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.298 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.298 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.298 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.298 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.298 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.298 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.298 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.298 12 ERROR oslo_messaging.notify.messaging 
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.299 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.allocation in the context of pollsters
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.312 12 DEBUG ceilometer.compute.pollsters [-] 04c5bee9-8745-45b2-884d-2abfbbec5d0e/disk.device.allocation volume: 204800 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.313 12 DEBUG ceilometer.compute.pollsters [-] 04c5bee9-8745-45b2-884d-2abfbbec5d0e/disk.device.allocation volume: 487424 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.324 12 DEBUG ceilometer.compute.pollsters [-] 6b668707-d685-4bda-bfbf-c52a9214fc5a/disk.device.allocation volume: 30351360 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.325 12 DEBUG ceilometer.compute.pollsters [-] 6b668707-d685-4bda-bfbf-c52a9214fc5a/disk.device.allocation volume: 487424 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.326 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '3c9c5daa-289c-454e-8f90-56e00e7b421c', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 204800, 'user_id': '833f1e9dce90456ea55a443da6704907', 'user_name': None, 'project_id': '34b96b4037d24a0ea19383ca2477b2fd', 'project_name': None, 'resource_id': '04c5bee9-8745-45b2-884d-2abfbbec5d0e-vda', 'timestamp': '2026-01-22T00:20:23.299699', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-205298966', 'name': 'instance-00000095', 'instance_id': '04c5bee9-8745-45b2-884d-2abfbbec5d0e', 'instance_type': 'm1.nano', 'host': 'e5ddce960d32daad00025894dc3b34f2163b2588c9dc00067da2324e', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '249753f2-f728-11f0-b13b-fa163e425b77', 'monotonic_time': 5782.018942941, 'message_signature': 'c5089a48bb940400f28e87cac94f137632b810b8e29271336ef338ca9e33b4f4'}, {'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 487424, 'user_id': '833f1e9dce90456ea55a443da6704907', 'user_name': None, 'project_id': '34b96b4037d24a0ea19383ca2477b2fd', 'project_name': None, 'resource_id': '04c5bee9-8745-45b2-884d-2abfbbec5d0e-sda', 'timestamp': '2026-01-22T00:20:23.299699', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-205298966', 'name': 'instance-00000095', 'instance_id': '04c5bee9-8745-45b2-884d-2abfbbec5d0e', 'instance_type': 'm1.nano', 'host': 'e5ddce960d32daad00025894dc3b34f2163b2588c9dc00067da2324e', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '24976540-f728-11f0-b13b-fa163e425b77', 'monotonic_time': 5782.018942941, 'message_signature': 'ad1f30b59d335c3da6587f58c496a2d089d022be6feb8dae8273085472da1a9c'}, {'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 30351360, 'user_id': '833f1e9dce90456ea55a443da6704907', 'user_name': None, 'project_id': '34b96b4037d24a0ea19383ca2477b2fd', 'project_name': None, 'resource_id': '6b668707-d685-4bda-bfbf-c52a9214fc5a-vda', 'timestamp': '2026-01-22T00:20:23.299699', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-398620410', 'name': 'instance-00000094', 'instance_id': '6b668707-d685-4bda-bfbf-c52a9214fc5a', 'instance_type': 'm1.nano', 'host': 'e5ddce960d32daad00025894dc3b34f2163b2588c9dc00067da2324e', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '24990e5e-f728-11f0-b13b-fa163e425b77', 'monotonic_time': 5782.033483009, 'message_signature': '647b599196ad35bc10c76b777cdce0a9a6b7bf0ee5e550efeacec3e2c25df09a'}, {'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 487424, 'user_id': '833f1e9dce90456ea55a443da6704907', 'user_name': None, 'project_id': '34b96b4037d24a0ea19383ca2477b2fd', 'project_name': None, 'resource_id': '6b668707-d685-4bda-bfbf-c52a9214fc5a-sda', 'timestamp': '2026-01-22T00:20:23.299699', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-398620410', 'name': 'instance-00000094', 'instance_id': '6b668707-d685-4bda-bfbf-c52a9214fc5a', 'instance_type': 'm1.nano', 'host': 'e5ddce960d32daad00025894dc3b34f2163b2588c9dc00067da2324e', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '24991fd4-f728-11f0-b13b-fa163e425b77', 'monotonic_time': 5782.033483009, 'message_signature': '5afeaf75ec5e7b4bdc666c095cd79c1c400208be706154b22ec2c74f9beac2d2'}]}, 'timestamp': '2026-01-22 00:20:23.325433', '_unique_id': '9f85b4dd377543399842ba38d19975bc'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.326 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.326 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.326 12 ERROR oslo_messaging.notify.messaging     yield
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.326 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.326 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.326 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.326 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.326 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.326 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.326 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.326 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.326 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.326 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.326 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.326 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.326 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.326 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.326 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.326 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.326 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.326 12 ERROR oslo_messaging.notify.messaging 
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.326 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.326 12 ERROR oslo_messaging.notify.messaging 
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.326 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.326 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.326 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.326 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.326 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.326 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.326 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.326 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.326 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.326 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.326 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.326 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.326 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.326 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.326 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.326 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.326 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.326 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.326 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.326 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.326 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.326 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.326 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.326 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.326 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.326 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.326 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.326 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.326 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.326 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.326 12 ERROR oslo_messaging.notify.messaging 
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.328 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.delta in the context of pollsters
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.328 12 DEBUG ceilometer.compute.pollsters [-] 04c5bee9-8745-45b2-884d-2abfbbec5d0e/network.outgoing.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.328 12 DEBUG ceilometer.compute.pollsters [-] 6b668707-d685-4bda-bfbf-c52a9214fc5a/network.outgoing.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.330 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'a7f064dc-aec4-4038-a953-151013ec792e', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '833f1e9dce90456ea55a443da6704907', 'user_name': None, 'project_id': '34b96b4037d24a0ea19383ca2477b2fd', 'project_name': None, 'resource_id': 'instance-00000095-04c5bee9-8745-45b2-884d-2abfbbec5d0e-tap5f30e7af-d4', 'timestamp': '2026-01-22T00:20:23.328206', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-205298966', 'name': 'tap5f30e7af-d4', 'instance_id': '04c5bee9-8745-45b2-884d-2abfbbec5d0e', 'instance_type': 'm1.nano', 'host': 'e5ddce960d32daad00025894dc3b34f2163b2588c9dc00067da2324e', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:81:a9:f6', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap5f30e7af-d4'}, 'message_id': '24999e5a-f728-11f0-b13b-fa163e425b77', 'monotonic_time': 5781.964730966, 'message_signature': '103518710113484d75057dcd9561025e590b6963fdf2860d0d3afdc4a14ba79a'}, {'source': 'openstack', 'counter_name': 'network.outgoing.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '833f1e9dce90456ea55a443da6704907', 'user_name': None, 'project_id': '34b96b4037d24a0ea19383ca2477b2fd', 'project_name': None, 'resource_id': 'instance-00000094-6b668707-d685-4bda-bfbf-c52a9214fc5a-tap93aa6e48-d8', 'timestamp': '2026-01-22T00:20:23.328206', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-398620410', 'name': 'tap93aa6e48-d8', 'instance_id': '6b668707-d685-4bda-bfbf-c52a9214fc5a', 'instance_type': 'm1.nano', 'host': 'e5ddce960d32daad00025894dc3b34f2163b2588c9dc00067da2324e', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:d8:48:67', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap93aa6e48-d8'}, 'message_id': '2499ad8c-f728-11f0-b13b-fa163e425b77', 'monotonic_time': 5781.971548267, 'message_signature': '21a7d6be21a0e7da4a679c7898636c119e4fbaf399f0f914a3ae2c303e39552a'}]}, 'timestamp': '2026-01-22 00:20:23.329162', '_unique_id': '515e8563f8a14bfa9743e3f44423fbb5'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.330 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.330 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.330 12 ERROR oslo_messaging.notify.messaging     yield
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.330 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.330 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.330 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.330 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.330 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.330 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.330 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.330 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.330 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.330 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.330 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.330 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.330 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.330 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.330 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.330 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.330 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.330 12 ERROR oslo_messaging.notify.messaging 
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.330 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.330 12 ERROR oslo_messaging.notify.messaging 
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.330 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.330 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.330 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.330 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.330 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.330 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.330 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.330 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.330 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.330 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.330 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.330 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.330 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.330 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.330 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.330 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.330 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.330 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.330 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.330 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.330 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.330 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.330 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.330 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.330 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.330 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.330 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.330 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.330 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.330 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.330 12 ERROR oslo_messaging.notify.messaging 
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.332 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.iops in the context of pollsters
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.333 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for PerDeviceDiskIOPSPollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.333 12 ERROR ceilometer.polling.manager [-] Prevent pollster disk.device.iops from polling [<NovaLikeServer: tempest-TestNetworkBasicOps-server-205298966>, <NovaLikeServer: tempest-TestNetworkBasicOps-server-398620410>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-TestNetworkBasicOps-server-205298966>, <NovaLikeServer: tempest-TestNetworkBasicOps-server-398620410>]
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.333 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.delta in the context of pollsters
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.333 12 DEBUG ceilometer.compute.pollsters [-] 04c5bee9-8745-45b2-884d-2abfbbec5d0e/network.incoming.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.334 12 DEBUG ceilometer.compute.pollsters [-] 6b668707-d685-4bda-bfbf-c52a9214fc5a/network.incoming.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.335 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '6be22ba1-09e4-4bab-aa1f-b9fd05eaf8d4', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '833f1e9dce90456ea55a443da6704907', 'user_name': None, 'project_id': '34b96b4037d24a0ea19383ca2477b2fd', 'project_name': None, 'resource_id': 'instance-00000095-04c5bee9-8745-45b2-884d-2abfbbec5d0e-tap5f30e7af-d4', 'timestamp': '2026-01-22T00:20:23.333928', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-205298966', 'name': 'tap5f30e7af-d4', 'instance_id': '04c5bee9-8745-45b2-884d-2abfbbec5d0e', 'instance_type': 'm1.nano', 'host': 'e5ddce960d32daad00025894dc3b34f2163b2588c9dc00067da2324e', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:81:a9:f6', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap5f30e7af-d4'}, 'message_id': '249a8248-f728-11f0-b13b-fa163e425b77', 'monotonic_time': 5781.964730966, 'message_signature': '8e05d4cb6b28e25ecd6301880fcd0a821d225527f5d0880b0c8085e9646b2f68'}, {'source': 'openstack', 'counter_name': 'network.incoming.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '833f1e9dce90456ea55a443da6704907', 'user_name': None, 'project_id': '34b96b4037d24a0ea19383ca2477b2fd', 'project_name': None, 'resource_id': 'instance-00000094-6b668707-d685-4bda-bfbf-c52a9214fc5a-tap93aa6e48-d8', 'timestamp': '2026-01-22T00:20:23.333928', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-398620410', 'name': 'tap93aa6e48-d8', 'instance_id': '6b668707-d685-4bda-bfbf-c52a9214fc5a', 'instance_type': 'm1.nano', 'host': 'e5ddce960d32daad00025894dc3b34f2163b2588c9dc00067da2324e', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:d8:48:67', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap93aa6e48-d8'}, 'message_id': '249a8fc2-f728-11f0-b13b-fa163e425b77', 'monotonic_time': 5781.971548267, 'message_signature': '8ee931008511384c48fda68c7e6e3018a7db60c350550310f53b897c32895678'}]}, 'timestamp': '2026-01-22 00:20:23.334846', '_unique_id': '64b62faa0e18449e9bcc415f05912272'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.335 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.335 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.335 12 ERROR oslo_messaging.notify.messaging     yield
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.335 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.335 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.335 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.335 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.335 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.335 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.335 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.335 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.335 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.335 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.335 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.335 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.335 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.335 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.335 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.335 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.335 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.335 12 ERROR oslo_messaging.notify.messaging 
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.335 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.335 12 ERROR oslo_messaging.notify.messaging 
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.335 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.335 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.335 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.335 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.335 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.335 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.335 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.335 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.335 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.335 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.335 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.335 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.335 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.335 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.335 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.335 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.335 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.335 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.335 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.335 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.335 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.335 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.335 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.335 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.335 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.335 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.335 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.335 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.335 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.335 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.335 12 ERROR oslo_messaging.notify.messaging 
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.336 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.capacity in the context of pollsters
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.336 12 DEBUG ceilometer.compute.pollsters [-] 04c5bee9-8745-45b2-884d-2abfbbec5d0e/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.337 12 DEBUG ceilometer.compute.pollsters [-] 04c5bee9-8745-45b2-884d-2abfbbec5d0e/disk.device.capacity volume: 485376 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.337 12 DEBUG ceilometer.compute.pollsters [-] 6b668707-d685-4bda-bfbf-c52a9214fc5a/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.337 12 DEBUG ceilometer.compute.pollsters [-] 6b668707-d685-4bda-bfbf-c52a9214fc5a/disk.device.capacity volume: 485376 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.338 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '7829574b-de03-4353-b83e-5b973523cbdb', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '833f1e9dce90456ea55a443da6704907', 'user_name': None, 'project_id': '34b96b4037d24a0ea19383ca2477b2fd', 'project_name': None, 'resource_id': '04c5bee9-8745-45b2-884d-2abfbbec5d0e-vda', 'timestamp': '2026-01-22T00:20:23.336937', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-205298966', 'name': 'instance-00000095', 'instance_id': '04c5bee9-8745-45b2-884d-2abfbbec5d0e', 'instance_type': 'm1.nano', 'host': 'e5ddce960d32daad00025894dc3b34f2163b2588c9dc00067da2324e', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '249aedfa-f728-11f0-b13b-fa163e425b77', 'monotonic_time': 5782.018942941, 'message_signature': '9007736516010f888958f8fd780150c3805a3da272665cde7b92ff41d159e853'}, {'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 485376, 'user_id': '833f1e9dce90456ea55a443da6704907', 'user_name': None, 'project_id': '34b96b4037d24a0ea19383ca2477b2fd', 'project_name': None, 'resource_id': '04c5bee9-8745-45b2-884d-2abfbbec5d0e-sda', 'timestamp': '2026-01-22T00:20:23.336937', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-205298966', 'name': 'instance-00000095', 'instance_id': '04c5bee9-8745-45b2-884d-2abfbbec5d0e', 'instance_type': 'm1.nano', 'host': 'e5ddce960d32daad00025894dc3b34f2163b2588c9dc00067da2324e', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '249afb6a-f728-11f0-b13b-fa163e425b77', 'monotonic_time': 5782.018942941, 'message_signature': 'd805d2f1031c91fac3d65eda3f9f1bb6eea5375acbbbc41f497471d5fa903dad'}, {'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '833f1e9dce90456ea55a443da6704907', 'user_name': None, 'project_id': '34b96b4037d24a0ea19383ca2477b2fd', 'project_name': None, 'resource_id': '6b668707-d685-4bda-bfbf-c52a9214fc5a-vda', 'timestamp': '2026-01-22T00:20:23.336937', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-398620410', 'name': 'instance-00000094', 'instance_id': '6b668707-d685-4bda-bfbf-c52a9214fc5a', 'instance_type': 'm1.nano', 'host': 'e5ddce960d32daad00025894dc3b34f2163b2588c9dc00067da2324e', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '249b065a-f728-11f0-b13b-fa163e425b77', 'monotonic_time': 5782.033483009, 'message_signature': 'ac417cc44ff68b392480d240dc3e350f37786607fd05a7731ca78abb5829f77d'}, {'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 485376, 'user_id': '833f1e9dce90456ea55a443da6704907', 'user_name': None, 'project_id': '34b96b4037d24a0ea19383ca2477b2fd', 'project_name': None, 'resource_id': '6b668707-d685-4bda-bfbf-c52a9214fc5a-sda', 'timestamp': '2026-01-22T00:20:23.336937', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-398620410', 'name': 'instance-00000094', 'instance_id': '6b668707-d685-4bda-bfbf-c52a9214fc5a', 'instance_type': 'm1.nano', 'host': 'e5ddce960d32daad00025894dc3b34f2163b2588c9dc00067da2324e', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '249b1168-f728-11f0-b13b-fa163e425b77', 'monotonic_time': 5782.033483009, 'message_signature': '3a5f4648e88fefbf92d302493b21b11a4ce1c0165275a133e1ae182f1592b0e1'}]}, 'timestamp': '2026-01-22 00:20:23.338132', '_unique_id': 'fe4e24238fec40e992ecb7a683802dcc'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.338 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.338 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.338 12 ERROR oslo_messaging.notify.messaging     yield
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.338 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.338 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.338 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.338 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.338 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.338 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.338 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.338 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.338 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.338 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.338 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.338 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.338 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.338 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.338 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.338 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.338 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.338 12 ERROR oslo_messaging.notify.messaging 
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.338 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.338 12 ERROR oslo_messaging.notify.messaging 
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.338 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.338 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.338 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.338 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.338 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.338 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.338 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.338 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.338 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.338 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.338 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.338 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.338 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.338 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.338 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.338 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.338 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.338 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.338 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.338 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.338 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.338 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.338 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.338 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.338 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.338 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.338 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.338 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.338 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.338 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.338 12 ERROR oslo_messaging.notify.messaging 
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.339 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.rate in the context of pollsters
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.339 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for OutgoingBytesRatePollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.339 12 ERROR ceilometer.polling.manager [-] Prevent pollster network.outgoing.bytes.rate from polling [<NovaLikeServer: tempest-TestNetworkBasicOps-server-205298966>, <NovaLikeServer: tempest-TestNetworkBasicOps-server-398620410>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-TestNetworkBasicOps-server-205298966>, <NovaLikeServer: tempest-TestNetworkBasicOps-server-398620410>]
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.339 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.bytes in the context of pollsters
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.340 12 DEBUG ceilometer.compute.pollsters [-] 04c5bee9-8745-45b2-884d-2abfbbec5d0e/disk.device.write.bytes volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.340 12 DEBUG ceilometer.compute.pollsters [-] 04c5bee9-8745-45b2-884d-2abfbbec5d0e/disk.device.write.bytes volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.340 12 DEBUG ceilometer.compute.pollsters [-] 6b668707-d685-4bda-bfbf-c52a9214fc5a/disk.device.write.bytes volume: 72916992 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.340 12 DEBUG ceilometer.compute.pollsters [-] 6b668707-d685-4bda-bfbf-c52a9214fc5a/disk.device.write.bytes volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.341 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '27ffa5e2-0102-48e6-af74-f980a2ed2e43', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '833f1e9dce90456ea55a443da6704907', 'user_name': None, 'project_id': '34b96b4037d24a0ea19383ca2477b2fd', 'project_name': None, 'resource_id': '04c5bee9-8745-45b2-884d-2abfbbec5d0e-vda', 'timestamp': '2026-01-22T00:20:23.340088', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-205298966', 'name': 'instance-00000095', 'instance_id': '04c5bee9-8745-45b2-884d-2abfbbec5d0e', 'instance_type': 'm1.nano', 'host': 'e5ddce960d32daad00025894dc3b34f2163b2588c9dc00067da2324e', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '249b6a50-f728-11f0-b13b-fa163e425b77', 'monotonic_time': 5781.896588878, 'message_signature': '818da5a79180991917e37cc3bc36ed500d2c5b9a5c539d9c7f5441509f947615'}, {'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '833f1e9dce90456ea55a443da6704907', 'user_name': None, 'project_id': '34b96b4037d24a0ea19383ca2477b2fd', 'project_name': None, 'resource_id': '04c5bee9-8745-45b2-884d-2abfbbec5d0e-sda', 'timestamp': '2026-01-22T00:20:23.340088', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-205298966', 'name': 'instance-00000095', 'instance_id': '04c5bee9-8745-45b2-884d-2abfbbec5d0e', 'instance_type': 'm1.nano', 'host': 'e5ddce960d32daad00025894dc3b34f2163b2588c9dc00067da2324e', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '249b73b0-f728-11f0-b13b-fa163e425b77', 'monotonic_time': 5781.896588878, 'message_signature': '38d16644921b31f1747899fd89d9d17555ab17a5b20ee5676e3977771ef6940e'}, {'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 72916992, 'user_id': '833f1e9dce90456ea55a443da6704907', 'user_name': None, 'project_id': '34b96b4037d24a0ea19383ca2477b2fd', 'project_name': None, 'resource_id': '6b668707-d685-4bda-bfbf-c52a9214fc5a-vda', 'timestamp': '2026-01-22T00:20:23.340088', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-398620410', 'name': 'instance-00000094', 'instance_id': '6b668707-d685-4bda-bfbf-c52a9214fc5a', 'instance_type': 'm1.nano', 'host': 'e5ddce960d32daad00025894dc3b34f2163b2588c9dc00067da2324e', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '249b7ef0-f728-11f0-b13b-fa163e425b77', 'monotonic_time': 5781.930061561, 'message_signature': '00691ef57ccceddc04276fed8a9e7ad59c0554728046e1165d6976dcb5914b3f'}, {'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '833f1e9dce90456ea55a443da6704907', 'user_name': None, 'project_id': '34b96b4037d24a0ea19383ca2477b2fd', 'project_name': None, 'resource_id': '6b668707-d685-4bda-bfbf-c52a9214fc5a-sda', 'timestamp': '2026-01-22T00:20:23.340088', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-398620410', 'name': 'instance-00000094', 'instance_id': '6b668707-d685-4bda-bfbf-c52a9214fc5a', 'instance_type': 'm1.nano', 'host': 'e5ddce960d32daad00025894dc3b34f2163b2588c9dc00067da2324e', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '249b8b66-f728-11f0-b13b-fa163e425b77', 'monotonic_time': 5781.930061561, 'message_signature': '03bf0ba1dc4461c0609268a3daa52b4f64218c4affdfbede2617a685245de21a'}]}, 'timestamp': '2026-01-22 00:20:23.341247', '_unique_id': '88d05c2ef57a4e8d95af293fe8d7155f'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.341 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.341 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.341 12 ERROR oslo_messaging.notify.messaging     yield
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.341 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.341 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.341 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.341 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.341 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.341 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.341 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.341 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.341 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.341 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.341 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.341 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.341 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.341 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.341 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.341 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.341 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.341 12 ERROR oslo_messaging.notify.messaging 
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.341 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.341 12 ERROR oslo_messaging.notify.messaging 
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.341 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.341 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.341 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.341 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.341 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.341 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.341 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.341 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.341 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.341 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.341 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.341 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.341 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.341 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.341 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.341 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.341 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.341 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.341 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.341 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.341 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.341 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.341 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.341 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.341 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.341 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.341 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.341 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.341 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.341 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.341 12 ERROR oslo_messaging.notify.messaging 
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.342 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.requests in the context of pollsters
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.342 12 DEBUG ceilometer.compute.pollsters [-] 04c5bee9-8745-45b2-884d-2abfbbec5d0e/disk.device.write.requests volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.343 12 DEBUG ceilometer.compute.pollsters [-] 04c5bee9-8745-45b2-884d-2abfbbec5d0e/disk.device.write.requests volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.343 12 DEBUG ceilometer.compute.pollsters [-] 6b668707-d685-4bda-bfbf-c52a9214fc5a/disk.device.write.requests volume: 315 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.343 12 DEBUG ceilometer.compute.pollsters [-] 6b668707-d685-4bda-bfbf-c52a9214fc5a/disk.device.write.requests volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.345 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '6025e480-81dc-408c-9e08-ec864e2a108d', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 0, 'user_id': '833f1e9dce90456ea55a443da6704907', 'user_name': None, 'project_id': '34b96b4037d24a0ea19383ca2477b2fd', 'project_name': None, 'resource_id': '04c5bee9-8745-45b2-884d-2abfbbec5d0e-vda', 'timestamp': '2026-01-22T00:20:23.342901', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-205298966', 'name': 'instance-00000095', 'instance_id': '04c5bee9-8745-45b2-884d-2abfbbec5d0e', 'instance_type': 'm1.nano', 'host': 'e5ddce960d32daad00025894dc3b34f2163b2588c9dc00067da2324e', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '249bd6b6-f728-11f0-b13b-fa163e425b77', 'monotonic_time': 5781.896588878, 'message_signature': 'de0da0674062ab92236c5475bbea911d611a241124d696cfbbb26fd3ae59bec0'}, {'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 0, 'user_id': '833f1e9dce90456ea55a443da6704907', 'user_name': None, 'project_id': '34b96b4037d24a0ea19383ca2477b2fd', 'project_name': None, 'resource_id': '04c5bee9-8745-45b2-884d-2abfbbec5d0e-sda', 'timestamp': '2026-01-22T00:20:23.342901', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-205298966', 'name': 'instance-00000095', 'instance_id': '04c5bee9-8745-45b2-884d-2abfbbec5d0e', 'instance_type': 'm1.nano', 'host': 'e5ddce960d32daad00025894dc3b34f2163b2588c9dc00067da2324e', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '249be174-f728-11f0-b13b-fa163e425b77', 'monotonic_time': 5781.896588878, 'message_signature': '1a45601da25e542fd7f3a6a1a4722f07d2787fb30a2ebe252d3f719c5981422b'}, {'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 315, 'user_id': '833f1e9dce90456ea55a443da6704907', 'user_name': None, 'project_id': '34b96b4037d24a0ea19383ca2477b2fd', 'project_name': None, 'resource_id': '6b668707-d685-4bda-bfbf-c52a9214fc5a-vda', 'timestamp': '2026-01-22T00:20:23.342901', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-398620410', 'name': 'instance-00000094', 'instance_id': '6b668707-d685-4bda-bfbf-c52a9214fc5a', 'instance_type': 'm1.nano', 'host': 'e5ddce960d32daad00025894dc3b34f2163b2588c9dc00067da2324e', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '249beb74-f728-11f0-b13b-fa163e425b77', 'monotonic_time': 5781.930061561, 'message_signature': '9648a66ec536354105eb047a070cbc6b749e541ff94666a831eb5761d6146bc5'}, {'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 0, 'user_id': '833f1e9dce90456ea55a443da6704907', 'user_name': None, 'project_id': '34b96b4037d24a0ea19383ca2477b2fd', 'project_name': None, 'resource_id': '6b668707-d685-4bda-bfbf-c52a9214fc5a-sda', 'timestamp': '2026-01-22T00:20:23.342901', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-398620410', 'name': 'instance-00000094', 'instance_id': '6b668707-d685-4bda-bfbf-c52a9214fc5a', 'instance_type': 'm1.nano', 'host': 'e5ddce960d32daad00025894dc3b34f2163b2588c9dc00067da2324e', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '249bf52e-f728-11f0-b13b-fa163e425b77', 'monotonic_time': 5781.930061561, 'message_signature': '36d37e67296847637ad662dea010faa5404af2ff5db797211ed374ed9517d69f'}]}, 'timestamp': '2026-01-22 00:20:23.344006', '_unique_id': 'ece581d11aa445e7a16cfe271190628c'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.345 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.345 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.345 12 ERROR oslo_messaging.notify.messaging     yield
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.345 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.345 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.345 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.345 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.345 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.345 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.345 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.345 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.345 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.345 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.345 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.345 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.345 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.345 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.345 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.345 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.345 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.345 12 ERROR oslo_messaging.notify.messaging 
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.345 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.345 12 ERROR oslo_messaging.notify.messaging 
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.345 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.345 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.345 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.345 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.345 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.345 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.345 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.345 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.345 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.345 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.345 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.345 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.345 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.345 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.345 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.345 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.345 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.345 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.345 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.345 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.345 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.345 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.345 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.345 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.345 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.345 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.345 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.345 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.345 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.345 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.345 12 ERROR oslo_messaging.notify.messaging 
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.345 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.usage in the context of pollsters
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.345 12 DEBUG ceilometer.compute.pollsters [-] 04c5bee9-8745-45b2-884d-2abfbbec5d0e/disk.device.usage volume: 196624 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.346 12 DEBUG ceilometer.compute.pollsters [-] 04c5bee9-8745-45b2-884d-2abfbbec5d0e/disk.device.usage volume: 485376 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.346 12 DEBUG ceilometer.compute.pollsters [-] 6b668707-d685-4bda-bfbf-c52a9214fc5a/disk.device.usage volume: 29949952 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.346 12 DEBUG ceilometer.compute.pollsters [-] 6b668707-d685-4bda-bfbf-c52a9214fc5a/disk.device.usage volume: 485376 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.347 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'b507ee23-3be8-4f97-9951-a8699c4a81aa', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 196624, 'user_id': '833f1e9dce90456ea55a443da6704907', 'user_name': None, 'project_id': '34b96b4037d24a0ea19383ca2477b2fd', 'project_name': None, 'resource_id': '04c5bee9-8745-45b2-884d-2abfbbec5d0e-vda', 'timestamp': '2026-01-22T00:20:23.345745', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-205298966', 'name': 'instance-00000095', 'instance_id': '04c5bee9-8745-45b2-884d-2abfbbec5d0e', 'instance_type': 'm1.nano', 'host': 'e5ddce960d32daad00025894dc3b34f2163b2588c9dc00067da2324e', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '249c4632-f728-11f0-b13b-fa163e425b77', 'monotonic_time': 5782.018942941, 'message_signature': '75111b53fef8691d1ce302015c7dcaad4f755a34750061877e87c007b52ef9c9'}, {'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 485376, 'user_id': '833f1e9dce90456ea55a443da6704907', 'user_name': None, 'project_id': '34b96b4037d24a0ea19383ca2477b2fd', 'project_name': None, 'resource_id': '04c5bee9-8745-45b2-884d-2abfbbec5d0e-sda', 'timestamp': '2026-01-22T00:20:23.345745', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-205298966', 'name': 'instance-00000095', 'instance_id': '04c5bee9-8745-45b2-884d-2abfbbec5d0e', 'instance_type': 'm1.nano', 'host': 'e5ddce960d32daad00025894dc3b34f2163b2588c9dc00067da2324e', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '249c5136-f728-11f0-b13b-fa163e425b77', 'monotonic_time': 5782.018942941, 'message_signature': '626b947b11ca487a99ea443f3cb5687b553b4e4715fcfbf8d00f50fc45275c82'}, {'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 29949952, 'user_id': '833f1e9dce90456ea55a443da6704907', 'user_name': None, 'project_id': '34b96b4037d24a0ea19383ca2477b2fd', 'project_name': None, 'resource_id': '6b668707-d685-4bda-bfbf-c52a9214fc5a-vda', 'timestamp': '2026-01-22T00:20:23.345745', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-398620410', 'name': 'instance-00000094', 'instance_id': '6b668707-d685-4bda-bfbf-c52a9214fc5a', 'instance_type': 'm1.nano', 'host': 'e5ddce960d32daad00025894dc3b34f2163b2588c9dc00067da2324e', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '249c59c4-f728-11f0-b13b-fa163e425b77', 'monotonic_time': 5782.033483009, 'message_signature': 'e513c88771d89380405f2b3875de66e60e650e6e7fe864aa265187b7d61b65e8'}, {'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 485376, 'user_id': '833f1e9dce90456ea55a443da6704907', 'user_name': None, 'project_id': '34b96b4037d24a0ea19383ca2477b2fd', 'project_name': None, 'resource_id': '6b668707-d685-4bda-bfbf-c52a9214fc5a-sda', 'timestamp': '2026-01-22T00:20:23.345745', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-398620410', 'name': 'instance-00000094', 'instance_id': '6b668707-d685-4bda-bfbf-c52a9214fc5a', 'instance_type': 'm1.nano', 'host': 'e5ddce960d32daad00025894dc3b34f2163b2588c9dc00067da2324e', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '249c6234-f728-11f0-b13b-fa163e425b77', 'monotonic_time': 5782.033483009, 'message_signature': 'd302626978a654fe396954e3964904d0662c14688f1393fdeb46b7e91a892b0b'}]}, 'timestamp': '2026-01-22 00:20:23.346754', '_unique_id': 'fb400ea0cb00479595b5e737c1918369'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.347 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.347 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.347 12 ERROR oslo_messaging.notify.messaging     yield
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.347 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.347 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.347 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.347 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.347 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.347 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.347 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.347 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.347 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.347 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.347 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.347 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.347 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.347 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.347 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.347 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.347 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.347 12 ERROR oslo_messaging.notify.messaging 
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.347 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.347 12 ERROR oslo_messaging.notify.messaging 
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.347 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.347 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.347 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.347 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.347 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.347 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.347 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.347 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.347 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.347 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.347 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.347 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.347 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.347 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.347 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.347 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.347 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.347 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.347 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.347 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.347 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.347 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.347 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.347 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.347 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.347 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.347 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.347 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.347 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.347 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.347 12 ERROR oslo_messaging.notify.messaging 
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.347 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.latency in the context of pollsters
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.347 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for PerDeviceDiskLatencyPollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.348 12 ERROR ceilometer.polling.manager [-] Prevent pollster disk.device.latency from polling [<NovaLikeServer: tempest-TestNetworkBasicOps-server-205298966>, <NovaLikeServer: tempest-TestNetworkBasicOps-server-398620410>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-TestNetworkBasicOps-server-205298966>, <NovaLikeServer: tempest-TestNetworkBasicOps-server-398620410>]
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.348 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.error in the context of pollsters
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.348 12 DEBUG ceilometer.compute.pollsters [-] 04c5bee9-8745-45b2-884d-2abfbbec5d0e/network.incoming.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.348 12 DEBUG ceilometer.compute.pollsters [-] 6b668707-d685-4bda-bfbf-c52a9214fc5a/network.incoming.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.349 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'badb699c-816a-4cda-86c8-73f62429c40c', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '833f1e9dce90456ea55a443da6704907', 'user_name': None, 'project_id': '34b96b4037d24a0ea19383ca2477b2fd', 'project_name': None, 'resource_id': 'instance-00000095-04c5bee9-8745-45b2-884d-2abfbbec5d0e-tap5f30e7af-d4', 'timestamp': '2026-01-22T00:20:23.348311', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-205298966', 'name': 'tap5f30e7af-d4', 'instance_id': '04c5bee9-8745-45b2-884d-2abfbbec5d0e', 'instance_type': 'm1.nano', 'host': 'e5ddce960d32daad00025894dc3b34f2163b2588c9dc00067da2324e', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:81:a9:f6', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap5f30e7af-d4'}, 'message_id': '249ca898-f728-11f0-b13b-fa163e425b77', 'monotonic_time': 5781.964730966, 'message_signature': 'acae6e9de2bb74322ebca93f495ba471a4d4daee76b68a751f03497aca0fd184'}, {'source': 'openstack', 'counter_name': 'network.incoming.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '833f1e9dce90456ea55a443da6704907', 'user_name': None, 'project_id': '34b96b4037d24a0ea19383ca2477b2fd', 'project_name': None, 'resource_id': 'instance-00000094-6b668707-d685-4bda-bfbf-c52a9214fc5a-tap93aa6e48-d8', 'timestamp': '2026-01-22T00:20:23.348311', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-398620410', 'name': 'tap93aa6e48-d8', 'instance_id': '6b668707-d685-4bda-bfbf-c52a9214fc5a', 'instance_type': 'm1.nano', 'host': 'e5ddce960d32daad00025894dc3b34f2163b2588c9dc00067da2324e', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:d8:48:67', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap93aa6e48-d8'}, 'message_id': '249cb180-f728-11f0-b13b-fa163e425b77', 'monotonic_time': 5781.971548267, 'message_signature': '7bb08dfe1c8e0e7cc3fc93f552885683849a9adeba0a4de243a2e4d70a65e3ee'}]}, 'timestamp': '2026-01-22 00:20:23.348785', '_unique_id': '532df574f5504b9095d6e9105fca51d9'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.349 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.349 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.349 12 ERROR oslo_messaging.notify.messaging     yield
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.349 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.349 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.349 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.349 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.349 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.349 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.349 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.349 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.349 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.349 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.349 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.349 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.349 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.349 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.349 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.349 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.349 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.349 12 ERROR oslo_messaging.notify.messaging 
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.349 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.349 12 ERROR oslo_messaging.notify.messaging 
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.349 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.349 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.349 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.349 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.349 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.349 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.349 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.349 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.349 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.349 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.349 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.349 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.349 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.349 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.349 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.349 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.349 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.349 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.349 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.349 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.349 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.349 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.349 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.349 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.349 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.349 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.349 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.349 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.349 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.349 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.349 12 ERROR oslo_messaging.notify.messaging 
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.350 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.latency in the context of pollsters
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.350 12 DEBUG ceilometer.compute.pollsters [-] 04c5bee9-8745-45b2-884d-2abfbbec5d0e/disk.device.write.latency volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.350 12 DEBUG ceilometer.compute.pollsters [-] 04c5bee9-8745-45b2-884d-2abfbbec5d0e/disk.device.write.latency volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.350 12 DEBUG ceilometer.compute.pollsters [-] 6b668707-d685-4bda-bfbf-c52a9214fc5a/disk.device.write.latency volume: 8474654522 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.350 12 DEBUG ceilometer.compute.pollsters [-] 6b668707-d685-4bda-bfbf-c52a9214fc5a/disk.device.write.latency volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.351 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'c7daf696-c4a9-44a1-be25-3f27a2d35232', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 0, 'user_id': '833f1e9dce90456ea55a443da6704907', 'user_name': None, 'project_id': '34b96b4037d24a0ea19383ca2477b2fd', 'project_name': None, 'resource_id': '04c5bee9-8745-45b2-884d-2abfbbec5d0e-vda', 'timestamp': '2026-01-22T00:20:23.350114', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-205298966', 'name': 'instance-00000095', 'instance_id': '04c5bee9-8745-45b2-884d-2abfbbec5d0e', 'instance_type': 'm1.nano', 'host': 'e5ddce960d32daad00025894dc3b34f2163b2588c9dc00067da2324e', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '249ceea2-f728-11f0-b13b-fa163e425b77', 'monotonic_time': 5781.896588878, 'message_signature': '6ec32b7311d92e3dfeb3fa4ccda1500737cf27e56d2d1bf75659a25387f24d1e'}, {'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 0, 'user_id': '833f1e9dce90456ea55a443da6704907', 'user_name': None, 'project_id': '34b96b4037d24a0ea19383ca2477b2fd', 'project_name': None, 'resource_id': '04c5bee9-8745-45b2-884d-2abfbbec5d0e-sda', 'timestamp': '2026-01-22T00:20:23.350114', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-205298966', 'name': 'instance-00000095', 'instance_id': '04c5bee9-8745-45b2-884d-2abfbbec5d0e', 'instance_type': 'm1.nano', 'host': 'e5ddce960d32daad00025894dc3b34f2163b2588c9dc00067da2324e', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '249cf6c2-f728-11f0-b13b-fa163e425b77', 'monotonic_time': 5781.896588878, 'message_signature': '59344f90919114c43fdd6f3e4919d0304baf0c527fda29cd076b1c90ff9a0d84'}, {'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 8474654522, 'user_id': '833f1e9dce90456ea55a443da6704907', 'user_name': None, 'project_id': '34b96b4037d24a0ea19383ca2477b2fd', 'project_name': None, 'resource_id': '6b668707-d685-4bda-bfbf-c52a9214fc5a-vda', 'timestamp': '2026-01-22T00:20:23.350114', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-398620410', 'name': 'instance-00000094', 'instance_id': '6b668707-d685-4bda-bfbf-c52a9214fc5a', 'instance_type': 'm1.nano', 'host': 'e5ddce960d32daad00025894dc3b34f2163b2588c9dc00067da2324e', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '249cfec4-f728-11f0-b13b-fa163e425b77', 'monotonic_time': 5781.930061561, 'message_signature': '18fb4c1e76f8b143b0a770b84469433ad9a9829a5b2beac14fb7d1d582f2fc55'}, {'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 0, 'user_id': '833f1e9dce90456ea55a443da6704907', 'user_name': None, 'project_id': '34b96b4037d24a0ea19383ca2477b2fd', 'project_name': None, 'resource_id': '6b668707-d685-4bda-bfbf-c52a9214fc5a-sda', 'timestamp': '2026-01-22T00:20:23.350114', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-398620410', 'name': 'instance-00000094', 'instance_id': '6b668707-d685-4bda-bfbf-c52a9214fc5a', 'instance_type': 'm1.nano', 'host': 'e5ddce960d32daad00025894dc3b34f2163b2588c9dc00067da2324e', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '249d069e-f728-11f0-b13b-fa163e425b77', 'monotonic_time': 5781.930061561, 'message_signature': '34f76459f86a35cef1a00fb9923a8f24235e8e39ef668679e38f5d32ef2c2657'}]}, 'timestamp': '2026-01-22 00:20:23.350985', '_unique_id': 'b07abc77302b42a2a08f6dc501e50e9e'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.351 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.351 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.351 12 ERROR oslo_messaging.notify.messaging     yield
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.351 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.351 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.351 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.351 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.351 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.351 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.351 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.351 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.351 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.351 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.351 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.351 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.351 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.351 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.351 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.351 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.351 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.351 12 ERROR oslo_messaging.notify.messaging 
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.351 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.351 12 ERROR oslo_messaging.notify.messaging 
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.351 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.351 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.351 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.351 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.351 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.351 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.351 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.351 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.351 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.351 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.351 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.351 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.351 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.351 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.351 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.351 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.351 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.351 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.351 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.351 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.351 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.351 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.351 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.351 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.351 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.351 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.351 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.351 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.351 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.351 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.351 12 ERROR oslo_messaging.notify.messaging 
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.352 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets in the context of pollsters
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.352 12 DEBUG ceilometer.compute.pollsters [-] 04c5bee9-8745-45b2-884d-2abfbbec5d0e/network.incoming.packets volume: 1 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.352 12 DEBUG ceilometer.compute.pollsters [-] 6b668707-d685-4bda-bfbf-c52a9214fc5a/network.incoming.packets volume: 11 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.353 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '7f1ea781-cdd6-459d-a441-868d5a7cda03', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 1, 'user_id': '833f1e9dce90456ea55a443da6704907', 'user_name': None, 'project_id': '34b96b4037d24a0ea19383ca2477b2fd', 'project_name': None, 'resource_id': 'instance-00000095-04c5bee9-8745-45b2-884d-2abfbbec5d0e-tap5f30e7af-d4', 'timestamp': '2026-01-22T00:20:23.352219', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-205298966', 'name': 'tap5f30e7af-d4', 'instance_id': '04c5bee9-8745-45b2-884d-2abfbbec5d0e', 'instance_type': 'm1.nano', 'host': 'e5ddce960d32daad00025894dc3b34f2163b2588c9dc00067da2324e', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:81:a9:f6', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap5f30e7af-d4'}, 'message_id': '249d410e-f728-11f0-b13b-fa163e425b77', 'monotonic_time': 5781.964730966, 'message_signature': 'd402d8be20f9ea9f228080949773edcfe010de5e38719da14cefd163089a3b29'}, {'source': 'openstack', 'counter_name': 'network.incoming.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 11, 'user_id': '833f1e9dce90456ea55a443da6704907', 'user_name': None, 'project_id': '34b96b4037d24a0ea19383ca2477b2fd', 'project_name': None, 'resource_id': 'instance-00000094-6b668707-d685-4bda-bfbf-c52a9214fc5a-tap93aa6e48-d8', 'timestamp': '2026-01-22T00:20:23.352219', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-398620410', 'name': 'tap93aa6e48-d8', 'instance_id': '6b668707-d685-4bda-bfbf-c52a9214fc5a', 'instance_type': 'm1.nano', 'host': 'e5ddce960d32daad00025894dc3b34f2163b2588c9dc00067da2324e', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:d8:48:67', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap93aa6e48-d8'}, 'message_id': '249d49ec-f728-11f0-b13b-fa163e425b77', 'monotonic_time': 5781.971548267, 'message_signature': 'e705fb308b32fae1792a343402df1e9ebc8e67ab3e4bda2ff3143c68c116c74e'}]}, 'timestamp': '2026-01-22 00:20:23.352687', '_unique_id': 'b646d9b9509249cdab1260390be6a6ea'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.353 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.353 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.353 12 ERROR oslo_messaging.notify.messaging     yield
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.353 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.353 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.353 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.353 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.353 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.353 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.353 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.353 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.353 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.353 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.353 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.353 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.353 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.353 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.353 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.353 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.353 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.353 12 ERROR oslo_messaging.notify.messaging 
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.353 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.353 12 ERROR oslo_messaging.notify.messaging 
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.353 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.353 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.353 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.353 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.353 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.353 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.353 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.353 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.353 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.353 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.353 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.353 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.353 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.353 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.353 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.353 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.353 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.353 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.353 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.353 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.353 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.353 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.353 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.353 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.353 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.353 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.353 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.353 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.353 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.353 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.353 12 ERROR oslo_messaging.notify.messaging 
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.354 12 INFO ceilometer.polling.manager [-] Polling pollster memory.usage in the context of pollsters
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.354 12 DEBUG ceilometer.compute.pollsters [-] 04c5bee9-8745-45b2-884d-2abfbbec5d0e/memory.usage volume: Unavailable _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.354 12 WARNING ceilometer.compute.pollsters [-] memory.usage statistic in not available for instance 04c5bee9-8745-45b2-884d-2abfbbec5d0e: ceilometer.compute.pollsters.NoVolumeException
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.354 12 DEBUG ceilometer.compute.pollsters [-] 6b668707-d685-4bda-bfbf-c52a9214fc5a/memory.usage volume: 42.875 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.355 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '6224d6e1-1dfd-4b2d-bbdd-ccc3c8440a31', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'memory.usage', 'counter_type': 'gauge', 'counter_unit': 'MB', 'counter_volume': 42.875, 'user_id': '833f1e9dce90456ea55a443da6704907', 'user_name': None, 'project_id': '34b96b4037d24a0ea19383ca2477b2fd', 'project_name': None, 'resource_id': '6b668707-d685-4bda-bfbf-c52a9214fc5a', 'timestamp': '2026-01-22T00:20:23.354161', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-398620410', 'name': 'instance-00000094', 'instance_id': '6b668707-d685-4bda-bfbf-c52a9214fc5a', 'instance_type': 'm1.nano', 'host': 'e5ddce960d32daad00025894dc3b34f2163b2588c9dc00067da2324e', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1}, 'message_id': '249d93de-f728-11f0-b13b-fa163e425b77', 'monotonic_time': 5782.012030167, 'message_signature': '35d03d35dbd0f4a390695b3f785b1310dbf0b52c0645913495497b52b52d4445'}]}, 'timestamp': '2026-01-22 00:20:23.354579', '_unique_id': '425601b7db6c42919b08280548c9d4b5'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.355 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.355 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.355 12 ERROR oslo_messaging.notify.messaging     yield
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.355 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.355 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.355 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.355 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.355 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.355 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.355 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.355 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.355 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.355 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.355 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.355 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.355 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.355 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.355 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.355 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.355 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.355 12 ERROR oslo_messaging.notify.messaging 
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.355 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.355 12 ERROR oslo_messaging.notify.messaging 
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.355 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.355 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.355 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.355 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.355 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.355 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.355 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.355 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.355 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.355 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.355 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.355 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.355 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.355 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.355 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.355 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.355 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.355 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.355 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.355 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.355 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.355 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.355 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.355 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.355 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.355 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.355 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.355 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.355 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.355 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.355 12 ERROR oslo_messaging.notify.messaging 
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.355 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.rate in the context of pollsters
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.355 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for IncomingBytesRatePollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.355 12 ERROR ceilometer.polling.manager [-] Prevent pollster network.incoming.bytes.rate from polling [<NovaLikeServer: tempest-TestNetworkBasicOps-server-205298966>, <NovaLikeServer: tempest-TestNetworkBasicOps-server-398620410>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-TestNetworkBasicOps-server-205298966>, <NovaLikeServer: tempest-TestNetworkBasicOps-server-398620410>]
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.356 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes in the context of pollsters
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.356 12 DEBUG ceilometer.compute.pollsters [-] 04c5bee9-8745-45b2-884d-2abfbbec5d0e/network.outgoing.bytes volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.356 12 DEBUG ceilometer.compute.pollsters [-] 6b668707-d685-4bda-bfbf-c52a9214fc5a/network.outgoing.bytes volume: 1620 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.357 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'cfd1a947-a4d7-49cd-9d57-d871f92945aa', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '833f1e9dce90456ea55a443da6704907', 'user_name': None, 'project_id': '34b96b4037d24a0ea19383ca2477b2fd', 'project_name': None, 'resource_id': 'instance-00000095-04c5bee9-8745-45b2-884d-2abfbbec5d0e-tap5f30e7af-d4', 'timestamp': '2026-01-22T00:20:23.356173', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-205298966', 'name': 'tap5f30e7af-d4', 'instance_id': '04c5bee9-8745-45b2-884d-2abfbbec5d0e', 'instance_type': 'm1.nano', 'host': 'e5ddce960d32daad00025894dc3b34f2163b2588c9dc00067da2324e', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:81:a9:f6', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap5f30e7af-d4'}, 'message_id': '249ddb96-f728-11f0-b13b-fa163e425b77', 'monotonic_time': 5781.964730966, 'message_signature': 'f9c6b107820d63751f9686a25a0770ea9d3c29671d9963dbb0d2c6b804ade769'}, {'source': 'openstack', 'counter_name': 'network.outgoing.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 1620, 'user_id': '833f1e9dce90456ea55a443da6704907', 'user_name': None, 'project_id': '34b96b4037d24a0ea19383ca2477b2fd', 'project_name': None, 'resource_id': 'instance-00000094-6b668707-d685-4bda-bfbf-c52a9214fc5a-tap93aa6e48-d8', 'timestamp': '2026-01-22T00:20:23.356173', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-398620410', 'name': 'tap93aa6e48-d8', 'instance_id': '6b668707-d685-4bda-bfbf-c52a9214fc5a', 'instance_type': 'm1.nano', 'host': 'e5ddce960d32daad00025894dc3b34f2163b2588c9dc00067da2324e', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:d8:48:67', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap93aa6e48-d8'}, 'message_id': '249de4ba-f728-11f0-b13b-fa163e425b77', 'monotonic_time': 5781.971548267, 'message_signature': '2823ef42281e69a0cce958f5d502565f4ed9118972344c92a06fcd89e91ba7e1'}]}, 'timestamp': '2026-01-22 00:20:23.356649', '_unique_id': 'caa9d9a9623c471496136a3f0df0a503'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.357 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.357 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.357 12 ERROR oslo_messaging.notify.messaging     yield
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.357 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.357 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.357 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.357 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.357 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.357 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.357 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.357 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.357 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.357 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.357 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.357 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.357 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.357 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.357 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.357 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.357 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.357 12 ERROR oslo_messaging.notify.messaging 
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.357 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.357 12 ERROR oslo_messaging.notify.messaging 
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.357 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.357 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.357 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.357 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.357 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.357 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.357 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.357 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.357 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.357 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.357 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.357 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.357 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.357 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.357 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.357 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.357 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.357 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.357 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.357 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.357 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.357 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.357 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.357 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.357 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.357 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.357 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.357 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.357 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.357 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.357 12 ERROR oslo_messaging.notify.messaging 
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.358 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes in the context of pollsters
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.358 12 DEBUG ceilometer.compute.pollsters [-] 04c5bee9-8745-45b2-884d-2abfbbec5d0e/network.incoming.bytes volume: 90 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.358 12 DEBUG ceilometer.compute.pollsters [-] 6b668707-d685-4bda-bfbf-c52a9214fc5a/network.incoming.bytes volume: 1436 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.359 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '65ead5d8-b5ce-46a7-aac8-ea74d4462c1e', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 90, 'user_id': '833f1e9dce90456ea55a443da6704907', 'user_name': None, 'project_id': '34b96b4037d24a0ea19383ca2477b2fd', 'project_name': None, 'resource_id': 'instance-00000095-04c5bee9-8745-45b2-884d-2abfbbec5d0e-tap5f30e7af-d4', 'timestamp': '2026-01-22T00:20:23.358099', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-205298966', 'name': 'tap5f30e7af-d4', 'instance_id': '04c5bee9-8745-45b2-884d-2abfbbec5d0e', 'instance_type': 'm1.nano', 'host': 'e5ddce960d32daad00025894dc3b34f2163b2588c9dc00067da2324e', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:81:a9:f6', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap5f30e7af-d4'}, 'message_id': '249e2696-f728-11f0-b13b-fa163e425b77', 'monotonic_time': 5781.964730966, 'message_signature': '52df353fbe468fc31d86d6ae01034eaab205352c4b2018be05d14e60e72ff1de'}, {'source': 'openstack', 'counter_name': 'network.incoming.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 1436, 'user_id': '833f1e9dce90456ea55a443da6704907', 'user_name': None, 'project_id': '34b96b4037d24a0ea19383ca2477b2fd', 'project_name': None, 'resource_id': 'instance-00000094-6b668707-d685-4bda-bfbf-c52a9214fc5a-tap93aa6e48-d8', 'timestamp': '2026-01-22T00:20:23.358099', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-398620410', 'name': 'tap93aa6e48-d8', 'instance_id': '6b668707-d685-4bda-bfbf-c52a9214fc5a', 'instance_type': 'm1.nano', 'host': 'e5ddce960d32daad00025894dc3b34f2163b2588c9dc00067da2324e', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:d8:48:67', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap93aa6e48-d8'}, 'message_id': '249e2f7e-f728-11f0-b13b-fa163e425b77', 'monotonic_time': 5781.971548267, 'message_signature': '77a3cbc5e0fa3b361e1b97ab274f581dc13b45c4fd263300d9f0e9ad0ec0fe70'}]}, 'timestamp': '2026-01-22 00:20:23.358562', '_unique_id': '3560fe1960df47ce9861775220694b77'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.359 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.359 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.359 12 ERROR oslo_messaging.notify.messaging     yield
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.359 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.359 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.359 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.359 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.359 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.359 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.359 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.359 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.359 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.359 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.359 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.359 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.359 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.359 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.359 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.359 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.359 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.359 12 ERROR oslo_messaging.notify.messaging 
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.359 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.359 12 ERROR oslo_messaging.notify.messaging 
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.359 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.359 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.359 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.359 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.359 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.359 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.359 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.359 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.359 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.359 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.359 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.359 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.359 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.359 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.359 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.359 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.359 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.359 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.359 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.359 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.359 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.359 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.359 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.359 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.359 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.359 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.359 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.359 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.359 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.359 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.359 12 ERROR oslo_messaging.notify.messaging 
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.359 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.error in the context of pollsters
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.359 12 DEBUG ceilometer.compute.pollsters [-] 04c5bee9-8745-45b2-884d-2abfbbec5d0e/network.outgoing.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.360 12 DEBUG ceilometer.compute.pollsters [-] 6b668707-d685-4bda-bfbf-c52a9214fc5a/network.outgoing.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.360 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'b0057af0-f4f6-4e22-ae16-df059203fbd7', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '833f1e9dce90456ea55a443da6704907', 'user_name': None, 'project_id': '34b96b4037d24a0ea19383ca2477b2fd', 'project_name': None, 'resource_id': 'instance-00000095-04c5bee9-8745-45b2-884d-2abfbbec5d0e-tap5f30e7af-d4', 'timestamp': '2026-01-22T00:20:23.359769', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-205298966', 'name': 'tap5f30e7af-d4', 'instance_id': '04c5bee9-8745-45b2-884d-2abfbbec5d0e', 'instance_type': 'm1.nano', 'host': 'e5ddce960d32daad00025894dc3b34f2163b2588c9dc00067da2324e', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:81:a9:f6', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap5f30e7af-d4'}, 'message_id': '249e68cc-f728-11f0-b13b-fa163e425b77', 'monotonic_time': 5781.964730966, 'message_signature': 'c3c040bbfd70cd5db27a9a80992a700c39691e6bbf7262c85458e708b310f522'}, {'source': 'openstack', 'counter_name': 'network.outgoing.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '833f1e9dce90456ea55a443da6704907', 'user_name': None, 'project_id': '34b96b4037d24a0ea19383ca2477b2fd', 'project_name': None, 'resource_id': 'instance-00000094-6b668707-d685-4bda-bfbf-c52a9214fc5a-tap93aa6e48-d8', 'timestamp': '2026-01-22T00:20:23.359769', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-398620410', 'name': 'tap93aa6e48-d8', 'instance_id': '6b668707-d685-4bda-bfbf-c52a9214fc5a', 'instance_type': 'm1.nano', 'host': 'e5ddce960d32daad00025894dc3b34f2163b2588c9dc00067da2324e', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:d8:48:67', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap93aa6e48-d8'}, 'message_id': '249e72a4-f728-11f0-b13b-fa163e425b77', 'monotonic_time': 5781.971548267, 'message_signature': '824d3c782ff7ea04221c1f46254adc4db58a66abec4363bd4307c82e4445a49a'}]}, 'timestamp': '2026-01-22 00:20:23.360285', '_unique_id': '383c3dd2bb184954886e9ad1facb68d8'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.360 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.360 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.360 12 ERROR oslo_messaging.notify.messaging     yield
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.360 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.360 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.360 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.360 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.360 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.360 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.360 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.360 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.360 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.360 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.360 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.360 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.360 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.360 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.360 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.360 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.360 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.360 12 ERROR oslo_messaging.notify.messaging 
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.360 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.360 12 ERROR oslo_messaging.notify.messaging 
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.360 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.360 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.360 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.360 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.360 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.360 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.360 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.360 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.360 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.360 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.360 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.360 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.360 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.360 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.360 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.360 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.360 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.360 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.360 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.360 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.360 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.360 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.360 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.360 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.360 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.360 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.360 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.360 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.360 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.360 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.360 12 ERROR oslo_messaging.notify.messaging 
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.361 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.bytes in the context of pollsters
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.361 12 DEBUG ceilometer.compute.pollsters [-] 04c5bee9-8745-45b2-884d-2abfbbec5d0e/disk.device.read.bytes volume: 23775232 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.361 12 DEBUG ceilometer.compute.pollsters [-] 04c5bee9-8745-45b2-884d-2abfbbec5d0e/disk.device.read.bytes volume: 2048 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.361 12 DEBUG ceilometer.compute.pollsters [-] 6b668707-d685-4bda-bfbf-c52a9214fc5a/disk.device.read.bytes volume: 29997568 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.362 12 DEBUG ceilometer.compute.pollsters [-] 6b668707-d685-4bda-bfbf-c52a9214fc5a/disk.device.read.bytes volume: 274750 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.362 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'e5750c26-e1fc-405e-8cd6-da3aaf307de1', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 23775232, 'user_id': '833f1e9dce90456ea55a443da6704907', 'user_name': None, 'project_id': '34b96b4037d24a0ea19383ca2477b2fd', 'project_name': None, 'resource_id': '04c5bee9-8745-45b2-884d-2abfbbec5d0e-vda', 'timestamp': '2026-01-22T00:20:23.361515', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-205298966', 'name': 'instance-00000095', 'instance_id': '04c5bee9-8745-45b2-884d-2abfbbec5d0e', 'instance_type': 'm1.nano', 'host': 'e5ddce960d32daad00025894dc3b34f2163b2588c9dc00067da2324e', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '249eac2e-f728-11f0-b13b-fa163e425b77', 'monotonic_time': 5781.896588878, 'message_signature': '877e40ba1fa1b7f22f8c1d58887494f380afd1ebe91dcf7521e66967a2629565'}, {'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 2048, 'user_id': '833f1e9dce90456ea55a443da6704907', 'user_name': None, 'project_id': '34b96b4037d24a0ea19383ca2477b2fd', 'project_name': None, 'resource_id': '04c5bee9-8745-45b2-884d-2abfbbec5d0e-sda', 'timestamp': '2026-01-22T00:20:23.361515', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-205298966', 'name': 'instance-00000095', 'instance_id': '04c5bee9-8745-45b2-884d-2abfbbec5d0e', 'instance_type': 'm1.nano', 'host': 'e5ddce960d32daad00025894dc3b34f2163b2588c9dc00067da2324e', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '249eb4e4-f728-11f0-b13b-fa163e425b77', 'monotonic_time': 5781.896588878, 'message_signature': '551863522a29497615be51965e3c63e539a6785d3692724745904a40bd88c006'}, {'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 29997568, 'user_id': '833f1e9dce90456ea55a443da6704907', 'user_name': None, 'project_id': '34b96b4037d24a0ea19383ca2477b2fd', 'project_name': None, 'resource_id': '6b668707-d685-4bda-bfbf-c52a9214fc5a-vda', 'timestamp': '2026-01-22T00:20:23.361515', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-398620410', 'name': 'instance-00000094', 'instance_id': '6b668707-d685-4bda-bfbf-c52a9214fc5a', 'instance_type': 'm1.nano', 'host': 'e5ddce960d32daad00025894dc3b34f2163b2588c9dc00067da2324e', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '249ebdea-f728-11f0-b13b-fa163e425b77', 'monotonic_time': 5781.930061561, 'message_signature': 'fdf5ebd52f56eaee1194ce9994b037c754b6991baabe73700d435745e9f4b50c'}, {'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 274750, 'user_id': '833f1e9dce90456ea55a443da6704907', 'user_name': None, 'project_id': '34b96b4037d24a0ea19383ca2477b2fd', 'project_name': None, 'resource_id': '6b668707-d685-4bda-bfbf-c52a9214fc5a-sda', 'timestamp': '2026-01-22T00:20:23.361515', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-398620410', 'name': 'instance-00000094', 'instance_id': '6b668707-d685-4bda-bfbf-c52a9214fc5a', 'instance_type': 'm1.nano', 'host': 'e5ddce960d32daad00025894dc3b34f2163b2588c9dc00067da2324e', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '249ec542-f728-11f0-b13b-fa163e425b77', 'monotonic_time': 5781.930061561, 'message_signature': '86f63525b62f757f6611998ad8ab1569ce98db037b178cb7c29b33736eb60a5e'}]}, 'timestamp': '2026-01-22 00:20:23.362379', '_unique_id': 'fee85d918a5d4745bf5ccc873c79bc3b'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.362 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.362 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.362 12 ERROR oslo_messaging.notify.messaging     yield
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.362 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.362 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.362 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.362 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.362 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.362 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.362 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.362 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.362 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.362 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.362 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.362 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.362 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.362 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.362 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.362 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.362 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.362 12 ERROR oslo_messaging.notify.messaging 
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.362 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.362 12 ERROR oslo_messaging.notify.messaging 
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.362 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.362 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.362 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.362 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.362 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.362 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.362 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.362 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.362 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.362 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.362 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.362 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.362 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.362 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.362 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.362 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.362 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.362 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.362 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.362 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.362 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.362 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.362 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.362 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.362 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.362 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.362 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.362 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.362 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.362 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.362 12 ERROR oslo_messaging.notify.messaging 
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.363 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.drop in the context of pollsters
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.363 12 DEBUG ceilometer.compute.pollsters [-] 04c5bee9-8745-45b2-884d-2abfbbec5d0e/network.incoming.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.364 12 DEBUG ceilometer.compute.pollsters [-] 6b668707-d685-4bda-bfbf-c52a9214fc5a/network.incoming.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.364 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'e48de544-a182-425b-9a43-58e1c7498af6', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '833f1e9dce90456ea55a443da6704907', 'user_name': None, 'project_id': '34b96b4037d24a0ea19383ca2477b2fd', 'project_name': None, 'resource_id': 'instance-00000095-04c5bee9-8745-45b2-884d-2abfbbec5d0e-tap5f30e7af-d4', 'timestamp': '2026-01-22T00:20:23.363677', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-205298966', 'name': 'tap5f30e7af-d4', 'instance_id': '04c5bee9-8745-45b2-884d-2abfbbec5d0e', 'instance_type': 'm1.nano', 'host': 'e5ddce960d32daad00025894dc3b34f2163b2588c9dc00067da2324e', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:81:a9:f6', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap5f30e7af-d4'}, 'message_id': '249f025a-f728-11f0-b13b-fa163e425b77', 'monotonic_time': 5781.964730966, 'message_signature': '372c66c7cf05b71dca7efc297c44e7bddd13cbe5ad524efee6b4a2c9c9cf4c77'}, {'source': 'openstack', 'counter_name': 'network.incoming.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '833f1e9dce90456ea55a443da6704907', 'user_name': None, 'project_id': '34b96b4037d24a0ea19383ca2477b2fd', 'project_name': None, 'resource_id': 'instance-00000094-6b668707-d685-4bda-bfbf-c52a9214fc5a-tap93aa6e48-d8', 'timestamp': '2026-01-22T00:20:23.363677', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-398620410', 'name': 'tap93aa6e48-d8', 'instance_id': '6b668707-d685-4bda-bfbf-c52a9214fc5a', 'instance_type': 'm1.nano', 'host': 'e5ddce960d32daad00025894dc3b34f2163b2588c9dc00067da2324e', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:d8:48:67', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap93aa6e48-d8'}, 'message_id': '249f0d7c-f728-11f0-b13b-fa163e425b77', 'monotonic_time': 5781.971548267, 'message_signature': '35c59a6926569679c1345e91f523d80f4513cbb2fae32b5feadb38fe262a2d95'}]}, 'timestamp': '2026-01-22 00:20:23.364261', '_unique_id': '57564f51fcc3496aa86771e93e189895'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.364 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.364 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.364 12 ERROR oslo_messaging.notify.messaging     yield
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.364 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.364 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.364 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.364 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.364 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.364 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.364 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.364 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.364 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.364 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.364 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.364 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.364 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.364 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.364 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.364 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.364 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.364 12 ERROR oslo_messaging.notify.messaging 
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.364 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.364 12 ERROR oslo_messaging.notify.messaging 
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.364 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.364 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.364 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.364 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.364 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.364 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.364 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.364 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.364 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.364 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.364 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.364 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.364 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.364 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.364 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.364 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.364 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.364 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.364 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.364 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.364 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.364 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.364 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.364 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.364 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.364 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.364 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.364 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.364 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.364 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 21 19:20:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:20:23.364 12 ERROR oslo_messaging.notify.messaging 
Jan 21 19:20:24 np0005591285 nova_compute[182755]: 2026-01-22 00:20:24.721 182759 DEBUG nova.network.neutron [req-47b39aac-ffae-4f76-a0ba-b1095c5cc3e5 req-cf1a07b3-ec8b-4ebc-831f-1b54d427b52d 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 04c5bee9-8745-45b2-884d-2abfbbec5d0e] Updated VIF entry in instance network info cache for port 5f30e7af-d4d9-44f1-9c51-4ab3afc2e3df. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 21 19:20:24 np0005591285 nova_compute[182755]: 2026-01-22 00:20:24.722 182759 DEBUG nova.network.neutron [req-47b39aac-ffae-4f76-a0ba-b1095c5cc3e5 req-cf1a07b3-ec8b-4ebc-831f-1b54d427b52d 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 04c5bee9-8745-45b2-884d-2abfbbec5d0e] Updating instance_info_cache with network_info: [{"id": "5f30e7af-d4d9-44f1-9c51-4ab3afc2e3df", "address": "fa:16:3e:81:a9:f6", "network": {"id": "09aa8d20-eb46-4367-945b-494fddadbef9", "bridge": "br-int", "label": "tempest-network-smoke--676031905", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.216", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "34b96b4037d24a0ea19383ca2477b2fd", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap5f30e7af-d4", "ovs_interfaceid": "5f30e7af-d4d9-44f1-9c51-4ab3afc2e3df", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 21 19:20:25 np0005591285 nova_compute[182755]: 2026-01-22 00:20:25.020 182759 DEBUG oslo_concurrency.lockutils [req-47b39aac-ffae-4f76-a0ba-b1095c5cc3e5 req-cf1a07b3-ec8b-4ebc-831f-1b54d427b52d 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Releasing lock "refresh_cache-04c5bee9-8745-45b2-884d-2abfbbec5d0e" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 21 19:20:26 np0005591285 nova_compute[182755]: 2026-01-22 00:20:26.668 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:20:26 np0005591285 ovn_controller[94908]: 2026-01-22T00:20:26Z|00585|binding|INFO|Releasing lport 41a0cd65-2b8a-41a7-a2ad-c71bf323d9ae from this chassis (sb_readonly=0)
Jan 21 19:20:27 np0005591285 nova_compute[182755]: 2026-01-22 00:20:27.038 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:20:27 np0005591285 ovn_controller[94908]: 2026-01-22T00:20:27Z|00056|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:81:a9:f6 10.100.0.4
Jan 21 19:20:27 np0005591285 ovn_controller[94908]: 2026-01-22T00:20:27Z|00057|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:81:a9:f6 10.100.0.4
Jan 21 19:20:27 np0005591285 nova_compute[182755]: 2026-01-22 00:20:27.823 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:20:31 np0005591285 nova_compute[182755]: 2026-01-22 00:20:31.671 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:20:32 np0005591285 nova_compute[182755]: 2026-01-22 00:20:32.826 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:20:35 np0005591285 podman[235461]: 2026-01-22 00:20:35.211754426 +0000 UTC m=+0.068020154 container health_status b5c1a31a508d0cf9e10702abe9c0ef424614b4d8f0daee85791f7de054afa6f0 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=ceilometer_agent_compute, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, container_name=ceilometer_agent_compute, org.label-schema.license=GPLv2)
Jan 21 19:20:35 np0005591285 podman[235460]: 2026-01-22 00:20:35.235399307 +0000 UTC m=+0.095358243 container health_status 769668cc4e818cfae854ab3fcb5517227ddca1e18005a18ef4d782a494f45662 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, url=https://catalog.redhat.com/en/search?searchType=containers, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., release=1755695350, build-date=2025-08-20T13:12:41, io.openshift.expose-services=, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.openshift.tags=minimal rhel9, name=ubi9-minimal, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=edpm_ansible, architecture=x86_64, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, maintainer=Red Hat, Inc., com.redhat.component=ubi9-minimal-container, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vcs-type=git, container_name=openstack_network_exporter, version=9.6, distribution-scope=public, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.buildah.version=1.33.7, config_id=openstack_network_exporter)
Jan 21 19:20:36 np0005591285 nova_compute[182755]: 2026-01-22 00:20:36.673 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:20:37 np0005591285 nova_compute[182755]: 2026-01-22 00:20:37.826 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:20:38 np0005591285 nova_compute[182755]: 2026-01-22 00:20:38.451 182759 INFO nova.compute.manager [None req-2558dece-2679-4088-ae9d-2df8482a1ac7 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] [instance: 6b668707-d685-4bda-bfbf-c52a9214fc5a] Get console output#033[00m
Jan 21 19:20:38 np0005591285 nova_compute[182755]: 2026-01-22 00:20:38.460 211512 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes#033[00m
Jan 21 19:20:40 np0005591285 nova_compute[182755]: 2026-01-22 00:20:40.245 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:20:40 np0005591285 nova_compute[182755]: 2026-01-22 00:20:40.346 182759 DEBUG nova.compute.manager [req-9a96a3b4-f1fb-4e7e-827b-c3be299795ea req-83813a7b-e3e5-425d-b4f5-422902b9c7af 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 6b668707-d685-4bda-bfbf-c52a9214fc5a] Received event network-changed-93aa6e48-d8e4-4f3c-b816-eedca06529c0 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 21 19:20:40 np0005591285 nova_compute[182755]: 2026-01-22 00:20:40.346 182759 DEBUG nova.compute.manager [req-9a96a3b4-f1fb-4e7e-827b-c3be299795ea req-83813a7b-e3e5-425d-b4f5-422902b9c7af 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 6b668707-d685-4bda-bfbf-c52a9214fc5a] Refreshing instance network info cache due to event network-changed-93aa6e48-d8e4-4f3c-b816-eedca06529c0. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 21 19:20:40 np0005591285 nova_compute[182755]: 2026-01-22 00:20:40.346 182759 DEBUG oslo_concurrency.lockutils [req-9a96a3b4-f1fb-4e7e-827b-c3be299795ea req-83813a7b-e3e5-425d-b4f5-422902b9c7af 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "refresh_cache-6b668707-d685-4bda-bfbf-c52a9214fc5a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 21 19:20:40 np0005591285 nova_compute[182755]: 2026-01-22 00:20:40.346 182759 DEBUG oslo_concurrency.lockutils [req-9a96a3b4-f1fb-4e7e-827b-c3be299795ea req-83813a7b-e3e5-425d-b4f5-422902b9c7af 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquired lock "refresh_cache-6b668707-d685-4bda-bfbf-c52a9214fc5a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 21 19:20:40 np0005591285 nova_compute[182755]: 2026-01-22 00:20:40.347 182759 DEBUG nova.network.neutron [req-9a96a3b4-f1fb-4e7e-827b-c3be299795ea req-83813a7b-e3e5-425d-b4f5-422902b9c7af 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 6b668707-d685-4bda-bfbf-c52a9214fc5a] Refreshing network info cache for port 93aa6e48-d8e4-4f3c-b816-eedca06529c0 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 21 19:20:41 np0005591285 nova_compute[182755]: 2026-01-22 00:20:41.415 182759 INFO nova.compute.manager [None req-6268b28e-91c7-4f6b-9792-ea15fe7d7503 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] [instance: 6b668707-d685-4bda-bfbf-c52a9214fc5a] Get console output#033[00m
Jan 21 19:20:41 np0005591285 nova_compute[182755]: 2026-01-22 00:20:41.419 211512 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes#033[00m
Jan 21 19:20:41 np0005591285 nova_compute[182755]: 2026-01-22 00:20:41.676 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:20:42 np0005591285 nova_compute[182755]: 2026-01-22 00:20:42.510 182759 DEBUG nova.compute.manager [req-776b34de-ae6e-46ca-8c0f-04292997d7dc req-8024a037-e1b6-4f16-8c96-790b5652b7c0 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 6b668707-d685-4bda-bfbf-c52a9214fc5a] Received event network-vif-unplugged-93aa6e48-d8e4-4f3c-b816-eedca06529c0 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 21 19:20:42 np0005591285 nova_compute[182755]: 2026-01-22 00:20:42.510 182759 DEBUG oslo_concurrency.lockutils [req-776b34de-ae6e-46ca-8c0f-04292997d7dc req-8024a037-e1b6-4f16-8c96-790b5652b7c0 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "6b668707-d685-4bda-bfbf-c52a9214fc5a-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 21 19:20:42 np0005591285 nova_compute[182755]: 2026-01-22 00:20:42.510 182759 DEBUG oslo_concurrency.lockutils [req-776b34de-ae6e-46ca-8c0f-04292997d7dc req-8024a037-e1b6-4f16-8c96-790b5652b7c0 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "6b668707-d685-4bda-bfbf-c52a9214fc5a-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 21 19:20:42 np0005591285 nova_compute[182755]: 2026-01-22 00:20:42.511 182759 DEBUG oslo_concurrency.lockutils [req-776b34de-ae6e-46ca-8c0f-04292997d7dc req-8024a037-e1b6-4f16-8c96-790b5652b7c0 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "6b668707-d685-4bda-bfbf-c52a9214fc5a-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 21 19:20:42 np0005591285 nova_compute[182755]: 2026-01-22 00:20:42.511 182759 DEBUG nova.compute.manager [req-776b34de-ae6e-46ca-8c0f-04292997d7dc req-8024a037-e1b6-4f16-8c96-790b5652b7c0 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 6b668707-d685-4bda-bfbf-c52a9214fc5a] No waiting events found dispatching network-vif-unplugged-93aa6e48-d8e4-4f3c-b816-eedca06529c0 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 21 19:20:42 np0005591285 nova_compute[182755]: 2026-01-22 00:20:42.511 182759 WARNING nova.compute.manager [req-776b34de-ae6e-46ca-8c0f-04292997d7dc req-8024a037-e1b6-4f16-8c96-790b5652b7c0 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 6b668707-d685-4bda-bfbf-c52a9214fc5a] Received unexpected event network-vif-unplugged-93aa6e48-d8e4-4f3c-b816-eedca06529c0 for instance with vm_state active and task_state None.#033[00m
Jan 21 19:20:42 np0005591285 nova_compute[182755]: 2026-01-22 00:20:42.511 182759 DEBUG nova.compute.manager [req-776b34de-ae6e-46ca-8c0f-04292997d7dc req-8024a037-e1b6-4f16-8c96-790b5652b7c0 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 6b668707-d685-4bda-bfbf-c52a9214fc5a] Received event network-vif-plugged-93aa6e48-d8e4-4f3c-b816-eedca06529c0 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 21 19:20:42 np0005591285 nova_compute[182755]: 2026-01-22 00:20:42.511 182759 DEBUG oslo_concurrency.lockutils [req-776b34de-ae6e-46ca-8c0f-04292997d7dc req-8024a037-e1b6-4f16-8c96-790b5652b7c0 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "6b668707-d685-4bda-bfbf-c52a9214fc5a-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 21 19:20:42 np0005591285 nova_compute[182755]: 2026-01-22 00:20:42.512 182759 DEBUG oslo_concurrency.lockutils [req-776b34de-ae6e-46ca-8c0f-04292997d7dc req-8024a037-e1b6-4f16-8c96-790b5652b7c0 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "6b668707-d685-4bda-bfbf-c52a9214fc5a-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 21 19:20:42 np0005591285 nova_compute[182755]: 2026-01-22 00:20:42.512 182759 DEBUG oslo_concurrency.lockutils [req-776b34de-ae6e-46ca-8c0f-04292997d7dc req-8024a037-e1b6-4f16-8c96-790b5652b7c0 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "6b668707-d685-4bda-bfbf-c52a9214fc5a-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 21 19:20:42 np0005591285 nova_compute[182755]: 2026-01-22 00:20:42.512 182759 DEBUG nova.compute.manager [req-776b34de-ae6e-46ca-8c0f-04292997d7dc req-8024a037-e1b6-4f16-8c96-790b5652b7c0 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 6b668707-d685-4bda-bfbf-c52a9214fc5a] No waiting events found dispatching network-vif-plugged-93aa6e48-d8e4-4f3c-b816-eedca06529c0 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 21 19:20:42 np0005591285 nova_compute[182755]: 2026-01-22 00:20:42.512 182759 WARNING nova.compute.manager [req-776b34de-ae6e-46ca-8c0f-04292997d7dc req-8024a037-e1b6-4f16-8c96-790b5652b7c0 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 6b668707-d685-4bda-bfbf-c52a9214fc5a] Received unexpected event network-vif-plugged-93aa6e48-d8e4-4f3c-b816-eedca06529c0 for instance with vm_state active and task_state None.#033[00m
Jan 21 19:20:42 np0005591285 nova_compute[182755]: 2026-01-22 00:20:42.828 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:20:43 np0005591285 nova_compute[182755]: 2026-01-22 00:20:43.257 182759 DEBUG nova.network.neutron [req-9a96a3b4-f1fb-4e7e-827b-c3be299795ea req-83813a7b-e3e5-425d-b4f5-422902b9c7af 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 6b668707-d685-4bda-bfbf-c52a9214fc5a] Updated VIF entry in instance network info cache for port 93aa6e48-d8e4-4f3c-b816-eedca06529c0. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 21 19:20:43 np0005591285 nova_compute[182755]: 2026-01-22 00:20:43.258 182759 DEBUG nova.network.neutron [req-9a96a3b4-f1fb-4e7e-827b-c3be299795ea req-83813a7b-e3e5-425d-b4f5-422902b9c7af 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 6b668707-d685-4bda-bfbf-c52a9214fc5a] Updating instance_info_cache with network_info: [{"id": "93aa6e48-d8e4-4f3c-b816-eedca06529c0", "address": "fa:16:3e:d8:48:67", "network": {"id": "09aa8d20-eb46-4367-945b-494fddadbef9", "bridge": "br-int", "label": "tempest-network-smoke--676031905", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.204", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "34b96b4037d24a0ea19383ca2477b2fd", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap93aa6e48-d8", "ovs_interfaceid": "93aa6e48-d8e4-4f3c-b816-eedca06529c0", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 21 19:20:43 np0005591285 nova_compute[182755]: 2026-01-22 00:20:43.287 182759 DEBUG oslo_concurrency.lockutils [req-9a96a3b4-f1fb-4e7e-827b-c3be299795ea req-83813a7b-e3e5-425d-b4f5-422902b9c7af 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Releasing lock "refresh_cache-6b668707-d685-4bda-bfbf-c52a9214fc5a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 21 19:20:43 np0005591285 nova_compute[182755]: 2026-01-22 00:20:43.757 182759 DEBUG nova.compute.manager [req-fd060af4-35bd-4df2-aec9-03b6c83bb670 req-43fa9706-44e8-49bf-9873-733d0d7234f5 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 6b668707-d685-4bda-bfbf-c52a9214fc5a] Received event network-changed-93aa6e48-d8e4-4f3c-b816-eedca06529c0 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 21 19:20:43 np0005591285 nova_compute[182755]: 2026-01-22 00:20:43.757 182759 DEBUG nova.compute.manager [req-fd060af4-35bd-4df2-aec9-03b6c83bb670 req-43fa9706-44e8-49bf-9873-733d0d7234f5 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 6b668707-d685-4bda-bfbf-c52a9214fc5a] Refreshing instance network info cache due to event network-changed-93aa6e48-d8e4-4f3c-b816-eedca06529c0. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 21 19:20:43 np0005591285 nova_compute[182755]: 2026-01-22 00:20:43.757 182759 DEBUG oslo_concurrency.lockutils [req-fd060af4-35bd-4df2-aec9-03b6c83bb670 req-43fa9706-44e8-49bf-9873-733d0d7234f5 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "refresh_cache-6b668707-d685-4bda-bfbf-c52a9214fc5a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 21 19:20:43 np0005591285 nova_compute[182755]: 2026-01-22 00:20:43.757 182759 DEBUG oslo_concurrency.lockutils [req-fd060af4-35bd-4df2-aec9-03b6c83bb670 req-43fa9706-44e8-49bf-9873-733d0d7234f5 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquired lock "refresh_cache-6b668707-d685-4bda-bfbf-c52a9214fc5a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 21 19:20:43 np0005591285 nova_compute[182755]: 2026-01-22 00:20:43.758 182759 DEBUG nova.network.neutron [req-fd060af4-35bd-4df2-aec9-03b6c83bb670 req-43fa9706-44e8-49bf-9873-733d0d7234f5 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 6b668707-d685-4bda-bfbf-c52a9214fc5a] Refreshing network info cache for port 93aa6e48-d8e4-4f3c-b816-eedca06529c0 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 21 19:20:44 np0005591285 nova_compute[182755]: 2026-01-22 00:20:44.087 182759 INFO nova.compute.manager [None req-8738d0ff-7507-4888-94d1-d46d2c46ae4b 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] [instance: 6b668707-d685-4bda-bfbf-c52a9214fc5a] Get console output#033[00m
Jan 21 19:20:44 np0005591285 nova_compute[182755]: 2026-01-22 00:20:44.092 211512 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes#033[00m
Jan 21 19:20:44 np0005591285 nova_compute[182755]: 2026-01-22 00:20:44.713 182759 DEBUG nova.compute.manager [req-f37b322c-b3b9-470c-a2c1-19674cb0669c req-baad6445-e94f-4bf8-88ea-0429e1b15f32 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 6b668707-d685-4bda-bfbf-c52a9214fc5a] Received event network-vif-plugged-93aa6e48-d8e4-4f3c-b816-eedca06529c0 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 21 19:20:44 np0005591285 nova_compute[182755]: 2026-01-22 00:20:44.713 182759 DEBUG oslo_concurrency.lockutils [req-f37b322c-b3b9-470c-a2c1-19674cb0669c req-baad6445-e94f-4bf8-88ea-0429e1b15f32 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "6b668707-d685-4bda-bfbf-c52a9214fc5a-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 21 19:20:44 np0005591285 nova_compute[182755]: 2026-01-22 00:20:44.714 182759 DEBUG oslo_concurrency.lockutils [req-f37b322c-b3b9-470c-a2c1-19674cb0669c req-baad6445-e94f-4bf8-88ea-0429e1b15f32 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "6b668707-d685-4bda-bfbf-c52a9214fc5a-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 21 19:20:44 np0005591285 nova_compute[182755]: 2026-01-22 00:20:44.714 182759 DEBUG oslo_concurrency.lockutils [req-f37b322c-b3b9-470c-a2c1-19674cb0669c req-baad6445-e94f-4bf8-88ea-0429e1b15f32 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "6b668707-d685-4bda-bfbf-c52a9214fc5a-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 21 19:20:44 np0005591285 nova_compute[182755]: 2026-01-22 00:20:44.714 182759 DEBUG nova.compute.manager [req-f37b322c-b3b9-470c-a2c1-19674cb0669c req-baad6445-e94f-4bf8-88ea-0429e1b15f32 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 6b668707-d685-4bda-bfbf-c52a9214fc5a] No waiting events found dispatching network-vif-plugged-93aa6e48-d8e4-4f3c-b816-eedca06529c0 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 21 19:20:44 np0005591285 nova_compute[182755]: 2026-01-22 00:20:44.714 182759 WARNING nova.compute.manager [req-f37b322c-b3b9-470c-a2c1-19674cb0669c req-baad6445-e94f-4bf8-88ea-0429e1b15f32 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 6b668707-d685-4bda-bfbf-c52a9214fc5a] Received unexpected event network-vif-plugged-93aa6e48-d8e4-4f3c-b816-eedca06529c0 for instance with vm_state active and task_state None.#033[00m
Jan 21 19:20:45 np0005591285 podman[235498]: 2026-01-22 00:20:45.192067293 +0000 UTC m=+0.061516201 container health_status 3d927e208c8d9d2bf2ed958430b573b5beb6e6062ba87e1ec7a0ee42c855b2e4 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter)
Jan 21 19:20:46 np0005591285 nova_compute[182755]: 2026-01-22 00:20:46.679 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:20:46 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:20:46.766 104259 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=44, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '16:a4:fb', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': 'fa:c2:bb:50:eb:27'}, ipsec=False) old=SB_Global(nb_cfg=43) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 21 19:20:46 np0005591285 nova_compute[182755]: 2026-01-22 00:20:46.767 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:20:46 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:20:46.767 104259 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 5 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Jan 21 19:20:46 np0005591285 nova_compute[182755]: 2026-01-22 00:20:46.895 182759 DEBUG nova.compute.manager [req-34d04fe3-e6b1-4482-bcd5-ba501f29c0a2 req-679a4f79-71af-4da3-8933-71c25da8bf3b 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 6b668707-d685-4bda-bfbf-c52a9214fc5a] Received event network-vif-plugged-93aa6e48-d8e4-4f3c-b816-eedca06529c0 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 21 19:20:46 np0005591285 nova_compute[182755]: 2026-01-22 00:20:46.896 182759 DEBUG oslo_concurrency.lockutils [req-34d04fe3-e6b1-4482-bcd5-ba501f29c0a2 req-679a4f79-71af-4da3-8933-71c25da8bf3b 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "6b668707-d685-4bda-bfbf-c52a9214fc5a-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 21 19:20:46 np0005591285 nova_compute[182755]: 2026-01-22 00:20:46.896 182759 DEBUG oslo_concurrency.lockutils [req-34d04fe3-e6b1-4482-bcd5-ba501f29c0a2 req-679a4f79-71af-4da3-8933-71c25da8bf3b 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "6b668707-d685-4bda-bfbf-c52a9214fc5a-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 21 19:20:46 np0005591285 nova_compute[182755]: 2026-01-22 00:20:46.896 182759 DEBUG oslo_concurrency.lockutils [req-34d04fe3-e6b1-4482-bcd5-ba501f29c0a2 req-679a4f79-71af-4da3-8933-71c25da8bf3b 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "6b668707-d685-4bda-bfbf-c52a9214fc5a-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 21 19:20:46 np0005591285 nova_compute[182755]: 2026-01-22 00:20:46.897 182759 DEBUG nova.compute.manager [req-34d04fe3-e6b1-4482-bcd5-ba501f29c0a2 req-679a4f79-71af-4da3-8933-71c25da8bf3b 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 6b668707-d685-4bda-bfbf-c52a9214fc5a] No waiting events found dispatching network-vif-plugged-93aa6e48-d8e4-4f3c-b816-eedca06529c0 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 21 19:20:46 np0005591285 nova_compute[182755]: 2026-01-22 00:20:46.897 182759 WARNING nova.compute.manager [req-34d04fe3-e6b1-4482-bcd5-ba501f29c0a2 req-679a4f79-71af-4da3-8933-71c25da8bf3b 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 6b668707-d685-4bda-bfbf-c52a9214fc5a] Received unexpected event network-vif-plugged-93aa6e48-d8e4-4f3c-b816-eedca06529c0 for instance with vm_state active and task_state None.#033[00m
Jan 21 19:20:47 np0005591285 nova_compute[182755]: 2026-01-22 00:20:47.050 182759 DEBUG nova.network.neutron [req-fd060af4-35bd-4df2-aec9-03b6c83bb670 req-43fa9706-44e8-49bf-9873-733d0d7234f5 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 6b668707-d685-4bda-bfbf-c52a9214fc5a] Updated VIF entry in instance network info cache for port 93aa6e48-d8e4-4f3c-b816-eedca06529c0. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 21 19:20:47 np0005591285 nova_compute[182755]: 2026-01-22 00:20:47.051 182759 DEBUG nova.network.neutron [req-fd060af4-35bd-4df2-aec9-03b6c83bb670 req-43fa9706-44e8-49bf-9873-733d0d7234f5 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 6b668707-d685-4bda-bfbf-c52a9214fc5a] Updating instance_info_cache with network_info: [{"id": "93aa6e48-d8e4-4f3c-b816-eedca06529c0", "address": "fa:16:3e:d8:48:67", "network": {"id": "09aa8d20-eb46-4367-945b-494fddadbef9", "bridge": "br-int", "label": "tempest-network-smoke--676031905", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.204", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "34b96b4037d24a0ea19383ca2477b2fd", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap93aa6e48-d8", "ovs_interfaceid": "93aa6e48-d8e4-4f3c-b816-eedca06529c0", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 21 19:20:47 np0005591285 nova_compute[182755]: 2026-01-22 00:20:47.206 182759 DEBUG oslo_concurrency.lockutils [req-fd060af4-35bd-4df2-aec9-03b6c83bb670 req-43fa9706-44e8-49bf-9873-733d0d7234f5 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Releasing lock "refresh_cache-6b668707-d685-4bda-bfbf-c52a9214fc5a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 21 19:20:47 np0005591285 nova_compute[182755]: 2026-01-22 00:20:47.642 182759 DEBUG oslo_concurrency.lockutils [None req-e409018d-7161-48bd-81cd-680d0c49c864 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] Acquiring lock "04c5bee9-8745-45b2-884d-2abfbbec5d0e" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 21 19:20:47 np0005591285 nova_compute[182755]: 2026-01-22 00:20:47.642 182759 DEBUG oslo_concurrency.lockutils [None req-e409018d-7161-48bd-81cd-680d0c49c864 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] Lock "04c5bee9-8745-45b2-884d-2abfbbec5d0e" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 21 19:20:47 np0005591285 nova_compute[182755]: 2026-01-22 00:20:47.642 182759 DEBUG oslo_concurrency.lockutils [None req-e409018d-7161-48bd-81cd-680d0c49c864 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] Acquiring lock "04c5bee9-8745-45b2-884d-2abfbbec5d0e-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 21 19:20:47 np0005591285 nova_compute[182755]: 2026-01-22 00:20:47.643 182759 DEBUG oslo_concurrency.lockutils [None req-e409018d-7161-48bd-81cd-680d0c49c864 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] Lock "04c5bee9-8745-45b2-884d-2abfbbec5d0e-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 21 19:20:47 np0005591285 nova_compute[182755]: 2026-01-22 00:20:47.643 182759 DEBUG oslo_concurrency.lockutils [None req-e409018d-7161-48bd-81cd-680d0c49c864 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] Lock "04c5bee9-8745-45b2-884d-2abfbbec5d0e-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 21 19:20:47 np0005591285 nova_compute[182755]: 2026-01-22 00:20:47.663 182759 INFO nova.compute.manager [None req-e409018d-7161-48bd-81cd-680d0c49c864 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] [instance: 04c5bee9-8745-45b2-884d-2abfbbec5d0e] Terminating instance#033[00m
Jan 21 19:20:47 np0005591285 nova_compute[182755]: 2026-01-22 00:20:47.674 182759 DEBUG nova.compute.manager [None req-e409018d-7161-48bd-81cd-680d0c49c864 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] [instance: 04c5bee9-8745-45b2-884d-2abfbbec5d0e] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Jan 21 19:20:47 np0005591285 kernel: tap5f30e7af-d4 (unregistering): left promiscuous mode
Jan 21 19:20:47 np0005591285 NetworkManager[55017]: <info>  [1769041247.7010] device (tap5f30e7af-d4): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 21 19:20:47 np0005591285 nova_compute[182755]: 2026-01-22 00:20:47.710 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:20:47 np0005591285 ovn_controller[94908]: 2026-01-22T00:20:47Z|00586|binding|INFO|Releasing lport 5f30e7af-d4d9-44f1-9c51-4ab3afc2e3df from this chassis (sb_readonly=0)
Jan 21 19:20:47 np0005591285 ovn_controller[94908]: 2026-01-22T00:20:47Z|00587|binding|INFO|Setting lport 5f30e7af-d4d9-44f1-9c51-4ab3afc2e3df down in Southbound
Jan 21 19:20:47 np0005591285 ovn_controller[94908]: 2026-01-22T00:20:47Z|00588|binding|INFO|Removing iface tap5f30e7af-d4 ovn-installed in OVS
Jan 21 19:20:47 np0005591285 nova_compute[182755]: 2026-01-22 00:20:47.715 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:20:47 np0005591285 nova_compute[182755]: 2026-01-22 00:20:47.726 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:20:47 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:20:47.758 104259 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:81:a9:f6 10.100.0.4'], port_security=['fa:16:3e:81:a9:f6 10.100.0.4'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.4/28', 'neutron:device_id': '04c5bee9-8745-45b2-884d-2abfbbec5d0e', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-09aa8d20-eb46-4367-945b-494fddadbef9', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '34b96b4037d24a0ea19383ca2477b2fd', 'neutron:revision_number': '4', 'neutron:security_group_ids': 'e3745aef-6a18-4d3a-b122-7a49de407273', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-2.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=def33975-878d-4f5c-8b1e-729e778f0cd5, chassis=[], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fdd55b5c970>], logical_port=5f30e7af-d4d9-44f1-9c51-4ab3afc2e3df) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7fdd55b5c970>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 21 19:20:47 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:20:47.759 104259 INFO neutron.agent.ovn.metadata.agent [-] Port 5f30e7af-d4d9-44f1-9c51-4ab3afc2e3df in datapath 09aa8d20-eb46-4367-945b-494fddadbef9 unbound from our chassis#033[00m
Jan 21 19:20:47 np0005591285 systemd[1]: machine-qemu\x2d69\x2dinstance\x2d00000095.scope: Deactivated successfully.
Jan 21 19:20:47 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:20:47.760 104259 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 09aa8d20-eb46-4367-945b-494fddadbef9#033[00m
Jan 21 19:20:47 np0005591285 systemd[1]: machine-qemu\x2d69\x2dinstance\x2d00000095.scope: Consumed 13.992s CPU time.
Jan 21 19:20:47 np0005591285 systemd-machined[154022]: Machine qemu-69-instance-00000095 terminated.
Jan 21 19:20:47 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:20:47.782 211690 DEBUG oslo.privsep.daemon [-] privsep: reply[5aab9926-0968-4f5e-908c-c77813f1b623]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 19:20:47 np0005591285 podman[235527]: 2026-01-22 00:20:47.799942723 +0000 UTC m=+0.062235481 container health_status 8919309155a033da8deb024bb37b9fc0d285fe97686ee0d202af37a5f2df8416 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Jan 21 19:20:47 np0005591285 podman[235524]: 2026-01-22 00:20:47.799997294 +0000 UTC m=+0.072723310 container health_status 482a8450540b33e5cd4c4cb0c4b60281016c657b79b89fa82f8a9e447843f995 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_id=ovn_metadata_agent, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 21 19:20:47 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:20:47.814 211704 DEBUG oslo.privsep.daemon [-] privsep: reply[96ffaa47-f6fc-4d23-961f-97d6345b0623]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 19:20:47 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:20:47.818 211704 DEBUG oslo.privsep.daemon [-] privsep: reply[5d48c059-e479-494f-93a2-67c5c6d46ea5]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 19:20:47 np0005591285 nova_compute[182755]: 2026-01-22 00:20:47.829 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:20:47 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:20:47.851 211704 DEBUG oslo.privsep.daemon [-] privsep: reply[63afc43c-57df-4242-b990-6609a7cf27f3]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 19:20:47 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:20:47.871 211690 DEBUG oslo.privsep.daemon [-] privsep: reply[4e72f2de-f56e-4557-8d1a-7c99ed9a7b6b]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap09aa8d20-e1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:ec:d3:10'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 10, 'tx_packets': 7, 'rx_bytes': 700, 'tx_bytes': 438, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 10, 'tx_packets': 7, 'rx_bytes': 700, 'tx_bytes': 438, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 182], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 574940, 'reachable_time': 23910, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 6, 'inoctets': 448, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 3, 'outoctets': 228, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 6, 'outmcastpkts': 3, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 448, 'outmcastoctets': 228, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 6, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 3, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 235575, 'error': None, 'target': 'ovnmeta-09aa8d20-eb46-4367-945b-494fddadbef9', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 19:20:47 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:20:47.888 211690 DEBUG oslo.privsep.daemon [-] privsep: reply[2aef9c26-cf2a-4b65-a010-0f74f3c79bf6]: (4, ({'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tap09aa8d20-e1'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 574951, 'tstamp': 574951}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 235576, 'error': None, 'target': 'ovnmeta-09aa8d20-eb46-4367-945b-494fddadbef9', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.15'], ['IFA_LABEL', 'tap09aa8d20-e1'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 574954, 'tstamp': 574954}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 235576, 'error': None, 'target': 'ovnmeta-09aa8d20-eb46-4367-945b-494fddadbef9', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 19:20:47 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:20:47.891 104259 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap09aa8d20-e0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 21 19:20:47 np0005591285 nova_compute[182755]: 2026-01-22 00:20:47.892 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:20:47 np0005591285 nova_compute[182755]: 2026-01-22 00:20:47.902 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:20:47 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:20:47.902 104259 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap09aa8d20-e0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 21 19:20:47 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:20:47.902 104259 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 21 19:20:47 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:20:47.903 104259 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap09aa8d20-e0, col_values=(('external_ids', {'iface-id': '41a0cd65-2b8a-41a7-a2ad-c71bf323d9ae'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 21 19:20:47 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:20:47.903 104259 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 21 19:20:47 np0005591285 nova_compute[182755]: 2026-01-22 00:20:47.931 182759 INFO nova.virt.libvirt.driver [-] [instance: 04c5bee9-8745-45b2-884d-2abfbbec5d0e] Instance destroyed successfully.#033[00m
Jan 21 19:20:47 np0005591285 nova_compute[182755]: 2026-01-22 00:20:47.931 182759 DEBUG nova.objects.instance [None req-e409018d-7161-48bd-81cd-680d0c49c864 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] Lazy-loading 'resources' on Instance uuid 04c5bee9-8745-45b2-884d-2abfbbec5d0e obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 21 19:20:48 np0005591285 nova_compute[182755]: 2026-01-22 00:20:48.018 182759 DEBUG nova.virt.libvirt.vif [None req-e409018d-7161-48bd-81cd-680d0c49c864 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-22T00:20:04Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-205298966',display_name='tempest-TestNetworkBasicOps-server-205298966',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-205298966',id=149,image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBB3d2Do150sU4k4vLE98K5ZXopEIgo5vuyIpIC/J+P0c2gsHillvzH56tRAy7PK/iETBFSZEp/AHY9tJxPJOszxY/i7AfqrLWgGciqd22u5Iwpdczd5JmZABg5vIE2R2GA==',key_name='tempest-TestNetworkBasicOps-2098586360',keypairs=<?>,launch_index=0,launched_at=2026-01-22T00:20:14Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='34b96b4037d24a0ea19383ca2477b2fd',ramdisk_id='',reservation_id='r-h209qu80',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestNetworkBasicOps-822850957',owner_user_name='tempest-TestNetworkBasicOps-822850957-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-22T00:20:14Z,user_data=None,user_id='833f1e9dce90456ea55a443da6704907',uuid=04c5bee9-8745-45b2-884d-2abfbbec5d0e,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "5f30e7af-d4d9-44f1-9c51-4ab3afc2e3df", "address": "fa:16:3e:81:a9:f6", "network": {"id": "09aa8d20-eb46-4367-945b-494fddadbef9", "bridge": "br-int", "label": "tempest-network-smoke--676031905", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.216", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "34b96b4037d24a0ea19383ca2477b2fd", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap5f30e7af-d4", "ovs_interfaceid": "5f30e7af-d4d9-44f1-9c51-4ab3afc2e3df", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Jan 21 19:20:48 np0005591285 nova_compute[182755]: 2026-01-22 00:20:48.018 182759 DEBUG nova.network.os_vif_util [None req-e409018d-7161-48bd-81cd-680d0c49c864 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] Converting VIF {"id": "5f30e7af-d4d9-44f1-9c51-4ab3afc2e3df", "address": "fa:16:3e:81:a9:f6", "network": {"id": "09aa8d20-eb46-4367-945b-494fddadbef9", "bridge": "br-int", "label": "tempest-network-smoke--676031905", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.216", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "34b96b4037d24a0ea19383ca2477b2fd", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap5f30e7af-d4", "ovs_interfaceid": "5f30e7af-d4d9-44f1-9c51-4ab3afc2e3df", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 21 19:20:48 np0005591285 nova_compute[182755]: 2026-01-22 00:20:48.019 182759 DEBUG nova.network.os_vif_util [None req-e409018d-7161-48bd-81cd-680d0c49c864 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:81:a9:f6,bridge_name='br-int',has_traffic_filtering=True,id=5f30e7af-d4d9-44f1-9c51-4ab3afc2e3df,network=Network(09aa8d20-eb46-4367-945b-494fddadbef9),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap5f30e7af-d4') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 21 19:20:48 np0005591285 nova_compute[182755]: 2026-01-22 00:20:48.019 182759 DEBUG os_vif [None req-e409018d-7161-48bd-81cd-680d0c49c864 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:81:a9:f6,bridge_name='br-int',has_traffic_filtering=True,id=5f30e7af-d4d9-44f1-9c51-4ab3afc2e3df,network=Network(09aa8d20-eb46-4367-945b-494fddadbef9),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap5f30e7af-d4') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Jan 21 19:20:48 np0005591285 nova_compute[182755]: 2026-01-22 00:20:48.021 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:20:48 np0005591285 nova_compute[182755]: 2026-01-22 00:20:48.022 182759 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap5f30e7af-d4, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 21 19:20:48 np0005591285 nova_compute[182755]: 2026-01-22 00:20:48.114 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:20:48 np0005591285 nova_compute[182755]: 2026-01-22 00:20:48.116 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:20:48 np0005591285 nova_compute[182755]: 2026-01-22 00:20:48.120 182759 INFO os_vif [None req-e409018d-7161-48bd-81cd-680d0c49c864 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:81:a9:f6,bridge_name='br-int',has_traffic_filtering=True,id=5f30e7af-d4d9-44f1-9c51-4ab3afc2e3df,network=Network(09aa8d20-eb46-4367-945b-494fddadbef9),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap5f30e7af-d4')#033[00m
Jan 21 19:20:48 np0005591285 nova_compute[182755]: 2026-01-22 00:20:48.120 182759 INFO nova.virt.libvirt.driver [None req-e409018d-7161-48bd-81cd-680d0c49c864 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] [instance: 04c5bee9-8745-45b2-884d-2abfbbec5d0e] Deleting instance files /var/lib/nova/instances/04c5bee9-8745-45b2-884d-2abfbbec5d0e_del#033[00m
Jan 21 19:20:48 np0005591285 nova_compute[182755]: 2026-01-22 00:20:48.121 182759 INFO nova.virt.libvirt.driver [None req-e409018d-7161-48bd-81cd-680d0c49c864 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] [instance: 04c5bee9-8745-45b2-884d-2abfbbec5d0e] Deletion of /var/lib/nova/instances/04c5bee9-8745-45b2-884d-2abfbbec5d0e_del complete#033[00m
Jan 21 19:20:48 np0005591285 nova_compute[182755]: 2026-01-22 00:20:48.213 182759 DEBUG oslo_service.periodic_task [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 21 19:20:48 np0005591285 nova_compute[182755]: 2026-01-22 00:20:48.216 182759 DEBUG oslo_service.periodic_task [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 21 19:20:48 np0005591285 nova_compute[182755]: 2026-01-22 00:20:48.217 182759 DEBUG nova.compute.manager [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Jan 21 19:20:48 np0005591285 nova_compute[182755]: 2026-01-22 00:20:48.250 182759 INFO nova.compute.manager [None req-e409018d-7161-48bd-81cd-680d0c49c864 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] [instance: 04c5bee9-8745-45b2-884d-2abfbbec5d0e] Took 0.58 seconds to destroy the instance on the hypervisor.#033[00m
Jan 21 19:20:48 np0005591285 nova_compute[182755]: 2026-01-22 00:20:48.250 182759 DEBUG oslo.service.loopingcall [None req-e409018d-7161-48bd-81cd-680d0c49c864 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Jan 21 19:20:48 np0005591285 nova_compute[182755]: 2026-01-22 00:20:48.251 182759 DEBUG nova.compute.manager [-] [instance: 04c5bee9-8745-45b2-884d-2abfbbec5d0e] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Jan 21 19:20:48 np0005591285 nova_compute[182755]: 2026-01-22 00:20:48.251 182759 DEBUG nova.network.neutron [-] [instance: 04c5bee9-8745-45b2-884d-2abfbbec5d0e] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Jan 21 19:20:48 np0005591285 nova_compute[182755]: 2026-01-22 00:20:48.400 182759 DEBUG nova.compute.manager [req-37990ff0-b269-44c9-ba26-c131dcb1a40a req-f96d029c-f8e5-4a60-bf50-8328a6b5bca8 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 04c5bee9-8745-45b2-884d-2abfbbec5d0e] Received event network-vif-unplugged-5f30e7af-d4d9-44f1-9c51-4ab3afc2e3df external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 21 19:20:48 np0005591285 nova_compute[182755]: 2026-01-22 00:20:48.400 182759 DEBUG oslo_concurrency.lockutils [req-37990ff0-b269-44c9-ba26-c131dcb1a40a req-f96d029c-f8e5-4a60-bf50-8328a6b5bca8 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "04c5bee9-8745-45b2-884d-2abfbbec5d0e-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 21 19:20:48 np0005591285 nova_compute[182755]: 2026-01-22 00:20:48.400 182759 DEBUG oslo_concurrency.lockutils [req-37990ff0-b269-44c9-ba26-c131dcb1a40a req-f96d029c-f8e5-4a60-bf50-8328a6b5bca8 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "04c5bee9-8745-45b2-884d-2abfbbec5d0e-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 21 19:20:48 np0005591285 nova_compute[182755]: 2026-01-22 00:20:48.400 182759 DEBUG oslo_concurrency.lockutils [req-37990ff0-b269-44c9-ba26-c131dcb1a40a req-f96d029c-f8e5-4a60-bf50-8328a6b5bca8 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "04c5bee9-8745-45b2-884d-2abfbbec5d0e-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 21 19:20:48 np0005591285 nova_compute[182755]: 2026-01-22 00:20:48.401 182759 DEBUG nova.compute.manager [req-37990ff0-b269-44c9-ba26-c131dcb1a40a req-f96d029c-f8e5-4a60-bf50-8328a6b5bca8 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 04c5bee9-8745-45b2-884d-2abfbbec5d0e] No waiting events found dispatching network-vif-unplugged-5f30e7af-d4d9-44f1-9c51-4ab3afc2e3df pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 21 19:20:48 np0005591285 nova_compute[182755]: 2026-01-22 00:20:48.401 182759 DEBUG nova.compute.manager [req-37990ff0-b269-44c9-ba26-c131dcb1a40a req-f96d029c-f8e5-4a60-bf50-8328a6b5bca8 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 04c5bee9-8745-45b2-884d-2abfbbec5d0e] Received event network-vif-unplugged-5f30e7af-d4d9-44f1-9c51-4ab3afc2e3df for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Jan 21 19:20:49 np0005591285 nova_compute[182755]: 2026-01-22 00:20:49.005 182759 DEBUG nova.compute.manager [req-6760ce51-cde4-46c1-a119-7e689eae4698 req-e3432234-1ba7-4086-8202-b398ea1cd78c 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 04c5bee9-8745-45b2-884d-2abfbbec5d0e] Received event network-changed-5f30e7af-d4d9-44f1-9c51-4ab3afc2e3df external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 21 19:20:49 np0005591285 nova_compute[182755]: 2026-01-22 00:20:49.006 182759 DEBUG nova.compute.manager [req-6760ce51-cde4-46c1-a119-7e689eae4698 req-e3432234-1ba7-4086-8202-b398ea1cd78c 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 04c5bee9-8745-45b2-884d-2abfbbec5d0e] Refreshing instance network info cache due to event network-changed-5f30e7af-d4d9-44f1-9c51-4ab3afc2e3df. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 21 19:20:49 np0005591285 nova_compute[182755]: 2026-01-22 00:20:49.006 182759 DEBUG oslo_concurrency.lockutils [req-6760ce51-cde4-46c1-a119-7e689eae4698 req-e3432234-1ba7-4086-8202-b398ea1cd78c 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "refresh_cache-04c5bee9-8745-45b2-884d-2abfbbec5d0e" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 21 19:20:49 np0005591285 nova_compute[182755]: 2026-01-22 00:20:49.006 182759 DEBUG oslo_concurrency.lockutils [req-6760ce51-cde4-46c1-a119-7e689eae4698 req-e3432234-1ba7-4086-8202-b398ea1cd78c 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquired lock "refresh_cache-04c5bee9-8745-45b2-884d-2abfbbec5d0e" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 21 19:20:49 np0005591285 nova_compute[182755]: 2026-01-22 00:20:49.006 182759 DEBUG nova.network.neutron [req-6760ce51-cde4-46c1-a119-7e689eae4698 req-e3432234-1ba7-4086-8202-b398ea1cd78c 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 04c5bee9-8745-45b2-884d-2abfbbec5d0e] Refreshing network info cache for port 5f30e7af-d4d9-44f1-9c51-4ab3afc2e3df _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 21 19:20:49 np0005591285 nova_compute[182755]: 2026-01-22 00:20:49.218 182759 DEBUG oslo_service.periodic_task [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 21 19:20:49 np0005591285 nova_compute[182755]: 2026-01-22 00:20:49.457 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:20:49 np0005591285 nova_compute[182755]: 2026-01-22 00:20:49.629 182759 DEBUG nova.network.neutron [-] [instance: 04c5bee9-8745-45b2-884d-2abfbbec5d0e] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 21 19:20:49 np0005591285 nova_compute[182755]: 2026-01-22 00:20:49.652 182759 INFO nova.compute.manager [-] [instance: 04c5bee9-8745-45b2-884d-2abfbbec5d0e] Took 1.40 seconds to deallocate network for instance.#033[00m
Jan 21 19:20:49 np0005591285 nova_compute[182755]: 2026-01-22 00:20:49.746 182759 DEBUG oslo_concurrency.lockutils [None req-e409018d-7161-48bd-81cd-680d0c49c864 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 21 19:20:49 np0005591285 nova_compute[182755]: 2026-01-22 00:20:49.746 182759 DEBUG oslo_concurrency.lockutils [None req-e409018d-7161-48bd-81cd-680d0c49c864 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 21 19:20:49 np0005591285 nova_compute[182755]: 2026-01-22 00:20:49.871 182759 DEBUG nova.compute.provider_tree [None req-e409018d-7161-48bd-81cd-680d0c49c864 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] Inventory has not changed in ProviderTree for provider: e96a8776-a298-4c19-937a-402cb8191067 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 21 19:20:49 np0005591285 nova_compute[182755]: 2026-01-22 00:20:49.886 182759 DEBUG nova.scheduler.client.report [None req-e409018d-7161-48bd-81cd-680d0c49c864 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] Inventory has not changed for provider e96a8776-a298-4c19-937a-402cb8191067 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 21 19:20:49 np0005591285 nova_compute[182755]: 2026-01-22 00:20:49.912 182759 DEBUG oslo_concurrency.lockutils [None req-e409018d-7161-48bd-81cd-680d0c49c864 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.166s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 21 19:20:49 np0005591285 nova_compute[182755]: 2026-01-22 00:20:49.966 182759 INFO nova.scheduler.client.report [None req-e409018d-7161-48bd-81cd-680d0c49c864 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] Deleted allocations for instance 04c5bee9-8745-45b2-884d-2abfbbec5d0e#033[00m
Jan 21 19:20:50 np0005591285 nova_compute[182755]: 2026-01-22 00:20:50.067 182759 DEBUG oslo_concurrency.lockutils [None req-e409018d-7161-48bd-81cd-680d0c49c864 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] Lock "04c5bee9-8745-45b2-884d-2abfbbec5d0e" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 2.425s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 21 19:20:50 np0005591285 nova_compute[182755]: 2026-01-22 00:20:50.488 182759 DEBUG nova.network.neutron [req-6760ce51-cde4-46c1-a119-7e689eae4698 req-e3432234-1ba7-4086-8202-b398ea1cd78c 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 04c5bee9-8745-45b2-884d-2abfbbec5d0e] Updated VIF entry in instance network info cache for port 5f30e7af-d4d9-44f1-9c51-4ab3afc2e3df. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 21 19:20:50 np0005591285 nova_compute[182755]: 2026-01-22 00:20:50.489 182759 DEBUG nova.network.neutron [req-6760ce51-cde4-46c1-a119-7e689eae4698 req-e3432234-1ba7-4086-8202-b398ea1cd78c 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 04c5bee9-8745-45b2-884d-2abfbbec5d0e] Updating instance_info_cache with network_info: [{"id": "5f30e7af-d4d9-44f1-9c51-4ab3afc2e3df", "address": "fa:16:3e:81:a9:f6", "network": {"id": "09aa8d20-eb46-4367-945b-494fddadbef9", "bridge": "br-int", "label": "tempest-network-smoke--676031905", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "34b96b4037d24a0ea19383ca2477b2fd", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap5f30e7af-d4", "ovs_interfaceid": "5f30e7af-d4d9-44f1-9c51-4ab3afc2e3df", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 21 19:20:50 np0005591285 nova_compute[182755]: 2026-01-22 00:20:50.518 182759 DEBUG oslo_concurrency.lockutils [req-6760ce51-cde4-46c1-a119-7e689eae4698 req-e3432234-1ba7-4086-8202-b398ea1cd78c 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Releasing lock "refresh_cache-04c5bee9-8745-45b2-884d-2abfbbec5d0e" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 21 19:20:50 np0005591285 nova_compute[182755]: 2026-01-22 00:20:50.520 182759 DEBUG nova.compute.manager [req-00434600-ed80-45e0-81c1-6c4dafdb26e2 req-652c2045-dcc4-4e8c-a2a5-fa8ca1fe6344 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 04c5bee9-8745-45b2-884d-2abfbbec5d0e] Received event network-vif-plugged-5f30e7af-d4d9-44f1-9c51-4ab3afc2e3df external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 21 19:20:50 np0005591285 nova_compute[182755]: 2026-01-22 00:20:50.521 182759 DEBUG oslo_concurrency.lockutils [req-00434600-ed80-45e0-81c1-6c4dafdb26e2 req-652c2045-dcc4-4e8c-a2a5-fa8ca1fe6344 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "04c5bee9-8745-45b2-884d-2abfbbec5d0e-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 21 19:20:50 np0005591285 nova_compute[182755]: 2026-01-22 00:20:50.522 182759 DEBUG oslo_concurrency.lockutils [req-00434600-ed80-45e0-81c1-6c4dafdb26e2 req-652c2045-dcc4-4e8c-a2a5-fa8ca1fe6344 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "04c5bee9-8745-45b2-884d-2abfbbec5d0e-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 21 19:20:50 np0005591285 nova_compute[182755]: 2026-01-22 00:20:50.522 182759 DEBUG oslo_concurrency.lockutils [req-00434600-ed80-45e0-81c1-6c4dafdb26e2 req-652c2045-dcc4-4e8c-a2a5-fa8ca1fe6344 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "04c5bee9-8745-45b2-884d-2abfbbec5d0e-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 21 19:20:50 np0005591285 nova_compute[182755]: 2026-01-22 00:20:50.522 182759 DEBUG nova.compute.manager [req-00434600-ed80-45e0-81c1-6c4dafdb26e2 req-652c2045-dcc4-4e8c-a2a5-fa8ca1fe6344 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 04c5bee9-8745-45b2-884d-2abfbbec5d0e] No waiting events found dispatching network-vif-plugged-5f30e7af-d4d9-44f1-9c51-4ab3afc2e3df pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 21 19:20:50 np0005591285 nova_compute[182755]: 2026-01-22 00:20:50.522 182759 WARNING nova.compute.manager [req-00434600-ed80-45e0-81c1-6c4dafdb26e2 req-652c2045-dcc4-4e8c-a2a5-fa8ca1fe6344 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 04c5bee9-8745-45b2-884d-2abfbbec5d0e] Received unexpected event network-vif-plugged-5f30e7af-d4d9-44f1-9c51-4ab3afc2e3df for instance with vm_state deleted and task_state None.#033[00m
Jan 21 19:20:51 np0005591285 nova_compute[182755]: 2026-01-22 00:20:51.142 182759 DEBUG nova.compute.manager [req-3c806374-b3a5-4a35-9fcb-414aaa6ba5de req-0e274215-817c-4b72-bf9c-a9961ddbc0de 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 04c5bee9-8745-45b2-884d-2abfbbec5d0e] Received event network-vif-deleted-5f30e7af-d4d9-44f1-9c51-4ab3afc2e3df external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 21 19:20:51 np0005591285 podman[235595]: 2026-01-22 00:20:51.22209288 +0000 UTC m=+0.088320585 container health_status e38def30a2d032278c507a8fe96e62e9ea51ff4721fe55b24e66691821fd079e (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, org.label-schema.schema-version=1.0, config_id=ovn_controller, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible)
Jan 21 19:20:51 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:20:51.771 104259 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=ce4b296c-26ac-415a-aa87-9634754eb3d3, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '44'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 21 19:20:52 np0005591285 nova_compute[182755]: 2026-01-22 00:20:52.217 182759 DEBUG oslo_service.periodic_task [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 21 19:20:52 np0005591285 nova_compute[182755]: 2026-01-22 00:20:52.217 182759 DEBUG nova.compute.manager [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Jan 21 19:20:52 np0005591285 nova_compute[182755]: 2026-01-22 00:20:52.218 182759 DEBUG nova.compute.manager [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Jan 21 19:20:52 np0005591285 nova_compute[182755]: 2026-01-22 00:20:52.678 182759 DEBUG oslo_concurrency.lockutils [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Acquiring lock "refresh_cache-6b668707-d685-4bda-bfbf-c52a9214fc5a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 21 19:20:52 np0005591285 nova_compute[182755]: 2026-01-22 00:20:52.679 182759 DEBUG oslo_concurrency.lockutils [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Acquired lock "refresh_cache-6b668707-d685-4bda-bfbf-c52a9214fc5a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 21 19:20:52 np0005591285 nova_compute[182755]: 2026-01-22 00:20:52.679 182759 DEBUG nova.network.neutron [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] [instance: 6b668707-d685-4bda-bfbf-c52a9214fc5a] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004#033[00m
Jan 21 19:20:52 np0005591285 nova_compute[182755]: 2026-01-22 00:20:52.679 182759 DEBUG nova.objects.instance [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Lazy-loading 'info_cache' on Instance uuid 6b668707-d685-4bda-bfbf-c52a9214fc5a obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 21 19:20:52 np0005591285 nova_compute[182755]: 2026-01-22 00:20:52.845 182759 DEBUG oslo_concurrency.lockutils [None req-343bfc38-043c-44f6-abfc-aa5a94af43a8 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] Acquiring lock "6b668707-d685-4bda-bfbf-c52a9214fc5a" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 21 19:20:52 np0005591285 nova_compute[182755]: 2026-01-22 00:20:52.846 182759 DEBUG oslo_concurrency.lockutils [None req-343bfc38-043c-44f6-abfc-aa5a94af43a8 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] Lock "6b668707-d685-4bda-bfbf-c52a9214fc5a" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 21 19:20:52 np0005591285 nova_compute[182755]: 2026-01-22 00:20:52.846 182759 DEBUG oslo_concurrency.lockutils [None req-343bfc38-043c-44f6-abfc-aa5a94af43a8 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] Acquiring lock "6b668707-d685-4bda-bfbf-c52a9214fc5a-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 21 19:20:52 np0005591285 nova_compute[182755]: 2026-01-22 00:20:52.846 182759 DEBUG oslo_concurrency.lockutils [None req-343bfc38-043c-44f6-abfc-aa5a94af43a8 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] Lock "6b668707-d685-4bda-bfbf-c52a9214fc5a-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 21 19:20:52 np0005591285 nova_compute[182755]: 2026-01-22 00:20:52.847 182759 DEBUG oslo_concurrency.lockutils [None req-343bfc38-043c-44f6-abfc-aa5a94af43a8 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] Lock "6b668707-d685-4bda-bfbf-c52a9214fc5a-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 21 19:20:52 np0005591285 nova_compute[182755]: 2026-01-22 00:20:52.862 182759 INFO nova.compute.manager [None req-343bfc38-043c-44f6-abfc-aa5a94af43a8 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] [instance: 6b668707-d685-4bda-bfbf-c52a9214fc5a] Terminating instance#033[00m
Jan 21 19:20:52 np0005591285 nova_compute[182755]: 2026-01-22 00:20:52.880 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:20:53 np0005591285 nova_compute[182755]: 2026-01-22 00:20:53.095 182759 DEBUG nova.compute.manager [None req-343bfc38-043c-44f6-abfc-aa5a94af43a8 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] [instance: 6b668707-d685-4bda-bfbf-c52a9214fc5a] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Jan 21 19:20:53 np0005591285 kernel: tap93aa6e48-d8 (unregistering): left promiscuous mode
Jan 21 19:20:53 np0005591285 nova_compute[182755]: 2026-01-22 00:20:53.116 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:20:53 np0005591285 NetworkManager[55017]: <info>  [1769041253.1190] device (tap93aa6e48-d8): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 21 19:20:53 np0005591285 nova_compute[182755]: 2026-01-22 00:20:53.127 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:20:53 np0005591285 ovn_controller[94908]: 2026-01-22T00:20:53Z|00589|binding|INFO|Releasing lport 93aa6e48-d8e4-4f3c-b816-eedca06529c0 from this chassis (sb_readonly=0)
Jan 21 19:20:53 np0005591285 ovn_controller[94908]: 2026-01-22T00:20:53Z|00590|binding|INFO|Setting lport 93aa6e48-d8e4-4f3c-b816-eedca06529c0 down in Southbound
Jan 21 19:20:53 np0005591285 ovn_controller[94908]: 2026-01-22T00:20:53Z|00591|binding|INFO|Removing iface tap93aa6e48-d8 ovn-installed in OVS
Jan 21 19:20:53 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:20:53.135 104259 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:d8:48:67 10.100.0.5'], port_security=['fa:16:3e:d8:48:67 10.100.0.5'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.5/28', 'neutron:device_id': '6b668707-d685-4bda-bfbf-c52a9214fc5a', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-09aa8d20-eb46-4367-945b-494fddadbef9', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '34b96b4037d24a0ea19383ca2477b2fd', 'neutron:revision_number': '8', 'neutron:security_group_ids': 'c95cb105-44d9-4f90-ae0f-7a483ddfaf48', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-2.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=def33975-878d-4f5c-8b1e-729e778f0cd5, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fdd55b5c970>], logical_port=93aa6e48-d8e4-4f3c-b816-eedca06529c0) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7fdd55b5c970>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 21 19:20:53 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:20:53.137 104259 INFO neutron.agent.ovn.metadata.agent [-] Port 93aa6e48-d8e4-4f3c-b816-eedca06529c0 in datapath 09aa8d20-eb46-4367-945b-494fddadbef9 unbound from our chassis#033[00m
Jan 21 19:20:53 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:20:53.139 104259 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 09aa8d20-eb46-4367-945b-494fddadbef9, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Jan 21 19:20:53 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:20:53.140 211690 DEBUG oslo.privsep.daemon [-] privsep: reply[f7b2f896-3218-42f9-ae35-fc13507d4d39]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 19:20:53 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:20:53.140 104259 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-09aa8d20-eb46-4367-945b-494fddadbef9 namespace which is not needed anymore#033[00m
Jan 21 19:20:53 np0005591285 nova_compute[182755]: 2026-01-22 00:20:53.143 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:20:53 np0005591285 systemd[1]: machine-qemu\x2d68\x2dinstance\x2d00000094.scope: Deactivated successfully.
Jan 21 19:20:53 np0005591285 systemd[1]: machine-qemu\x2d68\x2dinstance\x2d00000094.scope: Consumed 15.975s CPU time.
Jan 21 19:20:53 np0005591285 systemd-machined[154022]: Machine qemu-68-instance-00000094 terminated.
Jan 21 19:20:53 np0005591285 nova_compute[182755]: 2026-01-22 00:20:53.316 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:20:53 np0005591285 nova_compute[182755]: 2026-01-22 00:20:53.321 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:20:53 np0005591285 nova_compute[182755]: 2026-01-22 00:20:53.342 182759 DEBUG nova.compute.manager [req-73a89e32-16bb-4d25-a990-ec46b6588a1a req-5a022a22-5598-484c-9998-a6eba98daea1 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 6b668707-d685-4bda-bfbf-c52a9214fc5a] Received event network-changed-93aa6e48-d8e4-4f3c-b816-eedca06529c0 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 21 19:20:53 np0005591285 nova_compute[182755]: 2026-01-22 00:20:53.342 182759 DEBUG nova.compute.manager [req-73a89e32-16bb-4d25-a990-ec46b6588a1a req-5a022a22-5598-484c-9998-a6eba98daea1 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 6b668707-d685-4bda-bfbf-c52a9214fc5a] Refreshing instance network info cache due to event network-changed-93aa6e48-d8e4-4f3c-b816-eedca06529c0. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 21 19:20:53 np0005591285 nova_compute[182755]: 2026-01-22 00:20:53.342 182759 DEBUG oslo_concurrency.lockutils [req-73a89e32-16bb-4d25-a990-ec46b6588a1a req-5a022a22-5598-484c-9998-a6eba98daea1 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "refresh_cache-6b668707-d685-4bda-bfbf-c52a9214fc5a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 21 19:20:53 np0005591285 nova_compute[182755]: 2026-01-22 00:20:53.359 182759 INFO nova.virt.libvirt.driver [-] [instance: 6b668707-d685-4bda-bfbf-c52a9214fc5a] Instance destroyed successfully.#033[00m
Jan 21 19:20:53 np0005591285 nova_compute[182755]: 2026-01-22 00:20:53.360 182759 DEBUG nova.objects.instance [None req-343bfc38-043c-44f6-abfc-aa5a94af43a8 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] Lazy-loading 'resources' on Instance uuid 6b668707-d685-4bda-bfbf-c52a9214fc5a obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 21 19:20:53 np0005591285 nova_compute[182755]: 2026-01-22 00:20:53.380 182759 DEBUG nova.virt.libvirt.vif [None req-343bfc38-043c-44f6-abfc-aa5a94af43a8 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-22T00:19:41Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-398620410',display_name='tempest-TestNetworkBasicOps-server-398620410',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-398620410',id=148,image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBKIEvieg9TrMIcpLAYt3gk5p/YmFG00eTJcN+irKlcwFP4JIb9ny8lLu9l+wAcWyWHvM0k7OczTH+oTvUvPkfix7KDnMApoByZrmipa2gWnVbEhuqkyx+mSn5bJs2Pn/xw==',key_name='tempest-TestNetworkBasicOps-596721142',keypairs=<?>,launch_index=0,launched_at=2026-01-22T00:19:51Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='34b96b4037d24a0ea19383ca2477b2fd',ramdisk_id='',reservation_id='r-g3rthjto',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestNetworkBasicOps-822850957',owner_user_name='tempest-TestNetworkBasicOps-822850957-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-22T00:19:51Z,user_data=None,user_id='833f1e9dce90456ea55a443da6704907',uuid=6b668707-d685-4bda-bfbf-c52a9214fc5a,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "93aa6e48-d8e4-4f3c-b816-eedca06529c0", "address": "fa:16:3e:d8:48:67", "network": {"id": "09aa8d20-eb46-4367-945b-494fddadbef9", "bridge": "br-int", "label": "tempest-network-smoke--676031905", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.204", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "34b96b4037d24a0ea19383ca2477b2fd", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap93aa6e48-d8", "ovs_interfaceid": "93aa6e48-d8e4-4f3c-b816-eedca06529c0", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Jan 21 19:20:53 np0005591285 nova_compute[182755]: 2026-01-22 00:20:53.381 182759 DEBUG nova.network.os_vif_util [None req-343bfc38-043c-44f6-abfc-aa5a94af43a8 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] Converting VIF {"id": "93aa6e48-d8e4-4f3c-b816-eedca06529c0", "address": "fa:16:3e:d8:48:67", "network": {"id": "09aa8d20-eb46-4367-945b-494fddadbef9", "bridge": "br-int", "label": "tempest-network-smoke--676031905", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.204", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "34b96b4037d24a0ea19383ca2477b2fd", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap93aa6e48-d8", "ovs_interfaceid": "93aa6e48-d8e4-4f3c-b816-eedca06529c0", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 21 19:20:53 np0005591285 nova_compute[182755]: 2026-01-22 00:20:53.381 182759 DEBUG nova.network.os_vif_util [None req-343bfc38-043c-44f6-abfc-aa5a94af43a8 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:d8:48:67,bridge_name='br-int',has_traffic_filtering=True,id=93aa6e48-d8e4-4f3c-b816-eedca06529c0,network=Network(09aa8d20-eb46-4367-945b-494fddadbef9),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap93aa6e48-d8') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 21 19:20:53 np0005591285 nova_compute[182755]: 2026-01-22 00:20:53.382 182759 DEBUG os_vif [None req-343bfc38-043c-44f6-abfc-aa5a94af43a8 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:d8:48:67,bridge_name='br-int',has_traffic_filtering=True,id=93aa6e48-d8e4-4f3c-b816-eedca06529c0,network=Network(09aa8d20-eb46-4367-945b-494fddadbef9),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap93aa6e48-d8') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Jan 21 19:20:53 np0005591285 nova_compute[182755]: 2026-01-22 00:20:53.384 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:20:53 np0005591285 nova_compute[182755]: 2026-01-22 00:20:53.384 182759 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap93aa6e48-d8, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 21 19:20:53 np0005591285 nova_compute[182755]: 2026-01-22 00:20:53.385 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:20:53 np0005591285 nova_compute[182755]: 2026-01-22 00:20:53.387 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:20:53 np0005591285 nova_compute[182755]: 2026-01-22 00:20:53.389 182759 INFO os_vif [None req-343bfc38-043c-44f6-abfc-aa5a94af43a8 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:d8:48:67,bridge_name='br-int',has_traffic_filtering=True,id=93aa6e48-d8e4-4f3c-b816-eedca06529c0,network=Network(09aa8d20-eb46-4367-945b-494fddadbef9),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap93aa6e48-d8')#033[00m
Jan 21 19:20:53 np0005591285 nova_compute[182755]: 2026-01-22 00:20:53.390 182759 INFO nova.virt.libvirt.driver [None req-343bfc38-043c-44f6-abfc-aa5a94af43a8 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] [instance: 6b668707-d685-4bda-bfbf-c52a9214fc5a] Deleting instance files /var/lib/nova/instances/6b668707-d685-4bda-bfbf-c52a9214fc5a_del#033[00m
Jan 21 19:20:53 np0005591285 nova_compute[182755]: 2026-01-22 00:20:53.390 182759 INFO nova.virt.libvirt.driver [None req-343bfc38-043c-44f6-abfc-aa5a94af43a8 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] [instance: 6b668707-d685-4bda-bfbf-c52a9214fc5a] Deletion of /var/lib/nova/instances/6b668707-d685-4bda-bfbf-c52a9214fc5a_del complete#033[00m
Jan 21 19:20:53 np0005591285 neutron-haproxy-ovnmeta-09aa8d20-eb46-4367-945b-494fddadbef9[235216]: [NOTICE]   (235220) : haproxy version is 2.8.14-c23fe91
Jan 21 19:20:53 np0005591285 neutron-haproxy-ovnmeta-09aa8d20-eb46-4367-945b-494fddadbef9[235216]: [NOTICE]   (235220) : path to executable is /usr/sbin/haproxy
Jan 21 19:20:53 np0005591285 neutron-haproxy-ovnmeta-09aa8d20-eb46-4367-945b-494fddadbef9[235216]: [WARNING]  (235220) : Exiting Master process...
Jan 21 19:20:53 np0005591285 neutron-haproxy-ovnmeta-09aa8d20-eb46-4367-945b-494fddadbef9[235216]: [WARNING]  (235220) : Exiting Master process...
Jan 21 19:20:53 np0005591285 neutron-haproxy-ovnmeta-09aa8d20-eb46-4367-945b-494fddadbef9[235216]: [ALERT]    (235220) : Current worker (235222) exited with code 143 (Terminated)
Jan 21 19:20:53 np0005591285 neutron-haproxy-ovnmeta-09aa8d20-eb46-4367-945b-494fddadbef9[235216]: [WARNING]  (235220) : All workers exited. Exiting... (0)
Jan 21 19:20:53 np0005591285 systemd[1]: libpod-977f040c044f7163a091a0e4b4b2a261387bf1e0549304866ce8d00a81089e2d.scope: Deactivated successfully.
Jan 21 19:20:53 np0005591285 podman[235646]: 2026-01-22 00:20:53.477006199 +0000 UTC m=+0.233889937 container died 977f040c044f7163a091a0e4b4b2a261387bf1e0549304866ce8d00a81089e2d (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-09aa8d20-eb46-4367-945b-494fddadbef9, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251202, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS)
Jan 21 19:20:53 np0005591285 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-977f040c044f7163a091a0e4b4b2a261387bf1e0549304866ce8d00a81089e2d-userdata-shm.mount: Deactivated successfully.
Jan 21 19:20:53 np0005591285 systemd[1]: var-lib-containers-storage-overlay-fbd2bde88504642330d514732a207817e9a2afbd5c232906e48ad540e6790c45-merged.mount: Deactivated successfully.
Jan 21 19:20:53 np0005591285 podman[235646]: 2026-01-22 00:20:53.515485985 +0000 UTC m=+0.272369693 container cleanup 977f040c044f7163a091a0e4b4b2a261387bf1e0549304866ce8d00a81089e2d (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-09aa8d20-eb46-4367-945b-494fddadbef9, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 21 19:20:53 np0005591285 systemd[1]: libpod-conmon-977f040c044f7163a091a0e4b4b2a261387bf1e0549304866ce8d00a81089e2d.scope: Deactivated successfully.
Jan 21 19:20:53 np0005591285 nova_compute[182755]: 2026-01-22 00:20:53.531 182759 INFO nova.compute.manager [None req-343bfc38-043c-44f6-abfc-aa5a94af43a8 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] [instance: 6b668707-d685-4bda-bfbf-c52a9214fc5a] Took 0.44 seconds to destroy the instance on the hypervisor.#033[00m
Jan 21 19:20:53 np0005591285 nova_compute[182755]: 2026-01-22 00:20:53.532 182759 DEBUG oslo.service.loopingcall [None req-343bfc38-043c-44f6-abfc-aa5a94af43a8 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Jan 21 19:20:53 np0005591285 nova_compute[182755]: 2026-01-22 00:20:53.532 182759 DEBUG nova.compute.manager [-] [instance: 6b668707-d685-4bda-bfbf-c52a9214fc5a] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Jan 21 19:20:53 np0005591285 nova_compute[182755]: 2026-01-22 00:20:53.532 182759 DEBUG nova.network.neutron [-] [instance: 6b668707-d685-4bda-bfbf-c52a9214fc5a] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Jan 21 19:20:54 np0005591285 podman[235691]: 2026-01-22 00:20:54.340002997 +0000 UTC m=+0.796653980 container remove 977f040c044f7163a091a0e4b4b2a261387bf1e0549304866ce8d00a81089e2d (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-09aa8d20-eb46-4367-945b-494fddadbef9, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.build-date=20251202, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true)
Jan 21 19:20:54 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:20:54.348 211690 DEBUG oslo.privsep.daemon [-] privsep: reply[edccd840-d527-43d3-9ccd-b206dd6d794d]: (4, ('Thu Jan 22 12:20:53 AM UTC 2026 Stopping container neutron-haproxy-ovnmeta-09aa8d20-eb46-4367-945b-494fddadbef9 (977f040c044f7163a091a0e4b4b2a261387bf1e0549304866ce8d00a81089e2d)\n977f040c044f7163a091a0e4b4b2a261387bf1e0549304866ce8d00a81089e2d\nThu Jan 22 12:20:53 AM UTC 2026 Deleting container neutron-haproxy-ovnmeta-09aa8d20-eb46-4367-945b-494fddadbef9 (977f040c044f7163a091a0e4b4b2a261387bf1e0549304866ce8d00a81089e2d)\n977f040c044f7163a091a0e4b4b2a261387bf1e0549304866ce8d00a81089e2d\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 19:20:54 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:20:54.350 211690 DEBUG oslo.privsep.daemon [-] privsep: reply[c0dd35af-5000-4f33-8a47-381d470f5553]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 19:20:54 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:20:54.351 104259 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap09aa8d20-e0, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 21 19:20:54 np0005591285 nova_compute[182755]: 2026-01-22 00:20:54.352 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:20:54 np0005591285 kernel: tap09aa8d20-e0: left promiscuous mode
Jan 21 19:20:54 np0005591285 nova_compute[182755]: 2026-01-22 00:20:54.366 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:20:54 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:20:54.368 211690 DEBUG oslo.privsep.daemon [-] privsep: reply[c81db682-af52-455e-a942-c6f1172560da]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 19:20:54 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:20:54.388 211690 DEBUG oslo.privsep.daemon [-] privsep: reply[7b250a22-7157-4739-83f1-421503b65fba]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 19:20:54 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:20:54.389 211690 DEBUG oslo.privsep.daemon [-] privsep: reply[f1b5d38d-8f16-43e5-bdb0-026b3d90308b]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 19:20:54 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:20:54.409 211690 DEBUG oslo.privsep.daemon [-] privsep: reply[6ff71ddf-43d4-486f-ad62-f4b86d1a713c]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 574933, 'reachable_time': 22533, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 235707, 'error': None, 'target': 'ovnmeta-09aa8d20-eb46-4367-945b-494fddadbef9', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 19:20:54 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:20:54.412 104650 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-09aa8d20-eb46-4367-945b-494fddadbef9 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Jan 21 19:20:54 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:20:54.412 104650 DEBUG oslo.privsep.daemon [-] privsep: reply[72602cd5-83bb-4114-b947-f92fa8e2fb57]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 19:20:54 np0005591285 systemd[1]: run-netns-ovnmeta\x2d09aa8d20\x2deb46\x2d4367\x2d945b\x2d494fddadbef9.mount: Deactivated successfully.
Jan 21 19:20:54 np0005591285 nova_compute[182755]: 2026-01-22 00:20:54.491 182759 DEBUG nova.network.neutron [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] [instance: 6b668707-d685-4bda-bfbf-c52a9214fc5a] Updating instance_info_cache with network_info: [{"id": "93aa6e48-d8e4-4f3c-b816-eedca06529c0", "address": "fa:16:3e:d8:48:67", "network": {"id": "09aa8d20-eb46-4367-945b-494fddadbef9", "bridge": "br-int", "label": "tempest-network-smoke--676031905", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "34b96b4037d24a0ea19383ca2477b2fd", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap93aa6e48-d8", "ovs_interfaceid": "93aa6e48-d8e4-4f3c-b816-eedca06529c0", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 21 19:20:54 np0005591285 nova_compute[182755]: 2026-01-22 00:20:54.729 182759 DEBUG oslo_concurrency.lockutils [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Releasing lock "refresh_cache-6b668707-d685-4bda-bfbf-c52a9214fc5a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 21 19:20:54 np0005591285 nova_compute[182755]: 2026-01-22 00:20:54.729 182759 DEBUG nova.compute.manager [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] [instance: 6b668707-d685-4bda-bfbf-c52a9214fc5a] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929#033[00m
Jan 21 19:20:54 np0005591285 nova_compute[182755]: 2026-01-22 00:20:54.729 182759 DEBUG oslo_concurrency.lockutils [req-73a89e32-16bb-4d25-a990-ec46b6588a1a req-5a022a22-5598-484c-9998-a6eba98daea1 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquired lock "refresh_cache-6b668707-d685-4bda-bfbf-c52a9214fc5a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 21 19:20:54 np0005591285 nova_compute[182755]: 2026-01-22 00:20:54.730 182759 DEBUG nova.network.neutron [req-73a89e32-16bb-4d25-a990-ec46b6588a1a req-5a022a22-5598-484c-9998-a6eba98daea1 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 6b668707-d685-4bda-bfbf-c52a9214fc5a] Refreshing network info cache for port 93aa6e48-d8e4-4f3c-b816-eedca06529c0 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 21 19:20:54 np0005591285 nova_compute[182755]: 2026-01-22 00:20:54.731 182759 DEBUG oslo_service.periodic_task [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 21 19:20:54 np0005591285 nova_compute[182755]: 2026-01-22 00:20:54.881 182759 DEBUG nova.network.neutron [-] [instance: 6b668707-d685-4bda-bfbf-c52a9214fc5a] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 21 19:20:54 np0005591285 nova_compute[182755]: 2026-01-22 00:20:54.908 182759 INFO nova.compute.manager [-] [instance: 6b668707-d685-4bda-bfbf-c52a9214fc5a] Took 1.38 seconds to deallocate network for instance.#033[00m
Jan 21 19:20:54 np0005591285 nova_compute[182755]: 2026-01-22 00:20:54.987 182759 INFO nova.network.neutron [req-73a89e32-16bb-4d25-a990-ec46b6588a1a req-5a022a22-5598-484c-9998-a6eba98daea1 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 6b668707-d685-4bda-bfbf-c52a9214fc5a] Port 93aa6e48-d8e4-4f3c-b816-eedca06529c0 from network info_cache is no longer associated with instance in Neutron. Removing from network info_cache.#033[00m
Jan 21 19:20:54 np0005591285 nova_compute[182755]: 2026-01-22 00:20:54.988 182759 DEBUG nova.network.neutron [req-73a89e32-16bb-4d25-a990-ec46b6588a1a req-5a022a22-5598-484c-9998-a6eba98daea1 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 6b668707-d685-4bda-bfbf-c52a9214fc5a] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 21 19:20:55 np0005591285 nova_compute[182755]: 2026-01-22 00:20:55.014 182759 DEBUG oslo_concurrency.lockutils [req-73a89e32-16bb-4d25-a990-ec46b6588a1a req-5a022a22-5598-484c-9998-a6eba98daea1 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Releasing lock "refresh_cache-6b668707-d685-4bda-bfbf-c52a9214fc5a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 21 19:20:55 np0005591285 nova_compute[182755]: 2026-01-22 00:20:55.027 182759 DEBUG oslo_concurrency.lockutils [None req-343bfc38-043c-44f6-abfc-aa5a94af43a8 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 21 19:20:55 np0005591285 nova_compute[182755]: 2026-01-22 00:20:55.027 182759 DEBUG oslo_concurrency.lockutils [None req-343bfc38-043c-44f6-abfc-aa5a94af43a8 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 21 19:20:55 np0005591285 nova_compute[182755]: 2026-01-22 00:20:55.131 182759 DEBUG nova.compute.provider_tree [None req-343bfc38-043c-44f6-abfc-aa5a94af43a8 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] Inventory has not changed in ProviderTree for provider: e96a8776-a298-4c19-937a-402cb8191067 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 21 19:20:55 np0005591285 nova_compute[182755]: 2026-01-22 00:20:55.159 182759 DEBUG nova.scheduler.client.report [None req-343bfc38-043c-44f6-abfc-aa5a94af43a8 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] Inventory has not changed for provider e96a8776-a298-4c19-937a-402cb8191067 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 21 19:20:55 np0005591285 nova_compute[182755]: 2026-01-22 00:20:55.189 182759 DEBUG oslo_concurrency.lockutils [None req-343bfc38-043c-44f6-abfc-aa5a94af43a8 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.162s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 21 19:20:55 np0005591285 nova_compute[182755]: 2026-01-22 00:20:55.217 182759 DEBUG oslo_service.periodic_task [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 21 19:20:55 np0005591285 nova_compute[182755]: 2026-01-22 00:20:55.223 182759 INFO nova.scheduler.client.report [None req-343bfc38-043c-44f6-abfc-aa5a94af43a8 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] Deleted allocations for instance 6b668707-d685-4bda-bfbf-c52a9214fc5a#033[00m
Jan 21 19:20:55 np0005591285 nova_compute[182755]: 2026-01-22 00:20:55.334 182759 DEBUG oslo_concurrency.lockutils [None req-343bfc38-043c-44f6-abfc-aa5a94af43a8 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] Lock "6b668707-d685-4bda-bfbf-c52a9214fc5a" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 2.489s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 21 19:20:55 np0005591285 nova_compute[182755]: 2026-01-22 00:20:55.468 182759 DEBUG nova.compute.manager [req-2226303c-e08e-4c8a-8dca-f02faf666e60 req-33645ba4-77b5-43fd-acfa-8b1e945a6b1c 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 6b668707-d685-4bda-bfbf-c52a9214fc5a] Received event network-vif-unplugged-93aa6e48-d8e4-4f3c-b816-eedca06529c0 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 21 19:20:55 np0005591285 nova_compute[182755]: 2026-01-22 00:20:55.469 182759 DEBUG oslo_concurrency.lockutils [req-2226303c-e08e-4c8a-8dca-f02faf666e60 req-33645ba4-77b5-43fd-acfa-8b1e945a6b1c 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "6b668707-d685-4bda-bfbf-c52a9214fc5a-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 21 19:20:55 np0005591285 nova_compute[182755]: 2026-01-22 00:20:55.469 182759 DEBUG oslo_concurrency.lockutils [req-2226303c-e08e-4c8a-8dca-f02faf666e60 req-33645ba4-77b5-43fd-acfa-8b1e945a6b1c 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "6b668707-d685-4bda-bfbf-c52a9214fc5a-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 21 19:20:55 np0005591285 nova_compute[182755]: 2026-01-22 00:20:55.469 182759 DEBUG oslo_concurrency.lockutils [req-2226303c-e08e-4c8a-8dca-f02faf666e60 req-33645ba4-77b5-43fd-acfa-8b1e945a6b1c 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "6b668707-d685-4bda-bfbf-c52a9214fc5a-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 21 19:20:55 np0005591285 nova_compute[182755]: 2026-01-22 00:20:55.470 182759 DEBUG nova.compute.manager [req-2226303c-e08e-4c8a-8dca-f02faf666e60 req-33645ba4-77b5-43fd-acfa-8b1e945a6b1c 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 6b668707-d685-4bda-bfbf-c52a9214fc5a] No waiting events found dispatching network-vif-unplugged-93aa6e48-d8e4-4f3c-b816-eedca06529c0 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 21 19:20:55 np0005591285 nova_compute[182755]: 2026-01-22 00:20:55.470 182759 WARNING nova.compute.manager [req-2226303c-e08e-4c8a-8dca-f02faf666e60 req-33645ba4-77b5-43fd-acfa-8b1e945a6b1c 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 6b668707-d685-4bda-bfbf-c52a9214fc5a] Received unexpected event network-vif-unplugged-93aa6e48-d8e4-4f3c-b816-eedca06529c0 for instance with vm_state deleted and task_state None.#033[00m
Jan 21 19:20:55 np0005591285 nova_compute[182755]: 2026-01-22 00:20:55.470 182759 DEBUG nova.compute.manager [req-2226303c-e08e-4c8a-8dca-f02faf666e60 req-33645ba4-77b5-43fd-acfa-8b1e945a6b1c 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 6b668707-d685-4bda-bfbf-c52a9214fc5a] Received event network-vif-plugged-93aa6e48-d8e4-4f3c-b816-eedca06529c0 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 21 19:20:55 np0005591285 nova_compute[182755]: 2026-01-22 00:20:55.470 182759 DEBUG oslo_concurrency.lockutils [req-2226303c-e08e-4c8a-8dca-f02faf666e60 req-33645ba4-77b5-43fd-acfa-8b1e945a6b1c 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "6b668707-d685-4bda-bfbf-c52a9214fc5a-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 21 19:20:55 np0005591285 nova_compute[182755]: 2026-01-22 00:20:55.471 182759 DEBUG oslo_concurrency.lockutils [req-2226303c-e08e-4c8a-8dca-f02faf666e60 req-33645ba4-77b5-43fd-acfa-8b1e945a6b1c 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "6b668707-d685-4bda-bfbf-c52a9214fc5a-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 21 19:20:55 np0005591285 nova_compute[182755]: 2026-01-22 00:20:55.471 182759 DEBUG oslo_concurrency.lockutils [req-2226303c-e08e-4c8a-8dca-f02faf666e60 req-33645ba4-77b5-43fd-acfa-8b1e945a6b1c 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "6b668707-d685-4bda-bfbf-c52a9214fc5a-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 21 19:20:55 np0005591285 nova_compute[182755]: 2026-01-22 00:20:55.471 182759 DEBUG nova.compute.manager [req-2226303c-e08e-4c8a-8dca-f02faf666e60 req-33645ba4-77b5-43fd-acfa-8b1e945a6b1c 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 6b668707-d685-4bda-bfbf-c52a9214fc5a] No waiting events found dispatching network-vif-plugged-93aa6e48-d8e4-4f3c-b816-eedca06529c0 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 21 19:20:55 np0005591285 nova_compute[182755]: 2026-01-22 00:20:55.472 182759 WARNING nova.compute.manager [req-2226303c-e08e-4c8a-8dca-f02faf666e60 req-33645ba4-77b5-43fd-acfa-8b1e945a6b1c 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 6b668707-d685-4bda-bfbf-c52a9214fc5a] Received unexpected event network-vif-plugged-93aa6e48-d8e4-4f3c-b816-eedca06529c0 for instance with vm_state deleted and task_state None.#033[00m
Jan 21 19:20:55 np0005591285 nova_compute[182755]: 2026-01-22 00:20:55.472 182759 DEBUG nova.compute.manager [req-2226303c-e08e-4c8a-8dca-f02faf666e60 req-33645ba4-77b5-43fd-acfa-8b1e945a6b1c 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 6b668707-d685-4bda-bfbf-c52a9214fc5a] Received event network-vif-deleted-93aa6e48-d8e4-4f3c-b816-eedca06529c0 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 21 19:20:56 np0005591285 nova_compute[182755]: 2026-01-22 00:20:56.217 182759 DEBUG oslo_service.periodic_task [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 21 19:20:56 np0005591285 nova_compute[182755]: 2026-01-22 00:20:56.217 182759 DEBUG oslo_service.periodic_task [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 21 19:20:56 np0005591285 nova_compute[182755]: 2026-01-22 00:20:56.242 182759 DEBUG oslo_concurrency.lockutils [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 21 19:20:56 np0005591285 nova_compute[182755]: 2026-01-22 00:20:56.242 182759 DEBUG oslo_concurrency.lockutils [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 21 19:20:56 np0005591285 nova_compute[182755]: 2026-01-22 00:20:56.242 182759 DEBUG oslo_concurrency.lockutils [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 21 19:20:56 np0005591285 nova_compute[182755]: 2026-01-22 00:20:56.243 182759 DEBUG nova.compute.resource_tracker [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Jan 21 19:20:56 np0005591285 nova_compute[182755]: 2026-01-22 00:20:56.395 182759 WARNING nova.virt.libvirt.driver [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 21 19:20:56 np0005591285 nova_compute[182755]: 2026-01-22 00:20:56.396 182759 DEBUG nova.compute.resource_tracker [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=5739MB free_disk=73.19316864013672GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Jan 21 19:20:56 np0005591285 nova_compute[182755]: 2026-01-22 00:20:56.396 182759 DEBUG oslo_concurrency.lockutils [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 21 19:20:56 np0005591285 nova_compute[182755]: 2026-01-22 00:20:56.396 182759 DEBUG oslo_concurrency.lockutils [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 21 19:20:56 np0005591285 nova_compute[182755]: 2026-01-22 00:20:56.488 182759 DEBUG nova.compute.resource_tracker [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Jan 21 19:20:56 np0005591285 nova_compute[182755]: 2026-01-22 00:20:56.488 182759 DEBUG nova.compute.resource_tracker [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Jan 21 19:20:56 np0005591285 nova_compute[182755]: 2026-01-22 00:20:56.517 182759 DEBUG nova.compute.provider_tree [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Inventory has not changed in ProviderTree for provider: e96a8776-a298-4c19-937a-402cb8191067 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 21 19:20:56 np0005591285 nova_compute[182755]: 2026-01-22 00:20:56.537 182759 DEBUG nova.scheduler.client.report [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Inventory has not changed for provider e96a8776-a298-4c19-937a-402cb8191067 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 21 19:20:56 np0005591285 nova_compute[182755]: 2026-01-22 00:20:56.583 182759 DEBUG nova.compute.resource_tracker [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Jan 21 19:20:56 np0005591285 nova_compute[182755]: 2026-01-22 00:20:56.583 182759 DEBUG oslo_concurrency.lockutils [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.187s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 21 19:20:57 np0005591285 nova_compute[182755]: 2026-01-22 00:20:57.916 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:20:58 np0005591285 nova_compute[182755]: 2026-01-22 00:20:58.386 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:20:59 np0005591285 nova_compute[182755]: 2026-01-22 00:20:59.393 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:20:59 np0005591285 nova_compute[182755]: 2026-01-22 00:20:59.576 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:21:00 np0005591285 nova_compute[182755]: 2026-01-22 00:21:00.585 182759 DEBUG oslo_service.periodic_task [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 21 19:21:02 np0005591285 nova_compute[182755]: 2026-01-22 00:21:02.919 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:21:02 np0005591285 nova_compute[182755]: 2026-01-22 00:21:02.930 182759 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1769041247.9292448, 04c5bee9-8745-45b2-884d-2abfbbec5d0e => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 21 19:21:02 np0005591285 nova_compute[182755]: 2026-01-22 00:21:02.930 182759 INFO nova.compute.manager [-] [instance: 04c5bee9-8745-45b2-884d-2abfbbec5d0e] VM Stopped (Lifecycle Event)#033[00m
Jan 21 19:21:02 np0005591285 nova_compute[182755]: 2026-01-22 00:21:02.948 182759 DEBUG nova.compute.manager [None req-7f0b1679-63b1-466c-95d2-5cdf5a9b49c4 - - - - - -] [instance: 04c5bee9-8745-45b2-884d-2abfbbec5d0e] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 21 19:21:02 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:21:02.985 104259 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 21 19:21:02 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:21:02.985 104259 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 21 19:21:02 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:21:02.985 104259 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 21 19:21:03 np0005591285 nova_compute[182755]: 2026-01-22 00:21:03.389 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:21:06 np0005591285 podman[235713]: 2026-01-22 00:21:06.181747354 +0000 UTC m=+0.052198013 container health_status b5c1a31a508d0cf9e10702abe9c0ef424614b4d8f0daee85791f7de054afa6f0 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251202, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, container_name=ceilometer_agent_compute, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.schema-version=1.0, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=ceilometer_agent_compute)
Jan 21 19:21:06 np0005591285 podman[235712]: 2026-01-22 00:21:06.181734143 +0000 UTC m=+0.054358540 container health_status 769668cc4e818cfae854ab3fcb5517227ddca1e18005a18ef4d782a494f45662 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, url=https://catalog.redhat.com/en/search?searchType=containers, release=1755695350, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=Red Hat, Inc., build-date=2025-08-20T13:12:41, distribution-scope=public, version=9.6, managed_by=edpm_ansible, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.openshift.tags=minimal rhel9, name=ubi9-minimal, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-type=git, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, config_id=openstack_network_exporter, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.buildah.version=1.33.7, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vendor=Red Hat, Inc., com.redhat.component=ubi9-minimal-container, container_name=openstack_network_exporter, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=)
Jan 21 19:21:07 np0005591285 nova_compute[182755]: 2026-01-22 00:21:07.921 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:21:08 np0005591285 nova_compute[182755]: 2026-01-22 00:21:08.357 182759 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1769041253.357019, 6b668707-d685-4bda-bfbf-c52a9214fc5a => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 21 19:21:08 np0005591285 nova_compute[182755]: 2026-01-22 00:21:08.358 182759 INFO nova.compute.manager [-] [instance: 6b668707-d685-4bda-bfbf-c52a9214fc5a] VM Stopped (Lifecycle Event)#033[00m
Jan 21 19:21:08 np0005591285 nova_compute[182755]: 2026-01-22 00:21:08.389 182759 DEBUG nova.compute.manager [None req-e806cc0d-6b82-430b-97d8-d976830633d9 - - - - - -] [instance: 6b668707-d685-4bda-bfbf-c52a9214fc5a] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 21 19:21:08 np0005591285 nova_compute[182755]: 2026-01-22 00:21:08.391 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:21:12 np0005591285 nova_compute[182755]: 2026-01-22 00:21:12.923 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:21:13 np0005591285 nova_compute[182755]: 2026-01-22 00:21:13.391 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:21:16 np0005591285 podman[235753]: 2026-01-22 00:21:16.178719066 +0000 UTC m=+0.053913461 container health_status 3d927e208c8d9d2bf2ed958430b573b5beb6e6062ba87e1ec7a0ee42c855b2e4 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Jan 21 19:21:17 np0005591285 nova_compute[182755]: 2026-01-22 00:21:17.925 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:21:18 np0005591285 podman[235778]: 2026-01-22 00:21:18.191704685 +0000 UTC m=+0.069937072 container health_status 482a8450540b33e5cd4c4cb0c4b60281016c657b79b89fa82f8a9e447843f995 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, io.buildah.version=1.41.3)
Jan 21 19:21:18 np0005591285 podman[235779]: 2026-01-22 00:21:18.191564451 +0000 UTC m=+0.054580689 container health_status 8919309155a033da8deb024bb37b9fc0d285fe97686ee0d202af37a5f2df8416 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Jan 21 19:21:18 np0005591285 nova_compute[182755]: 2026-01-22 00:21:18.393 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:21:18 np0005591285 nova_compute[182755]: 2026-01-22 00:21:18.680 182759 DEBUG oslo_concurrency.lockutils [None req-1e6fd58f-821a-4085-b2e9-5cf75db49cd1 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] Acquiring lock "25b17338-0c55-4631-9d7e-896e6fa6339a" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 21 19:21:18 np0005591285 nova_compute[182755]: 2026-01-22 00:21:18.680 182759 DEBUG oslo_concurrency.lockutils [None req-1e6fd58f-821a-4085-b2e9-5cf75db49cd1 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] Lock "25b17338-0c55-4631-9d7e-896e6fa6339a" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 21 19:21:18 np0005591285 nova_compute[182755]: 2026-01-22 00:21:18.697 182759 DEBUG nova.compute.manager [None req-1e6fd58f-821a-4085-b2e9-5cf75db49cd1 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] [instance: 25b17338-0c55-4631-9d7e-896e6fa6339a] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Jan 21 19:21:18 np0005591285 nova_compute[182755]: 2026-01-22 00:21:18.807 182759 DEBUG oslo_concurrency.lockutils [None req-1e6fd58f-821a-4085-b2e9-5cf75db49cd1 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 21 19:21:18 np0005591285 nova_compute[182755]: 2026-01-22 00:21:18.808 182759 DEBUG oslo_concurrency.lockutils [None req-1e6fd58f-821a-4085-b2e9-5cf75db49cd1 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 21 19:21:18 np0005591285 nova_compute[182755]: 2026-01-22 00:21:18.814 182759 DEBUG nova.virt.hardware [None req-1e6fd58f-821a-4085-b2e9-5cf75db49cd1 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Jan 21 19:21:18 np0005591285 nova_compute[182755]: 2026-01-22 00:21:18.814 182759 INFO nova.compute.claims [None req-1e6fd58f-821a-4085-b2e9-5cf75db49cd1 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] [instance: 25b17338-0c55-4631-9d7e-896e6fa6339a] Claim successful on node compute-2.ctlplane.example.com#033[00m
Jan 21 19:21:18 np0005591285 nova_compute[182755]: 2026-01-22 00:21:18.956 182759 DEBUG nova.compute.provider_tree [None req-1e6fd58f-821a-4085-b2e9-5cf75db49cd1 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] Inventory has not changed in ProviderTree for provider: e96a8776-a298-4c19-937a-402cb8191067 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 21 19:21:18 np0005591285 nova_compute[182755]: 2026-01-22 00:21:18.974 182759 DEBUG nova.scheduler.client.report [None req-1e6fd58f-821a-4085-b2e9-5cf75db49cd1 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] Inventory has not changed for provider e96a8776-a298-4c19-937a-402cb8191067 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 21 19:21:18 np0005591285 nova_compute[182755]: 2026-01-22 00:21:18.995 182759 DEBUG oslo_concurrency.lockutils [None req-1e6fd58f-821a-4085-b2e9-5cf75db49cd1 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.188s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 21 19:21:18 np0005591285 nova_compute[182755]: 2026-01-22 00:21:18.996 182759 DEBUG nova.compute.manager [None req-1e6fd58f-821a-4085-b2e9-5cf75db49cd1 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] [instance: 25b17338-0c55-4631-9d7e-896e6fa6339a] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Jan 21 19:21:19 np0005591285 nova_compute[182755]: 2026-01-22 00:21:19.054 182759 DEBUG nova.compute.manager [None req-1e6fd58f-821a-4085-b2e9-5cf75db49cd1 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] [instance: 25b17338-0c55-4631-9d7e-896e6fa6339a] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Jan 21 19:21:19 np0005591285 nova_compute[182755]: 2026-01-22 00:21:19.055 182759 DEBUG nova.network.neutron [None req-1e6fd58f-821a-4085-b2e9-5cf75db49cd1 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] [instance: 25b17338-0c55-4631-9d7e-896e6fa6339a] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Jan 21 19:21:19 np0005591285 nova_compute[182755]: 2026-01-22 00:21:19.075 182759 INFO nova.virt.libvirt.driver [None req-1e6fd58f-821a-4085-b2e9-5cf75db49cd1 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] [instance: 25b17338-0c55-4631-9d7e-896e6fa6339a] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Jan 21 19:21:19 np0005591285 nova_compute[182755]: 2026-01-22 00:21:19.099 182759 DEBUG nova.compute.manager [None req-1e6fd58f-821a-4085-b2e9-5cf75db49cd1 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] [instance: 25b17338-0c55-4631-9d7e-896e6fa6339a] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Jan 21 19:21:19 np0005591285 nova_compute[182755]: 2026-01-22 00:21:19.234 182759 DEBUG nova.compute.manager [None req-1e6fd58f-821a-4085-b2e9-5cf75db49cd1 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] [instance: 25b17338-0c55-4631-9d7e-896e6fa6339a] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Jan 21 19:21:19 np0005591285 nova_compute[182755]: 2026-01-22 00:21:19.235 182759 DEBUG nova.virt.libvirt.driver [None req-1e6fd58f-821a-4085-b2e9-5cf75db49cd1 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] [instance: 25b17338-0c55-4631-9d7e-896e6fa6339a] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Jan 21 19:21:19 np0005591285 nova_compute[182755]: 2026-01-22 00:21:19.235 182759 INFO nova.virt.libvirt.driver [None req-1e6fd58f-821a-4085-b2e9-5cf75db49cd1 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] [instance: 25b17338-0c55-4631-9d7e-896e6fa6339a] Creating image(s)#033[00m
Jan 21 19:21:19 np0005591285 nova_compute[182755]: 2026-01-22 00:21:19.236 182759 DEBUG oslo_concurrency.lockutils [None req-1e6fd58f-821a-4085-b2e9-5cf75db49cd1 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] Acquiring lock "/var/lib/nova/instances/25b17338-0c55-4631-9d7e-896e6fa6339a/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 21 19:21:19 np0005591285 nova_compute[182755]: 2026-01-22 00:21:19.236 182759 DEBUG oslo_concurrency.lockutils [None req-1e6fd58f-821a-4085-b2e9-5cf75db49cd1 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] Lock "/var/lib/nova/instances/25b17338-0c55-4631-9d7e-896e6fa6339a/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 21 19:21:19 np0005591285 nova_compute[182755]: 2026-01-22 00:21:19.237 182759 DEBUG oslo_concurrency.lockutils [None req-1e6fd58f-821a-4085-b2e9-5cf75db49cd1 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] Lock "/var/lib/nova/instances/25b17338-0c55-4631-9d7e-896e6fa6339a/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 21 19:21:19 np0005591285 nova_compute[182755]: 2026-01-22 00:21:19.249 182759 DEBUG oslo_concurrency.processutils [None req-1e6fd58f-821a-4085-b2e9-5cf75db49cd1 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 21 19:21:19 np0005591285 nova_compute[182755]: 2026-01-22 00:21:19.301 182759 DEBUG oslo_concurrency.processutils [None req-1e6fd58f-821a-4085-b2e9-5cf75db49cd1 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json" returned: 0 in 0.052s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 21 19:21:19 np0005591285 nova_compute[182755]: 2026-01-22 00:21:19.302 182759 DEBUG oslo_concurrency.lockutils [None req-1e6fd58f-821a-4085-b2e9-5cf75db49cd1 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] Acquiring lock "3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 21 19:21:19 np0005591285 nova_compute[182755]: 2026-01-22 00:21:19.303 182759 DEBUG oslo_concurrency.lockutils [None req-1e6fd58f-821a-4085-b2e9-5cf75db49cd1 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] Lock "3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 21 19:21:19 np0005591285 nova_compute[182755]: 2026-01-22 00:21:19.316 182759 DEBUG oslo_concurrency.processutils [None req-1e6fd58f-821a-4085-b2e9-5cf75db49cd1 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 21 19:21:19 np0005591285 nova_compute[182755]: 2026-01-22 00:21:19.366 182759 DEBUG oslo_concurrency.processutils [None req-1e6fd58f-821a-4085-b2e9-5cf75db49cd1 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json" returned: 0 in 0.050s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 21 19:21:19 np0005591285 nova_compute[182755]: 2026-01-22 00:21:19.367 182759 DEBUG oslo_concurrency.processutils [None req-1e6fd58f-821a-4085-b2e9-5cf75db49cd1 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474,backing_fmt=raw /var/lib/nova/instances/25b17338-0c55-4631-9d7e-896e6fa6339a/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 21 19:21:19 np0005591285 nova_compute[182755]: 2026-01-22 00:21:19.411 182759 DEBUG oslo_concurrency.processutils [None req-1e6fd58f-821a-4085-b2e9-5cf75db49cd1 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474,backing_fmt=raw /var/lib/nova/instances/25b17338-0c55-4631-9d7e-896e6fa6339a/disk 1073741824" returned: 0 in 0.044s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 21 19:21:19 np0005591285 nova_compute[182755]: 2026-01-22 00:21:19.413 182759 DEBUG oslo_concurrency.lockutils [None req-1e6fd58f-821a-4085-b2e9-5cf75db49cd1 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] Lock "3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.110s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 21 19:21:19 np0005591285 nova_compute[182755]: 2026-01-22 00:21:19.414 182759 DEBUG oslo_concurrency.processutils [None req-1e6fd58f-821a-4085-b2e9-5cf75db49cd1 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 21 19:21:19 np0005591285 nova_compute[182755]: 2026-01-22 00:21:19.471 182759 DEBUG oslo_concurrency.processutils [None req-1e6fd58f-821a-4085-b2e9-5cf75db49cd1 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json" returned: 0 in 0.057s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 21 19:21:19 np0005591285 nova_compute[182755]: 2026-01-22 00:21:19.473 182759 DEBUG nova.virt.disk.api [None req-1e6fd58f-821a-4085-b2e9-5cf75db49cd1 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] Checking if we can resize image /var/lib/nova/instances/25b17338-0c55-4631-9d7e-896e6fa6339a/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166#033[00m
Jan 21 19:21:19 np0005591285 nova_compute[182755]: 2026-01-22 00:21:19.474 182759 DEBUG oslo_concurrency.processutils [None req-1e6fd58f-821a-4085-b2e9-5cf75db49cd1 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/25b17338-0c55-4631-9d7e-896e6fa6339a/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 21 19:21:19 np0005591285 nova_compute[182755]: 2026-01-22 00:21:19.562 182759 DEBUG oslo_concurrency.processutils [None req-1e6fd58f-821a-4085-b2e9-5cf75db49cd1 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/25b17338-0c55-4631-9d7e-896e6fa6339a/disk --force-share --output=json" returned: 0 in 0.088s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 21 19:21:19 np0005591285 nova_compute[182755]: 2026-01-22 00:21:19.564 182759 DEBUG nova.virt.disk.api [None req-1e6fd58f-821a-4085-b2e9-5cf75db49cd1 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] Cannot resize image /var/lib/nova/instances/25b17338-0c55-4631-9d7e-896e6fa6339a/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172#033[00m
Jan 21 19:21:19 np0005591285 nova_compute[182755]: 2026-01-22 00:21:19.565 182759 DEBUG nova.objects.instance [None req-1e6fd58f-821a-4085-b2e9-5cf75db49cd1 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] Lazy-loading 'migration_context' on Instance uuid 25b17338-0c55-4631-9d7e-896e6fa6339a obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 21 19:21:19 np0005591285 nova_compute[182755]: 2026-01-22 00:21:19.586 182759 DEBUG nova.virt.libvirt.driver [None req-1e6fd58f-821a-4085-b2e9-5cf75db49cd1 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] [instance: 25b17338-0c55-4631-9d7e-896e6fa6339a] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Jan 21 19:21:19 np0005591285 nova_compute[182755]: 2026-01-22 00:21:19.587 182759 DEBUG nova.virt.libvirt.driver [None req-1e6fd58f-821a-4085-b2e9-5cf75db49cd1 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] [instance: 25b17338-0c55-4631-9d7e-896e6fa6339a] Ensure instance console log exists: /var/lib/nova/instances/25b17338-0c55-4631-9d7e-896e6fa6339a/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Jan 21 19:21:19 np0005591285 nova_compute[182755]: 2026-01-22 00:21:19.588 182759 DEBUG oslo_concurrency.lockutils [None req-1e6fd58f-821a-4085-b2e9-5cf75db49cd1 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 21 19:21:19 np0005591285 nova_compute[182755]: 2026-01-22 00:21:19.588 182759 DEBUG oslo_concurrency.lockutils [None req-1e6fd58f-821a-4085-b2e9-5cf75db49cd1 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 21 19:21:19 np0005591285 nova_compute[182755]: 2026-01-22 00:21:19.589 182759 DEBUG oslo_concurrency.lockutils [None req-1e6fd58f-821a-4085-b2e9-5cf75db49cd1 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 21 19:21:19 np0005591285 nova_compute[182755]: 2026-01-22 00:21:19.951 182759 DEBUG nova.policy [None req-1e6fd58f-821a-4085-b2e9-5cf75db49cd1 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '833f1e9dce90456ea55a443da6704907', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '34b96b4037d24a0ea19383ca2477b2fd', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Jan 21 19:21:21 np0005591285 nova_compute[182755]: 2026-01-22 00:21:21.483 182759 DEBUG nova.network.neutron [None req-1e6fd58f-821a-4085-b2e9-5cf75db49cd1 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] [instance: 25b17338-0c55-4631-9d7e-896e6fa6339a] Successfully created port: 1d955a5e-1284-42e2-b7cc-b421102c744d _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Jan 21 19:21:22 np0005591285 podman[235835]: 2026-01-22 00:21:22.230318002 +0000 UTC m=+0.103158545 container health_status e38def30a2d032278c507a8fe96e62e9ea51ff4721fe55b24e66691821fd079e (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_controller, container_name=ovn_controller, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true)
Jan 21 19:21:22 np0005591285 nova_compute[182755]: 2026-01-22 00:21:22.929 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:21:22 np0005591285 nova_compute[182755]: 2026-01-22 00:21:22.977 182759 DEBUG nova.network.neutron [None req-1e6fd58f-821a-4085-b2e9-5cf75db49cd1 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] [instance: 25b17338-0c55-4631-9d7e-896e6fa6339a] Successfully updated port: 1d955a5e-1284-42e2-b7cc-b421102c744d _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Jan 21 19:21:23 np0005591285 nova_compute[182755]: 2026-01-22 00:21:23.003 182759 DEBUG oslo_concurrency.lockutils [None req-1e6fd58f-821a-4085-b2e9-5cf75db49cd1 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] Acquiring lock "refresh_cache-25b17338-0c55-4631-9d7e-896e6fa6339a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 21 19:21:23 np0005591285 nova_compute[182755]: 2026-01-22 00:21:23.004 182759 DEBUG oslo_concurrency.lockutils [None req-1e6fd58f-821a-4085-b2e9-5cf75db49cd1 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] Acquired lock "refresh_cache-25b17338-0c55-4631-9d7e-896e6fa6339a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 21 19:21:23 np0005591285 nova_compute[182755]: 2026-01-22 00:21:23.004 182759 DEBUG nova.network.neutron [None req-1e6fd58f-821a-4085-b2e9-5cf75db49cd1 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] [instance: 25b17338-0c55-4631-9d7e-896e6fa6339a] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Jan 21 19:21:23 np0005591285 nova_compute[182755]: 2026-01-22 00:21:23.122 182759 DEBUG nova.compute.manager [req-ab4a751e-31bb-4e25-890c-4326934c8a99 req-87acc936-9b9f-46cc-94e5-1950d95d464f 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 25b17338-0c55-4631-9d7e-896e6fa6339a] Received event network-changed-1d955a5e-1284-42e2-b7cc-b421102c744d external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 21 19:21:23 np0005591285 nova_compute[182755]: 2026-01-22 00:21:23.123 182759 DEBUG nova.compute.manager [req-ab4a751e-31bb-4e25-890c-4326934c8a99 req-87acc936-9b9f-46cc-94e5-1950d95d464f 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 25b17338-0c55-4631-9d7e-896e6fa6339a] Refreshing instance network info cache due to event network-changed-1d955a5e-1284-42e2-b7cc-b421102c744d. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 21 19:21:23 np0005591285 nova_compute[182755]: 2026-01-22 00:21:23.123 182759 DEBUG oslo_concurrency.lockutils [req-ab4a751e-31bb-4e25-890c-4326934c8a99 req-87acc936-9b9f-46cc-94e5-1950d95d464f 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "refresh_cache-25b17338-0c55-4631-9d7e-896e6fa6339a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 21 19:21:23 np0005591285 nova_compute[182755]: 2026-01-22 00:21:23.203 182759 DEBUG nova.network.neutron [None req-1e6fd58f-821a-4085-b2e9-5cf75db49cd1 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] [instance: 25b17338-0c55-4631-9d7e-896e6fa6339a] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Jan 21 19:21:23 np0005591285 nova_compute[182755]: 2026-01-22 00:21:23.394 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:21:24 np0005591285 nova_compute[182755]: 2026-01-22 00:21:24.386 182759 DEBUG nova.network.neutron [None req-1e6fd58f-821a-4085-b2e9-5cf75db49cd1 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] [instance: 25b17338-0c55-4631-9d7e-896e6fa6339a] Updating instance_info_cache with network_info: [{"id": "1d955a5e-1284-42e2-b7cc-b421102c744d", "address": "fa:16:3e:1e:6e:18", "network": {"id": "d8a7afef-267d-4702-a8ff-b40d78fc979d", "bridge": "br-int", "label": "tempest-network-smoke--1265373196", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "34b96b4037d24a0ea19383ca2477b2fd", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap1d955a5e-12", "ovs_interfaceid": "1d955a5e-1284-42e2-b7cc-b421102c744d", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 21 19:21:24 np0005591285 nova_compute[182755]: 2026-01-22 00:21:24.411 182759 DEBUG oslo_concurrency.lockutils [None req-1e6fd58f-821a-4085-b2e9-5cf75db49cd1 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] Releasing lock "refresh_cache-25b17338-0c55-4631-9d7e-896e6fa6339a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 21 19:21:24 np0005591285 nova_compute[182755]: 2026-01-22 00:21:24.412 182759 DEBUG nova.compute.manager [None req-1e6fd58f-821a-4085-b2e9-5cf75db49cd1 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] [instance: 25b17338-0c55-4631-9d7e-896e6fa6339a] Instance network_info: |[{"id": "1d955a5e-1284-42e2-b7cc-b421102c744d", "address": "fa:16:3e:1e:6e:18", "network": {"id": "d8a7afef-267d-4702-a8ff-b40d78fc979d", "bridge": "br-int", "label": "tempest-network-smoke--1265373196", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "34b96b4037d24a0ea19383ca2477b2fd", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap1d955a5e-12", "ovs_interfaceid": "1d955a5e-1284-42e2-b7cc-b421102c744d", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Jan 21 19:21:24 np0005591285 nova_compute[182755]: 2026-01-22 00:21:24.412 182759 DEBUG oslo_concurrency.lockutils [req-ab4a751e-31bb-4e25-890c-4326934c8a99 req-87acc936-9b9f-46cc-94e5-1950d95d464f 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquired lock "refresh_cache-25b17338-0c55-4631-9d7e-896e6fa6339a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 21 19:21:24 np0005591285 nova_compute[182755]: 2026-01-22 00:21:24.413 182759 DEBUG nova.network.neutron [req-ab4a751e-31bb-4e25-890c-4326934c8a99 req-87acc936-9b9f-46cc-94e5-1950d95d464f 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 25b17338-0c55-4631-9d7e-896e6fa6339a] Refreshing network info cache for port 1d955a5e-1284-42e2-b7cc-b421102c744d _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 21 19:21:24 np0005591285 nova_compute[182755]: 2026-01-22 00:21:24.417 182759 DEBUG nova.virt.libvirt.driver [None req-1e6fd58f-821a-4085-b2e9-5cf75db49cd1 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] [instance: 25b17338-0c55-4631-9d7e-896e6fa6339a] Start _get_guest_xml network_info=[{"id": "1d955a5e-1284-42e2-b7cc-b421102c744d", "address": "fa:16:3e:1e:6e:18", "network": {"id": "d8a7afef-267d-4702-a8ff-b40d78fc979d", "bridge": "br-int", "label": "tempest-network-smoke--1265373196", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "34b96b4037d24a0ea19383ca2477b2fd", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap1d955a5e-12", "ovs_interfaceid": "1d955a5e-1284-42e2-b7cc-b421102c744d", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-21T23:42:50Z,direct_url=<?>,disk_format='qcow2',id=9cd98f02-a505-4543-a7ad-04e9a377b456,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='43b70c4e837343859ac97b6b2397ba1b',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-21T23:42:52Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'device_type': 'disk', 'device_name': '/dev/vda', 'size': 0, 'boot_index': 0, 'disk_bus': 'virtio', 'guest_format': None, 'encryption_options': None, 'encryption_format': None, 'encrypted': False, 'encryption_secret_uuid': None, 'image_id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Jan 21 19:21:24 np0005591285 nova_compute[182755]: 2026-01-22 00:21:24.425 182759 WARNING nova.virt.libvirt.driver [None req-1e6fd58f-821a-4085-b2e9-5cf75db49cd1 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 21 19:21:24 np0005591285 nova_compute[182755]: 2026-01-22 00:21:24.441 182759 DEBUG nova.virt.libvirt.host [None req-1e6fd58f-821a-4085-b2e9-5cf75db49cd1 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Jan 21 19:21:24 np0005591285 nova_compute[182755]: 2026-01-22 00:21:24.441 182759 DEBUG nova.virt.libvirt.host [None req-1e6fd58f-821a-4085-b2e9-5cf75db49cd1 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Jan 21 19:21:24 np0005591285 nova_compute[182755]: 2026-01-22 00:21:24.447 182759 DEBUG nova.virt.libvirt.host [None req-1e6fd58f-821a-4085-b2e9-5cf75db49cd1 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Jan 21 19:21:24 np0005591285 nova_compute[182755]: 2026-01-22 00:21:24.448 182759 DEBUG nova.virt.libvirt.host [None req-1e6fd58f-821a-4085-b2e9-5cf75db49cd1 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Jan 21 19:21:24 np0005591285 nova_compute[182755]: 2026-01-22 00:21:24.449 182759 DEBUG nova.virt.libvirt.driver [None req-1e6fd58f-821a-4085-b2e9-5cf75db49cd1 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Jan 21 19:21:24 np0005591285 nova_compute[182755]: 2026-01-22 00:21:24.449 182759 DEBUG nova.virt.hardware [None req-1e6fd58f-821a-4085-b2e9-5cf75db49cd1 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-21T23:42:45Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='c3389c03-89c4-4ff5-9e03-1a99d41713d4',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-21T23:42:50Z,direct_url=<?>,disk_format='qcow2',id=9cd98f02-a505-4543-a7ad-04e9a377b456,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='43b70c4e837343859ac97b6b2397ba1b',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-21T23:42:52Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Jan 21 19:21:24 np0005591285 nova_compute[182755]: 2026-01-22 00:21:24.450 182759 DEBUG nova.virt.hardware [None req-1e6fd58f-821a-4085-b2e9-5cf75db49cd1 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Jan 21 19:21:24 np0005591285 nova_compute[182755]: 2026-01-22 00:21:24.450 182759 DEBUG nova.virt.hardware [None req-1e6fd58f-821a-4085-b2e9-5cf75db49cd1 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Jan 21 19:21:24 np0005591285 nova_compute[182755]: 2026-01-22 00:21:24.450 182759 DEBUG nova.virt.hardware [None req-1e6fd58f-821a-4085-b2e9-5cf75db49cd1 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Jan 21 19:21:24 np0005591285 nova_compute[182755]: 2026-01-22 00:21:24.451 182759 DEBUG nova.virt.hardware [None req-1e6fd58f-821a-4085-b2e9-5cf75db49cd1 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Jan 21 19:21:24 np0005591285 nova_compute[182755]: 2026-01-22 00:21:24.451 182759 DEBUG nova.virt.hardware [None req-1e6fd58f-821a-4085-b2e9-5cf75db49cd1 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Jan 21 19:21:24 np0005591285 nova_compute[182755]: 2026-01-22 00:21:24.451 182759 DEBUG nova.virt.hardware [None req-1e6fd58f-821a-4085-b2e9-5cf75db49cd1 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Jan 21 19:21:24 np0005591285 nova_compute[182755]: 2026-01-22 00:21:24.451 182759 DEBUG nova.virt.hardware [None req-1e6fd58f-821a-4085-b2e9-5cf75db49cd1 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Jan 21 19:21:24 np0005591285 nova_compute[182755]: 2026-01-22 00:21:24.452 182759 DEBUG nova.virt.hardware [None req-1e6fd58f-821a-4085-b2e9-5cf75db49cd1 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Jan 21 19:21:24 np0005591285 nova_compute[182755]: 2026-01-22 00:21:24.452 182759 DEBUG nova.virt.hardware [None req-1e6fd58f-821a-4085-b2e9-5cf75db49cd1 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Jan 21 19:21:24 np0005591285 nova_compute[182755]: 2026-01-22 00:21:24.452 182759 DEBUG nova.virt.hardware [None req-1e6fd58f-821a-4085-b2e9-5cf75db49cd1 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Jan 21 19:21:24 np0005591285 nova_compute[182755]: 2026-01-22 00:21:24.457 182759 DEBUG nova.virt.libvirt.vif [None req-1e6fd58f-821a-4085-b2e9-5cf75db49cd1 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-22T00:21:17Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-477861272',display_name='tempest-TestNetworkBasicOps-server-477861272',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-477861272',id=153,image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBC1B0QtYzlQT6wVB9N7C5p7quLULloUzL9snEqx01Oq0jh5qZo1YFPza37ma4X75ier+uy28EOQJmoSDKJqbNt0MpI9jP9AsOnfOju00xn+AfZ3nuB13y+9PpFvzC303Tw==',key_name='tempest-TestNetworkBasicOps-613570151',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='34b96b4037d24a0ea19383ca2477b2fd',ramdisk_id='',reservation_id='r-8n7cxs6p',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestNetworkBasicOps-822850957',owner_user_name='tempest-TestNetworkBasicOps-822850957-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-22T00:21:19Z,user_data=None,user_id='833f1e9dce90456ea55a443da6704907',uuid=25b17338-0c55-4631-9d7e-896e6fa6339a,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "1d955a5e-1284-42e2-b7cc-b421102c744d", "address": "fa:16:3e:1e:6e:18", "network": {"id": "d8a7afef-267d-4702-a8ff-b40d78fc979d", "bridge": "br-int", "label": "tempest-network-smoke--1265373196", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "34b96b4037d24a0ea19383ca2477b2fd", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap1d955a5e-12", "ovs_interfaceid": "1d955a5e-1284-42e2-b7cc-b421102c744d", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Jan 21 19:21:24 np0005591285 nova_compute[182755]: 2026-01-22 00:21:24.457 182759 DEBUG nova.network.os_vif_util [None req-1e6fd58f-821a-4085-b2e9-5cf75db49cd1 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] Converting VIF {"id": "1d955a5e-1284-42e2-b7cc-b421102c744d", "address": "fa:16:3e:1e:6e:18", "network": {"id": "d8a7afef-267d-4702-a8ff-b40d78fc979d", "bridge": "br-int", "label": "tempest-network-smoke--1265373196", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "34b96b4037d24a0ea19383ca2477b2fd", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap1d955a5e-12", "ovs_interfaceid": "1d955a5e-1284-42e2-b7cc-b421102c744d", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 21 19:21:24 np0005591285 nova_compute[182755]: 2026-01-22 00:21:24.458 182759 DEBUG nova.network.os_vif_util [None req-1e6fd58f-821a-4085-b2e9-5cf75db49cd1 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:1e:6e:18,bridge_name='br-int',has_traffic_filtering=True,id=1d955a5e-1284-42e2-b7cc-b421102c744d,network=Network(d8a7afef-267d-4702-a8ff-b40d78fc979d),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap1d955a5e-12') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 21 19:21:24 np0005591285 nova_compute[182755]: 2026-01-22 00:21:24.459 182759 DEBUG nova.objects.instance [None req-1e6fd58f-821a-4085-b2e9-5cf75db49cd1 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] Lazy-loading 'pci_devices' on Instance uuid 25b17338-0c55-4631-9d7e-896e6fa6339a obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 21 19:21:24 np0005591285 nova_compute[182755]: 2026-01-22 00:21:24.492 182759 DEBUG nova.virt.libvirt.driver [None req-1e6fd58f-821a-4085-b2e9-5cf75db49cd1 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] [instance: 25b17338-0c55-4631-9d7e-896e6fa6339a] End _get_guest_xml xml=<domain type="kvm">
Jan 21 19:21:24 np0005591285 nova_compute[182755]:  <uuid>25b17338-0c55-4631-9d7e-896e6fa6339a</uuid>
Jan 21 19:21:24 np0005591285 nova_compute[182755]:  <name>instance-00000099</name>
Jan 21 19:21:24 np0005591285 nova_compute[182755]:  <memory>131072</memory>
Jan 21 19:21:24 np0005591285 nova_compute[182755]:  <vcpu>1</vcpu>
Jan 21 19:21:24 np0005591285 nova_compute[182755]:  <metadata>
Jan 21 19:21:24 np0005591285 nova_compute[182755]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 21 19:21:24 np0005591285 nova_compute[182755]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 21 19:21:24 np0005591285 nova_compute[182755]:      <nova:name>tempest-TestNetworkBasicOps-server-477861272</nova:name>
Jan 21 19:21:24 np0005591285 nova_compute[182755]:      <nova:creationTime>2026-01-22 00:21:24</nova:creationTime>
Jan 21 19:21:24 np0005591285 nova_compute[182755]:      <nova:flavor name="m1.nano">
Jan 21 19:21:24 np0005591285 nova_compute[182755]:        <nova:memory>128</nova:memory>
Jan 21 19:21:24 np0005591285 nova_compute[182755]:        <nova:disk>1</nova:disk>
Jan 21 19:21:24 np0005591285 nova_compute[182755]:        <nova:swap>0</nova:swap>
Jan 21 19:21:24 np0005591285 nova_compute[182755]:        <nova:ephemeral>0</nova:ephemeral>
Jan 21 19:21:24 np0005591285 nova_compute[182755]:        <nova:vcpus>1</nova:vcpus>
Jan 21 19:21:24 np0005591285 nova_compute[182755]:      </nova:flavor>
Jan 21 19:21:24 np0005591285 nova_compute[182755]:      <nova:owner>
Jan 21 19:21:24 np0005591285 nova_compute[182755]:        <nova:user uuid="833f1e9dce90456ea55a443da6704907">tempest-TestNetworkBasicOps-822850957-project-member</nova:user>
Jan 21 19:21:24 np0005591285 nova_compute[182755]:        <nova:project uuid="34b96b4037d24a0ea19383ca2477b2fd">tempest-TestNetworkBasicOps-822850957</nova:project>
Jan 21 19:21:24 np0005591285 nova_compute[182755]:      </nova:owner>
Jan 21 19:21:24 np0005591285 nova_compute[182755]:      <nova:root type="image" uuid="9cd98f02-a505-4543-a7ad-04e9a377b456"/>
Jan 21 19:21:24 np0005591285 nova_compute[182755]:      <nova:ports>
Jan 21 19:21:24 np0005591285 nova_compute[182755]:        <nova:port uuid="1d955a5e-1284-42e2-b7cc-b421102c744d">
Jan 21 19:21:24 np0005591285 nova_compute[182755]:          <nova:ip type="fixed" address="10.100.0.13" ipVersion="4"/>
Jan 21 19:21:24 np0005591285 nova_compute[182755]:        </nova:port>
Jan 21 19:21:24 np0005591285 nova_compute[182755]:      </nova:ports>
Jan 21 19:21:24 np0005591285 nova_compute[182755]:    </nova:instance>
Jan 21 19:21:24 np0005591285 nova_compute[182755]:  </metadata>
Jan 21 19:21:24 np0005591285 nova_compute[182755]:  <sysinfo type="smbios">
Jan 21 19:21:24 np0005591285 nova_compute[182755]:    <system>
Jan 21 19:21:24 np0005591285 nova_compute[182755]:      <entry name="manufacturer">RDO</entry>
Jan 21 19:21:24 np0005591285 nova_compute[182755]:      <entry name="product">OpenStack Compute</entry>
Jan 21 19:21:24 np0005591285 nova_compute[182755]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 21 19:21:24 np0005591285 nova_compute[182755]:      <entry name="serial">25b17338-0c55-4631-9d7e-896e6fa6339a</entry>
Jan 21 19:21:24 np0005591285 nova_compute[182755]:      <entry name="uuid">25b17338-0c55-4631-9d7e-896e6fa6339a</entry>
Jan 21 19:21:24 np0005591285 nova_compute[182755]:      <entry name="family">Virtual Machine</entry>
Jan 21 19:21:24 np0005591285 nova_compute[182755]:    </system>
Jan 21 19:21:24 np0005591285 nova_compute[182755]:  </sysinfo>
Jan 21 19:21:24 np0005591285 nova_compute[182755]:  <os>
Jan 21 19:21:24 np0005591285 nova_compute[182755]:    <type arch="x86_64" machine="q35">hvm</type>
Jan 21 19:21:24 np0005591285 nova_compute[182755]:    <boot dev="hd"/>
Jan 21 19:21:24 np0005591285 nova_compute[182755]:    <smbios mode="sysinfo"/>
Jan 21 19:21:24 np0005591285 nova_compute[182755]:  </os>
Jan 21 19:21:24 np0005591285 nova_compute[182755]:  <features>
Jan 21 19:21:24 np0005591285 nova_compute[182755]:    <acpi/>
Jan 21 19:21:24 np0005591285 nova_compute[182755]:    <apic/>
Jan 21 19:21:24 np0005591285 nova_compute[182755]:    <vmcoreinfo/>
Jan 21 19:21:24 np0005591285 nova_compute[182755]:  </features>
Jan 21 19:21:24 np0005591285 nova_compute[182755]:  <clock offset="utc">
Jan 21 19:21:24 np0005591285 nova_compute[182755]:    <timer name="pit" tickpolicy="delay"/>
Jan 21 19:21:24 np0005591285 nova_compute[182755]:    <timer name="rtc" tickpolicy="catchup"/>
Jan 21 19:21:24 np0005591285 nova_compute[182755]:    <timer name="hpet" present="no"/>
Jan 21 19:21:24 np0005591285 nova_compute[182755]:  </clock>
Jan 21 19:21:24 np0005591285 nova_compute[182755]:  <cpu mode="custom" match="exact">
Jan 21 19:21:24 np0005591285 nova_compute[182755]:    <model>Nehalem</model>
Jan 21 19:21:24 np0005591285 nova_compute[182755]:    <topology sockets="1" cores="1" threads="1"/>
Jan 21 19:21:24 np0005591285 nova_compute[182755]:  </cpu>
Jan 21 19:21:24 np0005591285 nova_compute[182755]:  <devices>
Jan 21 19:21:24 np0005591285 nova_compute[182755]:    <disk type="file" device="disk">
Jan 21 19:21:24 np0005591285 nova_compute[182755]:      <driver name="qemu" type="qcow2" cache="none"/>
Jan 21 19:21:24 np0005591285 nova_compute[182755]:      <source file="/var/lib/nova/instances/25b17338-0c55-4631-9d7e-896e6fa6339a/disk"/>
Jan 21 19:21:24 np0005591285 nova_compute[182755]:      <target dev="vda" bus="virtio"/>
Jan 21 19:21:24 np0005591285 nova_compute[182755]:    </disk>
Jan 21 19:21:24 np0005591285 nova_compute[182755]:    <disk type="file" device="cdrom">
Jan 21 19:21:24 np0005591285 nova_compute[182755]:      <driver name="qemu" type="raw" cache="none"/>
Jan 21 19:21:24 np0005591285 nova_compute[182755]:      <source file="/var/lib/nova/instances/25b17338-0c55-4631-9d7e-896e6fa6339a/disk.config"/>
Jan 21 19:21:24 np0005591285 nova_compute[182755]:      <target dev="sda" bus="sata"/>
Jan 21 19:21:24 np0005591285 nova_compute[182755]:    </disk>
Jan 21 19:21:24 np0005591285 nova_compute[182755]:    <interface type="ethernet">
Jan 21 19:21:24 np0005591285 nova_compute[182755]:      <mac address="fa:16:3e:1e:6e:18"/>
Jan 21 19:21:24 np0005591285 nova_compute[182755]:      <model type="virtio"/>
Jan 21 19:21:24 np0005591285 nova_compute[182755]:      <driver name="vhost" rx_queue_size="512"/>
Jan 21 19:21:24 np0005591285 nova_compute[182755]:      <mtu size="1442"/>
Jan 21 19:21:24 np0005591285 nova_compute[182755]:      <target dev="tap1d955a5e-12"/>
Jan 21 19:21:24 np0005591285 nova_compute[182755]:    </interface>
Jan 21 19:21:24 np0005591285 nova_compute[182755]:    <serial type="pty">
Jan 21 19:21:24 np0005591285 nova_compute[182755]:      <log file="/var/lib/nova/instances/25b17338-0c55-4631-9d7e-896e6fa6339a/console.log" append="off"/>
Jan 21 19:21:24 np0005591285 nova_compute[182755]:    </serial>
Jan 21 19:21:24 np0005591285 nova_compute[182755]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 21 19:21:24 np0005591285 nova_compute[182755]:    <video>
Jan 21 19:21:24 np0005591285 nova_compute[182755]:      <model type="virtio"/>
Jan 21 19:21:24 np0005591285 nova_compute[182755]:    </video>
Jan 21 19:21:24 np0005591285 nova_compute[182755]:    <input type="tablet" bus="usb"/>
Jan 21 19:21:24 np0005591285 nova_compute[182755]:    <rng model="virtio">
Jan 21 19:21:24 np0005591285 nova_compute[182755]:      <backend model="random">/dev/urandom</backend>
Jan 21 19:21:24 np0005591285 nova_compute[182755]:    </rng>
Jan 21 19:21:24 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root"/>
Jan 21 19:21:24 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 19:21:24 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 19:21:24 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 19:21:24 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 19:21:24 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 19:21:24 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 19:21:24 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 19:21:24 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 19:21:24 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 19:21:24 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 19:21:24 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 19:21:24 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 19:21:24 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 19:21:24 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 19:21:24 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 19:21:24 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 19:21:24 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 19:21:24 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 19:21:24 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 19:21:24 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 19:21:24 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 19:21:24 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 19:21:24 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 19:21:24 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 19:21:24 np0005591285 nova_compute[182755]:    <controller type="usb" index="0"/>
Jan 21 19:21:24 np0005591285 nova_compute[182755]:    <memballoon model="virtio">
Jan 21 19:21:24 np0005591285 nova_compute[182755]:      <stats period="10"/>
Jan 21 19:21:24 np0005591285 nova_compute[182755]:    </memballoon>
Jan 21 19:21:24 np0005591285 nova_compute[182755]:  </devices>
Jan 21 19:21:24 np0005591285 nova_compute[182755]: </domain>
Jan 21 19:21:24 np0005591285 nova_compute[182755]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Jan 21 19:21:24 np0005591285 nova_compute[182755]: 2026-01-22 00:21:24.493 182759 DEBUG nova.compute.manager [None req-1e6fd58f-821a-4085-b2e9-5cf75db49cd1 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] [instance: 25b17338-0c55-4631-9d7e-896e6fa6339a] Preparing to wait for external event network-vif-plugged-1d955a5e-1284-42e2-b7cc-b421102c744d prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Jan 21 19:21:24 np0005591285 nova_compute[182755]: 2026-01-22 00:21:24.494 182759 DEBUG oslo_concurrency.lockutils [None req-1e6fd58f-821a-4085-b2e9-5cf75db49cd1 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] Acquiring lock "25b17338-0c55-4631-9d7e-896e6fa6339a-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 21 19:21:24 np0005591285 nova_compute[182755]: 2026-01-22 00:21:24.494 182759 DEBUG oslo_concurrency.lockutils [None req-1e6fd58f-821a-4085-b2e9-5cf75db49cd1 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] Lock "25b17338-0c55-4631-9d7e-896e6fa6339a-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 21 19:21:24 np0005591285 nova_compute[182755]: 2026-01-22 00:21:24.494 182759 DEBUG oslo_concurrency.lockutils [None req-1e6fd58f-821a-4085-b2e9-5cf75db49cd1 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] Lock "25b17338-0c55-4631-9d7e-896e6fa6339a-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 21 19:21:24 np0005591285 nova_compute[182755]: 2026-01-22 00:21:24.495 182759 DEBUG nova.virt.libvirt.vif [None req-1e6fd58f-821a-4085-b2e9-5cf75db49cd1 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-22T00:21:17Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-477861272',display_name='tempest-TestNetworkBasicOps-server-477861272',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-477861272',id=153,image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBC1B0QtYzlQT6wVB9N7C5p7quLULloUzL9snEqx01Oq0jh5qZo1YFPza37ma4X75ier+uy28EOQJmoSDKJqbNt0MpI9jP9AsOnfOju00xn+AfZ3nuB13y+9PpFvzC303Tw==',key_name='tempest-TestNetworkBasicOps-613570151',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='34b96b4037d24a0ea19383ca2477b2fd',ramdisk_id='',reservation_id='r-8n7cxs6p',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestNetworkBasicOps-822850957',owner_user_name='tempest-TestNetworkBasicOps-822850957-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-22T00:21:19Z,user_data=None,user_id='833f1e9dce90456ea55a443da6704907',uuid=25b17338-0c55-4631-9d7e-896e6fa6339a,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "1d955a5e-1284-42e2-b7cc-b421102c744d", "address": "fa:16:3e:1e:6e:18", "network": {"id": "d8a7afef-267d-4702-a8ff-b40d78fc979d", "bridge": "br-int", "label": "tempest-network-smoke--1265373196", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "34b96b4037d24a0ea19383ca2477b2fd", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap1d955a5e-12", "ovs_interfaceid": "1d955a5e-1284-42e2-b7cc-b421102c744d", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Jan 21 19:21:24 np0005591285 nova_compute[182755]: 2026-01-22 00:21:24.495 182759 DEBUG nova.network.os_vif_util [None req-1e6fd58f-821a-4085-b2e9-5cf75db49cd1 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] Converting VIF {"id": "1d955a5e-1284-42e2-b7cc-b421102c744d", "address": "fa:16:3e:1e:6e:18", "network": {"id": "d8a7afef-267d-4702-a8ff-b40d78fc979d", "bridge": "br-int", "label": "tempest-network-smoke--1265373196", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "34b96b4037d24a0ea19383ca2477b2fd", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap1d955a5e-12", "ovs_interfaceid": "1d955a5e-1284-42e2-b7cc-b421102c744d", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 21 19:21:24 np0005591285 nova_compute[182755]: 2026-01-22 00:21:24.496 182759 DEBUG nova.network.os_vif_util [None req-1e6fd58f-821a-4085-b2e9-5cf75db49cd1 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:1e:6e:18,bridge_name='br-int',has_traffic_filtering=True,id=1d955a5e-1284-42e2-b7cc-b421102c744d,network=Network(d8a7afef-267d-4702-a8ff-b40d78fc979d),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap1d955a5e-12') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 21 19:21:24 np0005591285 nova_compute[182755]: 2026-01-22 00:21:24.496 182759 DEBUG os_vif [None req-1e6fd58f-821a-4085-b2e9-5cf75db49cd1 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:1e:6e:18,bridge_name='br-int',has_traffic_filtering=True,id=1d955a5e-1284-42e2-b7cc-b421102c744d,network=Network(d8a7afef-267d-4702-a8ff-b40d78fc979d),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap1d955a5e-12') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Jan 21 19:21:24 np0005591285 nova_compute[182755]: 2026-01-22 00:21:24.497 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:21:24 np0005591285 nova_compute[182755]: 2026-01-22 00:21:24.497 182759 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 21 19:21:24 np0005591285 nova_compute[182755]: 2026-01-22 00:21:24.498 182759 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 21 19:21:24 np0005591285 nova_compute[182755]: 2026-01-22 00:21:24.502 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:21:24 np0005591285 nova_compute[182755]: 2026-01-22 00:21:24.502 182759 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap1d955a5e-12, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 21 19:21:24 np0005591285 nova_compute[182755]: 2026-01-22 00:21:24.503 182759 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap1d955a5e-12, col_values=(('external_ids', {'iface-id': '1d955a5e-1284-42e2-b7cc-b421102c744d', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:1e:6e:18', 'vm-uuid': '25b17338-0c55-4631-9d7e-896e6fa6339a'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 21 19:21:24 np0005591285 nova_compute[182755]: 2026-01-22 00:21:24.504 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:21:24 np0005591285 NetworkManager[55017]: <info>  [1769041284.5061] manager: (tap1d955a5e-12): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/284)
Jan 21 19:21:24 np0005591285 nova_compute[182755]: 2026-01-22 00:21:24.506 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 21 19:21:24 np0005591285 nova_compute[182755]: 2026-01-22 00:21:24.513 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:21:24 np0005591285 nova_compute[182755]: 2026-01-22 00:21:24.515 182759 INFO os_vif [None req-1e6fd58f-821a-4085-b2e9-5cf75db49cd1 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:1e:6e:18,bridge_name='br-int',has_traffic_filtering=True,id=1d955a5e-1284-42e2-b7cc-b421102c744d,network=Network(d8a7afef-267d-4702-a8ff-b40d78fc979d),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap1d955a5e-12')#033[00m
Jan 21 19:21:24 np0005591285 nova_compute[182755]: 2026-01-22 00:21:24.601 182759 DEBUG nova.virt.libvirt.driver [None req-1e6fd58f-821a-4085-b2e9-5cf75db49cd1 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 21 19:21:24 np0005591285 nova_compute[182755]: 2026-01-22 00:21:24.602 182759 DEBUG nova.virt.libvirt.driver [None req-1e6fd58f-821a-4085-b2e9-5cf75db49cd1 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 21 19:21:24 np0005591285 nova_compute[182755]: 2026-01-22 00:21:24.602 182759 DEBUG nova.virt.libvirt.driver [None req-1e6fd58f-821a-4085-b2e9-5cf75db49cd1 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] No VIF found with MAC fa:16:3e:1e:6e:18, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Jan 21 19:21:24 np0005591285 nova_compute[182755]: 2026-01-22 00:21:24.603 182759 INFO nova.virt.libvirt.driver [None req-1e6fd58f-821a-4085-b2e9-5cf75db49cd1 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] [instance: 25b17338-0c55-4631-9d7e-896e6fa6339a] Using config drive#033[00m
Jan 21 19:21:24 np0005591285 nova_compute[182755]: 2026-01-22 00:21:24.932 182759 INFO nova.virt.libvirt.driver [None req-1e6fd58f-821a-4085-b2e9-5cf75db49cd1 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] [instance: 25b17338-0c55-4631-9d7e-896e6fa6339a] Creating config drive at /var/lib/nova/instances/25b17338-0c55-4631-9d7e-896e6fa6339a/disk.config#033[00m
Jan 21 19:21:24 np0005591285 nova_compute[182755]: 2026-01-22 00:21:24.937 182759 DEBUG oslo_concurrency.processutils [None req-1e6fd58f-821a-4085-b2e9-5cf75db49cd1 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/25b17338-0c55-4631-9d7e-896e6fa6339a/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp426y7111 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 21 19:21:25 np0005591285 nova_compute[182755]: 2026-01-22 00:21:25.060 182759 DEBUG oslo_concurrency.processutils [None req-1e6fd58f-821a-4085-b2e9-5cf75db49cd1 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/25b17338-0c55-4631-9d7e-896e6fa6339a/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp426y7111" returned: 0 in 0.123s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 21 19:21:25 np0005591285 kernel: tap1d955a5e-12: entered promiscuous mode
Jan 21 19:21:25 np0005591285 NetworkManager[55017]: <info>  [1769041285.1199] manager: (tap1d955a5e-12): new Tun device (/org/freedesktop/NetworkManager/Devices/285)
Jan 21 19:21:25 np0005591285 nova_compute[182755]: 2026-01-22 00:21:25.120 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:21:25 np0005591285 ovn_controller[94908]: 2026-01-22T00:21:25Z|00592|binding|INFO|Claiming lport 1d955a5e-1284-42e2-b7cc-b421102c744d for this chassis.
Jan 21 19:21:25 np0005591285 ovn_controller[94908]: 2026-01-22T00:21:25Z|00593|binding|INFO|1d955a5e-1284-42e2-b7cc-b421102c744d: Claiming fa:16:3e:1e:6e:18 10.100.0.13
Jan 21 19:21:25 np0005591285 nova_compute[182755]: 2026-01-22 00:21:25.124 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:21:25 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:21:25.136 104259 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:1e:6e:18 10.100.0.13'], port_security=['fa:16:3e:1e:6e:18 10.100.0.13'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.13/28', 'neutron:device_id': '25b17338-0c55-4631-9d7e-896e6fa6339a', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-d8a7afef-267d-4702-a8ff-b40d78fc979d', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '34b96b4037d24a0ea19383ca2477b2fd', 'neutron:revision_number': '2', 'neutron:security_group_ids': '5ca70599-9d29-4e0e-b82d-7c3081b15cf6', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=51b97c50-36e0-44e4-96b5-94bb0eb67411, chassis=[<ovs.db.idl.Row object at 0x7fdd55b5c970>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fdd55b5c970>], logical_port=1d955a5e-1284-42e2-b7cc-b421102c744d) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 21 19:21:25 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:21:25.137 104259 INFO neutron.agent.ovn.metadata.agent [-] Port 1d955a5e-1284-42e2-b7cc-b421102c744d in datapath d8a7afef-267d-4702-a8ff-b40d78fc979d bound to our chassis#033[00m
Jan 21 19:21:25 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:21:25.138 104259 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network d8a7afef-267d-4702-a8ff-b40d78fc979d#033[00m
Jan 21 19:21:25 np0005591285 systemd-udevd[235881]: Network interface NamePolicy= disabled on kernel command line.
Jan 21 19:21:25 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:21:25.151 211690 DEBUG oslo.privsep.daemon [-] privsep: reply[b8945fb0-eada-4448-a705-ecb2c91f87de]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 19:21:25 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:21:25.152 104259 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tapd8a7afef-21 in ovnmeta-d8a7afef-267d-4702-a8ff-b40d78fc979d namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Jan 21 19:21:25 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:21:25.154 211690 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tapd8a7afef-20 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Jan 21 19:21:25 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:21:25.154 211690 DEBUG oslo.privsep.daemon [-] privsep: reply[4b67ae7a-7f54-4d7b-a6c3-e46e51075edb]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 19:21:25 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:21:25.155 211690 DEBUG oslo.privsep.daemon [-] privsep: reply[bd56f9ec-77fc-48f2-ac99-f771fb3a88af]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 19:21:25 np0005591285 NetworkManager[55017]: <info>  [1769041285.1590] device (tap1d955a5e-12): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 21 19:21:25 np0005591285 NetworkManager[55017]: <info>  [1769041285.1600] device (tap1d955a5e-12): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 21 19:21:25 np0005591285 systemd-machined[154022]: New machine qemu-70-instance-00000099.
Jan 21 19:21:25 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:21:25.165 104650 DEBUG oslo.privsep.daemon [-] privsep: reply[dfc4ed66-3e6c-475f-9a4b-0c1ea94c3464]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 19:21:25 np0005591285 nova_compute[182755]: 2026-01-22 00:21:25.177 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:21:25 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:21:25.180 211690 DEBUG oslo.privsep.daemon [-] privsep: reply[46732c9d-6fb0-40e5-bfa6-799aedc03c2c]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 19:21:25 np0005591285 ovn_controller[94908]: 2026-01-22T00:21:25Z|00594|binding|INFO|Setting lport 1d955a5e-1284-42e2-b7cc-b421102c744d ovn-installed in OVS
Jan 21 19:21:25 np0005591285 ovn_controller[94908]: 2026-01-22T00:21:25Z|00595|binding|INFO|Setting lport 1d955a5e-1284-42e2-b7cc-b421102c744d up in Southbound
Jan 21 19:21:25 np0005591285 nova_compute[182755]: 2026-01-22 00:21:25.184 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:21:25 np0005591285 systemd[1]: Started Virtual Machine qemu-70-instance-00000099.
Jan 21 19:21:25 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:21:25.205 211704 DEBUG oslo.privsep.daemon [-] privsep: reply[96b39d07-29ab-43b6-b248-de1aab1f5c7f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 19:21:25 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:21:25.211 211690 DEBUG oslo.privsep.daemon [-] privsep: reply[47078c9f-c04e-4d81-b548-76b930b50975]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 19:21:25 np0005591285 NetworkManager[55017]: <info>  [1769041285.2124] manager: (tapd8a7afef-20): new Veth device (/org/freedesktop/NetworkManager/Devices/286)
Jan 21 19:21:25 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:21:25.242 211704 DEBUG oslo.privsep.daemon [-] privsep: reply[ee725e2a-952d-4d4f-82b8-64399467cc43]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 19:21:25 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:21:25.245 211704 DEBUG oslo.privsep.daemon [-] privsep: reply[16ff2742-01a1-41c4-b140-5e24ba90fd0b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 19:21:25 np0005591285 NetworkManager[55017]: <info>  [1769041285.2671] device (tapd8a7afef-20): carrier: link connected
Jan 21 19:21:25 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:21:25.271 211704 DEBUG oslo.privsep.daemon [-] privsep: reply[d1504a58-32f3-4690-b9f5-6b5cfebbfa84]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 19:21:25 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:21:25.287 211690 DEBUG oslo.privsep.daemon [-] privsep: reply[aef6b497-f094-42ec-a620-928afecc0a07]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapd8a7afef-21'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:16:7b:76'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 187], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 584392, 'reachable_time': 26701, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 235914, 'error': None, 'target': 'ovnmeta-d8a7afef-267d-4702-a8ff-b40d78fc979d', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 19:21:25 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:21:25.304 211690 DEBUG oslo.privsep.daemon [-] privsep: reply[0fb240ac-d66e-43a6-809f-6e105d133dac]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe16:7b76'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 584392, 'tstamp': 584392}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 235915, 'error': None, 'target': 'ovnmeta-d8a7afef-267d-4702-a8ff-b40d78fc979d', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 19:21:25 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:21:25.323 211690 DEBUG oslo.privsep.daemon [-] privsep: reply[97b899a2-5ca1-4d65-a680-c586833bed70]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapd8a7afef-21'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:16:7b:76'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 2, 'rx_bytes': 90, 'tx_bytes': 180, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 2, 'rx_bytes': 90, 'tx_bytes': 180, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 187], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 584392, 'reachable_time': 26701, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 2, 'outoctets': 152, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 2, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 152, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 2, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 235916, 'error': None, 'target': 'ovnmeta-d8a7afef-267d-4702-a8ff-b40d78fc979d', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 19:21:25 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:21:25.349 211690 DEBUG oslo.privsep.daemon [-] privsep: reply[1fc2d504-d5a8-4978-be08-1c3d51d75931]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 19:21:25 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:21:25.392 211690 DEBUG oslo.privsep.daemon [-] privsep: reply[cc43b122-b741-4c91-ab8b-444db90776ee]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 19:21:25 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:21:25.394 104259 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapd8a7afef-20, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 21 19:21:25 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:21:25.394 104259 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 21 19:21:25 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:21:25.394 104259 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapd8a7afef-20, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 21 19:21:25 np0005591285 nova_compute[182755]: 2026-01-22 00:21:25.396 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:21:25 np0005591285 kernel: tapd8a7afef-20: entered promiscuous mode
Jan 21 19:21:25 np0005591285 NetworkManager[55017]: <info>  [1769041285.3966] manager: (tapd8a7afef-20): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/287)
Jan 21 19:21:25 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:21:25.398 104259 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapd8a7afef-20, col_values=(('external_ids', {'iface-id': 'fec7c3ef-ea69-4780-bb3e-9fb8476238d8'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 21 19:21:25 np0005591285 nova_compute[182755]: 2026-01-22 00:21:25.399 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:21:25 np0005591285 nova_compute[182755]: 2026-01-22 00:21:25.401 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:21:25 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:21:25.401 104259 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/d8a7afef-267d-4702-a8ff-b40d78fc979d.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/d8a7afef-267d-4702-a8ff-b40d78fc979d.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Jan 21 19:21:25 np0005591285 ovn_controller[94908]: 2026-01-22T00:21:25Z|00596|binding|INFO|Releasing lport fec7c3ef-ea69-4780-bb3e-9fb8476238d8 from this chassis (sb_readonly=0)
Jan 21 19:21:25 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:21:25.402 211690 DEBUG oslo.privsep.daemon [-] privsep: reply[3282a3da-f118-4a6f-a185-dd4ec53ed20e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 19:21:25 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:21:25.403 104259 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Jan 21 19:21:25 np0005591285 ovn_metadata_agent[104254]: global
Jan 21 19:21:25 np0005591285 ovn_metadata_agent[104254]:    log         /dev/log local0 debug
Jan 21 19:21:25 np0005591285 ovn_metadata_agent[104254]:    log-tag     haproxy-metadata-proxy-d8a7afef-267d-4702-a8ff-b40d78fc979d
Jan 21 19:21:25 np0005591285 ovn_metadata_agent[104254]:    user        root
Jan 21 19:21:25 np0005591285 ovn_metadata_agent[104254]:    group       root
Jan 21 19:21:25 np0005591285 ovn_metadata_agent[104254]:    maxconn     1024
Jan 21 19:21:25 np0005591285 ovn_metadata_agent[104254]:    pidfile     /var/lib/neutron/external/pids/d8a7afef-267d-4702-a8ff-b40d78fc979d.pid.haproxy
Jan 21 19:21:25 np0005591285 ovn_metadata_agent[104254]:    daemon
Jan 21 19:21:25 np0005591285 ovn_metadata_agent[104254]: 
Jan 21 19:21:25 np0005591285 ovn_metadata_agent[104254]: defaults
Jan 21 19:21:25 np0005591285 ovn_metadata_agent[104254]:    log global
Jan 21 19:21:25 np0005591285 ovn_metadata_agent[104254]:    mode http
Jan 21 19:21:25 np0005591285 ovn_metadata_agent[104254]:    option httplog
Jan 21 19:21:25 np0005591285 ovn_metadata_agent[104254]:    option dontlognull
Jan 21 19:21:25 np0005591285 ovn_metadata_agent[104254]:    option http-server-close
Jan 21 19:21:25 np0005591285 ovn_metadata_agent[104254]:    option forwardfor
Jan 21 19:21:25 np0005591285 ovn_metadata_agent[104254]:    retries                 3
Jan 21 19:21:25 np0005591285 ovn_metadata_agent[104254]:    timeout http-request    30s
Jan 21 19:21:25 np0005591285 ovn_metadata_agent[104254]:    timeout connect         30s
Jan 21 19:21:25 np0005591285 ovn_metadata_agent[104254]:    timeout client          32s
Jan 21 19:21:25 np0005591285 ovn_metadata_agent[104254]:    timeout server          32s
Jan 21 19:21:25 np0005591285 ovn_metadata_agent[104254]:    timeout http-keep-alive 30s
Jan 21 19:21:25 np0005591285 ovn_metadata_agent[104254]: 
Jan 21 19:21:25 np0005591285 ovn_metadata_agent[104254]: 
Jan 21 19:21:25 np0005591285 ovn_metadata_agent[104254]: listen listener
Jan 21 19:21:25 np0005591285 ovn_metadata_agent[104254]:    bind 169.254.169.254:80
Jan 21 19:21:25 np0005591285 ovn_metadata_agent[104254]:    server metadata /var/lib/neutron/metadata_proxy
Jan 21 19:21:25 np0005591285 ovn_metadata_agent[104254]:    http-request add-header X-OVN-Network-ID d8a7afef-267d-4702-a8ff-b40d78fc979d
Jan 21 19:21:25 np0005591285 ovn_metadata_agent[104254]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Jan 21 19:21:25 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:21:25.403 104259 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-d8a7afef-267d-4702-a8ff-b40d78fc979d', 'env', 'PROCESS_TAG=haproxy-d8a7afef-267d-4702-a8ff-b40d78fc979d', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/d8a7afef-267d-4702-a8ff-b40d78fc979d.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Jan 21 19:21:25 np0005591285 nova_compute[182755]: 2026-01-22 00:21:25.413 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:21:25 np0005591285 nova_compute[182755]: 2026-01-22 00:21:25.471 182759 DEBUG nova.compute.manager [req-45fc0e76-6fd5-4685-a6f8-ceb86cf92a9a req-2bea0f94-6c2f-425e-9dd5-8e4fcb5c6da1 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 25b17338-0c55-4631-9d7e-896e6fa6339a] Received event network-vif-plugged-1d955a5e-1284-42e2-b7cc-b421102c744d external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 21 19:21:25 np0005591285 nova_compute[182755]: 2026-01-22 00:21:25.472 182759 DEBUG oslo_concurrency.lockutils [req-45fc0e76-6fd5-4685-a6f8-ceb86cf92a9a req-2bea0f94-6c2f-425e-9dd5-8e4fcb5c6da1 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "25b17338-0c55-4631-9d7e-896e6fa6339a-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 21 19:21:25 np0005591285 nova_compute[182755]: 2026-01-22 00:21:25.472 182759 DEBUG oslo_concurrency.lockutils [req-45fc0e76-6fd5-4685-a6f8-ceb86cf92a9a req-2bea0f94-6c2f-425e-9dd5-8e4fcb5c6da1 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "25b17338-0c55-4631-9d7e-896e6fa6339a-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 21 19:21:25 np0005591285 nova_compute[182755]: 2026-01-22 00:21:25.472 182759 DEBUG oslo_concurrency.lockutils [req-45fc0e76-6fd5-4685-a6f8-ceb86cf92a9a req-2bea0f94-6c2f-425e-9dd5-8e4fcb5c6da1 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "25b17338-0c55-4631-9d7e-896e6fa6339a-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 21 19:21:25 np0005591285 nova_compute[182755]: 2026-01-22 00:21:25.472 182759 DEBUG nova.compute.manager [req-45fc0e76-6fd5-4685-a6f8-ceb86cf92a9a req-2bea0f94-6c2f-425e-9dd5-8e4fcb5c6da1 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 25b17338-0c55-4631-9d7e-896e6fa6339a] Processing event network-vif-plugged-1d955a5e-1284-42e2-b7cc-b421102c744d _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Jan 21 19:21:25 np0005591285 podman[235945]: 2026-01-22 00:21:25.727764108 +0000 UTC m=+0.020771300 image pull 3695f0466b4af47afdf4b467956f8cc4744d7249671a73e7ca3fd26cca2f59c3 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Jan 21 19:21:26 np0005591285 nova_compute[182755]: 2026-01-22 00:21:26.029 182759 DEBUG nova.compute.manager [None req-1e6fd58f-821a-4085-b2e9-5cf75db49cd1 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] [instance: 25b17338-0c55-4631-9d7e-896e6fa6339a] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Jan 21 19:21:26 np0005591285 nova_compute[182755]: 2026-01-22 00:21:26.030 182759 DEBUG nova.virt.driver [None req-f447ef32-7edb-4e2c-a5c5-5452f2f9c37e - - - - - -] Emitting event <LifecycleEvent: 1769041286.0288568, 25b17338-0c55-4631-9d7e-896e6fa6339a => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 21 19:21:26 np0005591285 nova_compute[182755]: 2026-01-22 00:21:26.030 182759 INFO nova.compute.manager [None req-f447ef32-7edb-4e2c-a5c5-5452f2f9c37e - - - - - -] [instance: 25b17338-0c55-4631-9d7e-896e6fa6339a] VM Started (Lifecycle Event)#033[00m
Jan 21 19:21:26 np0005591285 nova_compute[182755]: 2026-01-22 00:21:26.033 182759 DEBUG nova.virt.libvirt.driver [None req-1e6fd58f-821a-4085-b2e9-5cf75db49cd1 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] [instance: 25b17338-0c55-4631-9d7e-896e6fa6339a] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Jan 21 19:21:26 np0005591285 nova_compute[182755]: 2026-01-22 00:21:26.036 182759 INFO nova.virt.libvirt.driver [-] [instance: 25b17338-0c55-4631-9d7e-896e6fa6339a] Instance spawned successfully.#033[00m
Jan 21 19:21:26 np0005591285 nova_compute[182755]: 2026-01-22 00:21:26.036 182759 DEBUG nova.virt.libvirt.driver [None req-1e6fd58f-821a-4085-b2e9-5cf75db49cd1 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] [instance: 25b17338-0c55-4631-9d7e-896e6fa6339a] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Jan 21 19:21:26 np0005591285 nova_compute[182755]: 2026-01-22 00:21:26.067 182759 DEBUG nova.compute.manager [None req-f447ef32-7edb-4e2c-a5c5-5452f2f9c37e - - - - - -] [instance: 25b17338-0c55-4631-9d7e-896e6fa6339a] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 21 19:21:26 np0005591285 nova_compute[182755]: 2026-01-22 00:21:26.073 182759 DEBUG nova.virt.libvirt.driver [None req-1e6fd58f-821a-4085-b2e9-5cf75db49cd1 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] [instance: 25b17338-0c55-4631-9d7e-896e6fa6339a] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 21 19:21:26 np0005591285 nova_compute[182755]: 2026-01-22 00:21:26.073 182759 DEBUG nova.virt.libvirt.driver [None req-1e6fd58f-821a-4085-b2e9-5cf75db49cd1 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] [instance: 25b17338-0c55-4631-9d7e-896e6fa6339a] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 21 19:21:26 np0005591285 nova_compute[182755]: 2026-01-22 00:21:26.073 182759 DEBUG nova.virt.libvirt.driver [None req-1e6fd58f-821a-4085-b2e9-5cf75db49cd1 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] [instance: 25b17338-0c55-4631-9d7e-896e6fa6339a] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 21 19:21:26 np0005591285 nova_compute[182755]: 2026-01-22 00:21:26.074 182759 DEBUG nova.virt.libvirt.driver [None req-1e6fd58f-821a-4085-b2e9-5cf75db49cd1 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] [instance: 25b17338-0c55-4631-9d7e-896e6fa6339a] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 21 19:21:26 np0005591285 nova_compute[182755]: 2026-01-22 00:21:26.074 182759 DEBUG nova.virt.libvirt.driver [None req-1e6fd58f-821a-4085-b2e9-5cf75db49cd1 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] [instance: 25b17338-0c55-4631-9d7e-896e6fa6339a] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 21 19:21:26 np0005591285 nova_compute[182755]: 2026-01-22 00:21:26.075 182759 DEBUG nova.virt.libvirt.driver [None req-1e6fd58f-821a-4085-b2e9-5cf75db49cd1 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] [instance: 25b17338-0c55-4631-9d7e-896e6fa6339a] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 21 19:21:26 np0005591285 nova_compute[182755]: 2026-01-22 00:21:26.080 182759 DEBUG nova.compute.manager [None req-f447ef32-7edb-4e2c-a5c5-5452f2f9c37e - - - - - -] [instance: 25b17338-0c55-4631-9d7e-896e6fa6339a] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 21 19:21:26 np0005591285 nova_compute[182755]: 2026-01-22 00:21:26.137 182759 DEBUG nova.network.neutron [req-ab4a751e-31bb-4e25-890c-4326934c8a99 req-87acc936-9b9f-46cc-94e5-1950d95d464f 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 25b17338-0c55-4631-9d7e-896e6fa6339a] Updated VIF entry in instance network info cache for port 1d955a5e-1284-42e2-b7cc-b421102c744d. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 21 19:21:26 np0005591285 nova_compute[182755]: 2026-01-22 00:21:26.138 182759 DEBUG nova.network.neutron [req-ab4a751e-31bb-4e25-890c-4326934c8a99 req-87acc936-9b9f-46cc-94e5-1950d95d464f 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 25b17338-0c55-4631-9d7e-896e6fa6339a] Updating instance_info_cache with network_info: [{"id": "1d955a5e-1284-42e2-b7cc-b421102c744d", "address": "fa:16:3e:1e:6e:18", "network": {"id": "d8a7afef-267d-4702-a8ff-b40d78fc979d", "bridge": "br-int", "label": "tempest-network-smoke--1265373196", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "34b96b4037d24a0ea19383ca2477b2fd", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap1d955a5e-12", "ovs_interfaceid": "1d955a5e-1284-42e2-b7cc-b421102c744d", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 21 19:21:26 np0005591285 nova_compute[182755]: 2026-01-22 00:21:26.147 182759 INFO nova.compute.manager [None req-f447ef32-7edb-4e2c-a5c5-5452f2f9c37e - - - - - -] [instance: 25b17338-0c55-4631-9d7e-896e6fa6339a] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 21 19:21:26 np0005591285 nova_compute[182755]: 2026-01-22 00:21:26.147 182759 DEBUG nova.virt.driver [None req-f447ef32-7edb-4e2c-a5c5-5452f2f9c37e - - - - - -] Emitting event <LifecycleEvent: 1769041286.0297697, 25b17338-0c55-4631-9d7e-896e6fa6339a => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 21 19:21:26 np0005591285 nova_compute[182755]: 2026-01-22 00:21:26.148 182759 INFO nova.compute.manager [None req-f447ef32-7edb-4e2c-a5c5-5452f2f9c37e - - - - - -] [instance: 25b17338-0c55-4631-9d7e-896e6fa6339a] VM Paused (Lifecycle Event)#033[00m
Jan 21 19:21:26 np0005591285 podman[235945]: 2026-01-22 00:21:26.15514507 +0000 UTC m=+0.448152242 container create 13dbf31a813b05cfb5caa99bd13e0ac4f3713aebf381bb8358d6fa15e1436060 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-d8a7afef-267d-4702-a8ff-b40d78fc979d, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251202)
Jan 21 19:21:26 np0005591285 nova_compute[182755]: 2026-01-22 00:21:26.188 182759 DEBUG oslo_concurrency.lockutils [req-ab4a751e-31bb-4e25-890c-4326934c8a99 req-87acc936-9b9f-46cc-94e5-1950d95d464f 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Releasing lock "refresh_cache-25b17338-0c55-4631-9d7e-896e6fa6339a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 21 19:21:26 np0005591285 nova_compute[182755]: 2026-01-22 00:21:26.192 182759 DEBUG nova.compute.manager [None req-f447ef32-7edb-4e2c-a5c5-5452f2f9c37e - - - - - -] [instance: 25b17338-0c55-4631-9d7e-896e6fa6339a] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 21 19:21:26 np0005591285 nova_compute[182755]: 2026-01-22 00:21:26.196 182759 DEBUG nova.virt.driver [None req-f447ef32-7edb-4e2c-a5c5-5452f2f9c37e - - - - - -] Emitting event <LifecycleEvent: 1769041286.0324457, 25b17338-0c55-4631-9d7e-896e6fa6339a => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 21 19:21:26 np0005591285 nova_compute[182755]: 2026-01-22 00:21:26.197 182759 INFO nova.compute.manager [None req-f447ef32-7edb-4e2c-a5c5-5452f2f9c37e - - - - - -] [instance: 25b17338-0c55-4631-9d7e-896e6fa6339a] VM Resumed (Lifecycle Event)#033[00m
Jan 21 19:21:26 np0005591285 systemd[1]: Started libpod-conmon-13dbf31a813b05cfb5caa99bd13e0ac4f3713aebf381bb8358d6fa15e1436060.scope.
Jan 21 19:21:26 np0005591285 nova_compute[182755]: 2026-01-22 00:21:26.224 182759 INFO nova.compute.manager [None req-1e6fd58f-821a-4085-b2e9-5cf75db49cd1 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] [instance: 25b17338-0c55-4631-9d7e-896e6fa6339a] Took 6.99 seconds to spawn the instance on the hypervisor.#033[00m
Jan 21 19:21:26 np0005591285 nova_compute[182755]: 2026-01-22 00:21:26.224 182759 DEBUG nova.compute.manager [None req-1e6fd58f-821a-4085-b2e9-5cf75db49cd1 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] [instance: 25b17338-0c55-4631-9d7e-896e6fa6339a] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 21 19:21:26 np0005591285 nova_compute[182755]: 2026-01-22 00:21:26.226 182759 DEBUG nova.compute.manager [None req-f447ef32-7edb-4e2c-a5c5-5452f2f9c37e - - - - - -] [instance: 25b17338-0c55-4631-9d7e-896e6fa6339a] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 21 19:21:26 np0005591285 systemd[1]: Started libcrun container.
Jan 21 19:21:26 np0005591285 nova_compute[182755]: 2026-01-22 00:21:26.232 182759 DEBUG nova.compute.manager [None req-f447ef32-7edb-4e2c-a5c5-5452f2f9c37e - - - - - -] [instance: 25b17338-0c55-4631-9d7e-896e6fa6339a] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 21 19:21:26 np0005591285 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/3d6bd3d1db287e6aa3cc0078621ba4c5a0d6094d3fe4edaa2803a40f90327f2e/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 21 19:21:26 np0005591285 podman[235945]: 2026-01-22 00:21:26.249407395 +0000 UTC m=+0.542414567 container init 13dbf31a813b05cfb5caa99bd13e0ac4f3713aebf381bb8358d6fa15e1436060 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-d8a7afef-267d-4702-a8ff-b40d78fc979d, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.build-date=20251202, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 21 19:21:26 np0005591285 podman[235945]: 2026-01-22 00:21:26.256343262 +0000 UTC m=+0.549350434 container start 13dbf31a813b05cfb5caa99bd13e0ac4f3713aebf381bb8358d6fa15e1436060 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-d8a7afef-267d-4702-a8ff-b40d78fc979d, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.license=GPLv2)
Jan 21 19:21:26 np0005591285 neutron-haproxy-ovnmeta-d8a7afef-267d-4702-a8ff-b40d78fc979d[235967]: [NOTICE]   (235971) : New worker (235973) forked
Jan 21 19:21:26 np0005591285 neutron-haproxy-ovnmeta-d8a7afef-267d-4702-a8ff-b40d78fc979d[235967]: [NOTICE]   (235971) : Loading success.
Jan 21 19:21:26 np0005591285 nova_compute[182755]: 2026-01-22 00:21:26.289 182759 INFO nova.compute.manager [None req-f447ef32-7edb-4e2c-a5c5-5452f2f9c37e - - - - - -] [instance: 25b17338-0c55-4631-9d7e-896e6fa6339a] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 21 19:21:26 np0005591285 nova_compute[182755]: 2026-01-22 00:21:26.330 182759 INFO nova.compute.manager [None req-1e6fd58f-821a-4085-b2e9-5cf75db49cd1 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] [instance: 25b17338-0c55-4631-9d7e-896e6fa6339a] Took 7.57 seconds to build instance.#033[00m
Jan 21 19:21:26 np0005591285 nova_compute[182755]: 2026-01-22 00:21:26.352 182759 DEBUG oslo_concurrency.lockutils [None req-1e6fd58f-821a-4085-b2e9-5cf75db49cd1 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] Lock "25b17338-0c55-4631-9d7e-896e6fa6339a" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 7.672s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 21 19:21:27 np0005591285 nova_compute[182755]: 2026-01-22 00:21:27.627 182759 DEBUG nova.compute.manager [req-02ea1056-d816-4503-8dba-bc8e64e25fda req-92ba1de7-c9d3-415b-aaf8-06ea71fe2ee2 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 25b17338-0c55-4631-9d7e-896e6fa6339a] Received event network-vif-plugged-1d955a5e-1284-42e2-b7cc-b421102c744d external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 21 19:21:27 np0005591285 nova_compute[182755]: 2026-01-22 00:21:27.628 182759 DEBUG oslo_concurrency.lockutils [req-02ea1056-d816-4503-8dba-bc8e64e25fda req-92ba1de7-c9d3-415b-aaf8-06ea71fe2ee2 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "25b17338-0c55-4631-9d7e-896e6fa6339a-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 21 19:21:27 np0005591285 nova_compute[182755]: 2026-01-22 00:21:27.629 182759 DEBUG oslo_concurrency.lockutils [req-02ea1056-d816-4503-8dba-bc8e64e25fda req-92ba1de7-c9d3-415b-aaf8-06ea71fe2ee2 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "25b17338-0c55-4631-9d7e-896e6fa6339a-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 21 19:21:27 np0005591285 nova_compute[182755]: 2026-01-22 00:21:27.629 182759 DEBUG oslo_concurrency.lockutils [req-02ea1056-d816-4503-8dba-bc8e64e25fda req-92ba1de7-c9d3-415b-aaf8-06ea71fe2ee2 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "25b17338-0c55-4631-9d7e-896e6fa6339a-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 21 19:21:27 np0005591285 nova_compute[182755]: 2026-01-22 00:21:27.629 182759 DEBUG nova.compute.manager [req-02ea1056-d816-4503-8dba-bc8e64e25fda req-92ba1de7-c9d3-415b-aaf8-06ea71fe2ee2 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 25b17338-0c55-4631-9d7e-896e6fa6339a] No waiting events found dispatching network-vif-plugged-1d955a5e-1284-42e2-b7cc-b421102c744d pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 21 19:21:27 np0005591285 nova_compute[182755]: 2026-01-22 00:21:27.629 182759 WARNING nova.compute.manager [req-02ea1056-d816-4503-8dba-bc8e64e25fda req-92ba1de7-c9d3-415b-aaf8-06ea71fe2ee2 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 25b17338-0c55-4631-9d7e-896e6fa6339a] Received unexpected event network-vif-plugged-1d955a5e-1284-42e2-b7cc-b421102c744d for instance with vm_state active and task_state None.#033[00m
Jan 21 19:21:27 np0005591285 nova_compute[182755]: 2026-01-22 00:21:27.930 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:21:29 np0005591285 nova_compute[182755]: 2026-01-22 00:21:29.506 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:21:31 np0005591285 NetworkManager[55017]: <info>  [1769041291.2031] manager: (patch-br-int-to-provnet-99bcf688-d143-4306-a11c-956e66fbe227): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/288)
Jan 21 19:21:31 np0005591285 NetworkManager[55017]: <info>  [1769041291.2044] manager: (patch-provnet-99bcf688-d143-4306-a11c-956e66fbe227-to-br-int): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/289)
Jan 21 19:21:31 np0005591285 nova_compute[182755]: 2026-01-22 00:21:31.205 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:21:31 np0005591285 nova_compute[182755]: 2026-01-22 00:21:31.278 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:21:31 np0005591285 ovn_controller[94908]: 2026-01-22T00:21:31Z|00597|binding|INFO|Releasing lport fec7c3ef-ea69-4780-bb3e-9fb8476238d8 from this chassis (sb_readonly=0)
Jan 21 19:21:31 np0005591285 nova_compute[182755]: 2026-01-22 00:21:31.292 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:21:31 np0005591285 nova_compute[182755]: 2026-01-22 00:21:31.588 182759 DEBUG nova.compute.manager [req-4c142803-8352-4bc8-9301-4c64ee882af6 req-11bb1bd3-51a8-47cd-b57c-bff9ba5c7170 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 25b17338-0c55-4631-9d7e-896e6fa6339a] Received event network-changed-1d955a5e-1284-42e2-b7cc-b421102c744d external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 21 19:21:31 np0005591285 nova_compute[182755]: 2026-01-22 00:21:31.589 182759 DEBUG nova.compute.manager [req-4c142803-8352-4bc8-9301-4c64ee882af6 req-11bb1bd3-51a8-47cd-b57c-bff9ba5c7170 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 25b17338-0c55-4631-9d7e-896e6fa6339a] Refreshing instance network info cache due to event network-changed-1d955a5e-1284-42e2-b7cc-b421102c744d. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 21 19:21:31 np0005591285 nova_compute[182755]: 2026-01-22 00:21:31.589 182759 DEBUG oslo_concurrency.lockutils [req-4c142803-8352-4bc8-9301-4c64ee882af6 req-11bb1bd3-51a8-47cd-b57c-bff9ba5c7170 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "refresh_cache-25b17338-0c55-4631-9d7e-896e6fa6339a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 21 19:21:31 np0005591285 nova_compute[182755]: 2026-01-22 00:21:31.590 182759 DEBUG oslo_concurrency.lockutils [req-4c142803-8352-4bc8-9301-4c64ee882af6 req-11bb1bd3-51a8-47cd-b57c-bff9ba5c7170 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquired lock "refresh_cache-25b17338-0c55-4631-9d7e-896e6fa6339a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 21 19:21:31 np0005591285 nova_compute[182755]: 2026-01-22 00:21:31.590 182759 DEBUG nova.network.neutron [req-4c142803-8352-4bc8-9301-4c64ee882af6 req-11bb1bd3-51a8-47cd-b57c-bff9ba5c7170 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 25b17338-0c55-4631-9d7e-896e6fa6339a] Refreshing network info cache for port 1d955a5e-1284-42e2-b7cc-b421102c744d _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 21 19:21:32 np0005591285 nova_compute[182755]: 2026-01-22 00:21:32.932 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:21:34 np0005591285 nova_compute[182755]: 2026-01-22 00:21:34.509 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:21:34 np0005591285 nova_compute[182755]: 2026-01-22 00:21:34.584 182759 DEBUG nova.network.neutron [req-4c142803-8352-4bc8-9301-4c64ee882af6 req-11bb1bd3-51a8-47cd-b57c-bff9ba5c7170 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 25b17338-0c55-4631-9d7e-896e6fa6339a] Updated VIF entry in instance network info cache for port 1d955a5e-1284-42e2-b7cc-b421102c744d. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 21 19:21:34 np0005591285 nova_compute[182755]: 2026-01-22 00:21:34.584 182759 DEBUG nova.network.neutron [req-4c142803-8352-4bc8-9301-4c64ee882af6 req-11bb1bd3-51a8-47cd-b57c-bff9ba5c7170 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 25b17338-0c55-4631-9d7e-896e6fa6339a] Updating instance_info_cache with network_info: [{"id": "1d955a5e-1284-42e2-b7cc-b421102c744d", "address": "fa:16:3e:1e:6e:18", "network": {"id": "d8a7afef-267d-4702-a8ff-b40d78fc979d", "bridge": "br-int", "label": "tempest-network-smoke--1265373196", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.244", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "34b96b4037d24a0ea19383ca2477b2fd", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap1d955a5e-12", "ovs_interfaceid": "1d955a5e-1284-42e2-b7cc-b421102c744d", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 21 19:21:34 np0005591285 nova_compute[182755]: 2026-01-22 00:21:34.611 182759 DEBUG oslo_concurrency.lockutils [req-4c142803-8352-4bc8-9301-4c64ee882af6 req-11bb1bd3-51a8-47cd-b57c-bff9ba5c7170 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Releasing lock "refresh_cache-25b17338-0c55-4631-9d7e-896e6fa6339a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 21 19:21:35 np0005591285 nova_compute[182755]: 2026-01-22 00:21:35.658 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:21:37 np0005591285 podman[235984]: 2026-01-22 00:21:37.203107809 +0000 UTC m=+0.055214566 container health_status b5c1a31a508d0cf9e10702abe9c0ef424614b4d8f0daee85791f7de054afa6f0 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=ceilometer_agent_compute, container_name=ceilometer_agent_compute, org.label-schema.license=GPLv2)
Jan 21 19:21:37 np0005591285 podman[235983]: 2026-01-22 00:21:37.209942264 +0000 UTC m=+0.062486942 container health_status 769668cc4e818cfae854ab3fcb5517227ddca1e18005a18ef4d782a494f45662 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, release=1755695350, container_name=openstack_network_exporter, config_id=openstack_network_exporter, io.openshift.tags=minimal rhel9, name=ubi9-minimal, url=https://catalog.redhat.com/en/search?searchType=containers, architecture=x86_64, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., build-date=2025-08-20T13:12:41, maintainer=Red Hat, Inc., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, distribution-scope=public, vendor=Red Hat, Inc., config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, com.redhat.component=ubi9-minimal-container, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=9.6, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.buildah.version=1.33.7, io.openshift.expose-services=, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, managed_by=edpm_ansible, vcs-type=git)
Jan 21 19:21:37 np0005591285 nova_compute[182755]: 2026-01-22 00:21:37.935 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:21:39 np0005591285 nova_compute[182755]: 2026-01-22 00:21:39.513 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:21:39 np0005591285 ovn_controller[94908]: 2026-01-22T00:21:39Z|00058|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:1e:6e:18 10.100.0.13
Jan 21 19:21:39 np0005591285 ovn_controller[94908]: 2026-01-22T00:21:39Z|00059|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:1e:6e:18 10.100.0.13
Jan 21 19:21:42 np0005591285 nova_compute[182755]: 2026-01-22 00:21:42.936 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:21:44 np0005591285 nova_compute[182755]: 2026-01-22 00:21:44.516 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:21:46 np0005591285 nova_compute[182755]: 2026-01-22 00:21:46.212 182759 INFO nova.compute.manager [None req-4ef0cce4-5b1f-40ac-b91d-2205bd0c31ba 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] [instance: 25b17338-0c55-4631-9d7e-896e6fa6339a] Get console output#033[00m
Jan 21 19:21:46 np0005591285 nova_compute[182755]: 2026-01-22 00:21:46.218 211512 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes#033[00m
Jan 21 19:21:47 np0005591285 podman[236040]: 2026-01-22 00:21:47.181987521 +0000 UTC m=+0.053430017 container health_status 3d927e208c8d9d2bf2ed958430b573b5beb6e6062ba87e1ec7a0ee42c855b2e4 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Jan 21 19:21:47 np0005591285 ovn_controller[94908]: 2026-01-22T00:21:47Z|00598|binding|INFO|Releasing lport fec7c3ef-ea69-4780-bb3e-9fb8476238d8 from this chassis (sb_readonly=0)
Jan 21 19:21:47 np0005591285 nova_compute[182755]: 2026-01-22 00:21:47.938 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:21:48 np0005591285 ovn_controller[94908]: 2026-01-22T00:21:48Z|00599|binding|INFO|Releasing lport fec7c3ef-ea69-4780-bb3e-9fb8476238d8 from this chassis (sb_readonly=0)
Jan 21 19:21:48 np0005591285 nova_compute[182755]: 2026-01-22 00:21:48.097 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:21:49 np0005591285 podman[236067]: 2026-01-22 00:21:49.193104301 +0000 UTC m=+0.061980177 container health_status 482a8450540b33e5cd4c4cb0c4b60281016c657b79b89fa82f8a9e447843f995 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Jan 21 19:21:49 np0005591285 podman[236068]: 2026-01-22 00:21:49.197629213 +0000 UTC m=+0.063465408 container health_status 8919309155a033da8deb024bb37b9fc0d285fe97686ee0d202af37a5f2df8416 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Jan 21 19:21:49 np0005591285 nova_compute[182755]: 2026-01-22 00:21:49.216 182759 DEBUG oslo_service.periodic_task [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 21 19:21:49 np0005591285 nova_compute[182755]: 2026-01-22 00:21:49.217 182759 DEBUG nova.compute.manager [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Jan 21 19:21:49 np0005591285 nova_compute[182755]: 2026-01-22 00:21:49.341 182759 INFO nova.compute.manager [None req-162b7724-90cc-49ca-aa20-170357f5b305 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] [instance: 25b17338-0c55-4631-9d7e-896e6fa6339a] Get console output#033[00m
Jan 21 19:21:49 np0005591285 nova_compute[182755]: 2026-01-22 00:21:49.348 211512 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes#033[00m
Jan 21 19:21:49 np0005591285 nova_compute[182755]: 2026-01-22 00:21:49.518 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:21:50 np0005591285 nova_compute[182755]: 2026-01-22 00:21:50.111 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:21:50 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:21:50.111 104259 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=45, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '16:a4:fb', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': 'fa:c2:bb:50:eb:27'}, ipsec=False) old=SB_Global(nb_cfg=44) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 21 19:21:50 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:21:50.112 104259 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 0 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Jan 21 19:21:50 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:21:50.113 104259 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=ce4b296c-26ac-415a-aa87-9634754eb3d3, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '45'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 21 19:21:50 np0005591285 nova_compute[182755]: 2026-01-22 00:21:50.213 182759 DEBUG oslo_service.periodic_task [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 21 19:21:50 np0005591285 NetworkManager[55017]: <info>  [1769041310.4966] manager: (patch-provnet-99bcf688-d143-4306-a11c-956e66fbe227-to-br-int): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/290)
Jan 21 19:21:50 np0005591285 nova_compute[182755]: 2026-01-22 00:21:50.496 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:21:50 np0005591285 NetworkManager[55017]: <info>  [1769041310.4973] manager: (patch-br-int-to-provnet-99bcf688-d143-4306-a11c-956e66fbe227): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/291)
Jan 21 19:21:50 np0005591285 nova_compute[182755]: 2026-01-22 00:21:50.577 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:21:50 np0005591285 ovn_controller[94908]: 2026-01-22T00:21:50Z|00600|binding|INFO|Releasing lport fec7c3ef-ea69-4780-bb3e-9fb8476238d8 from this chassis (sb_readonly=0)
Jan 21 19:21:50 np0005591285 nova_compute[182755]: 2026-01-22 00:21:50.587 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:21:51 np0005591285 nova_compute[182755]: 2026-01-22 00:21:51.216 182759 DEBUG oslo_service.periodic_task [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 21 19:21:52 np0005591285 nova_compute[182755]: 2026-01-22 00:21:52.043 182759 INFO nova.compute.manager [None req-a17de048-eb93-4661-ae56-3c65bcf2e017 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] [instance: 25b17338-0c55-4631-9d7e-896e6fa6339a] Get console output#033[00m
Jan 21 19:21:52 np0005591285 nova_compute[182755]: 2026-01-22 00:21:52.048 211512 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes#033[00m
Jan 21 19:21:52 np0005591285 nova_compute[182755]: 2026-01-22 00:21:52.218 182759 DEBUG oslo_service.periodic_task [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 21 19:21:52 np0005591285 nova_compute[182755]: 2026-01-22 00:21:52.218 182759 DEBUG nova.compute.manager [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Jan 21 19:21:52 np0005591285 nova_compute[182755]: 2026-01-22 00:21:52.218 182759 DEBUG nova.compute.manager [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Jan 21 19:21:52 np0005591285 nova_compute[182755]: 2026-01-22 00:21:52.605 182759 DEBUG oslo_concurrency.lockutils [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Acquiring lock "refresh_cache-25b17338-0c55-4631-9d7e-896e6fa6339a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 21 19:21:52 np0005591285 nova_compute[182755]: 2026-01-22 00:21:52.605 182759 DEBUG oslo_concurrency.lockutils [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Acquired lock "refresh_cache-25b17338-0c55-4631-9d7e-896e6fa6339a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 21 19:21:52 np0005591285 nova_compute[182755]: 2026-01-22 00:21:52.605 182759 DEBUG nova.network.neutron [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] [instance: 25b17338-0c55-4631-9d7e-896e6fa6339a] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004#033[00m
Jan 21 19:21:52 np0005591285 nova_compute[182755]: 2026-01-22 00:21:52.606 182759 DEBUG nova.objects.instance [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Lazy-loading 'info_cache' on Instance uuid 25b17338-0c55-4631-9d7e-896e6fa6339a obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 21 19:21:53 np0005591285 nova_compute[182755]: 2026-01-22 00:21:53.063 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:21:53 np0005591285 podman[236111]: 2026-01-22 00:21:53.23030584 +0000 UTC m=+0.100640937 container health_status e38def30a2d032278c507a8fe96e62e9ea51ff4721fe55b24e66691821fd079e (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_id=ovn_controller, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251202)
Jan 21 19:21:54 np0005591285 nova_compute[182755]: 2026-01-22 00:21:54.213 182759 DEBUG nova.compute.manager [req-533bb9a6-ca9c-4d54-b9e0-8335f87243cd req-6d6df891-9687-4686-bb2e-0da4242a1bd4 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 25b17338-0c55-4631-9d7e-896e6fa6339a] Received event network-changed-1d955a5e-1284-42e2-b7cc-b421102c744d external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 21 19:21:54 np0005591285 nova_compute[182755]: 2026-01-22 00:21:54.213 182759 DEBUG nova.compute.manager [req-533bb9a6-ca9c-4d54-b9e0-8335f87243cd req-6d6df891-9687-4686-bb2e-0da4242a1bd4 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 25b17338-0c55-4631-9d7e-896e6fa6339a] Refreshing instance network info cache due to event network-changed-1d955a5e-1284-42e2-b7cc-b421102c744d. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 21 19:21:54 np0005591285 nova_compute[182755]: 2026-01-22 00:21:54.213 182759 DEBUG oslo_concurrency.lockutils [req-533bb9a6-ca9c-4d54-b9e0-8335f87243cd req-6d6df891-9687-4686-bb2e-0da4242a1bd4 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "refresh_cache-25b17338-0c55-4631-9d7e-896e6fa6339a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 21 19:21:54 np0005591285 nova_compute[182755]: 2026-01-22 00:21:54.521 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:21:55 np0005591285 nova_compute[182755]: 2026-01-22 00:21:55.007 182759 DEBUG oslo_concurrency.lockutils [None req-c1778e65-ddc0-40fe-a1db-289e8d72a92f 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] Acquiring lock "25b17338-0c55-4631-9d7e-896e6fa6339a" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 21 19:21:55 np0005591285 nova_compute[182755]: 2026-01-22 00:21:55.008 182759 DEBUG oslo_concurrency.lockutils [None req-c1778e65-ddc0-40fe-a1db-289e8d72a92f 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] Lock "25b17338-0c55-4631-9d7e-896e6fa6339a" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 21 19:21:55 np0005591285 nova_compute[182755]: 2026-01-22 00:21:55.008 182759 DEBUG oslo_concurrency.lockutils [None req-c1778e65-ddc0-40fe-a1db-289e8d72a92f 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] Acquiring lock "25b17338-0c55-4631-9d7e-896e6fa6339a-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 21 19:21:55 np0005591285 nova_compute[182755]: 2026-01-22 00:21:55.009 182759 DEBUG oslo_concurrency.lockutils [None req-c1778e65-ddc0-40fe-a1db-289e8d72a92f 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] Lock "25b17338-0c55-4631-9d7e-896e6fa6339a-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 21 19:21:55 np0005591285 nova_compute[182755]: 2026-01-22 00:21:55.010 182759 DEBUG oslo_concurrency.lockutils [None req-c1778e65-ddc0-40fe-a1db-289e8d72a92f 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] Lock "25b17338-0c55-4631-9d7e-896e6fa6339a-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 21 19:21:55 np0005591285 nova_compute[182755]: 2026-01-22 00:21:55.103 182759 INFO nova.compute.manager [None req-c1778e65-ddc0-40fe-a1db-289e8d72a92f 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] [instance: 25b17338-0c55-4631-9d7e-896e6fa6339a] Terminating instance#033[00m
Jan 21 19:21:55 np0005591285 nova_compute[182755]: 2026-01-22 00:21:55.122 182759 DEBUG nova.compute.manager [None req-c1778e65-ddc0-40fe-a1db-289e8d72a92f 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] [instance: 25b17338-0c55-4631-9d7e-896e6fa6339a] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Jan 21 19:21:55 np0005591285 kernel: tap1d955a5e-12 (unregistering): left promiscuous mode
Jan 21 19:21:55 np0005591285 NetworkManager[55017]: <info>  [1769041315.1450] device (tap1d955a5e-12): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 21 19:21:55 np0005591285 ovn_controller[94908]: 2026-01-22T00:21:55Z|00601|binding|INFO|Releasing lport 1d955a5e-1284-42e2-b7cc-b421102c744d from this chassis (sb_readonly=0)
Jan 21 19:21:55 np0005591285 nova_compute[182755]: 2026-01-22 00:21:55.211 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:21:55 np0005591285 ovn_controller[94908]: 2026-01-22T00:21:55Z|00602|binding|INFO|Setting lport 1d955a5e-1284-42e2-b7cc-b421102c744d down in Southbound
Jan 21 19:21:55 np0005591285 ovn_controller[94908]: 2026-01-22T00:21:55Z|00603|binding|INFO|Removing iface tap1d955a5e-12 ovn-installed in OVS
Jan 21 19:21:55 np0005591285 nova_compute[182755]: 2026-01-22 00:21:55.213 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:21:55 np0005591285 nova_compute[182755]: 2026-01-22 00:21:55.224 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:21:55 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:21:55.242 104259 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:1e:6e:18 10.100.0.13'], port_security=['fa:16:3e:1e:6e:18 10.100.0.13'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.13/28', 'neutron:device_id': '25b17338-0c55-4631-9d7e-896e6fa6339a', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-d8a7afef-267d-4702-a8ff-b40d78fc979d', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '34b96b4037d24a0ea19383ca2477b2fd', 'neutron:revision_number': '4', 'neutron:security_group_ids': '5ca70599-9d29-4e0e-b82d-7c3081b15cf6', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-2.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=51b97c50-36e0-44e4-96b5-94bb0eb67411, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fdd55b5c970>], logical_port=1d955a5e-1284-42e2-b7cc-b421102c744d) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7fdd55b5c970>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 21 19:21:55 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:21:55.244 104259 INFO neutron.agent.ovn.metadata.agent [-] Port 1d955a5e-1284-42e2-b7cc-b421102c744d in datapath d8a7afef-267d-4702-a8ff-b40d78fc979d unbound from our chassis#033[00m
Jan 21 19:21:55 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:21:55.245 104259 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network d8a7afef-267d-4702-a8ff-b40d78fc979d, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Jan 21 19:21:55 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:21:55.246 211690 DEBUG oslo.privsep.daemon [-] privsep: reply[8c174684-9b24-475d-ab0d-abe8fd3b2021]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 19:21:55 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:21:55.247 104259 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-d8a7afef-267d-4702-a8ff-b40d78fc979d namespace which is not needed anymore#033[00m
Jan 21 19:21:55 np0005591285 systemd[1]: machine-qemu\x2d70\x2dinstance\x2d00000099.scope: Deactivated successfully.
Jan 21 19:21:55 np0005591285 systemd[1]: machine-qemu\x2d70\x2dinstance\x2d00000099.scope: Consumed 13.900s CPU time.
Jan 21 19:21:55 np0005591285 systemd-machined[154022]: Machine qemu-70-instance-00000099 terminated.
Jan 21 19:21:55 np0005591285 nova_compute[182755]: 2026-01-22 00:21:55.347 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:21:55 np0005591285 nova_compute[182755]: 2026-01-22 00:21:55.353 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:21:55 np0005591285 nova_compute[182755]: 2026-01-22 00:21:55.384 182759 INFO nova.virt.libvirt.driver [-] [instance: 25b17338-0c55-4631-9d7e-896e6fa6339a] Instance destroyed successfully.#033[00m
Jan 21 19:21:55 np0005591285 nova_compute[182755]: 2026-01-22 00:21:55.384 182759 DEBUG nova.objects.instance [None req-c1778e65-ddc0-40fe-a1db-289e8d72a92f 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] Lazy-loading 'resources' on Instance uuid 25b17338-0c55-4631-9d7e-896e6fa6339a obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 21 19:21:55 np0005591285 neutron-haproxy-ovnmeta-d8a7afef-267d-4702-a8ff-b40d78fc979d[235967]: [NOTICE]   (235971) : haproxy version is 2.8.14-c23fe91
Jan 21 19:21:55 np0005591285 neutron-haproxy-ovnmeta-d8a7afef-267d-4702-a8ff-b40d78fc979d[235967]: [NOTICE]   (235971) : path to executable is /usr/sbin/haproxy
Jan 21 19:21:55 np0005591285 neutron-haproxy-ovnmeta-d8a7afef-267d-4702-a8ff-b40d78fc979d[235967]: [WARNING]  (235971) : Exiting Master process...
Jan 21 19:21:55 np0005591285 neutron-haproxy-ovnmeta-d8a7afef-267d-4702-a8ff-b40d78fc979d[235967]: [ALERT]    (235971) : Current worker (235973) exited with code 143 (Terminated)
Jan 21 19:21:55 np0005591285 neutron-haproxy-ovnmeta-d8a7afef-267d-4702-a8ff-b40d78fc979d[235967]: [WARNING]  (235971) : All workers exited. Exiting... (0)
Jan 21 19:21:55 np0005591285 systemd[1]: libpod-13dbf31a813b05cfb5caa99bd13e0ac4f3713aebf381bb8358d6fa15e1436060.scope: Deactivated successfully.
Jan 21 19:21:55 np0005591285 podman[236161]: 2026-01-22 00:21:55.401302928 +0000 UTC m=+0.056887921 container died 13dbf31a813b05cfb5caa99bd13e0ac4f3713aebf381bb8358d6fa15e1436060 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-d8a7afef-267d-4702-a8ff-b40d78fc979d, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 21 19:21:55 np0005591285 nova_compute[182755]: 2026-01-22 00:21:55.405 182759 DEBUG nova.virt.libvirt.vif [None req-c1778e65-ddc0-40fe-a1db-289e8d72a92f 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-22T00:21:17Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-477861272',display_name='tempest-TestNetworkBasicOps-server-477861272',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-477861272',id=153,image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBC1B0QtYzlQT6wVB9N7C5p7quLULloUzL9snEqx01Oq0jh5qZo1YFPza37ma4X75ier+uy28EOQJmoSDKJqbNt0MpI9jP9AsOnfOju00xn+AfZ3nuB13y+9PpFvzC303Tw==',key_name='tempest-TestNetworkBasicOps-613570151',keypairs=<?>,launch_index=0,launched_at=2026-01-22T00:21:26Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='34b96b4037d24a0ea19383ca2477b2fd',ramdisk_id='',reservation_id='r-8n7cxs6p',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestNetworkBasicOps-822850957',owner_user_name='tempest-TestNetworkBasicOps-822850957-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-22T00:21:26Z,user_data=None,user_id='833f1e9dce90456ea55a443da6704907',uuid=25b17338-0c55-4631-9d7e-896e6fa6339a,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "1d955a5e-1284-42e2-b7cc-b421102c744d", "address": "fa:16:3e:1e:6e:18", "network": {"id": "d8a7afef-267d-4702-a8ff-b40d78fc979d", "bridge": "br-int", "label": "tempest-network-smoke--1265373196", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.244", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "34b96b4037d24a0ea19383ca2477b2fd", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap1d955a5e-12", "ovs_interfaceid": "1d955a5e-1284-42e2-b7cc-b421102c744d", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Jan 21 19:21:55 np0005591285 nova_compute[182755]: 2026-01-22 00:21:55.405 182759 DEBUG nova.network.os_vif_util [None req-c1778e65-ddc0-40fe-a1db-289e8d72a92f 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] Converting VIF {"id": "1d955a5e-1284-42e2-b7cc-b421102c744d", "address": "fa:16:3e:1e:6e:18", "network": {"id": "d8a7afef-267d-4702-a8ff-b40d78fc979d", "bridge": "br-int", "label": "tempest-network-smoke--1265373196", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.244", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "34b96b4037d24a0ea19383ca2477b2fd", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap1d955a5e-12", "ovs_interfaceid": "1d955a5e-1284-42e2-b7cc-b421102c744d", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 21 19:21:55 np0005591285 nova_compute[182755]: 2026-01-22 00:21:55.406 182759 DEBUG nova.network.os_vif_util [None req-c1778e65-ddc0-40fe-a1db-289e8d72a92f 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:1e:6e:18,bridge_name='br-int',has_traffic_filtering=True,id=1d955a5e-1284-42e2-b7cc-b421102c744d,network=Network(d8a7afef-267d-4702-a8ff-b40d78fc979d),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap1d955a5e-12') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 21 19:21:55 np0005591285 nova_compute[182755]: 2026-01-22 00:21:55.407 182759 DEBUG os_vif [None req-c1778e65-ddc0-40fe-a1db-289e8d72a92f 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:1e:6e:18,bridge_name='br-int',has_traffic_filtering=True,id=1d955a5e-1284-42e2-b7cc-b421102c744d,network=Network(d8a7afef-267d-4702-a8ff-b40d78fc979d),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap1d955a5e-12') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Jan 21 19:21:55 np0005591285 nova_compute[182755]: 2026-01-22 00:21:55.410 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:21:55 np0005591285 nova_compute[182755]: 2026-01-22 00:21:55.410 182759 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap1d955a5e-12, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 21 19:21:55 np0005591285 nova_compute[182755]: 2026-01-22 00:21:55.412 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:21:55 np0005591285 nova_compute[182755]: 2026-01-22 00:21:55.413 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:21:55 np0005591285 nova_compute[182755]: 2026-01-22 00:21:55.416 182759 INFO os_vif [None req-c1778e65-ddc0-40fe-a1db-289e8d72a92f 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:1e:6e:18,bridge_name='br-int',has_traffic_filtering=True,id=1d955a5e-1284-42e2-b7cc-b421102c744d,network=Network(d8a7afef-267d-4702-a8ff-b40d78fc979d),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap1d955a5e-12')#033[00m
Jan 21 19:21:55 np0005591285 nova_compute[182755]: 2026-01-22 00:21:55.416 182759 INFO nova.virt.libvirt.driver [None req-c1778e65-ddc0-40fe-a1db-289e8d72a92f 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] [instance: 25b17338-0c55-4631-9d7e-896e6fa6339a] Deleting instance files /var/lib/nova/instances/25b17338-0c55-4631-9d7e-896e6fa6339a_del#033[00m
Jan 21 19:21:55 np0005591285 nova_compute[182755]: 2026-01-22 00:21:55.417 182759 INFO nova.virt.libvirt.driver [None req-c1778e65-ddc0-40fe-a1db-289e8d72a92f 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] [instance: 25b17338-0c55-4631-9d7e-896e6fa6339a] Deletion of /var/lib/nova/instances/25b17338-0c55-4631-9d7e-896e6fa6339a_del complete#033[00m
Jan 21 19:21:55 np0005591285 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-13dbf31a813b05cfb5caa99bd13e0ac4f3713aebf381bb8358d6fa15e1436060-userdata-shm.mount: Deactivated successfully.
Jan 21 19:21:55 np0005591285 systemd[1]: var-lib-containers-storage-overlay-3d6bd3d1db287e6aa3cc0078621ba4c5a0d6094d3fe4edaa2803a40f90327f2e-merged.mount: Deactivated successfully.
Jan 21 19:21:55 np0005591285 nova_compute[182755]: 2026-01-22 00:21:55.521 182759 INFO nova.compute.manager [None req-c1778e65-ddc0-40fe-a1db-289e8d72a92f 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] [instance: 25b17338-0c55-4631-9d7e-896e6fa6339a] Took 0.40 seconds to destroy the instance on the hypervisor.#033[00m
Jan 21 19:21:55 np0005591285 nova_compute[182755]: 2026-01-22 00:21:55.522 182759 DEBUG oslo.service.loopingcall [None req-c1778e65-ddc0-40fe-a1db-289e8d72a92f 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Jan 21 19:21:55 np0005591285 nova_compute[182755]: 2026-01-22 00:21:55.522 182759 DEBUG nova.compute.manager [-] [instance: 25b17338-0c55-4631-9d7e-896e6fa6339a] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Jan 21 19:21:55 np0005591285 nova_compute[182755]: 2026-01-22 00:21:55.522 182759 DEBUG nova.network.neutron [-] [instance: 25b17338-0c55-4631-9d7e-896e6fa6339a] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Jan 21 19:21:55 np0005591285 nova_compute[182755]: 2026-01-22 00:21:55.556 182759 DEBUG nova.compute.manager [req-a4b4e2b0-fdb4-4603-b29f-cb4a15184eba req-e05e3044-eccd-41de-9cc2-a6d098ad5d42 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 25b17338-0c55-4631-9d7e-896e6fa6339a] Received event network-vif-unplugged-1d955a5e-1284-42e2-b7cc-b421102c744d external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 21 19:21:55 np0005591285 nova_compute[182755]: 2026-01-22 00:21:55.557 182759 DEBUG oslo_concurrency.lockutils [req-a4b4e2b0-fdb4-4603-b29f-cb4a15184eba req-e05e3044-eccd-41de-9cc2-a6d098ad5d42 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "25b17338-0c55-4631-9d7e-896e6fa6339a-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 21 19:21:55 np0005591285 nova_compute[182755]: 2026-01-22 00:21:55.557 182759 DEBUG oslo_concurrency.lockutils [req-a4b4e2b0-fdb4-4603-b29f-cb4a15184eba req-e05e3044-eccd-41de-9cc2-a6d098ad5d42 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "25b17338-0c55-4631-9d7e-896e6fa6339a-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 21 19:21:55 np0005591285 nova_compute[182755]: 2026-01-22 00:21:55.557 182759 DEBUG oslo_concurrency.lockutils [req-a4b4e2b0-fdb4-4603-b29f-cb4a15184eba req-e05e3044-eccd-41de-9cc2-a6d098ad5d42 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "25b17338-0c55-4631-9d7e-896e6fa6339a-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 21 19:21:55 np0005591285 nova_compute[182755]: 2026-01-22 00:21:55.557 182759 DEBUG nova.compute.manager [req-a4b4e2b0-fdb4-4603-b29f-cb4a15184eba req-e05e3044-eccd-41de-9cc2-a6d098ad5d42 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 25b17338-0c55-4631-9d7e-896e6fa6339a] No waiting events found dispatching network-vif-unplugged-1d955a5e-1284-42e2-b7cc-b421102c744d pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 21 19:21:55 np0005591285 nova_compute[182755]: 2026-01-22 00:21:55.558 182759 DEBUG nova.compute.manager [req-a4b4e2b0-fdb4-4603-b29f-cb4a15184eba req-e05e3044-eccd-41de-9cc2-a6d098ad5d42 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 25b17338-0c55-4631-9d7e-896e6fa6339a] Received event network-vif-unplugged-1d955a5e-1284-42e2-b7cc-b421102c744d for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Jan 21 19:21:55 np0005591285 podman[236161]: 2026-01-22 00:21:55.564657751 +0000 UTC m=+0.220242754 container cleanup 13dbf31a813b05cfb5caa99bd13e0ac4f3713aebf381bb8358d6fa15e1436060 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-d8a7afef-267d-4702-a8ff-b40d78fc979d, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team)
Jan 21 19:21:55 np0005591285 systemd[1]: libpod-conmon-13dbf31a813b05cfb5caa99bd13e0ac4f3713aebf381bb8358d6fa15e1436060.scope: Deactivated successfully.
Jan 21 19:21:55 np0005591285 podman[236208]: 2026-01-22 00:21:55.657292392 +0000 UTC m=+0.065080602 container remove 13dbf31a813b05cfb5caa99bd13e0ac4f3713aebf381bb8358d6fa15e1436060 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-d8a7afef-267d-4702-a8ff-b40d78fc979d, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Jan 21 19:21:55 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:21:55.662 211690 DEBUG oslo.privsep.daemon [-] privsep: reply[8e7073c6-e754-4f72-b339-f16f565e2b28]: (4, ('Thu Jan 22 12:21:55 AM UTC 2026 Stopping container neutron-haproxy-ovnmeta-d8a7afef-267d-4702-a8ff-b40d78fc979d (13dbf31a813b05cfb5caa99bd13e0ac4f3713aebf381bb8358d6fa15e1436060)\n13dbf31a813b05cfb5caa99bd13e0ac4f3713aebf381bb8358d6fa15e1436060\nThu Jan 22 12:21:55 AM UTC 2026 Deleting container neutron-haproxy-ovnmeta-d8a7afef-267d-4702-a8ff-b40d78fc979d (13dbf31a813b05cfb5caa99bd13e0ac4f3713aebf381bb8358d6fa15e1436060)\n13dbf31a813b05cfb5caa99bd13e0ac4f3713aebf381bb8358d6fa15e1436060\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 19:21:55 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:21:55.664 211690 DEBUG oslo.privsep.daemon [-] privsep: reply[40ca0c96-99a9-4052-b169-8935c4e925f5]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 19:21:55 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:21:55.665 104259 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapd8a7afef-20, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 21 19:21:55 np0005591285 nova_compute[182755]: 2026-01-22 00:21:55.666 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:21:55 np0005591285 kernel: tapd8a7afef-20: left promiscuous mode
Jan 21 19:21:55 np0005591285 nova_compute[182755]: 2026-01-22 00:21:55.679 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:21:55 np0005591285 nova_compute[182755]: 2026-01-22 00:21:55.680 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:21:55 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:21:55.682 211690 DEBUG oslo.privsep.daemon [-] privsep: reply[28f81179-6f26-496f-bbe4-4ba66f0b1205]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 19:21:55 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:21:55.696 211690 DEBUG oslo.privsep.daemon [-] privsep: reply[988dec3c-cbd3-4430-b4a2-0605d708bf51]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 19:21:55 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:21:55.697 211690 DEBUG oslo.privsep.daemon [-] privsep: reply[eb6b1862-7b2d-467a-a0c8-5cabe0af9c70]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 19:21:55 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:21:55.711 211690 DEBUG oslo.privsep.daemon [-] privsep: reply[358b650d-3a75-4cdf-ab28-3fc953ff5339]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 584385, 'reachable_time': 16450, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 236224, 'error': None, 'target': 'ovnmeta-d8a7afef-267d-4702-a8ff-b40d78fc979d', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 19:21:55 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:21:55.715 104650 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-d8a7afef-267d-4702-a8ff-b40d78fc979d deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Jan 21 19:21:55 np0005591285 systemd[1]: run-netns-ovnmeta\x2dd8a7afef\x2d267d\x2d4702\x2da8ff\x2db40d78fc979d.mount: Deactivated successfully.
Jan 21 19:21:55 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:21:55.715 104650 DEBUG oslo.privsep.daemon [-] privsep: reply[d569597d-ae17-4deb-b2f3-8913792e650a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 19:21:56 np0005591285 nova_compute[182755]: 2026-01-22 00:21:56.348 182759 DEBUG nova.network.neutron [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] [instance: 25b17338-0c55-4631-9d7e-896e6fa6339a] Updating instance_info_cache with network_info: [{"id": "1d955a5e-1284-42e2-b7cc-b421102c744d", "address": "fa:16:3e:1e:6e:18", "network": {"id": "d8a7afef-267d-4702-a8ff-b40d78fc979d", "bridge": "br-int", "label": "tempest-network-smoke--1265373196", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "34b96b4037d24a0ea19383ca2477b2fd", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap1d955a5e-12", "ovs_interfaceid": "1d955a5e-1284-42e2-b7cc-b421102c744d", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 21 19:21:56 np0005591285 nova_compute[182755]: 2026-01-22 00:21:56.968 182759 DEBUG oslo_concurrency.lockutils [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Releasing lock "refresh_cache-25b17338-0c55-4631-9d7e-896e6fa6339a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 21 19:21:56 np0005591285 nova_compute[182755]: 2026-01-22 00:21:56.969 182759 DEBUG nova.compute.manager [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] [instance: 25b17338-0c55-4631-9d7e-896e6fa6339a] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929#033[00m
Jan 21 19:21:56 np0005591285 nova_compute[182755]: 2026-01-22 00:21:56.969 182759 DEBUG oslo_concurrency.lockutils [req-533bb9a6-ca9c-4d54-b9e0-8335f87243cd req-6d6df891-9687-4686-bb2e-0da4242a1bd4 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquired lock "refresh_cache-25b17338-0c55-4631-9d7e-896e6fa6339a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 21 19:21:56 np0005591285 nova_compute[182755]: 2026-01-22 00:21:56.970 182759 DEBUG nova.network.neutron [req-533bb9a6-ca9c-4d54-b9e0-8335f87243cd req-6d6df891-9687-4686-bb2e-0da4242a1bd4 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 25b17338-0c55-4631-9d7e-896e6fa6339a] Refreshing network info cache for port 1d955a5e-1284-42e2-b7cc-b421102c744d _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 21 19:21:56 np0005591285 nova_compute[182755]: 2026-01-22 00:21:56.971 182759 DEBUG oslo_service.periodic_task [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 21 19:21:56 np0005591285 nova_compute[182755]: 2026-01-22 00:21:56.972 182759 DEBUG oslo_service.periodic_task [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 21 19:21:57 np0005591285 nova_compute[182755]: 2026-01-22 00:21:57.188 182759 DEBUG nova.network.neutron [-] [instance: 25b17338-0c55-4631-9d7e-896e6fa6339a] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 21 19:21:57 np0005591285 nova_compute[182755]: 2026-01-22 00:21:57.209 182759 INFO nova.compute.manager [-] [instance: 25b17338-0c55-4631-9d7e-896e6fa6339a] Took 1.69 seconds to deallocate network for instance.#033[00m
Jan 21 19:21:57 np0005591285 nova_compute[182755]: 2026-01-22 00:21:57.248 182759 INFO nova.network.neutron [req-533bb9a6-ca9c-4d54-b9e0-8335f87243cd req-6d6df891-9687-4686-bb2e-0da4242a1bd4 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 25b17338-0c55-4631-9d7e-896e6fa6339a] Port 1d955a5e-1284-42e2-b7cc-b421102c744d from network info_cache is no longer associated with instance in Neutron. Removing from network info_cache.#033[00m
Jan 21 19:21:57 np0005591285 nova_compute[182755]: 2026-01-22 00:21:57.249 182759 DEBUG nova.network.neutron [req-533bb9a6-ca9c-4d54-b9e0-8335f87243cd req-6d6df891-9687-4686-bb2e-0da4242a1bd4 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 25b17338-0c55-4631-9d7e-896e6fa6339a] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 21 19:21:57 np0005591285 nova_compute[182755]: 2026-01-22 00:21:57.557 182759 DEBUG oslo_concurrency.lockutils [req-533bb9a6-ca9c-4d54-b9e0-8335f87243cd req-6d6df891-9687-4686-bb2e-0da4242a1bd4 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Releasing lock "refresh_cache-25b17338-0c55-4631-9d7e-896e6fa6339a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 21 19:21:57 np0005591285 nova_compute[182755]: 2026-01-22 00:21:57.575 182759 DEBUG oslo_concurrency.lockutils [None req-c1778e65-ddc0-40fe-a1db-289e8d72a92f 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 21 19:21:57 np0005591285 nova_compute[182755]: 2026-01-22 00:21:57.575 182759 DEBUG oslo_concurrency.lockutils [None req-c1778e65-ddc0-40fe-a1db-289e8d72a92f 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 21 19:21:57 np0005591285 nova_compute[182755]: 2026-01-22 00:21:57.646 182759 DEBUG nova.compute.provider_tree [None req-c1778e65-ddc0-40fe-a1db-289e8d72a92f 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] Inventory has not changed in ProviderTree for provider: e96a8776-a298-4c19-937a-402cb8191067 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 21 19:21:57 np0005591285 nova_compute[182755]: 2026-01-22 00:21:57.671 182759 DEBUG nova.scheduler.client.report [None req-c1778e65-ddc0-40fe-a1db-289e8d72a92f 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] Inventory has not changed for provider e96a8776-a298-4c19-937a-402cb8191067 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 21 19:21:57 np0005591285 nova_compute[182755]: 2026-01-22 00:21:57.677 182759 DEBUG nova.compute.manager [req-6fb6e847-0d82-4bad-bcff-cd95f0dde8e8 req-844ffc48-e69d-4d29-b6db-2a4cba607d39 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 25b17338-0c55-4631-9d7e-896e6fa6339a] Received event network-vif-plugged-1d955a5e-1284-42e2-b7cc-b421102c744d external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 21 19:21:57 np0005591285 nova_compute[182755]: 2026-01-22 00:21:57.677 182759 DEBUG oslo_concurrency.lockutils [req-6fb6e847-0d82-4bad-bcff-cd95f0dde8e8 req-844ffc48-e69d-4d29-b6db-2a4cba607d39 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "25b17338-0c55-4631-9d7e-896e6fa6339a-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 21 19:21:57 np0005591285 nova_compute[182755]: 2026-01-22 00:21:57.677 182759 DEBUG oslo_concurrency.lockutils [req-6fb6e847-0d82-4bad-bcff-cd95f0dde8e8 req-844ffc48-e69d-4d29-b6db-2a4cba607d39 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "25b17338-0c55-4631-9d7e-896e6fa6339a-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 21 19:21:57 np0005591285 nova_compute[182755]: 2026-01-22 00:21:57.678 182759 DEBUG oslo_concurrency.lockutils [req-6fb6e847-0d82-4bad-bcff-cd95f0dde8e8 req-844ffc48-e69d-4d29-b6db-2a4cba607d39 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "25b17338-0c55-4631-9d7e-896e6fa6339a-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 21 19:21:57 np0005591285 nova_compute[182755]: 2026-01-22 00:21:57.678 182759 DEBUG nova.compute.manager [req-6fb6e847-0d82-4bad-bcff-cd95f0dde8e8 req-844ffc48-e69d-4d29-b6db-2a4cba607d39 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 25b17338-0c55-4631-9d7e-896e6fa6339a] No waiting events found dispatching network-vif-plugged-1d955a5e-1284-42e2-b7cc-b421102c744d pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 21 19:21:57 np0005591285 nova_compute[182755]: 2026-01-22 00:21:57.678 182759 WARNING nova.compute.manager [req-6fb6e847-0d82-4bad-bcff-cd95f0dde8e8 req-844ffc48-e69d-4d29-b6db-2a4cba607d39 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 25b17338-0c55-4631-9d7e-896e6fa6339a] Received unexpected event network-vif-plugged-1d955a5e-1284-42e2-b7cc-b421102c744d for instance with vm_state deleted and task_state None.#033[00m
Jan 21 19:21:57 np0005591285 nova_compute[182755]: 2026-01-22 00:21:57.678 182759 DEBUG nova.compute.manager [req-6fb6e847-0d82-4bad-bcff-cd95f0dde8e8 req-844ffc48-e69d-4d29-b6db-2a4cba607d39 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 25b17338-0c55-4631-9d7e-896e6fa6339a] Received event network-vif-deleted-1d955a5e-1284-42e2-b7cc-b421102c744d external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 21 19:21:57 np0005591285 nova_compute[182755]: 2026-01-22 00:21:57.701 182759 DEBUG oslo_concurrency.lockutils [None req-c1778e65-ddc0-40fe-a1db-289e8d72a92f 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.126s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 21 19:21:57 np0005591285 nova_compute[182755]: 2026-01-22 00:21:57.733 182759 INFO nova.scheduler.client.report [None req-c1778e65-ddc0-40fe-a1db-289e8d72a92f 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] Deleted allocations for instance 25b17338-0c55-4631-9d7e-896e6fa6339a#033[00m
Jan 21 19:21:57 np0005591285 nova_compute[182755]: 2026-01-22 00:21:57.853 182759 DEBUG oslo_concurrency.lockutils [None req-c1778e65-ddc0-40fe-a1db-289e8d72a92f 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] Lock "25b17338-0c55-4631-9d7e-896e6fa6339a" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 2.846s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 21 19:21:58 np0005591285 nova_compute[182755]: 2026-01-22 00:21:58.065 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:21:58 np0005591285 nova_compute[182755]: 2026-01-22 00:21:58.217 182759 DEBUG oslo_service.periodic_task [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 21 19:21:58 np0005591285 nova_compute[182755]: 2026-01-22 00:21:58.217 182759 DEBUG oslo_service.periodic_task [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 21 19:21:58 np0005591285 nova_compute[182755]: 2026-01-22 00:21:58.281 182759 DEBUG oslo_concurrency.lockutils [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 21 19:21:58 np0005591285 nova_compute[182755]: 2026-01-22 00:21:58.282 182759 DEBUG oslo_concurrency.lockutils [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 21 19:21:58 np0005591285 nova_compute[182755]: 2026-01-22 00:21:58.282 182759 DEBUG oslo_concurrency.lockutils [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 21 19:21:58 np0005591285 nova_compute[182755]: 2026-01-22 00:21:58.283 182759 DEBUG nova.compute.resource_tracker [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Jan 21 19:21:58 np0005591285 nova_compute[182755]: 2026-01-22 00:21:58.507 182759 WARNING nova.virt.libvirt.driver [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 21 19:21:58 np0005591285 nova_compute[182755]: 2026-01-22 00:21:58.509 182759 DEBUG nova.compute.resource_tracker [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=5745MB free_disk=73.19315719604492GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Jan 21 19:21:58 np0005591285 nova_compute[182755]: 2026-01-22 00:21:58.509 182759 DEBUG oslo_concurrency.lockutils [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 21 19:21:58 np0005591285 nova_compute[182755]: 2026-01-22 00:21:58.509 182759 DEBUG oslo_concurrency.lockutils [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 21 19:21:58 np0005591285 nova_compute[182755]: 2026-01-22 00:21:58.732 182759 DEBUG nova.compute.resource_tracker [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Jan 21 19:21:58 np0005591285 nova_compute[182755]: 2026-01-22 00:21:58.733 182759 DEBUG nova.compute.resource_tracker [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Jan 21 19:21:58 np0005591285 nova_compute[182755]: 2026-01-22 00:21:58.755 182759 DEBUG nova.compute.provider_tree [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Inventory has not changed in ProviderTree for provider: e96a8776-a298-4c19-937a-402cb8191067 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 21 19:21:58 np0005591285 nova_compute[182755]: 2026-01-22 00:21:58.772 182759 DEBUG nova.scheduler.client.report [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Inventory has not changed for provider e96a8776-a298-4c19-937a-402cb8191067 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 21 19:21:58 np0005591285 nova_compute[182755]: 2026-01-22 00:21:58.795 182759 DEBUG nova.compute.resource_tracker [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Jan 21 19:21:58 np0005591285 nova_compute[182755]: 2026-01-22 00:21:58.795 182759 DEBUG oslo_concurrency.lockutils [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.286s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 21 19:22:00 np0005591285 nova_compute[182755]: 2026-01-22 00:22:00.416 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:22:00 np0005591285 nova_compute[182755]: 2026-01-22 00:22:00.795 182759 DEBUG oslo_service.periodic_task [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 21 19:22:01 np0005591285 nova_compute[182755]: 2026-01-22 00:22:01.212 182759 DEBUG oslo_service.periodic_task [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 21 19:22:02 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:22:02.985 104259 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 21 19:22:02 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:22:02.986 104259 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 21 19:22:02 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:22:02.986 104259 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 21 19:22:03 np0005591285 nova_compute[182755]: 2026-01-22 00:22:03.066 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:22:05 np0005591285 nova_compute[182755]: 2026-01-22 00:22:05.429 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:22:05 np0005591285 nova_compute[182755]: 2026-01-22 00:22:05.538 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:22:08 np0005591285 nova_compute[182755]: 2026-01-22 00:22:08.068 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:22:08 np0005591285 podman[236227]: 2026-01-22 00:22:08.189522133 +0000 UTC m=+0.055538024 container health_status 769668cc4e818cfae854ab3fcb5517227ddca1e18005a18ef4d782a494f45662 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, distribution-scope=public, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.openshift.tags=minimal rhel9, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.buildah.version=1.33.7, config_id=openstack_network_exporter, container_name=openstack_network_exporter, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, maintainer=Red Hat, Inc., url=https://catalog.redhat.com/en/search?searchType=containers, version=9.6, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, release=1755695350, vcs-type=git, com.redhat.component=ubi9-minimal-container, name=ubi9-minimal, build-date=2025-08-20T13:12:41, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64)
Jan 21 19:22:08 np0005591285 podman[236228]: 2026-01-22 00:22:08.19868178 +0000 UTC m=+0.059269325 container health_status b5c1a31a508d0cf9e10702abe9c0ef424614b4d8f0daee85791f7de054afa6f0 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=ceilometer_agent_compute, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251202, tcib_managed=true, container_name=ceilometer_agent_compute, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']})
Jan 21 19:22:10 np0005591285 nova_compute[182755]: 2026-01-22 00:22:10.382 182759 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1769041315.381271, 25b17338-0c55-4631-9d7e-896e6fa6339a => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 21 19:22:10 np0005591285 nova_compute[182755]: 2026-01-22 00:22:10.383 182759 INFO nova.compute.manager [-] [instance: 25b17338-0c55-4631-9d7e-896e6fa6339a] VM Stopped (Lifecycle Event)#033[00m
Jan 21 19:22:10 np0005591285 nova_compute[182755]: 2026-01-22 00:22:10.403 182759 DEBUG nova.compute.manager [None req-fc79636a-411b-4c66-86dc-7b1ab14404bc - - - - - -] [instance: 25b17338-0c55-4631-9d7e-896e6fa6339a] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 21 19:22:10 np0005591285 nova_compute[182755]: 2026-01-22 00:22:10.433 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:22:13 np0005591285 nova_compute[182755]: 2026-01-22 00:22:13.070 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:22:15 np0005591285 nova_compute[182755]: 2026-01-22 00:22:15.437 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:22:18 np0005591285 nova_compute[182755]: 2026-01-22 00:22:18.072 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:22:18 np0005591285 podman[236269]: 2026-01-22 00:22:18.180805208 +0000 UTC m=+0.054781904 container health_status 3d927e208c8d9d2bf2ed958430b573b5beb6e6062ba87e1ec7a0ee42c855b2e4 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Jan 21 19:22:20 np0005591285 podman[236293]: 2026-01-22 00:22:20.180379127 +0000 UTC m=+0.052148804 container health_status 8919309155a033da8deb024bb37b9fc0d285fe97686ee0d202af37a5f2df8416 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter)
Jan 21 19:22:20 np0005591285 podman[236292]: 2026-01-22 00:22:20.205581345 +0000 UTC m=+0.078632586 container health_status 482a8450540b33e5cd4c4cb0c4b60281016c657b79b89fa82f8a9e447843f995 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent)
Jan 21 19:22:20 np0005591285 nova_compute[182755]: 2026-01-22 00:22:20.441 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:22:23 np0005591285 nova_compute[182755]: 2026-01-22 00:22:23.074 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:22:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:22:23.171 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 21 19:22:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:22:23.172 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 21 19:22:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:22:23.172 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 21 19:22:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:22:23.173 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 21 19:22:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:22:23.173 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 21 19:22:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:22:23.173 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.allocation, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 21 19:22:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:22:23.173 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.capacity, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 21 19:22:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:22:23.174 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 21 19:22:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:22:23.174 12 DEBUG ceilometer.polling.manager [-] Skip pollster memory.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 21 19:22:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:22:23.174 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 21 19:22:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:22:23.174 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 21 19:22:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:22:23.175 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 21 19:22:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:22:23.175 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 21 19:22:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:22:23.175 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 21 19:22:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:22:23.175 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 21 19:22:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:22:23.175 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 21 19:22:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:22:23.175 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 21 19:22:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:22:23.176 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 21 19:22:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:22:23.176 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 21 19:22:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:22:23.176 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.iops, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 21 19:22:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:22:23.176 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 21 19:22:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:22:23.176 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 21 19:22:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:22:23.176 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 21 19:22:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:22:23.177 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 21 19:22:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:22:23.177 12 DEBUG ceilometer.polling.manager [-] Skip pollster cpu, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 21 19:22:24 np0005591285 podman[236333]: 2026-01-22 00:22:24.272816251 +0000 UTC m=+0.132225496 container health_status e38def30a2d032278c507a8fe96e62e9ea51ff4721fe55b24e66691821fd079e (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, container_name=ovn_controller, io.buildah.version=1.41.3)
Jan 21 19:22:25 np0005591285 nova_compute[182755]: 2026-01-22 00:22:25.444 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:22:28 np0005591285 nova_compute[182755]: 2026-01-22 00:22:28.077 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:22:30 np0005591285 nova_compute[182755]: 2026-01-22 00:22:30.446 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:22:33 np0005591285 nova_compute[182755]: 2026-01-22 00:22:33.078 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:22:35 np0005591285 nova_compute[182755]: 2026-01-22 00:22:35.450 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:22:36 np0005591285 nova_compute[182755]: 2026-01-22 00:22:36.217 182759 DEBUG oslo_service.periodic_task [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Running periodic task ComputeManager._run_pending_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 21 19:22:36 np0005591285 nova_compute[182755]: 2026-01-22 00:22:36.218 182759 DEBUG nova.compute.manager [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Cleaning up deleted instances _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11145#033[00m
Jan 21 19:22:36 np0005591285 nova_compute[182755]: 2026-01-22 00:22:36.245 182759 DEBUG nova.compute.manager [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] There are 0 instances to clean _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11154#033[00m
Jan 21 19:22:38 np0005591285 nova_compute[182755]: 2026-01-22 00:22:38.080 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:22:39 np0005591285 podman[236362]: 2026-01-22 00:22:39.183703733 +0000 UTC m=+0.051264909 container health_status b5c1a31a508d0cf9e10702abe9c0ef424614b4d8f0daee85791f7de054afa6f0 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=ceilometer_agent_compute, managed_by=edpm_ansible, org.label-schema.build-date=20251202, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3)
Jan 21 19:22:39 np0005591285 podman[236361]: 2026-01-22 00:22:39.184643448 +0000 UTC m=+0.055052821 container health_status 769668cc4e818cfae854ab3fcb5517227ddca1e18005a18ef4d782a494f45662 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, container_name=openstack_network_exporter, distribution-scope=public, vendor=Red Hat, Inc., version=9.6, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=ubi9-minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., build-date=2025-08-20T13:12:41, url=https://catalog.redhat.com/en/search?searchType=containers, io.openshift.tags=minimal rhel9, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, architecture=x86_64, vcs-type=git, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., release=1755695350, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, managed_by=edpm_ansible, io.buildah.version=1.33.7, io.openshift.expose-services=, com.redhat.component=ubi9-minimal-container, maintainer=Red Hat, Inc., config_id=openstack_network_exporter)
Jan 21 19:22:40 np0005591285 nova_compute[182755]: 2026-01-22 00:22:40.453 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:22:43 np0005591285 nova_compute[182755]: 2026-01-22 00:22:43.096 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:22:45 np0005591285 nova_compute[182755]: 2026-01-22 00:22:45.456 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:22:48 np0005591285 nova_compute[182755]: 2026-01-22 00:22:48.014 182759 DEBUG oslo_concurrency.lockutils [None req-54cf7232-b32b-4d93-8f60-e14dd36a9bd0 a60ce2b7b7ae47b484de12add551b287 02bcfc5f1f1044a3856e73a5938ff011 - - default default] Acquiring lock "92f9af4e-c724-4454-bf9f-49ae4bdb49a3" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 21 19:22:48 np0005591285 nova_compute[182755]: 2026-01-22 00:22:48.014 182759 DEBUG oslo_concurrency.lockutils [None req-54cf7232-b32b-4d93-8f60-e14dd36a9bd0 a60ce2b7b7ae47b484de12add551b287 02bcfc5f1f1044a3856e73a5938ff011 - - default default] Lock "92f9af4e-c724-4454-bf9f-49ae4bdb49a3" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 21 19:22:48 np0005591285 nova_compute[182755]: 2026-01-22 00:22:48.032 182759 DEBUG nova.compute.manager [None req-54cf7232-b32b-4d93-8f60-e14dd36a9bd0 a60ce2b7b7ae47b484de12add551b287 02bcfc5f1f1044a3856e73a5938ff011 - - default default] [instance: 92f9af4e-c724-4454-bf9f-49ae4bdb49a3] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Jan 21 19:22:48 np0005591285 nova_compute[182755]: 2026-01-22 00:22:48.101 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:22:48 np0005591285 nova_compute[182755]: 2026-01-22 00:22:48.147 182759 DEBUG oslo_concurrency.lockutils [None req-54cf7232-b32b-4d93-8f60-e14dd36a9bd0 a60ce2b7b7ae47b484de12add551b287 02bcfc5f1f1044a3856e73a5938ff011 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 21 19:22:48 np0005591285 nova_compute[182755]: 2026-01-22 00:22:48.147 182759 DEBUG oslo_concurrency.lockutils [None req-54cf7232-b32b-4d93-8f60-e14dd36a9bd0 a60ce2b7b7ae47b484de12add551b287 02bcfc5f1f1044a3856e73a5938ff011 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 21 19:22:48 np0005591285 nova_compute[182755]: 2026-01-22 00:22:48.153 182759 DEBUG nova.virt.hardware [None req-54cf7232-b32b-4d93-8f60-e14dd36a9bd0 a60ce2b7b7ae47b484de12add551b287 02bcfc5f1f1044a3856e73a5938ff011 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Jan 21 19:22:48 np0005591285 nova_compute[182755]: 2026-01-22 00:22:48.154 182759 INFO nova.compute.claims [None req-54cf7232-b32b-4d93-8f60-e14dd36a9bd0 a60ce2b7b7ae47b484de12add551b287 02bcfc5f1f1044a3856e73a5938ff011 - - default default] [instance: 92f9af4e-c724-4454-bf9f-49ae4bdb49a3] Claim successful on node compute-2.ctlplane.example.com#033[00m
Jan 21 19:22:48 np0005591285 nova_compute[182755]: 2026-01-22 00:22:48.306 182759 DEBUG nova.compute.provider_tree [None req-54cf7232-b32b-4d93-8f60-e14dd36a9bd0 a60ce2b7b7ae47b484de12add551b287 02bcfc5f1f1044a3856e73a5938ff011 - - default default] Inventory has not changed in ProviderTree for provider: e96a8776-a298-4c19-937a-402cb8191067 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 21 19:22:48 np0005591285 nova_compute[182755]: 2026-01-22 00:22:48.319 182759 DEBUG nova.scheduler.client.report [None req-54cf7232-b32b-4d93-8f60-e14dd36a9bd0 a60ce2b7b7ae47b484de12add551b287 02bcfc5f1f1044a3856e73a5938ff011 - - default default] Inventory has not changed for provider e96a8776-a298-4c19-937a-402cb8191067 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 21 19:22:48 np0005591285 nova_compute[182755]: 2026-01-22 00:22:48.341 182759 DEBUG oslo_concurrency.lockutils [None req-54cf7232-b32b-4d93-8f60-e14dd36a9bd0 a60ce2b7b7ae47b484de12add551b287 02bcfc5f1f1044a3856e73a5938ff011 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.193s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 21 19:22:48 np0005591285 nova_compute[182755]: 2026-01-22 00:22:48.341 182759 DEBUG nova.compute.manager [None req-54cf7232-b32b-4d93-8f60-e14dd36a9bd0 a60ce2b7b7ae47b484de12add551b287 02bcfc5f1f1044a3856e73a5938ff011 - - default default] [instance: 92f9af4e-c724-4454-bf9f-49ae4bdb49a3] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Jan 21 19:22:48 np0005591285 nova_compute[182755]: 2026-01-22 00:22:48.414 182759 DEBUG nova.compute.manager [None req-54cf7232-b32b-4d93-8f60-e14dd36a9bd0 a60ce2b7b7ae47b484de12add551b287 02bcfc5f1f1044a3856e73a5938ff011 - - default default] [instance: 92f9af4e-c724-4454-bf9f-49ae4bdb49a3] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Jan 21 19:22:48 np0005591285 nova_compute[182755]: 2026-01-22 00:22:48.414 182759 DEBUG nova.network.neutron [None req-54cf7232-b32b-4d93-8f60-e14dd36a9bd0 a60ce2b7b7ae47b484de12add551b287 02bcfc5f1f1044a3856e73a5938ff011 - - default default] [instance: 92f9af4e-c724-4454-bf9f-49ae4bdb49a3] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Jan 21 19:22:48 np0005591285 nova_compute[182755]: 2026-01-22 00:22:48.449 182759 INFO nova.virt.libvirt.driver [None req-54cf7232-b32b-4d93-8f60-e14dd36a9bd0 a60ce2b7b7ae47b484de12add551b287 02bcfc5f1f1044a3856e73a5938ff011 - - default default] [instance: 92f9af4e-c724-4454-bf9f-49ae4bdb49a3] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Jan 21 19:22:48 np0005591285 nova_compute[182755]: 2026-01-22 00:22:48.475 182759 DEBUG nova.compute.manager [None req-54cf7232-b32b-4d93-8f60-e14dd36a9bd0 a60ce2b7b7ae47b484de12add551b287 02bcfc5f1f1044a3856e73a5938ff011 - - default default] [instance: 92f9af4e-c724-4454-bf9f-49ae4bdb49a3] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Jan 21 19:22:48 np0005591285 nova_compute[182755]: 2026-01-22 00:22:48.595 182759 DEBUG nova.compute.manager [None req-54cf7232-b32b-4d93-8f60-e14dd36a9bd0 a60ce2b7b7ae47b484de12add551b287 02bcfc5f1f1044a3856e73a5938ff011 - - default default] [instance: 92f9af4e-c724-4454-bf9f-49ae4bdb49a3] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Jan 21 19:22:48 np0005591285 nova_compute[182755]: 2026-01-22 00:22:48.596 182759 DEBUG nova.virt.libvirt.driver [None req-54cf7232-b32b-4d93-8f60-e14dd36a9bd0 a60ce2b7b7ae47b484de12add551b287 02bcfc5f1f1044a3856e73a5938ff011 - - default default] [instance: 92f9af4e-c724-4454-bf9f-49ae4bdb49a3] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Jan 21 19:22:48 np0005591285 nova_compute[182755]: 2026-01-22 00:22:48.597 182759 INFO nova.virt.libvirt.driver [None req-54cf7232-b32b-4d93-8f60-e14dd36a9bd0 a60ce2b7b7ae47b484de12add551b287 02bcfc5f1f1044a3856e73a5938ff011 - - default default] [instance: 92f9af4e-c724-4454-bf9f-49ae4bdb49a3] Creating image(s)#033[00m
Jan 21 19:22:48 np0005591285 nova_compute[182755]: 2026-01-22 00:22:48.597 182759 DEBUG oslo_concurrency.lockutils [None req-54cf7232-b32b-4d93-8f60-e14dd36a9bd0 a60ce2b7b7ae47b484de12add551b287 02bcfc5f1f1044a3856e73a5938ff011 - - default default] Acquiring lock "/var/lib/nova/instances/92f9af4e-c724-4454-bf9f-49ae4bdb49a3/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 21 19:22:48 np0005591285 nova_compute[182755]: 2026-01-22 00:22:48.597 182759 DEBUG oslo_concurrency.lockutils [None req-54cf7232-b32b-4d93-8f60-e14dd36a9bd0 a60ce2b7b7ae47b484de12add551b287 02bcfc5f1f1044a3856e73a5938ff011 - - default default] Lock "/var/lib/nova/instances/92f9af4e-c724-4454-bf9f-49ae4bdb49a3/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 21 19:22:48 np0005591285 nova_compute[182755]: 2026-01-22 00:22:48.598 182759 DEBUG oslo_concurrency.lockutils [None req-54cf7232-b32b-4d93-8f60-e14dd36a9bd0 a60ce2b7b7ae47b484de12add551b287 02bcfc5f1f1044a3856e73a5938ff011 - - default default] Lock "/var/lib/nova/instances/92f9af4e-c724-4454-bf9f-49ae4bdb49a3/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 21 19:22:48 np0005591285 nova_compute[182755]: 2026-01-22 00:22:48.610 182759 DEBUG oslo_concurrency.processutils [None req-54cf7232-b32b-4d93-8f60-e14dd36a9bd0 a60ce2b7b7ae47b484de12add551b287 02bcfc5f1f1044a3856e73a5938ff011 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 21 19:22:48 np0005591285 nova_compute[182755]: 2026-01-22 00:22:48.666 182759 DEBUG oslo_concurrency.processutils [None req-54cf7232-b32b-4d93-8f60-e14dd36a9bd0 a60ce2b7b7ae47b484de12add551b287 02bcfc5f1f1044a3856e73a5938ff011 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json" returned: 0 in 0.056s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 21 19:22:48 np0005591285 nova_compute[182755]: 2026-01-22 00:22:48.667 182759 DEBUG oslo_concurrency.lockutils [None req-54cf7232-b32b-4d93-8f60-e14dd36a9bd0 a60ce2b7b7ae47b484de12add551b287 02bcfc5f1f1044a3856e73a5938ff011 - - default default] Acquiring lock "3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 21 19:22:48 np0005591285 nova_compute[182755]: 2026-01-22 00:22:48.668 182759 DEBUG oslo_concurrency.lockutils [None req-54cf7232-b32b-4d93-8f60-e14dd36a9bd0 a60ce2b7b7ae47b484de12add551b287 02bcfc5f1f1044a3856e73a5938ff011 - - default default] Lock "3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 21 19:22:48 np0005591285 nova_compute[182755]: 2026-01-22 00:22:48.678 182759 DEBUG oslo_concurrency.processutils [None req-54cf7232-b32b-4d93-8f60-e14dd36a9bd0 a60ce2b7b7ae47b484de12add551b287 02bcfc5f1f1044a3856e73a5938ff011 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 21 19:22:48 np0005591285 nova_compute[182755]: 2026-01-22 00:22:48.730 182759 DEBUG oslo_concurrency.processutils [None req-54cf7232-b32b-4d93-8f60-e14dd36a9bd0 a60ce2b7b7ae47b484de12add551b287 02bcfc5f1f1044a3856e73a5938ff011 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json" returned: 0 in 0.052s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 21 19:22:48 np0005591285 nova_compute[182755]: 2026-01-22 00:22:48.731 182759 DEBUG oslo_concurrency.processutils [None req-54cf7232-b32b-4d93-8f60-e14dd36a9bd0 a60ce2b7b7ae47b484de12add551b287 02bcfc5f1f1044a3856e73a5938ff011 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474,backing_fmt=raw /var/lib/nova/instances/92f9af4e-c724-4454-bf9f-49ae4bdb49a3/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 21 19:22:48 np0005591285 nova_compute[182755]: 2026-01-22 00:22:48.759 182759 DEBUG oslo_concurrency.processutils [None req-54cf7232-b32b-4d93-8f60-e14dd36a9bd0 a60ce2b7b7ae47b484de12add551b287 02bcfc5f1f1044a3856e73a5938ff011 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474,backing_fmt=raw /var/lib/nova/instances/92f9af4e-c724-4454-bf9f-49ae4bdb49a3/disk 1073741824" returned: 0 in 0.028s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 21 19:22:48 np0005591285 nova_compute[182755]: 2026-01-22 00:22:48.761 182759 DEBUG oslo_concurrency.lockutils [None req-54cf7232-b32b-4d93-8f60-e14dd36a9bd0 a60ce2b7b7ae47b484de12add551b287 02bcfc5f1f1044a3856e73a5938ff011 - - default default] Lock "3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.093s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 21 19:22:48 np0005591285 nova_compute[182755]: 2026-01-22 00:22:48.761 182759 DEBUG oslo_concurrency.processutils [None req-54cf7232-b32b-4d93-8f60-e14dd36a9bd0 a60ce2b7b7ae47b484de12add551b287 02bcfc5f1f1044a3856e73a5938ff011 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 21 19:22:48 np0005591285 nova_compute[182755]: 2026-01-22 00:22:48.813 182759 DEBUG oslo_concurrency.processutils [None req-54cf7232-b32b-4d93-8f60-e14dd36a9bd0 a60ce2b7b7ae47b484de12add551b287 02bcfc5f1f1044a3856e73a5938ff011 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json" returned: 0 in 0.052s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 21 19:22:48 np0005591285 nova_compute[182755]: 2026-01-22 00:22:48.815 182759 DEBUG nova.virt.disk.api [None req-54cf7232-b32b-4d93-8f60-e14dd36a9bd0 a60ce2b7b7ae47b484de12add551b287 02bcfc5f1f1044a3856e73a5938ff011 - - default default] Checking if we can resize image /var/lib/nova/instances/92f9af4e-c724-4454-bf9f-49ae4bdb49a3/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166#033[00m
Jan 21 19:22:48 np0005591285 nova_compute[182755]: 2026-01-22 00:22:48.815 182759 DEBUG oslo_concurrency.processutils [None req-54cf7232-b32b-4d93-8f60-e14dd36a9bd0 a60ce2b7b7ae47b484de12add551b287 02bcfc5f1f1044a3856e73a5938ff011 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/92f9af4e-c724-4454-bf9f-49ae4bdb49a3/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 21 19:22:48 np0005591285 nova_compute[182755]: 2026-01-22 00:22:48.869 182759 DEBUG oslo_concurrency.processutils [None req-54cf7232-b32b-4d93-8f60-e14dd36a9bd0 a60ce2b7b7ae47b484de12add551b287 02bcfc5f1f1044a3856e73a5938ff011 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/92f9af4e-c724-4454-bf9f-49ae4bdb49a3/disk --force-share --output=json" returned: 0 in 0.054s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 21 19:22:48 np0005591285 nova_compute[182755]: 2026-01-22 00:22:48.870 182759 DEBUG nova.virt.disk.api [None req-54cf7232-b32b-4d93-8f60-e14dd36a9bd0 a60ce2b7b7ae47b484de12add551b287 02bcfc5f1f1044a3856e73a5938ff011 - - default default] Cannot resize image /var/lib/nova/instances/92f9af4e-c724-4454-bf9f-49ae4bdb49a3/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172#033[00m
Jan 21 19:22:48 np0005591285 nova_compute[182755]: 2026-01-22 00:22:48.871 182759 DEBUG nova.objects.instance [None req-54cf7232-b32b-4d93-8f60-e14dd36a9bd0 a60ce2b7b7ae47b484de12add551b287 02bcfc5f1f1044a3856e73a5938ff011 - - default default] Lazy-loading 'migration_context' on Instance uuid 92f9af4e-c724-4454-bf9f-49ae4bdb49a3 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 21 19:22:48 np0005591285 nova_compute[182755]: 2026-01-22 00:22:48.887 182759 DEBUG nova.virt.libvirt.driver [None req-54cf7232-b32b-4d93-8f60-e14dd36a9bd0 a60ce2b7b7ae47b484de12add551b287 02bcfc5f1f1044a3856e73a5938ff011 - - default default] [instance: 92f9af4e-c724-4454-bf9f-49ae4bdb49a3] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Jan 21 19:22:48 np0005591285 nova_compute[182755]: 2026-01-22 00:22:48.887 182759 DEBUG nova.virt.libvirt.driver [None req-54cf7232-b32b-4d93-8f60-e14dd36a9bd0 a60ce2b7b7ae47b484de12add551b287 02bcfc5f1f1044a3856e73a5938ff011 - - default default] [instance: 92f9af4e-c724-4454-bf9f-49ae4bdb49a3] Ensure instance console log exists: /var/lib/nova/instances/92f9af4e-c724-4454-bf9f-49ae4bdb49a3/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Jan 21 19:22:48 np0005591285 nova_compute[182755]: 2026-01-22 00:22:48.888 182759 DEBUG oslo_concurrency.lockutils [None req-54cf7232-b32b-4d93-8f60-e14dd36a9bd0 a60ce2b7b7ae47b484de12add551b287 02bcfc5f1f1044a3856e73a5938ff011 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 21 19:22:48 np0005591285 nova_compute[182755]: 2026-01-22 00:22:48.888 182759 DEBUG oslo_concurrency.lockutils [None req-54cf7232-b32b-4d93-8f60-e14dd36a9bd0 a60ce2b7b7ae47b484de12add551b287 02bcfc5f1f1044a3856e73a5938ff011 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 21 19:22:48 np0005591285 nova_compute[182755]: 2026-01-22 00:22:48.888 182759 DEBUG oslo_concurrency.lockutils [None req-54cf7232-b32b-4d93-8f60-e14dd36a9bd0 a60ce2b7b7ae47b484de12add551b287 02bcfc5f1f1044a3856e73a5938ff011 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 21 19:22:49 np0005591285 nova_compute[182755]: 2026-01-22 00:22:49.097 182759 DEBUG nova.policy [None req-54cf7232-b32b-4d93-8f60-e14dd36a9bd0 a60ce2b7b7ae47b484de12add551b287 02bcfc5f1f1044a3856e73a5938ff011 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'a60ce2b7b7ae47b484de12add551b287', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '02bcfc5f1f1044a3856e73a5938ff011', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Jan 21 19:22:49 np0005591285 podman[236416]: 2026-01-22 00:22:49.166298295 +0000 UTC m=+0.042459374 container health_status 3d927e208c8d9d2bf2ed958430b573b5beb6e6062ba87e1ec7a0ee42c855b2e4 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter)
Jan 21 19:22:50 np0005591285 nova_compute[182755]: 2026-01-22 00:22:50.229 182759 DEBUG nova.network.neutron [None req-54cf7232-b32b-4d93-8f60-e14dd36a9bd0 a60ce2b7b7ae47b484de12add551b287 02bcfc5f1f1044a3856e73a5938ff011 - - default default] [instance: 92f9af4e-c724-4454-bf9f-49ae4bdb49a3] Successfully created port: 57b7bcc8-a7af-4f4d-b001-4f3b96c7d691 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Jan 21 19:22:50 np0005591285 nova_compute[182755]: 2026-01-22 00:22:50.240 182759 DEBUG oslo_service.periodic_task [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 21 19:22:50 np0005591285 nova_compute[182755]: 2026-01-22 00:22:50.459 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:22:51 np0005591285 nova_compute[182755]: 2026-01-22 00:22:51.089 182759 DEBUG nova.network.neutron [None req-54cf7232-b32b-4d93-8f60-e14dd36a9bd0 a60ce2b7b7ae47b484de12add551b287 02bcfc5f1f1044a3856e73a5938ff011 - - default default] [instance: 92f9af4e-c724-4454-bf9f-49ae4bdb49a3] Successfully updated port: 57b7bcc8-a7af-4f4d-b001-4f3b96c7d691 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Jan 21 19:22:51 np0005591285 nova_compute[182755]: 2026-01-22 00:22:51.133 182759 DEBUG oslo_concurrency.lockutils [None req-54cf7232-b32b-4d93-8f60-e14dd36a9bd0 a60ce2b7b7ae47b484de12add551b287 02bcfc5f1f1044a3856e73a5938ff011 - - default default] Acquiring lock "refresh_cache-92f9af4e-c724-4454-bf9f-49ae4bdb49a3" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 21 19:22:51 np0005591285 nova_compute[182755]: 2026-01-22 00:22:51.133 182759 DEBUG oslo_concurrency.lockutils [None req-54cf7232-b32b-4d93-8f60-e14dd36a9bd0 a60ce2b7b7ae47b484de12add551b287 02bcfc5f1f1044a3856e73a5938ff011 - - default default] Acquired lock "refresh_cache-92f9af4e-c724-4454-bf9f-49ae4bdb49a3" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 21 19:22:51 np0005591285 nova_compute[182755]: 2026-01-22 00:22:51.133 182759 DEBUG nova.network.neutron [None req-54cf7232-b32b-4d93-8f60-e14dd36a9bd0 a60ce2b7b7ae47b484de12add551b287 02bcfc5f1f1044a3856e73a5938ff011 - - default default] [instance: 92f9af4e-c724-4454-bf9f-49ae4bdb49a3] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Jan 21 19:22:51 np0005591285 podman[236440]: 2026-01-22 00:22:51.184856883 +0000 UTC m=+0.054596239 container health_status 482a8450540b33e5cd4c4cb0c4b60281016c657b79b89fa82f8a9e447843f995 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_id=ovn_metadata_agent)
Jan 21 19:22:51 np0005591285 podman[236441]: 2026-01-22 00:22:51.200841333 +0000 UTC m=+0.063576871 container health_status 8919309155a033da8deb024bb37b9fc0d285fe97686ee0d202af37a5f2df8416 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Jan 21 19:22:51 np0005591285 nova_compute[182755]: 2026-01-22 00:22:51.212 182759 DEBUG nova.compute.manager [req-08241f45-563d-46a2-901a-fc69d3ce79c3 req-ec194fc1-92a7-469b-81da-4e3087ffb2a0 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 92f9af4e-c724-4454-bf9f-49ae4bdb49a3] Received event network-changed-57b7bcc8-a7af-4f4d-b001-4f3b96c7d691 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 21 19:22:51 np0005591285 nova_compute[182755]: 2026-01-22 00:22:51.212 182759 DEBUG nova.compute.manager [req-08241f45-563d-46a2-901a-fc69d3ce79c3 req-ec194fc1-92a7-469b-81da-4e3087ffb2a0 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 92f9af4e-c724-4454-bf9f-49ae4bdb49a3] Refreshing instance network info cache due to event network-changed-57b7bcc8-a7af-4f4d-b001-4f3b96c7d691. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 21 19:22:51 np0005591285 nova_compute[182755]: 2026-01-22 00:22:51.212 182759 DEBUG oslo_concurrency.lockutils [req-08241f45-563d-46a2-901a-fc69d3ce79c3 req-ec194fc1-92a7-469b-81da-4e3087ffb2a0 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "refresh_cache-92f9af4e-c724-4454-bf9f-49ae4bdb49a3" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 21 19:22:51 np0005591285 nova_compute[182755]: 2026-01-22 00:22:51.216 182759 DEBUG oslo_service.periodic_task [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 21 19:22:51 np0005591285 nova_compute[182755]: 2026-01-22 00:22:51.216 182759 DEBUG nova.compute.manager [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Jan 21 19:22:51 np0005591285 nova_compute[182755]: 2026-01-22 00:22:51.304 182759 DEBUG nova.network.neutron [None req-54cf7232-b32b-4d93-8f60-e14dd36a9bd0 a60ce2b7b7ae47b484de12add551b287 02bcfc5f1f1044a3856e73a5938ff011 - - default default] [instance: 92f9af4e-c724-4454-bf9f-49ae4bdb49a3] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Jan 21 19:22:52 np0005591285 nova_compute[182755]: 2026-01-22 00:22:52.458 182759 DEBUG nova.network.neutron [None req-54cf7232-b32b-4d93-8f60-e14dd36a9bd0 a60ce2b7b7ae47b484de12add551b287 02bcfc5f1f1044a3856e73a5938ff011 - - default default] [instance: 92f9af4e-c724-4454-bf9f-49ae4bdb49a3] Updating instance_info_cache with network_info: [{"id": "57b7bcc8-a7af-4f4d-b001-4f3b96c7d691", "address": "fa:16:3e:0b:1a:79", "network": {"id": "92f99623-b3a9-41d7-ab3e-5bc19b701c77", "bridge": "br-int", "label": "tempest-network-smoke--863395071", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "02bcfc5f1f1044a3856e73a5938ff011", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap57b7bcc8-a7", "ovs_interfaceid": "57b7bcc8-a7af-4f4d-b001-4f3b96c7d691", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 21 19:22:52 np0005591285 nova_compute[182755]: 2026-01-22 00:22:52.490 182759 DEBUG oslo_concurrency.lockutils [None req-54cf7232-b32b-4d93-8f60-e14dd36a9bd0 a60ce2b7b7ae47b484de12add551b287 02bcfc5f1f1044a3856e73a5938ff011 - - default default] Releasing lock "refresh_cache-92f9af4e-c724-4454-bf9f-49ae4bdb49a3" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 21 19:22:52 np0005591285 nova_compute[182755]: 2026-01-22 00:22:52.491 182759 DEBUG nova.compute.manager [None req-54cf7232-b32b-4d93-8f60-e14dd36a9bd0 a60ce2b7b7ae47b484de12add551b287 02bcfc5f1f1044a3856e73a5938ff011 - - default default] [instance: 92f9af4e-c724-4454-bf9f-49ae4bdb49a3] Instance network_info: |[{"id": "57b7bcc8-a7af-4f4d-b001-4f3b96c7d691", "address": "fa:16:3e:0b:1a:79", "network": {"id": "92f99623-b3a9-41d7-ab3e-5bc19b701c77", "bridge": "br-int", "label": "tempest-network-smoke--863395071", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "02bcfc5f1f1044a3856e73a5938ff011", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap57b7bcc8-a7", "ovs_interfaceid": "57b7bcc8-a7af-4f4d-b001-4f3b96c7d691", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Jan 21 19:22:52 np0005591285 nova_compute[182755]: 2026-01-22 00:22:52.491 182759 DEBUG oslo_concurrency.lockutils [req-08241f45-563d-46a2-901a-fc69d3ce79c3 req-ec194fc1-92a7-469b-81da-4e3087ffb2a0 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquired lock "refresh_cache-92f9af4e-c724-4454-bf9f-49ae4bdb49a3" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 21 19:22:52 np0005591285 nova_compute[182755]: 2026-01-22 00:22:52.491 182759 DEBUG nova.network.neutron [req-08241f45-563d-46a2-901a-fc69d3ce79c3 req-ec194fc1-92a7-469b-81da-4e3087ffb2a0 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 92f9af4e-c724-4454-bf9f-49ae4bdb49a3] Refreshing network info cache for port 57b7bcc8-a7af-4f4d-b001-4f3b96c7d691 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 21 19:22:52 np0005591285 nova_compute[182755]: 2026-01-22 00:22:52.494 182759 DEBUG nova.virt.libvirt.driver [None req-54cf7232-b32b-4d93-8f60-e14dd36a9bd0 a60ce2b7b7ae47b484de12add551b287 02bcfc5f1f1044a3856e73a5938ff011 - - default default] [instance: 92f9af4e-c724-4454-bf9f-49ae4bdb49a3] Start _get_guest_xml network_info=[{"id": "57b7bcc8-a7af-4f4d-b001-4f3b96c7d691", "address": "fa:16:3e:0b:1a:79", "network": {"id": "92f99623-b3a9-41d7-ab3e-5bc19b701c77", "bridge": "br-int", "label": "tempest-network-smoke--863395071", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "02bcfc5f1f1044a3856e73a5938ff011", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap57b7bcc8-a7", "ovs_interfaceid": "57b7bcc8-a7af-4f4d-b001-4f3b96c7d691", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-21T23:42:50Z,direct_url=<?>,disk_format='qcow2',id=9cd98f02-a505-4543-a7ad-04e9a377b456,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='43b70c4e837343859ac97b6b2397ba1b',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-21T23:42:52Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'device_type': 'disk', 'device_name': '/dev/vda', 'size': 0, 'boot_index': 0, 'disk_bus': 'virtio', 'guest_format': None, 'encryption_options': None, 'encryption_format': None, 'encrypted': False, 'encryption_secret_uuid': None, 'image_id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Jan 21 19:22:52 np0005591285 nova_compute[182755]: 2026-01-22 00:22:52.500 182759 WARNING nova.virt.libvirt.driver [None req-54cf7232-b32b-4d93-8f60-e14dd36a9bd0 a60ce2b7b7ae47b484de12add551b287 02bcfc5f1f1044a3856e73a5938ff011 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 21 19:22:52 np0005591285 nova_compute[182755]: 2026-01-22 00:22:52.505 182759 DEBUG nova.virt.libvirt.host [None req-54cf7232-b32b-4d93-8f60-e14dd36a9bd0 a60ce2b7b7ae47b484de12add551b287 02bcfc5f1f1044a3856e73a5938ff011 - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Jan 21 19:22:52 np0005591285 nova_compute[182755]: 2026-01-22 00:22:52.505 182759 DEBUG nova.virt.libvirt.host [None req-54cf7232-b32b-4d93-8f60-e14dd36a9bd0 a60ce2b7b7ae47b484de12add551b287 02bcfc5f1f1044a3856e73a5938ff011 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Jan 21 19:22:52 np0005591285 nova_compute[182755]: 2026-01-22 00:22:52.509 182759 DEBUG nova.virt.libvirt.host [None req-54cf7232-b32b-4d93-8f60-e14dd36a9bd0 a60ce2b7b7ae47b484de12add551b287 02bcfc5f1f1044a3856e73a5938ff011 - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Jan 21 19:22:52 np0005591285 nova_compute[182755]: 2026-01-22 00:22:52.510 182759 DEBUG nova.virt.libvirt.host [None req-54cf7232-b32b-4d93-8f60-e14dd36a9bd0 a60ce2b7b7ae47b484de12add551b287 02bcfc5f1f1044a3856e73a5938ff011 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Jan 21 19:22:52 np0005591285 nova_compute[182755]: 2026-01-22 00:22:52.511 182759 DEBUG nova.virt.libvirt.driver [None req-54cf7232-b32b-4d93-8f60-e14dd36a9bd0 a60ce2b7b7ae47b484de12add551b287 02bcfc5f1f1044a3856e73a5938ff011 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Jan 21 19:22:52 np0005591285 nova_compute[182755]: 2026-01-22 00:22:52.511 182759 DEBUG nova.virt.hardware [None req-54cf7232-b32b-4d93-8f60-e14dd36a9bd0 a60ce2b7b7ae47b484de12add551b287 02bcfc5f1f1044a3856e73a5938ff011 - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-21T23:42:45Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='c3389c03-89c4-4ff5-9e03-1a99d41713d4',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-21T23:42:50Z,direct_url=<?>,disk_format='qcow2',id=9cd98f02-a505-4543-a7ad-04e9a377b456,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='43b70c4e837343859ac97b6b2397ba1b',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-21T23:42:52Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Jan 21 19:22:52 np0005591285 nova_compute[182755]: 2026-01-22 00:22:52.511 182759 DEBUG nova.virt.hardware [None req-54cf7232-b32b-4d93-8f60-e14dd36a9bd0 a60ce2b7b7ae47b484de12add551b287 02bcfc5f1f1044a3856e73a5938ff011 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Jan 21 19:22:52 np0005591285 nova_compute[182755]: 2026-01-22 00:22:52.512 182759 DEBUG nova.virt.hardware [None req-54cf7232-b32b-4d93-8f60-e14dd36a9bd0 a60ce2b7b7ae47b484de12add551b287 02bcfc5f1f1044a3856e73a5938ff011 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Jan 21 19:22:52 np0005591285 nova_compute[182755]: 2026-01-22 00:22:52.512 182759 DEBUG nova.virt.hardware [None req-54cf7232-b32b-4d93-8f60-e14dd36a9bd0 a60ce2b7b7ae47b484de12add551b287 02bcfc5f1f1044a3856e73a5938ff011 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Jan 21 19:22:52 np0005591285 nova_compute[182755]: 2026-01-22 00:22:52.512 182759 DEBUG nova.virt.hardware [None req-54cf7232-b32b-4d93-8f60-e14dd36a9bd0 a60ce2b7b7ae47b484de12add551b287 02bcfc5f1f1044a3856e73a5938ff011 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Jan 21 19:22:52 np0005591285 nova_compute[182755]: 2026-01-22 00:22:52.512 182759 DEBUG nova.virt.hardware [None req-54cf7232-b32b-4d93-8f60-e14dd36a9bd0 a60ce2b7b7ae47b484de12add551b287 02bcfc5f1f1044a3856e73a5938ff011 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Jan 21 19:22:52 np0005591285 nova_compute[182755]: 2026-01-22 00:22:52.512 182759 DEBUG nova.virt.hardware [None req-54cf7232-b32b-4d93-8f60-e14dd36a9bd0 a60ce2b7b7ae47b484de12add551b287 02bcfc5f1f1044a3856e73a5938ff011 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Jan 21 19:22:52 np0005591285 nova_compute[182755]: 2026-01-22 00:22:52.513 182759 DEBUG nova.virt.hardware [None req-54cf7232-b32b-4d93-8f60-e14dd36a9bd0 a60ce2b7b7ae47b484de12add551b287 02bcfc5f1f1044a3856e73a5938ff011 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Jan 21 19:22:52 np0005591285 nova_compute[182755]: 2026-01-22 00:22:52.513 182759 DEBUG nova.virt.hardware [None req-54cf7232-b32b-4d93-8f60-e14dd36a9bd0 a60ce2b7b7ae47b484de12add551b287 02bcfc5f1f1044a3856e73a5938ff011 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Jan 21 19:22:52 np0005591285 nova_compute[182755]: 2026-01-22 00:22:52.513 182759 DEBUG nova.virt.hardware [None req-54cf7232-b32b-4d93-8f60-e14dd36a9bd0 a60ce2b7b7ae47b484de12add551b287 02bcfc5f1f1044a3856e73a5938ff011 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Jan 21 19:22:52 np0005591285 nova_compute[182755]: 2026-01-22 00:22:52.513 182759 DEBUG nova.virt.hardware [None req-54cf7232-b32b-4d93-8f60-e14dd36a9bd0 a60ce2b7b7ae47b484de12add551b287 02bcfc5f1f1044a3856e73a5938ff011 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Jan 21 19:22:52 np0005591285 nova_compute[182755]: 2026-01-22 00:22:52.517 182759 DEBUG nova.virt.libvirt.vif [None req-54cf7232-b32b-4d93-8f60-e14dd36a9bd0 a60ce2b7b7ae47b484de12add551b287 02bcfc5f1f1044a3856e73a5938ff011 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-22T00:22:47Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-server-tempest-TestSecurityGroupsBasicOps-1492736128-access_point-966065747',display_name='tempest-server-tempest-TestSecurityGroupsBasicOps-1492736128-access_point-966065747',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-server-tempest-testsecuritygroupsbasicops-1492736128-ac',id=157,image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBFzNbeWQqM+R5mCLYdVsdPyQXYDqnkkhhC73mxN5fX0QRC+i5pxaSAc7LRsQKs9V1np8BzitSAx9O4U37xdH3m6MF7eYp2Ff07iBZVcoSIsB4CpGyP/xz08PAIvxm/KFgA==',key_name='tempest-TestSecurityGroupsBasicOps-135115301',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='02bcfc5f1f1044a3856e73a5938ff011',ramdisk_id='',reservation_id='r-3dy1zpts',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestSecurityGroupsBasicOps-1492736128',owner_user_name='tempest-TestSecurityGroupsBasicOps-1492736128-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-22T00:22:48Z,user_data=None,user_id='a60ce2b7b7ae47b484de12add551b287',uuid=92f9af4e-c724-4454-bf9f-49ae4bdb49a3,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "57b7bcc8-a7af-4f4d-b001-4f3b96c7d691", "address": "fa:16:3e:0b:1a:79", "network": {"id": "92f99623-b3a9-41d7-ab3e-5bc19b701c77", "bridge": "br-int", "label": "tempest-network-smoke--863395071", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "02bcfc5f1f1044a3856e73a5938ff011", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap57b7bcc8-a7", "ovs_interfaceid": "57b7bcc8-a7af-4f4d-b001-4f3b96c7d691", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Jan 21 19:22:52 np0005591285 nova_compute[182755]: 2026-01-22 00:22:52.517 182759 DEBUG nova.network.os_vif_util [None req-54cf7232-b32b-4d93-8f60-e14dd36a9bd0 a60ce2b7b7ae47b484de12add551b287 02bcfc5f1f1044a3856e73a5938ff011 - - default default] Converting VIF {"id": "57b7bcc8-a7af-4f4d-b001-4f3b96c7d691", "address": "fa:16:3e:0b:1a:79", "network": {"id": "92f99623-b3a9-41d7-ab3e-5bc19b701c77", "bridge": "br-int", "label": "tempest-network-smoke--863395071", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "02bcfc5f1f1044a3856e73a5938ff011", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap57b7bcc8-a7", "ovs_interfaceid": "57b7bcc8-a7af-4f4d-b001-4f3b96c7d691", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 21 19:22:52 np0005591285 nova_compute[182755]: 2026-01-22 00:22:52.518 182759 DEBUG nova.network.os_vif_util [None req-54cf7232-b32b-4d93-8f60-e14dd36a9bd0 a60ce2b7b7ae47b484de12add551b287 02bcfc5f1f1044a3856e73a5938ff011 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:0b:1a:79,bridge_name='br-int',has_traffic_filtering=True,id=57b7bcc8-a7af-4f4d-b001-4f3b96c7d691,network=Network(92f99623-b3a9-41d7-ab3e-5bc19b701c77),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap57b7bcc8-a7') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 21 19:22:52 np0005591285 nova_compute[182755]: 2026-01-22 00:22:52.519 182759 DEBUG nova.objects.instance [None req-54cf7232-b32b-4d93-8f60-e14dd36a9bd0 a60ce2b7b7ae47b484de12add551b287 02bcfc5f1f1044a3856e73a5938ff011 - - default default] Lazy-loading 'pci_devices' on Instance uuid 92f9af4e-c724-4454-bf9f-49ae4bdb49a3 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 21 19:22:52 np0005591285 nova_compute[182755]: 2026-01-22 00:22:52.544 182759 DEBUG nova.virt.libvirt.driver [None req-54cf7232-b32b-4d93-8f60-e14dd36a9bd0 a60ce2b7b7ae47b484de12add551b287 02bcfc5f1f1044a3856e73a5938ff011 - - default default] [instance: 92f9af4e-c724-4454-bf9f-49ae4bdb49a3] End _get_guest_xml xml=<domain type="kvm">
Jan 21 19:22:52 np0005591285 nova_compute[182755]:  <uuid>92f9af4e-c724-4454-bf9f-49ae4bdb49a3</uuid>
Jan 21 19:22:52 np0005591285 nova_compute[182755]:  <name>instance-0000009d</name>
Jan 21 19:22:52 np0005591285 nova_compute[182755]:  <memory>131072</memory>
Jan 21 19:22:52 np0005591285 nova_compute[182755]:  <vcpu>1</vcpu>
Jan 21 19:22:52 np0005591285 nova_compute[182755]:  <metadata>
Jan 21 19:22:52 np0005591285 nova_compute[182755]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 21 19:22:52 np0005591285 nova_compute[182755]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 21 19:22:52 np0005591285 nova_compute[182755]:      <nova:name>tempest-server-tempest-TestSecurityGroupsBasicOps-1492736128-access_point-966065747</nova:name>
Jan 21 19:22:52 np0005591285 nova_compute[182755]:      <nova:creationTime>2026-01-22 00:22:52</nova:creationTime>
Jan 21 19:22:52 np0005591285 nova_compute[182755]:      <nova:flavor name="m1.nano">
Jan 21 19:22:52 np0005591285 nova_compute[182755]:        <nova:memory>128</nova:memory>
Jan 21 19:22:52 np0005591285 nova_compute[182755]:        <nova:disk>1</nova:disk>
Jan 21 19:22:52 np0005591285 nova_compute[182755]:        <nova:swap>0</nova:swap>
Jan 21 19:22:52 np0005591285 nova_compute[182755]:        <nova:ephemeral>0</nova:ephemeral>
Jan 21 19:22:52 np0005591285 nova_compute[182755]:        <nova:vcpus>1</nova:vcpus>
Jan 21 19:22:52 np0005591285 nova_compute[182755]:      </nova:flavor>
Jan 21 19:22:52 np0005591285 nova_compute[182755]:      <nova:owner>
Jan 21 19:22:52 np0005591285 nova_compute[182755]:        <nova:user uuid="a60ce2b7b7ae47b484de12add551b287">tempest-TestSecurityGroupsBasicOps-1492736128-project-member</nova:user>
Jan 21 19:22:52 np0005591285 nova_compute[182755]:        <nova:project uuid="02bcfc5f1f1044a3856e73a5938ff011">tempest-TestSecurityGroupsBasicOps-1492736128</nova:project>
Jan 21 19:22:52 np0005591285 nova_compute[182755]:      </nova:owner>
Jan 21 19:22:52 np0005591285 nova_compute[182755]:      <nova:root type="image" uuid="9cd98f02-a505-4543-a7ad-04e9a377b456"/>
Jan 21 19:22:52 np0005591285 nova_compute[182755]:      <nova:ports>
Jan 21 19:22:52 np0005591285 nova_compute[182755]:        <nova:port uuid="57b7bcc8-a7af-4f4d-b001-4f3b96c7d691">
Jan 21 19:22:52 np0005591285 nova_compute[182755]:          <nova:ip type="fixed" address="10.100.0.5" ipVersion="4"/>
Jan 21 19:22:52 np0005591285 nova_compute[182755]:        </nova:port>
Jan 21 19:22:52 np0005591285 nova_compute[182755]:      </nova:ports>
Jan 21 19:22:52 np0005591285 nova_compute[182755]:    </nova:instance>
Jan 21 19:22:52 np0005591285 nova_compute[182755]:  </metadata>
Jan 21 19:22:52 np0005591285 nova_compute[182755]:  <sysinfo type="smbios">
Jan 21 19:22:52 np0005591285 nova_compute[182755]:    <system>
Jan 21 19:22:52 np0005591285 nova_compute[182755]:      <entry name="manufacturer">RDO</entry>
Jan 21 19:22:52 np0005591285 nova_compute[182755]:      <entry name="product">OpenStack Compute</entry>
Jan 21 19:22:52 np0005591285 nova_compute[182755]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 21 19:22:52 np0005591285 nova_compute[182755]:      <entry name="serial">92f9af4e-c724-4454-bf9f-49ae4bdb49a3</entry>
Jan 21 19:22:52 np0005591285 nova_compute[182755]:      <entry name="uuid">92f9af4e-c724-4454-bf9f-49ae4bdb49a3</entry>
Jan 21 19:22:52 np0005591285 nova_compute[182755]:      <entry name="family">Virtual Machine</entry>
Jan 21 19:22:52 np0005591285 nova_compute[182755]:    </system>
Jan 21 19:22:52 np0005591285 nova_compute[182755]:  </sysinfo>
Jan 21 19:22:52 np0005591285 nova_compute[182755]:  <os>
Jan 21 19:22:52 np0005591285 nova_compute[182755]:    <type arch="x86_64" machine="q35">hvm</type>
Jan 21 19:22:52 np0005591285 nova_compute[182755]:    <boot dev="hd"/>
Jan 21 19:22:52 np0005591285 nova_compute[182755]:    <smbios mode="sysinfo"/>
Jan 21 19:22:52 np0005591285 nova_compute[182755]:  </os>
Jan 21 19:22:52 np0005591285 nova_compute[182755]:  <features>
Jan 21 19:22:52 np0005591285 nova_compute[182755]:    <acpi/>
Jan 21 19:22:52 np0005591285 nova_compute[182755]:    <apic/>
Jan 21 19:22:52 np0005591285 nova_compute[182755]:    <vmcoreinfo/>
Jan 21 19:22:52 np0005591285 nova_compute[182755]:  </features>
Jan 21 19:22:52 np0005591285 nova_compute[182755]:  <clock offset="utc">
Jan 21 19:22:52 np0005591285 nova_compute[182755]:    <timer name="pit" tickpolicy="delay"/>
Jan 21 19:22:52 np0005591285 nova_compute[182755]:    <timer name="rtc" tickpolicy="catchup"/>
Jan 21 19:22:52 np0005591285 nova_compute[182755]:    <timer name="hpet" present="no"/>
Jan 21 19:22:52 np0005591285 nova_compute[182755]:  </clock>
Jan 21 19:22:52 np0005591285 nova_compute[182755]:  <cpu mode="custom" match="exact">
Jan 21 19:22:52 np0005591285 nova_compute[182755]:    <model>Nehalem</model>
Jan 21 19:22:52 np0005591285 nova_compute[182755]:    <topology sockets="1" cores="1" threads="1"/>
Jan 21 19:22:52 np0005591285 nova_compute[182755]:  </cpu>
Jan 21 19:22:52 np0005591285 nova_compute[182755]:  <devices>
Jan 21 19:22:52 np0005591285 nova_compute[182755]:    <disk type="file" device="disk">
Jan 21 19:22:52 np0005591285 nova_compute[182755]:      <driver name="qemu" type="qcow2" cache="none"/>
Jan 21 19:22:52 np0005591285 nova_compute[182755]:      <source file="/var/lib/nova/instances/92f9af4e-c724-4454-bf9f-49ae4bdb49a3/disk"/>
Jan 21 19:22:52 np0005591285 nova_compute[182755]:      <target dev="vda" bus="virtio"/>
Jan 21 19:22:52 np0005591285 nova_compute[182755]:    </disk>
Jan 21 19:22:52 np0005591285 nova_compute[182755]:    <disk type="file" device="cdrom">
Jan 21 19:22:52 np0005591285 nova_compute[182755]:      <driver name="qemu" type="raw" cache="none"/>
Jan 21 19:22:52 np0005591285 nova_compute[182755]:      <source file="/var/lib/nova/instances/92f9af4e-c724-4454-bf9f-49ae4bdb49a3/disk.config"/>
Jan 21 19:22:52 np0005591285 nova_compute[182755]:      <target dev="sda" bus="sata"/>
Jan 21 19:22:52 np0005591285 nova_compute[182755]:    </disk>
Jan 21 19:22:52 np0005591285 nova_compute[182755]:    <interface type="ethernet">
Jan 21 19:22:52 np0005591285 nova_compute[182755]:      <mac address="fa:16:3e:0b:1a:79"/>
Jan 21 19:22:52 np0005591285 nova_compute[182755]:      <model type="virtio"/>
Jan 21 19:22:52 np0005591285 nova_compute[182755]:      <driver name="vhost" rx_queue_size="512"/>
Jan 21 19:22:52 np0005591285 nova_compute[182755]:      <mtu size="1442"/>
Jan 21 19:22:52 np0005591285 nova_compute[182755]:      <target dev="tap57b7bcc8-a7"/>
Jan 21 19:22:52 np0005591285 nova_compute[182755]:    </interface>
Jan 21 19:22:52 np0005591285 nova_compute[182755]:    <serial type="pty">
Jan 21 19:22:52 np0005591285 nova_compute[182755]:      <log file="/var/lib/nova/instances/92f9af4e-c724-4454-bf9f-49ae4bdb49a3/console.log" append="off"/>
Jan 21 19:22:52 np0005591285 nova_compute[182755]:    </serial>
Jan 21 19:22:52 np0005591285 nova_compute[182755]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 21 19:22:52 np0005591285 nova_compute[182755]:    <video>
Jan 21 19:22:52 np0005591285 nova_compute[182755]:      <model type="virtio"/>
Jan 21 19:22:52 np0005591285 nova_compute[182755]:    </video>
Jan 21 19:22:52 np0005591285 nova_compute[182755]:    <input type="tablet" bus="usb"/>
Jan 21 19:22:52 np0005591285 nova_compute[182755]:    <rng model="virtio">
Jan 21 19:22:52 np0005591285 nova_compute[182755]:      <backend model="random">/dev/urandom</backend>
Jan 21 19:22:52 np0005591285 nova_compute[182755]:    </rng>
Jan 21 19:22:52 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root"/>
Jan 21 19:22:52 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 19:22:52 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 19:22:52 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 19:22:52 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 19:22:52 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 19:22:52 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 19:22:52 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 19:22:52 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 19:22:52 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 19:22:52 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 19:22:52 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 19:22:52 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 19:22:52 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 19:22:52 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 19:22:52 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 19:22:52 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 19:22:52 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 19:22:52 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 19:22:52 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 19:22:52 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 19:22:52 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 19:22:52 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 19:22:52 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 19:22:52 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 19:22:52 np0005591285 nova_compute[182755]:    <controller type="usb" index="0"/>
Jan 21 19:22:52 np0005591285 nova_compute[182755]:    <memballoon model="virtio">
Jan 21 19:22:52 np0005591285 nova_compute[182755]:      <stats period="10"/>
Jan 21 19:22:52 np0005591285 nova_compute[182755]:    </memballoon>
Jan 21 19:22:52 np0005591285 nova_compute[182755]:  </devices>
Jan 21 19:22:52 np0005591285 nova_compute[182755]: </domain>
Jan 21 19:22:52 np0005591285 nova_compute[182755]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Jan 21 19:22:52 np0005591285 nova_compute[182755]: 2026-01-22 00:22:52.545 182759 DEBUG nova.compute.manager [None req-54cf7232-b32b-4d93-8f60-e14dd36a9bd0 a60ce2b7b7ae47b484de12add551b287 02bcfc5f1f1044a3856e73a5938ff011 - - default default] [instance: 92f9af4e-c724-4454-bf9f-49ae4bdb49a3] Preparing to wait for external event network-vif-plugged-57b7bcc8-a7af-4f4d-b001-4f3b96c7d691 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Jan 21 19:22:52 np0005591285 nova_compute[182755]: 2026-01-22 00:22:52.546 182759 DEBUG oslo_concurrency.lockutils [None req-54cf7232-b32b-4d93-8f60-e14dd36a9bd0 a60ce2b7b7ae47b484de12add551b287 02bcfc5f1f1044a3856e73a5938ff011 - - default default] Acquiring lock "92f9af4e-c724-4454-bf9f-49ae4bdb49a3-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 21 19:22:52 np0005591285 nova_compute[182755]: 2026-01-22 00:22:52.547 182759 DEBUG oslo_concurrency.lockutils [None req-54cf7232-b32b-4d93-8f60-e14dd36a9bd0 a60ce2b7b7ae47b484de12add551b287 02bcfc5f1f1044a3856e73a5938ff011 - - default default] Lock "92f9af4e-c724-4454-bf9f-49ae4bdb49a3-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 21 19:22:52 np0005591285 nova_compute[182755]: 2026-01-22 00:22:52.547 182759 DEBUG oslo_concurrency.lockutils [None req-54cf7232-b32b-4d93-8f60-e14dd36a9bd0 a60ce2b7b7ae47b484de12add551b287 02bcfc5f1f1044a3856e73a5938ff011 - - default default] Lock "92f9af4e-c724-4454-bf9f-49ae4bdb49a3-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 21 19:22:52 np0005591285 nova_compute[182755]: 2026-01-22 00:22:52.549 182759 DEBUG nova.virt.libvirt.vif [None req-54cf7232-b32b-4d93-8f60-e14dd36a9bd0 a60ce2b7b7ae47b484de12add551b287 02bcfc5f1f1044a3856e73a5938ff011 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-22T00:22:47Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-server-tempest-TestSecurityGroupsBasicOps-1492736128-access_point-966065747',display_name='tempest-server-tempest-TestSecurityGroupsBasicOps-1492736128-access_point-966065747',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-server-tempest-testsecuritygroupsbasicops-1492736128-ac',id=157,image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBFzNbeWQqM+R5mCLYdVsdPyQXYDqnkkhhC73mxN5fX0QRC+i5pxaSAc7LRsQKs9V1np8BzitSAx9O4U37xdH3m6MF7eYp2Ff07iBZVcoSIsB4CpGyP/xz08PAIvxm/KFgA==',key_name='tempest-TestSecurityGroupsBasicOps-135115301',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='02bcfc5f1f1044a3856e73a5938ff011',ramdisk_id='',reservation_id='r-3dy1zpts',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestSecurityGroupsBasicOps-1492736128',owner_user_name='tempest-TestSecurityGroupsBasicOps-1492736128-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-22T00:22:48Z,user_data=None,user_id='a60ce2b7b7ae47b484de12add551b287',uuid=92f9af4e-c724-4454-bf9f-49ae4bdb49a3,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "57b7bcc8-a7af-4f4d-b001-4f3b96c7d691", "address": "fa:16:3e:0b:1a:79", "network": {"id": "92f99623-b3a9-41d7-ab3e-5bc19b701c77", "bridge": "br-int", "label": "tempest-network-smoke--863395071", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "02bcfc5f1f1044a3856e73a5938ff011", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap57b7bcc8-a7", "ovs_interfaceid": "57b7bcc8-a7af-4f4d-b001-4f3b96c7d691", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Jan 21 19:22:52 np0005591285 nova_compute[182755]: 2026-01-22 00:22:52.549 182759 DEBUG nova.network.os_vif_util [None req-54cf7232-b32b-4d93-8f60-e14dd36a9bd0 a60ce2b7b7ae47b484de12add551b287 02bcfc5f1f1044a3856e73a5938ff011 - - default default] Converting VIF {"id": "57b7bcc8-a7af-4f4d-b001-4f3b96c7d691", "address": "fa:16:3e:0b:1a:79", "network": {"id": "92f99623-b3a9-41d7-ab3e-5bc19b701c77", "bridge": "br-int", "label": "tempest-network-smoke--863395071", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "02bcfc5f1f1044a3856e73a5938ff011", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap57b7bcc8-a7", "ovs_interfaceid": "57b7bcc8-a7af-4f4d-b001-4f3b96c7d691", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 21 19:22:52 np0005591285 nova_compute[182755]: 2026-01-22 00:22:52.550 182759 DEBUG nova.network.os_vif_util [None req-54cf7232-b32b-4d93-8f60-e14dd36a9bd0 a60ce2b7b7ae47b484de12add551b287 02bcfc5f1f1044a3856e73a5938ff011 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:0b:1a:79,bridge_name='br-int',has_traffic_filtering=True,id=57b7bcc8-a7af-4f4d-b001-4f3b96c7d691,network=Network(92f99623-b3a9-41d7-ab3e-5bc19b701c77),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap57b7bcc8-a7') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 21 19:22:52 np0005591285 nova_compute[182755]: 2026-01-22 00:22:52.550 182759 DEBUG os_vif [None req-54cf7232-b32b-4d93-8f60-e14dd36a9bd0 a60ce2b7b7ae47b484de12add551b287 02bcfc5f1f1044a3856e73a5938ff011 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:0b:1a:79,bridge_name='br-int',has_traffic_filtering=True,id=57b7bcc8-a7af-4f4d-b001-4f3b96c7d691,network=Network(92f99623-b3a9-41d7-ab3e-5bc19b701c77),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap57b7bcc8-a7') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Jan 21 19:22:52 np0005591285 nova_compute[182755]: 2026-01-22 00:22:52.551 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:22:52 np0005591285 nova_compute[182755]: 2026-01-22 00:22:52.551 182759 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 21 19:22:52 np0005591285 nova_compute[182755]: 2026-01-22 00:22:52.552 182759 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 21 19:22:52 np0005591285 nova_compute[182755]: 2026-01-22 00:22:52.555 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:22:52 np0005591285 nova_compute[182755]: 2026-01-22 00:22:52.555 182759 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap57b7bcc8-a7, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 21 19:22:52 np0005591285 nova_compute[182755]: 2026-01-22 00:22:52.556 182759 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap57b7bcc8-a7, col_values=(('external_ids', {'iface-id': '57b7bcc8-a7af-4f4d-b001-4f3b96c7d691', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:0b:1a:79', 'vm-uuid': '92f9af4e-c724-4454-bf9f-49ae4bdb49a3'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 21 19:22:52 np0005591285 nova_compute[182755]: 2026-01-22 00:22:52.558 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:22:52 np0005591285 NetworkManager[55017]: <info>  [1769041372.5590] manager: (tap57b7bcc8-a7): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/292)
Jan 21 19:22:52 np0005591285 nova_compute[182755]: 2026-01-22 00:22:52.560 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 21 19:22:52 np0005591285 nova_compute[182755]: 2026-01-22 00:22:52.566 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:22:52 np0005591285 nova_compute[182755]: 2026-01-22 00:22:52.568 182759 INFO os_vif [None req-54cf7232-b32b-4d93-8f60-e14dd36a9bd0 a60ce2b7b7ae47b484de12add551b287 02bcfc5f1f1044a3856e73a5938ff011 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:0b:1a:79,bridge_name='br-int',has_traffic_filtering=True,id=57b7bcc8-a7af-4f4d-b001-4f3b96c7d691,network=Network(92f99623-b3a9-41d7-ab3e-5bc19b701c77),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap57b7bcc8-a7')#033[00m
Jan 21 19:22:52 np0005591285 nova_compute[182755]: 2026-01-22 00:22:52.638 182759 DEBUG nova.virt.libvirt.driver [None req-54cf7232-b32b-4d93-8f60-e14dd36a9bd0 a60ce2b7b7ae47b484de12add551b287 02bcfc5f1f1044a3856e73a5938ff011 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 21 19:22:52 np0005591285 nova_compute[182755]: 2026-01-22 00:22:52.639 182759 DEBUG nova.virt.libvirt.driver [None req-54cf7232-b32b-4d93-8f60-e14dd36a9bd0 a60ce2b7b7ae47b484de12add551b287 02bcfc5f1f1044a3856e73a5938ff011 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 21 19:22:52 np0005591285 nova_compute[182755]: 2026-01-22 00:22:52.639 182759 DEBUG nova.virt.libvirt.driver [None req-54cf7232-b32b-4d93-8f60-e14dd36a9bd0 a60ce2b7b7ae47b484de12add551b287 02bcfc5f1f1044a3856e73a5938ff011 - - default default] No VIF found with MAC fa:16:3e:0b:1a:79, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Jan 21 19:22:52 np0005591285 nova_compute[182755]: 2026-01-22 00:22:52.640 182759 INFO nova.virt.libvirt.driver [None req-54cf7232-b32b-4d93-8f60-e14dd36a9bd0 a60ce2b7b7ae47b484de12add551b287 02bcfc5f1f1044a3856e73a5938ff011 - - default default] [instance: 92f9af4e-c724-4454-bf9f-49ae4bdb49a3] Using config drive#033[00m
Jan 21 19:22:53 np0005591285 nova_compute[182755]: 2026-01-22 00:22:53.101 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:22:53 np0005591285 nova_compute[182755]: 2026-01-22 00:22:53.210 182759 INFO nova.virt.libvirt.driver [None req-54cf7232-b32b-4d93-8f60-e14dd36a9bd0 a60ce2b7b7ae47b484de12add551b287 02bcfc5f1f1044a3856e73a5938ff011 - - default default] [instance: 92f9af4e-c724-4454-bf9f-49ae4bdb49a3] Creating config drive at /var/lib/nova/instances/92f9af4e-c724-4454-bf9f-49ae4bdb49a3/disk.config#033[00m
Jan 21 19:22:53 np0005591285 nova_compute[182755]: 2026-01-22 00:22:53.219 182759 DEBUG oslo_concurrency.processutils [None req-54cf7232-b32b-4d93-8f60-e14dd36a9bd0 a60ce2b7b7ae47b484de12add551b287 02bcfc5f1f1044a3856e73a5938ff011 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/92f9af4e-c724-4454-bf9f-49ae4bdb49a3/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpxu6hgskt execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 21 19:22:53 np0005591285 nova_compute[182755]: 2026-01-22 00:22:53.241 182759 DEBUG oslo_service.periodic_task [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 21 19:22:53 np0005591285 nova_compute[182755]: 2026-01-22 00:22:53.350 182759 DEBUG oslo_concurrency.processutils [None req-54cf7232-b32b-4d93-8f60-e14dd36a9bd0 a60ce2b7b7ae47b484de12add551b287 02bcfc5f1f1044a3856e73a5938ff011 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/92f9af4e-c724-4454-bf9f-49ae4bdb49a3/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpxu6hgskt" returned: 0 in 0.131s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 21 19:22:53 np0005591285 kernel: tap57b7bcc8-a7: entered promiscuous mode
Jan 21 19:22:53 np0005591285 ovn_controller[94908]: 2026-01-22T00:22:53Z|00604|binding|INFO|Claiming lport 57b7bcc8-a7af-4f4d-b001-4f3b96c7d691 for this chassis.
Jan 21 19:22:53 np0005591285 nova_compute[182755]: 2026-01-22 00:22:53.421 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:22:53 np0005591285 ovn_controller[94908]: 2026-01-22T00:22:53Z|00605|binding|INFO|57b7bcc8-a7af-4f4d-b001-4f3b96c7d691: Claiming fa:16:3e:0b:1a:79 10.100.0.5
Jan 21 19:22:53 np0005591285 NetworkManager[55017]: <info>  [1769041373.4234] manager: (tap57b7bcc8-a7): new Tun device (/org/freedesktop/NetworkManager/Devices/293)
Jan 21 19:22:53 np0005591285 nova_compute[182755]: 2026-01-22 00:22:53.425 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:22:53 np0005591285 nova_compute[182755]: 2026-01-22 00:22:53.431 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:22:53 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:22:53.438 104259 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:0b:1a:79 10.100.0.5'], port_security=['fa:16:3e:0b:1a:79 10.100.0.5'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.5/28', 'neutron:device_id': '92f9af4e-c724-4454-bf9f-49ae4bdb49a3', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-92f99623-b3a9-41d7-ab3e-5bc19b701c77', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '02bcfc5f1f1044a3856e73a5938ff011', 'neutron:revision_number': '2', 'neutron:security_group_ids': 'ac7ef113-70ce-4186-8d61-02a4d7406050 fce92c79-e7f6-4f6f-8148-ac41818bc768', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=0f297238-9b07-4e59-b73b-faeb28c51c5f, chassis=[<ovs.db.idl.Row object at 0x7fdd55b5c970>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fdd55b5c970>], logical_port=57b7bcc8-a7af-4f4d-b001-4f3b96c7d691) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 21 19:22:53 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:22:53.440 104259 INFO neutron.agent.ovn.metadata.agent [-] Port 57b7bcc8-a7af-4f4d-b001-4f3b96c7d691 in datapath 92f99623-b3a9-41d7-ab3e-5bc19b701c77 bound to our chassis#033[00m
Jan 21 19:22:53 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:22:53.441 104259 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 92f99623-b3a9-41d7-ab3e-5bc19b701c77#033[00m
Jan 21 19:22:53 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:22:53.455 211690 DEBUG oslo.privsep.daemon [-] privsep: reply[2709ce97-0867-47bd-9f42-966a5fbc2c55]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 19:22:53 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:22:53.456 104259 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap92f99623-b1 in ovnmeta-92f99623-b3a9-41d7-ab3e-5bc19b701c77 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Jan 21 19:22:53 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:22:53.458 211690 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap92f99623-b0 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Jan 21 19:22:53 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:22:53.458 211690 DEBUG oslo.privsep.daemon [-] privsep: reply[38006b47-d778-4162-b8b6-b62017942435]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 19:22:53 np0005591285 systemd-udevd[236501]: Network interface NamePolicy= disabled on kernel command line.
Jan 21 19:22:53 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:22:53.459 211690 DEBUG oslo.privsep.daemon [-] privsep: reply[d658ccc7-6978-4a62-ab12-c436f7b49095]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 19:22:53 np0005591285 systemd-machined[154022]: New machine qemu-71-instance-0000009d.
Jan 21 19:22:53 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:22:53.470 104650 DEBUG oslo.privsep.daemon [-] privsep: reply[89269108-3dd6-4570-a96c-5f0a5789bdb7]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 19:22:53 np0005591285 NetworkManager[55017]: <info>  [1769041373.4755] device (tap57b7bcc8-a7): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 21 19:22:53 np0005591285 NetworkManager[55017]: <info>  [1769041373.4784] device (tap57b7bcc8-a7): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 21 19:22:53 np0005591285 nova_compute[182755]: 2026-01-22 00:22:53.481 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:22:53 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:22:53.483 211690 DEBUG oslo.privsep.daemon [-] privsep: reply[a6e416ac-9f92-4530-9d8e-97dcae811a28]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 19:22:53 np0005591285 ovn_controller[94908]: 2026-01-22T00:22:53Z|00606|binding|INFO|Setting lport 57b7bcc8-a7af-4f4d-b001-4f3b96c7d691 ovn-installed in OVS
Jan 21 19:22:53 np0005591285 ovn_controller[94908]: 2026-01-22T00:22:53Z|00607|binding|INFO|Setting lport 57b7bcc8-a7af-4f4d-b001-4f3b96c7d691 up in Southbound
Jan 21 19:22:53 np0005591285 nova_compute[182755]: 2026-01-22 00:22:53.485 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:22:53 np0005591285 systemd[1]: Started Virtual Machine qemu-71-instance-0000009d.
Jan 21 19:22:53 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:22:53.515 211704 DEBUG oslo.privsep.daemon [-] privsep: reply[a9bf217d-df75-4930-a3fb-8fcefaa740c5]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 19:22:53 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:22:53.520 211690 DEBUG oslo.privsep.daemon [-] privsep: reply[e10cbe92-8eec-41b3-846f-b519c2fdf77f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 19:22:53 np0005591285 NetworkManager[55017]: <info>  [1769041373.5218] manager: (tap92f99623-b0): new Veth device (/org/freedesktop/NetworkManager/Devices/294)
Jan 21 19:22:53 np0005591285 systemd-udevd[236506]: Network interface NamePolicy= disabled on kernel command line.
Jan 21 19:22:53 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:22:53.548 211704 DEBUG oslo.privsep.daemon [-] privsep: reply[b27467c3-7399-4629-b71b-01af683f5d4f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 19:22:53 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:22:53.550 211704 DEBUG oslo.privsep.daemon [-] privsep: reply[ed234cff-d109-4a23-8334-ff675a9879a6]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 19:22:53 np0005591285 NetworkManager[55017]: <info>  [1769041373.5735] device (tap92f99623-b0): carrier: link connected
Jan 21 19:22:53 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:22:53.579 211704 DEBUG oslo.privsep.daemon [-] privsep: reply[881d6d48-3722-40a1-88d8-c98f8d107db5]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 19:22:53 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:22:53.596 211690 DEBUG oslo.privsep.daemon [-] privsep: reply[1e494750-93c3-4150-b01f-2680cbe2a861]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap92f99623-b1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:dd:be:e7'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 190], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 593223, 'reachable_time': 39222, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 236534, 'error': None, 'target': 'ovnmeta-92f99623-b3a9-41d7-ab3e-5bc19b701c77', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 19:22:53 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:22:53.613 211690 DEBUG oslo.privsep.daemon [-] privsep: reply[8253db43-5a48-4cf3-877f-90bed51259eb]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fedd:bee7'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 593223, 'tstamp': 593223}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 236535, 'error': None, 'target': 'ovnmeta-92f99623-b3a9-41d7-ab3e-5bc19b701c77', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 19:22:53 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:22:53.634 211690 DEBUG oslo.privsep.daemon [-] privsep: reply[3c85ca40-9f30-4506-bdc5-e66463fbe026]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap92f99623-b1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:dd:be:e7'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 190], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 593223, 'reachable_time': 39222, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 236536, 'error': None, 'target': 'ovnmeta-92f99623-b3a9-41d7-ab3e-5bc19b701c77', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 19:22:53 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:22:53.678 211690 DEBUG oslo.privsep.daemon [-] privsep: reply[112b39c9-28ea-409f-8051-b1fcefc9f746]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 19:22:53 np0005591285 nova_compute[182755]: 2026-01-22 00:22:53.729 182759 DEBUG nova.virt.driver [None req-f447ef32-7edb-4e2c-a5c5-5452f2f9c37e - - - - - -] Emitting event <LifecycleEvent: 1769041373.7290425, 92f9af4e-c724-4454-bf9f-49ae4bdb49a3 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 21 19:22:53 np0005591285 nova_compute[182755]: 2026-01-22 00:22:53.730 182759 INFO nova.compute.manager [None req-f447ef32-7edb-4e2c-a5c5-5452f2f9c37e - - - - - -] [instance: 92f9af4e-c724-4454-bf9f-49ae4bdb49a3] VM Started (Lifecycle Event)#033[00m
Jan 21 19:22:53 np0005591285 nova_compute[182755]: 2026-01-22 00:22:53.751 182759 DEBUG nova.compute.manager [None req-f447ef32-7edb-4e2c-a5c5-5452f2f9c37e - - - - - -] [instance: 92f9af4e-c724-4454-bf9f-49ae4bdb49a3] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 21 19:22:53 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:22:53.755 211690 DEBUG oslo.privsep.daemon [-] privsep: reply[44a07a5d-702b-42f6-9f57-b6fd867ea777]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 19:22:53 np0005591285 nova_compute[182755]: 2026-01-22 00:22:53.756 182759 DEBUG nova.virt.driver [None req-f447ef32-7edb-4e2c-a5c5-5452f2f9c37e - - - - - -] Emitting event <LifecycleEvent: 1769041373.7293105, 92f9af4e-c724-4454-bf9f-49ae4bdb49a3 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 21 19:22:53 np0005591285 nova_compute[182755]: 2026-01-22 00:22:53.756 182759 INFO nova.compute.manager [None req-f447ef32-7edb-4e2c-a5c5-5452f2f9c37e - - - - - -] [instance: 92f9af4e-c724-4454-bf9f-49ae4bdb49a3] VM Paused (Lifecycle Event)#033[00m
Jan 21 19:22:53 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:22:53.757 104259 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap92f99623-b0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 21 19:22:53 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:22:53.757 104259 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 21 19:22:53 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:22:53.758 104259 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap92f99623-b0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 21 19:22:53 np0005591285 nova_compute[182755]: 2026-01-22 00:22:53.759 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:22:53 np0005591285 NetworkManager[55017]: <info>  [1769041373.7602] manager: (tap92f99623-b0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/295)
Jan 21 19:22:53 np0005591285 kernel: tap92f99623-b0: entered promiscuous mode
Jan 21 19:22:53 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:22:53.762 104259 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap92f99623-b0, col_values=(('external_ids', {'iface-id': '66ff7839-fd4b-434d-8f68-86c957e9ef5e'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 21 19:22:53 np0005591285 nova_compute[182755]: 2026-01-22 00:22:53.763 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:22:53 np0005591285 ovn_controller[94908]: 2026-01-22T00:22:53Z|00608|binding|INFO|Releasing lport 66ff7839-fd4b-434d-8f68-86c957e9ef5e from this chassis (sb_readonly=0)
Jan 21 19:22:53 np0005591285 nova_compute[182755]: 2026-01-22 00:22:53.764 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:22:53 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:22:53.764 104259 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/92f99623-b3a9-41d7-ab3e-5bc19b701c77.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/92f99623-b3a9-41d7-ab3e-5bc19b701c77.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Jan 21 19:22:53 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:22:53.765 211690 DEBUG oslo.privsep.daemon [-] privsep: reply[774e51ae-5957-40b8-af7f-a53e250f89ed]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 19:22:53 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:22:53.766 104259 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Jan 21 19:22:53 np0005591285 ovn_metadata_agent[104254]: global
Jan 21 19:22:53 np0005591285 ovn_metadata_agent[104254]:    log         /dev/log local0 debug
Jan 21 19:22:53 np0005591285 ovn_metadata_agent[104254]:    log-tag     haproxy-metadata-proxy-92f99623-b3a9-41d7-ab3e-5bc19b701c77
Jan 21 19:22:53 np0005591285 ovn_metadata_agent[104254]:    user        root
Jan 21 19:22:53 np0005591285 ovn_metadata_agent[104254]:    group       root
Jan 21 19:22:53 np0005591285 ovn_metadata_agent[104254]:    maxconn     1024
Jan 21 19:22:53 np0005591285 ovn_metadata_agent[104254]:    pidfile     /var/lib/neutron/external/pids/92f99623-b3a9-41d7-ab3e-5bc19b701c77.pid.haproxy
Jan 21 19:22:53 np0005591285 ovn_metadata_agent[104254]:    daemon
Jan 21 19:22:53 np0005591285 ovn_metadata_agent[104254]: 
Jan 21 19:22:53 np0005591285 ovn_metadata_agent[104254]: defaults
Jan 21 19:22:53 np0005591285 ovn_metadata_agent[104254]:    log global
Jan 21 19:22:53 np0005591285 ovn_metadata_agent[104254]:    mode http
Jan 21 19:22:53 np0005591285 ovn_metadata_agent[104254]:    option httplog
Jan 21 19:22:53 np0005591285 ovn_metadata_agent[104254]:    option dontlognull
Jan 21 19:22:53 np0005591285 ovn_metadata_agent[104254]:    option http-server-close
Jan 21 19:22:53 np0005591285 ovn_metadata_agent[104254]:    option forwardfor
Jan 21 19:22:53 np0005591285 ovn_metadata_agent[104254]:    retries                 3
Jan 21 19:22:53 np0005591285 ovn_metadata_agent[104254]:    timeout http-request    30s
Jan 21 19:22:53 np0005591285 ovn_metadata_agent[104254]:    timeout connect         30s
Jan 21 19:22:53 np0005591285 ovn_metadata_agent[104254]:    timeout client          32s
Jan 21 19:22:53 np0005591285 ovn_metadata_agent[104254]:    timeout server          32s
Jan 21 19:22:53 np0005591285 ovn_metadata_agent[104254]:    timeout http-keep-alive 30s
Jan 21 19:22:53 np0005591285 ovn_metadata_agent[104254]: 
Jan 21 19:22:53 np0005591285 ovn_metadata_agent[104254]: 
Jan 21 19:22:53 np0005591285 ovn_metadata_agent[104254]: listen listener
Jan 21 19:22:53 np0005591285 ovn_metadata_agent[104254]:    bind 169.254.169.254:80
Jan 21 19:22:53 np0005591285 ovn_metadata_agent[104254]:    server metadata /var/lib/neutron/metadata_proxy
Jan 21 19:22:53 np0005591285 ovn_metadata_agent[104254]:    http-request add-header X-OVN-Network-ID 92f99623-b3a9-41d7-ab3e-5bc19b701c77
Jan 21 19:22:53 np0005591285 ovn_metadata_agent[104254]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Jan 21 19:22:53 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:22:53.767 104259 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-92f99623-b3a9-41d7-ab3e-5bc19b701c77', 'env', 'PROCESS_TAG=haproxy-92f99623-b3a9-41d7-ab3e-5bc19b701c77', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/92f99623-b3a9-41d7-ab3e-5bc19b701c77.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Jan 21 19:22:53 np0005591285 nova_compute[182755]: 2026-01-22 00:22:53.772 182759 DEBUG nova.compute.manager [None req-f447ef32-7edb-4e2c-a5c5-5452f2f9c37e - - - - - -] [instance: 92f9af4e-c724-4454-bf9f-49ae4bdb49a3] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 21 19:22:53 np0005591285 nova_compute[182755]: 2026-01-22 00:22:53.774 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:22:53 np0005591285 nova_compute[182755]: 2026-01-22 00:22:53.776 182759 DEBUG nova.compute.manager [None req-f447ef32-7edb-4e2c-a5c5-5452f2f9c37e - - - - - -] [instance: 92f9af4e-c724-4454-bf9f-49ae4bdb49a3] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 21 19:22:53 np0005591285 nova_compute[182755]: 2026-01-22 00:22:53.793 182759 INFO nova.compute.manager [None req-f447ef32-7edb-4e2c-a5c5-5452f2f9c37e - - - - - -] [instance: 92f9af4e-c724-4454-bf9f-49ae4bdb49a3] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 21 19:22:54 np0005591285 nova_compute[182755]: 2026-01-22 00:22:54.094 182759 DEBUG nova.network.neutron [req-08241f45-563d-46a2-901a-fc69d3ce79c3 req-ec194fc1-92a7-469b-81da-4e3087ffb2a0 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 92f9af4e-c724-4454-bf9f-49ae4bdb49a3] Updated VIF entry in instance network info cache for port 57b7bcc8-a7af-4f4d-b001-4f3b96c7d691. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 21 19:22:54 np0005591285 nova_compute[182755]: 2026-01-22 00:22:54.095 182759 DEBUG nova.network.neutron [req-08241f45-563d-46a2-901a-fc69d3ce79c3 req-ec194fc1-92a7-469b-81da-4e3087ffb2a0 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 92f9af4e-c724-4454-bf9f-49ae4bdb49a3] Updating instance_info_cache with network_info: [{"id": "57b7bcc8-a7af-4f4d-b001-4f3b96c7d691", "address": "fa:16:3e:0b:1a:79", "network": {"id": "92f99623-b3a9-41d7-ab3e-5bc19b701c77", "bridge": "br-int", "label": "tempest-network-smoke--863395071", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "02bcfc5f1f1044a3856e73a5938ff011", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap57b7bcc8-a7", "ovs_interfaceid": "57b7bcc8-a7af-4f4d-b001-4f3b96c7d691", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 21 19:22:54 np0005591285 podman[236575]: 2026-01-22 00:22:54.108025747 +0000 UTC m=+0.041722723 container create 277c5df94dd8cab519dc3eb012785ef3c1d6d011b18a299ed1a734033dd38d36 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-92f99623-b3a9-41d7-ab3e-5bc19b701c77, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, tcib_managed=true, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 21 19:22:54 np0005591285 nova_compute[182755]: 2026-01-22 00:22:54.115 182759 DEBUG oslo_concurrency.lockutils [req-08241f45-563d-46a2-901a-fc69d3ce79c3 req-ec194fc1-92a7-469b-81da-4e3087ffb2a0 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Releasing lock "refresh_cache-92f9af4e-c724-4454-bf9f-49ae4bdb49a3" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 21 19:22:54 np0005591285 systemd[1]: Started libpod-conmon-277c5df94dd8cab519dc3eb012785ef3c1d6d011b18a299ed1a734033dd38d36.scope.
Jan 21 19:22:54 np0005591285 nova_compute[182755]: 2026-01-22 00:22:54.149 182759 DEBUG nova.compute.manager [req-0f86b864-2e9e-467e-b020-ae121c37744a req-62a0a599-6b16-4ee5-abef-52ee9bd3f043 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 92f9af4e-c724-4454-bf9f-49ae4bdb49a3] Received event network-vif-plugged-57b7bcc8-a7af-4f4d-b001-4f3b96c7d691 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 21 19:22:54 np0005591285 nova_compute[182755]: 2026-01-22 00:22:54.151 182759 DEBUG oslo_concurrency.lockutils [req-0f86b864-2e9e-467e-b020-ae121c37744a req-62a0a599-6b16-4ee5-abef-52ee9bd3f043 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "92f9af4e-c724-4454-bf9f-49ae4bdb49a3-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 21 19:22:54 np0005591285 nova_compute[182755]: 2026-01-22 00:22:54.151 182759 DEBUG oslo_concurrency.lockutils [req-0f86b864-2e9e-467e-b020-ae121c37744a req-62a0a599-6b16-4ee5-abef-52ee9bd3f043 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "92f9af4e-c724-4454-bf9f-49ae4bdb49a3-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 21 19:22:54 np0005591285 nova_compute[182755]: 2026-01-22 00:22:54.151 182759 DEBUG oslo_concurrency.lockutils [req-0f86b864-2e9e-467e-b020-ae121c37744a req-62a0a599-6b16-4ee5-abef-52ee9bd3f043 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "92f9af4e-c724-4454-bf9f-49ae4bdb49a3-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 21 19:22:54 np0005591285 nova_compute[182755]: 2026-01-22 00:22:54.152 182759 DEBUG nova.compute.manager [req-0f86b864-2e9e-467e-b020-ae121c37744a req-62a0a599-6b16-4ee5-abef-52ee9bd3f043 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 92f9af4e-c724-4454-bf9f-49ae4bdb49a3] Processing event network-vif-plugged-57b7bcc8-a7af-4f4d-b001-4f3b96c7d691 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Jan 21 19:22:54 np0005591285 nova_compute[182755]: 2026-01-22 00:22:54.152 182759 DEBUG nova.compute.manager [None req-54cf7232-b32b-4d93-8f60-e14dd36a9bd0 a60ce2b7b7ae47b484de12add551b287 02bcfc5f1f1044a3856e73a5938ff011 - - default default] [instance: 92f9af4e-c724-4454-bf9f-49ae4bdb49a3] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Jan 21 19:22:54 np0005591285 nova_compute[182755]: 2026-01-22 00:22:54.158 182759 DEBUG nova.virt.driver [None req-f447ef32-7edb-4e2c-a5c5-5452f2f9c37e - - - - - -] Emitting event <LifecycleEvent: 1769041374.1569927, 92f9af4e-c724-4454-bf9f-49ae4bdb49a3 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 21 19:22:54 np0005591285 nova_compute[182755]: 2026-01-22 00:22:54.158 182759 INFO nova.compute.manager [None req-f447ef32-7edb-4e2c-a5c5-5452f2f9c37e - - - - - -] [instance: 92f9af4e-c724-4454-bf9f-49ae4bdb49a3] VM Resumed (Lifecycle Event)#033[00m
Jan 21 19:22:54 np0005591285 nova_compute[182755]: 2026-01-22 00:22:54.159 182759 DEBUG nova.virt.libvirt.driver [None req-54cf7232-b32b-4d93-8f60-e14dd36a9bd0 a60ce2b7b7ae47b484de12add551b287 02bcfc5f1f1044a3856e73a5938ff011 - - default default] [instance: 92f9af4e-c724-4454-bf9f-49ae4bdb49a3] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Jan 21 19:22:54 np0005591285 nova_compute[182755]: 2026-01-22 00:22:54.162 182759 INFO nova.virt.libvirt.driver [-] [instance: 92f9af4e-c724-4454-bf9f-49ae4bdb49a3] Instance spawned successfully.#033[00m
Jan 21 19:22:54 np0005591285 nova_compute[182755]: 2026-01-22 00:22:54.162 182759 DEBUG nova.virt.libvirt.driver [None req-54cf7232-b32b-4d93-8f60-e14dd36a9bd0 a60ce2b7b7ae47b484de12add551b287 02bcfc5f1f1044a3856e73a5938ff011 - - default default] [instance: 92f9af4e-c724-4454-bf9f-49ae4bdb49a3] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Jan 21 19:22:54 np0005591285 systemd[1]: Started libcrun container.
Jan 21 19:22:54 np0005591285 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/86a837f54eb09298508e13b57b2981d8a5a5fc0703d853a2bff6cdbd6399667b/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 21 19:22:54 np0005591285 podman[236575]: 2026-01-22 00:22:54.086100158 +0000 UTC m=+0.019797154 image pull 3695f0466b4af47afdf4b467956f8cc4744d7249671a73e7ca3fd26cca2f59c3 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Jan 21 19:22:54 np0005591285 nova_compute[182755]: 2026-01-22 00:22:54.191 182759 DEBUG nova.compute.manager [None req-f447ef32-7edb-4e2c-a5c5-5452f2f9c37e - - - - - -] [instance: 92f9af4e-c724-4454-bf9f-49ae4bdb49a3] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 21 19:22:54 np0005591285 podman[236575]: 2026-01-22 00:22:54.193113595 +0000 UTC m=+0.126810661 container init 277c5df94dd8cab519dc3eb012785ef3c1d6d011b18a299ed1a734033dd38d36 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-92f99623-b3a9-41d7-ab3e-5bc19b701c77, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.build-date=20251202, maintainer=OpenStack Kubernetes Operator team)
Jan 21 19:22:54 np0005591285 nova_compute[182755]: 2026-01-22 00:22:54.195 182759 DEBUG nova.virt.libvirt.driver [None req-54cf7232-b32b-4d93-8f60-e14dd36a9bd0 a60ce2b7b7ae47b484de12add551b287 02bcfc5f1f1044a3856e73a5938ff011 - - default default] [instance: 92f9af4e-c724-4454-bf9f-49ae4bdb49a3] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 21 19:22:54 np0005591285 nova_compute[182755]: 2026-01-22 00:22:54.196 182759 DEBUG nova.virt.libvirt.driver [None req-54cf7232-b32b-4d93-8f60-e14dd36a9bd0 a60ce2b7b7ae47b484de12add551b287 02bcfc5f1f1044a3856e73a5938ff011 - - default default] [instance: 92f9af4e-c724-4454-bf9f-49ae4bdb49a3] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 21 19:22:54 np0005591285 nova_compute[182755]: 2026-01-22 00:22:54.196 182759 DEBUG nova.virt.libvirt.driver [None req-54cf7232-b32b-4d93-8f60-e14dd36a9bd0 a60ce2b7b7ae47b484de12add551b287 02bcfc5f1f1044a3856e73a5938ff011 - - default default] [instance: 92f9af4e-c724-4454-bf9f-49ae4bdb49a3] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 21 19:22:54 np0005591285 nova_compute[182755]: 2026-01-22 00:22:54.196 182759 DEBUG nova.virt.libvirt.driver [None req-54cf7232-b32b-4d93-8f60-e14dd36a9bd0 a60ce2b7b7ae47b484de12add551b287 02bcfc5f1f1044a3856e73a5938ff011 - - default default] [instance: 92f9af4e-c724-4454-bf9f-49ae4bdb49a3] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 21 19:22:54 np0005591285 nova_compute[182755]: 2026-01-22 00:22:54.197 182759 DEBUG nova.virt.libvirt.driver [None req-54cf7232-b32b-4d93-8f60-e14dd36a9bd0 a60ce2b7b7ae47b484de12add551b287 02bcfc5f1f1044a3856e73a5938ff011 - - default default] [instance: 92f9af4e-c724-4454-bf9f-49ae4bdb49a3] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 21 19:22:54 np0005591285 nova_compute[182755]: 2026-01-22 00:22:54.197 182759 DEBUG nova.virt.libvirt.driver [None req-54cf7232-b32b-4d93-8f60-e14dd36a9bd0 a60ce2b7b7ae47b484de12add551b287 02bcfc5f1f1044a3856e73a5938ff011 - - default default] [instance: 92f9af4e-c724-4454-bf9f-49ae4bdb49a3] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 21 19:22:54 np0005591285 podman[236575]: 2026-01-22 00:22:54.198990003 +0000 UTC m=+0.132687009 container start 277c5df94dd8cab519dc3eb012785ef3c1d6d011b18a299ed1a734033dd38d36 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-92f99623-b3a9-41d7-ab3e-5bc19b701c77, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, io.buildah.version=1.41.3)
Jan 21 19:22:54 np0005591285 nova_compute[182755]: 2026-01-22 00:22:54.201 182759 DEBUG nova.compute.manager [None req-f447ef32-7edb-4e2c-a5c5-5452f2f9c37e - - - - - -] [instance: 92f9af4e-c724-4454-bf9f-49ae4bdb49a3] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 21 19:22:54 np0005591285 neutron-haproxy-ovnmeta-92f99623-b3a9-41d7-ab3e-5bc19b701c77[236591]: [NOTICE]   (236595) : New worker (236597) forked
Jan 21 19:22:54 np0005591285 neutron-haproxy-ovnmeta-92f99623-b3a9-41d7-ab3e-5bc19b701c77[236591]: [NOTICE]   (236595) : Loading success.
Jan 21 19:22:54 np0005591285 nova_compute[182755]: 2026-01-22 00:22:54.217 182759 DEBUG oslo_service.periodic_task [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 21 19:22:54 np0005591285 nova_compute[182755]: 2026-01-22 00:22:54.218 182759 DEBUG nova.compute.manager [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Jan 21 19:22:54 np0005591285 nova_compute[182755]: 2026-01-22 00:22:54.219 182759 DEBUG nova.compute.manager [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Jan 21 19:22:54 np0005591285 nova_compute[182755]: 2026-01-22 00:22:54.235 182759 INFO nova.compute.manager [None req-f447ef32-7edb-4e2c-a5c5-5452f2f9c37e - - - - - -] [instance: 92f9af4e-c724-4454-bf9f-49ae4bdb49a3] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 21 19:22:54 np0005591285 nova_compute[182755]: 2026-01-22 00:22:54.244 182759 DEBUG nova.compute.manager [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] [instance: 92f9af4e-c724-4454-bf9f-49ae4bdb49a3] Skipping network cache update for instance because it is Building. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9871#033[00m
Jan 21 19:22:54 np0005591285 nova_compute[182755]: 2026-01-22 00:22:54.245 182759 DEBUG nova.compute.manager [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Jan 21 19:22:54 np0005591285 nova_compute[182755]: 2026-01-22 00:22:54.272 182759 INFO nova.compute.manager [None req-54cf7232-b32b-4d93-8f60-e14dd36a9bd0 a60ce2b7b7ae47b484de12add551b287 02bcfc5f1f1044a3856e73a5938ff011 - - default default] [instance: 92f9af4e-c724-4454-bf9f-49ae4bdb49a3] Took 5.68 seconds to spawn the instance on the hypervisor.#033[00m
Jan 21 19:22:54 np0005591285 nova_compute[182755]: 2026-01-22 00:22:54.272 182759 DEBUG nova.compute.manager [None req-54cf7232-b32b-4d93-8f60-e14dd36a9bd0 a60ce2b7b7ae47b484de12add551b287 02bcfc5f1f1044a3856e73a5938ff011 - - default default] [instance: 92f9af4e-c724-4454-bf9f-49ae4bdb49a3] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 21 19:22:54 np0005591285 nova_compute[182755]: 2026-01-22 00:22:54.380 182759 INFO nova.compute.manager [None req-54cf7232-b32b-4d93-8f60-e14dd36a9bd0 a60ce2b7b7ae47b484de12add551b287 02bcfc5f1f1044a3856e73a5938ff011 - - default default] [instance: 92f9af4e-c724-4454-bf9f-49ae4bdb49a3] Took 6.28 seconds to build instance.#033[00m
Jan 21 19:22:54 np0005591285 nova_compute[182755]: 2026-01-22 00:22:54.404 182759 DEBUG oslo_concurrency.lockutils [None req-54cf7232-b32b-4d93-8f60-e14dd36a9bd0 a60ce2b7b7ae47b484de12add551b287 02bcfc5f1f1044a3856e73a5938ff011 - - default default] Lock "92f9af4e-c724-4454-bf9f-49ae4bdb49a3" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 6.390s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 21 19:22:55 np0005591285 podman[236606]: 2026-01-22 00:22:55.233686515 +0000 UTC m=+0.110331927 container health_status e38def30a2d032278c507a8fe96e62e9ea51ff4721fe55b24e66691821fd079e (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.license=GPLv2, tcib_managed=true, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS)
Jan 21 19:22:56 np0005591285 nova_compute[182755]: 2026-01-22 00:22:56.216 182759 DEBUG oslo_service.periodic_task [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 21 19:22:56 np0005591285 nova_compute[182755]: 2026-01-22 00:22:56.264 182759 DEBUG nova.compute.manager [req-5d3fae49-b669-4172-8aab-f01804460b76 req-13fce8cf-84f4-4672-99c0-af620ca985a0 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 92f9af4e-c724-4454-bf9f-49ae4bdb49a3] Received event network-vif-plugged-57b7bcc8-a7af-4f4d-b001-4f3b96c7d691 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 21 19:22:56 np0005591285 nova_compute[182755]: 2026-01-22 00:22:56.265 182759 DEBUG oslo_concurrency.lockutils [req-5d3fae49-b669-4172-8aab-f01804460b76 req-13fce8cf-84f4-4672-99c0-af620ca985a0 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "92f9af4e-c724-4454-bf9f-49ae4bdb49a3-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 21 19:22:56 np0005591285 nova_compute[182755]: 2026-01-22 00:22:56.265 182759 DEBUG oslo_concurrency.lockutils [req-5d3fae49-b669-4172-8aab-f01804460b76 req-13fce8cf-84f4-4672-99c0-af620ca985a0 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "92f9af4e-c724-4454-bf9f-49ae4bdb49a3-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 21 19:22:56 np0005591285 nova_compute[182755]: 2026-01-22 00:22:56.265 182759 DEBUG oslo_concurrency.lockutils [req-5d3fae49-b669-4172-8aab-f01804460b76 req-13fce8cf-84f4-4672-99c0-af620ca985a0 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "92f9af4e-c724-4454-bf9f-49ae4bdb49a3-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 21 19:22:56 np0005591285 nova_compute[182755]: 2026-01-22 00:22:56.265 182759 DEBUG nova.compute.manager [req-5d3fae49-b669-4172-8aab-f01804460b76 req-13fce8cf-84f4-4672-99c0-af620ca985a0 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 92f9af4e-c724-4454-bf9f-49ae4bdb49a3] No waiting events found dispatching network-vif-plugged-57b7bcc8-a7af-4f4d-b001-4f3b96c7d691 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 21 19:22:56 np0005591285 nova_compute[182755]: 2026-01-22 00:22:56.265 182759 WARNING nova.compute.manager [req-5d3fae49-b669-4172-8aab-f01804460b76 req-13fce8cf-84f4-4672-99c0-af620ca985a0 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 92f9af4e-c724-4454-bf9f-49ae4bdb49a3] Received unexpected event network-vif-plugged-57b7bcc8-a7af-4f4d-b001-4f3b96c7d691 for instance with vm_state active and task_state None.#033[00m
Jan 21 19:22:56 np0005591285 nova_compute[182755]: 2026-01-22 00:22:56.709 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:22:56 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:22:56.710 104259 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=46, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '16:a4:fb', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': 'fa:c2:bb:50:eb:27'}, ipsec=False) old=SB_Global(nb_cfg=45) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 21 19:22:56 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:22:56.711 104259 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 9 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Jan 21 19:22:57 np0005591285 nova_compute[182755]: 2026-01-22 00:22:57.217 182759 DEBUG oslo_service.periodic_task [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 21 19:22:57 np0005591285 nova_compute[182755]: 2026-01-22 00:22:57.559 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:22:57 np0005591285 nova_compute[182755]: 2026-01-22 00:22:57.618 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:22:57 np0005591285 NetworkManager[55017]: <info>  [1769041377.6194] manager: (patch-provnet-99bcf688-d143-4306-a11c-956e66fbe227-to-br-int): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/296)
Jan 21 19:22:57 np0005591285 NetworkManager[55017]: <info>  [1769041377.6205] manager: (patch-br-int-to-provnet-99bcf688-d143-4306-a11c-956e66fbe227): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/297)
Jan 21 19:22:57 np0005591285 nova_compute[182755]: 2026-01-22 00:22:57.706 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:22:57 np0005591285 ovn_controller[94908]: 2026-01-22T00:22:57Z|00609|binding|INFO|Releasing lport 66ff7839-fd4b-434d-8f68-86c957e9ef5e from this chassis (sb_readonly=0)
Jan 21 19:22:57 np0005591285 nova_compute[182755]: 2026-01-22 00:22:57.716 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:22:58 np0005591285 nova_compute[182755]: 2026-01-22 00:22:58.156 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:22:58 np0005591285 nova_compute[182755]: 2026-01-22 00:22:58.365 182759 DEBUG nova.compute.manager [req-1c5bfa53-dd99-446f-a375-b3ad99fc26be req-2c18dafe-baa2-4c1d-adeb-7cc592bf0114 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 92f9af4e-c724-4454-bf9f-49ae4bdb49a3] Received event network-changed-57b7bcc8-a7af-4f4d-b001-4f3b96c7d691 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 21 19:22:58 np0005591285 nova_compute[182755]: 2026-01-22 00:22:58.366 182759 DEBUG nova.compute.manager [req-1c5bfa53-dd99-446f-a375-b3ad99fc26be req-2c18dafe-baa2-4c1d-adeb-7cc592bf0114 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 92f9af4e-c724-4454-bf9f-49ae4bdb49a3] Refreshing instance network info cache due to event network-changed-57b7bcc8-a7af-4f4d-b001-4f3b96c7d691. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 21 19:22:58 np0005591285 nova_compute[182755]: 2026-01-22 00:22:58.366 182759 DEBUG oslo_concurrency.lockutils [req-1c5bfa53-dd99-446f-a375-b3ad99fc26be req-2c18dafe-baa2-4c1d-adeb-7cc592bf0114 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "refresh_cache-92f9af4e-c724-4454-bf9f-49ae4bdb49a3" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 21 19:22:58 np0005591285 nova_compute[182755]: 2026-01-22 00:22:58.367 182759 DEBUG oslo_concurrency.lockutils [req-1c5bfa53-dd99-446f-a375-b3ad99fc26be req-2c18dafe-baa2-4c1d-adeb-7cc592bf0114 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquired lock "refresh_cache-92f9af4e-c724-4454-bf9f-49ae4bdb49a3" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 21 19:22:58 np0005591285 nova_compute[182755]: 2026-01-22 00:22:58.367 182759 DEBUG nova.network.neutron [req-1c5bfa53-dd99-446f-a375-b3ad99fc26be req-2c18dafe-baa2-4c1d-adeb-7cc592bf0114 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 92f9af4e-c724-4454-bf9f-49ae4bdb49a3] Refreshing network info cache for port 57b7bcc8-a7af-4f4d-b001-4f3b96c7d691 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 21 19:22:59 np0005591285 nova_compute[182755]: 2026-01-22 00:22:59.635 182759 DEBUG nova.network.neutron [req-1c5bfa53-dd99-446f-a375-b3ad99fc26be req-2c18dafe-baa2-4c1d-adeb-7cc592bf0114 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 92f9af4e-c724-4454-bf9f-49ae4bdb49a3] Updated VIF entry in instance network info cache for port 57b7bcc8-a7af-4f4d-b001-4f3b96c7d691. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 21 19:22:59 np0005591285 nova_compute[182755]: 2026-01-22 00:22:59.636 182759 DEBUG nova.network.neutron [req-1c5bfa53-dd99-446f-a375-b3ad99fc26be req-2c18dafe-baa2-4c1d-adeb-7cc592bf0114 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 92f9af4e-c724-4454-bf9f-49ae4bdb49a3] Updating instance_info_cache with network_info: [{"id": "57b7bcc8-a7af-4f4d-b001-4f3b96c7d691", "address": "fa:16:3e:0b:1a:79", "network": {"id": "92f99623-b3a9-41d7-ab3e-5bc19b701c77", "bridge": "br-int", "label": "tempest-network-smoke--863395071", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.232", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "02bcfc5f1f1044a3856e73a5938ff011", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap57b7bcc8-a7", "ovs_interfaceid": "57b7bcc8-a7af-4f4d-b001-4f3b96c7d691", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 21 19:22:59 np0005591285 nova_compute[182755]: 2026-01-22 00:22:59.656 182759 DEBUG oslo_concurrency.lockutils [req-1c5bfa53-dd99-446f-a375-b3ad99fc26be req-2c18dafe-baa2-4c1d-adeb-7cc592bf0114 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Releasing lock "refresh_cache-92f9af4e-c724-4454-bf9f-49ae4bdb49a3" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 21 19:23:00 np0005591285 nova_compute[182755]: 2026-01-22 00:23:00.217 182759 DEBUG oslo_service.periodic_task [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 21 19:23:00 np0005591285 nova_compute[182755]: 2026-01-22 00:23:00.218 182759 DEBUG oslo_service.periodic_task [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 21 19:23:00 np0005591285 nova_compute[182755]: 2026-01-22 00:23:00.218 182759 DEBUG oslo_service.periodic_task [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 21 19:23:00 np0005591285 nova_compute[182755]: 2026-01-22 00:23:00.241 182759 DEBUG oslo_concurrency.lockutils [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 21 19:23:00 np0005591285 nova_compute[182755]: 2026-01-22 00:23:00.242 182759 DEBUG oslo_concurrency.lockutils [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 21 19:23:00 np0005591285 nova_compute[182755]: 2026-01-22 00:23:00.242 182759 DEBUG oslo_concurrency.lockutils [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 21 19:23:00 np0005591285 nova_compute[182755]: 2026-01-22 00:23:00.243 182759 DEBUG nova.compute.resource_tracker [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Jan 21 19:23:00 np0005591285 nova_compute[182755]: 2026-01-22 00:23:00.320 182759 DEBUG oslo_concurrency.processutils [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/92f9af4e-c724-4454-bf9f-49ae4bdb49a3/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 21 19:23:00 np0005591285 nova_compute[182755]: 2026-01-22 00:23:00.385 182759 DEBUG oslo_concurrency.processutils [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/92f9af4e-c724-4454-bf9f-49ae4bdb49a3/disk --force-share --output=json" returned: 0 in 0.065s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 21 19:23:00 np0005591285 nova_compute[182755]: 2026-01-22 00:23:00.387 182759 DEBUG oslo_concurrency.processutils [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/92f9af4e-c724-4454-bf9f-49ae4bdb49a3/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 21 19:23:00 np0005591285 nova_compute[182755]: 2026-01-22 00:23:00.445 182759 DEBUG oslo_concurrency.processutils [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/92f9af4e-c724-4454-bf9f-49ae4bdb49a3/disk --force-share --output=json" returned: 0 in 0.058s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 21 19:23:00 np0005591285 nova_compute[182755]: 2026-01-22 00:23:00.617 182759 WARNING nova.virt.libvirt.driver [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 21 19:23:00 np0005591285 nova_compute[182755]: 2026-01-22 00:23:00.618 182759 DEBUG nova.compute.resource_tracker [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=5579MB free_disk=73.19224166870117GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Jan 21 19:23:00 np0005591285 nova_compute[182755]: 2026-01-22 00:23:00.619 182759 DEBUG oslo_concurrency.lockutils [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 21 19:23:00 np0005591285 nova_compute[182755]: 2026-01-22 00:23:00.619 182759 DEBUG oslo_concurrency.lockutils [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 21 19:23:00 np0005591285 nova_compute[182755]: 2026-01-22 00:23:00.824 182759 DEBUG nova.compute.resource_tracker [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Instance 92f9af4e-c724-4454-bf9f-49ae4bdb49a3 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Jan 21 19:23:00 np0005591285 nova_compute[182755]: 2026-01-22 00:23:00.826 182759 DEBUG nova.compute.resource_tracker [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Jan 21 19:23:00 np0005591285 nova_compute[182755]: 2026-01-22 00:23:00.826 182759 DEBUG nova.compute.resource_tracker [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7679MB used_ram=640MB phys_disk=79GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Jan 21 19:23:00 np0005591285 nova_compute[182755]: 2026-01-22 00:23:00.877 182759 DEBUG nova.compute.provider_tree [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Inventory has not changed in ProviderTree for provider: e96a8776-a298-4c19-937a-402cb8191067 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 21 19:23:00 np0005591285 nova_compute[182755]: 2026-01-22 00:23:00.896 182759 DEBUG nova.scheduler.client.report [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Inventory has not changed for provider e96a8776-a298-4c19-937a-402cb8191067 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 21 19:23:00 np0005591285 nova_compute[182755]: 2026-01-22 00:23:00.931 182759 DEBUG nova.compute.resource_tracker [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Jan 21 19:23:00 np0005591285 nova_compute[182755]: 2026-01-22 00:23:00.932 182759 DEBUG oslo_concurrency.lockutils [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.314s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 21 19:23:02 np0005591285 nova_compute[182755]: 2026-01-22 00:23:02.563 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:23:02 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:23:02.987 104259 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 21 19:23:02 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:23:02.987 104259 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 21 19:23:02 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:23:02.988 104259 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 21 19:23:03 np0005591285 nova_compute[182755]: 2026-01-22 00:23:03.192 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:23:05 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:23:05.712 104259 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=ce4b296c-26ac-415a-aa87-9634754eb3d3, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '46'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 21 19:23:06 np0005591285 ovn_controller[94908]: 2026-01-22T00:23:06Z|00060|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:0b:1a:79 10.100.0.5
Jan 21 19:23:06 np0005591285 ovn_controller[94908]: 2026-01-22T00:23:06Z|00061|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:0b:1a:79 10.100.0.5
Jan 21 19:23:07 np0005591285 nova_compute[182755]: 2026-01-22 00:23:07.565 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:23:08 np0005591285 nova_compute[182755]: 2026-01-22 00:23:08.195 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:23:10 np0005591285 podman[236660]: 2026-01-22 00:23:10.183640219 +0000 UTC m=+0.052064941 container health_status b5c1a31a508d0cf9e10702abe9c0ef424614b4d8f0daee85791f7de054afa6f0 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=ceilometer_agent_compute, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, container_name=ceilometer_agent_compute, org.label-schema.build-date=20251202, io.buildah.version=1.41.3)
Jan 21 19:23:10 np0005591285 podman[236659]: 2026-01-22 00:23:10.198843408 +0000 UTC m=+0.066531950 container health_status 769668cc4e818cfae854ab3fcb5517227ddca1e18005a18ef4d782a494f45662 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, com.redhat.component=ubi9-minimal-container, vendor=Red Hat, Inc., io.openshift.tags=minimal rhel9, distribution-scope=public, version=9.6, config_id=openstack_network_exporter, io.buildah.version=1.33.7, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, architecture=x86_64, release=1755695350, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-type=git, build-date=2025-08-20T13:12:41, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, container_name=openstack_network_exporter, maintainer=Red Hat, Inc., name=ubi9-minimal, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=edpm_ansible, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.expose-services=)
Jan 21 19:23:10 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:23:10.890 104259 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:95:d5:ab'], port_security=[], type=localport, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': ''}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '', 'neutron:device_id': 'ovnmeta-cc568949-a996-45b6-b055-c1780ec7685a', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-cc568949-a996-45b6-b055-c1780ec7685a', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'adb1305c8f874f2684e845e88fd95ffe', 'neutron:revision_number': '3', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=8e45a905-ef69-47b8-b157-96af9472b990, chassis=[], tunnel_key=1, gateway_chassis=[], requested_chassis=[], logical_port=7c217807-262b-45e7-a62c-ca33e3f039ed) old=Port_Binding(mac=['fa:16:3e:95:d5:ab 10.100.0.2'], external_ids={'neutron:cidrs': '10.100.0.2/28', 'neutron:device_id': 'ovnmeta-cc568949-a996-45b6-b055-c1780ec7685a', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-cc568949-a996-45b6-b055-c1780ec7685a', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'adb1305c8f874f2684e845e88fd95ffe', 'neutron:revision_number': '2', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 21 19:23:10 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:23:10.893 104259 INFO neutron.agent.ovn.metadata.agent [-] Metadata Port 7c217807-262b-45e7-a62c-ca33e3f039ed in datapath cc568949-a996-45b6-b055-c1780ec7685a updated#033[00m
Jan 21 19:23:10 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:23:10.894 104259 DEBUG neutron.agent.ovn.metadata.agent [-] There is no metadata port for network cc568949-a996-45b6-b055-c1780ec7685a or it has no MAC or IP addresses configured, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:599#033[00m
Jan 21 19:23:10 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:23:10.895 211690 DEBUG oslo.privsep.daemon [-] privsep: reply[7fdb8765-9477-49bd-a7aa-268f981e4cf5]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 19:23:11 np0005591285 ovn_controller[94908]: 2026-01-22T00:23:11Z|00610|binding|INFO|Releasing lport 66ff7839-fd4b-434d-8f68-86c957e9ef5e from this chassis (sb_readonly=0)
Jan 21 19:23:11 np0005591285 nova_compute[182755]: 2026-01-22 00:23:11.202 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:23:12 np0005591285 nova_compute[182755]: 2026-01-22 00:23:12.570 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:23:13 np0005591285 nova_compute[182755]: 2026-01-22 00:23:13.197 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:23:17 np0005591285 nova_compute[182755]: 2026-01-22 00:23:17.573 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:23:18 np0005591285 nova_compute[182755]: 2026-01-22 00:23:18.199 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:23:20 np0005591285 podman[236698]: 2026-01-22 00:23:20.17216147 +0000 UTC m=+0.049244956 container health_status 3d927e208c8d9d2bf2ed958430b573b5beb6e6062ba87e1ec7a0ee42c855b2e4 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter)
Jan 21 19:23:22 np0005591285 podman[236723]: 2026-01-22 00:23:22.178008828 +0000 UTC m=+0.052349839 container health_status 8919309155a033da8deb024bb37b9fc0d285fe97686ee0d202af37a5f2df8416 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']})
Jan 21 19:23:22 np0005591285 podman[236722]: 2026-01-22 00:23:22.209743741 +0000 UTC m=+0.083703022 container health_status 482a8450540b33e5cd4c4cb0c4b60281016c657b79b89fa82f8a9e447843f995 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.schema-version=1.0, tcib_managed=true)
Jan 21 19:23:22 np0005591285 nova_compute[182755]: 2026-01-22 00:23:22.217 182759 DEBUG oslo_service.periodic_task [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Running periodic task ComputeManager._cleanup_incomplete_migrations run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 21 19:23:22 np0005591285 nova_compute[182755]: 2026-01-22 00:23:22.217 182759 DEBUG nova.compute.manager [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Cleaning up deleted instances with incomplete migration  _cleanup_incomplete_migrations /usr/lib/python3.9/site-packages/nova/compute/manager.py:11183#033[00m
Jan 21 19:23:22 np0005591285 nova_compute[182755]: 2026-01-22 00:23:22.576 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:23:22 np0005591285 nova_compute[182755]: 2026-01-22 00:23:22.622 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:23:23 np0005591285 nova_compute[182755]: 2026-01-22 00:23:23.200 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:23:24 np0005591285 nova_compute[182755]: 2026-01-22 00:23:24.166 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:23:26 np0005591285 podman[236765]: 2026-01-22 00:23:26.235476701 +0000 UTC m=+0.105290503 container health_status e38def30a2d032278c507a8fe96e62e9ea51ff4721fe55b24e66691821fd079e (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_id=ovn_controller, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 21 19:23:27 np0005591285 nova_compute[182755]: 2026-01-22 00:23:27.584 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:23:28 np0005591285 nova_compute[182755]: 2026-01-22 00:23:28.202 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:23:31 np0005591285 nova_compute[182755]: 2026-01-22 00:23:31.208 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:23:32 np0005591285 nova_compute[182755]: 2026-01-22 00:23:32.586 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:23:33 np0005591285 nova_compute[182755]: 2026-01-22 00:23:33.213 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:23:33 np0005591285 nova_compute[182755]: 2026-01-22 00:23:33.241 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:23:34 np0005591285 nova_compute[182755]: 2026-01-22 00:23:34.219 182759 DEBUG oslo_service.periodic_task [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 21 19:23:37 np0005591285 nova_compute[182755]: 2026-01-22 00:23:37.589 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:23:38 np0005591285 nova_compute[182755]: 2026-01-22 00:23:38.216 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:23:39 np0005591285 nova_compute[182755]: 2026-01-22 00:23:39.919 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:23:41 np0005591285 podman[236791]: 2026-01-22 00:23:41.185611519 +0000 UTC m=+0.062445761 container health_status 769668cc4e818cfae854ab3fcb5517227ddca1e18005a18ef4d782a494f45662 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, release=1755695350, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, build-date=2025-08-20T13:12:41, container_name=openstack_network_exporter, io.buildah.version=1.33.7, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., name=ubi9-minimal, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=openstack_network_exporter, distribution-scope=public, vendor=Red Hat, Inc., com.redhat.component=ubi9-minimal-container, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, architecture=x86_64, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., managed_by=edpm_ansible, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-type=git, io.openshift.expose-services=, maintainer=Red Hat, Inc., url=https://catalog.redhat.com/en/search?searchType=containers, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, version=9.6, io.openshift.tags=minimal rhel9)
Jan 21 19:23:41 np0005591285 podman[236792]: 2026-01-22 00:23:41.185630959 +0000 UTC m=+0.058926765 container health_status b5c1a31a508d0cf9e10702abe9c0ef424614b4d8f0daee85791f7de054afa6f0 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, config_id=ceilometer_agent_compute, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, tcib_managed=true, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 21 19:23:42 np0005591285 nova_compute[182755]: 2026-01-22 00:23:42.626 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:23:43 np0005591285 nova_compute[182755]: 2026-01-22 00:23:43.218 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:23:47 np0005591285 nova_compute[182755]: 2026-01-22 00:23:47.630 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:23:48 np0005591285 nova_compute[182755]: 2026-01-22 00:23:48.220 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:23:48 np0005591285 nova_compute[182755]: 2026-01-22 00:23:48.874 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:23:51 np0005591285 podman[236834]: 2026-01-22 00:23:51.181364294 +0000 UTC m=+0.058511215 container health_status 3d927e208c8d9d2bf2ed958430b573b5beb6e6062ba87e1ec7a0ee42c855b2e4 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Jan 21 19:23:52 np0005591285 nova_compute[182755]: 2026-01-22 00:23:52.259 182759 DEBUG oslo_service.periodic_task [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 21 19:23:52 np0005591285 nova_compute[182755]: 2026-01-22 00:23:52.259 182759 DEBUG oslo_service.periodic_task [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 21 19:23:52 np0005591285 nova_compute[182755]: 2026-01-22 00:23:52.260 182759 DEBUG nova.compute.manager [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Jan 21 19:23:52 np0005591285 nova_compute[182755]: 2026-01-22 00:23:52.633 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:23:53 np0005591285 podman[236858]: 2026-01-22 00:23:53.179722319 +0000 UTC m=+0.047170739 container health_status 482a8450540b33e5cd4c4cb0c4b60281016c657b79b89fa82f8a9e447843f995 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, tcib_managed=true, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent, managed_by=edpm_ansible, io.buildah.version=1.41.3)
Jan 21 19:23:53 np0005591285 podman[236859]: 2026-01-22 00:23:53.210938639 +0000 UTC m=+0.073520288 container health_status 8919309155a033da8deb024bb37b9fc0d285fe97686ee0d202af37a5f2df8416 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Jan 21 19:23:53 np0005591285 nova_compute[182755]: 2026-01-22 00:23:53.300 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:23:54 np0005591285 nova_compute[182755]: 2026-01-22 00:23:54.217 182759 DEBUG oslo_service.periodic_task [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 21 19:23:54 np0005591285 nova_compute[182755]: 2026-01-22 00:23:54.218 182759 DEBUG nova.compute.manager [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Jan 21 19:23:54 np0005591285 nova_compute[182755]: 2026-01-22 00:23:54.218 182759 DEBUG nova.compute.manager [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Jan 21 19:23:55 np0005591285 nova_compute[182755]: 2026-01-22 00:23:55.416 182759 DEBUG oslo_concurrency.lockutils [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Acquiring lock "refresh_cache-92f9af4e-c724-4454-bf9f-49ae4bdb49a3" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 21 19:23:55 np0005591285 nova_compute[182755]: 2026-01-22 00:23:55.417 182759 DEBUG oslo_concurrency.lockutils [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Acquired lock "refresh_cache-92f9af4e-c724-4454-bf9f-49ae4bdb49a3" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 21 19:23:55 np0005591285 nova_compute[182755]: 2026-01-22 00:23:55.417 182759 DEBUG nova.network.neutron [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] [instance: 92f9af4e-c724-4454-bf9f-49ae4bdb49a3] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004#033[00m
Jan 21 19:23:55 np0005591285 nova_compute[182755]: 2026-01-22 00:23:55.417 182759 DEBUG nova.objects.instance [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Lazy-loading 'info_cache' on Instance uuid 92f9af4e-c724-4454-bf9f-49ae4bdb49a3 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 21 19:23:57 np0005591285 podman[236902]: 2026-01-22 00:23:57.215291816 +0000 UTC m=+0.082312365 container health_status e38def30a2d032278c507a8fe96e62e9ea51ff4721fe55b24e66691821fd079e (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_controller, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 21 19:23:57 np0005591285 nova_compute[182755]: 2026-01-22 00:23:57.310 182759 DEBUG nova.network.neutron [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] [instance: 92f9af4e-c724-4454-bf9f-49ae4bdb49a3] Updating instance_info_cache with network_info: [{"id": "57b7bcc8-a7af-4f4d-b001-4f3b96c7d691", "address": "fa:16:3e:0b:1a:79", "network": {"id": "92f99623-b3a9-41d7-ab3e-5bc19b701c77", "bridge": "br-int", "label": "tempest-network-smoke--863395071", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.232", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "02bcfc5f1f1044a3856e73a5938ff011", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap57b7bcc8-a7", "ovs_interfaceid": "57b7bcc8-a7af-4f4d-b001-4f3b96c7d691", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 21 19:23:57 np0005591285 nova_compute[182755]: 2026-01-22 00:23:57.356 182759 DEBUG oslo_concurrency.lockutils [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Releasing lock "refresh_cache-92f9af4e-c724-4454-bf9f-49ae4bdb49a3" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 21 19:23:57 np0005591285 nova_compute[182755]: 2026-01-22 00:23:57.357 182759 DEBUG nova.compute.manager [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] [instance: 92f9af4e-c724-4454-bf9f-49ae4bdb49a3] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929#033[00m
Jan 21 19:23:57 np0005591285 nova_compute[182755]: 2026-01-22 00:23:57.357 182759 DEBUG oslo_service.periodic_task [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 21 19:23:57 np0005591285 nova_compute[182755]: 2026-01-22 00:23:57.635 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:23:58 np0005591285 nova_compute[182755]: 2026-01-22 00:23:58.217 182759 DEBUG oslo_service.periodic_task [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 21 19:23:58 np0005591285 nova_compute[182755]: 2026-01-22 00:23:58.218 182759 DEBUG oslo_service.periodic_task [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 21 19:23:58 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:23:58.225 104259 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=47, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '16:a4:fb', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': 'fa:c2:bb:50:eb:27'}, ipsec=False) old=SB_Global(nb_cfg=46) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 21 19:23:58 np0005591285 nova_compute[182755]: 2026-01-22 00:23:58.226 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:23:58 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:23:58.226 104259 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 2 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Jan 21 19:23:58 np0005591285 nova_compute[182755]: 2026-01-22 00:23:58.304 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:24:00 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:24:00.228 104259 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=ce4b296c-26ac-415a-aa87-9634754eb3d3, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '47'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 21 19:24:01 np0005591285 nova_compute[182755]: 2026-01-22 00:24:01.217 182759 DEBUG oslo_service.periodic_task [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 21 19:24:01 np0005591285 nova_compute[182755]: 2026-01-22 00:24:01.218 182759 DEBUG oslo_service.periodic_task [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 21 19:24:01 np0005591285 nova_compute[182755]: 2026-01-22 00:24:01.331 182759 DEBUG oslo_concurrency.lockutils [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 21 19:24:01 np0005591285 nova_compute[182755]: 2026-01-22 00:24:01.331 182759 DEBUG oslo_concurrency.lockutils [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 21 19:24:01 np0005591285 nova_compute[182755]: 2026-01-22 00:24:01.331 182759 DEBUG oslo_concurrency.lockutils [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 21 19:24:01 np0005591285 nova_compute[182755]: 2026-01-22 00:24:01.332 182759 DEBUG nova.compute.resource_tracker [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Jan 21 19:24:01 np0005591285 nova_compute[182755]: 2026-01-22 00:24:01.451 182759 DEBUG oslo_concurrency.processutils [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/92f9af4e-c724-4454-bf9f-49ae4bdb49a3/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 21 19:24:01 np0005591285 nova_compute[182755]: 2026-01-22 00:24:01.514 182759 DEBUG oslo_concurrency.processutils [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/92f9af4e-c724-4454-bf9f-49ae4bdb49a3/disk --force-share --output=json" returned: 0 in 0.063s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 21 19:24:01 np0005591285 nova_compute[182755]: 2026-01-22 00:24:01.515 182759 DEBUG oslo_concurrency.processutils [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/92f9af4e-c724-4454-bf9f-49ae4bdb49a3/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 21 19:24:01 np0005591285 nova_compute[182755]: 2026-01-22 00:24:01.580 182759 DEBUG oslo_concurrency.processutils [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/92f9af4e-c724-4454-bf9f-49ae4bdb49a3/disk --force-share --output=json" returned: 0 in 0.064s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 21 19:24:01 np0005591285 nova_compute[182755]: 2026-01-22 00:24:01.736 182759 WARNING nova.virt.libvirt.driver [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 21 19:24:01 np0005591285 nova_compute[182755]: 2026-01-22 00:24:01.738 182759 DEBUG nova.compute.resource_tracker [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=5577MB free_disk=73.16440200805664GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Jan 21 19:24:01 np0005591285 nova_compute[182755]: 2026-01-22 00:24:01.738 182759 DEBUG oslo_concurrency.lockutils [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 21 19:24:01 np0005591285 nova_compute[182755]: 2026-01-22 00:24:01.739 182759 DEBUG oslo_concurrency.lockutils [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 21 19:24:02 np0005591285 nova_compute[182755]: 2026-01-22 00:24:02.196 182759 DEBUG nova.compute.resource_tracker [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Instance 92f9af4e-c724-4454-bf9f-49ae4bdb49a3 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Jan 21 19:24:02 np0005591285 nova_compute[182755]: 2026-01-22 00:24:02.197 182759 DEBUG nova.compute.resource_tracker [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Jan 21 19:24:02 np0005591285 nova_compute[182755]: 2026-01-22 00:24:02.197 182759 DEBUG nova.compute.resource_tracker [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7679MB used_ram=640MB phys_disk=79GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Jan 21 19:24:02 np0005591285 nova_compute[182755]: 2026-01-22 00:24:02.333 182759 DEBUG nova.compute.provider_tree [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Inventory has not changed in ProviderTree for provider: e96a8776-a298-4c19-937a-402cb8191067 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 21 19:24:02 np0005591285 nova_compute[182755]: 2026-01-22 00:24:02.353 182759 DEBUG nova.scheduler.client.report [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Inventory has not changed for provider e96a8776-a298-4c19-937a-402cb8191067 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 21 19:24:02 np0005591285 nova_compute[182755]: 2026-01-22 00:24:02.355 182759 DEBUG nova.compute.resource_tracker [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Jan 21 19:24:02 np0005591285 nova_compute[182755]: 2026-01-22 00:24:02.355 182759 DEBUG oslo_concurrency.lockutils [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.617s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 21 19:24:02 np0005591285 nova_compute[182755]: 2026-01-22 00:24:02.638 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:24:02 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:24:02.988 104259 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 21 19:24:02 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:24:02.988 104259 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 21 19:24:02 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:24:02.989 104259 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 21 19:24:03 np0005591285 nova_compute[182755]: 2026-01-22 00:24:03.304 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:24:03 np0005591285 nova_compute[182755]: 2026-01-22 00:24:03.355 182759 DEBUG oslo_service.periodic_task [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 21 19:24:03 np0005591285 nova_compute[182755]: 2026-01-22 00:24:03.380 182759 DEBUG oslo_service.periodic_task [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 21 19:24:04 np0005591285 nova_compute[182755]: 2026-01-22 00:24:04.168 182759 DEBUG nova.compute.manager [req-d16f3aa2-1a95-475a-b729-af3663a6cad3 req-82a443df-6f85-4874-96c8-732ce396999a 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 92f9af4e-c724-4454-bf9f-49ae4bdb49a3] Received event network-changed-57b7bcc8-a7af-4f4d-b001-4f3b96c7d691 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 21 19:24:04 np0005591285 nova_compute[182755]: 2026-01-22 00:24:04.169 182759 DEBUG nova.compute.manager [req-d16f3aa2-1a95-475a-b729-af3663a6cad3 req-82a443df-6f85-4874-96c8-732ce396999a 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 92f9af4e-c724-4454-bf9f-49ae4bdb49a3] Refreshing instance network info cache due to event network-changed-57b7bcc8-a7af-4f4d-b001-4f3b96c7d691. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 21 19:24:04 np0005591285 nova_compute[182755]: 2026-01-22 00:24:04.169 182759 DEBUG oslo_concurrency.lockutils [req-d16f3aa2-1a95-475a-b729-af3663a6cad3 req-82a443df-6f85-4874-96c8-732ce396999a 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "refresh_cache-92f9af4e-c724-4454-bf9f-49ae4bdb49a3" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 21 19:24:04 np0005591285 nova_compute[182755]: 2026-01-22 00:24:04.169 182759 DEBUG oslo_concurrency.lockutils [req-d16f3aa2-1a95-475a-b729-af3663a6cad3 req-82a443df-6f85-4874-96c8-732ce396999a 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquired lock "refresh_cache-92f9af4e-c724-4454-bf9f-49ae4bdb49a3" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 21 19:24:04 np0005591285 nova_compute[182755]: 2026-01-22 00:24:04.169 182759 DEBUG nova.network.neutron [req-d16f3aa2-1a95-475a-b729-af3663a6cad3 req-82a443df-6f85-4874-96c8-732ce396999a 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 92f9af4e-c724-4454-bf9f-49ae4bdb49a3] Refreshing network info cache for port 57b7bcc8-a7af-4f4d-b001-4f3b96c7d691 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 21 19:24:04 np0005591285 nova_compute[182755]: 2026-01-22 00:24:04.293 182759 DEBUG oslo_concurrency.lockutils [None req-ac4b72e8-5dae-4bd5-a860-6271f7713a84 a60ce2b7b7ae47b484de12add551b287 02bcfc5f1f1044a3856e73a5938ff011 - - default default] Acquiring lock "92f9af4e-c724-4454-bf9f-49ae4bdb49a3" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 21 19:24:04 np0005591285 nova_compute[182755]: 2026-01-22 00:24:04.294 182759 DEBUG oslo_concurrency.lockutils [None req-ac4b72e8-5dae-4bd5-a860-6271f7713a84 a60ce2b7b7ae47b484de12add551b287 02bcfc5f1f1044a3856e73a5938ff011 - - default default] Lock "92f9af4e-c724-4454-bf9f-49ae4bdb49a3" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 21 19:24:04 np0005591285 nova_compute[182755]: 2026-01-22 00:24:04.294 182759 DEBUG oslo_concurrency.lockutils [None req-ac4b72e8-5dae-4bd5-a860-6271f7713a84 a60ce2b7b7ae47b484de12add551b287 02bcfc5f1f1044a3856e73a5938ff011 - - default default] Acquiring lock "92f9af4e-c724-4454-bf9f-49ae4bdb49a3-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 21 19:24:04 np0005591285 nova_compute[182755]: 2026-01-22 00:24:04.295 182759 DEBUG oslo_concurrency.lockutils [None req-ac4b72e8-5dae-4bd5-a860-6271f7713a84 a60ce2b7b7ae47b484de12add551b287 02bcfc5f1f1044a3856e73a5938ff011 - - default default] Lock "92f9af4e-c724-4454-bf9f-49ae4bdb49a3-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 21 19:24:04 np0005591285 nova_compute[182755]: 2026-01-22 00:24:04.295 182759 DEBUG oslo_concurrency.lockutils [None req-ac4b72e8-5dae-4bd5-a860-6271f7713a84 a60ce2b7b7ae47b484de12add551b287 02bcfc5f1f1044a3856e73a5938ff011 - - default default] Lock "92f9af4e-c724-4454-bf9f-49ae4bdb49a3-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 21 19:24:04 np0005591285 nova_compute[182755]: 2026-01-22 00:24:04.307 182759 INFO nova.compute.manager [None req-ac4b72e8-5dae-4bd5-a860-6271f7713a84 a60ce2b7b7ae47b484de12add551b287 02bcfc5f1f1044a3856e73a5938ff011 - - default default] [instance: 92f9af4e-c724-4454-bf9f-49ae4bdb49a3] Terminating instance#033[00m
Jan 21 19:24:04 np0005591285 nova_compute[182755]: 2026-01-22 00:24:04.318 182759 DEBUG nova.compute.manager [None req-ac4b72e8-5dae-4bd5-a860-6271f7713a84 a60ce2b7b7ae47b484de12add551b287 02bcfc5f1f1044a3856e73a5938ff011 - - default default] [instance: 92f9af4e-c724-4454-bf9f-49ae4bdb49a3] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Jan 21 19:24:04 np0005591285 kernel: tap57b7bcc8-a7 (unregistering): left promiscuous mode
Jan 21 19:24:04 np0005591285 NetworkManager[55017]: <info>  [1769041444.3417] device (tap57b7bcc8-a7): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 21 19:24:04 np0005591285 ovn_controller[94908]: 2026-01-22T00:24:04Z|00611|binding|INFO|Releasing lport 57b7bcc8-a7af-4f4d-b001-4f3b96c7d691 from this chassis (sb_readonly=0)
Jan 21 19:24:04 np0005591285 ovn_controller[94908]: 2026-01-22T00:24:04Z|00612|binding|INFO|Setting lport 57b7bcc8-a7af-4f4d-b001-4f3b96c7d691 down in Southbound
Jan 21 19:24:04 np0005591285 ovn_controller[94908]: 2026-01-22T00:24:04Z|00613|binding|INFO|Removing iface tap57b7bcc8-a7 ovn-installed in OVS
Jan 21 19:24:04 np0005591285 nova_compute[182755]: 2026-01-22 00:24:04.351 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:24:04 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:24:04.358 104259 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:0b:1a:79 10.100.0.5'], port_security=['fa:16:3e:0b:1a:79 10.100.0.5'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.5/28', 'neutron:device_id': '92f9af4e-c724-4454-bf9f-49ae4bdb49a3', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-92f99623-b3a9-41d7-ab3e-5bc19b701c77', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '02bcfc5f1f1044a3856e73a5938ff011', 'neutron:revision_number': '4', 'neutron:security_group_ids': 'ac7ef113-70ce-4186-8d61-02a4d7406050 fce92c79-e7f6-4f6f-8148-ac41818bc768', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-2.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=0f297238-9b07-4e59-b73b-faeb28c51c5f, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fdd55b5c970>], logical_port=57b7bcc8-a7af-4f4d-b001-4f3b96c7d691) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7fdd55b5c970>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 21 19:24:04 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:24:04.360 104259 INFO neutron.agent.ovn.metadata.agent [-] Port 57b7bcc8-a7af-4f4d-b001-4f3b96c7d691 in datapath 92f99623-b3a9-41d7-ab3e-5bc19b701c77 unbound from our chassis#033[00m
Jan 21 19:24:04 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:24:04.361 104259 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 92f99623-b3a9-41d7-ab3e-5bc19b701c77, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Jan 21 19:24:04 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:24:04.362 211690 DEBUG oslo.privsep.daemon [-] privsep: reply[9185d45f-62e8-4da3-8ef0-f099784b4c87]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 19:24:04 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:24:04.364 104259 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-92f99623-b3a9-41d7-ab3e-5bc19b701c77 namespace which is not needed anymore#033[00m
Jan 21 19:24:04 np0005591285 nova_compute[182755]: 2026-01-22 00:24:04.367 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:24:04 np0005591285 systemd[1]: machine-qemu\x2d71\x2dinstance\x2d0000009d.scope: Deactivated successfully.
Jan 21 19:24:04 np0005591285 systemd[1]: machine-qemu\x2d71\x2dinstance\x2d0000009d.scope: Consumed 15.284s CPU time.
Jan 21 19:24:04 np0005591285 systemd-machined[154022]: Machine qemu-71-instance-0000009d terminated.
Jan 21 19:24:04 np0005591285 neutron-haproxy-ovnmeta-92f99623-b3a9-41d7-ab3e-5bc19b701c77[236591]: [NOTICE]   (236595) : haproxy version is 2.8.14-c23fe91
Jan 21 19:24:04 np0005591285 neutron-haproxy-ovnmeta-92f99623-b3a9-41d7-ab3e-5bc19b701c77[236591]: [NOTICE]   (236595) : path to executable is /usr/sbin/haproxy
Jan 21 19:24:04 np0005591285 neutron-haproxy-ovnmeta-92f99623-b3a9-41d7-ab3e-5bc19b701c77[236591]: [WARNING]  (236595) : Exiting Master process...
Jan 21 19:24:04 np0005591285 neutron-haproxy-ovnmeta-92f99623-b3a9-41d7-ab3e-5bc19b701c77[236591]: [WARNING]  (236595) : Exiting Master process...
Jan 21 19:24:04 np0005591285 neutron-haproxy-ovnmeta-92f99623-b3a9-41d7-ab3e-5bc19b701c77[236591]: [ALERT]    (236595) : Current worker (236597) exited with code 143 (Terminated)
Jan 21 19:24:04 np0005591285 neutron-haproxy-ovnmeta-92f99623-b3a9-41d7-ab3e-5bc19b701c77[236591]: [WARNING]  (236595) : All workers exited. Exiting... (0)
Jan 21 19:24:04 np0005591285 systemd[1]: libpod-277c5df94dd8cab519dc3eb012785ef3c1d6d011b18a299ed1a734033dd38d36.scope: Deactivated successfully.
Jan 21 19:24:04 np0005591285 podman[236958]: 2026-01-22 00:24:04.520230084 +0000 UTC m=+0.050226982 container died 277c5df94dd8cab519dc3eb012785ef3c1d6d011b18a299ed1a734033dd38d36 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-92f99623-b3a9-41d7-ab3e-5bc19b701c77, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0)
Jan 21 19:24:04 np0005591285 nova_compute[182755]: 2026-01-22 00:24:04.546 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:24:04 np0005591285 nova_compute[182755]: 2026-01-22 00:24:04.551 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:24:04 np0005591285 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-277c5df94dd8cab519dc3eb012785ef3c1d6d011b18a299ed1a734033dd38d36-userdata-shm.mount: Deactivated successfully.
Jan 21 19:24:04 np0005591285 systemd[1]: var-lib-containers-storage-overlay-86a837f54eb09298508e13b57b2981d8a5a5fc0703d853a2bff6cdbd6399667b-merged.mount: Deactivated successfully.
Jan 21 19:24:04 np0005591285 podman[236958]: 2026-01-22 00:24:04.567033633 +0000 UTC m=+0.097030521 container cleanup 277c5df94dd8cab519dc3eb012785ef3c1d6d011b18a299ed1a734033dd38d36 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-92f99623-b3a9-41d7-ab3e-5bc19b701c77, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0)
Jan 21 19:24:04 np0005591285 systemd[1]: libpod-conmon-277c5df94dd8cab519dc3eb012785ef3c1d6d011b18a299ed1a734033dd38d36.scope: Deactivated successfully.
Jan 21 19:24:04 np0005591285 nova_compute[182755]: 2026-01-22 00:24:04.598 182759 INFO nova.virt.libvirt.driver [-] [instance: 92f9af4e-c724-4454-bf9f-49ae4bdb49a3] Instance destroyed successfully.#033[00m
Jan 21 19:24:04 np0005591285 nova_compute[182755]: 2026-01-22 00:24:04.601 182759 DEBUG nova.objects.instance [None req-ac4b72e8-5dae-4bd5-a860-6271f7713a84 a60ce2b7b7ae47b484de12add551b287 02bcfc5f1f1044a3856e73a5938ff011 - - default default] Lazy-loading 'resources' on Instance uuid 92f9af4e-c724-4454-bf9f-49ae4bdb49a3 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 21 19:24:04 np0005591285 nova_compute[182755]: 2026-01-22 00:24:04.627 182759 DEBUG nova.virt.libvirt.vif [None req-ac4b72e8-5dae-4bd5-a860-6271f7713a84 a60ce2b7b7ae47b484de12add551b287 02bcfc5f1f1044a3856e73a5938ff011 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-22T00:22:47Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-server-tempest-TestSecurityGroupsBasicOps-1492736128-access_point-966065747',display_name='tempest-server-tempest-TestSecurityGroupsBasicOps-1492736128-access_point-966065747',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-server-tempest-testsecuritygroupsbasicops-1492736128-ac',id=157,image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBFzNbeWQqM+R5mCLYdVsdPyQXYDqnkkhhC73mxN5fX0QRC+i5pxaSAc7LRsQKs9V1np8BzitSAx9O4U37xdH3m6MF7eYp2Ff07iBZVcoSIsB4CpGyP/xz08PAIvxm/KFgA==',key_name='tempest-TestSecurityGroupsBasicOps-135115301',keypairs=<?>,launch_index=0,launched_at=2026-01-22T00:22:54Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='02bcfc5f1f1044a3856e73a5938ff011',ramdisk_id='',reservation_id='r-3dy1zpts',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestSecurityGroupsBasicOps-1492736128',owner_user_name='tempest-TestSecurityGroupsBasicOps-1492736128-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-22T00:22:54Z,user_data=None,user_id='a60ce2b7b7ae47b484de12add551b287',uuid=92f9af4e-c724-4454-bf9f-49ae4bdb49a3,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "57b7bcc8-a7af-4f4d-b001-4f3b96c7d691", "address": "fa:16:3e:0b:1a:79", "network": {"id": "92f99623-b3a9-41d7-ab3e-5bc19b701c77", "bridge": "br-int", "label": "tempest-network-smoke--863395071", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.232", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "02bcfc5f1f1044a3856e73a5938ff011", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap57b7bcc8-a7", "ovs_interfaceid": "57b7bcc8-a7af-4f4d-b001-4f3b96c7d691", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Jan 21 19:24:04 np0005591285 nova_compute[182755]: 2026-01-22 00:24:04.628 182759 DEBUG nova.network.os_vif_util [None req-ac4b72e8-5dae-4bd5-a860-6271f7713a84 a60ce2b7b7ae47b484de12add551b287 02bcfc5f1f1044a3856e73a5938ff011 - - default default] Converting VIF {"id": "57b7bcc8-a7af-4f4d-b001-4f3b96c7d691", "address": "fa:16:3e:0b:1a:79", "network": {"id": "92f99623-b3a9-41d7-ab3e-5bc19b701c77", "bridge": "br-int", "label": "tempest-network-smoke--863395071", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.232", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "02bcfc5f1f1044a3856e73a5938ff011", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap57b7bcc8-a7", "ovs_interfaceid": "57b7bcc8-a7af-4f4d-b001-4f3b96c7d691", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 21 19:24:04 np0005591285 nova_compute[182755]: 2026-01-22 00:24:04.630 182759 DEBUG nova.network.os_vif_util [None req-ac4b72e8-5dae-4bd5-a860-6271f7713a84 a60ce2b7b7ae47b484de12add551b287 02bcfc5f1f1044a3856e73a5938ff011 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:0b:1a:79,bridge_name='br-int',has_traffic_filtering=True,id=57b7bcc8-a7af-4f4d-b001-4f3b96c7d691,network=Network(92f99623-b3a9-41d7-ab3e-5bc19b701c77),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap57b7bcc8-a7') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 21 19:24:04 np0005591285 nova_compute[182755]: 2026-01-22 00:24:04.631 182759 DEBUG os_vif [None req-ac4b72e8-5dae-4bd5-a860-6271f7713a84 a60ce2b7b7ae47b484de12add551b287 02bcfc5f1f1044a3856e73a5938ff011 - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:0b:1a:79,bridge_name='br-int',has_traffic_filtering=True,id=57b7bcc8-a7af-4f4d-b001-4f3b96c7d691,network=Network(92f99623-b3a9-41d7-ab3e-5bc19b701c77),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap57b7bcc8-a7') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Jan 21 19:24:04 np0005591285 nova_compute[182755]: 2026-01-22 00:24:04.633 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:24:04 np0005591285 nova_compute[182755]: 2026-01-22 00:24:04.634 182759 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap57b7bcc8-a7, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 21 19:24:04 np0005591285 nova_compute[182755]: 2026-01-22 00:24:04.638 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:24:04 np0005591285 nova_compute[182755]: 2026-01-22 00:24:04.640 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 21 19:24:04 np0005591285 nova_compute[182755]: 2026-01-22 00:24:04.646 182759 INFO os_vif [None req-ac4b72e8-5dae-4bd5-a860-6271f7713a84 a60ce2b7b7ae47b484de12add551b287 02bcfc5f1f1044a3856e73a5938ff011 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:0b:1a:79,bridge_name='br-int',has_traffic_filtering=True,id=57b7bcc8-a7af-4f4d-b001-4f3b96c7d691,network=Network(92f99623-b3a9-41d7-ab3e-5bc19b701c77),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap57b7bcc8-a7')#033[00m
Jan 21 19:24:04 np0005591285 nova_compute[182755]: 2026-01-22 00:24:04.647 182759 INFO nova.virt.libvirt.driver [None req-ac4b72e8-5dae-4bd5-a860-6271f7713a84 a60ce2b7b7ae47b484de12add551b287 02bcfc5f1f1044a3856e73a5938ff011 - - default default] [instance: 92f9af4e-c724-4454-bf9f-49ae4bdb49a3] Deleting instance files /var/lib/nova/instances/92f9af4e-c724-4454-bf9f-49ae4bdb49a3_del#033[00m
Jan 21 19:24:04 np0005591285 nova_compute[182755]: 2026-01-22 00:24:04.648 182759 INFO nova.virt.libvirt.driver [None req-ac4b72e8-5dae-4bd5-a860-6271f7713a84 a60ce2b7b7ae47b484de12add551b287 02bcfc5f1f1044a3856e73a5938ff011 - - default default] [instance: 92f9af4e-c724-4454-bf9f-49ae4bdb49a3] Deletion of /var/lib/nova/instances/92f9af4e-c724-4454-bf9f-49ae4bdb49a3_del complete#033[00m
Jan 21 19:24:04 np0005591285 podman[237000]: 2026-01-22 00:24:04.651499154 +0000 UTC m=+0.054891316 container remove 277c5df94dd8cab519dc3eb012785ef3c1d6d011b18a299ed1a734033dd38d36 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-92f99623-b3a9-41d7-ab3e-5bc19b701c77, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202)
Jan 21 19:24:04 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:24:04.657 211690 DEBUG oslo.privsep.daemon [-] privsep: reply[971fa6f8-9aba-4875-883e-c2a94ef6c45b]: (4, ('Thu Jan 22 12:24:04 AM UTC 2026 Stopping container neutron-haproxy-ovnmeta-92f99623-b3a9-41d7-ab3e-5bc19b701c77 (277c5df94dd8cab519dc3eb012785ef3c1d6d011b18a299ed1a734033dd38d36)\n277c5df94dd8cab519dc3eb012785ef3c1d6d011b18a299ed1a734033dd38d36\nThu Jan 22 12:24:04 AM UTC 2026 Deleting container neutron-haproxy-ovnmeta-92f99623-b3a9-41d7-ab3e-5bc19b701c77 (277c5df94dd8cab519dc3eb012785ef3c1d6d011b18a299ed1a734033dd38d36)\n277c5df94dd8cab519dc3eb012785ef3c1d6d011b18a299ed1a734033dd38d36\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 19:24:04 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:24:04.658 211690 DEBUG oslo.privsep.daemon [-] privsep: reply[d1d8d85d-5bd9-4fb4-9060-60eda407706e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 19:24:04 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:24:04.659 104259 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap92f99623-b0, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 21 19:24:04 np0005591285 nova_compute[182755]: 2026-01-22 00:24:04.661 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:24:04 np0005591285 kernel: tap92f99623-b0: left promiscuous mode
Jan 21 19:24:04 np0005591285 nova_compute[182755]: 2026-01-22 00:24:04.663 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:24:04 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:24:04.665 211690 DEBUG oslo.privsep.daemon [-] privsep: reply[eeea3e80-6a81-49f6-9be6-95a6c2b48a06]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 19:24:04 np0005591285 nova_compute[182755]: 2026-01-22 00:24:04.675 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:24:04 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:24:04.685 211690 DEBUG oslo.privsep.daemon [-] privsep: reply[f07c4b18-75b9-4ce1-9941-d5bd3dcea3da]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 19:24:04 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:24:04.686 211690 DEBUG oslo.privsep.daemon [-] privsep: reply[244cf78a-44a7-4eb2-9144-e8ed492a1c71]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 19:24:04 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:24:04.702 211690 DEBUG oslo.privsep.daemon [-] privsep: reply[98ba8983-978e-44a1-88bd-7d3050e36e99]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 593216, 'reachable_time': 44994, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 237015, 'error': None, 'target': 'ovnmeta-92f99623-b3a9-41d7-ab3e-5bc19b701c77', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 19:24:04 np0005591285 systemd[1]: run-netns-ovnmeta\x2d92f99623\x2db3a9\x2d41d7\x2dab3e\x2d5bc19b701c77.mount: Deactivated successfully.
Jan 21 19:24:04 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:24:04.706 104650 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-92f99623-b3a9-41d7-ab3e-5bc19b701c77 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Jan 21 19:24:04 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:24:04.706 104650 DEBUG oslo.privsep.daemon [-] privsep: reply[dac4c31f-a545-4869-9ef0-36785df874e0]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 19:24:04 np0005591285 nova_compute[182755]: 2026-01-22 00:24:04.724 182759 INFO nova.compute.manager [None req-ac4b72e8-5dae-4bd5-a860-6271f7713a84 a60ce2b7b7ae47b484de12add551b287 02bcfc5f1f1044a3856e73a5938ff011 - - default default] [instance: 92f9af4e-c724-4454-bf9f-49ae4bdb49a3] Took 0.41 seconds to destroy the instance on the hypervisor.#033[00m
Jan 21 19:24:04 np0005591285 nova_compute[182755]: 2026-01-22 00:24:04.725 182759 DEBUG oslo.service.loopingcall [None req-ac4b72e8-5dae-4bd5-a860-6271f7713a84 a60ce2b7b7ae47b484de12add551b287 02bcfc5f1f1044a3856e73a5938ff011 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Jan 21 19:24:04 np0005591285 nova_compute[182755]: 2026-01-22 00:24:04.725 182759 DEBUG nova.compute.manager [-] [instance: 92f9af4e-c724-4454-bf9f-49ae4bdb49a3] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Jan 21 19:24:04 np0005591285 nova_compute[182755]: 2026-01-22 00:24:04.725 182759 DEBUG nova.network.neutron [-] [instance: 92f9af4e-c724-4454-bf9f-49ae4bdb49a3] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Jan 21 19:24:04 np0005591285 nova_compute[182755]: 2026-01-22 00:24:04.741 182759 DEBUG nova.compute.manager [req-3e476dbf-d44c-4add-820e-96777954b5ac req-314e1c24-174e-4015-a6ca-1c063cf0470f 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 92f9af4e-c724-4454-bf9f-49ae4bdb49a3] Received event network-vif-unplugged-57b7bcc8-a7af-4f4d-b001-4f3b96c7d691 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 21 19:24:04 np0005591285 nova_compute[182755]: 2026-01-22 00:24:04.741 182759 DEBUG oslo_concurrency.lockutils [req-3e476dbf-d44c-4add-820e-96777954b5ac req-314e1c24-174e-4015-a6ca-1c063cf0470f 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "92f9af4e-c724-4454-bf9f-49ae4bdb49a3-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 21 19:24:04 np0005591285 nova_compute[182755]: 2026-01-22 00:24:04.742 182759 DEBUG oslo_concurrency.lockutils [req-3e476dbf-d44c-4add-820e-96777954b5ac req-314e1c24-174e-4015-a6ca-1c063cf0470f 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "92f9af4e-c724-4454-bf9f-49ae4bdb49a3-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 21 19:24:04 np0005591285 nova_compute[182755]: 2026-01-22 00:24:04.742 182759 DEBUG oslo_concurrency.lockutils [req-3e476dbf-d44c-4add-820e-96777954b5ac req-314e1c24-174e-4015-a6ca-1c063cf0470f 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "92f9af4e-c724-4454-bf9f-49ae4bdb49a3-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 21 19:24:04 np0005591285 nova_compute[182755]: 2026-01-22 00:24:04.742 182759 DEBUG nova.compute.manager [req-3e476dbf-d44c-4add-820e-96777954b5ac req-314e1c24-174e-4015-a6ca-1c063cf0470f 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 92f9af4e-c724-4454-bf9f-49ae4bdb49a3] No waiting events found dispatching network-vif-unplugged-57b7bcc8-a7af-4f4d-b001-4f3b96c7d691 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 21 19:24:04 np0005591285 nova_compute[182755]: 2026-01-22 00:24:04.742 182759 DEBUG nova.compute.manager [req-3e476dbf-d44c-4add-820e-96777954b5ac req-314e1c24-174e-4015-a6ca-1c063cf0470f 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 92f9af4e-c724-4454-bf9f-49ae4bdb49a3] Received event network-vif-unplugged-57b7bcc8-a7af-4f4d-b001-4f3b96c7d691 for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Jan 21 19:24:05 np0005591285 nova_compute[182755]: 2026-01-22 00:24:05.441 182759 DEBUG nova.network.neutron [-] [instance: 92f9af4e-c724-4454-bf9f-49ae4bdb49a3] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 21 19:24:05 np0005591285 nova_compute[182755]: 2026-01-22 00:24:05.469 182759 INFO nova.compute.manager [-] [instance: 92f9af4e-c724-4454-bf9f-49ae4bdb49a3] Took 0.74 seconds to deallocate network for instance.#033[00m
Jan 21 19:24:05 np0005591285 nova_compute[182755]: 2026-01-22 00:24:05.575 182759 DEBUG oslo_concurrency.lockutils [None req-ac4b72e8-5dae-4bd5-a860-6271f7713a84 a60ce2b7b7ae47b484de12add551b287 02bcfc5f1f1044a3856e73a5938ff011 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 21 19:24:05 np0005591285 nova_compute[182755]: 2026-01-22 00:24:05.575 182759 DEBUG oslo_concurrency.lockutils [None req-ac4b72e8-5dae-4bd5-a860-6271f7713a84 a60ce2b7b7ae47b484de12add551b287 02bcfc5f1f1044a3856e73a5938ff011 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 21 19:24:05 np0005591285 nova_compute[182755]: 2026-01-22 00:24:05.622 182759 DEBUG nova.compute.provider_tree [None req-ac4b72e8-5dae-4bd5-a860-6271f7713a84 a60ce2b7b7ae47b484de12add551b287 02bcfc5f1f1044a3856e73a5938ff011 - - default default] Inventory has not changed in ProviderTree for provider: e96a8776-a298-4c19-937a-402cb8191067 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 21 19:24:05 np0005591285 nova_compute[182755]: 2026-01-22 00:24:05.644 182759 DEBUG nova.scheduler.client.report [None req-ac4b72e8-5dae-4bd5-a860-6271f7713a84 a60ce2b7b7ae47b484de12add551b287 02bcfc5f1f1044a3856e73a5938ff011 - - default default] Inventory has not changed for provider e96a8776-a298-4c19-937a-402cb8191067 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 21 19:24:05 np0005591285 nova_compute[182755]: 2026-01-22 00:24:05.672 182759 DEBUG oslo_concurrency.lockutils [None req-ac4b72e8-5dae-4bd5-a860-6271f7713a84 a60ce2b7b7ae47b484de12add551b287 02bcfc5f1f1044a3856e73a5938ff011 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.097s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 21 19:24:05 np0005591285 nova_compute[182755]: 2026-01-22 00:24:05.709 182759 INFO nova.scheduler.client.report [None req-ac4b72e8-5dae-4bd5-a860-6271f7713a84 a60ce2b7b7ae47b484de12add551b287 02bcfc5f1f1044a3856e73a5938ff011 - - default default] Deleted allocations for instance 92f9af4e-c724-4454-bf9f-49ae4bdb49a3#033[00m
Jan 21 19:24:05 np0005591285 nova_compute[182755]: 2026-01-22 00:24:05.876 182759 DEBUG oslo_concurrency.lockutils [None req-ac4b72e8-5dae-4bd5-a860-6271f7713a84 a60ce2b7b7ae47b484de12add551b287 02bcfc5f1f1044a3856e73a5938ff011 - - default default] Lock "92f9af4e-c724-4454-bf9f-49ae4bdb49a3" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 1.582s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 21 19:24:06 np0005591285 nova_compute[182755]: 2026-01-22 00:24:06.141 182759 DEBUG nova.network.neutron [req-d16f3aa2-1a95-475a-b729-af3663a6cad3 req-82a443df-6f85-4874-96c8-732ce396999a 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 92f9af4e-c724-4454-bf9f-49ae4bdb49a3] Updated VIF entry in instance network info cache for port 57b7bcc8-a7af-4f4d-b001-4f3b96c7d691. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 21 19:24:06 np0005591285 nova_compute[182755]: 2026-01-22 00:24:06.141 182759 DEBUG nova.network.neutron [req-d16f3aa2-1a95-475a-b729-af3663a6cad3 req-82a443df-6f85-4874-96c8-732ce396999a 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 92f9af4e-c724-4454-bf9f-49ae4bdb49a3] Updating instance_info_cache with network_info: [{"id": "57b7bcc8-a7af-4f4d-b001-4f3b96c7d691", "address": "fa:16:3e:0b:1a:79", "network": {"id": "92f99623-b3a9-41d7-ab3e-5bc19b701c77", "bridge": "br-int", "label": "tempest-network-smoke--863395071", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "02bcfc5f1f1044a3856e73a5938ff011", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap57b7bcc8-a7", "ovs_interfaceid": "57b7bcc8-a7af-4f4d-b001-4f3b96c7d691", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 21 19:24:06 np0005591285 nova_compute[182755]: 2026-01-22 00:24:06.163 182759 DEBUG oslo_concurrency.lockutils [req-d16f3aa2-1a95-475a-b729-af3663a6cad3 req-82a443df-6f85-4874-96c8-732ce396999a 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Releasing lock "refresh_cache-92f9af4e-c724-4454-bf9f-49ae4bdb49a3" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 21 19:24:06 np0005591285 nova_compute[182755]: 2026-01-22 00:24:06.327 182759 DEBUG nova.compute.manager [req-5885e301-679c-4e69-a981-677def3a5ec9 req-21747110-e8c6-4436-a119-fdac64a9d7a5 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 92f9af4e-c724-4454-bf9f-49ae4bdb49a3] Received event network-vif-deleted-57b7bcc8-a7af-4f4d-b001-4f3b96c7d691 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 21 19:24:07 np0005591285 nova_compute[182755]: 2026-01-22 00:24:07.009 182759 DEBUG nova.compute.manager [req-a360702b-22a3-4982-8451-d086c945c6f5 req-c55985ef-e692-4b17-be87-d0272c79503b 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 92f9af4e-c724-4454-bf9f-49ae4bdb49a3] Received event network-vif-plugged-57b7bcc8-a7af-4f4d-b001-4f3b96c7d691 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 21 19:24:07 np0005591285 nova_compute[182755]: 2026-01-22 00:24:07.009 182759 DEBUG oslo_concurrency.lockutils [req-a360702b-22a3-4982-8451-d086c945c6f5 req-c55985ef-e692-4b17-be87-d0272c79503b 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "92f9af4e-c724-4454-bf9f-49ae4bdb49a3-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 21 19:24:07 np0005591285 nova_compute[182755]: 2026-01-22 00:24:07.010 182759 DEBUG oslo_concurrency.lockutils [req-a360702b-22a3-4982-8451-d086c945c6f5 req-c55985ef-e692-4b17-be87-d0272c79503b 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "92f9af4e-c724-4454-bf9f-49ae4bdb49a3-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 21 19:24:07 np0005591285 nova_compute[182755]: 2026-01-22 00:24:07.011 182759 DEBUG oslo_concurrency.lockutils [req-a360702b-22a3-4982-8451-d086c945c6f5 req-c55985ef-e692-4b17-be87-d0272c79503b 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "92f9af4e-c724-4454-bf9f-49ae4bdb49a3-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 21 19:24:07 np0005591285 nova_compute[182755]: 2026-01-22 00:24:07.011 182759 DEBUG nova.compute.manager [req-a360702b-22a3-4982-8451-d086c945c6f5 req-c55985ef-e692-4b17-be87-d0272c79503b 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 92f9af4e-c724-4454-bf9f-49ae4bdb49a3] No waiting events found dispatching network-vif-plugged-57b7bcc8-a7af-4f4d-b001-4f3b96c7d691 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 21 19:24:07 np0005591285 nova_compute[182755]: 2026-01-22 00:24:07.012 182759 WARNING nova.compute.manager [req-a360702b-22a3-4982-8451-d086c945c6f5 req-c55985ef-e692-4b17-be87-d0272c79503b 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 92f9af4e-c724-4454-bf9f-49ae4bdb49a3] Received unexpected event network-vif-plugged-57b7bcc8-a7af-4f4d-b001-4f3b96c7d691 for instance with vm_state deleted and task_state None.#033[00m
Jan 21 19:24:08 np0005591285 nova_compute[182755]: 2026-01-22 00:24:08.306 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:24:09 np0005591285 nova_compute[182755]: 2026-01-22 00:24:09.637 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:24:11 np0005591285 nova_compute[182755]: 2026-01-22 00:24:11.086 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:24:11 np0005591285 nova_compute[182755]: 2026-01-22 00:24:11.267 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:24:12 np0005591285 podman[237017]: 2026-01-22 00:24:12.181960307 +0000 UTC m=+0.058142464 container health_status 769668cc4e818cfae854ab3fcb5517227ddca1e18005a18ef4d782a494f45662 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.openshift.expose-services=, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, url=https://catalog.redhat.com/en/search?searchType=containers, architecture=x86_64, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., managed_by=edpm_ansible, distribution-scope=public, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=openstack_network_exporter, com.redhat.component=ubi9-minimal-container, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.buildah.version=1.33.7, io.openshift.tags=minimal rhel9, release=1755695350, version=9.6, maintainer=Red Hat, Inc., name=ubi9-minimal, vendor=Red Hat, Inc., build-date=2025-08-20T13:12:41, config_id=openstack_network_exporter, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-type=git)
Jan 21 19:24:12 np0005591285 podman[237018]: 2026-01-22 00:24:12.184147086 +0000 UTC m=+0.059738467 container health_status b5c1a31a508d0cf9e10702abe9c0ef424614b4d8f0daee85791f7de054afa6f0 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, container_name=ceilometer_agent_compute, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=ceilometer_agent_compute)
Jan 21 19:24:13 np0005591285 nova_compute[182755]: 2026-01-22 00:24:13.308 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:24:14 np0005591285 nova_compute[182755]: 2026-01-22 00:24:14.640 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:24:18 np0005591285 nova_compute[182755]: 2026-01-22 00:24:18.309 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:24:19 np0005591285 nova_compute[182755]: 2026-01-22 00:24:19.596 182759 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1769041444.5944622, 92f9af4e-c724-4454-bf9f-49ae4bdb49a3 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 21 19:24:19 np0005591285 nova_compute[182755]: 2026-01-22 00:24:19.596 182759 INFO nova.compute.manager [-] [instance: 92f9af4e-c724-4454-bf9f-49ae4bdb49a3] VM Stopped (Lifecycle Event)#033[00m
Jan 21 19:24:19 np0005591285 nova_compute[182755]: 2026-01-22 00:24:19.630 182759 DEBUG nova.compute.manager [None req-3fb06bc8-f342-44e8-8716-e8a2ea164975 - - - - - -] [instance: 92f9af4e-c724-4454-bf9f-49ae4bdb49a3] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 21 19:24:19 np0005591285 nova_compute[182755]: 2026-01-22 00:24:19.642 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:24:22 np0005591285 podman[237055]: 2026-01-22 00:24:22.175948196 +0000 UTC m=+0.050138190 container health_status 3d927e208c8d9d2bf2ed958430b573b5beb6e6062ba87e1ec7a0ee42c855b2e4 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Jan 21 19:24:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:24:23.174 12 DEBUG ceilometer.polling.manager [-] Skip pollster cpu, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 21 19:24:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:24:23.176 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.iops, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 21 19:24:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:24:23.176 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 21 19:24:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:24:23.176 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 21 19:24:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:24:23.176 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.capacity, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 21 19:24:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:24:23.176 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 21 19:24:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:24:23.177 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 21 19:24:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:24:23.177 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 21 19:24:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:24:23.177 12 DEBUG ceilometer.polling.manager [-] Skip pollster memory.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 21 19:24:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:24:23.177 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 21 19:24:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:24:23.177 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.allocation, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 21 19:24:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:24:23.177 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 21 19:24:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:24:23.178 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 21 19:24:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:24:23.178 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 21 19:24:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:24:23.178 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 21 19:24:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:24:23.178 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 21 19:24:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:24:23.178 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 21 19:24:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:24:23.179 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 21 19:24:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:24:23.179 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 21 19:24:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:24:23.179 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 21 19:24:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:24:23.179 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 21 19:24:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:24:23.179 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 21 19:24:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:24:23.179 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 21 19:24:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:24:23.180 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 21 19:24:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:24:23.180 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 21 19:24:23 np0005591285 nova_compute[182755]: 2026-01-22 00:24:23.311 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:24:24 np0005591285 podman[237080]: 2026-01-22 00:24:24.189045848 +0000 UTC m=+0.047624762 container health_status 8919309155a033da8deb024bb37b9fc0d285fe97686ee0d202af37a5f2df8416 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Jan 21 19:24:24 np0005591285 podman[237079]: 2026-01-22 00:24:24.1988025 +0000 UTC m=+0.063760916 container health_status 482a8450540b33e5cd4c4cb0c4b60281016c657b79b89fa82f8a9e447843f995 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.license=GPLv2, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent)
Jan 21 19:24:24 np0005591285 nova_compute[182755]: 2026-01-22 00:24:24.645 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:24:28 np0005591285 podman[237125]: 2026-01-22 00:24:28.203078684 +0000 UTC m=+0.080695440 container health_status e38def30a2d032278c507a8fe96e62e9ea51ff4721fe55b24e66691821fd079e (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, container_name=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=ovn_controller)
Jan 21 19:24:28 np0005591285 nova_compute[182755]: 2026-01-22 00:24:28.313 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:24:29 np0005591285 nova_compute[182755]: 2026-01-22 00:24:29.648 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:24:33 np0005591285 nova_compute[182755]: 2026-01-22 00:24:33.315 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:24:34 np0005591285 nova_compute[182755]: 2026-01-22 00:24:34.651 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:24:38 np0005591285 nova_compute[182755]: 2026-01-22 00:24:38.316 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:24:39 np0005591285 nova_compute[182755]: 2026-01-22 00:24:39.654 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:24:39 np0005591285 nova_compute[182755]: 2026-01-22 00:24:39.810 182759 DEBUG oslo_concurrency.lockutils [None req-d44a660c-66ea-4509-b965-1f16fe1b6244 a60ce2b7b7ae47b484de12add551b287 02bcfc5f1f1044a3856e73a5938ff011 - - default default] Acquiring lock "3f9bded2-5958-4c54-90d9-fc4d4b658fc0" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 21 19:24:39 np0005591285 nova_compute[182755]: 2026-01-22 00:24:39.810 182759 DEBUG oslo_concurrency.lockutils [None req-d44a660c-66ea-4509-b965-1f16fe1b6244 a60ce2b7b7ae47b484de12add551b287 02bcfc5f1f1044a3856e73a5938ff011 - - default default] Lock "3f9bded2-5958-4c54-90d9-fc4d4b658fc0" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 21 19:24:39 np0005591285 nova_compute[182755]: 2026-01-22 00:24:39.838 182759 DEBUG nova.compute.manager [None req-d44a660c-66ea-4509-b965-1f16fe1b6244 a60ce2b7b7ae47b484de12add551b287 02bcfc5f1f1044a3856e73a5938ff011 - - default default] [instance: 3f9bded2-5958-4c54-90d9-fc4d4b658fc0] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Jan 21 19:24:42 np0005591285 nova_compute[182755]: 2026-01-22 00:24:42.370 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:24:42 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:24:42.371 104259 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=48, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '16:a4:fb', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': 'fa:c2:bb:50:eb:27'}, ipsec=False) old=SB_Global(nb_cfg=47) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 21 19:24:42 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:24:42.372 104259 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 8 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Jan 21 19:24:42 np0005591285 nova_compute[182755]: 2026-01-22 00:24:42.526 182759 DEBUG oslo_concurrency.lockutils [None req-d44a660c-66ea-4509-b965-1f16fe1b6244 a60ce2b7b7ae47b484de12add551b287 02bcfc5f1f1044a3856e73a5938ff011 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 21 19:24:42 np0005591285 nova_compute[182755]: 2026-01-22 00:24:42.526 182759 DEBUG oslo_concurrency.lockutils [None req-d44a660c-66ea-4509-b965-1f16fe1b6244 a60ce2b7b7ae47b484de12add551b287 02bcfc5f1f1044a3856e73a5938ff011 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 21 19:24:42 np0005591285 nova_compute[182755]: 2026-01-22 00:24:42.535 182759 DEBUG nova.virt.hardware [None req-d44a660c-66ea-4509-b965-1f16fe1b6244 a60ce2b7b7ae47b484de12add551b287 02bcfc5f1f1044a3856e73a5938ff011 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Jan 21 19:24:42 np0005591285 nova_compute[182755]: 2026-01-22 00:24:42.535 182759 INFO nova.compute.claims [None req-d44a660c-66ea-4509-b965-1f16fe1b6244 a60ce2b7b7ae47b484de12add551b287 02bcfc5f1f1044a3856e73a5938ff011 - - default default] [instance: 3f9bded2-5958-4c54-90d9-fc4d4b658fc0] Claim successful on node compute-2.ctlplane.example.com#033[00m
Jan 21 19:24:42 np0005591285 nova_compute[182755]: 2026-01-22 00:24:42.741 182759 DEBUG nova.compute.provider_tree [None req-d44a660c-66ea-4509-b965-1f16fe1b6244 a60ce2b7b7ae47b484de12add551b287 02bcfc5f1f1044a3856e73a5938ff011 - - default default] Inventory has not changed in ProviderTree for provider: e96a8776-a298-4c19-937a-402cb8191067 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 21 19:24:42 np0005591285 nova_compute[182755]: 2026-01-22 00:24:42.758 182759 DEBUG nova.scheduler.client.report [None req-d44a660c-66ea-4509-b965-1f16fe1b6244 a60ce2b7b7ae47b484de12add551b287 02bcfc5f1f1044a3856e73a5938ff011 - - default default] Inventory has not changed for provider e96a8776-a298-4c19-937a-402cb8191067 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 21 19:24:42 np0005591285 nova_compute[182755]: 2026-01-22 00:24:42.792 182759 DEBUG oslo_concurrency.lockutils [None req-d44a660c-66ea-4509-b965-1f16fe1b6244 a60ce2b7b7ae47b484de12add551b287 02bcfc5f1f1044a3856e73a5938ff011 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.266s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 21 19:24:42 np0005591285 nova_compute[182755]: 2026-01-22 00:24:42.793 182759 DEBUG nova.compute.manager [None req-d44a660c-66ea-4509-b965-1f16fe1b6244 a60ce2b7b7ae47b484de12add551b287 02bcfc5f1f1044a3856e73a5938ff011 - - default default] [instance: 3f9bded2-5958-4c54-90d9-fc4d4b658fc0] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Jan 21 19:24:42 np0005591285 nova_compute[182755]: 2026-01-22 00:24:42.911 182759 DEBUG nova.compute.manager [None req-d44a660c-66ea-4509-b965-1f16fe1b6244 a60ce2b7b7ae47b484de12add551b287 02bcfc5f1f1044a3856e73a5938ff011 - - default default] [instance: 3f9bded2-5958-4c54-90d9-fc4d4b658fc0] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Jan 21 19:24:42 np0005591285 nova_compute[182755]: 2026-01-22 00:24:42.911 182759 DEBUG nova.network.neutron [None req-d44a660c-66ea-4509-b965-1f16fe1b6244 a60ce2b7b7ae47b484de12add551b287 02bcfc5f1f1044a3856e73a5938ff011 - - default default] [instance: 3f9bded2-5958-4c54-90d9-fc4d4b658fc0] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Jan 21 19:24:42 np0005591285 nova_compute[182755]: 2026-01-22 00:24:42.950 182759 INFO nova.virt.libvirt.driver [None req-d44a660c-66ea-4509-b965-1f16fe1b6244 a60ce2b7b7ae47b484de12add551b287 02bcfc5f1f1044a3856e73a5938ff011 - - default default] [instance: 3f9bded2-5958-4c54-90d9-fc4d4b658fc0] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Jan 21 19:24:42 np0005591285 nova_compute[182755]: 2026-01-22 00:24:42.998 182759 DEBUG nova.compute.manager [None req-d44a660c-66ea-4509-b965-1f16fe1b6244 a60ce2b7b7ae47b484de12add551b287 02bcfc5f1f1044a3856e73a5938ff011 - - default default] [instance: 3f9bded2-5958-4c54-90d9-fc4d4b658fc0] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Jan 21 19:24:43 np0005591285 podman[237153]: 2026-01-22 00:24:43.181296747 +0000 UTC m=+0.053261503 container health_status b5c1a31a508d0cf9e10702abe9c0ef424614b4d8f0daee85791f7de054afa6f0 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=ceilometer_agent_compute, container_name=ceilometer_agent_compute)
Jan 21 19:24:43 np0005591285 podman[237152]: 2026-01-22 00:24:43.182482449 +0000 UTC m=+0.056831810 container health_status 769668cc4e818cfae854ab3fcb5517227ddca1e18005a18ef4d782a494f45662 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, distribution-scope=public, io.openshift.tags=minimal rhel9, managed_by=edpm_ansible, vendor=Red Hat, Inc., build-date=2025-08-20T13:12:41, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, config_id=openstack_network_exporter, vcs-type=git, maintainer=Red Hat, Inc., name=ubi9-minimal, com.redhat.component=ubi9-minimal-container, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.openshift.expose-services=, container_name=openstack_network_exporter, architecture=x86_64, io.buildah.version=1.33.7, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., release=1755695350, url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, version=9.6)
Jan 21 19:24:43 np0005591285 nova_compute[182755]: 2026-01-22 00:24:43.214 182759 DEBUG nova.compute.manager [None req-d44a660c-66ea-4509-b965-1f16fe1b6244 a60ce2b7b7ae47b484de12add551b287 02bcfc5f1f1044a3856e73a5938ff011 - - default default] [instance: 3f9bded2-5958-4c54-90d9-fc4d4b658fc0] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Jan 21 19:24:43 np0005591285 nova_compute[182755]: 2026-01-22 00:24:43.216 182759 DEBUG nova.virt.libvirt.driver [None req-d44a660c-66ea-4509-b965-1f16fe1b6244 a60ce2b7b7ae47b484de12add551b287 02bcfc5f1f1044a3856e73a5938ff011 - - default default] [instance: 3f9bded2-5958-4c54-90d9-fc4d4b658fc0] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Jan 21 19:24:43 np0005591285 nova_compute[182755]: 2026-01-22 00:24:43.217 182759 INFO nova.virt.libvirt.driver [None req-d44a660c-66ea-4509-b965-1f16fe1b6244 a60ce2b7b7ae47b484de12add551b287 02bcfc5f1f1044a3856e73a5938ff011 - - default default] [instance: 3f9bded2-5958-4c54-90d9-fc4d4b658fc0] Creating image(s)#033[00m
Jan 21 19:24:43 np0005591285 nova_compute[182755]: 2026-01-22 00:24:43.218 182759 DEBUG oslo_concurrency.lockutils [None req-d44a660c-66ea-4509-b965-1f16fe1b6244 a60ce2b7b7ae47b484de12add551b287 02bcfc5f1f1044a3856e73a5938ff011 - - default default] Acquiring lock "/var/lib/nova/instances/3f9bded2-5958-4c54-90d9-fc4d4b658fc0/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 21 19:24:43 np0005591285 nova_compute[182755]: 2026-01-22 00:24:43.218 182759 DEBUG oslo_concurrency.lockutils [None req-d44a660c-66ea-4509-b965-1f16fe1b6244 a60ce2b7b7ae47b484de12add551b287 02bcfc5f1f1044a3856e73a5938ff011 - - default default] Lock "/var/lib/nova/instances/3f9bded2-5958-4c54-90d9-fc4d4b658fc0/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 21 19:24:43 np0005591285 nova_compute[182755]: 2026-01-22 00:24:43.219 182759 DEBUG oslo_concurrency.lockutils [None req-d44a660c-66ea-4509-b965-1f16fe1b6244 a60ce2b7b7ae47b484de12add551b287 02bcfc5f1f1044a3856e73a5938ff011 - - default default] Lock "/var/lib/nova/instances/3f9bded2-5958-4c54-90d9-fc4d4b658fc0/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 21 19:24:43 np0005591285 nova_compute[182755]: 2026-01-22 00:24:43.231 182759 DEBUG oslo_concurrency.processutils [None req-d44a660c-66ea-4509-b965-1f16fe1b6244 a60ce2b7b7ae47b484de12add551b287 02bcfc5f1f1044a3856e73a5938ff011 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 21 19:24:43 np0005591285 nova_compute[182755]: 2026-01-22 00:24:43.285 182759 DEBUG nova.policy [None req-d44a660c-66ea-4509-b965-1f16fe1b6244 a60ce2b7b7ae47b484de12add551b287 02bcfc5f1f1044a3856e73a5938ff011 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'a60ce2b7b7ae47b484de12add551b287', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '02bcfc5f1f1044a3856e73a5938ff011', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Jan 21 19:24:43 np0005591285 nova_compute[182755]: 2026-01-22 00:24:43.288 182759 DEBUG oslo_concurrency.processutils [None req-d44a660c-66ea-4509-b965-1f16fe1b6244 a60ce2b7b7ae47b484de12add551b287 02bcfc5f1f1044a3856e73a5938ff011 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json" returned: 0 in 0.057s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 21 19:24:43 np0005591285 nova_compute[182755]: 2026-01-22 00:24:43.289 182759 DEBUG oslo_concurrency.lockutils [None req-d44a660c-66ea-4509-b965-1f16fe1b6244 a60ce2b7b7ae47b484de12add551b287 02bcfc5f1f1044a3856e73a5938ff011 - - default default] Acquiring lock "3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 21 19:24:43 np0005591285 nova_compute[182755]: 2026-01-22 00:24:43.289 182759 DEBUG oslo_concurrency.lockutils [None req-d44a660c-66ea-4509-b965-1f16fe1b6244 a60ce2b7b7ae47b484de12add551b287 02bcfc5f1f1044a3856e73a5938ff011 - - default default] Lock "3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 21 19:24:43 np0005591285 nova_compute[182755]: 2026-01-22 00:24:43.300 182759 DEBUG oslo_concurrency.processutils [None req-d44a660c-66ea-4509-b965-1f16fe1b6244 a60ce2b7b7ae47b484de12add551b287 02bcfc5f1f1044a3856e73a5938ff011 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 21 19:24:43 np0005591285 nova_compute[182755]: 2026-01-22 00:24:43.319 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:24:43 np0005591285 nova_compute[182755]: 2026-01-22 00:24:43.366 182759 DEBUG oslo_concurrency.processutils [None req-d44a660c-66ea-4509-b965-1f16fe1b6244 a60ce2b7b7ae47b484de12add551b287 02bcfc5f1f1044a3856e73a5938ff011 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json" returned: 0 in 0.066s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 21 19:24:43 np0005591285 nova_compute[182755]: 2026-01-22 00:24:43.367 182759 DEBUG oslo_concurrency.processutils [None req-d44a660c-66ea-4509-b965-1f16fe1b6244 a60ce2b7b7ae47b484de12add551b287 02bcfc5f1f1044a3856e73a5938ff011 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474,backing_fmt=raw /var/lib/nova/instances/3f9bded2-5958-4c54-90d9-fc4d4b658fc0/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 21 19:24:43 np0005591285 nova_compute[182755]: 2026-01-22 00:24:43.409 182759 DEBUG oslo_concurrency.processutils [None req-d44a660c-66ea-4509-b965-1f16fe1b6244 a60ce2b7b7ae47b484de12add551b287 02bcfc5f1f1044a3856e73a5938ff011 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474,backing_fmt=raw /var/lib/nova/instances/3f9bded2-5958-4c54-90d9-fc4d4b658fc0/disk 1073741824" returned: 0 in 0.042s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 21 19:24:43 np0005591285 nova_compute[182755]: 2026-01-22 00:24:43.410 182759 DEBUG oslo_concurrency.lockutils [None req-d44a660c-66ea-4509-b965-1f16fe1b6244 a60ce2b7b7ae47b484de12add551b287 02bcfc5f1f1044a3856e73a5938ff011 - - default default] Lock "3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.121s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 21 19:24:43 np0005591285 nova_compute[182755]: 2026-01-22 00:24:43.411 182759 DEBUG oslo_concurrency.processutils [None req-d44a660c-66ea-4509-b965-1f16fe1b6244 a60ce2b7b7ae47b484de12add551b287 02bcfc5f1f1044a3856e73a5938ff011 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 21 19:24:43 np0005591285 nova_compute[182755]: 2026-01-22 00:24:43.469 182759 DEBUG oslo_concurrency.processutils [None req-d44a660c-66ea-4509-b965-1f16fe1b6244 a60ce2b7b7ae47b484de12add551b287 02bcfc5f1f1044a3856e73a5938ff011 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json" returned: 0 in 0.058s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 21 19:24:43 np0005591285 nova_compute[182755]: 2026-01-22 00:24:43.470 182759 DEBUG nova.virt.disk.api [None req-d44a660c-66ea-4509-b965-1f16fe1b6244 a60ce2b7b7ae47b484de12add551b287 02bcfc5f1f1044a3856e73a5938ff011 - - default default] Checking if we can resize image /var/lib/nova/instances/3f9bded2-5958-4c54-90d9-fc4d4b658fc0/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166#033[00m
Jan 21 19:24:43 np0005591285 nova_compute[182755]: 2026-01-22 00:24:43.470 182759 DEBUG oslo_concurrency.processutils [None req-d44a660c-66ea-4509-b965-1f16fe1b6244 a60ce2b7b7ae47b484de12add551b287 02bcfc5f1f1044a3856e73a5938ff011 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/3f9bded2-5958-4c54-90d9-fc4d4b658fc0/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 21 19:24:43 np0005591285 nova_compute[182755]: 2026-01-22 00:24:43.525 182759 DEBUG oslo_concurrency.processutils [None req-d44a660c-66ea-4509-b965-1f16fe1b6244 a60ce2b7b7ae47b484de12add551b287 02bcfc5f1f1044a3856e73a5938ff011 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/3f9bded2-5958-4c54-90d9-fc4d4b658fc0/disk --force-share --output=json" returned: 0 in 0.054s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 21 19:24:43 np0005591285 nova_compute[182755]: 2026-01-22 00:24:43.526 182759 DEBUG nova.virt.disk.api [None req-d44a660c-66ea-4509-b965-1f16fe1b6244 a60ce2b7b7ae47b484de12add551b287 02bcfc5f1f1044a3856e73a5938ff011 - - default default] Cannot resize image /var/lib/nova/instances/3f9bded2-5958-4c54-90d9-fc4d4b658fc0/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172#033[00m
Jan 21 19:24:43 np0005591285 nova_compute[182755]: 2026-01-22 00:24:43.526 182759 DEBUG nova.objects.instance [None req-d44a660c-66ea-4509-b965-1f16fe1b6244 a60ce2b7b7ae47b484de12add551b287 02bcfc5f1f1044a3856e73a5938ff011 - - default default] Lazy-loading 'migration_context' on Instance uuid 3f9bded2-5958-4c54-90d9-fc4d4b658fc0 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 21 19:24:43 np0005591285 nova_compute[182755]: 2026-01-22 00:24:43.551 182759 DEBUG nova.virt.libvirt.driver [None req-d44a660c-66ea-4509-b965-1f16fe1b6244 a60ce2b7b7ae47b484de12add551b287 02bcfc5f1f1044a3856e73a5938ff011 - - default default] [instance: 3f9bded2-5958-4c54-90d9-fc4d4b658fc0] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Jan 21 19:24:43 np0005591285 nova_compute[182755]: 2026-01-22 00:24:43.551 182759 DEBUG nova.virt.libvirt.driver [None req-d44a660c-66ea-4509-b965-1f16fe1b6244 a60ce2b7b7ae47b484de12add551b287 02bcfc5f1f1044a3856e73a5938ff011 - - default default] [instance: 3f9bded2-5958-4c54-90d9-fc4d4b658fc0] Ensure instance console log exists: /var/lib/nova/instances/3f9bded2-5958-4c54-90d9-fc4d4b658fc0/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Jan 21 19:24:43 np0005591285 nova_compute[182755]: 2026-01-22 00:24:43.552 182759 DEBUG oslo_concurrency.lockutils [None req-d44a660c-66ea-4509-b965-1f16fe1b6244 a60ce2b7b7ae47b484de12add551b287 02bcfc5f1f1044a3856e73a5938ff011 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 21 19:24:43 np0005591285 nova_compute[182755]: 2026-01-22 00:24:43.552 182759 DEBUG oslo_concurrency.lockutils [None req-d44a660c-66ea-4509-b965-1f16fe1b6244 a60ce2b7b7ae47b484de12add551b287 02bcfc5f1f1044a3856e73a5938ff011 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 21 19:24:43 np0005591285 nova_compute[182755]: 2026-01-22 00:24:43.553 182759 DEBUG oslo_concurrency.lockutils [None req-d44a660c-66ea-4509-b965-1f16fe1b6244 a60ce2b7b7ae47b484de12add551b287 02bcfc5f1f1044a3856e73a5938ff011 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 21 19:24:44 np0005591285 nova_compute[182755]: 2026-01-22 00:24:44.656 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:24:46 np0005591285 nova_compute[182755]: 2026-01-22 00:24:46.047 182759 DEBUG nova.network.neutron [None req-d44a660c-66ea-4509-b965-1f16fe1b6244 a60ce2b7b7ae47b484de12add551b287 02bcfc5f1f1044a3856e73a5938ff011 - - default default] [instance: 3f9bded2-5958-4c54-90d9-fc4d4b658fc0] Successfully created port: 0e0f9617-ebfe-45af-98e6-38991b5338d0 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Jan 21 19:24:48 np0005591285 nova_compute[182755]: 2026-01-22 00:24:48.014 182759 DEBUG nova.network.neutron [None req-d44a660c-66ea-4509-b965-1f16fe1b6244 a60ce2b7b7ae47b484de12add551b287 02bcfc5f1f1044a3856e73a5938ff011 - - default default] [instance: 3f9bded2-5958-4c54-90d9-fc4d4b658fc0] Successfully updated port: 0e0f9617-ebfe-45af-98e6-38991b5338d0 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Jan 21 19:24:48 np0005591285 nova_compute[182755]: 2026-01-22 00:24:48.045 182759 DEBUG oslo_concurrency.lockutils [None req-d44a660c-66ea-4509-b965-1f16fe1b6244 a60ce2b7b7ae47b484de12add551b287 02bcfc5f1f1044a3856e73a5938ff011 - - default default] Acquiring lock "refresh_cache-3f9bded2-5958-4c54-90d9-fc4d4b658fc0" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 21 19:24:48 np0005591285 nova_compute[182755]: 2026-01-22 00:24:48.046 182759 DEBUG oslo_concurrency.lockutils [None req-d44a660c-66ea-4509-b965-1f16fe1b6244 a60ce2b7b7ae47b484de12add551b287 02bcfc5f1f1044a3856e73a5938ff011 - - default default] Acquired lock "refresh_cache-3f9bded2-5958-4c54-90d9-fc4d4b658fc0" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 21 19:24:48 np0005591285 nova_compute[182755]: 2026-01-22 00:24:48.046 182759 DEBUG nova.network.neutron [None req-d44a660c-66ea-4509-b965-1f16fe1b6244 a60ce2b7b7ae47b484de12add551b287 02bcfc5f1f1044a3856e73a5938ff011 - - default default] [instance: 3f9bded2-5958-4c54-90d9-fc4d4b658fc0] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Jan 21 19:24:48 np0005591285 nova_compute[182755]: 2026-01-22 00:24:48.192 182759 DEBUG nova.compute.manager [req-9cc94f79-017c-49f2-9430-f701b6bf9dba req-550f5810-aadf-40df-8206-ad797d922ffa 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 3f9bded2-5958-4c54-90d9-fc4d4b658fc0] Received event network-changed-0e0f9617-ebfe-45af-98e6-38991b5338d0 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 21 19:24:48 np0005591285 nova_compute[182755]: 2026-01-22 00:24:48.193 182759 DEBUG nova.compute.manager [req-9cc94f79-017c-49f2-9430-f701b6bf9dba req-550f5810-aadf-40df-8206-ad797d922ffa 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 3f9bded2-5958-4c54-90d9-fc4d4b658fc0] Refreshing instance network info cache due to event network-changed-0e0f9617-ebfe-45af-98e6-38991b5338d0. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 21 19:24:48 np0005591285 nova_compute[182755]: 2026-01-22 00:24:48.193 182759 DEBUG oslo_concurrency.lockutils [req-9cc94f79-017c-49f2-9430-f701b6bf9dba req-550f5810-aadf-40df-8206-ad797d922ffa 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "refresh_cache-3f9bded2-5958-4c54-90d9-fc4d4b658fc0" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 21 19:24:48 np0005591285 nova_compute[182755]: 2026-01-22 00:24:48.321 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:24:48 np0005591285 nova_compute[182755]: 2026-01-22 00:24:48.359 182759 DEBUG nova.network.neutron [None req-d44a660c-66ea-4509-b965-1f16fe1b6244 a60ce2b7b7ae47b484de12add551b287 02bcfc5f1f1044a3856e73a5938ff011 - - default default] [instance: 3f9bded2-5958-4c54-90d9-fc4d4b658fc0] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Jan 21 19:24:49 np0005591285 nova_compute[182755]: 2026-01-22 00:24:49.619 182759 DEBUG nova.network.neutron [None req-d44a660c-66ea-4509-b965-1f16fe1b6244 a60ce2b7b7ae47b484de12add551b287 02bcfc5f1f1044a3856e73a5938ff011 - - default default] [instance: 3f9bded2-5958-4c54-90d9-fc4d4b658fc0] Updating instance_info_cache with network_info: [{"id": "0e0f9617-ebfe-45af-98e6-38991b5338d0", "address": "fa:16:3e:8b:21:fa", "network": {"id": "5c39d2a7-2c89-4543-a593-0bbe9a34dfef", "bridge": "br-int", "label": "tempest-network-smoke--1486796925", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "02bcfc5f1f1044a3856e73a5938ff011", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap0e0f9617-eb", "ovs_interfaceid": "0e0f9617-ebfe-45af-98e6-38991b5338d0", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 21 19:24:49 np0005591285 nova_compute[182755]: 2026-01-22 00:24:49.659 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:24:49 np0005591285 nova_compute[182755]: 2026-01-22 00:24:49.667 182759 DEBUG oslo_concurrency.lockutils [None req-d44a660c-66ea-4509-b965-1f16fe1b6244 a60ce2b7b7ae47b484de12add551b287 02bcfc5f1f1044a3856e73a5938ff011 - - default default] Releasing lock "refresh_cache-3f9bded2-5958-4c54-90d9-fc4d4b658fc0" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 21 19:24:49 np0005591285 nova_compute[182755]: 2026-01-22 00:24:49.667 182759 DEBUG nova.compute.manager [None req-d44a660c-66ea-4509-b965-1f16fe1b6244 a60ce2b7b7ae47b484de12add551b287 02bcfc5f1f1044a3856e73a5938ff011 - - default default] [instance: 3f9bded2-5958-4c54-90d9-fc4d4b658fc0] Instance network_info: |[{"id": "0e0f9617-ebfe-45af-98e6-38991b5338d0", "address": "fa:16:3e:8b:21:fa", "network": {"id": "5c39d2a7-2c89-4543-a593-0bbe9a34dfef", "bridge": "br-int", "label": "tempest-network-smoke--1486796925", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "02bcfc5f1f1044a3856e73a5938ff011", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap0e0f9617-eb", "ovs_interfaceid": "0e0f9617-ebfe-45af-98e6-38991b5338d0", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Jan 21 19:24:49 np0005591285 nova_compute[182755]: 2026-01-22 00:24:49.668 182759 DEBUG oslo_concurrency.lockutils [req-9cc94f79-017c-49f2-9430-f701b6bf9dba req-550f5810-aadf-40df-8206-ad797d922ffa 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquired lock "refresh_cache-3f9bded2-5958-4c54-90d9-fc4d4b658fc0" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 21 19:24:49 np0005591285 nova_compute[182755]: 2026-01-22 00:24:49.668 182759 DEBUG nova.network.neutron [req-9cc94f79-017c-49f2-9430-f701b6bf9dba req-550f5810-aadf-40df-8206-ad797d922ffa 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 3f9bded2-5958-4c54-90d9-fc4d4b658fc0] Refreshing network info cache for port 0e0f9617-ebfe-45af-98e6-38991b5338d0 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 21 19:24:49 np0005591285 nova_compute[182755]: 2026-01-22 00:24:49.671 182759 DEBUG nova.virt.libvirt.driver [None req-d44a660c-66ea-4509-b965-1f16fe1b6244 a60ce2b7b7ae47b484de12add551b287 02bcfc5f1f1044a3856e73a5938ff011 - - default default] [instance: 3f9bded2-5958-4c54-90d9-fc4d4b658fc0] Start _get_guest_xml network_info=[{"id": "0e0f9617-ebfe-45af-98e6-38991b5338d0", "address": "fa:16:3e:8b:21:fa", "network": {"id": "5c39d2a7-2c89-4543-a593-0bbe9a34dfef", "bridge": "br-int", "label": "tempest-network-smoke--1486796925", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "02bcfc5f1f1044a3856e73a5938ff011", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap0e0f9617-eb", "ovs_interfaceid": "0e0f9617-ebfe-45af-98e6-38991b5338d0", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-21T23:42:50Z,direct_url=<?>,disk_format='qcow2',id=9cd98f02-a505-4543-a7ad-04e9a377b456,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='43b70c4e837343859ac97b6b2397ba1b',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-21T23:42:52Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'device_type': 'disk', 'device_name': '/dev/vda', 'size': 0, 'boot_index': 0, 'disk_bus': 'virtio', 'guest_format': None, 'encryption_options': None, 'encryption_format': None, 'encrypted': False, 'encryption_secret_uuid': None, 'image_id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Jan 21 19:24:49 np0005591285 nova_compute[182755]: 2026-01-22 00:24:49.675 182759 WARNING nova.virt.libvirt.driver [None req-d44a660c-66ea-4509-b965-1f16fe1b6244 a60ce2b7b7ae47b484de12add551b287 02bcfc5f1f1044a3856e73a5938ff011 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 21 19:24:49 np0005591285 nova_compute[182755]: 2026-01-22 00:24:49.681 182759 DEBUG nova.virt.libvirt.host [None req-d44a660c-66ea-4509-b965-1f16fe1b6244 a60ce2b7b7ae47b484de12add551b287 02bcfc5f1f1044a3856e73a5938ff011 - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Jan 21 19:24:49 np0005591285 nova_compute[182755]: 2026-01-22 00:24:49.681 182759 DEBUG nova.virt.libvirt.host [None req-d44a660c-66ea-4509-b965-1f16fe1b6244 a60ce2b7b7ae47b484de12add551b287 02bcfc5f1f1044a3856e73a5938ff011 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Jan 21 19:24:49 np0005591285 nova_compute[182755]: 2026-01-22 00:24:49.684 182759 DEBUG nova.virt.libvirt.host [None req-d44a660c-66ea-4509-b965-1f16fe1b6244 a60ce2b7b7ae47b484de12add551b287 02bcfc5f1f1044a3856e73a5938ff011 - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Jan 21 19:24:49 np0005591285 nova_compute[182755]: 2026-01-22 00:24:49.685 182759 DEBUG nova.virt.libvirt.host [None req-d44a660c-66ea-4509-b965-1f16fe1b6244 a60ce2b7b7ae47b484de12add551b287 02bcfc5f1f1044a3856e73a5938ff011 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Jan 21 19:24:49 np0005591285 nova_compute[182755]: 2026-01-22 00:24:49.686 182759 DEBUG nova.virt.libvirt.driver [None req-d44a660c-66ea-4509-b965-1f16fe1b6244 a60ce2b7b7ae47b484de12add551b287 02bcfc5f1f1044a3856e73a5938ff011 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Jan 21 19:24:49 np0005591285 nova_compute[182755]: 2026-01-22 00:24:49.686 182759 DEBUG nova.virt.hardware [None req-d44a660c-66ea-4509-b965-1f16fe1b6244 a60ce2b7b7ae47b484de12add551b287 02bcfc5f1f1044a3856e73a5938ff011 - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-21T23:42:45Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='c3389c03-89c4-4ff5-9e03-1a99d41713d4',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-21T23:42:50Z,direct_url=<?>,disk_format='qcow2',id=9cd98f02-a505-4543-a7ad-04e9a377b456,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='43b70c4e837343859ac97b6b2397ba1b',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-21T23:42:52Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Jan 21 19:24:49 np0005591285 nova_compute[182755]: 2026-01-22 00:24:49.686 182759 DEBUG nova.virt.hardware [None req-d44a660c-66ea-4509-b965-1f16fe1b6244 a60ce2b7b7ae47b484de12add551b287 02bcfc5f1f1044a3856e73a5938ff011 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Jan 21 19:24:49 np0005591285 nova_compute[182755]: 2026-01-22 00:24:49.687 182759 DEBUG nova.virt.hardware [None req-d44a660c-66ea-4509-b965-1f16fe1b6244 a60ce2b7b7ae47b484de12add551b287 02bcfc5f1f1044a3856e73a5938ff011 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Jan 21 19:24:49 np0005591285 nova_compute[182755]: 2026-01-22 00:24:49.687 182759 DEBUG nova.virt.hardware [None req-d44a660c-66ea-4509-b965-1f16fe1b6244 a60ce2b7b7ae47b484de12add551b287 02bcfc5f1f1044a3856e73a5938ff011 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Jan 21 19:24:49 np0005591285 nova_compute[182755]: 2026-01-22 00:24:49.687 182759 DEBUG nova.virt.hardware [None req-d44a660c-66ea-4509-b965-1f16fe1b6244 a60ce2b7b7ae47b484de12add551b287 02bcfc5f1f1044a3856e73a5938ff011 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Jan 21 19:24:49 np0005591285 nova_compute[182755]: 2026-01-22 00:24:49.687 182759 DEBUG nova.virt.hardware [None req-d44a660c-66ea-4509-b965-1f16fe1b6244 a60ce2b7b7ae47b484de12add551b287 02bcfc5f1f1044a3856e73a5938ff011 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Jan 21 19:24:49 np0005591285 nova_compute[182755]: 2026-01-22 00:24:49.687 182759 DEBUG nova.virt.hardware [None req-d44a660c-66ea-4509-b965-1f16fe1b6244 a60ce2b7b7ae47b484de12add551b287 02bcfc5f1f1044a3856e73a5938ff011 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Jan 21 19:24:49 np0005591285 nova_compute[182755]: 2026-01-22 00:24:49.688 182759 DEBUG nova.virt.hardware [None req-d44a660c-66ea-4509-b965-1f16fe1b6244 a60ce2b7b7ae47b484de12add551b287 02bcfc5f1f1044a3856e73a5938ff011 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Jan 21 19:24:49 np0005591285 nova_compute[182755]: 2026-01-22 00:24:49.688 182759 DEBUG nova.virt.hardware [None req-d44a660c-66ea-4509-b965-1f16fe1b6244 a60ce2b7b7ae47b484de12add551b287 02bcfc5f1f1044a3856e73a5938ff011 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Jan 21 19:24:49 np0005591285 nova_compute[182755]: 2026-01-22 00:24:49.688 182759 DEBUG nova.virt.hardware [None req-d44a660c-66ea-4509-b965-1f16fe1b6244 a60ce2b7b7ae47b484de12add551b287 02bcfc5f1f1044a3856e73a5938ff011 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Jan 21 19:24:49 np0005591285 nova_compute[182755]: 2026-01-22 00:24:49.688 182759 DEBUG nova.virt.hardware [None req-d44a660c-66ea-4509-b965-1f16fe1b6244 a60ce2b7b7ae47b484de12add551b287 02bcfc5f1f1044a3856e73a5938ff011 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Jan 21 19:24:49 np0005591285 nova_compute[182755]: 2026-01-22 00:24:49.692 182759 DEBUG nova.virt.libvirt.vif [None req-d44a660c-66ea-4509-b965-1f16fe1b6244 a60ce2b7b7ae47b484de12add551b287 02bcfc5f1f1044a3856e73a5938ff011 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-22T00:24:31Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-server-tempest-TestSecurityGroupsBasicOps-1492736128-access_point-1191581548',display_name='tempest-server-tempest-TestSecurityGroupsBasicOps-1492736128-access_point-1191581548',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-server-tempest-testsecuritygroupsbasicops-1492736128-ac',id=161,image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBA7jwxBkdWOz7w57AiaRcIngM4Y6BG0RAdXKZN1lpSf4fY7AaWS+RG49OCjvRqIpg6m9+OlWKeWKEGH6c13ztIF3i7IhSM5D4o2yMlEeDmvrwLxAQoPueCNJW1uOa0WZAw==',key_name='tempest-TestSecurityGroupsBasicOps-1448768175',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='02bcfc5f1f1044a3856e73a5938ff011',ramdisk_id='',reservation_id='r-p8b5xcxs',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestSecurityGroupsBasicOps-1492736128',owner_user_name='tempest-TestSecurityGroupsBasicOps-1492736128-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-22T00:24:43Z,user_data=None,user_id='a60ce2b7b7ae47b484de12add551b287',uuid=3f9bded2-5958-4c54-90d9-fc4d4b658fc0,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "0e0f9617-ebfe-45af-98e6-38991b5338d0", "address": "fa:16:3e:8b:21:fa", "network": {"id": "5c39d2a7-2c89-4543-a593-0bbe9a34dfef", "bridge": "br-int", "label": "tempest-network-smoke--1486796925", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "02bcfc5f1f1044a3856e73a5938ff011", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap0e0f9617-eb", "ovs_interfaceid": "0e0f9617-ebfe-45af-98e6-38991b5338d0", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Jan 21 19:24:49 np0005591285 nova_compute[182755]: 2026-01-22 00:24:49.692 182759 DEBUG nova.network.os_vif_util [None req-d44a660c-66ea-4509-b965-1f16fe1b6244 a60ce2b7b7ae47b484de12add551b287 02bcfc5f1f1044a3856e73a5938ff011 - - default default] Converting VIF {"id": "0e0f9617-ebfe-45af-98e6-38991b5338d0", "address": "fa:16:3e:8b:21:fa", "network": {"id": "5c39d2a7-2c89-4543-a593-0bbe9a34dfef", "bridge": "br-int", "label": "tempest-network-smoke--1486796925", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "02bcfc5f1f1044a3856e73a5938ff011", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap0e0f9617-eb", "ovs_interfaceid": "0e0f9617-ebfe-45af-98e6-38991b5338d0", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 21 19:24:49 np0005591285 nova_compute[182755]: 2026-01-22 00:24:49.693 182759 DEBUG nova.network.os_vif_util [None req-d44a660c-66ea-4509-b965-1f16fe1b6244 a60ce2b7b7ae47b484de12add551b287 02bcfc5f1f1044a3856e73a5938ff011 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:8b:21:fa,bridge_name='br-int',has_traffic_filtering=True,id=0e0f9617-ebfe-45af-98e6-38991b5338d0,network=Network(5c39d2a7-2c89-4543-a593-0bbe9a34dfef),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap0e0f9617-eb') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 21 19:24:49 np0005591285 nova_compute[182755]: 2026-01-22 00:24:49.694 182759 DEBUG nova.objects.instance [None req-d44a660c-66ea-4509-b965-1f16fe1b6244 a60ce2b7b7ae47b484de12add551b287 02bcfc5f1f1044a3856e73a5938ff011 - - default default] Lazy-loading 'pci_devices' on Instance uuid 3f9bded2-5958-4c54-90d9-fc4d4b658fc0 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 21 19:24:49 np0005591285 nova_compute[182755]: 2026-01-22 00:24:49.712 182759 DEBUG nova.virt.libvirt.driver [None req-d44a660c-66ea-4509-b965-1f16fe1b6244 a60ce2b7b7ae47b484de12add551b287 02bcfc5f1f1044a3856e73a5938ff011 - - default default] [instance: 3f9bded2-5958-4c54-90d9-fc4d4b658fc0] End _get_guest_xml xml=<domain type="kvm">
Jan 21 19:24:49 np0005591285 nova_compute[182755]:  <uuid>3f9bded2-5958-4c54-90d9-fc4d4b658fc0</uuid>
Jan 21 19:24:49 np0005591285 nova_compute[182755]:  <name>instance-000000a1</name>
Jan 21 19:24:49 np0005591285 nova_compute[182755]:  <memory>131072</memory>
Jan 21 19:24:49 np0005591285 nova_compute[182755]:  <vcpu>1</vcpu>
Jan 21 19:24:49 np0005591285 nova_compute[182755]:  <metadata>
Jan 21 19:24:49 np0005591285 nova_compute[182755]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 21 19:24:49 np0005591285 nova_compute[182755]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 21 19:24:49 np0005591285 nova_compute[182755]:      <nova:name>tempest-server-tempest-TestSecurityGroupsBasicOps-1492736128-access_point-1191581548</nova:name>
Jan 21 19:24:49 np0005591285 nova_compute[182755]:      <nova:creationTime>2026-01-22 00:24:49</nova:creationTime>
Jan 21 19:24:49 np0005591285 nova_compute[182755]:      <nova:flavor name="m1.nano">
Jan 21 19:24:49 np0005591285 nova_compute[182755]:        <nova:memory>128</nova:memory>
Jan 21 19:24:49 np0005591285 nova_compute[182755]:        <nova:disk>1</nova:disk>
Jan 21 19:24:49 np0005591285 nova_compute[182755]:        <nova:swap>0</nova:swap>
Jan 21 19:24:49 np0005591285 nova_compute[182755]:        <nova:ephemeral>0</nova:ephemeral>
Jan 21 19:24:49 np0005591285 nova_compute[182755]:        <nova:vcpus>1</nova:vcpus>
Jan 21 19:24:49 np0005591285 nova_compute[182755]:      </nova:flavor>
Jan 21 19:24:49 np0005591285 nova_compute[182755]:      <nova:owner>
Jan 21 19:24:49 np0005591285 nova_compute[182755]:        <nova:user uuid="a60ce2b7b7ae47b484de12add551b287">tempest-TestSecurityGroupsBasicOps-1492736128-project-member</nova:user>
Jan 21 19:24:49 np0005591285 nova_compute[182755]:        <nova:project uuid="02bcfc5f1f1044a3856e73a5938ff011">tempest-TestSecurityGroupsBasicOps-1492736128</nova:project>
Jan 21 19:24:49 np0005591285 nova_compute[182755]:      </nova:owner>
Jan 21 19:24:49 np0005591285 nova_compute[182755]:      <nova:root type="image" uuid="9cd98f02-a505-4543-a7ad-04e9a377b456"/>
Jan 21 19:24:49 np0005591285 nova_compute[182755]:      <nova:ports>
Jan 21 19:24:49 np0005591285 nova_compute[182755]:        <nova:port uuid="0e0f9617-ebfe-45af-98e6-38991b5338d0">
Jan 21 19:24:49 np0005591285 nova_compute[182755]:          <nova:ip type="fixed" address="10.100.0.4" ipVersion="4"/>
Jan 21 19:24:49 np0005591285 nova_compute[182755]:        </nova:port>
Jan 21 19:24:49 np0005591285 nova_compute[182755]:      </nova:ports>
Jan 21 19:24:49 np0005591285 nova_compute[182755]:    </nova:instance>
Jan 21 19:24:49 np0005591285 nova_compute[182755]:  </metadata>
Jan 21 19:24:49 np0005591285 nova_compute[182755]:  <sysinfo type="smbios">
Jan 21 19:24:49 np0005591285 nova_compute[182755]:    <system>
Jan 21 19:24:49 np0005591285 nova_compute[182755]:      <entry name="manufacturer">RDO</entry>
Jan 21 19:24:49 np0005591285 nova_compute[182755]:      <entry name="product">OpenStack Compute</entry>
Jan 21 19:24:49 np0005591285 nova_compute[182755]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 21 19:24:49 np0005591285 nova_compute[182755]:      <entry name="serial">3f9bded2-5958-4c54-90d9-fc4d4b658fc0</entry>
Jan 21 19:24:49 np0005591285 nova_compute[182755]:      <entry name="uuid">3f9bded2-5958-4c54-90d9-fc4d4b658fc0</entry>
Jan 21 19:24:49 np0005591285 nova_compute[182755]:      <entry name="family">Virtual Machine</entry>
Jan 21 19:24:49 np0005591285 nova_compute[182755]:    </system>
Jan 21 19:24:49 np0005591285 nova_compute[182755]:  </sysinfo>
Jan 21 19:24:49 np0005591285 nova_compute[182755]:  <os>
Jan 21 19:24:49 np0005591285 nova_compute[182755]:    <type arch="x86_64" machine="q35">hvm</type>
Jan 21 19:24:49 np0005591285 nova_compute[182755]:    <boot dev="hd"/>
Jan 21 19:24:49 np0005591285 nova_compute[182755]:    <smbios mode="sysinfo"/>
Jan 21 19:24:49 np0005591285 nova_compute[182755]:  </os>
Jan 21 19:24:49 np0005591285 nova_compute[182755]:  <features>
Jan 21 19:24:49 np0005591285 nova_compute[182755]:    <acpi/>
Jan 21 19:24:49 np0005591285 nova_compute[182755]:    <apic/>
Jan 21 19:24:49 np0005591285 nova_compute[182755]:    <vmcoreinfo/>
Jan 21 19:24:49 np0005591285 nova_compute[182755]:  </features>
Jan 21 19:24:49 np0005591285 nova_compute[182755]:  <clock offset="utc">
Jan 21 19:24:49 np0005591285 nova_compute[182755]:    <timer name="pit" tickpolicy="delay"/>
Jan 21 19:24:49 np0005591285 nova_compute[182755]:    <timer name="rtc" tickpolicy="catchup"/>
Jan 21 19:24:49 np0005591285 nova_compute[182755]:    <timer name="hpet" present="no"/>
Jan 21 19:24:49 np0005591285 nova_compute[182755]:  </clock>
Jan 21 19:24:49 np0005591285 nova_compute[182755]:  <cpu mode="custom" match="exact">
Jan 21 19:24:49 np0005591285 nova_compute[182755]:    <model>Nehalem</model>
Jan 21 19:24:49 np0005591285 nova_compute[182755]:    <topology sockets="1" cores="1" threads="1"/>
Jan 21 19:24:49 np0005591285 nova_compute[182755]:  </cpu>
Jan 21 19:24:49 np0005591285 nova_compute[182755]:  <devices>
Jan 21 19:24:49 np0005591285 nova_compute[182755]:    <disk type="file" device="disk">
Jan 21 19:24:49 np0005591285 nova_compute[182755]:      <driver name="qemu" type="qcow2" cache="none"/>
Jan 21 19:24:49 np0005591285 nova_compute[182755]:      <source file="/var/lib/nova/instances/3f9bded2-5958-4c54-90d9-fc4d4b658fc0/disk"/>
Jan 21 19:24:49 np0005591285 nova_compute[182755]:      <target dev="vda" bus="virtio"/>
Jan 21 19:24:49 np0005591285 nova_compute[182755]:    </disk>
Jan 21 19:24:49 np0005591285 nova_compute[182755]:    <disk type="file" device="cdrom">
Jan 21 19:24:49 np0005591285 nova_compute[182755]:      <driver name="qemu" type="raw" cache="none"/>
Jan 21 19:24:49 np0005591285 nova_compute[182755]:      <source file="/var/lib/nova/instances/3f9bded2-5958-4c54-90d9-fc4d4b658fc0/disk.config"/>
Jan 21 19:24:49 np0005591285 nova_compute[182755]:      <target dev="sda" bus="sata"/>
Jan 21 19:24:49 np0005591285 nova_compute[182755]:    </disk>
Jan 21 19:24:49 np0005591285 nova_compute[182755]:    <interface type="ethernet">
Jan 21 19:24:49 np0005591285 nova_compute[182755]:      <mac address="fa:16:3e:8b:21:fa"/>
Jan 21 19:24:49 np0005591285 nova_compute[182755]:      <model type="virtio"/>
Jan 21 19:24:49 np0005591285 nova_compute[182755]:      <driver name="vhost" rx_queue_size="512"/>
Jan 21 19:24:49 np0005591285 nova_compute[182755]:      <mtu size="1442"/>
Jan 21 19:24:49 np0005591285 nova_compute[182755]:      <target dev="tap0e0f9617-eb"/>
Jan 21 19:24:49 np0005591285 nova_compute[182755]:    </interface>
Jan 21 19:24:49 np0005591285 nova_compute[182755]:    <serial type="pty">
Jan 21 19:24:49 np0005591285 nova_compute[182755]:      <log file="/var/lib/nova/instances/3f9bded2-5958-4c54-90d9-fc4d4b658fc0/console.log" append="off"/>
Jan 21 19:24:49 np0005591285 nova_compute[182755]:    </serial>
Jan 21 19:24:49 np0005591285 nova_compute[182755]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 21 19:24:49 np0005591285 nova_compute[182755]:    <video>
Jan 21 19:24:49 np0005591285 nova_compute[182755]:      <model type="virtio"/>
Jan 21 19:24:49 np0005591285 nova_compute[182755]:    </video>
Jan 21 19:24:49 np0005591285 nova_compute[182755]:    <input type="tablet" bus="usb"/>
Jan 21 19:24:49 np0005591285 nova_compute[182755]:    <rng model="virtio">
Jan 21 19:24:49 np0005591285 nova_compute[182755]:      <backend model="random">/dev/urandom</backend>
Jan 21 19:24:49 np0005591285 nova_compute[182755]:    </rng>
Jan 21 19:24:49 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root"/>
Jan 21 19:24:49 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 19:24:49 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 19:24:49 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 19:24:49 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 19:24:49 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 19:24:49 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 19:24:49 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 19:24:49 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 19:24:49 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 19:24:49 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 19:24:49 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 19:24:49 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 19:24:49 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 19:24:49 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 19:24:49 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 19:24:49 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 19:24:49 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 19:24:49 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 19:24:49 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 19:24:49 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 19:24:49 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 19:24:49 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 19:24:49 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 19:24:49 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 19:24:49 np0005591285 nova_compute[182755]:    <controller type="usb" index="0"/>
Jan 21 19:24:49 np0005591285 nova_compute[182755]:    <memballoon model="virtio">
Jan 21 19:24:49 np0005591285 nova_compute[182755]:      <stats period="10"/>
Jan 21 19:24:49 np0005591285 nova_compute[182755]:    </memballoon>
Jan 21 19:24:49 np0005591285 nova_compute[182755]:  </devices>
Jan 21 19:24:49 np0005591285 nova_compute[182755]: </domain>
Jan 21 19:24:49 np0005591285 nova_compute[182755]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Jan 21 19:24:49 np0005591285 nova_compute[182755]: 2026-01-22 00:24:49.714 182759 DEBUG nova.compute.manager [None req-d44a660c-66ea-4509-b965-1f16fe1b6244 a60ce2b7b7ae47b484de12add551b287 02bcfc5f1f1044a3856e73a5938ff011 - - default default] [instance: 3f9bded2-5958-4c54-90d9-fc4d4b658fc0] Preparing to wait for external event network-vif-plugged-0e0f9617-ebfe-45af-98e6-38991b5338d0 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Jan 21 19:24:49 np0005591285 nova_compute[182755]: 2026-01-22 00:24:49.714 182759 DEBUG oslo_concurrency.lockutils [None req-d44a660c-66ea-4509-b965-1f16fe1b6244 a60ce2b7b7ae47b484de12add551b287 02bcfc5f1f1044a3856e73a5938ff011 - - default default] Acquiring lock "3f9bded2-5958-4c54-90d9-fc4d4b658fc0-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 21 19:24:49 np0005591285 nova_compute[182755]: 2026-01-22 00:24:49.714 182759 DEBUG oslo_concurrency.lockutils [None req-d44a660c-66ea-4509-b965-1f16fe1b6244 a60ce2b7b7ae47b484de12add551b287 02bcfc5f1f1044a3856e73a5938ff011 - - default default] Lock "3f9bded2-5958-4c54-90d9-fc4d4b658fc0-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 21 19:24:49 np0005591285 nova_compute[182755]: 2026-01-22 00:24:49.715 182759 DEBUG oslo_concurrency.lockutils [None req-d44a660c-66ea-4509-b965-1f16fe1b6244 a60ce2b7b7ae47b484de12add551b287 02bcfc5f1f1044a3856e73a5938ff011 - - default default] Lock "3f9bded2-5958-4c54-90d9-fc4d4b658fc0-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 21 19:24:49 np0005591285 nova_compute[182755]: 2026-01-22 00:24:49.715 182759 DEBUG nova.virt.libvirt.vif [None req-d44a660c-66ea-4509-b965-1f16fe1b6244 a60ce2b7b7ae47b484de12add551b287 02bcfc5f1f1044a3856e73a5938ff011 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-22T00:24:31Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-server-tempest-TestSecurityGroupsBasicOps-1492736128-access_point-1191581548',display_name='tempest-server-tempest-TestSecurityGroupsBasicOps-1492736128-access_point-1191581548',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-server-tempest-testsecuritygroupsbasicops-1492736128-ac',id=161,image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBA7jwxBkdWOz7w57AiaRcIngM4Y6BG0RAdXKZN1lpSf4fY7AaWS+RG49OCjvRqIpg6m9+OlWKeWKEGH6c13ztIF3i7IhSM5D4o2yMlEeDmvrwLxAQoPueCNJW1uOa0WZAw==',key_name='tempest-TestSecurityGroupsBasicOps-1448768175',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='02bcfc5f1f1044a3856e73a5938ff011',ramdisk_id='',reservation_id='r-p8b5xcxs',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestSecurityGroupsBasicOps-1492736128',owner_user_name='tempest-TestSecurityGroupsBasicOps-1492736128-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-22T00:24:43Z,user_data=None,user_id='a60ce2b7b7ae47b484de12add551b287',uuid=3f9bded2-5958-4c54-90d9-fc4d4b658fc0,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "0e0f9617-ebfe-45af-98e6-38991b5338d0", "address": "fa:16:3e:8b:21:fa", "network": {"id": "5c39d2a7-2c89-4543-a593-0bbe9a34dfef", "bridge": "br-int", "label": "tempest-network-smoke--1486796925", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "02bcfc5f1f1044a3856e73a5938ff011", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap0e0f9617-eb", "ovs_interfaceid": "0e0f9617-ebfe-45af-98e6-38991b5338d0", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Jan 21 19:24:49 np0005591285 nova_compute[182755]: 2026-01-22 00:24:49.716 182759 DEBUG nova.network.os_vif_util [None req-d44a660c-66ea-4509-b965-1f16fe1b6244 a60ce2b7b7ae47b484de12add551b287 02bcfc5f1f1044a3856e73a5938ff011 - - default default] Converting VIF {"id": "0e0f9617-ebfe-45af-98e6-38991b5338d0", "address": "fa:16:3e:8b:21:fa", "network": {"id": "5c39d2a7-2c89-4543-a593-0bbe9a34dfef", "bridge": "br-int", "label": "tempest-network-smoke--1486796925", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "02bcfc5f1f1044a3856e73a5938ff011", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap0e0f9617-eb", "ovs_interfaceid": "0e0f9617-ebfe-45af-98e6-38991b5338d0", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 21 19:24:49 np0005591285 nova_compute[182755]: 2026-01-22 00:24:49.716 182759 DEBUG nova.network.os_vif_util [None req-d44a660c-66ea-4509-b965-1f16fe1b6244 a60ce2b7b7ae47b484de12add551b287 02bcfc5f1f1044a3856e73a5938ff011 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:8b:21:fa,bridge_name='br-int',has_traffic_filtering=True,id=0e0f9617-ebfe-45af-98e6-38991b5338d0,network=Network(5c39d2a7-2c89-4543-a593-0bbe9a34dfef),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap0e0f9617-eb') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 21 19:24:49 np0005591285 nova_compute[182755]: 2026-01-22 00:24:49.716 182759 DEBUG os_vif [None req-d44a660c-66ea-4509-b965-1f16fe1b6244 a60ce2b7b7ae47b484de12add551b287 02bcfc5f1f1044a3856e73a5938ff011 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:8b:21:fa,bridge_name='br-int',has_traffic_filtering=True,id=0e0f9617-ebfe-45af-98e6-38991b5338d0,network=Network(5c39d2a7-2c89-4543-a593-0bbe9a34dfef),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap0e0f9617-eb') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Jan 21 19:24:49 np0005591285 nova_compute[182755]: 2026-01-22 00:24:49.717 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:24:49 np0005591285 nova_compute[182755]: 2026-01-22 00:24:49.717 182759 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 21 19:24:49 np0005591285 nova_compute[182755]: 2026-01-22 00:24:49.718 182759 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 21 19:24:49 np0005591285 nova_compute[182755]: 2026-01-22 00:24:49.720 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:24:49 np0005591285 nova_compute[182755]: 2026-01-22 00:24:49.720 182759 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap0e0f9617-eb, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 21 19:24:49 np0005591285 nova_compute[182755]: 2026-01-22 00:24:49.720 182759 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap0e0f9617-eb, col_values=(('external_ids', {'iface-id': '0e0f9617-ebfe-45af-98e6-38991b5338d0', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:8b:21:fa', 'vm-uuid': '3f9bded2-5958-4c54-90d9-fc4d4b658fc0'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 21 19:24:49 np0005591285 nova_compute[182755]: 2026-01-22 00:24:49.722 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:24:49 np0005591285 NetworkManager[55017]: <info>  [1769041489.7229] manager: (tap0e0f9617-eb): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/298)
Jan 21 19:24:49 np0005591285 nova_compute[182755]: 2026-01-22 00:24:49.724 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 21 19:24:49 np0005591285 nova_compute[182755]: 2026-01-22 00:24:49.727 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:24:49 np0005591285 nova_compute[182755]: 2026-01-22 00:24:49.728 182759 INFO os_vif [None req-d44a660c-66ea-4509-b965-1f16fe1b6244 a60ce2b7b7ae47b484de12add551b287 02bcfc5f1f1044a3856e73a5938ff011 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:8b:21:fa,bridge_name='br-int',has_traffic_filtering=True,id=0e0f9617-ebfe-45af-98e6-38991b5338d0,network=Network(5c39d2a7-2c89-4543-a593-0bbe9a34dfef),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap0e0f9617-eb')#033[00m
Jan 21 19:24:49 np0005591285 nova_compute[182755]: 2026-01-22 00:24:49.784 182759 DEBUG nova.virt.libvirt.driver [None req-d44a660c-66ea-4509-b965-1f16fe1b6244 a60ce2b7b7ae47b484de12add551b287 02bcfc5f1f1044a3856e73a5938ff011 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 21 19:24:49 np0005591285 nova_compute[182755]: 2026-01-22 00:24:49.785 182759 DEBUG nova.virt.libvirt.driver [None req-d44a660c-66ea-4509-b965-1f16fe1b6244 a60ce2b7b7ae47b484de12add551b287 02bcfc5f1f1044a3856e73a5938ff011 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 21 19:24:49 np0005591285 nova_compute[182755]: 2026-01-22 00:24:49.785 182759 DEBUG nova.virt.libvirt.driver [None req-d44a660c-66ea-4509-b965-1f16fe1b6244 a60ce2b7b7ae47b484de12add551b287 02bcfc5f1f1044a3856e73a5938ff011 - - default default] No VIF found with MAC fa:16:3e:8b:21:fa, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Jan 21 19:24:49 np0005591285 nova_compute[182755]: 2026-01-22 00:24:49.786 182759 INFO nova.virt.libvirt.driver [None req-d44a660c-66ea-4509-b965-1f16fe1b6244 a60ce2b7b7ae47b484de12add551b287 02bcfc5f1f1044a3856e73a5938ff011 - - default default] [instance: 3f9bded2-5958-4c54-90d9-fc4d4b658fc0] Using config drive#033[00m
Jan 21 19:24:50 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:24:50.374 104259 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=ce4b296c-26ac-415a-aa87-9634754eb3d3, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '48'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 21 19:24:51 np0005591285 nova_compute[182755]: 2026-01-22 00:24:51.510 182759 INFO nova.virt.libvirt.driver [None req-d44a660c-66ea-4509-b965-1f16fe1b6244 a60ce2b7b7ae47b484de12add551b287 02bcfc5f1f1044a3856e73a5938ff011 - - default default] [instance: 3f9bded2-5958-4c54-90d9-fc4d4b658fc0] Creating config drive at /var/lib/nova/instances/3f9bded2-5958-4c54-90d9-fc4d4b658fc0/disk.config#033[00m
Jan 21 19:24:51 np0005591285 nova_compute[182755]: 2026-01-22 00:24:51.515 182759 DEBUG oslo_concurrency.processutils [None req-d44a660c-66ea-4509-b965-1f16fe1b6244 a60ce2b7b7ae47b484de12add551b287 02bcfc5f1f1044a3856e73a5938ff011 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/3f9bded2-5958-4c54-90d9-fc4d4b658fc0/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpw5snkuak execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 21 19:24:51 np0005591285 nova_compute[182755]: 2026-01-22 00:24:51.644 182759 DEBUG oslo_concurrency.processutils [None req-d44a660c-66ea-4509-b965-1f16fe1b6244 a60ce2b7b7ae47b484de12add551b287 02bcfc5f1f1044a3856e73a5938ff011 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/3f9bded2-5958-4c54-90d9-fc4d4b658fc0/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpw5snkuak" returned: 0 in 0.129s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 21 19:24:51 np0005591285 kernel: tap0e0f9617-eb: entered promiscuous mode
Jan 21 19:24:51 np0005591285 NetworkManager[55017]: <info>  [1769041491.7025] manager: (tap0e0f9617-eb): new Tun device (/org/freedesktop/NetworkManager/Devices/299)
Jan 21 19:24:51 np0005591285 nova_compute[182755]: 2026-01-22 00:24:51.721 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:24:51 np0005591285 ovn_controller[94908]: 2026-01-22T00:24:51Z|00614|binding|INFO|Claiming lport 0e0f9617-ebfe-45af-98e6-38991b5338d0 for this chassis.
Jan 21 19:24:51 np0005591285 ovn_controller[94908]: 2026-01-22T00:24:51Z|00615|binding|INFO|0e0f9617-ebfe-45af-98e6-38991b5338d0: Claiming fa:16:3e:8b:21:fa 10.100.0.4
Jan 21 19:24:51 np0005591285 nova_compute[182755]: 2026-01-22 00:24:51.728 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:24:51 np0005591285 systemd-udevd[237230]: Network interface NamePolicy= disabled on kernel command line.
Jan 21 19:24:51 np0005591285 NetworkManager[55017]: <info>  [1769041491.7504] device (tap0e0f9617-eb): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 21 19:24:51 np0005591285 NetworkManager[55017]: <info>  [1769041491.7516] device (tap0e0f9617-eb): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 21 19:24:51 np0005591285 systemd-machined[154022]: New machine qemu-72-instance-000000a1.
Jan 21 19:24:51 np0005591285 systemd[1]: Started Virtual Machine qemu-72-instance-000000a1.
Jan 21 19:24:51 np0005591285 ovn_controller[94908]: 2026-01-22T00:24:51Z|00616|binding|INFO|Setting lport 0e0f9617-ebfe-45af-98e6-38991b5338d0 ovn-installed in OVS
Jan 21 19:24:51 np0005591285 nova_compute[182755]: 2026-01-22 00:24:51.791 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:24:51 np0005591285 ovn_controller[94908]: 2026-01-22T00:24:51Z|00617|binding|INFO|Setting lport 0e0f9617-ebfe-45af-98e6-38991b5338d0 up in Southbound
Jan 21 19:24:51 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:24:51.927 104259 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:8b:21:fa 10.100.0.4'], port_security=['fa:16:3e:8b:21:fa 10.100.0.4'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.4/28', 'neutron:device_id': '3f9bded2-5958-4c54-90d9-fc4d4b658fc0', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-5c39d2a7-2c89-4543-a593-0bbe9a34dfef', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '02bcfc5f1f1044a3856e73a5938ff011', 'neutron:revision_number': '2', 'neutron:security_group_ids': '49fcb811-8980-4c02-8e86-5cb74b163246 cf342e9e-efd0-4e0e-9d6f-a5a24378b540', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=48468636-833b-49e3-b1b9-d984040b8ee3, chassis=[<ovs.db.idl.Row object at 0x7fdd55b5c970>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fdd55b5c970>], logical_port=0e0f9617-ebfe-45af-98e6-38991b5338d0) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 21 19:24:51 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:24:51.928 104259 INFO neutron.agent.ovn.metadata.agent [-] Port 0e0f9617-ebfe-45af-98e6-38991b5338d0 in datapath 5c39d2a7-2c89-4543-a593-0bbe9a34dfef bound to our chassis#033[00m
Jan 21 19:24:51 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:24:51.929 104259 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 5c39d2a7-2c89-4543-a593-0bbe9a34dfef#033[00m
Jan 21 19:24:51 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:24:51.941 211690 DEBUG oslo.privsep.daemon [-] privsep: reply[f568630d-d241-4094-aec9-b675f8c1f6a9]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 19:24:51 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:24:51.941 104259 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap5c39d2a7-21 in ovnmeta-5c39d2a7-2c89-4543-a593-0bbe9a34dfef namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Jan 21 19:24:51 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:24:51.944 211690 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap5c39d2a7-20 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Jan 21 19:24:51 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:24:51.944 211690 DEBUG oslo.privsep.daemon [-] privsep: reply[0b9b261b-88c8-4392-b6e5-e7ee89ec2042]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 19:24:51 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:24:51.945 211690 DEBUG oslo.privsep.daemon [-] privsep: reply[dff7b85b-98ee-4457-ac41-21989a626503]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 19:24:51 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:24:51.960 104650 DEBUG oslo.privsep.daemon [-] privsep: reply[30fe02e9-1f72-4d95-96a0-8d6de336fa8a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 19:24:51 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:24:51.985 211690 DEBUG oslo.privsep.daemon [-] privsep: reply[cf60b8ae-cbc0-41c7-9aa6-ae6e7f9a0c5b]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 19:24:52 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:24:52.012 211704 DEBUG oslo.privsep.daemon [-] privsep: reply[9ce06cd4-1788-4b6e-904e-e4d56d7d4430]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 19:24:52 np0005591285 NetworkManager[55017]: <info>  [1769041492.0194] manager: (tap5c39d2a7-20): new Veth device (/org/freedesktop/NetworkManager/Devices/300)
Jan 21 19:24:52 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:24:52.019 211690 DEBUG oslo.privsep.daemon [-] privsep: reply[f3d532fa-b02b-4b02-b94b-e4c3734dcf54]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 19:24:52 np0005591285 systemd-udevd[237232]: Network interface NamePolicy= disabled on kernel command line.
Jan 21 19:24:52 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:24:52.054 211704 DEBUG oslo.privsep.daemon [-] privsep: reply[d0d89461-19fc-4790-8bd8-7b64d9d39bd6]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 19:24:52 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:24:52.057 211704 DEBUG oslo.privsep.daemon [-] privsep: reply[15da1b69-45d7-4e4c-826a-d7520c0a4b45]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 19:24:52 np0005591285 NetworkManager[55017]: <info>  [1769041492.0793] device (tap5c39d2a7-20): carrier: link connected
Jan 21 19:24:52 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:24:52.083 211704 DEBUG oslo.privsep.daemon [-] privsep: reply[864fbda9-1f96-49fa-bf4b-d00966db402c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 19:24:52 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:24:52.099 211690 DEBUG oslo.privsep.daemon [-] privsep: reply[d00fb3cc-00ee-4786-8f20-b30bbd799a93]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap5c39d2a7-21'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:84:5b:ec'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 193], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 605073, 'reachable_time': 43746, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 237264, 'error': None, 'target': 'ovnmeta-5c39d2a7-2c89-4543-a593-0bbe9a34dfef', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 19:24:52 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:24:52.117 211690 DEBUG oslo.privsep.daemon [-] privsep: reply[dec1971d-e9ee-4b21-b6f4-8b0a203f6101]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe84:5bec'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 605073, 'tstamp': 605073}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 237265, 'error': None, 'target': 'ovnmeta-5c39d2a7-2c89-4543-a593-0bbe9a34dfef', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 19:24:52 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:24:52.134 211690 DEBUG oslo.privsep.daemon [-] privsep: reply[911cb970-de3a-479b-955d-b5624d5de005]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap5c39d2a7-21'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:84:5b:ec'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 193], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 605073, 'reachable_time': 43746, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 237266, 'error': None, 'target': 'ovnmeta-5c39d2a7-2c89-4543-a593-0bbe9a34dfef', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 19:24:52 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:24:52.164 211690 DEBUG oslo.privsep.daemon [-] privsep: reply[d0a9f21f-2d8f-4cce-938e-aea95d3972e0]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 19:24:52 np0005591285 nova_compute[182755]: 2026-01-22 00:24:52.217 182759 DEBUG oslo_service.periodic_task [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 21 19:24:52 np0005591285 nova_compute[182755]: 2026-01-22 00:24:52.217 182759 DEBUG nova.compute.manager [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Jan 21 19:24:52 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:24:52.226 211690 DEBUG oslo.privsep.daemon [-] privsep: reply[2d87448d-73e0-4077-8796-5e2ab80cee11]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 19:24:52 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:24:52.228 104259 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap5c39d2a7-20, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 21 19:24:52 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:24:52.228 104259 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 21 19:24:52 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:24:52.228 104259 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap5c39d2a7-20, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 21 19:24:52 np0005591285 kernel: tap5c39d2a7-20: entered promiscuous mode
Jan 21 19:24:52 np0005591285 NetworkManager[55017]: <info>  [1769041492.2310] manager: (tap5c39d2a7-20): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/301)
Jan 21 19:24:52 np0005591285 nova_compute[182755]: 2026-01-22 00:24:52.230 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:24:52 np0005591285 nova_compute[182755]: 2026-01-22 00:24:52.231 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:24:52 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:24:52.235 104259 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap5c39d2a7-20, col_values=(('external_ids', {'iface-id': 'dbac63f8-5924-480d-ac2c-ed6dee0a255b'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 21 19:24:52 np0005591285 nova_compute[182755]: 2026-01-22 00:24:52.236 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:24:52 np0005591285 ovn_controller[94908]: 2026-01-22T00:24:52Z|00618|binding|INFO|Releasing lport dbac63f8-5924-480d-ac2c-ed6dee0a255b from this chassis (sb_readonly=0)
Jan 21 19:24:52 np0005591285 nova_compute[182755]: 2026-01-22 00:24:52.237 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:24:52 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:24:52.238 104259 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/5c39d2a7-2c89-4543-a593-0bbe9a34dfef.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/5c39d2a7-2c89-4543-a593-0bbe9a34dfef.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Jan 21 19:24:52 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:24:52.238 211690 DEBUG oslo.privsep.daemon [-] privsep: reply[a7bdd284-b793-42b4-a56a-f17a9d7f7b08]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 19:24:52 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:24:52.239 104259 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Jan 21 19:24:52 np0005591285 ovn_metadata_agent[104254]: global
Jan 21 19:24:52 np0005591285 ovn_metadata_agent[104254]:    log         /dev/log local0 debug
Jan 21 19:24:52 np0005591285 ovn_metadata_agent[104254]:    log-tag     haproxy-metadata-proxy-5c39d2a7-2c89-4543-a593-0bbe9a34dfef
Jan 21 19:24:52 np0005591285 ovn_metadata_agent[104254]:    user        root
Jan 21 19:24:52 np0005591285 ovn_metadata_agent[104254]:    group       root
Jan 21 19:24:52 np0005591285 ovn_metadata_agent[104254]:    maxconn     1024
Jan 21 19:24:52 np0005591285 ovn_metadata_agent[104254]:    pidfile     /var/lib/neutron/external/pids/5c39d2a7-2c89-4543-a593-0bbe9a34dfef.pid.haproxy
Jan 21 19:24:52 np0005591285 ovn_metadata_agent[104254]:    daemon
Jan 21 19:24:52 np0005591285 ovn_metadata_agent[104254]: 
Jan 21 19:24:52 np0005591285 ovn_metadata_agent[104254]: defaults
Jan 21 19:24:52 np0005591285 ovn_metadata_agent[104254]:    log global
Jan 21 19:24:52 np0005591285 ovn_metadata_agent[104254]:    mode http
Jan 21 19:24:52 np0005591285 ovn_metadata_agent[104254]:    option httplog
Jan 21 19:24:52 np0005591285 ovn_metadata_agent[104254]:    option dontlognull
Jan 21 19:24:52 np0005591285 ovn_metadata_agent[104254]:    option http-server-close
Jan 21 19:24:52 np0005591285 ovn_metadata_agent[104254]:    option forwardfor
Jan 21 19:24:52 np0005591285 ovn_metadata_agent[104254]:    retries                 3
Jan 21 19:24:52 np0005591285 ovn_metadata_agent[104254]:    timeout http-request    30s
Jan 21 19:24:52 np0005591285 ovn_metadata_agent[104254]:    timeout connect         30s
Jan 21 19:24:52 np0005591285 ovn_metadata_agent[104254]:    timeout client          32s
Jan 21 19:24:52 np0005591285 ovn_metadata_agent[104254]:    timeout server          32s
Jan 21 19:24:52 np0005591285 ovn_metadata_agent[104254]:    timeout http-keep-alive 30s
Jan 21 19:24:52 np0005591285 ovn_metadata_agent[104254]: 
Jan 21 19:24:52 np0005591285 ovn_metadata_agent[104254]: 
Jan 21 19:24:52 np0005591285 ovn_metadata_agent[104254]: listen listener
Jan 21 19:24:52 np0005591285 ovn_metadata_agent[104254]:    bind 169.254.169.254:80
Jan 21 19:24:52 np0005591285 ovn_metadata_agent[104254]:    server metadata /var/lib/neutron/metadata_proxy
Jan 21 19:24:52 np0005591285 ovn_metadata_agent[104254]:    http-request add-header X-OVN-Network-ID 5c39d2a7-2c89-4543-a593-0bbe9a34dfef
Jan 21 19:24:52 np0005591285 ovn_metadata_agent[104254]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Jan 21 19:24:52 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:24:52.241 104259 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-5c39d2a7-2c89-4543-a593-0bbe9a34dfef', 'env', 'PROCESS_TAG=haproxy-5c39d2a7-2c89-4543-a593-0bbe9a34dfef', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/5c39d2a7-2c89-4543-a593-0bbe9a34dfef.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Jan 21 19:24:52 np0005591285 nova_compute[182755]: 2026-01-22 00:24:52.247 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:24:52 np0005591285 nova_compute[182755]: 2026-01-22 00:24:52.468 182759 DEBUG nova.virt.driver [None req-f447ef32-7edb-4e2c-a5c5-5452f2f9c37e - - - - - -] Emitting event <LifecycleEvent: 1769041492.467563, 3f9bded2-5958-4c54-90d9-fc4d4b658fc0 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 21 19:24:52 np0005591285 nova_compute[182755]: 2026-01-22 00:24:52.468 182759 INFO nova.compute.manager [None req-f447ef32-7edb-4e2c-a5c5-5452f2f9c37e - - - - - -] [instance: 3f9bded2-5958-4c54-90d9-fc4d4b658fc0] VM Started (Lifecycle Event)#033[00m
Jan 21 19:24:52 np0005591285 nova_compute[182755]: 2026-01-22 00:24:52.495 182759 DEBUG nova.compute.manager [None req-f447ef32-7edb-4e2c-a5c5-5452f2f9c37e - - - - - -] [instance: 3f9bded2-5958-4c54-90d9-fc4d4b658fc0] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 21 19:24:52 np0005591285 nova_compute[182755]: 2026-01-22 00:24:52.499 182759 DEBUG nova.virt.driver [None req-f447ef32-7edb-4e2c-a5c5-5452f2f9c37e - - - - - -] Emitting event <LifecycleEvent: 1769041492.4702568, 3f9bded2-5958-4c54-90d9-fc4d4b658fc0 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 21 19:24:52 np0005591285 nova_compute[182755]: 2026-01-22 00:24:52.499 182759 INFO nova.compute.manager [None req-f447ef32-7edb-4e2c-a5c5-5452f2f9c37e - - - - - -] [instance: 3f9bded2-5958-4c54-90d9-fc4d4b658fc0] VM Paused (Lifecycle Event)#033[00m
Jan 21 19:24:52 np0005591285 nova_compute[182755]: 2026-01-22 00:24:52.533 182759 DEBUG nova.compute.manager [None req-f447ef32-7edb-4e2c-a5c5-5452f2f9c37e - - - - - -] [instance: 3f9bded2-5958-4c54-90d9-fc4d4b658fc0] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 21 19:24:52 np0005591285 nova_compute[182755]: 2026-01-22 00:24:52.539 182759 DEBUG nova.compute.manager [None req-f447ef32-7edb-4e2c-a5c5-5452f2f9c37e - - - - - -] [instance: 3f9bded2-5958-4c54-90d9-fc4d4b658fc0] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 21 19:24:52 np0005591285 podman[237303]: 2026-01-22 00:24:52.571111928 +0000 UTC m=+0.047737495 container create 2dbee010ab29a528cd64b59c726b9fe33e883b4752f8b77fefd0686e4cb9c700 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-5c39d2a7-2c89-4543-a593-0bbe9a34dfef, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, tcib_managed=true, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team)
Jan 21 19:24:52 np0005591285 systemd[1]: Started libpod-conmon-2dbee010ab29a528cd64b59c726b9fe33e883b4752f8b77fefd0686e4cb9c700.scope.
Jan 21 19:24:52 np0005591285 nova_compute[182755]: 2026-01-22 00:24:52.609 182759 INFO nova.compute.manager [None req-f447ef32-7edb-4e2c-a5c5-5452f2f9c37e - - - - - -] [instance: 3f9bded2-5958-4c54-90d9-fc4d4b658fc0] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 21 19:24:52 np0005591285 systemd[1]: Started libcrun container.
Jan 21 19:24:52 np0005591285 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/412b06d4169cd9c9536fe5d02a455f867edb2983747683f0b5011f330cf2b40a/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 21 19:24:52 np0005591285 podman[237303]: 2026-01-22 00:24:52.54551987 +0000 UTC m=+0.022145457 image pull 3695f0466b4af47afdf4b467956f8cc4744d7249671a73e7ca3fd26cca2f59c3 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Jan 21 19:24:52 np0005591285 podman[237303]: 2026-01-22 00:24:52.641006697 +0000 UTC m=+0.117632284 container init 2dbee010ab29a528cd64b59c726b9fe33e883b4752f8b77fefd0686e4cb9c700 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-5c39d2a7-2c89-4543-a593-0bbe9a34dfef, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3)
Jan 21 19:24:52 np0005591285 podman[237303]: 2026-01-22 00:24:52.646576247 +0000 UTC m=+0.123201804 container start 2dbee010ab29a528cd64b59c726b9fe33e883b4752f8b77fefd0686e4cb9c700 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-5c39d2a7-2c89-4543-a593-0bbe9a34dfef, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team)
Jan 21 19:24:52 np0005591285 podman[237317]: 2026-01-22 00:24:52.650000239 +0000 UTC m=+0.049108381 container health_status 3d927e208c8d9d2bf2ed958430b573b5beb6e6062ba87e1ec7a0ee42c855b2e4 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter)
Jan 21 19:24:52 np0005591285 neutron-haproxy-ovnmeta-5c39d2a7-2c89-4543-a593-0bbe9a34dfef[237320]: [NOTICE]   (237345) : New worker (237347) forked
Jan 21 19:24:52 np0005591285 neutron-haproxy-ovnmeta-5c39d2a7-2c89-4543-a593-0bbe9a34dfef[237320]: [NOTICE]   (237345) : Loading success.
Jan 21 19:24:53 np0005591285 nova_compute[182755]: 2026-01-22 00:24:53.013 182759 DEBUG nova.network.neutron [req-9cc94f79-017c-49f2-9430-f701b6bf9dba req-550f5810-aadf-40df-8206-ad797d922ffa 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 3f9bded2-5958-4c54-90d9-fc4d4b658fc0] Updated VIF entry in instance network info cache for port 0e0f9617-ebfe-45af-98e6-38991b5338d0. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 21 19:24:53 np0005591285 nova_compute[182755]: 2026-01-22 00:24:53.014 182759 DEBUG nova.network.neutron [req-9cc94f79-017c-49f2-9430-f701b6bf9dba req-550f5810-aadf-40df-8206-ad797d922ffa 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 3f9bded2-5958-4c54-90d9-fc4d4b658fc0] Updating instance_info_cache with network_info: [{"id": "0e0f9617-ebfe-45af-98e6-38991b5338d0", "address": "fa:16:3e:8b:21:fa", "network": {"id": "5c39d2a7-2c89-4543-a593-0bbe9a34dfef", "bridge": "br-int", "label": "tempest-network-smoke--1486796925", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "02bcfc5f1f1044a3856e73a5938ff011", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap0e0f9617-eb", "ovs_interfaceid": "0e0f9617-ebfe-45af-98e6-38991b5338d0", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 21 19:24:53 np0005591285 nova_compute[182755]: 2026-01-22 00:24:53.075 182759 DEBUG oslo_concurrency.lockutils [req-9cc94f79-017c-49f2-9430-f701b6bf9dba req-550f5810-aadf-40df-8206-ad797d922ffa 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Releasing lock "refresh_cache-3f9bded2-5958-4c54-90d9-fc4d4b658fc0" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 21 19:24:53 np0005591285 nova_compute[182755]: 2026-01-22 00:24:53.083 182759 DEBUG nova.compute.manager [req-80fc6647-7442-46cf-8f44-310657e7c90a req-b3f60301-6402-4afa-b91a-60191f77380c 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 3f9bded2-5958-4c54-90d9-fc4d4b658fc0] Received event network-vif-plugged-0e0f9617-ebfe-45af-98e6-38991b5338d0 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 21 19:24:53 np0005591285 nova_compute[182755]: 2026-01-22 00:24:53.083 182759 DEBUG oslo_concurrency.lockutils [req-80fc6647-7442-46cf-8f44-310657e7c90a req-b3f60301-6402-4afa-b91a-60191f77380c 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "3f9bded2-5958-4c54-90d9-fc4d4b658fc0-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 21 19:24:53 np0005591285 nova_compute[182755]: 2026-01-22 00:24:53.084 182759 DEBUG oslo_concurrency.lockutils [req-80fc6647-7442-46cf-8f44-310657e7c90a req-b3f60301-6402-4afa-b91a-60191f77380c 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "3f9bded2-5958-4c54-90d9-fc4d4b658fc0-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 21 19:24:53 np0005591285 nova_compute[182755]: 2026-01-22 00:24:53.084 182759 DEBUG oslo_concurrency.lockutils [req-80fc6647-7442-46cf-8f44-310657e7c90a req-b3f60301-6402-4afa-b91a-60191f77380c 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "3f9bded2-5958-4c54-90d9-fc4d4b658fc0-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 21 19:24:53 np0005591285 nova_compute[182755]: 2026-01-22 00:24:53.084 182759 DEBUG nova.compute.manager [req-80fc6647-7442-46cf-8f44-310657e7c90a req-b3f60301-6402-4afa-b91a-60191f77380c 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 3f9bded2-5958-4c54-90d9-fc4d4b658fc0] Processing event network-vif-plugged-0e0f9617-ebfe-45af-98e6-38991b5338d0 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Jan 21 19:24:53 np0005591285 nova_compute[182755]: 2026-01-22 00:24:53.085 182759 DEBUG nova.compute.manager [None req-d44a660c-66ea-4509-b965-1f16fe1b6244 a60ce2b7b7ae47b484de12add551b287 02bcfc5f1f1044a3856e73a5938ff011 - - default default] [instance: 3f9bded2-5958-4c54-90d9-fc4d4b658fc0] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Jan 21 19:24:53 np0005591285 nova_compute[182755]: 2026-01-22 00:24:53.090 182759 DEBUG nova.virt.driver [None req-f447ef32-7edb-4e2c-a5c5-5452f2f9c37e - - - - - -] Emitting event <LifecycleEvent: 1769041493.0899608, 3f9bded2-5958-4c54-90d9-fc4d4b658fc0 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 21 19:24:53 np0005591285 nova_compute[182755]: 2026-01-22 00:24:53.090 182759 INFO nova.compute.manager [None req-f447ef32-7edb-4e2c-a5c5-5452f2f9c37e - - - - - -] [instance: 3f9bded2-5958-4c54-90d9-fc4d4b658fc0] VM Resumed (Lifecycle Event)#033[00m
Jan 21 19:24:53 np0005591285 nova_compute[182755]: 2026-01-22 00:24:53.093 182759 DEBUG nova.virt.libvirt.driver [None req-d44a660c-66ea-4509-b965-1f16fe1b6244 a60ce2b7b7ae47b484de12add551b287 02bcfc5f1f1044a3856e73a5938ff011 - - default default] [instance: 3f9bded2-5958-4c54-90d9-fc4d4b658fc0] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Jan 21 19:24:53 np0005591285 nova_compute[182755]: 2026-01-22 00:24:53.096 182759 INFO nova.virt.libvirt.driver [-] [instance: 3f9bded2-5958-4c54-90d9-fc4d4b658fc0] Instance spawned successfully.#033[00m
Jan 21 19:24:53 np0005591285 nova_compute[182755]: 2026-01-22 00:24:53.096 182759 DEBUG nova.virt.libvirt.driver [None req-d44a660c-66ea-4509-b965-1f16fe1b6244 a60ce2b7b7ae47b484de12add551b287 02bcfc5f1f1044a3856e73a5938ff011 - - default default] [instance: 3f9bded2-5958-4c54-90d9-fc4d4b658fc0] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Jan 21 19:24:53 np0005591285 nova_compute[182755]: 2026-01-22 00:24:53.137 182759 DEBUG nova.virt.libvirt.driver [None req-d44a660c-66ea-4509-b965-1f16fe1b6244 a60ce2b7b7ae47b484de12add551b287 02bcfc5f1f1044a3856e73a5938ff011 - - default default] [instance: 3f9bded2-5958-4c54-90d9-fc4d4b658fc0] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 21 19:24:53 np0005591285 nova_compute[182755]: 2026-01-22 00:24:53.138 182759 DEBUG nova.virt.libvirt.driver [None req-d44a660c-66ea-4509-b965-1f16fe1b6244 a60ce2b7b7ae47b484de12add551b287 02bcfc5f1f1044a3856e73a5938ff011 - - default default] [instance: 3f9bded2-5958-4c54-90d9-fc4d4b658fc0] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 21 19:24:53 np0005591285 nova_compute[182755]: 2026-01-22 00:24:53.138 182759 DEBUG nova.virt.libvirt.driver [None req-d44a660c-66ea-4509-b965-1f16fe1b6244 a60ce2b7b7ae47b484de12add551b287 02bcfc5f1f1044a3856e73a5938ff011 - - default default] [instance: 3f9bded2-5958-4c54-90d9-fc4d4b658fc0] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 21 19:24:53 np0005591285 nova_compute[182755]: 2026-01-22 00:24:53.139 182759 DEBUG nova.virt.libvirt.driver [None req-d44a660c-66ea-4509-b965-1f16fe1b6244 a60ce2b7b7ae47b484de12add551b287 02bcfc5f1f1044a3856e73a5938ff011 - - default default] [instance: 3f9bded2-5958-4c54-90d9-fc4d4b658fc0] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 21 19:24:53 np0005591285 nova_compute[182755]: 2026-01-22 00:24:53.140 182759 DEBUG nova.virt.libvirt.driver [None req-d44a660c-66ea-4509-b965-1f16fe1b6244 a60ce2b7b7ae47b484de12add551b287 02bcfc5f1f1044a3856e73a5938ff011 - - default default] [instance: 3f9bded2-5958-4c54-90d9-fc4d4b658fc0] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 21 19:24:53 np0005591285 nova_compute[182755]: 2026-01-22 00:24:53.140 182759 DEBUG nova.virt.libvirt.driver [None req-d44a660c-66ea-4509-b965-1f16fe1b6244 a60ce2b7b7ae47b484de12add551b287 02bcfc5f1f1044a3856e73a5938ff011 - - default default] [instance: 3f9bded2-5958-4c54-90d9-fc4d4b658fc0] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 21 19:24:53 np0005591285 nova_compute[182755]: 2026-01-22 00:24:53.154 182759 DEBUG nova.compute.manager [None req-f447ef32-7edb-4e2c-a5c5-5452f2f9c37e - - - - - -] [instance: 3f9bded2-5958-4c54-90d9-fc4d4b658fc0] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 21 19:24:53 np0005591285 nova_compute[182755]: 2026-01-22 00:24:53.158 182759 DEBUG nova.compute.manager [None req-f447ef32-7edb-4e2c-a5c5-5452f2f9c37e - - - - - -] [instance: 3f9bded2-5958-4c54-90d9-fc4d4b658fc0] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 21 19:24:53 np0005591285 nova_compute[182755]: 2026-01-22 00:24:53.188 182759 INFO nova.compute.manager [None req-f447ef32-7edb-4e2c-a5c5-5452f2f9c37e - - - - - -] [instance: 3f9bded2-5958-4c54-90d9-fc4d4b658fc0] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 21 19:24:53 np0005591285 nova_compute[182755]: 2026-01-22 00:24:53.212 182759 DEBUG oslo_service.periodic_task [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 21 19:24:53 np0005591285 nova_compute[182755]: 2026-01-22 00:24:53.245 182759 INFO nova.compute.manager [None req-d44a660c-66ea-4509-b965-1f16fe1b6244 a60ce2b7b7ae47b484de12add551b287 02bcfc5f1f1044a3856e73a5938ff011 - - default default] [instance: 3f9bded2-5958-4c54-90d9-fc4d4b658fc0] Took 10.03 seconds to spawn the instance on the hypervisor.#033[00m
Jan 21 19:24:53 np0005591285 nova_compute[182755]: 2026-01-22 00:24:53.246 182759 DEBUG nova.compute.manager [None req-d44a660c-66ea-4509-b965-1f16fe1b6244 a60ce2b7b7ae47b484de12add551b287 02bcfc5f1f1044a3856e73a5938ff011 - - default default] [instance: 3f9bded2-5958-4c54-90d9-fc4d4b658fc0] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 21 19:24:53 np0005591285 nova_compute[182755]: 2026-01-22 00:24:53.322 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:24:53 np0005591285 nova_compute[182755]: 2026-01-22 00:24:53.335 182759 INFO nova.compute.manager [None req-d44a660c-66ea-4509-b965-1f16fe1b6244 a60ce2b7b7ae47b484de12add551b287 02bcfc5f1f1044a3856e73a5938ff011 - - default default] [instance: 3f9bded2-5958-4c54-90d9-fc4d4b658fc0] Took 10.87 seconds to build instance.#033[00m
Jan 21 19:24:53 np0005591285 nova_compute[182755]: 2026-01-22 00:24:53.369 182759 DEBUG oslo_concurrency.lockutils [None req-d44a660c-66ea-4509-b965-1f16fe1b6244 a60ce2b7b7ae47b484de12add551b287 02bcfc5f1f1044a3856e73a5938ff011 - - default default] Lock "3f9bded2-5958-4c54-90d9-fc4d4b658fc0" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 13.559s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 21 19:24:54 np0005591285 nova_compute[182755]: 2026-01-22 00:24:54.217 182759 DEBUG oslo_service.periodic_task [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 21 19:24:54 np0005591285 nova_compute[182755]: 2026-01-22 00:24:54.218 182759 DEBUG nova.compute.manager [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Jan 21 19:24:54 np0005591285 nova_compute[182755]: 2026-01-22 00:24:54.218 182759 DEBUG nova.compute.manager [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Jan 21 19:24:54 np0005591285 nova_compute[182755]: 2026-01-22 00:24:54.611 182759 DEBUG oslo_concurrency.lockutils [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Acquiring lock "refresh_cache-3f9bded2-5958-4c54-90d9-fc4d4b658fc0" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 21 19:24:54 np0005591285 nova_compute[182755]: 2026-01-22 00:24:54.611 182759 DEBUG oslo_concurrency.lockutils [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Acquired lock "refresh_cache-3f9bded2-5958-4c54-90d9-fc4d4b658fc0" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 21 19:24:54 np0005591285 nova_compute[182755]: 2026-01-22 00:24:54.612 182759 DEBUG nova.network.neutron [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] [instance: 3f9bded2-5958-4c54-90d9-fc4d4b658fc0] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004#033[00m
Jan 21 19:24:54 np0005591285 nova_compute[182755]: 2026-01-22 00:24:54.612 182759 DEBUG nova.objects.instance [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Lazy-loading 'info_cache' on Instance uuid 3f9bded2-5958-4c54-90d9-fc4d4b658fc0 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 21 19:24:54 np0005591285 nova_compute[182755]: 2026-01-22 00:24:54.724 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:24:55 np0005591285 podman[237357]: 2026-01-22 00:24:55.181062619 +0000 UTC m=+0.049646566 container health_status 8919309155a033da8deb024bb37b9fc0d285fe97686ee0d202af37a5f2df8416 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter)
Jan 21 19:24:55 np0005591285 podman[237356]: 2026-01-22 00:24:55.181642605 +0000 UTC m=+0.051287180 container health_status 482a8450540b33e5cd4c4cb0c4b60281016c657b79b89fa82f8a9e447843f995 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, io.buildah.version=1.41.3, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=ovn_metadata_agent, org.label-schema.build-date=20251202)
Jan 21 19:24:55 np0005591285 nova_compute[182755]: 2026-01-22 00:24:55.224 182759 DEBUG nova.compute.manager [req-3c47ed9b-12ce-4cf4-ad1f-3194726cf1f5 req-666a9a97-e82f-40cd-ae95-ec21881ad1e8 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 3f9bded2-5958-4c54-90d9-fc4d4b658fc0] Received event network-vif-plugged-0e0f9617-ebfe-45af-98e6-38991b5338d0 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 21 19:24:55 np0005591285 nova_compute[182755]: 2026-01-22 00:24:55.224 182759 DEBUG oslo_concurrency.lockutils [req-3c47ed9b-12ce-4cf4-ad1f-3194726cf1f5 req-666a9a97-e82f-40cd-ae95-ec21881ad1e8 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "3f9bded2-5958-4c54-90d9-fc4d4b658fc0-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 21 19:24:55 np0005591285 nova_compute[182755]: 2026-01-22 00:24:55.224 182759 DEBUG oslo_concurrency.lockutils [req-3c47ed9b-12ce-4cf4-ad1f-3194726cf1f5 req-666a9a97-e82f-40cd-ae95-ec21881ad1e8 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "3f9bded2-5958-4c54-90d9-fc4d4b658fc0-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 21 19:24:55 np0005591285 nova_compute[182755]: 2026-01-22 00:24:55.225 182759 DEBUG oslo_concurrency.lockutils [req-3c47ed9b-12ce-4cf4-ad1f-3194726cf1f5 req-666a9a97-e82f-40cd-ae95-ec21881ad1e8 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "3f9bded2-5958-4c54-90d9-fc4d4b658fc0-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 21 19:24:55 np0005591285 nova_compute[182755]: 2026-01-22 00:24:55.225 182759 DEBUG nova.compute.manager [req-3c47ed9b-12ce-4cf4-ad1f-3194726cf1f5 req-666a9a97-e82f-40cd-ae95-ec21881ad1e8 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 3f9bded2-5958-4c54-90d9-fc4d4b658fc0] No waiting events found dispatching network-vif-plugged-0e0f9617-ebfe-45af-98e6-38991b5338d0 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 21 19:24:55 np0005591285 nova_compute[182755]: 2026-01-22 00:24:55.225 182759 WARNING nova.compute.manager [req-3c47ed9b-12ce-4cf4-ad1f-3194726cf1f5 req-666a9a97-e82f-40cd-ae95-ec21881ad1e8 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 3f9bded2-5958-4c54-90d9-fc4d4b658fc0] Received unexpected event network-vif-plugged-0e0f9617-ebfe-45af-98e6-38991b5338d0 for instance with vm_state active and task_state None.#033[00m
Jan 21 19:24:56 np0005591285 nova_compute[182755]: 2026-01-22 00:24:56.833 182759 DEBUG nova.network.neutron [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] [instance: 3f9bded2-5958-4c54-90d9-fc4d4b658fc0] Updating instance_info_cache with network_info: [{"id": "0e0f9617-ebfe-45af-98e6-38991b5338d0", "address": "fa:16:3e:8b:21:fa", "network": {"id": "5c39d2a7-2c89-4543-a593-0bbe9a34dfef", "bridge": "br-int", "label": "tempest-network-smoke--1486796925", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "02bcfc5f1f1044a3856e73a5938ff011", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap0e0f9617-eb", "ovs_interfaceid": "0e0f9617-ebfe-45af-98e6-38991b5338d0", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 21 19:24:56 np0005591285 nova_compute[182755]: 2026-01-22 00:24:56.877 182759 DEBUG oslo_concurrency.lockutils [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Releasing lock "refresh_cache-3f9bded2-5958-4c54-90d9-fc4d4b658fc0" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 21 19:24:56 np0005591285 nova_compute[182755]: 2026-01-22 00:24:56.878 182759 DEBUG nova.compute.manager [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] [instance: 3f9bded2-5958-4c54-90d9-fc4d4b658fc0] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929#033[00m
Jan 21 19:24:56 np0005591285 nova_compute[182755]: 2026-01-22 00:24:56.879 182759 DEBUG oslo_service.periodic_task [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 21 19:24:57 np0005591285 nova_compute[182755]: 2026-01-22 00:24:57.517 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:24:57 np0005591285 NetworkManager[55017]: <info>  [1769041497.5202] manager: (patch-br-int-to-provnet-99bcf688-d143-4306-a11c-956e66fbe227): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/302)
Jan 21 19:24:57 np0005591285 NetworkManager[55017]: <info>  [1769041497.5214] manager: (patch-provnet-99bcf688-d143-4306-a11c-956e66fbe227-to-br-int): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/303)
Jan 21 19:24:57 np0005591285 ovn_controller[94908]: 2026-01-22T00:24:57Z|00619|binding|INFO|Releasing lport dbac63f8-5924-480d-ac2c-ed6dee0a255b from this chassis (sb_readonly=0)
Jan 21 19:24:57 np0005591285 nova_compute[182755]: 2026-01-22 00:24:57.582 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:24:57 np0005591285 nova_compute[182755]: 2026-01-22 00:24:57.589 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:24:58 np0005591285 nova_compute[182755]: 2026-01-22 00:24:58.166 182759 DEBUG nova.compute.manager [req-9eb823fc-57ef-4fdd-9e18-2dfdc3bab7a7 req-6c3a8bd0-8006-4aa1-a075-ecb88c43c5c3 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 3f9bded2-5958-4c54-90d9-fc4d4b658fc0] Received event network-changed-0e0f9617-ebfe-45af-98e6-38991b5338d0 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 21 19:24:58 np0005591285 nova_compute[182755]: 2026-01-22 00:24:58.167 182759 DEBUG nova.compute.manager [req-9eb823fc-57ef-4fdd-9e18-2dfdc3bab7a7 req-6c3a8bd0-8006-4aa1-a075-ecb88c43c5c3 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 3f9bded2-5958-4c54-90d9-fc4d4b658fc0] Refreshing instance network info cache due to event network-changed-0e0f9617-ebfe-45af-98e6-38991b5338d0. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 21 19:24:58 np0005591285 nova_compute[182755]: 2026-01-22 00:24:58.167 182759 DEBUG oslo_concurrency.lockutils [req-9eb823fc-57ef-4fdd-9e18-2dfdc3bab7a7 req-6c3a8bd0-8006-4aa1-a075-ecb88c43c5c3 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "refresh_cache-3f9bded2-5958-4c54-90d9-fc4d4b658fc0" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 21 19:24:58 np0005591285 nova_compute[182755]: 2026-01-22 00:24:58.168 182759 DEBUG oslo_concurrency.lockutils [req-9eb823fc-57ef-4fdd-9e18-2dfdc3bab7a7 req-6c3a8bd0-8006-4aa1-a075-ecb88c43c5c3 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquired lock "refresh_cache-3f9bded2-5958-4c54-90d9-fc4d4b658fc0" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 21 19:24:58 np0005591285 nova_compute[182755]: 2026-01-22 00:24:58.168 182759 DEBUG nova.network.neutron [req-9eb823fc-57ef-4fdd-9e18-2dfdc3bab7a7 req-6c3a8bd0-8006-4aa1-a075-ecb88c43c5c3 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 3f9bded2-5958-4c54-90d9-fc4d4b658fc0] Refreshing network info cache for port 0e0f9617-ebfe-45af-98e6-38991b5338d0 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 21 19:24:58 np0005591285 nova_compute[182755]: 2026-01-22 00:24:58.323 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:24:59 np0005591285 nova_compute[182755]: 2026-01-22 00:24:59.217 182759 DEBUG oslo_service.periodic_task [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 21 19:24:59 np0005591285 podman[237396]: 2026-01-22 00:24:59.220883819 +0000 UTC m=+0.089604680 container health_status e38def30a2d032278c507a8fe96e62e9ea51ff4721fe55b24e66691821fd079e (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, config_id=ovn_controller)
Jan 21 19:24:59 np0005591285 nova_compute[182755]: 2026-01-22 00:24:59.726 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:25:00 np0005591285 nova_compute[182755]: 2026-01-22 00:25:00.216 182759 DEBUG oslo_service.periodic_task [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 21 19:25:00 np0005591285 nova_compute[182755]: 2026-01-22 00:25:00.776 182759 DEBUG nova.network.neutron [req-9eb823fc-57ef-4fdd-9e18-2dfdc3bab7a7 req-6c3a8bd0-8006-4aa1-a075-ecb88c43c5c3 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 3f9bded2-5958-4c54-90d9-fc4d4b658fc0] Updated VIF entry in instance network info cache for port 0e0f9617-ebfe-45af-98e6-38991b5338d0. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 21 19:25:00 np0005591285 nova_compute[182755]: 2026-01-22 00:25:00.778 182759 DEBUG nova.network.neutron [req-9eb823fc-57ef-4fdd-9e18-2dfdc3bab7a7 req-6c3a8bd0-8006-4aa1-a075-ecb88c43c5c3 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 3f9bded2-5958-4c54-90d9-fc4d4b658fc0] Updating instance_info_cache with network_info: [{"id": "0e0f9617-ebfe-45af-98e6-38991b5338d0", "address": "fa:16:3e:8b:21:fa", "network": {"id": "5c39d2a7-2c89-4543-a593-0bbe9a34dfef", "bridge": "br-int", "label": "tempest-network-smoke--1486796925", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.203", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "02bcfc5f1f1044a3856e73a5938ff011", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap0e0f9617-eb", "ovs_interfaceid": "0e0f9617-ebfe-45af-98e6-38991b5338d0", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 21 19:25:00 np0005591285 nova_compute[182755]: 2026-01-22 00:25:00.838 182759 DEBUG oslo_concurrency.lockutils [req-9eb823fc-57ef-4fdd-9e18-2dfdc3bab7a7 req-6c3a8bd0-8006-4aa1-a075-ecb88c43c5c3 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Releasing lock "refresh_cache-3f9bded2-5958-4c54-90d9-fc4d4b658fc0" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 21 19:25:02 np0005591285 nova_compute[182755]: 2026-01-22 00:25:02.230 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:25:02 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:25:02.989 104259 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 21 19:25:02 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:25:02.990 104259 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 21 19:25:02 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:25:02.991 104259 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 21 19:25:03 np0005591285 ovn_controller[94908]: 2026-01-22T00:25:03Z|00620|binding|INFO|Releasing lport dbac63f8-5924-480d-ac2c-ed6dee0a255b from this chassis (sb_readonly=0)
Jan 21 19:25:03 np0005591285 nova_compute[182755]: 2026-01-22 00:25:03.052 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:25:03 np0005591285 nova_compute[182755]: 2026-01-22 00:25:03.218 182759 DEBUG oslo_service.periodic_task [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 21 19:25:03 np0005591285 nova_compute[182755]: 2026-01-22 00:25:03.218 182759 DEBUG oslo_service.periodic_task [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 21 19:25:03 np0005591285 nova_compute[182755]: 2026-01-22 00:25:03.219 182759 DEBUG oslo_service.periodic_task [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 21 19:25:03 np0005591285 nova_compute[182755]: 2026-01-22 00:25:03.257 182759 DEBUG oslo_concurrency.lockutils [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 21 19:25:03 np0005591285 nova_compute[182755]: 2026-01-22 00:25:03.258 182759 DEBUG oslo_concurrency.lockutils [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 21 19:25:03 np0005591285 nova_compute[182755]: 2026-01-22 00:25:03.258 182759 DEBUG oslo_concurrency.lockutils [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 21 19:25:03 np0005591285 nova_compute[182755]: 2026-01-22 00:25:03.258 182759 DEBUG nova.compute.resource_tracker [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Jan 21 19:25:03 np0005591285 nova_compute[182755]: 2026-01-22 00:25:03.326 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:25:03 np0005591285 nova_compute[182755]: 2026-01-22 00:25:03.352 182759 DEBUG oslo_concurrency.processutils [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/3f9bded2-5958-4c54-90d9-fc4d4b658fc0/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 21 19:25:03 np0005591285 nova_compute[182755]: 2026-01-22 00:25:03.418 182759 DEBUG oslo_concurrency.processutils [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/3f9bded2-5958-4c54-90d9-fc4d4b658fc0/disk --force-share --output=json" returned: 0 in 0.066s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 21 19:25:03 np0005591285 nova_compute[182755]: 2026-01-22 00:25:03.419 182759 DEBUG oslo_concurrency.processutils [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/3f9bded2-5958-4c54-90d9-fc4d4b658fc0/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 21 19:25:03 np0005591285 nova_compute[182755]: 2026-01-22 00:25:03.474 182759 DEBUG oslo_concurrency.processutils [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/3f9bded2-5958-4c54-90d9-fc4d4b658fc0/disk --force-share --output=json" returned: 0 in 0.055s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 21 19:25:03 np0005591285 nova_compute[182755]: 2026-01-22 00:25:03.655 182759 WARNING nova.virt.libvirt.driver [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 21 19:25:03 np0005591285 nova_compute[182755]: 2026-01-22 00:25:03.657 182759 DEBUG nova.compute.resource_tracker [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=5560MB free_disk=73.19026947021484GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Jan 21 19:25:03 np0005591285 nova_compute[182755]: 2026-01-22 00:25:03.657 182759 DEBUG oslo_concurrency.lockutils [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 21 19:25:03 np0005591285 nova_compute[182755]: 2026-01-22 00:25:03.658 182759 DEBUG oslo_concurrency.lockutils [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 21 19:25:03 np0005591285 nova_compute[182755]: 2026-01-22 00:25:03.756 182759 DEBUG nova.compute.resource_tracker [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Instance 3f9bded2-5958-4c54-90d9-fc4d4b658fc0 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Jan 21 19:25:03 np0005591285 nova_compute[182755]: 2026-01-22 00:25:03.757 182759 DEBUG nova.compute.resource_tracker [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Jan 21 19:25:03 np0005591285 nova_compute[182755]: 2026-01-22 00:25:03.757 182759 DEBUG nova.compute.resource_tracker [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7679MB used_ram=640MB phys_disk=79GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Jan 21 19:25:03 np0005591285 nova_compute[182755]: 2026-01-22 00:25:03.773 182759 DEBUG nova.scheduler.client.report [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Refreshing inventories for resource provider e96a8776-a298-4c19-937a-402cb8191067 _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:804#033[00m
Jan 21 19:25:03 np0005591285 nova_compute[182755]: 2026-01-22 00:25:03.802 182759 DEBUG nova.scheduler.client.report [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Updating ProviderTree inventory for provider e96a8776-a298-4c19-937a-402cb8191067 from _refresh_and_get_inventory using data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} _refresh_and_get_inventory /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:768#033[00m
Jan 21 19:25:03 np0005591285 nova_compute[182755]: 2026-01-22 00:25:03.803 182759 DEBUG nova.compute.provider_tree [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Updating inventory in ProviderTree for provider e96a8776-a298-4c19-937a-402cb8191067 with inventory: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176#033[00m
Jan 21 19:25:03 np0005591285 nova_compute[182755]: 2026-01-22 00:25:03.824 182759 DEBUG nova.scheduler.client.report [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Refreshing aggregate associations for resource provider e96a8776-a298-4c19-937a-402cb8191067, aggregates: None _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:813#033[00m
Jan 21 19:25:03 np0005591285 nova_compute[182755]: 2026-01-22 00:25:03.870 182759 DEBUG nova.scheduler.client.report [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Refreshing trait associations for resource provider e96a8776-a298-4c19-937a-402cb8191067, traits: COMPUTE_SECURITY_UEFI_SECURE_BOOT,COMPUTE_VIOMMU_MODEL_VIRTIO,COMPUTE_STORAGE_BUS_IDE,COMPUTE_NET_ATTACH_INTERFACE,HW_CPU_X86_SSSE3,HW_CPU_X86_SSE,COMPUTE_IMAGE_TYPE_QCOW2,COMPUTE_NET_VIF_MODEL_VMXNET3,COMPUTE_STORAGE_BUS_FDC,HW_CPU_X86_SSE42,COMPUTE_SECURITY_TPM_2_0,COMPUTE_IMAGE_TYPE_AMI,COMPUTE_DEVICE_TAGGING,COMPUTE_VOLUME_EXTEND,COMPUTE_NET_VIF_MODEL_E1000,COMPUTE_VIOMMU_MODEL_AUTO,COMPUTE_GRAPHICS_MODEL_CIRRUS,COMPUTE_VOLUME_ATTACH_WITH_TAG,COMPUTE_IMAGE_TYPE_AKI,COMPUTE_STORAGE_BUS_SATA,HW_CPU_X86_SSE2,COMPUTE_IMAGE_TYPE_ARI,HW_CPU_X86_SSE41,COMPUTE_RESCUE_BFV,COMPUTE_NODE,COMPUTE_GRAPHICS_MODEL_VIRTIO,COMPUTE_NET_VIF_MODEL_PCNET,COMPUTE_NET_VIF_MODEL_VIRTIO,COMPUTE_NET_VIF_MODEL_RTL8139,COMPUTE_ACCELERATORS,COMPUTE_SECURITY_TPM_1_2,COMPUTE_GRAPHICS_MODEL_NONE,COMPUTE_GRAPHICS_MODEL_BOCHS,COMPUTE_VOLUME_MULTI_ATTACH,COMPUTE_NET_VIF_MODEL_NE2K_PCI,COMPUTE_TRUSTED_CERTS,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,COMPUTE_VIOMMU_MODEL_INTEL,COMPUTE_IMAGE_TYPE_ISO,COMPUTE_SOCKET_PCI_NUMA_AFFINITY,COMPUTE_STORAGE_BUS_USB,COMPUTE_STORAGE_BUS_SCSI,COMPUTE_STORAGE_BUS_VIRTIO,HW_CPU_X86_MMX,COMPUTE_GRAPHICS_MODEL_VGA,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,COMPUTE_NET_VIF_MODEL_E1000E,COMPUTE_IMAGE_TYPE_RAW _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:825#033[00m
Jan 21 19:25:03 np0005591285 nova_compute[182755]: 2026-01-22 00:25:03.985 182759 DEBUG nova.compute.provider_tree [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Inventory has not changed in ProviderTree for provider: e96a8776-a298-4c19-937a-402cb8191067 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 21 19:25:04 np0005591285 nova_compute[182755]: 2026-01-22 00:25:04.014 182759 DEBUG nova.scheduler.client.report [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Inventory has not changed for provider e96a8776-a298-4c19-937a-402cb8191067 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 21 19:25:04 np0005591285 nova_compute[182755]: 2026-01-22 00:25:04.047 182759 DEBUG nova.compute.resource_tracker [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Jan 21 19:25:04 np0005591285 nova_compute[182755]: 2026-01-22 00:25:04.048 182759 DEBUG oslo_concurrency.lockutils [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.390s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 21 19:25:04 np0005591285 nova_compute[182755]: 2026-01-22 00:25:04.730 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:25:06 np0005591285 ovn_controller[94908]: 2026-01-22T00:25:06Z|00062|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:8b:21:fa 10.100.0.4
Jan 21 19:25:06 np0005591285 ovn_controller[94908]: 2026-01-22T00:25:06Z|00063|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:8b:21:fa 10.100.0.4
Jan 21 19:25:07 np0005591285 nova_compute[182755]: 2026-01-22 00:25:07.237 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:25:08 np0005591285 nova_compute[182755]: 2026-01-22 00:25:08.329 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:25:09 np0005591285 nova_compute[182755]: 2026-01-22 00:25:09.733 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:25:13 np0005591285 nova_compute[182755]: 2026-01-22 00:25:13.331 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:25:14 np0005591285 podman[237451]: 2026-01-22 00:25:14.192018704 +0000 UTC m=+0.056686965 container health_status b5c1a31a508d0cf9e10702abe9c0ef424614b4d8f0daee85791f7de054afa6f0 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ceilometer_agent_compute, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251202, container_name=ceilometer_agent_compute)
Jan 21 19:25:14 np0005591285 podman[237450]: 2026-01-22 00:25:14.192337123 +0000 UTC m=+0.056835109 container health_status 769668cc4e818cfae854ab3fcb5517227ddca1e18005a18ef4d782a494f45662 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, architecture=x86_64, release=1755695350, url=https://catalog.redhat.com/en/search?searchType=containers, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Red Hat, Inc., name=ubi9-minimal, vcs-type=git, com.redhat.component=ubi9-minimal-container, config_id=openstack_network_exporter, io.buildah.version=1.33.7, io.openshift.expose-services=, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., managed_by=edpm_ansible, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, container_name=openstack_network_exporter, distribution-scope=public, io.openshift.tags=minimal rhel9, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., version=9.6, build-date=2025-08-20T13:12:41, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b)
Jan 21 19:25:14 np0005591285 nova_compute[182755]: 2026-01-22 00:25:14.734 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:25:18 np0005591285 nova_compute[182755]: 2026-01-22 00:25:18.333 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:25:18 np0005591285 nova_compute[182755]: 2026-01-22 00:25:18.720 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:25:19 np0005591285 nova_compute[182755]: 2026-01-22 00:25:19.737 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:25:22 np0005591285 nova_compute[182755]: 2026-01-22 00:25:22.737 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:25:22 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:25:22.737 104259 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=49, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '16:a4:fb', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': 'fa:c2:bb:50:eb:27'}, ipsec=False) old=SB_Global(nb_cfg=48) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 21 19:25:22 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:25:22.739 104259 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 4 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Jan 21 19:25:23 np0005591285 podman[237493]: 2026-01-22 00:25:23.210716695 +0000 UTC m=+0.069540321 container health_status 3d927e208c8d9d2bf2ed958430b573b5beb6e6062ba87e1ec7a0ee42c855b2e4 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter)
Jan 21 19:25:23 np0005591285 nova_compute[182755]: 2026-01-22 00:25:23.335 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:25:24 np0005591285 nova_compute[182755]: 2026-01-22 00:25:24.740 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:25:24 np0005591285 nova_compute[182755]: 2026-01-22 00:25:24.851 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:25:26 np0005591285 podman[237517]: 2026-01-22 00:25:26.191862266 +0000 UTC m=+0.056354356 container health_status 482a8450540b33e5cd4c4cb0c4b60281016c657b79b89fa82f8a9e447843f995 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_metadata_agent, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, tcib_managed=true)
Jan 21 19:25:26 np0005591285 podman[237518]: 2026-01-22 00:25:26.226095738 +0000 UTC m=+0.081585475 container health_status 8919309155a033da8deb024bb37b9fc0d285fe97686ee0d202af37a5f2df8416 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']})
Jan 21 19:25:26 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:25:26.742 104259 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=ce4b296c-26ac-415a-aa87-9634754eb3d3, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '49'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 21 19:25:28 np0005591285 nova_compute[182755]: 2026-01-22 00:25:28.337 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:25:29 np0005591285 nova_compute[182755]: 2026-01-22 00:25:29.743 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:25:30 np0005591285 podman[237560]: 2026-01-22 00:25:30.275114414 +0000 UTC m=+0.136865941 container health_status e38def30a2d032278c507a8fe96e62e9ea51ff4721fe55b24e66691821fd079e (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, config_id=ovn_controller, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller)
Jan 21 19:25:33 np0005591285 nova_compute[182755]: 2026-01-22 00:25:33.348 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:25:34 np0005591285 nova_compute[182755]: 2026-01-22 00:25:34.747 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:25:37 np0005591285 nova_compute[182755]: 2026-01-22 00:25:37.560 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:25:38 np0005591285 nova_compute[182755]: 2026-01-22 00:25:38.350 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:25:39 np0005591285 nova_compute[182755]: 2026-01-22 00:25:39.749 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:25:40 np0005591285 nova_compute[182755]: 2026-01-22 00:25:40.776 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:25:43 np0005591285 nova_compute[182755]: 2026-01-22 00:25:43.405 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:25:44 np0005591285 nova_compute[182755]: 2026-01-22 00:25:44.752 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:25:45 np0005591285 podman[237589]: 2026-01-22 00:25:45.196993929 +0000 UTC m=+0.055483903 container health_status b5c1a31a508d0cf9e10702abe9c0ef424614b4d8f0daee85791f7de054afa6f0 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, config_id=ceilometer_agent_compute, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, container_name=ceilometer_agent_compute)
Jan 21 19:25:45 np0005591285 podman[237588]: 2026-01-22 00:25:45.198480898 +0000 UTC m=+0.061432153 container health_status 769668cc4e818cfae854ab3fcb5517227ddca1e18005a18ef4d782a494f45662 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=openstack_network_exporter, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, build-date=2025-08-20T13:12:41, managed_by=edpm_ansible, container_name=openstack_network_exporter, io.buildah.version=1.33.7, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., distribution-scope=public, io.openshift.tags=minimal rhel9, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, release=1755695350, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vendor=Red Hat, Inc., maintainer=Red Hat, Inc., url=https://catalog.redhat.com/en/search?searchType=containers, version=9.6, name=ubi9-minimal, architecture=x86_64, com.redhat.component=ubi9-minimal-container)
Jan 21 19:25:48 np0005591285 nova_compute[182755]: 2026-01-22 00:25:48.407 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:25:49 np0005591285 nova_compute[182755]: 2026-01-22 00:25:49.754 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:25:53 np0005591285 nova_compute[182755]: 2026-01-22 00:25:53.047 182759 DEBUG oslo_service.periodic_task [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 21 19:25:53 np0005591285 nova_compute[182755]: 2026-01-22 00:25:53.047 182759 DEBUG nova.compute.manager [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Jan 21 19:25:53 np0005591285 nova_compute[182755]: 2026-01-22 00:25:53.409 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:25:54 np0005591285 nova_compute[182755]: 2026-01-22 00:25:54.053 182759 DEBUG oslo_concurrency.lockutils [None req-95f3dbd5-85dc-4e67-9366-cbe74b1deafb 93f27bcf715e498cbac482f96dec39c0 c869345f15654dea91ddb775c6c3ed7d - - default default] Acquiring lock "ba1975bd-ca63-4cb4-afd3-fb1f077c28f0" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 21 19:25:54 np0005591285 nova_compute[182755]: 2026-01-22 00:25:54.054 182759 DEBUG oslo_concurrency.lockutils [None req-95f3dbd5-85dc-4e67-9366-cbe74b1deafb 93f27bcf715e498cbac482f96dec39c0 c869345f15654dea91ddb775c6c3ed7d - - default default] Lock "ba1975bd-ca63-4cb4-afd3-fb1f077c28f0" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 21 19:25:54 np0005591285 nova_compute[182755]: 2026-01-22 00:25:54.069 182759 DEBUG nova.compute.manager [None req-95f3dbd5-85dc-4e67-9366-cbe74b1deafb 93f27bcf715e498cbac482f96dec39c0 c869345f15654dea91ddb775c6c3ed7d - - default default] [instance: ba1975bd-ca63-4cb4-afd3-fb1f077c28f0] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Jan 21 19:25:54 np0005591285 podman[237630]: 2026-01-22 00:25:54.19263413 +0000 UTC m=+0.059351277 container health_status 3d927e208c8d9d2bf2ed958430b573b5beb6e6062ba87e1ec7a0ee42c855b2e4 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Jan 21 19:25:54 np0005591285 nova_compute[182755]: 2026-01-22 00:25:54.215 182759 DEBUG oslo_concurrency.lockutils [None req-95f3dbd5-85dc-4e67-9366-cbe74b1deafb 93f27bcf715e498cbac482f96dec39c0 c869345f15654dea91ddb775c6c3ed7d - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 21 19:25:54 np0005591285 nova_compute[182755]: 2026-01-22 00:25:54.215 182759 DEBUG oslo_concurrency.lockutils [None req-95f3dbd5-85dc-4e67-9366-cbe74b1deafb 93f27bcf715e498cbac482f96dec39c0 c869345f15654dea91ddb775c6c3ed7d - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 21 19:25:54 np0005591285 nova_compute[182755]: 2026-01-22 00:25:54.217 182759 DEBUG oslo_service.periodic_task [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 21 19:25:54 np0005591285 nova_compute[182755]: 2026-01-22 00:25:54.217 182759 DEBUG nova.compute.manager [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Jan 21 19:25:54 np0005591285 nova_compute[182755]: 2026-01-22 00:25:54.217 182759 DEBUG nova.compute.manager [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Jan 21 19:25:54 np0005591285 nova_compute[182755]: 2026-01-22 00:25:54.224 182759 DEBUG nova.virt.hardware [None req-95f3dbd5-85dc-4e67-9366-cbe74b1deafb 93f27bcf715e498cbac482f96dec39c0 c869345f15654dea91ddb775c6c3ed7d - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Jan 21 19:25:54 np0005591285 nova_compute[182755]: 2026-01-22 00:25:54.226 182759 INFO nova.compute.claims [None req-95f3dbd5-85dc-4e67-9366-cbe74b1deafb 93f27bcf715e498cbac482f96dec39c0 c869345f15654dea91ddb775c6c3ed7d - - default default] [instance: ba1975bd-ca63-4cb4-afd3-fb1f077c28f0] Claim successful on node compute-2.ctlplane.example.com#033[00m
Jan 21 19:25:54 np0005591285 nova_compute[182755]: 2026-01-22 00:25:54.399 182759 DEBUG nova.compute.provider_tree [None req-95f3dbd5-85dc-4e67-9366-cbe74b1deafb 93f27bcf715e498cbac482f96dec39c0 c869345f15654dea91ddb775c6c3ed7d - - default default] Inventory has not changed in ProviderTree for provider: e96a8776-a298-4c19-937a-402cb8191067 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 21 19:25:54 np0005591285 nova_compute[182755]: 2026-01-22 00:25:54.415 182759 DEBUG nova.scheduler.client.report [None req-95f3dbd5-85dc-4e67-9366-cbe74b1deafb 93f27bcf715e498cbac482f96dec39c0 c869345f15654dea91ddb775c6c3ed7d - - default default] Inventory has not changed for provider e96a8776-a298-4c19-937a-402cb8191067 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 21 19:25:54 np0005591285 nova_compute[182755]: 2026-01-22 00:25:54.455 182759 DEBUG oslo_concurrency.lockutils [None req-95f3dbd5-85dc-4e67-9366-cbe74b1deafb 93f27bcf715e498cbac482f96dec39c0 c869345f15654dea91ddb775c6c3ed7d - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.239s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 21 19:25:54 np0005591285 nova_compute[182755]: 2026-01-22 00:25:54.456 182759 DEBUG nova.compute.manager [None req-95f3dbd5-85dc-4e67-9366-cbe74b1deafb 93f27bcf715e498cbac482f96dec39c0 c869345f15654dea91ddb775c6c3ed7d - - default default] [instance: ba1975bd-ca63-4cb4-afd3-fb1f077c28f0] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Jan 21 19:25:54 np0005591285 nova_compute[182755]: 2026-01-22 00:25:54.499 182759 DEBUG oslo_concurrency.lockutils [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Acquiring lock "refresh_cache-3f9bded2-5958-4c54-90d9-fc4d4b658fc0" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 21 19:25:54 np0005591285 nova_compute[182755]: 2026-01-22 00:25:54.499 182759 DEBUG oslo_concurrency.lockutils [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Acquired lock "refresh_cache-3f9bded2-5958-4c54-90d9-fc4d4b658fc0" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 21 19:25:54 np0005591285 nova_compute[182755]: 2026-01-22 00:25:54.500 182759 DEBUG nova.network.neutron [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] [instance: 3f9bded2-5958-4c54-90d9-fc4d4b658fc0] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004#033[00m
Jan 21 19:25:54 np0005591285 nova_compute[182755]: 2026-01-22 00:25:54.500 182759 DEBUG nova.objects.instance [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Lazy-loading 'info_cache' on Instance uuid 3f9bded2-5958-4c54-90d9-fc4d4b658fc0 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 21 19:25:54 np0005591285 nova_compute[182755]: 2026-01-22 00:25:54.570 182759 DEBUG nova.compute.manager [None req-95f3dbd5-85dc-4e67-9366-cbe74b1deafb 93f27bcf715e498cbac482f96dec39c0 c869345f15654dea91ddb775c6c3ed7d - - default default] [instance: ba1975bd-ca63-4cb4-afd3-fb1f077c28f0] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Jan 21 19:25:54 np0005591285 nova_compute[182755]: 2026-01-22 00:25:54.571 182759 DEBUG nova.network.neutron [None req-95f3dbd5-85dc-4e67-9366-cbe74b1deafb 93f27bcf715e498cbac482f96dec39c0 c869345f15654dea91ddb775c6c3ed7d - - default default] [instance: ba1975bd-ca63-4cb4-afd3-fb1f077c28f0] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Jan 21 19:25:54 np0005591285 nova_compute[182755]: 2026-01-22 00:25:54.606 182759 INFO nova.virt.libvirt.driver [None req-95f3dbd5-85dc-4e67-9366-cbe74b1deafb 93f27bcf715e498cbac482f96dec39c0 c869345f15654dea91ddb775c6c3ed7d - - default default] [instance: ba1975bd-ca63-4cb4-afd3-fb1f077c28f0] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Jan 21 19:25:54 np0005591285 nova_compute[182755]: 2026-01-22 00:25:54.632 182759 DEBUG nova.compute.manager [None req-95f3dbd5-85dc-4e67-9366-cbe74b1deafb 93f27bcf715e498cbac482f96dec39c0 c869345f15654dea91ddb775c6c3ed7d - - default default] [instance: ba1975bd-ca63-4cb4-afd3-fb1f077c28f0] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Jan 21 19:25:54 np0005591285 nova_compute[182755]: 2026-01-22 00:25:54.748 182759 DEBUG nova.compute.manager [None req-95f3dbd5-85dc-4e67-9366-cbe74b1deafb 93f27bcf715e498cbac482f96dec39c0 c869345f15654dea91ddb775c6c3ed7d - - default default] [instance: ba1975bd-ca63-4cb4-afd3-fb1f077c28f0] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Jan 21 19:25:54 np0005591285 nova_compute[182755]: 2026-01-22 00:25:54.750 182759 DEBUG nova.virt.libvirt.driver [None req-95f3dbd5-85dc-4e67-9366-cbe74b1deafb 93f27bcf715e498cbac482f96dec39c0 c869345f15654dea91ddb775c6c3ed7d - - default default] [instance: ba1975bd-ca63-4cb4-afd3-fb1f077c28f0] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Jan 21 19:25:54 np0005591285 nova_compute[182755]: 2026-01-22 00:25:54.750 182759 INFO nova.virt.libvirt.driver [None req-95f3dbd5-85dc-4e67-9366-cbe74b1deafb 93f27bcf715e498cbac482f96dec39c0 c869345f15654dea91ddb775c6c3ed7d - - default default] [instance: ba1975bd-ca63-4cb4-afd3-fb1f077c28f0] Creating image(s)#033[00m
Jan 21 19:25:54 np0005591285 nova_compute[182755]: 2026-01-22 00:25:54.751 182759 DEBUG oslo_concurrency.lockutils [None req-95f3dbd5-85dc-4e67-9366-cbe74b1deafb 93f27bcf715e498cbac482f96dec39c0 c869345f15654dea91ddb775c6c3ed7d - - default default] Acquiring lock "/var/lib/nova/instances/ba1975bd-ca63-4cb4-afd3-fb1f077c28f0/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 21 19:25:54 np0005591285 nova_compute[182755]: 2026-01-22 00:25:54.751 182759 DEBUG oslo_concurrency.lockutils [None req-95f3dbd5-85dc-4e67-9366-cbe74b1deafb 93f27bcf715e498cbac482f96dec39c0 c869345f15654dea91ddb775c6c3ed7d - - default default] Lock "/var/lib/nova/instances/ba1975bd-ca63-4cb4-afd3-fb1f077c28f0/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 21 19:25:54 np0005591285 nova_compute[182755]: 2026-01-22 00:25:54.752 182759 DEBUG oslo_concurrency.lockutils [None req-95f3dbd5-85dc-4e67-9366-cbe74b1deafb 93f27bcf715e498cbac482f96dec39c0 c869345f15654dea91ddb775c6c3ed7d - - default default] Lock "/var/lib/nova/instances/ba1975bd-ca63-4cb4-afd3-fb1f077c28f0/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 21 19:25:54 np0005591285 nova_compute[182755]: 2026-01-22 00:25:54.765 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:25:54 np0005591285 nova_compute[182755]: 2026-01-22 00:25:54.767 182759 DEBUG oslo_concurrency.processutils [None req-95f3dbd5-85dc-4e67-9366-cbe74b1deafb 93f27bcf715e498cbac482f96dec39c0 c869345f15654dea91ddb775c6c3ed7d - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 21 19:25:54 np0005591285 nova_compute[182755]: 2026-01-22 00:25:54.825 182759 DEBUG oslo_concurrency.processutils [None req-95f3dbd5-85dc-4e67-9366-cbe74b1deafb 93f27bcf715e498cbac482f96dec39c0 c869345f15654dea91ddb775c6c3ed7d - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json" returned: 0 in 0.058s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 21 19:25:54 np0005591285 nova_compute[182755]: 2026-01-22 00:25:54.826 182759 DEBUG oslo_concurrency.lockutils [None req-95f3dbd5-85dc-4e67-9366-cbe74b1deafb 93f27bcf715e498cbac482f96dec39c0 c869345f15654dea91ddb775c6c3ed7d - - default default] Acquiring lock "3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 21 19:25:54 np0005591285 nova_compute[182755]: 2026-01-22 00:25:54.827 182759 DEBUG oslo_concurrency.lockutils [None req-95f3dbd5-85dc-4e67-9366-cbe74b1deafb 93f27bcf715e498cbac482f96dec39c0 c869345f15654dea91ddb775c6c3ed7d - - default default] Lock "3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 21 19:25:54 np0005591285 nova_compute[182755]: 2026-01-22 00:25:54.838 182759 DEBUG oslo_concurrency.processutils [None req-95f3dbd5-85dc-4e67-9366-cbe74b1deafb 93f27bcf715e498cbac482f96dec39c0 c869345f15654dea91ddb775c6c3ed7d - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 21 19:25:54 np0005591285 nova_compute[182755]: 2026-01-22 00:25:54.856 182759 DEBUG nova.policy [None req-95f3dbd5-85dc-4e67-9366-cbe74b1deafb 93f27bcf715e498cbac482f96dec39c0 c869345f15654dea91ddb775c6c3ed7d - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '93f27bcf715e498cbac482f96dec39c0', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'c869345f15654dea91ddb775c6c3ed7d', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Jan 21 19:25:54 np0005591285 nova_compute[182755]: 2026-01-22 00:25:54.893 182759 DEBUG oslo_concurrency.processutils [None req-95f3dbd5-85dc-4e67-9366-cbe74b1deafb 93f27bcf715e498cbac482f96dec39c0 c869345f15654dea91ddb775c6c3ed7d - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json" returned: 0 in 0.056s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 21 19:25:54 np0005591285 nova_compute[182755]: 2026-01-22 00:25:54.894 182759 DEBUG oslo_concurrency.processutils [None req-95f3dbd5-85dc-4e67-9366-cbe74b1deafb 93f27bcf715e498cbac482f96dec39c0 c869345f15654dea91ddb775c6c3ed7d - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474,backing_fmt=raw /var/lib/nova/instances/ba1975bd-ca63-4cb4-afd3-fb1f077c28f0/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 21 19:25:55 np0005591285 nova_compute[182755]: 2026-01-22 00:25:55.522 182759 DEBUG oslo_concurrency.processutils [None req-95f3dbd5-85dc-4e67-9366-cbe74b1deafb 93f27bcf715e498cbac482f96dec39c0 c869345f15654dea91ddb775c6c3ed7d - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474,backing_fmt=raw /var/lib/nova/instances/ba1975bd-ca63-4cb4-afd3-fb1f077c28f0/disk 1073741824" returned: 0 in 0.627s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 21 19:25:55 np0005591285 nova_compute[182755]: 2026-01-22 00:25:55.523 182759 DEBUG oslo_concurrency.lockutils [None req-95f3dbd5-85dc-4e67-9366-cbe74b1deafb 93f27bcf715e498cbac482f96dec39c0 c869345f15654dea91ddb775c6c3ed7d - - default default] Lock "3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.696s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 21 19:25:55 np0005591285 nova_compute[182755]: 2026-01-22 00:25:55.523 182759 DEBUG oslo_concurrency.processutils [None req-95f3dbd5-85dc-4e67-9366-cbe74b1deafb 93f27bcf715e498cbac482f96dec39c0 c869345f15654dea91ddb775c6c3ed7d - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 21 19:25:55 np0005591285 nova_compute[182755]: 2026-01-22 00:25:55.578 182759 DEBUG oslo_concurrency.processutils [None req-95f3dbd5-85dc-4e67-9366-cbe74b1deafb 93f27bcf715e498cbac482f96dec39c0 c869345f15654dea91ddb775c6c3ed7d - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json" returned: 0 in 0.055s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 21 19:25:55 np0005591285 nova_compute[182755]: 2026-01-22 00:25:55.579 182759 DEBUG nova.virt.disk.api [None req-95f3dbd5-85dc-4e67-9366-cbe74b1deafb 93f27bcf715e498cbac482f96dec39c0 c869345f15654dea91ddb775c6c3ed7d - - default default] Checking if we can resize image /var/lib/nova/instances/ba1975bd-ca63-4cb4-afd3-fb1f077c28f0/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166#033[00m
Jan 21 19:25:55 np0005591285 nova_compute[182755]: 2026-01-22 00:25:55.580 182759 DEBUG oslo_concurrency.processutils [None req-95f3dbd5-85dc-4e67-9366-cbe74b1deafb 93f27bcf715e498cbac482f96dec39c0 c869345f15654dea91ddb775c6c3ed7d - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/ba1975bd-ca63-4cb4-afd3-fb1f077c28f0/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 21 19:25:55 np0005591285 nova_compute[182755]: 2026-01-22 00:25:55.656 182759 DEBUG oslo_concurrency.processutils [None req-95f3dbd5-85dc-4e67-9366-cbe74b1deafb 93f27bcf715e498cbac482f96dec39c0 c869345f15654dea91ddb775c6c3ed7d - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/ba1975bd-ca63-4cb4-afd3-fb1f077c28f0/disk --force-share --output=json" returned: 0 in 0.077s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 21 19:25:55 np0005591285 nova_compute[182755]: 2026-01-22 00:25:55.657 182759 DEBUG nova.virt.disk.api [None req-95f3dbd5-85dc-4e67-9366-cbe74b1deafb 93f27bcf715e498cbac482f96dec39c0 c869345f15654dea91ddb775c6c3ed7d - - default default] Cannot resize image /var/lib/nova/instances/ba1975bd-ca63-4cb4-afd3-fb1f077c28f0/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172#033[00m
Jan 21 19:25:55 np0005591285 nova_compute[182755]: 2026-01-22 00:25:55.658 182759 DEBUG nova.objects.instance [None req-95f3dbd5-85dc-4e67-9366-cbe74b1deafb 93f27bcf715e498cbac482f96dec39c0 c869345f15654dea91ddb775c6c3ed7d - - default default] Lazy-loading 'migration_context' on Instance uuid ba1975bd-ca63-4cb4-afd3-fb1f077c28f0 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 21 19:25:55 np0005591285 nova_compute[182755]: 2026-01-22 00:25:55.683 182759 DEBUG nova.virt.libvirt.driver [None req-95f3dbd5-85dc-4e67-9366-cbe74b1deafb 93f27bcf715e498cbac482f96dec39c0 c869345f15654dea91ddb775c6c3ed7d - - default default] [instance: ba1975bd-ca63-4cb4-afd3-fb1f077c28f0] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Jan 21 19:25:55 np0005591285 nova_compute[182755]: 2026-01-22 00:25:55.683 182759 DEBUG nova.virt.libvirt.driver [None req-95f3dbd5-85dc-4e67-9366-cbe74b1deafb 93f27bcf715e498cbac482f96dec39c0 c869345f15654dea91ddb775c6c3ed7d - - default default] [instance: ba1975bd-ca63-4cb4-afd3-fb1f077c28f0] Ensure instance console log exists: /var/lib/nova/instances/ba1975bd-ca63-4cb4-afd3-fb1f077c28f0/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Jan 21 19:25:55 np0005591285 nova_compute[182755]: 2026-01-22 00:25:55.683 182759 DEBUG oslo_concurrency.lockutils [None req-95f3dbd5-85dc-4e67-9366-cbe74b1deafb 93f27bcf715e498cbac482f96dec39c0 c869345f15654dea91ddb775c6c3ed7d - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 21 19:25:55 np0005591285 nova_compute[182755]: 2026-01-22 00:25:55.684 182759 DEBUG oslo_concurrency.lockutils [None req-95f3dbd5-85dc-4e67-9366-cbe74b1deafb 93f27bcf715e498cbac482f96dec39c0 c869345f15654dea91ddb775c6c3ed7d - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 21 19:25:55 np0005591285 nova_compute[182755]: 2026-01-22 00:25:55.684 182759 DEBUG oslo_concurrency.lockutils [None req-95f3dbd5-85dc-4e67-9366-cbe74b1deafb 93f27bcf715e498cbac482f96dec39c0 c869345f15654dea91ddb775c6c3ed7d - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 21 19:25:57 np0005591285 podman[237670]: 2026-01-22 00:25:57.172606902 +0000 UTC m=+0.049125923 container health_status 482a8450540b33e5cd4c4cb0c4b60281016c657b79b89fa82f8a9e447843f995 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, managed_by=edpm_ansible, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Jan 21 19:25:57 np0005591285 podman[237671]: 2026-01-22 00:25:57.172679504 +0000 UTC m=+0.046735868 container health_status 8919309155a033da8deb024bb37b9fc0d285fe97686ee0d202af37a5f2df8416 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']})
Jan 21 19:25:57 np0005591285 nova_compute[182755]: 2026-01-22 00:25:57.352 182759 DEBUG nova.network.neutron [None req-95f3dbd5-85dc-4e67-9366-cbe74b1deafb 93f27bcf715e498cbac482f96dec39c0 c869345f15654dea91ddb775c6c3ed7d - - default default] [instance: ba1975bd-ca63-4cb4-afd3-fb1f077c28f0] Successfully created port: 168c1e42-5626-409f-86c2-c1b2a8b11d4b _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Jan 21 19:25:57 np0005591285 nova_compute[182755]: 2026-01-22 00:25:57.529 182759 DEBUG nova.network.neutron [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] [instance: 3f9bded2-5958-4c54-90d9-fc4d4b658fc0] Updating instance_info_cache with network_info: [{"id": "0e0f9617-ebfe-45af-98e6-38991b5338d0", "address": "fa:16:3e:8b:21:fa", "network": {"id": "5c39d2a7-2c89-4543-a593-0bbe9a34dfef", "bridge": "br-int", "label": "tempest-network-smoke--1486796925", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.203", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "02bcfc5f1f1044a3856e73a5938ff011", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap0e0f9617-eb", "ovs_interfaceid": "0e0f9617-ebfe-45af-98e6-38991b5338d0", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 21 19:25:57 np0005591285 nova_compute[182755]: 2026-01-22 00:25:57.548 182759 DEBUG oslo_concurrency.lockutils [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Releasing lock "refresh_cache-3f9bded2-5958-4c54-90d9-fc4d4b658fc0" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 21 19:25:57 np0005591285 nova_compute[182755]: 2026-01-22 00:25:57.549 182759 DEBUG nova.compute.manager [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] [instance: 3f9bded2-5958-4c54-90d9-fc4d4b658fc0] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929#033[00m
Jan 21 19:25:57 np0005591285 nova_compute[182755]: 2026-01-22 00:25:57.549 182759 DEBUG oslo_service.periodic_task [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 21 19:25:58 np0005591285 nova_compute[182755]: 2026-01-22 00:25:58.412 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:25:58 np0005591285 nova_compute[182755]: 2026-01-22 00:25:58.545 182759 DEBUG oslo_service.periodic_task [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 21 19:25:59 np0005591285 nova_compute[182755]: 2026-01-22 00:25:59.384 182759 DEBUG nova.network.neutron [None req-95f3dbd5-85dc-4e67-9366-cbe74b1deafb 93f27bcf715e498cbac482f96dec39c0 c869345f15654dea91ddb775c6c3ed7d - - default default] [instance: ba1975bd-ca63-4cb4-afd3-fb1f077c28f0] Successfully updated port: 168c1e42-5626-409f-86c2-c1b2a8b11d4b _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Jan 21 19:25:59 np0005591285 nova_compute[182755]: 2026-01-22 00:25:59.415 182759 DEBUG oslo_concurrency.lockutils [None req-95f3dbd5-85dc-4e67-9366-cbe74b1deafb 93f27bcf715e498cbac482f96dec39c0 c869345f15654dea91ddb775c6c3ed7d - - default default] Acquiring lock "refresh_cache-ba1975bd-ca63-4cb4-afd3-fb1f077c28f0" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 21 19:25:59 np0005591285 nova_compute[182755]: 2026-01-22 00:25:59.416 182759 DEBUG oslo_concurrency.lockutils [None req-95f3dbd5-85dc-4e67-9366-cbe74b1deafb 93f27bcf715e498cbac482f96dec39c0 c869345f15654dea91ddb775c6c3ed7d - - default default] Acquired lock "refresh_cache-ba1975bd-ca63-4cb4-afd3-fb1f077c28f0" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 21 19:25:59 np0005591285 nova_compute[182755]: 2026-01-22 00:25:59.416 182759 DEBUG nova.network.neutron [None req-95f3dbd5-85dc-4e67-9366-cbe74b1deafb 93f27bcf715e498cbac482f96dec39c0 c869345f15654dea91ddb775c6c3ed7d - - default default] [instance: ba1975bd-ca63-4cb4-afd3-fb1f077c28f0] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Jan 21 19:25:59 np0005591285 nova_compute[182755]: 2026-01-22 00:25:59.506 182759 DEBUG nova.compute.manager [req-026792f1-1ae2-48fd-8941-4e1f6f5aee3b req-5d3719dd-356c-432e-8d75-0058d9c611e4 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: ba1975bd-ca63-4cb4-afd3-fb1f077c28f0] Received event network-changed-168c1e42-5626-409f-86c2-c1b2a8b11d4b external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 21 19:25:59 np0005591285 nova_compute[182755]: 2026-01-22 00:25:59.507 182759 DEBUG nova.compute.manager [req-026792f1-1ae2-48fd-8941-4e1f6f5aee3b req-5d3719dd-356c-432e-8d75-0058d9c611e4 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: ba1975bd-ca63-4cb4-afd3-fb1f077c28f0] Refreshing instance network info cache due to event network-changed-168c1e42-5626-409f-86c2-c1b2a8b11d4b. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 21 19:25:59 np0005591285 nova_compute[182755]: 2026-01-22 00:25:59.507 182759 DEBUG oslo_concurrency.lockutils [req-026792f1-1ae2-48fd-8941-4e1f6f5aee3b req-5d3719dd-356c-432e-8d75-0058d9c611e4 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "refresh_cache-ba1975bd-ca63-4cb4-afd3-fb1f077c28f0" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 21 19:25:59 np0005591285 nova_compute[182755]: 2026-01-22 00:25:59.768 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:26:00 np0005591285 nova_compute[182755]: 2026-01-22 00:26:00.217 182759 DEBUG oslo_service.periodic_task [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 21 19:26:00 np0005591285 nova_compute[182755]: 2026-01-22 00:26:00.528 182759 DEBUG nova.network.neutron [None req-95f3dbd5-85dc-4e67-9366-cbe74b1deafb 93f27bcf715e498cbac482f96dec39c0 c869345f15654dea91ddb775c6c3ed7d - - default default] [instance: ba1975bd-ca63-4cb4-afd3-fb1f077c28f0] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Jan 21 19:26:00 np0005591285 nova_compute[182755]: 2026-01-22 00:26:00.704 182759 DEBUG nova.compute.manager [req-309432c9-2fac-417f-bb6a-3c1435b54403 req-1bcf7a00-2d2f-4c83-8fef-27884267c627 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 3f9bded2-5958-4c54-90d9-fc4d4b658fc0] Received event network-changed-0e0f9617-ebfe-45af-98e6-38991b5338d0 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 21 19:26:00 np0005591285 nova_compute[182755]: 2026-01-22 00:26:00.705 182759 DEBUG nova.compute.manager [req-309432c9-2fac-417f-bb6a-3c1435b54403 req-1bcf7a00-2d2f-4c83-8fef-27884267c627 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 3f9bded2-5958-4c54-90d9-fc4d4b658fc0] Refreshing instance network info cache due to event network-changed-0e0f9617-ebfe-45af-98e6-38991b5338d0. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 21 19:26:00 np0005591285 nova_compute[182755]: 2026-01-22 00:26:00.705 182759 DEBUG oslo_concurrency.lockutils [req-309432c9-2fac-417f-bb6a-3c1435b54403 req-1bcf7a00-2d2f-4c83-8fef-27884267c627 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "refresh_cache-3f9bded2-5958-4c54-90d9-fc4d4b658fc0" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 21 19:26:00 np0005591285 nova_compute[182755]: 2026-01-22 00:26:00.705 182759 DEBUG oslo_concurrency.lockutils [req-309432c9-2fac-417f-bb6a-3c1435b54403 req-1bcf7a00-2d2f-4c83-8fef-27884267c627 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquired lock "refresh_cache-3f9bded2-5958-4c54-90d9-fc4d4b658fc0" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 21 19:26:00 np0005591285 nova_compute[182755]: 2026-01-22 00:26:00.705 182759 DEBUG nova.network.neutron [req-309432c9-2fac-417f-bb6a-3c1435b54403 req-1bcf7a00-2d2f-4c83-8fef-27884267c627 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 3f9bded2-5958-4c54-90d9-fc4d4b658fc0] Refreshing network info cache for port 0e0f9617-ebfe-45af-98e6-38991b5338d0 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 21 19:26:00 np0005591285 nova_compute[182755]: 2026-01-22 00:26:00.937 182759 DEBUG oslo_concurrency.lockutils [None req-2c0234d5-fa51-4521-b48b-36adc6536c40 a60ce2b7b7ae47b484de12add551b287 02bcfc5f1f1044a3856e73a5938ff011 - - default default] Acquiring lock "3f9bded2-5958-4c54-90d9-fc4d4b658fc0" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 21 19:26:00 np0005591285 nova_compute[182755]: 2026-01-22 00:26:00.938 182759 DEBUG oslo_concurrency.lockutils [None req-2c0234d5-fa51-4521-b48b-36adc6536c40 a60ce2b7b7ae47b484de12add551b287 02bcfc5f1f1044a3856e73a5938ff011 - - default default] Lock "3f9bded2-5958-4c54-90d9-fc4d4b658fc0" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 21 19:26:00 np0005591285 nova_compute[182755]: 2026-01-22 00:26:00.938 182759 DEBUG oslo_concurrency.lockutils [None req-2c0234d5-fa51-4521-b48b-36adc6536c40 a60ce2b7b7ae47b484de12add551b287 02bcfc5f1f1044a3856e73a5938ff011 - - default default] Acquiring lock "3f9bded2-5958-4c54-90d9-fc4d4b658fc0-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 21 19:26:00 np0005591285 nova_compute[182755]: 2026-01-22 00:26:00.939 182759 DEBUG oslo_concurrency.lockutils [None req-2c0234d5-fa51-4521-b48b-36adc6536c40 a60ce2b7b7ae47b484de12add551b287 02bcfc5f1f1044a3856e73a5938ff011 - - default default] Lock "3f9bded2-5958-4c54-90d9-fc4d4b658fc0-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 21 19:26:00 np0005591285 nova_compute[182755]: 2026-01-22 00:26:00.939 182759 DEBUG oslo_concurrency.lockutils [None req-2c0234d5-fa51-4521-b48b-36adc6536c40 a60ce2b7b7ae47b484de12add551b287 02bcfc5f1f1044a3856e73a5938ff011 - - default default] Lock "3f9bded2-5958-4c54-90d9-fc4d4b658fc0-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 21 19:26:00 np0005591285 nova_compute[182755]: 2026-01-22 00:26:00.955 182759 INFO nova.compute.manager [None req-2c0234d5-fa51-4521-b48b-36adc6536c40 a60ce2b7b7ae47b484de12add551b287 02bcfc5f1f1044a3856e73a5938ff011 - - default default] [instance: 3f9bded2-5958-4c54-90d9-fc4d4b658fc0] Terminating instance#033[00m
Jan 21 19:26:00 np0005591285 nova_compute[182755]: 2026-01-22 00:26:00.967 182759 DEBUG nova.compute.manager [None req-2c0234d5-fa51-4521-b48b-36adc6536c40 a60ce2b7b7ae47b484de12add551b287 02bcfc5f1f1044a3856e73a5938ff011 - - default default] [instance: 3f9bded2-5958-4c54-90d9-fc4d4b658fc0] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Jan 21 19:26:00 np0005591285 kernel: tap0e0f9617-eb (unregistering): left promiscuous mode
Jan 21 19:26:00 np0005591285 NetworkManager[55017]: <info>  [1769041560.9932] device (tap0e0f9617-eb): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 21 19:26:01 np0005591285 nova_compute[182755]: 2026-01-22 00:26:01.002 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:26:01 np0005591285 ovn_controller[94908]: 2026-01-22T00:26:01Z|00621|binding|INFO|Releasing lport 0e0f9617-ebfe-45af-98e6-38991b5338d0 from this chassis (sb_readonly=0)
Jan 21 19:26:01 np0005591285 ovn_controller[94908]: 2026-01-22T00:26:01Z|00622|binding|INFO|Setting lport 0e0f9617-ebfe-45af-98e6-38991b5338d0 down in Southbound
Jan 21 19:26:01 np0005591285 ovn_controller[94908]: 2026-01-22T00:26:01Z|00623|binding|INFO|Removing iface tap0e0f9617-eb ovn-installed in OVS
Jan 21 19:26:01 np0005591285 nova_compute[182755]: 2026-01-22 00:26:01.005 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:26:01 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:26:01.010 104259 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:8b:21:fa 10.100.0.4'], port_security=['fa:16:3e:8b:21:fa 10.100.0.4'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.4/28', 'neutron:device_id': '3f9bded2-5958-4c54-90d9-fc4d4b658fc0', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-5c39d2a7-2c89-4543-a593-0bbe9a34dfef', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '02bcfc5f1f1044a3856e73a5938ff011', 'neutron:revision_number': '4', 'neutron:security_group_ids': '49fcb811-8980-4c02-8e86-5cb74b163246 cf342e9e-efd0-4e0e-9d6f-a5a24378b540', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-2.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=48468636-833b-49e3-b1b9-d984040b8ee3, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fdd55b5c970>], logical_port=0e0f9617-ebfe-45af-98e6-38991b5338d0) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7fdd55b5c970>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 21 19:26:01 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:26:01.012 104259 INFO neutron.agent.ovn.metadata.agent [-] Port 0e0f9617-ebfe-45af-98e6-38991b5338d0 in datapath 5c39d2a7-2c89-4543-a593-0bbe9a34dfef unbound from our chassis#033[00m
Jan 21 19:26:01 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:26:01.014 104259 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 5c39d2a7-2c89-4543-a593-0bbe9a34dfef, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Jan 21 19:26:01 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:26:01.016 211690 DEBUG oslo.privsep.daemon [-] privsep: reply[d97d68f5-4fb2-4c10-a4d7-37d048d4756b]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 19:26:01 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:26:01.017 104259 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-5c39d2a7-2c89-4543-a593-0bbe9a34dfef namespace which is not needed anymore#033[00m
Jan 21 19:26:01 np0005591285 nova_compute[182755]: 2026-01-22 00:26:01.020 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:26:01 np0005591285 systemd[1]: machine-qemu\x2d72\x2dinstance\x2d000000a1.scope: Deactivated successfully.
Jan 21 19:26:01 np0005591285 systemd[1]: machine-qemu\x2d72\x2dinstance\x2d000000a1.scope: Consumed 16.801s CPU time.
Jan 21 19:26:01 np0005591285 systemd-machined[154022]: Machine qemu-72-instance-000000a1 terminated.
Jan 21 19:26:01 np0005591285 nova_compute[182755]: 2026-01-22 00:26:01.218 182759 DEBUG oslo_service.periodic_task [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 21 19:26:01 np0005591285 nova_compute[182755]: 2026-01-22 00:26:01.244 182759 INFO nova.virt.libvirt.driver [-] [instance: 3f9bded2-5958-4c54-90d9-fc4d4b658fc0] Instance destroyed successfully.#033[00m
Jan 21 19:26:01 np0005591285 nova_compute[182755]: 2026-01-22 00:26:01.244 182759 DEBUG nova.objects.instance [None req-2c0234d5-fa51-4521-b48b-36adc6536c40 a60ce2b7b7ae47b484de12add551b287 02bcfc5f1f1044a3856e73a5938ff011 - - default default] Lazy-loading 'resources' on Instance uuid 3f9bded2-5958-4c54-90d9-fc4d4b658fc0 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 21 19:26:01 np0005591285 nova_compute[182755]: 2026-01-22 00:26:01.263 182759 DEBUG nova.virt.libvirt.vif [None req-2c0234d5-fa51-4521-b48b-36adc6536c40 a60ce2b7b7ae47b484de12add551b287 02bcfc5f1f1044a3856e73a5938ff011 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-22T00:24:31Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-server-tempest-TestSecurityGroupsBasicOps-1492736128-access_point-1191581548',display_name='tempest-server-tempest-TestSecurityGroupsBasicOps-1492736128-access_point-1191581548',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-server-tempest-testsecuritygroupsbasicops-1492736128-ac',id=161,image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBA7jwxBkdWOz7w57AiaRcIngM4Y6BG0RAdXKZN1lpSf4fY7AaWS+RG49OCjvRqIpg6m9+OlWKeWKEGH6c13ztIF3i7IhSM5D4o2yMlEeDmvrwLxAQoPueCNJW1uOa0WZAw==',key_name='tempest-TestSecurityGroupsBasicOps-1448768175',keypairs=<?>,launch_index=0,launched_at=2026-01-22T00:24:53Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='02bcfc5f1f1044a3856e73a5938ff011',ramdisk_id='',reservation_id='r-p8b5xcxs',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestSecurityGroupsBasicOps-1492736128',owner_user_name='tempest-TestSecurityGroupsBasicOps-1492736128-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-22T00:24:53Z,user_data=None,user_id='a60ce2b7b7ae47b484de12add551b287',uuid=3f9bded2-5958-4c54-90d9-fc4d4b658fc0,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "0e0f9617-ebfe-45af-98e6-38991b5338d0", "address": "fa:16:3e:8b:21:fa", "network": {"id": "5c39d2a7-2c89-4543-a593-0bbe9a34dfef", "bridge": "br-int", "label": "tempest-network-smoke--1486796925", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.203", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "02bcfc5f1f1044a3856e73a5938ff011", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap0e0f9617-eb", "ovs_interfaceid": "0e0f9617-ebfe-45af-98e6-38991b5338d0", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Jan 21 19:26:01 np0005591285 nova_compute[182755]: 2026-01-22 00:26:01.264 182759 DEBUG nova.network.os_vif_util [None req-2c0234d5-fa51-4521-b48b-36adc6536c40 a60ce2b7b7ae47b484de12add551b287 02bcfc5f1f1044a3856e73a5938ff011 - - default default] Converting VIF {"id": "0e0f9617-ebfe-45af-98e6-38991b5338d0", "address": "fa:16:3e:8b:21:fa", "network": {"id": "5c39d2a7-2c89-4543-a593-0bbe9a34dfef", "bridge": "br-int", "label": "tempest-network-smoke--1486796925", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.203", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "02bcfc5f1f1044a3856e73a5938ff011", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap0e0f9617-eb", "ovs_interfaceid": "0e0f9617-ebfe-45af-98e6-38991b5338d0", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 21 19:26:01 np0005591285 nova_compute[182755]: 2026-01-22 00:26:01.265 182759 DEBUG nova.network.os_vif_util [None req-2c0234d5-fa51-4521-b48b-36adc6536c40 a60ce2b7b7ae47b484de12add551b287 02bcfc5f1f1044a3856e73a5938ff011 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:8b:21:fa,bridge_name='br-int',has_traffic_filtering=True,id=0e0f9617-ebfe-45af-98e6-38991b5338d0,network=Network(5c39d2a7-2c89-4543-a593-0bbe9a34dfef),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap0e0f9617-eb') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 21 19:26:01 np0005591285 nova_compute[182755]: 2026-01-22 00:26:01.265 182759 DEBUG os_vif [None req-2c0234d5-fa51-4521-b48b-36adc6536c40 a60ce2b7b7ae47b484de12add551b287 02bcfc5f1f1044a3856e73a5938ff011 - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:8b:21:fa,bridge_name='br-int',has_traffic_filtering=True,id=0e0f9617-ebfe-45af-98e6-38991b5338d0,network=Network(5c39d2a7-2c89-4543-a593-0bbe9a34dfef),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap0e0f9617-eb') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Jan 21 19:26:01 np0005591285 nova_compute[182755]: 2026-01-22 00:26:01.268 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:26:01 np0005591285 nova_compute[182755]: 2026-01-22 00:26:01.268 182759 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap0e0f9617-eb, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 21 19:26:01 np0005591285 nova_compute[182755]: 2026-01-22 00:26:01.269 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:26:01 np0005591285 nova_compute[182755]: 2026-01-22 00:26:01.271 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:26:01 np0005591285 nova_compute[182755]: 2026-01-22 00:26:01.275 182759 INFO os_vif [None req-2c0234d5-fa51-4521-b48b-36adc6536c40 a60ce2b7b7ae47b484de12add551b287 02bcfc5f1f1044a3856e73a5938ff011 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:8b:21:fa,bridge_name='br-int',has_traffic_filtering=True,id=0e0f9617-ebfe-45af-98e6-38991b5338d0,network=Network(5c39d2a7-2c89-4543-a593-0bbe9a34dfef),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap0e0f9617-eb')#033[00m
Jan 21 19:26:01 np0005591285 nova_compute[182755]: 2026-01-22 00:26:01.276 182759 INFO nova.virt.libvirt.driver [None req-2c0234d5-fa51-4521-b48b-36adc6536c40 a60ce2b7b7ae47b484de12add551b287 02bcfc5f1f1044a3856e73a5938ff011 - - default default] [instance: 3f9bded2-5958-4c54-90d9-fc4d4b658fc0] Deleting instance files /var/lib/nova/instances/3f9bded2-5958-4c54-90d9-fc4d4b658fc0_del#033[00m
Jan 21 19:26:01 np0005591285 nova_compute[182755]: 2026-01-22 00:26:01.277 182759 INFO nova.virt.libvirt.driver [None req-2c0234d5-fa51-4521-b48b-36adc6536c40 a60ce2b7b7ae47b484de12add551b287 02bcfc5f1f1044a3856e73a5938ff011 - - default default] [instance: 3f9bded2-5958-4c54-90d9-fc4d4b658fc0] Deletion of /var/lib/nova/instances/3f9bded2-5958-4c54-90d9-fc4d4b658fc0_del complete#033[00m
Jan 21 19:26:01 np0005591285 nova_compute[182755]: 2026-01-22 00:26:01.368 182759 INFO nova.compute.manager [None req-2c0234d5-fa51-4521-b48b-36adc6536c40 a60ce2b7b7ae47b484de12add551b287 02bcfc5f1f1044a3856e73a5938ff011 - - default default] [instance: 3f9bded2-5958-4c54-90d9-fc4d4b658fc0] Took 0.40 seconds to destroy the instance on the hypervisor.#033[00m
Jan 21 19:26:01 np0005591285 nova_compute[182755]: 2026-01-22 00:26:01.368 182759 DEBUG oslo.service.loopingcall [None req-2c0234d5-fa51-4521-b48b-36adc6536c40 a60ce2b7b7ae47b484de12add551b287 02bcfc5f1f1044a3856e73a5938ff011 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Jan 21 19:26:01 np0005591285 nova_compute[182755]: 2026-01-22 00:26:01.369 182759 DEBUG nova.compute.manager [-] [instance: 3f9bded2-5958-4c54-90d9-fc4d4b658fc0] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Jan 21 19:26:01 np0005591285 nova_compute[182755]: 2026-01-22 00:26:01.369 182759 DEBUG nova.network.neutron [-] [instance: 3f9bded2-5958-4c54-90d9-fc4d4b658fc0] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Jan 21 19:26:01 np0005591285 neutron-haproxy-ovnmeta-5c39d2a7-2c89-4543-a593-0bbe9a34dfef[237320]: [NOTICE]   (237345) : haproxy version is 2.8.14-c23fe91
Jan 21 19:26:01 np0005591285 neutron-haproxy-ovnmeta-5c39d2a7-2c89-4543-a593-0bbe9a34dfef[237320]: [NOTICE]   (237345) : path to executable is /usr/sbin/haproxy
Jan 21 19:26:01 np0005591285 neutron-haproxy-ovnmeta-5c39d2a7-2c89-4543-a593-0bbe9a34dfef[237320]: [WARNING]  (237345) : Exiting Master process...
Jan 21 19:26:01 np0005591285 podman[237716]: 2026-01-22 00:26:01.670540772 +0000 UTC m=+0.648806667 container health_status e38def30a2d032278c507a8fe96e62e9ea51ff4721fe55b24e66691821fd079e (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, org.label-schema.license=GPLv2, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 21 19:26:01 np0005591285 neutron-haproxy-ovnmeta-5c39d2a7-2c89-4543-a593-0bbe9a34dfef[237320]: [WARNING]  (237345) : Exiting Master process...
Jan 21 19:26:01 np0005591285 neutron-haproxy-ovnmeta-5c39d2a7-2c89-4543-a593-0bbe9a34dfef[237320]: [ALERT]    (237345) : Current worker (237347) exited with code 143 (Terminated)
Jan 21 19:26:01 np0005591285 neutron-haproxy-ovnmeta-5c39d2a7-2c89-4543-a593-0bbe9a34dfef[237320]: [WARNING]  (237345) : All workers exited. Exiting... (0)
Jan 21 19:26:01 np0005591285 systemd[1]: libpod-2dbee010ab29a528cd64b59c726b9fe33e883b4752f8b77fefd0686e4cb9c700.scope: Deactivated successfully.
Jan 21 19:26:01 np0005591285 podman[237760]: 2026-01-22 00:26:01.683324176 +0000 UTC m=+0.573167344 container died 2dbee010ab29a528cd64b59c726b9fe33e883b4752f8b77fefd0686e4cb9c700 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-5c39d2a7-2c89-4543-a593-0bbe9a34dfef, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 21 19:26:01 np0005591285 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-2dbee010ab29a528cd64b59c726b9fe33e883b4752f8b77fefd0686e4cb9c700-userdata-shm.mount: Deactivated successfully.
Jan 21 19:26:01 np0005591285 systemd[1]: var-lib-containers-storage-overlay-412b06d4169cd9c9536fe5d02a455f867edb2983747683f0b5011f330cf2b40a-merged.mount: Deactivated successfully.
Jan 21 19:26:01 np0005591285 podman[237760]: 2026-01-22 00:26:01.717532885 +0000 UTC m=+0.607376063 container cleanup 2dbee010ab29a528cd64b59c726b9fe33e883b4752f8b77fefd0686e4cb9c700 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-5c39d2a7-2c89-4543-a593-0bbe9a34dfef, tcib_managed=true, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2)
Jan 21 19:26:01 np0005591285 systemd[1]: libpod-conmon-2dbee010ab29a528cd64b59c726b9fe33e883b4752f8b77fefd0686e4cb9c700.scope: Deactivated successfully.
Jan 21 19:26:01 np0005591285 podman[237813]: 2026-01-22 00:26:01.91630323 +0000 UTC m=+0.178272505 container remove 2dbee010ab29a528cd64b59c726b9fe33e883b4752f8b77fefd0686e4cb9c700 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-5c39d2a7-2c89-4543-a593-0bbe9a34dfef, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.build-date=20251202, tcib_managed=true, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 21 19:26:01 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:26:01.921 211690 DEBUG oslo.privsep.daemon [-] privsep: reply[52790bca-dd11-4790-92e1-5777c56faa32]: (4, ('Thu Jan 22 12:26:01 AM UTC 2026 Stopping container neutron-haproxy-ovnmeta-5c39d2a7-2c89-4543-a593-0bbe9a34dfef (2dbee010ab29a528cd64b59c726b9fe33e883b4752f8b77fefd0686e4cb9c700)\n2dbee010ab29a528cd64b59c726b9fe33e883b4752f8b77fefd0686e4cb9c700\nThu Jan 22 12:26:01 AM UTC 2026 Deleting container neutron-haproxy-ovnmeta-5c39d2a7-2c89-4543-a593-0bbe9a34dfef (2dbee010ab29a528cd64b59c726b9fe33e883b4752f8b77fefd0686e4cb9c700)\n2dbee010ab29a528cd64b59c726b9fe33e883b4752f8b77fefd0686e4cb9c700\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 19:26:01 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:26:01.923 211690 DEBUG oslo.privsep.daemon [-] privsep: reply[102760a2-8b57-4e11-ba78-955df7384f27]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 19:26:01 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:26:01.925 104259 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap5c39d2a7-20, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 21 19:26:01 np0005591285 nova_compute[182755]: 2026-01-22 00:26:01.927 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:26:01 np0005591285 kernel: tap5c39d2a7-20: left promiscuous mode
Jan 21 19:26:01 np0005591285 nova_compute[182755]: 2026-01-22 00:26:01.942 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:26:01 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:26:01.946 211690 DEBUG oslo.privsep.daemon [-] privsep: reply[87528502-6f8a-4cc4-94d6-aa0e4feba251]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 19:26:01 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:26:01.960 211690 DEBUG oslo.privsep.daemon [-] privsep: reply[b91d2cef-706b-45e2-b18c-7e6d39f008a0]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 19:26:01 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:26:01.961 211690 DEBUG oslo.privsep.daemon [-] privsep: reply[1fea479c-24fc-4e9a-b3fd-ccc033d4ecb0]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 19:26:01 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:26:01.976 211690 DEBUG oslo.privsep.daemon [-] privsep: reply[294ce907-c69c-4243-9248-5c5cd27b19bc]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 605066, 'reachable_time': 19083, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 237828, 'error': None, 'target': 'ovnmeta-5c39d2a7-2c89-4543-a593-0bbe9a34dfef', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 19:26:01 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:26:01.980 104650 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-5c39d2a7-2c89-4543-a593-0bbe9a34dfef deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Jan 21 19:26:01 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:26:01.981 104650 DEBUG oslo.privsep.daemon [-] privsep: reply[b35b9a1f-fffd-41a8-9ade-905edcca303e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 19:26:01 np0005591285 systemd[1]: run-netns-ovnmeta\x2d5c39d2a7\x2d2c89\x2d4543\x2da593\x2d0bbe9a34dfef.mount: Deactivated successfully.
Jan 21 19:26:02 np0005591285 nova_compute[182755]: 2026-01-22 00:26:02.675 182759 DEBUG nova.compute.manager [req-bab1f15f-f47c-49db-ac67-afd5e4c0a632 req-f14bf67b-4877-4aea-8ead-68500cba289c 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 3f9bded2-5958-4c54-90d9-fc4d4b658fc0] Received event network-vif-unplugged-0e0f9617-ebfe-45af-98e6-38991b5338d0 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 21 19:26:02 np0005591285 nova_compute[182755]: 2026-01-22 00:26:02.675 182759 DEBUG oslo_concurrency.lockutils [req-bab1f15f-f47c-49db-ac67-afd5e4c0a632 req-f14bf67b-4877-4aea-8ead-68500cba289c 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "3f9bded2-5958-4c54-90d9-fc4d4b658fc0-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 21 19:26:02 np0005591285 nova_compute[182755]: 2026-01-22 00:26:02.675 182759 DEBUG oslo_concurrency.lockutils [req-bab1f15f-f47c-49db-ac67-afd5e4c0a632 req-f14bf67b-4877-4aea-8ead-68500cba289c 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "3f9bded2-5958-4c54-90d9-fc4d4b658fc0-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 21 19:26:02 np0005591285 nova_compute[182755]: 2026-01-22 00:26:02.676 182759 DEBUG oslo_concurrency.lockutils [req-bab1f15f-f47c-49db-ac67-afd5e4c0a632 req-f14bf67b-4877-4aea-8ead-68500cba289c 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "3f9bded2-5958-4c54-90d9-fc4d4b658fc0-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 21 19:26:02 np0005591285 nova_compute[182755]: 2026-01-22 00:26:02.676 182759 DEBUG nova.compute.manager [req-bab1f15f-f47c-49db-ac67-afd5e4c0a632 req-f14bf67b-4877-4aea-8ead-68500cba289c 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 3f9bded2-5958-4c54-90d9-fc4d4b658fc0] No waiting events found dispatching network-vif-unplugged-0e0f9617-ebfe-45af-98e6-38991b5338d0 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 21 19:26:02 np0005591285 nova_compute[182755]: 2026-01-22 00:26:02.676 182759 DEBUG nova.compute.manager [req-bab1f15f-f47c-49db-ac67-afd5e4c0a632 req-f14bf67b-4877-4aea-8ead-68500cba289c 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 3f9bded2-5958-4c54-90d9-fc4d4b658fc0] Received event network-vif-unplugged-0e0f9617-ebfe-45af-98e6-38991b5338d0 for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Jan 21 19:26:02 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:26:02.991 104259 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 21 19:26:02 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:26:02.991 104259 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 21 19:26:02 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:26:02.991 104259 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 21 19:26:03 np0005591285 nova_compute[182755]: 2026-01-22 00:26:03.217 182759 DEBUG oslo_service.periodic_task [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 21 19:26:03 np0005591285 nova_compute[182755]: 2026-01-22 00:26:03.413 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:26:03 np0005591285 nova_compute[182755]: 2026-01-22 00:26:03.715 182759 DEBUG nova.network.neutron [None req-95f3dbd5-85dc-4e67-9366-cbe74b1deafb 93f27bcf715e498cbac482f96dec39c0 c869345f15654dea91ddb775c6c3ed7d - - default default] [instance: ba1975bd-ca63-4cb4-afd3-fb1f077c28f0] Updating instance_info_cache with network_info: [{"id": "168c1e42-5626-409f-86c2-c1b2a8b11d4b", "address": "fa:16:3e:01:b6:65", "network": {"id": "ab086ee0-e007-4a86-babc-64d267c3fd5e", "bridge": "br-int", "label": "tempest-TestSnapshotPattern-266383806-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c869345f15654dea91ddb775c6c3ed7d", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap168c1e42-56", "ovs_interfaceid": "168c1e42-5626-409f-86c2-c1b2a8b11d4b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 21 19:26:06 np0005591285 nova_compute[182755]: 2026-01-22 00:26:06.269 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:26:07 np0005591285 nova_compute[182755]: 2026-01-22 00:26:07.558 182759 DEBUG oslo_concurrency.lockutils [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 21 19:26:07 np0005591285 nova_compute[182755]: 2026-01-22 00:26:07.559 182759 DEBUG oslo_concurrency.lockutils [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 21 19:26:07 np0005591285 nova_compute[182755]: 2026-01-22 00:26:07.559 182759 DEBUG oslo_concurrency.lockutils [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 21 19:26:07 np0005591285 nova_compute[182755]: 2026-01-22 00:26:07.559 182759 DEBUG nova.compute.resource_tracker [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Jan 21 19:26:07 np0005591285 nova_compute[182755]: 2026-01-22 00:26:07.616 182759 DEBUG nova.compute.manager [req-bc08ebf3-bec8-46ef-b1b7-2c701eb68bf5 req-22127f6d-cdd0-44a0-a655-b8c5b4f6d1cf 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 3f9bded2-5958-4c54-90d9-fc4d4b658fc0] Received event network-vif-plugged-0e0f9617-ebfe-45af-98e6-38991b5338d0 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 21 19:26:07 np0005591285 nova_compute[182755]: 2026-01-22 00:26:07.616 182759 DEBUG oslo_concurrency.lockutils [req-bc08ebf3-bec8-46ef-b1b7-2c701eb68bf5 req-22127f6d-cdd0-44a0-a655-b8c5b4f6d1cf 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "3f9bded2-5958-4c54-90d9-fc4d4b658fc0-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 21 19:26:07 np0005591285 nova_compute[182755]: 2026-01-22 00:26:07.617 182759 DEBUG oslo_concurrency.lockutils [req-bc08ebf3-bec8-46ef-b1b7-2c701eb68bf5 req-22127f6d-cdd0-44a0-a655-b8c5b4f6d1cf 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "3f9bded2-5958-4c54-90d9-fc4d4b658fc0-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 21 19:26:07 np0005591285 nova_compute[182755]: 2026-01-22 00:26:07.617 182759 DEBUG oslo_concurrency.lockutils [req-bc08ebf3-bec8-46ef-b1b7-2c701eb68bf5 req-22127f6d-cdd0-44a0-a655-b8c5b4f6d1cf 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "3f9bded2-5958-4c54-90d9-fc4d4b658fc0-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 21 19:26:07 np0005591285 nova_compute[182755]: 2026-01-22 00:26:07.617 182759 DEBUG nova.compute.manager [req-bc08ebf3-bec8-46ef-b1b7-2c701eb68bf5 req-22127f6d-cdd0-44a0-a655-b8c5b4f6d1cf 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 3f9bded2-5958-4c54-90d9-fc4d4b658fc0] No waiting events found dispatching network-vif-plugged-0e0f9617-ebfe-45af-98e6-38991b5338d0 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 21 19:26:07 np0005591285 nova_compute[182755]: 2026-01-22 00:26:07.618 182759 WARNING nova.compute.manager [req-bc08ebf3-bec8-46ef-b1b7-2c701eb68bf5 req-22127f6d-cdd0-44a0-a655-b8c5b4f6d1cf 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 3f9bded2-5958-4c54-90d9-fc4d4b658fc0] Received unexpected event network-vif-plugged-0e0f9617-ebfe-45af-98e6-38991b5338d0 for instance with vm_state active and task_state deleting.#033[00m
Jan 21 19:26:07 np0005591285 nova_compute[182755]: 2026-01-22 00:26:07.749 182759 WARNING nova.virt.libvirt.driver [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 21 19:26:07 np0005591285 nova_compute[182755]: 2026-01-22 00:26:07.750 182759 DEBUG nova.compute.resource_tracker [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=5741MB free_disk=73.19093704223633GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Jan 21 19:26:07 np0005591285 nova_compute[182755]: 2026-01-22 00:26:07.751 182759 DEBUG oslo_concurrency.lockutils [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 21 19:26:07 np0005591285 nova_compute[182755]: 2026-01-22 00:26:07.751 182759 DEBUG oslo_concurrency.lockutils [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 21 19:26:07 np0005591285 nova_compute[182755]: 2026-01-22 00:26:07.778 182759 DEBUG oslo_concurrency.lockutils [None req-95f3dbd5-85dc-4e67-9366-cbe74b1deafb 93f27bcf715e498cbac482f96dec39c0 c869345f15654dea91ddb775c6c3ed7d - - default default] Releasing lock "refresh_cache-ba1975bd-ca63-4cb4-afd3-fb1f077c28f0" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 21 19:26:07 np0005591285 nova_compute[182755]: 2026-01-22 00:26:07.779 182759 DEBUG nova.compute.manager [None req-95f3dbd5-85dc-4e67-9366-cbe74b1deafb 93f27bcf715e498cbac482f96dec39c0 c869345f15654dea91ddb775c6c3ed7d - - default default] [instance: ba1975bd-ca63-4cb4-afd3-fb1f077c28f0] Instance network_info: |[{"id": "168c1e42-5626-409f-86c2-c1b2a8b11d4b", "address": "fa:16:3e:01:b6:65", "network": {"id": "ab086ee0-e007-4a86-babc-64d267c3fd5e", "bridge": "br-int", "label": "tempest-TestSnapshotPattern-266383806-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c869345f15654dea91ddb775c6c3ed7d", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap168c1e42-56", "ovs_interfaceid": "168c1e42-5626-409f-86c2-c1b2a8b11d4b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Jan 21 19:26:07 np0005591285 nova_compute[182755]: 2026-01-22 00:26:07.780 182759 DEBUG oslo_concurrency.lockutils [req-026792f1-1ae2-48fd-8941-4e1f6f5aee3b req-5d3719dd-356c-432e-8d75-0058d9c611e4 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquired lock "refresh_cache-ba1975bd-ca63-4cb4-afd3-fb1f077c28f0" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 21 19:26:07 np0005591285 nova_compute[182755]: 2026-01-22 00:26:07.780 182759 DEBUG nova.network.neutron [req-026792f1-1ae2-48fd-8941-4e1f6f5aee3b req-5d3719dd-356c-432e-8d75-0058d9c611e4 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: ba1975bd-ca63-4cb4-afd3-fb1f077c28f0] Refreshing network info cache for port 168c1e42-5626-409f-86c2-c1b2a8b11d4b _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 21 19:26:07 np0005591285 nova_compute[182755]: 2026-01-22 00:26:07.783 182759 DEBUG nova.virt.libvirt.driver [None req-95f3dbd5-85dc-4e67-9366-cbe74b1deafb 93f27bcf715e498cbac482f96dec39c0 c869345f15654dea91ddb775c6c3ed7d - - default default] [instance: ba1975bd-ca63-4cb4-afd3-fb1f077c28f0] Start _get_guest_xml network_info=[{"id": "168c1e42-5626-409f-86c2-c1b2a8b11d4b", "address": "fa:16:3e:01:b6:65", "network": {"id": "ab086ee0-e007-4a86-babc-64d267c3fd5e", "bridge": "br-int", "label": "tempest-TestSnapshotPattern-266383806-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c869345f15654dea91ddb775c6c3ed7d", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap168c1e42-56", "ovs_interfaceid": "168c1e42-5626-409f-86c2-c1b2a8b11d4b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-21T23:42:50Z,direct_url=<?>,disk_format='qcow2',id=9cd98f02-a505-4543-a7ad-04e9a377b456,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='43b70c4e837343859ac97b6b2397ba1b',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-21T23:42:52Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'device_type': 'disk', 'device_name': '/dev/vda', 'size': 0, 'boot_index': 0, 'disk_bus': 'virtio', 'guest_format': None, 'encryption_options': None, 'encryption_format': None, 'encrypted': False, 'encryption_secret_uuid': None, 'image_id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Jan 21 19:26:07 np0005591285 nova_compute[182755]: 2026-01-22 00:26:07.787 182759 WARNING nova.virt.libvirt.driver [None req-95f3dbd5-85dc-4e67-9366-cbe74b1deafb 93f27bcf715e498cbac482f96dec39c0 c869345f15654dea91ddb775c6c3ed7d - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 21 19:26:07 np0005591285 nova_compute[182755]: 2026-01-22 00:26:07.792 182759 DEBUG nova.virt.libvirt.host [None req-95f3dbd5-85dc-4e67-9366-cbe74b1deafb 93f27bcf715e498cbac482f96dec39c0 c869345f15654dea91ddb775c6c3ed7d - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Jan 21 19:26:07 np0005591285 nova_compute[182755]: 2026-01-22 00:26:07.793 182759 DEBUG nova.virt.libvirt.host [None req-95f3dbd5-85dc-4e67-9366-cbe74b1deafb 93f27bcf715e498cbac482f96dec39c0 c869345f15654dea91ddb775c6c3ed7d - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Jan 21 19:26:07 np0005591285 nova_compute[182755]: 2026-01-22 00:26:07.802 182759 DEBUG nova.virt.libvirt.host [None req-95f3dbd5-85dc-4e67-9366-cbe74b1deafb 93f27bcf715e498cbac482f96dec39c0 c869345f15654dea91ddb775c6c3ed7d - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Jan 21 19:26:07 np0005591285 nova_compute[182755]: 2026-01-22 00:26:07.803 182759 DEBUG nova.virt.libvirt.host [None req-95f3dbd5-85dc-4e67-9366-cbe74b1deafb 93f27bcf715e498cbac482f96dec39c0 c869345f15654dea91ddb775c6c3ed7d - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Jan 21 19:26:07 np0005591285 nova_compute[182755]: 2026-01-22 00:26:07.804 182759 DEBUG nova.virt.libvirt.driver [None req-95f3dbd5-85dc-4e67-9366-cbe74b1deafb 93f27bcf715e498cbac482f96dec39c0 c869345f15654dea91ddb775c6c3ed7d - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Jan 21 19:26:07 np0005591285 nova_compute[182755]: 2026-01-22 00:26:07.805 182759 DEBUG nova.virt.hardware [None req-95f3dbd5-85dc-4e67-9366-cbe74b1deafb 93f27bcf715e498cbac482f96dec39c0 c869345f15654dea91ddb775c6c3ed7d - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-21T23:42:45Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='c3389c03-89c4-4ff5-9e03-1a99d41713d4',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-21T23:42:50Z,direct_url=<?>,disk_format='qcow2',id=9cd98f02-a505-4543-a7ad-04e9a377b456,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='43b70c4e837343859ac97b6b2397ba1b',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-21T23:42:52Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Jan 21 19:26:07 np0005591285 nova_compute[182755]: 2026-01-22 00:26:07.805 182759 DEBUG nova.virt.hardware [None req-95f3dbd5-85dc-4e67-9366-cbe74b1deafb 93f27bcf715e498cbac482f96dec39c0 c869345f15654dea91ddb775c6c3ed7d - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Jan 21 19:26:07 np0005591285 nova_compute[182755]: 2026-01-22 00:26:07.805 182759 DEBUG nova.virt.hardware [None req-95f3dbd5-85dc-4e67-9366-cbe74b1deafb 93f27bcf715e498cbac482f96dec39c0 c869345f15654dea91ddb775c6c3ed7d - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Jan 21 19:26:07 np0005591285 nova_compute[182755]: 2026-01-22 00:26:07.806 182759 DEBUG nova.virt.hardware [None req-95f3dbd5-85dc-4e67-9366-cbe74b1deafb 93f27bcf715e498cbac482f96dec39c0 c869345f15654dea91ddb775c6c3ed7d - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Jan 21 19:26:07 np0005591285 nova_compute[182755]: 2026-01-22 00:26:07.806 182759 DEBUG nova.virt.hardware [None req-95f3dbd5-85dc-4e67-9366-cbe74b1deafb 93f27bcf715e498cbac482f96dec39c0 c869345f15654dea91ddb775c6c3ed7d - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Jan 21 19:26:07 np0005591285 nova_compute[182755]: 2026-01-22 00:26:07.806 182759 DEBUG nova.virt.hardware [None req-95f3dbd5-85dc-4e67-9366-cbe74b1deafb 93f27bcf715e498cbac482f96dec39c0 c869345f15654dea91ddb775c6c3ed7d - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Jan 21 19:26:07 np0005591285 nova_compute[182755]: 2026-01-22 00:26:07.806 182759 DEBUG nova.virt.hardware [None req-95f3dbd5-85dc-4e67-9366-cbe74b1deafb 93f27bcf715e498cbac482f96dec39c0 c869345f15654dea91ddb775c6c3ed7d - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Jan 21 19:26:07 np0005591285 nova_compute[182755]: 2026-01-22 00:26:07.806 182759 DEBUG nova.virt.hardware [None req-95f3dbd5-85dc-4e67-9366-cbe74b1deafb 93f27bcf715e498cbac482f96dec39c0 c869345f15654dea91ddb775c6c3ed7d - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Jan 21 19:26:07 np0005591285 nova_compute[182755]: 2026-01-22 00:26:07.807 182759 DEBUG nova.virt.hardware [None req-95f3dbd5-85dc-4e67-9366-cbe74b1deafb 93f27bcf715e498cbac482f96dec39c0 c869345f15654dea91ddb775c6c3ed7d - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Jan 21 19:26:07 np0005591285 nova_compute[182755]: 2026-01-22 00:26:07.807 182759 DEBUG nova.virt.hardware [None req-95f3dbd5-85dc-4e67-9366-cbe74b1deafb 93f27bcf715e498cbac482f96dec39c0 c869345f15654dea91ddb775c6c3ed7d - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Jan 21 19:26:07 np0005591285 nova_compute[182755]: 2026-01-22 00:26:07.807 182759 DEBUG nova.virt.hardware [None req-95f3dbd5-85dc-4e67-9366-cbe74b1deafb 93f27bcf715e498cbac482f96dec39c0 c869345f15654dea91ddb775c6c3ed7d - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Jan 21 19:26:07 np0005591285 nova_compute[182755]: 2026-01-22 00:26:07.811 182759 DEBUG nova.virt.libvirt.vif [None req-95f3dbd5-85dc-4e67-9366-cbe74b1deafb 93f27bcf715e498cbac482f96dec39c0 c869345f15654dea91ddb775c6c3ed7d - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-22T00:25:53Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestSnapshotPattern-server-1118267000',display_name='tempest-TestSnapshotPattern-server-1118267000',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-testsnapshotpattern-server-1118267000',id=164,image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBI0DY/TXOQB0M9YFyDLcqcxCKEdFgCfMpGiYt7S54G4iqWyBQXCc1Xwky+N3hTMg7xuZaO7fBEy0faktvOAVQkBCk+NHBAAtdaooYCb3c3mlb/2fG1QJ9qBFBibcnv6XRw==',key_name='tempest-TestSnapshotPattern-1957582469',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='c869345f15654dea91ddb775c6c3ed7d',ramdisk_id='',reservation_id='r-bn3hkves',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestSnapshotPattern-735860214',owner_user_name='tempest-TestSnapshotPattern-735860214-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-22T00:25:54Z,user_data=None,user_id='93f27bcf715e498cbac482f96dec39c0',uuid=ba1975bd-ca63-4cb4-afd3-fb1f077c28f0,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "168c1e42-5626-409f-86c2-c1b2a8b11d4b", "address": "fa:16:3e:01:b6:65", "network": {"id": "ab086ee0-e007-4a86-babc-64d267c3fd5e", "bridge": "br-int", "label": "tempest-TestSnapshotPattern-266383806-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c869345f15654dea91ddb775c6c3ed7d", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap168c1e42-56", "ovs_interfaceid": "168c1e42-5626-409f-86c2-c1b2a8b11d4b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Jan 21 19:26:07 np0005591285 nova_compute[182755]: 2026-01-22 00:26:07.811 182759 DEBUG nova.network.os_vif_util [None req-95f3dbd5-85dc-4e67-9366-cbe74b1deafb 93f27bcf715e498cbac482f96dec39c0 c869345f15654dea91ddb775c6c3ed7d - - default default] Converting VIF {"id": "168c1e42-5626-409f-86c2-c1b2a8b11d4b", "address": "fa:16:3e:01:b6:65", "network": {"id": "ab086ee0-e007-4a86-babc-64d267c3fd5e", "bridge": "br-int", "label": "tempest-TestSnapshotPattern-266383806-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c869345f15654dea91ddb775c6c3ed7d", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap168c1e42-56", "ovs_interfaceid": "168c1e42-5626-409f-86c2-c1b2a8b11d4b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 21 19:26:07 np0005591285 nova_compute[182755]: 2026-01-22 00:26:07.812 182759 DEBUG nova.network.os_vif_util [None req-95f3dbd5-85dc-4e67-9366-cbe74b1deafb 93f27bcf715e498cbac482f96dec39c0 c869345f15654dea91ddb775c6c3ed7d - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:01:b6:65,bridge_name='br-int',has_traffic_filtering=True,id=168c1e42-5626-409f-86c2-c1b2a8b11d4b,network=Network(ab086ee0-e007-4a86-babc-64d267c3fd5e),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap168c1e42-56') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 21 19:26:07 np0005591285 nova_compute[182755]: 2026-01-22 00:26:07.813 182759 DEBUG nova.objects.instance [None req-95f3dbd5-85dc-4e67-9366-cbe74b1deafb 93f27bcf715e498cbac482f96dec39c0 c869345f15654dea91ddb775c6c3ed7d - - default default] Lazy-loading 'pci_devices' on Instance uuid ba1975bd-ca63-4cb4-afd3-fb1f077c28f0 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 21 19:26:08 np0005591285 nova_compute[182755]: 2026-01-22 00:26:08.084 182759 DEBUG nova.virt.libvirt.driver [None req-95f3dbd5-85dc-4e67-9366-cbe74b1deafb 93f27bcf715e498cbac482f96dec39c0 c869345f15654dea91ddb775c6c3ed7d - - default default] [instance: ba1975bd-ca63-4cb4-afd3-fb1f077c28f0] End _get_guest_xml xml=<domain type="kvm">
Jan 21 19:26:08 np0005591285 nova_compute[182755]:  <uuid>ba1975bd-ca63-4cb4-afd3-fb1f077c28f0</uuid>
Jan 21 19:26:08 np0005591285 nova_compute[182755]:  <name>instance-000000a4</name>
Jan 21 19:26:08 np0005591285 nova_compute[182755]:  <memory>131072</memory>
Jan 21 19:26:08 np0005591285 nova_compute[182755]:  <vcpu>1</vcpu>
Jan 21 19:26:08 np0005591285 nova_compute[182755]:  <metadata>
Jan 21 19:26:08 np0005591285 nova_compute[182755]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 21 19:26:08 np0005591285 nova_compute[182755]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 21 19:26:08 np0005591285 nova_compute[182755]:      <nova:name>tempest-TestSnapshotPattern-server-1118267000</nova:name>
Jan 21 19:26:08 np0005591285 nova_compute[182755]:      <nova:creationTime>2026-01-22 00:26:07</nova:creationTime>
Jan 21 19:26:08 np0005591285 nova_compute[182755]:      <nova:flavor name="m1.nano">
Jan 21 19:26:08 np0005591285 nova_compute[182755]:        <nova:memory>128</nova:memory>
Jan 21 19:26:08 np0005591285 nova_compute[182755]:        <nova:disk>1</nova:disk>
Jan 21 19:26:08 np0005591285 nova_compute[182755]:        <nova:swap>0</nova:swap>
Jan 21 19:26:08 np0005591285 nova_compute[182755]:        <nova:ephemeral>0</nova:ephemeral>
Jan 21 19:26:08 np0005591285 nova_compute[182755]:        <nova:vcpus>1</nova:vcpus>
Jan 21 19:26:08 np0005591285 nova_compute[182755]:      </nova:flavor>
Jan 21 19:26:08 np0005591285 nova_compute[182755]:      <nova:owner>
Jan 21 19:26:08 np0005591285 nova_compute[182755]:        <nova:user uuid="93f27bcf715e498cbac482f96dec39c0">tempest-TestSnapshotPattern-735860214-project-member</nova:user>
Jan 21 19:26:08 np0005591285 nova_compute[182755]:        <nova:project uuid="c869345f15654dea91ddb775c6c3ed7d">tempest-TestSnapshotPattern-735860214</nova:project>
Jan 21 19:26:08 np0005591285 nova_compute[182755]:      </nova:owner>
Jan 21 19:26:08 np0005591285 nova_compute[182755]:      <nova:root type="image" uuid="9cd98f02-a505-4543-a7ad-04e9a377b456"/>
Jan 21 19:26:08 np0005591285 nova_compute[182755]:      <nova:ports>
Jan 21 19:26:08 np0005591285 nova_compute[182755]:        <nova:port uuid="168c1e42-5626-409f-86c2-c1b2a8b11d4b">
Jan 21 19:26:08 np0005591285 nova_compute[182755]:          <nova:ip type="fixed" address="10.100.0.6" ipVersion="4"/>
Jan 21 19:26:08 np0005591285 nova_compute[182755]:        </nova:port>
Jan 21 19:26:08 np0005591285 nova_compute[182755]:      </nova:ports>
Jan 21 19:26:08 np0005591285 nova_compute[182755]:    </nova:instance>
Jan 21 19:26:08 np0005591285 nova_compute[182755]:  </metadata>
Jan 21 19:26:08 np0005591285 nova_compute[182755]:  <sysinfo type="smbios">
Jan 21 19:26:08 np0005591285 nova_compute[182755]:    <system>
Jan 21 19:26:08 np0005591285 nova_compute[182755]:      <entry name="manufacturer">RDO</entry>
Jan 21 19:26:08 np0005591285 nova_compute[182755]:      <entry name="product">OpenStack Compute</entry>
Jan 21 19:26:08 np0005591285 nova_compute[182755]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 21 19:26:08 np0005591285 nova_compute[182755]:      <entry name="serial">ba1975bd-ca63-4cb4-afd3-fb1f077c28f0</entry>
Jan 21 19:26:08 np0005591285 nova_compute[182755]:      <entry name="uuid">ba1975bd-ca63-4cb4-afd3-fb1f077c28f0</entry>
Jan 21 19:26:08 np0005591285 nova_compute[182755]:      <entry name="family">Virtual Machine</entry>
Jan 21 19:26:08 np0005591285 nova_compute[182755]:    </system>
Jan 21 19:26:08 np0005591285 nova_compute[182755]:  </sysinfo>
Jan 21 19:26:08 np0005591285 nova_compute[182755]:  <os>
Jan 21 19:26:08 np0005591285 nova_compute[182755]:    <type arch="x86_64" machine="q35">hvm</type>
Jan 21 19:26:08 np0005591285 nova_compute[182755]:    <boot dev="hd"/>
Jan 21 19:26:08 np0005591285 nova_compute[182755]:    <smbios mode="sysinfo"/>
Jan 21 19:26:08 np0005591285 nova_compute[182755]:  </os>
Jan 21 19:26:08 np0005591285 nova_compute[182755]:  <features>
Jan 21 19:26:08 np0005591285 nova_compute[182755]:    <acpi/>
Jan 21 19:26:08 np0005591285 nova_compute[182755]:    <apic/>
Jan 21 19:26:08 np0005591285 nova_compute[182755]:    <vmcoreinfo/>
Jan 21 19:26:08 np0005591285 nova_compute[182755]:  </features>
Jan 21 19:26:08 np0005591285 nova_compute[182755]:  <clock offset="utc">
Jan 21 19:26:08 np0005591285 nova_compute[182755]:    <timer name="pit" tickpolicy="delay"/>
Jan 21 19:26:08 np0005591285 nova_compute[182755]:    <timer name="rtc" tickpolicy="catchup"/>
Jan 21 19:26:08 np0005591285 nova_compute[182755]:    <timer name="hpet" present="no"/>
Jan 21 19:26:08 np0005591285 nova_compute[182755]:  </clock>
Jan 21 19:26:08 np0005591285 nova_compute[182755]:  <cpu mode="custom" match="exact">
Jan 21 19:26:08 np0005591285 nova_compute[182755]:    <model>Nehalem</model>
Jan 21 19:26:08 np0005591285 nova_compute[182755]:    <topology sockets="1" cores="1" threads="1"/>
Jan 21 19:26:08 np0005591285 nova_compute[182755]:  </cpu>
Jan 21 19:26:08 np0005591285 nova_compute[182755]:  <devices>
Jan 21 19:26:08 np0005591285 nova_compute[182755]:    <disk type="file" device="disk">
Jan 21 19:26:08 np0005591285 nova_compute[182755]:      <driver name="qemu" type="qcow2" cache="none"/>
Jan 21 19:26:08 np0005591285 nova_compute[182755]:      <source file="/var/lib/nova/instances/ba1975bd-ca63-4cb4-afd3-fb1f077c28f0/disk"/>
Jan 21 19:26:08 np0005591285 nova_compute[182755]:      <target dev="vda" bus="virtio"/>
Jan 21 19:26:08 np0005591285 nova_compute[182755]:    </disk>
Jan 21 19:26:08 np0005591285 nova_compute[182755]:    <disk type="file" device="cdrom">
Jan 21 19:26:08 np0005591285 nova_compute[182755]:      <driver name="qemu" type="raw" cache="none"/>
Jan 21 19:26:08 np0005591285 nova_compute[182755]:      <source file="/var/lib/nova/instances/ba1975bd-ca63-4cb4-afd3-fb1f077c28f0/disk.config"/>
Jan 21 19:26:08 np0005591285 nova_compute[182755]:      <target dev="sda" bus="sata"/>
Jan 21 19:26:08 np0005591285 nova_compute[182755]:    </disk>
Jan 21 19:26:08 np0005591285 nova_compute[182755]:    <interface type="ethernet">
Jan 21 19:26:08 np0005591285 nova_compute[182755]:      <mac address="fa:16:3e:01:b6:65"/>
Jan 21 19:26:08 np0005591285 nova_compute[182755]:      <model type="virtio"/>
Jan 21 19:26:08 np0005591285 nova_compute[182755]:      <driver name="vhost" rx_queue_size="512"/>
Jan 21 19:26:08 np0005591285 nova_compute[182755]:      <mtu size="1442"/>
Jan 21 19:26:08 np0005591285 nova_compute[182755]:      <target dev="tap168c1e42-56"/>
Jan 21 19:26:08 np0005591285 nova_compute[182755]:    </interface>
Jan 21 19:26:08 np0005591285 nova_compute[182755]:    <serial type="pty">
Jan 21 19:26:08 np0005591285 nova_compute[182755]:      <log file="/var/lib/nova/instances/ba1975bd-ca63-4cb4-afd3-fb1f077c28f0/console.log" append="off"/>
Jan 21 19:26:08 np0005591285 nova_compute[182755]:    </serial>
Jan 21 19:26:08 np0005591285 nova_compute[182755]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 21 19:26:08 np0005591285 nova_compute[182755]:    <video>
Jan 21 19:26:08 np0005591285 nova_compute[182755]:      <model type="virtio"/>
Jan 21 19:26:08 np0005591285 nova_compute[182755]:    </video>
Jan 21 19:26:08 np0005591285 nova_compute[182755]:    <input type="tablet" bus="usb"/>
Jan 21 19:26:08 np0005591285 nova_compute[182755]:    <rng model="virtio">
Jan 21 19:26:08 np0005591285 nova_compute[182755]:      <backend model="random">/dev/urandom</backend>
Jan 21 19:26:08 np0005591285 nova_compute[182755]:    </rng>
Jan 21 19:26:08 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root"/>
Jan 21 19:26:08 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 19:26:08 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 19:26:08 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 19:26:08 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 19:26:08 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 19:26:08 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 19:26:08 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 19:26:08 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 19:26:08 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 19:26:08 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 19:26:08 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 19:26:08 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 19:26:08 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 19:26:08 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 19:26:08 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 19:26:08 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 19:26:08 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 19:26:08 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 19:26:08 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 19:26:08 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 19:26:08 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 19:26:08 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 19:26:08 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 19:26:08 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 19:26:08 np0005591285 nova_compute[182755]:    <controller type="usb" index="0"/>
Jan 21 19:26:08 np0005591285 nova_compute[182755]:    <memballoon model="virtio">
Jan 21 19:26:08 np0005591285 nova_compute[182755]:      <stats period="10"/>
Jan 21 19:26:08 np0005591285 nova_compute[182755]:    </memballoon>
Jan 21 19:26:08 np0005591285 nova_compute[182755]:  </devices>
Jan 21 19:26:08 np0005591285 nova_compute[182755]: </domain>
Jan 21 19:26:08 np0005591285 nova_compute[182755]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Jan 21 19:26:08 np0005591285 nova_compute[182755]: 2026-01-22 00:26:08.085 182759 DEBUG nova.compute.manager [None req-95f3dbd5-85dc-4e67-9366-cbe74b1deafb 93f27bcf715e498cbac482f96dec39c0 c869345f15654dea91ddb775c6c3ed7d - - default default] [instance: ba1975bd-ca63-4cb4-afd3-fb1f077c28f0] Preparing to wait for external event network-vif-plugged-168c1e42-5626-409f-86c2-c1b2a8b11d4b prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Jan 21 19:26:08 np0005591285 nova_compute[182755]: 2026-01-22 00:26:08.086 182759 DEBUG oslo_concurrency.lockutils [None req-95f3dbd5-85dc-4e67-9366-cbe74b1deafb 93f27bcf715e498cbac482f96dec39c0 c869345f15654dea91ddb775c6c3ed7d - - default default] Acquiring lock "ba1975bd-ca63-4cb4-afd3-fb1f077c28f0-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 21 19:26:08 np0005591285 nova_compute[182755]: 2026-01-22 00:26:08.086 182759 DEBUG oslo_concurrency.lockutils [None req-95f3dbd5-85dc-4e67-9366-cbe74b1deafb 93f27bcf715e498cbac482f96dec39c0 c869345f15654dea91ddb775c6c3ed7d - - default default] Lock "ba1975bd-ca63-4cb4-afd3-fb1f077c28f0-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 21 19:26:08 np0005591285 nova_compute[182755]: 2026-01-22 00:26:08.086 182759 DEBUG oslo_concurrency.lockutils [None req-95f3dbd5-85dc-4e67-9366-cbe74b1deafb 93f27bcf715e498cbac482f96dec39c0 c869345f15654dea91ddb775c6c3ed7d - - default default] Lock "ba1975bd-ca63-4cb4-afd3-fb1f077c28f0-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 21 19:26:08 np0005591285 nova_compute[182755]: 2026-01-22 00:26:08.087 182759 DEBUG nova.virt.libvirt.vif [None req-95f3dbd5-85dc-4e67-9366-cbe74b1deafb 93f27bcf715e498cbac482f96dec39c0 c869345f15654dea91ddb775c6c3ed7d - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-22T00:25:53Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestSnapshotPattern-server-1118267000',display_name='tempest-TestSnapshotPattern-server-1118267000',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-testsnapshotpattern-server-1118267000',id=164,image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBI0DY/TXOQB0M9YFyDLcqcxCKEdFgCfMpGiYt7S54G4iqWyBQXCc1Xwky+N3hTMg7xuZaO7fBEy0faktvOAVQkBCk+NHBAAtdaooYCb3c3mlb/2fG1QJ9qBFBibcnv6XRw==',key_name='tempest-TestSnapshotPattern-1957582469',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='c869345f15654dea91ddb775c6c3ed7d',ramdisk_id='',reservation_id='r-bn3hkves',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestSnapshotPattern-735860214',owner_user_name='tempest-TestSnapshotPattern-735860214-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-22T00:25:54Z,user_data=None,user_id='93f27bcf715e498cbac482f96dec39c0',uuid=ba1975bd-ca63-4cb4-afd3-fb1f077c28f0,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "168c1e42-5626-409f-86c2-c1b2a8b11d4b", "address": "fa:16:3e:01:b6:65", "network": {"id": "ab086ee0-e007-4a86-babc-64d267c3fd5e", "bridge": "br-int", "label": "tempest-TestSnapshotPattern-266383806-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c869345f15654dea91ddb775c6c3ed7d", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap168c1e42-56", "ovs_interfaceid": "168c1e42-5626-409f-86c2-c1b2a8b11d4b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Jan 21 19:26:08 np0005591285 nova_compute[182755]: 2026-01-22 00:26:08.087 182759 DEBUG nova.network.os_vif_util [None req-95f3dbd5-85dc-4e67-9366-cbe74b1deafb 93f27bcf715e498cbac482f96dec39c0 c869345f15654dea91ddb775c6c3ed7d - - default default] Converting VIF {"id": "168c1e42-5626-409f-86c2-c1b2a8b11d4b", "address": "fa:16:3e:01:b6:65", "network": {"id": "ab086ee0-e007-4a86-babc-64d267c3fd5e", "bridge": "br-int", "label": "tempest-TestSnapshotPattern-266383806-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c869345f15654dea91ddb775c6c3ed7d", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap168c1e42-56", "ovs_interfaceid": "168c1e42-5626-409f-86c2-c1b2a8b11d4b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 21 19:26:08 np0005591285 nova_compute[182755]: 2026-01-22 00:26:08.087 182759 DEBUG nova.network.os_vif_util [None req-95f3dbd5-85dc-4e67-9366-cbe74b1deafb 93f27bcf715e498cbac482f96dec39c0 c869345f15654dea91ddb775c6c3ed7d - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:01:b6:65,bridge_name='br-int',has_traffic_filtering=True,id=168c1e42-5626-409f-86c2-c1b2a8b11d4b,network=Network(ab086ee0-e007-4a86-babc-64d267c3fd5e),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap168c1e42-56') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 21 19:26:08 np0005591285 nova_compute[182755]: 2026-01-22 00:26:08.088 182759 DEBUG os_vif [None req-95f3dbd5-85dc-4e67-9366-cbe74b1deafb 93f27bcf715e498cbac482f96dec39c0 c869345f15654dea91ddb775c6c3ed7d - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:01:b6:65,bridge_name='br-int',has_traffic_filtering=True,id=168c1e42-5626-409f-86c2-c1b2a8b11d4b,network=Network(ab086ee0-e007-4a86-babc-64d267c3fd5e),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap168c1e42-56') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Jan 21 19:26:08 np0005591285 nova_compute[182755]: 2026-01-22 00:26:08.088 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:26:08 np0005591285 nova_compute[182755]: 2026-01-22 00:26:08.089 182759 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 21 19:26:08 np0005591285 nova_compute[182755]: 2026-01-22 00:26:08.089 182759 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 21 19:26:08 np0005591285 nova_compute[182755]: 2026-01-22 00:26:08.091 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:26:08 np0005591285 nova_compute[182755]: 2026-01-22 00:26:08.092 182759 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap168c1e42-56, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 21 19:26:08 np0005591285 nova_compute[182755]: 2026-01-22 00:26:08.092 182759 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap168c1e42-56, col_values=(('external_ids', {'iface-id': '168c1e42-5626-409f-86c2-c1b2a8b11d4b', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:01:b6:65', 'vm-uuid': 'ba1975bd-ca63-4cb4-afd3-fb1f077c28f0'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 21 19:26:08 np0005591285 NetworkManager[55017]: <info>  [1769041568.1329] manager: (tap168c1e42-56): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/304)
Jan 21 19:26:08 np0005591285 nova_compute[182755]: 2026-01-22 00:26:08.132 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:26:08 np0005591285 nova_compute[182755]: 2026-01-22 00:26:08.137 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 21 19:26:08 np0005591285 nova_compute[182755]: 2026-01-22 00:26:08.139 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:26:08 np0005591285 nova_compute[182755]: 2026-01-22 00:26:08.140 182759 INFO os_vif [None req-95f3dbd5-85dc-4e67-9366-cbe74b1deafb 93f27bcf715e498cbac482f96dec39c0 c869345f15654dea91ddb775c6c3ed7d - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:01:b6:65,bridge_name='br-int',has_traffic_filtering=True,id=168c1e42-5626-409f-86c2-c1b2a8b11d4b,network=Network(ab086ee0-e007-4a86-babc-64d267c3fd5e),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap168c1e42-56')#033[00m
Jan 21 19:26:08 np0005591285 nova_compute[182755]: 2026-01-22 00:26:08.358 182759 DEBUG nova.virt.libvirt.driver [None req-95f3dbd5-85dc-4e67-9366-cbe74b1deafb 93f27bcf715e498cbac482f96dec39c0 c869345f15654dea91ddb775c6c3ed7d - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 21 19:26:08 np0005591285 nova_compute[182755]: 2026-01-22 00:26:08.359 182759 DEBUG nova.virt.libvirt.driver [None req-95f3dbd5-85dc-4e67-9366-cbe74b1deafb 93f27bcf715e498cbac482f96dec39c0 c869345f15654dea91ddb775c6c3ed7d - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 21 19:26:08 np0005591285 nova_compute[182755]: 2026-01-22 00:26:08.359 182759 DEBUG nova.virt.libvirt.driver [None req-95f3dbd5-85dc-4e67-9366-cbe74b1deafb 93f27bcf715e498cbac482f96dec39c0 c869345f15654dea91ddb775c6c3ed7d - - default default] No VIF found with MAC fa:16:3e:01:b6:65, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Jan 21 19:26:08 np0005591285 nova_compute[182755]: 2026-01-22 00:26:08.360 182759 INFO nova.virt.libvirt.driver [None req-95f3dbd5-85dc-4e67-9366-cbe74b1deafb 93f27bcf715e498cbac482f96dec39c0 c869345f15654dea91ddb775c6c3ed7d - - default default] [instance: ba1975bd-ca63-4cb4-afd3-fb1f077c28f0] Using config drive#033[00m
Jan 21 19:26:08 np0005591285 nova_compute[182755]: 2026-01-22 00:26:08.415 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:26:09 np0005591285 nova_compute[182755]: 2026-01-22 00:26:09.020 182759 DEBUG nova.compute.resource_tracker [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Instance 3f9bded2-5958-4c54-90d9-fc4d4b658fc0 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Jan 21 19:26:09 np0005591285 nova_compute[182755]: 2026-01-22 00:26:09.021 182759 DEBUG nova.compute.resource_tracker [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Instance ba1975bd-ca63-4cb4-afd3-fb1f077c28f0 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Jan 21 19:26:09 np0005591285 nova_compute[182755]: 2026-01-22 00:26:09.021 182759 DEBUG nova.compute.resource_tracker [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 2 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Jan 21 19:26:09 np0005591285 nova_compute[182755]: 2026-01-22 00:26:09.021 182759 DEBUG nova.compute.resource_tracker [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7679MB used_ram=768MB phys_disk=79GB used_disk=2GB total_vcpus=8 used_vcpus=2 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Jan 21 19:26:09 np0005591285 nova_compute[182755]: 2026-01-22 00:26:09.080 182759 DEBUG nova.compute.provider_tree [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Inventory has not changed in ProviderTree for provider: e96a8776-a298-4c19-937a-402cb8191067 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 21 19:26:09 np0005591285 nova_compute[182755]: 2026-01-22 00:26:09.135 182759 DEBUG nova.scheduler.client.report [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Inventory has not changed for provider e96a8776-a298-4c19-937a-402cb8191067 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 21 19:26:09 np0005591285 nova_compute[182755]: 2026-01-22 00:26:09.176 182759 DEBUG nova.compute.resource_tracker [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Jan 21 19:26:09 np0005591285 nova_compute[182755]: 2026-01-22 00:26:09.177 182759 DEBUG oslo_concurrency.lockutils [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 1.426s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 21 19:26:09 np0005591285 nova_compute[182755]: 2026-01-22 00:26:09.568 182759 DEBUG nova.network.neutron [req-309432c9-2fac-417f-bb6a-3c1435b54403 req-1bcf7a00-2d2f-4c83-8fef-27884267c627 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 3f9bded2-5958-4c54-90d9-fc4d4b658fc0] Updated VIF entry in instance network info cache for port 0e0f9617-ebfe-45af-98e6-38991b5338d0. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 21 19:26:09 np0005591285 nova_compute[182755]: 2026-01-22 00:26:09.569 182759 DEBUG nova.network.neutron [req-309432c9-2fac-417f-bb6a-3c1435b54403 req-1bcf7a00-2d2f-4c83-8fef-27884267c627 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 3f9bded2-5958-4c54-90d9-fc4d4b658fc0] Updating instance_info_cache with network_info: [{"id": "0e0f9617-ebfe-45af-98e6-38991b5338d0", "address": "fa:16:3e:8b:21:fa", "network": {"id": "5c39d2a7-2c89-4543-a593-0bbe9a34dfef", "bridge": "br-int", "label": "tempest-network-smoke--1486796925", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "02bcfc5f1f1044a3856e73a5938ff011", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap0e0f9617-eb", "ovs_interfaceid": "0e0f9617-ebfe-45af-98e6-38991b5338d0", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 21 19:26:09 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:26:09.669 104259 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=50, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '16:a4:fb', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': 'fa:c2:bb:50:eb:27'}, ipsec=False) old=SB_Global(nb_cfg=49) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 21 19:26:09 np0005591285 nova_compute[182755]: 2026-01-22 00:26:09.669 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:26:09 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:26:09.670 104259 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 0 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Jan 21 19:26:09 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:26:09.670 104259 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=ce4b296c-26ac-415a-aa87-9634754eb3d3, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '50'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 21 19:26:09 np0005591285 nova_compute[182755]: 2026-01-22 00:26:09.684 182759 DEBUG oslo_concurrency.lockutils [req-309432c9-2fac-417f-bb6a-3c1435b54403 req-1bcf7a00-2d2f-4c83-8fef-27884267c627 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Releasing lock "refresh_cache-3f9bded2-5958-4c54-90d9-fc4d4b658fc0" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 21 19:26:09 np0005591285 nova_compute[182755]: 2026-01-22 00:26:09.784 182759 INFO nova.virt.libvirt.driver [None req-95f3dbd5-85dc-4e67-9366-cbe74b1deafb 93f27bcf715e498cbac482f96dec39c0 c869345f15654dea91ddb775c6c3ed7d - - default default] [instance: ba1975bd-ca63-4cb4-afd3-fb1f077c28f0] Creating config drive at /var/lib/nova/instances/ba1975bd-ca63-4cb4-afd3-fb1f077c28f0/disk.config#033[00m
Jan 21 19:26:09 np0005591285 nova_compute[182755]: 2026-01-22 00:26:09.789 182759 DEBUG oslo_concurrency.processutils [None req-95f3dbd5-85dc-4e67-9366-cbe74b1deafb 93f27bcf715e498cbac482f96dec39c0 c869345f15654dea91ddb775c6c3ed7d - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/ba1975bd-ca63-4cb4-afd3-fb1f077c28f0/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpc9cxsxli execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 21 19:26:09 np0005591285 nova_compute[182755]: 2026-01-22 00:26:09.811 182759 DEBUG nova.network.neutron [-] [instance: 3f9bded2-5958-4c54-90d9-fc4d4b658fc0] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 21 19:26:09 np0005591285 nova_compute[182755]: 2026-01-22 00:26:09.913 182759 DEBUG oslo_concurrency.processutils [None req-95f3dbd5-85dc-4e67-9366-cbe74b1deafb 93f27bcf715e498cbac482f96dec39c0 c869345f15654dea91ddb775c6c3ed7d - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/ba1975bd-ca63-4cb4-afd3-fb1f077c28f0/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpc9cxsxli" returned: 0 in 0.124s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 21 19:26:09 np0005591285 kernel: tap168c1e42-56: entered promiscuous mode
Jan 21 19:26:09 np0005591285 NetworkManager[55017]: <info>  [1769041569.9825] manager: (tap168c1e42-56): new Tun device (/org/freedesktop/NetworkManager/Devices/305)
Jan 21 19:26:09 np0005591285 nova_compute[182755]: 2026-01-22 00:26:09.982 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:26:09 np0005591285 ovn_controller[94908]: 2026-01-22T00:26:09Z|00624|binding|INFO|Claiming lport 168c1e42-5626-409f-86c2-c1b2a8b11d4b for this chassis.
Jan 21 19:26:09 np0005591285 ovn_controller[94908]: 2026-01-22T00:26:09Z|00625|binding|INFO|168c1e42-5626-409f-86c2-c1b2a8b11d4b: Claiming fa:16:3e:01:b6:65 10.100.0.6
Jan 21 19:26:10 np0005591285 ovn_controller[94908]: 2026-01-22T00:26:10Z|00626|binding|INFO|Setting lport 168c1e42-5626-409f-86c2-c1b2a8b11d4b ovn-installed in OVS
Jan 21 19:26:10 np0005591285 nova_compute[182755]: 2026-01-22 00:26:10.003 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:26:10 np0005591285 nova_compute[182755]: 2026-01-22 00:26:10.008 182759 INFO nova.compute.manager [-] [instance: 3f9bded2-5958-4c54-90d9-fc4d4b658fc0] Took 8.64 seconds to deallocate network for instance.#033[00m
Jan 21 19:26:10 np0005591285 nova_compute[182755]: 2026-01-22 00:26:10.008 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:26:10 np0005591285 systemd-udevd[237848]: Network interface NamePolicy= disabled on kernel command line.
Jan 21 19:26:10 np0005591285 systemd-machined[154022]: New machine qemu-73-instance-000000a4.
Jan 21 19:26:10 np0005591285 NetworkManager[55017]: <info>  [1769041570.0380] device (tap168c1e42-56): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 21 19:26:10 np0005591285 NetworkManager[55017]: <info>  [1769041570.0393] device (tap168c1e42-56): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 21 19:26:10 np0005591285 nova_compute[182755]: 2026-01-22 00:26:10.053 182759 DEBUG nova.compute.manager [req-0960c123-b8fe-404d-8946-a8006ffaaab7 req-91948a6e-a806-4aef-a7b3-d3faa0f9a39b 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 3f9bded2-5958-4c54-90d9-fc4d4b658fc0] Received event network-vif-deleted-0e0f9617-ebfe-45af-98e6-38991b5338d0 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 21 19:26:10 np0005591285 nova_compute[182755]: 2026-01-22 00:26:10.055 182759 INFO nova.compute.manager [req-0960c123-b8fe-404d-8946-a8006ffaaab7 req-91948a6e-a806-4aef-a7b3-d3faa0f9a39b 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 3f9bded2-5958-4c54-90d9-fc4d4b658fc0] Neutron deleted interface 0e0f9617-ebfe-45af-98e6-38991b5338d0; detaching it from the instance and deleting it from the info cache#033[00m
Jan 21 19:26:10 np0005591285 nova_compute[182755]: 2026-01-22 00:26:10.055 182759 DEBUG nova.network.neutron [req-0960c123-b8fe-404d-8946-a8006ffaaab7 req-91948a6e-a806-4aef-a7b3-d3faa0f9a39b 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 3f9bded2-5958-4c54-90d9-fc4d4b658fc0] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 21 19:26:10 np0005591285 systemd[1]: Started Virtual Machine qemu-73-instance-000000a4.
Jan 21 19:26:10 np0005591285 ovn_controller[94908]: 2026-01-22T00:26:10Z|00627|binding|INFO|Setting lport 168c1e42-5626-409f-86c2-c1b2a8b11d4b up in Southbound
Jan 21 19:26:10 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:26:10.126 104259 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:01:b6:65 10.100.0.6'], port_security=['fa:16:3e:01:b6:65 10.100.0.6'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.6/28', 'neutron:device_id': 'ba1975bd-ca63-4cb4-afd3-fb1f077c28f0', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-ab086ee0-e007-4a86-babc-64d267c3fd5e', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'c869345f15654dea91ddb775c6c3ed7d', 'neutron:revision_number': '2', 'neutron:security_group_ids': 'd99566cf-9d10-4ed9-89fe-0fedfcd05fcc', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=7ddf0c9e-e496-4d74-b1f7-5f7f3b8a365b, chassis=[<ovs.db.idl.Row object at 0x7fdd55b5c970>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fdd55b5c970>], logical_port=168c1e42-5626-409f-86c2-c1b2a8b11d4b) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 21 19:26:10 np0005591285 nova_compute[182755]: 2026-01-22 00:26:10.127 182759 DEBUG nova.compute.manager [req-0960c123-b8fe-404d-8946-a8006ffaaab7 req-91948a6e-a806-4aef-a7b3-d3faa0f9a39b 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 3f9bded2-5958-4c54-90d9-fc4d4b658fc0] Detach interface failed, port_id=0e0f9617-ebfe-45af-98e6-38991b5338d0, reason: Instance 3f9bded2-5958-4c54-90d9-fc4d4b658fc0 could not be found. _process_instance_vif_deleted_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10882#033[00m
Jan 21 19:26:10 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:26:10.127 104259 INFO neutron.agent.ovn.metadata.agent [-] Port 168c1e42-5626-409f-86c2-c1b2a8b11d4b in datapath ab086ee0-e007-4a86-babc-64d267c3fd5e bound to our chassis#033[00m
Jan 21 19:26:10 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:26:10.128 104259 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network ab086ee0-e007-4a86-babc-64d267c3fd5e#033[00m
Jan 21 19:26:10 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:26:10.139 211690 DEBUG oslo.privsep.daemon [-] privsep: reply[d930a31b-3910-4f1f-9e71-de01253c612f]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 19:26:10 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:26:10.139 104259 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tapab086ee0-e1 in ovnmeta-ab086ee0-e007-4a86-babc-64d267c3fd5e namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Jan 21 19:26:10 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:26:10.142 211690 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tapab086ee0-e0 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Jan 21 19:26:10 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:26:10.142 211690 DEBUG oslo.privsep.daemon [-] privsep: reply[d170a634-9dab-4160-abe5-a31ece59f348]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 19:26:10 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:26:10.143 211690 DEBUG oslo.privsep.daemon [-] privsep: reply[b94e4c62-f742-4dda-867d-02e2fe42b84e]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 19:26:10 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:26:10.158 104650 DEBUG oslo.privsep.daemon [-] privsep: reply[d95678eb-fe98-4e53-a600-3fe0cf24f002]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 19:26:10 np0005591285 nova_compute[182755]: 2026-01-22 00:26:10.177 182759 DEBUG oslo_service.periodic_task [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 21 19:26:10 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:26:10.179 211690 DEBUG oslo.privsep.daemon [-] privsep: reply[5d67852c-7bd6-4a70-ad4c-f279aaa049fd]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 19:26:10 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:26:10.208 211704 DEBUG oslo.privsep.daemon [-] privsep: reply[970f8c51-622c-4599-9dae-d89ad87a9e26]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 19:26:10 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:26:10.215 211690 DEBUG oslo.privsep.daemon [-] privsep: reply[f3b63f88-191b-4dc2-aa6d-34d616b377dc]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 19:26:10 np0005591285 NetworkManager[55017]: <info>  [1769041570.2174] manager: (tapab086ee0-e0): new Veth device (/org/freedesktop/NetworkManager/Devices/306)
Jan 21 19:26:10 np0005591285 nova_compute[182755]: 2026-01-22 00:26:10.222 182759 DEBUG oslo_service.periodic_task [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 21 19:26:10 np0005591285 nova_compute[182755]: 2026-01-22 00:26:10.223 182759 DEBUG oslo_service.periodic_task [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 21 19:26:10 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:26:10.248 211704 DEBUG oslo.privsep.daemon [-] privsep: reply[17865e52-cd0f-4ce8-9d2e-5cd2d16b9b34]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 19:26:10 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:26:10.252 211704 DEBUG oslo.privsep.daemon [-] privsep: reply[eedb5254-6aa3-48b7-927e-93e67d627626]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 19:26:10 np0005591285 nova_compute[182755]: 2026-01-22 00:26:10.259 182759 DEBUG oslo_concurrency.lockutils [None req-2c0234d5-fa51-4521-b48b-36adc6536c40 a60ce2b7b7ae47b484de12add551b287 02bcfc5f1f1044a3856e73a5938ff011 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 21 19:26:10 np0005591285 nova_compute[182755]: 2026-01-22 00:26:10.260 182759 DEBUG oslo_concurrency.lockutils [None req-2c0234d5-fa51-4521-b48b-36adc6536c40 a60ce2b7b7ae47b484de12add551b287 02bcfc5f1f1044a3856e73a5938ff011 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 21 19:26:10 np0005591285 NetworkManager[55017]: <info>  [1769041570.2757] device (tapab086ee0-e0): carrier: link connected
Jan 21 19:26:10 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:26:10.280 211704 DEBUG oslo.privsep.daemon [-] privsep: reply[97df90bb-b550-4e55-9e00-3152ff6e11d1]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 19:26:10 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:26:10.296 211690 DEBUG oslo.privsep.daemon [-] privsep: reply[65d879f7-e5c8-4709-a2ee-3a88785ac39a]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapab086ee0-e1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:1c:eb:16'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 196], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 612893, 'reachable_time': 38006, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 237883, 'error': None, 'target': 'ovnmeta-ab086ee0-e007-4a86-babc-64d267c3fd5e', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 19:26:10 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:26:10.312 211690 DEBUG oslo.privsep.daemon [-] privsep: reply[e79f3eda-1541-4282-b138-a27e432c2d60]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe1c:eb16'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 612893, 'tstamp': 612893}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 237884, 'error': None, 'target': 'ovnmeta-ab086ee0-e007-4a86-babc-64d267c3fd5e', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 19:26:10 np0005591285 nova_compute[182755]: 2026-01-22 00:26:10.326 182759 DEBUG nova.compute.provider_tree [None req-2c0234d5-fa51-4521-b48b-36adc6536c40 a60ce2b7b7ae47b484de12add551b287 02bcfc5f1f1044a3856e73a5938ff011 - - default default] Inventory has not changed in ProviderTree for provider: e96a8776-a298-4c19-937a-402cb8191067 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 21 19:26:10 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:26:10.331 211690 DEBUG oslo.privsep.daemon [-] privsep: reply[4a99c69c-815d-4880-8484-dd8b9967e478]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapab086ee0-e1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:1c:eb:16'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 2, 'rx_bytes': 90, 'tx_bytes': 176, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 2, 'rx_bytes': 90, 'tx_bytes': 176, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 196], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 612893, 'reachable_time': 38006, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 2, 'outoctets': 148, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 2, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 148, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 2, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 237885, 'error': None, 'target': 'ovnmeta-ab086ee0-e007-4a86-babc-64d267c3fd5e', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 19:26:10 np0005591285 nova_compute[182755]: 2026-01-22 00:26:10.341 182759 DEBUG nova.scheduler.client.report [None req-2c0234d5-fa51-4521-b48b-36adc6536c40 a60ce2b7b7ae47b484de12add551b287 02bcfc5f1f1044a3856e73a5938ff011 - - default default] Inventory has not changed for provider e96a8776-a298-4c19-937a-402cb8191067 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 21 19:26:10 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:26:10.362 211690 DEBUG oslo.privsep.daemon [-] privsep: reply[d779a58b-c5e4-4557-9183-141bcf04ebf1]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 19:26:10 np0005591285 nova_compute[182755]: 2026-01-22 00:26:10.370 182759 DEBUG oslo_concurrency.lockutils [None req-2c0234d5-fa51-4521-b48b-36adc6536c40 a60ce2b7b7ae47b484de12add551b287 02bcfc5f1f1044a3856e73a5938ff011 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.110s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 21 19:26:10 np0005591285 nova_compute[182755]: 2026-01-22 00:26:10.403 182759 INFO nova.scheduler.client.report [None req-2c0234d5-fa51-4521-b48b-36adc6536c40 a60ce2b7b7ae47b484de12add551b287 02bcfc5f1f1044a3856e73a5938ff011 - - default default] Deleted allocations for instance 3f9bded2-5958-4c54-90d9-fc4d4b658fc0#033[00m
Jan 21 19:26:10 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:26:10.437 211690 DEBUG oslo.privsep.daemon [-] privsep: reply[7bdf37bf-34e2-40f3-9642-fc5dd8ab17b2]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 19:26:10 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:26:10.438 104259 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapab086ee0-e0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 21 19:26:10 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:26:10.438 104259 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 21 19:26:10 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:26:10.439 104259 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapab086ee0-e0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 21 19:26:10 np0005591285 nova_compute[182755]: 2026-01-22 00:26:10.440 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:26:10 np0005591285 NetworkManager[55017]: <info>  [1769041570.4414] manager: (tapab086ee0-e0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/307)
Jan 21 19:26:10 np0005591285 kernel: tapab086ee0-e0: entered promiscuous mode
Jan 21 19:26:10 np0005591285 nova_compute[182755]: 2026-01-22 00:26:10.445 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:26:10 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:26:10.446 104259 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapab086ee0-e0, col_values=(('external_ids', {'iface-id': '931dfb05-b9a9-4afa-88ca-d1f52289b640'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 21 19:26:10 np0005591285 ovn_controller[94908]: 2026-01-22T00:26:10Z|00628|binding|INFO|Releasing lport 931dfb05-b9a9-4afa-88ca-d1f52289b640 from this chassis (sb_readonly=0)
Jan 21 19:26:10 np0005591285 nova_compute[182755]: 2026-01-22 00:26:10.447 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:26:10 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:26:10.448 104259 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/ab086ee0-e007-4a86-babc-64d267c3fd5e.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/ab086ee0-e007-4a86-babc-64d267c3fd5e.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Jan 21 19:26:10 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:26:10.449 211690 DEBUG oslo.privsep.daemon [-] privsep: reply[ba34233a-6cff-4496-a5c9-417a4b082e3b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 19:26:10 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:26:10.450 104259 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Jan 21 19:26:10 np0005591285 ovn_metadata_agent[104254]: global
Jan 21 19:26:10 np0005591285 ovn_metadata_agent[104254]:    log         /dev/log local0 debug
Jan 21 19:26:10 np0005591285 ovn_metadata_agent[104254]:    log-tag     haproxy-metadata-proxy-ab086ee0-e007-4a86-babc-64d267c3fd5e
Jan 21 19:26:10 np0005591285 ovn_metadata_agent[104254]:    user        root
Jan 21 19:26:10 np0005591285 ovn_metadata_agent[104254]:    group       root
Jan 21 19:26:10 np0005591285 ovn_metadata_agent[104254]:    maxconn     1024
Jan 21 19:26:10 np0005591285 ovn_metadata_agent[104254]:    pidfile     /var/lib/neutron/external/pids/ab086ee0-e007-4a86-babc-64d267c3fd5e.pid.haproxy
Jan 21 19:26:10 np0005591285 ovn_metadata_agent[104254]:    daemon
Jan 21 19:26:10 np0005591285 ovn_metadata_agent[104254]: 
Jan 21 19:26:10 np0005591285 ovn_metadata_agent[104254]: defaults
Jan 21 19:26:10 np0005591285 ovn_metadata_agent[104254]:    log global
Jan 21 19:26:10 np0005591285 ovn_metadata_agent[104254]:    mode http
Jan 21 19:26:10 np0005591285 ovn_metadata_agent[104254]:    option httplog
Jan 21 19:26:10 np0005591285 ovn_metadata_agent[104254]:    option dontlognull
Jan 21 19:26:10 np0005591285 ovn_metadata_agent[104254]:    option http-server-close
Jan 21 19:26:10 np0005591285 ovn_metadata_agent[104254]:    option forwardfor
Jan 21 19:26:10 np0005591285 ovn_metadata_agent[104254]:    retries                 3
Jan 21 19:26:10 np0005591285 ovn_metadata_agent[104254]:    timeout http-request    30s
Jan 21 19:26:10 np0005591285 ovn_metadata_agent[104254]:    timeout connect         30s
Jan 21 19:26:10 np0005591285 ovn_metadata_agent[104254]:    timeout client          32s
Jan 21 19:26:10 np0005591285 ovn_metadata_agent[104254]:    timeout server          32s
Jan 21 19:26:10 np0005591285 ovn_metadata_agent[104254]:    timeout http-keep-alive 30s
Jan 21 19:26:10 np0005591285 ovn_metadata_agent[104254]: 
Jan 21 19:26:10 np0005591285 ovn_metadata_agent[104254]: 
Jan 21 19:26:10 np0005591285 ovn_metadata_agent[104254]: listen listener
Jan 21 19:26:10 np0005591285 ovn_metadata_agent[104254]:    bind 169.254.169.254:80
Jan 21 19:26:10 np0005591285 ovn_metadata_agent[104254]:    server metadata /var/lib/neutron/metadata_proxy
Jan 21 19:26:10 np0005591285 ovn_metadata_agent[104254]:    http-request add-header X-OVN-Network-ID ab086ee0-e007-4a86-babc-64d267c3fd5e
Jan 21 19:26:10 np0005591285 ovn_metadata_agent[104254]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Jan 21 19:26:10 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:26:10.450 104259 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-ab086ee0-e007-4a86-babc-64d267c3fd5e', 'env', 'PROCESS_TAG=haproxy-ab086ee0-e007-4a86-babc-64d267c3fd5e', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/ab086ee0-e007-4a86-babc-64d267c3fd5e.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Jan 21 19:26:10 np0005591285 nova_compute[182755]: 2026-01-22 00:26:10.454 182759 DEBUG nova.virt.driver [None req-f447ef32-7edb-4e2c-a5c5-5452f2f9c37e - - - - - -] Emitting event <LifecycleEvent: 1769041570.4537675, ba1975bd-ca63-4cb4-afd3-fb1f077c28f0 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 21 19:26:10 np0005591285 nova_compute[182755]: 2026-01-22 00:26:10.455 182759 INFO nova.compute.manager [None req-f447ef32-7edb-4e2c-a5c5-5452f2f9c37e - - - - - -] [instance: ba1975bd-ca63-4cb4-afd3-fb1f077c28f0] VM Started (Lifecycle Event)#033[00m
Jan 21 19:26:10 np0005591285 nova_compute[182755]: 2026-01-22 00:26:10.460 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:26:10 np0005591285 nova_compute[182755]: 2026-01-22 00:26:10.489 182759 DEBUG nova.compute.manager [None req-f447ef32-7edb-4e2c-a5c5-5452f2f9c37e - - - - - -] [instance: ba1975bd-ca63-4cb4-afd3-fb1f077c28f0] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 21 19:26:10 np0005591285 nova_compute[182755]: 2026-01-22 00:26:10.492 182759 DEBUG nova.virt.driver [None req-f447ef32-7edb-4e2c-a5c5-5452f2f9c37e - - - - - -] Emitting event <LifecycleEvent: 1769041570.453936, ba1975bd-ca63-4cb4-afd3-fb1f077c28f0 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 21 19:26:10 np0005591285 nova_compute[182755]: 2026-01-22 00:26:10.493 182759 INFO nova.compute.manager [None req-f447ef32-7edb-4e2c-a5c5-5452f2f9c37e - - - - - -] [instance: ba1975bd-ca63-4cb4-afd3-fb1f077c28f0] VM Paused (Lifecycle Event)#033[00m
Jan 21 19:26:10 np0005591285 nova_compute[182755]: 2026-01-22 00:26:10.514 182759 DEBUG nova.compute.manager [None req-f447ef32-7edb-4e2c-a5c5-5452f2f9c37e - - - - - -] [instance: ba1975bd-ca63-4cb4-afd3-fb1f077c28f0] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 21 19:26:10 np0005591285 nova_compute[182755]: 2026-01-22 00:26:10.518 182759 DEBUG nova.compute.manager [None req-f447ef32-7edb-4e2c-a5c5-5452f2f9c37e - - - - - -] [instance: ba1975bd-ca63-4cb4-afd3-fb1f077c28f0] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 21 19:26:10 np0005591285 nova_compute[182755]: 2026-01-22 00:26:10.526 182759 DEBUG oslo_concurrency.lockutils [None req-2c0234d5-fa51-4521-b48b-36adc6536c40 a60ce2b7b7ae47b484de12add551b287 02bcfc5f1f1044a3856e73a5938ff011 - - default default] Lock "3f9bded2-5958-4c54-90d9-fc4d4b658fc0" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 9.588s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 21 19:26:10 np0005591285 nova_compute[182755]: 2026-01-22 00:26:10.549 182759 INFO nova.compute.manager [None req-f447ef32-7edb-4e2c-a5c5-5452f2f9c37e - - - - - -] [instance: ba1975bd-ca63-4cb4-afd3-fb1f077c28f0] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 21 19:26:10 np0005591285 nova_compute[182755]: 2026-01-22 00:26:10.614 182759 DEBUG nova.compute.manager [req-7a240ceb-74f8-4bb8-95e0-2f06b9b35db4 req-1ab28240-dfce-406f-bf8e-85232843bccc 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: ba1975bd-ca63-4cb4-afd3-fb1f077c28f0] Received event network-vif-plugged-168c1e42-5626-409f-86c2-c1b2a8b11d4b external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 21 19:26:10 np0005591285 nova_compute[182755]: 2026-01-22 00:26:10.615 182759 DEBUG oslo_concurrency.lockutils [req-7a240ceb-74f8-4bb8-95e0-2f06b9b35db4 req-1ab28240-dfce-406f-bf8e-85232843bccc 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "ba1975bd-ca63-4cb4-afd3-fb1f077c28f0-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 21 19:26:10 np0005591285 nova_compute[182755]: 2026-01-22 00:26:10.615 182759 DEBUG oslo_concurrency.lockutils [req-7a240ceb-74f8-4bb8-95e0-2f06b9b35db4 req-1ab28240-dfce-406f-bf8e-85232843bccc 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "ba1975bd-ca63-4cb4-afd3-fb1f077c28f0-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 21 19:26:10 np0005591285 nova_compute[182755]: 2026-01-22 00:26:10.615 182759 DEBUG oslo_concurrency.lockutils [req-7a240ceb-74f8-4bb8-95e0-2f06b9b35db4 req-1ab28240-dfce-406f-bf8e-85232843bccc 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "ba1975bd-ca63-4cb4-afd3-fb1f077c28f0-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 21 19:26:10 np0005591285 nova_compute[182755]: 2026-01-22 00:26:10.616 182759 DEBUG nova.compute.manager [req-7a240ceb-74f8-4bb8-95e0-2f06b9b35db4 req-1ab28240-dfce-406f-bf8e-85232843bccc 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: ba1975bd-ca63-4cb4-afd3-fb1f077c28f0] Processing event network-vif-plugged-168c1e42-5626-409f-86c2-c1b2a8b11d4b _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Jan 21 19:26:10 np0005591285 nova_compute[182755]: 2026-01-22 00:26:10.616 182759 DEBUG nova.compute.manager [None req-95f3dbd5-85dc-4e67-9366-cbe74b1deafb 93f27bcf715e498cbac482f96dec39c0 c869345f15654dea91ddb775c6c3ed7d - - default default] [instance: ba1975bd-ca63-4cb4-afd3-fb1f077c28f0] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Jan 21 19:26:10 np0005591285 nova_compute[182755]: 2026-01-22 00:26:10.621 182759 DEBUG nova.virt.libvirt.driver [None req-95f3dbd5-85dc-4e67-9366-cbe74b1deafb 93f27bcf715e498cbac482f96dec39c0 c869345f15654dea91ddb775c6c3ed7d - - default default] [instance: ba1975bd-ca63-4cb4-afd3-fb1f077c28f0] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Jan 21 19:26:10 np0005591285 nova_compute[182755]: 2026-01-22 00:26:10.622 182759 DEBUG nova.virt.driver [None req-f447ef32-7edb-4e2c-a5c5-5452f2f9c37e - - - - - -] Emitting event <LifecycleEvent: 1769041570.6209698, ba1975bd-ca63-4cb4-afd3-fb1f077c28f0 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 21 19:26:10 np0005591285 nova_compute[182755]: 2026-01-22 00:26:10.622 182759 INFO nova.compute.manager [None req-f447ef32-7edb-4e2c-a5c5-5452f2f9c37e - - - - - -] [instance: ba1975bd-ca63-4cb4-afd3-fb1f077c28f0] VM Resumed (Lifecycle Event)#033[00m
Jan 21 19:26:10 np0005591285 nova_compute[182755]: 2026-01-22 00:26:10.627 182759 INFO nova.virt.libvirt.driver [-] [instance: ba1975bd-ca63-4cb4-afd3-fb1f077c28f0] Instance spawned successfully.#033[00m
Jan 21 19:26:10 np0005591285 nova_compute[182755]: 2026-01-22 00:26:10.627 182759 DEBUG nova.virt.libvirt.driver [None req-95f3dbd5-85dc-4e67-9366-cbe74b1deafb 93f27bcf715e498cbac482f96dec39c0 c869345f15654dea91ddb775c6c3ed7d - - default default] [instance: ba1975bd-ca63-4cb4-afd3-fb1f077c28f0] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Jan 21 19:26:10 np0005591285 nova_compute[182755]: 2026-01-22 00:26:10.656 182759 DEBUG nova.compute.manager [None req-f447ef32-7edb-4e2c-a5c5-5452f2f9c37e - - - - - -] [instance: ba1975bd-ca63-4cb4-afd3-fb1f077c28f0] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 21 19:26:10 np0005591285 nova_compute[182755]: 2026-01-22 00:26:10.662 182759 DEBUG nova.compute.manager [None req-f447ef32-7edb-4e2c-a5c5-5452f2f9c37e - - - - - -] [instance: ba1975bd-ca63-4cb4-afd3-fb1f077c28f0] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 21 19:26:10 np0005591285 nova_compute[182755]: 2026-01-22 00:26:10.665 182759 DEBUG nova.virt.libvirt.driver [None req-95f3dbd5-85dc-4e67-9366-cbe74b1deafb 93f27bcf715e498cbac482f96dec39c0 c869345f15654dea91ddb775c6c3ed7d - - default default] [instance: ba1975bd-ca63-4cb4-afd3-fb1f077c28f0] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 21 19:26:10 np0005591285 nova_compute[182755]: 2026-01-22 00:26:10.666 182759 DEBUG nova.virt.libvirt.driver [None req-95f3dbd5-85dc-4e67-9366-cbe74b1deafb 93f27bcf715e498cbac482f96dec39c0 c869345f15654dea91ddb775c6c3ed7d - - default default] [instance: ba1975bd-ca63-4cb4-afd3-fb1f077c28f0] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 21 19:26:10 np0005591285 nova_compute[182755]: 2026-01-22 00:26:10.666 182759 DEBUG nova.virt.libvirt.driver [None req-95f3dbd5-85dc-4e67-9366-cbe74b1deafb 93f27bcf715e498cbac482f96dec39c0 c869345f15654dea91ddb775c6c3ed7d - - default default] [instance: ba1975bd-ca63-4cb4-afd3-fb1f077c28f0] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 21 19:26:10 np0005591285 nova_compute[182755]: 2026-01-22 00:26:10.667 182759 DEBUG nova.virt.libvirt.driver [None req-95f3dbd5-85dc-4e67-9366-cbe74b1deafb 93f27bcf715e498cbac482f96dec39c0 c869345f15654dea91ddb775c6c3ed7d - - default default] [instance: ba1975bd-ca63-4cb4-afd3-fb1f077c28f0] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 21 19:26:10 np0005591285 nova_compute[182755]: 2026-01-22 00:26:10.667 182759 DEBUG nova.virt.libvirt.driver [None req-95f3dbd5-85dc-4e67-9366-cbe74b1deafb 93f27bcf715e498cbac482f96dec39c0 c869345f15654dea91ddb775c6c3ed7d - - default default] [instance: ba1975bd-ca63-4cb4-afd3-fb1f077c28f0] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 21 19:26:10 np0005591285 nova_compute[182755]: 2026-01-22 00:26:10.667 182759 DEBUG nova.virt.libvirt.driver [None req-95f3dbd5-85dc-4e67-9366-cbe74b1deafb 93f27bcf715e498cbac482f96dec39c0 c869345f15654dea91ddb775c6c3ed7d - - default default] [instance: ba1975bd-ca63-4cb4-afd3-fb1f077c28f0] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 21 19:26:10 np0005591285 nova_compute[182755]: 2026-01-22 00:26:10.702 182759 INFO nova.compute.manager [None req-f447ef32-7edb-4e2c-a5c5-5452f2f9c37e - - - - - -] [instance: ba1975bd-ca63-4cb4-afd3-fb1f077c28f0] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 21 19:26:10 np0005591285 nova_compute[182755]: 2026-01-22 00:26:10.796 182759 INFO nova.compute.manager [None req-95f3dbd5-85dc-4e67-9366-cbe74b1deafb 93f27bcf715e498cbac482f96dec39c0 c869345f15654dea91ddb775c6c3ed7d - - default default] [instance: ba1975bd-ca63-4cb4-afd3-fb1f077c28f0] Took 16.05 seconds to spawn the instance on the hypervisor.#033[00m
Jan 21 19:26:10 np0005591285 nova_compute[182755]: 2026-01-22 00:26:10.797 182759 DEBUG nova.compute.manager [None req-95f3dbd5-85dc-4e67-9366-cbe74b1deafb 93f27bcf715e498cbac482f96dec39c0 c869345f15654dea91ddb775c6c3ed7d - - default default] [instance: ba1975bd-ca63-4cb4-afd3-fb1f077c28f0] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 21 19:26:10 np0005591285 podman[237924]: 2026-01-22 00:26:10.829096284 +0000 UTC m=+0.050106238 container create 5adae3dfa59173a9a76512f1e3051cf32474ffd806a7d7384e9b2b1003abd98b (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-ab086ee0-e007-4a86-babc-64d267c3fd5e, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3)
Jan 21 19:26:10 np0005591285 systemd[1]: Started libpod-conmon-5adae3dfa59173a9a76512f1e3051cf32474ffd806a7d7384e9b2b1003abd98b.scope.
Jan 21 19:26:10 np0005591285 systemd[1]: Started libcrun container.
Jan 21 19:26:10 np0005591285 podman[237924]: 2026-01-22 00:26:10.800294529 +0000 UTC m=+0.021304493 image pull 3695f0466b4af47afdf4b467956f8cc4744d7249671a73e7ca3fd26cca2f59c3 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Jan 21 19:26:10 np0005591285 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/3570e816b4362389ff045a54f5ebaf688009d7adb573583eaf662e8bf2fcf0a1/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 21 19:26:10 np0005591285 podman[237924]: 2026-01-22 00:26:10.911043158 +0000 UTC m=+0.132053142 container init 5adae3dfa59173a9a76512f1e3051cf32474ffd806a7d7384e9b2b1003abd98b (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-ab086ee0-e007-4a86-babc-64d267c3fd5e, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS)
Jan 21 19:26:10 np0005591285 podman[237924]: 2026-01-22 00:26:10.918285022 +0000 UTC m=+0.139294976 container start 5adae3dfa59173a9a76512f1e3051cf32474ffd806a7d7384e9b2b1003abd98b (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-ab086ee0-e007-4a86-babc-64d267c3fd5e, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team)
Jan 21 19:26:10 np0005591285 neutron-haproxy-ovnmeta-ab086ee0-e007-4a86-babc-64d267c3fd5e[237940]: [NOTICE]   (237944) : New worker (237946) forked
Jan 21 19:26:10 np0005591285 neutron-haproxy-ovnmeta-ab086ee0-e007-4a86-babc-64d267c3fd5e[237940]: [NOTICE]   (237944) : Loading success.
Jan 21 19:26:10 np0005591285 nova_compute[182755]: 2026-01-22 00:26:10.947 182759 INFO nova.compute.manager [None req-95f3dbd5-85dc-4e67-9366-cbe74b1deafb 93f27bcf715e498cbac482f96dec39c0 c869345f15654dea91ddb775c6c3ed7d - - default default] [instance: ba1975bd-ca63-4cb4-afd3-fb1f077c28f0] Took 16.78 seconds to build instance.#033[00m
Jan 21 19:26:10 np0005591285 nova_compute[182755]: 2026-01-22 00:26:10.965 182759 DEBUG oslo_concurrency.lockutils [None req-95f3dbd5-85dc-4e67-9366-cbe74b1deafb 93f27bcf715e498cbac482f96dec39c0 c869345f15654dea91ddb775c6c3ed7d - - default default] Lock "ba1975bd-ca63-4cb4-afd3-fb1f077c28f0" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 16.911s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 21 19:26:11 np0005591285 nova_compute[182755]: 2026-01-22 00:26:11.032 182759 DEBUG nova.network.neutron [req-026792f1-1ae2-48fd-8941-4e1f6f5aee3b req-5d3719dd-356c-432e-8d75-0058d9c611e4 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: ba1975bd-ca63-4cb4-afd3-fb1f077c28f0] Updated VIF entry in instance network info cache for port 168c1e42-5626-409f-86c2-c1b2a8b11d4b. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 21 19:26:11 np0005591285 nova_compute[182755]: 2026-01-22 00:26:11.033 182759 DEBUG nova.network.neutron [req-026792f1-1ae2-48fd-8941-4e1f6f5aee3b req-5d3719dd-356c-432e-8d75-0058d9c611e4 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: ba1975bd-ca63-4cb4-afd3-fb1f077c28f0] Updating instance_info_cache with network_info: [{"id": "168c1e42-5626-409f-86c2-c1b2a8b11d4b", "address": "fa:16:3e:01:b6:65", "network": {"id": "ab086ee0-e007-4a86-babc-64d267c3fd5e", "bridge": "br-int", "label": "tempest-TestSnapshotPattern-266383806-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c869345f15654dea91ddb775c6c3ed7d", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap168c1e42-56", "ovs_interfaceid": "168c1e42-5626-409f-86c2-c1b2a8b11d4b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 21 19:26:11 np0005591285 nova_compute[182755]: 2026-01-22 00:26:11.083 182759 DEBUG oslo_concurrency.lockutils [req-026792f1-1ae2-48fd-8941-4e1f6f5aee3b req-5d3719dd-356c-432e-8d75-0058d9c611e4 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Releasing lock "refresh_cache-ba1975bd-ca63-4cb4-afd3-fb1f077c28f0" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 21 19:26:12 np0005591285 nova_compute[182755]: 2026-01-22 00:26:12.714 182759 DEBUG nova.compute.manager [req-c5ae3e78-80a8-4b51-baea-53fcda77746d req-be3a853e-bd5d-4a79-b99b-00d5ea1e38e6 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: ba1975bd-ca63-4cb4-afd3-fb1f077c28f0] Received event network-vif-plugged-168c1e42-5626-409f-86c2-c1b2a8b11d4b external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 21 19:26:12 np0005591285 nova_compute[182755]: 2026-01-22 00:26:12.715 182759 DEBUG oslo_concurrency.lockutils [req-c5ae3e78-80a8-4b51-baea-53fcda77746d req-be3a853e-bd5d-4a79-b99b-00d5ea1e38e6 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "ba1975bd-ca63-4cb4-afd3-fb1f077c28f0-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 21 19:26:12 np0005591285 nova_compute[182755]: 2026-01-22 00:26:12.715 182759 DEBUG oslo_concurrency.lockutils [req-c5ae3e78-80a8-4b51-baea-53fcda77746d req-be3a853e-bd5d-4a79-b99b-00d5ea1e38e6 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "ba1975bd-ca63-4cb4-afd3-fb1f077c28f0-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 21 19:26:12 np0005591285 nova_compute[182755]: 2026-01-22 00:26:12.716 182759 DEBUG oslo_concurrency.lockutils [req-c5ae3e78-80a8-4b51-baea-53fcda77746d req-be3a853e-bd5d-4a79-b99b-00d5ea1e38e6 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "ba1975bd-ca63-4cb4-afd3-fb1f077c28f0-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 21 19:26:12 np0005591285 nova_compute[182755]: 2026-01-22 00:26:12.716 182759 DEBUG nova.compute.manager [req-c5ae3e78-80a8-4b51-baea-53fcda77746d req-be3a853e-bd5d-4a79-b99b-00d5ea1e38e6 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: ba1975bd-ca63-4cb4-afd3-fb1f077c28f0] No waiting events found dispatching network-vif-plugged-168c1e42-5626-409f-86c2-c1b2a8b11d4b pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 21 19:26:12 np0005591285 nova_compute[182755]: 2026-01-22 00:26:12.717 182759 WARNING nova.compute.manager [req-c5ae3e78-80a8-4b51-baea-53fcda77746d req-be3a853e-bd5d-4a79-b99b-00d5ea1e38e6 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: ba1975bd-ca63-4cb4-afd3-fb1f077c28f0] Received unexpected event network-vif-plugged-168c1e42-5626-409f-86c2-c1b2a8b11d4b for instance with vm_state active and task_state None.#033[00m
Jan 21 19:26:13 np0005591285 nova_compute[182755]: 2026-01-22 00:26:13.169 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:26:13 np0005591285 nova_compute[182755]: 2026-01-22 00:26:13.418 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:26:15 np0005591285 ovn_controller[94908]: 2026-01-22T00:26:15Z|00629|binding|INFO|Releasing lport 931dfb05-b9a9-4afa-88ca-d1f52289b640 from this chassis (sb_readonly=0)
Jan 21 19:26:15 np0005591285 nova_compute[182755]: 2026-01-22 00:26:15.845 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:26:15 np0005591285 ovn_controller[94908]: 2026-01-22T00:26:15Z|00630|binding|INFO|Releasing lport 931dfb05-b9a9-4afa-88ca-d1f52289b640 from this chassis (sb_readonly=0)
Jan 21 19:26:15 np0005591285 nova_compute[182755]: 2026-01-22 00:26:15.992 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:26:16 np0005591285 podman[237957]: 2026-01-22 00:26:16.184381747 +0000 UTC m=+0.055101803 container health_status 769668cc4e818cfae854ab3fcb5517227ddca1e18005a18ef4d782a494f45662 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, vcs-type=git, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vendor=Red Hat, Inc., container_name=openstack_network_exporter, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.buildah.version=1.33.7, release=1755695350, url=https://catalog.redhat.com/en/search?searchType=containers, managed_by=edpm_ansible, io.openshift.tags=minimal rhel9, architecture=x86_64, version=9.6, io.openshift.expose-services=, distribution-scope=public, maintainer=Red Hat, Inc., build-date=2025-08-20T13:12:41, config_id=openstack_network_exporter, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., name=ubi9-minimal, com.redhat.component=ubi9-minimal-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI)
Jan 21 19:26:16 np0005591285 podman[237958]: 2026-01-22 00:26:16.209723739 +0000 UTC m=+0.070632350 container health_status b5c1a31a508d0cf9e10702abe9c0ef424614b4d8f0daee85791f7de054afa6f0 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, config_id=ceilometer_agent_compute, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, managed_by=edpm_ansible, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, tcib_managed=true)
Jan 21 19:26:16 np0005591285 nova_compute[182755]: 2026-01-22 00:26:16.242 182759 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1769041561.2416742, 3f9bded2-5958-4c54-90d9-fc4d4b658fc0 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 21 19:26:16 np0005591285 nova_compute[182755]: 2026-01-22 00:26:16.243 182759 INFO nova.compute.manager [-] [instance: 3f9bded2-5958-4c54-90d9-fc4d4b658fc0] VM Stopped (Lifecycle Event)#033[00m
Jan 21 19:26:16 np0005591285 nova_compute[182755]: 2026-01-22 00:26:16.287 182759 DEBUG nova.compute.manager [None req-d24245db-0d69-456d-8bf6-972518cbe145 - - - - - -] [instance: 3f9bded2-5958-4c54-90d9-fc4d4b658fc0] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 21 19:26:18 np0005591285 nova_compute[182755]: 2026-01-22 00:26:18.172 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:26:18 np0005591285 NetworkManager[55017]: <info>  [1769041578.4170] manager: (patch-br-int-to-provnet-99bcf688-d143-4306-a11c-956e66fbe227): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/308)
Jan 21 19:26:18 np0005591285 NetworkManager[55017]: <info>  [1769041578.4193] manager: (patch-provnet-99bcf688-d143-4306-a11c-956e66fbe227-to-br-int): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/309)
Jan 21 19:26:18 np0005591285 nova_compute[182755]: 2026-01-22 00:26:18.420 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:26:18 np0005591285 nova_compute[182755]: 2026-01-22 00:26:18.516 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:26:18 np0005591285 ovn_controller[94908]: 2026-01-22T00:26:18Z|00631|binding|INFO|Releasing lport 931dfb05-b9a9-4afa-88ca-d1f52289b640 from this chassis (sb_readonly=0)
Jan 21 19:26:18 np0005591285 nova_compute[182755]: 2026-01-22 00:26:18.529 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:26:23 np0005591285 nova_compute[182755]: 2026-01-22 00:26:23.177 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.176 12 DEBUG ceilometer.compute.discovery [-] instance data: {'id': 'ba1975bd-ca63-4cb4-afd3-fb1f077c28f0', 'name': 'tempest-TestSnapshotPattern-server-1118267000', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'os_type': 'hvm', 'architecture': 'x86_64', 'OS-EXT-SRV-ATTR:instance_name': 'instance-000000a4', 'OS-EXT-SRV-ATTR:host': 'compute-2.ctlplane.example.com', 'OS-EXT-STS:vm_state': 'running', 'tenant_id': 'c869345f15654dea91ddb775c6c3ed7d', 'user_id': '93f27bcf715e498cbac482f96dec39c0', 'hostId': '87cf2df7d5510f7b4590e98386f7fc8d3e8f3543484bc323976a4d97', 'status': 'active', 'metadata': {}} discover_libvirt_polling /usr/lib/python3.9/site-packages/ceilometer/compute/discovery.py:228
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.178 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.requests in the context of pollsters
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.202 12 DEBUG ceilometer.compute.pollsters [-] ba1975bd-ca63-4cb4-afd3-fb1f077c28f0/disk.device.read.requests volume: 1095 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.203 12 DEBUG ceilometer.compute.pollsters [-] ba1975bd-ca63-4cb4-afd3-fb1f077c28f0/disk.device.read.requests volume: 108 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.205 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '84c56055-cdbd-4e1c-b306-6b047f0589b4', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 1095, 'user_id': '93f27bcf715e498cbac482f96dec39c0', 'user_name': None, 'project_id': 'c869345f15654dea91ddb775c6c3ed7d', 'project_name': None, 'resource_id': 'ba1975bd-ca63-4cb4-afd3-fb1f077c28f0-vda', 'timestamp': '2026-01-22T00:26:23.178388', 'resource_metadata': {'display_name': 'tempest-TestSnapshotPattern-server-1118267000', 'name': 'instance-000000a4', 'instance_id': 'ba1975bd-ca63-4cb4-afd3-fb1f077c28f0', 'instance_type': 'm1.nano', 'host': '87cf2df7d5510f7b4590e98386f7fc8d3e8f3543484bc323976a4d97', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'fb1a0ef6-f728-11f0-b13b-fa163e425b77', 'monotonic_time': 6141.897622048, 'message_signature': '8811c594ca9f233a7310edb9d7839997b531eb5c54e3f63354c2dd99b1e3db41'}, {'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 108, 'user_id': '93f27bcf715e498cbac482f96dec39c0', 'user_name': None, 'project_id': 'c869345f15654dea91ddb775c6c3ed7d', 'project_name': None, 'resource_id': 'ba1975bd-ca63-4cb4-afd3-fb1f077c28f0-sda', 'timestamp': '2026-01-22T00:26:23.178388', 'resource_metadata': {'display_name': 'tempest-TestSnapshotPattern-server-1118267000', 'name': 'instance-000000a4', 'instance_id': 'ba1975bd-ca63-4cb4-afd3-fb1f077c28f0', 'instance_type': 'm1.nano', 'host': '87cf2df7d5510f7b4590e98386f7fc8d3e8f3543484bc323976a4d97', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'fb1a263e-f728-11f0-b13b-fa163e425b77', 'monotonic_time': 6141.897622048, 'message_signature': '0bd51e8db274a3401f12c712f0f62728565cd7397e73949501bebe34bd8bb656'}]}, 'timestamp': '2026-01-22 00:26:23.203511', '_unique_id': '0d0a400fed374027baef7f904fd769b1'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.205 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.205 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.205 12 ERROR oslo_messaging.notify.messaging     yield
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.205 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.205 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.205 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.205 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.205 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.205 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.205 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.205 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.205 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.205 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.205 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.205 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.205 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.205 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.205 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.205 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.205 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.205 12 ERROR oslo_messaging.notify.messaging 
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.205 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.205 12 ERROR oslo_messaging.notify.messaging 
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.205 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.205 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.205 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.205 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.205 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.205 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.205 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.205 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.205 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.205 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.205 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.205 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.205 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.205 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.205 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.205 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.205 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.205 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.205 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.205 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.205 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.205 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.205 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.205 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.205 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.205 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.205 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.205 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.205 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.205 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.205 12 ERROR oslo_messaging.notify.messaging 
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.207 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.drop in the context of pollsters
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.210 12 DEBUG ceilometer.compute.virt.libvirt.inspector [-] No delta meter predecessor for ba1975bd-ca63-4cb4-afd3-fb1f077c28f0 / tap168c1e42-56 inspect_vnics /usr/lib/python3.9/site-packages/ceilometer/compute/virt/libvirt/inspector.py:136
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.210 12 DEBUG ceilometer.compute.pollsters [-] ba1975bd-ca63-4cb4-afd3-fb1f077c28f0/network.incoming.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.211 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'f1659b50-2f9b-49f2-bb11-24414ff57514', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '93f27bcf715e498cbac482f96dec39c0', 'user_name': None, 'project_id': 'c869345f15654dea91ddb775c6c3ed7d', 'project_name': None, 'resource_id': 'instance-000000a4-ba1975bd-ca63-4cb4-afd3-fb1f077c28f0-tap168c1e42-56', 'timestamp': '2026-01-22T00:26:23.207583', 'resource_metadata': {'display_name': 'tempest-TestSnapshotPattern-server-1118267000', 'name': 'tap168c1e42-56', 'instance_id': 'ba1975bd-ca63-4cb4-afd3-fb1f077c28f0', 'instance_type': 'm1.nano', 'host': '87cf2df7d5510f7b4590e98386f7fc8d3e8f3543484bc323976a4d97', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:01:b6:65', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap168c1e42-56'}, 'message_id': 'fb1b5658-f728-11f0-b13b-fa163e425b77', 'monotonic_time': 6141.926811992, 'message_signature': 'e7a342e3f1566571acf7629f8255434dff22b1be10f15bf9798152dd7493673a'}]}, 'timestamp': '2026-01-22 00:26:23.211211', '_unique_id': '36770d06d95b430bab36ddf27d1cc206'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.211 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.211 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.211 12 ERROR oslo_messaging.notify.messaging     yield
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.211 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.211 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.211 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.211 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.211 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.211 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.211 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.211 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.211 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.211 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.211 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.211 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.211 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.211 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.211 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.211 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.211 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.211 12 ERROR oslo_messaging.notify.messaging 
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.211 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.211 12 ERROR oslo_messaging.notify.messaging 
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.211 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.211 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.211 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.211 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.211 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.211 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.211 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.211 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.211 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.211 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.211 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.211 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.211 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.211 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.211 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.211 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.211 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.211 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.211 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.211 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.211 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.211 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.211 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.211 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.211 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.211 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.211 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.211 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.211 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.211 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.211 12 ERROR oslo_messaging.notify.messaging 
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.212 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.rate in the context of pollsters
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.213 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for OutgoingBytesRatePollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.213 12 ERROR ceilometer.polling.manager [-] Prevent pollster network.outgoing.bytes.rate from polling [<NovaLikeServer: tempest-TestSnapshotPattern-server-1118267000>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-TestSnapshotPattern-server-1118267000>]
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.213 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.rate in the context of pollsters
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.213 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for IncomingBytesRatePollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.213 12 ERROR ceilometer.polling.manager [-] Prevent pollster network.incoming.bytes.rate from polling [<NovaLikeServer: tempest-TestSnapshotPattern-server-1118267000>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-TestSnapshotPattern-server-1118267000>]
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.213 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.latency in the context of pollsters
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.213 12 DEBUG ceilometer.compute.pollsters [-] ba1975bd-ca63-4cb4-afd3-fb1f077c28f0/disk.device.read.latency volume: 209564186 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.214 12 DEBUG ceilometer.compute.pollsters [-] ba1975bd-ca63-4cb4-afd3-fb1f077c28f0/disk.device.read.latency volume: 28144902 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.214 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'd9965689-1e54-4e7c-a157-f8c7a3fcbfd2', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 209564186, 'user_id': '93f27bcf715e498cbac482f96dec39c0', 'user_name': None, 'project_id': 'c869345f15654dea91ddb775c6c3ed7d', 'project_name': None, 'resource_id': 'ba1975bd-ca63-4cb4-afd3-fb1f077c28f0-vda', 'timestamp': '2026-01-22T00:26:23.213904', 'resource_metadata': {'display_name': 'tempest-TestSnapshotPattern-server-1118267000', 'name': 'instance-000000a4', 'instance_id': 'ba1975bd-ca63-4cb4-afd3-fb1f077c28f0', 'instance_type': 'm1.nano', 'host': '87cf2df7d5510f7b4590e98386f7fc8d3e8f3543484bc323976a4d97', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'fb1bcb74-f728-11f0-b13b-fa163e425b77', 'monotonic_time': 6141.897622048, 'message_signature': 'ec24daeb89de96fee03f4c3fee32fae6cc388b7106280ed025f1bc27977b1d0c'}, {'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 28144902, 'user_id': '93f27bcf715e498cbac482f96dec39c0', 'user_name': None, 'project_id': 'c869345f15654dea91ddb775c6c3ed7d', 'project_name': None, 'resource_id': 'ba1975bd-ca63-4cb4-afd3-fb1f077c28f0-sda', 'timestamp': '2026-01-22T00:26:23.213904', 'resource_metadata': {'display_name': 'tempest-TestSnapshotPattern-server-1118267000', 'name': 'instance-000000a4', 'instance_id': 'ba1975bd-ca63-4cb4-afd3-fb1f077c28f0', 'instance_type': 'm1.nano', 'host': '87cf2df7d5510f7b4590e98386f7fc8d3e8f3543484bc323976a4d97', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'fb1bd61e-f728-11f0-b13b-fa163e425b77', 'monotonic_time': 6141.897622048, 'message_signature': 'fb517111fb971a40c4df6153065cb26d155d9800753a24d36ab96c6682287262'}]}, 'timestamp': '2026-01-22 00:26:23.214436', '_unique_id': '0700d74b3e134e229c362a368ecc2d15'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.214 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.214 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.214 12 ERROR oslo_messaging.notify.messaging     yield
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.214 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.214 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.214 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.214 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.214 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.214 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.214 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.214 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.214 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.214 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.214 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.214 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.214 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.214 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.214 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.214 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.214 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.214 12 ERROR oslo_messaging.notify.messaging 
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.214 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.214 12 ERROR oslo_messaging.notify.messaging 
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.214 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.214 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.214 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.214 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.214 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.214 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.214 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.214 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.214 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.214 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.214 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.214 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.214 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.214 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.214 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.214 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.214 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.214 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.214 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.214 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.214 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.214 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.214 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.214 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.214 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.214 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.214 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.214 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.214 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.214 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.214 12 ERROR oslo_messaging.notify.messaging 
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.215 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.delta in the context of pollsters
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.215 12 DEBUG ceilometer.compute.pollsters [-] ba1975bd-ca63-4cb4-afd3-fb1f077c28f0/network.outgoing.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.216 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '68327980-83c1-4b6f-a159-65c5efa7264e', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '93f27bcf715e498cbac482f96dec39c0', 'user_name': None, 'project_id': 'c869345f15654dea91ddb775c6c3ed7d', 'project_name': None, 'resource_id': 'instance-000000a4-ba1975bd-ca63-4cb4-afd3-fb1f077c28f0-tap168c1e42-56', 'timestamp': '2026-01-22T00:26:23.215893', 'resource_metadata': {'display_name': 'tempest-TestSnapshotPattern-server-1118267000', 'name': 'tap168c1e42-56', 'instance_id': 'ba1975bd-ca63-4cb4-afd3-fb1f077c28f0', 'instance_type': 'm1.nano', 'host': '87cf2df7d5510f7b4590e98386f7fc8d3e8f3543484bc323976a4d97', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:01:b6:65', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap168c1e42-56'}, 'message_id': 'fb1c1ba6-f728-11f0-b13b-fa163e425b77', 'monotonic_time': 6141.926811992, 'message_signature': 'c2d94b7510f7663ce01a3dc422950b61943ac365d6405443c61510b525801dd5'}]}, 'timestamp': '2026-01-22 00:26:23.216230', '_unique_id': 'd6337fee5ef94209b8e3be0d1a7d10db'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.216 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.216 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.216 12 ERROR oslo_messaging.notify.messaging     yield
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.216 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.216 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.216 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.216 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.216 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.216 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.216 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.216 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.216 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.216 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.216 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.216 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.216 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.216 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.216 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.216 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.216 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.216 12 ERROR oslo_messaging.notify.messaging 
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.216 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.216 12 ERROR oslo_messaging.notify.messaging 
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.216 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.216 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.216 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.216 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.216 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.216 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.216 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.216 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.216 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.216 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.216 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.216 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.216 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.216 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.216 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.216 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.216 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.216 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.216 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.216 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.216 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.216 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.216 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.216 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.216 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.216 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.216 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.216 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.216 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.216 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.216 12 ERROR oslo_messaging.notify.messaging 
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.217 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets in the context of pollsters
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.217 12 DEBUG ceilometer.compute.pollsters [-] ba1975bd-ca63-4cb4-afd3-fb1f077c28f0/network.outgoing.packets volume: 3 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.218 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '3c79e4b7-2c58-45dd-b1c3-05cffac7d244', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 3, 'user_id': '93f27bcf715e498cbac482f96dec39c0', 'user_name': None, 'project_id': 'c869345f15654dea91ddb775c6c3ed7d', 'project_name': None, 'resource_id': 'instance-000000a4-ba1975bd-ca63-4cb4-afd3-fb1f077c28f0-tap168c1e42-56', 'timestamp': '2026-01-22T00:26:23.217647', 'resource_metadata': {'display_name': 'tempest-TestSnapshotPattern-server-1118267000', 'name': 'tap168c1e42-56', 'instance_id': 'ba1975bd-ca63-4cb4-afd3-fb1f077c28f0', 'instance_type': 'm1.nano', 'host': '87cf2df7d5510f7b4590e98386f7fc8d3e8f3543484bc323976a4d97', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:01:b6:65', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap168c1e42-56'}, 'message_id': 'fb1c5dfa-f728-11f0-b13b-fa163e425b77', 'monotonic_time': 6141.926811992, 'message_signature': '077dc19d19ea456ea325fbe65a25e884cd3a8fcfe6781695948d8e8ca5633737'}]}, 'timestamp': '2026-01-22 00:26:23.217942', '_unique_id': 'c387b7e905d54795871292b11b2509cd'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.218 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.218 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.218 12 ERROR oslo_messaging.notify.messaging     yield
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.218 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.218 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.218 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.218 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.218 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.218 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.218 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.218 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.218 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.218 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.218 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.218 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.218 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.218 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.218 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.218 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.218 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.218 12 ERROR oslo_messaging.notify.messaging 
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.218 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.218 12 ERROR oslo_messaging.notify.messaging 
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.218 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.218 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.218 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.218 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.218 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.218 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.218 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.218 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.218 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.218 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.218 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.218 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.218 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.218 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.218 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.218 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.218 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.218 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.218 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.218 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.218 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.218 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.218 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.218 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.218 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.218 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.218 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.218 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.218 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.218 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.218 12 ERROR oslo_messaging.notify.messaging 
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.219 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.latency in the context of pollsters
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.219 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for PerDeviceDiskLatencyPollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.219 12 ERROR ceilometer.polling.manager [-] Prevent pollster disk.device.latency from polling [<NovaLikeServer: tempest-TestSnapshotPattern-server-1118267000>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-TestSnapshotPattern-server-1118267000>]
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.219 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.drop in the context of pollsters
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.219 12 DEBUG ceilometer.compute.pollsters [-] ba1975bd-ca63-4cb4-afd3-fb1f077c28f0/network.outgoing.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.220 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'ea4c88e9-c2c4-4f14-8111-1ae9be731b55', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '93f27bcf715e498cbac482f96dec39c0', 'user_name': None, 'project_id': 'c869345f15654dea91ddb775c6c3ed7d', 'project_name': None, 'resource_id': 'instance-000000a4-ba1975bd-ca63-4cb4-afd3-fb1f077c28f0-tap168c1e42-56', 'timestamp': '2026-01-22T00:26:23.219749', 'resource_metadata': {'display_name': 'tempest-TestSnapshotPattern-server-1118267000', 'name': 'tap168c1e42-56', 'instance_id': 'ba1975bd-ca63-4cb4-afd3-fb1f077c28f0', 'instance_type': 'm1.nano', 'host': '87cf2df7d5510f7b4590e98386f7fc8d3e8f3543484bc323976a4d97', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:01:b6:65', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap168c1e42-56'}, 'message_id': 'fb1caff8-f728-11f0-b13b-fa163e425b77', 'monotonic_time': 6141.926811992, 'message_signature': 'adafa66f7bbf6328b10c42043b28f0e579d954c5499786cc1e6d63fad4337190'}]}, 'timestamp': '2026-01-22 00:26:23.220025', '_unique_id': 'bb97db9ad6a545f9940b05e801158537'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.220 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.220 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.220 12 ERROR oslo_messaging.notify.messaging     yield
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.220 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.220 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.220 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.220 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.220 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.220 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.220 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.220 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.220 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.220 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.220 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.220 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.220 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.220 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.220 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.220 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.220 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.220 12 ERROR oslo_messaging.notify.messaging 
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.220 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.220 12 ERROR oslo_messaging.notify.messaging 
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.220 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.220 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.220 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.220 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.220 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.220 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.220 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.220 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.220 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.220 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.220 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.220 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.220 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.220 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.220 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.220 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.220 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.220 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.220 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.220 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.220 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.220 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.220 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.220 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.220 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.220 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.220 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.220 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.220 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.220 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.220 12 ERROR oslo_messaging.notify.messaging 
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.221 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes in the context of pollsters
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.221 12 DEBUG ceilometer.compute.pollsters [-] ba1975bd-ca63-4cb4-afd3-fb1f077c28f0/network.incoming.bytes volume: 844 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.222 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '82b327d7-80bb-4002-bfb5-f29c7570fd5a', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 844, 'user_id': '93f27bcf715e498cbac482f96dec39c0', 'user_name': None, 'project_id': 'c869345f15654dea91ddb775c6c3ed7d', 'project_name': None, 'resource_id': 'instance-000000a4-ba1975bd-ca63-4cb4-afd3-fb1f077c28f0-tap168c1e42-56', 'timestamp': '2026-01-22T00:26:23.221424', 'resource_metadata': {'display_name': 'tempest-TestSnapshotPattern-server-1118267000', 'name': 'tap168c1e42-56', 'instance_id': 'ba1975bd-ca63-4cb4-afd3-fb1f077c28f0', 'instance_type': 'm1.nano', 'host': '87cf2df7d5510f7b4590e98386f7fc8d3e8f3543484bc323976a4d97', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:01:b6:65', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap168c1e42-56'}, 'message_id': 'fb1cf0da-f728-11f0-b13b-fa163e425b77', 'monotonic_time': 6141.926811992, 'message_signature': 'd7b613526ffcd772471315969a3505b0e13321fadb7ddfc1b20ad6e54ae73ce3'}]}, 'timestamp': '2026-01-22 00:26:23.221687', '_unique_id': 'd48be95c7bd045d9aa1019037057ad5e'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.222 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.222 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.222 12 ERROR oslo_messaging.notify.messaging     yield
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.222 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.222 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.222 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.222 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.222 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.222 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.222 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.222 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.222 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.222 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.222 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.222 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.222 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.222 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.222 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.222 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.222 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.222 12 ERROR oslo_messaging.notify.messaging 
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.222 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.222 12 ERROR oslo_messaging.notify.messaging 
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.222 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.222 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.222 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.222 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.222 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.222 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.222 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.222 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.222 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.222 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.222 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.222 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.222 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.222 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.222 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.222 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.222 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.222 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.222 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.222 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.222 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.222 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.222 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.222 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.222 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.222 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.222 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.222 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.222 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.222 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.222 12 ERROR oslo_messaging.notify.messaging 
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.223 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets in the context of pollsters
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.223 12 DEBUG ceilometer.compute.pollsters [-] ba1975bd-ca63-4cb4-afd3-fb1f077c28f0/network.incoming.packets volume: 10 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.223 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '1c9925b3-b2ea-40b2-95eb-9bb3a44874a4', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 10, 'user_id': '93f27bcf715e498cbac482f96dec39c0', 'user_name': None, 'project_id': 'c869345f15654dea91ddb775c6c3ed7d', 'project_name': None, 'resource_id': 'instance-000000a4-ba1975bd-ca63-4cb4-afd3-fb1f077c28f0-tap168c1e42-56', 'timestamp': '2026-01-22T00:26:23.223236', 'resource_metadata': {'display_name': 'tempest-TestSnapshotPattern-server-1118267000', 'name': 'tap168c1e42-56', 'instance_id': 'ba1975bd-ca63-4cb4-afd3-fb1f077c28f0', 'instance_type': 'm1.nano', 'host': '87cf2df7d5510f7b4590e98386f7fc8d3e8f3543484bc323976a4d97', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:01:b6:65', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap168c1e42-56'}, 'message_id': 'fb1d37fc-f728-11f0-b13b-fa163e425b77', 'monotonic_time': 6141.926811992, 'message_signature': '5d024218f0f00806f0157e7a3b6b9a4e1a58eae42c0043e3944bf38e0f6d0319'}]}, 'timestamp': '2026-01-22 00:26:23.223508', '_unique_id': '32a24d65bb354ceaa9a2d6fdc2859728'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.223 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.223 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.223 12 ERROR oslo_messaging.notify.messaging     yield
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.223 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.223 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.223 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.223 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.223 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.223 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.223 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.223 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.223 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.223 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.223 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.223 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.223 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.223 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.223 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.223 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.223 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.223 12 ERROR oslo_messaging.notify.messaging 
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.223 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.223 12 ERROR oslo_messaging.notify.messaging 
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.223 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.223 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.223 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.223 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.223 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.223 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.223 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.223 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.223 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.223 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.223 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.223 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.223 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.223 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.223 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.223 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.223 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.223 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.223 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.223 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.223 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.223 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.223 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.223 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.223 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.223 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.223 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.223 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.223 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.223 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.223 12 ERROR oslo_messaging.notify.messaging 
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.224 12 INFO ceilometer.polling.manager [-] Polling pollster cpu in the context of pollsters
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.249 12 DEBUG ceilometer.compute.pollsters [-] ba1975bd-ca63-4cb4-afd3-fb1f077c28f0/cpu volume: 11440000000 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.251 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '895c77c7-446a-48a6-aa23-d1a509545934', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'cpu', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 11440000000, 'user_id': '93f27bcf715e498cbac482f96dec39c0', 'user_name': None, 'project_id': 'c869345f15654dea91ddb775c6c3ed7d', 'project_name': None, 'resource_id': 'ba1975bd-ca63-4cb4-afd3-fb1f077c28f0', 'timestamp': '2026-01-22T00:26:23.224981', 'resource_metadata': {'display_name': 'tempest-TestSnapshotPattern-server-1118267000', 'name': 'instance-000000a4', 'instance_id': 'ba1975bd-ca63-4cb4-afd3-fb1f077c28f0', 'instance_type': 'm1.nano', 'host': '87cf2df7d5510f7b4590e98386f7fc8d3e8f3543484bc323976a4d97', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'cpu_number': 1}, 'message_id': 'fb214662-f728-11f0-b13b-fa163e425b77', 'monotonic_time': 6141.968636687, 'message_signature': '5e7401ff2544cf575d968ab57d75037df86621c83f9b613b91099f1dc9c211f9'}]}, 'timestamp': '2026-01-22 00:26:23.250174', '_unique_id': 'a4d0ec2d791243c38a4e815362eefbdb'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.251 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.251 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.251 12 ERROR oslo_messaging.notify.messaging     yield
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.251 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.251 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.251 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.251 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.251 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.251 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.251 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.251 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.251 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.251 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.251 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.251 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.251 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.251 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.251 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.251 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.251 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.251 12 ERROR oslo_messaging.notify.messaging 
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.251 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.251 12 ERROR oslo_messaging.notify.messaging 
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.251 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.251 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.251 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.251 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.251 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.251 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.251 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.251 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.251 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.251 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.251 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.251 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.251 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.251 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.251 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.251 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.251 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.251 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.251 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.251 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.251 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.251 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.251 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.251 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.251 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.251 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.251 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.251 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.251 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.251 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.251 12 ERROR oslo_messaging.notify.messaging 
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.252 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.usage in the context of pollsters
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.265 12 DEBUG ceilometer.compute.pollsters [-] ba1975bd-ca63-4cb4-afd3-fb1f077c28f0/disk.device.usage volume: 29753344 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.266 12 DEBUG ceilometer.compute.pollsters [-] ba1975bd-ca63-4cb4-afd3-fb1f077c28f0/disk.device.usage volume: 485376 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.267 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '5b48919d-04a3-4cc6-b217-8b80d133e777', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 29753344, 'user_id': '93f27bcf715e498cbac482f96dec39c0', 'user_name': None, 'project_id': 'c869345f15654dea91ddb775c6c3ed7d', 'project_name': None, 'resource_id': 'ba1975bd-ca63-4cb4-afd3-fb1f077c28f0-vda', 'timestamp': '2026-01-22T00:26:23.252269', 'resource_metadata': {'display_name': 'tempest-TestSnapshotPattern-server-1118267000', 'name': 'instance-000000a4', 'instance_id': 'ba1975bd-ca63-4cb4-afd3-fb1f077c28f0', 'instance_type': 'm1.nano', 'host': '87cf2df7d5510f7b4590e98386f7fc8d3e8f3543484bc323976a4d97', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'fb23b60e-f728-11f0-b13b-fa163e425b77', 'monotonic_time': 6141.971457133, 'message_signature': 'a22d61b7469784b81d2dc10c995e62d75651500139565f2d0acc045875e1f7c7'}, {'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 485376, 'user_id': '93f27bcf715e498cbac482f96dec39c0', 'user_name': None, 'project_id': 'c869345f15654dea91ddb775c6c3ed7d', 'project_name': None, 'resource_id': 'ba1975bd-ca63-4cb4-afd3-fb1f077c28f0-sda', 'timestamp': '2026-01-22T00:26:23.252269', 'resource_metadata': {'display_name': 'tempest-TestSnapshotPattern-server-1118267000', 'name': 'instance-000000a4', 'instance_id': 'ba1975bd-ca63-4cb4-afd3-fb1f077c28f0', 'instance_type': 'm1.nano', 'host': '87cf2df7d5510f7b4590e98386f7fc8d3e8f3543484bc323976a4d97', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'fb23c63a-f728-11f0-b13b-fa163e425b77', 'monotonic_time': 6141.971457133, 'message_signature': '063a3b880399834b152aa21a1308e9c3738b5dc45c982ab54e6da011bbf77789'}]}, 'timestamp': '2026-01-22 00:26:23.266482', '_unique_id': 'db56ca8cd56c4756a2f9deae33e909eb'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.267 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.267 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.267 12 ERROR oslo_messaging.notify.messaging     yield
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.267 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.267 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.267 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.267 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.267 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.267 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.267 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.267 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.267 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.267 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.267 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.267 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.267 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.267 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.267 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.267 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.267 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.267 12 ERROR oslo_messaging.notify.messaging 
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.267 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.267 12 ERROR oslo_messaging.notify.messaging 
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.267 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.267 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.267 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.267 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.267 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.267 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.267 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.267 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.267 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.267 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.267 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.267 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.267 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.267 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.267 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.267 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.267 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.267 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.267 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.267 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.267 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.267 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.267 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.267 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.267 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.267 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.267 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.267 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.267 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.267 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.267 12 ERROR oslo_messaging.notify.messaging 
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.268 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.delta in the context of pollsters
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.268 12 DEBUG ceilometer.compute.pollsters [-] ba1975bd-ca63-4cb4-afd3-fb1f077c28f0/network.incoming.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.269 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '887d2580-9e6f-4ee9-be74-5e35c8b23c31', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '93f27bcf715e498cbac482f96dec39c0', 'user_name': None, 'project_id': 'c869345f15654dea91ddb775c6c3ed7d', 'project_name': None, 'resource_id': 'instance-000000a4-ba1975bd-ca63-4cb4-afd3-fb1f077c28f0-tap168c1e42-56', 'timestamp': '2026-01-22T00:26:23.268525', 'resource_metadata': {'display_name': 'tempest-TestSnapshotPattern-server-1118267000', 'name': 'tap168c1e42-56', 'instance_id': 'ba1975bd-ca63-4cb4-afd3-fb1f077c28f0', 'instance_type': 'm1.nano', 'host': '87cf2df7d5510f7b4590e98386f7fc8d3e8f3543484bc323976a4d97', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:01:b6:65', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap168c1e42-56'}, 'message_id': 'fb2421ac-f728-11f0-b13b-fa163e425b77', 'monotonic_time': 6141.926811992, 'message_signature': '579110298d46abf4f643e8ac54be66e22803c522139df16e6962fac244c42056'}]}, 'timestamp': '2026-01-22 00:26:23.268815', '_unique_id': '9b5a6a85e3314aa99c3a371d8070f8a5'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.269 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.269 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.269 12 ERROR oslo_messaging.notify.messaging     yield
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.269 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.269 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.269 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.269 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.269 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.269 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.269 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.269 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.269 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.269 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.269 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.269 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.269 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.269 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.269 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.269 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.269 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.269 12 ERROR oslo_messaging.notify.messaging 
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.269 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.269 12 ERROR oslo_messaging.notify.messaging 
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.269 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.269 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.269 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.269 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.269 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.269 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.269 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.269 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.269 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.269 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.269 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.269 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.269 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.269 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.269 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.269 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.269 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.269 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.269 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.269 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.269 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.269 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.269 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.269 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.269 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.269 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.269 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.269 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.269 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.269 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.269 12 ERROR oslo_messaging.notify.messaging 
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.270 12 INFO ceilometer.polling.manager [-] Polling pollster memory.usage in the context of pollsters
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.270 12 DEBUG ceilometer.compute.pollsters [-] ba1975bd-ca63-4cb4-afd3-fb1f077c28f0/memory.usage volume: 40.37890625 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.271 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '73ecec74-9faa-4e63-a8a4-2aa2e82b778a', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'memory.usage', 'counter_type': 'gauge', 'counter_unit': 'MB', 'counter_volume': 40.37890625, 'user_id': '93f27bcf715e498cbac482f96dec39c0', 'user_name': None, 'project_id': 'c869345f15654dea91ddb775c6c3ed7d', 'project_name': None, 'resource_id': 'ba1975bd-ca63-4cb4-afd3-fb1f077c28f0', 'timestamp': '2026-01-22T00:26:23.270301', 'resource_metadata': {'display_name': 'tempest-TestSnapshotPattern-server-1118267000', 'name': 'instance-000000a4', 'instance_id': 'ba1975bd-ca63-4cb4-afd3-fb1f077c28f0', 'instance_type': 'm1.nano', 'host': '87cf2df7d5510f7b4590e98386f7fc8d3e8f3543484bc323976a4d97', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1}, 'message_id': 'fb2466ee-f728-11f0-b13b-fa163e425b77', 'monotonic_time': 6141.968636687, 'message_signature': '0b64ac9fea4c29b91cff0e3de2f261ba88ae2c7ca6954013f603055e08d54f54'}]}, 'timestamp': '2026-01-22 00:26:23.270581', '_unique_id': 'd28266cbc40548328a44927c3c1839a7'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.271 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.271 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.271 12 ERROR oslo_messaging.notify.messaging     yield
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.271 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.271 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.271 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.271 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.271 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.271 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.271 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.271 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.271 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.271 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.271 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.271 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.271 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.271 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.271 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.271 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.271 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.271 12 ERROR oslo_messaging.notify.messaging 
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.271 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.271 12 ERROR oslo_messaging.notify.messaging 
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.271 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.271 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.271 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.271 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.271 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.271 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.271 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.271 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.271 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.271 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.271 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.271 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.271 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.271 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.271 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.271 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.271 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.271 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.271 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.271 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.271 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.271 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.271 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.271 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.271 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.271 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.271 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.271 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.271 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.271 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.271 12 ERROR oslo_messaging.notify.messaging 
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.271 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes in the context of pollsters
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.272 12 DEBUG ceilometer.compute.pollsters [-] ba1975bd-ca63-4cb4-afd3-fb1f077c28f0/network.outgoing.bytes volume: 266 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.272 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '85f12c20-bc7f-4f67-a820-65d6205826fe', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 266, 'user_id': '93f27bcf715e498cbac482f96dec39c0', 'user_name': None, 'project_id': 'c869345f15654dea91ddb775c6c3ed7d', 'project_name': None, 'resource_id': 'instance-000000a4-ba1975bd-ca63-4cb4-afd3-fb1f077c28f0-tap168c1e42-56', 'timestamp': '2026-01-22T00:26:23.272062', 'resource_metadata': {'display_name': 'tempest-TestSnapshotPattern-server-1118267000', 'name': 'tap168c1e42-56', 'instance_id': 'ba1975bd-ca63-4cb4-afd3-fb1f077c28f0', 'instance_type': 'm1.nano', 'host': '87cf2df7d5510f7b4590e98386f7fc8d3e8f3543484bc323976a4d97', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:01:b6:65', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap168c1e42-56'}, 'message_id': 'fb24ab36-f728-11f0-b13b-fa163e425b77', 'monotonic_time': 6141.926811992, 'message_signature': '90bed16c2a45ca6c8024b3873b1bd37f893ec83dc2581f83b6d875319425a189'}]}, 'timestamp': '2026-01-22 00:26:23.272333', '_unique_id': '0dc192b1c6ea4c38b142f987462e9ea7'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.272 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.272 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.272 12 ERROR oslo_messaging.notify.messaging     yield
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.272 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.272 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.272 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.272 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.272 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.272 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.272 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.272 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.272 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.272 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.272 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.272 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.272 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.272 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.272 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.272 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.272 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.272 12 ERROR oslo_messaging.notify.messaging 
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.272 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.272 12 ERROR oslo_messaging.notify.messaging 
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.272 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.272 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.272 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.272 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.272 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.272 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.272 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.272 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.272 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.272 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.272 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.272 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.272 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.272 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.272 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.272 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.272 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.272 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.272 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.272 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.272 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.272 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.272 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.272 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.272 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.272 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.272 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.272 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.272 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.272 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.272 12 ERROR oslo_messaging.notify.messaging 
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.273 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.bytes in the context of pollsters
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.273 12 DEBUG ceilometer.compute.pollsters [-] ba1975bd-ca63-4cb4-afd3-fb1f077c28f0/disk.device.read.bytes volume: 30517760 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.274 12 DEBUG ceilometer.compute.pollsters [-] ba1975bd-ca63-4cb4-afd3-fb1f077c28f0/disk.device.read.bytes volume: 274750 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.274 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '11e6ec7f-42f5-41ee-b593-f5cda1c1834d', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 30517760, 'user_id': '93f27bcf715e498cbac482f96dec39c0', 'user_name': None, 'project_id': 'c869345f15654dea91ddb775c6c3ed7d', 'project_name': None, 'resource_id': 'ba1975bd-ca63-4cb4-afd3-fb1f077c28f0-vda', 'timestamp': '2026-01-22T00:26:23.273794', 'resource_metadata': {'display_name': 'tempest-TestSnapshotPattern-server-1118267000', 'name': 'instance-000000a4', 'instance_id': 'ba1975bd-ca63-4cb4-afd3-fb1f077c28f0', 'instance_type': 'm1.nano', 'host': '87cf2df7d5510f7b4590e98386f7fc8d3e8f3543484bc323976a4d97', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'fb24f0aa-f728-11f0-b13b-fa163e425b77', 'monotonic_time': 6141.897622048, 'message_signature': 'ba63ad023324d390b9027ba8bb1baa4ab5dd1de1d84c355ebe7f106e736cbc74'}, {'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 274750, 'user_id': '93f27bcf715e498cbac482f96dec39c0', 'user_name': None, 'project_id': 'c869345f15654dea91ddb775c6c3ed7d', 'project_name': None, 'resource_id': 'ba1975bd-ca63-4cb4-afd3-fb1f077c28f0-sda', 'timestamp': '2026-01-22T00:26:23.273794', 'resource_metadata': {'display_name': 'tempest-TestSnapshotPattern-server-1118267000', 'name': 'instance-000000a4', 'instance_id': 'ba1975bd-ca63-4cb4-afd3-fb1f077c28f0', 'instance_type': 'm1.nano', 'host': '87cf2df7d5510f7b4590e98386f7fc8d3e8f3543484bc323976a4d97', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'fb24fb72-f728-11f0-b13b-fa163e425b77', 'monotonic_time': 6141.897622048, 'message_signature': 'db60b92a69c58000ef92c131944ca2b03f0c5920bce9bee38690d84f8f0a757c'}]}, 'timestamp': '2026-01-22 00:26:23.274376', '_unique_id': '1ae80279e94449f9a6d8fbc55423fd32'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.274 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.274 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.274 12 ERROR oslo_messaging.notify.messaging     yield
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.274 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.274 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.274 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.274 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.274 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.274 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.274 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.274 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.274 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.274 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.274 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.274 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.274 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.274 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.274 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.274 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.274 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.274 12 ERROR oslo_messaging.notify.messaging 
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.274 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.274 12 ERROR oslo_messaging.notify.messaging 
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.274 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.274 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.274 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.274 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.274 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.274 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.274 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.274 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.274 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.274 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.274 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.274 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.274 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.274 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.274 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.274 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.274 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.274 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.274 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.274 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.274 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.274 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.274 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.274 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.274 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.274 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.274 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.274 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.274 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.274 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.274 12 ERROR oslo_messaging.notify.messaging 
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.275 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.error in the context of pollsters
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.275 12 DEBUG ceilometer.compute.pollsters [-] ba1975bd-ca63-4cb4-afd3-fb1f077c28f0/network.outgoing.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.276 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'da7aa598-bcaa-4c60-b31e-64b4a189c718', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '93f27bcf715e498cbac482f96dec39c0', 'user_name': None, 'project_id': 'c869345f15654dea91ddb775c6c3ed7d', 'project_name': None, 'resource_id': 'instance-000000a4-ba1975bd-ca63-4cb4-afd3-fb1f077c28f0-tap168c1e42-56', 'timestamp': '2026-01-22T00:26:23.275884', 'resource_metadata': {'display_name': 'tempest-TestSnapshotPattern-server-1118267000', 'name': 'tap168c1e42-56', 'instance_id': 'ba1975bd-ca63-4cb4-afd3-fb1f077c28f0', 'instance_type': 'm1.nano', 'host': '87cf2df7d5510f7b4590e98386f7fc8d3e8f3543484bc323976a4d97', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:01:b6:65', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap168c1e42-56'}, 'message_id': 'fb25412c-f728-11f0-b13b-fa163e425b77', 'monotonic_time': 6141.926811992, 'message_signature': 'c8f3739225337e215142f7833fc00e49574b42d752506c9fb11c21360a47b797'}]}, 'timestamp': '2026-01-22 00:26:23.276173', '_unique_id': 'ca7abd9ace4e41cd950ec28d13310101'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.276 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.276 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.276 12 ERROR oslo_messaging.notify.messaging     yield
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.276 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.276 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.276 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.276 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.276 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.276 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.276 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.276 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.276 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.276 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.276 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.276 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.276 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.276 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.276 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.276 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.276 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.276 12 ERROR oslo_messaging.notify.messaging 
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.276 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.276 12 ERROR oslo_messaging.notify.messaging 
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.276 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.276 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.276 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.276 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.276 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.276 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.276 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.276 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.276 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.276 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.276 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.276 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.276 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.276 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.276 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.276 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.276 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.276 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.276 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.276 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.276 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.276 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.276 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.276 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.276 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.276 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.276 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.276 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.276 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.276 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.276 12 ERROR oslo_messaging.notify.messaging 
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.277 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.allocation in the context of pollsters
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.277 12 DEBUG ceilometer.compute.pollsters [-] ba1975bd-ca63-4cb4-afd3-fb1f077c28f0/disk.device.allocation volume: 30089216 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.277 12 DEBUG ceilometer.compute.pollsters [-] ba1975bd-ca63-4cb4-afd3-fb1f077c28f0/disk.device.allocation volume: 487424 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.278 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '3563a225-0dce-479a-81a6-c278cc4559c5', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 30089216, 'user_id': '93f27bcf715e498cbac482f96dec39c0', 'user_name': None, 'project_id': 'c869345f15654dea91ddb775c6c3ed7d', 'project_name': None, 'resource_id': 'ba1975bd-ca63-4cb4-afd3-fb1f077c28f0-vda', 'timestamp': '2026-01-22T00:26:23.277692', 'resource_metadata': {'display_name': 'tempest-TestSnapshotPattern-server-1118267000', 'name': 'instance-000000a4', 'instance_id': 'ba1975bd-ca63-4cb4-afd3-fb1f077c28f0', 'instance_type': 'm1.nano', 'host': '87cf2df7d5510f7b4590e98386f7fc8d3e8f3543484bc323976a4d97', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'fb25870e-f728-11f0-b13b-fa163e425b77', 'monotonic_time': 6141.971457133, 'message_signature': '28cb197baad3470eddac9fdc1531fa73b42beb04915b7b15655ab1070e85662e'}, {'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 487424, 'user_id': '93f27bcf715e498cbac482f96dec39c0', 'user_name': None, 'project_id': 'c869345f15654dea91ddb775c6c3ed7d', 'project_name': None, 'resource_id': 'ba1975bd-ca63-4cb4-afd3-fb1f077c28f0-sda', 'timestamp': '2026-01-22T00:26:23.277692', 'resource_metadata': {'display_name': 'tempest-TestSnapshotPattern-server-1118267000', 'name': 'instance-000000a4', 'instance_id': 'ba1975bd-ca63-4cb4-afd3-fb1f077c28f0', 'instance_type': 'm1.nano', 'host': '87cf2df7d5510f7b4590e98386f7fc8d3e8f3543484bc323976a4d97', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'fb25915e-f728-11f0-b13b-fa163e425b77', 'monotonic_time': 6141.971457133, 'message_signature': '6876f8e68397530dba57532846bcd5f79a406fd3ca0ede9da0a1cdf137d55861'}]}, 'timestamp': '2026-01-22 00:26:23.278238', '_unique_id': '8850e546c4254892a16b965ecff10635'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.278 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.278 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.278 12 ERROR oslo_messaging.notify.messaging     yield
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.278 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.278 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.278 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.278 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.278 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.278 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.278 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.278 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.278 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.278 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.278 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.278 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.278 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.278 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.278 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.278 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.278 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.278 12 ERROR oslo_messaging.notify.messaging 
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.278 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.278 12 ERROR oslo_messaging.notify.messaging 
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.278 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.278 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.278 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.278 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.278 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.278 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.278 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.278 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.278 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.278 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.278 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.278 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.278 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.278 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.278 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.278 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.278 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.278 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.278 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.278 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.278 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.278 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.278 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.278 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.278 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.278 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.278 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.278 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.278 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.278 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.278 12 ERROR oslo_messaging.notify.messaging 
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.279 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.requests in the context of pollsters
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.279 12 DEBUG ceilometer.compute.pollsters [-] ba1975bd-ca63-4cb4-afd3-fb1f077c28f0/disk.device.write.requests volume: 297 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.280 12 DEBUG ceilometer.compute.pollsters [-] ba1975bd-ca63-4cb4-afd3-fb1f077c28f0/disk.device.write.requests volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.281 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'b001ed27-8a38-44fa-9078-3ec7736016e8', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 297, 'user_id': '93f27bcf715e498cbac482f96dec39c0', 'user_name': None, 'project_id': 'c869345f15654dea91ddb775c6c3ed7d', 'project_name': None, 'resource_id': 'ba1975bd-ca63-4cb4-afd3-fb1f077c28f0-vda', 'timestamp': '2026-01-22T00:26:23.279715', 'resource_metadata': {'display_name': 'tempest-TestSnapshotPattern-server-1118267000', 'name': 'instance-000000a4', 'instance_id': 'ba1975bd-ca63-4cb4-afd3-fb1f077c28f0', 'instance_type': 'm1.nano', 'host': '87cf2df7d5510f7b4590e98386f7fc8d3e8f3543484bc323976a4d97', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'fb25d952-f728-11f0-b13b-fa163e425b77', 'monotonic_time': 6141.897622048, 'message_signature': 'c152b6699345237973c0198505e98f4dd9c123a9c73f786dfbe696272c675e1a'}, {'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 0, 'user_id': '93f27bcf715e498cbac482f96dec39c0', 'user_name': None, 'project_id': 'c869345f15654dea91ddb775c6c3ed7d', 'project_name': None, 'resource_id': 'ba1975bd-ca63-4cb4-afd3-fb1f077c28f0-sda', 'timestamp': '2026-01-22T00:26:23.279715', 'resource_metadata': {'display_name': 'tempest-TestSnapshotPattern-server-1118267000', 'name': 'instance-000000a4', 'instance_id': 'ba1975bd-ca63-4cb4-afd3-fb1f077c28f0', 'instance_type': 'm1.nano', 'host': '87cf2df7d5510f7b4590e98386f7fc8d3e8f3543484bc323976a4d97', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'fb25e618-f728-11f0-b13b-fa163e425b77', 'monotonic_time': 6141.897622048, 'message_signature': '85df13839614b87b3d724682d686218edf6ed4285ac38857914ce144adfc226d'}]}, 'timestamp': '2026-01-22 00:26:23.280415', '_unique_id': 'ac0711eb667b4a879f53f8e0745af915'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.281 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.281 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.281 12 ERROR oslo_messaging.notify.messaging     yield
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.281 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.281 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.281 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.281 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.281 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.281 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.281 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.281 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.281 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.281 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.281 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.281 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.281 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.281 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.281 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.281 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.281 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.281 12 ERROR oslo_messaging.notify.messaging 
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.281 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.281 12 ERROR oslo_messaging.notify.messaging 
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.281 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.281 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.281 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.281 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.281 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.281 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.281 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.281 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.281 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.281 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.281 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.281 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.281 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.281 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.281 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.281 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.281 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.281 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.281 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.281 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.281 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.281 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.281 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.281 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.281 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.281 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.281 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.281 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.281 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.281 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.281 12 ERROR oslo_messaging.notify.messaging 
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.281 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.iops in the context of pollsters
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.282 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for PerDeviceDiskIOPSPollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.282 12 ERROR ceilometer.polling.manager [-] Prevent pollster disk.device.iops from polling [<NovaLikeServer: tempest-TestSnapshotPattern-server-1118267000>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-TestSnapshotPattern-server-1118267000>]
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.282 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.error in the context of pollsters
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.282 12 DEBUG ceilometer.compute.pollsters [-] ba1975bd-ca63-4cb4-afd3-fb1f077c28f0/network.incoming.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.283 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '7f033dc6-caca-4bf6-bed5-33b9e64e8a87', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '93f27bcf715e498cbac482f96dec39c0', 'user_name': None, 'project_id': 'c869345f15654dea91ddb775c6c3ed7d', 'project_name': None, 'resource_id': 'instance-000000a4-ba1975bd-ca63-4cb4-afd3-fb1f077c28f0-tap168c1e42-56', 'timestamp': '2026-01-22T00:26:23.282484', 'resource_metadata': {'display_name': 'tempest-TestSnapshotPattern-server-1118267000', 'name': 'tap168c1e42-56', 'instance_id': 'ba1975bd-ca63-4cb4-afd3-fb1f077c28f0', 'instance_type': 'm1.nano', 'host': '87cf2df7d5510f7b4590e98386f7fc8d3e8f3543484bc323976a4d97', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:01:b6:65', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap168c1e42-56'}, 'message_id': 'fb26422a-f728-11f0-b13b-fa163e425b77', 'monotonic_time': 6141.926811992, 'message_signature': '10245d59bfda118403cf7a4d7230a89ddeb0928dee4e943ce2299761b7583911'}]}, 'timestamp': '2026-01-22 00:26:23.282760', '_unique_id': '77bd8e1459bd4f22a30f8b55b1cd3171'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.283 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.283 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.283 12 ERROR oslo_messaging.notify.messaging     yield
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.283 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.283 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.283 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.283 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.283 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.283 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.283 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.283 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.283 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.283 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.283 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.283 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.283 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.283 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.283 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.283 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.283 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.283 12 ERROR oslo_messaging.notify.messaging 
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.283 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.283 12 ERROR oslo_messaging.notify.messaging 
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.283 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.283 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.283 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.283 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.283 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.283 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.283 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.283 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.283 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.283 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.283 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.283 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.283 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.283 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.283 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.283 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.283 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.283 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.283 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.283 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.283 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.283 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.283 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.283 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.283 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.283 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.283 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.283 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.283 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.283 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.283 12 ERROR oslo_messaging.notify.messaging 
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.284 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.capacity in the context of pollsters
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.284 12 DEBUG ceilometer.compute.pollsters [-] ba1975bd-ca63-4cb4-afd3-fb1f077c28f0/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.284 12 DEBUG ceilometer.compute.pollsters [-] ba1975bd-ca63-4cb4-afd3-fb1f077c28f0/disk.device.capacity volume: 485376 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.285 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '0b77dd83-3a77-4e38-bb7c-949209b71f31', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '93f27bcf715e498cbac482f96dec39c0', 'user_name': None, 'project_id': 'c869345f15654dea91ddb775c6c3ed7d', 'project_name': None, 'resource_id': 'ba1975bd-ca63-4cb4-afd3-fb1f077c28f0-vda', 'timestamp': '2026-01-22T00:26:23.284399', 'resource_metadata': {'display_name': 'tempest-TestSnapshotPattern-server-1118267000', 'name': 'instance-000000a4', 'instance_id': 'ba1975bd-ca63-4cb4-afd3-fb1f077c28f0', 'instance_type': 'm1.nano', 'host': '87cf2df7d5510f7b4590e98386f7fc8d3e8f3543484bc323976a4d97', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'fb268d02-f728-11f0-b13b-fa163e425b77', 'monotonic_time': 6141.971457133, 'message_signature': '5e0a05259862838b4d3ed0274a858f2a4676504b263668f28b71052f86c54dea'}, {'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 485376, 'user_id': '93f27bcf715e498cbac482f96dec39c0', 'user_name': None, 'project_id': 'c869345f15654dea91ddb775c6c3ed7d', 'project_name': None, 'resource_id': 'ba1975bd-ca63-4cb4-afd3-fb1f077c28f0-sda', 'timestamp': '2026-01-22T00:26:23.284399', 'resource_metadata': {'display_name': 'tempest-TestSnapshotPattern-server-1118267000', 'name': 'instance-000000a4', 'instance_id': 'ba1975bd-ca63-4cb4-afd3-fb1f077c28f0', 'instance_type': 'm1.nano', 'host': '87cf2df7d5510f7b4590e98386f7fc8d3e8f3543484bc323976a4d97', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'fb269716-f728-11f0-b13b-fa163e425b77', 'monotonic_time': 6141.971457133, 'message_signature': '1ea3aa49ffed081800bc5246fb4b4de33c9ba50587f02c49a5fc15bf65d398ab'}]}, 'timestamp': '2026-01-22 00:26:23.284938', '_unique_id': 'f21b5bca52294dc7a0fa776c571e31f4'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.285 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.285 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.285 12 ERROR oslo_messaging.notify.messaging     yield
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.285 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.285 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.285 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.285 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.285 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.285 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.285 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.285 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.285 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.285 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.285 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.285 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.285 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.285 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.285 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.285 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.285 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.285 12 ERROR oslo_messaging.notify.messaging 
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.285 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.285 12 ERROR oslo_messaging.notify.messaging 
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.285 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.285 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.285 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.285 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.285 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.285 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.285 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.285 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.285 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.285 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.285 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.285 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.285 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.285 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.285 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.285 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.285 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.285 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.285 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.285 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.285 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.285 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.285 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.285 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.285 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.285 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.285 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.285 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.285 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.285 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.285 12 ERROR oslo_messaging.notify.messaging 
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.286 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.bytes in the context of pollsters
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.286 12 DEBUG ceilometer.compute.pollsters [-] ba1975bd-ca63-4cb4-afd3-fb1f077c28f0/disk.device.write.bytes volume: 72753152 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.286 12 DEBUG ceilometer.compute.pollsters [-] ba1975bd-ca63-4cb4-afd3-fb1f077c28f0/disk.device.write.bytes volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.287 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '8bded162-70bc-4105-8e0b-a5cc4bc4f7f6', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 72753152, 'user_id': '93f27bcf715e498cbac482f96dec39c0', 'user_name': None, 'project_id': 'c869345f15654dea91ddb775c6c3ed7d', 'project_name': None, 'resource_id': 'ba1975bd-ca63-4cb4-afd3-fb1f077c28f0-vda', 'timestamp': '2026-01-22T00:26:23.286482', 'resource_metadata': {'display_name': 'tempest-TestSnapshotPattern-server-1118267000', 'name': 'instance-000000a4', 'instance_id': 'ba1975bd-ca63-4cb4-afd3-fb1f077c28f0', 'instance_type': 'm1.nano', 'host': '87cf2df7d5510f7b4590e98386f7fc8d3e8f3543484bc323976a4d97', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'fb26de60-f728-11f0-b13b-fa163e425b77', 'monotonic_time': 6141.897622048, 'message_signature': 'fa2744619f6dedc6127dbc5df52aaeb149459e2c467c917481b9f47bca2ca75d'}, {'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '93f27bcf715e498cbac482f96dec39c0', 'user_name': None, 'project_id': 'c869345f15654dea91ddb775c6c3ed7d', 'project_name': None, 'resource_id': 'ba1975bd-ca63-4cb4-afd3-fb1f077c28f0-sda', 'timestamp': '2026-01-22T00:26:23.286482', 'resource_metadata': {'display_name': 'tempest-TestSnapshotPattern-server-1118267000', 'name': 'instance-000000a4', 'instance_id': 'ba1975bd-ca63-4cb4-afd3-fb1f077c28f0', 'instance_type': 'm1.nano', 'host': '87cf2df7d5510f7b4590e98386f7fc8d3e8f3543484bc323976a4d97', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'fb26e856-f728-11f0-b13b-fa163e425b77', 'monotonic_time': 6141.897622048, 'message_signature': '855c0b271b9a109465d83f045efbeb8de74cc4a0ab9bc48037649106264e0a19'}]}, 'timestamp': '2026-01-22 00:26:23.287010', '_unique_id': '5e23105e24b74995a10cf59bde01e665'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.287 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.287 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.287 12 ERROR oslo_messaging.notify.messaging     yield
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.287 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.287 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.287 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.287 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.287 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.287 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.287 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.287 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.287 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.287 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.287 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.287 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.287 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.287 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.287 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.287 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.287 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.287 12 ERROR oslo_messaging.notify.messaging 
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.287 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.287 12 ERROR oslo_messaging.notify.messaging 
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.287 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.287 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.287 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.287 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.287 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.287 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.287 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.287 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.287 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.287 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.287 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.287 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.287 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.287 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.287 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.287 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.287 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.287 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.287 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.287 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.287 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.287 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.287 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.287 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.287 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.287 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.287 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.287 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.287 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.287 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.287 12 ERROR oslo_messaging.notify.messaging 
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.288 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.latency in the context of pollsters
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.288 12 DEBUG ceilometer.compute.pollsters [-] ba1975bd-ca63-4cb4-afd3-fb1f077c28f0/disk.device.write.latency volume: 2166254086 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.288 12 DEBUG ceilometer.compute.pollsters [-] ba1975bd-ca63-4cb4-afd3-fb1f077c28f0/disk.device.write.latency volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.289 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'e699938e-b7db-41bc-b027-a6e173bf6408', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 2166254086, 'user_id': '93f27bcf715e498cbac482f96dec39c0', 'user_name': None, 'project_id': 'c869345f15654dea91ddb775c6c3ed7d', 'project_name': None, 'resource_id': 'ba1975bd-ca63-4cb4-afd3-fb1f077c28f0-vda', 'timestamp': '2026-01-22T00:26:23.288463', 'resource_metadata': {'display_name': 'tempest-TestSnapshotPattern-server-1118267000', 'name': 'instance-000000a4', 'instance_id': 'ba1975bd-ca63-4cb4-afd3-fb1f077c28f0', 'instance_type': 'm1.nano', 'host': '87cf2df7d5510f7b4590e98386f7fc8d3e8f3543484bc323976a4d97', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'fb272d0c-f728-11f0-b13b-fa163e425b77', 'monotonic_time': 6141.897622048, 'message_signature': '7531054ab40027952831ea38a3103b9c0d9368f0a1e0eb59c9ab390727341f09'}, {'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 0, 'user_id': '93f27bcf715e498cbac482f96dec39c0', 'user_name': None, 'project_id': 'c869345f15654dea91ddb775c6c3ed7d', 'project_name': None, 'resource_id': 'ba1975bd-ca63-4cb4-afd3-fb1f077c28f0-sda', 'timestamp': '2026-01-22T00:26:23.288463', 'resource_metadata': {'display_name': 'tempest-TestSnapshotPattern-server-1118267000', 'name': 'instance-000000a4', 'instance_id': 'ba1975bd-ca63-4cb4-afd3-fb1f077c28f0', 'instance_type': 'm1.nano', 'host': '87cf2df7d5510f7b4590e98386f7fc8d3e8f3543484bc323976a4d97', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'fb273a4a-f728-11f0-b13b-fa163e425b77', 'monotonic_time': 6141.897622048, 'message_signature': 'ebbbfe031ca0f45c41e512e4502757a2b80a1f257e9ec7015103103c98d41534'}]}, 'timestamp': '2026-01-22 00:26:23.289132', '_unique_id': '61b85c4710514b25a9429da1c6d19010'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.289 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.289 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.289 12 ERROR oslo_messaging.notify.messaging     yield
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.289 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.289 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.289 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.289 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.289 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.289 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.289 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.289 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.289 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.289 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.289 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.289 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.289 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.289 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.289 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.289 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.289 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.289 12 ERROR oslo_messaging.notify.messaging 
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.289 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.289 12 ERROR oslo_messaging.notify.messaging 
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.289 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.289 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.289 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.289 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.289 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.289 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.289 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.289 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.289 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.289 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.289 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.289 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.289 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.289 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.289 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.289 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.289 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.289 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.289 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.289 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.289 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.289 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.289 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.289 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.289 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.289 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.289 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.289 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.289 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.289 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 21 19:26:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:26:23.289 12 ERROR oslo_messaging.notify.messaging 
Jan 21 19:26:23 np0005591285 nova_compute[182755]: 2026-01-22 00:26:23.308 182759 DEBUG nova.compute.manager [req-20ea7058-ec10-43ff-8f20-930cab9c7757 req-0f91fc74-d557-47ea-8337-722f86ee836f 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: ba1975bd-ca63-4cb4-afd3-fb1f077c28f0] Received event network-changed-168c1e42-5626-409f-86c2-c1b2a8b11d4b external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 21 19:26:23 np0005591285 nova_compute[182755]: 2026-01-22 00:26:23.308 182759 DEBUG nova.compute.manager [req-20ea7058-ec10-43ff-8f20-930cab9c7757 req-0f91fc74-d557-47ea-8337-722f86ee836f 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: ba1975bd-ca63-4cb4-afd3-fb1f077c28f0] Refreshing instance network info cache due to event network-changed-168c1e42-5626-409f-86c2-c1b2a8b11d4b. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 21 19:26:23 np0005591285 nova_compute[182755]: 2026-01-22 00:26:23.308 182759 DEBUG oslo_concurrency.lockutils [req-20ea7058-ec10-43ff-8f20-930cab9c7757 req-0f91fc74-d557-47ea-8337-722f86ee836f 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "refresh_cache-ba1975bd-ca63-4cb4-afd3-fb1f077c28f0" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 21 19:26:23 np0005591285 nova_compute[182755]: 2026-01-22 00:26:23.308 182759 DEBUG oslo_concurrency.lockutils [req-20ea7058-ec10-43ff-8f20-930cab9c7757 req-0f91fc74-d557-47ea-8337-722f86ee836f 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquired lock "refresh_cache-ba1975bd-ca63-4cb4-afd3-fb1f077c28f0" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 21 19:26:23 np0005591285 nova_compute[182755]: 2026-01-22 00:26:23.309 182759 DEBUG nova.network.neutron [req-20ea7058-ec10-43ff-8f20-930cab9c7757 req-0f91fc74-d557-47ea-8337-722f86ee836f 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: ba1975bd-ca63-4cb4-afd3-fb1f077c28f0] Refreshing network info cache for port 168c1e42-5626-409f-86c2-c1b2a8b11d4b _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 21 19:26:23 np0005591285 nova_compute[182755]: 2026-01-22 00:26:23.520 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:26:24 np0005591285 ovn_controller[94908]: 2026-01-22T00:26:24Z|00064|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:01:b6:65 10.100.0.6
Jan 21 19:26:24 np0005591285 ovn_controller[94908]: 2026-01-22T00:26:24Z|00065|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:01:b6:65 10.100.0.6
Jan 21 19:26:25 np0005591285 podman[238013]: 2026-01-22 00:26:25.208555916 +0000 UTC m=+0.080613128 container health_status 3d927e208c8d9d2bf2ed958430b573b5beb6e6062ba87e1ec7a0ee42c855b2e4 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter)
Jan 21 19:26:25 np0005591285 nova_compute[182755]: 2026-01-22 00:26:25.718 182759 DEBUG nova.network.neutron [req-20ea7058-ec10-43ff-8f20-930cab9c7757 req-0f91fc74-d557-47ea-8337-722f86ee836f 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: ba1975bd-ca63-4cb4-afd3-fb1f077c28f0] Updated VIF entry in instance network info cache for port 168c1e42-5626-409f-86c2-c1b2a8b11d4b. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 21 19:26:25 np0005591285 nova_compute[182755]: 2026-01-22 00:26:25.719 182759 DEBUG nova.network.neutron [req-20ea7058-ec10-43ff-8f20-930cab9c7757 req-0f91fc74-d557-47ea-8337-722f86ee836f 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: ba1975bd-ca63-4cb4-afd3-fb1f077c28f0] Updating instance_info_cache with network_info: [{"id": "168c1e42-5626-409f-86c2-c1b2a8b11d4b", "address": "fa:16:3e:01:b6:65", "network": {"id": "ab086ee0-e007-4a86-babc-64d267c3fd5e", "bridge": "br-int", "label": "tempest-TestSnapshotPattern-266383806-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.237", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c869345f15654dea91ddb775c6c3ed7d", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap168c1e42-56", "ovs_interfaceid": "168c1e42-5626-409f-86c2-c1b2a8b11d4b", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 21 19:26:28 np0005591285 nova_compute[182755]: 2026-01-22 00:26:28.075 182759 DEBUG oslo_concurrency.lockutils [req-20ea7058-ec10-43ff-8f20-930cab9c7757 req-0f91fc74-d557-47ea-8337-722f86ee836f 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Releasing lock "refresh_cache-ba1975bd-ca63-4cb4-afd3-fb1f077c28f0" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 21 19:26:28 np0005591285 nova_compute[182755]: 2026-01-22 00:26:28.183 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:26:28 np0005591285 podman[238038]: 2026-01-22 00:26:28.211627428 +0000 UTC m=+0.070334432 container health_status 482a8450540b33e5cd4c4cb0c4b60281016c657b79b89fa82f8a9e447843f995 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=ovn_metadata_agent)
Jan 21 19:26:28 np0005591285 podman[238039]: 2026-01-22 00:26:28.2157435 +0000 UTC m=+0.073814876 container health_status 8919309155a033da8deb024bb37b9fc0d285fe97686ee0d202af37a5f2df8416 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter)
Jan 21 19:26:28 np0005591285 nova_compute[182755]: 2026-01-22 00:26:28.523 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:26:30 np0005591285 ovn_controller[94908]: 2026-01-22T00:26:30Z|00632|binding|INFO|Releasing lport 931dfb05-b9a9-4afa-88ca-d1f52289b640 from this chassis (sb_readonly=0)
Jan 21 19:26:30 np0005591285 nova_compute[182755]: 2026-01-22 00:26:30.351 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:26:32 np0005591285 podman[238079]: 2026-01-22 00:26:32.231459321 +0000 UTC m=+0.102344282 container health_status e38def30a2d032278c507a8fe96e62e9ea51ff4721fe55b24e66691821fd079e (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_id=ovn_controller, io.buildah.version=1.41.3, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, container_name=ovn_controller, org.label-schema.build-date=20251202)
Jan 21 19:26:33 np0005591285 nova_compute[182755]: 2026-01-22 00:26:33.186 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:26:33 np0005591285 nova_compute[182755]: 2026-01-22 00:26:33.525 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:26:36 np0005591285 nova_compute[182755]: 2026-01-22 00:26:36.863 182759 DEBUG nova.compute.manager [None req-55a78d14-9d4d-4227-a39b-fd7a4ab8d74f 93f27bcf715e498cbac482f96dec39c0 c869345f15654dea91ddb775c6c3ed7d - - default default] [instance: ba1975bd-ca63-4cb4-afd3-fb1f077c28f0] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 21 19:26:37 np0005591285 nova_compute[182755]: 2026-01-22 00:26:37.015 182759 INFO nova.compute.manager [None req-55a78d14-9d4d-4227-a39b-fd7a4ab8d74f 93f27bcf715e498cbac482f96dec39c0 c869345f15654dea91ddb775c6c3ed7d - - default default] [instance: ba1975bd-ca63-4cb4-afd3-fb1f077c28f0] instance snapshotting#033[00m
Jan 21 19:26:37 np0005591285 nova_compute[182755]: 2026-01-22 00:26:37.334 182759 INFO nova.virt.libvirt.driver [None req-55a78d14-9d4d-4227-a39b-fd7a4ab8d74f 93f27bcf715e498cbac482f96dec39c0 c869345f15654dea91ddb775c6c3ed7d - - default default] [instance: ba1975bd-ca63-4cb4-afd3-fb1f077c28f0] Beginning live snapshot process#033[00m
Jan 21 19:26:37 np0005591285 virtqemud[182299]: invalid argument: disk vda does not have an active block job
Jan 21 19:26:37 np0005591285 nova_compute[182755]: 2026-01-22 00:26:37.564 182759 DEBUG oslo_concurrency.processutils [None req-55a78d14-9d4d-4227-a39b-fd7a4ab8d74f 93f27bcf715e498cbac482f96dec39c0 c869345f15654dea91ddb775c6c3ed7d - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/ba1975bd-ca63-4cb4-afd3-fb1f077c28f0/disk --force-share --output=json -f qcow2 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 21 19:26:37 np0005591285 nova_compute[182755]: 2026-01-22 00:26:37.662 182759 DEBUG oslo_concurrency.processutils [None req-55a78d14-9d4d-4227-a39b-fd7a4ab8d74f 93f27bcf715e498cbac482f96dec39c0 c869345f15654dea91ddb775c6c3ed7d - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/ba1975bd-ca63-4cb4-afd3-fb1f077c28f0/disk --force-share --output=json -f qcow2" returned: 0 in 0.098s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 21 19:26:37 np0005591285 nova_compute[182755]: 2026-01-22 00:26:37.664 182759 DEBUG oslo_concurrency.processutils [None req-55a78d14-9d4d-4227-a39b-fd7a4ab8d74f 93f27bcf715e498cbac482f96dec39c0 c869345f15654dea91ddb775c6c3ed7d - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/ba1975bd-ca63-4cb4-afd3-fb1f077c28f0/disk --force-share --output=json -f qcow2 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 21 19:26:37 np0005591285 nova_compute[182755]: 2026-01-22 00:26:37.730 182759 DEBUG oslo_concurrency.processutils [None req-55a78d14-9d4d-4227-a39b-fd7a4ab8d74f 93f27bcf715e498cbac482f96dec39c0 c869345f15654dea91ddb775c6c3ed7d - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/ba1975bd-ca63-4cb4-afd3-fb1f077c28f0/disk --force-share --output=json -f qcow2" returned: 0 in 0.067s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 21 19:26:37 np0005591285 nova_compute[182755]: 2026-01-22 00:26:37.757 182759 DEBUG oslo_concurrency.processutils [None req-55a78d14-9d4d-4227-a39b-fd7a4ab8d74f 93f27bcf715e498cbac482f96dec39c0 c869345f15654dea91ddb775c6c3ed7d - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 21 19:26:37 np0005591285 nova_compute[182755]: 2026-01-22 00:26:37.813 182759 DEBUG oslo_concurrency.processutils [None req-55a78d14-9d4d-4227-a39b-fd7a4ab8d74f 93f27bcf715e498cbac482f96dec39c0 c869345f15654dea91ddb775c6c3ed7d - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json" returned: 0 in 0.056s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 21 19:26:37 np0005591285 nova_compute[182755]: 2026-01-22 00:26:37.815 182759 DEBUG oslo_concurrency.processutils [None req-55a78d14-9d4d-4227-a39b-fd7a4ab8d74f 93f27bcf715e498cbac482f96dec39c0 c869345f15654dea91ddb775c6c3ed7d - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474,backing_fmt=raw /var/lib/nova/instances/snapshots/tmp0fquygeq/6c56b4da56e74ae7b19446797ca0ae73.delta 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 21 19:26:37 np0005591285 nova_compute[182755]: 2026-01-22 00:26:37.849 182759 DEBUG oslo_concurrency.processutils [None req-55a78d14-9d4d-4227-a39b-fd7a4ab8d74f 93f27bcf715e498cbac482f96dec39c0 c869345f15654dea91ddb775c6c3ed7d - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474,backing_fmt=raw /var/lib/nova/instances/snapshots/tmp0fquygeq/6c56b4da56e74ae7b19446797ca0ae73.delta 1073741824" returned: 0 in 0.034s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 21 19:26:37 np0005591285 nova_compute[182755]: 2026-01-22 00:26:37.852 182759 INFO nova.virt.libvirt.driver [None req-55a78d14-9d4d-4227-a39b-fd7a4ab8d74f 93f27bcf715e498cbac482f96dec39c0 c869345f15654dea91ddb775c6c3ed7d - - default default] [instance: ba1975bd-ca63-4cb4-afd3-fb1f077c28f0] Quiescing instance not available: QEMU guest agent is not enabled.#033[00m
Jan 21 19:26:37 np0005591285 nova_compute[182755]: 2026-01-22 00:26:37.912 182759 DEBUG nova.virt.libvirt.guest [None req-55a78d14-9d4d-4227-a39b-fd7a4ab8d74f 93f27bcf715e498cbac482f96dec39c0 c869345f15654dea91ddb775c6c3ed7d - - default default] COPY block job progress, current cursor: 0 final cursor: 75497472 is_job_complete /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:846#033[00m
Jan 21 19:26:38 np0005591285 nova_compute[182755]: 2026-01-22 00:26:38.188 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:26:38 np0005591285 nova_compute[182755]: 2026-01-22 00:26:38.416 182759 DEBUG nova.virt.libvirt.guest [None req-55a78d14-9d4d-4227-a39b-fd7a4ab8d74f 93f27bcf715e498cbac482f96dec39c0 c869345f15654dea91ddb775c6c3ed7d - - default default] COPY block job progress, current cursor: 75497472 final cursor: 75497472 is_job_complete /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:846#033[00m
Jan 21 19:26:38 np0005591285 nova_compute[182755]: 2026-01-22 00:26:38.420 182759 INFO nova.virt.libvirt.driver [None req-55a78d14-9d4d-4227-a39b-fd7a4ab8d74f 93f27bcf715e498cbac482f96dec39c0 c869345f15654dea91ddb775c6c3ed7d - - default default] [instance: ba1975bd-ca63-4cb4-afd3-fb1f077c28f0] Skipping quiescing instance: QEMU guest agent is not enabled.#033[00m
Jan 21 19:26:38 np0005591285 nova_compute[182755]: 2026-01-22 00:26:38.463 182759 DEBUG nova.privsep.utils [None req-55a78d14-9d4d-4227-a39b-fd7a4ab8d74f 93f27bcf715e498cbac482f96dec39c0 c869345f15654dea91ddb775c6c3ed7d - - default default] Path '/var/lib/nova/instances' supports direct I/O supports_direct_io /usr/lib/python3.9/site-packages/nova/privsep/utils.py:63#033[00m
Jan 21 19:26:38 np0005591285 nova_compute[182755]: 2026-01-22 00:26:38.465 182759 DEBUG oslo_concurrency.processutils [None req-55a78d14-9d4d-4227-a39b-fd7a4ab8d74f 93f27bcf715e498cbac482f96dec39c0 c869345f15654dea91ddb775c6c3ed7d - - default default] Running cmd (subprocess): qemu-img convert -t none -O qcow2 -f qcow2 /var/lib/nova/instances/snapshots/tmp0fquygeq/6c56b4da56e74ae7b19446797ca0ae73.delta /var/lib/nova/instances/snapshots/tmp0fquygeq/6c56b4da56e74ae7b19446797ca0ae73 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 21 19:26:38 np0005591285 nova_compute[182755]: 2026-01-22 00:26:38.528 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:26:38 np0005591285 nova_compute[182755]: 2026-01-22 00:26:38.879 182759 DEBUG oslo_concurrency.processutils [None req-55a78d14-9d4d-4227-a39b-fd7a4ab8d74f 93f27bcf715e498cbac482f96dec39c0 c869345f15654dea91ddb775c6c3ed7d - - default default] CMD "qemu-img convert -t none -O qcow2 -f qcow2 /var/lib/nova/instances/snapshots/tmp0fquygeq/6c56b4da56e74ae7b19446797ca0ae73.delta /var/lib/nova/instances/snapshots/tmp0fquygeq/6c56b4da56e74ae7b19446797ca0ae73" returned: 0 in 0.414s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 21 19:26:38 np0005591285 nova_compute[182755]: 2026-01-22 00:26:38.889 182759 INFO nova.virt.libvirt.driver [None req-55a78d14-9d4d-4227-a39b-fd7a4ab8d74f 93f27bcf715e498cbac482f96dec39c0 c869345f15654dea91ddb775c6c3ed7d - - default default] [instance: ba1975bd-ca63-4cb4-afd3-fb1f077c28f0] Snapshot extracted, beginning image upload#033[00m
Jan 21 19:26:40 np0005591285 nova_compute[182755]: 2026-01-22 00:26:40.602 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:26:41 np0005591285 nova_compute[182755]: 2026-01-22 00:26:41.613 182759 INFO nova.virt.libvirt.driver [None req-55a78d14-9d4d-4227-a39b-fd7a4ab8d74f 93f27bcf715e498cbac482f96dec39c0 c869345f15654dea91ddb775c6c3ed7d - - default default] [instance: ba1975bd-ca63-4cb4-afd3-fb1f077c28f0] Snapshot image upload complete#033[00m
Jan 21 19:26:41 np0005591285 nova_compute[182755]: 2026-01-22 00:26:41.613 182759 INFO nova.compute.manager [None req-55a78d14-9d4d-4227-a39b-fd7a4ab8d74f 93f27bcf715e498cbac482f96dec39c0 c869345f15654dea91ddb775c6c3ed7d - - default default] [instance: ba1975bd-ca63-4cb4-afd3-fb1f077c28f0] Took 4.58 seconds to snapshot the instance on the hypervisor.#033[00m
Jan 21 19:26:43 np0005591285 nova_compute[182755]: 2026-01-22 00:26:43.234 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:26:43 np0005591285 nova_compute[182755]: 2026-01-22 00:26:43.530 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:26:46 np0005591285 nova_compute[182755]: 2026-01-22 00:26:46.573 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:26:47 np0005591285 podman[238144]: 2026-01-22 00:26:47.187988801 +0000 UTC m=+0.063268301 container health_status 769668cc4e818cfae854ab3fcb5517227ddca1e18005a18ef4d782a494f45662 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., managed_by=edpm_ansible, architecture=x86_64, com.redhat.component=ubi9-minimal-container, distribution-scope=public, config_id=openstack_network_exporter, name=ubi9-minimal, io.openshift.expose-services=, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., container_name=openstack_network_exporter, maintainer=Red Hat, Inc., url=https://catalog.redhat.com/en/search?searchType=containers, io.buildah.version=1.33.7, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.tags=minimal rhel9, vendor=Red Hat, Inc., vcs-type=git, release=1755695350, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., version=9.6, build-date=2025-08-20T13:12:41)
Jan 21 19:26:47 np0005591285 podman[238145]: 2026-01-22 00:26:47.193579612 +0000 UTC m=+0.063587880 container health_status b5c1a31a508d0cf9e10702abe9c0ef424614b4d8f0daee85791f7de054afa6f0 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, org.label-schema.vendor=CentOS, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ceilometer_agent_compute)
Jan 21 19:26:48 np0005591285 nova_compute[182755]: 2026-01-22 00:26:48.237 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:26:48 np0005591285 nova_compute[182755]: 2026-01-22 00:26:48.532 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:26:52 np0005591285 nova_compute[182755]: 2026-01-22 00:26:52.217 182759 DEBUG oslo_service.periodic_task [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 21 19:26:52 np0005591285 nova_compute[182755]: 2026-01-22 00:26:52.218 182759 DEBUG nova.compute.manager [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Jan 21 19:26:53 np0005591285 nova_compute[182755]: 2026-01-22 00:26:53.275 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:26:53 np0005591285 nova_compute[182755]: 2026-01-22 00:26:53.534 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:26:55 np0005591285 nova_compute[182755]: 2026-01-22 00:26:55.214 182759 DEBUG oslo_service.periodic_task [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 21 19:26:55 np0005591285 nova_compute[182755]: 2026-01-22 00:26:55.216 182759 DEBUG oslo_service.periodic_task [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 21 19:26:55 np0005591285 nova_compute[182755]: 2026-01-22 00:26:55.217 182759 DEBUG nova.compute.manager [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Jan 21 19:26:55 np0005591285 nova_compute[182755]: 2026-01-22 00:26:55.217 182759 DEBUG nova.compute.manager [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Jan 21 19:26:55 np0005591285 nova_compute[182755]: 2026-01-22 00:26:55.570 182759 DEBUG oslo_concurrency.lockutils [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Acquiring lock "refresh_cache-ba1975bd-ca63-4cb4-afd3-fb1f077c28f0" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 21 19:26:55 np0005591285 nova_compute[182755]: 2026-01-22 00:26:55.571 182759 DEBUG oslo_concurrency.lockutils [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Acquired lock "refresh_cache-ba1975bd-ca63-4cb4-afd3-fb1f077c28f0" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 21 19:26:55 np0005591285 nova_compute[182755]: 2026-01-22 00:26:55.571 182759 DEBUG nova.network.neutron [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] [instance: ba1975bd-ca63-4cb4-afd3-fb1f077c28f0] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004#033[00m
Jan 21 19:26:55 np0005591285 nova_compute[182755]: 2026-01-22 00:26:55.571 182759 DEBUG nova.objects.instance [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Lazy-loading 'info_cache' on Instance uuid ba1975bd-ca63-4cb4-afd3-fb1f077c28f0 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 21 19:26:56 np0005591285 podman[238184]: 2026-01-22 00:26:56.207311521 +0000 UTC m=+0.068283386 container health_status 3d927e208c8d9d2bf2ed958430b573b5beb6e6062ba87e1ec7a0ee42c855b2e4 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Jan 21 19:26:57 np0005591285 nova_compute[182755]: 2026-01-22 00:26:57.733 182759 DEBUG nova.network.neutron [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] [instance: ba1975bd-ca63-4cb4-afd3-fb1f077c28f0] Updating instance_info_cache with network_info: [{"id": "168c1e42-5626-409f-86c2-c1b2a8b11d4b", "address": "fa:16:3e:01:b6:65", "network": {"id": "ab086ee0-e007-4a86-babc-64d267c3fd5e", "bridge": "br-int", "label": "tempest-TestSnapshotPattern-266383806-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.237", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c869345f15654dea91ddb775c6c3ed7d", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap168c1e42-56", "ovs_interfaceid": "168c1e42-5626-409f-86c2-c1b2a8b11d4b", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 21 19:26:57 np0005591285 nova_compute[182755]: 2026-01-22 00:26:57.817 182759 DEBUG oslo_concurrency.lockutils [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Releasing lock "refresh_cache-ba1975bd-ca63-4cb4-afd3-fb1f077c28f0" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 21 19:26:57 np0005591285 nova_compute[182755]: 2026-01-22 00:26:57.818 182759 DEBUG nova.compute.manager [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] [instance: ba1975bd-ca63-4cb4-afd3-fb1f077c28f0] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929#033[00m
Jan 21 19:26:57 np0005591285 nova_compute[182755]: 2026-01-22 00:26:57.819 182759 DEBUG oslo_service.periodic_task [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 21 19:26:58 np0005591285 nova_compute[182755]: 2026-01-22 00:26:58.280 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:26:58 np0005591285 nova_compute[182755]: 2026-01-22 00:26:58.538 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:26:59 np0005591285 podman[238209]: 2026-01-22 00:26:59.191810804 +0000 UTC m=+0.058393532 container health_status 8919309155a033da8deb024bb37b9fc0d285fe97686ee0d202af37a5f2df8416 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter)
Jan 21 19:26:59 np0005591285 podman[238208]: 2026-01-22 00:26:59.21177035 +0000 UTC m=+0.086176567 container health_status 482a8450540b33e5cd4c4cb0c4b60281016c657b79b89fa82f8a9e447843f995 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_metadata_agent, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, managed_by=edpm_ansible)
Jan 21 19:27:00 np0005591285 nova_compute[182755]: 2026-01-22 00:27:00.217 182759 DEBUG oslo_service.periodic_task [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 21 19:27:01 np0005591285 nova_compute[182755]: 2026-01-22 00:27:01.218 182759 DEBUG oslo_service.periodic_task [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 21 19:27:02 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:27:02.992 104259 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 21 19:27:02 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:27:02.993 104259 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 21 19:27:02 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:27:02.994 104259 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 21 19:27:03 np0005591285 podman[238251]: 2026-01-22 00:27:03.205268097 +0000 UTC m=+0.081688477 container health_status e38def30a2d032278c507a8fe96e62e9ea51ff4721fe55b24e66691821fd079e (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_controller, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, managed_by=edpm_ansible)
Jan 21 19:27:03 np0005591285 nova_compute[182755]: 2026-01-22 00:27:03.217 182759 DEBUG oslo_service.periodic_task [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 21 19:27:03 np0005591285 nova_compute[182755]: 2026-01-22 00:27:03.243 182759 DEBUG oslo_concurrency.lockutils [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 21 19:27:03 np0005591285 nova_compute[182755]: 2026-01-22 00:27:03.243 182759 DEBUG oslo_concurrency.lockutils [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 21 19:27:03 np0005591285 nova_compute[182755]: 2026-01-22 00:27:03.243 182759 DEBUG oslo_concurrency.lockutils [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 21 19:27:03 np0005591285 nova_compute[182755]: 2026-01-22 00:27:03.243 182759 DEBUG nova.compute.resource_tracker [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Jan 21 19:27:03 np0005591285 nova_compute[182755]: 2026-01-22 00:27:03.282 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:27:03 np0005591285 nova_compute[182755]: 2026-01-22 00:27:03.316 182759 DEBUG oslo_concurrency.processutils [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/ba1975bd-ca63-4cb4-afd3-fb1f077c28f0/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 21 19:27:03 np0005591285 nova_compute[182755]: 2026-01-22 00:27:03.401 182759 DEBUG oslo_concurrency.processutils [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/ba1975bd-ca63-4cb4-afd3-fb1f077c28f0/disk --force-share --output=json" returned: 0 in 0.085s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 21 19:27:03 np0005591285 nova_compute[182755]: 2026-01-22 00:27:03.402 182759 DEBUG oslo_concurrency.processutils [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/ba1975bd-ca63-4cb4-afd3-fb1f077c28f0/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 21 19:27:03 np0005591285 nova_compute[182755]: 2026-01-22 00:27:03.466 182759 DEBUG oslo_concurrency.processutils [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/ba1975bd-ca63-4cb4-afd3-fb1f077c28f0/disk --force-share --output=json" returned: 0 in 0.064s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 21 19:27:03 np0005591285 nova_compute[182755]: 2026-01-22 00:27:03.539 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:27:03 np0005591285 nova_compute[182755]: 2026-01-22 00:27:03.610 182759 WARNING nova.virt.libvirt.driver [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 21 19:27:03 np0005591285 nova_compute[182755]: 2026-01-22 00:27:03.611 182759 DEBUG nova.compute.resource_tracker [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=5566MB free_disk=73.14781188964844GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Jan 21 19:27:03 np0005591285 nova_compute[182755]: 2026-01-22 00:27:03.612 182759 DEBUG oslo_concurrency.lockutils [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 21 19:27:03 np0005591285 nova_compute[182755]: 2026-01-22 00:27:03.612 182759 DEBUG oslo_concurrency.lockutils [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 21 19:27:03 np0005591285 nova_compute[182755]: 2026-01-22 00:27:03.693 182759 DEBUG nova.compute.resource_tracker [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Instance ba1975bd-ca63-4cb4-afd3-fb1f077c28f0 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Jan 21 19:27:03 np0005591285 nova_compute[182755]: 2026-01-22 00:27:03.693 182759 DEBUG nova.compute.resource_tracker [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Jan 21 19:27:03 np0005591285 nova_compute[182755]: 2026-01-22 00:27:03.693 182759 DEBUG nova.compute.resource_tracker [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7679MB used_ram=640MB phys_disk=79GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Jan 21 19:27:03 np0005591285 nova_compute[182755]: 2026-01-22 00:27:03.732 182759 DEBUG nova.compute.provider_tree [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Inventory has not changed in ProviderTree for provider: e96a8776-a298-4c19-937a-402cb8191067 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 21 19:27:03 np0005591285 nova_compute[182755]: 2026-01-22 00:27:03.762 182759 DEBUG nova.scheduler.client.report [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Inventory has not changed for provider e96a8776-a298-4c19-937a-402cb8191067 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 21 19:27:03 np0005591285 nova_compute[182755]: 2026-01-22 00:27:03.783 182759 DEBUG nova.compute.resource_tracker [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Jan 21 19:27:03 np0005591285 nova_compute[182755]: 2026-01-22 00:27:03.784 182759 DEBUG oslo_concurrency.lockutils [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.172s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 21 19:27:04 np0005591285 nova_compute[182755]: 2026-01-22 00:27:04.784 182759 DEBUG oslo_service.periodic_task [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 21 19:27:06 np0005591285 nova_compute[182755]: 2026-01-22 00:27:06.217 182759 DEBUG oslo_service.periodic_task [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 21 19:27:08 np0005591285 nova_compute[182755]: 2026-01-22 00:27:08.284 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:27:08 np0005591285 nova_compute[182755]: 2026-01-22 00:27:08.541 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:27:13 np0005591285 nova_compute[182755]: 2026-01-22 00:27:13.287 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:27:13 np0005591285 nova_compute[182755]: 2026-01-22 00:27:13.543 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:27:18 np0005591285 podman[238284]: 2026-01-22 00:27:18.195633176 +0000 UTC m=+0.063260712 container health_status b5c1a31a508d0cf9e10702abe9c0ef424614b4d8f0daee85791f7de054afa6f0 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=ceilometer_agent_compute, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ceilometer_agent_compute)
Jan 21 19:27:18 np0005591285 podman[238283]: 2026-01-22 00:27:18.215962473 +0000 UTC m=+0.078274016 container health_status 769668cc4e818cfae854ab3fcb5517227ddca1e18005a18ef4d782a494f45662 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, architecture=x86_64, build-date=2025-08-20T13:12:41, com.redhat.component=ubi9-minimal-container, distribution-scope=public, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.tags=minimal rhel9, container_name=openstack_network_exporter, io.buildah.version=1.33.7, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=Red Hat, Inc., name=ubi9-minimal, url=https://catalog.redhat.com/en/search?searchType=containers, release=1755695350, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.expose-services=, managed_by=edpm_ansible, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, config_id=openstack_network_exporter, version=9.6, vcs-type=git, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9.)
Jan 21 19:27:18 np0005591285 nova_compute[182755]: 2026-01-22 00:27:18.332 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:27:18 np0005591285 nova_compute[182755]: 2026-01-22 00:27:18.546 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:27:23 np0005591285 nova_compute[182755]: 2026-01-22 00:27:23.337 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:27:23 np0005591285 nova_compute[182755]: 2026-01-22 00:27:23.548 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:27:27 np0005591285 podman[238325]: 2026-01-22 00:27:27.190185879 +0000 UTC m=+0.061160756 container health_status 3d927e208c8d9d2bf2ed958430b573b5beb6e6062ba87e1ec7a0ee42c855b2e4 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter)
Jan 21 19:27:28 np0005591285 nova_compute[182755]: 2026-01-22 00:27:28.341 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:27:28 np0005591285 nova_compute[182755]: 2026-01-22 00:27:28.550 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:27:30 np0005591285 podman[238351]: 2026-01-22 00:27:30.179965673 +0000 UTC m=+0.056953452 container health_status 482a8450540b33e5cd4c4cb0c4b60281016c657b79b89fa82f8a9e447843f995 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_id=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.build-date=20251202)
Jan 21 19:27:30 np0005591285 podman[238352]: 2026-01-22 00:27:30.180076976 +0000 UTC m=+0.052810811 container health_status 8919309155a033da8deb024bb37b9fc0d285fe97686ee0d202af37a5f2df8416 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']})
Jan 21 19:27:33 np0005591285 nova_compute[182755]: 2026-01-22 00:27:33.386 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:27:33 np0005591285 nova_compute[182755]: 2026-01-22 00:27:33.553 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:27:34 np0005591285 podman[238389]: 2026-01-22 00:27:34.271810542 +0000 UTC m=+0.130257454 container health_status e38def30a2d032278c507a8fe96e62e9ea51ff4721fe55b24e66691821fd079e (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_id=ovn_controller, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, managed_by=edpm_ansible, maintainer=OpenStack Kubernetes Operator team, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, io.buildah.version=1.41.3)
Jan 21 19:27:35 np0005591285 ovn_controller[94908]: 2026-01-22T00:27:35Z|00633|memory_trim|INFO|Detected inactivity (last active 30002 ms ago): trimming memory
Jan 21 19:27:37 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:27:37.233 104259 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=51, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '16:a4:fb', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': 'fa:c2:bb:50:eb:27'}, ipsec=False) old=SB_Global(nb_cfg=50) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 21 19:27:37 np0005591285 nova_compute[182755]: 2026-01-22 00:27:37.233 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:27:37 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:27:37.234 104259 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 5 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Jan 21 19:27:38 np0005591285 nova_compute[182755]: 2026-01-22 00:27:38.389 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:27:38 np0005591285 nova_compute[182755]: 2026-01-22 00:27:38.555 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:27:39 np0005591285 nova_compute[182755]: 2026-01-22 00:27:39.217 182759 DEBUG oslo_service.periodic_task [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Running periodic task ComputeManager._run_pending_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 21 19:27:39 np0005591285 nova_compute[182755]: 2026-01-22 00:27:39.218 182759 DEBUG nova.compute.manager [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Cleaning up deleted instances _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11145#033[00m
Jan 21 19:27:39 np0005591285 nova_compute[182755]: 2026-01-22 00:27:39.324 182759 DEBUG nova.compute.manager [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] There are 0 instances to clean _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11154#033[00m
Jan 21 19:27:42 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:27:42.236 104259 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=ce4b296c-26ac-415a-aa87-9634754eb3d3, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '51'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 21 19:27:43 np0005591285 nova_compute[182755]: 2026-01-22 00:27:43.436 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:27:43 np0005591285 nova_compute[182755]: 2026-01-22 00:27:43.557 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:27:48 np0005591285 nova_compute[182755]: 2026-01-22 00:27:48.440 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:27:48 np0005591285 nova_compute[182755]: 2026-01-22 00:27:48.559 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:27:49 np0005591285 podman[238419]: 2026-01-22 00:27:49.188770507 +0000 UTC m=+0.057398195 container health_status b5c1a31a508d0cf9e10702abe9c0ef424614b4d8f0daee85791f7de054afa6f0 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, io.buildah.version=1.41.3, config_id=ceilometer_agent_compute, container_name=ceilometer_agent_compute, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_managed=true)
Jan 21 19:27:49 np0005591285 podman[238418]: 2026-01-22 00:27:49.203994016 +0000 UTC m=+0.067729882 container health_status 769668cc4e818cfae854ab3fcb5517227ddca1e18005a18ef4d782a494f45662 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, com.redhat.component=ubi9-minimal-container, container_name=openstack_network_exporter, name=ubi9-minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., maintainer=Red Hat, Inc., release=1755695350, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, vcs-type=git, build-date=2025-08-20T13:12:41, url=https://catalog.redhat.com/en/search?searchType=containers, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.buildah.version=1.33.7, io.openshift.expose-services=, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, architecture=x86_64, version=9.6, distribution-scope=public, io.openshift.tags=minimal rhel9, config_id=openstack_network_exporter, managed_by=edpm_ansible, vendor=Red Hat, Inc.)
Jan 21 19:27:53 np0005591285 nova_compute[182755]: 2026-01-22 00:27:53.444 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:27:53 np0005591285 nova_compute[182755]: 2026-01-22 00:27:53.561 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:27:54 np0005591285 nova_compute[182755]: 2026-01-22 00:27:54.324 182759 DEBUG oslo_service.periodic_task [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 21 19:27:54 np0005591285 nova_compute[182755]: 2026-01-22 00:27:54.324 182759 DEBUG nova.compute.manager [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Jan 21 19:27:57 np0005591285 nova_compute[182755]: 2026-01-22 00:27:57.213 182759 DEBUG oslo_service.periodic_task [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 21 19:27:57 np0005591285 nova_compute[182755]: 2026-01-22 00:27:57.217 182759 DEBUG oslo_service.periodic_task [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 21 19:27:57 np0005591285 nova_compute[182755]: 2026-01-22 00:27:57.217 182759 DEBUG nova.compute.manager [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Jan 21 19:27:57 np0005591285 nova_compute[182755]: 2026-01-22 00:27:57.217 182759 DEBUG nova.compute.manager [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Jan 21 19:27:57 np0005591285 nova_compute[182755]: 2026-01-22 00:27:57.608 182759 DEBUG oslo_concurrency.lockutils [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Acquiring lock "refresh_cache-ba1975bd-ca63-4cb4-afd3-fb1f077c28f0" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 21 19:27:57 np0005591285 nova_compute[182755]: 2026-01-22 00:27:57.608 182759 DEBUG oslo_concurrency.lockutils [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Acquired lock "refresh_cache-ba1975bd-ca63-4cb4-afd3-fb1f077c28f0" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 21 19:27:57 np0005591285 nova_compute[182755]: 2026-01-22 00:27:57.609 182759 DEBUG nova.network.neutron [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] [instance: ba1975bd-ca63-4cb4-afd3-fb1f077c28f0] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004#033[00m
Jan 21 19:27:57 np0005591285 nova_compute[182755]: 2026-01-22 00:27:57.609 182759 DEBUG nova.objects.instance [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Lazy-loading 'info_cache' on Instance uuid ba1975bd-ca63-4cb4-afd3-fb1f077c28f0 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 21 19:27:58 np0005591285 podman[238457]: 2026-01-22 00:27:58.218864045 +0000 UTC m=+0.085386638 container health_status 3d927e208c8d9d2bf2ed958430b573b5beb6e6062ba87e1ec7a0ee42c855b2e4 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Jan 21 19:27:58 np0005591285 nova_compute[182755]: 2026-01-22 00:27:58.448 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:27:58 np0005591285 nova_compute[182755]: 2026-01-22 00:27:58.564 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:28:00 np0005591285 nova_compute[182755]: 2026-01-22 00:28:00.652 182759 DEBUG nova.network.neutron [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] [instance: ba1975bd-ca63-4cb4-afd3-fb1f077c28f0] Updating instance_info_cache with network_info: [{"id": "168c1e42-5626-409f-86c2-c1b2a8b11d4b", "address": "fa:16:3e:01:b6:65", "network": {"id": "ab086ee0-e007-4a86-babc-64d267c3fd5e", "bridge": "br-int", "label": "tempest-TestSnapshotPattern-266383806-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.237", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c869345f15654dea91ddb775c6c3ed7d", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap168c1e42-56", "ovs_interfaceid": "168c1e42-5626-409f-86c2-c1b2a8b11d4b", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 21 19:28:00 np0005591285 nova_compute[182755]: 2026-01-22 00:28:00.693 182759 DEBUG oslo_concurrency.lockutils [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Releasing lock "refresh_cache-ba1975bd-ca63-4cb4-afd3-fb1f077c28f0" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 21 19:28:00 np0005591285 nova_compute[182755]: 2026-01-22 00:28:00.694 182759 DEBUG nova.compute.manager [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] [instance: ba1975bd-ca63-4cb4-afd3-fb1f077c28f0] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929#033[00m
Jan 21 19:28:00 np0005591285 nova_compute[182755]: 2026-01-22 00:28:00.695 182759 DEBUG oslo_service.periodic_task [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 21 19:28:01 np0005591285 nova_compute[182755]: 2026-01-22 00:28:01.148 182759 DEBUG nova.compute.manager [req-37aadc0b-4893-46f5-a33f-ecb283656761 req-01c6f7af-c8b4-4f21-ad7f-5616a18f7fa8 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: ba1975bd-ca63-4cb4-afd3-fb1f077c28f0] Received event network-changed-168c1e42-5626-409f-86c2-c1b2a8b11d4b external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 21 19:28:01 np0005591285 nova_compute[182755]: 2026-01-22 00:28:01.149 182759 DEBUG nova.compute.manager [req-37aadc0b-4893-46f5-a33f-ecb283656761 req-01c6f7af-c8b4-4f21-ad7f-5616a18f7fa8 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: ba1975bd-ca63-4cb4-afd3-fb1f077c28f0] Refreshing instance network info cache due to event network-changed-168c1e42-5626-409f-86c2-c1b2a8b11d4b. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 21 19:28:01 np0005591285 nova_compute[182755]: 2026-01-22 00:28:01.149 182759 DEBUG oslo_concurrency.lockutils [req-37aadc0b-4893-46f5-a33f-ecb283656761 req-01c6f7af-c8b4-4f21-ad7f-5616a18f7fa8 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "refresh_cache-ba1975bd-ca63-4cb4-afd3-fb1f077c28f0" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 21 19:28:01 np0005591285 nova_compute[182755]: 2026-01-22 00:28:01.149 182759 DEBUG oslo_concurrency.lockutils [req-37aadc0b-4893-46f5-a33f-ecb283656761 req-01c6f7af-c8b4-4f21-ad7f-5616a18f7fa8 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquired lock "refresh_cache-ba1975bd-ca63-4cb4-afd3-fb1f077c28f0" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 21 19:28:01 np0005591285 nova_compute[182755]: 2026-01-22 00:28:01.149 182759 DEBUG nova.network.neutron [req-37aadc0b-4893-46f5-a33f-ecb283656761 req-01c6f7af-c8b4-4f21-ad7f-5616a18f7fa8 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: ba1975bd-ca63-4cb4-afd3-fb1f077c28f0] Refreshing network info cache for port 168c1e42-5626-409f-86c2-c1b2a8b11d4b _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 21 19:28:01 np0005591285 podman[238482]: 2026-01-22 00:28:01.181808709 +0000 UTC m=+0.052621606 container health_status 482a8450540b33e5cd4c4cb0c4b60281016c657b79b89fa82f8a9e447843f995 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, managed_by=edpm_ansible)
Jan 21 19:28:01 np0005591285 podman[238483]: 2026-01-22 00:28:01.21498756 +0000 UTC m=+0.082913020 container health_status 8919309155a033da8deb024bb37b9fc0d285fe97686ee0d202af37a5f2df8416 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter)
Jan 21 19:28:01 np0005591285 nova_compute[182755]: 2026-01-22 00:28:01.217 182759 DEBUG oslo_service.periodic_task [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 21 19:28:01 np0005591285 nova_compute[182755]: 2026-01-22 00:28:01.223 182759 DEBUG oslo_concurrency.lockutils [None req-f7707da4-65a2-4ec9-9c18-2c903685c657 93f27bcf715e498cbac482f96dec39c0 c869345f15654dea91ddb775c6c3ed7d - - default default] Acquiring lock "ba1975bd-ca63-4cb4-afd3-fb1f077c28f0" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 21 19:28:01 np0005591285 nova_compute[182755]: 2026-01-22 00:28:01.224 182759 DEBUG oslo_concurrency.lockutils [None req-f7707da4-65a2-4ec9-9c18-2c903685c657 93f27bcf715e498cbac482f96dec39c0 c869345f15654dea91ddb775c6c3ed7d - - default default] Lock "ba1975bd-ca63-4cb4-afd3-fb1f077c28f0" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 21 19:28:01 np0005591285 nova_compute[182755]: 2026-01-22 00:28:01.224 182759 DEBUG oslo_concurrency.lockutils [None req-f7707da4-65a2-4ec9-9c18-2c903685c657 93f27bcf715e498cbac482f96dec39c0 c869345f15654dea91ddb775c6c3ed7d - - default default] Acquiring lock "ba1975bd-ca63-4cb4-afd3-fb1f077c28f0-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 21 19:28:01 np0005591285 nova_compute[182755]: 2026-01-22 00:28:01.224 182759 DEBUG oslo_concurrency.lockutils [None req-f7707da4-65a2-4ec9-9c18-2c903685c657 93f27bcf715e498cbac482f96dec39c0 c869345f15654dea91ddb775c6c3ed7d - - default default] Lock "ba1975bd-ca63-4cb4-afd3-fb1f077c28f0-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 21 19:28:01 np0005591285 nova_compute[182755]: 2026-01-22 00:28:01.224 182759 DEBUG oslo_concurrency.lockutils [None req-f7707da4-65a2-4ec9-9c18-2c903685c657 93f27bcf715e498cbac482f96dec39c0 c869345f15654dea91ddb775c6c3ed7d - - default default] Lock "ba1975bd-ca63-4cb4-afd3-fb1f077c28f0-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 21 19:28:01 np0005591285 nova_compute[182755]: 2026-01-22 00:28:01.236 182759 INFO nova.compute.manager [None req-f7707da4-65a2-4ec9-9c18-2c903685c657 93f27bcf715e498cbac482f96dec39c0 c869345f15654dea91ddb775c6c3ed7d - - default default] [instance: ba1975bd-ca63-4cb4-afd3-fb1f077c28f0] Terminating instance#033[00m
Jan 21 19:28:01 np0005591285 nova_compute[182755]: 2026-01-22 00:28:01.247 182759 DEBUG nova.compute.manager [None req-f7707da4-65a2-4ec9-9c18-2c903685c657 93f27bcf715e498cbac482f96dec39c0 c869345f15654dea91ddb775c6c3ed7d - - default default] [instance: ba1975bd-ca63-4cb4-afd3-fb1f077c28f0] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Jan 21 19:28:01 np0005591285 kernel: tap168c1e42-56 (unregistering): left promiscuous mode
Jan 21 19:28:01 np0005591285 NetworkManager[55017]: <info>  [1769041681.2745] device (tap168c1e42-56): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 21 19:28:01 np0005591285 ovn_controller[94908]: 2026-01-22T00:28:01Z|00634|binding|INFO|Releasing lport 168c1e42-5626-409f-86c2-c1b2a8b11d4b from this chassis (sb_readonly=0)
Jan 21 19:28:01 np0005591285 ovn_controller[94908]: 2026-01-22T00:28:01Z|00635|binding|INFO|Setting lport 168c1e42-5626-409f-86c2-c1b2a8b11d4b down in Southbound
Jan 21 19:28:01 np0005591285 ovn_controller[94908]: 2026-01-22T00:28:01Z|00636|binding|INFO|Removing iface tap168c1e42-56 ovn-installed in OVS
Jan 21 19:28:01 np0005591285 nova_compute[182755]: 2026-01-22 00:28:01.283 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:28:01 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:28:01.290 104259 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:01:b6:65 10.100.0.6'], port_security=['fa:16:3e:01:b6:65 10.100.0.6'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.6/28', 'neutron:device_id': 'ba1975bd-ca63-4cb4-afd3-fb1f077c28f0', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-ab086ee0-e007-4a86-babc-64d267c3fd5e', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'c869345f15654dea91ddb775c6c3ed7d', 'neutron:revision_number': '4', 'neutron:security_group_ids': 'd99566cf-9d10-4ed9-89fe-0fedfcd05fcc', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-2.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=7ddf0c9e-e496-4d74-b1f7-5f7f3b8a365b, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fdd55b5c970>], logical_port=168c1e42-5626-409f-86c2-c1b2a8b11d4b) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7fdd55b5c970>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 21 19:28:01 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:28:01.291 104259 INFO neutron.agent.ovn.metadata.agent [-] Port 168c1e42-5626-409f-86c2-c1b2a8b11d4b in datapath ab086ee0-e007-4a86-babc-64d267c3fd5e unbound from our chassis#033[00m
Jan 21 19:28:01 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:28:01.293 104259 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network ab086ee0-e007-4a86-babc-64d267c3fd5e, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Jan 21 19:28:01 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:28:01.295 211690 DEBUG oslo.privsep.daemon [-] privsep: reply[6cec0fdc-df76-49f7-aee6-89327afc78d2]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 19:28:01 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:28:01.296 104259 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-ab086ee0-e007-4a86-babc-64d267c3fd5e namespace which is not needed anymore#033[00m
Jan 21 19:28:01 np0005591285 nova_compute[182755]: 2026-01-22 00:28:01.299 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:28:01 np0005591285 systemd[1]: machine-qemu\x2d73\x2dinstance\x2d000000a4.scope: Deactivated successfully.
Jan 21 19:28:01 np0005591285 systemd[1]: machine-qemu\x2d73\x2dinstance\x2d000000a4.scope: Consumed 17.242s CPU time.
Jan 21 19:28:01 np0005591285 systemd-machined[154022]: Machine qemu-73-instance-000000a4 terminated.
Jan 21 19:28:01 np0005591285 neutron-haproxy-ovnmeta-ab086ee0-e007-4a86-babc-64d267c3fd5e[237940]: [NOTICE]   (237944) : haproxy version is 2.8.14-c23fe91
Jan 21 19:28:01 np0005591285 neutron-haproxy-ovnmeta-ab086ee0-e007-4a86-babc-64d267c3fd5e[237940]: [NOTICE]   (237944) : path to executable is /usr/sbin/haproxy
Jan 21 19:28:01 np0005591285 neutron-haproxy-ovnmeta-ab086ee0-e007-4a86-babc-64d267c3fd5e[237940]: [WARNING]  (237944) : Exiting Master process...
Jan 21 19:28:01 np0005591285 neutron-haproxy-ovnmeta-ab086ee0-e007-4a86-babc-64d267c3fd5e[237940]: [ALERT]    (237944) : Current worker (237946) exited with code 143 (Terminated)
Jan 21 19:28:01 np0005591285 neutron-haproxy-ovnmeta-ab086ee0-e007-4a86-babc-64d267c3fd5e[237940]: [WARNING]  (237944) : All workers exited. Exiting... (0)
Jan 21 19:28:01 np0005591285 systemd[1]: libpod-5adae3dfa59173a9a76512f1e3051cf32474ffd806a7d7384e9b2b1003abd98b.scope: Deactivated successfully.
Jan 21 19:28:01 np0005591285 podman[238549]: 2026-01-22 00:28:01.434081433 +0000 UTC m=+0.047981531 container died 5adae3dfa59173a9a76512f1e3051cf32474ffd806a7d7384e9b2b1003abd98b (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-ab086ee0-e007-4a86-babc-64d267c3fd5e, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202)
Jan 21 19:28:01 np0005591285 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-5adae3dfa59173a9a76512f1e3051cf32474ffd806a7d7384e9b2b1003abd98b-userdata-shm.mount: Deactivated successfully.
Jan 21 19:28:01 np0005591285 systemd[1]: var-lib-containers-storage-overlay-3570e816b4362389ff045a54f5ebaf688009d7adb573583eaf662e8bf2fcf0a1-merged.mount: Deactivated successfully.
Jan 21 19:28:01 np0005591285 podman[238549]: 2026-01-22 00:28:01.485536206 +0000 UTC m=+0.099436254 container cleanup 5adae3dfa59173a9a76512f1e3051cf32474ffd806a7d7384e9b2b1003abd98b (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-ab086ee0-e007-4a86-babc-64d267c3fd5e, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true)
Jan 21 19:28:01 np0005591285 systemd[1]: libpod-conmon-5adae3dfa59173a9a76512f1e3051cf32474ffd806a7d7384e9b2b1003abd98b.scope: Deactivated successfully.
Jan 21 19:28:01 np0005591285 nova_compute[182755]: 2026-01-22 00:28:01.529 182759 INFO nova.virt.libvirt.driver [-] [instance: ba1975bd-ca63-4cb4-afd3-fb1f077c28f0] Instance destroyed successfully.#033[00m
Jan 21 19:28:01 np0005591285 nova_compute[182755]: 2026-01-22 00:28:01.531 182759 DEBUG nova.objects.instance [None req-f7707da4-65a2-4ec9-9c18-2c903685c657 93f27bcf715e498cbac482f96dec39c0 c869345f15654dea91ddb775c6c3ed7d - - default default] Lazy-loading 'resources' on Instance uuid ba1975bd-ca63-4cb4-afd3-fb1f077c28f0 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 21 19:28:01 np0005591285 nova_compute[182755]: 2026-01-22 00:28:01.569 182759 DEBUG nova.virt.libvirt.vif [None req-f7707da4-65a2-4ec9-9c18-2c903685c657 93f27bcf715e498cbac482f96dec39c0 c869345f15654dea91ddb775c6c3ed7d - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-22T00:25:53Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestSnapshotPattern-server-1118267000',display_name='tempest-TestSnapshotPattern-server-1118267000',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-testsnapshotpattern-server-1118267000',id=164,image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBI0DY/TXOQB0M9YFyDLcqcxCKEdFgCfMpGiYt7S54G4iqWyBQXCc1Xwky+N3hTMg7xuZaO7fBEy0faktvOAVQkBCk+NHBAAtdaooYCb3c3mlb/2fG1QJ9qBFBibcnv6XRw==',key_name='tempest-TestSnapshotPattern-1957582469',keypairs=<?>,launch_index=0,launched_at=2026-01-22T00:26:10Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='c869345f15654dea91ddb775c6c3ed7d',ramdisk_id='',reservation_id='r-bn3hkves',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestSnapshotPattern-735860214',owner_user_name='tempest-TestSnapshotPattern-735860214-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-22T00:26:41Z,user_data=None,user_id='93f27bcf715e498cbac482f96dec39c0',uuid=ba1975bd-ca63-4cb4-afd3-fb1f077c28f0,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "168c1e42-5626-409f-86c2-c1b2a8b11d4b", "address": "fa:16:3e:01:b6:65", "network": {"id": "ab086ee0-e007-4a86-babc-64d267c3fd5e", "bridge": "br-int", "label": "tempest-TestSnapshotPattern-266383806-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.237", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c869345f15654dea91ddb775c6c3ed7d", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap168c1e42-56", "ovs_interfaceid": "168c1e42-5626-409f-86c2-c1b2a8b11d4b", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Jan 21 19:28:01 np0005591285 nova_compute[182755]: 2026-01-22 00:28:01.571 182759 DEBUG nova.network.os_vif_util [None req-f7707da4-65a2-4ec9-9c18-2c903685c657 93f27bcf715e498cbac482f96dec39c0 c869345f15654dea91ddb775c6c3ed7d - - default default] Converting VIF {"id": "168c1e42-5626-409f-86c2-c1b2a8b11d4b", "address": "fa:16:3e:01:b6:65", "network": {"id": "ab086ee0-e007-4a86-babc-64d267c3fd5e", "bridge": "br-int", "label": "tempest-TestSnapshotPattern-266383806-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.237", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c869345f15654dea91ddb775c6c3ed7d", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap168c1e42-56", "ovs_interfaceid": "168c1e42-5626-409f-86c2-c1b2a8b11d4b", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 21 19:28:01 np0005591285 nova_compute[182755]: 2026-01-22 00:28:01.573 182759 DEBUG nova.network.os_vif_util [None req-f7707da4-65a2-4ec9-9c18-2c903685c657 93f27bcf715e498cbac482f96dec39c0 c869345f15654dea91ddb775c6c3ed7d - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:01:b6:65,bridge_name='br-int',has_traffic_filtering=True,id=168c1e42-5626-409f-86c2-c1b2a8b11d4b,network=Network(ab086ee0-e007-4a86-babc-64d267c3fd5e),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap168c1e42-56') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 21 19:28:01 np0005591285 nova_compute[182755]: 2026-01-22 00:28:01.573 182759 DEBUG os_vif [None req-f7707da4-65a2-4ec9-9c18-2c903685c657 93f27bcf715e498cbac482f96dec39c0 c869345f15654dea91ddb775c6c3ed7d - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:01:b6:65,bridge_name='br-int',has_traffic_filtering=True,id=168c1e42-5626-409f-86c2-c1b2a8b11d4b,network=Network(ab086ee0-e007-4a86-babc-64d267c3fd5e),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap168c1e42-56') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Jan 21 19:28:01 np0005591285 nova_compute[182755]: 2026-01-22 00:28:01.575 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:28:01 np0005591285 podman[238595]: 2026-01-22 00:28:01.575661339 +0000 UTC m=+0.055098172 container remove 5adae3dfa59173a9a76512f1e3051cf32474ffd806a7d7384e9b2b1003abd98b (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-ab086ee0-e007-4a86-babc-64d267c3fd5e, org.label-schema.schema-version=1.0, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251202, io.buildah.version=1.41.3)
Jan 21 19:28:01 np0005591285 nova_compute[182755]: 2026-01-22 00:28:01.576 182759 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap168c1e42-56, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 21 19:28:01 np0005591285 nova_compute[182755]: 2026-01-22 00:28:01.578 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:28:01 np0005591285 nova_compute[182755]: 2026-01-22 00:28:01.580 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:28:01 np0005591285 nova_compute[182755]: 2026-01-22 00:28:01.585 182759 INFO os_vif [None req-f7707da4-65a2-4ec9-9c18-2c903685c657 93f27bcf715e498cbac482f96dec39c0 c869345f15654dea91ddb775c6c3ed7d - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:01:b6:65,bridge_name='br-int',has_traffic_filtering=True,id=168c1e42-5626-409f-86c2-c1b2a8b11d4b,network=Network(ab086ee0-e007-4a86-babc-64d267c3fd5e),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap168c1e42-56')#033[00m
Jan 21 19:28:01 np0005591285 nova_compute[182755]: 2026-01-22 00:28:01.585 182759 INFO nova.virt.libvirt.driver [None req-f7707da4-65a2-4ec9-9c18-2c903685c657 93f27bcf715e498cbac482f96dec39c0 c869345f15654dea91ddb775c6c3ed7d - - default default] [instance: ba1975bd-ca63-4cb4-afd3-fb1f077c28f0] Deleting instance files /var/lib/nova/instances/ba1975bd-ca63-4cb4-afd3-fb1f077c28f0_del#033[00m
Jan 21 19:28:01 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:28:01.585 211690 DEBUG oslo.privsep.daemon [-] privsep: reply[3831e780-11ed-4e8e-af12-32613dd2417a]: (4, ('Thu Jan 22 12:28:01 AM UTC 2026 Stopping container neutron-haproxy-ovnmeta-ab086ee0-e007-4a86-babc-64d267c3fd5e (5adae3dfa59173a9a76512f1e3051cf32474ffd806a7d7384e9b2b1003abd98b)\n5adae3dfa59173a9a76512f1e3051cf32474ffd806a7d7384e9b2b1003abd98b\nThu Jan 22 12:28:01 AM UTC 2026 Deleting container neutron-haproxy-ovnmeta-ab086ee0-e007-4a86-babc-64d267c3fd5e (5adae3dfa59173a9a76512f1e3051cf32474ffd806a7d7384e9b2b1003abd98b)\n5adae3dfa59173a9a76512f1e3051cf32474ffd806a7d7384e9b2b1003abd98b\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 19:28:01 np0005591285 nova_compute[182755]: 2026-01-22 00:28:01.586 182759 INFO nova.virt.libvirt.driver [None req-f7707da4-65a2-4ec9-9c18-2c903685c657 93f27bcf715e498cbac482f96dec39c0 c869345f15654dea91ddb775c6c3ed7d - - default default] [instance: ba1975bd-ca63-4cb4-afd3-fb1f077c28f0] Deletion of /var/lib/nova/instances/ba1975bd-ca63-4cb4-afd3-fb1f077c28f0_del complete#033[00m
Jan 21 19:28:01 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:28:01.587 211690 DEBUG oslo.privsep.daemon [-] privsep: reply[63353e9e-2ef0-40b0-b43a-3e86f26360ff]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 19:28:01 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:28:01.588 104259 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapab086ee0-e0, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 21 19:28:01 np0005591285 nova_compute[182755]: 2026-01-22 00:28:01.590 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:28:01 np0005591285 kernel: tapab086ee0-e0: left promiscuous mode
Jan 21 19:28:01 np0005591285 nova_compute[182755]: 2026-01-22 00:28:01.592 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:28:01 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:28:01.595 211690 DEBUG oslo.privsep.daemon [-] privsep: reply[65c85496-bafd-4cb3-8391-7ff1b10fc7e1]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 19:28:01 np0005591285 nova_compute[182755]: 2026-01-22 00:28:01.607 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:28:01 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:28:01.629 211690 DEBUG oslo.privsep.daemon [-] privsep: reply[2be11624-6b95-4358-a333-d1a1529469a8]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 19:28:01 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:28:01.631 211690 DEBUG oslo.privsep.daemon [-] privsep: reply[04b6f7fc-7bda-4285-a7ea-931ad8f0bb50]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 19:28:01 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:28:01.653 211690 DEBUG oslo.privsep.daemon [-] privsep: reply[2ce1cf9a-85f6-4cbd-97db-04051b0f760b]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 612886, 'reachable_time': 34923, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 238616, 'error': None, 'target': 'ovnmeta-ab086ee0-e007-4a86-babc-64d267c3fd5e', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 19:28:01 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:28:01.658 104650 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-ab086ee0-e007-4a86-babc-64d267c3fd5e deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Jan 21 19:28:01 np0005591285 systemd[1]: run-netns-ovnmeta\x2dab086ee0\x2de007\x2d4a86\x2dbabc\x2d64d267c3fd5e.mount: Deactivated successfully.
Jan 21 19:28:01 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:28:01.659 104650 DEBUG oslo.privsep.daemon [-] privsep: reply[1a74bfee-7fcc-4191-bedf-2d3c0004edf3]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 19:28:01 np0005591285 nova_compute[182755]: 2026-01-22 00:28:01.702 182759 INFO nova.compute.manager [None req-f7707da4-65a2-4ec9-9c18-2c903685c657 93f27bcf715e498cbac482f96dec39c0 c869345f15654dea91ddb775c6c3ed7d - - default default] [instance: ba1975bd-ca63-4cb4-afd3-fb1f077c28f0] Took 0.46 seconds to destroy the instance on the hypervisor.#033[00m
Jan 21 19:28:01 np0005591285 nova_compute[182755]: 2026-01-22 00:28:01.703 182759 DEBUG oslo.service.loopingcall [None req-f7707da4-65a2-4ec9-9c18-2c903685c657 93f27bcf715e498cbac482f96dec39c0 c869345f15654dea91ddb775c6c3ed7d - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Jan 21 19:28:01 np0005591285 nova_compute[182755]: 2026-01-22 00:28:01.703 182759 DEBUG nova.compute.manager [-] [instance: ba1975bd-ca63-4cb4-afd3-fb1f077c28f0] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Jan 21 19:28:01 np0005591285 nova_compute[182755]: 2026-01-22 00:28:01.703 182759 DEBUG nova.network.neutron [-] [instance: ba1975bd-ca63-4cb4-afd3-fb1f077c28f0] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Jan 21 19:28:02 np0005591285 nova_compute[182755]: 2026-01-22 00:28:02.217 182759 DEBUG oslo_service.periodic_task [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 21 19:28:02 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:28:02.992 104259 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 21 19:28:02 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:28:02.993 104259 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 21 19:28:02 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:28:02.993 104259 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 21 19:28:03 np0005591285 nova_compute[182755]: 2026-01-22 00:28:03.525 182759 DEBUG nova.compute.manager [req-7b2ecd50-92f6-41c5-8b6c-57041b792bb1 req-bc426b7e-9396-40d3-bd89-db0074217dda 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: ba1975bd-ca63-4cb4-afd3-fb1f077c28f0] Received event network-vif-unplugged-168c1e42-5626-409f-86c2-c1b2a8b11d4b external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 21 19:28:03 np0005591285 nova_compute[182755]: 2026-01-22 00:28:03.525 182759 DEBUG oslo_concurrency.lockutils [req-7b2ecd50-92f6-41c5-8b6c-57041b792bb1 req-bc426b7e-9396-40d3-bd89-db0074217dda 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "ba1975bd-ca63-4cb4-afd3-fb1f077c28f0-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 21 19:28:03 np0005591285 nova_compute[182755]: 2026-01-22 00:28:03.526 182759 DEBUG oslo_concurrency.lockutils [req-7b2ecd50-92f6-41c5-8b6c-57041b792bb1 req-bc426b7e-9396-40d3-bd89-db0074217dda 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "ba1975bd-ca63-4cb4-afd3-fb1f077c28f0-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 21 19:28:03 np0005591285 nova_compute[182755]: 2026-01-22 00:28:03.527 182759 DEBUG oslo_concurrency.lockutils [req-7b2ecd50-92f6-41c5-8b6c-57041b792bb1 req-bc426b7e-9396-40d3-bd89-db0074217dda 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "ba1975bd-ca63-4cb4-afd3-fb1f077c28f0-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 21 19:28:03 np0005591285 nova_compute[182755]: 2026-01-22 00:28:03.527 182759 DEBUG nova.compute.manager [req-7b2ecd50-92f6-41c5-8b6c-57041b792bb1 req-bc426b7e-9396-40d3-bd89-db0074217dda 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: ba1975bd-ca63-4cb4-afd3-fb1f077c28f0] No waiting events found dispatching network-vif-unplugged-168c1e42-5626-409f-86c2-c1b2a8b11d4b pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 21 19:28:03 np0005591285 nova_compute[182755]: 2026-01-22 00:28:03.528 182759 DEBUG nova.compute.manager [req-7b2ecd50-92f6-41c5-8b6c-57041b792bb1 req-bc426b7e-9396-40d3-bd89-db0074217dda 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: ba1975bd-ca63-4cb4-afd3-fb1f077c28f0] Received event network-vif-unplugged-168c1e42-5626-409f-86c2-c1b2a8b11d4b for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Jan 21 19:28:03 np0005591285 nova_compute[182755]: 2026-01-22 00:28:03.528 182759 DEBUG nova.compute.manager [req-7b2ecd50-92f6-41c5-8b6c-57041b792bb1 req-bc426b7e-9396-40d3-bd89-db0074217dda 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: ba1975bd-ca63-4cb4-afd3-fb1f077c28f0] Received event network-vif-plugged-168c1e42-5626-409f-86c2-c1b2a8b11d4b external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 21 19:28:03 np0005591285 nova_compute[182755]: 2026-01-22 00:28:03.529 182759 DEBUG oslo_concurrency.lockutils [req-7b2ecd50-92f6-41c5-8b6c-57041b792bb1 req-bc426b7e-9396-40d3-bd89-db0074217dda 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "ba1975bd-ca63-4cb4-afd3-fb1f077c28f0-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 21 19:28:03 np0005591285 nova_compute[182755]: 2026-01-22 00:28:03.529 182759 DEBUG oslo_concurrency.lockutils [req-7b2ecd50-92f6-41c5-8b6c-57041b792bb1 req-bc426b7e-9396-40d3-bd89-db0074217dda 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "ba1975bd-ca63-4cb4-afd3-fb1f077c28f0-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 21 19:28:03 np0005591285 nova_compute[182755]: 2026-01-22 00:28:03.530 182759 DEBUG oslo_concurrency.lockutils [req-7b2ecd50-92f6-41c5-8b6c-57041b792bb1 req-bc426b7e-9396-40d3-bd89-db0074217dda 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "ba1975bd-ca63-4cb4-afd3-fb1f077c28f0-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 21 19:28:03 np0005591285 nova_compute[182755]: 2026-01-22 00:28:03.530 182759 DEBUG nova.compute.manager [req-7b2ecd50-92f6-41c5-8b6c-57041b792bb1 req-bc426b7e-9396-40d3-bd89-db0074217dda 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: ba1975bd-ca63-4cb4-afd3-fb1f077c28f0] No waiting events found dispatching network-vif-plugged-168c1e42-5626-409f-86c2-c1b2a8b11d4b pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 21 19:28:03 np0005591285 nova_compute[182755]: 2026-01-22 00:28:03.531 182759 WARNING nova.compute.manager [req-7b2ecd50-92f6-41c5-8b6c-57041b792bb1 req-bc426b7e-9396-40d3-bd89-db0074217dda 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: ba1975bd-ca63-4cb4-afd3-fb1f077c28f0] Received unexpected event network-vif-plugged-168c1e42-5626-409f-86c2-c1b2a8b11d4b for instance with vm_state active and task_state deleting.#033[00m
Jan 21 19:28:03 np0005591285 nova_compute[182755]: 2026-01-22 00:28:03.567 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:28:04 np0005591285 nova_compute[182755]: 2026-01-22 00:28:04.587 182759 DEBUG nova.network.neutron [req-37aadc0b-4893-46f5-a33f-ecb283656761 req-01c6f7af-c8b4-4f21-ad7f-5616a18f7fa8 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: ba1975bd-ca63-4cb4-afd3-fb1f077c28f0] Updated VIF entry in instance network info cache for port 168c1e42-5626-409f-86c2-c1b2a8b11d4b. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 21 19:28:04 np0005591285 nova_compute[182755]: 2026-01-22 00:28:04.588 182759 DEBUG nova.network.neutron [req-37aadc0b-4893-46f5-a33f-ecb283656761 req-01c6f7af-c8b4-4f21-ad7f-5616a18f7fa8 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: ba1975bd-ca63-4cb4-afd3-fb1f077c28f0] Updating instance_info_cache with network_info: [{"id": "168c1e42-5626-409f-86c2-c1b2a8b11d4b", "address": "fa:16:3e:01:b6:65", "network": {"id": "ab086ee0-e007-4a86-babc-64d267c3fd5e", "bridge": "br-int", "label": "tempest-TestSnapshotPattern-266383806-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c869345f15654dea91ddb775c6c3ed7d", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap168c1e42-56", "ovs_interfaceid": "168c1e42-5626-409f-86c2-c1b2a8b11d4b", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 21 19:28:04 np0005591285 nova_compute[182755]: 2026-01-22 00:28:04.870 182759 DEBUG oslo_concurrency.lockutils [req-37aadc0b-4893-46f5-a33f-ecb283656761 req-01c6f7af-c8b4-4f21-ad7f-5616a18f7fa8 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Releasing lock "refresh_cache-ba1975bd-ca63-4cb4-afd3-fb1f077c28f0" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 21 19:28:04 np0005591285 nova_compute[182755]: 2026-01-22 00:28:04.871 182759 DEBUG nova.network.neutron [-] [instance: ba1975bd-ca63-4cb4-afd3-fb1f077c28f0] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 21 19:28:05 np0005591285 nova_compute[182755]: 2026-01-22 00:28:05.069 182759 INFO nova.compute.manager [-] [instance: ba1975bd-ca63-4cb4-afd3-fb1f077c28f0] Took 3.37 seconds to deallocate network for instance.#033[00m
Jan 21 19:28:05 np0005591285 nova_compute[182755]: 2026-01-22 00:28:05.217 182759 DEBUG oslo_service.periodic_task [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 21 19:28:05 np0005591285 nova_compute[182755]: 2026-01-22 00:28:05.217 182759 DEBUG oslo_service.periodic_task [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 21 19:28:05 np0005591285 podman[238617]: 2026-01-22 00:28:05.242834508 +0000 UTC m=+0.112001593 container health_status e38def30a2d032278c507a8fe96e62e9ea51ff4721fe55b24e66691821fd079e (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_controller, io.buildah.version=1.41.3)
Jan 21 19:28:05 np0005591285 nova_compute[182755]: 2026-01-22 00:28:05.319 182759 DEBUG oslo_concurrency.lockutils [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 21 19:28:05 np0005591285 nova_compute[182755]: 2026-01-22 00:28:05.320 182759 DEBUG oslo_concurrency.lockutils [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 21 19:28:05 np0005591285 nova_compute[182755]: 2026-01-22 00:28:05.320 182759 DEBUG oslo_concurrency.lockutils [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 21 19:28:05 np0005591285 nova_compute[182755]: 2026-01-22 00:28:05.321 182759 DEBUG nova.compute.resource_tracker [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Jan 21 19:28:05 np0005591285 nova_compute[182755]: 2026-01-22 00:28:05.447 182759 DEBUG oslo_concurrency.lockutils [None req-f7707da4-65a2-4ec9-9c18-2c903685c657 93f27bcf715e498cbac482f96dec39c0 c869345f15654dea91ddb775c6c3ed7d - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 21 19:28:05 np0005591285 nova_compute[182755]: 2026-01-22 00:28:05.447 182759 DEBUG oslo_concurrency.lockutils [None req-f7707da4-65a2-4ec9-9c18-2c903685c657 93f27bcf715e498cbac482f96dec39c0 c869345f15654dea91ddb775c6c3ed7d - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 21 19:28:05 np0005591285 nova_compute[182755]: 2026-01-22 00:28:05.489 182759 WARNING nova.virt.libvirt.driver [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 21 19:28:05 np0005591285 nova_compute[182755]: 2026-01-22 00:28:05.491 182759 DEBUG nova.compute.resource_tracker [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=5726MB free_disk=73.17726516723633GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Jan 21 19:28:05 np0005591285 nova_compute[182755]: 2026-01-22 00:28:05.491 182759 DEBUG oslo_concurrency.lockutils [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 21 19:28:05 np0005591285 nova_compute[182755]: 2026-01-22 00:28:05.528 182759 DEBUG nova.compute.provider_tree [None req-f7707da4-65a2-4ec9-9c18-2c903685c657 93f27bcf715e498cbac482f96dec39c0 c869345f15654dea91ddb775c6c3ed7d - - default default] Inventory has not changed in ProviderTree for provider: e96a8776-a298-4c19-937a-402cb8191067 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 21 19:28:05 np0005591285 nova_compute[182755]: 2026-01-22 00:28:05.558 182759 DEBUG nova.scheduler.client.report [None req-f7707da4-65a2-4ec9-9c18-2c903685c657 93f27bcf715e498cbac482f96dec39c0 c869345f15654dea91ddb775c6c3ed7d - - default default] Inventory has not changed for provider e96a8776-a298-4c19-937a-402cb8191067 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 21 19:28:05 np0005591285 nova_compute[182755]: 2026-01-22 00:28:05.604 182759 DEBUG oslo_concurrency.lockutils [None req-f7707da4-65a2-4ec9-9c18-2c903685c657 93f27bcf715e498cbac482f96dec39c0 c869345f15654dea91ddb775c6c3ed7d - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.157s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 21 19:28:05 np0005591285 nova_compute[182755]: 2026-01-22 00:28:05.606 182759 DEBUG oslo_concurrency.lockutils [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.115s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 21 19:28:05 np0005591285 nova_compute[182755]: 2026-01-22 00:28:05.643 182759 INFO nova.scheduler.client.report [None req-f7707da4-65a2-4ec9-9c18-2c903685c657 93f27bcf715e498cbac482f96dec39c0 c869345f15654dea91ddb775c6c3ed7d - - default default] Deleted allocations for instance ba1975bd-ca63-4cb4-afd3-fb1f077c28f0#033[00m
Jan 21 19:28:05 np0005591285 nova_compute[182755]: 2026-01-22 00:28:05.677 182759 DEBUG nova.compute.manager [req-cd56cb6a-4bcd-4b01-be4c-aaa5eef6fae0 req-ff42abfc-113b-4d35-9bd6-333084562b49 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: ba1975bd-ca63-4cb4-afd3-fb1f077c28f0] Received event network-vif-deleted-168c1e42-5626-409f-86c2-c1b2a8b11d4b external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 21 19:28:05 np0005591285 nova_compute[182755]: 2026-01-22 00:28:05.725 182759 DEBUG nova.compute.resource_tracker [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Jan 21 19:28:05 np0005591285 nova_compute[182755]: 2026-01-22 00:28:05.725 182759 DEBUG nova.compute.resource_tracker [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Jan 21 19:28:05 np0005591285 nova_compute[182755]: 2026-01-22 00:28:05.787 182759 DEBUG nova.compute.provider_tree [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Inventory has not changed in ProviderTree for provider: e96a8776-a298-4c19-937a-402cb8191067 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 21 19:28:05 np0005591285 nova_compute[182755]: 2026-01-22 00:28:05.822 182759 DEBUG nova.scheduler.client.report [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Inventory has not changed for provider e96a8776-a298-4c19-937a-402cb8191067 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 21 19:28:05 np0005591285 nova_compute[182755]: 2026-01-22 00:28:05.991 182759 DEBUG nova.compute.resource_tracker [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Jan 21 19:28:05 np0005591285 nova_compute[182755]: 2026-01-22 00:28:05.992 182759 DEBUG oslo_concurrency.lockutils [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.386s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 21 19:28:06 np0005591285 nova_compute[182755]: 2026-01-22 00:28:06.166 182759 DEBUG oslo_concurrency.lockutils [None req-f7707da4-65a2-4ec9-9c18-2c903685c657 93f27bcf715e498cbac482f96dec39c0 c869345f15654dea91ddb775c6c3ed7d - - default default] Lock "ba1975bd-ca63-4cb4-afd3-fb1f077c28f0" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 4.942s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 21 19:28:06 np0005591285 nova_compute[182755]: 2026-01-22 00:28:06.451 182759 DEBUG oslo_service.periodic_task [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Running periodic task ComputeManager._sync_power_states run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 21 19:28:06 np0005591285 nova_compute[182755]: 2026-01-22 00:28:06.577 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:28:08 np0005591285 nova_compute[182755]: 2026-01-22 00:28:08.270 182759 DEBUG oslo_service.periodic_task [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 21 19:28:08 np0005591285 nova_compute[182755]: 2026-01-22 00:28:08.569 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:28:11 np0005591285 nova_compute[182755]: 2026-01-22 00:28:11.580 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:28:11 np0005591285 nova_compute[182755]: 2026-01-22 00:28:11.700 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:28:11 np0005591285 nova_compute[182755]: 2026-01-22 00:28:11.846 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:28:13 np0005591285 nova_compute[182755]: 2026-01-22 00:28:13.213 182759 DEBUG oslo_service.periodic_task [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 21 19:28:13 np0005591285 nova_compute[182755]: 2026-01-22 00:28:13.570 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:28:16 np0005591285 nova_compute[182755]: 2026-01-22 00:28:16.527 182759 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1769041681.5244184, ba1975bd-ca63-4cb4-afd3-fb1f077c28f0 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 21 19:28:16 np0005591285 nova_compute[182755]: 2026-01-22 00:28:16.527 182759 INFO nova.compute.manager [-] [instance: ba1975bd-ca63-4cb4-afd3-fb1f077c28f0] VM Stopped (Lifecycle Event)#033[00m
Jan 21 19:28:16 np0005591285 nova_compute[182755]: 2026-01-22 00:28:16.550 182759 DEBUG nova.compute.manager [None req-23000a06-f4d9-422d-8ca6-a01ebf631eab - - - - - -] [instance: ba1975bd-ca63-4cb4-afd3-fb1f077c28f0] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 21 19:28:16 np0005591285 nova_compute[182755]: 2026-01-22 00:28:16.581 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:28:18 np0005591285 nova_compute[182755]: 2026-01-22 00:28:18.572 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:28:20 np0005591285 podman[238647]: 2026-01-22 00:28:20.185851966 +0000 UTC m=+0.059867790 container health_status b5c1a31a508d0cf9e10702abe9c0ef424614b4d8f0daee85791f7de054afa6f0 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, managed_by=edpm_ansible, tcib_managed=true, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_id=ceilometer_agent_compute, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2)
Jan 21 19:28:20 np0005591285 podman[238646]: 2026-01-22 00:28:20.18598868 +0000 UTC m=+0.063202310 container health_status 769668cc4e818cfae854ab3fcb5517227ddca1e18005a18ef4d782a494f45662 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=edpm_ansible, name=ubi9-minimal, architecture=x86_64, config_id=openstack_network_exporter, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., build-date=2025-08-20T13:12:41, release=1755695350, vcs-type=git, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.component=ubi9-minimal-container, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.buildah.version=1.33.7, distribution-scope=public, container_name=openstack_network_exporter, io.openshift.expose-services=, io.openshift.tags=minimal rhel9, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, maintainer=Red Hat, Inc., version=9.6)
Jan 21 19:28:21 np0005591285 nova_compute[182755]: 2026-01-22 00:28:21.583 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:28:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:28:23.175 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 21 19:28:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:28:23.176 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 21 19:28:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:28:23.176 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 21 19:28:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:28:23.176 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 21 19:28:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:28:23.176 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 21 19:28:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:28:23.176 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 21 19:28:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:28:23.176 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 21 19:28:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:28:23.176 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.iops, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 21 19:28:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:28:23.176 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 21 19:28:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:28:23.176 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 21 19:28:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:28:23.177 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 21 19:28:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:28:23.177 12 DEBUG ceilometer.polling.manager [-] Skip pollster cpu, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 21 19:28:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:28:23.177 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 21 19:28:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:28:23.177 12 DEBUG ceilometer.polling.manager [-] Skip pollster memory.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 21 19:28:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:28:23.177 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 21 19:28:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:28:23.177 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 21 19:28:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:28:23.178 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 21 19:28:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:28:23.178 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 21 19:28:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:28:23.178 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 21 19:28:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:28:23.178 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 21 19:28:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:28:23.178 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 21 19:28:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:28:23.178 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.allocation, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 21 19:28:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:28:23.178 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 21 19:28:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:28:23.178 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 21 19:28:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:28:23.178 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.capacity, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 21 19:28:23 np0005591285 nova_compute[182755]: 2026-01-22 00:28:23.574 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:28:25 np0005591285 nova_compute[182755]: 2026-01-22 00:28:25.217 182759 DEBUG oslo_service.periodic_task [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Running periodic task ComputeManager._cleanup_incomplete_migrations run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 21 19:28:25 np0005591285 nova_compute[182755]: 2026-01-22 00:28:25.218 182759 DEBUG nova.compute.manager [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Cleaning up deleted instances with incomplete migration  _cleanup_incomplete_migrations /usr/lib/python3.9/site-packages/nova/compute/manager.py:11183#033[00m
Jan 21 19:28:26 np0005591285 nova_compute[182755]: 2026-01-22 00:28:26.585 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:28:28 np0005591285 nova_compute[182755]: 2026-01-22 00:28:28.577 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:28:29 np0005591285 podman[238688]: 2026-01-22 00:28:29.210681666 +0000 UTC m=+0.072840980 container health_status 3d927e208c8d9d2bf2ed958430b573b5beb6e6062ba87e1ec7a0ee42c855b2e4 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Jan 21 19:28:31 np0005591285 nova_compute[182755]: 2026-01-22 00:28:31.587 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:28:32 np0005591285 podman[238713]: 2026-01-22 00:28:32.164603106 +0000 UTC m=+0.039235966 container health_status 8919309155a033da8deb024bb37b9fc0d285fe97686ee0d202af37a5f2df8416 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Jan 21 19:28:32 np0005591285 podman[238712]: 2026-01-22 00:28:32.194718566 +0000 UTC m=+0.072106490 container health_status 482a8450540b33e5cd4c4cb0c4b60281016c657b79b89fa82f8a9e447843f995 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, container_name=ovn_metadata_agent, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible)
Jan 21 19:28:33 np0005591285 nova_compute[182755]: 2026-01-22 00:28:33.579 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:28:36 np0005591285 podman[238752]: 2026-01-22 00:28:36.204316363 +0000 UTC m=+0.080467074 container health_status e38def30a2d032278c507a8fe96e62e9ea51ff4721fe55b24e66691821fd079e (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_controller, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, tcib_managed=true, container_name=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, managed_by=edpm_ansible, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0)
Jan 21 19:28:36 np0005591285 nova_compute[182755]: 2026-01-22 00:28:36.589 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:28:38 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:28:38.255 104259 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=52, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '16:a4:fb', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': 'fa:c2:bb:50:eb:27'}, ipsec=False) old=SB_Global(nb_cfg=51) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 21 19:28:38 np0005591285 nova_compute[182755]: 2026-01-22 00:28:38.255 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:28:38 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:28:38.256 104259 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 10 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Jan 21 19:28:38 np0005591285 nova_compute[182755]: 2026-01-22 00:28:38.581 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:28:41 np0005591285 nova_compute[182755]: 2026-01-22 00:28:41.591 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:28:42 np0005591285 nova_compute[182755]: 2026-01-22 00:28:42.218 182759 DEBUG oslo_service.periodic_task [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 21 19:28:43 np0005591285 nova_compute[182755]: 2026-01-22 00:28:43.583 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:28:46 np0005591285 nova_compute[182755]: 2026-01-22 00:28:46.593 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:28:48 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:28:48.258 104259 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=ce4b296c-26ac-415a-aa87-9634754eb3d3, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '52'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 21 19:28:48 np0005591285 nova_compute[182755]: 2026-01-22 00:28:48.632 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:28:51 np0005591285 podman[238779]: 2026-01-22 00:28:51.180045387 +0000 UTC m=+0.051491485 container health_status 769668cc4e818cfae854ab3fcb5517227ddca1e18005a18ef4d782a494f45662 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.openshift.expose-services=, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., build-date=2025-08-20T13:12:41, io.buildah.version=1.33.7, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.openshift.tags=minimal rhel9, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=openstack_network_exporter, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, maintainer=Red Hat, Inc., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., managed_by=edpm_ansible, container_name=openstack_network_exporter, vendor=Red Hat, Inc., version=9.6, url=https://catalog.redhat.com/en/search?searchType=containers, architecture=x86_64, com.redhat.component=ubi9-minimal-container, distribution-scope=public, name=ubi9-minimal, release=1755695350)
Jan 21 19:28:51 np0005591285 podman[238780]: 2026-01-22 00:28:51.212715376 +0000 UTC m=+0.080142397 container health_status b5c1a31a508d0cf9e10702abe9c0ef424614b4d8f0daee85791f7de054afa6f0 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, tcib_managed=true, config_id=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS)
Jan 21 19:28:51 np0005591285 nova_compute[182755]: 2026-01-22 00:28:51.596 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:28:53 np0005591285 nova_compute[182755]: 2026-01-22 00:28:53.635 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:28:56 np0005591285 nova_compute[182755]: 2026-01-22 00:28:56.229 182759 DEBUG oslo_service.periodic_task [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 21 19:28:56 np0005591285 nova_compute[182755]: 2026-01-22 00:28:56.230 182759 DEBUG nova.compute.manager [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Jan 21 19:28:56 np0005591285 nova_compute[182755]: 2026-01-22 00:28:56.597 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:28:58 np0005591285 nova_compute[182755]: 2026-01-22 00:28:58.213 182759 DEBUG oslo_service.periodic_task [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 21 19:28:58 np0005591285 nova_compute[182755]: 2026-01-22 00:28:58.216 182759 DEBUG oslo_service.periodic_task [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 21 19:28:58 np0005591285 nova_compute[182755]: 2026-01-22 00:28:58.217 182759 DEBUG nova.compute.manager [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Jan 21 19:28:58 np0005591285 nova_compute[182755]: 2026-01-22 00:28:58.217 182759 DEBUG nova.compute.manager [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Jan 21 19:28:58 np0005591285 nova_compute[182755]: 2026-01-22 00:28:58.239 182759 DEBUG nova.compute.manager [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Jan 21 19:28:58 np0005591285 nova_compute[182755]: 2026-01-22 00:28:58.636 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:29:00 np0005591285 podman[238816]: 2026-01-22 00:29:00.188629447 +0000 UTC m=+0.062322937 container health_status 3d927e208c8d9d2bf2ed958430b573b5beb6e6062ba87e1ec7a0ee42c855b2e4 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Jan 21 19:29:00 np0005591285 ovn_controller[94908]: 2026-01-22T00:29:00Z|00637|memory_trim|INFO|Detected inactivity (last active 30009 ms ago): trimming memory
Jan 21 19:29:01 np0005591285 nova_compute[182755]: 2026-01-22 00:29:01.217 182759 DEBUG oslo_service.periodic_task [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 21 19:29:01 np0005591285 nova_compute[182755]: 2026-01-22 00:29:01.218 182759 DEBUG oslo_service.periodic_task [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 21 19:29:01 np0005591285 nova_compute[182755]: 2026-01-22 00:29:01.599 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:29:02 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:29:02.994 104259 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 21 19:29:02 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:29:02.995 104259 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 21 19:29:02 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:29:02.995 104259 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 21 19:29:03 np0005591285 podman[238840]: 2026-01-22 00:29:03.18057467 +0000 UTC m=+0.046357507 container health_status 482a8450540b33e5cd4c4cb0c4b60281016c657b79b89fa82f8a9e447843f995 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3)
Jan 21 19:29:03 np0005591285 podman[238841]: 2026-01-22 00:29:03.204574846 +0000 UTC m=+0.058762961 container health_status 8919309155a033da8deb024bb37b9fc0d285fe97686ee0d202af37a5f2df8416 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter)
Jan 21 19:29:03 np0005591285 nova_compute[182755]: 2026-01-22 00:29:03.686 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:29:04 np0005591285 nova_compute[182755]: 2026-01-22 00:29:04.217 182759 DEBUG oslo_service.periodic_task [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 21 19:29:05 np0005591285 nova_compute[182755]: 2026-01-22 00:29:05.218 182759 DEBUG oslo_service.periodic_task [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 21 19:29:06 np0005591285 nova_compute[182755]: 2026-01-22 00:29:06.601 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:29:06 np0005591285 podman[238885]: 2026-01-22 00:29:06.705943957 +0000 UTC m=+0.076110248 container health_status e38def30a2d032278c507a8fe96e62e9ea51ff4721fe55b24e66691821fd079e (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 21 19:29:07 np0005591285 nova_compute[182755]: 2026-01-22 00:29:07.217 182759 DEBUG oslo_service.periodic_task [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 21 19:29:07 np0005591285 nova_compute[182755]: 2026-01-22 00:29:07.253 182759 DEBUG oslo_concurrency.lockutils [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 21 19:29:07 np0005591285 nova_compute[182755]: 2026-01-22 00:29:07.254 182759 DEBUG oslo_concurrency.lockutils [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 21 19:29:07 np0005591285 nova_compute[182755]: 2026-01-22 00:29:07.254 182759 DEBUG oslo_concurrency.lockutils [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 21 19:29:07 np0005591285 nova_compute[182755]: 2026-01-22 00:29:07.255 182759 DEBUG nova.compute.resource_tracker [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Jan 21 19:29:07 np0005591285 nova_compute[182755]: 2026-01-22 00:29:07.431 182759 WARNING nova.virt.libvirt.driver [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 21 19:29:07 np0005591285 nova_compute[182755]: 2026-01-22 00:29:07.432 182759 DEBUG nova.compute.resource_tracker [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=5754MB free_disk=73.17742538452148GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Jan 21 19:29:07 np0005591285 nova_compute[182755]: 2026-01-22 00:29:07.432 182759 DEBUG oslo_concurrency.lockutils [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 21 19:29:07 np0005591285 nova_compute[182755]: 2026-01-22 00:29:07.433 182759 DEBUG oslo_concurrency.lockutils [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 21 19:29:07 np0005591285 nova_compute[182755]: 2026-01-22 00:29:07.602 182759 DEBUG nova.compute.resource_tracker [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Jan 21 19:29:07 np0005591285 nova_compute[182755]: 2026-01-22 00:29:07.603 182759 DEBUG nova.compute.resource_tracker [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Jan 21 19:29:07 np0005591285 nova_compute[182755]: 2026-01-22 00:29:07.670 182759 DEBUG nova.compute.provider_tree [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Inventory has not changed in ProviderTree for provider: e96a8776-a298-4c19-937a-402cb8191067 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 21 19:29:07 np0005591285 nova_compute[182755]: 2026-01-22 00:29:07.690 182759 DEBUG nova.scheduler.client.report [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Inventory has not changed for provider e96a8776-a298-4c19-937a-402cb8191067 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 21 19:29:07 np0005591285 nova_compute[182755]: 2026-01-22 00:29:07.692 182759 DEBUG nova.compute.resource_tracker [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Jan 21 19:29:07 np0005591285 nova_compute[182755]: 2026-01-22 00:29:07.692 182759 DEBUG oslo_concurrency.lockutils [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.259s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 21 19:29:08 np0005591285 nova_compute[182755]: 2026-01-22 00:29:08.739 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:29:10 np0005591285 nova_compute[182755]: 2026-01-22 00:29:10.692 182759 DEBUG oslo_service.periodic_task [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 21 19:29:11 np0005591285 nova_compute[182755]: 2026-01-22 00:29:11.602 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:29:13 np0005591285 nova_compute[182755]: 2026-01-22 00:29:13.741 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:29:16 np0005591285 nova_compute[182755]: 2026-01-22 00:29:16.604 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:29:18 np0005591285 nova_compute[182755]: 2026-01-22 00:29:18.781 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:29:21 np0005591285 nova_compute[182755]: 2026-01-22 00:29:21.606 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:29:22 np0005591285 podman[238910]: 2026-01-22 00:29:22.194535962 +0000 UTC m=+0.053777427 container health_status b5c1a31a508d0cf9e10702abe9c0ef424614b4d8f0daee85791f7de054afa6f0 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, container_name=ceilometer_agent_compute, managed_by=edpm_ansible, org.label-schema.build-date=20251202, config_id=ceilometer_agent_compute, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS)
Jan 21 19:29:22 np0005591285 podman[238909]: 2026-01-22 00:29:22.197417279 +0000 UTC m=+0.061597647 container health_status 769668cc4e818cfae854ab3fcb5517227ddca1e18005a18ef4d782a494f45662 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Red Hat, Inc., io.openshift.tags=minimal rhel9, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, architecture=x86_64, config_id=openstack_network_exporter, container_name=openstack_network_exporter, release=1755695350, url=https://catalog.redhat.com/en/search?searchType=containers, version=9.6, io.openshift.expose-services=, build-date=2025-08-20T13:12:41, com.redhat.component=ubi9-minimal-container, managed_by=edpm_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., name=ubi9-minimal, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., distribution-scope=public, io.buildah.version=1.33.7, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vendor=Red Hat, Inc., vcs-type=git, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal)
Jan 21 19:29:23 np0005591285 nova_compute[182755]: 2026-01-22 00:29:23.785 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:29:26 np0005591285 nova_compute[182755]: 2026-01-22 00:29:26.608 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:29:28 np0005591285 nova_compute[182755]: 2026-01-22 00:29:28.788 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:29:31 np0005591285 podman[238948]: 2026-01-22 00:29:31.198232989 +0000 UTC m=+0.068407051 container health_status 3d927e208c8d9d2bf2ed958430b573b5beb6e6062ba87e1ec7a0ee42c855b2e4 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Jan 21 19:29:31 np0005591285 nova_compute[182755]: 2026-01-22 00:29:31.609 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:29:33 np0005591285 nova_compute[182755]: 2026-01-22 00:29:33.844 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:29:34 np0005591285 podman[238972]: 2026-01-22 00:29:34.17337638 +0000 UTC m=+0.048114695 container health_status 482a8450540b33e5cd4c4cb0c4b60281016c657b79b89fa82f8a9e447843f995 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, tcib_managed=true, container_name=ovn_metadata_agent, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']})
Jan 21 19:29:34 np0005591285 podman[238973]: 2026-01-22 00:29:34.179791993 +0000 UTC m=+0.050587011 container health_status 8919309155a033da8deb024bb37b9fc0d285fe97686ee0d202af37a5f2df8416 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Jan 21 19:29:36 np0005591285 nova_compute[182755]: 2026-01-22 00:29:36.610 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:29:37 np0005591285 podman[239014]: 2026-01-22 00:29:37.316903529 +0000 UTC m=+0.184551103 container health_status e38def30a2d032278c507a8fe96e62e9ea51ff4721fe55b24e66691821fd079e (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, tcib_managed=true)
Jan 21 19:29:38 np0005591285 nova_compute[182755]: 2026-01-22 00:29:38.846 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:29:40 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:29:40.680 104259 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=53, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '16:a4:fb', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': 'fa:c2:bb:50:eb:27'}, ipsec=False) old=SB_Global(nb_cfg=52) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 21 19:29:40 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:29:40.681 104259 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 0 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Jan 21 19:29:40 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:29:40.681 104259 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=ce4b296c-26ac-415a-aa87-9634754eb3d3, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '53'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 21 19:29:40 np0005591285 nova_compute[182755]: 2026-01-22 00:29:40.724 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:29:41 np0005591285 nova_compute[182755]: 2026-01-22 00:29:41.611 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:29:43 np0005591285 nova_compute[182755]: 2026-01-22 00:29:43.848 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:29:46 np0005591285 nova_compute[182755]: 2026-01-22 00:29:46.613 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:29:48 np0005591285 nova_compute[182755]: 2026-01-22 00:29:48.893 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:29:51 np0005591285 nova_compute[182755]: 2026-01-22 00:29:51.614 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:29:53 np0005591285 podman[239041]: 2026-01-22 00:29:53.200818767 +0000 UTC m=+0.073611182 container health_status b5c1a31a508d0cf9e10702abe9c0ef424614b4d8f0daee85791f7de054afa6f0 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=ceilometer_agent_compute, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible)
Jan 21 19:29:53 np0005591285 podman[239040]: 2026-01-22 00:29:53.218908863 +0000 UTC m=+0.094454712 container health_status 769668cc4e818cfae854ab3fcb5517227ddca1e18005a18ef4d782a494f45662 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=edpm_ansible, version=9.6, architecture=x86_64, build-date=2025-08-20T13:12:41, distribution-scope=public, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., container_name=openstack_network_exporter, io.buildah.version=1.33.7, io.openshift.tags=minimal rhel9, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vcs-type=git, release=1755695350, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vendor=Red Hat, Inc., com.redhat.component=ubi9-minimal-container, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, config_id=openstack_network_exporter, maintainer=Red Hat, Inc., name=ubi9-minimal, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']})
Jan 21 19:29:53 np0005591285 nova_compute[182755]: 2026-01-22 00:29:53.895 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:29:56 np0005591285 nova_compute[182755]: 2026-01-22 00:29:56.616 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:29:57 np0005591285 nova_compute[182755]: 2026-01-22 00:29:57.217 182759 DEBUG oslo_service.periodic_task [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 21 19:29:57 np0005591285 nova_compute[182755]: 2026-01-22 00:29:57.218 182759 DEBUG nova.compute.manager [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Jan 21 19:29:58 np0005591285 nova_compute[182755]: 2026-01-22 00:29:58.897 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:30:00 np0005591285 nova_compute[182755]: 2026-01-22 00:30:00.212 182759 DEBUG oslo_service.periodic_task [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 21 19:30:00 np0005591285 nova_compute[182755]: 2026-01-22 00:30:00.216 182759 DEBUG oslo_service.periodic_task [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 21 19:30:00 np0005591285 nova_compute[182755]: 2026-01-22 00:30:00.217 182759 DEBUG nova.compute.manager [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Jan 21 19:30:00 np0005591285 nova_compute[182755]: 2026-01-22 00:30:00.217 182759 DEBUG nova.compute.manager [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Jan 21 19:30:01 np0005591285 nova_compute[182755]: 2026-01-22 00:30:01.088 182759 DEBUG nova.compute.manager [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Jan 21 19:30:01 np0005591285 nova_compute[182755]: 2026-01-22 00:30:01.217 182759 DEBUG oslo_service.periodic_task [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 21 19:30:01 np0005591285 nova_compute[182755]: 2026-01-22 00:30:01.618 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:30:02 np0005591285 podman[239083]: 2026-01-22 00:30:02.18808484 +0000 UTC m=+0.054116908 container health_status 3d927e208c8d9d2bf2ed958430b573b5beb6e6062ba87e1ec7a0ee42c855b2e4 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter)
Jan 21 19:30:02 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:30:02.995 104259 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 21 19:30:02 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:30:02.996 104259 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 21 19:30:02 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:30:02.996 104259 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 21 19:30:03 np0005591285 nova_compute[182755]: 2026-01-22 00:30:03.216 182759 DEBUG oslo_service.periodic_task [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 21 19:30:03 np0005591285 nova_compute[182755]: 2026-01-22 00:30:03.899 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:30:04 np0005591285 nova_compute[182755]: 2026-01-22 00:30:04.218 182759 DEBUG oslo_service.periodic_task [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 21 19:30:05 np0005591285 podman[239107]: 2026-01-22 00:30:05.173705143 +0000 UTC m=+0.046945634 container health_status 482a8450540b33e5cd4c4cb0c4b60281016c657b79b89fa82f8a9e447843f995 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_metadata_agent, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=ovn_metadata_agent, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true)
Jan 21 19:30:05 np0005591285 podman[239108]: 2026-01-22 00:30:05.186139178 +0000 UTC m=+0.052502644 container health_status 8919309155a033da8deb024bb37b9fc0d285fe97686ee0d202af37a5f2df8416 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter)
Jan 21 19:30:06 np0005591285 nova_compute[182755]: 2026-01-22 00:30:06.619 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:30:07 np0005591285 nova_compute[182755]: 2026-01-22 00:30:07.217 182759 DEBUG oslo_service.periodic_task [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 21 19:30:08 np0005591285 podman[239151]: 2026-01-22 00:30:08.218469457 +0000 UTC m=+0.083737944 container health_status e38def30a2d032278c507a8fe96e62e9ea51ff4721fe55b24e66691821fd079e (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_controller, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible)
Jan 21 19:30:08 np0005591285 nova_compute[182755]: 2026-01-22 00:30:08.948 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:30:09 np0005591285 nova_compute[182755]: 2026-01-22 00:30:09.217 182759 DEBUG oslo_service.periodic_task [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 21 19:30:09 np0005591285 nova_compute[182755]: 2026-01-22 00:30:09.293 182759 DEBUG oslo_concurrency.lockutils [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 21 19:30:09 np0005591285 nova_compute[182755]: 2026-01-22 00:30:09.294 182759 DEBUG oslo_concurrency.lockutils [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 21 19:30:09 np0005591285 nova_compute[182755]: 2026-01-22 00:30:09.294 182759 DEBUG oslo_concurrency.lockutils [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 21 19:30:09 np0005591285 nova_compute[182755]: 2026-01-22 00:30:09.294 182759 DEBUG nova.compute.resource_tracker [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Jan 21 19:30:09 np0005591285 nova_compute[182755]: 2026-01-22 00:30:09.424 182759 WARNING nova.virt.libvirt.driver [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 21 19:30:09 np0005591285 nova_compute[182755]: 2026-01-22 00:30:09.424 182759 DEBUG nova.compute.resource_tracker [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=5753MB free_disk=73.17745590209961GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Jan 21 19:30:09 np0005591285 nova_compute[182755]: 2026-01-22 00:30:09.425 182759 DEBUG oslo_concurrency.lockutils [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 21 19:30:09 np0005591285 nova_compute[182755]: 2026-01-22 00:30:09.425 182759 DEBUG oslo_concurrency.lockutils [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 21 19:30:09 np0005591285 nova_compute[182755]: 2026-01-22 00:30:09.872 182759 DEBUG nova.compute.resource_tracker [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Jan 21 19:30:09 np0005591285 nova_compute[182755]: 2026-01-22 00:30:09.873 182759 DEBUG nova.compute.resource_tracker [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Jan 21 19:30:09 np0005591285 nova_compute[182755]: 2026-01-22 00:30:09.893 182759 DEBUG nova.scheduler.client.report [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Refreshing inventories for resource provider e96a8776-a298-4c19-937a-402cb8191067 _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:804#033[00m
Jan 21 19:30:09 np0005591285 nova_compute[182755]: 2026-01-22 00:30:09.968 182759 DEBUG nova.scheduler.client.report [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Updating ProviderTree inventory for provider e96a8776-a298-4c19-937a-402cb8191067 from _refresh_and_get_inventory using data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} _refresh_and_get_inventory /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:768#033[00m
Jan 21 19:30:09 np0005591285 nova_compute[182755]: 2026-01-22 00:30:09.969 182759 DEBUG nova.compute.provider_tree [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Updating inventory in ProviderTree for provider e96a8776-a298-4c19-937a-402cb8191067 with inventory: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176#033[00m
Jan 21 19:30:09 np0005591285 nova_compute[182755]: 2026-01-22 00:30:09.986 182759 DEBUG nova.scheduler.client.report [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Refreshing aggregate associations for resource provider e96a8776-a298-4c19-937a-402cb8191067, aggregates: None _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:813#033[00m
Jan 21 19:30:10 np0005591285 nova_compute[182755]: 2026-01-22 00:30:10.013 182759 DEBUG nova.scheduler.client.report [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Refreshing trait associations for resource provider e96a8776-a298-4c19-937a-402cb8191067, traits: COMPUTE_SECURITY_UEFI_SECURE_BOOT,COMPUTE_VIOMMU_MODEL_VIRTIO,COMPUTE_STORAGE_BUS_IDE,COMPUTE_NET_ATTACH_INTERFACE,HW_CPU_X86_SSSE3,HW_CPU_X86_SSE,COMPUTE_IMAGE_TYPE_QCOW2,COMPUTE_NET_VIF_MODEL_VMXNET3,COMPUTE_STORAGE_BUS_FDC,HW_CPU_X86_SSE42,COMPUTE_SECURITY_TPM_2_0,COMPUTE_IMAGE_TYPE_AMI,COMPUTE_DEVICE_TAGGING,COMPUTE_VOLUME_EXTEND,COMPUTE_NET_VIF_MODEL_E1000,COMPUTE_VIOMMU_MODEL_AUTO,COMPUTE_GRAPHICS_MODEL_CIRRUS,COMPUTE_VOLUME_ATTACH_WITH_TAG,COMPUTE_IMAGE_TYPE_AKI,COMPUTE_STORAGE_BUS_SATA,HW_CPU_X86_SSE2,COMPUTE_IMAGE_TYPE_ARI,HW_CPU_X86_SSE41,COMPUTE_RESCUE_BFV,COMPUTE_NODE,COMPUTE_GRAPHICS_MODEL_VIRTIO,COMPUTE_NET_VIF_MODEL_PCNET,COMPUTE_NET_VIF_MODEL_VIRTIO,COMPUTE_NET_VIF_MODEL_RTL8139,COMPUTE_ACCELERATORS,COMPUTE_SECURITY_TPM_1_2,COMPUTE_GRAPHICS_MODEL_NONE,COMPUTE_GRAPHICS_MODEL_BOCHS,COMPUTE_VOLUME_MULTI_ATTACH,COMPUTE_NET_VIF_MODEL_NE2K_PCI,COMPUTE_TRUSTED_CERTS,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,COMPUTE_VIOMMU_MODEL_INTEL,COMPUTE_IMAGE_TYPE_ISO,COMPUTE_SOCKET_PCI_NUMA_AFFINITY,COMPUTE_STORAGE_BUS_USB,COMPUTE_STORAGE_BUS_SCSI,COMPUTE_STORAGE_BUS_VIRTIO,HW_CPU_X86_MMX,COMPUTE_GRAPHICS_MODEL_VGA,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,COMPUTE_NET_VIF_MODEL_E1000E,COMPUTE_IMAGE_TYPE_RAW _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:825#033[00m
Jan 21 19:30:10 np0005591285 nova_compute[182755]: 2026-01-22 00:30:10.037 182759 DEBUG nova.compute.provider_tree [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Inventory has not changed in ProviderTree for provider: e96a8776-a298-4c19-937a-402cb8191067 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 21 19:30:10 np0005591285 nova_compute[182755]: 2026-01-22 00:30:10.052 182759 DEBUG nova.scheduler.client.report [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Inventory has not changed for provider e96a8776-a298-4c19-937a-402cb8191067 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 21 19:30:10 np0005591285 nova_compute[182755]: 2026-01-22 00:30:10.053 182759 DEBUG nova.compute.resource_tracker [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Jan 21 19:30:10 np0005591285 nova_compute[182755]: 2026-01-22 00:30:10.053 182759 DEBUG oslo_concurrency.lockutils [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.628s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 21 19:30:11 np0005591285 nova_compute[182755]: 2026-01-22 00:30:11.620 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:30:12 np0005591285 nova_compute[182755]: 2026-01-22 00:30:12.053 182759 DEBUG oslo_service.periodic_task [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 21 19:30:14 np0005591285 nova_compute[182755]: 2026-01-22 00:30:14.015 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:30:14 np0005591285 nova_compute[182755]: 2026-01-22 00:30:14.212 182759 DEBUG oslo_service.periodic_task [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 21 19:30:16 np0005591285 nova_compute[182755]: 2026-01-22 00:30:16.622 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:30:19 np0005591285 nova_compute[182755]: 2026-01-22 00:30:19.017 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:30:21 np0005591285 nova_compute[182755]: 2026-01-22 00:30:21.623 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:30:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:30:23.174 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 21 19:30:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:30:23.174 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 21 19:30:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:30:23.175 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 21 19:30:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:30:23.175 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 21 19:30:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:30:23.175 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.iops, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 21 19:30:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:30:23.175 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 21 19:30:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:30:23.175 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 21 19:30:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:30:23.175 12 DEBUG ceilometer.polling.manager [-] Skip pollster cpu, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 21 19:30:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:30:23.175 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 21 19:30:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:30:23.175 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 21 19:30:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:30:23.175 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 21 19:30:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:30:23.175 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 21 19:30:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:30:23.175 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 21 19:30:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:30:23.176 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 21 19:30:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:30:23.176 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.capacity, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 21 19:30:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:30:23.176 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.allocation, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 21 19:30:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:30:23.176 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 21 19:30:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:30:23.176 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 21 19:30:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:30:23.176 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 21 19:30:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:30:23.176 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 21 19:30:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:30:23.176 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 21 19:30:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:30:23.176 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 21 19:30:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:30:23.176 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 21 19:30:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:30:23.176 12 DEBUG ceilometer.polling.manager [-] Skip pollster memory.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 21 19:30:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:30:23.176 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 21 19:30:24 np0005591285 nova_compute[182755]: 2026-01-22 00:30:24.019 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:30:24 np0005591285 podman[239179]: 2026-01-22 00:30:24.183063953 +0000 UTC m=+0.056319267 container health_status 769668cc4e818cfae854ab3fcb5517227ddca1e18005a18ef4d782a494f45662 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, build-date=2025-08-20T13:12:41, managed_by=edpm_ansible, name=ubi9-minimal, url=https://catalog.redhat.com/en/search?searchType=containers, io.openshift.tags=minimal rhel9, vcs-type=git, distribution-scope=public, io.openshift.expose-services=, release=1755695350, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vendor=Red Hat, Inc., version=9.6, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., com.redhat.component=ubi9-minimal-container, container_name=openstack_network_exporter, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., architecture=x86_64, maintainer=Red Hat, Inc., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, config_id=openstack_network_exporter, io.buildah.version=1.33.7)
Jan 21 19:30:24 np0005591285 podman[239180]: 2026-01-22 00:30:24.192532418 +0000 UTC m=+0.060178471 container health_status b5c1a31a508d0cf9e10702abe9c0ef424614b4d8f0daee85791f7de054afa6f0 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, container_name=ceilometer_agent_compute, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=ceilometer_agent_compute)
Jan 21 19:30:26 np0005591285 nova_compute[182755]: 2026-01-22 00:30:26.624 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:30:29 np0005591285 nova_compute[182755]: 2026-01-22 00:30:29.021 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:30:31 np0005591285 nova_compute[182755]: 2026-01-22 00:30:31.625 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:30:33 np0005591285 podman[239216]: 2026-01-22 00:30:33.210320155 +0000 UTC m=+0.081039091 container health_status 3d927e208c8d9d2bf2ed958430b573b5beb6e6062ba87e1ec7a0ee42c855b2e4 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter)
Jan 21 19:30:34 np0005591285 nova_compute[182755]: 2026-01-22 00:30:34.025 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:30:36 np0005591285 podman[239242]: 2026-01-22 00:30:36.17370199 +0000 UTC m=+0.049488904 container health_status 482a8450540b33e5cd4c4cb0c4b60281016c657b79b89fa82f8a9e447843f995 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, org.label-schema.build-date=20251202)
Jan 21 19:30:36 np0005591285 podman[239243]: 2026-01-22 00:30:36.17373502 +0000 UTC m=+0.046132683 container health_status 8919309155a033da8deb024bb37b9fc0d285fe97686ee0d202af37a5f2df8416 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Jan 21 19:30:36 np0005591285 nova_compute[182755]: 2026-01-22 00:30:36.627 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:30:39 np0005591285 nova_compute[182755]: 2026-01-22 00:30:39.068 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:30:39 np0005591285 podman[239285]: 2026-01-22 00:30:39.20272075 +0000 UTC m=+0.078885734 container health_status e38def30a2d032278c507a8fe96e62e9ea51ff4721fe55b24e66691821fd079e (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=ovn_controller, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, container_name=ovn_controller)
Jan 21 19:30:41 np0005591285 nova_compute[182755]: 2026-01-22 00:30:41.628 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:30:44 np0005591285 nova_compute[182755]: 2026-01-22 00:30:44.082 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:30:44 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:30:44.115 104259 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=54, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '16:a4:fb', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': 'fa:c2:bb:50:eb:27'}, ipsec=False) old=SB_Global(nb_cfg=53) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 21 19:30:44 np0005591285 nova_compute[182755]: 2026-01-22 00:30:44.116 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:30:44 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:30:44.116 104259 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 8 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Jan 21 19:30:46 np0005591285 nova_compute[182755]: 2026-01-22 00:30:46.630 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:30:49 np0005591285 nova_compute[182755]: 2026-01-22 00:30:49.084 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:30:51 np0005591285 nova_compute[182755]: 2026-01-22 00:30:51.631 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:30:52 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:30:52.118 104259 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=ce4b296c-26ac-415a-aa87-9634754eb3d3, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '54'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 21 19:30:54 np0005591285 nova_compute[182755]: 2026-01-22 00:30:54.086 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:30:55 np0005591285 podman[239314]: 2026-01-22 00:30:55.204782322 +0000 UTC m=+0.072866852 container health_status 769668cc4e818cfae854ab3fcb5517227ddca1e18005a18ef4d782a494f45662 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, version=9.6, architecture=x86_64, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., build-date=2025-08-20T13:12:41, com.redhat.component=ubi9-minimal-container, url=https://catalog.redhat.com/en/search?searchType=containers, container_name=openstack_network_exporter, io.openshift.expose-services=, managed_by=edpm_ansible, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.openshift.tags=minimal rhel9, release=1755695350, vcs-type=git, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., name=ubi9-minimal, maintainer=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, distribution-scope=public, io.buildah.version=1.33.7, config_id=openstack_network_exporter, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal)
Jan 21 19:30:55 np0005591285 podman[239315]: 2026-01-22 00:30:55.222832987 +0000 UTC m=+0.073937710 container health_status b5c1a31a508d0cf9e10702abe9c0ef424614b4d8f0daee85791f7de054afa6f0 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_managed=true, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, org.label-schema.build-date=20251202, io.buildah.version=1.41.3, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=ceilometer_agent_compute, managed_by=edpm_ansible)
Jan 21 19:30:56 np0005591285 nova_compute[182755]: 2026-01-22 00:30:56.632 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:30:59 np0005591285 nova_compute[182755]: 2026-01-22 00:30:59.088 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:30:59 np0005591285 nova_compute[182755]: 2026-01-22 00:30:59.217 182759 DEBUG oslo_service.periodic_task [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 21 19:30:59 np0005591285 nova_compute[182755]: 2026-01-22 00:30:59.217 182759 DEBUG nova.compute.manager [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Jan 21 19:31:00 np0005591285 nova_compute[182755]: 2026-01-22 00:31:00.217 182759 DEBUG oslo_service.periodic_task [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 21 19:31:00 np0005591285 nova_compute[182755]: 2026-01-22 00:31:00.218 182759 DEBUG nova.compute.manager [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Jan 21 19:31:00 np0005591285 nova_compute[182755]: 2026-01-22 00:31:00.218 182759 DEBUG nova.compute.manager [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Jan 21 19:31:00 np0005591285 nova_compute[182755]: 2026-01-22 00:31:00.236 182759 DEBUG nova.compute.manager [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Jan 21 19:31:01 np0005591285 nova_compute[182755]: 2026-01-22 00:31:01.217 182759 DEBUG oslo_service.periodic_task [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 21 19:31:01 np0005591285 nova_compute[182755]: 2026-01-22 00:31:01.218 182759 DEBUG oslo_service.periodic_task [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 21 19:31:01 np0005591285 nova_compute[182755]: 2026-01-22 00:31:01.633 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:31:02 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:31:02.996 104259 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 21 19:31:02 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:31:02.997 104259 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 21 19:31:02 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:31:02.997 104259 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 21 19:31:03 np0005591285 nova_compute[182755]: 2026-01-22 00:31:03.217 182759 DEBUG oslo_service.periodic_task [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 21 19:31:04 np0005591285 nova_compute[182755]: 2026-01-22 00:31:04.137 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:31:04 np0005591285 podman[239357]: 2026-01-22 00:31:04.22673829 +0000 UTC m=+0.059382190 container health_status 3d927e208c8d9d2bf2ed958430b573b5beb6e6062ba87e1ec7a0ee42c855b2e4 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Jan 21 19:31:06 np0005591285 nova_compute[182755]: 2026-01-22 00:31:06.217 182759 DEBUG oslo_service.periodic_task [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 21 19:31:06 np0005591285 nova_compute[182755]: 2026-01-22 00:31:06.636 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:31:07 np0005591285 podman[239381]: 2026-01-22 00:31:07.18582737 +0000 UTC m=+0.054448227 container health_status 482a8450540b33e5cd4c4cb0c4b60281016c657b79b89fa82f8a9e447843f995 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, container_name=ovn_metadata_agent, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, managed_by=edpm_ansible, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent)
Jan 21 19:31:07 np0005591285 podman[239382]: 2026-01-22 00:31:07.193541768 +0000 UTC m=+0.054718664 container health_status 8919309155a033da8deb024bb37b9fc0d285fe97686ee0d202af37a5f2df8416 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter)
Jan 21 19:31:07 np0005591285 nova_compute[182755]: 2026-01-22 00:31:07.217 182759 DEBUG oslo_service.periodic_task [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 21 19:31:09 np0005591285 nova_compute[182755]: 2026-01-22 00:31:09.195 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:31:10 np0005591285 podman[239424]: 2026-01-22 00:31:10.203262928 +0000 UTC m=+0.080087096 container health_status e38def30a2d032278c507a8fe96e62e9ea51ff4721fe55b24e66691821fd079e (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_controller, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.license=GPLv2, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true)
Jan 21 19:31:10 np0005591285 nova_compute[182755]: 2026-01-22 00:31:10.217 182759 DEBUG oslo_service.periodic_task [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 21 19:31:10 np0005591285 nova_compute[182755]: 2026-01-22 00:31:10.243 182759 DEBUG oslo_concurrency.lockutils [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 21 19:31:10 np0005591285 nova_compute[182755]: 2026-01-22 00:31:10.244 182759 DEBUG oslo_concurrency.lockutils [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 21 19:31:10 np0005591285 nova_compute[182755]: 2026-01-22 00:31:10.244 182759 DEBUG oslo_concurrency.lockutils [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 21 19:31:10 np0005591285 nova_compute[182755]: 2026-01-22 00:31:10.244 182759 DEBUG nova.compute.resource_tracker [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Jan 21 19:31:10 np0005591285 nova_compute[182755]: 2026-01-22 00:31:10.377 182759 WARNING nova.virt.libvirt.driver [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 21 19:31:10 np0005591285 nova_compute[182755]: 2026-01-22 00:31:10.378 182759 DEBUG nova.compute.resource_tracker [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=5765MB free_disk=73.17741775512695GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Jan 21 19:31:10 np0005591285 nova_compute[182755]: 2026-01-22 00:31:10.378 182759 DEBUG oslo_concurrency.lockutils [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 21 19:31:10 np0005591285 nova_compute[182755]: 2026-01-22 00:31:10.378 182759 DEBUG oslo_concurrency.lockutils [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 21 19:31:10 np0005591285 nova_compute[182755]: 2026-01-22 00:31:10.440 182759 DEBUG nova.compute.resource_tracker [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Jan 21 19:31:10 np0005591285 nova_compute[182755]: 2026-01-22 00:31:10.441 182759 DEBUG nova.compute.resource_tracker [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Jan 21 19:31:10 np0005591285 nova_compute[182755]: 2026-01-22 00:31:10.476 182759 DEBUG nova.compute.provider_tree [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Inventory has not changed in ProviderTree for provider: e96a8776-a298-4c19-937a-402cb8191067 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 21 19:31:10 np0005591285 nova_compute[182755]: 2026-01-22 00:31:10.501 182759 DEBUG nova.scheduler.client.report [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Inventory has not changed for provider e96a8776-a298-4c19-937a-402cb8191067 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 21 19:31:10 np0005591285 nova_compute[182755]: 2026-01-22 00:31:10.502 182759 DEBUG nova.compute.resource_tracker [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Jan 21 19:31:10 np0005591285 nova_compute[182755]: 2026-01-22 00:31:10.502 182759 DEBUG oslo_concurrency.lockutils [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.125s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 21 19:31:11 np0005591285 nova_compute[182755]: 2026-01-22 00:31:11.638 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:31:11 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:31:11.727 104259 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:99:47:fc 10.100.0.2 2001:db8::f816:3eff:fe99:47fc'], port_security=[], type=localport, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': ''}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.2/28 2001:db8::f816:3eff:fe99:47fc/64', 'neutron:device_id': 'ovnmeta-895033ac-5f91-4350-ad1a-b5c5d0ff13a2', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-895033ac-5f91-4350-ad1a-b5c5d0ff13a2', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '837db8748d074b3c9179b47d30e7a1d4', 'neutron:revision_number': '3', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=cb52301b-689d-4e28-a6fb-c23352694dd4, chassis=[], tunnel_key=1, gateway_chassis=[], requested_chassis=[], logical_port=b38c45f8-f983-4d04-9b7c-db4cbbad86b5) old=Port_Binding(mac=['fa:16:3e:99:47:fc 10.100.0.2'], external_ids={'neutron:cidrs': '10.100.0.2/28', 'neutron:device_id': 'ovnmeta-895033ac-5f91-4350-ad1a-b5c5d0ff13a2', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-895033ac-5f91-4350-ad1a-b5c5d0ff13a2', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '837db8748d074b3c9179b47d30e7a1d4', 'neutron:revision_number': '2', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 21 19:31:11 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:31:11.728 104259 INFO neutron.agent.ovn.metadata.agent [-] Metadata Port b38c45f8-f983-4d04-9b7c-db4cbbad86b5 in datapath 895033ac-5f91-4350-ad1a-b5c5d0ff13a2 updated#033[00m
Jan 21 19:31:11 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:31:11.728 104259 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 895033ac-5f91-4350-ad1a-b5c5d0ff13a2, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Jan 21 19:31:11 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:31:11.730 211690 DEBUG oslo.privsep.daemon [-] privsep: reply[756fbef1-5f36-408a-97f8-514d5ea8b930]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 19:31:13 np0005591285 nova_compute[182755]: 2026-01-22 00:31:13.502 182759 DEBUG oslo_service.periodic_task [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 21 19:31:14 np0005591285 nova_compute[182755]: 2026-01-22 00:31:14.250 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:31:16 np0005591285 nova_compute[182755]: 2026-01-22 00:31:16.642 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:31:19 np0005591285 nova_compute[182755]: 2026-01-22 00:31:19.252 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:31:21 np0005591285 nova_compute[182755]: 2026-01-22 00:31:21.646 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:31:24 np0005591285 nova_compute[182755]: 2026-01-22 00:31:24.254 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:31:26 np0005591285 podman[239452]: 2026-01-22 00:31:26.184653305 +0000 UTC m=+0.055396652 container health_status b5c1a31a508d0cf9e10702abe9c0ef424614b4d8f0daee85791f7de054afa6f0 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=ceilometer_agent_compute, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true)
Jan 21 19:31:26 np0005591285 podman[239451]: 2026-01-22 00:31:26.200655255 +0000 UTC m=+0.075407289 container health_status 769668cc4e818cfae854ab3fcb5517227ddca1e18005a18ef4d782a494f45662 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, url=https://catalog.redhat.com/en/search?searchType=containers, version=9.6, distribution-scope=public, com.redhat.component=ubi9-minimal-container, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., name=ubi9-minimal, config_id=openstack_network_exporter, io.openshift.tags=minimal rhel9, architecture=x86_64, build-date=2025-08-20T13:12:41, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.33.7, vcs-type=git, release=1755695350, managed_by=edpm_ansible, container_name=openstack_network_exporter, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, maintainer=Red Hat, Inc., vendor=Red Hat, Inc.)
Jan 21 19:31:26 np0005591285 nova_compute[182755]: 2026-01-22 00:31:26.649 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:31:29 np0005591285 nova_compute[182755]: 2026-01-22 00:31:29.305 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:31:31 np0005591285 nova_compute[182755]: 2026-01-22 00:31:31.655 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:31:34 np0005591285 nova_compute[182755]: 2026-01-22 00:31:34.362 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:31:35 np0005591285 podman[239492]: 2026-01-22 00:31:35.175749324 +0000 UTC m=+0.048274259 container health_status 3d927e208c8d9d2bf2ed958430b573b5beb6e6062ba87e1ec7a0ee42c855b2e4 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Jan 21 19:31:36 np0005591285 nova_compute[182755]: 2026-01-22 00:31:36.660 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:31:38 np0005591285 podman[239517]: 2026-01-22 00:31:38.180686787 +0000 UTC m=+0.048235130 container health_status 8919309155a033da8deb024bb37b9fc0d285fe97686ee0d202af37a5f2df8416 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Jan 21 19:31:38 np0005591285 podman[239516]: 2026-01-22 00:31:38.20460027 +0000 UTC m=+0.077206600 container health_status 482a8450540b33e5cd4c4cb0c4b60281016c657b79b89fa82f8a9e447843f995 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 21 19:31:39 np0005591285 nova_compute[182755]: 2026-01-22 00:31:39.417 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:31:41 np0005591285 podman[239555]: 2026-01-22 00:31:41.233461367 +0000 UTC m=+0.105333916 container health_status e38def30a2d032278c507a8fe96e62e9ea51ff4721fe55b24e66691821fd079e (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_controller, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, container_name=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true)
Jan 21 19:31:41 np0005591285 nova_compute[182755]: 2026-01-22 00:31:41.663 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:31:44 np0005591285 nova_compute[182755]: 2026-01-22 00:31:44.466 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:31:46 np0005591285 nova_compute[182755]: 2026-01-22 00:31:46.709 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:31:49 np0005591285 nova_compute[182755]: 2026-01-22 00:31:49.468 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:31:51 np0005591285 nova_compute[182755]: 2026-01-22 00:31:51.779 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:31:54 np0005591285 nova_compute[182755]: 2026-01-22 00:31:54.503 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:31:56 np0005591285 nova_compute[182755]: 2026-01-22 00:31:56.782 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:31:57 np0005591285 podman[239582]: 2026-01-22 00:31:57.207327751 +0000 UTC m=+0.064844826 container health_status b5c1a31a508d0cf9e10702abe9c0ef424614b4d8f0daee85791f7de054afa6f0 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, tcib_managed=true, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ceilometer_agent_compute, managed_by=edpm_ansible, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0)
Jan 21 19:31:57 np0005591285 podman[239581]: 2026-01-22 00:31:57.225748176 +0000 UTC m=+0.079549701 container health_status 769668cc4e818cfae854ab3fcb5517227ddca1e18005a18ef4d782a494f45662 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.openshift.tags=minimal rhel9, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, release=1755695350, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vendor=Red Hat, Inc., architecture=x86_64, maintainer=Red Hat, Inc., managed_by=edpm_ansible, name=ubi9-minimal, io.openshift.expose-services=, io.buildah.version=1.33.7, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, config_id=openstack_network_exporter, vcs-type=git, build-date=2025-08-20T13:12:41, url=https://catalog.redhat.com/en/search?searchType=containers, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, com.redhat.component=ubi9-minimal-container, version=9.6, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, container_name=openstack_network_exporter)
Jan 21 19:31:58 np0005591285 nova_compute[182755]: 2026-01-22 00:31:58.030 182759 DEBUG oslo_concurrency.lockutils [None req-7c0476d1-b41e-42ef-92a8-ea1af027411b a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Acquiring lock "3da83711-3468-42e8-aec6-ea1b9848aa39" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 21 19:31:58 np0005591285 nova_compute[182755]: 2026-01-22 00:31:58.030 182759 DEBUG oslo_concurrency.lockutils [None req-7c0476d1-b41e-42ef-92a8-ea1af027411b a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Lock "3da83711-3468-42e8-aec6-ea1b9848aa39" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 21 19:31:58 np0005591285 nova_compute[182755]: 2026-01-22 00:31:58.049 182759 DEBUG nova.compute.manager [None req-7c0476d1-b41e-42ef-92a8-ea1af027411b a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] [instance: 3da83711-3468-42e8-aec6-ea1b9848aa39] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Jan 21 19:31:58 np0005591285 nova_compute[182755]: 2026-01-22 00:31:58.144 182759 DEBUG oslo_concurrency.lockutils [None req-7c0476d1-b41e-42ef-92a8-ea1af027411b a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 21 19:31:58 np0005591285 nova_compute[182755]: 2026-01-22 00:31:58.145 182759 DEBUG oslo_concurrency.lockutils [None req-7c0476d1-b41e-42ef-92a8-ea1af027411b a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 21 19:31:58 np0005591285 nova_compute[182755]: 2026-01-22 00:31:58.151 182759 DEBUG nova.virt.hardware [None req-7c0476d1-b41e-42ef-92a8-ea1af027411b a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Jan 21 19:31:58 np0005591285 nova_compute[182755]: 2026-01-22 00:31:58.152 182759 INFO nova.compute.claims [None req-7c0476d1-b41e-42ef-92a8-ea1af027411b a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] [instance: 3da83711-3468-42e8-aec6-ea1b9848aa39] Claim successful on node compute-2.ctlplane.example.com#033[00m
Jan 21 19:31:58 np0005591285 nova_compute[182755]: 2026-01-22 00:31:58.299 182759 DEBUG nova.compute.provider_tree [None req-7c0476d1-b41e-42ef-92a8-ea1af027411b a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Inventory has not changed in ProviderTree for provider: e96a8776-a298-4c19-937a-402cb8191067 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 21 19:31:58 np0005591285 nova_compute[182755]: 2026-01-22 00:31:58.314 182759 DEBUG nova.scheduler.client.report [None req-7c0476d1-b41e-42ef-92a8-ea1af027411b a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Inventory has not changed for provider e96a8776-a298-4c19-937a-402cb8191067 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 21 19:31:58 np0005591285 nova_compute[182755]: 2026-01-22 00:31:58.334 182759 DEBUG oslo_concurrency.lockutils [None req-7c0476d1-b41e-42ef-92a8-ea1af027411b a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.189s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 21 19:31:58 np0005591285 nova_compute[182755]: 2026-01-22 00:31:58.335 182759 DEBUG nova.compute.manager [None req-7c0476d1-b41e-42ef-92a8-ea1af027411b a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] [instance: 3da83711-3468-42e8-aec6-ea1b9848aa39] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Jan 21 19:31:58 np0005591285 nova_compute[182755]: 2026-01-22 00:31:58.386 182759 DEBUG nova.compute.manager [None req-7c0476d1-b41e-42ef-92a8-ea1af027411b a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] [instance: 3da83711-3468-42e8-aec6-ea1b9848aa39] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Jan 21 19:31:58 np0005591285 nova_compute[182755]: 2026-01-22 00:31:58.387 182759 DEBUG nova.network.neutron [None req-7c0476d1-b41e-42ef-92a8-ea1af027411b a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] [instance: 3da83711-3468-42e8-aec6-ea1b9848aa39] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Jan 21 19:31:58 np0005591285 nova_compute[182755]: 2026-01-22 00:31:58.404 182759 INFO nova.virt.libvirt.driver [None req-7c0476d1-b41e-42ef-92a8-ea1af027411b a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] [instance: 3da83711-3468-42e8-aec6-ea1b9848aa39] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Jan 21 19:31:58 np0005591285 nova_compute[182755]: 2026-01-22 00:31:58.424 182759 DEBUG nova.compute.manager [None req-7c0476d1-b41e-42ef-92a8-ea1af027411b a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] [instance: 3da83711-3468-42e8-aec6-ea1b9848aa39] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Jan 21 19:31:58 np0005591285 nova_compute[182755]: 2026-01-22 00:31:58.552 182759 DEBUG nova.compute.manager [None req-7c0476d1-b41e-42ef-92a8-ea1af027411b a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] [instance: 3da83711-3468-42e8-aec6-ea1b9848aa39] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Jan 21 19:31:58 np0005591285 nova_compute[182755]: 2026-01-22 00:31:58.553 182759 DEBUG nova.virt.libvirt.driver [None req-7c0476d1-b41e-42ef-92a8-ea1af027411b a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] [instance: 3da83711-3468-42e8-aec6-ea1b9848aa39] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Jan 21 19:31:58 np0005591285 nova_compute[182755]: 2026-01-22 00:31:58.553 182759 INFO nova.virt.libvirt.driver [None req-7c0476d1-b41e-42ef-92a8-ea1af027411b a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] [instance: 3da83711-3468-42e8-aec6-ea1b9848aa39] Creating image(s)#033[00m
Jan 21 19:31:58 np0005591285 nova_compute[182755]: 2026-01-22 00:31:58.554 182759 DEBUG oslo_concurrency.lockutils [None req-7c0476d1-b41e-42ef-92a8-ea1af027411b a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Acquiring lock "/var/lib/nova/instances/3da83711-3468-42e8-aec6-ea1b9848aa39/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 21 19:31:58 np0005591285 nova_compute[182755]: 2026-01-22 00:31:58.554 182759 DEBUG oslo_concurrency.lockutils [None req-7c0476d1-b41e-42ef-92a8-ea1af027411b a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Lock "/var/lib/nova/instances/3da83711-3468-42e8-aec6-ea1b9848aa39/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 21 19:31:58 np0005591285 nova_compute[182755]: 2026-01-22 00:31:58.555 182759 DEBUG oslo_concurrency.lockutils [None req-7c0476d1-b41e-42ef-92a8-ea1af027411b a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Lock "/var/lib/nova/instances/3da83711-3468-42e8-aec6-ea1b9848aa39/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 21 19:31:58 np0005591285 nova_compute[182755]: 2026-01-22 00:31:58.569 182759 DEBUG oslo_concurrency.processutils [None req-7c0476d1-b41e-42ef-92a8-ea1af027411b a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 21 19:31:58 np0005591285 nova_compute[182755]: 2026-01-22 00:31:58.628 182759 DEBUG oslo_concurrency.processutils [None req-7c0476d1-b41e-42ef-92a8-ea1af027411b a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json" returned: 0 in 0.059s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 21 19:31:58 np0005591285 nova_compute[182755]: 2026-01-22 00:31:58.630 182759 DEBUG oslo_concurrency.lockutils [None req-7c0476d1-b41e-42ef-92a8-ea1af027411b a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Acquiring lock "3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 21 19:31:58 np0005591285 nova_compute[182755]: 2026-01-22 00:31:58.630 182759 DEBUG oslo_concurrency.lockutils [None req-7c0476d1-b41e-42ef-92a8-ea1af027411b a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Lock "3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 21 19:31:58 np0005591285 nova_compute[182755]: 2026-01-22 00:31:58.646 182759 DEBUG oslo_concurrency.processutils [None req-7c0476d1-b41e-42ef-92a8-ea1af027411b a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 21 19:31:58 np0005591285 nova_compute[182755]: 2026-01-22 00:31:58.709 182759 DEBUG oslo_concurrency.processutils [None req-7c0476d1-b41e-42ef-92a8-ea1af027411b a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json" returned: 0 in 0.063s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 21 19:31:58 np0005591285 nova_compute[182755]: 2026-01-22 00:31:58.710 182759 DEBUG oslo_concurrency.processutils [None req-7c0476d1-b41e-42ef-92a8-ea1af027411b a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474,backing_fmt=raw /var/lib/nova/instances/3da83711-3468-42e8-aec6-ea1b9848aa39/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 21 19:31:58 np0005591285 nova_compute[182755]: 2026-01-22 00:31:58.750 182759 DEBUG oslo_concurrency.processutils [None req-7c0476d1-b41e-42ef-92a8-ea1af027411b a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474,backing_fmt=raw /var/lib/nova/instances/3da83711-3468-42e8-aec6-ea1b9848aa39/disk 1073741824" returned: 0 in 0.040s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 21 19:31:58 np0005591285 nova_compute[182755]: 2026-01-22 00:31:58.751 182759 DEBUG oslo_concurrency.lockutils [None req-7c0476d1-b41e-42ef-92a8-ea1af027411b a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Lock "3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.121s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 21 19:31:58 np0005591285 nova_compute[182755]: 2026-01-22 00:31:58.752 182759 DEBUG oslo_concurrency.processutils [None req-7c0476d1-b41e-42ef-92a8-ea1af027411b a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 21 19:31:58 np0005591285 nova_compute[182755]: 2026-01-22 00:31:58.804 182759 DEBUG oslo_concurrency.processutils [None req-7c0476d1-b41e-42ef-92a8-ea1af027411b a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json" returned: 0 in 0.052s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 21 19:31:58 np0005591285 nova_compute[182755]: 2026-01-22 00:31:58.805 182759 DEBUG nova.virt.disk.api [None req-7c0476d1-b41e-42ef-92a8-ea1af027411b a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Checking if we can resize image /var/lib/nova/instances/3da83711-3468-42e8-aec6-ea1b9848aa39/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166#033[00m
Jan 21 19:31:58 np0005591285 nova_compute[182755]: 2026-01-22 00:31:58.806 182759 DEBUG oslo_concurrency.processutils [None req-7c0476d1-b41e-42ef-92a8-ea1af027411b a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/3da83711-3468-42e8-aec6-ea1b9848aa39/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 21 19:31:58 np0005591285 nova_compute[182755]: 2026-01-22 00:31:58.828 182759 DEBUG nova.policy [None req-7c0476d1-b41e-42ef-92a8-ea1af027411b a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'a8fd196423d94b309668ffd08655f7ed', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '837db8748d074b3c9179b47d30e7a1d4', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Jan 21 19:31:58 np0005591285 nova_compute[182755]: 2026-01-22 00:31:58.862 182759 DEBUG oslo_concurrency.processutils [None req-7c0476d1-b41e-42ef-92a8-ea1af027411b a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/3da83711-3468-42e8-aec6-ea1b9848aa39/disk --force-share --output=json" returned: 0 in 0.057s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 21 19:31:58 np0005591285 nova_compute[182755]: 2026-01-22 00:31:58.863 182759 DEBUG nova.virt.disk.api [None req-7c0476d1-b41e-42ef-92a8-ea1af027411b a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Cannot resize image /var/lib/nova/instances/3da83711-3468-42e8-aec6-ea1b9848aa39/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172#033[00m
Jan 21 19:31:58 np0005591285 nova_compute[182755]: 2026-01-22 00:31:58.863 182759 DEBUG nova.objects.instance [None req-7c0476d1-b41e-42ef-92a8-ea1af027411b a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Lazy-loading 'migration_context' on Instance uuid 3da83711-3468-42e8-aec6-ea1b9848aa39 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 21 19:31:58 np0005591285 nova_compute[182755]: 2026-01-22 00:31:58.876 182759 DEBUG nova.virt.libvirt.driver [None req-7c0476d1-b41e-42ef-92a8-ea1af027411b a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] [instance: 3da83711-3468-42e8-aec6-ea1b9848aa39] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Jan 21 19:31:58 np0005591285 nova_compute[182755]: 2026-01-22 00:31:58.876 182759 DEBUG nova.virt.libvirt.driver [None req-7c0476d1-b41e-42ef-92a8-ea1af027411b a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] [instance: 3da83711-3468-42e8-aec6-ea1b9848aa39] Ensure instance console log exists: /var/lib/nova/instances/3da83711-3468-42e8-aec6-ea1b9848aa39/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Jan 21 19:31:58 np0005591285 nova_compute[182755]: 2026-01-22 00:31:58.876 182759 DEBUG oslo_concurrency.lockutils [None req-7c0476d1-b41e-42ef-92a8-ea1af027411b a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 21 19:31:58 np0005591285 nova_compute[182755]: 2026-01-22 00:31:58.877 182759 DEBUG oslo_concurrency.lockutils [None req-7c0476d1-b41e-42ef-92a8-ea1af027411b a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 21 19:31:58 np0005591285 nova_compute[182755]: 2026-01-22 00:31:58.877 182759 DEBUG oslo_concurrency.lockutils [None req-7c0476d1-b41e-42ef-92a8-ea1af027411b a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 21 19:31:59 np0005591285 nova_compute[182755]: 2026-01-22 00:31:59.528 182759 DEBUG nova.network.neutron [None req-7c0476d1-b41e-42ef-92a8-ea1af027411b a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] [instance: 3da83711-3468-42e8-aec6-ea1b9848aa39] Successfully created port: 2db11cf7-a9f4-4143-96b7-5bfdd381d1d8 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Jan 21 19:31:59 np0005591285 nova_compute[182755]: 2026-01-22 00:31:59.546 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:32:00 np0005591285 nova_compute[182755]: 2026-01-22 00:32:00.217 182759 DEBUG oslo_service.periodic_task [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 21 19:32:00 np0005591285 nova_compute[182755]: 2026-01-22 00:32:00.218 182759 DEBUG nova.compute.manager [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Jan 21 19:32:00 np0005591285 nova_compute[182755]: 2026-01-22 00:32:00.219 182759 DEBUG nova.compute.manager [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Jan 21 19:32:00 np0005591285 nova_compute[182755]: 2026-01-22 00:32:00.383 182759 DEBUG nova.compute.manager [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] [instance: 3da83711-3468-42e8-aec6-ea1b9848aa39] Skipping network cache update for instance because it is Building. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9871#033[00m
Jan 21 19:32:00 np0005591285 nova_compute[182755]: 2026-01-22 00:32:00.384 182759 DEBUG nova.compute.manager [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Jan 21 19:32:00 np0005591285 nova_compute[182755]: 2026-01-22 00:32:00.816 182759 DEBUG nova.network.neutron [None req-7c0476d1-b41e-42ef-92a8-ea1af027411b a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] [instance: 3da83711-3468-42e8-aec6-ea1b9848aa39] Successfully updated port: 2db11cf7-a9f4-4143-96b7-5bfdd381d1d8 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Jan 21 19:32:00 np0005591285 nova_compute[182755]: 2026-01-22 00:32:00.945 182759 DEBUG nova.compute.manager [req-82542648-976f-44bb-8590-4656d0a1e7d6 req-fb6f6440-70c2-4bd4-91f3-c705e8a8e27d 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 3da83711-3468-42e8-aec6-ea1b9848aa39] Received event network-changed-2db11cf7-a9f4-4143-96b7-5bfdd381d1d8 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 21 19:32:00 np0005591285 nova_compute[182755]: 2026-01-22 00:32:00.945 182759 DEBUG nova.compute.manager [req-82542648-976f-44bb-8590-4656d0a1e7d6 req-fb6f6440-70c2-4bd4-91f3-c705e8a8e27d 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 3da83711-3468-42e8-aec6-ea1b9848aa39] Refreshing instance network info cache due to event network-changed-2db11cf7-a9f4-4143-96b7-5bfdd381d1d8. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 21 19:32:00 np0005591285 nova_compute[182755]: 2026-01-22 00:32:00.946 182759 DEBUG oslo_concurrency.lockutils [req-82542648-976f-44bb-8590-4656d0a1e7d6 req-fb6f6440-70c2-4bd4-91f3-c705e8a8e27d 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "refresh_cache-3da83711-3468-42e8-aec6-ea1b9848aa39" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 21 19:32:00 np0005591285 nova_compute[182755]: 2026-01-22 00:32:00.947 182759 DEBUG oslo_concurrency.lockutils [req-82542648-976f-44bb-8590-4656d0a1e7d6 req-fb6f6440-70c2-4bd4-91f3-c705e8a8e27d 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquired lock "refresh_cache-3da83711-3468-42e8-aec6-ea1b9848aa39" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 21 19:32:00 np0005591285 nova_compute[182755]: 2026-01-22 00:32:00.947 182759 DEBUG nova.network.neutron [req-82542648-976f-44bb-8590-4656d0a1e7d6 req-fb6f6440-70c2-4bd4-91f3-c705e8a8e27d 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 3da83711-3468-42e8-aec6-ea1b9848aa39] Refreshing network info cache for port 2db11cf7-a9f4-4143-96b7-5bfdd381d1d8 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 21 19:32:00 np0005591285 nova_compute[182755]: 2026-01-22 00:32:00.951 182759 DEBUG oslo_concurrency.lockutils [None req-7c0476d1-b41e-42ef-92a8-ea1af027411b a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Acquiring lock "refresh_cache-3da83711-3468-42e8-aec6-ea1b9848aa39" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 21 19:32:01 np0005591285 nova_compute[182755]: 2026-01-22 00:32:01.121 182759 DEBUG nova.network.neutron [req-82542648-976f-44bb-8590-4656d0a1e7d6 req-fb6f6440-70c2-4bd4-91f3-c705e8a8e27d 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 3da83711-3468-42e8-aec6-ea1b9848aa39] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Jan 21 19:32:01 np0005591285 nova_compute[182755]: 2026-01-22 00:32:01.216 182759 DEBUG oslo_service.periodic_task [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 21 19:32:01 np0005591285 nova_compute[182755]: 2026-01-22 00:32:01.217 182759 DEBUG oslo_service.periodic_task [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 21 19:32:01 np0005591285 nova_compute[182755]: 2026-01-22 00:32:01.217 182759 DEBUG nova.compute.manager [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Jan 21 19:32:01 np0005591285 nova_compute[182755]: 2026-01-22 00:32:01.831 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:32:01 np0005591285 nova_compute[182755]: 2026-01-22 00:32:01.871 182759 DEBUG nova.network.neutron [req-82542648-976f-44bb-8590-4656d0a1e7d6 req-fb6f6440-70c2-4bd4-91f3-c705e8a8e27d 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 3da83711-3468-42e8-aec6-ea1b9848aa39] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 21 19:32:02 np0005591285 nova_compute[182755]: 2026-01-22 00:32:02.031 182759 DEBUG oslo_concurrency.lockutils [req-82542648-976f-44bb-8590-4656d0a1e7d6 req-fb6f6440-70c2-4bd4-91f3-c705e8a8e27d 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Releasing lock "refresh_cache-3da83711-3468-42e8-aec6-ea1b9848aa39" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 21 19:32:02 np0005591285 nova_compute[182755]: 2026-01-22 00:32:02.032 182759 DEBUG oslo_concurrency.lockutils [None req-7c0476d1-b41e-42ef-92a8-ea1af027411b a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Acquired lock "refresh_cache-3da83711-3468-42e8-aec6-ea1b9848aa39" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 21 19:32:02 np0005591285 nova_compute[182755]: 2026-01-22 00:32:02.033 182759 DEBUG nova.network.neutron [None req-7c0476d1-b41e-42ef-92a8-ea1af027411b a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] [instance: 3da83711-3468-42e8-aec6-ea1b9848aa39] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Jan 21 19:32:02 np0005591285 nova_compute[182755]: 2026-01-22 00:32:02.437 182759 DEBUG nova.network.neutron [None req-7c0476d1-b41e-42ef-92a8-ea1af027411b a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] [instance: 3da83711-3468-42e8-aec6-ea1b9848aa39] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Jan 21 19:32:02 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:32:02.998 104259 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 21 19:32:02 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:32:02.998 104259 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 21 19:32:02 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:32:02.998 104259 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 21 19:32:03 np0005591285 nova_compute[182755]: 2026-01-22 00:32:03.214 182759 DEBUG oslo_service.periodic_task [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 21 19:32:04 np0005591285 nova_compute[182755]: 2026-01-22 00:32:04.549 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:32:05 np0005591285 nova_compute[182755]: 2026-01-22 00:32:05.218 182759 DEBUG oslo_service.periodic_task [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 21 19:32:05 np0005591285 nova_compute[182755]: 2026-01-22 00:32:05.986 182759 DEBUG nova.network.neutron [None req-7c0476d1-b41e-42ef-92a8-ea1af027411b a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] [instance: 3da83711-3468-42e8-aec6-ea1b9848aa39] Updating instance_info_cache with network_info: [{"id": "2db11cf7-a9f4-4143-96b7-5bfdd381d1d8", "address": "fa:16:3e:0d:7f:01", "network": {"id": "895033ac-5f91-4350-ad1a-b5c5d0ff13a2", "bridge": "br-int", "label": "tempest-network-smoke--1056014394", "subnets": [{"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe0d:7f01", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}, {"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "837db8748d074b3c9179b47d30e7a1d4", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap2db11cf7-a9", "ovs_interfaceid": "2db11cf7-a9f4-4143-96b7-5bfdd381d1d8", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 21 19:32:06 np0005591285 nova_compute[182755]: 2026-01-22 00:32:06.009 182759 DEBUG oslo_concurrency.lockutils [None req-7c0476d1-b41e-42ef-92a8-ea1af027411b a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Releasing lock "refresh_cache-3da83711-3468-42e8-aec6-ea1b9848aa39" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 21 19:32:06 np0005591285 nova_compute[182755]: 2026-01-22 00:32:06.010 182759 DEBUG nova.compute.manager [None req-7c0476d1-b41e-42ef-92a8-ea1af027411b a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] [instance: 3da83711-3468-42e8-aec6-ea1b9848aa39] Instance network_info: |[{"id": "2db11cf7-a9f4-4143-96b7-5bfdd381d1d8", "address": "fa:16:3e:0d:7f:01", "network": {"id": "895033ac-5f91-4350-ad1a-b5c5d0ff13a2", "bridge": "br-int", "label": "tempest-network-smoke--1056014394", "subnets": [{"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe0d:7f01", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}, {"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "837db8748d074b3c9179b47d30e7a1d4", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap2db11cf7-a9", "ovs_interfaceid": "2db11cf7-a9f4-4143-96b7-5bfdd381d1d8", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Jan 21 19:32:06 np0005591285 nova_compute[182755]: 2026-01-22 00:32:06.012 182759 DEBUG nova.virt.libvirt.driver [None req-7c0476d1-b41e-42ef-92a8-ea1af027411b a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] [instance: 3da83711-3468-42e8-aec6-ea1b9848aa39] Start _get_guest_xml network_info=[{"id": "2db11cf7-a9f4-4143-96b7-5bfdd381d1d8", "address": "fa:16:3e:0d:7f:01", "network": {"id": "895033ac-5f91-4350-ad1a-b5c5d0ff13a2", "bridge": "br-int", "label": "tempest-network-smoke--1056014394", "subnets": [{"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe0d:7f01", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}, {"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "837db8748d074b3c9179b47d30e7a1d4", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap2db11cf7-a9", "ovs_interfaceid": "2db11cf7-a9f4-4143-96b7-5bfdd381d1d8", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-21T23:42:50Z,direct_url=<?>,disk_format='qcow2',id=9cd98f02-a505-4543-a7ad-04e9a377b456,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='43b70c4e837343859ac97b6b2397ba1b',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-21T23:42:52Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'device_type': 'disk', 'device_name': '/dev/vda', 'size': 0, 'boot_index': 0, 'disk_bus': 'virtio', 'guest_format': None, 'encryption_options': None, 'encryption_format': None, 'encrypted': False, 'encryption_secret_uuid': None, 'image_id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Jan 21 19:32:06 np0005591285 nova_compute[182755]: 2026-01-22 00:32:06.016 182759 WARNING nova.virt.libvirt.driver [None req-7c0476d1-b41e-42ef-92a8-ea1af027411b a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 21 19:32:06 np0005591285 nova_compute[182755]: 2026-01-22 00:32:06.020 182759 DEBUG nova.virt.libvirt.host [None req-7c0476d1-b41e-42ef-92a8-ea1af027411b a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Jan 21 19:32:06 np0005591285 nova_compute[182755]: 2026-01-22 00:32:06.020 182759 DEBUG nova.virt.libvirt.host [None req-7c0476d1-b41e-42ef-92a8-ea1af027411b a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Jan 21 19:32:06 np0005591285 nova_compute[182755]: 2026-01-22 00:32:06.023 182759 DEBUG nova.virt.libvirt.host [None req-7c0476d1-b41e-42ef-92a8-ea1af027411b a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Jan 21 19:32:06 np0005591285 nova_compute[182755]: 2026-01-22 00:32:06.024 182759 DEBUG nova.virt.libvirt.host [None req-7c0476d1-b41e-42ef-92a8-ea1af027411b a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Jan 21 19:32:06 np0005591285 nova_compute[182755]: 2026-01-22 00:32:06.025 182759 DEBUG nova.virt.libvirt.driver [None req-7c0476d1-b41e-42ef-92a8-ea1af027411b a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Jan 21 19:32:06 np0005591285 nova_compute[182755]: 2026-01-22 00:32:06.026 182759 DEBUG nova.virt.hardware [None req-7c0476d1-b41e-42ef-92a8-ea1af027411b a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-21T23:42:45Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='c3389c03-89c4-4ff5-9e03-1a99d41713d4',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-21T23:42:50Z,direct_url=<?>,disk_format='qcow2',id=9cd98f02-a505-4543-a7ad-04e9a377b456,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='43b70c4e837343859ac97b6b2397ba1b',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-21T23:42:52Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Jan 21 19:32:06 np0005591285 nova_compute[182755]: 2026-01-22 00:32:06.026 182759 DEBUG nova.virt.hardware [None req-7c0476d1-b41e-42ef-92a8-ea1af027411b a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Jan 21 19:32:06 np0005591285 nova_compute[182755]: 2026-01-22 00:32:06.026 182759 DEBUG nova.virt.hardware [None req-7c0476d1-b41e-42ef-92a8-ea1af027411b a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Jan 21 19:32:06 np0005591285 nova_compute[182755]: 2026-01-22 00:32:06.027 182759 DEBUG nova.virt.hardware [None req-7c0476d1-b41e-42ef-92a8-ea1af027411b a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Jan 21 19:32:06 np0005591285 nova_compute[182755]: 2026-01-22 00:32:06.027 182759 DEBUG nova.virt.hardware [None req-7c0476d1-b41e-42ef-92a8-ea1af027411b a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Jan 21 19:32:06 np0005591285 nova_compute[182755]: 2026-01-22 00:32:06.027 182759 DEBUG nova.virt.hardware [None req-7c0476d1-b41e-42ef-92a8-ea1af027411b a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Jan 21 19:32:06 np0005591285 nova_compute[182755]: 2026-01-22 00:32:06.027 182759 DEBUG nova.virt.hardware [None req-7c0476d1-b41e-42ef-92a8-ea1af027411b a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Jan 21 19:32:06 np0005591285 nova_compute[182755]: 2026-01-22 00:32:06.028 182759 DEBUG nova.virt.hardware [None req-7c0476d1-b41e-42ef-92a8-ea1af027411b a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Jan 21 19:32:06 np0005591285 nova_compute[182755]: 2026-01-22 00:32:06.028 182759 DEBUG nova.virt.hardware [None req-7c0476d1-b41e-42ef-92a8-ea1af027411b a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Jan 21 19:32:06 np0005591285 nova_compute[182755]: 2026-01-22 00:32:06.028 182759 DEBUG nova.virt.hardware [None req-7c0476d1-b41e-42ef-92a8-ea1af027411b a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Jan 21 19:32:06 np0005591285 nova_compute[182755]: 2026-01-22 00:32:06.028 182759 DEBUG nova.virt.hardware [None req-7c0476d1-b41e-42ef-92a8-ea1af027411b a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Jan 21 19:32:06 np0005591285 nova_compute[182755]: 2026-01-22 00:32:06.032 182759 DEBUG nova.virt.libvirt.vif [None req-7c0476d1-b41e-42ef-92a8-ea1af027411b a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-22T00:31:56Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestGettingAddress-server-805464775',display_name='tempest-TestGettingAddress-server-805464775',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-testgettingaddress-server-805464775',id=170,image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBEN8d2ViSKyfMT2XqudbWbmZhugtSo0AUa3hssPfIHXZXJuMLED9XwzZlkaV7imX6BxsiK4pWMoh9iMrlu0xzxZRk5QI4OmaesLZRq01J/YYbtUdz/2t7KMOohgfE7jvUQ==',key_name='tempest-TestGettingAddress-1273457996',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='837db8748d074b3c9179b47d30e7a1d4',ramdisk_id='',reservation_id='r-exz0mvk1',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestGettingAddress-471729430',owner_user_name='tempest-TestGettingAddress-471729430-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-22T00:31:58Z,user_data=None,user_id='a8fd196423d94b309668ffd08655f7ed',uuid=3da83711-3468-42e8-aec6-ea1b9848aa39,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "2db11cf7-a9f4-4143-96b7-5bfdd381d1d8", "address": "fa:16:3e:0d:7f:01", "network": {"id": "895033ac-5f91-4350-ad1a-b5c5d0ff13a2", "bridge": "br-int", "label": "tempest-network-smoke--1056014394", "subnets": [{"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe0d:7f01", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}, {"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "837db8748d074b3c9179b47d30e7a1d4", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap2db11cf7-a9", "ovs_interfaceid": "2db11cf7-a9f4-4143-96b7-5bfdd381d1d8", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Jan 21 19:32:06 np0005591285 nova_compute[182755]: 2026-01-22 00:32:06.033 182759 DEBUG nova.network.os_vif_util [None req-7c0476d1-b41e-42ef-92a8-ea1af027411b a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Converting VIF {"id": "2db11cf7-a9f4-4143-96b7-5bfdd381d1d8", "address": "fa:16:3e:0d:7f:01", "network": {"id": "895033ac-5f91-4350-ad1a-b5c5d0ff13a2", "bridge": "br-int", "label": "tempest-network-smoke--1056014394", "subnets": [{"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe0d:7f01", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}, {"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "837db8748d074b3c9179b47d30e7a1d4", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap2db11cf7-a9", "ovs_interfaceid": "2db11cf7-a9f4-4143-96b7-5bfdd381d1d8", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 21 19:32:06 np0005591285 nova_compute[182755]: 2026-01-22 00:32:06.034 182759 DEBUG nova.network.os_vif_util [None req-7c0476d1-b41e-42ef-92a8-ea1af027411b a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:0d:7f:01,bridge_name='br-int',has_traffic_filtering=True,id=2db11cf7-a9f4-4143-96b7-5bfdd381d1d8,network=Network(895033ac-5f91-4350-ad1a-b5c5d0ff13a2),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap2db11cf7-a9') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 21 19:32:06 np0005591285 nova_compute[182755]: 2026-01-22 00:32:06.035 182759 DEBUG nova.objects.instance [None req-7c0476d1-b41e-42ef-92a8-ea1af027411b a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Lazy-loading 'pci_devices' on Instance uuid 3da83711-3468-42e8-aec6-ea1b9848aa39 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 21 19:32:06 np0005591285 nova_compute[182755]: 2026-01-22 00:32:06.050 182759 DEBUG nova.virt.libvirt.driver [None req-7c0476d1-b41e-42ef-92a8-ea1af027411b a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] [instance: 3da83711-3468-42e8-aec6-ea1b9848aa39] End _get_guest_xml xml=<domain type="kvm">
Jan 21 19:32:06 np0005591285 nova_compute[182755]:  <uuid>3da83711-3468-42e8-aec6-ea1b9848aa39</uuid>
Jan 21 19:32:06 np0005591285 nova_compute[182755]:  <name>instance-000000aa</name>
Jan 21 19:32:06 np0005591285 nova_compute[182755]:  <memory>131072</memory>
Jan 21 19:32:06 np0005591285 nova_compute[182755]:  <vcpu>1</vcpu>
Jan 21 19:32:06 np0005591285 nova_compute[182755]:  <metadata>
Jan 21 19:32:06 np0005591285 nova_compute[182755]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 21 19:32:06 np0005591285 nova_compute[182755]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 21 19:32:06 np0005591285 nova_compute[182755]:      <nova:name>tempest-TestGettingAddress-server-805464775</nova:name>
Jan 21 19:32:06 np0005591285 nova_compute[182755]:      <nova:creationTime>2026-01-22 00:32:06</nova:creationTime>
Jan 21 19:32:06 np0005591285 nova_compute[182755]:      <nova:flavor name="m1.nano">
Jan 21 19:32:06 np0005591285 nova_compute[182755]:        <nova:memory>128</nova:memory>
Jan 21 19:32:06 np0005591285 nova_compute[182755]:        <nova:disk>1</nova:disk>
Jan 21 19:32:06 np0005591285 nova_compute[182755]:        <nova:swap>0</nova:swap>
Jan 21 19:32:06 np0005591285 nova_compute[182755]:        <nova:ephemeral>0</nova:ephemeral>
Jan 21 19:32:06 np0005591285 nova_compute[182755]:        <nova:vcpus>1</nova:vcpus>
Jan 21 19:32:06 np0005591285 nova_compute[182755]:      </nova:flavor>
Jan 21 19:32:06 np0005591285 nova_compute[182755]:      <nova:owner>
Jan 21 19:32:06 np0005591285 nova_compute[182755]:        <nova:user uuid="a8fd196423d94b309668ffd08655f7ed">tempest-TestGettingAddress-471729430-project-member</nova:user>
Jan 21 19:32:06 np0005591285 nova_compute[182755]:        <nova:project uuid="837db8748d074b3c9179b47d30e7a1d4">tempest-TestGettingAddress-471729430</nova:project>
Jan 21 19:32:06 np0005591285 nova_compute[182755]:      </nova:owner>
Jan 21 19:32:06 np0005591285 nova_compute[182755]:      <nova:root type="image" uuid="9cd98f02-a505-4543-a7ad-04e9a377b456"/>
Jan 21 19:32:06 np0005591285 nova_compute[182755]:      <nova:ports>
Jan 21 19:32:06 np0005591285 nova_compute[182755]:        <nova:port uuid="2db11cf7-a9f4-4143-96b7-5bfdd381d1d8">
Jan 21 19:32:06 np0005591285 nova_compute[182755]:          <nova:ip type="fixed" address="2001:db8::f816:3eff:fe0d:7f01" ipVersion="6"/>
Jan 21 19:32:06 np0005591285 nova_compute[182755]:          <nova:ip type="fixed" address="10.100.0.8" ipVersion="4"/>
Jan 21 19:32:06 np0005591285 nova_compute[182755]:        </nova:port>
Jan 21 19:32:06 np0005591285 nova_compute[182755]:      </nova:ports>
Jan 21 19:32:06 np0005591285 nova_compute[182755]:    </nova:instance>
Jan 21 19:32:06 np0005591285 nova_compute[182755]:  </metadata>
Jan 21 19:32:06 np0005591285 nova_compute[182755]:  <sysinfo type="smbios">
Jan 21 19:32:06 np0005591285 nova_compute[182755]:    <system>
Jan 21 19:32:06 np0005591285 nova_compute[182755]:      <entry name="manufacturer">RDO</entry>
Jan 21 19:32:06 np0005591285 nova_compute[182755]:      <entry name="product">OpenStack Compute</entry>
Jan 21 19:32:06 np0005591285 nova_compute[182755]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 21 19:32:06 np0005591285 nova_compute[182755]:      <entry name="serial">3da83711-3468-42e8-aec6-ea1b9848aa39</entry>
Jan 21 19:32:06 np0005591285 nova_compute[182755]:      <entry name="uuid">3da83711-3468-42e8-aec6-ea1b9848aa39</entry>
Jan 21 19:32:06 np0005591285 nova_compute[182755]:      <entry name="family">Virtual Machine</entry>
Jan 21 19:32:06 np0005591285 nova_compute[182755]:    </system>
Jan 21 19:32:06 np0005591285 nova_compute[182755]:  </sysinfo>
Jan 21 19:32:06 np0005591285 nova_compute[182755]:  <os>
Jan 21 19:32:06 np0005591285 nova_compute[182755]:    <type arch="x86_64" machine="q35">hvm</type>
Jan 21 19:32:06 np0005591285 nova_compute[182755]:    <boot dev="hd"/>
Jan 21 19:32:06 np0005591285 nova_compute[182755]:    <smbios mode="sysinfo"/>
Jan 21 19:32:06 np0005591285 nova_compute[182755]:  </os>
Jan 21 19:32:06 np0005591285 nova_compute[182755]:  <features>
Jan 21 19:32:06 np0005591285 nova_compute[182755]:    <acpi/>
Jan 21 19:32:06 np0005591285 nova_compute[182755]:    <apic/>
Jan 21 19:32:06 np0005591285 nova_compute[182755]:    <vmcoreinfo/>
Jan 21 19:32:06 np0005591285 nova_compute[182755]:  </features>
Jan 21 19:32:06 np0005591285 nova_compute[182755]:  <clock offset="utc">
Jan 21 19:32:06 np0005591285 nova_compute[182755]:    <timer name="pit" tickpolicy="delay"/>
Jan 21 19:32:06 np0005591285 nova_compute[182755]:    <timer name="rtc" tickpolicy="catchup"/>
Jan 21 19:32:06 np0005591285 nova_compute[182755]:    <timer name="hpet" present="no"/>
Jan 21 19:32:06 np0005591285 nova_compute[182755]:  </clock>
Jan 21 19:32:06 np0005591285 nova_compute[182755]:  <cpu mode="custom" match="exact">
Jan 21 19:32:06 np0005591285 nova_compute[182755]:    <model>Nehalem</model>
Jan 21 19:32:06 np0005591285 nova_compute[182755]:    <topology sockets="1" cores="1" threads="1"/>
Jan 21 19:32:06 np0005591285 nova_compute[182755]:  </cpu>
Jan 21 19:32:06 np0005591285 nova_compute[182755]:  <devices>
Jan 21 19:32:06 np0005591285 nova_compute[182755]:    <disk type="file" device="disk">
Jan 21 19:32:06 np0005591285 nova_compute[182755]:      <driver name="qemu" type="qcow2" cache="none"/>
Jan 21 19:32:06 np0005591285 nova_compute[182755]:      <source file="/var/lib/nova/instances/3da83711-3468-42e8-aec6-ea1b9848aa39/disk"/>
Jan 21 19:32:06 np0005591285 nova_compute[182755]:      <target dev="vda" bus="virtio"/>
Jan 21 19:32:06 np0005591285 nova_compute[182755]:    </disk>
Jan 21 19:32:06 np0005591285 nova_compute[182755]:    <disk type="file" device="cdrom">
Jan 21 19:32:06 np0005591285 nova_compute[182755]:      <driver name="qemu" type="raw" cache="none"/>
Jan 21 19:32:06 np0005591285 nova_compute[182755]:      <source file="/var/lib/nova/instances/3da83711-3468-42e8-aec6-ea1b9848aa39/disk.config"/>
Jan 21 19:32:06 np0005591285 nova_compute[182755]:      <target dev="sda" bus="sata"/>
Jan 21 19:32:06 np0005591285 nova_compute[182755]:    </disk>
Jan 21 19:32:06 np0005591285 nova_compute[182755]:    <interface type="ethernet">
Jan 21 19:32:06 np0005591285 nova_compute[182755]:      <mac address="fa:16:3e:0d:7f:01"/>
Jan 21 19:32:06 np0005591285 nova_compute[182755]:      <model type="virtio"/>
Jan 21 19:32:06 np0005591285 nova_compute[182755]:      <driver name="vhost" rx_queue_size="512"/>
Jan 21 19:32:06 np0005591285 nova_compute[182755]:      <mtu size="1442"/>
Jan 21 19:32:06 np0005591285 nova_compute[182755]:      <target dev="tap2db11cf7-a9"/>
Jan 21 19:32:06 np0005591285 nova_compute[182755]:    </interface>
Jan 21 19:32:06 np0005591285 nova_compute[182755]:    <serial type="pty">
Jan 21 19:32:06 np0005591285 nova_compute[182755]:      <log file="/var/lib/nova/instances/3da83711-3468-42e8-aec6-ea1b9848aa39/console.log" append="off"/>
Jan 21 19:32:06 np0005591285 nova_compute[182755]:    </serial>
Jan 21 19:32:06 np0005591285 nova_compute[182755]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 21 19:32:06 np0005591285 nova_compute[182755]:    <video>
Jan 21 19:32:06 np0005591285 nova_compute[182755]:      <model type="virtio"/>
Jan 21 19:32:06 np0005591285 nova_compute[182755]:    </video>
Jan 21 19:32:06 np0005591285 nova_compute[182755]:    <input type="tablet" bus="usb"/>
Jan 21 19:32:06 np0005591285 nova_compute[182755]:    <rng model="virtio">
Jan 21 19:32:06 np0005591285 nova_compute[182755]:      <backend model="random">/dev/urandom</backend>
Jan 21 19:32:06 np0005591285 nova_compute[182755]:    </rng>
Jan 21 19:32:06 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root"/>
Jan 21 19:32:06 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 19:32:06 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 19:32:06 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 19:32:06 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 19:32:06 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 19:32:06 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 19:32:06 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 19:32:06 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 19:32:06 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 19:32:06 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 19:32:06 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 19:32:06 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 19:32:06 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 19:32:06 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 19:32:06 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 19:32:06 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 19:32:06 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 19:32:06 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 19:32:06 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 19:32:06 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 19:32:06 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 19:32:06 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 19:32:06 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 19:32:06 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 19:32:06 np0005591285 nova_compute[182755]:    <controller type="usb" index="0"/>
Jan 21 19:32:06 np0005591285 nova_compute[182755]:    <memballoon model="virtio">
Jan 21 19:32:06 np0005591285 nova_compute[182755]:      <stats period="10"/>
Jan 21 19:32:06 np0005591285 nova_compute[182755]:    </memballoon>
Jan 21 19:32:06 np0005591285 nova_compute[182755]:  </devices>
Jan 21 19:32:06 np0005591285 nova_compute[182755]: </domain>
Jan 21 19:32:06 np0005591285 nova_compute[182755]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Jan 21 19:32:06 np0005591285 nova_compute[182755]: 2026-01-22 00:32:06.052 182759 DEBUG nova.compute.manager [None req-7c0476d1-b41e-42ef-92a8-ea1af027411b a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] [instance: 3da83711-3468-42e8-aec6-ea1b9848aa39] Preparing to wait for external event network-vif-plugged-2db11cf7-a9f4-4143-96b7-5bfdd381d1d8 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Jan 21 19:32:06 np0005591285 nova_compute[182755]: 2026-01-22 00:32:06.052 182759 DEBUG oslo_concurrency.lockutils [None req-7c0476d1-b41e-42ef-92a8-ea1af027411b a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Acquiring lock "3da83711-3468-42e8-aec6-ea1b9848aa39-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 21 19:32:06 np0005591285 nova_compute[182755]: 2026-01-22 00:32:06.053 182759 DEBUG oslo_concurrency.lockutils [None req-7c0476d1-b41e-42ef-92a8-ea1af027411b a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Lock "3da83711-3468-42e8-aec6-ea1b9848aa39-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 21 19:32:06 np0005591285 nova_compute[182755]: 2026-01-22 00:32:06.053 182759 DEBUG oslo_concurrency.lockutils [None req-7c0476d1-b41e-42ef-92a8-ea1af027411b a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Lock "3da83711-3468-42e8-aec6-ea1b9848aa39-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 21 19:32:06 np0005591285 nova_compute[182755]: 2026-01-22 00:32:06.053 182759 DEBUG nova.virt.libvirt.vif [None req-7c0476d1-b41e-42ef-92a8-ea1af027411b a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-22T00:31:56Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestGettingAddress-server-805464775',display_name='tempest-TestGettingAddress-server-805464775',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-testgettingaddress-server-805464775',id=170,image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBEN8d2ViSKyfMT2XqudbWbmZhugtSo0AUa3hssPfIHXZXJuMLED9XwzZlkaV7imX6BxsiK4pWMoh9iMrlu0xzxZRk5QI4OmaesLZRq01J/YYbtUdz/2t7KMOohgfE7jvUQ==',key_name='tempest-TestGettingAddress-1273457996',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='837db8748d074b3c9179b47d30e7a1d4',ramdisk_id='',reservation_id='r-exz0mvk1',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestGettingAddress-471729430',owner_user_name='tempest-TestGettingAddress-471729430-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-22T00:31:58Z,user_data=None,user_id='a8fd196423d94b309668ffd08655f7ed',uuid=3da83711-3468-42e8-aec6-ea1b9848aa39,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "2db11cf7-a9f4-4143-96b7-5bfdd381d1d8", "address": "fa:16:3e:0d:7f:01", "network": {"id": "895033ac-5f91-4350-ad1a-b5c5d0ff13a2", "bridge": "br-int", "label": "tempest-network-smoke--1056014394", "subnets": [{"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe0d:7f01", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}, {"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "837db8748d074b3c9179b47d30e7a1d4", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap2db11cf7-a9", "ovs_interfaceid": "2db11cf7-a9f4-4143-96b7-5bfdd381d1d8", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Jan 21 19:32:06 np0005591285 nova_compute[182755]: 2026-01-22 00:32:06.054 182759 DEBUG nova.network.os_vif_util [None req-7c0476d1-b41e-42ef-92a8-ea1af027411b a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Converting VIF {"id": "2db11cf7-a9f4-4143-96b7-5bfdd381d1d8", "address": "fa:16:3e:0d:7f:01", "network": {"id": "895033ac-5f91-4350-ad1a-b5c5d0ff13a2", "bridge": "br-int", "label": "tempest-network-smoke--1056014394", "subnets": [{"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe0d:7f01", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}, {"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "837db8748d074b3c9179b47d30e7a1d4", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap2db11cf7-a9", "ovs_interfaceid": "2db11cf7-a9f4-4143-96b7-5bfdd381d1d8", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 21 19:32:06 np0005591285 nova_compute[182755]: 2026-01-22 00:32:06.054 182759 DEBUG nova.network.os_vif_util [None req-7c0476d1-b41e-42ef-92a8-ea1af027411b a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:0d:7f:01,bridge_name='br-int',has_traffic_filtering=True,id=2db11cf7-a9f4-4143-96b7-5bfdd381d1d8,network=Network(895033ac-5f91-4350-ad1a-b5c5d0ff13a2),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap2db11cf7-a9') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 21 19:32:06 np0005591285 nova_compute[182755]: 2026-01-22 00:32:06.055 182759 DEBUG os_vif [None req-7c0476d1-b41e-42ef-92a8-ea1af027411b a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:0d:7f:01,bridge_name='br-int',has_traffic_filtering=True,id=2db11cf7-a9f4-4143-96b7-5bfdd381d1d8,network=Network(895033ac-5f91-4350-ad1a-b5c5d0ff13a2),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap2db11cf7-a9') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Jan 21 19:32:06 np0005591285 nova_compute[182755]: 2026-01-22 00:32:06.055 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:32:06 np0005591285 nova_compute[182755]: 2026-01-22 00:32:06.056 182759 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 21 19:32:06 np0005591285 nova_compute[182755]: 2026-01-22 00:32:06.056 182759 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 21 19:32:06 np0005591285 nova_compute[182755]: 2026-01-22 00:32:06.061 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:32:06 np0005591285 nova_compute[182755]: 2026-01-22 00:32:06.061 182759 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap2db11cf7-a9, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 21 19:32:06 np0005591285 nova_compute[182755]: 2026-01-22 00:32:06.062 182759 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap2db11cf7-a9, col_values=(('external_ids', {'iface-id': '2db11cf7-a9f4-4143-96b7-5bfdd381d1d8', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:0d:7f:01', 'vm-uuid': '3da83711-3468-42e8-aec6-ea1b9848aa39'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 21 19:32:06 np0005591285 nova_compute[182755]: 2026-01-22 00:32:06.063 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:32:06 np0005591285 NetworkManager[55017]: <info>  [1769041926.0645] manager: (tap2db11cf7-a9): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/310)
Jan 21 19:32:06 np0005591285 nova_compute[182755]: 2026-01-22 00:32:06.064 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 21 19:32:06 np0005591285 nova_compute[182755]: 2026-01-22 00:32:06.074 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:32:06 np0005591285 nova_compute[182755]: 2026-01-22 00:32:06.075 182759 INFO os_vif [None req-7c0476d1-b41e-42ef-92a8-ea1af027411b a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:0d:7f:01,bridge_name='br-int',has_traffic_filtering=True,id=2db11cf7-a9f4-4143-96b7-5bfdd381d1d8,network=Network(895033ac-5f91-4350-ad1a-b5c5d0ff13a2),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap2db11cf7-a9')#033[00m
Jan 21 19:32:06 np0005591285 nova_compute[182755]: 2026-01-22 00:32:06.117 182759 DEBUG nova.virt.libvirt.driver [None req-7c0476d1-b41e-42ef-92a8-ea1af027411b a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 21 19:32:06 np0005591285 nova_compute[182755]: 2026-01-22 00:32:06.118 182759 DEBUG nova.virt.libvirt.driver [None req-7c0476d1-b41e-42ef-92a8-ea1af027411b a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 21 19:32:06 np0005591285 nova_compute[182755]: 2026-01-22 00:32:06.118 182759 DEBUG nova.virt.libvirt.driver [None req-7c0476d1-b41e-42ef-92a8-ea1af027411b a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] No VIF found with MAC fa:16:3e:0d:7f:01, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Jan 21 19:32:06 np0005591285 nova_compute[182755]: 2026-01-22 00:32:06.118 182759 INFO nova.virt.libvirt.driver [None req-7c0476d1-b41e-42ef-92a8-ea1af027411b a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] [instance: 3da83711-3468-42e8-aec6-ea1b9848aa39] Using config drive#033[00m
Jan 21 19:32:06 np0005591285 podman[239639]: 2026-01-22 00:32:06.184258858 +0000 UTC m=+0.054145357 container health_status 3d927e208c8d9d2bf2ed958430b573b5beb6e6062ba87e1ec7a0ee42c855b2e4 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter)
Jan 21 19:32:06 np0005591285 nova_compute[182755]: 2026-01-22 00:32:06.898 182759 INFO nova.virt.libvirt.driver [None req-7c0476d1-b41e-42ef-92a8-ea1af027411b a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] [instance: 3da83711-3468-42e8-aec6-ea1b9848aa39] Creating config drive at /var/lib/nova/instances/3da83711-3468-42e8-aec6-ea1b9848aa39/disk.config#033[00m
Jan 21 19:32:06 np0005591285 nova_compute[182755]: 2026-01-22 00:32:06.903 182759 DEBUG oslo_concurrency.processutils [None req-7c0476d1-b41e-42ef-92a8-ea1af027411b a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/3da83711-3468-42e8-aec6-ea1b9848aa39/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpwwdsutbn execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 21 19:32:07 np0005591285 nova_compute[182755]: 2026-01-22 00:32:07.027 182759 DEBUG oslo_concurrency.processutils [None req-7c0476d1-b41e-42ef-92a8-ea1af027411b a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/3da83711-3468-42e8-aec6-ea1b9848aa39/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpwwdsutbn" returned: 0 in 0.124s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 21 19:32:07 np0005591285 kernel: tap2db11cf7-a9: entered promiscuous mode
Jan 21 19:32:07 np0005591285 NetworkManager[55017]: <info>  [1769041927.0946] manager: (tap2db11cf7-a9): new Tun device (/org/freedesktop/NetworkManager/Devices/311)
Jan 21 19:32:07 np0005591285 ovn_controller[94908]: 2026-01-22T00:32:07Z|00638|binding|INFO|Claiming lport 2db11cf7-a9f4-4143-96b7-5bfdd381d1d8 for this chassis.
Jan 21 19:32:07 np0005591285 ovn_controller[94908]: 2026-01-22T00:32:07Z|00639|binding|INFO|2db11cf7-a9f4-4143-96b7-5bfdd381d1d8: Claiming fa:16:3e:0d:7f:01 10.100.0.8 2001:db8::f816:3eff:fe0d:7f01
Jan 21 19:32:07 np0005591285 nova_compute[182755]: 2026-01-22 00:32:07.096 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:32:07 np0005591285 nova_compute[182755]: 2026-01-22 00:32:07.104 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:32:07 np0005591285 NetworkManager[55017]: <info>  [1769041927.1096] manager: (patch-br-int-to-provnet-99bcf688-d143-4306-a11c-956e66fbe227): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/312)
Jan 21 19:32:07 np0005591285 nova_compute[182755]: 2026-01-22 00:32:07.108 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:32:07 np0005591285 NetworkManager[55017]: <info>  [1769041927.1116] manager: (patch-provnet-99bcf688-d143-4306-a11c-956e66fbe227-to-br-int): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/313)
Jan 21 19:32:07 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:32:07.112 104259 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:0d:7f:01 10.100.0.8 2001:db8::f816:3eff:fe0d:7f01'], port_security=['fa:16:3e:0d:7f:01 10.100.0.8 2001:db8::f816:3eff:fe0d:7f01'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.8/28 2001:db8::f816:3eff:fe0d:7f01/64', 'neutron:device_id': '3da83711-3468-42e8-aec6-ea1b9848aa39', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-895033ac-5f91-4350-ad1a-b5c5d0ff13a2', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '837db8748d074b3c9179b47d30e7a1d4', 'neutron:revision_number': '2', 'neutron:security_group_ids': '9d931792-0187-42bd-ad30-da2120e7bd41', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=cb52301b-689d-4e28-a6fb-c23352694dd4, chassis=[<ovs.db.idl.Row object at 0x7fdd55b5c970>], tunnel_key=5, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fdd55b5c970>], logical_port=2db11cf7-a9f4-4143-96b7-5bfdd381d1d8) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 21 19:32:07 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:32:07.114 104259 INFO neutron.agent.ovn.metadata.agent [-] Port 2db11cf7-a9f4-4143-96b7-5bfdd381d1d8 in datapath 895033ac-5f91-4350-ad1a-b5c5d0ff13a2 bound to our chassis#033[00m
Jan 21 19:32:07 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:32:07.115 104259 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 895033ac-5f91-4350-ad1a-b5c5d0ff13a2#033[00m
Jan 21 19:32:07 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:32:07.125 211690 DEBUG oslo.privsep.daemon [-] privsep: reply[bb51ab67-2414-460d-aa36-1fe46e93a398]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 19:32:07 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:32:07.127 104259 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap895033ac-51 in ovnmeta-895033ac-5f91-4350-ad1a-b5c5d0ff13a2 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Jan 21 19:32:07 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:32:07.129 211690 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap895033ac-50 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Jan 21 19:32:07 np0005591285 systemd-udevd[239681]: Network interface NamePolicy= disabled on kernel command line.
Jan 21 19:32:07 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:32:07.129 211690 DEBUG oslo.privsep.daemon [-] privsep: reply[ee8d6afa-2061-4eb2-985d-c4c72e7a66f7]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 19:32:07 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:32:07.131 211690 DEBUG oslo.privsep.daemon [-] privsep: reply[0cfa6f6c-5492-446e-bb7c-b6ed4726dc29]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 19:32:07 np0005591285 systemd-machined[154022]: New machine qemu-74-instance-000000aa.
Jan 21 19:32:07 np0005591285 NetworkManager[55017]: <info>  [1769041927.1428] device (tap2db11cf7-a9): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 21 19:32:07 np0005591285 NetworkManager[55017]: <info>  [1769041927.1438] device (tap2db11cf7-a9): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 21 19:32:07 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:32:07.143 104650 DEBUG oslo.privsep.daemon [-] privsep: reply[f12c7082-dc02-4eb2-87f4-4d76999cc3c6]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 19:32:07 np0005591285 systemd[1]: Started Virtual Machine qemu-74-instance-000000aa.
Jan 21 19:32:07 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:32:07.173 211690 DEBUG oslo.privsep.daemon [-] privsep: reply[f0936a02-7d8b-4dc8-bcf1-9b67a30be9b8]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 19:32:07 np0005591285 nova_compute[182755]: 2026-01-22 00:32:07.173 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:32:07 np0005591285 nova_compute[182755]: 2026-01-22 00:32:07.184 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:32:07 np0005591285 ovn_controller[94908]: 2026-01-22T00:32:07Z|00640|binding|INFO|Setting lport 2db11cf7-a9f4-4143-96b7-5bfdd381d1d8 ovn-installed in OVS
Jan 21 19:32:07 np0005591285 ovn_controller[94908]: 2026-01-22T00:32:07Z|00641|binding|INFO|Setting lport 2db11cf7-a9f4-4143-96b7-5bfdd381d1d8 up in Southbound
Jan 21 19:32:07 np0005591285 nova_compute[182755]: 2026-01-22 00:32:07.193 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:32:07 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:32:07.210 211704 DEBUG oslo.privsep.daemon [-] privsep: reply[7a41d6ba-20ce-4b75-9640-69003f75a2b4]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 19:32:07 np0005591285 nova_compute[182755]: 2026-01-22 00:32:07.217 182759 DEBUG oslo_service.periodic_task [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 21 19:32:07 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:32:07.217 211690 DEBUG oslo.privsep.daemon [-] privsep: reply[8c3f8d8c-218f-41ff-9a0e-a1618b3de2f2]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 19:32:07 np0005591285 systemd-udevd[239685]: Network interface NamePolicy= disabled on kernel command line.
Jan 21 19:32:07 np0005591285 NetworkManager[55017]: <info>  [1769041927.2192] manager: (tap895033ac-50): new Veth device (/org/freedesktop/NetworkManager/Devices/314)
Jan 21 19:32:07 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:32:07.249 211704 DEBUG oslo.privsep.daemon [-] privsep: reply[1d4ff57a-f0b1-4d0a-9f9a-4b2418ebadb4]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 19:32:07 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:32:07.253 211704 DEBUG oslo.privsep.daemon [-] privsep: reply[a6b17203-317a-4e9c-ac56-a4e570d3b431]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 19:32:07 np0005591285 NetworkManager[55017]: <info>  [1769041927.2730] device (tap895033ac-50): carrier: link connected
Jan 21 19:32:07 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:32:07.278 211704 DEBUG oslo.privsep.daemon [-] privsep: reply[308955ce-4303-46f9-9b31-8276adf6806c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 19:32:07 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:32:07.295 211690 DEBUG oslo.privsep.daemon [-] privsep: reply[433eeefe-85df-49df-b5a5-481363d1a4e2]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap895033ac-51'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:99:47:fc'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 199], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 648593, 'reachable_time': 43912, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 239714, 'error': None, 'target': 'ovnmeta-895033ac-5f91-4350-ad1a-b5c5d0ff13a2', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 19:32:07 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:32:07.310 211690 DEBUG oslo.privsep.daemon [-] privsep: reply[4565de38-ea4a-47ec-8763-fa2116dba82e]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe99:47fc'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 648593, 'tstamp': 648593}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 239715, 'error': None, 'target': 'ovnmeta-895033ac-5f91-4350-ad1a-b5c5d0ff13a2', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 19:32:07 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:32:07.325 211690 DEBUG oslo.privsep.daemon [-] privsep: reply[5a78c7af-7766-4466-9059-eb7b87f9b0c6]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap895033ac-51'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:99:47:fc'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 199], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 648593, 'reachable_time': 43912, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 239716, 'error': None, 'target': 'ovnmeta-895033ac-5f91-4350-ad1a-b5c5d0ff13a2', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 19:32:07 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:32:07.356 211690 DEBUG oslo.privsep.daemon [-] privsep: reply[b26293c5-1660-496c-86b6-85c74fb387fa]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 19:32:07 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:32:07.407 211690 DEBUG oslo.privsep.daemon [-] privsep: reply[78b5b57b-39bb-44e5-a7b0-ef7355aeff9e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 19:32:07 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:32:07.409 104259 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap895033ac-50, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 21 19:32:07 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:32:07.409 104259 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 21 19:32:07 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:32:07.410 104259 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap895033ac-50, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 21 19:32:07 np0005591285 nova_compute[182755]: 2026-01-22 00:32:07.411 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:32:07 np0005591285 NetworkManager[55017]: <info>  [1769041927.4122] manager: (tap895033ac-50): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/315)
Jan 21 19:32:07 np0005591285 kernel: tap895033ac-50: entered promiscuous mode
Jan 21 19:32:07 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:32:07.414 104259 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap895033ac-50, col_values=(('external_ids', {'iface-id': 'b38c45f8-f983-4d04-9b7c-db4cbbad86b5'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 21 19:32:07 np0005591285 nova_compute[182755]: 2026-01-22 00:32:07.415 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:32:07 np0005591285 ovn_controller[94908]: 2026-01-22T00:32:07Z|00642|binding|INFO|Releasing lport b38c45f8-f983-4d04-9b7c-db4cbbad86b5 from this chassis (sb_readonly=0)
Jan 21 19:32:07 np0005591285 nova_compute[182755]: 2026-01-22 00:32:07.426 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:32:07 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:32:07.427 104259 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/895033ac-5f91-4350-ad1a-b5c5d0ff13a2.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/895033ac-5f91-4350-ad1a-b5c5d0ff13a2.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Jan 21 19:32:07 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:32:07.431 211690 DEBUG oslo.privsep.daemon [-] privsep: reply[39f63cf5-980e-4dba-b300-7cf25ab0dd88]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 19:32:07 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:32:07.432 104259 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Jan 21 19:32:07 np0005591285 ovn_metadata_agent[104254]: global
Jan 21 19:32:07 np0005591285 ovn_metadata_agent[104254]:    log         /dev/log local0 debug
Jan 21 19:32:07 np0005591285 ovn_metadata_agent[104254]:    log-tag     haproxy-metadata-proxy-895033ac-5f91-4350-ad1a-b5c5d0ff13a2
Jan 21 19:32:07 np0005591285 ovn_metadata_agent[104254]:    user        root
Jan 21 19:32:07 np0005591285 ovn_metadata_agent[104254]:    group       root
Jan 21 19:32:07 np0005591285 ovn_metadata_agent[104254]:    maxconn     1024
Jan 21 19:32:07 np0005591285 ovn_metadata_agent[104254]:    pidfile     /var/lib/neutron/external/pids/895033ac-5f91-4350-ad1a-b5c5d0ff13a2.pid.haproxy
Jan 21 19:32:07 np0005591285 ovn_metadata_agent[104254]:    daemon
Jan 21 19:32:07 np0005591285 ovn_metadata_agent[104254]: 
Jan 21 19:32:07 np0005591285 ovn_metadata_agent[104254]: defaults
Jan 21 19:32:07 np0005591285 ovn_metadata_agent[104254]:    log global
Jan 21 19:32:07 np0005591285 ovn_metadata_agent[104254]:    mode http
Jan 21 19:32:07 np0005591285 ovn_metadata_agent[104254]:    option httplog
Jan 21 19:32:07 np0005591285 ovn_metadata_agent[104254]:    option dontlognull
Jan 21 19:32:07 np0005591285 ovn_metadata_agent[104254]:    option http-server-close
Jan 21 19:32:07 np0005591285 ovn_metadata_agent[104254]:    option forwardfor
Jan 21 19:32:07 np0005591285 ovn_metadata_agent[104254]:    retries                 3
Jan 21 19:32:07 np0005591285 ovn_metadata_agent[104254]:    timeout http-request    30s
Jan 21 19:32:07 np0005591285 ovn_metadata_agent[104254]:    timeout connect         30s
Jan 21 19:32:07 np0005591285 ovn_metadata_agent[104254]:    timeout client          32s
Jan 21 19:32:07 np0005591285 ovn_metadata_agent[104254]:    timeout server          32s
Jan 21 19:32:07 np0005591285 ovn_metadata_agent[104254]:    timeout http-keep-alive 30s
Jan 21 19:32:07 np0005591285 ovn_metadata_agent[104254]: 
Jan 21 19:32:07 np0005591285 ovn_metadata_agent[104254]: 
Jan 21 19:32:07 np0005591285 ovn_metadata_agent[104254]: listen listener
Jan 21 19:32:07 np0005591285 ovn_metadata_agent[104254]:    bind 169.254.169.254:80
Jan 21 19:32:07 np0005591285 ovn_metadata_agent[104254]:    server metadata /var/lib/neutron/metadata_proxy
Jan 21 19:32:07 np0005591285 ovn_metadata_agent[104254]:    http-request add-header X-OVN-Network-ID 895033ac-5f91-4350-ad1a-b5c5d0ff13a2
Jan 21 19:32:07 np0005591285 ovn_metadata_agent[104254]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Jan 21 19:32:07 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:32:07.433 104259 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-895033ac-5f91-4350-ad1a-b5c5d0ff13a2', 'env', 'PROCESS_TAG=haproxy-895033ac-5f91-4350-ad1a-b5c5d0ff13a2', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/895033ac-5f91-4350-ad1a-b5c5d0ff13a2.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Jan 21 19:32:07 np0005591285 nova_compute[182755]: 2026-01-22 00:32:07.572 182759 DEBUG nova.virt.driver [None req-f447ef32-7edb-4e2c-a5c5-5452f2f9c37e - - - - - -] Emitting event <LifecycleEvent: 1769041927.5714357, 3da83711-3468-42e8-aec6-ea1b9848aa39 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 21 19:32:07 np0005591285 nova_compute[182755]: 2026-01-22 00:32:07.572 182759 INFO nova.compute.manager [None req-f447ef32-7edb-4e2c-a5c5-5452f2f9c37e - - - - - -] [instance: 3da83711-3468-42e8-aec6-ea1b9848aa39] VM Started (Lifecycle Event)#033[00m
Jan 21 19:32:07 np0005591285 nova_compute[182755]: 2026-01-22 00:32:07.595 182759 DEBUG nova.compute.manager [None req-f447ef32-7edb-4e2c-a5c5-5452f2f9c37e - - - - - -] [instance: 3da83711-3468-42e8-aec6-ea1b9848aa39] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 21 19:32:07 np0005591285 nova_compute[182755]: 2026-01-22 00:32:07.599 182759 DEBUG nova.virt.driver [None req-f447ef32-7edb-4e2c-a5c5-5452f2f9c37e - - - - - -] Emitting event <LifecycleEvent: 1769041927.5720987, 3da83711-3468-42e8-aec6-ea1b9848aa39 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 21 19:32:07 np0005591285 nova_compute[182755]: 2026-01-22 00:32:07.600 182759 INFO nova.compute.manager [None req-f447ef32-7edb-4e2c-a5c5-5452f2f9c37e - - - - - -] [instance: 3da83711-3468-42e8-aec6-ea1b9848aa39] VM Paused (Lifecycle Event)#033[00m
Jan 21 19:32:07 np0005591285 nova_compute[182755]: 2026-01-22 00:32:07.624 182759 DEBUG nova.compute.manager [None req-f447ef32-7edb-4e2c-a5c5-5452f2f9c37e - - - - - -] [instance: 3da83711-3468-42e8-aec6-ea1b9848aa39] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 21 19:32:07 np0005591285 nova_compute[182755]: 2026-01-22 00:32:07.627 182759 DEBUG nova.compute.manager [None req-f447ef32-7edb-4e2c-a5c5-5452f2f9c37e - - - - - -] [instance: 3da83711-3468-42e8-aec6-ea1b9848aa39] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 21 19:32:07 np0005591285 nova_compute[182755]: 2026-01-22 00:32:07.645 182759 INFO nova.compute.manager [None req-f447ef32-7edb-4e2c-a5c5-5452f2f9c37e - - - - - -] [instance: 3da83711-3468-42e8-aec6-ea1b9848aa39] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 21 19:32:07 np0005591285 nova_compute[182755]: 2026-01-22 00:32:07.757 182759 DEBUG nova.compute.manager [req-b57d93dc-b051-4b9b-915c-2c8a157c3ad5 req-c53e0648-a34d-4319-a9d7-9e6db3555306 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 3da83711-3468-42e8-aec6-ea1b9848aa39] Received event network-vif-plugged-2db11cf7-a9f4-4143-96b7-5bfdd381d1d8 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 21 19:32:07 np0005591285 nova_compute[182755]: 2026-01-22 00:32:07.758 182759 DEBUG oslo_concurrency.lockutils [req-b57d93dc-b051-4b9b-915c-2c8a157c3ad5 req-c53e0648-a34d-4319-a9d7-9e6db3555306 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "3da83711-3468-42e8-aec6-ea1b9848aa39-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 21 19:32:07 np0005591285 nova_compute[182755]: 2026-01-22 00:32:07.761 182759 DEBUG oslo_concurrency.lockutils [req-b57d93dc-b051-4b9b-915c-2c8a157c3ad5 req-c53e0648-a34d-4319-a9d7-9e6db3555306 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "3da83711-3468-42e8-aec6-ea1b9848aa39-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.003s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 21 19:32:07 np0005591285 nova_compute[182755]: 2026-01-22 00:32:07.763 182759 DEBUG oslo_concurrency.lockutils [req-b57d93dc-b051-4b9b-915c-2c8a157c3ad5 req-c53e0648-a34d-4319-a9d7-9e6db3555306 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "3da83711-3468-42e8-aec6-ea1b9848aa39-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 21 19:32:07 np0005591285 nova_compute[182755]: 2026-01-22 00:32:07.764 182759 DEBUG nova.compute.manager [req-b57d93dc-b051-4b9b-915c-2c8a157c3ad5 req-c53e0648-a34d-4319-a9d7-9e6db3555306 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 3da83711-3468-42e8-aec6-ea1b9848aa39] Processing event network-vif-plugged-2db11cf7-a9f4-4143-96b7-5bfdd381d1d8 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Jan 21 19:32:07 np0005591285 nova_compute[182755]: 2026-01-22 00:32:07.765 182759 DEBUG nova.compute.manager [None req-7c0476d1-b41e-42ef-92a8-ea1af027411b a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] [instance: 3da83711-3468-42e8-aec6-ea1b9848aa39] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Jan 21 19:32:07 np0005591285 nova_compute[182755]: 2026-01-22 00:32:07.771 182759 DEBUG nova.virt.driver [None req-f447ef32-7edb-4e2c-a5c5-5452f2f9c37e - - - - - -] Emitting event <LifecycleEvent: 1769041927.7709744, 3da83711-3468-42e8-aec6-ea1b9848aa39 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 21 19:32:07 np0005591285 nova_compute[182755]: 2026-01-22 00:32:07.772 182759 INFO nova.compute.manager [None req-f447ef32-7edb-4e2c-a5c5-5452f2f9c37e - - - - - -] [instance: 3da83711-3468-42e8-aec6-ea1b9848aa39] VM Resumed (Lifecycle Event)#033[00m
Jan 21 19:32:07 np0005591285 nova_compute[182755]: 2026-01-22 00:32:07.780 182759 DEBUG nova.virt.libvirt.driver [None req-7c0476d1-b41e-42ef-92a8-ea1af027411b a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] [instance: 3da83711-3468-42e8-aec6-ea1b9848aa39] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Jan 21 19:32:07 np0005591285 nova_compute[182755]: 2026-01-22 00:32:07.785 182759 INFO nova.virt.libvirt.driver [-] [instance: 3da83711-3468-42e8-aec6-ea1b9848aa39] Instance spawned successfully.#033[00m
Jan 21 19:32:07 np0005591285 nova_compute[182755]: 2026-01-22 00:32:07.787 182759 DEBUG nova.virt.libvirt.driver [None req-7c0476d1-b41e-42ef-92a8-ea1af027411b a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] [instance: 3da83711-3468-42e8-aec6-ea1b9848aa39] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Jan 21 19:32:07 np0005591285 nova_compute[182755]: 2026-01-22 00:32:07.806 182759 DEBUG nova.compute.manager [None req-f447ef32-7edb-4e2c-a5c5-5452f2f9c37e - - - - - -] [instance: 3da83711-3468-42e8-aec6-ea1b9848aa39] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 21 19:32:07 np0005591285 nova_compute[182755]: 2026-01-22 00:32:07.812 182759 DEBUG nova.compute.manager [None req-f447ef32-7edb-4e2c-a5c5-5452f2f9c37e - - - - - -] [instance: 3da83711-3468-42e8-aec6-ea1b9848aa39] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 21 19:32:07 np0005591285 nova_compute[182755]: 2026-01-22 00:32:07.826 182759 DEBUG nova.virt.libvirt.driver [None req-7c0476d1-b41e-42ef-92a8-ea1af027411b a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] [instance: 3da83711-3468-42e8-aec6-ea1b9848aa39] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 21 19:32:07 np0005591285 nova_compute[182755]: 2026-01-22 00:32:07.826 182759 DEBUG nova.virt.libvirt.driver [None req-7c0476d1-b41e-42ef-92a8-ea1af027411b a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] [instance: 3da83711-3468-42e8-aec6-ea1b9848aa39] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 21 19:32:07 np0005591285 nova_compute[182755]: 2026-01-22 00:32:07.827 182759 DEBUG nova.virt.libvirt.driver [None req-7c0476d1-b41e-42ef-92a8-ea1af027411b a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] [instance: 3da83711-3468-42e8-aec6-ea1b9848aa39] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 21 19:32:07 np0005591285 nova_compute[182755]: 2026-01-22 00:32:07.827 182759 DEBUG nova.virt.libvirt.driver [None req-7c0476d1-b41e-42ef-92a8-ea1af027411b a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] [instance: 3da83711-3468-42e8-aec6-ea1b9848aa39] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 21 19:32:07 np0005591285 nova_compute[182755]: 2026-01-22 00:32:07.828 182759 DEBUG nova.virt.libvirt.driver [None req-7c0476d1-b41e-42ef-92a8-ea1af027411b a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] [instance: 3da83711-3468-42e8-aec6-ea1b9848aa39] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 21 19:32:07 np0005591285 nova_compute[182755]: 2026-01-22 00:32:07.828 182759 DEBUG nova.virt.libvirt.driver [None req-7c0476d1-b41e-42ef-92a8-ea1af027411b a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] [instance: 3da83711-3468-42e8-aec6-ea1b9848aa39] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 21 19:32:07 np0005591285 podman[239755]: 2026-01-22 00:32:07.83205141 +0000 UTC m=+0.061897006 container create b421caa6912080c2b9fc94414af3cd7393d8ba78997a0886434ae2b8f84ac6bb (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-895033ac-5f91-4350-ad1a-b5c5d0ff13a2, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 21 19:32:07 np0005591285 nova_compute[182755]: 2026-01-22 00:32:07.839 182759 INFO nova.compute.manager [None req-f447ef32-7edb-4e2c-a5c5-5452f2f9c37e - - - - - -] [instance: 3da83711-3468-42e8-aec6-ea1b9848aa39] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 21 19:32:07 np0005591285 systemd[1]: Started libpod-conmon-b421caa6912080c2b9fc94414af3cd7393d8ba78997a0886434ae2b8f84ac6bb.scope.
Jan 21 19:32:07 np0005591285 podman[239755]: 2026-01-22 00:32:07.792691201 +0000 UTC m=+0.022536797 image pull 3695f0466b4af47afdf4b467956f8cc4744d7249671a73e7ca3fd26cca2f59c3 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Jan 21 19:32:07 np0005591285 systemd[1]: Started libcrun container.
Jan 21 19:32:07 np0005591285 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/64395b99d1bfdda7774d27e884efb3aae6cdf0c19c312b85051752165e002252/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 21 19:32:07 np0005591285 podman[239755]: 2026-01-22 00:32:07.921771204 +0000 UTC m=+0.151616800 container init b421caa6912080c2b9fc94414af3cd7393d8ba78997a0886434ae2b8f84ac6bb (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-895033ac-5f91-4350-ad1a-b5c5d0ff13a2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.build-date=20251202)
Jan 21 19:32:07 np0005591285 podman[239755]: 2026-01-22 00:32:07.928325231 +0000 UTC m=+0.158170807 container start b421caa6912080c2b9fc94414af3cd7393d8ba78997a0886434ae2b8f84ac6bb (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-895033ac-5f91-4350-ad1a-b5c5d0ff13a2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3)
Jan 21 19:32:07 np0005591285 nova_compute[182755]: 2026-01-22 00:32:07.937 182759 INFO nova.compute.manager [None req-7c0476d1-b41e-42ef-92a8-ea1af027411b a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] [instance: 3da83711-3468-42e8-aec6-ea1b9848aa39] Took 9.39 seconds to spawn the instance on the hypervisor.#033[00m
Jan 21 19:32:07 np0005591285 nova_compute[182755]: 2026-01-22 00:32:07.938 182759 DEBUG nova.compute.manager [None req-7c0476d1-b41e-42ef-92a8-ea1af027411b a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] [instance: 3da83711-3468-42e8-aec6-ea1b9848aa39] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 21 19:32:07 np0005591285 neutron-haproxy-ovnmeta-895033ac-5f91-4350-ad1a-b5c5d0ff13a2[239771]: [NOTICE]   (239775) : New worker (239777) forked
Jan 21 19:32:07 np0005591285 neutron-haproxy-ovnmeta-895033ac-5f91-4350-ad1a-b5c5d0ff13a2[239771]: [NOTICE]   (239775) : Loading success.
Jan 21 19:32:08 np0005591285 nova_compute[182755]: 2026-01-22 00:32:08.217 182759 DEBUG oslo_service.periodic_task [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 21 19:32:08 np0005591285 nova_compute[182755]: 2026-01-22 00:32:08.621 182759 INFO nova.compute.manager [None req-7c0476d1-b41e-42ef-92a8-ea1af027411b a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] [instance: 3da83711-3468-42e8-aec6-ea1b9848aa39] Took 10.52 seconds to build instance.#033[00m
Jan 21 19:32:08 np0005591285 nova_compute[182755]: 2026-01-22 00:32:08.648 182759 DEBUG oslo_concurrency.lockutils [None req-7c0476d1-b41e-42ef-92a8-ea1af027411b a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Lock "3da83711-3468-42e8-aec6-ea1b9848aa39" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 10.618s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 21 19:32:09 np0005591285 podman[239786]: 2026-01-22 00:32:09.197793662 +0000 UTC m=+0.060830328 container health_status 482a8450540b33e5cd4c4cb0c4b60281016c657b79b89fa82f8a9e447843f995 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, container_name=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team)
Jan 21 19:32:09 np0005591285 podman[239787]: 2026-01-22 00:32:09.246742189 +0000 UTC m=+0.099746284 container health_status 8919309155a033da8deb024bb37b9fc0d285fe97686ee0d202af37a5f2df8416 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter)
Jan 21 19:32:09 np0005591285 nova_compute[182755]: 2026-01-22 00:32:09.608 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:32:09 np0005591285 nova_compute[182755]: 2026-01-22 00:32:09.892 182759 DEBUG nova.compute.manager [req-291ac1e5-f86d-4efa-8e3e-3d24d91f7e28 req-6a94e1eb-3348-499a-97d9-14874bf75f56 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 3da83711-3468-42e8-aec6-ea1b9848aa39] Received event network-vif-plugged-2db11cf7-a9f4-4143-96b7-5bfdd381d1d8 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 21 19:32:09 np0005591285 nova_compute[182755]: 2026-01-22 00:32:09.892 182759 DEBUG oslo_concurrency.lockutils [req-291ac1e5-f86d-4efa-8e3e-3d24d91f7e28 req-6a94e1eb-3348-499a-97d9-14874bf75f56 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "3da83711-3468-42e8-aec6-ea1b9848aa39-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 21 19:32:09 np0005591285 nova_compute[182755]: 2026-01-22 00:32:09.892 182759 DEBUG oslo_concurrency.lockutils [req-291ac1e5-f86d-4efa-8e3e-3d24d91f7e28 req-6a94e1eb-3348-499a-97d9-14874bf75f56 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "3da83711-3468-42e8-aec6-ea1b9848aa39-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 21 19:32:09 np0005591285 nova_compute[182755]: 2026-01-22 00:32:09.893 182759 DEBUG oslo_concurrency.lockutils [req-291ac1e5-f86d-4efa-8e3e-3d24d91f7e28 req-6a94e1eb-3348-499a-97d9-14874bf75f56 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "3da83711-3468-42e8-aec6-ea1b9848aa39-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 21 19:32:09 np0005591285 nova_compute[182755]: 2026-01-22 00:32:09.893 182759 DEBUG nova.compute.manager [req-291ac1e5-f86d-4efa-8e3e-3d24d91f7e28 req-6a94e1eb-3348-499a-97d9-14874bf75f56 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 3da83711-3468-42e8-aec6-ea1b9848aa39] No waiting events found dispatching network-vif-plugged-2db11cf7-a9f4-4143-96b7-5bfdd381d1d8 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 21 19:32:09 np0005591285 nova_compute[182755]: 2026-01-22 00:32:09.893 182759 WARNING nova.compute.manager [req-291ac1e5-f86d-4efa-8e3e-3d24d91f7e28 req-6a94e1eb-3348-499a-97d9-14874bf75f56 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 3da83711-3468-42e8-aec6-ea1b9848aa39] Received unexpected event network-vif-plugged-2db11cf7-a9f4-4143-96b7-5bfdd381d1d8 for instance with vm_state active and task_state None.#033[00m
Jan 21 19:32:10 np0005591285 nova_compute[182755]: 2026-01-22 00:32:10.149 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:32:10 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:32:10.171 104259 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=55, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '16:a4:fb', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': 'fa:c2:bb:50:eb:27'}, ipsec=False) old=SB_Global(nb_cfg=54) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 21 19:32:10 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:32:10.172 104259 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 4 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Jan 21 19:32:10 np0005591285 nova_compute[182755]: 2026-01-22 00:32:10.217 182759 DEBUG oslo_service.periodic_task [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 21 19:32:10 np0005591285 nova_compute[182755]: 2026-01-22 00:32:10.246 182759 DEBUG oslo_concurrency.lockutils [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 21 19:32:10 np0005591285 nova_compute[182755]: 2026-01-22 00:32:10.247 182759 DEBUG oslo_concurrency.lockutils [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 21 19:32:10 np0005591285 nova_compute[182755]: 2026-01-22 00:32:10.247 182759 DEBUG oslo_concurrency.lockutils [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 21 19:32:10 np0005591285 nova_compute[182755]: 2026-01-22 00:32:10.247 182759 DEBUG nova.compute.resource_tracker [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Jan 21 19:32:10 np0005591285 nova_compute[182755]: 2026-01-22 00:32:10.319 182759 DEBUG oslo_concurrency.processutils [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/3da83711-3468-42e8-aec6-ea1b9848aa39/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 21 19:32:10 np0005591285 nova_compute[182755]: 2026-01-22 00:32:10.375 182759 DEBUG oslo_concurrency.processutils [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/3da83711-3468-42e8-aec6-ea1b9848aa39/disk --force-share --output=json" returned: 0 in 0.057s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 21 19:32:10 np0005591285 nova_compute[182755]: 2026-01-22 00:32:10.376 182759 DEBUG oslo_concurrency.processutils [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/3da83711-3468-42e8-aec6-ea1b9848aa39/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 21 19:32:10 np0005591285 nova_compute[182755]: 2026-01-22 00:32:10.431 182759 DEBUG oslo_concurrency.processutils [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/3da83711-3468-42e8-aec6-ea1b9848aa39/disk --force-share --output=json" returned: 0 in 0.055s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 21 19:32:10 np0005591285 nova_compute[182755]: 2026-01-22 00:32:10.579 182759 WARNING nova.virt.libvirt.driver [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 21 19:32:10 np0005591285 nova_compute[182755]: 2026-01-22 00:32:10.581 182759 DEBUG nova.compute.resource_tracker [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=5549MB free_disk=73.17643737792969GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Jan 21 19:32:10 np0005591285 nova_compute[182755]: 2026-01-22 00:32:10.581 182759 DEBUG oslo_concurrency.lockutils [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 21 19:32:10 np0005591285 nova_compute[182755]: 2026-01-22 00:32:10.581 182759 DEBUG oslo_concurrency.lockutils [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 21 19:32:10 np0005591285 nova_compute[182755]: 2026-01-22 00:32:10.648 182759 DEBUG nova.compute.resource_tracker [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Instance 3da83711-3468-42e8-aec6-ea1b9848aa39 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Jan 21 19:32:10 np0005591285 nova_compute[182755]: 2026-01-22 00:32:10.649 182759 DEBUG nova.compute.resource_tracker [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Jan 21 19:32:10 np0005591285 nova_compute[182755]: 2026-01-22 00:32:10.649 182759 DEBUG nova.compute.resource_tracker [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7679MB used_ram=640MB phys_disk=79GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Jan 21 19:32:10 np0005591285 nova_compute[182755]: 2026-01-22 00:32:10.705 182759 DEBUG nova.compute.provider_tree [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Inventory has not changed in ProviderTree for provider: e96a8776-a298-4c19-937a-402cb8191067 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 21 19:32:10 np0005591285 nova_compute[182755]: 2026-01-22 00:32:10.733 182759 DEBUG nova.scheduler.client.report [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Inventory has not changed for provider e96a8776-a298-4c19-937a-402cb8191067 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 21 19:32:10 np0005591285 nova_compute[182755]: 2026-01-22 00:32:10.758 182759 DEBUG nova.compute.resource_tracker [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Jan 21 19:32:10 np0005591285 nova_compute[182755]: 2026-01-22 00:32:10.759 182759 DEBUG oslo_concurrency.lockutils [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.178s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 21 19:32:11 np0005591285 nova_compute[182755]: 2026-01-22 00:32:11.067 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:32:11 np0005591285 nova_compute[182755]: 2026-01-22 00:32:11.141 182759 DEBUG nova.compute.manager [req-4e07bdb0-3780-4650-9e9c-db19e3d7ff0f req-e0ec26af-95f2-4a72-b8b9-410fa457b38c 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 3da83711-3468-42e8-aec6-ea1b9848aa39] Received event network-changed-2db11cf7-a9f4-4143-96b7-5bfdd381d1d8 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 21 19:32:11 np0005591285 nova_compute[182755]: 2026-01-22 00:32:11.142 182759 DEBUG nova.compute.manager [req-4e07bdb0-3780-4650-9e9c-db19e3d7ff0f req-e0ec26af-95f2-4a72-b8b9-410fa457b38c 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 3da83711-3468-42e8-aec6-ea1b9848aa39] Refreshing instance network info cache due to event network-changed-2db11cf7-a9f4-4143-96b7-5bfdd381d1d8. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 21 19:32:11 np0005591285 nova_compute[182755]: 2026-01-22 00:32:11.143 182759 DEBUG oslo_concurrency.lockutils [req-4e07bdb0-3780-4650-9e9c-db19e3d7ff0f req-e0ec26af-95f2-4a72-b8b9-410fa457b38c 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "refresh_cache-3da83711-3468-42e8-aec6-ea1b9848aa39" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 21 19:32:11 np0005591285 nova_compute[182755]: 2026-01-22 00:32:11.143 182759 DEBUG oslo_concurrency.lockutils [req-4e07bdb0-3780-4650-9e9c-db19e3d7ff0f req-e0ec26af-95f2-4a72-b8b9-410fa457b38c 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquired lock "refresh_cache-3da83711-3468-42e8-aec6-ea1b9848aa39" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 21 19:32:11 np0005591285 nova_compute[182755]: 2026-01-22 00:32:11.143 182759 DEBUG nova.network.neutron [req-4e07bdb0-3780-4650-9e9c-db19e3d7ff0f req-e0ec26af-95f2-4a72-b8b9-410fa457b38c 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 3da83711-3468-42e8-aec6-ea1b9848aa39] Refreshing network info cache for port 2db11cf7-a9f4-4143-96b7-5bfdd381d1d8 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 21 19:32:12 np0005591285 podman[239830]: 2026-01-22 00:32:12.221651893 +0000 UTC m=+0.093492427 container health_status e38def30a2d032278c507a8fe96e62e9ea51ff4721fe55b24e66691821fd079e (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_controller, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, config_id=ovn_controller)
Jan 21 19:32:13 np0005591285 nova_compute[182755]: 2026-01-22 00:32:13.761 182759 DEBUG oslo_service.periodic_task [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 21 19:32:14 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:32:14.175 104259 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=ce4b296c-26ac-415a-aa87-9634754eb3d3, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '55'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 21 19:32:14 np0005591285 nova_compute[182755]: 2026-01-22 00:32:14.611 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:32:14 np0005591285 nova_compute[182755]: 2026-01-22 00:32:14.937 182759 DEBUG nova.network.neutron [req-4e07bdb0-3780-4650-9e9c-db19e3d7ff0f req-e0ec26af-95f2-4a72-b8b9-410fa457b38c 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 3da83711-3468-42e8-aec6-ea1b9848aa39] Updated VIF entry in instance network info cache for port 2db11cf7-a9f4-4143-96b7-5bfdd381d1d8. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 21 19:32:14 np0005591285 nova_compute[182755]: 2026-01-22 00:32:14.939 182759 DEBUG nova.network.neutron [req-4e07bdb0-3780-4650-9e9c-db19e3d7ff0f req-e0ec26af-95f2-4a72-b8b9-410fa457b38c 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 3da83711-3468-42e8-aec6-ea1b9848aa39] Updating instance_info_cache with network_info: [{"id": "2db11cf7-a9f4-4143-96b7-5bfdd381d1d8", "address": "fa:16:3e:0d:7f:01", "network": {"id": "895033ac-5f91-4350-ad1a-b5c5d0ff13a2", "bridge": "br-int", "label": "tempest-network-smoke--1056014394", "subnets": [{"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe0d:7f01", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}, {"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.230", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "837db8748d074b3c9179b47d30e7a1d4", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap2db11cf7-a9", "ovs_interfaceid": "2db11cf7-a9f4-4143-96b7-5bfdd381d1d8", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 21 19:32:15 np0005591285 nova_compute[182755]: 2026-01-22 00:32:15.318 182759 DEBUG oslo_concurrency.lockutils [req-4e07bdb0-3780-4650-9e9c-db19e3d7ff0f req-e0ec26af-95f2-4a72-b8b9-410fa457b38c 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Releasing lock "refresh_cache-3da83711-3468-42e8-aec6-ea1b9848aa39" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 21 19:32:16 np0005591285 nova_compute[182755]: 2026-01-22 00:32:16.071 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:32:18 np0005591285 nova_compute[182755]: 2026-01-22 00:32:18.217 182759 DEBUG oslo_service.periodic_task [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 21 19:32:19 np0005591285 nova_compute[182755]: 2026-01-22 00:32:19.613 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:32:21 np0005591285 nova_compute[182755]: 2026-01-22 00:32:21.077 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:32:21 np0005591285 ovn_controller[94908]: 2026-01-22T00:32:21Z|00066|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:0d:7f:01 10.100.0.8
Jan 21 19:32:21 np0005591285 ovn_controller[94908]: 2026-01-22T00:32:21Z|00067|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:0d:7f:01 10.100.0.8
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.178 12 DEBUG ceilometer.compute.discovery [-] instance data: {'id': '3da83711-3468-42e8-aec6-ea1b9848aa39', 'name': 'tempest-TestGettingAddress-server-805464775', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'os_type': 'hvm', 'architecture': 'x86_64', 'OS-EXT-SRV-ATTR:instance_name': 'instance-000000aa', 'OS-EXT-SRV-ATTR:host': 'compute-2.ctlplane.example.com', 'OS-EXT-STS:vm_state': 'running', 'tenant_id': '837db8748d074b3c9179b47d30e7a1d4', 'user_id': 'a8fd196423d94b309668ffd08655f7ed', 'hostId': '164c0a9a6acac8f4385343b5316ca58358db1c583e55b6d4f56f9ab0', 'status': 'active', 'metadata': {}} discover_libvirt_polling /usr/lib/python3.9/site-packages/ceilometer/compute/discovery.py:228
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.179 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.requests in the context of pollsters
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.210 12 DEBUG ceilometer.compute.pollsters [-] 3da83711-3468-42e8-aec6-ea1b9848aa39/disk.device.write.requests volume: 305 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.212 12 DEBUG ceilometer.compute.pollsters [-] 3da83711-3468-42e8-aec6-ea1b9848aa39/disk.device.write.requests volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.215 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '003c226f-8079-41b7-aeff-d7381795afc6', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 305, 'user_id': 'a8fd196423d94b309668ffd08655f7ed', 'user_name': None, 'project_id': '837db8748d074b3c9179b47d30e7a1d4', 'project_name': None, 'resource_id': '3da83711-3468-42e8-aec6-ea1b9848aa39-vda', 'timestamp': '2026-01-22T00:32:23.180063', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-805464775', 'name': 'instance-000000aa', 'instance_id': '3da83711-3468-42e8-aec6-ea1b9848aa39', 'instance_type': 'm1.nano', 'host': '164c0a9a6acac8f4385343b5316ca58358db1c583e55b6d4f56f9ab0', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'd1af1be6-f729-11f0-b13b-fa163e425b77', 'monotonic_time': 6501.899286275, 'message_signature': '83d79dc4ea00f7c8af83275dad4fb9b51d7d578096b1dbf0934dc03526c5b3f6'}, {'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 0, 'user_id': 'a8fd196423d94b309668ffd08655f7ed', 'user_name': None, 'project_id': '837db8748d074b3c9179b47d30e7a1d4', 'project_name': None, 'resource_id': '3da83711-3468-42e8-aec6-ea1b9848aa39-sda', 'timestamp': '2026-01-22T00:32:23.180063', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-805464775', 'name': 'instance-000000aa', 'instance_id': '3da83711-3468-42e8-aec6-ea1b9848aa39', 'instance_type': 'm1.nano', 'host': '164c0a9a6acac8f4385343b5316ca58358db1c583e55b6d4f56f9ab0', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'd1af3126-f729-11f0-b13b-fa163e425b77', 'monotonic_time': 6501.899286275, 'message_signature': 'a0a9d23a2099dbd0f81080f72bbb1b75a5ba3559264fc6009c4f95b60050a20c'}]}, 'timestamp': '2026-01-22 00:32:23.212734', '_unique_id': 'a2a2d1f97fa34ee69a3f1720b0c480f7'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.215 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.215 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.215 12 ERROR oslo_messaging.notify.messaging     yield
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.215 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.215 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.215 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.215 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.215 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.215 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.215 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.215 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.215 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.215 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.215 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.215 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.215 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.215 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.215 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.215 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.215 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.215 12 ERROR oslo_messaging.notify.messaging 
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.215 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.215 12 ERROR oslo_messaging.notify.messaging 
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.215 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.215 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.215 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.215 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.215 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.215 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.215 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.215 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.215 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.215 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.215 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.215 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.215 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.215 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.215 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.215 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.215 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.215 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.215 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.215 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.215 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.215 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.215 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.215 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.215 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.215 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.215 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.215 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.215 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.215 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.215 12 ERROR oslo_messaging.notify.messaging 
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.216 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.rate in the context of pollsters
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.217 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for IncomingBytesRatePollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.217 12 ERROR ceilometer.polling.manager [-] Prevent pollster network.incoming.bytes.rate from polling [<NovaLikeServer: tempest-TestGettingAddress-server-805464775>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-TestGettingAddress-server-805464775>]
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.217 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.latency in the context of pollsters
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.217 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for PerDeviceDiskLatencyPollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.217 12 ERROR ceilometer.polling.manager [-] Prevent pollster disk.device.latency from polling [<NovaLikeServer: tempest-TestGettingAddress-server-805464775>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-TestGettingAddress-server-805464775>]
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.218 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.rate in the context of pollsters
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.218 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for OutgoingBytesRatePollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.218 12 ERROR ceilometer.polling.manager [-] Prevent pollster network.outgoing.bytes.rate from polling [<NovaLikeServer: tempest-TestGettingAddress-server-805464775>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-TestGettingAddress-server-805464775>]
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.218 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets in the context of pollsters
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.221 12 DEBUG ceilometer.compute.virt.libvirt.inspector [-] No delta meter predecessor for 3da83711-3468-42e8-aec6-ea1b9848aa39 / tap2db11cf7-a9 inspect_vnics /usr/lib/python3.9/site-packages/ceilometer/compute/virt/libvirt/inspector.py:136
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.221 12 DEBUG ceilometer.compute.pollsters [-] 3da83711-3468-42e8-aec6-ea1b9848aa39/network.incoming.packets volume: 14 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.222 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '206ca1eb-8243-426c-b996-d83d7c092b6e', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 14, 'user_id': 'a8fd196423d94b309668ffd08655f7ed', 'user_name': None, 'project_id': '837db8748d074b3c9179b47d30e7a1d4', 'project_name': None, 'resource_id': 'instance-000000aa-3da83711-3468-42e8-aec6-ea1b9848aa39-tap2db11cf7-a9', 'timestamp': '2026-01-22T00:32:23.218539', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-805464775', 'name': 'tap2db11cf7-a9', 'instance_id': '3da83711-3468-42e8-aec6-ea1b9848aa39', 'instance_type': 'm1.nano', 'host': '164c0a9a6acac8f4385343b5316ca58358db1c583e55b6d4f56f9ab0', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:0d:7f:01', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap2db11cf7-a9'}, 'message_id': 'd1b0ab8c-f729-11f0-b13b-fa163e425b77', 'monotonic_time': 6501.93774568, 'message_signature': '4720273d73d826deba53e649c34d5546667e51b78ae4ce42e45916fd67d276ff'}]}, 'timestamp': '2026-01-22 00:32:23.222300', '_unique_id': 'b39c367886ba4e74bd6a2446a2aea343'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.222 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.222 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.222 12 ERROR oslo_messaging.notify.messaging     yield
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.222 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.222 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.222 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.222 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.222 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.222 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.222 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.222 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.222 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.222 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.222 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.222 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.222 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.222 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.222 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.222 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.222 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.222 12 ERROR oslo_messaging.notify.messaging 
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.222 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.222 12 ERROR oslo_messaging.notify.messaging 
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.222 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.222 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.222 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.222 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.222 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.222 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.222 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.222 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.222 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.222 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.222 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.222 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.222 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.222 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.222 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.222 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.222 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.222 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.222 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.222 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.222 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.222 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.222 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.222 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.222 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.222 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.222 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.222 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.222 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.222 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.222 12 ERROR oslo_messaging.notify.messaging 
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.223 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.drop in the context of pollsters
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.224 12 DEBUG ceilometer.compute.pollsters [-] 3da83711-3468-42e8-aec6-ea1b9848aa39/network.incoming.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.224 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '693c8d22-b99d-4b6b-bf5d-e8e81affcfd5', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'a8fd196423d94b309668ffd08655f7ed', 'user_name': None, 'project_id': '837db8748d074b3c9179b47d30e7a1d4', 'project_name': None, 'resource_id': 'instance-000000aa-3da83711-3468-42e8-aec6-ea1b9848aa39-tap2db11cf7-a9', 'timestamp': '2026-01-22T00:32:23.224083', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-805464775', 'name': 'tap2db11cf7-a9', 'instance_id': '3da83711-3468-42e8-aec6-ea1b9848aa39', 'instance_type': 'm1.nano', 'host': '164c0a9a6acac8f4385343b5316ca58358db1c583e55b6d4f56f9ab0', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:0d:7f:01', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap2db11cf7-a9'}, 'message_id': 'd1b0fd44-f729-11f0-b13b-fa163e425b77', 'monotonic_time': 6501.93774568, 'message_signature': 'ba76abfe3a56c121e4c6d4ebd90c47f9ee6a65463921cbeb18f7af5da02e1807'}]}, 'timestamp': '2026-01-22 00:32:23.224366', '_unique_id': '905a8619815b4e27b4d195e3ef667b2b'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.224 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.224 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.224 12 ERROR oslo_messaging.notify.messaging     yield
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.224 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.224 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.224 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.224 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.224 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.224 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.224 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.224 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.224 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.224 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.224 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.224 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.224 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.224 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.224 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.224 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.224 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.224 12 ERROR oslo_messaging.notify.messaging 
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.224 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.224 12 ERROR oslo_messaging.notify.messaging 
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.224 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.224 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.224 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.224 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.224 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.224 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.224 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.224 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.224 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.224 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.224 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.224 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.224 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.224 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.224 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.224 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.224 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.224 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.224 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.224 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.224 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.224 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.224 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.224 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.224 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.224 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.224 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.224 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.224 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.224 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.224 12 ERROR oslo_messaging.notify.messaging 
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.225 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.delta in the context of pollsters
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.226 12 DEBUG ceilometer.compute.pollsters [-] 3da83711-3468-42e8-aec6-ea1b9848aa39/network.outgoing.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.227 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'cd6c32c7-05f4-4fee-8923-ea17a2f3672e', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': 'a8fd196423d94b309668ffd08655f7ed', 'user_name': None, 'project_id': '837db8748d074b3c9179b47d30e7a1d4', 'project_name': None, 'resource_id': 'instance-000000aa-3da83711-3468-42e8-aec6-ea1b9848aa39-tap2db11cf7-a9', 'timestamp': '2026-01-22T00:32:23.226122', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-805464775', 'name': 'tap2db11cf7-a9', 'instance_id': '3da83711-3468-42e8-aec6-ea1b9848aa39', 'instance_type': 'm1.nano', 'host': '164c0a9a6acac8f4385343b5316ca58358db1c583e55b6d4f56f9ab0', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:0d:7f:01', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap2db11cf7-a9'}, 'message_id': 'd1b14f60-f729-11f0-b13b-fa163e425b77', 'monotonic_time': 6501.93774568, 'message_signature': 'b08f1bb6590f531e00df821f7d0c5e36dd1c60efc071063044f51486992d8864'}]}, 'timestamp': '2026-01-22 00:32:23.226510', '_unique_id': '725b24296c3b48c1b69106c6bf58ddd8'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.227 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.227 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.227 12 ERROR oslo_messaging.notify.messaging     yield
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.227 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.227 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.227 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.227 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.227 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.227 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.227 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.227 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.227 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.227 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.227 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.227 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.227 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.227 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.227 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.227 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.227 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.227 12 ERROR oslo_messaging.notify.messaging 
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.227 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.227 12 ERROR oslo_messaging.notify.messaging 
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.227 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.227 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.227 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.227 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.227 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.227 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.227 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.227 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.227 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.227 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.227 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.227 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.227 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.227 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.227 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.227 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.227 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.227 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.227 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.227 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.227 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.227 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.227 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.227 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.227 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.227 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.227 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.227 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.227 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.227 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.227 12 ERROR oslo_messaging.notify.messaging 
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.228 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.drop in the context of pollsters
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.228 12 DEBUG ceilometer.compute.pollsters [-] 3da83711-3468-42e8-aec6-ea1b9848aa39/network.outgoing.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.229 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'b757dbcd-c32b-4391-a674-2aa59bee18aa', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'a8fd196423d94b309668ffd08655f7ed', 'user_name': None, 'project_id': '837db8748d074b3c9179b47d30e7a1d4', 'project_name': None, 'resource_id': 'instance-000000aa-3da83711-3468-42e8-aec6-ea1b9848aa39-tap2db11cf7-a9', 'timestamp': '2026-01-22T00:32:23.228320', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-805464775', 'name': 'tap2db11cf7-a9', 'instance_id': '3da83711-3468-42e8-aec6-ea1b9848aa39', 'instance_type': 'm1.nano', 'host': '164c0a9a6acac8f4385343b5316ca58358db1c583e55b6d4f56f9ab0', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:0d:7f:01', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap2db11cf7-a9'}, 'message_id': 'd1b1a2bc-f729-11f0-b13b-fa163e425b77', 'monotonic_time': 6501.93774568, 'message_signature': '9451e3df62db0ecacc1bf07e600bf47ff0358f6bafdb9a8314de5bf65c8b9780'}]}, 'timestamp': '2026-01-22 00:32:23.228600', '_unique_id': '6aa1b13937da4a36b625009270d10b32'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.229 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.229 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.229 12 ERROR oslo_messaging.notify.messaging     yield
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.229 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.229 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.229 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.229 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.229 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.229 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.229 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.229 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.229 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.229 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.229 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.229 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.229 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.229 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.229 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.229 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.229 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.229 12 ERROR oslo_messaging.notify.messaging 
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.229 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.229 12 ERROR oslo_messaging.notify.messaging 
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.229 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.229 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.229 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.229 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.229 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.229 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.229 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.229 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.229 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.229 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.229 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.229 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.229 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.229 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.229 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.229 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.229 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.229 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.229 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.229 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.229 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.229 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.229 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.229 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.229 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.229 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.229 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.229 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.229 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.229 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.229 12 ERROR oslo_messaging.notify.messaging 
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.230 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.latency in the context of pollsters
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.230 12 DEBUG ceilometer.compute.pollsters [-] 3da83711-3468-42e8-aec6-ea1b9848aa39/disk.device.read.latency volume: 182900787 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.230 12 DEBUG ceilometer.compute.pollsters [-] 3da83711-3468-42e8-aec6-ea1b9848aa39/disk.device.read.latency volume: 31757097 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.231 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '39b1f1a1-988d-4ddd-9543-a21695c8a62e', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 182900787, 'user_id': 'a8fd196423d94b309668ffd08655f7ed', 'user_name': None, 'project_id': '837db8748d074b3c9179b47d30e7a1d4', 'project_name': None, 'resource_id': '3da83711-3468-42e8-aec6-ea1b9848aa39-vda', 'timestamp': '2026-01-22T00:32:23.230442', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-805464775', 'name': 'instance-000000aa', 'instance_id': '3da83711-3468-42e8-aec6-ea1b9848aa39', 'instance_type': 'm1.nano', 'host': '164c0a9a6acac8f4385343b5316ca58358db1c583e55b6d4f56f9ab0', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'd1b1f78a-f729-11f0-b13b-fa163e425b77', 'monotonic_time': 6501.899286275, 'message_signature': '2557bdfcb6cb8a1298b2b6f548c46bba930c7c87288cc033c2cd8457e954e050'}, {'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 31757097, 'user_id': 'a8fd196423d94b309668ffd08655f7ed', 'user_name': None, 'project_id': '837db8748d074b3c9179b47d30e7a1d4', 'project_name': None, 'resource_id': '3da83711-3468-42e8-aec6-ea1b9848aa39-sda', 'timestamp': '2026-01-22T00:32:23.230442', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-805464775', 'name': 'instance-000000aa', 'instance_id': '3da83711-3468-42e8-aec6-ea1b9848aa39', 'instance_type': 'm1.nano', 'host': '164c0a9a6acac8f4385343b5316ca58358db1c583e55b6d4f56f9ab0', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'd1b2059a-f729-11f0-b13b-fa163e425b77', 'monotonic_time': 6501.899286275, 'message_signature': '3bdbfd57dfd100d7b2e205cc5c0c44bae892e7a9f4dcc64b468b4b43c07260df'}]}, 'timestamp': '2026-01-22 00:32:23.231167', '_unique_id': '6a4f6def9ef64fbfaa834834636c823b'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.231 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.231 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.231 12 ERROR oslo_messaging.notify.messaging     yield
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.231 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.231 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.231 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.231 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.231 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.231 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.231 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.231 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.231 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.231 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.231 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.231 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.231 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.231 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.231 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.231 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.231 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.231 12 ERROR oslo_messaging.notify.messaging 
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.231 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.231 12 ERROR oslo_messaging.notify.messaging 
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.231 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.231 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.231 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.231 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.231 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.231 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.231 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.231 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.231 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.231 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.231 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.231 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.231 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.231 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.231 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.231 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.231 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.231 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.231 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.231 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.231 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.231 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.231 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.231 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.231 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.231 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.231 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.231 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.231 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.231 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.231 12 ERROR oslo_messaging.notify.messaging 
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.233 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.iops in the context of pollsters
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.233 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for PerDeviceDiskIOPSPollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.233 12 ERROR ceilometer.polling.manager [-] Prevent pollster disk.device.iops from polling [<NovaLikeServer: tempest-TestGettingAddress-server-805464775>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-TestGettingAddress-server-805464775>]
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.233 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.error in the context of pollsters
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.233 12 DEBUG ceilometer.compute.pollsters [-] 3da83711-3468-42e8-aec6-ea1b9848aa39/network.outgoing.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.234 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'e2948f29-092d-41cc-8170-02b825c3362e', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'a8fd196423d94b309668ffd08655f7ed', 'user_name': None, 'project_id': '837db8748d074b3c9179b47d30e7a1d4', 'project_name': None, 'resource_id': 'instance-000000aa-3da83711-3468-42e8-aec6-ea1b9848aa39-tap2db11cf7-a9', 'timestamp': '2026-01-22T00:32:23.233788', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-805464775', 'name': 'tap2db11cf7-a9', 'instance_id': '3da83711-3468-42e8-aec6-ea1b9848aa39', 'instance_type': 'm1.nano', 'host': '164c0a9a6acac8f4385343b5316ca58358db1c583e55b6d4f56f9ab0', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:0d:7f:01', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap2db11cf7-a9'}, 'message_id': 'd1b27ade-f729-11f0-b13b-fa163e425b77', 'monotonic_time': 6501.93774568, 'message_signature': '27386002f6adf13f1cf4445b109541a690ac5e811aba1f251b2539bdc4140290'}]}, 'timestamp': '2026-01-22 00:32:23.234183', '_unique_id': '95cc862d38db43ff9f736bff28222721'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.234 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.234 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.234 12 ERROR oslo_messaging.notify.messaging     yield
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.234 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.234 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.234 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.234 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.234 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.234 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.234 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.234 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.234 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.234 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.234 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.234 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.234 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.234 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.234 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.234 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.234 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.234 12 ERROR oslo_messaging.notify.messaging 
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.234 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.234 12 ERROR oslo_messaging.notify.messaging 
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.234 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.234 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.234 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.234 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.234 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.234 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.234 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.234 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.234 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.234 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.234 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.234 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.234 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.234 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.234 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.234 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.234 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.234 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.234 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.234 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.234 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.234 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.234 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.234 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.234 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.234 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.234 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.234 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.234 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.234 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.234 12 ERROR oslo_messaging.notify.messaging 
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.235 12 INFO ceilometer.polling.manager [-] Polling pollster memory.usage in the context of pollsters
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.252 12 DEBUG ceilometer.compute.pollsters [-] 3da83711-3468-42e8-aec6-ea1b9848aa39/memory.usage volume: 40.37890625 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.254 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '062ed12e-220e-48eb-ba94-8f394cdbbd8d', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'memory.usage', 'counter_type': 'gauge', 'counter_unit': 'MB', 'counter_volume': 40.37890625, 'user_id': 'a8fd196423d94b309668ffd08655f7ed', 'user_name': None, 'project_id': '837db8748d074b3c9179b47d30e7a1d4', 'project_name': None, 'resource_id': '3da83711-3468-42e8-aec6-ea1b9848aa39', 'timestamp': '2026-01-22T00:32:23.235831', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-805464775', 'name': 'instance-000000aa', 'instance_id': '3da83711-3468-42e8-aec6-ea1b9848aa39', 'instance_type': 'm1.nano', 'host': '164c0a9a6acac8f4385343b5316ca58358db1c583e55b6d4f56f9ab0', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1}, 'message_id': 'd1b56956-f729-11f0-b13b-fa163e425b77', 'monotonic_time': 6501.971771435, 'message_signature': '962f7acb1e4e4c2d706c833371ad56eb241bb7e6bf054bd67d7196accaafb7f8'}]}, 'timestamp': '2026-01-22 00:32:23.253464', '_unique_id': 'd60344e241b342dba172e7d9690ca51b'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.254 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.254 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.254 12 ERROR oslo_messaging.notify.messaging     yield
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.254 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.254 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.254 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.254 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.254 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.254 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.254 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.254 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.254 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.254 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.254 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.254 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.254 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.254 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.254 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.254 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.254 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.254 12 ERROR oslo_messaging.notify.messaging 
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.254 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.254 12 ERROR oslo_messaging.notify.messaging 
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.254 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.254 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.254 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.254 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.254 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.254 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.254 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.254 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.254 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.254 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.254 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.254 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.254 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.254 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.254 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.254 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.254 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.254 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.254 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.254 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.254 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.254 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.254 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.254 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.254 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.254 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.254 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.254 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.254 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.254 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.254 12 ERROR oslo_messaging.notify.messaging 
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.256 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes in the context of pollsters
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.256 12 DEBUG ceilometer.compute.pollsters [-] 3da83711-3468-42e8-aec6-ea1b9848aa39/network.incoming.bytes volume: 1766 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.257 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '0cd8a482-2dc0-4f68-9a00-0b839f91c8f2', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 1766, 'user_id': 'a8fd196423d94b309668ffd08655f7ed', 'user_name': None, 'project_id': '837db8748d074b3c9179b47d30e7a1d4', 'project_name': None, 'resource_id': 'instance-000000aa-3da83711-3468-42e8-aec6-ea1b9848aa39-tap2db11cf7-a9', 'timestamp': '2026-01-22T00:32:23.256457', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-805464775', 'name': 'tap2db11cf7-a9', 'instance_id': '3da83711-3468-42e8-aec6-ea1b9848aa39', 'instance_type': 'm1.nano', 'host': '164c0a9a6acac8f4385343b5316ca58358db1c583e55b6d4f56f9ab0', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:0d:7f:01', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap2db11cf7-a9'}, 'message_id': 'd1b5f10a-f729-11f0-b13b-fa163e425b77', 'monotonic_time': 6501.93774568, 'message_signature': 'ed31033421baaef2fa45dbd9eb6c3c51cf44c952452bc218a1ec8b704532eb87'}]}, 'timestamp': '2026-01-22 00:32:23.256917', '_unique_id': '216417f2a085437ab36e025188bce351'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.257 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.257 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.257 12 ERROR oslo_messaging.notify.messaging     yield
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.257 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.257 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.257 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.257 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.257 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.257 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.257 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.257 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.257 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.257 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.257 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.257 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.257 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.257 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.257 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.257 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.257 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.257 12 ERROR oslo_messaging.notify.messaging 
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.257 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.257 12 ERROR oslo_messaging.notify.messaging 
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.257 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.257 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.257 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.257 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.257 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.257 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.257 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.257 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.257 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.257 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.257 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.257 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.257 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.257 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.257 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.257 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.257 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.257 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.257 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.257 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.257 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.257 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.257 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.257 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.257 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.257 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.257 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.257 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.257 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.257 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.257 12 ERROR oslo_messaging.notify.messaging 
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.258 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets in the context of pollsters
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.258 12 DEBUG ceilometer.compute.pollsters [-] 3da83711-3468-42e8-aec6-ea1b9848aa39/network.outgoing.packets volume: 14 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.259 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'a3c4c48c-1cc8-4a1b-a820-5263536115ff', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 14, 'user_id': 'a8fd196423d94b309668ffd08655f7ed', 'user_name': None, 'project_id': '837db8748d074b3c9179b47d30e7a1d4', 'project_name': None, 'resource_id': 'instance-000000aa-3da83711-3468-42e8-aec6-ea1b9848aa39-tap2db11cf7-a9', 'timestamp': '2026-01-22T00:32:23.258632', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-805464775', 'name': 'tap2db11cf7-a9', 'instance_id': '3da83711-3468-42e8-aec6-ea1b9848aa39', 'instance_type': 'm1.nano', 'host': '164c0a9a6acac8f4385343b5316ca58358db1c583e55b6d4f56f9ab0', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:0d:7f:01', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap2db11cf7-a9'}, 'message_id': 'd1b641dc-f729-11f0-b13b-fa163e425b77', 'monotonic_time': 6501.93774568, 'message_signature': 'd713f01e577d4b2f711e81660d35631c24d79baf61552fedd4327f0f4b125924'}]}, 'timestamp': '2026-01-22 00:32:23.258927', '_unique_id': 'd05751f3ac1a477f8ce1961aa89c5be5'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.259 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.259 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.259 12 ERROR oslo_messaging.notify.messaging     yield
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.259 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.259 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.259 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.259 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.259 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.259 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.259 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.259 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.259 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.259 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.259 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.259 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.259 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.259 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.259 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.259 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.259 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.259 12 ERROR oslo_messaging.notify.messaging 
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.259 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.259 12 ERROR oslo_messaging.notify.messaging 
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.259 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.259 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.259 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.259 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.259 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.259 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.259 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.259 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.259 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.259 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.259 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.259 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.259 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.259 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.259 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.259 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.259 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.259 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.259 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.259 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.259 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.259 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.259 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.259 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.259 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.259 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.259 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.259 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.259 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.259 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.259 12 ERROR oslo_messaging.notify.messaging 
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.260 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.capacity in the context of pollsters
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.270 12 DEBUG ceilometer.compute.pollsters [-] 3da83711-3468-42e8-aec6-ea1b9848aa39/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.271 12 DEBUG ceilometer.compute.pollsters [-] 3da83711-3468-42e8-aec6-ea1b9848aa39/disk.device.capacity volume: 485376 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.272 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'd53a12b0-1637-4af0-88b5-39b9b3311f95', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': 'a8fd196423d94b309668ffd08655f7ed', 'user_name': None, 'project_id': '837db8748d074b3c9179b47d30e7a1d4', 'project_name': None, 'resource_id': '3da83711-3468-42e8-aec6-ea1b9848aa39-vda', 'timestamp': '2026-01-22T00:32:23.260524', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-805464775', 'name': 'instance-000000aa', 'instance_id': '3da83711-3468-42e8-aec6-ea1b9848aa39', 'instance_type': 'm1.nano', 'host': '164c0a9a6acac8f4385343b5316ca58358db1c583e55b6d4f56f9ab0', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'd1b81868-f729-11f0-b13b-fa163e425b77', 'monotonic_time': 6501.97974148, 'message_signature': 'ac8762a0a1a1c51faa731f1bfd602b1eb2da270ba1e0e5c1b8d09396719150af'}, {'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 485376, 'user_id': 'a8fd196423d94b309668ffd08655f7ed', 'user_name': None, 'project_id': '837db8748d074b3c9179b47d30e7a1d4', 'project_name': None, 'resource_id': '3da83711-3468-42e8-aec6-ea1b9848aa39-sda', 'timestamp': '2026-01-22T00:32:23.260524', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-805464775', 'name': 'instance-000000aa', 'instance_id': '3da83711-3468-42e8-aec6-ea1b9848aa39', 'instance_type': 'm1.nano', 'host': '164c0a9a6acac8f4385343b5316ca58358db1c583e55b6d4f56f9ab0', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'd1b828f8-f729-11f0-b13b-fa163e425b77', 'monotonic_time': 6501.97974148, 'message_signature': 'ec41f26ac180a04bf9896664bbe4b88c18b025c3024e241e16e6bac78f664f69'}]}, 'timestamp': '2026-01-22 00:32:23.271407', '_unique_id': '384ab33063c943029eb7c740d5817b6c'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.272 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.272 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.272 12 ERROR oslo_messaging.notify.messaging     yield
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.272 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.272 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.272 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.272 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.272 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.272 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.272 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.272 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.272 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.272 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.272 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.272 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.272 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.272 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.272 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.272 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.272 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.272 12 ERROR oslo_messaging.notify.messaging 
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.272 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.272 12 ERROR oslo_messaging.notify.messaging 
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.272 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.272 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.272 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.272 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.272 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.272 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.272 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.272 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.272 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.272 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.272 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.272 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.272 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.272 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.272 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.272 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.272 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.272 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.272 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.272 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.272 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.272 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.272 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.272 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.272 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.272 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.272 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.272 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.272 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.272 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.272 12 ERROR oslo_messaging.notify.messaging 
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.273 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes in the context of pollsters
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.273 12 DEBUG ceilometer.compute.pollsters [-] 3da83711-3468-42e8-aec6-ea1b9848aa39/network.outgoing.bytes volume: 1636 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.274 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'e841877a-5831-4c37-8e55-c54e4765b297', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 1636, 'user_id': 'a8fd196423d94b309668ffd08655f7ed', 'user_name': None, 'project_id': '837db8748d074b3c9179b47d30e7a1d4', 'project_name': None, 'resource_id': 'instance-000000aa-3da83711-3468-42e8-aec6-ea1b9848aa39-tap2db11cf7-a9', 'timestamp': '2026-01-22T00:32:23.273517', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-805464775', 'name': 'tap2db11cf7-a9', 'instance_id': '3da83711-3468-42e8-aec6-ea1b9848aa39', 'instance_type': 'm1.nano', 'host': '164c0a9a6acac8f4385343b5316ca58358db1c583e55b6d4f56f9ab0', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:0d:7f:01', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap2db11cf7-a9'}, 'message_id': 'd1b888a2-f729-11f0-b13b-fa163e425b77', 'monotonic_time': 6501.93774568, 'message_signature': 'be1c000e6ead7af6ee4a612572d251cbb148e3458c86af8dbfc76e30efba057b'}]}, 'timestamp': '2026-01-22 00:32:23.273826', '_unique_id': '8bbbd3a172bf40588103b193d98ea700'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.274 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.274 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.274 12 ERROR oslo_messaging.notify.messaging     yield
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.274 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.274 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.274 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.274 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.274 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.274 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.274 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.274 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.274 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.274 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.274 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.274 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.274 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.274 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.274 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.274 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.274 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.274 12 ERROR oslo_messaging.notify.messaging 
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.274 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.274 12 ERROR oslo_messaging.notify.messaging 
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.274 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.274 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.274 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.274 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.274 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.274 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.274 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.274 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.274 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.274 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.274 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.274 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.274 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.274 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.274 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.274 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.274 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.274 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.274 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.274 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.274 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.274 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.274 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.274 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.274 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.274 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.274 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.274 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.274 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.274 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.274 12 ERROR oslo_messaging.notify.messaging 
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.275 12 INFO ceilometer.polling.manager [-] Polling pollster cpu in the context of pollsters
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.275 12 DEBUG ceilometer.compute.pollsters [-] 3da83711-3468-42e8-aec6-ea1b9848aa39/cpu volume: 11950000000 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.276 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '8fc87957-fb36-4779-a1aa-4390c9acb329', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'cpu', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 11950000000, 'user_id': 'a8fd196423d94b309668ffd08655f7ed', 'user_name': None, 'project_id': '837db8748d074b3c9179b47d30e7a1d4', 'project_name': None, 'resource_id': '3da83711-3468-42e8-aec6-ea1b9848aa39', 'timestamp': '2026-01-22T00:32:23.275383', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-805464775', 'name': 'instance-000000aa', 'instance_id': '3da83711-3468-42e8-aec6-ea1b9848aa39', 'instance_type': 'm1.nano', 'host': '164c0a9a6acac8f4385343b5316ca58358db1c583e55b6d4f56f9ab0', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'cpu_number': 1}, 'message_id': 'd1b8dbae-f729-11f0-b13b-fa163e425b77', 'monotonic_time': 6501.971771435, 'message_signature': '5c44a677936b7f1dfba518626cd52457c3834d5a7be7a75d2725d1ce94b9262b'}]}, 'timestamp': '2026-01-22 00:32:23.275961', '_unique_id': 'a7dff36f05344f3884af1026b4dba625'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.276 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.276 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.276 12 ERROR oslo_messaging.notify.messaging     yield
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.276 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.276 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.276 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.276 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.276 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.276 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.276 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.276 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.276 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.276 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.276 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.276 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.276 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.276 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.276 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.276 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.276 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.276 12 ERROR oslo_messaging.notify.messaging 
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.276 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.276 12 ERROR oslo_messaging.notify.messaging 
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.276 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.276 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.276 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.276 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.276 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.276 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.276 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.276 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.276 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.276 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.276 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.276 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.276 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.276 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.276 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.276 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.276 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.276 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.276 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.276 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.276 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.276 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.276 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.276 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.276 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.276 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.276 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.276 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.276 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.276 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.276 12 ERROR oslo_messaging.notify.messaging 
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.277 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.error in the context of pollsters
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.277 12 DEBUG ceilometer.compute.pollsters [-] 3da83711-3468-42e8-aec6-ea1b9848aa39/network.incoming.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.278 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '8158f8ab-95c1-4ef7-8d62-700bf71c957a', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'a8fd196423d94b309668ffd08655f7ed', 'user_name': None, 'project_id': '837db8748d074b3c9179b47d30e7a1d4', 'project_name': None, 'resource_id': 'instance-000000aa-3da83711-3468-42e8-aec6-ea1b9848aa39-tap2db11cf7-a9', 'timestamp': '2026-01-22T00:32:23.277526', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-805464775', 'name': 'tap2db11cf7-a9', 'instance_id': '3da83711-3468-42e8-aec6-ea1b9848aa39', 'instance_type': 'm1.nano', 'host': '164c0a9a6acac8f4385343b5316ca58358db1c583e55b6d4f56f9ab0', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:0d:7f:01', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap2db11cf7-a9'}, 'message_id': 'd1b9250a-f729-11f0-b13b-fa163e425b77', 'monotonic_time': 6501.93774568, 'message_signature': 'dab848a0fa6a28ee590b4c968e909af87fed2a4773560bbb01abf23dc0e43826'}]}, 'timestamp': '2026-01-22 00:32:23.277823', '_unique_id': 'a8fb85485ce9424db1cca4a5d5c752be'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.278 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.278 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.278 12 ERROR oslo_messaging.notify.messaging     yield
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.278 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.278 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.278 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.278 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.278 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.278 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.278 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.278 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.278 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.278 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.278 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.278 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.278 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.278 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.278 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.278 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.278 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.278 12 ERROR oslo_messaging.notify.messaging 
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.278 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.278 12 ERROR oslo_messaging.notify.messaging 
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.278 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.278 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.278 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.278 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.278 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.278 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.278 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.278 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.278 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.278 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.278 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.278 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.278 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.278 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.278 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.278 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.278 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.278 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.278 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.278 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.278 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.278 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.278 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.278 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.278 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.278 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.278 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.278 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.278 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.278 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.278 12 ERROR oslo_messaging.notify.messaging 
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.279 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.latency in the context of pollsters
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.279 12 DEBUG ceilometer.compute.pollsters [-] 3da83711-3468-42e8-aec6-ea1b9848aa39/disk.device.write.latency volume: 2442497440 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.279 12 DEBUG ceilometer.compute.pollsters [-] 3da83711-3468-42e8-aec6-ea1b9848aa39/disk.device.write.latency volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.280 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'd29755e5-cf9b-44b5-afdd-77f0674ca64b', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 2442497440, 'user_id': 'a8fd196423d94b309668ffd08655f7ed', 'user_name': None, 'project_id': '837db8748d074b3c9179b47d30e7a1d4', 'project_name': None, 'resource_id': '3da83711-3468-42e8-aec6-ea1b9848aa39-vda', 'timestamp': '2026-01-22T00:32:23.279524', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-805464775', 'name': 'instance-000000aa', 'instance_id': '3da83711-3468-42e8-aec6-ea1b9848aa39', 'instance_type': 'm1.nano', 'host': '164c0a9a6acac8f4385343b5316ca58358db1c583e55b6d4f56f9ab0', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'd1b9764a-f729-11f0-b13b-fa163e425b77', 'monotonic_time': 6501.899286275, 'message_signature': '54938510c9f2687ffe2f9cd847069573a47bc3dd3c04ced6e7f509f5c7ba3c90'}, {'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 0, 'user_id': 'a8fd196423d94b309668ffd08655f7ed', 'user_name': None, 'project_id': '837db8748d074b3c9179b47d30e7a1d4', 'project_name': None, 'resource_id': '3da83711-3468-42e8-aec6-ea1b9848aa39-sda', 'timestamp': '2026-01-22T00:32:23.279524', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-805464775', 'name': 'instance-000000aa', 'instance_id': '3da83711-3468-42e8-aec6-ea1b9848aa39', 'instance_type': 'm1.nano', 'host': '164c0a9a6acac8f4385343b5316ca58358db1c583e55b6d4f56f9ab0', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'd1b984c8-f729-11f0-b13b-fa163e425b77', 'monotonic_time': 6501.899286275, 'message_signature': '349ba5015d22a75e3b4600a4d80616828f4e3e1ecfacff1f769b6216286a7624'}]}, 'timestamp': '2026-01-22 00:32:23.280272', '_unique_id': '6e62ed58c77843619bc5caa3b43b70d7'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.280 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.280 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.280 12 ERROR oslo_messaging.notify.messaging     yield
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.280 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.280 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.280 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.280 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.280 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.280 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.280 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.280 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.280 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.280 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.280 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.280 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.280 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.280 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.280 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.280 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.280 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.280 12 ERROR oslo_messaging.notify.messaging 
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.280 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.280 12 ERROR oslo_messaging.notify.messaging 
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.280 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.280 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.280 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.280 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.280 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.280 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.280 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.280 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.280 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.280 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.280 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.280 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.280 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.280 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.280 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.280 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.280 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.280 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.280 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.280 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.280 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.280 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.280 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.280 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.280 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.280 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.280 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.280 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.280 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.280 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.280 12 ERROR oslo_messaging.notify.messaging 
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.282 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.requests in the context of pollsters
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.282 12 DEBUG ceilometer.compute.pollsters [-] 3da83711-3468-42e8-aec6-ea1b9848aa39/disk.device.read.requests volume: 1091 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.283 12 DEBUG ceilometer.compute.pollsters [-] 3da83711-3468-42e8-aec6-ea1b9848aa39/disk.device.read.requests volume: 108 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.284 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'a5c012eb-67e4-4f6a-b726-c3b7e78040a5', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 1091, 'user_id': 'a8fd196423d94b309668ffd08655f7ed', 'user_name': None, 'project_id': '837db8748d074b3c9179b47d30e7a1d4', 'project_name': None, 'resource_id': '3da83711-3468-42e8-aec6-ea1b9848aa39-vda', 'timestamp': '2026-01-22T00:32:23.282602', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-805464775', 'name': 'instance-000000aa', 'instance_id': '3da83711-3468-42e8-aec6-ea1b9848aa39', 'instance_type': 'm1.nano', 'host': '164c0a9a6acac8f4385343b5316ca58358db1c583e55b6d4f56f9ab0', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'd1b9f1e2-f729-11f0-b13b-fa163e425b77', 'monotonic_time': 6501.899286275, 'message_signature': '90209281eadf601d5aba484a7414f8acdeedbeb355b2eeec1f7e175288b3740f'}, {'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 108, 'user_id': 'a8fd196423d94b309668ffd08655f7ed', 'user_name': None, 'project_id': '837db8748d074b3c9179b47d30e7a1d4', 'project_name': None, 'resource_id': '3da83711-3468-42e8-aec6-ea1b9848aa39-sda', 'timestamp': '2026-01-22T00:32:23.282602', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-805464775', 'name': 'instance-000000aa', 'instance_id': '3da83711-3468-42e8-aec6-ea1b9848aa39', 'instance_type': 'm1.nano', 'host': '164c0a9a6acac8f4385343b5316ca58358db1c583e55b6d4f56f9ab0', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'd1ba0132-f729-11f0-b13b-fa163e425b77', 'monotonic_time': 6501.899286275, 'message_signature': 'df52482940727fec2c705ecae257607d92204556a26cf4998ba8ecc0a3c00c25'}]}, 'timestamp': '2026-01-22 00:32:23.283485', '_unique_id': '255383accd16400e9cae5328803a867a'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.284 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.284 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.284 12 ERROR oslo_messaging.notify.messaging     yield
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.284 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.284 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.284 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.284 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.284 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.284 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.284 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.284 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.284 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.284 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.284 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.284 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.284 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.284 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.284 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.284 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.284 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.284 12 ERROR oslo_messaging.notify.messaging 
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.284 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.284 12 ERROR oslo_messaging.notify.messaging 
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.284 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.284 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.284 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.284 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.284 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.284 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.284 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.284 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.284 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.284 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.284 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.284 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.284 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.284 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.284 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.284 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.284 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.284 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.284 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.284 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.284 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.284 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.284 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.284 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.284 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.284 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.284 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.284 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.284 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.284 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.284 12 ERROR oslo_messaging.notify.messaging 
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.285 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.bytes in the context of pollsters
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.286 12 DEBUG ceilometer.compute.pollsters [-] 3da83711-3468-42e8-aec6-ea1b9848aa39/disk.device.write.bytes volume: 72753152 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.286 12 DEBUG ceilometer.compute.pollsters [-] 3da83711-3468-42e8-aec6-ea1b9848aa39/disk.device.write.bytes volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.287 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '29a05cfa-e865-4d42-8083-d42595ac526e', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 72753152, 'user_id': 'a8fd196423d94b309668ffd08655f7ed', 'user_name': None, 'project_id': '837db8748d074b3c9179b47d30e7a1d4', 'project_name': None, 'resource_id': '3da83711-3468-42e8-aec6-ea1b9848aa39-vda', 'timestamp': '2026-01-22T00:32:23.286081', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-805464775', 'name': 'instance-000000aa', 'instance_id': '3da83711-3468-42e8-aec6-ea1b9848aa39', 'instance_type': 'm1.nano', 'host': '164c0a9a6acac8f4385343b5316ca58358db1c583e55b6d4f56f9ab0', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'd1ba74a0-f729-11f0-b13b-fa163e425b77', 'monotonic_time': 6501.899286275, 'message_signature': '18391fee4e20840e8880c2f26eaef37f0f2a0853f683b74159036669df05ab87'}, {'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': 'a8fd196423d94b309668ffd08655f7ed', 'user_name': None, 'project_id': '837db8748d074b3c9179b47d30e7a1d4', 'project_name': None, 'resource_id': '3da83711-3468-42e8-aec6-ea1b9848aa39-sda', 'timestamp': '2026-01-22T00:32:23.286081', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-805464775', 'name': 'instance-000000aa', 'instance_id': '3da83711-3468-42e8-aec6-ea1b9848aa39', 'instance_type': 'm1.nano', 'host': '164c0a9a6acac8f4385343b5316ca58358db1c583e55b6d4f56f9ab0', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'd1ba8030-f729-11f0-b13b-fa163e425b77', 'monotonic_time': 6501.899286275, 'message_signature': '33fbb730895761a8087dad9827b9c145379cb3ca82be80d8405afa85affb51f4'}]}, 'timestamp': '2026-01-22 00:32:23.286687', '_unique_id': '8fb21e74342e4829b92d88a24aee3f74'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.287 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.287 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.287 12 ERROR oslo_messaging.notify.messaging     yield
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.287 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.287 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.287 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.287 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.287 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.287 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.287 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.287 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.287 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.287 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.287 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.287 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.287 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.287 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.287 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.287 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.287 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.287 12 ERROR oslo_messaging.notify.messaging 
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.287 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.287 12 ERROR oslo_messaging.notify.messaging 
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.287 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.287 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.287 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.287 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.287 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.287 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.287 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.287 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.287 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.287 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.287 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.287 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.287 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.287 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.287 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.287 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.287 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.287 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.287 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.287 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.287 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.287 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.287 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.287 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.287 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.287 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.287 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.287 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.287 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.287 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.287 12 ERROR oslo_messaging.notify.messaging 
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.288 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.bytes in the context of pollsters
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.288 12 DEBUG ceilometer.compute.pollsters [-] 3da83711-3468-42e8-aec6-ea1b9848aa39/disk.device.read.bytes volume: 30398976 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.289 12 DEBUG ceilometer.compute.pollsters [-] 3da83711-3468-42e8-aec6-ea1b9848aa39/disk.device.read.bytes volume: 274750 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.290 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '5d12b17b-576c-4f75-bf37-47f8bad5360c', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 30398976, 'user_id': 'a8fd196423d94b309668ffd08655f7ed', 'user_name': None, 'project_id': '837db8748d074b3c9179b47d30e7a1d4', 'project_name': None, 'resource_id': '3da83711-3468-42e8-aec6-ea1b9848aa39-vda', 'timestamp': '2026-01-22T00:32:23.288759', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-805464775', 'name': 'instance-000000aa', 'instance_id': '3da83711-3468-42e8-aec6-ea1b9848aa39', 'instance_type': 'm1.nano', 'host': '164c0a9a6acac8f4385343b5316ca58358db1c583e55b6d4f56f9ab0', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'd1bade54-f729-11f0-b13b-fa163e425b77', 'monotonic_time': 6501.899286275, 'message_signature': '85d8c12a58b112b12b0080a6d08d149887a509c1192ba3b6d688cf83fa2704b7'}, {'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 274750, 'user_id': 'a8fd196423d94b309668ffd08655f7ed', 'user_name': None, 'project_id': '837db8748d074b3c9179b47d30e7a1d4', 'project_name': None, 'resource_id': '3da83711-3468-42e8-aec6-ea1b9848aa39-sda', 'timestamp': '2026-01-22T00:32:23.288759', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-805464775', 'name': 'instance-000000aa', 'instance_id': '3da83711-3468-42e8-aec6-ea1b9848aa39', 'instance_type': 'm1.nano', 'host': '164c0a9a6acac8f4385343b5316ca58358db1c583e55b6d4f56f9ab0', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'd1baea70-f729-11f0-b13b-fa163e425b77', 'monotonic_time': 6501.899286275, 'message_signature': '84e39579772b071580b54f37572e44918855abe38ece418641ab888cee90885f'}]}, 'timestamp': '2026-01-22 00:32:23.289439', '_unique_id': 'daf598269b204171a272c95d8e6f69a7'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.290 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.290 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.290 12 ERROR oslo_messaging.notify.messaging     yield
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.290 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.290 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.290 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.290 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.290 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.290 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.290 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.290 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.290 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.290 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.290 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.290 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.290 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.290 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.290 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.290 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.290 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.290 12 ERROR oslo_messaging.notify.messaging 
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.290 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.290 12 ERROR oslo_messaging.notify.messaging 
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.290 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.290 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.290 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.290 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.290 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.290 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.290 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.290 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.290 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.290 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.290 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.290 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.290 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.290 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.290 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.290 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.290 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.290 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.290 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.290 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.290 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.290 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.290 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.290 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.290 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.290 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.290 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.290 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.290 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.290 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.290 12 ERROR oslo_messaging.notify.messaging 
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.290 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.delta in the context of pollsters
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.291 12 DEBUG ceilometer.compute.pollsters [-] 3da83711-3468-42e8-aec6-ea1b9848aa39/network.incoming.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.292 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'd105371c-af1d-4f54-911b-271e410b52aa', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': 'a8fd196423d94b309668ffd08655f7ed', 'user_name': None, 'project_id': '837db8748d074b3c9179b47d30e7a1d4', 'project_name': None, 'resource_id': 'instance-000000aa-3da83711-3468-42e8-aec6-ea1b9848aa39-tap2db11cf7-a9', 'timestamp': '2026-01-22T00:32:23.291112', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-805464775', 'name': 'tap2db11cf7-a9', 'instance_id': '3da83711-3468-42e8-aec6-ea1b9848aa39', 'instance_type': 'm1.nano', 'host': '164c0a9a6acac8f4385343b5316ca58358db1c583e55b6d4f56f9ab0', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:0d:7f:01', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap2db11cf7-a9'}, 'message_id': 'd1bb3930-f729-11f0-b13b-fa163e425b77', 'monotonic_time': 6501.93774568, 'message_signature': '5e04cbaf15ba27aa27cd86483c36eeed57c28623ddb37fdb0e371de9faf7c997'}]}, 'timestamp': '2026-01-22 00:32:23.291466', '_unique_id': '58324c40db334ab8b1b45a8d44f6baad'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.292 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.292 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.292 12 ERROR oslo_messaging.notify.messaging     yield
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.292 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.292 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.292 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.292 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.292 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.292 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.292 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.292 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.292 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.292 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.292 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.292 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.292 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.292 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.292 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.292 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.292 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.292 12 ERROR oslo_messaging.notify.messaging 
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.292 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.292 12 ERROR oslo_messaging.notify.messaging 
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.292 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.292 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.292 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.292 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.292 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.292 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.292 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.292 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.292 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.292 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.292 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.292 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.292 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.292 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.292 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.292 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.292 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.292 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.292 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.292 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.292 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.292 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.292 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.292 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.292 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.292 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.292 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.292 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.292 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.292 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.292 12 ERROR oslo_messaging.notify.messaging 
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.293 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.allocation in the context of pollsters
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.293 12 DEBUG ceilometer.compute.pollsters [-] 3da83711-3468-42e8-aec6-ea1b9848aa39/disk.device.allocation volume: 30351360 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.293 12 DEBUG ceilometer.compute.pollsters [-] 3da83711-3468-42e8-aec6-ea1b9848aa39/disk.device.allocation volume: 487424 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.294 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '9cd18892-3978-4049-a499-eabee93d50cf', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 30351360, 'user_id': 'a8fd196423d94b309668ffd08655f7ed', 'user_name': None, 'project_id': '837db8748d074b3c9179b47d30e7a1d4', 'project_name': None, 'resource_id': '3da83711-3468-42e8-aec6-ea1b9848aa39-vda', 'timestamp': '2026-01-22T00:32:23.293278', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-805464775', 'name': 'instance-000000aa', 'instance_id': '3da83711-3468-42e8-aec6-ea1b9848aa39', 'instance_type': 'm1.nano', 'host': '164c0a9a6acac8f4385343b5316ca58358db1c583e55b6d4f56f9ab0', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'd1bb8dd6-f729-11f0-b13b-fa163e425b77', 'monotonic_time': 6501.97974148, 'message_signature': '4a7ebc7e30000df77ce98127f0323761341cf3778a68fc989c257def6ef5a967'}, {'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 487424, 'user_id': 'a8fd196423d94b309668ffd08655f7ed', 'user_name': None, 'project_id': '837db8748d074b3c9179b47d30e7a1d4', 'project_name': None, 'resource_id': '3da83711-3468-42e8-aec6-ea1b9848aa39-sda', 'timestamp': '2026-01-22T00:32:23.293278', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-805464775', 'name': 'instance-000000aa', 'instance_id': '3da83711-3468-42e8-aec6-ea1b9848aa39', 'instance_type': 'm1.nano', 'host': '164c0a9a6acac8f4385343b5316ca58358db1c583e55b6d4f56f9ab0', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'd1bb9ab0-f729-11f0-b13b-fa163e425b77', 'monotonic_time': 6501.97974148, 'message_signature': '5712db3c17b956c40c6ceee5026998ef8674dbf303858a4ecd921cad2bc2f3a0'}]}, 'timestamp': '2026-01-22 00:32:23.293975', '_unique_id': 'abe7530c2fab44f2972e1e57f22c1044'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.294 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.294 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.294 12 ERROR oslo_messaging.notify.messaging     yield
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.294 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.294 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.294 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.294 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.294 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.294 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.294 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.294 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.294 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.294 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.294 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.294 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.294 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.294 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.294 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.294 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.294 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.294 12 ERROR oslo_messaging.notify.messaging 
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.294 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.294 12 ERROR oslo_messaging.notify.messaging 
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.294 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.294 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.294 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.294 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.294 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.294 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.294 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.294 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.294 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.294 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.294 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.294 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.294 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.294 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.294 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.294 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.294 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.294 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.294 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.294 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.294 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.294 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.294 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.294 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.294 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.294 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.294 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.294 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.294 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.294 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.294 12 ERROR oslo_messaging.notify.messaging 
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.295 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.usage in the context of pollsters
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.295 12 DEBUG ceilometer.compute.pollsters [-] 3da83711-3468-42e8-aec6-ea1b9848aa39/disk.device.usage volume: 29753344 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.296 12 DEBUG ceilometer.compute.pollsters [-] 3da83711-3468-42e8-aec6-ea1b9848aa39/disk.device.usage volume: 485376 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.297 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '2b875018-72b5-402f-8f1d-eeb0cb756a98', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 29753344, 'user_id': 'a8fd196423d94b309668ffd08655f7ed', 'user_name': None, 'project_id': '837db8748d074b3c9179b47d30e7a1d4', 'project_name': None, 'resource_id': '3da83711-3468-42e8-aec6-ea1b9848aa39-vda', 'timestamp': '2026-01-22T00:32:23.295899', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-805464775', 'name': 'instance-000000aa', 'instance_id': '3da83711-3468-42e8-aec6-ea1b9848aa39', 'instance_type': 'm1.nano', 'host': '164c0a9a6acac8f4385343b5316ca58358db1c583e55b6d4f56f9ab0', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'd1bbf460-f729-11f0-b13b-fa163e425b77', 'monotonic_time': 6501.97974148, 'message_signature': '6782a35783b5a193bb755e9293b03270d9d7f9e835f48d6fd14051610c07396f'}, {'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 485376, 'user_id': 'a8fd196423d94b309668ffd08655f7ed', 'user_name': None, 'project_id': '837db8748d074b3c9179b47d30e7a1d4', 'project_name': None, 'resource_id': '3da83711-3468-42e8-aec6-ea1b9848aa39-sda', 'timestamp': '2026-01-22T00:32:23.295899', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-805464775', 'name': 'instance-000000aa', 'instance_id': '3da83711-3468-42e8-aec6-ea1b9848aa39', 'instance_type': 'm1.nano', 'host': '164c0a9a6acac8f4385343b5316ca58358db1c583e55b6d4f56f9ab0', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'd1bc009a-f729-11f0-b13b-fa163e425b77', 'monotonic_time': 6501.97974148, 'message_signature': '00cfabec8eacc91e021654dc6026a02a19d67a572e8ec18fdb852a8e700d16f2'}]}, 'timestamp': '2026-01-22 00:32:23.296562', '_unique_id': 'eb9c65ed6244442ab4aad84c5259486f'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.297 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.297 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.297 12 ERROR oslo_messaging.notify.messaging     yield
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.297 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.297 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.297 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.297 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.297 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.297 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.297 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.297 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.297 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.297 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.297 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.297 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.297 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.297 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.297 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.297 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.297 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.297 12 ERROR oslo_messaging.notify.messaging 
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.297 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.297 12 ERROR oslo_messaging.notify.messaging 
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.297 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.297 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.297 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.297 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.297 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.297 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.297 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.297 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.297 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.297 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.297 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.297 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.297 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.297 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.297 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.297 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.297 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.297 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.297 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.297 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.297 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.297 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.297 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.297 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.297 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.297 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.297 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.297 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.297 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.297 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 21 19:32:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:32:23.297 12 ERROR oslo_messaging.notify.messaging 
Jan 21 19:32:24 np0005591285 nova_compute[182755]: 2026-01-22 00:32:24.614 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:32:26 np0005591285 nova_compute[182755]: 2026-01-22 00:32:26.079 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:32:28 np0005591285 podman[239872]: 2026-01-22 00:32:28.1889592 +0000 UTC m=+0.054851967 container health_status 769668cc4e818cfae854ab3fcb5517227ddca1e18005a18ef4d782a494f45662 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, com.redhat.component=ubi9-minimal-container, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-type=git, managed_by=edpm_ansible, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.openshift.tags=minimal rhel9, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=openstack_network_exporter, distribution-scope=public, vendor=Red Hat, Inc., io.openshift.expose-services=, config_id=openstack_network_exporter, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., build-date=2025-08-20T13:12:41, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, version=9.6, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.buildah.version=1.33.7, release=1755695350, maintainer=Red Hat, Inc., architecture=x86_64, name=ubi9-minimal)
Jan 21 19:32:28 np0005591285 podman[239873]: 2026-01-22 00:32:28.193447721 +0000 UTC m=+0.052502434 container health_status b5c1a31a508d0cf9e10702abe9c0ef424614b4d8f0daee85791f7de054afa6f0 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, container_name=ceilometer_agent_compute, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_id=ceilometer_agent_compute, io.buildah.version=1.41.3, managed_by=edpm_ansible, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']})
Jan 21 19:32:29 np0005591285 nova_compute[182755]: 2026-01-22 00:32:29.616 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:32:31 np0005591285 nova_compute[182755]: 2026-01-22 00:32:31.082 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:32:34 np0005591285 nova_compute[182755]: 2026-01-22 00:32:34.666 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:32:36 np0005591285 nova_compute[182755]: 2026-01-22 00:32:36.085 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:32:37 np0005591285 podman[239912]: 2026-01-22 00:32:37.172590018 +0000 UTC m=+0.047570711 container health_status 3d927e208c8d9d2bf2ed958430b573b5beb6e6062ba87e1ec7a0ee42c855b2e4 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Jan 21 19:32:39 np0005591285 nova_compute[182755]: 2026-01-22 00:32:39.668 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:32:40 np0005591285 podman[239938]: 2026-01-22 00:32:40.177000326 +0000 UTC m=+0.050372756 container health_status 482a8450540b33e5cd4c4cb0c4b60281016c657b79b89fa82f8a9e447843f995 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent)
Jan 21 19:32:40 np0005591285 podman[239939]: 2026-01-22 00:32:40.17680404 +0000 UTC m=+0.045197477 container health_status 8919309155a033da8deb024bb37b9fc0d285fe97686ee0d202af37a5f2df8416 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Jan 21 19:32:40 np0005591285 ovn_controller[94908]: 2026-01-22T00:32:40Z|00643|memory_trim|INFO|Detected inactivity (last active 30022 ms ago): trimming memory
Jan 21 19:32:41 np0005591285 nova_compute[182755]: 2026-01-22 00:32:41.088 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:32:43 np0005591285 podman[239980]: 2026-01-22 00:32:43.218073591 +0000 UTC m=+0.088956025 container health_status e38def30a2d032278c507a8fe96e62e9ea51ff4721fe55b24e66691821fd079e (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS)
Jan 21 19:32:44 np0005591285 nova_compute[182755]: 2026-01-22 00:32:44.672 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:32:45 np0005591285 nova_compute[182755]: 2026-01-22 00:32:45.217 182759 DEBUG oslo_service.periodic_task [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Running periodic task ComputeManager._run_pending_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 21 19:32:45 np0005591285 nova_compute[182755]: 2026-01-22 00:32:45.218 182759 DEBUG nova.compute.manager [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Cleaning up deleted instances _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11145#033[00m
Jan 21 19:32:45 np0005591285 nova_compute[182755]: 2026-01-22 00:32:45.237 182759 DEBUG nova.compute.manager [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] There are 0 instances to clean _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11154#033[00m
Jan 21 19:32:46 np0005591285 nova_compute[182755]: 2026-01-22 00:32:46.049 182759 DEBUG nova.compute.manager [req-3220d107-27a5-40ef-a9b1-2b2562646b28 req-4b0099bc-8fef-4946-94ec-7f9e8285df47 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 3da83711-3468-42e8-aec6-ea1b9848aa39] Received event network-changed-2db11cf7-a9f4-4143-96b7-5bfdd381d1d8 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 21 19:32:46 np0005591285 nova_compute[182755]: 2026-01-22 00:32:46.049 182759 DEBUG nova.compute.manager [req-3220d107-27a5-40ef-a9b1-2b2562646b28 req-4b0099bc-8fef-4946-94ec-7f9e8285df47 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 3da83711-3468-42e8-aec6-ea1b9848aa39] Refreshing instance network info cache due to event network-changed-2db11cf7-a9f4-4143-96b7-5bfdd381d1d8. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 21 19:32:46 np0005591285 nova_compute[182755]: 2026-01-22 00:32:46.050 182759 DEBUG oslo_concurrency.lockutils [req-3220d107-27a5-40ef-a9b1-2b2562646b28 req-4b0099bc-8fef-4946-94ec-7f9e8285df47 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "refresh_cache-3da83711-3468-42e8-aec6-ea1b9848aa39" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 21 19:32:46 np0005591285 nova_compute[182755]: 2026-01-22 00:32:46.050 182759 DEBUG oslo_concurrency.lockutils [req-3220d107-27a5-40ef-a9b1-2b2562646b28 req-4b0099bc-8fef-4946-94ec-7f9e8285df47 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquired lock "refresh_cache-3da83711-3468-42e8-aec6-ea1b9848aa39" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 21 19:32:46 np0005591285 nova_compute[182755]: 2026-01-22 00:32:46.050 182759 DEBUG nova.network.neutron [req-3220d107-27a5-40ef-a9b1-2b2562646b28 req-4b0099bc-8fef-4946-94ec-7f9e8285df47 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 3da83711-3468-42e8-aec6-ea1b9848aa39] Refreshing network info cache for port 2db11cf7-a9f4-4143-96b7-5bfdd381d1d8 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 21 19:32:46 np0005591285 nova_compute[182755]: 2026-01-22 00:32:46.090 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:32:46 np0005591285 nova_compute[182755]: 2026-01-22 00:32:46.172 182759 DEBUG oslo_concurrency.lockutils [None req-11570629-f978-4d4b-b6a7-b788e6459d08 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Acquiring lock "3da83711-3468-42e8-aec6-ea1b9848aa39" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 21 19:32:46 np0005591285 nova_compute[182755]: 2026-01-22 00:32:46.172 182759 DEBUG oslo_concurrency.lockutils [None req-11570629-f978-4d4b-b6a7-b788e6459d08 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Lock "3da83711-3468-42e8-aec6-ea1b9848aa39" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 21 19:32:46 np0005591285 nova_compute[182755]: 2026-01-22 00:32:46.173 182759 DEBUG oslo_concurrency.lockutils [None req-11570629-f978-4d4b-b6a7-b788e6459d08 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Acquiring lock "3da83711-3468-42e8-aec6-ea1b9848aa39-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 21 19:32:46 np0005591285 nova_compute[182755]: 2026-01-22 00:32:46.173 182759 DEBUG oslo_concurrency.lockutils [None req-11570629-f978-4d4b-b6a7-b788e6459d08 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Lock "3da83711-3468-42e8-aec6-ea1b9848aa39-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 21 19:32:46 np0005591285 nova_compute[182755]: 2026-01-22 00:32:46.173 182759 DEBUG oslo_concurrency.lockutils [None req-11570629-f978-4d4b-b6a7-b788e6459d08 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Lock "3da83711-3468-42e8-aec6-ea1b9848aa39-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 21 19:32:46 np0005591285 nova_compute[182755]: 2026-01-22 00:32:46.188 182759 INFO nova.compute.manager [None req-11570629-f978-4d4b-b6a7-b788e6459d08 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] [instance: 3da83711-3468-42e8-aec6-ea1b9848aa39] Terminating instance#033[00m
Jan 21 19:32:46 np0005591285 nova_compute[182755]: 2026-01-22 00:32:46.200 182759 DEBUG nova.compute.manager [None req-11570629-f978-4d4b-b6a7-b788e6459d08 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] [instance: 3da83711-3468-42e8-aec6-ea1b9848aa39] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Jan 21 19:32:46 np0005591285 kernel: tap2db11cf7-a9 (unregistering): left promiscuous mode
Jan 21 19:32:46 np0005591285 NetworkManager[55017]: <info>  [1769041966.2234] device (tap2db11cf7-a9): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 21 19:32:46 np0005591285 nova_compute[182755]: 2026-01-22 00:32:46.232 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:32:46 np0005591285 ovn_controller[94908]: 2026-01-22T00:32:46Z|00644|binding|INFO|Releasing lport 2db11cf7-a9f4-4143-96b7-5bfdd381d1d8 from this chassis (sb_readonly=0)
Jan 21 19:32:46 np0005591285 ovn_controller[94908]: 2026-01-22T00:32:46Z|00645|binding|INFO|Setting lport 2db11cf7-a9f4-4143-96b7-5bfdd381d1d8 down in Southbound
Jan 21 19:32:46 np0005591285 ovn_controller[94908]: 2026-01-22T00:32:46Z|00646|binding|INFO|Removing iface tap2db11cf7-a9 ovn-installed in OVS
Jan 21 19:32:46 np0005591285 nova_compute[182755]: 2026-01-22 00:32:46.235 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:32:46 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:32:46.239 104259 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:0d:7f:01 10.100.0.8 2001:db8::f816:3eff:fe0d:7f01'], port_security=['fa:16:3e:0d:7f:01 10.100.0.8 2001:db8::f816:3eff:fe0d:7f01'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.8/28 2001:db8::f816:3eff:fe0d:7f01/64', 'neutron:device_id': '3da83711-3468-42e8-aec6-ea1b9848aa39', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-895033ac-5f91-4350-ad1a-b5c5d0ff13a2', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '837db8748d074b3c9179b47d30e7a1d4', 'neutron:revision_number': '4', 'neutron:security_group_ids': '9d931792-0187-42bd-ad30-da2120e7bd41', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-2.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=cb52301b-689d-4e28-a6fb-c23352694dd4, chassis=[], tunnel_key=5, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fdd55b5c970>], logical_port=2db11cf7-a9f4-4143-96b7-5bfdd381d1d8) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7fdd55b5c970>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 21 19:32:46 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:32:46.240 104259 INFO neutron.agent.ovn.metadata.agent [-] Port 2db11cf7-a9f4-4143-96b7-5bfdd381d1d8 in datapath 895033ac-5f91-4350-ad1a-b5c5d0ff13a2 unbound from our chassis#033[00m
Jan 21 19:32:46 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:32:46.241 104259 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 895033ac-5f91-4350-ad1a-b5c5d0ff13a2, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Jan 21 19:32:46 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:32:46.243 211690 DEBUG oslo.privsep.daemon [-] privsep: reply[17c382c6-49e6-4461-a9b0-1b08d31cbc53]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 19:32:46 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:32:46.244 104259 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-895033ac-5f91-4350-ad1a-b5c5d0ff13a2 namespace which is not needed anymore#033[00m
Jan 21 19:32:46 np0005591285 nova_compute[182755]: 2026-01-22 00:32:46.259 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:32:46 np0005591285 systemd[1]: machine-qemu\x2d74\x2dinstance\x2d000000aa.scope: Deactivated successfully.
Jan 21 19:32:46 np0005591285 systemd[1]: machine-qemu\x2d74\x2dinstance\x2d000000aa.scope: Consumed 14.226s CPU time.
Jan 21 19:32:46 np0005591285 systemd-machined[154022]: Machine qemu-74-instance-000000aa terminated.
Jan 21 19:32:46 np0005591285 neutron-haproxy-ovnmeta-895033ac-5f91-4350-ad1a-b5c5d0ff13a2[239771]: [NOTICE]   (239775) : haproxy version is 2.8.14-c23fe91
Jan 21 19:32:46 np0005591285 neutron-haproxy-ovnmeta-895033ac-5f91-4350-ad1a-b5c5d0ff13a2[239771]: [NOTICE]   (239775) : path to executable is /usr/sbin/haproxy
Jan 21 19:32:46 np0005591285 neutron-haproxy-ovnmeta-895033ac-5f91-4350-ad1a-b5c5d0ff13a2[239771]: [WARNING]  (239775) : Exiting Master process...
Jan 21 19:32:46 np0005591285 neutron-haproxy-ovnmeta-895033ac-5f91-4350-ad1a-b5c5d0ff13a2[239771]: [WARNING]  (239775) : Exiting Master process...
Jan 21 19:32:46 np0005591285 neutron-haproxy-ovnmeta-895033ac-5f91-4350-ad1a-b5c5d0ff13a2[239771]: [ALERT]    (239775) : Current worker (239777) exited with code 143 (Terminated)
Jan 21 19:32:46 np0005591285 neutron-haproxy-ovnmeta-895033ac-5f91-4350-ad1a-b5c5d0ff13a2[239771]: [WARNING]  (239775) : All workers exited. Exiting... (0)
Jan 21 19:32:46 np0005591285 systemd[1]: libpod-b421caa6912080c2b9fc94414af3cd7393d8ba78997a0886434ae2b8f84ac6bb.scope: Deactivated successfully.
Jan 21 19:32:46 np0005591285 podman[240031]: 2026-01-22 00:32:46.40723184 +0000 UTC m=+0.066805258 container died b421caa6912080c2b9fc94414af3cd7393d8ba78997a0886434ae2b8f84ac6bb (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-895033ac-5f91-4350-ad1a-b5c5d0ff13a2, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.license=GPLv2, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 21 19:32:46 np0005591285 nova_compute[182755]: 2026-01-22 00:32:46.464 182759 INFO nova.virt.libvirt.driver [-] [instance: 3da83711-3468-42e8-aec6-ea1b9848aa39] Instance destroyed successfully.#033[00m
Jan 21 19:32:46 np0005591285 nova_compute[182755]: 2026-01-22 00:32:46.465 182759 DEBUG nova.objects.instance [None req-11570629-f978-4d4b-b6a7-b788e6459d08 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Lazy-loading 'resources' on Instance uuid 3da83711-3468-42e8-aec6-ea1b9848aa39 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 21 19:32:46 np0005591285 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-b421caa6912080c2b9fc94414af3cd7393d8ba78997a0886434ae2b8f84ac6bb-userdata-shm.mount: Deactivated successfully.
Jan 21 19:32:46 np0005591285 systemd[1]: var-lib-containers-storage-overlay-64395b99d1bfdda7774d27e884efb3aae6cdf0c19c312b85051752165e002252-merged.mount: Deactivated successfully.
Jan 21 19:32:46 np0005591285 podman[240031]: 2026-01-22 00:32:46.532308396 +0000 UTC m=+0.191881804 container cleanup b421caa6912080c2b9fc94414af3cd7393d8ba78997a0886434ae2b8f84ac6bb (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-895033ac-5f91-4350-ad1a-b5c5d0ff13a2, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2)
Jan 21 19:32:46 np0005591285 nova_compute[182755]: 2026-01-22 00:32:46.531 182759 DEBUG nova.virt.libvirt.vif [None req-11570629-f978-4d4b-b6a7-b788e6459d08 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-22T00:31:56Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestGettingAddress-server-805464775',display_name='tempest-TestGettingAddress-server-805464775',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-testgettingaddress-server-805464775',id=170,image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBEN8d2ViSKyfMT2XqudbWbmZhugtSo0AUa3hssPfIHXZXJuMLED9XwzZlkaV7imX6BxsiK4pWMoh9iMrlu0xzxZRk5QI4OmaesLZRq01J/YYbtUdz/2t7KMOohgfE7jvUQ==',key_name='tempest-TestGettingAddress-1273457996',keypairs=<?>,launch_index=0,launched_at=2026-01-22T00:32:07Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='837db8748d074b3c9179b47d30e7a1d4',ramdisk_id='',reservation_id='r-exz0mvk1',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestGettingAddress-471729430',owner_user_name='tempest-TestGettingAddress-471729430-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-22T00:32:07Z,user_data=None,user_id='a8fd196423d94b309668ffd08655f7ed',uuid=3da83711-3468-42e8-aec6-ea1b9848aa39,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "2db11cf7-a9f4-4143-96b7-5bfdd381d1d8", "address": "fa:16:3e:0d:7f:01", "network": {"id": "895033ac-5f91-4350-ad1a-b5c5d0ff13a2", "bridge": "br-int", "label": "tempest-network-smoke--1056014394", "subnets": [{"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe0d:7f01", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}, {"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.230", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "837db8748d074b3c9179b47d30e7a1d4", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap2db11cf7-a9", "ovs_interfaceid": "2db11cf7-a9f4-4143-96b7-5bfdd381d1d8", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Jan 21 19:32:46 np0005591285 nova_compute[182755]: 2026-01-22 00:32:46.532 182759 DEBUG nova.network.os_vif_util [None req-11570629-f978-4d4b-b6a7-b788e6459d08 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Converting VIF {"id": "2db11cf7-a9f4-4143-96b7-5bfdd381d1d8", "address": "fa:16:3e:0d:7f:01", "network": {"id": "895033ac-5f91-4350-ad1a-b5c5d0ff13a2", "bridge": "br-int", "label": "tempest-network-smoke--1056014394", "subnets": [{"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe0d:7f01", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}, {"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.230", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "837db8748d074b3c9179b47d30e7a1d4", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap2db11cf7-a9", "ovs_interfaceid": "2db11cf7-a9f4-4143-96b7-5bfdd381d1d8", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 21 19:32:46 np0005591285 nova_compute[182755]: 2026-01-22 00:32:46.533 182759 DEBUG nova.network.os_vif_util [None req-11570629-f978-4d4b-b6a7-b788e6459d08 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:0d:7f:01,bridge_name='br-int',has_traffic_filtering=True,id=2db11cf7-a9f4-4143-96b7-5bfdd381d1d8,network=Network(895033ac-5f91-4350-ad1a-b5c5d0ff13a2),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap2db11cf7-a9') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 21 19:32:46 np0005591285 nova_compute[182755]: 2026-01-22 00:32:46.533 182759 DEBUG os_vif [None req-11570629-f978-4d4b-b6a7-b788e6459d08 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:0d:7f:01,bridge_name='br-int',has_traffic_filtering=True,id=2db11cf7-a9f4-4143-96b7-5bfdd381d1d8,network=Network(895033ac-5f91-4350-ad1a-b5c5d0ff13a2),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap2db11cf7-a9') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Jan 21 19:32:46 np0005591285 nova_compute[182755]: 2026-01-22 00:32:46.536 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:32:46 np0005591285 nova_compute[182755]: 2026-01-22 00:32:46.536 182759 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap2db11cf7-a9, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 21 19:32:46 np0005591285 nova_compute[182755]: 2026-01-22 00:32:46.538 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:32:46 np0005591285 nova_compute[182755]: 2026-01-22 00:32:46.540 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:32:46 np0005591285 systemd[1]: libpod-conmon-b421caa6912080c2b9fc94414af3cd7393d8ba78997a0886434ae2b8f84ac6bb.scope: Deactivated successfully.
Jan 21 19:32:46 np0005591285 nova_compute[182755]: 2026-01-22 00:32:46.546 182759 INFO os_vif [None req-11570629-f978-4d4b-b6a7-b788e6459d08 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:0d:7f:01,bridge_name='br-int',has_traffic_filtering=True,id=2db11cf7-a9f4-4143-96b7-5bfdd381d1d8,network=Network(895033ac-5f91-4350-ad1a-b5c5d0ff13a2),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap2db11cf7-a9')#033[00m
Jan 21 19:32:46 np0005591285 nova_compute[182755]: 2026-01-22 00:32:46.548 182759 INFO nova.virt.libvirt.driver [None req-11570629-f978-4d4b-b6a7-b788e6459d08 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] [instance: 3da83711-3468-42e8-aec6-ea1b9848aa39] Deleting instance files /var/lib/nova/instances/3da83711-3468-42e8-aec6-ea1b9848aa39_del#033[00m
Jan 21 19:32:46 np0005591285 nova_compute[182755]: 2026-01-22 00:32:46.548 182759 INFO nova.virt.libvirt.driver [None req-11570629-f978-4d4b-b6a7-b788e6459d08 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] [instance: 3da83711-3468-42e8-aec6-ea1b9848aa39] Deletion of /var/lib/nova/instances/3da83711-3468-42e8-aec6-ea1b9848aa39_del complete#033[00m
Jan 21 19:32:46 np0005591285 podman[240079]: 2026-01-22 00:32:46.713248596 +0000 UTC m=+0.158118107 container remove b421caa6912080c2b9fc94414af3cd7393d8ba78997a0886434ae2b8f84ac6bb (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-895033ac-5f91-4350-ad1a-b5c5d0ff13a2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Jan 21 19:32:46 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:32:46.719 211690 DEBUG oslo.privsep.daemon [-] privsep: reply[99b54f3d-6363-4e20-b525-dc0c2ae41983]: (4, ('Thu Jan 22 12:32:46 AM UTC 2026 Stopping container neutron-haproxy-ovnmeta-895033ac-5f91-4350-ad1a-b5c5d0ff13a2 (b421caa6912080c2b9fc94414af3cd7393d8ba78997a0886434ae2b8f84ac6bb)\nb421caa6912080c2b9fc94414af3cd7393d8ba78997a0886434ae2b8f84ac6bb\nThu Jan 22 12:32:46 AM UTC 2026 Deleting container neutron-haproxy-ovnmeta-895033ac-5f91-4350-ad1a-b5c5d0ff13a2 (b421caa6912080c2b9fc94414af3cd7393d8ba78997a0886434ae2b8f84ac6bb)\nb421caa6912080c2b9fc94414af3cd7393d8ba78997a0886434ae2b8f84ac6bb\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 19:32:46 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:32:46.720 211690 DEBUG oslo.privsep.daemon [-] privsep: reply[98d995f4-9713-4742-acdd-21b22976621c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 19:32:46 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:32:46.721 104259 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap895033ac-50, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 21 19:32:46 np0005591285 kernel: tap895033ac-50: left promiscuous mode
Jan 21 19:32:46 np0005591285 nova_compute[182755]: 2026-01-22 00:32:46.724 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:32:46 np0005591285 nova_compute[182755]: 2026-01-22 00:32:46.734 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:32:46 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:32:46.737 211690 DEBUG oslo.privsep.daemon [-] privsep: reply[36373d76-93c4-4d5f-af8e-71949c16bc38]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 19:32:46 np0005591285 nova_compute[182755]: 2026-01-22 00:32:46.758 182759 INFO nova.compute.manager [None req-11570629-f978-4d4b-b6a7-b788e6459d08 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] [instance: 3da83711-3468-42e8-aec6-ea1b9848aa39] Took 0.56 seconds to destroy the instance on the hypervisor.#033[00m
Jan 21 19:32:46 np0005591285 nova_compute[182755]: 2026-01-22 00:32:46.758 182759 DEBUG oslo.service.loopingcall [None req-11570629-f978-4d4b-b6a7-b788e6459d08 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Jan 21 19:32:46 np0005591285 nova_compute[182755]: 2026-01-22 00:32:46.758 182759 DEBUG nova.compute.manager [-] [instance: 3da83711-3468-42e8-aec6-ea1b9848aa39] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Jan 21 19:32:46 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:32:46.758 211690 DEBUG oslo.privsep.daemon [-] privsep: reply[2a62e36e-e477-476a-b14a-2e7ce66b2d61]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 19:32:46 np0005591285 nova_compute[182755]: 2026-01-22 00:32:46.758 182759 DEBUG nova.network.neutron [-] [instance: 3da83711-3468-42e8-aec6-ea1b9848aa39] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Jan 21 19:32:46 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:32:46.759 211690 DEBUG oslo.privsep.daemon [-] privsep: reply[779b7f8b-8821-4e7c-9e53-1110090bb8f1]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 19:32:46 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:32:46.772 211690 DEBUG oslo.privsep.daemon [-] privsep: reply[ea68d27a-ba5e-4ff2-982c-2b201775e365]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 648586, 'reachable_time': 37505, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 240094, 'error': None, 'target': 'ovnmeta-895033ac-5f91-4350-ad1a-b5c5d0ff13a2', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 19:32:46 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:32:46.775 104650 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-895033ac-5f91-4350-ad1a-b5c5d0ff13a2 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Jan 21 19:32:46 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:32:46.775 104650 DEBUG oslo.privsep.daemon [-] privsep: reply[51436f52-6d49-44b2-8db3-c2b919ef0f22]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 19:32:46 np0005591285 systemd[1]: run-netns-ovnmeta\x2d895033ac\x2d5f91\x2d4350\x2dad1a\x2db5c5d0ff13a2.mount: Deactivated successfully.
Jan 21 19:32:47 np0005591285 nova_compute[182755]: 2026-01-22 00:32:47.034 182759 DEBUG nova.compute.manager [req-4b1d3718-dba1-40b8-a941-2c0b05003088 req-4154a35a-fa40-494f-98fc-bd37ba2e73e8 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 3da83711-3468-42e8-aec6-ea1b9848aa39] Received event network-vif-unplugged-2db11cf7-a9f4-4143-96b7-5bfdd381d1d8 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 21 19:32:47 np0005591285 nova_compute[182755]: 2026-01-22 00:32:47.034 182759 DEBUG oslo_concurrency.lockutils [req-4b1d3718-dba1-40b8-a941-2c0b05003088 req-4154a35a-fa40-494f-98fc-bd37ba2e73e8 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "3da83711-3468-42e8-aec6-ea1b9848aa39-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 21 19:32:47 np0005591285 nova_compute[182755]: 2026-01-22 00:32:47.034 182759 DEBUG oslo_concurrency.lockutils [req-4b1d3718-dba1-40b8-a941-2c0b05003088 req-4154a35a-fa40-494f-98fc-bd37ba2e73e8 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "3da83711-3468-42e8-aec6-ea1b9848aa39-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 21 19:32:47 np0005591285 nova_compute[182755]: 2026-01-22 00:32:47.035 182759 DEBUG oslo_concurrency.lockutils [req-4b1d3718-dba1-40b8-a941-2c0b05003088 req-4154a35a-fa40-494f-98fc-bd37ba2e73e8 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "3da83711-3468-42e8-aec6-ea1b9848aa39-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 21 19:32:47 np0005591285 nova_compute[182755]: 2026-01-22 00:32:47.035 182759 DEBUG nova.compute.manager [req-4b1d3718-dba1-40b8-a941-2c0b05003088 req-4154a35a-fa40-494f-98fc-bd37ba2e73e8 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 3da83711-3468-42e8-aec6-ea1b9848aa39] No waiting events found dispatching network-vif-unplugged-2db11cf7-a9f4-4143-96b7-5bfdd381d1d8 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 21 19:32:47 np0005591285 nova_compute[182755]: 2026-01-22 00:32:47.035 182759 DEBUG nova.compute.manager [req-4b1d3718-dba1-40b8-a941-2c0b05003088 req-4154a35a-fa40-494f-98fc-bd37ba2e73e8 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 3da83711-3468-42e8-aec6-ea1b9848aa39] Received event network-vif-unplugged-2db11cf7-a9f4-4143-96b7-5bfdd381d1d8 for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Jan 21 19:32:47 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:32:47.232 104259 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=56, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '16:a4:fb', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': 'fa:c2:bb:50:eb:27'}, ipsec=False) old=SB_Global(nb_cfg=55) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 21 19:32:47 np0005591285 nova_compute[182755]: 2026-01-22 00:32:47.232 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:32:47 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:32:47.233 104259 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 0 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Jan 21 19:32:47 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:32:47.234 104259 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=ce4b296c-26ac-415a-aa87-9634754eb3d3, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '56'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 21 19:32:47 np0005591285 nova_compute[182755]: 2026-01-22 00:32:47.865 182759 DEBUG nova.network.neutron [-] [instance: 3da83711-3468-42e8-aec6-ea1b9848aa39] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 21 19:32:47 np0005591285 nova_compute[182755]: 2026-01-22 00:32:47.887 182759 INFO nova.compute.manager [-] [instance: 3da83711-3468-42e8-aec6-ea1b9848aa39] Took 1.13 seconds to deallocate network for instance.#033[00m
Jan 21 19:32:47 np0005591285 nova_compute[182755]: 2026-01-22 00:32:47.981 182759 DEBUG oslo_concurrency.lockutils [None req-11570629-f978-4d4b-b6a7-b788e6459d08 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 21 19:32:47 np0005591285 nova_compute[182755]: 2026-01-22 00:32:47.981 182759 DEBUG oslo_concurrency.lockutils [None req-11570629-f978-4d4b-b6a7-b788e6459d08 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 21 19:32:48 np0005591285 nova_compute[182755]: 2026-01-22 00:32:48.044 182759 DEBUG nova.compute.provider_tree [None req-11570629-f978-4d4b-b6a7-b788e6459d08 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Inventory has not changed in ProviderTree for provider: e96a8776-a298-4c19-937a-402cb8191067 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 21 19:32:48 np0005591285 nova_compute[182755]: 2026-01-22 00:32:48.061 182759 DEBUG nova.scheduler.client.report [None req-11570629-f978-4d4b-b6a7-b788e6459d08 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Inventory has not changed for provider e96a8776-a298-4c19-937a-402cb8191067 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 21 19:32:48 np0005591285 nova_compute[182755]: 2026-01-22 00:32:48.080 182759 DEBUG oslo_concurrency.lockutils [None req-11570629-f978-4d4b-b6a7-b788e6459d08 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.099s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 21 19:32:48 np0005591285 nova_compute[182755]: 2026-01-22 00:32:48.110 182759 INFO nova.scheduler.client.report [None req-11570629-f978-4d4b-b6a7-b788e6459d08 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Deleted allocations for instance 3da83711-3468-42e8-aec6-ea1b9848aa39#033[00m
Jan 21 19:32:48 np0005591285 nova_compute[182755]: 2026-01-22 00:32:48.128 182759 DEBUG nova.compute.manager [req-13325704-fbbd-4811-b29d-61f4a7894afa req-63abea7e-ca65-41d5-a5c6-07fdc8c020b3 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 3da83711-3468-42e8-aec6-ea1b9848aa39] Received event network-vif-deleted-2db11cf7-a9f4-4143-96b7-5bfdd381d1d8 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 21 19:32:48 np0005591285 nova_compute[182755]: 2026-01-22 00:32:48.190 182759 DEBUG oslo_concurrency.lockutils [None req-11570629-f978-4d4b-b6a7-b788e6459d08 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Lock "3da83711-3468-42e8-aec6-ea1b9848aa39" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 2.018s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 21 19:32:48 np0005591285 nova_compute[182755]: 2026-01-22 00:32:48.835 182759 DEBUG nova.network.neutron [req-3220d107-27a5-40ef-a9b1-2b2562646b28 req-4b0099bc-8fef-4946-94ec-7f9e8285df47 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 3da83711-3468-42e8-aec6-ea1b9848aa39] Updated VIF entry in instance network info cache for port 2db11cf7-a9f4-4143-96b7-5bfdd381d1d8. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 21 19:32:48 np0005591285 nova_compute[182755]: 2026-01-22 00:32:48.835 182759 DEBUG nova.network.neutron [req-3220d107-27a5-40ef-a9b1-2b2562646b28 req-4b0099bc-8fef-4946-94ec-7f9e8285df47 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 3da83711-3468-42e8-aec6-ea1b9848aa39] Updating instance_info_cache with network_info: [{"id": "2db11cf7-a9f4-4143-96b7-5bfdd381d1d8", "address": "fa:16:3e:0d:7f:01", "network": {"id": "895033ac-5f91-4350-ad1a-b5c5d0ff13a2", "bridge": "br-int", "label": "tempest-network-smoke--1056014394", "subnets": [{"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe0d:7f01", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}, {"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "837db8748d074b3c9179b47d30e7a1d4", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap2db11cf7-a9", "ovs_interfaceid": "2db11cf7-a9f4-4143-96b7-5bfdd381d1d8", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 21 19:32:48 np0005591285 nova_compute[182755]: 2026-01-22 00:32:48.856 182759 DEBUG oslo_concurrency.lockutils [req-3220d107-27a5-40ef-a9b1-2b2562646b28 req-4b0099bc-8fef-4946-94ec-7f9e8285df47 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Releasing lock "refresh_cache-3da83711-3468-42e8-aec6-ea1b9848aa39" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 21 19:32:49 np0005591285 nova_compute[182755]: 2026-01-22 00:32:49.131 182759 DEBUG nova.compute.manager [req-ec426d96-fa0a-4337-b7de-6e3ee0a94734 req-117008df-891c-47a8-bd5e-1cd71493ac1a 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 3da83711-3468-42e8-aec6-ea1b9848aa39] Received event network-vif-plugged-2db11cf7-a9f4-4143-96b7-5bfdd381d1d8 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 21 19:32:49 np0005591285 nova_compute[182755]: 2026-01-22 00:32:49.131 182759 DEBUG oslo_concurrency.lockutils [req-ec426d96-fa0a-4337-b7de-6e3ee0a94734 req-117008df-891c-47a8-bd5e-1cd71493ac1a 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "3da83711-3468-42e8-aec6-ea1b9848aa39-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 21 19:32:49 np0005591285 nova_compute[182755]: 2026-01-22 00:32:49.133 182759 DEBUG oslo_concurrency.lockutils [req-ec426d96-fa0a-4337-b7de-6e3ee0a94734 req-117008df-891c-47a8-bd5e-1cd71493ac1a 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "3da83711-3468-42e8-aec6-ea1b9848aa39-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 21 19:32:49 np0005591285 nova_compute[182755]: 2026-01-22 00:32:49.133 182759 DEBUG oslo_concurrency.lockutils [req-ec426d96-fa0a-4337-b7de-6e3ee0a94734 req-117008df-891c-47a8-bd5e-1cd71493ac1a 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "3da83711-3468-42e8-aec6-ea1b9848aa39-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 21 19:32:49 np0005591285 nova_compute[182755]: 2026-01-22 00:32:49.134 182759 DEBUG nova.compute.manager [req-ec426d96-fa0a-4337-b7de-6e3ee0a94734 req-117008df-891c-47a8-bd5e-1cd71493ac1a 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 3da83711-3468-42e8-aec6-ea1b9848aa39] No waiting events found dispatching network-vif-plugged-2db11cf7-a9f4-4143-96b7-5bfdd381d1d8 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 21 19:32:49 np0005591285 rsyslogd[1006]: imjournal: journal files changed, reloading...  [v8.2510.0-2.el9 try https://www.rsyslog.com/e/0 ]
Jan 21 19:32:49 np0005591285 nova_compute[182755]: 2026-01-22 00:32:49.134 182759 WARNING nova.compute.manager [req-ec426d96-fa0a-4337-b7de-6e3ee0a94734 req-117008df-891c-47a8-bd5e-1cd71493ac1a 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 3da83711-3468-42e8-aec6-ea1b9848aa39] Received unexpected event network-vif-plugged-2db11cf7-a9f4-4143-96b7-5bfdd381d1d8 for instance with vm_state deleted and task_state None.#033[00m
Jan 21 19:32:49 np0005591285 rsyslogd[1006]: imjournal: journal files changed, reloading...  [v8.2510.0-2.el9 try https://www.rsyslog.com/e/0 ]
Jan 21 19:32:49 np0005591285 nova_compute[182755]: 2026-01-22 00:32:49.674 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:32:51 np0005591285 nova_compute[182755]: 2026-01-22 00:32:51.541 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:32:54 np0005591285 nova_compute[182755]: 2026-01-22 00:32:54.677 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:32:55 np0005591285 nova_compute[182755]: 2026-01-22 00:32:55.748 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:32:55 np0005591285 nova_compute[182755]: 2026-01-22 00:32:55.750 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:32:56 np0005591285 nova_compute[182755]: 2026-01-22 00:32:56.545 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:32:59 np0005591285 podman[240099]: 2026-01-22 00:32:59.189825837 +0000 UTC m=+0.059254686 container health_status b5c1a31a508d0cf9e10702abe9c0ef424614b4d8f0daee85791f7de054afa6f0 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, org.label-schema.build-date=20251202, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_managed=true)
Jan 21 19:32:59 np0005591285 podman[240098]: 2026-01-22 00:32:59.19478641 +0000 UTC m=+0.066296874 container health_status 769668cc4e818cfae854ab3fcb5517227ddca1e18005a18ef4d782a494f45662 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, vcs-type=git, vendor=Red Hat, Inc., distribution-scope=public, architecture=x86_64, build-date=2025-08-20T13:12:41, io.openshift.tags=minimal rhel9, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, release=1755695350, container_name=openstack_network_exporter, name=ubi9-minimal, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, version=9.6, maintainer=Red Hat, Inc., io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=edpm_ansible, url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.component=ubi9-minimal-container, config_id=openstack_network_exporter, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.buildah.version=1.33.7)
Jan 21 19:32:59 np0005591285 nova_compute[182755]: 2026-01-22 00:32:59.678 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:33:00 np0005591285 nova_compute[182755]: 2026-01-22 00:33:00.238 182759 DEBUG oslo_service.periodic_task [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 21 19:33:00 np0005591285 nova_compute[182755]: 2026-01-22 00:33:00.238 182759 DEBUG nova.compute.manager [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Jan 21 19:33:00 np0005591285 nova_compute[182755]: 2026-01-22 00:33:00.238 182759 DEBUG nova.compute.manager [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Jan 21 19:33:00 np0005591285 nova_compute[182755]: 2026-01-22 00:33:00.262 182759 DEBUG nova.compute.manager [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Jan 21 19:33:01 np0005591285 nova_compute[182755]: 2026-01-22 00:33:01.463 182759 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1769041966.4610627, 3da83711-3468-42e8-aec6-ea1b9848aa39 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 21 19:33:01 np0005591285 nova_compute[182755]: 2026-01-22 00:33:01.464 182759 INFO nova.compute.manager [-] [instance: 3da83711-3468-42e8-aec6-ea1b9848aa39] VM Stopped (Lifecycle Event)#033[00m
Jan 21 19:33:01 np0005591285 nova_compute[182755]: 2026-01-22 00:33:01.487 182759 DEBUG nova.compute.manager [None req-651d5abe-33bb-4dee-90bc-6fb33da8a7d5 - - - - - -] [instance: 3da83711-3468-42e8-aec6-ea1b9848aa39] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 21 19:33:01 np0005591285 nova_compute[182755]: 2026-01-22 00:33:01.549 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:33:03 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:33:02.999 104259 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 21 19:33:03 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:33:03.000 104259 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 21 19:33:03 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:33:03.000 104259 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 21 19:33:03 np0005591285 nova_compute[182755]: 2026-01-22 00:33:03.217 182759 DEBUG oslo_service.periodic_task [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 21 19:33:03 np0005591285 nova_compute[182755]: 2026-01-22 00:33:03.217 182759 DEBUG oslo_service.periodic_task [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 21 19:33:03 np0005591285 nova_compute[182755]: 2026-01-22 00:33:03.218 182759 DEBUG oslo_service.periodic_task [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 21 19:33:03 np0005591285 nova_compute[182755]: 2026-01-22 00:33:03.218 182759 DEBUG nova.compute.manager [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Jan 21 19:33:04 np0005591285 nova_compute[182755]: 2026-01-22 00:33:04.680 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:33:06 np0005591285 nova_compute[182755]: 2026-01-22 00:33:06.217 182759 DEBUG oslo_service.periodic_task [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 21 19:33:06 np0005591285 nova_compute[182755]: 2026-01-22 00:33:06.552 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:33:08 np0005591285 podman[240140]: 2026-01-22 00:33:08.195735224 +0000 UTC m=+0.069104910 container health_status 3d927e208c8d9d2bf2ed958430b573b5beb6e6062ba87e1ec7a0ee42c855b2e4 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Jan 21 19:33:08 np0005591285 nova_compute[182755]: 2026-01-22 00:33:08.217 182759 DEBUG oslo_service.periodic_task [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 21 19:33:08 np0005591285 nova_compute[182755]: 2026-01-22 00:33:08.218 182759 DEBUG oslo_service.periodic_task [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 21 19:33:09 np0005591285 nova_compute[182755]: 2026-01-22 00:33:09.680 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:33:11 np0005591285 podman[240163]: 2026-01-22 00:33:11.171705208 +0000 UTC m=+0.048217729 container health_status 482a8450540b33e5cd4c4cb0c4b60281016c657b79b89fa82f8a9e447843f995 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent)
Jan 21 19:33:11 np0005591285 podman[240164]: 2026-01-22 00:33:11.171749399 +0000 UTC m=+0.048197178 container health_status 8919309155a033da8deb024bb37b9fc0d285fe97686ee0d202af37a5f2df8416 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter)
Jan 21 19:33:11 np0005591285 nova_compute[182755]: 2026-01-22 00:33:11.555 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:33:12 np0005591285 nova_compute[182755]: 2026-01-22 00:33:12.216 182759 DEBUG oslo_service.periodic_task [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 21 19:33:12 np0005591285 nova_compute[182755]: 2026-01-22 00:33:12.280 182759 DEBUG oslo_concurrency.lockutils [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 21 19:33:12 np0005591285 nova_compute[182755]: 2026-01-22 00:33:12.280 182759 DEBUG oslo_concurrency.lockutils [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 21 19:33:12 np0005591285 nova_compute[182755]: 2026-01-22 00:33:12.281 182759 DEBUG oslo_concurrency.lockutils [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 21 19:33:12 np0005591285 nova_compute[182755]: 2026-01-22 00:33:12.281 182759 DEBUG nova.compute.resource_tracker [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Jan 21 19:33:12 np0005591285 nova_compute[182755]: 2026-01-22 00:33:12.457 182759 WARNING nova.virt.libvirt.driver [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 21 19:33:12 np0005591285 nova_compute[182755]: 2026-01-22 00:33:12.458 182759 DEBUG nova.compute.resource_tracker [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=5754MB free_disk=73.1773796081543GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Jan 21 19:33:12 np0005591285 nova_compute[182755]: 2026-01-22 00:33:12.458 182759 DEBUG oslo_concurrency.lockutils [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 21 19:33:12 np0005591285 nova_compute[182755]: 2026-01-22 00:33:12.458 182759 DEBUG oslo_concurrency.lockutils [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 21 19:33:12 np0005591285 nova_compute[182755]: 2026-01-22 00:33:12.520 182759 DEBUG nova.compute.resource_tracker [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Jan 21 19:33:12 np0005591285 nova_compute[182755]: 2026-01-22 00:33:12.520 182759 DEBUG nova.compute.resource_tracker [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Jan 21 19:33:12 np0005591285 nova_compute[182755]: 2026-01-22 00:33:12.546 182759 DEBUG nova.compute.provider_tree [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Inventory has not changed in ProviderTree for provider: e96a8776-a298-4c19-937a-402cb8191067 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 21 19:33:12 np0005591285 nova_compute[182755]: 2026-01-22 00:33:12.563 182759 DEBUG nova.scheduler.client.report [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Inventory has not changed for provider e96a8776-a298-4c19-937a-402cb8191067 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 21 19:33:12 np0005591285 nova_compute[182755]: 2026-01-22 00:33:12.593 182759 DEBUG nova.compute.resource_tracker [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Jan 21 19:33:12 np0005591285 nova_compute[182755]: 2026-01-22 00:33:12.594 182759 DEBUG oslo_concurrency.lockutils [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.135s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 21 19:33:13 np0005591285 nova_compute[182755]: 2026-01-22 00:33:13.594 182759 DEBUG oslo_service.periodic_task [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 21 19:33:14 np0005591285 podman[240205]: 2026-01-22 00:33:14.229476481 +0000 UTC m=+0.104744189 container health_status e38def30a2d032278c507a8fe96e62e9ea51ff4721fe55b24e66691821fd079e (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=ovn_controller)
Jan 21 19:33:14 np0005591285 nova_compute[182755]: 2026-01-22 00:33:14.682 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:33:16 np0005591285 nova_compute[182755]: 2026-01-22 00:33:16.557 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:33:19 np0005591285 nova_compute[182755]: 2026-01-22 00:33:19.685 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:33:21 np0005591285 nova_compute[182755]: 2026-01-22 00:33:21.560 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:33:24 np0005591285 nova_compute[182755]: 2026-01-22 00:33:24.686 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:33:26 np0005591285 nova_compute[182755]: 2026-01-22 00:33:26.563 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:33:29 np0005591285 nova_compute[182755]: 2026-01-22 00:33:29.688 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:33:30 np0005591285 podman[240232]: 2026-01-22 00:33:30.173579714 +0000 UTC m=+0.048195608 container health_status 769668cc4e818cfae854ab3fcb5517227ddca1e18005a18ef4d782a494f45662 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, distribution-scope=public, managed_by=edpm_ansible, vcs-type=git, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., version=9.6, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, maintainer=Red Hat, Inc., config_id=openstack_network_exporter, io.openshift.tags=minimal rhel9, container_name=openstack_network_exporter, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.buildah.version=1.33.7, vendor=Red Hat, Inc., architecture=x86_64, com.redhat.component=ubi9-minimal-container, name=ubi9-minimal, release=1755695350, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, url=https://catalog.redhat.com/en/search?searchType=containers, build-date=2025-08-20T13:12:41, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI)
Jan 21 19:33:30 np0005591285 podman[240233]: 2026-01-22 00:33:30.181453525 +0000 UTC m=+0.051313691 container health_status b5c1a31a508d0cf9e10702abe9c0ef424614b4d8f0daee85791f7de054afa6f0 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251202, config_id=ceilometer_agent_compute, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']})
Jan 21 19:33:31 np0005591285 nova_compute[182755]: 2026-01-22 00:33:31.567 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:33:34 np0005591285 nova_compute[182755]: 2026-01-22 00:33:34.717 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:33:36 np0005591285 nova_compute[182755]: 2026-01-22 00:33:36.570 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:33:38 np0005591285 nova_compute[182755]: 2026-01-22 00:33:38.219 182759 DEBUG oslo_service.periodic_task [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Running periodic task ComputeManager._cleanup_incomplete_migrations run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 21 19:33:38 np0005591285 nova_compute[182755]: 2026-01-22 00:33:38.219 182759 DEBUG nova.compute.manager [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Cleaning up deleted instances with incomplete migration  _cleanup_incomplete_migrations /usr/lib/python3.9/site-packages/nova/compute/manager.py:11183#033[00m
Jan 21 19:33:39 np0005591285 podman[240274]: 2026-01-22 00:33:39.175863243 +0000 UTC m=+0.053037948 container health_status 3d927e208c8d9d2bf2ed958430b573b5beb6e6062ba87e1ec7a0ee42c855b2e4 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Jan 21 19:33:39 np0005591285 nova_compute[182755]: 2026-01-22 00:33:39.719 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:33:41 np0005591285 nova_compute[182755]: 2026-01-22 00:33:41.574 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:33:42 np0005591285 ovn_controller[94908]: 2026-01-22T00:33:42Z|00647|memory_trim|INFO|Detected inactivity (last active 30005 ms ago): trimming memory
Jan 21 19:33:42 np0005591285 podman[240298]: 2026-01-22 00:33:42.210155936 +0000 UTC m=+0.077353892 container health_status 482a8450540b33e5cd4c4cb0c4b60281016c657b79b89fa82f8a9e447843f995 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=ovn_metadata_agent, managed_by=edpm_ansible)
Jan 21 19:33:42 np0005591285 podman[240299]: 2026-01-22 00:33:42.212424137 +0000 UTC m=+0.076797007 container health_status 8919309155a033da8deb024bb37b9fc0d285fe97686ee0d202af37a5f2df8416 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']})
Jan 21 19:33:43 np0005591285 nova_compute[182755]: 2026-01-22 00:33:43.217 182759 DEBUG oslo_service.periodic_task [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 21 19:33:44 np0005591285 nova_compute[182755]: 2026-01-22 00:33:44.721 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:33:45 np0005591285 podman[240341]: 2026-01-22 00:33:45.206329332 +0000 UTC m=+0.080171199 container health_status e38def30a2d032278c507a8fe96e62e9ea51ff4721fe55b24e66691821fd079e (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=ovn_controller, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, managed_by=edpm_ansible)
Jan 21 19:33:46 np0005591285 nova_compute[182755]: 2026-01-22 00:33:46.578 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:33:49 np0005591285 nova_compute[182755]: 2026-01-22 00:33:49.723 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:33:51 np0005591285 nova_compute[182755]: 2026-01-22 00:33:51.582 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:33:54 np0005591285 nova_compute[182755]: 2026-01-22 00:33:54.725 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:33:56 np0005591285 nova_compute[182755]: 2026-01-22 00:33:56.585 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:33:59 np0005591285 nova_compute[182755]: 2026-01-22 00:33:59.727 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:34:01 np0005591285 podman[240367]: 2026-01-22 00:34:01.1758943 +0000 UTC m=+0.050452968 container health_status 769668cc4e818cfae854ab3fcb5517227ddca1e18005a18ef4d782a494f45662 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.33.7, maintainer=Red Hat, Inc., config_id=openstack_network_exporter, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vcs-type=git, architecture=x86_64, vendor=Red Hat, Inc., build-date=2025-08-20T13:12:41, distribution-scope=public, version=9.6, managed_by=edpm_ansible, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, release=1755695350, container_name=openstack_network_exporter, name=ubi9-minimal, com.redhat.component=ubi9-minimal-container, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.tags=minimal rhel9, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']})
Jan 21 19:34:01 np0005591285 podman[240368]: 2026-01-22 00:34:01.182461638 +0000 UTC m=+0.052566636 container health_status b5c1a31a508d0cf9e10702abe9c0ef424614b4d8f0daee85791f7de054afa6f0 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, container_name=ceilometer_agent_compute, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_id=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible)
Jan 21 19:34:01 np0005591285 nova_compute[182755]: 2026-01-22 00:34:01.588 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:34:02 np0005591285 nova_compute[182755]: 2026-01-22 00:34:02.242 182759 DEBUG oslo_service.periodic_task [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 21 19:34:02 np0005591285 nova_compute[182755]: 2026-01-22 00:34:02.242 182759 DEBUG nova.compute.manager [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Jan 21 19:34:02 np0005591285 nova_compute[182755]: 2026-01-22 00:34:02.242 182759 DEBUG nova.compute.manager [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Jan 21 19:34:02 np0005591285 nova_compute[182755]: 2026-01-22 00:34:02.257 182759 DEBUG nova.compute.manager [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Jan 21 19:34:03 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:34:03.000 104259 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 21 19:34:03 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:34:03.001 104259 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 21 19:34:03 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:34:03.001 104259 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 21 19:34:03 np0005591285 nova_compute[182755]: 2026-01-22 00:34:03.217 182759 DEBUG oslo_service.periodic_task [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 21 19:34:03 np0005591285 nova_compute[182755]: 2026-01-22 00:34:03.217 182759 DEBUG nova.compute.manager [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Jan 21 19:34:04 np0005591285 nova_compute[182755]: 2026-01-22 00:34:04.217 182759 DEBUG oslo_service.periodic_task [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 21 19:34:04 np0005591285 nova_compute[182755]: 2026-01-22 00:34:04.729 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:34:05 np0005591285 nova_compute[182755]: 2026-01-22 00:34:05.213 182759 DEBUG oslo_service.periodic_task [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 21 19:34:06 np0005591285 nova_compute[182755]: 2026-01-22 00:34:06.217 182759 DEBUG oslo_service.periodic_task [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 21 19:34:06 np0005591285 nova_compute[182755]: 2026-01-22 00:34:06.592 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:34:08 np0005591285 nova_compute[182755]: 2026-01-22 00:34:08.218 182759 DEBUG oslo_service.periodic_task [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 21 19:34:09 np0005591285 nova_compute[182755]: 2026-01-22 00:34:09.731 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:34:10 np0005591285 podman[240408]: 2026-01-22 00:34:10.181282215 +0000 UTC m=+0.058089424 container health_status 3d927e208c8d9d2bf2ed958430b573b5beb6e6062ba87e1ec7a0ee42c855b2e4 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Jan 21 19:34:10 np0005591285 nova_compute[182755]: 2026-01-22 00:34:10.216 182759 DEBUG oslo_service.periodic_task [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 21 19:34:11 np0005591285 nova_compute[182755]: 2026-01-22 00:34:11.644 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:34:11 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:34:11.963 104259 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=57, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '16:a4:fb', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': 'fa:c2:bb:50:eb:27'}, ipsec=False) old=SB_Global(nb_cfg=56) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 21 19:34:11 np0005591285 nova_compute[182755]: 2026-01-22 00:34:11.963 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:34:11 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:34:11.964 104259 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 9 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Jan 21 19:34:13 np0005591285 podman[240432]: 2026-01-22 00:34:13.178982133 +0000 UTC m=+0.056429170 container health_status 482a8450540b33e5cd4c4cb0c4b60281016c657b79b89fa82f8a9e447843f995 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible)
Jan 21 19:34:13 np0005591285 podman[240433]: 2026-01-22 00:34:13.191648364 +0000 UTC m=+0.063121210 container health_status 8919309155a033da8deb024bb37b9fc0d285fe97686ee0d202af37a5f2df8416 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter)
Jan 21 19:34:14 np0005591285 nova_compute[182755]: 2026-01-22 00:34:14.218 182759 DEBUG oslo_service.periodic_task [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 21 19:34:14 np0005591285 nova_compute[182755]: 2026-01-22 00:34:14.218 182759 DEBUG oslo_service.periodic_task [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 21 19:34:14 np0005591285 nova_compute[182755]: 2026-01-22 00:34:14.246 182759 DEBUG oslo_concurrency.lockutils [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 21 19:34:14 np0005591285 nova_compute[182755]: 2026-01-22 00:34:14.246 182759 DEBUG oslo_concurrency.lockutils [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 21 19:34:14 np0005591285 nova_compute[182755]: 2026-01-22 00:34:14.246 182759 DEBUG oslo_concurrency.lockutils [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 21 19:34:14 np0005591285 nova_compute[182755]: 2026-01-22 00:34:14.247 182759 DEBUG nova.compute.resource_tracker [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Jan 21 19:34:14 np0005591285 nova_compute[182755]: 2026-01-22 00:34:14.445 182759 WARNING nova.virt.libvirt.driver [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 21 19:34:14 np0005591285 nova_compute[182755]: 2026-01-22 00:34:14.446 182759 DEBUG nova.compute.resource_tracker [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=5747MB free_disk=73.1773796081543GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Jan 21 19:34:14 np0005591285 nova_compute[182755]: 2026-01-22 00:34:14.446 182759 DEBUG oslo_concurrency.lockutils [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 21 19:34:14 np0005591285 nova_compute[182755]: 2026-01-22 00:34:14.447 182759 DEBUG oslo_concurrency.lockutils [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 21 19:34:14 np0005591285 nova_compute[182755]: 2026-01-22 00:34:14.584 182759 DEBUG nova.compute.resource_tracker [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Jan 21 19:34:14 np0005591285 nova_compute[182755]: 2026-01-22 00:34:14.584 182759 DEBUG nova.compute.resource_tracker [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Jan 21 19:34:14 np0005591285 nova_compute[182755]: 2026-01-22 00:34:14.621 182759 DEBUG nova.compute.provider_tree [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Inventory has not changed in ProviderTree for provider: e96a8776-a298-4c19-937a-402cb8191067 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 21 19:34:14 np0005591285 nova_compute[182755]: 2026-01-22 00:34:14.640 182759 DEBUG nova.scheduler.client.report [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Inventory has not changed for provider e96a8776-a298-4c19-937a-402cb8191067 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 21 19:34:14 np0005591285 nova_compute[182755]: 2026-01-22 00:34:14.642 182759 DEBUG nova.compute.resource_tracker [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Jan 21 19:34:14 np0005591285 nova_compute[182755]: 2026-01-22 00:34:14.642 182759 DEBUG oslo_concurrency.lockutils [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.195s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 21 19:34:14 np0005591285 nova_compute[182755]: 2026-01-22 00:34:14.785 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:34:16 np0005591285 podman[240476]: 2026-01-22 00:34:16.255258395 +0000 UTC m=+0.127670216 container health_status e38def30a2d032278c507a8fe96e62e9ea51ff4721fe55b24e66691821fd079e (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 21 19:34:16 np0005591285 nova_compute[182755]: 2026-01-22 00:34:16.646 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:34:19 np0005591285 nova_compute[182755]: 2026-01-22 00:34:19.787 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:34:20 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:34:20.966 104259 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=ce4b296c-26ac-415a-aa87-9634754eb3d3, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '57'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 21 19:34:21 np0005591285 nova_compute[182755]: 2026-01-22 00:34:21.637 182759 DEBUG oslo_service.periodic_task [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 21 19:34:21 np0005591285 nova_compute[182755]: 2026-01-22 00:34:21.648 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:34:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:34:23.178 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 21 19:34:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:34:23.179 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 21 19:34:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:34:23.180 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 21 19:34:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:34:23.180 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 21 19:34:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:34:23.180 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 21 19:34:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:34:23.180 12 DEBUG ceilometer.polling.manager [-] Skip pollster cpu, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 21 19:34:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:34:23.180 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 21 19:34:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:34:23.180 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 21 19:34:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:34:23.181 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 21 19:34:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:34:23.181 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 21 19:34:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:34:23.181 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.allocation, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 21 19:34:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:34:23.181 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 21 19:34:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:34:23.181 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 21 19:34:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:34:23.181 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 21 19:34:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:34:23.182 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.iops, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 21 19:34:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:34:23.182 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 21 19:34:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:34:23.182 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 21 19:34:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:34:23.182 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 21 19:34:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:34:23.182 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 21 19:34:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:34:23.182 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 21 19:34:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:34:23.183 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.capacity, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 21 19:34:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:34:23.183 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 21 19:34:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:34:23.183 12 DEBUG ceilometer.polling.manager [-] Skip pollster memory.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 21 19:34:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:34:23.183 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 21 19:34:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:34:23.183 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 21 19:34:24 np0005591285 nova_compute[182755]: 2026-01-22 00:34:24.789 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:34:26 np0005591285 nova_compute[182755]: 2026-01-22 00:34:26.651 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:34:29 np0005591285 nova_compute[182755]: 2026-01-22 00:34:29.791 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:34:31 np0005591285 nova_compute[182755]: 2026-01-22 00:34:31.654 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:34:32 np0005591285 podman[240503]: 2026-01-22 00:34:32.195474583 +0000 UTC m=+0.060360785 container health_status b5c1a31a508d0cf9e10702abe9c0ef424614b4d8f0daee85791f7de054afa6f0 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=ceilometer_agent_compute, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, org.label-schema.vendor=CentOS, container_name=ceilometer_agent_compute, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 21 19:34:32 np0005591285 podman[240502]: 2026-01-22 00:34:32.21058775 +0000 UTC m=+0.076368307 container health_status 769668cc4e818cfae854ab3fcb5517227ddca1e18005a18ef4d782a494f45662 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, vcs-type=git, com.redhat.component=ubi9-minimal-container, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.buildah.version=1.33.7, url=https://catalog.redhat.com/en/search?searchType=containers, config_id=openstack_network_exporter, container_name=openstack_network_exporter, io.openshift.tags=minimal rhel9, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, architecture=x86_64, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, maintainer=Red Hat, Inc., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., managed_by=edpm_ansible, release=1755695350, name=ubi9-minimal, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vendor=Red Hat, Inc., build-date=2025-08-20T13:12:41, version=9.6)
Jan 21 19:34:34 np0005591285 nova_compute[182755]: 2026-01-22 00:34:34.792 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:34:36 np0005591285 nova_compute[182755]: 2026-01-22 00:34:36.658 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:34:39 np0005591285 nova_compute[182755]: 2026-01-22 00:34:39.794 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:34:41 np0005591285 podman[240544]: 2026-01-22 00:34:41.206641272 +0000 UTC m=+0.082335016 container health_status 3d927e208c8d9d2bf2ed958430b573b5beb6e6062ba87e1ec7a0ee42c855b2e4 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Jan 21 19:34:41 np0005591285 nova_compute[182755]: 2026-01-22 00:34:41.663 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:34:44 np0005591285 podman[240569]: 2026-01-22 00:34:44.173780928 +0000 UTC m=+0.045430714 container health_status 8919309155a033da8deb024bb37b9fc0d285fe97686ee0d202af37a5f2df8416 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Jan 21 19:34:44 np0005591285 podman[240568]: 2026-01-22 00:34:44.198736249 +0000 UTC m=+0.074464634 container health_status 482a8450540b33e5cd4c4cb0c4b60281016c657b79b89fa82f8a9e447843f995 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, managed_by=edpm_ansible)
Jan 21 19:34:44 np0005591285 nova_compute[182755]: 2026-01-22 00:34:44.796 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:34:46 np0005591285 nova_compute[182755]: 2026-01-22 00:34:46.667 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:34:47 np0005591285 podman[240610]: 2026-01-22 00:34:47.240266467 +0000 UTC m=+0.107181365 container health_status e38def30a2d032278c507a8fe96e62e9ea51ff4721fe55b24e66691821fd079e (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_controller, org.label-schema.build-date=20251202, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']})
Jan 21 19:34:49 np0005591285 nova_compute[182755]: 2026-01-22 00:34:49.799 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:34:51 np0005591285 nova_compute[182755]: 2026-01-22 00:34:51.672 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:34:54 np0005591285 nova_compute[182755]: 2026-01-22 00:34:54.801 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:34:56 np0005591285 nova_compute[182755]: 2026-01-22 00:34:56.678 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:34:59 np0005591285 nova_compute[182755]: 2026-01-22 00:34:59.803 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:35:01 np0005591285 nova_compute[182755]: 2026-01-22 00:35:01.683 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:35:03 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:35:03.002 104259 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 21 19:35:03 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:35:03.002 104259 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 21 19:35:03 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:35:03.002 104259 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 21 19:35:03 np0005591285 podman[240638]: 2026-01-22 00:35:03.184937247 +0000 UTC m=+0.057425057 container health_status 769668cc4e818cfae854ab3fcb5517227ddca1e18005a18ef4d782a494f45662 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, config_id=openstack_network_exporter, version=9.6, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, maintainer=Red Hat, Inc., vcs-type=git, vendor=Red Hat, Inc., io.openshift.tags=minimal rhel9, container_name=openstack_network_exporter, com.redhat.component=ubi9-minimal-container, managed_by=edpm_ansible, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., url=https://catalog.redhat.com/en/search?searchType=containers, name=ubi9-minimal, release=1755695350, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., distribution-scope=public, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.buildah.version=1.33.7, build-date=2025-08-20T13:12:41)
Jan 21 19:35:03 np0005591285 podman[240639]: 2026-01-22 00:35:03.190140786 +0000 UTC m=+0.058199366 container health_status b5c1a31a508d0cf9e10702abe9c0ef424614b4d8f0daee85791f7de054afa6f0 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, managed_by=edpm_ansible, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, config_id=ceilometer_agent_compute, org.label-schema.schema-version=1.0, container_name=ceilometer_agent_compute)
Jan 21 19:35:03 np0005591285 nova_compute[182755]: 2026-01-22 00:35:03.218 182759 DEBUG oslo_service.periodic_task [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 21 19:35:03 np0005591285 nova_compute[182755]: 2026-01-22 00:35:03.218 182759 DEBUG nova.compute.manager [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Jan 21 19:35:03 np0005591285 nova_compute[182755]: 2026-01-22 00:35:03.218 182759 DEBUG nova.compute.manager [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Jan 21 19:35:03 np0005591285 nova_compute[182755]: 2026-01-22 00:35:03.235 182759 DEBUG nova.compute.manager [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Jan 21 19:35:04 np0005591285 nova_compute[182755]: 2026-01-22 00:35:04.805 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:35:05 np0005591285 nova_compute[182755]: 2026-01-22 00:35:05.217 182759 DEBUG oslo_service.periodic_task [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 21 19:35:05 np0005591285 nova_compute[182755]: 2026-01-22 00:35:05.217 182759 DEBUG nova.compute.manager [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Jan 21 19:35:06 np0005591285 nova_compute[182755]: 2026-01-22 00:35:06.218 182759 DEBUG oslo_service.periodic_task [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 21 19:35:06 np0005591285 nova_compute[182755]: 2026-01-22 00:35:06.688 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:35:07 np0005591285 nova_compute[182755]: 2026-01-22 00:35:07.213 182759 DEBUG oslo_service.periodic_task [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 21 19:35:07 np0005591285 nova_compute[182755]: 2026-01-22 00:35:07.217 182759 DEBUG oslo_service.periodic_task [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 21 19:35:09 np0005591285 nova_compute[182755]: 2026-01-22 00:35:09.807 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:35:10 np0005591285 nova_compute[182755]: 2026-01-22 00:35:10.217 182759 DEBUG oslo_service.periodic_task [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 21 19:35:11 np0005591285 nova_compute[182755]: 2026-01-22 00:35:11.692 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:35:12 np0005591285 podman[240678]: 2026-01-22 00:35:12.17028682 +0000 UTC m=+0.047574671 container health_status 3d927e208c8d9d2bf2ed958430b573b5beb6e6062ba87e1ec7a0ee42c855b2e4 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Jan 21 19:35:12 np0005591285 nova_compute[182755]: 2026-01-22 00:35:12.217 182759 DEBUG oslo_service.periodic_task [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 21 19:35:12 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:35:12.967 104259 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=58, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '16:a4:fb', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': 'fa:c2:bb:50:eb:27'}, ipsec=False) old=SB_Global(nb_cfg=57) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 21 19:35:12 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:35:12.967 104259 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 6 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Jan 21 19:35:12 np0005591285 nova_compute[182755]: 2026-01-22 00:35:12.969 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:35:14 np0005591285 nova_compute[182755]: 2026-01-22 00:35:14.808 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:35:15 np0005591285 podman[240704]: 2026-01-22 00:35:15.18330674 +0000 UTC m=+0.053692927 container health_status 482a8450540b33e5cd4c4cb0c4b60281016c657b79b89fa82f8a9e447843f995 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.build-date=20251202, maintainer=OpenStack Kubernetes Operator team, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent)
Jan 21 19:35:15 np0005591285 podman[240705]: 2026-01-22 00:35:15.19969723 +0000 UTC m=+0.070674253 container health_status 8919309155a033da8deb024bb37b9fc0d285fe97686ee0d202af37a5f2df8416 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Jan 21 19:35:15 np0005591285 nova_compute[182755]: 2026-01-22 00:35:15.217 182759 DEBUG oslo_service.periodic_task [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 21 19:35:15 np0005591285 nova_compute[182755]: 2026-01-22 00:35:15.217 182759 DEBUG oslo_service.periodic_task [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 21 19:35:15 np0005591285 nova_compute[182755]: 2026-01-22 00:35:15.240 182759 DEBUG oslo_concurrency.lockutils [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 21 19:35:15 np0005591285 nova_compute[182755]: 2026-01-22 00:35:15.241 182759 DEBUG oslo_concurrency.lockutils [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 21 19:35:15 np0005591285 nova_compute[182755]: 2026-01-22 00:35:15.241 182759 DEBUG oslo_concurrency.lockutils [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 21 19:35:15 np0005591285 nova_compute[182755]: 2026-01-22 00:35:15.241 182759 DEBUG nova.compute.resource_tracker [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Jan 21 19:35:15 np0005591285 nova_compute[182755]: 2026-01-22 00:35:15.376 182759 WARNING nova.virt.libvirt.driver [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 21 19:35:15 np0005591285 nova_compute[182755]: 2026-01-22 00:35:15.377 182759 DEBUG nova.compute.resource_tracker [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=5738MB free_disk=73.1773796081543GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Jan 21 19:35:15 np0005591285 nova_compute[182755]: 2026-01-22 00:35:15.377 182759 DEBUG oslo_concurrency.lockutils [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 21 19:35:15 np0005591285 nova_compute[182755]: 2026-01-22 00:35:15.378 182759 DEBUG oslo_concurrency.lockutils [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 21 19:35:15 np0005591285 nova_compute[182755]: 2026-01-22 00:35:15.812 182759 DEBUG nova.compute.resource_tracker [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Jan 21 19:35:15 np0005591285 nova_compute[182755]: 2026-01-22 00:35:15.813 182759 DEBUG nova.compute.resource_tracker [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Jan 21 19:35:16 np0005591285 nova_compute[182755]: 2026-01-22 00:35:16.153 182759 DEBUG nova.scheduler.client.report [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Refreshing inventories for resource provider e96a8776-a298-4c19-937a-402cb8191067 _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:804#033[00m
Jan 21 19:35:16 np0005591285 nova_compute[182755]: 2026-01-22 00:35:16.216 182759 DEBUG nova.scheduler.client.report [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Updating ProviderTree inventory for provider e96a8776-a298-4c19-937a-402cb8191067 from _refresh_and_get_inventory using data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} _refresh_and_get_inventory /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:768#033[00m
Jan 21 19:35:16 np0005591285 nova_compute[182755]: 2026-01-22 00:35:16.217 182759 DEBUG nova.compute.provider_tree [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Updating inventory in ProviderTree for provider e96a8776-a298-4c19-937a-402cb8191067 with inventory: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176#033[00m
Jan 21 19:35:16 np0005591285 nova_compute[182755]: 2026-01-22 00:35:16.232 182759 DEBUG nova.scheduler.client.report [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Refreshing aggregate associations for resource provider e96a8776-a298-4c19-937a-402cb8191067, aggregates: None _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:813#033[00m
Jan 21 19:35:16 np0005591285 nova_compute[182755]: 2026-01-22 00:35:16.255 182759 DEBUG nova.scheduler.client.report [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Refreshing trait associations for resource provider e96a8776-a298-4c19-937a-402cb8191067, traits: COMPUTE_SECURITY_UEFI_SECURE_BOOT,COMPUTE_VIOMMU_MODEL_VIRTIO,COMPUTE_STORAGE_BUS_IDE,COMPUTE_NET_ATTACH_INTERFACE,HW_CPU_X86_SSSE3,HW_CPU_X86_SSE,COMPUTE_IMAGE_TYPE_QCOW2,COMPUTE_NET_VIF_MODEL_VMXNET3,COMPUTE_STORAGE_BUS_FDC,HW_CPU_X86_SSE42,COMPUTE_SECURITY_TPM_2_0,COMPUTE_IMAGE_TYPE_AMI,COMPUTE_DEVICE_TAGGING,COMPUTE_VOLUME_EXTEND,COMPUTE_NET_VIF_MODEL_E1000,COMPUTE_VIOMMU_MODEL_AUTO,COMPUTE_GRAPHICS_MODEL_CIRRUS,COMPUTE_VOLUME_ATTACH_WITH_TAG,COMPUTE_IMAGE_TYPE_AKI,COMPUTE_STORAGE_BUS_SATA,HW_CPU_X86_SSE2,COMPUTE_IMAGE_TYPE_ARI,HW_CPU_X86_SSE41,COMPUTE_RESCUE_BFV,COMPUTE_NODE,COMPUTE_GRAPHICS_MODEL_VIRTIO,COMPUTE_NET_VIF_MODEL_PCNET,COMPUTE_NET_VIF_MODEL_VIRTIO,COMPUTE_NET_VIF_MODEL_RTL8139,COMPUTE_ACCELERATORS,COMPUTE_SECURITY_TPM_1_2,COMPUTE_GRAPHICS_MODEL_NONE,COMPUTE_GRAPHICS_MODEL_BOCHS,COMPUTE_VOLUME_MULTI_ATTACH,COMPUTE_NET_VIF_MODEL_NE2K_PCI,COMPUTE_TRUSTED_CERTS,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,COMPUTE_VIOMMU_MODEL_INTEL,COMPUTE_IMAGE_TYPE_ISO,COMPUTE_SOCKET_PCI_NUMA_AFFINITY,COMPUTE_STORAGE_BUS_USB,COMPUTE_STORAGE_BUS_SCSI,COMPUTE_STORAGE_BUS_VIRTIO,HW_CPU_X86_MMX,COMPUTE_GRAPHICS_MODEL_VGA,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,COMPUTE_NET_VIF_MODEL_E1000E,COMPUTE_IMAGE_TYPE_RAW _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:825#033[00m
Jan 21 19:35:16 np0005591285 nova_compute[182755]: 2026-01-22 00:35:16.290 182759 DEBUG nova.compute.provider_tree [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Inventory has not changed in ProviderTree for provider: e96a8776-a298-4c19-937a-402cb8191067 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 21 19:35:16 np0005591285 nova_compute[182755]: 2026-01-22 00:35:16.311 182759 DEBUG nova.scheduler.client.report [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Inventory has not changed for provider e96a8776-a298-4c19-937a-402cb8191067 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 21 19:35:16 np0005591285 nova_compute[182755]: 2026-01-22 00:35:16.313 182759 DEBUG nova.compute.resource_tracker [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Jan 21 19:35:16 np0005591285 nova_compute[182755]: 2026-01-22 00:35:16.313 182759 DEBUG oslo_concurrency.lockutils [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.936s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 21 19:35:16 np0005591285 nova_compute[182755]: 2026-01-22 00:35:16.696 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:35:18 np0005591285 podman[240752]: 2026-01-22 00:35:18.206724999 +0000 UTC m=+0.079911332 container health_status e38def30a2d032278c507a8fe96e62e9ea51ff4721fe55b24e66691821fd079e (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=ovn_controller, org.label-schema.license=GPLv2, container_name=ovn_controller, io.buildah.version=1.41.3)
Jan 21 19:35:18 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:35:18.969 104259 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=ce4b296c-26ac-415a-aa87-9634754eb3d3, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '58'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 21 19:35:19 np0005591285 nova_compute[182755]: 2026-01-22 00:35:19.809 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:35:21 np0005591285 nova_compute[182755]: 2026-01-22 00:35:21.700 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:35:24 np0005591285 nova_compute[182755]: 2026-01-22 00:35:24.811 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:35:26 np0005591285 nova_compute[182755]: 2026-01-22 00:35:26.704 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:35:29 np0005591285 nova_compute[182755]: 2026-01-22 00:35:29.813 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:35:30 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:35:30.181 104259 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:17:32:e3 2001:db8:0:1:f816:3eff:fe17:32e3 2001:db8::f816:3eff:fe17:32e3'], port_security=[], type=localport, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': ''}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '2001:db8:0:1:f816:3eff:fe17:32e3/64 2001:db8::f816:3eff:fe17:32e3/64', 'neutron:device_id': 'ovnmeta-041654ff-0c5d-4cd2-89f6-0863dbbf44a8', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-041654ff-0c5d-4cd2-89f6-0863dbbf44a8', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '837db8748d074b3c9179b47d30e7a1d4', 'neutron:revision_number': '3', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=9d6cac94-5c44-44de-a872-7bf42948d910, chassis=[], tunnel_key=1, gateway_chassis=[], requested_chassis=[], logical_port=44d0292d-d743-4a92-8996-3ae3a26c0afc) old=Port_Binding(mac=['fa:16:3e:17:32:e3 2001:db8::f816:3eff:fe17:32e3'], external_ids={'neutron:cidrs': '2001:db8::f816:3eff:fe17:32e3/64', 'neutron:device_id': 'ovnmeta-041654ff-0c5d-4cd2-89f6-0863dbbf44a8', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-041654ff-0c5d-4cd2-89f6-0863dbbf44a8', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '837db8748d074b3c9179b47d30e7a1d4', 'neutron:revision_number': '2', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 21 19:35:30 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:35:30.183 104259 INFO neutron.agent.ovn.metadata.agent [-] Metadata Port 44d0292d-d743-4a92-8996-3ae3a26c0afc in datapath 041654ff-0c5d-4cd2-89f6-0863dbbf44a8 updated#033[00m
Jan 21 19:35:30 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:35:30.184 104259 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 041654ff-0c5d-4cd2-89f6-0863dbbf44a8, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Jan 21 19:35:30 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:35:30.185 211690 DEBUG oslo.privsep.daemon [-] privsep: reply[52ff8bf7-1db5-47d9-b5ec-5b2dfb352a46]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 19:35:31 np0005591285 nova_compute[182755]: 2026-01-22 00:35:31.708 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:35:34 np0005591285 podman[240779]: 2026-01-22 00:35:34.174423669 +0000 UTC m=+0.048489507 container health_status b5c1a31a508d0cf9e10702abe9c0ef424614b4d8f0daee85791f7de054afa6f0 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, managed_by=edpm_ansible, org.label-schema.schema-version=1.0)
Jan 21 19:35:34 np0005591285 podman[240778]: 2026-01-22 00:35:34.199891914 +0000 UTC m=+0.077763374 container health_status 769668cc4e818cfae854ab3fcb5517227ddca1e18005a18ef4d782a494f45662 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, com.redhat.component=ubi9-minimal-container, distribution-scope=public, architecture=x86_64, name=ubi9-minimal, vendor=Red Hat, Inc., version=9.6, build-date=2025-08-20T13:12:41, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., url=https://catalog.redhat.com/en/search?searchType=containers, io.openshift.tags=minimal rhel9, release=1755695350, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=openstack_network_exporter, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.buildah.version=1.33.7, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., config_id=openstack_network_exporter, io.openshift.expose-services=, maintainer=Red Hat, Inc., managed_by=edpm_ansible, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vcs-type=git)
Jan 21 19:35:34 np0005591285 nova_compute[182755]: 2026-01-22 00:35:34.816 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:35:36 np0005591285 nova_compute[182755]: 2026-01-22 00:35:36.712 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:35:39 np0005591285 nova_compute[182755]: 2026-01-22 00:35:39.819 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:35:41 np0005591285 nova_compute[182755]: 2026-01-22 00:35:41.716 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:35:43 np0005591285 podman[240816]: 2026-01-22 00:35:43.180896292 +0000 UTC m=+0.053127140 container health_status 3d927e208c8d9d2bf2ed958430b573b5beb6e6062ba87e1ec7a0ee42c855b2e4 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Jan 21 19:35:44 np0005591285 nova_compute[182755]: 2026-01-22 00:35:44.829 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:35:46 np0005591285 podman[240840]: 2026-01-22 00:35:46.168575899 +0000 UTC m=+0.042502715 container health_status 482a8450540b33e5cd4c4cb0c4b60281016c657b79b89fa82f8a9e447843f995 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent)
Jan 21 19:35:46 np0005591285 podman[240841]: 2026-01-22 00:35:46.179559625 +0000 UTC m=+0.048704942 container health_status 8919309155a033da8deb024bb37b9fc0d285fe97686ee0d202af37a5f2df8416 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter)
Jan 21 19:35:46 np0005591285 nova_compute[182755]: 2026-01-22 00:35:46.718 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:35:49 np0005591285 podman[240884]: 2026-01-22 00:35:49.248079758 +0000 UTC m=+0.117871303 container health_status e38def30a2d032278c507a8fe96e62e9ea51ff4721fe55b24e66691821fd079e (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_id=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0)
Jan 21 19:35:49 np0005591285 nova_compute[182755]: 2026-01-22 00:35:49.829 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:35:51 np0005591285 nova_compute[182755]: 2026-01-22 00:35:51.721 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:35:54 np0005591285 nova_compute[182755]: 2026-01-22 00:35:54.830 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:35:56 np0005591285 nova_compute[182755]: 2026-01-22 00:35:56.724 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:35:59 np0005591285 nova_compute[182755]: 2026-01-22 00:35:59.831 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:36:01 np0005591285 nova_compute[182755]: 2026-01-22 00:36:01.728 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:36:03 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:36:03.003 104259 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 21 19:36:03 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:36:03.003 104259 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 21 19:36:03 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:36:03.003 104259 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 21 19:36:04 np0005591285 nova_compute[182755]: 2026-01-22 00:36:04.314 182759 DEBUG oslo_service.periodic_task [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 21 19:36:04 np0005591285 nova_compute[182755]: 2026-01-22 00:36:04.315 182759 DEBUG nova.compute.manager [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Jan 21 19:36:04 np0005591285 nova_compute[182755]: 2026-01-22 00:36:04.315 182759 DEBUG nova.compute.manager [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Jan 21 19:36:04 np0005591285 nova_compute[182755]: 2026-01-22 00:36:04.833 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:36:05 np0005591285 podman[240911]: 2026-01-22 00:36:05.171949137 +0000 UTC m=+0.050496220 container health_status 769668cc4e818cfae854ab3fcb5517227ddca1e18005a18ef4d782a494f45662 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=ubi9-minimal-container, container_name=openstack_network_exporter, distribution-scope=public, vendor=Red Hat, Inc., version=9.6, io.openshift.tags=minimal rhel9, managed_by=edpm_ansible, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., maintainer=Red Hat, Inc., name=ubi9-minimal, release=1755695350, architecture=x86_64, build-date=2025-08-20T13:12:41, vcs-type=git, io.openshift.expose-services=, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.buildah.version=1.33.7, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, config_id=openstack_network_exporter)
Jan 21 19:36:05 np0005591285 podman[240912]: 2026-01-22 00:36:05.19400625 +0000 UTC m=+0.068600747 container health_status b5c1a31a508d0cf9e10702abe9c0ef424614b4d8f0daee85791f7de054afa6f0 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, container_name=ceilometer_agent_compute, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202)
Jan 21 19:36:06 np0005591285 nova_compute[182755]: 2026-01-22 00:36:06.735 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:36:06 np0005591285 nova_compute[182755]: 2026-01-22 00:36:06.817 182759 DEBUG nova.compute.manager [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Jan 21 19:36:06 np0005591285 nova_compute[182755]: 2026-01-22 00:36:06.818 182759 DEBUG oslo_service.periodic_task [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 21 19:36:06 np0005591285 nova_compute[182755]: 2026-01-22 00:36:06.818 182759 DEBUG nova.compute.manager [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Jan 21 19:36:07 np0005591285 nova_compute[182755]: 2026-01-22 00:36:07.217 182759 DEBUG oslo_service.periodic_task [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 21 19:36:08 np0005591285 nova_compute[182755]: 2026-01-22 00:36:08.213 182759 DEBUG oslo_service.periodic_task [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 21 19:36:08 np0005591285 nova_compute[182755]: 2026-01-22 00:36:08.216 182759 DEBUG oslo_service.periodic_task [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 21 19:36:09 np0005591285 nova_compute[182755]: 2026-01-22 00:36:09.835 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:36:10 np0005591285 nova_compute[182755]: 2026-01-22 00:36:10.216 182759 DEBUG oslo_service.periodic_task [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 21 19:36:11 np0005591285 nova_compute[182755]: 2026-01-22 00:36:11.738 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:36:14 np0005591285 podman[240954]: 2026-01-22 00:36:14.185761147 +0000 UTC m=+0.055572885 container health_status 3d927e208c8d9d2bf2ed958430b573b5beb6e6062ba87e1ec7a0ee42c855b2e4 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter)
Jan 21 19:36:14 np0005591285 nova_compute[182755]: 2026-01-22 00:36:14.218 182759 DEBUG oslo_service.periodic_task [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 21 19:36:14 np0005591285 nova_compute[182755]: 2026-01-22 00:36:14.847 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:36:15 np0005591285 nova_compute[182755]: 2026-01-22 00:36:15.217 182759 DEBUG oslo_service.periodic_task [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 21 19:36:15 np0005591285 nova_compute[182755]: 2026-01-22 00:36:15.217 182759 DEBUG oslo_service.periodic_task [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 21 19:36:15 np0005591285 nova_compute[182755]: 2026-01-22 00:36:15.257 182759 DEBUG oslo_concurrency.lockutils [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 21 19:36:15 np0005591285 nova_compute[182755]: 2026-01-22 00:36:15.258 182759 DEBUG oslo_concurrency.lockutils [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 21 19:36:15 np0005591285 nova_compute[182755]: 2026-01-22 00:36:15.258 182759 DEBUG oslo_concurrency.lockutils [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 21 19:36:15 np0005591285 nova_compute[182755]: 2026-01-22 00:36:15.258 182759 DEBUG nova.compute.resource_tracker [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Jan 21 19:36:15 np0005591285 nova_compute[182755]: 2026-01-22 00:36:15.426 182759 WARNING nova.virt.libvirt.driver [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 21 19:36:15 np0005591285 nova_compute[182755]: 2026-01-22 00:36:15.427 182759 DEBUG nova.compute.resource_tracker [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=5739MB free_disk=73.17740631103516GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Jan 21 19:36:15 np0005591285 nova_compute[182755]: 2026-01-22 00:36:15.427 182759 DEBUG oslo_concurrency.lockutils [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 21 19:36:15 np0005591285 nova_compute[182755]: 2026-01-22 00:36:15.427 182759 DEBUG oslo_concurrency.lockutils [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 21 19:36:15 np0005591285 nova_compute[182755]: 2026-01-22 00:36:15.504 182759 DEBUG nova.compute.resource_tracker [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Jan 21 19:36:15 np0005591285 nova_compute[182755]: 2026-01-22 00:36:15.504 182759 DEBUG nova.compute.resource_tracker [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Jan 21 19:36:15 np0005591285 nova_compute[182755]: 2026-01-22 00:36:15.529 182759 DEBUG nova.compute.provider_tree [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Inventory has not changed in ProviderTree for provider: e96a8776-a298-4c19-937a-402cb8191067 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 21 19:36:15 np0005591285 nova_compute[182755]: 2026-01-22 00:36:15.550 182759 DEBUG nova.scheduler.client.report [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Inventory has not changed for provider e96a8776-a298-4c19-937a-402cb8191067 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 21 19:36:15 np0005591285 nova_compute[182755]: 2026-01-22 00:36:15.553 182759 DEBUG nova.compute.resource_tracker [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Jan 21 19:36:15 np0005591285 nova_compute[182755]: 2026-01-22 00:36:15.553 182759 DEBUG oslo_concurrency.lockutils [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.126s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 21 19:36:16 np0005591285 nova_compute[182755]: 2026-01-22 00:36:16.741 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:36:17 np0005591285 podman[240979]: 2026-01-22 00:36:17.195702945 +0000 UTC m=+0.053080239 container health_status 8919309155a033da8deb024bb37b9fc0d285fe97686ee0d202af37a5f2df8416 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Jan 21 19:36:17 np0005591285 podman[240978]: 2026-01-22 00:36:17.202652262 +0000 UTC m=+0.075255766 container health_status 482a8450540b33e5cd4c4cb0c4b60281016c657b79b89fa82f8a9e447843f995 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true)
Jan 21 19:36:19 np0005591285 nova_compute[182755]: 2026-01-22 00:36:19.850 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:36:20 np0005591285 podman[241021]: 2026-01-22 00:36:20.209193658 +0000 UTC m=+0.082592064 container health_status e38def30a2d032278c507a8fe96e62e9ea51ff4721fe55b24e66691821fd079e (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, container_name=ovn_controller, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=ovn_controller, org.label-schema.schema-version=1.0)
Jan 21 19:36:20 np0005591285 nova_compute[182755]: 2026-01-22 00:36:20.385 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:36:20 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:36:20.385 104259 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=59, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '16:a4:fb', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': 'fa:c2:bb:50:eb:27'}, ipsec=False) old=SB_Global(nb_cfg=58) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 21 19:36:20 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:36:20.387 104259 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 9 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Jan 21 19:36:21 np0005591285 nova_compute[182755]: 2026-01-22 00:36:21.744 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:36:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:36:23.176 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 21 19:36:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:36:23.177 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 21 19:36:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:36:23.177 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 21 19:36:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:36:23.177 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 21 19:36:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:36:23.178 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.capacity, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 21 19:36:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:36:23.178 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.allocation, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 21 19:36:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:36:23.178 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 21 19:36:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:36:23.178 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 21 19:36:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:36:23.179 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 21 19:36:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:36:23.179 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 21 19:36:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:36:23.179 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 21 19:36:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:36:23.180 12 DEBUG ceilometer.polling.manager [-] Skip pollster cpu, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 21 19:36:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:36:23.180 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 21 19:36:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:36:23.180 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 21 19:36:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:36:23.180 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 21 19:36:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:36:23.181 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 21 19:36:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:36:23.181 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 21 19:36:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:36:23.181 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 21 19:36:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:36:23.181 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 21 19:36:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:36:23.182 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 21 19:36:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:36:23.182 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 21 19:36:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:36:23.182 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 21 19:36:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:36:23.182 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 21 19:36:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:36:23.183 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.iops, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 21 19:36:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:36:23.183 12 DEBUG ceilometer.polling.manager [-] Skip pollster memory.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 21 19:36:24 np0005591285 nova_compute[182755]: 2026-01-22 00:36:24.549 182759 DEBUG oslo_service.periodic_task [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 21 19:36:24 np0005591285 nova_compute[182755]: 2026-01-22 00:36:24.869 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:36:26 np0005591285 nova_compute[182755]: 2026-01-22 00:36:26.748 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:36:29 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:36:29.389 104259 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=ce4b296c-26ac-415a-aa87-9634754eb3d3, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '59'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 21 19:36:29 np0005591285 nova_compute[182755]: 2026-01-22 00:36:29.906 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:36:31 np0005591285 nova_compute[182755]: 2026-01-22 00:36:31.752 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:36:34 np0005591285 nova_compute[182755]: 2026-01-22 00:36:34.907 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:36:36 np0005591285 podman[241048]: 2026-01-22 00:36:36.18818376 +0000 UTC m=+0.061003413 container health_status b5c1a31a508d0cf9e10702abe9c0ef424614b4d8f0daee85791f7de054afa6f0 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, config_id=ceilometer_agent_compute, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true)
Jan 21 19:36:36 np0005591285 podman[241047]: 2026-01-22 00:36:36.206900992 +0000 UTC m=+0.081186095 container health_status 769668cc4e818cfae854ab3fcb5517227ddca1e18005a18ef4d782a494f45662 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., name=ubi9-minimal, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vcs-type=git, architecture=x86_64, config_id=openstack_network_exporter, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., url=https://catalog.redhat.com/en/search?searchType=containers, container_name=openstack_network_exporter, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.component=ubi9-minimal-container, version=9.6, io.buildah.version=1.33.7, vendor=Red Hat, Inc., release=1755695350, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, maintainer=Red Hat, Inc., build-date=2025-08-20T13:12:41, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.openshift.expose-services=, io.openshift.tags=minimal rhel9, distribution-scope=public, managed_by=edpm_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI)
Jan 21 19:36:36 np0005591285 nova_compute[182755]: 2026-01-22 00:36:36.755 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:36:39 np0005591285 nova_compute[182755]: 2026-01-22 00:36:39.908 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:36:41 np0005591285 nova_compute[182755]: 2026-01-22 00:36:41.759 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:36:44 np0005591285 nova_compute[182755]: 2026-01-22 00:36:44.910 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:36:45 np0005591285 podman[241090]: 2026-01-22 00:36:45.177602143 +0000 UTC m=+0.049000190 container health_status 3d927e208c8d9d2bf2ed958430b573b5beb6e6062ba87e1ec7a0ee42c855b2e4 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Jan 21 19:36:46 np0005591285 nova_compute[182755]: 2026-01-22 00:36:46.762 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:36:48 np0005591285 podman[241114]: 2026-01-22 00:36:48.173645896 +0000 UTC m=+0.049516024 container health_status 482a8450540b33e5cd4c4cb0c4b60281016c657b79b89fa82f8a9e447843f995 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=ovn_metadata_agent, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, container_name=ovn_metadata_agent, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Jan 21 19:36:48 np0005591285 podman[241115]: 2026-01-22 00:36:48.176645407 +0000 UTC m=+0.048516327 container health_status 8919309155a033da8deb024bb37b9fc0d285fe97686ee0d202af37a5f2df8416 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Jan 21 19:36:49 np0005591285 nova_compute[182755]: 2026-01-22 00:36:49.912 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:36:51 np0005591285 podman[241152]: 2026-01-22 00:36:51.230917616 +0000 UTC m=+0.104253767 container health_status e38def30a2d032278c507a8fe96e62e9ea51ff4721fe55b24e66691821fd079e (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller)
Jan 21 19:36:51 np0005591285 nova_compute[182755]: 2026-01-22 00:36:51.765 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:36:54 np0005591285 nova_compute[182755]: 2026-01-22 00:36:54.913 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:36:56 np0005591285 nova_compute[182755]: 2026-01-22 00:36:56.769 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:36:59 np0005591285 nova_compute[182755]: 2026-01-22 00:36:59.915 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:37:01 np0005591285 nova_compute[182755]: 2026-01-22 00:37:01.771 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:37:03 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:37:03.004 104259 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 21 19:37:03 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:37:03.004 104259 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 21 19:37:03 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:37:03.004 104259 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 21 19:37:03 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:37:03.288 104259 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=60, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '16:a4:fb', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': 'fa:c2:bb:50:eb:27'}, ipsec=False) old=SB_Global(nb_cfg=59) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 21 19:37:03 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:37:03.289 104259 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 0 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Jan 21 19:37:03 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:37:03.289 104259 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=ce4b296c-26ac-415a-aa87-9634754eb3d3, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '60'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 21 19:37:03 np0005591285 nova_compute[182755]: 2026-01-22 00:37:03.328 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:37:04 np0005591285 nova_compute[182755]: 2026-01-22 00:37:04.915 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:37:05 np0005591285 nova_compute[182755]: 2026-01-22 00:37:05.218 182759 DEBUG oslo_service.periodic_task [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 21 19:37:05 np0005591285 nova_compute[182755]: 2026-01-22 00:37:05.218 182759 DEBUG nova.compute.manager [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Jan 21 19:37:05 np0005591285 nova_compute[182755]: 2026-01-22 00:37:05.218 182759 DEBUG nova.compute.manager [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Jan 21 19:37:05 np0005591285 nova_compute[182755]: 2026-01-22 00:37:05.235 182759 DEBUG nova.compute.manager [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Jan 21 19:37:06 np0005591285 nova_compute[182755]: 2026-01-22 00:37:06.218 182759 DEBUG oslo_service.periodic_task [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 21 19:37:06 np0005591285 nova_compute[182755]: 2026-01-22 00:37:06.219 182759 DEBUG nova.compute.manager [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Jan 21 19:37:06 np0005591285 nova_compute[182755]: 2026-01-22 00:37:06.775 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:37:07 np0005591285 podman[241178]: 2026-01-22 00:37:07.186715424 +0000 UTC m=+0.059428040 container health_status 769668cc4e818cfae854ab3fcb5517227ddca1e18005a18ef4d782a494f45662 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, version=9.6, maintainer=Red Hat, Inc., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vendor=Red Hat, Inc., managed_by=edpm_ansible, name=ubi9-minimal, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, build-date=2025-08-20T13:12:41, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.component=ubi9-minimal-container, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, release=1755695350, config_id=openstack_network_exporter, distribution-scope=public, io.buildah.version=1.33.7, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.tags=minimal rhel9, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, container_name=openstack_network_exporter, architecture=x86_64, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-type=git)
Jan 21 19:37:07 np0005591285 podman[241179]: 2026-01-22 00:37:07.20480662 +0000 UTC m=+0.065713529 container health_status b5c1a31a508d0cf9e10702abe9c0ef424614b4d8f0daee85791f7de054afa6f0 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.build-date=20251202, config_id=ceilometer_agent_compute, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team)
Jan 21 19:37:08 np0005591285 nova_compute[182755]: 2026-01-22 00:37:08.219 182759 DEBUG oslo_service.periodic_task [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 21 19:37:08 np0005591285 nova_compute[182755]: 2026-01-22 00:37:08.219 182759 DEBUG oslo_service.periodic_task [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 21 19:37:09 np0005591285 nova_compute[182755]: 2026-01-22 00:37:09.213 182759 DEBUG oslo_service.periodic_task [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 21 19:37:09 np0005591285 nova_compute[182755]: 2026-01-22 00:37:09.917 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:37:10 np0005591285 nova_compute[182755]: 2026-01-22 00:37:10.217 182759 DEBUG oslo_service.periodic_task [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 21 19:37:11 np0005591285 nova_compute[182755]: 2026-01-22 00:37:11.217 182759 DEBUG oslo_service.periodic_task [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Running periodic task ComputeManager._poll_shelved_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 21 19:37:11 np0005591285 nova_compute[182755]: 2026-01-22 00:37:11.792 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:37:14 np0005591285 nova_compute[182755]: 2026-01-22 00:37:14.918 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:37:16 np0005591285 podman[241219]: 2026-01-22 00:37:16.189777164 +0000 UTC m=+0.052331299 container health_status 3d927e208c8d9d2bf2ed958430b573b5beb6e6062ba87e1ec7a0ee42c855b2e4 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Jan 21 19:37:16 np0005591285 nova_compute[182755]: 2026-01-22 00:37:16.217 182759 DEBUG oslo_service.periodic_task [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 21 19:37:16 np0005591285 nova_compute[182755]: 2026-01-22 00:37:16.217 182759 DEBUG oslo_service.periodic_task [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 21 19:37:16 np0005591285 nova_compute[182755]: 2026-01-22 00:37:16.242 182759 DEBUG oslo_concurrency.lockutils [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 21 19:37:16 np0005591285 nova_compute[182755]: 2026-01-22 00:37:16.242 182759 DEBUG oslo_concurrency.lockutils [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 21 19:37:16 np0005591285 nova_compute[182755]: 2026-01-22 00:37:16.242 182759 DEBUG oslo_concurrency.lockutils [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 21 19:37:16 np0005591285 nova_compute[182755]: 2026-01-22 00:37:16.243 182759 DEBUG nova.compute.resource_tracker [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Jan 21 19:37:16 np0005591285 nova_compute[182755]: 2026-01-22 00:37:16.444 182759 WARNING nova.virt.libvirt.driver [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 21 19:37:16 np0005591285 nova_compute[182755]: 2026-01-22 00:37:16.445 182759 DEBUG nova.compute.resource_tracker [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=5761MB free_disk=73.17743301391602GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Jan 21 19:37:16 np0005591285 nova_compute[182755]: 2026-01-22 00:37:16.446 182759 DEBUG oslo_concurrency.lockutils [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 21 19:37:16 np0005591285 nova_compute[182755]: 2026-01-22 00:37:16.446 182759 DEBUG oslo_concurrency.lockutils [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 21 19:37:16 np0005591285 nova_compute[182755]: 2026-01-22 00:37:16.515 182759 DEBUG nova.compute.resource_tracker [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Jan 21 19:37:16 np0005591285 nova_compute[182755]: 2026-01-22 00:37:16.515 182759 DEBUG nova.compute.resource_tracker [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Jan 21 19:37:16 np0005591285 nova_compute[182755]: 2026-01-22 00:37:16.567 182759 DEBUG nova.compute.provider_tree [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Inventory has not changed in ProviderTree for provider: e96a8776-a298-4c19-937a-402cb8191067 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 21 19:37:16 np0005591285 nova_compute[182755]: 2026-01-22 00:37:16.589 182759 DEBUG nova.scheduler.client.report [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Inventory has not changed for provider e96a8776-a298-4c19-937a-402cb8191067 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 21 19:37:16 np0005591285 nova_compute[182755]: 2026-01-22 00:37:16.591 182759 DEBUG nova.compute.resource_tracker [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Jan 21 19:37:16 np0005591285 nova_compute[182755]: 2026-01-22 00:37:16.591 182759 DEBUG oslo_concurrency.lockutils [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.145s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 21 19:37:16 np0005591285 nova_compute[182755]: 2026-01-22 00:37:16.795 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:37:17 np0005591285 nova_compute[182755]: 2026-01-22 00:37:17.591 182759 DEBUG oslo_service.periodic_task [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 21 19:37:19 np0005591285 podman[241244]: 2026-01-22 00:37:19.230730475 +0000 UTC m=+0.085688436 container health_status 8919309155a033da8deb024bb37b9fc0d285fe97686ee0d202af37a5f2df8416 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']})
Jan 21 19:37:19 np0005591285 podman[241243]: 2026-01-22 00:37:19.253807867 +0000 UTC m=+0.119184769 container health_status 482a8450540b33e5cd4c4cb0c4b60281016c657b79b89fa82f8a9e447843f995 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 21 19:37:19 np0005591285 nova_compute[182755]: 2026-01-22 00:37:19.920 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:37:21 np0005591285 nova_compute[182755]: 2026-01-22 00:37:21.800 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:37:22 np0005591285 podman[241285]: 2026-01-22 00:37:22.251771662 +0000 UTC m=+0.119050455 container health_status e38def30a2d032278c507a8fe96e62e9ea51ff4721fe55b24e66691821fd079e (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.license=GPLv2, tcib_managed=true, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, config_id=ovn_controller, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.build-date=20251202)
Jan 21 19:37:24 np0005591285 nova_compute[182755]: 2026-01-22 00:37:24.922 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:37:26 np0005591285 nova_compute[182755]: 2026-01-22 00:37:26.801 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:37:29 np0005591285 nova_compute[182755]: 2026-01-22 00:37:29.923 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:37:31 np0005591285 nova_compute[182755]: 2026-01-22 00:37:31.805 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:37:34 np0005591285 nova_compute[182755]: 2026-01-22 00:37:34.925 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:37:36 np0005591285 nova_compute[182755]: 2026-01-22 00:37:36.809 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:37:38 np0005591285 podman[241316]: 2026-01-22 00:37:38.182727372 +0000 UTC m=+0.055918726 container health_status b5c1a31a508d0cf9e10702abe9c0ef424614b4d8f0daee85791f7de054afa6f0 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, container_name=ceilometer_agent_compute, org.label-schema.build-date=20251202, config_id=ceilometer_agent_compute, io.buildah.version=1.41.3, tcib_managed=true, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 21 19:37:38 np0005591285 podman[241315]: 2026-01-22 00:37:38.192750731 +0000 UTC m=+0.068083953 container health_status 769668cc4e818cfae854ab3fcb5517227ddca1e18005a18ef4d782a494f45662 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, name=ubi9-minimal, release=1755695350, vcs-type=git, io.buildah.version=1.33.7, url=https://catalog.redhat.com/en/search?searchType=containers, managed_by=edpm_ansible, distribution-scope=public, io.openshift.expose-services=, vendor=Red Hat, Inc., config_id=openstack_network_exporter, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., architecture=x86_64, build-date=2025-08-20T13:12:41, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., version=9.6, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.openshift.tags=minimal rhel9, container_name=openstack_network_exporter, com.redhat.component=ubi9-minimal-container, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Red Hat, Inc.)
Jan 21 19:37:39 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:37:39.727 104259 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:5f:ff:76 2001:db8:0:1:f816:3eff:fe5f:ff76 2001:db8::f816:3eff:fe5f:ff76'], port_security=[], type=localport, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': ''}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '2001:db8:0:1:f816:3eff:fe5f:ff76/64 2001:db8::f816:3eff:fe5f:ff76/64', 'neutron:device_id': 'ovnmeta-01fa8e13-9f62-4b06-88db-79f2e6ca65b8', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-01fa8e13-9f62-4b06-88db-79f2e6ca65b8', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '837db8748d074b3c9179b47d30e7a1d4', 'neutron:revision_number': '3', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=eaa35b5e-130a-4933-a219-b6429231aa8c, chassis=[], tunnel_key=1, gateway_chassis=[], requested_chassis=[], logical_port=63ad2747-135a-46c8-90ca-ec1def31a1c2) old=Port_Binding(mac=['fa:16:3e:5f:ff:76 2001:db8::f816:3eff:fe5f:ff76'], external_ids={'neutron:cidrs': '2001:db8::f816:3eff:fe5f:ff76/64', 'neutron:device_id': 'ovnmeta-01fa8e13-9f62-4b06-88db-79f2e6ca65b8', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-01fa8e13-9f62-4b06-88db-79f2e6ca65b8', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '837db8748d074b3c9179b47d30e7a1d4', 'neutron:revision_number': '2', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 21 19:37:39 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:37:39.728 104259 INFO neutron.agent.ovn.metadata.agent [-] Metadata Port 63ad2747-135a-46c8-90ca-ec1def31a1c2 in datapath 01fa8e13-9f62-4b06-88db-79f2e6ca65b8 updated#033[00m
Jan 21 19:37:39 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:37:39.729 104259 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 01fa8e13-9f62-4b06-88db-79f2e6ca65b8, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Jan 21 19:37:39 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:37:39.731 211690 DEBUG oslo.privsep.daemon [-] privsep: reply[9bddc9c8-da8b-4766-aa23-5cd0fb34ffcb]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 19:37:39 np0005591285 nova_compute[182755]: 2026-01-22 00:37:39.927 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:37:41 np0005591285 nova_compute[182755]: 2026-01-22 00:37:41.812 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:37:44 np0005591285 nova_compute[182755]: 2026-01-22 00:37:44.928 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:37:46 np0005591285 nova_compute[182755]: 2026-01-22 00:37:46.366 182759 DEBUG oslo_concurrency.lockutils [None req-05c120eb-7169-456f-a6e0-a0d6bf45e198 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Acquiring lock "f5280c26-3c89-472c-96cd-5d580ed702ce" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 21 19:37:46 np0005591285 nova_compute[182755]: 2026-01-22 00:37:46.366 182759 DEBUG oslo_concurrency.lockutils [None req-05c120eb-7169-456f-a6e0-a0d6bf45e198 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Lock "f5280c26-3c89-472c-96cd-5d580ed702ce" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 21 19:37:46 np0005591285 nova_compute[182755]: 2026-01-22 00:37:46.385 182759 DEBUG nova.compute.manager [None req-05c120eb-7169-456f-a6e0-a0d6bf45e198 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] [instance: f5280c26-3c89-472c-96cd-5d580ed702ce] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Jan 21 19:37:46 np0005591285 nova_compute[182755]: 2026-01-22 00:37:46.482 182759 DEBUG oslo_concurrency.lockutils [None req-05c120eb-7169-456f-a6e0-a0d6bf45e198 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 21 19:37:46 np0005591285 nova_compute[182755]: 2026-01-22 00:37:46.483 182759 DEBUG oslo_concurrency.lockutils [None req-05c120eb-7169-456f-a6e0-a0d6bf45e198 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 21 19:37:46 np0005591285 nova_compute[182755]: 2026-01-22 00:37:46.491 182759 DEBUG nova.virt.hardware [None req-05c120eb-7169-456f-a6e0-a0d6bf45e198 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Jan 21 19:37:46 np0005591285 nova_compute[182755]: 2026-01-22 00:37:46.491 182759 INFO nova.compute.claims [None req-05c120eb-7169-456f-a6e0-a0d6bf45e198 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] [instance: f5280c26-3c89-472c-96cd-5d580ed702ce] Claim successful on node compute-2.ctlplane.example.com#033[00m
Jan 21 19:37:46 np0005591285 nova_compute[182755]: 2026-01-22 00:37:46.620 182759 DEBUG nova.compute.provider_tree [None req-05c120eb-7169-456f-a6e0-a0d6bf45e198 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Inventory has not changed in ProviderTree for provider: e96a8776-a298-4c19-937a-402cb8191067 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 21 19:37:46 np0005591285 nova_compute[182755]: 2026-01-22 00:37:46.635 182759 DEBUG nova.scheduler.client.report [None req-05c120eb-7169-456f-a6e0-a0d6bf45e198 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Inventory has not changed for provider e96a8776-a298-4c19-937a-402cb8191067 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 21 19:37:46 np0005591285 nova_compute[182755]: 2026-01-22 00:37:46.655 182759 DEBUG oslo_concurrency.lockutils [None req-05c120eb-7169-456f-a6e0-a0d6bf45e198 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.172s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 21 19:37:46 np0005591285 nova_compute[182755]: 2026-01-22 00:37:46.656 182759 DEBUG nova.compute.manager [None req-05c120eb-7169-456f-a6e0-a0d6bf45e198 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] [instance: f5280c26-3c89-472c-96cd-5d580ed702ce] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Jan 21 19:37:46 np0005591285 nova_compute[182755]: 2026-01-22 00:37:46.713 182759 DEBUG nova.compute.manager [None req-05c120eb-7169-456f-a6e0-a0d6bf45e198 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] [instance: f5280c26-3c89-472c-96cd-5d580ed702ce] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Jan 21 19:37:46 np0005591285 nova_compute[182755]: 2026-01-22 00:37:46.713 182759 DEBUG nova.network.neutron [None req-05c120eb-7169-456f-a6e0-a0d6bf45e198 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] [instance: f5280c26-3c89-472c-96cd-5d580ed702ce] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Jan 21 19:37:46 np0005591285 nova_compute[182755]: 2026-01-22 00:37:46.731 182759 INFO nova.virt.libvirt.driver [None req-05c120eb-7169-456f-a6e0-a0d6bf45e198 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] [instance: f5280c26-3c89-472c-96cd-5d580ed702ce] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Jan 21 19:37:46 np0005591285 nova_compute[182755]: 2026-01-22 00:37:46.757 182759 DEBUG nova.compute.manager [None req-05c120eb-7169-456f-a6e0-a0d6bf45e198 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] [instance: f5280c26-3c89-472c-96cd-5d580ed702ce] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Jan 21 19:37:46 np0005591285 nova_compute[182755]: 2026-01-22 00:37:46.816 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:37:47 np0005591285 nova_compute[182755]: 2026-01-22 00:37:47.090 182759 DEBUG nova.policy [None req-05c120eb-7169-456f-a6e0-a0d6bf45e198 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'a8fd196423d94b309668ffd08655f7ed', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '837db8748d074b3c9179b47d30e7a1d4', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Jan 21 19:37:47 np0005591285 nova_compute[182755]: 2026-01-22 00:37:47.104 182759 DEBUG nova.compute.manager [None req-05c120eb-7169-456f-a6e0-a0d6bf45e198 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] [instance: f5280c26-3c89-472c-96cd-5d580ed702ce] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Jan 21 19:37:47 np0005591285 nova_compute[182755]: 2026-01-22 00:37:47.106 182759 DEBUG nova.virt.libvirt.driver [None req-05c120eb-7169-456f-a6e0-a0d6bf45e198 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] [instance: f5280c26-3c89-472c-96cd-5d580ed702ce] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Jan 21 19:37:47 np0005591285 nova_compute[182755]: 2026-01-22 00:37:47.107 182759 INFO nova.virt.libvirt.driver [None req-05c120eb-7169-456f-a6e0-a0d6bf45e198 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] [instance: f5280c26-3c89-472c-96cd-5d580ed702ce] Creating image(s)#033[00m
Jan 21 19:37:47 np0005591285 nova_compute[182755]: 2026-01-22 00:37:47.108 182759 DEBUG oslo_concurrency.lockutils [None req-05c120eb-7169-456f-a6e0-a0d6bf45e198 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Acquiring lock "/var/lib/nova/instances/f5280c26-3c89-472c-96cd-5d580ed702ce/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 21 19:37:47 np0005591285 nova_compute[182755]: 2026-01-22 00:37:47.109 182759 DEBUG oslo_concurrency.lockutils [None req-05c120eb-7169-456f-a6e0-a0d6bf45e198 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Lock "/var/lib/nova/instances/f5280c26-3c89-472c-96cd-5d580ed702ce/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 21 19:37:47 np0005591285 nova_compute[182755]: 2026-01-22 00:37:47.110 182759 DEBUG oslo_concurrency.lockutils [None req-05c120eb-7169-456f-a6e0-a0d6bf45e198 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Lock "/var/lib/nova/instances/f5280c26-3c89-472c-96cd-5d580ed702ce/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 21 19:37:47 np0005591285 nova_compute[182755]: 2026-01-22 00:37:47.140 182759 DEBUG oslo_concurrency.processutils [None req-05c120eb-7169-456f-a6e0-a0d6bf45e198 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 21 19:37:47 np0005591285 podman[241356]: 2026-01-22 00:37:47.170161132 +0000 UTC m=+0.047107799 container health_status 3d927e208c8d9d2bf2ed958430b573b5beb6e6062ba87e1ec7a0ee42c855b2e4 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter)
Jan 21 19:37:47 np0005591285 nova_compute[182755]: 2026-01-22 00:37:47.211 182759 DEBUG oslo_concurrency.processutils [None req-05c120eb-7169-456f-a6e0-a0d6bf45e198 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json" returned: 0 in 0.071s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 21 19:37:47 np0005591285 nova_compute[182755]: 2026-01-22 00:37:47.213 182759 DEBUG oslo_concurrency.lockutils [None req-05c120eb-7169-456f-a6e0-a0d6bf45e198 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Acquiring lock "3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 21 19:37:47 np0005591285 nova_compute[182755]: 2026-01-22 00:37:47.213 182759 DEBUG oslo_concurrency.lockutils [None req-05c120eb-7169-456f-a6e0-a0d6bf45e198 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Lock "3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 21 19:37:47 np0005591285 nova_compute[182755]: 2026-01-22 00:37:47.226 182759 DEBUG oslo_concurrency.processutils [None req-05c120eb-7169-456f-a6e0-a0d6bf45e198 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 21 19:37:47 np0005591285 nova_compute[182755]: 2026-01-22 00:37:47.285 182759 DEBUG oslo_concurrency.processutils [None req-05c120eb-7169-456f-a6e0-a0d6bf45e198 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json" returned: 0 in 0.059s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 21 19:37:47 np0005591285 nova_compute[182755]: 2026-01-22 00:37:47.286 182759 DEBUG oslo_concurrency.processutils [None req-05c120eb-7169-456f-a6e0-a0d6bf45e198 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474,backing_fmt=raw /var/lib/nova/instances/f5280c26-3c89-472c-96cd-5d580ed702ce/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 21 19:37:47 np0005591285 nova_compute[182755]: 2026-01-22 00:37:47.317 182759 DEBUG oslo_concurrency.processutils [None req-05c120eb-7169-456f-a6e0-a0d6bf45e198 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474,backing_fmt=raw /var/lib/nova/instances/f5280c26-3c89-472c-96cd-5d580ed702ce/disk 1073741824" returned: 0 in 0.031s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 21 19:37:47 np0005591285 nova_compute[182755]: 2026-01-22 00:37:47.318 182759 DEBUG oslo_concurrency.lockutils [None req-05c120eb-7169-456f-a6e0-a0d6bf45e198 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Lock "3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.105s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 21 19:37:47 np0005591285 nova_compute[182755]: 2026-01-22 00:37:47.319 182759 DEBUG oslo_concurrency.processutils [None req-05c120eb-7169-456f-a6e0-a0d6bf45e198 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 21 19:37:47 np0005591285 nova_compute[182755]: 2026-01-22 00:37:47.368 182759 DEBUG oslo_concurrency.processutils [None req-05c120eb-7169-456f-a6e0-a0d6bf45e198 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json" returned: 0 in 0.049s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 21 19:37:47 np0005591285 nova_compute[182755]: 2026-01-22 00:37:47.369 182759 DEBUG nova.virt.disk.api [None req-05c120eb-7169-456f-a6e0-a0d6bf45e198 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Checking if we can resize image /var/lib/nova/instances/f5280c26-3c89-472c-96cd-5d580ed702ce/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166#033[00m
Jan 21 19:37:47 np0005591285 nova_compute[182755]: 2026-01-22 00:37:47.370 182759 DEBUG oslo_concurrency.processutils [None req-05c120eb-7169-456f-a6e0-a0d6bf45e198 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/f5280c26-3c89-472c-96cd-5d580ed702ce/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 21 19:37:47 np0005591285 nova_compute[182755]: 2026-01-22 00:37:47.425 182759 DEBUG oslo_concurrency.processutils [None req-05c120eb-7169-456f-a6e0-a0d6bf45e198 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/f5280c26-3c89-472c-96cd-5d580ed702ce/disk --force-share --output=json" returned: 0 in 0.055s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 21 19:37:47 np0005591285 nova_compute[182755]: 2026-01-22 00:37:47.426 182759 DEBUG nova.virt.disk.api [None req-05c120eb-7169-456f-a6e0-a0d6bf45e198 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Cannot resize image /var/lib/nova/instances/f5280c26-3c89-472c-96cd-5d580ed702ce/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172#033[00m
Jan 21 19:37:47 np0005591285 nova_compute[182755]: 2026-01-22 00:37:47.426 182759 DEBUG nova.objects.instance [None req-05c120eb-7169-456f-a6e0-a0d6bf45e198 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Lazy-loading 'migration_context' on Instance uuid f5280c26-3c89-472c-96cd-5d580ed702ce obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 21 19:37:47 np0005591285 nova_compute[182755]: 2026-01-22 00:37:47.444 182759 DEBUG nova.virt.libvirt.driver [None req-05c120eb-7169-456f-a6e0-a0d6bf45e198 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] [instance: f5280c26-3c89-472c-96cd-5d580ed702ce] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Jan 21 19:37:47 np0005591285 nova_compute[182755]: 2026-01-22 00:37:47.444 182759 DEBUG nova.virt.libvirt.driver [None req-05c120eb-7169-456f-a6e0-a0d6bf45e198 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] [instance: f5280c26-3c89-472c-96cd-5d580ed702ce] Ensure instance console log exists: /var/lib/nova/instances/f5280c26-3c89-472c-96cd-5d580ed702ce/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Jan 21 19:37:47 np0005591285 nova_compute[182755]: 2026-01-22 00:37:47.445 182759 DEBUG oslo_concurrency.lockutils [None req-05c120eb-7169-456f-a6e0-a0d6bf45e198 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 21 19:37:47 np0005591285 nova_compute[182755]: 2026-01-22 00:37:47.445 182759 DEBUG oslo_concurrency.lockutils [None req-05c120eb-7169-456f-a6e0-a0d6bf45e198 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 21 19:37:47 np0005591285 nova_compute[182755]: 2026-01-22 00:37:47.445 182759 DEBUG oslo_concurrency.lockutils [None req-05c120eb-7169-456f-a6e0-a0d6bf45e198 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 21 19:37:47 np0005591285 nova_compute[182755]: 2026-01-22 00:37:47.962 182759 DEBUG nova.network.neutron [None req-05c120eb-7169-456f-a6e0-a0d6bf45e198 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] [instance: f5280c26-3c89-472c-96cd-5d580ed702ce] Successfully created port: 6158c039-5f87-4d75-91cd-734e6337b27f _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Jan 21 19:37:49 np0005591285 nova_compute[182755]: 2026-01-22 00:37:49.930 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:37:50 np0005591285 podman[241395]: 2026-01-22 00:37:50.215949913 +0000 UTC m=+0.078387800 container health_status 482a8450540b33e5cd4c4cb0c4b60281016c657b79b89fa82f8a9e447843f995 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.schema-version=1.0)
Jan 21 19:37:50 np0005591285 podman[241396]: 2026-01-22 00:37:50.22435996 +0000 UTC m=+0.081061723 container health_status 8919309155a033da8deb024bb37b9fc0d285fe97686ee0d202af37a5f2df8416 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter)
Jan 21 19:37:51 np0005591285 nova_compute[182755]: 2026-01-22 00:37:51.135 182759 DEBUG nova.network.neutron [None req-05c120eb-7169-456f-a6e0-a0d6bf45e198 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] [instance: f5280c26-3c89-472c-96cd-5d580ed702ce] Successfully created port: dc8f6b9c-5810-44f0-9c16-cb38b34b2dfa _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Jan 21 19:37:51 np0005591285 nova_compute[182755]: 2026-01-22 00:37:51.819 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:37:53 np0005591285 nova_compute[182755]: 2026-01-22 00:37:53.132 182759 DEBUG nova.network.neutron [None req-05c120eb-7169-456f-a6e0-a0d6bf45e198 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] [instance: f5280c26-3c89-472c-96cd-5d580ed702ce] Successfully updated port: 6158c039-5f87-4d75-91cd-734e6337b27f _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Jan 21 19:37:53 np0005591285 podman[241436]: 2026-01-22 00:37:53.222497619 +0000 UTC m=+0.080868668 container health_status e38def30a2d032278c507a8fe96e62e9ea51ff4721fe55b24e66691821fd079e (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_managed=true, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 21 19:37:53 np0005591285 nova_compute[182755]: 2026-01-22 00:37:53.255 182759 DEBUG nova.compute.manager [req-dd1f072e-4207-45ee-9f2c-2d5935dab433 req-3efb54f4-daea-471c-86ec-b18744f5702e 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: f5280c26-3c89-472c-96cd-5d580ed702ce] Received event network-changed-6158c039-5f87-4d75-91cd-734e6337b27f external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 21 19:37:53 np0005591285 nova_compute[182755]: 2026-01-22 00:37:53.255 182759 DEBUG nova.compute.manager [req-dd1f072e-4207-45ee-9f2c-2d5935dab433 req-3efb54f4-daea-471c-86ec-b18744f5702e 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: f5280c26-3c89-472c-96cd-5d580ed702ce] Refreshing instance network info cache due to event network-changed-6158c039-5f87-4d75-91cd-734e6337b27f. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 21 19:37:53 np0005591285 nova_compute[182755]: 2026-01-22 00:37:53.255 182759 DEBUG oslo_concurrency.lockutils [req-dd1f072e-4207-45ee-9f2c-2d5935dab433 req-3efb54f4-daea-471c-86ec-b18744f5702e 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "refresh_cache-f5280c26-3c89-472c-96cd-5d580ed702ce" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 21 19:37:53 np0005591285 nova_compute[182755]: 2026-01-22 00:37:53.255 182759 DEBUG oslo_concurrency.lockutils [req-dd1f072e-4207-45ee-9f2c-2d5935dab433 req-3efb54f4-daea-471c-86ec-b18744f5702e 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquired lock "refresh_cache-f5280c26-3c89-472c-96cd-5d580ed702ce" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 21 19:37:53 np0005591285 nova_compute[182755]: 2026-01-22 00:37:53.255 182759 DEBUG nova.network.neutron [req-dd1f072e-4207-45ee-9f2c-2d5935dab433 req-3efb54f4-daea-471c-86ec-b18744f5702e 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: f5280c26-3c89-472c-96cd-5d580ed702ce] Refreshing network info cache for port 6158c039-5f87-4d75-91cd-734e6337b27f _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 21 19:37:53 np0005591285 nova_compute[182755]: 2026-01-22 00:37:53.478 182759 DEBUG nova.network.neutron [req-dd1f072e-4207-45ee-9f2c-2d5935dab433 req-3efb54f4-daea-471c-86ec-b18744f5702e 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: f5280c26-3c89-472c-96cd-5d580ed702ce] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Jan 21 19:37:53 np0005591285 nova_compute[182755]: 2026-01-22 00:37:53.813 182759 DEBUG nova.network.neutron [req-dd1f072e-4207-45ee-9f2c-2d5935dab433 req-3efb54f4-daea-471c-86ec-b18744f5702e 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: f5280c26-3c89-472c-96cd-5d580ed702ce] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 21 19:37:53 np0005591285 nova_compute[182755]: 2026-01-22 00:37:53.829 182759 DEBUG oslo_concurrency.lockutils [req-dd1f072e-4207-45ee-9f2c-2d5935dab433 req-3efb54f4-daea-471c-86ec-b18744f5702e 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Releasing lock "refresh_cache-f5280c26-3c89-472c-96cd-5d580ed702ce" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 21 19:37:54 np0005591285 nova_compute[182755]: 2026-01-22 00:37:54.005 182759 DEBUG nova.network.neutron [None req-05c120eb-7169-456f-a6e0-a0d6bf45e198 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] [instance: f5280c26-3c89-472c-96cd-5d580ed702ce] Successfully updated port: dc8f6b9c-5810-44f0-9c16-cb38b34b2dfa _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Jan 21 19:37:54 np0005591285 nova_compute[182755]: 2026-01-22 00:37:54.022 182759 DEBUG oslo_concurrency.lockutils [None req-05c120eb-7169-456f-a6e0-a0d6bf45e198 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Acquiring lock "refresh_cache-f5280c26-3c89-472c-96cd-5d580ed702ce" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 21 19:37:54 np0005591285 nova_compute[182755]: 2026-01-22 00:37:54.022 182759 DEBUG oslo_concurrency.lockutils [None req-05c120eb-7169-456f-a6e0-a0d6bf45e198 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Acquired lock "refresh_cache-f5280c26-3c89-472c-96cd-5d580ed702ce" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 21 19:37:54 np0005591285 nova_compute[182755]: 2026-01-22 00:37:54.022 182759 DEBUG nova.network.neutron [None req-05c120eb-7169-456f-a6e0-a0d6bf45e198 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] [instance: f5280c26-3c89-472c-96cd-5d580ed702ce] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Jan 21 19:37:54 np0005591285 nova_compute[182755]: 2026-01-22 00:37:54.202 182759 DEBUG nova.network.neutron [None req-05c120eb-7169-456f-a6e0-a0d6bf45e198 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] [instance: f5280c26-3c89-472c-96cd-5d580ed702ce] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Jan 21 19:37:54 np0005591285 nova_compute[182755]: 2026-01-22 00:37:54.932 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:37:55 np0005591285 nova_compute[182755]: 2026-01-22 00:37:55.353 182759 DEBUG nova.compute.manager [req-17bd973b-8377-4dfc-a743-29c37d24c209 req-1f8099fe-70e0-4fad-98f7-de89eab7b619 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: f5280c26-3c89-472c-96cd-5d580ed702ce] Received event network-changed-dc8f6b9c-5810-44f0-9c16-cb38b34b2dfa external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 21 19:37:55 np0005591285 nova_compute[182755]: 2026-01-22 00:37:55.354 182759 DEBUG nova.compute.manager [req-17bd973b-8377-4dfc-a743-29c37d24c209 req-1f8099fe-70e0-4fad-98f7-de89eab7b619 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: f5280c26-3c89-472c-96cd-5d580ed702ce] Refreshing instance network info cache due to event network-changed-dc8f6b9c-5810-44f0-9c16-cb38b34b2dfa. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 21 19:37:55 np0005591285 nova_compute[182755]: 2026-01-22 00:37:55.354 182759 DEBUG oslo_concurrency.lockutils [req-17bd973b-8377-4dfc-a743-29c37d24c209 req-1f8099fe-70e0-4fad-98f7-de89eab7b619 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "refresh_cache-f5280c26-3c89-472c-96cd-5d580ed702ce" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 21 19:37:56 np0005591285 nova_compute[182755]: 2026-01-22 00:37:56.537 182759 DEBUG nova.network.neutron [None req-05c120eb-7169-456f-a6e0-a0d6bf45e198 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] [instance: f5280c26-3c89-472c-96cd-5d580ed702ce] Updating instance_info_cache with network_info: [{"id": "6158c039-5f87-4d75-91cd-734e6337b27f", "address": "fa:16:3e:27:e9:26", "network": {"id": "96576974-adfc-492e-9141-63dd99e1cb25", "bridge": "br-int", "label": "tempest-network-smoke--1773827977", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "837db8748d074b3c9179b47d30e7a1d4", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap6158c039-5f", "ovs_interfaceid": "6158c039-5f87-4d75-91cd-734e6337b27f", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "dc8f6b9c-5810-44f0-9c16-cb38b34b2dfa", "address": "fa:16:3e:20:7f:7e", "network": {"id": "01fa8e13-9f62-4b06-88db-79f2e6ca65b8", "bridge": "br-int", "label": "tempest-network-smoke--1773626540", "subnets": [{"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe20:7f7e", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}, {"cidr": "2001:db8:0:1::/64", "dns": [], "gateway": {"address": "2001:db8:0:1::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8:0:1:f816:3eff:fe20:7f7e", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "837db8748d074b3c9179b47d30e7a1d4", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapdc8f6b9c-58", "ovs_interfaceid": "dc8f6b9c-5810-44f0-9c16-cb38b34b2dfa", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 21 19:37:56 np0005591285 nova_compute[182755]: 2026-01-22 00:37:56.565 182759 DEBUG oslo_concurrency.lockutils [None req-05c120eb-7169-456f-a6e0-a0d6bf45e198 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Releasing lock "refresh_cache-f5280c26-3c89-472c-96cd-5d580ed702ce" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 21 19:37:56 np0005591285 nova_compute[182755]: 2026-01-22 00:37:56.565 182759 DEBUG nova.compute.manager [None req-05c120eb-7169-456f-a6e0-a0d6bf45e198 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] [instance: f5280c26-3c89-472c-96cd-5d580ed702ce] Instance network_info: |[{"id": "6158c039-5f87-4d75-91cd-734e6337b27f", "address": "fa:16:3e:27:e9:26", "network": {"id": "96576974-adfc-492e-9141-63dd99e1cb25", "bridge": "br-int", "label": "tempest-network-smoke--1773827977", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "837db8748d074b3c9179b47d30e7a1d4", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap6158c039-5f", "ovs_interfaceid": "6158c039-5f87-4d75-91cd-734e6337b27f", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "dc8f6b9c-5810-44f0-9c16-cb38b34b2dfa", "address": "fa:16:3e:20:7f:7e", "network": {"id": "01fa8e13-9f62-4b06-88db-79f2e6ca65b8", "bridge": "br-int", "label": "tempest-network-smoke--1773626540", "subnets": [{"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe20:7f7e", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}, {"cidr": "2001:db8:0:1::/64", "dns": [], "gateway": {"address": "2001:db8:0:1::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8:0:1:f816:3eff:fe20:7f7e", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "837db8748d074b3c9179b47d30e7a1d4", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapdc8f6b9c-58", "ovs_interfaceid": "dc8f6b9c-5810-44f0-9c16-cb38b34b2dfa", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Jan 21 19:37:56 np0005591285 nova_compute[182755]: 2026-01-22 00:37:56.566 182759 DEBUG oslo_concurrency.lockutils [req-17bd973b-8377-4dfc-a743-29c37d24c209 req-1f8099fe-70e0-4fad-98f7-de89eab7b619 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquired lock "refresh_cache-f5280c26-3c89-472c-96cd-5d580ed702ce" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 21 19:37:56 np0005591285 nova_compute[182755]: 2026-01-22 00:37:56.566 182759 DEBUG nova.network.neutron [req-17bd973b-8377-4dfc-a743-29c37d24c209 req-1f8099fe-70e0-4fad-98f7-de89eab7b619 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: f5280c26-3c89-472c-96cd-5d580ed702ce] Refreshing network info cache for port dc8f6b9c-5810-44f0-9c16-cb38b34b2dfa _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 21 19:37:56 np0005591285 nova_compute[182755]: 2026-01-22 00:37:56.570 182759 DEBUG nova.virt.libvirt.driver [None req-05c120eb-7169-456f-a6e0-a0d6bf45e198 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] [instance: f5280c26-3c89-472c-96cd-5d580ed702ce] Start _get_guest_xml network_info=[{"id": "6158c039-5f87-4d75-91cd-734e6337b27f", "address": "fa:16:3e:27:e9:26", "network": {"id": "96576974-adfc-492e-9141-63dd99e1cb25", "bridge": "br-int", "label": "tempest-network-smoke--1773827977", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "837db8748d074b3c9179b47d30e7a1d4", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap6158c039-5f", "ovs_interfaceid": "6158c039-5f87-4d75-91cd-734e6337b27f", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "dc8f6b9c-5810-44f0-9c16-cb38b34b2dfa", "address": "fa:16:3e:20:7f:7e", "network": {"id": "01fa8e13-9f62-4b06-88db-79f2e6ca65b8", "bridge": "br-int", "label": "tempest-network-smoke--1773626540", "subnets": [{"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe20:7f7e", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}, {"cidr": "2001:db8:0:1::/64", "dns": [], "gateway": {"address": "2001:db8:0:1::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8:0:1:f816:3eff:fe20:7f7e", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "837db8748d074b3c9179b47d30e7a1d4", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapdc8f6b9c-58", "ovs_interfaceid": "dc8f6b9c-5810-44f0-9c16-cb38b34b2dfa", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-21T23:42:50Z,direct_url=<?>,disk_format='qcow2',id=9cd98f02-a505-4543-a7ad-04e9a377b456,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='43b70c4e837343859ac97b6b2397ba1b',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-21T23:42:52Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'device_type': 'disk', 'device_name': '/dev/vda', 'size': 0, 'boot_index': 0, 'disk_bus': 'virtio', 'guest_format': None, 'encryption_options': None, 'encryption_format': None, 'encrypted': False, 'encryption_secret_uuid': None, 'image_id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Jan 21 19:37:56 np0005591285 nova_compute[182755]: 2026-01-22 00:37:56.573 182759 WARNING nova.virt.libvirt.driver [None req-05c120eb-7169-456f-a6e0-a0d6bf45e198 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 21 19:37:56 np0005591285 nova_compute[182755]: 2026-01-22 00:37:56.580 182759 DEBUG nova.virt.libvirt.host [None req-05c120eb-7169-456f-a6e0-a0d6bf45e198 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Jan 21 19:37:56 np0005591285 nova_compute[182755]: 2026-01-22 00:37:56.581 182759 DEBUG nova.virt.libvirt.host [None req-05c120eb-7169-456f-a6e0-a0d6bf45e198 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Jan 21 19:37:56 np0005591285 nova_compute[182755]: 2026-01-22 00:37:56.584 182759 DEBUG nova.virt.libvirt.host [None req-05c120eb-7169-456f-a6e0-a0d6bf45e198 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Jan 21 19:37:56 np0005591285 nova_compute[182755]: 2026-01-22 00:37:56.584 182759 DEBUG nova.virt.libvirt.host [None req-05c120eb-7169-456f-a6e0-a0d6bf45e198 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Jan 21 19:37:56 np0005591285 nova_compute[182755]: 2026-01-22 00:37:56.586 182759 DEBUG nova.virt.libvirt.driver [None req-05c120eb-7169-456f-a6e0-a0d6bf45e198 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Jan 21 19:37:56 np0005591285 nova_compute[182755]: 2026-01-22 00:37:56.586 182759 DEBUG nova.virt.hardware [None req-05c120eb-7169-456f-a6e0-a0d6bf45e198 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-21T23:42:45Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='c3389c03-89c4-4ff5-9e03-1a99d41713d4',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-21T23:42:50Z,direct_url=<?>,disk_format='qcow2',id=9cd98f02-a505-4543-a7ad-04e9a377b456,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='43b70c4e837343859ac97b6b2397ba1b',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-21T23:42:52Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Jan 21 19:37:56 np0005591285 nova_compute[182755]: 2026-01-22 00:37:56.586 182759 DEBUG nova.virt.hardware [None req-05c120eb-7169-456f-a6e0-a0d6bf45e198 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Jan 21 19:37:56 np0005591285 nova_compute[182755]: 2026-01-22 00:37:56.587 182759 DEBUG nova.virt.hardware [None req-05c120eb-7169-456f-a6e0-a0d6bf45e198 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Jan 21 19:37:56 np0005591285 nova_compute[182755]: 2026-01-22 00:37:56.587 182759 DEBUG nova.virt.hardware [None req-05c120eb-7169-456f-a6e0-a0d6bf45e198 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Jan 21 19:37:56 np0005591285 nova_compute[182755]: 2026-01-22 00:37:56.587 182759 DEBUG nova.virt.hardware [None req-05c120eb-7169-456f-a6e0-a0d6bf45e198 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Jan 21 19:37:56 np0005591285 nova_compute[182755]: 2026-01-22 00:37:56.587 182759 DEBUG nova.virt.hardware [None req-05c120eb-7169-456f-a6e0-a0d6bf45e198 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Jan 21 19:37:56 np0005591285 nova_compute[182755]: 2026-01-22 00:37:56.588 182759 DEBUG nova.virt.hardware [None req-05c120eb-7169-456f-a6e0-a0d6bf45e198 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Jan 21 19:37:56 np0005591285 nova_compute[182755]: 2026-01-22 00:37:56.588 182759 DEBUG nova.virt.hardware [None req-05c120eb-7169-456f-a6e0-a0d6bf45e198 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Jan 21 19:37:56 np0005591285 nova_compute[182755]: 2026-01-22 00:37:56.588 182759 DEBUG nova.virt.hardware [None req-05c120eb-7169-456f-a6e0-a0d6bf45e198 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Jan 21 19:37:56 np0005591285 nova_compute[182755]: 2026-01-22 00:37:56.588 182759 DEBUG nova.virt.hardware [None req-05c120eb-7169-456f-a6e0-a0d6bf45e198 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Jan 21 19:37:56 np0005591285 nova_compute[182755]: 2026-01-22 00:37:56.588 182759 DEBUG nova.virt.hardware [None req-05c120eb-7169-456f-a6e0-a0d6bf45e198 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Jan 21 19:37:56 np0005591285 nova_compute[182755]: 2026-01-22 00:37:56.593 182759 DEBUG nova.virt.libvirt.vif [None req-05c120eb-7169-456f-a6e0-a0d6bf45e198 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-22T00:37:45Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestGettingAddress-server-960448291',display_name='tempest-TestGettingAddress-server-960448291',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-testgettingaddress-server-960448291',id=175,image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBKUPE/b7T//C7EWRAZhxqFsRGr7AXrACj+OWHY0bSytLiLst+E4mc3tNVo/ZttM4rMO8VKrIAX0ipjrNBzr3hEfrSo0ADkuS+9zF2SWKTGt3QdgHdDGoyPuHVN6vYWqrHA==',key_name='tempest-TestGettingAddress-2032783061',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='837db8748d074b3c9179b47d30e7a1d4',ramdisk_id='',reservation_id='r-d7q9t02g',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestGettingAddress-471729430',owner_user_name='tempest-TestGettingAddress-471729430-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-22T00:37:47Z,user_data=None,user_id='a8fd196423d94b309668ffd08655f7ed',uuid=f5280c26-3c89-472c-96cd-5d580ed702ce,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "6158c039-5f87-4d75-91cd-734e6337b27f", "address": "fa:16:3e:27:e9:26", "network": {"id": "96576974-adfc-492e-9141-63dd99e1cb25", "bridge": "br-int", "label": "tempest-network-smoke--1773827977", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "837db8748d074b3c9179b47d30e7a1d4", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap6158c039-5f", "ovs_interfaceid": "6158c039-5f87-4d75-91cd-734e6337b27f", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Jan 21 19:37:56 np0005591285 nova_compute[182755]: 2026-01-22 00:37:56.593 182759 DEBUG nova.network.os_vif_util [None req-05c120eb-7169-456f-a6e0-a0d6bf45e198 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Converting VIF {"id": "6158c039-5f87-4d75-91cd-734e6337b27f", "address": "fa:16:3e:27:e9:26", "network": {"id": "96576974-adfc-492e-9141-63dd99e1cb25", "bridge": "br-int", "label": "tempest-network-smoke--1773827977", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "837db8748d074b3c9179b47d30e7a1d4", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap6158c039-5f", "ovs_interfaceid": "6158c039-5f87-4d75-91cd-734e6337b27f", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 21 19:37:56 np0005591285 nova_compute[182755]: 2026-01-22 00:37:56.594 182759 DEBUG nova.network.os_vif_util [None req-05c120eb-7169-456f-a6e0-a0d6bf45e198 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:27:e9:26,bridge_name='br-int',has_traffic_filtering=True,id=6158c039-5f87-4d75-91cd-734e6337b27f,network=Network(96576974-adfc-492e-9141-63dd99e1cb25),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap6158c039-5f') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 21 19:37:56 np0005591285 nova_compute[182755]: 2026-01-22 00:37:56.595 182759 DEBUG nova.virt.libvirt.vif [None req-05c120eb-7169-456f-a6e0-a0d6bf45e198 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-22T00:37:45Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestGettingAddress-server-960448291',display_name='tempest-TestGettingAddress-server-960448291',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-testgettingaddress-server-960448291',id=175,image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBKUPE/b7T//C7EWRAZhxqFsRGr7AXrACj+OWHY0bSytLiLst+E4mc3tNVo/ZttM4rMO8VKrIAX0ipjrNBzr3hEfrSo0ADkuS+9zF2SWKTGt3QdgHdDGoyPuHVN6vYWqrHA==',key_name='tempest-TestGettingAddress-2032783061',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='837db8748d074b3c9179b47d30e7a1d4',ramdisk_id='',reservation_id='r-d7q9t02g',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestGettingAddress-471729430',owner_user_name='tempest-TestGettingAddress-471729430-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-22T00:37:47Z,user_data=None,user_id='a8fd196423d94b309668ffd08655f7ed',uuid=f5280c26-3c89-472c-96cd-5d580ed702ce,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "dc8f6b9c-5810-44f0-9c16-cb38b34b2dfa", "address": "fa:16:3e:20:7f:7e", "network": {"id": "01fa8e13-9f62-4b06-88db-79f2e6ca65b8", "bridge": "br-int", "label": "tempest-network-smoke--1773626540", "subnets": [{"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe20:7f7e", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}, {"cidr": "2001:db8:0:1::/64", "dns": [], "gateway": {"address": "2001:db8:0:1::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8:0:1:f816:3eff:fe20:7f7e", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "837db8748d074b3c9179b47d30e7a1d4", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapdc8f6b9c-58", "ovs_interfaceid": "dc8f6b9c-5810-44f0-9c16-cb38b34b2dfa", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Jan 21 19:37:56 np0005591285 nova_compute[182755]: 2026-01-22 00:37:56.595 182759 DEBUG nova.network.os_vif_util [None req-05c120eb-7169-456f-a6e0-a0d6bf45e198 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Converting VIF {"id": "dc8f6b9c-5810-44f0-9c16-cb38b34b2dfa", "address": "fa:16:3e:20:7f:7e", "network": {"id": "01fa8e13-9f62-4b06-88db-79f2e6ca65b8", "bridge": "br-int", "label": "tempest-network-smoke--1773626540", "subnets": [{"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe20:7f7e", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}, {"cidr": "2001:db8:0:1::/64", "dns": [], "gateway": {"address": "2001:db8:0:1::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8:0:1:f816:3eff:fe20:7f7e", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "837db8748d074b3c9179b47d30e7a1d4", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapdc8f6b9c-58", "ovs_interfaceid": "dc8f6b9c-5810-44f0-9c16-cb38b34b2dfa", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 21 19:37:56 np0005591285 nova_compute[182755]: 2026-01-22 00:37:56.595 182759 DEBUG nova.network.os_vif_util [None req-05c120eb-7169-456f-a6e0-a0d6bf45e198 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:20:7f:7e,bridge_name='br-int',has_traffic_filtering=True,id=dc8f6b9c-5810-44f0-9c16-cb38b34b2dfa,network=Network(01fa8e13-9f62-4b06-88db-79f2e6ca65b8),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapdc8f6b9c-58') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 21 19:37:56 np0005591285 nova_compute[182755]: 2026-01-22 00:37:56.596 182759 DEBUG nova.objects.instance [None req-05c120eb-7169-456f-a6e0-a0d6bf45e198 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Lazy-loading 'pci_devices' on Instance uuid f5280c26-3c89-472c-96cd-5d580ed702ce obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 21 19:37:56 np0005591285 nova_compute[182755]: 2026-01-22 00:37:56.617 182759 DEBUG nova.virt.libvirt.driver [None req-05c120eb-7169-456f-a6e0-a0d6bf45e198 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] [instance: f5280c26-3c89-472c-96cd-5d580ed702ce] End _get_guest_xml xml=<domain type="kvm">
Jan 21 19:37:56 np0005591285 nova_compute[182755]:  <uuid>f5280c26-3c89-472c-96cd-5d580ed702ce</uuid>
Jan 21 19:37:56 np0005591285 nova_compute[182755]:  <name>instance-000000af</name>
Jan 21 19:37:56 np0005591285 nova_compute[182755]:  <memory>131072</memory>
Jan 21 19:37:56 np0005591285 nova_compute[182755]:  <vcpu>1</vcpu>
Jan 21 19:37:56 np0005591285 nova_compute[182755]:  <metadata>
Jan 21 19:37:56 np0005591285 nova_compute[182755]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 21 19:37:56 np0005591285 nova_compute[182755]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 21 19:37:56 np0005591285 nova_compute[182755]:      <nova:name>tempest-TestGettingAddress-server-960448291</nova:name>
Jan 21 19:37:56 np0005591285 nova_compute[182755]:      <nova:creationTime>2026-01-22 00:37:56</nova:creationTime>
Jan 21 19:37:56 np0005591285 nova_compute[182755]:      <nova:flavor name="m1.nano">
Jan 21 19:37:56 np0005591285 nova_compute[182755]:        <nova:memory>128</nova:memory>
Jan 21 19:37:56 np0005591285 nova_compute[182755]:        <nova:disk>1</nova:disk>
Jan 21 19:37:56 np0005591285 nova_compute[182755]:        <nova:swap>0</nova:swap>
Jan 21 19:37:56 np0005591285 nova_compute[182755]:        <nova:ephemeral>0</nova:ephemeral>
Jan 21 19:37:56 np0005591285 nova_compute[182755]:        <nova:vcpus>1</nova:vcpus>
Jan 21 19:37:56 np0005591285 nova_compute[182755]:      </nova:flavor>
Jan 21 19:37:56 np0005591285 nova_compute[182755]:      <nova:owner>
Jan 21 19:37:56 np0005591285 nova_compute[182755]:        <nova:user uuid="a8fd196423d94b309668ffd08655f7ed">tempest-TestGettingAddress-471729430-project-member</nova:user>
Jan 21 19:37:56 np0005591285 nova_compute[182755]:        <nova:project uuid="837db8748d074b3c9179b47d30e7a1d4">tempest-TestGettingAddress-471729430</nova:project>
Jan 21 19:37:56 np0005591285 nova_compute[182755]:      </nova:owner>
Jan 21 19:37:56 np0005591285 nova_compute[182755]:      <nova:root type="image" uuid="9cd98f02-a505-4543-a7ad-04e9a377b456"/>
Jan 21 19:37:56 np0005591285 nova_compute[182755]:      <nova:ports>
Jan 21 19:37:56 np0005591285 nova_compute[182755]:        <nova:port uuid="6158c039-5f87-4d75-91cd-734e6337b27f">
Jan 21 19:37:56 np0005591285 nova_compute[182755]:          <nova:ip type="fixed" address="10.100.0.13" ipVersion="4"/>
Jan 21 19:37:56 np0005591285 nova_compute[182755]:        </nova:port>
Jan 21 19:37:56 np0005591285 nova_compute[182755]:        <nova:port uuid="dc8f6b9c-5810-44f0-9c16-cb38b34b2dfa">
Jan 21 19:37:56 np0005591285 nova_compute[182755]:          <nova:ip type="fixed" address="2001:db8::f816:3eff:fe20:7f7e" ipVersion="6"/>
Jan 21 19:37:56 np0005591285 nova_compute[182755]:          <nova:ip type="fixed" address="2001:db8:0:1:f816:3eff:fe20:7f7e" ipVersion="6"/>
Jan 21 19:37:56 np0005591285 nova_compute[182755]:        </nova:port>
Jan 21 19:37:56 np0005591285 nova_compute[182755]:      </nova:ports>
Jan 21 19:37:56 np0005591285 nova_compute[182755]:    </nova:instance>
Jan 21 19:37:56 np0005591285 nova_compute[182755]:  </metadata>
Jan 21 19:37:56 np0005591285 nova_compute[182755]:  <sysinfo type="smbios">
Jan 21 19:37:56 np0005591285 nova_compute[182755]:    <system>
Jan 21 19:37:56 np0005591285 nova_compute[182755]:      <entry name="manufacturer">RDO</entry>
Jan 21 19:37:56 np0005591285 nova_compute[182755]:      <entry name="product">OpenStack Compute</entry>
Jan 21 19:37:56 np0005591285 nova_compute[182755]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 21 19:37:56 np0005591285 nova_compute[182755]:      <entry name="serial">f5280c26-3c89-472c-96cd-5d580ed702ce</entry>
Jan 21 19:37:56 np0005591285 nova_compute[182755]:      <entry name="uuid">f5280c26-3c89-472c-96cd-5d580ed702ce</entry>
Jan 21 19:37:56 np0005591285 nova_compute[182755]:      <entry name="family">Virtual Machine</entry>
Jan 21 19:37:56 np0005591285 nova_compute[182755]:    </system>
Jan 21 19:37:56 np0005591285 nova_compute[182755]:  </sysinfo>
Jan 21 19:37:56 np0005591285 nova_compute[182755]:  <os>
Jan 21 19:37:56 np0005591285 nova_compute[182755]:    <type arch="x86_64" machine="q35">hvm</type>
Jan 21 19:37:56 np0005591285 nova_compute[182755]:    <boot dev="hd"/>
Jan 21 19:37:56 np0005591285 nova_compute[182755]:    <smbios mode="sysinfo"/>
Jan 21 19:37:56 np0005591285 nova_compute[182755]:  </os>
Jan 21 19:37:56 np0005591285 nova_compute[182755]:  <features>
Jan 21 19:37:56 np0005591285 nova_compute[182755]:    <acpi/>
Jan 21 19:37:56 np0005591285 nova_compute[182755]:    <apic/>
Jan 21 19:37:56 np0005591285 nova_compute[182755]:    <vmcoreinfo/>
Jan 21 19:37:56 np0005591285 nova_compute[182755]:  </features>
Jan 21 19:37:56 np0005591285 nova_compute[182755]:  <clock offset="utc">
Jan 21 19:37:56 np0005591285 nova_compute[182755]:    <timer name="pit" tickpolicy="delay"/>
Jan 21 19:37:56 np0005591285 nova_compute[182755]:    <timer name="rtc" tickpolicy="catchup"/>
Jan 21 19:37:56 np0005591285 nova_compute[182755]:    <timer name="hpet" present="no"/>
Jan 21 19:37:56 np0005591285 nova_compute[182755]:  </clock>
Jan 21 19:37:56 np0005591285 nova_compute[182755]:  <cpu mode="custom" match="exact">
Jan 21 19:37:56 np0005591285 nova_compute[182755]:    <model>Nehalem</model>
Jan 21 19:37:56 np0005591285 nova_compute[182755]:    <topology sockets="1" cores="1" threads="1"/>
Jan 21 19:37:56 np0005591285 nova_compute[182755]:  </cpu>
Jan 21 19:37:56 np0005591285 nova_compute[182755]:  <devices>
Jan 21 19:37:56 np0005591285 nova_compute[182755]:    <disk type="file" device="disk">
Jan 21 19:37:56 np0005591285 nova_compute[182755]:      <driver name="qemu" type="qcow2" cache="none"/>
Jan 21 19:37:56 np0005591285 nova_compute[182755]:      <source file="/var/lib/nova/instances/f5280c26-3c89-472c-96cd-5d580ed702ce/disk"/>
Jan 21 19:37:56 np0005591285 nova_compute[182755]:      <target dev="vda" bus="virtio"/>
Jan 21 19:37:56 np0005591285 nova_compute[182755]:    </disk>
Jan 21 19:37:56 np0005591285 nova_compute[182755]:    <disk type="file" device="cdrom">
Jan 21 19:37:56 np0005591285 nova_compute[182755]:      <driver name="qemu" type="raw" cache="none"/>
Jan 21 19:37:56 np0005591285 nova_compute[182755]:      <source file="/var/lib/nova/instances/f5280c26-3c89-472c-96cd-5d580ed702ce/disk.config"/>
Jan 21 19:37:56 np0005591285 nova_compute[182755]:      <target dev="sda" bus="sata"/>
Jan 21 19:37:56 np0005591285 nova_compute[182755]:    </disk>
Jan 21 19:37:56 np0005591285 nova_compute[182755]:    <interface type="ethernet">
Jan 21 19:37:56 np0005591285 nova_compute[182755]:      <mac address="fa:16:3e:27:e9:26"/>
Jan 21 19:37:56 np0005591285 nova_compute[182755]:      <model type="virtio"/>
Jan 21 19:37:56 np0005591285 nova_compute[182755]:      <driver name="vhost" rx_queue_size="512"/>
Jan 21 19:37:56 np0005591285 nova_compute[182755]:      <mtu size="1442"/>
Jan 21 19:37:56 np0005591285 nova_compute[182755]:      <target dev="tap6158c039-5f"/>
Jan 21 19:37:56 np0005591285 nova_compute[182755]:    </interface>
Jan 21 19:37:56 np0005591285 nova_compute[182755]:    <interface type="ethernet">
Jan 21 19:37:56 np0005591285 nova_compute[182755]:      <mac address="fa:16:3e:20:7f:7e"/>
Jan 21 19:37:56 np0005591285 nova_compute[182755]:      <model type="virtio"/>
Jan 21 19:37:56 np0005591285 nova_compute[182755]:      <driver name="vhost" rx_queue_size="512"/>
Jan 21 19:37:56 np0005591285 nova_compute[182755]:      <mtu size="1442"/>
Jan 21 19:37:56 np0005591285 nova_compute[182755]:      <target dev="tapdc8f6b9c-58"/>
Jan 21 19:37:56 np0005591285 nova_compute[182755]:    </interface>
Jan 21 19:37:56 np0005591285 nova_compute[182755]:    <serial type="pty">
Jan 21 19:37:56 np0005591285 nova_compute[182755]:      <log file="/var/lib/nova/instances/f5280c26-3c89-472c-96cd-5d580ed702ce/console.log" append="off"/>
Jan 21 19:37:56 np0005591285 nova_compute[182755]:    </serial>
Jan 21 19:37:56 np0005591285 nova_compute[182755]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 21 19:37:56 np0005591285 nova_compute[182755]:    <video>
Jan 21 19:37:56 np0005591285 nova_compute[182755]:      <model type="virtio"/>
Jan 21 19:37:56 np0005591285 nova_compute[182755]:    </video>
Jan 21 19:37:56 np0005591285 nova_compute[182755]:    <input type="tablet" bus="usb"/>
Jan 21 19:37:56 np0005591285 nova_compute[182755]:    <rng model="virtio">
Jan 21 19:37:56 np0005591285 nova_compute[182755]:      <backend model="random">/dev/urandom</backend>
Jan 21 19:37:56 np0005591285 nova_compute[182755]:    </rng>
Jan 21 19:37:56 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root"/>
Jan 21 19:37:56 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 19:37:56 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 19:37:56 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 19:37:56 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 19:37:56 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 19:37:56 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 19:37:56 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 19:37:56 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 19:37:56 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 19:37:56 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 19:37:56 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 19:37:56 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 19:37:56 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 19:37:56 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 19:37:56 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 19:37:56 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 19:37:56 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 19:37:56 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 19:37:56 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 19:37:56 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 19:37:56 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 19:37:56 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 19:37:56 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 19:37:56 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 19:37:56 np0005591285 nova_compute[182755]:    <controller type="usb" index="0"/>
Jan 21 19:37:56 np0005591285 nova_compute[182755]:    <memballoon model="virtio">
Jan 21 19:37:56 np0005591285 nova_compute[182755]:      <stats period="10"/>
Jan 21 19:37:56 np0005591285 nova_compute[182755]:    </memballoon>
Jan 21 19:37:56 np0005591285 nova_compute[182755]:  </devices>
Jan 21 19:37:56 np0005591285 nova_compute[182755]: </domain>
Jan 21 19:37:56 np0005591285 nova_compute[182755]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Jan 21 19:37:56 np0005591285 nova_compute[182755]: 2026-01-22 00:37:56.618 182759 DEBUG nova.compute.manager [None req-05c120eb-7169-456f-a6e0-a0d6bf45e198 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] [instance: f5280c26-3c89-472c-96cd-5d580ed702ce] Preparing to wait for external event network-vif-plugged-6158c039-5f87-4d75-91cd-734e6337b27f prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Jan 21 19:37:56 np0005591285 nova_compute[182755]: 2026-01-22 00:37:56.618 182759 DEBUG oslo_concurrency.lockutils [None req-05c120eb-7169-456f-a6e0-a0d6bf45e198 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Acquiring lock "f5280c26-3c89-472c-96cd-5d580ed702ce-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 21 19:37:56 np0005591285 nova_compute[182755]: 2026-01-22 00:37:56.619 182759 DEBUG oslo_concurrency.lockutils [None req-05c120eb-7169-456f-a6e0-a0d6bf45e198 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Lock "f5280c26-3c89-472c-96cd-5d580ed702ce-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 21 19:37:56 np0005591285 nova_compute[182755]: 2026-01-22 00:37:56.619 182759 DEBUG oslo_concurrency.lockutils [None req-05c120eb-7169-456f-a6e0-a0d6bf45e198 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Lock "f5280c26-3c89-472c-96cd-5d580ed702ce-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 21 19:37:56 np0005591285 nova_compute[182755]: 2026-01-22 00:37:56.619 182759 DEBUG nova.compute.manager [None req-05c120eb-7169-456f-a6e0-a0d6bf45e198 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] [instance: f5280c26-3c89-472c-96cd-5d580ed702ce] Preparing to wait for external event network-vif-plugged-dc8f6b9c-5810-44f0-9c16-cb38b34b2dfa prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Jan 21 19:37:56 np0005591285 nova_compute[182755]: 2026-01-22 00:37:56.619 182759 DEBUG oslo_concurrency.lockutils [None req-05c120eb-7169-456f-a6e0-a0d6bf45e198 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Acquiring lock "f5280c26-3c89-472c-96cd-5d580ed702ce-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 21 19:37:56 np0005591285 nova_compute[182755]: 2026-01-22 00:37:56.620 182759 DEBUG oslo_concurrency.lockutils [None req-05c120eb-7169-456f-a6e0-a0d6bf45e198 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Lock "f5280c26-3c89-472c-96cd-5d580ed702ce-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 21 19:37:56 np0005591285 nova_compute[182755]: 2026-01-22 00:37:56.620 182759 DEBUG oslo_concurrency.lockutils [None req-05c120eb-7169-456f-a6e0-a0d6bf45e198 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Lock "f5280c26-3c89-472c-96cd-5d580ed702ce-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 21 19:37:56 np0005591285 nova_compute[182755]: 2026-01-22 00:37:56.621 182759 DEBUG nova.virt.libvirt.vif [None req-05c120eb-7169-456f-a6e0-a0d6bf45e198 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-22T00:37:45Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestGettingAddress-server-960448291',display_name='tempest-TestGettingAddress-server-960448291',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-testgettingaddress-server-960448291',id=175,image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBKUPE/b7T//C7EWRAZhxqFsRGr7AXrACj+OWHY0bSytLiLst+E4mc3tNVo/ZttM4rMO8VKrIAX0ipjrNBzr3hEfrSo0ADkuS+9zF2SWKTGt3QdgHdDGoyPuHVN6vYWqrHA==',key_name='tempest-TestGettingAddress-2032783061',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='837db8748d074b3c9179b47d30e7a1d4',ramdisk_id='',reservation_id='r-d7q9t02g',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestGettingAddress-471729430',owner_user_name='tempest-TestGettingAddress-471729430-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-22T00:37:47Z,user_data=None,user_id='a8fd196423d94b309668ffd08655f7ed',uuid=f5280c26-3c89-472c-96cd-5d580ed702ce,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "6158c039-5f87-4d75-91cd-734e6337b27f", "address": "fa:16:3e:27:e9:26", "network": {"id": "96576974-adfc-492e-9141-63dd99e1cb25", "bridge": "br-int", "label": "tempest-network-smoke--1773827977", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "837db8748d074b3c9179b47d30e7a1d4", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap6158c039-5f", "ovs_interfaceid": "6158c039-5f87-4d75-91cd-734e6337b27f", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Jan 21 19:37:56 np0005591285 nova_compute[182755]: 2026-01-22 00:37:56.621 182759 DEBUG nova.network.os_vif_util [None req-05c120eb-7169-456f-a6e0-a0d6bf45e198 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Converting VIF {"id": "6158c039-5f87-4d75-91cd-734e6337b27f", "address": "fa:16:3e:27:e9:26", "network": {"id": "96576974-adfc-492e-9141-63dd99e1cb25", "bridge": "br-int", "label": "tempest-network-smoke--1773827977", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "837db8748d074b3c9179b47d30e7a1d4", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap6158c039-5f", "ovs_interfaceid": "6158c039-5f87-4d75-91cd-734e6337b27f", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 21 19:37:56 np0005591285 nova_compute[182755]: 2026-01-22 00:37:56.622 182759 DEBUG nova.network.os_vif_util [None req-05c120eb-7169-456f-a6e0-a0d6bf45e198 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:27:e9:26,bridge_name='br-int',has_traffic_filtering=True,id=6158c039-5f87-4d75-91cd-734e6337b27f,network=Network(96576974-adfc-492e-9141-63dd99e1cb25),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap6158c039-5f') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 21 19:37:56 np0005591285 nova_compute[182755]: 2026-01-22 00:37:56.622 182759 DEBUG os_vif [None req-05c120eb-7169-456f-a6e0-a0d6bf45e198 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:27:e9:26,bridge_name='br-int',has_traffic_filtering=True,id=6158c039-5f87-4d75-91cd-734e6337b27f,network=Network(96576974-adfc-492e-9141-63dd99e1cb25),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap6158c039-5f') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Jan 21 19:37:56 np0005591285 nova_compute[182755]: 2026-01-22 00:37:56.622 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:37:56 np0005591285 nova_compute[182755]: 2026-01-22 00:37:56.623 182759 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 21 19:37:56 np0005591285 nova_compute[182755]: 2026-01-22 00:37:56.623 182759 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 21 19:37:56 np0005591285 nova_compute[182755]: 2026-01-22 00:37:56.626 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:37:56 np0005591285 nova_compute[182755]: 2026-01-22 00:37:56.626 182759 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap6158c039-5f, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 21 19:37:56 np0005591285 nova_compute[182755]: 2026-01-22 00:37:56.627 182759 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap6158c039-5f, col_values=(('external_ids', {'iface-id': '6158c039-5f87-4d75-91cd-734e6337b27f', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:27:e9:26', 'vm-uuid': 'f5280c26-3c89-472c-96cd-5d580ed702ce'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 21 19:37:56 np0005591285 NetworkManager[55017]: <info>  [1769042276.6301] manager: (tap6158c039-5f): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/316)
Jan 21 19:37:56 np0005591285 nova_compute[182755]: 2026-01-22 00:37:56.632 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 21 19:37:56 np0005591285 nova_compute[182755]: 2026-01-22 00:37:56.635 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:37:56 np0005591285 nova_compute[182755]: 2026-01-22 00:37:56.636 182759 INFO os_vif [None req-05c120eb-7169-456f-a6e0-a0d6bf45e198 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:27:e9:26,bridge_name='br-int',has_traffic_filtering=True,id=6158c039-5f87-4d75-91cd-734e6337b27f,network=Network(96576974-adfc-492e-9141-63dd99e1cb25),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap6158c039-5f')#033[00m
Jan 21 19:37:56 np0005591285 nova_compute[182755]: 2026-01-22 00:37:56.637 182759 DEBUG nova.virt.libvirt.vif [None req-05c120eb-7169-456f-a6e0-a0d6bf45e198 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-22T00:37:45Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestGettingAddress-server-960448291',display_name='tempest-TestGettingAddress-server-960448291',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-testgettingaddress-server-960448291',id=175,image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBKUPE/b7T//C7EWRAZhxqFsRGr7AXrACj+OWHY0bSytLiLst+E4mc3tNVo/ZttM4rMO8VKrIAX0ipjrNBzr3hEfrSo0ADkuS+9zF2SWKTGt3QdgHdDGoyPuHVN6vYWqrHA==',key_name='tempest-TestGettingAddress-2032783061',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='837db8748d074b3c9179b47d30e7a1d4',ramdisk_id='',reservation_id='r-d7q9t02g',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestGettingAddress-471729430',owner_user_name='tempest-TestGettingAddress-471729430-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-22T00:37:47Z,user_data=None,user_id='a8fd196423d94b309668ffd08655f7ed',uuid=f5280c26-3c89-472c-96cd-5d580ed702ce,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "dc8f6b9c-5810-44f0-9c16-cb38b34b2dfa", "address": "fa:16:3e:20:7f:7e", "network": {"id": "01fa8e13-9f62-4b06-88db-79f2e6ca65b8", "bridge": "br-int", "label": "tempest-network-smoke--1773626540", "subnets": [{"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe20:7f7e", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}, {"cidr": "2001:db8:0:1::/64", "dns": [], "gateway": {"address": "2001:db8:0:1::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8:0:1:f816:3eff:fe20:7f7e", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "837db8748d074b3c9179b47d30e7a1d4", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapdc8f6b9c-58", "ovs_interfaceid": "dc8f6b9c-5810-44f0-9c16-cb38b34b2dfa", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Jan 21 19:37:56 np0005591285 nova_compute[182755]: 2026-01-22 00:37:56.637 182759 DEBUG nova.network.os_vif_util [None req-05c120eb-7169-456f-a6e0-a0d6bf45e198 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Converting VIF {"id": "dc8f6b9c-5810-44f0-9c16-cb38b34b2dfa", "address": "fa:16:3e:20:7f:7e", "network": {"id": "01fa8e13-9f62-4b06-88db-79f2e6ca65b8", "bridge": "br-int", "label": "tempest-network-smoke--1773626540", "subnets": [{"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe20:7f7e", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}, {"cidr": "2001:db8:0:1::/64", "dns": [], "gateway": {"address": "2001:db8:0:1::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8:0:1:f816:3eff:fe20:7f7e", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "837db8748d074b3c9179b47d30e7a1d4", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapdc8f6b9c-58", "ovs_interfaceid": "dc8f6b9c-5810-44f0-9c16-cb38b34b2dfa", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 21 19:37:56 np0005591285 nova_compute[182755]: 2026-01-22 00:37:56.638 182759 DEBUG nova.network.os_vif_util [None req-05c120eb-7169-456f-a6e0-a0d6bf45e198 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:20:7f:7e,bridge_name='br-int',has_traffic_filtering=True,id=dc8f6b9c-5810-44f0-9c16-cb38b34b2dfa,network=Network(01fa8e13-9f62-4b06-88db-79f2e6ca65b8),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapdc8f6b9c-58') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 21 19:37:56 np0005591285 nova_compute[182755]: 2026-01-22 00:37:56.638 182759 DEBUG os_vif [None req-05c120eb-7169-456f-a6e0-a0d6bf45e198 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:20:7f:7e,bridge_name='br-int',has_traffic_filtering=True,id=dc8f6b9c-5810-44f0-9c16-cb38b34b2dfa,network=Network(01fa8e13-9f62-4b06-88db-79f2e6ca65b8),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapdc8f6b9c-58') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Jan 21 19:37:56 np0005591285 nova_compute[182755]: 2026-01-22 00:37:56.638 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:37:56 np0005591285 nova_compute[182755]: 2026-01-22 00:37:56.639 182759 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 21 19:37:56 np0005591285 nova_compute[182755]: 2026-01-22 00:37:56.639 182759 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 21 19:37:56 np0005591285 nova_compute[182755]: 2026-01-22 00:37:56.641 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:37:56 np0005591285 nova_compute[182755]: 2026-01-22 00:37:56.641 182759 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapdc8f6b9c-58, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 21 19:37:56 np0005591285 nova_compute[182755]: 2026-01-22 00:37:56.642 182759 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tapdc8f6b9c-58, col_values=(('external_ids', {'iface-id': 'dc8f6b9c-5810-44f0-9c16-cb38b34b2dfa', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:20:7f:7e', 'vm-uuid': 'f5280c26-3c89-472c-96cd-5d580ed702ce'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 21 19:37:56 np0005591285 NetworkManager[55017]: <info>  [1769042276.6436] manager: (tapdc8f6b9c-58): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/317)
Jan 21 19:37:56 np0005591285 nova_compute[182755]: 2026-01-22 00:37:56.642 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:37:56 np0005591285 nova_compute[182755]: 2026-01-22 00:37:56.645 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 21 19:37:56 np0005591285 nova_compute[182755]: 2026-01-22 00:37:56.648 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:37:56 np0005591285 nova_compute[182755]: 2026-01-22 00:37:56.649 182759 INFO os_vif [None req-05c120eb-7169-456f-a6e0-a0d6bf45e198 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:20:7f:7e,bridge_name='br-int',has_traffic_filtering=True,id=dc8f6b9c-5810-44f0-9c16-cb38b34b2dfa,network=Network(01fa8e13-9f62-4b06-88db-79f2e6ca65b8),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapdc8f6b9c-58')#033[00m
Jan 21 19:37:56 np0005591285 nova_compute[182755]: 2026-01-22 00:37:56.700 182759 DEBUG nova.virt.libvirt.driver [None req-05c120eb-7169-456f-a6e0-a0d6bf45e198 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 21 19:37:56 np0005591285 nova_compute[182755]: 2026-01-22 00:37:56.701 182759 DEBUG nova.virt.libvirt.driver [None req-05c120eb-7169-456f-a6e0-a0d6bf45e198 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 21 19:37:56 np0005591285 nova_compute[182755]: 2026-01-22 00:37:56.701 182759 DEBUG nova.virt.libvirt.driver [None req-05c120eb-7169-456f-a6e0-a0d6bf45e198 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] No VIF found with MAC fa:16:3e:27:e9:26, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Jan 21 19:37:56 np0005591285 nova_compute[182755]: 2026-01-22 00:37:56.701 182759 DEBUG nova.virt.libvirt.driver [None req-05c120eb-7169-456f-a6e0-a0d6bf45e198 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] No VIF found with MAC fa:16:3e:20:7f:7e, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Jan 21 19:37:56 np0005591285 nova_compute[182755]: 2026-01-22 00:37:56.702 182759 INFO nova.virt.libvirt.driver [None req-05c120eb-7169-456f-a6e0-a0d6bf45e198 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] [instance: f5280c26-3c89-472c-96cd-5d580ed702ce] Using config drive#033[00m
Jan 21 19:37:57 np0005591285 nova_compute[182755]: 2026-01-22 00:37:57.232 182759 INFO nova.virt.libvirt.driver [None req-05c120eb-7169-456f-a6e0-a0d6bf45e198 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] [instance: f5280c26-3c89-472c-96cd-5d580ed702ce] Creating config drive at /var/lib/nova/instances/f5280c26-3c89-472c-96cd-5d580ed702ce/disk.config#033[00m
Jan 21 19:37:57 np0005591285 nova_compute[182755]: 2026-01-22 00:37:57.242 182759 DEBUG oslo_concurrency.processutils [None req-05c120eb-7169-456f-a6e0-a0d6bf45e198 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/f5280c26-3c89-472c-96cd-5d580ed702ce/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpgn2szdrp execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 21 19:37:57 np0005591285 nova_compute[182755]: 2026-01-22 00:37:57.369 182759 DEBUG oslo_concurrency.processutils [None req-05c120eb-7169-456f-a6e0-a0d6bf45e198 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/f5280c26-3c89-472c-96cd-5d580ed702ce/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpgn2szdrp" returned: 0 in 0.127s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 21 19:37:57 np0005591285 NetworkManager[55017]: <info>  [1769042277.4583] manager: (tap6158c039-5f): new Tun device (/org/freedesktop/NetworkManager/Devices/318)
Jan 21 19:37:57 np0005591285 kernel: tap6158c039-5f: entered promiscuous mode
Jan 21 19:37:57 np0005591285 ovn_controller[94908]: 2026-01-22T00:37:57Z|00648|binding|INFO|Claiming lport 6158c039-5f87-4d75-91cd-734e6337b27f for this chassis.
Jan 21 19:37:57 np0005591285 ovn_controller[94908]: 2026-01-22T00:37:57Z|00649|binding|INFO|6158c039-5f87-4d75-91cd-734e6337b27f: Claiming fa:16:3e:27:e9:26 10.100.0.13
Jan 21 19:37:57 np0005591285 nova_compute[182755]: 2026-01-22 00:37:57.486 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:37:57 np0005591285 NetworkManager[55017]: <info>  [1769042277.4924] manager: (tapdc8f6b9c-58): new Tun device (/org/freedesktop/NetworkManager/Devices/319)
Jan 21 19:37:57 np0005591285 kernel: tapdc8f6b9c-58: entered promiscuous mode
Jan 21 19:37:57 np0005591285 nova_compute[182755]: 2026-01-22 00:37:57.495 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:37:57 np0005591285 ovn_controller[94908]: 2026-01-22T00:37:57Z|00650|if_status|INFO|Not updating pb chassis for dc8f6b9c-5810-44f0-9c16-cb38b34b2dfa now as sb is readonly
Jan 21 19:37:57 np0005591285 ovn_controller[94908]: 2026-01-22T00:37:57Z|00651|binding|INFO|Claiming lport dc8f6b9c-5810-44f0-9c16-cb38b34b2dfa for this chassis.
Jan 21 19:37:57 np0005591285 ovn_controller[94908]: 2026-01-22T00:37:57Z|00652|binding|INFO|dc8f6b9c-5810-44f0-9c16-cb38b34b2dfa: Claiming fa:16:3e:20:7f:7e 2001:db8:0:1:f816:3eff:fe20:7f7e 2001:db8::f816:3eff:fe20:7f7e
Jan 21 19:37:57 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:37:57.510 104259 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:27:e9:26 10.100.0.13'], port_security=['fa:16:3e:27:e9:26 10.100.0.13'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.13/28', 'neutron:device_id': 'f5280c26-3c89-472c-96cd-5d580ed702ce', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-96576974-adfc-492e-9141-63dd99e1cb25', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '837db8748d074b3c9179b47d30e7a1d4', 'neutron:revision_number': '2', 'neutron:security_group_ids': '4381a94a-5b04-4450-b603-573605756783', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=700861ed-e604-4e52-bc1a-65ca23f63d88, chassis=[<ovs.db.idl.Row object at 0x7fdd55b5c970>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fdd55b5c970>], logical_port=6158c039-5f87-4d75-91cd-734e6337b27f) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 21 19:37:57 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:37:57.511 104259 INFO neutron.agent.ovn.metadata.agent [-] Port 6158c039-5f87-4d75-91cd-734e6337b27f in datapath 96576974-adfc-492e-9141-63dd99e1cb25 bound to our chassis#033[00m
Jan 21 19:37:57 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:37:57.512 104259 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 96576974-adfc-492e-9141-63dd99e1cb25#033[00m
Jan 21 19:37:57 np0005591285 systemd-udevd[241487]: Network interface NamePolicy= disabled on kernel command line.
Jan 21 19:37:57 np0005591285 systemd-udevd[241488]: Network interface NamePolicy= disabled on kernel command line.
Jan 21 19:37:57 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:37:57.516 104259 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:20:7f:7e 2001:db8:0:1:f816:3eff:fe20:7f7e 2001:db8::f816:3eff:fe20:7f7e'], port_security=['fa:16:3e:20:7f:7e 2001:db8:0:1:f816:3eff:fe20:7f7e 2001:db8::f816:3eff:fe20:7f7e'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '2001:db8:0:1:f816:3eff:fe20:7f7e/64 2001:db8::f816:3eff:fe20:7f7e/64', 'neutron:device_id': 'f5280c26-3c89-472c-96cd-5d580ed702ce', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-01fa8e13-9f62-4b06-88db-79f2e6ca65b8', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '837db8748d074b3c9179b47d30e7a1d4', 'neutron:revision_number': '2', 'neutron:security_group_ids': '4381a94a-5b04-4450-b603-573605756783', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=eaa35b5e-130a-4933-a219-b6429231aa8c, chassis=[<ovs.db.idl.Row object at 0x7fdd55b5c970>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fdd55b5c970>], logical_port=dc8f6b9c-5810-44f0-9c16-cb38b34b2dfa) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 21 19:37:57 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:37:57.523 211690 DEBUG oslo.privsep.daemon [-] privsep: reply[a46b3a0c-e893-4d70-a63d-9c402257a1dc]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 19:37:57 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:37:57.523 104259 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap96576974-a1 in ovnmeta-96576974-adfc-492e-9141-63dd99e1cb25 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Jan 21 19:37:57 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:37:57.526 211690 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap96576974-a0 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Jan 21 19:37:57 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:37:57.526 211690 DEBUG oslo.privsep.daemon [-] privsep: reply[3d250207-50e4-4187-9cae-7dab58b5ccb9]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 19:37:57 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:37:57.527 211690 DEBUG oslo.privsep.daemon [-] privsep: reply[02cbc9f6-30e6-4a2c-9e36-b560cda7fd72]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 19:37:57 np0005591285 NetworkManager[55017]: <info>  [1769042277.5283] device (tap6158c039-5f): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 21 19:37:57 np0005591285 NetworkManager[55017]: <info>  [1769042277.5288] device (tapdc8f6b9c-58): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 21 19:37:57 np0005591285 NetworkManager[55017]: <info>  [1769042277.5293] device (tap6158c039-5f): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 21 19:37:57 np0005591285 NetworkManager[55017]: <info>  [1769042277.5297] device (tapdc8f6b9c-58): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 21 19:37:57 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:37:57.538 104650 DEBUG oslo.privsep.daemon [-] privsep: reply[3583f3fe-5859-4208-9d72-61c2f4a310f1]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 19:37:57 np0005591285 systemd-machined[154022]: New machine qemu-75-instance-000000af.
Jan 21 19:37:57 np0005591285 nova_compute[182755]: 2026-01-22 00:37:57.565 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:37:57 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:37:57.564 211690 DEBUG oslo.privsep.daemon [-] privsep: reply[b7c0e0ea-5107-44d7-b3c3-7d2442606573]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 19:37:57 np0005591285 ovn_controller[94908]: 2026-01-22T00:37:57Z|00653|binding|INFO|Setting lport 6158c039-5f87-4d75-91cd-734e6337b27f ovn-installed in OVS
Jan 21 19:37:57 np0005591285 ovn_controller[94908]: 2026-01-22T00:37:57Z|00654|binding|INFO|Setting lport 6158c039-5f87-4d75-91cd-734e6337b27f up in Southbound
Jan 21 19:37:57 np0005591285 systemd[1]: Started Virtual Machine qemu-75-instance-000000af.
Jan 21 19:37:57 np0005591285 nova_compute[182755]: 2026-01-22 00:37:57.572 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:37:57 np0005591285 ovn_controller[94908]: 2026-01-22T00:37:57Z|00655|binding|INFO|Setting lport dc8f6b9c-5810-44f0-9c16-cb38b34b2dfa ovn-installed in OVS
Jan 21 19:37:57 np0005591285 nova_compute[182755]: 2026-01-22 00:37:57.581 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:37:57 np0005591285 ovn_controller[94908]: 2026-01-22T00:37:57Z|00656|binding|INFO|Setting lport dc8f6b9c-5810-44f0-9c16-cb38b34b2dfa up in Southbound
Jan 21 19:37:57 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:37:57.596 211704 DEBUG oslo.privsep.daemon [-] privsep: reply[fc59faa8-7e30-431d-8fa6-e23fb9ca9d99]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 19:37:57 np0005591285 NetworkManager[55017]: <info>  [1769042277.6027] manager: (tap96576974-a0): new Veth device (/org/freedesktop/NetworkManager/Devices/320)
Jan 21 19:37:57 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:37:57.601 211690 DEBUG oslo.privsep.daemon [-] privsep: reply[c08ed387-4f69-41af-82ba-8a9ad50ca5a9]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 19:37:57 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:37:57.631 211704 DEBUG oslo.privsep.daemon [-] privsep: reply[c926b81d-f604-4aff-a941-47e03ff95d1c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 19:37:57 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:37:57.634 211704 DEBUG oslo.privsep.daemon [-] privsep: reply[6bdaca1d-f3d1-4d09-84d0-10c4e1373975]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 19:37:57 np0005591285 NetworkManager[55017]: <info>  [1769042277.6544] device (tap96576974-a0): carrier: link connected
Jan 21 19:37:57 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:37:57.659 211704 DEBUG oslo.privsep.daemon [-] privsep: reply[bdeccda1-c3a6-4140-8077-5c496f468a32]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 19:37:57 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:37:57.678 211690 DEBUG oslo.privsep.daemon [-] privsep: reply[f369b5b3-9081-405f-9ecb-9dd0cabce284]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap96576974-a1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:2d:af:e7'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 203], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 683631, 'reachable_time': 25591, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 241524, 'error': None, 'target': 'ovnmeta-96576974-adfc-492e-9141-63dd99e1cb25', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 19:37:57 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:37:57.694 211690 DEBUG oslo.privsep.daemon [-] privsep: reply[7cc19b61-7990-45a5-bc2e-a77c2b161505]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe2d:afe7'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 683631, 'tstamp': 683631}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 241525, 'error': None, 'target': 'ovnmeta-96576974-adfc-492e-9141-63dd99e1cb25', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 19:37:57 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:37:57.710 211690 DEBUG oslo.privsep.daemon [-] privsep: reply[1ed7df04-88d6-4153-b120-f4ac866e98f5]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap96576974-a1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:2d:af:e7'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 2, 'rx_bytes': 90, 'tx_bytes': 180, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 2, 'rx_bytes': 90, 'tx_bytes': 180, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 203], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 683631, 'reachable_time': 25591, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 2, 'outoctets': 152, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 2, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 152, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 2, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 241526, 'error': None, 'target': 'ovnmeta-96576974-adfc-492e-9141-63dd99e1cb25', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 19:37:57 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:37:57.737 211690 DEBUG oslo.privsep.daemon [-] privsep: reply[b5625e7c-6610-4734-9bf3-ef707661b1ea]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 19:37:57 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:37:57.804 211690 DEBUG oslo.privsep.daemon [-] privsep: reply[ce9c5f5d-fc89-49c0-aaf3-354c8fb55059]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 19:37:57 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:37:57.806 104259 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap96576974-a0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 21 19:37:57 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:37:57.806 104259 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 21 19:37:57 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:37:57.807 104259 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap96576974-a0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 21 19:37:57 np0005591285 nova_compute[182755]: 2026-01-22 00:37:57.809 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:37:57 np0005591285 NetworkManager[55017]: <info>  [1769042277.8099] manager: (tap96576974-a0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/321)
Jan 21 19:37:57 np0005591285 kernel: tap96576974-a0: entered promiscuous mode
Jan 21 19:37:57 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:37:57.812 104259 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap96576974-a0, col_values=(('external_ids', {'iface-id': '4faa3c3e-65cf-4db1-ab38-d3f17011be65'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 21 19:37:57 np0005591285 nova_compute[182755]: 2026-01-22 00:37:57.814 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:37:57 np0005591285 ovn_controller[94908]: 2026-01-22T00:37:57Z|00657|binding|INFO|Releasing lport 4faa3c3e-65cf-4db1-ab38-d3f17011be65 from this chassis (sb_readonly=0)
Jan 21 19:37:57 np0005591285 nova_compute[182755]: 2026-01-22 00:37:57.825 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:37:57 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:37:57.826 104259 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/96576974-adfc-492e-9141-63dd99e1cb25.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/96576974-adfc-492e-9141-63dd99e1cb25.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Jan 21 19:37:57 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:37:57.828 211690 DEBUG oslo.privsep.daemon [-] privsep: reply[a65f472b-057d-4baf-9db9-fc90cb6912ca]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 19:37:57 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:37:57.828 104259 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Jan 21 19:37:57 np0005591285 ovn_metadata_agent[104254]: global
Jan 21 19:37:57 np0005591285 ovn_metadata_agent[104254]:    log         /dev/log local0 debug
Jan 21 19:37:57 np0005591285 ovn_metadata_agent[104254]:    log-tag     haproxy-metadata-proxy-96576974-adfc-492e-9141-63dd99e1cb25
Jan 21 19:37:57 np0005591285 ovn_metadata_agent[104254]:    user        root
Jan 21 19:37:57 np0005591285 ovn_metadata_agent[104254]:    group       root
Jan 21 19:37:57 np0005591285 ovn_metadata_agent[104254]:    maxconn     1024
Jan 21 19:37:57 np0005591285 ovn_metadata_agent[104254]:    pidfile     /var/lib/neutron/external/pids/96576974-adfc-492e-9141-63dd99e1cb25.pid.haproxy
Jan 21 19:37:57 np0005591285 ovn_metadata_agent[104254]:    daemon
Jan 21 19:37:57 np0005591285 ovn_metadata_agent[104254]: 
Jan 21 19:37:57 np0005591285 ovn_metadata_agent[104254]: defaults
Jan 21 19:37:57 np0005591285 ovn_metadata_agent[104254]:    log global
Jan 21 19:37:57 np0005591285 ovn_metadata_agent[104254]:    mode http
Jan 21 19:37:57 np0005591285 ovn_metadata_agent[104254]:    option httplog
Jan 21 19:37:57 np0005591285 ovn_metadata_agent[104254]:    option dontlognull
Jan 21 19:37:57 np0005591285 ovn_metadata_agent[104254]:    option http-server-close
Jan 21 19:37:57 np0005591285 ovn_metadata_agent[104254]:    option forwardfor
Jan 21 19:37:57 np0005591285 ovn_metadata_agent[104254]:    retries                 3
Jan 21 19:37:57 np0005591285 ovn_metadata_agent[104254]:    timeout http-request    30s
Jan 21 19:37:57 np0005591285 ovn_metadata_agent[104254]:    timeout connect         30s
Jan 21 19:37:57 np0005591285 ovn_metadata_agent[104254]:    timeout client          32s
Jan 21 19:37:57 np0005591285 ovn_metadata_agent[104254]:    timeout server          32s
Jan 21 19:37:57 np0005591285 ovn_metadata_agent[104254]:    timeout http-keep-alive 30s
Jan 21 19:37:57 np0005591285 ovn_metadata_agent[104254]: 
Jan 21 19:37:57 np0005591285 ovn_metadata_agent[104254]: 
Jan 21 19:37:57 np0005591285 ovn_metadata_agent[104254]: listen listener
Jan 21 19:37:57 np0005591285 ovn_metadata_agent[104254]:    bind 169.254.169.254:80
Jan 21 19:37:57 np0005591285 ovn_metadata_agent[104254]:    server metadata /var/lib/neutron/metadata_proxy
Jan 21 19:37:57 np0005591285 ovn_metadata_agent[104254]:    http-request add-header X-OVN-Network-ID 96576974-adfc-492e-9141-63dd99e1cb25
Jan 21 19:37:57 np0005591285 ovn_metadata_agent[104254]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Jan 21 19:37:57 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:37:57.831 104259 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-96576974-adfc-492e-9141-63dd99e1cb25', 'env', 'PROCESS_TAG=haproxy-96576974-adfc-492e-9141-63dd99e1cb25', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/96576974-adfc-492e-9141-63dd99e1cb25.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Jan 21 19:37:57 np0005591285 nova_compute[182755]: 2026-01-22 00:37:57.847 182759 DEBUG nova.compute.manager [req-3a81b172-4e3e-416e-99c9-97117f415431 req-2bec9933-b147-4402-aca9-079f6474a563 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: f5280c26-3c89-472c-96cd-5d580ed702ce] Received event network-vif-plugged-dc8f6b9c-5810-44f0-9c16-cb38b34b2dfa external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 21 19:37:57 np0005591285 nova_compute[182755]: 2026-01-22 00:37:57.847 182759 DEBUG oslo_concurrency.lockutils [req-3a81b172-4e3e-416e-99c9-97117f415431 req-2bec9933-b147-4402-aca9-079f6474a563 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "f5280c26-3c89-472c-96cd-5d580ed702ce-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 21 19:37:57 np0005591285 nova_compute[182755]: 2026-01-22 00:37:57.848 182759 DEBUG oslo_concurrency.lockutils [req-3a81b172-4e3e-416e-99c9-97117f415431 req-2bec9933-b147-4402-aca9-079f6474a563 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "f5280c26-3c89-472c-96cd-5d580ed702ce-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 21 19:37:57 np0005591285 nova_compute[182755]: 2026-01-22 00:37:57.848 182759 DEBUG oslo_concurrency.lockutils [req-3a81b172-4e3e-416e-99c9-97117f415431 req-2bec9933-b147-4402-aca9-079f6474a563 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "f5280c26-3c89-472c-96cd-5d580ed702ce-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 21 19:37:57 np0005591285 nova_compute[182755]: 2026-01-22 00:37:57.848 182759 DEBUG nova.compute.manager [req-3a81b172-4e3e-416e-99c9-97117f415431 req-2bec9933-b147-4402-aca9-079f6474a563 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: f5280c26-3c89-472c-96cd-5d580ed702ce] Processing event network-vif-plugged-dc8f6b9c-5810-44f0-9c16-cb38b34b2dfa _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Jan 21 19:37:57 np0005591285 nova_compute[182755]: 2026-01-22 00:37:57.899 182759 DEBUG nova.compute.manager [req-e1028a6d-1d6f-495b-99e4-90f7018989dd req-398c838e-1837-41a8-92a4-41480395110e 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: f5280c26-3c89-472c-96cd-5d580ed702ce] Received event network-vif-plugged-6158c039-5f87-4d75-91cd-734e6337b27f external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 21 19:37:57 np0005591285 nova_compute[182755]: 2026-01-22 00:37:57.899 182759 DEBUG oslo_concurrency.lockutils [req-e1028a6d-1d6f-495b-99e4-90f7018989dd req-398c838e-1837-41a8-92a4-41480395110e 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "f5280c26-3c89-472c-96cd-5d580ed702ce-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 21 19:37:57 np0005591285 nova_compute[182755]: 2026-01-22 00:37:57.900 182759 DEBUG oslo_concurrency.lockutils [req-e1028a6d-1d6f-495b-99e4-90f7018989dd req-398c838e-1837-41a8-92a4-41480395110e 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "f5280c26-3c89-472c-96cd-5d580ed702ce-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 21 19:37:57 np0005591285 nova_compute[182755]: 2026-01-22 00:37:57.900 182759 DEBUG oslo_concurrency.lockutils [req-e1028a6d-1d6f-495b-99e4-90f7018989dd req-398c838e-1837-41a8-92a4-41480395110e 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "f5280c26-3c89-472c-96cd-5d580ed702ce-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 21 19:37:57 np0005591285 nova_compute[182755]: 2026-01-22 00:37:57.900 182759 DEBUG nova.compute.manager [req-e1028a6d-1d6f-495b-99e4-90f7018989dd req-398c838e-1837-41a8-92a4-41480395110e 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: f5280c26-3c89-472c-96cd-5d580ed702ce] Processing event network-vif-plugged-6158c039-5f87-4d75-91cd-734e6337b27f _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Jan 21 19:37:58 np0005591285 nova_compute[182755]: 2026-01-22 00:37:58.045 182759 DEBUG nova.compute.manager [None req-05c120eb-7169-456f-a6e0-a0d6bf45e198 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] [instance: f5280c26-3c89-472c-96cd-5d580ed702ce] Instance event wait completed in 0 seconds for network-vif-plugged,network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Jan 21 19:37:58 np0005591285 nova_compute[182755]: 2026-01-22 00:37:58.046 182759 DEBUG nova.virt.driver [None req-f447ef32-7edb-4e2c-a5c5-5452f2f9c37e - - - - - -] Emitting event <LifecycleEvent: 1769042278.0464175, f5280c26-3c89-472c-96cd-5d580ed702ce => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 21 19:37:58 np0005591285 nova_compute[182755]: 2026-01-22 00:37:58.047 182759 INFO nova.compute.manager [None req-f447ef32-7edb-4e2c-a5c5-5452f2f9c37e - - - - - -] [instance: f5280c26-3c89-472c-96cd-5d580ed702ce] VM Started (Lifecycle Event)#033[00m
Jan 21 19:37:58 np0005591285 nova_compute[182755]: 2026-01-22 00:37:58.050 182759 DEBUG nova.virt.libvirt.driver [None req-05c120eb-7169-456f-a6e0-a0d6bf45e198 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] [instance: f5280c26-3c89-472c-96cd-5d580ed702ce] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Jan 21 19:37:58 np0005591285 nova_compute[182755]: 2026-01-22 00:37:58.054 182759 INFO nova.virt.libvirt.driver [-] [instance: f5280c26-3c89-472c-96cd-5d580ed702ce] Instance spawned successfully.#033[00m
Jan 21 19:37:58 np0005591285 nova_compute[182755]: 2026-01-22 00:37:58.054 182759 DEBUG nova.virt.libvirt.driver [None req-05c120eb-7169-456f-a6e0-a0d6bf45e198 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] [instance: f5280c26-3c89-472c-96cd-5d580ed702ce] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Jan 21 19:37:58 np0005591285 nova_compute[182755]: 2026-01-22 00:37:58.079 182759 DEBUG nova.compute.manager [None req-f447ef32-7edb-4e2c-a5c5-5452f2f9c37e - - - - - -] [instance: f5280c26-3c89-472c-96cd-5d580ed702ce] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 21 19:37:58 np0005591285 nova_compute[182755]: 2026-01-22 00:37:58.082 182759 DEBUG nova.virt.libvirt.driver [None req-05c120eb-7169-456f-a6e0-a0d6bf45e198 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] [instance: f5280c26-3c89-472c-96cd-5d580ed702ce] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 21 19:37:58 np0005591285 nova_compute[182755]: 2026-01-22 00:37:58.083 182759 DEBUG nova.virt.libvirt.driver [None req-05c120eb-7169-456f-a6e0-a0d6bf45e198 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] [instance: f5280c26-3c89-472c-96cd-5d580ed702ce] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 21 19:37:58 np0005591285 nova_compute[182755]: 2026-01-22 00:37:58.084 182759 DEBUG nova.virt.libvirt.driver [None req-05c120eb-7169-456f-a6e0-a0d6bf45e198 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] [instance: f5280c26-3c89-472c-96cd-5d580ed702ce] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 21 19:37:58 np0005591285 nova_compute[182755]: 2026-01-22 00:37:58.084 182759 DEBUG nova.virt.libvirt.driver [None req-05c120eb-7169-456f-a6e0-a0d6bf45e198 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] [instance: f5280c26-3c89-472c-96cd-5d580ed702ce] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 21 19:37:58 np0005591285 nova_compute[182755]: 2026-01-22 00:37:58.085 182759 DEBUG nova.virt.libvirt.driver [None req-05c120eb-7169-456f-a6e0-a0d6bf45e198 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] [instance: f5280c26-3c89-472c-96cd-5d580ed702ce] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 21 19:37:58 np0005591285 nova_compute[182755]: 2026-01-22 00:37:58.085 182759 DEBUG nova.virt.libvirt.driver [None req-05c120eb-7169-456f-a6e0-a0d6bf45e198 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] [instance: f5280c26-3c89-472c-96cd-5d580ed702ce] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 21 19:37:58 np0005591285 nova_compute[182755]: 2026-01-22 00:37:58.089 182759 DEBUG nova.compute.manager [None req-f447ef32-7edb-4e2c-a5c5-5452f2f9c37e - - - - - -] [instance: f5280c26-3c89-472c-96cd-5d580ed702ce] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 21 19:37:58 np0005591285 nova_compute[182755]: 2026-01-22 00:37:58.120 182759 INFO nova.compute.manager [None req-f447ef32-7edb-4e2c-a5c5-5452f2f9c37e - - - - - -] [instance: f5280c26-3c89-472c-96cd-5d580ed702ce] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 21 19:37:58 np0005591285 nova_compute[182755]: 2026-01-22 00:37:58.120 182759 DEBUG nova.virt.driver [None req-f447ef32-7edb-4e2c-a5c5-5452f2f9c37e - - - - - -] Emitting event <LifecycleEvent: 1769042278.04716, f5280c26-3c89-472c-96cd-5d580ed702ce => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 21 19:37:58 np0005591285 nova_compute[182755]: 2026-01-22 00:37:58.120 182759 INFO nova.compute.manager [None req-f447ef32-7edb-4e2c-a5c5-5452f2f9c37e - - - - - -] [instance: f5280c26-3c89-472c-96cd-5d580ed702ce] VM Paused (Lifecycle Event)#033[00m
Jan 21 19:37:58 np0005591285 nova_compute[182755]: 2026-01-22 00:37:58.157 182759 DEBUG nova.compute.manager [None req-f447ef32-7edb-4e2c-a5c5-5452f2f9c37e - - - - - -] [instance: f5280c26-3c89-472c-96cd-5d580ed702ce] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 21 19:37:58 np0005591285 nova_compute[182755]: 2026-01-22 00:37:58.159 182759 DEBUG nova.virt.driver [None req-f447ef32-7edb-4e2c-a5c5-5452f2f9c37e - - - - - -] Emitting event <LifecycleEvent: 1769042278.0490353, f5280c26-3c89-472c-96cd-5d580ed702ce => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 21 19:37:58 np0005591285 nova_compute[182755]: 2026-01-22 00:37:58.160 182759 INFO nova.compute.manager [None req-f447ef32-7edb-4e2c-a5c5-5452f2f9c37e - - - - - -] [instance: f5280c26-3c89-472c-96cd-5d580ed702ce] VM Resumed (Lifecycle Event)#033[00m
Jan 21 19:37:58 np0005591285 nova_compute[182755]: 2026-01-22 00:37:58.180 182759 INFO nova.compute.manager [None req-05c120eb-7169-456f-a6e0-a0d6bf45e198 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] [instance: f5280c26-3c89-472c-96cd-5d580ed702ce] Took 11.08 seconds to spawn the instance on the hypervisor.#033[00m
Jan 21 19:37:58 np0005591285 nova_compute[182755]: 2026-01-22 00:37:58.181 182759 DEBUG nova.compute.manager [None req-05c120eb-7169-456f-a6e0-a0d6bf45e198 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] [instance: f5280c26-3c89-472c-96cd-5d580ed702ce] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 21 19:37:58 np0005591285 nova_compute[182755]: 2026-01-22 00:37:58.184 182759 DEBUG nova.compute.manager [None req-f447ef32-7edb-4e2c-a5c5-5452f2f9c37e - - - - - -] [instance: f5280c26-3c89-472c-96cd-5d580ed702ce] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 21 19:37:58 np0005591285 nova_compute[182755]: 2026-01-22 00:37:58.189 182759 DEBUG nova.compute.manager [None req-f447ef32-7edb-4e2c-a5c5-5452f2f9c37e - - - - - -] [instance: f5280c26-3c89-472c-96cd-5d580ed702ce] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 21 19:37:58 np0005591285 podman[241563]: 2026-01-22 00:37:58.199089799 +0000 UTC m=+0.051694252 container create 03cae6243bde812964866312c07ebd0a082f96fd451167b898a5492acd76e92a (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-96576974-adfc-492e-9141-63dd99e1cb25, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 21 19:37:58 np0005591285 nova_compute[182755]: 2026-01-22 00:37:58.228 182759 INFO nova.compute.manager [None req-f447ef32-7edb-4e2c-a5c5-5452f2f9c37e - - - - - -] [instance: f5280c26-3c89-472c-96cd-5d580ed702ce] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 21 19:37:58 np0005591285 systemd[1]: Started libpod-conmon-03cae6243bde812964866312c07ebd0a082f96fd451167b898a5492acd76e92a.scope.
Jan 21 19:37:58 np0005591285 podman[241563]: 2026-01-22 00:37:58.171207819 +0000 UTC m=+0.023812322 image pull 3695f0466b4af47afdf4b467956f8cc4744d7249671a73e7ca3fd26cca2f59c3 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Jan 21 19:37:58 np0005591285 systemd[1]: Started libcrun container.
Jan 21 19:37:58 np0005591285 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/fa389c99bda41224dd64502566e3fce2fc1709898b21a1b1ce303c204a0ce220/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 21 19:37:58 np0005591285 nova_compute[182755]: 2026-01-22 00:37:58.297 182759 INFO nova.compute.manager [None req-05c120eb-7169-456f-a6e0-a0d6bf45e198 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] [instance: f5280c26-3c89-472c-96cd-5d580ed702ce] Took 11.86 seconds to build instance.#033[00m
Jan 21 19:37:58 np0005591285 podman[241563]: 2026-01-22 00:37:58.302747019 +0000 UTC m=+0.155351502 container init 03cae6243bde812964866312c07ebd0a082f96fd451167b898a5492acd76e92a (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-96576974-adfc-492e-9141-63dd99e1cb25, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team)
Jan 21 19:37:58 np0005591285 podman[241563]: 2026-01-22 00:37:58.308810722 +0000 UTC m=+0.161415185 container start 03cae6243bde812964866312c07ebd0a082f96fd451167b898a5492acd76e92a (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-96576974-adfc-492e-9141-63dd99e1cb25, org.label-schema.license=GPLv2, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 21 19:37:58 np0005591285 nova_compute[182755]: 2026-01-22 00:37:58.318 182759 DEBUG oslo_concurrency.lockutils [None req-05c120eb-7169-456f-a6e0-a0d6bf45e198 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Lock "f5280c26-3c89-472c-96cd-5d580ed702ce" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 11.952s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 21 19:37:58 np0005591285 neutron-haproxy-ovnmeta-96576974-adfc-492e-9141-63dd99e1cb25[241580]: [NOTICE]   (241584) : New worker (241586) forked
Jan 21 19:37:58 np0005591285 neutron-haproxy-ovnmeta-96576974-adfc-492e-9141-63dd99e1cb25[241580]: [NOTICE]   (241584) : Loading success.
Jan 21 19:37:58 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:37:58.357 104259 INFO neutron.agent.ovn.metadata.agent [-] Port dc8f6b9c-5810-44f0-9c16-cb38b34b2dfa in datapath 01fa8e13-9f62-4b06-88db-79f2e6ca65b8 unbound from our chassis#033[00m
Jan 21 19:37:58 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:37:58.359 104259 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 01fa8e13-9f62-4b06-88db-79f2e6ca65b8#033[00m
Jan 21 19:37:58 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:37:58.372 211690 DEBUG oslo.privsep.daemon [-] privsep: reply[c437f922-8677-4d98-800d-975959ef5747]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 19:37:58 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:37:58.373 104259 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap01fa8e13-91 in ovnmeta-01fa8e13-9f62-4b06-88db-79f2e6ca65b8 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Jan 21 19:37:58 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:37:58.376 211690 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap01fa8e13-90 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Jan 21 19:37:58 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:37:58.376 211690 DEBUG oslo.privsep.daemon [-] privsep: reply[647043bc-b0b0-42fc-9128-cb7330883bb8]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 19:37:58 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:37:58.377 211690 DEBUG oslo.privsep.daemon [-] privsep: reply[9888b585-469e-4665-8f11-d1e21878f569]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 19:37:58 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:37:58.389 104650 DEBUG oslo.privsep.daemon [-] privsep: reply[4f5bc34a-7bd1-4b00-aff1-2d8b835d8920]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 19:37:58 np0005591285 nova_compute[182755]: 2026-01-22 00:37:58.408 182759 DEBUG nova.network.neutron [req-17bd973b-8377-4dfc-a743-29c37d24c209 req-1f8099fe-70e0-4fad-98f7-de89eab7b619 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: f5280c26-3c89-472c-96cd-5d580ed702ce] Updated VIF entry in instance network info cache for port dc8f6b9c-5810-44f0-9c16-cb38b34b2dfa. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 21 19:37:58 np0005591285 nova_compute[182755]: 2026-01-22 00:37:58.408 182759 DEBUG nova.network.neutron [req-17bd973b-8377-4dfc-a743-29c37d24c209 req-1f8099fe-70e0-4fad-98f7-de89eab7b619 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: f5280c26-3c89-472c-96cd-5d580ed702ce] Updating instance_info_cache with network_info: [{"id": "6158c039-5f87-4d75-91cd-734e6337b27f", "address": "fa:16:3e:27:e9:26", "network": {"id": "96576974-adfc-492e-9141-63dd99e1cb25", "bridge": "br-int", "label": "tempest-network-smoke--1773827977", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "837db8748d074b3c9179b47d30e7a1d4", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap6158c039-5f", "ovs_interfaceid": "6158c039-5f87-4d75-91cd-734e6337b27f", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "dc8f6b9c-5810-44f0-9c16-cb38b34b2dfa", "address": "fa:16:3e:20:7f:7e", "network": {"id": "01fa8e13-9f62-4b06-88db-79f2e6ca65b8", "bridge": "br-int", "label": "tempest-network-smoke--1773626540", "subnets": [{"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe20:7f7e", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}, {"cidr": "2001:db8:0:1::/64", "dns": [], "gateway": {"address": "2001:db8:0:1::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8:0:1:f816:3eff:fe20:7f7e", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "837db8748d074b3c9179b47d30e7a1d4", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapdc8f6b9c-58", "ovs_interfaceid": "dc8f6b9c-5810-44f0-9c16-cb38b34b2dfa", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 21 19:37:58 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:37:58.415 211690 DEBUG oslo.privsep.daemon [-] privsep: reply[51cf77ef-2070-49c4-9bec-4d77e1c6c1f6]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 19:37:58 np0005591285 nova_compute[182755]: 2026-01-22 00:37:58.429 182759 DEBUG oslo_concurrency.lockutils [req-17bd973b-8377-4dfc-a743-29c37d24c209 req-1f8099fe-70e0-4fad-98f7-de89eab7b619 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Releasing lock "refresh_cache-f5280c26-3c89-472c-96cd-5d580ed702ce" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 21 19:37:58 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:37:58.441 211704 DEBUG oslo.privsep.daemon [-] privsep: reply[5eb8bdb1-0c74-45fc-847f-2337cd6d4653]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 19:37:58 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:37:58.447 211690 DEBUG oslo.privsep.daemon [-] privsep: reply[67bce5ae-0a8d-4a03-a2ec-dcfd183f2892]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 19:37:58 np0005591285 NetworkManager[55017]: <info>  [1769042278.4483] manager: (tap01fa8e13-90): new Veth device (/org/freedesktop/NetworkManager/Devices/322)
Jan 21 19:37:58 np0005591285 systemd-udevd[241514]: Network interface NamePolicy= disabled on kernel command line.
Jan 21 19:37:58 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:37:58.478 211704 DEBUG oslo.privsep.daemon [-] privsep: reply[d8d57786-f7b2-4bb6-8c30-affa1eb040a0]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 19:37:58 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:37:58.481 211704 DEBUG oslo.privsep.daemon [-] privsep: reply[fd3a491f-a420-4158-abe5-f74a095e7d2e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 19:37:58 np0005591285 NetworkManager[55017]: <info>  [1769042278.5037] device (tap01fa8e13-90): carrier: link connected
Jan 21 19:37:58 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:37:58.509 211704 DEBUG oslo.privsep.daemon [-] privsep: reply[70754ebf-3ea7-4128-9137-6f8dcbb14671]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 19:37:58 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:37:58.528 211690 DEBUG oslo.privsep.daemon [-] privsep: reply[9dd7ead4-d417-45a9-9924-0e2f8a33f312]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap01fa8e13-91'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:5f:ff:76'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 204], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 683716, 'reachable_time': 43542, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 241605, 'error': None, 'target': 'ovnmeta-01fa8e13-9f62-4b06-88db-79f2e6ca65b8', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 19:37:58 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:37:58.544 211690 DEBUG oslo.privsep.daemon [-] privsep: reply[91cf532d-f72b-41e0-85eb-eaae8310831b]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe5f:ff76'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 683716, 'tstamp': 683716}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 241606, 'error': None, 'target': 'ovnmeta-01fa8e13-9f62-4b06-88db-79f2e6ca65b8', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 19:37:58 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:37:58.560 211690 DEBUG oslo.privsep.daemon [-] privsep: reply[f0b9c74c-dc0b-4189-abf0-342b68fe1d2c]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap01fa8e13-91'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:5f:ff:76'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 204], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 683716, 'reachable_time': 43542, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 241607, 'error': None, 'target': 'ovnmeta-01fa8e13-9f62-4b06-88db-79f2e6ca65b8', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 19:37:58 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:37:58.589 211690 DEBUG oslo.privsep.daemon [-] privsep: reply[f8230e39-6137-411f-a481-29f43e43d2e2]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 19:37:58 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:37:58.620 211690 DEBUG oslo.privsep.daemon [-] privsep: reply[9a425ed4-4a36-4350-9b2e-fcf937d439fb]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 19:37:58 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:37:58.622 104259 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap01fa8e13-90, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 21 19:37:58 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:37:58.622 104259 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 21 19:37:58 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:37:58.623 104259 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap01fa8e13-90, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 21 19:37:58 np0005591285 nova_compute[182755]: 2026-01-22 00:37:58.625 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:37:58 np0005591285 NetworkManager[55017]: <info>  [1769042278.6261] manager: (tap01fa8e13-90): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/323)
Jan 21 19:37:58 np0005591285 kernel: tap01fa8e13-90: entered promiscuous mode
Jan 21 19:37:58 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:37:58.628 104259 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap01fa8e13-90, col_values=(('external_ids', {'iface-id': '63ad2747-135a-46c8-90ca-ec1def31a1c2'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 21 19:37:58 np0005591285 ovn_controller[94908]: 2026-01-22T00:37:58Z|00658|binding|INFO|Releasing lport 63ad2747-135a-46c8-90ca-ec1def31a1c2 from this chassis (sb_readonly=0)
Jan 21 19:37:58 np0005591285 nova_compute[182755]: 2026-01-22 00:37:58.646 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:37:58 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:37:58.647 104259 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/01fa8e13-9f62-4b06-88db-79f2e6ca65b8.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/01fa8e13-9f62-4b06-88db-79f2e6ca65b8.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Jan 21 19:37:58 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:37:58.648 211690 DEBUG oslo.privsep.daemon [-] privsep: reply[0ad70aed-0083-4e9b-9a15-d67df463ca30]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 19:37:58 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:37:58.649 104259 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Jan 21 19:37:58 np0005591285 ovn_metadata_agent[104254]: global
Jan 21 19:37:58 np0005591285 ovn_metadata_agent[104254]:    log         /dev/log local0 debug
Jan 21 19:37:58 np0005591285 ovn_metadata_agent[104254]:    log-tag     haproxy-metadata-proxy-01fa8e13-9f62-4b06-88db-79f2e6ca65b8
Jan 21 19:37:58 np0005591285 ovn_metadata_agent[104254]:    user        root
Jan 21 19:37:58 np0005591285 ovn_metadata_agent[104254]:    group       root
Jan 21 19:37:58 np0005591285 ovn_metadata_agent[104254]:    maxconn     1024
Jan 21 19:37:58 np0005591285 ovn_metadata_agent[104254]:    pidfile     /var/lib/neutron/external/pids/01fa8e13-9f62-4b06-88db-79f2e6ca65b8.pid.haproxy
Jan 21 19:37:58 np0005591285 ovn_metadata_agent[104254]:    daemon
Jan 21 19:37:58 np0005591285 ovn_metadata_agent[104254]: 
Jan 21 19:37:58 np0005591285 ovn_metadata_agent[104254]: defaults
Jan 21 19:37:58 np0005591285 ovn_metadata_agent[104254]:    log global
Jan 21 19:37:58 np0005591285 ovn_metadata_agent[104254]:    mode http
Jan 21 19:37:58 np0005591285 ovn_metadata_agent[104254]:    option httplog
Jan 21 19:37:58 np0005591285 ovn_metadata_agent[104254]:    option dontlognull
Jan 21 19:37:58 np0005591285 ovn_metadata_agent[104254]:    option http-server-close
Jan 21 19:37:58 np0005591285 ovn_metadata_agent[104254]:    option forwardfor
Jan 21 19:37:58 np0005591285 ovn_metadata_agent[104254]:    retries                 3
Jan 21 19:37:58 np0005591285 ovn_metadata_agent[104254]:    timeout http-request    30s
Jan 21 19:37:58 np0005591285 ovn_metadata_agent[104254]:    timeout connect         30s
Jan 21 19:37:58 np0005591285 ovn_metadata_agent[104254]:    timeout client          32s
Jan 21 19:37:58 np0005591285 ovn_metadata_agent[104254]:    timeout server          32s
Jan 21 19:37:58 np0005591285 ovn_metadata_agent[104254]:    timeout http-keep-alive 30s
Jan 21 19:37:58 np0005591285 ovn_metadata_agent[104254]: 
Jan 21 19:37:58 np0005591285 ovn_metadata_agent[104254]: 
Jan 21 19:37:58 np0005591285 ovn_metadata_agent[104254]: listen listener
Jan 21 19:37:58 np0005591285 ovn_metadata_agent[104254]:    bind 169.254.169.254:80
Jan 21 19:37:58 np0005591285 ovn_metadata_agent[104254]:    server metadata /var/lib/neutron/metadata_proxy
Jan 21 19:37:58 np0005591285 ovn_metadata_agent[104254]:    http-request add-header X-OVN-Network-ID 01fa8e13-9f62-4b06-88db-79f2e6ca65b8
Jan 21 19:37:58 np0005591285 ovn_metadata_agent[104254]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Jan 21 19:37:58 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:37:58.651 104259 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-01fa8e13-9f62-4b06-88db-79f2e6ca65b8', 'env', 'PROCESS_TAG=haproxy-01fa8e13-9f62-4b06-88db-79f2e6ca65b8', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/01fa8e13-9f62-4b06-88db-79f2e6ca65b8.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Jan 21 19:37:58 np0005591285 podman[241637]: 2026-01-22 00:37:58.995785808 +0000 UTC m=+0.042512275 container create 2b96b90ca1b6c2488016c4014ad6db19cbefd7f8f151aa9eda28bce32ca025c9 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-01fa8e13-9f62-4b06-88db-79f2e6ca65b8, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true)
Jan 21 19:37:59 np0005591285 systemd[1]: Started libpod-conmon-2b96b90ca1b6c2488016c4014ad6db19cbefd7f8f151aa9eda28bce32ca025c9.scope.
Jan 21 19:37:59 np0005591285 systemd[1]: Started libcrun container.
Jan 21 19:37:59 np0005591285 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/b4f5a3f86a0da0b19a3f41725f2a2f8b24a64fdc90919740746a7197d75cbfed/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 21 19:37:59 np0005591285 podman[241637]: 2026-01-22 00:37:59.058902427 +0000 UTC m=+0.105628924 container init 2b96b90ca1b6c2488016c4014ad6db19cbefd7f8f151aa9eda28bce32ca025c9 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-01fa8e13-9f62-4b06-88db-79f2e6ca65b8, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251202)
Jan 21 19:37:59 np0005591285 podman[241637]: 2026-01-22 00:37:59.064525758 +0000 UTC m=+0.111252225 container start 2b96b90ca1b6c2488016c4014ad6db19cbefd7f8f151aa9eda28bce32ca025c9 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-01fa8e13-9f62-4b06-88db-79f2e6ca65b8, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0)
Jan 21 19:37:59 np0005591285 podman[241637]: 2026-01-22 00:37:58.973833497 +0000 UTC m=+0.020559974 image pull 3695f0466b4af47afdf4b467956f8cc4744d7249671a73e7ca3fd26cca2f59c3 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Jan 21 19:37:59 np0005591285 neutron-haproxy-ovnmeta-01fa8e13-9f62-4b06-88db-79f2e6ca65b8[241652]: [NOTICE]   (241656) : New worker (241658) forked
Jan 21 19:37:59 np0005591285 neutron-haproxy-ovnmeta-01fa8e13-9f62-4b06-88db-79f2e6ca65b8[241652]: [NOTICE]   (241656) : Loading success.
Jan 21 19:37:59 np0005591285 nova_compute[182755]: 2026-01-22 00:37:59.217 182759 DEBUG oslo_service.periodic_task [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Running periodic task ComputeManager._run_pending_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 21 19:37:59 np0005591285 nova_compute[182755]: 2026-01-22 00:37:59.217 182759 DEBUG nova.compute.manager [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Cleaning up deleted instances _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11145#033[00m
Jan 21 19:37:59 np0005591285 nova_compute[182755]: 2026-01-22 00:37:59.243 182759 DEBUG nova.compute.manager [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] There are 0 instances to clean _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11154#033[00m
Jan 21 19:37:59 np0005591285 nova_compute[182755]: 2026-01-22 00:37:59.935 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:38:00 np0005591285 nova_compute[182755]: 2026-01-22 00:38:00.278 182759 DEBUG nova.compute.manager [req-3ffcb3c5-4fe8-43bf-803b-02115c9938c0 req-bcc83d78-fc23-4a78-8fae-541cef52536e 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: f5280c26-3c89-472c-96cd-5d580ed702ce] Received event network-vif-plugged-dc8f6b9c-5810-44f0-9c16-cb38b34b2dfa external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 21 19:38:00 np0005591285 nova_compute[182755]: 2026-01-22 00:38:00.278 182759 DEBUG oslo_concurrency.lockutils [req-3ffcb3c5-4fe8-43bf-803b-02115c9938c0 req-bcc83d78-fc23-4a78-8fae-541cef52536e 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "f5280c26-3c89-472c-96cd-5d580ed702ce-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 21 19:38:00 np0005591285 nova_compute[182755]: 2026-01-22 00:38:00.279 182759 DEBUG oslo_concurrency.lockutils [req-3ffcb3c5-4fe8-43bf-803b-02115c9938c0 req-bcc83d78-fc23-4a78-8fae-541cef52536e 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "f5280c26-3c89-472c-96cd-5d580ed702ce-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 21 19:38:00 np0005591285 nova_compute[182755]: 2026-01-22 00:38:00.279 182759 DEBUG oslo_concurrency.lockutils [req-3ffcb3c5-4fe8-43bf-803b-02115c9938c0 req-bcc83d78-fc23-4a78-8fae-541cef52536e 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "f5280c26-3c89-472c-96cd-5d580ed702ce-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 21 19:38:00 np0005591285 nova_compute[182755]: 2026-01-22 00:38:00.279 182759 DEBUG nova.compute.manager [req-3ffcb3c5-4fe8-43bf-803b-02115c9938c0 req-bcc83d78-fc23-4a78-8fae-541cef52536e 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: f5280c26-3c89-472c-96cd-5d580ed702ce] No waiting events found dispatching network-vif-plugged-dc8f6b9c-5810-44f0-9c16-cb38b34b2dfa pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 21 19:38:00 np0005591285 nova_compute[182755]: 2026-01-22 00:38:00.279 182759 WARNING nova.compute.manager [req-3ffcb3c5-4fe8-43bf-803b-02115c9938c0 req-bcc83d78-fc23-4a78-8fae-541cef52536e 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: f5280c26-3c89-472c-96cd-5d580ed702ce] Received unexpected event network-vif-plugged-dc8f6b9c-5810-44f0-9c16-cb38b34b2dfa for instance with vm_state active and task_state None.#033[00m
Jan 21 19:38:00 np0005591285 nova_compute[182755]: 2026-01-22 00:38:00.281 182759 DEBUG nova.compute.manager [req-b4e509d2-867e-4cce-8376-dbd785561c7d req-29c54e64-8eae-4aa4-a54a-cb9939004f10 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: f5280c26-3c89-472c-96cd-5d580ed702ce] Received event network-vif-plugged-6158c039-5f87-4d75-91cd-734e6337b27f external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 21 19:38:00 np0005591285 nova_compute[182755]: 2026-01-22 00:38:00.282 182759 DEBUG oslo_concurrency.lockutils [req-b4e509d2-867e-4cce-8376-dbd785561c7d req-29c54e64-8eae-4aa4-a54a-cb9939004f10 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "f5280c26-3c89-472c-96cd-5d580ed702ce-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 21 19:38:00 np0005591285 nova_compute[182755]: 2026-01-22 00:38:00.282 182759 DEBUG oslo_concurrency.lockutils [req-b4e509d2-867e-4cce-8376-dbd785561c7d req-29c54e64-8eae-4aa4-a54a-cb9939004f10 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "f5280c26-3c89-472c-96cd-5d580ed702ce-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 21 19:38:00 np0005591285 nova_compute[182755]: 2026-01-22 00:38:00.283 182759 DEBUG oslo_concurrency.lockutils [req-b4e509d2-867e-4cce-8376-dbd785561c7d req-29c54e64-8eae-4aa4-a54a-cb9939004f10 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "f5280c26-3c89-472c-96cd-5d580ed702ce-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 21 19:38:00 np0005591285 nova_compute[182755]: 2026-01-22 00:38:00.283 182759 DEBUG nova.compute.manager [req-b4e509d2-867e-4cce-8376-dbd785561c7d req-29c54e64-8eae-4aa4-a54a-cb9939004f10 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: f5280c26-3c89-472c-96cd-5d580ed702ce] No waiting events found dispatching network-vif-plugged-6158c039-5f87-4d75-91cd-734e6337b27f pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 21 19:38:00 np0005591285 nova_compute[182755]: 2026-01-22 00:38:00.283 182759 WARNING nova.compute.manager [req-b4e509d2-867e-4cce-8376-dbd785561c7d req-29c54e64-8eae-4aa4-a54a-cb9939004f10 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: f5280c26-3c89-472c-96cd-5d580ed702ce] Received unexpected event network-vif-plugged-6158c039-5f87-4d75-91cd-734e6337b27f for instance with vm_state active and task_state None.#033[00m
Jan 21 19:38:01 np0005591285 nova_compute[182755]: 2026-01-22 00:38:01.644 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:38:03 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:38:03.005 104259 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 21 19:38:03 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:38:03.006 104259 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 21 19:38:03 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:38:03.007 104259 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 21 19:38:04 np0005591285 ovn_controller[94908]: 2026-01-22T00:38:04Z|00659|binding|INFO|Releasing lport 63ad2747-135a-46c8-90ca-ec1def31a1c2 from this chassis (sb_readonly=0)
Jan 21 19:38:04 np0005591285 ovn_controller[94908]: 2026-01-22T00:38:04Z|00660|binding|INFO|Releasing lport 4faa3c3e-65cf-4db1-ab38-d3f17011be65 from this chassis (sb_readonly=0)
Jan 21 19:38:04 np0005591285 nova_compute[182755]: 2026-01-22 00:38:04.783 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:38:04 np0005591285 NetworkManager[55017]: <info>  [1769042284.7844] manager: (patch-provnet-99bcf688-d143-4306-a11c-956e66fbe227-to-br-int): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/324)
Jan 21 19:38:04 np0005591285 NetworkManager[55017]: <info>  [1769042284.7854] manager: (patch-br-int-to-provnet-99bcf688-d143-4306-a11c-956e66fbe227): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/325)
Jan 21 19:38:04 np0005591285 ovn_controller[94908]: 2026-01-22T00:38:04Z|00661|binding|INFO|Releasing lport 63ad2747-135a-46c8-90ca-ec1def31a1c2 from this chassis (sb_readonly=0)
Jan 21 19:38:04 np0005591285 nova_compute[182755]: 2026-01-22 00:38:04.814 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:38:04 np0005591285 ovn_controller[94908]: 2026-01-22T00:38:04Z|00662|binding|INFO|Releasing lport 4faa3c3e-65cf-4db1-ab38-d3f17011be65 from this chassis (sb_readonly=0)
Jan 21 19:38:04 np0005591285 nova_compute[182755]: 2026-01-22 00:38:04.828 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:38:04 np0005591285 nova_compute[182755]: 2026-01-22 00:38:04.937 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:38:05 np0005591285 nova_compute[182755]: 2026-01-22 00:38:05.301 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:38:05 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:38:05.303 104259 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=61, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '16:a4:fb', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': 'fa:c2:bb:50:eb:27'}, ipsec=False) old=SB_Global(nb_cfg=60) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 21 19:38:05 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:38:05.304 104259 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 7 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Jan 21 19:38:05 np0005591285 nova_compute[182755]: 2026-01-22 00:38:05.347 182759 DEBUG nova.compute.manager [req-9546a27c-9aee-4960-a1cb-a5b31551fef7 req-fbf69578-6448-40de-bac7-e341cc965c69 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: f5280c26-3c89-472c-96cd-5d580ed702ce] Received event network-changed-6158c039-5f87-4d75-91cd-734e6337b27f external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 21 19:38:05 np0005591285 nova_compute[182755]: 2026-01-22 00:38:05.347 182759 DEBUG nova.compute.manager [req-9546a27c-9aee-4960-a1cb-a5b31551fef7 req-fbf69578-6448-40de-bac7-e341cc965c69 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: f5280c26-3c89-472c-96cd-5d580ed702ce] Refreshing instance network info cache due to event network-changed-6158c039-5f87-4d75-91cd-734e6337b27f. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 21 19:38:05 np0005591285 nova_compute[182755]: 2026-01-22 00:38:05.348 182759 DEBUG oslo_concurrency.lockutils [req-9546a27c-9aee-4960-a1cb-a5b31551fef7 req-fbf69578-6448-40de-bac7-e341cc965c69 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "refresh_cache-f5280c26-3c89-472c-96cd-5d580ed702ce" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 21 19:38:05 np0005591285 nova_compute[182755]: 2026-01-22 00:38:05.348 182759 DEBUG oslo_concurrency.lockutils [req-9546a27c-9aee-4960-a1cb-a5b31551fef7 req-fbf69578-6448-40de-bac7-e341cc965c69 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquired lock "refresh_cache-f5280c26-3c89-472c-96cd-5d580ed702ce" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 21 19:38:05 np0005591285 nova_compute[182755]: 2026-01-22 00:38:05.348 182759 DEBUG nova.network.neutron [req-9546a27c-9aee-4960-a1cb-a5b31551fef7 req-fbf69578-6448-40de-bac7-e341cc965c69 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: f5280c26-3c89-472c-96cd-5d580ed702ce] Refreshing network info cache for port 6158c039-5f87-4d75-91cd-734e6337b27f _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 21 19:38:06 np0005591285 nova_compute[182755]: 2026-01-22 00:38:06.647 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:38:07 np0005591285 nova_compute[182755]: 2026-01-22 00:38:07.245 182759 DEBUG oslo_service.periodic_task [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 21 19:38:07 np0005591285 nova_compute[182755]: 2026-01-22 00:38:07.245 182759 DEBUG nova.compute.manager [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Jan 21 19:38:07 np0005591285 nova_compute[182755]: 2026-01-22 00:38:07.246 182759 DEBUG nova.compute.manager [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Jan 21 19:38:07 np0005591285 nova_compute[182755]: 2026-01-22 00:38:07.656 182759 DEBUG oslo_concurrency.lockutils [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Acquiring lock "refresh_cache-f5280c26-3c89-472c-96cd-5d580ed702ce" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 21 19:38:07 np0005591285 nova_compute[182755]: 2026-01-22 00:38:07.877 182759 DEBUG nova.network.neutron [req-9546a27c-9aee-4960-a1cb-a5b31551fef7 req-fbf69578-6448-40de-bac7-e341cc965c69 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: f5280c26-3c89-472c-96cd-5d580ed702ce] Updated VIF entry in instance network info cache for port 6158c039-5f87-4d75-91cd-734e6337b27f. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 21 19:38:07 np0005591285 nova_compute[182755]: 2026-01-22 00:38:07.878 182759 DEBUG nova.network.neutron [req-9546a27c-9aee-4960-a1cb-a5b31551fef7 req-fbf69578-6448-40de-bac7-e341cc965c69 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: f5280c26-3c89-472c-96cd-5d580ed702ce] Updating instance_info_cache with network_info: [{"id": "6158c039-5f87-4d75-91cd-734e6337b27f", "address": "fa:16:3e:27:e9:26", "network": {"id": "96576974-adfc-492e-9141-63dd99e1cb25", "bridge": "br-int", "label": "tempest-network-smoke--1773827977", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.192", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "837db8748d074b3c9179b47d30e7a1d4", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap6158c039-5f", "ovs_interfaceid": "6158c039-5f87-4d75-91cd-734e6337b27f", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "dc8f6b9c-5810-44f0-9c16-cb38b34b2dfa", "address": "fa:16:3e:20:7f:7e", "network": {"id": "01fa8e13-9f62-4b06-88db-79f2e6ca65b8", "bridge": "br-int", "label": "tempest-network-smoke--1773626540", "subnets": [{"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe20:7f7e", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}, {"cidr": "2001:db8:0:1::/64", "dns": [], "gateway": {"address": "2001:db8:0:1::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8:0:1:f816:3eff:fe20:7f7e", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "837db8748d074b3c9179b47d30e7a1d4", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapdc8f6b9c-58", "ovs_interfaceid": "dc8f6b9c-5810-44f0-9c16-cb38b34b2dfa", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 21 19:38:07 np0005591285 nova_compute[182755]: 2026-01-22 00:38:07.916 182759 DEBUG oslo_concurrency.lockutils [req-9546a27c-9aee-4960-a1cb-a5b31551fef7 req-fbf69578-6448-40de-bac7-e341cc965c69 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Releasing lock "refresh_cache-f5280c26-3c89-472c-96cd-5d580ed702ce" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 21 19:38:07 np0005591285 nova_compute[182755]: 2026-01-22 00:38:07.917 182759 DEBUG oslo_concurrency.lockutils [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Acquired lock "refresh_cache-f5280c26-3c89-472c-96cd-5d580ed702ce" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 21 19:38:07 np0005591285 nova_compute[182755]: 2026-01-22 00:38:07.917 182759 DEBUG nova.network.neutron [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] [instance: f5280c26-3c89-472c-96cd-5d580ed702ce] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004#033[00m
Jan 21 19:38:07 np0005591285 nova_compute[182755]: 2026-01-22 00:38:07.917 182759 DEBUG nova.objects.instance [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Lazy-loading 'info_cache' on Instance uuid f5280c26-3c89-472c-96cd-5d580ed702ce obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 21 19:38:09 np0005591285 podman[241669]: 2026-01-22 00:38:09.186800025 +0000 UTC m=+0.061308531 container health_status 769668cc4e818cfae854ab3fcb5517227ddca1e18005a18ef4d782a494f45662 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., architecture=x86_64, version=9.6, managed_by=edpm_ansible, io.buildah.version=1.33.7, io.openshift.tags=minimal rhel9, maintainer=Red Hat, Inc., io.openshift.expose-services=, config_id=openstack_network_exporter, container_name=openstack_network_exporter, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1755695350, com.redhat.component=ubi9-minimal-container, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., distribution-scope=public, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, build-date=2025-08-20T13:12:41, name=ubi9-minimal, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vcs-type=git)
Jan 21 19:38:09 np0005591285 podman[241670]: 2026-01-22 00:38:09.196013043 +0000 UTC m=+0.067679552 container health_status b5c1a31a508d0cf9e10702abe9c0ef424614b4d8f0daee85791f7de054afa6f0 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_id=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS)
Jan 21 19:38:09 np0005591285 nova_compute[182755]: 2026-01-22 00:38:09.939 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:38:10 np0005591285 ovn_controller[94908]: 2026-01-22T00:38:10Z|00068|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:27:e9:26 10.100.0.13
Jan 21 19:38:10 np0005591285 ovn_controller[94908]: 2026-01-22T00:38:10Z|00069|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:27:e9:26 10.100.0.13
Jan 21 19:38:11 np0005591285 nova_compute[182755]: 2026-01-22 00:38:11.652 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:38:12 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:38:12.306 104259 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=ce4b296c-26ac-415a-aa87-9634754eb3d3, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '61'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 21 19:38:13 np0005591285 nova_compute[182755]: 2026-01-22 00:38:13.063 182759 DEBUG nova.network.neutron [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] [instance: f5280c26-3c89-472c-96cd-5d580ed702ce] Updating instance_info_cache with network_info: [{"id": "6158c039-5f87-4d75-91cd-734e6337b27f", "address": "fa:16:3e:27:e9:26", "network": {"id": "96576974-adfc-492e-9141-63dd99e1cb25", "bridge": "br-int", "label": "tempest-network-smoke--1773827977", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.192", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "837db8748d074b3c9179b47d30e7a1d4", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap6158c039-5f", "ovs_interfaceid": "6158c039-5f87-4d75-91cd-734e6337b27f", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "dc8f6b9c-5810-44f0-9c16-cb38b34b2dfa", "address": "fa:16:3e:20:7f:7e", "network": {"id": "01fa8e13-9f62-4b06-88db-79f2e6ca65b8", "bridge": "br-int", "label": "tempest-network-smoke--1773626540", "subnets": [{"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe20:7f7e", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}, {"cidr": "2001:db8:0:1::/64", "dns": [], "gateway": {"address": "2001:db8:0:1::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8:0:1:f816:3eff:fe20:7f7e", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "837db8748d074b3c9179b47d30e7a1d4", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapdc8f6b9c-58", "ovs_interfaceid": "dc8f6b9c-5810-44f0-9c16-cb38b34b2dfa", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 21 19:38:13 np0005591285 nova_compute[182755]: 2026-01-22 00:38:13.088 182759 DEBUG oslo_concurrency.lockutils [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Releasing lock "refresh_cache-f5280c26-3c89-472c-96cd-5d580ed702ce" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 21 19:38:13 np0005591285 nova_compute[182755]: 2026-01-22 00:38:13.088 182759 DEBUG nova.compute.manager [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] [instance: f5280c26-3c89-472c-96cd-5d580ed702ce] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929#033[00m
Jan 21 19:38:13 np0005591285 nova_compute[182755]: 2026-01-22 00:38:13.089 182759 DEBUG oslo_service.periodic_task [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 21 19:38:13 np0005591285 nova_compute[182755]: 2026-01-22 00:38:13.089 182759 DEBUG oslo_service.periodic_task [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 21 19:38:13 np0005591285 nova_compute[182755]: 2026-01-22 00:38:13.089 182759 DEBUG oslo_service.periodic_task [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 21 19:38:13 np0005591285 nova_compute[182755]: 2026-01-22 00:38:13.090 182759 DEBUG oslo_service.periodic_task [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 21 19:38:13 np0005591285 nova_compute[182755]: 2026-01-22 00:38:13.090 182759 DEBUG nova.compute.manager [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Jan 21 19:38:14 np0005591285 nova_compute[182755]: 2026-01-22 00:38:14.942 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:38:15 np0005591285 nova_compute[182755]: 2026-01-22 00:38:15.058 182759 DEBUG oslo_service.periodic_task [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 21 19:38:16 np0005591285 nova_compute[182755]: 2026-01-22 00:38:16.657 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:38:17 np0005591285 nova_compute[182755]: 2026-01-22 00:38:17.216 182759 DEBUG oslo_service.periodic_task [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 21 19:38:18 np0005591285 podman[241724]: 2026-01-22 00:38:18.185130408 +0000 UTC m=+0.058634769 container health_status 3d927e208c8d9d2bf2ed958430b573b5beb6e6062ba87e1ec7a0ee42c855b2e4 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter)
Jan 21 19:38:18 np0005591285 nova_compute[182755]: 2026-01-22 00:38:18.218 182759 DEBUG oslo_service.periodic_task [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 21 19:38:18 np0005591285 nova_compute[182755]: 2026-01-22 00:38:18.678 182759 DEBUG oslo_concurrency.lockutils [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 21 19:38:18 np0005591285 nova_compute[182755]: 2026-01-22 00:38:18.678 182759 DEBUG oslo_concurrency.lockutils [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 21 19:38:18 np0005591285 nova_compute[182755]: 2026-01-22 00:38:18.678 182759 DEBUG oslo_concurrency.lockutils [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 21 19:38:18 np0005591285 nova_compute[182755]: 2026-01-22 00:38:18.678 182759 DEBUG nova.compute.resource_tracker [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Jan 21 19:38:18 np0005591285 nova_compute[182755]: 2026-01-22 00:38:18.934 182759 DEBUG oslo_concurrency.processutils [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/f5280c26-3c89-472c-96cd-5d580ed702ce/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 21 19:38:18 np0005591285 nova_compute[182755]: 2026-01-22 00:38:18.994 182759 DEBUG oslo_concurrency.processutils [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/f5280c26-3c89-472c-96cd-5d580ed702ce/disk --force-share --output=json" returned: 0 in 0.060s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 21 19:38:18 np0005591285 nova_compute[182755]: 2026-01-22 00:38:18.995 182759 DEBUG oslo_concurrency.processutils [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/f5280c26-3c89-472c-96cd-5d580ed702ce/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 21 19:38:19 np0005591285 nova_compute[182755]: 2026-01-22 00:38:19.060 182759 DEBUG oslo_concurrency.processutils [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/f5280c26-3c89-472c-96cd-5d580ed702ce/disk --force-share --output=json" returned: 0 in 0.065s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 21 19:38:19 np0005591285 nova_compute[182755]: 2026-01-22 00:38:19.222 182759 WARNING nova.virt.libvirt.driver [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 21 19:38:19 np0005591285 nova_compute[182755]: 2026-01-22 00:38:19.223 182759 DEBUG nova.compute.resource_tracker [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=5566MB free_disk=73.14834976196289GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Jan 21 19:38:19 np0005591285 nova_compute[182755]: 2026-01-22 00:38:19.223 182759 DEBUG oslo_concurrency.lockutils [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 21 19:38:19 np0005591285 nova_compute[182755]: 2026-01-22 00:38:19.224 182759 DEBUG oslo_concurrency.lockutils [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 21 19:38:19 np0005591285 nova_compute[182755]: 2026-01-22 00:38:19.298 182759 DEBUG nova.compute.resource_tracker [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Instance f5280c26-3c89-472c-96cd-5d580ed702ce actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Jan 21 19:38:19 np0005591285 nova_compute[182755]: 2026-01-22 00:38:19.298 182759 DEBUG nova.compute.resource_tracker [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Jan 21 19:38:19 np0005591285 nova_compute[182755]: 2026-01-22 00:38:19.298 182759 DEBUG nova.compute.resource_tracker [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7679MB used_ram=640MB phys_disk=79GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Jan 21 19:38:19 np0005591285 nova_compute[182755]: 2026-01-22 00:38:19.335 182759 DEBUG nova.compute.provider_tree [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Inventory has not changed in ProviderTree for provider: e96a8776-a298-4c19-937a-402cb8191067 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 21 19:38:19 np0005591285 nova_compute[182755]: 2026-01-22 00:38:19.350 182759 DEBUG nova.scheduler.client.report [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Inventory has not changed for provider e96a8776-a298-4c19-937a-402cb8191067 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 21 19:38:19 np0005591285 nova_compute[182755]: 2026-01-22 00:38:19.372 182759 DEBUG nova.compute.resource_tracker [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Jan 21 19:38:19 np0005591285 nova_compute[182755]: 2026-01-22 00:38:19.372 182759 DEBUG oslo_concurrency.lockutils [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.149s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 21 19:38:19 np0005591285 nova_compute[182755]: 2026-01-22 00:38:19.944 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:38:20 np0005591285 nova_compute[182755]: 2026-01-22 00:38:20.372 182759 DEBUG oslo_service.periodic_task [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 21 19:38:21 np0005591285 podman[241755]: 2026-01-22 00:38:21.185673433 +0000 UTC m=+0.055978918 container health_status 8919309155a033da8deb024bb37b9fc0d285fe97686ee0d202af37a5f2df8416 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter)
Jan 21 19:38:21 np0005591285 podman[241754]: 2026-01-22 00:38:21.185643082 +0000 UTC m=+0.060745896 container health_status 482a8450540b33e5cd4c4cb0c4b60281016c657b79b89fa82f8a9e447843f995 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_id=ovn_metadata_agent)
Jan 21 19:38:21 np0005591285 nova_compute[182755]: 2026-01-22 00:38:21.660 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.182 12 DEBUG ceilometer.compute.discovery [-] instance data: {'id': 'f5280c26-3c89-472c-96cd-5d580ed702ce', 'name': 'tempest-TestGettingAddress-server-960448291', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'os_type': 'hvm', 'architecture': 'x86_64', 'OS-EXT-SRV-ATTR:instance_name': 'instance-000000af', 'OS-EXT-SRV-ATTR:host': 'compute-2.ctlplane.example.com', 'OS-EXT-STS:vm_state': 'running', 'tenant_id': '837db8748d074b3c9179b47d30e7a1d4', 'user_id': 'a8fd196423d94b309668ffd08655f7ed', 'hostId': '164c0a9a6acac8f4385343b5316ca58358db1c583e55b6d4f56f9ab0', 'status': 'active', 'metadata': {}} discover_libvirt_polling /usr/lib/python3.9/site-packages/ceilometer/compute/discovery.py:228
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.183 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.delta in the context of pollsters
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.187 12 DEBUG ceilometer.compute.virt.libvirt.inspector [-] No delta meter predecessor for f5280c26-3c89-472c-96cd-5d580ed702ce / tap6158c039-5f inspect_vnics /usr/lib/python3.9/site-packages/ceilometer/compute/virt/libvirt/inspector.py:136
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.188 12 DEBUG ceilometer.compute.virt.libvirt.inspector [-] No delta meter predecessor for f5280c26-3c89-472c-96cd-5d580ed702ce / tapdc8f6b9c-58 inspect_vnics /usr/lib/python3.9/site-packages/ceilometer/compute/virt/libvirt/inspector.py:136
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.189 12 DEBUG ceilometer.compute.pollsters [-] f5280c26-3c89-472c-96cd-5d580ed702ce/network.outgoing.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.190 12 DEBUG ceilometer.compute.pollsters [-] f5280c26-3c89-472c-96cd-5d580ed702ce/network.outgoing.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.196 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '0b822e22-6de6-402f-8489-e0ad839c12ed', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': 'a8fd196423d94b309668ffd08655f7ed', 'user_name': None, 'project_id': '837db8748d074b3c9179b47d30e7a1d4', 'project_name': None, 'resource_id': 'instance-000000af-f5280c26-3c89-472c-96cd-5d580ed702ce-tap6158c039-5f', 'timestamp': '2026-01-22T00:38:23.184045', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-960448291', 'name': 'tap6158c039-5f', 'instance_id': 'f5280c26-3c89-472c-96cd-5d580ed702ce', 'instance_type': 'm1.nano', 'host': '164c0a9a6acac8f4385343b5316ca58358db1c583e55b6d4f56f9ab0', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:27:e9:26', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap6158c039-5f'}, 'message_id': 'a83f6dc8-f72a-11f0-b13b-fa163e425b77', 'monotonic_time': 6861.903449274, 'message_signature': '4859afc51cb358527832be3cde58d65c1731bdbe376b4cd3f3bfde479a364696'}, {'source': 'openstack', 'counter_name': 'network.outgoing.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': 'a8fd196423d94b309668ffd08655f7ed', 'user_name': None, 'project_id': '837db8748d074b3c9179b47d30e7a1d4', 'project_name': None, 'resource_id': 'instance-000000af-f5280c26-3c89-472c-96cd-5d580ed702ce-tapdc8f6b9c-58', 'timestamp': '2026-01-22T00:38:23.184045', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-960448291', 'name': 'tapdc8f6b9c-58', 'instance_id': 'f5280c26-3c89-472c-96cd-5d580ed702ce', 'instance_type': 'm1.nano', 'host': '164c0a9a6acac8f4385343b5316ca58358db1c583e55b6d4f56f9ab0', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:20:7f:7e', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapdc8f6b9c-58'}, 'message_id': 'a83f9262-f72a-11f0-b13b-fa163e425b77', 'monotonic_time': 6861.903449274, 'message_signature': '6e97042aba63e73430a8ba0c3c344f680d932ec2bea4377fba4c66cc028c5164'}]}, 'timestamp': '2026-01-22 00:38:23.191683', '_unique_id': 'eaac62a4aae44c95ba9e214dcf213cbe'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.196 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.196 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.196 12 ERROR oslo_messaging.notify.messaging     yield
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.196 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.196 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.196 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.196 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.196 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.196 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.196 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.196 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.196 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.196 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.196 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.196 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.196 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.196 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.196 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.196 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.196 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.196 12 ERROR oslo_messaging.notify.messaging 
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.196 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.196 12 ERROR oslo_messaging.notify.messaging 
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.196 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.196 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.196 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.196 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.196 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.196 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.196 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.196 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.196 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.196 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.196 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.196 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.196 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.196 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.196 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.196 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.196 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.196 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.196 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.196 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.196 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.196 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.196 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.196 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.196 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.196 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.196 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.196 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.196 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.196 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.196 12 ERROR oslo_messaging.notify.messaging 
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.199 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.drop in the context of pollsters
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.199 12 DEBUG ceilometer.compute.pollsters [-] f5280c26-3c89-472c-96cd-5d580ed702ce/network.outgoing.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.199 12 DEBUG ceilometer.compute.pollsters [-] f5280c26-3c89-472c-96cd-5d580ed702ce/network.outgoing.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.200 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'b0e95dfe-dae8-42c9-8ac1-771951d828fa', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'a8fd196423d94b309668ffd08655f7ed', 'user_name': None, 'project_id': '837db8748d074b3c9179b47d30e7a1d4', 'project_name': None, 'resource_id': 'instance-000000af-f5280c26-3c89-472c-96cd-5d580ed702ce-tap6158c039-5f', 'timestamp': '2026-01-22T00:38:23.199370', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-960448291', 'name': 'tap6158c039-5f', 'instance_id': 'f5280c26-3c89-472c-96cd-5d580ed702ce', 'instance_type': 'm1.nano', 'host': '164c0a9a6acac8f4385343b5316ca58358db1c583e55b6d4f56f9ab0', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:27:e9:26', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap6158c039-5f'}, 'message_id': 'a840df50-f72a-11f0-b13b-fa163e425b77', 'monotonic_time': 6861.903449274, 'message_signature': '4b8d4bb98d355c05f78c69a8f804d3daa6a00facf0a7e0b39a59c92e3cfef26c'}, {'source': 'openstack', 'counter_name': 'network.outgoing.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'a8fd196423d94b309668ffd08655f7ed', 'user_name': None, 'project_id': '837db8748d074b3c9179b47d30e7a1d4', 'project_name': None, 'resource_id': 'instance-000000af-f5280c26-3c89-472c-96cd-5d580ed702ce-tapdc8f6b9c-58', 'timestamp': '2026-01-22T00:38:23.199370', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-960448291', 'name': 'tapdc8f6b9c-58', 'instance_id': 'f5280c26-3c89-472c-96cd-5d580ed702ce', 'instance_type': 'm1.nano', 'host': '164c0a9a6acac8f4385343b5316ca58358db1c583e55b6d4f56f9ab0', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:20:7f:7e', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapdc8f6b9c-58'}, 'message_id': 'a840eb58-f72a-11f0-b13b-fa163e425b77', 'monotonic_time': 6861.903449274, 'message_signature': '62b321f588b984a4a928b04b6e386a74f907d050dbe3c8d0f7f770493e0d4fe6'}]}, 'timestamp': '2026-01-22 00:38:23.200051', '_unique_id': 'debcfe70a3654193ab0d1b73e8df3e8f'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.200 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.200 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.200 12 ERROR oslo_messaging.notify.messaging     yield
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.200 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.200 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.200 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.200 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.200 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.200 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.200 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.200 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.200 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.200 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.200 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.200 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.200 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.200 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.200 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.200 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.200 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.200 12 ERROR oslo_messaging.notify.messaging 
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.200 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.200 12 ERROR oslo_messaging.notify.messaging 
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.200 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.200 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.200 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.200 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.200 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.200 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.200 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.200 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.200 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.200 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.200 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.200 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.200 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.200 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.200 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.200 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.200 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.200 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.200 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.200 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.200 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.200 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.200 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.200 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.200 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.200 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.200 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.200 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.200 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.200 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.200 12 ERROR oslo_messaging.notify.messaging 
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.201 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.rate in the context of pollsters
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.201 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for IncomingBytesRatePollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.201 12 ERROR ceilometer.polling.manager [-] Prevent pollster network.incoming.bytes.rate from polling [<NovaLikeServer: tempest-TestGettingAddress-server-960448291>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-TestGettingAddress-server-960448291>]
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.202 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.bytes in the context of pollsters
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.228 12 DEBUG ceilometer.compute.pollsters [-] f5280c26-3c89-472c-96cd-5d580ed702ce/disk.device.read.bytes volume: 29264384 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.229 12 DEBUG ceilometer.compute.pollsters [-] f5280c26-3c89-472c-96cd-5d580ed702ce/disk.device.read.bytes volume: 274750 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.230 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'd1999c2b-63e9-4fb4-9924-421e119e9924', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 29264384, 'user_id': 'a8fd196423d94b309668ffd08655f7ed', 'user_name': None, 'project_id': '837db8748d074b3c9179b47d30e7a1d4', 'project_name': None, 'resource_id': 'f5280c26-3c89-472c-96cd-5d580ed702ce-vda', 'timestamp': '2026-01-22T00:38:23.202233', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-960448291', 'name': 'instance-000000af', 'instance_id': 'f5280c26-3c89-472c-96cd-5d580ed702ce', 'instance_type': 'm1.nano', 'host': '164c0a9a6acac8f4385343b5316ca58358db1c583e55b6d4f56f9ab0', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'a8455940-f72a-11f0-b13b-fa163e425b77', 'monotonic_time': 6861.921439708, 'message_signature': '7a75375ebf54038f4eb6161b7cea904232dde0c25fe8d8bc6ca01933e6d629c7'}, {'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 274750, 'user_id': 'a8fd196423d94b309668ffd08655f7ed', 'user_name': None, 'project_id': '837db8748d074b3c9179b47d30e7a1d4', 'project_name': None, 'resource_id': 'f5280c26-3c89-472c-96cd-5d580ed702ce-sda', 'timestamp': '2026-01-22T00:38:23.202233', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-960448291', 'name': 'instance-000000af', 'instance_id': 'f5280c26-3c89-472c-96cd-5d580ed702ce', 'instance_type': 'm1.nano', 'host': '164c0a9a6acac8f4385343b5316ca58358db1c583e55b6d4f56f9ab0', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'a8456494-f72a-11f0-b13b-fa163e425b77', 'monotonic_time': 6861.921439708, 'message_signature': 'ac264a70817fa26ca17fe56aed0b0cbdcf921b0efbe4b68daa12dc576fa61b08'}]}, 'timestamp': '2026-01-22 00:38:23.229365', '_unique_id': 'f33b45ff6f3e4a07bd17a42c97773721'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.230 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.230 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.230 12 ERROR oslo_messaging.notify.messaging     yield
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.230 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.230 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.230 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.230 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.230 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.230 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.230 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.230 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.230 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.230 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.230 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.230 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.230 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.230 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.230 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.230 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.230 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.230 12 ERROR oslo_messaging.notify.messaging 
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.230 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.230 12 ERROR oslo_messaging.notify.messaging 
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.230 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.230 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.230 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.230 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.230 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.230 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.230 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.230 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.230 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.230 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.230 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.230 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.230 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.230 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.230 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.230 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.230 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.230 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.230 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.230 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.230 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.230 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.230 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.230 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.230 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.230 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.230 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.230 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.230 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.230 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.230 12 ERROR oslo_messaging.notify.messaging 
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.231 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.requests in the context of pollsters
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.231 12 DEBUG ceilometer.compute.pollsters [-] f5280c26-3c89-472c-96cd-5d580ed702ce/disk.device.write.requests volume: 302 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.231 12 DEBUG ceilometer.compute.pollsters [-] f5280c26-3c89-472c-96cd-5d580ed702ce/disk.device.write.requests volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.232 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'c2700c89-8a79-4b4b-9428-75d4253711bb', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 302, 'user_id': 'a8fd196423d94b309668ffd08655f7ed', 'user_name': None, 'project_id': '837db8748d074b3c9179b47d30e7a1d4', 'project_name': None, 'resource_id': 'f5280c26-3c89-472c-96cd-5d580ed702ce-vda', 'timestamp': '2026-01-22T00:38:23.231443', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-960448291', 'name': 'instance-000000af', 'instance_id': 'f5280c26-3c89-472c-96cd-5d580ed702ce', 'instance_type': 'm1.nano', 'host': '164c0a9a6acac8f4385343b5316ca58358db1c583e55b6d4f56f9ab0', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'a845bfde-f72a-11f0-b13b-fa163e425b77', 'monotonic_time': 6861.921439708, 'message_signature': '509264d5201c3be457da80bbbf18fe5a2e225ae97f8aef132199dec3b823f17d'}, {'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 0, 'user_id': 'a8fd196423d94b309668ffd08655f7ed', 'user_name': None, 'project_id': '837db8748d074b3c9179b47d30e7a1d4', 'project_name': None, 'resource_id': 'f5280c26-3c89-472c-96cd-5d580ed702ce-sda', 'timestamp': '2026-01-22T00:38:23.231443', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-960448291', 'name': 'instance-000000af', 'instance_id': 'f5280c26-3c89-472c-96cd-5d580ed702ce', 'instance_type': 'm1.nano', 'host': '164c0a9a6acac8f4385343b5316ca58358db1c583e55b6d4f56f9ab0', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'a845c7ae-f72a-11f0-b13b-fa163e425b77', 'monotonic_time': 6861.921439708, 'message_signature': '694e53f0fa797beeb5d41aa0283bc9c850150558df5ef7b2c2b9f933fe28ba8d'}]}, 'timestamp': '2026-01-22 00:38:23.231900', '_unique_id': 'b248ce5fcb1e4d7bb13650473feb1390'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.232 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.232 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.232 12 ERROR oslo_messaging.notify.messaging     yield
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.232 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.232 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.232 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.232 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.232 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.232 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.232 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.232 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.232 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.232 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.232 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.232 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.232 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.232 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.232 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.232 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.232 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.232 12 ERROR oslo_messaging.notify.messaging 
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.232 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.232 12 ERROR oslo_messaging.notify.messaging 
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.232 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.232 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.232 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.232 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.232 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.232 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.232 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.232 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.232 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.232 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.232 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.232 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.232 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.232 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.232 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.232 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.232 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.232 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.232 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.232 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.232 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.232 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.232 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.232 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.232 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.232 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.232 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.232 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.232 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.232 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.232 12 ERROR oslo_messaging.notify.messaging 
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.232 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.bytes in the context of pollsters
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.233 12 DEBUG ceilometer.compute.pollsters [-] f5280c26-3c89-472c-96cd-5d580ed702ce/disk.device.write.bytes volume: 72904704 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.233 12 DEBUG ceilometer.compute.pollsters [-] f5280c26-3c89-472c-96cd-5d580ed702ce/disk.device.write.bytes volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.234 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '476f854e-b303-4b2a-9a4e-a41f40feb08f', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 72904704, 'user_id': 'a8fd196423d94b309668ffd08655f7ed', 'user_name': None, 'project_id': '837db8748d074b3c9179b47d30e7a1d4', 'project_name': None, 'resource_id': 'f5280c26-3c89-472c-96cd-5d580ed702ce-vda', 'timestamp': '2026-01-22T00:38:23.233023', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-960448291', 'name': 'instance-000000af', 'instance_id': 'f5280c26-3c89-472c-96cd-5d580ed702ce', 'instance_type': 'm1.nano', 'host': '164c0a9a6acac8f4385343b5316ca58358db1c583e55b6d4f56f9ab0', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'a845fc7e-f72a-11f0-b13b-fa163e425b77', 'monotonic_time': 6861.921439708, 'message_signature': 'c9351b7f8df9c629a685116acd79f838985985fbffc3d9d957d3d869e2f204df'}, {'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': 'a8fd196423d94b309668ffd08655f7ed', 'user_name': None, 'project_id': '837db8748d074b3c9179b47d30e7a1d4', 'project_name': None, 'resource_id': 'f5280c26-3c89-472c-96cd-5d580ed702ce-sda', 'timestamp': '2026-01-22T00:38:23.233023', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-960448291', 'name': 'instance-000000af', 'instance_id': 'f5280c26-3c89-472c-96cd-5d580ed702ce', 'instance_type': 'm1.nano', 'host': '164c0a9a6acac8f4385343b5316ca58358db1c583e55b6d4f56f9ab0', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'a846043a-f72a-11f0-b13b-fa163e425b77', 'monotonic_time': 6861.921439708, 'message_signature': 'a76819e64d3fb2165199cd297ccfe475cd6dfa17249ea28dc9adb1fab5e835e9'}]}, 'timestamp': '2026-01-22 00:38:23.233457', '_unique_id': 'a110848fbcbe484b8a92f361186df9df'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.234 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.234 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.234 12 ERROR oslo_messaging.notify.messaging     yield
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.234 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.234 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.234 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.234 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.234 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.234 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.234 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.234 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.234 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.234 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.234 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.234 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.234 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.234 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.234 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.234 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.234 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.234 12 ERROR oslo_messaging.notify.messaging 
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.234 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.234 12 ERROR oslo_messaging.notify.messaging 
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.234 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.234 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.234 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.234 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.234 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.234 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.234 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.234 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.234 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.234 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.234 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.234 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.234 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.234 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.234 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.234 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.234 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.234 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.234 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.234 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.234 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.234 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.234 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.234 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.234 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.234 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.234 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.234 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.234 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.234 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.234 12 ERROR oslo_messaging.notify.messaging 
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.234 12 INFO ceilometer.polling.manager [-] Polling pollster memory.usage in the context of pollsters
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.249 12 DEBUG ceilometer.compute.pollsters [-] f5280c26-3c89-472c-96cd-5d580ed702ce/memory.usage volume: 43.9296875 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.251 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'bbc02822-deb8-44c5-81f3-67a1c7aeda60', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'memory.usage', 'counter_type': 'gauge', 'counter_unit': 'MB', 'counter_volume': 43.9296875, 'user_id': 'a8fd196423d94b309668ffd08655f7ed', 'user_name': None, 'project_id': '837db8748d074b3c9179b47d30e7a1d4', 'project_name': None, 'resource_id': 'f5280c26-3c89-472c-96cd-5d580ed702ce', 'timestamp': '2026-01-22T00:38:23.235037', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-960448291', 'name': 'instance-000000af', 'instance_id': 'f5280c26-3c89-472c-96cd-5d580ed702ce', 'instance_type': 'm1.nano', 'host': '164c0a9a6acac8f4385343b5316ca58358db1c583e55b6d4f56f9ab0', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1}, 'message_id': 'a848995c-f72a-11f0-b13b-fa163e425b77', 'monotonic_time': 6861.968793612, 'message_signature': 'c9243af94171db9cec26e87ccc2ad85a9b0d3570404c1bb9e6ee2b56d26120d8'}]}, 'timestamp': '2026-01-22 00:38:23.250474', '_unique_id': 'f6ad1546c1934ab79415d8f16ca1e4ee'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.251 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.251 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.251 12 ERROR oslo_messaging.notify.messaging     yield
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.251 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.251 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.251 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.251 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.251 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.251 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.251 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.251 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.251 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.251 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.251 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.251 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.251 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.251 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.251 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.251 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.251 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.251 12 ERROR oslo_messaging.notify.messaging 
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.251 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.251 12 ERROR oslo_messaging.notify.messaging 
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.251 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.251 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.251 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.251 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.251 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.251 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.251 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.251 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.251 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.251 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.251 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.251 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.251 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.251 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.251 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.251 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.251 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.251 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.251 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.251 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.251 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.251 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.251 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.251 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.251 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.251 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.251 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.251 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.251 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.251 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.251 12 ERROR oslo_messaging.notify.messaging 
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.252 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.iops in the context of pollsters
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.252 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for PerDeviceDiskIOPSPollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.253 12 ERROR ceilometer.polling.manager [-] Prevent pollster disk.device.iops from polling [<NovaLikeServer: tempest-TestGettingAddress-server-960448291>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-TestGettingAddress-server-960448291>]
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.253 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets in the context of pollsters
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.253 12 DEBUG ceilometer.compute.pollsters [-] f5280c26-3c89-472c-96cd-5d580ed702ce/network.incoming.packets volume: 25 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.253 12 DEBUG ceilometer.compute.pollsters [-] f5280c26-3c89-472c-96cd-5d580ed702ce/network.incoming.packets volume: 8 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.255 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '45ce4d7b-6e22-4382-99b5-4c09287484ad', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 25, 'user_id': 'a8fd196423d94b309668ffd08655f7ed', 'user_name': None, 'project_id': '837db8748d074b3c9179b47d30e7a1d4', 'project_name': None, 'resource_id': 'instance-000000af-f5280c26-3c89-472c-96cd-5d580ed702ce-tap6158c039-5f', 'timestamp': '2026-01-22T00:38:23.253425', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-960448291', 'name': 'tap6158c039-5f', 'instance_id': 'f5280c26-3c89-472c-96cd-5d580ed702ce', 'instance_type': 'm1.nano', 'host': '164c0a9a6acac8f4385343b5316ca58358db1c583e55b6d4f56f9ab0', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:27:e9:26', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap6158c039-5f'}, 'message_id': 'a8491dd2-f72a-11f0-b13b-fa163e425b77', 'monotonic_time': 6861.903449274, 'message_signature': 'bc1daa07c88bfd520fc05eeb3c5b9a6ce05e9c80c8451208a26f09f5b34e4a8e'}, {'source': 'openstack', 'counter_name': 'network.incoming.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 8, 'user_id': 'a8fd196423d94b309668ffd08655f7ed', 'user_name': None, 'project_id': '837db8748d074b3c9179b47d30e7a1d4', 'project_name': None, 'resource_id': 'instance-000000af-f5280c26-3c89-472c-96cd-5d580ed702ce-tapdc8f6b9c-58', 'timestamp': '2026-01-22T00:38:23.253425', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-960448291', 'name': 'tapdc8f6b9c-58', 'instance_id': 'f5280c26-3c89-472c-96cd-5d580ed702ce', 'instance_type': 'm1.nano', 'host': '164c0a9a6acac8f4385343b5316ca58358db1c583e55b6d4f56f9ab0', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:20:7f:7e', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapdc8f6b9c-58'}, 'message_id': 'a8492d7c-f72a-11f0-b13b-fa163e425b77', 'monotonic_time': 6861.903449274, 'message_signature': '7c9b43595b393665ca533ac5bddce23870c6ae05319ea2ffe0cde843ad9a445e'}]}, 'timestamp': '2026-01-22 00:38:23.254200', '_unique_id': '7c4f74c8737148279f228e877da9a9ae'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.255 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.255 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.255 12 ERROR oslo_messaging.notify.messaging     yield
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.255 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.255 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.255 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.255 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.255 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.255 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.255 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.255 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.255 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.255 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.255 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.255 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.255 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.255 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.255 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.255 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.255 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.255 12 ERROR oslo_messaging.notify.messaging 
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.255 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.255 12 ERROR oslo_messaging.notify.messaging 
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.255 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.255 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.255 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.255 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.255 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.255 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.255 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.255 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.255 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.255 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.255 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.255 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.255 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.255 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.255 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.255 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.255 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.255 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.255 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.255 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.255 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.255 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.255 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.255 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.255 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.255 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.255 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.255 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.255 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.255 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.255 12 ERROR oslo_messaging.notify.messaging 
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.256 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets in the context of pollsters
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.256 12 DEBUG ceilometer.compute.pollsters [-] f5280c26-3c89-472c-96cd-5d580ed702ce/network.outgoing.packets volume: 28 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.256 12 DEBUG ceilometer.compute.pollsters [-] f5280c26-3c89-472c-96cd-5d580ed702ce/network.outgoing.packets volume: 23 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.257 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '21b4baec-8042-4c08-9217-52272f0667f6', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 28, 'user_id': 'a8fd196423d94b309668ffd08655f7ed', 'user_name': None, 'project_id': '837db8748d074b3c9179b47d30e7a1d4', 'project_name': None, 'resource_id': 'instance-000000af-f5280c26-3c89-472c-96cd-5d580ed702ce-tap6158c039-5f', 'timestamp': '2026-01-22T00:38:23.256374', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-960448291', 'name': 'tap6158c039-5f', 'instance_id': 'f5280c26-3c89-472c-96cd-5d580ed702ce', 'instance_type': 'm1.nano', 'host': '164c0a9a6acac8f4385343b5316ca58358db1c583e55b6d4f56f9ab0', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:27:e9:26', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap6158c039-5f'}, 'message_id': 'a849919a-f72a-11f0-b13b-fa163e425b77', 'monotonic_time': 6861.903449274, 'message_signature': 'f5a1cc03825e67239435c13db155780fd41772892cd66e537a59f7ec5934f23f'}, {'source': 'openstack', 'counter_name': 'network.outgoing.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 23, 'user_id': 'a8fd196423d94b309668ffd08655f7ed', 'user_name': None, 'project_id': '837db8748d074b3c9179b47d30e7a1d4', 'project_name': None, 'resource_id': 'instance-000000af-f5280c26-3c89-472c-96cd-5d580ed702ce-tapdc8f6b9c-58', 'timestamp': '2026-01-22T00:38:23.256374', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-960448291', 'name': 'tapdc8f6b9c-58', 'instance_id': 'f5280c26-3c89-472c-96cd-5d580ed702ce', 'instance_type': 'm1.nano', 'host': '164c0a9a6acac8f4385343b5316ca58358db1c583e55b6d4f56f9ab0', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:20:7f:7e', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapdc8f6b9c-58'}, 'message_id': 'a8499f28-f72a-11f0-b13b-fa163e425b77', 'monotonic_time': 6861.903449274, 'message_signature': '0e17c85c8c3001812852bc2428c85657cb4b846df495dcde0bbc38a60224bc36'}]}, 'timestamp': '2026-01-22 00:38:23.257078', '_unique_id': 'f60615b89aff468b8155fe967eec450c'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.257 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.257 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.257 12 ERROR oslo_messaging.notify.messaging     yield
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.257 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.257 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.257 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.257 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.257 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.257 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.257 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.257 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.257 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.257 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.257 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.257 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.257 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.257 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.257 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.257 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.257 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.257 12 ERROR oslo_messaging.notify.messaging 
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.257 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.257 12 ERROR oslo_messaging.notify.messaging 
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.257 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.257 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.257 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.257 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.257 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.257 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.257 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.257 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.257 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.257 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.257 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.257 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.257 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.257 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.257 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.257 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.257 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.257 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.257 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.257 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.257 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.257 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.257 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.257 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.257 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.257 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.257 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.257 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.257 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.257 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.257 12 ERROR oslo_messaging.notify.messaging 
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.258 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.error in the context of pollsters
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.258 12 DEBUG ceilometer.compute.pollsters [-] f5280c26-3c89-472c-96cd-5d580ed702ce/network.incoming.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.259 12 DEBUG ceilometer.compute.pollsters [-] f5280c26-3c89-472c-96cd-5d580ed702ce/network.incoming.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.259 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '77870e57-8d67-495e-ab28-0726e75debb2', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'a8fd196423d94b309668ffd08655f7ed', 'user_name': None, 'project_id': '837db8748d074b3c9179b47d30e7a1d4', 'project_name': None, 'resource_id': 'instance-000000af-f5280c26-3c89-472c-96cd-5d580ed702ce-tap6158c039-5f', 'timestamp': '2026-01-22T00:38:23.258753', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-960448291', 'name': 'tap6158c039-5f', 'instance_id': 'f5280c26-3c89-472c-96cd-5d580ed702ce', 'instance_type': 'm1.nano', 'host': '164c0a9a6acac8f4385343b5316ca58358db1c583e55b6d4f56f9ab0', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:27:e9:26', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap6158c039-5f'}, 'message_id': 'a849ed70-f72a-11f0-b13b-fa163e425b77', 'monotonic_time': 6861.903449274, 'message_signature': '28335c89907616012a5fac070a7c35fcaa4a4f907d65538e7a5603daec7edb79'}, {'source': 'openstack', 'counter_name': 'network.incoming.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'a8fd196423d94b309668ffd08655f7ed', 'user_name': None, 'project_id': '837db8748d074b3c9179b47d30e7a1d4', 'project_name': None, 'resource_id': 'instance-000000af-f5280c26-3c89-472c-96cd-5d580ed702ce-tapdc8f6b9c-58', 'timestamp': '2026-01-22T00:38:23.258753', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-960448291', 'name': 'tapdc8f6b9c-58', 'instance_id': 'f5280c26-3c89-472c-96cd-5d580ed702ce', 'instance_type': 'm1.nano', 'host': '164c0a9a6acac8f4385343b5316ca58358db1c583e55b6d4f56f9ab0', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:20:7f:7e', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapdc8f6b9c-58'}, 'message_id': 'a849f86a-f72a-11f0-b13b-fa163e425b77', 'monotonic_time': 6861.903449274, 'message_signature': '81d691d0179c02651b6afd45326c8f8c5a96533e8cfde3ef669ba679b617845e'}]}, 'timestamp': '2026-01-22 00:38:23.259386', '_unique_id': '63edd14c62b34c1c9093b40ed1dd978e'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.259 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.259 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.259 12 ERROR oslo_messaging.notify.messaging     yield
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.259 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.259 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.259 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.259 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.259 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.259 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.259 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.259 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.259 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.259 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.259 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.259 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.259 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.259 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.259 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.259 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.259 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.259 12 ERROR oslo_messaging.notify.messaging 
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.259 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.259 12 ERROR oslo_messaging.notify.messaging 
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.259 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.259 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.259 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.259 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.259 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.259 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.259 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.259 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.259 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.259 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.259 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.259 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.259 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.259 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.259 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.259 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.259 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.259 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.259 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.259 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.259 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.259 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.259 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.259 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.259 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.259 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.259 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.259 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.259 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.259 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.259 12 ERROR oslo_messaging.notify.messaging 
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.260 12 INFO ceilometer.polling.manager [-] Polling pollster cpu in the context of pollsters
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.261 12 DEBUG ceilometer.compute.pollsters [-] f5280c26-3c89-472c-96cd-5d580ed702ce/cpu volume: 12690000000 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.262 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'a3a251a5-aa21-4174-8f1e-2d85ca19592e', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'cpu', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 12690000000, 'user_id': 'a8fd196423d94b309668ffd08655f7ed', 'user_name': None, 'project_id': '837db8748d074b3c9179b47d30e7a1d4', 'project_name': None, 'resource_id': 'f5280c26-3c89-472c-96cd-5d580ed702ce', 'timestamp': '2026-01-22T00:38:23.261087', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-960448291', 'name': 'instance-000000af', 'instance_id': 'f5280c26-3c89-472c-96cd-5d580ed702ce', 'instance_type': 'm1.nano', 'host': '164c0a9a6acac8f4385343b5316ca58358db1c583e55b6d4f56f9ab0', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'cpu_number': 1}, 'message_id': 'a84a482e-f72a-11f0-b13b-fa163e425b77', 'monotonic_time': 6861.968793612, 'message_signature': 'e6631c922dce62f11b2b388a23a96b20e99f4e708e716f83540b3f9e9e17c032'}]}, 'timestamp': '2026-01-22 00:38:23.261432', '_unique_id': '1048223e78384672921fceae6841893b'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.262 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.262 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.262 12 ERROR oslo_messaging.notify.messaging     yield
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.262 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.262 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.262 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.262 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.262 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.262 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.262 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.262 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.262 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.262 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.262 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.262 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.262 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.262 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.262 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.262 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.262 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.262 12 ERROR oslo_messaging.notify.messaging 
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.262 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.262 12 ERROR oslo_messaging.notify.messaging 
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.262 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.262 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.262 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.262 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.262 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.262 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.262 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.262 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.262 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.262 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.262 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.262 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.262 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.262 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.262 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.262 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.262 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.262 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.262 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.262 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.262 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.262 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.262 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.262 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.262 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.262 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.262 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.262 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.262 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.262 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.262 12 ERROR oslo_messaging.notify.messaging 
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.263 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes in the context of pollsters
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.263 12 DEBUG ceilometer.compute.pollsters [-] f5280c26-3c89-472c-96cd-5d580ed702ce/network.outgoing.bytes volume: 3390 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.263 12 DEBUG ceilometer.compute.pollsters [-] f5280c26-3c89-472c-96cd-5d580ed702ce/network.outgoing.bytes volume: 2326 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.264 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '0a8ae8e0-6f4b-4eed-b555-f82c67cd8e74', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 3390, 'user_id': 'a8fd196423d94b309668ffd08655f7ed', 'user_name': None, 'project_id': '837db8748d074b3c9179b47d30e7a1d4', 'project_name': None, 'resource_id': 'instance-000000af-f5280c26-3c89-472c-96cd-5d580ed702ce-tap6158c039-5f', 'timestamp': '2026-01-22T00:38:23.263369', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-960448291', 'name': 'tap6158c039-5f', 'instance_id': 'f5280c26-3c89-472c-96cd-5d580ed702ce', 'instance_type': 'm1.nano', 'host': '164c0a9a6acac8f4385343b5316ca58358db1c583e55b6d4f56f9ab0', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:27:e9:26', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap6158c039-5f'}, 'message_id': 'a84aa1a2-f72a-11f0-b13b-fa163e425b77', 'monotonic_time': 6861.903449274, 'message_signature': 'd28b5947f415f8fad29ae02f0f26ee5544cd3758fb088f12d11fd2034ab38f34'}, {'source': 'openstack', 'counter_name': 'network.outgoing.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 2326, 'user_id': 'a8fd196423d94b309668ffd08655f7ed', 'user_name': None, 'project_id': '837db8748d074b3c9179b47d30e7a1d4', 'project_name': None, 'resource_id': 'instance-000000af-f5280c26-3c89-472c-96cd-5d580ed702ce-tapdc8f6b9c-58', 'timestamp': '2026-01-22T00:38:23.263369', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-960448291', 'name': 'tapdc8f6b9c-58', 'instance_id': 'f5280c26-3c89-472c-96cd-5d580ed702ce', 'instance_type': 'm1.nano', 'host': '164c0a9a6acac8f4385343b5316ca58358db1c583e55b6d4f56f9ab0', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:20:7f:7e', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapdc8f6b9c-58'}, 'message_id': 'a84aafa8-f72a-11f0-b13b-fa163e425b77', 'monotonic_time': 6861.903449274, 'message_signature': '1480fc196c5c67fccb9c755bd3f0967212d0b7fbca62fdf8bd626b5a3ef3946d'}]}, 'timestamp': '2026-01-22 00:38:23.264084', '_unique_id': 'f07aac2a026f42bcad9902a9ce8a0e61'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.264 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.264 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.264 12 ERROR oslo_messaging.notify.messaging     yield
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.264 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.264 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.264 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.264 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.264 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.264 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.264 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.264 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.264 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.264 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.264 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.264 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.264 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.264 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.264 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.264 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.264 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.264 12 ERROR oslo_messaging.notify.messaging 
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.264 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.264 12 ERROR oslo_messaging.notify.messaging 
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.264 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.264 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.264 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.264 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.264 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.264 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.264 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.264 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.264 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.264 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.264 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.264 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.264 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.264 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.264 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.264 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.264 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.264 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.264 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.264 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.264 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.264 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.264 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.264 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.264 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.264 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.264 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.264 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.264 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.264 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.264 12 ERROR oslo_messaging.notify.messaging 
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.265 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.rate in the context of pollsters
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.266 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for OutgoingBytesRatePollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.266 12 ERROR ceilometer.polling.manager [-] Prevent pollster network.outgoing.bytes.rate from polling [<NovaLikeServer: tempest-TestGettingAddress-server-960448291>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-TestGettingAddress-server-960448291>]
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.266 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.latency in the context of pollsters
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.266 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for PerDeviceDiskLatencyPollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.266 12 ERROR ceilometer.polling.manager [-] Prevent pollster disk.device.latency from polling [<NovaLikeServer: tempest-TestGettingAddress-server-960448291>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-TestGettingAddress-server-960448291>]
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.267 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.latency in the context of pollsters
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.267 12 DEBUG ceilometer.compute.pollsters [-] f5280c26-3c89-472c-96cd-5d580ed702ce/disk.device.write.latency volume: 3130276574 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.267 12 DEBUG ceilometer.compute.pollsters [-] f5280c26-3c89-472c-96cd-5d580ed702ce/disk.device.write.latency volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.268 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '95d86d22-9152-448a-9e3d-1f9533193794', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 3130276574, 'user_id': 'a8fd196423d94b309668ffd08655f7ed', 'user_name': None, 'project_id': '837db8748d074b3c9179b47d30e7a1d4', 'project_name': None, 'resource_id': 'f5280c26-3c89-472c-96cd-5d580ed702ce-vda', 'timestamp': '2026-01-22T00:38:23.267119', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-960448291', 'name': 'instance-000000af', 'instance_id': 'f5280c26-3c89-472c-96cd-5d580ed702ce', 'instance_type': 'm1.nano', 'host': '164c0a9a6acac8f4385343b5316ca58358db1c583e55b6d4f56f9ab0', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'a84b33ce-f72a-11f0-b13b-fa163e425b77', 'monotonic_time': 6861.921439708, 'message_signature': 'f31b8ae9538059f7818334c62046926850b895e4c4709b804eea2319713b6111'}, {'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 0, 'user_id': 'a8fd196423d94b309668ffd08655f7ed', 'user_name': None, 'project_id': '837db8748d074b3c9179b47d30e7a1d4', 'project_name': None, 'resource_id': 'f5280c26-3c89-472c-96cd-5d580ed702ce-sda', 'timestamp': '2026-01-22T00:38:23.267119', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-960448291', 'name': 'instance-000000af', 'instance_id': 'f5280c26-3c89-472c-96cd-5d580ed702ce', 'instance_type': 'm1.nano', 'host': '164c0a9a6acac8f4385343b5316ca58358db1c583e55b6d4f56f9ab0', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'a84b3fea-f72a-11f0-b13b-fa163e425b77', 'monotonic_time': 6861.921439708, 'message_signature': 'fd542ba82b3627682298116d0ed25c29aece41d88135b60f7c5f5770f2bdc2cc'}]}, 'timestamp': '2026-01-22 00:38:23.267767', '_unique_id': '5608d90ee13c4cca8dd3b578bebff3de'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.268 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.268 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.268 12 ERROR oslo_messaging.notify.messaging     yield
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.268 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.268 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.268 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.268 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.268 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.268 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.268 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.268 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.268 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.268 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.268 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.268 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.268 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.268 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.268 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.268 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.268 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.268 12 ERROR oslo_messaging.notify.messaging 
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.268 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.268 12 ERROR oslo_messaging.notify.messaging 
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.268 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.268 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.268 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.268 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.268 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.268 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.268 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.268 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.268 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.268 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.268 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.268 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.268 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.268 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.268 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.268 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.268 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.268 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.268 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.268 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.268 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.268 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.268 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.268 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.268 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.268 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.268 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.268 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.268 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.268 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.268 12 ERROR oslo_messaging.notify.messaging 
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.269 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.delta in the context of pollsters
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.269 12 DEBUG ceilometer.compute.pollsters [-] f5280c26-3c89-472c-96cd-5d580ed702ce/network.incoming.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.269 12 DEBUG ceilometer.compute.pollsters [-] f5280c26-3c89-472c-96cd-5d580ed702ce/network.incoming.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.270 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '0348397c-dbf0-4e8b-b9b4-ed46c8bf6d38', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': 'a8fd196423d94b309668ffd08655f7ed', 'user_name': None, 'project_id': '837db8748d074b3c9179b47d30e7a1d4', 'project_name': None, 'resource_id': 'instance-000000af-f5280c26-3c89-472c-96cd-5d580ed702ce-tap6158c039-5f', 'timestamp': '2026-01-22T00:38:23.269493', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-960448291', 'name': 'tap6158c039-5f', 'instance_id': 'f5280c26-3c89-472c-96cd-5d580ed702ce', 'instance_type': 'm1.nano', 'host': '164c0a9a6acac8f4385343b5316ca58358db1c583e55b6d4f56f9ab0', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:27:e9:26', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap6158c039-5f'}, 'message_id': 'a84b9058-f72a-11f0-b13b-fa163e425b77', 'monotonic_time': 6861.903449274, 'message_signature': 'c1e4da1dc3bcb6c21720818e92f1128988179590a4e8cea3ba23bd034e3365d5'}, {'source': 'openstack', 'counter_name': 'network.incoming.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': 'a8fd196423d94b309668ffd08655f7ed', 'user_name': None, 'project_id': '837db8748d074b3c9179b47d30e7a1d4', 'project_name': None, 'resource_id': 'instance-000000af-f5280c26-3c89-472c-96cd-5d580ed702ce-tapdc8f6b9c-58', 'timestamp': '2026-01-22T00:38:23.269493', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-960448291', 'name': 'tapdc8f6b9c-58', 'instance_id': 'f5280c26-3c89-472c-96cd-5d580ed702ce', 'instance_type': 'm1.nano', 'host': '164c0a9a6acac8f4385343b5316ca58358db1c583e55b6d4f56f9ab0', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:20:7f:7e', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapdc8f6b9c-58'}, 'message_id': 'a84b9d46-f72a-11f0-b13b-fa163e425b77', 'monotonic_time': 6861.903449274, 'message_signature': '5bd80b3988c2b7f3b231c93e8fdd5ea60e202ac834268249c8e3d26c24ab01db'}]}, 'timestamp': '2026-01-22 00:38:23.270155', '_unique_id': '33bc1f5c6b17405dafc31dbd20db72c4'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.270 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.270 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.270 12 ERROR oslo_messaging.notify.messaging     yield
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.270 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.270 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.270 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.270 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.270 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.270 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.270 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.270 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.270 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.270 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.270 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.270 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.270 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.270 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.270 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.270 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.270 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.270 12 ERROR oslo_messaging.notify.messaging 
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.270 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.270 12 ERROR oslo_messaging.notify.messaging 
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.270 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.270 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.270 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.270 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.270 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.270 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.270 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.270 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.270 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.270 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.270 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.270 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.270 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.270 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.270 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.270 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.270 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.270 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.270 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.270 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.270 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.270 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.270 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.270 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.270 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.270 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.270 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.270 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.270 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.270 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.270 12 ERROR oslo_messaging.notify.messaging 
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.271 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.usage in the context of pollsters
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.284 12 DEBUG ceilometer.compute.pollsters [-] f5280c26-3c89-472c-96cd-5d580ed702ce/disk.device.usage volume: 29949952 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.284 12 DEBUG ceilometer.compute.pollsters [-] f5280c26-3c89-472c-96cd-5d580ed702ce/disk.device.usage volume: 485376 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.286 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '7744aa90-b039-465f-903d-c109fc74b5c4', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 29949952, 'user_id': 'a8fd196423d94b309668ffd08655f7ed', 'user_name': None, 'project_id': '837db8748d074b3c9179b47d30e7a1d4', 'project_name': None, 'resource_id': 'f5280c26-3c89-472c-96cd-5d580ed702ce-vda', 'timestamp': '2026-01-22T00:38:23.271867', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-960448291', 'name': 'instance-000000af', 'instance_id': 'f5280c26-3c89-472c-96cd-5d580ed702ce', 'instance_type': 'm1.nano', 'host': '164c0a9a6acac8f4385343b5316ca58358db1c583e55b6d4f56f9ab0', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'a84dde44-f72a-11f0-b13b-fa163e425b77', 'monotonic_time': 6861.991098652, 'message_signature': '0bb3d2b7c6b41a71675bfb0e178da9e1df3e2ba68e4faabf53d4fa98ef3cde22'}, {'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 485376, 'user_id': 'a8fd196423d94b309668ffd08655f7ed', 'user_name': None, 'project_id': '837db8748d074b3c9179b47d30e7a1d4', 'project_name': None, 'resource_id': 'f5280c26-3c89-472c-96cd-5d580ed702ce-sda', 'timestamp': '2026-01-22T00:38:23.271867', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-960448291', 'name': 'instance-000000af', 'instance_id': 'f5280c26-3c89-472c-96cd-5d580ed702ce', 'instance_type': 'm1.nano', 'host': '164c0a9a6acac8f4385343b5316ca58358db1c583e55b6d4f56f9ab0', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'a84dec72-f72a-11f0-b13b-fa163e425b77', 'monotonic_time': 6861.991098652, 'message_signature': '1e9b90e59acf42b3e1ef48ff0300da8d0779106ac29c7291e4327a3a31ca5c7f'}]}, 'timestamp': '2026-01-22 00:38:23.285276', '_unique_id': 'd1b7e6fa700049c4b7ed0bccb45722a2'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.286 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.286 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.286 12 ERROR oslo_messaging.notify.messaging     yield
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.286 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.286 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.286 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.286 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.286 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.286 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.286 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.286 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.286 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.286 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.286 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.286 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.286 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.286 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.286 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.286 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.286 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.286 12 ERROR oslo_messaging.notify.messaging 
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.286 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.286 12 ERROR oslo_messaging.notify.messaging 
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.286 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.286 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.286 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.286 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.286 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.286 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.286 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.286 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.286 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.286 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.286 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.286 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.286 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.286 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.286 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.286 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.286 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.286 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.286 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.286 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.286 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.286 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.286 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.286 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.286 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.286 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.286 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.286 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.286 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.286 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.286 12 ERROR oslo_messaging.notify.messaging 
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.287 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.capacity in the context of pollsters
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.287 12 DEBUG ceilometer.compute.pollsters [-] f5280c26-3c89-472c-96cd-5d580ed702ce/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.287 12 DEBUG ceilometer.compute.pollsters [-] f5280c26-3c89-472c-96cd-5d580ed702ce/disk.device.capacity volume: 485376 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.288 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'c3019f60-2a54-4914-9c30-75c20063d7a7', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': 'a8fd196423d94b309668ffd08655f7ed', 'user_name': None, 'project_id': '837db8748d074b3c9179b47d30e7a1d4', 'project_name': None, 'resource_id': 'f5280c26-3c89-472c-96cd-5d580ed702ce-vda', 'timestamp': '2026-01-22T00:38:23.287520', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-960448291', 'name': 'instance-000000af', 'instance_id': 'f5280c26-3c89-472c-96cd-5d580ed702ce', 'instance_type': 'm1.nano', 'host': '164c0a9a6acac8f4385343b5316ca58358db1c583e55b6d4f56f9ab0', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'a84e500e-f72a-11f0-b13b-fa163e425b77', 'monotonic_time': 6861.991098652, 'message_signature': 'f592e68e2f994f66cdfe569dd197fc00bb17ae6253536f6e4522873c784de774'}, {'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 485376, 'user_id': 'a8fd196423d94b309668ffd08655f7ed', 'user_name': None, 'project_id': '837db8748d074b3c9179b47d30e7a1d4', 'project_name': None, 'resource_id': 'f5280c26-3c89-472c-96cd-5d580ed702ce-sda', 'timestamp': '2026-01-22T00:38:23.287520', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-960448291', 'name': 'instance-000000af', 'instance_id': 'f5280c26-3c89-472c-96cd-5d580ed702ce', 'instance_type': 'm1.nano', 'host': '164c0a9a6acac8f4385343b5316ca58358db1c583e55b6d4f56f9ab0', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'a84e5aae-f72a-11f0-b13b-fa163e425b77', 'monotonic_time': 6861.991098652, 'message_signature': 'f4c2fe39a5199223cf0ab32f093655efca6e9c9a5960b94f646be0b705edf0d9'}]}, 'timestamp': '2026-01-22 00:38:23.288097', '_unique_id': 'e18ce9f7bb7c4bc98b85c60fd68779c8'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.288 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.288 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.288 12 ERROR oslo_messaging.notify.messaging     yield
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.288 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.288 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.288 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.288 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.288 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.288 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.288 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.288 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.288 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.288 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.288 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.288 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.288 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.288 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.288 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.288 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.288 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.288 12 ERROR oslo_messaging.notify.messaging 
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.288 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.288 12 ERROR oslo_messaging.notify.messaging 
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.288 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.288 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.288 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.288 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.288 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.288 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.288 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.288 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.288 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.288 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.288 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.288 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.288 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.288 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.288 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.288 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.288 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.288 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.288 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.288 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.288 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.288 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.288 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.288 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.288 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.288 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.288 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.288 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.288 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.288 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.288 12 ERROR oslo_messaging.notify.messaging 
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.290 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.error in the context of pollsters
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.290 12 DEBUG ceilometer.compute.pollsters [-] f5280c26-3c89-472c-96cd-5d580ed702ce/network.outgoing.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.290 12 DEBUG ceilometer.compute.pollsters [-] f5280c26-3c89-472c-96cd-5d580ed702ce/network.outgoing.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.291 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'da67d920-cc4f-4071-963f-b43c3aa3e1f5', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'a8fd196423d94b309668ffd08655f7ed', 'user_name': None, 'project_id': '837db8748d074b3c9179b47d30e7a1d4', 'project_name': None, 'resource_id': 'instance-000000af-f5280c26-3c89-472c-96cd-5d580ed702ce-tap6158c039-5f', 'timestamp': '2026-01-22T00:38:23.290261', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-960448291', 'name': 'tap6158c039-5f', 'instance_id': 'f5280c26-3c89-472c-96cd-5d580ed702ce', 'instance_type': 'm1.nano', 'host': '164c0a9a6acac8f4385343b5316ca58358db1c583e55b6d4f56f9ab0', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:27:e9:26', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap6158c039-5f'}, 'message_id': 'a84ebc9c-f72a-11f0-b13b-fa163e425b77', 'monotonic_time': 6861.903449274, 'message_signature': 'a83df8a2e79037d02a5e9ede3660ac29216fe6877358f566f3e6378c9f32a13c'}, {'source': 'openstack', 'counter_name': 'network.outgoing.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'a8fd196423d94b309668ffd08655f7ed', 'user_name': None, 'project_id': '837db8748d074b3c9179b47d30e7a1d4', 'project_name': None, 'resource_id': 'instance-000000af-f5280c26-3c89-472c-96cd-5d580ed702ce-tapdc8f6b9c-58', 'timestamp': '2026-01-22T00:38:23.290261', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-960448291', 'name': 'tapdc8f6b9c-58', 'instance_id': 'f5280c26-3c89-472c-96cd-5d580ed702ce', 'instance_type': 'm1.nano', 'host': '164c0a9a6acac8f4385343b5316ca58358db1c583e55b6d4f56f9ab0', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:20:7f:7e', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapdc8f6b9c-58'}, 'message_id': 'a84ec944-f72a-11f0-b13b-fa163e425b77', 'monotonic_time': 6861.903449274, 'message_signature': '38a3b0f78793cf00833d943cd941d8f3e48dba71acca447b19a473846ce636d4'}]}, 'timestamp': '2026-01-22 00:38:23.290987', '_unique_id': '373ef79c84e943df9765fd32d13e957d'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.291 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.291 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.291 12 ERROR oslo_messaging.notify.messaging     yield
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.291 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.291 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.291 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.291 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.291 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.291 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.291 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.291 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.291 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.291 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.291 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.291 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.291 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.291 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.291 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.291 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.291 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.291 12 ERROR oslo_messaging.notify.messaging 
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.291 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.291 12 ERROR oslo_messaging.notify.messaging 
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.291 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.291 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.291 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.291 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.291 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.291 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.291 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.291 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.291 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.291 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.291 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.291 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.291 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.291 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.291 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.291 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.291 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.291 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.291 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.291 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.291 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.291 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.291 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.291 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.291 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.291 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.291 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.291 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.291 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.291 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.291 12 ERROR oslo_messaging.notify.messaging 
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.293 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.latency in the context of pollsters
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.293 12 DEBUG ceilometer.compute.pollsters [-] f5280c26-3c89-472c-96cd-5d580ed702ce/disk.device.read.latency volume: 153559596 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.293 12 DEBUG ceilometer.compute.pollsters [-] f5280c26-3c89-472c-96cd-5d580ed702ce/disk.device.read.latency volume: 20383646 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.294 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '22219001-88a0-4d04-8c7b-8356cf653500', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 153559596, 'user_id': 'a8fd196423d94b309668ffd08655f7ed', 'user_name': None, 'project_id': '837db8748d074b3c9179b47d30e7a1d4', 'project_name': None, 'resource_id': 'f5280c26-3c89-472c-96cd-5d580ed702ce-vda', 'timestamp': '2026-01-22T00:38:23.293402', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-960448291', 'name': 'instance-000000af', 'instance_id': 'f5280c26-3c89-472c-96cd-5d580ed702ce', 'instance_type': 'm1.nano', 'host': '164c0a9a6acac8f4385343b5316ca58358db1c583e55b6d4f56f9ab0', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'a84f373a-f72a-11f0-b13b-fa163e425b77', 'monotonic_time': 6861.921439708, 'message_signature': '1278c2f705e91250d7eb447054df34489a922c2efab19ddb2d246ebb3e641d14'}, {'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 20383646, 'user_id': 'a8fd196423d94b309668ffd08655f7ed', 'user_name': None, 'project_id': '837db8748d074b3c9179b47d30e7a1d4', 'project_name': None, 'resource_id': 'f5280c26-3c89-472c-96cd-5d580ed702ce-sda', 'timestamp': '2026-01-22T00:38:23.293402', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-960448291', 'name': 'instance-000000af', 'instance_id': 'f5280c26-3c89-472c-96cd-5d580ed702ce', 'instance_type': 'm1.nano', 'host': '164c0a9a6acac8f4385343b5316ca58358db1c583e55b6d4f56f9ab0', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'a84f4626-f72a-11f0-b13b-fa163e425b77', 'monotonic_time': 6861.921439708, 'message_signature': '8cb70545664a333b803a3cee9dfcfb948aeb637f71076908115a66a09da9ab76'}]}, 'timestamp': '2026-01-22 00:38:23.294111', '_unique_id': 'ea81e26245d7460c857108d58f09ee57'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.294 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.294 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.294 12 ERROR oslo_messaging.notify.messaging     yield
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.294 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.294 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.294 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.294 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.294 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.294 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.294 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.294 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.294 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.294 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.294 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.294 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.294 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.294 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.294 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.294 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.294 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.294 12 ERROR oslo_messaging.notify.messaging 
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.294 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.294 12 ERROR oslo_messaging.notify.messaging 
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.294 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.294 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.294 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.294 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.294 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.294 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.294 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.294 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.294 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.294 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.294 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.294 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.294 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.294 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.294 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.294 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.294 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.294 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.294 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.294 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.294 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.294 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.294 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.294 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.294 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.294 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.294 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.294 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.294 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.294 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.294 12 ERROR oslo_messaging.notify.messaging 
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.296 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.requests in the context of pollsters
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.296 12 DEBUG ceilometer.compute.pollsters [-] f5280c26-3c89-472c-96cd-5d580ed702ce/disk.device.read.requests volume: 1055 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.296 12 DEBUG ceilometer.compute.pollsters [-] f5280c26-3c89-472c-96cd-5d580ed702ce/disk.device.read.requests volume: 108 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.297 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '4592e697-b5ed-4f9d-9f5a-228d79665e7d', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 1055, 'user_id': 'a8fd196423d94b309668ffd08655f7ed', 'user_name': None, 'project_id': '837db8748d074b3c9179b47d30e7a1d4', 'project_name': None, 'resource_id': 'f5280c26-3c89-472c-96cd-5d580ed702ce-vda', 'timestamp': '2026-01-22T00:38:23.296135', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-960448291', 'name': 'instance-000000af', 'instance_id': 'f5280c26-3c89-472c-96cd-5d580ed702ce', 'instance_type': 'm1.nano', 'host': '164c0a9a6acac8f4385343b5316ca58358db1c583e55b6d4f56f9ab0', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'a84fa170-f72a-11f0-b13b-fa163e425b77', 'monotonic_time': 6861.921439708, 'message_signature': 'eaf33eebd1d8655767a471af247ec118735073d84de44438a7e2db15e299e978'}, {'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 108, 'user_id': 'a8fd196423d94b309668ffd08655f7ed', 'user_name': None, 'project_id': '837db8748d074b3c9179b47d30e7a1d4', 'project_name': None, 'resource_id': 'f5280c26-3c89-472c-96cd-5d580ed702ce-sda', 'timestamp': '2026-01-22T00:38:23.296135', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-960448291', 'name': 'instance-000000af', 'instance_id': 'f5280c26-3c89-472c-96cd-5d580ed702ce', 'instance_type': 'm1.nano', 'host': '164c0a9a6acac8f4385343b5316ca58358db1c583e55b6d4f56f9ab0', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'a84faa94-f72a-11f0-b13b-fa163e425b77', 'monotonic_time': 6861.921439708, 'message_signature': 'befa844b02d88ba2102443f9ab50e4d75fffb0cbbfac72d1d7e9b77113b7f1e4'}]}, 'timestamp': '2026-01-22 00:38:23.296828', '_unique_id': '32d08a80523c447397a742793ce9118d'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.297 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.297 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.297 12 ERROR oslo_messaging.notify.messaging     yield
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.297 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.297 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.297 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.297 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.297 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.297 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.297 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.297 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.297 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.297 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.297 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.297 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.297 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.297 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.297 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.297 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.297 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.297 12 ERROR oslo_messaging.notify.messaging 
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.297 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.297 12 ERROR oslo_messaging.notify.messaging 
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.297 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.297 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.297 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.297 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.297 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.297 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.297 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.297 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.297 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.297 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.297 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.297 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.297 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.297 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.297 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.297 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.297 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.297 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.297 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.297 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.297 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.297 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.297 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.297 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.297 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.297 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.297 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.297 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.297 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.297 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.297 12 ERROR oslo_messaging.notify.messaging 
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.298 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.allocation in the context of pollsters
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.298 12 DEBUG ceilometer.compute.pollsters [-] f5280c26-3c89-472c-96cd-5d580ed702ce/disk.device.allocation volume: 30089216 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.298 12 DEBUG ceilometer.compute.pollsters [-] f5280c26-3c89-472c-96cd-5d580ed702ce/disk.device.allocation volume: 487424 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.299 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '4addccf8-0a53-4ac7-997e-a922f5d9258d', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 30089216, 'user_id': 'a8fd196423d94b309668ffd08655f7ed', 'user_name': None, 'project_id': '837db8748d074b3c9179b47d30e7a1d4', 'project_name': None, 'resource_id': 'f5280c26-3c89-472c-96cd-5d580ed702ce-vda', 'timestamp': '2026-01-22T00:38:23.298348', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-960448291', 'name': 'instance-000000af', 'instance_id': 'f5280c26-3c89-472c-96cd-5d580ed702ce', 'instance_type': 'm1.nano', 'host': '164c0a9a6acac8f4385343b5316ca58358db1c583e55b6d4f56f9ab0', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'a84ff594-f72a-11f0-b13b-fa163e425b77', 'monotonic_time': 6861.991098652, 'message_signature': '4c8d8fd9349a99ac5d70a984effff1e80fdcd465f17c341fc74d1e7b8d1946ff'}, {'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 487424, 'user_id': 'a8fd196423d94b309668ffd08655f7ed', 'user_name': None, 'project_id': '837db8748d074b3c9179b47d30e7a1d4', 'project_name': None, 'resource_id': 'f5280c26-3c89-472c-96cd-5d580ed702ce-sda', 'timestamp': '2026-01-22T00:38:23.298348', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-960448291', 'name': 'instance-000000af', 'instance_id': 'f5280c26-3c89-472c-96cd-5d580ed702ce', 'instance_type': 'm1.nano', 'host': '164c0a9a6acac8f4385343b5316ca58358db1c583e55b6d4f56f9ab0', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'a84ffff8-f72a-11f0-b13b-fa163e425b77', 'monotonic_time': 6861.991098652, 'message_signature': 'f2a0f6fb33409e62d120a5f03b4087de3a69b2d7019bbed265dbba875c88b9ed'}]}, 'timestamp': '2026-01-22 00:38:23.298886', '_unique_id': 'e4103d3319bf4094b2e0ac049a577a18'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.299 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.299 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.299 12 ERROR oslo_messaging.notify.messaging     yield
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.299 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.299 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.299 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.299 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.299 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.299 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.299 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.299 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.299 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.299 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.299 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.299 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.299 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.299 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.299 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.299 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.299 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.299 12 ERROR oslo_messaging.notify.messaging 
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.299 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.299 12 ERROR oslo_messaging.notify.messaging 
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.299 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.299 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.299 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.299 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.299 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.299 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.299 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.299 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.299 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.299 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.299 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.299 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.299 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.299 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.299 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.299 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.299 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.299 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.299 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.299 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.299 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.299 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.299 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.299 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.299 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.299 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.299 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.299 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.299 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.299 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.299 12 ERROR oslo_messaging.notify.messaging 
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.299 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes in the context of pollsters
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.300 12 DEBUG ceilometer.compute.pollsters [-] f5280c26-3c89-472c-96cd-5d580ed702ce/network.incoming.bytes volume: 4121 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.300 12 DEBUG ceilometer.compute.pollsters [-] f5280c26-3c89-472c-96cd-5d580ed702ce/network.incoming.bytes volume: 772 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.301 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '9f2b0a79-4dfa-461d-a0a9-ac4a69a704fa', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 4121, 'user_id': 'a8fd196423d94b309668ffd08655f7ed', 'user_name': None, 'project_id': '837db8748d074b3c9179b47d30e7a1d4', 'project_name': None, 'resource_id': 'instance-000000af-f5280c26-3c89-472c-96cd-5d580ed702ce-tap6158c039-5f', 'timestamp': '2026-01-22T00:38:23.300082', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-960448291', 'name': 'tap6158c039-5f', 'instance_id': 'f5280c26-3c89-472c-96cd-5d580ed702ce', 'instance_type': 'm1.nano', 'host': '164c0a9a6acac8f4385343b5316ca58358db1c583e55b6d4f56f9ab0', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:27:e9:26', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap6158c039-5f'}, 'message_id': 'a8503982-f72a-11f0-b13b-fa163e425b77', 'monotonic_time': 6861.903449274, 'message_signature': 'dfa97e072c3a0b1dff2ed052794a35450dd1200ba4777e228778cc0f37985b6b'}, {'source': 'openstack', 'counter_name': 'network.incoming.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 772, 'user_id': 'a8fd196423d94b309668ffd08655f7ed', 'user_name': None, 'project_id': '837db8748d074b3c9179b47d30e7a1d4', 'project_name': None, 'resource_id': 'instance-000000af-f5280c26-3c89-472c-96cd-5d580ed702ce-tapdc8f6b9c-58', 'timestamp': '2026-01-22T00:38:23.300082', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-960448291', 'name': 'tapdc8f6b9c-58', 'instance_id': 'f5280c26-3c89-472c-96cd-5d580ed702ce', 'instance_type': 'm1.nano', 'host': '164c0a9a6acac8f4385343b5316ca58358db1c583e55b6d4f56f9ab0', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:20:7f:7e', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapdc8f6b9c-58'}, 'message_id': 'a8504472-f72a-11f0-b13b-fa163e425b77', 'monotonic_time': 6861.903449274, 'message_signature': 'afbfd6cd4f2550da584c8233f73bb36fb19ce7182c233ca9a3ccd024c83af3a8'}]}, 'timestamp': '2026-01-22 00:38:23.300627', '_unique_id': '25eb7b3415da4f60947d8800fe76a5ae'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.301 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.301 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.301 12 ERROR oslo_messaging.notify.messaging     yield
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.301 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.301 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.301 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.301 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.301 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.301 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.301 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.301 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.301 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.301 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.301 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.301 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.301 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.301 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.301 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.301 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.301 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.301 12 ERROR oslo_messaging.notify.messaging 
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.301 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.301 12 ERROR oslo_messaging.notify.messaging 
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.301 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.301 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.301 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.301 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.301 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.301 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.301 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.301 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.301 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.301 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.301 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.301 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.301 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.301 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.301 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.301 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.301 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.301 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.301 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.301 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.301 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.301 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.301 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.301 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.301 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.301 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.301 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.301 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.301 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.301 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.301 12 ERROR oslo_messaging.notify.messaging 
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.301 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.drop in the context of pollsters
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.301 12 DEBUG ceilometer.compute.pollsters [-] f5280c26-3c89-472c-96cd-5d580ed702ce/network.incoming.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.302 12 DEBUG ceilometer.compute.pollsters [-] f5280c26-3c89-472c-96cd-5d580ed702ce/network.incoming.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.303 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'cf7b3231-b82b-4566-b325-d7ccd0fd7029', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'a8fd196423d94b309668ffd08655f7ed', 'user_name': None, 'project_id': '837db8748d074b3c9179b47d30e7a1d4', 'project_name': None, 'resource_id': 'instance-000000af-f5280c26-3c89-472c-96cd-5d580ed702ce-tap6158c039-5f', 'timestamp': '2026-01-22T00:38:23.301850', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-960448291', 'name': 'tap6158c039-5f', 'instance_id': 'f5280c26-3c89-472c-96cd-5d580ed702ce', 'instance_type': 'm1.nano', 'host': '164c0a9a6acac8f4385343b5316ca58358db1c583e55b6d4f56f9ab0', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:27:e9:26', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap6158c039-5f'}, 'message_id': 'a8507fdc-f72a-11f0-b13b-fa163e425b77', 'monotonic_time': 6861.903449274, 'message_signature': '37eb19db53b5436e66b458daaf534624df0a3e4609839ff03eb5fe9995c0bbac'}, {'source': 'openstack', 'counter_name': 'network.incoming.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'a8fd196423d94b309668ffd08655f7ed', 'user_name': None, 'project_id': '837db8748d074b3c9179b47d30e7a1d4', 'project_name': None, 'resource_id': 'instance-000000af-f5280c26-3c89-472c-96cd-5d580ed702ce-tapdc8f6b9c-58', 'timestamp': '2026-01-22T00:38:23.301850', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-960448291', 'name': 'tapdc8f6b9c-58', 'instance_id': 'f5280c26-3c89-472c-96cd-5d580ed702ce', 'instance_type': 'm1.nano', 'host': '164c0a9a6acac8f4385343b5316ca58358db1c583e55b6d4f56f9ab0', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:20:7f:7e', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapdc8f6b9c-58'}, 'message_id': 'a8508aa4-f72a-11f0-b13b-fa163e425b77', 'monotonic_time': 6861.903449274, 'message_signature': '4d08e62116a3db4423d927e5eb63ccc89720b3f96db21899c4a437d9cc340186'}]}, 'timestamp': '2026-01-22 00:38:23.302422', '_unique_id': '46201f4c1a544acf839e5a02776db10d'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.303 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.303 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.303 12 ERROR oslo_messaging.notify.messaging     yield
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.303 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.303 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.303 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.303 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.303 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.303 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.303 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.303 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.303 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.303 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.303 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.303 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.303 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.303 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.303 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.303 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.303 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.303 12 ERROR oslo_messaging.notify.messaging 
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.303 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.303 12 ERROR oslo_messaging.notify.messaging 
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.303 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.303 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.303 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.303 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.303 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.303 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.303 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.303 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.303 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.303 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.303 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.303 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.303 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.303 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.303 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.303 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.303 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.303 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.303 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.303 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.303 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.303 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.303 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.303 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.303 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.303 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.303 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.303 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.303 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.303 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 21 19:38:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:38:23.303 12 ERROR oslo_messaging.notify.messaging 
Jan 21 19:38:24 np0005591285 podman[241797]: 2026-01-22 00:38:24.202541895 +0000 UTC m=+0.076846969 container health_status e38def30a2d032278c507a8fe96e62e9ea51ff4721fe55b24e66691821fd079e (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_id=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, container_name=ovn_controller)
Jan 21 19:38:24 np0005591285 nova_compute[182755]: 2026-01-22 00:38:24.946 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:38:26 np0005591285 nova_compute[182755]: 2026-01-22 00:38:26.663 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:38:29 np0005591285 nova_compute[182755]: 2026-01-22 00:38:29.212 182759 DEBUG oslo_service.periodic_task [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 21 19:38:29 np0005591285 nova_compute[182755]: 2026-01-22 00:38:29.947 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:38:31 np0005591285 nova_compute[182755]: 2026-01-22 00:38:31.666 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:38:34 np0005591285 nova_compute[182755]: 2026-01-22 00:38:34.676 182759 DEBUG oslo_service.periodic_task [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Running periodic task ComputeManager._sync_power_states run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 21 19:38:34 np0005591285 nova_compute[182755]: 2026-01-22 00:38:34.709 182759 DEBUG nova.compute.manager [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Triggering sync for uuid f5280c26-3c89-472c-96cd-5d580ed702ce _sync_power_states /usr/lib/python3.9/site-packages/nova/compute/manager.py:10268#033[00m
Jan 21 19:38:34 np0005591285 nova_compute[182755]: 2026-01-22 00:38:34.710 182759 DEBUG oslo_concurrency.lockutils [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Acquiring lock "f5280c26-3c89-472c-96cd-5d580ed702ce" by "nova.compute.manager.ComputeManager._sync_power_states.<locals>._sync.<locals>.query_driver_power_state_and_sync" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 21 19:38:34 np0005591285 nova_compute[182755]: 2026-01-22 00:38:34.710 182759 DEBUG oslo_concurrency.lockutils [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Lock "f5280c26-3c89-472c-96cd-5d580ed702ce" acquired by "nova.compute.manager.ComputeManager._sync_power_states.<locals>._sync.<locals>.query_driver_power_state_and_sync" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 21 19:38:34 np0005591285 nova_compute[182755]: 2026-01-22 00:38:34.734 182759 DEBUG oslo_concurrency.lockutils [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Lock "f5280c26-3c89-472c-96cd-5d580ed702ce" "released" by "nova.compute.manager.ComputeManager._sync_power_states.<locals>._sync.<locals>.query_driver_power_state_and_sync" :: held 0.024s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 21 19:38:34 np0005591285 nova_compute[182755]: 2026-01-22 00:38:34.949 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:38:36 np0005591285 nova_compute[182755]: 2026-01-22 00:38:36.669 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:38:39 np0005591285 nova_compute[182755]: 2026-01-22 00:38:39.951 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:38:40 np0005591285 podman[241824]: 2026-01-22 00:38:40.190776975 +0000 UTC m=+0.062081586 container health_status 769668cc4e818cfae854ab3fcb5517227ddca1e18005a18ef4d782a494f45662 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.33.7, io.openshift.expose-services=, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., url=https://catalog.redhat.com/en/search?searchType=containers, distribution-scope=public, container_name=openstack_network_exporter, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.tags=minimal rhel9, maintainer=Red Hat, Inc., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, version=9.6, vendor=Red Hat, Inc., architecture=x86_64, com.redhat.component=ubi9-minimal-container, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., name=ubi9-minimal, release=1755695350, config_id=openstack_network_exporter, vcs-type=git, build-date=2025-08-20T13:12:41, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, managed_by=edpm_ansible)
Jan 21 19:38:40 np0005591285 podman[241825]: 2026-01-22 00:38:40.221483584 +0000 UTC m=+0.078138910 container health_status b5c1a31a508d0cf9e10702abe9c0ef424614b4d8f0daee85791f7de054afa6f0 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=ceilometer_agent_compute, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202)
Jan 21 19:38:41 np0005591285 nova_compute[182755]: 2026-01-22 00:38:41.671 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:38:44 np0005591285 nova_compute[182755]: 2026-01-22 00:38:44.953 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:38:46 np0005591285 nova_compute[182755]: 2026-01-22 00:38:46.675 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:38:49 np0005591285 podman[241865]: 2026-01-22 00:38:49.18017886 +0000 UTC m=+0.054869081 container health_status 3d927e208c8d9d2bf2ed958430b573b5beb6e6062ba87e1ec7a0ee42c855b2e4 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Jan 21 19:38:49 np0005591285 nova_compute[182755]: 2026-01-22 00:38:49.217 182759 DEBUG oslo_service.periodic_task [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 21 19:38:49 np0005591285 nova_compute[182755]: 2026-01-22 00:38:49.955 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:38:50 np0005591285 nova_compute[182755]: 2026-01-22 00:38:50.747 182759 DEBUG oslo_service.periodic_task [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Running periodic task ComputeManager._cleanup_running_deleted_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 21 19:38:51 np0005591285 nova_compute[182755]: 2026-01-22 00:38:51.678 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:38:52 np0005591285 podman[241889]: 2026-01-22 00:38:52.180542119 +0000 UTC m=+0.058924651 container health_status 482a8450540b33e5cd4c4cb0c4b60281016c657b79b89fa82f8a9e447843f995 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']})
Jan 21 19:38:52 np0005591285 podman[241890]: 2026-01-22 00:38:52.181625189 +0000 UTC m=+0.054440961 container health_status 8919309155a033da8deb024bb37b9fc0d285fe97686ee0d202af37a5f2df8416 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter)
Jan 21 19:38:52 np0005591285 nova_compute[182755]: 2026-01-22 00:38:52.218 182759 DEBUG oslo_service.periodic_task [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Running periodic task ComputeManager._cleanup_incomplete_migrations run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 21 19:38:52 np0005591285 nova_compute[182755]: 2026-01-22 00:38:52.218 182759 DEBUG nova.compute.manager [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Cleaning up deleted instances with incomplete migration  _cleanup_incomplete_migrations /usr/lib/python3.9/site-packages/nova/compute/manager.py:11183#033[00m
Jan 21 19:38:54 np0005591285 nova_compute[182755]: 2026-01-22 00:38:54.959 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:38:55 np0005591285 podman[241931]: 2026-01-22 00:38:55.209796009 +0000 UTC m=+0.081713557 container health_status e38def30a2d032278c507a8fe96e62e9ea51ff4721fe55b24e66691821fd079e (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_controller, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team)
Jan 21 19:38:56 np0005591285 nova_compute[182755]: 2026-01-22 00:38:56.682 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:38:59 np0005591285 nova_compute[182755]: 2026-01-22 00:38:59.960 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:39:01 np0005591285 nova_compute[182755]: 2026-01-22 00:39:01.686 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:39:03 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:39:03.006 104259 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 21 19:39:03 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:39:03.007 104259 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 21 19:39:03 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:39:03.008 104259 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 21 19:39:03 np0005591285 podman[198602]: time="2026-01-22T00:39:03Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Jan 21 19:39:03 np0005591285 podman[198602]: @ - - [22/Jan/2026:00:39:03 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=true&sync=false HTTP/1.1" 200 24054 "" "Go-http-client/1.1"
Jan 21 19:39:04 np0005591285 nova_compute[182755]: 2026-01-22 00:39:04.962 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:39:06 np0005591285 nova_compute[182755]: 2026-01-22 00:39:06.688 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:39:07 np0005591285 nova_compute[182755]: 2026-01-22 00:39:07.256 182759 DEBUG oslo_service.periodic_task [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 21 19:39:07 np0005591285 nova_compute[182755]: 2026-01-22 00:39:07.256 182759 DEBUG nova.compute.manager [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Jan 21 19:39:07 np0005591285 nova_compute[182755]: 2026-01-22 00:39:07.256 182759 DEBUG nova.compute.manager [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Jan 21 19:39:07 np0005591285 nova_compute[182755]: 2026-01-22 00:39:07.289 182759 DEBUG nova.compute.manager [req-ed25ff65-b958-485c-915d-c5aca277cf6b req-fb0360bf-8de7-4a34-9e9b-e2234a346924 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: f5280c26-3c89-472c-96cd-5d580ed702ce] Received event network-changed-6158c039-5f87-4d75-91cd-734e6337b27f external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 21 19:39:07 np0005591285 nova_compute[182755]: 2026-01-22 00:39:07.289 182759 DEBUG nova.compute.manager [req-ed25ff65-b958-485c-915d-c5aca277cf6b req-fb0360bf-8de7-4a34-9e9b-e2234a346924 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: f5280c26-3c89-472c-96cd-5d580ed702ce] Refreshing instance network info cache due to event network-changed-6158c039-5f87-4d75-91cd-734e6337b27f. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 21 19:39:07 np0005591285 nova_compute[182755]: 2026-01-22 00:39:07.289 182759 DEBUG oslo_concurrency.lockutils [req-ed25ff65-b958-485c-915d-c5aca277cf6b req-fb0360bf-8de7-4a34-9e9b-e2234a346924 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "refresh_cache-f5280c26-3c89-472c-96cd-5d580ed702ce" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 21 19:39:07 np0005591285 nova_compute[182755]: 2026-01-22 00:39:07.290 182759 DEBUG oslo_concurrency.lockutils [req-ed25ff65-b958-485c-915d-c5aca277cf6b req-fb0360bf-8de7-4a34-9e9b-e2234a346924 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquired lock "refresh_cache-f5280c26-3c89-472c-96cd-5d580ed702ce" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 21 19:39:07 np0005591285 nova_compute[182755]: 2026-01-22 00:39:07.290 182759 DEBUG nova.network.neutron [req-ed25ff65-b958-485c-915d-c5aca277cf6b req-fb0360bf-8de7-4a34-9e9b-e2234a346924 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: f5280c26-3c89-472c-96cd-5d580ed702ce] Refreshing network info cache for port 6158c039-5f87-4d75-91cd-734e6337b27f _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 21 19:39:07 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:39:07.409 104259 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=62, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '16:a4:fb', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': 'fa:c2:bb:50:eb:27'}, ipsec=False) old=SB_Global(nb_cfg=61) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 21 19:39:07 np0005591285 nova_compute[182755]: 2026-01-22 00:39:07.409 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:39:07 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:39:07.410 104259 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 7 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Jan 21 19:39:07 np0005591285 nova_compute[182755]: 2026-01-22 00:39:07.482 182759 DEBUG oslo_concurrency.lockutils [None req-ab123f14-c852-402f-ad3f-97b0f89e77f8 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Acquiring lock "f5280c26-3c89-472c-96cd-5d580ed702ce" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 21 19:39:07 np0005591285 nova_compute[182755]: 2026-01-22 00:39:07.483 182759 DEBUG oslo_concurrency.lockutils [None req-ab123f14-c852-402f-ad3f-97b0f89e77f8 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Lock "f5280c26-3c89-472c-96cd-5d580ed702ce" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 21 19:39:07 np0005591285 nova_compute[182755]: 2026-01-22 00:39:07.483 182759 DEBUG oslo_concurrency.lockutils [None req-ab123f14-c852-402f-ad3f-97b0f89e77f8 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Acquiring lock "f5280c26-3c89-472c-96cd-5d580ed702ce-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 21 19:39:07 np0005591285 nova_compute[182755]: 2026-01-22 00:39:07.483 182759 DEBUG oslo_concurrency.lockutils [None req-ab123f14-c852-402f-ad3f-97b0f89e77f8 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Lock "f5280c26-3c89-472c-96cd-5d580ed702ce-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 21 19:39:07 np0005591285 nova_compute[182755]: 2026-01-22 00:39:07.484 182759 DEBUG oslo_concurrency.lockutils [None req-ab123f14-c852-402f-ad3f-97b0f89e77f8 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Lock "f5280c26-3c89-472c-96cd-5d580ed702ce-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 21 19:39:07 np0005591285 nova_compute[182755]: 2026-01-22 00:39:07.488 182759 DEBUG oslo_concurrency.lockutils [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Acquiring lock "refresh_cache-f5280c26-3c89-472c-96cd-5d580ed702ce" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 21 19:39:07 np0005591285 nova_compute[182755]: 2026-01-22 00:39:07.498 182759 INFO nova.compute.manager [None req-ab123f14-c852-402f-ad3f-97b0f89e77f8 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] [instance: f5280c26-3c89-472c-96cd-5d580ed702ce] Terminating instance#033[00m
Jan 21 19:39:07 np0005591285 nova_compute[182755]: 2026-01-22 00:39:07.509 182759 DEBUG nova.compute.manager [None req-ab123f14-c852-402f-ad3f-97b0f89e77f8 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] [instance: f5280c26-3c89-472c-96cd-5d580ed702ce] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Jan 21 19:39:07 np0005591285 kernel: tap6158c039-5f (unregistering): left promiscuous mode
Jan 21 19:39:07 np0005591285 NetworkManager[55017]: <info>  [1769042347.5346] device (tap6158c039-5f): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 21 19:39:07 np0005591285 nova_compute[182755]: 2026-01-22 00:39:07.607 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:39:07 np0005591285 ovn_controller[94908]: 2026-01-22T00:39:07Z|00663|binding|INFO|Releasing lport 6158c039-5f87-4d75-91cd-734e6337b27f from this chassis (sb_readonly=0)
Jan 21 19:39:07 np0005591285 ovn_controller[94908]: 2026-01-22T00:39:07Z|00664|binding|INFO|Setting lport 6158c039-5f87-4d75-91cd-734e6337b27f down in Southbound
Jan 21 19:39:07 np0005591285 ovn_controller[94908]: 2026-01-22T00:39:07Z|00665|binding|INFO|Removing iface tap6158c039-5f ovn-installed in OVS
Jan 21 19:39:07 np0005591285 nova_compute[182755]: 2026-01-22 00:39:07.610 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:39:07 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:39:07.616 104259 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:27:e9:26 10.100.0.13'], port_security=['fa:16:3e:27:e9:26 10.100.0.13'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.13/28', 'neutron:device_id': 'f5280c26-3c89-472c-96cd-5d580ed702ce', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-96576974-adfc-492e-9141-63dd99e1cb25', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '837db8748d074b3c9179b47d30e7a1d4', 'neutron:revision_number': '4', 'neutron:security_group_ids': '4381a94a-5b04-4450-b603-573605756783', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-2.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=700861ed-e604-4e52-bc1a-65ca23f63d88, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fdd55b5c970>], logical_port=6158c039-5f87-4d75-91cd-734e6337b27f) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7fdd55b5c970>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 21 19:39:07 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:39:07.617 104259 INFO neutron.agent.ovn.metadata.agent [-] Port 6158c039-5f87-4d75-91cd-734e6337b27f in datapath 96576974-adfc-492e-9141-63dd99e1cb25 unbound from our chassis#033[00m
Jan 21 19:39:07 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:39:07.618 104259 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 96576974-adfc-492e-9141-63dd99e1cb25, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Jan 21 19:39:07 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:39:07.620 211690 DEBUG oslo.privsep.daemon [-] privsep: reply[c7f43040-6dae-4298-8412-4a0ad01e643e]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 19:39:07 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:39:07.620 104259 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-96576974-adfc-492e-9141-63dd99e1cb25 namespace which is not needed anymore#033[00m
Jan 21 19:39:07 np0005591285 nova_compute[182755]: 2026-01-22 00:39:07.629 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:39:07 np0005591285 kernel: tapdc8f6b9c-58 (unregistering): left promiscuous mode
Jan 21 19:39:07 np0005591285 NetworkManager[55017]: <info>  [1769042347.6428] device (tapdc8f6b9c-58): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 21 19:39:07 np0005591285 ovn_controller[94908]: 2026-01-22T00:39:07Z|00666|binding|INFO|Releasing lport dc8f6b9c-5810-44f0-9c16-cb38b34b2dfa from this chassis (sb_readonly=0)
Jan 21 19:39:07 np0005591285 ovn_controller[94908]: 2026-01-22T00:39:07Z|00667|binding|INFO|Setting lport dc8f6b9c-5810-44f0-9c16-cb38b34b2dfa down in Southbound
Jan 21 19:39:07 np0005591285 ovn_controller[94908]: 2026-01-22T00:39:07Z|00668|binding|INFO|Removing iface tapdc8f6b9c-58 ovn-installed in OVS
Jan 21 19:39:07 np0005591285 nova_compute[182755]: 2026-01-22 00:39:07.647 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:39:07 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:39:07.655 104259 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:20:7f:7e 2001:db8:0:1:f816:3eff:fe20:7f7e 2001:db8::f816:3eff:fe20:7f7e'], port_security=['fa:16:3e:20:7f:7e 2001:db8:0:1:f816:3eff:fe20:7f7e 2001:db8::f816:3eff:fe20:7f7e'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '2001:db8:0:1:f816:3eff:fe20:7f7e/64 2001:db8::f816:3eff:fe20:7f7e/64', 'neutron:device_id': 'f5280c26-3c89-472c-96cd-5d580ed702ce', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-01fa8e13-9f62-4b06-88db-79f2e6ca65b8', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '837db8748d074b3c9179b47d30e7a1d4', 'neutron:revision_number': '4', 'neutron:security_group_ids': '4381a94a-5b04-4450-b603-573605756783', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-2.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=eaa35b5e-130a-4933-a219-b6429231aa8c, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fdd55b5c970>], logical_port=dc8f6b9c-5810-44f0-9c16-cb38b34b2dfa) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7fdd55b5c970>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 21 19:39:07 np0005591285 nova_compute[182755]: 2026-01-22 00:39:07.667 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:39:07 np0005591285 systemd[1]: machine-qemu\x2d75\x2dinstance\x2d000000af.scope: Deactivated successfully.
Jan 21 19:39:07 np0005591285 systemd[1]: machine-qemu\x2d75\x2dinstance\x2d000000af.scope: Consumed 16.447s CPU time.
Jan 21 19:39:07 np0005591285 systemd-machined[154022]: Machine qemu-75-instance-000000af terminated.
Jan 21 19:39:07 np0005591285 NetworkManager[55017]: <info>  [1769042347.7305] manager: (tap6158c039-5f): new Tun device (/org/freedesktop/NetworkManager/Devices/326)
Jan 21 19:39:07 np0005591285 NetworkManager[55017]: <info>  [1769042347.7440] manager: (tapdc8f6b9c-58): new Tun device (/org/freedesktop/NetworkManager/Devices/327)
Jan 21 19:39:07 np0005591285 nova_compute[182755]: 2026-01-22 00:39:07.786 182759 INFO nova.virt.libvirt.driver [-] [instance: f5280c26-3c89-472c-96cd-5d580ed702ce] Instance destroyed successfully.#033[00m
Jan 21 19:39:07 np0005591285 nova_compute[182755]: 2026-01-22 00:39:07.787 182759 DEBUG nova.objects.instance [None req-ab123f14-c852-402f-ad3f-97b0f89e77f8 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Lazy-loading 'resources' on Instance uuid f5280c26-3c89-472c-96cd-5d580ed702ce obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 21 19:39:07 np0005591285 nova_compute[182755]: 2026-01-22 00:39:07.816 182759 DEBUG nova.virt.libvirt.vif [None req-ab123f14-c852-402f-ad3f-97b0f89e77f8 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-22T00:37:45Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestGettingAddress-server-960448291',display_name='tempest-TestGettingAddress-server-960448291',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-testgettingaddress-server-960448291',id=175,image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBKUPE/b7T//C7EWRAZhxqFsRGr7AXrACj+OWHY0bSytLiLst+E4mc3tNVo/ZttM4rMO8VKrIAX0ipjrNBzr3hEfrSo0ADkuS+9zF2SWKTGt3QdgHdDGoyPuHVN6vYWqrHA==',key_name='tempest-TestGettingAddress-2032783061',keypairs=<?>,launch_index=0,launched_at=2026-01-22T00:37:58Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='837db8748d074b3c9179b47d30e7a1d4',ramdisk_id='',reservation_id='r-d7q9t02g',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestGettingAddress-471729430',owner_user_name='tempest-TestGettingAddress-471729430-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-22T00:37:58Z,user_data=None,user_id='a8fd196423d94b309668ffd08655f7ed',uuid=f5280c26-3c89-472c-96cd-5d580ed702ce,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "6158c039-5f87-4d75-91cd-734e6337b27f", "address": "fa:16:3e:27:e9:26", "network": {"id": "96576974-adfc-492e-9141-63dd99e1cb25", "bridge": "br-int", "label": "tempest-network-smoke--1773827977", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.192", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "837db8748d074b3c9179b47d30e7a1d4", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap6158c039-5f", "ovs_interfaceid": "6158c039-5f87-4d75-91cd-734e6337b27f", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Jan 21 19:39:07 np0005591285 nova_compute[182755]: 2026-01-22 00:39:07.817 182759 DEBUG nova.network.os_vif_util [None req-ab123f14-c852-402f-ad3f-97b0f89e77f8 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Converting VIF {"id": "6158c039-5f87-4d75-91cd-734e6337b27f", "address": "fa:16:3e:27:e9:26", "network": {"id": "96576974-adfc-492e-9141-63dd99e1cb25", "bridge": "br-int", "label": "tempest-network-smoke--1773827977", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.192", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "837db8748d074b3c9179b47d30e7a1d4", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap6158c039-5f", "ovs_interfaceid": "6158c039-5f87-4d75-91cd-734e6337b27f", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 21 19:39:07 np0005591285 nova_compute[182755]: 2026-01-22 00:39:07.817 182759 DEBUG nova.network.os_vif_util [None req-ab123f14-c852-402f-ad3f-97b0f89e77f8 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:27:e9:26,bridge_name='br-int',has_traffic_filtering=True,id=6158c039-5f87-4d75-91cd-734e6337b27f,network=Network(96576974-adfc-492e-9141-63dd99e1cb25),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap6158c039-5f') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 21 19:39:07 np0005591285 nova_compute[182755]: 2026-01-22 00:39:07.818 182759 DEBUG os_vif [None req-ab123f14-c852-402f-ad3f-97b0f89e77f8 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:27:e9:26,bridge_name='br-int',has_traffic_filtering=True,id=6158c039-5f87-4d75-91cd-734e6337b27f,network=Network(96576974-adfc-492e-9141-63dd99e1cb25),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap6158c039-5f') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Jan 21 19:39:07 np0005591285 nova_compute[182755]: 2026-01-22 00:39:07.819 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:39:07 np0005591285 nova_compute[182755]: 2026-01-22 00:39:07.820 182759 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap6158c039-5f, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 21 19:39:07 np0005591285 nova_compute[182755]: 2026-01-22 00:39:07.821 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:39:07 np0005591285 nova_compute[182755]: 2026-01-22 00:39:07.823 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 21 19:39:07 np0005591285 nova_compute[182755]: 2026-01-22 00:39:07.825 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:39:07 np0005591285 nova_compute[182755]: 2026-01-22 00:39:07.830 182759 INFO os_vif [None req-ab123f14-c852-402f-ad3f-97b0f89e77f8 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:27:e9:26,bridge_name='br-int',has_traffic_filtering=True,id=6158c039-5f87-4d75-91cd-734e6337b27f,network=Network(96576974-adfc-492e-9141-63dd99e1cb25),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap6158c039-5f')#033[00m
Jan 21 19:39:07 np0005591285 nova_compute[182755]: 2026-01-22 00:39:07.831 182759 DEBUG nova.virt.libvirt.vif [None req-ab123f14-c852-402f-ad3f-97b0f89e77f8 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-22T00:37:45Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestGettingAddress-server-960448291',display_name='tempest-TestGettingAddress-server-960448291',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-testgettingaddress-server-960448291',id=175,image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBKUPE/b7T//C7EWRAZhxqFsRGr7AXrACj+OWHY0bSytLiLst+E4mc3tNVo/ZttM4rMO8VKrIAX0ipjrNBzr3hEfrSo0ADkuS+9zF2SWKTGt3QdgHdDGoyPuHVN6vYWqrHA==',key_name='tempest-TestGettingAddress-2032783061',keypairs=<?>,launch_index=0,launched_at=2026-01-22T00:37:58Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='837db8748d074b3c9179b47d30e7a1d4',ramdisk_id='',reservation_id='r-d7q9t02g',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestGettingAddress-471729430',owner_user_name='tempest-TestGettingAddress-471729430-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-22T00:37:58Z,user_data=None,user_id='a8fd196423d94b309668ffd08655f7ed',uuid=f5280c26-3c89-472c-96cd-5d580ed702ce,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "dc8f6b9c-5810-44f0-9c16-cb38b34b2dfa", "address": "fa:16:3e:20:7f:7e", "network": {"id": "01fa8e13-9f62-4b06-88db-79f2e6ca65b8", "bridge": "br-int", "label": "tempest-network-smoke--1773626540", "subnets": [{"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe20:7f7e", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}, {"cidr": "2001:db8:0:1::/64", "dns": [], "gateway": {"address": "2001:db8:0:1::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8:0:1:f816:3eff:fe20:7f7e", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "837db8748d074b3c9179b47d30e7a1d4", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapdc8f6b9c-58", "ovs_interfaceid": "dc8f6b9c-5810-44f0-9c16-cb38b34b2dfa", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Jan 21 19:39:07 np0005591285 nova_compute[182755]: 2026-01-22 00:39:07.831 182759 DEBUG nova.network.os_vif_util [None req-ab123f14-c852-402f-ad3f-97b0f89e77f8 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Converting VIF {"id": "dc8f6b9c-5810-44f0-9c16-cb38b34b2dfa", "address": "fa:16:3e:20:7f:7e", "network": {"id": "01fa8e13-9f62-4b06-88db-79f2e6ca65b8", "bridge": "br-int", "label": "tempest-network-smoke--1773626540", "subnets": [{"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe20:7f7e", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}, {"cidr": "2001:db8:0:1::/64", "dns": [], "gateway": {"address": "2001:db8:0:1::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8:0:1:f816:3eff:fe20:7f7e", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "837db8748d074b3c9179b47d30e7a1d4", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapdc8f6b9c-58", "ovs_interfaceid": "dc8f6b9c-5810-44f0-9c16-cb38b34b2dfa", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 21 19:39:07 np0005591285 nova_compute[182755]: 2026-01-22 00:39:07.833 182759 DEBUG nova.network.os_vif_util [None req-ab123f14-c852-402f-ad3f-97b0f89e77f8 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:20:7f:7e,bridge_name='br-int',has_traffic_filtering=True,id=dc8f6b9c-5810-44f0-9c16-cb38b34b2dfa,network=Network(01fa8e13-9f62-4b06-88db-79f2e6ca65b8),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapdc8f6b9c-58') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 21 19:39:07 np0005591285 nova_compute[182755]: 2026-01-22 00:39:07.833 182759 DEBUG os_vif [None req-ab123f14-c852-402f-ad3f-97b0f89e77f8 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:20:7f:7e,bridge_name='br-int',has_traffic_filtering=True,id=dc8f6b9c-5810-44f0-9c16-cb38b34b2dfa,network=Network(01fa8e13-9f62-4b06-88db-79f2e6ca65b8),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapdc8f6b9c-58') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Jan 21 19:39:07 np0005591285 nova_compute[182755]: 2026-01-22 00:39:07.834 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:39:07 np0005591285 nova_compute[182755]: 2026-01-22 00:39:07.834 182759 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapdc8f6b9c-58, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 21 19:39:07 np0005591285 nova_compute[182755]: 2026-01-22 00:39:07.835 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:39:07 np0005591285 nova_compute[182755]: 2026-01-22 00:39:07.836 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:39:07 np0005591285 nova_compute[182755]: 2026-01-22 00:39:07.838 182759 INFO os_vif [None req-ab123f14-c852-402f-ad3f-97b0f89e77f8 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:20:7f:7e,bridge_name='br-int',has_traffic_filtering=True,id=dc8f6b9c-5810-44f0-9c16-cb38b34b2dfa,network=Network(01fa8e13-9f62-4b06-88db-79f2e6ca65b8),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapdc8f6b9c-58')#033[00m
Jan 21 19:39:07 np0005591285 nova_compute[182755]: 2026-01-22 00:39:07.838 182759 INFO nova.virt.libvirt.driver [None req-ab123f14-c852-402f-ad3f-97b0f89e77f8 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] [instance: f5280c26-3c89-472c-96cd-5d580ed702ce] Deleting instance files /var/lib/nova/instances/f5280c26-3c89-472c-96cd-5d580ed702ce_del#033[00m
Jan 21 19:39:07 np0005591285 nova_compute[182755]: 2026-01-22 00:39:07.839 182759 INFO nova.virt.libvirt.driver [None req-ab123f14-c852-402f-ad3f-97b0f89e77f8 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] [instance: f5280c26-3c89-472c-96cd-5d580ed702ce] Deletion of /var/lib/nova/instances/f5280c26-3c89-472c-96cd-5d580ed702ce_del complete#033[00m
Jan 21 19:39:07 np0005591285 neutron-haproxy-ovnmeta-96576974-adfc-492e-9141-63dd99e1cb25[241580]: [NOTICE]   (241584) : haproxy version is 2.8.14-c23fe91
Jan 21 19:39:07 np0005591285 neutron-haproxy-ovnmeta-96576974-adfc-492e-9141-63dd99e1cb25[241580]: [NOTICE]   (241584) : path to executable is /usr/sbin/haproxy
Jan 21 19:39:07 np0005591285 neutron-haproxy-ovnmeta-96576974-adfc-492e-9141-63dd99e1cb25[241580]: [WARNING]  (241584) : Exiting Master process...
Jan 21 19:39:07 np0005591285 neutron-haproxy-ovnmeta-96576974-adfc-492e-9141-63dd99e1cb25[241580]: [WARNING]  (241584) : Exiting Master process...
Jan 21 19:39:07 np0005591285 neutron-haproxy-ovnmeta-96576974-adfc-492e-9141-63dd99e1cb25[241580]: [ALERT]    (241584) : Current worker (241586) exited with code 143 (Terminated)
Jan 21 19:39:07 np0005591285 neutron-haproxy-ovnmeta-96576974-adfc-492e-9141-63dd99e1cb25[241580]: [WARNING]  (241584) : All workers exited. Exiting... (0)
Jan 21 19:39:07 np0005591285 systemd[1]: libpod-03cae6243bde812964866312c07ebd0a082f96fd451167b898a5492acd76e92a.scope: Deactivated successfully.
Jan 21 19:39:07 np0005591285 podman[241987]: 2026-01-22 00:39:07.918679328 +0000 UTC m=+0.212207748 container died 03cae6243bde812964866312c07ebd0a082f96fd451167b898a5492acd76e92a (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-96576974-adfc-492e-9141-63dd99e1cb25, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.schema-version=1.0, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2)
Jan 21 19:39:07 np0005591285 nova_compute[182755]: 2026-01-22 00:39:07.929 182759 DEBUG nova.compute.manager [req-c4ff11b9-d5f2-41e4-a463-bc8756e75f47 req-983272d7-d88a-4e89-ba64-e1cb885a3e42 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: f5280c26-3c89-472c-96cd-5d580ed702ce] Received event network-vif-unplugged-dc8f6b9c-5810-44f0-9c16-cb38b34b2dfa external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 21 19:39:07 np0005591285 nova_compute[182755]: 2026-01-22 00:39:07.930 182759 DEBUG oslo_concurrency.lockutils [req-c4ff11b9-d5f2-41e4-a463-bc8756e75f47 req-983272d7-d88a-4e89-ba64-e1cb885a3e42 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "f5280c26-3c89-472c-96cd-5d580ed702ce-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 21 19:39:07 np0005591285 nova_compute[182755]: 2026-01-22 00:39:07.930 182759 DEBUG oslo_concurrency.lockutils [req-c4ff11b9-d5f2-41e4-a463-bc8756e75f47 req-983272d7-d88a-4e89-ba64-e1cb885a3e42 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "f5280c26-3c89-472c-96cd-5d580ed702ce-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 21 19:39:07 np0005591285 nova_compute[182755]: 2026-01-22 00:39:07.930 182759 DEBUG oslo_concurrency.lockutils [req-c4ff11b9-d5f2-41e4-a463-bc8756e75f47 req-983272d7-d88a-4e89-ba64-e1cb885a3e42 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "f5280c26-3c89-472c-96cd-5d580ed702ce-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 21 19:39:07 np0005591285 nova_compute[182755]: 2026-01-22 00:39:07.930 182759 DEBUG nova.compute.manager [req-c4ff11b9-d5f2-41e4-a463-bc8756e75f47 req-983272d7-d88a-4e89-ba64-e1cb885a3e42 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: f5280c26-3c89-472c-96cd-5d580ed702ce] No waiting events found dispatching network-vif-unplugged-dc8f6b9c-5810-44f0-9c16-cb38b34b2dfa pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 21 19:39:07 np0005591285 nova_compute[182755]: 2026-01-22 00:39:07.931 182759 DEBUG nova.compute.manager [req-c4ff11b9-d5f2-41e4-a463-bc8756e75f47 req-983272d7-d88a-4e89-ba64-e1cb885a3e42 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: f5280c26-3c89-472c-96cd-5d580ed702ce] Received event network-vif-unplugged-dc8f6b9c-5810-44f0-9c16-cb38b34b2dfa for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Jan 21 19:39:07 np0005591285 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-03cae6243bde812964866312c07ebd0a082f96fd451167b898a5492acd76e92a-userdata-shm.mount: Deactivated successfully.
Jan 21 19:39:07 np0005591285 systemd[1]: var-lib-containers-storage-overlay-fa389c99bda41224dd64502566e3fce2fc1709898b21a1b1ce303c204a0ce220-merged.mount: Deactivated successfully.
Jan 21 19:39:07 np0005591285 podman[241987]: 2026-01-22 00:39:07.961773102 +0000 UTC m=+0.255301522 container cleanup 03cae6243bde812964866312c07ebd0a082f96fd451167b898a5492acd76e92a (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-96576974-adfc-492e-9141-63dd99e1cb25, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS)
Jan 21 19:39:07 np0005591285 systemd[1]: libpod-conmon-03cae6243bde812964866312c07ebd0a082f96fd451167b898a5492acd76e92a.scope: Deactivated successfully.
Jan 21 19:39:07 np0005591285 nova_compute[182755]: 2026-01-22 00:39:07.982 182759 INFO nova.compute.manager [None req-ab123f14-c852-402f-ad3f-97b0f89e77f8 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] [instance: f5280c26-3c89-472c-96cd-5d580ed702ce] Took 0.47 seconds to destroy the instance on the hypervisor.#033[00m
Jan 21 19:39:07 np0005591285 nova_compute[182755]: 2026-01-22 00:39:07.984 182759 DEBUG oslo.service.loopingcall [None req-ab123f14-c852-402f-ad3f-97b0f89e77f8 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Jan 21 19:39:07 np0005591285 nova_compute[182755]: 2026-01-22 00:39:07.984 182759 DEBUG nova.compute.manager [-] [instance: f5280c26-3c89-472c-96cd-5d580ed702ce] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Jan 21 19:39:07 np0005591285 nova_compute[182755]: 2026-01-22 00:39:07.985 182759 DEBUG nova.network.neutron [-] [instance: f5280c26-3c89-472c-96cd-5d580ed702ce] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Jan 21 19:39:08 np0005591285 podman[242049]: 2026-01-22 00:39:08.213598828 +0000 UTC m=+0.228843967 container remove 03cae6243bde812964866312c07ebd0a082f96fd451167b898a5492acd76e92a (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-96576974-adfc-492e-9141-63dd99e1cb25, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, io.buildah.version=1.41.3)
Jan 21 19:39:08 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:39:08.219 211690 DEBUG oslo.privsep.daemon [-] privsep: reply[12325689-278d-4513-a468-9a2ed9e48aef]: (4, ('Thu Jan 22 12:39:07 AM UTC 2026 Stopping container neutron-haproxy-ovnmeta-96576974-adfc-492e-9141-63dd99e1cb25 (03cae6243bde812964866312c07ebd0a082f96fd451167b898a5492acd76e92a)\n03cae6243bde812964866312c07ebd0a082f96fd451167b898a5492acd76e92a\nThu Jan 22 12:39:07 AM UTC 2026 Deleting container neutron-haproxy-ovnmeta-96576974-adfc-492e-9141-63dd99e1cb25 (03cae6243bde812964866312c07ebd0a082f96fd451167b898a5492acd76e92a)\n03cae6243bde812964866312c07ebd0a082f96fd451167b898a5492acd76e92a\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 19:39:08 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:39:08.221 211690 DEBUG oslo.privsep.daemon [-] privsep: reply[43ad4b29-562a-46c9-a123-73c3eb6646d1]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 19:39:08 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:39:08.222 104259 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap96576974-a0, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 21 19:39:08 np0005591285 kernel: tap96576974-a0: left promiscuous mode
Jan 21 19:39:08 np0005591285 nova_compute[182755]: 2026-01-22 00:39:08.224 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:39:08 np0005591285 nova_compute[182755]: 2026-01-22 00:39:08.236 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:39:08 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:39:08.241 211690 DEBUG oslo.privsep.daemon [-] privsep: reply[874705f3-3597-44f1-aeb3-9ce48b047e9f]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 19:39:08 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:39:08.257 211690 DEBUG oslo.privsep.daemon [-] privsep: reply[5b316654-e44a-404b-91ed-13e1e985cb2b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 19:39:08 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:39:08.258 211690 DEBUG oslo.privsep.daemon [-] privsep: reply[bb534d57-f16b-4753-9f8a-6e1c0f50cfc6]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 19:39:08 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:39:08.279 211690 DEBUG oslo.privsep.daemon [-] privsep: reply[631944f1-d737-4e33-a05a-46e1dd1f3b95]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 683625, 'reachable_time': 18936, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 242064, 'error': None, 'target': 'ovnmeta-96576974-adfc-492e-9141-63dd99e1cb25', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 19:39:08 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:39:08.282 104650 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-96576974-adfc-492e-9141-63dd99e1cb25 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Jan 21 19:39:08 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:39:08.283 104650 DEBUG oslo.privsep.daemon [-] privsep: reply[5e4c1b05-49e2-4965-8295-efb4a907068a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 19:39:08 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:39:08.284 104259 INFO neutron.agent.ovn.metadata.agent [-] Port dc8f6b9c-5810-44f0-9c16-cb38b34b2dfa in datapath 01fa8e13-9f62-4b06-88db-79f2e6ca65b8 unbound from our chassis#033[00m
Jan 21 19:39:08 np0005591285 systemd[1]: run-netns-ovnmeta\x2d96576974\x2dadfc\x2d492e\x2d9141\x2d63dd99e1cb25.mount: Deactivated successfully.
Jan 21 19:39:08 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:39:08.285 104259 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 01fa8e13-9f62-4b06-88db-79f2e6ca65b8, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Jan 21 19:39:08 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:39:08.286 211690 DEBUG oslo.privsep.daemon [-] privsep: reply[fbe70bc5-9ff5-4f90-ae12-773bf43871e1]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 19:39:08 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:39:08.286 104259 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-01fa8e13-9f62-4b06-88db-79f2e6ca65b8 namespace which is not needed anymore#033[00m
Jan 21 19:39:08 np0005591285 neutron-haproxy-ovnmeta-01fa8e13-9f62-4b06-88db-79f2e6ca65b8[241652]: [NOTICE]   (241656) : haproxy version is 2.8.14-c23fe91
Jan 21 19:39:08 np0005591285 neutron-haproxy-ovnmeta-01fa8e13-9f62-4b06-88db-79f2e6ca65b8[241652]: [NOTICE]   (241656) : path to executable is /usr/sbin/haproxy
Jan 21 19:39:08 np0005591285 neutron-haproxy-ovnmeta-01fa8e13-9f62-4b06-88db-79f2e6ca65b8[241652]: [WARNING]  (241656) : Exiting Master process...
Jan 21 19:39:08 np0005591285 neutron-haproxy-ovnmeta-01fa8e13-9f62-4b06-88db-79f2e6ca65b8[241652]: [ALERT]    (241656) : Current worker (241658) exited with code 143 (Terminated)
Jan 21 19:39:08 np0005591285 neutron-haproxy-ovnmeta-01fa8e13-9f62-4b06-88db-79f2e6ca65b8[241652]: [WARNING]  (241656) : All workers exited. Exiting... (0)
Jan 21 19:39:08 np0005591285 systemd[1]: libpod-2b96b90ca1b6c2488016c4014ad6db19cbefd7f8f151aa9eda28bce32ca025c9.scope: Deactivated successfully.
Jan 21 19:39:08 np0005591285 podman[242082]: 2026-01-22 00:39:08.546617904 +0000 UTC m=+0.184118349 container died 2b96b90ca1b6c2488016c4014ad6db19cbefd7f8f151aa9eda28bce32ca025c9 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-01fa8e13-9f62-4b06-88db-79f2e6ca65b8, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS)
Jan 21 19:39:08 np0005591285 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-2b96b90ca1b6c2488016c4014ad6db19cbefd7f8f151aa9eda28bce32ca025c9-userdata-shm.mount: Deactivated successfully.
Jan 21 19:39:08 np0005591285 systemd[1]: var-lib-containers-storage-overlay-b4f5a3f86a0da0b19a3f41725f2a2f8b24a64fdc90919740746a7197d75cbfed-merged.mount: Deactivated successfully.
Jan 21 19:39:08 np0005591285 podman[242082]: 2026-01-22 00:39:08.665284487 +0000 UTC m=+0.302784932 container cleanup 2b96b90ca1b6c2488016c4014ad6db19cbefd7f8f151aa9eda28bce32ca025c9 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-01fa8e13-9f62-4b06-88db-79f2e6ca65b8, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true)
Jan 21 19:39:08 np0005591285 systemd[1]: libpod-conmon-2b96b90ca1b6c2488016c4014ad6db19cbefd7f8f151aa9eda28bce32ca025c9.scope: Deactivated successfully.
Jan 21 19:39:08 np0005591285 podman[242110]: 2026-01-22 00:39:08.831605985 +0000 UTC m=+0.144533911 container remove 2b96b90ca1b6c2488016c4014ad6db19cbefd7f8f151aa9eda28bce32ca025c9 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-01fa8e13-9f62-4b06-88db-79f2e6ca65b8, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.build-date=20251202, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3)
Jan 21 19:39:08 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:39:08.837 211690 DEBUG oslo.privsep.daemon [-] privsep: reply[8c7e66c1-bcb4-434b-9169-d1b9d3a7babe]: (4, ('Thu Jan 22 12:39:08 AM UTC 2026 Stopping container neutron-haproxy-ovnmeta-01fa8e13-9f62-4b06-88db-79f2e6ca65b8 (2b96b90ca1b6c2488016c4014ad6db19cbefd7f8f151aa9eda28bce32ca025c9)\n2b96b90ca1b6c2488016c4014ad6db19cbefd7f8f151aa9eda28bce32ca025c9\nThu Jan 22 12:39:08 AM UTC 2026 Deleting container neutron-haproxy-ovnmeta-01fa8e13-9f62-4b06-88db-79f2e6ca65b8 (2b96b90ca1b6c2488016c4014ad6db19cbefd7f8f151aa9eda28bce32ca025c9)\n2b96b90ca1b6c2488016c4014ad6db19cbefd7f8f151aa9eda28bce32ca025c9\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 19:39:08 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:39:08.839 211690 DEBUG oslo.privsep.daemon [-] privsep: reply[6567d220-c2d6-4d6d-98d0-e59c10d4be83]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 19:39:08 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:39:08.840 104259 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap01fa8e13-90, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 21 19:39:08 np0005591285 nova_compute[182755]: 2026-01-22 00:39:08.841 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:39:08 np0005591285 kernel: tap01fa8e13-90: left promiscuous mode
Jan 21 19:39:08 np0005591285 nova_compute[182755]: 2026-01-22 00:39:08.853 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:39:08 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:39:08.855 211690 DEBUG oslo.privsep.daemon [-] privsep: reply[2ef3fd89-93a9-4493-9834-160b0c887232]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 19:39:08 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:39:08.868 211690 DEBUG oslo.privsep.daemon [-] privsep: reply[d65e162f-06dc-497b-81d2-db768f7f7ed0]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 19:39:08 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:39:08.869 211690 DEBUG oslo.privsep.daemon [-] privsep: reply[df5483fc-592e-43e0-990a-70bbe29c228b]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 19:39:08 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:39:08.883 211690 DEBUG oslo.privsep.daemon [-] privsep: reply[35434740-10e1-45c0-b634-309e4c67da49]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 683709, 'reachable_time': 24093, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 242125, 'error': None, 'target': 'ovnmeta-01fa8e13-9f62-4b06-88db-79f2e6ca65b8', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 19:39:08 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:39:08.886 104650 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-01fa8e13-9f62-4b06-88db-79f2e6ca65b8 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Jan 21 19:39:08 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:39:08.886 104650 DEBUG oslo.privsep.daemon [-] privsep: reply[9a1ea88c-6fb5-49a4-83fc-b4879bbe8cbf]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 19:39:08 np0005591285 systemd[1]: run-netns-ovnmeta\x2d01fa8e13\x2d9f62\x2d4b06\x2d88db\x2d79f2e6ca65b8.mount: Deactivated successfully.
Jan 21 19:39:09 np0005591285 nova_compute[182755]: 2026-01-22 00:39:09.385 182759 DEBUG nova.compute.manager [req-1f9d28f0-c938-4299-af4b-28291472ff27 req-2397a39e-eedb-44c7-b4d3-ef51be435ccc 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: f5280c26-3c89-472c-96cd-5d580ed702ce] Received event network-vif-unplugged-6158c039-5f87-4d75-91cd-734e6337b27f external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 21 19:39:09 np0005591285 nova_compute[182755]: 2026-01-22 00:39:09.386 182759 DEBUG oslo_concurrency.lockutils [req-1f9d28f0-c938-4299-af4b-28291472ff27 req-2397a39e-eedb-44c7-b4d3-ef51be435ccc 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "f5280c26-3c89-472c-96cd-5d580ed702ce-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 21 19:39:09 np0005591285 nova_compute[182755]: 2026-01-22 00:39:09.386 182759 DEBUG oslo_concurrency.lockutils [req-1f9d28f0-c938-4299-af4b-28291472ff27 req-2397a39e-eedb-44c7-b4d3-ef51be435ccc 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "f5280c26-3c89-472c-96cd-5d580ed702ce-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 21 19:39:09 np0005591285 nova_compute[182755]: 2026-01-22 00:39:09.387 182759 DEBUG oslo_concurrency.lockutils [req-1f9d28f0-c938-4299-af4b-28291472ff27 req-2397a39e-eedb-44c7-b4d3-ef51be435ccc 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "f5280c26-3c89-472c-96cd-5d580ed702ce-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 21 19:39:09 np0005591285 nova_compute[182755]: 2026-01-22 00:39:09.387 182759 DEBUG nova.compute.manager [req-1f9d28f0-c938-4299-af4b-28291472ff27 req-2397a39e-eedb-44c7-b4d3-ef51be435ccc 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: f5280c26-3c89-472c-96cd-5d580ed702ce] No waiting events found dispatching network-vif-unplugged-6158c039-5f87-4d75-91cd-734e6337b27f pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 21 19:39:09 np0005591285 nova_compute[182755]: 2026-01-22 00:39:09.387 182759 DEBUG nova.compute.manager [req-1f9d28f0-c938-4299-af4b-28291472ff27 req-2397a39e-eedb-44c7-b4d3-ef51be435ccc 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: f5280c26-3c89-472c-96cd-5d580ed702ce] Received event network-vif-unplugged-6158c039-5f87-4d75-91cd-734e6337b27f for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Jan 21 19:39:09 np0005591285 nova_compute[182755]: 2026-01-22 00:39:09.388 182759 DEBUG nova.compute.manager [req-1f9d28f0-c938-4299-af4b-28291472ff27 req-2397a39e-eedb-44c7-b4d3-ef51be435ccc 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: f5280c26-3c89-472c-96cd-5d580ed702ce] Received event network-vif-plugged-6158c039-5f87-4d75-91cd-734e6337b27f external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 21 19:39:09 np0005591285 nova_compute[182755]: 2026-01-22 00:39:09.388 182759 DEBUG oslo_concurrency.lockutils [req-1f9d28f0-c938-4299-af4b-28291472ff27 req-2397a39e-eedb-44c7-b4d3-ef51be435ccc 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "f5280c26-3c89-472c-96cd-5d580ed702ce-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 21 19:39:09 np0005591285 nova_compute[182755]: 2026-01-22 00:39:09.389 182759 DEBUG oslo_concurrency.lockutils [req-1f9d28f0-c938-4299-af4b-28291472ff27 req-2397a39e-eedb-44c7-b4d3-ef51be435ccc 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "f5280c26-3c89-472c-96cd-5d580ed702ce-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 21 19:39:09 np0005591285 nova_compute[182755]: 2026-01-22 00:39:09.389 182759 DEBUG oslo_concurrency.lockutils [req-1f9d28f0-c938-4299-af4b-28291472ff27 req-2397a39e-eedb-44c7-b4d3-ef51be435ccc 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "f5280c26-3c89-472c-96cd-5d580ed702ce-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 21 19:39:09 np0005591285 nova_compute[182755]: 2026-01-22 00:39:09.389 182759 DEBUG nova.compute.manager [req-1f9d28f0-c938-4299-af4b-28291472ff27 req-2397a39e-eedb-44c7-b4d3-ef51be435ccc 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: f5280c26-3c89-472c-96cd-5d580ed702ce] No waiting events found dispatching network-vif-plugged-6158c039-5f87-4d75-91cd-734e6337b27f pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 21 19:39:09 np0005591285 nova_compute[182755]: 2026-01-22 00:39:09.390 182759 WARNING nova.compute.manager [req-1f9d28f0-c938-4299-af4b-28291472ff27 req-2397a39e-eedb-44c7-b4d3-ef51be435ccc 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: f5280c26-3c89-472c-96cd-5d580ed702ce] Received unexpected event network-vif-plugged-6158c039-5f87-4d75-91cd-734e6337b27f for instance with vm_state active and task_state deleting.#033[00m
Jan 21 19:39:09 np0005591285 nova_compute[182755]: 2026-01-22 00:39:09.601 182759 DEBUG nova.network.neutron [req-ed25ff65-b958-485c-915d-c5aca277cf6b req-fb0360bf-8de7-4a34-9e9b-e2234a346924 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: f5280c26-3c89-472c-96cd-5d580ed702ce] Updated VIF entry in instance network info cache for port 6158c039-5f87-4d75-91cd-734e6337b27f. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 21 19:39:09 np0005591285 nova_compute[182755]: 2026-01-22 00:39:09.602 182759 DEBUG nova.network.neutron [req-ed25ff65-b958-485c-915d-c5aca277cf6b req-fb0360bf-8de7-4a34-9e9b-e2234a346924 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: f5280c26-3c89-472c-96cd-5d580ed702ce] Updating instance_info_cache with network_info: [{"id": "6158c039-5f87-4d75-91cd-734e6337b27f", "address": "fa:16:3e:27:e9:26", "network": {"id": "96576974-adfc-492e-9141-63dd99e1cb25", "bridge": "br-int", "label": "tempest-network-smoke--1773827977", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "837db8748d074b3c9179b47d30e7a1d4", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap6158c039-5f", "ovs_interfaceid": "6158c039-5f87-4d75-91cd-734e6337b27f", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "dc8f6b9c-5810-44f0-9c16-cb38b34b2dfa", "address": "fa:16:3e:20:7f:7e", "network": {"id": "01fa8e13-9f62-4b06-88db-79f2e6ca65b8", "bridge": "br-int", "label": "tempest-network-smoke--1773626540", "subnets": [{"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe20:7f7e", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}, {"cidr": "2001:db8:0:1::/64", "dns": [], "gateway": {"address": "2001:db8:0:1::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8:0:1:f816:3eff:fe20:7f7e", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "837db8748d074b3c9179b47d30e7a1d4", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapdc8f6b9c-58", "ovs_interfaceid": "dc8f6b9c-5810-44f0-9c16-cb38b34b2dfa", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 21 19:39:09 np0005591285 nova_compute[182755]: 2026-01-22 00:39:09.632 182759 DEBUG oslo_concurrency.lockutils [req-ed25ff65-b958-485c-915d-c5aca277cf6b req-fb0360bf-8de7-4a34-9e9b-e2234a346924 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Releasing lock "refresh_cache-f5280c26-3c89-472c-96cd-5d580ed702ce" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 21 19:39:09 np0005591285 nova_compute[182755]: 2026-01-22 00:39:09.632 182759 DEBUG oslo_concurrency.lockutils [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Acquired lock "refresh_cache-f5280c26-3c89-472c-96cd-5d580ed702ce" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 21 19:39:09 np0005591285 nova_compute[182755]: 2026-01-22 00:39:09.633 182759 DEBUG nova.network.neutron [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] [instance: f5280c26-3c89-472c-96cd-5d580ed702ce] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004#033[00m
Jan 21 19:39:09 np0005591285 nova_compute[182755]: 2026-01-22 00:39:09.633 182759 DEBUG nova.objects.instance [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Lazy-loading 'info_cache' on Instance uuid f5280c26-3c89-472c-96cd-5d580ed702ce obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 21 19:39:09 np0005591285 nova_compute[182755]: 2026-01-22 00:39:09.965 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:39:10 np0005591285 nova_compute[182755]: 2026-01-22 00:39:10.317 182759 DEBUG nova.compute.manager [req-0e9a59c3-5635-4ca6-bc70-52e4f4fdf5c6 req-d59b98ee-7b39-441a-b922-57b4826c84af 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: f5280c26-3c89-472c-96cd-5d580ed702ce] Received event network-vif-plugged-dc8f6b9c-5810-44f0-9c16-cb38b34b2dfa external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 21 19:39:10 np0005591285 nova_compute[182755]: 2026-01-22 00:39:10.318 182759 DEBUG oslo_concurrency.lockutils [req-0e9a59c3-5635-4ca6-bc70-52e4f4fdf5c6 req-d59b98ee-7b39-441a-b922-57b4826c84af 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "f5280c26-3c89-472c-96cd-5d580ed702ce-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 21 19:39:10 np0005591285 nova_compute[182755]: 2026-01-22 00:39:10.318 182759 DEBUG oslo_concurrency.lockutils [req-0e9a59c3-5635-4ca6-bc70-52e4f4fdf5c6 req-d59b98ee-7b39-441a-b922-57b4826c84af 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "f5280c26-3c89-472c-96cd-5d580ed702ce-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 21 19:39:10 np0005591285 nova_compute[182755]: 2026-01-22 00:39:10.318 182759 DEBUG oslo_concurrency.lockutils [req-0e9a59c3-5635-4ca6-bc70-52e4f4fdf5c6 req-d59b98ee-7b39-441a-b922-57b4826c84af 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "f5280c26-3c89-472c-96cd-5d580ed702ce-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 21 19:39:10 np0005591285 nova_compute[182755]: 2026-01-22 00:39:10.318 182759 DEBUG nova.compute.manager [req-0e9a59c3-5635-4ca6-bc70-52e4f4fdf5c6 req-d59b98ee-7b39-441a-b922-57b4826c84af 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: f5280c26-3c89-472c-96cd-5d580ed702ce] No waiting events found dispatching network-vif-plugged-dc8f6b9c-5810-44f0-9c16-cb38b34b2dfa pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 21 19:39:10 np0005591285 nova_compute[182755]: 2026-01-22 00:39:10.319 182759 WARNING nova.compute.manager [req-0e9a59c3-5635-4ca6-bc70-52e4f4fdf5c6 req-d59b98ee-7b39-441a-b922-57b4826c84af 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: f5280c26-3c89-472c-96cd-5d580ed702ce] Received unexpected event network-vif-plugged-dc8f6b9c-5810-44f0-9c16-cb38b34b2dfa for instance with vm_state active and task_state deleting.#033[00m
Jan 21 19:39:11 np0005591285 nova_compute[182755]: 2026-01-22 00:39:11.139 182759 DEBUG nova.network.neutron [-] [instance: f5280c26-3c89-472c-96cd-5d580ed702ce] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 21 19:39:11 np0005591285 nova_compute[182755]: 2026-01-22 00:39:11.155 182759 INFO nova.compute.manager [-] [instance: f5280c26-3c89-472c-96cd-5d580ed702ce] Took 3.17 seconds to deallocate network for instance.#033[00m
Jan 21 19:39:11 np0005591285 podman[242126]: 2026-01-22 00:39:11.17810643 +0000 UTC m=+0.053246198 container health_status 769668cc4e818cfae854ab3fcb5517227ddca1e18005a18ef4d782a494f45662 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, vcs-type=git, maintainer=Red Hat, Inc., managed_by=edpm_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.buildah.version=1.33.7, io.openshift.tags=minimal rhel9, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vendor=Red Hat, Inc., url=https://catalog.redhat.com/en/search?searchType=containers, architecture=x86_64, container_name=openstack_network_exporter, name=ubi9-minimal, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., version=9.6, build-date=2025-08-20T13:12:41, distribution-scope=public, io.openshift.expose-services=, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., release=1755695350, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, com.redhat.component=ubi9-minimal-container, config_id=openstack_network_exporter)
Jan 21 19:39:11 np0005591285 podman[242127]: 2026-01-22 00:39:11.187616596 +0000 UTC m=+0.061557862 container health_status b5c1a31a508d0cf9e10702abe9c0ef424614b4d8f0daee85791f7de054afa6f0 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=ceilometer_agent_compute, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, container_name=ceilometer_agent_compute, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']})
Jan 21 19:39:11 np0005591285 nova_compute[182755]: 2026-01-22 00:39:11.231 182759 DEBUG oslo_concurrency.lockutils [None req-ab123f14-c852-402f-ad3f-97b0f89e77f8 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 21 19:39:11 np0005591285 nova_compute[182755]: 2026-01-22 00:39:11.232 182759 DEBUG oslo_concurrency.lockutils [None req-ab123f14-c852-402f-ad3f-97b0f89e77f8 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 21 19:39:11 np0005591285 nova_compute[182755]: 2026-01-22 00:39:11.288 182759 DEBUG nova.compute.provider_tree [None req-ab123f14-c852-402f-ad3f-97b0f89e77f8 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Inventory has not changed in ProviderTree for provider: e96a8776-a298-4c19-937a-402cb8191067 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 21 19:39:11 np0005591285 nova_compute[182755]: 2026-01-22 00:39:11.302 182759 DEBUG nova.scheduler.client.report [None req-ab123f14-c852-402f-ad3f-97b0f89e77f8 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Inventory has not changed for provider e96a8776-a298-4c19-937a-402cb8191067 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 21 19:39:11 np0005591285 nova_compute[182755]: 2026-01-22 00:39:11.321 182759 DEBUG oslo_concurrency.lockutils [None req-ab123f14-c852-402f-ad3f-97b0f89e77f8 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.090s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 21 19:39:11 np0005591285 nova_compute[182755]: 2026-01-22 00:39:11.341 182759 INFO nova.scheduler.client.report [None req-ab123f14-c852-402f-ad3f-97b0f89e77f8 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Deleted allocations for instance f5280c26-3c89-472c-96cd-5d580ed702ce#033[00m
Jan 21 19:39:11 np0005591285 nova_compute[182755]: 2026-01-22 00:39:11.423 182759 DEBUG oslo_concurrency.lockutils [None req-ab123f14-c852-402f-ad3f-97b0f89e77f8 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Lock "f5280c26-3c89-472c-96cd-5d580ed702ce" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 3.940s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 21 19:39:11 np0005591285 nova_compute[182755]: 2026-01-22 00:39:11.500 182759 DEBUG nova.compute.manager [req-67c9d27b-b4cc-4e86-a5ef-08f3260b3020 req-0d5a4b03-7846-4525-bc5d-7ebbf531402d 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: f5280c26-3c89-472c-96cd-5d580ed702ce] Received event network-vif-deleted-6158c039-5f87-4d75-91cd-734e6337b27f external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 21 19:39:11 np0005591285 nova_compute[182755]: 2026-01-22 00:39:11.500 182759 DEBUG nova.compute.manager [req-67c9d27b-b4cc-4e86-a5ef-08f3260b3020 req-0d5a4b03-7846-4525-bc5d-7ebbf531402d 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: f5280c26-3c89-472c-96cd-5d580ed702ce] Received event network-vif-deleted-dc8f6b9c-5810-44f0-9c16-cb38b34b2dfa external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 21 19:39:12 np0005591285 nova_compute[182755]: 2026-01-22 00:39:12.092 182759 DEBUG nova.network.neutron [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] [instance: f5280c26-3c89-472c-96cd-5d580ed702ce] Updating instance_info_cache with network_info: [{"id": "dc8f6b9c-5810-44f0-9c16-cb38b34b2dfa", "address": "fa:16:3e:20:7f:7e", "network": {"id": "01fa8e13-9f62-4b06-88db-79f2e6ca65b8", "bridge": "br-int", "label": "tempest-network-smoke--1773626540", "subnets": [{"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe20:7f7e", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}, {"cidr": "2001:db8:0:1::/64", "dns": [], "gateway": {"address": "2001:db8:0:1::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8:0:1:f816:3eff:fe20:7f7e", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "837db8748d074b3c9179b47d30e7a1d4", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapdc8f6b9c-58", "ovs_interfaceid": "dc8f6b9c-5810-44f0-9c16-cb38b34b2dfa", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 21 19:39:12 np0005591285 nova_compute[182755]: 2026-01-22 00:39:12.115 182759 DEBUG oslo_concurrency.lockutils [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Releasing lock "refresh_cache-f5280c26-3c89-472c-96cd-5d580ed702ce" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 21 19:39:12 np0005591285 nova_compute[182755]: 2026-01-22 00:39:12.116 182759 DEBUG nova.compute.manager [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] [instance: f5280c26-3c89-472c-96cd-5d580ed702ce] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929#033[00m
Jan 21 19:39:12 np0005591285 nova_compute[182755]: 2026-01-22 00:39:12.117 182759 DEBUG oslo_service.periodic_task [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 21 19:39:12 np0005591285 nova_compute[182755]: 2026-01-22 00:39:12.118 182759 DEBUG oslo_service.periodic_task [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 21 19:39:12 np0005591285 nova_compute[182755]: 2026-01-22 00:39:12.118 182759 DEBUG oslo_service.periodic_task [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 21 19:39:12 np0005591285 nova_compute[182755]: 2026-01-22 00:39:12.118 182759 DEBUG nova.compute.manager [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Jan 21 19:39:12 np0005591285 nova_compute[182755]: 2026-01-22 00:39:12.216 182759 DEBUG oslo_service.periodic_task [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 21 19:39:12 np0005591285 nova_compute[182755]: 2026-01-22 00:39:12.217 182759 DEBUG oslo_service.periodic_task [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 21 19:39:12 np0005591285 nova_compute[182755]: 2026-01-22 00:39:12.836 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:39:14 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:39:14.412 104259 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=ce4b296c-26ac-415a-aa87-9634754eb3d3, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '62'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 21 19:39:14 np0005591285 nova_compute[182755]: 2026-01-22 00:39:14.966 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:39:17 np0005591285 nova_compute[182755]: 2026-01-22 00:39:17.597 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:39:17 np0005591285 nova_compute[182755]: 2026-01-22 00:39:17.709 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:39:17 np0005591285 nova_compute[182755]: 2026-01-22 00:39:17.868 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:39:19 np0005591285 nova_compute[182755]: 2026-01-22 00:39:19.216 182759 DEBUG oslo_service.periodic_task [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 21 19:39:19 np0005591285 nova_compute[182755]: 2026-01-22 00:39:19.968 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:39:20 np0005591285 podman[242165]: 2026-01-22 00:39:20.186053993 +0000 UTC m=+0.061243204 container health_status 3d927e208c8d9d2bf2ed958430b573b5beb6e6062ba87e1ec7a0ee42c855b2e4 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Jan 21 19:39:20 np0005591285 nova_compute[182755]: 2026-01-22 00:39:20.217 182759 DEBUG oslo_service.periodic_task [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 21 19:39:20 np0005591285 nova_compute[182755]: 2026-01-22 00:39:20.217 182759 DEBUG oslo_service.periodic_task [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 21 19:39:20 np0005591285 nova_compute[182755]: 2026-01-22 00:39:20.240 182759 DEBUG oslo_concurrency.lockutils [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 21 19:39:20 np0005591285 nova_compute[182755]: 2026-01-22 00:39:20.240 182759 DEBUG oslo_concurrency.lockutils [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 21 19:39:20 np0005591285 nova_compute[182755]: 2026-01-22 00:39:20.241 182759 DEBUG oslo_concurrency.lockutils [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 21 19:39:20 np0005591285 nova_compute[182755]: 2026-01-22 00:39:20.241 182759 DEBUG nova.compute.resource_tracker [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Jan 21 19:39:20 np0005591285 nova_compute[182755]: 2026-01-22 00:39:20.412 182759 WARNING nova.virt.libvirt.driver [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 21 19:39:20 np0005591285 nova_compute[182755]: 2026-01-22 00:39:20.413 182759 DEBUG nova.compute.resource_tracker [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=5710MB free_disk=73.17712783813477GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Jan 21 19:39:20 np0005591285 nova_compute[182755]: 2026-01-22 00:39:20.413 182759 DEBUG oslo_concurrency.lockutils [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 21 19:39:20 np0005591285 nova_compute[182755]: 2026-01-22 00:39:20.413 182759 DEBUG oslo_concurrency.lockutils [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 21 19:39:20 np0005591285 nova_compute[182755]: 2026-01-22 00:39:20.579 182759 DEBUG nova.compute.resource_tracker [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Jan 21 19:39:20 np0005591285 nova_compute[182755]: 2026-01-22 00:39:20.579 182759 DEBUG nova.compute.resource_tracker [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Jan 21 19:39:20 np0005591285 nova_compute[182755]: 2026-01-22 00:39:20.599 182759 DEBUG nova.compute.provider_tree [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Inventory has not changed in ProviderTree for provider: e96a8776-a298-4c19-937a-402cb8191067 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 21 19:39:20 np0005591285 nova_compute[182755]: 2026-01-22 00:39:20.614 182759 DEBUG nova.scheduler.client.report [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Inventory has not changed for provider e96a8776-a298-4c19-937a-402cb8191067 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 21 19:39:20 np0005591285 nova_compute[182755]: 2026-01-22 00:39:20.634 182759 DEBUG nova.compute.resource_tracker [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Jan 21 19:39:20 np0005591285 nova_compute[182755]: 2026-01-22 00:39:20.634 182759 DEBUG oslo_concurrency.lockutils [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.221s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 21 19:39:22 np0005591285 nova_compute[182755]: 2026-01-22 00:39:22.786 182759 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1769042347.7846544, f5280c26-3c89-472c-96cd-5d580ed702ce => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 21 19:39:22 np0005591285 nova_compute[182755]: 2026-01-22 00:39:22.786 182759 INFO nova.compute.manager [-] [instance: f5280c26-3c89-472c-96cd-5d580ed702ce] VM Stopped (Lifecycle Event)#033[00m
Jan 21 19:39:22 np0005591285 nova_compute[182755]: 2026-01-22 00:39:22.822 182759 DEBUG nova.compute.manager [None req-536a2e72-dfcd-4885-a1c4-d9bf152a073f - - - - - -] [instance: f5280c26-3c89-472c-96cd-5d580ed702ce] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 21 19:39:22 np0005591285 nova_compute[182755]: 2026-01-22 00:39:22.870 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:39:23 np0005591285 podman[242190]: 2026-01-22 00:39:23.195649423 +0000 UTC m=+0.052835457 container health_status 8919309155a033da8deb024bb37b9fc0d285fe97686ee0d202af37a5f2df8416 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Jan 21 19:39:23 np0005591285 podman[242189]: 2026-01-22 00:39:23.201578033 +0000 UTC m=+0.061748338 container health_status 482a8450540b33e5cd4c4cb0c4b60281016c657b79b89fa82f8a9e447843f995 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 21 19:39:25 np0005591285 nova_compute[182755]: 2026-01-22 00:39:25.016 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:39:26 np0005591285 podman[242231]: 2026-01-22 00:39:26.225470757 +0000 UTC m=+0.088469798 container health_status e38def30a2d032278c507a8fe96e62e9ea51ff4721fe55b24e66691821fd079e (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251202, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, container_name=ovn_controller)
Jan 21 19:39:27 np0005591285 nova_compute[182755]: 2026-01-22 00:39:27.872 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:39:30 np0005591285 nova_compute[182755]: 2026-01-22 00:39:30.018 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:39:32 np0005591285 nova_compute[182755]: 2026-01-22 00:39:32.874 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:39:35 np0005591285 nova_compute[182755]: 2026-01-22 00:39:35.019 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:39:37 np0005591285 nova_compute[182755]: 2026-01-22 00:39:37.875 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:39:40 np0005591285 nova_compute[182755]: 2026-01-22 00:39:40.020 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:39:42 np0005591285 podman[242260]: 2026-01-22 00:39:42.17937465 +0000 UTC m=+0.053025252 container health_status b5c1a31a508d0cf9e10702abe9c0ef424614b4d8f0daee85791f7de054afa6f0 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, config_id=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, managed_by=edpm_ansible, org.label-schema.build-date=20251202)
Jan 21 19:39:42 np0005591285 podman[242259]: 2026-01-22 00:39:42.218165667 +0000 UTC m=+0.083329040 container health_status 769668cc4e818cfae854ab3fcb5517227ddca1e18005a18ef4d782a494f45662 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., version=9.6, managed_by=edpm_ansible, name=ubi9-minimal, release=1755695350, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.expose-services=, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, config_id=openstack_network_exporter, distribution-scope=public, vcs-type=git, io.buildah.version=1.33.7, com.redhat.component=ubi9-minimal-container, maintainer=Red Hat, Inc., architecture=x86_64, vendor=Red Hat, Inc., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.tags=minimal rhel9, url=https://catalog.redhat.com/en/search?searchType=containers, build-date=2025-08-20T13:12:41, container_name=openstack_network_exporter)
Jan 21 19:39:42 np0005591285 nova_compute[182755]: 2026-01-22 00:39:42.913 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:39:45 np0005591285 nova_compute[182755]: 2026-01-22 00:39:45.022 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:39:47 np0005591285 nova_compute[182755]: 2026-01-22 00:39:47.914 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:39:50 np0005591285 nova_compute[182755]: 2026-01-22 00:39:50.052 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:39:51 np0005591285 podman[242302]: 2026-01-22 00:39:51.166027448 +0000 UTC m=+0.043753132 container health_status 3d927e208c8d9d2bf2ed958430b573b5beb6e6062ba87e1ec7a0ee42c855b2e4 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter)
Jan 21 19:39:52 np0005591285 nova_compute[182755]: 2026-01-22 00:39:52.916 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:39:54 np0005591285 podman[242326]: 2026-01-22 00:39:54.179001568 +0000 UTC m=+0.054778030 container health_status 482a8450540b33e5cd4c4cb0c4b60281016c657b79b89fa82f8a9e447843f995 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 21 19:39:54 np0005591285 podman[242327]: 2026-01-22 00:39:54.234015702 +0000 UTC m=+0.094683976 container health_status 8919309155a033da8deb024bb37b9fc0d285fe97686ee0d202af37a5f2df8416 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']})
Jan 21 19:39:55 np0005591285 nova_compute[182755]: 2026-01-22 00:39:55.386 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:39:57 np0005591285 podman[242370]: 2026-01-22 00:39:57.198337 +0000 UTC m=+0.074002967 container health_status e38def30a2d032278c507a8fe96e62e9ea51ff4721fe55b24e66691821fd079e (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_controller, managed_by=edpm_ansible, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, tcib_managed=true)
Jan 21 19:39:57 np0005591285 nova_compute[182755]: 2026-01-22 00:39:57.918 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:39:58 np0005591285 nova_compute[182755]: 2026-01-22 00:39:58.522 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:39:58 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:39:58.523 104259 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=63, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '16:a4:fb', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': 'fa:c2:bb:50:eb:27'}, ipsec=False) old=SB_Global(nb_cfg=62) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 21 19:39:58 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:39:58.525 104259 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 1 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Jan 21 19:39:59 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:39:59.526 104259 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=ce4b296c-26ac-415a-aa87-9634754eb3d3, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '63'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 21 19:40:00 np0005591285 nova_compute[182755]: 2026-01-22 00:40:00.066 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:40:02 np0005591285 nova_compute[182755]: 2026-01-22 00:40:02.921 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:40:03 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:40:03.007 104259 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 21 19:40:03 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:40:03.008 104259 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 21 19:40:03 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:40:03.008 104259 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 21 19:40:05 np0005591285 nova_compute[182755]: 2026-01-22 00:40:05.068 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:40:05 np0005591285 ovn_controller[94908]: 2026-01-22T00:40:05Z|00669|memory_trim|INFO|Detected inactivity (last active 30005 ms ago): trimming memory
Jan 21 19:40:07 np0005591285 nova_compute[182755]: 2026-01-22 00:40:07.923 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:40:08 np0005591285 nova_compute[182755]: 2026-01-22 00:40:08.635 182759 DEBUG oslo_service.periodic_task [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 21 19:40:08 np0005591285 nova_compute[182755]: 2026-01-22 00:40:08.635 182759 DEBUG nova.compute.manager [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Jan 21 19:40:08 np0005591285 nova_compute[182755]: 2026-01-22 00:40:08.635 182759 DEBUG nova.compute.manager [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Jan 21 19:40:08 np0005591285 nova_compute[182755]: 2026-01-22 00:40:08.667 182759 DEBUG nova.compute.manager [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Jan 21 19:40:10 np0005591285 nova_compute[182755]: 2026-01-22 00:40:10.070 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:40:11 np0005591285 nova_compute[182755]: 2026-01-22 00:40:11.217 182759 DEBUG oslo_service.periodic_task [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 21 19:40:11 np0005591285 nova_compute[182755]: 2026-01-22 00:40:11.218 182759 DEBUG oslo_service.periodic_task [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 21 19:40:11 np0005591285 nova_compute[182755]: 2026-01-22 00:40:11.218 182759 DEBUG oslo_service.periodic_task [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 21 19:40:11 np0005591285 nova_compute[182755]: 2026-01-22 00:40:11.218 182759 DEBUG oslo_service.periodic_task [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 21 19:40:11 np0005591285 nova_compute[182755]: 2026-01-22 00:40:11.218 182759 DEBUG nova.compute.manager [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Jan 21 19:40:12 np0005591285 nova_compute[182755]: 2026-01-22 00:40:12.217 182759 DEBUG oslo_service.periodic_task [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 21 19:40:12 np0005591285 nova_compute[182755]: 2026-01-22 00:40:12.960 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:40:13 np0005591285 podman[242397]: 2026-01-22 00:40:13.202110318 +0000 UTC m=+0.074671686 container health_status 769668cc4e818cfae854ab3fcb5517227ddca1e18005a18ef4d782a494f45662 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, com.redhat.component=ubi9-minimal-container, io.buildah.version=1.33.7, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vendor=Red Hat, Inc., config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=openstack_network_exporter, container_name=openstack_network_exporter, io.openshift.expose-services=, managed_by=edpm_ansible, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., name=ubi9-minimal, io.openshift.tags=minimal rhel9, maintainer=Red Hat, Inc., url=https://catalog.redhat.com/en/search?searchType=containers, build-date=2025-08-20T13:12:41, version=9.6, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., distribution-scope=public, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, release=1755695350, architecture=x86_64, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b)
Jan 21 19:40:13 np0005591285 podman[242398]: 2026-01-22 00:40:13.21852213 +0000 UTC m=+0.088331844 container health_status b5c1a31a508d0cf9e10702abe9c0ef424614b4d8f0daee85791f7de054afa6f0 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, container_name=ceilometer_agent_compute, tcib_managed=true, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=ceilometer_agent_compute, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, org.label-schema.build-date=20251202)
Jan 21 19:40:15 np0005591285 nova_compute[182755]: 2026-01-22 00:40:15.113 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:40:17 np0005591285 nova_compute[182755]: 2026-01-22 00:40:17.962 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:40:20 np0005591285 nova_compute[182755]: 2026-01-22 00:40:20.115 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:40:20 np0005591285 nova_compute[182755]: 2026-01-22 00:40:20.217 182759 DEBUG oslo_service.periodic_task [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 21 19:40:20 np0005591285 nova_compute[182755]: 2026-01-22 00:40:20.218 182759 DEBUG oslo_service.periodic_task [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 21 19:40:21 np0005591285 nova_compute[182755]: 2026-01-22 00:40:21.216 182759 DEBUG oslo_service.periodic_task [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 21 19:40:21 np0005591285 nova_compute[182755]: 2026-01-22 00:40:21.250 182759 DEBUG oslo_concurrency.lockutils [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 21 19:40:21 np0005591285 nova_compute[182755]: 2026-01-22 00:40:21.250 182759 DEBUG oslo_concurrency.lockutils [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 21 19:40:21 np0005591285 nova_compute[182755]: 2026-01-22 00:40:21.251 182759 DEBUG oslo_concurrency.lockutils [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 21 19:40:21 np0005591285 nova_compute[182755]: 2026-01-22 00:40:21.251 182759 DEBUG nova.compute.resource_tracker [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Jan 21 19:40:21 np0005591285 nova_compute[182755]: 2026-01-22 00:40:21.451 182759 WARNING nova.virt.libvirt.driver [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 21 19:40:21 np0005591285 nova_compute[182755]: 2026-01-22 00:40:21.454 182759 DEBUG nova.compute.resource_tracker [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=5747MB free_disk=73.17712783813477GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Jan 21 19:40:21 np0005591285 nova_compute[182755]: 2026-01-22 00:40:21.455 182759 DEBUG oslo_concurrency.lockutils [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 21 19:40:21 np0005591285 nova_compute[182755]: 2026-01-22 00:40:21.455 182759 DEBUG oslo_concurrency.lockutils [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 21 19:40:21 np0005591285 nova_compute[182755]: 2026-01-22 00:40:21.573 182759 DEBUG nova.compute.resource_tracker [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Jan 21 19:40:21 np0005591285 nova_compute[182755]: 2026-01-22 00:40:21.574 182759 DEBUG nova.compute.resource_tracker [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Jan 21 19:40:21 np0005591285 nova_compute[182755]: 2026-01-22 00:40:21.628 182759 DEBUG nova.scheduler.client.report [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Refreshing inventories for resource provider e96a8776-a298-4c19-937a-402cb8191067 _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:804#033[00m
Jan 21 19:40:21 np0005591285 nova_compute[182755]: 2026-01-22 00:40:21.681 182759 DEBUG nova.scheduler.client.report [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Updating ProviderTree inventory for provider e96a8776-a298-4c19-937a-402cb8191067 from _refresh_and_get_inventory using data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} _refresh_and_get_inventory /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:768#033[00m
Jan 21 19:40:21 np0005591285 nova_compute[182755]: 2026-01-22 00:40:21.682 182759 DEBUG nova.compute.provider_tree [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Updating inventory in ProviderTree for provider e96a8776-a298-4c19-937a-402cb8191067 with inventory: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176#033[00m
Jan 21 19:40:21 np0005591285 nova_compute[182755]: 2026-01-22 00:40:21.695 182759 DEBUG nova.scheduler.client.report [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Refreshing aggregate associations for resource provider e96a8776-a298-4c19-937a-402cb8191067, aggregates: None _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:813#033[00m
Jan 21 19:40:21 np0005591285 nova_compute[182755]: 2026-01-22 00:40:21.718 182759 DEBUG nova.scheduler.client.report [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Refreshing trait associations for resource provider e96a8776-a298-4c19-937a-402cb8191067, traits: COMPUTE_SECURITY_UEFI_SECURE_BOOT,COMPUTE_VIOMMU_MODEL_VIRTIO,COMPUTE_STORAGE_BUS_IDE,COMPUTE_NET_ATTACH_INTERFACE,HW_CPU_X86_SSSE3,HW_CPU_X86_SSE,COMPUTE_IMAGE_TYPE_QCOW2,COMPUTE_NET_VIF_MODEL_VMXNET3,COMPUTE_STORAGE_BUS_FDC,HW_CPU_X86_SSE42,COMPUTE_SECURITY_TPM_2_0,COMPUTE_IMAGE_TYPE_AMI,COMPUTE_DEVICE_TAGGING,COMPUTE_VOLUME_EXTEND,COMPUTE_NET_VIF_MODEL_E1000,COMPUTE_VIOMMU_MODEL_AUTO,COMPUTE_GRAPHICS_MODEL_CIRRUS,COMPUTE_VOLUME_ATTACH_WITH_TAG,COMPUTE_IMAGE_TYPE_AKI,COMPUTE_STORAGE_BUS_SATA,HW_CPU_X86_SSE2,COMPUTE_IMAGE_TYPE_ARI,HW_CPU_X86_SSE41,COMPUTE_RESCUE_BFV,COMPUTE_NODE,COMPUTE_GRAPHICS_MODEL_VIRTIO,COMPUTE_NET_VIF_MODEL_PCNET,COMPUTE_NET_VIF_MODEL_VIRTIO,COMPUTE_NET_VIF_MODEL_RTL8139,COMPUTE_ACCELERATORS,COMPUTE_SECURITY_TPM_1_2,COMPUTE_GRAPHICS_MODEL_NONE,COMPUTE_GRAPHICS_MODEL_BOCHS,COMPUTE_VOLUME_MULTI_ATTACH,COMPUTE_NET_VIF_MODEL_NE2K_PCI,COMPUTE_TRUSTED_CERTS,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,COMPUTE_VIOMMU_MODEL_INTEL,COMPUTE_IMAGE_TYPE_ISO,COMPUTE_SOCKET_PCI_NUMA_AFFINITY,COMPUTE_STORAGE_BUS_USB,COMPUTE_STORAGE_BUS_SCSI,COMPUTE_STORAGE_BUS_VIRTIO,HW_CPU_X86_MMX,COMPUTE_GRAPHICS_MODEL_VGA,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,COMPUTE_NET_VIF_MODEL_E1000E,COMPUTE_IMAGE_TYPE_RAW _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:825#033[00m
Jan 21 19:40:21 np0005591285 nova_compute[182755]: 2026-01-22 00:40:21.739 182759 DEBUG nova.compute.provider_tree [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Inventory has not changed in ProviderTree for provider: e96a8776-a298-4c19-937a-402cb8191067 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 21 19:40:21 np0005591285 nova_compute[182755]: 2026-01-22 00:40:21.758 182759 DEBUG nova.scheduler.client.report [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Inventory has not changed for provider e96a8776-a298-4c19-937a-402cb8191067 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 21 19:40:21 np0005591285 nova_compute[182755]: 2026-01-22 00:40:21.760 182759 DEBUG nova.compute.resource_tracker [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Jan 21 19:40:21 np0005591285 nova_compute[182755]: 2026-01-22 00:40:21.760 182759 DEBUG oslo_concurrency.lockutils [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.305s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 21 19:40:22 np0005591285 podman[242435]: 2026-01-22 00:40:22.185724255 +0000 UTC m=+0.055944641 container health_status 3d927e208c8d9d2bf2ed958430b573b5beb6e6062ba87e1ec7a0ee42c855b2e4 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Jan 21 19:40:23 np0005591285 nova_compute[182755]: 2026-01-22 00:40:23.004 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:40:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:40:23.178 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.capacity, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 21 19:40:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:40:23.178 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 21 19:40:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:40:23.178 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 21 19:40:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:40:23.179 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 21 19:40:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:40:23.179 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 21 19:40:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:40:23.179 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 21 19:40:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:40:23.179 12 DEBUG ceilometer.polling.manager [-] Skip pollster cpu, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 21 19:40:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:40:23.179 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 21 19:40:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:40:23.179 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 21 19:40:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:40:23.179 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.iops, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 21 19:40:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:40:23.179 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 21 19:40:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:40:23.179 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 21 19:40:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:40:23.179 12 DEBUG ceilometer.polling.manager [-] Skip pollster memory.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 21 19:40:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:40:23.179 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 21 19:40:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:40:23.180 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 21 19:40:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:40:23.180 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 21 19:40:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:40:23.180 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 21 19:40:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:40:23.180 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 21 19:40:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:40:23.180 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 21 19:40:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:40:23.180 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.allocation, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 21 19:40:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:40:23.180 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 21 19:40:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:40:23.180 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 21 19:40:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:40:23.180 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 21 19:40:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:40:23.180 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 21 19:40:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:40:23.180 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 21 19:40:25 np0005591285 nova_compute[182755]: 2026-01-22 00:40:25.207 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:40:25 np0005591285 podman[242459]: 2026-01-22 00:40:25.258096078 +0000 UTC m=+0.134248384 container health_status 482a8450540b33e5cd4c4cb0c4b60281016c657b79b89fa82f8a9e447843f995 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2)
Jan 21 19:40:25 np0005591285 podman[242460]: 2026-01-22 00:40:25.268773966 +0000 UTC m=+0.049940218 container health_status 8919309155a033da8deb024bb37b9fc0d285fe97686ee0d202af37a5f2df8416 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter)
Jan 21 19:40:28 np0005591285 nova_compute[182755]: 2026-01-22 00:40:28.005 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:40:28 np0005591285 podman[242503]: 2026-01-22 00:40:28.20482456 +0000 UTC m=+0.083016931 container health_status e38def30a2d032278c507a8fe96e62e9ea51ff4721fe55b24e66691821fd079e (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 21 19:40:30 np0005591285 nova_compute[182755]: 2026-01-22 00:40:30.209 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:40:33 np0005591285 nova_compute[182755]: 2026-01-22 00:40:33.008 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:40:33 np0005591285 nova_compute[182755]: 2026-01-22 00:40:33.757 182759 DEBUG oslo_service.periodic_task [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 21 19:40:35 np0005591285 nova_compute[182755]: 2026-01-22 00:40:35.210 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:40:38 np0005591285 nova_compute[182755]: 2026-01-22 00:40:38.011 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:40:40 np0005591285 nova_compute[182755]: 2026-01-22 00:40:40.213 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:40:43 np0005591285 nova_compute[182755]: 2026-01-22 00:40:43.013 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:40:44 np0005591285 podman[242530]: 2026-01-22 00:40:44.18058098 +0000 UTC m=+0.056887306 container health_status 769668cc4e818cfae854ab3fcb5517227ddca1e18005a18ef4d782a494f45662 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=openstack_network_exporter, io.openshift.tags=minimal rhel9, version=9.6, io.buildah.version=1.33.7, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, com.redhat.component=ubi9-minimal-container, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, release=1755695350, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., url=https://catalog.redhat.com/en/search?searchType=containers, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=openstack_network_exporter, maintainer=Red Hat, Inc., distribution-scope=public, managed_by=edpm_ansible, vcs-type=git, vendor=Red Hat, Inc., build-date=2025-08-20T13:12:41, name=ubi9-minimal, architecture=x86_64)
Jan 21 19:40:44 np0005591285 podman[242531]: 2026-01-22 00:40:44.189570714 +0000 UTC m=+0.062166949 container health_status b5c1a31a508d0cf9e10702abe9c0ef424614b4d8f0daee85791f7de054afa6f0 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_id=ceilometer_agent_compute, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 21 19:40:45 np0005591285 nova_compute[182755]: 2026-01-22 00:40:45.215 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:40:48 np0005591285 nova_compute[182755]: 2026-01-22 00:40:48.055 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:40:50 np0005591285 nova_compute[182755]: 2026-01-22 00:40:50.260 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:40:53 np0005591285 nova_compute[182755]: 2026-01-22 00:40:53.057 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:40:53 np0005591285 podman[242571]: 2026-01-22 00:40:53.212951564 +0000 UTC m=+0.091198733 container health_status 3d927e208c8d9d2bf2ed958430b573b5beb6e6062ba87e1ec7a0ee42c855b2e4 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Jan 21 19:40:55 np0005591285 nova_compute[182755]: 2026-01-22 00:40:55.304 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:40:56 np0005591285 podman[242597]: 2026-01-22 00:40:56.17803007 +0000 UTC m=+0.049382383 container health_status 8919309155a033da8deb024bb37b9fc0d285fe97686ee0d202af37a5f2df8416 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter)
Jan 21 19:40:56 np0005591285 podman[242596]: 2026-01-22 00:40:56.191229966 +0000 UTC m=+0.067989295 container health_status 482a8450540b33e5cd4c4cb0c4b60281016c657b79b89fa82f8a9e447843f995 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=ovn_metadata_agent, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']})
Jan 21 19:40:58 np0005591285 nova_compute[182755]: 2026-01-22 00:40:58.059 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:40:59 np0005591285 podman[242637]: 2026-01-22 00:40:59.234829102 +0000 UTC m=+0.111607902 container health_status e38def30a2d032278c507a8fe96e62e9ea51ff4721fe55b24e66691821fd079e (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.build-date=20251202, tcib_managed=true)
Jan 21 19:41:00 np0005591285 nova_compute[182755]: 2026-01-22 00:41:00.364 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:41:03 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:41:03.009 104259 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 21 19:41:03 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:41:03.009 104259 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 21 19:41:03 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:41:03.009 104259 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 21 19:41:03 np0005591285 nova_compute[182755]: 2026-01-22 00:41:03.061 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:41:03 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:41:03.909 104259 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=64, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '16:a4:fb', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': 'fa:c2:bb:50:eb:27'}, ipsec=False) old=SB_Global(nb_cfg=63) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 21 19:41:03 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:41:03.910 104259 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 4 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Jan 21 19:41:03 np0005591285 nova_compute[182755]: 2026-01-22 00:41:03.910 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:41:05 np0005591285 nova_compute[182755]: 2026-01-22 00:41:05.364 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:41:07 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:41:07.912 104259 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=ce4b296c-26ac-415a-aa87-9634754eb3d3, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '64'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 21 19:41:08 np0005591285 nova_compute[182755]: 2026-01-22 00:41:08.063 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:41:09 np0005591285 nova_compute[182755]: 2026-01-22 00:41:09.217 182759 DEBUG oslo_service.periodic_task [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 21 19:41:09 np0005591285 nova_compute[182755]: 2026-01-22 00:41:09.217 182759 DEBUG nova.compute.manager [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Jan 21 19:41:09 np0005591285 nova_compute[182755]: 2026-01-22 00:41:09.218 182759 DEBUG nova.compute.manager [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Jan 21 19:41:09 np0005591285 nova_compute[182755]: 2026-01-22 00:41:09.231 182759 DEBUG nova.compute.manager [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Jan 21 19:41:10 np0005591285 nova_compute[182755]: 2026-01-22 00:41:10.364 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:41:11 np0005591285 nova_compute[182755]: 2026-01-22 00:41:11.217 182759 DEBUG oslo_service.periodic_task [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 21 19:41:12 np0005591285 nova_compute[182755]: 2026-01-22 00:41:12.213 182759 DEBUG oslo_service.periodic_task [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 21 19:41:12 np0005591285 nova_compute[182755]: 2026-01-22 00:41:12.216 182759 DEBUG oslo_service.periodic_task [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 21 19:41:12 np0005591285 nova_compute[182755]: 2026-01-22 00:41:12.217 182759 DEBUG nova.compute.manager [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Jan 21 19:41:13 np0005591285 nova_compute[182755]: 2026-01-22 00:41:13.067 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:41:13 np0005591285 nova_compute[182755]: 2026-01-22 00:41:13.219 182759 DEBUG oslo_service.periodic_task [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 21 19:41:13 np0005591285 nova_compute[182755]: 2026-01-22 00:41:13.219 182759 DEBUG oslo_service.periodic_task [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 21 19:41:15 np0005591285 podman[242662]: 2026-01-22 00:41:15.187172383 +0000 UTC m=+0.061731267 container health_status 769668cc4e818cfae854ab3fcb5517227ddca1e18005a18ef4d782a494f45662 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=openstack_network_exporter, vcs-type=git, maintainer=Red Hat, Inc., io.openshift.tags=minimal rhel9, architecture=x86_64, release=1755695350, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vendor=Red Hat, Inc., com.redhat.component=ubi9-minimal-container, build-date=2025-08-20T13:12:41, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., name=ubi9-minimal, distribution-scope=public, url=https://catalog.redhat.com/en/search?searchType=containers, managed_by=edpm_ansible, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=openstack_network_exporter, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, version=9.6, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.buildah.version=1.33.7)
Jan 21 19:41:15 np0005591285 podman[242663]: 2026-01-22 00:41:15.229974158 +0000 UTC m=+0.096127595 container health_status b5c1a31a508d0cf9e10702abe9c0ef424614b4d8f0daee85791f7de054afa6f0 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, tcib_managed=true, config_id=ceilometer_agent_compute, container_name=ceilometer_agent_compute, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 21 19:41:15 np0005591285 nova_compute[182755]: 2026-01-22 00:41:15.366 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:41:18 np0005591285 nova_compute[182755]: 2026-01-22 00:41:18.069 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:41:20 np0005591285 nova_compute[182755]: 2026-01-22 00:41:20.369 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:41:22 np0005591285 nova_compute[182755]: 2026-01-22 00:41:22.218 182759 DEBUG oslo_service.periodic_task [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 21 19:41:22 np0005591285 nova_compute[182755]: 2026-01-22 00:41:22.218 182759 DEBUG oslo_service.periodic_task [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 21 19:41:22 np0005591285 nova_compute[182755]: 2026-01-22 00:41:22.219 182759 DEBUG oslo_service.periodic_task [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 21 19:41:22 np0005591285 nova_compute[182755]: 2026-01-22 00:41:22.246 182759 DEBUG oslo_concurrency.lockutils [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 21 19:41:22 np0005591285 nova_compute[182755]: 2026-01-22 00:41:22.246 182759 DEBUG oslo_concurrency.lockutils [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 21 19:41:22 np0005591285 nova_compute[182755]: 2026-01-22 00:41:22.246 182759 DEBUG oslo_concurrency.lockutils [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 21 19:41:22 np0005591285 nova_compute[182755]: 2026-01-22 00:41:22.246 182759 DEBUG nova.compute.resource_tracker [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Jan 21 19:41:22 np0005591285 nova_compute[182755]: 2026-01-22 00:41:22.396 182759 WARNING nova.virt.libvirt.driver [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 21 19:41:22 np0005591285 nova_compute[182755]: 2026-01-22 00:41:22.397 182759 DEBUG nova.compute.resource_tracker [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=5741MB free_disk=73.17712783813477GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Jan 21 19:41:22 np0005591285 nova_compute[182755]: 2026-01-22 00:41:22.397 182759 DEBUG oslo_concurrency.lockutils [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 21 19:41:22 np0005591285 nova_compute[182755]: 2026-01-22 00:41:22.398 182759 DEBUG oslo_concurrency.lockutils [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 21 19:41:22 np0005591285 nova_compute[182755]: 2026-01-22 00:41:22.460 182759 DEBUG nova.compute.resource_tracker [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Jan 21 19:41:22 np0005591285 nova_compute[182755]: 2026-01-22 00:41:22.461 182759 DEBUG nova.compute.resource_tracker [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Jan 21 19:41:22 np0005591285 nova_compute[182755]: 2026-01-22 00:41:22.483 182759 DEBUG nova.compute.provider_tree [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Inventory has not changed in ProviderTree for provider: e96a8776-a298-4c19-937a-402cb8191067 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 21 19:41:22 np0005591285 nova_compute[182755]: 2026-01-22 00:41:22.497 182759 DEBUG nova.scheduler.client.report [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Inventory has not changed for provider e96a8776-a298-4c19-937a-402cb8191067 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 21 19:41:22 np0005591285 nova_compute[182755]: 2026-01-22 00:41:22.499 182759 DEBUG nova.compute.resource_tracker [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Jan 21 19:41:22 np0005591285 nova_compute[182755]: 2026-01-22 00:41:22.499 182759 DEBUG oslo_concurrency.lockutils [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.101s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 21 19:41:23 np0005591285 nova_compute[182755]: 2026-01-22 00:41:23.071 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:41:24 np0005591285 podman[242704]: 2026-01-22 00:41:24.190288947 +0000 UTC m=+0.062053696 container health_status 3d927e208c8d9d2bf2ed958430b573b5beb6e6062ba87e1ec7a0ee42c855b2e4 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Jan 21 19:41:25 np0005591285 nova_compute[182755]: 2026-01-22 00:41:25.372 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:41:27 np0005591285 podman[242728]: 2026-01-22 00:41:27.17742588 +0000 UTC m=+0.053440933 container health_status 482a8450540b33e5cd4c4cb0c4b60281016c657b79b89fa82f8a9e447843f995 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.vendor=CentOS)
Jan 21 19:41:27 np0005591285 podman[242729]: 2026-01-22 00:41:27.214167501 +0000 UTC m=+0.087764289 container health_status 8919309155a033da8deb024bb37b9fc0d285fe97686ee0d202af37a5f2df8416 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter)
Jan 21 19:41:28 np0005591285 nova_compute[182755]: 2026-01-22 00:41:28.073 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:41:30 np0005591285 podman[242768]: 2026-01-22 00:41:30.256064962 +0000 UTC m=+0.133001271 container health_status e38def30a2d032278c507a8fe96e62e9ea51ff4721fe55b24e66691821fd079e (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, io.buildah.version=1.41.3, managed_by=edpm_ansible, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS)
Jan 21 19:41:30 np0005591285 nova_compute[182755]: 2026-01-22 00:41:30.372 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:41:33 np0005591285 nova_compute[182755]: 2026-01-22 00:41:33.076 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:41:35 np0005591285 nova_compute[182755]: 2026-01-22 00:41:35.374 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:41:35 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:41:35.853 104259 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:5f:63:f3 10.100.0.2 2001:db8::f816:3eff:fe5f:63f3'], port_security=[], type=localport, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': ''}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.2/28 2001:db8::f816:3eff:fe5f:63f3/64', 'neutron:device_id': 'ovnmeta-0fbc923c-90ec-4c3d-92df-bc42843601b3', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-0fbc923c-90ec-4c3d-92df-bc42843601b3', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '837db8748d074b3c9179b47d30e7a1d4', 'neutron:revision_number': '3', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=b420faa5-5ae8-471e-9b88-5f792c3ff519, chassis=[], tunnel_key=1, gateway_chassis=[], requested_chassis=[], logical_port=2a8aebb4-643e-4d79-9b9e-71408c2b29d3) old=Port_Binding(mac=['fa:16:3e:5f:63:f3 10.100.0.2'], external_ids={'neutron:cidrs': '10.100.0.2/28', 'neutron:device_id': 'ovnmeta-0fbc923c-90ec-4c3d-92df-bc42843601b3', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-0fbc923c-90ec-4c3d-92df-bc42843601b3', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '837db8748d074b3c9179b47d30e7a1d4', 'neutron:revision_number': '2', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 21 19:41:35 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:41:35.855 104259 INFO neutron.agent.ovn.metadata.agent [-] Metadata Port 2a8aebb4-643e-4d79-9b9e-71408c2b29d3 in datapath 0fbc923c-90ec-4c3d-92df-bc42843601b3 updated#033[00m
Jan 21 19:41:35 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:41:35.856 104259 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 0fbc923c-90ec-4c3d-92df-bc42843601b3, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Jan 21 19:41:35 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:41:35.858 211690 DEBUG oslo.privsep.daemon [-] privsep: reply[c686c2a5-9cc4-4f3d-ad99-e5fb9827a5e9]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 19:41:38 np0005591285 nova_compute[182755]: 2026-01-22 00:41:38.078 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:41:40 np0005591285 nova_compute[182755]: 2026-01-22 00:41:40.376 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:41:41 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:41:41.551 104259 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:5f:63:f3 10.100.0.2 2001:db8:0:1:f816:3eff:fe5f:63f3 2001:db8::f816:3eff:fe5f:63f3'], port_security=[], type=localport, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': ''}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.2/28 2001:db8:0:1:f816:3eff:fe5f:63f3/64 2001:db8::f816:3eff:fe5f:63f3/64', 'neutron:device_id': 'ovnmeta-0fbc923c-90ec-4c3d-92df-bc42843601b3', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-0fbc923c-90ec-4c3d-92df-bc42843601b3', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '837db8748d074b3c9179b47d30e7a1d4', 'neutron:revision_number': '4', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=b420faa5-5ae8-471e-9b88-5f792c3ff519, chassis=[], tunnel_key=1, gateway_chassis=[], requested_chassis=[], logical_port=2a8aebb4-643e-4d79-9b9e-71408c2b29d3) old=Port_Binding(mac=['fa:16:3e:5f:63:f3 10.100.0.2 2001:db8::f816:3eff:fe5f:63f3'], external_ids={'neutron:cidrs': '10.100.0.2/28 2001:db8::f816:3eff:fe5f:63f3/64', 'neutron:device_id': 'ovnmeta-0fbc923c-90ec-4c3d-92df-bc42843601b3', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-0fbc923c-90ec-4c3d-92df-bc42843601b3', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '837db8748d074b3c9179b47d30e7a1d4', 'neutron:revision_number': '3', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 21 19:41:41 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:41:41.552 104259 INFO neutron.agent.ovn.metadata.agent [-] Metadata Port 2a8aebb4-643e-4d79-9b9e-71408c2b29d3 in datapath 0fbc923c-90ec-4c3d-92df-bc42843601b3 updated#033[00m
Jan 21 19:41:41 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:41:41.553 104259 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 0fbc923c-90ec-4c3d-92df-bc42843601b3, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Jan 21 19:41:41 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:41:41.554 211690 DEBUG oslo.privsep.daemon [-] privsep: reply[f754658c-6b66-48d1-a790-c52a5169f746]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 19:41:43 np0005591285 nova_compute[182755]: 2026-01-22 00:41:43.080 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:41:45 np0005591285 nova_compute[182755]: 2026-01-22 00:41:45.378 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:41:46 np0005591285 podman[242797]: 2026-01-22 00:41:46.199782107 +0000 UTC m=+0.064984435 container health_status b5c1a31a508d0cf9e10702abe9c0ef424614b4d8f0daee85791f7de054afa6f0 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=ceilometer_agent_compute, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, managed_by=edpm_ansible, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, org.label-schema.build-date=20251202)
Jan 21 19:41:46 np0005591285 podman[242796]: 2026-01-22 00:41:46.207354851 +0000 UTC m=+0.071982443 container health_status 769668cc4e818cfae854ab3fcb5517227ddca1e18005a18ef4d782a494f45662 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=openstack_network_exporter, release=1755695350, architecture=x86_64, vendor=Red Hat, Inc., maintainer=Red Hat, Inc., io.openshift.tags=minimal rhel9, version=9.6, config_id=openstack_network_exporter, name=ubi9-minimal, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vcs-type=git, build-date=2025-08-20T13:12:41, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., distribution-scope=public, io.openshift.expose-services=, io.buildah.version=1.33.7, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., managed_by=edpm_ansible, com.redhat.component=ubi9-minimal-container, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9.)
Jan 21 19:41:48 np0005591285 nova_compute[182755]: 2026-01-22 00:41:48.081 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:41:50 np0005591285 nova_compute[182755]: 2026-01-22 00:41:50.379 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:41:53 np0005591285 nova_compute[182755]: 2026-01-22 00:41:53.083 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:41:55 np0005591285 podman[242836]: 2026-01-22 00:41:55.201065869 +0000 UTC m=+0.073511725 container health_status 3d927e208c8d9d2bf2ed958430b573b5beb6e6062ba87e1ec7a0ee42c855b2e4 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Jan 21 19:41:55 np0005591285 nova_compute[182755]: 2026-01-22 00:41:55.380 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:41:58 np0005591285 nova_compute[182755]: 2026-01-22 00:41:58.084 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:41:58 np0005591285 podman[242861]: 2026-01-22 00:41:58.174289387 +0000 UTC m=+0.049286722 container health_status 8919309155a033da8deb024bb37b9fc0d285fe97686ee0d202af37a5f2df8416 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Jan 21 19:41:58 np0005591285 podman[242860]: 2026-01-22 00:41:58.17441505 +0000 UTC m=+0.051624615 container health_status 482a8450540b33e5cd4c4cb0c4b60281016c657b79b89fa82f8a9e447843f995 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, managed_by=edpm_ansible)
Jan 21 19:42:00 np0005591285 nova_compute[182755]: 2026-01-22 00:42:00.419 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:42:01 np0005591285 podman[242902]: 2026-01-22 00:42:01.207836613 +0000 UTC m=+0.081024018 container health_status e38def30a2d032278c507a8fe96e62e9ea51ff4721fe55b24e66691821fd079e (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_id=ovn_controller, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, container_name=ovn_controller, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.license=GPLv2)
Jan 21 19:42:03 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:42:03.010 104259 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 21 19:42:03 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:42:03.010 104259 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 21 19:42:03 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:42:03.011 104259 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 21 19:42:03 np0005591285 nova_compute[182755]: 2026-01-22 00:42:03.086 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:42:05 np0005591285 nova_compute[182755]: 2026-01-22 00:42:05.462 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:42:08 np0005591285 nova_compute[182755]: 2026-01-22 00:42:08.088 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:42:09 np0005591285 nova_compute[182755]: 2026-01-22 00:42:09.498 182759 DEBUG oslo_service.periodic_task [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 21 19:42:09 np0005591285 nova_compute[182755]: 2026-01-22 00:42:09.499 182759 DEBUG nova.compute.manager [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Jan 21 19:42:09 np0005591285 nova_compute[182755]: 2026-01-22 00:42:09.499 182759 DEBUG nova.compute.manager [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Jan 21 19:42:09 np0005591285 nova_compute[182755]: 2026-01-22 00:42:09.513 182759 DEBUG nova.compute.manager [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Jan 21 19:42:10 np0005591285 nova_compute[182755]: 2026-01-22 00:42:10.515 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:42:11 np0005591285 nova_compute[182755]: 2026-01-22 00:42:11.217 182759 DEBUG oslo_service.periodic_task [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 21 19:42:12 np0005591285 nova_compute[182755]: 2026-01-22 00:42:12.213 182759 DEBUG oslo_service.periodic_task [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 21 19:42:12 np0005591285 nova_compute[182755]: 2026-01-22 00:42:12.217 182759 DEBUG oslo_service.periodic_task [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 21 19:42:12 np0005591285 nova_compute[182755]: 2026-01-22 00:42:12.217 182759 DEBUG nova.compute.manager [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Jan 21 19:42:13 np0005591285 nova_compute[182755]: 2026-01-22 00:42:13.089 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:42:13 np0005591285 nova_compute[182755]: 2026-01-22 00:42:13.218 182759 DEBUG oslo_service.periodic_task [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 21 19:42:14 np0005591285 nova_compute[182755]: 2026-01-22 00:42:14.218 182759 DEBUG oslo_service.periodic_task [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 21 19:42:15 np0005591285 nova_compute[182755]: 2026-01-22 00:42:15.517 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:42:17 np0005591285 podman[242930]: 2026-01-22 00:42:17.216987583 +0000 UTC m=+0.073454863 container health_status 769668cc4e818cfae854ab3fcb5517227ddca1e18005a18ef4d782a494f45662 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, name=ubi9-minimal, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.tags=minimal rhel9, maintainer=Red Hat, Inc., architecture=x86_64, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., container_name=openstack_network_exporter, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, build-date=2025-08-20T13:12:41, release=1755695350, url=https://catalog.redhat.com/en/search?searchType=containers, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, managed_by=edpm_ansible, config_id=openstack_network_exporter, io.openshift.expose-services=, com.redhat.component=ubi9-minimal-container, distribution-scope=public, io.buildah.version=1.33.7, version=9.6)
Jan 21 19:42:17 np0005591285 podman[242931]: 2026-01-22 00:42:17.216913651 +0000 UTC m=+0.071107779 container health_status b5c1a31a508d0cf9e10702abe9c0ef424614b4d8f0daee85791f7de054afa6f0 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, tcib_managed=true, config_id=ceilometer_agent_compute)
Jan 21 19:42:18 np0005591285 nova_compute[182755]: 2026-01-22 00:42:18.056 182759 DEBUG oslo_concurrency.lockutils [None req-209c5657-1b77-421b-8c86-f747023daa56 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Acquiring lock "03f0da8f-c1d4-4432-bd08-77122a64e6b9" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 21 19:42:18 np0005591285 nova_compute[182755]: 2026-01-22 00:42:18.057 182759 DEBUG oslo_concurrency.lockutils [None req-209c5657-1b77-421b-8c86-f747023daa56 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Lock "03f0da8f-c1d4-4432-bd08-77122a64e6b9" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 21 19:42:18 np0005591285 nova_compute[182755]: 2026-01-22 00:42:18.076 182759 DEBUG nova.compute.manager [None req-209c5657-1b77-421b-8c86-f747023daa56 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] [instance: 03f0da8f-c1d4-4432-bd08-77122a64e6b9] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Jan 21 19:42:18 np0005591285 nova_compute[182755]: 2026-01-22 00:42:18.091 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:42:18 np0005591285 nova_compute[182755]: 2026-01-22 00:42:18.197 182759 DEBUG oslo_concurrency.lockutils [None req-209c5657-1b77-421b-8c86-f747023daa56 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 21 19:42:18 np0005591285 nova_compute[182755]: 2026-01-22 00:42:18.198 182759 DEBUG oslo_concurrency.lockutils [None req-209c5657-1b77-421b-8c86-f747023daa56 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 21 19:42:18 np0005591285 nova_compute[182755]: 2026-01-22 00:42:18.205 182759 DEBUG nova.virt.hardware [None req-209c5657-1b77-421b-8c86-f747023daa56 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Jan 21 19:42:18 np0005591285 nova_compute[182755]: 2026-01-22 00:42:18.205 182759 INFO nova.compute.claims [None req-209c5657-1b77-421b-8c86-f747023daa56 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] [instance: 03f0da8f-c1d4-4432-bd08-77122a64e6b9] Claim successful on node compute-2.ctlplane.example.com#033[00m
Jan 21 19:42:18 np0005591285 nova_compute[182755]: 2026-01-22 00:42:18.336 182759 DEBUG nova.compute.provider_tree [None req-209c5657-1b77-421b-8c86-f747023daa56 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Inventory has not changed in ProviderTree for provider: e96a8776-a298-4c19-937a-402cb8191067 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 21 19:42:18 np0005591285 nova_compute[182755]: 2026-01-22 00:42:18.351 182759 DEBUG nova.scheduler.client.report [None req-209c5657-1b77-421b-8c86-f747023daa56 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Inventory has not changed for provider e96a8776-a298-4c19-937a-402cb8191067 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 21 19:42:18 np0005591285 nova_compute[182755]: 2026-01-22 00:42:18.374 182759 DEBUG oslo_concurrency.lockutils [None req-209c5657-1b77-421b-8c86-f747023daa56 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.176s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 21 19:42:18 np0005591285 nova_compute[182755]: 2026-01-22 00:42:18.375 182759 DEBUG nova.compute.manager [None req-209c5657-1b77-421b-8c86-f747023daa56 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] [instance: 03f0da8f-c1d4-4432-bd08-77122a64e6b9] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Jan 21 19:42:18 np0005591285 nova_compute[182755]: 2026-01-22 00:42:18.460 182759 DEBUG nova.compute.manager [None req-209c5657-1b77-421b-8c86-f747023daa56 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] [instance: 03f0da8f-c1d4-4432-bd08-77122a64e6b9] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Jan 21 19:42:18 np0005591285 nova_compute[182755]: 2026-01-22 00:42:18.460 182759 DEBUG nova.network.neutron [None req-209c5657-1b77-421b-8c86-f747023daa56 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] [instance: 03f0da8f-c1d4-4432-bd08-77122a64e6b9] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Jan 21 19:42:18 np0005591285 nova_compute[182755]: 2026-01-22 00:42:18.485 182759 INFO nova.virt.libvirt.driver [None req-209c5657-1b77-421b-8c86-f747023daa56 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] [instance: 03f0da8f-c1d4-4432-bd08-77122a64e6b9] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Jan 21 19:42:18 np0005591285 nova_compute[182755]: 2026-01-22 00:42:18.519 182759 DEBUG nova.compute.manager [None req-209c5657-1b77-421b-8c86-f747023daa56 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] [instance: 03f0da8f-c1d4-4432-bd08-77122a64e6b9] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Jan 21 19:42:18 np0005591285 nova_compute[182755]: 2026-01-22 00:42:18.640 182759 DEBUG nova.compute.manager [None req-209c5657-1b77-421b-8c86-f747023daa56 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] [instance: 03f0da8f-c1d4-4432-bd08-77122a64e6b9] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Jan 21 19:42:18 np0005591285 nova_compute[182755]: 2026-01-22 00:42:18.641 182759 DEBUG nova.virt.libvirt.driver [None req-209c5657-1b77-421b-8c86-f747023daa56 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] [instance: 03f0da8f-c1d4-4432-bd08-77122a64e6b9] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Jan 21 19:42:18 np0005591285 nova_compute[182755]: 2026-01-22 00:42:18.642 182759 INFO nova.virt.libvirt.driver [None req-209c5657-1b77-421b-8c86-f747023daa56 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] [instance: 03f0da8f-c1d4-4432-bd08-77122a64e6b9] Creating image(s)#033[00m
Jan 21 19:42:18 np0005591285 nova_compute[182755]: 2026-01-22 00:42:18.642 182759 DEBUG oslo_concurrency.lockutils [None req-209c5657-1b77-421b-8c86-f747023daa56 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Acquiring lock "/var/lib/nova/instances/03f0da8f-c1d4-4432-bd08-77122a64e6b9/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 21 19:42:18 np0005591285 nova_compute[182755]: 2026-01-22 00:42:18.643 182759 DEBUG oslo_concurrency.lockutils [None req-209c5657-1b77-421b-8c86-f747023daa56 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Lock "/var/lib/nova/instances/03f0da8f-c1d4-4432-bd08-77122a64e6b9/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 21 19:42:18 np0005591285 nova_compute[182755]: 2026-01-22 00:42:18.643 182759 DEBUG oslo_concurrency.lockutils [None req-209c5657-1b77-421b-8c86-f747023daa56 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Lock "/var/lib/nova/instances/03f0da8f-c1d4-4432-bd08-77122a64e6b9/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 21 19:42:18 np0005591285 nova_compute[182755]: 2026-01-22 00:42:18.658 182759 DEBUG oslo_concurrency.processutils [None req-209c5657-1b77-421b-8c86-f747023daa56 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 21 19:42:18 np0005591285 nova_compute[182755]: 2026-01-22 00:42:18.714 182759 DEBUG oslo_concurrency.processutils [None req-209c5657-1b77-421b-8c86-f747023daa56 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json" returned: 0 in 0.056s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 21 19:42:18 np0005591285 nova_compute[182755]: 2026-01-22 00:42:18.715 182759 DEBUG oslo_concurrency.lockutils [None req-209c5657-1b77-421b-8c86-f747023daa56 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Acquiring lock "3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 21 19:42:18 np0005591285 nova_compute[182755]: 2026-01-22 00:42:18.715 182759 DEBUG oslo_concurrency.lockutils [None req-209c5657-1b77-421b-8c86-f747023daa56 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Lock "3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 21 19:42:18 np0005591285 nova_compute[182755]: 2026-01-22 00:42:18.727 182759 DEBUG oslo_concurrency.processutils [None req-209c5657-1b77-421b-8c86-f747023daa56 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 21 19:42:18 np0005591285 nova_compute[182755]: 2026-01-22 00:42:18.782 182759 DEBUG oslo_concurrency.processutils [None req-209c5657-1b77-421b-8c86-f747023daa56 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json" returned: 0 in 0.056s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 21 19:42:18 np0005591285 nova_compute[182755]: 2026-01-22 00:42:18.783 182759 DEBUG oslo_concurrency.processutils [None req-209c5657-1b77-421b-8c86-f747023daa56 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474,backing_fmt=raw /var/lib/nova/instances/03f0da8f-c1d4-4432-bd08-77122a64e6b9/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 21 19:42:18 np0005591285 nova_compute[182755]: 2026-01-22 00:42:18.818 182759 DEBUG oslo_concurrency.processutils [None req-209c5657-1b77-421b-8c86-f747023daa56 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474,backing_fmt=raw /var/lib/nova/instances/03f0da8f-c1d4-4432-bd08-77122a64e6b9/disk 1073741824" returned: 0 in 0.035s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 21 19:42:18 np0005591285 nova_compute[182755]: 2026-01-22 00:42:18.819 182759 DEBUG oslo_concurrency.lockutils [None req-209c5657-1b77-421b-8c86-f747023daa56 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Lock "3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.104s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 21 19:42:18 np0005591285 nova_compute[182755]: 2026-01-22 00:42:18.819 182759 DEBUG oslo_concurrency.processutils [None req-209c5657-1b77-421b-8c86-f747023daa56 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 21 19:42:18 np0005591285 nova_compute[182755]: 2026-01-22 00:42:18.888 182759 DEBUG oslo_concurrency.processutils [None req-209c5657-1b77-421b-8c86-f747023daa56 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json" returned: 0 in 0.068s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 21 19:42:18 np0005591285 nova_compute[182755]: 2026-01-22 00:42:18.889 182759 DEBUG nova.virt.disk.api [None req-209c5657-1b77-421b-8c86-f747023daa56 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Checking if we can resize image /var/lib/nova/instances/03f0da8f-c1d4-4432-bd08-77122a64e6b9/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166#033[00m
Jan 21 19:42:18 np0005591285 nova_compute[182755]: 2026-01-22 00:42:18.890 182759 DEBUG oslo_concurrency.processutils [None req-209c5657-1b77-421b-8c86-f747023daa56 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/03f0da8f-c1d4-4432-bd08-77122a64e6b9/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 21 19:42:18 np0005591285 nova_compute[182755]: 2026-01-22 00:42:18.950 182759 DEBUG oslo_concurrency.processutils [None req-209c5657-1b77-421b-8c86-f747023daa56 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/03f0da8f-c1d4-4432-bd08-77122a64e6b9/disk --force-share --output=json" returned: 0 in 0.061s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 21 19:42:18 np0005591285 nova_compute[182755]: 2026-01-22 00:42:18.951 182759 DEBUG nova.virt.disk.api [None req-209c5657-1b77-421b-8c86-f747023daa56 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Cannot resize image /var/lib/nova/instances/03f0da8f-c1d4-4432-bd08-77122a64e6b9/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172#033[00m
Jan 21 19:42:18 np0005591285 nova_compute[182755]: 2026-01-22 00:42:18.952 182759 DEBUG nova.objects.instance [None req-209c5657-1b77-421b-8c86-f747023daa56 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Lazy-loading 'migration_context' on Instance uuid 03f0da8f-c1d4-4432-bd08-77122a64e6b9 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 21 19:42:18 np0005591285 nova_compute[182755]: 2026-01-22 00:42:18.970 182759 DEBUG nova.virt.libvirt.driver [None req-209c5657-1b77-421b-8c86-f747023daa56 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] [instance: 03f0da8f-c1d4-4432-bd08-77122a64e6b9] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Jan 21 19:42:18 np0005591285 nova_compute[182755]: 2026-01-22 00:42:18.970 182759 DEBUG nova.virt.libvirt.driver [None req-209c5657-1b77-421b-8c86-f747023daa56 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] [instance: 03f0da8f-c1d4-4432-bd08-77122a64e6b9] Ensure instance console log exists: /var/lib/nova/instances/03f0da8f-c1d4-4432-bd08-77122a64e6b9/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Jan 21 19:42:18 np0005591285 nova_compute[182755]: 2026-01-22 00:42:18.971 182759 DEBUG oslo_concurrency.lockutils [None req-209c5657-1b77-421b-8c86-f747023daa56 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 21 19:42:18 np0005591285 nova_compute[182755]: 2026-01-22 00:42:18.971 182759 DEBUG oslo_concurrency.lockutils [None req-209c5657-1b77-421b-8c86-f747023daa56 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 21 19:42:18 np0005591285 nova_compute[182755]: 2026-01-22 00:42:18.971 182759 DEBUG oslo_concurrency.lockutils [None req-209c5657-1b77-421b-8c86-f747023daa56 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 21 19:42:19 np0005591285 nova_compute[182755]: 2026-01-22 00:42:19.262 182759 DEBUG nova.policy [None req-209c5657-1b77-421b-8c86-f747023daa56 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'a8fd196423d94b309668ffd08655f7ed', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '837db8748d074b3c9179b47d30e7a1d4', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Jan 21 19:42:19 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:42:19.911 104259 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=65, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '16:a4:fb', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': 'fa:c2:bb:50:eb:27'}, ipsec=False) old=SB_Global(nb_cfg=64) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 21 19:42:19 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:42:19.912 104259 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 8 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Jan 21 19:42:19 np0005591285 nova_compute[182755]: 2026-01-22 00:42:19.913 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:42:20 np0005591285 nova_compute[182755]: 2026-01-22 00:42:20.342 182759 DEBUG nova.network.neutron [None req-209c5657-1b77-421b-8c86-f747023daa56 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] [instance: 03f0da8f-c1d4-4432-bd08-77122a64e6b9] Successfully created port: df0b1d8d-bf36-48ca-b912-b8b71d623097 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Jan 21 19:42:20 np0005591285 nova_compute[182755]: 2026-01-22 00:42:20.519 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:42:21 np0005591285 nova_compute[182755]: 2026-01-22 00:42:21.503 182759 DEBUG nova.network.neutron [None req-209c5657-1b77-421b-8c86-f747023daa56 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] [instance: 03f0da8f-c1d4-4432-bd08-77122a64e6b9] Successfully updated port: df0b1d8d-bf36-48ca-b912-b8b71d623097 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Jan 21 19:42:21 np0005591285 nova_compute[182755]: 2026-01-22 00:42:21.520 182759 DEBUG oslo_concurrency.lockutils [None req-209c5657-1b77-421b-8c86-f747023daa56 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Acquiring lock "refresh_cache-03f0da8f-c1d4-4432-bd08-77122a64e6b9" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 21 19:42:21 np0005591285 nova_compute[182755]: 2026-01-22 00:42:21.521 182759 DEBUG oslo_concurrency.lockutils [None req-209c5657-1b77-421b-8c86-f747023daa56 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Acquired lock "refresh_cache-03f0da8f-c1d4-4432-bd08-77122a64e6b9" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 21 19:42:21 np0005591285 nova_compute[182755]: 2026-01-22 00:42:21.521 182759 DEBUG nova.network.neutron [None req-209c5657-1b77-421b-8c86-f747023daa56 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] [instance: 03f0da8f-c1d4-4432-bd08-77122a64e6b9] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Jan 21 19:42:21 np0005591285 nova_compute[182755]: 2026-01-22 00:42:21.667 182759 DEBUG nova.network.neutron [None req-209c5657-1b77-421b-8c86-f747023daa56 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] [instance: 03f0da8f-c1d4-4432-bd08-77122a64e6b9] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Jan 21 19:42:21 np0005591285 nova_compute[182755]: 2026-01-22 00:42:21.985 182759 DEBUG nova.compute.manager [req-65eb7941-da49-4891-acfa-dd320133da30 req-145817db-c374-40dd-84c2-71014a01e344 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 03f0da8f-c1d4-4432-bd08-77122a64e6b9] Received event network-changed-df0b1d8d-bf36-48ca-b912-b8b71d623097 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 21 19:42:21 np0005591285 nova_compute[182755]: 2026-01-22 00:42:21.985 182759 DEBUG nova.compute.manager [req-65eb7941-da49-4891-acfa-dd320133da30 req-145817db-c374-40dd-84c2-71014a01e344 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 03f0da8f-c1d4-4432-bd08-77122a64e6b9] Refreshing instance network info cache due to event network-changed-df0b1d8d-bf36-48ca-b912-b8b71d623097. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 21 19:42:21 np0005591285 nova_compute[182755]: 2026-01-22 00:42:21.985 182759 DEBUG oslo_concurrency.lockutils [req-65eb7941-da49-4891-acfa-dd320133da30 req-145817db-c374-40dd-84c2-71014a01e344 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "refresh_cache-03f0da8f-c1d4-4432-bd08-77122a64e6b9" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 21 19:42:22 np0005591285 nova_compute[182755]: 2026-01-22 00:42:22.216 182759 DEBUG oslo_service.periodic_task [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 21 19:42:23 np0005591285 nova_compute[182755]: 2026-01-22 00:42:23.093 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:42:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:42:23.178 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 21 19:42:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:42:23.178 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 21 19:42:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:42:23.178 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 21 19:42:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:42:23.179 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 21 19:42:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:42:23.179 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 21 19:42:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:42:23.179 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 21 19:42:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:42:23.179 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.iops, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 21 19:42:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:42:23.179 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 21 19:42:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:42:23.180 12 DEBUG ceilometer.polling.manager [-] Skip pollster memory.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 21 19:42:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:42:23.180 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 21 19:42:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:42:23.180 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 21 19:42:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:42:23.180 12 DEBUG ceilometer.polling.manager [-] Skip pollster cpu, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 21 19:42:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:42:23.181 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 21 19:42:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:42:23.181 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 21 19:42:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:42:23.181 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 21 19:42:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:42:23.181 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 21 19:42:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:42:23.182 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 21 19:42:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:42:23.182 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 21 19:42:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:42:23.182 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.capacity, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 21 19:42:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:42:23.182 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 21 19:42:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:42:23.182 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 21 19:42:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:42:23.183 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 21 19:42:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:42:23.183 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.allocation, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 21 19:42:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:42:23.183 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 21 19:42:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:42:23.183 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 21 19:42:23 np0005591285 nova_compute[182755]: 2026-01-22 00:42:23.218 182759 DEBUG oslo_service.periodic_task [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 21 19:42:24 np0005591285 nova_compute[182755]: 2026-01-22 00:42:24.217 182759 DEBUG oslo_service.periodic_task [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 21 19:42:24 np0005591285 nova_compute[182755]: 2026-01-22 00:42:24.248 182759 DEBUG oslo_concurrency.lockutils [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 21 19:42:24 np0005591285 nova_compute[182755]: 2026-01-22 00:42:24.248 182759 DEBUG oslo_concurrency.lockutils [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 21 19:42:24 np0005591285 nova_compute[182755]: 2026-01-22 00:42:24.248 182759 DEBUG oslo_concurrency.lockutils [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 21 19:42:24 np0005591285 nova_compute[182755]: 2026-01-22 00:42:24.248 182759 DEBUG nova.compute.resource_tracker [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Jan 21 19:42:24 np0005591285 nova_compute[182755]: 2026-01-22 00:42:24.379 182759 WARNING nova.virt.libvirt.driver [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 21 19:42:24 np0005591285 nova_compute[182755]: 2026-01-22 00:42:24.379 182759 DEBUG nova.compute.resource_tracker [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=5741MB free_disk=73.17691802978516GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Jan 21 19:42:24 np0005591285 nova_compute[182755]: 2026-01-22 00:42:24.380 182759 DEBUG oslo_concurrency.lockutils [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 21 19:42:24 np0005591285 nova_compute[182755]: 2026-01-22 00:42:24.380 182759 DEBUG oslo_concurrency.lockutils [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 21 19:42:24 np0005591285 nova_compute[182755]: 2026-01-22 00:42:24.472 182759 DEBUG nova.compute.resource_tracker [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Instance 03f0da8f-c1d4-4432-bd08-77122a64e6b9 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Jan 21 19:42:24 np0005591285 nova_compute[182755]: 2026-01-22 00:42:24.473 182759 DEBUG nova.compute.resource_tracker [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Jan 21 19:42:24 np0005591285 nova_compute[182755]: 2026-01-22 00:42:24.473 182759 DEBUG nova.compute.resource_tracker [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7679MB used_ram=640MB phys_disk=79GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Jan 21 19:42:24 np0005591285 nova_compute[182755]: 2026-01-22 00:42:24.634 182759 DEBUG nova.compute.provider_tree [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Inventory has not changed in ProviderTree for provider: e96a8776-a298-4c19-937a-402cb8191067 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 21 19:42:24 np0005591285 nova_compute[182755]: 2026-01-22 00:42:24.698 182759 DEBUG nova.scheduler.client.report [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Inventory has not changed for provider e96a8776-a298-4c19-937a-402cb8191067 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 21 19:42:24 np0005591285 nova_compute[182755]: 2026-01-22 00:42:24.730 182759 DEBUG nova.compute.resource_tracker [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Jan 21 19:42:24 np0005591285 nova_compute[182755]: 2026-01-22 00:42:24.730 182759 DEBUG oslo_concurrency.lockutils [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.350s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 21 19:42:25 np0005591285 nova_compute[182755]: 2026-01-22 00:42:25.337 182759 DEBUG nova.network.neutron [None req-209c5657-1b77-421b-8c86-f747023daa56 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] [instance: 03f0da8f-c1d4-4432-bd08-77122a64e6b9] Updating instance_info_cache with network_info: [{"id": "df0b1d8d-bf36-48ca-b912-b8b71d623097", "address": "fa:16:3e:3e:2b:ed", "network": {"id": "0fbc923c-90ec-4c3d-92df-bc42843601b3", "bridge": "br-int", "label": "tempest-network-smoke--540170543", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}, {"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe3e:2bed", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}, {"cidr": "2001:db8:0:1::/64", "dns": [], "gateway": {"address": "2001:db8:0:1::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8:0:1:f816:3eff:fe3e:2bed", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "837db8748d074b3c9179b47d30e7a1d4", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapdf0b1d8d-bf", "ovs_interfaceid": "df0b1d8d-bf36-48ca-b912-b8b71d623097", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 21 19:42:25 np0005591285 nova_compute[182755]: 2026-01-22 00:42:25.371 182759 DEBUG oslo_concurrency.lockutils [None req-209c5657-1b77-421b-8c86-f747023daa56 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Releasing lock "refresh_cache-03f0da8f-c1d4-4432-bd08-77122a64e6b9" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 21 19:42:25 np0005591285 nova_compute[182755]: 2026-01-22 00:42:25.371 182759 DEBUG nova.compute.manager [None req-209c5657-1b77-421b-8c86-f747023daa56 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] [instance: 03f0da8f-c1d4-4432-bd08-77122a64e6b9] Instance network_info: |[{"id": "df0b1d8d-bf36-48ca-b912-b8b71d623097", "address": "fa:16:3e:3e:2b:ed", "network": {"id": "0fbc923c-90ec-4c3d-92df-bc42843601b3", "bridge": "br-int", "label": "tempest-network-smoke--540170543", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}, {"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe3e:2bed", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}, {"cidr": "2001:db8:0:1::/64", "dns": [], "gateway": {"address": "2001:db8:0:1::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8:0:1:f816:3eff:fe3e:2bed", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "837db8748d074b3c9179b47d30e7a1d4", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapdf0b1d8d-bf", "ovs_interfaceid": "df0b1d8d-bf36-48ca-b912-b8b71d623097", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Jan 21 19:42:25 np0005591285 nova_compute[182755]: 2026-01-22 00:42:25.372 182759 DEBUG oslo_concurrency.lockutils [req-65eb7941-da49-4891-acfa-dd320133da30 req-145817db-c374-40dd-84c2-71014a01e344 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquired lock "refresh_cache-03f0da8f-c1d4-4432-bd08-77122a64e6b9" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 21 19:42:25 np0005591285 nova_compute[182755]: 2026-01-22 00:42:25.372 182759 DEBUG nova.network.neutron [req-65eb7941-da49-4891-acfa-dd320133da30 req-145817db-c374-40dd-84c2-71014a01e344 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 03f0da8f-c1d4-4432-bd08-77122a64e6b9] Refreshing network info cache for port df0b1d8d-bf36-48ca-b912-b8b71d623097 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 21 19:42:25 np0005591285 nova_compute[182755]: 2026-01-22 00:42:25.375 182759 DEBUG nova.virt.libvirt.driver [None req-209c5657-1b77-421b-8c86-f747023daa56 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] [instance: 03f0da8f-c1d4-4432-bd08-77122a64e6b9] Start _get_guest_xml network_info=[{"id": "df0b1d8d-bf36-48ca-b912-b8b71d623097", "address": "fa:16:3e:3e:2b:ed", "network": {"id": "0fbc923c-90ec-4c3d-92df-bc42843601b3", "bridge": "br-int", "label": "tempest-network-smoke--540170543", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}, {"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe3e:2bed", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}, {"cidr": "2001:db8:0:1::/64", "dns": [], "gateway": {"address": "2001:db8:0:1::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8:0:1:f816:3eff:fe3e:2bed", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "837db8748d074b3c9179b47d30e7a1d4", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapdf0b1d8d-bf", "ovs_interfaceid": "df0b1d8d-bf36-48ca-b912-b8b71d623097", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-21T23:42:50Z,direct_url=<?>,disk_format='qcow2',id=9cd98f02-a505-4543-a7ad-04e9a377b456,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='43b70c4e837343859ac97b6b2397ba1b',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-21T23:42:52Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'device_type': 'disk', 'device_name': '/dev/vda', 'size': 0, 'boot_index': 0, 'disk_bus': 'virtio', 'guest_format': None, 'encryption_options': None, 'encryption_format': None, 'encrypted': False, 'encryption_secret_uuid': None, 'image_id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Jan 21 19:42:25 np0005591285 nova_compute[182755]: 2026-01-22 00:42:25.378 182759 WARNING nova.virt.libvirt.driver [None req-209c5657-1b77-421b-8c86-f747023daa56 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 21 19:42:25 np0005591285 nova_compute[182755]: 2026-01-22 00:42:25.382 182759 DEBUG nova.virt.libvirt.host [None req-209c5657-1b77-421b-8c86-f747023daa56 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Jan 21 19:42:25 np0005591285 nova_compute[182755]: 2026-01-22 00:42:25.382 182759 DEBUG nova.virt.libvirt.host [None req-209c5657-1b77-421b-8c86-f747023daa56 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Jan 21 19:42:25 np0005591285 nova_compute[182755]: 2026-01-22 00:42:25.389 182759 DEBUG nova.virt.libvirt.host [None req-209c5657-1b77-421b-8c86-f747023daa56 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Jan 21 19:42:25 np0005591285 nova_compute[182755]: 2026-01-22 00:42:25.389 182759 DEBUG nova.virt.libvirt.host [None req-209c5657-1b77-421b-8c86-f747023daa56 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Jan 21 19:42:25 np0005591285 nova_compute[182755]: 2026-01-22 00:42:25.390 182759 DEBUG nova.virt.libvirt.driver [None req-209c5657-1b77-421b-8c86-f747023daa56 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Jan 21 19:42:25 np0005591285 nova_compute[182755]: 2026-01-22 00:42:25.391 182759 DEBUG nova.virt.hardware [None req-209c5657-1b77-421b-8c86-f747023daa56 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-21T23:42:45Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='c3389c03-89c4-4ff5-9e03-1a99d41713d4',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-21T23:42:50Z,direct_url=<?>,disk_format='qcow2',id=9cd98f02-a505-4543-a7ad-04e9a377b456,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='43b70c4e837343859ac97b6b2397ba1b',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-21T23:42:52Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Jan 21 19:42:25 np0005591285 nova_compute[182755]: 2026-01-22 00:42:25.391 182759 DEBUG nova.virt.hardware [None req-209c5657-1b77-421b-8c86-f747023daa56 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Jan 21 19:42:25 np0005591285 nova_compute[182755]: 2026-01-22 00:42:25.391 182759 DEBUG nova.virt.hardware [None req-209c5657-1b77-421b-8c86-f747023daa56 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Jan 21 19:42:25 np0005591285 nova_compute[182755]: 2026-01-22 00:42:25.391 182759 DEBUG nova.virt.hardware [None req-209c5657-1b77-421b-8c86-f747023daa56 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Jan 21 19:42:25 np0005591285 nova_compute[182755]: 2026-01-22 00:42:25.392 182759 DEBUG nova.virt.hardware [None req-209c5657-1b77-421b-8c86-f747023daa56 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Jan 21 19:42:25 np0005591285 nova_compute[182755]: 2026-01-22 00:42:25.392 182759 DEBUG nova.virt.hardware [None req-209c5657-1b77-421b-8c86-f747023daa56 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Jan 21 19:42:25 np0005591285 nova_compute[182755]: 2026-01-22 00:42:25.392 182759 DEBUG nova.virt.hardware [None req-209c5657-1b77-421b-8c86-f747023daa56 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Jan 21 19:42:25 np0005591285 nova_compute[182755]: 2026-01-22 00:42:25.392 182759 DEBUG nova.virt.hardware [None req-209c5657-1b77-421b-8c86-f747023daa56 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Jan 21 19:42:25 np0005591285 nova_compute[182755]: 2026-01-22 00:42:25.392 182759 DEBUG nova.virt.hardware [None req-209c5657-1b77-421b-8c86-f747023daa56 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Jan 21 19:42:25 np0005591285 nova_compute[182755]: 2026-01-22 00:42:25.393 182759 DEBUG nova.virt.hardware [None req-209c5657-1b77-421b-8c86-f747023daa56 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Jan 21 19:42:25 np0005591285 nova_compute[182755]: 2026-01-22 00:42:25.393 182759 DEBUG nova.virt.hardware [None req-209c5657-1b77-421b-8c86-f747023daa56 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Jan 21 19:42:25 np0005591285 nova_compute[182755]: 2026-01-22 00:42:25.396 182759 DEBUG nova.virt.libvirt.vif [None req-209c5657-1b77-421b-8c86-f747023daa56 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-22T00:42:17Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestGettingAddress-server-813069027',display_name='tempest-TestGettingAddress-server-813069027',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-testgettingaddress-server-813069027',id=180,image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBGfcFV8SVrYtqhuEJR2u0WnZRZ3aIYjdzcrjfLDmTvbNVw+iNWtleLlqtVUIYQyWXU5cujTqDWjA511UjJA6kRMyxPcbENHgTLoJy3T95U8C9/oslNz/OBwLaWEuXg2SRA==',key_name='tempest-TestGettingAddress-313470075',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='837db8748d074b3c9179b47d30e7a1d4',ramdisk_id='',reservation_id='r-jup1ca84',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestGettingAddress-471729430',owner_user_name='tempest-TestGettingAddress-471729430-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-22T00:42:18Z,user_data=None,user_id='a8fd196423d94b309668ffd08655f7ed',uuid=03f0da8f-c1d4-4432-bd08-77122a64e6b9,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "df0b1d8d-bf36-48ca-b912-b8b71d623097", "address": "fa:16:3e:3e:2b:ed", "network": {"id": "0fbc923c-90ec-4c3d-92df-bc42843601b3", "bridge": "br-int", "label": "tempest-network-smoke--540170543", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}, {"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe3e:2bed", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}, {"cidr": "2001:db8:0:1::/64", "dns": [], "gateway": {"address": "2001:db8:0:1::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8:0:1:f816:3eff:fe3e:2bed", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "837db8748d074b3c9179b47d30e7a1d4", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapdf0b1d8d-bf", "ovs_interfaceid": "df0b1d8d-bf36-48ca-b912-b8b71d623097", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Jan 21 19:42:25 np0005591285 nova_compute[182755]: 2026-01-22 00:42:25.396 182759 DEBUG nova.network.os_vif_util [None req-209c5657-1b77-421b-8c86-f747023daa56 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Converting VIF {"id": "df0b1d8d-bf36-48ca-b912-b8b71d623097", "address": "fa:16:3e:3e:2b:ed", "network": {"id": "0fbc923c-90ec-4c3d-92df-bc42843601b3", "bridge": "br-int", "label": "tempest-network-smoke--540170543", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}, {"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe3e:2bed", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}, {"cidr": "2001:db8:0:1::/64", "dns": [], "gateway": {"address": "2001:db8:0:1::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8:0:1:f816:3eff:fe3e:2bed", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "837db8748d074b3c9179b47d30e7a1d4", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapdf0b1d8d-bf", "ovs_interfaceid": "df0b1d8d-bf36-48ca-b912-b8b71d623097", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 21 19:42:25 np0005591285 nova_compute[182755]: 2026-01-22 00:42:25.397 182759 DEBUG nova.network.os_vif_util [None req-209c5657-1b77-421b-8c86-f747023daa56 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:3e:2b:ed,bridge_name='br-int',has_traffic_filtering=True,id=df0b1d8d-bf36-48ca-b912-b8b71d623097,network=Network(0fbc923c-90ec-4c3d-92df-bc42843601b3),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapdf0b1d8d-bf') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 21 19:42:25 np0005591285 nova_compute[182755]: 2026-01-22 00:42:25.398 182759 DEBUG nova.objects.instance [None req-209c5657-1b77-421b-8c86-f747023daa56 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Lazy-loading 'pci_devices' on Instance uuid 03f0da8f-c1d4-4432-bd08-77122a64e6b9 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 21 19:42:25 np0005591285 nova_compute[182755]: 2026-01-22 00:42:25.410 182759 DEBUG nova.virt.libvirt.driver [None req-209c5657-1b77-421b-8c86-f747023daa56 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] [instance: 03f0da8f-c1d4-4432-bd08-77122a64e6b9] End _get_guest_xml xml=<domain type="kvm">
Jan 21 19:42:25 np0005591285 nova_compute[182755]:  <uuid>03f0da8f-c1d4-4432-bd08-77122a64e6b9</uuid>
Jan 21 19:42:25 np0005591285 nova_compute[182755]:  <name>instance-000000b4</name>
Jan 21 19:42:25 np0005591285 nova_compute[182755]:  <memory>131072</memory>
Jan 21 19:42:25 np0005591285 nova_compute[182755]:  <vcpu>1</vcpu>
Jan 21 19:42:25 np0005591285 nova_compute[182755]:  <metadata>
Jan 21 19:42:25 np0005591285 nova_compute[182755]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 21 19:42:25 np0005591285 nova_compute[182755]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 21 19:42:25 np0005591285 nova_compute[182755]:      <nova:name>tempest-TestGettingAddress-server-813069027</nova:name>
Jan 21 19:42:25 np0005591285 nova_compute[182755]:      <nova:creationTime>2026-01-22 00:42:25</nova:creationTime>
Jan 21 19:42:25 np0005591285 nova_compute[182755]:      <nova:flavor name="m1.nano">
Jan 21 19:42:25 np0005591285 nova_compute[182755]:        <nova:memory>128</nova:memory>
Jan 21 19:42:25 np0005591285 nova_compute[182755]:        <nova:disk>1</nova:disk>
Jan 21 19:42:25 np0005591285 nova_compute[182755]:        <nova:swap>0</nova:swap>
Jan 21 19:42:25 np0005591285 nova_compute[182755]:        <nova:ephemeral>0</nova:ephemeral>
Jan 21 19:42:25 np0005591285 nova_compute[182755]:        <nova:vcpus>1</nova:vcpus>
Jan 21 19:42:25 np0005591285 nova_compute[182755]:      </nova:flavor>
Jan 21 19:42:25 np0005591285 nova_compute[182755]:      <nova:owner>
Jan 21 19:42:25 np0005591285 nova_compute[182755]:        <nova:user uuid="a8fd196423d94b309668ffd08655f7ed">tempest-TestGettingAddress-471729430-project-member</nova:user>
Jan 21 19:42:25 np0005591285 nova_compute[182755]:        <nova:project uuid="837db8748d074b3c9179b47d30e7a1d4">tempest-TestGettingAddress-471729430</nova:project>
Jan 21 19:42:25 np0005591285 nova_compute[182755]:      </nova:owner>
Jan 21 19:42:25 np0005591285 nova_compute[182755]:      <nova:root type="image" uuid="9cd98f02-a505-4543-a7ad-04e9a377b456"/>
Jan 21 19:42:25 np0005591285 nova_compute[182755]:      <nova:ports>
Jan 21 19:42:25 np0005591285 nova_compute[182755]:        <nova:port uuid="df0b1d8d-bf36-48ca-b912-b8b71d623097">
Jan 21 19:42:25 np0005591285 nova_compute[182755]:          <nova:ip type="fixed" address="10.100.0.5" ipVersion="4"/>
Jan 21 19:42:25 np0005591285 nova_compute[182755]:          <nova:ip type="fixed" address="2001:db8::f816:3eff:fe3e:2bed" ipVersion="6"/>
Jan 21 19:42:25 np0005591285 nova_compute[182755]:          <nova:ip type="fixed" address="2001:db8:0:1:f816:3eff:fe3e:2bed" ipVersion="6"/>
Jan 21 19:42:25 np0005591285 nova_compute[182755]:        </nova:port>
Jan 21 19:42:25 np0005591285 nova_compute[182755]:      </nova:ports>
Jan 21 19:42:25 np0005591285 nova_compute[182755]:    </nova:instance>
Jan 21 19:42:25 np0005591285 nova_compute[182755]:  </metadata>
Jan 21 19:42:25 np0005591285 nova_compute[182755]:  <sysinfo type="smbios">
Jan 21 19:42:25 np0005591285 nova_compute[182755]:    <system>
Jan 21 19:42:25 np0005591285 nova_compute[182755]:      <entry name="manufacturer">RDO</entry>
Jan 21 19:42:25 np0005591285 nova_compute[182755]:      <entry name="product">OpenStack Compute</entry>
Jan 21 19:42:25 np0005591285 nova_compute[182755]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 21 19:42:25 np0005591285 nova_compute[182755]:      <entry name="serial">03f0da8f-c1d4-4432-bd08-77122a64e6b9</entry>
Jan 21 19:42:25 np0005591285 nova_compute[182755]:      <entry name="uuid">03f0da8f-c1d4-4432-bd08-77122a64e6b9</entry>
Jan 21 19:42:25 np0005591285 nova_compute[182755]:      <entry name="family">Virtual Machine</entry>
Jan 21 19:42:25 np0005591285 nova_compute[182755]:    </system>
Jan 21 19:42:25 np0005591285 nova_compute[182755]:  </sysinfo>
Jan 21 19:42:25 np0005591285 nova_compute[182755]:  <os>
Jan 21 19:42:25 np0005591285 nova_compute[182755]:    <type arch="x86_64" machine="q35">hvm</type>
Jan 21 19:42:25 np0005591285 nova_compute[182755]:    <boot dev="hd"/>
Jan 21 19:42:25 np0005591285 nova_compute[182755]:    <smbios mode="sysinfo"/>
Jan 21 19:42:25 np0005591285 nova_compute[182755]:  </os>
Jan 21 19:42:25 np0005591285 nova_compute[182755]:  <features>
Jan 21 19:42:25 np0005591285 nova_compute[182755]:    <acpi/>
Jan 21 19:42:25 np0005591285 nova_compute[182755]:    <apic/>
Jan 21 19:42:25 np0005591285 nova_compute[182755]:    <vmcoreinfo/>
Jan 21 19:42:25 np0005591285 nova_compute[182755]:  </features>
Jan 21 19:42:25 np0005591285 nova_compute[182755]:  <clock offset="utc">
Jan 21 19:42:25 np0005591285 nova_compute[182755]:    <timer name="pit" tickpolicy="delay"/>
Jan 21 19:42:25 np0005591285 nova_compute[182755]:    <timer name="rtc" tickpolicy="catchup"/>
Jan 21 19:42:25 np0005591285 nova_compute[182755]:    <timer name="hpet" present="no"/>
Jan 21 19:42:25 np0005591285 nova_compute[182755]:  </clock>
Jan 21 19:42:25 np0005591285 nova_compute[182755]:  <cpu mode="custom" match="exact">
Jan 21 19:42:25 np0005591285 nova_compute[182755]:    <model>Nehalem</model>
Jan 21 19:42:25 np0005591285 nova_compute[182755]:    <topology sockets="1" cores="1" threads="1"/>
Jan 21 19:42:25 np0005591285 nova_compute[182755]:  </cpu>
Jan 21 19:42:25 np0005591285 nova_compute[182755]:  <devices>
Jan 21 19:42:25 np0005591285 nova_compute[182755]:    <disk type="file" device="disk">
Jan 21 19:42:25 np0005591285 nova_compute[182755]:      <driver name="qemu" type="qcow2" cache="none"/>
Jan 21 19:42:25 np0005591285 nova_compute[182755]:      <source file="/var/lib/nova/instances/03f0da8f-c1d4-4432-bd08-77122a64e6b9/disk"/>
Jan 21 19:42:25 np0005591285 nova_compute[182755]:      <target dev="vda" bus="virtio"/>
Jan 21 19:42:25 np0005591285 nova_compute[182755]:    </disk>
Jan 21 19:42:25 np0005591285 nova_compute[182755]:    <disk type="file" device="cdrom">
Jan 21 19:42:25 np0005591285 nova_compute[182755]:      <driver name="qemu" type="raw" cache="none"/>
Jan 21 19:42:25 np0005591285 nova_compute[182755]:      <source file="/var/lib/nova/instances/03f0da8f-c1d4-4432-bd08-77122a64e6b9/disk.config"/>
Jan 21 19:42:25 np0005591285 nova_compute[182755]:      <target dev="sda" bus="sata"/>
Jan 21 19:42:25 np0005591285 nova_compute[182755]:    </disk>
Jan 21 19:42:25 np0005591285 nova_compute[182755]:    <interface type="ethernet">
Jan 21 19:42:25 np0005591285 nova_compute[182755]:      <mac address="fa:16:3e:3e:2b:ed"/>
Jan 21 19:42:25 np0005591285 nova_compute[182755]:      <model type="virtio"/>
Jan 21 19:42:25 np0005591285 nova_compute[182755]:      <driver name="vhost" rx_queue_size="512"/>
Jan 21 19:42:25 np0005591285 nova_compute[182755]:      <mtu size="1442"/>
Jan 21 19:42:25 np0005591285 nova_compute[182755]:      <target dev="tapdf0b1d8d-bf"/>
Jan 21 19:42:25 np0005591285 nova_compute[182755]:    </interface>
Jan 21 19:42:25 np0005591285 nova_compute[182755]:    <serial type="pty">
Jan 21 19:42:25 np0005591285 nova_compute[182755]:      <log file="/var/lib/nova/instances/03f0da8f-c1d4-4432-bd08-77122a64e6b9/console.log" append="off"/>
Jan 21 19:42:25 np0005591285 nova_compute[182755]:    </serial>
Jan 21 19:42:25 np0005591285 nova_compute[182755]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 21 19:42:25 np0005591285 nova_compute[182755]:    <video>
Jan 21 19:42:25 np0005591285 nova_compute[182755]:      <model type="virtio"/>
Jan 21 19:42:25 np0005591285 nova_compute[182755]:    </video>
Jan 21 19:42:25 np0005591285 nova_compute[182755]:    <input type="tablet" bus="usb"/>
Jan 21 19:42:25 np0005591285 nova_compute[182755]:    <rng model="virtio">
Jan 21 19:42:25 np0005591285 nova_compute[182755]:      <backend model="random">/dev/urandom</backend>
Jan 21 19:42:25 np0005591285 nova_compute[182755]:    </rng>
Jan 21 19:42:25 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root"/>
Jan 21 19:42:25 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 19:42:25 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 19:42:25 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 19:42:25 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 19:42:25 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 19:42:25 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 19:42:25 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 19:42:25 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 19:42:25 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 19:42:25 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 19:42:25 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 19:42:25 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 19:42:25 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 19:42:25 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 19:42:25 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 19:42:25 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 19:42:25 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 19:42:25 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 19:42:25 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 19:42:25 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 19:42:25 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 19:42:25 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 19:42:25 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 19:42:25 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 19:42:25 np0005591285 nova_compute[182755]:    <controller type="usb" index="0"/>
Jan 21 19:42:25 np0005591285 nova_compute[182755]:    <memballoon model="virtio">
Jan 21 19:42:25 np0005591285 nova_compute[182755]:      <stats period="10"/>
Jan 21 19:42:25 np0005591285 nova_compute[182755]:    </memballoon>
Jan 21 19:42:25 np0005591285 nova_compute[182755]:  </devices>
Jan 21 19:42:25 np0005591285 nova_compute[182755]: </domain>
Jan 21 19:42:25 np0005591285 nova_compute[182755]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Jan 21 19:42:25 np0005591285 nova_compute[182755]: 2026-01-22 00:42:25.411 182759 DEBUG nova.compute.manager [None req-209c5657-1b77-421b-8c86-f747023daa56 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] [instance: 03f0da8f-c1d4-4432-bd08-77122a64e6b9] Preparing to wait for external event network-vif-plugged-df0b1d8d-bf36-48ca-b912-b8b71d623097 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Jan 21 19:42:25 np0005591285 nova_compute[182755]: 2026-01-22 00:42:25.411 182759 DEBUG oslo_concurrency.lockutils [None req-209c5657-1b77-421b-8c86-f747023daa56 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Acquiring lock "03f0da8f-c1d4-4432-bd08-77122a64e6b9-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 21 19:42:25 np0005591285 nova_compute[182755]: 2026-01-22 00:42:25.412 182759 DEBUG oslo_concurrency.lockutils [None req-209c5657-1b77-421b-8c86-f747023daa56 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Lock "03f0da8f-c1d4-4432-bd08-77122a64e6b9-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 21 19:42:25 np0005591285 nova_compute[182755]: 2026-01-22 00:42:25.412 182759 DEBUG oslo_concurrency.lockutils [None req-209c5657-1b77-421b-8c86-f747023daa56 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Lock "03f0da8f-c1d4-4432-bd08-77122a64e6b9-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 21 19:42:25 np0005591285 nova_compute[182755]: 2026-01-22 00:42:25.413 182759 DEBUG nova.virt.libvirt.vif [None req-209c5657-1b77-421b-8c86-f747023daa56 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-22T00:42:17Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestGettingAddress-server-813069027',display_name='tempest-TestGettingAddress-server-813069027',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-testgettingaddress-server-813069027',id=180,image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBGfcFV8SVrYtqhuEJR2u0WnZRZ3aIYjdzcrjfLDmTvbNVw+iNWtleLlqtVUIYQyWXU5cujTqDWjA511UjJA6kRMyxPcbENHgTLoJy3T95U8C9/oslNz/OBwLaWEuXg2SRA==',key_name='tempest-TestGettingAddress-313470075',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='837db8748d074b3c9179b47d30e7a1d4',ramdisk_id='',reservation_id='r-jup1ca84',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestGettingAddress-471729430',owner_user_name='tempest-TestGettingAddress-471729430-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-22T00:42:18Z,user_data=None,user_id='a8fd196423d94b309668ffd08655f7ed',uuid=03f0da8f-c1d4-4432-bd08-77122a64e6b9,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "df0b1d8d-bf36-48ca-b912-b8b71d623097", "address": "fa:16:3e:3e:2b:ed", "network": {"id": "0fbc923c-90ec-4c3d-92df-bc42843601b3", "bridge": "br-int", "label": "tempest-network-smoke--540170543", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}, {"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe3e:2bed", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}, {"cidr": "2001:db8:0:1::/64", "dns": [], "gateway": {"address": "2001:db8:0:1::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8:0:1:f816:3eff:fe3e:2bed", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "837db8748d074b3c9179b47d30e7a1d4", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapdf0b1d8d-bf", "ovs_interfaceid": "df0b1d8d-bf36-48ca-b912-b8b71d623097", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Jan 21 19:42:25 np0005591285 nova_compute[182755]: 2026-01-22 00:42:25.413 182759 DEBUG nova.network.os_vif_util [None req-209c5657-1b77-421b-8c86-f747023daa56 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Converting VIF {"id": "df0b1d8d-bf36-48ca-b912-b8b71d623097", "address": "fa:16:3e:3e:2b:ed", "network": {"id": "0fbc923c-90ec-4c3d-92df-bc42843601b3", "bridge": "br-int", "label": "tempest-network-smoke--540170543", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}, {"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe3e:2bed", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}, {"cidr": "2001:db8:0:1::/64", "dns": [], "gateway": {"address": "2001:db8:0:1::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8:0:1:f816:3eff:fe3e:2bed", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "837db8748d074b3c9179b47d30e7a1d4", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapdf0b1d8d-bf", "ovs_interfaceid": "df0b1d8d-bf36-48ca-b912-b8b71d623097", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 21 19:42:25 np0005591285 nova_compute[182755]: 2026-01-22 00:42:25.415 182759 DEBUG nova.network.os_vif_util [None req-209c5657-1b77-421b-8c86-f747023daa56 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:3e:2b:ed,bridge_name='br-int',has_traffic_filtering=True,id=df0b1d8d-bf36-48ca-b912-b8b71d623097,network=Network(0fbc923c-90ec-4c3d-92df-bc42843601b3),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapdf0b1d8d-bf') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 21 19:42:25 np0005591285 nova_compute[182755]: 2026-01-22 00:42:25.415 182759 DEBUG os_vif [None req-209c5657-1b77-421b-8c86-f747023daa56 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:3e:2b:ed,bridge_name='br-int',has_traffic_filtering=True,id=df0b1d8d-bf36-48ca-b912-b8b71d623097,network=Network(0fbc923c-90ec-4c3d-92df-bc42843601b3),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapdf0b1d8d-bf') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Jan 21 19:42:25 np0005591285 nova_compute[182755]: 2026-01-22 00:42:25.416 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:42:25 np0005591285 nova_compute[182755]: 2026-01-22 00:42:25.416 182759 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 21 19:42:25 np0005591285 nova_compute[182755]: 2026-01-22 00:42:25.417 182759 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 21 19:42:25 np0005591285 nova_compute[182755]: 2026-01-22 00:42:25.421 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:42:25 np0005591285 nova_compute[182755]: 2026-01-22 00:42:25.422 182759 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapdf0b1d8d-bf, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 21 19:42:25 np0005591285 nova_compute[182755]: 2026-01-22 00:42:25.422 182759 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tapdf0b1d8d-bf, col_values=(('external_ids', {'iface-id': 'df0b1d8d-bf36-48ca-b912-b8b71d623097', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:3e:2b:ed', 'vm-uuid': '03f0da8f-c1d4-4432-bd08-77122a64e6b9'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 21 19:42:25 np0005591285 nova_compute[182755]: 2026-01-22 00:42:25.460 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:42:25 np0005591285 NetworkManager[55017]: <info>  [1769042545.4624] manager: (tapdf0b1d8d-bf): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/328)
Jan 21 19:42:25 np0005591285 nova_compute[182755]: 2026-01-22 00:42:25.464 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 21 19:42:25 np0005591285 nova_compute[182755]: 2026-01-22 00:42:25.471 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:42:25 np0005591285 nova_compute[182755]: 2026-01-22 00:42:25.472 182759 INFO os_vif [None req-209c5657-1b77-421b-8c86-f747023daa56 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:3e:2b:ed,bridge_name='br-int',has_traffic_filtering=True,id=df0b1d8d-bf36-48ca-b912-b8b71d623097,network=Network(0fbc923c-90ec-4c3d-92df-bc42843601b3),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapdf0b1d8d-bf')#033[00m
Jan 21 19:42:25 np0005591285 nova_compute[182755]: 2026-01-22 00:42:25.513 182759 DEBUG nova.virt.libvirt.driver [None req-209c5657-1b77-421b-8c86-f747023daa56 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 21 19:42:25 np0005591285 nova_compute[182755]: 2026-01-22 00:42:25.514 182759 DEBUG nova.virt.libvirt.driver [None req-209c5657-1b77-421b-8c86-f747023daa56 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 21 19:42:25 np0005591285 nova_compute[182755]: 2026-01-22 00:42:25.514 182759 DEBUG nova.virt.libvirt.driver [None req-209c5657-1b77-421b-8c86-f747023daa56 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] No VIF found with MAC fa:16:3e:3e:2b:ed, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Jan 21 19:42:25 np0005591285 nova_compute[182755]: 2026-01-22 00:42:25.514 182759 INFO nova.virt.libvirt.driver [None req-209c5657-1b77-421b-8c86-f747023daa56 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] [instance: 03f0da8f-c1d4-4432-bd08-77122a64e6b9] Using config drive#033[00m
Jan 21 19:42:25 np0005591285 nova_compute[182755]: 2026-01-22 00:42:25.520 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:42:26 np0005591285 podman[242987]: 2026-01-22 00:42:26.243405004 +0000 UTC m=+0.100482452 container health_status 3d927e208c8d9d2bf2ed958430b573b5beb6e6062ba87e1ec7a0ee42c855b2e4 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter)
Jan 21 19:42:26 np0005591285 nova_compute[182755]: 2026-01-22 00:42:26.261 182759 INFO nova.virt.libvirt.driver [None req-209c5657-1b77-421b-8c86-f747023daa56 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] [instance: 03f0da8f-c1d4-4432-bd08-77122a64e6b9] Creating config drive at /var/lib/nova/instances/03f0da8f-c1d4-4432-bd08-77122a64e6b9/disk.config#033[00m
Jan 21 19:42:26 np0005591285 nova_compute[182755]: 2026-01-22 00:42:26.268 182759 DEBUG oslo_concurrency.processutils [None req-209c5657-1b77-421b-8c86-f747023daa56 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/03f0da8f-c1d4-4432-bd08-77122a64e6b9/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmptw5fc_o9 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 21 19:42:26 np0005591285 nova_compute[182755]: 2026-01-22 00:42:26.408 182759 DEBUG oslo_concurrency.processutils [None req-209c5657-1b77-421b-8c86-f747023daa56 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/03f0da8f-c1d4-4432-bd08-77122a64e6b9/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmptw5fc_o9" returned: 0 in 0.140s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 21 19:42:26 np0005591285 kernel: tapdf0b1d8d-bf: entered promiscuous mode
Jan 21 19:42:26 np0005591285 NetworkManager[55017]: <info>  [1769042546.5000] manager: (tapdf0b1d8d-bf): new Tun device (/org/freedesktop/NetworkManager/Devices/329)
Jan 21 19:42:26 np0005591285 nova_compute[182755]: 2026-01-22 00:42:26.530 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:42:26 np0005591285 ovn_controller[94908]: 2026-01-22T00:42:26Z|00670|binding|INFO|Claiming lport df0b1d8d-bf36-48ca-b912-b8b71d623097 for this chassis.
Jan 21 19:42:26 np0005591285 ovn_controller[94908]: 2026-01-22T00:42:26Z|00671|binding|INFO|df0b1d8d-bf36-48ca-b912-b8b71d623097: Claiming fa:16:3e:3e:2b:ed 10.100.0.5 2001:db8:0:1:f816:3eff:fe3e:2bed 2001:db8::f816:3eff:fe3e:2bed
Jan 21 19:42:26 np0005591285 nova_compute[182755]: 2026-01-22 00:42:26.532 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:42:26 np0005591285 systemd-udevd[243026]: Network interface NamePolicy= disabled on kernel command line.
Jan 21 19:42:26 np0005591285 nova_compute[182755]: 2026-01-22 00:42:26.537 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:42:26 np0005591285 NetworkManager[55017]: <info>  [1769042546.5427] manager: (patch-br-int-to-provnet-99bcf688-d143-4306-a11c-956e66fbe227): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/330)
Jan 21 19:42:26 np0005591285 nova_compute[182755]: 2026-01-22 00:42:26.542 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:42:26 np0005591285 NetworkManager[55017]: <info>  [1769042546.5433] manager: (patch-provnet-99bcf688-d143-4306-a11c-956e66fbe227-to-br-int): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/331)
Jan 21 19:42:26 np0005591285 NetworkManager[55017]: <info>  [1769042546.5461] device (tapdf0b1d8d-bf): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 21 19:42:26 np0005591285 NetworkManager[55017]: <info>  [1769042546.5470] device (tapdf0b1d8d-bf): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 21 19:42:26 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:42:26.549 104259 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:3e:2b:ed 10.100.0.5 2001:db8:0:1:f816:3eff:fe3e:2bed 2001:db8::f816:3eff:fe3e:2bed'], port_security=['fa:16:3e:3e:2b:ed 10.100.0.5 2001:db8:0:1:f816:3eff:fe3e:2bed 2001:db8::f816:3eff:fe3e:2bed'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.5/28 2001:db8:0:1:f816:3eff:fe3e:2bed/64 2001:db8::f816:3eff:fe3e:2bed/64', 'neutron:device_id': '03f0da8f-c1d4-4432-bd08-77122a64e6b9', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-0fbc923c-90ec-4c3d-92df-bc42843601b3', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '837db8748d074b3c9179b47d30e7a1d4', 'neutron:revision_number': '2', 'neutron:security_group_ids': 'ee79161b-ebd1-43ab-81bd-31efca053e81', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=b420faa5-5ae8-471e-9b88-5f792c3ff519, chassis=[<ovs.db.idl.Row object at 0x7fdd55b5c970>], tunnel_key=5, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fdd55b5c970>], logical_port=df0b1d8d-bf36-48ca-b912-b8b71d623097) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 21 19:42:26 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:42:26.550 104259 INFO neutron.agent.ovn.metadata.agent [-] Port df0b1d8d-bf36-48ca-b912-b8b71d623097 in datapath 0fbc923c-90ec-4c3d-92df-bc42843601b3 bound to our chassis#033[00m
Jan 21 19:42:26 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:42:26.551 104259 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 0fbc923c-90ec-4c3d-92df-bc42843601b3#033[00m
Jan 21 19:42:26 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:42:26.563 211690 DEBUG oslo.privsep.daemon [-] privsep: reply[c076dd4f-a6de-4638-a9e1-123a398b3a66]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 19:42:26 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:42:26.564 104259 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap0fbc923c-91 in ovnmeta-0fbc923c-90ec-4c3d-92df-bc42843601b3 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Jan 21 19:42:26 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:42:26.566 211690 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap0fbc923c-90 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Jan 21 19:42:26 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:42:26.566 211690 DEBUG oslo.privsep.daemon [-] privsep: reply[51cd2c4f-0757-4e09-86c5-c81171226011]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 19:42:26 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:42:26.567 211690 DEBUG oslo.privsep.daemon [-] privsep: reply[382cfd85-cc51-4c09-82ac-501d45e551ef]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 19:42:26 np0005591285 systemd-machined[154022]: New machine qemu-76-instance-000000b4.
Jan 21 19:42:26 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:42:26.580 104650 DEBUG oslo.privsep.daemon [-] privsep: reply[3eeb1ec2-731e-4ec7-ba64-3e826e0848e9]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 19:42:26 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:42:26.607 211690 DEBUG oslo.privsep.daemon [-] privsep: reply[49ae4b7f-cfc8-450e-9969-ad71d9f388bc]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 19:42:26 np0005591285 nova_compute[182755]: 2026-01-22 00:42:26.628 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:42:26 np0005591285 systemd[1]: Started Virtual Machine qemu-76-instance-000000b4.
Jan 21 19:42:26 np0005591285 nova_compute[182755]: 2026-01-22 00:42:26.634 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:42:26 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:42:26.638 211704 DEBUG oslo.privsep.daemon [-] privsep: reply[15ddbb4e-0c89-4dfa-b46b-fbb1b78af857]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 19:42:26 np0005591285 ovn_controller[94908]: 2026-01-22T00:42:26Z|00672|binding|INFO|Setting lport df0b1d8d-bf36-48ca-b912-b8b71d623097 ovn-installed in OVS
Jan 21 19:42:26 np0005591285 ovn_controller[94908]: 2026-01-22T00:42:26Z|00673|binding|INFO|Setting lport df0b1d8d-bf36-48ca-b912-b8b71d623097 up in Southbound
Jan 21 19:42:26 np0005591285 nova_compute[182755]: 2026-01-22 00:42:26.644 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:42:26 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:42:26.646 211690 DEBUG oslo.privsep.daemon [-] privsep: reply[a0328005-c257-4d7d-b939-c8e9c8c6e58b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 19:42:26 np0005591285 NetworkManager[55017]: <info>  [1769042546.6471] manager: (tap0fbc923c-90): new Veth device (/org/freedesktop/NetworkManager/Devices/332)
Jan 21 19:42:26 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:42:26.674 211704 DEBUG oslo.privsep.daemon [-] privsep: reply[521cad6c-98ea-4ca9-9f14-c185df502d94]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 19:42:26 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:42:26.677 211704 DEBUG oslo.privsep.daemon [-] privsep: reply[f61f9475-d083-4b85-90a4-a7620c88a1d8]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 19:42:26 np0005591285 NetworkManager[55017]: <info>  [1769042546.7248] device (tap0fbc923c-90): carrier: link connected
Jan 21 19:42:26 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:42:26.729 211704 DEBUG oslo.privsep.daemon [-] privsep: reply[c8e13a6a-f940-48e9-9103-51ee38c19935]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 19:42:26 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:42:26.747 211690 DEBUG oslo.privsep.daemon [-] privsep: reply[b9f46530-6f24-4b07-a6b5-22298eec968e]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap0fbc923c-91'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:5f:63:f3'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 208], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 710535, 'reachable_time': 30275, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 243061, 'error': None, 'target': 'ovnmeta-0fbc923c-90ec-4c3d-92df-bc42843601b3', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 19:42:26 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:42:26.765 211690 DEBUG oslo.privsep.daemon [-] privsep: reply[198812b3-b248-4d32-b120-d0f93499bfd6]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe5f:63f3'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 710535, 'tstamp': 710535}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 243063, 'error': None, 'target': 'ovnmeta-0fbc923c-90ec-4c3d-92df-bc42843601b3', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 19:42:26 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:42:26.784 211690 DEBUG oslo.privsep.daemon [-] privsep: reply[75068ab4-935e-42ce-9ee3-3ab5507c211a]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap0fbc923c-91'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:5f:63:f3'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 208], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 710535, 'reachable_time': 30275, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 243064, 'error': None, 'target': 'ovnmeta-0fbc923c-90ec-4c3d-92df-bc42843601b3', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 19:42:26 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:42:26.817 211690 DEBUG oslo.privsep.daemon [-] privsep: reply[ecf39623-44df-442d-9f7b-e583ecb3c2c4]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 19:42:26 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:42:26.880 211690 DEBUG oslo.privsep.daemon [-] privsep: reply[31211d5d-d586-41d8-8dd9-e741d4ae1004]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 19:42:26 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:42:26.882 104259 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap0fbc923c-90, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 21 19:42:26 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:42:26.882 104259 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 21 19:42:26 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:42:26.882 104259 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap0fbc923c-90, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 21 19:42:26 np0005591285 nova_compute[182755]: 2026-01-22 00:42:26.884 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:42:26 np0005591285 kernel: tap0fbc923c-90: entered promiscuous mode
Jan 21 19:42:26 np0005591285 NetworkManager[55017]: <info>  [1769042546.8872] manager: (tap0fbc923c-90): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/333)
Jan 21 19:42:26 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:42:26.890 104259 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap0fbc923c-90, col_values=(('external_ids', {'iface-id': '2a8aebb4-643e-4d79-9b9e-71408c2b29d3'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 21 19:42:26 np0005591285 nova_compute[182755]: 2026-01-22 00:42:26.892 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:42:26 np0005591285 ovn_controller[94908]: 2026-01-22T00:42:26Z|00674|binding|INFO|Releasing lport 2a8aebb4-643e-4d79-9b9e-71408c2b29d3 from this chassis (sb_readonly=0)
Jan 21 19:42:26 np0005591285 nova_compute[182755]: 2026-01-22 00:42:26.893 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:42:26 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:42:26.894 104259 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/0fbc923c-90ec-4c3d-92df-bc42843601b3.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/0fbc923c-90ec-4c3d-92df-bc42843601b3.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Jan 21 19:42:26 np0005591285 nova_compute[182755]: 2026-01-22 00:42:26.903 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:42:26 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:42:26.903 211690 DEBUG oslo.privsep.daemon [-] privsep: reply[f0f54665-d51f-4f0c-a14f-87c8a87e63dc]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 19:42:26 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:42:26.904 104259 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Jan 21 19:42:26 np0005591285 ovn_metadata_agent[104254]: global
Jan 21 19:42:26 np0005591285 ovn_metadata_agent[104254]:    log         /dev/log local0 debug
Jan 21 19:42:26 np0005591285 ovn_metadata_agent[104254]:    log-tag     haproxy-metadata-proxy-0fbc923c-90ec-4c3d-92df-bc42843601b3
Jan 21 19:42:26 np0005591285 ovn_metadata_agent[104254]:    user        root
Jan 21 19:42:26 np0005591285 ovn_metadata_agent[104254]:    group       root
Jan 21 19:42:26 np0005591285 ovn_metadata_agent[104254]:    maxconn     1024
Jan 21 19:42:26 np0005591285 ovn_metadata_agent[104254]:    pidfile     /var/lib/neutron/external/pids/0fbc923c-90ec-4c3d-92df-bc42843601b3.pid.haproxy
Jan 21 19:42:26 np0005591285 ovn_metadata_agent[104254]:    daemon
Jan 21 19:42:26 np0005591285 ovn_metadata_agent[104254]: 
Jan 21 19:42:26 np0005591285 ovn_metadata_agent[104254]: defaults
Jan 21 19:42:26 np0005591285 ovn_metadata_agent[104254]:    log global
Jan 21 19:42:26 np0005591285 ovn_metadata_agent[104254]:    mode http
Jan 21 19:42:26 np0005591285 ovn_metadata_agent[104254]:    option httplog
Jan 21 19:42:26 np0005591285 ovn_metadata_agent[104254]:    option dontlognull
Jan 21 19:42:26 np0005591285 ovn_metadata_agent[104254]:    option http-server-close
Jan 21 19:42:26 np0005591285 ovn_metadata_agent[104254]:    option forwardfor
Jan 21 19:42:26 np0005591285 ovn_metadata_agent[104254]:    retries                 3
Jan 21 19:42:26 np0005591285 ovn_metadata_agent[104254]:    timeout http-request    30s
Jan 21 19:42:26 np0005591285 ovn_metadata_agent[104254]:    timeout connect         30s
Jan 21 19:42:26 np0005591285 ovn_metadata_agent[104254]:    timeout client          32s
Jan 21 19:42:26 np0005591285 ovn_metadata_agent[104254]:    timeout server          32s
Jan 21 19:42:26 np0005591285 ovn_metadata_agent[104254]:    timeout http-keep-alive 30s
Jan 21 19:42:26 np0005591285 ovn_metadata_agent[104254]: 
Jan 21 19:42:26 np0005591285 ovn_metadata_agent[104254]: 
Jan 21 19:42:26 np0005591285 ovn_metadata_agent[104254]: listen listener
Jan 21 19:42:26 np0005591285 ovn_metadata_agent[104254]:    bind 169.254.169.254:80
Jan 21 19:42:26 np0005591285 ovn_metadata_agent[104254]:    server metadata /var/lib/neutron/metadata_proxy
Jan 21 19:42:26 np0005591285 ovn_metadata_agent[104254]:    http-request add-header X-OVN-Network-ID 0fbc923c-90ec-4c3d-92df-bc42843601b3
Jan 21 19:42:26 np0005591285 ovn_metadata_agent[104254]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Jan 21 19:42:26 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:42:26.905 104259 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-0fbc923c-90ec-4c3d-92df-bc42843601b3', 'env', 'PROCESS_TAG=haproxy-0fbc923c-90ec-4c3d-92df-bc42843601b3', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/0fbc923c-90ec-4c3d-92df-bc42843601b3.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Jan 21 19:42:27 np0005591285 nova_compute[182755]: 2026-01-22 00:42:27.092 182759 DEBUG nova.virt.driver [None req-f447ef32-7edb-4e2c-a5c5-5452f2f9c37e - - - - - -] Emitting event <LifecycleEvent: 1769042547.091801, 03f0da8f-c1d4-4432-bd08-77122a64e6b9 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 21 19:42:27 np0005591285 nova_compute[182755]: 2026-01-22 00:42:27.092 182759 INFO nova.compute.manager [None req-f447ef32-7edb-4e2c-a5c5-5452f2f9c37e - - - - - -] [instance: 03f0da8f-c1d4-4432-bd08-77122a64e6b9] VM Started (Lifecycle Event)#033[00m
Jan 21 19:42:27 np0005591285 nova_compute[182755]: 2026-01-22 00:42:27.112 182759 DEBUG nova.compute.manager [None req-f447ef32-7edb-4e2c-a5c5-5452f2f9c37e - - - - - -] [instance: 03f0da8f-c1d4-4432-bd08-77122a64e6b9] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 21 19:42:27 np0005591285 nova_compute[182755]: 2026-01-22 00:42:27.117 182759 DEBUG nova.virt.driver [None req-f447ef32-7edb-4e2c-a5c5-5452f2f9c37e - - - - - -] Emitting event <LifecycleEvent: 1769042547.092597, 03f0da8f-c1d4-4432-bd08-77122a64e6b9 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 21 19:42:27 np0005591285 nova_compute[182755]: 2026-01-22 00:42:27.118 182759 INFO nova.compute.manager [None req-f447ef32-7edb-4e2c-a5c5-5452f2f9c37e - - - - - -] [instance: 03f0da8f-c1d4-4432-bd08-77122a64e6b9] VM Paused (Lifecycle Event)#033[00m
Jan 21 19:42:27 np0005591285 nova_compute[182755]: 2026-01-22 00:42:27.136 182759 DEBUG nova.compute.manager [None req-f447ef32-7edb-4e2c-a5c5-5452f2f9c37e - - - - - -] [instance: 03f0da8f-c1d4-4432-bd08-77122a64e6b9] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 21 19:42:27 np0005591285 nova_compute[182755]: 2026-01-22 00:42:27.140 182759 DEBUG nova.compute.manager [None req-f447ef32-7edb-4e2c-a5c5-5452f2f9c37e - - - - - -] [instance: 03f0da8f-c1d4-4432-bd08-77122a64e6b9] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 21 19:42:27 np0005591285 nova_compute[182755]: 2026-01-22 00:42:27.162 182759 INFO nova.compute.manager [None req-f447ef32-7edb-4e2c-a5c5-5452f2f9c37e - - - - - -] [instance: 03f0da8f-c1d4-4432-bd08-77122a64e6b9] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 21 19:42:27 np0005591285 nova_compute[182755]: 2026-01-22 00:42:27.206 182759 DEBUG nova.compute.manager [req-7663e418-50be-403b-abe8-b684f084bfce req-e6aa8069-0831-49f4-a753-b8ac2b6f5a4f 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 03f0da8f-c1d4-4432-bd08-77122a64e6b9] Received event network-vif-plugged-df0b1d8d-bf36-48ca-b912-b8b71d623097 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 21 19:42:27 np0005591285 nova_compute[182755]: 2026-01-22 00:42:27.206 182759 DEBUG oslo_concurrency.lockutils [req-7663e418-50be-403b-abe8-b684f084bfce req-e6aa8069-0831-49f4-a753-b8ac2b6f5a4f 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "03f0da8f-c1d4-4432-bd08-77122a64e6b9-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 21 19:42:27 np0005591285 nova_compute[182755]: 2026-01-22 00:42:27.207 182759 DEBUG oslo_concurrency.lockutils [req-7663e418-50be-403b-abe8-b684f084bfce req-e6aa8069-0831-49f4-a753-b8ac2b6f5a4f 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "03f0da8f-c1d4-4432-bd08-77122a64e6b9-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 21 19:42:27 np0005591285 nova_compute[182755]: 2026-01-22 00:42:27.207 182759 DEBUG oslo_concurrency.lockutils [req-7663e418-50be-403b-abe8-b684f084bfce req-e6aa8069-0831-49f4-a753-b8ac2b6f5a4f 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "03f0da8f-c1d4-4432-bd08-77122a64e6b9-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 21 19:42:27 np0005591285 nova_compute[182755]: 2026-01-22 00:42:27.208 182759 DEBUG nova.compute.manager [req-7663e418-50be-403b-abe8-b684f084bfce req-e6aa8069-0831-49f4-a753-b8ac2b6f5a4f 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 03f0da8f-c1d4-4432-bd08-77122a64e6b9] Processing event network-vif-plugged-df0b1d8d-bf36-48ca-b912-b8b71d623097 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Jan 21 19:42:27 np0005591285 nova_compute[182755]: 2026-01-22 00:42:27.208 182759 DEBUG nova.compute.manager [None req-209c5657-1b77-421b-8c86-f747023daa56 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] [instance: 03f0da8f-c1d4-4432-bd08-77122a64e6b9] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Jan 21 19:42:27 np0005591285 nova_compute[182755]: 2026-01-22 00:42:27.211 182759 DEBUG nova.virt.driver [None req-f447ef32-7edb-4e2c-a5c5-5452f2f9c37e - - - - - -] Emitting event <LifecycleEvent: 1769042547.2115588, 03f0da8f-c1d4-4432-bd08-77122a64e6b9 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 21 19:42:27 np0005591285 nova_compute[182755]: 2026-01-22 00:42:27.212 182759 INFO nova.compute.manager [None req-f447ef32-7edb-4e2c-a5c5-5452f2f9c37e - - - - - -] [instance: 03f0da8f-c1d4-4432-bd08-77122a64e6b9] VM Resumed (Lifecycle Event)#033[00m
Jan 21 19:42:27 np0005591285 nova_compute[182755]: 2026-01-22 00:42:27.213 182759 DEBUG nova.virt.libvirt.driver [None req-209c5657-1b77-421b-8c86-f747023daa56 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] [instance: 03f0da8f-c1d4-4432-bd08-77122a64e6b9] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Jan 21 19:42:27 np0005591285 nova_compute[182755]: 2026-01-22 00:42:27.216 182759 INFO nova.virt.libvirt.driver [-] [instance: 03f0da8f-c1d4-4432-bd08-77122a64e6b9] Instance spawned successfully.#033[00m
Jan 21 19:42:27 np0005591285 nova_compute[182755]: 2026-01-22 00:42:27.217 182759 DEBUG nova.virt.libvirt.driver [None req-209c5657-1b77-421b-8c86-f747023daa56 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] [instance: 03f0da8f-c1d4-4432-bd08-77122a64e6b9] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Jan 21 19:42:27 np0005591285 nova_compute[182755]: 2026-01-22 00:42:27.238 182759 DEBUG nova.compute.manager [None req-f447ef32-7edb-4e2c-a5c5-5452f2f9c37e - - - - - -] [instance: 03f0da8f-c1d4-4432-bd08-77122a64e6b9] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 21 19:42:27 np0005591285 nova_compute[182755]: 2026-01-22 00:42:27.243 182759 DEBUG nova.virt.libvirt.driver [None req-209c5657-1b77-421b-8c86-f747023daa56 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] [instance: 03f0da8f-c1d4-4432-bd08-77122a64e6b9] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 21 19:42:27 np0005591285 nova_compute[182755]: 2026-01-22 00:42:27.244 182759 DEBUG nova.virt.libvirt.driver [None req-209c5657-1b77-421b-8c86-f747023daa56 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] [instance: 03f0da8f-c1d4-4432-bd08-77122a64e6b9] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 21 19:42:27 np0005591285 nova_compute[182755]: 2026-01-22 00:42:27.245 182759 DEBUG nova.virt.libvirt.driver [None req-209c5657-1b77-421b-8c86-f747023daa56 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] [instance: 03f0da8f-c1d4-4432-bd08-77122a64e6b9] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 21 19:42:27 np0005591285 nova_compute[182755]: 2026-01-22 00:42:27.245 182759 DEBUG nova.virt.libvirt.driver [None req-209c5657-1b77-421b-8c86-f747023daa56 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] [instance: 03f0da8f-c1d4-4432-bd08-77122a64e6b9] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 21 19:42:27 np0005591285 nova_compute[182755]: 2026-01-22 00:42:27.246 182759 DEBUG nova.virt.libvirt.driver [None req-209c5657-1b77-421b-8c86-f747023daa56 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] [instance: 03f0da8f-c1d4-4432-bd08-77122a64e6b9] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 21 19:42:27 np0005591285 nova_compute[182755]: 2026-01-22 00:42:27.247 182759 DEBUG nova.virt.libvirt.driver [None req-209c5657-1b77-421b-8c86-f747023daa56 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] [instance: 03f0da8f-c1d4-4432-bd08-77122a64e6b9] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 21 19:42:27 np0005591285 nova_compute[182755]: 2026-01-22 00:42:27.257 182759 DEBUG nova.compute.manager [None req-f447ef32-7edb-4e2c-a5c5-5452f2f9c37e - - - - - -] [instance: 03f0da8f-c1d4-4432-bd08-77122a64e6b9] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 21 19:42:27 np0005591285 podman[243103]: 2026-01-22 00:42:27.273449492 +0000 UTC m=+0.055524829 container create 3946a81ce75d4d50350bfb8f709f0f3a32cad47d13e5abe40ee755144a40b404 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-0fbc923c-90ec-4c3d-92df-bc42843601b3, io.buildah.version=1.41.3, tcib_managed=true, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Jan 21 19:42:27 np0005591285 nova_compute[182755]: 2026-01-22 00:42:27.304 182759 INFO nova.compute.manager [None req-f447ef32-7edb-4e2c-a5c5-5452f2f9c37e - - - - - -] [instance: 03f0da8f-c1d4-4432-bd08-77122a64e6b9] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 21 19:42:27 np0005591285 systemd[1]: Started libpod-conmon-3946a81ce75d4d50350bfb8f709f0f3a32cad47d13e5abe40ee755144a40b404.scope.
Jan 21 19:42:27 np0005591285 podman[243103]: 2026-01-22 00:42:27.244692006 +0000 UTC m=+0.026767383 image pull 3695f0466b4af47afdf4b467956f8cc4744d7249671a73e7ca3fd26cca2f59c3 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Jan 21 19:42:27 np0005591285 systemd[1]: Started libcrun container.
Jan 21 19:42:27 np0005591285 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/e0709a4967a78abefa8b3a8e63969120cd6473e502c16790afa3b9bd6b12c499/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 21 19:42:27 np0005591285 nova_compute[182755]: 2026-01-22 00:42:27.350 182759 INFO nova.compute.manager [None req-209c5657-1b77-421b-8c86-f747023daa56 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] [instance: 03f0da8f-c1d4-4432-bd08-77122a64e6b9] Took 8.71 seconds to spawn the instance on the hypervisor.#033[00m
Jan 21 19:42:27 np0005591285 nova_compute[182755]: 2026-01-22 00:42:27.351 182759 DEBUG nova.compute.manager [None req-209c5657-1b77-421b-8c86-f747023daa56 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] [instance: 03f0da8f-c1d4-4432-bd08-77122a64e6b9] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 21 19:42:27 np0005591285 podman[243103]: 2026-01-22 00:42:27.357220683 +0000 UTC m=+0.139296000 container init 3946a81ce75d4d50350bfb8f709f0f3a32cad47d13e5abe40ee755144a40b404 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-0fbc923c-90ec-4c3d-92df-bc42843601b3, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2)
Jan 21 19:42:27 np0005591285 podman[243103]: 2026-01-22 00:42:27.36379665 +0000 UTC m=+0.145871937 container start 3946a81ce75d4d50350bfb8f709f0f3a32cad47d13e5abe40ee755144a40b404 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-0fbc923c-90ec-4c3d-92df-bc42843601b3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202)
Jan 21 19:42:27 np0005591285 neutron-haproxy-ovnmeta-0fbc923c-90ec-4c3d-92df-bc42843601b3[243119]: [NOTICE]   (243123) : New worker (243125) forked
Jan 21 19:42:27 np0005591285 neutron-haproxy-ovnmeta-0fbc923c-90ec-4c3d-92df-bc42843601b3[243119]: [NOTICE]   (243123) : Loading success.
Jan 21 19:42:27 np0005591285 nova_compute[182755]: 2026-01-22 00:42:27.450 182759 INFO nova.compute.manager [None req-209c5657-1b77-421b-8c86-f747023daa56 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] [instance: 03f0da8f-c1d4-4432-bd08-77122a64e6b9] Took 9.30 seconds to build instance.#033[00m
Jan 21 19:42:27 np0005591285 nova_compute[182755]: 2026-01-22 00:42:27.479 182759 DEBUG oslo_concurrency.lockutils [None req-209c5657-1b77-421b-8c86-f747023daa56 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Lock "03f0da8f-c1d4-4432-bd08-77122a64e6b9" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 9.422s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 21 19:42:27 np0005591285 nova_compute[182755]: 2026-01-22 00:42:27.694 182759 DEBUG nova.network.neutron [req-65eb7941-da49-4891-acfa-dd320133da30 req-145817db-c374-40dd-84c2-71014a01e344 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 03f0da8f-c1d4-4432-bd08-77122a64e6b9] Updated VIF entry in instance network info cache for port df0b1d8d-bf36-48ca-b912-b8b71d623097. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 21 19:42:27 np0005591285 nova_compute[182755]: 2026-01-22 00:42:27.695 182759 DEBUG nova.network.neutron [req-65eb7941-da49-4891-acfa-dd320133da30 req-145817db-c374-40dd-84c2-71014a01e344 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 03f0da8f-c1d4-4432-bd08-77122a64e6b9] Updating instance_info_cache with network_info: [{"id": "df0b1d8d-bf36-48ca-b912-b8b71d623097", "address": "fa:16:3e:3e:2b:ed", "network": {"id": "0fbc923c-90ec-4c3d-92df-bc42843601b3", "bridge": "br-int", "label": "tempest-network-smoke--540170543", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}, {"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe3e:2bed", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}, {"cidr": "2001:db8:0:1::/64", "dns": [], "gateway": {"address": "2001:db8:0:1::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8:0:1:f816:3eff:fe3e:2bed", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "837db8748d074b3c9179b47d30e7a1d4", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapdf0b1d8d-bf", "ovs_interfaceid": "df0b1d8d-bf36-48ca-b912-b8b71d623097", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 21 19:42:27 np0005591285 nova_compute[182755]: 2026-01-22 00:42:27.711 182759 DEBUG oslo_concurrency.lockutils [req-65eb7941-da49-4891-acfa-dd320133da30 req-145817db-c374-40dd-84c2-71014a01e344 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Releasing lock "refresh_cache-03f0da8f-c1d4-4432-bd08-77122a64e6b9" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 21 19:42:27 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:42:27.915 104259 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=ce4b296c-26ac-415a-aa87-9634754eb3d3, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '65'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 21 19:42:29 np0005591285 podman[243134]: 2026-01-22 00:42:29.189916801 +0000 UTC m=+0.055295712 container health_status 482a8450540b33e5cd4c4cb0c4b60281016c657b79b89fa82f8a9e447843f995 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 21 19:42:29 np0005591285 podman[243135]: 2026-01-22 00:42:29.190419215 +0000 UTC m=+0.053060292 container health_status 8919309155a033da8deb024bb37b9fc0d285fe97686ee0d202af37a5f2df8416 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter)
Jan 21 19:42:29 np0005591285 nova_compute[182755]: 2026-01-22 00:42:29.282 182759 DEBUG nova.compute.manager [req-c5a04043-f27f-4d0e-9123-11f8f02c1fee req-58205282-19ef-488f-ac6a-e2a3fa86d620 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 03f0da8f-c1d4-4432-bd08-77122a64e6b9] Received event network-vif-plugged-df0b1d8d-bf36-48ca-b912-b8b71d623097 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 21 19:42:29 np0005591285 nova_compute[182755]: 2026-01-22 00:42:29.283 182759 DEBUG oslo_concurrency.lockutils [req-c5a04043-f27f-4d0e-9123-11f8f02c1fee req-58205282-19ef-488f-ac6a-e2a3fa86d620 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "03f0da8f-c1d4-4432-bd08-77122a64e6b9-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 21 19:42:29 np0005591285 nova_compute[182755]: 2026-01-22 00:42:29.283 182759 DEBUG oslo_concurrency.lockutils [req-c5a04043-f27f-4d0e-9123-11f8f02c1fee req-58205282-19ef-488f-ac6a-e2a3fa86d620 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "03f0da8f-c1d4-4432-bd08-77122a64e6b9-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 21 19:42:29 np0005591285 nova_compute[182755]: 2026-01-22 00:42:29.284 182759 DEBUG oslo_concurrency.lockutils [req-c5a04043-f27f-4d0e-9123-11f8f02c1fee req-58205282-19ef-488f-ac6a-e2a3fa86d620 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "03f0da8f-c1d4-4432-bd08-77122a64e6b9-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 21 19:42:29 np0005591285 nova_compute[182755]: 2026-01-22 00:42:29.284 182759 DEBUG nova.compute.manager [req-c5a04043-f27f-4d0e-9123-11f8f02c1fee req-58205282-19ef-488f-ac6a-e2a3fa86d620 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 03f0da8f-c1d4-4432-bd08-77122a64e6b9] No waiting events found dispatching network-vif-plugged-df0b1d8d-bf36-48ca-b912-b8b71d623097 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 21 19:42:29 np0005591285 nova_compute[182755]: 2026-01-22 00:42:29.284 182759 WARNING nova.compute.manager [req-c5a04043-f27f-4d0e-9123-11f8f02c1fee req-58205282-19ef-488f-ac6a-e2a3fa86d620 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 03f0da8f-c1d4-4432-bd08-77122a64e6b9] Received unexpected event network-vif-plugged-df0b1d8d-bf36-48ca-b912-b8b71d623097 for instance with vm_state active and task_state None.#033[00m
Jan 21 19:42:30 np0005591285 nova_compute[182755]: 2026-01-22 00:42:30.512 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:42:30 np0005591285 nova_compute[182755]: 2026-01-22 00:42:30.524 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:42:32 np0005591285 podman[243173]: 2026-01-22 00:42:32.22694795 +0000 UTC m=+0.091570321 container health_status e38def30a2d032278c507a8fe96e62e9ea51ff4721fe55b24e66691821fd079e (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, org.label-schema.license=GPLv2, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, container_name=ovn_controller, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true)
Jan 21 19:42:34 np0005591285 nova_compute[182755]: 2026-01-22 00:42:34.038 182759 DEBUG nova.compute.manager [req-f63ce90b-104c-4efb-989e-4957064d09a1 req-cb58b15a-9a0a-4363-8099-7ef1db189995 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 03f0da8f-c1d4-4432-bd08-77122a64e6b9] Received event network-changed-df0b1d8d-bf36-48ca-b912-b8b71d623097 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 21 19:42:34 np0005591285 nova_compute[182755]: 2026-01-22 00:42:34.039 182759 DEBUG nova.compute.manager [req-f63ce90b-104c-4efb-989e-4957064d09a1 req-cb58b15a-9a0a-4363-8099-7ef1db189995 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 03f0da8f-c1d4-4432-bd08-77122a64e6b9] Refreshing instance network info cache due to event network-changed-df0b1d8d-bf36-48ca-b912-b8b71d623097. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 21 19:42:34 np0005591285 nova_compute[182755]: 2026-01-22 00:42:34.039 182759 DEBUG oslo_concurrency.lockutils [req-f63ce90b-104c-4efb-989e-4957064d09a1 req-cb58b15a-9a0a-4363-8099-7ef1db189995 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "refresh_cache-03f0da8f-c1d4-4432-bd08-77122a64e6b9" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 21 19:42:34 np0005591285 nova_compute[182755]: 2026-01-22 00:42:34.039 182759 DEBUG oslo_concurrency.lockutils [req-f63ce90b-104c-4efb-989e-4957064d09a1 req-cb58b15a-9a0a-4363-8099-7ef1db189995 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquired lock "refresh_cache-03f0da8f-c1d4-4432-bd08-77122a64e6b9" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 21 19:42:34 np0005591285 nova_compute[182755]: 2026-01-22 00:42:34.040 182759 DEBUG nova.network.neutron [req-f63ce90b-104c-4efb-989e-4957064d09a1 req-cb58b15a-9a0a-4363-8099-7ef1db189995 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 03f0da8f-c1d4-4432-bd08-77122a64e6b9] Refreshing network info cache for port df0b1d8d-bf36-48ca-b912-b8b71d623097 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 21 19:42:34 np0005591285 nova_compute[182755]: 2026-01-22 00:42:34.726 182759 DEBUG oslo_service.periodic_task [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 21 19:42:35 np0005591285 nova_compute[182755]: 2026-01-22 00:42:35.518 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:42:35 np0005591285 nova_compute[182755]: 2026-01-22 00:42:35.527 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:42:36 np0005591285 nova_compute[182755]: 2026-01-22 00:42:36.495 182759 DEBUG nova.network.neutron [req-f63ce90b-104c-4efb-989e-4957064d09a1 req-cb58b15a-9a0a-4363-8099-7ef1db189995 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 03f0da8f-c1d4-4432-bd08-77122a64e6b9] Updated VIF entry in instance network info cache for port df0b1d8d-bf36-48ca-b912-b8b71d623097. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 21 19:42:36 np0005591285 nova_compute[182755]: 2026-01-22 00:42:36.496 182759 DEBUG nova.network.neutron [req-f63ce90b-104c-4efb-989e-4957064d09a1 req-cb58b15a-9a0a-4363-8099-7ef1db189995 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 03f0da8f-c1d4-4432-bd08-77122a64e6b9] Updating instance_info_cache with network_info: [{"id": "df0b1d8d-bf36-48ca-b912-b8b71d623097", "address": "fa:16:3e:3e:2b:ed", "network": {"id": "0fbc923c-90ec-4c3d-92df-bc42843601b3", "bridge": "br-int", "label": "tempest-network-smoke--540170543", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.202", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}, {"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe3e:2bed", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}, {"cidr": "2001:db8:0:1::/64", "dns": [], "gateway": {"address": "2001:db8:0:1::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8:0:1:f816:3eff:fe3e:2bed", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "837db8748d074b3c9179b47d30e7a1d4", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapdf0b1d8d-bf", "ovs_interfaceid": "df0b1d8d-bf36-48ca-b912-b8b71d623097", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 21 19:42:36 np0005591285 nova_compute[182755]: 2026-01-22 00:42:36.532 182759 DEBUG oslo_concurrency.lockutils [req-f63ce90b-104c-4efb-989e-4957064d09a1 req-cb58b15a-9a0a-4363-8099-7ef1db189995 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Releasing lock "refresh_cache-03f0da8f-c1d4-4432-bd08-77122a64e6b9" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 21 19:42:39 np0005591285 ovn_controller[94908]: 2026-01-22T00:42:39Z|00070|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:3e:2b:ed 10.100.0.5
Jan 21 19:42:39 np0005591285 ovn_controller[94908]: 2026-01-22T00:42:39Z|00071|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:3e:2b:ed 10.100.0.5
Jan 21 19:42:40 np0005591285 nova_compute[182755]: 2026-01-22 00:42:40.530 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4998-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 21 19:42:40 np0005591285 nova_compute[182755]: 2026-01-22 00:42:40.533 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 21 19:42:40 np0005591285 nova_compute[182755]: 2026-01-22 00:42:40.533 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117#033[00m
Jan 21 19:42:40 np0005591285 nova_compute[182755]: 2026-01-22 00:42:40.533 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m
Jan 21 19:42:40 np0005591285 nova_compute[182755]: 2026-01-22 00:42:40.561 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:42:40 np0005591285 nova_compute[182755]: 2026-01-22 00:42:40.562 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m
Jan 21 19:42:45 np0005591285 nova_compute[182755]: 2026-01-22 00:42:45.563 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4998-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 21 19:42:45 np0005591285 nova_compute[182755]: 2026-01-22 00:42:45.565 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 21 19:42:45 np0005591285 nova_compute[182755]: 2026-01-22 00:42:45.565 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117#033[00m
Jan 21 19:42:45 np0005591285 nova_compute[182755]: 2026-01-22 00:42:45.565 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m
Jan 21 19:42:45 np0005591285 nova_compute[182755]: 2026-01-22 00:42:45.611 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:42:45 np0005591285 nova_compute[182755]: 2026-01-22 00:42:45.611 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m
Jan 21 19:42:48 np0005591285 podman[243217]: 2026-01-22 00:42:48.193410042 +0000 UTC m=+0.058184612 container health_status 769668cc4e818cfae854ab3fcb5517227ddca1e18005a18ef4d782a494f45662 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., architecture=x86_64, com.redhat.component=ubi9-minimal-container, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.buildah.version=1.33.7, vcs-type=git, vendor=Red Hat, Inc., name=ubi9-minimal, build-date=2025-08-20T13:12:41, container_name=openstack_network_exporter, managed_by=edpm_ansible, url=https://catalog.redhat.com/en/search?searchType=containers, config_id=openstack_network_exporter, io.openshift.tags=minimal rhel9, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., version=9.6, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, distribution-scope=public, io.openshift.expose-services=, maintainer=Red Hat, Inc., release=1755695350)
Jan 21 19:42:48 np0005591285 podman[243218]: 2026-01-22 00:42:48.229572978 +0000 UTC m=+0.079923518 container health_status b5c1a31a508d0cf9e10702abe9c0ef424614b4d8f0daee85791f7de054afa6f0 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_id=ceilometer_agent_compute, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, container_name=ceilometer_agent_compute, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team)
Jan 21 19:42:50 np0005591285 nova_compute[182755]: 2026-01-22 00:42:50.613 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4998-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 21 19:42:50 np0005591285 nova_compute[182755]: 2026-01-22 00:42:50.614 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 21 19:42:50 np0005591285 nova_compute[182755]: 2026-01-22 00:42:50.615 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117#033[00m
Jan 21 19:42:50 np0005591285 nova_compute[182755]: 2026-01-22 00:42:50.615 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m
Jan 21 19:42:50 np0005591285 nova_compute[182755]: 2026-01-22 00:42:50.652 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:42:50 np0005591285 nova_compute[182755]: 2026-01-22 00:42:50.653 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m
Jan 21 19:42:54 np0005591285 nova_compute[182755]: 2026-01-22 00:42:54.227 182759 DEBUG nova.compute.manager [req-433b1973-dfec-4950-972e-e90926e12072 req-ece0b81a-79fc-4c0c-a581-f45c1fec3276 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 03f0da8f-c1d4-4432-bd08-77122a64e6b9] Received event network-changed-df0b1d8d-bf36-48ca-b912-b8b71d623097 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 21 19:42:54 np0005591285 nova_compute[182755]: 2026-01-22 00:42:54.228 182759 DEBUG nova.compute.manager [req-433b1973-dfec-4950-972e-e90926e12072 req-ece0b81a-79fc-4c0c-a581-f45c1fec3276 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 03f0da8f-c1d4-4432-bd08-77122a64e6b9] Refreshing instance network info cache due to event network-changed-df0b1d8d-bf36-48ca-b912-b8b71d623097. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 21 19:42:54 np0005591285 nova_compute[182755]: 2026-01-22 00:42:54.228 182759 DEBUG oslo_concurrency.lockutils [req-433b1973-dfec-4950-972e-e90926e12072 req-ece0b81a-79fc-4c0c-a581-f45c1fec3276 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "refresh_cache-03f0da8f-c1d4-4432-bd08-77122a64e6b9" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 21 19:42:54 np0005591285 nova_compute[182755]: 2026-01-22 00:42:54.228 182759 DEBUG oslo_concurrency.lockutils [req-433b1973-dfec-4950-972e-e90926e12072 req-ece0b81a-79fc-4c0c-a581-f45c1fec3276 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquired lock "refresh_cache-03f0da8f-c1d4-4432-bd08-77122a64e6b9" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 21 19:42:54 np0005591285 nova_compute[182755]: 2026-01-22 00:42:54.228 182759 DEBUG nova.network.neutron [req-433b1973-dfec-4950-972e-e90926e12072 req-ece0b81a-79fc-4c0c-a581-f45c1fec3276 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 03f0da8f-c1d4-4432-bd08-77122a64e6b9] Refreshing network info cache for port df0b1d8d-bf36-48ca-b912-b8b71d623097 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 21 19:42:54 np0005591285 nova_compute[182755]: 2026-01-22 00:42:54.295 182759 DEBUG oslo_concurrency.lockutils [None req-54531ba5-8b55-410e-af67-98973564244d a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Acquiring lock "03f0da8f-c1d4-4432-bd08-77122a64e6b9" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 21 19:42:54 np0005591285 nova_compute[182755]: 2026-01-22 00:42:54.296 182759 DEBUG oslo_concurrency.lockutils [None req-54531ba5-8b55-410e-af67-98973564244d a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Lock "03f0da8f-c1d4-4432-bd08-77122a64e6b9" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 21 19:42:54 np0005591285 nova_compute[182755]: 2026-01-22 00:42:54.296 182759 DEBUG oslo_concurrency.lockutils [None req-54531ba5-8b55-410e-af67-98973564244d a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Acquiring lock "03f0da8f-c1d4-4432-bd08-77122a64e6b9-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 21 19:42:54 np0005591285 nova_compute[182755]: 2026-01-22 00:42:54.297 182759 DEBUG oslo_concurrency.lockutils [None req-54531ba5-8b55-410e-af67-98973564244d a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Lock "03f0da8f-c1d4-4432-bd08-77122a64e6b9-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 21 19:42:54 np0005591285 nova_compute[182755]: 2026-01-22 00:42:54.297 182759 DEBUG oslo_concurrency.lockutils [None req-54531ba5-8b55-410e-af67-98973564244d a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Lock "03f0da8f-c1d4-4432-bd08-77122a64e6b9-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 21 19:42:54 np0005591285 nova_compute[182755]: 2026-01-22 00:42:54.310 182759 INFO nova.compute.manager [None req-54531ba5-8b55-410e-af67-98973564244d a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] [instance: 03f0da8f-c1d4-4432-bd08-77122a64e6b9] Terminating instance#033[00m
Jan 21 19:42:54 np0005591285 nova_compute[182755]: 2026-01-22 00:42:54.322 182759 DEBUG nova.compute.manager [None req-54531ba5-8b55-410e-af67-98973564244d a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] [instance: 03f0da8f-c1d4-4432-bd08-77122a64e6b9] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Jan 21 19:42:54 np0005591285 kernel: tapdf0b1d8d-bf (unregistering): left promiscuous mode
Jan 21 19:42:54 np0005591285 NetworkManager[55017]: <info>  [1769042574.3496] device (tapdf0b1d8d-bf): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 21 19:42:54 np0005591285 nova_compute[182755]: 2026-01-22 00:42:54.397 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:42:54 np0005591285 ovn_controller[94908]: 2026-01-22T00:42:54Z|00675|binding|INFO|Releasing lport df0b1d8d-bf36-48ca-b912-b8b71d623097 from this chassis (sb_readonly=0)
Jan 21 19:42:54 np0005591285 ovn_controller[94908]: 2026-01-22T00:42:54Z|00676|binding|INFO|Setting lport df0b1d8d-bf36-48ca-b912-b8b71d623097 down in Southbound
Jan 21 19:42:54 np0005591285 ovn_controller[94908]: 2026-01-22T00:42:54Z|00677|binding|INFO|Removing iface tapdf0b1d8d-bf ovn-installed in OVS
Jan 21 19:42:54 np0005591285 nova_compute[182755]: 2026-01-22 00:42:54.402 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:42:54 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:42:54.408 104259 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:3e:2b:ed 10.100.0.5 2001:db8:0:1:f816:3eff:fe3e:2bed 2001:db8::f816:3eff:fe3e:2bed'], port_security=['fa:16:3e:3e:2b:ed 10.100.0.5 2001:db8:0:1:f816:3eff:fe3e:2bed 2001:db8::f816:3eff:fe3e:2bed'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.5/28 2001:db8:0:1:f816:3eff:fe3e:2bed/64 2001:db8::f816:3eff:fe3e:2bed/64', 'neutron:device_id': '03f0da8f-c1d4-4432-bd08-77122a64e6b9', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-0fbc923c-90ec-4c3d-92df-bc42843601b3', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '837db8748d074b3c9179b47d30e7a1d4', 'neutron:revision_number': '4', 'neutron:security_group_ids': 'ee79161b-ebd1-43ab-81bd-31efca053e81', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-2.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=b420faa5-5ae8-471e-9b88-5f792c3ff519, chassis=[], tunnel_key=5, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fdd55b5c970>], logical_port=df0b1d8d-bf36-48ca-b912-b8b71d623097) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7fdd55b5c970>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 21 19:42:54 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:42:54.411 104259 INFO neutron.agent.ovn.metadata.agent [-] Port df0b1d8d-bf36-48ca-b912-b8b71d623097 in datapath 0fbc923c-90ec-4c3d-92df-bc42843601b3 unbound from our chassis#033[00m
Jan 21 19:42:54 np0005591285 nova_compute[182755]: 2026-01-22 00:42:54.413 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:42:54 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:42:54.413 104259 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 0fbc923c-90ec-4c3d-92df-bc42843601b3, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Jan 21 19:42:54 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:42:54.415 211690 DEBUG oslo.privsep.daemon [-] privsep: reply[1cef7226-c8a4-4855-ae5c-b6c79c5183a3]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 19:42:54 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:42:54.416 104259 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-0fbc923c-90ec-4c3d-92df-bc42843601b3 namespace which is not needed anymore#033[00m
Jan 21 19:42:54 np0005591285 systemd[1]: machine-qemu\x2d76\x2dinstance\x2d000000b4.scope: Deactivated successfully.
Jan 21 19:42:54 np0005591285 systemd[1]: machine-qemu\x2d76\x2dinstance\x2d000000b4.scope: Consumed 14.053s CPU time.
Jan 21 19:42:54 np0005591285 systemd-machined[154022]: Machine qemu-76-instance-000000b4 terminated.
Jan 21 19:42:54 np0005591285 neutron-haproxy-ovnmeta-0fbc923c-90ec-4c3d-92df-bc42843601b3[243119]: [NOTICE]   (243123) : haproxy version is 2.8.14-c23fe91
Jan 21 19:42:54 np0005591285 neutron-haproxy-ovnmeta-0fbc923c-90ec-4c3d-92df-bc42843601b3[243119]: [NOTICE]   (243123) : path to executable is /usr/sbin/haproxy
Jan 21 19:42:54 np0005591285 neutron-haproxy-ovnmeta-0fbc923c-90ec-4c3d-92df-bc42843601b3[243119]: [WARNING]  (243123) : Exiting Master process...
Jan 21 19:42:54 np0005591285 neutron-haproxy-ovnmeta-0fbc923c-90ec-4c3d-92df-bc42843601b3[243119]: [WARNING]  (243123) : Exiting Master process...
Jan 21 19:42:54 np0005591285 neutron-haproxy-ovnmeta-0fbc923c-90ec-4c3d-92df-bc42843601b3[243119]: [ALERT]    (243123) : Current worker (243125) exited with code 143 (Terminated)
Jan 21 19:42:54 np0005591285 neutron-haproxy-ovnmeta-0fbc923c-90ec-4c3d-92df-bc42843601b3[243119]: [WARNING]  (243123) : All workers exited. Exiting... (0)
Jan 21 19:42:54 np0005591285 systemd[1]: libpod-3946a81ce75d4d50350bfb8f709f0f3a32cad47d13e5abe40ee755144a40b404.scope: Deactivated successfully.
Jan 21 19:42:54 np0005591285 conmon[243119]: conmon 3946a81ce75d4d50350b <nwarn>: Failed to open cgroups file: /sys/fs/cgroup/machine.slice/libpod-3946a81ce75d4d50350bfb8f709f0f3a32cad47d13e5abe40ee755144a40b404.scope/container/memory.events
Jan 21 19:42:54 np0005591285 podman[243283]: 2026-01-22 00:42:54.570105048 +0000 UTC m=+0.050638908 container died 3946a81ce75d4d50350bfb8f709f0f3a32cad47d13e5abe40ee755144a40b404 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-0fbc923c-90ec-4c3d-92df-bc42843601b3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 21 19:42:54 np0005591285 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-3946a81ce75d4d50350bfb8f709f0f3a32cad47d13e5abe40ee755144a40b404-userdata-shm.mount: Deactivated successfully.
Jan 21 19:42:54 np0005591285 nova_compute[182755]: 2026-01-22 00:42:54.596 182759 INFO nova.virt.libvirt.driver [-] [instance: 03f0da8f-c1d4-4432-bd08-77122a64e6b9] Instance destroyed successfully.#033[00m
Jan 21 19:42:54 np0005591285 nova_compute[182755]: 2026-01-22 00:42:54.597 182759 DEBUG nova.objects.instance [None req-54531ba5-8b55-410e-af67-98973564244d a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Lazy-loading 'resources' on Instance uuid 03f0da8f-c1d4-4432-bd08-77122a64e6b9 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 21 19:42:54 np0005591285 systemd[1]: var-lib-containers-storage-overlay-e0709a4967a78abefa8b3a8e63969120cd6473e502c16790afa3b9bd6b12c499-merged.mount: Deactivated successfully.
Jan 21 19:42:54 np0005591285 podman[243283]: 2026-01-22 00:42:54.603735765 +0000 UTC m=+0.084269615 container cleanup 3946a81ce75d4d50350bfb8f709f0f3a32cad47d13e5abe40ee755144a40b404 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-0fbc923c-90ec-4c3d-92df-bc42843601b3, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS)
Jan 21 19:42:54 np0005591285 systemd[1]: libpod-conmon-3946a81ce75d4d50350bfb8f709f0f3a32cad47d13e5abe40ee755144a40b404.scope: Deactivated successfully.
Jan 21 19:42:54 np0005591285 nova_compute[182755]: 2026-01-22 00:42:54.615 182759 DEBUG nova.virt.libvirt.vif [None req-54531ba5-8b55-410e-af67-98973564244d a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-22T00:42:17Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestGettingAddress-server-813069027',display_name='tempest-TestGettingAddress-server-813069027',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-testgettingaddress-server-813069027',id=180,image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBGfcFV8SVrYtqhuEJR2u0WnZRZ3aIYjdzcrjfLDmTvbNVw+iNWtleLlqtVUIYQyWXU5cujTqDWjA511UjJA6kRMyxPcbENHgTLoJy3T95U8C9/oslNz/OBwLaWEuXg2SRA==',key_name='tempest-TestGettingAddress-313470075',keypairs=<?>,launch_index=0,launched_at=2026-01-22T00:42:27Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='837db8748d074b3c9179b47d30e7a1d4',ramdisk_id='',reservation_id='r-jup1ca84',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestGettingAddress-471729430',owner_user_name='tempest-TestGettingAddress-471729430-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-22T00:42:27Z,user_data=None,user_id='a8fd196423d94b309668ffd08655f7ed',uuid=03f0da8f-c1d4-4432-bd08-77122a64e6b9,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "df0b1d8d-bf36-48ca-b912-b8b71d623097", "address": "fa:16:3e:3e:2b:ed", "network": {"id": "0fbc923c-90ec-4c3d-92df-bc42843601b3", "bridge": "br-int", "label": "tempest-network-smoke--540170543", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.202", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}, {"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe3e:2bed", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}, {"cidr": "2001:db8:0:1::/64", "dns": [], "gateway": {"address": "2001:db8:0:1::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8:0:1:f816:3eff:fe3e:2bed", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "837db8748d074b3c9179b47d30e7a1d4", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapdf0b1d8d-bf", "ovs_interfaceid": "df0b1d8d-bf36-48ca-b912-b8b71d623097", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Jan 21 19:42:54 np0005591285 nova_compute[182755]: 2026-01-22 00:42:54.616 182759 DEBUG nova.network.os_vif_util [None req-54531ba5-8b55-410e-af67-98973564244d a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Converting VIF {"id": "df0b1d8d-bf36-48ca-b912-b8b71d623097", "address": "fa:16:3e:3e:2b:ed", "network": {"id": "0fbc923c-90ec-4c3d-92df-bc42843601b3", "bridge": "br-int", "label": "tempest-network-smoke--540170543", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.202", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}, {"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe3e:2bed", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}, {"cidr": "2001:db8:0:1::/64", "dns": [], "gateway": {"address": "2001:db8:0:1::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8:0:1:f816:3eff:fe3e:2bed", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "837db8748d074b3c9179b47d30e7a1d4", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapdf0b1d8d-bf", "ovs_interfaceid": "df0b1d8d-bf36-48ca-b912-b8b71d623097", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 21 19:42:54 np0005591285 nova_compute[182755]: 2026-01-22 00:42:54.617 182759 DEBUG nova.network.os_vif_util [None req-54531ba5-8b55-410e-af67-98973564244d a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:3e:2b:ed,bridge_name='br-int',has_traffic_filtering=True,id=df0b1d8d-bf36-48ca-b912-b8b71d623097,network=Network(0fbc923c-90ec-4c3d-92df-bc42843601b3),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapdf0b1d8d-bf') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 21 19:42:54 np0005591285 nova_compute[182755]: 2026-01-22 00:42:54.617 182759 DEBUG os_vif [None req-54531ba5-8b55-410e-af67-98973564244d a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:3e:2b:ed,bridge_name='br-int',has_traffic_filtering=True,id=df0b1d8d-bf36-48ca-b912-b8b71d623097,network=Network(0fbc923c-90ec-4c3d-92df-bc42843601b3),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapdf0b1d8d-bf') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Jan 21 19:42:54 np0005591285 nova_compute[182755]: 2026-01-22 00:42:54.619 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:42:54 np0005591285 nova_compute[182755]: 2026-01-22 00:42:54.619 182759 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapdf0b1d8d-bf, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 21 19:42:54 np0005591285 nova_compute[182755]: 2026-01-22 00:42:54.621 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:42:54 np0005591285 nova_compute[182755]: 2026-01-22 00:42:54.623 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 21 19:42:54 np0005591285 nova_compute[182755]: 2026-01-22 00:42:54.625 182759 INFO os_vif [None req-54531ba5-8b55-410e-af67-98973564244d a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:3e:2b:ed,bridge_name='br-int',has_traffic_filtering=True,id=df0b1d8d-bf36-48ca-b912-b8b71d623097,network=Network(0fbc923c-90ec-4c3d-92df-bc42843601b3),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapdf0b1d8d-bf')#033[00m
Jan 21 19:42:54 np0005591285 nova_compute[182755]: 2026-01-22 00:42:54.626 182759 INFO nova.virt.libvirt.driver [None req-54531ba5-8b55-410e-af67-98973564244d a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] [instance: 03f0da8f-c1d4-4432-bd08-77122a64e6b9] Deleting instance files /var/lib/nova/instances/03f0da8f-c1d4-4432-bd08-77122a64e6b9_del#033[00m
Jan 21 19:42:54 np0005591285 nova_compute[182755]: 2026-01-22 00:42:54.626 182759 INFO nova.virt.libvirt.driver [None req-54531ba5-8b55-410e-af67-98973564244d a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] [instance: 03f0da8f-c1d4-4432-bd08-77122a64e6b9] Deletion of /var/lib/nova/instances/03f0da8f-c1d4-4432-bd08-77122a64e6b9_del complete#033[00m
Jan 21 19:42:54 np0005591285 podman[243327]: 2026-01-22 00:42:54.667933288 +0000 UTC m=+0.045642662 container remove 3946a81ce75d4d50350bfb8f709f0f3a32cad47d13e5abe40ee755144a40b404 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-0fbc923c-90ec-4c3d-92df-bc42843601b3, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true)
Jan 21 19:42:54 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:42:54.674 211690 DEBUG oslo.privsep.daemon [-] privsep: reply[cf88c84c-d31d-4213-9d28-38f8b0f4d568]: (4, ('Thu Jan 22 12:42:54 AM UTC 2026 Stopping container neutron-haproxy-ovnmeta-0fbc923c-90ec-4c3d-92df-bc42843601b3 (3946a81ce75d4d50350bfb8f709f0f3a32cad47d13e5abe40ee755144a40b404)\n3946a81ce75d4d50350bfb8f709f0f3a32cad47d13e5abe40ee755144a40b404\nThu Jan 22 12:42:54 AM UTC 2026 Deleting container neutron-haproxy-ovnmeta-0fbc923c-90ec-4c3d-92df-bc42843601b3 (3946a81ce75d4d50350bfb8f709f0f3a32cad47d13e5abe40ee755144a40b404)\n3946a81ce75d4d50350bfb8f709f0f3a32cad47d13e5abe40ee755144a40b404\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 19:42:54 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:42:54.676 211690 DEBUG oslo.privsep.daemon [-] privsep: reply[7b579736-77b3-4eb2-b121-2d4b31115a13]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 19:42:54 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:42:54.677 104259 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap0fbc923c-90, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 21 19:42:54 np0005591285 nova_compute[182755]: 2026-01-22 00:42:54.679 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:42:54 np0005591285 kernel: tap0fbc923c-90: left promiscuous mode
Jan 21 19:42:54 np0005591285 nova_compute[182755]: 2026-01-22 00:42:54.682 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:42:54 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:42:54.684 211690 DEBUG oslo.privsep.daemon [-] privsep: reply[e9e15f9f-a0a6-4f72-a6c9-03fbcfaa4e33]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 19:42:54 np0005591285 nova_compute[182755]: 2026-01-22 00:42:54.694 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:42:54 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:42:54.707 211690 DEBUG oslo.privsep.daemon [-] privsep: reply[c191e421-afe0-4ed3-a959-68c0be6e99e2]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 19:42:54 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:42:54.708 211690 DEBUG oslo.privsep.daemon [-] privsep: reply[5744123e-2616-4cfc-af6d-18d5a01bb30f]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 19:42:54 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:42:54.723 211690 DEBUG oslo.privsep.daemon [-] privsep: reply[0624b990-107c-44ab-a18c-38b94cc269a3]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 710529, 'reachable_time': 39054, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 243341, 'error': None, 'target': 'ovnmeta-0fbc923c-90ec-4c3d-92df-bc42843601b3', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 19:42:54 np0005591285 nova_compute[182755]: 2026-01-22 00:42:54.724 182759 INFO nova.compute.manager [None req-54531ba5-8b55-410e-af67-98973564244d a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] [instance: 03f0da8f-c1d4-4432-bd08-77122a64e6b9] Took 0.40 seconds to destroy the instance on the hypervisor.#033[00m
Jan 21 19:42:54 np0005591285 nova_compute[182755]: 2026-01-22 00:42:54.724 182759 DEBUG oslo.service.loopingcall [None req-54531ba5-8b55-410e-af67-98973564244d a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Jan 21 19:42:54 np0005591285 nova_compute[182755]: 2026-01-22 00:42:54.725 182759 DEBUG nova.compute.manager [-] [instance: 03f0da8f-c1d4-4432-bd08-77122a64e6b9] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Jan 21 19:42:54 np0005591285 nova_compute[182755]: 2026-01-22 00:42:54.725 182759 DEBUG nova.network.neutron [-] [instance: 03f0da8f-c1d4-4432-bd08-77122a64e6b9] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Jan 21 19:42:54 np0005591285 systemd[1]: run-netns-ovnmeta\x2d0fbc923c\x2d90ec\x2d4c3d\x2d92df\x2dbc42843601b3.mount: Deactivated successfully.
Jan 21 19:42:54 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:42:54.726 104650 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-0fbc923c-90ec-4c3d-92df-bc42843601b3 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Jan 21 19:42:54 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:42:54.726 104650 DEBUG oslo.privsep.daemon [-] privsep: reply[d6395db7-c69c-4278-b56c-cabfd7e4557c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 19:42:55 np0005591285 nova_compute[182755]: 2026-01-22 00:42:55.706 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:42:56 np0005591285 nova_compute[182755]: 2026-01-22 00:42:56.167 182759 DEBUG nova.network.neutron [-] [instance: 03f0da8f-c1d4-4432-bd08-77122a64e6b9] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 21 19:42:56 np0005591285 nova_compute[182755]: 2026-01-22 00:42:56.189 182759 INFO nova.compute.manager [-] [instance: 03f0da8f-c1d4-4432-bd08-77122a64e6b9] Took 1.46 seconds to deallocate network for instance.#033[00m
Jan 21 19:42:56 np0005591285 nova_compute[182755]: 2026-01-22 00:42:56.261 182759 DEBUG nova.compute.manager [req-a0e92939-3b20-48f4-ae0c-556fc06d1e0e req-293676a4-25a0-4904-98ad-68c7cd599aef 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 03f0da8f-c1d4-4432-bd08-77122a64e6b9] Received event network-vif-deleted-df0b1d8d-bf36-48ca-b912-b8b71d623097 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 21 19:42:56 np0005591285 nova_compute[182755]: 2026-01-22 00:42:56.269 182759 DEBUG oslo_concurrency.lockutils [None req-54531ba5-8b55-410e-af67-98973564244d a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 21 19:42:56 np0005591285 nova_compute[182755]: 2026-01-22 00:42:56.270 182759 DEBUG oslo_concurrency.lockutils [None req-54531ba5-8b55-410e-af67-98973564244d a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 21 19:42:56 np0005591285 nova_compute[182755]: 2026-01-22 00:42:56.339 182759 DEBUG nova.compute.provider_tree [None req-54531ba5-8b55-410e-af67-98973564244d a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Inventory has not changed in ProviderTree for provider: e96a8776-a298-4c19-937a-402cb8191067 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 21 19:42:56 np0005591285 nova_compute[182755]: 2026-01-22 00:42:56.358 182759 DEBUG nova.compute.manager [req-b6b0a5fa-f576-47aa-a1e6-60230bb93ff5 req-e8e2242b-237f-41b3-ae69-c5afa22987a6 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 03f0da8f-c1d4-4432-bd08-77122a64e6b9] Received event network-vif-unplugged-df0b1d8d-bf36-48ca-b912-b8b71d623097 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 21 19:42:56 np0005591285 nova_compute[182755]: 2026-01-22 00:42:56.359 182759 DEBUG oslo_concurrency.lockutils [req-b6b0a5fa-f576-47aa-a1e6-60230bb93ff5 req-e8e2242b-237f-41b3-ae69-c5afa22987a6 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "03f0da8f-c1d4-4432-bd08-77122a64e6b9-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 21 19:42:56 np0005591285 nova_compute[182755]: 2026-01-22 00:42:56.359 182759 DEBUG oslo_concurrency.lockutils [req-b6b0a5fa-f576-47aa-a1e6-60230bb93ff5 req-e8e2242b-237f-41b3-ae69-c5afa22987a6 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "03f0da8f-c1d4-4432-bd08-77122a64e6b9-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 21 19:42:56 np0005591285 nova_compute[182755]: 2026-01-22 00:42:56.359 182759 DEBUG oslo_concurrency.lockutils [req-b6b0a5fa-f576-47aa-a1e6-60230bb93ff5 req-e8e2242b-237f-41b3-ae69-c5afa22987a6 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "03f0da8f-c1d4-4432-bd08-77122a64e6b9-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 21 19:42:56 np0005591285 nova_compute[182755]: 2026-01-22 00:42:56.360 182759 DEBUG nova.compute.manager [req-b6b0a5fa-f576-47aa-a1e6-60230bb93ff5 req-e8e2242b-237f-41b3-ae69-c5afa22987a6 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 03f0da8f-c1d4-4432-bd08-77122a64e6b9] No waiting events found dispatching network-vif-unplugged-df0b1d8d-bf36-48ca-b912-b8b71d623097 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 21 19:42:56 np0005591285 nova_compute[182755]: 2026-01-22 00:42:56.360 182759 WARNING nova.compute.manager [req-b6b0a5fa-f576-47aa-a1e6-60230bb93ff5 req-e8e2242b-237f-41b3-ae69-c5afa22987a6 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 03f0da8f-c1d4-4432-bd08-77122a64e6b9] Received unexpected event network-vif-unplugged-df0b1d8d-bf36-48ca-b912-b8b71d623097 for instance with vm_state deleted and task_state None.#033[00m
Jan 21 19:42:56 np0005591285 nova_compute[182755]: 2026-01-22 00:42:56.360 182759 DEBUG nova.compute.manager [req-b6b0a5fa-f576-47aa-a1e6-60230bb93ff5 req-e8e2242b-237f-41b3-ae69-c5afa22987a6 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 03f0da8f-c1d4-4432-bd08-77122a64e6b9] Received event network-vif-plugged-df0b1d8d-bf36-48ca-b912-b8b71d623097 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 21 19:42:56 np0005591285 nova_compute[182755]: 2026-01-22 00:42:56.361 182759 DEBUG oslo_concurrency.lockutils [req-b6b0a5fa-f576-47aa-a1e6-60230bb93ff5 req-e8e2242b-237f-41b3-ae69-c5afa22987a6 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "03f0da8f-c1d4-4432-bd08-77122a64e6b9-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 21 19:42:56 np0005591285 nova_compute[182755]: 2026-01-22 00:42:56.361 182759 DEBUG oslo_concurrency.lockutils [req-b6b0a5fa-f576-47aa-a1e6-60230bb93ff5 req-e8e2242b-237f-41b3-ae69-c5afa22987a6 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "03f0da8f-c1d4-4432-bd08-77122a64e6b9-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 21 19:42:56 np0005591285 nova_compute[182755]: 2026-01-22 00:42:56.361 182759 DEBUG oslo_concurrency.lockutils [req-b6b0a5fa-f576-47aa-a1e6-60230bb93ff5 req-e8e2242b-237f-41b3-ae69-c5afa22987a6 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "03f0da8f-c1d4-4432-bd08-77122a64e6b9-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 21 19:42:56 np0005591285 nova_compute[182755]: 2026-01-22 00:42:56.362 182759 DEBUG nova.compute.manager [req-b6b0a5fa-f576-47aa-a1e6-60230bb93ff5 req-e8e2242b-237f-41b3-ae69-c5afa22987a6 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 03f0da8f-c1d4-4432-bd08-77122a64e6b9] No waiting events found dispatching network-vif-plugged-df0b1d8d-bf36-48ca-b912-b8b71d623097 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 21 19:42:56 np0005591285 nova_compute[182755]: 2026-01-22 00:42:56.362 182759 WARNING nova.compute.manager [req-b6b0a5fa-f576-47aa-a1e6-60230bb93ff5 req-e8e2242b-237f-41b3-ae69-c5afa22987a6 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 03f0da8f-c1d4-4432-bd08-77122a64e6b9] Received unexpected event network-vif-plugged-df0b1d8d-bf36-48ca-b912-b8b71d623097 for instance with vm_state deleted and task_state None.#033[00m
Jan 21 19:42:56 np0005591285 nova_compute[182755]: 2026-01-22 00:42:56.364 182759 DEBUG nova.scheduler.client.report [None req-54531ba5-8b55-410e-af67-98973564244d a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Inventory has not changed for provider e96a8776-a298-4c19-937a-402cb8191067 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 21 19:42:56 np0005591285 nova_compute[182755]: 2026-01-22 00:42:56.383 182759 DEBUG oslo_concurrency.lockutils [None req-54531ba5-8b55-410e-af67-98973564244d a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.113s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 21 19:42:56 np0005591285 nova_compute[182755]: 2026-01-22 00:42:56.413 182759 INFO nova.scheduler.client.report [None req-54531ba5-8b55-410e-af67-98973564244d a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Deleted allocations for instance 03f0da8f-c1d4-4432-bd08-77122a64e6b9#033[00m
Jan 21 19:42:56 np0005591285 nova_compute[182755]: 2026-01-22 00:42:56.503 182759 DEBUG oslo_concurrency.lockutils [None req-54531ba5-8b55-410e-af67-98973564244d a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Lock "03f0da8f-c1d4-4432-bd08-77122a64e6b9" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 2.207s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 21 19:42:57 np0005591285 podman[243342]: 2026-01-22 00:42:57.212292031 +0000 UTC m=+0.071631434 container health_status 3d927e208c8d9d2bf2ed958430b573b5beb6e6062ba87e1ec7a0ee42c855b2e4 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Jan 21 19:42:58 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:42:57.946 104259 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=66, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '16:a4:fb', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': 'fa:c2:bb:50:eb:27'}, ipsec=False) old=SB_Global(nb_cfg=65) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 21 19:42:58 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:42:57.946 104259 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 6 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Jan 21 19:42:58 np0005591285 nova_compute[182755]: 2026-01-22 00:42:58.006 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:42:58 np0005591285 nova_compute[182755]: 2026-01-22 00:42:58.056 182759 DEBUG nova.network.neutron [req-433b1973-dfec-4950-972e-e90926e12072 req-ece0b81a-79fc-4c0c-a581-f45c1fec3276 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 03f0da8f-c1d4-4432-bd08-77122a64e6b9] Updated VIF entry in instance network info cache for port df0b1d8d-bf36-48ca-b912-b8b71d623097. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 21 19:42:58 np0005591285 nova_compute[182755]: 2026-01-22 00:42:58.056 182759 DEBUG nova.network.neutron [req-433b1973-dfec-4950-972e-e90926e12072 req-ece0b81a-79fc-4c0c-a581-f45c1fec3276 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 03f0da8f-c1d4-4432-bd08-77122a64e6b9] Updating instance_info_cache with network_info: [{"id": "df0b1d8d-bf36-48ca-b912-b8b71d623097", "address": "fa:16:3e:3e:2b:ed", "network": {"id": "0fbc923c-90ec-4c3d-92df-bc42843601b3", "bridge": "br-int", "label": "tempest-network-smoke--540170543", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}, {"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe3e:2bed", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}, {"cidr": "2001:db8:0:1::/64", "dns": [], "gateway": {"address": "2001:db8:0:1::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8:0:1:f816:3eff:fe3e:2bed", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "837db8748d074b3c9179b47d30e7a1d4", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapdf0b1d8d-bf", "ovs_interfaceid": "df0b1d8d-bf36-48ca-b912-b8b71d623097", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 21 19:42:58 np0005591285 nova_compute[182755]: 2026-01-22 00:42:58.076 182759 DEBUG oslo_concurrency.lockutils [req-433b1973-dfec-4950-972e-e90926e12072 req-ece0b81a-79fc-4c0c-a581-f45c1fec3276 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Releasing lock "refresh_cache-03f0da8f-c1d4-4432-bd08-77122a64e6b9" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 21 19:42:59 np0005591285 nova_compute[182755]: 2026-01-22 00:42:59.623 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:43:00 np0005591285 podman[243367]: 2026-01-22 00:43:00.190182234 +0000 UTC m=+0.057861452 container health_status 8919309155a033da8deb024bb37b9fc0d285fe97686ee0d202af37a5f2df8416 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Jan 21 19:43:00 np0005591285 podman[243366]: 2026-01-22 00:43:00.199313881 +0000 UTC m=+0.072236441 container health_status 482a8450540b33e5cd4c4cb0c4b60281016c657b79b89fa82f8a9e447843f995 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, tcib_managed=true, container_name=ovn_metadata_agent, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, managed_by=edpm_ansible)
Jan 21 19:43:00 np0005591285 nova_compute[182755]: 2026-01-22 00:43:00.747 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:43:03 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:43:03.011 104259 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 21 19:43:03 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:43:03.012 104259 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 21 19:43:03 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:43:03.012 104259 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 21 19:43:03 np0005591285 podman[243408]: 2026-01-22 00:43:03.202694842 +0000 UTC m=+0.077277797 container health_status e38def30a2d032278c507a8fe96e62e9ea51ff4721fe55b24e66691821fd079e (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_controller, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, config_id=ovn_controller, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2)
Jan 21 19:43:03 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:43:03.949 104259 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=ce4b296c-26ac-415a-aa87-9634754eb3d3, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '66'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 21 19:43:04 np0005591285 nova_compute[182755]: 2026-01-22 00:43:04.627 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:43:05 np0005591285 nova_compute[182755]: 2026-01-22 00:43:05.749 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:43:07 np0005591285 nova_compute[182755]: 2026-01-22 00:43:07.036 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:43:07 np0005591285 nova_compute[182755]: 2026-01-22 00:43:07.142 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:43:09 np0005591285 nova_compute[182755]: 2026-01-22 00:43:09.217 182759 DEBUG oslo_service.periodic_task [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 21 19:43:09 np0005591285 nova_compute[182755]: 2026-01-22 00:43:09.217 182759 DEBUG nova.compute.manager [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Jan 21 19:43:09 np0005591285 nova_compute[182755]: 2026-01-22 00:43:09.218 182759 DEBUG nova.compute.manager [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Jan 21 19:43:09 np0005591285 nova_compute[182755]: 2026-01-22 00:43:09.241 182759 DEBUG nova.compute.manager [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Jan 21 19:43:09 np0005591285 nova_compute[182755]: 2026-01-22 00:43:09.595 182759 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1769042574.593546, 03f0da8f-c1d4-4432-bd08-77122a64e6b9 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 21 19:43:09 np0005591285 nova_compute[182755]: 2026-01-22 00:43:09.595 182759 INFO nova.compute.manager [-] [instance: 03f0da8f-c1d4-4432-bd08-77122a64e6b9] VM Stopped (Lifecycle Event)#033[00m
Jan 21 19:43:09 np0005591285 nova_compute[182755]: 2026-01-22 00:43:09.626 182759 DEBUG nova.compute.manager [None req-7f1c01f4-9963-4452-9819-cc175b26f3fa - - - - - -] [instance: 03f0da8f-c1d4-4432-bd08-77122a64e6b9] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 21 19:43:09 np0005591285 nova_compute[182755]: 2026-01-22 00:43:09.631 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:43:10 np0005591285 nova_compute[182755]: 2026-01-22 00:43:10.791 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:43:11 np0005591285 nova_compute[182755]: 2026-01-22 00:43:11.217 182759 DEBUG oslo_service.periodic_task [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 21 19:43:13 np0005591285 nova_compute[182755]: 2026-01-22 00:43:13.218 182759 DEBUG oslo_service.periodic_task [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Running periodic task ComputeManager._run_pending_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 21 19:43:13 np0005591285 nova_compute[182755]: 2026-01-22 00:43:13.218 182759 DEBUG nova.compute.manager [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Cleaning up deleted instances _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11145#033[00m
Jan 21 19:43:13 np0005591285 nova_compute[182755]: 2026-01-22 00:43:13.238 182759 DEBUG nova.compute.manager [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] There are 0 instances to clean _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11154#033[00m
Jan 21 19:43:14 np0005591285 nova_compute[182755]: 2026-01-22 00:43:14.233 182759 DEBUG oslo_service.periodic_task [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 21 19:43:14 np0005591285 nova_compute[182755]: 2026-01-22 00:43:14.234 182759 DEBUG oslo_service.periodic_task [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 21 19:43:14 np0005591285 nova_compute[182755]: 2026-01-22 00:43:14.234 182759 DEBUG oslo_service.periodic_task [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 21 19:43:14 np0005591285 nova_compute[182755]: 2026-01-22 00:43:14.235 182759 DEBUG nova.compute.manager [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Jan 21 19:43:14 np0005591285 nova_compute[182755]: 2026-01-22 00:43:14.635 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:43:15 np0005591285 nova_compute[182755]: 2026-01-22 00:43:15.845 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:43:16 np0005591285 nova_compute[182755]: 2026-01-22 00:43:16.217 182759 DEBUG oslo_service.periodic_task [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 21 19:43:19 np0005591285 podman[243436]: 2026-01-22 00:43:19.20757821 +0000 UTC m=+0.067173444 container health_status 769668cc4e818cfae854ab3fcb5517227ddca1e18005a18ef4d782a494f45662 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, vcs-type=git, maintainer=Red Hat, Inc., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, com.redhat.component=ubi9-minimal-container, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., url=https://catalog.redhat.com/en/search?searchType=containers, build-date=2025-08-20T13:12:41, io.buildah.version=1.33.7, architecture=x86_64, container_name=openstack_network_exporter, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.openshift.tags=minimal rhel9, name=ubi9-minimal, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, version=9.6, vendor=Red Hat, Inc., config_id=openstack_network_exporter, release=1755695350, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., managed_by=edpm_ansible)
Jan 21 19:43:19 np0005591285 podman[243437]: 2026-01-22 00:43:19.220181309 +0000 UTC m=+0.082896497 container health_status b5c1a31a508d0cf9e10702abe9c0ef424614b4d8f0daee85791f7de054afa6f0 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, config_id=ceilometer_agent_compute, org.label-schema.schema-version=1.0)
Jan 21 19:43:19 np0005591285 nova_compute[182755]: 2026-01-22 00:43:19.637 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:43:20 np0005591285 nova_compute[182755]: 2026-01-22 00:43:20.847 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:43:22 np0005591285 nova_compute[182755]: 2026-01-22 00:43:22.217 182759 DEBUG oslo_service.periodic_task [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 21 19:43:24 np0005591285 nova_compute[182755]: 2026-01-22 00:43:24.217 182759 DEBUG oslo_service.periodic_task [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 21 19:43:24 np0005591285 nova_compute[182755]: 2026-01-22 00:43:24.217 182759 DEBUG oslo_service.periodic_task [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 21 19:43:24 np0005591285 nova_compute[182755]: 2026-01-22 00:43:24.247 182759 DEBUG oslo_concurrency.lockutils [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 21 19:43:24 np0005591285 nova_compute[182755]: 2026-01-22 00:43:24.247 182759 DEBUG oslo_concurrency.lockutils [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 21 19:43:24 np0005591285 nova_compute[182755]: 2026-01-22 00:43:24.247 182759 DEBUG oslo_concurrency.lockutils [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 21 19:43:24 np0005591285 nova_compute[182755]: 2026-01-22 00:43:24.247 182759 DEBUG nova.compute.resource_tracker [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Jan 21 19:43:24 np0005591285 nova_compute[182755]: 2026-01-22 00:43:24.459 182759 WARNING nova.virt.libvirt.driver [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 21 19:43:24 np0005591285 nova_compute[182755]: 2026-01-22 00:43:24.461 182759 DEBUG nova.compute.resource_tracker [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=5740MB free_disk=73.17712020874023GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Jan 21 19:43:24 np0005591285 nova_compute[182755]: 2026-01-22 00:43:24.461 182759 DEBUG oslo_concurrency.lockutils [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 21 19:43:24 np0005591285 nova_compute[182755]: 2026-01-22 00:43:24.461 182759 DEBUG oslo_concurrency.lockutils [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 21 19:43:24 np0005591285 nova_compute[182755]: 2026-01-22 00:43:24.537 182759 DEBUG nova.compute.resource_tracker [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Jan 21 19:43:24 np0005591285 nova_compute[182755]: 2026-01-22 00:43:24.538 182759 DEBUG nova.compute.resource_tracker [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Jan 21 19:43:24 np0005591285 nova_compute[182755]: 2026-01-22 00:43:24.560 182759 DEBUG nova.compute.provider_tree [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Inventory has not changed in ProviderTree for provider: e96a8776-a298-4c19-937a-402cb8191067 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 21 19:43:24 np0005591285 nova_compute[182755]: 2026-01-22 00:43:24.575 182759 DEBUG nova.scheduler.client.report [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Inventory has not changed for provider e96a8776-a298-4c19-937a-402cb8191067 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 21 19:43:24 np0005591285 nova_compute[182755]: 2026-01-22 00:43:24.598 182759 DEBUG nova.compute.resource_tracker [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Jan 21 19:43:24 np0005591285 nova_compute[182755]: 2026-01-22 00:43:24.598 182759 DEBUG oslo_concurrency.lockutils [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.137s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 21 19:43:24 np0005591285 nova_compute[182755]: 2026-01-22 00:43:24.640 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:43:25 np0005591285 nova_compute[182755]: 2026-01-22 00:43:25.847 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:43:26 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:43:26.451 104259 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:e0:9f:dc 10.100.0.2 2001:db8::f816:3eff:fee0:9fdc'], port_security=[], type=localport, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': ''}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.2/28 2001:db8::f816:3eff:fee0:9fdc/64', 'neutron:device_id': 'ovnmeta-bc173f9b-a39e-490e-b1d4-92abd1855016', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-bc173f9b-a39e-490e-b1d4-92abd1855016', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '837db8748d074b3c9179b47d30e7a1d4', 'neutron:revision_number': '3', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=ad392942-0b6b-462d-a3a5-d979f385a143, chassis=[], tunnel_key=1, gateway_chassis=[], requested_chassis=[], logical_port=e429e99d-d544-4554-bbe2-f8538fbd55b8) old=Port_Binding(mac=['fa:16:3e:e0:9f:dc 10.100.0.2'], external_ids={'neutron:cidrs': '10.100.0.2/28', 'neutron:device_id': 'ovnmeta-bc173f9b-a39e-490e-b1d4-92abd1855016', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-bc173f9b-a39e-490e-b1d4-92abd1855016', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '837db8748d074b3c9179b47d30e7a1d4', 'neutron:revision_number': '2', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 21 19:43:26 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:43:26.454 104259 INFO neutron.agent.ovn.metadata.agent [-] Metadata Port e429e99d-d544-4554-bbe2-f8538fbd55b8 in datapath bc173f9b-a39e-490e-b1d4-92abd1855016 updated#033[00m
Jan 21 19:43:26 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:43:26.456 104259 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network bc173f9b-a39e-490e-b1d4-92abd1855016, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Jan 21 19:43:26 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:43:26.458 211690 DEBUG oslo.privsep.daemon [-] privsep: reply[02be3808-98fc-4f7b-a2bf-b60ff7f78dfc]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 19:43:28 np0005591285 podman[243480]: 2026-01-22 00:43:28.210087487 +0000 UTC m=+0.076313501 container health_status 3d927e208c8d9d2bf2ed958430b573b5beb6e6062ba87e1ec7a0ee42c855b2e4 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Jan 21 19:43:29 np0005591285 nova_compute[182755]: 2026-01-22 00:43:29.643 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:43:30 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:43:30.057 104259 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:e0:9f:dc 10.100.0.2 2001:db8:0:1:f816:3eff:fee0:9fdc 2001:db8::f816:3eff:fee0:9fdc'], port_security=[], type=localport, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': ''}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.2/28 2001:db8:0:1:f816:3eff:fee0:9fdc/64 2001:db8::f816:3eff:fee0:9fdc/64', 'neutron:device_id': 'ovnmeta-bc173f9b-a39e-490e-b1d4-92abd1855016', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-bc173f9b-a39e-490e-b1d4-92abd1855016', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '837db8748d074b3c9179b47d30e7a1d4', 'neutron:revision_number': '4', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=ad392942-0b6b-462d-a3a5-d979f385a143, chassis=[], tunnel_key=1, gateway_chassis=[], requested_chassis=[], logical_port=e429e99d-d544-4554-bbe2-f8538fbd55b8) old=Port_Binding(mac=['fa:16:3e:e0:9f:dc 10.100.0.2 2001:db8::f816:3eff:fee0:9fdc'], external_ids={'neutron:cidrs': '10.100.0.2/28 2001:db8::f816:3eff:fee0:9fdc/64', 'neutron:device_id': 'ovnmeta-bc173f9b-a39e-490e-b1d4-92abd1855016', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-bc173f9b-a39e-490e-b1d4-92abd1855016', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '837db8748d074b3c9179b47d30e7a1d4', 'neutron:revision_number': '3', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 21 19:43:30 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:43:30.058 104259 INFO neutron.agent.ovn.metadata.agent [-] Metadata Port e429e99d-d544-4554-bbe2-f8538fbd55b8 in datapath bc173f9b-a39e-490e-b1d4-92abd1855016 updated#033[00m
Jan 21 19:43:30 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:43:30.059 104259 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network bc173f9b-a39e-490e-b1d4-92abd1855016, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Jan 21 19:43:30 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:43:30.060 211690 DEBUG oslo.privsep.daemon [-] privsep: reply[3ce2dd73-6bee-4400-9f16-95d8693da797]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 19:43:30 np0005591285 nova_compute[182755]: 2026-01-22 00:43:30.850 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:43:31 np0005591285 podman[243505]: 2026-01-22 00:43:31.189383238 +0000 UTC m=+0.054068709 container health_status 482a8450540b33e5cd4c4cb0c4b60281016c657b79b89fa82f8a9e447843f995 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 21 19:43:31 np0005591285 podman[243506]: 2026-01-22 00:43:31.193141149 +0000 UTC m=+0.052606809 container health_status 8919309155a033da8deb024bb37b9fc0d285fe97686ee0d202af37a5f2df8416 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter)
Jan 21 19:43:34 np0005591285 podman[243548]: 2026-01-22 00:43:34.273677333 +0000 UTC m=+0.146584287 container health_status e38def30a2d032278c507a8fe96e62e9ea51ff4721fe55b24e66691821fd079e (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.build-date=20251202, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team)
Jan 21 19:43:34 np0005591285 nova_compute[182755]: 2026-01-22 00:43:34.646 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:43:35 np0005591285 nova_compute[182755]: 2026-01-22 00:43:35.852 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:43:35 np0005591285 nova_compute[182755]: 2026-01-22 00:43:35.904 182759 DEBUG oslo_concurrency.lockutils [None req-fa2a4568-fb3f-44a5-acea-42cf3c0af941 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Acquiring lock "e1d6bfab-5b5f-4e87-903f-663a797f6e97" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 21 19:43:35 np0005591285 nova_compute[182755]: 2026-01-22 00:43:35.904 182759 DEBUG oslo_concurrency.lockutils [None req-fa2a4568-fb3f-44a5-acea-42cf3c0af941 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Lock "e1d6bfab-5b5f-4e87-903f-663a797f6e97" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 21 19:43:35 np0005591285 nova_compute[182755]: 2026-01-22 00:43:35.925 182759 DEBUG nova.compute.manager [None req-fa2a4568-fb3f-44a5-acea-42cf3c0af941 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] [instance: e1d6bfab-5b5f-4e87-903f-663a797f6e97] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Jan 21 19:43:36 np0005591285 nova_compute[182755]: 2026-01-22 00:43:36.023 182759 DEBUG oslo_concurrency.lockutils [None req-fa2a4568-fb3f-44a5-acea-42cf3c0af941 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 21 19:43:36 np0005591285 nova_compute[182755]: 2026-01-22 00:43:36.024 182759 DEBUG oslo_concurrency.lockutils [None req-fa2a4568-fb3f-44a5-acea-42cf3c0af941 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 21 19:43:36 np0005591285 nova_compute[182755]: 2026-01-22 00:43:36.034 182759 DEBUG nova.virt.hardware [None req-fa2a4568-fb3f-44a5-acea-42cf3c0af941 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Jan 21 19:43:36 np0005591285 nova_compute[182755]: 2026-01-22 00:43:36.035 182759 INFO nova.compute.claims [None req-fa2a4568-fb3f-44a5-acea-42cf3c0af941 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] [instance: e1d6bfab-5b5f-4e87-903f-663a797f6e97] Claim successful on node compute-2.ctlplane.example.com#033[00m
Jan 21 19:43:36 np0005591285 nova_compute[182755]: 2026-01-22 00:43:36.158 182759 DEBUG nova.compute.provider_tree [None req-fa2a4568-fb3f-44a5-acea-42cf3c0af941 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Inventory has not changed in ProviderTree for provider: e96a8776-a298-4c19-937a-402cb8191067 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 21 19:43:36 np0005591285 nova_compute[182755]: 2026-01-22 00:43:36.176 182759 DEBUG nova.scheduler.client.report [None req-fa2a4568-fb3f-44a5-acea-42cf3c0af941 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Inventory has not changed for provider e96a8776-a298-4c19-937a-402cb8191067 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 21 19:43:36 np0005591285 nova_compute[182755]: 2026-01-22 00:43:36.202 182759 DEBUG oslo_concurrency.lockutils [None req-fa2a4568-fb3f-44a5-acea-42cf3c0af941 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.178s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 21 19:43:36 np0005591285 nova_compute[182755]: 2026-01-22 00:43:36.203 182759 DEBUG nova.compute.manager [None req-fa2a4568-fb3f-44a5-acea-42cf3c0af941 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] [instance: e1d6bfab-5b5f-4e87-903f-663a797f6e97] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Jan 21 19:43:36 np0005591285 nova_compute[182755]: 2026-01-22 00:43:36.280 182759 DEBUG nova.compute.manager [None req-fa2a4568-fb3f-44a5-acea-42cf3c0af941 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] [instance: e1d6bfab-5b5f-4e87-903f-663a797f6e97] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Jan 21 19:43:36 np0005591285 nova_compute[182755]: 2026-01-22 00:43:36.281 182759 DEBUG nova.network.neutron [None req-fa2a4568-fb3f-44a5-acea-42cf3c0af941 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] [instance: e1d6bfab-5b5f-4e87-903f-663a797f6e97] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Jan 21 19:43:36 np0005591285 nova_compute[182755]: 2026-01-22 00:43:36.300 182759 INFO nova.virt.libvirt.driver [None req-fa2a4568-fb3f-44a5-acea-42cf3c0af941 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] [instance: e1d6bfab-5b5f-4e87-903f-663a797f6e97] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Jan 21 19:43:36 np0005591285 nova_compute[182755]: 2026-01-22 00:43:36.323 182759 DEBUG nova.compute.manager [None req-fa2a4568-fb3f-44a5-acea-42cf3c0af941 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] [instance: e1d6bfab-5b5f-4e87-903f-663a797f6e97] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Jan 21 19:43:36 np0005591285 nova_compute[182755]: 2026-01-22 00:43:36.490 182759 DEBUG nova.compute.manager [None req-fa2a4568-fb3f-44a5-acea-42cf3c0af941 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] [instance: e1d6bfab-5b5f-4e87-903f-663a797f6e97] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Jan 21 19:43:36 np0005591285 nova_compute[182755]: 2026-01-22 00:43:36.493 182759 DEBUG nova.virt.libvirt.driver [None req-fa2a4568-fb3f-44a5-acea-42cf3c0af941 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] [instance: e1d6bfab-5b5f-4e87-903f-663a797f6e97] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Jan 21 19:43:36 np0005591285 nova_compute[182755]: 2026-01-22 00:43:36.493 182759 INFO nova.virt.libvirt.driver [None req-fa2a4568-fb3f-44a5-acea-42cf3c0af941 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] [instance: e1d6bfab-5b5f-4e87-903f-663a797f6e97] Creating image(s)#033[00m
Jan 21 19:43:36 np0005591285 nova_compute[182755]: 2026-01-22 00:43:36.494 182759 DEBUG oslo_concurrency.lockutils [None req-fa2a4568-fb3f-44a5-acea-42cf3c0af941 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Acquiring lock "/var/lib/nova/instances/e1d6bfab-5b5f-4e87-903f-663a797f6e97/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 21 19:43:36 np0005591285 nova_compute[182755]: 2026-01-22 00:43:36.495 182759 DEBUG oslo_concurrency.lockutils [None req-fa2a4568-fb3f-44a5-acea-42cf3c0af941 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Lock "/var/lib/nova/instances/e1d6bfab-5b5f-4e87-903f-663a797f6e97/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 21 19:43:36 np0005591285 nova_compute[182755]: 2026-01-22 00:43:36.496 182759 DEBUG oslo_concurrency.lockutils [None req-fa2a4568-fb3f-44a5-acea-42cf3c0af941 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Lock "/var/lib/nova/instances/e1d6bfab-5b5f-4e87-903f-663a797f6e97/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 21 19:43:36 np0005591285 nova_compute[182755]: 2026-01-22 00:43:36.517 182759 DEBUG oslo_concurrency.processutils [None req-fa2a4568-fb3f-44a5-acea-42cf3c0af941 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 21 19:43:36 np0005591285 nova_compute[182755]: 2026-01-22 00:43:36.584 182759 DEBUG oslo_concurrency.processutils [None req-fa2a4568-fb3f-44a5-acea-42cf3c0af941 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json" returned: 0 in 0.067s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 21 19:43:36 np0005591285 nova_compute[182755]: 2026-01-22 00:43:36.585 182759 DEBUG oslo_concurrency.lockutils [None req-fa2a4568-fb3f-44a5-acea-42cf3c0af941 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Acquiring lock "3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 21 19:43:36 np0005591285 nova_compute[182755]: 2026-01-22 00:43:36.586 182759 DEBUG oslo_concurrency.lockutils [None req-fa2a4568-fb3f-44a5-acea-42cf3c0af941 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Lock "3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 21 19:43:36 np0005591285 nova_compute[182755]: 2026-01-22 00:43:36.598 182759 DEBUG oslo_concurrency.processutils [None req-fa2a4568-fb3f-44a5-acea-42cf3c0af941 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 21 19:43:36 np0005591285 nova_compute[182755]: 2026-01-22 00:43:36.651 182759 DEBUG oslo_concurrency.processutils [None req-fa2a4568-fb3f-44a5-acea-42cf3c0af941 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json" returned: 0 in 0.053s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 21 19:43:36 np0005591285 nova_compute[182755]: 2026-01-22 00:43:36.652 182759 DEBUG oslo_concurrency.processutils [None req-fa2a4568-fb3f-44a5-acea-42cf3c0af941 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474,backing_fmt=raw /var/lib/nova/instances/e1d6bfab-5b5f-4e87-903f-663a797f6e97/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 21 19:43:37 np0005591285 nova_compute[182755]: 2026-01-22 00:43:37.297 182759 DEBUG nova.policy [None req-fa2a4568-fb3f-44a5-acea-42cf3c0af941 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'a8fd196423d94b309668ffd08655f7ed', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '837db8748d074b3c9179b47d30e7a1d4', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Jan 21 19:43:37 np0005591285 nova_compute[182755]: 2026-01-22 00:43:37.991 182759 DEBUG oslo_concurrency.processutils [None req-fa2a4568-fb3f-44a5-acea-42cf3c0af941 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474,backing_fmt=raw /var/lib/nova/instances/e1d6bfab-5b5f-4e87-903f-663a797f6e97/disk 1073741824" returned: 0 in 1.339s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 21 19:43:37 np0005591285 nova_compute[182755]: 2026-01-22 00:43:37.993 182759 DEBUG oslo_concurrency.lockutils [None req-fa2a4568-fb3f-44a5-acea-42cf3c0af941 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Lock "3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 1.407s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 21 19:43:37 np0005591285 nova_compute[182755]: 2026-01-22 00:43:37.993 182759 DEBUG oslo_concurrency.processutils [None req-fa2a4568-fb3f-44a5-acea-42cf3c0af941 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 21 19:43:38 np0005591285 nova_compute[182755]: 2026-01-22 00:43:38.045 182759 DEBUG oslo_concurrency.processutils [None req-fa2a4568-fb3f-44a5-acea-42cf3c0af941 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json" returned: 0 in 0.051s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 21 19:43:38 np0005591285 nova_compute[182755]: 2026-01-22 00:43:38.047 182759 DEBUG nova.virt.disk.api [None req-fa2a4568-fb3f-44a5-acea-42cf3c0af941 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Checking if we can resize image /var/lib/nova/instances/e1d6bfab-5b5f-4e87-903f-663a797f6e97/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166#033[00m
Jan 21 19:43:38 np0005591285 nova_compute[182755]: 2026-01-22 00:43:38.048 182759 DEBUG oslo_concurrency.processutils [None req-fa2a4568-fb3f-44a5-acea-42cf3c0af941 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/e1d6bfab-5b5f-4e87-903f-663a797f6e97/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 21 19:43:38 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:43:38.090 104259 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=67, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '16:a4:fb', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': 'fa:c2:bb:50:eb:27'}, ipsec=False) old=SB_Global(nb_cfg=66) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 21 19:43:38 np0005591285 nova_compute[182755]: 2026-01-22 00:43:38.090 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:43:38 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:43:38.093 104259 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 8 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Jan 21 19:43:38 np0005591285 nova_compute[182755]: 2026-01-22 00:43:38.113 182759 DEBUG oslo_concurrency.processutils [None req-fa2a4568-fb3f-44a5-acea-42cf3c0af941 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/e1d6bfab-5b5f-4e87-903f-663a797f6e97/disk --force-share --output=json" returned: 0 in 0.065s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 21 19:43:38 np0005591285 nova_compute[182755]: 2026-01-22 00:43:38.115 182759 DEBUG nova.virt.disk.api [None req-fa2a4568-fb3f-44a5-acea-42cf3c0af941 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Cannot resize image /var/lib/nova/instances/e1d6bfab-5b5f-4e87-903f-663a797f6e97/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172#033[00m
Jan 21 19:43:38 np0005591285 nova_compute[182755]: 2026-01-22 00:43:38.115 182759 DEBUG nova.objects.instance [None req-fa2a4568-fb3f-44a5-acea-42cf3c0af941 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Lazy-loading 'migration_context' on Instance uuid e1d6bfab-5b5f-4e87-903f-663a797f6e97 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 21 19:43:38 np0005591285 nova_compute[182755]: 2026-01-22 00:43:38.132 182759 DEBUG nova.virt.libvirt.driver [None req-fa2a4568-fb3f-44a5-acea-42cf3c0af941 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] [instance: e1d6bfab-5b5f-4e87-903f-663a797f6e97] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Jan 21 19:43:38 np0005591285 nova_compute[182755]: 2026-01-22 00:43:38.133 182759 DEBUG nova.virt.libvirt.driver [None req-fa2a4568-fb3f-44a5-acea-42cf3c0af941 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] [instance: e1d6bfab-5b5f-4e87-903f-663a797f6e97] Ensure instance console log exists: /var/lib/nova/instances/e1d6bfab-5b5f-4e87-903f-663a797f6e97/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Jan 21 19:43:38 np0005591285 nova_compute[182755]: 2026-01-22 00:43:38.133 182759 DEBUG oslo_concurrency.lockutils [None req-fa2a4568-fb3f-44a5-acea-42cf3c0af941 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 21 19:43:38 np0005591285 nova_compute[182755]: 2026-01-22 00:43:38.133 182759 DEBUG oslo_concurrency.lockutils [None req-fa2a4568-fb3f-44a5-acea-42cf3c0af941 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 21 19:43:38 np0005591285 nova_compute[182755]: 2026-01-22 00:43:38.133 182759 DEBUG oslo_concurrency.lockutils [None req-fa2a4568-fb3f-44a5-acea-42cf3c0af941 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 21 19:43:38 np0005591285 nova_compute[182755]: 2026-01-22 00:43:38.190 182759 DEBUG nova.network.neutron [None req-fa2a4568-fb3f-44a5-acea-42cf3c0af941 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] [instance: e1d6bfab-5b5f-4e87-903f-663a797f6e97] Successfully created port: 3a45d6cb-8d53-4e0f-8011-06cad53a8190 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Jan 21 19:43:38 np0005591285 nova_compute[182755]: 2026-01-22 00:43:38.893 182759 DEBUG nova.network.neutron [None req-fa2a4568-fb3f-44a5-acea-42cf3c0af941 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] [instance: e1d6bfab-5b5f-4e87-903f-663a797f6e97] Successfully updated port: 3a45d6cb-8d53-4e0f-8011-06cad53a8190 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Jan 21 19:43:38 np0005591285 nova_compute[182755]: 2026-01-22 00:43:38.911 182759 DEBUG oslo_concurrency.lockutils [None req-fa2a4568-fb3f-44a5-acea-42cf3c0af941 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Acquiring lock "refresh_cache-e1d6bfab-5b5f-4e87-903f-663a797f6e97" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 21 19:43:38 np0005591285 nova_compute[182755]: 2026-01-22 00:43:38.911 182759 DEBUG oslo_concurrency.lockutils [None req-fa2a4568-fb3f-44a5-acea-42cf3c0af941 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Acquired lock "refresh_cache-e1d6bfab-5b5f-4e87-903f-663a797f6e97" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 21 19:43:38 np0005591285 nova_compute[182755]: 2026-01-22 00:43:38.911 182759 DEBUG nova.network.neutron [None req-fa2a4568-fb3f-44a5-acea-42cf3c0af941 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] [instance: e1d6bfab-5b5f-4e87-903f-663a797f6e97] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Jan 21 19:43:39 np0005591285 nova_compute[182755]: 2026-01-22 00:43:39.073 182759 DEBUG nova.network.neutron [None req-fa2a4568-fb3f-44a5-acea-42cf3c0af941 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] [instance: e1d6bfab-5b5f-4e87-903f-663a797f6e97] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Jan 21 19:43:39 np0005591285 nova_compute[182755]: 2026-01-22 00:43:39.320 182759 DEBUG nova.compute.manager [req-50fe9731-5c58-42f8-a782-a74473d48955 req-78a69826-8d84-480e-a647-f8f14682e2a8 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: e1d6bfab-5b5f-4e87-903f-663a797f6e97] Received event network-changed-3a45d6cb-8d53-4e0f-8011-06cad53a8190 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 21 19:43:39 np0005591285 nova_compute[182755]: 2026-01-22 00:43:39.320 182759 DEBUG nova.compute.manager [req-50fe9731-5c58-42f8-a782-a74473d48955 req-78a69826-8d84-480e-a647-f8f14682e2a8 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: e1d6bfab-5b5f-4e87-903f-663a797f6e97] Refreshing instance network info cache due to event network-changed-3a45d6cb-8d53-4e0f-8011-06cad53a8190. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 21 19:43:39 np0005591285 nova_compute[182755]: 2026-01-22 00:43:39.320 182759 DEBUG oslo_concurrency.lockutils [req-50fe9731-5c58-42f8-a782-a74473d48955 req-78a69826-8d84-480e-a647-f8f14682e2a8 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "refresh_cache-e1d6bfab-5b5f-4e87-903f-663a797f6e97" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 21 19:43:39 np0005591285 nova_compute[182755]: 2026-01-22 00:43:39.649 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:43:40 np0005591285 nova_compute[182755]: 2026-01-22 00:43:40.904 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:43:41 np0005591285 nova_compute[182755]: 2026-01-22 00:43:41.534 182759 DEBUG nova.network.neutron [None req-fa2a4568-fb3f-44a5-acea-42cf3c0af941 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] [instance: e1d6bfab-5b5f-4e87-903f-663a797f6e97] Updating instance_info_cache with network_info: [{"id": "3a45d6cb-8d53-4e0f-8011-06cad53a8190", "address": "fa:16:3e:ff:5e:29", "network": {"id": "bc173f9b-a39e-490e-b1d4-92abd1855016", "bridge": "br-int", "label": "tempest-network-smoke--453483768", "subnets": [{"cidr": "2001:db8:0:1::/64", "dns": [], "gateway": {"address": "2001:db8:0:1::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8:0:1:f816:3eff:feff:5e29", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}, {"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}, {"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:feff:5e29", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "837db8748d074b3c9179b47d30e7a1d4", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3a45d6cb-8d", "ovs_interfaceid": "3a45d6cb-8d53-4e0f-8011-06cad53a8190", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 21 19:43:41 np0005591285 nova_compute[182755]: 2026-01-22 00:43:41.560 182759 DEBUG oslo_concurrency.lockutils [None req-fa2a4568-fb3f-44a5-acea-42cf3c0af941 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Releasing lock "refresh_cache-e1d6bfab-5b5f-4e87-903f-663a797f6e97" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 21 19:43:41 np0005591285 nova_compute[182755]: 2026-01-22 00:43:41.560 182759 DEBUG nova.compute.manager [None req-fa2a4568-fb3f-44a5-acea-42cf3c0af941 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] [instance: e1d6bfab-5b5f-4e87-903f-663a797f6e97] Instance network_info: |[{"id": "3a45d6cb-8d53-4e0f-8011-06cad53a8190", "address": "fa:16:3e:ff:5e:29", "network": {"id": "bc173f9b-a39e-490e-b1d4-92abd1855016", "bridge": "br-int", "label": "tempest-network-smoke--453483768", "subnets": [{"cidr": "2001:db8:0:1::/64", "dns": [], "gateway": {"address": "2001:db8:0:1::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8:0:1:f816:3eff:feff:5e29", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}, {"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}, {"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:feff:5e29", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "837db8748d074b3c9179b47d30e7a1d4", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3a45d6cb-8d", "ovs_interfaceid": "3a45d6cb-8d53-4e0f-8011-06cad53a8190", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Jan 21 19:43:41 np0005591285 nova_compute[182755]: 2026-01-22 00:43:41.560 182759 DEBUG oslo_concurrency.lockutils [req-50fe9731-5c58-42f8-a782-a74473d48955 req-78a69826-8d84-480e-a647-f8f14682e2a8 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquired lock "refresh_cache-e1d6bfab-5b5f-4e87-903f-663a797f6e97" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 21 19:43:41 np0005591285 nova_compute[182755]: 2026-01-22 00:43:41.561 182759 DEBUG nova.network.neutron [req-50fe9731-5c58-42f8-a782-a74473d48955 req-78a69826-8d84-480e-a647-f8f14682e2a8 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: e1d6bfab-5b5f-4e87-903f-663a797f6e97] Refreshing network info cache for port 3a45d6cb-8d53-4e0f-8011-06cad53a8190 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 21 19:43:41 np0005591285 nova_compute[182755]: 2026-01-22 00:43:41.564 182759 DEBUG nova.virt.libvirt.driver [None req-fa2a4568-fb3f-44a5-acea-42cf3c0af941 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] [instance: e1d6bfab-5b5f-4e87-903f-663a797f6e97] Start _get_guest_xml network_info=[{"id": "3a45d6cb-8d53-4e0f-8011-06cad53a8190", "address": "fa:16:3e:ff:5e:29", "network": {"id": "bc173f9b-a39e-490e-b1d4-92abd1855016", "bridge": "br-int", "label": "tempest-network-smoke--453483768", "subnets": [{"cidr": "2001:db8:0:1::/64", "dns": [], "gateway": {"address": "2001:db8:0:1::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8:0:1:f816:3eff:feff:5e29", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}, {"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}, {"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:feff:5e29", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "837db8748d074b3c9179b47d30e7a1d4", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3a45d6cb-8d", "ovs_interfaceid": "3a45d6cb-8d53-4e0f-8011-06cad53a8190", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-21T23:42:50Z,direct_url=<?>,disk_format='qcow2',id=9cd98f02-a505-4543-a7ad-04e9a377b456,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='43b70c4e837343859ac97b6b2397ba1b',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-21T23:42:52Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'device_type': 'disk', 'device_name': '/dev/vda', 'size': 0, 'boot_index': 0, 'disk_bus': 'virtio', 'guest_format': None, 'encryption_options': None, 'encryption_format': None, 'encrypted': False, 'encryption_secret_uuid': None, 'image_id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Jan 21 19:43:41 np0005591285 nova_compute[182755]: 2026-01-22 00:43:41.568 182759 WARNING nova.virt.libvirt.driver [None req-fa2a4568-fb3f-44a5-acea-42cf3c0af941 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 21 19:43:41 np0005591285 nova_compute[182755]: 2026-01-22 00:43:41.574 182759 DEBUG nova.virt.libvirt.host [None req-fa2a4568-fb3f-44a5-acea-42cf3c0af941 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Jan 21 19:43:41 np0005591285 nova_compute[182755]: 2026-01-22 00:43:41.574 182759 DEBUG nova.virt.libvirt.host [None req-fa2a4568-fb3f-44a5-acea-42cf3c0af941 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Jan 21 19:43:41 np0005591285 nova_compute[182755]: 2026-01-22 00:43:41.579 182759 DEBUG nova.virt.libvirt.host [None req-fa2a4568-fb3f-44a5-acea-42cf3c0af941 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Jan 21 19:43:41 np0005591285 nova_compute[182755]: 2026-01-22 00:43:41.579 182759 DEBUG nova.virt.libvirt.host [None req-fa2a4568-fb3f-44a5-acea-42cf3c0af941 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Jan 21 19:43:41 np0005591285 nova_compute[182755]: 2026-01-22 00:43:41.580 182759 DEBUG nova.virt.libvirt.driver [None req-fa2a4568-fb3f-44a5-acea-42cf3c0af941 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Jan 21 19:43:41 np0005591285 nova_compute[182755]: 2026-01-22 00:43:41.581 182759 DEBUG nova.virt.hardware [None req-fa2a4568-fb3f-44a5-acea-42cf3c0af941 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-21T23:42:45Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='c3389c03-89c4-4ff5-9e03-1a99d41713d4',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-21T23:42:50Z,direct_url=<?>,disk_format='qcow2',id=9cd98f02-a505-4543-a7ad-04e9a377b456,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='43b70c4e837343859ac97b6b2397ba1b',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-21T23:42:52Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Jan 21 19:43:41 np0005591285 nova_compute[182755]: 2026-01-22 00:43:41.581 182759 DEBUG nova.virt.hardware [None req-fa2a4568-fb3f-44a5-acea-42cf3c0af941 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Jan 21 19:43:41 np0005591285 nova_compute[182755]: 2026-01-22 00:43:41.581 182759 DEBUG nova.virt.hardware [None req-fa2a4568-fb3f-44a5-acea-42cf3c0af941 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Jan 21 19:43:41 np0005591285 nova_compute[182755]: 2026-01-22 00:43:41.581 182759 DEBUG nova.virt.hardware [None req-fa2a4568-fb3f-44a5-acea-42cf3c0af941 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Jan 21 19:43:41 np0005591285 nova_compute[182755]: 2026-01-22 00:43:41.581 182759 DEBUG nova.virt.hardware [None req-fa2a4568-fb3f-44a5-acea-42cf3c0af941 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Jan 21 19:43:41 np0005591285 nova_compute[182755]: 2026-01-22 00:43:41.582 182759 DEBUG nova.virt.hardware [None req-fa2a4568-fb3f-44a5-acea-42cf3c0af941 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Jan 21 19:43:41 np0005591285 nova_compute[182755]: 2026-01-22 00:43:41.582 182759 DEBUG nova.virt.hardware [None req-fa2a4568-fb3f-44a5-acea-42cf3c0af941 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Jan 21 19:43:41 np0005591285 nova_compute[182755]: 2026-01-22 00:43:41.582 182759 DEBUG nova.virt.hardware [None req-fa2a4568-fb3f-44a5-acea-42cf3c0af941 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Jan 21 19:43:41 np0005591285 nova_compute[182755]: 2026-01-22 00:43:41.582 182759 DEBUG nova.virt.hardware [None req-fa2a4568-fb3f-44a5-acea-42cf3c0af941 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Jan 21 19:43:41 np0005591285 nova_compute[182755]: 2026-01-22 00:43:41.582 182759 DEBUG nova.virt.hardware [None req-fa2a4568-fb3f-44a5-acea-42cf3c0af941 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Jan 21 19:43:41 np0005591285 nova_compute[182755]: 2026-01-22 00:43:41.583 182759 DEBUG nova.virt.hardware [None req-fa2a4568-fb3f-44a5-acea-42cf3c0af941 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Jan 21 19:43:41 np0005591285 nova_compute[182755]: 2026-01-22 00:43:41.586 182759 DEBUG nova.virt.libvirt.vif [None req-fa2a4568-fb3f-44a5-acea-42cf3c0af941 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-22T00:43:34Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestGettingAddress-server-1515318656',display_name='tempest-TestGettingAddress-server-1515318656',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-testgettingaddress-server-1515318656',id=181,image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBFzS1Jx/APWM5vcQRL+j1JWQJn1AI5LoKxMBW97Fa2XcQxLO8wlk0d2rFNEjm5ruItcAVjf35MpAgTKTp3E/600O3yHmKIiUXb2hz3moFXrY6FueGiSaiI56sxuqhWZG5g==',key_name='tempest-TestGettingAddress-79214675',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='837db8748d074b3c9179b47d30e7a1d4',ramdisk_id='',reservation_id='r-jgzj67i0',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestGettingAddress-471729430',owner_user_name='tempest-TestGettingAddress-471729430-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-22T00:43:36Z,user_data=None,user_id='a8fd196423d94b309668ffd08655f7ed',uuid=e1d6bfab-5b5f-4e87-903f-663a797f6e97,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "3a45d6cb-8d53-4e0f-8011-06cad53a8190", "address": "fa:16:3e:ff:5e:29", "network": {"id": "bc173f9b-a39e-490e-b1d4-92abd1855016", "bridge": "br-int", "label": "tempest-network-smoke--453483768", "subnets": [{"cidr": "2001:db8:0:1::/64", "dns": [], "gateway": {"address": "2001:db8:0:1::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8:0:1:f816:3eff:feff:5e29", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}, {"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}, {"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:feff:5e29", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "837db8748d074b3c9179b47d30e7a1d4", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3a45d6cb-8d", "ovs_interfaceid": "3a45d6cb-8d53-4e0f-8011-06cad53a8190", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Jan 21 19:43:41 np0005591285 nova_compute[182755]: 2026-01-22 00:43:41.586 182759 DEBUG nova.network.os_vif_util [None req-fa2a4568-fb3f-44a5-acea-42cf3c0af941 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Converting VIF {"id": "3a45d6cb-8d53-4e0f-8011-06cad53a8190", "address": "fa:16:3e:ff:5e:29", "network": {"id": "bc173f9b-a39e-490e-b1d4-92abd1855016", "bridge": "br-int", "label": "tempest-network-smoke--453483768", "subnets": [{"cidr": "2001:db8:0:1::/64", "dns": [], "gateway": {"address": "2001:db8:0:1::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8:0:1:f816:3eff:feff:5e29", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}, {"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}, {"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:feff:5e29", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "837db8748d074b3c9179b47d30e7a1d4", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3a45d6cb-8d", "ovs_interfaceid": "3a45d6cb-8d53-4e0f-8011-06cad53a8190", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 21 19:43:41 np0005591285 nova_compute[182755]: 2026-01-22 00:43:41.587 182759 DEBUG nova.network.os_vif_util [None req-fa2a4568-fb3f-44a5-acea-42cf3c0af941 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:ff:5e:29,bridge_name='br-int',has_traffic_filtering=True,id=3a45d6cb-8d53-4e0f-8011-06cad53a8190,network=Network(bc173f9b-a39e-490e-b1d4-92abd1855016),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap3a45d6cb-8d') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 21 19:43:41 np0005591285 nova_compute[182755]: 2026-01-22 00:43:41.588 182759 DEBUG nova.objects.instance [None req-fa2a4568-fb3f-44a5-acea-42cf3c0af941 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Lazy-loading 'pci_devices' on Instance uuid e1d6bfab-5b5f-4e87-903f-663a797f6e97 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 21 19:43:41 np0005591285 nova_compute[182755]: 2026-01-22 00:43:41.602 182759 DEBUG nova.virt.libvirt.driver [None req-fa2a4568-fb3f-44a5-acea-42cf3c0af941 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] [instance: e1d6bfab-5b5f-4e87-903f-663a797f6e97] End _get_guest_xml xml=<domain type="kvm">
Jan 21 19:43:41 np0005591285 nova_compute[182755]:  <uuid>e1d6bfab-5b5f-4e87-903f-663a797f6e97</uuid>
Jan 21 19:43:41 np0005591285 nova_compute[182755]:  <name>instance-000000b5</name>
Jan 21 19:43:41 np0005591285 nova_compute[182755]:  <memory>131072</memory>
Jan 21 19:43:41 np0005591285 nova_compute[182755]:  <vcpu>1</vcpu>
Jan 21 19:43:41 np0005591285 nova_compute[182755]:  <metadata>
Jan 21 19:43:41 np0005591285 nova_compute[182755]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 21 19:43:41 np0005591285 nova_compute[182755]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 21 19:43:41 np0005591285 nova_compute[182755]:      <nova:name>tempest-TestGettingAddress-server-1515318656</nova:name>
Jan 21 19:43:41 np0005591285 nova_compute[182755]:      <nova:creationTime>2026-01-22 00:43:41</nova:creationTime>
Jan 21 19:43:41 np0005591285 nova_compute[182755]:      <nova:flavor name="m1.nano">
Jan 21 19:43:41 np0005591285 nova_compute[182755]:        <nova:memory>128</nova:memory>
Jan 21 19:43:41 np0005591285 nova_compute[182755]:        <nova:disk>1</nova:disk>
Jan 21 19:43:41 np0005591285 nova_compute[182755]:        <nova:swap>0</nova:swap>
Jan 21 19:43:41 np0005591285 nova_compute[182755]:        <nova:ephemeral>0</nova:ephemeral>
Jan 21 19:43:41 np0005591285 nova_compute[182755]:        <nova:vcpus>1</nova:vcpus>
Jan 21 19:43:41 np0005591285 nova_compute[182755]:      </nova:flavor>
Jan 21 19:43:41 np0005591285 nova_compute[182755]:      <nova:owner>
Jan 21 19:43:41 np0005591285 nova_compute[182755]:        <nova:user uuid="a8fd196423d94b309668ffd08655f7ed">tempest-TestGettingAddress-471729430-project-member</nova:user>
Jan 21 19:43:41 np0005591285 nova_compute[182755]:        <nova:project uuid="837db8748d074b3c9179b47d30e7a1d4">tempest-TestGettingAddress-471729430</nova:project>
Jan 21 19:43:41 np0005591285 nova_compute[182755]:      </nova:owner>
Jan 21 19:43:41 np0005591285 nova_compute[182755]:      <nova:root type="image" uuid="9cd98f02-a505-4543-a7ad-04e9a377b456"/>
Jan 21 19:43:41 np0005591285 nova_compute[182755]:      <nova:ports>
Jan 21 19:43:41 np0005591285 nova_compute[182755]:        <nova:port uuid="3a45d6cb-8d53-4e0f-8011-06cad53a8190">
Jan 21 19:43:41 np0005591285 nova_compute[182755]:          <nova:ip type="fixed" address="2001:db8:0:1:f816:3eff:feff:5e29" ipVersion="6"/>
Jan 21 19:43:41 np0005591285 nova_compute[182755]:          <nova:ip type="fixed" address="10.100.0.7" ipVersion="4"/>
Jan 21 19:43:41 np0005591285 nova_compute[182755]:          <nova:ip type="fixed" address="2001:db8::f816:3eff:feff:5e29" ipVersion="6"/>
Jan 21 19:43:41 np0005591285 nova_compute[182755]:        </nova:port>
Jan 21 19:43:41 np0005591285 nova_compute[182755]:      </nova:ports>
Jan 21 19:43:41 np0005591285 nova_compute[182755]:    </nova:instance>
Jan 21 19:43:41 np0005591285 nova_compute[182755]:  </metadata>
Jan 21 19:43:41 np0005591285 nova_compute[182755]:  <sysinfo type="smbios">
Jan 21 19:43:41 np0005591285 nova_compute[182755]:    <system>
Jan 21 19:43:41 np0005591285 nova_compute[182755]:      <entry name="manufacturer">RDO</entry>
Jan 21 19:43:41 np0005591285 nova_compute[182755]:      <entry name="product">OpenStack Compute</entry>
Jan 21 19:43:41 np0005591285 nova_compute[182755]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 21 19:43:41 np0005591285 nova_compute[182755]:      <entry name="serial">e1d6bfab-5b5f-4e87-903f-663a797f6e97</entry>
Jan 21 19:43:41 np0005591285 nova_compute[182755]:      <entry name="uuid">e1d6bfab-5b5f-4e87-903f-663a797f6e97</entry>
Jan 21 19:43:41 np0005591285 nova_compute[182755]:      <entry name="family">Virtual Machine</entry>
Jan 21 19:43:41 np0005591285 nova_compute[182755]:    </system>
Jan 21 19:43:41 np0005591285 nova_compute[182755]:  </sysinfo>
Jan 21 19:43:41 np0005591285 nova_compute[182755]:  <os>
Jan 21 19:43:41 np0005591285 nova_compute[182755]:    <type arch="x86_64" machine="q35">hvm</type>
Jan 21 19:43:41 np0005591285 nova_compute[182755]:    <boot dev="hd"/>
Jan 21 19:43:41 np0005591285 nova_compute[182755]:    <smbios mode="sysinfo"/>
Jan 21 19:43:41 np0005591285 nova_compute[182755]:  </os>
Jan 21 19:43:41 np0005591285 nova_compute[182755]:  <features>
Jan 21 19:43:41 np0005591285 nova_compute[182755]:    <acpi/>
Jan 21 19:43:41 np0005591285 nova_compute[182755]:    <apic/>
Jan 21 19:43:41 np0005591285 nova_compute[182755]:    <vmcoreinfo/>
Jan 21 19:43:41 np0005591285 nova_compute[182755]:  </features>
Jan 21 19:43:41 np0005591285 nova_compute[182755]:  <clock offset="utc">
Jan 21 19:43:41 np0005591285 nova_compute[182755]:    <timer name="pit" tickpolicy="delay"/>
Jan 21 19:43:41 np0005591285 nova_compute[182755]:    <timer name="rtc" tickpolicy="catchup"/>
Jan 21 19:43:41 np0005591285 nova_compute[182755]:    <timer name="hpet" present="no"/>
Jan 21 19:43:41 np0005591285 nova_compute[182755]:  </clock>
Jan 21 19:43:41 np0005591285 nova_compute[182755]:  <cpu mode="custom" match="exact">
Jan 21 19:43:41 np0005591285 nova_compute[182755]:    <model>Nehalem</model>
Jan 21 19:43:41 np0005591285 nova_compute[182755]:    <topology sockets="1" cores="1" threads="1"/>
Jan 21 19:43:41 np0005591285 nova_compute[182755]:  </cpu>
Jan 21 19:43:41 np0005591285 nova_compute[182755]:  <devices>
Jan 21 19:43:41 np0005591285 nova_compute[182755]:    <disk type="file" device="disk">
Jan 21 19:43:41 np0005591285 nova_compute[182755]:      <driver name="qemu" type="qcow2" cache="none"/>
Jan 21 19:43:41 np0005591285 nova_compute[182755]:      <source file="/var/lib/nova/instances/e1d6bfab-5b5f-4e87-903f-663a797f6e97/disk"/>
Jan 21 19:43:41 np0005591285 nova_compute[182755]:      <target dev="vda" bus="virtio"/>
Jan 21 19:43:41 np0005591285 nova_compute[182755]:    </disk>
Jan 21 19:43:41 np0005591285 nova_compute[182755]:    <disk type="file" device="cdrom">
Jan 21 19:43:41 np0005591285 nova_compute[182755]:      <driver name="qemu" type="raw" cache="none"/>
Jan 21 19:43:41 np0005591285 nova_compute[182755]:      <source file="/var/lib/nova/instances/e1d6bfab-5b5f-4e87-903f-663a797f6e97/disk.config"/>
Jan 21 19:43:41 np0005591285 nova_compute[182755]:      <target dev="sda" bus="sata"/>
Jan 21 19:43:41 np0005591285 nova_compute[182755]:    </disk>
Jan 21 19:43:41 np0005591285 nova_compute[182755]:    <interface type="ethernet">
Jan 21 19:43:41 np0005591285 nova_compute[182755]:      <mac address="fa:16:3e:ff:5e:29"/>
Jan 21 19:43:41 np0005591285 nova_compute[182755]:      <model type="virtio"/>
Jan 21 19:43:41 np0005591285 nova_compute[182755]:      <driver name="vhost" rx_queue_size="512"/>
Jan 21 19:43:41 np0005591285 nova_compute[182755]:      <mtu size="1442"/>
Jan 21 19:43:41 np0005591285 nova_compute[182755]:      <target dev="tap3a45d6cb-8d"/>
Jan 21 19:43:41 np0005591285 nova_compute[182755]:    </interface>
Jan 21 19:43:41 np0005591285 nova_compute[182755]:    <serial type="pty">
Jan 21 19:43:41 np0005591285 nova_compute[182755]:      <log file="/var/lib/nova/instances/e1d6bfab-5b5f-4e87-903f-663a797f6e97/console.log" append="off"/>
Jan 21 19:43:41 np0005591285 nova_compute[182755]:    </serial>
Jan 21 19:43:41 np0005591285 nova_compute[182755]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 21 19:43:41 np0005591285 nova_compute[182755]:    <video>
Jan 21 19:43:41 np0005591285 nova_compute[182755]:      <model type="virtio"/>
Jan 21 19:43:41 np0005591285 nova_compute[182755]:    </video>
Jan 21 19:43:41 np0005591285 nova_compute[182755]:    <input type="tablet" bus="usb"/>
Jan 21 19:43:41 np0005591285 nova_compute[182755]:    <rng model="virtio">
Jan 21 19:43:41 np0005591285 nova_compute[182755]:      <backend model="random">/dev/urandom</backend>
Jan 21 19:43:41 np0005591285 nova_compute[182755]:    </rng>
Jan 21 19:43:41 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root"/>
Jan 21 19:43:41 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 19:43:41 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 19:43:41 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 19:43:41 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 19:43:41 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 19:43:41 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 19:43:41 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 19:43:41 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 19:43:41 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 19:43:41 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 19:43:41 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 19:43:41 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 19:43:41 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 19:43:41 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 19:43:41 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 19:43:41 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 19:43:41 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 19:43:41 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 19:43:41 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 19:43:41 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 19:43:41 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 19:43:41 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 19:43:41 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 19:43:41 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 19:43:41 np0005591285 nova_compute[182755]:    <controller type="usb" index="0"/>
Jan 21 19:43:41 np0005591285 nova_compute[182755]:    <memballoon model="virtio">
Jan 21 19:43:41 np0005591285 nova_compute[182755]:      <stats period="10"/>
Jan 21 19:43:41 np0005591285 nova_compute[182755]:    </memballoon>
Jan 21 19:43:41 np0005591285 nova_compute[182755]:  </devices>
Jan 21 19:43:41 np0005591285 nova_compute[182755]: </domain>
Jan 21 19:43:41 np0005591285 nova_compute[182755]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Jan 21 19:43:41 np0005591285 nova_compute[182755]: 2026-01-22 00:43:41.604 182759 DEBUG nova.compute.manager [None req-fa2a4568-fb3f-44a5-acea-42cf3c0af941 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] [instance: e1d6bfab-5b5f-4e87-903f-663a797f6e97] Preparing to wait for external event network-vif-plugged-3a45d6cb-8d53-4e0f-8011-06cad53a8190 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Jan 21 19:43:41 np0005591285 nova_compute[182755]: 2026-01-22 00:43:41.604 182759 DEBUG oslo_concurrency.lockutils [None req-fa2a4568-fb3f-44a5-acea-42cf3c0af941 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Acquiring lock "e1d6bfab-5b5f-4e87-903f-663a797f6e97-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 21 19:43:41 np0005591285 nova_compute[182755]: 2026-01-22 00:43:41.605 182759 DEBUG oslo_concurrency.lockutils [None req-fa2a4568-fb3f-44a5-acea-42cf3c0af941 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Lock "e1d6bfab-5b5f-4e87-903f-663a797f6e97-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 21 19:43:41 np0005591285 nova_compute[182755]: 2026-01-22 00:43:41.605 182759 DEBUG oslo_concurrency.lockutils [None req-fa2a4568-fb3f-44a5-acea-42cf3c0af941 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Lock "e1d6bfab-5b5f-4e87-903f-663a797f6e97-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 21 19:43:41 np0005591285 nova_compute[182755]: 2026-01-22 00:43:41.606 182759 DEBUG nova.virt.libvirt.vif [None req-fa2a4568-fb3f-44a5-acea-42cf3c0af941 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-22T00:43:34Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestGettingAddress-server-1515318656',display_name='tempest-TestGettingAddress-server-1515318656',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-testgettingaddress-server-1515318656',id=181,image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBFzS1Jx/APWM5vcQRL+j1JWQJn1AI5LoKxMBW97Fa2XcQxLO8wlk0d2rFNEjm5ruItcAVjf35MpAgTKTp3E/600O3yHmKIiUXb2hz3moFXrY6FueGiSaiI56sxuqhWZG5g==',key_name='tempest-TestGettingAddress-79214675',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='837db8748d074b3c9179b47d30e7a1d4',ramdisk_id='',reservation_id='r-jgzj67i0',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestGettingAddress-471729430',owner_user_name='tempest-TestGettingAddress-471729430-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-22T00:43:36Z,user_data=None,user_id='a8fd196423d94b309668ffd08655f7ed',uuid=e1d6bfab-5b5f-4e87-903f-663a797f6e97,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "3a45d6cb-8d53-4e0f-8011-06cad53a8190", "address": "fa:16:3e:ff:5e:29", "network": {"id": "bc173f9b-a39e-490e-b1d4-92abd1855016", "bridge": "br-int", "label": "tempest-network-smoke--453483768", "subnets": [{"cidr": "2001:db8:0:1::/64", "dns": [], "gateway": {"address": "2001:db8:0:1::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8:0:1:f816:3eff:feff:5e29", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}, {"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}, {"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:feff:5e29", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "837db8748d074b3c9179b47d30e7a1d4", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3a45d6cb-8d", "ovs_interfaceid": "3a45d6cb-8d53-4e0f-8011-06cad53a8190", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Jan 21 19:43:41 np0005591285 nova_compute[182755]: 2026-01-22 00:43:41.606 182759 DEBUG nova.network.os_vif_util [None req-fa2a4568-fb3f-44a5-acea-42cf3c0af941 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Converting VIF {"id": "3a45d6cb-8d53-4e0f-8011-06cad53a8190", "address": "fa:16:3e:ff:5e:29", "network": {"id": "bc173f9b-a39e-490e-b1d4-92abd1855016", "bridge": "br-int", "label": "tempest-network-smoke--453483768", "subnets": [{"cidr": "2001:db8:0:1::/64", "dns": [], "gateway": {"address": "2001:db8:0:1::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8:0:1:f816:3eff:feff:5e29", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}, {"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}, {"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:feff:5e29", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "837db8748d074b3c9179b47d30e7a1d4", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3a45d6cb-8d", "ovs_interfaceid": "3a45d6cb-8d53-4e0f-8011-06cad53a8190", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 21 19:43:41 np0005591285 nova_compute[182755]: 2026-01-22 00:43:41.607 182759 DEBUG nova.network.os_vif_util [None req-fa2a4568-fb3f-44a5-acea-42cf3c0af941 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:ff:5e:29,bridge_name='br-int',has_traffic_filtering=True,id=3a45d6cb-8d53-4e0f-8011-06cad53a8190,network=Network(bc173f9b-a39e-490e-b1d4-92abd1855016),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap3a45d6cb-8d') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 21 19:43:41 np0005591285 nova_compute[182755]: 2026-01-22 00:43:41.607 182759 DEBUG os_vif [None req-fa2a4568-fb3f-44a5-acea-42cf3c0af941 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:ff:5e:29,bridge_name='br-int',has_traffic_filtering=True,id=3a45d6cb-8d53-4e0f-8011-06cad53a8190,network=Network(bc173f9b-a39e-490e-b1d4-92abd1855016),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap3a45d6cb-8d') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Jan 21 19:43:41 np0005591285 nova_compute[182755]: 2026-01-22 00:43:41.607 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:43:41 np0005591285 nova_compute[182755]: 2026-01-22 00:43:41.608 182759 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 21 19:43:41 np0005591285 nova_compute[182755]: 2026-01-22 00:43:41.608 182759 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 21 19:43:41 np0005591285 nova_compute[182755]: 2026-01-22 00:43:41.614 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:43:41 np0005591285 nova_compute[182755]: 2026-01-22 00:43:41.614 182759 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap3a45d6cb-8d, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 21 19:43:41 np0005591285 nova_compute[182755]: 2026-01-22 00:43:41.614 182759 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap3a45d6cb-8d, col_values=(('external_ids', {'iface-id': '3a45d6cb-8d53-4e0f-8011-06cad53a8190', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:ff:5e:29', 'vm-uuid': 'e1d6bfab-5b5f-4e87-903f-663a797f6e97'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 21 19:43:41 np0005591285 nova_compute[182755]: 2026-01-22 00:43:41.616 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:43:41 np0005591285 NetworkManager[55017]: <info>  [1769042621.6167] manager: (tap3a45d6cb-8d): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/334)
Jan 21 19:43:41 np0005591285 nova_compute[182755]: 2026-01-22 00:43:41.617 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 21 19:43:41 np0005591285 nova_compute[182755]: 2026-01-22 00:43:41.622 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:43:41 np0005591285 nova_compute[182755]: 2026-01-22 00:43:41.622 182759 INFO os_vif [None req-fa2a4568-fb3f-44a5-acea-42cf3c0af941 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:ff:5e:29,bridge_name='br-int',has_traffic_filtering=True,id=3a45d6cb-8d53-4e0f-8011-06cad53a8190,network=Network(bc173f9b-a39e-490e-b1d4-92abd1855016),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap3a45d6cb-8d')#033[00m
Jan 21 19:43:41 np0005591285 nova_compute[182755]: 2026-01-22 00:43:41.686 182759 DEBUG nova.virt.libvirt.driver [None req-fa2a4568-fb3f-44a5-acea-42cf3c0af941 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 21 19:43:41 np0005591285 nova_compute[182755]: 2026-01-22 00:43:41.687 182759 DEBUG nova.virt.libvirt.driver [None req-fa2a4568-fb3f-44a5-acea-42cf3c0af941 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 21 19:43:41 np0005591285 nova_compute[182755]: 2026-01-22 00:43:41.687 182759 DEBUG nova.virt.libvirt.driver [None req-fa2a4568-fb3f-44a5-acea-42cf3c0af941 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] No VIF found with MAC fa:16:3e:ff:5e:29, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Jan 21 19:43:41 np0005591285 nova_compute[182755]: 2026-01-22 00:43:41.688 182759 INFO nova.virt.libvirt.driver [None req-fa2a4568-fb3f-44a5-acea-42cf3c0af941 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] [instance: e1d6bfab-5b5f-4e87-903f-663a797f6e97] Using config drive#033[00m
Jan 21 19:43:42 np0005591285 nova_compute[182755]: 2026-01-22 00:43:42.477 182759 INFO nova.virt.libvirt.driver [None req-fa2a4568-fb3f-44a5-acea-42cf3c0af941 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] [instance: e1d6bfab-5b5f-4e87-903f-663a797f6e97] Creating config drive at /var/lib/nova/instances/e1d6bfab-5b5f-4e87-903f-663a797f6e97/disk.config#033[00m
Jan 21 19:43:42 np0005591285 nova_compute[182755]: 2026-01-22 00:43:42.487 182759 DEBUG oslo_concurrency.processutils [None req-fa2a4568-fb3f-44a5-acea-42cf3c0af941 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/e1d6bfab-5b5f-4e87-903f-663a797f6e97/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpcl8as3vo execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 21 19:43:42 np0005591285 nova_compute[182755]: 2026-01-22 00:43:42.616 182759 DEBUG oslo_concurrency.processutils [None req-fa2a4568-fb3f-44a5-acea-42cf3c0af941 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/e1d6bfab-5b5f-4e87-903f-663a797f6e97/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpcl8as3vo" returned: 0 in 0.129s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 21 19:43:42 np0005591285 kernel: tap3a45d6cb-8d: entered promiscuous mode
Jan 21 19:43:42 np0005591285 NetworkManager[55017]: <info>  [1769042622.6784] manager: (tap3a45d6cb-8d): new Tun device (/org/freedesktop/NetworkManager/Devices/335)
Jan 21 19:43:42 np0005591285 ovn_controller[94908]: 2026-01-22T00:43:42Z|00678|binding|INFO|Claiming lport 3a45d6cb-8d53-4e0f-8011-06cad53a8190 for this chassis.
Jan 21 19:43:42 np0005591285 ovn_controller[94908]: 2026-01-22T00:43:42Z|00679|binding|INFO|3a45d6cb-8d53-4e0f-8011-06cad53a8190: Claiming fa:16:3e:ff:5e:29 10.100.0.7 2001:db8:0:1:f816:3eff:feff:5e29 2001:db8::f816:3eff:feff:5e29
Jan 21 19:43:42 np0005591285 nova_compute[182755]: 2026-01-22 00:43:42.680 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:43:42 np0005591285 nova_compute[182755]: 2026-01-22 00:43:42.689 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:43:42 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:43:42.696 104259 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:ff:5e:29 10.100.0.7 2001:db8:0:1:f816:3eff:feff:5e29 2001:db8::f816:3eff:feff:5e29'], port_security=['fa:16:3e:ff:5e:29 10.100.0.7 2001:db8:0:1:f816:3eff:feff:5e29 2001:db8::f816:3eff:feff:5e29'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.7/28 2001:db8:0:1:f816:3eff:feff:5e29/64 2001:db8::f816:3eff:feff:5e29/64', 'neutron:device_id': 'e1d6bfab-5b5f-4e87-903f-663a797f6e97', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-bc173f9b-a39e-490e-b1d4-92abd1855016', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '837db8748d074b3c9179b47d30e7a1d4', 'neutron:revision_number': '2', 'neutron:security_group_ids': '46ea8a3f-4945-4bb2-97cf-c1bd6e8fe825', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=ad392942-0b6b-462d-a3a5-d979f385a143, chassis=[<ovs.db.idl.Row object at 0x7fdd55b5c970>], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fdd55b5c970>], logical_port=3a45d6cb-8d53-4e0f-8011-06cad53a8190) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 21 19:43:42 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:43:42.697 104259 INFO neutron.agent.ovn.metadata.agent [-] Port 3a45d6cb-8d53-4e0f-8011-06cad53a8190 in datapath bc173f9b-a39e-490e-b1d4-92abd1855016 bound to our chassis#033[00m
Jan 21 19:43:42 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:43:42.698 104259 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network bc173f9b-a39e-490e-b1d4-92abd1855016#033[00m
Jan 21 19:43:42 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:43:42.710 211690 DEBUG oslo.privsep.daemon [-] privsep: reply[a8bc2200-f494-42be-b24b-925b7b1ca805]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 19:43:42 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:43:42.711 104259 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tapbc173f9b-a1 in ovnmeta-bc173f9b-a39e-490e-b1d4-92abd1855016 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Jan 21 19:43:42 np0005591285 systemd-udevd[243612]: Network interface NamePolicy= disabled on kernel command line.
Jan 21 19:43:42 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:43:42.712 211690 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tapbc173f9b-a0 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Jan 21 19:43:42 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:43:42.713 211690 DEBUG oslo.privsep.daemon [-] privsep: reply[e3305281-d73d-4278-b00a-2d40d4dccafd]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 19:43:42 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:43:42.714 211690 DEBUG oslo.privsep.daemon [-] privsep: reply[a3186086-9029-460f-bd8e-2c3c3ca7d2de]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 19:43:42 np0005591285 systemd-machined[154022]: New machine qemu-77-instance-000000b5.
Jan 21 19:43:42 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:43:42.726 104650 DEBUG oslo.privsep.daemon [-] privsep: reply[83ced2c3-5360-4cd1-91e9-510f7201f0c0]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 19:43:42 np0005591285 NetworkManager[55017]: <info>  [1769042622.7290] device (tap3a45d6cb-8d): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 21 19:43:42 np0005591285 NetworkManager[55017]: <info>  [1769042622.7304] device (tap3a45d6cb-8d): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 21 19:43:42 np0005591285 systemd[1]: Started Virtual Machine qemu-77-instance-000000b5.
Jan 21 19:43:42 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:43:42.767 211690 DEBUG oslo.privsep.daemon [-] privsep: reply[d08ae929-cbd2-4dd6-b6dd-5efae6f43306]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 19:43:42 np0005591285 ovn_controller[94908]: 2026-01-22T00:43:42Z|00680|binding|INFO|Setting lport 3a45d6cb-8d53-4e0f-8011-06cad53a8190 ovn-installed in OVS
Jan 21 19:43:42 np0005591285 ovn_controller[94908]: 2026-01-22T00:43:42Z|00681|binding|INFO|Setting lport 3a45d6cb-8d53-4e0f-8011-06cad53a8190 up in Southbound
Jan 21 19:43:42 np0005591285 nova_compute[182755]: 2026-01-22 00:43:42.777 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:43:42 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:43:42.797 211704 DEBUG oslo.privsep.daemon [-] privsep: reply[8a822977-31ef-4139-8fd2-17d8ff833088]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 19:43:42 np0005591285 NetworkManager[55017]: <info>  [1769042622.8031] manager: (tapbc173f9b-a0): new Veth device (/org/freedesktop/NetworkManager/Devices/336)
Jan 21 19:43:42 np0005591285 systemd-udevd[243616]: Network interface NamePolicy= disabled on kernel command line.
Jan 21 19:43:42 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:43:42.802 211690 DEBUG oslo.privsep.daemon [-] privsep: reply[5791b86a-7644-4f56-9c72-4e681356bc9d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 19:43:42 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:43:42.828 211704 DEBUG oslo.privsep.daemon [-] privsep: reply[9566d055-668b-4133-828c-ded84a16248e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 19:43:42 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:43:42.831 211704 DEBUG oslo.privsep.daemon [-] privsep: reply[7ed1be65-86f7-46d7-8a9f-a42cea6ee83f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 19:43:42 np0005591285 NetworkManager[55017]: <info>  [1769042622.8500] device (tapbc173f9b-a0): carrier: link connected
Jan 21 19:43:42 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:43:42.856 211704 DEBUG oslo.privsep.daemon [-] privsep: reply[8036737d-c186-4ff6-8c5d-3f670eadaf79]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 19:43:42 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:43:42.873 211690 DEBUG oslo.privsep.daemon [-] privsep: reply[751a75fb-5b5c-47c3-b0ec-09a4e6f29cc9]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapbc173f9b-a1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:e0:9f:dc'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 211], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 718150, 'reachable_time': 25144, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 243645, 'error': None, 'target': 'ovnmeta-bc173f9b-a39e-490e-b1d4-92abd1855016', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 19:43:42 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:43:42.887 211690 DEBUG oslo.privsep.daemon [-] privsep: reply[ba802bbf-e75d-4c0d-811a-3d9ee34f0aff]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fee0:9fdc'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 718150, 'tstamp': 718150}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 243646, 'error': None, 'target': 'ovnmeta-bc173f9b-a39e-490e-b1d4-92abd1855016', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 19:43:42 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:43:42.909 211690 DEBUG oslo.privsep.daemon [-] privsep: reply[59a7f4bc-4de9-486c-aa5b-4ffb3cd25f4d]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapbc173f9b-a1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:e0:9f:dc'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 211], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 718150, 'reachable_time': 25144, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 243647, 'error': None, 'target': 'ovnmeta-bc173f9b-a39e-490e-b1d4-92abd1855016', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 19:43:42 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:43:42.938 211690 DEBUG oslo.privsep.daemon [-] privsep: reply[8fb3e162-95e7-4799-9a43-30ffcadc919c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 19:43:43 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:43:43.016 211690 DEBUG oslo.privsep.daemon [-] privsep: reply[ed74c202-d894-4767-a4b1-553310931b24]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 19:43:43 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:43:43.022 104259 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapbc173f9b-a0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 21 19:43:43 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:43:43.023 104259 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 21 19:43:43 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:43:43.023 104259 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapbc173f9b-a0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 21 19:43:43 np0005591285 kernel: tapbc173f9b-a0: entered promiscuous mode
Jan 21 19:43:43 np0005591285 nova_compute[182755]: 2026-01-22 00:43:43.026 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:43:43 np0005591285 NetworkManager[55017]: <info>  [1769042623.0280] manager: (tapbc173f9b-a0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/337)
Jan 21 19:43:43 np0005591285 nova_compute[182755]: 2026-01-22 00:43:43.031 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:43:43 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:43:43.036 104259 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapbc173f9b-a0, col_values=(('external_ids', {'iface-id': 'e429e99d-d544-4554-bbe2-f8538fbd55b8'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 21 19:43:43 np0005591285 ovn_controller[94908]: 2026-01-22T00:43:43Z|00682|binding|INFO|Releasing lport e429e99d-d544-4554-bbe2-f8538fbd55b8 from this chassis (sb_readonly=0)
Jan 21 19:43:43 np0005591285 nova_compute[182755]: 2026-01-22 00:43:43.038 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:43:43 np0005591285 nova_compute[182755]: 2026-01-22 00:43:43.039 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:43:43 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:43:43.043 104259 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/bc173f9b-a39e-490e-b1d4-92abd1855016.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/bc173f9b-a39e-490e-b1d4-92abd1855016.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Jan 21 19:43:43 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:43:43.044 211690 DEBUG oslo.privsep.daemon [-] privsep: reply[148b0039-c8b6-4991-8919-256c09498db4]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 19:43:43 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:43:43.046 104259 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Jan 21 19:43:43 np0005591285 ovn_metadata_agent[104254]: global
Jan 21 19:43:43 np0005591285 ovn_metadata_agent[104254]:    log         /dev/log local0 debug
Jan 21 19:43:43 np0005591285 ovn_metadata_agent[104254]:    log-tag     haproxy-metadata-proxy-bc173f9b-a39e-490e-b1d4-92abd1855016
Jan 21 19:43:43 np0005591285 ovn_metadata_agent[104254]:    user        root
Jan 21 19:43:43 np0005591285 ovn_metadata_agent[104254]:    group       root
Jan 21 19:43:43 np0005591285 ovn_metadata_agent[104254]:    maxconn     1024
Jan 21 19:43:43 np0005591285 ovn_metadata_agent[104254]:    pidfile     /var/lib/neutron/external/pids/bc173f9b-a39e-490e-b1d4-92abd1855016.pid.haproxy
Jan 21 19:43:43 np0005591285 ovn_metadata_agent[104254]:    daemon
Jan 21 19:43:43 np0005591285 ovn_metadata_agent[104254]: 
Jan 21 19:43:43 np0005591285 ovn_metadata_agent[104254]: defaults
Jan 21 19:43:43 np0005591285 ovn_metadata_agent[104254]:    log global
Jan 21 19:43:43 np0005591285 ovn_metadata_agent[104254]:    mode http
Jan 21 19:43:43 np0005591285 ovn_metadata_agent[104254]:    option httplog
Jan 21 19:43:43 np0005591285 ovn_metadata_agent[104254]:    option dontlognull
Jan 21 19:43:43 np0005591285 ovn_metadata_agent[104254]:    option http-server-close
Jan 21 19:43:43 np0005591285 ovn_metadata_agent[104254]:    option forwardfor
Jan 21 19:43:43 np0005591285 ovn_metadata_agent[104254]:    retries                 3
Jan 21 19:43:43 np0005591285 ovn_metadata_agent[104254]:    timeout http-request    30s
Jan 21 19:43:43 np0005591285 ovn_metadata_agent[104254]:    timeout connect         30s
Jan 21 19:43:43 np0005591285 ovn_metadata_agent[104254]:    timeout client          32s
Jan 21 19:43:43 np0005591285 ovn_metadata_agent[104254]:    timeout server          32s
Jan 21 19:43:43 np0005591285 ovn_metadata_agent[104254]:    timeout http-keep-alive 30s
Jan 21 19:43:43 np0005591285 ovn_metadata_agent[104254]: 
Jan 21 19:43:43 np0005591285 ovn_metadata_agent[104254]: 
Jan 21 19:43:43 np0005591285 ovn_metadata_agent[104254]: listen listener
Jan 21 19:43:43 np0005591285 ovn_metadata_agent[104254]:    bind 169.254.169.254:80
Jan 21 19:43:43 np0005591285 ovn_metadata_agent[104254]:    server metadata /var/lib/neutron/metadata_proxy
Jan 21 19:43:43 np0005591285 ovn_metadata_agent[104254]:    http-request add-header X-OVN-Network-ID bc173f9b-a39e-490e-b1d4-92abd1855016
Jan 21 19:43:43 np0005591285 ovn_metadata_agent[104254]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Jan 21 19:43:43 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:43:43.047 104259 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-bc173f9b-a39e-490e-b1d4-92abd1855016', 'env', 'PROCESS_TAG=haproxy-bc173f9b-a39e-490e-b1d4-92abd1855016', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/bc173f9b-a39e-490e-b1d4-92abd1855016.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Jan 21 19:43:43 np0005591285 nova_compute[182755]: 2026-01-22 00:43:43.060 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:43:43 np0005591285 nova_compute[182755]: 2026-01-22 00:43:43.363 182759 DEBUG nova.virt.driver [None req-f447ef32-7edb-4e2c-a5c5-5452f2f9c37e - - - - - -] Emitting event <LifecycleEvent: 1769042623.3633718, e1d6bfab-5b5f-4e87-903f-663a797f6e97 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 21 19:43:43 np0005591285 nova_compute[182755]: 2026-01-22 00:43:43.364 182759 INFO nova.compute.manager [None req-f447ef32-7edb-4e2c-a5c5-5452f2f9c37e - - - - - -] [instance: e1d6bfab-5b5f-4e87-903f-663a797f6e97] VM Started (Lifecycle Event)#033[00m
Jan 21 19:43:43 np0005591285 nova_compute[182755]: 2026-01-22 00:43:43.392 182759 DEBUG nova.compute.manager [None req-f447ef32-7edb-4e2c-a5c5-5452f2f9c37e - - - - - -] [instance: e1d6bfab-5b5f-4e87-903f-663a797f6e97] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 21 19:43:43 np0005591285 nova_compute[182755]: 2026-01-22 00:43:43.396 182759 DEBUG nova.virt.driver [None req-f447ef32-7edb-4e2c-a5c5-5452f2f9c37e - - - - - -] Emitting event <LifecycleEvent: 1769042623.3657928, e1d6bfab-5b5f-4e87-903f-663a797f6e97 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 21 19:43:43 np0005591285 nova_compute[182755]: 2026-01-22 00:43:43.396 182759 INFO nova.compute.manager [None req-f447ef32-7edb-4e2c-a5c5-5452f2f9c37e - - - - - -] [instance: e1d6bfab-5b5f-4e87-903f-663a797f6e97] VM Paused (Lifecycle Event)#033[00m
Jan 21 19:43:43 np0005591285 nova_compute[182755]: 2026-01-22 00:43:43.418 182759 DEBUG nova.compute.manager [None req-f447ef32-7edb-4e2c-a5c5-5452f2f9c37e - - - - - -] [instance: e1d6bfab-5b5f-4e87-903f-663a797f6e97] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 21 19:43:43 np0005591285 nova_compute[182755]: 2026-01-22 00:43:43.421 182759 DEBUG nova.compute.manager [None req-f447ef32-7edb-4e2c-a5c5-5452f2f9c37e - - - - - -] [instance: e1d6bfab-5b5f-4e87-903f-663a797f6e97] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 21 19:43:43 np0005591285 nova_compute[182755]: 2026-01-22 00:43:43.445 182759 INFO nova.compute.manager [None req-f447ef32-7edb-4e2c-a5c5-5452f2f9c37e - - - - - -] [instance: e1d6bfab-5b5f-4e87-903f-663a797f6e97] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 21 19:43:43 np0005591285 nova_compute[182755]: 2026-01-22 00:43:43.457 182759 DEBUG nova.compute.manager [req-8d980619-973e-4138-b924-45a37181a571 req-bbfd9349-b7c2-4cbb-9d99-227f59b2340b 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: e1d6bfab-5b5f-4e87-903f-663a797f6e97] Received event network-vif-plugged-3a45d6cb-8d53-4e0f-8011-06cad53a8190 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 21 19:43:43 np0005591285 nova_compute[182755]: 2026-01-22 00:43:43.458 182759 DEBUG oslo_concurrency.lockutils [req-8d980619-973e-4138-b924-45a37181a571 req-bbfd9349-b7c2-4cbb-9d99-227f59b2340b 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "e1d6bfab-5b5f-4e87-903f-663a797f6e97-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 21 19:43:43 np0005591285 nova_compute[182755]: 2026-01-22 00:43:43.458 182759 DEBUG oslo_concurrency.lockutils [req-8d980619-973e-4138-b924-45a37181a571 req-bbfd9349-b7c2-4cbb-9d99-227f59b2340b 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "e1d6bfab-5b5f-4e87-903f-663a797f6e97-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 21 19:43:43 np0005591285 nova_compute[182755]: 2026-01-22 00:43:43.458 182759 DEBUG oslo_concurrency.lockutils [req-8d980619-973e-4138-b924-45a37181a571 req-bbfd9349-b7c2-4cbb-9d99-227f59b2340b 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "e1d6bfab-5b5f-4e87-903f-663a797f6e97-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 21 19:43:43 np0005591285 nova_compute[182755]: 2026-01-22 00:43:43.458 182759 DEBUG nova.compute.manager [req-8d980619-973e-4138-b924-45a37181a571 req-bbfd9349-b7c2-4cbb-9d99-227f59b2340b 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: e1d6bfab-5b5f-4e87-903f-663a797f6e97] Processing event network-vif-plugged-3a45d6cb-8d53-4e0f-8011-06cad53a8190 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Jan 21 19:43:43 np0005591285 nova_compute[182755]: 2026-01-22 00:43:43.459 182759 DEBUG nova.compute.manager [None req-fa2a4568-fb3f-44a5-acea-42cf3c0af941 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] [instance: e1d6bfab-5b5f-4e87-903f-663a797f6e97] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Jan 21 19:43:43 np0005591285 podman[243683]: 2026-01-22 00:43:43.461118582 +0000 UTC m=+0.051699757 container create bfaf5011268f13a45b2c383855cdd285460b190df4c0aa12eca16c90445f96fb (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-bc173f9b-a39e-490e-b1d4-92abd1855016, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 21 19:43:43 np0005591285 nova_compute[182755]: 2026-01-22 00:43:43.463 182759 DEBUG nova.virt.libvirt.driver [None req-fa2a4568-fb3f-44a5-acea-42cf3c0af941 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] [instance: e1d6bfab-5b5f-4e87-903f-663a797f6e97] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Jan 21 19:43:43 np0005591285 nova_compute[182755]: 2026-01-22 00:43:43.464 182759 DEBUG nova.virt.driver [None req-f447ef32-7edb-4e2c-a5c5-5452f2f9c37e - - - - - -] Emitting event <LifecycleEvent: 1769042623.4635222, e1d6bfab-5b5f-4e87-903f-663a797f6e97 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 21 19:43:43 np0005591285 nova_compute[182755]: 2026-01-22 00:43:43.464 182759 INFO nova.compute.manager [None req-f447ef32-7edb-4e2c-a5c5-5452f2f9c37e - - - - - -] [instance: e1d6bfab-5b5f-4e87-903f-663a797f6e97] VM Resumed (Lifecycle Event)#033[00m
Jan 21 19:43:43 np0005591285 nova_compute[182755]: 2026-01-22 00:43:43.468 182759 INFO nova.virt.libvirt.driver [-] [instance: e1d6bfab-5b5f-4e87-903f-663a797f6e97] Instance spawned successfully.#033[00m
Jan 21 19:43:43 np0005591285 nova_compute[182755]: 2026-01-22 00:43:43.469 182759 DEBUG nova.virt.libvirt.driver [None req-fa2a4568-fb3f-44a5-acea-42cf3c0af941 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] [instance: e1d6bfab-5b5f-4e87-903f-663a797f6e97] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Jan 21 19:43:43 np0005591285 nova_compute[182755]: 2026-01-22 00:43:43.493 182759 DEBUG nova.compute.manager [None req-f447ef32-7edb-4e2c-a5c5-5452f2f9c37e - - - - - -] [instance: e1d6bfab-5b5f-4e87-903f-663a797f6e97] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 21 19:43:43 np0005591285 nova_compute[182755]: 2026-01-22 00:43:43.498 182759 DEBUG nova.compute.manager [None req-f447ef32-7edb-4e2c-a5c5-5452f2f9c37e - - - - - -] [instance: e1d6bfab-5b5f-4e87-903f-663a797f6e97] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 21 19:43:43 np0005591285 systemd[1]: Started libpod-conmon-bfaf5011268f13a45b2c383855cdd285460b190df4c0aa12eca16c90445f96fb.scope.
Jan 21 19:43:43 np0005591285 nova_compute[182755]: 2026-01-22 00:43:43.502 182759 DEBUG nova.virt.libvirt.driver [None req-fa2a4568-fb3f-44a5-acea-42cf3c0af941 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] [instance: e1d6bfab-5b5f-4e87-903f-663a797f6e97] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 21 19:43:43 np0005591285 nova_compute[182755]: 2026-01-22 00:43:43.503 182759 DEBUG nova.virt.libvirt.driver [None req-fa2a4568-fb3f-44a5-acea-42cf3c0af941 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] [instance: e1d6bfab-5b5f-4e87-903f-663a797f6e97] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 21 19:43:43 np0005591285 nova_compute[182755]: 2026-01-22 00:43:43.503 182759 DEBUG nova.virt.libvirt.driver [None req-fa2a4568-fb3f-44a5-acea-42cf3c0af941 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] [instance: e1d6bfab-5b5f-4e87-903f-663a797f6e97] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 21 19:43:43 np0005591285 nova_compute[182755]: 2026-01-22 00:43:43.504 182759 DEBUG nova.virt.libvirt.driver [None req-fa2a4568-fb3f-44a5-acea-42cf3c0af941 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] [instance: e1d6bfab-5b5f-4e87-903f-663a797f6e97] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 21 19:43:43 np0005591285 nova_compute[182755]: 2026-01-22 00:43:43.504 182759 DEBUG nova.virt.libvirt.driver [None req-fa2a4568-fb3f-44a5-acea-42cf3c0af941 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] [instance: e1d6bfab-5b5f-4e87-903f-663a797f6e97] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 21 19:43:43 np0005591285 nova_compute[182755]: 2026-01-22 00:43:43.505 182759 DEBUG nova.virt.libvirt.driver [None req-fa2a4568-fb3f-44a5-acea-42cf3c0af941 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] [instance: e1d6bfab-5b5f-4e87-903f-663a797f6e97] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 21 19:43:43 np0005591285 systemd[1]: Started libcrun container.
Jan 21 19:43:43 np0005591285 podman[243683]: 2026-01-22 00:43:43.432203081 +0000 UTC m=+0.022784306 image pull 3695f0466b4af47afdf4b467956f8cc4744d7249671a73e7ca3fd26cca2f59c3 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Jan 21 19:43:43 np0005591285 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/16f7b1317b5723a6051563eb45f5150513e3bc15c2b8e4b0424b96705087c381/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 21 19:43:43 np0005591285 nova_compute[182755]: 2026-01-22 00:43:43.532 182759 INFO nova.compute.manager [None req-f447ef32-7edb-4e2c-a5c5-5452f2f9c37e - - - - - -] [instance: e1d6bfab-5b5f-4e87-903f-663a797f6e97] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 21 19:43:43 np0005591285 podman[243683]: 2026-01-22 00:43:43.542720254 +0000 UTC m=+0.133301449 container init bfaf5011268f13a45b2c383855cdd285460b190df4c0aa12eca16c90445f96fb (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-bc173f9b-a39e-490e-b1d4-92abd1855016, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team)
Jan 21 19:43:43 np0005591285 podman[243683]: 2026-01-22 00:43:43.547606855 +0000 UTC m=+0.138188030 container start bfaf5011268f13a45b2c383855cdd285460b190df4c0aa12eca16c90445f96fb (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-bc173f9b-a39e-490e-b1d4-92abd1855016, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, tcib_managed=true)
Jan 21 19:43:43 np0005591285 neutron-haproxy-ovnmeta-bc173f9b-a39e-490e-b1d4-92abd1855016[243699]: [NOTICE]   (243703) : New worker (243705) forked
Jan 21 19:43:43 np0005591285 neutron-haproxy-ovnmeta-bc173f9b-a39e-490e-b1d4-92abd1855016[243699]: [NOTICE]   (243703) : Loading success.
Jan 21 19:43:43 np0005591285 nova_compute[182755]: 2026-01-22 00:43:43.587 182759 INFO nova.compute.manager [None req-fa2a4568-fb3f-44a5-acea-42cf3c0af941 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] [instance: e1d6bfab-5b5f-4e87-903f-663a797f6e97] Took 7.10 seconds to spawn the instance on the hypervisor.#033[00m
Jan 21 19:43:43 np0005591285 nova_compute[182755]: 2026-01-22 00:43:43.588 182759 DEBUG nova.compute.manager [None req-fa2a4568-fb3f-44a5-acea-42cf3c0af941 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] [instance: e1d6bfab-5b5f-4e87-903f-663a797f6e97] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 21 19:43:43 np0005591285 nova_compute[182755]: 2026-01-22 00:43:43.693 182759 INFO nova.compute.manager [None req-fa2a4568-fb3f-44a5-acea-42cf3c0af941 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] [instance: e1d6bfab-5b5f-4e87-903f-663a797f6e97] Took 7.71 seconds to build instance.#033[00m
Jan 21 19:43:43 np0005591285 nova_compute[182755]: 2026-01-22 00:43:43.726 182759 DEBUG oslo_concurrency.lockutils [None req-fa2a4568-fb3f-44a5-acea-42cf3c0af941 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Lock "e1d6bfab-5b5f-4e87-903f-663a797f6e97" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 7.822s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 21 19:43:44 np0005591285 nova_compute[182755]: 2026-01-22 00:43:44.455 182759 DEBUG nova.network.neutron [req-50fe9731-5c58-42f8-a782-a74473d48955 req-78a69826-8d84-480e-a647-f8f14682e2a8 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: e1d6bfab-5b5f-4e87-903f-663a797f6e97] Updated VIF entry in instance network info cache for port 3a45d6cb-8d53-4e0f-8011-06cad53a8190. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 21 19:43:44 np0005591285 nova_compute[182755]: 2026-01-22 00:43:44.456 182759 DEBUG nova.network.neutron [req-50fe9731-5c58-42f8-a782-a74473d48955 req-78a69826-8d84-480e-a647-f8f14682e2a8 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: e1d6bfab-5b5f-4e87-903f-663a797f6e97] Updating instance_info_cache with network_info: [{"id": "3a45d6cb-8d53-4e0f-8011-06cad53a8190", "address": "fa:16:3e:ff:5e:29", "network": {"id": "bc173f9b-a39e-490e-b1d4-92abd1855016", "bridge": "br-int", "label": "tempest-network-smoke--453483768", "subnets": [{"cidr": "2001:db8:0:1::/64", "dns": [], "gateway": {"address": "2001:db8:0:1::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8:0:1:f816:3eff:feff:5e29", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}, {"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}, {"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:feff:5e29", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "837db8748d074b3c9179b47d30e7a1d4", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3a45d6cb-8d", "ovs_interfaceid": "3a45d6cb-8d53-4e0f-8011-06cad53a8190", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 21 19:43:44 np0005591285 nova_compute[182755]: 2026-01-22 00:43:44.474 182759 DEBUG oslo_concurrency.lockutils [req-50fe9731-5c58-42f8-a782-a74473d48955 req-78a69826-8d84-480e-a647-f8f14682e2a8 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Releasing lock "refresh_cache-e1d6bfab-5b5f-4e87-903f-663a797f6e97" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 21 19:43:45 np0005591285 nova_compute[182755]: 2026-01-22 00:43:45.553 182759 DEBUG nova.compute.manager [req-bf1ed2fa-3861-448b-82f6-b47a4fdb0def req-94236128-e512-47ec-80bc-97acca130b15 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: e1d6bfab-5b5f-4e87-903f-663a797f6e97] Received event network-vif-plugged-3a45d6cb-8d53-4e0f-8011-06cad53a8190 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 21 19:43:45 np0005591285 nova_compute[182755]: 2026-01-22 00:43:45.554 182759 DEBUG oslo_concurrency.lockutils [req-bf1ed2fa-3861-448b-82f6-b47a4fdb0def req-94236128-e512-47ec-80bc-97acca130b15 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "e1d6bfab-5b5f-4e87-903f-663a797f6e97-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 21 19:43:45 np0005591285 nova_compute[182755]: 2026-01-22 00:43:45.555 182759 DEBUG oslo_concurrency.lockutils [req-bf1ed2fa-3861-448b-82f6-b47a4fdb0def req-94236128-e512-47ec-80bc-97acca130b15 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "e1d6bfab-5b5f-4e87-903f-663a797f6e97-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 21 19:43:45 np0005591285 nova_compute[182755]: 2026-01-22 00:43:45.555 182759 DEBUG oslo_concurrency.lockutils [req-bf1ed2fa-3861-448b-82f6-b47a4fdb0def req-94236128-e512-47ec-80bc-97acca130b15 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "e1d6bfab-5b5f-4e87-903f-663a797f6e97-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 21 19:43:45 np0005591285 nova_compute[182755]: 2026-01-22 00:43:45.555 182759 DEBUG nova.compute.manager [req-bf1ed2fa-3861-448b-82f6-b47a4fdb0def req-94236128-e512-47ec-80bc-97acca130b15 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: e1d6bfab-5b5f-4e87-903f-663a797f6e97] No waiting events found dispatching network-vif-plugged-3a45d6cb-8d53-4e0f-8011-06cad53a8190 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 21 19:43:45 np0005591285 nova_compute[182755]: 2026-01-22 00:43:45.555 182759 WARNING nova.compute.manager [req-bf1ed2fa-3861-448b-82f6-b47a4fdb0def req-94236128-e512-47ec-80bc-97acca130b15 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: e1d6bfab-5b5f-4e87-903f-663a797f6e97] Received unexpected event network-vif-plugged-3a45d6cb-8d53-4e0f-8011-06cad53a8190 for instance with vm_state active and task_state None.#033[00m
Jan 21 19:43:45 np0005591285 nova_compute[182755]: 2026-01-22 00:43:45.906 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:43:46 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:43:46.096 104259 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=ce4b296c-26ac-415a-aa87-9634754eb3d3, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '67'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 21 19:43:46 np0005591285 ovn_controller[94908]: 2026-01-22T00:43:46Z|00683|binding|INFO|Releasing lport e429e99d-d544-4554-bbe2-f8538fbd55b8 from this chassis (sb_readonly=0)
Jan 21 19:43:46 np0005591285 NetworkManager[55017]: <info>  [1769042626.4903] manager: (patch-br-int-to-provnet-99bcf688-d143-4306-a11c-956e66fbe227): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/338)
Jan 21 19:43:46 np0005591285 nova_compute[182755]: 2026-01-22 00:43:46.490 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:43:46 np0005591285 NetworkManager[55017]: <info>  [1769042626.4912] manager: (patch-provnet-99bcf688-d143-4306-a11c-956e66fbe227-to-br-int): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/339)
Jan 21 19:43:46 np0005591285 ovn_controller[94908]: 2026-01-22T00:43:46Z|00684|binding|INFO|Releasing lport e429e99d-d544-4554-bbe2-f8538fbd55b8 from this chassis (sb_readonly=0)
Jan 21 19:43:46 np0005591285 nova_compute[182755]: 2026-01-22 00:43:46.518 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:43:46 np0005591285 nova_compute[182755]: 2026-01-22 00:43:46.523 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:43:46 np0005591285 nova_compute[182755]: 2026-01-22 00:43:46.615 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:43:47 np0005591285 nova_compute[182755]: 2026-01-22 00:43:47.128 182759 DEBUG nova.compute.manager [req-0c35ea46-584b-4206-a072-10a81bb1bf52 req-c2acfba5-0b8f-48ac-84a0-76a8b96da2f6 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: e1d6bfab-5b5f-4e87-903f-663a797f6e97] Received event network-changed-3a45d6cb-8d53-4e0f-8011-06cad53a8190 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 21 19:43:47 np0005591285 nova_compute[182755]: 2026-01-22 00:43:47.128 182759 DEBUG nova.compute.manager [req-0c35ea46-584b-4206-a072-10a81bb1bf52 req-c2acfba5-0b8f-48ac-84a0-76a8b96da2f6 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: e1d6bfab-5b5f-4e87-903f-663a797f6e97] Refreshing instance network info cache due to event network-changed-3a45d6cb-8d53-4e0f-8011-06cad53a8190. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 21 19:43:47 np0005591285 nova_compute[182755]: 2026-01-22 00:43:47.129 182759 DEBUG oslo_concurrency.lockutils [req-0c35ea46-584b-4206-a072-10a81bb1bf52 req-c2acfba5-0b8f-48ac-84a0-76a8b96da2f6 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "refresh_cache-e1d6bfab-5b5f-4e87-903f-663a797f6e97" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 21 19:43:47 np0005591285 nova_compute[182755]: 2026-01-22 00:43:47.129 182759 DEBUG oslo_concurrency.lockutils [req-0c35ea46-584b-4206-a072-10a81bb1bf52 req-c2acfba5-0b8f-48ac-84a0-76a8b96da2f6 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquired lock "refresh_cache-e1d6bfab-5b5f-4e87-903f-663a797f6e97" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 21 19:43:47 np0005591285 nova_compute[182755]: 2026-01-22 00:43:47.130 182759 DEBUG nova.network.neutron [req-0c35ea46-584b-4206-a072-10a81bb1bf52 req-c2acfba5-0b8f-48ac-84a0-76a8b96da2f6 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: e1d6bfab-5b5f-4e87-903f-663a797f6e97] Refreshing network info cache for port 3a45d6cb-8d53-4e0f-8011-06cad53a8190 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 21 19:43:48 np0005591285 nova_compute[182755]: 2026-01-22 00:43:48.856 182759 DEBUG nova.network.neutron [req-0c35ea46-584b-4206-a072-10a81bb1bf52 req-c2acfba5-0b8f-48ac-84a0-76a8b96da2f6 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: e1d6bfab-5b5f-4e87-903f-663a797f6e97] Updated VIF entry in instance network info cache for port 3a45d6cb-8d53-4e0f-8011-06cad53a8190. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 21 19:43:48 np0005591285 nova_compute[182755]: 2026-01-22 00:43:48.858 182759 DEBUG nova.network.neutron [req-0c35ea46-584b-4206-a072-10a81bb1bf52 req-c2acfba5-0b8f-48ac-84a0-76a8b96da2f6 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: e1d6bfab-5b5f-4e87-903f-663a797f6e97] Updating instance_info_cache with network_info: [{"id": "3a45d6cb-8d53-4e0f-8011-06cad53a8190", "address": "fa:16:3e:ff:5e:29", "network": {"id": "bc173f9b-a39e-490e-b1d4-92abd1855016", "bridge": "br-int", "label": "tempest-network-smoke--453483768", "subnets": [{"cidr": "2001:db8:0:1::/64", "dns": [], "gateway": {"address": "2001:db8:0:1::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8:0:1:f816:3eff:feff:5e29", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}, {"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.249", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}, {"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:feff:5e29", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "837db8748d074b3c9179b47d30e7a1d4", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3a45d6cb-8d", "ovs_interfaceid": "3a45d6cb-8d53-4e0f-8011-06cad53a8190", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 21 19:43:48 np0005591285 nova_compute[182755]: 2026-01-22 00:43:48.883 182759 DEBUG oslo_concurrency.lockutils [req-0c35ea46-584b-4206-a072-10a81bb1bf52 req-c2acfba5-0b8f-48ac-84a0-76a8b96da2f6 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Releasing lock "refresh_cache-e1d6bfab-5b5f-4e87-903f-663a797f6e97" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 21 19:43:50 np0005591285 podman[243716]: 2026-01-22 00:43:50.204306988 +0000 UTC m=+0.063781192 container health_status b5c1a31a508d0cf9e10702abe9c0ef424614b4d8f0daee85791f7de054afa6f0 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_id=ceilometer_agent_compute, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0)
Jan 21 19:43:50 np0005591285 podman[243715]: 2026-01-22 00:43:50.233615309 +0000 UTC m=+0.088954782 container health_status 769668cc4e818cfae854ab3fcb5517227ddca1e18005a18ef4d782a494f45662 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, vendor=Red Hat, Inc., architecture=x86_64, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, version=9.6, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.tags=minimal rhel9, com.redhat.component=ubi9-minimal-container, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, container_name=openstack_network_exporter, io.buildah.version=1.33.7, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, maintainer=Red Hat, Inc., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., release=1755695350, url=https://catalog.redhat.com/en/search?searchType=containers, config_id=openstack_network_exporter, managed_by=edpm_ansible, name=ubi9-minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., build-date=2025-08-20T13:12:41, vcs-type=git)
Jan 21 19:43:50 np0005591285 nova_compute[182755]: 2026-01-22 00:43:50.908 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:43:51 np0005591285 nova_compute[182755]: 2026-01-22 00:43:51.619 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:43:53 np0005591285 nova_compute[182755]: 2026-01-22 00:43:53.217 182759 DEBUG oslo_service.periodic_task [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Running periodic task ComputeManager._cleanup_incomplete_migrations run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 21 19:43:53 np0005591285 nova_compute[182755]: 2026-01-22 00:43:53.217 182759 DEBUG nova.compute.manager [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Cleaning up deleted instances with incomplete migration  _cleanup_incomplete_migrations /usr/lib/python3.9/site-packages/nova/compute/manager.py:11183#033[00m
Jan 21 19:43:55 np0005591285 nova_compute[182755]: 2026-01-22 00:43:55.910 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:43:56 np0005591285 ovn_controller[94908]: 2026-01-22T00:43:56Z|00072|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:ff:5e:29 10.100.0.7
Jan 21 19:43:56 np0005591285 ovn_controller[94908]: 2026-01-22T00:43:56Z|00073|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:ff:5e:29 10.100.0.7
Jan 21 19:43:56 np0005591285 nova_compute[182755]: 2026-01-22 00:43:56.622 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:43:59 np0005591285 podman[243765]: 2026-01-22 00:43:59.20511125 +0000 UTC m=+0.061207113 container health_status 3d927e208c8d9d2bf2ed958430b573b5beb6e6062ba87e1ec7a0ee42c855b2e4 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Jan 21 19:44:00 np0005591285 nova_compute[182755]: 2026-01-22 00:44:00.963 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:44:01 np0005591285 nova_compute[182755]: 2026-01-22 00:44:01.624 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:44:02 np0005591285 podman[243790]: 2026-01-22 00:44:02.174827033 +0000 UTC m=+0.044201024 container health_status 8919309155a033da8deb024bb37b9fc0d285fe97686ee0d202af37a5f2df8416 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Jan 21 19:44:02 np0005591285 podman[243789]: 2026-01-22 00:44:02.174945686 +0000 UTC m=+0.049489337 container health_status 482a8450540b33e5cd4c4cb0c4b60281016c657b79b89fa82f8a9e447843f995 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, config_id=ovn_metadata_agent)
Jan 21 19:44:03 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:44:03.013 104259 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 21 19:44:03 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:44:03.013 104259 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 21 19:44:03 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:44:03.014 104259 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 21 19:44:03 np0005591285 nova_compute[182755]: 2026-01-22 00:44:03.216 182759 DEBUG oslo_service.periodic_task [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 21 19:44:05 np0005591285 podman[243831]: 2026-01-22 00:44:05.248757167 +0000 UTC m=+0.118553950 container health_status e38def30a2d032278c507a8fe96e62e9ea51ff4721fe55b24e66691821fd079e (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 21 19:44:05 np0005591285 nova_compute[182755]: 2026-01-22 00:44:05.966 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:44:06 np0005591285 nova_compute[182755]: 2026-01-22 00:44:06.626 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:44:09 np0005591285 nova_compute[182755]: 2026-01-22 00:44:09.240 182759 DEBUG oslo_service.periodic_task [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 21 19:44:09 np0005591285 nova_compute[182755]: 2026-01-22 00:44:09.240 182759 DEBUG nova.compute.manager [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Jan 21 19:44:09 np0005591285 nova_compute[182755]: 2026-01-22 00:44:09.241 182759 DEBUG nova.compute.manager [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Jan 21 19:44:09 np0005591285 nova_compute[182755]: 2026-01-22 00:44:09.504 182759 DEBUG oslo_concurrency.lockutils [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Acquiring lock "refresh_cache-e1d6bfab-5b5f-4e87-903f-663a797f6e97" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 21 19:44:09 np0005591285 nova_compute[182755]: 2026-01-22 00:44:09.505 182759 DEBUG oslo_concurrency.lockutils [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Acquired lock "refresh_cache-e1d6bfab-5b5f-4e87-903f-663a797f6e97" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 21 19:44:09 np0005591285 nova_compute[182755]: 2026-01-22 00:44:09.505 182759 DEBUG nova.network.neutron [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] [instance: e1d6bfab-5b5f-4e87-903f-663a797f6e97] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004#033[00m
Jan 21 19:44:09 np0005591285 nova_compute[182755]: 2026-01-22 00:44:09.506 182759 DEBUG nova.objects.instance [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Lazy-loading 'info_cache' on Instance uuid e1d6bfab-5b5f-4e87-903f-663a797f6e97 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 21 19:44:10 np0005591285 nova_compute[182755]: 2026-01-22 00:44:10.981 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:44:11 np0005591285 nova_compute[182755]: 2026-01-22 00:44:11.628 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:44:12 np0005591285 nova_compute[182755]: 2026-01-22 00:44:12.503 182759 DEBUG nova.network.neutron [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] [instance: e1d6bfab-5b5f-4e87-903f-663a797f6e97] Updating instance_info_cache with network_info: [{"id": "3a45d6cb-8d53-4e0f-8011-06cad53a8190", "address": "fa:16:3e:ff:5e:29", "network": {"id": "bc173f9b-a39e-490e-b1d4-92abd1855016", "bridge": "br-int", "label": "tempest-network-smoke--453483768", "subnets": [{"cidr": "2001:db8:0:1::/64", "dns": [], "gateway": {"address": "2001:db8:0:1::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8:0:1:f816:3eff:feff:5e29", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}, {"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.249", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}, {"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:feff:5e29", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "837db8748d074b3c9179b47d30e7a1d4", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3a45d6cb-8d", "ovs_interfaceid": "3a45d6cb-8d53-4e0f-8011-06cad53a8190", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 21 19:44:12 np0005591285 nova_compute[182755]: 2026-01-22 00:44:12.524 182759 DEBUG oslo_concurrency.lockutils [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Releasing lock "refresh_cache-e1d6bfab-5b5f-4e87-903f-663a797f6e97" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 21 19:44:12 np0005591285 nova_compute[182755]: 2026-01-22 00:44:12.525 182759 DEBUG nova.compute.manager [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] [instance: e1d6bfab-5b5f-4e87-903f-663a797f6e97] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929#033[00m
Jan 21 19:44:12 np0005591285 nova_compute[182755]: 2026-01-22 00:44:12.526 182759 DEBUG oslo_service.periodic_task [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 21 19:44:14 np0005591285 nova_compute[182755]: 2026-01-22 00:44:14.217 182759 DEBUG oslo_service.periodic_task [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 21 19:44:14 np0005591285 nova_compute[182755]: 2026-01-22 00:44:14.218 182759 DEBUG nova.compute.manager [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Jan 21 19:44:15 np0005591285 nova_compute[182755]: 2026-01-22 00:44:15.213 182759 DEBUG oslo_service.periodic_task [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 21 19:44:15 np0005591285 nova_compute[182755]: 2026-01-22 00:44:15.984 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:44:16 np0005591285 nova_compute[182755]: 2026-01-22 00:44:16.217 182759 DEBUG oslo_service.periodic_task [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 21 19:44:16 np0005591285 nova_compute[182755]: 2026-01-22 00:44:16.630 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:44:17 np0005591285 nova_compute[182755]: 2026-01-22 00:44:17.218 182759 DEBUG oslo_service.periodic_task [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 21 19:44:18 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:44:18.087 104259 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=68, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '16:a4:fb', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': 'fa:c2:bb:50:eb:27'}, ipsec=False) old=SB_Global(nb_cfg=67) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 21 19:44:18 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:44:18.088 104259 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 7 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Jan 21 19:44:18 np0005591285 nova_compute[182755]: 2026-01-22 00:44:18.097 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:44:21 np0005591285 nova_compute[182755]: 2026-01-22 00:44:21.023 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:44:21 np0005591285 podman[243860]: 2026-01-22 00:44:21.186740207 +0000 UTC m=+0.058892521 container health_status b5c1a31a508d0cf9e10702abe9c0ef424614b4d8f0daee85791f7de054afa6f0 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251202)
Jan 21 19:44:21 np0005591285 podman[243859]: 2026-01-22 00:44:21.202362329 +0000 UTC m=+0.077648997 container health_status 769668cc4e818cfae854ab3fcb5517227ddca1e18005a18ef4d782a494f45662 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, vcs-type=git, config_id=openstack_network_exporter, distribution-scope=public, maintainer=Red Hat, Inc., version=9.6, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, managed_by=edpm_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2025-08-20T13:12:41, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., container_name=openstack_network_exporter, io.buildah.version=1.33.7, url=https://catalog.redhat.com/en/search?searchType=containers, name=ubi9-minimal, com.redhat.component=ubi9-minimal-container, architecture=x86_64, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, release=1755695350, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vendor=Red Hat, Inc., io.openshift.tags=minimal rhel9, io.openshift.expose-services=)
Jan 21 19:44:21 np0005591285 nova_compute[182755]: 2026-01-22 00:44:21.633 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.184 12 DEBUG ceilometer.compute.discovery [-] instance data: {'id': 'e1d6bfab-5b5f-4e87-903f-663a797f6e97', 'name': 'tempest-TestGettingAddress-server-1515318656', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'os_type': 'hvm', 'architecture': 'x86_64', 'OS-EXT-SRV-ATTR:instance_name': 'instance-000000b5', 'OS-EXT-SRV-ATTR:host': 'compute-2.ctlplane.example.com', 'OS-EXT-STS:vm_state': 'running', 'tenant_id': '837db8748d074b3c9179b47d30e7a1d4', 'user_id': 'a8fd196423d94b309668ffd08655f7ed', 'hostId': '164c0a9a6acac8f4385343b5316ca58358db1c583e55b6d4f56f9ab0', 'status': 'active', 'metadata': {}} discover_libvirt_polling /usr/lib/python3.9/site-packages/ceilometer/compute/discovery.py:228
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.185 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets in the context of pollsters
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.188 12 DEBUG ceilometer.compute.virt.libvirt.inspector [-] No delta meter predecessor for e1d6bfab-5b5f-4e87-903f-663a797f6e97 / tap3a45d6cb-8d inspect_vnics /usr/lib/python3.9/site-packages/ceilometer/compute/virt/libvirt/inspector.py:136
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.189 12 DEBUG ceilometer.compute.pollsters [-] e1d6bfab-5b5f-4e87-903f-663a797f6e97/network.outgoing.packets volume: 35 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.192 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'c875330b-a621-4ede-a080-1e9c500f7e7d', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 35, 'user_id': 'a8fd196423d94b309668ffd08655f7ed', 'user_name': None, 'project_id': '837db8748d074b3c9179b47d30e7a1d4', 'project_name': None, 'resource_id': 'instance-000000b5-e1d6bfab-5b5f-4e87-903f-663a797f6e97-tap3a45d6cb-8d', 'timestamp': '2026-01-22T00:44:23.185348', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1515318656', 'name': 'tap3a45d6cb-8d', 'instance_id': 'e1d6bfab-5b5f-4e87-903f-663a797f6e97', 'instance_type': 'm1.nano', 'host': '164c0a9a6acac8f4385343b5316ca58358db1c583e55b6d4f56f9ab0', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:ff:5e:29', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap3a45d6cb-8d'}, 'message_id': '7ed2f724-f72b-11f0-b13b-fa163e425b77', 'monotonic_time': 7221.904584765, 'message_signature': '219c0026089b3088d612ea04bcc019b047e66f133a493bbb0e304e9d023ff0a8'}]}, 'timestamp': '2026-01-22 00:44:23.189883', '_unique_id': '38ba50c145cc49e68ab54713e6f22d29'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.192 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.192 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.192 12 ERROR oslo_messaging.notify.messaging     yield
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.192 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.192 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.192 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.192 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.192 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.192 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.192 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.192 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.192 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.192 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.192 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.192 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.192 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.192 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.192 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.192 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.192 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.192 12 ERROR oslo_messaging.notify.messaging 
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.192 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.192 12 ERROR oslo_messaging.notify.messaging 
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.192 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.192 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.192 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.192 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.192 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.192 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.192 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.192 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.192 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.192 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.192 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.192 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.192 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.192 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.192 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.192 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.192 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.192 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.192 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.192 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.192 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.192 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.192 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.192 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.192 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.192 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.192 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.192 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.192 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.192 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.192 12 ERROR oslo_messaging.notify.messaging 
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.193 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.latency in the context of pollsters
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.193 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for PerDeviceDiskLatencyPollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.193 12 ERROR ceilometer.polling.manager [-] Prevent pollster disk.device.latency from polling [<NovaLikeServer: tempest-TestGettingAddress-server-1515318656>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-TestGettingAddress-server-1515318656>]
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.194 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets in the context of pollsters
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.194 12 DEBUG ceilometer.compute.pollsters [-] e1d6bfab-5b5f-4e87-903f-663a797f6e97/network.incoming.packets volume: 26 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.195 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '79eaf4da-0a37-450f-8b38-f42c159724fd', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 26, 'user_id': 'a8fd196423d94b309668ffd08655f7ed', 'user_name': None, 'project_id': '837db8748d074b3c9179b47d30e7a1d4', 'project_name': None, 'resource_id': 'instance-000000b5-e1d6bfab-5b5f-4e87-903f-663a797f6e97-tap3a45d6cb-8d', 'timestamp': '2026-01-22T00:44:23.194307', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1515318656', 'name': 'tap3a45d6cb-8d', 'instance_id': 'e1d6bfab-5b5f-4e87-903f-663a797f6e97', 'instance_type': 'm1.nano', 'host': '164c0a9a6acac8f4385343b5316ca58358db1c583e55b6d4f56f9ab0', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:ff:5e:29', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap3a45d6cb-8d'}, 'message_id': '7ed3ba10-f72b-11f0-b13b-fa163e425b77', 'monotonic_time': 7221.904584765, 'message_signature': 'd4014f4e1691cd791546716efa730920beb88ec717471033df27c08f19a52b0e'}]}, 'timestamp': '2026-01-22 00:44:23.194603', '_unique_id': 'b5072fa9bc2543d5b1bd33e6b3131112'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.195 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.195 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.195 12 ERROR oslo_messaging.notify.messaging     yield
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.195 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.195 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.195 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.195 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.195 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.195 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.195 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.195 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.195 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.195 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.195 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.195 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.195 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.195 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.195 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.195 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.195 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.195 12 ERROR oslo_messaging.notify.messaging 
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.195 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.195 12 ERROR oslo_messaging.notify.messaging 
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.195 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.195 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.195 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.195 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.195 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.195 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.195 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.195 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.195 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.195 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.195 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.195 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.195 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.195 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.195 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.195 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.195 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.195 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.195 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.195 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.195 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.195 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.195 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.195 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.195 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.195 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.195 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.195 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.195 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.195 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.195 12 ERROR oslo_messaging.notify.messaging 
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.196 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.requests in the context of pollsters
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.226 12 DEBUG ceilometer.compute.pollsters [-] e1d6bfab-5b5f-4e87-903f-663a797f6e97/disk.device.read.requests volume: 1091 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.226 12 DEBUG ceilometer.compute.pollsters [-] e1d6bfab-5b5f-4e87-903f-663a797f6e97/disk.device.read.requests volume: 108 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.227 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'a2dbb3c1-3f23-49c9-988d-d6de42e0b764', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 1091, 'user_id': 'a8fd196423d94b309668ffd08655f7ed', 'user_name': None, 'project_id': '837db8748d074b3c9179b47d30e7a1d4', 'project_name': None, 'resource_id': 'e1d6bfab-5b5f-4e87-903f-663a797f6e97-vda', 'timestamp': '2026-01-22T00:44:23.196367', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1515318656', 'name': 'instance-000000b5', 'instance_id': 'e1d6bfab-5b5f-4e87-903f-663a797f6e97', 'instance_type': 'm1.nano', 'host': '164c0a9a6acac8f4385343b5316ca58358db1c583e55b6d4f56f9ab0', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '7ed89d32-f72b-11f0-b13b-fa163e425b77', 'monotonic_time': 7221.915557672, 'message_signature': 'd74449316edb1a265625bd5a890541ab61147aad2dc9f359e763e58abdb58c5f'}, {'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 108, 'user_id': 'a8fd196423d94b309668ffd08655f7ed', 'user_name': None, 'project_id': '837db8748d074b3c9179b47d30e7a1d4', 'project_name': None, 'resource_id': 'e1d6bfab-5b5f-4e87-903f-663a797f6e97-sda', 'timestamp': '2026-01-22T00:44:23.196367', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1515318656', 'name': 'instance-000000b5', 'instance_id': 'e1d6bfab-5b5f-4e87-903f-663a797f6e97', 'instance_type': 'm1.nano', 'host': '164c0a9a6acac8f4385343b5316ca58358db1c583e55b6d4f56f9ab0', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '7ed8a908-f72b-11f0-b13b-fa163e425b77', 'monotonic_time': 7221.915557672, 'message_signature': '241a253136a8c0a6b8537e54474106c69ed531c2d8212f8f1621f9d1825d6a78'}]}, 'timestamp': '2026-01-22 00:44:23.226953', '_unique_id': 'c98d80649f1547568ef5a93b0b030840'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.227 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.227 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.227 12 ERROR oslo_messaging.notify.messaging     yield
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.227 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.227 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.227 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.227 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.227 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.227 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.227 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.227 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.227 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.227 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.227 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.227 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.227 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.227 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.227 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.227 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.227 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.227 12 ERROR oslo_messaging.notify.messaging 
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.227 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.227 12 ERROR oslo_messaging.notify.messaging 
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.227 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.227 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.227 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.227 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.227 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.227 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.227 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.227 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.227 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.227 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.227 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.227 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.227 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.227 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.227 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.227 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.227 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.227 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.227 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.227 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.227 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.227 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.227 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.227 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.227 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.227 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.227 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.227 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.227 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.227 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.227 12 ERROR oslo_messaging.notify.messaging 
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.228 12 INFO ceilometer.polling.manager [-] Polling pollster cpu in the context of pollsters
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.251 12 DEBUG ceilometer.compute.pollsters [-] e1d6bfab-5b5f-4e87-903f-663a797f6e97/cpu volume: 11890000000 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.253 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '951e49b6-4e3a-48f1-872d-67659c923d90', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'cpu', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 11890000000, 'user_id': 'a8fd196423d94b309668ffd08655f7ed', 'user_name': None, 'project_id': '837db8748d074b3c9179b47d30e7a1d4', 'project_name': None, 'resource_id': 'e1d6bfab-5b5f-4e87-903f-663a797f6e97', 'timestamp': '2026-01-22T00:44:23.228843', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1515318656', 'name': 'instance-000000b5', 'instance_id': 'e1d6bfab-5b5f-4e87-903f-663a797f6e97', 'instance_type': 'm1.nano', 'host': '164c0a9a6acac8f4385343b5316ca58358db1c583e55b6d4f56f9ab0', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'cpu_number': 1}, 'message_id': '7edc75b0-f72b-11f0-b13b-fa163e425b77', 'monotonic_time': 7221.970466823, 'message_signature': '9d2465523c5f0a1c19f115212afb6cbc0f42eba333cfc63d0e3b7599ffc7271f'}]}, 'timestamp': '2026-01-22 00:44:23.251971', '_unique_id': 'fe2f17a2690748c3b6b843cb997ff867'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.253 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.253 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.253 12 ERROR oslo_messaging.notify.messaging     yield
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.253 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.253 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.253 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.253 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.253 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.253 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.253 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.253 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.253 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.253 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.253 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.253 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.253 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.253 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.253 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.253 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.253 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.253 12 ERROR oslo_messaging.notify.messaging 
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.253 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.253 12 ERROR oslo_messaging.notify.messaging 
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.253 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.253 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.253 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.253 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.253 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.253 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.253 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.253 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.253 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.253 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.253 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.253 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.253 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.253 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.253 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.253 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.253 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.253 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.253 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.253 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.253 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.253 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.253 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.253 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.253 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.253 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.253 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.253 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.253 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.253 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.253 12 ERROR oslo_messaging.notify.messaging 
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.254 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.error in the context of pollsters
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.255 12 DEBUG ceilometer.compute.pollsters [-] e1d6bfab-5b5f-4e87-903f-663a797f6e97/network.incoming.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.258 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '0bbd5ded-9482-48da-9dc9-e0691e12577c', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'a8fd196423d94b309668ffd08655f7ed', 'user_name': None, 'project_id': '837db8748d074b3c9179b47d30e7a1d4', 'project_name': None, 'resource_id': 'instance-000000b5-e1d6bfab-5b5f-4e87-903f-663a797f6e97-tap3a45d6cb-8d', 'timestamp': '2026-01-22T00:44:23.255391', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1515318656', 'name': 'tap3a45d6cb-8d', 'instance_id': 'e1d6bfab-5b5f-4e87-903f-663a797f6e97', 'instance_type': 'm1.nano', 'host': '164c0a9a6acac8f4385343b5316ca58358db1c583e55b6d4f56f9ab0', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:ff:5e:29', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap3a45d6cb-8d'}, 'message_id': '7edd1a9c-f72b-11f0-b13b-fa163e425b77', 'monotonic_time': 7221.904584765, 'message_signature': 'dabdbcddacaf4a740dc382e6f117ac9c34f14c9f6d36abcc97c1bfb11319be39'}]}, 'timestamp': '2026-01-22 00:44:23.256302', '_unique_id': '1db1ce1f55934fda85434e531432f1ec'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.258 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.258 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.258 12 ERROR oslo_messaging.notify.messaging     yield
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.258 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.258 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.258 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.258 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.258 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.258 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.258 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.258 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.258 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.258 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.258 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.258 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.258 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.258 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.258 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.258 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.258 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.258 12 ERROR oslo_messaging.notify.messaging 
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.258 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.258 12 ERROR oslo_messaging.notify.messaging 
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.258 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.258 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.258 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.258 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.258 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.258 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.258 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.258 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.258 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.258 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.258 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.258 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.258 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.258 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.258 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.258 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.258 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.258 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.258 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.258 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.258 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.258 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.258 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.258 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.258 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.258 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.258 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.258 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.258 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.258 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.258 12 ERROR oslo_messaging.notify.messaging 
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.260 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.error in the context of pollsters
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.260 12 DEBUG ceilometer.compute.pollsters [-] e1d6bfab-5b5f-4e87-903f-663a797f6e97/network.outgoing.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.262 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'ee732413-c59e-40d3-9d23-384cc30527b3', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'a8fd196423d94b309668ffd08655f7ed', 'user_name': None, 'project_id': '837db8748d074b3c9179b47d30e7a1d4', 'project_name': None, 'resource_id': 'instance-000000b5-e1d6bfab-5b5f-4e87-903f-663a797f6e97-tap3a45d6cb-8d', 'timestamp': '2026-01-22T00:44:23.260662', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1515318656', 'name': 'tap3a45d6cb-8d', 'instance_id': 'e1d6bfab-5b5f-4e87-903f-663a797f6e97', 'instance_type': 'm1.nano', 'host': '164c0a9a6acac8f4385343b5316ca58358db1c583e55b6d4f56f9ab0', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:ff:5e:29', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap3a45d6cb-8d'}, 'message_id': '7edde3d2-f72b-11f0-b13b-fa163e425b77', 'monotonic_time': 7221.904584765, 'message_signature': '65cb90811516f509554dcb2d0a196897d4c3a485a8acb4c98a045185e3f78e64'}]}, 'timestamp': '2026-01-22 00:44:23.261370', '_unique_id': '8769729826d24093a7599bd670557212'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.262 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.262 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.262 12 ERROR oslo_messaging.notify.messaging     yield
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.262 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.262 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.262 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.262 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.262 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.262 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.262 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.262 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.262 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.262 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.262 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.262 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.262 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.262 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.262 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.262 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.262 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.262 12 ERROR oslo_messaging.notify.messaging 
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.262 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.262 12 ERROR oslo_messaging.notify.messaging 
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.262 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.262 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.262 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.262 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.262 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.262 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.262 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.262 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.262 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.262 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.262 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.262 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.262 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.262 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.262 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.262 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.262 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.262 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.262 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.262 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.262 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.262 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.262 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.262 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.262 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.262 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.262 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.262 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.262 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.262 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.262 12 ERROR oslo_messaging.notify.messaging 
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.264 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.capacity in the context of pollsters
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.278 12 DEBUG ceilometer.compute.pollsters [-] e1d6bfab-5b5f-4e87-903f-663a797f6e97/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.278 12 DEBUG ceilometer.compute.pollsters [-] e1d6bfab-5b5f-4e87-903f-663a797f6e97/disk.device.capacity volume: 485376 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.279 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '1cf501a9-effa-46a7-9660-a661a8c1565c', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': 'a8fd196423d94b309668ffd08655f7ed', 'user_name': None, 'project_id': '837db8748d074b3c9179b47d30e7a1d4', 'project_name': None, 'resource_id': 'e1d6bfab-5b5f-4e87-903f-663a797f6e97-vda', 'timestamp': '2026-01-22T00:44:23.265136', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1515318656', 'name': 'instance-000000b5', 'instance_id': 'e1d6bfab-5b5f-4e87-903f-663a797f6e97', 'instance_type': 'm1.nano', 'host': '164c0a9a6acac8f4385343b5316ca58358db1c583e55b6d4f56f9ab0', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '7ee08632-f72b-11f0-b13b-fa163e425b77', 'monotonic_time': 7221.984410339, 'message_signature': 'f2a912bae74a98f4746eafb7724144f470243a923ac10156642a111b3fc7bd56'}, {'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 485376, 'user_id': 'a8fd196423d94b309668ffd08655f7ed', 'user_name': None, 'project_id': '837db8748d074b3c9179b47d30e7a1d4', 'project_name': None, 'resource_id': 'e1d6bfab-5b5f-4e87-903f-663a797f6e97-sda', 'timestamp': '2026-01-22T00:44:23.265136', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1515318656', 'name': 'instance-000000b5', 'instance_id': 'e1d6bfab-5b5f-4e87-903f-663a797f6e97', 'instance_type': 'm1.nano', 'host': '164c0a9a6acac8f4385343b5316ca58358db1c583e55b6d4f56f9ab0', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '7ee092e4-f72b-11f0-b13b-fa163e425b77', 'monotonic_time': 7221.984410339, 'message_signature': 'c23cf695130ba63e44e0e828ed1fde41ab1e3a8cebd6f4a736004ae7b04e121e'}]}, 'timestamp': '2026-01-22 00:44:23.278778', '_unique_id': 'bbfc96be59014d69a73328313ce2e5b2'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.279 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.279 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.279 12 ERROR oslo_messaging.notify.messaging     yield
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.279 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.279 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.279 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.279 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.279 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.279 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.279 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.279 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.279 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.279 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.279 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.279 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.279 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.279 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.279 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.279 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.279 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.279 12 ERROR oslo_messaging.notify.messaging 
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.279 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.279 12 ERROR oslo_messaging.notify.messaging 
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.279 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.279 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.279 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.279 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.279 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.279 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.279 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.279 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.279 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.279 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.279 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.279 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.279 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.279 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.279 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.279 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.279 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.279 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.279 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.279 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.279 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.279 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.279 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.279 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.279 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.279 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.279 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.279 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.279 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.279 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.279 12 ERROR oslo_messaging.notify.messaging 
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.280 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.delta in the context of pollsters
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.280 12 DEBUG ceilometer.compute.pollsters [-] e1d6bfab-5b5f-4e87-903f-663a797f6e97/network.outgoing.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.281 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'efbe03f4-5b2a-4a6a-abf3-38ecc8612cd7', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': 'a8fd196423d94b309668ffd08655f7ed', 'user_name': None, 'project_id': '837db8748d074b3c9179b47d30e7a1d4', 'project_name': None, 'resource_id': 'instance-000000b5-e1d6bfab-5b5f-4e87-903f-663a797f6e97-tap3a45d6cb-8d', 'timestamp': '2026-01-22T00:44:23.280698', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1515318656', 'name': 'tap3a45d6cb-8d', 'instance_id': 'e1d6bfab-5b5f-4e87-903f-663a797f6e97', 'instance_type': 'm1.nano', 'host': '164c0a9a6acac8f4385343b5316ca58358db1c583e55b6d4f56f9ab0', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:ff:5e:29', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap3a45d6cb-8d'}, 'message_id': '7ee0eaf0-f72b-11f0-b13b-fa163e425b77', 'monotonic_time': 7221.904584765, 'message_signature': 'a554534abae13d8edb84d3476cd618c7efd91bcae77867f514bed39648fc5078'}]}, 'timestamp': '2026-01-22 00:44:23.281035', '_unique_id': 'bed5545e98ec44aabc0e04b1a51d68c7'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.281 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.281 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.281 12 ERROR oslo_messaging.notify.messaging     yield
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.281 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.281 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.281 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.281 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.281 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.281 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.281 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.281 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.281 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.281 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.281 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.281 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.281 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.281 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.281 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.281 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.281 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.281 12 ERROR oslo_messaging.notify.messaging 
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.281 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.281 12 ERROR oslo_messaging.notify.messaging 
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.281 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.281 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.281 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.281 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.281 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.281 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.281 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.281 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.281 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.281 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.281 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.281 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.281 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.281 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.281 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.281 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.281 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.281 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.281 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.281 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.281 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.281 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.281 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.281 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.281 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.281 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.281 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.281 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.281 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.281 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.281 12 ERROR oslo_messaging.notify.messaging 
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.282 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes in the context of pollsters
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.282 12 DEBUG ceilometer.compute.pollsters [-] e1d6bfab-5b5f-4e87-903f-663a797f6e97/network.incoming.bytes volume: 4271 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.283 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'a45829fe-b920-483c-84c4-974b9cf0712b', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 4271, 'user_id': 'a8fd196423d94b309668ffd08655f7ed', 'user_name': None, 'project_id': '837db8748d074b3c9179b47d30e7a1d4', 'project_name': None, 'resource_id': 'instance-000000b5-e1d6bfab-5b5f-4e87-903f-663a797f6e97-tap3a45d6cb-8d', 'timestamp': '2026-01-22T00:44:23.282477', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1515318656', 'name': 'tap3a45d6cb-8d', 'instance_id': 'e1d6bfab-5b5f-4e87-903f-663a797f6e97', 'instance_type': 'm1.nano', 'host': '164c0a9a6acac8f4385343b5316ca58358db1c583e55b6d4f56f9ab0', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:ff:5e:29', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap3a45d6cb-8d'}, 'message_id': '7ee12e84-f72b-11f0-b13b-fa163e425b77', 'monotonic_time': 7221.904584765, 'message_signature': 'd01255c01bb8af06bc095f7e3045ba1d7737589e689ddeb08d412c83622430ec'}]}, 'timestamp': '2026-01-22 00:44:23.282761', '_unique_id': 'b496919ee0984fd0a2fed504c032192b'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.283 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.283 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.283 12 ERROR oslo_messaging.notify.messaging     yield
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.283 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.283 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.283 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.283 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.283 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.283 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.283 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.283 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.283 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.283 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.283 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.283 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.283 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.283 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.283 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.283 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.283 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.283 12 ERROR oslo_messaging.notify.messaging 
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.283 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.283 12 ERROR oslo_messaging.notify.messaging 
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.283 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.283 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.283 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.283 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.283 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.283 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.283 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.283 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.283 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.283 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.283 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.283 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.283 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.283 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.283 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.283 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.283 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.283 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.283 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.283 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.283 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.283 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.283 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.283 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.283 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.283 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.283 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.283 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.283 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.283 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.283 12 ERROR oslo_messaging.notify.messaging 
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.284 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.rate in the context of pollsters
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.284 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for IncomingBytesRatePollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.284 12 ERROR ceilometer.polling.manager [-] Prevent pollster network.incoming.bytes.rate from polling [<NovaLikeServer: tempest-TestGettingAddress-server-1515318656>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-TestGettingAddress-server-1515318656>]
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.284 12 INFO ceilometer.polling.manager [-] Polling pollster memory.usage in the context of pollsters
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.284 12 DEBUG ceilometer.compute.pollsters [-] e1d6bfab-5b5f-4e87-903f-663a797f6e97/memory.usage volume: 42.75 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.285 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '58102127-6cc5-48df-bbb5-5dab78476830', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'memory.usage', 'counter_type': 'gauge', 'counter_unit': 'MB', 'counter_volume': 42.75, 'user_id': 'a8fd196423d94b309668ffd08655f7ed', 'user_name': None, 'project_id': '837db8748d074b3c9179b47d30e7a1d4', 'project_name': None, 'resource_id': 'e1d6bfab-5b5f-4e87-903f-663a797f6e97', 'timestamp': '2026-01-22T00:44:23.284672', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1515318656', 'name': 'instance-000000b5', 'instance_id': 'e1d6bfab-5b5f-4e87-903f-663a797f6e97', 'instance_type': 'm1.nano', 'host': '164c0a9a6acac8f4385343b5316ca58358db1c583e55b6d4f56f9ab0', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1}, 'message_id': '7ee1841a-f72b-11f0-b13b-fa163e425b77', 'monotonic_time': 7221.970466823, 'message_signature': '1813ed7a0d8fa3e2337dc2f4ab57cb26cf1588fdfd28d74ce5b46c62c71855ea'}]}, 'timestamp': '2026-01-22 00:44:23.284962', '_unique_id': '0b02da61c319475ca366f70a90859d33'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.285 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.285 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.285 12 ERROR oslo_messaging.notify.messaging     yield
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.285 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.285 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.285 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.285 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.285 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.285 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.285 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.285 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.285 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.285 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.285 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.285 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.285 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.285 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.285 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.285 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.285 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.285 12 ERROR oslo_messaging.notify.messaging 
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.285 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.285 12 ERROR oslo_messaging.notify.messaging 
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.285 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.285 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.285 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.285 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.285 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.285 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.285 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.285 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.285 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.285 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.285 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.285 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.285 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.285 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.285 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.285 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.285 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.285 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.285 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.285 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.285 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.285 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.285 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.285 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.285 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.285 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.285 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.285 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.285 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.285 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.285 12 ERROR oslo_messaging.notify.messaging 
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.286 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.bytes in the context of pollsters
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.286 12 DEBUG ceilometer.compute.pollsters [-] e1d6bfab-5b5f-4e87-903f-663a797f6e97/disk.device.write.bytes volume: 72929280 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.286 12 DEBUG ceilometer.compute.pollsters [-] e1d6bfab-5b5f-4e87-903f-663a797f6e97/disk.device.write.bytes volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.287 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '03b58b2c-4b11-415b-a05e-ebe6f1fd4113', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 72929280, 'user_id': 'a8fd196423d94b309668ffd08655f7ed', 'user_name': None, 'project_id': '837db8748d074b3c9179b47d30e7a1d4', 'project_name': None, 'resource_id': 'e1d6bfab-5b5f-4e87-903f-663a797f6e97-vda', 'timestamp': '2026-01-22T00:44:23.286411', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1515318656', 'name': 'instance-000000b5', 'instance_id': 'e1d6bfab-5b5f-4e87-903f-663a797f6e97', 'instance_type': 'm1.nano', 'host': '164c0a9a6acac8f4385343b5316ca58358db1c583e55b6d4f56f9ab0', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '7ee1c7fe-f72b-11f0-b13b-fa163e425b77', 'monotonic_time': 7221.915557672, 'message_signature': '2ceaa0325b9765db86df6b7376ec7e99c23828ffe08125dda4af59e20029fbe3'}, {'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': 'a8fd196423d94b309668ffd08655f7ed', 'user_name': None, 'project_id': '837db8748d074b3c9179b47d30e7a1d4', 'project_name': None, 'resource_id': 'e1d6bfab-5b5f-4e87-903f-663a797f6e97-sda', 'timestamp': '2026-01-22T00:44:23.286411', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1515318656', 'name': 'instance-000000b5', 'instance_id': 'e1d6bfab-5b5f-4e87-903f-663a797f6e97', 'instance_type': 'm1.nano', 'host': '164c0a9a6acac8f4385343b5316ca58358db1c583e55b6d4f56f9ab0', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '7ee1d186-f72b-11f0-b13b-fa163e425b77', 'monotonic_time': 7221.915557672, 'message_signature': '94ac0642decee50e7667d3ed25a17dea11efcd8a729bdc79f999fcdcbb726956'}]}, 'timestamp': '2026-01-22 00:44:23.286955', '_unique_id': '098720ce4bbd46faa150271613f13bcd'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.287 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.287 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.287 12 ERROR oslo_messaging.notify.messaging     yield
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.287 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.287 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.287 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.287 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.287 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.287 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.287 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.287 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.287 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.287 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.287 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.287 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.287 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.287 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.287 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.287 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.287 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.287 12 ERROR oslo_messaging.notify.messaging 
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.287 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.287 12 ERROR oslo_messaging.notify.messaging 
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.287 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.287 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.287 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.287 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.287 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.287 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.287 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.287 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.287 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.287 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.287 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.287 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.287 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.287 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.287 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.287 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.287 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.287 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.287 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.287 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.287 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.287 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.287 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.287 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.287 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.287 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.287 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.287 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.287 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.287 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.287 12 ERROR oslo_messaging.notify.messaging 
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.288 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes in the context of pollsters
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.288 12 DEBUG ceilometer.compute.pollsters [-] e1d6bfab-5b5f-4e87-903f-663a797f6e97/network.outgoing.bytes volume: 4048 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.289 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'e3d0b500-ab0c-4a1c-a29e-9a58e3842cb9', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 4048, 'user_id': 'a8fd196423d94b309668ffd08655f7ed', 'user_name': None, 'project_id': '837db8748d074b3c9179b47d30e7a1d4', 'project_name': None, 'resource_id': 'instance-000000b5-e1d6bfab-5b5f-4e87-903f-663a797f6e97-tap3a45d6cb-8d', 'timestamp': '2026-01-22T00:44:23.288436', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1515318656', 'name': 'tap3a45d6cb-8d', 'instance_id': 'e1d6bfab-5b5f-4e87-903f-663a797f6e97', 'instance_type': 'm1.nano', 'host': '164c0a9a6acac8f4385343b5316ca58358db1c583e55b6d4f56f9ab0', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:ff:5e:29', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap3a45d6cb-8d'}, 'message_id': '7ee21704-f72b-11f0-b13b-fa163e425b77', 'monotonic_time': 7221.904584765, 'message_signature': 'b9322d91366915edc08cb75259463ee2d511813d5bc7fbfebac0338c1ab1fa6c'}]}, 'timestamp': '2026-01-22 00:44:23.288712', '_unique_id': '5e85d66de63c4bf7b3aecfcb829a22c2'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.289 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.289 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.289 12 ERROR oslo_messaging.notify.messaging     yield
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.289 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.289 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.289 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.289 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.289 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.289 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.289 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.289 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.289 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.289 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.289 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.289 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.289 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.289 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.289 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.289 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.289 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.289 12 ERROR oslo_messaging.notify.messaging 
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.289 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.289 12 ERROR oslo_messaging.notify.messaging 
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.289 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.289 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.289 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.289 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.289 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.289 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.289 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.289 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.289 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.289 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.289 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.289 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.289 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.289 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.289 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.289 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.289 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.289 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.289 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.289 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.289 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.289 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.289 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.289 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.289 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.289 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.289 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.289 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.289 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.289 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.289 12 ERROR oslo_messaging.notify.messaging 
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.290 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.drop in the context of pollsters
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.290 12 DEBUG ceilometer.compute.pollsters [-] e1d6bfab-5b5f-4e87-903f-663a797f6e97/network.incoming.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.290 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '53cd49fb-b377-42af-91c0-ceb41e77a258', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'a8fd196423d94b309668ffd08655f7ed', 'user_name': None, 'project_id': '837db8748d074b3c9179b47d30e7a1d4', 'project_name': None, 'resource_id': 'instance-000000b5-e1d6bfab-5b5f-4e87-903f-663a797f6e97-tap3a45d6cb-8d', 'timestamp': '2026-01-22T00:44:23.290157', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1515318656', 'name': 'tap3a45d6cb-8d', 'instance_id': 'e1d6bfab-5b5f-4e87-903f-663a797f6e97', 'instance_type': 'm1.nano', 'host': '164c0a9a6acac8f4385343b5316ca58358db1c583e55b6d4f56f9ab0', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:ff:5e:29', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap3a45d6cb-8d'}, 'message_id': '7ee25a02-f72b-11f0-b13b-fa163e425b77', 'monotonic_time': 7221.904584765, 'message_signature': '6ce9019a9665038e1e61767b1916b676c62349182a3e5a0c9c4b49004ce0b584'}]}, 'timestamp': '2026-01-22 00:44:23.290426', '_unique_id': '8889aa7553de45ff8fa22e0e0d47613a'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.290 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.290 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.290 12 ERROR oslo_messaging.notify.messaging     yield
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.290 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.290 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.290 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.290 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.290 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.290 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.290 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.290 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.290 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.290 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.290 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.290 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.290 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.290 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.290 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.290 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.290 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.290 12 ERROR oslo_messaging.notify.messaging 
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.290 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.290 12 ERROR oslo_messaging.notify.messaging 
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.290 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.290 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.290 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.290 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.290 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.290 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.290 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.290 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.290 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.290 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.290 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.290 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.290 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.290 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.290 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.290 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.290 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.290 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.290 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.290 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.290 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.290 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.290 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.290 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.290 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.290 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.290 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.290 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.290 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.290 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.290 12 ERROR oslo_messaging.notify.messaging 
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.291 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.bytes in the context of pollsters
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.291 12 DEBUG ceilometer.compute.pollsters [-] e1d6bfab-5b5f-4e87-903f-663a797f6e97/disk.device.read.bytes volume: 30304768 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.292 12 DEBUG ceilometer.compute.pollsters [-] e1d6bfab-5b5f-4e87-903f-663a797f6e97/disk.device.read.bytes volume: 274750 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.293 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '19cf5c20-7a7c-4877-a59a-c72085802317', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 30304768, 'user_id': 'a8fd196423d94b309668ffd08655f7ed', 'user_name': None, 'project_id': '837db8748d074b3c9179b47d30e7a1d4', 'project_name': None, 'resource_id': 'e1d6bfab-5b5f-4e87-903f-663a797f6e97-vda', 'timestamp': '2026-01-22T00:44:23.291959', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1515318656', 'name': 'instance-000000b5', 'instance_id': 'e1d6bfab-5b5f-4e87-903f-663a797f6e97', 'instance_type': 'm1.nano', 'host': '164c0a9a6acac8f4385343b5316ca58358db1c583e55b6d4f56f9ab0', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '7ee2a07a-f72b-11f0-b13b-fa163e425b77', 'monotonic_time': 7221.915557672, 'message_signature': '76451f63dd6a0528549c5927b510e6ae588531e6092bd30ccd21c74fadd6c79e'}, {'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 274750, 'user_id': 'a8fd196423d94b309668ffd08655f7ed', 'user_name': None, 'project_id': '837db8748d074b3c9179b47d30e7a1d4', 'project_name': None, 'resource_id': 'e1d6bfab-5b5f-4e87-903f-663a797f6e97-sda', 'timestamp': '2026-01-22T00:44:23.291959', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1515318656', 'name': 'instance-000000b5', 'instance_id': 'e1d6bfab-5b5f-4e87-903f-663a797f6e97', 'instance_type': 'm1.nano', 'host': '164c0a9a6acac8f4385343b5316ca58358db1c583e55b6d4f56f9ab0', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '7ee2aebc-f72b-11f0-b13b-fa163e425b77', 'monotonic_time': 7221.915557672, 'message_signature': '2058d59e57cc2d1d1f48bbc864c7347fe88a8444735bb06abbaa3ade31ba6bd9'}]}, 'timestamp': '2026-01-22 00:44:23.292584', '_unique_id': 'ef64feee048c4264a2e9c51cd3b35076'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.293 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.293 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.293 12 ERROR oslo_messaging.notify.messaging     yield
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.293 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.293 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.293 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.293 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.293 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.293 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.293 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.293 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.293 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.293 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.293 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.293 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.293 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.293 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.293 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.293 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.293 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.293 12 ERROR oslo_messaging.notify.messaging 
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.293 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.293 12 ERROR oslo_messaging.notify.messaging 
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.293 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.293 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.293 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.293 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.293 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.293 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.293 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.293 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.293 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.293 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.293 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.293 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.293 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.293 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.293 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.293 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.293 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.293 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.293 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.293 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.293 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.293 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.293 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.293 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.293 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.293 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.293 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.293 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.293 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.293 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.293 12 ERROR oslo_messaging.notify.messaging 
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.294 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.iops in the context of pollsters
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.294 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for PerDeviceDiskIOPSPollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.294 12 ERROR ceilometer.polling.manager [-] Prevent pollster disk.device.iops from polling [<NovaLikeServer: tempest-TestGettingAddress-server-1515318656>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-TestGettingAddress-server-1515318656>]
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.294 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.usage in the context of pollsters
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.294 12 DEBUG ceilometer.compute.pollsters [-] e1d6bfab-5b5f-4e87-903f-663a797f6e97/disk.device.usage volume: 29949952 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.294 12 DEBUG ceilometer.compute.pollsters [-] e1d6bfab-5b5f-4e87-903f-663a797f6e97/disk.device.usage volume: 485376 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.295 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '63403ad5-94a6-49ad-a96e-f7d64895dfec', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 29949952, 'user_id': 'a8fd196423d94b309668ffd08655f7ed', 'user_name': None, 'project_id': '837db8748d074b3c9179b47d30e7a1d4', 'project_name': None, 'resource_id': 'e1d6bfab-5b5f-4e87-903f-663a797f6e97-vda', 'timestamp': '2026-01-22T00:44:23.294513', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1515318656', 'name': 'instance-000000b5', 'instance_id': 'e1d6bfab-5b5f-4e87-903f-663a797f6e97', 'instance_type': 'm1.nano', 'host': '164c0a9a6acac8f4385343b5316ca58358db1c583e55b6d4f56f9ab0', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '7ee304a2-f72b-11f0-b13b-fa163e425b77', 'monotonic_time': 7221.984410339, 'message_signature': 'd20982ace6bf03c3ff8624776e11693dcb869186284b82046ef2b936536c4395'}, {'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 485376, 'user_id': 'a8fd196423d94b309668ffd08655f7ed', 'user_name': None, 'project_id': '837db8748d074b3c9179b47d30e7a1d4', 'project_name': None, 'resource_id': 'e1d6bfab-5b5f-4e87-903f-663a797f6e97-sda', 'timestamp': '2026-01-22T00:44:23.294513', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1515318656', 'name': 'instance-000000b5', 'instance_id': 'e1d6bfab-5b5f-4e87-903f-663a797f6e97', 'instance_type': 'm1.nano', 'host': '164c0a9a6acac8f4385343b5316ca58358db1c583e55b6d4f56f9ab0', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '7ee30f92-f72b-11f0-b13b-fa163e425b77', 'monotonic_time': 7221.984410339, 'message_signature': '75cc1099f7c1424ee8c5c78b80a522b124feb612d3ce5a4993e142ea2d09441f'}]}, 'timestamp': '2026-01-22 00:44:23.295063', '_unique_id': '0dddeceded4b4d42a5fe03f57005c8fc'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.295 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.295 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.295 12 ERROR oslo_messaging.notify.messaging     yield
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.295 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.295 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.295 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.295 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.295 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.295 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.295 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.295 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.295 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.295 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.295 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.295 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.295 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.295 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.295 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.295 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.295 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.295 12 ERROR oslo_messaging.notify.messaging 
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.295 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.295 12 ERROR oslo_messaging.notify.messaging 
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.295 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.295 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.295 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.295 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.295 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.295 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.295 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.295 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.295 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.295 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.295 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.295 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.295 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.295 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.295 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.295 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.295 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.295 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.295 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.295 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.295 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.295 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.295 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.295 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.295 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.295 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.295 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.295 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.295 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.295 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.295 12 ERROR oslo_messaging.notify.messaging 
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.296 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.delta in the context of pollsters
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.296 12 DEBUG ceilometer.compute.pollsters [-] e1d6bfab-5b5f-4e87-903f-663a797f6e97/network.incoming.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.297 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '1c71144a-5b6d-4f84-a411-8c1404fb59fc', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': 'a8fd196423d94b309668ffd08655f7ed', 'user_name': None, 'project_id': '837db8748d074b3c9179b47d30e7a1d4', 'project_name': None, 'resource_id': 'instance-000000b5-e1d6bfab-5b5f-4e87-903f-663a797f6e97-tap3a45d6cb-8d', 'timestamp': '2026-01-22T00:44:23.296491', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1515318656', 'name': 'tap3a45d6cb-8d', 'instance_id': 'e1d6bfab-5b5f-4e87-903f-663a797f6e97', 'instance_type': 'm1.nano', 'host': '164c0a9a6acac8f4385343b5316ca58358db1c583e55b6d4f56f9ab0', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:ff:5e:29', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap3a45d6cb-8d'}, 'message_id': '7ee35178-f72b-11f0-b13b-fa163e425b77', 'monotonic_time': 7221.904584765, 'message_signature': '1fd49dae7f46b390e04f9677717d7f2d2e9286fc6f697f8025f3811c5a1d2210'}]}, 'timestamp': '2026-01-22 00:44:23.296768', '_unique_id': 'e6438de3c2a449f580eba5fb053e65a2'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.297 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.297 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.297 12 ERROR oslo_messaging.notify.messaging     yield
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.297 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.297 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.297 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.297 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.297 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.297 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.297 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.297 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.297 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.297 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.297 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.297 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.297 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.297 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.297 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.297 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.297 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.297 12 ERROR oslo_messaging.notify.messaging 
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.297 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.297 12 ERROR oslo_messaging.notify.messaging 
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.297 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.297 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.297 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.297 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.297 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.297 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.297 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.297 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.297 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.297 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.297 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.297 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.297 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.297 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.297 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.297 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.297 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.297 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.297 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.297 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.297 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.297 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.297 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.297 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.297 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.297 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.297 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.297 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.297 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.297 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.297 12 ERROR oslo_messaging.notify.messaging 
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.298 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.allocation in the context of pollsters
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.298 12 DEBUG ceilometer.compute.pollsters [-] e1d6bfab-5b5f-4e87-903f-663a797f6e97/disk.device.allocation volume: 30089216 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.298 12 DEBUG ceilometer.compute.pollsters [-] e1d6bfab-5b5f-4e87-903f-663a797f6e97/disk.device.allocation volume: 487424 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.299 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '56229970-4dfa-43af-8915-2171663fec63', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 30089216, 'user_id': 'a8fd196423d94b309668ffd08655f7ed', 'user_name': None, 'project_id': '837db8748d074b3c9179b47d30e7a1d4', 'project_name': None, 'resource_id': 'e1d6bfab-5b5f-4e87-903f-663a797f6e97-vda', 'timestamp': '2026-01-22T00:44:23.298398', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1515318656', 'name': 'instance-000000b5', 'instance_id': 'e1d6bfab-5b5f-4e87-903f-663a797f6e97', 'instance_type': 'm1.nano', 'host': '164c0a9a6acac8f4385343b5316ca58358db1c583e55b6d4f56f9ab0', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '7ee39c82-f72b-11f0-b13b-fa163e425b77', 'monotonic_time': 7221.984410339, 'message_signature': 'f7faa51c0ab0ab7aa09077e2b88b232ad1b7525e78bb99629af7cfb869dda626'}, {'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 487424, 'user_id': 'a8fd196423d94b309668ffd08655f7ed', 'user_name': None, 'project_id': '837db8748d074b3c9179b47d30e7a1d4', 'project_name': None, 'resource_id': 'e1d6bfab-5b5f-4e87-903f-663a797f6e97-sda', 'timestamp': '2026-01-22T00:44:23.298398', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1515318656', 'name': 'instance-000000b5', 'instance_id': 'e1d6bfab-5b5f-4e87-903f-663a797f6e97', 'instance_type': 'm1.nano', 'host': '164c0a9a6acac8f4385343b5316ca58358db1c583e55b6d4f56f9ab0', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '7ee3a664-f72b-11f0-b13b-fa163e425b77', 'monotonic_time': 7221.984410339, 'message_signature': '70eb4fef5a5cc54329eeac924588ffbbde0b9be142a953c3326c4c965697bae4'}]}, 'timestamp': '2026-01-22 00:44:23.298949', '_unique_id': 'dcaec961fcbb45b0bc3858fec70c1d11'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.299 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.299 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.299 12 ERROR oslo_messaging.notify.messaging     yield
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.299 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.299 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.299 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.299 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.299 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.299 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.299 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.299 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.299 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.299 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.299 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.299 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.299 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.299 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.299 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.299 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.299 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.299 12 ERROR oslo_messaging.notify.messaging 
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.299 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.299 12 ERROR oslo_messaging.notify.messaging 
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.299 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.299 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.299 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.299 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.299 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.299 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.299 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.299 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.299 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.299 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.299 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.299 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.299 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.299 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.299 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.299 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.299 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.299 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.299 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.299 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.299 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.299 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.299 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.299 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.299 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.299 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.299 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.299 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.299 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.299 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.299 12 ERROR oslo_messaging.notify.messaging 
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.300 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.requests in the context of pollsters
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.300 12 DEBUG ceilometer.compute.pollsters [-] e1d6bfab-5b5f-4e87-903f-663a797f6e97/disk.device.write.requests volume: 305 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.300 12 DEBUG ceilometer.compute.pollsters [-] e1d6bfab-5b5f-4e87-903f-663a797f6e97/disk.device.write.requests volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.301 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'c2aec30a-6b57-480c-bed1-8feeaa2051b9', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 305, 'user_id': 'a8fd196423d94b309668ffd08655f7ed', 'user_name': None, 'project_id': '837db8748d074b3c9179b47d30e7a1d4', 'project_name': None, 'resource_id': 'e1d6bfab-5b5f-4e87-903f-663a797f6e97-vda', 'timestamp': '2026-01-22T00:44:23.300448', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1515318656', 'name': 'instance-000000b5', 'instance_id': 'e1d6bfab-5b5f-4e87-903f-663a797f6e97', 'instance_type': 'm1.nano', 'host': '164c0a9a6acac8f4385343b5316ca58358db1c583e55b6d4f56f9ab0', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '7ee3ebe2-f72b-11f0-b13b-fa163e425b77', 'monotonic_time': 7221.915557672, 'message_signature': '6463e3b6a4cdfffd18edacb457692289aba6f2ad6ef8b829d8d17220ed0e232f'}, {'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 0, 'user_id': 'a8fd196423d94b309668ffd08655f7ed', 'user_name': None, 'project_id': '837db8748d074b3c9179b47d30e7a1d4', 'project_name': None, 'resource_id': 'e1d6bfab-5b5f-4e87-903f-663a797f6e97-sda', 'timestamp': '2026-01-22T00:44:23.300448', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1515318656', 'name': 'instance-000000b5', 'instance_id': 'e1d6bfab-5b5f-4e87-903f-663a797f6e97', 'instance_type': 'm1.nano', 'host': '164c0a9a6acac8f4385343b5316ca58358db1c583e55b6d4f56f9ab0', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '7ee3f5ba-f72b-11f0-b13b-fa163e425b77', 'monotonic_time': 7221.915557672, 'message_signature': 'f27e15abab932e245e7fef05be291ec8c5bf18423eef6bf7300e0831ea297484'}]}, 'timestamp': '2026-01-22 00:44:23.300977', '_unique_id': 'c6c3454db1d347c7bd71f52da8dd30e7'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.301 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.301 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.301 12 ERROR oslo_messaging.notify.messaging     yield
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.301 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.301 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.301 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.301 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.301 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.301 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.301 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.301 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.301 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.301 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.301 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.301 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.301 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.301 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.301 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.301 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.301 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.301 12 ERROR oslo_messaging.notify.messaging 
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.301 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.301 12 ERROR oslo_messaging.notify.messaging 
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.301 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.301 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.301 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.301 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.301 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.301 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.301 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.301 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.301 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.301 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.301 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.301 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.301 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.301 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.301 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.301 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.301 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.301 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.301 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.301 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.301 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.301 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.301 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.301 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.301 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.301 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.301 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.301 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.301 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.301 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.301 12 ERROR oslo_messaging.notify.messaging 
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.302 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.drop in the context of pollsters
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.302 12 DEBUG ceilometer.compute.pollsters [-] e1d6bfab-5b5f-4e87-903f-663a797f6e97/network.outgoing.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.303 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '1c39860d-6201-42ac-9af5-7c900bd443a5', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'a8fd196423d94b309668ffd08655f7ed', 'user_name': None, 'project_id': '837db8748d074b3c9179b47d30e7a1d4', 'project_name': None, 'resource_id': 'instance-000000b5-e1d6bfab-5b5f-4e87-903f-663a797f6e97-tap3a45d6cb-8d', 'timestamp': '2026-01-22T00:44:23.302582', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1515318656', 'name': 'tap3a45d6cb-8d', 'instance_id': 'e1d6bfab-5b5f-4e87-903f-663a797f6e97', 'instance_type': 'm1.nano', 'host': '164c0a9a6acac8f4385343b5316ca58358db1c583e55b6d4f56f9ab0', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:ff:5e:29', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap3a45d6cb-8d'}, 'message_id': '7ee43fac-f72b-11f0-b13b-fa163e425b77', 'monotonic_time': 7221.904584765, 'message_signature': '7c41b85b8dafc083afd28cc1098a2c312c4dfdef96007d29d888cd57bed3f8a9'}]}, 'timestamp': '2026-01-22 00:44:23.302919', '_unique_id': 'ce85a0afa2a24e03a41343f90951634c'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.303 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.303 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.303 12 ERROR oslo_messaging.notify.messaging     yield
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.303 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.303 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.303 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.303 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.303 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.303 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.303 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.303 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.303 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.303 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.303 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.303 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.303 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.303 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.303 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.303 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.303 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.303 12 ERROR oslo_messaging.notify.messaging 
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.303 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.303 12 ERROR oslo_messaging.notify.messaging 
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.303 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.303 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.303 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.303 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.303 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.303 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.303 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.303 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.303 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.303 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.303 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.303 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.303 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.303 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.303 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.303 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.303 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.303 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.303 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.303 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.303 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.303 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.303 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.303 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.303 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.303 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.303 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.303 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.303 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.303 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.303 12 ERROR oslo_messaging.notify.messaging 
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.304 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.latency in the context of pollsters
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.304 12 DEBUG ceilometer.compute.pollsters [-] e1d6bfab-5b5f-4e87-903f-663a797f6e97/disk.device.read.latency volume: 151913924 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.304 12 DEBUG ceilometer.compute.pollsters [-] e1d6bfab-5b5f-4e87-903f-663a797f6e97/disk.device.read.latency volume: 25719840 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.305 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '989e1fb3-8474-4917-badf-d67620e36e0a', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 151913924, 'user_id': 'a8fd196423d94b309668ffd08655f7ed', 'user_name': None, 'project_id': '837db8748d074b3c9179b47d30e7a1d4', 'project_name': None, 'resource_id': 'e1d6bfab-5b5f-4e87-903f-663a797f6e97-vda', 'timestamp': '2026-01-22T00:44:23.304505', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1515318656', 'name': 'instance-000000b5', 'instance_id': 'e1d6bfab-5b5f-4e87-903f-663a797f6e97', 'instance_type': 'm1.nano', 'host': '164c0a9a6acac8f4385343b5316ca58358db1c583e55b6d4f56f9ab0', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '7ee48c1e-f72b-11f0-b13b-fa163e425b77', 'monotonic_time': 7221.915557672, 'message_signature': 'adcbfb31e0f79f962f523e76f6e93296304e9a4ece34448698cc8067a3195a1d'}, {'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 25719840, 'user_id': 'a8fd196423d94b309668ffd08655f7ed', 'user_name': None, 'project_id': '837db8748d074b3c9179b47d30e7a1d4', 'project_name': None, 'resource_id': 'e1d6bfab-5b5f-4e87-903f-663a797f6e97-sda', 'timestamp': '2026-01-22T00:44:23.304505', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1515318656', 'name': 'instance-000000b5', 'instance_id': 'e1d6bfab-5b5f-4e87-903f-663a797f6e97', 'instance_type': 'm1.nano', 'host': '164c0a9a6acac8f4385343b5316ca58358db1c583e55b6d4f56f9ab0', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '7ee4995c-f72b-11f0-b13b-fa163e425b77', 'monotonic_time': 7221.915557672, 'message_signature': '466e1b0d3c51677afaa0215e207a671dba64cc632f33f575cfb14f29b22fbac9'}]}, 'timestamp': '2026-01-22 00:44:23.305176', '_unique_id': '550a2ff2d47f4af3aa5cd038ad30407d'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.305 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.305 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.305 12 ERROR oslo_messaging.notify.messaging     yield
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.305 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.305 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.305 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.305 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.305 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.305 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.305 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.305 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.305 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.305 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.305 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.305 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.305 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.305 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.305 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.305 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.305 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.305 12 ERROR oslo_messaging.notify.messaging 
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.305 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.305 12 ERROR oslo_messaging.notify.messaging 
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.305 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.305 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.305 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.305 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.305 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.305 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.305 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.305 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.305 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.305 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.305 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.305 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.305 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.305 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.305 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.305 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.305 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.305 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.305 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.305 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.305 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.305 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.305 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.305 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.305 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.305 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.305 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.305 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.305 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.305 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.305 12 ERROR oslo_messaging.notify.messaging 
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.306 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.rate in the context of pollsters
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.306 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for OutgoingBytesRatePollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.307 12 ERROR ceilometer.polling.manager [-] Prevent pollster network.outgoing.bytes.rate from polling [<NovaLikeServer: tempest-TestGettingAddress-server-1515318656>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-TestGettingAddress-server-1515318656>]
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.307 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.latency in the context of pollsters
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.307 12 DEBUG ceilometer.compute.pollsters [-] e1d6bfab-5b5f-4e87-903f-663a797f6e97/disk.device.write.latency volume: 2837906585 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.307 12 DEBUG ceilometer.compute.pollsters [-] e1d6bfab-5b5f-4e87-903f-663a797f6e97/disk.device.write.latency volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.308 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'ffb9473a-7b3f-46fc-8d18-629286eed788', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 2837906585, 'user_id': 'a8fd196423d94b309668ffd08655f7ed', 'user_name': None, 'project_id': '837db8748d074b3c9179b47d30e7a1d4', 'project_name': None, 'resource_id': 'e1d6bfab-5b5f-4e87-903f-663a797f6e97-vda', 'timestamp': '2026-01-22T00:44:23.307281', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1515318656', 'name': 'instance-000000b5', 'instance_id': 'e1d6bfab-5b5f-4e87-903f-663a797f6e97', 'instance_type': 'm1.nano', 'host': '164c0a9a6acac8f4385343b5316ca58358db1c583e55b6d4f56f9ab0', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '7ee4f6cc-f72b-11f0-b13b-fa163e425b77', 'monotonic_time': 7221.915557672, 'message_signature': '81652d40bc32f96298ddce6ac93a9e083526af2060879738822165b7a3b74417'}, {'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 0, 'user_id': 'a8fd196423d94b309668ffd08655f7ed', 'user_name': None, 'project_id': '837db8748d074b3c9179b47d30e7a1d4', 'project_name': None, 'resource_id': 'e1d6bfab-5b5f-4e87-903f-663a797f6e97-sda', 'timestamp': '2026-01-22T00:44:23.307281', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1515318656', 'name': 'instance-000000b5', 'instance_id': 'e1d6bfab-5b5f-4e87-903f-663a797f6e97', 'instance_type': 'm1.nano', 'host': '164c0a9a6acac8f4385343b5316ca58358db1c583e55b6d4f56f9ab0', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '7ee50022-f72b-11f0-b13b-fa163e425b77', 'monotonic_time': 7221.915557672, 'message_signature': '6effb22a7a280f4970006e8ff8403900f3b71d18d8e81ddc9dad6b581672ee04'}]}, 'timestamp': '2026-01-22 00:44:23.307774', '_unique_id': '11dee3932c9f499db2bb5d08e9411d5a'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.308 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.308 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.308 12 ERROR oslo_messaging.notify.messaging     yield
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.308 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.308 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.308 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.308 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.308 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.308 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.308 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.308 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.308 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.308 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.308 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.308 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.308 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.308 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.308 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.308 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.308 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.308 12 ERROR oslo_messaging.notify.messaging 
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.308 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.308 12 ERROR oslo_messaging.notify.messaging 
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.308 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.308 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.308 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.308 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.308 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.308 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.308 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.308 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.308 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.308 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.308 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.308 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.308 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.308 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.308 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.308 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.308 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.308 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.308 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.308 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.308 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.308 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.308 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.308 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.308 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.308 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.308 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.308 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.308 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.308 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 21 19:44:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:44:23.308 12 ERROR oslo_messaging.notify.messaging 
Jan 21 19:44:24 np0005591285 nova_compute[182755]: 2026-01-22 00:44:24.217 182759 DEBUG oslo_service.periodic_task [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 21 19:44:25 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:44:25.092 104259 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=ce4b296c-26ac-415a-aa87-9634754eb3d3, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '68'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 21 19:44:25 np0005591285 nova_compute[182755]: 2026-01-22 00:44:25.217 182759 DEBUG oslo_service.periodic_task [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 21 19:44:25 np0005591285 nova_compute[182755]: 2026-01-22 00:44:25.515 182759 DEBUG oslo_concurrency.lockutils [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 21 19:44:25 np0005591285 nova_compute[182755]: 2026-01-22 00:44:25.516 182759 DEBUG oslo_concurrency.lockutils [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 21 19:44:25 np0005591285 nova_compute[182755]: 2026-01-22 00:44:25.516 182759 DEBUG oslo_concurrency.lockutils [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 21 19:44:25 np0005591285 nova_compute[182755]: 2026-01-22 00:44:25.516 182759 DEBUG nova.compute.resource_tracker [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Jan 21 19:44:25 np0005591285 nova_compute[182755]: 2026-01-22 00:44:25.617 182759 DEBUG oslo_concurrency.processutils [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/e1d6bfab-5b5f-4e87-903f-663a797f6e97/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 21 19:44:25 np0005591285 nova_compute[182755]: 2026-01-22 00:44:25.675 182759 DEBUG oslo_concurrency.processutils [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/e1d6bfab-5b5f-4e87-903f-663a797f6e97/disk --force-share --output=json" returned: 0 in 0.058s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 21 19:44:25 np0005591285 nova_compute[182755]: 2026-01-22 00:44:25.676 182759 DEBUG oslo_concurrency.processutils [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/e1d6bfab-5b5f-4e87-903f-663a797f6e97/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 21 19:44:25 np0005591285 nova_compute[182755]: 2026-01-22 00:44:25.727 182759 DEBUG oslo_concurrency.processutils [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/e1d6bfab-5b5f-4e87-903f-663a797f6e97/disk --force-share --output=json" returned: 0 in 0.051s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 21 19:44:25 np0005591285 nova_compute[182755]: 2026-01-22 00:44:25.927 182759 WARNING nova.virt.libvirt.driver [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 21 19:44:25 np0005591285 nova_compute[182755]: 2026-01-22 00:44:25.929 182759 DEBUG nova.compute.resource_tracker [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=5560MB free_disk=73.1482925415039GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Jan 21 19:44:25 np0005591285 nova_compute[182755]: 2026-01-22 00:44:25.930 182759 DEBUG oslo_concurrency.lockutils [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 21 19:44:25 np0005591285 nova_compute[182755]: 2026-01-22 00:44:25.930 182759 DEBUG oslo_concurrency.lockutils [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 21 19:44:26 np0005591285 nova_compute[182755]: 2026-01-22 00:44:26.028 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:44:26 np0005591285 nova_compute[182755]: 2026-01-22 00:44:26.091 182759 DEBUG nova.compute.resource_tracker [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Instance e1d6bfab-5b5f-4e87-903f-663a797f6e97 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Jan 21 19:44:26 np0005591285 nova_compute[182755]: 2026-01-22 00:44:26.092 182759 DEBUG nova.compute.resource_tracker [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Jan 21 19:44:26 np0005591285 nova_compute[182755]: 2026-01-22 00:44:26.092 182759 DEBUG nova.compute.resource_tracker [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7679MB used_ram=640MB phys_disk=79GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Jan 21 19:44:26 np0005591285 nova_compute[182755]: 2026-01-22 00:44:26.163 182759 DEBUG nova.compute.provider_tree [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Inventory has not changed in ProviderTree for provider: e96a8776-a298-4c19-937a-402cb8191067 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 21 19:44:26 np0005591285 nova_compute[182755]: 2026-01-22 00:44:26.207 182759 DEBUG nova.scheduler.client.report [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Inventory has not changed for provider e96a8776-a298-4c19-937a-402cb8191067 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 21 19:44:26 np0005591285 nova_compute[182755]: 2026-01-22 00:44:26.238 182759 DEBUG nova.compute.resource_tracker [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Jan 21 19:44:26 np0005591285 nova_compute[182755]: 2026-01-22 00:44:26.239 182759 DEBUG oslo_concurrency.lockutils [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.308s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 21 19:44:26 np0005591285 nova_compute[182755]: 2026-01-22 00:44:26.636 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:44:27 np0005591285 nova_compute[182755]: 2026-01-22 00:44:27.240 182759 DEBUG oslo_service.periodic_task [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 21 19:44:30 np0005591285 podman[243905]: 2026-01-22 00:44:30.20051588 +0000 UTC m=+0.062184139 container health_status 3d927e208c8d9d2bf2ed958430b573b5beb6e6062ba87e1ec7a0ee42c855b2e4 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Jan 21 19:44:31 np0005591285 nova_compute[182755]: 2026-01-22 00:44:31.064 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:44:31 np0005591285 nova_compute[182755]: 2026-01-22 00:44:31.638 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:44:33 np0005591285 podman[243929]: 2026-01-22 00:44:33.196840892 +0000 UTC m=+0.065192961 container health_status 482a8450540b33e5cd4c4cb0c4b60281016c657b79b89fa82f8a9e447843f995 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0)
Jan 21 19:44:33 np0005591285 podman[243930]: 2026-01-22 00:44:33.225861065 +0000 UTC m=+0.086121605 container health_status 8919309155a033da8deb024bb37b9fc0d285fe97686ee0d202af37a5f2df8416 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter)
Jan 21 19:44:36 np0005591285 nova_compute[182755]: 2026-01-22 00:44:36.066 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:44:36 np0005591285 podman[243970]: 2026-01-22 00:44:36.245842925 +0000 UTC m=+0.118560471 container health_status e38def30a2d032278c507a8fe96e62e9ea51ff4721fe55b24e66691821fd079e (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, config_id=ovn_controller, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, container_name=ovn_controller)
Jan 21 19:44:36 np0005591285 nova_compute[182755]: 2026-01-22 00:44:36.640 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:44:37 np0005591285 nova_compute[182755]: 2026-01-22 00:44:37.213 182759 DEBUG oslo_service.periodic_task [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 21 19:44:41 np0005591285 nova_compute[182755]: 2026-01-22 00:44:41.069 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:44:41 np0005591285 nova_compute[182755]: 2026-01-22 00:44:41.641 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:44:44 np0005591285 nova_compute[182755]: 2026-01-22 00:44:44.050 182759 DEBUG nova.compute.manager [req-9a380f5e-dcd5-4e3c-a033-44944f5f4705 req-4142d01a-0941-47f6-9fed-924d2c06cf21 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: e1d6bfab-5b5f-4e87-903f-663a797f6e97] Received event network-changed-3a45d6cb-8d53-4e0f-8011-06cad53a8190 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 21 19:44:44 np0005591285 nova_compute[182755]: 2026-01-22 00:44:44.050 182759 DEBUG nova.compute.manager [req-9a380f5e-dcd5-4e3c-a033-44944f5f4705 req-4142d01a-0941-47f6-9fed-924d2c06cf21 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: e1d6bfab-5b5f-4e87-903f-663a797f6e97] Refreshing instance network info cache due to event network-changed-3a45d6cb-8d53-4e0f-8011-06cad53a8190. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 21 19:44:44 np0005591285 nova_compute[182755]: 2026-01-22 00:44:44.051 182759 DEBUG oslo_concurrency.lockutils [req-9a380f5e-dcd5-4e3c-a033-44944f5f4705 req-4142d01a-0941-47f6-9fed-924d2c06cf21 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "refresh_cache-e1d6bfab-5b5f-4e87-903f-663a797f6e97" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 21 19:44:44 np0005591285 nova_compute[182755]: 2026-01-22 00:44:44.051 182759 DEBUG oslo_concurrency.lockutils [req-9a380f5e-dcd5-4e3c-a033-44944f5f4705 req-4142d01a-0941-47f6-9fed-924d2c06cf21 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquired lock "refresh_cache-e1d6bfab-5b5f-4e87-903f-663a797f6e97" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 21 19:44:44 np0005591285 nova_compute[182755]: 2026-01-22 00:44:44.051 182759 DEBUG nova.network.neutron [req-9a380f5e-dcd5-4e3c-a033-44944f5f4705 req-4142d01a-0941-47f6-9fed-924d2c06cf21 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: e1d6bfab-5b5f-4e87-903f-663a797f6e97] Refreshing network info cache for port 3a45d6cb-8d53-4e0f-8011-06cad53a8190 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 21 19:44:44 np0005591285 nova_compute[182755]: 2026-01-22 00:44:44.210 182759 DEBUG oslo_concurrency.lockutils [None req-730be766-c09a-44cc-a928-14d02787fba0 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Acquiring lock "e1d6bfab-5b5f-4e87-903f-663a797f6e97" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 21 19:44:44 np0005591285 nova_compute[182755]: 2026-01-22 00:44:44.210 182759 DEBUG oslo_concurrency.lockutils [None req-730be766-c09a-44cc-a928-14d02787fba0 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Lock "e1d6bfab-5b5f-4e87-903f-663a797f6e97" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 21 19:44:44 np0005591285 nova_compute[182755]: 2026-01-22 00:44:44.210 182759 DEBUG oslo_concurrency.lockutils [None req-730be766-c09a-44cc-a928-14d02787fba0 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Acquiring lock "e1d6bfab-5b5f-4e87-903f-663a797f6e97-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 21 19:44:44 np0005591285 nova_compute[182755]: 2026-01-22 00:44:44.211 182759 DEBUG oslo_concurrency.lockutils [None req-730be766-c09a-44cc-a928-14d02787fba0 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Lock "e1d6bfab-5b5f-4e87-903f-663a797f6e97-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 21 19:44:44 np0005591285 nova_compute[182755]: 2026-01-22 00:44:44.211 182759 DEBUG oslo_concurrency.lockutils [None req-730be766-c09a-44cc-a928-14d02787fba0 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Lock "e1d6bfab-5b5f-4e87-903f-663a797f6e97-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 21 19:44:44 np0005591285 nova_compute[182755]: 2026-01-22 00:44:44.222 182759 INFO nova.compute.manager [None req-730be766-c09a-44cc-a928-14d02787fba0 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] [instance: e1d6bfab-5b5f-4e87-903f-663a797f6e97] Terminating instance#033[00m
Jan 21 19:44:44 np0005591285 nova_compute[182755]: 2026-01-22 00:44:44.236 182759 DEBUG nova.compute.manager [None req-730be766-c09a-44cc-a928-14d02787fba0 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] [instance: e1d6bfab-5b5f-4e87-903f-663a797f6e97] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Jan 21 19:44:44 np0005591285 kernel: tap3a45d6cb-8d (unregistering): left promiscuous mode
Jan 21 19:44:44 np0005591285 NetworkManager[55017]: <info>  [1769042684.2734] device (tap3a45d6cb-8d): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 21 19:44:44 np0005591285 ovn_controller[94908]: 2026-01-22T00:44:44Z|00685|binding|INFO|Releasing lport 3a45d6cb-8d53-4e0f-8011-06cad53a8190 from this chassis (sb_readonly=0)
Jan 21 19:44:44 np0005591285 ovn_controller[94908]: 2026-01-22T00:44:44Z|00686|binding|INFO|Setting lport 3a45d6cb-8d53-4e0f-8011-06cad53a8190 down in Southbound
Jan 21 19:44:44 np0005591285 nova_compute[182755]: 2026-01-22 00:44:44.284 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:44:44 np0005591285 ovn_controller[94908]: 2026-01-22T00:44:44Z|00687|binding|INFO|Removing iface tap3a45d6cb-8d ovn-installed in OVS
Jan 21 19:44:44 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:44:44.297 104259 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:ff:5e:29 10.100.0.7 2001:db8:0:1:f816:3eff:feff:5e29 2001:db8::f816:3eff:feff:5e29'], port_security=['fa:16:3e:ff:5e:29 10.100.0.7 2001:db8:0:1:f816:3eff:feff:5e29 2001:db8::f816:3eff:feff:5e29'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.7/28 2001:db8:0:1:f816:3eff:feff:5e29/64 2001:db8::f816:3eff:feff:5e29/64', 'neutron:device_id': 'e1d6bfab-5b5f-4e87-903f-663a797f6e97', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-bc173f9b-a39e-490e-b1d4-92abd1855016', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '837db8748d074b3c9179b47d30e7a1d4', 'neutron:revision_number': '4', 'neutron:security_group_ids': '46ea8a3f-4945-4bb2-97cf-c1bd6e8fe825', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-2.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=ad392942-0b6b-462d-a3a5-d979f385a143, chassis=[], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fdd55b5c970>], logical_port=3a45d6cb-8d53-4e0f-8011-06cad53a8190) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7fdd55b5c970>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 21 19:44:44 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:44:44.300 104259 INFO neutron.agent.ovn.metadata.agent [-] Port 3a45d6cb-8d53-4e0f-8011-06cad53a8190 in datapath bc173f9b-a39e-490e-b1d4-92abd1855016 unbound from our chassis#033[00m
Jan 21 19:44:44 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:44:44.302 104259 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network bc173f9b-a39e-490e-b1d4-92abd1855016, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Jan 21 19:44:44 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:44:44.305 211690 DEBUG oslo.privsep.daemon [-] privsep: reply[ff82ce19-e7f9-48c8-8bd8-36cd1d7028f8]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 19:44:44 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:44:44.307 104259 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-bc173f9b-a39e-490e-b1d4-92abd1855016 namespace which is not needed anymore#033[00m
Jan 21 19:44:44 np0005591285 nova_compute[182755]: 2026-01-22 00:44:44.320 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:44:44 np0005591285 systemd[1]: machine-qemu\x2d77\x2dinstance\x2d000000b5.scope: Deactivated successfully.
Jan 21 19:44:44 np0005591285 systemd[1]: machine-qemu\x2d77\x2dinstance\x2d000000b5.scope: Consumed 15.671s CPU time.
Jan 21 19:44:44 np0005591285 systemd-machined[154022]: Machine qemu-77-instance-000000b5 terminated.
Jan 21 19:44:44 np0005591285 neutron-haproxy-ovnmeta-bc173f9b-a39e-490e-b1d4-92abd1855016[243699]: [NOTICE]   (243703) : haproxy version is 2.8.14-c23fe91
Jan 21 19:44:44 np0005591285 neutron-haproxy-ovnmeta-bc173f9b-a39e-490e-b1d4-92abd1855016[243699]: [NOTICE]   (243703) : path to executable is /usr/sbin/haproxy
Jan 21 19:44:44 np0005591285 neutron-haproxy-ovnmeta-bc173f9b-a39e-490e-b1d4-92abd1855016[243699]: [WARNING]  (243703) : Exiting Master process...
Jan 21 19:44:44 np0005591285 neutron-haproxy-ovnmeta-bc173f9b-a39e-490e-b1d4-92abd1855016[243699]: [ALERT]    (243703) : Current worker (243705) exited with code 143 (Terminated)
Jan 21 19:44:44 np0005591285 neutron-haproxy-ovnmeta-bc173f9b-a39e-490e-b1d4-92abd1855016[243699]: [WARNING]  (243703) : All workers exited. Exiting... (0)
Jan 21 19:44:44 np0005591285 systemd[1]: libpod-bfaf5011268f13a45b2c383855cdd285460b190df4c0aa12eca16c90445f96fb.scope: Deactivated successfully.
Jan 21 19:44:44 np0005591285 podman[244033]: 2026-01-22 00:44:44.458798084 +0000 UTC m=+0.045738256 container died bfaf5011268f13a45b2c383855cdd285460b190df4c0aa12eca16c90445f96fb (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-bc173f9b-a39e-490e-b1d4-92abd1855016, org.label-schema.schema-version=1.0, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 21 19:44:44 np0005591285 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-bfaf5011268f13a45b2c383855cdd285460b190df4c0aa12eca16c90445f96fb-userdata-shm.mount: Deactivated successfully.
Jan 21 19:44:44 np0005591285 systemd[1]: var-lib-containers-storage-overlay-16f7b1317b5723a6051563eb45f5150513e3bc15c2b8e4b0424b96705087c381-merged.mount: Deactivated successfully.
Jan 21 19:44:44 np0005591285 podman[244033]: 2026-01-22 00:44:44.507112918 +0000 UTC m=+0.094053090 container cleanup bfaf5011268f13a45b2c383855cdd285460b190df4c0aa12eca16c90445f96fb (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-bc173f9b-a39e-490e-b1d4-92abd1855016, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team)
Jan 21 19:44:44 np0005591285 nova_compute[182755]: 2026-01-22 00:44:44.513 182759 INFO nova.virt.libvirt.driver [-] [instance: e1d6bfab-5b5f-4e87-903f-663a797f6e97] Instance destroyed successfully.#033[00m
Jan 21 19:44:44 np0005591285 nova_compute[182755]: 2026-01-22 00:44:44.514 182759 DEBUG nova.objects.instance [None req-730be766-c09a-44cc-a928-14d02787fba0 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Lazy-loading 'resources' on Instance uuid e1d6bfab-5b5f-4e87-903f-663a797f6e97 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 21 19:44:44 np0005591285 systemd[1]: libpod-conmon-bfaf5011268f13a45b2c383855cdd285460b190df4c0aa12eca16c90445f96fb.scope: Deactivated successfully.
Jan 21 19:44:44 np0005591285 nova_compute[182755]: 2026-01-22 00:44:44.544 182759 DEBUG nova.virt.libvirt.vif [None req-730be766-c09a-44cc-a928-14d02787fba0 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-22T00:43:34Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestGettingAddress-server-1515318656',display_name='tempest-TestGettingAddress-server-1515318656',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-testgettingaddress-server-1515318656',id=181,image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBFzS1Jx/APWM5vcQRL+j1JWQJn1AI5LoKxMBW97Fa2XcQxLO8wlk0d2rFNEjm5ruItcAVjf35MpAgTKTp3E/600O3yHmKIiUXb2hz3moFXrY6FueGiSaiI56sxuqhWZG5g==',key_name='tempest-TestGettingAddress-79214675',keypairs=<?>,launch_index=0,launched_at=2026-01-22T00:43:43Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='837db8748d074b3c9179b47d30e7a1d4',ramdisk_id='',reservation_id='r-jgzj67i0',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestGettingAddress-471729430',owner_user_name='tempest-TestGettingAddress-471729430-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-22T00:43:43Z,user_data=None,user_id='a8fd196423d94b309668ffd08655f7ed',uuid=e1d6bfab-5b5f-4e87-903f-663a797f6e97,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "3a45d6cb-8d53-4e0f-8011-06cad53a8190", "address": "fa:16:3e:ff:5e:29", "network": {"id": "bc173f9b-a39e-490e-b1d4-92abd1855016", "bridge": "br-int", "label": "tempest-network-smoke--453483768", "subnets": [{"cidr": "2001:db8:0:1::/64", "dns": [], "gateway": {"address": "2001:db8:0:1::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8:0:1:f816:3eff:feff:5e29", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}, {"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.249", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}, {"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:feff:5e29", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "837db8748d074b3c9179b47d30e7a1d4", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3a45d6cb-8d", "ovs_interfaceid": "3a45d6cb-8d53-4e0f-8011-06cad53a8190", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Jan 21 19:44:44 np0005591285 nova_compute[182755]: 2026-01-22 00:44:44.545 182759 DEBUG nova.network.os_vif_util [None req-730be766-c09a-44cc-a928-14d02787fba0 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Converting VIF {"id": "3a45d6cb-8d53-4e0f-8011-06cad53a8190", "address": "fa:16:3e:ff:5e:29", "network": {"id": "bc173f9b-a39e-490e-b1d4-92abd1855016", "bridge": "br-int", "label": "tempest-network-smoke--453483768", "subnets": [{"cidr": "2001:db8:0:1::/64", "dns": [], "gateway": {"address": "2001:db8:0:1::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8:0:1:f816:3eff:feff:5e29", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}, {"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.249", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}, {"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:feff:5e29", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "837db8748d074b3c9179b47d30e7a1d4", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3a45d6cb-8d", "ovs_interfaceid": "3a45d6cb-8d53-4e0f-8011-06cad53a8190", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 21 19:44:44 np0005591285 nova_compute[182755]: 2026-01-22 00:44:44.547 182759 DEBUG nova.network.os_vif_util [None req-730be766-c09a-44cc-a928-14d02787fba0 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:ff:5e:29,bridge_name='br-int',has_traffic_filtering=True,id=3a45d6cb-8d53-4e0f-8011-06cad53a8190,network=Network(bc173f9b-a39e-490e-b1d4-92abd1855016),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap3a45d6cb-8d') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 21 19:44:44 np0005591285 nova_compute[182755]: 2026-01-22 00:44:44.547 182759 DEBUG os_vif [None req-730be766-c09a-44cc-a928-14d02787fba0 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:ff:5e:29,bridge_name='br-int',has_traffic_filtering=True,id=3a45d6cb-8d53-4e0f-8011-06cad53a8190,network=Network(bc173f9b-a39e-490e-b1d4-92abd1855016),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap3a45d6cb-8d') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Jan 21 19:44:44 np0005591285 nova_compute[182755]: 2026-01-22 00:44:44.549 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:44:44 np0005591285 nova_compute[182755]: 2026-01-22 00:44:44.550 182759 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap3a45d6cb-8d, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 21 19:44:44 np0005591285 nova_compute[182755]: 2026-01-22 00:44:44.551 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:44:44 np0005591285 nova_compute[182755]: 2026-01-22 00:44:44.552 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:44:44 np0005591285 nova_compute[182755]: 2026-01-22 00:44:44.556 182759 INFO os_vif [None req-730be766-c09a-44cc-a928-14d02787fba0 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:ff:5e:29,bridge_name='br-int',has_traffic_filtering=True,id=3a45d6cb-8d53-4e0f-8011-06cad53a8190,network=Network(bc173f9b-a39e-490e-b1d4-92abd1855016),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap3a45d6cb-8d')#033[00m
Jan 21 19:44:44 np0005591285 nova_compute[182755]: 2026-01-22 00:44:44.556 182759 INFO nova.virt.libvirt.driver [None req-730be766-c09a-44cc-a928-14d02787fba0 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] [instance: e1d6bfab-5b5f-4e87-903f-663a797f6e97] Deleting instance files /var/lib/nova/instances/e1d6bfab-5b5f-4e87-903f-663a797f6e97_del#033[00m
Jan 21 19:44:44 np0005591285 nova_compute[182755]: 2026-01-22 00:44:44.557 182759 INFO nova.virt.libvirt.driver [None req-730be766-c09a-44cc-a928-14d02787fba0 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] [instance: e1d6bfab-5b5f-4e87-903f-663a797f6e97] Deletion of /var/lib/nova/instances/e1d6bfab-5b5f-4e87-903f-663a797f6e97_del complete#033[00m
Jan 21 19:44:44 np0005591285 podman[244078]: 2026-01-22 00:44:44.568157245 +0000 UTC m=+0.039738693 container remove bfaf5011268f13a45b2c383855cdd285460b190df4c0aa12eca16c90445f96fb (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-bc173f9b-a39e-490e-b1d4-92abd1855016, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true)
Jan 21 19:44:44 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:44:44.572 211690 DEBUG oslo.privsep.daemon [-] privsep: reply[8f849e1a-4d89-450f-94a8-6663918e1e29]: (4, ('Thu Jan 22 12:44:44 AM UTC 2026 Stopping container neutron-haproxy-ovnmeta-bc173f9b-a39e-490e-b1d4-92abd1855016 (bfaf5011268f13a45b2c383855cdd285460b190df4c0aa12eca16c90445f96fb)\nbfaf5011268f13a45b2c383855cdd285460b190df4c0aa12eca16c90445f96fb\nThu Jan 22 12:44:44 AM UTC 2026 Deleting container neutron-haproxy-ovnmeta-bc173f9b-a39e-490e-b1d4-92abd1855016 (bfaf5011268f13a45b2c383855cdd285460b190df4c0aa12eca16c90445f96fb)\nbfaf5011268f13a45b2c383855cdd285460b190df4c0aa12eca16c90445f96fb\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 19:44:44 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:44:44.574 211690 DEBUG oslo.privsep.daemon [-] privsep: reply[63f4250a-8a62-4e18-815b-9ba6fb4eb180]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 19:44:44 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:44:44.575 104259 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapbc173f9b-a0, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 21 19:44:44 np0005591285 nova_compute[182755]: 2026-01-22 00:44:44.576 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:44:44 np0005591285 kernel: tapbc173f9b-a0: left promiscuous mode
Jan 21 19:44:44 np0005591285 nova_compute[182755]: 2026-01-22 00:44:44.589 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:44:44 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:44:44.592 211690 DEBUG oslo.privsep.daemon [-] privsep: reply[30c48d33-a0fe-43ef-a577-4f8c65e80ab9]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 19:44:44 np0005591285 nova_compute[182755]: 2026-01-22 00:44:44.606 182759 DEBUG nova.compute.manager [req-cdd392a0-510d-4b44-85b3-b3b4435cce4f req-c4d13b64-29d8-4976-ae99-e9db5bab8b15 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: e1d6bfab-5b5f-4e87-903f-663a797f6e97] Received event network-vif-unplugged-3a45d6cb-8d53-4e0f-8011-06cad53a8190 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 21 19:44:44 np0005591285 nova_compute[182755]: 2026-01-22 00:44:44.606 182759 DEBUG oslo_concurrency.lockutils [req-cdd392a0-510d-4b44-85b3-b3b4435cce4f req-c4d13b64-29d8-4976-ae99-e9db5bab8b15 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "e1d6bfab-5b5f-4e87-903f-663a797f6e97-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 21 19:44:44 np0005591285 nova_compute[182755]: 2026-01-22 00:44:44.606 182759 DEBUG oslo_concurrency.lockutils [req-cdd392a0-510d-4b44-85b3-b3b4435cce4f req-c4d13b64-29d8-4976-ae99-e9db5bab8b15 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "e1d6bfab-5b5f-4e87-903f-663a797f6e97-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 21 19:44:44 np0005591285 nova_compute[182755]: 2026-01-22 00:44:44.606 182759 DEBUG oslo_concurrency.lockutils [req-cdd392a0-510d-4b44-85b3-b3b4435cce4f req-c4d13b64-29d8-4976-ae99-e9db5bab8b15 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "e1d6bfab-5b5f-4e87-903f-663a797f6e97-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 21 19:44:44 np0005591285 nova_compute[182755]: 2026-01-22 00:44:44.606 182759 DEBUG nova.compute.manager [req-cdd392a0-510d-4b44-85b3-b3b4435cce4f req-c4d13b64-29d8-4976-ae99-e9db5bab8b15 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: e1d6bfab-5b5f-4e87-903f-663a797f6e97] No waiting events found dispatching network-vif-unplugged-3a45d6cb-8d53-4e0f-8011-06cad53a8190 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 21 19:44:44 np0005591285 nova_compute[182755]: 2026-01-22 00:44:44.607 182759 DEBUG nova.compute.manager [req-cdd392a0-510d-4b44-85b3-b3b4435cce4f req-c4d13b64-29d8-4976-ae99-e9db5bab8b15 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: e1d6bfab-5b5f-4e87-903f-663a797f6e97] Received event network-vif-unplugged-3a45d6cb-8d53-4e0f-8011-06cad53a8190 for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Jan 21 19:44:44 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:44:44.607 211690 DEBUG oslo.privsep.daemon [-] privsep: reply[1d0f3afc-111b-4b6a-967f-57cb10e36a89]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 19:44:44 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:44:44.608 211690 DEBUG oslo.privsep.daemon [-] privsep: reply[5975daff-de40-4cf2-989d-fc32803c1cec]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 19:44:44 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:44:44.622 211690 DEBUG oslo.privsep.daemon [-] privsep: reply[7508a642-b8e6-43d0-bc89-457d0d381896]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 718145, 'reachable_time': 36821, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 244092, 'error': None, 'target': 'ovnmeta-bc173f9b-a39e-490e-b1d4-92abd1855016', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 19:44:44 np0005591285 systemd[1]: run-netns-ovnmeta\x2dbc173f9b\x2da39e\x2d490e\x2db1d4\x2d92abd1855016.mount: Deactivated successfully.
Jan 21 19:44:44 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:44:44.626 104650 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-bc173f9b-a39e-490e-b1d4-92abd1855016 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Jan 21 19:44:44 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:44:44.627 104650 DEBUG oslo.privsep.daemon [-] privsep: reply[1c3d8e7a-29b2-4682-ad8e-bdfb32449844]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 19:44:44 np0005591285 nova_compute[182755]: 2026-01-22 00:44:44.674 182759 INFO nova.compute.manager [None req-730be766-c09a-44cc-a928-14d02787fba0 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] [instance: e1d6bfab-5b5f-4e87-903f-663a797f6e97] Took 0.44 seconds to destroy the instance on the hypervisor.#033[00m
Jan 21 19:44:44 np0005591285 nova_compute[182755]: 2026-01-22 00:44:44.675 182759 DEBUG oslo.service.loopingcall [None req-730be766-c09a-44cc-a928-14d02787fba0 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Jan 21 19:44:44 np0005591285 nova_compute[182755]: 2026-01-22 00:44:44.675 182759 DEBUG nova.compute.manager [-] [instance: e1d6bfab-5b5f-4e87-903f-663a797f6e97] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Jan 21 19:44:44 np0005591285 nova_compute[182755]: 2026-01-22 00:44:44.676 182759 DEBUG nova.network.neutron [-] [instance: e1d6bfab-5b5f-4e87-903f-663a797f6e97] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Jan 21 19:44:45 np0005591285 nova_compute[182755]: 2026-01-22 00:44:45.836 182759 DEBUG nova.network.neutron [-] [instance: e1d6bfab-5b5f-4e87-903f-663a797f6e97] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 21 19:44:45 np0005591285 nova_compute[182755]: 2026-01-22 00:44:45.867 182759 INFO nova.compute.manager [-] [instance: e1d6bfab-5b5f-4e87-903f-663a797f6e97] Took 1.19 seconds to deallocate network for instance.#033[00m
Jan 21 19:44:45 np0005591285 nova_compute[182755]: 2026-01-22 00:44:45.964 182759 DEBUG oslo_concurrency.lockutils [None req-730be766-c09a-44cc-a928-14d02787fba0 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 21 19:44:45 np0005591285 nova_compute[182755]: 2026-01-22 00:44:45.965 182759 DEBUG oslo_concurrency.lockutils [None req-730be766-c09a-44cc-a928-14d02787fba0 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 21 19:44:46 np0005591285 nova_compute[182755]: 2026-01-22 00:44:46.051 182759 DEBUG nova.compute.provider_tree [None req-730be766-c09a-44cc-a928-14d02787fba0 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Inventory has not changed in ProviderTree for provider: e96a8776-a298-4c19-937a-402cb8191067 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 21 19:44:46 np0005591285 nova_compute[182755]: 2026-01-22 00:44:46.071 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:44:46 np0005591285 nova_compute[182755]: 2026-01-22 00:44:46.103 182759 DEBUG nova.scheduler.client.report [None req-730be766-c09a-44cc-a928-14d02787fba0 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Inventory has not changed for provider e96a8776-a298-4c19-937a-402cb8191067 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 21 19:44:46 np0005591285 nova_compute[182755]: 2026-01-22 00:44:46.132 182759 DEBUG oslo_concurrency.lockutils [None req-730be766-c09a-44cc-a928-14d02787fba0 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.167s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 21 19:44:46 np0005591285 nova_compute[182755]: 2026-01-22 00:44:46.204 182759 INFO nova.scheduler.client.report [None req-730be766-c09a-44cc-a928-14d02787fba0 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Deleted allocations for instance e1d6bfab-5b5f-4e87-903f-663a797f6e97#033[00m
Jan 21 19:44:46 np0005591285 nova_compute[182755]: 2026-01-22 00:44:46.305 182759 DEBUG oslo_concurrency.lockutils [None req-730be766-c09a-44cc-a928-14d02787fba0 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Lock "e1d6bfab-5b5f-4e87-903f-663a797f6e97" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 2.095s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 21 19:44:46 np0005591285 nova_compute[182755]: 2026-01-22 00:44:46.765 182759 DEBUG nova.compute.manager [req-afd25498-c442-4618-890d-7e150d04f4be req-e94e740a-9ed8-47c1-b6fe-48a2c420c2fa 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: e1d6bfab-5b5f-4e87-903f-663a797f6e97] Received event network-vif-plugged-3a45d6cb-8d53-4e0f-8011-06cad53a8190 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 21 19:44:46 np0005591285 nova_compute[182755]: 2026-01-22 00:44:46.765 182759 DEBUG oslo_concurrency.lockutils [req-afd25498-c442-4618-890d-7e150d04f4be req-e94e740a-9ed8-47c1-b6fe-48a2c420c2fa 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "e1d6bfab-5b5f-4e87-903f-663a797f6e97-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 21 19:44:46 np0005591285 nova_compute[182755]: 2026-01-22 00:44:46.765 182759 DEBUG oslo_concurrency.lockutils [req-afd25498-c442-4618-890d-7e150d04f4be req-e94e740a-9ed8-47c1-b6fe-48a2c420c2fa 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "e1d6bfab-5b5f-4e87-903f-663a797f6e97-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 21 19:44:46 np0005591285 nova_compute[182755]: 2026-01-22 00:44:46.766 182759 DEBUG oslo_concurrency.lockutils [req-afd25498-c442-4618-890d-7e150d04f4be req-e94e740a-9ed8-47c1-b6fe-48a2c420c2fa 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "e1d6bfab-5b5f-4e87-903f-663a797f6e97-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 21 19:44:46 np0005591285 nova_compute[182755]: 2026-01-22 00:44:46.766 182759 DEBUG nova.compute.manager [req-afd25498-c442-4618-890d-7e150d04f4be req-e94e740a-9ed8-47c1-b6fe-48a2c420c2fa 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: e1d6bfab-5b5f-4e87-903f-663a797f6e97] No waiting events found dispatching network-vif-plugged-3a45d6cb-8d53-4e0f-8011-06cad53a8190 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 21 19:44:46 np0005591285 nova_compute[182755]: 2026-01-22 00:44:46.766 182759 WARNING nova.compute.manager [req-afd25498-c442-4618-890d-7e150d04f4be req-e94e740a-9ed8-47c1-b6fe-48a2c420c2fa 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: e1d6bfab-5b5f-4e87-903f-663a797f6e97] Received unexpected event network-vif-plugged-3a45d6cb-8d53-4e0f-8011-06cad53a8190 for instance with vm_state deleted and task_state None.#033[00m
Jan 21 19:44:46 np0005591285 nova_compute[182755]: 2026-01-22 00:44:46.766 182759 DEBUG nova.compute.manager [req-afd25498-c442-4618-890d-7e150d04f4be req-e94e740a-9ed8-47c1-b6fe-48a2c420c2fa 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: e1d6bfab-5b5f-4e87-903f-663a797f6e97] Received event network-vif-deleted-3a45d6cb-8d53-4e0f-8011-06cad53a8190 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 21 19:44:48 np0005591285 nova_compute[182755]: 2026-01-22 00:44:48.301 182759 DEBUG nova.network.neutron [req-9a380f5e-dcd5-4e3c-a033-44944f5f4705 req-4142d01a-0941-47f6-9fed-924d2c06cf21 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: e1d6bfab-5b5f-4e87-903f-663a797f6e97] Updated VIF entry in instance network info cache for port 3a45d6cb-8d53-4e0f-8011-06cad53a8190. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 21 19:44:48 np0005591285 nova_compute[182755]: 2026-01-22 00:44:48.302 182759 DEBUG nova.network.neutron [req-9a380f5e-dcd5-4e3c-a033-44944f5f4705 req-4142d01a-0941-47f6-9fed-924d2c06cf21 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: e1d6bfab-5b5f-4e87-903f-663a797f6e97] Updating instance_info_cache with network_info: [{"id": "3a45d6cb-8d53-4e0f-8011-06cad53a8190", "address": "fa:16:3e:ff:5e:29", "network": {"id": "bc173f9b-a39e-490e-b1d4-92abd1855016", "bridge": "br-int", "label": "tempest-network-smoke--453483768", "subnets": [{"cidr": "2001:db8:0:1::/64", "dns": [], "gateway": {"address": "2001:db8:0:1::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8:0:1:f816:3eff:feff:5e29", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}, {"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}, {"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:feff:5e29", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "837db8748d074b3c9179b47d30e7a1d4", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3a45d6cb-8d", "ovs_interfaceid": "3a45d6cb-8d53-4e0f-8011-06cad53a8190", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 21 19:44:48 np0005591285 nova_compute[182755]: 2026-01-22 00:44:48.341 182759 DEBUG oslo_concurrency.lockutils [req-9a380f5e-dcd5-4e3c-a033-44944f5f4705 req-4142d01a-0941-47f6-9fed-924d2c06cf21 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Releasing lock "refresh_cache-e1d6bfab-5b5f-4e87-903f-663a797f6e97" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 21 19:44:49 np0005591285 nova_compute[182755]: 2026-01-22 00:44:49.552 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:44:51 np0005591285 nova_compute[182755]: 2026-01-22 00:44:51.115 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:44:52 np0005591285 podman[244093]: 2026-01-22 00:44:52.188671628 +0000 UTC m=+0.057163963 container health_status 769668cc4e818cfae854ab3fcb5517227ddca1e18005a18ef4d782a494f45662 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, build-date=2025-08-20T13:12:41, com.redhat.component=ubi9-minimal-container, managed_by=edpm_ansible, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, config_id=openstack_network_exporter, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., release=1755695350, url=https://catalog.redhat.com/en/search?searchType=containers, vendor=Red Hat, Inc., config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, distribution-scope=public, io.openshift.expose-services=, io.openshift.tags=minimal rhel9, architecture=x86_64, io.buildah.version=1.33.7, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=Red Hat, Inc., container_name=openstack_network_exporter, name=ubi9-minimal, version=9.6)
Jan 21 19:44:52 np0005591285 podman[244094]: 2026-01-22 00:44:52.194712381 +0000 UTC m=+0.061095830 container health_status b5c1a31a508d0cf9e10702abe9c0ef424614b4d8f0daee85791f7de054afa6f0 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ceilometer_agent_compute, io.buildah.version=1.41.3, managed_by=edpm_ansible)
Jan 21 19:44:54 np0005591285 nova_compute[182755]: 2026-01-22 00:44:54.554 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:44:56 np0005591285 nova_compute[182755]: 2026-01-22 00:44:56.117 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:44:59 np0005591285 nova_compute[182755]: 2026-01-22 00:44:59.509 182759 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1769042684.507689, e1d6bfab-5b5f-4e87-903f-663a797f6e97 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 21 19:44:59 np0005591285 nova_compute[182755]: 2026-01-22 00:44:59.509 182759 INFO nova.compute.manager [-] [instance: e1d6bfab-5b5f-4e87-903f-663a797f6e97] VM Stopped (Lifecycle Event)#033[00m
Jan 21 19:44:59 np0005591285 nova_compute[182755]: 2026-01-22 00:44:59.536 182759 DEBUG nova.compute.manager [None req-a65cf284-09ac-4bae-9a14-b13bc98d4c0d - - - - - -] [instance: e1d6bfab-5b5f-4e87-903f-663a797f6e97] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 21 19:44:59 np0005591285 nova_compute[182755]: 2026-01-22 00:44:59.556 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:45:01 np0005591285 nova_compute[182755]: 2026-01-22 00:45:01.117 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:45:01 np0005591285 podman[244133]: 2026-01-22 00:45:01.184802944 +0000 UTC m=+0.057857172 container health_status 3d927e208c8d9d2bf2ed958430b573b5beb6e6062ba87e1ec7a0ee42c855b2e4 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Jan 21 19:45:01 np0005591285 nova_compute[182755]: 2026-01-22 00:45:01.947 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:45:02 np0005591285 nova_compute[182755]: 2026-01-22 00:45:02.012 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:45:03 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:45:03.014 104259 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 21 19:45:03 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:45:03.015 104259 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 21 19:45:03 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:45:03.015 104259 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 21 19:45:04 np0005591285 podman[244158]: 2026-01-22 00:45:04.172934963 +0000 UTC m=+0.045132528 container health_status 482a8450540b33e5cd4c4cb0c4b60281016c657b79b89fa82f8a9e447843f995 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=ovn_metadata_agent, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 21 19:45:04 np0005591285 podman[244159]: 2026-01-22 00:45:04.172997575 +0000 UTC m=+0.041986064 container health_status 8919309155a033da8deb024bb37b9fc0d285fe97686ee0d202af37a5f2df8416 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Jan 21 19:45:04 np0005591285 nova_compute[182755]: 2026-01-22 00:45:04.558 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:45:06 np0005591285 nova_compute[182755]: 2026-01-22 00:45:06.154 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:45:07 np0005591285 podman[244199]: 2026-01-22 00:45:07.266275382 +0000 UTC m=+0.136827404 container health_status e38def30a2d032278c507a8fe96e62e9ea51ff4721fe55b24e66691821fd079e (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, config_id=ovn_controller, org.label-schema.build-date=20251202, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS)
Jan 21 19:45:09 np0005591285 nova_compute[182755]: 2026-01-22 00:45:09.560 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:45:11 np0005591285 nova_compute[182755]: 2026-01-22 00:45:11.157 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:45:11 np0005591285 nova_compute[182755]: 2026-01-22 00:45:11.217 182759 DEBUG oslo_service.periodic_task [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 21 19:45:11 np0005591285 nova_compute[182755]: 2026-01-22 00:45:11.217 182759 DEBUG nova.compute.manager [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Jan 21 19:45:11 np0005591285 nova_compute[182755]: 2026-01-22 00:45:11.218 182759 DEBUG nova.compute.manager [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Jan 21 19:45:11 np0005591285 nova_compute[182755]: 2026-01-22 00:45:11.248 182759 DEBUG nova.compute.manager [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Jan 21 19:45:11 np0005591285 nova_compute[182755]: 2026-01-22 00:45:11.248 182759 DEBUG oslo_service.periodic_task [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 21 19:45:14 np0005591285 nova_compute[182755]: 2026-01-22 00:45:14.217 182759 DEBUG oslo_service.periodic_task [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 21 19:45:14 np0005591285 nova_compute[182755]: 2026-01-22 00:45:14.218 182759 DEBUG nova.compute.manager [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Jan 21 19:45:14 np0005591285 nova_compute[182755]: 2026-01-22 00:45:14.563 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:45:15 np0005591285 nova_compute[182755]: 2026-01-22 00:45:15.212 182759 DEBUG oslo_service.periodic_task [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 21 19:45:16 np0005591285 nova_compute[182755]: 2026-01-22 00:45:16.193 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:45:17 np0005591285 nova_compute[182755]: 2026-01-22 00:45:17.217 182759 DEBUG oslo_service.periodic_task [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 21 19:45:18 np0005591285 nova_compute[182755]: 2026-01-22 00:45:18.217 182759 DEBUG oslo_service.periodic_task [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 21 19:45:19 np0005591285 nova_compute[182755]: 2026-01-22 00:45:19.565 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:45:21 np0005591285 nova_compute[182755]: 2026-01-22 00:45:21.195 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:45:21 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:45:21.442 104259 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=69, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '16:a4:fb', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': 'fa:c2:bb:50:eb:27'}, ipsec=False) old=SB_Global(nb_cfg=68) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 21 19:45:21 np0005591285 nova_compute[182755]: 2026-01-22 00:45:21.442 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:45:21 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:45:21.443 104259 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 6 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Jan 21 19:45:23 np0005591285 podman[244227]: 2026-01-22 00:45:23.187893131 +0000 UTC m=+0.060885074 container health_status b5c1a31a508d0cf9e10702abe9c0ef424614b4d8f0daee85791f7de054afa6f0 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=ceilometer_agent_compute, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, tcib_managed=true, container_name=ceilometer_agent_compute, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS)
Jan 21 19:45:23 np0005591285 podman[244226]: 2026-01-22 00:45:23.212195716 +0000 UTC m=+0.088547839 container health_status 769668cc4e818cfae854ab3fcb5517227ddca1e18005a18ef4d782a494f45662 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=openstack_network_exporter, release=1755695350, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., config_id=openstack_network_exporter, maintainer=Red Hat, Inc., com.redhat.component=ubi9-minimal-container, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-type=git, io.openshift.tags=minimal rhel9, managed_by=edpm_ansible, io.buildah.version=1.33.7, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, name=ubi9-minimal, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, build-date=2025-08-20T13:12:41, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vendor=Red Hat, Inc., version=9.6)
Jan 21 19:45:24 np0005591285 nova_compute[182755]: 2026-01-22 00:45:24.568 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:45:25 np0005591285 nova_compute[182755]: 2026-01-22 00:45:25.217 182759 DEBUG oslo_service.periodic_task [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 21 19:45:25 np0005591285 nova_compute[182755]: 2026-01-22 00:45:25.218 182759 DEBUG oslo_service.periodic_task [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 21 19:45:25 np0005591285 nova_compute[182755]: 2026-01-22 00:45:25.253 182759 DEBUG oslo_concurrency.lockutils [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 21 19:45:25 np0005591285 nova_compute[182755]: 2026-01-22 00:45:25.254 182759 DEBUG oslo_concurrency.lockutils [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 21 19:45:25 np0005591285 nova_compute[182755]: 2026-01-22 00:45:25.254 182759 DEBUG oslo_concurrency.lockutils [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 21 19:45:25 np0005591285 nova_compute[182755]: 2026-01-22 00:45:25.254 182759 DEBUG nova.compute.resource_tracker [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Jan 21 19:45:25 np0005591285 nova_compute[182755]: 2026-01-22 00:45:25.436 182759 WARNING nova.virt.libvirt.driver [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 21 19:45:25 np0005591285 nova_compute[182755]: 2026-01-22 00:45:25.437 182759 DEBUG nova.compute.resource_tracker [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=5760MB free_disk=73.17695999145508GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Jan 21 19:45:25 np0005591285 nova_compute[182755]: 2026-01-22 00:45:25.437 182759 DEBUG oslo_concurrency.lockutils [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 21 19:45:25 np0005591285 nova_compute[182755]: 2026-01-22 00:45:25.437 182759 DEBUG oslo_concurrency.lockutils [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 21 19:45:25 np0005591285 nova_compute[182755]: 2026-01-22 00:45:25.581 182759 DEBUG nova.compute.resource_tracker [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Jan 21 19:45:25 np0005591285 nova_compute[182755]: 2026-01-22 00:45:25.581 182759 DEBUG nova.compute.resource_tracker [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Jan 21 19:45:25 np0005591285 nova_compute[182755]: 2026-01-22 00:45:25.685 182759 DEBUG nova.scheduler.client.report [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Refreshing inventories for resource provider e96a8776-a298-4c19-937a-402cb8191067 _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:804#033[00m
Jan 21 19:45:25 np0005591285 nova_compute[182755]: 2026-01-22 00:45:25.793 182759 DEBUG nova.scheduler.client.report [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Updating ProviderTree inventory for provider e96a8776-a298-4c19-937a-402cb8191067 from _refresh_and_get_inventory using data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} _refresh_and_get_inventory /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:768#033[00m
Jan 21 19:45:25 np0005591285 nova_compute[182755]: 2026-01-22 00:45:25.793 182759 DEBUG nova.compute.provider_tree [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Updating inventory in ProviderTree for provider e96a8776-a298-4c19-937a-402cb8191067 with inventory: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176#033[00m
Jan 21 19:45:25 np0005591285 nova_compute[182755]: 2026-01-22 00:45:25.833 182759 DEBUG nova.scheduler.client.report [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Refreshing aggregate associations for resource provider e96a8776-a298-4c19-937a-402cb8191067, aggregates: None _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:813#033[00m
Jan 21 19:45:25 np0005591285 nova_compute[182755]: 2026-01-22 00:45:25.860 182759 DEBUG nova.scheduler.client.report [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Refreshing trait associations for resource provider e96a8776-a298-4c19-937a-402cb8191067, traits: COMPUTE_SECURITY_UEFI_SECURE_BOOT,COMPUTE_VIOMMU_MODEL_VIRTIO,COMPUTE_STORAGE_BUS_IDE,COMPUTE_NET_ATTACH_INTERFACE,HW_CPU_X86_SSSE3,HW_CPU_X86_SSE,COMPUTE_IMAGE_TYPE_QCOW2,COMPUTE_NET_VIF_MODEL_VMXNET3,COMPUTE_STORAGE_BUS_FDC,HW_CPU_X86_SSE42,COMPUTE_SECURITY_TPM_2_0,COMPUTE_IMAGE_TYPE_AMI,COMPUTE_DEVICE_TAGGING,COMPUTE_VOLUME_EXTEND,COMPUTE_NET_VIF_MODEL_E1000,COMPUTE_VIOMMU_MODEL_AUTO,COMPUTE_GRAPHICS_MODEL_CIRRUS,COMPUTE_VOLUME_ATTACH_WITH_TAG,COMPUTE_IMAGE_TYPE_AKI,COMPUTE_STORAGE_BUS_SATA,HW_CPU_X86_SSE2,COMPUTE_IMAGE_TYPE_ARI,HW_CPU_X86_SSE41,COMPUTE_RESCUE_BFV,COMPUTE_NODE,COMPUTE_GRAPHICS_MODEL_VIRTIO,COMPUTE_NET_VIF_MODEL_PCNET,COMPUTE_NET_VIF_MODEL_VIRTIO,COMPUTE_NET_VIF_MODEL_RTL8139,COMPUTE_ACCELERATORS,COMPUTE_SECURITY_TPM_1_2,COMPUTE_GRAPHICS_MODEL_NONE,COMPUTE_GRAPHICS_MODEL_BOCHS,COMPUTE_VOLUME_MULTI_ATTACH,COMPUTE_NET_VIF_MODEL_NE2K_PCI,COMPUTE_TRUSTED_CERTS,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,COMPUTE_VIOMMU_MODEL_INTEL,COMPUTE_IMAGE_TYPE_ISO,COMPUTE_SOCKET_PCI_NUMA_AFFINITY,COMPUTE_STORAGE_BUS_USB,COMPUTE_STORAGE_BUS_SCSI,COMPUTE_STORAGE_BUS_VIRTIO,HW_CPU_X86_MMX,COMPUTE_GRAPHICS_MODEL_VGA,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,COMPUTE_NET_VIF_MODEL_E1000E,COMPUTE_IMAGE_TYPE_RAW _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:825#033[00m
Jan 21 19:45:25 np0005591285 nova_compute[182755]: 2026-01-22 00:45:25.889 182759 DEBUG nova.compute.provider_tree [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Inventory has not changed in ProviderTree for provider: e96a8776-a298-4c19-937a-402cb8191067 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 21 19:45:25 np0005591285 nova_compute[182755]: 2026-01-22 00:45:25.904 182759 DEBUG nova.scheduler.client.report [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Inventory has not changed for provider e96a8776-a298-4c19-937a-402cb8191067 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 21 19:45:25 np0005591285 nova_compute[182755]: 2026-01-22 00:45:25.938 182759 DEBUG nova.compute.resource_tracker [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Jan 21 19:45:25 np0005591285 nova_compute[182755]: 2026-01-22 00:45:25.939 182759 DEBUG oslo_concurrency.lockutils [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.502s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 21 19:45:26 np0005591285 nova_compute[182755]: 2026-01-22 00:45:26.198 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:45:27 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:45:27.446 104259 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=ce4b296c-26ac-415a-aa87-9634754eb3d3, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '69'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 21 19:45:27 np0005591285 nova_compute[182755]: 2026-01-22 00:45:27.941 182759 DEBUG oslo_service.periodic_task [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 21 19:45:29 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:45:29.042 104259 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:e8:5f:6e 10.100.0.2 2001:db8::f816:3eff:fee8:5f6e'], port_security=[], type=localport, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': ''}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.2/28 2001:db8::f816:3eff:fee8:5f6e/64', 'neutron:device_id': 'ovnmeta-83666af9-15ce-4344-a623-7180c9b2515a', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-83666af9-15ce-4344-a623-7180c9b2515a', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '837db8748d074b3c9179b47d30e7a1d4', 'neutron:revision_number': '3', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=496d15df-9baa-43c6-8bd0-ae8566291be1, chassis=[], tunnel_key=1, gateway_chassis=[], requested_chassis=[], logical_port=4a596305-d10e-4e9e-a8ea-d94a630e8baa) old=Port_Binding(mac=['fa:16:3e:e8:5f:6e 10.100.0.2'], external_ids={'neutron:cidrs': '10.100.0.2/28', 'neutron:device_id': 'ovnmeta-83666af9-15ce-4344-a623-7180c9b2515a', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-83666af9-15ce-4344-a623-7180c9b2515a', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '837db8748d074b3c9179b47d30e7a1d4', 'neutron:revision_number': '2', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 21 19:45:29 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:45:29.044 104259 INFO neutron.agent.ovn.metadata.agent [-] Metadata Port 4a596305-d10e-4e9e-a8ea-d94a630e8baa in datapath 83666af9-15ce-4344-a623-7180c9b2515a updated#033[00m
Jan 21 19:45:29 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:45:29.045 104259 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 83666af9-15ce-4344-a623-7180c9b2515a, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Jan 21 19:45:29 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:45:29.046 211690 DEBUG oslo.privsep.daemon [-] privsep: reply[986ce533-fac2-4565-9bf9-1fda3230f0d7]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 19:45:29 np0005591285 nova_compute[182755]: 2026-01-22 00:45:29.592 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:45:31 np0005591285 nova_compute[182755]: 2026-01-22 00:45:31.201 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:45:32 np0005591285 podman[244265]: 2026-01-22 00:45:32.227044337 +0000 UTC m=+0.085513818 container health_status 3d927e208c8d9d2bf2ed958430b573b5beb6e6062ba87e1ec7a0ee42c855b2e4 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter)
Jan 21 19:45:34 np0005591285 nova_compute[182755]: 2026-01-22 00:45:34.594 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:45:35 np0005591285 podman[244289]: 2026-01-22 00:45:35.207827559 +0000 UTC m=+0.079171037 container health_status 482a8450540b33e5cd4c4cb0c4b60281016c657b79b89fa82f8a9e447843f995 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true)
Jan 21 19:45:35 np0005591285 podman[244290]: 2026-01-22 00:45:35.209944127 +0000 UTC m=+0.066825655 container health_status 8919309155a033da8deb024bb37b9fc0d285fe97686ee0d202af37a5f2df8416 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Jan 21 19:45:36 np0005591285 nova_compute[182755]: 2026-01-22 00:45:36.244 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:45:36 np0005591285 nova_compute[182755]: 2026-01-22 00:45:36.974 182759 DEBUG oslo_concurrency.lockutils [None req-325002f8-a7fa-40aa-9f32-80116053cf0e a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Acquiring lock "b53b9c71-63b9-497f-a60b-07fe6f17dad1" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 21 19:45:36 np0005591285 nova_compute[182755]: 2026-01-22 00:45:36.974 182759 DEBUG oslo_concurrency.lockutils [None req-325002f8-a7fa-40aa-9f32-80116053cf0e a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Lock "b53b9c71-63b9-497f-a60b-07fe6f17dad1" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 21 19:45:36 np0005591285 nova_compute[182755]: 2026-01-22 00:45:36.995 182759 DEBUG nova.compute.manager [None req-325002f8-a7fa-40aa-9f32-80116053cf0e a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] [instance: b53b9c71-63b9-497f-a60b-07fe6f17dad1] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Jan 21 19:45:37 np0005591285 nova_compute[182755]: 2026-01-22 00:45:37.143 182759 DEBUG oslo_concurrency.lockutils [None req-325002f8-a7fa-40aa-9f32-80116053cf0e a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 21 19:45:37 np0005591285 nova_compute[182755]: 2026-01-22 00:45:37.143 182759 DEBUG oslo_concurrency.lockutils [None req-325002f8-a7fa-40aa-9f32-80116053cf0e a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 21 19:45:37 np0005591285 nova_compute[182755]: 2026-01-22 00:45:37.153 182759 DEBUG nova.virt.hardware [None req-325002f8-a7fa-40aa-9f32-80116053cf0e a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Jan 21 19:45:37 np0005591285 nova_compute[182755]: 2026-01-22 00:45:37.153 182759 INFO nova.compute.claims [None req-325002f8-a7fa-40aa-9f32-80116053cf0e a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] [instance: b53b9c71-63b9-497f-a60b-07fe6f17dad1] Claim successful on node compute-2.ctlplane.example.com#033[00m
Jan 21 19:45:37 np0005591285 nova_compute[182755]: 2026-01-22 00:45:37.315 182759 DEBUG nova.compute.provider_tree [None req-325002f8-a7fa-40aa-9f32-80116053cf0e a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Inventory has not changed in ProviderTree for provider: e96a8776-a298-4c19-937a-402cb8191067 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 21 19:45:37 np0005591285 nova_compute[182755]: 2026-01-22 00:45:37.343 182759 DEBUG nova.scheduler.client.report [None req-325002f8-a7fa-40aa-9f32-80116053cf0e a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Inventory has not changed for provider e96a8776-a298-4c19-937a-402cb8191067 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 21 19:45:37 np0005591285 nova_compute[182755]: 2026-01-22 00:45:37.362 182759 DEBUG oslo_concurrency.lockutils [None req-325002f8-a7fa-40aa-9f32-80116053cf0e a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.219s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 21 19:45:37 np0005591285 nova_compute[182755]: 2026-01-22 00:45:37.363 182759 DEBUG nova.compute.manager [None req-325002f8-a7fa-40aa-9f32-80116053cf0e a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] [instance: b53b9c71-63b9-497f-a60b-07fe6f17dad1] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Jan 21 19:45:37 np0005591285 nova_compute[182755]: 2026-01-22 00:45:37.438 182759 DEBUG nova.compute.manager [None req-325002f8-a7fa-40aa-9f32-80116053cf0e a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] [instance: b53b9c71-63b9-497f-a60b-07fe6f17dad1] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Jan 21 19:45:37 np0005591285 nova_compute[182755]: 2026-01-22 00:45:37.439 182759 DEBUG nova.network.neutron [None req-325002f8-a7fa-40aa-9f32-80116053cf0e a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] [instance: b53b9c71-63b9-497f-a60b-07fe6f17dad1] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Jan 21 19:45:37 np0005591285 nova_compute[182755]: 2026-01-22 00:45:37.461 182759 INFO nova.virt.libvirt.driver [None req-325002f8-a7fa-40aa-9f32-80116053cf0e a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] [instance: b53b9c71-63b9-497f-a60b-07fe6f17dad1] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Jan 21 19:45:37 np0005591285 nova_compute[182755]: 2026-01-22 00:45:37.485 182759 DEBUG nova.compute.manager [None req-325002f8-a7fa-40aa-9f32-80116053cf0e a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] [instance: b53b9c71-63b9-497f-a60b-07fe6f17dad1] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Jan 21 19:45:37 np0005591285 nova_compute[182755]: 2026-01-22 00:45:37.668 182759 DEBUG nova.compute.manager [None req-325002f8-a7fa-40aa-9f32-80116053cf0e a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] [instance: b53b9c71-63b9-497f-a60b-07fe6f17dad1] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Jan 21 19:45:37 np0005591285 nova_compute[182755]: 2026-01-22 00:45:37.669 182759 DEBUG nova.virt.libvirt.driver [None req-325002f8-a7fa-40aa-9f32-80116053cf0e a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] [instance: b53b9c71-63b9-497f-a60b-07fe6f17dad1] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Jan 21 19:45:37 np0005591285 nova_compute[182755]: 2026-01-22 00:45:37.670 182759 INFO nova.virt.libvirt.driver [None req-325002f8-a7fa-40aa-9f32-80116053cf0e a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] [instance: b53b9c71-63b9-497f-a60b-07fe6f17dad1] Creating image(s)#033[00m
Jan 21 19:45:37 np0005591285 nova_compute[182755]: 2026-01-22 00:45:37.670 182759 DEBUG oslo_concurrency.lockutils [None req-325002f8-a7fa-40aa-9f32-80116053cf0e a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Acquiring lock "/var/lib/nova/instances/b53b9c71-63b9-497f-a60b-07fe6f17dad1/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 21 19:45:37 np0005591285 nova_compute[182755]: 2026-01-22 00:45:37.671 182759 DEBUG oslo_concurrency.lockutils [None req-325002f8-a7fa-40aa-9f32-80116053cf0e a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Lock "/var/lib/nova/instances/b53b9c71-63b9-497f-a60b-07fe6f17dad1/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 21 19:45:37 np0005591285 nova_compute[182755]: 2026-01-22 00:45:37.671 182759 DEBUG oslo_concurrency.lockutils [None req-325002f8-a7fa-40aa-9f32-80116053cf0e a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Lock "/var/lib/nova/instances/b53b9c71-63b9-497f-a60b-07fe6f17dad1/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 21 19:45:37 np0005591285 nova_compute[182755]: 2026-01-22 00:45:37.686 182759 DEBUG oslo_concurrency.processutils [None req-325002f8-a7fa-40aa-9f32-80116053cf0e a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 21 19:45:37 np0005591285 nova_compute[182755]: 2026-01-22 00:45:37.759 182759 DEBUG oslo_concurrency.processutils [None req-325002f8-a7fa-40aa-9f32-80116053cf0e a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json" returned: 0 in 0.073s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 21 19:45:37 np0005591285 nova_compute[182755]: 2026-01-22 00:45:37.761 182759 DEBUG oslo_concurrency.lockutils [None req-325002f8-a7fa-40aa-9f32-80116053cf0e a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Acquiring lock "3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 21 19:45:37 np0005591285 nova_compute[182755]: 2026-01-22 00:45:37.762 182759 DEBUG oslo_concurrency.lockutils [None req-325002f8-a7fa-40aa-9f32-80116053cf0e a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Lock "3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 21 19:45:37 np0005591285 nova_compute[182755]: 2026-01-22 00:45:37.787 182759 DEBUG oslo_concurrency.processutils [None req-325002f8-a7fa-40aa-9f32-80116053cf0e a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 21 19:45:37 np0005591285 nova_compute[182755]: 2026-01-22 00:45:37.844 182759 DEBUG oslo_concurrency.processutils [None req-325002f8-a7fa-40aa-9f32-80116053cf0e a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json" returned: 0 in 0.056s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 21 19:45:37 np0005591285 nova_compute[182755]: 2026-01-22 00:45:37.845 182759 DEBUG oslo_concurrency.processutils [None req-325002f8-a7fa-40aa-9f32-80116053cf0e a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474,backing_fmt=raw /var/lib/nova/instances/b53b9c71-63b9-497f-a60b-07fe6f17dad1/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 21 19:45:37 np0005591285 nova_compute[182755]: 2026-01-22 00:45:37.877 182759 DEBUG oslo_concurrency.processutils [None req-325002f8-a7fa-40aa-9f32-80116053cf0e a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474,backing_fmt=raw /var/lib/nova/instances/b53b9c71-63b9-497f-a60b-07fe6f17dad1/disk 1073741824" returned: 0 in 0.032s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 21 19:45:37 np0005591285 nova_compute[182755]: 2026-01-22 00:45:37.879 182759 DEBUG oslo_concurrency.lockutils [None req-325002f8-a7fa-40aa-9f32-80116053cf0e a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Lock "3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.117s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 21 19:45:37 np0005591285 nova_compute[182755]: 2026-01-22 00:45:37.879 182759 DEBUG oslo_concurrency.processutils [None req-325002f8-a7fa-40aa-9f32-80116053cf0e a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 21 19:45:37 np0005591285 nova_compute[182755]: 2026-01-22 00:45:37.944 182759 DEBUG oslo_concurrency.processutils [None req-325002f8-a7fa-40aa-9f32-80116053cf0e a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json" returned: 0 in 0.065s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 21 19:45:37 np0005591285 nova_compute[182755]: 2026-01-22 00:45:37.945 182759 DEBUG nova.virt.disk.api [None req-325002f8-a7fa-40aa-9f32-80116053cf0e a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Checking if we can resize image /var/lib/nova/instances/b53b9c71-63b9-497f-a60b-07fe6f17dad1/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166#033[00m
Jan 21 19:45:37 np0005591285 nova_compute[182755]: 2026-01-22 00:45:37.945 182759 DEBUG oslo_concurrency.processutils [None req-325002f8-a7fa-40aa-9f32-80116053cf0e a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/b53b9c71-63b9-497f-a60b-07fe6f17dad1/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 21 19:45:38 np0005591285 nova_compute[182755]: 2026-01-22 00:45:38.030 182759 DEBUG oslo_concurrency.processutils [None req-325002f8-a7fa-40aa-9f32-80116053cf0e a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/b53b9c71-63b9-497f-a60b-07fe6f17dad1/disk --force-share --output=json" returned: 0 in 0.085s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 21 19:45:38 np0005591285 nova_compute[182755]: 2026-01-22 00:45:38.031 182759 DEBUG nova.virt.disk.api [None req-325002f8-a7fa-40aa-9f32-80116053cf0e a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Cannot resize image /var/lib/nova/instances/b53b9c71-63b9-497f-a60b-07fe6f17dad1/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172#033[00m
Jan 21 19:45:38 np0005591285 nova_compute[182755]: 2026-01-22 00:45:38.032 182759 DEBUG nova.objects.instance [None req-325002f8-a7fa-40aa-9f32-80116053cf0e a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Lazy-loading 'migration_context' on Instance uuid b53b9c71-63b9-497f-a60b-07fe6f17dad1 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 21 19:45:38 np0005591285 nova_compute[182755]: 2026-01-22 00:45:38.059 182759 DEBUG nova.virt.libvirt.driver [None req-325002f8-a7fa-40aa-9f32-80116053cf0e a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] [instance: b53b9c71-63b9-497f-a60b-07fe6f17dad1] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Jan 21 19:45:38 np0005591285 nova_compute[182755]: 2026-01-22 00:45:38.059 182759 DEBUG nova.virt.libvirt.driver [None req-325002f8-a7fa-40aa-9f32-80116053cf0e a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] [instance: b53b9c71-63b9-497f-a60b-07fe6f17dad1] Ensure instance console log exists: /var/lib/nova/instances/b53b9c71-63b9-497f-a60b-07fe6f17dad1/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Jan 21 19:45:38 np0005591285 nova_compute[182755]: 2026-01-22 00:45:38.060 182759 DEBUG oslo_concurrency.lockutils [None req-325002f8-a7fa-40aa-9f32-80116053cf0e a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 21 19:45:38 np0005591285 nova_compute[182755]: 2026-01-22 00:45:38.060 182759 DEBUG oslo_concurrency.lockutils [None req-325002f8-a7fa-40aa-9f32-80116053cf0e a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 21 19:45:38 np0005591285 nova_compute[182755]: 2026-01-22 00:45:38.060 182759 DEBUG oslo_concurrency.lockutils [None req-325002f8-a7fa-40aa-9f32-80116053cf0e a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 21 19:45:38 np0005591285 podman[244343]: 2026-01-22 00:45:38.22752755 +0000 UTC m=+0.094150151 container health_status e38def30a2d032278c507a8fe96e62e9ea51ff4721fe55b24e66691821fd079e (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_id=ovn_controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 21 19:45:38 np0005591285 nova_compute[182755]: 2026-01-22 00:45:38.674 182759 DEBUG nova.policy [None req-325002f8-a7fa-40aa-9f32-80116053cf0e a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'a8fd196423d94b309668ffd08655f7ed', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '837db8748d074b3c9179b47d30e7a1d4', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Jan 21 19:45:39 np0005591285 nova_compute[182755]: 2026-01-22 00:45:39.645 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:45:40 np0005591285 nova_compute[182755]: 2026-01-22 00:45:40.523 182759 DEBUG nova.network.neutron [None req-325002f8-a7fa-40aa-9f32-80116053cf0e a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] [instance: b53b9c71-63b9-497f-a60b-07fe6f17dad1] Successfully created port: f87e7c5b-000a-44c7-a7e8-b7e97027b22d _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Jan 21 19:45:41 np0005591285 nova_compute[182755]: 2026-01-22 00:45:41.245 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:45:42 np0005591285 nova_compute[182755]: 2026-01-22 00:45:42.335 182759 DEBUG nova.network.neutron [None req-325002f8-a7fa-40aa-9f32-80116053cf0e a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] [instance: b53b9c71-63b9-497f-a60b-07fe6f17dad1] Successfully updated port: f87e7c5b-000a-44c7-a7e8-b7e97027b22d _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Jan 21 19:45:42 np0005591285 nova_compute[182755]: 2026-01-22 00:45:42.354 182759 DEBUG oslo_concurrency.lockutils [None req-325002f8-a7fa-40aa-9f32-80116053cf0e a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Acquiring lock "refresh_cache-b53b9c71-63b9-497f-a60b-07fe6f17dad1" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 21 19:45:42 np0005591285 nova_compute[182755]: 2026-01-22 00:45:42.355 182759 DEBUG oslo_concurrency.lockutils [None req-325002f8-a7fa-40aa-9f32-80116053cf0e a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Acquired lock "refresh_cache-b53b9c71-63b9-497f-a60b-07fe6f17dad1" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 21 19:45:42 np0005591285 nova_compute[182755]: 2026-01-22 00:45:42.355 182759 DEBUG nova.network.neutron [None req-325002f8-a7fa-40aa-9f32-80116053cf0e a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] [instance: b53b9c71-63b9-497f-a60b-07fe6f17dad1] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Jan 21 19:45:42 np0005591285 nova_compute[182755]: 2026-01-22 00:45:42.553 182759 DEBUG nova.compute.manager [req-6f10cfe9-7c6f-44f8-b215-68233e21ae49 req-ca769db9-6501-46b1-9e22-77003b6a4d89 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: b53b9c71-63b9-497f-a60b-07fe6f17dad1] Received event network-changed-f87e7c5b-000a-44c7-a7e8-b7e97027b22d external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 21 19:45:42 np0005591285 nova_compute[182755]: 2026-01-22 00:45:42.553 182759 DEBUG nova.compute.manager [req-6f10cfe9-7c6f-44f8-b215-68233e21ae49 req-ca769db9-6501-46b1-9e22-77003b6a4d89 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: b53b9c71-63b9-497f-a60b-07fe6f17dad1] Refreshing instance network info cache due to event network-changed-f87e7c5b-000a-44c7-a7e8-b7e97027b22d. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 21 19:45:42 np0005591285 nova_compute[182755]: 2026-01-22 00:45:42.554 182759 DEBUG oslo_concurrency.lockutils [req-6f10cfe9-7c6f-44f8-b215-68233e21ae49 req-ca769db9-6501-46b1-9e22-77003b6a4d89 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "refresh_cache-b53b9c71-63b9-497f-a60b-07fe6f17dad1" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 21 19:45:42 np0005591285 nova_compute[182755]: 2026-01-22 00:45:42.620 182759 DEBUG nova.network.neutron [None req-325002f8-a7fa-40aa-9f32-80116053cf0e a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] [instance: b53b9c71-63b9-497f-a60b-07fe6f17dad1] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Jan 21 19:45:44 np0005591285 nova_compute[182755]: 2026-01-22 00:45:44.410 182759 DEBUG nova.network.neutron [None req-325002f8-a7fa-40aa-9f32-80116053cf0e a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] [instance: b53b9c71-63b9-497f-a60b-07fe6f17dad1] Updating instance_info_cache with network_info: [{"id": "f87e7c5b-000a-44c7-a7e8-b7e97027b22d", "address": "fa:16:3e:74:c8:f6", "network": {"id": "83666af9-15ce-4344-a623-7180c9b2515a", "bridge": "br-int", "label": "tempest-network-smoke--333028759", "subnets": [{"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe74:c8f6", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}, {"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "837db8748d074b3c9179b47d30e7a1d4", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapf87e7c5b-00", "ovs_interfaceid": "f87e7c5b-000a-44c7-a7e8-b7e97027b22d", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 21 19:45:44 np0005591285 nova_compute[182755]: 2026-01-22 00:45:44.439 182759 DEBUG oslo_concurrency.lockutils [None req-325002f8-a7fa-40aa-9f32-80116053cf0e a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Releasing lock "refresh_cache-b53b9c71-63b9-497f-a60b-07fe6f17dad1" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 21 19:45:44 np0005591285 nova_compute[182755]: 2026-01-22 00:45:44.439 182759 DEBUG nova.compute.manager [None req-325002f8-a7fa-40aa-9f32-80116053cf0e a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] [instance: b53b9c71-63b9-497f-a60b-07fe6f17dad1] Instance network_info: |[{"id": "f87e7c5b-000a-44c7-a7e8-b7e97027b22d", "address": "fa:16:3e:74:c8:f6", "network": {"id": "83666af9-15ce-4344-a623-7180c9b2515a", "bridge": "br-int", "label": "tempest-network-smoke--333028759", "subnets": [{"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe74:c8f6", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}, {"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "837db8748d074b3c9179b47d30e7a1d4", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapf87e7c5b-00", "ovs_interfaceid": "f87e7c5b-000a-44c7-a7e8-b7e97027b22d", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Jan 21 19:45:44 np0005591285 nova_compute[182755]: 2026-01-22 00:45:44.440 182759 DEBUG oslo_concurrency.lockutils [req-6f10cfe9-7c6f-44f8-b215-68233e21ae49 req-ca769db9-6501-46b1-9e22-77003b6a4d89 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquired lock "refresh_cache-b53b9c71-63b9-497f-a60b-07fe6f17dad1" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 21 19:45:44 np0005591285 nova_compute[182755]: 2026-01-22 00:45:44.440 182759 DEBUG nova.network.neutron [req-6f10cfe9-7c6f-44f8-b215-68233e21ae49 req-ca769db9-6501-46b1-9e22-77003b6a4d89 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: b53b9c71-63b9-497f-a60b-07fe6f17dad1] Refreshing network info cache for port f87e7c5b-000a-44c7-a7e8-b7e97027b22d _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 21 19:45:44 np0005591285 nova_compute[182755]: 2026-01-22 00:45:44.442 182759 DEBUG nova.virt.libvirt.driver [None req-325002f8-a7fa-40aa-9f32-80116053cf0e a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] [instance: b53b9c71-63b9-497f-a60b-07fe6f17dad1] Start _get_guest_xml network_info=[{"id": "f87e7c5b-000a-44c7-a7e8-b7e97027b22d", "address": "fa:16:3e:74:c8:f6", "network": {"id": "83666af9-15ce-4344-a623-7180c9b2515a", "bridge": "br-int", "label": "tempest-network-smoke--333028759", "subnets": [{"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe74:c8f6", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}, {"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "837db8748d074b3c9179b47d30e7a1d4", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapf87e7c5b-00", "ovs_interfaceid": "f87e7c5b-000a-44c7-a7e8-b7e97027b22d", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-21T23:42:50Z,direct_url=<?>,disk_format='qcow2',id=9cd98f02-a505-4543-a7ad-04e9a377b456,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='43b70c4e837343859ac97b6b2397ba1b',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-21T23:42:52Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'device_type': 'disk', 'device_name': '/dev/vda', 'size': 0, 'boot_index': 0, 'disk_bus': 'virtio', 'guest_format': None, 'encryption_options': None, 'encryption_format': None, 'encrypted': False, 'encryption_secret_uuid': None, 'image_id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Jan 21 19:45:44 np0005591285 nova_compute[182755]: 2026-01-22 00:45:44.447 182759 WARNING nova.virt.libvirt.driver [None req-325002f8-a7fa-40aa-9f32-80116053cf0e a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 21 19:45:44 np0005591285 nova_compute[182755]: 2026-01-22 00:45:44.457 182759 DEBUG nova.virt.libvirt.host [None req-325002f8-a7fa-40aa-9f32-80116053cf0e a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Jan 21 19:45:44 np0005591285 nova_compute[182755]: 2026-01-22 00:45:44.458 182759 DEBUG nova.virt.libvirt.host [None req-325002f8-a7fa-40aa-9f32-80116053cf0e a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Jan 21 19:45:44 np0005591285 nova_compute[182755]: 2026-01-22 00:45:44.465 182759 DEBUG nova.virt.libvirt.host [None req-325002f8-a7fa-40aa-9f32-80116053cf0e a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Jan 21 19:45:44 np0005591285 nova_compute[182755]: 2026-01-22 00:45:44.465 182759 DEBUG nova.virt.libvirt.host [None req-325002f8-a7fa-40aa-9f32-80116053cf0e a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Jan 21 19:45:44 np0005591285 nova_compute[182755]: 2026-01-22 00:45:44.466 182759 DEBUG nova.virt.libvirt.driver [None req-325002f8-a7fa-40aa-9f32-80116053cf0e a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Jan 21 19:45:44 np0005591285 nova_compute[182755]: 2026-01-22 00:45:44.466 182759 DEBUG nova.virt.hardware [None req-325002f8-a7fa-40aa-9f32-80116053cf0e a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-21T23:42:45Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='c3389c03-89c4-4ff5-9e03-1a99d41713d4',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-21T23:42:50Z,direct_url=<?>,disk_format='qcow2',id=9cd98f02-a505-4543-a7ad-04e9a377b456,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='43b70c4e837343859ac97b6b2397ba1b',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-21T23:42:52Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Jan 21 19:45:44 np0005591285 nova_compute[182755]: 2026-01-22 00:45:44.467 182759 DEBUG nova.virt.hardware [None req-325002f8-a7fa-40aa-9f32-80116053cf0e a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Jan 21 19:45:44 np0005591285 nova_compute[182755]: 2026-01-22 00:45:44.467 182759 DEBUG nova.virt.hardware [None req-325002f8-a7fa-40aa-9f32-80116053cf0e a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Jan 21 19:45:44 np0005591285 nova_compute[182755]: 2026-01-22 00:45:44.467 182759 DEBUG nova.virt.hardware [None req-325002f8-a7fa-40aa-9f32-80116053cf0e a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Jan 21 19:45:44 np0005591285 nova_compute[182755]: 2026-01-22 00:45:44.467 182759 DEBUG nova.virt.hardware [None req-325002f8-a7fa-40aa-9f32-80116053cf0e a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Jan 21 19:45:44 np0005591285 nova_compute[182755]: 2026-01-22 00:45:44.468 182759 DEBUG nova.virt.hardware [None req-325002f8-a7fa-40aa-9f32-80116053cf0e a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Jan 21 19:45:44 np0005591285 nova_compute[182755]: 2026-01-22 00:45:44.468 182759 DEBUG nova.virt.hardware [None req-325002f8-a7fa-40aa-9f32-80116053cf0e a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Jan 21 19:45:44 np0005591285 nova_compute[182755]: 2026-01-22 00:45:44.468 182759 DEBUG nova.virt.hardware [None req-325002f8-a7fa-40aa-9f32-80116053cf0e a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Jan 21 19:45:44 np0005591285 nova_compute[182755]: 2026-01-22 00:45:44.468 182759 DEBUG nova.virt.hardware [None req-325002f8-a7fa-40aa-9f32-80116053cf0e a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Jan 21 19:45:44 np0005591285 nova_compute[182755]: 2026-01-22 00:45:44.469 182759 DEBUG nova.virt.hardware [None req-325002f8-a7fa-40aa-9f32-80116053cf0e a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Jan 21 19:45:44 np0005591285 nova_compute[182755]: 2026-01-22 00:45:44.469 182759 DEBUG nova.virt.hardware [None req-325002f8-a7fa-40aa-9f32-80116053cf0e a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Jan 21 19:45:44 np0005591285 nova_compute[182755]: 2026-01-22 00:45:44.473 182759 DEBUG nova.virt.libvirt.vif [None req-325002f8-a7fa-40aa-9f32-80116053cf0e a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-22T00:45:35Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestGettingAddress-server-362739796',display_name='tempest-TestGettingAddress-server-362739796',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-testgettingaddress-server-362739796',id=183,image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBL3/PLQY4lAQU2yFGaoAmqWPJI5565ofTauEAmPcwEncHglgrmt+9X41pDrGx2Hzo63wjxi644i8QnD2R87vFxz3Kmnkg4MUbe27S7AT4N98N34iBfOk+UwjPX/szWkvLg==',key_name='tempest-TestGettingAddress-2046948813',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='837db8748d074b3c9179b47d30e7a1d4',ramdisk_id='',reservation_id='r-hs0y34u0',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestGettingAddress-471729430',owner_user_name='tempest-TestGettingAddress-471729430-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-22T00:45:37Z,user_data=None,user_id='a8fd196423d94b309668ffd08655f7ed',uuid=b53b9c71-63b9-497f-a60b-07fe6f17dad1,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "f87e7c5b-000a-44c7-a7e8-b7e97027b22d", "address": "fa:16:3e:74:c8:f6", "network": {"id": "83666af9-15ce-4344-a623-7180c9b2515a", "bridge": "br-int", "label": "tempest-network-smoke--333028759", "subnets": [{"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe74:c8f6", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}, {"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "837db8748d074b3c9179b47d30e7a1d4", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapf87e7c5b-00", "ovs_interfaceid": "f87e7c5b-000a-44c7-a7e8-b7e97027b22d", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Jan 21 19:45:44 np0005591285 nova_compute[182755]: 2026-01-22 00:45:44.473 182759 DEBUG nova.network.os_vif_util [None req-325002f8-a7fa-40aa-9f32-80116053cf0e a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Converting VIF {"id": "f87e7c5b-000a-44c7-a7e8-b7e97027b22d", "address": "fa:16:3e:74:c8:f6", "network": {"id": "83666af9-15ce-4344-a623-7180c9b2515a", "bridge": "br-int", "label": "tempest-network-smoke--333028759", "subnets": [{"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe74:c8f6", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}, {"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "837db8748d074b3c9179b47d30e7a1d4", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapf87e7c5b-00", "ovs_interfaceid": "f87e7c5b-000a-44c7-a7e8-b7e97027b22d", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 21 19:45:44 np0005591285 nova_compute[182755]: 2026-01-22 00:45:44.474 182759 DEBUG nova.network.os_vif_util [None req-325002f8-a7fa-40aa-9f32-80116053cf0e a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:74:c8:f6,bridge_name='br-int',has_traffic_filtering=True,id=f87e7c5b-000a-44c7-a7e8-b7e97027b22d,network=Network(83666af9-15ce-4344-a623-7180c9b2515a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapf87e7c5b-00') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 21 19:45:44 np0005591285 nova_compute[182755]: 2026-01-22 00:45:44.474 182759 DEBUG nova.objects.instance [None req-325002f8-a7fa-40aa-9f32-80116053cf0e a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Lazy-loading 'pci_devices' on Instance uuid b53b9c71-63b9-497f-a60b-07fe6f17dad1 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 21 19:45:44 np0005591285 nova_compute[182755]: 2026-01-22 00:45:44.514 182759 DEBUG nova.virt.libvirt.driver [None req-325002f8-a7fa-40aa-9f32-80116053cf0e a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] [instance: b53b9c71-63b9-497f-a60b-07fe6f17dad1] End _get_guest_xml xml=<domain type="kvm">
Jan 21 19:45:44 np0005591285 nova_compute[182755]:  <uuid>b53b9c71-63b9-497f-a60b-07fe6f17dad1</uuid>
Jan 21 19:45:44 np0005591285 nova_compute[182755]:  <name>instance-000000b7</name>
Jan 21 19:45:44 np0005591285 nova_compute[182755]:  <memory>131072</memory>
Jan 21 19:45:44 np0005591285 nova_compute[182755]:  <vcpu>1</vcpu>
Jan 21 19:45:44 np0005591285 nova_compute[182755]:  <metadata>
Jan 21 19:45:44 np0005591285 nova_compute[182755]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 21 19:45:44 np0005591285 nova_compute[182755]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 21 19:45:44 np0005591285 nova_compute[182755]:      <nova:name>tempest-TestGettingAddress-server-362739796</nova:name>
Jan 21 19:45:44 np0005591285 nova_compute[182755]:      <nova:creationTime>2026-01-22 00:45:44</nova:creationTime>
Jan 21 19:45:44 np0005591285 nova_compute[182755]:      <nova:flavor name="m1.nano">
Jan 21 19:45:44 np0005591285 nova_compute[182755]:        <nova:memory>128</nova:memory>
Jan 21 19:45:44 np0005591285 nova_compute[182755]:        <nova:disk>1</nova:disk>
Jan 21 19:45:44 np0005591285 nova_compute[182755]:        <nova:swap>0</nova:swap>
Jan 21 19:45:44 np0005591285 nova_compute[182755]:        <nova:ephemeral>0</nova:ephemeral>
Jan 21 19:45:44 np0005591285 nova_compute[182755]:        <nova:vcpus>1</nova:vcpus>
Jan 21 19:45:44 np0005591285 nova_compute[182755]:      </nova:flavor>
Jan 21 19:45:44 np0005591285 nova_compute[182755]:      <nova:owner>
Jan 21 19:45:44 np0005591285 nova_compute[182755]:        <nova:user uuid="a8fd196423d94b309668ffd08655f7ed">tempest-TestGettingAddress-471729430-project-member</nova:user>
Jan 21 19:45:44 np0005591285 nova_compute[182755]:        <nova:project uuid="837db8748d074b3c9179b47d30e7a1d4">tempest-TestGettingAddress-471729430</nova:project>
Jan 21 19:45:44 np0005591285 nova_compute[182755]:      </nova:owner>
Jan 21 19:45:44 np0005591285 nova_compute[182755]:      <nova:root type="image" uuid="9cd98f02-a505-4543-a7ad-04e9a377b456"/>
Jan 21 19:45:44 np0005591285 nova_compute[182755]:      <nova:ports>
Jan 21 19:45:44 np0005591285 nova_compute[182755]:        <nova:port uuid="f87e7c5b-000a-44c7-a7e8-b7e97027b22d">
Jan 21 19:45:44 np0005591285 nova_compute[182755]:          <nova:ip type="fixed" address="2001:db8::f816:3eff:fe74:c8f6" ipVersion="6"/>
Jan 21 19:45:44 np0005591285 nova_compute[182755]:          <nova:ip type="fixed" address="10.100.0.4" ipVersion="4"/>
Jan 21 19:45:44 np0005591285 nova_compute[182755]:        </nova:port>
Jan 21 19:45:44 np0005591285 nova_compute[182755]:      </nova:ports>
Jan 21 19:45:44 np0005591285 nova_compute[182755]:    </nova:instance>
Jan 21 19:45:44 np0005591285 nova_compute[182755]:  </metadata>
Jan 21 19:45:44 np0005591285 nova_compute[182755]:  <sysinfo type="smbios">
Jan 21 19:45:44 np0005591285 nova_compute[182755]:    <system>
Jan 21 19:45:44 np0005591285 nova_compute[182755]:      <entry name="manufacturer">RDO</entry>
Jan 21 19:45:44 np0005591285 nova_compute[182755]:      <entry name="product">OpenStack Compute</entry>
Jan 21 19:45:44 np0005591285 nova_compute[182755]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 21 19:45:44 np0005591285 nova_compute[182755]:      <entry name="serial">b53b9c71-63b9-497f-a60b-07fe6f17dad1</entry>
Jan 21 19:45:44 np0005591285 nova_compute[182755]:      <entry name="uuid">b53b9c71-63b9-497f-a60b-07fe6f17dad1</entry>
Jan 21 19:45:44 np0005591285 nova_compute[182755]:      <entry name="family">Virtual Machine</entry>
Jan 21 19:45:44 np0005591285 nova_compute[182755]:    </system>
Jan 21 19:45:44 np0005591285 nova_compute[182755]:  </sysinfo>
Jan 21 19:45:44 np0005591285 nova_compute[182755]:  <os>
Jan 21 19:45:44 np0005591285 nova_compute[182755]:    <type arch="x86_64" machine="q35">hvm</type>
Jan 21 19:45:44 np0005591285 nova_compute[182755]:    <boot dev="hd"/>
Jan 21 19:45:44 np0005591285 nova_compute[182755]:    <smbios mode="sysinfo"/>
Jan 21 19:45:44 np0005591285 nova_compute[182755]:  </os>
Jan 21 19:45:44 np0005591285 nova_compute[182755]:  <features>
Jan 21 19:45:44 np0005591285 nova_compute[182755]:    <acpi/>
Jan 21 19:45:44 np0005591285 nova_compute[182755]:    <apic/>
Jan 21 19:45:44 np0005591285 nova_compute[182755]:    <vmcoreinfo/>
Jan 21 19:45:44 np0005591285 nova_compute[182755]:  </features>
Jan 21 19:45:44 np0005591285 nova_compute[182755]:  <clock offset="utc">
Jan 21 19:45:44 np0005591285 nova_compute[182755]:    <timer name="pit" tickpolicy="delay"/>
Jan 21 19:45:44 np0005591285 nova_compute[182755]:    <timer name="rtc" tickpolicy="catchup"/>
Jan 21 19:45:44 np0005591285 nova_compute[182755]:    <timer name="hpet" present="no"/>
Jan 21 19:45:44 np0005591285 nova_compute[182755]:  </clock>
Jan 21 19:45:44 np0005591285 nova_compute[182755]:  <cpu mode="custom" match="exact">
Jan 21 19:45:44 np0005591285 nova_compute[182755]:    <model>Nehalem</model>
Jan 21 19:45:44 np0005591285 nova_compute[182755]:    <topology sockets="1" cores="1" threads="1"/>
Jan 21 19:45:44 np0005591285 nova_compute[182755]:  </cpu>
Jan 21 19:45:44 np0005591285 nova_compute[182755]:  <devices>
Jan 21 19:45:44 np0005591285 nova_compute[182755]:    <disk type="file" device="disk">
Jan 21 19:45:44 np0005591285 nova_compute[182755]:      <driver name="qemu" type="qcow2" cache="none"/>
Jan 21 19:45:44 np0005591285 nova_compute[182755]:      <source file="/var/lib/nova/instances/b53b9c71-63b9-497f-a60b-07fe6f17dad1/disk"/>
Jan 21 19:45:44 np0005591285 nova_compute[182755]:      <target dev="vda" bus="virtio"/>
Jan 21 19:45:44 np0005591285 nova_compute[182755]:    </disk>
Jan 21 19:45:44 np0005591285 nova_compute[182755]:    <disk type="file" device="cdrom">
Jan 21 19:45:44 np0005591285 nova_compute[182755]:      <driver name="qemu" type="raw" cache="none"/>
Jan 21 19:45:44 np0005591285 nova_compute[182755]:      <source file="/var/lib/nova/instances/b53b9c71-63b9-497f-a60b-07fe6f17dad1/disk.config"/>
Jan 21 19:45:44 np0005591285 nova_compute[182755]:      <target dev="sda" bus="sata"/>
Jan 21 19:45:44 np0005591285 nova_compute[182755]:    </disk>
Jan 21 19:45:44 np0005591285 nova_compute[182755]:    <interface type="ethernet">
Jan 21 19:45:44 np0005591285 nova_compute[182755]:      <mac address="fa:16:3e:74:c8:f6"/>
Jan 21 19:45:44 np0005591285 nova_compute[182755]:      <model type="virtio"/>
Jan 21 19:45:44 np0005591285 nova_compute[182755]:      <driver name="vhost" rx_queue_size="512"/>
Jan 21 19:45:44 np0005591285 nova_compute[182755]:      <mtu size="1442"/>
Jan 21 19:45:44 np0005591285 nova_compute[182755]:      <target dev="tapf87e7c5b-00"/>
Jan 21 19:45:44 np0005591285 nova_compute[182755]:    </interface>
Jan 21 19:45:44 np0005591285 nova_compute[182755]:    <serial type="pty">
Jan 21 19:45:44 np0005591285 nova_compute[182755]:      <log file="/var/lib/nova/instances/b53b9c71-63b9-497f-a60b-07fe6f17dad1/console.log" append="off"/>
Jan 21 19:45:44 np0005591285 nova_compute[182755]:    </serial>
Jan 21 19:45:44 np0005591285 nova_compute[182755]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 21 19:45:44 np0005591285 nova_compute[182755]:    <video>
Jan 21 19:45:44 np0005591285 nova_compute[182755]:      <model type="virtio"/>
Jan 21 19:45:44 np0005591285 nova_compute[182755]:    </video>
Jan 21 19:45:44 np0005591285 nova_compute[182755]:    <input type="tablet" bus="usb"/>
Jan 21 19:45:44 np0005591285 nova_compute[182755]:    <rng model="virtio">
Jan 21 19:45:44 np0005591285 nova_compute[182755]:      <backend model="random">/dev/urandom</backend>
Jan 21 19:45:44 np0005591285 nova_compute[182755]:    </rng>
Jan 21 19:45:44 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root"/>
Jan 21 19:45:44 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 19:45:44 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 19:45:44 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 19:45:44 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 19:45:44 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 19:45:44 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 19:45:44 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 19:45:44 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 19:45:44 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 19:45:44 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 19:45:44 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 19:45:44 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 19:45:44 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 19:45:44 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 19:45:44 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 19:45:44 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 19:45:44 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 19:45:44 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 19:45:44 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 19:45:44 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 19:45:44 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 19:45:44 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 19:45:44 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 19:45:44 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 19:45:44 np0005591285 nova_compute[182755]:    <controller type="usb" index="0"/>
Jan 21 19:45:44 np0005591285 nova_compute[182755]:    <memballoon model="virtio">
Jan 21 19:45:44 np0005591285 nova_compute[182755]:      <stats period="10"/>
Jan 21 19:45:44 np0005591285 nova_compute[182755]:    </memballoon>
Jan 21 19:45:44 np0005591285 nova_compute[182755]:  </devices>
Jan 21 19:45:44 np0005591285 nova_compute[182755]: </domain>
Jan 21 19:45:44 np0005591285 nova_compute[182755]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Jan 21 19:45:44 np0005591285 nova_compute[182755]: 2026-01-22 00:45:44.516 182759 DEBUG nova.compute.manager [None req-325002f8-a7fa-40aa-9f32-80116053cf0e a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] [instance: b53b9c71-63b9-497f-a60b-07fe6f17dad1] Preparing to wait for external event network-vif-plugged-f87e7c5b-000a-44c7-a7e8-b7e97027b22d prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Jan 21 19:45:44 np0005591285 nova_compute[182755]: 2026-01-22 00:45:44.516 182759 DEBUG oslo_concurrency.lockutils [None req-325002f8-a7fa-40aa-9f32-80116053cf0e a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Acquiring lock "b53b9c71-63b9-497f-a60b-07fe6f17dad1-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 21 19:45:44 np0005591285 nova_compute[182755]: 2026-01-22 00:45:44.516 182759 DEBUG oslo_concurrency.lockutils [None req-325002f8-a7fa-40aa-9f32-80116053cf0e a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Lock "b53b9c71-63b9-497f-a60b-07fe6f17dad1-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 21 19:45:44 np0005591285 nova_compute[182755]: 2026-01-22 00:45:44.516 182759 DEBUG oslo_concurrency.lockutils [None req-325002f8-a7fa-40aa-9f32-80116053cf0e a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Lock "b53b9c71-63b9-497f-a60b-07fe6f17dad1-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 21 19:45:44 np0005591285 nova_compute[182755]: 2026-01-22 00:45:44.517 182759 DEBUG nova.virt.libvirt.vif [None req-325002f8-a7fa-40aa-9f32-80116053cf0e a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-22T00:45:35Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestGettingAddress-server-362739796',display_name='tempest-TestGettingAddress-server-362739796',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-testgettingaddress-server-362739796',id=183,image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBL3/PLQY4lAQU2yFGaoAmqWPJI5565ofTauEAmPcwEncHglgrmt+9X41pDrGx2Hzo63wjxi644i8QnD2R87vFxz3Kmnkg4MUbe27S7AT4N98N34iBfOk+UwjPX/szWkvLg==',key_name='tempest-TestGettingAddress-2046948813',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='837db8748d074b3c9179b47d30e7a1d4',ramdisk_id='',reservation_id='r-hs0y34u0',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestGettingAddress-471729430',owner_user_name='tempest-TestGettingAddress-471729430-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-22T00:45:37Z,user_data=None,user_id='a8fd196423d94b309668ffd08655f7ed',uuid=b53b9c71-63b9-497f-a60b-07fe6f17dad1,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "f87e7c5b-000a-44c7-a7e8-b7e97027b22d", "address": "fa:16:3e:74:c8:f6", "network": {"id": "83666af9-15ce-4344-a623-7180c9b2515a", "bridge": "br-int", "label": "tempest-network-smoke--333028759", "subnets": [{"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe74:c8f6", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}, {"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "837db8748d074b3c9179b47d30e7a1d4", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapf87e7c5b-00", "ovs_interfaceid": "f87e7c5b-000a-44c7-a7e8-b7e97027b22d", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Jan 21 19:45:44 np0005591285 nova_compute[182755]: 2026-01-22 00:45:44.517 182759 DEBUG nova.network.os_vif_util [None req-325002f8-a7fa-40aa-9f32-80116053cf0e a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Converting VIF {"id": "f87e7c5b-000a-44c7-a7e8-b7e97027b22d", "address": "fa:16:3e:74:c8:f6", "network": {"id": "83666af9-15ce-4344-a623-7180c9b2515a", "bridge": "br-int", "label": "tempest-network-smoke--333028759", "subnets": [{"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe74:c8f6", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}, {"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "837db8748d074b3c9179b47d30e7a1d4", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapf87e7c5b-00", "ovs_interfaceid": "f87e7c5b-000a-44c7-a7e8-b7e97027b22d", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 21 19:45:44 np0005591285 nova_compute[182755]: 2026-01-22 00:45:44.518 182759 DEBUG nova.network.os_vif_util [None req-325002f8-a7fa-40aa-9f32-80116053cf0e a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:74:c8:f6,bridge_name='br-int',has_traffic_filtering=True,id=f87e7c5b-000a-44c7-a7e8-b7e97027b22d,network=Network(83666af9-15ce-4344-a623-7180c9b2515a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapf87e7c5b-00') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 21 19:45:44 np0005591285 nova_compute[182755]: 2026-01-22 00:45:44.518 182759 DEBUG os_vif [None req-325002f8-a7fa-40aa-9f32-80116053cf0e a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:74:c8:f6,bridge_name='br-int',has_traffic_filtering=True,id=f87e7c5b-000a-44c7-a7e8-b7e97027b22d,network=Network(83666af9-15ce-4344-a623-7180c9b2515a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapf87e7c5b-00') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Jan 21 19:45:44 np0005591285 nova_compute[182755]: 2026-01-22 00:45:44.518 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:45:44 np0005591285 nova_compute[182755]: 2026-01-22 00:45:44.519 182759 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 21 19:45:44 np0005591285 nova_compute[182755]: 2026-01-22 00:45:44.519 182759 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 21 19:45:44 np0005591285 nova_compute[182755]: 2026-01-22 00:45:44.522 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:45:44 np0005591285 nova_compute[182755]: 2026-01-22 00:45:44.522 182759 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapf87e7c5b-00, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 21 19:45:44 np0005591285 nova_compute[182755]: 2026-01-22 00:45:44.523 182759 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tapf87e7c5b-00, col_values=(('external_ids', {'iface-id': 'f87e7c5b-000a-44c7-a7e8-b7e97027b22d', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:74:c8:f6', 'vm-uuid': 'b53b9c71-63b9-497f-a60b-07fe6f17dad1'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 21 19:45:44 np0005591285 nova_compute[182755]: 2026-01-22 00:45:44.574 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:45:44 np0005591285 NetworkManager[55017]: <info>  [1769042744.5752] manager: (tapf87e7c5b-00): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/340)
Jan 21 19:45:44 np0005591285 nova_compute[182755]: 2026-01-22 00:45:44.577 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 21 19:45:44 np0005591285 nova_compute[182755]: 2026-01-22 00:45:44.580 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:45:44 np0005591285 nova_compute[182755]: 2026-01-22 00:45:44.581 182759 INFO os_vif [None req-325002f8-a7fa-40aa-9f32-80116053cf0e a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:74:c8:f6,bridge_name='br-int',has_traffic_filtering=True,id=f87e7c5b-000a-44c7-a7e8-b7e97027b22d,network=Network(83666af9-15ce-4344-a623-7180c9b2515a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapf87e7c5b-00')#033[00m
Jan 21 19:45:44 np0005591285 nova_compute[182755]: 2026-01-22 00:45:44.648 182759 DEBUG nova.virt.libvirt.driver [None req-325002f8-a7fa-40aa-9f32-80116053cf0e a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 21 19:45:44 np0005591285 nova_compute[182755]: 2026-01-22 00:45:44.649 182759 DEBUG nova.virt.libvirt.driver [None req-325002f8-a7fa-40aa-9f32-80116053cf0e a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 21 19:45:44 np0005591285 nova_compute[182755]: 2026-01-22 00:45:44.649 182759 DEBUG nova.virt.libvirt.driver [None req-325002f8-a7fa-40aa-9f32-80116053cf0e a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] No VIF found with MAC fa:16:3e:74:c8:f6, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Jan 21 19:45:44 np0005591285 nova_compute[182755]: 2026-01-22 00:45:44.650 182759 INFO nova.virt.libvirt.driver [None req-325002f8-a7fa-40aa-9f32-80116053cf0e a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] [instance: b53b9c71-63b9-497f-a60b-07fe6f17dad1] Using config drive#033[00m
Jan 21 19:45:45 np0005591285 nova_compute[182755]: 2026-01-22 00:45:45.217 182759 INFO nova.virt.libvirt.driver [None req-325002f8-a7fa-40aa-9f32-80116053cf0e a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] [instance: b53b9c71-63b9-497f-a60b-07fe6f17dad1] Creating config drive at /var/lib/nova/instances/b53b9c71-63b9-497f-a60b-07fe6f17dad1/disk.config#033[00m
Jan 21 19:45:45 np0005591285 nova_compute[182755]: 2026-01-22 00:45:45.223 182759 DEBUG oslo_concurrency.processutils [None req-325002f8-a7fa-40aa-9f32-80116053cf0e a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/b53b9c71-63b9-497f-a60b-07fe6f17dad1/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmphj8q6vtn execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 21 19:45:45 np0005591285 nova_compute[182755]: 2026-01-22 00:45:45.347 182759 DEBUG oslo_concurrency.processutils [None req-325002f8-a7fa-40aa-9f32-80116053cf0e a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/b53b9c71-63b9-497f-a60b-07fe6f17dad1/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmphj8q6vtn" returned: 0 in 0.124s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 21 19:45:45 np0005591285 kernel: tapf87e7c5b-00: entered promiscuous mode
Jan 21 19:45:45 np0005591285 ovn_controller[94908]: 2026-01-22T00:45:45Z|00688|binding|INFO|Claiming lport f87e7c5b-000a-44c7-a7e8-b7e97027b22d for this chassis.
Jan 21 19:45:45 np0005591285 nova_compute[182755]: 2026-01-22 00:45:45.405 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:45:45 np0005591285 ovn_controller[94908]: 2026-01-22T00:45:45Z|00689|binding|INFO|f87e7c5b-000a-44c7-a7e8-b7e97027b22d: Claiming fa:16:3e:74:c8:f6 10.100.0.4 2001:db8::f816:3eff:fe74:c8f6
Jan 21 19:45:45 np0005591285 NetworkManager[55017]: <info>  [1769042745.4074] manager: (tapf87e7c5b-00): new Tun device (/org/freedesktop/NetworkManager/Devices/341)
Jan 21 19:45:45 np0005591285 nova_compute[182755]: 2026-01-22 00:45:45.408 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:45:45 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:45:45.423 104259 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:74:c8:f6 10.100.0.4 2001:db8::f816:3eff:fe74:c8f6'], port_security=['fa:16:3e:74:c8:f6 10.100.0.4 2001:db8::f816:3eff:fe74:c8f6'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.4/28 2001:db8::f816:3eff:fe74:c8f6/64', 'neutron:device_id': 'b53b9c71-63b9-497f-a60b-07fe6f17dad1', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-83666af9-15ce-4344-a623-7180c9b2515a', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '837db8748d074b3c9179b47d30e7a1d4', 'neutron:revision_number': '2', 'neutron:security_group_ids': 'bd69fbc7-ff38-42ce-b5d5-6559f7285ccb', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=496d15df-9baa-43c6-8bd0-ae8566291be1, chassis=[<ovs.db.idl.Row object at 0x7fdd55b5c970>], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fdd55b5c970>], logical_port=f87e7c5b-000a-44c7-a7e8-b7e97027b22d) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 21 19:45:45 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:45:45.425 104259 INFO neutron.agent.ovn.metadata.agent [-] Port f87e7c5b-000a-44c7-a7e8-b7e97027b22d in datapath 83666af9-15ce-4344-a623-7180c9b2515a bound to our chassis#033[00m
Jan 21 19:45:45 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:45:45.426 104259 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 83666af9-15ce-4344-a623-7180c9b2515a#033[00m
Jan 21 19:45:45 np0005591285 systemd-udevd[244385]: Network interface NamePolicy= disabled on kernel command line.
Jan 21 19:45:45 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:45:45.439 211690 DEBUG oslo.privsep.daemon [-] privsep: reply[49320df3-b5a1-4949-a343-e9c69ff86494]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 19:45:45 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:45:45.440 104259 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap83666af9-11 in ovnmeta-83666af9-15ce-4344-a623-7180c9b2515a namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Jan 21 19:45:45 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:45:45.442 211690 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap83666af9-10 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Jan 21 19:45:45 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:45:45.442 211690 DEBUG oslo.privsep.daemon [-] privsep: reply[fafd992a-0f5e-4bf6-b528-a0f3e0f1c886]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 19:45:45 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:45:45.443 211690 DEBUG oslo.privsep.daemon [-] privsep: reply[dd700a3c-c703-4d4c-8433-fb8f9b273f18]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 19:45:45 np0005591285 NetworkManager[55017]: <info>  [1769042745.4458] device (tapf87e7c5b-00): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 21 19:45:45 np0005591285 NetworkManager[55017]: <info>  [1769042745.4466] device (tapf87e7c5b-00): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 21 19:45:45 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:45:45.456 104650 DEBUG oslo.privsep.daemon [-] privsep: reply[35259f11-e99e-4af6-9608-c75999dd2463]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 19:45:45 np0005591285 nova_compute[182755]: 2026-01-22 00:45:45.462 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:45:45 np0005591285 systemd-machined[154022]: New machine qemu-78-instance-000000b7.
Jan 21 19:45:45 np0005591285 ovn_controller[94908]: 2026-01-22T00:45:45Z|00690|binding|INFO|Setting lport f87e7c5b-000a-44c7-a7e8-b7e97027b22d ovn-installed in OVS
Jan 21 19:45:45 np0005591285 ovn_controller[94908]: 2026-01-22T00:45:45Z|00691|binding|INFO|Setting lport f87e7c5b-000a-44c7-a7e8-b7e97027b22d up in Southbound
Jan 21 19:45:45 np0005591285 nova_compute[182755]: 2026-01-22 00:45:45.468 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:45:45 np0005591285 systemd[1]: Started Virtual Machine qemu-78-instance-000000b7.
Jan 21 19:45:45 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:45:45.482 211690 DEBUG oslo.privsep.daemon [-] privsep: reply[16d99e82-e1ea-4eb8-a721-66a25b4df1d2]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 19:45:45 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:45:45.525 211704 DEBUG oslo.privsep.daemon [-] privsep: reply[4d62c9a9-2baa-4bd7-b759-509b9d96ba99]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 19:45:45 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:45:45.529 211690 DEBUG oslo.privsep.daemon [-] privsep: reply[4bc77c0b-cb7e-4f39-97d6-b685961c4436]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 19:45:45 np0005591285 NetworkManager[55017]: <info>  [1769042745.5305] manager: (tap83666af9-10): new Veth device (/org/freedesktop/NetworkManager/Devices/342)
Jan 21 19:45:45 np0005591285 systemd-udevd[244390]: Network interface NamePolicy= disabled on kernel command line.
Jan 21 19:45:45 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:45:45.559 211704 DEBUG oslo.privsep.daemon [-] privsep: reply[0434e860-e3c2-40fc-b0db-0012d3528004]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 19:45:45 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:45:45.562 211704 DEBUG oslo.privsep.daemon [-] privsep: reply[aa0d6f0d-6ba7-4f0e-aba4-9d4286077e6a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 19:45:45 np0005591285 NetworkManager[55017]: <info>  [1769042745.5843] device (tap83666af9-10): carrier: link connected
Jan 21 19:45:45 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:45:45.588 211704 DEBUG oslo.privsep.daemon [-] privsep: reply[9f0093d7-b82c-4cfb-8b78-4a55f29687e3]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 19:45:45 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:45:45.607 211690 DEBUG oslo.privsep.daemon [-] privsep: reply[a11d9e29-4699-42d4-a7ad-6619fd6e4c1b]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap83666af9-11'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:e8:5f:6e'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 214], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 730424, 'reachable_time': 28427, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 244423, 'error': None, 'target': 'ovnmeta-83666af9-15ce-4344-a623-7180c9b2515a', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 19:45:45 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:45:45.622 211690 DEBUG oslo.privsep.daemon [-] privsep: reply[7f413f72-94d9-43a0-8d43-96c4811be7c9]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fee8:5f6e'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 730424, 'tstamp': 730424}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 244424, 'error': None, 'target': 'ovnmeta-83666af9-15ce-4344-a623-7180c9b2515a', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 19:45:45 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:45:45.744 211690 DEBUG oslo.privsep.daemon [-] privsep: reply[1942824d-bfad-4646-ad9c-e23458f04c41]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap83666af9-11'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:e8:5f:6e'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 2, 'rx_bytes': 90, 'tx_bytes': 180, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 2, 'rx_bytes': 90, 'tx_bytes': 180, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 214], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 730424, 'reachable_time': 28427, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 2, 'outoctets': 152, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 2, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 152, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 2, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 244425, 'error': None, 'target': 'ovnmeta-83666af9-15ce-4344-a623-7180c9b2515a', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 19:45:45 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:45:45.778 211690 DEBUG oslo.privsep.daemon [-] privsep: reply[e9f77720-d5ed-495a-8478-d62fd0429b30]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 19:45:45 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:45:45.861 211690 DEBUG oslo.privsep.daemon [-] privsep: reply[07378799-f8bb-4e41-8636-2f2fb8befa17]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 19:45:45 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:45:45.862 104259 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap83666af9-10, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 21 19:45:45 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:45:45.862 104259 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 21 19:45:45 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:45:45.862 104259 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap83666af9-10, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 21 19:45:45 np0005591285 nova_compute[182755]: 2026-01-22 00:45:45.864 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:45:45 np0005591285 NetworkManager[55017]: <info>  [1769042745.8654] manager: (tap83666af9-10): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/343)
Jan 21 19:45:45 np0005591285 kernel: tap83666af9-10: entered promiscuous mode
Jan 21 19:45:45 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:45:45.869 104259 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap83666af9-10, col_values=(('external_ids', {'iface-id': '4a596305-d10e-4e9e-a8ea-d94a630e8baa'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 21 19:45:45 np0005591285 nova_compute[182755]: 2026-01-22 00:45:45.868 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:45:45 np0005591285 nova_compute[182755]: 2026-01-22 00:45:45.870 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:45:45 np0005591285 ovn_controller[94908]: 2026-01-22T00:45:45Z|00692|binding|INFO|Releasing lport 4a596305-d10e-4e9e-a8ea-d94a630e8baa from this chassis (sb_readonly=0)
Jan 21 19:45:45 np0005591285 nova_compute[182755]: 2026-01-22 00:45:45.884 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:45:45 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:45:45.885 104259 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/83666af9-15ce-4344-a623-7180c9b2515a.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/83666af9-15ce-4344-a623-7180c9b2515a.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Jan 21 19:45:45 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:45:45.886 211690 DEBUG oslo.privsep.daemon [-] privsep: reply[a0571469-6efe-4379-86af-bba57694ddf1]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 19:45:45 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:45:45.887 104259 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Jan 21 19:45:45 np0005591285 ovn_metadata_agent[104254]: global
Jan 21 19:45:45 np0005591285 ovn_metadata_agent[104254]:    log         /dev/log local0 debug
Jan 21 19:45:45 np0005591285 ovn_metadata_agent[104254]:    log-tag     haproxy-metadata-proxy-83666af9-15ce-4344-a623-7180c9b2515a
Jan 21 19:45:45 np0005591285 ovn_metadata_agent[104254]:    user        root
Jan 21 19:45:45 np0005591285 ovn_metadata_agent[104254]:    group       root
Jan 21 19:45:45 np0005591285 ovn_metadata_agent[104254]:    maxconn     1024
Jan 21 19:45:45 np0005591285 ovn_metadata_agent[104254]:    pidfile     /var/lib/neutron/external/pids/83666af9-15ce-4344-a623-7180c9b2515a.pid.haproxy
Jan 21 19:45:45 np0005591285 ovn_metadata_agent[104254]:    daemon
Jan 21 19:45:45 np0005591285 ovn_metadata_agent[104254]: 
Jan 21 19:45:45 np0005591285 ovn_metadata_agent[104254]: defaults
Jan 21 19:45:45 np0005591285 ovn_metadata_agent[104254]:    log global
Jan 21 19:45:45 np0005591285 ovn_metadata_agent[104254]:    mode http
Jan 21 19:45:45 np0005591285 ovn_metadata_agent[104254]:    option httplog
Jan 21 19:45:45 np0005591285 ovn_metadata_agent[104254]:    option dontlognull
Jan 21 19:45:45 np0005591285 ovn_metadata_agent[104254]:    option http-server-close
Jan 21 19:45:45 np0005591285 ovn_metadata_agent[104254]:    option forwardfor
Jan 21 19:45:45 np0005591285 ovn_metadata_agent[104254]:    retries                 3
Jan 21 19:45:45 np0005591285 ovn_metadata_agent[104254]:    timeout http-request    30s
Jan 21 19:45:45 np0005591285 ovn_metadata_agent[104254]:    timeout connect         30s
Jan 21 19:45:45 np0005591285 ovn_metadata_agent[104254]:    timeout client          32s
Jan 21 19:45:45 np0005591285 ovn_metadata_agent[104254]:    timeout server          32s
Jan 21 19:45:45 np0005591285 ovn_metadata_agent[104254]:    timeout http-keep-alive 30s
Jan 21 19:45:45 np0005591285 ovn_metadata_agent[104254]: 
Jan 21 19:45:45 np0005591285 ovn_metadata_agent[104254]: 
Jan 21 19:45:45 np0005591285 ovn_metadata_agent[104254]: listen listener
Jan 21 19:45:45 np0005591285 ovn_metadata_agent[104254]:    bind 169.254.169.254:80
Jan 21 19:45:45 np0005591285 ovn_metadata_agent[104254]:    server metadata /var/lib/neutron/metadata_proxy
Jan 21 19:45:45 np0005591285 ovn_metadata_agent[104254]:    http-request add-header X-OVN-Network-ID 83666af9-15ce-4344-a623-7180c9b2515a
Jan 21 19:45:45 np0005591285 ovn_metadata_agent[104254]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Jan 21 19:45:45 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:45:45.887 104259 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-83666af9-15ce-4344-a623-7180c9b2515a', 'env', 'PROCESS_TAG=haproxy-83666af9-15ce-4344-a623-7180c9b2515a', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/83666af9-15ce-4344-a623-7180c9b2515a.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Jan 21 19:45:45 np0005591285 nova_compute[182755]: 2026-01-22 00:45:45.927 182759 DEBUG nova.virt.driver [None req-f447ef32-7edb-4e2c-a5c5-5452f2f9c37e - - - - - -] Emitting event <LifecycleEvent: 1769042745.9267113, b53b9c71-63b9-497f-a60b-07fe6f17dad1 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 21 19:45:45 np0005591285 nova_compute[182755]: 2026-01-22 00:45:45.927 182759 INFO nova.compute.manager [None req-f447ef32-7edb-4e2c-a5c5-5452f2f9c37e - - - - - -] [instance: b53b9c71-63b9-497f-a60b-07fe6f17dad1] VM Started (Lifecycle Event)#033[00m
Jan 21 19:45:45 np0005591285 nova_compute[182755]: 2026-01-22 00:45:45.994 182759 DEBUG nova.compute.manager [None req-f447ef32-7edb-4e2c-a5c5-5452f2f9c37e - - - - - -] [instance: b53b9c71-63b9-497f-a60b-07fe6f17dad1] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 21 19:45:46 np0005591285 nova_compute[182755]: 2026-01-22 00:45:46.000 182759 DEBUG nova.virt.driver [None req-f447ef32-7edb-4e2c-a5c5-5452f2f9c37e - - - - - -] Emitting event <LifecycleEvent: 1769042745.9269638, b53b9c71-63b9-497f-a60b-07fe6f17dad1 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 21 19:45:46 np0005591285 nova_compute[182755]: 2026-01-22 00:45:46.001 182759 INFO nova.compute.manager [None req-f447ef32-7edb-4e2c-a5c5-5452f2f9c37e - - - - - -] [instance: b53b9c71-63b9-497f-a60b-07fe6f17dad1] VM Paused (Lifecycle Event)#033[00m
Jan 21 19:45:46 np0005591285 nova_compute[182755]: 2026-01-22 00:45:46.068 182759 DEBUG nova.compute.manager [None req-f447ef32-7edb-4e2c-a5c5-5452f2f9c37e - - - - - -] [instance: b53b9c71-63b9-497f-a60b-07fe6f17dad1] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 21 19:45:46 np0005591285 nova_compute[182755]: 2026-01-22 00:45:46.074 182759 DEBUG nova.compute.manager [None req-f447ef32-7edb-4e2c-a5c5-5452f2f9c37e - - - - - -] [instance: b53b9c71-63b9-497f-a60b-07fe6f17dad1] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 21 19:45:46 np0005591285 nova_compute[182755]: 2026-01-22 00:45:46.126 182759 INFO nova.compute.manager [None req-f447ef32-7edb-4e2c-a5c5-5452f2f9c37e - - - - - -] [instance: b53b9c71-63b9-497f-a60b-07fe6f17dad1] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 21 19:45:46 np0005591285 nova_compute[182755]: 2026-01-22 00:45:46.247 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:45:46 np0005591285 podman[244464]: 2026-01-22 00:45:46.254686266 +0000 UTC m=+0.049823576 container create 669c8a857a523c77811ecc1037081ab50ceadf985188e4e17817d08c4011be2c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-83666af9-15ce-4344-a623-7180c9b2515a, org.label-schema.build-date=20251202, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.schema-version=1.0)
Jan 21 19:45:46 np0005591285 systemd[1]: Started libpod-conmon-669c8a857a523c77811ecc1037081ab50ceadf985188e4e17817d08c4011be2c.scope.
Jan 21 19:45:46 np0005591285 systemd[1]: Started libcrun container.
Jan 21 19:45:46 np0005591285 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/156bbbc55167c4bf0f8269a27234d2f7a7576ba89b6a5a72995d64162a72f16c/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 21 19:45:46 np0005591285 podman[244464]: 2026-01-22 00:45:46.230657517 +0000 UTC m=+0.025794847 image pull 3695f0466b4af47afdf4b467956f8cc4744d7249671a73e7ca3fd26cca2f59c3 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Jan 21 19:45:46 np0005591285 podman[244464]: 2026-01-22 00:45:46.339005251 +0000 UTC m=+0.134142581 container init 669c8a857a523c77811ecc1037081ab50ceadf985188e4e17817d08c4011be2c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-83666af9-15ce-4344-a623-7180c9b2515a, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251202, io.buildah.version=1.41.3)
Jan 21 19:45:46 np0005591285 podman[244464]: 2026-01-22 00:45:46.345918807 +0000 UTC m=+0.141056117 container start 669c8a857a523c77811ecc1037081ab50ceadf985188e4e17817d08c4011be2c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-83666af9-15ce-4344-a623-7180c9b2515a, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202)
Jan 21 19:45:46 np0005591285 neutron-haproxy-ovnmeta-83666af9-15ce-4344-a623-7180c9b2515a[244480]: [NOTICE]   (244484) : New worker (244486) forked
Jan 21 19:45:46 np0005591285 neutron-haproxy-ovnmeta-83666af9-15ce-4344-a623-7180c9b2515a[244480]: [NOTICE]   (244484) : Loading success.
Jan 21 19:45:46 np0005591285 nova_compute[182755]: 2026-01-22 00:45:46.429 182759 DEBUG nova.compute.manager [req-198e842d-7ac0-414c-a3d7-cb0b48bdd603 req-4c50155e-74bc-44fd-9a3f-9129477a3197 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: b53b9c71-63b9-497f-a60b-07fe6f17dad1] Received event network-vif-plugged-f87e7c5b-000a-44c7-a7e8-b7e97027b22d external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 21 19:45:46 np0005591285 nova_compute[182755]: 2026-01-22 00:45:46.430 182759 DEBUG oslo_concurrency.lockutils [req-198e842d-7ac0-414c-a3d7-cb0b48bdd603 req-4c50155e-74bc-44fd-9a3f-9129477a3197 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "b53b9c71-63b9-497f-a60b-07fe6f17dad1-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 21 19:45:46 np0005591285 nova_compute[182755]: 2026-01-22 00:45:46.430 182759 DEBUG oslo_concurrency.lockutils [req-198e842d-7ac0-414c-a3d7-cb0b48bdd603 req-4c50155e-74bc-44fd-9a3f-9129477a3197 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "b53b9c71-63b9-497f-a60b-07fe6f17dad1-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 21 19:45:46 np0005591285 nova_compute[182755]: 2026-01-22 00:45:46.431 182759 DEBUG oslo_concurrency.lockutils [req-198e842d-7ac0-414c-a3d7-cb0b48bdd603 req-4c50155e-74bc-44fd-9a3f-9129477a3197 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "b53b9c71-63b9-497f-a60b-07fe6f17dad1-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 21 19:45:46 np0005591285 nova_compute[182755]: 2026-01-22 00:45:46.431 182759 DEBUG nova.compute.manager [req-198e842d-7ac0-414c-a3d7-cb0b48bdd603 req-4c50155e-74bc-44fd-9a3f-9129477a3197 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: b53b9c71-63b9-497f-a60b-07fe6f17dad1] Processing event network-vif-plugged-f87e7c5b-000a-44c7-a7e8-b7e97027b22d _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Jan 21 19:45:46 np0005591285 nova_compute[182755]: 2026-01-22 00:45:46.431 182759 DEBUG nova.compute.manager [None req-325002f8-a7fa-40aa-9f32-80116053cf0e a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] [instance: b53b9c71-63b9-497f-a60b-07fe6f17dad1] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Jan 21 19:45:46 np0005591285 nova_compute[182755]: 2026-01-22 00:45:46.435 182759 DEBUG nova.virt.driver [None req-f447ef32-7edb-4e2c-a5c5-5452f2f9c37e - - - - - -] Emitting event <LifecycleEvent: 1769042746.4348865, b53b9c71-63b9-497f-a60b-07fe6f17dad1 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 21 19:45:46 np0005591285 nova_compute[182755]: 2026-01-22 00:45:46.435 182759 INFO nova.compute.manager [None req-f447ef32-7edb-4e2c-a5c5-5452f2f9c37e - - - - - -] [instance: b53b9c71-63b9-497f-a60b-07fe6f17dad1] VM Resumed (Lifecycle Event)#033[00m
Jan 21 19:45:46 np0005591285 nova_compute[182755]: 2026-01-22 00:45:46.436 182759 DEBUG nova.virt.libvirt.driver [None req-325002f8-a7fa-40aa-9f32-80116053cf0e a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] [instance: b53b9c71-63b9-497f-a60b-07fe6f17dad1] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Jan 21 19:45:46 np0005591285 nova_compute[182755]: 2026-01-22 00:45:46.439 182759 INFO nova.virt.libvirt.driver [-] [instance: b53b9c71-63b9-497f-a60b-07fe6f17dad1] Instance spawned successfully.#033[00m
Jan 21 19:45:46 np0005591285 nova_compute[182755]: 2026-01-22 00:45:46.439 182759 DEBUG nova.virt.libvirt.driver [None req-325002f8-a7fa-40aa-9f32-80116053cf0e a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] [instance: b53b9c71-63b9-497f-a60b-07fe6f17dad1] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Jan 21 19:45:46 np0005591285 nova_compute[182755]: 2026-01-22 00:45:46.474 182759 DEBUG nova.compute.manager [None req-f447ef32-7edb-4e2c-a5c5-5452f2f9c37e - - - - - -] [instance: b53b9c71-63b9-497f-a60b-07fe6f17dad1] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 21 19:45:46 np0005591285 nova_compute[182755]: 2026-01-22 00:45:46.479 182759 DEBUG nova.compute.manager [None req-f447ef32-7edb-4e2c-a5c5-5452f2f9c37e - - - - - -] [instance: b53b9c71-63b9-497f-a60b-07fe6f17dad1] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 21 19:45:46 np0005591285 nova_compute[182755]: 2026-01-22 00:45:46.482 182759 DEBUG nova.virt.libvirt.driver [None req-325002f8-a7fa-40aa-9f32-80116053cf0e a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] [instance: b53b9c71-63b9-497f-a60b-07fe6f17dad1] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 21 19:45:46 np0005591285 nova_compute[182755]: 2026-01-22 00:45:46.482 182759 DEBUG nova.virt.libvirt.driver [None req-325002f8-a7fa-40aa-9f32-80116053cf0e a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] [instance: b53b9c71-63b9-497f-a60b-07fe6f17dad1] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 21 19:45:46 np0005591285 nova_compute[182755]: 2026-01-22 00:45:46.483 182759 DEBUG nova.virt.libvirt.driver [None req-325002f8-a7fa-40aa-9f32-80116053cf0e a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] [instance: b53b9c71-63b9-497f-a60b-07fe6f17dad1] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 21 19:45:46 np0005591285 nova_compute[182755]: 2026-01-22 00:45:46.483 182759 DEBUG nova.virt.libvirt.driver [None req-325002f8-a7fa-40aa-9f32-80116053cf0e a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] [instance: b53b9c71-63b9-497f-a60b-07fe6f17dad1] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 21 19:45:46 np0005591285 nova_compute[182755]: 2026-01-22 00:45:46.483 182759 DEBUG nova.virt.libvirt.driver [None req-325002f8-a7fa-40aa-9f32-80116053cf0e a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] [instance: b53b9c71-63b9-497f-a60b-07fe6f17dad1] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 21 19:45:46 np0005591285 nova_compute[182755]: 2026-01-22 00:45:46.484 182759 DEBUG nova.virt.libvirt.driver [None req-325002f8-a7fa-40aa-9f32-80116053cf0e a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] [instance: b53b9c71-63b9-497f-a60b-07fe6f17dad1] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 21 19:45:46 np0005591285 nova_compute[182755]: 2026-01-22 00:45:46.527 182759 INFO nova.compute.manager [None req-f447ef32-7edb-4e2c-a5c5-5452f2f9c37e - - - - - -] [instance: b53b9c71-63b9-497f-a60b-07fe6f17dad1] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 21 19:45:46 np0005591285 nova_compute[182755]: 2026-01-22 00:45:46.566 182759 INFO nova.compute.manager [None req-325002f8-a7fa-40aa-9f32-80116053cf0e a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] [instance: b53b9c71-63b9-497f-a60b-07fe6f17dad1] Took 8.90 seconds to spawn the instance on the hypervisor.#033[00m
Jan 21 19:45:46 np0005591285 nova_compute[182755]: 2026-01-22 00:45:46.566 182759 DEBUG nova.compute.manager [None req-325002f8-a7fa-40aa-9f32-80116053cf0e a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] [instance: b53b9c71-63b9-497f-a60b-07fe6f17dad1] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 21 19:45:46 np0005591285 nova_compute[182755]: 2026-01-22 00:45:46.662 182759 INFO nova.compute.manager [None req-325002f8-a7fa-40aa-9f32-80116053cf0e a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] [instance: b53b9c71-63b9-497f-a60b-07fe6f17dad1] Took 9.57 seconds to build instance.#033[00m
Jan 21 19:45:46 np0005591285 nova_compute[182755]: 2026-01-22 00:45:46.736 182759 DEBUG oslo_concurrency.lockutils [None req-325002f8-a7fa-40aa-9f32-80116053cf0e a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Lock "b53b9c71-63b9-497f-a60b-07fe6f17dad1" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 9.762s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 21 19:45:48 np0005591285 nova_compute[182755]: 2026-01-22 00:45:48.282 182759 DEBUG nova.network.neutron [req-6f10cfe9-7c6f-44f8-b215-68233e21ae49 req-ca769db9-6501-46b1-9e22-77003b6a4d89 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: b53b9c71-63b9-497f-a60b-07fe6f17dad1] Updated VIF entry in instance network info cache for port f87e7c5b-000a-44c7-a7e8-b7e97027b22d. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 21 19:45:48 np0005591285 nova_compute[182755]: 2026-01-22 00:45:48.284 182759 DEBUG nova.network.neutron [req-6f10cfe9-7c6f-44f8-b215-68233e21ae49 req-ca769db9-6501-46b1-9e22-77003b6a4d89 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: b53b9c71-63b9-497f-a60b-07fe6f17dad1] Updating instance_info_cache with network_info: [{"id": "f87e7c5b-000a-44c7-a7e8-b7e97027b22d", "address": "fa:16:3e:74:c8:f6", "network": {"id": "83666af9-15ce-4344-a623-7180c9b2515a", "bridge": "br-int", "label": "tempest-network-smoke--333028759", "subnets": [{"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe74:c8f6", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}, {"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "837db8748d074b3c9179b47d30e7a1d4", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapf87e7c5b-00", "ovs_interfaceid": "f87e7c5b-000a-44c7-a7e8-b7e97027b22d", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 21 19:45:48 np0005591285 nova_compute[182755]: 2026-01-22 00:45:48.303 182759 DEBUG oslo_concurrency.lockutils [req-6f10cfe9-7c6f-44f8-b215-68233e21ae49 req-ca769db9-6501-46b1-9e22-77003b6a4d89 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Releasing lock "refresh_cache-b53b9c71-63b9-497f-a60b-07fe6f17dad1" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 21 19:45:48 np0005591285 nova_compute[182755]: 2026-01-22 00:45:48.581 182759 DEBUG nova.compute.manager [req-5d104f6b-c310-4402-be61-0be3474f05cf req-83f872c7-5cdf-4ad8-a848-c3b78bacc737 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: b53b9c71-63b9-497f-a60b-07fe6f17dad1] Received event network-vif-plugged-f87e7c5b-000a-44c7-a7e8-b7e97027b22d external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 21 19:45:48 np0005591285 nova_compute[182755]: 2026-01-22 00:45:48.582 182759 DEBUG oslo_concurrency.lockutils [req-5d104f6b-c310-4402-be61-0be3474f05cf req-83f872c7-5cdf-4ad8-a848-c3b78bacc737 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "b53b9c71-63b9-497f-a60b-07fe6f17dad1-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 21 19:45:48 np0005591285 nova_compute[182755]: 2026-01-22 00:45:48.582 182759 DEBUG oslo_concurrency.lockutils [req-5d104f6b-c310-4402-be61-0be3474f05cf req-83f872c7-5cdf-4ad8-a848-c3b78bacc737 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "b53b9c71-63b9-497f-a60b-07fe6f17dad1-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 21 19:45:48 np0005591285 nova_compute[182755]: 2026-01-22 00:45:48.583 182759 DEBUG oslo_concurrency.lockutils [req-5d104f6b-c310-4402-be61-0be3474f05cf req-83f872c7-5cdf-4ad8-a848-c3b78bacc737 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "b53b9c71-63b9-497f-a60b-07fe6f17dad1-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 21 19:45:48 np0005591285 nova_compute[182755]: 2026-01-22 00:45:48.583 182759 DEBUG nova.compute.manager [req-5d104f6b-c310-4402-be61-0be3474f05cf req-83f872c7-5cdf-4ad8-a848-c3b78bacc737 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: b53b9c71-63b9-497f-a60b-07fe6f17dad1] No waiting events found dispatching network-vif-plugged-f87e7c5b-000a-44c7-a7e8-b7e97027b22d pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 21 19:45:48 np0005591285 nova_compute[182755]: 2026-01-22 00:45:48.584 182759 WARNING nova.compute.manager [req-5d104f6b-c310-4402-be61-0be3474f05cf req-83f872c7-5cdf-4ad8-a848-c3b78bacc737 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: b53b9c71-63b9-497f-a60b-07fe6f17dad1] Received unexpected event network-vif-plugged-f87e7c5b-000a-44c7-a7e8-b7e97027b22d for instance with vm_state active and task_state None.#033[00m
Jan 21 19:45:49 np0005591285 nova_compute[182755]: 2026-01-22 00:45:49.574 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:45:51 np0005591285 nova_compute[182755]: 2026-01-22 00:45:51.248 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:45:54 np0005591285 ovn_controller[94908]: 2026-01-22T00:45:54Z|00693|binding|INFO|Releasing lport 4a596305-d10e-4e9e-a8ea-d94a630e8baa from this chassis (sb_readonly=0)
Jan 21 19:45:54 np0005591285 NetworkManager[55017]: <info>  [1769042754.1755] manager: (patch-provnet-99bcf688-d143-4306-a11c-956e66fbe227-to-br-int): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/344)
Jan 21 19:45:54 np0005591285 NetworkManager[55017]: <info>  [1769042754.1769] manager: (patch-br-int-to-provnet-99bcf688-d143-4306-a11c-956e66fbe227): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/345)
Jan 21 19:45:54 np0005591285 nova_compute[182755]: 2026-01-22 00:45:54.168 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:45:54 np0005591285 podman[244495]: 2026-01-22 00:45:54.189279332 +0000 UTC m=+0.062078486 container health_status 769668cc4e818cfae854ab3fcb5517227ddca1e18005a18ef4d782a494f45662 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=openstack_network_exporter, maintainer=Red Hat, Inc., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, release=1755695350, version=9.6, config_id=openstack_network_exporter, managed_by=edpm_ansible, io.buildah.version=1.33.7, vcs-type=git, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, name=ubi9-minimal, architecture=x86_64, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, com.redhat.component=ubi9-minimal-container, io.openshift.tags=minimal rhel9, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, url=https://catalog.redhat.com/en/search?searchType=containers, vendor=Red Hat, Inc., build-date=2025-08-20T13:12:41)
Jan 21 19:45:54 np0005591285 podman[244496]: 2026-01-22 00:45:54.200461804 +0000 UTC m=+0.070235336 container health_status b5c1a31a508d0cf9e10702abe9c0ef424614b4d8f0daee85791f7de054afa6f0 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=ceilometer_agent_compute, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true)
Jan 21 19:45:54 np0005591285 ovn_controller[94908]: 2026-01-22T00:45:54Z|00694|binding|INFO|Releasing lport 4a596305-d10e-4e9e-a8ea-d94a630e8baa from this chassis (sb_readonly=0)
Jan 21 19:45:54 np0005591285 nova_compute[182755]: 2026-01-22 00:45:54.219 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:45:54 np0005591285 nova_compute[182755]: 2026-01-22 00:45:54.577 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:45:55 np0005591285 nova_compute[182755]: 2026-01-22 00:45:55.939 182759 DEBUG nova.compute.manager [req-b68281f4-9f2e-4b7a-bd7c-509f38fd7835 req-31692675-97a8-482f-a6b5-b5fbcee8691d 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: b53b9c71-63b9-497f-a60b-07fe6f17dad1] Received event network-changed-f87e7c5b-000a-44c7-a7e8-b7e97027b22d external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 21 19:45:55 np0005591285 nova_compute[182755]: 2026-01-22 00:45:55.940 182759 DEBUG nova.compute.manager [req-b68281f4-9f2e-4b7a-bd7c-509f38fd7835 req-31692675-97a8-482f-a6b5-b5fbcee8691d 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: b53b9c71-63b9-497f-a60b-07fe6f17dad1] Refreshing instance network info cache due to event network-changed-f87e7c5b-000a-44c7-a7e8-b7e97027b22d. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 21 19:45:55 np0005591285 nova_compute[182755]: 2026-01-22 00:45:55.940 182759 DEBUG oslo_concurrency.lockutils [req-b68281f4-9f2e-4b7a-bd7c-509f38fd7835 req-31692675-97a8-482f-a6b5-b5fbcee8691d 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "refresh_cache-b53b9c71-63b9-497f-a60b-07fe6f17dad1" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 21 19:45:55 np0005591285 nova_compute[182755]: 2026-01-22 00:45:55.940 182759 DEBUG oslo_concurrency.lockutils [req-b68281f4-9f2e-4b7a-bd7c-509f38fd7835 req-31692675-97a8-482f-a6b5-b5fbcee8691d 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquired lock "refresh_cache-b53b9c71-63b9-497f-a60b-07fe6f17dad1" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 21 19:45:55 np0005591285 nova_compute[182755]: 2026-01-22 00:45:55.941 182759 DEBUG nova.network.neutron [req-b68281f4-9f2e-4b7a-bd7c-509f38fd7835 req-31692675-97a8-482f-a6b5-b5fbcee8691d 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: b53b9c71-63b9-497f-a60b-07fe6f17dad1] Refreshing network info cache for port f87e7c5b-000a-44c7-a7e8-b7e97027b22d _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 21 19:45:56 np0005591285 nova_compute[182755]: 2026-01-22 00:45:56.250 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:45:58 np0005591285 nova_compute[182755]: 2026-01-22 00:45:58.395 182759 DEBUG nova.network.neutron [req-b68281f4-9f2e-4b7a-bd7c-509f38fd7835 req-31692675-97a8-482f-a6b5-b5fbcee8691d 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: b53b9c71-63b9-497f-a60b-07fe6f17dad1] Updated VIF entry in instance network info cache for port f87e7c5b-000a-44c7-a7e8-b7e97027b22d. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 21 19:45:58 np0005591285 nova_compute[182755]: 2026-01-22 00:45:58.395 182759 DEBUG nova.network.neutron [req-b68281f4-9f2e-4b7a-bd7c-509f38fd7835 req-31692675-97a8-482f-a6b5-b5fbcee8691d 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: b53b9c71-63b9-497f-a60b-07fe6f17dad1] Updating instance_info_cache with network_info: [{"id": "f87e7c5b-000a-44c7-a7e8-b7e97027b22d", "address": "fa:16:3e:74:c8:f6", "network": {"id": "83666af9-15ce-4344-a623-7180c9b2515a", "bridge": "br-int", "label": "tempest-network-smoke--333028759", "subnets": [{"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe74:c8f6", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}, {"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.186", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "837db8748d074b3c9179b47d30e7a1d4", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapf87e7c5b-00", "ovs_interfaceid": "f87e7c5b-000a-44c7-a7e8-b7e97027b22d", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 21 19:45:58 np0005591285 nova_compute[182755]: 2026-01-22 00:45:58.763 182759 DEBUG oslo_concurrency.lockutils [req-b68281f4-9f2e-4b7a-bd7c-509f38fd7835 req-31692675-97a8-482f-a6b5-b5fbcee8691d 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Releasing lock "refresh_cache-b53b9c71-63b9-497f-a60b-07fe6f17dad1" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 21 19:45:59 np0005591285 nova_compute[182755]: 2026-01-22 00:45:59.579 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:46:00 np0005591285 ovn_controller[94908]: 2026-01-22T00:46:00Z|00074|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:74:c8:f6 10.100.0.4
Jan 21 19:46:00 np0005591285 ovn_controller[94908]: 2026-01-22T00:46:00Z|00075|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:74:c8:f6 10.100.0.4
Jan 21 19:46:01 np0005591285 nova_compute[182755]: 2026-01-22 00:46:01.253 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:46:03 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:46:03.015 104259 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 21 19:46:03 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:46:03.016 104259 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 21 19:46:03 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:46:03.017 104259 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 21 19:46:03 np0005591285 podman[244547]: 2026-01-22 00:46:03.197319158 +0000 UTC m=+0.065931720 container health_status 3d927e208c8d9d2bf2ed958430b573b5beb6e6062ba87e1ec7a0ee42c855b2e4 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Jan 21 19:46:04 np0005591285 nova_compute[182755]: 2026-01-22 00:46:04.581 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:46:06 np0005591285 podman[244571]: 2026-01-22 00:46:06.180608768 +0000 UTC m=+0.053066993 container health_status 482a8450540b33e5cd4c4cb0c4b60281016c657b79b89fa82f8a9e447843f995 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 21 19:46:06 np0005591285 podman[244572]: 2026-01-22 00:46:06.186851846 +0000 UTC m=+0.055617962 container health_status 8919309155a033da8deb024bb37b9fc0d285fe97686ee0d202af37a5f2df8416 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Jan 21 19:46:06 np0005591285 nova_compute[182755]: 2026-01-22 00:46:06.253 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:46:09 np0005591285 podman[244615]: 2026-01-22 00:46:09.225957641 +0000 UTC m=+0.103695689 container health_status e38def30a2d032278c507a8fe96e62e9ea51ff4721fe55b24e66691821fd079e (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, tcib_managed=true, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 21 19:46:09 np0005591285 nova_compute[182755]: 2026-01-22 00:46:09.583 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:46:11 np0005591285 nova_compute[182755]: 2026-01-22 00:46:11.255 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:46:13 np0005591285 nova_compute[182755]: 2026-01-22 00:46:13.217 182759 DEBUG oslo_service.periodic_task [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 21 19:46:13 np0005591285 nova_compute[182755]: 2026-01-22 00:46:13.217 182759 DEBUG nova.compute.manager [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Jan 21 19:46:13 np0005591285 nova_compute[182755]: 2026-01-22 00:46:13.218 182759 DEBUG nova.compute.manager [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Jan 21 19:46:13 np0005591285 nova_compute[182755]: 2026-01-22 00:46:13.414 182759 DEBUG oslo_concurrency.lockutils [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Acquiring lock "refresh_cache-b53b9c71-63b9-497f-a60b-07fe6f17dad1" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 21 19:46:13 np0005591285 nova_compute[182755]: 2026-01-22 00:46:13.415 182759 DEBUG oslo_concurrency.lockutils [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Acquired lock "refresh_cache-b53b9c71-63b9-497f-a60b-07fe6f17dad1" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 21 19:46:13 np0005591285 nova_compute[182755]: 2026-01-22 00:46:13.415 182759 DEBUG nova.network.neutron [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] [instance: b53b9c71-63b9-497f-a60b-07fe6f17dad1] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004#033[00m
Jan 21 19:46:13 np0005591285 nova_compute[182755]: 2026-01-22 00:46:13.416 182759 DEBUG nova.objects.instance [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Lazy-loading 'info_cache' on Instance uuid b53b9c71-63b9-497f-a60b-07fe6f17dad1 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 21 19:46:14 np0005591285 nova_compute[182755]: 2026-01-22 00:46:14.607 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:46:14 np0005591285 nova_compute[182755]: 2026-01-22 00:46:14.883 182759 DEBUG nova.network.neutron [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] [instance: b53b9c71-63b9-497f-a60b-07fe6f17dad1] Updating instance_info_cache with network_info: [{"id": "f87e7c5b-000a-44c7-a7e8-b7e97027b22d", "address": "fa:16:3e:74:c8:f6", "network": {"id": "83666af9-15ce-4344-a623-7180c9b2515a", "bridge": "br-int", "label": "tempest-network-smoke--333028759", "subnets": [{"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe74:c8f6", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}, {"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.186", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "837db8748d074b3c9179b47d30e7a1d4", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapf87e7c5b-00", "ovs_interfaceid": "f87e7c5b-000a-44c7-a7e8-b7e97027b22d", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 21 19:46:14 np0005591285 nova_compute[182755]: 2026-01-22 00:46:14.899 182759 DEBUG oslo_concurrency.lockutils [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Releasing lock "refresh_cache-b53b9c71-63b9-497f-a60b-07fe6f17dad1" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 21 19:46:14 np0005591285 nova_compute[182755]: 2026-01-22 00:46:14.900 182759 DEBUG nova.compute.manager [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] [instance: b53b9c71-63b9-497f-a60b-07fe6f17dad1] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929#033[00m
Jan 21 19:46:14 np0005591285 nova_compute[182755]: 2026-01-22 00:46:14.900 182759 DEBUG oslo_service.periodic_task [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 21 19:46:14 np0005591285 nova_compute[182755]: 2026-01-22 00:46:14.901 182759 DEBUG oslo_service.periodic_task [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 21 19:46:14 np0005591285 nova_compute[182755]: 2026-01-22 00:46:14.901 182759 DEBUG nova.compute.manager [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Jan 21 19:46:16 np0005591285 nova_compute[182755]: 2026-01-22 00:46:16.260 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:46:18 np0005591285 nova_compute[182755]: 2026-01-22 00:46:18.217 182759 DEBUG oslo_service.periodic_task [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 21 19:46:18 np0005591285 nova_compute[182755]: 2026-01-22 00:46:18.217 182759 DEBUG oslo_service.periodic_task [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 21 19:46:19 np0005591285 nova_compute[182755]: 2026-01-22 00:46:19.609 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:46:20 np0005591285 nova_compute[182755]: 2026-01-22 00:46:20.217 182759 DEBUG oslo_service.periodic_task [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 21 19:46:21 np0005591285 nova_compute[182755]: 2026-01-22 00:46:21.263 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.189 12 DEBUG ceilometer.compute.discovery [-] instance data: {'id': 'b53b9c71-63b9-497f-a60b-07fe6f17dad1', 'name': 'tempest-TestGettingAddress-server-362739796', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'os_type': 'hvm', 'architecture': 'x86_64', 'OS-EXT-SRV-ATTR:instance_name': 'instance-000000b7', 'OS-EXT-SRV-ATTR:host': 'compute-2.ctlplane.example.com', 'OS-EXT-STS:vm_state': 'running', 'tenant_id': '837db8748d074b3c9179b47d30e7a1d4', 'user_id': 'a8fd196423d94b309668ffd08655f7ed', 'hostId': '164c0a9a6acac8f4385343b5316ca58358db1c583e55b6d4f56f9ab0', 'status': 'active', 'metadata': {}} discover_libvirt_polling /usr/lib/python3.9/site-packages/ceilometer/compute/discovery.py:228
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.191 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.requests in the context of pollsters
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.224 12 DEBUG ceilometer.compute.pollsters [-] b53b9c71-63b9-497f-a60b-07fe6f17dad1/disk.device.write.requests volume: 317 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.226 12 DEBUG ceilometer.compute.pollsters [-] b53b9c71-63b9-497f-a60b-07fe6f17dad1/disk.device.write.requests volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.231 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '5d9d0f2e-4d14-4570-9081-f0b8c733f380', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 317, 'user_id': 'a8fd196423d94b309668ffd08655f7ed', 'user_name': None, 'project_id': '837db8748d074b3c9179b47d30e7a1d4', 'project_name': None, 'resource_id': 'b53b9c71-63b9-497f-a60b-07fe6f17dad1-vda', 'timestamp': '2026-01-22T00:46:23.192156', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-362739796', 'name': 'instance-000000b7', 'instance_id': 'b53b9c71-63b9-497f-a60b-07fe6f17dad1', 'instance_type': 'm1.nano', 'host': '164c0a9a6acac8f4385343b5316ca58358db1c583e55b6d4f56f9ab0', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'c65ef872-f72b-11f0-b13b-fa163e425b77', 'monotonic_time': 7341.911524037, 'message_signature': 'a5ed2f0257974eed430881766b36d6eba210778aaca49f7167aa60b34ae544f7'}, {'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 0, 'user_id': 'a8fd196423d94b309668ffd08655f7ed', 'user_name': None, 'project_id': '837db8748d074b3c9179b47d30e7a1d4', 'project_name': None, 'resource_id': 'b53b9c71-63b9-497f-a60b-07fe6f17dad1-sda', 'timestamp': '2026-01-22T00:46:23.192156', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-362739796', 'name': 'instance-000000b7', 'instance_id': 'b53b9c71-63b9-497f-a60b-07fe6f17dad1', 'instance_type': 'm1.nano', 'host': '164c0a9a6acac8f4385343b5316ca58358db1c583e55b6d4f56f9ab0', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'c65f28e2-f72b-11f0-b13b-fa163e425b77', 'monotonic_time': 7341.911524037, 'message_signature': 'a41f67f5b489bfc45375eac49a7dad9a1419ced6fc61fe8c7c17fcf75dfaaa25'}]}, 'timestamp': '2026-01-22 00:46:23.227169', '_unique_id': '073ac7e695154aabb9e28f981087dc67'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.231 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.231 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.231 12 ERROR oslo_messaging.notify.messaging     yield
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.231 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.231 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.231 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.231 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.231 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.231 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.231 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.231 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.231 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.231 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.231 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.231 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.231 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.231 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.231 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.231 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.231 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.231 12 ERROR oslo_messaging.notify.messaging 
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.231 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.231 12 ERROR oslo_messaging.notify.messaging 
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.231 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.231 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.231 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.231 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.231 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.231 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.231 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.231 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.231 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.231 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.231 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.231 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.231 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.231 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.231 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.231 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.231 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.231 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.231 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.231 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.231 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.231 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.231 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.231 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.231 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.231 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.231 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.231 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.231 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.231 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.231 12 ERROR oslo_messaging.notify.messaging 
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.235 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes in the context of pollsters
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.239 12 DEBUG ceilometer.compute.virt.libvirt.inspector [-] No delta meter predecessor for b53b9c71-63b9-497f-a60b-07fe6f17dad1 / tapf87e7c5b-00 inspect_vnics /usr/lib/python3.9/site-packages/ceilometer/compute/virt/libvirt/inspector.py:136
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.240 12 DEBUG ceilometer.compute.pollsters [-] b53b9c71-63b9-497f-a60b-07fe6f17dad1/network.incoming.bytes volume: 4313 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.243 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'ceaf65c2-8866-4c4e-9f99-5a719d00a628', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 4313, 'user_id': 'a8fd196423d94b309668ffd08655f7ed', 'user_name': None, 'project_id': '837db8748d074b3c9179b47d30e7a1d4', 'project_name': None, 'resource_id': 'instance-000000b7-b53b9c71-63b9-497f-a60b-07fe6f17dad1-tapf87e7c5b-00', 'timestamp': '2026-01-22T00:46:23.235392', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-362739796', 'name': 'tapf87e7c5b-00', 'instance_id': 'b53b9c71-63b9-497f-a60b-07fe6f17dad1', 'instance_type': 'm1.nano', 'host': '164c0a9a6acac8f4385343b5316ca58358db1c583e55b6d4f56f9ab0', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:74:c8:f6', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapf87e7c5b-00'}, 'message_id': 'c661695e-f72b-11f0-b13b-fa163e425b77', 'monotonic_time': 7341.954710383, 'message_signature': '1d20af77db144147690ee488e419bbe8f1ff5350de19c760b2926b4267c51d46'}]}, 'timestamp': '2026-01-22 00:46:23.241670', '_unique_id': 'ef73cd2c929740058292fb4945caa2d5'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.243 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.243 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.243 12 ERROR oslo_messaging.notify.messaging     yield
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.243 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.243 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.243 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.243 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.243 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.243 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.243 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.243 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.243 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.243 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.243 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.243 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.243 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.243 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.243 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.243 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.243 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.243 12 ERROR oslo_messaging.notify.messaging 
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.243 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.243 12 ERROR oslo_messaging.notify.messaging 
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.243 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.243 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.243 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.243 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.243 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.243 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.243 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.243 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.243 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.243 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.243 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.243 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.243 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.243 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.243 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.243 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.243 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.243 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.243 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.243 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.243 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.243 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.243 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.243 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.243 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.243 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.243 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.243 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.243 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.243 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.243 12 ERROR oslo_messaging.notify.messaging 
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.245 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.usage in the context of pollsters
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.264 12 DEBUG ceilometer.compute.pollsters [-] b53b9c71-63b9-497f-a60b-07fe6f17dad1/disk.device.usage volume: 30015488 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.265 12 DEBUG ceilometer.compute.pollsters [-] b53b9c71-63b9-497f-a60b-07fe6f17dad1/disk.device.usage volume: 485376 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.267 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '971bb49a-d9d8-4d4b-825a-40f7190292b9', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 30015488, 'user_id': 'a8fd196423d94b309668ffd08655f7ed', 'user_name': None, 'project_id': '837db8748d074b3c9179b47d30e7a1d4', 'project_name': None, 'resource_id': 'b53b9c71-63b9-497f-a60b-07fe6f17dad1-vda', 'timestamp': '2026-01-22T00:46:23.246078', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-362739796', 'name': 'instance-000000b7', 'instance_id': 'b53b9c71-63b9-497f-a60b-07fe6f17dad1', 'instance_type': 'm1.nano', 'host': '164c0a9a6acac8f4385343b5316ca58358db1c583e55b6d4f56f9ab0', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'c664ff60-f72b-11f0-b13b-fa163e425b77', 'monotonic_time': 7341.96534439, 'message_signature': '1db4ab09bae83ae24625bdc28800fb934fbb972f5934ac8a78108d7c97623a6d'}, {'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 485376, 'user_id': 'a8fd196423d94b309668ffd08655f7ed', 'user_name': None, 'project_id': '837db8748d074b3c9179b47d30e7a1d4', 'project_name': None, 'resource_id': 'b53b9c71-63b9-497f-a60b-07fe6f17dad1-sda', 'timestamp': '2026-01-22T00:46:23.246078', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-362739796', 'name': 'instance-000000b7', 'instance_id': 'b53b9c71-63b9-497f-a60b-07fe6f17dad1', 'instance_type': 'm1.nano', 'host': '164c0a9a6acac8f4385343b5316ca58358db1c583e55b6d4f56f9ab0', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'c6652076-f72b-11f0-b13b-fa163e425b77', 'monotonic_time': 7341.96534439, 'message_signature': 'a0b29f6456d670d6c486da1b9a4df3061342317830f0e1baafd53e45c2e7cb01'}]}, 'timestamp': '2026-01-22 00:46:23.265904', '_unique_id': '4561c93950ae4eb689fa6cb05bb0f536'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.267 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.267 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.267 12 ERROR oslo_messaging.notify.messaging     yield
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.267 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.267 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.267 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.267 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.267 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.267 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.267 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.267 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.267 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.267 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.267 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.267 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.267 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.267 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.267 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.267 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.267 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.267 12 ERROR oslo_messaging.notify.messaging 
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.267 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.267 12 ERROR oslo_messaging.notify.messaging 
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.267 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.267 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.267 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.267 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.267 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.267 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.267 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.267 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.267 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.267 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.267 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.267 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.267 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.267 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.267 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.267 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.267 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.267 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.267 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.267 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.267 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.267 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.267 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.267 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.267 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.267 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.267 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.267 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.267 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.267 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.267 12 ERROR oslo_messaging.notify.messaging 
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.269 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.delta in the context of pollsters
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.270 12 DEBUG ceilometer.compute.pollsters [-] b53b9c71-63b9-497f-a60b-07fe6f17dad1/network.outgoing.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.271 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '9d7421f5-fd08-444f-b65c-20a271320352', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': 'a8fd196423d94b309668ffd08655f7ed', 'user_name': None, 'project_id': '837db8748d074b3c9179b47d30e7a1d4', 'project_name': None, 'resource_id': 'instance-000000b7-b53b9c71-63b9-497f-a60b-07fe6f17dad1-tapf87e7c5b-00', 'timestamp': '2026-01-22T00:46:23.270075', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-362739796', 'name': 'tapf87e7c5b-00', 'instance_id': 'b53b9c71-63b9-497f-a60b-07fe6f17dad1', 'instance_type': 'm1.nano', 'host': '164c0a9a6acac8f4385343b5316ca58358db1c583e55b6d4f56f9ab0', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:74:c8:f6', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapf87e7c5b-00'}, 'message_id': 'c665df48-f72b-11f0-b13b-fa163e425b77', 'monotonic_time': 7341.954710383, 'message_signature': '2bef346834a86ab8dcd4af78b837ed2710892487f2bbefb42e32b5af69b049db'}]}, 'timestamp': '2026-01-22 00:46:23.270819', '_unique_id': '1e8df3ab52cf4ec08972956c9ee225a5'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.271 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.271 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.271 12 ERROR oslo_messaging.notify.messaging     yield
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.271 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.271 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.271 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.271 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.271 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.271 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.271 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.271 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.271 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.271 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.271 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.271 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.271 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.271 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.271 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.271 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.271 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.271 12 ERROR oslo_messaging.notify.messaging 
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.271 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.271 12 ERROR oslo_messaging.notify.messaging 
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.271 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.271 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.271 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.271 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.271 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.271 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.271 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.271 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.271 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.271 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.271 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.271 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.271 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.271 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.271 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.271 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.271 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.271 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.271 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.271 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.271 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.271 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.271 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.271 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.271 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.271 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.271 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.271 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.271 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.271 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.271 12 ERROR oslo_messaging.notify.messaging 
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.274 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes in the context of pollsters
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.274 12 DEBUG ceilometer.compute.pollsters [-] b53b9c71-63b9-497f-a60b-07fe6f17dad1/network.outgoing.bytes volume: 3704 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.276 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '59037c6d-905e-49e0-87fc-5d40651db047', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 3704, 'user_id': 'a8fd196423d94b309668ffd08655f7ed', 'user_name': None, 'project_id': '837db8748d074b3c9179b47d30e7a1d4', 'project_name': None, 'resource_id': 'instance-000000b7-b53b9c71-63b9-497f-a60b-07fe6f17dad1-tapf87e7c5b-00', 'timestamp': '2026-01-22T00:46:23.274516', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-362739796', 'name': 'tapf87e7c5b-00', 'instance_id': 'b53b9c71-63b9-497f-a60b-07fe6f17dad1', 'instance_type': 'm1.nano', 'host': '164c0a9a6acac8f4385343b5316ca58358db1c583e55b6d4f56f9ab0', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:74:c8:f6', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapf87e7c5b-00'}, 'message_id': 'c66688b2-f72b-11f0-b13b-fa163e425b77', 'monotonic_time': 7341.954710383, 'message_signature': 'f39d240f5a3a6a21100f5976692e67b0db91d6fb20e97c8ccccc15cd653db87e'}]}, 'timestamp': '2026-01-22 00:46:23.275108', '_unique_id': 'e028624e0b6f461fb007d5929f9ac46f'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.276 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.276 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.276 12 ERROR oslo_messaging.notify.messaging     yield
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.276 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.276 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.276 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.276 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.276 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.276 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.276 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.276 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.276 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.276 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.276 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.276 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.276 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.276 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.276 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.276 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.276 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.276 12 ERROR oslo_messaging.notify.messaging 
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.276 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.276 12 ERROR oslo_messaging.notify.messaging 
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.276 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.276 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.276 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.276 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.276 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.276 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.276 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.276 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.276 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.276 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.276 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.276 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.276 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.276 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.276 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.276 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.276 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.276 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.276 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.276 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.276 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.276 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.276 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.276 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.276 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.276 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.276 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.276 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.276 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.276 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.276 12 ERROR oslo_messaging.notify.messaging 
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.277 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets in the context of pollsters
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.277 12 DEBUG ceilometer.compute.pollsters [-] b53b9c71-63b9-497f-a60b-07fe6f17dad1/network.incoming.packets volume: 27 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.278 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'df0539bc-c9f3-4757-a774-fc99540abc1b', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 27, 'user_id': 'a8fd196423d94b309668ffd08655f7ed', 'user_name': None, 'project_id': '837db8748d074b3c9179b47d30e7a1d4', 'project_name': None, 'resource_id': 'instance-000000b7-b53b9c71-63b9-497f-a60b-07fe6f17dad1-tapf87e7c5b-00', 'timestamp': '2026-01-22T00:46:23.277241', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-362739796', 'name': 'tapf87e7c5b-00', 'instance_id': 'b53b9c71-63b9-497f-a60b-07fe6f17dad1', 'instance_type': 'm1.nano', 'host': '164c0a9a6acac8f4385343b5316ca58358db1c583e55b6d4f56f9ab0', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:74:c8:f6', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapf87e7c5b-00'}, 'message_id': 'c666ee74-f72b-11f0-b13b-fa163e425b77', 'monotonic_time': 7341.954710383, 'message_signature': 'a3f87d740c3508d946e75aa62869350e9531430028093e4b67c0277cb0dc2c46'}]}, 'timestamp': '2026-01-22 00:46:23.277581', '_unique_id': 'a72cd4a7b5e6493cacad1709e192e650'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.278 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.278 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.278 12 ERROR oslo_messaging.notify.messaging     yield
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.278 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.278 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.278 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.278 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.278 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.278 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.278 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.278 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.278 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.278 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.278 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.278 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.278 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.278 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.278 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.278 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.278 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.278 12 ERROR oslo_messaging.notify.messaging 
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.278 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.278 12 ERROR oslo_messaging.notify.messaging 
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.278 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.278 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.278 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.278 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.278 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.278 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.278 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.278 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.278 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.278 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.278 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.278 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.278 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.278 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.278 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.278 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.278 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.278 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.278 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.278 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.278 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.278 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.278 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.278 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.278 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.278 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.278 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.278 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.278 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.278 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.278 12 ERROR oslo_messaging.notify.messaging 
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.279 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.iops in the context of pollsters
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.279 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for PerDeviceDiskIOPSPollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.279 12 ERROR ceilometer.polling.manager [-] Prevent pollster disk.device.iops from polling [<NovaLikeServer: tempest-TestGettingAddress-server-362739796>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-TestGettingAddress-server-362739796>]
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.279 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.rate in the context of pollsters
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.279 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for IncomingBytesRatePollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.279 12 ERROR ceilometer.polling.manager [-] Prevent pollster network.incoming.bytes.rate from polling [<NovaLikeServer: tempest-TestGettingAddress-server-362739796>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-TestGettingAddress-server-362739796>]
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.280 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.error in the context of pollsters
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.280 12 DEBUG ceilometer.compute.pollsters [-] b53b9c71-63b9-497f-a60b-07fe6f17dad1/network.outgoing.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.281 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '7f09437b-3ad9-4e7b-8b59-d3c2a334ea46', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'a8fd196423d94b309668ffd08655f7ed', 'user_name': None, 'project_id': '837db8748d074b3c9179b47d30e7a1d4', 'project_name': None, 'resource_id': 'instance-000000b7-b53b9c71-63b9-497f-a60b-07fe6f17dad1-tapf87e7c5b-00', 'timestamp': '2026-01-22T00:46:23.280187', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-362739796', 'name': 'tapf87e7c5b-00', 'instance_id': 'b53b9c71-63b9-497f-a60b-07fe6f17dad1', 'instance_type': 'm1.nano', 'host': '164c0a9a6acac8f4385343b5316ca58358db1c583e55b6d4f56f9ab0', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:74:c8:f6', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapf87e7c5b-00'}, 'message_id': 'c66760f2-f72b-11f0-b13b-fa163e425b77', 'monotonic_time': 7341.954710383, 'message_signature': 'b7ca2849d93dd1e182d5f282b7d2be1f5c7346f8758e7b1abe3a66f4b4e010b5'}]}, 'timestamp': '2026-01-22 00:46:23.280516', '_unique_id': '937907771535406bbaf2f547f4b26a33'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.281 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.281 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.281 12 ERROR oslo_messaging.notify.messaging     yield
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.281 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.281 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.281 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.281 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.281 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.281 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.281 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.281 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.281 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.281 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.281 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.281 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.281 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.281 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.281 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.281 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.281 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.281 12 ERROR oslo_messaging.notify.messaging 
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.281 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.281 12 ERROR oslo_messaging.notify.messaging 
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.281 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.281 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.281 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.281 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.281 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.281 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.281 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.281 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.281 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.281 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.281 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.281 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.281 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.281 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.281 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.281 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.281 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.281 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.281 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.281 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.281 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.281 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.281 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.281 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.281 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.281 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.281 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.281 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.281 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.281 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.281 12 ERROR oslo_messaging.notify.messaging 
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.282 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.bytes in the context of pollsters
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.282 12 DEBUG ceilometer.compute.pollsters [-] b53b9c71-63b9-497f-a60b-07fe6f17dad1/disk.device.write.bytes volume: 72957952 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.282 12 DEBUG ceilometer.compute.pollsters [-] b53b9c71-63b9-497f-a60b-07fe6f17dad1/disk.device.write.bytes volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.283 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '3e14d34b-feef-42f1-ac43-7f16d19ce059', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 72957952, 'user_id': 'a8fd196423d94b309668ffd08655f7ed', 'user_name': None, 'project_id': '837db8748d074b3c9179b47d30e7a1d4', 'project_name': None, 'resource_id': 'b53b9c71-63b9-497f-a60b-07fe6f17dad1-vda', 'timestamp': '2026-01-22T00:46:23.282168', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-362739796', 'name': 'instance-000000b7', 'instance_id': 'b53b9c71-63b9-497f-a60b-07fe6f17dad1', 'instance_type': 'm1.nano', 'host': '164c0a9a6acac8f4385343b5316ca58358db1c583e55b6d4f56f9ab0', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'c667ade6-f72b-11f0-b13b-fa163e425b77', 'monotonic_time': 7341.911524037, 'message_signature': 'ebafb2154bb6111668b8dd1770b21d28549a61ce60755800a70d026064810bea'}, {'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': 'a8fd196423d94b309668ffd08655f7ed', 'user_name': None, 'project_id': '837db8748d074b3c9179b47d30e7a1d4', 'project_name': None, 'resource_id': 'b53b9c71-63b9-497f-a60b-07fe6f17dad1-sda', 'timestamp': '2026-01-22T00:46:23.282168', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-362739796', 'name': 'instance-000000b7', 'instance_id': 'b53b9c71-63b9-497f-a60b-07fe6f17dad1', 'instance_type': 'm1.nano', 'host': '164c0a9a6acac8f4385343b5316ca58358db1c583e55b6d4f56f9ab0', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'c667b85e-f72b-11f0-b13b-fa163e425b77', 'monotonic_time': 7341.911524037, 'message_signature': '7e96fa2cc3e9aae6b7005f63257c9a84f0f0d1a59355864b9c80cf0d2a97d0d4'}]}, 'timestamp': '2026-01-22 00:46:23.282724', '_unique_id': '346c3d976e4a46b1be230c4774437435'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.283 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.283 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.283 12 ERROR oslo_messaging.notify.messaging     yield
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.283 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.283 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.283 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.283 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.283 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.283 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.283 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.283 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.283 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.283 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.283 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.283 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.283 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.283 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.283 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.283 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.283 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.283 12 ERROR oslo_messaging.notify.messaging 
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.283 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.283 12 ERROR oslo_messaging.notify.messaging 
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.283 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.283 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.283 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.283 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.283 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.283 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.283 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.283 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.283 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.283 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.283 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.283 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.283 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.283 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.283 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.283 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.283 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.283 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.283 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.283 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.283 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.283 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.283 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.283 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.283 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.283 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.283 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.283 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.283 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.283 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.283 12 ERROR oslo_messaging.notify.messaging 
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.284 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.latency in the context of pollsters
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.284 12 DEBUG ceilometer.compute.pollsters [-] b53b9c71-63b9-497f-a60b-07fe6f17dad1/disk.device.read.latency volume: 178619996 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.284 12 DEBUG ceilometer.compute.pollsters [-] b53b9c71-63b9-497f-a60b-07fe6f17dad1/disk.device.read.latency volume: 25820775 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.285 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'e76e54e2-db19-4f2d-b954-caadf24162a8', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 178619996, 'user_id': 'a8fd196423d94b309668ffd08655f7ed', 'user_name': None, 'project_id': '837db8748d074b3c9179b47d30e7a1d4', 'project_name': None, 'resource_id': 'b53b9c71-63b9-497f-a60b-07fe6f17dad1-vda', 'timestamp': '2026-01-22T00:46:23.284299', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-362739796', 'name': 'instance-000000b7', 'instance_id': 'b53b9c71-63b9-497f-a60b-07fe6f17dad1', 'instance_type': 'm1.nano', 'host': '164c0a9a6acac8f4385343b5316ca58358db1c583e55b6d4f56f9ab0', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'c6680480-f72b-11f0-b13b-fa163e425b77', 'monotonic_time': 7341.911524037, 'message_signature': 'da6662ee7a479c540de35c71e5a0a4e116bda4d446d2e96c8cdc12d0345fe8e9'}, {'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 25820775, 'user_id': 'a8fd196423d94b309668ffd08655f7ed', 'user_name': None, 'project_id': '837db8748d074b3c9179b47d30e7a1d4', 'project_name': None, 'resource_id': 'b53b9c71-63b9-497f-a60b-07fe6f17dad1-sda', 'timestamp': '2026-01-22T00:46:23.284299', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-362739796', 'name': 'instance-000000b7', 'instance_id': 'b53b9c71-63b9-497f-a60b-07fe6f17dad1', 'instance_type': 'm1.nano', 'host': '164c0a9a6acac8f4385343b5316ca58358db1c583e55b6d4f56f9ab0', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'c668133a-f72b-11f0-b13b-fa163e425b77', 'monotonic_time': 7341.911524037, 'message_signature': '45e6755e222eb50273a959643c8b2a762de70e91792c45b95b30e36edb2d4341'}]}, 'timestamp': '2026-01-22 00:46:23.285106', '_unique_id': 'de40606594044401a88e7b87cb408201'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.285 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.285 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.285 12 ERROR oslo_messaging.notify.messaging     yield
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.285 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.285 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.285 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.285 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.285 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.285 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.285 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.285 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.285 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.285 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.285 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.285 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.285 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.285 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.285 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.285 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.285 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.285 12 ERROR oslo_messaging.notify.messaging 
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.285 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.285 12 ERROR oslo_messaging.notify.messaging 
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.285 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.285 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.285 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.285 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.285 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.285 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.285 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.285 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.285 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.285 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.285 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.285 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.285 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.285 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.285 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.285 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.285 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.285 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.285 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.285 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.285 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.285 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.285 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.285 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.285 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.285 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.285 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.285 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.285 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.285 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.285 12 ERROR oslo_messaging.notify.messaging 
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.286 12 INFO ceilometer.polling.manager [-] Polling pollster cpu in the context of pollsters
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.309 12 DEBUG ceilometer.compute.pollsters [-] b53b9c71-63b9-497f-a60b-07fe6f17dad1/cpu volume: 11840000000 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.310 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '3e6d8f1f-ce09-4788-b854-5d5d6997f369', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'cpu', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 11840000000, 'user_id': 'a8fd196423d94b309668ffd08655f7ed', 'user_name': None, 'project_id': '837db8748d074b3c9179b47d30e7a1d4', 'project_name': None, 'resource_id': 'b53b9c71-63b9-497f-a60b-07fe6f17dad1', 'timestamp': '2026-01-22T00:46:23.286834', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-362739796', 'name': 'instance-000000b7', 'instance_id': 'b53b9c71-63b9-497f-a60b-07fe6f17dad1', 'instance_type': 'm1.nano', 'host': '164c0a9a6acac8f4385343b5316ca58358db1c583e55b6d4f56f9ab0', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'cpu_number': 1}, 'message_id': 'c66bd39e-f72b-11f0-b13b-fa163e425b77', 'monotonic_time': 7342.028300759, 'message_signature': '95c5b8dc76dedef83b81d9d7b79d6cd1ae56e520529f15d6d23ca19483badfd1'}]}, 'timestamp': '2026-01-22 00:46:23.309710', '_unique_id': 'f1ae8dfd92bb49a8812d1885302ca8ea'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.310 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.310 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.310 12 ERROR oslo_messaging.notify.messaging     yield
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.310 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.310 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.310 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.310 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.310 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.310 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.310 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.310 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.310 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.310 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.310 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.310 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.310 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.310 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.310 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.310 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.310 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.310 12 ERROR oslo_messaging.notify.messaging 
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.310 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.310 12 ERROR oslo_messaging.notify.messaging 
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.310 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.310 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.310 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.310 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.310 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.310 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.310 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.310 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.310 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.310 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.310 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.310 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.310 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.310 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.310 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.310 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.310 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.310 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.310 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.310 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.310 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.310 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.310 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.310 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.310 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.310 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.310 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.310 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.310 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.310 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.310 12 ERROR oslo_messaging.notify.messaging 
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.311 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.drop in the context of pollsters
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.311 12 DEBUG ceilometer.compute.pollsters [-] b53b9c71-63b9-497f-a60b-07fe6f17dad1/network.outgoing.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.312 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '3d102750-5332-4842-bf31-32c7462ea364', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'a8fd196423d94b309668ffd08655f7ed', 'user_name': None, 'project_id': '837db8748d074b3c9179b47d30e7a1d4', 'project_name': None, 'resource_id': 'instance-000000b7-b53b9c71-63b9-497f-a60b-07fe6f17dad1-tapf87e7c5b-00', 'timestamp': '2026-01-22T00:46:23.311713', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-362739796', 'name': 'tapf87e7c5b-00', 'instance_id': 'b53b9c71-63b9-497f-a60b-07fe6f17dad1', 'instance_type': 'm1.nano', 'host': '164c0a9a6acac8f4385343b5316ca58358db1c583e55b6d4f56f9ab0', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:74:c8:f6', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapf87e7c5b-00'}, 'message_id': 'c66c31a4-f72b-11f0-b13b-fa163e425b77', 'monotonic_time': 7341.954710383, 'message_signature': '7f8e42ec6288a38fd24ae76ca8c0a2aa5a8355084415220cfd977d97b974d730'}]}, 'timestamp': '2026-01-22 00:46:23.312084', '_unique_id': '6d1c3488f9644541bfa14726caf7f6ea'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.312 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.312 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.312 12 ERROR oslo_messaging.notify.messaging     yield
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.312 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.312 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.312 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.312 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.312 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.312 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.312 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.312 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.312 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.312 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.312 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.312 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.312 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.312 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.312 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.312 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.312 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.312 12 ERROR oslo_messaging.notify.messaging 
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.312 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.312 12 ERROR oslo_messaging.notify.messaging 
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.312 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.312 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.312 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.312 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.312 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.312 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.312 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.312 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.312 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.312 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.312 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.312 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.312 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.312 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.312 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.312 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.312 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.312 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.312 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.312 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.312 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.312 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.312 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.312 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.312 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.312 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.312 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.312 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.312 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.312 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.312 12 ERROR oslo_messaging.notify.messaging 
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.313 12 INFO ceilometer.polling.manager [-] Polling pollster memory.usage in the context of pollsters
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.313 12 DEBUG ceilometer.compute.pollsters [-] b53b9c71-63b9-497f-a60b-07fe6f17dad1/memory.usage volume: 46.546875 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.314 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'cc205ad6-8b99-41e0-bb13-37609f1d25c9', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'memory.usage', 'counter_type': 'gauge', 'counter_unit': 'MB', 'counter_volume': 46.546875, 'user_id': 'a8fd196423d94b309668ffd08655f7ed', 'user_name': None, 'project_id': '837db8748d074b3c9179b47d30e7a1d4', 'project_name': None, 'resource_id': 'b53b9c71-63b9-497f-a60b-07fe6f17dad1', 'timestamp': '2026-01-22T00:46:23.313651', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-362739796', 'name': 'instance-000000b7', 'instance_id': 'b53b9c71-63b9-497f-a60b-07fe6f17dad1', 'instance_type': 'm1.nano', 'host': '164c0a9a6acac8f4385343b5316ca58358db1c583e55b6d4f56f9ab0', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1}, 'message_id': 'c66c7cae-f72b-11f0-b13b-fa163e425b77', 'monotonic_time': 7342.028300759, 'message_signature': '935fe85a63038a3e20a6ef6d20844f35394370fe398efc993675485149d5bb7a'}]}, 'timestamp': '2026-01-22 00:46:23.314024', '_unique_id': 'c8da7a10c26e492abb5690f3786faa3b'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.314 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.314 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.314 12 ERROR oslo_messaging.notify.messaging     yield
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.314 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.314 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.314 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.314 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.314 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.314 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.314 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.314 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.314 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.314 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.314 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.314 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.314 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.314 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.314 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.314 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.314 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.314 12 ERROR oslo_messaging.notify.messaging 
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.314 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.314 12 ERROR oslo_messaging.notify.messaging 
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.314 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.314 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.314 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.314 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.314 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.314 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.314 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.314 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.314 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.314 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.314 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.314 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.314 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.314 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.314 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.314 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.314 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.314 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.314 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.314 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.314 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.314 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.314 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.314 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.314 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.314 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.314 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.314 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.314 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.314 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.314 12 ERROR oslo_messaging.notify.messaging 
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.315 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.latency in the context of pollsters
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.316 12 DEBUG ceilometer.compute.pollsters [-] b53b9c71-63b9-497f-a60b-07fe6f17dad1/disk.device.write.latency volume: 2873270036 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.316 12 DEBUG ceilometer.compute.pollsters [-] b53b9c71-63b9-497f-a60b-07fe6f17dad1/disk.device.write.latency volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.317 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '6b89d7ee-bb7e-45fe-9df3-1385240a0ca2', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 2873270036, 'user_id': 'a8fd196423d94b309668ffd08655f7ed', 'user_name': None, 'project_id': '837db8748d074b3c9179b47d30e7a1d4', 'project_name': None, 'resource_id': 'b53b9c71-63b9-497f-a60b-07fe6f17dad1-vda', 'timestamp': '2026-01-22T00:46:23.316098', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-362739796', 'name': 'instance-000000b7', 'instance_id': 'b53b9c71-63b9-497f-a60b-07fe6f17dad1', 'instance_type': 'm1.nano', 'host': '164c0a9a6acac8f4385343b5316ca58358db1c583e55b6d4f56f9ab0', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'c66cde38-f72b-11f0-b13b-fa163e425b77', 'monotonic_time': 7341.911524037, 'message_signature': '3bcab74250ca0ec2c5214bb38eea4c2b173f5d75015bd3908006f2f88242b7a2'}, {'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 0, 'user_id': 'a8fd196423d94b309668ffd08655f7ed', 'user_name': None, 'project_id': '837db8748d074b3c9179b47d30e7a1d4', 'project_name': None, 'resource_id': 'b53b9c71-63b9-497f-a60b-07fe6f17dad1-sda', 'timestamp': '2026-01-22T00:46:23.316098', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-362739796', 'name': 'instance-000000b7', 'instance_id': 'b53b9c71-63b9-497f-a60b-07fe6f17dad1', 'instance_type': 'm1.nano', 'host': '164c0a9a6acac8f4385343b5316ca58358db1c583e55b6d4f56f9ab0', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'c66ceb94-f72b-11f0-b13b-fa163e425b77', 'monotonic_time': 7341.911524037, 'message_signature': 'b4df1d44cb1ec0404f13d6af13b18b9d00765bf8fc6985a9ef2497607aaa8faa'}]}, 'timestamp': '2026-01-22 00:46:23.316811', '_unique_id': 'c789487b47524e9fa0c3abe63a831f7f'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.317 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.317 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.317 12 ERROR oslo_messaging.notify.messaging     yield
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.317 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.317 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.317 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.317 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.317 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.317 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.317 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.317 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.317 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.317 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.317 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.317 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.317 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.317 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.317 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.317 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.317 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.317 12 ERROR oslo_messaging.notify.messaging 
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.317 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.317 12 ERROR oslo_messaging.notify.messaging 
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.317 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.317 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.317 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.317 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.317 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.317 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.317 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.317 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.317 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.317 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.317 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.317 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.317 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.317 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.317 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.317 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.317 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.317 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.317 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.317 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.317 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.317 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.317 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.317 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.317 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.317 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.317 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.317 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.317 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.317 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.317 12 ERROR oslo_messaging.notify.messaging 
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.318 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.allocation in the context of pollsters
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.319 12 DEBUG ceilometer.compute.pollsters [-] b53b9c71-63b9-497f-a60b-07fe6f17dad1/disk.device.allocation volume: 30351360 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.319 12 DEBUG ceilometer.compute.pollsters [-] b53b9c71-63b9-497f-a60b-07fe6f17dad1/disk.device.allocation volume: 487424 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.320 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'a3d16eff-0956-43ca-89e5-be6cadf1388f', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 30351360, 'user_id': 'a8fd196423d94b309668ffd08655f7ed', 'user_name': None, 'project_id': '837db8748d074b3c9179b47d30e7a1d4', 'project_name': None, 'resource_id': 'b53b9c71-63b9-497f-a60b-07fe6f17dad1-vda', 'timestamp': '2026-01-22T00:46:23.318966', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-362739796', 'name': 'instance-000000b7', 'instance_id': 'b53b9c71-63b9-497f-a60b-07fe6f17dad1', 'instance_type': 'm1.nano', 'host': '164c0a9a6acac8f4385343b5316ca58358db1c583e55b6d4f56f9ab0', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'c66d4e04-f72b-11f0-b13b-fa163e425b77', 'monotonic_time': 7341.96534439, 'message_signature': 'cacc60f4774de017847e023e21b77f05a856d418996ac2bacbcfda24eda6a5eb'}, {'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 487424, 'user_id': 'a8fd196423d94b309668ffd08655f7ed', 'user_name': None, 'project_id': '837db8748d074b3c9179b47d30e7a1d4', 'project_name': None, 'resource_id': 'b53b9c71-63b9-497f-a60b-07fe6f17dad1-sda', 'timestamp': '2026-01-22T00:46:23.318966', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-362739796', 'name': 'instance-000000b7', 'instance_id': 'b53b9c71-63b9-497f-a60b-07fe6f17dad1', 'instance_type': 'm1.nano', 'host': '164c0a9a6acac8f4385343b5316ca58358db1c583e55b6d4f56f9ab0', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'c66d5c3c-f72b-11f0-b13b-fa163e425b77', 'monotonic_time': 7341.96534439, 'message_signature': '2202438027c1ecac8886916124843b76efa08876fddc1a206d982c716f74ea5e'}]}, 'timestamp': '2026-01-22 00:46:23.319697', '_unique_id': 'e36bc932d00e4e78a03efae2fa9813c1'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.320 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.320 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.320 12 ERROR oslo_messaging.notify.messaging     yield
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.320 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.320 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.320 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.320 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.320 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.320 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.320 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.320 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.320 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.320 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.320 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.320 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.320 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.320 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.320 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.320 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.320 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.320 12 ERROR oslo_messaging.notify.messaging 
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.320 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.320 12 ERROR oslo_messaging.notify.messaging 
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.320 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.320 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.320 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.320 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.320 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.320 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.320 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.320 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.320 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.320 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.320 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.320 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.320 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.320 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.320 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.320 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.320 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.320 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.320 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.320 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.320 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.320 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.320 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.320 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.320 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.320 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.320 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.320 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.320 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.320 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.320 12 ERROR oslo_messaging.notify.messaging 
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.321 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.rate in the context of pollsters
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.321 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for OutgoingBytesRatePollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.322 12 ERROR ceilometer.polling.manager [-] Prevent pollster network.outgoing.bytes.rate from polling [<NovaLikeServer: tempest-TestGettingAddress-server-362739796>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-TestGettingAddress-server-362739796>]
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.322 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.error in the context of pollsters
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.322 12 DEBUG ceilometer.compute.pollsters [-] b53b9c71-63b9-497f-a60b-07fe6f17dad1/network.incoming.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.323 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'b835dd01-0b4d-4e58-9f82-588c401e205b', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'a8fd196423d94b309668ffd08655f7ed', 'user_name': None, 'project_id': '837db8748d074b3c9179b47d30e7a1d4', 'project_name': None, 'resource_id': 'instance-000000b7-b53b9c71-63b9-497f-a60b-07fe6f17dad1-tapf87e7c5b-00', 'timestamp': '2026-01-22T00:46:23.322490', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-362739796', 'name': 'tapf87e7c5b-00', 'instance_id': 'b53b9c71-63b9-497f-a60b-07fe6f17dad1', 'instance_type': 'm1.nano', 'host': '164c0a9a6acac8f4385343b5316ca58358db1c583e55b6d4f56f9ab0', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:74:c8:f6', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapf87e7c5b-00'}, 'message_id': 'c66dd766-f72b-11f0-b13b-fa163e425b77', 'monotonic_time': 7341.954710383, 'message_signature': 'f001ef9716a761f24557d5152d38f3e7dfc50b56290ad942f7f4b404bff682af'}]}, 'timestamp': '2026-01-22 00:46:23.322890', '_unique_id': '39207bc1f6134328a68618e1883d22f2'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.323 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.323 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.323 12 ERROR oslo_messaging.notify.messaging     yield
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.323 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.323 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.323 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.323 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.323 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.323 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.323 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.323 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.323 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.323 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.323 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.323 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.323 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.323 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.323 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.323 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.323 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.323 12 ERROR oslo_messaging.notify.messaging 
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.323 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.323 12 ERROR oslo_messaging.notify.messaging 
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.323 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.323 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.323 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.323 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.323 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.323 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.323 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.323 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.323 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.323 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.323 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.323 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.323 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.323 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.323 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.323 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.323 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.323 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.323 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.323 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.323 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.323 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.323 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.323 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.323 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.323 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.323 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.323 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.323 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.323 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.323 12 ERROR oslo_messaging.notify.messaging 
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.324 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.latency in the context of pollsters
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.325 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for PerDeviceDiskLatencyPollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.325 12 ERROR ceilometer.polling.manager [-] Prevent pollster disk.device.latency from polling [<NovaLikeServer: tempest-TestGettingAddress-server-362739796>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-TestGettingAddress-server-362739796>]
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.325 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.requests in the context of pollsters
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.325 12 DEBUG ceilometer.compute.pollsters [-] b53b9c71-63b9-497f-a60b-07fe6f17dad1/disk.device.read.requests volume: 1105 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.325 12 DEBUG ceilometer.compute.pollsters [-] b53b9c71-63b9-497f-a60b-07fe6f17dad1/disk.device.read.requests volume: 108 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.326 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '6951b8fa-8a40-46ba-9521-c95bfe4df386', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 1105, 'user_id': 'a8fd196423d94b309668ffd08655f7ed', 'user_name': None, 'project_id': '837db8748d074b3c9179b47d30e7a1d4', 'project_name': None, 'resource_id': 'b53b9c71-63b9-497f-a60b-07fe6f17dad1-vda', 'timestamp': '2026-01-22T00:46:23.325543', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-362739796', 'name': 'instance-000000b7', 'instance_id': 'b53b9c71-63b9-497f-a60b-07fe6f17dad1', 'instance_type': 'm1.nano', 'host': '164c0a9a6acac8f4385343b5316ca58358db1c583e55b6d4f56f9ab0', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'c66e4f34-f72b-11f0-b13b-fa163e425b77', 'monotonic_time': 7341.911524037, 'message_signature': '66743f6d240d4a00660b28d910814e9e74e2b98571b1ef57af2fe095c5964c47'}, {'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 108, 'user_id': 'a8fd196423d94b309668ffd08655f7ed', 'user_name': None, 'project_id': '837db8748d074b3c9179b47d30e7a1d4', 'project_name': None, 'resource_id': 'b53b9c71-63b9-497f-a60b-07fe6f17dad1-sda', 'timestamp': '2026-01-22T00:46:23.325543', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-362739796', 'name': 'instance-000000b7', 'instance_id': 'b53b9c71-63b9-497f-a60b-07fe6f17dad1', 'instance_type': 'm1.nano', 'host': '164c0a9a6acac8f4385343b5316ca58358db1c583e55b6d4f56f9ab0', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'c66e5f2e-f72b-11f0-b13b-fa163e425b77', 'monotonic_time': 7341.911524037, 'message_signature': '8a97edd701894d9cc37e007c589e5adaf60bf6e024a0589905bda50261568eae'}]}, 'timestamp': '2026-01-22 00:46:23.326326', '_unique_id': '8a8d18585d4a4e8693243885908982f5'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.326 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.326 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.326 12 ERROR oslo_messaging.notify.messaging     yield
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.326 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.326 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.326 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.326 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.326 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.326 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.326 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.326 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.326 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.326 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.326 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.326 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.326 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.326 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.326 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.326 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.326 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.326 12 ERROR oslo_messaging.notify.messaging 
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.326 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.326 12 ERROR oslo_messaging.notify.messaging 
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.326 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.326 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.326 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.326 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.326 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.326 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.326 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.326 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.326 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.326 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.326 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.326 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.326 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.326 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.326 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.326 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.326 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.326 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.326 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.326 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.326 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.326 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.326 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.326 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.326 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.326 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.326 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.326 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.326 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.326 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.326 12 ERROR oslo_messaging.notify.messaging 
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.328 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets in the context of pollsters
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.328 12 DEBUG ceilometer.compute.pollsters [-] b53b9c71-63b9-497f-a60b-07fe6f17dad1/network.outgoing.packets volume: 31 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.329 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '2c62723e-58d1-4857-9dae-721e35b2854a', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 31, 'user_id': 'a8fd196423d94b309668ffd08655f7ed', 'user_name': None, 'project_id': '837db8748d074b3c9179b47d30e7a1d4', 'project_name': None, 'resource_id': 'instance-000000b7-b53b9c71-63b9-497f-a60b-07fe6f17dad1-tapf87e7c5b-00', 'timestamp': '2026-01-22T00:46:23.328553', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-362739796', 'name': 'tapf87e7c5b-00', 'instance_id': 'b53b9c71-63b9-497f-a60b-07fe6f17dad1', 'instance_type': 'm1.nano', 'host': '164c0a9a6acac8f4385343b5316ca58358db1c583e55b6d4f56f9ab0', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:74:c8:f6', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapf87e7c5b-00'}, 'message_id': 'c66ec612-f72b-11f0-b13b-fa163e425b77', 'monotonic_time': 7341.954710383, 'message_signature': 'ef37d4538c2a647f2f2bf204e6ac065307e7f8a2000b438abc75e3ae19c5a1b2'}]}, 'timestamp': '2026-01-22 00:46:23.329058', '_unique_id': '1286ef56e0094f2cbeaf70782bad8aab'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.329 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.329 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.329 12 ERROR oslo_messaging.notify.messaging     yield
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.329 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.329 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.329 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.329 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.329 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.329 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.329 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.329 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.329 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.329 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.329 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.329 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.329 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.329 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.329 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.329 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.329 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.329 12 ERROR oslo_messaging.notify.messaging 
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.329 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.329 12 ERROR oslo_messaging.notify.messaging 
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.329 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.329 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.329 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.329 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.329 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.329 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.329 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.329 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.329 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.329 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.329 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.329 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.329 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.329 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.329 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.329 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.329 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.329 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.329 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.329 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.329 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.329 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.329 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.329 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.329 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.329 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.329 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.329 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.329 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.329 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.329 12 ERROR oslo_messaging.notify.messaging 
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.331 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.capacity in the context of pollsters
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.331 12 DEBUG ceilometer.compute.pollsters [-] b53b9c71-63b9-497f-a60b-07fe6f17dad1/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.331 12 DEBUG ceilometer.compute.pollsters [-] b53b9c71-63b9-497f-a60b-07fe6f17dad1/disk.device.capacity volume: 485376 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.333 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'ede3dc74-fccf-41d8-9499-14a3177bf98c', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': 'a8fd196423d94b309668ffd08655f7ed', 'user_name': None, 'project_id': '837db8748d074b3c9179b47d30e7a1d4', 'project_name': None, 'resource_id': 'b53b9c71-63b9-497f-a60b-07fe6f17dad1-vda', 'timestamp': '2026-01-22T00:46:23.331305', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-362739796', 'name': 'instance-000000b7', 'instance_id': 'b53b9c71-63b9-497f-a60b-07fe6f17dad1', 'instance_type': 'm1.nano', 'host': '164c0a9a6acac8f4385343b5316ca58358db1c583e55b6d4f56f9ab0', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'c66f31ba-f72b-11f0-b13b-fa163e425b77', 'monotonic_time': 7341.96534439, 'message_signature': '79df0952a3e2e163b9b3ba4eb6d2f1b201c1c6e6c1853789e61dd9c54f5b24f4'}, {'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 485376, 'user_id': 'a8fd196423d94b309668ffd08655f7ed', 'user_name': None, 'project_id': '837db8748d074b3c9179b47d30e7a1d4', 'project_name': None, 'resource_id': 'b53b9c71-63b9-497f-a60b-07fe6f17dad1-sda', 'timestamp': '2026-01-22T00:46:23.331305', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-362739796', 'name': 'instance-000000b7', 'instance_id': 'b53b9c71-63b9-497f-a60b-07fe6f17dad1', 'instance_type': 'm1.nano', 'host': '164c0a9a6acac8f4385343b5316ca58358db1c583e55b6d4f56f9ab0', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'c66f4498-f72b-11f0-b13b-fa163e425b77', 'monotonic_time': 7341.96534439, 'message_signature': '6f28e219c01dc65f716997ec0718cec2c23cb657db584f0bf48306d07b17eebb'}]}, 'timestamp': '2026-01-22 00:46:23.332253', '_unique_id': 'fcd722caf04f4e41972b5beb08c2406f'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.333 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.333 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.333 12 ERROR oslo_messaging.notify.messaging     yield
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.333 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.333 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.333 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.333 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.333 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.333 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.333 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.333 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.333 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.333 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.333 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.333 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.333 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.333 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.333 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.333 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.333 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.333 12 ERROR oslo_messaging.notify.messaging 
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.333 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.333 12 ERROR oslo_messaging.notify.messaging 
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.333 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.333 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.333 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.333 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.333 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.333 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.333 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.333 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.333 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.333 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.333 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.333 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.333 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.333 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.333 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.333 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.333 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.333 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.333 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.333 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.333 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.333 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.333 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.333 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.333 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.333 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.333 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.333 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.333 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.333 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.333 12 ERROR oslo_messaging.notify.messaging 
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.334 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.drop in the context of pollsters
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.334 12 DEBUG ceilometer.compute.pollsters [-] b53b9c71-63b9-497f-a60b-07fe6f17dad1/network.incoming.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.335 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '979458a4-162f-472a-8aaa-03737a1cb497', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'a8fd196423d94b309668ffd08655f7ed', 'user_name': None, 'project_id': '837db8748d074b3c9179b47d30e7a1d4', 'project_name': None, 'resource_id': 'instance-000000b7-b53b9c71-63b9-497f-a60b-07fe6f17dad1-tapf87e7c5b-00', 'timestamp': '2026-01-22T00:46:23.334545', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-362739796', 'name': 'tapf87e7c5b-00', 'instance_id': 'b53b9c71-63b9-497f-a60b-07fe6f17dad1', 'instance_type': 'm1.nano', 'host': '164c0a9a6acac8f4385343b5316ca58358db1c583e55b6d4f56f9ab0', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:74:c8:f6', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapf87e7c5b-00'}, 'message_id': 'c66fae7e-f72b-11f0-b13b-fa163e425b77', 'monotonic_time': 7341.954710383, 'message_signature': '57df2f231aa8df317e13511e9d8677c72a180e554f765bd0f427ea95c9d0ba5a'}]}, 'timestamp': '2026-01-22 00:46:23.334950', '_unique_id': '08b0a4fb1b1e4973ab02c51df6d5c5b1'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.335 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.335 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.335 12 ERROR oslo_messaging.notify.messaging     yield
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.335 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.335 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.335 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.335 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.335 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.335 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.335 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.335 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.335 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.335 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.335 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.335 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.335 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.335 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.335 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.335 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.335 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.335 12 ERROR oslo_messaging.notify.messaging 
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.335 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.335 12 ERROR oslo_messaging.notify.messaging 
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.335 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.335 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.335 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.335 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.335 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.335 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.335 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.335 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.335 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.335 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.335 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.335 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.335 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.335 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.335 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.335 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.335 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.335 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.335 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.335 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.335 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.335 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.335 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.335 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.335 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.335 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.335 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.335 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.335 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.335 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.335 12 ERROR oslo_messaging.notify.messaging 
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.336 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.delta in the context of pollsters
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.337 12 DEBUG ceilometer.compute.pollsters [-] b53b9c71-63b9-497f-a60b-07fe6f17dad1/network.incoming.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.338 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '6d816024-e108-466f-8f19-246dba6fc365', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': 'a8fd196423d94b309668ffd08655f7ed', 'user_name': None, 'project_id': '837db8748d074b3c9179b47d30e7a1d4', 'project_name': None, 'resource_id': 'instance-000000b7-b53b9c71-63b9-497f-a60b-07fe6f17dad1-tapf87e7c5b-00', 'timestamp': '2026-01-22T00:46:23.337051', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-362739796', 'name': 'tapf87e7c5b-00', 'instance_id': 'b53b9c71-63b9-497f-a60b-07fe6f17dad1', 'instance_type': 'm1.nano', 'host': '164c0a9a6acac8f4385343b5316ca58358db1c583e55b6d4f56f9ab0', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:74:c8:f6', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapf87e7c5b-00'}, 'message_id': 'c67012c4-f72b-11f0-b13b-fa163e425b77', 'monotonic_time': 7341.954710383, 'message_signature': '8a7a3f46ccf3da215478440b94537f6f6d152576038505eff5c0d92b1e2cf9fc'}]}, 'timestamp': '2026-01-22 00:46:23.337521', '_unique_id': '97ae9c5239fa43b0a57fc04c6d0a152f'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.338 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.338 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.338 12 ERROR oslo_messaging.notify.messaging     yield
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.338 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.338 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.338 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.338 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.338 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.338 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.338 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.338 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.338 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.338 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.338 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.338 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.338 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.338 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.338 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.338 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.338 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.338 12 ERROR oslo_messaging.notify.messaging 
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.338 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.338 12 ERROR oslo_messaging.notify.messaging 
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.338 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.338 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.338 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.338 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.338 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.338 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.338 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.338 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.338 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.338 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.338 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.338 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.338 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.338 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.338 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.338 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.338 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.338 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.338 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.338 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.338 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.338 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.338 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.338 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.338 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.338 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.338 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.338 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.338 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.338 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.338 12 ERROR oslo_messaging.notify.messaging 
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.339 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.bytes in the context of pollsters
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.339 12 DEBUG ceilometer.compute.pollsters [-] b53b9c71-63b9-497f-a60b-07fe6f17dad1/disk.device.read.bytes volume: 30525952 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.340 12 DEBUG ceilometer.compute.pollsters [-] b53b9c71-63b9-497f-a60b-07fe6f17dad1/disk.device.read.bytes volume: 274750 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.341 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '10572fdf-1d9b-4525-8161-db27dc09db02', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 30525952, 'user_id': 'a8fd196423d94b309668ffd08655f7ed', 'user_name': None, 'project_id': '837db8748d074b3c9179b47d30e7a1d4', 'project_name': None, 'resource_id': 'b53b9c71-63b9-497f-a60b-07fe6f17dad1-vda', 'timestamp': '2026-01-22T00:46:23.339716', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-362739796', 'name': 'instance-000000b7', 'instance_id': 'b53b9c71-63b9-497f-a60b-07fe6f17dad1', 'instance_type': 'm1.nano', 'host': '164c0a9a6acac8f4385343b5316ca58358db1c583e55b6d4f56f9ab0', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'c670799e-f72b-11f0-b13b-fa163e425b77', 'monotonic_time': 7341.911524037, 'message_signature': '6a6d3f34e63c0f974ab211dc11df9990bbc10e5bd1e7a4bb665631968a2363ff'}, {'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 274750, 'user_id': 'a8fd196423d94b309668ffd08655f7ed', 'user_name': None, 'project_id': '837db8748d074b3c9179b47d30e7a1d4', 'project_name': None, 'resource_id': 'b53b9c71-63b9-497f-a60b-07fe6f17dad1-sda', 'timestamp': '2026-01-22T00:46:23.339716', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-362739796', 'name': 'instance-000000b7', 'instance_id': 'b53b9c71-63b9-497f-a60b-07fe6f17dad1', 'instance_type': 'm1.nano', 'host': '164c0a9a6acac8f4385343b5316ca58358db1c583e55b6d4f56f9ab0', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'c6708736-f72b-11f0-b13b-fa163e425b77', 'monotonic_time': 7341.911524037, 'message_signature': 'cb27d5f3ca0617ca339f20831995637b39ccff671429b92dc7dfa73226010ea2'}]}, 'timestamp': '2026-01-22 00:46:23.340456', '_unique_id': '5af3d779e269466793f40c96c6f3b1fb'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.341 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.341 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.341 12 ERROR oslo_messaging.notify.messaging     yield
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.341 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.341 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.341 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.341 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.341 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.341 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.341 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.341 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.341 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.341 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.341 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.341 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.341 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.341 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.341 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.341 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.341 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.341 12 ERROR oslo_messaging.notify.messaging 
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.341 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.341 12 ERROR oslo_messaging.notify.messaging 
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.341 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.341 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.341 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.341 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.341 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.341 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.341 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.341 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.341 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.341 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.341 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.341 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.341 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.341 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.341 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.341 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.341 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.341 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.341 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.341 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.341 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.341 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.341 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.341 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.341 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.341 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.341 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.341 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.341 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.341 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 21 19:46:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:46:23.341 12 ERROR oslo_messaging.notify.messaging 
Jan 21 19:46:24 np0005591285 nova_compute[182755]: 2026-01-22 00:46:24.611 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:46:25 np0005591285 podman[244643]: 2026-01-22 00:46:25.205788173 +0000 UTC m=+0.074184903 container health_status b5c1a31a508d0cf9e10702abe9c0ef424614b4d8f0daee85791f7de054afa6f0 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, config_id=ceilometer_agent_compute, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ceilometer_agent_compute)
Jan 21 19:46:25 np0005591285 podman[244642]: 2026-01-22 00:46:25.214157309 +0000 UTC m=+0.083085664 container health_status 769668cc4e818cfae854ab3fcb5517227ddca1e18005a18ef4d782a494f45662 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, url=https://catalog.redhat.com/en/search?searchType=containers, container_name=openstack_network_exporter, io.buildah.version=1.33.7, maintainer=Red Hat, Inc., vendor=Red Hat, Inc., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., com.redhat.component=ubi9-minimal-container, architecture=x86_64, distribution-scope=public, io.openshift.tags=minimal rhel9, vcs-type=git, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, config_id=openstack_network_exporter, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, build-date=2025-08-20T13:12:41, managed_by=edpm_ansible, name=ubi9-minimal, version=9.6, io.openshift.expose-services=, release=1755695350)
Jan 21 19:46:25 np0005591285 nova_compute[182755]: 2026-01-22 00:46:25.217 182759 DEBUG oslo_service.periodic_task [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 21 19:46:25 np0005591285 nova_compute[182755]: 2026-01-22 00:46:25.217 182759 DEBUG oslo_service.periodic_task [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 21 19:46:25 np0005591285 nova_compute[182755]: 2026-01-22 00:46:25.287 182759 DEBUG oslo_concurrency.lockutils [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 21 19:46:25 np0005591285 nova_compute[182755]: 2026-01-22 00:46:25.287 182759 DEBUG oslo_concurrency.lockutils [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 21 19:46:25 np0005591285 nova_compute[182755]: 2026-01-22 00:46:25.288 182759 DEBUG oslo_concurrency.lockutils [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 21 19:46:25 np0005591285 nova_compute[182755]: 2026-01-22 00:46:25.288 182759 DEBUG nova.compute.resource_tracker [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Jan 21 19:46:25 np0005591285 nova_compute[182755]: 2026-01-22 00:46:25.367 182759 DEBUG oslo_concurrency.processutils [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/b53b9c71-63b9-497f-a60b-07fe6f17dad1/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 21 19:46:25 np0005591285 nova_compute[182755]: 2026-01-22 00:46:25.427 182759 DEBUG oslo_concurrency.processutils [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/b53b9c71-63b9-497f-a60b-07fe6f17dad1/disk --force-share --output=json" returned: 0 in 0.060s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 21 19:46:25 np0005591285 nova_compute[182755]: 2026-01-22 00:46:25.430 182759 DEBUG oslo_concurrency.processutils [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/b53b9c71-63b9-497f-a60b-07fe6f17dad1/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 21 19:46:25 np0005591285 nova_compute[182755]: 2026-01-22 00:46:25.510 182759 DEBUG oslo_concurrency.processutils [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/b53b9c71-63b9-497f-a60b-07fe6f17dad1/disk --force-share --output=json" returned: 0 in 0.080s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 21 19:46:25 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:46:25.645 104259 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=70, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '16:a4:fb', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': 'fa:c2:bb:50:eb:27'}, ipsec=False) old=SB_Global(nb_cfg=69) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 21 19:46:25 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:46:25.645 104259 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 9 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Jan 21 19:46:25 np0005591285 nova_compute[182755]: 2026-01-22 00:46:25.652 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:46:25 np0005591285 nova_compute[182755]: 2026-01-22 00:46:25.666 182759 WARNING nova.virt.libvirt.driver [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 21 19:46:25 np0005591285 nova_compute[182755]: 2026-01-22 00:46:25.668 182759 DEBUG nova.compute.resource_tracker [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=5560MB free_disk=73.14801788330078GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Jan 21 19:46:25 np0005591285 nova_compute[182755]: 2026-01-22 00:46:25.668 182759 DEBUG oslo_concurrency.lockutils [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 21 19:46:25 np0005591285 nova_compute[182755]: 2026-01-22 00:46:25.668 182759 DEBUG oslo_concurrency.lockutils [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 21 19:46:25 np0005591285 nova_compute[182755]: 2026-01-22 00:46:25.781 182759 DEBUG nova.compute.resource_tracker [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Instance b53b9c71-63b9-497f-a60b-07fe6f17dad1 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Jan 21 19:46:25 np0005591285 nova_compute[182755]: 2026-01-22 00:46:25.782 182759 DEBUG nova.compute.resource_tracker [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Jan 21 19:46:25 np0005591285 nova_compute[182755]: 2026-01-22 00:46:25.782 182759 DEBUG nova.compute.resource_tracker [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7679MB used_ram=640MB phys_disk=79GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Jan 21 19:46:25 np0005591285 nova_compute[182755]: 2026-01-22 00:46:25.833 182759 DEBUG nova.compute.provider_tree [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Inventory has not changed in ProviderTree for provider: e96a8776-a298-4c19-937a-402cb8191067 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 21 19:46:25 np0005591285 nova_compute[182755]: 2026-01-22 00:46:25.846 182759 DEBUG nova.scheduler.client.report [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Inventory has not changed for provider e96a8776-a298-4c19-937a-402cb8191067 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 21 19:46:25 np0005591285 nova_compute[182755]: 2026-01-22 00:46:25.864 182759 DEBUG nova.compute.resource_tracker [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Jan 21 19:46:25 np0005591285 nova_compute[182755]: 2026-01-22 00:46:25.865 182759 DEBUG oslo_concurrency.lockutils [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.196s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 21 19:46:26 np0005591285 nova_compute[182755]: 2026-01-22 00:46:26.265 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:46:29 np0005591285 nova_compute[182755]: 2026-01-22 00:46:29.614 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:46:29 np0005591285 nova_compute[182755]: 2026-01-22 00:46:29.865 182759 DEBUG oslo_service.periodic_task [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 21 19:46:31 np0005591285 nova_compute[182755]: 2026-01-22 00:46:31.268 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:46:34 np0005591285 podman[244691]: 2026-01-22 00:46:34.187358188 +0000 UTC m=+0.054376128 container health_status 3d927e208c8d9d2bf2ed958430b573b5beb6e6062ba87e1ec7a0ee42c855b2e4 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Jan 21 19:46:34 np0005591285 nova_compute[182755]: 2026-01-22 00:46:34.616 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:46:34 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:46:34.647 104259 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=ce4b296c-26ac-415a-aa87-9634754eb3d3, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '70'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 21 19:46:36 np0005591285 nova_compute[182755]: 2026-01-22 00:46:36.268 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:46:37 np0005591285 podman[244716]: 2026-01-22 00:46:37.192841766 +0000 UTC m=+0.059746853 container health_status 482a8450540b33e5cd4c4cb0c4b60281016c657b79b89fa82f8a9e447843f995 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']})
Jan 21 19:46:37 np0005591285 podman[244717]: 2026-01-22 00:46:37.217847931 +0000 UTC m=+0.081310366 container health_status 8919309155a033da8deb024bb37b9fc0d285fe97686ee0d202af37a5f2df8416 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Jan 21 19:46:38 np0005591285 nova_compute[182755]: 2026-01-22 00:46:38.213 182759 DEBUG oslo_service.periodic_task [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 21 19:46:39 np0005591285 nova_compute[182755]: 2026-01-22 00:46:39.619 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:46:40 np0005591285 podman[244759]: 2026-01-22 00:46:40.267490951 +0000 UTC m=+0.144117721 container health_status e38def30a2d032278c507a8fe96e62e9ea51ff4721fe55b24e66691821fd079e (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, config_id=ovn_controller, org.label-schema.build-date=20251202)
Jan 21 19:46:41 np0005591285 nova_compute[182755]: 2026-01-22 00:46:41.270 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:46:44 np0005591285 nova_compute[182755]: 2026-01-22 00:46:44.622 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:46:46 np0005591285 nova_compute[182755]: 2026-01-22 00:46:46.273 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:46:49 np0005591285 nova_compute[182755]: 2026-01-22 00:46:49.624 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:46:49 np0005591285 nova_compute[182755]: 2026-01-22 00:46:49.660 182759 DEBUG oslo_concurrency.lockutils [None req-470ad05d-f3e2-45eb-9145-e5c2d4afcc71 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Acquiring lock "b53b9c71-63b9-497f-a60b-07fe6f17dad1" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 21 19:46:49 np0005591285 nova_compute[182755]: 2026-01-22 00:46:49.661 182759 DEBUG oslo_concurrency.lockutils [None req-470ad05d-f3e2-45eb-9145-e5c2d4afcc71 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Lock "b53b9c71-63b9-497f-a60b-07fe6f17dad1" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 21 19:46:49 np0005591285 nova_compute[182755]: 2026-01-22 00:46:49.662 182759 DEBUG oslo_concurrency.lockutils [None req-470ad05d-f3e2-45eb-9145-e5c2d4afcc71 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Acquiring lock "b53b9c71-63b9-497f-a60b-07fe6f17dad1-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 21 19:46:49 np0005591285 nova_compute[182755]: 2026-01-22 00:46:49.662 182759 DEBUG oslo_concurrency.lockutils [None req-470ad05d-f3e2-45eb-9145-e5c2d4afcc71 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Lock "b53b9c71-63b9-497f-a60b-07fe6f17dad1-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 21 19:46:49 np0005591285 nova_compute[182755]: 2026-01-22 00:46:49.663 182759 DEBUG oslo_concurrency.lockutils [None req-470ad05d-f3e2-45eb-9145-e5c2d4afcc71 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Lock "b53b9c71-63b9-497f-a60b-07fe6f17dad1-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 21 19:46:49 np0005591285 nova_compute[182755]: 2026-01-22 00:46:49.683 182759 INFO nova.compute.manager [None req-470ad05d-f3e2-45eb-9145-e5c2d4afcc71 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] [instance: b53b9c71-63b9-497f-a60b-07fe6f17dad1] Terminating instance#033[00m
Jan 21 19:46:49 np0005591285 nova_compute[182755]: 2026-01-22 00:46:49.699 182759 DEBUG nova.compute.manager [None req-470ad05d-f3e2-45eb-9145-e5c2d4afcc71 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] [instance: b53b9c71-63b9-497f-a60b-07fe6f17dad1] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Jan 21 19:46:49 np0005591285 kernel: tapf87e7c5b-00 (unregistering): left promiscuous mode
Jan 21 19:46:49 np0005591285 NetworkManager[55017]: <info>  [1769042809.7318] device (tapf87e7c5b-00): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 21 19:46:49 np0005591285 ovn_controller[94908]: 2026-01-22T00:46:49Z|00695|binding|INFO|Releasing lport f87e7c5b-000a-44c7-a7e8-b7e97027b22d from this chassis (sb_readonly=0)
Jan 21 19:46:49 np0005591285 ovn_controller[94908]: 2026-01-22T00:46:49Z|00696|binding|INFO|Setting lport f87e7c5b-000a-44c7-a7e8-b7e97027b22d down in Southbound
Jan 21 19:46:49 np0005591285 ovn_controller[94908]: 2026-01-22T00:46:49Z|00697|binding|INFO|Removing iface tapf87e7c5b-00 ovn-installed in OVS
Jan 21 19:46:49 np0005591285 nova_compute[182755]: 2026-01-22 00:46:49.738 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:46:49 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:46:49.747 104259 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:74:c8:f6 10.100.0.4 2001:db8::f816:3eff:fe74:c8f6'], port_security=['fa:16:3e:74:c8:f6 10.100.0.4 2001:db8::f816:3eff:fe74:c8f6'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.4/28 2001:db8::f816:3eff:fe74:c8f6/64', 'neutron:device_id': 'b53b9c71-63b9-497f-a60b-07fe6f17dad1', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-83666af9-15ce-4344-a623-7180c9b2515a', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '837db8748d074b3c9179b47d30e7a1d4', 'neutron:revision_number': '4', 'neutron:security_group_ids': 'bd69fbc7-ff38-42ce-b5d5-6559f7285ccb', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-2.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=496d15df-9baa-43c6-8bd0-ae8566291be1, chassis=[], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fdd55b5c970>], logical_port=f87e7c5b-000a-44c7-a7e8-b7e97027b22d) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7fdd55b5c970>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 21 19:46:49 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:46:49.749 104259 INFO neutron.agent.ovn.metadata.agent [-] Port f87e7c5b-000a-44c7-a7e8-b7e97027b22d in datapath 83666af9-15ce-4344-a623-7180c9b2515a unbound from our chassis#033[00m
Jan 21 19:46:49 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:46:49.751 104259 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 83666af9-15ce-4344-a623-7180c9b2515a, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Jan 21 19:46:49 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:46:49.753 211690 DEBUG oslo.privsep.daemon [-] privsep: reply[eddef0c3-9258-4a4a-9aef-7f3a7669984d]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 19:46:49 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:46:49.754 104259 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-83666af9-15ce-4344-a623-7180c9b2515a namespace which is not needed anymore#033[00m
Jan 21 19:46:49 np0005591285 nova_compute[182755]: 2026-01-22 00:46:49.769 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:46:49 np0005591285 systemd[1]: machine-qemu\x2d78\x2dinstance\x2d000000b7.scope: Deactivated successfully.
Jan 21 19:46:49 np0005591285 systemd[1]: machine-qemu\x2d78\x2dinstance\x2d000000b7.scope: Consumed 15.539s CPU time.
Jan 21 19:46:49 np0005591285 systemd-machined[154022]: Machine qemu-78-instance-000000b7 terminated.
Jan 21 19:46:49 np0005591285 neutron-haproxy-ovnmeta-83666af9-15ce-4344-a623-7180c9b2515a[244480]: [NOTICE]   (244484) : haproxy version is 2.8.14-c23fe91
Jan 21 19:46:49 np0005591285 neutron-haproxy-ovnmeta-83666af9-15ce-4344-a623-7180c9b2515a[244480]: [NOTICE]   (244484) : path to executable is /usr/sbin/haproxy
Jan 21 19:46:49 np0005591285 neutron-haproxy-ovnmeta-83666af9-15ce-4344-a623-7180c9b2515a[244480]: [WARNING]  (244484) : Exiting Master process...
Jan 21 19:46:49 np0005591285 neutron-haproxy-ovnmeta-83666af9-15ce-4344-a623-7180c9b2515a[244480]: [ALERT]    (244484) : Current worker (244486) exited with code 143 (Terminated)
Jan 21 19:46:49 np0005591285 neutron-haproxy-ovnmeta-83666af9-15ce-4344-a623-7180c9b2515a[244480]: [WARNING]  (244484) : All workers exited. Exiting... (0)
Jan 21 19:46:49 np0005591285 systemd[1]: libpod-669c8a857a523c77811ecc1037081ab50ceadf985188e4e17817d08c4011be2c.scope: Deactivated successfully.
Jan 21 19:46:49 np0005591285 podman[244813]: 2026-01-22 00:46:49.882683442 +0000 UTC m=+0.042747355 container died 669c8a857a523c77811ecc1037081ab50ceadf985188e4e17817d08c4011be2c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-83666af9-15ce-4344-a623-7180c9b2515a, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Jan 21 19:46:49 np0005591285 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-669c8a857a523c77811ecc1037081ab50ceadf985188e4e17817d08c4011be2c-userdata-shm.mount: Deactivated successfully.
Jan 21 19:46:49 np0005591285 systemd[1]: var-lib-containers-storage-overlay-156bbbc55167c4bf0f8269a27234d2f7a7576ba89b6a5a72995d64162a72f16c-merged.mount: Deactivated successfully.
Jan 21 19:46:49 np0005591285 podman[244813]: 2026-01-22 00:46:49.913117373 +0000 UTC m=+0.073181296 container cleanup 669c8a857a523c77811ecc1037081ab50ceadf985188e4e17817d08c4011be2c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-83666af9-15ce-4344-a623-7180c9b2515a, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true)
Jan 21 19:46:49 np0005591285 systemd[1]: libpod-conmon-669c8a857a523c77811ecc1037081ab50ceadf985188e4e17817d08c4011be2c.scope: Deactivated successfully.
Jan 21 19:46:49 np0005591285 nova_compute[182755]: 2026-01-22 00:46:49.955 182759 INFO nova.virt.libvirt.driver [-] [instance: b53b9c71-63b9-497f-a60b-07fe6f17dad1] Instance destroyed successfully.#033[00m
Jan 21 19:46:49 np0005591285 nova_compute[182755]: 2026-01-22 00:46:49.956 182759 DEBUG nova.objects.instance [None req-470ad05d-f3e2-45eb-9145-e5c2d4afcc71 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Lazy-loading 'resources' on Instance uuid b53b9c71-63b9-497f-a60b-07fe6f17dad1 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 21 19:46:49 np0005591285 nova_compute[182755]: 2026-01-22 00:46:49.973 182759 DEBUG nova.virt.libvirt.vif [None req-470ad05d-f3e2-45eb-9145-e5c2d4afcc71 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-22T00:45:35Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestGettingAddress-server-362739796',display_name='tempest-TestGettingAddress-server-362739796',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-testgettingaddress-server-362739796',id=183,image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBL3/PLQY4lAQU2yFGaoAmqWPJI5565ofTauEAmPcwEncHglgrmt+9X41pDrGx2Hzo63wjxi644i8QnD2R87vFxz3Kmnkg4MUbe27S7AT4N98N34iBfOk+UwjPX/szWkvLg==',key_name='tempest-TestGettingAddress-2046948813',keypairs=<?>,launch_index=0,launched_at=2026-01-22T00:45:46Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='837db8748d074b3c9179b47d30e7a1d4',ramdisk_id='',reservation_id='r-hs0y34u0',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestGettingAddress-471729430',owner_user_name='tempest-TestGettingAddress-471729430-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-22T00:45:46Z,user_data=None,user_id='a8fd196423d94b309668ffd08655f7ed',uuid=b53b9c71-63b9-497f-a60b-07fe6f17dad1,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "f87e7c5b-000a-44c7-a7e8-b7e97027b22d", "address": "fa:16:3e:74:c8:f6", "network": {"id": "83666af9-15ce-4344-a623-7180c9b2515a", "bridge": "br-int", "label": "tempest-network-smoke--333028759", "subnets": [{"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe74:c8f6", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}, {"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.186", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "837db8748d074b3c9179b47d30e7a1d4", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapf87e7c5b-00", "ovs_interfaceid": "f87e7c5b-000a-44c7-a7e8-b7e97027b22d", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Jan 21 19:46:49 np0005591285 nova_compute[182755]: 2026-01-22 00:46:49.973 182759 DEBUG nova.network.os_vif_util [None req-470ad05d-f3e2-45eb-9145-e5c2d4afcc71 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Converting VIF {"id": "f87e7c5b-000a-44c7-a7e8-b7e97027b22d", "address": "fa:16:3e:74:c8:f6", "network": {"id": "83666af9-15ce-4344-a623-7180c9b2515a", "bridge": "br-int", "label": "tempest-network-smoke--333028759", "subnets": [{"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe74:c8f6", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}, {"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.186", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "837db8748d074b3c9179b47d30e7a1d4", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapf87e7c5b-00", "ovs_interfaceid": "f87e7c5b-000a-44c7-a7e8-b7e97027b22d", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 21 19:46:49 np0005591285 nova_compute[182755]: 2026-01-22 00:46:49.974 182759 DEBUG nova.network.os_vif_util [None req-470ad05d-f3e2-45eb-9145-e5c2d4afcc71 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:74:c8:f6,bridge_name='br-int',has_traffic_filtering=True,id=f87e7c5b-000a-44c7-a7e8-b7e97027b22d,network=Network(83666af9-15ce-4344-a623-7180c9b2515a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapf87e7c5b-00') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 21 19:46:49 np0005591285 nova_compute[182755]: 2026-01-22 00:46:49.974 182759 DEBUG os_vif [None req-470ad05d-f3e2-45eb-9145-e5c2d4afcc71 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:74:c8:f6,bridge_name='br-int',has_traffic_filtering=True,id=f87e7c5b-000a-44c7-a7e8-b7e97027b22d,network=Network(83666af9-15ce-4344-a623-7180c9b2515a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapf87e7c5b-00') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Jan 21 19:46:49 np0005591285 nova_compute[182755]: 2026-01-22 00:46:49.976 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:46:49 np0005591285 nova_compute[182755]: 2026-01-22 00:46:49.976 182759 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapf87e7c5b-00, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 21 19:46:49 np0005591285 nova_compute[182755]: 2026-01-22 00:46:49.978 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:46:49 np0005591285 nova_compute[182755]: 2026-01-22 00:46:49.979 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:46:49 np0005591285 podman[244849]: 2026-01-22 00:46:49.980974084 +0000 UTC m=+0.044597405 container remove 669c8a857a523c77811ecc1037081ab50ceadf985188e4e17817d08c4011be2c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-83666af9-15ce-4344-a623-7180c9b2515a, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Jan 21 19:46:49 np0005591285 nova_compute[182755]: 2026-01-22 00:46:49.984 182759 INFO os_vif [None req-470ad05d-f3e2-45eb-9145-e5c2d4afcc71 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:74:c8:f6,bridge_name='br-int',has_traffic_filtering=True,id=f87e7c5b-000a-44c7-a7e8-b7e97027b22d,network=Network(83666af9-15ce-4344-a623-7180c9b2515a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapf87e7c5b-00')#033[00m
Jan 21 19:46:49 np0005591285 nova_compute[182755]: 2026-01-22 00:46:49.985 182759 INFO nova.virt.libvirt.driver [None req-470ad05d-f3e2-45eb-9145-e5c2d4afcc71 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] [instance: b53b9c71-63b9-497f-a60b-07fe6f17dad1] Deleting instance files /var/lib/nova/instances/b53b9c71-63b9-497f-a60b-07fe6f17dad1_del#033[00m
Jan 21 19:46:49 np0005591285 nova_compute[182755]: 2026-01-22 00:46:49.986 182759 INFO nova.virt.libvirt.driver [None req-470ad05d-f3e2-45eb-9145-e5c2d4afcc71 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] [instance: b53b9c71-63b9-497f-a60b-07fe6f17dad1] Deletion of /var/lib/nova/instances/b53b9c71-63b9-497f-a60b-07fe6f17dad1_del complete#033[00m
Jan 21 19:46:49 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:46:49.986 211690 DEBUG oslo.privsep.daemon [-] privsep: reply[4a5a2b31-8fc9-47ad-8513-0604aec8fed2]: (4, ('Thu Jan 22 12:46:49 AM UTC 2026 Stopping container neutron-haproxy-ovnmeta-83666af9-15ce-4344-a623-7180c9b2515a (669c8a857a523c77811ecc1037081ab50ceadf985188e4e17817d08c4011be2c)\n669c8a857a523c77811ecc1037081ab50ceadf985188e4e17817d08c4011be2c\nThu Jan 22 12:46:49 AM UTC 2026 Deleting container neutron-haproxy-ovnmeta-83666af9-15ce-4344-a623-7180c9b2515a (669c8a857a523c77811ecc1037081ab50ceadf985188e4e17817d08c4011be2c)\n669c8a857a523c77811ecc1037081ab50ceadf985188e4e17817d08c4011be2c\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 19:46:49 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:46:49.988 211690 DEBUG oslo.privsep.daemon [-] privsep: reply[1febab2c-a5af-4240-93b4-592b3b343360]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 19:46:49 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:46:49.990 104259 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap83666af9-10, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 21 19:46:49 np0005591285 nova_compute[182755]: 2026-01-22 00:46:49.991 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:46:49 np0005591285 kernel: tap83666af9-10: left promiscuous mode
Jan 21 19:46:50 np0005591285 nova_compute[182755]: 2026-01-22 00:46:50.003 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:46:50 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:46:50.006 211690 DEBUG oslo.privsep.daemon [-] privsep: reply[066130d9-c2a9-4ce0-83db-8413e132166c]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 19:46:50 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:46:50.025 211690 DEBUG oslo.privsep.daemon [-] privsep: reply[8a8aa3b4-7510-4864-84b6-3fcfd7f80f59]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 19:46:50 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:46:50.026 211690 DEBUG oslo.privsep.daemon [-] privsep: reply[623e4f1b-698f-416a-9ad2-ec6870171036]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 19:46:50 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:46:50.045 211690 DEBUG oslo.privsep.daemon [-] privsep: reply[6ca10adf-2a02-40c0-b93a-c64986ef887c]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 730417, 'reachable_time': 26288, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 244873, 'error': None, 'target': 'ovnmeta-83666af9-15ce-4344-a623-7180c9b2515a', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 19:46:50 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:46:50.048 104650 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-83666af9-15ce-4344-a623-7180c9b2515a deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Jan 21 19:46:50 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:46:50.048 104650 DEBUG oslo.privsep.daemon [-] privsep: reply[3ee31176-2b52-4d63-9ab3-2a7b08691994]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 19:46:50 np0005591285 systemd[1]: run-netns-ovnmeta\x2d83666af9\x2d15ce\x2d4344\x2da623\x2d7180c9b2515a.mount: Deactivated successfully.
Jan 21 19:46:50 np0005591285 nova_compute[182755]: 2026-01-22 00:46:50.082 182759 DEBUG nova.compute.manager [req-e2f2a2ab-7a3a-4821-b233-d4a3577a3f8c req-1826002a-5263-4f1f-9dfa-e9e1145aa4b5 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: b53b9c71-63b9-497f-a60b-07fe6f17dad1] Received event network-changed-f87e7c5b-000a-44c7-a7e8-b7e97027b22d external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 21 19:46:50 np0005591285 nova_compute[182755]: 2026-01-22 00:46:50.083 182759 DEBUG nova.compute.manager [req-e2f2a2ab-7a3a-4821-b233-d4a3577a3f8c req-1826002a-5263-4f1f-9dfa-e9e1145aa4b5 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: b53b9c71-63b9-497f-a60b-07fe6f17dad1] Refreshing instance network info cache due to event network-changed-f87e7c5b-000a-44c7-a7e8-b7e97027b22d. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 21 19:46:50 np0005591285 nova_compute[182755]: 2026-01-22 00:46:50.083 182759 DEBUG oslo_concurrency.lockutils [req-e2f2a2ab-7a3a-4821-b233-d4a3577a3f8c req-1826002a-5263-4f1f-9dfa-e9e1145aa4b5 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "refresh_cache-b53b9c71-63b9-497f-a60b-07fe6f17dad1" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 21 19:46:50 np0005591285 nova_compute[182755]: 2026-01-22 00:46:50.083 182759 DEBUG oslo_concurrency.lockutils [req-e2f2a2ab-7a3a-4821-b233-d4a3577a3f8c req-1826002a-5263-4f1f-9dfa-e9e1145aa4b5 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquired lock "refresh_cache-b53b9c71-63b9-497f-a60b-07fe6f17dad1" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 21 19:46:50 np0005591285 nova_compute[182755]: 2026-01-22 00:46:50.083 182759 DEBUG nova.network.neutron [req-e2f2a2ab-7a3a-4821-b233-d4a3577a3f8c req-1826002a-5263-4f1f-9dfa-e9e1145aa4b5 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: b53b9c71-63b9-497f-a60b-07fe6f17dad1] Refreshing network info cache for port f87e7c5b-000a-44c7-a7e8-b7e97027b22d _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 21 19:46:50 np0005591285 nova_compute[182755]: 2026-01-22 00:46:50.088 182759 INFO nova.compute.manager [None req-470ad05d-f3e2-45eb-9145-e5c2d4afcc71 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] [instance: b53b9c71-63b9-497f-a60b-07fe6f17dad1] Took 0.39 seconds to destroy the instance on the hypervisor.#033[00m
Jan 21 19:46:50 np0005591285 nova_compute[182755]: 2026-01-22 00:46:50.089 182759 DEBUG oslo.service.loopingcall [None req-470ad05d-f3e2-45eb-9145-e5c2d4afcc71 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Jan 21 19:46:50 np0005591285 nova_compute[182755]: 2026-01-22 00:46:50.089 182759 DEBUG nova.compute.manager [-] [instance: b53b9c71-63b9-497f-a60b-07fe6f17dad1] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Jan 21 19:46:50 np0005591285 nova_compute[182755]: 2026-01-22 00:46:50.089 182759 DEBUG nova.network.neutron [-] [instance: b53b9c71-63b9-497f-a60b-07fe6f17dad1] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Jan 21 19:46:50 np0005591285 nova_compute[182755]: 2026-01-22 00:46:50.642 182759 DEBUG nova.compute.manager [req-a7714b66-fd6e-4830-8e57-1f670488b7f4 req-d8d2f435-c79f-4fa6-821d-2fdecf69b825 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: b53b9c71-63b9-497f-a60b-07fe6f17dad1] Received event network-vif-unplugged-f87e7c5b-000a-44c7-a7e8-b7e97027b22d external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 21 19:46:50 np0005591285 nova_compute[182755]: 2026-01-22 00:46:50.643 182759 DEBUG oslo_concurrency.lockutils [req-a7714b66-fd6e-4830-8e57-1f670488b7f4 req-d8d2f435-c79f-4fa6-821d-2fdecf69b825 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "b53b9c71-63b9-497f-a60b-07fe6f17dad1-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 21 19:46:50 np0005591285 nova_compute[182755]: 2026-01-22 00:46:50.643 182759 DEBUG oslo_concurrency.lockutils [req-a7714b66-fd6e-4830-8e57-1f670488b7f4 req-d8d2f435-c79f-4fa6-821d-2fdecf69b825 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "b53b9c71-63b9-497f-a60b-07fe6f17dad1-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 21 19:46:50 np0005591285 nova_compute[182755]: 2026-01-22 00:46:50.643 182759 DEBUG oslo_concurrency.lockutils [req-a7714b66-fd6e-4830-8e57-1f670488b7f4 req-d8d2f435-c79f-4fa6-821d-2fdecf69b825 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "b53b9c71-63b9-497f-a60b-07fe6f17dad1-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 21 19:46:50 np0005591285 nova_compute[182755]: 2026-01-22 00:46:50.643 182759 DEBUG nova.compute.manager [req-a7714b66-fd6e-4830-8e57-1f670488b7f4 req-d8d2f435-c79f-4fa6-821d-2fdecf69b825 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: b53b9c71-63b9-497f-a60b-07fe6f17dad1] No waiting events found dispatching network-vif-unplugged-f87e7c5b-000a-44c7-a7e8-b7e97027b22d pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 21 19:46:50 np0005591285 nova_compute[182755]: 2026-01-22 00:46:50.644 182759 DEBUG nova.compute.manager [req-a7714b66-fd6e-4830-8e57-1f670488b7f4 req-d8d2f435-c79f-4fa6-821d-2fdecf69b825 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: b53b9c71-63b9-497f-a60b-07fe6f17dad1] Received event network-vif-unplugged-f87e7c5b-000a-44c7-a7e8-b7e97027b22d for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Jan 21 19:46:51 np0005591285 nova_compute[182755]: 2026-01-22 00:46:51.276 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:46:51 np0005591285 nova_compute[182755]: 2026-01-22 00:46:51.643 182759 DEBUG nova.network.neutron [-] [instance: b53b9c71-63b9-497f-a60b-07fe6f17dad1] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 21 19:46:51 np0005591285 nova_compute[182755]: 2026-01-22 00:46:51.673 182759 INFO nova.compute.manager [-] [instance: b53b9c71-63b9-497f-a60b-07fe6f17dad1] Took 1.58 seconds to deallocate network for instance.#033[00m
Jan 21 19:46:51 np0005591285 nova_compute[182755]: 2026-01-22 00:46:51.765 182759 DEBUG oslo_concurrency.lockutils [None req-470ad05d-f3e2-45eb-9145-e5c2d4afcc71 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 21 19:46:51 np0005591285 nova_compute[182755]: 2026-01-22 00:46:51.765 182759 DEBUG oslo_concurrency.lockutils [None req-470ad05d-f3e2-45eb-9145-e5c2d4afcc71 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 21 19:46:51 np0005591285 nova_compute[182755]: 2026-01-22 00:46:51.828 182759 DEBUG nova.compute.provider_tree [None req-470ad05d-f3e2-45eb-9145-e5c2d4afcc71 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Inventory has not changed in ProviderTree for provider: e96a8776-a298-4c19-937a-402cb8191067 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 21 19:46:51 np0005591285 nova_compute[182755]: 2026-01-22 00:46:51.844 182759 DEBUG nova.scheduler.client.report [None req-470ad05d-f3e2-45eb-9145-e5c2d4afcc71 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Inventory has not changed for provider e96a8776-a298-4c19-937a-402cb8191067 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 21 19:46:51 np0005591285 nova_compute[182755]: 2026-01-22 00:46:51.864 182759 DEBUG oslo_concurrency.lockutils [None req-470ad05d-f3e2-45eb-9145-e5c2d4afcc71 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.098s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 21 19:46:51 np0005591285 nova_compute[182755]: 2026-01-22 00:46:51.888 182759 INFO nova.scheduler.client.report [None req-470ad05d-f3e2-45eb-9145-e5c2d4afcc71 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Deleted allocations for instance b53b9c71-63b9-497f-a60b-07fe6f17dad1#033[00m
Jan 21 19:46:51 np0005591285 nova_compute[182755]: 2026-01-22 00:46:51.956 182759 DEBUG oslo_concurrency.lockutils [None req-470ad05d-f3e2-45eb-9145-e5c2d4afcc71 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Lock "b53b9c71-63b9-497f-a60b-07fe6f17dad1" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 2.295s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 21 19:46:52 np0005591285 nova_compute[182755]: 2026-01-22 00:46:52.329 182759 DEBUG nova.network.neutron [req-e2f2a2ab-7a3a-4821-b233-d4a3577a3f8c req-1826002a-5263-4f1f-9dfa-e9e1145aa4b5 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: b53b9c71-63b9-497f-a60b-07fe6f17dad1] Updated VIF entry in instance network info cache for port f87e7c5b-000a-44c7-a7e8-b7e97027b22d. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 21 19:46:52 np0005591285 nova_compute[182755]: 2026-01-22 00:46:52.329 182759 DEBUG nova.network.neutron [req-e2f2a2ab-7a3a-4821-b233-d4a3577a3f8c req-1826002a-5263-4f1f-9dfa-e9e1145aa4b5 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: b53b9c71-63b9-497f-a60b-07fe6f17dad1] Updating instance_info_cache with network_info: [{"id": "f87e7c5b-000a-44c7-a7e8-b7e97027b22d", "address": "fa:16:3e:74:c8:f6", "network": {"id": "83666af9-15ce-4344-a623-7180c9b2515a", "bridge": "br-int", "label": "tempest-network-smoke--333028759", "subnets": [{"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe74:c8f6", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}, {"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "837db8748d074b3c9179b47d30e7a1d4", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapf87e7c5b-00", "ovs_interfaceid": "f87e7c5b-000a-44c7-a7e8-b7e97027b22d", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 21 19:46:52 np0005591285 nova_compute[182755]: 2026-01-22 00:46:52.354 182759 DEBUG oslo_concurrency.lockutils [req-e2f2a2ab-7a3a-4821-b233-d4a3577a3f8c req-1826002a-5263-4f1f-9dfa-e9e1145aa4b5 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Releasing lock "refresh_cache-b53b9c71-63b9-497f-a60b-07fe6f17dad1" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 21 19:46:52 np0005591285 nova_compute[182755]: 2026-01-22 00:46:52.765 182759 DEBUG nova.compute.manager [req-4e73fed9-45f2-414d-8710-19e0a8e3a78c req-bc720672-5548-4ec9-a022-824fa9376117 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: b53b9c71-63b9-497f-a60b-07fe6f17dad1] Received event network-vif-plugged-f87e7c5b-000a-44c7-a7e8-b7e97027b22d external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 21 19:46:52 np0005591285 nova_compute[182755]: 2026-01-22 00:46:52.766 182759 DEBUG oslo_concurrency.lockutils [req-4e73fed9-45f2-414d-8710-19e0a8e3a78c req-bc720672-5548-4ec9-a022-824fa9376117 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "b53b9c71-63b9-497f-a60b-07fe6f17dad1-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 21 19:46:52 np0005591285 nova_compute[182755]: 2026-01-22 00:46:52.767 182759 DEBUG oslo_concurrency.lockutils [req-4e73fed9-45f2-414d-8710-19e0a8e3a78c req-bc720672-5548-4ec9-a022-824fa9376117 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "b53b9c71-63b9-497f-a60b-07fe6f17dad1-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 21 19:46:52 np0005591285 nova_compute[182755]: 2026-01-22 00:46:52.767 182759 DEBUG oslo_concurrency.lockutils [req-4e73fed9-45f2-414d-8710-19e0a8e3a78c req-bc720672-5548-4ec9-a022-824fa9376117 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "b53b9c71-63b9-497f-a60b-07fe6f17dad1-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 21 19:46:52 np0005591285 nova_compute[182755]: 2026-01-22 00:46:52.767 182759 DEBUG nova.compute.manager [req-4e73fed9-45f2-414d-8710-19e0a8e3a78c req-bc720672-5548-4ec9-a022-824fa9376117 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: b53b9c71-63b9-497f-a60b-07fe6f17dad1] No waiting events found dispatching network-vif-plugged-f87e7c5b-000a-44c7-a7e8-b7e97027b22d pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 21 19:46:52 np0005591285 nova_compute[182755]: 2026-01-22 00:46:52.767 182759 WARNING nova.compute.manager [req-4e73fed9-45f2-414d-8710-19e0a8e3a78c req-bc720672-5548-4ec9-a022-824fa9376117 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: b53b9c71-63b9-497f-a60b-07fe6f17dad1] Received unexpected event network-vif-plugged-f87e7c5b-000a-44c7-a7e8-b7e97027b22d for instance with vm_state deleted and task_state None.#033[00m
Jan 21 19:46:52 np0005591285 nova_compute[182755]: 2026-01-22 00:46:52.768 182759 DEBUG nova.compute.manager [req-4e73fed9-45f2-414d-8710-19e0a8e3a78c req-bc720672-5548-4ec9-a022-824fa9376117 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: b53b9c71-63b9-497f-a60b-07fe6f17dad1] Received event network-vif-deleted-f87e7c5b-000a-44c7-a7e8-b7e97027b22d external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 21 19:46:54 np0005591285 nova_compute[182755]: 2026-01-22 00:46:54.978 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:46:56 np0005591285 podman[244875]: 2026-01-22 00:46:56.236135019 +0000 UTC m=+0.093199605 container health_status 769668cc4e818cfae854ab3fcb5517227ddca1e18005a18ef4d782a494f45662 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Red Hat, Inc., container_name=openstack_network_exporter, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., managed_by=edpm_ansible, vcs-type=git, com.redhat.component=ubi9-minimal-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, release=1755695350, io.openshift.tags=minimal rhel9, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.buildah.version=1.33.7, url=https://catalog.redhat.com/en/search?searchType=containers, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, distribution-scope=public, io.openshift.expose-services=, vendor=Red Hat, Inc., version=9.6, config_id=openstack_network_exporter, architecture=x86_64, build-date=2025-08-20T13:12:41, name=ubi9-minimal, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly.)
Jan 21 19:46:56 np0005591285 podman[244876]: 2026-01-22 00:46:56.244222808 +0000 UTC m=+0.097746489 container health_status b5c1a31a508d0cf9e10702abe9c0ef424614b4d8f0daee85791f7de054afa6f0 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.build-date=20251202, config_id=ceilometer_agent_compute, org.label-schema.vendor=CentOS, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, io.buildah.version=1.41.3)
Jan 21 19:46:56 np0005591285 nova_compute[182755]: 2026-01-22 00:46:56.279 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:46:57 np0005591285 nova_compute[182755]: 2026-01-22 00:46:57.443 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:46:57 np0005591285 nova_compute[182755]: 2026-01-22 00:46:57.529 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:46:59 np0005591285 nova_compute[182755]: 2026-01-22 00:46:59.980 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:47:01 np0005591285 nova_compute[182755]: 2026-01-22 00:47:01.280 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:47:03 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:47:03.016 104259 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 21 19:47:03 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:47:03.017 104259 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 21 19:47:03 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:47:03.017 104259 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 21 19:47:04 np0005591285 nova_compute[182755]: 2026-01-22 00:47:04.954 182759 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1769042809.9536614, b53b9c71-63b9-497f-a60b-07fe6f17dad1 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 21 19:47:04 np0005591285 nova_compute[182755]: 2026-01-22 00:47:04.955 182759 INFO nova.compute.manager [-] [instance: b53b9c71-63b9-497f-a60b-07fe6f17dad1] VM Stopped (Lifecycle Event)#033[00m
Jan 21 19:47:04 np0005591285 nova_compute[182755]: 2026-01-22 00:47:04.981 182759 DEBUG nova.compute.manager [None req-edf40b45-890d-4407-85f4-99db072218a0 - - - - - -] [instance: b53b9c71-63b9-497f-a60b-07fe6f17dad1] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 21 19:47:04 np0005591285 nova_compute[182755]: 2026-01-22 00:47:04.983 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:47:05 np0005591285 podman[244916]: 2026-01-22 00:47:05.208854383 +0000 UTC m=+0.082107298 container health_status 3d927e208c8d9d2bf2ed958430b573b5beb6e6062ba87e1ec7a0ee42c855b2e4 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Jan 21 19:47:06 np0005591285 nova_compute[182755]: 2026-01-22 00:47:06.282 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:47:08 np0005591285 podman[244941]: 2026-01-22 00:47:08.189657714 +0000 UTC m=+0.061892931 container health_status 482a8450540b33e5cd4c4cb0c4b60281016c657b79b89fa82f8a9e447843f995 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, org.label-schema.license=GPLv2, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.build-date=20251202)
Jan 21 19:47:08 np0005591285 podman[244942]: 2026-01-22 00:47:08.234773102 +0000 UTC m=+0.094999605 container health_status 8919309155a033da8deb024bb37b9fc0d285fe97686ee0d202af37a5f2df8416 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter)
Jan 21 19:47:09 np0005591285 nova_compute[182755]: 2026-01-22 00:47:09.985 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:47:10 np0005591285 nova_compute[182755]: 2026-01-22 00:47:10.869 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:47:10 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:47:10.871 104259 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=71, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '16:a4:fb', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': 'fa:c2:bb:50:eb:27'}, ipsec=False) old=SB_Global(nb_cfg=70) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 21 19:47:10 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:47:10.872 104259 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 6 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Jan 21 19:47:11 np0005591285 podman[244984]: 2026-01-22 00:47:11.219735565 +0000 UTC m=+0.097135721 container health_status e38def30a2d032278c507a8fe96e62e9ea51ff4721fe55b24e66691821fd079e (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, config_id=ovn_controller, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.license=GPLv2)
Jan 21 19:47:11 np0005591285 nova_compute[182755]: 2026-01-22 00:47:11.283 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:47:14 np0005591285 nova_compute[182755]: 2026-01-22 00:47:14.217 182759 DEBUG oslo_service.periodic_task [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 21 19:47:14 np0005591285 nova_compute[182755]: 2026-01-22 00:47:14.217 182759 DEBUG oslo_service.periodic_task [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 21 19:47:14 np0005591285 nova_compute[182755]: 2026-01-22 00:47:14.218 182759 DEBUG nova.compute.manager [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Jan 21 19:47:14 np0005591285 nova_compute[182755]: 2026-01-22 00:47:14.988 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:47:15 np0005591285 nova_compute[182755]: 2026-01-22 00:47:15.219 182759 DEBUG oslo_service.periodic_task [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 21 19:47:15 np0005591285 nova_compute[182755]: 2026-01-22 00:47:15.219 182759 DEBUG nova.compute.manager [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Jan 21 19:47:15 np0005591285 nova_compute[182755]: 2026-01-22 00:47:15.220 182759 DEBUG nova.compute.manager [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Jan 21 19:47:15 np0005591285 nova_compute[182755]: 2026-01-22 00:47:15.238 182759 DEBUG nova.compute.manager [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Jan 21 19:47:16 np0005591285 nova_compute[182755]: 2026-01-22 00:47:16.285 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:47:16 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:47:16.874 104259 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=ce4b296c-26ac-415a-aa87-9634754eb3d3, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '71'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 21 19:47:18 np0005591285 nova_compute[182755]: 2026-01-22 00:47:18.232 182759 DEBUG oslo_service.periodic_task [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 21 19:47:19 np0005591285 nova_compute[182755]: 2026-01-22 00:47:19.216 182759 DEBUG oslo_service.periodic_task [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 21 19:47:19 np0005591285 nova_compute[182755]: 2026-01-22 00:47:19.990 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:47:21 np0005591285 nova_compute[182755]: 2026-01-22 00:47:21.217 182759 DEBUG oslo_service.periodic_task [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 21 19:47:21 np0005591285 nova_compute[182755]: 2026-01-22 00:47:21.287 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:47:24 np0005591285 nova_compute[182755]: 2026-01-22 00:47:24.992 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:47:26 np0005591285 nova_compute[182755]: 2026-01-22 00:47:26.217 182759 DEBUG oslo_service.periodic_task [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 21 19:47:26 np0005591285 nova_compute[182755]: 2026-01-22 00:47:26.243 182759 DEBUG oslo_concurrency.lockutils [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 21 19:47:26 np0005591285 nova_compute[182755]: 2026-01-22 00:47:26.244 182759 DEBUG oslo_concurrency.lockutils [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 21 19:47:26 np0005591285 nova_compute[182755]: 2026-01-22 00:47:26.244 182759 DEBUG oslo_concurrency.lockutils [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 21 19:47:26 np0005591285 nova_compute[182755]: 2026-01-22 00:47:26.245 182759 DEBUG nova.compute.resource_tracker [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Jan 21 19:47:26 np0005591285 nova_compute[182755]: 2026-01-22 00:47:26.337 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:47:26 np0005591285 podman[245011]: 2026-01-22 00:47:26.397962965 +0000 UTC m=+0.090918935 container health_status 769668cc4e818cfae854ab3fcb5517227ddca1e18005a18ef4d782a494f45662 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.tags=minimal rhel9, vendor=Red Hat, Inc., version=9.6, distribution-scope=public, vcs-type=git, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., architecture=x86_64, managed_by=edpm_ansible, io.buildah.version=1.33.7, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, com.redhat.component=ubi9-minimal-container, release=1755695350, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, config_id=openstack_network_exporter, url=https://catalog.redhat.com/en/search?searchType=containers, build-date=2025-08-20T13:12:41, maintainer=Red Hat, Inc., container_name=openstack_network_exporter, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, name=ubi9-minimal)
Jan 21 19:47:26 np0005591285 podman[245012]: 2026-01-22 00:47:26.458129648 +0000 UTC m=+0.096792993 container health_status b5c1a31a508d0cf9e10702abe9c0ef424614b4d8f0daee85791f7de054afa6f0 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, tcib_managed=true, container_name=ceilometer_agent_compute, org.label-schema.build-date=20251202, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=ceilometer_agent_compute, io.buildah.version=1.41.3)
Jan 21 19:47:26 np0005591285 nova_compute[182755]: 2026-01-22 00:47:26.504 182759 WARNING nova.virt.libvirt.driver [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 21 19:47:26 np0005591285 nova_compute[182755]: 2026-01-22 00:47:26.505 182759 DEBUG nova.compute.resource_tracker [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=5740MB free_disk=73.17695236206055GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Jan 21 19:47:26 np0005591285 nova_compute[182755]: 2026-01-22 00:47:26.506 182759 DEBUG oslo_concurrency.lockutils [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 21 19:47:26 np0005591285 nova_compute[182755]: 2026-01-22 00:47:26.506 182759 DEBUG oslo_concurrency.lockutils [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 21 19:47:26 np0005591285 nova_compute[182755]: 2026-01-22 00:47:26.569 182759 DEBUG nova.compute.resource_tracker [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Jan 21 19:47:26 np0005591285 nova_compute[182755]: 2026-01-22 00:47:26.569 182759 DEBUG nova.compute.resource_tracker [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Jan 21 19:47:26 np0005591285 nova_compute[182755]: 2026-01-22 00:47:26.599 182759 DEBUG nova.compute.provider_tree [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Inventory has not changed in ProviderTree for provider: e96a8776-a298-4c19-937a-402cb8191067 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 21 19:47:26 np0005591285 nova_compute[182755]: 2026-01-22 00:47:26.613 182759 DEBUG nova.scheduler.client.report [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Inventory has not changed for provider e96a8776-a298-4c19-937a-402cb8191067 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 21 19:47:26 np0005591285 nova_compute[182755]: 2026-01-22 00:47:26.635 182759 DEBUG nova.compute.resource_tracker [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Jan 21 19:47:26 np0005591285 nova_compute[182755]: 2026-01-22 00:47:26.635 182759 DEBUG oslo_concurrency.lockutils [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.129s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 21 19:47:27 np0005591285 nova_compute[182755]: 2026-01-22 00:47:27.635 182759 DEBUG oslo_service.periodic_task [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 21 19:47:29 np0005591285 nova_compute[182755]: 2026-01-22 00:47:29.303 182759 DEBUG oslo_concurrency.lockutils [None req-24510a19-17e8-492d-aed4-b2e33151d95c 8fb6fa8c5dd241fb975d0e13ddb107f4 38ae0051f15c46809f70ec5299cfb2c6 - - default default] Acquiring lock "b0c229ff-8141-43f5-a553-f5282618869e" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 21 19:47:29 np0005591285 nova_compute[182755]: 2026-01-22 00:47:29.304 182759 DEBUG oslo_concurrency.lockutils [None req-24510a19-17e8-492d-aed4-b2e33151d95c 8fb6fa8c5dd241fb975d0e13ddb107f4 38ae0051f15c46809f70ec5299cfb2c6 - - default default] Lock "b0c229ff-8141-43f5-a553-f5282618869e" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 21 19:47:29 np0005591285 nova_compute[182755]: 2026-01-22 00:47:29.321 182759 DEBUG nova.compute.manager [None req-24510a19-17e8-492d-aed4-b2e33151d95c 8fb6fa8c5dd241fb975d0e13ddb107f4 38ae0051f15c46809f70ec5299cfb2c6 - - default default] [instance: b0c229ff-8141-43f5-a553-f5282618869e] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Jan 21 19:47:29 np0005591285 nova_compute[182755]: 2026-01-22 00:47:29.428 182759 DEBUG oslo_concurrency.lockutils [None req-24510a19-17e8-492d-aed4-b2e33151d95c 8fb6fa8c5dd241fb975d0e13ddb107f4 38ae0051f15c46809f70ec5299cfb2c6 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 21 19:47:29 np0005591285 nova_compute[182755]: 2026-01-22 00:47:29.429 182759 DEBUG oslo_concurrency.lockutils [None req-24510a19-17e8-492d-aed4-b2e33151d95c 8fb6fa8c5dd241fb975d0e13ddb107f4 38ae0051f15c46809f70ec5299cfb2c6 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 21 19:47:29 np0005591285 nova_compute[182755]: 2026-01-22 00:47:29.437 182759 DEBUG nova.virt.hardware [None req-24510a19-17e8-492d-aed4-b2e33151d95c 8fb6fa8c5dd241fb975d0e13ddb107f4 38ae0051f15c46809f70ec5299cfb2c6 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Jan 21 19:47:29 np0005591285 nova_compute[182755]: 2026-01-22 00:47:29.438 182759 INFO nova.compute.claims [None req-24510a19-17e8-492d-aed4-b2e33151d95c 8fb6fa8c5dd241fb975d0e13ddb107f4 38ae0051f15c46809f70ec5299cfb2c6 - - default default] [instance: b0c229ff-8141-43f5-a553-f5282618869e] Claim successful on node compute-2.ctlplane.example.com#033[00m
Jan 21 19:47:29 np0005591285 nova_compute[182755]: 2026-01-22 00:47:29.598 182759 DEBUG nova.compute.provider_tree [None req-24510a19-17e8-492d-aed4-b2e33151d95c 8fb6fa8c5dd241fb975d0e13ddb107f4 38ae0051f15c46809f70ec5299cfb2c6 - - default default] Inventory has not changed in ProviderTree for provider: e96a8776-a298-4c19-937a-402cb8191067 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 21 19:47:29 np0005591285 nova_compute[182755]: 2026-01-22 00:47:29.622 182759 DEBUG nova.scheduler.client.report [None req-24510a19-17e8-492d-aed4-b2e33151d95c 8fb6fa8c5dd241fb975d0e13ddb107f4 38ae0051f15c46809f70ec5299cfb2c6 - - default default] Inventory has not changed for provider e96a8776-a298-4c19-937a-402cb8191067 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 21 19:47:29 np0005591285 nova_compute[182755]: 2026-01-22 00:47:29.647 182759 DEBUG oslo_concurrency.lockutils [None req-24510a19-17e8-492d-aed4-b2e33151d95c 8fb6fa8c5dd241fb975d0e13ddb107f4 38ae0051f15c46809f70ec5299cfb2c6 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.218s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 21 19:47:29 np0005591285 nova_compute[182755]: 2026-01-22 00:47:29.648 182759 DEBUG nova.compute.manager [None req-24510a19-17e8-492d-aed4-b2e33151d95c 8fb6fa8c5dd241fb975d0e13ddb107f4 38ae0051f15c46809f70ec5299cfb2c6 - - default default] [instance: b0c229ff-8141-43f5-a553-f5282618869e] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Jan 21 19:47:29 np0005591285 nova_compute[182755]: 2026-01-22 00:47:29.725 182759 DEBUG nova.compute.manager [None req-24510a19-17e8-492d-aed4-b2e33151d95c 8fb6fa8c5dd241fb975d0e13ddb107f4 38ae0051f15c46809f70ec5299cfb2c6 - - default default] [instance: b0c229ff-8141-43f5-a553-f5282618869e] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Jan 21 19:47:29 np0005591285 nova_compute[182755]: 2026-01-22 00:47:29.725 182759 DEBUG nova.network.neutron [None req-24510a19-17e8-492d-aed4-b2e33151d95c 8fb6fa8c5dd241fb975d0e13ddb107f4 38ae0051f15c46809f70ec5299cfb2c6 - - default default] [instance: b0c229ff-8141-43f5-a553-f5282618869e] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Jan 21 19:47:29 np0005591285 nova_compute[182755]: 2026-01-22 00:47:29.749 182759 INFO nova.virt.libvirt.driver [None req-24510a19-17e8-492d-aed4-b2e33151d95c 8fb6fa8c5dd241fb975d0e13ddb107f4 38ae0051f15c46809f70ec5299cfb2c6 - - default default] [instance: b0c229ff-8141-43f5-a553-f5282618869e] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Jan 21 19:47:29 np0005591285 nova_compute[182755]: 2026-01-22 00:47:29.769 182759 DEBUG nova.compute.manager [None req-24510a19-17e8-492d-aed4-b2e33151d95c 8fb6fa8c5dd241fb975d0e13ddb107f4 38ae0051f15c46809f70ec5299cfb2c6 - - default default] [instance: b0c229ff-8141-43f5-a553-f5282618869e] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Jan 21 19:47:29 np0005591285 nova_compute[182755]: 2026-01-22 00:47:29.887 182759 DEBUG nova.compute.manager [None req-24510a19-17e8-492d-aed4-b2e33151d95c 8fb6fa8c5dd241fb975d0e13ddb107f4 38ae0051f15c46809f70ec5299cfb2c6 - - default default] [instance: b0c229ff-8141-43f5-a553-f5282618869e] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Jan 21 19:47:29 np0005591285 nova_compute[182755]: 2026-01-22 00:47:29.889 182759 DEBUG nova.virt.libvirt.driver [None req-24510a19-17e8-492d-aed4-b2e33151d95c 8fb6fa8c5dd241fb975d0e13ddb107f4 38ae0051f15c46809f70ec5299cfb2c6 - - default default] [instance: b0c229ff-8141-43f5-a553-f5282618869e] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Jan 21 19:47:29 np0005591285 nova_compute[182755]: 2026-01-22 00:47:29.889 182759 INFO nova.virt.libvirt.driver [None req-24510a19-17e8-492d-aed4-b2e33151d95c 8fb6fa8c5dd241fb975d0e13ddb107f4 38ae0051f15c46809f70ec5299cfb2c6 - - default default] [instance: b0c229ff-8141-43f5-a553-f5282618869e] Creating image(s)#033[00m
Jan 21 19:47:29 np0005591285 nova_compute[182755]: 2026-01-22 00:47:29.891 182759 DEBUG oslo_concurrency.lockutils [None req-24510a19-17e8-492d-aed4-b2e33151d95c 8fb6fa8c5dd241fb975d0e13ddb107f4 38ae0051f15c46809f70ec5299cfb2c6 - - default default] Acquiring lock "/var/lib/nova/instances/b0c229ff-8141-43f5-a553-f5282618869e/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 21 19:47:29 np0005591285 nova_compute[182755]: 2026-01-22 00:47:29.891 182759 DEBUG oslo_concurrency.lockutils [None req-24510a19-17e8-492d-aed4-b2e33151d95c 8fb6fa8c5dd241fb975d0e13ddb107f4 38ae0051f15c46809f70ec5299cfb2c6 - - default default] Lock "/var/lib/nova/instances/b0c229ff-8141-43f5-a553-f5282618869e/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 21 19:47:29 np0005591285 nova_compute[182755]: 2026-01-22 00:47:29.892 182759 DEBUG oslo_concurrency.lockutils [None req-24510a19-17e8-492d-aed4-b2e33151d95c 8fb6fa8c5dd241fb975d0e13ddb107f4 38ae0051f15c46809f70ec5299cfb2c6 - - default default] Lock "/var/lib/nova/instances/b0c229ff-8141-43f5-a553-f5282618869e/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 21 19:47:29 np0005591285 nova_compute[182755]: 2026-01-22 00:47:29.923 182759 DEBUG oslo_concurrency.processutils [None req-24510a19-17e8-492d-aed4-b2e33151d95c 8fb6fa8c5dd241fb975d0e13ddb107f4 38ae0051f15c46809f70ec5299cfb2c6 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 21 19:47:29 np0005591285 nova_compute[182755]: 2026-01-22 00:47:29.995 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:47:30 np0005591285 nova_compute[182755]: 2026-01-22 00:47:30.020 182759 DEBUG oslo_concurrency.processutils [None req-24510a19-17e8-492d-aed4-b2e33151d95c 8fb6fa8c5dd241fb975d0e13ddb107f4 38ae0051f15c46809f70ec5299cfb2c6 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json" returned: 0 in 0.096s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 21 19:47:30 np0005591285 nova_compute[182755]: 2026-01-22 00:47:30.021 182759 DEBUG oslo_concurrency.lockutils [None req-24510a19-17e8-492d-aed4-b2e33151d95c 8fb6fa8c5dd241fb975d0e13ddb107f4 38ae0051f15c46809f70ec5299cfb2c6 - - default default] Acquiring lock "3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 21 19:47:30 np0005591285 nova_compute[182755]: 2026-01-22 00:47:30.021 182759 DEBUG oslo_concurrency.lockutils [None req-24510a19-17e8-492d-aed4-b2e33151d95c 8fb6fa8c5dd241fb975d0e13ddb107f4 38ae0051f15c46809f70ec5299cfb2c6 - - default default] Lock "3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 21 19:47:30 np0005591285 nova_compute[182755]: 2026-01-22 00:47:30.032 182759 DEBUG oslo_concurrency.processutils [None req-24510a19-17e8-492d-aed4-b2e33151d95c 8fb6fa8c5dd241fb975d0e13ddb107f4 38ae0051f15c46809f70ec5299cfb2c6 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 21 19:47:30 np0005591285 nova_compute[182755]: 2026-01-22 00:47:30.086 182759 DEBUG oslo_concurrency.processutils [None req-24510a19-17e8-492d-aed4-b2e33151d95c 8fb6fa8c5dd241fb975d0e13ddb107f4 38ae0051f15c46809f70ec5299cfb2c6 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json" returned: 0 in 0.054s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 21 19:47:30 np0005591285 nova_compute[182755]: 2026-01-22 00:47:30.087 182759 DEBUG oslo_concurrency.processutils [None req-24510a19-17e8-492d-aed4-b2e33151d95c 8fb6fa8c5dd241fb975d0e13ddb107f4 38ae0051f15c46809f70ec5299cfb2c6 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474,backing_fmt=raw /var/lib/nova/instances/b0c229ff-8141-43f5-a553-f5282618869e/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 21 19:47:30 np0005591285 nova_compute[182755]: 2026-01-22 00:47:30.121 182759 DEBUG oslo_concurrency.processutils [None req-24510a19-17e8-492d-aed4-b2e33151d95c 8fb6fa8c5dd241fb975d0e13ddb107f4 38ae0051f15c46809f70ec5299cfb2c6 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474,backing_fmt=raw /var/lib/nova/instances/b0c229ff-8141-43f5-a553-f5282618869e/disk 1073741824" returned: 0 in 0.034s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 21 19:47:30 np0005591285 nova_compute[182755]: 2026-01-22 00:47:30.123 182759 DEBUG oslo_concurrency.lockutils [None req-24510a19-17e8-492d-aed4-b2e33151d95c 8fb6fa8c5dd241fb975d0e13ddb107f4 38ae0051f15c46809f70ec5299cfb2c6 - - default default] Lock "3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.102s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 21 19:47:30 np0005591285 nova_compute[182755]: 2026-01-22 00:47:30.124 182759 DEBUG oslo_concurrency.processutils [None req-24510a19-17e8-492d-aed4-b2e33151d95c 8fb6fa8c5dd241fb975d0e13ddb107f4 38ae0051f15c46809f70ec5299cfb2c6 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 21 19:47:30 np0005591285 nova_compute[182755]: 2026-01-22 00:47:30.179 182759 DEBUG oslo_concurrency.processutils [None req-24510a19-17e8-492d-aed4-b2e33151d95c 8fb6fa8c5dd241fb975d0e13ddb107f4 38ae0051f15c46809f70ec5299cfb2c6 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json" returned: 0 in 0.056s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 21 19:47:30 np0005591285 nova_compute[182755]: 2026-01-22 00:47:30.181 182759 DEBUG nova.virt.disk.api [None req-24510a19-17e8-492d-aed4-b2e33151d95c 8fb6fa8c5dd241fb975d0e13ddb107f4 38ae0051f15c46809f70ec5299cfb2c6 - - default default] Checking if we can resize image /var/lib/nova/instances/b0c229ff-8141-43f5-a553-f5282618869e/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166#033[00m
Jan 21 19:47:30 np0005591285 nova_compute[182755]: 2026-01-22 00:47:30.181 182759 DEBUG oslo_concurrency.processutils [None req-24510a19-17e8-492d-aed4-b2e33151d95c 8fb6fa8c5dd241fb975d0e13ddb107f4 38ae0051f15c46809f70ec5299cfb2c6 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/b0c229ff-8141-43f5-a553-f5282618869e/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 21 19:47:30 np0005591285 nova_compute[182755]: 2026-01-22 00:47:30.237 182759 DEBUG oslo_concurrency.processutils [None req-24510a19-17e8-492d-aed4-b2e33151d95c 8fb6fa8c5dd241fb975d0e13ddb107f4 38ae0051f15c46809f70ec5299cfb2c6 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/b0c229ff-8141-43f5-a553-f5282618869e/disk --force-share --output=json" returned: 0 in 0.055s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 21 19:47:30 np0005591285 nova_compute[182755]: 2026-01-22 00:47:30.238 182759 DEBUG nova.virt.disk.api [None req-24510a19-17e8-492d-aed4-b2e33151d95c 8fb6fa8c5dd241fb975d0e13ddb107f4 38ae0051f15c46809f70ec5299cfb2c6 - - default default] Cannot resize image /var/lib/nova/instances/b0c229ff-8141-43f5-a553-f5282618869e/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172#033[00m
Jan 21 19:47:30 np0005591285 nova_compute[182755]: 2026-01-22 00:47:30.239 182759 DEBUG nova.objects.instance [None req-24510a19-17e8-492d-aed4-b2e33151d95c 8fb6fa8c5dd241fb975d0e13ddb107f4 38ae0051f15c46809f70ec5299cfb2c6 - - default default] Lazy-loading 'migration_context' on Instance uuid b0c229ff-8141-43f5-a553-f5282618869e obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 21 19:47:30 np0005591285 nova_compute[182755]: 2026-01-22 00:47:30.255 182759 DEBUG nova.virt.libvirt.driver [None req-24510a19-17e8-492d-aed4-b2e33151d95c 8fb6fa8c5dd241fb975d0e13ddb107f4 38ae0051f15c46809f70ec5299cfb2c6 - - default default] [instance: b0c229ff-8141-43f5-a553-f5282618869e] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Jan 21 19:47:30 np0005591285 nova_compute[182755]: 2026-01-22 00:47:30.256 182759 DEBUG nova.virt.libvirt.driver [None req-24510a19-17e8-492d-aed4-b2e33151d95c 8fb6fa8c5dd241fb975d0e13ddb107f4 38ae0051f15c46809f70ec5299cfb2c6 - - default default] [instance: b0c229ff-8141-43f5-a553-f5282618869e] Ensure instance console log exists: /var/lib/nova/instances/b0c229ff-8141-43f5-a553-f5282618869e/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Jan 21 19:47:30 np0005591285 nova_compute[182755]: 2026-01-22 00:47:30.256 182759 DEBUG oslo_concurrency.lockutils [None req-24510a19-17e8-492d-aed4-b2e33151d95c 8fb6fa8c5dd241fb975d0e13ddb107f4 38ae0051f15c46809f70ec5299cfb2c6 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 21 19:47:30 np0005591285 nova_compute[182755]: 2026-01-22 00:47:30.256 182759 DEBUG oslo_concurrency.lockutils [None req-24510a19-17e8-492d-aed4-b2e33151d95c 8fb6fa8c5dd241fb975d0e13ddb107f4 38ae0051f15c46809f70ec5299cfb2c6 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 21 19:47:30 np0005591285 nova_compute[182755]: 2026-01-22 00:47:30.257 182759 DEBUG oslo_concurrency.lockutils [None req-24510a19-17e8-492d-aed4-b2e33151d95c 8fb6fa8c5dd241fb975d0e13ddb107f4 38ae0051f15c46809f70ec5299cfb2c6 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 21 19:47:30 np0005591285 nova_compute[182755]: 2026-01-22 00:47:30.471 182759 DEBUG nova.network.neutron [None req-24510a19-17e8-492d-aed4-b2e33151d95c 8fb6fa8c5dd241fb975d0e13ddb107f4 38ae0051f15c46809f70ec5299cfb2c6 - - default default] [instance: b0c229ff-8141-43f5-a553-f5282618869e] Successfully created port: 84562d91-1d45-4705-a924-4d9ca2b2ab5f _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Jan 21 19:47:31 np0005591285 nova_compute[182755]: 2026-01-22 00:47:31.218 182759 DEBUG oslo_service.periodic_task [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 21 19:47:31 np0005591285 nova_compute[182755]: 2026-01-22 00:47:31.235 182759 DEBUG nova.network.neutron [None req-24510a19-17e8-492d-aed4-b2e33151d95c 8fb6fa8c5dd241fb975d0e13ddb107f4 38ae0051f15c46809f70ec5299cfb2c6 - - default default] [instance: b0c229ff-8141-43f5-a553-f5282618869e] Successfully updated port: 84562d91-1d45-4705-a924-4d9ca2b2ab5f _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Jan 21 19:47:31 np0005591285 nova_compute[182755]: 2026-01-22 00:47:31.265 182759 DEBUG oslo_concurrency.lockutils [None req-24510a19-17e8-492d-aed4-b2e33151d95c 8fb6fa8c5dd241fb975d0e13ddb107f4 38ae0051f15c46809f70ec5299cfb2c6 - - default default] Acquiring lock "refresh_cache-b0c229ff-8141-43f5-a553-f5282618869e" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 21 19:47:31 np0005591285 nova_compute[182755]: 2026-01-22 00:47:31.265 182759 DEBUG oslo_concurrency.lockutils [None req-24510a19-17e8-492d-aed4-b2e33151d95c 8fb6fa8c5dd241fb975d0e13ddb107f4 38ae0051f15c46809f70ec5299cfb2c6 - - default default] Acquired lock "refresh_cache-b0c229ff-8141-43f5-a553-f5282618869e" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 21 19:47:31 np0005591285 nova_compute[182755]: 2026-01-22 00:47:31.266 182759 DEBUG nova.network.neutron [None req-24510a19-17e8-492d-aed4-b2e33151d95c 8fb6fa8c5dd241fb975d0e13ddb107f4 38ae0051f15c46809f70ec5299cfb2c6 - - default default] [instance: b0c229ff-8141-43f5-a553-f5282618869e] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Jan 21 19:47:31 np0005591285 nova_compute[182755]: 2026-01-22 00:47:31.339 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:47:31 np0005591285 nova_compute[182755]: 2026-01-22 00:47:31.344 182759 DEBUG nova.compute.manager [req-3ef63251-8271-43e3-b3a0-21c73810073d req-df44e56d-a58e-4e75-a64c-b2209c2e7e9d 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: b0c229ff-8141-43f5-a553-f5282618869e] Received event network-changed-84562d91-1d45-4705-a924-4d9ca2b2ab5f external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 21 19:47:31 np0005591285 nova_compute[182755]: 2026-01-22 00:47:31.345 182759 DEBUG nova.compute.manager [req-3ef63251-8271-43e3-b3a0-21c73810073d req-df44e56d-a58e-4e75-a64c-b2209c2e7e9d 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: b0c229ff-8141-43f5-a553-f5282618869e] Refreshing instance network info cache due to event network-changed-84562d91-1d45-4705-a924-4d9ca2b2ab5f. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 21 19:47:31 np0005591285 nova_compute[182755]: 2026-01-22 00:47:31.345 182759 DEBUG oslo_concurrency.lockutils [req-3ef63251-8271-43e3-b3a0-21c73810073d req-df44e56d-a58e-4e75-a64c-b2209c2e7e9d 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "refresh_cache-b0c229ff-8141-43f5-a553-f5282618869e" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 21 19:47:32 np0005591285 nova_compute[182755]: 2026-01-22 00:47:32.400 182759 DEBUG nova.network.neutron [None req-24510a19-17e8-492d-aed4-b2e33151d95c 8fb6fa8c5dd241fb975d0e13ddb107f4 38ae0051f15c46809f70ec5299cfb2c6 - - default default] [instance: b0c229ff-8141-43f5-a553-f5282618869e] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Jan 21 19:47:33 np0005591285 nova_compute[182755]: 2026-01-22 00:47:33.302 182759 DEBUG nova.network.neutron [None req-24510a19-17e8-492d-aed4-b2e33151d95c 8fb6fa8c5dd241fb975d0e13ddb107f4 38ae0051f15c46809f70ec5299cfb2c6 - - default default] [instance: b0c229ff-8141-43f5-a553-f5282618869e] Updating instance_info_cache with network_info: [{"id": "84562d91-1d45-4705-a924-4d9ca2b2ab5f", "address": "fa:16:3e:f6:86:fa", "network": {"id": "c27f16e8-e7ea-4ce6-8fc8-52a4d97170f2", "bridge": "br-int", "label": "tempest-TestServerMultinode-485848203-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "991deac0047e45c598f6b4e9ae868ad3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap84562d91-1d", "ovs_interfaceid": "84562d91-1d45-4705-a924-4d9ca2b2ab5f", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 21 19:47:33 np0005591285 nova_compute[182755]: 2026-01-22 00:47:33.333 182759 DEBUG oslo_concurrency.lockutils [None req-24510a19-17e8-492d-aed4-b2e33151d95c 8fb6fa8c5dd241fb975d0e13ddb107f4 38ae0051f15c46809f70ec5299cfb2c6 - - default default] Releasing lock "refresh_cache-b0c229ff-8141-43f5-a553-f5282618869e" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 21 19:47:33 np0005591285 nova_compute[182755]: 2026-01-22 00:47:33.333 182759 DEBUG nova.compute.manager [None req-24510a19-17e8-492d-aed4-b2e33151d95c 8fb6fa8c5dd241fb975d0e13ddb107f4 38ae0051f15c46809f70ec5299cfb2c6 - - default default] [instance: b0c229ff-8141-43f5-a553-f5282618869e] Instance network_info: |[{"id": "84562d91-1d45-4705-a924-4d9ca2b2ab5f", "address": "fa:16:3e:f6:86:fa", "network": {"id": "c27f16e8-e7ea-4ce6-8fc8-52a4d97170f2", "bridge": "br-int", "label": "tempest-TestServerMultinode-485848203-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "991deac0047e45c598f6b4e9ae868ad3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap84562d91-1d", "ovs_interfaceid": "84562d91-1d45-4705-a924-4d9ca2b2ab5f", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Jan 21 19:47:33 np0005591285 nova_compute[182755]: 2026-01-22 00:47:33.333 182759 DEBUG oslo_concurrency.lockutils [req-3ef63251-8271-43e3-b3a0-21c73810073d req-df44e56d-a58e-4e75-a64c-b2209c2e7e9d 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquired lock "refresh_cache-b0c229ff-8141-43f5-a553-f5282618869e" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 21 19:47:33 np0005591285 nova_compute[182755]: 2026-01-22 00:47:33.333 182759 DEBUG nova.network.neutron [req-3ef63251-8271-43e3-b3a0-21c73810073d req-df44e56d-a58e-4e75-a64c-b2209c2e7e9d 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: b0c229ff-8141-43f5-a553-f5282618869e] Refreshing network info cache for port 84562d91-1d45-4705-a924-4d9ca2b2ab5f _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 21 19:47:33 np0005591285 nova_compute[182755]: 2026-01-22 00:47:33.336 182759 DEBUG nova.virt.libvirt.driver [None req-24510a19-17e8-492d-aed4-b2e33151d95c 8fb6fa8c5dd241fb975d0e13ddb107f4 38ae0051f15c46809f70ec5299cfb2c6 - - default default] [instance: b0c229ff-8141-43f5-a553-f5282618869e] Start _get_guest_xml network_info=[{"id": "84562d91-1d45-4705-a924-4d9ca2b2ab5f", "address": "fa:16:3e:f6:86:fa", "network": {"id": "c27f16e8-e7ea-4ce6-8fc8-52a4d97170f2", "bridge": "br-int", "label": "tempest-TestServerMultinode-485848203-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "991deac0047e45c598f6b4e9ae868ad3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap84562d91-1d", "ovs_interfaceid": "84562d91-1d45-4705-a924-4d9ca2b2ab5f", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-21T23:42:50Z,direct_url=<?>,disk_format='qcow2',id=9cd98f02-a505-4543-a7ad-04e9a377b456,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='43b70c4e837343859ac97b6b2397ba1b',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-21T23:42:52Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'device_type': 'disk', 'device_name': '/dev/vda', 'size': 0, 'boot_index': 0, 'disk_bus': 'virtio', 'guest_format': None, 'encryption_options': None, 'encryption_format': None, 'encrypted': False, 'encryption_secret_uuid': None, 'image_id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Jan 21 19:47:33 np0005591285 nova_compute[182755]: 2026-01-22 00:47:33.341 182759 WARNING nova.virt.libvirt.driver [None req-24510a19-17e8-492d-aed4-b2e33151d95c 8fb6fa8c5dd241fb975d0e13ddb107f4 38ae0051f15c46809f70ec5299cfb2c6 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 21 19:47:33 np0005591285 nova_compute[182755]: 2026-01-22 00:47:33.345 182759 DEBUG nova.virt.libvirt.host [None req-24510a19-17e8-492d-aed4-b2e33151d95c 8fb6fa8c5dd241fb975d0e13ddb107f4 38ae0051f15c46809f70ec5299cfb2c6 - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Jan 21 19:47:33 np0005591285 nova_compute[182755]: 2026-01-22 00:47:33.346 182759 DEBUG nova.virt.libvirt.host [None req-24510a19-17e8-492d-aed4-b2e33151d95c 8fb6fa8c5dd241fb975d0e13ddb107f4 38ae0051f15c46809f70ec5299cfb2c6 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Jan 21 19:47:33 np0005591285 nova_compute[182755]: 2026-01-22 00:47:33.349 182759 DEBUG nova.virt.libvirt.host [None req-24510a19-17e8-492d-aed4-b2e33151d95c 8fb6fa8c5dd241fb975d0e13ddb107f4 38ae0051f15c46809f70ec5299cfb2c6 - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Jan 21 19:47:33 np0005591285 nova_compute[182755]: 2026-01-22 00:47:33.349 182759 DEBUG nova.virt.libvirt.host [None req-24510a19-17e8-492d-aed4-b2e33151d95c 8fb6fa8c5dd241fb975d0e13ddb107f4 38ae0051f15c46809f70ec5299cfb2c6 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Jan 21 19:47:33 np0005591285 nova_compute[182755]: 2026-01-22 00:47:33.351 182759 DEBUG nova.virt.libvirt.driver [None req-24510a19-17e8-492d-aed4-b2e33151d95c 8fb6fa8c5dd241fb975d0e13ddb107f4 38ae0051f15c46809f70ec5299cfb2c6 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Jan 21 19:47:33 np0005591285 nova_compute[182755]: 2026-01-22 00:47:33.351 182759 DEBUG nova.virt.hardware [None req-24510a19-17e8-492d-aed4-b2e33151d95c 8fb6fa8c5dd241fb975d0e13ddb107f4 38ae0051f15c46809f70ec5299cfb2c6 - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-21T23:42:45Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='c3389c03-89c4-4ff5-9e03-1a99d41713d4',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-21T23:42:50Z,direct_url=<?>,disk_format='qcow2',id=9cd98f02-a505-4543-a7ad-04e9a377b456,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='43b70c4e837343859ac97b6b2397ba1b',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-21T23:42:52Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Jan 21 19:47:33 np0005591285 nova_compute[182755]: 2026-01-22 00:47:33.352 182759 DEBUG nova.virt.hardware [None req-24510a19-17e8-492d-aed4-b2e33151d95c 8fb6fa8c5dd241fb975d0e13ddb107f4 38ae0051f15c46809f70ec5299cfb2c6 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Jan 21 19:47:33 np0005591285 nova_compute[182755]: 2026-01-22 00:47:33.352 182759 DEBUG nova.virt.hardware [None req-24510a19-17e8-492d-aed4-b2e33151d95c 8fb6fa8c5dd241fb975d0e13ddb107f4 38ae0051f15c46809f70ec5299cfb2c6 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Jan 21 19:47:33 np0005591285 nova_compute[182755]: 2026-01-22 00:47:33.352 182759 DEBUG nova.virt.hardware [None req-24510a19-17e8-492d-aed4-b2e33151d95c 8fb6fa8c5dd241fb975d0e13ddb107f4 38ae0051f15c46809f70ec5299cfb2c6 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Jan 21 19:47:33 np0005591285 nova_compute[182755]: 2026-01-22 00:47:33.352 182759 DEBUG nova.virt.hardware [None req-24510a19-17e8-492d-aed4-b2e33151d95c 8fb6fa8c5dd241fb975d0e13ddb107f4 38ae0051f15c46809f70ec5299cfb2c6 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Jan 21 19:47:33 np0005591285 nova_compute[182755]: 2026-01-22 00:47:33.353 182759 DEBUG nova.virt.hardware [None req-24510a19-17e8-492d-aed4-b2e33151d95c 8fb6fa8c5dd241fb975d0e13ddb107f4 38ae0051f15c46809f70ec5299cfb2c6 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Jan 21 19:47:33 np0005591285 nova_compute[182755]: 2026-01-22 00:47:33.353 182759 DEBUG nova.virt.hardware [None req-24510a19-17e8-492d-aed4-b2e33151d95c 8fb6fa8c5dd241fb975d0e13ddb107f4 38ae0051f15c46809f70ec5299cfb2c6 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Jan 21 19:47:33 np0005591285 nova_compute[182755]: 2026-01-22 00:47:33.353 182759 DEBUG nova.virt.hardware [None req-24510a19-17e8-492d-aed4-b2e33151d95c 8fb6fa8c5dd241fb975d0e13ddb107f4 38ae0051f15c46809f70ec5299cfb2c6 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Jan 21 19:47:33 np0005591285 nova_compute[182755]: 2026-01-22 00:47:33.353 182759 DEBUG nova.virt.hardware [None req-24510a19-17e8-492d-aed4-b2e33151d95c 8fb6fa8c5dd241fb975d0e13ddb107f4 38ae0051f15c46809f70ec5299cfb2c6 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Jan 21 19:47:33 np0005591285 nova_compute[182755]: 2026-01-22 00:47:33.354 182759 DEBUG nova.virt.hardware [None req-24510a19-17e8-492d-aed4-b2e33151d95c 8fb6fa8c5dd241fb975d0e13ddb107f4 38ae0051f15c46809f70ec5299cfb2c6 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Jan 21 19:47:33 np0005591285 nova_compute[182755]: 2026-01-22 00:47:33.354 182759 DEBUG nova.virt.hardware [None req-24510a19-17e8-492d-aed4-b2e33151d95c 8fb6fa8c5dd241fb975d0e13ddb107f4 38ae0051f15c46809f70ec5299cfb2c6 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Jan 21 19:47:33 np0005591285 nova_compute[182755]: 2026-01-22 00:47:33.359 182759 DEBUG nova.virt.libvirt.vif [None req-24510a19-17e8-492d-aed4-b2e33151d95c 8fb6fa8c5dd241fb975d0e13ddb107f4 38ae0051f15c46809f70ec5299cfb2c6 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-22T00:47:28Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestServerMultinode-server-659438332',display_name='tempest-TestServerMultinode-server-659438332',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-testservermultinode-server-659438332',id=186,image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='38ae0051f15c46809f70ec5299cfb2c6',ramdisk_id='',reservation_id='r-f1x0xb7e',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='admin,member,reader',image_base_image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestServerMultinode-385846676',owner_user_name='tempest-TestServerMultinode-385846676-project-admin'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-22T00:47:29Z,user_data=None,user_id='8fb6fa8c5dd241fb975d0e13ddb107f4',uuid=b0c229ff-8141-43f5-a553-f5282618869e,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "84562d91-1d45-4705-a924-4d9ca2b2ab5f", "address": "fa:16:3e:f6:86:fa", "network": {"id": "c27f16e8-e7ea-4ce6-8fc8-52a4d97170f2", "bridge": "br-int", "label": "tempest-TestServerMultinode-485848203-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "991deac0047e45c598f6b4e9ae868ad3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap84562d91-1d", "ovs_interfaceid": "84562d91-1d45-4705-a924-4d9ca2b2ab5f", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Jan 21 19:47:33 np0005591285 nova_compute[182755]: 2026-01-22 00:47:33.359 182759 DEBUG nova.network.os_vif_util [None req-24510a19-17e8-492d-aed4-b2e33151d95c 8fb6fa8c5dd241fb975d0e13ddb107f4 38ae0051f15c46809f70ec5299cfb2c6 - - default default] Converting VIF {"id": "84562d91-1d45-4705-a924-4d9ca2b2ab5f", "address": "fa:16:3e:f6:86:fa", "network": {"id": "c27f16e8-e7ea-4ce6-8fc8-52a4d97170f2", "bridge": "br-int", "label": "tempest-TestServerMultinode-485848203-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "991deac0047e45c598f6b4e9ae868ad3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap84562d91-1d", "ovs_interfaceid": "84562d91-1d45-4705-a924-4d9ca2b2ab5f", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 21 19:47:33 np0005591285 nova_compute[182755]: 2026-01-22 00:47:33.360 182759 DEBUG nova.network.os_vif_util [None req-24510a19-17e8-492d-aed4-b2e33151d95c 8fb6fa8c5dd241fb975d0e13ddb107f4 38ae0051f15c46809f70ec5299cfb2c6 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:f6:86:fa,bridge_name='br-int',has_traffic_filtering=True,id=84562d91-1d45-4705-a924-4d9ca2b2ab5f,network=Network(c27f16e8-e7ea-4ce6-8fc8-52a4d97170f2),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap84562d91-1d') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 21 19:47:33 np0005591285 nova_compute[182755]: 2026-01-22 00:47:33.361 182759 DEBUG nova.objects.instance [None req-24510a19-17e8-492d-aed4-b2e33151d95c 8fb6fa8c5dd241fb975d0e13ddb107f4 38ae0051f15c46809f70ec5299cfb2c6 - - default default] Lazy-loading 'pci_devices' on Instance uuid b0c229ff-8141-43f5-a553-f5282618869e obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 21 19:47:33 np0005591285 nova_compute[182755]: 2026-01-22 00:47:33.376 182759 DEBUG nova.virt.libvirt.driver [None req-24510a19-17e8-492d-aed4-b2e33151d95c 8fb6fa8c5dd241fb975d0e13ddb107f4 38ae0051f15c46809f70ec5299cfb2c6 - - default default] [instance: b0c229ff-8141-43f5-a553-f5282618869e] End _get_guest_xml xml=<domain type="kvm">
Jan 21 19:47:33 np0005591285 nova_compute[182755]:  <uuid>b0c229ff-8141-43f5-a553-f5282618869e</uuid>
Jan 21 19:47:33 np0005591285 nova_compute[182755]:  <name>instance-000000ba</name>
Jan 21 19:47:33 np0005591285 nova_compute[182755]:  <memory>131072</memory>
Jan 21 19:47:33 np0005591285 nova_compute[182755]:  <vcpu>1</vcpu>
Jan 21 19:47:33 np0005591285 nova_compute[182755]:  <metadata>
Jan 21 19:47:33 np0005591285 nova_compute[182755]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 21 19:47:33 np0005591285 nova_compute[182755]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 21 19:47:33 np0005591285 nova_compute[182755]:      <nova:name>tempest-TestServerMultinode-server-659438332</nova:name>
Jan 21 19:47:33 np0005591285 nova_compute[182755]:      <nova:creationTime>2026-01-22 00:47:33</nova:creationTime>
Jan 21 19:47:33 np0005591285 nova_compute[182755]:      <nova:flavor name="m1.nano">
Jan 21 19:47:33 np0005591285 nova_compute[182755]:        <nova:memory>128</nova:memory>
Jan 21 19:47:33 np0005591285 nova_compute[182755]:        <nova:disk>1</nova:disk>
Jan 21 19:47:33 np0005591285 nova_compute[182755]:        <nova:swap>0</nova:swap>
Jan 21 19:47:33 np0005591285 nova_compute[182755]:        <nova:ephemeral>0</nova:ephemeral>
Jan 21 19:47:33 np0005591285 nova_compute[182755]:        <nova:vcpus>1</nova:vcpus>
Jan 21 19:47:33 np0005591285 nova_compute[182755]:      </nova:flavor>
Jan 21 19:47:33 np0005591285 nova_compute[182755]:      <nova:owner>
Jan 21 19:47:33 np0005591285 nova_compute[182755]:        <nova:user uuid="8fb6fa8c5dd241fb975d0e13ddb107f4">tempest-TestServerMultinode-385846676-project-admin</nova:user>
Jan 21 19:47:33 np0005591285 nova_compute[182755]:        <nova:project uuid="38ae0051f15c46809f70ec5299cfb2c6">tempest-TestServerMultinode-385846676</nova:project>
Jan 21 19:47:33 np0005591285 nova_compute[182755]:      </nova:owner>
Jan 21 19:47:33 np0005591285 nova_compute[182755]:      <nova:root type="image" uuid="9cd98f02-a505-4543-a7ad-04e9a377b456"/>
Jan 21 19:47:33 np0005591285 nova_compute[182755]:      <nova:ports>
Jan 21 19:47:33 np0005591285 nova_compute[182755]:        <nova:port uuid="84562d91-1d45-4705-a924-4d9ca2b2ab5f">
Jan 21 19:47:33 np0005591285 nova_compute[182755]:          <nova:ip type="fixed" address="10.100.0.12" ipVersion="4"/>
Jan 21 19:47:33 np0005591285 nova_compute[182755]:        </nova:port>
Jan 21 19:47:33 np0005591285 nova_compute[182755]:      </nova:ports>
Jan 21 19:47:33 np0005591285 nova_compute[182755]:    </nova:instance>
Jan 21 19:47:33 np0005591285 nova_compute[182755]:  </metadata>
Jan 21 19:47:33 np0005591285 nova_compute[182755]:  <sysinfo type="smbios">
Jan 21 19:47:33 np0005591285 nova_compute[182755]:    <system>
Jan 21 19:47:33 np0005591285 nova_compute[182755]:      <entry name="manufacturer">RDO</entry>
Jan 21 19:47:33 np0005591285 nova_compute[182755]:      <entry name="product">OpenStack Compute</entry>
Jan 21 19:47:33 np0005591285 nova_compute[182755]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 21 19:47:33 np0005591285 nova_compute[182755]:      <entry name="serial">b0c229ff-8141-43f5-a553-f5282618869e</entry>
Jan 21 19:47:33 np0005591285 nova_compute[182755]:      <entry name="uuid">b0c229ff-8141-43f5-a553-f5282618869e</entry>
Jan 21 19:47:33 np0005591285 nova_compute[182755]:      <entry name="family">Virtual Machine</entry>
Jan 21 19:47:33 np0005591285 nova_compute[182755]:    </system>
Jan 21 19:47:33 np0005591285 nova_compute[182755]:  </sysinfo>
Jan 21 19:47:33 np0005591285 nova_compute[182755]:  <os>
Jan 21 19:47:33 np0005591285 nova_compute[182755]:    <type arch="x86_64" machine="q35">hvm</type>
Jan 21 19:47:33 np0005591285 nova_compute[182755]:    <boot dev="hd"/>
Jan 21 19:47:33 np0005591285 nova_compute[182755]:    <smbios mode="sysinfo"/>
Jan 21 19:47:33 np0005591285 nova_compute[182755]:  </os>
Jan 21 19:47:33 np0005591285 nova_compute[182755]:  <features>
Jan 21 19:47:33 np0005591285 nova_compute[182755]:    <acpi/>
Jan 21 19:47:33 np0005591285 nova_compute[182755]:    <apic/>
Jan 21 19:47:33 np0005591285 nova_compute[182755]:    <vmcoreinfo/>
Jan 21 19:47:33 np0005591285 nova_compute[182755]:  </features>
Jan 21 19:47:33 np0005591285 nova_compute[182755]:  <clock offset="utc">
Jan 21 19:47:33 np0005591285 nova_compute[182755]:    <timer name="pit" tickpolicy="delay"/>
Jan 21 19:47:33 np0005591285 nova_compute[182755]:    <timer name="rtc" tickpolicy="catchup"/>
Jan 21 19:47:33 np0005591285 nova_compute[182755]:    <timer name="hpet" present="no"/>
Jan 21 19:47:33 np0005591285 nova_compute[182755]:  </clock>
Jan 21 19:47:33 np0005591285 nova_compute[182755]:  <cpu mode="custom" match="exact">
Jan 21 19:47:33 np0005591285 nova_compute[182755]:    <model>Nehalem</model>
Jan 21 19:47:33 np0005591285 nova_compute[182755]:    <topology sockets="1" cores="1" threads="1"/>
Jan 21 19:47:33 np0005591285 nova_compute[182755]:  </cpu>
Jan 21 19:47:33 np0005591285 nova_compute[182755]:  <devices>
Jan 21 19:47:33 np0005591285 nova_compute[182755]:    <disk type="file" device="disk">
Jan 21 19:47:33 np0005591285 nova_compute[182755]:      <driver name="qemu" type="qcow2" cache="none"/>
Jan 21 19:47:33 np0005591285 nova_compute[182755]:      <source file="/var/lib/nova/instances/b0c229ff-8141-43f5-a553-f5282618869e/disk"/>
Jan 21 19:47:33 np0005591285 nova_compute[182755]:      <target dev="vda" bus="virtio"/>
Jan 21 19:47:33 np0005591285 nova_compute[182755]:    </disk>
Jan 21 19:47:33 np0005591285 nova_compute[182755]:    <disk type="file" device="cdrom">
Jan 21 19:47:33 np0005591285 nova_compute[182755]:      <driver name="qemu" type="raw" cache="none"/>
Jan 21 19:47:33 np0005591285 nova_compute[182755]:      <source file="/var/lib/nova/instances/b0c229ff-8141-43f5-a553-f5282618869e/disk.config"/>
Jan 21 19:47:33 np0005591285 nova_compute[182755]:      <target dev="sda" bus="sata"/>
Jan 21 19:47:33 np0005591285 nova_compute[182755]:    </disk>
Jan 21 19:47:33 np0005591285 nova_compute[182755]:    <interface type="ethernet">
Jan 21 19:47:33 np0005591285 nova_compute[182755]:      <mac address="fa:16:3e:f6:86:fa"/>
Jan 21 19:47:33 np0005591285 nova_compute[182755]:      <model type="virtio"/>
Jan 21 19:47:33 np0005591285 nova_compute[182755]:      <driver name="vhost" rx_queue_size="512"/>
Jan 21 19:47:33 np0005591285 nova_compute[182755]:      <mtu size="1442"/>
Jan 21 19:47:33 np0005591285 nova_compute[182755]:      <target dev="tap84562d91-1d"/>
Jan 21 19:47:33 np0005591285 nova_compute[182755]:    </interface>
Jan 21 19:47:33 np0005591285 nova_compute[182755]:    <serial type="pty">
Jan 21 19:47:33 np0005591285 nova_compute[182755]:      <log file="/var/lib/nova/instances/b0c229ff-8141-43f5-a553-f5282618869e/console.log" append="off"/>
Jan 21 19:47:33 np0005591285 nova_compute[182755]:    </serial>
Jan 21 19:47:33 np0005591285 nova_compute[182755]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 21 19:47:33 np0005591285 nova_compute[182755]:    <video>
Jan 21 19:47:33 np0005591285 nova_compute[182755]:      <model type="virtio"/>
Jan 21 19:47:33 np0005591285 nova_compute[182755]:    </video>
Jan 21 19:47:33 np0005591285 nova_compute[182755]:    <input type="tablet" bus="usb"/>
Jan 21 19:47:33 np0005591285 nova_compute[182755]:    <rng model="virtio">
Jan 21 19:47:33 np0005591285 nova_compute[182755]:      <backend model="random">/dev/urandom</backend>
Jan 21 19:47:33 np0005591285 nova_compute[182755]:    </rng>
Jan 21 19:47:33 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root"/>
Jan 21 19:47:33 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 19:47:33 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 19:47:33 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 19:47:33 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 19:47:33 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 19:47:33 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 19:47:33 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 19:47:33 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 19:47:33 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 19:47:33 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 19:47:33 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 19:47:33 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 19:47:33 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 19:47:33 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 19:47:33 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 19:47:33 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 19:47:33 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 19:47:33 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 19:47:33 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 19:47:33 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 19:47:33 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 19:47:33 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 19:47:33 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 19:47:33 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 19:47:33 np0005591285 nova_compute[182755]:    <controller type="usb" index="0"/>
Jan 21 19:47:33 np0005591285 nova_compute[182755]:    <memballoon model="virtio">
Jan 21 19:47:33 np0005591285 nova_compute[182755]:      <stats period="10"/>
Jan 21 19:47:33 np0005591285 nova_compute[182755]:    </memballoon>
Jan 21 19:47:33 np0005591285 nova_compute[182755]:  </devices>
Jan 21 19:47:33 np0005591285 nova_compute[182755]: </domain>
Jan 21 19:47:33 np0005591285 nova_compute[182755]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Jan 21 19:47:33 np0005591285 nova_compute[182755]: 2026-01-22 00:47:33.377 182759 DEBUG nova.compute.manager [None req-24510a19-17e8-492d-aed4-b2e33151d95c 8fb6fa8c5dd241fb975d0e13ddb107f4 38ae0051f15c46809f70ec5299cfb2c6 - - default default] [instance: b0c229ff-8141-43f5-a553-f5282618869e] Preparing to wait for external event network-vif-plugged-84562d91-1d45-4705-a924-4d9ca2b2ab5f prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Jan 21 19:47:33 np0005591285 nova_compute[182755]: 2026-01-22 00:47:33.377 182759 DEBUG oslo_concurrency.lockutils [None req-24510a19-17e8-492d-aed4-b2e33151d95c 8fb6fa8c5dd241fb975d0e13ddb107f4 38ae0051f15c46809f70ec5299cfb2c6 - - default default] Acquiring lock "b0c229ff-8141-43f5-a553-f5282618869e-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 21 19:47:33 np0005591285 nova_compute[182755]: 2026-01-22 00:47:33.378 182759 DEBUG oslo_concurrency.lockutils [None req-24510a19-17e8-492d-aed4-b2e33151d95c 8fb6fa8c5dd241fb975d0e13ddb107f4 38ae0051f15c46809f70ec5299cfb2c6 - - default default] Lock "b0c229ff-8141-43f5-a553-f5282618869e-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 21 19:47:33 np0005591285 nova_compute[182755]: 2026-01-22 00:47:33.378 182759 DEBUG oslo_concurrency.lockutils [None req-24510a19-17e8-492d-aed4-b2e33151d95c 8fb6fa8c5dd241fb975d0e13ddb107f4 38ae0051f15c46809f70ec5299cfb2c6 - - default default] Lock "b0c229ff-8141-43f5-a553-f5282618869e-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 21 19:47:33 np0005591285 nova_compute[182755]: 2026-01-22 00:47:33.379 182759 DEBUG nova.virt.libvirt.vif [None req-24510a19-17e8-492d-aed4-b2e33151d95c 8fb6fa8c5dd241fb975d0e13ddb107f4 38ae0051f15c46809f70ec5299cfb2c6 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-22T00:47:28Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestServerMultinode-server-659438332',display_name='tempest-TestServerMultinode-server-659438332',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-testservermultinode-server-659438332',id=186,image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='38ae0051f15c46809f70ec5299cfb2c6',ramdisk_id='',reservation_id='r-f1x0xb7e',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='admin,member,reader',image_base_image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestServerMultinode-385846676',owner_user_name='tempest-TestServerMultinode-385846676-project-admin'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-22T00:47:29Z,user_data=None,user_id='8fb6fa8c5dd241fb975d0e13ddb107f4',uuid=b0c229ff-8141-43f5-a553-f5282618869e,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "84562d91-1d45-4705-a924-4d9ca2b2ab5f", "address": "fa:16:3e:f6:86:fa", "network": {"id": "c27f16e8-e7ea-4ce6-8fc8-52a4d97170f2", "bridge": "br-int", "label": "tempest-TestServerMultinode-485848203-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "991deac0047e45c598f6b4e9ae868ad3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap84562d91-1d", "ovs_interfaceid": "84562d91-1d45-4705-a924-4d9ca2b2ab5f", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Jan 21 19:47:33 np0005591285 nova_compute[182755]: 2026-01-22 00:47:33.379 182759 DEBUG nova.network.os_vif_util [None req-24510a19-17e8-492d-aed4-b2e33151d95c 8fb6fa8c5dd241fb975d0e13ddb107f4 38ae0051f15c46809f70ec5299cfb2c6 - - default default] Converting VIF {"id": "84562d91-1d45-4705-a924-4d9ca2b2ab5f", "address": "fa:16:3e:f6:86:fa", "network": {"id": "c27f16e8-e7ea-4ce6-8fc8-52a4d97170f2", "bridge": "br-int", "label": "tempest-TestServerMultinode-485848203-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "991deac0047e45c598f6b4e9ae868ad3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap84562d91-1d", "ovs_interfaceid": "84562d91-1d45-4705-a924-4d9ca2b2ab5f", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 21 19:47:33 np0005591285 nova_compute[182755]: 2026-01-22 00:47:33.380 182759 DEBUG nova.network.os_vif_util [None req-24510a19-17e8-492d-aed4-b2e33151d95c 8fb6fa8c5dd241fb975d0e13ddb107f4 38ae0051f15c46809f70ec5299cfb2c6 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:f6:86:fa,bridge_name='br-int',has_traffic_filtering=True,id=84562d91-1d45-4705-a924-4d9ca2b2ab5f,network=Network(c27f16e8-e7ea-4ce6-8fc8-52a4d97170f2),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap84562d91-1d') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 21 19:47:33 np0005591285 nova_compute[182755]: 2026-01-22 00:47:33.381 182759 DEBUG os_vif [None req-24510a19-17e8-492d-aed4-b2e33151d95c 8fb6fa8c5dd241fb975d0e13ddb107f4 38ae0051f15c46809f70ec5299cfb2c6 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:f6:86:fa,bridge_name='br-int',has_traffic_filtering=True,id=84562d91-1d45-4705-a924-4d9ca2b2ab5f,network=Network(c27f16e8-e7ea-4ce6-8fc8-52a4d97170f2),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap84562d91-1d') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Jan 21 19:47:33 np0005591285 nova_compute[182755]: 2026-01-22 00:47:33.381 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:47:33 np0005591285 nova_compute[182755]: 2026-01-22 00:47:33.382 182759 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 21 19:47:33 np0005591285 nova_compute[182755]: 2026-01-22 00:47:33.382 182759 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 21 19:47:33 np0005591285 nova_compute[182755]: 2026-01-22 00:47:33.386 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:47:33 np0005591285 nova_compute[182755]: 2026-01-22 00:47:33.387 182759 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap84562d91-1d, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 21 19:47:33 np0005591285 nova_compute[182755]: 2026-01-22 00:47:33.387 182759 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap84562d91-1d, col_values=(('external_ids', {'iface-id': '84562d91-1d45-4705-a924-4d9ca2b2ab5f', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:f6:86:fa', 'vm-uuid': 'b0c229ff-8141-43f5-a553-f5282618869e'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 21 19:47:33 np0005591285 nova_compute[182755]: 2026-01-22 00:47:33.389 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:47:33 np0005591285 NetworkManager[55017]: <info>  [1769042853.3903] manager: (tap84562d91-1d): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/346)
Jan 21 19:47:33 np0005591285 nova_compute[182755]: 2026-01-22 00:47:33.392 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 21 19:47:33 np0005591285 nova_compute[182755]: 2026-01-22 00:47:33.397 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:47:33 np0005591285 nova_compute[182755]: 2026-01-22 00:47:33.398 182759 INFO os_vif [None req-24510a19-17e8-492d-aed4-b2e33151d95c 8fb6fa8c5dd241fb975d0e13ddb107f4 38ae0051f15c46809f70ec5299cfb2c6 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:f6:86:fa,bridge_name='br-int',has_traffic_filtering=True,id=84562d91-1d45-4705-a924-4d9ca2b2ab5f,network=Network(c27f16e8-e7ea-4ce6-8fc8-52a4d97170f2),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap84562d91-1d')#033[00m
Jan 21 19:47:33 np0005591285 nova_compute[182755]: 2026-01-22 00:47:33.467 182759 DEBUG nova.virt.libvirt.driver [None req-24510a19-17e8-492d-aed4-b2e33151d95c 8fb6fa8c5dd241fb975d0e13ddb107f4 38ae0051f15c46809f70ec5299cfb2c6 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 21 19:47:33 np0005591285 nova_compute[182755]: 2026-01-22 00:47:33.468 182759 DEBUG nova.virt.libvirt.driver [None req-24510a19-17e8-492d-aed4-b2e33151d95c 8fb6fa8c5dd241fb975d0e13ddb107f4 38ae0051f15c46809f70ec5299cfb2c6 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 21 19:47:33 np0005591285 nova_compute[182755]: 2026-01-22 00:47:33.468 182759 DEBUG nova.virt.libvirt.driver [None req-24510a19-17e8-492d-aed4-b2e33151d95c 8fb6fa8c5dd241fb975d0e13ddb107f4 38ae0051f15c46809f70ec5299cfb2c6 - - default default] No VIF found with MAC fa:16:3e:f6:86:fa, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Jan 21 19:47:33 np0005591285 nova_compute[182755]: 2026-01-22 00:47:33.468 182759 INFO nova.virt.libvirt.driver [None req-24510a19-17e8-492d-aed4-b2e33151d95c 8fb6fa8c5dd241fb975d0e13ddb107f4 38ae0051f15c46809f70ec5299cfb2c6 - - default default] [instance: b0c229ff-8141-43f5-a553-f5282618869e] Using config drive#033[00m
Jan 21 19:47:34 np0005591285 nova_compute[182755]: 2026-01-22 00:47:34.145 182759 INFO nova.virt.libvirt.driver [None req-24510a19-17e8-492d-aed4-b2e33151d95c 8fb6fa8c5dd241fb975d0e13ddb107f4 38ae0051f15c46809f70ec5299cfb2c6 - - default default] [instance: b0c229ff-8141-43f5-a553-f5282618869e] Creating config drive at /var/lib/nova/instances/b0c229ff-8141-43f5-a553-f5282618869e/disk.config#033[00m
Jan 21 19:47:34 np0005591285 nova_compute[182755]: 2026-01-22 00:47:34.150 182759 DEBUG oslo_concurrency.processutils [None req-24510a19-17e8-492d-aed4-b2e33151d95c 8fb6fa8c5dd241fb975d0e13ddb107f4 38ae0051f15c46809f70ec5299cfb2c6 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/b0c229ff-8141-43f5-a553-f5282618869e/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp8w4u_hcw execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 21 19:47:34 np0005591285 nova_compute[182755]: 2026-01-22 00:47:34.293 182759 DEBUG oslo_concurrency.processutils [None req-24510a19-17e8-492d-aed4-b2e33151d95c 8fb6fa8c5dd241fb975d0e13ddb107f4 38ae0051f15c46809f70ec5299cfb2c6 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/b0c229ff-8141-43f5-a553-f5282618869e/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp8w4u_hcw" returned: 0 in 0.143s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 21 19:47:34 np0005591285 kernel: tap84562d91-1d: entered promiscuous mode
Jan 21 19:47:34 np0005591285 NetworkManager[55017]: <info>  [1769042854.3828] manager: (tap84562d91-1d): new Tun device (/org/freedesktop/NetworkManager/Devices/347)
Jan 21 19:47:34 np0005591285 nova_compute[182755]: 2026-01-22 00:47:34.409 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:47:34 np0005591285 ovn_controller[94908]: 2026-01-22T00:47:34Z|00698|binding|INFO|Claiming lport 84562d91-1d45-4705-a924-4d9ca2b2ab5f for this chassis.
Jan 21 19:47:34 np0005591285 ovn_controller[94908]: 2026-01-22T00:47:34Z|00699|binding|INFO|84562d91-1d45-4705-a924-4d9ca2b2ab5f: Claiming fa:16:3e:f6:86:fa 10.100.0.12
Jan 21 19:47:34 np0005591285 nova_compute[182755]: 2026-01-22 00:47:34.414 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:47:34 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:47:34.426 104259 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:f6:86:fa 10.100.0.12'], port_security=['fa:16:3e:f6:86:fa 10.100.0.12'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.12/28', 'neutron:device_id': 'b0c229ff-8141-43f5-a553-f5282618869e', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-c27f16e8-e7ea-4ce6-8fc8-52a4d97170f2', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '38ae0051f15c46809f70ec5299cfb2c6', 'neutron:revision_number': '2', 'neutron:security_group_ids': '09d74f28-816b-485a-851f-3d27c0c9555a', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=2b227cd0-f221-40a4-86c3-ce27482fa492, chassis=[<ovs.db.idl.Row object at 0x7fdd55b5c970>], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fdd55b5c970>], logical_port=84562d91-1d45-4705-a924-4d9ca2b2ab5f) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 21 19:47:34 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:47:34.428 104259 INFO neutron.agent.ovn.metadata.agent [-] Port 84562d91-1d45-4705-a924-4d9ca2b2ab5f in datapath c27f16e8-e7ea-4ce6-8fc8-52a4d97170f2 bound to our chassis#033[00m
Jan 21 19:47:34 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:47:34.430 104259 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network c27f16e8-e7ea-4ce6-8fc8-52a4d97170f2#033[00m
Jan 21 19:47:34 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:47:34.447 211690 DEBUG oslo.privsep.daemon [-] privsep: reply[9169ea25-dec4-4469-a70b-b3314b4a3616]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 19:47:34 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:47:34.448 104259 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tapc27f16e8-e1 in ovnmeta-c27f16e8-e7ea-4ce6-8fc8-52a4d97170f2 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Jan 21 19:47:34 np0005591285 systemd-machined[154022]: New machine qemu-79-instance-000000ba.
Jan 21 19:47:34 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:47:34.451 211690 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tapc27f16e8-e0 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Jan 21 19:47:34 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:47:34.451 211690 DEBUG oslo.privsep.daemon [-] privsep: reply[0dc6418b-7739-47c9-a5ba-004686b5fde9]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 19:47:34 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:47:34.452 211690 DEBUG oslo.privsep.daemon [-] privsep: reply[16588ed8-3908-48fb-8a99-f0180d4358b2]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 19:47:34 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:47:34.470 104650 DEBUG oslo.privsep.daemon [-] privsep: reply[a9eecaaf-cc9e-45fa-9ed8-fe702e96f8f7]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 19:47:34 np0005591285 ovn_controller[94908]: 2026-01-22T00:47:34Z|00700|binding|INFO|Setting lport 84562d91-1d45-4705-a924-4d9ca2b2ab5f ovn-installed in OVS
Jan 21 19:47:34 np0005591285 ovn_controller[94908]: 2026-01-22T00:47:34Z|00701|binding|INFO|Setting lport 84562d91-1d45-4705-a924-4d9ca2b2ab5f up in Southbound
Jan 21 19:47:34 np0005591285 nova_compute[182755]: 2026-01-22 00:47:34.483 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:47:34 np0005591285 systemd[1]: Started Virtual Machine qemu-79-instance-000000ba.
Jan 21 19:47:34 np0005591285 systemd-udevd[245090]: Network interface NamePolicy= disabled on kernel command line.
Jan 21 19:47:34 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:47:34.499 211690 DEBUG oslo.privsep.daemon [-] privsep: reply[13e017df-0223-429a-8061-353a6a0407cf]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 19:47:34 np0005591285 NetworkManager[55017]: <info>  [1769042854.5150] device (tap84562d91-1d): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 21 19:47:34 np0005591285 NetworkManager[55017]: <info>  [1769042854.5156] device (tap84562d91-1d): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 21 19:47:34 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:47:34.536 211704 DEBUG oslo.privsep.daemon [-] privsep: reply[ef51670f-fa0a-4795-b462-8402b5836756]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 19:47:34 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:47:34.541 211690 DEBUG oslo.privsep.daemon [-] privsep: reply[057e6376-a662-4141-a693-a61dc07bc8c9]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 19:47:34 np0005591285 NetworkManager[55017]: <info>  [1769042854.5425] manager: (tapc27f16e8-e0): new Veth device (/org/freedesktop/NetworkManager/Devices/348)
Jan 21 19:47:34 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:47:34.570 211704 DEBUG oslo.privsep.daemon [-] privsep: reply[07fa6b46-759c-474b-a6d4-47b5232702e4]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 19:47:34 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:47:34.574 211704 DEBUG oslo.privsep.daemon [-] privsep: reply[05ea9bd2-fc55-449a-8ba0-b8083833651e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 19:47:34 np0005591285 NetworkManager[55017]: <info>  [1769042854.6034] device (tapc27f16e8-e0): carrier: link connected
Jan 21 19:47:34 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:47:34.610 211704 DEBUG oslo.privsep.daemon [-] privsep: reply[bbf10886-b951-4833-b48d-71a83252b35d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 19:47:34 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:47:34.633 211690 DEBUG oslo.privsep.daemon [-] privsep: reply[ddc637fb-193b-4b65-85b9-8091a16e6f0a]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapc27f16e8-e1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:cf:14:4f'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 217], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 741326, 'reachable_time': 19318, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 245120, 'error': None, 'target': 'ovnmeta-c27f16e8-e7ea-4ce6-8fc8-52a4d97170f2', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 19:47:34 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:47:34.651 211690 DEBUG oslo.privsep.daemon [-] privsep: reply[0eecddb6-1537-4cf8-b7b0-5b874a36523b]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fecf:144f'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 741326, 'tstamp': 741326}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 245121, 'error': None, 'target': 'ovnmeta-c27f16e8-e7ea-4ce6-8fc8-52a4d97170f2', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 19:47:34 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:47:34.676 211690 DEBUG oslo.privsep.daemon [-] privsep: reply[b07a9cb4-8764-4fe7-b0c0-35085133ca66]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapc27f16e8-e1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:cf:14:4f'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 217], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 741326, 'reachable_time': 19318, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 245122, 'error': None, 'target': 'ovnmeta-c27f16e8-e7ea-4ce6-8fc8-52a4d97170f2', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 19:47:34 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:47:34.724 211690 DEBUG oslo.privsep.daemon [-] privsep: reply[22ecd7a6-eb95-43ab-acf3-90ba43902508]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 19:47:34 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:47:34.797 211690 DEBUG oslo.privsep.daemon [-] privsep: reply[ba492ab2-9f59-4655-97f8-ba3bca083721]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 19:47:34 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:47:34.798 104259 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapc27f16e8-e0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 21 19:47:34 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:47:34.799 104259 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 21 19:47:34 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:47:34.799 104259 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapc27f16e8-e0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 21 19:47:34 np0005591285 nova_compute[182755]: 2026-01-22 00:47:34.801 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:47:34 np0005591285 NetworkManager[55017]: <info>  [1769042854.8041] manager: (tapc27f16e8-e0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/349)
Jan 21 19:47:34 np0005591285 kernel: tapc27f16e8-e0: entered promiscuous mode
Jan 21 19:47:34 np0005591285 nova_compute[182755]: 2026-01-22 00:47:34.805 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:47:34 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:47:34.808 104259 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapc27f16e8-e0, col_values=(('external_ids', {'iface-id': '8c4d0320-cbc0-4761-8fbc-cd4251890b14'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 21 19:47:34 np0005591285 nova_compute[182755]: 2026-01-22 00:47:34.809 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:47:34 np0005591285 ovn_controller[94908]: 2026-01-22T00:47:34Z|00702|binding|INFO|Releasing lport 8c4d0320-cbc0-4761-8fbc-cd4251890b14 from this chassis (sb_readonly=0)
Jan 21 19:47:34 np0005591285 nova_compute[182755]: 2026-01-22 00:47:34.824 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:47:34 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:47:34.825 104259 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/c27f16e8-e7ea-4ce6-8fc8-52a4d97170f2.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/c27f16e8-e7ea-4ce6-8fc8-52a4d97170f2.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Jan 21 19:47:34 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:47:34.826 211690 DEBUG oslo.privsep.daemon [-] privsep: reply[6451548a-1b69-47cf-ac25-960e6baa5b7b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 19:47:34 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:47:34.827 104259 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Jan 21 19:47:34 np0005591285 ovn_metadata_agent[104254]: global
Jan 21 19:47:34 np0005591285 ovn_metadata_agent[104254]:    log         /dev/log local0 debug
Jan 21 19:47:34 np0005591285 ovn_metadata_agent[104254]:    log-tag     haproxy-metadata-proxy-c27f16e8-e7ea-4ce6-8fc8-52a4d97170f2
Jan 21 19:47:34 np0005591285 ovn_metadata_agent[104254]:    user        root
Jan 21 19:47:34 np0005591285 ovn_metadata_agent[104254]:    group       root
Jan 21 19:47:34 np0005591285 ovn_metadata_agent[104254]:    maxconn     1024
Jan 21 19:47:34 np0005591285 ovn_metadata_agent[104254]:    pidfile     /var/lib/neutron/external/pids/c27f16e8-e7ea-4ce6-8fc8-52a4d97170f2.pid.haproxy
Jan 21 19:47:34 np0005591285 ovn_metadata_agent[104254]:    daemon
Jan 21 19:47:34 np0005591285 ovn_metadata_agent[104254]: 
Jan 21 19:47:34 np0005591285 ovn_metadata_agent[104254]: defaults
Jan 21 19:47:34 np0005591285 ovn_metadata_agent[104254]:    log global
Jan 21 19:47:34 np0005591285 ovn_metadata_agent[104254]:    mode http
Jan 21 19:47:34 np0005591285 ovn_metadata_agent[104254]:    option httplog
Jan 21 19:47:34 np0005591285 ovn_metadata_agent[104254]:    option dontlognull
Jan 21 19:47:34 np0005591285 ovn_metadata_agent[104254]:    option http-server-close
Jan 21 19:47:34 np0005591285 ovn_metadata_agent[104254]:    option forwardfor
Jan 21 19:47:34 np0005591285 ovn_metadata_agent[104254]:    retries                 3
Jan 21 19:47:34 np0005591285 ovn_metadata_agent[104254]:    timeout http-request    30s
Jan 21 19:47:34 np0005591285 ovn_metadata_agent[104254]:    timeout connect         30s
Jan 21 19:47:34 np0005591285 ovn_metadata_agent[104254]:    timeout client          32s
Jan 21 19:47:34 np0005591285 ovn_metadata_agent[104254]:    timeout server          32s
Jan 21 19:47:34 np0005591285 ovn_metadata_agent[104254]:    timeout http-keep-alive 30s
Jan 21 19:47:34 np0005591285 ovn_metadata_agent[104254]: 
Jan 21 19:47:34 np0005591285 ovn_metadata_agent[104254]: 
Jan 21 19:47:34 np0005591285 ovn_metadata_agent[104254]: listen listener
Jan 21 19:47:34 np0005591285 ovn_metadata_agent[104254]:    bind 169.254.169.254:80
Jan 21 19:47:34 np0005591285 ovn_metadata_agent[104254]:    server metadata /var/lib/neutron/metadata_proxy
Jan 21 19:47:34 np0005591285 ovn_metadata_agent[104254]:    http-request add-header X-OVN-Network-ID c27f16e8-e7ea-4ce6-8fc8-52a4d97170f2
Jan 21 19:47:34 np0005591285 ovn_metadata_agent[104254]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Jan 21 19:47:34 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:47:34.827 104259 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-c27f16e8-e7ea-4ce6-8fc8-52a4d97170f2', 'env', 'PROCESS_TAG=haproxy-c27f16e8-e7ea-4ce6-8fc8-52a4d97170f2', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/c27f16e8-e7ea-4ce6-8fc8-52a4d97170f2.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Jan 21 19:47:34 np0005591285 nova_compute[182755]: 2026-01-22 00:47:34.827 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:47:34 np0005591285 nova_compute[182755]: 2026-01-22 00:47:34.955 182759 DEBUG nova.virt.driver [None req-f447ef32-7edb-4e2c-a5c5-5452f2f9c37e - - - - - -] Emitting event <LifecycleEvent: 1769042854.954306, b0c229ff-8141-43f5-a553-f5282618869e => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 21 19:47:34 np0005591285 nova_compute[182755]: 2026-01-22 00:47:34.955 182759 INFO nova.compute.manager [None req-f447ef32-7edb-4e2c-a5c5-5452f2f9c37e - - - - - -] [instance: b0c229ff-8141-43f5-a553-f5282618869e] VM Started (Lifecycle Event)#033[00m
Jan 21 19:47:34 np0005591285 nova_compute[182755]: 2026-01-22 00:47:34.984 182759 DEBUG nova.compute.manager [None req-f447ef32-7edb-4e2c-a5c5-5452f2f9c37e - - - - - -] [instance: b0c229ff-8141-43f5-a553-f5282618869e] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 21 19:47:34 np0005591285 nova_compute[182755]: 2026-01-22 00:47:34.989 182759 DEBUG nova.virt.driver [None req-f447ef32-7edb-4e2c-a5c5-5452f2f9c37e - - - - - -] Emitting event <LifecycleEvent: 1769042854.9545527, b0c229ff-8141-43f5-a553-f5282618869e => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 21 19:47:34 np0005591285 nova_compute[182755]: 2026-01-22 00:47:34.989 182759 INFO nova.compute.manager [None req-f447ef32-7edb-4e2c-a5c5-5452f2f9c37e - - - - - -] [instance: b0c229ff-8141-43f5-a553-f5282618869e] VM Paused (Lifecycle Event)#033[00m
Jan 21 19:47:35 np0005591285 nova_compute[182755]: 2026-01-22 00:47:35.015 182759 DEBUG nova.compute.manager [None req-f447ef32-7edb-4e2c-a5c5-5452f2f9c37e - - - - - -] [instance: b0c229ff-8141-43f5-a553-f5282618869e] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 21 19:47:35 np0005591285 nova_compute[182755]: 2026-01-22 00:47:35.018 182759 DEBUG nova.compute.manager [None req-f447ef32-7edb-4e2c-a5c5-5452f2f9c37e - - - - - -] [instance: b0c229ff-8141-43f5-a553-f5282618869e] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 21 19:47:35 np0005591285 nova_compute[182755]: 2026-01-22 00:47:35.045 182759 INFO nova.compute.manager [None req-f447ef32-7edb-4e2c-a5c5-5452f2f9c37e - - - - - -] [instance: b0c229ff-8141-43f5-a553-f5282618869e] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 21 19:47:35 np0005591285 nova_compute[182755]: 2026-01-22 00:47:35.087 182759 DEBUG nova.network.neutron [req-3ef63251-8271-43e3-b3a0-21c73810073d req-df44e56d-a58e-4e75-a64c-b2209c2e7e9d 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: b0c229ff-8141-43f5-a553-f5282618869e] Updated VIF entry in instance network info cache for port 84562d91-1d45-4705-a924-4d9ca2b2ab5f. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 21 19:47:35 np0005591285 nova_compute[182755]: 2026-01-22 00:47:35.088 182759 DEBUG nova.network.neutron [req-3ef63251-8271-43e3-b3a0-21c73810073d req-df44e56d-a58e-4e75-a64c-b2209c2e7e9d 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: b0c229ff-8141-43f5-a553-f5282618869e] Updating instance_info_cache with network_info: [{"id": "84562d91-1d45-4705-a924-4d9ca2b2ab5f", "address": "fa:16:3e:f6:86:fa", "network": {"id": "c27f16e8-e7ea-4ce6-8fc8-52a4d97170f2", "bridge": "br-int", "label": "tempest-TestServerMultinode-485848203-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "991deac0047e45c598f6b4e9ae868ad3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap84562d91-1d", "ovs_interfaceid": "84562d91-1d45-4705-a924-4d9ca2b2ab5f", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 21 19:47:35 np0005591285 nova_compute[182755]: 2026-01-22 00:47:35.121 182759 DEBUG oslo_concurrency.lockutils [req-3ef63251-8271-43e3-b3a0-21c73810073d req-df44e56d-a58e-4e75-a64c-b2209c2e7e9d 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Releasing lock "refresh_cache-b0c229ff-8141-43f5-a553-f5282618869e" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 21 19:47:35 np0005591285 podman[245159]: 2026-01-22 00:47:35.258282505 +0000 UTC m=+0.054535352 container create 467708e6f2a8f0512d9edb8822894fd1fc481abc7e5769dfc7ea6309da11d4d3 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-c27f16e8-e7ea-4ce6-8fc8-52a4d97170f2, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.vendor=CentOS)
Jan 21 19:47:35 np0005591285 systemd[1]: Started libpod-conmon-467708e6f2a8f0512d9edb8822894fd1fc481abc7e5769dfc7ea6309da11d4d3.scope.
Jan 21 19:47:35 np0005591285 systemd[1]: Started libcrun container.
Jan 21 19:47:35 np0005591285 podman[245159]: 2026-01-22 00:47:35.229257842 +0000 UTC m=+0.025510689 image pull 3695f0466b4af47afdf4b467956f8cc4744d7249671a73e7ca3fd26cca2f59c3 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Jan 21 19:47:35 np0005591285 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/a45f8dc95a3aa6f3ee5201372ee8e3fc8b78d5c7c00d859b397a12b1d1653f08/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 21 19:47:35 np0005591285 podman[245159]: 2026-01-22 00:47:35.350470213 +0000 UTC m=+0.146723100 container init 467708e6f2a8f0512d9edb8822894fd1fc481abc7e5769dfc7ea6309da11d4d3 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-c27f16e8-e7ea-4ce6-8fc8-52a4d97170f2, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true)
Jan 21 19:47:35 np0005591285 podman[245159]: 2026-01-22 00:47:35.358930151 +0000 UTC m=+0.155182998 container start 467708e6f2a8f0512d9edb8822894fd1fc481abc7e5769dfc7ea6309da11d4d3 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-c27f16e8-e7ea-4ce6-8fc8-52a4d97170f2, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, io.buildah.version=1.41.3)
Jan 21 19:47:35 np0005591285 neutron-haproxy-ovnmeta-c27f16e8-e7ea-4ce6-8fc8-52a4d97170f2[245175]: [NOTICE]   (245195) : New worker (245204) forked
Jan 21 19:47:35 np0005591285 podman[245172]: 2026-01-22 00:47:35.387322297 +0000 UTC m=+0.075038975 container health_status 3d927e208c8d9d2bf2ed958430b573b5beb6e6062ba87e1ec7a0ee42c855b2e4 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter)
Jan 21 19:47:35 np0005591285 neutron-haproxy-ovnmeta-c27f16e8-e7ea-4ce6-8fc8-52a4d97170f2[245175]: [NOTICE]   (245195) : Loading success.
Jan 21 19:47:35 np0005591285 nova_compute[182755]: 2026-01-22 00:47:35.561 182759 DEBUG nova.compute.manager [req-f6edc751-4b0a-4d45-8159-bb75db2a1269 req-683f0971-da3d-46c9-b345-da1e2bc5755e 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: b0c229ff-8141-43f5-a553-f5282618869e] Received event network-vif-plugged-84562d91-1d45-4705-a924-4d9ca2b2ab5f external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 21 19:47:35 np0005591285 nova_compute[182755]: 2026-01-22 00:47:35.562 182759 DEBUG oslo_concurrency.lockutils [req-f6edc751-4b0a-4d45-8159-bb75db2a1269 req-683f0971-da3d-46c9-b345-da1e2bc5755e 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "b0c229ff-8141-43f5-a553-f5282618869e-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 21 19:47:35 np0005591285 nova_compute[182755]: 2026-01-22 00:47:35.563 182759 DEBUG oslo_concurrency.lockutils [req-f6edc751-4b0a-4d45-8159-bb75db2a1269 req-683f0971-da3d-46c9-b345-da1e2bc5755e 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "b0c229ff-8141-43f5-a553-f5282618869e-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 21 19:47:35 np0005591285 nova_compute[182755]: 2026-01-22 00:47:35.564 182759 DEBUG oslo_concurrency.lockutils [req-f6edc751-4b0a-4d45-8159-bb75db2a1269 req-683f0971-da3d-46c9-b345-da1e2bc5755e 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "b0c229ff-8141-43f5-a553-f5282618869e-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 21 19:47:35 np0005591285 nova_compute[182755]: 2026-01-22 00:47:35.564 182759 DEBUG nova.compute.manager [req-f6edc751-4b0a-4d45-8159-bb75db2a1269 req-683f0971-da3d-46c9-b345-da1e2bc5755e 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: b0c229ff-8141-43f5-a553-f5282618869e] Processing event network-vif-plugged-84562d91-1d45-4705-a924-4d9ca2b2ab5f _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Jan 21 19:47:35 np0005591285 nova_compute[182755]: 2026-01-22 00:47:35.565 182759 DEBUG nova.compute.manager [req-f6edc751-4b0a-4d45-8159-bb75db2a1269 req-683f0971-da3d-46c9-b345-da1e2bc5755e 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: b0c229ff-8141-43f5-a553-f5282618869e] Received event network-vif-plugged-84562d91-1d45-4705-a924-4d9ca2b2ab5f external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 21 19:47:35 np0005591285 nova_compute[182755]: 2026-01-22 00:47:35.565 182759 DEBUG oslo_concurrency.lockutils [req-f6edc751-4b0a-4d45-8159-bb75db2a1269 req-683f0971-da3d-46c9-b345-da1e2bc5755e 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "b0c229ff-8141-43f5-a553-f5282618869e-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 21 19:47:35 np0005591285 nova_compute[182755]: 2026-01-22 00:47:35.566 182759 DEBUG oslo_concurrency.lockutils [req-f6edc751-4b0a-4d45-8159-bb75db2a1269 req-683f0971-da3d-46c9-b345-da1e2bc5755e 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "b0c229ff-8141-43f5-a553-f5282618869e-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 21 19:47:35 np0005591285 nova_compute[182755]: 2026-01-22 00:47:35.566 182759 DEBUG oslo_concurrency.lockutils [req-f6edc751-4b0a-4d45-8159-bb75db2a1269 req-683f0971-da3d-46c9-b345-da1e2bc5755e 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "b0c229ff-8141-43f5-a553-f5282618869e-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 21 19:47:35 np0005591285 nova_compute[182755]: 2026-01-22 00:47:35.566 182759 DEBUG nova.compute.manager [req-f6edc751-4b0a-4d45-8159-bb75db2a1269 req-683f0971-da3d-46c9-b345-da1e2bc5755e 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: b0c229ff-8141-43f5-a553-f5282618869e] No waiting events found dispatching network-vif-plugged-84562d91-1d45-4705-a924-4d9ca2b2ab5f pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 21 19:47:35 np0005591285 nova_compute[182755]: 2026-01-22 00:47:35.567 182759 WARNING nova.compute.manager [req-f6edc751-4b0a-4d45-8159-bb75db2a1269 req-683f0971-da3d-46c9-b345-da1e2bc5755e 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: b0c229ff-8141-43f5-a553-f5282618869e] Received unexpected event network-vif-plugged-84562d91-1d45-4705-a924-4d9ca2b2ab5f for instance with vm_state building and task_state spawning.#033[00m
Jan 21 19:47:35 np0005591285 nova_compute[182755]: 2026-01-22 00:47:35.568 182759 DEBUG nova.compute.manager [None req-24510a19-17e8-492d-aed4-b2e33151d95c 8fb6fa8c5dd241fb975d0e13ddb107f4 38ae0051f15c46809f70ec5299cfb2c6 - - default default] [instance: b0c229ff-8141-43f5-a553-f5282618869e] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Jan 21 19:47:35 np0005591285 nova_compute[182755]: 2026-01-22 00:47:35.573 182759 DEBUG nova.virt.driver [None req-f447ef32-7edb-4e2c-a5c5-5452f2f9c37e - - - - - -] Emitting event <LifecycleEvent: 1769042855.5730324, b0c229ff-8141-43f5-a553-f5282618869e => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 21 19:47:35 np0005591285 nova_compute[182755]: 2026-01-22 00:47:35.573 182759 INFO nova.compute.manager [None req-f447ef32-7edb-4e2c-a5c5-5452f2f9c37e - - - - - -] [instance: b0c229ff-8141-43f5-a553-f5282618869e] VM Resumed (Lifecycle Event)#033[00m
Jan 21 19:47:35 np0005591285 nova_compute[182755]: 2026-01-22 00:47:35.575 182759 DEBUG nova.virt.libvirt.driver [None req-24510a19-17e8-492d-aed4-b2e33151d95c 8fb6fa8c5dd241fb975d0e13ddb107f4 38ae0051f15c46809f70ec5299cfb2c6 - - default default] [instance: b0c229ff-8141-43f5-a553-f5282618869e] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Jan 21 19:47:35 np0005591285 nova_compute[182755]: 2026-01-22 00:47:35.580 182759 INFO nova.virt.libvirt.driver [-] [instance: b0c229ff-8141-43f5-a553-f5282618869e] Instance spawned successfully.#033[00m
Jan 21 19:47:35 np0005591285 nova_compute[182755]: 2026-01-22 00:47:35.581 182759 DEBUG nova.virt.libvirt.driver [None req-24510a19-17e8-492d-aed4-b2e33151d95c 8fb6fa8c5dd241fb975d0e13ddb107f4 38ae0051f15c46809f70ec5299cfb2c6 - - default default] [instance: b0c229ff-8141-43f5-a553-f5282618869e] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Jan 21 19:47:35 np0005591285 nova_compute[182755]: 2026-01-22 00:47:35.609 182759 DEBUG nova.compute.manager [None req-f447ef32-7edb-4e2c-a5c5-5452f2f9c37e - - - - - -] [instance: b0c229ff-8141-43f5-a553-f5282618869e] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 21 19:47:35 np0005591285 nova_compute[182755]: 2026-01-22 00:47:35.616 182759 DEBUG nova.compute.manager [None req-f447ef32-7edb-4e2c-a5c5-5452f2f9c37e - - - - - -] [instance: b0c229ff-8141-43f5-a553-f5282618869e] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 21 19:47:35 np0005591285 nova_compute[182755]: 2026-01-22 00:47:35.621 182759 DEBUG nova.virt.libvirt.driver [None req-24510a19-17e8-492d-aed4-b2e33151d95c 8fb6fa8c5dd241fb975d0e13ddb107f4 38ae0051f15c46809f70ec5299cfb2c6 - - default default] [instance: b0c229ff-8141-43f5-a553-f5282618869e] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 21 19:47:35 np0005591285 nova_compute[182755]: 2026-01-22 00:47:35.621 182759 DEBUG nova.virt.libvirt.driver [None req-24510a19-17e8-492d-aed4-b2e33151d95c 8fb6fa8c5dd241fb975d0e13ddb107f4 38ae0051f15c46809f70ec5299cfb2c6 - - default default] [instance: b0c229ff-8141-43f5-a553-f5282618869e] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 21 19:47:35 np0005591285 nova_compute[182755]: 2026-01-22 00:47:35.622 182759 DEBUG nova.virt.libvirt.driver [None req-24510a19-17e8-492d-aed4-b2e33151d95c 8fb6fa8c5dd241fb975d0e13ddb107f4 38ae0051f15c46809f70ec5299cfb2c6 - - default default] [instance: b0c229ff-8141-43f5-a553-f5282618869e] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 21 19:47:35 np0005591285 nova_compute[182755]: 2026-01-22 00:47:35.622 182759 DEBUG nova.virt.libvirt.driver [None req-24510a19-17e8-492d-aed4-b2e33151d95c 8fb6fa8c5dd241fb975d0e13ddb107f4 38ae0051f15c46809f70ec5299cfb2c6 - - default default] [instance: b0c229ff-8141-43f5-a553-f5282618869e] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 21 19:47:35 np0005591285 nova_compute[182755]: 2026-01-22 00:47:35.623 182759 DEBUG nova.virt.libvirt.driver [None req-24510a19-17e8-492d-aed4-b2e33151d95c 8fb6fa8c5dd241fb975d0e13ddb107f4 38ae0051f15c46809f70ec5299cfb2c6 - - default default] [instance: b0c229ff-8141-43f5-a553-f5282618869e] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 21 19:47:35 np0005591285 nova_compute[182755]: 2026-01-22 00:47:35.623 182759 DEBUG nova.virt.libvirt.driver [None req-24510a19-17e8-492d-aed4-b2e33151d95c 8fb6fa8c5dd241fb975d0e13ddb107f4 38ae0051f15c46809f70ec5299cfb2c6 - - default default] [instance: b0c229ff-8141-43f5-a553-f5282618869e] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 21 19:47:35 np0005591285 nova_compute[182755]: 2026-01-22 00:47:35.653 182759 INFO nova.compute.manager [None req-f447ef32-7edb-4e2c-a5c5-5452f2f9c37e - - - - - -] [instance: b0c229ff-8141-43f5-a553-f5282618869e] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 21 19:47:35 np0005591285 nova_compute[182755]: 2026-01-22 00:47:35.731 182759 INFO nova.compute.manager [None req-24510a19-17e8-492d-aed4-b2e33151d95c 8fb6fa8c5dd241fb975d0e13ddb107f4 38ae0051f15c46809f70ec5299cfb2c6 - - default default] [instance: b0c229ff-8141-43f5-a553-f5282618869e] Took 5.84 seconds to spawn the instance on the hypervisor.#033[00m
Jan 21 19:47:35 np0005591285 nova_compute[182755]: 2026-01-22 00:47:35.732 182759 DEBUG nova.compute.manager [None req-24510a19-17e8-492d-aed4-b2e33151d95c 8fb6fa8c5dd241fb975d0e13ddb107f4 38ae0051f15c46809f70ec5299cfb2c6 - - default default] [instance: b0c229ff-8141-43f5-a553-f5282618869e] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 21 19:47:35 np0005591285 nova_compute[182755]: 2026-01-22 00:47:35.831 182759 INFO nova.compute.manager [None req-24510a19-17e8-492d-aed4-b2e33151d95c 8fb6fa8c5dd241fb975d0e13ddb107f4 38ae0051f15c46809f70ec5299cfb2c6 - - default default] [instance: b0c229ff-8141-43f5-a553-f5282618869e] Took 6.45 seconds to build instance.#033[00m
Jan 21 19:47:35 np0005591285 nova_compute[182755]: 2026-01-22 00:47:35.849 182759 DEBUG oslo_concurrency.lockutils [None req-24510a19-17e8-492d-aed4-b2e33151d95c 8fb6fa8c5dd241fb975d0e13ddb107f4 38ae0051f15c46809f70ec5299cfb2c6 - - default default] Lock "b0c229ff-8141-43f5-a553-f5282618869e" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 6.545s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 21 19:47:36 np0005591285 nova_compute[182755]: 2026-01-22 00:47:36.344 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:47:38 np0005591285 nova_compute[182755]: 2026-01-22 00:47:38.391 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:47:39 np0005591285 podman[245216]: 2026-01-22 00:47:39.19225982 +0000 UTC m=+0.057088732 container health_status 8919309155a033da8deb024bb37b9fc0d285fe97686ee0d202af37a5f2df8416 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Jan 21 19:47:39 np0005591285 podman[245215]: 2026-01-22 00:47:39.193971826 +0000 UTC m=+0.067846672 container health_status 482a8450540b33e5cd4c4cb0c4b60281016c657b79b89fa82f8a9e447843f995 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, managed_by=edpm_ansible, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team)
Jan 21 19:47:41 np0005591285 nova_compute[182755]: 2026-01-22 00:47:41.347 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:47:42 np0005591285 podman[245258]: 2026-01-22 00:47:42.25374243 +0000 UTC m=+0.113390642 container health_status e38def30a2d032278c507a8fe96e62e9ea51ff4721fe55b24e66691821fd079e (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_controller, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, managed_by=edpm_ansible, org.label-schema.build-date=20251202, config_id=ovn_controller)
Jan 21 19:47:43 np0005591285 nova_compute[182755]: 2026-01-22 00:47:43.393 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:47:46 np0005591285 nova_compute[182755]: 2026-01-22 00:47:46.348 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:47:47 np0005591285 ovn_controller[94908]: 2026-01-22T00:47:47Z|00076|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:f6:86:fa 10.100.0.12
Jan 21 19:47:47 np0005591285 ovn_controller[94908]: 2026-01-22T00:47:47Z|00077|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:f6:86:fa 10.100.0.12
Jan 21 19:47:48 np0005591285 nova_compute[182755]: 2026-01-22 00:47:48.396 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:47:50 np0005591285 nova_compute[182755]: 2026-01-22 00:47:50.849 182759 DEBUG oslo_concurrency.lockutils [None req-60cffd48-e6e7-4e92-8247-b2b458e4c9d4 8fb6fa8c5dd241fb975d0e13ddb107f4 38ae0051f15c46809f70ec5299cfb2c6 - - default default] Acquiring lock "b0c229ff-8141-43f5-a553-f5282618869e" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 21 19:47:50 np0005591285 nova_compute[182755]: 2026-01-22 00:47:50.850 182759 DEBUG oslo_concurrency.lockutils [None req-60cffd48-e6e7-4e92-8247-b2b458e4c9d4 8fb6fa8c5dd241fb975d0e13ddb107f4 38ae0051f15c46809f70ec5299cfb2c6 - - default default] Lock "b0c229ff-8141-43f5-a553-f5282618869e" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 21 19:47:50 np0005591285 nova_compute[182755]: 2026-01-22 00:47:50.851 182759 DEBUG oslo_concurrency.lockutils [None req-60cffd48-e6e7-4e92-8247-b2b458e4c9d4 8fb6fa8c5dd241fb975d0e13ddb107f4 38ae0051f15c46809f70ec5299cfb2c6 - - default default] Acquiring lock "b0c229ff-8141-43f5-a553-f5282618869e-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 21 19:47:50 np0005591285 nova_compute[182755]: 2026-01-22 00:47:50.852 182759 DEBUG oslo_concurrency.lockutils [None req-60cffd48-e6e7-4e92-8247-b2b458e4c9d4 8fb6fa8c5dd241fb975d0e13ddb107f4 38ae0051f15c46809f70ec5299cfb2c6 - - default default] Lock "b0c229ff-8141-43f5-a553-f5282618869e-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 21 19:47:50 np0005591285 nova_compute[182755]: 2026-01-22 00:47:50.852 182759 DEBUG oslo_concurrency.lockutils [None req-60cffd48-e6e7-4e92-8247-b2b458e4c9d4 8fb6fa8c5dd241fb975d0e13ddb107f4 38ae0051f15c46809f70ec5299cfb2c6 - - default default] Lock "b0c229ff-8141-43f5-a553-f5282618869e-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 21 19:47:50 np0005591285 nova_compute[182755]: 2026-01-22 00:47:50.877 182759 INFO nova.compute.manager [None req-60cffd48-e6e7-4e92-8247-b2b458e4c9d4 8fb6fa8c5dd241fb975d0e13ddb107f4 38ae0051f15c46809f70ec5299cfb2c6 - - default default] [instance: b0c229ff-8141-43f5-a553-f5282618869e] Terminating instance#033[00m
Jan 21 19:47:50 np0005591285 nova_compute[182755]: 2026-01-22 00:47:50.892 182759 DEBUG nova.compute.manager [None req-60cffd48-e6e7-4e92-8247-b2b458e4c9d4 8fb6fa8c5dd241fb975d0e13ddb107f4 38ae0051f15c46809f70ec5299cfb2c6 - - default default] [instance: b0c229ff-8141-43f5-a553-f5282618869e] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Jan 21 19:47:50 np0005591285 kernel: tap84562d91-1d (unregistering): left promiscuous mode
Jan 21 19:47:50 np0005591285 NetworkManager[55017]: <info>  [1769042870.9229] device (tap84562d91-1d): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 21 19:47:50 np0005591285 nova_compute[182755]: 2026-01-22 00:47:50.959 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:47:50 np0005591285 ovn_controller[94908]: 2026-01-22T00:47:50Z|00703|binding|INFO|Releasing lport 84562d91-1d45-4705-a924-4d9ca2b2ab5f from this chassis (sb_readonly=0)
Jan 21 19:47:50 np0005591285 ovn_controller[94908]: 2026-01-22T00:47:50Z|00704|binding|INFO|Setting lport 84562d91-1d45-4705-a924-4d9ca2b2ab5f down in Southbound
Jan 21 19:47:50 np0005591285 ovn_controller[94908]: 2026-01-22T00:47:50Z|00705|binding|INFO|Removing iface tap84562d91-1d ovn-installed in OVS
Jan 21 19:47:50 np0005591285 nova_compute[182755]: 2026-01-22 00:47:50.963 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:47:50 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:47:50.975 104259 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:f6:86:fa 10.100.0.12'], port_security=['fa:16:3e:f6:86:fa 10.100.0.12'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.12/28', 'neutron:device_id': 'b0c229ff-8141-43f5-a553-f5282618869e', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-c27f16e8-e7ea-4ce6-8fc8-52a4d97170f2', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '38ae0051f15c46809f70ec5299cfb2c6', 'neutron:revision_number': '4', 'neutron:security_group_ids': '09d74f28-816b-485a-851f-3d27c0c9555a', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-2.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=2b227cd0-f221-40a4-86c3-ce27482fa492, chassis=[], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fdd55b5c970>], logical_port=84562d91-1d45-4705-a924-4d9ca2b2ab5f) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7fdd55b5c970>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 21 19:47:50 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:47:50.977 104259 INFO neutron.agent.ovn.metadata.agent [-] Port 84562d91-1d45-4705-a924-4d9ca2b2ab5f in datapath c27f16e8-e7ea-4ce6-8fc8-52a4d97170f2 unbound from our chassis#033[00m
Jan 21 19:47:50 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:47:50.978 104259 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network c27f16e8-e7ea-4ce6-8fc8-52a4d97170f2, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Jan 21 19:47:50 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:47:50.980 211690 DEBUG oslo.privsep.daemon [-] privsep: reply[75f1f334-f8c5-4e8b-afaa-8faeb1a3a114]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 19:47:50 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:47:50.981 104259 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-c27f16e8-e7ea-4ce6-8fc8-52a4d97170f2 namespace which is not needed anymore#033[00m
Jan 21 19:47:50 np0005591285 nova_compute[182755]: 2026-01-22 00:47:50.984 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:47:51 np0005591285 systemd[1]: machine-qemu\x2d79\x2dinstance\x2d000000ba.scope: Deactivated successfully.
Jan 21 19:47:51 np0005591285 systemd[1]: machine-qemu\x2d79\x2dinstance\x2d000000ba.scope: Consumed 12.491s CPU time.
Jan 21 19:47:51 np0005591285 systemd-machined[154022]: Machine qemu-79-instance-000000ba terminated.
Jan 21 19:47:51 np0005591285 neutron-haproxy-ovnmeta-c27f16e8-e7ea-4ce6-8fc8-52a4d97170f2[245175]: [NOTICE]   (245195) : haproxy version is 2.8.14-c23fe91
Jan 21 19:47:51 np0005591285 neutron-haproxy-ovnmeta-c27f16e8-e7ea-4ce6-8fc8-52a4d97170f2[245175]: [NOTICE]   (245195) : path to executable is /usr/sbin/haproxy
Jan 21 19:47:51 np0005591285 neutron-haproxy-ovnmeta-c27f16e8-e7ea-4ce6-8fc8-52a4d97170f2[245175]: [WARNING]  (245195) : Exiting Master process...
Jan 21 19:47:51 np0005591285 neutron-haproxy-ovnmeta-c27f16e8-e7ea-4ce6-8fc8-52a4d97170f2[245175]: [ALERT]    (245195) : Current worker (245204) exited with code 143 (Terminated)
Jan 21 19:47:51 np0005591285 neutron-haproxy-ovnmeta-c27f16e8-e7ea-4ce6-8fc8-52a4d97170f2[245175]: [WARNING]  (245195) : All workers exited. Exiting... (0)
Jan 21 19:47:51 np0005591285 systemd[1]: libpod-467708e6f2a8f0512d9edb8822894fd1fc481abc7e5769dfc7ea6309da11d4d3.scope: Deactivated successfully.
Jan 21 19:47:51 np0005591285 podman[245326]: 2026-01-22 00:47:51.128693283 +0000 UTC m=+0.051530952 container died 467708e6f2a8f0512d9edb8822894fd1fc481abc7e5769dfc7ea6309da11d4d3 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-c27f16e8-e7ea-4ce6-8fc8-52a4d97170f2, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team)
Jan 21 19:47:51 np0005591285 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-467708e6f2a8f0512d9edb8822894fd1fc481abc7e5769dfc7ea6309da11d4d3-userdata-shm.mount: Deactivated successfully.
Jan 21 19:47:51 np0005591285 systemd[1]: var-lib-containers-storage-overlay-a45f8dc95a3aa6f3ee5201372ee8e3fc8b78d5c7c00d859b397a12b1d1653f08-merged.mount: Deactivated successfully.
Jan 21 19:47:51 np0005591285 podman[245326]: 2026-01-22 00:47:51.179226057 +0000 UTC m=+0.102063746 container cleanup 467708e6f2a8f0512d9edb8822894fd1fc481abc7e5769dfc7ea6309da11d4d3 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-c27f16e8-e7ea-4ce6-8fc8-52a4d97170f2, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 21 19:47:51 np0005591285 nova_compute[182755]: 2026-01-22 00:47:51.178 182759 INFO nova.virt.libvirt.driver [-] [instance: b0c229ff-8141-43f5-a553-f5282618869e] Instance destroyed successfully.#033[00m
Jan 21 19:47:51 np0005591285 nova_compute[182755]: 2026-01-22 00:47:51.180 182759 DEBUG nova.objects.instance [None req-60cffd48-e6e7-4e92-8247-b2b458e4c9d4 8fb6fa8c5dd241fb975d0e13ddb107f4 38ae0051f15c46809f70ec5299cfb2c6 - - default default] Lazy-loading 'resources' on Instance uuid b0c229ff-8141-43f5-a553-f5282618869e obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 21 19:47:51 np0005591285 systemd[1]: libpod-conmon-467708e6f2a8f0512d9edb8822894fd1fc481abc7e5769dfc7ea6309da11d4d3.scope: Deactivated successfully.
Jan 21 19:47:51 np0005591285 nova_compute[182755]: 2026-01-22 00:47:51.196 182759 DEBUG nova.virt.libvirt.vif [None req-60cffd48-e6e7-4e92-8247-b2b458e4c9d4 8fb6fa8c5dd241fb975d0e13ddb107f4 38ae0051f15c46809f70ec5299cfb2c6 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-22T00:47:28Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestServerMultinode-server-659438332',display_name='tempest-TestServerMultinode-server-659438332',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-testservermultinode-server-659438332',id=186,image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2026-01-22T00:47:35Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='38ae0051f15c46809f70ec5299cfb2c6',ramdisk_id='',reservation_id='r-f1x0xb7e',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='admin,member,reader',image_base_image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestServerMultinode-385846676',owner_user_name='tempest-TestServerMultinode-385846676-project-admin'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-22T00:47:35Z,user_data=None,user_id='8fb6fa8c5dd241fb975d0e13ddb107f4',uuid=b0c229ff-8141-43f5-a553-f5282618869e,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "84562d91-1d45-4705-a924-4d9ca2b2ab5f", "address": "fa:16:3e:f6:86:fa", "network": {"id": "c27f16e8-e7ea-4ce6-8fc8-52a4d97170f2", "bridge": "br-int", "label": "tempest-TestServerMultinode-485848203-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "991deac0047e45c598f6b4e9ae868ad3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap84562d91-1d", "ovs_interfaceid": "84562d91-1d45-4705-a924-4d9ca2b2ab5f", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Jan 21 19:47:51 np0005591285 nova_compute[182755]: 2026-01-22 00:47:51.197 182759 DEBUG nova.network.os_vif_util [None req-60cffd48-e6e7-4e92-8247-b2b458e4c9d4 8fb6fa8c5dd241fb975d0e13ddb107f4 38ae0051f15c46809f70ec5299cfb2c6 - - default default] Converting VIF {"id": "84562d91-1d45-4705-a924-4d9ca2b2ab5f", "address": "fa:16:3e:f6:86:fa", "network": {"id": "c27f16e8-e7ea-4ce6-8fc8-52a4d97170f2", "bridge": "br-int", "label": "tempest-TestServerMultinode-485848203-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "991deac0047e45c598f6b4e9ae868ad3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap84562d91-1d", "ovs_interfaceid": "84562d91-1d45-4705-a924-4d9ca2b2ab5f", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 21 19:47:51 np0005591285 nova_compute[182755]: 2026-01-22 00:47:51.198 182759 DEBUG nova.network.os_vif_util [None req-60cffd48-e6e7-4e92-8247-b2b458e4c9d4 8fb6fa8c5dd241fb975d0e13ddb107f4 38ae0051f15c46809f70ec5299cfb2c6 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:f6:86:fa,bridge_name='br-int',has_traffic_filtering=True,id=84562d91-1d45-4705-a924-4d9ca2b2ab5f,network=Network(c27f16e8-e7ea-4ce6-8fc8-52a4d97170f2),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap84562d91-1d') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 21 19:47:51 np0005591285 nova_compute[182755]: 2026-01-22 00:47:51.198 182759 DEBUG os_vif [None req-60cffd48-e6e7-4e92-8247-b2b458e4c9d4 8fb6fa8c5dd241fb975d0e13ddb107f4 38ae0051f15c46809f70ec5299cfb2c6 - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:f6:86:fa,bridge_name='br-int',has_traffic_filtering=True,id=84562d91-1d45-4705-a924-4d9ca2b2ab5f,network=Network(c27f16e8-e7ea-4ce6-8fc8-52a4d97170f2),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap84562d91-1d') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Jan 21 19:47:51 np0005591285 nova_compute[182755]: 2026-01-22 00:47:51.200 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:47:51 np0005591285 nova_compute[182755]: 2026-01-22 00:47:51.201 182759 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap84562d91-1d, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 21 19:47:51 np0005591285 nova_compute[182755]: 2026-01-22 00:47:51.202 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:47:51 np0005591285 nova_compute[182755]: 2026-01-22 00:47:51.204 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:47:51 np0005591285 nova_compute[182755]: 2026-01-22 00:47:51.206 182759 INFO os_vif [None req-60cffd48-e6e7-4e92-8247-b2b458e4c9d4 8fb6fa8c5dd241fb975d0e13ddb107f4 38ae0051f15c46809f70ec5299cfb2c6 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:f6:86:fa,bridge_name='br-int',has_traffic_filtering=True,id=84562d91-1d45-4705-a924-4d9ca2b2ab5f,network=Network(c27f16e8-e7ea-4ce6-8fc8-52a4d97170f2),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap84562d91-1d')#033[00m
Jan 21 19:47:51 np0005591285 nova_compute[182755]: 2026-01-22 00:47:51.207 182759 INFO nova.virt.libvirt.driver [None req-60cffd48-e6e7-4e92-8247-b2b458e4c9d4 8fb6fa8c5dd241fb975d0e13ddb107f4 38ae0051f15c46809f70ec5299cfb2c6 - - default default] [instance: b0c229ff-8141-43f5-a553-f5282618869e] Deleting instance files /var/lib/nova/instances/b0c229ff-8141-43f5-a553-f5282618869e_del#033[00m
Jan 21 19:47:51 np0005591285 nova_compute[182755]: 2026-01-22 00:47:51.207 182759 INFO nova.virt.libvirt.driver [None req-60cffd48-e6e7-4e92-8247-b2b458e4c9d4 8fb6fa8c5dd241fb975d0e13ddb107f4 38ae0051f15c46809f70ec5299cfb2c6 - - default default] [instance: b0c229ff-8141-43f5-a553-f5282618869e] Deletion of /var/lib/nova/instances/b0c229ff-8141-43f5-a553-f5282618869e_del complete#033[00m
Jan 21 19:47:51 np0005591285 podman[245372]: 2026-01-22 00:47:51.2538192 +0000 UTC m=+0.046791073 container remove 467708e6f2a8f0512d9edb8822894fd1fc481abc7e5769dfc7ea6309da11d4d3 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-c27f16e8-e7ea-4ce6-8fc8-52a4d97170f2, org.label-schema.schema-version=1.0, tcib_managed=true, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2)
Jan 21 19:47:51 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:47:51.259 211690 DEBUG oslo.privsep.daemon [-] privsep: reply[ea0c9b0f-3829-4f4d-aa52-be7d57d334ad]: (4, ('Thu Jan 22 12:47:51 AM UTC 2026 Stopping container neutron-haproxy-ovnmeta-c27f16e8-e7ea-4ce6-8fc8-52a4d97170f2 (467708e6f2a8f0512d9edb8822894fd1fc481abc7e5769dfc7ea6309da11d4d3)\n467708e6f2a8f0512d9edb8822894fd1fc481abc7e5769dfc7ea6309da11d4d3\nThu Jan 22 12:47:51 AM UTC 2026 Deleting container neutron-haproxy-ovnmeta-c27f16e8-e7ea-4ce6-8fc8-52a4d97170f2 (467708e6f2a8f0512d9edb8822894fd1fc481abc7e5769dfc7ea6309da11d4d3)\n467708e6f2a8f0512d9edb8822894fd1fc481abc7e5769dfc7ea6309da11d4d3\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 19:47:51 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:47:51.261 211690 DEBUG oslo.privsep.daemon [-] privsep: reply[a192475f-cc18-4618-bc79-6cb8828e129b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 19:47:51 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:47:51.262 104259 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapc27f16e8-e0, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 21 19:47:51 np0005591285 kernel: tapc27f16e8-e0: left promiscuous mode
Jan 21 19:47:51 np0005591285 nova_compute[182755]: 2026-01-22 00:47:51.265 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:47:51 np0005591285 nova_compute[182755]: 2026-01-22 00:47:51.276 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:47:51 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:47:51.278 211690 DEBUG oslo.privsep.daemon [-] privsep: reply[24bd84e9-fa3f-42f6-b4ad-897a1f77b04b]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 19:47:51 np0005591285 nova_compute[182755]: 2026-01-22 00:47:51.293 182759 INFO nova.compute.manager [None req-60cffd48-e6e7-4e92-8247-b2b458e4c9d4 8fb6fa8c5dd241fb975d0e13ddb107f4 38ae0051f15c46809f70ec5299cfb2c6 - - default default] [instance: b0c229ff-8141-43f5-a553-f5282618869e] Took 0.40 seconds to destroy the instance on the hypervisor.#033[00m
Jan 21 19:47:51 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:47:51.293 211690 DEBUG oslo.privsep.daemon [-] privsep: reply[8b40a415-5d66-409b-8c78-94cfb3bfd444]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 19:47:51 np0005591285 nova_compute[182755]: 2026-01-22 00:47:51.294 182759 DEBUG oslo.service.loopingcall [None req-60cffd48-e6e7-4e92-8247-b2b458e4c9d4 8fb6fa8c5dd241fb975d0e13ddb107f4 38ae0051f15c46809f70ec5299cfb2c6 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Jan 21 19:47:51 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:47:51.294 211690 DEBUG oslo.privsep.daemon [-] privsep: reply[fdd052fa-41f9-4c89-ac87-84ba1332ed7c]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 19:47:51 np0005591285 nova_compute[182755]: 2026-01-22 00:47:51.294 182759 DEBUG nova.compute.manager [-] [instance: b0c229ff-8141-43f5-a553-f5282618869e] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Jan 21 19:47:51 np0005591285 nova_compute[182755]: 2026-01-22 00:47:51.295 182759 DEBUG nova.network.neutron [-] [instance: b0c229ff-8141-43f5-a553-f5282618869e] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Jan 21 19:47:51 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:47:51.313 211690 DEBUG oslo.privsep.daemon [-] privsep: reply[8d4aa4e7-9da0-4689-8c7a-c3b89371521c]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 741318, 'reachable_time': 28309, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 245387, 'error': None, 'target': 'ovnmeta-c27f16e8-e7ea-4ce6-8fc8-52a4d97170f2', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 19:47:51 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:47:51.317 104650 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-c27f16e8-e7ea-4ce6-8fc8-52a4d97170f2 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Jan 21 19:47:51 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:47:51.317 104650 DEBUG oslo.privsep.daemon [-] privsep: reply[5fb30ad6-cf19-4819-8a9f-621fdb928cf5]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 19:47:51 np0005591285 systemd[1]: run-netns-ovnmeta\x2dc27f16e8\x2de7ea\x2d4ce6\x2d8fc8\x2d52a4d97170f2.mount: Deactivated successfully.
Jan 21 19:47:51 np0005591285 nova_compute[182755]: 2026-01-22 00:47:51.348 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:47:51 np0005591285 nova_compute[182755]: 2026-01-22 00:47:51.715 182759 DEBUG nova.compute.manager [req-c597601f-bf4f-438c-a1fa-22052a48a42b req-7f61aadc-74a6-4a5e-a20d-a5d16c6cf79e 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: b0c229ff-8141-43f5-a553-f5282618869e] Received event network-vif-unplugged-84562d91-1d45-4705-a924-4d9ca2b2ab5f external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 21 19:47:51 np0005591285 nova_compute[182755]: 2026-01-22 00:47:51.717 182759 DEBUG oslo_concurrency.lockutils [req-c597601f-bf4f-438c-a1fa-22052a48a42b req-7f61aadc-74a6-4a5e-a20d-a5d16c6cf79e 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "b0c229ff-8141-43f5-a553-f5282618869e-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 21 19:47:51 np0005591285 nova_compute[182755]: 2026-01-22 00:47:51.717 182759 DEBUG oslo_concurrency.lockutils [req-c597601f-bf4f-438c-a1fa-22052a48a42b req-7f61aadc-74a6-4a5e-a20d-a5d16c6cf79e 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "b0c229ff-8141-43f5-a553-f5282618869e-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 21 19:47:51 np0005591285 nova_compute[182755]: 2026-01-22 00:47:51.718 182759 DEBUG oslo_concurrency.lockutils [req-c597601f-bf4f-438c-a1fa-22052a48a42b req-7f61aadc-74a6-4a5e-a20d-a5d16c6cf79e 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "b0c229ff-8141-43f5-a553-f5282618869e-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 21 19:47:51 np0005591285 nova_compute[182755]: 2026-01-22 00:47:51.718 182759 DEBUG nova.compute.manager [req-c597601f-bf4f-438c-a1fa-22052a48a42b req-7f61aadc-74a6-4a5e-a20d-a5d16c6cf79e 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: b0c229ff-8141-43f5-a553-f5282618869e] No waiting events found dispatching network-vif-unplugged-84562d91-1d45-4705-a924-4d9ca2b2ab5f pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 21 19:47:51 np0005591285 nova_compute[182755]: 2026-01-22 00:47:51.719 182759 DEBUG nova.compute.manager [req-c597601f-bf4f-438c-a1fa-22052a48a42b req-7f61aadc-74a6-4a5e-a20d-a5d16c6cf79e 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: b0c229ff-8141-43f5-a553-f5282618869e] Received event network-vif-unplugged-84562d91-1d45-4705-a924-4d9ca2b2ab5f for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Jan 21 19:47:52 np0005591285 nova_compute[182755]: 2026-01-22 00:47:52.426 182759 DEBUG nova.network.neutron [-] [instance: b0c229ff-8141-43f5-a553-f5282618869e] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 21 19:47:52 np0005591285 nova_compute[182755]: 2026-01-22 00:47:52.448 182759 INFO nova.compute.manager [-] [instance: b0c229ff-8141-43f5-a553-f5282618869e] Took 1.15 seconds to deallocate network for instance.#033[00m
Jan 21 19:47:52 np0005591285 nova_compute[182755]: 2026-01-22 00:47:52.545 182759 DEBUG oslo_concurrency.lockutils [None req-60cffd48-e6e7-4e92-8247-b2b458e4c9d4 8fb6fa8c5dd241fb975d0e13ddb107f4 38ae0051f15c46809f70ec5299cfb2c6 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 21 19:47:52 np0005591285 nova_compute[182755]: 2026-01-22 00:47:52.545 182759 DEBUG oslo_concurrency.lockutils [None req-60cffd48-e6e7-4e92-8247-b2b458e4c9d4 8fb6fa8c5dd241fb975d0e13ddb107f4 38ae0051f15c46809f70ec5299cfb2c6 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 21 19:47:52 np0005591285 nova_compute[182755]: 2026-01-22 00:47:52.565 182759 DEBUG nova.compute.manager [req-a35e7b0e-0079-423c-b26d-3a2f690d6a53 req-a2345cd1-d926-423b-b8a5-eebf9765d93f 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: b0c229ff-8141-43f5-a553-f5282618869e] Received event network-vif-deleted-84562d91-1d45-4705-a924-4d9ca2b2ab5f external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 21 19:47:52 np0005591285 nova_compute[182755]: 2026-01-22 00:47:52.644 182759 DEBUG nova.compute.provider_tree [None req-60cffd48-e6e7-4e92-8247-b2b458e4c9d4 8fb6fa8c5dd241fb975d0e13ddb107f4 38ae0051f15c46809f70ec5299cfb2c6 - - default default] Inventory has not changed in ProviderTree for provider: e96a8776-a298-4c19-937a-402cb8191067 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 21 19:47:52 np0005591285 nova_compute[182755]: 2026-01-22 00:47:52.666 182759 DEBUG nova.scheduler.client.report [None req-60cffd48-e6e7-4e92-8247-b2b458e4c9d4 8fb6fa8c5dd241fb975d0e13ddb107f4 38ae0051f15c46809f70ec5299cfb2c6 - - default default] Inventory has not changed for provider e96a8776-a298-4c19-937a-402cb8191067 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 21 19:47:52 np0005591285 nova_compute[182755]: 2026-01-22 00:47:52.709 182759 DEBUG oslo_concurrency.lockutils [None req-60cffd48-e6e7-4e92-8247-b2b458e4c9d4 8fb6fa8c5dd241fb975d0e13ddb107f4 38ae0051f15c46809f70ec5299cfb2c6 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.164s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 21 19:47:52 np0005591285 nova_compute[182755]: 2026-01-22 00:47:52.760 182759 INFO nova.scheduler.client.report [None req-60cffd48-e6e7-4e92-8247-b2b458e4c9d4 8fb6fa8c5dd241fb975d0e13ddb107f4 38ae0051f15c46809f70ec5299cfb2c6 - - default default] Deleted allocations for instance b0c229ff-8141-43f5-a553-f5282618869e#033[00m
Jan 21 19:47:52 np0005591285 nova_compute[182755]: 2026-01-22 00:47:52.914 182759 DEBUG oslo_concurrency.lockutils [None req-60cffd48-e6e7-4e92-8247-b2b458e4c9d4 8fb6fa8c5dd241fb975d0e13ddb107f4 38ae0051f15c46809f70ec5299cfb2c6 - - default default] Lock "b0c229ff-8141-43f5-a553-f5282618869e" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 2.064s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 21 19:47:54 np0005591285 nova_compute[182755]: 2026-01-22 00:47:54.655 182759 DEBUG nova.compute.manager [req-1051b412-f232-4716-b2ed-e28a808caf59 req-a940a1e3-66cc-4243-abb7-aefc6acdd058 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: b0c229ff-8141-43f5-a553-f5282618869e] Received event network-vif-plugged-84562d91-1d45-4705-a924-4d9ca2b2ab5f external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 21 19:47:54 np0005591285 nova_compute[182755]: 2026-01-22 00:47:54.656 182759 DEBUG oslo_concurrency.lockutils [req-1051b412-f232-4716-b2ed-e28a808caf59 req-a940a1e3-66cc-4243-abb7-aefc6acdd058 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "b0c229ff-8141-43f5-a553-f5282618869e-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 21 19:47:54 np0005591285 nova_compute[182755]: 2026-01-22 00:47:54.656 182759 DEBUG oslo_concurrency.lockutils [req-1051b412-f232-4716-b2ed-e28a808caf59 req-a940a1e3-66cc-4243-abb7-aefc6acdd058 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "b0c229ff-8141-43f5-a553-f5282618869e-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 21 19:47:54 np0005591285 nova_compute[182755]: 2026-01-22 00:47:54.656 182759 DEBUG oslo_concurrency.lockutils [req-1051b412-f232-4716-b2ed-e28a808caf59 req-a940a1e3-66cc-4243-abb7-aefc6acdd058 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "b0c229ff-8141-43f5-a553-f5282618869e-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 21 19:47:54 np0005591285 nova_compute[182755]: 2026-01-22 00:47:54.657 182759 DEBUG nova.compute.manager [req-1051b412-f232-4716-b2ed-e28a808caf59 req-a940a1e3-66cc-4243-abb7-aefc6acdd058 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: b0c229ff-8141-43f5-a553-f5282618869e] No waiting events found dispatching network-vif-plugged-84562d91-1d45-4705-a924-4d9ca2b2ab5f pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 21 19:47:54 np0005591285 nova_compute[182755]: 2026-01-22 00:47:54.657 182759 WARNING nova.compute.manager [req-1051b412-f232-4716-b2ed-e28a808caf59 req-a940a1e3-66cc-4243-abb7-aefc6acdd058 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: b0c229ff-8141-43f5-a553-f5282618869e] Received unexpected event network-vif-plugged-84562d91-1d45-4705-a924-4d9ca2b2ab5f for instance with vm_state deleted and task_state None.#033[00m
Jan 21 19:47:56 np0005591285 nova_compute[182755]: 2026-01-22 00:47:56.204 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:47:56 np0005591285 nova_compute[182755]: 2026-01-22 00:47:56.351 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:47:57 np0005591285 podman[245391]: 2026-01-22 00:47:57.199200226 +0000 UTC m=+0.071042309 container health_status 769668cc4e818cfae854ab3fcb5517227ddca1e18005a18ef4d782a494f45662 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=openstack_network_exporter, build-date=2025-08-20T13:12:41, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, com.redhat.component=ubi9-minimal-container, distribution-scope=public, maintainer=Red Hat, Inc., container_name=openstack_network_exporter, managed_by=edpm_ansible, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-type=git, version=9.6, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=ubi9-minimal, io.buildah.version=1.33.7, vendor=Red Hat, Inc., architecture=x86_64, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.tags=minimal rhel9, release=1755695350)
Jan 21 19:47:57 np0005591285 podman[245392]: 2026-01-22 00:47:57.201268881 +0000 UTC m=+0.071310125 container health_status b5c1a31a508d0cf9e10702abe9c0ef424614b4d8f0daee85791f7de054afa6f0 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, config_id=ceilometer_agent_compute, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team)
Jan 21 19:48:01 np0005591285 nova_compute[182755]: 2026-01-22 00:48:01.206 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:48:01 np0005591285 nova_compute[182755]: 2026-01-22 00:48:01.353 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:48:02 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:48:02.388 104259 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=72, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '16:a4:fb', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': 'fa:c2:bb:50:eb:27'}, ipsec=False) old=SB_Global(nb_cfg=71) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 21 19:48:02 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:48:02.389 104259 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 5 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Jan 21 19:48:02 np0005591285 nova_compute[182755]: 2026-01-22 00:48:02.390 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:48:03 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:48:03.018 104259 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 21 19:48:03 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:48:03.018 104259 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 21 19:48:03 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:48:03.018 104259 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 21 19:48:06 np0005591285 nova_compute[182755]: 2026-01-22 00:48:06.014 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:48:06 np0005591285 nova_compute[182755]: 2026-01-22 00:48:06.176 182759 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1769042871.1749394, b0c229ff-8141-43f5-a553-f5282618869e => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 21 19:48:06 np0005591285 nova_compute[182755]: 2026-01-22 00:48:06.176 182759 INFO nova.compute.manager [-] [instance: b0c229ff-8141-43f5-a553-f5282618869e] VM Stopped (Lifecycle Event)#033[00m
Jan 21 19:48:06 np0005591285 podman[245434]: 2026-01-22 00:48:06.183942313 +0000 UTC m=+0.054133852 container health_status 3d927e208c8d9d2bf2ed958430b573b5beb6e6062ba87e1ec7a0ee42c855b2e4 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Jan 21 19:48:06 np0005591285 nova_compute[182755]: 2026-01-22 00:48:06.204 182759 DEBUG nova.compute.manager [None req-e7d43c7d-c00b-4ad2-89a1-84822a6eaf85 - - - - - -] [instance: b0c229ff-8141-43f5-a553-f5282618869e] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 21 19:48:06 np0005591285 nova_compute[182755]: 2026-01-22 00:48:06.207 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:48:06 np0005591285 nova_compute[182755]: 2026-01-22 00:48:06.354 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:48:07 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:48:07.391 104259 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=ce4b296c-26ac-415a-aa87-9634754eb3d3, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '72'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 21 19:48:10 np0005591285 podman[245459]: 2026-01-22 00:48:10.207125605 +0000 UTC m=+0.061879141 container health_status 8919309155a033da8deb024bb37b9fc0d285fe97686ee0d202af37a5f2df8416 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Jan 21 19:48:10 np0005591285 podman[245458]: 2026-01-22 00:48:10.228727768 +0000 UTC m=+0.080503393 container health_status 482a8450540b33e5cd4c4cb0c4b60281016c657b79b89fa82f8a9e447843f995 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.build-date=20251202, managed_by=edpm_ansible)
Jan 21 19:48:11 np0005591285 nova_compute[182755]: 2026-01-22 00:48:11.210 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:48:11 np0005591285 nova_compute[182755]: 2026-01-22 00:48:11.356 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:48:13 np0005591285 podman[245499]: 2026-01-22 00:48:13.27770458 +0000 UTC m=+0.142191648 container health_status e38def30a2d032278c507a8fe96e62e9ea51ff4721fe55b24e66691821fd079e (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, container_name=ovn_controller, managed_by=edpm_ansible, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_managed=true)
Jan 21 19:48:15 np0005591285 nova_compute[182755]: 2026-01-22 00:48:15.217 182759 DEBUG oslo_service.periodic_task [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 21 19:48:15 np0005591285 nova_compute[182755]: 2026-01-22 00:48:15.217 182759 DEBUG nova.compute.manager [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Jan 21 19:48:16 np0005591285 nova_compute[182755]: 2026-01-22 00:48:16.212 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:48:16 np0005591285 nova_compute[182755]: 2026-01-22 00:48:16.217 182759 DEBUG oslo_service.periodic_task [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 21 19:48:16 np0005591285 nova_compute[182755]: 2026-01-22 00:48:16.218 182759 DEBUG nova.compute.manager [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Jan 21 19:48:16 np0005591285 nova_compute[182755]: 2026-01-22 00:48:16.218 182759 DEBUG nova.compute.manager [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Jan 21 19:48:16 np0005591285 nova_compute[182755]: 2026-01-22 00:48:16.235 182759 DEBUG nova.compute.manager [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Jan 21 19:48:16 np0005591285 nova_compute[182755]: 2026-01-22 00:48:16.236 182759 DEBUG oslo_service.periodic_task [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 21 19:48:16 np0005591285 nova_compute[182755]: 2026-01-22 00:48:16.359 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:48:18 np0005591285 nova_compute[182755]: 2026-01-22 00:48:18.231 182759 DEBUG oslo_service.periodic_task [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 21 19:48:20 np0005591285 nova_compute[182755]: 2026-01-22 00:48:20.217 182759 DEBUG oslo_service.periodic_task [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 21 19:48:21 np0005591285 nova_compute[182755]: 2026-01-22 00:48:21.215 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:48:21 np0005591285 nova_compute[182755]: 2026-01-22 00:48:21.360 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:48:22 np0005591285 nova_compute[182755]: 2026-01-22 00:48:22.217 182759 DEBUG oslo_service.periodic_task [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 21 19:48:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:48:23.182 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 21 19:48:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:48:23.183 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 21 19:48:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:48:23.183 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 21 19:48:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:48:23.183 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.allocation, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 21 19:48:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:48:23.183 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 21 19:48:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:48:23.183 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 21 19:48:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:48:23.183 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 21 19:48:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:48:23.183 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.capacity, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 21 19:48:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:48:23.183 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 21 19:48:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:48:23.183 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 21 19:48:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:48:23.183 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 21 19:48:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:48:23.183 12 DEBUG ceilometer.polling.manager [-] Skip pollster memory.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 21 19:48:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:48:23.183 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 21 19:48:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:48:23.183 12 DEBUG ceilometer.polling.manager [-] Skip pollster cpu, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 21 19:48:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:48:23.183 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.iops, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 21 19:48:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:48:23.184 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 21 19:48:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:48:23.184 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 21 19:48:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:48:23.184 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 21 19:48:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:48:23.184 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 21 19:48:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:48:23.184 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 21 19:48:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:48:23.184 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 21 19:48:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:48:23.184 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 21 19:48:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:48:23.184 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 21 19:48:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:48:23.184 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 21 19:48:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:48:23.184 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 21 19:48:26 np0005591285 nova_compute[182755]: 2026-01-22 00:48:26.217 182759 DEBUG oslo_service.periodic_task [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Running periodic task ComputeManager._run_pending_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 21 19:48:26 np0005591285 nova_compute[182755]: 2026-01-22 00:48:26.218 182759 DEBUG nova.compute.manager [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Cleaning up deleted instances _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11145#033[00m
Jan 21 19:48:26 np0005591285 nova_compute[182755]: 2026-01-22 00:48:26.221 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:48:26 np0005591285 nova_compute[182755]: 2026-01-22 00:48:26.263 182759 DEBUG nova.compute.manager [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] There are 0 instances to clean _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11154#033[00m
Jan 21 19:48:26 np0005591285 nova_compute[182755]: 2026-01-22 00:48:26.361 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:48:27 np0005591285 nova_compute[182755]: 2026-01-22 00:48:27.263 182759 DEBUG oslo_service.periodic_task [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 21 19:48:27 np0005591285 nova_compute[182755]: 2026-01-22 00:48:27.264 182759 DEBUG oslo_service.periodic_task [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 21 19:48:27 np0005591285 nova_compute[182755]: 2026-01-22 00:48:27.295 182759 DEBUG oslo_concurrency.lockutils [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 21 19:48:27 np0005591285 nova_compute[182755]: 2026-01-22 00:48:27.295 182759 DEBUG oslo_concurrency.lockutils [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 21 19:48:27 np0005591285 nova_compute[182755]: 2026-01-22 00:48:27.296 182759 DEBUG oslo_concurrency.lockutils [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 21 19:48:27 np0005591285 nova_compute[182755]: 2026-01-22 00:48:27.296 182759 DEBUG nova.compute.resource_tracker [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Jan 21 19:48:27 np0005591285 podman[245525]: 2026-01-22 00:48:27.422501989 +0000 UTC m=+0.079199898 container health_status 769668cc4e818cfae854ab3fcb5517227ddca1e18005a18ef4d782a494f45662 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, url=https://catalog.redhat.com/en/search?searchType=containers, build-date=2025-08-20T13:12:41, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., architecture=x86_64, com.redhat.component=ubi9-minimal-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=openstack_network_exporter, version=9.6, io.buildah.version=1.33.7, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, container_name=openstack_network_exporter, managed_by=edpm_ansible, name=ubi9-minimal, maintainer=Red Hat, Inc., vcs-type=git, vendor=Red Hat, Inc., release=1755695350, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., distribution-scope=public, io.openshift.tags=minimal rhel9, io.openshift.expose-services=)
Jan 21 19:48:27 np0005591285 podman[245527]: 2026-01-22 00:48:27.434328168 +0000 UTC m=+0.073574456 container health_status b5c1a31a508d0cf9e10702abe9c0ef424614b4d8f0daee85791f7de054afa6f0 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, config_id=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, container_name=ceilometer_agent_compute, tcib_managed=true, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']})
Jan 21 19:48:27 np0005591285 nova_compute[182755]: 2026-01-22 00:48:27.505 182759 WARNING nova.virt.libvirt.driver [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 21 19:48:27 np0005591285 nova_compute[182755]: 2026-01-22 00:48:27.506 182759 DEBUG nova.compute.resource_tracker [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=5739MB free_disk=73.17700958251953GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Jan 21 19:48:27 np0005591285 nova_compute[182755]: 2026-01-22 00:48:27.507 182759 DEBUG oslo_concurrency.lockutils [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 21 19:48:27 np0005591285 nova_compute[182755]: 2026-01-22 00:48:27.507 182759 DEBUG oslo_concurrency.lockutils [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 21 19:48:27 np0005591285 nova_compute[182755]: 2026-01-22 00:48:27.641 182759 DEBUG nova.compute.resource_tracker [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Jan 21 19:48:27 np0005591285 nova_compute[182755]: 2026-01-22 00:48:27.642 182759 DEBUG nova.compute.resource_tracker [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Jan 21 19:48:27 np0005591285 nova_compute[182755]: 2026-01-22 00:48:27.664 182759 DEBUG nova.compute.provider_tree [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Inventory has not changed in ProviderTree for provider: e96a8776-a298-4c19-937a-402cb8191067 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 21 19:48:27 np0005591285 nova_compute[182755]: 2026-01-22 00:48:27.708 182759 DEBUG nova.scheduler.client.report [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Inventory has not changed for provider e96a8776-a298-4c19-937a-402cb8191067 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 21 19:48:27 np0005591285 nova_compute[182755]: 2026-01-22 00:48:27.753 182759 DEBUG nova.compute.resource_tracker [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Jan 21 19:48:27 np0005591285 nova_compute[182755]: 2026-01-22 00:48:27.753 182759 DEBUG oslo_concurrency.lockutils [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.246s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 21 19:48:31 np0005591285 nova_compute[182755]: 2026-01-22 00:48:31.224 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:48:31 np0005591285 nova_compute[182755]: 2026-01-22 00:48:31.362 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:48:31 np0005591285 nova_compute[182755]: 2026-01-22 00:48:31.708 182759 DEBUG oslo_service.periodic_task [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 21 19:48:36 np0005591285 nova_compute[182755]: 2026-01-22 00:48:36.227 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:48:36 np0005591285 nova_compute[182755]: 2026-01-22 00:48:36.363 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:48:36 np0005591285 podman[245566]: 2026-01-22 00:48:36.682201288 +0000 UTC m=+0.055354055 container health_status 3d927e208c8d9d2bf2ed958430b573b5beb6e6062ba87e1ec7a0ee42c855b2e4 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Jan 21 19:48:38 np0005591285 nova_compute[182755]: 2026-01-22 00:48:38.213 182759 DEBUG oslo_service.periodic_task [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 21 19:48:41 np0005591285 podman[245592]: 2026-01-22 00:48:41.1700431 +0000 UTC m=+0.046887797 container health_status 482a8450540b33e5cd4c4cb0c4b60281016c657b79b89fa82f8a9e447843f995 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 21 19:48:41 np0005591285 podman[245593]: 2026-01-22 00:48:41.177945052 +0000 UTC m=+0.050464673 container health_status 8919309155a033da8deb024bb37b9fc0d285fe97686ee0d202af37a5f2df8416 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter)
Jan 21 19:48:41 np0005591285 nova_compute[182755]: 2026-01-22 00:48:41.230 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:48:41 np0005591285 ovn_controller[94908]: 2026-01-22T00:48:41Z|00706|memory_trim|INFO|Detected inactivity (last active 30001 ms ago): trimming memory
Jan 21 19:48:41 np0005591285 nova_compute[182755]: 2026-01-22 00:48:41.365 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:48:44 np0005591285 podman[245633]: 2026-01-22 00:48:44.205555097 +0000 UTC m=+0.073744801 container health_status e38def30a2d032278c507a8fe96e62e9ea51ff4721fe55b24e66691821fd079e (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, container_name=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, config_id=ovn_controller, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 21 19:48:46 np0005591285 nova_compute[182755]: 2026-01-22 00:48:46.233 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:48:46 np0005591285 nova_compute[182755]: 2026-01-22 00:48:46.366 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:48:51 np0005591285 nova_compute[182755]: 2026-01-22 00:48:51.236 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:48:51 np0005591285 nova_compute[182755]: 2026-01-22 00:48:51.367 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:48:53 np0005591285 nova_compute[182755]: 2026-01-22 00:48:53.217 182759 DEBUG oslo_service.periodic_task [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Running periodic task ComputeManager._cleanup_incomplete_migrations run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 21 19:48:53 np0005591285 nova_compute[182755]: 2026-01-22 00:48:53.218 182759 DEBUG nova.compute.manager [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Cleaning up deleted instances with incomplete migration  _cleanup_incomplete_migrations /usr/lib/python3.9/site-packages/nova/compute/manager.py:11183#033[00m
Jan 21 19:48:56 np0005591285 nova_compute[182755]: 2026-01-22 00:48:56.239 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:48:56 np0005591285 nova_compute[182755]: 2026-01-22 00:48:56.371 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:48:58 np0005591285 podman[245660]: 2026-01-22 00:48:58.208114529 +0000 UTC m=+0.072148207 container health_status b5c1a31a508d0cf9e10702abe9c0ef424614b4d8f0daee85791f7de054afa6f0 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=ceilometer_agent_compute, container_name=ceilometer_agent_compute, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3)
Jan 21 19:48:58 np0005591285 podman[245659]: 2026-01-22 00:48:58.231795998 +0000 UTC m=+0.098868678 container health_status 769668cc4e818cfae854ab3fcb5517227ddca1e18005a18ef4d782a494f45662 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, distribution-scope=public, io.openshift.tags=minimal rhel9, config_id=openstack_network_exporter, container_name=openstack_network_exporter, managed_by=edpm_ansible, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, name=ubi9-minimal, release=1755695350, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.buildah.version=1.33.7, url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.component=ubi9-minimal-container, architecture=x86_64, vendor=Red Hat, Inc., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Red Hat, Inc., version=9.6, io.openshift.expose-services=, vcs-type=git, build-date=2025-08-20T13:12:41)
Jan 21 19:49:01 np0005591285 nova_compute[182755]: 2026-01-22 00:49:01.242 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:49:01 np0005591285 nova_compute[182755]: 2026-01-22 00:49:01.372 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:49:02 np0005591285 nova_compute[182755]: 2026-01-22 00:49:02.700 182759 DEBUG oslo_service.periodic_task [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Running periodic task ComputeManager._sync_power_states run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 21 19:49:03 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:49:03.019 104259 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 21 19:49:03 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:49:03.020 104259 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 21 19:49:03 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:49:03.020 104259 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 21 19:49:06 np0005591285 nova_compute[182755]: 2026-01-22 00:49:06.245 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:49:06 np0005591285 nova_compute[182755]: 2026-01-22 00:49:06.373 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:49:07 np0005591285 podman[245697]: 2026-01-22 00:49:07.172834876 +0000 UTC m=+0.049609150 container health_status 3d927e208c8d9d2bf2ed958430b573b5beb6e6062ba87e1ec7a0ee42c855b2e4 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter)
Jan 21 19:49:11 np0005591285 nova_compute[182755]: 2026-01-22 00:49:11.217 182759 DEBUG oslo_service.periodic_task [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 21 19:49:11 np0005591285 nova_compute[182755]: 2026-01-22 00:49:11.248 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:49:11 np0005591285 nova_compute[182755]: 2026-01-22 00:49:11.376 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:49:12 np0005591285 podman[245723]: 2026-01-22 00:49:12.182688765 +0000 UTC m=+0.051550812 container health_status 8919309155a033da8deb024bb37b9fc0d285fe97686ee0d202af37a5f2df8416 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Jan 21 19:49:12 np0005591285 podman[245722]: 2026-01-22 00:49:12.194101293 +0000 UTC m=+0.062336663 container health_status 482a8450540b33e5cd4c4cb0c4b60281016c657b79b89fa82f8a9e447843f995 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251202, tcib_managed=true, io.buildah.version=1.41.3, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible)
Jan 21 19:49:15 np0005591285 podman[245768]: 2026-01-22 00:49:15.259998811 +0000 UTC m=+0.124433770 container health_status e38def30a2d032278c507a8fe96e62e9ea51ff4721fe55b24e66691821fd079e (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_id=ovn_controller, org.label-schema.vendor=CentOS)
Jan 21 19:49:16 np0005591285 nova_compute[182755]: 2026-01-22 00:49:16.239 182759 DEBUG oslo_service.periodic_task [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 21 19:49:16 np0005591285 nova_compute[182755]: 2026-01-22 00:49:16.250 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:49:16 np0005591285 nova_compute[182755]: 2026-01-22 00:49:16.378 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:49:17 np0005591285 nova_compute[182755]: 2026-01-22 00:49:17.216 182759 DEBUG oslo_service.periodic_task [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 21 19:49:17 np0005591285 nova_compute[182755]: 2026-01-22 00:49:17.216 182759 DEBUG nova.compute.manager [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Jan 21 19:49:18 np0005591285 nova_compute[182755]: 2026-01-22 00:49:18.212 182759 DEBUG oslo_service.periodic_task [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 21 19:49:18 np0005591285 nova_compute[182755]: 2026-01-22 00:49:18.217 182759 DEBUG oslo_service.periodic_task [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 21 19:49:18 np0005591285 nova_compute[182755]: 2026-01-22 00:49:18.217 182759 DEBUG nova.compute.manager [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Jan 21 19:49:18 np0005591285 nova_compute[182755]: 2026-01-22 00:49:18.218 182759 DEBUG nova.compute.manager [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Jan 21 19:49:18 np0005591285 nova_compute[182755]: 2026-01-22 00:49:18.246 182759 DEBUG nova.compute.manager [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Jan 21 19:49:21 np0005591285 nova_compute[182755]: 2026-01-22 00:49:21.216 182759 DEBUG oslo_service.periodic_task [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 21 19:49:21 np0005591285 nova_compute[182755]: 2026-01-22 00:49:21.252 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:49:21 np0005591285 nova_compute[182755]: 2026-01-22 00:49:21.379 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:49:22 np0005591285 nova_compute[182755]: 2026-01-22 00:49:22.216 182759 DEBUG oslo_service.periodic_task [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 21 19:49:26 np0005591285 nova_compute[182755]: 2026-01-22 00:49:26.255 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:49:26 np0005591285 nova_compute[182755]: 2026-01-22 00:49:26.381 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:49:27 np0005591285 nova_compute[182755]: 2026-01-22 00:49:27.217 182759 DEBUG oslo_service.periodic_task [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 21 19:49:27 np0005591285 nova_compute[182755]: 2026-01-22 00:49:27.218 182759 DEBUG oslo_service.periodic_task [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 21 19:49:27 np0005591285 nova_compute[182755]: 2026-01-22 00:49:27.252 182759 DEBUG oslo_concurrency.lockutils [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 21 19:49:27 np0005591285 nova_compute[182755]: 2026-01-22 00:49:27.253 182759 DEBUG oslo_concurrency.lockutils [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 21 19:49:27 np0005591285 nova_compute[182755]: 2026-01-22 00:49:27.253 182759 DEBUG oslo_concurrency.lockutils [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 21 19:49:27 np0005591285 nova_compute[182755]: 2026-01-22 00:49:27.254 182759 DEBUG nova.compute.resource_tracker [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Jan 21 19:49:27 np0005591285 nova_compute[182755]: 2026-01-22 00:49:27.454 182759 WARNING nova.virt.libvirt.driver [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 21 19:49:27 np0005591285 nova_compute[182755]: 2026-01-22 00:49:27.455 182759 DEBUG nova.compute.resource_tracker [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=5739MB free_disk=73.17667388916016GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Jan 21 19:49:27 np0005591285 nova_compute[182755]: 2026-01-22 00:49:27.456 182759 DEBUG oslo_concurrency.lockutils [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 21 19:49:27 np0005591285 nova_compute[182755]: 2026-01-22 00:49:27.456 182759 DEBUG oslo_concurrency.lockutils [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 21 19:49:27 np0005591285 nova_compute[182755]: 2026-01-22 00:49:27.531 182759 DEBUG nova.compute.resource_tracker [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Jan 21 19:49:27 np0005591285 nova_compute[182755]: 2026-01-22 00:49:27.532 182759 DEBUG nova.compute.resource_tracker [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Jan 21 19:49:27 np0005591285 nova_compute[182755]: 2026-01-22 00:49:27.556 182759 DEBUG nova.compute.provider_tree [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Inventory has not changed in ProviderTree for provider: e96a8776-a298-4c19-937a-402cb8191067 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 21 19:49:27 np0005591285 nova_compute[182755]: 2026-01-22 00:49:27.581 182759 DEBUG nova.scheduler.client.report [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Inventory has not changed for provider e96a8776-a298-4c19-937a-402cb8191067 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 21 19:49:27 np0005591285 nova_compute[182755]: 2026-01-22 00:49:27.584 182759 DEBUG nova.compute.resource_tracker [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Jan 21 19:49:27 np0005591285 nova_compute[182755]: 2026-01-22 00:49:27.584 182759 DEBUG oslo_concurrency.lockutils [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.128s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 21 19:49:29 np0005591285 podman[245797]: 2026-01-22 00:49:29.216780567 +0000 UTC m=+0.079353793 container health_status b5c1a31a508d0cf9e10702abe9c0ef424614b4d8f0daee85791f7de054afa6f0 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, config_id=ceilometer_agent_compute, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, managed_by=edpm_ansible)
Jan 21 19:49:29 np0005591285 podman[245796]: 2026-01-22 00:49:29.218933524 +0000 UTC m=+0.073560356 container health_status 769668cc4e818cfae854ab3fcb5517227ddca1e18005a18ef4d782a494f45662 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, com.redhat.component=ubi9-minimal-container, managed_by=edpm_ansible, build-date=2025-08-20T13:12:41, container_name=openstack_network_exporter, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.buildah.version=1.33.7, name=ubi9-minimal, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, maintainer=Red Hat, Inc., release=1755695350, url=https://catalog.redhat.com/en/search?searchType=containers, distribution-scope=public, version=9.6, vendor=Red Hat, Inc., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.tags=minimal rhel9, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, architecture=x86_64, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.openshift.expose-services=, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=openstack_network_exporter)
Jan 21 19:49:31 np0005591285 nova_compute[182755]: 2026-01-22 00:49:31.258 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:49:31 np0005591285 nova_compute[182755]: 2026-01-22 00:49:31.383 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:49:31 np0005591285 nova_compute[182755]: 2026-01-22 00:49:31.585 182759 DEBUG oslo_service.periodic_task [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 21 19:49:36 np0005591285 nova_compute[182755]: 2026-01-22 00:49:36.276 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:49:36 np0005591285 nova_compute[182755]: 2026-01-22 00:49:36.383 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:49:38 np0005591285 podman[245836]: 2026-01-22 00:49:38.206782407 +0000 UTC m=+0.081588463 container health_status 3d927e208c8d9d2bf2ed958430b573b5beb6e6062ba87e1ec7a0ee42c855b2e4 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Jan 21 19:49:38 np0005591285 nova_compute[182755]: 2026-01-22 00:49:38.733 182759 DEBUG oslo_concurrency.lockutils [None req-028b4ddc-8a2b-46bf-93f0-2ea61b821e0a f96259409b0747b6ac866ebe79dcf160 9d05c3cf062a4f6ebb5083b35d40286e - - default default] Acquiring lock "07d46432-944a-49b9-9862-65d4e541e750" by "nova.compute.manager.ComputeManager.unshelve_instance.<locals>.do_unshelve_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 21 19:49:38 np0005591285 nova_compute[182755]: 2026-01-22 00:49:38.734 182759 DEBUG oslo_concurrency.lockutils [None req-028b4ddc-8a2b-46bf-93f0-2ea61b821e0a f96259409b0747b6ac866ebe79dcf160 9d05c3cf062a4f6ebb5083b35d40286e - - default default] Lock "07d46432-944a-49b9-9862-65d4e541e750" acquired by "nova.compute.manager.ComputeManager.unshelve_instance.<locals>.do_unshelve_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 21 19:49:38 np0005591285 nova_compute[182755]: 2026-01-22 00:49:38.734 182759 INFO nova.compute.manager [None req-028b4ddc-8a2b-46bf-93f0-2ea61b821e0a f96259409b0747b6ac866ebe79dcf160 9d05c3cf062a4f6ebb5083b35d40286e - - default default] [instance: 07d46432-944a-49b9-9862-65d4e541e750] Unshelving#033[00m
Jan 21 19:49:38 np0005591285 nova_compute[182755]: 2026-01-22 00:49:38.854 182759 DEBUG oslo_concurrency.lockutils [None req-028b4ddc-8a2b-46bf-93f0-2ea61b821e0a f96259409b0747b6ac866ebe79dcf160 9d05c3cf062a4f6ebb5083b35d40286e - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 21 19:49:38 np0005591285 nova_compute[182755]: 2026-01-22 00:49:38.855 182759 DEBUG oslo_concurrency.lockutils [None req-028b4ddc-8a2b-46bf-93f0-2ea61b821e0a f96259409b0747b6ac866ebe79dcf160 9d05c3cf062a4f6ebb5083b35d40286e - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 21 19:49:38 np0005591285 nova_compute[182755]: 2026-01-22 00:49:38.860 182759 DEBUG nova.objects.instance [None req-028b4ddc-8a2b-46bf-93f0-2ea61b821e0a f96259409b0747b6ac866ebe79dcf160 9d05c3cf062a4f6ebb5083b35d40286e - - default default] Lazy-loading 'pci_requests' on Instance uuid 07d46432-944a-49b9-9862-65d4e541e750 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 21 19:49:38 np0005591285 nova_compute[182755]: 2026-01-22 00:49:38.875 182759 DEBUG nova.objects.instance [None req-028b4ddc-8a2b-46bf-93f0-2ea61b821e0a f96259409b0747b6ac866ebe79dcf160 9d05c3cf062a4f6ebb5083b35d40286e - - default default] Lazy-loading 'numa_topology' on Instance uuid 07d46432-944a-49b9-9862-65d4e541e750 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 21 19:49:38 np0005591285 nova_compute[182755]: 2026-01-22 00:49:38.888 182759 DEBUG nova.virt.hardware [None req-028b4ddc-8a2b-46bf-93f0-2ea61b821e0a f96259409b0747b6ac866ebe79dcf160 9d05c3cf062a4f6ebb5083b35d40286e - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Jan 21 19:49:38 np0005591285 nova_compute[182755]: 2026-01-22 00:49:38.889 182759 INFO nova.compute.claims [None req-028b4ddc-8a2b-46bf-93f0-2ea61b821e0a f96259409b0747b6ac866ebe79dcf160 9d05c3cf062a4f6ebb5083b35d40286e - - default default] [instance: 07d46432-944a-49b9-9862-65d4e541e750] Claim successful on node compute-2.ctlplane.example.com#033[00m
Jan 21 19:49:39 np0005591285 nova_compute[182755]: 2026-01-22 00:49:39.009 182759 DEBUG nova.compute.provider_tree [None req-028b4ddc-8a2b-46bf-93f0-2ea61b821e0a f96259409b0747b6ac866ebe79dcf160 9d05c3cf062a4f6ebb5083b35d40286e - - default default] Inventory has not changed in ProviderTree for provider: e96a8776-a298-4c19-937a-402cb8191067 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 21 19:49:39 np0005591285 nova_compute[182755]: 2026-01-22 00:49:39.025 182759 DEBUG nova.scheduler.client.report [None req-028b4ddc-8a2b-46bf-93f0-2ea61b821e0a f96259409b0747b6ac866ebe79dcf160 9d05c3cf062a4f6ebb5083b35d40286e - - default default] Inventory has not changed for provider e96a8776-a298-4c19-937a-402cb8191067 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 21 19:49:39 np0005591285 nova_compute[182755]: 2026-01-22 00:49:39.049 182759 DEBUG oslo_concurrency.lockutils [None req-028b4ddc-8a2b-46bf-93f0-2ea61b821e0a f96259409b0747b6ac866ebe79dcf160 9d05c3cf062a4f6ebb5083b35d40286e - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.194s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 21 19:49:39 np0005591285 nova_compute[182755]: 2026-01-22 00:49:39.663 182759 INFO nova.network.neutron [None req-028b4ddc-8a2b-46bf-93f0-2ea61b821e0a f96259409b0747b6ac866ebe79dcf160 9d05c3cf062a4f6ebb5083b35d40286e - - default default] [instance: 07d46432-944a-49b9-9862-65d4e541e750] Updating port ee3eb2da-6644-4c49-952b-d4fd939223d9 with attributes {'binding:host_id': 'compute-2.ctlplane.example.com', 'device_owner': 'compute:nova'}#033[00m
Jan 21 19:49:40 np0005591285 nova_compute[182755]: 2026-01-22 00:49:40.047 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:49:40 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:49:40.047 104259 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=73, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '16:a4:fb', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': 'fa:c2:bb:50:eb:27'}, ipsec=False) old=SB_Global(nb_cfg=72) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 21 19:49:40 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:49:40.049 104259 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 3 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Jan 21 19:49:40 np0005591285 nova_compute[182755]: 2026-01-22 00:49:40.469 182759 DEBUG oslo_concurrency.lockutils [None req-028b4ddc-8a2b-46bf-93f0-2ea61b821e0a f96259409b0747b6ac866ebe79dcf160 9d05c3cf062a4f6ebb5083b35d40286e - - default default] Acquiring lock "refresh_cache-07d46432-944a-49b9-9862-65d4e541e750" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 21 19:49:40 np0005591285 nova_compute[182755]: 2026-01-22 00:49:40.469 182759 DEBUG oslo_concurrency.lockutils [None req-028b4ddc-8a2b-46bf-93f0-2ea61b821e0a f96259409b0747b6ac866ebe79dcf160 9d05c3cf062a4f6ebb5083b35d40286e - - default default] Acquired lock "refresh_cache-07d46432-944a-49b9-9862-65d4e541e750" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 21 19:49:40 np0005591285 nova_compute[182755]: 2026-01-22 00:49:40.470 182759 DEBUG nova.network.neutron [None req-028b4ddc-8a2b-46bf-93f0-2ea61b821e0a f96259409b0747b6ac866ebe79dcf160 9d05c3cf062a4f6ebb5083b35d40286e - - default default] [instance: 07d46432-944a-49b9-9862-65d4e541e750] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Jan 21 19:49:40 np0005591285 nova_compute[182755]: 2026-01-22 00:49:40.600 182759 DEBUG nova.compute.manager [req-67754d49-46b8-47ab-95f5-40fda4b5e4b5 req-5e02a163-8df2-4d2c-bcb7-0a908699c19f 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 07d46432-944a-49b9-9862-65d4e541e750] Received event network-changed-ee3eb2da-6644-4c49-952b-d4fd939223d9 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 21 19:49:40 np0005591285 nova_compute[182755]: 2026-01-22 00:49:40.601 182759 DEBUG nova.compute.manager [req-67754d49-46b8-47ab-95f5-40fda4b5e4b5 req-5e02a163-8df2-4d2c-bcb7-0a908699c19f 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 07d46432-944a-49b9-9862-65d4e541e750] Refreshing instance network info cache due to event network-changed-ee3eb2da-6644-4c49-952b-d4fd939223d9. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 21 19:49:40 np0005591285 nova_compute[182755]: 2026-01-22 00:49:40.601 182759 DEBUG oslo_concurrency.lockutils [req-67754d49-46b8-47ab-95f5-40fda4b5e4b5 req-5e02a163-8df2-4d2c-bcb7-0a908699c19f 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "refresh_cache-07d46432-944a-49b9-9862-65d4e541e750" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 21 19:49:41 np0005591285 nova_compute[182755]: 2026-01-22 00:49:41.278 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:49:41 np0005591285 nova_compute[182755]: 2026-01-22 00:49:41.387 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:49:41 np0005591285 nova_compute[182755]: 2026-01-22 00:49:41.718 182759 DEBUG nova.network.neutron [None req-028b4ddc-8a2b-46bf-93f0-2ea61b821e0a f96259409b0747b6ac866ebe79dcf160 9d05c3cf062a4f6ebb5083b35d40286e - - default default] [instance: 07d46432-944a-49b9-9862-65d4e541e750] Updating instance_info_cache with network_info: [{"id": "ee3eb2da-6644-4c49-952b-d4fd939223d9", "address": "fa:16:3e:6d:46:7b", "network": {"id": "7f04cd1e-fc0c-46c7-9d75-03b818ec99e2", "bridge": "br-int", "label": "tempest-TestShelveInstance-1867443717-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.219", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "9d05c3cf062a4f6ebb5083b35d40286e", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapee3eb2da-66", "ovs_interfaceid": "ee3eb2da-6644-4c49-952b-d4fd939223d9", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 21 19:49:41 np0005591285 nova_compute[182755]: 2026-01-22 00:49:41.746 182759 DEBUG oslo_concurrency.lockutils [None req-028b4ddc-8a2b-46bf-93f0-2ea61b821e0a f96259409b0747b6ac866ebe79dcf160 9d05c3cf062a4f6ebb5083b35d40286e - - default default] Releasing lock "refresh_cache-07d46432-944a-49b9-9862-65d4e541e750" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 21 19:49:41 np0005591285 nova_compute[182755]: 2026-01-22 00:49:41.748 182759 DEBUG nova.virt.libvirt.driver [None req-028b4ddc-8a2b-46bf-93f0-2ea61b821e0a f96259409b0747b6ac866ebe79dcf160 9d05c3cf062a4f6ebb5083b35d40286e - - default default] [instance: 07d46432-944a-49b9-9862-65d4e541e750] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Jan 21 19:49:41 np0005591285 nova_compute[182755]: 2026-01-22 00:49:41.748 182759 INFO nova.virt.libvirt.driver [None req-028b4ddc-8a2b-46bf-93f0-2ea61b821e0a f96259409b0747b6ac866ebe79dcf160 9d05c3cf062a4f6ebb5083b35d40286e - - default default] [instance: 07d46432-944a-49b9-9862-65d4e541e750] Creating image(s)#033[00m
Jan 21 19:49:41 np0005591285 nova_compute[182755]: 2026-01-22 00:49:41.749 182759 DEBUG oslo_concurrency.lockutils [None req-028b4ddc-8a2b-46bf-93f0-2ea61b821e0a f96259409b0747b6ac866ebe79dcf160 9d05c3cf062a4f6ebb5083b35d40286e - - default default] Acquiring lock "/var/lib/nova/instances/07d46432-944a-49b9-9862-65d4e541e750/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 21 19:49:41 np0005591285 nova_compute[182755]: 2026-01-22 00:49:41.749 182759 DEBUG oslo_concurrency.lockutils [None req-028b4ddc-8a2b-46bf-93f0-2ea61b821e0a f96259409b0747b6ac866ebe79dcf160 9d05c3cf062a4f6ebb5083b35d40286e - - default default] Lock "/var/lib/nova/instances/07d46432-944a-49b9-9862-65d4e541e750/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 21 19:49:41 np0005591285 nova_compute[182755]: 2026-01-22 00:49:41.750 182759 DEBUG oslo_concurrency.lockutils [None req-028b4ddc-8a2b-46bf-93f0-2ea61b821e0a f96259409b0747b6ac866ebe79dcf160 9d05c3cf062a4f6ebb5083b35d40286e - - default default] Lock "/var/lib/nova/instances/07d46432-944a-49b9-9862-65d4e541e750/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 21 19:49:41 np0005591285 nova_compute[182755]: 2026-01-22 00:49:41.750 182759 DEBUG nova.objects.instance [None req-028b4ddc-8a2b-46bf-93f0-2ea61b821e0a f96259409b0747b6ac866ebe79dcf160 9d05c3cf062a4f6ebb5083b35d40286e - - default default] Lazy-loading 'trusted_certs' on Instance uuid 07d46432-944a-49b9-9862-65d4e541e750 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 21 19:49:41 np0005591285 nova_compute[182755]: 2026-01-22 00:49:41.751 182759 DEBUG oslo_concurrency.lockutils [req-67754d49-46b8-47ab-95f5-40fda4b5e4b5 req-5e02a163-8df2-4d2c-bcb7-0a908699c19f 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquired lock "refresh_cache-07d46432-944a-49b9-9862-65d4e541e750" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 21 19:49:41 np0005591285 nova_compute[182755]: 2026-01-22 00:49:41.751 182759 DEBUG nova.network.neutron [req-67754d49-46b8-47ab-95f5-40fda4b5e4b5 req-5e02a163-8df2-4d2c-bcb7-0a908699c19f 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 07d46432-944a-49b9-9862-65d4e541e750] Refreshing network info cache for port ee3eb2da-6644-4c49-952b-d4fd939223d9 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 21 19:49:41 np0005591285 nova_compute[182755]: 2026-01-22 00:49:41.796 182759 DEBUG oslo_concurrency.lockutils [None req-028b4ddc-8a2b-46bf-93f0-2ea61b821e0a f96259409b0747b6ac866ebe79dcf160 9d05c3cf062a4f6ebb5083b35d40286e - - default default] Acquiring lock "dd2b4e9aff705bf3376f6f40ce326783f810526c" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 21 19:49:41 np0005591285 nova_compute[182755]: 2026-01-22 00:49:41.797 182759 DEBUG oslo_concurrency.lockutils [None req-028b4ddc-8a2b-46bf-93f0-2ea61b821e0a f96259409b0747b6ac866ebe79dcf160 9d05c3cf062a4f6ebb5083b35d40286e - - default default] Lock "dd2b4e9aff705bf3376f6f40ce326783f810526c" acquired by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 21 19:49:43 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:49:43.051 104259 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=ce4b296c-26ac-415a-aa87-9634754eb3d3, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '73'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 21 19:49:43 np0005591285 podman[245861]: 2026-01-22 00:49:43.18531435 +0000 UTC m=+0.055926650 container health_status 8919309155a033da8deb024bb37b9fc0d285fe97686ee0d202af37a5f2df8416 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter)
Jan 21 19:49:43 np0005591285 podman[245860]: 2026-01-22 00:49:43.195794794 +0000 UTC m=+0.074032430 container health_status 482a8450540b33e5cd4c4cb0c4b60281016c657b79b89fa82f8a9e447843f995 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS)
Jan 21 19:49:43 np0005591285 nova_compute[182755]: 2026-01-22 00:49:43.264 182759 DEBUG oslo_concurrency.processutils [None req-028b4ddc-8a2b-46bf-93f0-2ea61b821e0a f96259409b0747b6ac866ebe79dcf160 9d05c3cf062a4f6ebb5083b35d40286e - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/dd2b4e9aff705bf3376f6f40ce326783f810526c.part --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 21 19:49:43 np0005591285 nova_compute[182755]: 2026-01-22 00:49:43.323 182759 DEBUG oslo_concurrency.processutils [None req-028b4ddc-8a2b-46bf-93f0-2ea61b821e0a f96259409b0747b6ac866ebe79dcf160 9d05c3cf062a4f6ebb5083b35d40286e - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/dd2b4e9aff705bf3376f6f40ce326783f810526c.part --force-share --output=json" returned: 0 in 0.060s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 21 19:49:43 np0005591285 nova_compute[182755]: 2026-01-22 00:49:43.325 182759 DEBUG nova.virt.images [None req-028b4ddc-8a2b-46bf-93f0-2ea61b821e0a f96259409b0747b6ac866ebe79dcf160 9d05c3cf062a4f6ebb5083b35d40286e - - default default] 01af4101-f702-46bd-aa85-cba557b6a17e was qcow2, converting to raw fetch_to_raw /usr/lib/python3.9/site-packages/nova/virt/images.py:242#033[00m
Jan 21 19:49:43 np0005591285 nova_compute[182755]: 2026-01-22 00:49:43.326 182759 DEBUG nova.privsep.utils [None req-028b4ddc-8a2b-46bf-93f0-2ea61b821e0a f96259409b0747b6ac866ebe79dcf160 9d05c3cf062a4f6ebb5083b35d40286e - - default default] Path '/var/lib/nova/instances' supports direct I/O supports_direct_io /usr/lib/python3.9/site-packages/nova/privsep/utils.py:63#033[00m
Jan 21 19:49:43 np0005591285 nova_compute[182755]: 2026-01-22 00:49:43.327 182759 DEBUG oslo_concurrency.processutils [None req-028b4ddc-8a2b-46bf-93f0-2ea61b821e0a f96259409b0747b6ac866ebe79dcf160 9d05c3cf062a4f6ebb5083b35d40286e - - default default] Running cmd (subprocess): qemu-img convert -t none -O raw -f qcow2 /var/lib/nova/instances/_base/dd2b4e9aff705bf3376f6f40ce326783f810526c.part /var/lib/nova/instances/_base/dd2b4e9aff705bf3376f6f40ce326783f810526c.converted execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 21 19:49:43 np0005591285 nova_compute[182755]: 2026-01-22 00:49:43.608 182759 DEBUG oslo_concurrency.processutils [None req-028b4ddc-8a2b-46bf-93f0-2ea61b821e0a f96259409b0747b6ac866ebe79dcf160 9d05c3cf062a4f6ebb5083b35d40286e - - default default] CMD "qemu-img convert -t none -O raw -f qcow2 /var/lib/nova/instances/_base/dd2b4e9aff705bf3376f6f40ce326783f810526c.part /var/lib/nova/instances/_base/dd2b4e9aff705bf3376f6f40ce326783f810526c.converted" returned: 0 in 0.281s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 21 19:49:43 np0005591285 nova_compute[182755]: 2026-01-22 00:49:43.619 182759 DEBUG oslo_concurrency.processutils [None req-028b4ddc-8a2b-46bf-93f0-2ea61b821e0a f96259409b0747b6ac866ebe79dcf160 9d05c3cf062a4f6ebb5083b35d40286e - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/dd2b4e9aff705bf3376f6f40ce326783f810526c.converted --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 21 19:49:43 np0005591285 nova_compute[182755]: 2026-01-22 00:49:43.673 182759 DEBUG oslo_concurrency.processutils [None req-028b4ddc-8a2b-46bf-93f0-2ea61b821e0a f96259409b0747b6ac866ebe79dcf160 9d05c3cf062a4f6ebb5083b35d40286e - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/dd2b4e9aff705bf3376f6f40ce326783f810526c.converted --force-share --output=json" returned: 0 in 0.054s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 21 19:49:43 np0005591285 nova_compute[182755]: 2026-01-22 00:49:43.674 182759 DEBUG oslo_concurrency.lockutils [None req-028b4ddc-8a2b-46bf-93f0-2ea61b821e0a f96259409b0747b6ac866ebe79dcf160 9d05c3cf062a4f6ebb5083b35d40286e - - default default] Lock "dd2b4e9aff705bf3376f6f40ce326783f810526c" "released" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: held 1.878s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 21 19:49:43 np0005591285 nova_compute[182755]: 2026-01-22 00:49:43.693 182759 DEBUG oslo_concurrency.processutils [None req-028b4ddc-8a2b-46bf-93f0-2ea61b821e0a f96259409b0747b6ac866ebe79dcf160 9d05c3cf062a4f6ebb5083b35d40286e - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/dd2b4e9aff705bf3376f6f40ce326783f810526c --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 21 19:49:43 np0005591285 nova_compute[182755]: 2026-01-22 00:49:43.747 182759 DEBUG oslo_concurrency.processutils [None req-028b4ddc-8a2b-46bf-93f0-2ea61b821e0a f96259409b0747b6ac866ebe79dcf160 9d05c3cf062a4f6ebb5083b35d40286e - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/dd2b4e9aff705bf3376f6f40ce326783f810526c --force-share --output=json" returned: 0 in 0.054s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 21 19:49:43 np0005591285 nova_compute[182755]: 2026-01-22 00:49:43.749 182759 DEBUG oslo_concurrency.lockutils [None req-028b4ddc-8a2b-46bf-93f0-2ea61b821e0a f96259409b0747b6ac866ebe79dcf160 9d05c3cf062a4f6ebb5083b35d40286e - - default default] Acquiring lock "dd2b4e9aff705bf3376f6f40ce326783f810526c" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 21 19:49:43 np0005591285 nova_compute[182755]: 2026-01-22 00:49:43.749 182759 DEBUG oslo_concurrency.lockutils [None req-028b4ddc-8a2b-46bf-93f0-2ea61b821e0a f96259409b0747b6ac866ebe79dcf160 9d05c3cf062a4f6ebb5083b35d40286e - - default default] Lock "dd2b4e9aff705bf3376f6f40ce326783f810526c" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 21 19:49:43 np0005591285 nova_compute[182755]: 2026-01-22 00:49:43.765 182759 DEBUG oslo_concurrency.processutils [None req-028b4ddc-8a2b-46bf-93f0-2ea61b821e0a f96259409b0747b6ac866ebe79dcf160 9d05c3cf062a4f6ebb5083b35d40286e - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/dd2b4e9aff705bf3376f6f40ce326783f810526c --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 21 19:49:43 np0005591285 nova_compute[182755]: 2026-01-22 00:49:43.864 182759 DEBUG oslo_concurrency.processutils [None req-028b4ddc-8a2b-46bf-93f0-2ea61b821e0a f96259409b0747b6ac866ebe79dcf160 9d05c3cf062a4f6ebb5083b35d40286e - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/dd2b4e9aff705bf3376f6f40ce326783f810526c --force-share --output=json" returned: 0 in 0.098s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 21 19:49:43 np0005591285 nova_compute[182755]: 2026-01-22 00:49:43.867 182759 DEBUG oslo_concurrency.processutils [None req-028b4ddc-8a2b-46bf-93f0-2ea61b821e0a f96259409b0747b6ac866ebe79dcf160 9d05c3cf062a4f6ebb5083b35d40286e - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/dd2b4e9aff705bf3376f6f40ce326783f810526c,backing_fmt=raw /var/lib/nova/instances/07d46432-944a-49b9-9862-65d4e541e750/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 21 19:49:43 np0005591285 nova_compute[182755]: 2026-01-22 00:49:43.905 182759 DEBUG oslo_concurrency.processutils [None req-028b4ddc-8a2b-46bf-93f0-2ea61b821e0a f96259409b0747b6ac866ebe79dcf160 9d05c3cf062a4f6ebb5083b35d40286e - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/dd2b4e9aff705bf3376f6f40ce326783f810526c,backing_fmt=raw /var/lib/nova/instances/07d46432-944a-49b9-9862-65d4e541e750/disk 1073741824" returned: 0 in 0.038s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 21 19:49:43 np0005591285 nova_compute[182755]: 2026-01-22 00:49:43.907 182759 DEBUG oslo_concurrency.lockutils [None req-028b4ddc-8a2b-46bf-93f0-2ea61b821e0a f96259409b0747b6ac866ebe79dcf160 9d05c3cf062a4f6ebb5083b35d40286e - - default default] Lock "dd2b4e9aff705bf3376f6f40ce326783f810526c" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.157s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 21 19:49:43 np0005591285 nova_compute[182755]: 2026-01-22 00:49:43.907 182759 DEBUG oslo_concurrency.processutils [None req-028b4ddc-8a2b-46bf-93f0-2ea61b821e0a f96259409b0747b6ac866ebe79dcf160 9d05c3cf062a4f6ebb5083b35d40286e - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/dd2b4e9aff705bf3376f6f40ce326783f810526c --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 21 19:49:43 np0005591285 nova_compute[182755]: 2026-01-22 00:49:43.971 182759 DEBUG oslo_concurrency.processutils [None req-028b4ddc-8a2b-46bf-93f0-2ea61b821e0a f96259409b0747b6ac866ebe79dcf160 9d05c3cf062a4f6ebb5083b35d40286e - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/dd2b4e9aff705bf3376f6f40ce326783f810526c --force-share --output=json" returned: 0 in 0.063s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 21 19:49:43 np0005591285 nova_compute[182755]: 2026-01-22 00:49:43.973 182759 DEBUG nova.objects.instance [None req-028b4ddc-8a2b-46bf-93f0-2ea61b821e0a f96259409b0747b6ac866ebe79dcf160 9d05c3cf062a4f6ebb5083b35d40286e - - default default] Lazy-loading 'migration_context' on Instance uuid 07d46432-944a-49b9-9862-65d4e541e750 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 21 19:49:43 np0005591285 nova_compute[182755]: 2026-01-22 00:49:43.989 182759 INFO nova.virt.libvirt.driver [None req-028b4ddc-8a2b-46bf-93f0-2ea61b821e0a f96259409b0747b6ac866ebe79dcf160 9d05c3cf062a4f6ebb5083b35d40286e - - default default] [instance: 07d46432-944a-49b9-9862-65d4e541e750] Rebasing disk image.#033[00m
Jan 21 19:49:43 np0005591285 nova_compute[182755]: 2026-01-22 00:49:43.990 182759 DEBUG oslo_concurrency.processutils [None req-028b4ddc-8a2b-46bf-93f0-2ea61b821e0a f96259409b0747b6ac866ebe79dcf160 9d05c3cf062a4f6ebb5083b35d40286e - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 21 19:49:44 np0005591285 nova_compute[182755]: 2026-01-22 00:49:44.048 182759 DEBUG oslo_concurrency.processutils [None req-028b4ddc-8a2b-46bf-93f0-2ea61b821e0a f96259409b0747b6ac866ebe79dcf160 9d05c3cf062a4f6ebb5083b35d40286e - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json" returned: 0 in 0.058s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 21 19:49:44 np0005591285 nova_compute[182755]: 2026-01-22 00:49:44.049 182759 DEBUG oslo_concurrency.processutils [None req-028b4ddc-8a2b-46bf-93f0-2ea61b821e0a f96259409b0747b6ac866ebe79dcf160 9d05c3cf062a4f6ebb5083b35d40286e - - default default] Running cmd (subprocess): qemu-img rebase -b /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 -F raw /var/lib/nova/instances/07d46432-944a-49b9-9862-65d4e541e750/disk execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 21 19:49:44 np0005591285 nova_compute[182755]: 2026-01-22 00:49:44.292 182759 DEBUG nova.network.neutron [req-67754d49-46b8-47ab-95f5-40fda4b5e4b5 req-5e02a163-8df2-4d2c-bcb7-0a908699c19f 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 07d46432-944a-49b9-9862-65d4e541e750] Updated VIF entry in instance network info cache for port ee3eb2da-6644-4c49-952b-d4fd939223d9. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 21 19:49:44 np0005591285 nova_compute[182755]: 2026-01-22 00:49:44.293 182759 DEBUG nova.network.neutron [req-67754d49-46b8-47ab-95f5-40fda4b5e4b5 req-5e02a163-8df2-4d2c-bcb7-0a908699c19f 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 07d46432-944a-49b9-9862-65d4e541e750] Updating instance_info_cache with network_info: [{"id": "ee3eb2da-6644-4c49-952b-d4fd939223d9", "address": "fa:16:3e:6d:46:7b", "network": {"id": "7f04cd1e-fc0c-46c7-9d75-03b818ec99e2", "bridge": "br-int", "label": "tempest-TestShelveInstance-1867443717-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.219", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "9d05c3cf062a4f6ebb5083b35d40286e", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapee3eb2da-66", "ovs_interfaceid": "ee3eb2da-6644-4c49-952b-d4fd939223d9", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 21 19:49:44 np0005591285 nova_compute[182755]: 2026-01-22 00:49:44.310 182759 DEBUG oslo_concurrency.lockutils [req-67754d49-46b8-47ab-95f5-40fda4b5e4b5 req-5e02a163-8df2-4d2c-bcb7-0a908699c19f 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Releasing lock "refresh_cache-07d46432-944a-49b9-9862-65d4e541e750" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 21 19:49:46 np0005591285 nova_compute[182755]: 2026-01-22 00:49:46.097 182759 DEBUG oslo_concurrency.processutils [None req-028b4ddc-8a2b-46bf-93f0-2ea61b821e0a f96259409b0747b6ac866ebe79dcf160 9d05c3cf062a4f6ebb5083b35d40286e - - default default] CMD "qemu-img rebase -b /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 -F raw /var/lib/nova/instances/07d46432-944a-49b9-9862-65d4e541e750/disk" returned: 0 in 2.048s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 21 19:49:46 np0005591285 nova_compute[182755]: 2026-01-22 00:49:46.098 182759 DEBUG nova.virt.libvirt.driver [None req-028b4ddc-8a2b-46bf-93f0-2ea61b821e0a f96259409b0747b6ac866ebe79dcf160 9d05c3cf062a4f6ebb5083b35d40286e - - default default] [instance: 07d46432-944a-49b9-9862-65d4e541e750] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Jan 21 19:49:46 np0005591285 nova_compute[182755]: 2026-01-22 00:49:46.099 182759 DEBUG nova.virt.libvirt.driver [None req-028b4ddc-8a2b-46bf-93f0-2ea61b821e0a f96259409b0747b6ac866ebe79dcf160 9d05c3cf062a4f6ebb5083b35d40286e - - default default] [instance: 07d46432-944a-49b9-9862-65d4e541e750] Ensure instance console log exists: /var/lib/nova/instances/07d46432-944a-49b9-9862-65d4e541e750/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Jan 21 19:49:46 np0005591285 nova_compute[182755]: 2026-01-22 00:49:46.099 182759 DEBUG oslo_concurrency.lockutils [None req-028b4ddc-8a2b-46bf-93f0-2ea61b821e0a f96259409b0747b6ac866ebe79dcf160 9d05c3cf062a4f6ebb5083b35d40286e - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 21 19:49:46 np0005591285 nova_compute[182755]: 2026-01-22 00:49:46.099 182759 DEBUG oslo_concurrency.lockutils [None req-028b4ddc-8a2b-46bf-93f0-2ea61b821e0a f96259409b0747b6ac866ebe79dcf160 9d05c3cf062a4f6ebb5083b35d40286e - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 21 19:49:46 np0005591285 nova_compute[182755]: 2026-01-22 00:49:46.100 182759 DEBUG oslo_concurrency.lockutils [None req-028b4ddc-8a2b-46bf-93f0-2ea61b821e0a f96259409b0747b6ac866ebe79dcf160 9d05c3cf062a4f6ebb5083b35d40286e - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 21 19:49:46 np0005591285 nova_compute[182755]: 2026-01-22 00:49:46.102 182759 DEBUG nova.virt.libvirt.driver [None req-028b4ddc-8a2b-46bf-93f0-2ea61b821e0a f96259409b0747b6ac866ebe79dcf160 9d05c3cf062a4f6ebb5083b35d40286e - - default default] [instance: 07d46432-944a-49b9-9862-65d4e541e750] Start _get_guest_xml network_info=[{"id": "ee3eb2da-6644-4c49-952b-d4fd939223d9", "address": "fa:16:3e:6d:46:7b", "network": {"id": "7f04cd1e-fc0c-46c7-9d75-03b818ec99e2", "bridge": "br-int", "label": "tempest-TestShelveInstance-1867443717-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.219", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "9d05c3cf062a4f6ebb5083b35d40286e", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapee3eb2da-66", "ovs_interfaceid": "ee3eb2da-6644-4c49-952b-d4fd939223d9", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='6be5390638397d411adfdbde6af873d8',container_format='bare',created_at=2026-01-22T00:49:25Z,direct_url=<?>,disk_format='qcow2',id=01af4101-f702-46bd-aa85-cba557b6a17e,min_disk=1,min_ram=0,name='tempest-TestShelveInstance-server-229309001-shelved',owner='9d05c3cf062a4f6ebb5083b35d40286e',properties=ImageMetaProps,protected=<?>,size=52363264,status='active',tags=<?>,updated_at=2026-01-22T00:49:31Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'device_type': 'disk', 'device_name': '/dev/vda', 'size': 0, 'boot_index': 0, 'disk_bus': 'virtio', 'guest_format': None, 'encryption_options': None, 'encryption_format': None, 'encrypted': False, 'encryption_secret_uuid': None, 'image_id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Jan 21 19:49:46 np0005591285 nova_compute[182755]: 2026-01-22 00:49:46.105 182759 WARNING nova.virt.libvirt.driver [None req-028b4ddc-8a2b-46bf-93f0-2ea61b821e0a f96259409b0747b6ac866ebe79dcf160 9d05c3cf062a4f6ebb5083b35d40286e - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 21 19:49:46 np0005591285 nova_compute[182755]: 2026-01-22 00:49:46.110 182759 DEBUG nova.virt.libvirt.host [None req-028b4ddc-8a2b-46bf-93f0-2ea61b821e0a f96259409b0747b6ac866ebe79dcf160 9d05c3cf062a4f6ebb5083b35d40286e - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Jan 21 19:49:46 np0005591285 nova_compute[182755]: 2026-01-22 00:49:46.110 182759 DEBUG nova.virt.libvirt.host [None req-028b4ddc-8a2b-46bf-93f0-2ea61b821e0a f96259409b0747b6ac866ebe79dcf160 9d05c3cf062a4f6ebb5083b35d40286e - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Jan 21 19:49:46 np0005591285 nova_compute[182755]: 2026-01-22 00:49:46.113 182759 DEBUG nova.virt.libvirt.host [None req-028b4ddc-8a2b-46bf-93f0-2ea61b821e0a f96259409b0747b6ac866ebe79dcf160 9d05c3cf062a4f6ebb5083b35d40286e - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Jan 21 19:49:46 np0005591285 nova_compute[182755]: 2026-01-22 00:49:46.113 182759 DEBUG nova.virt.libvirt.host [None req-028b4ddc-8a2b-46bf-93f0-2ea61b821e0a f96259409b0747b6ac866ebe79dcf160 9d05c3cf062a4f6ebb5083b35d40286e - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Jan 21 19:49:46 np0005591285 nova_compute[182755]: 2026-01-22 00:49:46.115 182759 DEBUG nova.virt.libvirt.driver [None req-028b4ddc-8a2b-46bf-93f0-2ea61b821e0a f96259409b0747b6ac866ebe79dcf160 9d05c3cf062a4f6ebb5083b35d40286e - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Jan 21 19:49:46 np0005591285 nova_compute[182755]: 2026-01-22 00:49:46.116 182759 DEBUG nova.virt.hardware [None req-028b4ddc-8a2b-46bf-93f0-2ea61b821e0a f96259409b0747b6ac866ebe79dcf160 9d05c3cf062a4f6ebb5083b35d40286e - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-21T23:42:45Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='c3389c03-89c4-4ff5-9e03-1a99d41713d4',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='6be5390638397d411adfdbde6af873d8',container_format='bare',created_at=2026-01-22T00:49:25Z,direct_url=<?>,disk_format='qcow2',id=01af4101-f702-46bd-aa85-cba557b6a17e,min_disk=1,min_ram=0,name='tempest-TestShelveInstance-server-229309001-shelved',owner='9d05c3cf062a4f6ebb5083b35d40286e',properties=ImageMetaProps,protected=<?>,size=52363264,status='active',tags=<?>,updated_at=2026-01-22T00:49:31Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Jan 21 19:49:46 np0005591285 nova_compute[182755]: 2026-01-22 00:49:46.117 182759 DEBUG nova.virt.hardware [None req-028b4ddc-8a2b-46bf-93f0-2ea61b821e0a f96259409b0747b6ac866ebe79dcf160 9d05c3cf062a4f6ebb5083b35d40286e - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Jan 21 19:49:46 np0005591285 nova_compute[182755]: 2026-01-22 00:49:46.117 182759 DEBUG nova.virt.hardware [None req-028b4ddc-8a2b-46bf-93f0-2ea61b821e0a f96259409b0747b6ac866ebe79dcf160 9d05c3cf062a4f6ebb5083b35d40286e - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Jan 21 19:49:46 np0005591285 nova_compute[182755]: 2026-01-22 00:49:46.117 182759 DEBUG nova.virt.hardware [None req-028b4ddc-8a2b-46bf-93f0-2ea61b821e0a f96259409b0747b6ac866ebe79dcf160 9d05c3cf062a4f6ebb5083b35d40286e - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Jan 21 19:49:46 np0005591285 nova_compute[182755]: 2026-01-22 00:49:46.117 182759 DEBUG nova.virt.hardware [None req-028b4ddc-8a2b-46bf-93f0-2ea61b821e0a f96259409b0747b6ac866ebe79dcf160 9d05c3cf062a4f6ebb5083b35d40286e - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Jan 21 19:49:46 np0005591285 nova_compute[182755]: 2026-01-22 00:49:46.117 182759 DEBUG nova.virt.hardware [None req-028b4ddc-8a2b-46bf-93f0-2ea61b821e0a f96259409b0747b6ac866ebe79dcf160 9d05c3cf062a4f6ebb5083b35d40286e - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Jan 21 19:49:46 np0005591285 nova_compute[182755]: 2026-01-22 00:49:46.118 182759 DEBUG nova.virt.hardware [None req-028b4ddc-8a2b-46bf-93f0-2ea61b821e0a f96259409b0747b6ac866ebe79dcf160 9d05c3cf062a4f6ebb5083b35d40286e - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Jan 21 19:49:46 np0005591285 nova_compute[182755]: 2026-01-22 00:49:46.118 182759 DEBUG nova.virt.hardware [None req-028b4ddc-8a2b-46bf-93f0-2ea61b821e0a f96259409b0747b6ac866ebe79dcf160 9d05c3cf062a4f6ebb5083b35d40286e - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Jan 21 19:49:46 np0005591285 nova_compute[182755]: 2026-01-22 00:49:46.118 182759 DEBUG nova.virt.hardware [None req-028b4ddc-8a2b-46bf-93f0-2ea61b821e0a f96259409b0747b6ac866ebe79dcf160 9d05c3cf062a4f6ebb5083b35d40286e - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Jan 21 19:49:46 np0005591285 nova_compute[182755]: 2026-01-22 00:49:46.119 182759 DEBUG nova.virt.hardware [None req-028b4ddc-8a2b-46bf-93f0-2ea61b821e0a f96259409b0747b6ac866ebe79dcf160 9d05c3cf062a4f6ebb5083b35d40286e - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Jan 21 19:49:46 np0005591285 nova_compute[182755]: 2026-01-22 00:49:46.119 182759 DEBUG nova.virt.hardware [None req-028b4ddc-8a2b-46bf-93f0-2ea61b821e0a f96259409b0747b6ac866ebe79dcf160 9d05c3cf062a4f6ebb5083b35d40286e - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Jan 21 19:49:46 np0005591285 nova_compute[182755]: 2026-01-22 00:49:46.119 182759 DEBUG nova.objects.instance [None req-028b4ddc-8a2b-46bf-93f0-2ea61b821e0a f96259409b0747b6ac866ebe79dcf160 9d05c3cf062a4f6ebb5083b35d40286e - - default default] Lazy-loading 'vcpu_model' on Instance uuid 07d46432-944a-49b9-9862-65d4e541e750 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 21 19:49:46 np0005591285 nova_compute[182755]: 2026-01-22 00:49:46.144 182759 DEBUG nova.virt.libvirt.vif [None req-028b4ddc-8a2b-46bf-93f0-2ea61b821e0a f96259409b0747b6ac866ebe79dcf160 9d05c3cf062a4f6ebb5083b35d40286e - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=True,config_drive='True',created_at=2026-01-22T00:48:51Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestShelveInstance-server-229309001',display_name='tempest-TestShelveInstance-server-229309001',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-testshelveinstance-server-229309001',id=188,image_ref='01af4101-f702-46bd-aa85-cba557b6a17e',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name='tempest-TestShelveInstance-1678551224',keypairs=<?>,launch_index=0,launched_at=2026-01-22T00:49:00Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=4,progress=0,project_id='9d05c3cf062a4f6ebb5083b35d40286e',ramdisk_id='',reservation_id='r-urkgbm2c',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',clean_attempts='1',image_base_image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestShelveInstance-1694031060',owner_user_name='tempest-TestShelveInstance-1694031060-project-member',shelved_at='2026-01-22T00:49:31.989100',shelved_host='compute-0.ctlplane.example.com',shelved_image_id='01af4101-f702-46bd-aa85-cba557b6a17e'},tags=<?>,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-22T00:49:38Z,user_data=None,user_id='f96259409b0747b6ac866ebe79dcf160',uuid=07d46432-944a-49b9-9862-65d4e541e750,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='shelved_offloaded') vif={"id": "ee3eb2da-6644-4c49-952b-d4fd939223d9", "address": "fa:16:3e:6d:46:7b", "network": {"id": "7f04cd1e-fc0c-46c7-9d75-03b818ec99e2", "bridge": "br-int", "label": "tempest-TestShelveInstance-1867443717-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.219", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "9d05c3cf062a4f6ebb5083b35d40286e", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapee3eb2da-66", "ovs_interfaceid": "ee3eb2da-6644-4c49-952b-d4fd939223d9", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Jan 21 19:49:46 np0005591285 nova_compute[182755]: 2026-01-22 00:49:46.144 182759 DEBUG nova.network.os_vif_util [None req-028b4ddc-8a2b-46bf-93f0-2ea61b821e0a f96259409b0747b6ac866ebe79dcf160 9d05c3cf062a4f6ebb5083b35d40286e - - default default] Converting VIF {"id": "ee3eb2da-6644-4c49-952b-d4fd939223d9", "address": "fa:16:3e:6d:46:7b", "network": {"id": "7f04cd1e-fc0c-46c7-9d75-03b818ec99e2", "bridge": "br-int", "label": "tempest-TestShelveInstance-1867443717-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.219", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "9d05c3cf062a4f6ebb5083b35d40286e", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapee3eb2da-66", "ovs_interfaceid": "ee3eb2da-6644-4c49-952b-d4fd939223d9", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 21 19:49:46 np0005591285 nova_compute[182755]: 2026-01-22 00:49:46.145 182759 DEBUG nova.network.os_vif_util [None req-028b4ddc-8a2b-46bf-93f0-2ea61b821e0a f96259409b0747b6ac866ebe79dcf160 9d05c3cf062a4f6ebb5083b35d40286e - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:6d:46:7b,bridge_name='br-int',has_traffic_filtering=True,id=ee3eb2da-6644-4c49-952b-d4fd939223d9,network=Network(7f04cd1e-fc0c-46c7-9d75-03b818ec99e2),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapee3eb2da-66') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 21 19:49:46 np0005591285 nova_compute[182755]: 2026-01-22 00:49:46.146 182759 DEBUG nova.objects.instance [None req-028b4ddc-8a2b-46bf-93f0-2ea61b821e0a f96259409b0747b6ac866ebe79dcf160 9d05c3cf062a4f6ebb5083b35d40286e - - default default] Lazy-loading 'pci_devices' on Instance uuid 07d46432-944a-49b9-9862-65d4e541e750 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 21 19:49:46 np0005591285 nova_compute[182755]: 2026-01-22 00:49:46.170 182759 DEBUG nova.virt.libvirt.driver [None req-028b4ddc-8a2b-46bf-93f0-2ea61b821e0a f96259409b0747b6ac866ebe79dcf160 9d05c3cf062a4f6ebb5083b35d40286e - - default default] [instance: 07d46432-944a-49b9-9862-65d4e541e750] End _get_guest_xml xml=<domain type="kvm">
Jan 21 19:49:46 np0005591285 nova_compute[182755]:  <uuid>07d46432-944a-49b9-9862-65d4e541e750</uuid>
Jan 21 19:49:46 np0005591285 nova_compute[182755]:  <name>instance-000000bc</name>
Jan 21 19:49:46 np0005591285 nova_compute[182755]:  <memory>131072</memory>
Jan 21 19:49:46 np0005591285 nova_compute[182755]:  <vcpu>1</vcpu>
Jan 21 19:49:46 np0005591285 nova_compute[182755]:  <metadata>
Jan 21 19:49:46 np0005591285 nova_compute[182755]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 21 19:49:46 np0005591285 nova_compute[182755]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 21 19:49:46 np0005591285 nova_compute[182755]:      <nova:name>tempest-TestShelveInstance-server-229309001</nova:name>
Jan 21 19:49:46 np0005591285 nova_compute[182755]:      <nova:creationTime>2026-01-22 00:49:46</nova:creationTime>
Jan 21 19:49:46 np0005591285 nova_compute[182755]:      <nova:flavor name="m1.nano">
Jan 21 19:49:46 np0005591285 nova_compute[182755]:        <nova:memory>128</nova:memory>
Jan 21 19:49:46 np0005591285 nova_compute[182755]:        <nova:disk>1</nova:disk>
Jan 21 19:49:46 np0005591285 nova_compute[182755]:        <nova:swap>0</nova:swap>
Jan 21 19:49:46 np0005591285 nova_compute[182755]:        <nova:ephemeral>0</nova:ephemeral>
Jan 21 19:49:46 np0005591285 nova_compute[182755]:        <nova:vcpus>1</nova:vcpus>
Jan 21 19:49:46 np0005591285 nova_compute[182755]:      </nova:flavor>
Jan 21 19:49:46 np0005591285 nova_compute[182755]:      <nova:owner>
Jan 21 19:49:46 np0005591285 nova_compute[182755]:        <nova:user uuid="f96259409b0747b6ac866ebe79dcf160">tempest-TestShelveInstance-1694031060-project-member</nova:user>
Jan 21 19:49:46 np0005591285 nova_compute[182755]:        <nova:project uuid="9d05c3cf062a4f6ebb5083b35d40286e">tempest-TestShelveInstance-1694031060</nova:project>
Jan 21 19:49:46 np0005591285 nova_compute[182755]:      </nova:owner>
Jan 21 19:49:46 np0005591285 nova_compute[182755]:      <nova:root type="image" uuid="01af4101-f702-46bd-aa85-cba557b6a17e"/>
Jan 21 19:49:46 np0005591285 nova_compute[182755]:      <nova:ports>
Jan 21 19:49:46 np0005591285 nova_compute[182755]:        <nova:port uuid="ee3eb2da-6644-4c49-952b-d4fd939223d9">
Jan 21 19:49:46 np0005591285 nova_compute[182755]:          <nova:ip type="fixed" address="10.100.0.4" ipVersion="4"/>
Jan 21 19:49:46 np0005591285 nova_compute[182755]:        </nova:port>
Jan 21 19:49:46 np0005591285 nova_compute[182755]:      </nova:ports>
Jan 21 19:49:46 np0005591285 nova_compute[182755]:    </nova:instance>
Jan 21 19:49:46 np0005591285 nova_compute[182755]:  </metadata>
Jan 21 19:49:46 np0005591285 nova_compute[182755]:  <sysinfo type="smbios">
Jan 21 19:49:46 np0005591285 nova_compute[182755]:    <system>
Jan 21 19:49:46 np0005591285 nova_compute[182755]:      <entry name="manufacturer">RDO</entry>
Jan 21 19:49:46 np0005591285 nova_compute[182755]:      <entry name="product">OpenStack Compute</entry>
Jan 21 19:49:46 np0005591285 nova_compute[182755]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 21 19:49:46 np0005591285 nova_compute[182755]:      <entry name="serial">07d46432-944a-49b9-9862-65d4e541e750</entry>
Jan 21 19:49:46 np0005591285 nova_compute[182755]:      <entry name="uuid">07d46432-944a-49b9-9862-65d4e541e750</entry>
Jan 21 19:49:46 np0005591285 nova_compute[182755]:      <entry name="family">Virtual Machine</entry>
Jan 21 19:49:46 np0005591285 nova_compute[182755]:    </system>
Jan 21 19:49:46 np0005591285 nova_compute[182755]:  </sysinfo>
Jan 21 19:49:46 np0005591285 nova_compute[182755]:  <os>
Jan 21 19:49:46 np0005591285 nova_compute[182755]:    <type arch="x86_64" machine="q35">hvm</type>
Jan 21 19:49:46 np0005591285 nova_compute[182755]:    <boot dev="hd"/>
Jan 21 19:49:46 np0005591285 nova_compute[182755]:    <smbios mode="sysinfo"/>
Jan 21 19:49:46 np0005591285 nova_compute[182755]:  </os>
Jan 21 19:49:46 np0005591285 nova_compute[182755]:  <features>
Jan 21 19:49:46 np0005591285 nova_compute[182755]:    <acpi/>
Jan 21 19:49:46 np0005591285 nova_compute[182755]:    <apic/>
Jan 21 19:49:46 np0005591285 nova_compute[182755]:    <vmcoreinfo/>
Jan 21 19:49:46 np0005591285 nova_compute[182755]:  </features>
Jan 21 19:49:46 np0005591285 nova_compute[182755]:  <clock offset="utc">
Jan 21 19:49:46 np0005591285 nova_compute[182755]:    <timer name="pit" tickpolicy="delay"/>
Jan 21 19:49:46 np0005591285 nova_compute[182755]:    <timer name="rtc" tickpolicy="catchup"/>
Jan 21 19:49:46 np0005591285 nova_compute[182755]:    <timer name="hpet" present="no"/>
Jan 21 19:49:46 np0005591285 nova_compute[182755]:  </clock>
Jan 21 19:49:46 np0005591285 nova_compute[182755]:  <cpu mode="custom" match="exact">
Jan 21 19:49:46 np0005591285 nova_compute[182755]:    <model>Nehalem</model>
Jan 21 19:49:46 np0005591285 nova_compute[182755]:    <topology sockets="1" cores="1" threads="1"/>
Jan 21 19:49:46 np0005591285 nova_compute[182755]:  </cpu>
Jan 21 19:49:46 np0005591285 nova_compute[182755]:  <devices>
Jan 21 19:49:46 np0005591285 nova_compute[182755]:    <disk type="file" device="disk">
Jan 21 19:49:46 np0005591285 nova_compute[182755]:      <driver name="qemu" type="qcow2" cache="none"/>
Jan 21 19:49:46 np0005591285 nova_compute[182755]:      <source file="/var/lib/nova/instances/07d46432-944a-49b9-9862-65d4e541e750/disk"/>
Jan 21 19:49:46 np0005591285 nova_compute[182755]:      <target dev="vda" bus="virtio"/>
Jan 21 19:49:46 np0005591285 nova_compute[182755]:    </disk>
Jan 21 19:49:46 np0005591285 nova_compute[182755]:    <disk type="file" device="cdrom">
Jan 21 19:49:46 np0005591285 nova_compute[182755]:      <driver name="qemu" type="raw" cache="none"/>
Jan 21 19:49:46 np0005591285 nova_compute[182755]:      <source file="/var/lib/nova/instances/07d46432-944a-49b9-9862-65d4e541e750/disk.config"/>
Jan 21 19:49:46 np0005591285 nova_compute[182755]:      <target dev="sda" bus="sata"/>
Jan 21 19:49:46 np0005591285 nova_compute[182755]:    </disk>
Jan 21 19:49:46 np0005591285 nova_compute[182755]:    <interface type="ethernet">
Jan 21 19:49:46 np0005591285 nova_compute[182755]:      <mac address="fa:16:3e:6d:46:7b"/>
Jan 21 19:49:46 np0005591285 nova_compute[182755]:      <model type="virtio"/>
Jan 21 19:49:46 np0005591285 nova_compute[182755]:      <driver name="vhost" rx_queue_size="512"/>
Jan 21 19:49:46 np0005591285 nova_compute[182755]:      <mtu size="1442"/>
Jan 21 19:49:46 np0005591285 nova_compute[182755]:      <target dev="tapee3eb2da-66"/>
Jan 21 19:49:46 np0005591285 nova_compute[182755]:    </interface>
Jan 21 19:49:46 np0005591285 nova_compute[182755]:    <serial type="pty">
Jan 21 19:49:46 np0005591285 nova_compute[182755]:      <log file="/var/lib/nova/instances/07d46432-944a-49b9-9862-65d4e541e750/console.log" append="off"/>
Jan 21 19:49:46 np0005591285 nova_compute[182755]:    </serial>
Jan 21 19:49:46 np0005591285 nova_compute[182755]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 21 19:49:46 np0005591285 nova_compute[182755]:    <video>
Jan 21 19:49:46 np0005591285 nova_compute[182755]:      <model type="virtio"/>
Jan 21 19:49:46 np0005591285 nova_compute[182755]:    </video>
Jan 21 19:49:46 np0005591285 nova_compute[182755]:    <input type="tablet" bus="usb"/>
Jan 21 19:49:46 np0005591285 nova_compute[182755]:    <input type="keyboard" bus="usb"/>
Jan 21 19:49:46 np0005591285 nova_compute[182755]:    <rng model="virtio">
Jan 21 19:49:46 np0005591285 nova_compute[182755]:      <backend model="random">/dev/urandom</backend>
Jan 21 19:49:46 np0005591285 nova_compute[182755]:    </rng>
Jan 21 19:49:46 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root"/>
Jan 21 19:49:46 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 19:49:46 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 19:49:46 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 19:49:46 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 19:49:46 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 19:49:46 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 19:49:46 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 19:49:46 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 19:49:46 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 19:49:46 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 19:49:46 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 19:49:46 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 19:49:46 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 19:49:46 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 19:49:46 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 19:49:46 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 19:49:46 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 19:49:46 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 19:49:46 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 19:49:46 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 19:49:46 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 19:49:46 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 19:49:46 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 19:49:46 np0005591285 nova_compute[182755]:    <controller type="pci" model="pcie-root-port"/>
Jan 21 19:49:46 np0005591285 nova_compute[182755]:    <controller type="usb" index="0"/>
Jan 21 19:49:46 np0005591285 nova_compute[182755]:    <memballoon model="virtio">
Jan 21 19:49:46 np0005591285 nova_compute[182755]:      <stats period="10"/>
Jan 21 19:49:46 np0005591285 nova_compute[182755]:    </memballoon>
Jan 21 19:49:46 np0005591285 nova_compute[182755]:  </devices>
Jan 21 19:49:46 np0005591285 nova_compute[182755]: </domain>
Jan 21 19:49:46 np0005591285 nova_compute[182755]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Jan 21 19:49:46 np0005591285 nova_compute[182755]: 2026-01-22 00:49:46.171 182759 DEBUG nova.compute.manager [None req-028b4ddc-8a2b-46bf-93f0-2ea61b821e0a f96259409b0747b6ac866ebe79dcf160 9d05c3cf062a4f6ebb5083b35d40286e - - default default] [instance: 07d46432-944a-49b9-9862-65d4e541e750] Preparing to wait for external event network-vif-plugged-ee3eb2da-6644-4c49-952b-d4fd939223d9 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Jan 21 19:49:46 np0005591285 nova_compute[182755]: 2026-01-22 00:49:46.171 182759 DEBUG oslo_concurrency.lockutils [None req-028b4ddc-8a2b-46bf-93f0-2ea61b821e0a f96259409b0747b6ac866ebe79dcf160 9d05c3cf062a4f6ebb5083b35d40286e - - default default] Acquiring lock "07d46432-944a-49b9-9862-65d4e541e750-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 21 19:49:46 np0005591285 nova_compute[182755]: 2026-01-22 00:49:46.171 182759 DEBUG oslo_concurrency.lockutils [None req-028b4ddc-8a2b-46bf-93f0-2ea61b821e0a f96259409b0747b6ac866ebe79dcf160 9d05c3cf062a4f6ebb5083b35d40286e - - default default] Lock "07d46432-944a-49b9-9862-65d4e541e750-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 21 19:49:46 np0005591285 nova_compute[182755]: 2026-01-22 00:49:46.172 182759 DEBUG oslo_concurrency.lockutils [None req-028b4ddc-8a2b-46bf-93f0-2ea61b821e0a f96259409b0747b6ac866ebe79dcf160 9d05c3cf062a4f6ebb5083b35d40286e - - default default] Lock "07d46432-944a-49b9-9862-65d4e541e750-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 21 19:49:46 np0005591285 nova_compute[182755]: 2026-01-22 00:49:46.172 182759 DEBUG nova.virt.libvirt.vif [None req-028b4ddc-8a2b-46bf-93f0-2ea61b821e0a f96259409b0747b6ac866ebe79dcf160 9d05c3cf062a4f6ebb5083b35d40286e - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=True,config_drive='True',created_at=2026-01-22T00:48:51Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestShelveInstance-server-229309001',display_name='tempest-TestShelveInstance-server-229309001',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-testshelveinstance-server-229309001',id=188,image_ref='01af4101-f702-46bd-aa85-cba557b6a17e',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name='tempest-TestShelveInstance-1678551224',keypairs=<?>,launch_index=0,launched_at=2026-01-22T00:49:00Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=4,progress=0,project_id='9d05c3cf062a4f6ebb5083b35d40286e',ramdisk_id='',reservation_id='r-urkgbm2c',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',clean_attempts='1',image_base_image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestShelveInstance-1694031060',owner_user_name='tempest-TestShelveInstance-1694031060-project-member',shelved_at='2026-01-22T00:49:31.989100',shelved_host='compute-0.ctlplane.example.com',shelved_image_id='01af4101-f702-46bd-aa85-cba557b6a17e'},tags=<?>,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-22T00:49:38Z,user_data=None,user_id='f96259409b0747b6ac866ebe79dcf160',uuid=07d46432-944a-49b9-9862-65d4e541e750,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='shelved_offloaded') vif={"id": "ee3eb2da-6644-4c49-952b-d4fd939223d9", "address": "fa:16:3e:6d:46:7b", "network": {"id": "7f04cd1e-fc0c-46c7-9d75-03b818ec99e2", "bridge": "br-int", "label": "tempest-TestShelveInstance-1867443717-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.219", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "9d05c3cf062a4f6ebb5083b35d40286e", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapee3eb2da-66", "ovs_interfaceid": "ee3eb2da-6644-4c49-952b-d4fd939223d9", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Jan 21 19:49:46 np0005591285 nova_compute[182755]: 2026-01-22 00:49:46.172 182759 DEBUG nova.network.os_vif_util [None req-028b4ddc-8a2b-46bf-93f0-2ea61b821e0a f96259409b0747b6ac866ebe79dcf160 9d05c3cf062a4f6ebb5083b35d40286e - - default default] Converting VIF {"id": "ee3eb2da-6644-4c49-952b-d4fd939223d9", "address": "fa:16:3e:6d:46:7b", "network": {"id": "7f04cd1e-fc0c-46c7-9d75-03b818ec99e2", "bridge": "br-int", "label": "tempest-TestShelveInstance-1867443717-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.219", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "9d05c3cf062a4f6ebb5083b35d40286e", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapee3eb2da-66", "ovs_interfaceid": "ee3eb2da-6644-4c49-952b-d4fd939223d9", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 21 19:49:46 np0005591285 nova_compute[182755]: 2026-01-22 00:49:46.173 182759 DEBUG nova.network.os_vif_util [None req-028b4ddc-8a2b-46bf-93f0-2ea61b821e0a f96259409b0747b6ac866ebe79dcf160 9d05c3cf062a4f6ebb5083b35d40286e - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:6d:46:7b,bridge_name='br-int',has_traffic_filtering=True,id=ee3eb2da-6644-4c49-952b-d4fd939223d9,network=Network(7f04cd1e-fc0c-46c7-9d75-03b818ec99e2),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapee3eb2da-66') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 21 19:49:46 np0005591285 nova_compute[182755]: 2026-01-22 00:49:46.173 182759 DEBUG os_vif [None req-028b4ddc-8a2b-46bf-93f0-2ea61b821e0a f96259409b0747b6ac866ebe79dcf160 9d05c3cf062a4f6ebb5083b35d40286e - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:6d:46:7b,bridge_name='br-int',has_traffic_filtering=True,id=ee3eb2da-6644-4c49-952b-d4fd939223d9,network=Network(7f04cd1e-fc0c-46c7-9d75-03b818ec99e2),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapee3eb2da-66') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Jan 21 19:49:46 np0005591285 nova_compute[182755]: 2026-01-22 00:49:46.174 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:49:46 np0005591285 nova_compute[182755]: 2026-01-22 00:49:46.175 182759 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 21 19:49:46 np0005591285 nova_compute[182755]: 2026-01-22 00:49:46.175 182759 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 21 19:49:46 np0005591285 nova_compute[182755]: 2026-01-22 00:49:46.182 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:49:46 np0005591285 nova_compute[182755]: 2026-01-22 00:49:46.183 182759 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapee3eb2da-66, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 21 19:49:46 np0005591285 nova_compute[182755]: 2026-01-22 00:49:46.184 182759 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tapee3eb2da-66, col_values=(('external_ids', {'iface-id': 'ee3eb2da-6644-4c49-952b-d4fd939223d9', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:6d:46:7b', 'vm-uuid': '07d46432-944a-49b9-9862-65d4e541e750'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 21 19:49:46 np0005591285 nova_compute[182755]: 2026-01-22 00:49:46.186 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:49:46 np0005591285 NetworkManager[55017]: <info>  [1769042986.1871] manager: (tapee3eb2da-66): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/350)
Jan 21 19:49:46 np0005591285 nova_compute[182755]: 2026-01-22 00:49:46.190 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 21 19:49:46 np0005591285 nova_compute[182755]: 2026-01-22 00:49:46.194 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:49:46 np0005591285 nova_compute[182755]: 2026-01-22 00:49:46.196 182759 INFO os_vif [None req-028b4ddc-8a2b-46bf-93f0-2ea61b821e0a f96259409b0747b6ac866ebe79dcf160 9d05c3cf062a4f6ebb5083b35d40286e - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:6d:46:7b,bridge_name='br-int',has_traffic_filtering=True,id=ee3eb2da-6644-4c49-952b-d4fd939223d9,network=Network(7f04cd1e-fc0c-46c7-9d75-03b818ec99e2),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapee3eb2da-66')#033[00m
Jan 21 19:49:46 np0005591285 podman[245934]: 2026-01-22 00:49:46.228757762 +0000 UTC m=+0.098717825 container health_status e38def30a2d032278c507a8fe96e62e9ea51ff4721fe55b24e66691821fd079e (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, org.label-schema.vendor=CentOS, config_id=ovn_controller, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 21 19:49:46 np0005591285 nova_compute[182755]: 2026-01-22 00:49:46.252 182759 DEBUG nova.virt.libvirt.driver [None req-028b4ddc-8a2b-46bf-93f0-2ea61b821e0a f96259409b0747b6ac866ebe79dcf160 9d05c3cf062a4f6ebb5083b35d40286e - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 21 19:49:46 np0005591285 nova_compute[182755]: 2026-01-22 00:49:46.252 182759 DEBUG nova.virt.libvirt.driver [None req-028b4ddc-8a2b-46bf-93f0-2ea61b821e0a f96259409b0747b6ac866ebe79dcf160 9d05c3cf062a4f6ebb5083b35d40286e - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 21 19:49:46 np0005591285 nova_compute[182755]: 2026-01-22 00:49:46.252 182759 DEBUG nova.virt.libvirt.driver [None req-028b4ddc-8a2b-46bf-93f0-2ea61b821e0a f96259409b0747b6ac866ebe79dcf160 9d05c3cf062a4f6ebb5083b35d40286e - - default default] No VIF found with MAC fa:16:3e:6d:46:7b, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Jan 21 19:49:46 np0005591285 nova_compute[182755]: 2026-01-22 00:49:46.253 182759 INFO nova.virt.libvirt.driver [None req-028b4ddc-8a2b-46bf-93f0-2ea61b821e0a f96259409b0747b6ac866ebe79dcf160 9d05c3cf062a4f6ebb5083b35d40286e - - default default] [instance: 07d46432-944a-49b9-9862-65d4e541e750] Using config drive#033[00m
Jan 21 19:49:46 np0005591285 nova_compute[182755]: 2026-01-22 00:49:46.273 182759 DEBUG nova.objects.instance [None req-028b4ddc-8a2b-46bf-93f0-2ea61b821e0a f96259409b0747b6ac866ebe79dcf160 9d05c3cf062a4f6ebb5083b35d40286e - - default default] Lazy-loading 'ec2_ids' on Instance uuid 07d46432-944a-49b9-9862-65d4e541e750 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 21 19:49:46 np0005591285 nova_compute[182755]: 2026-01-22 00:49:46.309 182759 DEBUG nova.objects.instance [None req-028b4ddc-8a2b-46bf-93f0-2ea61b821e0a f96259409b0747b6ac866ebe79dcf160 9d05c3cf062a4f6ebb5083b35d40286e - - default default] Lazy-loading 'keypairs' on Instance uuid 07d46432-944a-49b9-9862-65d4e541e750 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 21 19:49:46 np0005591285 nova_compute[182755]: 2026-01-22 00:49:46.388 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:49:47 np0005591285 nova_compute[182755]: 2026-01-22 00:49:47.827 182759 INFO nova.virt.libvirt.driver [None req-028b4ddc-8a2b-46bf-93f0-2ea61b821e0a f96259409b0747b6ac866ebe79dcf160 9d05c3cf062a4f6ebb5083b35d40286e - - default default] [instance: 07d46432-944a-49b9-9862-65d4e541e750] Creating config drive at /var/lib/nova/instances/07d46432-944a-49b9-9862-65d4e541e750/disk.config#033[00m
Jan 21 19:49:47 np0005591285 nova_compute[182755]: 2026-01-22 00:49:47.837 182759 DEBUG oslo_concurrency.processutils [None req-028b4ddc-8a2b-46bf-93f0-2ea61b821e0a f96259409b0747b6ac866ebe79dcf160 9d05c3cf062a4f6ebb5083b35d40286e - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/07d46432-944a-49b9-9862-65d4e541e750/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpw08_tw88 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 21 19:49:47 np0005591285 nova_compute[182755]: 2026-01-22 00:49:47.971 182759 DEBUG oslo_concurrency.processutils [None req-028b4ddc-8a2b-46bf-93f0-2ea61b821e0a f96259409b0747b6ac866ebe79dcf160 9d05c3cf062a4f6ebb5083b35d40286e - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/07d46432-944a-49b9-9862-65d4e541e750/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpw08_tw88" returned: 0 in 0.134s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 21 19:49:48 np0005591285 kernel: tapee3eb2da-66: entered promiscuous mode
Jan 21 19:49:48 np0005591285 NetworkManager[55017]: <info>  [1769042988.0309] manager: (tapee3eb2da-66): new Tun device (/org/freedesktop/NetworkManager/Devices/351)
Jan 21 19:49:48 np0005591285 ovn_controller[94908]: 2026-01-22T00:49:48Z|00707|binding|INFO|Claiming lport ee3eb2da-6644-4c49-952b-d4fd939223d9 for this chassis.
Jan 21 19:49:48 np0005591285 ovn_controller[94908]: 2026-01-22T00:49:48Z|00708|binding|INFO|ee3eb2da-6644-4c49-952b-d4fd939223d9: Claiming fa:16:3e:6d:46:7b 10.100.0.4
Jan 21 19:49:48 np0005591285 nova_compute[182755]: 2026-01-22 00:49:48.070 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:49:48 np0005591285 nova_compute[182755]: 2026-01-22 00:49:48.075 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:49:48 np0005591285 systemd-machined[154022]: New machine qemu-80-instance-000000bc.
Jan 21 19:49:48 np0005591285 NetworkManager[55017]: <info>  [1769042988.1004] manager: (patch-provnet-99bcf688-d143-4306-a11c-956e66fbe227-to-br-int): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/352)
Jan 21 19:49:48 np0005591285 NetworkManager[55017]: <info>  [1769042988.1010] manager: (patch-br-int-to-provnet-99bcf688-d143-4306-a11c-956e66fbe227): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/353)
Jan 21 19:49:48 np0005591285 nova_compute[182755]: 2026-01-22 00:49:48.099 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:49:48 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:49:48.105 104259 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:6d:46:7b 10.100.0.4'], port_security=['fa:16:3e:6d:46:7b 10.100.0.4'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.4/28', 'neutron:device_id': '07d46432-944a-49b9-9862-65d4e541e750', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-7f04cd1e-fc0c-46c7-9d75-03b818ec99e2', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '9d05c3cf062a4f6ebb5083b35d40286e', 'neutron:revision_number': '7', 'neutron:security_group_ids': '58135b34-ec05-462e-8563-87deac605474', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:port_fip': '192.168.122.219'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=d85ad74f-8de0-427b-84fe-c5395634422f, chassis=[<ovs.db.idl.Row object at 0x7fdd55b5c970>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fdd55b5c970>], logical_port=ee3eb2da-6644-4c49-952b-d4fd939223d9) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 21 19:49:48 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:49:48.106 104259 INFO neutron.agent.ovn.metadata.agent [-] Port ee3eb2da-6644-4c49-952b-d4fd939223d9 in datapath 7f04cd1e-fc0c-46c7-9d75-03b818ec99e2 bound to our chassis#033[00m
Jan 21 19:49:48 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:49:48.107 104259 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 7f04cd1e-fc0c-46c7-9d75-03b818ec99e2#033[00m
Jan 21 19:49:48 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:49:48.122 211690 DEBUG oslo.privsep.daemon [-] privsep: reply[4b531d9d-cec8-4412-bacc-b7ec80d0f6a8]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 19:49:48 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:49:48.123 104259 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap7f04cd1e-f1 in ovnmeta-7f04cd1e-fc0c-46c7-9d75-03b818ec99e2 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Jan 21 19:49:48 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:49:48.124 211690 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap7f04cd1e-f0 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Jan 21 19:49:48 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:49:48.125 211690 DEBUG oslo.privsep.daemon [-] privsep: reply[e4ded842-e131-4ec5-9df7-6cbc141285c4]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 19:49:48 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:49:48.126 211690 DEBUG oslo.privsep.daemon [-] privsep: reply[d7882b84-9ab0-4a7a-98a5-d74d37c160bb]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 19:49:48 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:49:48.136 104650 DEBUG oslo.privsep.daemon [-] privsep: reply[e5182d21-6bfe-480b-ac03-294b1c07a7b6]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 19:49:48 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:49:48.161 211690 DEBUG oslo.privsep.daemon [-] privsep: reply[565044c1-dcfc-4986-b099-fc2e7232591f]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 19:49:48 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:49:48.196 211704 DEBUG oslo.privsep.daemon [-] privsep: reply[4a50f442-c46a-47e8-bb2b-18023ff77839]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 19:49:48 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:49:48.212 211690 DEBUG oslo.privsep.daemon [-] privsep: reply[44e855e3-ea1d-4931-88a0-f51dcb59310d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 19:49:48 np0005591285 NetworkManager[55017]: <info>  [1769042988.2135] manager: (tap7f04cd1e-f0): new Veth device (/org/freedesktop/NetworkManager/Devices/354)
Jan 21 19:49:48 np0005591285 systemd[1]: Started Virtual Machine qemu-80-instance-000000bc.
Jan 21 19:49:48 np0005591285 nova_compute[182755]: 2026-01-22 00:49:48.215 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:49:48 np0005591285 systemd-udevd[245986]: Network interface NamePolicy= disabled on kernel command line.
Jan 21 19:49:48 np0005591285 systemd-udevd[245987]: Network interface NamePolicy= disabled on kernel command line.
Jan 21 19:49:48 np0005591285 nova_compute[182755]: 2026-01-22 00:49:48.228 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:49:48 np0005591285 ovn_controller[94908]: 2026-01-22T00:49:48Z|00709|binding|INFO|Setting lport ee3eb2da-6644-4c49-952b-d4fd939223d9 ovn-installed in OVS
Jan 21 19:49:48 np0005591285 ovn_controller[94908]: 2026-01-22T00:49:48Z|00710|binding|INFO|Setting lport ee3eb2da-6644-4c49-952b-d4fd939223d9 up in Southbound
Jan 21 19:49:48 np0005591285 nova_compute[182755]: 2026-01-22 00:49:48.236 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:49:48 np0005591285 NetworkManager[55017]: <info>  [1769042988.2407] device (tapee3eb2da-66): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 21 19:49:48 np0005591285 NetworkManager[55017]: <info>  [1769042988.2416] device (tapee3eb2da-66): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 21 19:49:48 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:49:48.258 211704 DEBUG oslo.privsep.daemon [-] privsep: reply[4e0a291b-8076-49b9-8667-95b4e3567152]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 19:49:48 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:49:48.261 211704 DEBUG oslo.privsep.daemon [-] privsep: reply[c9830043-8fa4-48a3-8f2b-8ac6c1ad00f7]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 19:49:48 np0005591285 NetworkManager[55017]: <info>  [1769042988.2941] device (tap7f04cd1e-f0): carrier: link connected
Jan 21 19:49:48 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:49:48.299 211704 DEBUG oslo.privsep.daemon [-] privsep: reply[83050212-66c7-45f2-9fd1-ba289af3e99f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 19:49:48 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:49:48.322 211690 DEBUG oslo.privsep.daemon [-] privsep: reply[ce13b614-8a08-46c5-a42a-805d91012764]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap7f04cd1e-f1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:de:ee:c2'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 220], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 754695, 'reachable_time': 24138, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 246012, 'error': None, 'target': 'ovnmeta-7f04cd1e-fc0c-46c7-9d75-03b818ec99e2', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 19:49:48 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:49:48.336 211690 DEBUG oslo.privsep.daemon [-] privsep: reply[1e8e9b67-0cf8-4458-abbe-964020208c95]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fede:eec2'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 754695, 'tstamp': 754695}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 246014, 'error': None, 'target': 'ovnmeta-7f04cd1e-fc0c-46c7-9d75-03b818ec99e2', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 19:49:48 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:49:48.356 211690 DEBUG oslo.privsep.daemon [-] privsep: reply[67c5a58e-9f5a-4e2d-9f41-bcc59b25b396]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap7f04cd1e-f1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:de:ee:c2'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 220], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 754695, 'reachable_time': 24138, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 246015, 'error': None, 'target': 'ovnmeta-7f04cd1e-fc0c-46c7-9d75-03b818ec99e2', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 19:49:48 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:49:48.388 211690 DEBUG oslo.privsep.daemon [-] privsep: reply[986eecfd-f0db-4d88-ad36-9214f70f06e1]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 19:49:48 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:49:48.444 211690 DEBUG oslo.privsep.daemon [-] privsep: reply[5b5c81a0-1a7d-4dd2-a677-6e4b2a80cef0]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 19:49:48 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:49:48.446 104259 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap7f04cd1e-f0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 21 19:49:48 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:49:48.446 104259 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 21 19:49:48 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:49:48.447 104259 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap7f04cd1e-f0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 21 19:49:48 np0005591285 nova_compute[182755]: 2026-01-22 00:49:48.448 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:49:48 np0005591285 NetworkManager[55017]: <info>  [1769042988.4493] manager: (tap7f04cd1e-f0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/355)
Jan 21 19:49:48 np0005591285 kernel: tap7f04cd1e-f0: entered promiscuous mode
Jan 21 19:49:48 np0005591285 nova_compute[182755]: 2026-01-22 00:49:48.456 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:49:48 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:49:48.457 104259 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap7f04cd1e-f0, col_values=(('external_ids', {'iface-id': 'f2779172-88a7-44be-ac20-583c93a461c0'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 21 19:49:48 np0005591285 nova_compute[182755]: 2026-01-22 00:49:48.458 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:49:48 np0005591285 ovn_controller[94908]: 2026-01-22T00:49:48Z|00711|binding|INFO|Releasing lport f2779172-88a7-44be-ac20-583c93a461c0 from this chassis (sb_readonly=0)
Jan 21 19:49:48 np0005591285 nova_compute[182755]: 2026-01-22 00:49:48.470 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:49:48 np0005591285 nova_compute[182755]: 2026-01-22 00:49:48.477 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:49:48 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:49:48.479 104259 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/7f04cd1e-fc0c-46c7-9d75-03b818ec99e2.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/7f04cd1e-fc0c-46c7-9d75-03b818ec99e2.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Jan 21 19:49:48 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:49:48.480 211690 DEBUG oslo.privsep.daemon [-] privsep: reply[14ec34ae-d8bb-4698-b0e6-ded25ceaa465]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 19:49:48 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:49:48.481 104259 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Jan 21 19:49:48 np0005591285 ovn_metadata_agent[104254]: global
Jan 21 19:49:48 np0005591285 ovn_metadata_agent[104254]:    log         /dev/log local0 debug
Jan 21 19:49:48 np0005591285 ovn_metadata_agent[104254]:    log-tag     haproxy-metadata-proxy-7f04cd1e-fc0c-46c7-9d75-03b818ec99e2
Jan 21 19:49:48 np0005591285 ovn_metadata_agent[104254]:    user        root
Jan 21 19:49:48 np0005591285 ovn_metadata_agent[104254]:    group       root
Jan 21 19:49:48 np0005591285 ovn_metadata_agent[104254]:    maxconn     1024
Jan 21 19:49:48 np0005591285 ovn_metadata_agent[104254]:    pidfile     /var/lib/neutron/external/pids/7f04cd1e-fc0c-46c7-9d75-03b818ec99e2.pid.haproxy
Jan 21 19:49:48 np0005591285 ovn_metadata_agent[104254]:    daemon
Jan 21 19:49:48 np0005591285 ovn_metadata_agent[104254]: 
Jan 21 19:49:48 np0005591285 ovn_metadata_agent[104254]: defaults
Jan 21 19:49:48 np0005591285 ovn_metadata_agent[104254]:    log global
Jan 21 19:49:48 np0005591285 ovn_metadata_agent[104254]:    mode http
Jan 21 19:49:48 np0005591285 ovn_metadata_agent[104254]:    option httplog
Jan 21 19:49:48 np0005591285 ovn_metadata_agent[104254]:    option dontlognull
Jan 21 19:49:48 np0005591285 ovn_metadata_agent[104254]:    option http-server-close
Jan 21 19:49:48 np0005591285 ovn_metadata_agent[104254]:    option forwardfor
Jan 21 19:49:48 np0005591285 ovn_metadata_agent[104254]:    retries                 3
Jan 21 19:49:48 np0005591285 ovn_metadata_agent[104254]:    timeout http-request    30s
Jan 21 19:49:48 np0005591285 ovn_metadata_agent[104254]:    timeout connect         30s
Jan 21 19:49:48 np0005591285 ovn_metadata_agent[104254]:    timeout client          32s
Jan 21 19:49:48 np0005591285 ovn_metadata_agent[104254]:    timeout server          32s
Jan 21 19:49:48 np0005591285 ovn_metadata_agent[104254]:    timeout http-keep-alive 30s
Jan 21 19:49:48 np0005591285 ovn_metadata_agent[104254]: 
Jan 21 19:49:48 np0005591285 ovn_metadata_agent[104254]: 
Jan 21 19:49:48 np0005591285 ovn_metadata_agent[104254]: listen listener
Jan 21 19:49:48 np0005591285 ovn_metadata_agent[104254]:    bind 169.254.169.254:80
Jan 21 19:49:48 np0005591285 ovn_metadata_agent[104254]:    server metadata /var/lib/neutron/metadata_proxy
Jan 21 19:49:48 np0005591285 ovn_metadata_agent[104254]:    http-request add-header X-OVN-Network-ID 7f04cd1e-fc0c-46c7-9d75-03b818ec99e2
Jan 21 19:49:48 np0005591285 ovn_metadata_agent[104254]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Jan 21 19:49:48 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:49:48.482 104259 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-7f04cd1e-fc0c-46c7-9d75-03b818ec99e2', 'env', 'PROCESS_TAG=haproxy-7f04cd1e-fc0c-46c7-9d75-03b818ec99e2', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/7f04cd1e-fc0c-46c7-9d75-03b818ec99e2.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Jan 21 19:49:48 np0005591285 nova_compute[182755]: 2026-01-22 00:49:48.486 182759 DEBUG nova.compute.manager [req-8d375c76-9c34-4251-a4f0-38a8c487bbc9 req-673681df-4238-484f-9bcb-6e2e097293e6 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 07d46432-944a-49b9-9862-65d4e541e750] Received event network-vif-plugged-ee3eb2da-6644-4c49-952b-d4fd939223d9 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 21 19:49:48 np0005591285 nova_compute[182755]: 2026-01-22 00:49:48.487 182759 DEBUG oslo_concurrency.lockutils [req-8d375c76-9c34-4251-a4f0-38a8c487bbc9 req-673681df-4238-484f-9bcb-6e2e097293e6 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "07d46432-944a-49b9-9862-65d4e541e750-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 21 19:49:48 np0005591285 nova_compute[182755]: 2026-01-22 00:49:48.487 182759 DEBUG oslo_concurrency.lockutils [req-8d375c76-9c34-4251-a4f0-38a8c487bbc9 req-673681df-4238-484f-9bcb-6e2e097293e6 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "07d46432-944a-49b9-9862-65d4e541e750-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 21 19:49:48 np0005591285 nova_compute[182755]: 2026-01-22 00:49:48.487 182759 DEBUG oslo_concurrency.lockutils [req-8d375c76-9c34-4251-a4f0-38a8c487bbc9 req-673681df-4238-484f-9bcb-6e2e097293e6 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "07d46432-944a-49b9-9862-65d4e541e750-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 21 19:49:48 np0005591285 nova_compute[182755]: 2026-01-22 00:49:48.488 182759 DEBUG nova.compute.manager [req-8d375c76-9c34-4251-a4f0-38a8c487bbc9 req-673681df-4238-484f-9bcb-6e2e097293e6 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 07d46432-944a-49b9-9862-65d4e541e750] Processing event network-vif-plugged-ee3eb2da-6644-4c49-952b-d4fd939223d9 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Jan 21 19:49:48 np0005591285 podman[246047]: 2026-01-22 00:49:48.866000782 +0000 UTC m=+0.046086985 container create f02c1028e26949f3f9d3bd0fe7acff93513890affe16745e4ed3a4cfa7aa48a5 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-7f04cd1e-fc0c-46c7-9d75-03b818ec99e2, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Jan 21 19:49:48 np0005591285 systemd[1]: Started libpod-conmon-f02c1028e26949f3f9d3bd0fe7acff93513890affe16745e4ed3a4cfa7aa48a5.scope.
Jan 21 19:49:48 np0005591285 systemd[1]: Started libcrun container.
Jan 21 19:49:48 np0005591285 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/38c7d67560dd6efab45a52153c99be85713e60a0bc7348c80713c650d8428380/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 21 19:49:48 np0005591285 podman[246047]: 2026-01-22 00:49:48.840527335 +0000 UTC m=+0.020613558 image pull 3695f0466b4af47afdf4b467956f8cc4744d7249671a73e7ca3fd26cca2f59c3 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Jan 21 19:49:48 np0005591285 podman[246047]: 2026-01-22 00:49:48.940581494 +0000 UTC m=+0.120667697 container init f02c1028e26949f3f9d3bd0fe7acff93513890affe16745e4ed3a4cfa7aa48a5 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-7f04cd1e-fc0c-46c7-9d75-03b818ec99e2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.build-date=20251202, io.buildah.version=1.41.3, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true)
Jan 21 19:49:48 np0005591285 podman[246047]: 2026-01-22 00:49:48.946085133 +0000 UTC m=+0.126171356 container start f02c1028e26949f3f9d3bd0fe7acff93513890affe16745e4ed3a4cfa7aa48a5 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-7f04cd1e-fc0c-46c7-9d75-03b818ec99e2, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team)
Jan 21 19:49:48 np0005591285 neutron-haproxy-ovnmeta-7f04cd1e-fc0c-46c7-9d75-03b818ec99e2[246062]: [NOTICE]   (246066) : New worker (246068) forked
Jan 21 19:49:48 np0005591285 neutron-haproxy-ovnmeta-7f04cd1e-fc0c-46c7-9d75-03b818ec99e2[246062]: [NOTICE]   (246066) : Loading success.
Jan 21 19:49:49 np0005591285 nova_compute[182755]: 2026-01-22 00:49:49.128 182759 DEBUG nova.compute.manager [None req-028b4ddc-8a2b-46bf-93f0-2ea61b821e0a f96259409b0747b6ac866ebe79dcf160 9d05c3cf062a4f6ebb5083b35d40286e - - default default] [instance: 07d46432-944a-49b9-9862-65d4e541e750] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Jan 21 19:49:49 np0005591285 nova_compute[182755]: 2026-01-22 00:49:49.130 182759 DEBUG nova.virt.driver [None req-f447ef32-7edb-4e2c-a5c5-5452f2f9c37e - - - - - -] Emitting event <LifecycleEvent: 1769042989.1283724, 07d46432-944a-49b9-9862-65d4e541e750 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 21 19:49:49 np0005591285 nova_compute[182755]: 2026-01-22 00:49:49.131 182759 INFO nova.compute.manager [None req-f447ef32-7edb-4e2c-a5c5-5452f2f9c37e - - - - - -] [instance: 07d46432-944a-49b9-9862-65d4e541e750] VM Started (Lifecycle Event)#033[00m
Jan 21 19:49:49 np0005591285 nova_compute[182755]: 2026-01-22 00:49:49.135 182759 DEBUG nova.virt.libvirt.driver [None req-028b4ddc-8a2b-46bf-93f0-2ea61b821e0a f96259409b0747b6ac866ebe79dcf160 9d05c3cf062a4f6ebb5083b35d40286e - - default default] [instance: 07d46432-944a-49b9-9862-65d4e541e750] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Jan 21 19:49:49 np0005591285 nova_compute[182755]: 2026-01-22 00:49:49.138 182759 INFO nova.virt.libvirt.driver [-] [instance: 07d46432-944a-49b9-9862-65d4e541e750] Instance spawned successfully.#033[00m
Jan 21 19:49:49 np0005591285 nova_compute[182755]: 2026-01-22 00:49:49.154 182759 DEBUG nova.compute.manager [None req-f447ef32-7edb-4e2c-a5c5-5452f2f9c37e - - - - - -] [instance: 07d46432-944a-49b9-9862-65d4e541e750] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 21 19:49:49 np0005591285 nova_compute[182755]: 2026-01-22 00:49:49.156 182759 DEBUG nova.compute.manager [None req-f447ef32-7edb-4e2c-a5c5-5452f2f9c37e - - - - - -] [instance: 07d46432-944a-49b9-9862-65d4e541e750] Synchronizing instance power state after lifecycle event "Started"; current vm_state: shelved_offloaded, current task_state: spawning, current DB power_state: 4, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 21 19:49:49 np0005591285 nova_compute[182755]: 2026-01-22 00:49:49.177 182759 INFO nova.compute.manager [None req-f447ef32-7edb-4e2c-a5c5-5452f2f9c37e - - - - - -] [instance: 07d46432-944a-49b9-9862-65d4e541e750] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 21 19:49:49 np0005591285 nova_compute[182755]: 2026-01-22 00:49:49.177 182759 DEBUG nova.virt.driver [None req-f447ef32-7edb-4e2c-a5c5-5452f2f9c37e - - - - - -] Emitting event <LifecycleEvent: 1769042989.129466, 07d46432-944a-49b9-9862-65d4e541e750 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 21 19:49:49 np0005591285 nova_compute[182755]: 2026-01-22 00:49:49.178 182759 INFO nova.compute.manager [None req-f447ef32-7edb-4e2c-a5c5-5452f2f9c37e - - - - - -] [instance: 07d46432-944a-49b9-9862-65d4e541e750] VM Paused (Lifecycle Event)#033[00m
Jan 21 19:49:49 np0005591285 nova_compute[182755]: 2026-01-22 00:49:49.197 182759 DEBUG nova.compute.manager [None req-f447ef32-7edb-4e2c-a5c5-5452f2f9c37e - - - - - -] [instance: 07d46432-944a-49b9-9862-65d4e541e750] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 21 19:49:49 np0005591285 nova_compute[182755]: 2026-01-22 00:49:49.200 182759 DEBUG nova.virt.driver [None req-f447ef32-7edb-4e2c-a5c5-5452f2f9c37e - - - - - -] Emitting event <LifecycleEvent: 1769042989.1341958, 07d46432-944a-49b9-9862-65d4e541e750 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 21 19:49:49 np0005591285 nova_compute[182755]: 2026-01-22 00:49:49.201 182759 INFO nova.compute.manager [None req-f447ef32-7edb-4e2c-a5c5-5452f2f9c37e - - - - - -] [instance: 07d46432-944a-49b9-9862-65d4e541e750] VM Resumed (Lifecycle Event)#033[00m
Jan 21 19:49:49 np0005591285 nova_compute[182755]: 2026-01-22 00:49:49.217 182759 DEBUG nova.compute.manager [None req-f447ef32-7edb-4e2c-a5c5-5452f2f9c37e - - - - - -] [instance: 07d46432-944a-49b9-9862-65d4e541e750] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 21 19:49:49 np0005591285 nova_compute[182755]: 2026-01-22 00:49:49.221 182759 DEBUG nova.compute.manager [None req-f447ef32-7edb-4e2c-a5c5-5452f2f9c37e - - - - - -] [instance: 07d46432-944a-49b9-9862-65d4e541e750] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: shelved_offloaded, current task_state: spawning, current DB power_state: 4, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 21 19:49:49 np0005591285 nova_compute[182755]: 2026-01-22 00:49:49.239 182759 INFO nova.compute.manager [None req-f447ef32-7edb-4e2c-a5c5-5452f2f9c37e - - - - - -] [instance: 07d46432-944a-49b9-9862-65d4e541e750] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 21 19:49:49 np0005591285 nova_compute[182755]: 2026-01-22 00:49:49.882 182759 DEBUG nova.compute.manager [None req-028b4ddc-8a2b-46bf-93f0-2ea61b821e0a f96259409b0747b6ac866ebe79dcf160 9d05c3cf062a4f6ebb5083b35d40286e - - default default] [instance: 07d46432-944a-49b9-9862-65d4e541e750] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 21 19:49:49 np0005591285 nova_compute[182755]: 2026-01-22 00:49:49.984 182759 DEBUG oslo_concurrency.lockutils [None req-028b4ddc-8a2b-46bf-93f0-2ea61b821e0a f96259409b0747b6ac866ebe79dcf160 9d05c3cf062a4f6ebb5083b35d40286e - - default default] Lock "07d46432-944a-49b9-9862-65d4e541e750" "released" by "nova.compute.manager.ComputeManager.unshelve_instance.<locals>.do_unshelve_instance" :: held 11.251s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 21 19:49:50 np0005591285 nova_compute[182755]: 2026-01-22 00:49:50.899 182759 DEBUG nova.compute.manager [req-29ed1fe6-0a00-435e-843a-3b1aade698e4 req-8bbce358-fb11-48b9-a42f-67b3b545dff1 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 07d46432-944a-49b9-9862-65d4e541e750] Received event network-vif-plugged-ee3eb2da-6644-4c49-952b-d4fd939223d9 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 21 19:49:50 np0005591285 nova_compute[182755]: 2026-01-22 00:49:50.900 182759 DEBUG oslo_concurrency.lockutils [req-29ed1fe6-0a00-435e-843a-3b1aade698e4 req-8bbce358-fb11-48b9-a42f-67b3b545dff1 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "07d46432-944a-49b9-9862-65d4e541e750-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 21 19:49:50 np0005591285 nova_compute[182755]: 2026-01-22 00:49:50.901 182759 DEBUG oslo_concurrency.lockutils [req-29ed1fe6-0a00-435e-843a-3b1aade698e4 req-8bbce358-fb11-48b9-a42f-67b3b545dff1 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "07d46432-944a-49b9-9862-65d4e541e750-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 21 19:49:50 np0005591285 nova_compute[182755]: 2026-01-22 00:49:50.901 182759 DEBUG oslo_concurrency.lockutils [req-29ed1fe6-0a00-435e-843a-3b1aade698e4 req-8bbce358-fb11-48b9-a42f-67b3b545dff1 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "07d46432-944a-49b9-9862-65d4e541e750-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 21 19:49:50 np0005591285 nova_compute[182755]: 2026-01-22 00:49:50.902 182759 DEBUG nova.compute.manager [req-29ed1fe6-0a00-435e-843a-3b1aade698e4 req-8bbce358-fb11-48b9-a42f-67b3b545dff1 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 07d46432-944a-49b9-9862-65d4e541e750] No waiting events found dispatching network-vif-plugged-ee3eb2da-6644-4c49-952b-d4fd939223d9 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 21 19:49:50 np0005591285 nova_compute[182755]: 2026-01-22 00:49:50.902 182759 WARNING nova.compute.manager [req-29ed1fe6-0a00-435e-843a-3b1aade698e4 req-8bbce358-fb11-48b9-a42f-67b3b545dff1 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 07d46432-944a-49b9-9862-65d4e541e750] Received unexpected event network-vif-plugged-ee3eb2da-6644-4c49-952b-d4fd939223d9 for instance with vm_state active and task_state None.#033[00m
Jan 21 19:49:51 np0005591285 nova_compute[182755]: 2026-01-22 00:49:51.188 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:49:51 np0005591285 nova_compute[182755]: 2026-01-22 00:49:51.393 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:49:56 np0005591285 nova_compute[182755]: 2026-01-22 00:49:56.220 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:49:56 np0005591285 nova_compute[182755]: 2026-01-22 00:49:56.393 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:50:00 np0005591285 podman[246086]: 2026-01-22 00:50:00.205479266 +0000 UTC m=+0.066739892 container health_status 769668cc4e818cfae854ab3fcb5517227ddca1e18005a18ef4d782a494f45662 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Red Hat, Inc., name=ubi9-minimal, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, managed_by=edpm_ansible, vcs-type=git, vendor=Red Hat, Inc., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., config_id=openstack_network_exporter, release=1755695350, container_name=openstack_network_exporter, distribution-scope=public, version=9.6, com.redhat.component=ubi9-minimal-container, url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.33.7, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, build-date=2025-08-20T13:12:41, architecture=x86_64, io.openshift.tags=minimal rhel9, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly.)
Jan 21 19:50:00 np0005591285 podman[246087]: 2026-01-22 00:50:00.229307409 +0000 UTC m=+0.092206200 container health_status b5c1a31a508d0cf9e10702abe9c0ef424614b4d8f0daee85791f7de054afa6f0 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, container_name=ceilometer_agent_compute, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, tcib_managed=true, config_id=ceilometer_agent_compute, org.label-schema.build-date=20251202)
Jan 21 19:50:01 np0005591285 nova_compute[182755]: 2026-01-22 00:50:01.223 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:50:01 np0005591285 ovn_controller[94908]: 2026-01-22T00:50:01Z|00078|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:6d:46:7b 10.100.0.4
Jan 21 19:50:01 np0005591285 nova_compute[182755]: 2026-01-22 00:50:01.395 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:50:03 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:50:03.020 104259 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 21 19:50:03 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:50:03.022 104259 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 21 19:50:03 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:50:03.023 104259 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 21 19:50:06 np0005591285 nova_compute[182755]: 2026-01-22 00:50:06.228 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:50:06 np0005591285 nova_compute[182755]: 2026-01-22 00:50:06.396 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:50:08 np0005591285 nova_compute[182755]: 2026-01-22 00:50:08.119 182759 DEBUG nova.compute.manager [req-6e6aa7d1-4058-4f3a-a2c7-c519e24e415e req-eeb7b0f1-6164-41b2-8a37-7e6c4d9a7dff 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 07d46432-944a-49b9-9862-65d4e541e750] Received event network-changed-ee3eb2da-6644-4c49-952b-d4fd939223d9 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 21 19:50:08 np0005591285 nova_compute[182755]: 2026-01-22 00:50:08.119 182759 DEBUG nova.compute.manager [req-6e6aa7d1-4058-4f3a-a2c7-c519e24e415e req-eeb7b0f1-6164-41b2-8a37-7e6c4d9a7dff 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 07d46432-944a-49b9-9862-65d4e541e750] Refreshing instance network info cache due to event network-changed-ee3eb2da-6644-4c49-952b-d4fd939223d9. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 21 19:50:08 np0005591285 nova_compute[182755]: 2026-01-22 00:50:08.120 182759 DEBUG oslo_concurrency.lockutils [req-6e6aa7d1-4058-4f3a-a2c7-c519e24e415e req-eeb7b0f1-6164-41b2-8a37-7e6c4d9a7dff 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "refresh_cache-07d46432-944a-49b9-9862-65d4e541e750" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 21 19:50:08 np0005591285 nova_compute[182755]: 2026-01-22 00:50:08.120 182759 DEBUG oslo_concurrency.lockutils [req-6e6aa7d1-4058-4f3a-a2c7-c519e24e415e req-eeb7b0f1-6164-41b2-8a37-7e6c4d9a7dff 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquired lock "refresh_cache-07d46432-944a-49b9-9862-65d4e541e750" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 21 19:50:08 np0005591285 nova_compute[182755]: 2026-01-22 00:50:08.120 182759 DEBUG nova.network.neutron [req-6e6aa7d1-4058-4f3a-a2c7-c519e24e415e req-eeb7b0f1-6164-41b2-8a37-7e6c4d9a7dff 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 07d46432-944a-49b9-9862-65d4e541e750] Refreshing network info cache for port ee3eb2da-6644-4c49-952b-d4fd939223d9 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 21 19:50:08 np0005591285 nova_compute[182755]: 2026-01-22 00:50:08.189 182759 DEBUG oslo_concurrency.lockutils [None req-593ca0d5-d976-4849-9266-29c1c0ffdfc6 f96259409b0747b6ac866ebe79dcf160 9d05c3cf062a4f6ebb5083b35d40286e - - default default] Acquiring lock "07d46432-944a-49b9-9862-65d4e541e750" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 21 19:50:08 np0005591285 nova_compute[182755]: 2026-01-22 00:50:08.190 182759 DEBUG oslo_concurrency.lockutils [None req-593ca0d5-d976-4849-9266-29c1c0ffdfc6 f96259409b0747b6ac866ebe79dcf160 9d05c3cf062a4f6ebb5083b35d40286e - - default default] Lock "07d46432-944a-49b9-9862-65d4e541e750" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 21 19:50:08 np0005591285 nova_compute[182755]: 2026-01-22 00:50:08.190 182759 DEBUG oslo_concurrency.lockutils [None req-593ca0d5-d976-4849-9266-29c1c0ffdfc6 f96259409b0747b6ac866ebe79dcf160 9d05c3cf062a4f6ebb5083b35d40286e - - default default] Acquiring lock "07d46432-944a-49b9-9862-65d4e541e750-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 21 19:50:08 np0005591285 nova_compute[182755]: 2026-01-22 00:50:08.190 182759 DEBUG oslo_concurrency.lockutils [None req-593ca0d5-d976-4849-9266-29c1c0ffdfc6 f96259409b0747b6ac866ebe79dcf160 9d05c3cf062a4f6ebb5083b35d40286e - - default default] Lock "07d46432-944a-49b9-9862-65d4e541e750-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 21 19:50:08 np0005591285 nova_compute[182755]: 2026-01-22 00:50:08.190 182759 DEBUG oslo_concurrency.lockutils [None req-593ca0d5-d976-4849-9266-29c1c0ffdfc6 f96259409b0747b6ac866ebe79dcf160 9d05c3cf062a4f6ebb5083b35d40286e - - default default] Lock "07d46432-944a-49b9-9862-65d4e541e750-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 21 19:50:08 np0005591285 nova_compute[182755]: 2026-01-22 00:50:08.204 182759 INFO nova.compute.manager [None req-593ca0d5-d976-4849-9266-29c1c0ffdfc6 f96259409b0747b6ac866ebe79dcf160 9d05c3cf062a4f6ebb5083b35d40286e - - default default] [instance: 07d46432-944a-49b9-9862-65d4e541e750] Terminating instance#033[00m
Jan 21 19:50:08 np0005591285 nova_compute[182755]: 2026-01-22 00:50:08.213 182759 DEBUG nova.compute.manager [None req-593ca0d5-d976-4849-9266-29c1c0ffdfc6 f96259409b0747b6ac866ebe79dcf160 9d05c3cf062a4f6ebb5083b35d40286e - - default default] [instance: 07d46432-944a-49b9-9862-65d4e541e750] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Jan 21 19:50:08 np0005591285 kernel: tapee3eb2da-66 (unregistering): left promiscuous mode
Jan 21 19:50:08 np0005591285 NetworkManager[55017]: <info>  [1769043008.2371] device (tapee3eb2da-66): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 21 19:50:08 np0005591285 nova_compute[182755]: 2026-01-22 00:50:08.252 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:50:08 np0005591285 ovn_controller[94908]: 2026-01-22T00:50:08Z|00712|binding|INFO|Releasing lport ee3eb2da-6644-4c49-952b-d4fd939223d9 from this chassis (sb_readonly=0)
Jan 21 19:50:08 np0005591285 ovn_controller[94908]: 2026-01-22T00:50:08Z|00713|binding|INFO|Setting lport ee3eb2da-6644-4c49-952b-d4fd939223d9 down in Southbound
Jan 21 19:50:08 np0005591285 ovn_controller[94908]: 2026-01-22T00:50:08Z|00714|binding|INFO|Removing iface tapee3eb2da-66 ovn-installed in OVS
Jan 21 19:50:08 np0005591285 nova_compute[182755]: 2026-01-22 00:50:08.255 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:50:08 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:50:08.265 104259 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:6d:46:7b 10.100.0.4'], port_security=['fa:16:3e:6d:46:7b 10.100.0.4'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.4/28', 'neutron:device_id': '07d46432-944a-49b9-9862-65d4e541e750', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-7f04cd1e-fc0c-46c7-9d75-03b818ec99e2', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '9d05c3cf062a4f6ebb5083b35d40286e', 'neutron:revision_number': '9', 'neutron:security_group_ids': '58135b34-ec05-462e-8563-87deac605474', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-2.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=d85ad74f-8de0-427b-84fe-c5395634422f, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fdd55b5c970>], logical_port=ee3eb2da-6644-4c49-952b-d4fd939223d9) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7fdd55b5c970>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 21 19:50:08 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:50:08.266 104259 INFO neutron.agent.ovn.metadata.agent [-] Port ee3eb2da-6644-4c49-952b-d4fd939223d9 in datapath 7f04cd1e-fc0c-46c7-9d75-03b818ec99e2 unbound from our chassis#033[00m
Jan 21 19:50:08 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:50:08.267 104259 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 7f04cd1e-fc0c-46c7-9d75-03b818ec99e2, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Jan 21 19:50:08 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:50:08.269 211690 DEBUG oslo.privsep.daemon [-] privsep: reply[f7f06d6d-1bc5-4dcb-bd82-b4764e53f29b]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 19:50:08 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:50:08.270 104259 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-7f04cd1e-fc0c-46c7-9d75-03b818ec99e2 namespace which is not needed anymore#033[00m
Jan 21 19:50:08 np0005591285 nova_compute[182755]: 2026-01-22 00:50:08.271 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:50:08 np0005591285 systemd[1]: machine-qemu\x2d80\x2dinstance\x2d000000bc.scope: Deactivated successfully.
Jan 21 19:50:08 np0005591285 systemd[1]: machine-qemu\x2d80\x2dinstance\x2d000000bc.scope: Consumed 14.928s CPU time.
Jan 21 19:50:08 np0005591285 systemd-machined[154022]: Machine qemu-80-instance-000000bc terminated.
Jan 21 19:50:08 np0005591285 podman[246134]: 2026-01-22 00:50:08.337020909 +0000 UTC m=+0.074415769 container health_status 3d927e208c8d9d2bf2ed958430b573b5beb6e6062ba87e1ec7a0ee42c855b2e4 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Jan 21 19:50:08 np0005591285 neutron-haproxy-ovnmeta-7f04cd1e-fc0c-46c7-9d75-03b818ec99e2[246062]: [NOTICE]   (246066) : haproxy version is 2.8.14-c23fe91
Jan 21 19:50:08 np0005591285 neutron-haproxy-ovnmeta-7f04cd1e-fc0c-46c7-9d75-03b818ec99e2[246062]: [NOTICE]   (246066) : path to executable is /usr/sbin/haproxy
Jan 21 19:50:08 np0005591285 neutron-haproxy-ovnmeta-7f04cd1e-fc0c-46c7-9d75-03b818ec99e2[246062]: [WARNING]  (246066) : Exiting Master process...
Jan 21 19:50:08 np0005591285 neutron-haproxy-ovnmeta-7f04cd1e-fc0c-46c7-9d75-03b818ec99e2[246062]: [ALERT]    (246066) : Current worker (246068) exited with code 143 (Terminated)
Jan 21 19:50:08 np0005591285 neutron-haproxy-ovnmeta-7f04cd1e-fc0c-46c7-9d75-03b818ec99e2[246062]: [WARNING]  (246066) : All workers exited. Exiting... (0)
Jan 21 19:50:08 np0005591285 systemd[1]: libpod-f02c1028e26949f3f9d3bd0fe7acff93513890affe16745e4ed3a4cfa7aa48a5.scope: Deactivated successfully.
Jan 21 19:50:08 np0005591285 podman[246181]: 2026-01-22 00:50:08.419851355 +0000 UTC m=+0.048257104 container died f02c1028e26949f3f9d3bd0fe7acff93513890affe16745e4ed3a4cfa7aa48a5 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-7f04cd1e-fc0c-46c7-9d75-03b818ec99e2, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 21 19:50:08 np0005591285 systemd[1]: var-lib-containers-storage-overlay-38c7d67560dd6efab45a52153c99be85713e60a0bc7348c80713c650d8428380-merged.mount: Deactivated successfully.
Jan 21 19:50:08 np0005591285 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-f02c1028e26949f3f9d3bd0fe7acff93513890affe16745e4ed3a4cfa7aa48a5-userdata-shm.mount: Deactivated successfully.
Jan 21 19:50:08 np0005591285 podman[246181]: 2026-01-22 00:50:08.455196169 +0000 UTC m=+0.083601918 container cleanup f02c1028e26949f3f9d3bd0fe7acff93513890affe16745e4ed3a4cfa7aa48a5 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-7f04cd1e-fc0c-46c7-9d75-03b818ec99e2, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.schema-version=1.0, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 21 19:50:08 np0005591285 systemd[1]: libpod-conmon-f02c1028e26949f3f9d3bd0fe7acff93513890affe16745e4ed3a4cfa7aa48a5.scope: Deactivated successfully.
Jan 21 19:50:08 np0005591285 nova_compute[182755]: 2026-01-22 00:50:08.470 182759 INFO nova.virt.libvirt.driver [-] [instance: 07d46432-944a-49b9-9862-65d4e541e750] Instance destroyed successfully.#033[00m
Jan 21 19:50:08 np0005591285 nova_compute[182755]: 2026-01-22 00:50:08.471 182759 DEBUG nova.objects.instance [None req-593ca0d5-d976-4849-9266-29c1c0ffdfc6 f96259409b0747b6ac866ebe79dcf160 9d05c3cf062a4f6ebb5083b35d40286e - - default default] Lazy-loading 'resources' on Instance uuid 07d46432-944a-49b9-9862-65d4e541e750 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 21 19:50:08 np0005591285 nova_compute[182755]: 2026-01-22 00:50:08.492 182759 DEBUG nova.virt.libvirt.vif [None req-593ca0d5-d976-4849-9266-29c1c0ffdfc6 f96259409b0747b6ac866ebe79dcf160 9d05c3cf062a4f6ebb5083b35d40286e - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=True,config_drive='True',created_at=2026-01-22T00:48:51Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestShelveInstance-server-229309001',display_name='tempest-TestShelveInstance-server-229309001',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-testshelveinstance-server-229309001',id=188,image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBOgA5ZzfKbfeycJFDDkwZB6fgaNy8JupDR87kTs8Udzuy0Hdmm9UofiFEnY5+8X6yn18pBjwI+0V0Npbtx57RV5bhVB+OuvvOnztrIeeQxpfJd0y9DR5TAlaf0wFpgpxfw==',key_name='tempest-TestShelveInstance-1678551224',keypairs=<?>,launch_index=0,launched_at=2026-01-22T00:49:49Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='9d05c3cf062a4f6ebb5083b35d40286e',ramdisk_id='',reservation_id='r-urkgbm2c',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',clean_attempts='1',image_base_image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestShelveInstance-1694031060',owner_user_name='tempest-TestShelveInstance-1694031060-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-22T00:49:49Z,user_data=None,user_id='f96259409b0747b6ac866ebe79dcf160',uuid=07d46432-944a-49b9-9862-65d4e541e750,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "ee3eb2da-6644-4c49-952b-d4fd939223d9", "address": "fa:16:3e:6d:46:7b", "network": {"id": "7f04cd1e-fc0c-46c7-9d75-03b818ec99e2", "bridge": "br-int", "label": "tempest-TestShelveInstance-1867443717-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.219", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "9d05c3cf062a4f6ebb5083b35d40286e", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapee3eb2da-66", "ovs_interfaceid": "ee3eb2da-6644-4c49-952b-d4fd939223d9", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Jan 21 19:50:08 np0005591285 nova_compute[182755]: 2026-01-22 00:50:08.493 182759 DEBUG nova.network.os_vif_util [None req-593ca0d5-d976-4849-9266-29c1c0ffdfc6 f96259409b0747b6ac866ebe79dcf160 9d05c3cf062a4f6ebb5083b35d40286e - - default default] Converting VIF {"id": "ee3eb2da-6644-4c49-952b-d4fd939223d9", "address": "fa:16:3e:6d:46:7b", "network": {"id": "7f04cd1e-fc0c-46c7-9d75-03b818ec99e2", "bridge": "br-int", "label": "tempest-TestShelveInstance-1867443717-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.219", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "9d05c3cf062a4f6ebb5083b35d40286e", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapee3eb2da-66", "ovs_interfaceid": "ee3eb2da-6644-4c49-952b-d4fd939223d9", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 21 19:50:08 np0005591285 nova_compute[182755]: 2026-01-22 00:50:08.494 182759 DEBUG nova.network.os_vif_util [None req-593ca0d5-d976-4849-9266-29c1c0ffdfc6 f96259409b0747b6ac866ebe79dcf160 9d05c3cf062a4f6ebb5083b35d40286e - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:6d:46:7b,bridge_name='br-int',has_traffic_filtering=True,id=ee3eb2da-6644-4c49-952b-d4fd939223d9,network=Network(7f04cd1e-fc0c-46c7-9d75-03b818ec99e2),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapee3eb2da-66') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 21 19:50:08 np0005591285 nova_compute[182755]: 2026-01-22 00:50:08.494 182759 DEBUG os_vif [None req-593ca0d5-d976-4849-9266-29c1c0ffdfc6 f96259409b0747b6ac866ebe79dcf160 9d05c3cf062a4f6ebb5083b35d40286e - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:6d:46:7b,bridge_name='br-int',has_traffic_filtering=True,id=ee3eb2da-6644-4c49-952b-d4fd939223d9,network=Network(7f04cd1e-fc0c-46c7-9d75-03b818ec99e2),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapee3eb2da-66') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Jan 21 19:50:08 np0005591285 nova_compute[182755]: 2026-01-22 00:50:08.496 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:50:08 np0005591285 nova_compute[182755]: 2026-01-22 00:50:08.496 182759 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapee3eb2da-66, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 21 19:50:08 np0005591285 nova_compute[182755]: 2026-01-22 00:50:08.498 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:50:08 np0005591285 nova_compute[182755]: 2026-01-22 00:50:08.498 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:50:08 np0005591285 nova_compute[182755]: 2026-01-22 00:50:08.501 182759 INFO os_vif [None req-593ca0d5-d976-4849-9266-29c1c0ffdfc6 f96259409b0747b6ac866ebe79dcf160 9d05c3cf062a4f6ebb5083b35d40286e - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:6d:46:7b,bridge_name='br-int',has_traffic_filtering=True,id=ee3eb2da-6644-4c49-952b-d4fd939223d9,network=Network(7f04cd1e-fc0c-46c7-9d75-03b818ec99e2),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapee3eb2da-66')#033[00m
Jan 21 19:50:08 np0005591285 nova_compute[182755]: 2026-01-22 00:50:08.502 182759 INFO nova.virt.libvirt.driver [None req-593ca0d5-d976-4849-9266-29c1c0ffdfc6 f96259409b0747b6ac866ebe79dcf160 9d05c3cf062a4f6ebb5083b35d40286e - - default default] [instance: 07d46432-944a-49b9-9862-65d4e541e750] Deleting instance files /var/lib/nova/instances/07d46432-944a-49b9-9862-65d4e541e750_del#033[00m
Jan 21 19:50:08 np0005591285 nova_compute[182755]: 2026-01-22 00:50:08.507 182759 INFO nova.virt.libvirt.driver [None req-593ca0d5-d976-4849-9266-29c1c0ffdfc6 f96259409b0747b6ac866ebe79dcf160 9d05c3cf062a4f6ebb5083b35d40286e - - default default] [instance: 07d46432-944a-49b9-9862-65d4e541e750] Deletion of /var/lib/nova/instances/07d46432-944a-49b9-9862-65d4e541e750_del complete#033[00m
Jan 21 19:50:08 np0005591285 podman[246226]: 2026-01-22 00:50:08.515718191 +0000 UTC m=+0.037022850 container remove f02c1028e26949f3f9d3bd0fe7acff93513890affe16745e4ed3a4cfa7aa48a5 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-7f04cd1e-fc0c-46c7-9d75-03b818ec99e2, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.license=GPLv2)
Jan 21 19:50:08 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:50:08.521 211690 DEBUG oslo.privsep.daemon [-] privsep: reply[756f7dfb-0067-4609-9c3f-85f16a419bdc]: (4, ('Thu Jan 22 12:50:08 AM UTC 2026 Stopping container neutron-haproxy-ovnmeta-7f04cd1e-fc0c-46c7-9d75-03b818ec99e2 (f02c1028e26949f3f9d3bd0fe7acff93513890affe16745e4ed3a4cfa7aa48a5)\nf02c1028e26949f3f9d3bd0fe7acff93513890affe16745e4ed3a4cfa7aa48a5\nThu Jan 22 12:50:08 AM UTC 2026 Deleting container neutron-haproxy-ovnmeta-7f04cd1e-fc0c-46c7-9d75-03b818ec99e2 (f02c1028e26949f3f9d3bd0fe7acff93513890affe16745e4ed3a4cfa7aa48a5)\nf02c1028e26949f3f9d3bd0fe7acff93513890affe16745e4ed3a4cfa7aa48a5\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 19:50:08 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:50:08.522 211690 DEBUG oslo.privsep.daemon [-] privsep: reply[e6852ac4-4a5a-42ea-a6fb-9d4775aa1343]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 19:50:08 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:50:08.523 104259 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap7f04cd1e-f0, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 21 19:50:08 np0005591285 nova_compute[182755]: 2026-01-22 00:50:08.524 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:50:08 np0005591285 kernel: tap7f04cd1e-f0: left promiscuous mode
Jan 21 19:50:08 np0005591285 nova_compute[182755]: 2026-01-22 00:50:08.526 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:50:08 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:50:08.528 211690 DEBUG oslo.privsep.daemon [-] privsep: reply[c89861cc-cad5-4662-8ac1-74755f8b9718]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 19:50:08 np0005591285 nova_compute[182755]: 2026-01-22 00:50:08.536 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:50:08 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:50:08.550 211690 DEBUG oslo.privsep.daemon [-] privsep: reply[4e6fba49-5e41-4c58-b696-537edcd63c23]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 19:50:08 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:50:08.551 211690 DEBUG oslo.privsep.daemon [-] privsep: reply[6fd590cf-b80b-46ed-aac6-4d4dc77a4824]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 19:50:08 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:50:08.563 211690 DEBUG oslo.privsep.daemon [-] privsep: reply[e83e6489-f2ff-44ac-8d26-7b35c119361b]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 754685, 'reachable_time': 26951, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 246241, 'error': None, 'target': 'ovnmeta-7f04cd1e-fc0c-46c7-9d75-03b818ec99e2', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 19:50:08 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:50:08.565 104650 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-7f04cd1e-fc0c-46c7-9d75-03b818ec99e2 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Jan 21 19:50:08 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:50:08.565 104650 DEBUG oslo.privsep.daemon [-] privsep: reply[e800f237-fb69-47f9-8546-b2ed1c2e7309]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 21 19:50:08 np0005591285 systemd[1]: run-netns-ovnmeta\x2d7f04cd1e\x2dfc0c\x2d46c7\x2d9d75\x2d03b818ec99e2.mount: Deactivated successfully.
Jan 21 19:50:08 np0005591285 nova_compute[182755]: 2026-01-22 00:50:08.628 182759 INFO nova.compute.manager [None req-593ca0d5-d976-4849-9266-29c1c0ffdfc6 f96259409b0747b6ac866ebe79dcf160 9d05c3cf062a4f6ebb5083b35d40286e - - default default] [instance: 07d46432-944a-49b9-9862-65d4e541e750] Took 0.42 seconds to destroy the instance on the hypervisor.#033[00m
Jan 21 19:50:08 np0005591285 nova_compute[182755]: 2026-01-22 00:50:08.629 182759 DEBUG oslo.service.loopingcall [None req-593ca0d5-d976-4849-9266-29c1c0ffdfc6 f96259409b0747b6ac866ebe79dcf160 9d05c3cf062a4f6ebb5083b35d40286e - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Jan 21 19:50:08 np0005591285 nova_compute[182755]: 2026-01-22 00:50:08.629 182759 DEBUG nova.compute.manager [-] [instance: 07d46432-944a-49b9-9862-65d4e541e750] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Jan 21 19:50:08 np0005591285 nova_compute[182755]: 2026-01-22 00:50:08.630 182759 DEBUG nova.network.neutron [-] [instance: 07d46432-944a-49b9-9862-65d4e541e750] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Jan 21 19:50:09 np0005591285 nova_compute[182755]: 2026-01-22 00:50:09.300 182759 DEBUG nova.network.neutron [req-6e6aa7d1-4058-4f3a-a2c7-c519e24e415e req-eeb7b0f1-6164-41b2-8a37-7e6c4d9a7dff 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 07d46432-944a-49b9-9862-65d4e541e750] Updated VIF entry in instance network info cache for port ee3eb2da-6644-4c49-952b-d4fd939223d9. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 21 19:50:09 np0005591285 nova_compute[182755]: 2026-01-22 00:50:09.301 182759 DEBUG nova.network.neutron [req-6e6aa7d1-4058-4f3a-a2c7-c519e24e415e req-eeb7b0f1-6164-41b2-8a37-7e6c4d9a7dff 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 07d46432-944a-49b9-9862-65d4e541e750] Updating instance_info_cache with network_info: [{"id": "ee3eb2da-6644-4c49-952b-d4fd939223d9", "address": "fa:16:3e:6d:46:7b", "network": {"id": "7f04cd1e-fc0c-46c7-9d75-03b818ec99e2", "bridge": "br-int", "label": "tempest-TestShelveInstance-1867443717-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "9d05c3cf062a4f6ebb5083b35d40286e", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapee3eb2da-66", "ovs_interfaceid": "ee3eb2da-6644-4c49-952b-d4fd939223d9", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 21 19:50:09 np0005591285 nova_compute[182755]: 2026-01-22 00:50:09.311 182759 DEBUG nova.network.neutron [-] [instance: 07d46432-944a-49b9-9862-65d4e541e750] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 21 19:50:09 np0005591285 nova_compute[182755]: 2026-01-22 00:50:09.315 182759 DEBUG oslo_concurrency.lockutils [req-6e6aa7d1-4058-4f3a-a2c7-c519e24e415e req-eeb7b0f1-6164-41b2-8a37-7e6c4d9a7dff 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Releasing lock "refresh_cache-07d46432-944a-49b9-9862-65d4e541e750" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 21 19:50:09 np0005591285 nova_compute[182755]: 2026-01-22 00:50:09.323 182759 INFO nova.compute.manager [-] [instance: 07d46432-944a-49b9-9862-65d4e541e750] Took 0.69 seconds to deallocate network for instance.#033[00m
Jan 21 19:50:09 np0005591285 nova_compute[182755]: 2026-01-22 00:50:09.390 182759 DEBUG oslo_concurrency.lockutils [None req-593ca0d5-d976-4849-9266-29c1c0ffdfc6 f96259409b0747b6ac866ebe79dcf160 9d05c3cf062a4f6ebb5083b35d40286e - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 21 19:50:09 np0005591285 nova_compute[182755]: 2026-01-22 00:50:09.390 182759 DEBUG oslo_concurrency.lockutils [None req-593ca0d5-d976-4849-9266-29c1c0ffdfc6 f96259409b0747b6ac866ebe79dcf160 9d05c3cf062a4f6ebb5083b35d40286e - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 21 19:50:09 np0005591285 nova_compute[182755]: 2026-01-22 00:50:09.450 182759 DEBUG nova.compute.provider_tree [None req-593ca0d5-d976-4849-9266-29c1c0ffdfc6 f96259409b0747b6ac866ebe79dcf160 9d05c3cf062a4f6ebb5083b35d40286e - - default default] Inventory has not changed in ProviderTree for provider: e96a8776-a298-4c19-937a-402cb8191067 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 21 19:50:09 np0005591285 nova_compute[182755]: 2026-01-22 00:50:09.470 182759 DEBUG nova.scheduler.client.report [None req-593ca0d5-d976-4849-9266-29c1c0ffdfc6 f96259409b0747b6ac866ebe79dcf160 9d05c3cf062a4f6ebb5083b35d40286e - - default default] Inventory has not changed for provider e96a8776-a298-4c19-937a-402cb8191067 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 21 19:50:09 np0005591285 nova_compute[182755]: 2026-01-22 00:50:09.496 182759 DEBUG oslo_concurrency.lockutils [None req-593ca0d5-d976-4849-9266-29c1c0ffdfc6 f96259409b0747b6ac866ebe79dcf160 9d05c3cf062a4f6ebb5083b35d40286e - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.106s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 21 19:50:09 np0005591285 nova_compute[182755]: 2026-01-22 00:50:09.532 182759 INFO nova.scheduler.client.report [None req-593ca0d5-d976-4849-9266-29c1c0ffdfc6 f96259409b0747b6ac866ebe79dcf160 9d05c3cf062a4f6ebb5083b35d40286e - - default default] Deleted allocations for instance 07d46432-944a-49b9-9862-65d4e541e750#033[00m
Jan 21 19:50:09 np0005591285 nova_compute[182755]: 2026-01-22 00:50:09.628 182759 DEBUG oslo_concurrency.lockutils [None req-593ca0d5-d976-4849-9266-29c1c0ffdfc6 f96259409b0747b6ac866ebe79dcf160 9d05c3cf062a4f6ebb5083b35d40286e - - default default] Lock "07d46432-944a-49b9-9862-65d4e541e750" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 1.439s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 21 19:50:10 np0005591285 nova_compute[182755]: 2026-01-22 00:50:10.247 182759 DEBUG nova.compute.manager [req-0bd41c1e-5a52-4c8c-8176-45a10d83c6e8 req-73539d94-d5d8-4203-8ae6-74561aa904aa 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 07d46432-944a-49b9-9862-65d4e541e750] Received event network-vif-unplugged-ee3eb2da-6644-4c49-952b-d4fd939223d9 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 21 19:50:10 np0005591285 nova_compute[182755]: 2026-01-22 00:50:10.247 182759 DEBUG oslo_concurrency.lockutils [req-0bd41c1e-5a52-4c8c-8176-45a10d83c6e8 req-73539d94-d5d8-4203-8ae6-74561aa904aa 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "07d46432-944a-49b9-9862-65d4e541e750-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 21 19:50:10 np0005591285 nova_compute[182755]: 2026-01-22 00:50:10.247 182759 DEBUG oslo_concurrency.lockutils [req-0bd41c1e-5a52-4c8c-8176-45a10d83c6e8 req-73539d94-d5d8-4203-8ae6-74561aa904aa 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "07d46432-944a-49b9-9862-65d4e541e750-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 21 19:50:10 np0005591285 nova_compute[182755]: 2026-01-22 00:50:10.247 182759 DEBUG oslo_concurrency.lockutils [req-0bd41c1e-5a52-4c8c-8176-45a10d83c6e8 req-73539d94-d5d8-4203-8ae6-74561aa904aa 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "07d46432-944a-49b9-9862-65d4e541e750-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 21 19:50:10 np0005591285 nova_compute[182755]: 2026-01-22 00:50:10.248 182759 DEBUG nova.compute.manager [req-0bd41c1e-5a52-4c8c-8176-45a10d83c6e8 req-73539d94-d5d8-4203-8ae6-74561aa904aa 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 07d46432-944a-49b9-9862-65d4e541e750] No waiting events found dispatching network-vif-unplugged-ee3eb2da-6644-4c49-952b-d4fd939223d9 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 21 19:50:10 np0005591285 nova_compute[182755]: 2026-01-22 00:50:10.248 182759 WARNING nova.compute.manager [req-0bd41c1e-5a52-4c8c-8176-45a10d83c6e8 req-73539d94-d5d8-4203-8ae6-74561aa904aa 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 07d46432-944a-49b9-9862-65d4e541e750] Received unexpected event network-vif-unplugged-ee3eb2da-6644-4c49-952b-d4fd939223d9 for instance with vm_state deleted and task_state None.#033[00m
Jan 21 19:50:10 np0005591285 nova_compute[182755]: 2026-01-22 00:50:10.248 182759 DEBUG nova.compute.manager [req-0bd41c1e-5a52-4c8c-8176-45a10d83c6e8 req-73539d94-d5d8-4203-8ae6-74561aa904aa 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 07d46432-944a-49b9-9862-65d4e541e750] Received event network-vif-plugged-ee3eb2da-6644-4c49-952b-d4fd939223d9 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 21 19:50:10 np0005591285 nova_compute[182755]: 2026-01-22 00:50:10.248 182759 DEBUG oslo_concurrency.lockutils [req-0bd41c1e-5a52-4c8c-8176-45a10d83c6e8 req-73539d94-d5d8-4203-8ae6-74561aa904aa 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "07d46432-944a-49b9-9862-65d4e541e750-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 21 19:50:10 np0005591285 nova_compute[182755]: 2026-01-22 00:50:10.248 182759 DEBUG oslo_concurrency.lockutils [req-0bd41c1e-5a52-4c8c-8176-45a10d83c6e8 req-73539d94-d5d8-4203-8ae6-74561aa904aa 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "07d46432-944a-49b9-9862-65d4e541e750-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 21 19:50:10 np0005591285 nova_compute[182755]: 2026-01-22 00:50:10.249 182759 DEBUG oslo_concurrency.lockutils [req-0bd41c1e-5a52-4c8c-8176-45a10d83c6e8 req-73539d94-d5d8-4203-8ae6-74561aa904aa 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "07d46432-944a-49b9-9862-65d4e541e750-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 21 19:50:10 np0005591285 nova_compute[182755]: 2026-01-22 00:50:10.249 182759 DEBUG nova.compute.manager [req-0bd41c1e-5a52-4c8c-8176-45a10d83c6e8 req-73539d94-d5d8-4203-8ae6-74561aa904aa 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 07d46432-944a-49b9-9862-65d4e541e750] No waiting events found dispatching network-vif-plugged-ee3eb2da-6644-4c49-952b-d4fd939223d9 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 21 19:50:10 np0005591285 nova_compute[182755]: 2026-01-22 00:50:10.249 182759 WARNING nova.compute.manager [req-0bd41c1e-5a52-4c8c-8176-45a10d83c6e8 req-73539d94-d5d8-4203-8ae6-74561aa904aa 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 07d46432-944a-49b9-9862-65d4e541e750] Received unexpected event network-vif-plugged-ee3eb2da-6644-4c49-952b-d4fd939223d9 for instance with vm_state deleted and task_state None.#033[00m
Jan 21 19:50:10 np0005591285 nova_compute[182755]: 2026-01-22 00:50:10.249 182759 DEBUG nova.compute.manager [req-0bd41c1e-5a52-4c8c-8176-45a10d83c6e8 req-73539d94-d5d8-4203-8ae6-74561aa904aa 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 07d46432-944a-49b9-9862-65d4e541e750] Received event network-vif-deleted-ee3eb2da-6644-4c49-952b-d4fd939223d9 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 21 19:50:11 np0005591285 nova_compute[182755]: 2026-01-22 00:50:11.398 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:50:13 np0005591285 nova_compute[182755]: 2026-01-22 00:50:13.498 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:50:14 np0005591285 podman[246243]: 2026-01-22 00:50:14.191111541 +0000 UTC m=+0.057212314 container health_status 8919309155a033da8deb024bb37b9fc0d285fe97686ee0d202af37a5f2df8416 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Jan 21 19:50:14 np0005591285 podman[246242]: 2026-01-22 00:50:14.218315326 +0000 UTC m=+0.085188161 container health_status 482a8450540b33e5cd4c4cb0c4b60281016c657b79b89fa82f8a9e447843f995 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Jan 21 19:50:14 np0005591285 nova_compute[182755]: 2026-01-22 00:50:14.644 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:50:14 np0005591285 nova_compute[182755]: 2026-01-22 00:50:14.746 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:50:16 np0005591285 nova_compute[182755]: 2026-01-22 00:50:16.400 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:50:17 np0005591285 podman[246284]: 2026-01-22 00:50:17.252811696 +0000 UTC m=+0.118121009 container health_status e38def30a2d032278c507a8fe96e62e9ea51ff4721fe55b24e66691821fd079e (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_id=ovn_controller)
Jan 21 19:50:18 np0005591285 nova_compute[182755]: 2026-01-22 00:50:18.218 182759 DEBUG oslo_service.periodic_task [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 21 19:50:18 np0005591285 nova_compute[182755]: 2026-01-22 00:50:18.219 182759 DEBUG nova.compute.manager [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Jan 21 19:50:18 np0005591285 nova_compute[182755]: 2026-01-22 00:50:18.219 182759 DEBUG nova.compute.manager [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Jan 21 19:50:18 np0005591285 nova_compute[182755]: 2026-01-22 00:50:18.237 182759 DEBUG nova.compute.manager [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Jan 21 19:50:18 np0005591285 nova_compute[182755]: 2026-01-22 00:50:18.239 182759 DEBUG oslo_service.periodic_task [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 21 19:50:18 np0005591285 nova_compute[182755]: 2026-01-22 00:50:18.500 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:50:19 np0005591285 nova_compute[182755]: 2026-01-22 00:50:19.216 182759 DEBUG oslo_service.periodic_task [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 21 19:50:19 np0005591285 nova_compute[182755]: 2026-01-22 00:50:19.217 182759 DEBUG nova.compute.manager [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Jan 21 19:50:20 np0005591285 nova_compute[182755]: 2026-01-22 00:50:20.213 182759 DEBUG oslo_service.periodic_task [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 21 19:50:21 np0005591285 nova_compute[182755]: 2026-01-22 00:50:21.216 182759 DEBUG oslo_service.periodic_task [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 21 19:50:21 np0005591285 nova_compute[182755]: 2026-01-22 00:50:21.243 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:50:21 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:50:21.243 104259 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=74, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '16:a4:fb', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': 'fa:c2:bb:50:eb:27'}, ipsec=False) old=SB_Global(nb_cfg=73) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 21 19:50:21 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:50:21.245 104259 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 2 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Jan 21 19:50:21 np0005591285 nova_compute[182755]: 2026-01-22 00:50:21.402 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:50:22 np0005591285 nova_compute[182755]: 2026-01-22 00:50:22.218 182759 DEBUG oslo_service.periodic_task [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 21 19:50:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:50:23.185 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 21 19:50:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:50:23.187 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 21 19:50:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:50:23.187 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 21 19:50:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:50:23.187 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 21 19:50:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:50:23.187 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 21 19:50:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:50:23.187 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 21 19:50:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:50:23.188 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.allocation, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 21 19:50:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:50:23.188 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 21 19:50:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:50:23.188 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.capacity, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 21 19:50:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:50:23.188 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 21 19:50:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:50:23.188 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 21 19:50:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:50:23.189 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 21 19:50:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:50:23.189 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 21 19:50:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:50:23.189 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 21 19:50:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:50:23.189 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 21 19:50:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:50:23.189 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 21 19:50:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:50:23.189 12 DEBUG ceilometer.polling.manager [-] Skip pollster cpu, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 21 19:50:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:50:23.190 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 21 19:50:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:50:23.190 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.iops, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 21 19:50:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:50:23.190 12 DEBUG ceilometer.polling.manager [-] Skip pollster memory.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 21 19:50:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:50:23.190 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 21 19:50:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:50:23.190 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 21 19:50:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:50:23.191 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 21 19:50:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:50:23.191 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 21 19:50:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:50:23.191 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 21 19:50:23 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:50:23.247 104259 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=ce4b296c-26ac-415a-aa87-9634754eb3d3, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '74'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 21 19:50:23 np0005591285 nova_compute[182755]: 2026-01-22 00:50:23.468 182759 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1769043008.4676807, 07d46432-944a-49b9-9862-65d4e541e750 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 21 19:50:23 np0005591285 nova_compute[182755]: 2026-01-22 00:50:23.468 182759 INFO nova.compute.manager [-] [instance: 07d46432-944a-49b9-9862-65d4e541e750] VM Stopped (Lifecycle Event)#033[00m
Jan 21 19:50:23 np0005591285 nova_compute[182755]: 2026-01-22 00:50:23.496 182759 DEBUG nova.compute.manager [None req-8f2af15c-661a-4924-929c-681f9bcce0e6 - - - - - -] [instance: 07d46432-944a-49b9-9862-65d4e541e750] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 21 19:50:23 np0005591285 nova_compute[182755]: 2026-01-22 00:50:23.502 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:50:26 np0005591285 nova_compute[182755]: 2026-01-22 00:50:26.406 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:50:28 np0005591285 nova_compute[182755]: 2026-01-22 00:50:28.217 182759 DEBUG oslo_service.periodic_task [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 21 19:50:28 np0005591285 nova_compute[182755]: 2026-01-22 00:50:28.253 182759 DEBUG oslo_concurrency.lockutils [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 21 19:50:28 np0005591285 nova_compute[182755]: 2026-01-22 00:50:28.253 182759 DEBUG oslo_concurrency.lockutils [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 21 19:50:28 np0005591285 nova_compute[182755]: 2026-01-22 00:50:28.254 182759 DEBUG oslo_concurrency.lockutils [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 21 19:50:28 np0005591285 nova_compute[182755]: 2026-01-22 00:50:28.254 182759 DEBUG nova.compute.resource_tracker [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Jan 21 19:50:28 np0005591285 nova_compute[182755]: 2026-01-22 00:50:28.433 182759 WARNING nova.virt.libvirt.driver [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 21 19:50:28 np0005591285 nova_compute[182755]: 2026-01-22 00:50:28.435 182759 DEBUG nova.compute.resource_tracker [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=5747MB free_disk=73.10916519165039GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Jan 21 19:50:28 np0005591285 nova_compute[182755]: 2026-01-22 00:50:28.435 182759 DEBUG oslo_concurrency.lockutils [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 21 19:50:28 np0005591285 nova_compute[182755]: 2026-01-22 00:50:28.435 182759 DEBUG oslo_concurrency.lockutils [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 21 19:50:28 np0005591285 nova_compute[182755]: 2026-01-22 00:50:28.505 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:50:28 np0005591285 nova_compute[182755]: 2026-01-22 00:50:28.583 182759 DEBUG nova.compute.resource_tracker [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Jan 21 19:50:28 np0005591285 nova_compute[182755]: 2026-01-22 00:50:28.584 182759 DEBUG nova.compute.resource_tracker [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Jan 21 19:50:28 np0005591285 nova_compute[182755]: 2026-01-22 00:50:28.652 182759 DEBUG nova.scheduler.client.report [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Refreshing inventories for resource provider e96a8776-a298-4c19-937a-402cb8191067 _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:804#033[00m
Jan 21 19:50:28 np0005591285 nova_compute[182755]: 2026-01-22 00:50:28.751 182759 DEBUG nova.scheduler.client.report [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Updating ProviderTree inventory for provider e96a8776-a298-4c19-937a-402cb8191067 from _refresh_and_get_inventory using data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} _refresh_and_get_inventory /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:768#033[00m
Jan 21 19:50:28 np0005591285 nova_compute[182755]: 2026-01-22 00:50:28.752 182759 DEBUG nova.compute.provider_tree [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Updating inventory in ProviderTree for provider e96a8776-a298-4c19-937a-402cb8191067 with inventory: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176#033[00m
Jan 21 19:50:28 np0005591285 nova_compute[182755]: 2026-01-22 00:50:28.786 182759 DEBUG nova.scheduler.client.report [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Refreshing aggregate associations for resource provider e96a8776-a298-4c19-937a-402cb8191067, aggregates: None _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:813#033[00m
Jan 21 19:50:28 np0005591285 nova_compute[182755]: 2026-01-22 00:50:28.841 182759 DEBUG nova.scheduler.client.report [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Refreshing trait associations for resource provider e96a8776-a298-4c19-937a-402cb8191067, traits: COMPUTE_SECURITY_UEFI_SECURE_BOOT,COMPUTE_VIOMMU_MODEL_VIRTIO,COMPUTE_STORAGE_BUS_IDE,COMPUTE_NET_ATTACH_INTERFACE,HW_CPU_X86_SSSE3,HW_CPU_X86_SSE,COMPUTE_IMAGE_TYPE_QCOW2,COMPUTE_NET_VIF_MODEL_VMXNET3,COMPUTE_STORAGE_BUS_FDC,HW_CPU_X86_SSE42,COMPUTE_SECURITY_TPM_2_0,COMPUTE_IMAGE_TYPE_AMI,COMPUTE_DEVICE_TAGGING,COMPUTE_VOLUME_EXTEND,COMPUTE_NET_VIF_MODEL_E1000,COMPUTE_VIOMMU_MODEL_AUTO,COMPUTE_GRAPHICS_MODEL_CIRRUS,COMPUTE_VOLUME_ATTACH_WITH_TAG,COMPUTE_IMAGE_TYPE_AKI,COMPUTE_STORAGE_BUS_SATA,HW_CPU_X86_SSE2,COMPUTE_IMAGE_TYPE_ARI,HW_CPU_X86_SSE41,COMPUTE_RESCUE_BFV,COMPUTE_NODE,COMPUTE_GRAPHICS_MODEL_VIRTIO,COMPUTE_NET_VIF_MODEL_PCNET,COMPUTE_NET_VIF_MODEL_VIRTIO,COMPUTE_NET_VIF_MODEL_RTL8139,COMPUTE_ACCELERATORS,COMPUTE_SECURITY_TPM_1_2,COMPUTE_GRAPHICS_MODEL_NONE,COMPUTE_GRAPHICS_MODEL_BOCHS,COMPUTE_VOLUME_MULTI_ATTACH,COMPUTE_NET_VIF_MODEL_NE2K_PCI,COMPUTE_TRUSTED_CERTS,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,COMPUTE_VIOMMU_MODEL_INTEL,COMPUTE_IMAGE_TYPE_ISO,COMPUTE_SOCKET_PCI_NUMA_AFFINITY,COMPUTE_STORAGE_BUS_USB,COMPUTE_STORAGE_BUS_SCSI,COMPUTE_STORAGE_BUS_VIRTIO,HW_CPU_X86_MMX,COMPUTE_GRAPHICS_MODEL_VGA,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,COMPUTE_NET_VIF_MODEL_E1000E,COMPUTE_IMAGE_TYPE_RAW _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:825#033[00m
Jan 21 19:50:28 np0005591285 nova_compute[182755]: 2026-01-22 00:50:28.867 182759 DEBUG nova.compute.provider_tree [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Inventory has not changed in ProviderTree for provider: e96a8776-a298-4c19-937a-402cb8191067 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 21 19:50:28 np0005591285 nova_compute[182755]: 2026-01-22 00:50:28.890 182759 DEBUG nova.scheduler.client.report [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Inventory has not changed for provider e96a8776-a298-4c19-937a-402cb8191067 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 21 19:50:28 np0005591285 nova_compute[182755]: 2026-01-22 00:50:28.913 182759 DEBUG nova.compute.resource_tracker [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Jan 21 19:50:28 np0005591285 nova_compute[182755]: 2026-01-22 00:50:28.914 182759 DEBUG oslo_concurrency.lockutils [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.478s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 21 19:50:29 np0005591285 nova_compute[182755]: 2026-01-22 00:50:29.914 182759 DEBUG oslo_service.periodic_task [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 21 19:50:31 np0005591285 podman[246311]: 2026-01-22 00:50:31.194649437 +0000 UTC m=+0.062572590 container health_status 769668cc4e818cfae854ab3fcb5517227ddca1e18005a18ef4d782a494f45662 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, io.buildah.version=1.33.7, version=9.6, container_name=openstack_network_exporter, io.openshift.tags=minimal rhel9, com.redhat.component=ubi9-minimal-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2025-08-20T13:12:41, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-type=git, architecture=x86_64, release=1755695350, maintainer=Red Hat, Inc., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, url=https://catalog.redhat.com/en/search?searchType=containers, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, distribution-scope=public, config_id=openstack_network_exporter, name=ubi9-minimal, io.openshift.expose-services=, vendor=Red Hat, Inc., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9.)
Jan 21 19:50:31 np0005591285 podman[246312]: 2026-01-22 00:50:31.209819986 +0000 UTC m=+0.072993080 container health_status b5c1a31a508d0cf9e10702abe9c0ef424614b4d8f0daee85791f7de054afa6f0 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ceilometer_agent_compute, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, org.label-schema.license=GPLv2)
Jan 21 19:50:31 np0005591285 nova_compute[182755]: 2026-01-22 00:50:31.406 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:50:33 np0005591285 nova_compute[182755]: 2026-01-22 00:50:33.217 182759 DEBUG oslo_service.periodic_task [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 21 19:50:33 np0005591285 nova_compute[182755]: 2026-01-22 00:50:33.508 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:50:36 np0005591285 nova_compute[182755]: 2026-01-22 00:50:36.408 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:50:38 np0005591285 nova_compute[182755]: 2026-01-22 00:50:38.511 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:50:39 np0005591285 podman[246351]: 2026-01-22 00:50:39.20542096 +0000 UTC m=+0.068107598 container health_status 3d927e208c8d9d2bf2ed958430b573b5beb6e6062ba87e1ec7a0ee42c855b2e4 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter)
Jan 21 19:50:39 np0005591285 nova_compute[182755]: 2026-01-22 00:50:39.212 182759 DEBUG oslo_service.periodic_task [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 21 19:50:41 np0005591285 nova_compute[182755]: 2026-01-22 00:50:41.410 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:50:43 np0005591285 nova_compute[182755]: 2026-01-22 00:50:43.513 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:50:45 np0005591285 podman[246376]: 2026-01-22 00:50:45.186153031 +0000 UTC m=+0.058684605 container health_status 8919309155a033da8deb024bb37b9fc0d285fe97686ee0d202af37a5f2df8416 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']})
Jan 21 19:50:45 np0005591285 podman[246375]: 2026-01-22 00:50:45.194050564 +0000 UTC m=+0.068994173 container health_status 482a8450540b33e5cd4c4cb0c4b60281016c657b79b89fa82f8a9e447843f995 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, container_name=ovn_metadata_agent, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 21 19:50:46 np0005591285 nova_compute[182755]: 2026-01-22 00:50:46.411 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:50:48 np0005591285 podman[246420]: 2026-01-22 00:50:48.232303546 +0000 UTC m=+0.107836001 container health_status e38def30a2d032278c507a8fe96e62e9ea51ff4721fe55b24e66691821fd079e (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.license=GPLv2, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.build-date=20251202)
Jan 21 19:50:48 np0005591285 nova_compute[182755]: 2026-01-22 00:50:48.517 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:50:50 np0005591285 ovn_controller[94908]: 2026-01-22T00:50:50Z|00715|memory_trim|INFO|Detected inactivity (last active 30003 ms ago): trimming memory
Jan 21 19:50:51 np0005591285 nova_compute[182755]: 2026-01-22 00:50:51.415 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:50:53 np0005591285 nova_compute[182755]: 2026-01-22 00:50:53.521 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:50:56 np0005591285 nova_compute[182755]: 2026-01-22 00:50:56.416 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:50:58 np0005591285 nova_compute[182755]: 2026-01-22 00:50:58.525 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:51:01 np0005591285 nova_compute[182755]: 2026-01-22 00:51:01.418 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:51:02 np0005591285 podman[246447]: 2026-01-22 00:51:02.194733396 +0000 UTC m=+0.064317887 container health_status 769668cc4e818cfae854ab3fcb5517227ddca1e18005a18ef4d782a494f45662 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, version=9.6, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., architecture=x86_64, release=1755695350, container_name=openstack_network_exporter, io.buildah.version=1.33.7, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, managed_by=edpm_ansible, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-type=git, build-date=2025-08-20T13:12:41, io.openshift.expose-services=, com.redhat.component=ubi9-minimal-container, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, distribution-scope=public, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=minimal rhel9, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, maintainer=Red Hat, Inc., config_id=openstack_network_exporter, name=ubi9-minimal, url=https://catalog.redhat.com/en/search?searchType=containers)
Jan 21 19:51:02 np0005591285 podman[246448]: 2026-01-22 00:51:02.204330044 +0000 UTC m=+0.071780587 container health_status b5c1a31a508d0cf9e10702abe9c0ef424614b4d8f0daee85791f7de054afa6f0 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, config_id=ceilometer_agent_compute)
Jan 21 19:51:03 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:51:03.021 104259 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 21 19:51:03 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:51:03.021 104259 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 21 19:51:03 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:51:03.021 104259 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 21 19:51:03 np0005591285 nova_compute[182755]: 2026-01-22 00:51:03.529 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:51:06 np0005591285 nova_compute[182755]: 2026-01-22 00:51:06.419 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:51:08 np0005591285 nova_compute[182755]: 2026-01-22 00:51:08.532 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:51:10 np0005591285 podman[246488]: 2026-01-22 00:51:10.171904091 +0000 UTC m=+0.048605492 container health_status 3d927e208c8d9d2bf2ed958430b573b5beb6e6062ba87e1ec7a0ee42c855b2e4 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter)
Jan 21 19:51:11 np0005591285 nova_compute[182755]: 2026-01-22 00:51:11.421 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:51:13 np0005591285 nova_compute[182755]: 2026-01-22 00:51:13.534 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:51:16 np0005591285 podman[246513]: 2026-01-22 00:51:16.193712991 +0000 UTC m=+0.057233046 container health_status 8919309155a033da8deb024bb37b9fc0d285fe97686ee0d202af37a5f2df8416 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Jan 21 19:51:16 np0005591285 podman[246512]: 2026-01-22 00:51:16.218645994 +0000 UTC m=+0.088628973 container health_status 482a8450540b33e5cd4c4cb0c4b60281016c657b79b89fa82f8a9e447843f995 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, io.buildah.version=1.41.3, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=ovn_metadata_agent, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2)
Jan 21 19:51:16 np0005591285 nova_compute[182755]: 2026-01-22 00:51:16.424 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:51:18 np0005591285 nova_compute[182755]: 2026-01-22 00:51:18.537 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:51:19 np0005591285 nova_compute[182755]: 2026-01-22 00:51:19.216 182759 DEBUG oslo_service.periodic_task [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 21 19:51:19 np0005591285 nova_compute[182755]: 2026-01-22 00:51:19.217 182759 DEBUG nova.compute.manager [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Jan 21 19:51:19 np0005591285 nova_compute[182755]: 2026-01-22 00:51:19.217 182759 DEBUG nova.compute.manager [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Jan 21 19:51:19 np0005591285 nova_compute[182755]: 2026-01-22 00:51:19.232 182759 DEBUG nova.compute.manager [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Jan 21 19:51:19 np0005591285 nova_compute[182755]: 2026-01-22 00:51:19.232 182759 DEBUG oslo_service.periodic_task [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 21 19:51:19 np0005591285 podman[246555]: 2026-01-22 00:51:19.24627365 +0000 UTC m=+0.120181414 container health_status e38def30a2d032278c507a8fe96e62e9ea51ff4721fe55b24e66691821fd079e (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 21 19:51:20 np0005591285 nova_compute[182755]: 2026-01-22 00:51:20.217 182759 DEBUG oslo_service.periodic_task [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 21 19:51:20 np0005591285 nova_compute[182755]: 2026-01-22 00:51:20.218 182759 DEBUG nova.compute.manager [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Jan 21 19:51:21 np0005591285 nova_compute[182755]: 2026-01-22 00:51:21.214 182759 DEBUG oslo_service.periodic_task [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 21 19:51:21 np0005591285 nova_compute[182755]: 2026-01-22 00:51:21.424 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:51:22 np0005591285 nova_compute[182755]: 2026-01-22 00:51:22.218 182759 DEBUG oslo_service.periodic_task [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 21 19:51:22 np0005591285 nova_compute[182755]: 2026-01-22 00:51:22.218 182759 DEBUG oslo_service.periodic_task [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 21 19:51:23 np0005591285 nova_compute[182755]: 2026-01-22 00:51:23.540 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:51:26 np0005591285 nova_compute[182755]: 2026-01-22 00:51:26.426 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:51:28 np0005591285 systemd-logind[788]: New session 54 of user zuul.
Jan 21 19:51:28 np0005591285 systemd[1]: Started Session 54 of User zuul.
Jan 21 19:51:28 np0005591285 nova_compute[182755]: 2026-01-22 00:51:28.543 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:51:29 np0005591285 nova_compute[182755]: 2026-01-22 00:51:29.217 182759 DEBUG oslo_service.periodic_task [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 21 19:51:29 np0005591285 nova_compute[182755]: 2026-01-22 00:51:29.253 182759 DEBUG oslo_concurrency.lockutils [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 21 19:51:29 np0005591285 nova_compute[182755]: 2026-01-22 00:51:29.254 182759 DEBUG oslo_concurrency.lockutils [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 21 19:51:29 np0005591285 nova_compute[182755]: 2026-01-22 00:51:29.255 182759 DEBUG oslo_concurrency.lockutils [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 21 19:51:29 np0005591285 nova_compute[182755]: 2026-01-22 00:51:29.255 182759 DEBUG nova.compute.resource_tracker [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Jan 21 19:51:29 np0005591285 nova_compute[182755]: 2026-01-22 00:51:29.480 182759 WARNING nova.virt.libvirt.driver [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 21 19:51:29 np0005591285 nova_compute[182755]: 2026-01-22 00:51:29.482 182759 DEBUG nova.compute.resource_tracker [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=5730MB free_disk=73.10916519165039GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Jan 21 19:51:29 np0005591285 nova_compute[182755]: 2026-01-22 00:51:29.482 182759 DEBUG oslo_concurrency.lockutils [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 21 19:51:29 np0005591285 nova_compute[182755]: 2026-01-22 00:51:29.483 182759 DEBUG oslo_concurrency.lockutils [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 21 19:51:29 np0005591285 nova_compute[182755]: 2026-01-22 00:51:29.599 182759 DEBUG nova.compute.resource_tracker [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Jan 21 19:51:29 np0005591285 nova_compute[182755]: 2026-01-22 00:51:29.601 182759 DEBUG nova.compute.resource_tracker [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Jan 21 19:51:29 np0005591285 nova_compute[182755]: 2026-01-22 00:51:29.649 182759 DEBUG nova.compute.provider_tree [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Inventory has not changed in ProviderTree for provider: e96a8776-a298-4c19-937a-402cb8191067 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 21 19:51:29 np0005591285 nova_compute[182755]: 2026-01-22 00:51:29.671 182759 DEBUG nova.scheduler.client.report [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Inventory has not changed for provider e96a8776-a298-4c19-937a-402cb8191067 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 21 19:51:29 np0005591285 nova_compute[182755]: 2026-01-22 00:51:29.673 182759 DEBUG nova.compute.resource_tracker [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Jan 21 19:51:29 np0005591285 nova_compute[182755]: 2026-01-22 00:51:29.674 182759 DEBUG oslo_concurrency.lockutils [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.191s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 21 19:51:30 np0005591285 nova_compute[182755]: 2026-01-22 00:51:30.675 182759 DEBUG oslo_service.periodic_task [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 21 19:51:31 np0005591285 nova_compute[182755]: 2026-01-22 00:51:31.429 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:51:33 np0005591285 podman[246750]: 2026-01-22 00:51:33.201136883 +0000 UTC m=+0.067451721 container health_status b5c1a31a508d0cf9e10702abe9c0ef424614b4d8f0daee85791f7de054afa6f0 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, config_id=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, tcib_managed=true)
Jan 21 19:51:33 np0005591285 podman[246749]: 2026-01-22 00:51:33.201691168 +0000 UTC m=+0.067226495 container health_status 769668cc4e818cfae854ab3fcb5517227ddca1e18005a18ef4d782a494f45662 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, build-date=2025-08-20T13:12:41, name=ubi9-minimal, vcs-type=git, maintainer=Red Hat, Inc., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., managed_by=edpm_ansible, io.openshift.expose-services=, io.openshift.tags=minimal rhel9, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, container_name=openstack_network_exporter, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, com.redhat.component=ubi9-minimal-container, io.buildah.version=1.33.7, release=1755695350, url=https://catalog.redhat.com/en/search?searchType=containers, architecture=x86_64, distribution-scope=public, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, version=9.6, config_id=openstack_network_exporter, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly.)
Jan 21 19:51:33 np0005591285 nova_compute[182755]: 2026-01-22 00:51:33.547 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:51:34 np0005591285 nova_compute[182755]: 2026-01-22 00:51:34.217 182759 DEBUG oslo_service.periodic_task [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 21 19:51:36 np0005591285 nova_compute[182755]: 2026-01-22 00:51:36.436 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:51:36 np0005591285 ovs-vsctl[246836]: ovs|00001|db_ctl_base|ERR|no key "dpdk-init" in Open_vSwitch record "." column other_config
Jan 21 19:51:37 np0005591285 systemd[1]: proc-sys-fs-binfmt_misc.automount: Got automount request for /proc/sys/fs/binfmt_misc, triggered by 246610 (sos)
Jan 21 19:51:37 np0005591285 systemd[1]: Mounting Arbitrary Executable File Formats File System...
Jan 21 19:51:37 np0005591285 systemd[1]: Mounted Arbitrary Executable File Formats File System.
Jan 21 19:51:37 np0005591285 virtqemud[182299]: Failed to connect socket to '/var/run/libvirt/virtnetworkd-sock-ro': No such file or directory
Jan 21 19:51:37 np0005591285 virtqemud[182299]: Failed to connect socket to '/var/run/libvirt/virtnwfilterd-sock-ro': No such file or directory
Jan 21 19:51:37 np0005591285 virtqemud[182299]: Failed to connect socket to '/var/run/libvirt/virtstoraged-sock-ro': No such file or directory
Jan 21 19:51:38 np0005591285 nova_compute[182755]: 2026-01-22 00:51:38.549 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:51:41 np0005591285 systemd[1]: Starting Hostname Service...
Jan 21 19:51:41 np0005591285 systemd[1]: Started Hostname Service.
Jan 21 19:51:41 np0005591285 podman[247352]: 2026-01-22 00:51:41.158356111 +0000 UTC m=+0.094425648 container health_status 3d927e208c8d9d2bf2ed958430b573b5beb6e6062ba87e1ec7a0ee42c855b2e4 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter)
Jan 21 19:51:41 np0005591285 nova_compute[182755]: 2026-01-22 00:51:41.437 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:51:43 np0005591285 nova_compute[182755]: 2026-01-22 00:51:43.552 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:51:46 np0005591285 nova_compute[182755]: 2026-01-22 00:51:46.439 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:51:47 np0005591285 podman[248106]: 2026-01-22 00:51:47.186601193 +0000 UTC m=+0.058988832 container health_status 482a8450540b33e5cd4c4cb0c4b60281016c657b79b89fa82f8a9e447843f995 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_metadata_agent, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.build-date=20251202, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS)
Jan 21 19:51:47 np0005591285 podman[248108]: 2026-01-22 00:51:47.189768719 +0000 UTC m=+0.061482890 container health_status 8919309155a033da8deb024bb37b9fc0d285fe97686ee0d202af37a5f2df8416 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Jan 21 19:51:48 np0005591285 nova_compute[182755]: 2026-01-22 00:51:48.553 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:51:49 np0005591285 ovs-appctl[248647]: ovs|00001|daemon_unix|WARN|/var/run/openvswitch/ovs-monitor-ipsec.pid: open: No such file or directory
Jan 21 19:51:49 np0005591285 ovs-appctl[248654]: ovs|00001|daemon_unix|WARN|/var/run/openvswitch/ovs-monitor-ipsec.pid: open: No such file or directory
Jan 21 19:51:49 np0005591285 ovs-appctl[248658]: ovs|00001|daemon_unix|WARN|/var/run/openvswitch/ovs-monitor-ipsec.pid: open: No such file or directory
Jan 21 19:51:49 np0005591285 podman[248818]: 2026-01-22 00:51:49.906795903 +0000 UTC m=+0.154837839 container health_status e38def30a2d032278c507a8fe96e62e9ea51ff4721fe55b24e66691821fd079e (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, container_name=ovn_controller, org.label-schema.license=GPLv2, tcib_managed=true, config_id=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0)
Jan 21 19:51:51 np0005591285 nova_compute[182755]: 2026-01-22 00:51:51.440 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:51:53 np0005591285 nova_compute[182755]: 2026-01-22 00:51:53.556 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:51:56 np0005591285 nova_compute[182755]: 2026-01-22 00:51:56.461 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:51:56 np0005591285 virtqemud[182299]: Failed to connect socket to '/var/run/libvirt/virtstoraged-sock-ro': No such file or directory
Jan 21 19:51:58 np0005591285 nova_compute[182755]: 2026-01-22 00:51:58.558 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:51:58 np0005591285 systemd[1]: Starting Time & Date Service...
Jan 21 19:51:58 np0005591285 systemd[1]: Started Time & Date Service.
Jan 21 19:52:01 np0005591285 nova_compute[182755]: 2026-01-22 00:52:01.463 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:52:03 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:52:03.022 104259 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 21 19:52:03 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:52:03.023 104259 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 21 19:52:03 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:52:03.024 104259 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 21 19:52:03 np0005591285 nova_compute[182755]: 2026-01-22 00:52:03.559 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:52:03 np0005591285 podman[250134]: 2026-01-22 00:52:03.889422889 +0000 UTC m=+0.064180533 container health_status b5c1a31a508d0cf9e10702abe9c0ef424614b4d8f0daee85791f7de054afa6f0 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, tcib_managed=true, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=ceilometer_agent_compute)
Jan 21 19:52:03 np0005591285 podman[250133]: 2026-01-22 00:52:03.890696403 +0000 UTC m=+0.065586731 container health_status 769668cc4e818cfae854ab3fcb5517227ddca1e18005a18ef4d782a494f45662 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=openstack_network_exporter, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, name=ubi9-minimal, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, release=1755695350, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-type=git, io.buildah.version=1.33.7, managed_by=edpm_ansible, url=https://catalog.redhat.com/en/search?searchType=containers, vendor=Red Hat, Inc., version=9.6, container_name=openstack_network_exporter, distribution-scope=public, io.openshift.tags=minimal rhel9, maintainer=Red Hat, Inc., architecture=x86_64, com.redhat.component=ubi9-minimal-container, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.openshift.expose-services=, build-date=2025-08-20T13:12:41, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI)
Jan 21 19:52:06 np0005591285 nova_compute[182755]: 2026-01-22 00:52:06.465 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:52:08 np0005591285 nova_compute[182755]: 2026-01-22 00:52:08.561 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:52:11 np0005591285 nova_compute[182755]: 2026-01-22 00:52:11.474 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:52:11 np0005591285 podman[250175]: 2026-01-22 00:52:11.678731035 +0000 UTC m=+0.052909889 container health_status 3d927e208c8d9d2bf2ed958430b573b5beb6e6062ba87e1ec7a0ee42c855b2e4 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Jan 21 19:52:13 np0005591285 nova_compute[182755]: 2026-01-22 00:52:13.563 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:52:16 np0005591285 nova_compute[182755]: 2026-01-22 00:52:16.476 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:52:18 np0005591285 podman[250199]: 2026-01-22 00:52:18.146640262 +0000 UTC m=+0.059219369 container health_status 482a8450540b33e5cd4c4cb0c4b60281016c657b79b89fa82f8a9e447843f995 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']})
Jan 21 19:52:18 np0005591285 podman[250200]: 2026-01-22 00:52:18.146812577 +0000 UTC m=+0.059104716 container health_status 8919309155a033da8deb024bb37b9fc0d285fe97686ee0d202af37a5f2df8416 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']})
Jan 21 19:52:18 np0005591285 nova_compute[182755]: 2026-01-22 00:52:18.565 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:52:20 np0005591285 nova_compute[182755]: 2026-01-22 00:52:20.217 182759 DEBUG oslo_service.periodic_task [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 21 19:52:20 np0005591285 podman[250239]: 2026-01-22 00:52:20.249189473 +0000 UTC m=+0.111724866 container health_status e38def30a2d032278c507a8fe96e62e9ea51ff4721fe55b24e66691821fd079e (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_id=ovn_controller, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0)
Jan 21 19:52:21 np0005591285 nova_compute[182755]: 2026-01-22 00:52:21.218 182759 DEBUG oslo_service.periodic_task [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 21 19:52:21 np0005591285 nova_compute[182755]: 2026-01-22 00:52:21.218 182759 DEBUG nova.compute.manager [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Jan 21 19:52:21 np0005591285 nova_compute[182755]: 2026-01-22 00:52:21.219 182759 DEBUG nova.compute.manager [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Jan 21 19:52:21 np0005591285 nova_compute[182755]: 2026-01-22 00:52:21.240 182759 DEBUG nova.compute.manager [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Jan 21 19:52:21 np0005591285 nova_compute[182755]: 2026-01-22 00:52:21.476 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:52:22 np0005591285 nova_compute[182755]: 2026-01-22 00:52:22.217 182759 DEBUG oslo_service.periodic_task [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 21 19:52:22 np0005591285 nova_compute[182755]: 2026-01-22 00:52:22.218 182759 DEBUG oslo_service.periodic_task [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 21 19:52:22 np0005591285 nova_compute[182755]: 2026-01-22 00:52:22.218 182759 DEBUG nova.compute.manager [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Jan 21 19:52:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:52:23.184 12 DEBUG ceilometer.polling.manager [-] Skip pollster memory.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 21 19:52:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:52:23.186 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 21 19:52:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:52:23.186 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 21 19:52:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:52:23.186 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 21 19:52:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:52:23.186 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.iops, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 21 19:52:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:52:23.186 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 21 19:52:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:52:23.186 12 DEBUG ceilometer.polling.manager [-] Skip pollster cpu, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 21 19:52:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:52:23.186 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 21 19:52:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:52:23.186 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 21 19:52:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:52:23.186 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 21 19:52:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:52:23.186 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 21 19:52:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:52:23.186 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.capacity, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 21 19:52:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:52:23.187 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 21 19:52:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:52:23.187 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 21 19:52:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:52:23.187 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 21 19:52:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:52:23.187 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 21 19:52:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:52:23.187 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 21 19:52:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:52:23.187 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 21 19:52:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:52:23.187 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.allocation, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 21 19:52:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:52:23.187 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 21 19:52:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:52:23.187 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 21 19:52:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:52:23.187 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 21 19:52:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:52:23.187 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 21 19:52:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:52:23.187 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 21 19:52:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:52:23.187 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 21 19:52:23 np0005591285 nova_compute[182755]: 2026-01-22 00:52:23.566 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:52:23 np0005591285 systemd[1]: session-54.scope: Deactivated successfully.
Jan 21 19:52:23 np0005591285 systemd[1]: session-54.scope: Consumed 1min 26.307s CPU time, 704.7M memory peak, read 235.4M from disk, written 30.3M to disk.
Jan 21 19:52:23 np0005591285 systemd-logind[788]: Session 54 logged out. Waiting for processes to exit.
Jan 21 19:52:23 np0005591285 systemd-logind[788]: Removed session 54.
Jan 21 19:52:23 np0005591285 systemd-logind[788]: New session 55 of user zuul.
Jan 21 19:52:23 np0005591285 systemd[1]: Started Session 55 of User zuul.
Jan 21 19:52:24 np0005591285 systemd[1]: session-55.scope: Deactivated successfully.
Jan 21 19:52:24 np0005591285 systemd-logind[788]: Session 55 logged out. Waiting for processes to exit.
Jan 21 19:52:24 np0005591285 systemd-logind[788]: Removed session 55.
Jan 21 19:52:24 np0005591285 systemd-logind[788]: New session 56 of user zuul.
Jan 21 19:52:24 np0005591285 systemd[1]: Started Session 56 of User zuul.
Jan 21 19:52:24 np0005591285 nova_compute[182755]: 2026-01-22 00:52:24.218 182759 DEBUG oslo_service.periodic_task [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 21 19:52:24 np0005591285 nova_compute[182755]: 2026-01-22 00:52:24.219 182759 DEBUG oslo_service.periodic_task [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 21 19:52:24 np0005591285 systemd[1]: session-56.scope: Deactivated successfully.
Jan 21 19:52:24 np0005591285 systemd-logind[788]: Session 56 logged out. Waiting for processes to exit.
Jan 21 19:52:24 np0005591285 systemd-logind[788]: Removed session 56.
Jan 21 19:52:26 np0005591285 nova_compute[182755]: 2026-01-22 00:52:26.477 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:52:28 np0005591285 nova_compute[182755]: 2026-01-22 00:52:28.568 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:52:29 np0005591285 systemd[1]: systemd-timedated.service: Deactivated successfully.
Jan 21 19:52:29 np0005591285 systemd[1]: systemd-hostnamed.service: Deactivated successfully.
Jan 21 19:52:30 np0005591285 nova_compute[182755]: 2026-01-22 00:52:30.217 182759 DEBUG oslo_service.periodic_task [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 21 19:52:30 np0005591285 nova_compute[182755]: 2026-01-22 00:52:30.218 182759 DEBUG oslo_service.periodic_task [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 21 19:52:30 np0005591285 nova_compute[182755]: 2026-01-22 00:52:30.251 182759 DEBUG oslo_concurrency.lockutils [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 21 19:52:30 np0005591285 nova_compute[182755]: 2026-01-22 00:52:30.252 182759 DEBUG oslo_concurrency.lockutils [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 21 19:52:30 np0005591285 nova_compute[182755]: 2026-01-22 00:52:30.252 182759 DEBUG oslo_concurrency.lockutils [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 21 19:52:30 np0005591285 nova_compute[182755]: 2026-01-22 00:52:30.252 182759 DEBUG nova.compute.resource_tracker [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Jan 21 19:52:30 np0005591285 nova_compute[182755]: 2026-01-22 00:52:30.409 182759 WARNING nova.virt.libvirt.driver [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 21 19:52:30 np0005591285 nova_compute[182755]: 2026-01-22 00:52:30.412 182759 DEBUG nova.compute.resource_tracker [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=5634MB free_disk=73.10889434814453GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Jan 21 19:52:30 np0005591285 nova_compute[182755]: 2026-01-22 00:52:30.412 182759 DEBUG oslo_concurrency.lockutils [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 21 19:52:30 np0005591285 nova_compute[182755]: 2026-01-22 00:52:30.412 182759 DEBUG oslo_concurrency.lockutils [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 21 19:52:30 np0005591285 nova_compute[182755]: 2026-01-22 00:52:30.520 182759 DEBUG nova.compute.resource_tracker [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Jan 21 19:52:30 np0005591285 nova_compute[182755]: 2026-01-22 00:52:30.521 182759 DEBUG nova.compute.resource_tracker [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Jan 21 19:52:30 np0005591285 nova_compute[182755]: 2026-01-22 00:52:30.547 182759 DEBUG nova.compute.provider_tree [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Inventory has not changed in ProviderTree for provider: e96a8776-a298-4c19-937a-402cb8191067 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 21 19:52:30 np0005591285 nova_compute[182755]: 2026-01-22 00:52:30.567 182759 DEBUG nova.scheduler.client.report [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Inventory has not changed for provider e96a8776-a298-4c19-937a-402cb8191067 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 21 19:52:30 np0005591285 nova_compute[182755]: 2026-01-22 00:52:30.569 182759 DEBUG nova.compute.resource_tracker [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Jan 21 19:52:30 np0005591285 nova_compute[182755]: 2026-01-22 00:52:30.569 182759 DEBUG oslo_concurrency.lockutils [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.157s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 21 19:52:31 np0005591285 nova_compute[182755]: 2026-01-22 00:52:31.479 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:52:33 np0005591285 nova_compute[182755]: 2026-01-22 00:52:33.570 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:52:34 np0005591285 podman[250329]: 2026-01-22 00:52:34.201288013 +0000 UTC m=+0.067105502 container health_status b5c1a31a508d0cf9e10702abe9c0ef424614b4d8f0daee85791f7de054afa6f0 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, container_name=ceilometer_agent_compute, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_id=ceilometer_agent_compute, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0)
Jan 21 19:52:34 np0005591285 podman[250328]: 2026-01-22 00:52:34.232406853 +0000 UTC m=+0.096739221 container health_status 769668cc4e818cfae854ab3fcb5517227ddca1e18005a18ef4d782a494f45662 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, vcs-type=git, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.tags=minimal rhel9, release=1755695350, io.buildah.version=1.33.7, config_id=openstack_network_exporter, vendor=Red Hat, Inc., config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, managed_by=edpm_ansible, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, maintainer=Red Hat, Inc., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.openshift.expose-services=, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., container_name=openstack_network_exporter, name=ubi9-minimal, url=https://catalog.redhat.com/en/search?searchType=containers, build-date=2025-08-20T13:12:41, com.redhat.component=ubi9-minimal-container, version=9.6, distribution-scope=public)
Jan 21 19:52:36 np0005591285 nova_compute[182755]: 2026-01-22 00:52:36.482 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:52:36 np0005591285 nova_compute[182755]: 2026-01-22 00:52:36.569 182759 DEBUG oslo_service.periodic_task [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 21 19:52:38 np0005591285 nova_compute[182755]: 2026-01-22 00:52:38.607 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:52:41 np0005591285 nova_compute[182755]: 2026-01-22 00:52:41.212 182759 DEBUG oslo_service.periodic_task [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 21 19:52:41 np0005591285 nova_compute[182755]: 2026-01-22 00:52:41.484 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:52:42 np0005591285 podman[250367]: 2026-01-22 00:52:42.210461524 +0000 UTC m=+0.069545648 container health_status 3d927e208c8d9d2bf2ed958430b573b5beb6e6062ba87e1ec7a0ee42c855b2e4 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Jan 21 19:52:43 np0005591285 nova_compute[182755]: 2026-01-22 00:52:43.609 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:52:46 np0005591285 nova_compute[182755]: 2026-01-22 00:52:46.486 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:52:48 np0005591285 nova_compute[182755]: 2026-01-22 00:52:48.611 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:52:49 np0005591285 podman[250391]: 2026-01-22 00:52:49.22303944 +0000 UTC m=+0.087745208 container health_status 482a8450540b33e5cd4c4cb0c4b60281016c657b79b89fa82f8a9e447843f995 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.build-date=20251202, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']})
Jan 21 19:52:49 np0005591285 podman[250392]: 2026-01-22 00:52:49.247600833 +0000 UTC m=+0.109688040 container health_status 8919309155a033da8deb024bb37b9fc0d285fe97686ee0d202af37a5f2df8416 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Jan 21 19:52:51 np0005591285 podman[250432]: 2026-01-22 00:52:51.21028339 +0000 UTC m=+0.083120154 container health_status e38def30a2d032278c507a8fe96e62e9ea51ff4721fe55b24e66691821fd079e (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Jan 21 19:52:51 np0005591285 nova_compute[182755]: 2026-01-22 00:52:51.489 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:52:53 np0005591285 nova_compute[182755]: 2026-01-22 00:52:53.641 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:52:56 np0005591285 nova_compute[182755]: 2026-01-22 00:52:56.491 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:52:58 np0005591285 nova_compute[182755]: 2026-01-22 00:52:58.644 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:53:01 np0005591285 nova_compute[182755]: 2026-01-22 00:53:01.493 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:53:03 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:53:03.023 104259 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 21 19:53:03 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:53:03.025 104259 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 21 19:53:03 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:53:03.025 104259 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 21 19:53:03 np0005591285 nova_compute[182755]: 2026-01-22 00:53:03.682 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:53:05 np0005591285 podman[250458]: 2026-01-22 00:53:05.194737563 +0000 UTC m=+0.062536839 container health_status 769668cc4e818cfae854ab3fcb5517227ddca1e18005a18ef4d782a494f45662 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-type=git, release=1755695350, architecture=x86_64, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vendor=Red Hat, Inc., distribution-scope=public, io.openshift.expose-services=, name=ubi9-minimal, com.redhat.component=ubi9-minimal-container, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, url=https://catalog.redhat.com/en/search?searchType=containers, io.buildah.version=1.33.7, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., version=9.6, config_id=openstack_network_exporter, managed_by=edpm_ansible, io.openshift.tags=minimal rhel9, build-date=2025-08-20T13:12:41, container_name=openstack_network_exporter, maintainer=Red Hat, Inc.)
Jan 21 19:53:05 np0005591285 podman[250459]: 2026-01-22 00:53:05.20764558 +0000 UTC m=+0.063520154 container health_status b5c1a31a508d0cf9e10702abe9c0ef424614b4d8f0daee85791f7de054afa6f0 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251202, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=ceilometer_agent_compute, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 21 19:53:06 np0005591285 nova_compute[182755]: 2026-01-22 00:53:06.495 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:53:08 np0005591285 nova_compute[182755]: 2026-01-22 00:53:08.685 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:53:11 np0005591285 nova_compute[182755]: 2026-01-22 00:53:11.497 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:53:13 np0005591285 podman[250500]: 2026-01-22 00:53:13.173754789 +0000 UTC m=+0.047759772 container health_status 3d927e208c8d9d2bf2ed958430b573b5beb6e6062ba87e1ec7a0ee42c855b2e4 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter)
Jan 21 19:53:13 np0005591285 nova_compute[182755]: 2026-01-22 00:53:13.686 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:53:16 np0005591285 nova_compute[182755]: 2026-01-22 00:53:16.498 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:53:18 np0005591285 nova_compute[182755]: 2026-01-22 00:53:18.689 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:53:20 np0005591285 podman[250525]: 2026-01-22 00:53:20.178628538 +0000 UTC m=+0.048246806 container health_status 482a8450540b33e5cd4c4cb0c4b60281016c657b79b89fa82f8a9e447843f995 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, container_name=ovn_metadata_agent, org.label-schema.build-date=20251202, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 21 19:53:20 np0005591285 podman[250526]: 2026-01-22 00:53:20.183458128 +0000 UTC m=+0.049420118 container health_status 8919309155a033da8deb024bb37b9fc0d285fe97686ee0d202af37a5f2df8416 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Jan 21 19:53:21 np0005591285 nova_compute[182755]: 2026-01-22 00:53:21.217 182759 DEBUG oslo_service.periodic_task [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 21 19:53:21 np0005591285 nova_compute[182755]: 2026-01-22 00:53:21.217 182759 DEBUG nova.compute.manager [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Jan 21 19:53:21 np0005591285 nova_compute[182755]: 2026-01-22 00:53:21.218 182759 DEBUG nova.compute.manager [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Jan 21 19:53:21 np0005591285 nova_compute[182755]: 2026-01-22 00:53:21.252 182759 DEBUG nova.compute.manager [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Jan 21 19:53:21 np0005591285 nova_compute[182755]: 2026-01-22 00:53:21.500 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:53:22 np0005591285 nova_compute[182755]: 2026-01-22 00:53:22.217 182759 DEBUG oslo_service.periodic_task [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 21 19:53:22 np0005591285 nova_compute[182755]: 2026-01-22 00:53:22.217 182759 DEBUG oslo_service.periodic_task [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 21 19:53:22 np0005591285 nova_compute[182755]: 2026-01-22 00:53:22.218 182759 DEBUG nova.compute.manager [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Jan 21 19:53:22 np0005591285 podman[250569]: 2026-01-22 00:53:22.240601374 +0000 UTC m=+0.110119148 container health_status e38def30a2d032278c507a8fe96e62e9ea51ff4721fe55b24e66691821fd079e (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=ovn_controller, container_name=ovn_controller)
Jan 21 19:53:23 np0005591285 nova_compute[182755]: 2026-01-22 00:53:23.736 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:53:24 np0005591285 nova_compute[182755]: 2026-01-22 00:53:24.213 182759 DEBUG oslo_service.periodic_task [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 21 19:53:24 np0005591285 nova_compute[182755]: 2026-01-22 00:53:24.216 182759 DEBUG oslo_service.periodic_task [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 21 19:53:25 np0005591285 nova_compute[182755]: 2026-01-22 00:53:25.217 182759 DEBUG oslo_service.periodic_task [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 21 19:53:26 np0005591285 nova_compute[182755]: 2026-01-22 00:53:26.501 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:53:28 np0005591285 nova_compute[182755]: 2026-01-22 00:53:28.738 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:53:30 np0005591285 nova_compute[182755]: 2026-01-22 00:53:30.216 182759 DEBUG oslo_service.periodic_task [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 21 19:53:30 np0005591285 nova_compute[182755]: 2026-01-22 00:53:30.217 182759 DEBUG oslo_service.periodic_task [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 21 19:53:30 np0005591285 nova_compute[182755]: 2026-01-22 00:53:30.275 182759 DEBUG oslo_concurrency.lockutils [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 21 19:53:30 np0005591285 nova_compute[182755]: 2026-01-22 00:53:30.276 182759 DEBUG oslo_concurrency.lockutils [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 21 19:53:30 np0005591285 nova_compute[182755]: 2026-01-22 00:53:30.276 182759 DEBUG oslo_concurrency.lockutils [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 21 19:53:30 np0005591285 nova_compute[182755]: 2026-01-22 00:53:30.277 182759 DEBUG nova.compute.resource_tracker [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Jan 21 19:53:30 np0005591285 nova_compute[182755]: 2026-01-22 00:53:30.441 182759 WARNING nova.virt.libvirt.driver [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 21 19:53:30 np0005591285 nova_compute[182755]: 2026-01-22 00:53:30.443 182759 DEBUG nova.compute.resource_tracker [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=5687MB free_disk=73.10894775390625GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Jan 21 19:53:30 np0005591285 nova_compute[182755]: 2026-01-22 00:53:30.443 182759 DEBUG oslo_concurrency.lockutils [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 21 19:53:30 np0005591285 nova_compute[182755]: 2026-01-22 00:53:30.443 182759 DEBUG oslo_concurrency.lockutils [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 21 19:53:30 np0005591285 nova_compute[182755]: 2026-01-22 00:53:30.569 182759 DEBUG nova.compute.resource_tracker [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Jan 21 19:53:30 np0005591285 nova_compute[182755]: 2026-01-22 00:53:30.570 182759 DEBUG nova.compute.resource_tracker [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Jan 21 19:53:30 np0005591285 nova_compute[182755]: 2026-01-22 00:53:30.595 182759 DEBUG nova.compute.provider_tree [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Inventory has not changed in ProviderTree for provider: e96a8776-a298-4c19-937a-402cb8191067 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 21 19:53:30 np0005591285 nova_compute[182755]: 2026-01-22 00:53:30.617 182759 DEBUG nova.scheduler.client.report [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Inventory has not changed for provider e96a8776-a298-4c19-937a-402cb8191067 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 21 19:53:30 np0005591285 nova_compute[182755]: 2026-01-22 00:53:30.618 182759 DEBUG nova.compute.resource_tracker [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Jan 21 19:53:30 np0005591285 nova_compute[182755]: 2026-01-22 00:53:30.618 182759 DEBUG oslo_concurrency.lockutils [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.175s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 21 19:53:31 np0005591285 nova_compute[182755]: 2026-01-22 00:53:31.503 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:53:33 np0005591285 nova_compute[182755]: 2026-01-22 00:53:33.741 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:53:36 np0005591285 podman[250596]: 2026-01-22 00:53:36.186195161 +0000 UTC m=+0.056025595 container health_status b5c1a31a508d0cf9e10702abe9c0ef424614b4d8f0daee85791f7de054afa6f0 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, config_id=ceilometer_agent_compute, container_name=ceilometer_agent_compute, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Jan 21 19:53:36 np0005591285 podman[250595]: 2026-01-22 00:53:36.188081782 +0000 UTC m=+0.058736208 container health_status 769668cc4e818cfae854ab3fcb5517227ddca1e18005a18ef4d782a494f45662 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vendor=Red Hat, Inc., vcs-type=git, container_name=openstack_network_exporter, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, config_id=openstack_network_exporter, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., name=ubi9-minimal, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1755695350, build-date=2025-08-20T13:12:41, io.openshift.expose-services=, distribution-scope=public, io.buildah.version=1.33.7, com.redhat.component=ubi9-minimal-container, managed_by=edpm_ansible, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, version=9.6, maintainer=Red Hat, Inc., io.openshift.tags=minimal rhel9, url=https://catalog.redhat.com/en/search?searchType=containers, architecture=x86_64, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']})
Jan 21 19:53:36 np0005591285 nova_compute[182755]: 2026-01-22 00:53:36.505 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:53:37 np0005591285 nova_compute[182755]: 2026-01-22 00:53:37.217 182759 DEBUG oslo_service.periodic_task [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 21 19:53:37 np0005591285 nova_compute[182755]: 2026-01-22 00:53:37.218 182759 DEBUG oslo_service.periodic_task [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Running periodic task ComputeManager._run_pending_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 21 19:53:37 np0005591285 nova_compute[182755]: 2026-01-22 00:53:37.218 182759 DEBUG nova.compute.manager [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Cleaning up deleted instances _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11145#033[00m
Jan 21 19:53:37 np0005591285 nova_compute[182755]: 2026-01-22 00:53:37.250 182759 DEBUG nova.compute.manager [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] There are 0 instances to clean _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11154#033[00m
Jan 21 19:53:38 np0005591285 nova_compute[182755]: 2026-01-22 00:53:38.742 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:53:41 np0005591285 nova_compute[182755]: 2026-01-22 00:53:41.506 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:53:43 np0005591285 nova_compute[182755]: 2026-01-22 00:53:43.745 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:53:44 np0005591285 podman[250635]: 2026-01-22 00:53:44.219573892 +0000 UTC m=+0.086902034 container health_status 3d927e208c8d9d2bf2ed958430b573b5beb6e6062ba87e1ec7a0ee42c855b2e4 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter)
Jan 21 19:53:46 np0005591285 nova_compute[182755]: 2026-01-22 00:53:46.508 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:53:48 np0005591285 nova_compute[182755]: 2026-01-22 00:53:48.748 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:53:51 np0005591285 systemd[1]: Starting dnf makecache...
Jan 21 19:53:51 np0005591285 podman[250660]: 2026-01-22 00:53:51.184004745 +0000 UTC m=+0.047662711 container health_status 8919309155a033da8deb024bb37b9fc0d285fe97686ee0d202af37a5f2df8416 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Jan 21 19:53:51 np0005591285 podman[250659]: 2026-01-22 00:53:51.211594336 +0000 UTC m=+0.075369475 container health_status 482a8450540b33e5cd4c4cb0c4b60281016c657b79b89fa82f8a9e447843f995 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 21 19:53:51 np0005591285 dnf[250661]: Metadata cache refreshed recently.
Jan 21 19:53:51 np0005591285 systemd[1]: dnf-makecache.service: Deactivated successfully.
Jan 21 19:53:51 np0005591285 systemd[1]: Finished dnf makecache.
Jan 21 19:53:51 np0005591285 nova_compute[182755]: 2026-01-22 00:53:51.509 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:53:53 np0005591285 podman[250699]: 2026-01-22 00:53:53.272206766 +0000 UTC m=+0.119229554 container health_status e38def30a2d032278c507a8fe96e62e9ea51ff4721fe55b24e66691821fd079e (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.license=GPLv2, config_id=ovn_controller, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, container_name=ovn_controller, io.buildah.version=1.41.3, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 21 19:53:53 np0005591285 nova_compute[182755]: 2026-01-22 00:53:53.750 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:53:56 np0005591285 nova_compute[182755]: 2026-01-22 00:53:56.511 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:53:58 np0005591285 nova_compute[182755]: 2026-01-22 00:53:58.218 182759 DEBUG oslo_service.periodic_task [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Running periodic task ComputeManager._cleanup_incomplete_migrations run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 21 19:53:58 np0005591285 nova_compute[182755]: 2026-01-22 00:53:58.219 182759 DEBUG nova.compute.manager [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Cleaning up deleted instances with incomplete migration  _cleanup_incomplete_migrations /usr/lib/python3.9/site-packages/nova/compute/manager.py:11183#033[00m
Jan 21 19:53:58 np0005591285 nova_compute[182755]: 2026-01-22 00:53:58.753 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:54:01 np0005591285 nova_compute[182755]: 2026-01-22 00:54:01.513 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:54:03 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:54:03.025 104259 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 21 19:54:03 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:54:03.025 104259 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 21 19:54:03 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:54:03.026 104259 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 21 19:54:03 np0005591285 nova_compute[182755]: 2026-01-22 00:54:03.755 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:54:06 np0005591285 nova_compute[182755]: 2026-01-22 00:54:06.512 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:54:07 np0005591285 podman[250726]: 2026-01-22 00:54:07.18488438 +0000 UTC m=+0.060067423 container health_status 769668cc4e818cfae854ab3fcb5517227ddca1e18005a18ef4d782a494f45662 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.openshift.tags=minimal rhel9, io.buildah.version=1.33.7, maintainer=Red Hat, Inc., name=ubi9-minimal, distribution-scope=public, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., version=9.6, container_name=openstack_network_exporter, managed_by=edpm_ansible, com.redhat.component=ubi9-minimal-container, release=1755695350, url=https://catalog.redhat.com/en/search?searchType=containers, vendor=Red Hat, Inc., io.openshift.expose-services=, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, architecture=x86_64, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., build-date=2025-08-20T13:12:41, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, config_id=openstack_network_exporter, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal)
Jan 21 19:54:07 np0005591285 podman[250727]: 2026-01-22 00:54:07.186612856 +0000 UTC m=+0.056329373 container health_status b5c1a31a508d0cf9e10702abe9c0ef424614b4d8f0daee85791f7de054afa6f0 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ceilometer_agent_compute, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=ceilometer_agent_compute)
Jan 21 19:54:08 np0005591285 nova_compute[182755]: 2026-01-22 00:54:08.756 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:54:11 np0005591285 nova_compute[182755]: 2026-01-22 00:54:11.513 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:54:12 np0005591285 nova_compute[182755]: 2026-01-22 00:54:12.217 182759 DEBUG oslo_service.periodic_task [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 21 19:54:13 np0005591285 nova_compute[182755]: 2026-01-22 00:54:13.758 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:54:15 np0005591285 podman[250768]: 2026-01-22 00:54:15.1735262 +0000 UTC m=+0.051023452 container health_status 3d927e208c8d9d2bf2ed958430b573b5beb6e6062ba87e1ec7a0ee42c855b2e4 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter)
Jan 21 19:54:16 np0005591285 nova_compute[182755]: 2026-01-22 00:54:16.516 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:54:18 np0005591285 nova_compute[182755]: 2026-01-22 00:54:18.761 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:54:21 np0005591285 nova_compute[182755]: 2026-01-22 00:54:21.244 182759 DEBUG oslo_service.periodic_task [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 21 19:54:21 np0005591285 nova_compute[182755]: 2026-01-22 00:54:21.244 182759 DEBUG nova.compute.manager [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Jan 21 19:54:21 np0005591285 nova_compute[182755]: 2026-01-22 00:54:21.245 182759 DEBUG nova.compute.manager [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Jan 21 19:54:21 np0005591285 nova_compute[182755]: 2026-01-22 00:54:21.269 182759 DEBUG nova.compute.manager [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Jan 21 19:54:21 np0005591285 nova_compute[182755]: 2026-01-22 00:54:21.518 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:54:22 np0005591285 podman[250792]: 2026-01-22 00:54:22.217408576 +0000 UTC m=+0.082424774 container health_status 482a8450540b33e5cd4c4cb0c4b60281016c657b79b89fa82f8a9e447843f995 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 21 19:54:22 np0005591285 nova_compute[182755]: 2026-01-22 00:54:22.218 182759 DEBUG oslo_service.periodic_task [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 21 19:54:22 np0005591285 podman[250793]: 2026-01-22 00:54:22.236440027 +0000 UTC m=+0.094817737 container health_status 8919309155a033da8deb024bb37b9fc0d285fe97686ee0d202af37a5f2df8416 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Jan 21 19:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:54:23.185 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.capacity, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 21 19:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:54:23.186 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 21 19:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:54:23.186 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 21 19:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:54:23.186 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 21 19:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:54:23.186 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 21 19:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:54:23.186 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 21 19:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:54:23.186 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 21 19:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:54:23.187 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 21 19:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:54:23.187 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 21 19:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:54:23.187 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 21 19:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:54:23.187 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 21 19:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:54:23.187 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.allocation, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 21 19:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:54:23.187 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 21 19:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:54:23.187 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 21 19:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:54:23.187 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.iops, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 21 19:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:54:23.187 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 21 19:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:54:23.188 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 21 19:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:54:23.188 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 21 19:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:54:23.188 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 21 19:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:54:23.188 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 21 19:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:54:23.188 12 DEBUG ceilometer.polling.manager [-] Skip pollster memory.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 21 19:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:54:23.188 12 DEBUG ceilometer.polling.manager [-] Skip pollster cpu, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 21 19:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:54:23.188 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 21 19:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:54:23.188 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 21 19:54:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:54:23.188 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 21 19:54:23 np0005591285 nova_compute[182755]: 2026-01-22 00:54:23.217 182759 DEBUG oslo_service.periodic_task [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 21 19:54:23 np0005591285 nova_compute[182755]: 2026-01-22 00:54:23.218 182759 DEBUG nova.compute.manager [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Jan 21 19:54:23 np0005591285 nova_compute[182755]: 2026-01-22 00:54:23.763 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:54:24 np0005591285 podman[250833]: 2026-01-22 00:54:24.270920674 +0000 UTC m=+0.135443298 container health_status e38def30a2d032278c507a8fe96e62e9ea51ff4721fe55b24e66691821fd079e (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_id=ovn_controller, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.build-date=20251202, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS)
Jan 21 19:54:26 np0005591285 nova_compute[182755]: 2026-01-22 00:54:26.213 182759 DEBUG oslo_service.periodic_task [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 21 19:54:26 np0005591285 nova_compute[182755]: 2026-01-22 00:54:26.217 182759 DEBUG oslo_service.periodic_task [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 21 19:54:26 np0005591285 nova_compute[182755]: 2026-01-22 00:54:26.217 182759 DEBUG oslo_service.periodic_task [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 21 19:54:26 np0005591285 nova_compute[182755]: 2026-01-22 00:54:26.519 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:54:28 np0005591285 nova_compute[182755]: 2026-01-22 00:54:28.766 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:54:30 np0005591285 nova_compute[182755]: 2026-01-22 00:54:30.217 182759 DEBUG oslo_service.periodic_task [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 21 19:54:31 np0005591285 nova_compute[182755]: 2026-01-22 00:54:31.217 182759 DEBUG oslo_service.periodic_task [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 21 19:54:31 np0005591285 nova_compute[182755]: 2026-01-22 00:54:31.248 182759 DEBUG oslo_concurrency.lockutils [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 21 19:54:31 np0005591285 nova_compute[182755]: 2026-01-22 00:54:31.249 182759 DEBUG oslo_concurrency.lockutils [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 21 19:54:31 np0005591285 nova_compute[182755]: 2026-01-22 00:54:31.249 182759 DEBUG oslo_concurrency.lockutils [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 21 19:54:31 np0005591285 nova_compute[182755]: 2026-01-22 00:54:31.249 182759 DEBUG nova.compute.resource_tracker [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Jan 21 19:54:31 np0005591285 nova_compute[182755]: 2026-01-22 00:54:31.482 182759 WARNING nova.virt.libvirt.driver [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 21 19:54:31 np0005591285 nova_compute[182755]: 2026-01-22 00:54:31.484 182759 DEBUG nova.compute.resource_tracker [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=5702MB free_disk=73.10884094238281GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Jan 21 19:54:31 np0005591285 nova_compute[182755]: 2026-01-22 00:54:31.484 182759 DEBUG oslo_concurrency.lockutils [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 21 19:54:31 np0005591285 nova_compute[182755]: 2026-01-22 00:54:31.484 182759 DEBUG oslo_concurrency.lockutils [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 21 19:54:31 np0005591285 nova_compute[182755]: 2026-01-22 00:54:31.521 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:54:31 np0005591285 nova_compute[182755]: 2026-01-22 00:54:31.601 182759 DEBUG nova.compute.resource_tracker [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Jan 21 19:54:31 np0005591285 nova_compute[182755]: 2026-01-22 00:54:31.602 182759 DEBUG nova.compute.resource_tracker [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Jan 21 19:54:31 np0005591285 nova_compute[182755]: 2026-01-22 00:54:31.647 182759 DEBUG nova.compute.provider_tree [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Inventory has not changed in ProviderTree for provider: e96a8776-a298-4c19-937a-402cb8191067 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 21 19:54:31 np0005591285 nova_compute[182755]: 2026-01-22 00:54:31.677 182759 DEBUG nova.scheduler.client.report [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Inventory has not changed for provider e96a8776-a298-4c19-937a-402cb8191067 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 21 19:54:31 np0005591285 nova_compute[182755]: 2026-01-22 00:54:31.678 182759 DEBUG nova.compute.resource_tracker [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Jan 21 19:54:31 np0005591285 nova_compute[182755]: 2026-01-22 00:54:31.679 182759 DEBUG oslo_concurrency.lockutils [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.194s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 21 19:54:33 np0005591285 nova_compute[182755]: 2026-01-22 00:54:33.767 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:54:36 np0005591285 nova_compute[182755]: 2026-01-22 00:54:36.524 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:54:37 np0005591285 nova_compute[182755]: 2026-01-22 00:54:37.680 182759 DEBUG oslo_service.periodic_task [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 21 19:54:38 np0005591285 podman[250861]: 2026-01-22 00:54:38.198249279 +0000 UTC m=+0.067424101 container health_status b5c1a31a508d0cf9e10702abe9c0ef424614b4d8f0daee85791f7de054afa6f0 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_id=ceilometer_agent_compute, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2)
Jan 21 19:54:38 np0005591285 podman[250860]: 2026-01-22 00:54:38.209547223 +0000 UTC m=+0.072857518 container health_status 769668cc4e818cfae854ab3fcb5517227ddca1e18005a18ef4d782a494f45662 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vcs-type=git, com.redhat.component=ubi9-minimal-container, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, container_name=openstack_network_exporter, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vendor=Red Hat, Inc., name=ubi9-minimal, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=openstack_network_exporter, architecture=x86_64, build-date=2025-08-20T13:12:41, io.buildah.version=1.33.7, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., url=https://catalog.redhat.com/en/search?searchType=containers, managed_by=edpm_ansible, release=1755695350, io.openshift.expose-services=, io.openshift.tags=minimal rhel9, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, distribution-scope=public, version=9.6, maintainer=Red Hat, Inc.)
Jan 21 19:54:38 np0005591285 nova_compute[182755]: 2026-01-22 00:54:38.768 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:54:41 np0005591285 nova_compute[182755]: 2026-01-22 00:54:41.526 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:54:43 np0005591285 nova_compute[182755]: 2026-01-22 00:54:43.771 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:54:45 np0005591285 nova_compute[182755]: 2026-01-22 00:54:45.213 182759 DEBUG oslo_service.periodic_task [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 21 19:54:46 np0005591285 podman[250897]: 2026-01-22 00:54:46.217951832 +0000 UTC m=+0.077443650 container health_status 3d927e208c8d9d2bf2ed958430b573b5beb6e6062ba87e1ec7a0ee42c855b2e4 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Jan 21 19:54:46 np0005591285 nova_compute[182755]: 2026-01-22 00:54:46.528 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:54:48 np0005591285 nova_compute[182755]: 2026-01-22 00:54:48.772 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:54:51 np0005591285 nova_compute[182755]: 2026-01-22 00:54:51.531 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:54:53 np0005591285 podman[250922]: 2026-01-22 00:54:53.200093413 +0000 UTC m=+0.064412721 container health_status 8919309155a033da8deb024bb37b9fc0d285fe97686ee0d202af37a5f2df8416 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Jan 21 19:54:53 np0005591285 podman[250921]: 2026-01-22 00:54:53.204995405 +0000 UTC m=+0.072520829 container health_status 482a8450540b33e5cd4c4cb0c4b60281016c657b79b89fa82f8a9e447843f995 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_id=ovn_metadata_agent)
Jan 21 19:54:53 np0005591285 nova_compute[182755]: 2026-01-22 00:54:53.774 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:54:55 np0005591285 podman[250962]: 2026-01-22 00:54:55.280563326 +0000 UTC m=+0.142433246 container health_status e38def30a2d032278c507a8fe96e62e9ea51ff4721fe55b24e66691821fd079e (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 21 19:54:56 np0005591285 nova_compute[182755]: 2026-01-22 00:54:56.533 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:54:58 np0005591285 nova_compute[182755]: 2026-01-22 00:54:58.776 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:55:01 np0005591285 nova_compute[182755]: 2026-01-22 00:55:01.571 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:55:03 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:55:03.027 104259 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 21 19:55:03 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:55:03.027 104259 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 21 19:55:03 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:55:03.028 104259 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 21 19:55:03 np0005591285 nova_compute[182755]: 2026-01-22 00:55:03.779 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:55:06 np0005591285 nova_compute[182755]: 2026-01-22 00:55:06.575 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:55:08 np0005591285 nova_compute[182755]: 2026-01-22 00:55:08.807 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:55:09 np0005591285 podman[250990]: 2026-01-22 00:55:09.201493552 +0000 UTC m=+0.067399581 container health_status b5c1a31a508d0cf9e10702abe9c0ef424614b4d8f0daee85791f7de054afa6f0 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=ceilometer_agent_compute, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, managed_by=edpm_ansible)
Jan 21 19:55:09 np0005591285 podman[250989]: 2026-01-22 00:55:09.260182628 +0000 UTC m=+0.110478338 container health_status 769668cc4e818cfae854ab3fcb5517227ddca1e18005a18ef4d782a494f45662 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., container_name=openstack_network_exporter, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.openshift.tags=minimal rhel9, com.redhat.component=ubi9-minimal-container, io.openshift.expose-services=, vcs-type=git, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., version=9.6, architecture=x86_64, distribution-scope=public, name=ubi9-minimal, maintainer=Red Hat, Inc., io.buildah.version=1.33.7, release=1755695350, url=https://catalog.redhat.com/en/search?searchType=containers, vendor=Red Hat, Inc., managed_by=edpm_ansible, build-date=2025-08-20T13:12:41, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=openstack_network_exporter)
Jan 21 19:55:11 np0005591285 nova_compute[182755]: 2026-01-22 00:55:11.597 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:55:13 np0005591285 nova_compute[182755]: 2026-01-22 00:55:13.872 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:55:16 np0005591285 nova_compute[182755]: 2026-01-22 00:55:16.601 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:55:17 np0005591285 podman[251031]: 2026-01-22 00:55:17.200297313 +0000 UTC m=+0.071729227 container health_status 3d927e208c8d9d2bf2ed958430b573b5beb6e6062ba87e1ec7a0ee42c855b2e4 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Jan 21 19:55:18 np0005591285 nova_compute[182755]: 2026-01-22 00:55:18.874 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:55:21 np0005591285 nova_compute[182755]: 2026-01-22 00:55:21.216 182759 DEBUG oslo_service.periodic_task [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 21 19:55:21 np0005591285 nova_compute[182755]: 2026-01-22 00:55:21.217 182759 DEBUG nova.compute.manager [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Jan 21 19:55:21 np0005591285 nova_compute[182755]: 2026-01-22 00:55:21.217 182759 DEBUG nova.compute.manager [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Jan 21 19:55:21 np0005591285 nova_compute[182755]: 2026-01-22 00:55:21.235 182759 DEBUG nova.compute.manager [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Jan 21 19:55:21 np0005591285 nova_compute[182755]: 2026-01-22 00:55:21.603 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:55:23 np0005591285 nova_compute[182755]: 2026-01-22 00:55:23.914 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:55:24 np0005591285 podman[251057]: 2026-01-22 00:55:24.217847103 +0000 UTC m=+0.075093898 container health_status 8919309155a033da8deb024bb37b9fc0d285fe97686ee0d202af37a5f2df8416 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Jan 21 19:55:24 np0005591285 podman[251056]: 2026-01-22 00:55:24.217821673 +0000 UTC m=+0.074760759 container health_status 482a8450540b33e5cd4c4cb0c4b60281016c657b79b89fa82f8a9e447843f995 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, managed_by=edpm_ansible, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2)
Jan 21 19:55:24 np0005591285 nova_compute[182755]: 2026-01-22 00:55:24.217 182759 DEBUG oslo_service.periodic_task [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 21 19:55:24 np0005591285 nova_compute[182755]: 2026-01-22 00:55:24.217 182759 DEBUG oslo_service.periodic_task [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 21 19:55:24 np0005591285 nova_compute[182755]: 2026-01-22 00:55:24.217 182759 DEBUG nova.compute.manager [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Jan 21 19:55:26 np0005591285 nova_compute[182755]: 2026-01-22 00:55:26.218 182759 DEBUG oslo_service.periodic_task [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 21 19:55:26 np0005591285 podman[251095]: 2026-01-22 00:55:26.291205235 +0000 UTC m=+0.159734731 container health_status e38def30a2d032278c507a8fe96e62e9ea51ff4721fe55b24e66691821fd079e (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, org.label-schema.build-date=20251202, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller)
Jan 21 19:55:26 np0005591285 nova_compute[182755]: 2026-01-22 00:55:26.605 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:55:27 np0005591285 nova_compute[182755]: 2026-01-22 00:55:27.212 182759 DEBUG oslo_service.periodic_task [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 21 19:55:28 np0005591285 nova_compute[182755]: 2026-01-22 00:55:28.218 182759 DEBUG oslo_service.periodic_task [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 21 19:55:28 np0005591285 nova_compute[182755]: 2026-01-22 00:55:28.916 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:55:30 np0005591285 nova_compute[182755]: 2026-01-22 00:55:30.217 182759 DEBUG oslo_service.periodic_task [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 21 19:55:31 np0005591285 nova_compute[182755]: 2026-01-22 00:55:31.606 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:55:32 np0005591285 nova_compute[182755]: 2026-01-22 00:55:32.218 182759 DEBUG oslo_service.periodic_task [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 21 19:55:32 np0005591285 nova_compute[182755]: 2026-01-22 00:55:32.283 182759 DEBUG oslo_concurrency.lockutils [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 21 19:55:32 np0005591285 nova_compute[182755]: 2026-01-22 00:55:32.284 182759 DEBUG oslo_concurrency.lockutils [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 21 19:55:32 np0005591285 nova_compute[182755]: 2026-01-22 00:55:32.284 182759 DEBUG oslo_concurrency.lockutils [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 21 19:55:32 np0005591285 nova_compute[182755]: 2026-01-22 00:55:32.284 182759 DEBUG nova.compute.resource_tracker [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Jan 21 19:55:32 np0005591285 nova_compute[182755]: 2026-01-22 00:55:32.490 182759 WARNING nova.virt.libvirt.driver [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 21 19:55:32 np0005591285 nova_compute[182755]: 2026-01-22 00:55:32.491 182759 DEBUG nova.compute.resource_tracker [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=5710MB free_disk=73.10892868041992GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Jan 21 19:55:32 np0005591285 nova_compute[182755]: 2026-01-22 00:55:32.492 182759 DEBUG oslo_concurrency.lockutils [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 21 19:55:32 np0005591285 nova_compute[182755]: 2026-01-22 00:55:32.492 182759 DEBUG oslo_concurrency.lockutils [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 21 19:55:32 np0005591285 nova_compute[182755]: 2026-01-22 00:55:32.685 182759 DEBUG nova.compute.resource_tracker [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Jan 21 19:55:32 np0005591285 nova_compute[182755]: 2026-01-22 00:55:32.686 182759 DEBUG nova.compute.resource_tracker [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Jan 21 19:55:32 np0005591285 nova_compute[182755]: 2026-01-22 00:55:32.875 182759 DEBUG nova.scheduler.client.report [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Refreshing inventories for resource provider e96a8776-a298-4c19-937a-402cb8191067 _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:804#033[00m
Jan 21 19:55:32 np0005591285 nova_compute[182755]: 2026-01-22 00:55:32.989 182759 DEBUG nova.scheduler.client.report [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Updating ProviderTree inventory for provider e96a8776-a298-4c19-937a-402cb8191067 from _refresh_and_get_inventory using data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} _refresh_and_get_inventory /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:768#033[00m
Jan 21 19:55:32 np0005591285 nova_compute[182755]: 2026-01-22 00:55:32.989 182759 DEBUG nova.compute.provider_tree [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Updating inventory in ProviderTree for provider e96a8776-a298-4c19-937a-402cb8191067 with inventory: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176#033[00m
Jan 21 19:55:33 np0005591285 nova_compute[182755]: 2026-01-22 00:55:33.006 182759 DEBUG nova.scheduler.client.report [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Refreshing aggregate associations for resource provider e96a8776-a298-4c19-937a-402cb8191067, aggregates: None _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:813#033[00m
Jan 21 19:55:33 np0005591285 nova_compute[182755]: 2026-01-22 00:55:33.035 182759 DEBUG nova.scheduler.client.report [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Refreshing trait associations for resource provider e96a8776-a298-4c19-937a-402cb8191067, traits: COMPUTE_SECURITY_UEFI_SECURE_BOOT,COMPUTE_VIOMMU_MODEL_VIRTIO,COMPUTE_STORAGE_BUS_IDE,COMPUTE_NET_ATTACH_INTERFACE,HW_CPU_X86_SSSE3,HW_CPU_X86_SSE,COMPUTE_IMAGE_TYPE_QCOW2,COMPUTE_NET_VIF_MODEL_VMXNET3,COMPUTE_STORAGE_BUS_FDC,HW_CPU_X86_SSE42,COMPUTE_SECURITY_TPM_2_0,COMPUTE_IMAGE_TYPE_AMI,COMPUTE_DEVICE_TAGGING,COMPUTE_VOLUME_EXTEND,COMPUTE_NET_VIF_MODEL_E1000,COMPUTE_VIOMMU_MODEL_AUTO,COMPUTE_GRAPHICS_MODEL_CIRRUS,COMPUTE_VOLUME_ATTACH_WITH_TAG,COMPUTE_IMAGE_TYPE_AKI,COMPUTE_STORAGE_BUS_SATA,HW_CPU_X86_SSE2,COMPUTE_IMAGE_TYPE_ARI,HW_CPU_X86_SSE41,COMPUTE_RESCUE_BFV,COMPUTE_NODE,COMPUTE_GRAPHICS_MODEL_VIRTIO,COMPUTE_NET_VIF_MODEL_PCNET,COMPUTE_NET_VIF_MODEL_VIRTIO,COMPUTE_NET_VIF_MODEL_RTL8139,COMPUTE_ACCELERATORS,COMPUTE_SECURITY_TPM_1_2,COMPUTE_GRAPHICS_MODEL_NONE,COMPUTE_GRAPHICS_MODEL_BOCHS,COMPUTE_VOLUME_MULTI_ATTACH,COMPUTE_NET_VIF_MODEL_NE2K_PCI,COMPUTE_TRUSTED_CERTS,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,COMPUTE_VIOMMU_MODEL_INTEL,COMPUTE_IMAGE_TYPE_ISO,COMPUTE_SOCKET_PCI_NUMA_AFFINITY,COMPUTE_STORAGE_BUS_USB,COMPUTE_STORAGE_BUS_SCSI,COMPUTE_STORAGE_BUS_VIRTIO,HW_CPU_X86_MMX,COMPUTE_GRAPHICS_MODEL_VGA,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,COMPUTE_NET_VIF_MODEL_E1000E,COMPUTE_IMAGE_TYPE_RAW _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:825#033[00m
Jan 21 19:55:33 np0005591285 nova_compute[182755]: 2026-01-22 00:55:33.059 182759 DEBUG nova.compute.provider_tree [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Inventory has not changed in ProviderTree for provider: e96a8776-a298-4c19-937a-402cb8191067 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 21 19:55:33 np0005591285 nova_compute[182755]: 2026-01-22 00:55:33.076 182759 DEBUG nova.scheduler.client.report [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Inventory has not changed for provider e96a8776-a298-4c19-937a-402cb8191067 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 21 19:55:33 np0005591285 nova_compute[182755]: 2026-01-22 00:55:33.078 182759 DEBUG nova.compute.resource_tracker [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Jan 21 19:55:33 np0005591285 nova_compute[182755]: 2026-01-22 00:55:33.078 182759 DEBUG oslo_concurrency.lockutils [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.586s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 21 19:55:33 np0005591285 nova_compute[182755]: 2026-01-22 00:55:33.918 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:55:36 np0005591285 nova_compute[182755]: 2026-01-22 00:55:36.608 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:55:38 np0005591285 nova_compute[182755]: 2026-01-22 00:55:38.078 182759 DEBUG oslo_service.periodic_task [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 21 19:55:38 np0005591285 nova_compute[182755]: 2026-01-22 00:55:38.921 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:55:40 np0005591285 podman[251122]: 2026-01-22 00:55:40.225657903 +0000 UTC m=+0.081394716 container health_status b5c1a31a508d0cf9e10702abe9c0ef424614b4d8f0daee85791f7de054afa6f0 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, config_id=ceilometer_agent_compute, org.label-schema.schema-version=1.0, container_name=ceilometer_agent_compute, managed_by=edpm_ansible, org.label-schema.build-date=20251202, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true)
Jan 21 19:55:40 np0005591285 podman[251121]: 2026-01-22 00:55:40.230962226 +0000 UTC m=+0.095304851 container health_status 769668cc4e818cfae854ab3fcb5517227ddca1e18005a18ef4d782a494f45662 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, build-date=2025-08-20T13:12:41, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, distribution-scope=public, com.redhat.component=ubi9-minimal-container, maintainer=Red Hat, Inc., config_id=openstack_network_exporter, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.expose-services=, container_name=openstack_network_exporter, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-type=git, vendor=Red Hat, Inc., io.buildah.version=1.33.7, url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., managed_by=edpm_ansible, name=ubi9-minimal, io.openshift.tags=minimal rhel9, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., release=1755695350, version=9.6, architecture=x86_64)
Jan 21 19:55:41 np0005591285 nova_compute[182755]: 2026-01-22 00:55:41.610 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:55:43 np0005591285 nova_compute[182755]: 2026-01-22 00:55:43.961 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:55:46 np0005591285 nova_compute[182755]: 2026-01-22 00:55:46.624 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:55:48 np0005591285 podman[251162]: 2026-01-22 00:55:48.232755997 +0000 UTC m=+0.097242493 container health_status 3d927e208c8d9d2bf2ed958430b573b5beb6e6062ba87e1ec7a0ee42c855b2e4 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter)
Jan 21 19:55:48 np0005591285 nova_compute[182755]: 2026-01-22 00:55:48.962 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:55:51 np0005591285 nova_compute[182755]: 2026-01-22 00:55:51.615 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:55:53 np0005591285 nova_compute[182755]: 2026-01-22 00:55:53.965 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:55:55 np0005591285 podman[251189]: 2026-01-22 00:55:55.186144193 +0000 UTC m=+0.060093545 container health_status 8919309155a033da8deb024bb37b9fc0d285fe97686ee0d202af37a5f2df8416 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Jan 21 19:55:55 np0005591285 podman[251188]: 2026-01-22 00:55:55.186324098 +0000 UTC m=+0.063352123 container health_status 482a8450540b33e5cd4c4cb0c4b60281016c657b79b89fa82f8a9e447843f995 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_metadata_agent, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']})
Jan 21 19:55:56 np0005591285 nova_compute[182755]: 2026-01-22 00:55:56.620 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:55:57 np0005591285 podman[251230]: 2026-01-22 00:55:57.266243269 +0000 UTC m=+0.136491305 container health_status e38def30a2d032278c507a8fe96e62e9ea51ff4721fe55b24e66691821fd079e (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0)
Jan 21 19:55:58 np0005591285 nova_compute[182755]: 2026-01-22 00:55:58.967 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:56:01 np0005591285 nova_compute[182755]: 2026-01-22 00:56:01.668 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:56:03 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:56:03.029 104259 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 21 19:56:03 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:56:03.030 104259 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 21 19:56:03 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:56:03.030 104259 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 21 19:56:04 np0005591285 nova_compute[182755]: 2026-01-22 00:56:04.022 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:56:06 np0005591285 nova_compute[182755]: 2026-01-22 00:56:06.669 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:56:09 np0005591285 nova_compute[182755]: 2026-01-22 00:56:09.083 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:56:11 np0005591285 podman[251256]: 2026-01-22 00:56:11.208586978 +0000 UTC m=+0.076752271 container health_status 769668cc4e818cfae854ab3fcb5517227ddca1e18005a18ef4d782a494f45662 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=openstack_network_exporter, io.openshift.tags=minimal rhel9, name=ubi9-minimal, url=https://catalog.redhat.com/en/search?searchType=containers, architecture=x86_64, version=9.6, build-date=2025-08-20T13:12:41, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, managed_by=edpm_ansible, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., distribution-scope=public, io.openshift.expose-services=, com.redhat.component=ubi9-minimal-container, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., release=1755695350, io.buildah.version=1.33.7, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, config_id=openstack_network_exporter, maintainer=Red Hat, Inc., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vcs-type=git)
Jan 21 19:56:11 np0005591285 podman[251257]: 2026-01-22 00:56:11.229736637 +0000 UTC m=+0.086074153 container health_status b5c1a31a508d0cf9e10702abe9c0ef424614b4d8f0daee85791f7de054afa6f0 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, container_name=ceilometer_agent_compute, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, managed_by=edpm_ansible, tcib_managed=true, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=ceilometer_agent_compute, org.label-schema.build-date=20251202)
Jan 21 19:56:11 np0005591285 nova_compute[182755]: 2026-01-22 00:56:11.726 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:56:14 np0005591285 nova_compute[182755]: 2026-01-22 00:56:14.137 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:56:16 np0005591285 nova_compute[182755]: 2026-01-22 00:56:16.730 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:56:19 np0005591285 nova_compute[182755]: 2026-01-22 00:56:19.183 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:56:19 np0005591285 podman[251298]: 2026-01-22 00:56:19.248149471 +0000 UTC m=+0.063908608 container health_status 3d927e208c8d9d2bf2ed958430b573b5beb6e6062ba87e1ec7a0ee42c855b2e4 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter)
Jan 21 19:56:21 np0005591285 nova_compute[182755]: 2026-01-22 00:56:21.734 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:56:23.186 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 21 19:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:56:23.186 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 21 19:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:56:23.187 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 21 19:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:56:23.187 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 21 19:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:56:23.187 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 21 19:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:56:23.187 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 21 19:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:56:23.187 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 21 19:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:56:23.187 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 21 19:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:56:23.187 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.iops, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 21 19:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:56:23.188 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 21 19:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:56:23.188 12 DEBUG ceilometer.polling.manager [-] Skip pollster memory.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 21 19:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:56:23.188 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 21 19:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:56:23.188 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.capacity, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 21 19:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:56:23.188 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 21 19:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:56:23.188 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 21 19:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:56:23.188 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 21 19:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:56:23.188 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.allocation, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 21 19:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:56:23.188 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 21 19:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:56:23.189 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 21 19:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:56:23.189 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 21 19:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:56:23.189 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 21 19:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:56:23.189 12 DEBUG ceilometer.polling.manager [-] Skip pollster cpu, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 21 19:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:56:23.189 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 21 19:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:56:23.189 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 21 19:56:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:56:23.189 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 21 19:56:23 np0005591285 nova_compute[182755]: 2026-01-22 00:56:23.218 182759 DEBUG oslo_service.periodic_task [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 21 19:56:23 np0005591285 nova_compute[182755]: 2026-01-22 00:56:23.219 182759 DEBUG nova.compute.manager [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Jan 21 19:56:23 np0005591285 nova_compute[182755]: 2026-01-22 00:56:23.219 182759 DEBUG nova.compute.manager [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Jan 21 19:56:23 np0005591285 nova_compute[182755]: 2026-01-22 00:56:23.242 182759 DEBUG nova.compute.manager [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Jan 21 19:56:24 np0005591285 nova_compute[182755]: 2026-01-22 00:56:24.185 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:56:24 np0005591285 nova_compute[182755]: 2026-01-22 00:56:24.217 182759 DEBUG oslo_service.periodic_task [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 21 19:56:24 np0005591285 nova_compute[182755]: 2026-01-22 00:56:24.217 182759 DEBUG nova.compute.manager [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Jan 21 19:56:25 np0005591285 nova_compute[182755]: 2026-01-22 00:56:25.218 182759 DEBUG oslo_service.periodic_task [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 21 19:56:26 np0005591285 podman[251324]: 2026-01-22 00:56:26.219554031 +0000 UTC m=+0.077542183 container health_status 482a8450540b33e5cd4c4cb0c4b60281016c657b79b89fa82f8a9e447843f995 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, org.label-schema.license=GPLv2)
Jan 21 19:56:26 np0005591285 podman[251325]: 2026-01-22 00:56:26.232262162 +0000 UTC m=+0.082616439 container health_status 8919309155a033da8deb024bb37b9fc0d285fe97686ee0d202af37a5f2df8416 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']})
Jan 21 19:56:26 np0005591285 nova_compute[182755]: 2026-01-22 00:56:26.768 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:56:28 np0005591285 nova_compute[182755]: 2026-01-22 00:56:28.213 182759 DEBUG oslo_service.periodic_task [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 21 19:56:28 np0005591285 nova_compute[182755]: 2026-01-22 00:56:28.217 182759 DEBUG oslo_service.periodic_task [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 21 19:56:28 np0005591285 podman[251365]: 2026-01-22 00:56:28.259328824 +0000 UTC m=+0.119609473 container health_status e38def30a2d032278c507a8fe96e62e9ea51ff4721fe55b24e66691821fd079e (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=ovn_controller, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.build-date=20251202)
Jan 21 19:56:29 np0005591285 nova_compute[182755]: 2026-01-22 00:56:29.186 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:56:29 np0005591285 nova_compute[182755]: 2026-01-22 00:56:29.217 182759 DEBUG oslo_service.periodic_task [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 21 19:56:31 np0005591285 nova_compute[182755]: 2026-01-22 00:56:31.218 182759 DEBUG oslo_service.periodic_task [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 21 19:56:31 np0005591285 nova_compute[182755]: 2026-01-22 00:56:31.768 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:56:34 np0005591285 nova_compute[182755]: 2026-01-22 00:56:34.189 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:56:34 np0005591285 nova_compute[182755]: 2026-01-22 00:56:34.216 182759 DEBUG oslo_service.periodic_task [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 21 19:56:34 np0005591285 nova_compute[182755]: 2026-01-22 00:56:34.268 182759 DEBUG oslo_concurrency.lockutils [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 21 19:56:34 np0005591285 nova_compute[182755]: 2026-01-22 00:56:34.269 182759 DEBUG oslo_concurrency.lockutils [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 21 19:56:34 np0005591285 nova_compute[182755]: 2026-01-22 00:56:34.269 182759 DEBUG oslo_concurrency.lockutils [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 21 19:56:34 np0005591285 nova_compute[182755]: 2026-01-22 00:56:34.269 182759 DEBUG nova.compute.resource_tracker [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Jan 21 19:56:34 np0005591285 nova_compute[182755]: 2026-01-22 00:56:34.479 182759 WARNING nova.virt.libvirt.driver [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 21 19:56:34 np0005591285 nova_compute[182755]: 2026-01-22 00:56:34.482 182759 DEBUG nova.compute.resource_tracker [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=5713MB free_disk=73.10894393920898GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Jan 21 19:56:34 np0005591285 nova_compute[182755]: 2026-01-22 00:56:34.482 182759 DEBUG oslo_concurrency.lockutils [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 21 19:56:34 np0005591285 nova_compute[182755]: 2026-01-22 00:56:34.483 182759 DEBUG oslo_concurrency.lockutils [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 21 19:56:34 np0005591285 nova_compute[182755]: 2026-01-22 00:56:34.598 182759 DEBUG nova.compute.resource_tracker [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Jan 21 19:56:34 np0005591285 nova_compute[182755]: 2026-01-22 00:56:34.598 182759 DEBUG nova.compute.resource_tracker [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Jan 21 19:56:34 np0005591285 nova_compute[182755]: 2026-01-22 00:56:34.640 182759 DEBUG nova.compute.provider_tree [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Inventory has not changed in ProviderTree for provider: e96a8776-a298-4c19-937a-402cb8191067 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 21 19:56:34 np0005591285 nova_compute[182755]: 2026-01-22 00:56:34.657 182759 DEBUG nova.scheduler.client.report [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Inventory has not changed for provider e96a8776-a298-4c19-937a-402cb8191067 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 21 19:56:34 np0005591285 nova_compute[182755]: 2026-01-22 00:56:34.660 182759 DEBUG nova.compute.resource_tracker [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Jan 21 19:56:34 np0005591285 nova_compute[182755]: 2026-01-22 00:56:34.660 182759 DEBUG oslo_concurrency.lockutils [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.178s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 21 19:56:36 np0005591285 nova_compute[182755]: 2026-01-22 00:56:36.770 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:56:37 np0005591285 nova_compute[182755]: 2026-01-22 00:56:37.662 182759 DEBUG oslo_service.periodic_task [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 21 19:56:39 np0005591285 nova_compute[182755]: 2026-01-22 00:56:39.191 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:56:41 np0005591285 nova_compute[182755]: 2026-01-22 00:56:41.772 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:56:42 np0005591285 podman[251391]: 2026-01-22 00:56:42.200459889 +0000 UTC m=+0.073824303 container health_status 769668cc4e818cfae854ab3fcb5517227ddca1e18005a18ef4d782a494f45662 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, name=ubi9-minimal, release=1755695350, build-date=2025-08-20T13:12:41, maintainer=Red Hat, Inc., config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.openshift.expose-services=, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.tags=minimal rhel9, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, com.redhat.component=ubi9-minimal-container, architecture=x86_64, config_id=openstack_network_exporter, managed_by=edpm_ansible, container_name=openstack_network_exporter, url=https://catalog.redhat.com/en/search?searchType=containers, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., version=9.6, distribution-scope=public, io.buildah.version=1.33.7)
Jan 21 19:56:42 np0005591285 podman[251392]: 2026-01-22 00:56:42.206692617 +0000 UTC m=+0.067391992 container health_status b5c1a31a508d0cf9e10702abe9c0ef424614b4d8f0daee85791f7de054afa6f0 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, container_name=ceilometer_agent_compute, org.label-schema.license=GPLv2, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=ceilometer_agent_compute)
Jan 21 19:56:44 np0005591285 nova_compute[182755]: 2026-01-22 00:56:44.193 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:56:46 np0005591285 nova_compute[182755]: 2026-01-22 00:56:46.775 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:56:48 np0005591285 nova_compute[182755]: 2026-01-22 00:56:48.214 182759 DEBUG oslo_service.periodic_task [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 21 19:56:49 np0005591285 nova_compute[182755]: 2026-01-22 00:56:49.224 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:56:50 np0005591285 podman[251432]: 2026-01-22 00:56:50.194520319 +0000 UTC m=+0.067924204 container health_status 3d927e208c8d9d2bf2ed958430b573b5beb6e6062ba87e1ec7a0ee42c855b2e4 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter)
Jan 21 19:56:51 np0005591285 nova_compute[182755]: 2026-01-22 00:56:51.778 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:56:54 np0005591285 nova_compute[182755]: 2026-01-22 00:56:54.226 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:56:56 np0005591285 nova_compute[182755]: 2026-01-22 00:56:56.779 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:56:57 np0005591285 podman[251458]: 2026-01-22 00:56:57.189925393 +0000 UTC m=+0.057276929 container health_status 8919309155a033da8deb024bb37b9fc0d285fe97686ee0d202af37a5f2df8416 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter)
Jan 21 19:56:57 np0005591285 podman[251457]: 2026-01-22 00:56:57.204889084 +0000 UTC m=+0.073986597 container health_status 482a8450540b33e5cd4c4cb0c4b60281016c657b79b89fa82f8a9e447843f995 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251202, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Jan 21 19:56:59 np0005591285 nova_compute[182755]: 2026-01-22 00:56:59.269 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:56:59 np0005591285 podman[251498]: 2026-01-22 00:56:59.321006678 +0000 UTC m=+0.176004977 container health_status e38def30a2d032278c507a8fe96e62e9ea51ff4721fe55b24e66691821fd079e (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, config_id=ovn_controller, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team)
Jan 21 19:57:01 np0005591285 nova_compute[182755]: 2026-01-22 00:57:01.780 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:57:03 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:57:03.030 104259 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 21 19:57:03 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:57:03.031 104259 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 21 19:57:03 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:57:03.031 104259 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 21 19:57:04 np0005591285 nova_compute[182755]: 2026-01-22 00:57:04.302 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:57:06 np0005591285 nova_compute[182755]: 2026-01-22 00:57:06.782 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:57:09 np0005591285 nova_compute[182755]: 2026-01-22 00:57:09.304 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:57:11 np0005591285 nova_compute[182755]: 2026-01-22 00:57:11.816 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:57:13 np0005591285 podman[251525]: 2026-01-22 00:57:13.220021781 +0000 UTC m=+0.087005217 container health_status 769668cc4e818cfae854ab3fcb5517227ddca1e18005a18ef4d782a494f45662 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, build-date=2025-08-20T13:12:41, release=1755695350, container_name=openstack_network_exporter, managed_by=edpm_ansible, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.openshift.tags=minimal rhel9, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., name=ubi9-minimal, version=9.6, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vendor=Red Hat, Inc., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, distribution-scope=public, url=https://catalog.redhat.com/en/search?searchType=containers, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=openstack_network_exporter, io.buildah.version=1.33.7, io.openshift.expose-services=, maintainer=Red Hat, Inc., architecture=x86_64, com.redhat.component=ubi9-minimal-container, vcs-type=git)
Jan 21 19:57:13 np0005591285 podman[251526]: 2026-01-22 00:57:13.220200706 +0000 UTC m=+0.081846019 container health_status b5c1a31a508d0cf9e10702abe9c0ef424614b4d8f0daee85791f7de054afa6f0 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=ceilometer_agent_compute, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, tcib_managed=true)
Jan 21 19:57:14 np0005591285 nova_compute[182755]: 2026-01-22 00:57:14.307 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:57:16 np0005591285 nova_compute[182755]: 2026-01-22 00:57:16.817 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:57:19 np0005591285 nova_compute[182755]: 2026-01-22 00:57:19.343 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:57:21 np0005591285 podman[251564]: 2026-01-22 00:57:21.209276394 +0000 UTC m=+0.075497379 container health_status 3d927e208c8d9d2bf2ed958430b573b5beb6e6062ba87e1ec7a0ee42c855b2e4 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter)
Jan 21 19:57:21 np0005591285 nova_compute[182755]: 2026-01-22 00:57:21.862 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:57:23 np0005591285 nova_compute[182755]: 2026-01-22 00:57:23.217 182759 DEBUG oslo_service.periodic_task [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 21 19:57:23 np0005591285 nova_compute[182755]: 2026-01-22 00:57:23.217 182759 DEBUG nova.compute.manager [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Jan 21 19:57:23 np0005591285 nova_compute[182755]: 2026-01-22 00:57:23.217 182759 DEBUG nova.compute.manager [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Jan 21 19:57:23 np0005591285 nova_compute[182755]: 2026-01-22 00:57:23.241 182759 DEBUG nova.compute.manager [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Jan 21 19:57:24 np0005591285 nova_compute[182755]: 2026-01-22 00:57:24.378 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:57:25 np0005591285 nova_compute[182755]: 2026-01-22 00:57:25.218 182759 DEBUG oslo_service.periodic_task [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 21 19:57:25 np0005591285 nova_compute[182755]: 2026-01-22 00:57:25.218 182759 DEBUG oslo_service.periodic_task [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 21 19:57:25 np0005591285 nova_compute[182755]: 2026-01-22 00:57:25.219 182759 DEBUG nova.compute.manager [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Jan 21 19:57:26 np0005591285 nova_compute[182755]: 2026-01-22 00:57:26.860 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:57:28 np0005591285 podman[251589]: 2026-01-22 00:57:28.210340419 +0000 UTC m=+0.076166806 container health_status 482a8450540b33e5cd4c4cb0c4b60281016c657b79b89fa82f8a9e447843f995 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251202, tcib_managed=true, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, managed_by=edpm_ansible)
Jan 21 19:57:28 np0005591285 podman[251590]: 2026-01-22 00:57:28.21261417 +0000 UTC m=+0.065770667 container health_status 8919309155a033da8deb024bb37b9fc0d285fe97686ee0d202af37a5f2df8416 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Jan 21 19:57:29 np0005591285 nova_compute[182755]: 2026-01-22 00:57:29.213 182759 DEBUG oslo_service.periodic_task [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 21 19:57:29 np0005591285 nova_compute[182755]: 2026-01-22 00:57:29.217 182759 DEBUG oslo_service.periodic_task [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 21 19:57:29 np0005591285 nova_compute[182755]: 2026-01-22 00:57:29.406 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:57:30 np0005591285 nova_compute[182755]: 2026-01-22 00:57:30.218 182759 DEBUG oslo_service.periodic_task [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 21 19:57:30 np0005591285 podman[251632]: 2026-01-22 00:57:30.226419476 +0000 UTC m=+0.100160141 container health_status e38def30a2d032278c507a8fe96e62e9ea51ff4721fe55b24e66691821fd079e (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_id=ovn_controller, managed_by=edpm_ansible, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, tcib_managed=true)
Jan 21 19:57:31 np0005591285 nova_compute[182755]: 2026-01-22 00:57:31.218 182759 DEBUG oslo_service.periodic_task [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Running periodic task ComputeManager._run_image_cache_manager_pass run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 21 19:57:31 np0005591285 nova_compute[182755]: 2026-01-22 00:57:31.218 182759 DEBUG oslo_concurrency.lockutils [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Acquiring lock "storage-registry-lock" by "nova.virt.storage_users.register_storage_use.<locals>.do_register_storage_use" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 21 19:57:31 np0005591285 nova_compute[182755]: 2026-01-22 00:57:31.219 182759 DEBUG oslo_concurrency.lockutils [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Lock "storage-registry-lock" acquired by "nova.virt.storage_users.register_storage_use.<locals>.do_register_storage_use" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 21 19:57:31 np0005591285 nova_compute[182755]: 2026-01-22 00:57:31.219 182759 DEBUG oslo_concurrency.lockutils [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Lock "storage-registry-lock" "released" by "nova.virt.storage_users.register_storage_use.<locals>.do_register_storage_use" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 21 19:57:31 np0005591285 nova_compute[182755]: 2026-01-22 00:57:31.220 182759 DEBUG oslo_concurrency.lockutils [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Acquiring lock "storage-registry-lock" by "nova.virt.storage_users.get_storage_users.<locals>.do_get_storage_users" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 21 19:57:31 np0005591285 nova_compute[182755]: 2026-01-22 00:57:31.220 182759 DEBUG oslo_concurrency.lockutils [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Lock "storage-registry-lock" acquired by "nova.virt.storage_users.get_storage_users.<locals>.do_get_storage_users" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 21 19:57:31 np0005591285 nova_compute[182755]: 2026-01-22 00:57:31.220 182759 DEBUG oslo_concurrency.lockutils [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Lock "storage-registry-lock" "released" by "nova.virt.storage_users.get_storage_users.<locals>.do_get_storage_users" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 21 19:57:31 np0005591285 nova_compute[182755]: 2026-01-22 00:57:31.242 182759 DEBUG nova.virt.libvirt.imagecache [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Adding ephemeral_1_0706d66 into backend ephemeral images _store_ephemeral_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/imagecache.py:100#033[00m
Jan 21 19:57:31 np0005591285 nova_compute[182755]: 2026-01-22 00:57:31.256 182759 DEBUG nova.virt.libvirt.imagecache [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Verify base images _age_and_verify_cached_images /usr/lib/python3.9/site-packages/nova/virt/libvirt/imagecache.py:314#033[00m
Jan 21 19:57:31 np0005591285 nova_compute[182755]: 2026-01-22 00:57:31.257 182759 WARNING nova.virt.libvirt.imagecache [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Unknown base file: /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474#033[00m
Jan 21 19:57:31 np0005591285 nova_compute[182755]: 2026-01-22 00:57:31.257 182759 WARNING nova.virt.libvirt.imagecache [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Unknown base file: /var/lib/nova/instances/_base/28d799b333bd7d52e5e892149f424e185effed74#033[00m
Jan 21 19:57:31 np0005591285 nova_compute[182755]: 2026-01-22 00:57:31.257 182759 WARNING nova.virt.libvirt.imagecache [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Unknown base file: /var/lib/nova/instances/_base/6779aa1454b0f9e323fac2693f45a73902da912b#033[00m
Jan 21 19:57:31 np0005591285 nova_compute[182755]: 2026-01-22 00:57:31.258 182759 WARNING nova.virt.libvirt.imagecache [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Unknown base file: /var/lib/nova/instances/_base/21a6e6787783e19d6abd064b1f558cdd7dc1053f#033[00m
Jan 21 19:57:31 np0005591285 nova_compute[182755]: 2026-01-22 00:57:31.258 182759 WARNING nova.virt.libvirt.imagecache [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Unknown base file: /var/lib/nova/instances/_base/b33c9920bf26c6dae549fa60eaf22a65772f20df#033[00m
Jan 21 19:57:31 np0005591285 nova_compute[182755]: 2026-01-22 00:57:31.258 182759 WARNING nova.virt.libvirt.imagecache [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Unknown base file: /var/lib/nova/instances/_base/dd2b4e9aff705bf3376f6f40ce326783f810526c#033[00m
Jan 21 19:57:31 np0005591285 nova_compute[182755]: 2026-01-22 00:57:31.258 182759 INFO nova.virt.libvirt.imagecache [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Removable base files: /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 /var/lib/nova/instances/_base/28d799b333bd7d52e5e892149f424e185effed74 /var/lib/nova/instances/_base/6779aa1454b0f9e323fac2693f45a73902da912b /var/lib/nova/instances/_base/21a6e6787783e19d6abd064b1f558cdd7dc1053f /var/lib/nova/instances/_base/b33c9920bf26c6dae549fa60eaf22a65772f20df /var/lib/nova/instances/_base/dd2b4e9aff705bf3376f6f40ce326783f810526c#033[00m
Jan 21 19:57:31 np0005591285 nova_compute[182755]: 2026-01-22 00:57:31.258 182759 INFO nova.virt.libvirt.imagecache [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Base, swap or ephemeral file too young to remove: /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474#033[00m
Jan 21 19:57:31 np0005591285 nova_compute[182755]: 2026-01-22 00:57:31.259 182759 INFO nova.virt.libvirt.imagecache [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Base, swap or ephemeral file too young to remove: /var/lib/nova/instances/_base/28d799b333bd7d52e5e892149f424e185effed74#033[00m
Jan 21 19:57:31 np0005591285 nova_compute[182755]: 2026-01-22 00:57:31.259 182759 INFO nova.virt.libvirt.imagecache [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Base, swap or ephemeral file too young to remove: /var/lib/nova/instances/_base/6779aa1454b0f9e323fac2693f45a73902da912b#033[00m
Jan 21 19:57:31 np0005591285 nova_compute[182755]: 2026-01-22 00:57:31.259 182759 INFO nova.virt.libvirt.imagecache [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Base, swap or ephemeral file too young to remove: /var/lib/nova/instances/_base/21a6e6787783e19d6abd064b1f558cdd7dc1053f#033[00m
Jan 21 19:57:31 np0005591285 nova_compute[182755]: 2026-01-22 00:57:31.259 182759 INFO nova.virt.libvirt.imagecache [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Base, swap or ephemeral file too young to remove: /var/lib/nova/instances/_base/b33c9920bf26c6dae549fa60eaf22a65772f20df#033[00m
Jan 21 19:57:31 np0005591285 nova_compute[182755]: 2026-01-22 00:57:31.259 182759 INFO nova.virt.libvirt.imagecache [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Base, swap or ephemeral file too young to remove: /var/lib/nova/instances/_base/dd2b4e9aff705bf3376f6f40ce326783f810526c#033[00m
Jan 21 19:57:31 np0005591285 nova_compute[182755]: 2026-01-22 00:57:31.259 182759 DEBUG nova.virt.libvirt.imagecache [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Verification complete _age_and_verify_cached_images /usr/lib/python3.9/site-packages/nova/virt/libvirt/imagecache.py:350#033[00m
Jan 21 19:57:31 np0005591285 nova_compute[182755]: 2026-01-22 00:57:31.260 182759 DEBUG nova.virt.libvirt.imagecache [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Verify swap images _age_and_verify_swap_images /usr/lib/python3.9/site-packages/nova/virt/libvirt/imagecache.py:299#033[00m
Jan 21 19:57:31 np0005591285 nova_compute[182755]: 2026-01-22 00:57:31.260 182759 DEBUG nova.virt.libvirt.imagecache [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Verify ephemeral images _age_and_verify_ephemeral_images /usr/lib/python3.9/site-packages/nova/virt/libvirt/imagecache.py:284#033[00m
Jan 21 19:57:31 np0005591285 nova_compute[182755]: 2026-01-22 00:57:31.260 182759 INFO nova.virt.libvirt.imagecache [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Base, swap or ephemeral file too young to remove: /var/lib/nova/instances/_base/ephemeral_1_0706d66#033[00m
Jan 21 19:57:31 np0005591285 nova_compute[182755]: 2026-01-22 00:57:31.898 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:57:32 np0005591285 nova_compute[182755]: 2026-01-22 00:57:32.260 182759 DEBUG oslo_service.periodic_task [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 21 19:57:34 np0005591285 nova_compute[182755]: 2026-01-22 00:57:34.448 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:57:35 np0005591285 nova_compute[182755]: 2026-01-22 00:57:35.217 182759 DEBUG oslo_service.periodic_task [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 21 19:57:35 np0005591285 nova_compute[182755]: 2026-01-22 00:57:35.249 182759 DEBUG oslo_concurrency.lockutils [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 21 19:57:35 np0005591285 nova_compute[182755]: 2026-01-22 00:57:35.249 182759 DEBUG oslo_concurrency.lockutils [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 21 19:57:35 np0005591285 nova_compute[182755]: 2026-01-22 00:57:35.249 182759 DEBUG oslo_concurrency.lockutils [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 21 19:57:35 np0005591285 nova_compute[182755]: 2026-01-22 00:57:35.250 182759 DEBUG nova.compute.resource_tracker [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Jan 21 19:57:35 np0005591285 nova_compute[182755]: 2026-01-22 00:57:35.430 182759 WARNING nova.virt.libvirt.driver [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 21 19:57:35 np0005591285 nova_compute[182755]: 2026-01-22 00:57:35.431 182759 DEBUG nova.compute.resource_tracker [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=5709MB free_disk=73.10894393920898GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Jan 21 19:57:35 np0005591285 nova_compute[182755]: 2026-01-22 00:57:35.431 182759 DEBUG oslo_concurrency.lockutils [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 21 19:57:35 np0005591285 nova_compute[182755]: 2026-01-22 00:57:35.432 182759 DEBUG oslo_concurrency.lockutils [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 21 19:57:35 np0005591285 nova_compute[182755]: 2026-01-22 00:57:35.506 182759 DEBUG nova.compute.resource_tracker [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Jan 21 19:57:35 np0005591285 nova_compute[182755]: 2026-01-22 00:57:35.506 182759 DEBUG nova.compute.resource_tracker [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Jan 21 19:57:35 np0005591285 nova_compute[182755]: 2026-01-22 00:57:35.530 182759 DEBUG nova.compute.provider_tree [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Inventory has not changed in ProviderTree for provider: e96a8776-a298-4c19-937a-402cb8191067 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 21 19:57:35 np0005591285 nova_compute[182755]: 2026-01-22 00:57:35.553 182759 DEBUG nova.scheduler.client.report [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Inventory has not changed for provider e96a8776-a298-4c19-937a-402cb8191067 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 21 19:57:35 np0005591285 nova_compute[182755]: 2026-01-22 00:57:35.554 182759 DEBUG nova.compute.resource_tracker [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Jan 21 19:57:35 np0005591285 nova_compute[182755]: 2026-01-22 00:57:35.555 182759 DEBUG oslo_concurrency.lockutils [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.123s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 21 19:57:36 np0005591285 nova_compute[182755]: 2026-01-22 00:57:36.902 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:57:37 np0005591285 nova_compute[182755]: 2026-01-22 00:57:37.554 182759 DEBUG oslo_service.periodic_task [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 21 19:57:39 np0005591285 nova_compute[182755]: 2026-01-22 00:57:39.452 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:57:41 np0005591285 nova_compute[182755]: 2026-01-22 00:57:41.903 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:57:44 np0005591285 podman[251659]: 2026-01-22 00:57:44.218530372 +0000 UTC m=+0.081099759 container health_status b5c1a31a508d0cf9e10702abe9c0ef424614b4d8f0daee85791f7de054afa6f0 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, managed_by=edpm_ansible, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251202)
Jan 21 19:57:44 np0005591285 podman[251658]: 2026-01-22 00:57:44.254752845 +0000 UTC m=+0.117421094 container health_status 769668cc4e818cfae854ab3fcb5517227ddca1e18005a18ef4d782a494f45662 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, container_name=openstack_network_exporter, io.buildah.version=1.33.7, vendor=Red Hat, Inc., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.component=ubi9-minimal-container, config_id=openstack_network_exporter, io.openshift.expose-services=, name=ubi9-minimal, version=9.6, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, distribution-scope=public, build-date=2025-08-20T13:12:41, io.openshift.tags=minimal rhel9, release=1755695350, maintainer=Red Hat, Inc., vcs-type=git, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., managed_by=edpm_ansible, url=https://catalog.redhat.com/en/search?searchType=containers)
Jan 21 19:57:44 np0005591285 nova_compute[182755]: 2026-01-22 00:57:44.454 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:57:46 np0005591285 nova_compute[182755]: 2026-01-22 00:57:46.905 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:57:49 np0005591285 nova_compute[182755]: 2026-01-22 00:57:49.457 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:57:52 np0005591285 nova_compute[182755]: 2026-01-22 00:57:52.161 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:57:52 np0005591285 podman[251697]: 2026-01-22 00:57:52.251321973 +0000 UTC m=+0.062811838 container health_status 3d927e208c8d9d2bf2ed958430b573b5beb6e6062ba87e1ec7a0ee42c855b2e4 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Jan 21 19:57:54 np0005591285 nova_compute[182755]: 2026-01-22 00:57:54.500 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:57:56 np0005591285 nova_compute[182755]: 2026-01-22 00:57:56.936 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:57:59 np0005591285 podman[251721]: 2026-01-22 00:57:59.200308921 +0000 UTC m=+0.072095098 container health_status 482a8450540b33e5cd4c4cb0c4b60281016c657b79b89fa82f8a9e447843f995 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, container_name=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']})
Jan 21 19:57:59 np0005591285 podman[251722]: 2026-01-22 00:57:59.212304213 +0000 UTC m=+0.076530686 container health_status 8919309155a033da8deb024bb37b9fc0d285fe97686ee0d202af37a5f2df8416 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Jan 21 19:57:59 np0005591285 nova_compute[182755]: 2026-01-22 00:57:59.543 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:58:01 np0005591285 podman[251764]: 2026-01-22 00:58:01.221938086 +0000 UTC m=+0.096162353 container health_status e38def30a2d032278c507a8fe96e62e9ea51ff4721fe55b24e66691821fd079e (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, container_name=ovn_controller, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_controller, org.label-schema.vendor=CentOS)
Jan 21 19:58:01 np0005591285 nova_compute[182755]: 2026-01-22 00:58:01.937 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:58:03 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:58:03.032 104259 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 21 19:58:03 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:58:03.032 104259 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 21 19:58:03 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:58:03.032 104259 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 21 19:58:04 np0005591285 nova_compute[182755]: 2026-01-22 00:58:04.547 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:58:06 np0005591285 nova_compute[182755]: 2026-01-22 00:58:06.966 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:58:09 np0005591285 nova_compute[182755]: 2026-01-22 00:58:09.550 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:58:11 np0005591285 nova_compute[182755]: 2026-01-22 00:58:11.968 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:58:14 np0005591285 nova_compute[182755]: 2026-01-22 00:58:14.553 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:58:15 np0005591285 podman[251792]: 2026-01-22 00:58:15.184590461 +0000 UTC m=+0.052003228 container health_status b5c1a31a508d0cf9e10702abe9c0ef424614b4d8f0daee85791f7de054afa6f0 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, tcib_managed=true, config_id=ceilometer_agent_compute, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.license=GPLv2)
Jan 21 19:58:15 np0005591285 podman[251791]: 2026-01-22 00:58:15.1927814 +0000 UTC m=+0.062239772 container health_status 769668cc4e818cfae854ab3fcb5517227ddca1e18005a18ef4d782a494f45662 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.33.7, url=https://catalog.redhat.com/en/search?searchType=containers, architecture=x86_64, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, build-date=2025-08-20T13:12:41, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, maintainer=Red Hat, Inc., release=1755695350, vcs-type=git, vendor=Red Hat, Inc., config_id=openstack_network_exporter, com.redhat.component=ubi9-minimal-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=edpm_ansible, name=ubi9-minimal, container_name=openstack_network_exporter, distribution-scope=public, version=9.6, io.openshift.tags=minimal rhel9)
Jan 21 19:58:17 np0005591285 nova_compute[182755]: 2026-01-22 00:58:17.003 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:58:19 np0005591285 nova_compute[182755]: 2026-01-22 00:58:19.555 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:58:22 np0005591285 nova_compute[182755]: 2026-01-22 00:58:22.005 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:58:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:58:23.185 12 DEBUG ceilometer.polling.manager [-] Skip pollster cpu, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 21 19:58:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:58:23.185 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 21 19:58:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:58:23.185 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 21 19:58:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:58:23.185 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 21 19:58:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:58:23.185 12 DEBUG ceilometer.polling.manager [-] Skip pollster memory.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 21 19:58:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:58:23.185 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 21 19:58:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:58:23.185 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 21 19:58:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:58:23.185 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.capacity, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 21 19:58:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:58:23.185 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 21 19:58:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:58:23.185 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 21 19:58:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:58:23.186 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 21 19:58:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:58:23.186 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.iops, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 21 19:58:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:58:23.186 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 21 19:58:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:58:23.186 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 21 19:58:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:58:23.186 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 21 19:58:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:58:23.186 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 21 19:58:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:58:23.186 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 21 19:58:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:58:23.186 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 21 19:58:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:58:23.186 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.allocation, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 21 19:58:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:58:23.186 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 21 19:58:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:58:23.186 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 21 19:58:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:58:23.186 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 21 19:58:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:58:23.186 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 21 19:58:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:58:23.187 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 21 19:58:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 00:58:23.187 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 21 19:58:23 np0005591285 podman[251830]: 2026-01-22 00:58:23.196706548 +0000 UTC m=+0.072008945 container health_status 3d927e208c8d9d2bf2ed958430b573b5beb6e6062ba87e1ec7a0ee42c855b2e4 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Jan 21 19:58:24 np0005591285 nova_compute[182755]: 2026-01-22 00:58:24.558 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:58:25 np0005591285 nova_compute[182755]: 2026-01-22 00:58:25.217 182759 DEBUG oslo_service.periodic_task [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 21 19:58:25 np0005591285 nova_compute[182755]: 2026-01-22 00:58:25.218 182759 DEBUG nova.compute.manager [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Jan 21 19:58:25 np0005591285 nova_compute[182755]: 2026-01-22 00:58:25.218 182759 DEBUG nova.compute.manager [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Jan 21 19:58:25 np0005591285 nova_compute[182755]: 2026-01-22 00:58:25.237 182759 DEBUG nova.compute.manager [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Jan 21 19:58:25 np0005591285 nova_compute[182755]: 2026-01-22 00:58:25.238 182759 DEBUG oslo_service.periodic_task [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 21 19:58:25 np0005591285 nova_compute[182755]: 2026-01-22 00:58:25.239 182759 DEBUG nova.compute.manager [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Jan 21 19:58:27 np0005591285 nova_compute[182755]: 2026-01-22 00:58:27.007 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:58:27 np0005591285 nova_compute[182755]: 2026-01-22 00:58:27.218 182759 DEBUG oslo_service.periodic_task [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 21 19:58:29 np0005591285 nova_compute[182755]: 2026-01-22 00:58:29.212 182759 DEBUG oslo_service.periodic_task [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 21 19:58:29 np0005591285 nova_compute[182755]: 2026-01-22 00:58:29.216 182759 DEBUG oslo_service.periodic_task [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 21 19:58:29 np0005591285 nova_compute[182755]: 2026-01-22 00:58:29.575 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:58:30 np0005591285 podman[251857]: 2026-01-22 00:58:30.229175156 +0000 UTC m=+0.096610505 container health_status 482a8450540b33e5cd4c4cb0c4b60281016c657b79b89fa82f8a9e447843f995 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, managed_by=edpm_ansible, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0)
Jan 21 19:58:30 np0005591285 podman[251858]: 2026-01-22 00:58:30.229391302 +0000 UTC m=+0.094353915 container health_status 8919309155a033da8deb024bb37b9fc0d285fe97686ee0d202af37a5f2df8416 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Jan 21 19:58:31 np0005591285 nova_compute[182755]: 2026-01-22 00:58:31.216 182759 DEBUG oslo_service.periodic_task [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 21 19:58:32 np0005591285 nova_compute[182755]: 2026-01-22 00:58:32.008 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:58:32 np0005591285 nova_compute[182755]: 2026-01-22 00:58:32.216 182759 DEBUG oslo_service.periodic_task [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 21 19:58:32 np0005591285 podman[251898]: 2026-01-22 00:58:32.253614178 +0000 UTC m=+0.119465719 container health_status e38def30a2d032278c507a8fe96e62e9ea51ff4721fe55b24e66691821fd079e (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_controller, io.buildah.version=1.41.3, managed_by=edpm_ansible, tcib_managed=true, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS)
Jan 21 19:58:34 np0005591285 nova_compute[182755]: 2026-01-22 00:58:34.577 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:58:35 np0005591285 nova_compute[182755]: 2026-01-22 00:58:35.217 182759 DEBUG oslo_service.periodic_task [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 21 19:58:35 np0005591285 nova_compute[182755]: 2026-01-22 00:58:35.254 182759 DEBUG oslo_concurrency.lockutils [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 21 19:58:35 np0005591285 nova_compute[182755]: 2026-01-22 00:58:35.254 182759 DEBUG oslo_concurrency.lockutils [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 21 19:58:35 np0005591285 nova_compute[182755]: 2026-01-22 00:58:35.254 182759 DEBUG oslo_concurrency.lockutils [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 21 19:58:35 np0005591285 nova_compute[182755]: 2026-01-22 00:58:35.254 182759 DEBUG nova.compute.resource_tracker [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Jan 21 19:58:35 np0005591285 nova_compute[182755]: 2026-01-22 00:58:35.458 182759 WARNING nova.virt.libvirt.driver [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 21 19:58:35 np0005591285 nova_compute[182755]: 2026-01-22 00:58:35.459 182759 DEBUG nova.compute.resource_tracker [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=5699MB free_disk=73.10894393920898GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Jan 21 19:58:35 np0005591285 nova_compute[182755]: 2026-01-22 00:58:35.459 182759 DEBUG oslo_concurrency.lockutils [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 21 19:58:35 np0005591285 nova_compute[182755]: 2026-01-22 00:58:35.459 182759 DEBUG oslo_concurrency.lockutils [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 21 19:58:35 np0005591285 nova_compute[182755]: 2026-01-22 00:58:35.517 182759 DEBUG nova.compute.resource_tracker [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Jan 21 19:58:35 np0005591285 nova_compute[182755]: 2026-01-22 00:58:35.518 182759 DEBUG nova.compute.resource_tracker [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Jan 21 19:58:35 np0005591285 nova_compute[182755]: 2026-01-22 00:58:35.539 182759 DEBUG nova.compute.provider_tree [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Inventory has not changed in ProviderTree for provider: e96a8776-a298-4c19-937a-402cb8191067 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 21 19:58:35 np0005591285 nova_compute[182755]: 2026-01-22 00:58:35.576 182759 DEBUG nova.scheduler.client.report [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Inventory has not changed for provider e96a8776-a298-4c19-937a-402cb8191067 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 21 19:58:35 np0005591285 nova_compute[182755]: 2026-01-22 00:58:35.578 182759 DEBUG nova.compute.resource_tracker [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Jan 21 19:58:35 np0005591285 nova_compute[182755]: 2026-01-22 00:58:35.578 182759 DEBUG oslo_concurrency.lockutils [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.119s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 21 19:58:37 np0005591285 nova_compute[182755]: 2026-01-22 00:58:37.010 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:58:37 np0005591285 nova_compute[182755]: 2026-01-22 00:58:37.578 182759 DEBUG oslo_service.periodic_task [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 21 19:58:39 np0005591285 nova_compute[182755]: 2026-01-22 00:58:39.615 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:58:42 np0005591285 nova_compute[182755]: 2026-01-22 00:58:42.014 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:58:44 np0005591285 nova_compute[182755]: 2026-01-22 00:58:44.618 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:58:46 np0005591285 podman[251926]: 2026-01-22 00:58:46.207221408 +0000 UTC m=+0.076044584 container health_status b5c1a31a508d0cf9e10702abe9c0ef424614b4d8f0daee85791f7de054afa6f0 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, tcib_managed=true, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team)
Jan 21 19:58:46 np0005591285 podman[251925]: 2026-01-22 00:58:46.213605559 +0000 UTC m=+0.075526739 container health_status 769668cc4e818cfae854ab3fcb5517227ddca1e18005a18ef4d782a494f45662 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Red Hat, Inc., build-date=2025-08-20T13:12:41, com.redhat.component=ubi9-minimal-container, io.openshift.expose-services=, container_name=openstack_network_exporter, distribution-scope=public, vendor=Red Hat, Inc., io.buildah.version=1.33.7, io.openshift.tags=minimal rhel9, vcs-type=git, config_id=openstack_network_exporter, architecture=x86_64, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, managed_by=edpm_ansible, release=1755695350, url=https://catalog.redhat.com/en/search?searchType=containers, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, name=ubi9-minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, version=9.6)
Jan 21 19:58:47 np0005591285 nova_compute[182755]: 2026-01-22 00:58:47.016 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:58:49 np0005591285 nova_compute[182755]: 2026-01-22 00:58:49.622 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:58:50 np0005591285 nova_compute[182755]: 2026-01-22 00:58:50.217 182759 DEBUG oslo_service.periodic_task [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Running periodic task ComputeManager._run_pending_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 21 19:58:50 np0005591285 nova_compute[182755]: 2026-01-22 00:58:50.218 182759 DEBUG nova.compute.manager [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Cleaning up deleted instances _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11145#033[00m
Jan 21 19:58:50 np0005591285 nova_compute[182755]: 2026-01-22 00:58:50.237 182759 DEBUG nova.compute.manager [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] There are 0 instances to clean _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11154#033[00m
Jan 21 19:58:51 np0005591285 nova_compute[182755]: 2026-01-22 00:58:51.232 182759 DEBUG oslo_service.periodic_task [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 21 19:58:52 np0005591285 nova_compute[182755]: 2026-01-22 00:58:52.018 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:58:54 np0005591285 podman[251964]: 2026-01-22 00:58:54.217842302 +0000 UTC m=+0.075336464 container health_status 3d927e208c8d9d2bf2ed958430b573b5beb6e6062ba87e1ec7a0ee42c855b2e4 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Jan 21 19:58:54 np0005591285 nova_compute[182755]: 2026-01-22 00:58:54.625 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:58:57 np0005591285 nova_compute[182755]: 2026-01-22 00:58:57.020 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:58:59 np0005591285 nova_compute[182755]: 2026-01-22 00:58:59.217 182759 DEBUG oslo_service.periodic_task [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Running periodic task ComputeManager._cleanup_incomplete_migrations run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 21 19:58:59 np0005591285 nova_compute[182755]: 2026-01-22 00:58:59.217 182759 DEBUG nova.compute.manager [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Cleaning up deleted instances with incomplete migration  _cleanup_incomplete_migrations /usr/lib/python3.9/site-packages/nova/compute/manager.py:11183#033[00m
Jan 21 19:58:59 np0005591285 nova_compute[182755]: 2026-01-22 00:58:59.628 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:59:01 np0005591285 podman[251989]: 2026-01-22 00:59:01.230983034 +0000 UTC m=+0.086763921 container health_status 8919309155a033da8deb024bb37b9fc0d285fe97686ee0d202af37a5f2df8416 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']})
Jan 21 19:59:01 np0005591285 podman[251988]: 2026-01-22 00:59:01.258092821 +0000 UTC m=+0.113876558 container health_status 482a8450540b33e5cd4c4cb0c4b60281016c657b79b89fa82f8a9e447843f995 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, org.label-schema.vendor=CentOS, tcib_managed=true, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3)
Jan 21 19:59:02 np0005591285 nova_compute[182755]: 2026-01-22 00:59:02.023 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:59:03 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:59:03.033 104259 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 21 19:59:03 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:59:03.033 104259 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 21 19:59:03 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 00:59:03.033 104259 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 21 19:59:03 np0005591285 podman[252032]: 2026-01-22 00:59:03.299133968 +0000 UTC m=+0.155728123 container health_status e38def30a2d032278c507a8fe96e62e9ea51ff4721fe55b24e66691821fd079e (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_managed=true, container_name=ovn_controller, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_id=ovn_controller)
Jan 21 19:59:04 np0005591285 nova_compute[182755]: 2026-01-22 00:59:04.631 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:59:07 np0005591285 nova_compute[182755]: 2026-01-22 00:59:07.024 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:59:08 np0005591285 nova_compute[182755]: 2026-01-22 00:59:08.691 182759 DEBUG oslo_service.periodic_task [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Running periodic task ComputeManager._sync_power_states run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 21 19:59:09 np0005591285 nova_compute[182755]: 2026-01-22 00:59:09.634 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:59:12 np0005591285 nova_compute[182755]: 2026-01-22 00:59:12.025 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:59:13 np0005591285 nova_compute[182755]: 2026-01-22 00:59:13.218 182759 DEBUG oslo_service.periodic_task [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 21 19:59:14 np0005591285 nova_compute[182755]: 2026-01-22 00:59:14.637 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:59:17 np0005591285 nova_compute[182755]: 2026-01-22 00:59:17.027 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:59:17 np0005591285 podman[252058]: 2026-01-22 00:59:17.207934765 +0000 UTC m=+0.074616645 container health_status 769668cc4e818cfae854ab3fcb5517227ddca1e18005a18ef4d782a494f45662 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, release=1755695350, container_name=openstack_network_exporter, distribution-scope=public, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., config_id=openstack_network_exporter, build-date=2025-08-20T13:12:41, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, name=ubi9-minimal, version=9.6, architecture=x86_64, maintainer=Red Hat, Inc., managed_by=edpm_ansible, vendor=Red Hat, Inc., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.tags=minimal rhel9, url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.component=ubi9-minimal-container, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-type=git, io.buildah.version=1.33.7, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']})
Jan 21 19:59:17 np0005591285 podman[252059]: 2026-01-22 00:59:17.22859752 +0000 UTC m=+0.085650381 container health_status b5c1a31a508d0cf9e10702abe9c0ef424614b4d8f0daee85791f7de054afa6f0 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_id=ceilometer_agent_compute, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team)
Jan 21 19:59:19 np0005591285 nova_compute[182755]: 2026-01-22 00:59:19.641 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:59:22 np0005591285 nova_compute[182755]: 2026-01-22 00:59:22.030 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:59:24 np0005591285 nova_compute[182755]: 2026-01-22 00:59:24.645 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:59:25 np0005591285 podman[252102]: 2026-01-22 00:59:25.219996559 +0000 UTC m=+0.082818485 container health_status 3d927e208c8d9d2bf2ed958430b573b5beb6e6062ba87e1ec7a0ee42c855b2e4 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Jan 21 19:59:25 np0005591285 nova_compute[182755]: 2026-01-22 00:59:25.239 182759 DEBUG oslo_service.periodic_task [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 21 19:59:25 np0005591285 nova_compute[182755]: 2026-01-22 00:59:25.240 182759 DEBUG nova.compute.manager [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Jan 21 19:59:27 np0005591285 nova_compute[182755]: 2026-01-22 00:59:27.030 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:59:27 np0005591285 nova_compute[182755]: 2026-01-22 00:59:27.218 182759 DEBUG oslo_service.periodic_task [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 21 19:59:27 np0005591285 nova_compute[182755]: 2026-01-22 00:59:27.218 182759 DEBUG nova.compute.manager [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Jan 21 19:59:27 np0005591285 nova_compute[182755]: 2026-01-22 00:59:27.219 182759 DEBUG nova.compute.manager [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Jan 21 19:59:27 np0005591285 nova_compute[182755]: 2026-01-22 00:59:27.243 182759 DEBUG nova.compute.manager [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Jan 21 19:59:28 np0005591285 nova_compute[182755]: 2026-01-22 00:59:28.217 182759 DEBUG oslo_service.periodic_task [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 21 19:59:29 np0005591285 nova_compute[182755]: 2026-01-22 00:59:29.212 182759 DEBUG oslo_service.periodic_task [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 21 19:59:29 np0005591285 nova_compute[182755]: 2026-01-22 00:59:29.648 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:59:30 np0005591285 nova_compute[182755]: 2026-01-22 00:59:30.217 182759 DEBUG oslo_service.periodic_task [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 21 19:59:31 np0005591285 nova_compute[182755]: 2026-01-22 00:59:31.218 182759 DEBUG oslo_service.periodic_task [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 21 19:59:32 np0005591285 nova_compute[182755]: 2026-01-22 00:59:32.033 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:59:32 np0005591285 podman[252127]: 2026-01-22 00:59:32.211237502 +0000 UTC m=+0.070247007 container health_status 8919309155a033da8deb024bb37b9fc0d285fe97686ee0d202af37a5f2df8416 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter)
Jan 21 19:59:32 np0005591285 nova_compute[182755]: 2026-01-22 00:59:32.217 182759 DEBUG oslo_service.periodic_task [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 21 19:59:32 np0005591285 podman[252126]: 2026-01-22 00:59:32.220356847 +0000 UTC m=+0.092950696 container health_status 482a8450540b33e5cd4c4cb0c4b60281016c657b79b89fa82f8a9e447843f995 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_id=ovn_metadata_agent, org.label-schema.build-date=20251202)
Jan 21 19:59:34 np0005591285 podman[252167]: 2026-01-22 00:59:34.328388444 +0000 UTC m=+0.191789581 container health_status e38def30a2d032278c507a8fe96e62e9ea51ff4721fe55b24e66691821fd079e (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_controller, org.label-schema.license=GPLv2, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller)
Jan 21 19:59:34 np0005591285 nova_compute[182755]: 2026-01-22 00:59:34.650 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:59:37 np0005591285 nova_compute[182755]: 2026-01-22 00:59:37.034 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:59:37 np0005591285 nova_compute[182755]: 2026-01-22 00:59:37.217 182759 DEBUG oslo_service.periodic_task [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 21 19:59:37 np0005591285 nova_compute[182755]: 2026-01-22 00:59:37.243 182759 DEBUG oslo_concurrency.lockutils [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 21 19:59:37 np0005591285 nova_compute[182755]: 2026-01-22 00:59:37.243 182759 DEBUG oslo_concurrency.lockutils [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 21 19:59:37 np0005591285 nova_compute[182755]: 2026-01-22 00:59:37.244 182759 DEBUG oslo_concurrency.lockutils [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 21 19:59:37 np0005591285 nova_compute[182755]: 2026-01-22 00:59:37.244 182759 DEBUG nova.compute.resource_tracker [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Jan 21 19:59:37 np0005591285 nova_compute[182755]: 2026-01-22 00:59:37.469 182759 WARNING nova.virt.libvirt.driver [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 21 19:59:37 np0005591285 nova_compute[182755]: 2026-01-22 00:59:37.471 182759 DEBUG nova.compute.resource_tracker [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=5719MB free_disk=73.10840225219727GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Jan 21 19:59:37 np0005591285 nova_compute[182755]: 2026-01-22 00:59:37.471 182759 DEBUG oslo_concurrency.lockutils [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 21 19:59:37 np0005591285 nova_compute[182755]: 2026-01-22 00:59:37.472 182759 DEBUG oslo_concurrency.lockutils [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 21 19:59:37 np0005591285 nova_compute[182755]: 2026-01-22 00:59:37.564 182759 DEBUG nova.compute.resource_tracker [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Jan 21 19:59:37 np0005591285 nova_compute[182755]: 2026-01-22 00:59:37.565 182759 DEBUG nova.compute.resource_tracker [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Jan 21 19:59:37 np0005591285 nova_compute[182755]: 2026-01-22 00:59:37.592 182759 DEBUG nova.compute.provider_tree [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Inventory has not changed in ProviderTree for provider: e96a8776-a298-4c19-937a-402cb8191067 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 21 19:59:37 np0005591285 nova_compute[182755]: 2026-01-22 00:59:37.609 182759 DEBUG nova.scheduler.client.report [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Inventory has not changed for provider e96a8776-a298-4c19-937a-402cb8191067 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 21 19:59:37 np0005591285 nova_compute[182755]: 2026-01-22 00:59:37.611 182759 DEBUG nova.compute.resource_tracker [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Jan 21 19:59:37 np0005591285 nova_compute[182755]: 2026-01-22 00:59:37.612 182759 DEBUG oslo_concurrency.lockutils [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.140s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 21 19:59:39 np0005591285 nova_compute[182755]: 2026-01-22 00:59:39.613 182759 DEBUG oslo_service.periodic_task [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 21 19:59:39 np0005591285 nova_compute[182755]: 2026-01-22 00:59:39.653 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:59:42 np0005591285 nova_compute[182755]: 2026-01-22 00:59:42.037 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:59:44 np0005591285 nova_compute[182755]: 2026-01-22 00:59:44.656 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:59:47 np0005591285 nova_compute[182755]: 2026-01-22 00:59:47.039 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:59:48 np0005591285 podman[252194]: 2026-01-22 00:59:48.184210919 +0000 UTC m=+0.056907829 container health_status b5c1a31a508d0cf9e10702abe9c0ef424614b4d8f0daee85791f7de054afa6f0 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, config_id=ceilometer_agent_compute, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ceilometer_agent_compute, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 21 19:59:48 np0005591285 podman[252193]: 2026-01-22 00:59:48.211668246 +0000 UTC m=+0.085221239 container health_status 769668cc4e818cfae854ab3fcb5517227ddca1e18005a18ef4d782a494f45662 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Red Hat, Inc., architecture=x86_64, distribution-scope=public, vcs-type=git, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, managed_by=edpm_ansible, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, container_name=openstack_network_exporter, name=ubi9-minimal, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=openstack_network_exporter, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.buildah.version=1.33.7, release=1755695350, url=https://catalog.redhat.com/en/search?searchType=containers, vendor=Red Hat, Inc., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.tags=minimal rhel9, version=9.6, com.redhat.component=ubi9-minimal-container, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, build-date=2025-08-20T13:12:41, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=)
Jan 21 19:59:49 np0005591285 nova_compute[182755]: 2026-01-22 00:59:49.659 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:59:52 np0005591285 nova_compute[182755]: 2026-01-22 00:59:52.041 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:59:54 np0005591285 nova_compute[182755]: 2026-01-22 00:59:54.662 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:59:56 np0005591285 podman[252232]: 2026-01-22 00:59:56.179570514 +0000 UTC m=+0.050292171 container health_status 3d927e208c8d9d2bf2ed958430b573b5beb6e6062ba87e1ec7a0ee42c855b2e4 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Jan 21 19:59:57 np0005591285 nova_compute[182755]: 2026-01-22 00:59:57.043 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 19:59:59 np0005591285 nova_compute[182755]: 2026-01-22 00:59:59.665 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 20:00:02 np0005591285 nova_compute[182755]: 2026-01-22 01:00:02.046 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 20:00:03 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 01:00:03.035 104259 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 21 20:00:03 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 01:00:03.035 104259 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 21 20:00:03 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 01:00:03.036 104259 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 21 20:00:03 np0005591285 podman[252257]: 2026-01-22 01:00:03.209369392 +0000 UTC m=+0.068581263 container health_status 8919309155a033da8deb024bb37b9fc0d285fe97686ee0d202af37a5f2df8416 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Jan 21 20:00:03 np0005591285 podman[252256]: 2026-01-22 01:00:03.24728151 +0000 UTC m=+0.106682816 container health_status 482a8450540b33e5cd4c4cb0c4b60281016c657b79b89fa82f8a9e447843f995 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.build-date=20251202, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team)
Jan 21 20:00:04 np0005591285 nova_compute[182755]: 2026-01-22 01:00:04.668 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 20:00:05 np0005591285 podman[252298]: 2026-01-22 01:00:05.245008384 +0000 UTC m=+0.109610894 container health_status e38def30a2d032278c507a8fe96e62e9ea51ff4721fe55b24e66691821fd079e (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3)
Jan 21 20:00:07 np0005591285 nova_compute[182755]: 2026-01-22 01:00:07.049 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 20:00:09 np0005591285 nova_compute[182755]: 2026-01-22 01:00:09.672 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 20:00:12 np0005591285 nova_compute[182755]: 2026-01-22 01:00:12.053 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 20:00:14 np0005591285 nova_compute[182755]: 2026-01-22 01:00:14.676 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 20:00:17 np0005591285 nova_compute[182755]: 2026-01-22 01:00:17.055 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 20:00:19 np0005591285 podman[252325]: 2026-01-22 01:00:19.199247489 +0000 UTC m=+0.068009447 container health_status 769668cc4e818cfae854ab3fcb5517227ddca1e18005a18ef4d782a494f45662 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=openstack_network_exporter, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., release=1755695350, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, architecture=x86_64, vcs-type=git, io.openshift.tags=minimal rhel9, name=ubi9-minimal, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.expose-services=, url=https://catalog.redhat.com/en/search?searchType=containers, container_name=openstack_network_exporter, distribution-scope=public, build-date=2025-08-20T13:12:41, managed_by=edpm_ansible, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vendor=Red Hat, Inc., com.redhat.component=ubi9-minimal-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=9.6, maintainer=Red Hat, Inc., io.buildah.version=1.33.7, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']})
Jan 21 20:00:19 np0005591285 podman[252326]: 2026-01-22 01:00:19.212171986 +0000 UTC m=+0.077325127 container health_status b5c1a31a508d0cf9e10702abe9c0ef424614b4d8f0daee85791f7de054afa6f0 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, tcib_managed=true, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=ceilometer_agent_compute, container_name=ceilometer_agent_compute, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Jan 21 20:00:19 np0005591285 nova_compute[182755]: 2026-01-22 01:00:19.677 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 20:00:22 np0005591285 nova_compute[182755]: 2026-01-22 01:00:22.092 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 20:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 01:00:23.186 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 21 20:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 01:00:23.187 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 21 20:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 01:00:23.187 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 21 20:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 01:00:23.187 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 21 20:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 01:00:23.188 12 DEBUG ceilometer.polling.manager [-] Skip pollster cpu, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 21 20:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 01:00:23.188 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 21 20:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 01:00:23.188 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 21 20:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 01:00:23.188 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 21 20:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 01:00:23.189 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 21 20:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 01:00:23.189 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 21 20:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 01:00:23.189 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 21 20:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 01:00:23.189 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 21 20:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 01:00:23.190 12 DEBUG ceilometer.polling.manager [-] Skip pollster memory.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 21 20:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 01:00:23.190 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 21 20:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 01:00:23.190 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 21 20:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 01:00:23.190 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 21 20:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 01:00:23.190 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 21 20:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 01:00:23.191 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.iops, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 21 20:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 01:00:23.191 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.allocation, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 21 20:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 01:00:23.191 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.capacity, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 21 20:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 01:00:23.191 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 21 20:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 01:00:23.191 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 21 20:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 01:00:23.192 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 21 20:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 01:00:23.192 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 21 20:00:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 01:00:23.192 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 21 20:00:24 np0005591285 nova_compute[182755]: 2026-01-22 01:00:24.680 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 20:00:27 np0005591285 nova_compute[182755]: 2026-01-22 01:00:27.094 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 20:00:27 np0005591285 podman[252364]: 2026-01-22 01:00:27.208301853 +0000 UTC m=+0.071022039 container health_status 3d927e208c8d9d2bf2ed958430b573b5beb6e6062ba87e1ec7a0ee42c855b2e4 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Jan 21 20:00:27 np0005591285 nova_compute[182755]: 2026-01-22 01:00:27.217 182759 DEBUG oslo_service.periodic_task [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 21 20:00:27 np0005591285 nova_compute[182755]: 2026-01-22 01:00:27.217 182759 DEBUG nova.compute.manager [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Jan 21 20:00:28 np0005591285 nova_compute[182755]: 2026-01-22 01:00:28.218 182759 DEBUG oslo_service.periodic_task [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 21 20:00:29 np0005591285 nova_compute[182755]: 2026-01-22 01:00:29.216 182759 DEBUG oslo_service.periodic_task [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 21 20:00:29 np0005591285 nova_compute[182755]: 2026-01-22 01:00:29.217 182759 DEBUG nova.compute.manager [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Jan 21 20:00:29 np0005591285 nova_compute[182755]: 2026-01-22 01:00:29.217 182759 DEBUG nova.compute.manager [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Jan 21 20:00:29 np0005591285 nova_compute[182755]: 2026-01-22 01:00:29.309 182759 DEBUG nova.compute.manager [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Jan 21 20:00:29 np0005591285 nova_compute[182755]: 2026-01-22 01:00:29.683 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 20:00:30 np0005591285 nova_compute[182755]: 2026-01-22 01:00:30.217 182759 DEBUG oslo_service.periodic_task [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 21 20:00:31 np0005591285 nova_compute[182755]: 2026-01-22 01:00:31.213 182759 DEBUG oslo_service.periodic_task [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 21 20:00:31 np0005591285 nova_compute[182755]: 2026-01-22 01:00:31.217 182759 DEBUG oslo_service.periodic_task [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 21 20:00:32 np0005591285 nova_compute[182755]: 2026-01-22 01:00:32.097 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 20:00:32 np0005591285 nova_compute[182755]: 2026-01-22 01:00:32.217 182759 DEBUG oslo_service.periodic_task [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 21 20:00:34 np0005591285 podman[252390]: 2026-01-22 01:00:34.207326004 +0000 UTC m=+0.069583580 container health_status 482a8450540b33e5cd4c4cb0c4b60281016c657b79b89fa82f8a9e447843f995 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2, tcib_managed=true)
Jan 21 20:00:34 np0005591285 podman[252391]: 2026-01-22 01:00:34.212988326 +0000 UTC m=+0.069901548 container health_status 8919309155a033da8deb024bb37b9fc0d285fe97686ee0d202af37a5f2df8416 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Jan 21 20:00:34 np0005591285 nova_compute[182755]: 2026-01-22 01:00:34.687 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 20:00:36 np0005591285 podman[252435]: 2026-01-22 01:00:36.242138934 +0000 UTC m=+0.111846715 container health_status e38def30a2d032278c507a8fe96e62e9ea51ff4721fe55b24e66691821fd079e (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.build-date=20251202)
Jan 21 20:00:37 np0005591285 nova_compute[182755]: 2026-01-22 01:00:37.100 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 20:00:37 np0005591285 nova_compute[182755]: 2026-01-22 01:00:37.217 182759 DEBUG oslo_service.periodic_task [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 21 20:00:37 np0005591285 nova_compute[182755]: 2026-01-22 01:00:37.266 182759 DEBUG oslo_concurrency.lockutils [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 21 20:00:37 np0005591285 nova_compute[182755]: 2026-01-22 01:00:37.267 182759 DEBUG oslo_concurrency.lockutils [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 21 20:00:37 np0005591285 nova_compute[182755]: 2026-01-22 01:00:37.267 182759 DEBUG oslo_concurrency.lockutils [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 21 20:00:37 np0005591285 nova_compute[182755]: 2026-01-22 01:00:37.267 182759 DEBUG nova.compute.resource_tracker [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Jan 21 20:00:37 np0005591285 nova_compute[182755]: 2026-01-22 01:00:37.461 182759 WARNING nova.virt.libvirt.driver [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 21 20:00:37 np0005591285 nova_compute[182755]: 2026-01-22 01:00:37.463 182759 DEBUG nova.compute.resource_tracker [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=5713MB free_disk=73.1083755493164GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Jan 21 20:00:37 np0005591285 nova_compute[182755]: 2026-01-22 01:00:37.463 182759 DEBUG oslo_concurrency.lockutils [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 21 20:00:37 np0005591285 nova_compute[182755]: 2026-01-22 01:00:37.463 182759 DEBUG oslo_concurrency.lockutils [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 21 20:00:38 np0005591285 nova_compute[182755]: 2026-01-22 01:00:38.247 182759 DEBUG nova.compute.resource_tracker [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Jan 21 20:00:38 np0005591285 nova_compute[182755]: 2026-01-22 01:00:38.248 182759 DEBUG nova.compute.resource_tracker [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Jan 21 20:00:38 np0005591285 nova_compute[182755]: 2026-01-22 01:00:38.463 182759 DEBUG nova.scheduler.client.report [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Refreshing inventories for resource provider e96a8776-a298-4c19-937a-402cb8191067 _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:804#033[00m
Jan 21 20:00:39 np0005591285 nova_compute[182755]: 2026-01-22 01:00:39.690 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 20:00:39 np0005591285 nova_compute[182755]: 2026-01-22 01:00:39.731 182759 DEBUG nova.scheduler.client.report [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Updating ProviderTree inventory for provider e96a8776-a298-4c19-937a-402cb8191067 from _refresh_and_get_inventory using data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} _refresh_and_get_inventory /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:768#033[00m
Jan 21 20:00:39 np0005591285 nova_compute[182755]: 2026-01-22 01:00:39.731 182759 DEBUG nova.compute.provider_tree [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Updating inventory in ProviderTree for provider e96a8776-a298-4c19-937a-402cb8191067 with inventory: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176#033[00m
Jan 21 20:00:39 np0005591285 nova_compute[182755]: 2026-01-22 01:00:39.773 182759 DEBUG nova.scheduler.client.report [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Refreshing aggregate associations for resource provider e96a8776-a298-4c19-937a-402cb8191067, aggregates: None _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:813#033[00m
Jan 21 20:00:39 np0005591285 nova_compute[182755]: 2026-01-22 01:00:39.851 182759 DEBUG nova.scheduler.client.report [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Refreshing trait associations for resource provider e96a8776-a298-4c19-937a-402cb8191067, traits: COMPUTE_SECURITY_UEFI_SECURE_BOOT,COMPUTE_VIOMMU_MODEL_VIRTIO,COMPUTE_STORAGE_BUS_IDE,COMPUTE_NET_ATTACH_INTERFACE,HW_CPU_X86_SSSE3,HW_CPU_X86_SSE,COMPUTE_IMAGE_TYPE_QCOW2,COMPUTE_NET_VIF_MODEL_VMXNET3,COMPUTE_STORAGE_BUS_FDC,HW_CPU_X86_SSE42,COMPUTE_SECURITY_TPM_2_0,COMPUTE_IMAGE_TYPE_AMI,COMPUTE_DEVICE_TAGGING,COMPUTE_VOLUME_EXTEND,COMPUTE_NET_VIF_MODEL_E1000,COMPUTE_VIOMMU_MODEL_AUTO,COMPUTE_GRAPHICS_MODEL_CIRRUS,COMPUTE_VOLUME_ATTACH_WITH_TAG,COMPUTE_IMAGE_TYPE_AKI,COMPUTE_STORAGE_BUS_SATA,HW_CPU_X86_SSE2,COMPUTE_IMAGE_TYPE_ARI,HW_CPU_X86_SSE41,COMPUTE_RESCUE_BFV,COMPUTE_NODE,COMPUTE_GRAPHICS_MODEL_VIRTIO,COMPUTE_NET_VIF_MODEL_PCNET,COMPUTE_NET_VIF_MODEL_VIRTIO,COMPUTE_NET_VIF_MODEL_RTL8139,COMPUTE_ACCELERATORS,COMPUTE_SECURITY_TPM_1_2,COMPUTE_GRAPHICS_MODEL_NONE,COMPUTE_GRAPHICS_MODEL_BOCHS,COMPUTE_VOLUME_MULTI_ATTACH,COMPUTE_NET_VIF_MODEL_NE2K_PCI,COMPUTE_TRUSTED_CERTS,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,COMPUTE_VIOMMU_MODEL_INTEL,COMPUTE_IMAGE_TYPE_ISO,COMPUTE_SOCKET_PCI_NUMA_AFFINITY,COMPUTE_STORAGE_BUS_USB,COMPUTE_STORAGE_BUS_SCSI,COMPUTE_STORAGE_BUS_VIRTIO,HW_CPU_X86_MMX,COMPUTE_GRAPHICS_MODEL_VGA,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,COMPUTE_NET_VIF_MODEL_E1000E,COMPUTE_IMAGE_TYPE_RAW _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:825#033[00m
Jan 21 20:00:39 np0005591285 nova_compute[182755]: 2026-01-22 01:00:39.955 182759 DEBUG nova.compute.provider_tree [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Inventory has not changed in ProviderTree for provider: e96a8776-a298-4c19-937a-402cb8191067 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 21 20:00:40 np0005591285 nova_compute[182755]: 2026-01-22 01:00:40.085 182759 DEBUG nova.scheduler.client.report [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Inventory has not changed for provider e96a8776-a298-4c19-937a-402cb8191067 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 21 20:00:40 np0005591285 nova_compute[182755]: 2026-01-22 01:00:40.087 182759 DEBUG nova.compute.resource_tracker [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Jan 21 20:00:40 np0005591285 nova_compute[182755]: 2026-01-22 01:00:40.087 182759 DEBUG oslo_concurrency.lockutils [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 2.624s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 21 20:00:42 np0005591285 nova_compute[182755]: 2026-01-22 01:00:42.104 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 20:00:44 np0005591285 nova_compute[182755]: 2026-01-22 01:00:44.088 182759 DEBUG oslo_service.periodic_task [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 21 20:00:44 np0005591285 nova_compute[182755]: 2026-01-22 01:00:44.695 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 20:00:47 np0005591285 nova_compute[182755]: 2026-01-22 01:00:47.105 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 20:00:48 np0005591285 nova_compute[182755]: 2026-01-22 01:00:48.698 182759 DEBUG oslo_concurrency.processutils [None req-3383b229-0645-4d84-a161-f817197299df c798bde61dce4297a27213eac66acb7f 43b70c4e837343859ac97b6b2397ba1b - - default default] Running cmd (subprocess): env LANG=C uptime execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 21 20:00:48 np0005591285 nova_compute[182755]: 2026-01-22 01:00:48.749 182759 DEBUG oslo_concurrency.processutils [None req-3383b229-0645-4d84-a161-f817197299df c798bde61dce4297a27213eac66acb7f 43b70c4e837343859ac97b6b2397ba1b - - default default] CMD "env LANG=C uptime" returned: 0 in 0.051s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 21 20:00:49 np0005591285 nova_compute[182755]: 2026-01-22 01:00:49.699 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 20:00:50 np0005591285 podman[252462]: 2026-01-22 01:00:50.213270285 +0000 UTC m=+0.074879042 container health_status 769668cc4e818cfae854ab3fcb5517227ddca1e18005a18ef4d782a494f45662 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, build-date=2025-08-20T13:12:41, com.redhat.component=ubi9-minimal-container, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., name=ubi9-minimal, vcs-type=git, io.buildah.version=1.33.7, io.openshift.tags=minimal rhel9, managed_by=edpm_ansible, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=openstack_network_exporter, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, url=https://catalog.redhat.com/en/search?searchType=containers, io.openshift.expose-services=, distribution-scope=public, release=1755695350, architecture=x86_64, config_id=openstack_network_exporter, maintainer=Red Hat, Inc., vendor=Red Hat, Inc., version=9.6)
Jan 21 20:00:50 np0005591285 podman[252463]: 2026-01-22 01:00:50.260915234 +0000 UTC m=+0.107972440 container health_status b5c1a31a508d0cf9e10702abe9c0ef424614b4d8f0daee85791f7de054afa6f0 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, org.label-schema.license=GPLv2, config_id=ceilometer_agent_compute, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0)
Jan 21 20:00:52 np0005591285 nova_compute[182755]: 2026-01-22 01:00:52.108 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 20:00:52 np0005591285 nova_compute[182755]: 2026-01-22 01:00:52.213 182759 DEBUG oslo_service.periodic_task [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 21 20:00:54 np0005591285 nova_compute[182755]: 2026-01-22 01:00:54.748 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 20:00:57 np0005591285 nova_compute[182755]: 2026-01-22 01:00:57.110 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 20:00:58 np0005591285 podman[252505]: 2026-01-22 01:00:58.195776865 +0000 UTC m=+0.068987314 container health_status 3d927e208c8d9d2bf2ed958430b573b5beb6e6062ba87e1ec7a0ee42c855b2e4 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter)
Jan 21 20:00:59 np0005591285 nova_compute[182755]: 2026-01-22 01:00:59.123 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 20:00:59 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 01:00:59.125 104259 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=75, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '16:a4:fb', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': 'fa:c2:bb:50:eb:27'}, ipsec=False) old=SB_Global(nb_cfg=74) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 21 20:00:59 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 01:00:59.126 104259 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 5 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Jan 21 20:00:59 np0005591285 nova_compute[182755]: 2026-01-22 01:00:59.750 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 20:01:02 np0005591285 nova_compute[182755]: 2026-01-22 01:01:02.112 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 20:01:03 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 01:01:03.036 104259 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 21 20:01:03 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 01:01:03.036 104259 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 21 20:01:03 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 01:01:03.037 104259 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 21 20:01:04 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 01:01:04.129 104259 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=ce4b296c-26ac-415a-aa87-9634754eb3d3, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '75'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 21 20:01:04 np0005591285 nova_compute[182755]: 2026-01-22 01:01:04.753 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 20:01:05 np0005591285 podman[252546]: 2026-01-22 01:01:05.219744418 +0000 UTC m=+0.080055370 container health_status 8919309155a033da8deb024bb37b9fc0d285fe97686ee0d202af37a5f2df8416 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter)
Jan 21 20:01:05 np0005591285 podman[252545]: 2026-01-22 01:01:05.228250536 +0000 UTC m=+0.087906511 container health_status 482a8450540b33e5cd4c4cb0c4b60281016c657b79b89fa82f8a9e447843f995 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Jan 21 20:01:07 np0005591285 nova_compute[182755]: 2026-01-22 01:01:07.114 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 20:01:07 np0005591285 podman[252586]: 2026-01-22 01:01:07.310759067 +0000 UTC m=+0.170710535 container health_status e38def30a2d032278c507a8fe96e62e9ea51ff4721fe55b24e66691821fd079e (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_controller, org.label-schema.build-date=20251202, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, container_name=ovn_controller, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']})
Jan 21 20:01:09 np0005591285 nova_compute[182755]: 2026-01-22 01:01:09.756 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 20:01:12 np0005591285 nova_compute[182755]: 2026-01-22 01:01:12.116 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 20:01:14 np0005591285 nova_compute[182755]: 2026-01-22 01:01:14.796 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 20:01:17 np0005591285 nova_compute[182755]: 2026-01-22 01:01:17.118 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 20:01:19 np0005591285 nova_compute[182755]: 2026-01-22 01:01:19.799 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 20:01:21 np0005591285 podman[252612]: 2026-01-22 01:01:21.221122856 +0000 UTC m=+0.087925201 container health_status 769668cc4e818cfae854ab3fcb5517227ddca1e18005a18ef4d782a494f45662 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, url=https://catalog.redhat.com/en/search?searchType=containers, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vcs-type=git, architecture=x86_64, io.openshift.expose-services=, maintainer=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., build-date=2025-08-20T13:12:41, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., version=9.6, io.buildah.version=1.33.7, container_name=openstack_network_exporter, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, config_id=openstack_network_exporter, distribution-scope=public, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, com.redhat.component=ubi9-minimal-container, release=1755695350, vendor=Red Hat, Inc., name=ubi9-minimal, managed_by=edpm_ansible, io.openshift.tags=minimal rhel9)
Jan 21 20:01:21 np0005591285 podman[252613]: 2026-01-22 01:01:21.233091079 +0000 UTC m=+0.090748039 container health_status b5c1a31a508d0cf9e10702abe9c0ef424614b4d8f0daee85791f7de054afa6f0 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, container_name=ceilometer_agent_compute, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=ceilometer_agent_compute, io.buildah.version=1.41.3)
Jan 21 20:01:22 np0005591285 nova_compute[182755]: 2026-01-22 01:01:22.121 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 20:01:24 np0005591285 nova_compute[182755]: 2026-01-22 01:01:24.802 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 20:01:27 np0005591285 nova_compute[182755]: 2026-01-22 01:01:27.125 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 20:01:28 np0005591285 nova_compute[182755]: 2026-01-22 01:01:28.218 182759 DEBUG oslo_service.periodic_task [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 21 20:01:28 np0005591285 nova_compute[182755]: 2026-01-22 01:01:28.218 182759 DEBUG nova.compute.manager [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Jan 21 20:01:29 np0005591285 podman[252652]: 2026-01-22 01:01:29.254223386 +0000 UTC m=+0.112222304 container health_status 3d927e208c8d9d2bf2ed958430b573b5beb6e6062ba87e1ec7a0ee42c855b2e4 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Jan 21 20:01:29 np0005591285 nova_compute[182755]: 2026-01-22 01:01:29.807 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 20:01:30 np0005591285 nova_compute[182755]: 2026-01-22 01:01:30.218 182759 DEBUG oslo_service.periodic_task [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 21 20:01:31 np0005591285 nova_compute[182755]: 2026-01-22 01:01:31.214 182759 DEBUG oslo_service.periodic_task [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 21 20:01:31 np0005591285 nova_compute[182755]: 2026-01-22 01:01:31.216 182759 DEBUG oslo_service.periodic_task [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 21 20:01:31 np0005591285 nova_compute[182755]: 2026-01-22 01:01:31.217 182759 DEBUG nova.compute.manager [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Jan 21 20:01:31 np0005591285 nova_compute[182755]: 2026-01-22 01:01:31.217 182759 DEBUG nova.compute.manager [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Jan 21 20:01:31 np0005591285 nova_compute[182755]: 2026-01-22 01:01:31.253 182759 DEBUG nova.compute.manager [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Jan 21 20:01:32 np0005591285 nova_compute[182755]: 2026-01-22 01:01:32.162 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 20:01:32 np0005591285 nova_compute[182755]: 2026-01-22 01:01:32.217 182759 DEBUG oslo_service.periodic_task [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 21 20:01:32 np0005591285 nova_compute[182755]: 2026-01-22 01:01:32.218 182759 DEBUG oslo_service.periodic_task [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 21 20:01:33 np0005591285 nova_compute[182755]: 2026-01-22 01:01:33.218 182759 DEBUG oslo_service.periodic_task [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 21 20:01:34 np0005591285 nova_compute[182755]: 2026-01-22 01:01:34.809 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 20:01:36 np0005591285 podman[252678]: 2026-01-22 01:01:36.22022988 +0000 UTC m=+0.081809747 container health_status 482a8450540b33e5cd4c4cb0c4b60281016c657b79b89fa82f8a9e447843f995 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, managed_by=edpm_ansible, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0)
Jan 21 20:01:36 np0005591285 podman[252679]: 2026-01-22 01:01:36.223193801 +0000 UTC m=+0.078196571 container health_status 8919309155a033da8deb024bb37b9fc0d285fe97686ee0d202af37a5f2df8416 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter)
Jan 21 20:01:37 np0005591285 nova_compute[182755]: 2026-01-22 01:01:37.163 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 20:01:37 np0005591285 nova_compute[182755]: 2026-01-22 01:01:37.217 182759 DEBUG oslo_service.periodic_task [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 21 20:01:37 np0005591285 nova_compute[182755]: 2026-01-22 01:01:37.250 182759 DEBUG oslo_concurrency.lockutils [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 21 20:01:37 np0005591285 nova_compute[182755]: 2026-01-22 01:01:37.251 182759 DEBUG oslo_concurrency.lockutils [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 21 20:01:37 np0005591285 nova_compute[182755]: 2026-01-22 01:01:37.251 182759 DEBUG oslo_concurrency.lockutils [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 21 20:01:37 np0005591285 nova_compute[182755]: 2026-01-22 01:01:37.251 182759 DEBUG nova.compute.resource_tracker [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Jan 21 20:01:37 np0005591285 nova_compute[182755]: 2026-01-22 01:01:37.458 182759 WARNING nova.virt.libvirt.driver [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 21 20:01:37 np0005591285 nova_compute[182755]: 2026-01-22 01:01:37.460 182759 DEBUG nova.compute.resource_tracker [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=5710MB free_disk=73.10839462280273GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Jan 21 20:01:37 np0005591285 nova_compute[182755]: 2026-01-22 01:01:37.460 182759 DEBUG oslo_concurrency.lockutils [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 21 20:01:37 np0005591285 nova_compute[182755]: 2026-01-22 01:01:37.461 182759 DEBUG oslo_concurrency.lockutils [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 21 20:01:37 np0005591285 nova_compute[182755]: 2026-01-22 01:01:37.568 182759 DEBUG nova.compute.resource_tracker [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Jan 21 20:01:37 np0005591285 nova_compute[182755]: 2026-01-22 01:01:37.568 182759 DEBUG nova.compute.resource_tracker [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Jan 21 20:01:37 np0005591285 nova_compute[182755]: 2026-01-22 01:01:37.605 182759 DEBUG nova.compute.provider_tree [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Inventory has not changed in ProviderTree for provider: e96a8776-a298-4c19-937a-402cb8191067 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 21 20:01:37 np0005591285 nova_compute[182755]: 2026-01-22 01:01:37.621 182759 DEBUG nova.scheduler.client.report [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Inventory has not changed for provider e96a8776-a298-4c19-937a-402cb8191067 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 21 20:01:37 np0005591285 nova_compute[182755]: 2026-01-22 01:01:37.622 182759 DEBUG nova.compute.resource_tracker [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Jan 21 20:01:37 np0005591285 nova_compute[182755]: 2026-01-22 01:01:37.623 182759 DEBUG oslo_concurrency.lockutils [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.162s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 21 20:01:38 np0005591285 podman[252721]: 2026-01-22 01:01:38.246297886 +0000 UTC m=+0.110012126 container health_status e38def30a2d032278c507a8fe96e62e9ea51ff4721fe55b24e66691821fd079e (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_id=ovn_controller, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team)
Jan 21 20:01:39 np0005591285 nova_compute[182755]: 2026-01-22 01:01:39.811 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 20:01:41 np0005591285 nova_compute[182755]: 2026-01-22 01:01:41.623 182759 DEBUG oslo_service.periodic_task [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 21 20:01:42 np0005591285 nova_compute[182755]: 2026-01-22 01:01:42.205 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 20:01:44 np0005591285 nova_compute[182755]: 2026-01-22 01:01:44.856 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 20:01:47 np0005591285 nova_compute[182755]: 2026-01-22 01:01:47.207 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 20:01:49 np0005591285 nova_compute[182755]: 2026-01-22 01:01:49.859 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 20:01:52 np0005591285 podman[252747]: 2026-01-22 01:01:52.182363105 +0000 UTC m=+0.052875970 container health_status 769668cc4e818cfae854ab3fcb5517227ddca1e18005a18ef4d782a494f45662 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.openshift.tags=minimal rhel9, distribution-scope=public, maintainer=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=openstack_network_exporter, vendor=Red Hat, Inc., version=9.6, io.openshift.expose-services=, com.redhat.component=ubi9-minimal-container, container_name=openstack_network_exporter, name=ubi9-minimal, vcs-type=git, url=https://catalog.redhat.com/en/search?searchType=containers, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, build-date=2025-08-20T13:12:41, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, release=1755695350, architecture=x86_64, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.buildah.version=1.33.7, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., managed_by=edpm_ansible)
Jan 21 20:01:52 np0005591285 podman[252748]: 2026-01-22 01:01:52.215976728 +0000 UTC m=+0.081543961 container health_status b5c1a31a508d0cf9e10702abe9c0ef424614b4d8f0daee85791f7de054afa6f0 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, config_id=ceilometer_agent_compute, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, container_name=ceilometer_agent_compute)
Jan 21 20:01:52 np0005591285 nova_compute[182755]: 2026-01-22 01:01:52.228 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 20:01:54 np0005591285 nova_compute[182755]: 2026-01-22 01:01:54.862 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 20:01:57 np0005591285 nova_compute[182755]: 2026-01-22 01:01:57.231 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 20:01:59 np0005591285 nova_compute[182755]: 2026-01-22 01:01:59.865 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 20:02:00 np0005591285 podman[252788]: 2026-01-22 01:02:00.20001077 +0000 UTC m=+0.057192417 container health_status 3d927e208c8d9d2bf2ed958430b573b5beb6e6062ba87e1ec7a0ee42c855b2e4 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Jan 21 20:02:02 np0005591285 nova_compute[182755]: 2026-01-22 01:02:02.233 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 20:02:03 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 01:02:03.037 104259 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 21 20:02:03 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 01:02:03.038 104259 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 21 20:02:03 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 01:02:03.038 104259 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 21 20:02:04 np0005591285 nova_compute[182755]: 2026-01-22 01:02:04.868 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 20:02:07 np0005591285 podman[252813]: 2026-01-22 01:02:07.197708976 +0000 UTC m=+0.067015950 container health_status 482a8450540b33e5cd4c4cb0c4b60281016c657b79b89fa82f8a9e447843f995 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent)
Jan 21 20:02:07 np0005591285 podman[252814]: 2026-01-22 01:02:07.232752448 +0000 UTC m=+0.098483016 container health_status 8919309155a033da8deb024bb37b9fc0d285fe97686ee0d202af37a5f2df8416 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Jan 21 20:02:07 np0005591285 nova_compute[182755]: 2026-01-22 01:02:07.236 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 20:02:09 np0005591285 podman[252852]: 2026-01-22 01:02:09.264859944 +0000 UTC m=+0.142168816 container health_status e38def30a2d032278c507a8fe96e62e9ea51ff4721fe55b24e66691821fd079e (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_controller, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.vendor=CentOS)
Jan 21 20:02:09 np0005591285 nova_compute[182755]: 2026-01-22 01:02:09.870 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 20:02:12 np0005591285 nova_compute[182755]: 2026-01-22 01:02:12.238 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 20:02:14 np0005591285 nova_compute[182755]: 2026-01-22 01:02:14.872 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 20:02:17 np0005591285 nova_compute[182755]: 2026-01-22 01:02:17.240 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 20:02:19 np0005591285 nova_compute[182755]: 2026-01-22 01:02:19.875 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 20:02:22 np0005591285 nova_compute[182755]: 2026-01-22 01:02:22.243 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 20:02:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 01:02:23.189 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 21 20:02:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 01:02:23.190 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 21 20:02:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 01:02:23.190 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 21 20:02:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 01:02:23.190 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 21 20:02:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 01:02:23.191 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 21 20:02:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 01:02:23.191 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 21 20:02:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 01:02:23.191 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 21 20:02:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 01:02:23.191 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 21 20:02:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 01:02:23.191 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.allocation, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 21 20:02:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 01:02:23.192 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 21 20:02:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 01:02:23.192 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 21 20:02:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 01:02:23.192 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 21 20:02:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 01:02:23.192 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 21 20:02:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 01:02:23.192 12 DEBUG ceilometer.polling.manager [-] Skip pollster memory.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 21 20:02:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 01:02:23.193 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 21 20:02:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 01:02:23.193 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 21 20:02:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 01:02:23.193 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 21 20:02:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 01:02:23.193 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 21 20:02:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 01:02:23.193 12 DEBUG ceilometer.polling.manager [-] Skip pollster cpu, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 21 20:02:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 01:02:23.194 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 21 20:02:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 01:02:23.194 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 21 20:02:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 01:02:23.194 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 21 20:02:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 01:02:23.194 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 21 20:02:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 01:02:23.194 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.capacity, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 21 20:02:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 01:02:23.196 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.iops, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 21 20:02:23 np0005591285 podman[252878]: 2026-01-22 01:02:23.201987822 +0000 UTC m=+0.072308572 container health_status 769668cc4e818cfae854ab3fcb5517227ddca1e18005a18ef4d782a494f45662 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=openstack_network_exporter, io.buildah.version=1.33.7, version=9.6, distribution-scope=public, maintainer=Red Hat, Inc., name=ubi9-minimal, url=https://catalog.redhat.com/en/search?searchType=containers, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.expose-services=, release=1755695350, com.redhat.component=ubi9-minimal-container, vendor=Red Hat, Inc., build-date=2025-08-20T13:12:41, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, container_name=openstack_network_exporter, io.openshift.tags=minimal rhel9, managed_by=edpm_ansible, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-type=git, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, architecture=x86_64)
Jan 21 20:02:23 np0005591285 podman[252879]: 2026-01-22 01:02:23.22499855 +0000 UTC m=+0.078460537 container health_status b5c1a31a508d0cf9e10702abe9c0ef424614b4d8f0daee85791f7de054afa6f0 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=ceilometer_agent_compute)
Jan 21 20:02:24 np0005591285 nova_compute[182755]: 2026-01-22 01:02:24.929 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 20:02:27 np0005591285 nova_compute[182755]: 2026-01-22 01:02:27.245 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 20:02:28 np0005591285 nova_compute[182755]: 2026-01-22 01:02:28.217 182759 DEBUG oslo_service.periodic_task [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 21 20:02:28 np0005591285 nova_compute[182755]: 2026-01-22 01:02:28.218 182759 DEBUG nova.compute.manager [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Jan 21 20:02:29 np0005591285 nova_compute[182755]: 2026-01-22 01:02:29.933 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 20:02:31 np0005591285 podman[252919]: 2026-01-22 01:02:31.173086096 +0000 UTC m=+0.050540658 container health_status 3d927e208c8d9d2bf2ed958430b573b5beb6e6062ba87e1ec7a0ee42c855b2e4 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Jan 21 20:02:31 np0005591285 nova_compute[182755]: 2026-01-22 01:02:31.213 182759 DEBUG oslo_service.periodic_task [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 21 20:02:31 np0005591285 nova_compute[182755]: 2026-01-22 01:02:31.217 182759 DEBUG oslo_service.periodic_task [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 21 20:02:31 np0005591285 nova_compute[182755]: 2026-01-22 01:02:31.217 182759 DEBUG nova.compute.manager [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Jan 21 20:02:31 np0005591285 nova_compute[182755]: 2026-01-22 01:02:31.218 182759 DEBUG nova.compute.manager [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Jan 21 20:02:31 np0005591285 nova_compute[182755]: 2026-01-22 01:02:31.258 182759 DEBUG nova.compute.manager [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Jan 21 20:02:32 np0005591285 nova_compute[182755]: 2026-01-22 01:02:32.217 182759 DEBUG oslo_service.periodic_task [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 21 20:02:32 np0005591285 nova_compute[182755]: 2026-01-22 01:02:32.218 182759 DEBUG oslo_service.periodic_task [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 21 20:02:32 np0005591285 nova_compute[182755]: 2026-01-22 01:02:32.248 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 20:02:33 np0005591285 nova_compute[182755]: 2026-01-22 01:02:33.218 182759 DEBUG oslo_service.periodic_task [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 21 20:02:34 np0005591285 nova_compute[182755]: 2026-01-22 01:02:34.217 182759 DEBUG oslo_service.periodic_task [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 21 20:02:34 np0005591285 nova_compute[182755]: 2026-01-22 01:02:34.935 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 20:02:37 np0005591285 nova_compute[182755]: 2026-01-22 01:02:37.249 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 20:02:38 np0005591285 podman[252945]: 2026-01-22 01:02:38.199432871 +0000 UTC m=+0.063077825 container health_status 8919309155a033da8deb024bb37b9fc0d285fe97686ee0d202af37a5f2df8416 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter)
Jan 21 20:02:38 np0005591285 podman[252944]: 2026-01-22 01:02:38.206763568 +0000 UTC m=+0.065911831 container health_status 482a8450540b33e5cd4c4cb0c4b60281016c657b79b89fa82f8a9e447843f995 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, tcib_managed=true, managed_by=edpm_ansible, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent)
Jan 21 20:02:38 np0005591285 nova_compute[182755]: 2026-01-22 01:02:38.218 182759 DEBUG oslo_service.periodic_task [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 21 20:02:38 np0005591285 nova_compute[182755]: 2026-01-22 01:02:38.284 182759 DEBUG oslo_concurrency.lockutils [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 21 20:02:38 np0005591285 nova_compute[182755]: 2026-01-22 01:02:38.285 182759 DEBUG oslo_concurrency.lockutils [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 21 20:02:38 np0005591285 nova_compute[182755]: 2026-01-22 01:02:38.285 182759 DEBUG oslo_concurrency.lockutils [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 21 20:02:38 np0005591285 nova_compute[182755]: 2026-01-22 01:02:38.285 182759 DEBUG nova.compute.resource_tracker [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Jan 21 20:02:38 np0005591285 nova_compute[182755]: 2026-01-22 01:02:38.463 182759 WARNING nova.virt.libvirt.driver [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 21 20:02:38 np0005591285 nova_compute[182755]: 2026-01-22 01:02:38.465 182759 DEBUG nova.compute.resource_tracker [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=5706MB free_disk=73.10839462280273GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Jan 21 20:02:38 np0005591285 nova_compute[182755]: 2026-01-22 01:02:38.466 182759 DEBUG oslo_concurrency.lockutils [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 21 20:02:38 np0005591285 nova_compute[182755]: 2026-01-22 01:02:38.466 182759 DEBUG oslo_concurrency.lockutils [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 21 20:02:38 np0005591285 nova_compute[182755]: 2026-01-22 01:02:38.569 182759 DEBUG nova.compute.resource_tracker [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Jan 21 20:02:38 np0005591285 nova_compute[182755]: 2026-01-22 01:02:38.569 182759 DEBUG nova.compute.resource_tracker [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Jan 21 20:02:38 np0005591285 nova_compute[182755]: 2026-01-22 01:02:38.621 182759 DEBUG nova.compute.provider_tree [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Inventory has not changed in ProviderTree for provider: e96a8776-a298-4c19-937a-402cb8191067 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 21 20:02:38 np0005591285 nova_compute[182755]: 2026-01-22 01:02:38.646 182759 DEBUG nova.scheduler.client.report [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Inventory has not changed for provider e96a8776-a298-4c19-937a-402cb8191067 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 21 20:02:38 np0005591285 nova_compute[182755]: 2026-01-22 01:02:38.647 182759 DEBUG nova.compute.resource_tracker [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Jan 21 20:02:38 np0005591285 nova_compute[182755]: 2026-01-22 01:02:38.647 182759 DEBUG oslo_concurrency.lockutils [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.181s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 21 20:02:39 np0005591285 nova_compute[182755]: 2026-01-22 01:02:39.939 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 20:02:40 np0005591285 podman[252989]: 2026-01-22 01:02:40.220819431 +0000 UTC m=+0.098231620 container health_status e38def30a2d032278c507a8fe96e62e9ea51ff4721fe55b24e66691821fd079e (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, container_name=ovn_controller, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251202, maintainer=OpenStack Kubernetes Operator team)
Jan 21 20:02:42 np0005591285 nova_compute[182755]: 2026-01-22 01:02:42.252 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 20:02:42 np0005591285 nova_compute[182755]: 2026-01-22 01:02:42.646 182759 DEBUG oslo_service.periodic_task [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 21 20:02:44 np0005591285 nova_compute[182755]: 2026-01-22 01:02:44.940 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 20:02:47 np0005591285 nova_compute[182755]: 2026-01-22 01:02:47.252 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 20:02:49 np0005591285 nova_compute[182755]: 2026-01-22 01:02:49.942 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 20:02:52 np0005591285 nova_compute[182755]: 2026-01-22 01:02:52.214 182759 DEBUG oslo_service.periodic_task [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 21 20:02:52 np0005591285 nova_compute[182755]: 2026-01-22 01:02:52.255 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 20:02:54 np0005591285 podman[253020]: 2026-01-22 01:02:54.203623085 +0000 UTC m=+0.062176030 container health_status b5c1a31a508d0cf9e10702abe9c0ef424614b4d8f0daee85791f7de054afa6f0 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=ceilometer_agent_compute, container_name=ceilometer_agent_compute, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0)
Jan 21 20:02:54 np0005591285 podman[253019]: 2026-01-22 01:02:54.218611567 +0000 UTC m=+0.074540402 container health_status 769668cc4e818cfae854ab3fcb5517227ddca1e18005a18ef4d782a494f45662 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, container_name=openstack_network_exporter, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., managed_by=edpm_ansible, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, build-date=2025-08-20T13:12:41, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Red Hat, Inc., architecture=x86_64, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.expose-services=, com.redhat.component=ubi9-minimal-container, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., name=ubi9-minimal, release=1755695350, version=9.6, io.openshift.tags=minimal rhel9, distribution-scope=public, io.buildah.version=1.33.7, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=openstack_network_exporter, vendor=Red Hat, Inc.)
Jan 21 20:02:54 np0005591285 nova_compute[182755]: 2026-01-22 01:02:54.945 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 20:02:57 np0005591285 nova_compute[182755]: 2026-01-22 01:02:57.258 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 20:02:59 np0005591285 nova_compute[182755]: 2026-01-22 01:02:59.947 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 20:03:02 np0005591285 podman[253061]: 2026-01-22 01:03:02.190943065 +0000 UTC m=+0.060092684 container health_status 3d927e208c8d9d2bf2ed958430b573b5beb6e6062ba87e1ec7a0ee42c855b2e4 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Jan 21 20:03:02 np0005591285 nova_compute[182755]: 2026-01-22 01:03:02.260 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 20:03:03 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 01:03:03.038 104259 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 21 20:03:03 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 01:03:03.038 104259 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 21 20:03:03 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 01:03:03.039 104259 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 21 20:03:04 np0005591285 nova_compute[182755]: 2026-01-22 01:03:04.950 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 20:03:07 np0005591285 nova_compute[182755]: 2026-01-22 01:03:07.262 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 20:03:09 np0005591285 podman[253085]: 2026-01-22 01:03:09.187480361 +0000 UTC m=+0.060451834 container health_status 482a8450540b33e5cd4c4cb0c4b60281016c657b79b89fa82f8a9e447843f995 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_metadata_agent, tcib_managed=true, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent)
Jan 21 20:03:09 np0005591285 podman[253086]: 2026-01-22 01:03:09.205105344 +0000 UTC m=+0.079219157 container health_status 8919309155a033da8deb024bb37b9fc0d285fe97686ee0d202af37a5f2df8416 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Jan 21 20:03:09 np0005591285 nova_compute[182755]: 2026-01-22 01:03:09.952 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 20:03:11 np0005591285 podman[253127]: 2026-01-22 01:03:11.278756578 +0000 UTC m=+0.144224924 container health_status e38def30a2d032278c507a8fe96e62e9ea51ff4721fe55b24e66691821fd079e (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=ovn_controller, container_name=ovn_controller, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']})
Jan 21 20:03:12 np0005591285 nova_compute[182755]: 2026-01-22 01:03:12.265 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 20:03:14 np0005591285 nova_compute[182755]: 2026-01-22 01:03:14.985 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 20:03:17 np0005591285 nova_compute[182755]: 2026-01-22 01:03:17.267 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 20:03:19 np0005591285 nova_compute[182755]: 2026-01-22 01:03:19.988 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 20:03:22 np0005591285 nova_compute[182755]: 2026-01-22 01:03:22.269 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 20:03:25 np0005591285 nova_compute[182755]: 2026-01-22 01:03:25.029 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 20:03:25 np0005591285 podman[253151]: 2026-01-22 01:03:25.208013214 +0000 UTC m=+0.078962611 container health_status 769668cc4e818cfae854ab3fcb5517227ddca1e18005a18ef4d782a494f45662 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, name=ubi9-minimal, io.openshift.tags=minimal rhel9, build-date=2025-08-20T13:12:41, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, maintainer=Red Hat, Inc., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vcs-type=git, io.openshift.expose-services=, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., url=https://catalog.redhat.com/en/search?searchType=containers, version=9.6, config_id=openstack_network_exporter, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., distribution-scope=public, vendor=Red Hat, Inc., managed_by=edpm_ansible, io.buildah.version=1.33.7, release=1755695350, com.redhat.component=ubi9-minimal-container, container_name=openstack_network_exporter, architecture=x86_64, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']})
Jan 21 20:03:25 np0005591285 podman[253152]: 2026-01-22 01:03:25.233856279 +0000 UTC m=+0.090648706 container health_status b5c1a31a508d0cf9e10702abe9c0ef424614b4d8f0daee85791f7de054afa6f0 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, managed_by=edpm_ansible, tcib_managed=true, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=ceilometer_agent_compute)
Jan 21 20:03:27 np0005591285 nova_compute[182755]: 2026-01-22 01:03:27.272 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 20:03:29 np0005591285 nova_compute[182755]: 2026-01-22 01:03:29.217 182759 DEBUG oslo_service.periodic_task [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 21 20:03:29 np0005591285 nova_compute[182755]: 2026-01-22 01:03:29.218 182759 DEBUG nova.compute.manager [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Jan 21 20:03:30 np0005591285 nova_compute[182755]: 2026-01-22 01:03:30.032 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 20:03:32 np0005591285 nova_compute[182755]: 2026-01-22 01:03:32.219 182759 DEBUG oslo_service.periodic_task [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 21 20:03:32 np0005591285 nova_compute[182755]: 2026-01-22 01:03:32.219 182759 DEBUG nova.compute.manager [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Jan 21 20:03:32 np0005591285 nova_compute[182755]: 2026-01-22 01:03:32.219 182759 DEBUG nova.compute.manager [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Jan 21 20:03:32 np0005591285 nova_compute[182755]: 2026-01-22 01:03:32.240 182759 DEBUG nova.compute.manager [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Jan 21 20:03:32 np0005591285 nova_compute[182755]: 2026-01-22 01:03:32.240 182759 DEBUG oslo_service.periodic_task [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 21 20:03:32 np0005591285 nova_compute[182755]: 2026-01-22 01:03:32.274 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 20:03:33 np0005591285 nova_compute[182755]: 2026-01-22 01:03:33.217 182759 DEBUG oslo_service.periodic_task [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 21 20:03:33 np0005591285 nova_compute[182755]: 2026-01-22 01:03:33.217 182759 DEBUG oslo_service.periodic_task [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 21 20:03:33 np0005591285 podman[253193]: 2026-01-22 01:03:33.232118322 +0000 UTC m=+0.090652056 container health_status 3d927e208c8d9d2bf2ed958430b573b5beb6e6062ba87e1ec7a0ee42c855b2e4 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter)
Jan 21 20:03:34 np0005591285 nova_compute[182755]: 2026-01-22 01:03:34.217 182759 DEBUG oslo_service.periodic_task [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 21 20:03:34 np0005591285 nova_compute[182755]: 2026-01-22 01:03:34.218 182759 DEBUG oslo_service.periodic_task [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 21 20:03:35 np0005591285 nova_compute[182755]: 2026-01-22 01:03:35.034 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 20:03:37 np0005591285 nova_compute[182755]: 2026-01-22 01:03:37.278 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 20:03:39 np0005591285 nova_compute[182755]: 2026-01-22 01:03:39.217 182759 DEBUG oslo_service.periodic_task [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 21 20:03:39 np0005591285 nova_compute[182755]: 2026-01-22 01:03:39.366 182759 DEBUG oslo_concurrency.lockutils [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 21 20:03:39 np0005591285 nova_compute[182755]: 2026-01-22 01:03:39.367 182759 DEBUG oslo_concurrency.lockutils [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 21 20:03:39 np0005591285 nova_compute[182755]: 2026-01-22 01:03:39.367 182759 DEBUG oslo_concurrency.lockutils [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 21 20:03:39 np0005591285 nova_compute[182755]: 2026-01-22 01:03:39.368 182759 DEBUG nova.compute.resource_tracker [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Jan 21 20:03:39 np0005591285 nova_compute[182755]: 2026-01-22 01:03:39.631 182759 WARNING nova.virt.libvirt.driver [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 21 20:03:39 np0005591285 nova_compute[182755]: 2026-01-22 01:03:39.633 182759 DEBUG nova.compute.resource_tracker [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=5712MB free_disk=73.10839462280273GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Jan 21 20:03:39 np0005591285 nova_compute[182755]: 2026-01-22 01:03:39.633 182759 DEBUG oslo_concurrency.lockutils [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 21 20:03:39 np0005591285 nova_compute[182755]: 2026-01-22 01:03:39.634 182759 DEBUG oslo_concurrency.lockutils [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 21 20:03:39 np0005591285 nova_compute[182755]: 2026-01-22 01:03:39.710 182759 DEBUG nova.compute.resource_tracker [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Jan 21 20:03:39 np0005591285 nova_compute[182755]: 2026-01-22 01:03:39.711 182759 DEBUG nova.compute.resource_tracker [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Jan 21 20:03:39 np0005591285 nova_compute[182755]: 2026-01-22 01:03:39.753 182759 DEBUG nova.compute.provider_tree [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Inventory has not changed in ProviderTree for provider: e96a8776-a298-4c19-937a-402cb8191067 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 21 20:03:39 np0005591285 nova_compute[182755]: 2026-01-22 01:03:39.769 182759 DEBUG nova.scheduler.client.report [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Inventory has not changed for provider e96a8776-a298-4c19-937a-402cb8191067 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 21 20:03:39 np0005591285 nova_compute[182755]: 2026-01-22 01:03:39.772 182759 DEBUG nova.compute.resource_tracker [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Jan 21 20:03:39 np0005591285 nova_compute[182755]: 2026-01-22 01:03:39.772 182759 DEBUG oslo_concurrency.lockutils [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.138s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 21 20:03:40 np0005591285 nova_compute[182755]: 2026-01-22 01:03:40.037 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 20:03:40 np0005591285 podman[253219]: 2026-01-22 01:03:40.186557175 +0000 UTC m=+0.056001715 container health_status 8919309155a033da8deb024bb37b9fc0d285fe97686ee0d202af37a5f2df8416 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter)
Jan 21 20:03:40 np0005591285 podman[253218]: 2026-01-22 01:03:40.187505301 +0000 UTC m=+0.062870420 container health_status 482a8450540b33e5cd4c4cb0c4b60281016c657b79b89fa82f8a9e447843f995 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, container_name=ovn_metadata_agent, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true)
Jan 21 20:03:42 np0005591285 podman[253262]: 2026-01-22 01:03:42.240811988 +0000 UTC m=+0.111037642 container health_status e38def30a2d032278c507a8fe96e62e9ea51ff4721fe55b24e66691821fd079e (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=ovn_controller, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']})
Jan 21 20:03:42 np0005591285 nova_compute[182755]: 2026-01-22 01:03:42.278 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 20:03:44 np0005591285 nova_compute[182755]: 2026-01-22 01:03:44.773 182759 DEBUG oslo_service.periodic_task [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 21 20:03:45 np0005591285 nova_compute[182755]: 2026-01-22 01:03:45.040 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 20:03:47 np0005591285 nova_compute[182755]: 2026-01-22 01:03:47.280 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 20:03:50 np0005591285 nova_compute[182755]: 2026-01-22 01:03:50.042 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 20:03:52 np0005591285 nova_compute[182755]: 2026-01-22 01:03:52.282 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 20:03:55 np0005591285 nova_compute[182755]: 2026-01-22 01:03:55.045 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 20:03:56 np0005591285 podman[253289]: 2026-01-22 01:03:56.211907467 +0000 UTC m=+0.076150806 container health_status b5c1a31a508d0cf9e10702abe9c0ef424614b4d8f0daee85791f7de054afa6f0 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=ceilometer_agent_compute, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, container_name=ceilometer_agent_compute, org.label-schema.build-date=20251202, tcib_managed=true, io.buildah.version=1.41.3)
Jan 21 20:03:56 np0005591285 podman[253288]: 2026-01-22 01:03:56.226442067 +0000 UTC m=+0.084712645 container health_status 769668cc4e818cfae854ab3fcb5517227ddca1e18005a18ef4d782a494f45662 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, distribution-scope=public, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, com.redhat.component=ubi9-minimal-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=ubi9-minimal, build-date=2025-08-20T13:12:41, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, config_id=openstack_network_exporter, vcs-type=git, io.openshift.tags=minimal rhel9, url=https://catalog.redhat.com/en/search?searchType=containers, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, architecture=x86_64, release=1755695350, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, maintainer=Red Hat, Inc., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., container_name=openstack_network_exporter, vendor=Red Hat, Inc., version=9.6, io.buildah.version=1.33.7, managed_by=edpm_ansible)
Jan 21 20:03:57 np0005591285 nova_compute[182755]: 2026-01-22 01:03:57.218 182759 DEBUG oslo_service.periodic_task [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Running periodic task ComputeManager._run_pending_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 21 20:03:57 np0005591285 nova_compute[182755]: 2026-01-22 01:03:57.219 182759 DEBUG nova.compute.manager [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Cleaning up deleted instances _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11145#033[00m
Jan 21 20:03:57 np0005591285 nova_compute[182755]: 2026-01-22 01:03:57.236 182759 DEBUG nova.compute.manager [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] There are 0 instances to clean _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11154#033[00m
Jan 21 20:03:57 np0005591285 nova_compute[182755]: 2026-01-22 01:03:57.282 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 20:04:00 np0005591285 nova_compute[182755]: 2026-01-22 01:04:00.048 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 20:04:00 np0005591285 nova_compute[182755]: 2026-01-22 01:04:00.217 182759 DEBUG oslo_service.periodic_task [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Running periodic task ComputeManager._cleanup_incomplete_migrations run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 21 20:04:00 np0005591285 nova_compute[182755]: 2026-01-22 01:04:00.217 182759 DEBUG nova.compute.manager [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Cleaning up deleted instances with incomplete migration  _cleanup_incomplete_migrations /usr/lib/python3.9/site-packages/nova/compute/manager.py:11183#033[00m
Jan 21 20:04:02 np0005591285 nova_compute[182755]: 2026-01-22 01:04:02.285 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 20:04:03 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 01:04:03.039 104259 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 21 20:04:03 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 01:04:03.040 104259 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 21 20:04:03 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 01:04:03.040 104259 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 21 20:04:04 np0005591285 podman[253330]: 2026-01-22 01:04:04.217298941 +0000 UTC m=+0.081726485 container health_status 3d927e208c8d9d2bf2ed958430b573b5beb6e6062ba87e1ec7a0ee42c855b2e4 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter)
Jan 21 20:04:05 np0005591285 nova_compute[182755]: 2026-01-22 01:04:05.051 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 20:04:07 np0005591285 nova_compute[182755]: 2026-01-22 01:04:07.315 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 20:04:10 np0005591285 nova_compute[182755]: 2026-01-22 01:04:10.053 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 20:04:11 np0005591285 podman[253357]: 2026-01-22 01:04:11.197930479 +0000 UTC m=+0.055615354 container health_status 8919309155a033da8deb024bb37b9fc0d285fe97686ee0d202af37a5f2df8416 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter)
Jan 21 20:04:11 np0005591285 podman[253356]: 2026-01-22 01:04:11.202602395 +0000 UTC m=+0.062460298 container health_status 482a8450540b33e5cd4c4cb0c4b60281016c657b79b89fa82f8a9e447843f995 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, managed_by=edpm_ansible, tcib_managed=true, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 21 20:04:12 np0005591285 nova_compute[182755]: 2026-01-22 01:04:12.319 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 20:04:13 np0005591285 podman[253395]: 2026-01-22 01:04:13.239669565 +0000 UTC m=+0.102208875 container health_status e38def30a2d032278c507a8fe96e62e9ea51ff4721fe55b24e66691821fd079e (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, config_id=ovn_controller, io.buildah.version=1.41.3, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller)
Jan 21 20:04:15 np0005591285 nova_compute[182755]: 2026-01-22 01:04:15.056 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 20:04:17 np0005591285 nova_compute[182755]: 2026-01-22 01:04:17.359 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 20:04:20 np0005591285 nova_compute[182755]: 2026-01-22 01:04:20.058 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 20:04:21 np0005591285 nova_compute[182755]: 2026-01-22 01:04:21.218 182759 DEBUG oslo_service.periodic_task [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 21 20:04:22 np0005591285 nova_compute[182755]: 2026-01-22 01:04:22.359 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 20:04:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 01:04:23.187 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 21 20:04:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 01:04:23.187 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 21 20:04:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 01:04:23.188 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 21 20:04:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 01:04:23.188 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 21 20:04:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 01:04:23.188 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 21 20:04:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 01:04:23.188 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 21 20:04:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 01:04:23.188 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 21 20:04:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 01:04:23.188 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 21 20:04:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 01:04:23.189 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.capacity, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 21 20:04:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 01:04:23.189 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.allocation, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 21 20:04:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 01:04:23.189 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 21 20:04:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 01:04:23.189 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 21 20:04:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 01:04:23.189 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 21 20:04:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 01:04:23.189 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 21 20:04:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 01:04:23.190 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 21 20:04:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 01:04:23.190 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 21 20:04:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 01:04:23.190 12 DEBUG ceilometer.polling.manager [-] Skip pollster cpu, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 21 20:04:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 01:04:23.190 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 21 20:04:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 01:04:23.190 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 21 20:04:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 01:04:23.190 12 DEBUG ceilometer.polling.manager [-] Skip pollster memory.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 21 20:04:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 01:04:23.190 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.iops, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 21 20:04:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 01:04:23.191 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 21 20:04:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 01:04:23.191 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 21 20:04:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 01:04:23.191 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 21 20:04:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 01:04:23.191 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 21 20:04:25 np0005591285 nova_compute[182755]: 2026-01-22 01:04:25.061 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 20:04:27 np0005591285 podman[253422]: 2026-01-22 01:04:27.232265003 +0000 UTC m=+0.093659955 container health_status 769668cc4e818cfae854ab3fcb5517227ddca1e18005a18ef4d782a494f45662 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, distribution-scope=public, managed_by=edpm_ansible, name=ubi9-minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.openshift.tags=minimal rhel9, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=openstack_network_exporter, io.openshift.expose-services=, url=https://catalog.redhat.com/en/search?searchType=containers, io.buildah.version=1.33.7, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., release=1755695350, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, maintainer=Red Hat, Inc., version=9.6, com.redhat.component=ubi9-minimal-container, config_id=openstack_network_exporter, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, architecture=x86_64, build-date=2025-08-20T13:12:41, vcs-type=git, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vendor=Red Hat, Inc.)
Jan 21 20:04:27 np0005591285 podman[253423]: 2026-01-22 01:04:27.240971117 +0000 UTC m=+0.092667279 container health_status b5c1a31a508d0cf9e10702abe9c0ef424614b4d8f0daee85791f7de054afa6f0 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, container_name=ceilometer_agent_compute, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_managed=true, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Jan 21 20:04:27 np0005591285 nova_compute[182755]: 2026-01-22 01:04:27.361 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 20:04:29 np0005591285 nova_compute[182755]: 2026-01-22 01:04:29.231 182759 DEBUG oslo_service.periodic_task [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 21 20:04:29 np0005591285 nova_compute[182755]: 2026-01-22 01:04:29.232 182759 DEBUG nova.compute.manager [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Jan 21 20:04:30 np0005591285 nova_compute[182755]: 2026-01-22 01:04:30.064 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 20:04:32 np0005591285 nova_compute[182755]: 2026-01-22 01:04:32.364 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 20:04:33 np0005591285 nova_compute[182755]: 2026-01-22 01:04:33.214 182759 DEBUG oslo_service.periodic_task [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 21 20:04:33 np0005591285 nova_compute[182755]: 2026-01-22 01:04:33.216 182759 DEBUG oslo_service.periodic_task [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 21 20:04:34 np0005591285 nova_compute[182755]: 2026-01-22 01:04:34.218 182759 DEBUG oslo_service.periodic_task [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 21 20:04:34 np0005591285 nova_compute[182755]: 2026-01-22 01:04:34.218 182759 DEBUG nova.compute.manager [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Jan 21 20:04:34 np0005591285 nova_compute[182755]: 2026-01-22 01:04:34.218 182759 DEBUG nova.compute.manager [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Jan 21 20:04:34 np0005591285 nova_compute[182755]: 2026-01-22 01:04:34.237 182759 DEBUG nova.compute.manager [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Jan 21 20:04:34 np0005591285 nova_compute[182755]: 2026-01-22 01:04:34.237 182759 DEBUG oslo_service.periodic_task [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 21 20:04:35 np0005591285 nova_compute[182755]: 2026-01-22 01:04:35.067 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 20:04:35 np0005591285 podman[253463]: 2026-01-22 01:04:35.210760896 +0000 UTC m=+0.068364076 container health_status 3d927e208c8d9d2bf2ed958430b573b5beb6e6062ba87e1ec7a0ee42c855b2e4 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter)
Jan 21 20:04:35 np0005591285 nova_compute[182755]: 2026-01-22 01:04:35.217 182759 DEBUG oslo_service.periodic_task [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 21 20:04:35 np0005591285 nova_compute[182755]: 2026-01-22 01:04:35.217 182759 DEBUG oslo_service.periodic_task [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 21 20:04:37 np0005591285 nova_compute[182755]: 2026-01-22 01:04:37.366 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 20:04:39 np0005591285 nova_compute[182755]: 2026-01-22 01:04:39.216 182759 DEBUG oslo_service.periodic_task [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 21 20:04:39 np0005591285 nova_compute[182755]: 2026-01-22 01:04:39.239 182759 DEBUG oslo_concurrency.lockutils [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 21 20:04:39 np0005591285 nova_compute[182755]: 2026-01-22 01:04:39.240 182759 DEBUG oslo_concurrency.lockutils [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 21 20:04:39 np0005591285 nova_compute[182755]: 2026-01-22 01:04:39.240 182759 DEBUG oslo_concurrency.lockutils [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 21 20:04:39 np0005591285 nova_compute[182755]: 2026-01-22 01:04:39.240 182759 DEBUG nova.compute.resource_tracker [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Jan 21 20:04:39 np0005591285 nova_compute[182755]: 2026-01-22 01:04:39.417 182759 WARNING nova.virt.libvirt.driver [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 21 20:04:39 np0005591285 nova_compute[182755]: 2026-01-22 01:04:39.418 182759 DEBUG nova.compute.resource_tracker [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=5709MB free_disk=73.1083984375GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Jan 21 20:04:39 np0005591285 nova_compute[182755]: 2026-01-22 01:04:39.418 182759 DEBUG oslo_concurrency.lockutils [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 21 20:04:39 np0005591285 nova_compute[182755]: 2026-01-22 01:04:39.418 182759 DEBUG oslo_concurrency.lockutils [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 21 20:04:39 np0005591285 nova_compute[182755]: 2026-01-22 01:04:39.483 182759 DEBUG nova.compute.resource_tracker [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Jan 21 20:04:39 np0005591285 nova_compute[182755]: 2026-01-22 01:04:39.483 182759 DEBUG nova.compute.resource_tracker [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Jan 21 20:04:39 np0005591285 nova_compute[182755]: 2026-01-22 01:04:39.510 182759 DEBUG nova.compute.provider_tree [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Inventory has not changed in ProviderTree for provider: e96a8776-a298-4c19-937a-402cb8191067 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 21 20:04:39 np0005591285 nova_compute[182755]: 2026-01-22 01:04:39.523 182759 DEBUG nova.scheduler.client.report [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Inventory has not changed for provider e96a8776-a298-4c19-937a-402cb8191067 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 21 20:04:39 np0005591285 nova_compute[182755]: 2026-01-22 01:04:39.525 182759 DEBUG nova.compute.resource_tracker [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Jan 21 20:04:39 np0005591285 nova_compute[182755]: 2026-01-22 01:04:39.525 182759 DEBUG oslo_concurrency.lockutils [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.107s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 21 20:04:40 np0005591285 nova_compute[182755]: 2026-01-22 01:04:40.071 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 20:04:42 np0005591285 podman[253490]: 2026-01-22 01:04:42.225270735 +0000 UTC m=+0.079643769 container health_status 8919309155a033da8deb024bb37b9fc0d285fe97686ee0d202af37a5f2df8416 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Jan 21 20:04:42 np0005591285 podman[253489]: 2026-01-22 01:04:42.236801674 +0000 UTC m=+0.092552966 container health_status 482a8450540b33e5cd4c4cb0c4b60281016c657b79b89fa82f8a9e447843f995 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, managed_by=edpm_ansible)
Jan 21 20:04:42 np0005591285 nova_compute[182755]: 2026-01-22 01:04:42.367 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 20:04:44 np0005591285 podman[253529]: 2026-01-22 01:04:44.239424549 +0000 UTC m=+0.099952615 container health_status e38def30a2d032278c507a8fe96e62e9ea51ff4721fe55b24e66691821fd079e (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, container_name=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_id=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251202, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']})
Jan 21 20:04:45 np0005591285 nova_compute[182755]: 2026-01-22 01:04:45.073 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 20:04:45 np0005591285 nova_compute[182755]: 2026-01-22 01:04:45.527 182759 DEBUG oslo_service.periodic_task [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 21 20:04:47 np0005591285 nova_compute[182755]: 2026-01-22 01:04:47.369 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 20:04:50 np0005591285 nova_compute[182755]: 2026-01-22 01:04:50.075 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 20:04:52 np0005591285 nova_compute[182755]: 2026-01-22 01:04:52.371 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 20:04:55 np0005591285 nova_compute[182755]: 2026-01-22 01:04:55.077 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 20:04:57 np0005591285 nova_compute[182755]: 2026-01-22 01:04:57.213 182759 DEBUG oslo_service.periodic_task [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 21 20:04:57 np0005591285 nova_compute[182755]: 2026-01-22 01:04:57.374 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 20:04:58 np0005591285 podman[253558]: 2026-01-22 01:04:58.20834182 +0000 UTC m=+0.075603981 container health_status 769668cc4e818cfae854ab3fcb5517227ddca1e18005a18ef4d782a494f45662 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Red Hat, Inc., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, build-date=2025-08-20T13:12:41, architecture=x86_64, com.redhat.component=ubi9-minimal-container, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, release=1755695350, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-type=git, distribution-scope=public, name=ubi9-minimal, io.openshift.tags=minimal rhel9, io.buildah.version=1.33.7, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., config_id=openstack_network_exporter, container_name=openstack_network_exporter, managed_by=edpm_ansible, vendor=Red Hat, Inc., version=9.6, io.openshift.expose-services=, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly.)
Jan 21 20:04:58 np0005591285 podman[253559]: 2026-01-22 01:04:58.239138367 +0000 UTC m=+0.093106961 container health_status b5c1a31a508d0cf9e10702abe9c0ef424614b4d8f0daee85791f7de054afa6f0 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, tcib_managed=true, config_id=ceilometer_agent_compute)
Jan 21 20:05:00 np0005591285 nova_compute[182755]: 2026-01-22 01:05:00.079 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 20:05:02 np0005591285 nova_compute[182755]: 2026-01-22 01:05:02.378 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 20:05:03 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 01:05:03.041 104259 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 21 20:05:03 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 01:05:03.041 104259 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 21 20:05:03 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 01:05:03.042 104259 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 21 20:05:05 np0005591285 nova_compute[182755]: 2026-01-22 01:05:05.082 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 20:05:06 np0005591285 podman[253599]: 2026-01-22 01:05:06.221805362 +0000 UTC m=+0.069444926 container health_status 3d927e208c8d9d2bf2ed958430b573b5beb6e6062ba87e1ec7a0ee42c855b2e4 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Jan 21 20:05:07 np0005591285 nova_compute[182755]: 2026-01-22 01:05:07.381 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 20:05:10 np0005591285 nova_compute[182755]: 2026-01-22 01:05:10.141 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 20:05:12 np0005591285 nova_compute[182755]: 2026-01-22 01:05:12.383 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 20:05:13 np0005591285 podman[253626]: 2026-01-22 01:05:13.21100153 +0000 UTC m=+0.071209743 container health_status 482a8450540b33e5cd4c4cb0c4b60281016c657b79b89fa82f8a9e447843f995 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, managed_by=edpm_ansible, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_metadata_agent)
Jan 21 20:05:13 np0005591285 podman[253627]: 2026-01-22 01:05:13.233381931 +0000 UTC m=+0.079961628 container health_status 8919309155a033da8deb024bb37b9fc0d285fe97686ee0d202af37a5f2df8416 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']})
Jan 21 20:05:15 np0005591285 nova_compute[182755]: 2026-01-22 01:05:15.167 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 20:05:15 np0005591285 podman[253663]: 2026-01-22 01:05:15.275343383 +0000 UTC m=+0.134500943 container health_status e38def30a2d032278c507a8fe96e62e9ea51ff4721fe55b24e66691821fd079e (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_id=ovn_controller, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, container_name=ovn_controller, io.buildah.version=1.41.3, managed_by=edpm_ansible, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.schema-version=1.0)
Jan 21 20:05:17 np0005591285 nova_compute[182755]: 2026-01-22 01:05:17.385 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 20:05:20 np0005591285 nova_compute[182755]: 2026-01-22 01:05:20.215 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 20:05:22 np0005591285 nova_compute[182755]: 2026-01-22 01:05:22.386 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 20:05:25 np0005591285 nova_compute[182755]: 2026-01-22 01:05:25.280 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 20:05:27 np0005591285 nova_compute[182755]: 2026-01-22 01:05:27.390 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 20:05:29 np0005591285 nova_compute[182755]: 2026-01-22 01:05:29.217 182759 DEBUG oslo_service.periodic_task [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 21 20:05:29 np0005591285 nova_compute[182755]: 2026-01-22 01:05:29.217 182759 DEBUG nova.compute.manager [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Jan 21 20:05:29 np0005591285 podman[253691]: 2026-01-22 01:05:29.218470122 +0000 UTC m=+0.081888069 container health_status b5c1a31a508d0cf9e10702abe9c0ef424614b4d8f0daee85791f7de054afa6f0 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=ceilometer_agent_compute, container_name=ceilometer_agent_compute, org.label-schema.license=GPLv2, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, managed_by=edpm_ansible, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202)
Jan 21 20:05:29 np0005591285 podman[253690]: 2026-01-22 01:05:29.233492045 +0000 UTC m=+0.094998122 container health_status 769668cc4e818cfae854ab3fcb5517227ddca1e18005a18ef4d782a494f45662 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, com.redhat.component=ubi9-minimal-container, managed_by=edpm_ansible, container_name=openstack_network_exporter, distribution-scope=public, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, url=https://catalog.redhat.com/en/search?searchType=containers, release=1755695350, config_id=openstack_network_exporter, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., version=9.6, build-date=2025-08-20T13:12:41, io.buildah.version=1.33.7, io.openshift.expose-services=, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, name=ubi9-minimal, vendor=Red Hat, Inc., io.openshift.tags=minimal rhel9, maintainer=Red Hat, Inc., vcs-type=git, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI)
Jan 21 20:05:30 np0005591285 nova_compute[182755]: 2026-01-22 01:05:30.283 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 20:05:32 np0005591285 nova_compute[182755]: 2026-01-22 01:05:32.391 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 20:05:34 np0005591285 nova_compute[182755]: 2026-01-22 01:05:34.218 182759 DEBUG oslo_service.periodic_task [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 21 20:05:34 np0005591285 nova_compute[182755]: 2026-01-22 01:05:34.218 182759 DEBUG nova.compute.manager [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Jan 21 20:05:34 np0005591285 nova_compute[182755]: 2026-01-22 01:05:34.218 182759 DEBUG nova.compute.manager [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Jan 21 20:05:34 np0005591285 nova_compute[182755]: 2026-01-22 01:05:34.234 182759 DEBUG nova.compute.manager [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Jan 21 20:05:35 np0005591285 nova_compute[182755]: 2026-01-22 01:05:35.217 182759 DEBUG oslo_service.periodic_task [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 21 20:05:35 np0005591285 nova_compute[182755]: 2026-01-22 01:05:35.217 182759 DEBUG oslo_service.periodic_task [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 21 20:05:35 np0005591285 nova_compute[182755]: 2026-01-22 01:05:35.286 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 20:05:36 np0005591285 nova_compute[182755]: 2026-01-22 01:05:36.217 182759 DEBUG oslo_service.periodic_task [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 21 20:05:36 np0005591285 nova_compute[182755]: 2026-01-22 01:05:36.218 182759 DEBUG oslo_service.periodic_task [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 21 20:05:36 np0005591285 nova_compute[182755]: 2026-01-22 01:05:36.218 182759 DEBUG oslo_service.periodic_task [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 21 20:05:37 np0005591285 podman[253730]: 2026-01-22 01:05:37.22806182 +0000 UTC m=+0.084503680 container health_status 3d927e208c8d9d2bf2ed958430b573b5beb6e6062ba87e1ec7a0ee42c855b2e4 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Jan 21 20:05:37 np0005591285 nova_compute[182755]: 2026-01-22 01:05:37.394 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 20:05:40 np0005591285 nova_compute[182755]: 2026-01-22 01:05:40.217 182759 DEBUG oslo_service.periodic_task [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 21 20:05:40 np0005591285 nova_compute[182755]: 2026-01-22 01:05:40.240 182759 DEBUG oslo_concurrency.lockutils [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 21 20:05:40 np0005591285 nova_compute[182755]: 2026-01-22 01:05:40.241 182759 DEBUG oslo_concurrency.lockutils [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 21 20:05:40 np0005591285 nova_compute[182755]: 2026-01-22 01:05:40.241 182759 DEBUG oslo_concurrency.lockutils [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 21 20:05:40 np0005591285 nova_compute[182755]: 2026-01-22 01:05:40.241 182759 DEBUG nova.compute.resource_tracker [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Jan 21 20:05:40 np0005591285 nova_compute[182755]: 2026-01-22 01:05:40.323 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 20:05:40 np0005591285 nova_compute[182755]: 2026-01-22 01:05:40.462 182759 WARNING nova.virt.libvirt.driver [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 21 20:05:40 np0005591285 nova_compute[182755]: 2026-01-22 01:05:40.463 182759 DEBUG nova.compute.resource_tracker [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=5715MB free_disk=73.1083984375GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Jan 21 20:05:40 np0005591285 nova_compute[182755]: 2026-01-22 01:05:40.463 182759 DEBUG oslo_concurrency.lockutils [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 21 20:05:40 np0005591285 nova_compute[182755]: 2026-01-22 01:05:40.464 182759 DEBUG oslo_concurrency.lockutils [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 21 20:05:40 np0005591285 nova_compute[182755]: 2026-01-22 01:05:40.662 182759 DEBUG nova.compute.resource_tracker [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Jan 21 20:05:40 np0005591285 nova_compute[182755]: 2026-01-22 01:05:40.663 182759 DEBUG nova.compute.resource_tracker [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Jan 21 20:05:40 np0005591285 nova_compute[182755]: 2026-01-22 01:05:40.847 182759 DEBUG nova.scheduler.client.report [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Refreshing inventories for resource provider e96a8776-a298-4c19-937a-402cb8191067 _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:804#033[00m
Jan 21 20:05:40 np0005591285 nova_compute[182755]: 2026-01-22 01:05:40.946 182759 DEBUG nova.scheduler.client.report [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Updating ProviderTree inventory for provider e96a8776-a298-4c19-937a-402cb8191067 from _refresh_and_get_inventory using data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} _refresh_and_get_inventory /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:768#033[00m
Jan 21 20:05:40 np0005591285 nova_compute[182755]: 2026-01-22 01:05:40.946 182759 DEBUG nova.compute.provider_tree [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Updating inventory in ProviderTree for provider e96a8776-a298-4c19-937a-402cb8191067 with inventory: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176#033[00m
Jan 21 20:05:40 np0005591285 nova_compute[182755]: 2026-01-22 01:05:40.981 182759 DEBUG nova.scheduler.client.report [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Refreshing aggregate associations for resource provider e96a8776-a298-4c19-937a-402cb8191067, aggregates: None _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:813#033[00m
Jan 21 20:05:41 np0005591285 nova_compute[182755]: 2026-01-22 01:05:41.006 182759 DEBUG nova.scheduler.client.report [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Refreshing trait associations for resource provider e96a8776-a298-4c19-937a-402cb8191067, traits: COMPUTE_SECURITY_UEFI_SECURE_BOOT,COMPUTE_VIOMMU_MODEL_VIRTIO,COMPUTE_STORAGE_BUS_IDE,COMPUTE_NET_ATTACH_INTERFACE,HW_CPU_X86_SSSE3,HW_CPU_X86_SSE,COMPUTE_IMAGE_TYPE_QCOW2,COMPUTE_NET_VIF_MODEL_VMXNET3,COMPUTE_STORAGE_BUS_FDC,HW_CPU_X86_SSE42,COMPUTE_SECURITY_TPM_2_0,COMPUTE_IMAGE_TYPE_AMI,COMPUTE_DEVICE_TAGGING,COMPUTE_VOLUME_EXTEND,COMPUTE_NET_VIF_MODEL_E1000,COMPUTE_VIOMMU_MODEL_AUTO,COMPUTE_GRAPHICS_MODEL_CIRRUS,COMPUTE_VOLUME_ATTACH_WITH_TAG,COMPUTE_IMAGE_TYPE_AKI,COMPUTE_STORAGE_BUS_SATA,HW_CPU_X86_SSE2,COMPUTE_IMAGE_TYPE_ARI,HW_CPU_X86_SSE41,COMPUTE_RESCUE_BFV,COMPUTE_NODE,COMPUTE_GRAPHICS_MODEL_VIRTIO,COMPUTE_NET_VIF_MODEL_PCNET,COMPUTE_NET_VIF_MODEL_VIRTIO,COMPUTE_NET_VIF_MODEL_RTL8139,COMPUTE_ACCELERATORS,COMPUTE_SECURITY_TPM_1_2,COMPUTE_GRAPHICS_MODEL_NONE,COMPUTE_GRAPHICS_MODEL_BOCHS,COMPUTE_VOLUME_MULTI_ATTACH,COMPUTE_NET_VIF_MODEL_NE2K_PCI,COMPUTE_TRUSTED_CERTS,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,COMPUTE_VIOMMU_MODEL_INTEL,COMPUTE_IMAGE_TYPE_ISO,COMPUTE_SOCKET_PCI_NUMA_AFFINITY,COMPUTE_STORAGE_BUS_USB,COMPUTE_STORAGE_BUS_SCSI,COMPUTE_STORAGE_BUS_VIRTIO,HW_CPU_X86_MMX,COMPUTE_GRAPHICS_MODEL_VGA,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,COMPUTE_NET_VIF_MODEL_E1000E,COMPUTE_IMAGE_TYPE_RAW _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:825#033[00m
Jan 21 20:05:41 np0005591285 nova_compute[182755]: 2026-01-22 01:05:41.042 182759 DEBUG nova.compute.provider_tree [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Inventory has not changed in ProviderTree for provider: e96a8776-a298-4c19-937a-402cb8191067 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 21 20:05:41 np0005591285 nova_compute[182755]: 2026-01-22 01:05:41.078 182759 DEBUG nova.scheduler.client.report [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Inventory has not changed for provider e96a8776-a298-4c19-937a-402cb8191067 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 21 20:05:41 np0005591285 nova_compute[182755]: 2026-01-22 01:05:41.080 182759 DEBUG nova.compute.resource_tracker [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Jan 21 20:05:41 np0005591285 nova_compute[182755]: 2026-01-22 01:05:41.081 182759 DEBUG oslo_concurrency.lockutils [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.617s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 21 20:05:42 np0005591285 nova_compute[182755]: 2026-01-22 01:05:42.395 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 20:05:44 np0005591285 podman[253757]: 2026-01-22 01:05:44.20767138 +0000 UTC m=+0.061698748 container health_status 8919309155a033da8deb024bb37b9fc0d285fe97686ee0d202af37a5f2df8416 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter)
Jan 21 20:05:44 np0005591285 podman[253756]: 2026-01-22 01:05:44.22893771 +0000 UTC m=+0.082754322 container health_status 482a8450540b33e5cd4c4cb0c4b60281016c657b79b89fa82f8a9e447843f995 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2)
Jan 21 20:05:45 np0005591285 nova_compute[182755]: 2026-01-22 01:05:45.375 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 20:05:46 np0005591285 nova_compute[182755]: 2026-01-22 01:05:46.081 182759 DEBUG oslo_service.periodic_task [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 21 20:05:46 np0005591285 podman[253797]: 2026-01-22 01:05:46.20489035 +0000 UTC m=+0.082679572 container health_status e38def30a2d032278c507a8fe96e62e9ea51ff4721fe55b24e66691821fd079e (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, io.buildah.version=1.41.3, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS)
Jan 21 20:05:47 np0005591285 nova_compute[182755]: 2026-01-22 01:05:47.396 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 20:05:50 np0005591285 nova_compute[182755]: 2026-01-22 01:05:50.378 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 20:05:52 np0005591285 nova_compute[182755]: 2026-01-22 01:05:52.399 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 20:05:55 np0005591285 nova_compute[182755]: 2026-01-22 01:05:55.380 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 20:05:57 np0005591285 nova_compute[182755]: 2026-01-22 01:05:57.401 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 20:06:00 np0005591285 podman[253822]: 2026-01-22 01:06:00.17859835 +0000 UTC m=+0.051048572 container health_status 769668cc4e818cfae854ab3fcb5517227ddca1e18005a18ef4d782a494f45662 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.openshift.expose-services=, release=1755695350, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Red Hat, Inc., build-date=2025-08-20T13:12:41, container_name=openstack_network_exporter, version=9.6, architecture=x86_64, io.buildah.version=1.33.7, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, config_id=openstack_network_exporter, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, name=ubi9-minimal, managed_by=edpm_ansible, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., url=https://catalog.redhat.com/en/search?searchType=containers, vcs-type=git, io.openshift.tags=minimal rhel9, vendor=Red Hat, Inc., com.redhat.component=ubi9-minimal-container)
Jan 21 20:06:00 np0005591285 podman[253823]: 2026-01-22 01:06:00.211293157 +0000 UTC m=+0.077981184 container health_status b5c1a31a508d0cf9e10702abe9c0ef424614b4d8f0daee85791f7de054afa6f0 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, managed_by=edpm_ansible, tcib_managed=true, config_id=ceilometer_agent_compute, container_name=ceilometer_agent_compute, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS)
Jan 21 20:06:00 np0005591285 nova_compute[182755]: 2026-01-22 01:06:00.425 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 20:06:02 np0005591285 nova_compute[182755]: 2026-01-22 01:06:02.403 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 20:06:03 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 01:06:03.042 104259 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 21 20:06:03 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 01:06:03.043 104259 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 21 20:06:03 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 01:06:03.043 104259 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 21 20:06:05 np0005591285 nova_compute[182755]: 2026-01-22 01:06:05.466 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 20:06:07 np0005591285 nova_compute[182755]: 2026-01-22 01:06:07.403 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 20:06:08 np0005591285 podman[253863]: 2026-01-22 01:06:08.229010254 +0000 UTC m=+0.090752508 container health_status 3d927e208c8d9d2bf2ed958430b573b5beb6e6062ba87e1ec7a0ee42c855b2e4 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter)
Jan 21 20:06:10 np0005591285 nova_compute[182755]: 2026-01-22 01:06:10.499 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 20:06:12 np0005591285 nova_compute[182755]: 2026-01-22 01:06:12.405 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 20:06:15 np0005591285 podman[253888]: 2026-01-22 01:06:15.228178809 +0000 UTC m=+0.082429254 container health_status 8919309155a033da8deb024bb37b9fc0d285fe97686ee0d202af37a5f2df8416 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Jan 21 20:06:15 np0005591285 podman[253887]: 2026-01-22 01:06:15.231821756 +0000 UTC m=+0.086170755 container health_status 482a8450540b33e5cd4c4cb0c4b60281016c657b79b89fa82f8a9e447843f995 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.vendor=CentOS)
Jan 21 20:06:15 np0005591285 nova_compute[182755]: 2026-01-22 01:06:15.543 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 20:06:17 np0005591285 podman[253931]: 2026-01-22 01:06:17.264555081 +0000 UTC m=+0.137982406 container health_status e38def30a2d032278c507a8fe96e62e9ea51ff4721fe55b24e66691821fd079e (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=ovn_controller, container_name=ovn_controller, managed_by=edpm_ansible)
Jan 21 20:06:17 np0005591285 nova_compute[182755]: 2026-01-22 01:06:17.406 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 20:06:20 np0005591285 nova_compute[182755]: 2026-01-22 01:06:20.577 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 20:06:22 np0005591285 nova_compute[182755]: 2026-01-22 01:06:22.409 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 20:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 01:06:23.188 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 21 20:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 01:06:23.188 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 21 20:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 01:06:23.188 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 21 20:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 01:06:23.189 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.allocation, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 21 20:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 01:06:23.189 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 21 20:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 01:06:23.189 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 21 20:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 01:06:23.189 12 DEBUG ceilometer.polling.manager [-] Skip pollster memory.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 21 20:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 01:06:23.189 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 21 20:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 01:06:23.189 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 21 20:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 01:06:23.189 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 21 20:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 01:06:23.189 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 21 20:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 01:06:23.189 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 21 20:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 01:06:23.189 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 21 20:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 01:06:23.189 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 21 20:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 01:06:23.189 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 21 20:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 01:06:23.189 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 21 20:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 01:06:23.189 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 21 20:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 01:06:23.190 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.capacity, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 21 20:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 01:06:23.190 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 21 20:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 01:06:23.190 12 DEBUG ceilometer.polling.manager [-] Skip pollster cpu, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 21 20:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 01:06:23.190 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 21 20:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 01:06:23.190 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 21 20:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 01:06:23.190 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 21 20:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 01:06:23.190 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.iops, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 21 20:06:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 01:06:23.190 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 21 20:06:25 np0005591285 nova_compute[182755]: 2026-01-22 01:06:25.620 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 20:06:27 np0005591285 nova_compute[182755]: 2026-01-22 01:06:27.410 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 20:06:30 np0005591285 nova_compute[182755]: 2026-01-22 01:06:30.216 182759 DEBUG oslo_service.periodic_task [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 21 20:06:30 np0005591285 nova_compute[182755]: 2026-01-22 01:06:30.217 182759 DEBUG nova.compute.manager [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Jan 21 20:06:30 np0005591285 nova_compute[182755]: 2026-01-22 01:06:30.673 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 20:06:31 np0005591285 podman[253958]: 2026-01-22 01:06:31.183826549 +0000 UTC m=+0.050375244 container health_status b5c1a31a508d0cf9e10702abe9c0ef424614b4d8f0daee85791f7de054afa6f0 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, container_name=ceilometer_agent_compute, org.label-schema.build-date=20251202, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=ceilometer_agent_compute, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 21 20:06:31 np0005591285 podman[253957]: 2026-01-22 01:06:31.18909829 +0000 UTC m=+0.058201933 container health_status 769668cc4e818cfae854ab3fcb5517227ddca1e18005a18ef4d782a494f45662 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=openstack_network_exporter, container_name=openstack_network_exporter, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vcs-type=git, io.openshift.expose-services=, name=ubi9-minimal, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.tags=minimal rhel9, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, managed_by=edpm_ansible, version=9.6, release=1755695350, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., architecture=x86_64, vendor=Red Hat, Inc., com.redhat.component=ubi9-minimal-container, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=Red Hat, Inc., url=https://catalog.redhat.com/en/search?searchType=containers, distribution-scope=public, io.buildah.version=1.33.7, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, build-date=2025-08-20T13:12:41)
Jan 21 20:06:32 np0005591285 nova_compute[182755]: 2026-01-22 01:06:32.412 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 20:06:34 np0005591285 nova_compute[182755]: 2026-01-22 01:06:34.218 182759 DEBUG oslo_service.periodic_task [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 21 20:06:34 np0005591285 nova_compute[182755]: 2026-01-22 01:06:34.219 182759 DEBUG nova.compute.manager [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Jan 21 20:06:34 np0005591285 nova_compute[182755]: 2026-01-22 01:06:34.219 182759 DEBUG nova.compute.manager [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Jan 21 20:06:34 np0005591285 nova_compute[182755]: 2026-01-22 01:06:34.240 182759 DEBUG nova.compute.manager [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Jan 21 20:06:35 np0005591285 nova_compute[182755]: 2026-01-22 01:06:35.713 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 20:06:36 np0005591285 nova_compute[182755]: 2026-01-22 01:06:36.217 182759 DEBUG oslo_service.periodic_task [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 21 20:06:36 np0005591285 nova_compute[182755]: 2026-01-22 01:06:36.218 182759 DEBUG oslo_service.periodic_task [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 21 20:06:36 np0005591285 nova_compute[182755]: 2026-01-22 01:06:36.219 182759 DEBUG oslo_service.periodic_task [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 21 20:06:37 np0005591285 nova_compute[182755]: 2026-01-22 01:06:37.215 182759 DEBUG oslo_service.periodic_task [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 21 20:06:37 np0005591285 nova_compute[182755]: 2026-01-22 01:06:37.216 182759 DEBUG oslo_service.periodic_task [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 21 20:06:37 np0005591285 nova_compute[182755]: 2026-01-22 01:06:37.414 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 20:06:39 np0005591285 podman[253999]: 2026-01-22 01:06:39.215986793 +0000 UTC m=+0.069228780 container health_status 3d927e208c8d9d2bf2ed958430b573b5beb6e6062ba87e1ec7a0ee42c855b2e4 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter)
Jan 21 20:06:40 np0005591285 nova_compute[182755]: 2026-01-22 01:06:40.758 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 20:06:42 np0005591285 nova_compute[182755]: 2026-01-22 01:06:42.217 182759 DEBUG oslo_service.periodic_task [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 21 20:06:42 np0005591285 nova_compute[182755]: 2026-01-22 01:06:42.241 182759 DEBUG oslo_concurrency.lockutils [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 21 20:06:42 np0005591285 nova_compute[182755]: 2026-01-22 01:06:42.242 182759 DEBUG oslo_concurrency.lockutils [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 21 20:06:42 np0005591285 nova_compute[182755]: 2026-01-22 01:06:42.242 182759 DEBUG oslo_concurrency.lockutils [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 21 20:06:42 np0005591285 nova_compute[182755]: 2026-01-22 01:06:42.243 182759 DEBUG nova.compute.resource_tracker [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Jan 21 20:06:42 np0005591285 nova_compute[182755]: 2026-01-22 01:06:42.414 182759 WARNING nova.virt.libvirt.driver [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 21 20:06:42 np0005591285 nova_compute[182755]: 2026-01-22 01:06:42.415 182759 DEBUG nova.compute.resource_tracker [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=5708MB free_disk=73.1083984375GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Jan 21 20:06:42 np0005591285 nova_compute[182755]: 2026-01-22 01:06:42.416 182759 DEBUG oslo_concurrency.lockutils [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 21 20:06:42 np0005591285 nova_compute[182755]: 2026-01-22 01:06:42.416 182759 DEBUG oslo_concurrency.lockutils [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 21 20:06:42 np0005591285 nova_compute[182755]: 2026-01-22 01:06:42.417 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 20:06:42 np0005591285 nova_compute[182755]: 2026-01-22 01:06:42.488 182759 DEBUG nova.compute.resource_tracker [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Jan 21 20:06:42 np0005591285 nova_compute[182755]: 2026-01-22 01:06:42.489 182759 DEBUG nova.compute.resource_tracker [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Jan 21 20:06:42 np0005591285 nova_compute[182755]: 2026-01-22 01:06:42.507 182759 DEBUG nova.compute.provider_tree [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Inventory has not changed in ProviderTree for provider: e96a8776-a298-4c19-937a-402cb8191067 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 21 20:06:42 np0005591285 nova_compute[182755]: 2026-01-22 01:06:42.522 182759 DEBUG nova.scheduler.client.report [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Inventory has not changed for provider e96a8776-a298-4c19-937a-402cb8191067 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 21 20:06:42 np0005591285 nova_compute[182755]: 2026-01-22 01:06:42.523 182759 DEBUG nova.compute.resource_tracker [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Jan 21 20:06:42 np0005591285 nova_compute[182755]: 2026-01-22 01:06:42.523 182759 DEBUG oslo_concurrency.lockutils [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.107s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 21 20:06:45 np0005591285 nova_compute[182755]: 2026-01-22 01:06:45.523 182759 DEBUG oslo_service.periodic_task [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 21 20:06:45 np0005591285 nova_compute[182755]: 2026-01-22 01:06:45.760 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 20:06:46 np0005591285 podman[254024]: 2026-01-22 01:06:46.197379271 +0000 UTC m=+0.062584441 container health_status 8919309155a033da8deb024bb37b9fc0d285fe97686ee0d202af37a5f2df8416 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Jan 21 20:06:46 np0005591285 podman[254023]: 2026-01-22 01:06:46.213302259 +0000 UTC m=+0.078848809 container health_status 482a8450540b33e5cd4c4cb0c4b60281016c657b79b89fa82f8a9e447843f995 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=ovn_metadata_agent, org.label-schema.build-date=20251202, tcib_managed=true)
Jan 21 20:06:47 np0005591285 nova_compute[182755]: 2026-01-22 01:06:47.418 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 20:06:48 np0005591285 podman[254065]: 2026-01-22 01:06:48.26568215 +0000 UTC m=+0.122476580 container health_status e38def30a2d032278c507a8fe96e62e9ea51ff4721fe55b24e66691821fd079e (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_controller, container_name=ovn_controller, tcib_managed=true, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 21 20:06:50 np0005591285 nova_compute[182755]: 2026-01-22 01:06:50.763 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 20:06:52 np0005591285 nova_compute[182755]: 2026-01-22 01:06:52.419 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 20:06:55 np0005591285 nova_compute[182755]: 2026-01-22 01:06:55.766 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 20:06:57 np0005591285 nova_compute[182755]: 2026-01-22 01:06:57.422 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 20:06:58 np0005591285 nova_compute[182755]: 2026-01-22 01:06:58.213 182759 DEBUG oslo_service.periodic_task [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 21 20:07:00 np0005591285 nova_compute[182755]: 2026-01-22 01:07:00.803 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 20:07:02 np0005591285 podman[254092]: 2026-01-22 01:07:02.21081575 +0000 UTC m=+0.076210237 container health_status 769668cc4e818cfae854ab3fcb5517227ddca1e18005a18ef4d782a494f45662 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=openstack_network_exporter, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., release=1755695350, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., maintainer=Red Hat, Inc., com.redhat.component=ubi9-minimal-container, managed_by=edpm_ansible, io.openshift.tags=minimal rhel9, url=https://catalog.redhat.com/en/search?searchType=containers, version=9.6, io.buildah.version=1.33.7, distribution-scope=public, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., name=ubi9-minimal, architecture=x86_64, build-date=2025-08-20T13:12:41, config_id=openstack_network_exporter, vendor=Red Hat, Inc., io.openshift.expose-services=, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vcs-type=git)
Jan 21 20:07:02 np0005591285 podman[254093]: 2026-01-22 01:07:02.225590366 +0000 UTC m=+0.075247310 container health_status b5c1a31a508d0cf9e10702abe9c0ef424614b4d8f0daee85791f7de054afa6f0 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.vendor=CentOS, config_id=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, container_name=ceilometer_agent_compute, org.label-schema.license=GPLv2)
Jan 21 20:07:02 np0005591285 nova_compute[182755]: 2026-01-22 01:07:02.424 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 20:07:03 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 01:07:03.043 104259 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 21 20:07:03 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 01:07:03.044 104259 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 21 20:07:03 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 01:07:03.044 104259 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 21 20:07:05 np0005591285 nova_compute[182755]: 2026-01-22 01:07:05.844 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 20:07:07 np0005591285 nova_compute[182755]: 2026-01-22 01:07:07.425 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 20:07:10 np0005591285 podman[254134]: 2026-01-22 01:07:10.192037776 +0000 UTC m=+0.060539606 container health_status 3d927e208c8d9d2bf2ed958430b573b5beb6e6062ba87e1ec7a0ee42c855b2e4 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Jan 21 20:07:10 np0005591285 nova_compute[182755]: 2026-01-22 01:07:10.877 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 20:07:12 np0005591285 nova_compute[182755]: 2026-01-22 01:07:12.428 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 20:07:15 np0005591285 nova_compute[182755]: 2026-01-22 01:07:15.880 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 20:07:17 np0005591285 podman[254159]: 2026-01-22 01:07:17.222139913 +0000 UTC m=+0.080780510 container health_status 482a8450540b33e5cd4c4cb0c4b60281016c657b79b89fa82f8a9e447843f995 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, managed_by=edpm_ansible, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.schema-version=1.0, container_name=ovn_metadata_agent, org.label-schema.vendor=CentOS, tcib_managed=true)
Jan 21 20:07:17 np0005591285 podman[254160]: 2026-01-22 01:07:17.23022874 +0000 UTC m=+0.087117021 container health_status 8919309155a033da8deb024bb37b9fc0d285fe97686ee0d202af37a5f2df8416 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Jan 21 20:07:17 np0005591285 nova_compute[182755]: 2026-01-22 01:07:17.431 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 20:07:19 np0005591285 podman[254201]: 2026-01-22 01:07:19.321218908 +0000 UTC m=+0.183916030 container health_status e38def30a2d032278c507a8fe96e62e9ea51ff4721fe55b24e66691821fd079e (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, managed_by=edpm_ansible, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, container_name=ovn_controller, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, config_id=ovn_controller)
Jan 21 20:07:20 np0005591285 nova_compute[182755]: 2026-01-22 01:07:20.925 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 20:07:22 np0005591285 nova_compute[182755]: 2026-01-22 01:07:22.432 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 20:07:25 np0005591285 nova_compute[182755]: 2026-01-22 01:07:25.928 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 20:07:27 np0005591285 nova_compute[182755]: 2026-01-22 01:07:27.433 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 20:07:30 np0005591285 nova_compute[182755]: 2026-01-22 01:07:30.932 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 20:07:31 np0005591285 nova_compute[182755]: 2026-01-22 01:07:31.217 182759 DEBUG oslo_service.periodic_task [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 21 20:07:31 np0005591285 nova_compute[182755]: 2026-01-22 01:07:31.217 182759 DEBUG nova.compute.manager [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Jan 21 20:07:32 np0005591285 nova_compute[182755]: 2026-01-22 01:07:32.437 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 20:07:33 np0005591285 podman[254228]: 2026-01-22 01:07:33.223541551 +0000 UTC m=+0.084618274 container health_status b5c1a31a508d0cf9e10702abe9c0ef424614b4d8f0daee85791f7de054afa6f0 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3)
Jan 21 20:07:33 np0005591285 podman[254227]: 2026-01-22 01:07:33.240417264 +0000 UTC m=+0.100989633 container health_status 769668cc4e818cfae854ab3fcb5517227ddca1e18005a18ef4d782a494f45662 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.buildah.version=1.33.7, distribution-scope=public, name=ubi9-minimal, config_id=openstack_network_exporter, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.openshift.expose-services=, vendor=Red Hat, Inc., container_name=openstack_network_exporter, io.openshift.tags=minimal rhel9, release=1755695350, url=https://catalog.redhat.com/en/search?searchType=containers, architecture=x86_64, build-date=2025-08-20T13:12:41, maintainer=Red Hat, Inc., managed_by=edpm_ansible, version=9.6, com.redhat.component=ubi9-minimal-container, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI)
Jan 21 20:07:35 np0005591285 nova_compute[182755]: 2026-01-22 01:07:35.218 182759 DEBUG oslo_service.periodic_task [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 21 20:07:35 np0005591285 nova_compute[182755]: 2026-01-22 01:07:35.218 182759 DEBUG nova.compute.manager [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Jan 21 20:07:35 np0005591285 nova_compute[182755]: 2026-01-22 01:07:35.218 182759 DEBUG nova.compute.manager [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Jan 21 20:07:35 np0005591285 nova_compute[182755]: 2026-01-22 01:07:35.355 182759 DEBUG nova.compute.manager [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Jan 21 20:07:35 np0005591285 nova_compute[182755]: 2026-01-22 01:07:35.934 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 20:07:36 np0005591285 nova_compute[182755]: 2026-01-22 01:07:36.217 182759 DEBUG oslo_service.periodic_task [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 21 20:07:37 np0005591285 nova_compute[182755]: 2026-01-22 01:07:37.213 182759 DEBUG oslo_service.periodic_task [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 21 20:07:37 np0005591285 nova_compute[182755]: 2026-01-22 01:07:37.217 182759 DEBUG oslo_service.periodic_task [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 21 20:07:37 np0005591285 nova_compute[182755]: 2026-01-22 01:07:37.217 182759 DEBUG oslo_service.periodic_task [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 21 20:07:37 np0005591285 nova_compute[182755]: 2026-01-22 01:07:37.218 182759 DEBUG oslo_service.periodic_task [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 21 20:07:37 np0005591285 nova_compute[182755]: 2026-01-22 01:07:37.440 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 20:07:40 np0005591285 nova_compute[182755]: 2026-01-22 01:07:40.973 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 20:07:41 np0005591285 podman[254265]: 2026-01-22 01:07:41.219928385 +0000 UTC m=+0.074220243 container health_status 3d927e208c8d9d2bf2ed958430b573b5beb6e6062ba87e1ec7a0ee42c855b2e4 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter)
Jan 21 20:07:42 np0005591285 nova_compute[182755]: 2026-01-22 01:07:42.217 182759 DEBUG oslo_service.periodic_task [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 21 20:07:42 np0005591285 nova_compute[182755]: 2026-01-22 01:07:42.243 182759 DEBUG oslo_concurrency.lockutils [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 21 20:07:42 np0005591285 nova_compute[182755]: 2026-01-22 01:07:42.244 182759 DEBUG oslo_concurrency.lockutils [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 21 20:07:42 np0005591285 nova_compute[182755]: 2026-01-22 01:07:42.244 182759 DEBUG oslo_concurrency.lockutils [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 21 20:07:42 np0005591285 nova_compute[182755]: 2026-01-22 01:07:42.244 182759 DEBUG nova.compute.resource_tracker [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Jan 21 20:07:42 np0005591285 nova_compute[182755]: 2026-01-22 01:07:42.440 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 20:07:42 np0005591285 nova_compute[182755]: 2026-01-22 01:07:42.503 182759 WARNING nova.virt.libvirt.driver [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 21 20:07:42 np0005591285 nova_compute[182755]: 2026-01-22 01:07:42.504 182759 DEBUG nova.compute.resource_tracker [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=5712MB free_disk=73.1083984375GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Jan 21 20:07:42 np0005591285 nova_compute[182755]: 2026-01-22 01:07:42.505 182759 DEBUG oslo_concurrency.lockutils [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 21 20:07:42 np0005591285 nova_compute[182755]: 2026-01-22 01:07:42.505 182759 DEBUG oslo_concurrency.lockutils [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 21 20:07:42 np0005591285 nova_compute[182755]: 2026-01-22 01:07:42.585 182759 DEBUG nova.compute.resource_tracker [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Jan 21 20:07:42 np0005591285 nova_compute[182755]: 2026-01-22 01:07:42.586 182759 DEBUG nova.compute.resource_tracker [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Jan 21 20:07:42 np0005591285 nova_compute[182755]: 2026-01-22 01:07:42.632 182759 DEBUG nova.compute.provider_tree [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Inventory has not changed in ProviderTree for provider: e96a8776-a298-4c19-937a-402cb8191067 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 21 20:07:42 np0005591285 nova_compute[182755]: 2026-01-22 01:07:42.648 182759 DEBUG nova.scheduler.client.report [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Inventory has not changed for provider e96a8776-a298-4c19-937a-402cb8191067 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 21 20:07:42 np0005591285 nova_compute[182755]: 2026-01-22 01:07:42.652 182759 DEBUG nova.compute.resource_tracker [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Jan 21 20:07:42 np0005591285 nova_compute[182755]: 2026-01-22 01:07:42.653 182759 DEBUG oslo_concurrency.lockutils [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.147s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 21 20:07:45 np0005591285 nova_compute[182755]: 2026-01-22 01:07:45.975 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 20:07:46 np0005591285 nova_compute[182755]: 2026-01-22 01:07:46.654 182759 DEBUG oslo_service.periodic_task [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 21 20:07:47 np0005591285 nova_compute[182755]: 2026-01-22 01:07:47.443 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 20:07:48 np0005591285 podman[254289]: 2026-01-22 01:07:48.221272269 +0000 UTC m=+0.080113072 container health_status 482a8450540b33e5cd4c4cb0c4b60281016c657b79b89fa82f8a9e447843f995 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 21 20:07:48 np0005591285 podman[254290]: 2026-01-22 01:07:48.222085432 +0000 UTC m=+0.084342386 container health_status 8919309155a033da8deb024bb37b9fc0d285fe97686ee0d202af37a5f2df8416 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter)
Jan 21 20:07:50 np0005591285 podman[254330]: 2026-01-22 01:07:50.251905477 +0000 UTC m=+0.122678396 container health_status e38def30a2d032278c507a8fe96e62e9ea51ff4721fe55b24e66691821fd079e (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, container_name=ovn_controller, io.buildah.version=1.41.3, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0)
Jan 21 20:07:51 np0005591285 nova_compute[182755]: 2026-01-22 01:07:51.016 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 20:07:52 np0005591285 nova_compute[182755]: 2026-01-22 01:07:52.446 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 20:07:56 np0005591285 nova_compute[182755]: 2026-01-22 01:07:56.020 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 20:07:57 np0005591285 nova_compute[182755]: 2026-01-22 01:07:57.449 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 20:08:01 np0005591285 nova_compute[182755]: 2026-01-22 01:08:01.024 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 20:08:02 np0005591285 nova_compute[182755]: 2026-01-22 01:08:02.451 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 20:08:03 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 01:08:03.045 104259 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 21 20:08:03 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 01:08:03.047 104259 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 21 20:08:03 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 01:08:03.047 104259 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 21 20:08:04 np0005591285 podman[254357]: 2026-01-22 01:08:04.240195459 +0000 UTC m=+0.101120728 container health_status 769668cc4e818cfae854ab3fcb5517227ddca1e18005a18ef4d782a494f45662 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, distribution-scope=public, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, release=1755695350, config_id=openstack_network_exporter, url=https://catalog.redhat.com/en/search?searchType=containers, container_name=openstack_network_exporter, io.buildah.version=1.33.7, io.openshift.tags=minimal rhel9, vendor=Red Hat, Inc., build-date=2025-08-20T13:12:41, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=edpm_ansible, vcs-type=git, version=9.6, io.openshift.expose-services=, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., com.redhat.component=ubi9-minimal-container, name=ubi9-minimal, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, maintainer=Red Hat, Inc.)
Jan 21 20:08:04 np0005591285 podman[254358]: 2026-01-22 01:08:04.248174172 +0000 UTC m=+0.103847209 container health_status b5c1a31a508d0cf9e10702abe9c0ef424614b4d8f0daee85791f7de054afa6f0 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=ceilometer_agent_compute, container_name=ceilometer_agent_compute, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']})
Jan 21 20:08:06 np0005591285 nova_compute[182755]: 2026-01-22 01:08:06.027 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 20:08:07 np0005591285 nova_compute[182755]: 2026-01-22 01:08:07.453 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 20:08:11 np0005591285 nova_compute[182755]: 2026-01-22 01:08:11.063 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 20:08:12 np0005591285 podman[254398]: 2026-01-22 01:08:12.248267976 +0000 UTC m=+0.112750149 container health_status 3d927e208c8d9d2bf2ed958430b573b5beb6e6062ba87e1ec7a0ee42c855b2e4 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter)
Jan 21 20:08:12 np0005591285 nova_compute[182755]: 2026-01-22 01:08:12.456 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 20:08:16 np0005591285 nova_compute[182755]: 2026-01-22 01:08:16.066 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 20:08:17 np0005591285 nova_compute[182755]: 2026-01-22 01:08:17.457 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 20:08:19 np0005591285 podman[254423]: 2026-01-22 01:08:19.232504261 +0000 UTC m=+0.086324219 container health_status 482a8450540b33e5cd4c4cb0c4b60281016c657b79b89fa82f8a9e447843f995 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_metadata_agent)
Jan 21 20:08:19 np0005591285 podman[254424]: 2026-01-22 01:08:19.264321716 +0000 UTC m=+0.110753916 container health_status 8919309155a033da8deb024bb37b9fc0d285fe97686ee0d202af37a5f2df8416 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter)
Jan 21 20:08:21 np0005591285 nova_compute[182755]: 2026-01-22 01:08:21.097 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 20:08:21 np0005591285 podman[254464]: 2026-01-22 01:08:21.211238635 +0000 UTC m=+0.086470123 container health_status e38def30a2d032278c507a8fe96e62e9ea51ff4721fe55b24e66691821fd079e (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team)
Jan 21 20:08:22 np0005591285 nova_compute[182755]: 2026-01-22 01:08:22.458 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 20:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 01:08:23.188 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 21 20:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 01:08:23.189 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 21 20:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 01:08:23.189 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 21 20:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 01:08:23.189 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.capacity, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 21 20:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 01:08:23.189 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 21 20:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 01:08:23.190 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 21 20:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 01:08:23.190 12 DEBUG ceilometer.polling.manager [-] Skip pollster cpu, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 21 20:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 01:08:23.190 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 21 20:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 01:08:23.190 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 21 20:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 01:08:23.190 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 21 20:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 01:08:23.191 12 DEBUG ceilometer.polling.manager [-] Skip pollster memory.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 21 20:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 01:08:23.191 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 21 20:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 01:08:23.191 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 21 20:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 01:08:23.191 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 21 20:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 01:08:23.192 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 21 20:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 01:08:23.192 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 21 20:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 01:08:23.192 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 21 20:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 01:08:23.192 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 21 20:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 01:08:23.193 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.iops, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 21 20:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 01:08:23.193 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 21 20:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 01:08:23.193 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 21 20:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 01:08:23.193 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.allocation, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 21 20:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 01:08:23.193 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 21 20:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 01:08:23.194 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 21 20:08:23 np0005591285 ceilometer_agent_compute[192452]: 2026-01-22 01:08:23.194 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 21 20:08:26 np0005591285 nova_compute[182755]: 2026-01-22 01:08:26.099 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 20:08:27 np0005591285 nova_compute[182755]: 2026-01-22 01:08:27.460 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 20:08:31 np0005591285 nova_compute[182755]: 2026-01-22 01:08:31.102 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 20:08:31 np0005591285 nova_compute[182755]: 2026-01-22 01:08:31.217 182759 DEBUG oslo_service.periodic_task [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 21 20:08:31 np0005591285 nova_compute[182755]: 2026-01-22 01:08:31.218 182759 DEBUG nova.compute.manager [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Jan 21 20:08:32 np0005591285 nova_compute[182755]: 2026-01-22 01:08:32.461 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 20:08:35 np0005591285 nova_compute[182755]: 2026-01-22 01:08:35.219 182759 DEBUG oslo_service.periodic_task [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 21 20:08:35 np0005591285 nova_compute[182755]: 2026-01-22 01:08:35.219 182759 DEBUG nova.compute.manager [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Jan 21 20:08:35 np0005591285 nova_compute[182755]: 2026-01-22 01:08:35.219 182759 DEBUG nova.compute.manager [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Jan 21 20:08:35 np0005591285 podman[254491]: 2026-01-22 01:08:35.228216787 +0000 UTC m=+0.083862613 container health_status 769668cc4e818cfae854ab3fcb5517227ddca1e18005a18ef4d782a494f45662 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, vcs-type=git, io.openshift.tags=minimal rhel9, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.33.7, url=https://catalog.redhat.com/en/search?searchType=containers, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, name=ubi9-minimal, version=9.6, distribution-scope=public, managed_by=edpm_ansible, build-date=2025-08-20T13:12:41, container_name=openstack_network_exporter, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., maintainer=Red Hat, Inc., release=1755695350, config_id=openstack_network_exporter, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, architecture=x86_64, com.redhat.component=ubi9-minimal-container)
Jan 21 20:08:35 np0005591285 podman[254492]: 2026-01-22 01:08:35.239441708 +0000 UTC m=+0.089470713 container health_status b5c1a31a508d0cf9e10702abe9c0ef424614b4d8f0daee85791f7de054afa6f0 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, config_id=ceilometer_agent_compute, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']})
Jan 21 20:08:35 np0005591285 nova_compute[182755]: 2026-01-22 01:08:35.246 182759 DEBUG nova.compute.manager [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Jan 21 20:08:36 np0005591285 nova_compute[182755]: 2026-01-22 01:08:36.106 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 20:08:37 np0005591285 nova_compute[182755]: 2026-01-22 01:08:37.217 182759 DEBUG oslo_service.periodic_task [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 21 20:08:37 np0005591285 nova_compute[182755]: 2026-01-22 01:08:37.218 182759 DEBUG oslo_service.periodic_task [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 21 20:08:37 np0005591285 nova_compute[182755]: 2026-01-22 01:08:37.218 182759 DEBUG oslo_service.periodic_task [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 21 20:08:37 np0005591285 nova_compute[182755]: 2026-01-22 01:08:37.463 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 20:08:38 np0005591285 nova_compute[182755]: 2026-01-22 01:08:38.217 182759 DEBUG oslo_service.periodic_task [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 21 20:08:39 np0005591285 nova_compute[182755]: 2026-01-22 01:08:39.218 182759 DEBUG oslo_service.periodic_task [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 21 20:08:41 np0005591285 nova_compute[182755]: 2026-01-22 01:08:41.109 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 20:08:42 np0005591285 nova_compute[182755]: 2026-01-22 01:08:42.463 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 20:08:43 np0005591285 podman[254531]: 2026-01-22 01:08:43.216088792 +0000 UTC m=+0.090885932 container health_status 3d927e208c8d9d2bf2ed958430b573b5beb6e6062ba87e1ec7a0ee42c855b2e4 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Jan 21 20:08:43 np0005591285 nova_compute[182755]: 2026-01-22 01:08:43.217 182759 DEBUG oslo_service.periodic_task [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 21 20:08:43 np0005591285 nova_compute[182755]: 2026-01-22 01:08:43.241 182759 DEBUG oslo_concurrency.lockutils [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 21 20:08:43 np0005591285 nova_compute[182755]: 2026-01-22 01:08:43.242 182759 DEBUG oslo_concurrency.lockutils [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 21 20:08:43 np0005591285 nova_compute[182755]: 2026-01-22 01:08:43.242 182759 DEBUG oslo_concurrency.lockutils [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 21 20:08:43 np0005591285 nova_compute[182755]: 2026-01-22 01:08:43.243 182759 DEBUG nova.compute.resource_tracker [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Jan 21 20:08:43 np0005591285 nova_compute[182755]: 2026-01-22 01:08:43.429 182759 WARNING nova.virt.libvirt.driver [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 21 20:08:43 np0005591285 nova_compute[182755]: 2026-01-22 01:08:43.431 182759 DEBUG nova.compute.resource_tracker [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=5714MB free_disk=73.1083984375GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Jan 21 20:08:43 np0005591285 nova_compute[182755]: 2026-01-22 01:08:43.431 182759 DEBUG oslo_concurrency.lockutils [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 21 20:08:43 np0005591285 nova_compute[182755]: 2026-01-22 01:08:43.432 182759 DEBUG oslo_concurrency.lockutils [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 21 20:08:43 np0005591285 nova_compute[182755]: 2026-01-22 01:08:43.514 182759 DEBUG nova.compute.resource_tracker [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Jan 21 20:08:43 np0005591285 nova_compute[182755]: 2026-01-22 01:08:43.515 182759 DEBUG nova.compute.resource_tracker [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Jan 21 20:08:43 np0005591285 nova_compute[182755]: 2026-01-22 01:08:43.540 182759 DEBUG nova.compute.provider_tree [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Inventory has not changed in ProviderTree for provider: e96a8776-a298-4c19-937a-402cb8191067 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 21 20:08:43 np0005591285 nova_compute[182755]: 2026-01-22 01:08:43.565 182759 DEBUG nova.scheduler.client.report [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Inventory has not changed for provider e96a8776-a298-4c19-937a-402cb8191067 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 21 20:08:43 np0005591285 nova_compute[182755]: 2026-01-22 01:08:43.567 182759 DEBUG nova.compute.resource_tracker [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Jan 21 20:08:43 np0005591285 nova_compute[182755]: 2026-01-22 01:08:43.568 182759 DEBUG oslo_concurrency.lockutils [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.136s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 21 20:08:46 np0005591285 nova_compute[182755]: 2026-01-22 01:08:46.112 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 20:08:47 np0005591285 nova_compute[182755]: 2026-01-22 01:08:47.465 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 20:08:47 np0005591285 nova_compute[182755]: 2026-01-22 01:08:47.569 182759 DEBUG oslo_service.periodic_task [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 21 20:08:50 np0005591285 podman[254558]: 2026-01-22 01:08:50.206678607 +0000 UTC m=+0.065474629 container health_status 8919309155a033da8deb024bb37b9fc0d285fe97686ee0d202af37a5f2df8416 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']})
Jan 21 20:08:50 np0005591285 podman[254557]: 2026-01-22 01:08:50.211535967 +0000 UTC m=+0.072296892 container health_status 482a8450540b33e5cd4c4cb0c4b60281016c657b79b89fa82f8a9e447843f995 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 21 20:08:51 np0005591285 nova_compute[182755]: 2026-01-22 01:08:51.116 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 20:08:53 np0005591285 nova_compute[182755]: 2026-01-22 01:08:52.467 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 20:08:53 np0005591285 podman[254598]: 2026-01-22 01:08:53.164571863 +0000 UTC m=+0.126928199 container health_status e38def30a2d032278c507a8fe96e62e9ea51ff4721fe55b24e66691821fd079e (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=ovn_controller, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 21 20:08:56 np0005591285 nova_compute[182755]: 2026-01-22 01:08:56.118 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 20:08:57 np0005591285 nova_compute[182755]: 2026-01-22 01:08:57.468 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 20:09:01 np0005591285 nova_compute[182755]: 2026-01-22 01:09:01.121 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 20:09:01 np0005591285 nova_compute[182755]: 2026-01-22 01:09:01.213 182759 DEBUG oslo_service.periodic_task [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 21 20:09:01 np0005591285 nova_compute[182755]: 2026-01-22 01:09:01.233 182759 DEBUG oslo_service.periodic_task [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Running periodic task ComputeManager._cleanup_incomplete_migrations run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 21 20:09:01 np0005591285 nova_compute[182755]: 2026-01-22 01:09:01.234 182759 DEBUG nova.compute.manager [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Cleaning up deleted instances with incomplete migration  _cleanup_incomplete_migrations /usr/lib/python3.9/site-packages/nova/compute/manager.py:11183#033[00m
Jan 21 20:09:02 np0005591285 nova_compute[182755]: 2026-01-22 01:09:02.247 182759 DEBUG oslo_service.periodic_task [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Running periodic task ComputeManager._run_pending_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 21 20:09:02 np0005591285 nova_compute[182755]: 2026-01-22 01:09:02.247 182759 DEBUG nova.compute.manager [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Cleaning up deleted instances _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11145#033[00m
Jan 21 20:09:02 np0005591285 nova_compute[182755]: 2026-01-22 01:09:02.266 182759 DEBUG nova.compute.manager [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] There are 0 instances to clean _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11154#033[00m
Jan 21 20:09:02 np0005591285 nova_compute[182755]: 2026-01-22 01:09:02.470 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 20:09:03 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 01:09:03.046 104259 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 21 20:09:03 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 01:09:03.047 104259 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 21 20:09:03 np0005591285 ovn_metadata_agent[104254]: 2026-01-22 01:09:03.047 104259 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 21 20:09:06 np0005591285 nova_compute[182755]: 2026-01-22 01:09:06.123 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 20:09:06 np0005591285 podman[254627]: 2026-01-22 01:09:06.246405325 +0000 UTC m=+0.095592158 container health_status b5c1a31a508d0cf9e10702abe9c0ef424614b4d8f0daee85791f7de054afa6f0 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=ceilometer_agent_compute, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3)
Jan 21 20:09:06 np0005591285 podman[254626]: 2026-01-22 01:09:06.267317846 +0000 UTC m=+0.127986938 container health_status 769668cc4e818cfae854ab3fcb5517227ddca1e18005a18ef4d782a494f45662 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.openshift.expose-services=, com.redhat.component=ubi9-minimal-container, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, vendor=Red Hat, Inc., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.buildah.version=1.33.7, io.openshift.tags=minimal rhel9, container_name=openstack_network_exporter, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, managed_by=edpm_ansible, vcs-type=git, maintainer=Red Hat, Inc., config_id=openstack_network_exporter, distribution-scope=public, url=https://catalog.redhat.com/en/search?searchType=containers, version=9.6, build-date=2025-08-20T13:12:41, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=ubi9-minimal, architecture=x86_64, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., release=1755695350)
Jan 21 20:09:07 np0005591285 nova_compute[182755]: 2026-01-22 01:09:07.506 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 20:09:11 np0005591285 nova_compute[182755]: 2026-01-22 01:09:11.127 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 20:09:12 np0005591285 nova_compute[182755]: 2026-01-22 01:09:12.510 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 20:09:14 np0005591285 podman[254665]: 2026-01-22 01:09:14.22053468 +0000 UTC m=+0.085028683 container health_status 3d927e208c8d9d2bf2ed958430b573b5beb6e6062ba87e1ec7a0ee42c855b2e4 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Jan 21 20:09:14 np0005591285 nova_compute[182755]: 2026-01-22 01:09:14.696 182759 DEBUG oslo_service.periodic_task [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Running periodic task ComputeManager._sync_power_states run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 21 20:09:16 np0005591285 nova_compute[182755]: 2026-01-22 01:09:16.129 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 20:09:17 np0005591285 nova_compute[182755]: 2026-01-22 01:09:17.511 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 20:09:21 np0005591285 nova_compute[182755]: 2026-01-22 01:09:21.131 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 20:09:21 np0005591285 podman[254693]: 2026-01-22 01:09:21.218469872 +0000 UTC m=+0.079638789 container health_status 8919309155a033da8deb024bb37b9fc0d285fe97686ee0d202af37a5f2df8416 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Jan 21 20:09:21 np0005591285 podman[254692]: 2026-01-22 01:09:21.230753253 +0000 UTC m=+0.087643876 container health_status 482a8450540b33e5cd4c4cb0c4b60281016c657b79b89fa82f8a9e447843f995 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, org.label-schema.build-date=20251202)
Jan 21 20:09:22 np0005591285 nova_compute[182755]: 2026-01-22 01:09:22.512 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 20:09:24 np0005591285 podman[254733]: 2026-01-22 01:09:24.249224796 +0000 UTC m=+0.117753453 container health_status e38def30a2d032278c507a8fe96e62e9ea51ff4721fe55b24e66691821fd079e (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_id=ovn_controller, managed_by=edpm_ansible, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Jan 21 20:09:26 np0005591285 nova_compute[182755]: 2026-01-22 01:09:26.134 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 20:09:26 np0005591285 nova_compute[182755]: 2026-01-22 01:09:26.732 182759 DEBUG oslo_service.periodic_task [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Running periodic task ComputeManager._cleanup_running_deleted_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 21 20:09:27 np0005591285 nova_compute[182755]: 2026-01-22 01:09:27.514 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 20:09:30 np0005591285 nova_compute[182755]: 2026-01-22 01:09:30.216 182759 DEBUG oslo_service.periodic_task [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 21 20:09:31 np0005591285 nova_compute[182755]: 2026-01-22 01:09:31.136 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 20:09:32 np0005591285 nova_compute[182755]: 2026-01-22 01:09:32.236 182759 DEBUG oslo_service.periodic_task [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 21 20:09:32 np0005591285 nova_compute[182755]: 2026-01-22 01:09:32.236 182759 DEBUG nova.compute.manager [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Jan 21 20:09:32 np0005591285 nova_compute[182755]: 2026-01-22 01:09:32.516 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 20:09:34 np0005591285 systemd-logind[788]: New session 57 of user zuul.
Jan 21 20:09:34 np0005591285 systemd[1]: Started Session 57 of User zuul.
Jan 21 20:09:36 np0005591285 nova_compute[182755]: 2026-01-22 01:09:36.138 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 20:09:36 np0005591285 podman[254896]: 2026-01-22 01:09:36.70968121 +0000 UTC m=+0.068913802 container health_status 769668cc4e818cfae854ab3fcb5517227ddca1e18005a18ef4d782a494f45662 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, release=1755695350, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., managed_by=edpm_ansible, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, build-date=2025-08-20T13:12:41, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, container_name=openstack_network_exporter, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.openshift.tags=minimal rhel9, config_id=openstack_network_exporter, name=ubi9-minimal, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-type=git, io.buildah.version=1.33.7, maintainer=Red Hat, Inc., com.redhat.component=ubi9-minimal-container, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vendor=Red Hat, Inc., version=9.6, distribution-scope=public, io.openshift.expose-services=, url=https://catalog.redhat.com/en/search?searchType=containers, architecture=x86_64)
Jan 21 20:09:36 np0005591285 podman[254901]: 2026-01-22 01:09:36.714435338 +0000 UTC m=+0.072415126 container health_status b5c1a31a508d0cf9e10702abe9c0ef424614b4d8f0daee85791f7de054afa6f0 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, config_id=ceilometer_agent_compute, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, container_name=ceilometer_agent_compute, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.license=GPLv2)
Jan 21 20:09:37 np0005591285 nova_compute[182755]: 2026-01-22 01:09:37.218 182759 DEBUG oslo_service.periodic_task [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 21 20:09:37 np0005591285 nova_compute[182755]: 2026-01-22 01:09:37.218 182759 DEBUG nova.compute.manager [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Jan 21 20:09:37 np0005591285 nova_compute[182755]: 2026-01-22 01:09:37.219 182759 DEBUG nova.compute.manager [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Jan 21 20:09:37 np0005591285 nova_compute[182755]: 2026-01-22 01:09:37.237 182759 DEBUG nova.compute.manager [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Jan 21 20:09:37 np0005591285 nova_compute[182755]: 2026-01-22 01:09:37.238 182759 DEBUG oslo_service.periodic_task [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 21 20:09:37 np0005591285 nova_compute[182755]: 2026-01-22 01:09:37.519 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 20:09:38 np0005591285 nova_compute[182755]: 2026-01-22 01:09:38.233 182759 DEBUG oslo_service.periodic_task [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 21 20:09:39 np0005591285 nova_compute[182755]: 2026-01-22 01:09:39.216 182759 DEBUG oslo_service.periodic_task [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 21 20:09:39 np0005591285 nova_compute[182755]: 2026-01-22 01:09:39.217 182759 DEBUG oslo_service.periodic_task [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 21 20:09:39 np0005591285 ovs-vsctl[254976]: ovs|00001|db_ctl_base|ERR|no key "dpdk-init" in Open_vSwitch record "." column other_config
Jan 21 20:09:40 np0005591285 virtqemud[182299]: Failed to connect socket to '/var/run/libvirt/virtnetworkd-sock-ro': No such file or directory
Jan 21 20:09:40 np0005591285 virtqemud[182299]: Failed to connect socket to '/var/run/libvirt/virtnwfilterd-sock-ro': No such file or directory
Jan 21 20:09:40 np0005591285 virtqemud[182299]: Failed to connect socket to '/var/run/libvirt/virtstoraged-sock-ro': No such file or directory
Jan 21 20:09:41 np0005591285 nova_compute[182755]: 2026-01-22 01:09:41.142 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 20:09:41 np0005591285 nova_compute[182755]: 2026-01-22 01:09:41.218 182759 DEBUG oslo_service.periodic_task [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 21 20:09:42 np0005591285 nova_compute[182755]: 2026-01-22 01:09:42.520 182759 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 21 20:09:43 np0005591285 systemd[1]: Starting Hostname Service...
Jan 21 20:09:43 np0005591285 systemd[1]: Started Hostname Service.
Jan 21 20:09:45 np0005591285 podman[255613]: 2026-01-22 01:09:45.180995977 +0000 UTC m=+0.055334058 container health_status 3d927e208c8d9d2bf2ed958430b573b5beb6e6062ba87e1ec7a0ee42c855b2e4 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '3579c83a3c0f2a6c1d6d9063c43f29888e1e1166032f7823fed24b1414b1e7c8-0d2d656fe422a23de079929c91d0475fa21541d506229340bf1fed35821d03fc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter)
Jan 21 20:09:45 np0005591285 nova_compute[182755]: 2026-01-22 01:09:45.217 182759 DEBUG oslo_service.periodic_task [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 21 20:09:45 np0005591285 nova_compute[182755]: 2026-01-22 01:09:45.252 182759 DEBUG oslo_concurrency.lockutils [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 21 20:09:45 np0005591285 nova_compute[182755]: 2026-01-22 01:09:45.253 182759 DEBUG oslo_concurrency.lockutils [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 21 20:09:45 np0005591285 nova_compute[182755]: 2026-01-22 01:09:45.253 182759 DEBUG oslo_concurrency.lockutils [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 21 20:09:45 np0005591285 nova_compute[182755]: 2026-01-22 01:09:45.253 182759 DEBUG nova.compute.resource_tracker [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Jan 21 20:09:45 np0005591285 nova_compute[182755]: 2026-01-22 01:09:45.403 182759 WARNING nova.virt.libvirt.driver [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 21 20:09:45 np0005591285 nova_compute[182755]: 2026-01-22 01:09:45.404 182759 DEBUG nova.compute.resource_tracker [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=5310MB free_disk=72.92241668701172GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Jan 21 20:09:45 np0005591285 nova_compute[182755]: 2026-01-22 01:09:45.404 182759 DEBUG oslo_concurrency.lockutils [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 21 20:09:45 np0005591285 nova_compute[182755]: 2026-01-22 01:09:45.404 182759 DEBUG oslo_concurrency.lockutils [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 21 20:09:45 np0005591285 nova_compute[182755]: 2026-01-22 01:09:45.523 182759 DEBUG nova.compute.resource_tracker [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Jan 21 20:09:45 np0005591285 nova_compute[182755]: 2026-01-22 01:09:45.523 182759 DEBUG nova.compute.resource_tracker [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Jan 21 20:09:45 np0005591285 nova_compute[182755]: 2026-01-22 01:09:45.544 182759 DEBUG nova.compute.provider_tree [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Inventory has not changed in ProviderTree for provider: e96a8776-a298-4c19-937a-402cb8191067 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 21 20:09:45 np0005591285 nova_compute[182755]: 2026-01-22 01:09:45.566 182759 DEBUG nova.scheduler.client.report [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Inventory has not changed for provider e96a8776-a298-4c19-937a-402cb8191067 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 21 20:09:45 np0005591285 nova_compute[182755]: 2026-01-22 01:09:45.586 182759 DEBUG nova.compute.resource_tracker [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Jan 21 20:09:45 np0005591285 nova_compute[182755]: 2026-01-22 01:09:45.586 182759 DEBUG oslo_concurrency.lockutils [None req-7793717a-f1a0-4154-9b4a-765e5c236ac8 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.182s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
